Last time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.
Last time I tried using a local llm (about a year ago) it generated only a couple words per second and the answers were barely relevant. Also I don’t see how a local llm can fulfill the glorified search engine role that people use llms for.
Almost every language does, do they not? Rust is special because it is safe and as fast as cpp
He tried to play Linus it seems
Kind of unrelated, why does c sometimes fail to print if it hits a breakpoint right after a print while debugging? Or if it segfaults right after too iirc
This is not how motion blur works at all. Is there a specific game you’re taking about? Are you sure this is not monitor ghosting?
Motion blur in games cost next to no performance. It does use motion data but not to generate in between frames, to smear the pixels of the existing frame.
Motion blur off looks like those high shutter speed fight scenes from the Kingsman movies. Good for a striking action scene but not pleasant to look at in general. Motion blur blends the motion that happen between frames like how anti aliasing blurs stairstepping.
Even if it says “licence” or whatever then I’d still not be fine with it not being permanent. The language isn’t the problem
I thought with cow file systems programs didn’t have to explicitly reflink since normal copies are already reflinks?
This is kind of a strange article. Of course a court isn’t going to judge by someone’s character or good karma. If the judgement seems too high the blame should be on charging for someone else’s crime (deterrent sentences) and what is written as acceptable in the book
They literally said the issue was an unintentional bug and then fixed it. How is that damage control?
That website was really tempting… until I remembered that these run on AA batteries and don’t have gyro or a touchpad. M$ really selling an 85$ controller without the features of the 50$ ps5 controller
Edit: apparently dualsense are 75$ now?? Wtf. and apparently they’ve never been 50$? Just discard my comment
I see, I don’t consciously think about the map/level design when playing something so my opinion of doom comes from its mechanics and presentation, both of which are lacking in comparison to what indie boomer shooters have today.
find better examples of them (Bioshock
I don’t know if I played it wrong or something but I really didn’t like bioshock 1. It lasted like twice as long as it was fun and as time passed enemies just got spongier. Ammo is super scarce in the beginning and super common at the end. Shooting not very satisfying. The existence of the elemental gun. Bioshock 2 was much better imo
I meant it as “yes why did you ask?”
I don’t have strong feelings about level design. I think the levels I enjoyed the most were in other episodes. If this is about the keys I’m neutral about them, I like exploring everywhere anyways so I’d just collect them on the way. I don’t know what else to say
I feel the same way for smb. It has historical importance but it’s not up to the quality standards of today. I like the digital movement, feels better than the analogue stick in nwerer games
Will it not use it even after the ram fills up? I wouldn’t want the compressed part be prioritised anyways
These are the answers they gave the first time.
Qwencoder is persistent after 6 rerolls.
Anyways, how do I make these use my gpu? ollama logs say the model will fit into vram / offloaing all layers but gpu usage doesn’t change and cpu gets the load. And regardless of the model size vram usage never changes and ram only goes up by couple hundred megabytes. Any advice? (Linux / Nvidia) Edit: it didn’t have cuda enabled apparently, fixed now