Reddit -> kbin.social -> kbin.run
-> kbin.earth
4th times the charm, right?
re: The warning/grammer checking system.
What you’re describing is called a linter, and they’ve existed for ages.
The only way I can really think of to improve them would be to give them a full understanding of your codebase as a whole, which would require a deeper understanding than current gen AI is capable of. There might be some marginal improvements possible with current gen, but it’s not going to be groundbreaking.
What I have found AI very useful for is basic repetitive stuff that isn’t easily automated in other ways or that I simply can’t be bothered to write again. eg: “Given this data model, generate a validated CRUD form” or “write a bash script that renames all the files in a folder to follow this pattern”
You still need to check what it produces though because it will happily hallucinate parameters that don’t exist, or entire validation libraries that don’t exist, but it’s usually close enough to be used as a starting point.
Cool! What’s VR on Linux like generally? I’d like to get a headset again, but not if it means going back to Windows
I really feel for federal workers right now.
…
Some of us even voted for Trump this past election.
I really feel for most federal workers right now.
I think others have said it better than I could, but yes, I include the blatant cash grab that the sims has become.
In case you weren’t aware, Paralives is a thing and looks really promising
Everything EA touches dies. Fuck EA
Oh, the movie! I was confused as hell for a second there. I thought there was a remaster or something that I’d missed…
…dear universe: I would buy the shit out of a well made Sonic 3 remaster
What? I respectfully disagree. The reason I struggled to stay on Pixelfed, and recently Loops too, is because as a new user all I was seeing was classical art, and landscapes (and the odd weirdo) but nothing actually entertaining. Loops at least has people copying stuff from TikTok. I’m not condoning that behaviour, I’m just saying as a casual new user, that was pretty much the most compelling content.
Maybe I was doing it wrong, but IMHO the content creators are the core of any social media. Pixelfed and Loops will both live or die by the creators they attract
ETA: I love weirdos by the way, that wasn’t meant as a negative. If anything, more please!
We’ll figure that out later. In the mean time, lets just keep them concentrated in one place, like a resort… or camp
LLMs not being able to tell us what bread tastes like has nothing to do with intelligence. it’s a qualia. I think you meant it cannot KNOW what bread tastes like… although I still don’t understand why you’d think that’s a requirement for intelligence
I apologize for my last comment, I was drunk when I wrote it. I’d rather not put that kind of negativity into the world.
I do still disagree with you though.
On paper or not, the system supports it, which means that they are very likely NOT supporting two lighting systems, which means that, yes, my point still stands. The series S is only 5 years old. The minimum system requirements are for 7 year old hardware.
EVERYTHING else is a matter of optimization, which no one here can comment on until the game is released. You just cannot know the game will perform badly until it is released.
As evidence of this, I will again point to the Indiana Jones game which is a) Ray Traced, b) Runs on the series S, and c) runs at 60fps (although, admittedly it’s apparently blurry)
deleted by creator
This is my last reply because I just can’t with you anymore.
Those are some HELLA cherry picked examples. Both Dragon’s Dogma examples have the games running at MAX SETTINGS!
Even the Cyberpunk example, that was RAY TRACING on Medium.
I can’t read german, so no idea what is actually being said in the Indiana Jones article, but the closest equivalent I could find is this video which, frankly, tells me you’re completely full of shit.
It’s cute that you think that was a good counter argument
forcing gamers to have a sub par experience
The game isn’t even out yet and you’re commenting on performance! As someone else pointed out, the modern Doom games have a reputation for being extremely well optimised, so let’s wait and see how it actually performs on a 20 series card
As for needing a card > $1000 that’s just ridiculous. You can get a 4060 NEW for under 500, and again, the minimum here is a 2060.
Re: supporting old hardware, again. The minimum is 7 year old hardware. I was also around in the 386 era and to say that devs of that time supported hardware for longer, is at best, wildly exaggerated.
I can’t comment too deeply on consoles. I have no real experience with them, but from a very shallow level, the series S was released in 2020, and Google suggests that it supports ray tracing.
So my point stands. Stop expecting your 10 year old hardware to run new games indefinitely
So couple things:
The first RTX cards were the 20 series, which came out in 2018
There was a time when volumetric lighting was also optional
There was a time when GRAPHICS CARDS were optional.
The first game to require RTX was the Indiana Jones game, as did the Avatar game.
Shit moves on. Did you expect your 1060 card from 2016 to last indefinitely? How long did you expect developers to support 2 different lighting systems?
There is so much to be angry about these days, but not this. This was inevitable. If you MUST be angry about it, at least be angry at the right devs
Also let’s be real. He’s not trying to help anyone but himself. If he ever does find some miracle cure that significantly slows aging, you can bet your ass that he’ll charge so much for it that only his billionaire cronies will be able to afford it
sigh all I’m hearing is prices for every fucking thing are going to skyrocket… globally…