Ah yes, the "GTA-style" demo that actually just looks exactly like GTA IV because it's just an unholy amalgamation of a bunch of GTA IV gameplay videos. Truly the next generation of gaming.
God forbid we have a little incremental progress, huh?
I think a good rule of thumb when deciding to criticize someone's project is to pretend it was created by your own children or your best friend. Would you be as harsh and close-minded if it were created by someone you love?
Incremental progress towards what? This is literally going backwards. I can play GTA IV on my Playstation 3 right now. I could almost 2 decades ago.
But now I can instead play a version of GTA that resembles what dreaming about playing GTA would be like, in which I can press a button and, after 10 seconds of latency, watch my "character" awkwardly walk into a building as the world melts around him, all while consuming literally 100 times the computing resources that the original game required to run. And this is apparently revolutionary.
If this was created by someone I knew, I'd tell them to learn Unity or something and make an actual game.
Then go play GTA IV on your Playstation 3, since you lack understanding about the goals of world models and cannot recognize incremental progress towards that vision, and are only interested in armchair criticism without a full understanding of the scene.
Every game company is going to use this tech in some form one day, whether in a production engine or during the prototyping phase. At some point, you're going to have to shed your biases. If you lack the vision or understanding to appreciate the research going into this, you're welcome to come back years from now when a finished product has reached the masses.
> If this was created by someone I knew, I'd tell them to learn Unity or something and make an actual game.
That's sad. Sad that you don't think actual games will be made with this technology, sad that you're gatekeeping what games even are by using vague qualifiers like "actual" which allow you to retreat from your position as needed, sad that you'd discourage them from continuing to research this amazing new avenue of creative possibilities.
Yes, we know, your groundbreaking AI startup is the future and it changes everything and it's going to take the world by storm and every company on earth will be using your technology by the end of the decade, just like every other tech startup ever. Save it for the VCs.
> gatekeeping what games even are by using vague qualifiers like "actual"
I really shouldn't need to define this, but "actual" games have gameplay. Slowly moving a character in an AI generated "world" that melts around him, doesn't obey any laws of physics, has no well defined mechanics and generally doesn't make any sense doesn't count as gameplay in my eyes, and I don't think that's a controversial opinion.
Once this "technology" allows me to play a real game, with real mechanics that make sense, that isn't just a dream-like version of an existing popular game, then you can call it an "amazing new avenue of creative possibilities." Until then, it's just a gimmick that doesn't serve much of a purpose beyond impressing the uneducated for a few minutes.
If all my years on the internet have taught me anything, it's that some people are just severely mentally unwell and will attempt to destroy anything they can get their hands on, purely because they can. Sometimes it's for attention, sometimes they just want to watch the world burn, but either way, asking "what did their target do to deserve it?" is pointless because the attacker likely never asked themselves that question either, and could very well just be a straight up sociopath.
As the internet grows, so grows the number of such people on it. In days gone, these people would've been rightly shunned from society, and their ability to cause harm to others was severely limited, unless they were willing to resort to more... extreme methods that would usually come with serious consequences. But the internet has given them a new outlet, a new way to ruin things for people from across the world that would've been far, far beyond their reach before, usually without any risk of punishment.
The game should remain playable, even if via LAN only. How that is accomplished is the responsibility of the studio, not the player - maybe they should think twice before licensing proprietary components that players cannot run themselves.
If the company fails to do this, they are effectively committing theft, and should be punished accordingly by the law. If studio execs think this it's an unreasonable thing to do, then they're free to not release their games to the public and keep their proprietary services to themselves.
It's these extreme implications why I, as a gamer and software dev, haven't signed the initiative. A lot of these things are just not feasible. And it'll be so much harder on the indie devs than on the Bungie/Blizzards.
I'm afraid if this is pushed through, the studios will just switch all online experiences to be fully subscription based. No more purchasing the game, you just pay for a month of the experience.
As a software developer do you genuinely believe that it is harder for indie game developers to build online infrastructure and pay for its hosting costs rather than build some LAN feature into the game, or to package local server binaries into the game as it was done just a few decades ago?
Most indie games I've play don't even run their own online infrastructure because of costs. Why bother, when you can just use a storefront's matchmaking for free? And storefronts provide it as a means of soft lock-in. For example one of my favorites, Deep Rock Galatic, doesn't have crossplay between the Steam PC version and the Xbox PC store version of the game.
And there's already software to emulate Steam's matchmaking because it's so common.
Not OP, but since Diablo3 and its server issues that often locked my out of what's supposed to be a single-player game, I absolutely refuse to buy online-only-single player games. And anything from Blizzard.
If I can't obtain a DRM-free copy of a game I already paid for, I'll seek "alternative ways". I mostly play indie games though, and I haven't had much trouble with my archiving endeavors.
Similar to how the parents of today tell their children bedtime stories about the luddites who thought there would still be humans driving cars by 2020.
In college, newcomers will start with the basics of high level languages and then spend the rest of the time learning prompting.
Just like nowadays assembler is only a side note, C is only taught in specialized classes (OS, graphics) and most things are taught in high level languages.
The same way most of us review our compiler generated code today (ie not at all). If it works it works, if doesn't we fix the higher level input and try again. I won't be surprised if in a few more generation the AI will skip the human readable code step and generate ASTs directly.
> if doesn't we fix the higher level input and try again
How can I visit this fantasy world of yours where LLMs are as reliable and deterministic as compilers and any mistakes can be blamed solely on the user?
It's really easy to make unsubstantiated claims about what will happen decades from now, knowing your claims will be long forgotten when that time finally comes around.
Crawling up the abstraction ladder and 'forgetting' everything below has been the driving trend in programming since at least the 60s and probably before.
We for example have a whole generation of programmers who have no idea what the difference between a stack and a heap is and know nothing about how memory is allocated. They just assume that creating arbitrarily complex objects and data structures always works and memory never runs out. And they have successful careers, earning good money, delivering useful software. I see no reason why this won't continue.
It's really interesting to see the extreme contrast between the constant praise of AI coding tools here on HN vs the actual real world performance as seen recently on public Microsoft repos, where it utterly fails at even the most basic tasks.
I'm pretty surprised people here are saying anything good about copilot honestly. Its PR summaries and reviews are, for me, basically worthless, and i turned off the autocomplete snippets after the first day when they were always tantalizingly close to being right, but then never actually worked.
You see the polarization in HN comments: to some people, AI is the most important engineering tool since the lever and the wheel. To others, it’s basically useless. I’ve never seen such an extreme lack of consensus over a tech tool’s usefulness or non-usefulness.
> shouldn’t it have the right to control who or what interacts with it?
Does the manufacturer of your refrigerator have the right to control what food you're allowed to put into it? If not, why do you have different standards for computing devices? Why did it ever become okay for Apple to decide what you do with your device after they've sold it to you?
reply