2024.07.05 While the gaming industry explores the potential of generative AI, Nintendo remains cautious due to concerns over IP rights and the company's predilection for its unique approach to game development.
So far research suggests the guardrail and hallucination problems are unsolvable, and we are seeing diminishing returns from increasing the complexity of these systems.
Hence devs will never have the necessary control required to author an actual narrative. NPCs will end up talking about mechanics that don’t exist, or saying things that contradict an overrall narrative.
Even with actual people, if you just throw them in a room and have the improv a world into existence, it never ends up quite as good as a properly authored narrative.
And LLMs are nowhere near achieving the level of internal consistency required for something like the worlds of Elden Ring or Mass Effect.
Baldur’s Gate 3 contains truly staggering amounts of writing, multiple times that of classical literary works. The hallucination problem means that if all that were AI generated, small parts of it might pass inspection, but trying to immerse yourself in it as a fictional world would have you noticing immersion breaking continuity errors left and right.
Not gonna happen. Not really.
So far research suggests the guardrail and hallucination problems are unsolvable, and we are seeing diminishing returns from increasing the complexity of these systems.
Hence devs will never have the necessary control required to author an actual narrative. NPCs will end up talking about mechanics that don’t exist, or saying things that contradict an overrall narrative.
Even with actual people, if you just throw them in a room and have the improv a world into existence, it never ends up quite as good as a properly authored narrative.
And LLMs are nowhere near achieving the level of internal consistency required for something like the worlds of Elden Ring or Mass Effect.
Baldur’s Gate 3 contains truly staggering amounts of writing, multiple times that of classical literary works. The hallucination problem means that if all that were AI generated, small parts of it might pass inspection, but trying to immerse yourself in it as a fictional world would have you noticing immersion breaking continuity errors left and right.