Over a year ago, in May 2023, Nvidia first showed off a demo for what it called ACE (Avatar Cloud Engine) for Games. It used a large language model to allow gamers to add NPCs that could interact with players in non-scripted sequences, similar to how people chat with generative AI bots.
Today, as part of its Gamescom 2024 announcements, Nvidia revealed that its ACE for Games technology has been put into a video demo for the upcoming sci-fi action game Mecha BREAK.
Nvidia stated it has developed its own on-device LLM called Nemotron-4 4B Instruct that has been made specifically for role-playing. Mecha BREAK's developer, Amazing Seasun Games, has put that LLM in the game. It added:
In the ACE showcase demo, players can interact via natural language with game characters, most notably the player’s mechanic. Ask for advice on objectives, the ideal mech for the task, and more, before having their mech’s paint job quickly updated for maximum battlefield bling.
In addition to the LLM, Mecha BREAK used Nvidia's Audio2Face-3D NIM for character facial movements. It will also use OpenAI's Whisper for its speech recognition tech. The game will connect to the cloud via Elevenlabs to generate the AI NPC's voice. The game is set to launch in 2025 officially, but there's no word yet on if the ACE for Games feature will be added.
Nvidia also showed another ACE for Games demo made by Perfect World Games, called Legends. It states:
Within the demo, the character Yun Ni can see gamers and identify people and objects in the real world using the computer’s camera powered by ChatGPT-4o, adding an augmented reality layer to the gameplay experience. These capabilities unlock a new level of immersion and accessibility for PC games.
There is no word on whether this combination of chatbot with real-world camera interaction will be added to a specific future game.
2 Comments - Add comment