With the Avatar Cloud Engine (ACE), introduced widely this year, Nvidia wants to breathe more life into NPCs in games, and now new animation and audio functions have recently been added to enable more natural conversations and emotional expressions. With new cloud APIs for Automatic Speech Recognition (ASR), Text-to-Speech (TTS), Neural Machine Translation (NMT), and Audio2Face (A2F), developers should now be able to implement intelligent avatars and scale across applications with ease.
Features available through the early access program aim to make it easier than ever to “create and deploy digital humans anywhere, at scale, using some of the most popular rendering tools like Unreal Engine 5.”
ACE’s latest AI-powered animation features and microservices aim to create more expressive digital humans with newly added emotional A2F support and precise animation service for body, head and eye movements. A2F quality improvements also include lip sync for added realism. In addition, more languages are now supported. More details can be found In the Nvidia blog entry.
Stalker 2 preview?
One of the first AAA titles in which Nvidia’s ACE technology is expected to be primarily used is said to be Stalker 2, according to Wccftech mentioned. GSC Game World’s gritty open-world shooter is coming next year after being recently delayed and could see the AI-controlled NPC revolution in action for large-scale players for the first time. Despite the long wait recently, there has also been more positive news about the game recently.
“Subtly charming coffee scholar. General zombie junkie. Introvert. Alcohol nerd. Travel lover. Twitter specialist. Freelance student.”