Broadway

Complete News World

Nvidia ACE: New AI features for lifelike NPCs

Nvidia ACE: New AI features for lifelike NPCs

With the Avatar Cloud Engine (ACE), introduced widely this year, Nvidia wants to breathe more life into NPCs in games, and now new animation and audio functions have recently been added to enable more natural conversations and emotional expressions. With new cloud APIs for Automatic Speech Recognition (ASR), Text-to-Speech (TTS), Neural Machine Translation (NMT), and Audio2Face (A2F), developers should now be able to implement intelligent avatars and scale across applications with ease.

Features available through the early access program aim to make it easier than ever to “create and deploy digital humans anywhere, at scale, using some of the most popular rendering tools like Unreal Engine 5.”

ACE’s latest AI-powered animation features and microservices aim to create more expressive digital humans with newly added emotional A2F support and precise animation service for body, head and eye movements. A2F quality improvements also include lip sync for added realism. In addition, more languages ​​are now supported. More details can be found In the Nvidia blog entry.

Recommended editorial contentHere you will find external content from [PLATTFORM]. To protect your personal data, external integrations will only be displayed if you confirm this by clicking “Load all external content”: