AI Avatars that can Express Human Emotions are Unveiled by Synthesia, a Firm Backed by Nvidia

Synthesia, an artificial intelligence startup supported by Nvidia, debuted a new generation of AI-generated digital avatars on Thursday that can interpret text input from users to represent human emotions.

According to the business, its “Expressive Avatars” have the ability to conflate virtual and real-world personas. It seeks to remove expenses associated with the professional video production process, such as cameras, microphones, performers, and protracted edits. Actors recite lines in front of a green screen at Synthesia’s London studio in order to train the system.

The business demonstrated how to incorporate three lines of text onto their platform in one demonstration:  “I am happy. I am sad. I am frustrated” — after which the AI-generated actor in the video responded by reading the text in the tone of each corresponding emotion.”

According to Synthesia, the company’s technology is utilized by more than 55,000 companies—including half of the Fortune 100—to create digital avatars for training films and corporate presentations.

Having been founded in 2017, Synthesia is one of the more recent AI “unicorn” companies in Britain. It garnered $90 million from investors last year at a valuation of approximately $1 billion. Additional investors include Accel, Kleiner Perkins, GV, FirstMark Capital, and MMC.

The business said that publishers need to register as enterprise clients in order to develop synthetic avatars, in response to concerns about how their videos might be used to create false news content. Moderators review material produced using this technology.

Pricing for Synthesia’s enterprise clients is not made public.

In order to stop dishonest people from fabricating corporate profiles in order to disseminate false information, the company additionally mandates that all of its new customers go through a comprehensive “Know Your Customer” procedure akin to what the banking sector uses.

According to Synthesia, the company is already getting ready for the next round of international elections and has put in place a number of safeguards to make sure that hostile actors abusing its platform to rig different polls won’t be successful.

The business is also a member of the Coalition for Content Provenance and Authenticity, a group of AI businesses that work to certify content and digitally “watermark” content produced by AI so that users are aware that it was created by AI rather than a human.