Square Enix spent over a decade working on Final Fantasy 7 Remake but the passionate project was not the only thing eating away precious resources. The developer was also building an advanced piece of technology alongside for one specific purpose — to make the in-game characters look as lifelike as possible.
Speaking with Edge for the latest issue, co-director Naoki Hamaguchi explained how artificial intelligence (AI) decides what expressions each character has to make during conversations. The program detects emotions from every line of dialogue delivered in Final Fantasy 7 Remake and manipulates faces accordingly.
You can generally tell the emotional content of any piece of dialogue from the intonation and patterns, if you look at it on a graph and at how the levels of tension go up and down.So we took a number of samples from various different voice data, downloaded them into a database, and then looked for the patterns to create a system where we could get a very high level of recognition on the actual emotional content of any piece of dialogue.
Hamaguchi confirmed that the same AI also does the job of lip-syncing and can even calculate the optimal angle (and distance) from characters speaking in a scene to automatically move the camera to that spot. The technology was being built "pretty much throughout the entire development cycle" and hence, is as old as Final Fantasy 7 Remake itself.
More at: https://segmentnext.com/2020/04/02/final-fantasy-7-remake-ai-facial-expressions/
Tell me words aren't the only way to tell someone how you feel if old