“The Future Is Here: Verbal Interaction With NPCs on Mobile.”
Conference:
Type(s):
Title:
- The Future Is Here: Verbal Interaction With NPCs on Mobile.
Developer(s):
Description:
This demo showcases the future of user interaction in gaming: verbal communication. The main character interacts verbally with the user using a language model-based brain and performs actions using an MLP-based brain trained with Reinforcement Learning. Implemented on Unity. Everything running locally on mobile.
References:
[1]
Ronen Eldan and Yuanzhi Li. 2023. TinyStories: How Small Can Language Models Be and Still Speak Coherent English?arxiv:2305.07759 [cs.CL]
[2]
Arthur Juliani, Vincent-Pierre Berges, Ervin Teng, Andrew Cohen, Jonathan Harper, Chris Elion, Chris Goy, Yuan Gao, Hunter Henry, Marwan Mattar, and Danny Lange. 2020. Unity: A general platform for intelligent agents. arXiv preprint arXiv:1809.02627 (2020). https://arxiv.org/pdf/1809.02627.pdf
[3]
Koki Mitsunami. 2023. Part 1: Build dynamic and engaging mobile games with multi-agent reinforcement learning. Retrieved Mar 25, 2024 from https://community.arm.com/arm-community-blogs/b/ai-and-ml-blog/posts/p1-multi-agent-reinforcement-learning
[4]
Nils Reimers and Iryna Gurevych. 2019. Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks. arxiv:1908.10084 [cs.CL]