Meta Expands GenAI Toolkit with Environment Generation and Embodied NPCs

Meta Expands GenAI Toolkit with Environment Generation and Embodied NPCs
Source: Meta (Screenshot of video)
  • Environment Generation is now available in the Worlds Desktop Editor, building on Meta’s existing GenAI tools.
  • Embodied NPCs with live voice interactions and in-world behavior are launching soon.

Meta has expanded its GenAI toolkit for Horizon Worlds. Environment Generation is now live, letting creators build entire 3D scenes with a text prompt. Meta also previewed fully-embodied NPCs that respond to player voice input and interact dynamically inside virtual worlds.

The upcoming NPCs will use large language models to support dynamic voice conversations with players. Creators can combine scripted and AI-driven responses, choose from a voice library, and define traits like name, personality, and backstory. The Character Builder is also getting new features to allow deeper customization and testing. These characters will be available in both VR and mobile worlds.

Environment Generation lets developers produce themed landscapes using AI prompts or detailed parameters. Each scene includes rich detail, 3D meshes, textures, and decomposable assets for creative control. These tools are all available in the Worlds Desktop Editor, alongside other GenAI features like mesh, texture, skybox, ambient audio, sound effects, code generation, and a creator assistant for worldbuilding.


🌀 Tom’s Take:

GenAI is reshaping how worlds are built, and Meta is doubling down to speed up development for Horizon Worlds creators. Embodied NPCs and environment generation extend the powerful tools already rolling out in the Worlds Editor.


Disclosure: Tom Emrich has previously worked with or holds interests in companies mentioned. His commentary is based solely on public information and reflects his personal views

Source: Meta Blog