Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Cortex
  4. Immersive Human-Machine Co-Presence

Immersive Human-Machine Co-Presence

XR environments controlled directly by brain signals for hands-free interaction
Back to CortexView interactive version

Immersive human-machine co-presence systems are Extended Reality (XR) platforms including virtual reality (VR), augmented reality (AR), and mixed reality (MR) that are driven by direct neural intent, enabling hands-free manipulation of virtual worlds and shared mixed-reality workspaces where thoughts translate directly to actions without requiring hand controllers, gestures, or voice commands. These systems use brain-computer interfaces to read user intent and translate it into actions in virtual or augmented environments, enabling more natural and immersive interactions where users can manipulate virtual objects, navigate spaces, and interact with digital content using only their thoughts, creating shared collaborative spaces where multiple users can interact in mixed-reality environments through neural control.

This innovation addresses the limitation of current XR interfaces, where controllers or gestures can feel unnatural and limit immersion. By enabling thought-based control, these systems could create more natural and immersive experiences. Research institutions and companies are developing these technologies.

The technology is particularly significant for XR applications, where natural interaction could dramatically improve user experience. As the technology improves, it could enable new forms of collaboration and interaction. However, ensuring responsiveness, managing complexity, and achieving reliable control remain challenges. The technology represents an interesting direction for XR interfaces, but requires extensive development to achieve the performance and reliability needed for practical use. Success could enable more immersive XR experiences, but the technology must prove it can provide responsive, reliable control that enhances rather than detracts from the experience.

TRL
4/9Formative
Impact
5/5
Investment
5/5
Category
Applications

Related Organizations

Cognixion logo
Cognixion

United States · Startup

95%

Builds AI-powered BCI headsets with AR displays for accessibility and communication.

Developer
OpenBCI logo
OpenBCI

United States · Company

95%

Creates open-source brain-computer interface tools and the Galea headset (integrating with VR) for researching physiological responses.

Developer
Meta Reality Labs logo
Meta Reality Labs

United States · Company

90%

Develops the Quest Pro and research prototypes (Butterscotch, Starburst) focusing on foveated systems.

Developer
Wisear logo
Wisear

France · Startup

88%

Deeptech startup creating neural interface earbuds.

Developer
Pison

United States · Startup

85%

Develops wrist-worn electroneurography (ENG) sensors for gesture control in XR environments without cameras.

Developer
Snap Inc. logo
Snap Inc.

United States · Company

85%

Social media and camera company developing AR spectacles.

Acquirer
Emotiv logo
Emotiv

United States · Company

80%

Produces EEG headsets and the BCI-OS platform, allowing developers to build applications that respond to cognitive stress and facial expressions.

Developer
MindMaze logo
MindMaze

Switzerland · Company

80%

Develops gamified neurorehabilitation platforms for stroke and brain injury recovery.

Developer
HTC VIVE logo
HTC VIVE

Taiwan · Company

75%

Produces VR headsets and actively partners with BCI companies (like OpenBCI and MyndPlay) to integrate brain-sensing into their hardware ecosystem.

Deployer
Varjo logo
Varjo

Finland · Company

75%

Manufacturer of 'bionic display' headsets that use a high-density focus display inside a peripheral context display.

Developer

Supporting Evidence

Evidence data is not available for this technology yet.

Connections

Applications
Applications
Neuro-Gaming Interfaces

Brain-computer interfaces that let players control games with thoughts and mental states

TRL
5/9
Impact
3/5
Investment
4/5
Software
Software
Neural Prosthesis Control Systems

Software that translates brain and muscle signals into precise prosthetic limb movements

TRL
7/9
Impact
5/5
Investment
5/5
Applications
Applications
Brain-Guided Robotics

Robotic systems controlled by brain signals for surgery, hazardous work, or remote operations

TRL
5/9
Impact
4/5
Investment
4/5
Hardware
Hardware
Next-Gen Noninvasive BCIs

Wearable brain sensors using magnetic fields and light to decode neural activity outside labs

TRL
6/9
Impact
4/5
Investment
4/5
Applications
Applications
Brain-to-Brain Communication

Direct neural transmission of thoughts or commands between brains via networked interfaces

TRL
2/9
Impact
5/5
Investment
2/5
Applications
Applications
Advanced Restorative Neuroprosthetics

Prosthetic limbs that respond to thought and transmit touch, pressure, and temperature back to the user

TRL
6/9
Impact
5/5
Investment
5/5

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions