Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • My Collection
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Soma
  4. Olfactory and Gustatory Interfaces

Olfactory and Gustatory Interfaces

Devices that synthesize smell and taste sensations for immersive digital experiences
Back to SomaView interactive version

Olfactory and gustatory interfaces represent a frontier in sensory technology that extends immersive experiences beyond the traditional audiovisual domain. These systems employ two primary approaches to recreate smell and taste sensations: chemical delivery mechanisms that release actual scent molecules or flavor compounds, and electrical or thermal stimulation that directly activates olfactory and gustatory receptors. Chemical-based systems typically utilize cartridges containing concentrated aromatic compounds or flavor essences, which are vaporized, atomized, or otherwise dispersed in precise combinations to create specific sensory experiences. Meanwhile, electrical stimulation approaches apply controlled currents to the tongue or nasal cavity, triggering neural responses that the brain interprets as taste or smell. Some hybrid systems combine both methods, while room-scale installations may use environmental scent diffusion synchronized with content playback. The technical challenge lies in achieving rapid switching between different sensory profiles, maintaining consistency across sessions, and ensuring safety when delivering substances or electrical signals to sensitive mucous membranes.

The integration of smell and taste into extended reality environments addresses a fundamental limitation in current immersive technologies: the sensory gap that prevents truly convincing presence in virtual spaces. Research suggests that olfactory cues are particularly powerful triggers for memory and emotion, making them valuable tools for therapeutic applications such as PTSD treatment, phobia exposure therapy, and cognitive rehabilitation. In entertainment and training contexts, these interfaces enable more realistic simulations—from culinary education programs where students can experience flavors before working with actual ingredients, to military or emergency response training that incorporates the stress-inducing smells of smoke or chemical agents. The technology also opens new possibilities for remote social experiences, allowing people to share meals or visit virtual environments with sensory fidelity that approaches physical presence. For individuals with sensory impairments, these systems may offer alternative pathways to experience content, while marketing and retail applications are exploring how scent and taste previews might influence consumer behavior in virtual shopping environments.

Early commercial deployments have appeared in specialized entertainment venues, high-end VR arcades, and research laboratories exploring human-computer interaction. Several companies have developed consumer-grade olfactory devices for gaming and meditation applications, though widespread adoption remains limited by factors including cost, the complexity of scent cartridge systems, and the lack of standardized content formats. Pilot programs in culinary schools and fragrance design studios demonstrate professional applications, while therapeutic uses are being explored in clinical settings for conditions ranging from eating disorders to dementia care. As the metaverse concept gains traction and extended reality technologies mature, industry analysts note growing interest in multi-sensory interfaces as differentiators in competitive markets. The trajectory suggests a gradual evolution from novelty applications toward integrated sensory ecosystems, particularly as miniaturization advances and safety standards become established. This technology aligns with broader trends toward embodied computing and the recognition that human experience is fundamentally multi-sensory, pointing toward future interfaces that engage our full perceptual apparatus rather than privileging sight and sound alone.

TRL
3/9Conceptual
Impact
4/5
Investment
3/5
Category
Hardware

Related Organizations

Meiji University (Homei Miyashita Lab) logo
Meiji University (Homei Miyashita Lab)

Japan · University

95%

Research lab famous for developing the 'Taste the TV' (TTTV) prototype and electric taste-enhancing chopsticks.

Researcher
OVR Technology logo
OVR Technology

United States · Startup

95%

Creates wearable scent technology (ION) for VR/AR headsets to deliver precise olfactory experiences.

Developer
Aromajoin Corporation logo
Aromajoin Corporation

Japan · Company

92%

Develops the Aroma Shooter, a directional scent delivery device that uses solid-state cartridges.

Developer
Imagineering Institute logo
Imagineering Institute

Malaysia · Research Lab

88%

Conducts advanced research into 'Digital Smell' and 'Digital Taste' via electrical stimulation of the tongue and nose.

Researcher
University of Chicago (Human Computer Integration Lab) logo
University of Chicago (Human Computer Integration Lab)

United States · University

85%

Research group led by Pedro Lopes exploring chemical haptics and intranasal trigeminal stimulation.

Researcher
eScent logo
eScent

United Kingdom · Startup

82%

Developing wearable scent technology for wellbeing and immersive experiences.

Developer

Moodify

Israel · Startup

80%

Develops 'functional scents' using AI to create 'white scent' (smell cancellation) and mood-altering olfactory signals.

Developer

Supporting Evidence

Evidence data is not available for this technology yet.

Connections

Hardware
Hardware
Haptic Epidermal Interfaces

Skin-adhered sensors and actuators that deliver touch, pressure, and thermal feedback

TRL
4/9
Impact
4/5
Investment
3/5
Hardware
Hardware
Tangible Affective Interfaces

Physical objects that change shape, texture, or temperature to sense and express emotion

TRL
4/9
Impact
4/5
Investment
3/5
Applications
Applications
Immersive Workspaces and Classrooms

Persistent XR environments blending physical and digital spaces for collaborative work and learning

TRL
6/9
Impact
5/5
Investment
4/5
Hardware
Hardware
Neuro-Affective Headsets

Wearable brain sensors that detect emotional states like stress, engagement, and frustration

TRL
6/9
Impact
4/5
Investment
4/5
Ethics Security
Ethics Security
Immersive Consent and Safety Protocols

Real-time consent and boundary enforcement systems designed for XR social environments

TRL
4/9
Impact
5/5
Investment
4/5
Applications
Applications
Therapeutic VR Exposure

Controlled virtual environments for gradual exposure therapy in PTSD, phobias, and anxiety disorders

TRL
8/9
Impact
5/5
Investment
4/5

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions