Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Pixels
  4. Olfactory Interface Modules

Olfactory Interface Modules

Scent-emitting devices that release game-triggered aromas to deepen immersion
Back to PixelsView interactive version

Olfactory interface modules miniaturize scent cartridges, micro-pumps, and dispersion fans into devices that sit on a monitor bezel, headset strap, or room corner. Game engines trigger pre-authored scent profiles—pine forests, plasma vents, potion shops—and the device blends base aromatics into bursts that dissipate quickly thanks to catalytic scrubbers. Advanced units include PID sensors that monitor ambient concentration and adjust output so perfumed bosses don’t linger for hours.

Location-based entertainment, horror streamers, and wellness games use scent to reinforce story beats: a heist mission smells of ozone when lasers arm, cozy farming sims release soil and citrus cues, and guided-breathing apps pair essential oils with calming visuals. Retail brands piggyback on the tech for experiential marketing, turning shoppable livestreams into multi-sensory showcases. Designers embed scent metadata into narrative scripts so localization teams can swap culturally relevant aromas without changing gameplay.

TRL 3–4 pilots highlight practical challenges: cartridge logistics, allergic reactions, and cross-venue safety codes. Standards groups (ASTM, ISO) are drafting exposure limits, while platform SDKs add scent timelines so creators author aromas like they do audio. As hardware shrinks and subscription cartridge services mature, olfactory modules will evolve from novelty add-ons into another creative brush for multisensory games and mixed-reality shows.

TRL
3/9Conceptual
Impact
3/5
Investment
2/5
Category
Hardware

Related Organizations

Aromajoin Corporation logo
Aromajoin Corporation

Japan · Company

95%

Develops the Aroma Shooter, a directional scent delivery device that uses solid-state cartridges.

Developer
GameScent

United States · Startup

95%

A consumer device that uses AI to release scents based on audio cues from gameplay.

Developer
OVR Technology logo
OVR Technology

United States · Startup

95%

Creates wearable scent technology (ION) for VR/AR headsets to deliver precise olfactory experiences.

Developer
Olorama

Spain · Company

90%

Specialists in digital scent technology for VR, cinemas, and events.

Developer
eScent logo
eScent

United Kingdom · Startup

80%

Developing wearable scent technology for wellbeing and immersive experiences.

Developer
Osmo logo
Osmo

United States · Startup

80%

A Google Research spinoff using AI to map the structure of molecules to odor perception (digitizing smell).

Researcher

Supporting Evidence

Evidence data is not available for this technology yet.

Same technology in other hubs

Prism
Prism
Olfactory Media Synthesizers

Devices that synthesize and release scents in sync with media to deepen immersion

Connections

Hardware
Hardware
Volumetric Fog Displays

Aerosol screens that project interactive 3D images suspended in mid-air

TRL
4/9
Impact
3/5
Investment
3/5
Applications
Applications
Hyperpersonalized Interfaces

Game UIs that adjust visuals, pacing, and prompts based on real-time biometric and cognitive data

TRL
4/9
Impact
3/5
Investment
3/5
Hardware
Hardware
Haptic & Force-Feedback Materials

Wearable materials that simulate touch, weight, and texture through soft robotics and programmable surfaces

TRL
5/9
Impact
4/5
Investment
3/5
Hardware
Hardware
Spatial Computing Rigs

Lightweight XR headsets and sensor-embedded surfaces that blend VR, AR, and physical play

TRL
6/9
Impact
5/5
Investment
5/5
Applications
Applications
Location-Based VR Arcades

Warehouse-scale VR arenas with haptic floors, tracked props, and multiplayer free-roam experiences

TRL
7/9
Impact
4/5
Investment
4/5
Hardware
Hardware
Eye-Tracking Game Controllers

Hardware that maps eye movement to in-game actions and UI navigation

TRL
7/9
Impact
4/5
Investment
4/5

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions