Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Research
  3. Prism
  4. Neuromorphic Vision Sensors

Neuromorphic Vision Sensors

Event-driven vision chips with on-sensor neural processing for real-time motion and edge detection
Back to PrismView interactive version

Neuromorphic vision sensors extend beyond event cameras by embedding spiking neural networks next to the photodiodes, producing edge detections, optic flow, or gesture classifications directly on-sensor. Instead of dumping raw frames to a GPU, the chip emits sparse spike trains or metadata that describe motion trajectories and salience, drastically reducing computation for downstream systems. Some architectures, such as Intel’s Loihi-based add-ons or Prophesee’s Metavision stack, let developers upload custom neuromorphic models to the sensor itself.

Media robotics teams use these sensors in autonomous camera rigs, drones, and stabilizers because they see fast motion without blur and can react before a mechanical gimbal moves. Sports broadcasters deploy them atop hoops or goalposts to detect impact points, while interactive art installations leverage their low latency for gesture-controlled projections. Since the sensors can output higher-level semantic cues, they also serve as input for adaptive live graphics that respond to performer motion in real time.

Tooling is nascent: creative coders must learn neuromorphic programming paradigms, and there is no dominant standard for spike-based data interchange. IEEE P2846 and Khronos working groups are evaluating ways to encapsulate neuromorphic outputs alongside traditional video streams, and startups are building Unreal and TouchDesigner plugins that translate spikes to MIDI-like events. With TRL 4–5 prototypes already powering labs and select broadcast experiments, neuromorphic vision will gradually complement conventional cameras wherever ultra-low latency perception unlocks new forms of responsive media.

TRL
5/9Validated
Impact
4/5
Investment
3/5
Category
Hardware

Related Organizations

Prophesee logo
Prophesee

France · Company

100%

Pioneer in event-based vision sensors and associated neuromorphic processing algorithms.

Developer
iniVation logo
iniVation

Switzerland · Company

95%

Swiss company specializing in Dynamic Vision Sensors (DVS) and neuromorphic software for robotics.

Developer
Sony Semiconductor Solutions logo
Sony Semiconductor Solutions

Japan · Company

95%

Develops stacked event-based vision sensors with integrated logic layers.

Developer
University of Zurich (Robotics and Perception Group)

Switzerland · University

95%

Lab led by Davide Scaramuzza.

Researcher
SynSense logo
SynSense

Switzerland · Startup

90%

Develops ultra-low-power mixed-signal neuromorphic processors and sensors for edge AI applications.

Developer
AlpsenTek

China · Startup

85%

Vision sensor startup.

Developer

CelePixel

China · Startup

85%

Sensor technology company.

Developer
Samsung Electronics logo
Samsung Electronics

South Korea · Company

85%

Global electronics leader.

Developer
BrainChip logo
BrainChip

United States · Company

80%

Developer of the Akida neuromorphic processor IP and chips.

Developer
Western Sydney University

Australia · University

80%

Hosts the International Centre for Neuromorphic Systems (ICNS).

Researcher

Supporting Evidence

Evidence data is not available for this technology yet.

Connections

Hardware
Hardware
Neuromorphic Event Cameras

Vision sensors that record brightness changes as timestamped events instead of frames

TRL
5/9
Impact
3/5
Investment
3/5
Hardware
Hardware
Neural light-field cameras

Cameras that record light direction and intensity to enable post-capture focus and viewpoint editing

TRL
4/9
Impact
4/5
Investment
4/5
Software
Software
Real-Time NeRF Engines

Live 3D scene capture and rendering from multiple camera angles in real time

TRL
6/9
Impact
5/5
Investment
5/5
Software
Software
Dream-to-Video Decoders

Systems that reconstruct visual imagery from brain scans of dreams or perception

TRL
2/9
Impact
3/5
Investment
2/5
Hardware
Hardware
Brain-Computer Media Interfaces (BCMI)

Neural interfaces that translate brain signals into media control and content creation commands

TRL
3/9
Impact
5/5
Investment
5/5
Hardware
Hardware
Wearable biometric emotion recorders

Wearable sensors that track emotional responses in real time to personalize media experiences

TRL
5/9
Impact
3/5
Investment
3/5

Book a research session

Bring this signal into a focused decision sprint with analyst-led framing and synthesis.
Research Sessions