Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. TCN (Temporal Convolutional Networks)

TCN (Temporal Convolutional Networks)

Convolutional neural networks that model sequential data using dilated, causal convolutions.

Year: 2018Generality: 550
Back to Vocab

Temporal Convolutional Networks (TCNs) are a class of neural network architectures that apply convolutional operations across the time dimension to model sequential data. Rather than processing sequences step-by-step as recurrent models do, TCNs use causal convolutions — filters constrained so that the output at any time step depends only on current and past inputs, never future ones. This causality constraint makes TCNs appropriate for real-time and autoregressive tasks where future information is unavailable at inference time.

The key architectural innovation in TCNs is the use of dilated convolutions, where gaps are introduced between filter taps to exponentially expand the receptive field without increasing the number of parameters. By stacking layers with progressively larger dilation factors (e.g., 1, 2, 4, 8, ...), a TCN can capture dependencies spanning thousands of time steps while remaining computationally efficient. Residual connections are typically added between layers to stabilize training in deep networks and allow gradients to flow cleanly during backpropagation.

TCNs offer several practical advantages over recurrent architectures such as LSTMs and GRUs. Because convolutions across a sequence can be computed in parallel, TCNs train significantly faster on modern hardware. They also sidestep the vanishing gradient problem that plagues deep RNNs, since gradients in convolutional networks travel through fixed-depth paths rather than unrolled time steps. A landmark 2018 study by Bai, Kolter, and Koltun demonstrated that TCNs matched or outperformed canonical recurrent models across a wide range of sequence modeling benchmarks, prompting broader adoption of convolutional approaches for temporal data.

TCNs have found application in time series forecasting, audio generation, anomaly detection, and natural language processing. While transformer-based architectures have since become dominant for many sequence tasks, TCNs remain a strong baseline — particularly in resource-constrained settings where their efficiency and simplicity are valuable. Their design principles, especially dilated causal convolutions, have also influenced hybrid architectures that blend convolutional and attention-based components.

Related

Related

CNN (Convolutional Neural Network)
CNN (Convolutional Neural Network)

A deep learning architecture that learns spatial hierarchies of features from visual data.

Generality: 875
Causal Transformer
Causal Transformer

A transformer variant that enforces temporal causality by masking future information during training.

Generality: 453
GCN (Graph Convolutional Networks)
GCN (Graph Convolutional Networks)

Neural networks that apply convolution-like operations to learn from graph-structured data.

Generality: 694
RNN (Recurrent Neural Network)
RNN (Recurrent Neural Network)

Neural networks with feedback connections that process sequential data using internal memory.

Generality: 838
Temporal Data
Temporal Data

Data indexed by time, capturing sequences, durations, and the ordering of events.

Generality: 650
FCN (Fully Convolutional Network)
FCN (Fully Convolutional Network)

A neural network architecture that produces pixel-wise predictions for image segmentation.

Generality: 694