Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Radial Basis Function Network

Radial Basis Function Network

A neural network using radial basis functions as hidden-layer activations for function approximation.

Year: 1988Generality: 563
Back to Vocab

A Radial Basis Function Network (RBFN) is a three-layer artificial neural network in which the hidden units apply radial basis functions — most commonly Gaussians — as their activation functions. Unlike standard feedforward networks where activations depend on a weighted sum of inputs, each hidden neuron in an RBFN computes a distance-based response: it measures how far an input vector lies from a learned center point and produces an output that decreases (or increases) monotonically with that distance. The final output layer then combines these responses through a simple linear weighted sum, making the overall mapping a linear combination of localized, radially symmetric basis functions.

Training an RBFN typically proceeds in two stages. First, the centers of the radial basis functions are determined — often through unsupervised methods like k-means clustering on the training data, or by selecting a subset of training points directly. Second, the output-layer weights are fitted using linear least squares, which is computationally cheap and avoids the vanishing gradient problems that can plague deep networks trained end-to-end. This decoupled training procedure gives RBFNs a significant speed advantage over multilayer perceptrons in many settings, and the linear output stage guarantees a unique, globally optimal solution for the weights given fixed centers.

RBFNs are particularly well-suited to interpolation and function approximation tasks because each basis function acts as a local detector, responding strongly only to inputs near its center. This locality means the network can model complex, nonlinear mappings while remaining interpretable — the contribution of each hidden unit is spatially bounded and easy to visualize. Applications have included time-series forecasting, control systems, classification, and density estimation. They also connect naturally to kernel methods and Gaussian processes, providing a bridge between neural network and statistical learning perspectives.

Although deep learning has largely supplanted RBFNs for large-scale perception tasks, they remain relevant in settings where training data is limited, fast training is essential, or interpretability matters. Their theoretical properties — universal approximation, convex output-layer optimization, and clear geometric interpretation — make them a valuable conceptual reference point in the broader landscape of neural network architectures.

Related

Related

Restricted Boltzmann Machines (RBMs)
Restricted Boltzmann Machines (RBMs)

Generative neural networks that learn probability distributions over input data using two layers.

Generality: 692
DBN (Deep Belief Network)
DBN (Deep Belief Network)

A generative neural network built from stacked Restricted Boltzmann Machines trained layer by layer.

Generality: 694
Neural Network
Neural Network

A layered system of interconnected nodes that learns patterns from data.

Generality: 947
Feedforward Neural Network
Feedforward Neural Network

A neural network architecture where information flows strictly from input to output.

Generality: 838
RNN (Recurrent Neural Network)
RNN (Recurrent Neural Network)

Neural networks with feedback connections that process sequential data using internal memory.

Generality: 838
Boltzmann Machine
Boltzmann Machine

A stochastic recurrent network that learns probability distributions over binary variables.

Generality: 694