Skip to main content

Envisioning is an emerging technology research institute and advisory.

LinkedInInstagramGitHub

2011 — 2026

research
  • Reports
  • Newsletter
  • Methodology
  • Origins
  • Vocab
services
  • Research Sessions
  • Signals Workspace
  • Bespoke Projects
  • Use Cases
  • Signal Scanfree
  • Readinessfree
impact
  • ANBIMAFuture of Brazilian Capital Markets
  • IEEECharting the Energy Transition
  • Horizon 2045Future of Human and Planetary Security
  • WKOTechnology Scanning for Austria
audiences
  • Innovation
  • Strategy
  • Consultants
  • Foresight
  • Associations
  • Governments
resources
  • Pricing
  • Partners
  • How We Work
  • Data Visualization
  • Multi-Model Method
  • FAQ
  • Security & Privacy
about
  • Manifesto
  • Community
  • Events
  • Support
  • Contact
  • Login
ResearchServicesPricingPartnersAbout
ResearchServicesPricingPartnersAbout
  1. Home
  2. Vocab
  3. Weighted Sum

Weighted Sum

A linear combination of inputs scaled by learned weights, fundamental to neural networks.

Year: 1958Generality: 820
Back to Vocab

A weighted sum is a mathematical operation in which each input value is multiplied by a corresponding weight and the resulting products are added together to produce a single scalar output. In neural networks, this operation sits at the heart of every artificial neuron: given an input vector x and a weight vector w, the weighted sum is computed as z = w₁x₁ + w₂x₂ + ... + wₙxₙ + b, where b is an optional bias term. This value is then typically passed through a nonlinear activation function to produce the neuron's output. The simplicity of the operation belies its power — stacking many such computations across layers allows neural networks to approximate arbitrarily complex functions.

The weights in a weighted sum encode the relative importance of each input to the final output. A large positive weight amplifies a feature's influence, a large negative weight suppresses or inverts it, and a weight near zero effectively ignores it. During training, these weights are learned through backpropagation and gradient descent: the network computes a loss measuring prediction error, calculates how that loss changes with respect to each weight, and nudges the weights in the direction that reduces error. Over many iterations, the weighted sums across all layers collectively learn to extract and combine features in ways that solve the target task.

Weighted sums are not unique to neural networks — they appear in linear regression, support vector machines, attention mechanisms, and ensemble methods like boosting, where base model predictions are combined using learned or heuristic weights. In the transformer architecture, the scaled dot-product attention mechanism is essentially a weighted sum of value vectors, where the weights are derived from query-key similarity scores. This makes the weighted sum one of the most pervasive primitives in all of machine learning.

Understanding weighted sums is essential for interpreting model behavior. Techniques like saliency maps and LIME approximate complex models locally as weighted sums to explain individual predictions. The concept also connects directly to the notion of linear separability — a single weighted sum with a threshold can classify any linearly separable dataset, while deeper compositions of weighted sums and nonlinearities handle far more complex decision boundaries.

Related

Related

Weight
Weight

A learnable parameter that scales the influence of inputs within a model.

Generality: 850
Neural Network
Neural Network

A layered system of interconnected nodes that learns patterns from data.

Generality: 947
Local Weight Sharing
Local Weight Sharing

Reusing the same weights across spatial positions to detect patterns regardless of location.

Generality: 694
Artificial Neuron
Artificial Neuron

The basic computational unit of neural networks, modeled on biological neurons.

Generality: 875
Perceptron
Perceptron

A single-neuron linear classifier that learns binary decisions by adjusting weighted inputs.

Generality: 795
Attention Mechanisms
Attention Mechanisms

Neural network components that dynamically weight input elements by their contextual relevance.

Generality: 865