
Research lab hosting Josh Tenenbaum's Computational Cognitive Science group, a leader in probabilistic programming and neuro-symbolic models.
Organization building tools for artist consent and data protection, including Kudurru which tracks scraping and offers defensive tools.
United Kingdom · University
The Centre for Cold Matter develops portable quantum accelerometers for navigation without satellite support.
Researchers involved in the development of Quipper, a scalable functional quantum programming language embedded in Haskell.
Developers of the Gemini family of models, which are trained from the start to be multimodal across text, images, video, and audio.
The global hub for open-source AI models and datasets. Founded by French entrepreneurs with a major office in Paris.
Adversarial noise cloaks algorithmically perturb pixels, textures, or audio spectra so computer-vision and voiceprint models misclassify what they see or hear while humans perceive little change. Tools such as Glaze, Nightshade, and PhotoGuard train counter-models against state-of-the-art scrapers, outputting overlays that travel with an image even after resizing or mild compression. For video, temporal cloaks spread perturbations across frames to avoid flicker, and audio cloaks hide carrier signals inside frequencies smartphones capture but humans ignore.
Artists, journalists, and public figures deploy cloaks to stop style-transfer models from cloning their work or to keep biometric signatures out of unauthorized datasets. Newsrooms apply them to protest footage to protect demonstrators without blurring entire scenes, and fashion brands encode cloaks into lookbooks so counterfeiters can’t easily lift patterns. As generative models open-source faster than legal frameworks evolve, cloaks provide a grassroots defense that doesn’t require waiting for platform policy.
Yet the tactic sits at TRL 4. Arms races ensue as model builders retrain on cloaked data, and some jurisdictions debate whether intentionally misleading algorithms violates anti-circumvention laws. Researchers push toward certified defenses using provable robustness, while policy groups argue for a right to “algorithmic camouflage.” Expect adversarial cloaks to be part of a layered strategy alongside provenance tags and licensing frameworks, especially for creators who cannot afford lengthy legal battles over data misuse.