One of Europe’s astir salient AI startups has released 2 AI models that are truthful tiny, they person named them aft a chicken’s encephalon and a fly’s brain.
Multiverse Computing claims these are nan world’s smallest models that are still precocious performing and tin grip chat, speech, and moreover reasoning successful 1 case.
These caller mini models are intended to beryllium embedded into net of things devices, arsenic good arsenic tally locally connected smartphones, tablets, and PCs.
“We tin compress nan exemplary truthful overmuch that they tin fresh connected devices,” Orús told TechCrunch. “You tin tally them connected premises, straight connected your iPhone aliases connected your Apple Watch.”
As we antecedently reported, Multiverse Computing is simply a buzzy European AI startup headquartered successful Donostia, Spain, pinch astir 100 labor successful offices worldwide. It was co-founded by a apical European professor of quantum computers and physics, Román Orús; quantum computing master Samuel Mugel; and Enrique Lizaso Olmos nan erstwhile lawman CEO of Unnim Banc.
It conscionable raised €189 cardinal (about $215 million) successful June connected nan spot of a exemplary compression exertion it calls “CompactifAI.” (Since it was founded successful 2019, it has raised astir $250 million, Orús said.)
CompactifAI is simply a quantum-inspired compression algorithm that reduces nan size of existing AI models without sacrificing those models’ performance, Orús said.
Techcrunch event
San Francisco | October 27-29, 2025
“We person a compression exertion that is not nan emblematic compression exertion that nan group from machine subject aliases instrumentality learning will do, because we travel from quantum physics,” he described. “It’s a much subtle and much refined compression algorithm.”
The institution has already released a agelong database of compressed versions of open-source models, particularly celebrated mini models for illustration Llama 4 Scout aliases Mistral Small 3.1. And it conscionable launched compressed versions of OpenAI’s 2 caller unfastened models. It has besides compressed immoderate very ample models – it offers a DeepSeek R1 Slim, for instance.
But since it’s successful nan business of making models smaller, it has focused other attraction connected making nan smallest yet astir powerful models possible.
Its 2 caller models are truthful mini that they tin bring chat AI capabilities to conscionable astir immoderate IoT instrumentality and activity without an net connection, nan institution says. It humorously calls this family nan Model Zoo because it’s naming nan products based connected animal encephalon sizes.
A exemplary it calls SuperFly is simply a compressed type of Hugging Face’s open-source exemplary SmolLM2 135. The original has 135M parameters and was developed for on-device uses. SuperFly is 94M parameters, which Orús likens to nan size of a fly’s brain. “This is for illustration having a fly, but a small spot much clever,” he said.
SuperFly is designed to beryllium trained connected very restricted data, for illustration a device’s operations. Multiverse envisions it embedded into location appliances, allowing users to run them pinch sound commands for illustration “start speedy wash” for a washing machine. Or users tin inquire troubleshooting questions. With a small processing powerfulness (like an Arduino), nan exemplary tin grip a sound interface, arsenic nan institution showed successful a unrecorded demo to TechCrunch.
The different exemplary is named ChickBrain, and is larger astatine 3.2 cardinal parameters, but is besides acold much tin and has reasoning capabilities. It’s a compressed type of Meta’s Llama 3.1 8B model, Multiverse says. Yet it’s mini capable to tally connected a MacBook, nary net relationship required.
More importantly, Orús said that ChickBrain really somewhat outperforms nan original successful respective modular benchmarks, including nan language-skill benchmark MMLU-Pro, mathematics skills benchmarks Math 500 and GSM8K, and nan wide knowledge benchmark GPQA Diamond.
Here are nan results of Multiverse’s soul tests of ChickBrain connected nan benchmarks. The institution didn’t connection benchmark results for SuperFly but Multiverse besides isn’t targeting SuperFly astatine usage cases that require reasoning.

It’s important to statement that Multiverse isn’t claiming that its Model Zoo will hit nan largest state-of-the-art models connected specified benchmarks. Zoo performances mightiness not moreover onshore connected nan leaderboards. The constituent is that its tech tin shrink exemplary size without a capacity hit, nan institution says.
Orús says nan institution is already successful talks pinch each nan starring instrumentality and appliance makers. “We are talking pinch Apple. We are talking pinch Samsung, besides pinch Sony and pinch HP, obviously. HP came arsenic an investor successful nan past round,” he said. The information was led by well-known European VC patient Bullhound Capital, pinch information from a batch of others, including HP Tech Ventures and Toshiba.
The startup besides offers compression tech for different forms of instrumentality learning, for illustration image recognition, and successful six years has obtained clients for illustration BASF, Ally, Moody’s, Bosch, and others.
In summation to trading its models straight to awesome instrumentality manufacturers, Multiverse offers its compressed models via an API hosted connected AWS that immoderate developer tin use, often astatine little token fees than competitors.