One of Europe’s most prominent AI startups has released two AI models so tiny they have been named after a chicken’s brain and a fly’s brain. Multiverse Computing claims these are the world’s smallest models that still deliver high performance, capable of handling chat, speech, and even reasoning in one case. These compact models are designed to be embedded into Internet of Things devices and run locally on smartphones, tablets, and PCs.
According to Román Orús, co-founder of Multiverse Computing, the models can be compressed to fit on devices, allowing them to operate directly on an iPhone or Apple Watch without needing an internet connection.
Multiverse Computing is a well-known European AI startup based in Donostia, Spain, with around 100 employees across global offices. It was co-founded by Román Orús, a leading professor in quantum computing and physics, alongside quantum computing expert Samuel Mugel and Enrique Lizaso Olmos, former deputy CEO of Unnim Banc.
In June, the company raised €189 million, driven by its proprietary model compression technology called CompactifAI. Since its founding in 2019, Multiverse has secured approximately $250 million in funding.
CompactifAI is a quantum-inspired compression algorithm that reduces the size of existing AI models without compromising performance. Unlike traditional compression methods, this approach leverages principles from quantum physics to achieve more refined results.
The company has already released compressed versions of several open-source models, including popular small models like Llama 4 Scout and Mistral Small 3.1. It has also introduced compressed versions of OpenAI’s latest open models. While Multiverse works with large models, its focus is on creating the smallest yet most powerful models possible.
The two newest models, humorously named after animal brain sizes, are designed to bring AI capabilities to IoT devices without requiring an internet connection.
The first, SuperFly, is a compressed version of Hugging Face’s SmolLM2 135, originally developed for on-device use. With just 94 million parameters, Orús likens its size to that of a fly’s brain. SuperFly is optimized for training on limited data, such as a device’s operational logs. The company envisions it being embedded in home appliances, enabling voice commands like “start quick wash” for a washing machine or answering troubleshooting questions.
The second model, ChickBrain, is larger at 3.2 billion parameters but far more capable, featuring reasoning abilities. It is a compressed version of Meta’s Llama 3.1 8B model and is small enough to run on a MacBook offline. Surprisingly, ChickBrain slightly outperforms the original model in several benchmarks, including language skills, math proficiency, and general knowledge tests.
Multiverse emphasizes that its Model Zoo is not intended to compete with the largest state-of-the-art models. Instead, the technology demonstrates that models can be significantly reduced in size without sacrificing performance.
The startup is already in discussions with major device manufacturers, including Apple, Samsung, Sony, and HP, the latter of which participated in its latest funding round led by Bullhound Capital.
Beyond AI models, Multiverse offers compression solutions for other machine learning applications, such as image recognition. Over six years, it has secured clients like BASF, Ally, Moody’s, and Bosch.
In addition to selling directly to manufacturers, Multiverse provides its compressed models via an AWS-hosted API, offering developers a cost-effective alternative with competitive token pricing.