Fundamental Raises $255 Million Series A With A New Take On Big Data Analysis

Trending 4 hours ago
A man wearing a ample achromatic t-shirt stands successful a sunlit office.

7:00 AM PST · February 5, 2026

An AI laboratory called Fundamental emerged from stealth connected Thursday, offering a caller instauration exemplary to lick an aged problem: really to tie insights from nan immense quantities of system information produced by enterprises. By combining nan aged systems of predictive AI pinch much modern tools, nan institution believes it tin reshape really ample enterprises analyse their data.

“While LLMs person been awesome astatine moving pinch unstructured data, for illustration text, audio, video, and code, they don’t activity good pinch system information for illustration tables,” CEO Jeremy Fraenkel told TechCrunch. “With our exemplary Nexus, we person built nan champion instauration exemplary to grip that type of data.”

The thought has already drawn important liking from investors. The institution is emerging from stealth pinch $255 cardinal successful funding. The bulk of it comes from nan caller $225 cardinal Series A information led by Oak HC/FT, Valor Equity Partners, Battery Ventures, and Salesforce Ventures; Hetz Ventures besides participated successful nan Series A, pinch angel backing from Perplexity CEO Aravind Srinivas, Brex co-founder Henrique Dubugras, and Datadog CEO Olivier Pomel.

Called a Large Tabular Model (LTM) alternatively than a Large Language Model (LLM), Fundamental’s Nexus breaks from modern AI practices successful a number of important ways. The exemplary is deterministic — that is, it will springiness nan aforesaid reply each clip it is asked a fixed mobility — and doesn’t trust connected the transformer architecture that defines models from astir modern AI labs. Fundamental calls it a instauration exemplary because it goes done nan normal steps of pre-training and fine-tuning, but nan consequence is thing profoundly different from what a customer would get erstwhile partnering pinch OpenAI aliases Anthropic.

Those differences are important because Fundamental is chasing a use-case wherever modern AI models often falter. Because Transformer-based AI models tin only process information that’s wrong their discourse window, they often person problem reasoning complete highly ample datasets — analyzing a spreadsheet pinch billions of rows, for instance. But that benignant of tremendous system dataset is communal wrong ample enterprises, creating a important opportunity for models that tin grip nan scale.

As Fraenkel sees it, that’s a immense opportunity for Fundamental. Using Nexus, nan institution tin bring modern techniques to Big Data analysis, offering thing much powerful and elastic than nan algorithms that are presently successful use.

“You tin now person 1 exemplary crossed each of your usage cases, truthful you tin now grow massively nan number of usage cases that you tackle,” he told TechCrunch. “And connected each 1 of those usage cases, you get amended capacity than what you would different beryllium capable to do pinch an service of information scientists.”

That committedness has already brought successful a number of high-profile contracts, including seven-figure contracts pinch Fortune 100 clients. The institution has besides entered into a strategical business pinch AWS that will let AWS users to deploy Nexus straight from existing instances.

Russell Brandom has been covering nan tech manufacture since 2012, pinch a attraction connected level argumentation and emerging technologies. He antecedently worked astatine The Verge and Rest of World, and has written for Wired, The Awl and MIT’s Technology Review. He tin beryllium reached astatine russell.brandom@techcrunch.com aliases connected Signal astatine 412-401-5489.

More