The explosion of AI corporations has pushed demand for computing energy to new extremes, and firms like CoreWeave, Collectively AI and Lambda Labs have capitalized on that demand, attracting immense quantities of consideration and capital for his or her capacity to supply distributed compute capability.
However most corporations nonetheless retailer information with the massive three cloud suppliers, AWS, Google Cloud, and Microsoft Azure, whose storage programs had been constructed to maintain information near their very own compute sources, not unfold throughout a number of clouds or areas.
“Trendy AI workloads and AI infrastructure are selecting distributed computing as an alternative of huge cloud,” Ovais Tariq, co-founder and CEO of Tigris Information, advised TechCrunch. “We wish to present the identical possibility for storage, as a result of with out storage, compute is nothing.”
Tigris, based by the workforce that developed Uber’s storage platform, is constructing a community of localized information storage facilities that it claims can meet the distributed compute wants of contemporary AI workloads. The startup’s AI-native storage platform “strikes together with your compute, [allows] information [to] routinely replicate to the place GPUs are, helps billions of small recordsdata, and offers low-latency entry for coaching, inference, and agentic workloads,” Tariq mentioned.
To do all of that, Tigris not too long ago raised a $25 million Sequence A spherical that was led by Spark Capital and noticed participation from present traders, which embrace Andreessen Horowitz, TechCrunch has completely realized. The startup goes in opposition to the incumbents, who Tariq calls “Large Cloud.”

Tariq feels these incumbents not solely provide a dearer information storage service, however additionally a much less environment friendly one. AWS, Google Cloud and Microsoft Azure have traditionally charged egress charges (dubbed “cloud tax” within the trade) if a buyer desires emigrate to a different cloud supplier, or obtain and transfer their information in the event that they wish to, say, use a less expensive GPU or practice fashions in numerous components of the world concurrently. Consider it like having to pay your fitness center additional if you wish to cease going there.
In keeping with Batuhan Taskaya, head of engineering at Fal.ai, one among Tigris’ clients, these prices as soon as accounted for almost all of Fal’s cloud spending.
Techcrunch occasion
San Francisco
|
October 27-29, 2025
Past egress charges, Tariq says there’s nonetheless the issue of latency with bigger cloud suppliers. “Egress charges had been only one symptom of a deeper downside: centralized storage that can’t sustain with a decentralized, high-speed AI ecosystem,” he mentioned.
Most of Tigris’ 4,000+ clients are like Fal.ai: generative AI startups constructing picture, video and voice fashions, which are likely to have massive, latency-sensitive datasets.
“Think about speaking to an AI agent that’s doing native audio,” Tariq mentioned. “You need the bottom latency. You need your compute to be native, shut by, and also you need your storage to be native, too.”
Large clouds aren’t optimized for AI workloads, he added. Streaming huge datasets for coaching or operating real-time inference throughout a number of areas can create latency bottlenecks, slowing mannequin efficiency. However with the ability to entry localized storage means information is retrieved quicker, which suggests builders can run AI workloads reliably and extra cheaply utilizing decentralized clouds.
“Tigris lets us scale our workloads in any cloud by offering entry to the identical information filesystem from all these locations with out charging egress,” Fal’s Taskaya mentioned.
There are different the reason why corporations wish to have information nearer to their distributed cloud choices. For instance, in extremely regulated fields like finance and healthcare, one massive roadblock to adopting AI instruments is that enterprises want to make sure information safety.
One other motivation, says Tariq, is that corporations more and more wish to personal their information, pointing to how Salesforce earlier this yr blocked its AI rivals from utilizing Slack information. “Firms have gotten increasingly conscious of how vital the info is, how it’s fueling the LLMs, how it’s fueling the AI,” Tariq mentioned. “They wish to be extra in management. They don’t need another person to be accountable for it.”
With the recent funds, Tigris intends to proceed constructing its information storage facilities to assist growing demand — Tariq says the startup has grown 8x yearly since its founding in November 2021. Tigris already has three information facilities in Virginia, Chicago and San Jose, and desires to proceed increasing within the U.S. in addition to in Europe and Asia, particularly in London, Frankfurt and Singapore.