Lamini, a Palo Alto-based startup building a platform to help businesses develop genetic AI technology, has raised $25 million from investors including Stanford computer science professor Andrew Ng.
Laminifounded several years ago by Sharon Zhou and Greg Diamos, has an interesting sales pitch.
Many AI production platforms are too generic, Zhou and Diamos argue, and lack solutions and infrastructure that meet companies’ needs. Instead, Lamini was built from the ground up with businesses in mind and focused on delivering highly accurate and scalable AI.
“The top priority of almost every CEO, CIO and CTO is to take advantage of productive AI in their organization with maximum return on investment,” Zhou, CEO of Lamini, told TechCrunch. “But while it’s easy to get a working demo on a laptop for an individual developer, the road to production is littered with failures left and right.”
According to Zhou, many companies have expressed frustration at the barriers to meaningful adoption of genetic AI in their business operations.
According to a March voting from MIT Insights, only 9% of organizations have widely adopted genetic AI, even though 75% have experimented with it. Top barriers run the gamut from lack of IT infrastructure and capabilities to poor governance structures, insufficient skills and high implementation costs. Security is also a major factor — in a recent one overview by Insight Enterprises, 38% of companies said security was impacting their ability to leverage genetic AI technology.
So what is Lamini’s answer?
Zhou says “every piece” of Lamini’s technology stack is optimized for enterprise-scale AI workloads, from the hardware to the software, including the engines used to support model orchestration, refinement, operation and training. “Optimized” is a vague word, admittedly, but Lamini pioneered a step Zhou calls “memory tuning,” which is a technique for training a model on data so that it recalls exactly parts of that data.
Memory tuning can potentially reduce hallucinations, Zhou claims, or instances where a model creates events in response to a request.
“Memory tuning is a training pattern – as effective as fine-tuning, but goes beyond it – for training a model on proprietary data that includes key elements, numbers and numbers so that the model has high accuracy,” said Nina Wei, AI designer at Lamini, she told me via email, “and can memorize and recall the exact match of any key information rather than generalizing or hallucinating.”
I’m not sure I buy it. “Memory tuning” seems to be more of a marketing term than an academic one. There are no research papers on this – none that I could find, at least. I’ll let Lamini prove that his “memory tuning” is better than the other hallucination reduction techniques that are/have been attempted.
Fortunately for Lamini, memory tuning isn’t its only differentiator.
Zhou says the platform can operate in highly secure environments, including those with an air gap. Lamini enables companies to run, tune and train models in a range of configurations, from on-premises data centers to public and private clouds. And it scales workloads “elastically,” reaching over 1,000 GPUs if the application or use case demands it, Zhou says.
“Incentives are currently not aligned in the market with closed-source models,” Zhou said. “We intend to put control back in the hands of more people, not just a few, starting with businesses that care most about control and have the most to lose from their proprietary data being owned by someone else.”
Lamini’s co-founders are, for what it’s worth, quite successful in the AI space. They have also separately brushed shoulders with Ng, which no doubt explains his investment.
Zhou was previously on faculty at Stanford, where she led a group researching genetic artificial intelligence. Before receiving her PhD in computer science under Ng, she was a machine learning product manager at Google Cloud.
Diamos, for his part, co-founded MLCommons, the engineering consortium dedicated to creating standard benchmarks for AI models and hardware, as well as the MLCommons benchmarking suite, MLPerf. He also led AI research at Baidu, where he worked with Ng while the latter was chief scientist there. Diamos was also a software architect at Nvidia CUDA club.
The co-founders’ industry connections seem to have given Lamini a leg up on the fundraising front. In addition to Ng, Figma CEO Dylan Field, Dropbox CEO Drew Houston, OpenAI co-founder Andrej Karpathy, and—surprisingly—Bernard Arnault, the CEO of luxury goods giant LVMH , have all invested in Lamini.
AMD Ventures is also an investor (a bit ironic considering Diamos’ Nvidia roots), as are First Round Capital and Amplify Partners. AMD got involved early, supplying Lamini with data center hardware, and today, Lamini runs many of her models on AMD Instinct GPUs, bucking the industry trend.
Lamini claims that the performance of its models in training and operation is equivalent to corresponding Nvidia GPUs, depending on the workload. Since we are not equipped to verify this claim, we will leave it to a third party.
To date, Lamini has raised $25 million in seed and Series A rounds (Amplify led the Series A). Zhou says the money is being used to triple the company’s 10-person team, expand its computing infrastructure and begin development on “deeper technical optimizations.”
There are several enterprise-oriented AI vendors that could compete with aspects of Lamini’s platform, including tech giants like Google, AWS, and Microsoft (through its partnership with OpenAI). Google, AWS, and OpenAI, in particular, have been aggressively courting the enterprise in recent months, introducing features such as improved regulation, private granularity on private data, and more.
I asked Zhou about Lamini’s customers, revenue and overall momentum. She wasn’t willing to reveal much at this somewhat early juncture, but said that AMD (via its AMD Ventures connection), AngelList and NordicTrack are among the first (paid) users of Lamini, along with several undisclosed government agencies.
“We’re growing fast,” he added. “The number one challenge is customer service. We have only handled incoming demand because we are flooded. Given the interest in genetic AI, we are not representative of the overall tech slowdown — unlike our peers in the AI world, we have mixed margins and are more like a normal tech company.”
Amplify General Partner Mike Dauber said: “We believe there is a huge opportunity for productive artificial intelligence in the enterprise. While there are many AI infrastructure companies, Lamini is the first I’ve seen that takes business problems seriously and creates a solution that helps businesses unlock the enormous value of their personal data while satisfying even the most stringent compliance and security requirements”.