As consumers, businesses and governments flock to the promise of cheap, fast and seemingly magic AI tools, a question continues to block: How can I keep my data privately?
Technological giants such as Openai, Anthropic, XAI, Google and others absorb quietly and maintain user data to improve their models or monitor security and security, even in some business frameworks where companies undertake their information is out of bounds. For the very regulated border -based industries or companies, this gray area could be a negotiation. Fears of where the data go, who can see it and how it can be used slows down AI in areas such as health care, funding and government.
Enter Starting based on San Francisco Secure securitythat aims to be “the sign for AI”. The company’s product, Confsec, is a end to end tool that wrapped fundamental models, guaranteeing that they prompt them and metadata cannot be stored, seen or used for AI training, even by the model provider or any third party.
“The second you give up on your data to someone else, you have virtually reduced your privacy,” Jonathan Mortensen, founder and chief executive of TechCrunch, told TechCrunch. “And the goal of our product is to remove this compromise.”
Confident security came out of Stealth on Thursday with $ 4.2 million in Decibel, South Park Commons, Ex Ante and Swyx, TechCrunch has learned exclusively. The company wants to serve as an intermediate supplier between the sellers of AI and their customers – such as ultrasound, governments and businesses.
Even AI companies could see the value to offer the security tool confidently to business customers as a way to unlock this market, Mortensen said. He added that Confsec is also suitable for new AI browsers that hit the market, such as the recently released Comet, to give customers guarantees that their sensitive data is not stored on a server somewhere that the company or bad actors could have access to your work or that their work is not used.
Confsec is formed after the architecture of the Apple Compute Compute (PCC), which Mortensen says that “it is 10 times better than anything out of the warranty that Apple cannot see your data” when some AIs are doing safely in the cloud.
TechCrunch event
Francisco
|
27-29 October 2025
Like Apple’s PCC, the Security’s Security system operates with the first anonymous data with encryption and routing through services such as Cloudflare or quickly, so that servers will never see the original source or content. It then uses advanced encryption that allows only decryption under strict conditions.
“So you can only say that you are only allowed to decipher this if you are not going to record the data and you are not going to use it for training and you will not let anyone see it,” Mortensen said.
Finally, the software of the AI conclusion is recorded and is open to review so that experts can verify its guarantees.
“Sure security is in front of the curve recognizing that AI’s future depends on the confidence incorporated in the infrastructure itself,” said Jess Leão, a partner at Decibel. “Without solutions like this. Many businesses just can’t proceed with AI.”
It is still the first days for the company’s years, but Mortensen said Confsec has been tested, externally tested and ready for production. The team is in talks with banks, browsers and search engines, including potential customers, to add Confsec to their infrastructure stacks.
“Bring AI. We bring the privacy,” Mortensen said.
