Home Tech Confident Security, ‘the Signal for AI,’ comes out of stealth with $4.2M

Confident Security, ‘the Signal for AI,’ comes out of stealth with $4.2M


As customers, companies, and governments flock to the promise of low-cost, quick, and seemingly magical AI instruments, one query retains getting in the way in which: How do I preserve my information personal?

Tech giants like OpenAI, Anthropic, xAI, Google, and others are quietly scooping up and retaining person information to enhance their fashions or monitor for security and safety, even in some enterprise contexts the place corporations assume their info is off limits. For extremely regulated industries or corporations constructing on the frontier, that grey space may very well be a dealbreaker. Fears about the place information goes, who can see it, and the way it may be used are slowing AI adoption in sectors like healthcare, finance, and authorities. 

Enter San Francisco-based startup Assured Safety, which goals to be “the Sign for AI.” The corporate’s product, CONFSEC, is an end-to-end encryption device that wraps round foundational fashions, guaranteeing that prompts and metadata can’t be saved, seen, or used for AI coaching, even by the mannequin supplier or any third occasion.

“The second that you simply surrender your information to another person, you’ve basically diminished your privateness,” Jonathan Mortensen, founder and CEO of Assured Safety, informed TechCrunch. “And our product’s objective is to take away that trade-off.”

Assured Safety got here out of stealth on Thursday with $4.2 million in seed funding from Decibel, South Park Commons, Ex Ante, and Swyx, TechCrunch has completely realized. The corporate desires to function an middleman vendor between AI distributors and their prospects — like hyperscalers, governments, and enterprises.

Even AI corporations may see the worth in providing Assured Safety’s device to enterprise purchasers as a approach to unlock that market, mentioned Mortensen. He added that CONFSEC can also be well-suited for brand new AI browsers hitting the market, like Perplexity’s not too long ago launched Comet, to provide prospects ensures that their delicate information isn’t being saved on a server someplace that the corporate or dangerous actors may entry, or that their work-related prompts aren’t getting used to “practice AI to do your job.”

CONFSEC is modeled after Apple’s Personal Cloud Compute (PCC) structure, which Mortensen says “is 10x higher than something on the market when it comes to guaranteeing that Apple can not see your information” when it runs sure AI duties securely within the cloud.

Techcrunch occasion

San Francisco
|
October 27-29, 2025

Like Apple’s PCC, Assured Safety’s system works by first anonymizing information by encrypting and routing it by means of providers like Cloudflare or Fastly, so servers by no means see the unique supply or content material. Subsequent, it makes use of superior encryption that solely permits decryption beneath strict situations.

“So you possibly can say you’re solely allowed to decrypt this if you’re not going to log the information, and also you’re not going to make use of it for coaching, and also you’re not going to let anybody see it,” Mortensen mentioned. 

Lastly, the software program working the AI inference is publicly logged and open to evaluation in order that consultants can confirm its ensures. 

“Assured Safety is forward of the curve in recognizing that the way forward for AI is dependent upon belief constructed into the infrastructure itself,” Jess Leão, accomplice at Decibel, mentioned in an announcement. “With out options like this, many enterprises merely can’t transfer ahead with AI.”

It’s nonetheless early days for the year-old firm, however Mortensen mentioned CONFSEC has been examined, externally audited, and is production-ready. The workforce is in talks with banks, browsers, and serps, amongst different potential purchasers, so as to add CONFSEC to their infrastructure stacks. 

“You convey the AI, we convey the privateness,” mentioned Mortensen.

NO COMMENTS

Exit mobile version