You’ve gone home with a Tinder date and things are escalating. You don’t really know or trust this guy and you don’t want to get an STD, so… now what?
Call a company Calamara wants you to take a picture of the man’s penis and then use its artificial intelligence to tell you whether your partner is “pure” or not.
Let’s get this right off the bat: You shouldn’t take a picture of someone’s genitalia and scan it with an AI tool to decide whether or not you should have sex.
The Calmara case has more red flags than a bad first date, but it only gets worse from there when you consider that the majority of STDs are asymptomatic. So your partner could very well have an STD, but Calmara would tell you he’s in the clear. This is why real STD tests use blood and urine samples to detect infection, as opposed to visual examination.
Other startups are tackling the need for affordable STD testing in a more responsible way.
“With laboratory diagnosis, sensitivity and specificity are two key measures that help us understand the test’s propensity for missing infections and false positives,” Daphne Chen, founder of TBD Health, told TechCrunch. “There’s always some level of error, even with highly rigorous tests, but test manufacturers like Roche are upfront with their validation rates for a reason — so clinicians can adjust the results.”
In the fine print, Calmara cautions that its findings should not substitute for medical advice. But its marketing suggests otherwise. Before TechCrunch contacted Calmara, the title of its website was: “Calmara: The Intimate Best for Unprotected Sex” (it has since been updated to read “Safer Sex.”) And in a promotional videobills itself as “THE PERFECT CONNECTING SITE!”
Co-founder and CEO Mei-Ling Lu told TechCrunch that Calmara was not intended as a serious medical tool. “Calmara is a lifestyle product, not a medical application. It does not include medical conditions or discussions within its context and no medical practitioner is involved with the current Calmara experience. It’s a free information service.”
“We are updating communications to better reflect our intentions at this time,” Lu added. “The clear idea is to start a conversation about the state and testing of sexually transmitted diseases.”
Calmara is part of HeHealth, which was founded in 2019. Calmara and HeHealth use the same AI, which says it’s 65-90% accurate. HeHealth is framed as the first step in sexual health assessment. The platform then helps users connect with partner clinics in their area to schedule an appointment for a real, comprehensive checkup.
HeHealth’s approach is more reassuring than Calmara’s, but it’s low-key — and even then, there’s a huge red flag waving: data privacy.
“It’s good to see that they offer an anonymous mode, where you don’t have to associate your photos with personally identifiable information,” Valentina Milanova, founder of tampon-based STI screening startup Daye, told TechCrunch. “This does not mean, however, that their service has been de-identified or anonymized, as your photos may still be traced to your email or IP address.”
HeHealth and Calmara also claim to comply with HIPAA, a regulation that protects patient privacy, because they use Amazon Web Services. That sounds reassuring, but in its privacy policy, Calmara writes that it shares user information with “service providers and partners who help operate the Services, including data hosting, analytics, marketing, payment processing and security.” They also do not specify whether these AI scans take place on your device or in the cloud, and if so, how long this data remains in the cloud and what it is used for. This is a little vague to reassure users that their private photos are safe.
These security questions aren’t just about users – they’re dangerous for the company itself. What happens if a minor uses the site to check for STDs? Calmara then ends up in possession of child sexual abuse material. Calmara’s response to this moral and legal liability is to write in its Terms of Service that it prohibits use by minors, but that the defense would have no legal weight.
Calmara represents the danger of overhyped technology: It seems like a publicity stunt for HeHealth to capitalize on the hype surrounding AI, but in its real-world application, it just gives users a false sense of security about their sexual health. These consequences are serious.
“Sexual health is a difficult space to innovate in, and I can see where their intentions are noble,” Chen said. “I just think they might be too quick to market with a solution that isn’t well built.”