W
Welcome to TechCrunch Exchange, a weekly startups-and-market newsletter. It’s inspired by the TechCrunch+ daily column it’s named after. Want it in your inbox every Saturday? Register here.
Technology’s ability to reinvent the wheel has its downsides: It can mean you’re ignoring glaring truths that others have already learned. But the good news is that new founders sometimes figure it out on their own faster than their predecessors. — Anna
AI, trust and security
This year is an Olympic year, a leap year. . . Moreover The election year. But before you blame me The US bankruptcyI’m not just thinking about the continuation of Biden vs. Trump: More than 60 countries are holding national electionsnot to mention the European Parliament.
How each of these votes might impact tech companies. For example, different parties tend to have different perceptions of AI regulation. But before elections are even held, technology will also have a role to play in guaranteeing their integrity.
Electoral integrity probably wasn’t on Mark Zuckerberg’s mind when he created Facebook, and maybe not even when he bought WhatsApp. But 20 and 10 years later, respectively, trust and security are now a liability that Meta and other tech giants can’t escape, whether they like it or not. This means working to prevent misinformation, fraud, hate speech, CSAM (child sexual abuse material), self-harm and more.
However, AI will likely make the task more difficult, and not just because of deepfakes or because of empowering more bad actors. Says Lotan Levkowitz, its general partner Grove Ventures:
All these trust and security platforms have this hash database, so I can upload what’s bad there, share with all my communities, and together they’ll stop it. but today, I can train the model to try to avoid it. So even the most classic work of trust and security, because of Gen AI, it becomes harder and harder because the algorithm can help bypass all these things.
From hindsight to foreground
Although online forums had already learned a lot about content moderation, there was no social media playbook that Facebook followed when it was born, so it’s somewhat understandable that it would take some time to get up to the task. But it is disappointing to learn from internal Meta documents that as early as 2017, there was still internal reluctance to adopt measures that could better protect children.
Zuckerberg was one of five social media tech CEOs to appear at a recent US Senate hearing on children’s online safety. The filing was far from the first for Meta, but that Discord was included is also worth noting. While it has branched out beyond its gaming roots, it’s a reminder that threats to trust and security can arise in many places online. This means that a social gaming app, for example, could also put its users at risk of phishing or grooming.
Will the younger companies acquire faster than the FAANGs? That’s not guaranteed: Founders often operate on first principles, which is both good and bad. The content moderation learning curve it’s real. But OpenAI is much younger than Meta, so it’s encouraging to hear that it’s forming a new group to study child safety — even if that may be a result of the scrutiny it’s under.
Some startups, however, don’t wait for signs of trouble to take action. A provider of AI-enabled trust and security solutions and part of the Grove Ventures portfolio, ActiveFence is seeing more incoming requests, CEO Noam Schwartz told me.
“I’ve seen a lot of people approach our team from companies that have just started or even started. They think about the safety of their products during the design phase [and] adopting a concept called safety by design. They bake security into their products, the same way you think about security and privacy today when you build your features.”
ActiveFence is not the only startup in this space, which Wired described as “trust and security as a service”. But it’s one of the biggest, especially since it acquired Spectrum Labs in September, so it’s good to hear that its clients include not only big names afraid of PR crises and political scrutiny, but also smaller groups just starting out. . Technology, too, has the opportunity to learn from past mistakes.