AI is forced to almost every aspect of life, from phones and applications to search engines and even routesfor some reason. The fact that we are now getting tissue browsers with baked AI assistants and chatbots shows that the way some people use the internet to search and consume information today is very different from even a few years ago.
But AI tools are increasingly seeking gross levels of access to your personal data on the pretext that they need to work. This type of access is not normal, nor should it be normalized.
Not long ago, you will be right to ask yourself why a seemingly harmless free “lens” or “calculator” application on the App Store would try to ask for access to your contacts, photos and even real -time location data. These applications may not need this data to operate, but they will ask if they believe they can do a buck or two by making revenue from your data.
These days, AI is not so different.
Get the latest browser to Web Perplexity, Comet, as an example. Comet allows users to find answers with the built -in AI search engine and automate routine tasks, such as summarizing emails and calendar events.
In a recent hands-on browser, TechCrunch has found that when embarrassment requires access to the user’s diary, the browser requests to receive a wide wide from your rights to the user’s Google Account, including the ability to manage your plans and e-mails, your contacts, Even the capabilities to receive a copy of your company.
The embarrassment says that many of these data is stored locally on your device, but you still grant company rights access and use Your personal information, including improving AI models for everyone else.
The embarrassment is not alone on access to your data. There is a trend of AI applications that promise to save you time by transferring your calls or meetings, for example, but require an AI assistant to access your private conversations in real time, your calendars, your contacts and more. Meta has also tried the limits of what AI applications can request access, including the appreciation of photos stored on the camera roll of a user who has not yet been uploaded.
The president of Signal Meredith Whittaker recently likened the use of AI agents and assistants to “put your brain in a jar”. Whittaker explained how some AI products can promise to do all kinds of cosmic duties, such as keeping a table in a restaurant or booking a ticket for a concert. But to do this, AI will say that you need your permission to open your browser to load the site (which can allow AI to access the stored passwords, bookmarks and your browsing history), a credit card to make your date, and you can also book your date, Share your booking with a friend.
There are serious risks to the safety and protection of privacy associated with the use of AI assistants based on your data. By allowing access to, you deliver immediately and irreversible rights to a whole snapshot of your most personal information from that time, from the inbox, messages and calendar entries dating from years and much more. All this for the sake of performing a job that seem to save you time – or, at the point of Whittaker, saves you from actively thinking about it.
You also grant AI agent permission to act autonomously for you, demanding you to make a huge confidence in a technology that is already prone to getting things wrong or doing things. The use of AI requires further to trust companies looking for profits that develop these AI products, which are based on your data to try to make AI models perform better. When things go wrong (and they do it, a lot), it is common practice for people in AI companies to look at your private suggestions to understand why things were not working.
In terms of security and privacy, a simple cost-benefit analysis to connect AI to your most personal data is simply not worth giving up accessing your more private information. Any AI application requesting these levels of rights should send your alarm bells, as well as the lens application that wants to know your location at all times.
Given the consequences of the data you deliver to AI companies, ask yourself if what you get out of it really is worth it.
