The website for one app, called Mimico-Your AI Friends, includes only the word “ Hi.” Others do not list their owners or where they are located, or just include generic help or support contact email addresses. It is unclear who owns or runs some of the companies behind the chatbots. It also indicates how people’s chat messages could be abused by hackers. The Mozilla research provides a glimpse into how this gold rush may have neglected people’s privacy, and into tensions between emerging technologies and how they gather and use data. Since OpenAI unleashed ChatGPT on the world in November 2022, developers have raced to deploy large language models and create chatbots that people can interact with and pay to subscribe to. Collectively, the apps, which have been downloaded more than 100 million times on Android devices, gather huge amounts of people’s data use trackers that send information to Google, Facebook, and companies in Russia and China allow users to use weak passwords and lack transparency about their ownership and the AI models that power them. That’s especially true for “AI girlfriends” or “AI boyfriends,” according to new research.Īn analysis into 11 so-called romance and companion chatbots, published on Wednesday by the Mozilla Foundation, has found a litany of security and privacy concerns with the bots. And you probably shouldn’t trust it with your personal information either. You shouldn’t trust any answers a chatbot sends you.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |