Unsettling Reasons To Avoid Sharing Personal Information With ChatGPT

Artificial Intelligence or AI technology has become a big topic these days, particularly among companies like OpenAI (developers of ChatGPT) and Google, among many others. IBM defines AI as "technology that enables computers and machines to simulate human intelligence and problem-solving capabilities." While various platforms that incorporate AI systems tout their value as tools to help enhance productivity and efficiency, there are potential downsides as well.

One key issue that's come with the rise of AI has been privacy. ChatGPT for instance, which can hold conversations, help you complete tasks, and answer a variety of queries, had to get its knowledge from somewhere. Where you might ask? From all over the internet and often without asking for permission. Not only is nearly anything you've ever written on the internet up for grabs, but those questions and prompts you're entering into ChatGPT likely aren't private either. Privacy concerns aren't just about ChatGPT either. Google's new AI is poised to listen to your phone calls, and many other companies have said they intend to use your activities on their platforms to help train their AIs further.

AI's learning method is ethically questionable

One of the misconceptions of technology like ChatGPT, due in part to media and marketing hype, is that it's actually smart. While the technology can accomplish impressive feats, such as helping you plan a vacation, or write a limerick, or many other ways Google's Gemini AI can be useful everyday, "AI is not capable of independent thought or reasoning." ChatGPT has learned from essentially dragging a giant net over the internet and scooping up every bit of information it could. This "net" wasn't focused though, and picked up everything it could, be it personal or copywritten material.

Since AI learns by mimicking, anything you type into ChatGPT should be considered public-facing as the chatbot doesn't make a distinction between personal details and general knowledge. Personal identifying information like your social security number, financial information, and passwords should never be shared with ChatGPT. The chatbot will use personal details for learning, so anything you've posted online or discussed with the chatbot is up for grabs. Further, it might unwittingly share that information with other people using the service, as The New York Times showed with its reporter's personal details last year.

Data access and leaks

The other uncomfortable truth surrounding ChatGPT is who has access to its data. OpenAI developers may have access to the information their AI has gathered. This is another reason why your data isn't private whether you directly shared it with ChatGPT or posted it elsewhere on the internet.

The developers eyeing your messages isn't the only concern, as the platform has also been a victim of hackers. Private conversations with the chatbot, sensitive information, usernames, passwords, and payment history have reportedly leaked in the past, disseminating the info to others using ChatGPT. In the wrong hands, hackers can cause havoc with this information. Here's what hackers are really doing with your info and it involves more than just financial theft.

It's not just individuals that have become concerned about privacy but also major corporations. Samsung recently prohibited its employees from using ChatGPT following an incident where proprietary code by the company's engineers found its way onto the AI platform.