AI tools such as ChatGPT, Copilot & Co. have long since arrived in our everyday lives – both privately and professionally. They promise efficiency, inspiration and relief. But despite all the enthusiasm, many people forget: What we enter can be saved, analysed and (in the worst case) misused. 🧠💻 To help you work safely with AI, here are 6 things you’d better not share with ChatGPT:
1️⃣ 📞
Personal data: Name, address, telephone number or identification documents have no place in the input line. Why? You never know exactly where the data will end up – and whether it will become public in the event of a data leak.
2️⃣ 🔐 Passwords & logins: Sounds obvious, but it happens more often than you think. Don’t enter any access data, customer numbers or sensitive account information – even if the AI is supposed to help.
3️⃣ 💸
Financial information: Bank statements, bills, bank letters or even credit card information? Don’t belong in the AI. These tools are not online banking portals – and do not offer any corresponding protection.
4️⃣ 🩺 Health data: Googling initial symptoms – yes. But you should not share your medical history, diagnoses or personal complaints unfiltered. AI does not replace a doctor – and is not a protected space for medical records.
5️⃣ 🏢 Confidential company information: Many use AI tools at work – for text optimisation, analysis or idea generation. But be careful: if you share meeting notes, customer information or internal strategies, you run the risk of sensitive content being shared and reused.
6️⃣ 🚫 Illegal or questionable requests: Even if it’s just a “joke”, questions on criminal topics can be tracked – and reported. AI is not an anonymous space without consequences.
📌 Conclusion: AI can provide support – but is no substitute for common sense and data protection. The more general and anonymous your input, the safer you are when travelling.