The article Prompts: You shouldn’t betray these things Chatgpt first appeared at the online magazine Basic Thinking. You can start the day well every morning via our newsletter update.
AI tools can be useful in everyday life and at work. However, the quality of the answers depends largely on the user inputs, the so -called prompt. Basically, the more information you provide Chatgpt, the better the results. You should rather keep some things for yourself.
Hardly any digital product has achieved as much attention as a chat. With around 500 million users worldwide, AI has now not only significantly changed the Internet, but also the everyday life of many people.
Due to the rapid technological development and emerging competition in the field of artificial intelligence, most major language and AI models have been going beyond the pure word processing for some time. However, the quality of the answers, pictures or analyzes from Chatgpt depends largely on the user inputs, the so -called prompt.
Chatgpt: You should rather keep these things to yourself
Chatgpt is generally used free of charge – even without an account. However, many functions are either only available if users register or link to certain limits that can only be avoided by a paid subscription. However, the supposedly free use only applies in monetary terms. Because: Even without a subscription, Chatgpt users pay to a certain extent-namely with their inputs or data.
With user inputs, the so -called prompts, caution is required, since it is not clearly clear where this information ultimately ends up and how it is used. Personal data could also land freely on the Internet through cyber attacks on servers – what has already happened.
Therefore, it is advisable to disclose contact details such as telephone numbers, addresses or names. The photograph and upload of passport, driver’s license or ID card should also be an absolute no go. AI models such as chatt can also be linked to other apps or services.
You should note not to enter login data directly at Chatgpt or to name it in a language input. The same applies to account or customers numbers. Because: There is also a risk that data will end up and sold in the Darknet through potential hacker attacks. In addition, you should not upload invoices, account statements or letters, as there are also confidential information.
Health information and company data
Studies According to many people, many people are still skeptical when it comes to asking chatt and Co. for medical advice. But it is tempting a chat bot similar to Dr. Google to ask information on diseases and symptoms. However, you should rather keep your medical history and personal health information for yourself. Because they only go on your doctor and you.
The disclosure of corporate data can also be problematic-especially since more and more employed AI tools use for their work. If employers and employees give AI models such as Chatgpt too much trust, sensitive information can be landed on the net or used for AI training. According to AI Act of the EU, companies and employees therefore need certain skills that are to be conveyed as part of a training course.
Caution should also be advised if you have any questions about potential crimes – even if statements are meant in fun. Because: In extreme cases, inquiries from AI operators are forwarded to the investigative authorities-similar to threats or announcements of criminal offenses to social media.
Also interesting:
- The current Chatgpt models in comparison
- AI from Europe: Everything you need to know about the Mistral-Ki “Le Chat”
- Chatgpt: 5 tips to get better answers
- AI companies from Germany: These are the 5 pioneers
The article Prompts: You shouldn’t betray these things Chatgpt first appeared on basic thinking. Follow us too Google News and Flipboard Or subscribe to our update newsletter.
As a Tech Industry expert, I believe that there are certain things that should not be revealed to ChatGPT or any other AI chatbot. This includes sensitive personal information such as credit card numbers, social security numbers, or passwords. It is important to remember that AI chatbots are still machines and can potentially be vulnerable to hacking or misuse.
Additionally, disclosing proprietary information or trade secrets to ChatGPT could pose a risk to a company’s intellectual property and competitive advantage. It is important for individuals and businesses to exercise caution when interacting with AI chatbots and to be mindful of the information they share.
Overall, while AI chatbots can be a valuable tool for automation and customer service, it is important to maintain a level of caution and discretion when using them to ensure the safety and security of sensitive information.
Credits