Most popular now

OpenAI restricts medical advice in ChatGPT: what is now prohibited

Chat GPT - not an advisor in the fields of medicine and law
Обмеження на медичні поради у ChatGPT: які нові правила діють?

OpenAI has made changes to its Terms of Use to make the operation of ChatGPT and other artificial intelligence tools safer and more responsible. According to the new provisions, ChatGPT will no longer provide personalized advice in medical or legal fields, even at the users' request. Instead, the system will only provide general reference information and will refer users to appropriate professionals – doctors, lawyers, or consultants.

OpenAI has also banned the use of its models for analyzing medical images to avoid misdiagnoses and reduce health risks for users. The company aims to balance technology development with the protection of the rights and safety of the people who use its products.

According to an analysis by Apptopia, the growth of the ChatGPT mobile application may have reached its peak, as indicated by global download trends and user activity. Young people interested in a career in technology should pay attention to artificial intelligence tools, especially those that help optimize code.

“We have made ChatGPT restricted enough to ensure that we are cautious about mental health issues. Now that we have managed to mitigate mental health issues and have new tools at our disposal, we can safely relax restrictions in most cases,” said OpenAI CEO Sam Altman.

During the update of ChatGPT's Terms of Use, OpenAI emphasizes the importance of safety and responsibility in the use of artificial intelligence. The changes are aimed at preventing potential misdiagnoses and providing users with accurate and reliable information. The development of technology must go hand in hand with the protection of human rights.

Read also

Advertisement