OpenAI, the company behind ChatGPT, is thinking about making users show ID to log in. The idea is meant to make sure kids are safe online, but it has also caused concern and discussion among people who use the app regularly. Some students at school say it could be helpful, but most feel like it would just make things more complicated and harder to access.
“Requiring ID for ChatGPT could make it safer for kids but raises big privacy and access concerns for everyone,” said sophomore Sebastian Theodore.
Underscoring this information was a story which appeared in the British newspaper, The Guardian, for which reporter Josh Taylor wrote, “OpenAI said it will use age-prediction tech and request ID verification if a user appears under 18.”
“If ChatGPT started making people use an ID, I feel like it would be annoying since not everyone wants to give their info just to use it, but I get that they might want to make it safer,” said sophomore Andrew Brito.
Some people think requiring ID is too extreme because not everybody wants to share personal info online. There is also the risk that data could be leaked, which could create even bigger problems and concerns. Many students said it doesn’t make sense to put privacy at risk when most kids just use ChatGPT for school assignments, homework help, or research purposes.
“ChatGPT requiring ID would be really bad because it would make it more difficult to access for many people which would be more of an annoyance than do any good,” said sophomore Marcelo Lopez.
Still, OpenAI says the plan is focused on protecting kids and making sure the app is safe for everyone. Nothing has been finalized yet, but the proposal has already sparked debate among students. Many students are now wondering if this could change how they use ChatGPT in the future, or if they might even stop using it completely if showing an ID becomes required.