ChatGPT maker OpenAI teams up with children’s safety non-profit to create AI guidelines

OpenAI has come under scrutiny over data handling and the fear that AI could harm society.


OpenAI is partnering with a children’s safety organisation to create artificial intelligence (AI) guidelines and education materials for teens and families.

The move comes as the company behind ChatGPT faces scrutiny over data handling and the fear that AI could harm society.

The collaboration with the nonprofit Common Sense Media, which assesses whether media and technology are appropriate for children, was announced at an event in San Francisco on Monday.

“We want to figure out how to make this tool safely and responsibly and broadly available to teens and people who are going to use it as part of their educational experience,” OpenAI CEO Sam Altman told the audience.

The app has received backlash over not protecting minors as there is no way to verify a user’s age. It is also under scrutiny as its tools can cause hallucinations, where the AI systems confidently make things up or be biased due to the data used to train the tools.

However, Altman rejected the notion that AI is bad for children and that AI tools should be kept out of schools.

“Humans are tool users and we better teach people to use the tools that are going to be out in the world,” he said. “To not teach people to use those would be a mistake.”

Common Sense Media has been trying to develop an AI rating system for parents, children and educators.

Some academics have voiced concerns about how AI chatbots could be used to write essays. But children and minors are also using ChatGPT to help with personal issues.

According to a survey for the non-profit Centre for Democracy and Technology, 29 per cent of children and teenagers in the US have used ChatGPT to help with anxiety or mental health issues, 22 per cent for problems with friends and 16 per cent for family conflicts.

Source link

2024-01-30 11:57:49

#ChatGPT #maker #OpenAI #teams #childrens #safety #nonprofit #create #guidelines

Learn More →