Google says it plans additional privacy measures to protect teenage users on YouTube and its search engine, becoming the latest tech giant to adopt stricter standards in the face of criticism that companies are not doing enough to protect children. .
in a blog post, Google announced that videos uploaded to YouTube by users ages 13-17 would be private by default, allowing content to be viewed only by users and their designated individuals.
Google will also begin allowing anyone under the age of 18, or a parent or guardian, to request the removal of that person’s images from Google’s image search results, the company said. It’s unclear if this process will be easy and responsive, considering Google’s historical reluctance to remove items from search results.
Additionally, Google said it would disable location history for all users under the age of 18 and remove the option to enable it again.
The company plans to implement the changes in the “next few weeks,” he said.
Globally, there is mounting political and regulatory pressure for tech companies to do more to protect children. In the United States, two recent bills would expand the Children’s Online Privacy Protection Act, known as COPPA, to restrict the tracking and targeting of teens.
Google has repeatedly faced scrutiny over its handling of data related to children. In 2019, it agreed to pay a $ 170 million ($ 231 million) fine for violating COPPA by collecting data from children under the age of 13 without parental consent.