TECHNOLOGY

The EU wants to know more about the content ‘rabbit holes’ and the way TikTok, YouTube, and Snapchat operate

×

The EU wants to know more about the content ‘rabbit holes’ and the way TikTok, YouTube, and Snapchat operate

Share this article
The EU wants to know more about the content ‘rabbit holes’ and the way TikTok, YouTube, and Snapchat operate



The EU Commission asked YouTube, Snapchat, and TikTok to provide details on how their algorithms suggest content to users, as well as how these systems may contribute to certain widespread risks, such as those concerning elections, mental health, and safeguarding minors.

The request has been made under the Digital Services Act (DSA) – an EU regulation adopted in 2022 that addresses illegal content, transparent advertising and disinformation.

Under the Digital Services Act (DSA), platforms are required to evaluate and effectively address risks associated with their recommendation algorithms, such as impacts on user mental health and the spread of harmful content driven by engagement-based designs.

YouTube and Snapchat have been asked to provide specific information regarding the factors their algorithms use to suggest content to users, as well as their role in intensifying particular systemic risks. These include risks tied to elections, civic discourse, mental well-being (e.g., addictive behaviors and falling into content ‘rabbit holes’), and the safety of minors.

The inquiry also extends to the platforms’ actions to limit the role of these systems in promoting illegal content, such as drug-related material and hate speech.

TikTok has been asked to clarify the steps it has taken to prevent bad actors from manipulating the platform and to mitigate risks related to elections, media pluralism, and public discourse, which can be amplified by its recommendation systems.

YouTube, Snapchat, and TikTok must submit the requested information by November 15. Following a review of their responses, the Commission will determine further action, which could include formally opening proceedings under a certain article of the DSA.

See also  Sign up for the GameCentral newsletter for exclusive gaming content

According to the DSA, the Commission can impose fines for providing incorrect, incomplete, or misleading information in response to requests for information. Failure to respond may lead to a formal decision from the Commission, and missing the deadline could result in recurring penalty payments.

See, YouTube is considered to be a VLOP – Very Large Online Platform.

Under the Digital Services Act (DSA), platforms or search engines with over 45 million monthly users in the EU are classified as very large online platforms (VLOPs) or very large online search engines (VLOSEs). Once designated, these platforms must comply with stricter regulations, including establishing points of contact for authorities, reporting criminal activities, and ensuring transparency in advertising and content moderation. They must also identify and assess risks related to illegal content, fundamental rights, public security, and wellbeing, especially in areas like election integrity, media freedom, and the protection of minors.

VLOPs and VLOSEs are required to implement measures to address these risks, such as adjusting their services or enhancing their internal systems. Additionally, they must set up internal compliance functions, undergo independent audits annually, and share data with authorities to ensure compliance with the DSA. These platforms must also provide data access to approved researchers studying systemic risks and offer users an alternative recommendation system that does not rely on profiling. They are also obliged to maintain a public repository of advertisements.



Source Link Website

Leave a Reply

Your email address will not be published. Required fields are marked *