Of Digital Services Law “The DSA has the potential to become a” global gold standard “and to encourage other countries to” adhere to new laws that protect our democracy, “Hawgen emphasized in today’s hearing. However, she cautioned that the rules must be strong in terms of transparency, control and implementation, otherwise “we will lose this one-generation opportunity to adapt to the future of technology and democracy.”
Protecting user rights and increasing accountability
Ms. Hawgen’s comments on how Facebook affects users and their basic rights were a source of concern for MEPs. They expressed concern about the exploitation of children and adolescents and the exploitation of small targets, as well as for political purposes. Questions on how to hold the forums accountable are to ensure that the Disaster Risk Assessment and Disaster Reduction Act (DSA) is strong enough to address abuses and address threats to democracy.
Ms. Hawgen commented on the need to control not only illegal but also harmful content, content management tools and targeted advertising. You want to know what kind of protection you want to see in the EU’s digital laws, thinking that the package on the table is enough. Enforcement tools, transparency of algorithms, access to forum information for academic researchers, NGOs, and investigative journalists were among the issues raised at the hearing.
Report information and keep algorithms secure
In her response, she emphasized the need for companies such as Facebook to publicly disclose information and how it collects it (in terms of content, advertising, metrics, for example) to ensure that people make clear decisions and prevent online “dark patterns”. She added that individuals in these companies should be responsible for their own decisions, not committees.
He noted that Facebook is less transparent than other forums in countering false information and minimizing harmful content and can do more to make algorithms safer by limiting how often content can be re-shared, including services to support multiple languages, reviewing forums, people ratings and Instead of being guided by artificial intelligence, users look for ways to communicate with each other. She praised lawmakers for their content-independent approach, but warned of potential gaps and freedom for media organizations and trade secrets.
In her speech, Ms. Hawgen stressed the importance of protecting technology informants because their testimony will be key to protecting people from harm in the future with digital technologies.
Of Video recording The hearing is available. over here.
The hearing was organized by the European Parliament’s Committee on Internal Markets and Consumer Protection in collaboration with other committees, including the Industries, Legal Affairs and Civil Liberties Committees, and the Special Committee on False Information and Artificial Intelligence.
Control of forums is underway in parliament.
The Committee on Internal Markets and Consumer Protection is discussing how to improve and improve the proposed digital services law submitted by the European Commission in December 2020. Ms. Hawgen’s presentation will feed the committee’s work on DSA before the vote (due date). This law is a European opportunity to shape the digital economy at the European Union level and to become a digital regulator at the international level.
Francis Hawgen is a former Facebook employee specializing in computer engineering and algorithm management. On Facebook, Ms. Hawgen worked as a leading product manager for the Civic Misinformation Group. This group has observed electoral interference around the world, and has worked on issues related to democracy and misinformation. Facebook left the group after the 2020 US election, and Ms. Haugen spoke to the Wall Street Journal shortly after. Ms. Hawgen disclosed thousands of internal documents she collected while working for Facebook. One of the most striking facts supported by leaked documents is how the use of Instagram can seriously affect the mental health of adolescents, especially the development of dietary and physical impairments. In general, leaked documents show how Facebook’s public claims on various topics – beyond mental health, Facebook’s work on hate speech and free speech – often conflict with internal research. In general, Ms. Hawgen Facebook (owned by other widely used social media companies such as Instagram) does not deliberately make these platforms safer for users because this will affect their profits.