Tech News

Large Tech's “black field” algorithms are topic to regulatory oversight below the EU plan

Important internet platforms have to open their algorithms to the supervision of the supervisory authorities as part of proposals that the European legislator is to introduce next month.

In a speech today, Commissioner EPP Margrethe Vestager suggested that algorithmic accountability will be a key element of the upcoming digital legislative package – with in-depth draft rules that require platforms to explain how their recommendation systems work and give users more control over them.

“The rules we are preparing would give all digital services an obligation to work with regulators. And the biggest platforms need to provide more information about how their algorithms work when regulators ask, "she said, adding that platforms also need to" give regulators and researchers access to the data they hold – including ad archives ". .

While social media platforms like Facebook have set up ad archives before it is required by law, there are persistent complaints from third-party researchers about how the information is structured and how (in) accessible it is for independent study.

More information for users around advertising targeting is another planned requirement, along with increased reporting requirements for platforms to explain decisions about content moderation, said Vestager, who also previewed developments in the Digital Services Act and Act on digital markets in another gave speech earlier this week.

Regional lawmakers respond to concerns that black box algorithms can have detrimental effects on individuals and societies due to the way they process data and organize and evaluate information, with risks such as discrimination, heightened bias, and abuse vulnerable people and groups.

The Commission has announced that it is working on binding transparency rules to force technology giants to take more responsibility for the content that expands and monetizes their platforms. The devil will deal with both the details of the requirements and their effectiveness, but a draft of the plan is due next month.

“One of the main goals of the Digital Services Act, which we will table in December, is to protect our democracy by ensuring that platforms are transparent about how these algorithms work – and that these platforms become more accountable for the decisions they make make, ”Vestager said today in a speech at an event organized by the non-profit research advocacy group AlgorithmWatch.

“The suggestions we're working on mean that platforms need to tell users how their recommendation systems decide what content to display, so it's easier for us to judge whether or not to trust the image of the world they are giving us . "

According to the planned rules, the most powerful internet platforms – in EU parlance so-called "gatekeepers" – have to submit regular reports on "the tools they use for content moderation and the accuracy and results of these tools", as Vestager put it.

There are also specific disclosure requirements for ad targeting that go beyond the current fuzzy disclosures that platforms like Facebook may already offer (in this case, via the "Why am I seeing this ad?" Menu).

"Better information" needs to be provided, like platforms that communicate to users "Who ran a particular ad and why was it targeted at us?" The overall goal will be to ensure that users of such platforms “a Better idea of ​​who is trying to influence us – and a better chance of spotting when algorithms are discriminating against us, ”she added.

Today, a coalition of 46 civil society organizations, led by AlgorithmWatch, called on the Commission to ensure that the transparency requirements in the upcoming legislation "make sense" and called on it to put in place "comprehensive data access frameworks" that will provide the tools to watchdogs who need them to keep platforms accountable and empowering journalists, academics and civil society to "challenge and question the power."

The group's recommendations call for binding disclosure obligations based on the technical functions of dominant platforms. a single EU institution "with a clear legal mandate to enable access to data and enforce transparency obligations"; and regulations to ensure that data collection complies with EU data protection regulations.

Another way to balance the performance imbalance between the giants of the data mining platform and the people they track, profile, and target could be for users to have to turn off algorithmic feeds entirely if they so choose – without the option of a data-driven one Discrimination or manipulation. However, it remains to be seen whether the EU legislator will go that far in the forthcoming legislative proposals.

The only pointers Vestager gave on this front was to say that the proposed rules "will also give users more power – algorithms do not have the final say on what we see and what we cannot see". .

Platforms must also "give users the ability to influence the decisions that recommendation systems make on our behalf," she said.

In further remarks, she confirmed that there will be more detailed reporting requirements for digital services in connection with the moderation and deactivation of content. You must notify users when they are removing content and "give them effective rights to contest that removal". While there is widespread public support across the bloc for restarting the rules of the game for digital giants, there is a strong view that regulation should not interfere with freedom of expression online – for example, by encouraging platforms to reduce their regulatory risk by applying Reduce upload filtering or remove controversial content for no good reason.

The proposals must be supported by the EU member states through the European Council and by elected representatives in the European Parliament.

The latter has already voted in favor of stricter ad targeting rules. MEPs also urged the Commission to oppose the use of upload filters or any form of ex ante content control for harmful or illegal content. The final decision on whether or not content is legal should be made by an independent judiciary.

At the same time, the Commission is working on developing rules specifically for applications that use artificial intelligence. However, this legislative package is due next year.

Vestager confirmed that this will be rolled out in early 2021 with the aim of creating an "AI Ecosystem of Trust".

Related Articles