Connect with us

International Circuit

Facebook Endows AI Ethics Institute At German University TUM

The new center, which Facebook is funding with an initial grant of $7.5 million over five years, will investigate issues around AI safety, fairness, privacy and transparency, Joaquin Quinonero Candela, Facebook’s director of applied machine learning, said in a blog post.

Christoph Luetge, a TUM professor specializing in business ethics, will lead the new Institute for Ethics in Artificial Intelligence, Candela said.

Facebook has found itself increasingly embattled in the past two years, facing criticism for failing to stop the spread of fake news, terrorist propaganda and hate speech as well as abusing users’ privacy. Chief Executive Officer Mark Zuckerberg has told the U.S. Congress that the company will increasingly lean on AI to police content on the social network. The company has a large AI research lab that employs some of the world’s top experts in machine learning. But the use of AI can itself raise serious ethical concerns.

“At Facebook, ensuring responsible and thoughtful use of AI is foundational to everything we do — from the data labels we use to the individual algorithms we build, to the systems they are a part of,” Candela said.

Candela said Facebook is also working on new tools, including one called Fairness Flow, that can help those working on machine learning systems evaluate them for hidden bias.

He said Facebook chose TUM because it is “one of the top-ranked universities worldwide in the field of artificial intelligence.”

In the past, Facebook has faced particular criticism from German lawmakers for failing to police hate speech and misinformation and the company has responded, in part, by helping to fund German non-profits that work to combat rightwing propaganda.

While Facebook is providing the initial financing for the Institute, it will seek other funding partners too, according to the blog.―Bloomberg

Click to comment

You must be logged in to post a comment Login

Leave a Reply

Copyright © 2024 Communications Today

error: Content is protected !!