Facebook Sees Copyright Abuse as One of the Platform’s Main Challenges
When it comes to targeting infringement, Facebook has rolled out a few anti-piracy initiatives over recent years.
In addition to processing regular takedown requests, the company has a “Rights Manager” tool that detects infringing material automatically and allows owners to take down or monetize the content.
In a recent meeting organized by the European Commission, Facebook explained in detail how this automated system works. The meeting was organized to create a dialogue between various parties about possible solutions for the implementation of Article 17.
In Facebook’s presentation Dave Axelgard, product Manager for Rights Manager, explained how automated matching of copyrighted content takes place on the social media network. He also detailed what actions rightsholders can take in response, and how users can protest misuse and abuse of the system.
The EU meeting was attended by a wide range of parties. In addition to copyright holders, it also included various people representing digital rights organizations. Facebook made it clear that it keeps the interests of all sides in mind. It specifically highlighted, however, that abuse of Rights Manager is a serious concern.
“We spend much of our time building systems to avoid blocking legitimate content,” Axelgard mentioned during his presentation.
“The way that inappropriate blocks occur is when rightsholders gain access to Rights Manager despite our application process, who attempt to upload content to the tool that they do not own.”
Another form of overblocking that takes place is when copyright holders upload content that they don’t own. This can happen by mistake when a compilation video is added, which also includes content that’s not theirs.
Facebook works hard to catch and prevent these types of misuse and abuse, to ensure that its automated detection system doesn’t remove legitimate content. This is also something to keep in mind for the implementation of possible ‘upload filters’ with the introduction of Article 17.
“Misuse is a significant issue and after operating Rights Manager for a number of years, we can tell you it is one of the most sensitive things that need to be accounted for in a proportionate system,” Axelgard says.
Facebook tries to limit abuse through a variety of measures. The company limits access to its Rights Manager tool to a select group of verified copyright holders. In addition, it always requires playable reference files, so all claims can be properly vetted.
The social media network also limits the availability of certain automated actions, such as removal or blocking, to a subset of Rights Manager users. This is in part because some smaller rightsholders may not fully understand copyright, which can lead to errors.
Finally, Facebook points out that misuse of its Rights Manager tool constitutes a breach of its Terms of Service. This allows the company to terminate rightsholders that repeatedly make mistakes.
“If we find that Rights Manager is being misused, then under our Rights Manager terms we have the ability to terminate someone’s access to the tool. We really do want to stress how important it is that platforms have the ability to adjust access and functionality related to these powerful technologies to avoid misuse,” Axelgard notes.
The strong focus on misuse was welcomed by digital rights groups, including Communia. However, it also raised some eyebrows among rightsholders.
Mathieu Moreuil of the English Premier League, who represented the Sports Rights Owner Coalition, asked Facebook whether the abuse of Rights Manager really is the company’s main challenge.
“I think it’s definitely one of our main challenges,” Axelgard confirmed, while noting that Facebook also keeps the interests of rightsholders in mind.
Overall, Facebook carefully explained the pros and cons of its system. Whether it is an ideal tool to implement Article 17 in EU countries is another question. In its current form Rights Manager isn’t, as it doesn’t allow all copyright holders to join in.
Also, Rights Manager works with audio and video, but not with digital images, which is another major restriction.
On the other hand, there are pitfalls from a consumer perspective as well. Automated systems may be very good at detecting copyrighted content, but Facebook confirmed that they currently can’t make a determination in respect of copyright exemptions such as parody and fair use.
“Our matching system is not able to take context into account. It’s just seeking to identify whether or not two pieces of content matched to one another,” Axelgard said, responding to a question from Communia’s Paul Keller.
This shortcoming of automated filters was also confirmed by Audible Magic, the popular music matching service that’s used by dozens of large companies to detect copyright infringements.
“Copyright exceptions require a high degree of intellectual judgment and an understanding and appreciation of context. We do not represent that any technology can solve this problem in an automated fashion. Ultimately these types of determinations must be handled by human judgment,” Audible Magic CEO Vance Ikezoye said.
As noted by Communia, the most recent stakeholder meeting once again showed that automated content recognition systems are extremely powerful and very limited at the same time.
If any of these technologies become the basis of implementing Europe’s Article 17 requirements, these shortcomings should be kept in mind. Or as Facebook said, a lot of time and effort should go into preventing legitimate content being blocked.
—
A video of the full stakeholder meeting is available on the European Commission’s website. A copy of Facebook’s slides is available here (pdf).
Source: TF, for the latest info on copyright, file-sharing, torrent sites and more. We also have VPN reviews, discounts, offers and coupons.
Leave a Reply
Want to join the discussion?Feel free to contribute!