Here’s how it works: A company makes an ad, or creates a shop, and submits it to Facebook for approval, an automated process. (If it’s a storefront, the products can also arrive via a feed, and each one must comply with Facebook rules.) If the system flags a potential violation, the ad or product is sent back to the company as noncompliant. But the precise word or part of the image that created the problem is not identified, meaning it is up to the company to effectively guess where the problem lies.

The company can then either appeal the ad/listing as is, or make a change to the image or wording it hopes will pass the Facebook rules. Either way, the communication is sent back through the automated system, where it may be reviewed by another automated system, or an actual person.

According to Facebook, it has added thousands of reviewers over the last few years, but three million businesses advertise on Facebook, the majority of which are small businesses. The Facebook spokeswoman did not identify what would trigger an appeal being elevated to a human reviewer, or if there was a codified process by which that would happen. Often, the small business owners feel caught in an endless machine-ruled loop.

“The problem we keep coming up against is channels of communication,” said Sinéad Burke, an inclusivity activist who consults with numerous brands and platforms, including Juniper. “Access needs to mean more than just digital access. And we have to understand who is in the room when these systems are created.”

The Facebook spokeswoman said there were employees with disabilities throughout the company, including at the executive level, and that there was an Accessibility team that worked across Facebook to embed accessibility into the product development process. But though there is no question the rules governing ad and store policy created by Facebook were designed in part to protect their communities from false medical claims and fake products, those rules are also, if inadvertently, blocking some of those very same communities from accessing products created for them.

“This is one of the most typical problems we see,” said Tobias Matzner, a professor of media, algorithms and society at Paderborn University in Germany. “Algorithms solve the problem of efficiency at grand scale” — by detecting patterns and making assumptions — “but in doing that one thing, they do all sorts of other things, too, like hurting small businesses.”



Source link