Indie Game Platform Itch.io Says Its Domain Was Nuked Due to ‘Trash AI-Powered’ Phishing Report

Estimated read time 3 min read


Itch.io says its website was temporarily taken offline early Monday morning following what it described as a “bogus phishing report” filed on behalf of Funko, a maker of pop culture collectibles. The Verge earlier reported on the situation.

A storefront for indie games, Itch.io is popular with developers because they can sell their work—such as games and zines—without paying hefty commissions to the platform itself. Itch.io lets developers dictate how much revenue the website receives, even allowing them to give Itch.io a 0% cut, if they choose. Other platforms like Steam typically take a 30% cut.

In a tweet, Itch.io explained that its domain was disabled as a result of Funko using “some trash ‘AI-powered’ brand protection software” from a company called BrandShield. An employee from Itch.io later clarified that Funko was likely not the malicious actor in this case, rather placing the blame on BrandShield’s use of automated technology to identify unauthorized use of the Funko trademark. Instead of going through the typical DMCA process, which would allow Itch.io to review and remove the offending content without being taken offline, BrandShield took an aggressive tactic of sending “fraud and phishing” reports to both Itch.io’s domain registrar and its hosting provider.

The site did come back online after Itch.io contacted its domain registrar. It appears that the website itself was never taken offline, simply the registrar disabled the domain.

“BrandShield serves as a trusted partner to many brands,” the company posted in a statement on X. “Our AI-driven platform detects potential threats and provides analysis, and in this case, an abuse was identified from an @itchio subdomain.” It added, “The temporary takedown of the website was a decision made by the service providers, not BrandShield.”

Abuse of automated moderation systems has run rampant online in recent years, with malicious actors filing fake copyright claims on sites like YouTube in order to attack users over petty grievances or content they simply don’t like. YouTube, in fact, has taken these malicious actors to court in recent years—in one case, it found an individual in the Minecraft community was filing fraudulent DMCA complaints against other creators in order to extort them.

Companies like YouTube field so many requests, that they have come to rely on automated systems in order to manage it all. Technology like YouTube’s ContentID makes it possible to fingerprint and automatically identify a lot of content uploaded without permission, and even gives copyright holders the ability to monetize other users’ videos containing their material, instead of having them taken down. But these automated systems are not perfect, and platforms like YouTube will often take down reported content without reviewing it in the interest of preserving the company’s Section 230 protections. Platforms like YouTube are not liable for damages if they remove reported content within a reasonable timeframe.





Source link

You May Also Like

More From Author

+ There are no comments

Add yours