While addressing disinformation may feel Sysyphean at times, it is important to recognise that the tactical playbooks in use are nothing new. In fact, they are often based on digital marketing practices that companies and political actors use to sell their brand. For those working to combat disinformation, this means we need to look at a much wider definition of “platform” and focus on other channels that are used to market products, ideas and brands. This means turning our attention to the universe of e-commerce platforms.
Similar to selling sports team sweatshirts or campaign bumper stickers, disinformation actors are selling merchandise to sell their brand – and push their theories.
Politicians and pop culture icons are old hats at using merchandising to build their brands and sell their messages.
Just look at the award-winning American musician (and philanthropist) Chance the Rapper. He has a huge social media footprint: 9.7 million Instagram followers, 8.13 million Twitter followers, and 1.9 million Facebook fans. Chance, arguably the most successful independent rapper in modern history, has used his following to create a global brand around his “Chance 3” logo that can be monetised.
The financial gains are not just limited to Chance himself, whose net worth has topped the US$30 million mark. Do-it-Yourself (DIY) e-commerce platforms like RedBubble, Etsy, Amazon, and TeePublic offer a collective space for people to make money off of the concept of Chance the Rapper fandom. The bigger the celebrity, the greater opportunity for fans to cash-in on the brand through their own merchandise.
These same tactics that are being used to sell Chance’s hats have been employed by disinformation actors to sell hate.
Take the US-based QAnon conspiracy theory and the global brand it has become.
QAnon dates to 2017 and first appeared as a person posting a string of messages in 4Chan. In these, QAnon claimed to be a high-level political insider with “Q”-level US government security clearance. Many of the posts espoused conspiracy theories and secret plots by the “deep state” against President Trump. QAnon is considered an offshoot of the pizzagate conspiracy and support for QAnon has grown exponentially through a network of social media echo chambers.
Some of these followers of Q have produced a range of DIY products so you can show your commitment to the cause. And you can buy this merchandise – ranging from insulated coffee cups to baby onesies – on some of the top e-commerce platforms like Ebay (13,131 QAnon products), redbubble (2,134 QAnon products), Spreadshirt (1,200 QAnon Products), Etsy (827 QAnon products) and Amazon Prime (872 QAnon products; all figures as of 2 May 2019).
The online marketplaces are making money off the “cause” as well. For example, Etsy gets US$0.20 for each item listed to its marketplace and once the product is sold, it takes a five percent transaction fee. Amazon charges its pro-sellers a fee of US$39.99 per month to be on the market place while individual sellers pay US$0.99 per sale.
The QAnon products are being marketed on mainstream sites as well as posted on fringe social media platforms such as 8chan, 4chan, Gab, and Voat. More recently, this content has returned to Facebook under the veil of sponsored advertisements “related to politics or issues of importance” or “ran without a disclaimer” in Facebook’s Ad Library. The result is that the adverts are being used to not only market conspiracy theories but also circumvent the political ad monitoring system that Facebook has put in place.
Newly identified pipelines for political disinformation have expanded the scale and scope of digital conspiracy theories. We must recognise the circumvention utility and financial safe harbor that e-commerce platforms offer these communities.
Platforms, marketplaces and payment systems can all do more. A first but critical step is that they need to update their community standards and content moderation to deal with products that are being used as a means to promote – and sell – objectionable content.
There is an inherent need for robust collaboration to deplatform these actors and ensure coherent content moderation across our digital ecosystem: from social media platforms to e-commerce providers. Disinformation actors need to see that there is no safe space for malicious, inaccurate content to thrive and be monetised online.
In the words of Chance:
“Sometimes the truth don’t rhyme. Sometimes the lies get millions of views….”