Twitter says an 'issue' is messing with Likes, retweets, and notifications

On Tuesday, Twitter revealed via its Support account that the platform is currently experiencing an "issue" with multiple aspects of its service, including Likes. The confirmation follows complaints from users who noticed unusual variations in the Likes on their tweets, the changes having prompted speculation that Twitter was suspending large numbers of accounts or removing Likes for nefarious reasons.

The primary issue for some users revolves around the number on tweets that indicates how many users had "liked" the content. A number of Twitter users posted tweets highlighting unexpected large drops in Likes, including conservative political commentator Ann Coulter, who suggested that Twitter was removing Likes from tweets from conservatives. Others have expressed similar sentiments, including Eric Weinstein.

In response, the Twitter Support account published a tweet on Tuesday stating that an issue is affecting users globally, and it impacts Likes, retweets, and notifications. The company didn't elaborate on the problem, nor did it say how long the issue has been happening. Based on complaints from users, it appears the problem has only been around for a day.

Despite the company's confirmation, a number of conspiracy theories and speculation about deliberate censorship remain live on the site. Some responses to the Twitter Support tweet accuse the company of deliberately suppressing Likes pertaining to posts from certain political parties or groups.

Conspiracies remain a hot topic on social media platforms, where individuals and groups alike share posts, articles, and videos surfacing a variety of unusual beliefs and paranoia.

YouTube has frequently faced criticism regarding the conspiracy theory videos its system recommends to users. In response, YouTube said in January that it would start reducing the number of conspiracy theory videos it recommends to users. According to the company, its change would impact videos that may "misinform" its users or that feature "borderline content."

That decision has itself proven controversial, however, from both sides of the debate. On one hand, some users have expressed concerns that the change could suppress what they consider to be valid information, though it should be noted that the videos often remain live on the platform.

Others have asked YouTube how it will address videos related to conspiracy theories that don't necessarily promote false information. A valid example would be the difference between a video promoting flat Earth conspiracy theories versus a video exploring a popular conspiracy theory from an entertainment standpoint.

YouTube and Twitter aren't the only social platforms dealing with conspiracy theory content. Facebook has been heavily criticized over past years for misinformation spread on its platform, including everything from popular truther conspiracies to anti-vaxxer content and fake news revolving around politics.

In a report on Tuesday, The Guardian pointed toward closed anti-vaxxer groups that verify users before allowing them to participate, the exclusive nature resulting in an echo chamber. According to the report, some of these groups are "large and sophisticated," including one called Stop Mandatory Vaccination that have more than 150,000 verified users. Officials and experts have called on Facebook to address such groups.