Tech

Those Cute Cats Online? They Help Spread Misinformation.

On Oct. 2, New Tang Dynasty Television, a station linked to the Chinese spiritual movement Falun Gong, posted a Facebook video of a woman saving a baby shark stranded on a shore. Next to the video was a link to subscribe to The Epoch Times, a newspaper that is tied to Falun Gong and that spreads anti-China and right-wing conspiracies. The post collected 33,000 likes, comments and shares.

The website of Dr. Joseph Mercola, an osteopathic physician who researchers say is a chief spreader of coronavirus misinformation online, regularly posts about cute animals that generate tens or even hundreds of thousands of interactions on Facebook. The stories include “Kitten and Chick Nap So Sweetly Together” and “Why Orange Cats May Be Different From Other Cats,” written by Dr. Karen Becker, a veterinarian.

And Western Journal, a right-wing publication that has published unproven claims about the benefits of using hydroxychloroquine to treat Covid-19, and spread falsehoods about fraud in the 2020 presidential election, owns Liftable Animals, a popular Facebook page. Liftable Animals posts stories from Western Journal’s main website alongside stories about golden retrievers and giraffes.

Videos and GIFs of cute animals — usually cats — have gone viral online for almost as long as the internet has been around. Many of the animals became famous: There’s Keyboard Cat, Grumpy Cat, Lil Bub and Nyan Cat, just to name a few.

Now, it is becoming increasingly clear how widely the old-school internet trick is being used by people and organizations peddling false information online, misinformation researchers say.

The posts with the animals do not directly spread false information. But they can draw a huge audience that can be redirected to a publication or site spreading false information about election fraud, unproven coronavirus cures and other baseless conspiracy theories entirely unrelated to the videos. Sometimes, following a feed of cute animals on Facebook unknowingly signs users up as subscribers to misleading posts from the same publisher.

Dr. Karen Becker, a veterinarian who shares pet advice on Dr. Joseph Mercola’s website, wrote the article “Why Orange Cats May Be Different From Other Cats.”

Melissa Ryan, chief executive of Card Strategies, a consulting firm that researches disinformation, said this kind of “engagement bait” helped misinformation actors generate clicks on their pages, which can make them more prominent in users’ feeds in the future. That prominence can drive a broader audience to content with inaccurate or misleading information, she said.

“The strategy works because the platforms continue to reward engagement over everything else,” Ms. Ryan said, “even when that engagement comes from” publications that also publish false or misleading content.

Perhaps no organization deploys the tactic as forcefully as Epoch Media, parent company of The Epoch Times. Epoch Media has published videos of cute animals in 12,062 posts on its 103 Facebook pages in the past year, according to an analysis by The New York Times. Those posts, which include links to other Epoch Media websites, racked up nearly four billion views. Trending World, one of Epoch’s Facebook pages, was the 15th most popular page on the platform in the United States between July and September.

One video, posted last month by The Epoch Times’s Taiwan page, shows a close-up of a golden retriever while a woman tries in vain to pry an apple from its mouth. It has over 20,000 likes, shares and comments on Facebook. Another post, on Trending World’s Facebook page, features a seal grinning widely with a family posing for a picture at a Sea World resort. The video has 12 million views.

Epoch Media did not respond to a request for comment.

“Dr. Becker is a veterinarian, her articles are about pets,” said an email from Dr. Mercola’s public relations team. “We reject any New York Times accusations of misleading any visitors, but are not surprised by it.”

The viral animal videos often come from places like Jukin Media and ViralHog. The companies identify extremely shareable videos and reach licensing deals with the people who made them. After securing the rights to the videos, Jukin Media and ViralHog license the clips to other media companies, giving a cut of the profits to the original creator.

Mike Skogmo, Jukin Media’s senior vice president for marketing and communications, said his company had a licensing deal with New Tang Dynasty Television, the station tied to Falun Gong.

“Jukin has licensing deals with hundreds of publishers worldwide, across the political spectrum and with a range of subject matters, under guidelines that protect the creators of the works in our library,” he said in a statement.

Understand the Facebook Papers


Card 1 of 6

A tech giant in trouble. The leak of internal documents by a former Facebook employee has provided an intimate look at the operations of the secretive social media company and renewed calls for better regulations of the company’s wide reach into the lives of its users.

How it began. In September, The Wall Street Journal published The Facebook Files, a series of reports based on leaked documents. The series exposed evidence that Facebook, which on Oct. 28 assumed the corporate name of Meta, knew Instagram, one of its products was worsening body-image issues among teenagers.

The whistle-blower. During an interview with “60 Minutes” that aired Oct. 3, Frances Haugen, a Facebook product manager who left the company in May, revealed that she was responsible for the leak of those internal documents.

Ms. Haugen’s testimony in Congress. On Oct. 5, Ms. Haugen testified before a Senate subcommittee, saying that Facebook was willing to use hateful and harmful content on its site to keep users coming back. Facebook executives, including Mark Zuckerberg, called her accusations untrue.

The Facebook Papers. Ms. Haugen also filed a complaint with the Securities and Exchange Commission and provided the documents to Congress in redacted form. A congressional staff member then supplied the documents, known as the Facebook Papers, to several news organizations, including The New York Times.

New revelations. Documents from the Facebook Papers show the degree to which Facebook knew of extremist groups on its site trying to polarize American voters before the election. They also reveal that internal researchers had repeatedly determined how Facebook’s key features amplified toxic content on the platform.

Asked whether the company evaluated whether their clips were used as engagement bait for misinformation in striking the license deals, Mr. Skogmo said Jukin had nothing else to add.

“Once someone licenses our raw content, what they do with it is up to them,” said Ryan Bartholomew, founder of ViralHog. “ViralHog is not supporting or opposing any cause or objective — that would be outside of our scope of business.”

The use of animal videos presents a conundrum for the tech platforms like Facebook, because the animal posts themselves do not contain misinformation. Facebook has banned ads from Epoch Media when the network violated its political advertising policy, and it took down several hundred Epoch Media-affiliated accounts last year when it determined that the accounts had violated its “coordinated inauthentic behavior” policies.

“We’ve taken enforcement actions against Epoch Media and related groups several times already,” said Drew Pusateri, a Facebook spokesman. “If we discover that they’re engaging in deceptive actions in the future we will continue enforcing against them.” The company did not comment on the tactic of using cute animals to spread misinformation.

Rachel E. Moran, a researcher at the University of Washington who studies online misinformation, said it was unclear how often the animal videos led people to misinformation. But posting them continues to be a popular tactic because they run such a low risk of breaking a platform’s rules.

“Pictures of cute animals and videos of wholesome moments are the bread and butter of social media, and definitely won’t run afoul of any algorithmic content moderation detection,” Ms. Moran said.

“People are still using it every day,” she said.

Jacob Silver contributed research.

Back to top button