#

Congress had a lot to say about TikTok. Much of it was wrong.

LOS ANGELES — On Thursday, Republican Rep. Earl L. “Buddy” Carter of Georgia lambasted TikTok CEO Shou Chew for alleged viral challenges he attributed to TikTok.

With a board behind him featuring so-called “sleepy chicken” (chicken sautéed in NyQuil), he claimed that the Chinese Communist Party was engaging in “psychological warfare through TikTok to deliberately influence U.S. children” specifically through TikTok challenges.

“We’ve heard from parents who are with us who have lost children,” he said. “Why is it that TikTok consistently fails to identify and moderate these kinds of harmful videos, why is it that you allow it to go on?”

When Chew tried to respond, Carter cut him off. “This is TikTok, we’re talking about, TikTok. Tell me why this goes on.” It was a dramatic and heart wrenching moment. It was also untrue.

NyQuil chicken was never a “TikTok challenge.” The idea originated on the fringe website 4chan in 2017 — a year before TikTok launched in the United States. Known as “sleepy chicken,” the alleged recipe has been an internet meme for years. It spread in viral posts on Reddit, image boards and humor websites years ago. Images and video of chicken being cooked in NyQuil can also be found on YouTube and Instagram.

“if she makes you nyquil chicken … do NOT let her go,” read one viral tweet from 2017 that received nearly 10,000 shares. That was a year before TikTok was available in the United States.

On Monday, Carter’s office declined to say where the congressman had gotten the information or discuss last Thursday’s hearing.

Also declining to discuss assertions he’d made during the hearing was the office of Rep. Robert E. Latta (R-Ohio) who attributed the death of a 10-year-old girl to the “blackout challenge,” which he accused TikTok of promoting.

But the “blackout challenge,” also called the “choking game,” isn’t a recent phenomenon. In 2008, the Centers for Disease Control and Prevention reported that 82 children in the United States had died playing the choking game between 1995 and 2007. TikTok wasn’t founded until 2016 and didn’t launch in the United States until 2018.

It was not the first use of inaccurate information to slam TikTok. Last year, The Washington Post reported that Meta, Facebook’s parent company, had hired a consulting firm to malign the app in local news media across the country. The firm, Targeted Victory, successfully planted op-eds in regional news outlets falsely tying TikTok to viral challenges that, in some cases, originated on Facebook. In one case, Targeted Victory worked to spread rumors of a “Slap a Teacher’ TikTok challenge” in local news. But, no such challenge existed on TikTok. The rumor may have started on Facebook.

There was no evidence that Targeted Victory played a role in Thursday’s hearing, and legislators asked to comment Monday on how they’d come by the information they cited — none acknowledged being a TikTok user — declined to comment.

A spokeswoman for the House Energy and Commerce Committee chair, Rep. Cathy McMorris Rodgers (R-Wash.), declined to comment.

“Legislators using misinformation to back up their policies is not particularly new,” said Abbie Richards, a disinformation researcher at Accelerationism Research Consortium, a nonprofit studying the threat of far-right extremism to democratic societies. “We’re certainly seeing that when it comes to LGBTQ legislation that’s being implemented. They’re finding misinformation that backs up their points to justify their view that we should ban TikTok.”

Lawmakers made a number of other claims that were inaccurate or at least debatable.

When Chew denied that TikTok censors videos related to the Uyghur genocide or the Tiananmen Square massacre, McMorris Rodgers warned him that, “Making false or misleading statements to Congress is a federal crime.” But a simple search on the app reveals dozens of videos bashing China and calling attention to the Uyghur genocide and the Tiananmen Square massacre.

Frustrated by that line of questioning, some TikTok users last week began uploading graphic content of the Tiananmen Square massacre to show that it would not be removed. “Oddly enough I tried posting this on Facebook and got a 24 hour ban,” one TikTok user commented in a TikTok video showing footage from the Tiananmen Square massacre that had been viewed more than 132,500 times.

TikTok also denied lawmakers’ assertions that the CEO of TikTok’s parent company, ByteDance, is a member of the Chinese Communist Party. He is not, the company said.

Other questioning drew TikTokers to rally to the company’s defense. Several mocked Rep. Richard Hudson (R-N.C.) after he asked, “Does TikTok access the home WiFi network?”

Rep. Richard Hudson (R-NC) asks TikTok CEO Shou Chew: ‘Does TikTok access the home WiFi network?’ https://t.co/Fmv8MED8z0 pic.twitter.com/xwrYuSn3jE

— Bloomberg (@business) March 23, 2023

But Patrick Jackson, chief technology officer at Disconnect, a data privacy company, said the question might have had a basis in fact. While it’s hard to know exactly what Hudson was trying to ask (his office declined to comment), Jackson said he believes the congressman was attempting to question TikTok’s CEO about a setting in Apple’s iOS system where users are prompted to give apps permission to access other devices on their WiFi network.

“Most times it’s harmless,” Jackson said, “maybe it’s a video app that wants to cast to your Chromecast, or send audio to your Sonos. By default, apps can only communicate to the internet, not to your local network.”

Apps can exploit that access, however, if a user grants it. For instance, giving an app access to a printer may allow it to print a document without asking the user. And data about what other devices a user has on their WiFi network is valuable. Other apps such as Instagram and Signal also prompt the user to connect to their local network.

Jackson said he believes it’s on Apple to better police this sort of access and help consumers understand what data they’re giving and why. “Apple could do a better job communicating the risk and making sure developers justify the use of these APIs to Apple,” he said. “It should be part of the Apple review process.”

“TikTok follows industry norms, and like other apps, may ask permission to discover and connect to devices on the networks people use,” a company spokesperson said. “We do not sell personal information, and people can choose to allow or revoke permission at any time.”

Fighting misinformation about TikTok is especially difficult, said Richards, the Accelerationism Research Consortium researcher, because the false information often goes viral and plays into people’s preexisting beliefs. “It’s one of those classic debunking struggles,” she said, “It doesn’t matter how much you debunk it because the lie has spread so much farther than the truth ever will.”

For instance, in 2020, a viral tweet accused TikTok of blocking the #BlackLivesMatter hashtag. While a temporary glitch hid view counts for all hashtags on the app for a number of hours one day in 2020, the app did not block the Black Lives Matter hashtag, or cease counting views on the hashtag. In fact, the company promoted the #BlackLivesMatter hashtag repeatedly within the app throughout the summer of 2020, and videos containing the hashtag received tens of millions of views. A Post poll of TikTok users found its user base is largely young and people of color.

Jamie Cohen, an assistant professor of media studies at CUNY Queens College, said that the lawmakers displayed “willful ignorance of internet culture.” However, he added that the media also should take some responsibility for perpetuating the false information repeated at the hearing.

“Much like the way young people know how to game algorithms on social media, news media knows how to create panic to get viewership and ratings,” he said. “There would be no sleepy chicken behind a congressperson if the news media didn’t perpetuate the notion that it exists. It doesn’t exist, but it creates a fear factor.”

Amin Shaykho, founder of Kadama, a tutoring app for students that is promoted widely on TikTok, said he was disappointed that no member of the committee seemed interested in the potential negative impact of a ban. “I felt so bad that no member spoke up,” he said. “I’m going to have to lay off 5,000 of our tutors if TikTok gets banned, and millions of other businesses will also be impacted. Between 80 to 90 percent of our users discover us on TikTok.”

This post appeared first on The Washington Post