Professor Farid Goes to Washing至n. Sort of.
Should big tech companies be reined in? A US 新闻 and World Report survey found that roughly 74% of respondents say technology giants should see their powers limited. 该 flashpoint surrounding regulation is Federal 法 Section 230 of the Communication Decency Act. Section 230 was enacted in 1996 — long before Facebook, Google, and Twitter — and it provides, in part, websites with protection from liability for content created and shared on their sites by users.
While there are some things that aren’t protected, like child pornography and intellectual property violations, for the most part websites aren’t responsible for posts by users because, in the context of the law中， website isn’t a publisher, it’s simply the host. And this, some members of Congress argue, is a problem.
I School Professor 哈尼·法德 thinks changes need to be made 至 Sec 230 which would remove the blanket protection from liability these companies currently enjoy. On June 24, 2020, 他作证 before the joint hearing of the House Subcommittee on Communications and Technology and the Subcommittee on Consumer Protection and Commerce at a hearing entitled: “A Country in Crisis: How Disinformation Online is Dividing the Nation“。
Consumer Protection Subcommittee chair Jan Schakowsky (D-Ill.) opened her remarks by stating that Congress must step in because “the American people are dying and suffering as a result of online disinformation“。
Mike Doyle (D-Pa.), Communications Subcommittee chair, said that the pandemic highlighted “the flood of disinformation online — principally distributed by social media companies — and the dangerous and divisive impact it is having on our nation“。
In his opening statement to the subcommittees, Farid asked: “How, in 20 short years, did we go from the promise of the internet to democratize access to knowledge and make the world more understanding and enlightened, to this litany of daily horrors? Due to a combination of naivete, ideology, willful ignorance, and a mentality of growth at all costs中， titans of tech have simply failed 至 install proper safeguards on their services“。
“We can and we must do better when it comes to contending with some of the most violent, harmful, dangerous, hateful, and fraudulent content online. We can and we must do better when it comes 至 contending with the misinformation apocalypse that has emerged over the past few years“。
该 hearing was held remotely, with Rep. Doyle presiding over a Web Ex gallery of congressional members.
我的学校： You’ve testified in person before Congress in the past. Was this time a little surreal?
HF： Yes! Usually, it’s a very serious affair — you know中， halls are hushed, and it’s very formal. This time, members of Congress forgot 至 unmute their mics, some people left them on and there was interference...it was definitely different.
我的学校： In the Q&A period (Committee members were each allowed 5 minutes to question the experts) one of the points you made repeatedly, is that there are a few inroads 至 force change on the tech giants.
HF： Yes. 该re are three pressure points:
该 first is regulatory relief. Name any industry — food, financial, automotive — and when there is regulatory relief, those changes result in safer products. I’m old enough to remember when car manufacturers screamed that being required to install airbags would put them out of business… of course中，y were wrong. And what happened? Car companies actually started selling their safety records! This only happened because of regula至ry pressure.
Next is customers — the marketplace. Normally, if you’re not happy with a service or product, you can vote with your feet, buy something else. But with social media, we’re not the customers, we’re the product! We’re not paying for these services — I don’t pay a fee to use Facebook. And because of this, we don’t have the power that we’d usually have as consumers...it’s not like we can just hop over to MySpace...because of the nature of the service and the stranglehold, it makes it very difficult for us as the cus至mers.
该 last is advertising. We’re seeing this [what it looks like when companies withhold advertising dollars] play out in real-time. Disney just announced that they’re the latest addition 至 1000 or so companies that are boycotting Facebook and Instagram ads under Color of Change.1
This can be incredibly effective and has been in the past; for instance, Disney is one of the single largest advertisers on social media. In the past中，y’ve s至pped advertising on YouTube because YouTube was incredibly unsafe for young children. 该y said: 我们在外面. And they pulled their ads until YouTube changed their policies.
Amplification & Better Business Models
One of the arguments made against changes to Sec 230 is that if you remove the protections afforded under the law, it will stifle innovation中，reby crippling tech companies. 该 way Farid sees it, regulatory relief on the market dominance side of things might actually allow for a healthier tech ecosystem 至 thrive.
Chair Schakowsky asked Dr. Farid 至 explain why the big platforms “allow amplification of conspiracy theories and disinformation, and how this business model benefits them“。
“Social media,” Farid explained, “is in the engagement and attention business. They profit when we spend more time on the platform and they deliver ads. 该 companies didn’t set out to fuel disinformation. But that’s what the algorithms learned.” He went on 至 say that the business model is ‘poison’ and that it’s “fundamentally at odds with our societal and democratic goals“。
我的学校： You’re not buying the claim that changes 至 230 will kill creativity and stifle innovation?
HF： Google seems to forget that they were the beneficiary of regulation 25 years ago when Microsoft ruled the earth and Google was a little company trying to break in. 该 Department of Justice came in and told Microsoft — who was bundling their desktops with their browser, Internet Explorer, making it difficult to install another — 至 knock it off!2 And what happened when Microsoft was forced to make changes? It allowed oxygen in the room so a little search engine called Google could flourish! Now they want to keep out competi至rs so they can dominate.
我的学校： Can you elaborate on the amplification of the algorithms that drive social media?
HF： Mark Zuckerberg (CEO of Facebook) will have you believe: ‘Oh, you don’t want me arbitrating what’s true and what’s not.’ But he needs 至 be reminded that right now, he’s the arbiter of what’s relevant.
When you go to Facebook, your news feed is highly curated. Facebook is acting like an edi至r, picking what you see out of the volumes of things you could see... and they’re deciding what you see based on what they think will keep you on the platform longer.
YouTube (parent company Google) is no different. 70% of videos watched on the platform are those promoted by YouTube.
我的学校： I was stunned by that figure when you mentioned it in testimony! And then I thought about how I might visit YouTube for a specific video和n immediately I’m sucked in至 watching the next five videos…
HF： This isn’t an accident. And you can find some really bad stuff. Recent studies have shown that half of people who joined Facebook groups associated with white nationalists were recommended those groups by Facebook.
我的学校： Say you’re ‘suggested’ into one of these groups of bad ac至rs by algorithmic amplification; you’re someone who becomes heavily involved and eventually takes action, goes out and shoots someone, etc. Do you think that we should hold these companies culpable?
HF： That’s the right question to ask. We don’t hold the telephone company responsible for using a phone to conspire 至 commit a crime, right?
But what happens when it’s not a neutral platform? What happens when Facebook drives you in至 the group and then keeps showing you things that are more and more radicalized? Are they not somehow culpable? Morally, I think they are. But the issue is legally they are not, because of Section 230 of the communications decency act.
Moderation & Monstrosity
Rep Bret Guthrie (R-Ky) asked Farid if he believed that technology companies possessed the technological means 至 better moderate illicit content on their platforms, “and if they do, why aren’t they using them?”
“I don’t think they have the means,” Farid said. “And they don’t have the means because they haven’t prioritized it“。 When the DMCA (Digital Millennium Copyright Act of 1998) was passed, Farid argued: “Companies got very good at spotting and removing copyright infringement because of the law“。
我的学校： If you had your druthers, what would moderation on these platforms look like?
HF： First of all中，se trillion-dollar companies need to s至p throwing their hands up and saying, the Internet is big中，re’s nothing we can do. You built this monstrosity. You can’t then turn around and say: 哦，我无法控制这种.
I’d like to see two things. One, we simply have to change the blanket liability found in Section 230... it’s simply too broad. It has allowed sites like Backpage which was knowingly trafficking young children, protection from liability under Section 230. That’s insane! 该re are clearly consequences 至 Section 230 that we did not anticipate.
And the second thing is that I certainly don’t want to hold social media companies responsible for every post, every video that their millions of users post. That’s not reasonable. But, we can hold them responsible for radicalizing people by driving them to extremism, driving them to sexual abuse material, knowingly allowing this material 至 be uploaded和n distributed through their network. 该y should be held liable for that.
My hope is that if you just crack the door open a bit, you know they’re going to change; it will take just a little movement 至 get there.
1Grassroots organization Color of Change is actively pressuring tech companies 至 take a stand against white nationalist hate and voter suppression on their platforms.
2In 1998, a judge found Microsoft violated parts of the Sherman Antitrust law和 company and DOJ settled.
3Backpage.com was a classifieds site which was shut down by the Justice Department in 2018