Making Sense of DSA, 230, Thai Food and Swimming Pools

Nobody has explained Section 230 of the US Communications Decency Act better than Wired Magazine’s editor in chief Nick Thompson the other day in a tweet:

Why does section 230 allow tech companies to moderate content? Bc the previous law was like one saying swimming pools were liable for drowning deaths if they had lifeguards, but not if they didn’t. The point of 230 is to fix those nuts incentives.

This was in response to news that Republicans seek to amend Section 230. In a parallel policy-development, which Netopia has commented on before, Big Tech is pushing to bring the liability immunity from CDA230 to European policy, for example in the Digital Services Act.

Now, Thompson’s tweet begs this question: Should we not worry more about the drownings than the liability? The take-away should not be no liability anywhere, but policy to minimize the drownings. Speaking metaphorically.

This is the blind spot for Big Tech, the failure to acknowledge that there are some actual problems that must be addressed, instead worrying about threats to “innovation” (=current business model) or “breaking the internet” (=Big Tech’s monopoly on regulation). The supposedly threatening policies are a consequence of Silicon Valley’s failure to deal with the fallout of its own business. Google says thai restaurants depend on its services to stay in business in the pandemic, while it fails to deal with disinformation campaigns – such as anti-vaccine – that also depend on those same services.

Clean up your swimming pool, Big Tech.