Encryption

Bill to Prevent Child Porn Could Kill Encryption

Voiced by Amazon Polly
Mark Rasch

by Mark Rasch on February 7, 2020

New legislation being introduced by Sen. Lindsay Graham ostensibly to prevent the spread of child porn and other child sexual abuse materials (CSAM) may have the effect of prohibiting the kinds of good security practices that prevent unauthorized access to medical records, credit card records, bank records and other personal information, and may make the internet a hell of a lot less secure. That’s because the bill, the Eliminating Abusive and Rampant Neglect of Interactive Technologies (or EARN IT) Act, would remove the limited immunity that internet online service providers currently enjoy for the acts of users on their systems and networks, and would require that these providers comply with “recommended best practices” for the prevention of online child exploitation as determined by a federal commission.

So what’s wrong with that? I mean, we can all get behind preventing online child exploitation, right? Don’t we want to use “best” practices to prevent that? The commission to be created to establish these “best” practices will be made up of the most technically knowledgeable and sophisticated individuals with an appreciation for the nature of technology and privacy and security, as well as the needs for the protection of the rights of users.

Naw. I’m just kidding you.

The commission will consist of Attorney General Bill Barr, DHS Secretary Chad Wolf and FCC chair Ajit Pai (or their designees), as well as three people each designated by Sens. Mitch McConnell and Chuck Schumer and Reps. Nancy Pelosi and Kevin McCarthy. The commission would include at least four representatives of law enforcement agencies, two representatives of the technology industry, two representatives of “child safety organizations” and two computer scientists or software engineering experts. No privacy experts required. None on internet governance or regulation. Nobody representing users. Nobody representing device manufacturers. Nobody representing community organizations, social media subscribers, corporations that use the web or business representatives. Nope. And a practice can be deemed “best” if two-thirds of the body recommends it. So if the law enforcement and child safety organizations want something (with the asset of the AG) but the industry and privacy people don’t, guess who wins?

This august body will decide whether ISPs, social media companies and other online providers are using the “best” practices (the statute requires the best practices) to curb CSAM. If it ain’t the best, the commission will require the best—not the most reasonable, not the ones that respect privacy, not what is workable. While the commission “shall consider” users’ interests in privacy, data security and product quality, the law provides no guidance on how much weight to give these interests. And remember, they have to find “best practices” to restrict CSAM. This assumes that we are not already using such “best” practices. Oh, and they don’t even decide which practices are “best,” they just recommend such practices to the AG, who makes his or her own determination whether to “certify” a practice as “best” or to overrule the commission.

And among these “best” practices might be outlawing end-to-end encryption.

EARN-IT

The concept behind the legislation is that the immunity currently conferred on online providers under Section 230 of the Communications Decency Act (230 immunity) means they have an incentive to not do anything about CSAM and they need to “earn” their immunity by doing more. In fact, they have to “earn” their immunity by proving to the commission that they are following all “best” practices to prevent the creation, storage and dissemination of CSAM. We want to enlist these entities as soldiers in the fight against child porn rather than allowing them to passively move data from place to place, some of which may be CSAM.

The bill would not only require these internet companies to adhere to these “best” practices, but it would make the companies and their executives liable for state crimes such as possession and distribution of child porn if the internet company failed to adhere to these best practices.

Just so we understand, current law, including the federal laws on child exploitation, makes it a crime to “knowingly” create, store or possess CSAM, and the law goes further to require internet companies to report any CSAM that they find or know about. And while they have no affirmative duty to scan for CSAM on the billions of text messages, e-mails, SMS and other communications they carry, if they find them (or even suspect they have found them) they must both preserve the files and report the potential child porn or go to jail themselves. Last year internet companies reported more than 45 million such images to law enforcement. So internet providers have no “immunity” from prosecution for knowingly possessing or distributing CSAM, except where they “preserve” it for law enforcement and turn it over to either the cops or to the National Center for Missing and Exploited Children (NCMEC).

Have You No Sense of Decency at Long Last?

At its core, Section 230 of the Communications Decency Act reverses a 1995 New York State case involving the so-called “Wolf of Wall Street” company Stratton Oakmont, which sued Prodigy (remember Prodigy?) for messages people posted on Prodigy’s messaging service that suggested that Stratton Oakmont wasn’t completely on the up and up. I’m shocked, shocked! Under then-existing law, the court ruled Prodigy was a “publisher” of the things people posted on the service, just as The New York Times is a “publisher” of letters to the editor, and therefore the internet company had a duty to monitor what was posted for things such as defamation, copyright infringement, export control regulation and a host of other things for which a publisher might be liable. That would, at least in theory, include CSAM.

Section 230 recognized that internet providers simply are not publishers of what others say and do online, and that, from a societal perspective, we don’t want them to police free speech online. Other laws interpret the liability of internet companies for things such as copyright infringement, and a court could always order an internet company to take down or block content. But Section 230 simply says that the carrier is not the publisher.

In the case of CSAM, that means that the carrier is not liable for the CSAM that others disseminate unless they are aware of it (in which case they must preserve and report). They don’t have a duty to prevent it, to look for it, or to develop technologies to scan for and remove it.

That’s what EARN-IT is trying to reverse.

Sauce for the Goose

At its core, the EARN-IT act would make internet providers, messaging services and potentially device manufacturers criminally liable for the CSAM that others send and receive using their services unless they can prove that they were using whatever “best practices” the commission mandated—or if not, that they were at least using “reasonable” practices to prevent CSAM. While the commission as a whole would prescribe the “best” practices, the bill would allow the attorney general to override the commission and mandate any practices he or she deemed reasonable.

So, if you were an AG determined to prevent CSAM, what would you require?

Hmm … anything?? Anything at all? Let’s see. First, we make use of the Dark Web illegal. All websites must be indexed and accessible—at least accessible to the FBI. And let’s shut down password-protected sites since child pornographers can hide there. Let’s require that all photographs identify the make and model of the camera used as well as the identity of the purchaser of the camera, so we can track down those who use digital cameras to make CSAM. Torrent sites and TOR sites/browsers would be illegal or heavily regulated (licensed?). Same for VPNs.

Next, we eliminate end-to-end encryption; prohibit layering encryption over any storage or communication medium; require that internet providers create and keep “back door” keys to any encrypted files stored on or transmitted through their networks; and for good measure, not allow internet providers to carry communications between any devices (I’m talking to you, Apple iPhone) that does not have a “back door” for law enforcement to access (currently device manufacturers have no 230 immunity, so this would revoke the 230 immunity of carriers for carrying messages between secure devices).

All this would be on my anti-CSAM “wish list,” or as the EARN-IT law would suggest, “best practices.”

The other thing we would require as “best practices” to combat CSAM is that all providers—from Google to AOL, to Hotmail, to every chat and messenger service—as a condition for the “privilege” of immunity scan every message, text, e-mail and file transfer for CSAM. Since they can’t actually identify what messages and files have CSAM, they would do what they are doing now: They would have to create a hash function of every message on the planet and compare that with the ever-growing database of hashes of “known” CSAM. For every message sent by everyone on the planet. Because 14 million reports last year was not enough. That would be another “best” practice to prevent CSAM.

And, of course, do to that, you would have to configure the transmission and storage of communications and files in a way that it would be accessible for examination by these providers. Which means, no encryption.

So the bill does not actually say anything about security or encryption, but at its heart, by making the officers and directors of internet companies liable for arrest and prosecution for child porn and child exploitation and forcing them to register as child sex offenders because they were running an Internet company, it has the same impact.

Not About CSAM

At the end of the day, the EARN-IT bill, while pegged to the real, legitimate and growing problem of child sexual exploitation, is not about child porn, or child abduction or child abuse. It’s a camel’s nose in the tent. If we outlaw end-to-end encryption as a “best practice” to combat child porn, we can then use the weakened security to go after terrorists and those who finance terrorism, right? And who is in favor of terror, amirite? Oh, and then fraud. Fraud is bad. Let’s use the weaker security to go after fraudsters. And drug dealers. We want to get them too, right? I mean, drugs are bad, m’kay? Oh yeah, and tax cheats, welfare cheats and those who are cashing in coupons or other crimes.

And that’s the real problem. We can’t have “weak” security and communications networks for CSAM and require these communications to be scanned without also having “weak” security and communications for everyone. And in the end, everyone is less secure with a less secure internet. And that includes children, too.Recent Articles By Author

Mark Rasch

More from Mark Rasch