from will-it-protect-or-harm-the-children? department
The Children’s Online Safety Act (KOSA) was voted out of committee with a long list of amendments. Supporters had warned of some serious unintended consequences that could arise from this bill, the most concerning of which was forcing tech companies to take LGBTQ+ minors out of their parents — potentially against their will. The changes were supposed to fix these issues and more. But did they?
The short answer is there was an honest attempt, but I think it fails, and I think it fails for some specific reason.
The context of the bill
In order to understand why this bill has significant problems, we must first cover some basics and separate the intentional and unintentional harms of the bill.
Let’s start with what the bill wants to do, which is to establish a threshold of protection for minors. It does this by creating a duty of care to act in the best interests of the minor. The bill then vaguely defines what this means and the category of harms that online platforms must protect minors from, requiring the creation of certain tools that parents can use to monitor their children, etc. It also gives platforms a lot of homework, like creating an annual report identifying the risks they think miners will encounter on their platform and what they are doing to mitigate that damage.
So why did I say that this bill intended to harm? Well, drafting a bill is hard, you have to describe what you mean when you say a business “must act in the best interests of a minor” to “take reasonable steps” to “prevent and mitigate mental health issues” or “addiction”. The more detail you get, the more confusing it gets and the more it is stated, the harder it is to apply to specific facts.
Let’s say I’m playing a game with VOIP and someone calls me a slur. Was it because the game company failed to take reasonable steps? If I want to play a game in all my free time, is it because the game is really good or because it was intentionally designed to cause “compulsive use”? What are even “reasonable measures”? Especially when many of the things described in the bill impact people differently.
KOSA’s intentional fuzzy language
The authors of KOSA essentially subcontract to the courts how to apply the vague language of the bill to the real facts. Concretely, this means that if the bill passes, all platforms will try to comply with what they think the text means. Then at least one of the platforms will almost certainly be sued for failing. These companies will then have to undergo many discoveries and the judges will only get lost.
It will be a long, painful, expensive and time-consuming process. But I think it’s intentional. Many members of Congress believe the platforms are not doing enough to protect children, even though they should have the resources to do so. They either don’t see, or they don’t care, the vast amount of resources already allocated to trust and security divisions to protect all users, including minors. They see a problem that needs to be fixed immediately and believe that a strong regulatory response will give platforms enough of a kick in the pants to figure it out. It’s the famous “nerd hard” complaint that is often leveled at Silicon Valley.
If you look at KOSA through this lens, everything makes sense. Never mind that it sets up a bunch of costly new compliance efforts that may or may not be productive. It doesn’t matter if it kills some companies or forces consolidation. It doesn’t matter that some platforms try to completely ban minors from their platform (of course, we all know that kids will figure out how to access platforms anyway). It’s a big extrinsic shock that they hope will shake things up enough that the platforms end up nerding pretty hard.
After all, the application of KOSA is limited to the FTC and state AGs. We can trust them to only present cases that will advance the welfare of children, right?
KOSA’s extremely serious unforeseen damage
In Normal Times™, here’s how the debate over whether to pass KOSA would play out: This bill is a mess and will be too painful (and expensive) to sort out — against — who cares, the platforms can afford it and we think it will at least help make the world a little better.
But these are not normal times, and advocates have warned that not only will this bill be painful to settle, but it provides an avenue of attack for ideologues using the legal system to prey on marginalized communities. This is a real threat that no lawmakers (especially Democrats) should be complicit in, especially since Roe’s overthrow has become a starting gun to use the legal system for culture wars and ends. extreme ideologies.
The primary avenue of attack built into the original KOSA was towards the LGBTQ community, and comments given at the time were that it would send children to parents who might not be tolerant and could result in things like deported minors homes or sent to conversion therapy. This is what proponents warned editors about and what the new language sought to correct.
So it was settled? Kind of. They added a provision saying that the bill should not be interpreted as requiring disclosure to parents of things like browsing behavior, search history, messages, content of communications. The tools the platforms are required to provide to parents now seem to only focus on high-level things like time used, purchases, etc. in” the large section describing the misdeeds they want to stop. Which option stops bullying? I’d like to know (maybe this will help me stop being T-bagged in multiplayer games).
Sorting this may or may not sweep some sensitive data and expose children. Sometimes children keep secrets to protect themselves from their parents. Makes sense to me, I had a friend growing up sent to one of the reform schools that Paris Hilton warned us about. However, I am generally less concerned about forced exits than before the amendments.
I am now more concerned about this bill which invites a broad attack on platforms allowing a child to view any pro-LGBTQ content. The Culture Wars Eye of Sauron has turned to harassment and despicable behavior towards this community, especially trans people, and they do so under the banner of child protection.
Unfortunately, the language these people use to vilify the LGBTQ community is pervasive in the bill. Being trans has been labeled a mental health disorder, and this bill says platforms are needed to protect minors from it. Seeing a drag queen, period, has also been described as sexual exploitation, grooming, and sexual abuse. Again, forbidden at KOSA. Gender-affirming care has been labeled as self-harm, which rigs must again protect against under KOSA.
The vague language of the bill, which may have been seen by drafters as an asset, is now a huge liability. And it’s not just limited to anti-LGBTQ content. For example, a minor seeking information on how to receive a safe abortion could also be described as self-harm.
The bill’s sponsors might think they are safe from using their bill in these culture wars because enforcement is limited to the FTC and state attorneys general. While I’m less worried about the FTC (now), it’s easy to imagine some state AGs coming before the right judge and succeeding in blocking minors from accessing the basic information they need to understand what they are going through and how to get help, if they need it . Just look to floridawhere Governor Desantis filed a lawsuit against a restaurant and bar that allowed children a drag brunch and said parents who allow their children to see a drag show could be targeted by the Protective Services ‘childhood.
This bill throws a hand grenade in the midst of a particularly dark moment in our justice system. I don’t think that’s wise or very smart politically when the odds are actually quite high that someone will decide to go along with this bill.
Matthew Lane is Senior Director at InSight Public Affairs.
Filed Under: congress, fuzzy language, kosa, lgbtq, tougher nerd, parents, protect children, secrecy, trust and safety