Social media apps are bad for kids’ mental health. We should treat them like cigarettes and label them. Parents need a way of controlling them without going whole-hog and taking away the phones (not that that would be a bad idea).
You may or may not agree with that sentiment, but rehashing the arguments here is not my purpose. You can find plenty of debate on the Web, and there is a bipartisan bill in Congress about it. Rather, as a techie, my interest is:
How can we actually implement that?
In many parts of the world (the EU, Australia, New Zealand, Japan, and others) food containing genetically modified organisms (GMO’s) must be labelled. Cigarettes carry gruesome warnings about their health risks. Those products aren’t banned, but they carry labels to inform consumers. This is a proposal to do the same for social apps, so parents and teachers can get a handle on the problem.
This is crowd-sourcing. Unlike a bill in Congress, where the staffers write the bill with “help” from lobbyists, WE are writing it here. So speak up if something is in error.
Who Is This For?
If you have kids, or you’re responsible for them: this is for you. How would you keep your kids safe from online bullying, predators, and just plain “wasting their time staring at a phone instead of being kids?” You could try looking at their phone
But you might not know what to look for! What if you could ask the kid to show you what social apps they have?
(These are AI-generated images. I know: you hate AI. But this way I don’t have to get permission to use an image of a real person. Sorry.)
You might get a QR code like this (not what’s in the picture):
which a decoder shows as:
Adult Social Apps Installed
TikTok Jan. 4, 2024 8:21 am
Instagram Jan. 5, 2024 7:15 pm
That’s what this is: a proposal for phones to disclose the Adult Social Apps installed on them. The definition of an adult social app is set at the federal level. If you want to do something when you discover a kid uses one, that’s up to you and your school.
What IS an Adult Social App?
Defining that is the core of this proposal, and it’s the Federal Government’s only role. A method is described for showing the adult social apps on a phone, without taking possession of the phone. That’s all.
All the rest is left up to parents, the states, school systems, and municipalities. Congress is not going to ban child use of social apps.
How is it Defined Now?
The app stores for Apple and Google do define “child” and “social network.” These will need to be defined legally by Congress, not left up to the companies.
Apple AppStore
“Kids” are generally defined as age 11 and under.
1.3 Kids Category
The Kids Category is a great way for people to easily find apps that are designed for children. If you want to participate in the Kids Category, you should focus on creating a great experience specifically for younger users. These apps must not include links out of the app, purchasing opportunities, or other distractions to kids unless reserved for a designated area behind a parental gate. Keep in mind that once customers expect your app to follow the Kids Category requirements, it will need to continue to meet these guidelines in subsequent updates, even if you decide to deselect the category. Learn more about parental gates.
(The Kids Category)
The Kids category on the App Store is a great way for people to easily find apps specifically designed for children ages 11 and under on iPhone, iPod touch, and iPad. You’ll place your app in one of three age bands based on its primary audience: 5 and under, 6 to 8, or 9 to 11. In addition, you’ll need to follow certain guidelines to ensure user safety.
“social networks” are also defined:
Apps that connect people by means of text, voice, photo, or video. Apps that contribute to community development.
For example: interpersonal connections, text messaging, voice messaging, video communication, photo & video sharing, dating, blogs, special interest communities, companion apps for traditional social networking services.
Google PlayStore
On the definition of “children” Google avoids the question:
Families
Before submitting an app that targets children to the Google Play Store, you are responsible for ensuring your app is appropriate for children and compliant with all relevant laws.
…..
The word "children" can mean different things in different locales and in different contexts. It is important that you consult with your legal counsel to help determine what obligations and/or age-based restrictions may apply to your app. You know best how your app works so we are relying on you to help us make sure apps on Google Play are safe for families.
They do go into quite a bit more detail on social networks:
Social Apps & Features: If your apps allows users to share or exchange information, you must accurately disclose these features in the content rating questionnaire on the Play Console.
Social Apps: A social app is an app where the main focus is to enable users to share freeform content or communicate with large groups of people. All social apps that include children in their target audience must provide an in-app reminder to be safe online and to be aware of the real world risk of online interaction before allowing child users to exchange freeform media or information. You must also require adult action before allowing child users to exchange personal information.
Social Features: A social feature is any additional app functionality that enables users to share freeform content or communicate with large groups of people. Any app that includes children in their target audience and has social features, must provide an in-app reminder to be safe online and to be aware of the real world risk of online interaction before allowing child users to exchange freeform media or information. You must also provide a method for adults to manage social features for child users, including, but not limited to, enabling/disabling the social feature or selecting different levels of functionality. Finally, you must require adult action before enabling features that allow children to exchange personal information.
Adult action means a mechanism to verify that the user is not a child and does not encourage children to falsify their age to gain access to areas of your app that are designed for adults (that is, an adult PIN, password, birthdate, email verification, photo ID, credit card, or SSN).
Social apps where the main focus of the app is to chat with people they do not know must not target children. Examples include: chat roulette style apps, dating apps, kids-focused open chat rooms, etc.
What This Proposal Does
If you look at Google’s and Apple’s definitions, they cry out for a uniform standard. No company should be able to gain an advantage by being slightly more lenient, thereby attracting all the kids. That’s why we have universal rules on water pollution, for example: a factory can’t save money by just dumping its waste in the river.
Apple defines “child” as someone under 11, while Google doesn’t define it at all. Google defines some toothless rules about children and adults existing on the same platform, with feel-good warnings about being “safe” online. Apple leaves it vague, so that their AppStore reviewers can do whatever they like.
“Child”
“Child” will be defined as anyone under 18. This will be the subject of hard negotiation and lobbying and might end up as 16 or 14, of course, but 11 is way too low.
“Social App”
Google’s definitions seem like a very good start.
Educational, game, and hobby apps must also be defined and explicitly ruled not “social apps.”
“Adult Social App”
This is the sticking point of this rule: An ASA is a Social App without rigorously enforced rules, including on who can join.
A Kid-Friendly Social App must obey stringent rules to be set out in legislation (moderation, anti-bullying, warnings, privacy, advertising, contact from strangers, etc.)
All popular social media apps now are ASAs. The owners (Meta, TikTok, etc.) may, in the future, choose to create a separate Kids version of their apps for children.
ASA’s Installed on a Phone Are Detectable
We want to make it possible for a parent or teacher to inspect a child’s phone and detect if any ASA’s are installed, without taking physical or login possession of the phone, and without violating privacy.
How do we do this? That’s the subject of this paper.
Prelude: ASA’s Are Not Like Terrorism
Often in discussing a way to prevent something bad, people say “it only takes one.” You can have the best airport security in the world, but it only takes one determined terrorist to wreak havoc. We have to be right every time, while they only have to be right once.
This is not like that. A Social App is valuable because nearly everyone the kids care about is on it. As soon as the users fall below a certain threshold, it becomes uninteresting. So saying that some bad kids will get around it is not an argument. The question is: how difficult and risky can we make it?
What does a small drop in users accomplish?
Metcalfe's law states that “the financial value or influence of a telecommunications network is proportional to the square of the number of connected users of the system.” (There is some debate on this, of course.)
For computer and math geeks like me, this kind of language is second nature, but maybe you’re a normal person and math makes your head hurt. So the only math we’re using here is “2 squared = 4” and “3 squared = 9.”
Metcalfe’s law implies that the value of a one-telephone network is almost nothing; let’s call it “one.” As soon as another person has a phone, you can call each other, and the value is not 2, but 4 (2 squared). When a third person gets a phone, the value is 9 (3 squared).
Here’s the key: suppose the number of ASA users doubles. Then its value goes up, not by 2 but by 4. It works in the other direction, too: what if we halve the number of users? Then the value of the ASA is only 1/4 of what it was!
So even a small decrease in TikTok usage has a big, big effect on its value to the kids. Their video is much less likely to be seen by “everybody.”
How to Detect ASAs
The current Senate bill relies on parental consent. A few moments’ thought shows that that will work poorly, or not at all. Social App makers can simply assert “We’re doing the best we can. It’s not our fault if the kids lie.” Parents can plead befuddlement at all this Internet stuff, and claim not to know how to control it.
Furthermore, parental consent as the barrier is not even desirable. A parent cannot “consent” to let their child buy beer or tobacco or marijuana, so a seller is justified in requiring a valid ID, whether the kid carries a “note from his mother” or not.
So let’s consider the practical problems one by one, and see if we can solve them.
Terminology: This document defines the use of a Adult Social App Reporter (ASAR). ASAR is installed, by law, on all phones. It reports the ASAs installed on this device, via a QR code.
Since there is no personally-identifying information (PII) transmitted in the ASAR code, there’s no need for ultra high security on who can use it. Further, the recipient has to explicitly bring it up, so there is no remote spying.
Federalism: In nearly all aspects, the ASA ban is under local control, or school district control for educational personnel. If a city wishes to make 14 the legal age for ASAs, that’s up to them. Police and school policies for enforcing it are defined at the state and local levels.
The federal government’s role is limited to:
Defining “Adult Social Network”
Creating and funding the agency that collects and disseminates ASA registrations
Defining the procedures by which this agency may launch civil complaints against unregistered ASA makers
Defining the contents of the QR code
Supporting an open-source API by which ASA’s register their presence on a user’s device.
I’ll explain in detail how all this works.
Registering ASAs
A company offering an ASA (like TikTok) must register with a government agency. The registration is only “I’m offering an ASA at these domains.”
Offering an ASA and not registering will be a civil offense with extremely high penalties. Private litigation (i.e. asserting that someone’s app is really an ASA) is not supported; this is to prevent harassment of app makers for personal or political reasons.
Evasion of the “ASA” Definition
We can make use of the “virality” property here: suppose rebellious kids create an app that’s limited to their school or city, and it somehow doesn’t fit our ASA definition? I’d claim that not having a nationwide or worldwide audience would dramatically reduce its addictiveness. So would a limit on the number of enrolled users. Or a mandated delay before a post becomes available.
Finally, there is a big difference from drinking or pot use. High school kids will tell you “anyone who wants pot can get it, no problem.” However, Social Apps depend on reach in a way that pot use does not. If you want to smoke pot, you do it, maybe with your friends, and you don’t care whether 90% of the other kids do as well. But losing 90% of your audience, such that most kids won’t see your post, is a critical problem in a Social App. Therefore, some leakage is OK.
How is the “Ban” Enforced?
There is no federal “ban.” Rather, the federal government provides a uniform standard which parents and school districts can use.
This is where the ASAR is radically different from any of the other bills: a parent or educator (we’ll use the term Authority) does not need physical access to the child’s phone. The child does not have to give up their password.
There are no secret searches of kids’ phones. This is not like a license-plate reader or a geofencing search, which surveils people not suspected of a crime.
The Authority requests the phone owner to display their ASAR. He or she then checks for the use of ASAs.
When would the Authority do this?
A teacher or other school staff would do it routinely on every student. A parent would do it with their children.
If the subject is obviously over 16, or 18, or whatever the age is: then the Authority needn’t check.
Staring at your phone, like those kids in the opening picture, is considered suspicious behavior, and the Authority would be encouraged to inspect the ASAR on all of them.
So I’d propose to treat underage ASA usage exactly the same as underage drinking or marijuana use. Adults facilitating it are guilty of a criminal offense, and that suffices to keep the problem from becoming universal. A bar that routinely serves liquor to children will eventually find itself in a world of trouble, and this would be the same.
What if the child doesn’t have their phone, or the school impounds them?
I’d say that is “problem solved.” They’re not carrying a phone around with them.
Most smartphones emit regular radio signals, so a lie could be easily disproven.
What if the child refuses the query from a teacher?
Suppose a teacher says to a kid, “please show me your ASAR” and the kid refuses?
This will depend on school policy. Refusing could be grounds for suspension, expulsion, or confiscation of the phone. Or if this school district doesn’t agree with the policy: nothing. Remember, the federal government is not policing this.
What if the child refuses the query from a cop?
This applies only if the jurisdiction defines ASA possession by a minor as a citable offense. Many cities may choose to leave the police and courts out of it.
A request to see the ASAR on a citizen’s phone, like asking for identification now, is permitted only during a “lawful encounter.” If the citizen is suspected of committing a crime, then the request is lawful. A child coming out of a store which sells liquor may be suspected of underage possession of alcohol, and could be searched in many or most jurisdictions. We would handle underage Social App use exactly the same way.
If in a jurisdiction “underage ASA use” is a citable offense, the cop may ask to see the ASAR on the phone, or alternately, ask for proof of age. Actual criminal penalties are highly discouraged.
What if the phone does have ASAs?
Here again, the solution is locally determined. A citation could be issued and the parents notified, the phone could be confiscated and the apps deleted, or a warning issued. It’s a local matter.
How Do Kids Use a Social App?
Usually, apps are downloaded and installed from the App Store, the Play Store, or some other online venue. Sometimes they’re preinstalled on the phone (you can also get at them through a browser, but we’ll consider that below).
Installing an ASA requires attesting that you’re over 16 (if that’s the age). However this need not be checked rigorously, and it just serves to warn innocent or honest kids that they shouldn’t be doing this. This proposal doesn’t demand that Social App companies do the impossible.
The ASAR is required to report the Adult Social Apps installed on this phone. That’s all; no privacy-violating information is transmitted. It does not know the age of the person using the phone, or how much they use it, or what they said.
What If they use It through a browser, text, phone, or other channel?
What if you can access a Adult Social App without installing anything? On my own Android phone, for example, I don’t have the Twitter app, but I can go to twitter.com in the browser and access it that way. We could imagine some other teenaged rule breakers using SMS or some other general platform to use Adult Social Apps without installing anything.
Allowing this at all is a Adult Social App choice. I know that Twitter and LinkedIn try hard to get you to install their apps, instead of using them in the browser. But they don’t insist. Perhaps now they would.
Therefore, we’d have to say that any Social App that is accessed through a browser or other general platform must leave a fingerprint on the phone (details TBD). That fingerprint would be reported by the ASAR.
The ASAR File
The ASAR file’s format and API are defined in open source and owned by a federal agency, possibly NIST (which might outsource it to CCITT or IETF), and extensions are prohibited.
The file’s name and location are platform-specific. Android phones may have quite different treatment of the file, but the QR codes will be the same.
ASAR is tamper-evident but not tamper-resistant. That means that if some bad actor or a software bug, modifies the file, it will be apparent in the QR code.
Since all of this is open source, obvious bugs can be found and reported by white hat hackers and honest citizens.
Penalties for abetting tampering
What about malicious people promising the kids that they can use an ASA and not get caught?
The federal law sets out criminal penalties for providing tools or information on how to tamper with the ASAR. But not for actually doing it. This is similar to “possession of drugs with intent to sell” being more serious than simple possession.
Tampering with the ASAR
“Kids will find a way around it” is the natural objection to this. There will be a null ASAR for sale, which tells the Authority, “Hey, no ASA’s installed here!”
The same is true for underage drinking: there are fake IDs and there are bars that don’t check age rigorously. I remember that when I was still 17 I ordered a beer at one particular bar in college.
For a partial answer to that, see the section above on how this is not like terrorism. It does not only take one. Rather, it takes “everybody” (for some value of that word); otherwise, your post can’t go viral. As long as we encourage compliance and discourage noncompliance, ASAs will be unattractive to kids.
What Legislators Normally Do
Politicians like to get their names and/or parties associated with an issue. This lets them raise funds from people and groups who are concerned with that issue. That is much more important than actually solving the problem; it’s similar to the way a drug company looks at a vaccine that gives lifetime protection:
You can only sell one dose! What they want is a drug you take every day, forever. Similarly, a lawyer doesn’t want to solve a problem; he wants to add it to his practice, and a politician wants to keep those contributions flowing in.
Thus, for the above bill, Senators Schatz, Murphy, Cotton, and Britt want to be able to say in their next reelection campaign, “I led the fight to protect our kids from dangerous smartphone apps!” They consider their work to be over when the bill passes and the President signs it.
The details of how something like this works are usually settled in secret. Lobbyists are consulted, and the public only finds out when it’s too late. We’re inverting that and working the details out now.
What happens after such a law goes into effect? That’s someone else’s job. If it fails to stop kids from staring at their phones, that’s not the politicians’ fault. Maybe, maybe they’ll hold “oversight hearings” in a year, where they’ll get the chance to go on TV again and flog the presidents of tech companies, like they just did
“Will it actually work?” is way down their list of priorities. But not to us!
An Exercise with TikTok’s Market Cap
Let’s play a game here: suppose we take TikTok’s value as $75 billion, which it is, going by this, Their monthly active users on that same date was 1.8 billion. The square of that number is 3.24E+18 (a very big number).
“monthly active users” means the number of distinct users who are active at least once in a given month.
Metcalfe’s law doesn’t say that the value is equal to the square, but we’ll pretend it is. Let’s divide the value by the square: we get $0.0000000231481481, so each user is worth about .000002 of a cent.
Now suppose that this campaign succeeds and TikTok’s monthly actives drops to 1.7 billion. After all, this is a US-only campaign, for now, and that’s 100 million kids. We get a new valuation for TikTok of $66,898,148,148.15, for a difference of $8,101,851,851.85. So dropping 100 million kids would cost them $8.1 billion in valuation, but presumably they’d get most of that back via a new TikTok for Kids.
Frequently Asked Questions
What about Internet Cafes?
What if, under this law, an entrepreneur installs TikTok and Instagram on some devices, and rents minutes to underage kids?
As mentioned earlier, this would be treated the same way as a bar that serves alcohol to children.
Nothing personal, of course, but this seems just the millionth iteration of "children of even less than 10 years old MUST have their own smartphone period, no one dare to resist this dogma, even if it makes the problem 1000 times more complicated than it should".
Seriously, it's impressive the convoluted lenghts people will go to keep that dogma untouched, as if it were a law of physics.
Please let's go instead for the only solution that WOULD really work in the real world for every parent: https://mfioretti.substack.com/p/honestly-the-problem-with-children