Before there were algorithms, there were invitations
A COLUMN By Clayton “Tiger” Hulin
Early social media arrived with college email addresses and profile songs. It felt temporary, playful, almost innocent. A digital yearbook that never closed. We posted photographs, reconnected with classmates, shared jokes. We messaged friends we had not seen in a lifetime. It felt like waking from a long sleep and discovering everyone still existed somewhere. People from every chapter of your life appeared at once. It seemed magical then.
But magic is often just machinery we do not yet understand.
There was comfort in not knowing how it worked. In believing it was simply connection. In assuming the design was neutral.
There is a certain peace in ignorance.
And a cost.
The platforms did not fade. They matured. They scaled. They professionalized. What began as novelty became infrastructure. Children who once needed parental permission to create accounts are now adults raising children of their own inside the same digital systems. Social media is no longer new. It is old enough to be intergenerational.
That is what makes the current moment different.
When Tennessee released the internal exhibits in its case against Meta Platforms, it did more than file a lawsuit. It pulled back the curtain on how this environment was studied, measured, and refined from the inside.
Most readers remember the early days. Fewer have seen the research decks.
Instead of summaries and selective quotations, the state released the documents themselves. Internal presentations. Research materials. Communications between leadership and policy teams.
Read together, those materials reveal something consistent. Engagement systems were studied in psychological, developmental, and behavioral terms, particularly as they relate to youth.
Engagement Was Understood as Reinforcement
One exhibit includes a Guardian article titled “Never get high on your own supply – why social media bosses don’t use social media.”¹
In that article, former executives describe platforms as built around short term, dopamine driven feedback loops. The concept of engagement as reinforcement was not invented by critics. It was openly acknowledged within industry circles.
That matters because it shows awareness. Repeated interaction was not accidental. It was architected.
Youth Impact Was Studied Systematically
The Instagram Well Being Research materials included in the archive provide structured mapping of youth interaction with platform content.²
The documents outline:
• A spectrum of content from Fine or OK to Disturbing
• Correlation between perceived intensity and report rates
• Overlap between aggressive content and non recommendable categories
• Explicit mental health definitions noting teens and younger users as core affected demographics
This was not anecdotal review. It was structured measurement.
Research was also tied to product implementation. Internal communications reference Smart Defaults decisions and how research influenced roadmap alignment.³
Defaults shape behavior.
Behavior shapes time spent.
Time spent shapes exposure.
The architecture was not blind to developmental realities.
Disclosure Is Not the Same as Restraint
One day companies may point to these studies and say they researched the impacts and informed the public.
That may be factually true.
But disclosure is not the same as structural friction.
Publishing data about overuse does not automatically redesign systems to make overuse harder.
If a company acknowledges reinforcement loops and maps youth vulnerability, the public question becomes whether that awareness meaningfully altered incentives.
Knowledge without constraint leaves the architecture intact.
This Is Not About Censorship
This is not an argument for banning platforms.
It is about consent and capacity.
Every major social media platform requires agreement to terms of service. That agreement is a contract.
Contracts require legal capacity.
Under American law, minors lack full contractual capacity in most commercial settings. They cannot independently enter binding financial agreements. They cannot waive certain rights. They cannot legally purchase regulated substances.
Yet millions of minors click I agree to dense digital contracts governing:
• Data harvesting
• Behavioral profiling
• Arbitration clauses
• Liability limitations
• Exposure to algorithmic recommendation systems
We treat that click as informed consent.
But adolescents do not possess the same cognitive development as adults. Even the company’s own internal research emphasizes the developmental sensitivity of teens.²
A checkbox substitutes for capacity.
And when harm surfaces, responsibility shifts first to the parents.
Not to the system.
Not to the defaults.
Not to the engagement design.
To Mom.
To Dad.
That is the imbalance.
The Capability Question
Social media companies have demonstrated extraordinary technical precision when it aligns with business or policy goals.
Algorithms can:
• Downrank specific content
• Detect behavioral clusters
• Identify coordinated activity
• Optimize targeted advertising
• Adjust visibility at massive scale
These systems are sophisticated enough to categorize users by interest, sentiment, and predicted engagement.
Yet we are told meaningful age enforcement is too difficult.
If platforms can modulate participation and visibility with remarkable accuracy, the claim that they cannot meaningfully restrict underage access deserves scrutiny.
Is the issue capability?
Or is it prioritization?
Age verification introduces friction. Friction reduces growth. Growth affects valuation.
The age bar exists in policy.
The enforcement is minimal.
That tension erodes trust.
A Larger Civic Question
The Tennessee archive shifts the debate from accusation to architecture.
The reinforcement discussion exists.¹
Youth harm measurement frameworks exist.²
Research influencing defaults exists.³
The issue is no longer whether platforms affect youth.
It is how consciously those effects were studied and how deliberately they were managed.
Have we given up on our children?
No.
But we may have normalized a digital environment we would never have accepted in our own adolescence.
Bullying once ended at the bus stop.
Comparison once had geographic limits.
Distraction once required effort.
Now it is frictionless.
We regulate substances that exploit biological vulnerability.
We regulate gambling that exploits reinforcement cycles.
Yet when it comes to algorithmic systems described in terms of feedback loops,¹ we largely defer to optional tools and parental controls.
The question is not whether these platforms have a right to exist.
The question is whether the developmental environment of minors deserves stronger structural safeguards than a checkbox and a warning label.
The machine works.
The research exists.
The architecture is documented.
The remaining question is whether awareness will finally produce meaningful restraint.
That decision belongs to us.
Notes
- “Never get high on your own supply – why social media bosses don’t use social media,” The Guardian, included as Exhibit 17 in Tennessee v. Meta Platforms, Inc., Middle District of Tennessee, June 2024.
- Instagram Well Being Research Summary, Exhibit 16 and related materials in Tennessee v. Meta Platforms, Inc., Middle District of Tennessee, June 2024.
- Internal email referencing Smart Defaults and bullying research alignment, Exhibit 16 materials, Tennessee v. Meta Platforms, Inc., Middle District of Tennessee, June 2024.
Supplemental Documents
Supplement 1: Index of All Internal Meta Inc. Studies
https://docs.google.com/document/d/1m41dVAZ4AJLOJ4Z41Fjb-IP3FbmS0x9jAB-trQizV58/edit?pli=1&tab=t.ooqe8ushd20q#heading=h.siaq7d4lvui1
Supplement 2: Tennessee v. Meta – Complete Exhibit Archive
https://docs.google.com/document/d/1IXNwoBXNsRxGvJk4qeFx4W3fAzYK7lqPSu6bklhW34s/edit?tab=t.fcgvyqvzg4zu#heading=h.ldfnmda9r7hh
Supplement 3: Court Cases Against Meta Inc.
https://docs.google.com/document/d/1dk_IEtvPanp4UvuZ_NpimmUxlQBxNAtO4bNPmWxBJpI/edit?tab=t.gkzst1c4wijk#heading=h.6stkm655ni2j






