Breaking the Attention Fracking Economy
Regulating technology and media companies is hard.
Technology moves quickly, so any hyper-technical legislation runs the risk of being overtaken by a new breakthrough fairly quickly. Over-general regulations run the risk of disincentivizing new innovations that may benefit people broadly. Not acting lets the negative consequences run unabated.
For today's purpose, let's ignore the kinds of bills that are introduced more for grandstanding and keep the focus on the things with positive intent.
Even though it's challenging to do, we've reached a point where social media platforms need some regulations. The evidence of harm is widely available and continues to pile on in both severity and pervasiveness.
Dr. Vivek Murthy, the Surgeon General of the United States summarized much of this in his Op-Ed in the NYT last week, calling for a warning label to be appended to social platforms. While it is mostly geared towards the harm for teens, a lot of the potential dangers apply widely - and not just to social media. Give it a read, it's quite interesting.
In between detailing the dangers and gruesome results of platforms run-amok, Dr. Murthy quietly slipped in a groundbreaking idea for national regulation that could minimize the harm of these platforms. I'll admit, I missed it on my first read.
The measures should prevent platforms from collecting sensitive data from children and should restrict the use of features like push notifications, autoplay and infinite scroll, which prey on developing brains and contribute to excessive use.
Wait, that's it?
Pause. Take a breath.
Now think about how annoying and intrusive the auto-play ads on every news site are. Think about how often your phone dings during a conversation with a friend, pulling your focus away. Remember how often you picked up your phone during a movie, and got sucked into the endless scroll of IG Reels and had to rewind for the parts you missed.
You almost forget that at every turn, you aren't really choosing your own adventure in today's media landscape. You're pulled, endlessly, from one experience to the next.
And now, think about getting online in a way that you actually get to choose what you see next.
Sounds...calm, right?
Dr. Murthy's diagnosis of specific features that have a disproportionately negative effect on building and sustaining an addiction to social platforms is notable for (1) its simplicity and (2) its ability to be acted upon.
Distilling much of the harmful effects of social media down to a set of three specific features that could be limited through legislation is a daunting challenge - and this list of features is exceptionally notable because it does not get distracted by black-box technologies that would be harder to regulate (algorithmic curation/amplification of content, content-specific labels/warnings, AI-created or manipulated content, etc).
These UX patterns can - and I'm coming around to the feeling of should - be eliminated from the modern web experience.
There would of course be unintended consequences. By definition we would not likely be able predict them all, but as it stands today, this not-really-a-proposal is a much clearer and more realistic path to regulation than current anti-monopoly, national privacy regulation, and FCC & FTC limits on speech/advertising.
But I want to go one step further and craft some legislation that would help give people back control over their online experience, and limit some of the nastier effects of the modern social web (and, if you'd permit me, advertising too), we'll start with Dr. Murthy's list:
No Autoplay Content. Users must initiate the playback of any video or audio
No Infinite Scroll. Digital platforms must engineer clear experience breaks that require users to express intent to continue further (this may need refinement for different formats)
No Push Notifications. All non-native-OS applications must refrain from using push notifications. If it came from the App Store, it doesn't have access to notification triggers (engineering a loophole for phone calls & messages, but again, may need refinement since this could be exploited)
Going further, I would propose two more regulations to our "End Attention-Fracking Bill":
Legally capped ad-loads for all content, regardless of channel. Whether it's TV or TikTok, no channel can have more than 12 minutes per every hours-worth of content in the user experience be taken up with advertising. That means that every channel has a fixed cap on the amount of advertising they can sell. If an ad's in view, that time counts against your cap. This disincentivizes increasing the frequency of ad-breaks in content, and also incentivizes having clear breaks between core content vs. advertising - if you cant grow the amount of time (per user) dedicated to advertising on your platform, you make a higher quality advertising experience within the time you can dedicate to it. If I could get around the stadium signage issue for sports, I'd also put a limit on the number of ads that can be in-view simultaneously.
A quick aside for #5: I'll admit - I don't have a good idea on how to stem the tides of surveillance capitalism without burning down the web. I think several proposed solutions have merit: restricting/preventing the use of any cross-device data for personalization; slowing down personalization (make it impossible to use any data signal for targeting for at least 30 days after capture, for example), allowing people to opt-out of having their data stored or used; mandating explicit opt-in to any data used for targeting, and allowing each person to review all available data signals on a periodic basis - all of these have some merit. But they're missing one crucial ingredient: fun. Let's choose a more shenanigans-filled way of making invasive privacy practices seem ludicrous.
Shame & Opt-Out. Every media company must be able to provide, when requested, the total cost advertisers paid to reach each user of their services. Going further, line item it out by advertiser to make it clear who's extra thirsty to reach you. Advertisers paying to license specific, often invasive, data will show up as higher spenders. Advertisers using extreme retargeting or over-bidding to reach niche audiences will show up as higher spenders. Side benefit: people get much more transparency about how much each platform is gaining from monetizing their attention. Does this then put negative pressure on subscription inflation? Does it become a status symbol? "Last week, companies spent $7,552 trying to reach me on Spotify & Meta?!" I'm here for the chaos.
That's it. That's how we start to stem the tide of attention-stealing, addiction-fueling, media dark patterns.
Will this make life harder for advertisers? Absolutely, and about time - it makes the smart agencies more valuable.
Could this help end platform-power and incentivize the investment in quality journalism, premium content, and slower living? Hopefully.
Would this make the web significantly more pleasant? Damn right.