Coda Hale { writing, projects, about, contact }

Fan-In

In which attention is lensed.

For a long time, my go-to question when interviewing a candidate was about building Twitter. I was looking to see how they thought about scalability and so, in classic programming interview cargo cult style, I would try to lead the candidate down the road of designing a system which could handle the pub/sub-y part of Twitter’s timeline.

(This was years ago, mind you, back when Twitter’s product design was simple enough that telling engineers at Twitter how to do their jobs was a common hobby.)

The Bieber Problem

My question’s focus was the Bieber problem: given that Mr. Bieber is now followed by over 75 million people, how do you build a timeline system in which his adoring fans can, with low latency, see his tweets? Candidates would toss something up on the board, I’d ask questions about why they went with a particular approach or how they thought a particular bit would handle increased load, and they’d toss more stuff up on the board. It was great.

(Like almost all interview questions, it was ultimately just a vehicle for my own prejudices and superstitions but it passed for clever at the time and no one, including myself, noticed.)

The Other Bieber Problem

Some years later, I described this question to a friend of mine over drinks at one of the City bars where people argue about databases and eat tater tots. He was a third-wave Twitter old blood, depending on how one counts his rings, and he smiled and nodded and told me about the other Bieber problem.

Fan-out’s rough, but that’s mostly parallelizable. The real rough part is that all of his fans reply.

I also smiled and nodded and, thinking only of computers, commiserated: “Wow, that sounds hard.” Like most things, I hadn’t actually thought it through.

The Small, Nice Thing

The other day I shared a small observation I’d made about programming in Clojure and compared it to a number of other languages I’ve used over the years. Unlike some of the other things I’ve written, it was not an advocacy piece. I had found a neat bit of Clojure where everything lined up perfectly—like the congruence of azimuth and street which slots the sunset neatly between skyscrapers on certain days—and wanted to share. I had hoped to return to my long-neglected blog by publishing a quick observation about some recent work of mine, the way I’d done back when I’d tried to explain what I liked about JAX-RS to folks in the Ruby community.

(Again, this was years ago, back when Rails was new and shiny enough that telling rails-core how to do their job was a common hobby.)

And so I wrote and published a post comparing the pair-wise addition of sequences of numbers in some of the languages with which I’ve worked. I had hoped to shake my reluctance to write, but what happened instead confirmed the dread I’d felt about writing at all.

Some folks—friends, mostly—were happy to see me return to writing, which was nice to hear. A few—again, mostly friends—suggested more idiomatic solutions, and I updated the post with their suggestions. Others—mostly Clojure programmers—said they’d noticed the same thing or shared other little serendipitous bits of code, which made me smile.

Most people, though, wanted to argue. In retrospect, it’s a classic bikeshed setup—a simple bit of code about which everyone can have an opinion—and that’s exactly what happened. Folks mentioned me to tell me how it could be done in languages I haven’t used and don’t care about, to tell me how much better statically-typed languages are (as if I hadn’t spent years programming in statically-typed languages), to suggest that I cherry-picked this case in order to better advocate for Clojure, to shit-talk the languages I’ve used, &c. &c. &c.

This wasn’t entirely unexpected—lord knows I’ve written some fairly incendiary shit and lord knows I’ve argued on the internet about computers before—but what took me by surprise was the degree to which these fairly garden-variety responses fucked up my day.

In order to explain why, though, I need to back up a bit.

The Arc Of The Choral Universe

At the dawn of time, when the earth was still new and everyone hosted their own websites, responses to a piece of writing arrived as either emails or more pieces of public writing. If your work moved someone to respond—agreement, correction, outrage, whatever—they would either send you an email or publish something themselves.

Later, when the animals had been named and everyone was running their blogs on insecure bits of PHP, responses would arrive as comments on your post or pingbacks to new posts on other insecure bits of PHP.

Later still, when people began to walk on two legs and everyone gathered in central forums and link aggregators, responses would collect in long-running threads as commenters battled for karma and other imaginary things.

Now, as the stars begin to dim and humans dip and swerve in flocks of social media ephemera, responses are instantaneous and direct and physical, our nascent haptic helpers tugging gently at our sleeves to let us know that someone, somewhere, has had an opinion at us.

At each stage we gain something and we lose something. I miss what we’ve lost most recently. When comments were emails and other blog posts, they could be ignored. It’s a simple thing to add a filter to, e.g., mark all emails containing the word “bcrypt” as read and move them to a folder one never checks. Emails and web sites don’t come find you; they don’t interrupt conversations of yours to interject their opinions; they don’t make your watch subtly tap you on the wrist. They wait.

Twitter, though, is different. It’s immediate. There are no messages left unread, no inbox, no filters, no delay, no curation. Tweets cause notifications, which are instantly pushed to the devices you carry with you daily. Twitter’s also convex. Everyone’s connected to everyone else, with blocks and protected accounts to hide behind. You can find anyone, join any conversation, spectate any exchange, say anything. All conversations are accessible, to the point that a tweet about someone which doesn’t notify a referent isn’t a tweet—it’s a subtweet.

These are precisely the fishbowl dynamics which made me fall in love with Twitter, but they also have a deeply sinister side.

The Attention Lens

It’s a common enough phenomenon: someone says something on Twitter, perhaps about how cute sleepy dogs are, which suggests a particular reply. Someone sees the place where a joke would go, so they crack it. Everyone laughs. Someone else sees the same empty place where a joke would go, so they also make the joke. Everyone laughs. Another person makes the joke, and another, and another, and another.

(It’s a good joke, for what it’s worth.)

I’ve started thinking of this as an attention lens: small, human amounts of individual attention are refracted through social media to converge on a single person, producing the effect of infinite attention at the focal point. Even in the event that everyone means well, the experience is surreal for the person at the focal point of the lens.

This dynamic, like most, is objectively worse for people who are not stereotypically authoritative, which on the internet means everyone who isn’t a white, straight, cis guy between 18 and 50 years old. The inhibitions to interrupt are lowered, the desire to interject is higher, and as a consequence the mentions are played in that much more. Attention lenses also underlie the asymmetric dynamics of mass harassment on Twitter. Like an amplification DDoS attack, each individual participant need only contribute a handful of messages to flood the target’s mentions. Combine that with a small set of leaders indirectly coordinating the daily hate and you have the blueprints for a fuckboy mention-laser capable of melting steel beams with collimated rays of anime avatars.

Needless to say, this worries me.

The Empty Space

It’s tempting to say that this is a social problem, to add “check someone’s mentions before mentioning them” to the endless scroll of ignored netiquette, to sigh and lament that harassment will always be with us, but that’s fundamentally a cop-out. The problem is that we build social software with no affordances for the limits of individual attention in the name of expedience, of engagement, of the “marketplace of ideas”, of democracy itself.

The idea that we would not want to see something—to not consume something—rankles the Californian ideology. While our giant advertising companies perfect centralized spam-blocking technology, we debate whether it’s ethical for individuals to opt out of displaying ads in their browsers. (Something there is that doesn’t love a wall, it’s true, but only our walls. Theirs, obviously, make good neighbors of us all.)

It’s patently true that Twitter could do more but it’s also patently true that this isn’t one of Twitter’s priorities. They’re an advertising company with a sliding stock price, not a charity, and it doesn’t make business sense to bolt on a bunch of power-user-only features which can only result in fewer sponsored tweets hitting eyeballs. This is the kind of system I’ve let into my life, though. This is who intermediates my interactions with most of my friends.

I don’t have a solution for this. It’s a set of tensions I think humanity will have to resolve it eventually, but it seems unlikely that boom/bust funding of winner-take-all companies will produce systems that respect the limits of human attention.

We’re all deeply in love with the possibilities of fan-out, but what’ll get us in the end is the fan-in.

Updated February 05, 2016

Almost a year ago, Nat Dudley wrote an insightful analysis of how Twitter’s UX leads to, as she puts it, “clusterfucks”:

We’re placing a huge expectation on individuals to strictly adhere to behaviour that is in direct contrast to the behaviour Twitter’s design encourages them to do.

As you’d expect from a UX designer, it’s a much more focused and practical article than this one and includes some suggested changes to help fix things.