Joanne McNeil’s Lurking serves as a warning for makers of digital products

I knew I was in for a wild ride when I saw Warren Ellis’ — of Transmetropolitan and Castlevania fame — quote-review on the inside front sleeve:”The first history of the social internet I’ve seen that has its authentic life and breadth.” It’s a valid claim.

If you’re a product manager, designer, or developer, that history is worth your while. McNeil includes a number of examples of product and feature fails which caused anxiety and anguish among users — especially marginalized ones. Many companies strive to create a “disruptive” product, one with the potential for a huge impact. In doing so, they often unknowingly adopt or amplify toxic elements of our society — racism, sexism, etc. Lives can also be disrupted.

McNeil frames this disruption and the history of the internet user through the lenses of searching, safety, privacy, identity, community, anonymity, and visibility. The internet’s capacity to make good on those functionalities ebbs and flows through time and across various identities. Lurking includes examples of how marginalized communities have been both uplifted and failed by the social internet and its products and services.

This book grabbed me, not just on an empathetic or techno-anthropological level but also because of how it mirrors my history as an internet denizen, first connecting with friends and like-minded people through BBSes, then facilitating those connections as a BBS “sysop”, joining the ranks of the internet on Usenet, creating a homepage shortly after the advent of HTTP and graphical browsers, then coming full circle and connecting with friends and strangers on evolving social media platforms, starting with blogs. It’s been quite a journey.

In Lurking, McNeil describes how online society has shifted over time. If you’ve been online since birth, the book offers a comprehensive look at online life before your time. If you’ve journeyed online since there’s been an “online”, this book pieces it all together and offers glimpses you otherwise might have missed.

Take McNeil’s inclusion of Echo, like a for-fee Reddit for the New York arts community. You’ve probably never heard of it — even if you lived in NYC in the early ’90s. Likewise, if you haven’t been harassed on social media, as many women have — especially racialized ones — you might not know about tech companies’ history of largely ignoring such threats. Lurking tells these stories and others from the 30+ year period of the social internet.

Many of these stories aren’t the sorts of experiences you want for your users. As a product manager or designer, you don’t want your product to amplify racism, sexism, or anxiety — right? Lurking is a backgrounder in how previous social products have failed certain segments of its users. We keep making the same hurtful mistakes in digital products.

One attitude which hasn’t changed, unfortunately, is the pass white supremacy often gets online. As with Facebook, AOL — that walled garden in the internet — defended their inaction as supporting free speech, despite limiting it elsewhere. In McNeil’s words:

In the nineties, AOL even hosted a page for the Texas branch of the Ku Klux Klan. The online provider prohibited racial slurs in search and user profiles and yet this was a First Amendment issue, AOL insisted.

Time and again, companies miss the same opportunities. Often a guise of “authenticity” results in its rejection. Take Friendster, an early social media site. People could connect individually, but not as a group around a shared interest. To compensate, users started creating fake accounts to engage with like-minded users on an authentic level. These accounts acted as stand-ins for celebrities, movies, or even concepts like war. Rather than embrace it, Friendster stamped it out:

But Friendster developers were unbudging about its purpose. Rather than capitalizing on emerging user behavior, they banked on their product as a sorta-kinda dating space that mapped how various people were connected to one another. Fakesters were an innocuous presence, but the company believed they contaminated the data the platform collected and provided as a hook.

Facebook, in its drive for authenticity with real names, also kicked off accounts employees deemed fake. Unsurprisingly, some people had their accounts suspended despite using their real names. In her book, McNeil recounts the story of Lance Browneyes, an Oglala Lakota artist kicked off Facebook for using a “fake” name. To get his account reinstated he had to submit proof of ID, only to have a Facebook admin inexplicably change his name to “Lance Brown”.

That real name policy often alienated trans users, Indigenous people, Black users, and anyone with names that weren’t expressly white. The sad irony being that anyone could — and did — create a fake account with an “accepted” name, only to harass LGBTQ+ and racialized people, infiltrate their groups, or flag them for deletion. Meanwhile, some members of targeted groups couldn’t even use their real names without passing through special hoops, like requiring passport photos or other official documents. Facebook later changed its name policy — in 2015, 11 years after its founding.

It’s the stories in Lurking which truly illustrate the emotional damage poorly-designed features can cause. Take a Facebook sidebar of supposedly most-frequently contacted friends added in 2010. There are stories of girls who started seeing profiles of boys — past boyfriends, hookups, crushes, and acquaintances — in that space, despite not being connected to and not viewing those profiles. These, the girls deduced through experiments, were their lurkers. As you can imagine, the presence of this feature induced both anxiety and curiosity.

Another example of unintended consequences is Facebook’s “People You May Know” feature. One woman once noticed her estranged father of twenty-seven years in that box. Facebook had inadvertently forced this man back into her life along with any feelings it dredged up.

As you read Lurking, consider your offering through the lenses of searching, safety, privacy, identity, community, anonymity, and visibility, as McNeil lays out. Where does your product fit in? Are you supporting your users’ outcomes in each category? In which ways might people from marginalized groups disagree with your assessment? How might your products erode outcomes or lead to anguish or anxiety in those categories?

These errors or oversights can be limited with teams which better represent a product’s users. Drawing people in from a wider community increases diversity of experience and, in turn, increases diversity of thought. To reduce group-think, teams should include members of differing genders, ethnicities, nationalities, physical abilities, racial identities, and socioeconomic classes. Recognizing possibly damaging unintended outcomes won’t be easy, but it will be worthwhile knowing your product stands a far less chance of ruining someone’s day — or worse.

Originally published at https://ianstevens.ca.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store