January 13, 2016

Last weekend, a bunch of linguists met up in Washington DC for their annual professional conference. Tucked in among debates on language rights advocacy and descriptions of new kinds of NOs and NOTs emerging on Twitter, the American Dialect Society voted on the 2015 Word Of The Year. After some lively debate with words like ghost (to leave a relationship by abruptly cutting off communication; aka an Irish goodbye) and adulting (“I remembered to buy toothpaste! I’m adulting so well today!”), a winner was finally chosen. They went with they.

Singular they, to be specific (and its morphological twins, their and theirs), as in

  • “Someone left their cake out in the rain”
  • “The client will be happy they hired us”
  • “Where’s Jack? Are they working from home today?”

So what? Why should you care (besides the fact that 4K’s resident linguist says you should)?— because the triumph of “singular they” is a triumph of actual use in-the-wild over the imagined use of prescribed (or proscribed) guidelines; it’s the triumph of the User over the Program.

The thing is, people have been using they as a singular pronoun for centuries— from Geoffrey Chaucer to Jane Austen to Oscar Wilde to Gloria Steinem— the Word of the Year vote simply recognizes that, because linguists know that language is a usage-driven framework. And what a great meditation for digital strategy— today let’s ask ourselves: how are people already using your product, regardless of how you intended them to use it?

The internet is filled with usage-driven frameworks— people decide how to use something, intention be damned— there’s a dustpan that doubles as a water spout, using an eraser as an earring back, heating your bathroom with a clay pot and a candle, and we’ve all tried to dry a phone in a bag of rice by now even though that’s not really rice’s raison d’être. So when it comes to user-driven usage, there’s good, bad, and everything in between. And whether or not you should capitalize on these novel uses isn’t always clear cut.

For example, there are good ways to incorporate user-driven usage, like when Japanese cellphone companies in the late 1990s recognized how their users were going beyond the basic :-) and o_O emoticons and inserting whole images to liven up their texts, so they created a way to incorporate these images in the texting framework itself— violà! emoji were born (sort of) from emoticons (kind of).

And there are bad ways to incorporate user-driven usage: Recently Twitter announced that it would expand its platform from 140 characters to more than 10,000— but only as a “read more” button in an otherwise standard tweet. Which is already what Twitter users were doing— we just inserted a link to longer content we had on our blogs— but now Twitter owns our content? Maybe? Good recognition of user-driven usage, but a bone-headed application of it (and only months after their favicon debacle).

So consider the triumph of “singular they” for a moment and meditate on the notion of a usage-driven framework. Now think about your thing. Are people using it the way you intended? If they aren’t, can a small tweak better align you with your users? or should you just leave well enough alone?

Posted in

Douglas Bigham is a Content Specialist at Four Kitchens; he's a writer and ex-academic with a background in digital publics and social language use. He likes dark beer, bright colors, and he speaks a little Klingon.

Comments