Monday, February 2, 2015

Attention Spans are Not Getting Shorter

A great deal has been made in recent years about how the attention span of the average American has been dropping precipitously. This complaint is most often leveled by people who are also most likely to wax nostalgic about how much better things used to be, have a deep-seated mistrust of social media, and are on the long side of 30. That hasn't stopped the narrative from spreading, however, and it has been picked up and repeated in some form across the social spectrum, even by those members of the younger generations who are believed to suffer most notably from this malady.

While the cause of such a disease has yet to be formally identified by the medical community, a list of usual suspects is generally paraded about by the media for public condemnation. In the 80s & 90s it was that kids watched too much television and played video games that rotted their brains. The 2000s saw the internet stretch its tentacles into practically every household occupied by people under 60, with damnable effects like email, instant messaging, and YouTube videos. Then we saw the blame shifting to blogs, Twitter, and Facebook, as mediums that encouraged communication in an ever shrinking range of word and character counts. Now, with the popularity of emoji, Instagram and Snapchat, people don't even have to write anything to get a point across!

Kids today, amirite?


This has been coming up a lot at my work, both in terms directing creation of the content that we sell (should we be making shorter videos to accommodate these shrinking attention spans?), as well as the way that we communicate with our users (are our emails too long? are we making enough image-only content?). Someone recently sent me an email about a report showing that Instagram users had overtaken Twitter users:

"Instagram (300 Mln) overtakes Twitter (284 Mln) in 2014, suggesting a trend towards an even shorter audience attention span - from 140 characters to a single picture!"

...to which I jokingly replied:

"Early human communication was visual, moving from literal pictographs (cave paintings) to metaphorical ones (hieroglyphics), then to cuneiform and eventually written words.  We are now moving back in the other direction to more visual forms of getting our messages across."

I didn't even think about it again, or realize that my joke might be taken seriously without a sarcastic tone of voice, until my response was referenced seriously weeks later. I didn't mean what I had said at all, and I am actually dismissive of the notion, but we have come to a point where a suggestion that human communication is regressing to a preliterate state can be taken at face value, and given the direction of public discourse on the matter, I couldn't blame the recipients of my flip remark.

Let's first get one thing straight: in a literal sense, there is no way that our "attention span" (whatever that is) could possibly have changed genetically in the space of a generation or two, as a result of the technology that we are exposed to. Evolution is simply not a process that happens that quickly. Children born today will have the same natural attention span as their great grandparents, no matter how many tweets their 30-something parents have sent. So, any change in the actual communication preferences of people today is habitual, not hereditary. Now, this is not to say that there can't be any kind of biological change over the course of a person's life, brain chemistry is mysterious beyond my personal kenning, but I will leave that argument to scientists, rather than media theorists.

Perhaps the constant electrical stimulation of certain nerve clusters has indeed rendered us a bunch of easily distracted simpletons, but I submit that the growth of short-form mediums is in no way evidence of any such change. This is yet another example of a classic post hoc fallacy, an attempt to craft a narrative to explain an observed phenomenon. There are, however, alternative explanations that don't rely on faulty casual conclusions.

The simplest explanation for the proliferation of micro-communication has to do with resources, both technological and chronological. Essentially, pieces of media have gotten shorter because it is easier and cheaper to create and disseminate content.

Understand that the situation we are observing is nothing new, but the continuation of process that has existed since the birth of human society. When writing required materials that were difficult to produce, very few things were written down, because the cost was prohibitive. Even with the invention of the printing press, mass production was difficult and expensive, so books were rare. As the technology for making paper and printing became better and cheaper, we saw broadsheets and newspapers come into being, and film and television likewise made it possible to create content that reached a wider audience, but there were still physical limitations to deal with.

Think about it this way: 30 years ago, if you wanted to make a movie and show it to millions of people, you needed not only the time and resources to produce the original content, but then you had to spend huge amounts of time and money to get it broadcast over the airwaves, or distributed in theater chains across the country. It literally cost millions of dollars and thousands of hours, with a substantial risk that you would not return even a fraction of those investments. 99.999% of people simply couldn't afford to take part in such an economy of scale, and for those who could, it wasn't efficient to create anything that short.

Now? Anyone can make a 30-second video with their phone and upload it to YouTube. The investment requirements in time and money are minimal, meaning there is no risk, so there is no downside to creating something that no one actually cares about.

In the past, if I wanted to share a picture of my awesome dinner with the world, it would require taking out a magazine ad (which of course no one ever did). Now I can just post it to Instagram effortlessly. Because the barrier to entry in content publication is all but non-existent, there is essentially no threshold for evaluating the "worth" of any communication that can exist digitally.

eBooks are cheaper, so the same revenue means more authors



Obviously, these are all issues of creation and distribution, and can easily explain why there would be such a growth in micro-content, without making any assumptions about attention spans. It doesn't explain consumption, however. A tree falling in the forest and all of that, a million cat videos (and counting) in the world wouldn't lead to the current conversation if no one were watching them.

I would suggest that attention spans have not declined, simply that people devote about as much attention to any particular piece of media as it warrants. There is a great deal of content that takes only seconds to consume, and those fit easily into the spaces that exist in a modern schedule while being conveniently formatted for the smart phones that are never far from our hands.

I think that an additional point that needs to be made (and shoe-horned into this post) is that too much of the conversation around new media and attention spans implies that length or medium is somehow equivalent to value. If an Instagram post is considered less valuable because it is visual rather than text-based, what of a classic painting? Does the time that something takes to consume related to the time that it takes to produce? That picture of of a fancy dinner may only take a second to compose, but some chef could have spent a great deal of time on the dish.

What of a haiku, or a sonnet? From Seamus Heaney to Shakespeare, plenty of great poets created works ranging from a few lines to thousands. Some of the greatest prose writers in history utilized a variety of forms, from novels to novellas to short stories, adapting the medium to the muse, but no one questions their artistic merit.

For all of that, the same people who read thousands of texts and tweets, and scan the Reddit headlines without reading the linked articles (myself included), still somehow manage to marshal their mental faculties long enough to watch feature films and even remain gainfully employed. I have it on good authority that people even still read actual books, printed on paper.

It's perhaps a less compelling narrative, but the truth is that the medium simply fits the message, so as mediums are created that are more convenient for delivering short-form messages, it is only natural that more such messages will be crafted. A story that is worth telling, or is worth hearing, will receive the attention it deserves.

As a man in my thirties, I have grown up with all of the offending technologies that should have stunted my ability to focus, from the Nintendo I got as a child to the Twitter account with which I shared this post. On any given day I will check my email, Instagram, Twitter, and several forums, and watch a YouTube video or two. At the same time, I am ten weeks into an online statistics course, 500 pages into Anna Karenina, and recently watched a two and half hour movie. I will spend hours cooking a meal, or a whole day exploring a short stretch of river out of cell range. I am not alone in this, because movies, books, and Rolling Stone still have audiences.

So I suppose that my point is, far too many words later, that our attention spans are fine, just so long as we turn them towards objects that are deserving.

[Of course, the fact that probably only one person in a hundred made it this far into the post doesn't say anything good about the value of this blog. If you are reading this, congratulations on your attention span!]