Alone Together – Socializing in the Social Media Era

Sitting with friends or family, a realization dawns as you look around – the whole group is staring down at their phones, silently engrossed. It is an experience with which we are all familiar. A recent Buzzfeed article lists “18 Tiny Ways Your Phone Has Ruined Your Life,” pointing out that, among other problems, “you’re pretty damn addicted to the thing” and “nothing can come between you [and your phone], not even sleep” (Barnicoat). There’s a certain irony in a website designed around mobile use (from its coding, to its advertising, to its article titles) asserting that smart phones, and more importantly the online worlds they connect us to, might have some ‘tiny’ harmful effects. But according to the author and MIT professor Sherry Turkle, these effects are not tiny at all. In fact, they are much greater than we currently imagine. In her most recent book, Alone Together, Turkle confronts the “the triumphalist narrative of” Silicon Valley, “the reassuring story that people want to hear and that technologists want to tell” (Turkle 18). She asks the “nagging question” on all of our minds as we notice that those around us are lost in their gadgets: “does virtual intimacy degrade our experience of the other kind and, indeed, of all encounters, of any kind” (12)?

The famous mid-century physician Lewis Thomas witnessed many medical miracles in his lifetime, as his profession made discovery after discovery, curing previously incurable diseases and raising life expectancy by decades. However, he once mused, even though the “new medicine works” – works beyond the wildest expectations he had as a young doctor in depression era America – “there are costs…the close-up, reassuring, warm touch of the physician…the comfort and concern…this uniquely subtle, personal relationship” that he saw disappearing behind the “immense, automated apparatus” that is the modern hospital (Lewis 58-59). Often, as miraculous as technological and scientific advancement can appear, these costs become obscured. And Turkle is interested in bringing these costs, particularly as they relate to the social dynamics of online interaction, to light. As she writes in the introduction, “if you’re spending three, four, or five hours a day in an online game or virtual world…there’s got to be someplace you’re not” (Turkle 12). The internet has ostensibly brought the entire world closer together, but she wonders whether or not it is instead driving us apart.

At one time, not too many years ago, inter-connectivity was a novelty. Online enthusiasts in the 1990s labeled themselves ‘cyborgs’, and these “cyborgs were a new kind of nomad, wandering in and out of the physical real” (152). Parents would warn their children not to put too much of themselves online, and would limit their child’s time online – the internet was the realm of ‘nerds’ and ‘geeks’. But in 2017 we are all always online, parents and children, and “in simulation culture we [all] become cyborg, and it can be hard to return to anything less” (209). The question of ‘less’ is central to the book. Do we lose something by disconnecting? Do we lose something by connecting? Turkle is an anthropologist by training, and she approaches these questions through the ethnographic interviews and studies that remain central to her field. “Over 450 people have participated in my studies of connectivity,” she notes, and these people’s stories are spread throughout the text – lonely computer scientists, suicidal teenagers, and feuding siblings on opposite sides of the country (xiii). In his seminal text book on New Media, Lev Manovich worries that “analytical texts from our era…mostly contain speculations about the future rather than a record…of the present” (Manovich 33). It is an understandable concern, and one that Turkle takes seriously. She is very much interested in the present, in the ways that titanic technological advancements interact with our primitive primate brains, still biologically stuck in the stone age. “I’ve never taken opiates, but I imagine it’s an electronic version of that…this is an opiate,” one interviewee muses (Turkle 228).

If the internet is an opiate, it is also a place where you can buy opiates – where you can seemingly buy, or do, or be anything. Turkle discusses the world of Second Life, where individuals ‘become’ (and intermingle with other) idealized avatars, the site PostSecret, where users can anonymously vent their innermost thoughts to the world, and the random, no strings attached personal communication of Chatroulette. But much of this interaction is superficial – “when technology engineers intimacy, relationships can be reduced to mere connections…and then, easy connection becomes redefined as intimacy” (16). Turkle readily admits there are no “simple answers” to the questions and concerns she’s posing, and that the purpose of this book isn’t so much to provide solutions as it is to put forward “good terms with which to start a conversation” (277). And it is a conversation that many people seem to want to have. “There are days i [sic] wish cellphones had never been invented,” reads one comment below the Buzzfeed piece – likely typed out on a phone (Barnicoat). And that is the difficulty. These technologies have quickly become indispensable, and their very indispensability is frightening. “We have agreed to an experiment in which we are the human subjects,” Turkle worries (299). And the results of that experiment are yet to be fully realized.

 

Works Cited:

Barnicoat, Becky. “18 Tiny Ways Your Phone Has Ruined Your Life.” Buzzfeed, 19 May 2017, www.buzzfeed.com/beckybarnicoat/tiny-ways-your-phone-has-ruined-your-life.

Manovich, Lev. The Language of New Media. MIT Press, 2001.

Thomas, Lewis. The Youngest Science: Notes of a Medicine-Watcher. The Viking Press, 1983.

Turkle, Sherry. Alone Together: Why We Expect More from Technology and Less from Each Other. Basic Books, 2017.

Gravestones as Media

You’ll find them in every American town, large or small. Sometimes they’ll be conspicuous, sitting prominently behind the church in the middle of a country village. Other times they’ll be out of the way, obscured by trees or the growth of the city around them. They are rarely visited; in fact almost the only time people visit them is when they are first put up, when someone passes away. They are a sad sight, and sometimes a scary sight – the background setting for ghost stories and zombie attacks – but they serve a purpose. They mark the dead, and leave a notice of their former life. Gravestones are an “extension of ourselves” as McLuhan put it in Understanding Media – not only of the person who has died, but of the people and the societies who bury and memorialize them.

The “classic” gravestone is made out of sandstone, although now granite is more common – and you can spot a pricey one not just by its size and shape but also by its material, with expensive marble replacing commoner and cheaper alternatives. Sandstone fades slowly over time, a physical representation of the fading memories the community has for its older dead. When it comes to their graves at least, the dead do have an age. The writing on the older ones has often almost completely disappeared, leaving their names and dates indecipherable. Newer ones are regularly cleaned, and quite shiny – with flowers and pictures sometimes left by the living relatives. Keeping graves clean is a business, usually called “grave care” or “grave maintenance”. And it is also charity work. In New Orleans, the group “Save Our Cemeteries” is trying to restore some of the many famous historical burial sites in the city. But fame and notoriety are not enough to save a graveyard. Paris’s Montmartre cemetery may be well known, but it is still full of crumbling and crushed headstones. The physical construction of a grave tells us about the affluence and prominence of the person being buried, and the physical upkeep of that grave tells us about their continued prominence – or sometimes lack thereof.

Speaking of prominence, the graves of the famous often serve as a sort of exception to the rule – as a spectacle, spots of regular gathering and visitation, in what is otherwise a place of mourning. To return to Paris, the singer Jim Morrison rests at the Père Lachaise Cemetery and his grave continues to attract crowds. Indeed, that cemetery is unique in that it is a big tourist attraction, which millions visit annually. Of course, there is a similar case in the US – that of Arlington National Cemetery, where many of the nation’s war dead are buried. The rows of plain white headstones that line not only Arlington but more than a hundred military cemeteries both at home and abroad hold a special significance. Their owners died in uniform, and are buried uniformly. And that uniformity, coupled with their number, immediately brings to mind the immense costs of war. But not every soldier has a gravestone, and the absence of one can often say as much as its presence. The age of the machine gun, the bomb, and the tank brought the death and destruction of warfare to a previously unimaginable scale. Among the millions of dead in the First World War, there were many rendered unidentifiable by these new technologies – blown to bits by artillery, or gunned down in a fruitless mass charge. They could not be remembered in a traditional sense, their mangled and unrecognizable bodies lying in a foreign land – their death could be roughly dated, but they could not be named. So the British government created the Tomb of the Unknown Warrior to serve as a site of collective, rather than personal, remembrance (although it is quite literally a grave, or rather series of graves, with actual remains buried below) – and nations around the world soon followed suit.

Regimes of mass murder did away with the gravestone entirely. The Nazi government cremated its victims, or more specifically, made its victims (organized in units of “Sonderkommandos”) cremate each other. The Soviet government supplied generous amounts of liquor to its camp guards before ordering them to bulldoze over and bury unmarked the frozen dead of its Gulag system, in order to make the grisly task bearable. These people were rendered inhuman in life, excluded from that category by totalitarian governments, and they were rendered inhuman in death – excluded from the signification of humanity, of remembered humanity, which a gravestone provides. To these regimes, they were meant to be forgotten. On that note, there’s the curious case of the Bergfriedhof cemetery in Heidelberg, Germany. Like the society that created it, Bergfriedhof was strictly divided by religion – with Protestant, Catholic, and Jewish sections. Many of its gravestones share a certain similarity, namely, their death dates. The first half of the 1940s is heavily over-represented for every faith, for completely understandable reasons. And yet though the dates are similar, and though the deaths occurred under the auspices of the same conflict, the gravestones marked with a Jewish star differ both stylistically and temporally. That is, many of the Jewish gravestones were actually constructed well after the war by charity groups, the person’s physical corpse having been burnt to ashes in one of the crematoriums of the death camps, and their family being unable for obvious reasons to memorialize them. In one corner of the cemetery, across the way from the victims of the Holocaust, there is a gravestone, heavily covered in ivy but otherwise unremarkable, that reads “Albert Speer” – the Nazi war criminal known for being Hitler’s favorite architect and as the wartime Minister of Armaments. The phrase “Never Forget” takes on a strangely dichotomous meaning at Bergfriedhof – victims and war criminals are both marked by rectangular stones labeled with names and dates in the midst of the same tree-laden field, both never to be forgotten for very different reasons.