Follow this and succeeding pages for the full archive of this web site.
Follow this and succeeding pages for the full archive of this web site.
A recent blog post by one of my favourite writers, Alexandra Samuel and an article in the New York Times about some research which suggests that teenagers who use the Internet at home are less likely to have good grades at school has motivated me to start a new series on Social Media.
Let’s assume for a moment that everyone is capable of being creative. This is a fair assumption based on an egalitarian model of human development. To varying degrees, people respond to complex situations in very
creative ways. But is this enough to make the suggestion that everyone can translate creativity into expressive forms with the power and import of art?
For example, art schools are hotbeds of creative engagement and, wherever they are available, their community based creative courses attract a wide variety of the populace. Most people I speak to have a deep attraction to art and to artists. Many individuals harbor a secret desire to become artists. The same is true of the attraction to writing. One thing that is often forgotten in discussions of the period of history we live in is that the proliferation of web sites and blogs is perhaps one of the best indicators of the universal desire to create and communicate. This desire crosses national boundaries, class differences and religions.
The questions that flow from this seemingly superficial assumption about the universal desire to be creative are many, but among the most important is what do we mean by creativity?
First and foremost, creative engagement means producing something new and, most importantly, engaging with the world through less linear and unpredictable means than the constraints of everyday life often allow. The ephemeral nature of discovery combined with excitement of working with ideas and materials, encourages fluidity of thought and an almost child-like excitement about simple acts like shaping paper into a sculpture or creating movement from drawings that are still.
Artists are compelled to create. Their lives are burdened by the fact that there are rarely any alternatives to the depth of desire that they feel — the physical and mental need to explore their chosen craft or medium. Most writers cannot pass a day without engaging with words and sentences. Yet, not everyone is a writer or artist.
So, although everyone is capable of being creative, very few exercise their talent to the point of making creative engagement the centre of their lives. This is because the translation of creative desire into forms or
materials requires a further step beyond the spontaneous production of artifacts. The secondary act of speculative and critical thinking that needs to be applied to creative production requires a profound understanding not only of history, but also of our place in history.
Painters come to an intimate understanding of the materials they use in the context of the history of art. The intellectual work that is necessary here far exceeds popular notions of spontaneous inspiration. Take a hard look at the many letters which Vincent Van Gogh wrote, and you see a man devoted not only to explaining his art but also to communicating his intentions. Alternately, take a look at the many letters that Samuel Beckett wrote, and you become a witness to his intense and sometimes violent need to communicate his
views of the world.
In all of this, art is produced through action and reflection, through interchange and community. Practice, repetition and rigour transform working with materials, ideas and media into complex acts of communication.
Creativity is therefore about more than what we do or how we think. It is about the application of knowledge to the production of artifacts, ideas and even moments in time. Everyone can be creative, but not everyone wants to spend the time and energy engaging with the demands that creative production requires.
Among its many errors of logic and argument, Nicholas Carr's book, The Shallows: What the Internet Is Doing to Our Brains suggests that the plasticity of the brain — its malleability, means that the generation now heavily involved with, and indebted to the Internet, is having its brains rewired. Aside from the obvious problems in talking about the brain as an electrical system, the supposed plasticity of the human brain is far from being proven although it is in an important area of research in the neurosciences. It is true that the brain is far more capable of adaptation than previously thought, and there is evidence to suggest that learning at all stages of life contributes to a "healthy" brain. However to draw the conclusion, as Carr does, that we are in the midst of a crisis which is redrawing the boundaries of how people think (and most importantly what they do with their thoughts) is alarmist and counter productive.
Carr's panic at what is happening to "us" — distracted multitaskers who no longer read or experience the world with any depth or rigour — perpetuates the century's old hysteria about the effects of new technologies on humans. Stephen Pinker, who actually does research in the neurosciences skewers the simplicity and reductiveness of people like Carr in a recent New York Times article. He says, "Critics of new media sometimes use science itself to press their case, citing research that shows how “experience can change the brain.” But cognitive neuroscientists roll their eyes at such talk. Yes, every time we learn a fact or skill, the wiring of the brain changes; it’s not as if the information is stored in the pancreas. But the existence of neural plasticity does not mean the brain is a blob of clay pounded into shape by experience."
Of even greater interest is Carr's transformation of Darwin's theories of evolution into claims about the speed with which the Internet is altering human biology. This fast forward approach to human evolution has its attractions. After all, humans were not around to witness millions of years of evolution, so it is easy to draw simplistic solutions to explain shifts in human activities and modes of thinking.
Carr's moral panic (taken up and reproduced by hundreds of journalists in newspapers and blogs seemingly desperate for some explanation as to why they are hooked to a medium they haven't thought about with enough depth and historical range) suggests that evolution is like Lego blocks. Once you put a few blocks into place, you have a structure, and once you have a structure, presto! you have evolved!
Carr's argument is just a variation on intelligent design. Replace god with the Internet and you have a power so great that humans are not only its victims, they are growing new brains to accommodate its vicissitudes.
Why do balkanized versions of genuinely interesting and important research projects into human adaptability get transformed into this type of discourse? It is probably not sufficient to suggest that every new technology generates panic among those who least understand either its present use or future transformation.
After all, had Carr taken even a minimum peak at the 19th century, he would have noticed that among other assertions, the telephone was described as a killer of conversation and human interaction (an attitude that lasted well into the 1960's). He would also have noticed that the cinema was described as a terrible distraction that among its many effects would probably lead to the death of literature and theatre. Photography was lambasted for its potential to lie and convince the gullible masses that the truth of an event could be found in images.
But, Carr is not the problem here; he is merely symptomatic of an ever growing and worrying trend to ahistoricism among so called public intellectuals. Those who should be the most sensitive to the nuances of change and the shifting relationships among individuals and their communities and the communications technologies they use are now sanctimoniously declaring that the public is being dumbed down. Carr, of course, never spent any time doing an empirical study because it would have taken him years to complete. He accuses internet dwellers of swimming in a sea of illusions without asking any hard questions about how he came to that conclusion.
His lack of attention to history is what he suggests internet users have devolved into, and, in so doing, he imposes on this vast and ever changing community with all of its diversity and multi-national character a superficiality of intent that he himself creates with his own very shallow arguments.
I will weave through a series of juxtapositions in this blog entry drawn from a number of experiences which I have had in the "field" of ethnography and documentary film — a kind of bricolage — or as James Clifford has put it, an 'ethnographic surrealism'.(1)
In retrospect these fragments are linked in ways which I could not have anticipated before I made the attempt to understand the connections. This kind of reconstruction interests me because it is a combination of personal history, fieldwork and theoretical exploration, evidence of an effort to explore and map the relationships among subjectivity, analysis and experience.
The etymological origin of the term documentary is rooted in the notion of the lesson and is connected to docility, doctrine, indoctrination and didacticism. Docility suggests someone willing to be taught and also someone who is teachable. To be responsive, to be taught, to be open to the information which is presented — information which, if it is to function as document must reproduce in as great detail as possible the world being pictured.
Search a bit further into the etymology and the stronger connection is to doctrine and indoctrination which have roots not only in teaching but in notions of specialization — in the idea of a specialist able to instruct, someone whose knowledge cannot be questioned, the documentary filmmaker.
I bring up this tableau of origins because although a consensus has developed around the definition of "documentary" the debate at this stage pivots on questions of realism almost, though not completely, in opposition to questions of pedagogy. Documentaries however, exist as object lessons in themselves of a desire to teach and thus to enter into a system of communication (or to create one) which links images with specific outcomes or results. The instrumental logic of the documentary is so deeply ingrained that the technology of image creation is now geared to increasing the probability of specific effects upon the viewer.
I am speaking here about the hybridization of light-weight video, film and computers, a kind of postmodern brew designed to make the experience of viewing more immersive, hence more real.
Let me contrast the above with the work of Eric Michaels, a documentary/ethnographic imagemaker who worked in Australia with Aboriginal peoples. He died in 1988 but the impact of his undertaking will continue to be felt for a long time. His essays and his brilliant monograph entitled, *Aboriginal Invention of Television* (1986) reveal a sensibility closely tied to some radical innovations in documentary and ethnographic thought over the last decade. (2)
Michaels explored the frontiers of one of my major interests, the impact of video and television on indigenous cultures. He achieved this by rethinking the notion of "effects" — the ways in which Western cultures control and attempt to dominate other societies — and not positing anything like a linear model for what happens when new technologies are thrust upon indigenous peoples. Michaels worked on both sides of a complex process. He was aware of the need for indigenous peoples to take control of the media they were being exposed to. He was also very sensitive to the specific choices, which they made with respect to images. His approach interests me because he questioned the roots of instrumental thinking by looking at the way in which another culture responds to the logic of images — to modes of storytelling and representation.
His insights in this regard are very significant. In his essay on Hollywood iconography (Michaels 1988:119) Michaels points out many of the radical differences in understanding, which the Aboriginal group, the Wapiri have with regards to American films and television. Not only are the plots dealt with differently, but the characters in these films are reinterpreted according to the specific exigencies of Walpiri culture and social life.
All of this is of course a way of questioning the role of documentary images precisely as devices of teaching and learning that may have cross-cultural value. It is also about how to analyse the strategic choices, which different cultures make in response to the influences, which they have on each other. The question of vantage point — where and how these choices can be examined was a central concern of Michaels. He tried to draw upon the experiences of non-print media and apply them to the process through which ethnographic knowledge is transferred and transformed into visual and oral documents. He noted the specificity of Aboriginal approaches to images and celebrated the differences in how they interpreted documentary claims of truth and realism.
This is made very clear in his article entitled, "How to Look at Us Looking at the Yanomami Looking at Us," (3) in which he says: "A solution is to address the entire process of visual media as a problem of communication, more specifically in cross-cultural translation." (Michaels 1982:145)
It may be that nothing of value to indigenous cultures can be yielded in the process of translation and that the role of visual media is more important for Western cultures than for colonised ones. But this would presume, as Michaels so often pointed out, that colonised cultures themselves have somehow escaped the influences of modern media, which as anyone who has been watching the growth and development of video for example, knows is not the case.
This still doesn't lessen one of the central dilemmas of ethnographic and documentary work with film and video. For the ethnographer it may be more important to uncover both the applicability and effects of the technology than to let the technology work its way through the society in question and let that society find the measure of its own response.
I think that it would not be too radical an assertion to say that the response of indigenous cultures to cultural phenomena cannot be ascertained clearly until those cultures have devised strategies of response, whatever form those responses might take.
Working its way through — what do I mean? A process perhaps which may not be open to external examination and without wanting to push the point too far, a process which may produce forms of internal and culturally specific images which cannot be judged, evaluated or examined from the outside. I want to be careful here because I am not suggesting that a vantage point couldn't be found which might permit one culture to examine another, but there is the matter, and I consider it to be an important one, of how we go about understanding our own history with respect to modern media, let alone the history of other cultures.
There is a tendency, manifest in many ethnographic and documentary projects but even more so when film and video are put to use, to presume that what other cultures choose as images can actually be translated, and it is this presumption which I think needs to be contested because what is inevitably involved are complex sign systems which our own culture has had difficulty in interpreting for itself let alone for others.
This is a fascinating and perplexing problem. It suggests a kind of opaqueness, which the universalizing tendencies of modern film and television production have not grappled with. We need to celebrate the complex and rather 'different' images, which the Aboriginal peoples of Australia have produced, and which Eric Michaels documented. (3)
(1) See Chapter Four of *The Predicament of Culture* (1988) in which Clifford argues for a redefinition of the history of surrealism in order to show the close if not parallel development of ethnography and surrealist thinking.
(2) I am thinking of the work of Edmund Carpenter (1970); James Clifford (1988); Jean Comaroff (1985); Vincent Crapazano (1980); Michel De Certeau (1984); Johannes Fabian (1983); Clifford Geertz (1988); George Marcus and Michael Fischer (1986); Paul Rabinow (1977).
(3) Eric Michaels (1982). This is an essay in a superb collection edited by Jay Ruby, entitled, A Crack in the Mirror (1982).
Much has been made of the iPad’s possible influence on the future of reading and writing. Many of the fears about the disappearance of physical books are justified just as the worries about the future of newspapers needs to be taken very seriously. There is no doubt that we have entered an unstable period of change as various traditional forms of media shift to accommodate the impact of the Internet and digital culture in general.
However, the idea that books will disappear or that newspapers will lose their relevance because they may have to shift to devices like the iPad is naïve at best and alarmist. After all, books are really just pages and pages of discourse sometimes fictional, often not. All the genres that make up what we call the modern novel are not dependent on the physical boundaries established by traditional book production. In fact, an argument can be made that the process through which books have been brought to the attention of the reading public (ads, publicity campaigns and so on) are more in danger of dying than the books themselves. There is only one way in which books will die, and that is if we cease to speak or if we shift so dramatically to an oral culture that the written word becomes redundant.
An argument could be made that people inundated by many different forms of media expression will relegate books to the attics in their homes and in their minds. And a further argument could be made that the decline of reading has been happening for some time, if we look at the number of books sold over the last decade. There is a real danger that books and the reading public will shrink even further.
Nevertheless, my sense is that reading has morphed onto the Web and other media and that reading is more about glances and headlines than in-depth absorption of texts. We now have a multimedia space that links all manner of images with texts and vice-versa. The nature of content is shifting as are the venues in which that content can be read. The design of graphical spaces is often more important than words. Texts on the iPad can be embedded with moving images and sounds and so on, in much the same manner as we now do with web pages. However, this phantasmagoria of elements is still governed by language, discourse and expression.
Matt Richtel has an article in the New York Times that examines the interaction of all of these divergent media on users. “At home, people consume 12 hours of media a day on average, when an hour spent with, say, the Internet and TV simultaneously counts as two hours. That compares with five hours in 1960, say researchers at the University of California, San Diego. Computer users visit an average of 40 Web sites a day, according to research by RescueTime, which offers time-management tools.” Richtel suggests that the intensity of these activities and the need to multitask are rewiring the human brain. I am not able to judge whether that is true or not, but irrespective it would be foolhardy not to recognize that all of this activity increases the speed of interaction. Clearly, reading a non-fiction book is not about speed and books in general cannot be read in the same way as we read web pages, especially if we are looking at book content on mobile phones.
The same can be said for newspapers, which over the years have been designed to entice readers into reading their pages through headlines in order to slow down the tendency to glance or scan. This tells us something about the challenges of print. We tend to assume that the existence of a newspaper means that it is read. But, there has always been a problem with attention spans. Newspapers are as much about a quick read, as are web pages. Newspapers are generally read in a short time, on buses or trains — talk about multitasking.
As it turns out this is very much the same for many genres of the novel from thrillers to the millions of potboilers that people read and that are not generally counted when reference is made to the reading public. In fact, the speed of reading has accelerated over the last hundred years in large measure because of the increased amount of information that has become available and the need to keep up.
This is where e-books and the iPad come in. E-books are an amazing extension of books in general, another and important vehicle for the spread of ideas. The iPad will make it possible (if authors so desire) to extend their use of words into new realms. Remember, when the cinema was invented in 1895 among the very first comments in the British Parliament was that moving images would destroy theatre, books and music. Instead, the cinema has extended the role of all of these forms either through adaptation or integration. Writers remain integral to all media.
Vancouver, Canada, June 3, 2010
M. Garcia, Consul-General of France
Ladies and Gentlemen
Good evening and, thank-you so much for coming!!
I stand here tonight feeling both proud and humbled. Proud because so much about culture and creativity is affirmed by this honour, and humbled because I have been chosen to receive this prestigious award from the French government.
Mes premiers mots seront pour exprimer mes sincère remerciements
au gouvernement français, à l'ancienne ministre de la culture et de la communication Madame Christine Albanel, au ministre de la culture et de la communication Monsieur Frédéric Mitterand, à l'Ambassadeur de la France à Ottawa, Monsieur François Delattre, au Consul de la France à Vancouver, Monsieur Alexandre Garcia et à son Attaché Culturel, Monsieur Hadrian Laroche.
Merci pour tout ce que vous avez fait pour rendre possible cette grande distinction.
C’est un honneur d’avoir été reconnu et accepté à l ordre — un ordre qui est unique en caractère et objectif parmi les démocraties occidentales.
Let me express my profound thanks to the French government, the former Minister of Culture and Communication, La Ministre, Christine Albanel, the Minister, M. Frederic Mitterand, the French Ambassador in Ottawa, M. François Delattre, and M Alexandre Garcia the Consul in Vancouver and his Cultural Attaché, M Hadrien LAROCHE. Thank-you for making this award possible and thank-you for the recognition and for supporting awards of this kind, which are unique in character and purpose in Western democracies. My deepest thanks also to my wife Martha and my two daughters, Maija and Katie for their support and love.
One of the central purposes of French government cultural policy in the international arena is the promotion of cultural diversity among all nations. This policy is also at the heart of UNESCO’s cultural platform. 93 nations signed an agreement to promote cultural diversity, including Canada. France led this effort, and among the policy’s key statements are the following:
**Affirming** that cultural diversity is a defining characteristic of humanity;
**Conscious** that cultural diversity forms a common heritage of humanity and should be cherished and preserved for the benefit of all;
**Being aware** that cultural diversity creates a rich and varied world, which increases the range of choices and nurtures human capacities and values, and therefore is a mainspring for sustainable development for communities, peoples and nations;
**Recalling** that cultural diversity, flourishing within a framework of democracy, tolerance, social justice and mutual respect between peoples and cultures, is indispensable for peace and security at the local, national and international levels.
These statements and the values they put forward are in many respects, at the heart of my career and articulate far better than I ever could what has motivated me to spend a lifetime creating, promoting and defending culture in all its manifestations and forms.
I was born in London, England in a difficult post-war period of deprivation and familial challenge. My family and I immigrated to Canada in 1952 during a time of economic difficulty for all countries. The struggle of immigrants to find their place has only accelerated since then, not only because of the increasing movement of peoples across many societies, but also because so many cultures have faced immense and sometime insurmountable struggles to survive. Disaporic experiences were fundamental features of the 20th century and will continue to determine the direction of the 21st century.
My career is built upon and is a reflection of what I learned during that formative and early period of my life as we struggled to adapt to living in Montreal.
Over the last forty years, I have worked at a number of positions including a wonderful period at McGill University and five years in Australia at LaTrobe University. During my tenure as the President of Emily Carr University I have learned more than I could ever have imagined when I took the position fourteen years ago.
Seamus Heany, the great Irish poet, credits poetry for teaching him to “walk on air against your better judgment.”
Albert Camus, whom I read in my teens and who had a formative impact on my life, said, “The artist forges himself midway between the beauty he cannot do without and the community he cannot tear himself away from.” “Et celui qui, souvent, a choisi son destin d'artiste parce qu'il se sentait différent apprend bien vite qu'il ne nourrira son art, et sa différence, qu'en avouant sa ressemblance avec tous.”
When you walk on air, life is a continuous adventure. And, when you immerse yourself in beauty, even the saddest moments are learning experiences. Learning is at the heart of what I do everyday. It is only possible to learn if one remains open, open to change, open to insights, open to difference. Even in this historical period characterized by many difficult challenges, I continue to believe that it is possible to walk on air.
I am privileged everyday at Emily Carr to be among wonderful people and to experience their passionate excitement about creativity, invention and innovation — their extraordinary commitment to the materials of art, to the crafts of making and to the challenges of living and learning about the creative life, their passion for aesthetics, for colour and for form, their intense desire to produce meaning and communicate it, all of this has not only taught me a great deal, but also given me a profound insight into the potential and importance of the creative process.
So, this award means the world to me because it also acknowledges the values of that creative life and the importance of sustaining creativity in every aspect of what we do everyday of our lives.
The other great intellectual mentor in my life is Claude Lévi-Strauss. His work brings together all of my interests in anthropology, sign systems, linguistics and images. It is therefore not without some sense of the ironies of history that we find ourselves today amidst a renaissance in First Nations culture and cultural production.
Because, it was Lévi-Strauss who brought Pacific Northwest native culture into French consciousness and did more than many to signal to Westerners the importance of culture to this extraordinary area of the world. And, it is not without irony that in one of his last books, brilliantly titled, Look, Listen, Read (Regarder, Ecouter, Lire) that Levi-Strauss celebrated the craft of basket weaving so integral to First Nations culture. He talks of craft in relation to myth and of the integrated nature of making, thinking and living. For me, making, thinking and creative engagement are at the core of what I do and how I live.
I will leave it to my other great intellectual hero, Michel Serres to complete these remarks. In speaking about the social and cultural context that we now share, Serres mentions the endless noise of modern life, the sharp points of despair at the edge of chaos, and he contrasts this with art as the means through which we build society, create vision and make peace with each other and with the world we live in. And then, in talking about Lévi-Strauss, he says, Levi-Strauss helps us see what we can’t see, and through his stories he helps us understand the strange and beautiful social forms that surround us. Cheers to that and cheers to you all!! Et Merci encore à vous tous.
Recently, I have been thinking about the material nature of digital culture, perhaps best exemplified by the Web and its intensely spatial nature. The Internet is often understood by imagining or visualizing a vast lattice of lines connecting across the globe. Of course, lattice works are by their very nature architectural, points in space connected by technologies, the built environment. At the same time as it creates the possibility of virtual interaction, the Internet is also very material. This materiality comes from the wires, servers, buildings and routers that form and shape the experiences of interaction without being in the foreground.
Web pages are designed using boxes for texts and images. The underlying HTML code for the Web is hidden but the manner in which a web page draws content from servers is visible every time we click. Writing for web pages is a material practice. The immateriality of computer screens is offset by the concrete nature of keyboards and mice. The iPad for example, is an object, although probably one of the most powerful objects I have ever held. The iPad hovers between its physical presence and the intense manner in which one’s eyes are drawn into its images, into the screen based worlds of games and photographs. Apps reach out from the screen into our daily lives either organizing our time or allowing us to write on glass enclosures. Software is written and tested within a material universe of employment and job pressures including sometimes unreasonable expectations of productivity.
The aerials that make Wi-Fi possible are produced in factories as are Apple computers. The assembly line in a computer factory looks like something out of the 19th century. Some of the most exotic minerals in the world are needed to make our computers and their screens work correctly. Many of those minerals are found in China and Africa. More often than not the working conditions for extraction and processing are terrible.
Materiality is not something that disappears because we now have so ways in which to experience the world through virtual means. One of the criticisms about human relations on the Internet is that because distance plays a significant role, it is likely that there is something very superficial about the communications process. It is true that the Internet makes communications across varying distances not only possible, but as with Facebook, promotes interactions that are not face to face. And, there are dangers in living a life in front of a screen. Just as there were dangers in spending too much time on the telephone or watching too much television. There is nothing inherent to the technology that sustains the manner in which it is used. There is something in the technology that attracts use from that material universe of people and communities.
The environments we share have never been pristine and civilizations have been built on the interactions between humans and their technologies.
First day of school, West Beverly High, 1990: Brenda and Brandon Walsh from the television show 90210 transplants from Random Town, Minnesota have no idea what they will be up against in Beverly Hills. Ten years later, the show ends with two beloved characters getting married which sweetly ties up the show in an unpretentious manner.
I found the show less interesting after high school graduation, because I only cared about Brenda and Kelly Taylor, as evidenced by my Brenda and [Kelly Barbie dolls](http://en.wikipedia.org/wiki/Kelly_(Barbie)), bought in the 90's in Florida. When Brenda and Kelly chopped their hair off, I chopped off their Barbie's hair, making them hideous and completely un-sellable should I ever choose to part with them (if I could find them — I suspect they're with my Babysitters Club collection "in the basement".)
I really loved Beverly Hills 90210, yet for a million dollars I can't remember the character that Tiffany Amber Thiessen played (OK I just looked it up — it was Valerie Malone — phew). I believe what [IMDB](http://www.imdb.com) says since I have no memory of people calling out "Valerie". But I was completely wrapped up in that show. Ironically and to my surprise, I cannot name one secondary character.
OK, so the show started twenty years ago and ended in 2000 (apparently my family let an eight year old watch this show). Whose memory is that good, especially with respect to television?
I have found that when a show ends no one is ever pleased and most of our questions remain unanswered. In some ways, that's half the fun — we're left to discuss and wonder for years about our favorite characters what they did with their lives and where would they be now if a pathetic attempt at a reboot of the show were attempted? (e.g., the present day version of 90210).
By comparison, if I can't name every single character on Lost ten years from now then I will deem myself a failure as an observer of popular culture. And while the original 90210 can't be compared to Lost, it was an iconic show that I, and millions of other young people watched in its entirety for the duration of its run.
Shows come and go, but Lost is different. To me it stands out. The show practically cured me of my fear of flying — OK, a crash wouldn't be ideal but if it got me to the Island and if it got me to Jack/Sawyer/Desmond, well, I wouldn't complain. And a mango diet sounds good right about now.
Lost started as a "what if" — what if people crashed on an island that was a little different, a little weird? I wasn't hooked until the second season. I was in a hotel room waiting for a flight from London to Vancouver and there was an episode on one of the four British channels. Its title was "The Other 47 days". I have a strange thing about TV shows — I don't like the introduction of new characters.
I inevitably don't love them as much as the original characters. I'm bitter towards them, defiant, wondering why they were suddenly brought into *my* show. Well, as a good Lost lover would know, "The Other 47 days" involved only new characters — and yet I was transfixed. It never occurred to me that there were other survivors of the crash, and I had no idea what their experiences would be like. I went back to Vancouver thinking, "I should get back into Lost". Coincidentally, I was completely jet lagged and staying at my parent's house. My parents had taped the Season 2 finale. I decided, in the middle of the night to just watch it — why not? I didn't understand a thing but between this new hot Scottish fellow and some random button-hatch-thing, I decided I was completely back on the Lost train and immediately bought Season 2 in its entirety and watched it over a very short period.
What other show has had the courage to play with plot lines and characters like this one? If someone had suggested that one of the key locales for the show would be a hatch with a man inside pressing a button every few minutes in order to save the world, I would have laughed. Yet we (most of us?) accepted this reality once we start watching, and I think we (all of us?) fell wholeheartedly for Desmond as a result (male or female, who DOESN'T like Desmond? Definitely the most likeable guy on the planet). And, was Desmond part of the original cast? No. Do I have unwavering love for him? Yes. Does this mean I should accept new characters into my life on TV shows? I guess (grumble grumble.)
The series finale ended six years of turbulence. I've been on 14 hour flights, and even a few minutes drives me nuts. Lost has been a turbulent experience. Lost is about stress and anxiety and it has made me scream and cry and wish I had never started watching it. I don't know what the Island is, but I think I know what it means to me, and it's not just a meeting place of attractive, shirtless men.
I have watched many people I love on the show die: Charlie, Daniel Faraday, Charlotte, Alex — even Juliet, whom I was adamantly against for so many years. I hated her even more when she shacked up with Sawyer, yet she wasn't worried because she knew I would love her eventually. And I did; and I cried when a) I thought she was dead at the end of Season 5 and b) when she did die at the beginning of Season 6. I don't even want to touch on Jack's death because I am in denial. Maybe one day, but not today. Complete and utter denial.
I had so many questions I assumed would get answered in the final season until I realized I didn't really need answers. Carlton Cuse and Damon Lindeloff, the writers of the show want us to keep the questions coming. They want us to debate the show and its outcome for years, if possible. They want to leave us with question marks surrounding all the mysterious elements that made up the show.
Once you accept Lost, once you know that there is a Smoke Monster, polar bears, a giant wheel that can hide the Island and also allow people to escape it you give yourself the freedom to simply enjoy the world created by the writers and director. How and why would Daniel Faraday's mother kill him in the past? Well, I was never going to get an answer to that. Is Richard Alpert finally mortal now that he has a gray hair? How could I have hated Ben so much and by the end love him like a dear old friend?
But these are just questions, and they have allowed me to think about so many possible outcomes to the story. And Lost is about the debate between outcomes, reality and myth. The frustration we feel is also part of the joy that the story has brought us. We will always have so many questions, but isn't that the point? To question everything around us, to question each other? What other (network) show has brought up up so many different ideas and points of view and left so many stories dangling?
Lost will live on as a show that divided people, but its true followers know that it's an exemplary show that took us far away from what we thought it would ever be when it began. If it had been a simple show about people crashing and trying to live together, without all of the supernatural forces in play, would the intrigue have lasted six brilliant seasons? Sure, I would have loved a few more episodes of the castaways just sitting around, cooking fish and rice, arguing, but Survivor got pretty old after a few seasons. We got more than we bargained for and for that I am grateful.
The ties that bind are more often than not based on memories. Memories of events, people, relationships, daily life, and exceptional moments, both personal and historical. Our bodies are like scrapbooks. We write our memories all over our bodies in the course of a lifetime.
We are in the early days of lives lived at the edges of the virtual and the real. Notwithstanding the power of the computer screens we hold in our hands or the larger screens that now broadcast to us, all screens are flat and in the case of the iPad thin and beautiful. These mediated instances bring us closer to the people we love (through Skype or Facebook) and distance us at the same time from the physical pain and joy of touch and embodied dialogue. So close and yet far away. As more information floods into our minds and bodies and as more and more of the communications process is governed by mediators of greater and greater complexity, we have to start asking some hard questions about the fragile nature of what we are doing.
For example, much of the electronic information of the 1980’s is lost. More importantly, so much of the material produced during that era cannot be easily transferred from its original form into more contemporary technologies. In fact, how much of the massive exchange of information that we are now producing will still be around and accessible ten years from now? The pace of change means that even if the information is available, will we be able to realize its importance? What interpretive tools will we be able to apply to processes that appear and disappear so quickly? History may provide us with narratives, but personal memories are unstructured and thus for the most part forgotten. Many people now have thousands of photographs stored on computers and hard drives. The challenge is how to manage all of that data. The even greater challenge is to link memories to the images as the pictures proliferate.
All of this is a round about way of saying that online communities of varying sorts are highly mediated not only by technology but by time. We tend to think of networks in spatial terms. Time is more difficult to picture because in the case of Internet time, it is non-linear.
In other words, as we glance about picking this and that from the Twitter stream, or quickly reading a short piece on a web site and then just as quickly clicking through a series of links, we are creating a non-linear time line. The results are more like a montage, abstract and real at the same time. I find it amusing that the Twitter stream is timed according to date and time of entry. Add hash marks and we are speedily scrolling through a web of links, comments and further comments. Even if the streams were preserved, the context would be lost. Even if our memories were perfect, the cumulative effect cannot be contained.
Non-linear processes are wonderful because they defy easy explanation. They cannot be packaged into neat or modular statements. So, the irony is that social media are drunk with the use of language and constrained by the fact that most of the discourse they produce is so specific to the moment, that it cannot last. I am not one to argue for the end of history, but our memories are normally fragmentary and even as we build narratives around those fragments, we lose far more than we retain. So, this raises the further point around the necessity of conversation through social media. Clearly, as an extension of existing friendships or as the base for building new ones, social media work. Conversations in this ever expanding universe are complex and of great utility to interlocutors. But, the intensity of fragmentation has also been accelerated and with it comes the dangers of even greater loss.
Time is a strange creature. Virtual spaces make it seem as if time can be manipulated. (This is after all the central theme of William Gibson’s early work on cyberspace.) The interface of the real and the virtual makes it seem as if the preservation of memories can be achieved by archiving them. But no one anticipated the human obsession with data. How many of you would knowingly explore a three year-old website? It just doesn’t feel relevant.
Social media are redefining this complex communications landscape. But what if that landscape has no solid geography? What if the history of its formation cannot be traced other than through a series of fragments that don’t connect? We are seeing the formation of a new kind of oral culture and as historians know, oral cultures retain the stories they want to hear and quickly dispense with everything else.
This is the last entry of this series. Follow me on Twitter @ronburnett
*Take a look at the video below*. The first cinematic encounter with wireless technologies from 1922!!
" Two women walk towards the camera on a city street. They stop beside a fire hydrant (this is presumably the United States of America). C/U of the women winding a wire around the top of the fire hydrant. One of the women holds a small box."
"It's Eve's portable wireless 'phone - in 1922." (from the Pathé archives)
The ties that bind connect people, families and communities but those ties remain limited and small in number however richly endowed they may appear to be within the context of discussions about social media. As I mentioned in my previous post, this is a fragile ecology that assumes among other things, that people will stay on top of their connections to each other and maintain the strength and frequency of their conversations over time.
It also means that the participatory qualities of social media can be sustained amidst the ever expanding information noise coming at individuals from many different sources. Remember, sharing information or even contributing to the production of information doesn’t mean that users will become more or less social. The assumption that social media users make is that they are being social because they are participating within various networks, but there is no way of knowing other than through some really hard edged research whether that is really the case.
One of the most fascinating aspects of social media is what I would describe as statistical overload or inflation. “There are now more Facebook users in the Arab world than newspaper readers, a survey suggests. The research by Spot On Public Relations, a Dubai-based agency, says there are more than 15 million subscribers to the social network. The total number of newspaper copies in Arabic, English and French is just under 14 million.” (viewed on May 25, 2010). I am not sure how these figures were arrived at since no methodology was listed on their website. The company is essentially marketing itself by making these statistics available. There are hundreds of sites which make similar claims. Some of the more empirical studies that actually explain their methodologies still only sample a small number of users. Large scale studies will take years to complete.
The best way to think of this is to actually count the number of blogs that you visit on a regular basis or to look at the count of your tweets. Inevitably, there will be a narrowing not only of your range of interests but of the actual number of visits or tweets that you make in any given day. The point is that statistics of use tell us very little about reading, depth of concern or even effects.
The counter to this argument goes something like this. What about all those YouTube videos that have gone viral and have been seen by millions of viewers? Or, all the Blogs that so many people have developed? Or, the seemingly endless flow of tweets?
Jakob Nielsen at useit.com who has been writing about usability for many years makes the following claim. “In most online communities, 90% of users are lurkers who never contribute, 9% of users contribute a little, and 1% of users account for almost all the action. All large-scale, multi-user communities and online social networks that rely on users to contribute content or build services share one property: most users don't participate very much. Often, they simply lurk in the background. In contrast, a tiny minority of users usually accounts for a disproportionately large amount of the content and other system activity.” (viewed on May 25, 2010) Neilsen’s insights have to be taken seriously.
The question is why are we engaging in this inflated talk about the effects and impact of social media? Part of the answer is the sheer excitement that comes from the mental images of all these people creating, participating, and speaking to each other even if the number is smaller than we think. I see these mental images as projections, ways of looking at the world that more often than not link with our preconceptions rather than against them.
So, here is another worrying trend. When Facebook announces 500 million people using its site(s), this suggests a significant explosion of desire to create and participate in some form of communications exchange. It says nothing about the content (except that Facebook has the algorithms to mine what we write) other than through the categories Facebook has, which do tend to define the nature of what we exchange. For example, many users list hundreds of friends which becomes a telling sign of popularity and relevance. It is pretty clear that very few members of that group actually constitute the community of that individual. Yet, there is an effect in having that many friends and that effect is defined by status, activities and pictures as well as likes and dislikes.
None of this is necessarily negative. The problem with the figure 500 million is that it projects a gigantic network from which we as individuals can only draw small pieces of content. And, most of this network is of necessity virtual and detached from real encounters. This detachment is both what encourages communication and can also discourage social connections. This is why privacy is so important. It is also why the anti-Facebook movement is gathering strength. The honest desire to communicate has been supplanted by the systematic use of personal information for profit.
The Ties that Bind……the appearance of portable video in the late 1960's and early 1970's led to a variety of claims about the potential for community media. The most important claim was that video in the hands of community members would allow people in various disenfranchised communities to have a voice. This claim was always stated in contrast to mainstream media which were viewed as one-way and intent on removing the rights of citizens to speak and be heard.
Keep in mind that communities are variously defined by the ties that bind people together. Cities are really agglomerations of villages, impersonal and personal at the same time. Urban environments are as much about the circulation of information as they are about the institutions that individuals share, work in and create. Cities are also very fragile environments largely dependent upon the good will of citizens at all levels of activity. So, communities change all of the time as do the means of communications that they use. There is a constant and ever widening and profoundly interactive exchange of information going on in any urban centre. The buzz is at many levels, from the most personal and familial to the public context of debate about local, national and international issues.
In the post 9/11 world, the two way flow of information and communication has become even more central to urban life. It is not just the appearance and then massive increase in the use of mobile technologies that has altered what communities do and how they see themselves, it is the non-stop and incessant commentaries by many different people on their own lives and the lives of others and on every aspect of the news that has altered both the mental and physical landscape that we inhabit. All of this however, is very fragile. In a world increasingly defined by the extended virtual spaces that we all use, social media platforms define the ties that bind.
In my last entry, I ended with the statement that only eleven percent of internet users actively engage with Twitter on a daily basis. Take a look at [this visualization ](http://informationarchitects.jp/) and you will notice that there are 140 people or organizations that dominate Twitter usage. This doesn't mean that everyone else is not twittering, it just suggests that the community of relationships developed through twitter is not as broad as one might imagine, nor is it as local as the notion of community would suggest. This idea of an extended space lengthens and widens the reach of a small number of people while everyone else essentially maintains the village approach to their usage. The key difference to earlier historical periods is that we imagine a far greater effect to our own words than is actually possible.
From time to time, such as during the Haiti crisis, the best elements of this new and extended social world comes to the fore. However, if you take a hard look at some of the research on news blogs you will discover that the vast majority link to legacy media and get most of their information from traditional sources. Even the categories used by bloggers retain the frameworks and terminology of the mainstream media.
Part of the irony here is that in order for blogs to move beyond these constraints, they would actually have to construct organizations capable of doing research and distinguishing between what is true and what is false. At the same time, the controlled anarchy of the Web allows information to seep through that might otherwise have been hidden or restrained. The total picture however is not as diverse as social media advocates would have us believe.
First let me say that I have really appreciated all of the carefully thought out comments sent in by readers. The last two entries including this one directly and indirectly reference your input.
Go to this site to follow the latest local news in your area. Much like Twitter, FWIX lets you follow the local news based on your interests. This is not dissimilar to the aggregate approach taken by many blogs like the Huffington Post and The Daily Beast. The core difference is that news sites select their bloggers, while FWIX relies on entries produced by locals. A site like NowPublic which started in Vancouver but was bought out by a Denver based investment firm, also relies on public participation although there is a good deal more vetting than on other social news sites.
To what degree is the news different on these aggregate sites and what does this say about the use of media? What is the difference between traditional broadcast news and social news? Or, have we all become journalists, writers and commentators on the communities we live in and on the broader political stories that we share?
Part of what makes social media distinct is the *strength* of the ties between people and the stories and messages they exchange. The suggestion that living in a city makes you an expert on local stories depends on many factors not the least of which is what community you belong to, what your work is and where you live. There is no guarantee that being a local confers any greater depth upon a writer or observer. In fact, in some instances the opposite claim can be made. I would suggest that social news broadens the base of potential stories but that the vast majority of what is published is essentially hearsay. In general, with some exceptions, social news sites become a reflection of a small number of users and writers who effectively take on the job for the community of readers.
Digg uses submissions from readers to build a picture of the importance of some topics over others. Numbers count. In 2006 it became apparent that a small number of writers were manipulating the ratings in order to dominate not only the trends of the time, but also to promote their own blogs. An investigation showed that thirty users had taken over.
The internal picture that we have of the Internet makes it appear as if everything we do and say within its confines will have an audience. The network is so large, that news aggregation in particular gives off the impression of connectivity and currency. There is no obvious way of testing these claims other than through a quantitative analysis of visitors and some in-depth studies of usage patterns and learning experiences. Rating a story is not good enough. Feedback is essential to the lifeblood of social news but in reality only a few sites attract the traffic to make them relevant.
This is where Twitter comes in. The brilliance of this short messaging system was all too obvious during the crisis in Iran last year. It has also been very useful in other crisis situations in Africa and Asia. No claims are made to journalistic truth. Twitter entries are newsy without all the baggage of the news attached to them. Recent events in Thailand bore this out, as protesters were able to keep track of their own and the police's movements throughout Bangkok and news agencies used the Twitter entries to explain what was happening.
However, let's delve a bit more deeply into this. The following quote may articulate some of the complications here:
While the standard definition of a social network embodies the notion of all the people with whom one shares a social relationship, in reality people interact with very few of those "listed" as part of their network. One important reason behind this fact is that attention is the scarce resource in the age of the web. Users faced with many daily tasks and large number of social links default to interacting with those few that matter and that reciprocate their attention. For example, a recent study of Facebook showed that users only poke and message a small number of people while they have a large number of declared friends. And a casual search through recent calls made through any mobile phone usually reveals that a small percentage
of the contacts stored in the phone are frequently contacted by the user.
(Bernardo A. Huberman, Daniel M. Romero and Fang Wu Social Computing Lab, HP Laboratories, arXiv:0812.1045v1 [cs.CY] 4 Dec 2008)
In the same article, the authors talk about how after analyzing thousands of Twitter users they came to the conclusion that even with a large following, the central motivating factor in most tweets is to keep friends and family updated on both personal and public news. Their analysis also showed that the number of friends and family involved in the exchanges were quite small. Once again, the overall size of the network as a whole is making it appear as if more is actually going on than is possible given the daily habits of most users. As it turns out, a tiny number of Twitter personalities and sites gather in most of the usage. As with the news, over time, readers will default to a small number of acceptable sources.
A December, 2008 PEW study showed that eleven percent of Americans who are online use Twitter. The mental image we have is of something far larger going on and guess where that has come from? Broadcast media, in other words, television, and the twenty or so most visited news sites on the web which are also the most traditional.
More on this in my next posting. Follow me on Twitter @ronburnett
The previous sections of Are social media, social? have examined a variety of sometimes complex and often simple elements within the world of social media. Let me now turn to one of the most important issues in this growing phenomenon.
What do we mean by social? Social is one of those words that is used in so many different ways and in so many different contexts that its meaning is now as variable as the individuals who make use of it. Of course, the literal meaning of social seems to be obvious, that is people associating with each other to form groups, alliances or associations. A secondary assumption in the use of social is descriptive and it is about people who ally with each other and have enough in common to identify themselves with a particular group.
Social as a term is about relationships and relationships are inevitably about boundaries. Think of it this way. Groups for better or worse mark out their identities through language and their activities. Specific groups will have specific identities, other groups will be a bit more vague in order to attract lurkers and those on the margins. All groups end up defining themselves in one way or another. Those definitions can be as simple as a name or as complex as a broad-based activity with many layers and many sub-groups.
Identity is the key here. Any number of different identities can be expressed through social media, but a number of core assumptions remain. First, I will not be part of a group that I disagree with and second, I will not want to identify myself with a group that has beliefs that are diametrically opposed to my own. So, in this instance social comes to mean commonality.
Commonality of thought, ideology and interests which is linked to communal, a blending of interests, concerns and outlooks. So, social as a term is about blending differences into ways of thinking and living, and blending shared concerns into language so that people in groups can understand each other. The best current example of this is the Tea Party movement in the US. The driving energy in posts and blogs among the people who share the ideology of the Tea Party is based on solidifying shared assumptions, defining the enemy and consolidating dissent within the group.
In this process, a great deal has to be glossed over. The social space of conversation is dominated by a variety of metaphors that don't change. Keep in mind that commonality is based on a negation, that is, containing differences of opinion. And so, we see in formation, the development of ideology — a set of constraints with solid boundaries that adherents cannot diverge from, or put another way, why follow a group if you disagree with everything that they say? Of course, Tea Party has its own resonances which are symbolic and steeped in American history.
The danger in the simple uses of the word social should be obvious. Why, you may ask should we deconstruct such a 'common' word? Well, that may become more obvious when I make some suggestions about the use of media in social media. Stay tuned.
In the 1930’s radio was a crucial part of European and North American culture. It was a medium that anyone could listen to and many people did. It was also a medium that was used by the Nazis for example, as a propaganda vehicle. Communication’s systems are by their very nature open to abuse as well as good. Radio is now one of the most important media used in Africa for learning at a distance.
I bring up what seems like an archaic medium, radio, to suggest that social networks have always been at the heart of the many different ways in which humans communicate with each other. In each historical instance, as a new medium has appeared, there has been an exponential increase in the size of networks and the manner in which messages and information have been exchanged.
These increases have radiated outwards like a series of concentric circles sometimes encapsulating older forms and other times disrupting them. The fundamental desire to reach out and be understood remains the same. This is what we as humans do, even in our worst moments. We primarily use language and then layer other media not so much on top of language but within its very structure. The brilliance of working with 140 characters is that it takes us back (and may well be pushing us forward) to poetry. The psychology of engaging with an economy of words within the cacophony of messages directed towards us each moment of every day is encouraging a more precise appreciation of the power of individual words. In this sense, I am very heavily on the side of Twitter.
At the same time, Twitter is not a revolution. Information in whatever form, depending on context, can be dangerous or benign. But information exists in a very precise fashion within a defined context. Notice that the Twitterati in general identify themselves. Twitter is somewhere in between text messages and instant messages, an interlude that connects events and experiences through the web as a hub. Early on Twitter was described as microblogging. My next post will look at blogging and what has happened to the many claims made about it when blogging first appeared.
Heidi May has produced some important comments on the previous entries of Are Social Media, Social? May suggested a link to Network, A Networked Book about Network Art which is a fascinating example of the extensions that are possible when communities of interest establish a context to work together and collaborate. Heidi May also asks about the Diaspora project. Diaspora will attempt to build an open source version of Facebook. I wish them luck. This is an essential move to broaden the scope and expectations that we have about the role and usage of social networks, about privacy and most importantly about controlling the very code that governs how we relate within virtual spaces.
A good example of some of the challenges that we face within networked environments is what happened to the famous German philosopher, Jürgen Habermas. “In January, one of the world’s leading intellectuals fell prey to an internet hoax. An anonymous prankster set up a fake Twitter feed purporting to be by Jürgen Habermas, professor emeritus of philosophy at the Johann Wolfgang Goethe University of Frankfurt. “It irritated me because the sender’s identity was a fake,” Habermas told me recently. Like Apple co-founder Steve Jobs, Zimbabwean president Robert Mugabe and former US secretary of state Condoleezza Rice before him, Habermas had been “twitterjacked”.” Stuart Jeffries Financial Times, April 30, 2010.
As it turns out the hoax was removed but not before the individual was found and apologized. Subsequently, Habermas was interviewed and made this comment:
“The internet generates a centrifugal force,” Habemas says. “It releases an anarchic wave of highly fragmented circuits of communication that infrequently overlap. Of course, the spontaneous and egalitarian nature of unlimited communication can have subversive effects under authoritarian regimes. But the web itself does not produce any public spheres. Its structure is not suited to focusing the attention of a dispersed public of citizens who form opinions simultaneously on the same topics and contributions which have been scrutinised and filtered by experts.”
Habermas suggests that power resides with the State even when social networks bring people together to protest and demonstrate. The results of these engagements are contingent and don’t necessarily lead to change or to the enlargement of the public sphere.
The question is how does the public become enlightened? What conditions will allow for and encourage rich interchanges that will drive new perceptions of power and new ideas about power relations?
The general assumption is that social networks facilitate the growth of constructive public debate. Yet, if that were true how can one explain the nature of the debates in the US around health care which were characterized by some of the most vitriolic exchanges in a generation? How do we explain the restrictive and generally anti-immigrant laws introduced by the state of Arizona? The utopian view of social networks tends to gloss over these contradictions. Yes, it is true that Twitter was banned in Iran during the popular uprising last year to prevent protestors from communicating with each other. Yes, social media can be used for good and bad. There is nothing inherent in social networks, nothing latent within their structure that prevents them being used for enhanced exchange and debate. For debates to be public however, there has to be a sense that the debates are visible to a variety of different constituencies. The challenge is that the networks are not visible to each other — mapping them produces interesting lattice-related structures but these say very little about the contents of the interactions.
The overall effect could be described as mythic since we cannot connect to ten thousand people or know what they are saying to each other. At a minimum, the public sphere takes on a visible face through traditional forms of broadcast that can be experienced simultaneously by many different people. Twitter on the other hand, allows us to see trends but that may often not be enough to make a judgment about currency and our capacity to intervene. Is the headline structure of Twitter enough? Should it be?
The computer screen remains the main interface and mediator between the movement of ideas from discourse to action. And, as I have discussed in previous posts, networks are abstracted instances of complex, quantitatively driven relationships. We need more research and perhaps establishing a social network to do this would help, more research on whether social media are actually driving towards increasingly fragmented forms of interaction. A question. How many of your followers have you met? How many people leave comments on your blog and what is the relationship between hits and comments? Beyond the ten or so web sites that everyone visits, how many have settled into a regular routine not unlike bulletin boards of old?
The recent election campaign won by President Obama in which social media played a formidable role suggests that my questions may have no pertinence to his success. Consumer campaigns and boycotts made all the more practical and possible by social networks suggests the opposite of what I am saying. The potential intimacy of dialogues among strangers working together to figure out problems and meet challenges may contradict my intuition that these are variations on existing networks albeit with some dramatic enhancements.
A final thought. We often talk about the speed with which these phenomena develop without referencing their predecessors. For example, if the Web is just an extension of bulletin boards and hypercard systems then we need to understand how that continuity has been built and upon what premises. If Twitter is an extension of daily conversation and is helping to build the public sphere then we need more research on what is being said and actually examine whether Twitters translate into action.
Some non-profits are using Social Media for real results. They are raising the profiles of their charities as well as increasing the brand awareness of their work. They are connecting with a variety of communities inside and outside of their home environments. In the process, Twitter is enabling a variety of exchanges many of which would not happen without the easy access that Twitter provides. These are examples of growth and change through the movement of ideas and projects. Twitter posts remind me short telegrams and as it turns out that may well be the reason the 140 character limit works so well. Social networks facilitate new forms of interaction and often unanticipated contacts. It is in the nature of networks to create nodes, to generate relationships, and to encourage intercommunication. That is after all, one of the key definitions of networks.
Alexandra Samuel suggests: “But here’s what’s different: you, as an audience member, can decide how social you want your social media to be. If you’re reading a newspaper or watching TV, you can talk back — shake your fist in the air! send a letter the editor! — or you can talk about (inviting friends to watch the game with you, chatting about the latest story over your morning coffee). But the opportunities for conversation and engagement don’t vary much from story to story, or content provider to content provider. On the social web, there are still lots of people who are using Twitter to have conversations, who are asking for your comments on that YouTube video, who are enabling — and participating in — wide-ranging conversations via blog and Facebook. You can engage with the people, organization and brands who want to hear from you…or you can go back to being a passive broadcastee.”
These are crucial points, a synopsis of sorts of the foundational assumptions in the Twitterverse and the Blogosphere. At their root is an inference or even assertion about traditional media that needs to be thought about. Traditional media are always portrayed as producing passive experiences or at least not as intensely interactive as social media.
Let’s reel back a bit. Take an iconic event like the assassination of John F. Kennedy. That was a broadcast event that everyone alive at the time experienced in a deeply personal fashion. The tears, the pain, people walking the streets of Washington and elsewhere in a daze, all of this part and parcel of a series of complex reactions as much social as private. Or 9/11, which was watched in real time within a broadcast context. People were on the phone with each other all over the world. Families watched and cried. I could go on and on. It is not the medium which induces passivity, but what we do with the experiences.
So, Twitter and most social media are simply *extensions* of existing forms of communication. This is not in anyway to downplay their importance. It is simply to suggest that each generation seems to take ownership of their media as if history and continuity are not part of the process. Or, to put it another way, telegrams, the telegraph was as important to 19th century society as the telephone was to the middle of the 20th century.
In part one of this essay, I linked Twitter and gossip. Gossip was fundamental to the 17th century and could lead to the building or destruction of careers. Gossip was a crucial aspect of the Dreyfus affair. Gossip has brought down movie stars and politicians. The reality is that all media are interactive and the notion of the passive viewer was an invention of marketers to simplify the complexity of communications between images and people, between people and what they watch and between advertisers and their market.
For some reason, the marketing model of communications has won the day making it seem as if we need more and more complex forms of interaction to achieve or arrive at rich yet simple experiences. All forms of communications to varying degrees are about interaction at different levels. Every form of communication begins with conversations and radiates outwards to media and then loops back. There is an exquisite beauty to this endless loop of information, talk, discussion, blogging, twittering and talking some more. The continuity between all of the parts is what makes communications processes so rich and engaging.
Okay. Lots of responses to my previous entry. Like I said at the end of the article, I am not trying to be negative. I am actually responding to the profoundly important critique of the digitally induced and digested world of communications that Jaron Lanier distills in his recent book, You Are Not a Gadget.
Mashable, a great web site has an article entitled, 21 Essential Social Media Resources You May Have Missed. Most of what the article describes is very important. This is truly the utopian side of the highly mediated universe that we now inhabit. But, as Lanier suggests, mediation does come with risks not the least of which is a loss of identity. Who am I in the Twitterverse or even within the confines of this Blog. And, why would you want to know?
According to Lanier, "A new generation has come of age with a reduced expectation of what a person can be, and of who each person might become." (I can't give you a page number because my Kindle doesn't show page numbers! Location 50-65 whatever that means.) The Mashable article would seem to contradict Lanier describing as it does many instances of Social Media use that have genuinely benefitted a pretty large number of people. What Lanier is getting at goes beyond these immediate examples. He talks at length about a lock-in effect that comes from the repeated use of certain modes of thought and action within the virtual confines of a computer screen.
He is somewhat of a romantic talking about the need for mystery and asking what cannot be represented by a computer. This is an important issue. The underlying structure of the web and the social media that piggyback on that structure is pretty much the same as it was when Tim Berners-Lee transformed the old Apple Hypercard system into something far grander.
UNIX is core to the operating systems of most computers and its command line references have not evolved that much since the 1980's. Open up the Terminal program on a Mac and take a look at it. Lanier's point is that this says something about how we use computers. Most people cannot change the underlying system that has been put in place. That is why open source programming is so exciting. But even open source is developed by very few people.
Could we for example develop our own Twitter-like client? Could we, should we become programmers with enough savvy to create a new and less commercially oriented version of Facebook? Even the SDK for the iPhone and the iPad requires a massive time investment if you want to learn how to develop an App. Yes, you can follow a set of instructions, but no you cannot recreate the SDK to make it your own.
Now, some would say that the use of this software is more important than its underlying language. However, imagine if you applied that same principle to speech and to creativity? This is not about tools. This is about the structure, the embedded nature of the mechanisms that allow things to happen. And, as Lanier suggests, most people have been experiencing digital technology without understanding how that structure may influence their usage of the technology.
I have been thinking a great deal about social media these days not only because of their importance, but also because of their ubiquity. There are some fundamental contradictions at work here that need more discussion. Let's take Twitter. Some people have thousands of followers. What exactly are they following? And more crucially, what does the word follow mean in this context?
Twitter is an endless flow of news and links between friends and strangers. It allows and sometimes encourages exchanges that have varying degrees of value. Twitter is also a tool for people who don't know each other to learn about shared interests. These are valuable aspects of this tightly wrought medium that tend towards the interactivity of human conversation.
On the other hand, Twitter like many Blogs is really a broadcast medium. Sure, followers can respond. And sometimes, comments on blog entries suggest that a "reading" has taken place. But, individual exchanges in both mediums tend to be short, anecdotal and piecemeal.
The general argument around the value of social media is that at least people can respond to the circulation of conversations and that larger and larger circles of people can form to generate varied and often complex interactions. But, responses of the nature and shortness that characterize Twitter are more like fragments — reactions that in their totality may say important things about what we are thinking, but within the immediate context of their publication are at best, broken sentences that are declarative without the consequences that often arise during interpersonal discussions. So, on Twitter we can make claims or state what we feel with few of the direct results that might occur if we had to face our ‘followers’ in person.
Blogs and web sites live and die because they can trace and often declare the number of ‘hits’ they receive. What exactly is a hit? Hit is actually an interesting word since its original meaning was to come upon something and to meet with…. In the 21st century, hits are about visits and the more visits you have the more likely you have an important web presence. Dig into Google Analytics and you will notice that they actually count the amount of time ‘hitters” spend on sites. The average across many sites is no more than a few seconds. Does this mean that a hit is really a glance? And what are the implications of glancing at this and that over the period of a day or a month? A glance is by definition short (like Twitter) and quickly forgotten. You don’t spend a long time glancing at someone.
Let’s look at the term Twitter a bit more closely. It is a noun that means “tremulous excitement.” But, its real origins are related to gossiping. And, gossiping is very much about voyeurism. There is also a pejorative sense to Twitter, chattering, chattering on and on about the same thing. So, we are atwitter with excitement about social media because they seem to extend our capacity to gossip about nearly everything which may explain why Justin Bieber has been at the top of discussions within the twitterverse. I am Canadian and so is he. Enough said.
Back to follow for a moment. To follow also means to pursue. I will for example twitter about this blog entry in an effort to increase the readership for this article. In a sense, I want you the reader, to pursue your interest in social media with enough energy to actually read this piece! To follow also means to align oneself, to be a follower. You may as a result wish to pursue me @ronburnett.
But the real intent of the word follow is to create a following. And the real intent of talking about hits is to increase the number of followers. All in all, this is about convincing people that you have something important and valuable to say which means that social media is also about advertising and marketing. This explains why businesses are justifiably interested in using social media and why governments are entering the blogosphere and the twitterverse in such great numbers.
Here is the irony. After a while, the sheer quantity of Twitters means that the circle of glances has to narrow. Trends become more important than the actual content. Quantity rules just like Google, where the greater the number of hits, the more likely you will have a site that advertisers want to use. Remember, advertisers assume that a glance will have the impact they need to make you notice that their products exist. It is worth noting that glancing is also derived from the word slippery.
As the circle of glances narrows, the interactions take on a fairly predictable tone with content that is for the most part, newsy and narcissistic. I am not trying to be negative here. Twitter me and find out.