Digital Culture Notes: Part Two

E-Books, iPads and Digital Things

 

Much has been made of the iPad’s possible influence on the future of reading and writing. Many of the fears about the disappearance of physical books are justified just as the worries about the future of newspapers needs to be taken very seriously. There is no doubt that we have entered an unstable period of change as various traditional forms of media shift to accommodate the impact of the Internet and digital culture in general.

However, the idea that books will disappear or that newspapers will lose their relevance because they may have to shift to devices like the iPad is naïve at best and alarmist. After all, books are really just pages and pages of discourse sometimes fictional, often not. All the genres that make up what we call the modern novel are not dependent on the physical boundaries established by traditional book production. In fact, an argument can be made that the process through which books have been brought to the attention of the reading public (ads, publicity campaigns and so on) are more in danger of dying than the books themselves. There is only one way in which books will die, and that is if we cease to speak or if we shift so dramatically to an oral culture that the written word becomes redundant.

An argument could be made that people inundated by many different forms of media expression will relegate books to the attics in their homes and in their minds. And a further argument could be made that the decline of reading has been happening for some time, if we look at the number of books sold over the last decade. There is a real danger that books and the reading public will shrink even further.

Nevertheless, my sense is that reading has morphed onto the Web and other media and that reading is more about glances and headlines than in-depth absorption of texts. We now have a multimedia space that links all manner of images with texts and vice-versa. The nature of content is shifting as are the venues in which that content can be read. The design of graphical spaces is often more important than words. Texts on the iPad can be embedded with moving images and sounds and so on, in much the same manner as we now do with web pages. However, this phantasmagoria of elements is still governed by language, discourse and expression.

Matt Richtel has an article in the New York Times that examines the interaction of all of these divergent media on users. “At home, people consume 12 hours of media a day on average, when an hour spent with, say, the Internet and TV simultaneously counts as two hours. That compares with five hours in 1960, say researchers at the University of California, San Diego. Computer users visit an average of 40 Web sites a day, according to research by RescueTime, which offers time-management tools.” Richtel suggests that the intensity of these activities and the need to multitask are rewiring the human brain. I am not able to judge whether that is true or not, but irrespective it would be foolhardy not to recognize that all of this activity increases the speed of interaction. Clearly, reading a non-fiction book is not about speed and books in general cannot be read in the same way as we read web pages, especially if we are looking at book content on mobile phones.

The same can be said for newspapers, which over the years have been designed to entice readers into reading their pages through headlines in order to slow down the tendency to glance or scan. This tells us something about the challenges of print. We tend to assume that the existence of a newspaper means that it is read. But, there has always been a problem with attention spans. Newspapers are as much about a quick read, as are web pages. Newspapers are generally read in a short time, on buses or trains — talk about multitasking.

As it turns out this is very much the same for many genres of the novel from thrillers to the millions of potboilers that people read and that are not generally counted when reference is made to the reading public. In fact, the speed of reading has accelerated over the last hundred years in large measure because of the increased amount of information that has become available and the need to keep up.

This is where e-books and the iPad come in. E-books are an amazing extension of books in general, another and important vehicle for the spread of ideas. The iPad will make it possible (if authors so desire) to extend their use of words into new realms. Remember, when the cinema was invented in 1895 among the very first comments in the British Parliament was that moving images would destroy theatre, books and music. Instead, the cinema has extended the role of all of these forms either through adaptation or integration. Writers remain integral to all media.

 

Digital Culture Notes (First of a series)

Recently, I have been thinking about the material nature of digital culture, perhaps best exemplified by the Web and its intensely spatial nature. The Internet is often understood by imagining or visualizing a vast lattice of lines connecting across the globe. Of course, lattice works are by their very nature architectural, points in space connected by technologies, the built environment. At the same time as it creates the possibility of virtual interaction, the Internet is also very material. This materiality comes from the wires, servers, buildings and routers that form and shape the experiences of interaction without being in the foreground.

Web pages are designed using boxes for texts and images. The underlying HTML code for the Web is hidden but the manner in which a web page draws content from servers is visible every time we click. Writing for web pages is a material practice. The immateriality of computer screens is offset by the concrete nature of keyboards and mice. The iPad for example, is an object, although probably one of the most powerful objects I have ever held. The iPad hovers between its physical presence and the intense manner in which one’s eyes are drawn into its images, into the screen based worlds of games and photographs. Apps reach out from the screen into our daily lives either organizing our time or allowing us to write on glass enclosures. Software is written and tested within a material universe of employment and job pressures including sometimes unreasonable expectations of productivity.

The aerials that make Wi-Fi possible are produced in factories as are Apple computers. The assembly line in a computer factory looks like something out of the 19th century. Some of the most exotic minerals in the world are needed to make our computers and their screens work correctly. Many of those minerals are found in China and Africa. More often than not the working conditions for extraction and processing are terrible.

Materiality is not something that disappears because we now have so ways in which to experience the world through virtual means. One of the criticisms about human relations on the Internet is that because distance plays a significant role, it is likely that there is something very superficial about the communications process. It is true that the Internet makes communications across varying distances not only possible, but as with Facebook, promotes interactions that are not face to face. And, there are dangers in living a life in front of a screen. Just as there were dangers in spending too much time on the telephone or watching too much television. There is nothing inherent to the technology that sustains the manner in which it is used. There is something in the technology that attracts use from that material universe of people and communities.

The environments we share have never been pristine and civilizations have been built on the interactions between humans and their technologies.

Part Two...

 

 

 

Reblog this post [with Zemanta]

 

Are Social Media, Social? (Part Nine)

The ties that bind connect people, families and communities but those ties remain limited and small in number however richly endowed they may appear to be within the context of discussions about social media. As I mentioned in my previous post, this is a fragile ecology that assumes among other things, that people will stay on top of their connections to each other and maintain the strength and frequency of their conversations over time.

It also means that the participatory qualities of social media can be sustained amidst the ever expanding information noise coming at individuals from many different sources. Remember, sharing information or even contributing to the production of information doesn’t mean that users will become more or less social. The assumption that social media users make is that they are being social because they are participating within various networks, but there is no way of knowing other than through some really hard edged research whether that is really the case.

One of the most fascinating aspects of social media is what I would describe as statistical overload or inflation. “There are now more Facebook users in the Arab world than newspaper readers, a survey suggests. The research by Spot On Public Relations, a Dubai-based agency, says there are more than 15 million subscribers to the social network. The total number of newspaper copies in Arabic, English and French is just under 14 million.” (viewed on May 25, 2010). I am not sure how these figures were arrived at since no methodology was listed on their website. The company is essentially marketing itself by making these statistics available. There are hundreds of sites which make similar claims. Some of the more empirical studies that actually explain their methodologies still only sample a small number of users. Large scale studies will take years to complete.

The best way to think of this is to actually count the number of blogs that you visit on a regular basis or to look at the count of your tweets. Inevitably, there will be a narrowing not only of your range of interests but of the actual number of visits or tweets that you make in any given day. The point is that statistics of use tell us very little about reading, depth of concern or even effects.

The counter to this argument goes something like this. What about all those YouTube videos that have gone viral and have been seen by millions of viewers? Or, all the Blogs that so many people have developed? Or, the seemingly endless flow of tweets?

Jakob Nielsen at useit.com who has been writing about usability for many years makes the following claim. “In most online communities, 90% of users are lurkers who never contribute, 9% of users contribute a little, and 1% of users account for almost all the action. All large-scale, multi-user communities and online social networks that rely on users to contribute content or build services share one property: most users don't participate very much. Often, they simply lurk in the background. In contrast, a tiny minority of users usually accounts for a disproportionately large amount of the content and other system activity.” (viewed on May 25, 2010) Neilsen’s insights have to be taken seriously.

The question is why are we engaging in this inflated talk about the effects and impact of social media? Part of the answer is the sheer excitement that comes from the mental images of all these people creating, participating, and speaking to each other even if the number is smaller than we think. I see these mental images as projections, ways of looking at the world that more often than not link with our preconceptions rather than against them.

So, here is another worrying trend. When Facebook announces 500 million people using its site(s), this suggests a significant explosion of desire to create and participate in some form of communications exchange. It says nothing about the content (except that Facebook has the algorithms to mine what we write) other than through the categories Facebook has, which do tend to define the nature of what we exchange. For example, many users list hundreds of friends which becomes a telling sign of popularity and relevance. It is pretty clear that very few members of that group actually constitute the community of that individual. Yet, there is an effect in having that many friends and that effect is defined by status, activities and pictures as well as likes and dislikes.

None of this is necessarily negative. The problem with the figure 500 million is that it projects a gigantic network from which we as individuals can only draw small pieces of content. And, most of this network is of necessity virtual and detached from real encounters. This detachment is both what encourages communication and can also discourage social connections. This is why privacy is so important. It is also why the anti-Facebook movement is gathering strength. The honest desire to communicate has been supplanted by the systematic use of personal information for profit.

Part Ten Follow on me Twitter @ronburnett

 

Reblog this post [with Zemanta]

 

Are social media, social? (Part six)

The previous sections of Are social media, social? have examined a variety of sometimes complex and often simple elements within the world of social media. Let me now turn to one of the most important issues in this growing phenomenon.

What do we mean by social? Social is one of those words that is used in so many different ways and in so many different contexts that its meaning is now as variable as the individuals who make use of it. Of course, the literal meaning of social seems to be obvious, that is people associating with each other to form groups, alliances or associations. A secondary assumption in the use of social is descriptive and it is about people who ally with each other and have enough in common to identify themselves with a particular group.

Social as a term is about relationships and relationships are inevitably about boundaries. Think of it this way. Groups for better or worse mark out their identities through language and their activities. Specific groups will have specific identities, other groups will be a bit more vague in order to attract lurkers and those on the margins. All groups end up defining themselves in one way or another. Those definitions can be as simple as a name or as complex as a broad-based activity with many layers and many sub-groups.

Identity is the key here. Any number of different identities can be expressed through social media, but a number of core assumptions remain. First, I will not be part of a group that I disagree with and second, I will not want to identify myself with a group that has beliefs that are diametrically opposed to my own. So, in this instance social comes to mean commonality.

Commonality of thought, ideology and interests which is linked to communal, a blending of interests, concerns and outlooks. So, social as a term is about blending differences into ways of thinking and living, and blending shared concerns into language so that people in groups can understand each other. The best current example of this is the Tea Party movement in the US. The driving energy in posts and blogs among the people who share the ideology of the Tea Party is based on solidifying shared assumptions, defining the enemy and consolidating dissent within the group.

In this process, a great deal has to be glossed over. The social space of conversation is dominated by a variety of metaphors that don't change. Keep in mind that commonality is based on a negation, that is, containing differences of opinion. And so, we see in formation, the development of ideology — a set of constraints with solid boundaries that adherents cannot diverge from, or put another way, why follow a group if you disagree with everything that they say? Of course, Tea Party has its own resonances which are symbolic and steeped in American history.

The danger in the simple uses of the word social should be obvious. Why, you may ask should we deconstruct such a 'common' word? Well, that may become more obvious when I make some suggestions about the use of media in social media. Stay tuned.

Part Seven 

 

 

Reblog this post [with Zemanta]

 

Are social media, social? (Part Five)

In the 1930’s radio was a crucial part of European and North American culture. It was a medium that anyone could listen to and many people did. It was also a medium that was used by the Nazis for example, as a propaganda vehicle. Communication’s systems are by their very nature open to abuse as well as good. Radio is now one of the most important media used in Africa for learning at a distance.

I bring up what seems like an archaic medium, radio, to suggest that social networks have always been at the heart of the many different ways in which humans communicate with each other. In each historical instance, as a new medium has appeared, there has been an exponential increase in the size of networks and the manner in which messages and information have been exchanged.

These increases have radiated outwards like a series of concentric circles sometimes encapsulating older forms and other times disrupting them. The fundamental desire to reach out and be understood remains the same. This is what we as humans do, even in our worst moments. We primarily use language and then layer other media not so much on top of language but within its very structure. The brilliance of working with 140 characters is that it takes us back (and may well be pushing us forward) to poetry. The psychology of engaging with an economy of words within the cacophony of messages directed towards us each moment of every day is encouraging a more precise appreciation of the power of individual words. In this sense, I am very heavily on the side of Twitter.

At the same time, Twitter is not a revolution. Information in whatever form, depending on context, can be dangerous or benign. But information exists in a very precise fashion within a defined context. Notice that the Twitterati in general identify themselves. Twitter is somewhere in between text messages and instant messages, an interlude that connects events and experiences through the web as a hub. Early on Twitter was described as microblogging. My next post will look at blogging and what has happened to the many claims made about it when blogging first appeared.

Part Six...

Are social media, social? (Part Three)

Some non-profits are using Social Media for real results. They are raising the profiles of their charities as well as increasing the brand awareness of their work. They are connecting with a variety of communities inside and outside of their home environments. In the process, Twitter is enabling a variety of exchanges many of which would not happen without the easy access that Twitter provides. These are examples of growth and change through the movement of ideas and projects. Twitter posts remind me short telegrams and as it turns out that may well be the reason the 140 character limit works so well. Social networks facilitate new forms of interaction and often unanticipated contacts. It is in the nature of networks to create nodes, to generate relationships, and to encourage intercommunication. That is after all, one of the key definitions of networks.

Alexandra Samuel suggests: “But here’s what’s different: you, as an audience member, can decide how social you want your social media to be. If you’re reading a newspaper or watching TV, you can talk back — shake your fist in the air! send a letter the editor! — or you can talk about (inviting friends to watch the game with you, chatting about the latest story over your morning coffee). But the opportunities for conversation and engagement don’t vary much from story to story, or content provider to content provider. On the social web, there are still lots of people who are using Twitter to have conversations, who are asking for your comments on that YouTube video, who are enabling — and participating in — wide-ranging conversations via blog and Facebook. You can engage with the people, organization and brands who want to hear from you…or you can go back to being a passive broadcastee.”

These are crucial points, a synopsis of sorts of the foundational assumptions in the Twitterverse and the Blogosphere. At their root is an inference or even assertion about traditional media that needs to be thought about. Traditional media are always portrayed as producing passive experiences or at least not as intensely interactive as social media.

Let’s reel back a bit. Take an iconic event like the assassination of John F. Kennedy. That was a broadcast event that everyone alive at the time experienced in a deeply personal fashion. The tears, the pain, people walking the streets of Washington and elsewhere in a daze, all of this part and parcel of a series of complex reactions as much social as private. Or 9/11, which was watched in real time within a broadcast context. People were on the phone with each other all over the world. Families watched and cried. I could go on and on. It is not the medium which induces passivity, but what we do with the experiences.

So, Twitter and most social media are simply *extensions* of existing forms of communication. This is not in anyway to downplay their importance. It is simply to suggest that each generation seems to take ownership of their media as if history and continuity are not part of the process. Or, to put it another way, telegrams, the telegraph was as important to 19th century society as the telephone was to the middle of the 20th century.

In part one of this essay, I linked Twitter and gossip. Gossip was fundamental to the 17th century and could lead to the building or destruction of careers. Gossip was a crucial aspect of the Dreyfus affair. Gossip has brought down movie stars and politicians. The reality is that all media are interactive and the notion of the passive viewer was an invention of marketers to simplify the complexity of communications between images and people, between people and what they watch and between advertisers and their market.

For some reason, the marketing model of communications has won the day making it seem as if we need more and more complex forms of interaction to achieve or arrive at rich yet simple experiences. All forms of communications to varying degrees are about interaction at different levels. Every form of communication begins with conversations and radiates outwards to media and then loops back. There is an exquisite beauty to this endless loop of information, talk, discussion, blogging, twittering and talking some more. The continuity between all of the parts is what makes communications processes so rich and engaging.

Part Four

Are social media, social? (Part Two)

Okay. Lots of responses to my previous entry. Like I said at the end of the article, I am not trying to be negative. I am actually responding to the profoundly important critique of the digitally induced and digested world of communications that Jaron Lanier distills in his recent book, You Are Not a Gadget.

Mashable, a great web site has an article entitled, 21 Essential Social Media Resources You May Have Missed. Most of what the article describes is very important. This is truly the utopian side of the highly mediated universe that we now inhabit. But, as Lanier suggests, mediation does come with risks not the least of which is a loss of identity. Who am I in the Twitterverse or even within the confines of this Blog. And, why would you want to know?

According to Lanier, "A new generation has come of age with a reduced expectation of what a person can be, and of who each person might become." (I can't give you a page number because my Kindle doesn't show page numbers! Location 50-65 whatever that means.) The Mashable article would seem to contradict Lanier describing as it does many instances of Social Media use that have genuinely benefitted a pretty large number of people. What Lanier is getting at goes beyond these immediate examples. He talks at length about a lock-in effect that comes from the repeated use of certain modes of thought and action within the virtual confines of a computer screen.

He is somewhat of a romantic talking about the need for mystery and asking what cannot be represented by a computer. This is an important issue. The underlying structure of the web and the social media that piggyback on that structure is pretty much the same as it was when Tim Berners-Lee transformed the old Apple Hypercard system into something far grander.

UNIX is core to the operating systems of most computers and its command line references have not evolved that much since the 1980's. Open up the Terminal program on a Mac and take a look at it. Lanier's point is that this says something about how we use computers. Most people cannot change the underlying system that has been put in place. That is why open source programming is so exciting. But even open source is developed by very few people.

Could we for example develop our own Twitter-like client? Could we, should we become programmers with enough savvy to create a new and less commercially oriented version of Facebook? Even the SDK for the iPhone and the iPad requires a massive time investment if you want to learn how to develop an App. Yes, you can follow a set of instructions, but no you cannot recreate the SDK to make it your own.

Now, some would say that the use of this software is more important than its underlying language. However, imagine if you applied that same principle to speech and to creativity? This is not about tools. This is about the structure, the embedded nature of the mechanisms that allow things to happen. And, as Lanier suggests, most people have been experiencing digital technology without understanding how that structure may influence their usage of the technology.

Part Three

Are social media, social?

Warning: This is a long article and not necessarily suitable to a glance. (See below on glances.)

I have been thinking a great deal about social media these days not only because of their importance, but also because of their ubiquity. There are some fundamental contradictions at work here that need more discussion. Let's take Twitter. Some people have thousands of followers. What exactly are they following? And more crucially, what does the word follow mean in this context?

Twitter is an endless flow of news and links between friends and strangers. It allows and sometimes encourages exchanges that have varying degrees of value. Twitter is also a tool for people who don't know each other to learn about shared interests. These are valuable aspects of this tightly wrought medium that tend towards the interactivity of human conversation.

On the other hand, Twitter like many Blogs is really a broadcast medium. Sure, followers can respond. And sometimes, comments on blog entries suggest that a "reading" has taken place. But, individual exchanges in both mediums tend to be short, anecdotal and piecemeal.

The general argument around the value of social media is that at least people can respond to the circulation of conversations and that larger and larger circles of people can form to generate varied and often complex interactions. But, responses of the nature and shortness that characterize Twitter are more like fragments — reactions that in their totality may say important things about what we are thinking, but within the immediate context of their publication are at best, broken sentences that are declarative without the consequences that often arise during interpersonal discussions. So, on Twitter we can make claims or state what we feel with few of the direct results that might occur if we had to face our ‘followers’ in person.

Blogs and web sites live and die because they can trace and often declare the number of ‘hits’ they receive. What exactly is a hit? Hit is actually an interesting word since its original meaning was to come upon something and to meet with…. In the 21st century, hits are about visits and the more visits you have the more likely you have an important web presence. Dig into Google Analytics and you will notice that they actually count the amount of time ‘hitters” spend on sites. The average across many sites is no more than a few seconds. Does this mean that a hit is really a glance? And what are the implications of glancing at this and that over the period of a day or a month? A glance is by definition short (like Twitter) and quickly forgotten. You don’t spend a long time glancing at someone.

Let’s look at the term Twitter a bit more closely. It is a noun that means “tremulous excitement.” But, its real origins are related to gossiping. And, gossiping is very much about voyeurism. There is also a pejorative sense to Twitter, chattering, chattering on and on about the same thing. So, we are atwitter with excitement about social media because they seem to extend our capacity to gossip about nearly everything which may explain why Justin Bieber has been at the top of discussions within the twitterverse. I am Canadian and so is he. Enough said.

Back to follow for a moment. To follow also means to pursue. I will for example twitter about this blog entry in an effort to increase the readership for this article. In a sense, I want you the reader, to pursue your interest in social media with enough energy to actually read this piece! To follow also means to align oneself, to be a follower. You may as a result wish to pursue me @ronburnett.

But the real intent of the word follow is to create a following. And the real intent of talking about hits is to increase the number of followers. All in all, this is about convincing people that you have something important and valuable to say which means that social media is also about advertising and marketing. This explains why businesses are justifiably interested in using social media and why governments are entering the blogosphere and the twitterverse in such great numbers.

Here is the irony. After a while, the sheer quantity of Twitters means that the circle of glances has to narrow. Trends become more important than the actual content. Quantity rules just like Google, where the greater the number of hits, the more likely you will have a site that advertisers want to use. Remember, advertisers assume that a glance will have the impact they need to make you notice that their products exist. It is worth noting that glancing is also derived from the word slippery.

As the circle of glances narrows, the interactions take on a fairly predictable tone with content that is for the most part, newsy and narcissistic. I am not trying to be negative here. Twitter me and find out.

Part Two

Avatar, the Movie

It is always fascinating to read critical analyses of popular films when the writer actually dislikes popular culture, which begs the question, why write about something you hate? [James Bowman writes for the journal, [The New Atlantis and his pieces are generally anti-technology and anti-pop culture. His recent article on Avatar follows the usual arguments of critics disconnected from the culture they seem bent on critiquing. Bowman describes Avatar as a flight of fantasy, dangerous because as with all fantasy films of this genre, it is both escapist and dangerously full of illusions not only about society but also about the future. Interestingly, he claims that the film doesn’t follow the Western tradition of mimesis, that is, it makes no claim to imitate reality and because of this, has no merit as art.

Bowman also says that the only difference between Avatar and other films of the same type is the use of 3D as if the medium of film and its transformation is not part of an important aesthetic shift as well as an important shift in how stories are told. Bowman even criticizes James Cameron’s development of a new language for the indigenous people of Pandora, the Na’vi whom Bowman describes as monkeys. Here is what he says: "The natives of Pandora are giant blue monkeys with sophisticated fiber optics in their tails and the natural world they inhabit is filled with floating mountains, huge dragon-birds whom the inhabitants ride like horses, hammer-headed hippos the size of houses, and other fantastical creatures too numerous to mention and impossible to exist on Earth." Of course, the ‘natives’ are constructions and of course they don’t exist. As with all artifice they are the products of Cameron’s rich imagination, but in Bowman’s world imagination is actually a dirty word.

But, enough about a bad review. To answer a question that must be creeping into your mind, why write about something I dislike? Avatar is an experiment in 3D, that is an experiment with images that have a rather wispy feel like the brilliant disappearing Cheshire cat in Tim Burton’s, Alice in Wonderland. 3D creates an intense feeling of pleasure in viewers largely because it is so ephemeral, not because it approximates reality. I have watched viewers try and grasp the images that come close to them. But, the closeness is itself a function of the glasses we are wearing, a function of the desire to be in the image, and to be a part of the experiences the images are generating.

Generating.

3D in its modern incarnation is about generative images, that is about depth, distance and a more profound sense of perspective. 3D continues the long tradition of exploring our rather human capacity and desire to enter into worlds entirely made of images. 3D extends the Renaissance exploration of line, shape and colour. That is why Avatar is so important. Sure, its story has been told many times, but crucially not in this way. The film is an exploration of a new frontier and aside from 3D, its real innovations lie in the use of motion capture technology to create not only a synthesis of the real and imaginary, but also synthetic worlds. Finally, we can be rid of the pretensions that all art must show in the most pedantic of ways some relationship to the real!! Painters rid themselves of this crisis when they explored entire canvases of one colour (Rothko), while filmmakers and film critics still think that a black screen goes against the essence of the cinema.

Of course, 3D is in its early days as a medium for exploring the power of storytelling. And, Cameron actually got much of his inspiration for Avatar from his underwater explorations of the wreck of the Titanic. Cameron is really interested in creating new languages for conventional ways of seeing and describing the world. He didn’t need to invent a new language for the Na’vi but he did. He didn’t have to shoot all those beautiful and magical scenes of Pandora, except that if you have ever swum off a reef, you would have noticed many of the same colours and shapes and why not recreate them if you can?

Bowman doesn’t talk about what the word avatar means. Yet, that is at the heart of the film. Avatars are about substitution, that is about substituting what is missing, be it a body or a mind or a story. Avatars don’t replace their progenitors. That is, unless you decide like Cameron did, that his main character had to be transformed from the two dimensional world of the screen into a Na’vi, through a death and rebirth ritual that actually happens to be at the heart of what nearly all major religions in the world proselytize about on an hourly basis.

Let me switch terminology for a moment and suggest that Avatar is actually a commentary on the illusions of religion and on the impossible dreams of immortality that have haunted humans since they began to paint on the walls of caves. Avatar is about that inner world, our inner world that we keep alive in order to stay alive. It is the reverse of the Platonic cave where those who are blind to reality need to be saved. Rather, the film explores those who have reconciled themselves to their fate and who have created a world that is a reflection of their weaknesses and strengths. In other words, the Na’vi are us when we dream and lest we forget, we spend a good proportion of our lives dreaming.

Eric Topol: The wireless future of medicine

Emily Carr University is developing a Health Design Lab in association with the Children's Hospital in Vancouver. The use of wireless technologies both in developed and developing countries will be increasingly important to efficient and economic health care delivery. Eric Topol develops a brilliant argument for the wireless future of medicine in this TED presentation.

As director of the Scripps Translational Science Institute in La Jolla, California, Eric Topol uses the study of genomics to propel game-changing medical research. The Institute combines clinical investigation with scientific theory, training physicians and scientists for research-based careers. He also serves on the board of the West Wireless Health Institute, discovering how wireless technology can change the future of health care.

Learning in a Participatory Culture: A Conversation About New Media and Education

by Henry Jenkins, Professor at USC.

An important and timely discussion that explores the growing interdependence of learners with digital media and the need to examine how these media are working, what their influence is and how to teach in this new environment.

Jenkins interviews, Pillar Lacasa, a Spanish researcher. His first question is: "Children and young people like to spend their free time in front of the screen. Could you give us some good reasons to that could persuade educators to introduce new media and screens in schools." Read more……

Facebook Lands Patent for News Feed

Social networking site is awarded patent for news feed activity stream, potentially giving it a monopoly on the technology behind an essential feature on dozens of sites across the social Web.

Social networking giant Facebook has won a patent for its news feed feature, locking in the intellectual property rights to one of its most popular features.

The patent describes "a method for displaying a news feed in a social network environment," detailing the flow and filtering of information about people's activities across the site. Read more.....

Next-Generation Search

Scouring the Web for information is becoming faster and easier. Could this new rise in search tools and navigational technologies be a threat to Google's dominance?

A series of articles from Technology Review. One of the best summaries of the state of the field and the direction of search.

The Literate Future

 

At the conclusion of a short piece on text, literacy and the Internet, Nicholas Carr suggests the following about the digital age: "Writing will survive, but it will survive in a debased form. It will lose its richness. We will no longer read and write words. We will merely process them, the way our computers do."

I want to take issue with this pessimistic prediction. At every stage of technological change since the invention of the printing press, similar claims have been made. Most often, these claims originate with those people more likely than others to be both literate and dependent on traditional forms of explanation and exposition. The appearance of the telephone in the 1850's led to predictions of the death of conversation. The growth in the distribution of books and magazines in the 19th century led to predictions that writing, both as process and creative activity would be debased. More recently, the growth of digital tools and their pervasive use led to predictions that creative practices like painting would disappear. (The reverse is true. There has been a renaissance in interest in painting in most Art Schools and a significant rise in attendance at museums showing both contemporary works as well as paintings from different historical periods.) The invention of the cinema in the 1890's led both politicians and critics to suggest that the theater was dead.

In most cases, the advent of new technologies disrupts old ways of doing things. Equally, the disruption builds on the historical advantages conferred upon the medium through its use and modes of distribution. Text is everywhere in the digital age, and while it may be true that attention spans have decreased (although research in this area is very weak), that says nothing about how people use language to communicate whether in written or verbal form.

The example that is most often cited as evidence that there has been a decline in literacy is text messaging. What a red herring! Text messaging is simply the transposition of the oral into text form. It is a version of speech not of writing. It neither indicates a loss of ability nor an increase in literacy. Rather, and more importantly, text messaging is another and quite creative use of new technologies to increase the range and often the depth of communications among people.

The beauty of language is its flexibility and adaptability. The various modes of conversation to which we have become accustomed over centuries have a textured and rich quality that depends on our desire to communicate. That desire crosses nearly every cultural and political boundary on this shrinking earth. Rather than worry about whether text messaging will undermine literacy, we need to examine how to use all of the new modalities of communications now available to us to enhance the relationships we have with each other. That is the real challenge, quality of exchange, what we say and why and how all of that translates into modes of expression that can be understood and analyzed.

Up In The Air with Avatar

"Being in the air is the last refuge for those that wish to be alone." Jason Reitman) There are profound connections between Avatar and Up in the Air. Both movies come at a time that can best be described as dystopic. From Afghanistan, Iraq and other countries mired in war to the deepest and most serious recession since the 1930's, to the ongoing crisis of climate change, the first decade of the 21st Century has been characterized by waves of loss, violence and instability.

What then allows any individual to compose their identity and to maintain their sense of self as the air around the planet gets thinner and thinner? How does the imagination work within a dystopia?

Up in the Air explores the tropes of loneliness and travel -- the in-between of airports and hotels, those places that are not places but nevertheless retain many of the trappings of home without the same responsibilities and challenges. There are consequences to being on the road 300 days of the year and among them is the construction of an artificial universe to live in like the metal tubes we describe as airplanes. One of the other consequences is that frequent travelers have to build imaginary lives that are fundamentally disconnected from intimacy and genuine conversation.

Ironically, Avatar imagines a world that is for a time dragged into the dystopia of 21st century life and where at the end of the day, a new vision is constructed. Avatar's use of 3D will be the subject of another article soon, but suffice to say that the worlds James Cameron constructs through motion capture and animation are among the most beautiful that the cinema has ever seen.

Hidden behind both films is a plaintiff plea for love and genuine relationships. Avatar explores this through tales of transmigrating spirits and animistic notions that transform animals and nature itself into a vast Gaia-like system of communications and interaction. The N'avi are a synthesis of Cameron's rather superficial understanding of Aboriginal peoples, although their language is a fascinating blend created by Paul Frommer from the University of Southern California.

The flesh of avatars in the film are not virtual but as the main character, Jake Sully discovers, the N'avi are the true inheritors of the planet they live on, a exotic version of early Earth called Pandora. In Greek mythology Pandora is actually derived from 'nav' and was the first woman. The Pandora myth asks the question why there is evil in the world which is a central thematic of Avatar.

Up in the Air asks the same question but from the perspective of a rapacious corporation which sends its employees out to fire people for other companies or as the main character, Ryan Bingham says to save weak managers from the tasks for which they were hired. The film also asks why there is evil in the world and suggests that any escape, even the one that sees you flying all year doesn't lead to salvation.

Both films explore the loss of meaning, morality and principles in worlds both real and unreal. Avatar provides the simplest solution, migrate from a humanoid body and spirit to a N'avi to discover not only who you are but how to live in the world. **Up in the Air** suggests that love will solve the dystopic only to discover that casual relationships never lead to truth and friendship.

These are 21st century morality tales. Avatar is a semi-religious film of conversion not so much to truth but to the true God, who is now a mother. Up in the Air teaches Ryan that life is never complete when it is entirely an imaginary construction.

It is however, the reanimation of the human body in Avatar that is the most interesting reflection of the challenges of overcoming the impact of this first decade of the 21st century. Jake Sully is able to transcend his wheelchair and become another being, now connected to a tribe. He is able to return to a period of life when innocence and naivete enable and empower — when the wonders of living can be experienced without the mediations of history and loss. This of course is also the promise of 3D technology, to reanimate images such that they reach into the spectator's body, so we can share those moments as if we have transcended the limitations of our corporeal selves.

James Cameron's digital utopia, full of exotic colours, people, plants and animals suggests that escape is possible in much the same way as Ryan Bingham imagines a world without the constraints that are its very essence. 3D technology promises to allow us to transcend our conventional notions of space and time but it cannot bring the earth back to its pristine form nor reverse engineer evolution or history. At the same time, Avatar represent a shift in the way in which images are created, in the ways in which we watch them and also in the potential to think differently about our imaginations and about our future. (Imagine a 3D film about the destruction of the Amazon!)

 


 

The role of research in the Creative Arts (1)

Ceramics is an extraordinary craft-based discipline. It is also an art and a science. The materials that ceramicists use have changed over the last century, but many of the core creative methods remain the same. None of what I have just said would be possible without some research into the history and practices of ceramic artists and the technologies they use. So, for example when I mention to people that ceramic engineering is a crucial part of the digital age, they don’t know what I am talking about. Optical fibers make use of ceramic materials. The tiles which cover the bottom of the Space Shuttle are made of ceramic materials shaped and formed using a variety of heating and manufacturing methods.

Ceramics is increasingly being used in the creation of products (other than the traditional ones) and is linking itself to product and industrial design. There are medical applications and so on.

I mention this to point out that research is fundamental to any creative exploration and that research may take any form — and make use of any number of different materials. A reductive approach will not recognize the rather extensive way in which the practice of creation is deeply involved with everything from theory through to reflection and self-criticism. For too long, universities in particular have maintained distinctions between their professional and non-professional disciplines as a way of differentiating between applied and pure research. The latter is supposed to reflect a disinterested approach to knowledge in the hope that over time the research will produce some results. The former is supposed to direct itself towards results from the outset and to be more directly connected to industry and the community. Engineering schools for example, are cloistered in separate buildings on university campuses and generally develop an applied approach to learning. In neither case, applied or pure can the distinctions I have just mentioned work since by its very nature research is **always** both applied and pure.

Creative practices are generally seen as applied because the focus is on materials even if they are virtual. The standardized and by now clichéd image of creative people driven by intuitions and/or inspiration actually covers up the years of apprenticeship that every artist has to engage in to become good at what they do.

Every creative discipline involves many different levels of research, some of which is directly derived from practices in the social sciences, as well as the sciences. In the next installment of this article, I will examine how creative practices are at the forefront of redefining not only the nature of research but the knowledge base for many disciplines.