Skip to content

Recent Articles

22
Aug

Colonel Mustard and the Curious Case of the Missing Particle

 

 

 

sherlock1

Just two hours after it first appeared on British television, the long-awaited first episode of the third season of Sherlock was being shown, complete with Chinese subtitles, on Youku.com, China’s biggest video website. In the space of just 24 hours the Chinese version of Sherlock was viewed more than 5 million times, becoming the site’s most popular programme ever.
 Sherlock’s success in China is such that when British Prime Minister David Cameron recently visited the country and opened an account on the social network Weibo to answer questions from the general public about relations between Britain and China, he was reportedly besieged by question’s about the show.
 Sherlock Holmes is not just the most famous detective of all time, he also represents one of the most compelling ideas in Western popular culture. Because wth his dazzling use of empirical observation, deductive logic and forensic skills he is the physical embodiment of what is often called ‘the scientific method’.
 Indeed, there is a curious symmetry to be seen in the relationship between the development of science in Western culture and the rise of detective fiction over the last century and a half.
At first glance one might presume that this is simply because progress in scientific knowledge has influenced the way that we like our murder mysteries. And yet perhaps there is more to this than meets the eye, and although it would be far too provocative to argue the reverse – that it is detective fiction that has influenced the narratives of modern science – perhaps the truth lies somewhere between these two extremes.
Western culture has gone from a position in Victorian times where it saw reality as an objective phenomenon to the opposite extreme by the late Twentieth century where reality was seen as a culturally created phenomenon.  And if  we explore the relationship between what is conventionally thought of as fact and what is conventionally thought of as fiction this may reveal much about the culture that created both. As we do so we will discover the curious relationship between the theories of Sherlock Holmes and the theories of Albert Einstein, and the search for ‘the murderer’ in the popular boardgame Cluedo and the search for the mysterious particle known as the ‘Higgs Boson’.
Welcome to the story of Colonel Mustard and the Curious Case of the Missing Particle.

 

 

When science ruled the world

 

In the second half of the nineteenth century, it was thought that there was an underlying order to the universe which could be unlocked through mathematics and scientific enquiry. The entire surface of the planet had been mapped and its contents were now being duly itemized, classified and catalogued. Above is an illustration from 'Kunstformen der Natur' (Art Forms of Nature) by Ernst Heinrich Haeckel (1834 –1919) a scientist and follower of Darwin, who discovered, described and named thousands of new species, and also created a grand genealogical tree relating all life forms to one another.  

An illustration from ‘Kunstformen der Natur’ (Art Forms of Nature) by Ernst Heinrich Haeckel (1834 –1919)

Middle and Upper-class Victorians seemed to be at the very pinnacle of human development. Dominant as they were over a vast working-class majority at home and over millions of “uncivilized” and “lesser” races abroad, it seemed that human history itself, had been nothing if not an inevitable journey away from the dark origins of superstition, savagery and ignorance towards the triumph of the truth of scientific rationalism that was the glory of contemporary Victorian society.

The entire surface of the planet had been mapped – as well as the entirety of the heavens – and its contents were now being duly itemized, classified and catalogued. And Darwin’s theory of evolution -the very pinnacle of Nineteenth Century scientific thought – seemed to support the inexorable advance of Western culture, legitimizing the great Victorian project of imperial domination of the ‘lesser races’ of the world.

It was thought that there was an underlying order to the universe which could be unlocked through mathematics and scientific enquiry. Indeed the triumph of science was such that it now seemed that everything in the universe was ultimately knowable, and comprehensible to the human mind, and that there was no mystery in the world that could not be penetrated by the persistent and systematic use of the scientific method.

Victorian science however was nothing if not practical, and the Nineteenth century was also an age of unprecedented technological innovation.

Ernst Haeckel and von Miclucho-Maclay 1866

Ernst Haeckel and fellow scientist 1866

London, at the start of the 20th century, was the largest city on the face of the planet and the capital of the greatest empire the world had ever seen. It is estimated that in this period alone, around 10,000,000 square miles of territory and roughly 400 million people were added to the British Empire. At its height the Empire incorporated a quarter of the Earth’s habitable lands, and contained a fifth of its people.

And this vast enterprise was made possible by technological innovations like the telegraph, the steam turbine, and the Gatling gun.

 

 

A most remarkable invention

 

But the invention that perhaps had the greatest impact on global culture, was neither an instrument of war, nor even the invention of a completely new medium like photography, the wireless the telephone or the cinema, all of which were invented at this time.

It was in Britain in the 1860’s that an exciting new form of literature began to appear. Often referred to as the ‘sensation novel’, or ‘novel with a secret’ these narratives featured a radical new storytelling technique, in which the author would deliberately withhold information from the reader forcing them to use their own powers of observation, logic and deduction to solve a puzzle, a crime that was shrouded in mystery at the heart of the narrative.

These novels were called ‘sensational’ partly because of their content – usually a murder, or better, a murder combined with a sexual transgression –which allowing the reader to experience the dark, mysterious, criminal underbelly of Victorian society from the safety of their favourite armchair.

Scientific rationalism had become a lens through which the darkest, most primitive, and savage actions of less civilized peoples could be viewed, and rectified, by polite society. Clue by clue, discovery by discovery, through the rigorous application of logic and scientific thinking, even the most sinister shadowy mystery could be penetrated and order restored.

In short, the modern detective story, in which the author shares the task of solving the crime with the reader, had been invented.

All detective stories are, by their very nature, about the unequal distribution of knowledge. Not just between the characters within the story, but also between the author and the reader.

Indeed it is only through the careful deployment of false clues, unreliable testimonies, and even barefaced lies that the author gradually allows the truth to emerge and the reader to ultimately share the clarity of their omniscient understanding of the narrative.

And as Patrick Brantlinger succinctly describes it, in his essay What Is “Sensational” About the “Sensation Novel”? from this point on…‘the forthright declarative statements of realistic fiction are, in a sense, now punctuated by question marks.’

 

 

Murder most modern

 

Despite the fact that it is often dismissed as a lesser form of literature, the detective story – or murder mystery as it is also called – has dominated the cultural landscape for over one hundred and fifty years.

21661127_700x700min_1Indeed, it is hard not to overestimate the impact that the creation of this radical new type of narrative technique was to have on modern popular culture -paving the way, as it did, for the vast array of modern detective stories that surround us today, in the form of books, of course, but also TV shows, movies, stage plays, children’s stories and even board games and computer games. From the child-friendly adventures of Nancy Drew or Enid Blyton’s mystery books, to the dark brooding landscapes of Scandinavian dramas like The Bridge or the Killing, from the reassuring idealised world of Midsomer Murders to the poisoned industrial wasteland of True Detective, and from the elegant and desirable Venetian lifestyle of police commissioner Guido Brunetti to the small town desperation of Ystad’s Inspector Kurt Wallander, the detective story is clearly the dominant dramatic narrative form of the modern age.

The most popular television drama series in the world is CSI: Crime Scene Investigation – the murder mystery franchise produced by Hollywood director and producer, Jerry Bruckheimer. CSI first appeared on TV screens back in 2000, and since then its audience has grown exponentially, so that in 2009, its worldwide audience was estimated at more than seventy million viewers.

Agatha Christie’s Then There Were None, which, with over a hundred million sales to date, and still climbing, is one of the best-selling books of all time, and, according to Publications International, is the 7th best-selling book in the world (outstripped only by such essential publications as The Bible, The Thoughts of Chairman Mao Zedong, The Qur’an, Xinhua Zidian (The Chinese Dictionary), The Book of Mormon, and, of course Harry Potter).

The longest running play in the world is the murder mystery called The Mousetrap – also written by Agatha Christie. – which first opened in the West End of London in 1952, and has been running continuously since then, with its 25,000th performance taking place on 18 November 2012. A feat all the more remarkable given the fact that the audience are asked not to reveal the solution to the mystery after leaving the theatre.

1031-1385376024-mouse1

The longest running play in the world is London’s West End production of The Mousetrap by Agatha Christie.

 

 

 

Scientific certainty: the answer to life the universe and everything

 

And yet, despite the fact that its legacy dominates so much of contemporary culture, the sensation novel was also very much a product of the certainties of its own time and cultural context.

The Victorian faith in scientific progress was strengthened by the assumption that everything in the universe was ultimately knowable, and that scientific progress would eventually lead to a perfect knowledge of nature’s fundamental physical laws. Indeed for many late Victorians, the completion of this vast god-like understanding of the universe was, only a matter of time.

In a speech at the University of Chicago in 1894 Albert. A. Michelson, the first American physicist to win the Nobel Prize, declared that:

‘The more important fundamental laws and facts of physical science have all been discovered, and these are now so firmly established that the possibility of their ever being supplanted in consequence of new discoveries is exceedingly remote…’

At around the same time Lord Kelvin, the mathematical physicist is reputed to have said that:

‘There is nothing new to be discovered in physics now. All that remains is more and more precise measurement’

And in 1900, the German mathematician, David Hilbert, one of the most influential mathematicians of the 19th and early 20th centuries, put forward a list of the 23 remaining unsolved problems within mathematics at the International Congress of Mathematicians in Paris, (You can read more about Hilbert’s ‘to-do’ list here). According to Hilbert, once answered, these outstanding questions would complete our mathematical understanding of the universe. Indeed by tidying up these last remaining anomalies, we would finally arrive at a universal theory of everything.

What could be simpler? The nature of reality itself, just like the murder mystery novel, appeared to be a puzzle which could be solved, clue by clue, discovery by discovery, through the rigorous application of logic and scientific thinking.

The murder mystery detective just like his real-life counterpart the explorer-scientist, was to become the fictional hero of this great age of scientific discovery. And the greatest of these was none other than the most celebrated fictional detective of all time… Sherlock Holmes.

 

Sherlock Holmes: the embodiment of the scientific method

 

Every age has its hero; a figure that, more than any other, embodies the deepest yearnings and greatest aspirations of the time. And for the modern age that figure is none other than the detective.

Sherlock Holmes is not just the most famous detective of all time, but he also represents one of the most potent and compelling ideas in modern popular culture.

With his dazzling use of empirical observation and deductive logic he is the absolute physical embodiment of the ‘scientific method’.

 

Sherlock-Holmes-Experimenting-Watson-by-Sidney-Paget-1893

‘Sherlock Holmes Experimenting’ by Sidney-Paget 1893. Sherlock Holmes is not just the most famous detective of all time, but he also represents the physical embodiment of the ‘scientific method’. In 2002, Sherlock Holmes became the first fictional character to receive an Honorary Fellowship from the Royal Society of Chemistry, for ‘using science, courage and crystal clear thought processes to achieve his goals.’

The Scientific method has characterized the pursuit of knowledge in the West, for hundreds, if not thousands of years, and consists of a process of systematic observation, measurement, and experiment, and the formulation, testing, and the constant modification of theories.

The widespread adoption of the scientific method in the seventeenth century with its pragmatic, evidence-based approach, stands in sharp contrast to the shadowy world of received wisdom, superstition and taboo that was supposed to have characterized most of pre-enlightenment Europe.

Implicit within the idea of scientific method, is the rather optimistic belief that there is no phenomenon that cannot ultimately be understood… you simply have to do enough observations, measurements, and experiments to reveal the truth.

2013-updated_scientific-method-steps_v6_noheader

It is in just this way then, that the famous “consulting detective” from 221B Baker Street, is allowed to work away, unconstrained by tradition, or petty social norms. His intellect, working like a powerful torchlight that can penetrate the darkest secrets and deepest mysteries of a Victorian London that is permanently shrouded in fog and darkness.

And even though the last Holmes story appeared as late as 1927 – by which time, in reality, the streets of London were filled with motor cars rather than hansom cabs and ablaze with electric lights rather than faint flickering of gas lamps – our hero continued to inhabit his mysterious Victorian world of fogbound streets where, as Vincent Sterrett, in The Private Life of Sherlock Holmes puts it, ‘it is always 1895’.

Of course, the stories of Sherlock Holmes benefit enormously from the fact that it is ‘always 1895’. By creating a character that quite literally embodies the scientific method and placing this character in a setting that represents the complete opposite – a dark, fog-bound shadowy world of prejudice, ignorance and assumption – Conan Doyle created one of the most famous fictional characters the world has ever seen.

In 2002, Sherlock Holmes became the first fictional character to receive an Honorary Fellowship from the Royal Society of Chemistry, for ‘using science, courage and crystal clear thought processes to achieve his goals.’

 

 

 

A tale of two stories: the unique structure of detective fiction

 

The classic detective story has dominated the popular culture of the English-speaking world for more than a hundred and fifty years now. So is there some dark impulse at the heart of the modern age that creates such an insatiable thirst for savagery and bloodshed? Or is it more about the sense of the civilized world restoring order, clue by clue, revelation by revelation and the satisfaction, the reward, of that wonderful moment, at the end of these stories when all is illuminated, and we finally understand the truth?

The answer is probably a bit of both.

Murder mystery stories tend to share a specific, defining structural characteristic. And to understand this we need to digress for a moment into the world of literary theory. The terms ‘Fabula’ and ‘Sujet’ were originally invented in the 1920s, by two Russian Formalist literary theorists – Vladimir Propp and Viktor Shklovsky, to describe the difference between a story and its plot.

So, for example, if you were to take the movie Citizen Kane, which starts with the death of the main character, and then proceeds to tell his life story through a series of flashbacks that are intercut with a journalist’s investigation of Kane’s life in the present day, the ‘Fabula’ of the film is the story of Kane’s life – the way it happened in chronological order – while the ‘Sujet’ is the way that this story is actually told, reconstructed for greatest dramatic effect through flashbacks.

An even more extreme example of this separation of ‘Fabula’ from ‘Sujet’ is to be found in the structure of the film Memento, where the story is presented as two different sequences of scenes: a series in black-and-white that is shown chronologically, and a series of color sequences shown in reverse order. These two sequences “meet” at the end of the film, producing one common story.

1600px-Memento_Timeline

In The Poetics of Prose, the philosopher and literary theorist Tzvetan Todorov explains how this division between ‘Fabula’ and ‘Sujet’ is uniquely exploited in the narrative structure of the murder mystery story, in that it too:

‘contains not one but two stories: the story of the crime and the story of the investigation. In their purest form, these two stories have no point in common . . .

The first story, that of the crime, ends before the second begins. But what happens to the second? Not much. The characters of the second story, the story of the investigation, do not act, they learn.

…The hundred and fifty pages which separate the discovery of the crime from the revelation of the killer are devoted to a slow apprenticeship: we examine clue after clue, lead after lead…

This second story, the story of the investigation, . . . is often told by a friend of the detective, who explicitly acknowledges that he is writing a book; the second story consists, in fact, in explaining how this very book came to be written . . .

The first— the story of the crime — tells ‘what really happened,’ whereas the second — the story of the investigation — explains ‘how the reader (or the narrator) has come to know about it.'”

The first, that of the crime, is in fact the story of an absence: its characteristic is that it cannot be immediately present in the book…The status of the second story…(is) a story which has no importance in itself, which serves only as a mediator between the reader and the story of the crime…

We are concerned then in the whodunit with two stories of which one is absent but real, the other present but insignificant.”

As Todorov says, ‘The characters of the second story, the story of the investigation, do not act, they learn’ and, of course the most important character in the story of the investigation, the one who does the ‘learning’, is the central figure of the murder mystery genre, the character through whom we the audience learn everything– the detective.

It is through the detective figure that we discover the central mystery-the puzzle that needs solving. And it is through the detective figure that we too speculate and create hypotheses as, bit by bit, parts of the solution to the mystery are gradually revealed.

Until that wonderful moment when all becomes clear, and, together with the detective we experience, for the moment at least, a complete, god-like understanding of everything.

Significantly, the ‘Fabula’ and ‘Sujet’ of a story tend to have very different characteristics.

In classic murder mystery stories, the ‘Fabula’ is primarily about the crime whereas the ‘Sujet’ is primarily about the investigation. So whilst the ‘Fabula’ tends to be about the destruction brought about by the forces of chaos, darkness and barbarism, the ‘Sujet’ on the other hand tends to be about the restoration of order and the all round benefits of civilization. And whilst the ‘Fabula’ is primarily about animal passions, such as love, jealousy, greed and fury, the ‘Sujet’ is primarily about detached observation, intellect and rational scientific thought.

This binary conflict between chaos and order, animal passions and rational scientific thought is also to be found at the very heart of Victorian culture in the second half of the nineteenth century… precisely at the time when the modern detective story first began to appear.

 

 

 

The origins of the scientific detective

 

The detective story, as a distinct genre, was a product of Victorian culture however, and only a tiny proportion of the detective fiction produced at the time is still available today, even less is still read for pleasure or even studied by academics. There are therefore many theories therefore about the precise genealogy of this form of narrative.

The prototype for the scientific detective – later made famous by Sherlock Holmes – was certainly Chevalier Dupin, a character created by that master of tales of mystery and the macabre, Edgar Alan Poe. Chevalier Dupin, appears in three stories: The Murders in the Rue Morgue (1841); The Mystery of Marie Rogêt (1842); and The Purloined Letter (1845).

As T.J Binyon puts it in “Detective in Fiction from Poe to the Present“:

“In Dupin, Poe created the prototype of the great detective, the eccentric genius with stupendous reasoning powers, whose brilliance is given added refulgence by the fact that he is always accompanied, and his investigative tours de force always set down, by a loyal admiring, but uncomprehending and imperceptive friend and assistant.”

Poe himself was aware of how innovative these stories were – “These tales of ratiocination,” Edgar Allan Poe explained in 1846, “owe most of their popularity to being something in a new key.” – and even in outline, the modern reader will recognize many of the features of the classic detective story. In The Murders in the Rue Morgue, Dupin comes across the case in the newspaper of the gruesome murder of Madame L’Espanaye and her daughter in their apparently locked lodgings in the Rue Morgue, and using his powers of logical deduction he unravels the seemingly insoluble mystery, by methodically sifting through all the various accounts and considering various all the possible hypotheses, he exposes the narrow-mindedness of the local prefect of police. Poe had given the form its initial shape, created its first great detective, complete with companion/narrator.

It was almost half a century later, in 1887, with a short story called A Study in Scarlet, Sir Arthur Conan Doyle, introduced his own version of this intrepid pair – the eccentric genius with extraordinary deductive powers, accompanied by his trusty, but rather imperceptive assistant – in the shape of Sherlock Holmes and Dr. John H. Watson. This duo, were, of course, to become perhaps the most famous duo in literary history, going on to become the subjects of four full length novels and 56 short stories, all but four of which are narrated by Holmes’s assistant, Dr. Watson.

moonstone finalHowever the author who was responsible for creating so many of the elements that would later become the model for the classic detective stories of the early twentieth century was a writer of ‘sensation’ novels called Wilkie Collins, with the publication of what many consider the first, and greatest example of the genre: The Moonstone in 1868.

The renowned crime writer Dorothy L. Sayers once described The Moonstone as ‘probably the very finest detective story ever written’.  Whilst the distinguished modernist poet and literary critic T.S. Eliot called it “the first, the longest, and the best of modern English detective novels in a genre invented by Collins and not by Poe.”

Whoever it was that was ultimately responsible for creating this new literary genre – and in reality there were probably many writers who should ultimately take the credit for their own individual contributions – it was the publication of The Moonstone in 1868 that clearly laid down the principles for all the ‘Golden Age’ murder mystery stories that were to follow. These principles can be summarized in the following manner:

  1. The story must form a puzzle, the solution to which is only revealed at the very end
  2. An investigator with unusual forensic and deductive skills who is seeking to establish ‘the truth’ must drive the plot.
  3. And this investigator is placed in direct contrast to the conventional police who are largely incompetent.
  4. It is essential that the reader discovers clues only as the investigator does
  5. There should be a large array of false suspects, and false clues, to confuse and mislead the reader
  6. The guilty party should always be the last person you might suspect
  7. The crime to be solved should be a ‘locked room mystery’ in which a murder has been committed under apparently impossible circumstances

 

 

The Golden Age of Detective Fiction

 

In the early part of the Twentieth Century, this genre exploded into mainstream popularity – especially amongst the British Middle classes. According to Carole Kismaric and Marvi Heiferman in The Mysterious Case of Nancy Drew & The Hardy Boys:

‘The golden age of detective fiction began with high-class amateur detectives sniffing out murderers lurking in rose gardens, down country lanes, and in picturesque villages. Many conventions of the detective-fiction genre evolved in this era, as numerous writers — from populist entertainers to respected poets — tried their hands at mystery stories.’

And Professor William D. Rubinstein, in his essay called A Very British Crime Wave: How Detective Stories Captured the Imaginations of the British Middle Classes in the 20th Century gives some sense of the scale of this phenomenon:

‘Between around 1910 and 1950 Britain was in the grip of a genteel crime wave; a seemingly endless output of murder mysteries… Such works formed a major component of middle-class culture in Britain at the time: for every person who read T.S. Eliot, D.H. Lawrence, or Virginia Woolf, probably 50 more read Agatha Christie and double that number Conan Doyle.’

aGATHA BOOKSIt was during this time, that some of the greatest works of the English murder mystery genre were created. The Grande Dame of the genre, Agatha Christie, was publishing classics with titles such as, Murder on the Orient Express (1934), Death on the Nile (1937), and Then There Were None (1939) as well as introducing her two most famous fictional detectives, Hercule Poirot and Miss Marple. At the same time, Dorothy L. Sayers’s was busy creating her archetypal British gentleman detective, Lord Peter Wimsey with equally classic murder mysteries like: Murder Must Advertise (1933) The Nine Tailors (1934) and Gaudy Night (1935).

But whilst this was a time of great creativity, it was also the period during which conventions of the genre came to be set in stone, and in 1929 the British clergyman, and amateur detective storywriter Ronald Knox set out his ‘Decalogue’ of rules for detective fiction.

These rules, which ensure that, a detective story “must have as its main interest the unraveling of a mystery; a mystery whose elements are clearly presented to the reader at an early stage in the proceedings, and whose nature is such as to arouse curiosity, a curiosity which is gratified at the end” were perhaps written with slightly more of a wry smile than is generally thought – Father Knox was also known for his love of pranks and practical jokes – but they are rules which most writers within the genre adhere to.

Father Knox’s Decalogue: The Ten Rules of Detective Fiction

  1. The criminal must be someone mentioned in the early part of the story, but must not be anyone whose thoughts the reader has been allowed to follow.
  2. All supernatural or preternatural agencies are ruled out as a matter of course.
  3. Not more than one secret room or passage is allowable.
  4. No hitherto undiscovered poisons may be used, nor any appliance which will need a long scientific explanation at the end.
  5. No Chinaman must figure in the story.
  6. No accident must ever help the detective, nor must he ever have an unaccountable intuition which proves to be right.
  7. The detective must not himself commit the crime.
  8. The detective must not light on any clues which are not instantly produced for the inspection of the reader.
  9. The stupid friend of the detective, the Watson, must not conceal any thoughts which pass through his mind; his intelligence must be slightly, but very slightly, below that of the average reader.
  10. Twin brothers, and doubles generally, must not appear unless we have been duly prepared for them.

In her wonderfully insightful and informative Twentieth-Century Crime Fiction Lee Horsley explains why so unashamedly artificial a genre has had such enduring popular appeal:

‘One answer might be that this reassuring object (a well-known kind of text) is also an invitation to playful readers to participate, challenging them to put a fictional world in order by the act of being, simply, a ‘good reader’. Such a person will judge the writer as ‘good’ partly because he or she manages to delay an appreciative audience’s recognition of the ‘true’ narrative. Seen in this way, the classic detective story combines the comforting familiarity of a repeated pattern with the surprising turns of a well-played game.’

Lee Horsley expands upon this idea by comparing the rise in popularity of golden age crime fiction with the increasing popularity of the cryptic crossword:

‘In the inter-war period, the flourishing of golden age crime fiction, epitomized by Christie and Sayers, coincided with the emergence of another favourite British game, the cryptic crossword. The cryptic crossword, it should be emphasized, differs from the variety of word puzzle which is solved by filling in famous names, general knowledge, or words that fit dictionary definitions. Rather, it consists of complicated wordplay (puns, anagrams, etc.), with completion of the puzzle involving a battle of wits between clue-setter and solver.5 Although the early cryptic crossword was somewhat anarchic, it soon became established that all good setters must abide by the fundamental principle of fair play. One of the best-known setters, Afrit, offered the dictum, ‘I need not mean what I say, but I must say what I mean.’ This would, I think, serve equally for any good detective storywriter. Both games mirror the nature of civilized discourse in their careful ironies, their nuances and clever evasions, and their attentiveness to the exact meanings of words (a particular skill, for example, of Christie’s Hercule Poirot). The correct answer should be accessible to the solver, but must be cleverly hidden, in such a manner that, once enlightened, he or she will ‘see that the solution had, in a sense, been staring him in the face’.

 

 

Colonel Mustard and the Murder at Tudor Hall

 

In the 1930’s, the game-like quality of English Golden Age detective fiction, had inspired a craze for ‘Murder Mystery’ games, played out in hotels across the length and breadth of the country.  These ‘Murder Mystery’ games, would involve both actors and hotel guests playing the part of characters in a murder mystery drama – one that centred around the ‘murder’ of one of the guests. The hotel, with its large number, of sprawling rooms, took on the role of the country mansion and when the body was found murdered, each of the guests would fall under suspicion. By piecing together the clues provided, the hotel guests would then have to solve the mystery over the course of the evening.

Anthony E. Pratt was a musician who made a living from playing piano in the country hotels where these murder games were played. Anthony himself was a huge fan of murder mystery stories and in particular Agatha Christie novels like The Body in the Library.  And as he watched these ‘Murder Mystery’ games being played out in front of him night after night, he began to have a rather brilliant idea…

Anthony and Elva Pratt in the 1940's around the time they devised the game of Cluedo

Anthony and Elva Pratt in the 1940’s in their back garden around the time they were in the process of designing the game of Cluedo

Anthony Pratt realized that he could translate these murder mystery games into a board game. By 1943, Anthony with some help from his wife, Elva, had designed his murder mystery board game. The game was called “Murder” and Elva designed the artwork for the board. The object of the game was for each player to move around the game board, which featured the floor plan of an English country house, known as Tudor Hall, with each player in the guise of one of the game’s six characters. As they did so they would collect clues until they were able to announce who had committed the murder, in which room, and with which weapon. So for example the winner might utter the immortal words: ‘I suggest it was Colonel Mustard, in the Library, with the Lead Pipe.’

Anthony Pratt filed his original patent application on 1 December 1944, and in February 1945 showed the game to Waddington’s, the largest British board games manufacturer at the time. Waddington’s immediately saw the potential and, with a few very minor modifications, (It was Waddington’s who renamed the game Cluedo – a combination of “Clue” and “Ludo”, the Latin word for “I play”) decided to go ahead and manufacture the game.

Of course, Cluedo – or Clue as it became known in the United States – went on to become a massive worldwide success. It is fitting, therefore, that one of the most enduring expressions of the English Golden Age of detective fiction is not a book, but a board game, and one with a very specific social setting. Because – in the same way that for Sherlock Holmes it is ‘always 1895’ – for players of Cluedo, it will always be 1926 at a country mansion somewhere in Hampshire.

Cluedo-01

The Cluedo board featuring the floor plan of the English country house, known as Tudor Hall desigfned by Elva Pratt in 1943.

 

The end of the idea of ‘Human Progress’

Over the course of it’s first hundred years in Britain, detective fiction had developed many conventions that were in danger of fossilising this otherwise vibrant new artform and lending the genre a somewhat artificial quality.

This was all the more noticeable given the social and cultural changes that had taken place in Britain in the meantime.

During the course of the intervening years the old Victorian attitudes to progress began to change. For many writers, artists and poets it had been the horrors of the First World War that gave the lie to the simplistic nature of the late Victorian worldview. For others it was the increasing familiarity with other cultures and worldviews that occurred as Imperial power declined and new forms of media proliferated.

tumblr_mwg9x8dKSF1qz4txfo1_1280-1Whatever the precise cause, we can see that some kind of tipping point in the understanding of the nature of progress was reached in 1931, when the British historian Herbert Butterfield published a short, but highly influential book, called The Whig Interpretation of History. The title referred to the Eighteenth century conflict between ‘Whigs’ and ‘Tories’ in which, the ‘Whigs’ – who were political liberals -believed in the concept of progress and the ‘Tories’ – who were political conservatives – distrusted anything new, and clung to the institutions of the past.

Butterfield’s little book demonstrated the major flaw in the Victorian representation of history; imagining as it did, the past as an inevitable progression towards ever greater liberty and enlightenment, culminating in the finished forms of liberal democracy and constitutional monarchy that we have today. Butterfield’s thinking was rapidly and widely adopted in academic circles. Indeed after this, anyone in academic circles offering a view of the world based on the Victorian notion of progress was in danger of being dismissed as ‘Whiggish’ in their approach.

And yet, Whiggish interpretations of history continued to influence the popular imagination, and to this very day can be found throughout popular culture in the form of films, television documentaries, and even history textbooks. This can be most dramatically seen in the story of the famous scientific illustration, commonly referred to as The March of Progress.

The_March_of_Progress

The The March of Progress’ is not only one of the most famous scientific illustrations ever created, it is also one of the most misunderstood.

The illustration was originally commissioned for Time-Life Books in 1965, by anthropologist F. Clark Howell and painted by natural history painter Rudolph Zallinger, and it was designed to show a visual summary of 25 million years of human evolution, by lining up a series of figures from the history of human evolution marching in line, from left to right,

It was never the authors’ intention to imply a linear ancestor-descendant parade, but as the popularity of the image grew, the image became known as The March of Progress, reinforcing as it did so, the old discredited Victorian idea of progress: that early human evolution had developed in a linear, sequential fashion along a predetermined path towards our current ‘finished’ form as human beings.

Shocked by the scale of the popular misreading of the image, Howell later remarked that ‘the artist didn’t intend to reduce the evolution of man to a linear sequence, but it was read that way by viewers… The graphic overwhelmed the text. It was so powerful and emotional’.

 

 

The end of the dream of human omniscience

 

However it wasn’t just in the realm of popular culture that the Victorian concept of progress persisted. Watching these Victorian ideas – like the idea of progress and the dream of an all embracing ‘Theory of Everything’ – gradually evaporate during the twentieth century, is a bit like watching the tide go out on a very shallow shelving beach: in some areas they disappeared very quickly whilst in others they lingered in large but increasingly isolated pools.

Ironically perhaps, the last remaining areas left behind by the receding tide were in the field of science. Twentieth century science needed the idea of progress since the idea that science is a body of knowledge passed down from one generation to the next, was, for many scientists, a fundamental article of faith… a prerequisite to the idea of science itself.

But as science developed in the twentieth century, it began to become clear that far from steadily unraveling the nature of the universe, many of these new discoveries were, in fact, simply opening up more questions.

As late as 1930 David Hilbert, (the mathematician who had issued the original challenge to solve the last 23 remaining unsolved mathematics problems at the International Congress of Mathematicians in Paris in 1900) was once again attacking the idea that there were any limits to scientific knowledge. A position exemplified by the phrase ignoramus et ignorabimus, meaning “we do not know and will not know”.  In a celebrated address to the Society of German Scientists and Physicians, in Königsberg he once again stated:

We must not believe those, who today, with philosophical bearing and deliberative tone, prophesy the fall of culture and accept the ignorabimus. For us there is no ignorabimus, and in my opinion none whatever in natural science. In opposition to the foolish ignorabimus our slogan shall be: Wir müssen wissen — wir werden wissen! (‘We must know — we will know!)

Less than a year later, in 1931, a 25 year old mathematician called Kurt Gödel, who had actually been at this lecture, demonstrated that Hilbert’s ambitious grand plan to tidy up the remaining questions and anomalies within mathematics was, in fact, impossible. In what came to be known as Gödel’s Incompleteness Theorems, the young mathematician showed that mathematics could either be consistent or complete. And that it could never be both.

Hilbert never published again, and never recognised Gödel’s work. The words ‘Wir müssen wissen. Wir werden wissen’ are the only words on Hilbert’s gravestone.

Hilbert's grave with the simple words: Wir müssen wissen. Wir werden wissen.

Hilbert’s grave with the simple words:
Wir müssen wissen. Wir werden wissen.

But perhaps the greatest barrier to the discovery of a grand, all embracing ‘Theory of Everything’ was the fact that the two great achievements of twentieth century science, Albert Einstein’s two theories about the nature of the universe – the Theory of General Relativity and the Theory of Quantum Mechanics – had been shown to be incompatible.

After years of research, and experimentation, physicists in the 1950’s had confirmed virtually every prediction made by his theory of the very large – the Theory of General Relativity, which focuses on how gravity affects the way the universe, behaves in terms of large-scale and high mass objects like stars and galaxies – and his theory of the very small – the Theory of Quantum Mechanics, which focuses on the way the universe behaves in terms of objects with both small scale and low mass: objects like sub-atomic particles, atoms and molecules, etc.- within their own domains.

The only problem was that these physicists had also shown that General Relativity and Quantum Mechanics, as they are currently formulated, are mutually incompatible. In short, that two of the greatest scientific breakthroughs of the twentieth century, the Theory of General Relativity and the Theory of Quantum Mechanics, cannot both be right.

One answer to these seemingly intractable questions came in 1962, with the publication of The Structure of Scientific Revolutions by the physicist Thomas Kuhn.

The Structure of Scientific Revolutions changed the way that many scientists think about science, and triggered an ongoing assessment of what scientific progress really means… the effects of which is still being felt to this very day.

According to The Stanford Encyclopedia of Philosophy:

Thomas Samuel Kuhn (1922–1996) is one of the most influential philosophers of science of the twentieth century, perhaps the most influential. His 1962 book The Structure of Scientific Revolutions is one of the most cited academic books of all time. Kuhn’s contribution to the philosophy of science marked not only a break with several key positivist doctrines, but also inaugurated a new style of philosophy of science that brought it closer to the history of science.

Even though the majority of people have never heard of The Structure of Scientific Revolution – or its author – their thinking has still been profoundly influenced by his ideas. Indeed, the term ‘paradigm shift’, first coined by Kuhn, to define one of the central ideas of this ground-breaking work, has become one of the most used, and abused, phrases in modern English.

Kuhn’s great achievement was to, at a stroke, change the way we think about mankind’s attempt to understand the world through science.

As The Stanford Encyclopedia of Philosophy puts it, before Kuhn, our view of science had been dominated by a narrative of scientific progress as ‘the addition of new truths to the stock of old truths, or the increasing approximation of theories to the truth, and in the odd case, the correction of past errors’. In other words, we had seen science, as providing an inevitable and heroic progression towards the ultimate “truth”. A progression in which each successive generation of scientists built on the discoveries and knowledge by ‘standing on the shoulders’ of previous generations.

However, rather than science being this steady, cumulative “progress”, Kuhn saw the history of science as a series of revolutions within which conflicting paradigms overthrew one another. A paradigm is never overthrown until a replacement paradigm is waiting in the wings, and crucially this new paradigm is not necessarily any more ‘truthful’ than the one that it replaces.

According to Kuhn, one of the aims of science is to find models that will account for as many observations as possible within a coherent framework. So, for example, taken together, Galileo’s re-evaluation of the nature of motion and Keplerian cosmology represented a coherent framework that was capable of rivaling and replacing the Aristotelian/Ptolemaic framework.

Once a paradigm shift like this has taken place, the textbooks are rewritten. (And often, at this stage, the history of science is also rewritten, and presented as an inevitable process leading up to the newly established framework of thought). At this point in the establishment of a paradigm, there is a widely held belief that all hitherto-unexplained phenomena will, in due course, be accounted for in terms of this newly established framework of knowledge.

Controversially, Kuhn suggested that those scientists, who choose to operate within an established paradigm, spend their lives in the process of mere ‘puzzle-solving’, since the initial successes created by the established paradigm tend to generate the belief that the paradigm both predicts and guarantees that solutions to these puzzles exist.

Kuhn calls this ‘puzzle-solving’ process ‘Normal Science’. However, this ‘Normal Science’ begins to have problems, as the paradigm is stretched to its limits, and anomalies — i.e. failures of the current paradigm to take into account newly observed phenomena — begin to accumulate. Some of these anomalies may be dismissed as errors in observation, whilst others may merely mean making a few minor adjustments to the prevailing paradigm. But no matter how many anomalies are found, the scientific community as a whole, will not lose faith in the established paradigm until a credible alternative is available.

And yet, Kuhn maintained, that in any community of scientists, there would always be some individuals who would embrace these anomalies, and who would begin to practice what Thomas Kuhn calls ‘Revolutionary Science’… exploring alternatives to the long-held assumptions of the prevailing paradigm. Occasionally this ‘Revolutionary Science’ will create a new paradigm, a rival to the established framework of thought. And in time, if the majority of the scientific community adopts this challenger paradigm, it will completely replace the old paradigm, and a ‘paradigm shift’ will have occurred.

In this way, Kuhn argued that competing paradigms are “incommensurable”: that is to say, there exists no objective way of assessing their relative merits.

In other words, there is no one single, objective,’ Theory of Everything’ waiting to be discovered by modern science, simply the process of ‘puzzle solving’ within the prevailing paradigm which is simply the current shared belief system of the cultural community of scientists… until a new ‘paradigm shift’ takes place.

Obviously many scientists were, and still are, scandalized by the suggestion that modern science is a culturally constructed narrative, rather than the progression towards some kind of universal truth.

But perhaps, and more importantly, even though the majority of people have never heard of either Thomas S. Kuhn or The Structure of Scientific Revolutions, we have all unconsciously adopted his thinking… including the idea of ‘paradigms’ and ‘paradigm shift’. And, unlike our Victorian forebears, we are now willing to believe that reality can be viewed from a number of cultural perspectives. Each of which is equally valid.

 

 

 

The American Detective

 

British Golden Age crime fiction is often referred to as ‘Cozy’ crime fiction, since it tends to be set in an idealized version of middle or upper class England… a reassuringly ordered world that – although temporarily disturbed by the nuisance of an unsolved crime – is always restored to its natural state of peace and harmony in the end.

The act of solving the crime in a ‘Cozy’ is therefore doubly satisfying, since it represents both, an intellectual accomplishment, as well as an act, which restores order and balance to the world. Furthermore, it is one in which the reader is an active participant in the ‘suet’ of the narrative – the scientific search for the truth – and a detached observer of the ‘fabula’ – the dark story of the crime itself.

The ‘Cozy’ was the product of a different age, an age of scientific and social certainties, and, as the Twentieth century developed, through world war and economic depression, many of these certainties began to unravel, and this form of murder mystery began to look increasingly anachronistic and unrealistic.

And nowhere in the world did Golden Age British detective fiction look more artificial and anachronistic than in the United Staes during the great depression.

Film Noir Comp-RecoveredTaking the essential structure of the detective story, writers like Dashiel Hammett and Raymond Chandler, were to fashion a new, more contemporary, kind of fiction that was to come to be known as ‘Hardboiled’.

‘Hardboiled’ detective fiction developed directly out of the American world of pulp detective magazines like ‘True Detective’, the pioneering American crime magazine specialized in dramatizing real-life American crime stories.

These pulp detective magazines had reached the peak of their popularity in the 1920s and 1930s at a time when Prohibition was turning ordinary citizens into criminals and ordinary criminals into celebrities. At this time, magazines like True Detective had become so popular – some would sell up to one million copies per issue – that real life cops and robbers vied to see themselves on the pages. Even FBI boss, J. Edgar Hoover himself, found time to write regularly for the pulp detective magazines.

The ‘Hardboiled’ world lacks the comforting certainty of the British ‘Cozy’, embedded as it is in the reality of crime and violence. And in contrast to the ‘Cozy’ tradition – where deeds of a sexual and violent nature often feature as part of the fabric of the ‘fibula’, but rarely as part of the ‘sujet’ – the ‘hardboiled’ world is hostile, dangerous and morally ambiguous – both in the fabula and the sujet. Furthermore, unlike their “Cozy” British counterparts, these detectives solve mysteries, by moving freely within the world of those who commit the crimes and not just by observing them scientifically from a distance.

The first and, perhaps, the most famous example of the ‘hardboiled’ detective is a character created by Dashiel Hammett called Sam Spade. As Hammett himself later described him:

Spade has no original. He is a dream man in the sense that he is what most of the private detectives I worked with would like to have been and in their cockier moments thought they approached. For your private detective does not — or did not ten years ago when he was my colleague — want to be an erudite solver of riddles in the Sherlock Holmes manner; he wants to be a hard and shifty fellow, able to take care of himself in any situation, able to get the best of anybody he comes in contact with, whether criminal, innocent by-stander or client.

A composite of many of Hammett’s previous detectives, Sam Spade was to become the prototype for a vast number of cynical, world-weary hard-boiled detectives. His is a fictional character that casts a very long shadow. His influence can be felt in characters as diverse as the retired police officer, Rick Deckard, in the movie Blade Runner, Raymond Chandler’s Philip Marlowe, Henning Mankell’s Kurt Wallander or private investigator J.J. “Jake” Gittes in the movie Chinatown.

2.-The-Maltese-Falcon-1941Although Spade first appeared in 1930, in a story called The Maltese Falcon, serialized in a pulp magazine called The Black Mask, it was not the American pulp magazines that were destined to make the hard-boiled detective famous. It was the movies.

And although the 1931 movie version of the Maltese Falcon had been a modest commercial and critical success, it was the great 1941 remake, directed by a young first timer called John Houston and featuring an unknown 42 year old actor called Humphrey Bogart as Sam Spade, that created the archetype of the modern hard boiled detective, in a new, highly expressive film style that was to become known as ‘Film Noir’. Indeed, John Huston’s Maltese Falcon is widely regarded as the greatest detective movie of all time.

 

 

Into the darkness with Film Noir

 

Ever since the Wall Street crash in 1929, America had been in the grips of The Great Depression. But for the big five Hollywood Studios (MGM, Paramount, Fox, RKO, and Warner Bros.) The Great Depression was a veritable boom time. Going to the movies was one way for people to escape from it all – at least for a while – and by 1939 the number of movie theaters in the United States had grown to over fifteen thousand.

The 1930s had seen amazing technical advances in both Technicolor and sound, as evidenced by epic movies like The Wizard of Oz and Gone with the Wind. But these films were unbelievably expensive to make because the color technology they employed was still in its infancy and the three-strip color process used in the production of these films required massive amounts of lighting and time to create.

To maximize their investment in these expensive blockbuster spectacles, the studios used a system called “Block Booking”. This meant that for cinemas to get the rights to showing the big A-list movies, they would have to buy blocks of films which included an assortment of less desirable B-list movies. At the height studio era, these blocks could include up to a hundred films a year purchased in advance blindly by the theaters, often before they even went into production.

Because of this need for large volumes of low cost B-list movies, there was a massive demand for stories, and many of these were found in the pulp fiction of the time, which featured western, sci-fi, horror stories, and of course, the new ‘Hardboiled’ detective stories.

Given the fact that these were low budget movies and because their financial success was relatively assured, a certain amount of experimentation was allowed in how these stories were told. Directors like the German immigrant Fritz Lang – who had been at the forefront of German Expressionist cinema with its highly stylized set design, use of unusual camera angles and dramatic lighting – were quick to seize the opportunity to create movies in a more expressive style, a style that came to be known as ‘film noir’.

The primary literary influence on film noir was the Hardboiled School of writers such as Dashiell Hammett and James M. Cain – both of whom had written for The Black Mask. The classic film noirs The Maltese Falcon and The Glass Key (1942) were based on novels by Hammet, whilst Cain’s novels provided the basis for Double Indemnity (1944), Mildred Pierce (1945), The Postman Always Rings Twice (1946), and Slightly Scarlet (1956). However it was Raymond Chandler – another Black Mask writer – who soon became the most famous author of the hardboiled school. Not only were Chandler’s novels turned into major noirs—Murder, My Sweet (1944), The Big Sleep (1946), and Lady in the Lake (1947) but Chandler was also to become an important screenwriter in the genre as well, producing the scripts for Double Indemnity, The Blue Dahlia (1946), and Strangers on a Train (1951).

Film Noir Comp2

But it is the visual expression of ‘Noir’ movies that is perhaps their most striking feature. It was the French film critic Nino Frank that first coined the phrase ‘film noir’ in 1946. And, as the name suggests this is a form of cinema where darkness dominates the screen.

And within these enormous mysterious shadow areas we sense the realm of the unknown. Because in film noir, it is not what you can see, but what you cannot see, that sets the tone of the drama.

It is this deep and unmistakably modern truth that the American painter Edward Hopper articulates so beautifully in his painting Nighthawks, a painting that was heavily influenced by film lighting; indeed it could very easily be a still from just such a movie. Indeed, in turn, Nighthawks has gone on to be used as a reference for the lighting design for countless film noir movies.

2560px-Nighthawks_by_Edward_Hopper_1942

ABOVE: “Study for Nighthawks” by Edward Hopper, 1941 or 1942, fabricated chalk and charcoal on paper, 28.3 × 38.1 cm (11 1/8 × 15 in.). BELOW: “Nighthawks” by Edward Hopper 1942, Oil on canvas 84.1 cm × 152.4 cm (33 1⁄8 in × 60 in)

Much has been made of the relationships between the three characters seated at the bar. Earlier in his career, Hopper had earned his living creating the cover art for pulp detective magazines, and in the process he had taught himself the ability to condense the suggestion of a large complex narrative into a single image. According to his biographer, Gail Levin, the painting was inspired after reading Ernest Hemingway’s story The Killers, in which two hit men arrive at a diner to murder a burnt-out prizefighter for some undisclosed offence. And yet, we do not really need any backstory to read this painting, whatever about the relationship between the characters inside the diner, the real drama in this painting is in the significance of the relationship between the light of the interior and the darkness of the exterior. Indeed, the darkness outside the diner is the real point of Nighthawks.

Inside it is safe, and there is certainty. But step outside, and nothing is certain… for who knows what danger lurks in those shadows? In these mysterious shadow areas we sense the realm of the unknown. And we know it is not what you can see, but the sheer enormity of what you cannot see, that will ultimately decide the fate of the characters taking temporary refuge in the illuminated interior.

 

Forget it Jake. It’s Chinatown.

 

It has been said that the detective novel ‘brings light into dark places, and, in doing so, for a brief period at any rate, it washes the world clean’. And yet, with the advent of American Film Noir, it is clear that we have journeyed a great distance from the ‘Cozy’ world of Lord Peter Wimsey and Hercule Poirot, from stories set in an idealized upper class England… a reassuringly ordered world that, although temporarily discomoded by the presence of an unsolved crime – is always restored to its natural state of peace and harmony in the end.

And this is partly, because the world had changed. We no longer believed in one ‘authorised’ version of  the truth, we are less inclined to believe that we can know everything, and we no longer have a simple blind faith in the Victorian notion of progress.

Indeed, the ”Hardboiled’ detective is like one of Thomas S. Kuhn’s renegade scientists, who finding anomalies in the prevailing paradigm is forced to construct their own belief system outside of conventional society.

And although, the essential structure of  the ‘Sensation Novel’ is still present – in terms of the audience going on a journey of discovery with the detective – the tone and manner has changed considerably. Here sex and violence  feature both as part of the fabric of the ‘fabula’, as well as the ‘sujet’ , indeed, the ‘Hardboiled’protagonist is often as dangerous and morally ambiguous, as the world in which he now thoroughly immerses himself in. And perhaps more importantly, when the mystery is revealed and the crime solved, order is no longer necessarily restored, and evil is not necessarily removed from the world.

Nowhere is this more true than with Roman Polanski’s 1974 movie, Chinatown.

With the exception of John Huston’s The Maltese Falcon, Roman Polanski’s Chinatown (1974) is often considered to be the greatest detective movie ever made. The original screenplay was written by Robert Towne in the style of the classic 30’s and 40’s Film Noirs of Dashiell Hammett and Raymond Chandler, and features a protagonist who is clearly based on the likes of Sam Spade and Phillip Marlowe… a private eye called J.J. “Jake” Gittes, a part that  Robert Towne had written specifically for actor Jack Nicholson.

Robert Towne’s screenplay for the film has become a model for other writers and filmmakers, and is often cited as one of the finest examples of the craft. However, it was the director, Roman Polanski who decided about filming the fatal final scene, changing Towne’s idea of a happy ending, and thus transforming what might have been a good period detective story into one of the greatest movies of all time. “I knew that if Chinatown was to be special,” Polanski later said, “not just another thriller where the good guys triumph in the final reel, Evelyn had to die.”

Towne had worked on many high profile movies such as Bonnie and Clyde (1967), The Godfather (1972) The Last Detail (1973) and in 1971 producer Robert Evans had offered Towne $175,000 to write a screenplay for The Great Gatsby (1974), Towne felt he could not improve on the F. Scott Fitzgerald novel and  instead, asked Evans for a mere $25,000 to write the screenplay for his own, original story, Chinatown.

 

 

Jake-Gittes-in-Chinatown-640x428

“Forget it, Jake. It’s Chinatown.” Jack Nicholson as  J.J. “Jake” Gittes in the movie Chinatown.

Set in 1937 Chinatown descibes the manipulation of a critical natural resources, by a shadowy cadre of city oligarchs. Chinatown was to be the first part of a planned trilogy featuring J.J. Gittes, as he investigates the supression of public interest by private greed through the manipulation of natural resources in this case water. (The second and third part were to deal with the city oligarchs appropriation of oil and land.) Although the story is set in 1937, Chinatown is based on real-life events in Los Angeles that became known as the Owens River Valley Scandal and actually took place in 1908.

J.J. “Jake” Gittes, a low-rent divorce detective, is hired to follow the LA water commissioner, by his wife who claims he’s cheating on her, at a time when Los Angeles is suffering from severe water shortages. We follow him, as he gradually uncovers secrets that reveal layer upon layer of corruption and deception. The wickedness revealed is staggering, as we’re slowly subjected to the corruption of politics, money, sex, innocence and even the land itself.

And although the central crime in the story is institutionalised patriarchal rape, as Margaret Leslie Davis, says in her 1993 book Rivers in the Desert: William Mulholland and the Inventing of Los Angeles, this sexually charged film is a metaphor for the “rape” of the Owens Valley.

When Jake finally solves the mystery, he is incapable of righting the wrongs he discovers. In the end, there is just a sense of futility and powerlessness, in the face of such absolutely corrupt and unassailable power.

As his partner says: ‘Forget it, Jake. It’s Chinatown.’

 

 

Dark Matter

 

 

Dark Matter copy_Page_4As the twentieth century drew to a close Western science began to become more and more aware of just how much it did not know. Indeed, in a complete reversal of Victorian thinking which saw everything as knowable and a complete Theory of Everything being just over the horizon, scientists now began to realise that the vast majority of the universe is made up of stuff that we cannot see, detect or even comprehend.

The first inkling scientists had that there might be more mass in the universe than was previously realised came in the 1970s, when Vera Rubin, a young astronomer at the Department of Terrestrial Magnetism at the Carnegie Institution of Washington, began observing the speeds of stars at various positions in their galaxies. Traditional Newtonian physics predicts that stars on the outskirts of a galaxy should orbit more slowly than stars at the center, and yet Rubin’s observations found that all the stars in a galaxy seem to circle the center at roughly the same speed. Research by other astronomers confirmed the anomalies that Rubin had found and, eventually, based on observations and computer models, and in true Kuhnian fashion a paradigm shift began to take place as scientists concluded that there must be much more matter in galaxies than that which is visible or detectable. They called this material dark matter, and estimated that it accounted for 23% of the matter in the universe.

 

Dark Matter copy_Page_1 copyIn the 40 years that have followed, scientists still haven’t been able to establish what dark matter is actually made of. But a more recent discovery than Dark Matter, Dark Energy is possibly even more mysterious. In the mid-1990s, two teams of researchers were looking at the speeds of stars to determine how fast the universe was expanding at various points in its lifetime.
 Based on the prevailing paradigm astronomers had predicted two possibilities: either the universe has been expanding at roughly the same rate throughout time, or it has been slowing down in its expansion as it gets older.
 Shockingly, the researchers observed neither. Instead, the expansion of the universe appeared to be accelerating.

If the Big Bang theory is true, all the gravity of all the mass in the cosmos should have been pulling the universe back inward, just as gravity pulls a ball back to Earth after it’s been thrown into the air.
 There was clearly some other force out there operating on a cosmic scale that was counteracting the force of gravity. This force has been called Dark Energy and is estimated to account for 72% of the universe.

Together then, Dark Energy and Dark Matter account for an extraordinary 95% of the matter in the universe. That which we can see, detect and attempt to comprehend, a mere 5%.

 

 

 

Knowledge vs Ignorance

 

tumblr_n1wdapX6ha1qe3hnzo2_500With the screening of True Detective in 2014, the medium of television finally confirmed its position as the creative equal – some would say the creative superior – of its older sibling, the cinema.

Traditionally, cinema has always looked down on television as a lesser medium, with actors, writers and studio executives regarding movement from the small screen to the large screen as career advancement and movement from the large to the small as, well, not a good idea….

But that was before the extraordinary commercial and critical success of television series like The Sopranos, The Wire, and Breaking Bad. Not to mention the great Scandinavian mystery sagas like The Bridge or The Killing.

These television series have taken the core strength of the medium – the opportunity to tell stories, develop characters and expand upon themes over a much longer time span than the mere couple of hours that a conventional movie affords – and dramatically exploited this advantage to create an exciting new kind of narrative.

So much so, that critics like Brett Martin in ‘Difficult Men: Behind the Scenes of a Creative Revolution‘ believes that these television drama series have “become the significant American art form of the first decade of the 21st century, the equivalent of what the films of Scorsese, Altman, Coppola, and others had been to the 1970s…”

But really it was only when Hollywood ‘A Listers’ Matthew McConaughey and Woody Harrelson made a play for their own small screen franchise, with True Detective, that it seemed clear that some kind of tipping point had been reached.

In the same year that, McConaughey won his Oscar for Best Actor, he and his good friend Woody Harrelson launched True Detective through the HBO network. Having teamed up with the highly regarded author and screenwriter Nic Pizzolatto, and equally respected film director Cary Joji Fukunaga to produce a drama that features some of the finest writing, acting and cinematography to appear on any screen… large or small.

Combining the two core strengths of each medium – the lavish production values of a full scale cinematic production (remarkably, Cary Joji Fukunaga insisted on shooting the whole drama on film), and the leisurely pace that television affords (the entire drama is eight hours long) – they created a narrative structure that expands on many levels to explore grand themes like the nature of truth, the nature of belief and the nature of space and time through the exploration of a number of damaged male characters set amid the poisoned and polluted post industrial landscapes of the Gulf Coast of Louisiana.

True Detective is many things at once—a powerful character study of damaged people in damaged landscapes, a gripping murder mystery, a tour de force for Woody Harrelson and Matthew McConaughey. But first and foremost it is about knowledge and ignorance.

There is a scene at the heart of True Detective that shows this in a startling and profoundly moving way.

Rust Cole, played by Matthew McConaughey, is an ex-cop from Texas and a flickering ghost of a man. Still haunted by the loss of his two year old daughter, who died tragically more than twenty years ago, he is now a disheveled looking alcoholic, who is being interviewed by two Louisiana detectives, who want to know about a murder investigation that he was involved in seventeen years earlier.

We learn that the case notes have all been lost in the wake of Hurricane Rita, and that we now have to rely on the testimony of Cole – and his ex-partner Marty Harte, whom the two detectives interview separately – to understand what happened all those years ago.

Cohle treats his interviewers with the disinterested disdain of a man who has ventured far beyond the realms of conventional society, and has little need of its illusory comforts or, indeed, any of its social niceties.

Knowing how much they need his testimony, he insists on being allowed to smoke in their non-smoking office, and drink his choice of Texan beer, before he explains things any further.

In contrast, ‘Rust’ Cole’s ex-partner, Marty Harte, comes across as a regular guy and a good-ole-boy with simple Bible-Belt family values. Unlike Cohle he appears to be a well-adjusted member of society, who exudes bonhomie and treats his interviewers with genial professionalism.

During the course of the two interviews, however, we begin to see that Cohle and Harte are not all that they seem… ‘Rust’ Cole is a detached, outsider figure who seeks knowledge at any price. Intolerant of any kind of falsehood or superstition, he wants to know the truth, no matter how uncomfortable, inconvenient or downright dangerous that truth might turn out to be.

Marty Harte, on the other hand, lacks knowledge… not just of the world around him but, more importantly, he lacks knowledge of himself. Critically, he fails to understand those who should be closest to him, including his wife – to whom he is unfaithful – and his two daughters – from whom he is profoundly absent even when he is present in their company.

With a stack of empty beer cans in front of him, ‘Rust’ Cole asks the two detectives if they are familiar with M Theory – the latest attempt by theoretical phycists to create a theory of everything, and reconcile the conflicting requirements of quantum mechanics and general relativity.

Cohle says: You ever heard of something called the M-brane theory, detectives?
Detective 1 responds non-commitally: No. That’s over my head.
Cole says: It’s like in this universe, we process time linearly forward but outside of our spacetime, from what would be a fourth-dimensional perspective, time wouldn’t exist, and from that vantage, could we attain it we’d see our spacetime would look flattened…

At this point Cole, with an empty beer can in one hand, extends his arms, brings his hands together and smashes the can into a flat silver disc. Cole continues: …like a single sculpture with matter in a superposition of every place it ever occupied, our sentience just cycling through our lives like carts on a track. See, everything outside our dimension that’s eternity, eternity looking down on us. Now, to us, it’s a sphere, but to them it’s a circle.

We cut to Marty Hartes two daughters aged seven or eight, sitting on the front lawn of their suburban house. They are dressed in princess outfits – perhaps they are waiting for their dad to come home – but they have grown bored and are now bickering with one another. In frustration, one of the girls grabs the others tiara and throws it high up into the tree above them. The camera follows the tiara up into the tree and waits. And waits. We sense that time has passed, before eventually the camera comes back to its original position to see that the lawn is now empty. A car pulls up into the driveway, the passenger door opens and a clatter of empty beer cans fall out, followed by a teenage girl who is so drunk she can barely stand.

It is Marty Harte’s daughter.

 

When Cole first starts talking about M Theory, you’d be forgiven for thinking that this was nothing more than the ramblings of the town drunk, happy to waste police time for a few free beers. However, the film language tells us something very different. By telescoping time, as if to echo the collapsing of the beer can, the edit shows us what Cohle knows and Harte doesn’t: that time spent away from his young family is time that can never be recovered and that this ignorance will have far reaching consequences for all of them.

 

The missing particle

 

Just a few kilometers to the North East of Geneva, and nestling in the foothills of the Jura Mountains, there is a massive underground nuclear research facility, which would not look out of place in a James Bond Movie. It is called CERN (The European Organization for Nuclear Research) and it is here that some of the world’s most brilliant scientists – guys who like to invent things like the internet in their spare time – are probing the fundamental structure of the universe. Here, in a vast 27-kilometer circumference underground complex, rather modestly called the Large Hadron Collider they are smashing particles of matter together at close to the speed of light in order to provide insights into the fundamental laws of nature, and specifically to confirm the existence of an elusive sub-atomic particle known as the Higgs Boson.

The Large Hadron Collider took about a decade to construct, for a  total cost of about $4.75 billion. Electricity costs alone for the LHC run about $23.5 million per year, and the total operating budget of the LHC runs to about $1 billion per year. The search for the Higgs Boson involved the work of almost 3,000 physicists from 169 institutions, in 37 countries and five continents

On the 4th of July 2012 Fabiola Gianotti, Italian particle physicist, and spokesperson for the project at the Large Hadron Collider announced success. But as soon as she did so, she said that: “We need more data.” As Stuart Firestein, who chairs the Department of Biological Sciences at Columbia University, puts it in Ignorance: how it drives science

“We need more data.” With these words, Fabiola Gianotti wrapped up the triumphant announcement that the elusive Higgs boson particle had been detected. Gianotti is the physicist in charge of the experiment at the Large Hadron Collider where this unveiling was made. She added “surprise, surprise” to the end of that sentence, not as a damp squib, or faux humility, nor a beg for more grant money. She said these words because she understands that science is a process not a bank of knowledge…

As a culture, we have a voracious appetite for information. And it is perhaps the ultimate irony, that just as the digital revolution has given each and every one of us easier and faster access to exponentially larger and larger amounts of information, we are only now beginning to become aware of just how much we do not know, and in fact, just how much we cannot know.

Put simply, we are beginning to realize that larger and larger amounts of information do not necessarily guarantee larger and larger amounts of knowledge – indeed there is evidence to suggest that in many cases the opposite may often be true. This is a phenomenon that is often described as ‘the illusion of knowledge.’

The illusion of knowledge and its counterpart, the ignorance of ignorance, are two of the most important philosophical ideas of the digital age. They have found their simplest and best articulation in the rather unlikely form of a statement by Donald Rumsfeld – the then United States Secretary of Defense – when he responded to a journalist at a press briefing in February 2002, who had asked him about the lack of evidence for weapons of mass destruction in Iraq. What Rumsfeld said was:

‘…there are known knowns; there are things that we know that we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns, the ones we don’t know we don’t know.’

Despite being the subject of much derision at the time – The Plain English Campaign gave Rumsfeld its Foot in Mouth Award – Rumsfeld’s statement has come to be seen by many as being (however unintentionally) one of the best summaries of the problem of quantifying ignorance, and the impact that ‘unknown unknowns’ may have on our world.

‘Unknown unknowns’ or the ignorance of ignorance are central to the thinking of Nassim Taleb and his use of 19th century philosopher John Stuart Mill’s metaphor of the black swan.

‘A black swan’ was a common expression in 16th century London as a statement of impossibility, much like the way we use the phrase ‘flying pigs’ today. This was based on the presumption that because all swans ever observed in the Northern hemisphere were white, ALL swans in the world must therefore be white. After the Dutch explorer Willem de Vlamingh discovered black swans in Western Australia in 1697, the term metamorphosed to mean that a seeming impossibility might later be disproven.

The idea of Black swan events – ones which cannot be predicted but have a massive impact on human history – were introduced by Nassim Nicholas Taleb in his 2001 book Fooled By Randomness. His 2007 book The Black Swan extended the idea and showed that almost all major scientific discoveries, historical events, and artistic accomplishments are, in fact, “black swans”. The Internet, the personal computer, World War I, the dissolution of the Soviet Union, and the September 11 attacks are all examples of black swan events.

As a Lebanese, whose family home was destroyed by an unforeseen war Nassim Taleb, is more aware than most of the dangers of ignoring ‘unknown unknowns’, and refers to our attitudes to ignorance, as epistemic arrogance or ‘our hubris concerning the limits of our knowledge.’

Stuart Firestein, eloquently describes how the presence of these ‘unknown unknowns’, means that science can never be a finite process, but an ongoing process that is constantly revising itself:

Science, then, is not like the onion in the often used analogy of stripping away layer after layer to get at some core, central, fundamental truth. Rather it’s like the magic well: no matter how many buckets of water you remove, there’s always another one to be had. Or even better, it’s like the widening ripples on the surface of a pond, the ever larger circumference in touch with more and more of what’s outside the circle, the unknown. This growing forefront is where science occurs… It is a mistake to bob around in the circle of facts instead of riding the wave to the great expanse lying outside the circle.

Three decades ago, Stephen Hawking famously declared that a “theory of everything” was on the horizon, with a 50 per cent chance of its completion by the year 2000. In 2002, Stephen Hawking gave a lecture entitled Godel and the End of Physics, in which he described how he no longer believed that a “Theory of Everything” was possible, given ‘Gödel’s Incompleteness Theorems’, which show that mathematics can either be consistent or complete. And that it could never be both.

In The Grand Design, written with Leonard Mlodinow, Professor Stephen Hawings explains how, in the early 1990s, string theory was struggling with a multiplicity of separate and distinct theories. In fact instead of a single theory of everything, there seemed to be five. Beginning in 1994, though, physicists noticed that, at low energies, some of these theories were mathematically equivalent to one another, suggesting that they may just be two descriptions of the same thing. Eventually, one string theory was shown to be mathematically equivalent to 11-Dimensional Supergravity, a theory that described not only strings but membranes, too. Many physicists now believe that this supergravity theory is actually one piece of a hypothetical ultimate theory, which they call M-theory, of which all the different string theories offer us the merest glimpses.

Thus, according to Mlodinow and Hawing the only way to understand reality is to employ a philosophy called “model-dependent realism” in which we cannot ever attain a single comprehensive theory of the universe. Instead, science offers many separate and sometimes overlapping windows onto a common reality.

Great science is not just there to answer questions, and provide explanations. Indeed, the works of Gödel, Schrodinger, Eisenberg, and Einstein have, perhaps, all raised greater questions than they have answers.

And that is something we should all celebrate rather than deny. For as Thomas Kuhn has shown, so much of experimental physics is simply puzzle solving within an existing paradigm, whereas what we should be doing is exploring the inconsistencies that may point the way towards a new paradigm. The Late Victorians had longed for a science that would provide the answer to life, the universe and everything. And although this yearning was to find its perfect expression in the fiction of the scientific detective, it would never be realized in the real world of experimental science.

Unlike detective fiction, whose purpose is to solve and explain puzzles, perhaps the ultimate purpose of great science is exactly the same as any great detective story – not to solve mystery… but to create it.

As Professor Stephen Hawking puts it:

‘Some people will be very disappointed if there is not an ultimate theory that can be formulated as a finite number of principles. I used to belong to that camp, but I have changed my mind. I’m now glad that our search for understanding will never come to an end, and that we will always have the challenge of new discovery. Without it, we would stagnate. Gödel’s theorem ensured there would always be a job for mathematicians. I think M theory will do the same for physicists…’

 

 

 

 

 

 

26
Oct

Marshall McLuhan: Creativity in a world without boundaries

mansaram-mcluhan cropIn 1996, Gary Wolf, a writer for Wired Magazine, noticed that someone calling himself Marshall McLuhan was posting comments on the Wired website.

This struck Gary as more than just a little curious, since Marshall McLuhan had been dead for more than sixteen years.

Not one to be put off by such details, and sensing one hell of a story, Gary emailed the deceased media guru and asked him to do an interview.

Marshall McLuhan agreed, and the highly unusual exchange that followed, was published in Wired Magazine (you can read it here).

When questioned about the experience, Gary concluded, “If the author was not McLuhan himself, it was a bot programmed with an eerie command of McLuhan’s life and inimitable perspective.”

Now, whether or not you believe in Marshall McLuhan’s ability to conduct interviews from beyond the grave, the fact is, he was never a man to be limited by boundaries.

Unlike most Western knowledge.

For the last five hundred years, since the invention of the printing press, Western culture has divided human knowledge into a number of separate, discrete silos.

So for example, if you want to find a book in your local library, whether you realise it or not, you would be finding your way around by using something called the Dewey decimal classification system.

This ingenious little system was introduced by Melvil Dewey in 1876, and makes it really easy to find any book in the world by its subject matter. In order to do this, all human knowledge is divided into ten broad areas:

000 – Generalities

100 – Philosophy and Psychology

200 – Religion

300 – Social Sciences

400 – Language

500 – Natural Science and Mathematics

600 – Technology (Including Applied Sciences)

700 – The Arts

800 – Literature and Rhetoric

900 – History and Geography

These ten areas are then each divided into ten divisions, each having ten subsections. The structure is hierarchical and the numerical convention follows this structure. So each subdivision gives increasingly specific subjects within a broader subject area, for example:

500 Natural Science and Mathematics

510 Mathematics

516 Geometry

516.3 Analytic geometries

516.37 Metric differential geometries

516.375 Finsler Geometry

The great strength of the Dewey System is that it allows the reader to find any given subject and drill down into it to discover more and more specialized levels, within that subject.

But that is also its weakness.

Because whilst you can explore data vertically as much as you want, you cannot explore data horizontally.

So you cannot, for example, move easily from, say Neuroscience to Advertising. And a book that has the audacity to cover both will be forced to choose one or the other.

The Dewey System is a perfect model of the way that we in the West, have interacted with information, since the invention of printing.

And this creates extraordinary limitations on the way that we do our research, the way that we make our scientific discoveries and the way that we educate our children.

But all of this was to change utterly with the advent of the digital age.

50 years before anybody ever updated his Facebook page or posted his whereabouts on Twitter Marshall McLuhan not only predicted the creation of the internet, he also predicted most of it’s defining characteristics.

And he coined the phrase “surfing” to describe the way we would all navigate our way in a non-linear fashion through the sum total of human knowledge.

medium-is-massage_i3

A double page spread from “The Medium is the Massage: An Inventory of Effects” (1967). Fifty years ago Marshall McLuhan had not only predicted the creation of the internet, he had also predicted most of it’s defining characteristics… including the phrase “surfing” to describe the way we would be able to navigate our way, in a non-linear fashion, through the sum total of human knowledge.

In this way, all human knowledge suddenly becomes interconnected in a ways that were previously inconceivable.

And there is no better expression of this world without boundaries than the “Mashup”.

Mashups are perhaps the defining characteristic of late 20th and early 21st century culture. Whether it’s music, video, literature, or software a mashup combines material from two or more sources to create something that is simultaneously 100% derivative and 100% original.

Mashups work by linking two separate cultural expressions, and seeing how they inform and influence each other.

For sheer comedy value the farther apart the two elements are culturally, the funnier the result.

Try this simple thought experiment: think of a subject, any subject let’s say for example, Death Metal.

Think of another as far removed from that as possible, say, monks of the Benedictine order.

Now put them together and Hey Presto… you’ve either got the basis of a new TV comedy (Benedictine monks form Death Metal Band… with hilarious consequences) or the next novelty music hit (Death Metal meets Gregorian chant.)

1ff71e69c5dfd61896d4ef048d45ec41 copy

Mashups are perhaps the defining characteristic of late 20th and early 21st century culture. They work by linking two separate cultural expressions, and seeing how they inform and influence each other. For sheer comedy value the farther apart the two elements are culturally, the funnier the result.

In many ways, mashup techniques have become the default methodology for creativity in the digital age because they work so consistently well.

In fact, no lesser an author than J.K Rowling used the mashup technique, when she took the narrative conventions of the genre commonly known as “Sword and Sorcery” and mashed this up with that most beloved genre of English children’s literature… the boarding school novel. (A genre made famous by Anthony Buckeridge’s Jennings, and Enid Blyton’s Malory Towers amongst others). And created the most successful children’s books of all time… the Harry Potter novels.

HP COMP

J.K Rowling created a classic mashup, when she took the narrative conventions of “Sword and Sorcery” and mashed this up with that most beloved genre of English children’s literature… the boarding school novel.

But for McLuhan, the instantaneous nature of electronic media had implications far beyond the cultural mashup.

McLuhan had shot to international stardom in the 1960s with radical ideas about the effects of media on human consciousness.

For McLuhan, the introduction of any new medium, whether it is the invention of alphabetic writing, the printing press, or television will always affect our central nervous system by becoming an extension of one or more of the five human senses.

Thus the introduction of any new medium has the effect of distorting the way in which we perceive reality.

So every time there is a significant new development in media, there will also be an equally significant impact on human consciousness.

For McLuhan, the most significant effect of electronic media was to dissolve the traditional barriers that segmented knowledge into separate compartments. And in particular the eradication of our traditional notions of “Time” and “Space”.

This “allatonceness” – as McLuhan terms it – created by digital media allows us to connect across the complete range of human knowledge and experience. Instantaneously dissolving the divisions between, say, language and mathematics, art and advertising, opera and pop music, distance and proximity and the living and the dead.

In “The Medium is the Massage: An Inventory of Effects” (1967), a book he co-created with graphic designer Quentin Fiore. Marshall McLuhan describes the dramatic impact that is being brought about by the arrival of electronic media:

“Societies have always been shaped more by the nature of the media by which men communicate than by the content of the communication…

The alphabet and print technology fostered and encouraged a fragmenting process, a process of specialism and of detachment. Electric technology fosters and encourages unification and involvement…

Ours is a brand-new world of allatonceness. ‘Time’ has ceased, ‘space’ has vanished. We now live in a global village…a simultaneous happening.

Electric circuitry profoundly involves men with one another. Information pours upon us, instantaneously and continuously. As soon as information is acquired, it is very rapidly replaced by still newer information.

Our electrically configured world has forced us to move from the habit of data classification to the mode of pattern recognition. We can no longer build serially, block-by-block, step-by-step, because instant communication insures that all factors of the environment and of experience co-exist in a state of active interplay.”

And yet these ideas about the effects of media on human consciousness, although radical were not entirely original.

McLuhan was particularly indebted to Harold Innis, professor of political economy at the University of Toronto and the author of a number of seminal works on media, and communication theory.

McLuhan was indebted to Innis not just for giving the young McLuhan a framework for his ideas about media, but also for giving him permission to ignore cultural boundaries in his search for a greater understanding of the effects of media on culture and consciousness:

“I remember the excitement I felt when I first realized I didn’t have to restrict my studies to literature. Innis taught me that I could roam through all history and all subjects in search of the true meaning of the medium is the message.

My friend… who teaches economics at Toronto University tells me that F. von Hayek says, “Nobody can be a great economist who is only an economist – and I am even tempted to add that the economist who is only an economist is likely to become a nuisance if not a positive danger.”

Likewise, no student of media studies can afford to be only a student of media studies. A man who only reads about TV is as good for a man as a steady diet of coke and chips.”

Ignoring all the usual boundaries between academia and popular culture, contemporary creativity and ancient literature, McLuhan became the most extraordinary synthesizer of the ideas of others.

From the Ukrainian Scientist, Vladimir Ivanovich Vernadsky , and French Jesuit philosopher, Teillhard De Chardin he took the idea of the “Noosphere”, as the basis for his notion of the “The Global Village”. Walter Ong’s account of what he calls the “oral-to-visual” shift was, in his own words, “hammered out with great agony” in his 1958 book “Ramus, Method, and the Decay of Dialogue”. And was to greatly influence McLuhan’s first major publication, “The Mechanical Bride”.

But, without doubt, the greatest influence of all on Marshall McLuhan’s were neither philosophers nor media theorists… but a little known revolutionary art movement, which had appeared on the eve of the first World War in Britain.

This movement was called “Vorticsm” and although it had little impact on the world at the time, the men behind it, Ezra Pound, Wyndham Lewis, James Joyce and T.S Eliot would go on to shake the culture of the English speaking world to it’s very foundations.

These “Men of 1914”, as McLuhan was fond of calling them, set out do destroy many of the artificial boundaries that separated high art from low art, and would both define the key characteristics of the Modernist world, and help bring it into being.

Through their influence on McLuhan, however, they would also prefigure and define the key characteristics of the digital age (including the invention of the cultural Mashup). And help bring the digital age into being.

Next: Allatonceness: McLuhan and the Men of 1914

23
Jun

How our culture blinds us to the world around us

Like the characters in C.S Lewis’ Chroncles of Narnia, research indicates that small children experience many things which adults can no longer see…

From the Wild Lands in the North, to the Great Deserts in the South, and the Majestic River of Telmar in the West, to the High Castle of Cair Paravel in the East, the land of Narnia is an extraordinary region populated by centaurs, dragons, talking animals and all manner of wonders which no adult human may ever see.

And although no adult may set foot in the land of Narnia, children, may enter into it through the famous wardrobe… as Lucy, Edmund, Susan and Peter famously discover in C.S. Lewis’s Chronicles of Narnia.

But they can do this only as long as they remain children.

It is with sadness that we see first Susan and Peter, then Edmund, and Lucy each learn that, beyond a certain age, they will never be able to return to Narnia.

This narrative of expulsion from paradise is as old as the story of the Garden of Eden, or the myth of “The Golden Age.”

Indeed, most adults, at some level would feel that the adult world somehow lacks the magic, the wonder and the sheer sense of possibility they once experienced as children.

We tend to see the cultural acclimatisation and education of our children as a process of opening their minds to more and more knowledge.

However, recent developments in neuroscience suggest that the opposite is in fact the case.

Because, extraordinary as it may seem, it is now clear that our awareness of the world around us, rather than expanding, in certain key areas, actually diminishes, as we grow older, and we become more socially acclimatised to the needs of our own particular tribe or social grouping.

Because, whilst education and the development of the social brain enable us to find our niche in society, this process is often at the expense of significant cognitive abilities.

To put it bluntly: we become blinded to anything other than that which our mother culture defines as reality.

Limited by language

We are each of us, born with around 100 billion neurons in our brains… to imagine how enormous this is, just think that this is about the same as the number of stars in the Milky Way.

And in a child’s first years of life, the brain is constantly being sculpted by its cultural surroundings, as it refines its circuits in response to environmental experiences.

Since brain circuits organize and reorganize themselves in response to an infant’s interactions with his or her environment, the synapses—the points where neurons connect—are either built and strengthened, or weakened and pruned away as needed (This process is often catchily described as “the neurons that fire together wire together”.)

fc7831d78e381e4f6b1d37aed55ab455d1d14914_389x292

In her 2011 TED talk, “The Linguistic Genius of Babies,” Kuhl describes this process as babies going from being “citizens of the world,” to becoming “language-bound” members of their own tribal grouping.

Patricia K. Kuhl is a Professor of Speech and Hearing Sciences at the Institute for Learning & Brain Sciences at the University of Washington. She specializes in language acquisition and the neural bases of language. Using magneto encephalography (MEG, a relatively new technology that measures magnetic fields generated by the activity of brain cells) Kuhl has, for the first time in human history, been able to show just how babies acquire language.

All spoken languages consist of basic units of sound, called phonemes, these phonemes combine together to form syllables. For example, in English, the consonant sound “t” and the vowel sound “o” are both phonemes that combine to form the syllable “to” as in “tomato”.

In total there are more than 200 phonemes, representing all the sounds that the human voice can create. But most languages use only a fraction of this number. In some languages it can be as few as 15, whilst in English it is over 40.

Patricia K. Kuhl discovered that before 8 months of age, the neurons in a baby’s brain could recognise the phonemes of any language on the planet.

After this point, they quickly learn to ignore the vast majority of phonemes and concentrate only on those used in their native language. And within a few months they have lost this ability altogether.

In her 2011 TED talk, “The Linguistic Genius of Babies,” Kuhl describes this process as babies going from being “citizens of the world,” to becoming “language-bound” members of their own tribal grouping.

(Intriguingly, similar tests done on adults show that these neurons continue to fire in recognition of all 200 phonemes, when presented with any “foreign” language. However this information is no longer processed consciously. So the listener is not aware that they can “hear” them.)

Limited by culture

Similarly, in the visual domain, it has been shown that very young babies have cognitive abilities that become lost as they begin to grow into their culturally acclimatized selves.

According to a study led by Olivier Pascalis, a psychologist at England’s University of Sheffield, human babies start out with the ability to recognize a wide range of faces, even among races or species different from their own.

The study focused on the ability to recognize and categorize faces, determine identity and gender, and read emotions. The findings suggest that, in humans, whether or not you have this ability, is a question of “use it or lose it.”

Michelle de Haan, one of the study's authors, said: "We usually think about development as a process of gaining skills, so what is surprising about this case is that babies seem to be losing ability with age. This is probably a reflection of the brain's 'tuning in' to the perceptual differences that are most important for telling human faces apart and losing the ability to detect those differences that are not so useful."

Michelle de Haan, one of the study’s authors, said: “We usually think about development as a process of gaining skills, so what is surprising about this case is that babies seem to be losing ability with age.This is probably a reflection of the brain’s ‘tuning in’ to the perceptual differences that are most important for telling human faces apart and losing the ability to detect those differences that are not so useful.”

In the study six-month-old infants were able to recognize the faces of individuals of either different races or even different species—in this case, monkeys. Something, which most adults cannot do.

Babies who received specific visual training retained the ability. But those with no training lost the skill altogether by the time they were nine months old.

This is because by the time they’re nine months old, face recognition is based on a much narrower model, one that is based on the faces they see most often.

This more specialized view, in turn, diminishes our early ability to make distinctions among other species, and other races. For instance, if an infant is exposed to mainly Asian faces, he or she will grow to become less skilled at discerning among different, say, Caucasian faces.

Michelle de Haan, one of the study’s authors, said: “We usually think about development as a process of gaining skills, so what is surprising about this case is that babies seem to be losing ability with age.

“This is probably a reflection of the brain’s ‘tuning in’ to the perceptual differences that are most important for telling human faces apart and losing the ability to detect those differences that are not so useful.”

Even if children were not to lose such cognitive abilities as they “tune in” to their contextual cultural norms, we also know that a large part of their cultural acclimatsation  would prevent them from expressing views that are at odds with the social groupings in which they find themselves.

Limited by both masters and peers

Even when we see the world differently, our adult brains have all too often been wired to keep our thoughts to ourselves.

Research conducted by Solomon Asch of Swarthmore College has clearly demonstrated the degree to which an individual’s own opinions will conform to those of the group in which he finds himself.

Whilst the Research conducted by Stanley Milgram of Yale has shown how likely people are to obey authority figures even when their orders go against their own personal morality.

Perhaps this is why we love the ability of children to speak the truth. To say what we have all been thinking, even though it is not culturally acceptable.

After all it is the child who is not blinded by culture, who on seeing the Emperor’s New Clothes says “But he isn’t wearing anything at all!”….

14
Apr

Death by Powerpoint

A part from being sujected to constant attack from the Taliban the US Military is also subjected to daily Powerpoint presentations. Photo Credit: Bryan Denton for The New York Times

Many senior US Military figures are concerned that the, sometimes daily, Powerpoint presentations that all levels of the service are forced to sit through, are curbing discussion, critical thinking and informed decision-making.
Photo Credit: Bryan Denton for The New York Times

When Marshall McLuhan, famously declared that “the medium is the message”, he was not simply referring to the fact that the medium in which information is presented often affects it’s meaning, he was also making the far more radical point, that the medium in itself, is often far more meaningful and significant than the message it appears to convey.

For McLuhan, the very act of reading a newspaper is ultimately of greater significance than the information contained within that newspaper.

Similarly, Microsoft’s infamous presentation software, Powerpoint, is a medium that has found it’s niche in contemporary society, as much for  it’s ability to help one person dominate, and “sell” to a passive audience, as much for it’s ability to send information.

When it comes to understanding the real meaning of Powerpoint, perhaps the obvious clue is in the name.

As a sales tool, Powerpoint, gives the presenter the “power” to make their “points”, in the form of bullet points. These are often linked less by internal logic or narrative coherence, than by the illusion of hierarchical structure created by the program.

And it is this combination of the lack of a coherent narrative to follow, and an unrelenting hail of bullet points, that has made PowerPoint notorious for it’s unique ability to reduce audiences to catatonic states of mind-numbed boredom.

PowerPoint

The man responsible for bringing Powerpoint into the world is actually a very nice chap called Robert Gaskins, who on the twentieth anniversary of the launch of his program wrote this piece called “Back to Basics” where he tried to rein in the excesses of over exuberant Powerpointers around the world by placing his invention in it’s historical framework.

Digital media guru Seth Godin has also criticised bad use of Powerpoint and made some useful suggestions in this blog piece here. However, he is still fundamentally thinking about the program as a sales tool…  as Seth puts it in his best Web2.0 accent: “If you believe in your idea, sell it. Make your point as hard as you can… Your audience will thank you for it, because deep down, we all want to be sold.”

Which is all well and good when the purpose of the presentation is to sell.

But these days Powerpoint is no longer simply as used a sales tool, since it is now used everywhere and anywhere that one person needs to address and impart information to others. Including life and death situations faced by organisations like NASA, and the US military. And here Powerpoint’s inability to deal with complexity and it’s rigid hierarchy have meant that “Death by Powerpoint” is not simply a metaphor.

The amount of time spent creating PowerPoint presentations, has made it a running joke among members of the US Military. Since it ties up a large number of junior officers, (who are actually referred to as PowerPoint Rangers) in the daily preparation of slides, at all levels of the service, whether these be for a high level Joint Staff meeting in Washington or for a platoon leader’s pre-mission combat briefing of his men in a remote valley in Afghanistan.

But apart from the time wasted, behind all the jokes about PowerPoint Rangers, are some very serious concerns among senior staff that the program stifles discussion, critical thinking and thoughtful decision-making.

In an article by in the New York Times entitled “We Have Met the Enemy and He Is PowerPoint”, journalist Elisabeth Bumiller describes how the program has become deeply embedded in a military culture that has come to rely on PowerPoint’s hierarchical ordering of a confused world.

““PowerPoint makes us stupid,” Gen. James N. Mattis of the Marine Corps, the Joint Forces commander, said this month at a military conference in North Carolina. (He spoke without PowerPoint.)

Brig. Gen. H. R. McMaster, who banned PowerPoint presentations when he led the successful effort to secure the northern Iraqi city of Tal Afar in 2005, followed up at the same conference by likening PowerPoint to an internal threat. “It’s dangerous because it can create the illusion of understanding and the illusion of control,” General McMaster said…“Some problems in the world are not bullet-izable.”

In General McMaster’s view, PowerPoint’s worst offense is not a chart like the spaghetti graphic, which was first uncovered by NBC’s Richard Engel, but rigid lists of bullet points (in, say, a presentation on a conflict’s causes) that take no account of interconnected political, economic and ethnic forces. “If you divorce war from all of that, it becomes a targeting exercise,” General McMaster said.”

Office of the Joint Chiefs of Staff This unclassified document from the Office of the Joint Chiefs of Staff shows the U.S. military's plan for "Afghanistan Stability/COIN Dynamics – Security."

This Powerpoint slide became world-famous when Richard Engel posted it on NBC’s World News Blog under the title “So what is the actual surge strategy?”

Crucially, Elisabeth Bumiller concludes her piece with what certain Senior staff in the US Military see as the real strength of Powerpoint…  as a propaganda tool: ” handy when the goal is not imparting information, but the opposite, as in briefings for reporters…. The news media sessions often last 25 minutes, with 5 minutes left at the end for questions from anyone still awake… Those types of PowerPoint presentations, are known as “hypnotizing chickens.””

Perhaps the most infamous example of the US Military “hypnotizing chickens” was when General Colin Powell, then acting as As Secretary of State for the Bush administration, made his pitch to the United Nations Assembly for war in the Middle East, (“Go up there and sell it” Vice President Dick Cheney is reported to have said to him beforehand) with a highly imaginative presentation of the “evidence” for the existence weapons of mass destruction in Iraq.

"Hypnotizing Chickens" Part of Colin powell's highly imaginative presentation "proving" the existence of weapons of mass detruction in Iraq.

“Hypnotizing Chickens” Part of Colin powell’s highly imaginative presentation “proving” the existence of weapons of mass detruction in Iraq.

But perhaps PowerPoint’s failings as medium for the clear communication of information (rather than just a pitch tool) have been outlined most vividly by Yale political scientist Edward Tufte.

Tufte is the world’s leading thinker in the visual display of information.

Since the publication of “The Visual Display of Quantitative Information”, in 1982 , Tufte has occupied a unique position in the worlds of statistics and graphic design, by championing the importance of good information design, and differentiating clearly between style and content. He has single-handedly created the modern scientific discipline of information graphics and the representation of data, and has been described by the New York Times as “The Leonardo da Vinci of data.”

In the wake of the tragic 2003 Columbia Space Shuttle disaster, Tufte published an article in Wired magazine entitled: “PowerPoint Is Evil” and followed this up with a more scholarly, less dramatically titled pamphlet, “The Cognitive Style of PowerPoint,”

Here Tufte argued that the program encourages “faux-analytical” thinking that favors the slickly produced “sales pitch” over the sober exchange of information.

“Imagine a widely used and expensive prescription drug that promised to make us beautiful but didn’t. Instead the drug had frequent, serious side effects: It induced stupidity, turned everyone into bores, wasted time, and degraded the quality and credibility of communication. These side effects would rightly lead to a worldwide product recall.

Yet slideware -computer programs for presentations… is everywhere: in corporate America, in government bureaucracies, even in our schools. Several hundred million copies of Microsoft PowerPoint are churning out trillions of slides each year. Slideware may help speakers outline their talks, but convenience for the speaker can be punishing to both content and audience. The standard PowerPoint presentation elevates format over content, betraying an attitude of commercialism that turns everything into a sales pitch.”

In “The Cognitive Style of PowerPoint” Tufte’s analysis reveals that Powerpoint has a number of specific design faults which render it incapable of communicating certain types of information.

Firstly, Tufte says, it is designed to guide and to reassure the presenter, rather than to communicate with the audience. Secondly, the outliner function causes ideas to be arranged in an unnecessarily deep hierarchy, itself subverted by the need to restate the hierarchy on every slide. (What’s more, the audience is forced to follow the presenter’s thinking in lockstep linear progression through this hierarchy – whereas with handouts, readers could browse and relate items at the same time). And thirdly, and perhaps most importantly, it has a natural tendency to oversimplify thinking, with ideas being squashed into bulleted lists, and stories with beginning, middle, and end being turned into a collection of disparate bullet points.

A central part of this analysis was the Columbia Space Shuttle disaster which Tufte demonstrates, was caused, in part, by the use of Powerpoint.

The Columbia Space Shuttle was destroyed on February 1, 2003 while attempting to re-entering the atmosphere after a 16-day scientific mission. A hole was punctured in the leading edge on one of Columbia’s wings, made of a carbon composite. The hole had formed when a piece of insulating foam from the external fuel tank peeled off during the launch and struck the shuttle’s wing. During the intense heat of re-entry, hot gases penetrated the interior of the wing, destroying the support structure and causing the rest of the shuttle to break apart.

The seven crew members who died aboard the Columbus were: Rick Husband, Commander; William C. McCool, Pilot; Michael P. Anderson, Payload Commander; David M. Brown, Mission Specialist 1; Kalpana Chawla, Mission Specialist 2; Laurel Clark, Mission Specialist 4; and Ilan Ramon, Payload Specialist 1.

Death by Powerpoint? The seven crew members who died aboard the Columbus following an over reliance on Powerpoint for communicating critical information. They were: Rick Husband, Commander; William C. McCool, Pilot; Michael P. Anderson, Payload Commander; David M. Brown, Mission Specialist 1; Kalpana Chawla, Mission Specialist 2; Laurel Clark, Mission Specialist 4; and Ilan Ramon, Payload Specialist 1.

Exhibit A in Tufte’s analysis is a PowerPoint slide which you can see here, and which had been presented to NASA senior managers in January 2003, while the space shuttle Columbia was in space and they were trying to understand the risk posed by the damage caused by the piece of insulating foam to the shuttle wings.

Unfortunately, as you can see from Tufte’s analysis, the critical piece of information is buried so deeply in the rigid PowerPoint format as to be useless.

“It is easy to understand how a senior manager might read this PowerPoint slide and not realize that it addresses a life-threatening situation,” the Columbia Accident Investigation Board concluded, citing Tufte’s work, and strongly criticized the culture within NASA in which, it said, “the endemic use of PowerPoint” had been substituted for rigorous technical analysis.

It is now more than twenty-five years since Powerpoint was first launched as a presentation aid. It is installed on over one billion computers around the world, and it’s use is no longer confined to the sales pitch. It is used everywhere from church services to military briefings, from hospitals and now, increasingly, in schools.

And you don’t have to be a NASA scientist to see that that’s a really dumb idea.

27
Feb

Understanding Consciousness in the Digital Age. Part 2

Robbie_Cooper23game550w-773517-1

In the United States alone, there are over 5 million gamers who are playing an average of forty-five hours a week…the equivalent of a full time job.

“It’s tough to make predictions,” the great Yogi Berra once said, “especially about the future.”

And yet, after just a decade, or so, of massive social and technological change, we are, now, beginning to see clear indications of how digital technologies are beginning to dramatically affect both human culture and human consciousness.

Because we are now starting to see how human consciousness can be modified, by exposure to new knowledge, experiences and technologies. Not just on an evolutionary level, over the course of millennia, but also on an individual basis over the course of just a few years.

Key to understanding the speed of this process of this, are the recent discoveries in “neuroplasticity”. The phenomenon where dramatic physical and cognitive changes can take place, not just in the infant human brain, but also in the adult human brain.

As Sharon Begley puts it so eloquently in  “How The Brain Rewires Itself”: For many “the conscious act of thinking about their thoughts in a particular way rearranged the brain. The discovery of neuroplasticity, in particular the power of the mind to change the brain, is still too new for scientists, let alone the rest of us, to grasp its full meaning. But even as it offers new therapies for illnesses of the mind, it promises something more fundamental: a new understanding of what it means to be human.”

A good barometer for the way technology is changing “what it means to be human” is, of course, TED. TED, is an acronym for Technology, Entertainment and Design a non-profit organization formed to support “ideas worth spreading.”

And this ability to spread new ideas, has made TED an unparalleled phenomenon in the history of communications.  Talks from the various TED conferences have been available online since June 2006. By January 2009 the talks had been viewed more than 50 million times. In June 2011, that figure stood at more than 500 million, and by November 2012, it was over one billion.

TED talks cover a wide variety of subjects, but the most watched TED talk of all time (with over a million views) is still one of the first. Given six years ago, by Sir Ken Robinson, it is an impassioned description of how our educational systems are no longer serving the needs of our societies.

Another much watched TED talk is one by Jane McGonigal which describes the extraordinary impact that digital gaming is beginning to have on our societies.

Inteview-Jane-McGonigal-631

Jane McGonigal: “Game designers and developers are actively transforming what was once an intuitive art of optimizing human experience into an applied science. And as a result, they are becoming the most talented and powerful happiness engineers on the planet.”

As we have seen, two of the key drivers in the shape of things to come are going to be gaming and education. And, by TED standards at least, McGonigal and Robinson are the two people who are leading the conversations on each.

As McGonigal says, there is much suspicion about electronic gaming. Particularly amongst non gamers.

But in April 2009, players of the game “Halo 3” celebrated an extraordinary achievement… by any standards.

After 565 days fighting, in the third and final campaign in the Great War to protect the Earth from their enemy, The Covenant, players had achieved the milestone of 10 billion kills.

They had set themselves this goal, averaging 17.5 million kills a day…  and along the way they had assembled the largest army the Earth has ever seen. More than 15 million people had fought on behalf of United Nations Space Command. That’s a larger army than twenty-five of the largest armies in the world combined.

halo_3_11954979872803

Halo 3: More than 15 million people fought on behalf of United Nations Space Command. That’s a larger army than twenty-five of the largest armies in the world combined.

Each week over three billion hours are spent playing video games.

In the United States alone, 183 million people are playing an average of thirteen hours a week.

Of  course many of these are kids, and teenagers with time on their hands, but many are also nine-to-fivers who come home to apply all of the talents that are underutilized at work in multiplayer online games.

Not to mention the 5 million gamers in the United States, who are playing an average of forty-five hours a week, for whom it has become the equivalent of a full time job.

As McGonigal puts it, these figures are: “a sign of something important, a truth that we urgently need to recognize. The truth is this: in today’s society, computer and video games are fulfilling genuine human needs that the real world is currently unable to satisfy. Games are providing rewards that reality is not. They are teaching and inspiring and engaging us in ways that reality is not. They are bringing us together in ways that reality is not.”

But, despite its title, Reality is Broken is not a rallying cry for mass rejection of the world and a retreat into digital fantasy. According to McGonigal, games can teach us how to live better in the real world.

Because games are potent engines for teaching us how to have enhanced emotional experiences.

And all great games deliver these enhanced emotional experiences by placing the player in a state of “flow”— a state of complete absorption in the activity at hand – and keeping them there.

The science of happiness, and the emotional evolution of digital gameplay are two major historical trends that are now intersecting in this concept of “Flow”.

The concept of “flow” was first articulated by Mihaly Csikszentmihalyi, one of the world’s leading researchers in positive psychology. In his book, Flow: The Psychology of Optimal Experience, Csíkszentmihályi explains how people are happiest when they are in a state of flow— a state in which they are so fully immersed in what they are doing that nothing else seems to matter. During which time their usual concerns, like keeping track of time, realizing that you need food, or sleep are often ignored.

In an interview with Wired magazine, Csíkszentmihályi described being in “Flow” as “being completely involved in an activity for its own sake. The ego falls away. Time flies. Every action, movement, and thought follows inevitably from the previous one, like playing jazz. Your whole being is involved, and you’re using your skills to the utmost.”

To achieve a flow state, a balance must be struck between the challenge of the task and the skill of the performer. If the task is too easy or too difficult, flow cannot occur. Both skill level and challenge level must be constantly matched and high; because if skill and challenge are low and matched, then apathy results.

It is the unique ability of electronic games to keep you poised at this point, at just the right level of difficulty to keep you constantly engaged, and in a constant state of flow, that is the key to their addictiveness.

As McGonigal writes “As one journalist put it, the Microsoft game testing lab “looks more like a psychological research institute than a game studio”. This is no accident. Game designers and developers are actively transforming what was once an intuitive art of optimizing human experience into an applied science. And as a result, they are becoming the most talented and powerful happiness engineers on the planet.”

And a generation that has been exposed to these regular and extended “flow” experiences will have very different expectations to previous generations, regarding  the happiness potential of other aspects of their lives such as education or work.

Not least because their brains are hard wired to do so through the phenomenon of “neuroplasticity”.

In many ways the concept of “flow” within the world of gaming adds real depth to the predictions of Ahonen and Moore in “Communities Dominate Brands” and Mark Prensky in his highly influential paper on education published in

The concept of “flow” is also of great importance to our other TED speaker, Sir Ken Robinson.

Sir Ken Robinson’s TED talk is entitled “Do schools kill creativity?” and he opens the proceedings with this blistering broadside: “all kids have tremendous talents and we squander them… pretty ruthlessly.”

All-sizes-TED2010_30415_D31_0774_1280-Flickr-Photo-Sharing

Sir Ken Robinson: “I believe this passionately… that we don’t grow into creativity, we grow out of it. Or rather we get educated out of it.”

To illustrate the fact that, left to their own devices, children are not frightened of making mistakes he tells a great story about “a little girl who was in a drawing lesson, she was 6 and she was at the back, drawing, and the teacher said this little girl hardly paid attention, and in this drawing lesson she did. The teacher was fascinated and she went over to her and she said, “What are you drawing?” and the girl said, “I’m drawing a picture of God.” And the teacher said, “But nobody knows what God looks like.” And the girl said, “They will in a minute.”

Kids will take a chance. If they don’t know, they’ll have a go. Because they’re not frightened of being wrong. And as Sir Ken says, “What we do know is, if you’re not prepared to be wrong, you’ll never come up with anything original. And by the time they get to be adults, most kids have lost that capacity. They have become frightened of being wrong… And the result is, we are educating people out of their creative capacities.”
He adds “Picasso once  said that “all children are born artists. The problem is to remain an artist as we grow up”. I believe this passionately, that we don’t grow into creativity, we grow out of it. Or rather we get educated out of it.”

And this is because the world’s great public systems of education were developed in the 19th century specifically to meet the needs of an industrialised society.

The hierarchy of subjects, with mathematics and languages at the top, humanities in the middle and the arts at the bottom, reflect the requirements of an industrial workforce.

But this is now starting to show signs of it’s age, not just for students, but for society as a whole: “In the next 30 years. according to Unesco, more people worldwide will be graduating through education than since the beginning of history.  Suddenly degrees aren’t worth anything… kids with degrees are often heading home to carry on playing video games, because you need an MA where the previous job required a BA, and now you need a PhD for the other. It’s a process of academic inflation. And it indicates the whole structure of education is shifting beneath our feet.”

As Sir Ken says: “We need to radically rethink our view of intelligence.”

In most countries, industrialism has passed it’s peak as the major form of employment and wealth. In America in 1965, manufacturing accounted for something like 30% of employment. Whereas today currently less than 12% of employment. Manufacturing output has increased and is still a very important part of the economy, but it doesn’t employ as many people.

Throughout the world, the real growth era is the intellectual industries, including the arts, software, science and technology. These are areas where new ideas matter most. . Many countries recognize now that the future of national economies depends upon a steady flow of innovative ideas.

Elsewhere, (Presentation by Sir Ken Robinson to Education Commission of the United States National Forum of Education Policy, July 14, 2005) Sir Ken has written that: “We’re all trying to work out how to educate our children to survive in a world we can’t predict and to maintain a sense of cultural identity in a world that’s changing faster than ever.”

He tells one other great story which describes one child who was allowed to discover her sense of”flow”.

Gillian Lynne is the choreographer responsible for Cats and Phantom of the Opera. Sir Ken once asked Gillian how she had become a dancer.
“… She said it nearly didn’t happen. She said that when she was in the elementary school she was a terrible student. Her handwriting was awful, she didn’t concentrate, couldn’t apply herself and was always looking out the window and being disruptive. As a result she was constantly in trouble. Eventually, the school wrote to her parents and said, “We think Gillian has a serious learning disorder.” ( I think now, by the way, they’d say she had Attention Deficit Disorder and put her on Ritalin.)

Anyway, she remembers being sent to see a specialist with her mother… for about 20 minutes her mother described to him all the problems she was having at school and all the problems she was causing. All the time he was watching her intently. At the end of it, he stood up and came across and sat next to her. And he said, “Gillian, I have been listening to all the things your mother’s told me – all the problems you’re having at school and I really now need to speak to her privately, so I’m going to leave with her and leave you on your own, but we’ll be back. We won’t be very long – just wait for us.” She said okay and they got up and left the room. But as they went out of the room, he leant across the desk and turned the radio on that was sitting on his desk.

She found out later that as they got into the corridor he turned to her mother and said, “Just stand here for a moment and watch her.” There was a window back into the room. The moment they left the room, Gillian was on her feet moving to the music, all around the room. They watched for a few minutes and then he turned to her mother and said to her, “Mrs. Lynne, Gillian isn’t sick – she’s a dancer. Take her to dance school.”

Gillian never looked back, she has had a glittering career helping to create  some of the most successful musical theater productions in history, she’s given
pleasure to millions and is probably a millionaire. Somebody else might have put her on medication and told her to calm down.

As Sir Ken says: “Now my point really is that there are millions of Gillians. We are all of us Gillians in our different ways looking to find the thing we can do. People achieve their best when they’re in their element – when they do the thing that they love”.

Sir Ken Robinson and Jane McGonigal are very different people, from very different backgrounds, but on this point these two TED speakers are in complete agreement.

The future is “flow”.

lynne01

Gillian Lyne has had a glittering career helping to create some of the most successful musical theater productions in history, she’s given
pleasure to millions and is probably a millionaire. Somebody else might have put her on medication and told her to calm down.

 

31
Dec

Understanding Consciousness in the Digital Age. Part 1

Raquel Welch in a promotional photograph for "One Million Years B.C.

What was she thinking? Raquel Welch in her fur bikini ready to do battle with dinosaurs. We tend to see such historical figures as being mentally and physically the same as us. However research using MRI technology suggests that, historically, the brains of humans were structurally and functionally very different to our own.

 

Let’s face it, with a gloriously daft plot that has Raquel Welch running around in a fur bikini and cave men doing battle with dinosaurs, the movie “One Million Years B.C.” was never going to win any prizes for historical accuracy.

However the fact that it wasn’t completely laughed out of the cinema when it was released in 1966, illustrates a fascinating cultural bias…

That popular culture tends to see history as little more than an extended costume drama.

We tend to see historical figures as being physically the same as us… in both mind and body. Admittedly, we recognize that these fellows may be a bit behind with technology, and that they may have some curious superstitions and beliefs, not to mention some dubious personal hygiene habits. But, given a good bath, a few lessons in English, and of course, a change of clothes we imagine that most historical characters could be introduced to contemporary society with ease.

This is largely because, when we have given it any thought at all, we have always tended to see the adult human brain as something fixed and unchangeable.

And if changes do take place to the structure of the human brain, that they tend to take place, through natural selection, over the course of many generations and many millennia.

Not everyone thinks this way, however. Back in the 1976, Julian Jaynes, a professor of psychology at Princeton, published a revolutionary new approach to the history of the mind in The Origin of Consciousness in the Breakdown of the Bicameral Mind.

In this extraordinary book, Jaynes argued that consciousness was, in fact, only a fairly recent development in human evolution, emerging as late as 10,000 years BCE.

Before this time, Jaynes argues, people experienced the world rather like schizophrenics who experience “command hallucinations”, or “voices” that tell them what to do. (In fact, according to Jaynes, schizophrenia is simply a vestige of humanity’s earlier pre-conscious state.)

Jaynes called this cognitive state “Bi-Cameralism” reasoning that the instructions or “voices” came from the right brain counterparts of the left brain language centres—specifically, Wernicke’s area and Broca’s area. These regions are somewhat dormant in the right brains of most modern humans, but occasionally auditory hallucinations have been seen to correspond to increased activity in these areas.

Using the earliest writings as evidence for his theory, Jaynes showed that the characters described in the Iliad, do not exhibit the kind of self-awareness we normally associate with consciousness. Rather, these bicameral minded individuals are guided by mental commands issued by external “gods”.

Julian Jaynes argued that consciousness is, in fact, only a fairly recent development in human evolution, emerging as late as 10,000 years BCE. Before this time, people like Helen of Troy in Homer's Iliad experienced the world rather as schizophrenics, who experience "command hallucinations", or "voices" that tell them what to do. (A handy excuse when you have just sparked a major international incident).

Julian Jaynes argued that consciousness is, in fact, only a fairly recent development in human evolution, emerging as late as 10,000 years BCE. Before this time, people like Helen of Troy in Homer’s Iliad experienced the world as schizophrenics, who experience “command hallucinations”, or “voices” that tell them what to do. (A handy excuse when you have just sparked a major international incident).

 

And whilst, no mention is made of any kind of introspection in the Iliad and the older books of the Old Testament, the corresponding works written after 10,000 B.C.E. like the later books of the Old Testament or the later Homeric work, the Odyssey, show signs of a very different kind of mentality … an early form of consciousness.

According to Jaynes, human consciousness first emerged as a neurological adaptation to developing social complexity, as the bicameral mind began to break down during the social chaos of the The Bronze Age Collapse” in the second millennium BCE.

Historians are unclear as to the cause of  “The Bronze Age Collapse”, but between 1206 and 1150 BCE, the cultures of the Mycenaean kingdoms, the Hittite Empire, and the New Kingdom of Egypt collapsed, and almost every city between Pylos and Gaza were all violently destroyed, and largely abandoned.

This was a time of large scale economic collapse, and mass migrations across the region (Christian and Hebrew scholars associate Moses and the story of Exodus with this time), creating social stresses that required societies to intermingle, forcing people to learn new languages and customs and generally, to become more flexible and creative.

Jaynes argues that self-awareness, or human consciousness, was the culturally evolved solution to this problem. Jaynes further suggests that the concepts of prophecy and prayer arose during this breakdown period, as people attempted to summon instructions from the “gods” and the Biblical prophets bemoaned the fact that they no longer heard directly from their one, true god.

According to Jaynes, relics of the bicameral mind can still be seen today in cultural phenomena like religion, schizophrenia and the persistent need amongst human beings for external authority in decision-making.

Jaynes big idea is really quite breathtaking in its boldness. And although many of his other, related, ideas have since been validated by modern brain imaging techniques, Jaynes’s Bi-Cameral hypothesis remains highly controversial.

At the time of publication, one of Jaynes’ more enthusiastic suporters was Marshall MCcluhan, who was undoubtedly drawn to Jaynes ideas about the origins of consciousness and how it shed light on the origins of language and writing. (It is entirely possible that the Bronze Age Collapse was the result of the collapse in early media, and breakdown of the Bronze Age mind , rather than the cause of it.)

Marshall McCluhan is, of course, famous for his thinking on the impact of media on consciousness.

McCluhan and onetime collaborator Walter Ong were both fascinated by how the shift from an oral-based stage of consciousness to one dominated by writing and print changed the way humans think.

McCluhan went on to expand on these ideas in The Gutenberg Galaxy, particularly the significance of  the invention of moveable type  and printing in terms of it’s impact on human consciousness.

It has been said that the digital age we are now entering is greater than any of these previous information revolutions, in that its impact may be no less than the equivalent of the invention of writing and the invention of the printing press… all at the same time.

However, until recently we have been lacking proof that even changes of this magnitude can have an immediate effect on human consciousness,

But over the last few years, new research using MRI scanners, has revealed that the adult human brain can, in fact, be significantly transformed by experience.

“Neuroplasticity” is the term used to describe this new found ability of the brain to transform itself structurally and functionally as a result of it’s environment. Significantly, the most widely recognized forms of “Neuroplasticity” are related to learning and memory.

If you live in central London you will be very familiar with the sight of guys, looking a bit lost, as they ride around on mopeds with a clipboard attached to the handlebars.

These are would-be London taxi drivers doing what is known locally as “The Knowledge”.

London Taxi Drivers who do "The Knowledge" develop a larger, modified hippocampus - that part of the brain associated with memory and navigation.

London Taxi Drivers who do “The Knowledge” develop a larger, modified hippocampus – that part of the brain associated with memory and navigation.

 

In order to drive a traditional black London cab, these drivers must first pass a rigorous exam which requires a thorough knowledge of every street  in a six-mile radius of Charing Cross.  It usually takes around three years to do “The Knowledge”, and memorise this vast labyrinth of streets in central London, and on average, only a quarter of those who start the course manage to complete it. However, it now appears that those who manage to attain “The Knowledge” , find themselves not only with a new job, but also with a modified brain.

A study published in 2000 by a researcher team at the Department of Imaging Neuroscience, at University College London demonstrated that London Taxi Drivers who do “The Knowledge” develop a larger, modified hippocampus, that part of the brain associated with memory and navigation, than they had previously.

 

As Dr Eleanor Maguire, who led the research team put it, “There seems to be a definite relationship between the navigating they do as a taxi driver and the brain changes. The hippocampus has changed its structure to accommodate their huge amount of navigating experience. This is very interesting because we now see there can be structural changes in healthy human brains.”

Over the past decade, other researchers, using MRI scanners, have observed similar structural and functional changes to the brain. For example, this study (Temporal and Spatial Dynamics of Brain Structure Changes during Extensive Learning Draganski, et al., 2006). shows how medical students undergo significant changes to their brains whilst studying for exams.

If this type of transformation can be observed in the adult brain, imagine what changes might be ocurring in the minds of children growing up in the midst of the digital revolution.

In 2001, the American educationalist, Mark Prensky, drew everyone’s attention to the impact this was beginning to have on the way our brains are wired, when he published this article called “Digital Natives, Digital Immigrants”.

As Prensky points out: “Our students have changed radically. Today’s students are no longer the people our educational system was designed to teach. Today‟s students have not just changed incrementally from those of the past, nor simply changed their slang, clothes, body adornments, or styles, as has happened between generations previously.

A really big discontinuity has taken place. One might even call it a “singularity” – an event which changes things so fundamentally that there is absolutely no going back.This so-called “singularity” is the arrival and rapid dissemination of digital technology in the last decades of the 20th century. It is now clear that as a result of this ubiquitous environment and the sheer volume of their interaction with it, today‟s students think and process information fundamentally differently from their predecessors.

These differences go far further and deeper than most educators suspect or realize… it is very likely that our students’ brains have physically changed – and are different from ours – as a result of how they grew up….

Digital Natives are used to receiving information really fast. They like to parallel process and multi-task. They prefer their graphics before their text rather than the opposite. They prefer random access (like hypertext). They function best when networked. They thrive on instant gratification and frequent rewards. They prefer games to “serious” work.”

These points that Prensky makes about the changing nature of consciousness are further developed in “Communities Dominate Brands” published in 2005 by Thomi Ahonen and Alan Moore. Here, Ahonen and Moore, identify two key drivers behind these changes:

Firstly they identify the act of gaming as the key factor in the rewiring of the Digital Native’s neural circuitry. Since the mid 1990s children have been  playing games on electronic devices, rather than as the previous generation, simply being passive consumers of broadcast media. And it is the interactive nature of electronic gaming, according to Ahonen and Moore that that is responsible for changing the brains of this generation structurally and functionally, and creating a constant appetite for social interactivity.

Secondly, Ahonen and Moore, emphasise the difference between the world of the PC and the fixed internet to the world of the mobile device and the mobile internet. The former is a world where you “log on” and “log off”, the latter a world where you are constantly connected. They characterize the former as “The Networked Age” (1992-2004) and the latter as “The Connected Age” (2002-2014?). And where the key characteristic of the “The Networked Age” is “Acquiring”, the key characteristic of the “The Connected Age” is “Sharing”.

It is this constant connection, and desire to share and interact  with other engaged minds that Ahonen and Moore predicted would create new “elective”, dynamic communities, and that these new social groupings would be the engine of massive social change. The subsequent rise in social media and it’s impact on major social events like the events of “The Arab Spring” are testament to the validity of Ahonen and Moore’s predictions.

However, it is probably the world of gaming that has the most to teach us about the changes that are taking in place in wiring of the brains of our “Digital Natives. And we will explore this in a little more detail in part two.

1
Dec

Invisible gorillas, erotic dancers, and what lies beyond the visible spectrum.

 

Most people know about the spectrum of colours that can be seen with the naked eye, and that beyond this visible spectrum of colour, there are things that we cannot see, like infra-red, for example.

In recent years, however, it has become apparent that there are many things that we do not consciously see that can have profound effects on our behaviour. These are things that the unconscious mind sees “under the radar” of consciousness.

In 2004, two researchers from Harvard University, Christopher Chabris and Daniel Simons, were awarded the Ig Nobel Prize in Psychology for the experiment known as “The Invisible Gorilla.”

In this experiment, participants are shown a video, featuring two teams, one wearing white shirts, the other black. The teams are moving around in a circle, passing basketballs to one another. In order to occupy your attention, you are asked to count the number of passes made by the team wearing white.

Halfway through the video, someone wearing a full-body gorilla suit walks slowly to the middle of the screen, pounds their chest, and then walks out of the frame.

If you were to watch the video without being asked to count the passes, you would, of course, see the gorilla. But in tests, when people were asked to concentrate on the passes, about half the people did not see the gorilla at all.

Chabris and Simons call this phenomenon ‘inattentional blindness’. It occurs when you direct your attention like a mental spotlight on the basketball passes, because it is so focused on this activity, it leaves everything else in darkness. In this state, even when you look straight at the gorilla you won’t see it, because it’s simply not what you’re looking for.
That is not to say, however, that at some level your mind hasn’t registered it.

Our brains are physical systems and hence have finite ­resources. Compared to a computer chip, which is capable of processing billions of bits of information every second, our conscious brains (that part of our thinking in which we are aware of thinking) can only process a mere 40 bits of information per second.

Tor Nørretranders

Tor Nørretranders

In the “The User Illusion: Cutting Consciousness Down to Size”, Tor Nørretranders has pointed out that our senses receive about 12 million bits of information every second. Of that 12 million bits of information, 10 million bits come from our eyes, 1 million bits come from our sense of touch, and the rest being delivered from all the other senses—hearing, smell, taste, and spatial sensations.

And, this is the important bit, because our conscious brains can only process at 40 bits per second, the remaining information is processed subconsciously.

That’s a ratio, of something like 99.999 percent subconscious processing, to 0.001 percent actual conscious thinking.

And this information we receive “under the radar” of consciousness would appear to have a powerful effect on behaviour.

According to research conducted by Professors Gavan Fitzsimons and Tanya Chartrand of Duke University, and Gráinne Fitzsimons of the University of Waterloo and published in the Journal of Consumer Research, in April, 2008, when people are subliminally exposed to either an IBM or an Apple logo, those exposed to the Apple logo behave in a more creative fashion than those who had been shown the IBM logo.

Gavan Fitzsimons explains: “Each of us is exposed to thousands of brand images every day, most of which are not related to paid advertising. We assume that incidental brand exposures do not affect us, but our work demonstrates that even fleeting glimpses of logos can affect us quite dramatically.”

To demonstrate the effects of brands on behavior, the researchers selected two household names, with contrasting and clearly defined brand characteristics. They asked the participants to complete what appeared to be a simple visual acuity task, during which either the Apple or IBM logo was flashed so quickly that they were completely unaware they had seen anything.

The participants were asked to then complete a task designed to evaluate how creative they were, by listing as many uses as possible for a brick other than the obvious such as building a wall. And those who were exposed to the Apple logo generated significantly more unexpected, oblique and creative uses for the brick compared with those who had “seen” the IBM logo.

As Gráinne Fitzsimons puts it: “This is the first clear evidence that subliminal brand exposures can cause people to act in very specific ways.”

But perhaps even more dramatic than the discovery that subliminal exposure to brands can affect behaviour, was the research published by a group of evolutionary psychologists from the University of New Mexico, in their 2007 paper “Ovulatory cycle effects on tip earnings by lap dancers: economic evidence for human estrus?”

What they discovered, in fact, was that lap dancers earnings vary in direct proportion to the stages of their ovulatory cycles.

So, on average, a lap dancer would earn $335 per evening during estrus, that part of their ovulatory cycle when they are most likely to conceive, $260 per evening during the couple of weeks that form the luteal phase, and only $185 per evening during menstruation. (By contrast, participants using contraceptive pills showed no estrous earnings peak.)

As the researchers describe it in their paper: “All participants in this study worked as lap dancers in Albuquerque “gentlemen’s clubs” circa November 2006 through January 2007. The clubs serve alcohol; they are fairly dark, smoky, and loud (with a DJ playing rock, rap, or pop music). Most club patrons are Anglo or Hispanic men aged 20 to 60, ranging from semiskilled laborers to professionals; they typically start the evening by getting a stack of US$20 bills from the club’s on-site ATM and having a couple of drinks.

The Dancers in these clubs perform topless but by law are required to wear a underwear or a thong of some sort. During the evening, each dancer performs on an elevated central stage to advertise her presence, attractiveness, and availability for lap dances. These dances result in only modest tip earnings (typically $1–5 tips from the men seated closest to the stage, and amounting to just 10% of her total earnings).

The rest of the time, she will walk around the club asking men if they want a “lap dance.” A lap dance typically costs $10 per 3-min song in the main club area or $20 in the more private VIP lounge. Lap dances require informal “tips” rather than having explicit “prices” (to avoid police charges of illegal “solicitation”), but the tipping is vigorously enforced by bouncers. Dancers thus maximize their earnings by providing as many lap dances as possible per shift.

The direct correlation between the tips earned, and the ovulatory status of these women, demonstrates that this information was clearly communicated to their customers through some form of non-verbal communication. And that this is perceived by the subconscious part of the brain that processes 12 million bits of information every second, rather than the conscious part that is chugging along at a mere 40 bits of information per second.

What both the Apple vs. IBM, and the lap dancers research clearly shows, is that in large part, our behaviours, are driven by experiences that we are not consciously aware of.

And, that the vast majority of these experiences are primarily visual.

And that, in a nutshell, is why, the traditional marketing practice of proposition testing doesn’t work.

It’s all a question of bandwidth. Consider this: we have seen that something like 99.999 percent of our perception is subconscious processing, and of that processing capability, 10 million bits out of 12 million bits per second is purely visual.

So proposition testing only speaks to 0.001 percent of the available attention in the group.

In order to get real insights out of any focus group, you need to engage the whole human being, their conscious and subconscious selves, the rational and the emotional, or System A and System B consciousness as Daniel Kahnemann describes it in “Thinking Fast and Slow.

And you need to use visually rich stimulus.

A number of years ago our agency, Chemistry, developed a process that we call “Creative Planning” which does just this. It is based on the belief that consumers cannot relate in any meaningful way to propositions, but do respond to narratives placed in a visually rich context.

We find that using these methods in qualitative research, creates much higher engagement with consumers, providing much better, more profound insights than the use of propositions out of context.

Now “Creative Planning” isn’t perfect, But to be fair, consumers in focus groups are never going to be as engaged to the same degree as the customers of a lap dancing club. Whatever the time of the month.

11
Nov

How Brer Rabbit survived the Black Holocaust. The resilience of narrative in social media.

Anyone hoping to understand the real power of social media, would do well to consider the extraordinary tale of a certain individual who often goes by the name of Brer Rabbit.

Now, nobody knows exactly how old Brer Rabbit really is, but he is clearly many, many hundreds of years old.

He was smuggled across the Atlantic in stories told by African slaves, to America, where he found fame and fortune in popular books and movies, becoming a character beloved by generations of children around the world.

In more recent years, these books and movies have become mired in arguments about political correctness and all but disappeared from the popular imagination. But, remarkably, the ancient oral storytelling tradition that gave birth to this character, keeps his adventures alive to this very day.

The Atlantic slave trade was a human tragedy on a scale like no other. The “Black Holocaust” or “Maafa” (a word derived from the Swahili term for “disaster”, or “great tragedy”) lasted for almost four hundred years, and although we have no way of knowing exactly how many people died as a result, many modern historians estimate a staggering death toll of at least ten million men, women and children.

The most deadly part of the journey was the notorious “Middle Passage” where prisoners were held below decks in slave ships for months as they crossed the Atlantic Ocean.

Despite the apalling conditions in which they were transported, it is thought that around eleven million Africans survived the journey to become slaves in the Americas.

The majority came from the west coast of Africa, and they came from at least 45 separate racial groupings. These included, the The BaKongo, The Mandé, The Akan, The Wolof, The Igbo, The Mbundu and The Yoruba to name but a few.

Most slaves came from the west coast of Africa, with at least 45 separate racial groupings, speaking over 1,400 different Niger-Congo languages

Mostly these people would have spoken one of the Niger Congo family of languages (these days, some 85 percent of the population of Africa speak a Niger-Congo language). However, it is estimated that there are at least 1,400 of these Niger-Congo languages.

Huddled together, in chains, in the darkness of the great slave ships, many of these people could not even talk with one another.

Over the years, West African Pidgin English, also called Guinea Coast Creole English, became the lingua franca along the West African coast.

This language began it’s life among Slave traders doing business along the coast, but it quickly spread up the river systems into the West African interior, because of its value as a common trade language among different tribes; even amongst Africans who had never have seen a white man.

It is still spoken to this day in West Africa.

Slaves in the Americas found West African Pidgin English as useful as a common language on the plantations as they had found it back home in West Africa as a trading language. And when they had children, these too adopted their own version of West African Pidgin English as their native language, thus creating a number of American English-based creole languages.

One of these creole languages is called Gullah and is still spoken today by about 250,000 people in the Southern United States, specifically, on the coasts of South Carolina and throughout the State of Georgia.

And it was in the language of Gulah, that a young Irish American called Joel Chandler Harris was to first hear, the animal stories, and songs, that were to bring him worldwide fame with the tales of Brer Rabbit.

Joel Chandler Harris was a journalist who wrote for a newspaper called “The Constitution” in Atlanta, Georgia, in the years immediately following the American Civil War. A war that had destroyed so much of the South, but wreaked devastation on Atlanta in particular.

Harris published his first Brer Rabbit tale, “The Story of Mr. Rabbit and Mr. Fox as Told by Uncle Remus”, in a phonetic version of the Gullah language, in the July 20, 1879 issue of the newspaper, under the heading “Negro Folklore”. He would publish 184 more of these tales during the next 27 years.

Becoming a household name, not just across the States and but also around the world with readers who delighted in these strange tales told in the creole language of Gullah.

Because of this, Joel Chandler Harris’s position amongst American men of letters at the start of the 20th century was second only to that of Mark Twain.

And his influence on other writers was equally far reaching; the children’s literature analyst John Goldthwaite has said that the Uncle Remus tales are “irrefutably the central event in the making of modern children’s story.” In terms of content, their influence on children’s writers such as Rudyard Kipling, A.A.Milne, Beatrix Potter, and Edith Blyton is substantial. Not to mention their stylistic influence on modernism in the works of Pound, Eliot, Joyce, and Faulkner.

And yet, today, few children would recognize the name Uncle Remus, let alone that of Joel Chandler Harris.

In the late 1960s most Brer Rabbit books were removed from schools and libraries in the States because they were deemed racist. And despite the enduring popularity of the signature song “Zip-a-Dee-Doo-Dah”, the Disney movie, “Song of the South”, which was based on these stories, has not been seen in it’s entirety for over fifty years. And never been released on home video or DVD.

In 1981 the writer Alice Walker , author of “The Colour Purple”, accused Harris of “stealing a good part of my heritage” in a blistering essay called “Uncle Remus, No Friend of Mine”. Strangely enough, and to be fair to Joel Chandler Harris, he would probably have agreed with much of what Alice Walker had to say.

Cruciallly, Harris saw himself as an ethnographic collector of the oral traditions of these former slaves rather than an original author of fictional literature in the style of Mark Twain. His tales were roundly praised by leading folklore scholars of the day. He became intrigued with the new “science of ethnology” and became a charter member of the American Folklore Society (along with Twain). As he began to fill his library with ethnological texts, journals and folklore collections, he become intrigued by the fact that the tales he was collecting bore striking resemblances to tales from cultures in other parts of the world.

Which they clearly do.

In English “Brer Rabbit” means “Brother Rabbit”. As indeed, “Brer Fox”, “Brer Bear”, “Brer Wolf” and “Brer Buzzard” are in fact: ” Brother Fox”, “Brother Bear”, “Brother Wolf” and “Brother Buzzard”.

As such, the names of these characters betray their very ancient origins in Western Africa.

As Karen Armstrong has pointed out in her brilliant “Short History of Myth”, pre-agrarian, hunter gatherer societies exhibit a strong sense of identification with all living creatures, particularly those that are hunted for food. Seeing all animals as siblings is a common expression of this perception.

Brother Rabbit, is a trickster. And as such is also another iteration of Brother Spider, or Anansi. Brer Rabbit tales, like the Anansi stories, depict a physically small and vulnerable creature using his cunning intelligence to prevail over larger animals. Brer Rabbit, originated from the folklore of the Bantu-speaking peoples of south and central Africa.Whereas, the Anansi tales which are some of the best-known in West Africa are believed to have originated in the Ashanti people in Ghana.

Although, many Brer Rabbit and Anansi stories are easily interchangeable, they often took on a whole new level of meaning on the plantations.

In the introduction of the first volume, Harris wrote: “…it needs no scientific investigation to show why (the Negro) selects as his hero the weakest and most harmless of all animals, and brings him out victorious in contests with the bear, the wolf, and the fox.” And Brer Rabbit, born into this world with “needer huff ner hawn” – neither hooves nor horns – has to use trickery to survive. The enjoyment of his amoral, and immoral, adventures, being made all the more fun as a thinly veiled code for the black slave out-foxing his white masters.

It has been said that these stories were usually told by one adult to another. And children, if they were lucky would get to listen in.

And the adult tone of many of the stories reflects this. Stealing, lying, cheating,torture savage beatings, and even cold-blooded murder are normal fare for what has been described as “this malevolent rabbit”.

Take “The Sad Fate of Mr. Fox,” in which Brer Rabbit not only tricks Brer Fox into getting himself beaten to death by Mr. Man, but then takes Brer Fox’s severed head back to his wife pretending that it’s beef for her soup pot. Or another story which has Brer Rabbit slowly scalding Brer Wolf to death, while another has him killing Brer Bear by engulfing him in a swarm of bees.

Several stories even touch on sex as a theme, usually with Brer Rabbit beating Brer Fox and the other animals for the attentions of “Miss Meadows and de gals,” who then make merry in a thinly disguised brothel.

Br'er Fox and Br'er Bear from Uncle Remus, His...

Br’er Fox and Br’er Bear from Uncle Remus, His Songs and His Sayings: The Folk-Lore of the Old Plantation, by Joel Chandler Harris

But perhaps no better tale demonstrates Brer Rabbit’s supreme wickedness than “Mr. Rabbit Nibbles Up the Butter.” In which “lumberin'” Brer Possum gets burned to death in his own fire. The little white boy, who is listening to Old UncleRemus tell this dark tale, protests indignantly that since Brer Rabbit stole the butter, he should be the one to be punished for it, not poor Brer Possum. To which Remus shrugs and says: “In dis worl’, lots er folks is gotter suffer fer udder folks sins.”

In the late 1990’s, I travelled along the Coast of West Africa with a good friend of mine, Winston, a West Indian with African slave ancestors who had been born on the small Carribean island of St Vincent.

On the westerly shores of Ghana, there is a beautiful stretch of beach, lined with palm trees, where the Atlantic surf crashes up on the golden sand, and creates the very image of a tropical paradise. And here, on a promontory a 16th century Portuguese castle stands like a dark, brooding Equatorial Elsinore.

This is Cape Coast Castle, and for almost four centuries, this was the centre of the North Atlantic slave trade in West Africa.

Cape Coast Castle. Centre of the West African slave trade for over four hundred years.

The castle itself is a dark, oppressive place. The immeasurable human pain and suffering it has witnessed, over hundreds and hundreds of years, seems to be ingrained into the very fabric of the walls.

The Gate of No Return, Cape Coast Castle, Elmina

Within the bowels of this castle is a doorway that is known as “The Gate of No Return.” Through this doorway you can see the surf crashing on the golden beaches below.

It feels like a portal to another world.

And for millions of Africans it was just that, as they passed through this gate on their way to a life of slavery, over the horizon, in the Americas. If they did not perish on the way.

As Emily Raboteau puts it so powerfully in a piece called “The Throne of Zion. A Pilgrimage to São Jorge Da Mina, Ghana’s Oldest and Most Notorious Slave Castle”:

This, then, was the door. It struck me as vaginal. You passed through it and onto a ship for Suriname or Curaçao, or through similar doorways for Cuba or Jamaica, Savannah or New Orleans. You passed through it, lost everything, and became something else. You lost your language. You lost your parents. You were no longer Asante or Krobo, Ewe or Ga. You became black. You were a slave. Your children inherited your condition. You lost your children. You lost your gods, as you had known them. You slaved. You suffered, like Christ, the new god you learned of. You learned of the Hebrew slaves of old. In the field, you sang about Moses and Pharaoh. You built a church, different from your masters’. You prayed for freedom. You wondered about the Promised Land, where that place might be.”

The only things they carried with them were their memories and their stories.

After a few hours in this dark claustrophobic castle, we were all quite relieved to get out into the late afternoon sunshine.

George a local teacher who had offered to show us around the castle suggested a place a little way back down the coast where we could get a cold beer.

An hour or so later, we were sitting outside a small wooden bar, on a beach, a couple of miles East of Elmina, watching the sun set over the promontory and the castle, and swopping stories.

As the light began to fall George started to tell Anansi stories. It emerged that Winston had been told similar stories, by his grandmother, as a child on the Caribbean island of St Vincent. Our spirits revived with the cold beer, Winston told one of his Anansi stories. Then George told one of his. Then Winston responded with another.

This went on for a while, when, with the sun slowly setting behind the silhouette of Elmina Castle, something really extraordinary happened:

Winston told a particularly funny Anansi story…

One that George had never heard before…

And at that moment it struck us all like a thunderbolt… At some remote point in the last four hundred years, this story had travelled over the vast expanse of the Atlantic ocean to the Carribean island of St Vincent. (After, perhaps passing through the “Gate of No Return” which stood ominously behind us in the gathering darkness.) Where it was passed down, from generation to generation, until Winston brought it back across the Atlantic, to share with us that evening.

The fact is, that these Brer Rabbit, Anansi stories have the most amazing ability to travel across vast swathes of space and time. And media.

Which is why these trickster tales are alive and well, and still being shared on a daily basis.

Despite the fact that many of the original books are out of print and the movie called “Song of the South” is deemed by the executives at Disney to be too politically sensitive to be re-released. And despite the fact that here have been many attempts to make the stories more socially acceptable to by removing the Uncle Remus character and the use of the Gullah language. These stories are flourishing, not in traditional media, but in that original social media… the shared oral tradition.

The rabbit who survived the Black Holocaust, may well have a few more surprises for us yet.

21
Oct

The visual representation of data: how the Duke of Wellington was almost destroyed by an army of toy soldiers.

We tend to think of the visual display of quantitative information, in terms of boring old graphs and pie charts. And yet the visual representation of a narrative is often a more dynamic and powerful storytelling medium than the use of mere words. What’s more, because it is often more readily understood than the complex data that it represents, a visual representation also tends to be believed more readily.

History is a complex affair. This is the story of how a visual representation of data almost changed the course of history.

According to popular history, the battle of Waterloo was a stunning victory for both Britain and the Duke of Wellington, and a crushing defeat for both France and Napoleon Bonaparte.

In fact, the truth was never quite this simple. And Duke of Wellington’s victory hung in the balance for many years after the event, and was only won after decades of campaigning against an army composed, strangely enough… of thousands of toy soldiers.

Information Graphics or “infographics” as they are more often called, owe much of their popularity to the fact that they can make even the most complex subject accessible, by translating large amounts of complex data into more accessible, more readily understandable, visual media.

In the early 19th century, Information Graphics, began to be used as a means of giving complex quantitative information a clear visual narrative. An early, and justly celebrated, example of Information Graphics is Charles Minard’s graphic representation of Napoleon Bonaparte’s disastrous march on Moscow in 1812. In his “The Visual Display of Quantitative Information”, Edward Tufte says it “may well be the best statistical graphic ever drawn”. Because, at a glance, the viewer can see immediately the real, data-based, story of the campaign.

Because they make the complex simple for the casual viewer, visual displays of quantitative information of this sort, are also extraordinarily powerful storytelling devices. And, because they are more readily understood than the data that they represent, these stories also tend to be believed more readily.

It was this simple fact that eventually led to the downfall of a contemporary of Charles Minard, an Englishman called William Siborne.

In a story that is a curiously British reflection of that of Charles Minard and Napoleon, William Siborne had set about the task of creating a vast visual representation of The Duke of Wellington’s great victory over Napoleon, at The Battle of Waterloo

Unfortunately, for Siborne, his visual representation directly questioned the Duke of Wellington and the British Establishment’s account of how the battle had been won… and indeed who in fact had actually won the battle.

William Siborne came upon his explosive discovery quite by accident. In 1829, the British army proposed the creation of a United Services Museum and wanted a scale model of the Battle of Waterloo as its central exhibit. Lieutenant Siborne, a brilliant topographer and military surveyor, then aged 32, received the commission.

Siborne was a meticulous researcher. He spent eight months surveying the battlefield in minute detail, and a further eight years clarifying each soldier’s position, at each stage of the battle. He compared accounts from official dispatches, printed memoirs, and by means of some rather modern looking, pre-printed questionnaires, conducted an unpecedented research programme with surviving veterans from all sides – English, French, and Prussian. It was the first time in history that a battle had been so carefully researched, and at the end of the process, Siborne had an unrivalled understanding of the position of almost every soldier on the field of battle.

This almost god-like, universal knowledge of the battle, gathered from so many sources, was translated into the scale model that Siborne built.

His model was planned on an enormous scale: covering almost 400 square feet, it would represent perfectly every tree, road, and contour of the field of battle. Some 75,000 tin models would represent the deployment of the various forces at the moment of the “Crisis of the Battle” – 7 pm on June 18, 1815 – when events turned decisively against the French Emperor. By this point, the 68,000 British troops with which the Duke of Wellington had started the battle were severely reduced, and his allies – 40,000 Prussians under Field Marshal Blucher von Wahlstadt – were staging their third major attack on the French positions in the village of Plancenoit.

At a time before aerial photography, or for that matter of any type of photography at all, the diorama provided the viewer with a clear vision of the battle, that could be viewed from any angle.

It was, in fact Information Graphics in 3D.

After innumerable practical problems, the model eventually went on show in London, where it was greeted by rapturous reviews and visited by 100,000 people in its first year.

A detail of “The Large Model” 1838, Siborne’s 3D visualisation of the quantitative information that he had gathered from surviving eyewitnesses.

One person, however, failed to share the popular enthusiasm for the model. While Wellington had initially approved of the project as a monument to his military genius, he had become cooler, and eventually downright obstructive, as Siborne’s researches had progressed.

The key point at issue was the role of the Prussians in changing the course of the battle.

After comparing the records of the Prussian General Staff with those of Wellington’s own officers, Siborne had discovered serious inconsistencies in Wellington’s own account of the battle, which had become the official version of events.

Whilst Wellington had always insisted that the Prussians had arrived later in the day, when the battle was already won, Siborne could now prove, beyond any shadow of a doubt, that they had actually joined the battle several hours earlier than Wellington claimed, and consequently had played a far greater part in the victory than was credited to them by Wellington.

Wellington responded by insisting that Siborne was “mistaken” and demanding that most of the Prussian troops displayed on the model be removed.

In questioning Wellington’s version of events, Siborne was not only undermining a pillar of the British Establishment, but subverting what had become a central element of national mythology: the conviction that Britain alone – and the genius of the Iron Duke – saved Europe from the tyranny of Napoleon.

However, rather than concede to this kind of pressure, Siborne raised the stakes even further, and published, in two lavish octavo volumes with an accompanying folio volume of maps, irrefutable evidence of the decisive Prussian contribution to the victory at Waterloo.

A detail from the folio volume of maps that accompanied the two octavo volumes of Siborne’s “History of the War in France and Belgium in 1815”

Siborne paid heavily for this act of defiance. Wellington’s colleagues at the War Office declined to purchase the model they had commissioned. A proposal for the display of the model at the newly opened National Gallery in Trafalgar Square was quietly shelved.

And as if to underscore the fact that the model, as a visual representation of the data, was more effective than a written text, whilst his own account of the war had sold relatively poorly, he had to watch as it was plagiarised by the Reverend George Gleig, a Wellington-backed rival, for a best-selling of the campaign that shamelessly corroborated the official version.

But Wellington had good reason to campaign so aggressively to discredit both Siborne’s model and his reliability as an historian.

In the early 19th century Britain’s military reputation was still badly tarnished by it’s humiliating defeat in the Americas. And in 1815, Britain’s relations with Prussia were acutely strained by the suspicion that its ally was intent on territorial expansionism. As Hofschröer points out in  “The Duke, the Model Maker and the Secret of Waterloo”, “had the Duke given the Prussians full credit for their role in the battle, it would probably have led to them making even louder demands for further territorial aggrandisement, upsetting the balance of power so laboriously established at the Congress of Vienna”.

But Wellington had good reason to campaign so aggressively to discredit both Siborne’s model and his reliability as an historian.

In the early 19th century Britain’s military reputation was still badly tarnished by it’s humiliating defeat in the Americas. And in 1815, Britain’s relations with Prussia were acutely strained by the suspicion that its ally was intent on territorial expansionism. As Hofschröer points out in  “The Duke, the Model Maker and the Secret of Waterloo”, “had the Duke given the Prussians full credit for their role in the battle, it would probably have led to them making even louder demands for further territorial aggrandisement, upsetting the balance of power so laboriously established at the Congress of Vienna”.

So the Duke had little alternative but to seek to discredit Siborne if Britain’s post-war military reputation – were not to be revealed as being founded on a falsehood.

The Duke won his battle… eventually. And Siborne died in poverty and obscurity in 1851.

After almost 200 years, and after many years in storage, Siborne’s Large Model of Waterloo is once again on public display – in the National Army Museum, in London. Ironically, right beside the Chelsea Hospital, where Siborne ended his days in poverty.

The 4,000 Prussians that Siborne had so carefully hand-painted, are still missing. But it is an extraordinary testament to a time when an army of model soldiers struck fear in the hearts of the British Establishment. And to the fact that, in a contest between quantitative data, and the clear visual representation of that quantitative data, it is always the visual that will persuade more effectively.

8
Apr

Vapour Trails

I was driving across the burning desert
When I spotted six jet planes
Leaving six white vapor trails across the bleak terrain
It was the hexagram of the heavens
it was the strings of my guitar
Amelia, it was just a false alarm

The drone of flying engines
Is a song so wild and blue
It scrambles time and seasons if it gets thru to you
Then your life becomes a travelogue
Of picture post card charms
Amelia, it was just a false alarm

People will tell you where they’ve gone
They’ll tell you where to go
But till you get there yourself you never really know
Where some have found their paradise
Other’s just come to harm
Oh, Amelia it was just a false alarm

A ghost of aviation
She was swallowed by the sky
Or by the sea like me she had a dream to fly
Like Icarus ascending
On beautiful foolish arms
Amelia, it was just a false alarm

Maybe I’ve never really loved
I guess that is the truth
I’ve spent my whole life in clouds at icy altitude
And looking down on everything
I crashed into his arms
Amelia, it was just a false alarm

I pulled into the Cactus Tree Motel
To shower off the dust
And I slept on the strange pillows of my wanderlust
I dreamed of 747s
Over geometric farms
Dreams Amelia – dreams and false alarms

“Amelia” by Joni Mitchell

The more we know about people, socially, culturally and personally, the more we feel we can anticipate how they might respond to any given situation.

And yet, it is impossible to predict anyone else’s behaviour with certainty.

Despite how close we might feel to another human being we can never really tell what they are going to do next.

Each of us has our own unique sense of being… the sense of an autobiographical self that is poised between the remembered past and the anticipated future.

And the nature of this sense of being, whether we realise it or not, is freedom.

And whilst it is sometimes hard to realise this sense of freedom in ourselves, it is practically impossible to experience it in others, because we can only ever experience their actions in the past.

This sense of freedom is central to the ideas of the Existentialist philosopher, Jean-Paul Sartre.

Hazel Barnes, is probably best known as the person who introduced the works of Jean-Paul Sartre to a wider American audience.  But as  her writing shows, Hazel Barnes was also something of a philosopher herself.

To explain the idea that freedom is at the heart of human experience, she came up with a wonderful visual analogy.

A visual analogy, that is both evocative and deeply illuminating.

The way we experience other human beings , she explains, is a bit like when you look up to see a jet aircraft flying across a clear blue sky.

You can see the white vapour trail stretching out for miles behind the plane, so you know where they have come from, and what the pilot’s previous actions are.

In other words you can experience his past. And from this past, you can also anticipate of where the pilot might go in the future.

What you cannot really know, however, is what is happening in the mind of the pilot or indeed what the pilot’s next move will be.

As an existentialist, Barnes understood that the core reality of every human being is freedom.

And that our daily denial of this essential nature was what Sartre had characterised as “Bad Faith”. Since it is a fundamental betrayal of our true selves.

Just because the pilot is flying in one direction, does not mean that he will always do so.

Every second he flies in that direction, he is doing so, not because he has no alternative, but because he is choosing to do so.

Because his true nature is freedom he has the potential to fly in any direction he wants.

And this is the unknowable part of any other human being.

fall-forest-foliage-reflecting-on-a-blue-lake-and-wetlands--airplane-vapour-trails--autumn-colors-chantal-photopix

It is 8.30am on a beautiful, clear Autumn morning in the pretty town of Albany, in upstate New York, where the trees are all changing colour to a brilliant blaze of red and gold.

Any one who happens to look up at the clear blue sky, might see the pure white line of a single vapour trail stretching across the sky from the East.

And, if they stop for a moment and carry on watching the plane, they will see something really quite unusual.

Because at this point, the plane which is a Boeing 767, banks sharply, and turns South, leaving a vapour trail at right angles to its original flight path.

The aircraft is in fact American Airlines Flight 11 flying from Boston to Los Angeles and the pilot is a man called Mohamed Atta.

Atta had been born on September 1, 1968 in the town of Kafr el-Sheikh, close to the shores of the Mediterranean Sea, in Northern Egypt.

His parents were ambitious for him, and, as a child, Atta was discouraged from socialising and spent most of his time at home studying.  His father, Mohamed el-Amir Awad el-Sayed Atta, was a succesful lawyer. His mother, Bouthayna Mohamed Mustapha Sheraqi, was also highly educated.

Atta had two sisters and his parents were equally ambitious for them too. One would go on on to become a medical doctor and the other a college professor.

In 1985, all this hard work paid off for Atta, when he secured a place at Cairo University to study engineering.

Atta excelled at his studies here, and in his final year he was admitted into the prestigious Cairo University architecture programme. And in 1990, graduated with a degree in architecture.

Nobody could have predicted that ten years later he would find himself at the controls of a Boeing 767, as he flew over the golden forests of upstate New York.

Nobody could have known that he would turn the plane away from it’s destination, Los Angeles, and begin flying due south down the Hudson River Valley towards New York.

And nobody could have imagined that around fifteen minutes later, at just after 8:46am, he would straighten up the plane, accelerate, and fly American Airlines Flight 11 straight into the the side of the North Tower of the World Trade Center.

With his strong engineering background, Mohamed Atta, of course knew that the Boeing 767 traveling at over 465 miles per hour and carrying more than 10,000 gallons of jet fuel, would explode immediately, killing everybody on board.

But nobody could else could have predicted that.

Not even Atta’s father, Mohamed el-Amir Awad el-Sayed Atta, who, to this day, continues to deny that his son could ever have been involved in something so unthinkable.

Because, however close we feel to another human being, we can never really tell what they are going to do next.

World Trade Center Attacked