DBpedia mashup: the most important dead people according to Wikipedia
The timeline below shows the names of dead people and their lifespans, as retrieved from Wikipedia. They are arranged so that people nearer the top are the best linked in on Wikipedia, as measured by the average number of clicks it would take to get from any Wikipedia page to the page of the person in question.
I had imagined that Wikipedia 'linkedin-ness' would serve as a proxy for celebrity, which it kind of does - but only in a lose way.
Values range from 3.72 (at the top) to 4.04 (at the bottom). This means that if you were to navigate from a large number of Wikipedia pages, using only internal Wikipedia links, it would take you, on average, 3.72 clicks to get to Pope John Paul II. This data set was made by Stephan Dolan, who explains the concept better than me. Basically, it's the 6 degrees of Kevin Bacon on Wikipedia.
I looped through the data set and queried DBpedia to see if the Wikipedia article was about a person, and if so retrieved their dates of birth and death.
The timeline does show a certain amnesia on the part of Wikipedia, Shakespeare and Newton are absent, while Romainian historian of religion Mircea Eliade comes 5th. If I had included people who are alive tennis players would have dominated the list (I don't know why) - Billie Jean King is the second best-linked article on wikipedia, one ahead of the USA (the UK is number one!).
Any mistakes (I have seen some) are due to the sketchiness of the DBpedia data, though I can't rule out having made some mistakes myself...
There results are limited to the top 1000, and they only go back to 1650. Almost no names previous to 1650 appeared, the exceptions being Jesus (who was still miles down) and Guy Fawkes.
In case you were wondering 'Who's Saul Bellow below?', the answer is Rudolph Hess.
Sunday, April 11, 2010
Sunday, February 7, 2010
Smoke Stacks to Apple Macs
The digital revolution will not be televised - to the contrary, is it possible that no artist or medium can be said to have adequately addressed the information age?
Zizek once sumerised Marx as having said that the invention of steam engine caused more social change than any revolution ever would. Marx himself doesn't seem to have provided a useful soundbite to this effect (at least not one that I can find though Google), so I'm afraid it will have to remain second hand. It's a powerful sentiment, whoever originated it - which philosopher's views cannot be analyzed as the product of the social and technological novelties of his day?
It's easy to see that the technology that is most salient in our age is the internet, which has been made possible by consumer electronics. Have our philosophers stepped forward to engage with the latest technological crop?
Moving on from philosophers, what of our artists? Will Gompertz recently posted to share an apparently widely held view that no piece of art has yet spoken eloquently from or about the internet. He cites Turner prize winning Jeremy Deller describing "a post-warholian" era, presumably indicating that Warhol was last person to adequately reference technological change in the guise of mass production. I wonder if the Saatchi-fueled infloresence has also captured something of marketing-led landscape we currently live in, but whatever the last sufficient reflection on cultural change afforded by art was, I think we may be on safe ground in stating that the first widely accepted visual aperçus of the digital era is still to come.
Which is some surprise when you consider, for example, how engaged the news agenda is with technology: I was amazed to see that Google's Wave technology (still barely incipient) got substantial coverage on BBC news.
With my employment centering on the web, and my pretensions at cultural engagement, this weekend I visited the Kinetica Art Fair. Kinetica is a museum which aims to 'encourage convergence of art and technology'. The fair certainly captured one aspect of contemporary mood - a very reasonably priced bar was a welcome response to our collective financial deficit.
Standout pieces included a cleverly designed mechanical system for tracing the contours of plaster bust onto a piece of paper and a strangely terrifying triangular mirror with mechanically operated metal rods. It looked like a Buck Rogers inspired torture device designed to inflict pain by a method so awful that you'd have to see it in operation before its evil would be comprehensible. The other works included a urinal which provided an opportunity for punters to simulate pan-global urination (sadly not with real urine) by providing a jet of water and a globe in a urinal. I would defy anyone not to be entertained by spending time wondering round the the fair.
However, Will Gompertz's challenge was not answered at Kinetica - the essence of the technological modernity was distilled into any of work - not even slightly.
I've been mulling over various possible reasons for this failure, and quite a few suggestions spring to mind. Do computers naturally alienate artists? Is information technology to visually banal to be characterised succinctly?
I'd like to suggest that its the transitory nature of our electronic lives that makes them so hard to pin down. Mobile phones, web sites, computers and opperating systems from a decade ago all look ludicrously dated - it's almost impossible to capture the platonic form of these items because they have so little essential similarity. Moreover, their form is almost an accident, and not connected with their more profound meaning in any way. The boats of the merchantile age and the smoke stacks of the industrial age all seem to denote something broader - how can communism be separated from its tractors? Yet the form factor of my computer is trivial. Form and functional significance are of necessity separated by digital goods, their flexibility is the source of their power.
In someway I think films give us tacit acknowledgment of the contingent nature of the digital environment that we spend much of our lives in: no protagonist is ever seen using Windows on their computer, in films computer's interfaces are always generic. When we see a Mac in a film it impossible to see it as anything other than product placement.
So, the Kinetica Art Fair may not have been able to help society understand its relationship with technology, but actually, despite their rhetoric, I think it was a little unfair to expect it to. Really the fair was about works facilitated by technology, rather than about it.
But, in case you think I've picked a straw man in Kentica, let me say that the V&As ongoing exhibition Decode really does no better, though its failures and successes are another topic.
Zizek once sumerised Marx as having said that the invention of steam engine caused more social change than any revolution ever would. Marx himself doesn't seem to have provided a useful soundbite to this effect (at least not one that I can find though Google), so I'm afraid it will have to remain second hand. It's a powerful sentiment, whoever originated it - which philosopher's views cannot be analyzed as the product of the social and technological novelties of his day?
It's easy to see that the technology that is most salient in our age is the internet, which has been made possible by consumer electronics. Have our philosophers stepped forward to engage with the latest technological crop?
Moving on from philosophers, what of our artists? Will Gompertz recently posted to share an apparently widely held view that no piece of art has yet spoken eloquently from or about the internet. He cites Turner prize winning Jeremy Deller describing "a post-warholian" era, presumably indicating that Warhol was last person to adequately reference technological change in the guise of mass production. I wonder if the Saatchi-fueled infloresence has also captured something of marketing-led landscape we currently live in, but whatever the last sufficient reflection on cultural change afforded by art was, I think we may be on safe ground in stating that the first widely accepted visual aperçus of the digital era is still to come.
Which is some surprise when you consider, for example, how engaged the news agenda is with technology: I was amazed to see that Google's Wave technology (still barely incipient) got substantial coverage on BBC news.
With my employment centering on the web, and my pretensions at cultural engagement, this weekend I visited the Kinetica Art Fair. Kinetica is a museum which aims to 'encourage convergence of art and technology'. The fair certainly captured one aspect of contemporary mood - a very reasonably priced bar was a welcome response to our collective financial deficit.
Standout pieces included a cleverly designed mechanical system for tracing the contours of plaster bust onto a piece of paper and a strangely terrifying triangular mirror with mechanically operated metal rods. It looked like a Buck Rogers inspired torture device designed to inflict pain by a method so awful that you'd have to see it in operation before its evil would be comprehensible. The other works included a urinal which provided an opportunity for punters to simulate pan-global urination (sadly not with real urine) by providing a jet of water and a globe in a urinal. I would defy anyone not to be entertained by spending time wondering round the the fair.
However, Will Gompertz's challenge was not answered at Kinetica - the essence of the technological modernity was distilled into any of work - not even slightly.
I've been mulling over various possible reasons for this failure, and quite a few suggestions spring to mind. Do computers naturally alienate artists? Is information technology to visually banal to be characterised succinctly?
I'd like to suggest that its the transitory nature of our electronic lives that makes them so hard to pin down. Mobile phones, web sites, computers and opperating systems from a decade ago all look ludicrously dated - it's almost impossible to capture the platonic form of these items because they have so little essential similarity. Moreover, their form is almost an accident, and not connected with their more profound meaning in any way. The boats of the merchantile age and the smoke stacks of the industrial age all seem to denote something broader - how can communism be separated from its tractors? Yet the form factor of my computer is trivial. Form and functional significance are of necessity separated by digital goods, their flexibility is the source of their power.
In someway I think films give us tacit acknowledgment of the contingent nature of the digital environment that we spend much of our lives in: no protagonist is ever seen using Windows on their computer, in films computer's interfaces are always generic. When we see a Mac in a film it impossible to see it as anything other than product placement.
So, the Kinetica Art Fair may not have been able to help society understand its relationship with technology, but actually, despite their rhetoric, I think it was a little unfair to expect it to. Really the fair was about works facilitated by technology, rather than about it.
But, in case you think I've picked a straw man in Kentica, let me say that the V&As ongoing exhibition Decode really does no better, though its failures and successes are another topic.
Monday, February 1, 2010
Internet Protocol and HRH the Queen
Whatever we end up using the web for, don't the world's citizens lead more equal lives if they are all mediated by the same technology?
The queen tweets. She's commissioned a special jewel embossed netbook and a bespoke Twitter client with skinned with ermine and sable.
I made that up. For starters, she hasn't actually started tweeting - there is a generic royal feed which announces the various visits and condescensions of Britain's feudal anachronism, but nothing from miss fancy hat herself. Perhaps royal protocol means she can only use it if her followers can find a way of curtsying in 140 characters?
The feed does give an insight into how boring the Royal's lives might actually be - opening wards and meeting factory workers - when they aren't having a bloody good time shooting and riding. However, as a PR initiative it breaks the rule that states for a Twitter account to be of any interest then tweets must emanate from the relevant horse's mouth, if you'll forgive the chimerical metaphor. If you can't have the lady herself, I don't really think there is much point in bothering. But that's not the point I'm here to make.
I'm more interested in the fact that, should any of us choose to, Bill Gates, Sir Ranulph Twisleton-Wykeham-Fiennes, 3rd Baronet OBE, Osama Bin Laden and I will have exactly the same experience when we use Twitter (assuming it's available in the relevant language).
I suppose Bin Laden might have quite a slow connection in Tora Bora, and probably Bill Gates has something faster than Tiscali's 2meg package. Details aside, everyone is doing the same thing.
Actually, not only will we be using the same website, we'll be using very similar devices. Bill probably doesn't have a Mac like me (he may be the richest man in the world, he can still envy me one thing), but all our computers will be very similar.
The reason for this is that for both websites and computer technology have very high development costs, and low marginal costs per user. Even the Queen can't afford to develop an iPhone, but everyone can afford to buy one.
If a lot of your life is mediated by technology then this is going to be very important to you. While there is healthy debate about the web's democratisation of publishing, I think we might reasonably add to the web's egalitarian reputation its ability to give people of disparate incomes identical online experiences.
That doesn't sound like a blow against inequality and tyranny in all its forms - but none the less I think its important . Even people using OLPC computers [low priced laptops aimed at the third world] have basically the same experience of the internet as you or I. That's to say Uruguayan children will quite possibly spend a good part of their day doing exactly the same things as New York's office workers and Korea's pensioners. When you consider that only very recently there were probably no major similarities in these disparate lives I think it does constitute a significant development.
Of course, for all I know a line of luxury websites will come along and exclude some strata of the social pile. In a way it's already happened - we've seen the thousand dollar iPhone app - but its hard to see this one off as part of a pattern. This is not to say that the 'freemium' business model [basic website for free, pay to get the premium version] couldn't exclude certain people, it's more that this model can only exist when there isn't much pressure for a free version. At the moment, there aren't any widely used web applications that aren't available at zero cost - of course this may change if your audience is sufficiently well off to attract paid advertising, but there again it may not.
This is a phenomena that's been observed before: technology tends to eliminate differences between cultures. It's been termed the Apparatgeist, and has been developed as a concept in response to the observation that mobile phone habits, one differentiated locally, are now more or less identical in all developed economies. As a concept it surely applies equally as well to class and income - leaving us us in a more equal experiential world. And perhaps also a monoculture - but then isn't that entailed in the new equalities that so many internet optimists evangelise?
The queen tweets. She's commissioned a special jewel embossed netbook and a bespoke Twitter client with skinned with ermine and sable.
I made that up. For starters, she hasn't actually started tweeting - there is a generic royal feed which announces the various visits and condescensions of Britain's feudal anachronism, but nothing from miss fancy hat herself. Perhaps royal protocol means she can only use it if her followers can find a way of curtsying in 140 characters?
The feed does give an insight into how boring the Royal's lives might actually be - opening wards and meeting factory workers - when they aren't having a bloody good time shooting and riding. However, as a PR initiative it breaks the rule that states for a Twitter account to be of any interest then tweets must emanate from the relevant horse's mouth, if you'll forgive the chimerical metaphor. If you can't have the lady herself, I don't really think there is much point in bothering. But that's not the point I'm here to make.
I'm more interested in the fact that, should any of us choose to, Bill Gates, Sir Ranulph Twisleton-Wykeham-Fiennes, 3rd Baronet OBE, Osama Bin Laden and I will have exactly the same experience when we use Twitter (assuming it's available in the relevant language).
I suppose Bin Laden might have quite a slow connection in Tora Bora, and probably Bill Gates has something faster than Tiscali's 2meg package. Details aside, everyone is doing the same thing.
Actually, not only will we be using the same website, we'll be using very similar devices. Bill probably doesn't have a Mac like me (he may be the richest man in the world, he can still envy me one thing), but all our computers will be very similar.
The reason for this is that for both websites and computer technology have very high development costs, and low marginal costs per user. Even the Queen can't afford to develop an iPhone, but everyone can afford to buy one.
If a lot of your life is mediated by technology then this is going to be very important to you. While there is healthy debate about the web's democratisation of publishing, I think we might reasonably add to the web's egalitarian reputation its ability to give people of disparate incomes identical online experiences.
That doesn't sound like a blow against inequality and tyranny in all its forms - but none the less I think its important . Even people using OLPC computers [low priced laptops aimed at the third world] have basically the same experience of the internet as you or I. That's to say Uruguayan children will quite possibly spend a good part of their day doing exactly the same things as New York's office workers and Korea's pensioners. When you consider that only very recently there were probably no major similarities in these disparate lives I think it does constitute a significant development.
Of course, for all I know a line of luxury websites will come along and exclude some strata of the social pile. In a way it's already happened - we've seen the thousand dollar iPhone app - but its hard to see this one off as part of a pattern. This is not to say that the 'freemium' business model [basic website for free, pay to get the premium version] couldn't exclude certain people, it's more that this model can only exist when there isn't much pressure for a free version. At the moment, there aren't any widely used web applications that aren't available at zero cost - of course this may change if your audience is sufficiently well off to attract paid advertising, but there again it may not.
This is a phenomena that's been observed before: technology tends to eliminate differences between cultures. It's been termed the Apparatgeist, and has been developed as a concept in response to the observation that mobile phone habits, one differentiated locally, are now more or less identical in all developed economies. As a concept it surely applies equally as well to class and income - leaving us us in a more equal experiential world. And perhaps also a monoculture - but then isn't that entailed in the new equalities that so many internet optimists evangelise?
Thursday, January 7, 2010
Dinner with Portillo [amuse bouche]
I'd like to summerise this post but it doesn't actually make a point. I was just hypnotised by Michael Portillo's rubbery features as he presented the Dinner With Portillo series- if you will, his bouche amused me.
Since a casual discussion of Michael Portillo arose in the office he seems suddenly to loom large on my psyche - and my iPlayer. Recently he has had as many as three programmes on the BBC simultaneously.
Dinner with Portillo is a strange invitation, but none the less I joined him in the back room of some posh restaurant via the medium of television. He was hosting his various guests as they discussed the merits of the political diary. Portillo himself seemed to be animated by a passionate dislike of political diaries, and ruled out the publication of his own memoirs.
Surprisingly, the format worked, and there was certainly a sense that the food and the booze and the polite surroundings away from the harsh lights of the studio led to a more honest discussion. The success of the programme came from the window into westminster life that was afforded, at least, Westminster life as I imagine it: people referring to civil servants by first name, having read, as a matter of course, all the substantial political diaries and gently playing out their political views by needling on a personal level those of other persuasions.
It's undeniable that Portillo was excellent at stepping in when necessary, and letting the conversation flow naturally when he wasn't guiding it.
Anthony Howard was something of a treat. If a fossil of his dental palette is discovered paleontologists will certainly classify him a different species. We were treated to a shot of him browsing on some bread, which gave the sensation of a wildlife programme. In fact the whole thing had something of the aspect of trying not to interfere with the behaviour of an animal not normally seen in its natural environment, with waiters deftly sustaining the dinners with booze and food without upsetting the flow of these rare specimen's various calls and rituals.
Everyone had an anecdote they wanted to get out, except Gene Seaton, who came across as a rather kind, impartial and more mature character than everyone else.
One question did hang in the air a little - why must the programme be bookended by segments that feel like they've been lifted from News Round? In that crappy way that programmes about the world's worst car chases spin out their material we were introduced to each of the guests twice, we started with a monologue from Portillo getting ready in his bedroom, and the awed tones with which each of the guests was explained to us gave the impression that the programme makers thought that in the absence of the audience actually knowing who any of the guests were they were free to make them sound as grand as they wanted.
But, it seems to me, if I'm watching this kind of programme then I'll know who most of the guests are and that I don't need it to have some trite question to peg the discussion to (allegedly the point was find out if political diaries were dishonest, which was hardly the focus of the conversation), nor any of the other paraphernalia of crap TV. Can't TV be grown up just once? When your making TV for BBC four surely you can accept that your not going to steal views from Two Pints Of Larger And A Packet Of Crisps?
Of course, what I really want to know was what they talked about down the pub afterwards, when they actually got drunk, and actually spilled the beans. Fortunately, if I cared enough I could easily find out by waiting for any one of their diarised accounts of the event to be published.
Since a casual discussion of Michael Portillo arose in the office he seems suddenly to loom large on my psyche - and my iPlayer. Recently he has had as many as three programmes on the BBC simultaneously.
Dinner with Portillo is a strange invitation, but none the less I joined him in the back room of some posh restaurant via the medium of television. He was hosting his various guests as they discussed the merits of the political diary. Portillo himself seemed to be animated by a passionate dislike of political diaries, and ruled out the publication of his own memoirs.
Surprisingly, the format worked, and there was certainly a sense that the food and the booze and the polite surroundings away from the harsh lights of the studio led to a more honest discussion. The success of the programme came from the window into westminster life that was afforded, at least, Westminster life as I imagine it: people referring to civil servants by first name, having read, as a matter of course, all the substantial political diaries and gently playing out their political views by needling on a personal level those of other persuasions.
It's undeniable that Portillo was excellent at stepping in when necessary, and letting the conversation flow naturally when he wasn't guiding it.
Anthony Howard was something of a treat. If a fossil of his dental palette is discovered paleontologists will certainly classify him a different species. We were treated to a shot of him browsing on some bread, which gave the sensation of a wildlife programme. In fact the whole thing had something of the aspect of trying not to interfere with the behaviour of an animal not normally seen in its natural environment, with waiters deftly sustaining the dinners with booze and food without upsetting the flow of these rare specimen's various calls and rituals.
Everyone had an anecdote they wanted to get out, except Gene Seaton, who came across as a rather kind, impartial and more mature character than everyone else.
One question did hang in the air a little - why must the programme be bookended by segments that feel like they've been lifted from News Round? In that crappy way that programmes about the world's worst car chases spin out their material we were introduced to each of the guests twice, we started with a monologue from Portillo getting ready in his bedroom, and the awed tones with which each of the guests was explained to us gave the impression that the programme makers thought that in the absence of the audience actually knowing who any of the guests were they were free to make them sound as grand as they wanted.
But, it seems to me, if I'm watching this kind of programme then I'll know who most of the guests are and that I don't need it to have some trite question to peg the discussion to (allegedly the point was find out if political diaries were dishonest, which was hardly the focus of the conversation), nor any of the other paraphernalia of crap TV. Can't TV be grown up just once? When your making TV for BBC four surely you can accept that your not going to steal views from Two Pints Of Larger And A Packet Of Crisps?
Of course, what I really want to know was what they talked about down the pub afterwards, when they actually got drunk, and actually spilled the beans. Fortunately, if I cared enough I could easily find out by waiting for any one of their diarised accounts of the event to be published.
Monday, January 4, 2010
Dinner with Portillo [main course]
A criticism of the ‘transhumanist’ point raised in Michael Portillo’s prandial discussion of morality and science.
The Dinner with Portillo TV series continues to fascinate me - this time his guests were discussing the moral status of scientists. If you need someone to miss the point, ask for Susan Greenfield. Mr Portillo’s decision to invite her to his dinner party discussion was a point-missing case in point – but if you want more evidence for my hypothesis you can look to her widely published scepticism about the web, which has always struck me as poorly reasoned too.
In this circumstance it’s not fair to single her out though, because all the guests seemed to fumble with an important nub of the transhumanist issue.
In case you haven’t delved into this topic yourself, transhumanism is simply the idea that technology might be used to radically change and improve human capabilities beyond those bestowed on us through the natural biological processes. In a common example, computer memory might be implanted in your brain, with the revolutionary effect of preventing you from losing the car keys (or perhaps more importantly preventing you from forgetting anything at all).
There are, as you might imagine, some proponents of the idea who fall into the mad scientist trope – for example Aubrey de Grey; but it’s an idea to take seriously too - Nick Bostrom is a good place to look to find a robust defence. (He has a great website here.)
Professor Bostrom, as a surefooted philosopher, would quickly have taken Suzanne to pieces. Her argument held that because we could not specify what a perfect human looked like we could not take transhumanism seriously. This is of course rubbish. We don’t know what the perfect painting looks like - doesn’t mean we aren’t able to rate painters, or strive to improve our painting ability. Portillo did try to tell her this, and predictably she wasn’t interested in hearing it.
But the underlying assumption of the discussion, the enthymeme in the room, was that transhumanism is somewhere in the future. Are we not already, in an important way, transhuman? In the previously noted example, there was horror at the thought that a computer might be implanted to give greatly enhanced cognitive abilities. What is really qualitatively different here between me reaching into my pocket for the calculator on my phone and accessing a mathematical faculty that has been prosthetically added to my brain? I can do a sum that even 50 years ago was beyond the reach of even the most powerful computer in either case. The difference is a question, basically, of ergonomics. Right now I need to have the phone with me, in the future I may have direct mental access. Surely, the most important upgrade in human faculties has already happened. Being able to do the sum in my head would be the icing on the cake, not the beef in the beef wellington.
The same must surely be true of so many other things. I don’t have an artificially improved memory, but I do have the internet in my pocket, which has a similar effect on my ability to recover certain kinds of information.
I will admit that there is a big psychological barrier around the idea of having your actual body enhanced, as opposed to using a device to get the extra functionality, but reality is that there would be little difference in practice.
I think a great example are the recently developed augmented reality goggles which give mechanics information on the things they are looking at and guide them through various repairs and procedures using terminator-style graphic overlays. If someone suggested mechanics be implanted with an automotive knowledge chip Susan Greenfield would probably organise a press conference to condemn it. But the goggles, which do exactly the same thing in all practical respects, barely warrant a ripple in the media.
It’s a shame that Susan Greenfield is so leery of enhancement, because it’s beginning to look like she might benefit from some herself… OK, I don’t mean it – I’m sure Baroness Greenfield, Professor of Synaptic Pharmacology at Oxford, deserves more credit than I’ve given her.
The Dinner with Portillo TV series continues to fascinate me - this time his guests were discussing the moral status of scientists. If you need someone to miss the point, ask for Susan Greenfield. Mr Portillo’s decision to invite her to his dinner party discussion was a point-missing case in point – but if you want more evidence for my hypothesis you can look to her widely published scepticism about the web, which has always struck me as poorly reasoned too.
In this circumstance it’s not fair to single her out though, because all the guests seemed to fumble with an important nub of the transhumanist issue.
In case you haven’t delved into this topic yourself, transhumanism is simply the idea that technology might be used to radically change and improve human capabilities beyond those bestowed on us through the natural biological processes. In a common example, computer memory might be implanted in your brain, with the revolutionary effect of preventing you from losing the car keys (or perhaps more importantly preventing you from forgetting anything at all).
There are, as you might imagine, some proponents of the idea who fall into the mad scientist trope – for example Aubrey de Grey; but it’s an idea to take seriously too - Nick Bostrom is a good place to look to find a robust defence. (He has a great website here.)
Professor Bostrom, as a surefooted philosopher, would quickly have taken Suzanne to pieces. Her argument held that because we could not specify what a perfect human looked like we could not take transhumanism seriously. This is of course rubbish. We don’t know what the perfect painting looks like - doesn’t mean we aren’t able to rate painters, or strive to improve our painting ability. Portillo did try to tell her this, and predictably she wasn’t interested in hearing it.
But the underlying assumption of the discussion, the enthymeme in the room, was that transhumanism is somewhere in the future. Are we not already, in an important way, transhuman? In the previously noted example, there was horror at the thought that a computer might be implanted to give greatly enhanced cognitive abilities. What is really qualitatively different here between me reaching into my pocket for the calculator on my phone and accessing a mathematical faculty that has been prosthetically added to my brain? I can do a sum that even 50 years ago was beyond the reach of even the most powerful computer in either case. The difference is a question, basically, of ergonomics. Right now I need to have the phone with me, in the future I may have direct mental access. Surely, the most important upgrade in human faculties has already happened. Being able to do the sum in my head would be the icing on the cake, not the beef in the beef wellington.
The same must surely be true of so many other things. I don’t have an artificially improved memory, but I do have the internet in my pocket, which has a similar effect on my ability to recover certain kinds of information.
I will admit that there is a big psychological barrier around the idea of having your actual body enhanced, as opposed to using a device to get the extra functionality, but reality is that there would be little difference in practice.
I think a great example are the recently developed augmented reality goggles which give mechanics information on the things they are looking at and guide them through various repairs and procedures using terminator-style graphic overlays. If someone suggested mechanics be implanted with an automotive knowledge chip Susan Greenfield would probably organise a press conference to condemn it. But the goggles, which do exactly the same thing in all practical respects, barely warrant a ripple in the media.
It’s a shame that Susan Greenfield is so leery of enhancement, because it’s beginning to look like she might benefit from some herself… OK, I don’t mean it – I’m sure Baroness Greenfield, Professor of Synaptic Pharmacology at Oxford, deserves more credit than I’ve given her.
Tuesday, October 6, 2009
TEDx Manchester and its Discontents
Whenever I watch a TED video it's always so optimistic. Perhaps the independent TEDx event I attended in Manchester was under the pall of the city's ceaseless rain, because it focused on some less than rosy home truths.
Content? Are you? Not if your job is to produce content. The anodyne catchall phrase for creativity as mediated by the web belies a bloodbath of job loss in newspapers, music and TV. The Evening Standard has recently accepted that what a consumer will pay for its product is zero, but it was last Friday at TEDx Manchester that a simple message came home to me.
It is conceivable that content is just something you can’t monetise in the era of the internet. Historically publishing has been fraught with similar difficulties, some of the world’s most influential books were utterly unable to remunerate their creators. Dr Johnson required a royal pension to keep him afloat despite having written the first full scale dictionary, likewise Diderot managed to remain poor after producing the West’s most famous encyclopaedia. No wonder publishers struggle when all the profundity they can muster is the Evening Standard. Are we simply returning to the equilibrium where creativity is next to impossible to convert into cash?
At TEDx Guardian Digital Editor Sarah Hartley articulated hyperlocal journalism (basically a local resident keeping a blog) as a possible future of news media, but she also admitted that she had no idea how journalists might earn a crust from this pursuit.
The next speaker to play into this theme was Marc Goodchild, head of Interactive for BBC childrens, who told us (amongst much else of interest) that at the age of 12 most kids started to predominantly spend their time on social networks and games - two areas where in effect you make the content yourself. He also told us that for the first time for children game play and internet use combined account for more hours of viewing than TV.
Hugh Garry, a Radio 1 producer, made the point even more firmly. His talk focused on a project that involved handing out mobile phones at festivals and asking people to record their experiences. The material was gathered into a film called “Shoot The Summer”. This exercise illustrated an interesting technical fact: mobile phones can produce footage that is perfectly watchable at cinema size.
A more subtle point was that most of the recipients of the mobile phones had a great natural sense of what would make interesting footage. If you don’t believe me, check out the film. And if you think that he just has the good bits from millions of hours of people taking drugs in tents, well, you’re right. That’s exactly the point – where is the space for the professional when a million amateur YouTube clips can be relied upon to produce a thousand gems? Of course, the content generation generation will also have a more natural sense of how to use a video camera compared with those for whom such devices are fresher developments.
Against a backdrop of the inevitable Twitterfall, and the equally inevitable Mancunian rainfall, the possibility of the end of professional content production took root in my mind. What medium might remain immune? Film? Surely this is the medium with the highest barrier to entry protecting its profits. Perhaps, but in a projected video of a JJ Abrahams TED talk we were all told we had no excuse not to be making films now the relevant hardware is so cheap.
I don’t really doubt that there are a number of ways for the paid journalist or film directors to survive, and it’s not news that the internet has put the squeeze on certain professions. There is a feeling though that we are just waiting for really cheap credit card transactions, or for Murdoch to spearhead online paid content, or for some other technological development to restore the professionals to their thrown. That might be misplaced optimism. Indeed some top journalists may be reduced to giving talks to a load half-arsed bloggers, perish the thought.
Content? Are you? Not if your job is to produce content. The anodyne catchall phrase for creativity as mediated by the web belies a bloodbath of job loss in newspapers, music and TV. The Evening Standard has recently accepted that what a consumer will pay for its product is zero, but it was last Friday at TEDx Manchester that a simple message came home to me.
It is conceivable that content is just something you can’t monetise in the era of the internet. Historically publishing has been fraught with similar difficulties, some of the world’s most influential books were utterly unable to remunerate their creators. Dr Johnson required a royal pension to keep him afloat despite having written the first full scale dictionary, likewise Diderot managed to remain poor after producing the West’s most famous encyclopaedia. No wonder publishers struggle when all the profundity they can muster is the Evening Standard. Are we simply returning to the equilibrium where creativity is next to impossible to convert into cash?
At TEDx Guardian Digital Editor Sarah Hartley articulated hyperlocal journalism (basically a local resident keeping a blog) as a possible future of news media, but she also admitted that she had no idea how journalists might earn a crust from this pursuit.
The next speaker to play into this theme was Marc Goodchild, head of Interactive for BBC childrens, who told us (amongst much else of interest) that at the age of 12 most kids started to predominantly spend their time on social networks and games - two areas where in effect you make the content yourself. He also told us that for the first time for children game play and internet use combined account for more hours of viewing than TV.
Hugh Garry, a Radio 1 producer, made the point even more firmly. His talk focused on a project that involved handing out mobile phones at festivals and asking people to record their experiences. The material was gathered into a film called “Shoot The Summer”. This exercise illustrated an interesting technical fact: mobile phones can produce footage that is perfectly watchable at cinema size.
A more subtle point was that most of the recipients of the mobile phones had a great natural sense of what would make interesting footage. If you don’t believe me, check out the film. And if you think that he just has the good bits from millions of hours of people taking drugs in tents, well, you’re right. That’s exactly the point – where is the space for the professional when a million amateur YouTube clips can be relied upon to produce a thousand gems? Of course, the content generation generation will also have a more natural sense of how to use a video camera compared with those for whom such devices are fresher developments.
Against a backdrop of the inevitable Twitterfall, and the equally inevitable Mancunian rainfall, the possibility of the end of professional content production took root in my mind. What medium might remain immune? Film? Surely this is the medium with the highest barrier to entry protecting its profits. Perhaps, but in a projected video of a JJ Abrahams TED talk we were all told we had no excuse not to be making films now the relevant hardware is so cheap.
I don’t really doubt that there are a number of ways for the paid journalist or film directors to survive, and it’s not news that the internet has put the squeeze on certain professions. There is a feeling though that we are just waiting for really cheap credit card transactions, or for Murdoch to spearhead online paid content, or for some other technological development to restore the professionals to their thrown. That might be misplaced optimism. Indeed some top journalists may be reduced to giving talks to a load half-arsed bloggers, perish the thought.
Monday, June 1, 2009
Mathew Paris' note from the underground
Do people live their lives differently to fulfill their obligations to writing? Is contriving you life to be Tweetable acceptable?
Mathew Paris’s melodic voice was easily called to mind as I read his recent article in The Spectator. In his soft-spoken lilt he detailed a moment of pique on the London Underground, the subject of his ire TfL’s decision to close the connection between Bank and Moment stations in one direction, a rule enforced by an escalator that conveys passengers up but not down.
Our protagonist struck a blow against the system by refusing to return to street level to make the connection, as those without Mr Paris’ anguished relationship with public transport might, instead dashing down the escalator the wrong way. Paris may have stood on the right previously, but on this occasion commuters must have been surprised to see him descend on the left.
I had imagined that he may have struggled to make the distance, fooled by his soft voice and gentle demeanour; I now discover he is in fact the fastest living marathon runner to have sat as an MP. He was, he stated, fuelled by a burning sense of injustice.
But I think he was also fueled by something else - the need to write an article for The Spectator. It would be too much to imply that petty rule breaking is the only means for a man with Paris’ talents to conjure an article, at the same time I don’t doubt that the same journalistic bent must have automatically packaged this handy anecdote into 800 words as he battled against the receding treads.
Without conferring the pejorative term annecdotalist on anyone these types of stories are the meat and potatoes of much journalistic writing - no news there. Having to come up with a bite sized morsel of zeitgeist on a regular basis must cause one to be constantly alive to the possibility that your weekly topic lurks in the article you are reading, the post office queue you are in or a conversation you had at the school gates. You must, I would suggest, encourage journalisable events to occur, at least on a subconscious level.
And surely, if this is the case, as more and more people have a quota of written output to fulfil, more and more people will live their lives in this way. I’m not referring to an increase in the number of professional writers, which certainly isn’t on the cards, but many people have a responsibility to a Twitter account, a regime of Facebook updates to keep up or even a full blown blog to maintain.
Next time you see an unreasonable argument in a restaurant, a petty provocation of social norms or perhaps even a novel act of kindness then you may be witnessing the need to construct a life makes good reading. Now Virgin Trains have introduced WiFi on trains perhaps we will all have something sensational to read on them. And there again, perhaps not.
Mathew Paris’s melodic voice was easily called to mind as I read his recent article in The Spectator. In his soft-spoken lilt he detailed a moment of pique on the London Underground, the subject of his ire TfL’s decision to close the connection between Bank and Moment stations in one direction, a rule enforced by an escalator that conveys passengers up but not down.
Our protagonist struck a blow against the system by refusing to return to street level to make the connection, as those without Mr Paris’ anguished relationship with public transport might, instead dashing down the escalator the wrong way. Paris may have stood on the right previously, but on this occasion commuters must have been surprised to see him descend on the left.
I had imagined that he may have struggled to make the distance, fooled by his soft voice and gentle demeanour; I now discover he is in fact the fastest living marathon runner to have sat as an MP. He was, he stated, fuelled by a burning sense of injustice.
But I think he was also fueled by something else - the need to write an article for The Spectator. It would be too much to imply that petty rule breaking is the only means for a man with Paris’ talents to conjure an article, at the same time I don’t doubt that the same journalistic bent must have automatically packaged this handy anecdote into 800 words as he battled against the receding treads.
Without conferring the pejorative term annecdotalist on anyone these types of stories are the meat and potatoes of much journalistic writing - no news there. Having to come up with a bite sized morsel of zeitgeist on a regular basis must cause one to be constantly alive to the possibility that your weekly topic lurks in the article you are reading, the post office queue you are in or a conversation you had at the school gates. You must, I would suggest, encourage journalisable events to occur, at least on a subconscious level.
And surely, if this is the case, as more and more people have a quota of written output to fulfil, more and more people will live their lives in this way. I’m not referring to an increase in the number of professional writers, which certainly isn’t on the cards, but many people have a responsibility to a Twitter account, a regime of Facebook updates to keep up or even a full blown blog to maintain.
Next time you see an unreasonable argument in a restaurant, a petty provocation of social norms or perhaps even a novel act of kindness then you may be witnessing the need to construct a life makes good reading. Now Virgin Trains have introduced WiFi on trains perhaps we will all have something sensational to read on them. And there again, perhaps not.
Subscribe to:
Posts (Atom)