Saturday, 30 April 2011

Early-'90s Retro-Haunto-Shite Dumped


I'm not a QPR fan (by any stretch) but on the day they return to the Premier Land after 15 long years of exile, it seems like an apt moment to reflect on this, err, classic artefact. Says just about everything there is to say about the early-'90s in one small, boxfresh, paradisal space.

Sir Les, to you.

Thursday, 28 April 2011

Conforming To A Pattern


"And the bitter conclusion is that it is all irretrievably over with the arts of form of the West. The crisis of the nineteenth century was the death-struggle. Like the Apollinian, the Egyptian and every other, the Faustian art dies of senility, having actualised its inward possibilities and fulfilled its mission within the course of its Culture.

What is practised as art today - be it music after Wagner or painting after Manet, Cézanne, Leibl and Menzel - is impotence and falsehood. One thing is quite certain, that today every single art-school could be shut down without art being affected in the slightest. We can learn all we wish to know about the art-clamour which a megalopolis sets up in order to forget that its art is dead from the Alexandria of the year 200. There, as here in our world-cities, we find the pursuit of illusions of artistic progress, of personal peculiarity, of "the new style", of "unsuspected possibilites", theoretical babble, pretentious fashionable artists, weight-lifters with cardboard dumb-bells - the "Literary Man" in the Poet’s place, the unabashed farce of Expressionism, which the art-trade has organised as a "phase of art history", thinking and feeling and forming as industrial art. Alexandria, too, had problem-dramatists and box-office artists whom it preferred to Sophocles, and painters who invented new tendencies and successfully bluffed their public. The final result is that endless industrious repetition of a stock of fixed forms which we see today in Indian, Chinese and Arabian-Persian art. Pictures and fabrics, verses and vessels, furniture, dramas and musical compositions - all is pattern work. We cease to be able to date anything within centuries, let alone decades, by the language of its ornamentation.

So it has been in the Last Act of all Cultures."

- Oswald Spengler, "The Decline Of The West"



The story of Western art in the 20th century was that of its desperate, hopeless struggle against ossification and irrelevance. In comparison with the previous 500 years that had seen the gradual evolution of technique and tincture in painting; in instrumentation and acoustic space in music, it saw a profusion of disparate styles, ever emerging and fusing and fissioning before tailing away into cul-de-sacs of obscurity or rarefication. Burdened with an art that had crystalised its final form, and would no longer yield mana, artists could only respond in an ad hoc and provisional manner, seeking any means they could to breathe life back into the dying disciplines.

These efforts usually entailed either one of two strategies: firstly, to incorporate and adapt styles from cultures that had previously been despised and therefore neglected. In music this meant incorporating innovations from Black America and later the Caribbean. In the plastic arts of painting and sculpture it meant adopting "primitive" styles from anywhere from Africa to Polynesia. Secondly, whatever forms already existed were abstracted, so that any remaining mana could be iteratively sieved out. When miscegenation and abstraction had been exhausted, then two other now-familiar artistic gestures would result - nihilism and shock tactics. Although nihilistic art movements such as Dada and Punk are often written of as responses to the extreme socio-political currents of the 20th century, they were even more a response to its artistic-cultural exhaustion - a howl in the face of expressive extinction. Classical music, jazz and rock all ended their productive phases with abstract-nihilist gestures that, though decades apart, sounded hauntingly similar in their brittle, atonal desolation.

It is this alternating pattern of miscegenation and abstraction that gave 20th Century culture its strange, giddy, quality, as the adoption of a new form could suddenly generate a new optimism and vigour in the arts and popular culture, before its premature exhaustion within the rationalising cultural superstructure would lead to the bitter deflation of hope. New eras and new ages were perpetually declared and quickly found to be false dawns. The sense of desolation was amplified in the wake of the constant hyperbole and "buzz" that necessarily saturated the culture, amplifying the mana-affect of exciting new styles while simultaneously attempting to ward off the underlying sense of doubt and anomie. The abstract-nihilist phase was invariably policed by particularly virulent progressivist rhethoric, partly to ward off the layman who might impolitely mistake the work for a cacophonous din, like the apocryphal cleaners who disposed of priceless conceptual works from modernist galleries, but mainly as an act of bad faith: the maintenance of the collective illusion that any 20th century art movement was capable of going anywhere.

Britain in the 1990’s was to witness two parallel movements that affected to reinvigorate the nation’s artistic culture. One of them, Britpop, was a classic revitalisation movement in the tradition of the Ghost Dances of the Plains Indians of the late 19th century - a call to long dead ancestors to replenish the spirit-well. The other, the Young British Artists, was a farce, a flurry of gestures as a disparate band of hucksters marketed their unlikely wares to plutocrats grown fat on the decade’s credit binge.



British rock musicians in the early Nineties had been roused from a decade of torpor by the thundering arrival of Grunge from the US northwest. Although in themselves neither musically radical or innovative, the Grunge bands managed to sound fresh to British ears largely due to their adoption of early-70’s hard rock influences that had been proscribed in the UK by Punk. However, in attempting to fashion both a response to Grunge, and a new style of music to suit the times, the British musicians faced a predicament in that they could no longer stylistically feed off their traditional source of inspiration - the great engines of musical innovation and spiritual mana that were Black America and Jamaica. Black American music had, in the form of Hip-Hop, largely shed its mana-lode for the kind of nihilist aggression of which white music already possessed a surfeit. Also, as with the more mainstream R&B of the "swingbeat" producers, it was structurally complex, and not amenable to being broken down and organically re-fashioned. The intricate recording techniques of R&B producers such as Teddy Riley and Jam & Lewis functioned like the filigree designs on a banknote - as a protection against counterfeiting. If you wanted to adopt these guys’ sounds, you had to work with them directly, and pay them well.

Two of the early notable British bands of the Nineties, the Manic Street Preachers and Suede, attempted to overcome this impasse by re-invoking those elements of radicalism that were deemed to be indigenous - in the former’s case via the use of the kind of Situationist sloganeering not seen since Punk, and in the latter’s via recalling the shock-androgyny of the likes of Bowie and Bryan Ferry. What neither band could provide, however, was the necessary aural shock-of-the-new counterpart. The Preachers’ adoption of a not-especially-bold combination of New Wave and the lighter end of Glam Metal sounded conservative even in comparison to Suede’s blend of Ronson and Marr riffing. That said, both bands represented genuine efforts to squeeze out whatever vitality remained in the dying form, as indeed did the band that was to supplant them in being the signature group of the 1990’s.



Oasis are generally condemned for a putative conservatism in their harking back to the classic melodicism of the likes of The Beatles and The Jam, but initially, with songs like "Wonderwall" and "Live Forever" it really seemed that they might pull it off, that sheer talent might by itself be able to beat back the shadow of nightfall. Unfortunately they succumbed to that characteristic affliction of the era - a lack of staying power. When rock was young and vital, bands like The Rolling Stones and The Who could undertake coast-to-coast American tours while releasing one or two albums a year and a slew of hit singles. Oasis’s American tours were notoriously reluctant and tardy, and by the constipated gestation of their third album it was clear that they had prematurely given their best. Moreover, diligent sleuthing on the part of their critics began to expose the jackdaw nature of the band’s music. Noel Gallagher wasn’t so much a great tunesmith as a connoisseur of other great tunesmiths.



After Oasis came Radiohead and Coldplay, the first of the zombie bands, and British rock, starved of sources of miscegenation, devolved into "landfill indie", a classic example of Spenglerian pattern work, in which bands sounded identical for decade after decade, generation after generation, with only the most enthusiastic admirers being able to identify the detail-differences. Hip-hop itself also drifted into pattern work - perhaps the absence of white musicians purloining its structural and textural innovations was ironically a reason for its stasis, the confounding of thieves having been a good reason to keep the music changing. The popular music landscape of the early 21st century is of course one of disparate genres, or multi-genres in the case of Metal, each one labouring away in its silo to the breathless appreciation of its own adherents. Tellingly, any collaborations across genres tend to go under the monicker of "versus", signalling in advance that after the brief fission, both participants will return to their respective corners.

If Britpop had been a futile attempt to ward off death, Britart, as the work of the Young British Artists was sometimes called, was a kind of grinning post-modern celebration of death - a dancing on art’s grave. "Conceptual" art had long been in the business of the industrial-scale production of the baffling, the shocking and the merely titilating, but with Britart any lingering embarrassment regarding the business-commercial relationship between the artist, the patron, and the paying customer was definitively put to rest. It boasted a raft of almost Dickensian characters with its hungry working-class artist-entrepreneurs, its wide-boy dealers, its shady advertising-executive patrons, its dunderhead curators buying cans of "artist’s shit" and its mysterious Russian customers purchasing works as part of complex tax-evasion schemes.

Perhaps the most important figure in Britart was the advertising mogul and gallery owner Charles Saatchi, who first witnessed the work of Damien Hirst at a property developer-sponsored exhibition entitled Freeze in the late 1980’s. Saatchi reputedly stood open-mouthed at a Hirst work that featured a cow’s head being consumed by maggots. It was to be the start of a beautiful friendship in which Hirst and his cronies would supply the tabloid-baiting works and Saatchi the marketing power and art trade connections to bring them to the widest possible audience. The worthless art of the YBA’s dovetailed neatly into a London that was increasingly making vast amounts of money from worthless activity - from property speculation, insurance scams, reckless credit expansion and opaque financial transactions. The decadence of Hirst’s work, his openness about his non-artistic background and use of assistants, no doubt resonated strongly with those who knew that their own wealth was equally the result of fraud and circumstance. It’s difficult not to suspect that the riches blown at auction on Hirst and his cohorts’ tat were some kind of subconscious potlatch, the plutocrats attempting to cleanse themselves by exchanging their filthy lucre for the most putrid and inconvenient exhibit possible. It is perhaps telling that when Hirst created an object deliberately intended to appeal to the wealthy, a diamond-encrusted skull entitled "For The Love Of God", it failed to find a buyer.

Nevertheless, the media furore surrounding the Young British Artists gathered its own momentum, and exhibitions such as Sensation proved to be popular draws, the public not so much going to see the works, as to see why everyone else was going to see them. Damien Hirst and Tracey Emin became household names, celebrities even. It would probably amaze future generations that such obvious charlatans could be taken seriously even for a few minutes, but then future generations will have far more important work to do than pore over the scrag-ends of our own dying civilisation.

Friday, 15 April 2011

And therefore must his choice be circumscrib'd: the wisdom of David Foster Wallace


David Foster Wallace is fast approaching the status of a "sage writer" (albeit a tragically posthumous one). As everyone knows, Wallace was the Guy Who Moved Things On From Postmodernism, so in a way it makes sense that he should be treated as a sort of modern-day Ruskin, a doler-out of soundbite ethical wisdom in an age trying to recapture sincerity and cohesiveness after the pomo-relativist flood. Middling rockist indie bands are wont to quote the bit in 1993 essay "E Unibus Pluram" where Wallace talks about the likelihood of the next generation of radicals being a "weird bunch of anti-rebels ... who dare somehow to back away from ironic watching, who have the childish gall actually to endorse and instantiate single-entendre principles". Zadie Smith and Foals are notable celebrity admirers. Meanwhile, Guardian hacks post links on their vanity websites* to Wallace's now very widely quoted 2005 Kenyon commencement speech, a dazzling 20 minute morality lesson (now better known in its transcribed form as "This is Water"), which has become a sort of "Everybody's Free To Wear Sunscreen" for the 2010s. Fuck's sake, when I was scratching around for something pithy to say as a goodbye bow to my middling indie band a couple of years ago (long story), I posted the whole of the Kenyon speech to our Myspace blog. It wasn't really all that relevant to that particular moment of personal crisis, I now realise, but hell, it was the most sagacious thing I was reading at the time, and it seemed to fit in some vague way with the bloody-minded, stand-taking gesture I was making.

Right now, as Wallace's final work - the unfinished novel The Pale King - is about to be published, the Saint DFW tendency is reaching a spectacular climax, accompanied by the sort of inordinate PR hysteria we're all familiar with. The media idolatry is unfortunate, but then again, it's bound to be short-lived. More importantly, you get the impression that, when the culture and publishing industries have moved onto their next five-minute hero/victim, Wallace's voice and legacy will still be just about audible underneath the debris of post-mortem exploitation and expropriation (or that's the hope, anyway).

So what is the legacy? If Wallace was - and still has the potential to be - a modern sage, then what kind of wisdom did he impart? Taking the Kenyon speech/"This is Water"** as a sort of condensed moral manifesto (and despite Wallace's protestations that he shouldn't be regarded as a didactic "wise old fish", he clearly was just this - a willingness to be so was perhaps his greatest contribution to contemporary letters), the most striking thing for me is how damnably conflicted his argument is. The rhetoric, for once, is lucid, cogent, and pomo-free. But the message remains infernally difficult to hammer out. Broadly speaking, Wallace seems to be caught between a visionary perspicuity about "what is to be done" on the one hand, and a self-lacerating, high-sceptical tendency that to a large extent nullifies the affirmative potential of his astonishingly powerful insights. This is not the place to speculate about parallels between this sort of mindset and Wallace's tragic biography. What is certain, though, is that this expression of the struggle between hope and fear, unselfishness and self-directed masochism, positive utterance and negative qualification of it, is the key testament of one of the most harrowingly representative figures of our times. Wallace's struggle was, and remains, an epochal one.    

I'm with Wallace on about 90% of the statements he makes. "In the day to day trenches of adult life, there's no such thing as atheism": so simple, so dead on the mark. "And the world will not discourage you from operating on your default settings, because the world of men and money and power hums along quite nicely on the fuel of fear and contempt and frustration and craving and the worship of self": again, it would be churlish to try to gloss eloquence of this calibre. The speech concludes with the following brilliant penultimate paragraph, which seems to hit so many nails on the head it's not even funny:          
Our own present culture has harnessed these forces [of self/money/appearance/intellect worship] in ways that have yielded extraordinary wealth and comfort and personal freedom. The freedom to be lords of our own tiny skull-sized kingdoms, alone at the centre of all creation. This kind of freedom has much to recommend it. But there are all different kinds of freedom, and the kind that is most precious you will not hear much talked about in the great outside world of winning and achieving and displaying. The really important kind of freedom involves attention, and awareness, and discipline, and effort, and being able truly to care about other people and to sacrifice for them, over and over, in myriad petty little unsexy ways, every day. That is real freedom. The alternative is unconsciousness, the default setting, the "rat race" - the constant gnawing sense of having had and lost some infinite thing.
We're all used to the neoliberal malaise being explained in all kinds of complex theoretical ways, but where else will you find such an economical, profound, even mystical (in the best sense) critique of the radical selfishness and spiritual paucity of neoliberal culture?
     
Yet extraordinarily (and this is evident even in the above passage), the whole weight of Wallace's argument is ultimately predicated on a pronounced self-centredness that ends up merely replaying the victory of a worldview which imprisons us in "our own tiny skull-sized kingdoms". We do not have to look hard for the culprit, the snag that means Wallace cannot finally rise above the atomism that is his ostensible target. Where are we to look to try to get back in touch with real freedom? Ourselves. What is the only virtue, the antidote to self-worship, the "capital-T Truth" that remains after a "whole lot of rhetorical bullshit" has been "pared away", the secret to "making it to 30, or maybe 50, without wanting to shoot yourself in the head"? Choice, the word that is littered throughout the Kenyon speech like some blindingly obvious Freudian crux.

In the day to day trenches of adult life, Wallace argues, one must become a sort of heroic superman, dedicated to caring for others, but only achieving this civic awareness through preternatural self-discipline and the continual invocation of one's formidable moral-intellectual might. By simply straining hard enough, we will be able to transcend reality and inoculate pain:
But if you've really learned how to think, how to pay attention, then you will know you have other options. It will be within your power to experience a crowded, loud, slow, consumer-hell-type situation as not only meaningful but sacred, on fire with the same force that lit the stars - compassion, love, the sub-surface unity of all things. Not that that mystical stuff's necessarily true: the only thing that's capital-T True is that you get to decide how you're going to try to see it. You get to consciously decide what has meaning and what doesn't. You get to decide what to worship.
No amount of undeniably beautiful phraseology can cover over the fact that Wallace's solution to surmounting a consumer-hell-type situation (and tellingly, he sets his parable in a supermarket) is a bizarre restatement of the terms of the market place: "you have other options", "you get to decide", "most days, if you're aware enough to give yourself a choice, you can choose to look differently at this fat, dead-eyed, over-made-up lady who just screamed at her little child in the checkout line". Isn't this sort of alchemical make-nice strategy exactly what the advertising executive is trying to promulgate on a daily basis? Contra Wallace, might not there be some value in acknowledging the awfulness of a consumer-hell-type situation for what it really is? Wouldn't this be the really true exercise of civic-minded consciousness: recognising that the solution does not lie with one's individual powers of imagination alone, that there might be a more social, less heroically isolated way of responding to the causes of depression and misery? Wallace appears to briefly identify the root cause of such suffering in "the world of men and money and power", but the focus of his solution is not on this matrix. It is turned bravely but violently back on a quite different target; that is, himself.
   
Crucially, in the actual Kenyon speech (as opposed to the transcribed version), Wallace is speaking to a class of graduating students, and the emphasis is on the value of a liberal arts education as a means of fostering the sort of consciousness that makes responsible choice possible. This makes the whole premise a lot saner and less like a proclamation of radical stoical individualism, and this establishing context should probably be reinstated in future published versions of "This is Water" to make it clear that Wallace was not in fact addressing the world with a parti pris, but merely trying to say something intelligent and constructive to a group of young people for whom a light reminder of the importance of responsibility to others cannot have been such a bad thing. Nevertheless, it seems that the "moral superman" motif was one that Wallace obsessed over and grappled with in his last years. I haven't read The Pale King yet, but the reviews suggest that it does in fact essentially reiterate the argument of the Kenyon speech. As one reviewer puts it (quoting Wallace), the "crucial conceit" of the novel is "that the soul-crushing boredom of tax work can lead to transcendent bliss, 'a second-by-second joy and gratitude at the gift of being alive'." To me, this seems to amount to something like philosophy-as-prozac.    

Woah, didn't know the late-nineties had been Penguin Classicked, already!
The side of Wallace that I love doesn't have anything to do with this inverted, masochistic narcissism. For me, Wallace's sagacity lies in his willingness to get squarely behind a moral or ethical precept and stay there, the "childish gall actually to endorse and instantiate single-entendre principles", if you like. In the 1930s, T.S. Eliot said, "we await the great genius who shall triumphantly succeed in believing something", and by god, we're still waiting, which is just one reason why it's such a crying shame that Wallace had to go and top himself. Perhaps even more than this though, the message that I think should be his legacy, is the revival of a particular kind of novelistic tradition, one that runs through Dickens, The Brothers Karamazov, Ulysses, Mr Sammler's Planet, and a host of other works up to (and arguably ending with) the postmodern period, one that foregrounds as a sacred rite the utopian process of one human consciousness coming into contact and merging with another. This seems to me to be the grand underlying scheme in Wallace's masterpiece Infinite Jest (1996), nowhere so magically evident as in the scene towards the end of the novel in which Mario Incadenza poses as a homeless person, and waits for days until somebody comes up and touches his outstretched hand. The reawakening of this basic sentimental, moral commitment to socially-minded anti-individualism was Wallace's most profound gift to the culture. It's just that, as the Kenyon speech shows, he didn't seem to be able to equate this anti-individualism with the need to take the fight out of his own head.   

 
*Is there anything more pernicious than the aspiring journalist's personal dot.com website? Why not just get a blog? They're free, and more interesting.
** This is only an abbreviated version of the text, published in the Guardian after Wallace's death. See penultimate paragraph for a link to the full audio of the speech.

Tuesday, 12 April 2011

"John Romero's about to make you his bitch”

In videogames, the view on the screen through which the player sees the action is euphemistically called the camera.

A first person game is where the player is the camera.

A first person shooter is when the camera has a gun attached.

While the elements of the first person shooter had been floating around for years, it wasn’t until 1992’s Wolfenstein 3D that the style became named, recognisable and popular.

Wolfenstein 3D pitted a lone American soldier (the player character) against the Nazis in a nightmare bunker shaped like a cluster of swastikas. The player is up against Hitler, but the Fuhrer, despite at one point being encased in armour, is not even the final boss. It was infamous for being bloody, for using the Horst-Wessel-Lied as a theme tune, for the aforementioned swastikas and for making dogs killable enemies. It was a huge hit.


Wolfenstein’s developer, id Software, capitalised on the success of the game by releasing a sequel called Spear of Destiny. The sequel didn’t require a new game engine to be built, so during development of the game Wolfenstein’s programmer John Carmack had an opportunity to experiment. Carmack, an introverted loner with little time for socialising, ensconced himself away from the rest of the small team at id to focus solely on his work. He came up with an engine that allowed for new levels of realism. Whereas Wolfenstein took place on one level plain, the new engine allowed for multiple platforms at different heights. Lighting and texture effects were also improved, allowing for more atmospheric environments.

The game that eventually used this new engine was called Doom. The designers had taken inspiration from the movies Aliens and Evil Dead II and from a session of Dungeons & Dragons they had played that had ended with a demonic planetary take over. Doom was bloodier and more relentless than Wolfenstein and while today it looks almost like a cartoon, at the time it seemed incredibly realistic.

One of the leading designers on Doom was John Romero. Romero had worked as a designer on Wolfenstein and was the opposite of the brusque Carmack. Romero had long hair, tucked his shirt into his jeans and drove a Ferrari. He looked like a cross between a metalhead and an asshole yuppie bad guy straight out of any number of late eighties/early nineties popcorn movies. Romero wanted Doom to be about the action and dismissed his fellow Wolfenstein designer Tom Hall’s attempts to create a detailed story for Doom by saying, "Story in a game is like story in a porn movie. It's expected to be there, but it's not that important."

Like porn, Doom appeals to realism while being extremely unrealistic. The world of Doom is strangely abstract: the architecture is Byzantine in layout; the player character moves impossibly fast for a human; the demons are numerous and varied and exist only to kill you. During development Tom Hall had built a number of levels based on military facilities. These were on one floor and had low ceilings in the Wolfenstein style, but the other designers preferred Romero’s spacious and alien levels. Hall was pushed out in ‘93.

Wolfenstein level map

Doom level map

Both Wolfenstein and Doom were released as shareware, meaning that part of the game was available for free. If the player wanted to play the rest of the game, she had to buy it. This distribution model was very successful, but another key element to the success of Doom was the modding community. Id enabled players to modify Doom and make their own levels and a vast modding community sprang up, with players designing and swapping levels, some of which were later commercially released. Finally, there was the multiplayer function. Christened “deathmatch” by Romero, players fought each other to the death in Doom levels via ethernet. Romero was particularly fond of deathmatches, claiming that the winner of a match won the right to humiliate the loser.

John Romero and Noel Stevens in the aftermath of a deathmatch

Doom was immensely popular – the game was estimated to be installed on more computers than Windows 95. Just like the change in nineties porn, Doom was a watershed moment in videogames for repetitive, stripped-down brutality. Fittingly, fans of FPSs like Doom (along with sports, racing and fighting games) called themselves hard-core gamers.

Id followed up Doom with a sequel, and a new FPS called Quake. Romero’s vision for Quake was of a Lovecraftian “dark fantasy”, a labyrinth of stone dungeons like a medieval hell. But John Carmack and the other designers wanted to continue Doom’s mixture of demons with futuristic technology. The game was a struggle to make. Romero grew frustrated with working at the company and longed to branch out and start his own game developer. Soon after Quake was finished he got the perfect opportunity – he was fired from id.


Before being let go, Romero had contacted Tom Hall and invited him to form a new developer with him. After being joined by Jerry O'Flaherty and Todd Porter, the resulting company, ION Storm, was founded in 1996 and quickly signed a licensing deal with Eidos. Romero and Hall’s vision for ION Storm was summed up by their new motto, "Design is Law". At id they worked on one game at a time, but Romero didn’t want to do things that way. He wanted ION Storm to be a hub of creativity where many games could be worked on at once, where he and Hall could work on their own games separately without interference. He also wanted it to have a plush office. The utilitarian John Carmack may have had his own Ferrari, but he kept the id offices modestly furnished. The way he saw it, “an office is just a place to hold our stuff”.

The ION Storm offices in Dallas were in the top two floors of one of the tallest buildings in the city, the Chase Tower. The interiors were made by the Russ Berger Design Group and featured a motion capture stage, a recording studio for voiceovers, a small cinema fitted with leather seats and a $50,000 projector, pool tables, arcade cabinets, a bank of 12 TVs for deathmatches, and a lobby fitted with elevators panelled in green dye-coated metal sheets and a matching company logo embedded into the terrazzo floor. The large skylights caused the offices to get very hot and the light made working on computers difficult.

The ION Storm lobby

The fancy new offices wouldn’t be ready until 1998, but Romero was eager to start work on his ambitious new game that would run on the Quake engine. Influenced by the JRPG Chrono Trigger, Daikatana was to be a time-travelling FPS with a vast array of enemies, worlds and weapons as well as sidekicks that would accompany the player character. He gave this enormous project a seven month deadline.

ION Storm's motion capture stage

Romero didn’t want to poach talent from other developers, so many of the people he hired were from the modding subculture. Most had no professional experience. They were thrown into a large group and had to meet a tight schedule that would be punishing for seasoned pros. Some of the artists were hired form the comic book world and had no idea how to make images of a suitable size for nineties videogames. Romero would never settle for second best and the game was delayed as they struggled to update the code to fit the new spectacular Quake II engine turned out by Carmack.


Romero spent a lot of time on marketing. Despite ION Storm not having released a single game, he gave many interviews to magazines like Rolling Stone, Wired, Newsweek and Time. He was hyping ION Storm and Daikatana before the company had moved into it’s offices and while development of the game had barely begun. Press photos were sent out of Romero sitting in a $9,000 antique chair. One of the early magazine adverts for Daikatana had no information or screenshots of the game. It merely read, "John Romero's about to make you his bitch. Suck it down.”


The advert became infamous overnight and turned gamers against Diakatana and Romero in particular. Romero claimed that he only agreed to the slogan reluctantly, insisting that he would “never say that to anybody” because “that is gay”. The ad does illustrate the strangely homoerotic nature of misogyny, especially as it was taken for granted back then that gamers – and especially players of Doom – were men. Or rather, boys. Suck it down.


The development of Diakatana was fraught with problems. The programmers didn’t trust the artists, the game code was mangled from frequent engine changes, morale was low, there was no proper direction from Romero, and no one knew what was going on. Workers began leaving en masse and gossip about the company proliferated on the internet.

Todd Porter’s management style in particular caused trouble. He would rage at and needle stressed workers to get the job done, while giving last-minute design changes that contradicted Romero’s instructions. Days before the high profile trade show E3, Porter ordered changes to the demo of Daikatana while the team were struggling to finish it on time. During the chaos to get the changes implemented and the demo completed, an error went unnoticed and the demo performed badly at E3.

While Daikatana was frequently delayed, the popularity of Doom only increased. By now the graphics looked out-of-date, but the modding and deathmatch communities saved it from being tossed down the memory-hole at the usual pace of accelerated obsolescence in videogames. This longevity made Doom stick out among bloody shooters and it was repeatedly blamed throughout the nineties – along with gangsta rap – for glorifying and causing real violence. In particular the apparent realism of the game led it to be seen as a kind of virtual reality, giving credence to ludicrous claims that it was a “mass murder simulator”. After the Columbine massacre it was discovered that Klebold and Harris had made their own Doom levels that Harris had uploaded to his website. Doom became one of many pop cultural scapegoats in the frantic rush to find someone other than teenage boys to blame for the killings. It was claimed that Harris had made Doom levels that were based on Columbine High School, but this was a myth.

Daikatana finally came out in 2000. It flopped. After Tom Hall’s game Anachronox was released to good reviews but commercial indifference in 2001, he and Romero left the company and the Dallas office was closed. In 1997 a second ION Storm office had been founded in Austin at the request of Eidos. Away from the chaos of the Chase Tower penthouse, the Austin office produced a number of critically and commercially successful games, including Deus Ex and the third instalment of the Theif series. But it wasn’t enough – ION Storm finally shut it’s doors in 2005. All in all, Eidos had spent more than $30m on the company.

On release, Daikatana was vigorously panned. Now that the dust has settled, the general consensus is that Daikatana – while flawed – is not that bad. There were two products on sale, as Romero put it, “One was [the] marketing and hype and the other was the game.” It was a case of the marketing tail wagging the videogame dog. He apologised unreservedly for the advert in 2010. While id remains stuck in the FPS mire (Doom 4, coming soon!), Romero has continued to evolve, making games in different genres for numerous platforms. But he never recovered his former lustre.


Doom changed games in more ways than one. There was the revolutionary programming and design – “the sound and the violence and the speed” (Romero). But there was also the macho, tough-guy posturing. Videogames today – and especially FPSs – are infested with trash-talking nerds. You don’t have to look hard to find that particular mixture of misogyny, racism and social Darwinism espoused by geeks who grew up to become bullies. Dylan Harris had written in his diary that, “everyone should be put to a test. an ULTIMATE DOOM test, see who can survive in an environtment using only smarts and military skills.” But Doom was merely the window dressing for ideas that go back long before bloody videogames or Marilyn Manson, ideas that Harris was convinced were true. On the day of the Columbine massacre he wore a white t-shirt with a slogan written on the back: “NATURAL SELECTION”.

One of the most pervading myths of videogames (and computer culture in general) is that it is a somehow “outlaw” industry. That videogames don’t have to worry about the guys in suits – as a Wired article on id put it – because “there are no guys in suits”. When John Romero was trying to get extra funding for ION Storm, every company he spoke to was enthusiastic about his vaguely structured and enormously ambitious (non)plan. With one exception: Virgin Interactive. Romero said, "Virgin was the only company that immediately said, 'You can't do that, it would fall apart!’”. He didn’t heed their warning, didn't need to. Why take the advice of a company where the guys in suits acted like guys in suits?

Sunday, 10 April 2011

Small Mercies

Are you all aware of DBC Pierre's concept of the fate song?

About 15 years ago, I attended, along with three other candidates, a "recruitment event" for the Mars Corporation (you know, the choccy bar manufacturers), who are in fact a deeply strange and sinister organisation.

The event consisted of us spending 3 days confined in a hotel (I mean really, we were forbidden to leave) in a small Norfolk market town while their human resources managers subjected us to various types of psychological stress/torture, including psychometric tests, mock legal disputes, and their UK vice-president insulting us to our faces in order to see how we would react.

About midway through Day 2, I thought "ah, fuck this", strode out of the hotel, across the busy market square (poignantly thronged with pensioners milling around the stalls, a world away from what I had emerged from) and into the nearest pub.

Sitting there, pint in hand, breathing in deeply the comforting atmosphere of normality, this suddenly came on the jukebox:



And, you know, I thought it was really quite deep.

Still do, in fact.

Tuesday, 5 April 2011

Testing, testing... is this thing on?



Children born in the school year that crosses 1983 and 1984 were the guinea pigs of UK education. They were the first to take SATs tests at age 7, the first to take SATs tests at age 11 and the first to take SATs tests at age 14 (these latter tests were dropped in 2009). If they took A-level examinations after their GCSEs, they were also the first to take AS-level exams.

That means that if they went on to sixth form they were under the pressures of serious examinations every year for five years (SATs at 14, mock GCSEs at 15, GCSEs at 16, AS-levels at 17 and A-levels at 18).



We are constantly told that the reason that students get higher exam results than before is because the exams are too easy. It never occurs to journalists, politicians and the professionally outraged that the reason students do better in exams is because that is what they spend most of their time at school doing. What they don’t have much time for is learning.

At school we were assured by our teachers that we shouldn’t worry, as “you can’t fail SATs”. So what were they for? The tests enabled league tables of school exam results to be made and wealthy parents moved to the catchment areas of the highest performing schools. This drove house prices up and - in Adam Curtis’ memorable phrase - “kept the poor out”.

We all know what happened to the housing market, but what happened (and continues to happen) to the children and young people who managed to complete the examination hurdles? They have gone from the constant anxieties of exams to the constant anxieties of the ‘flexible’ job market and (technically illegal) unpaid work. And the well-being of children in the UK is not good. The test results say one thing, but more sophisticated studies give the real story:

Monday, 4 April 2011

"Cataclysm"

Rob Cotter has been in touch to remind us of this priceless clip ("the exact second at which Britpop died"):


"1996 seems to have lasted about 15 years for some people", says Rob, rightly.

Friday, 1 April 2011

If you can remember the nineties, you weren't really there

The Major years witnessed the final demise of a particular kind of drug-based radical hedonism. Of course, the so-called radical aspects of drug taking have always been compromised by the fact that drugs tend to get in the way of social protest and other related important stuff like, well, critical thinking. Nevertheless, it's inarguable that the counterculture (however loosely defined) was at certain crucial moments and in certain important ways galvanized by a subversive, imaginative use of illegal substances. What's equally certain is that this particular approach to drug use is no longer with us, and that the nineties was the occasion for its repudiation.

The telltale sign that something is about to become extinct is always a hyperbolic, last-gasp flourishing of it. Reliably then, we might note the carmodism that the very same day Major was announced PM (27 November, 1990), this was released:


Within months there was this:


And about another year later, of course, this:


At this point, pre-Leah Betts (another last-gasp - this time of anti-drugs tabloid sanctimony), even the retro-conservative aspects of these records cannot cover over the fact that there is still something faintly meaningful, if not quite subversive, about invoking illegal substances. Drug culture was still vaguely an alternative one, and the nineties was going to give it one final fling.

There was an element of eighties hangover here. The unemployment pogroms of the eighties had not quite broken the spirit of the working classes and their ability to create collective identities. And one cogent response to being continually out of work and without money to even go to the pub is of course to resort to a kind of unbridled hedonism: cheap pills, powders, weed, petty larceny, playing loose shaggy music, listening to the classic records that happen to be to lying around, scrapping, going out raving. This hedonism might end up killing you, and its long-term social effects can only be utterly pernicious. But for a while you might be able to derive a large amount of radical creative energy from it. Hence acid house, and the above indie-pop reductions of it.

As with so many things the period 1995-'97 was the turning point. In September '95 Pulp release this, which sums up the opiate of the people motif pretty eloquently:


In December, Leah Betts dies. A few weeks later in February '96, the filmic behemoth that is Trainspotting descends (Carl has said almost everything that needs to be said about this, so I'll limit myself to observing that it was timely and apposite partly for being a perfect pastiche/retro-annullment of post-war drug culture, with its reifications of Iggy, heroin, etc). In 1997 Blair comes to power, and the remainder of his time in office sees a rapid removal of the taboo on public figures and drug taking, to the point that an almost Dickensianly old-fashioned Tory can be elected Prime Minister in 2010, without his fondness for cocaine being anything of an issue. That's before we've even gotten onto that early-nineties escapade involving our chancellor, the call-girl, and the white stuff:

     
In 1995 Oasis said "where were you while we were getting high?" and for me, putting aside the band's copious culture crimes, there's a good deal of pathos in that line. It seems to compound a past tense of collectivity ("we") with a present in which some sort of extraneous betrayal ("you") has travestied solidarity and replaced it with pleasure-seeking egotism. Oasis's ramifications were entirely negative, but it's also true that their roots lay in a much more positive context of affirmation, brotherhood, and most relevantly for our purposes, in the hedonistic culture of the early nineties. This is most evident of course in the better tunes on Definitely Maybe (Supersonic, Cigarettes and Alcohol), which were, ineluctably, actually written on the dole, whatever lucrative poisons would subsequently come to turn the Brothers G into nouveau-Thatcherite monsters. Here's the elder Gallagher on the contexts underlying the composition of Live Forever:

"it was written in the middle of grunge and all that, and I remember Nirvana had a tune called I Hate Myself and I Want To Die, and I was like . . . seems to me that here was a guy who had everything, and was miserable about it. And we had fuck-all, and I still thought that getting up in the morning was the greatest fucking thing ever, because you didn't know where you'd end up at night. And we didn't have a pot to piss in, but it was fucking great, man."

 
This was radical hedonism. At its best Oasis's music is ultimately redeemed, I would argue, by its ability to encapsulate this tendency at the very moment it's about to be transformed into its opposite, morphing from a means of retaining some kind of empowerment and collective enjoyment in the middle of the dark night of neolibralism, into an egocentric, acquisitive hedonism that is utterly, tragically complicit with the neoliberal status quo. The lyrical climax to Live Forever is perhaps the most compelling instance of this double-pull, as it moves in the space of a few syllables from a remarkable declaration of solidarity (maybe you're the same as me) to hubristic - but still oppositional - drug-speak (we see things they'll never see), before finally collapsing into a final, hardcore Thatcherite statement of Faustian self-regard (you and I are gonna live forever). Oasis's songs can be heartbreaking for the way they embody this shift from "you and I" to just "I". It's surely no coincidence that the promo video for Live Forever features a symbolic burial in what looks like the ruins of a council estate, as it was more than just drummer Tony McCarroll (soon to be screwed over by his band mates) that was being buried here. 

I've never been more than a very sporadic drug taker. But I've come to think recently that this is more than just a matter of inherent temperament, also a consequence of the Times. Put simply, drug taking and drug culture just isn't that interesting any more. Intoxication is still a central part of our culture, and probably always will be, but the subversive potential of opening the doors of perception seems to have been somehow nullified. Alcoholism, the eternal state-sanctioned, commercially profitable form of inebriation is rapidly approaching pandemic proportions. Meanwhile, in January, an NHS survey found that illegal drug use (cannabis, ecstasy, heroin, cocaine) had fallen significantly over the past few years. A charity spokesperson said: 

"There could well be a generational shift away from drugs going on ... Overall drug use has been declining significantly over the last six or seven years, which is encouraging, and we are seeing fewer young people reporting that they are using drugs. It could be to do with young people's culture and fashion..."

 
From Pete Doherty to Russell Brand to Skins to Ke$ha to David Cameron: it seems that the superficial, commodified aesthetics of drug taking and pseudo-radical lifestyle hedonism are more popular than ever. A casual admission of an appetite for drugs has become acceptable, just as an admission of one's actual wealth has become the most unacceptable taboo. The reality is that we're all impoverished, and no one's actually getting high any more, or daring to think outside the box.

[NB: credit for this post's title goes to Phil Knight.]