Archive

Uncategorized

‘Easy’– everyone called him that. His name is Esse (it rhymes with ‘Jesse’) but his gift of charm and his easy-going ways were right there to be seen even when we were small. We were born nearly at the same time, yes, twins. But he charmed his way here first. I am frequently reminded of that. I was literally right behind him, they say I had my hand on his foot as we both emerged from our mother’s womb.

Everyone always liked Esse, even when we were kids together. They said I was too thoughtful, always wondering about the whys and the wherefores. Our father preferred him, of course. And as the elder son, Esse was set up from birth to inherit whatever little fortune the old man might be able to amass. The funny thing was, Esse didn’t care—if he had money, he spent it. He was crazy generous, too easy-going to be anyone’s head of household. About as responsible as a he-goat. And hairy like one, if the truth be told.

Strangely, we were never really pals. I know that is unusual in twins. I wanted out, I wanted to learn about the world. I wanted to see the city. Esse wanted nothing more than to wander the fields and date a girl from the next town. Or several.

[Part 3]

Jake’s Testimony

 

Once upon every time, each family hands down  its stories.  Many are never told, outside of the small circles huddled around the last sleeping mat that old Auntie ever would lay in, or during that mourning time we had to spend together after the passing of Dear Old father. You know.

After long hesitation, I’ve decided to tell you ours. Mine. This one.

My grandfather was a religious nut who went around breaking into other people’s sanctuaries and destroying their statues and holy relics.  It is not surprising that he had to keep moving out of town, and never settled down properly. He had two sons; and abandoned one of them. Probably they had some sisters but I’ve never heard their names mentioned. I could blame my grandmother for the ‘lost’ great-uncle, since she was the jealous type, but I blame him. The patriarch we are all supposed to admire and speak no ill of.

My father, Izzy, now he was a piece of work. God’s gift to everyone, or so he would tell you.  But what did he ever do, when it comes down to it? He found my mother and married her; I’ll give him that. God bless my mother, Becca, she had a lot to put up with. She saved me more than once. Izzy carried on the family line, he was good for that much. He told us about the time his father almost slit his throat, told us about wrecking all the statuary and how they kept losing everything and being forced to move far, far away again and again. Izzy retired early and just grew older and older, being apparently in no hurry to die and leave us something to live on.

But it’s really about my brother. I think this whole story is really about my brother. Damn him.

[Part 2]

I’ve been thinking about it, off and on since I read this book many years ago. And I’ve decided that, where there is no reciprocity, there is no friendship. This is saddening realization.

The Gift: The Form and Reason for Exchange in Archaic Societies

By Marcel Mauss (1954)

Add to that, these observations by C. S. Lewis, who himself was the chief connecting force among the Inklings: CSL on ‘friendship’ (philia) from his book, ‘The Four Loves’ (1960)

On the magisterial tone, and why everyone should (mostly) avoid it
TL:DR: (I believe) In conversation, you should verbally only put yourself forward as authority, or as the judge or arbiter of something, when you really, really are competent and expert on the point you are making. Otherwise, it is a unforced rhetorical error. (Generally speaking.) What I call “the magisterial tone” is a well-worn form of expression in English, and in other languages as well (especially German and Latin) which, by using words and phrases, the speaker attempts to imply that they really, really know what they are talking about, that other views on the point are inadequate, and that the matter ought thereafter be closed to discussion. Here come de judge— and the judge has banged the gavel, rendering a decision.
This is something articulate and voluble people, of all ages, do; and I encounter it frequently on Facebook and elsewhere. One easy example: When the writer — or speaker, in conversation — commonly resorts to forms of “to be” verbs in describing some assessment of their own. It sounds like the judgement is rendered from “on high” — “This is That. It is Nothing But That. Shut Up.”
Life — your life and my life — is not overseen by an omniscient narrator, at least, not one that you or I can reliably speak for. Almost everything we post can and possibly should be viewed as carrying a silent, ‘…or so it seems to me.’ Or maybe an explicit one. Now, there are certainly times when the magisterial, or authoritative voice is called for, and the best choice. When you are speaking from actual expertise, for example. For example, “That is not how the PCI bus works” * may be perfectly valid, accurate, and the most economical way to move the discussion along. But adding snark that implies, “But you dumb fucks don’t really get this, do you?” wrecks the otherwise useful point you might have established. It’s verbal bullying, pure and simple, and no one should do it. And it doesn’t work.
Now, the opposite of the magisterial tone is what I will call Namby-Pamby-speak. This sort of talk includes a lot of “Sorry, ….” and “I know, but…” This can also easily be overdone. “Sorry, but I really think that is an explosive combination, would you consider putting that down?” is not what you want to be saying, just then, right?
And, some equal time for opposing views: http://www.businessinsider.com/stop… “It’s just that…” & “Never” & “Always” and even “Clearly” are traps, but not necessarily tied to one gender or the other. I always will remember Dick Cheney using “Clearly” as his ‘tell’ — what it meant was “I have absolutely no factual basis for the next words coming out of my mouth.” See also: “In fact.” **
*I made that up. I actually don’t have much knowledge of PCI buses, except that some things fit in them and others don’t. Being able to read Wikipedia entries, etc. doesn’t make anyone an expert on anything significant.
**This authoritative ‘tone’ and ‘voice’ thing is related to the nonce-word, ‘mansplaining’ but applies, I think, a broader application of it. Everybody does it, sometimes.

[This essay first appeared on the (now, sadly defunct) web site of Arriviste Press, in 2006. It was ably edited by Rick Miller. Its original title was “Moby Dick in an X-Box?” ] 

It has been commonplace to dismiss video games as trash entertainment, but can video games become recognized as classics — in the same sense that ‘The Great Gatsby’, ‘Citizen Kane’, and ‘The Old Man and The Sea’  are?

 

Standard dictionaries define a classic as “a work of enduring interest and appeal – used especially [in] literature, art, and music.” Beyond that, the criteria for defining a ‘classic’ work are subject to debate, but most scholars expect a work to provide plot, dramatic tension, crisis and resolution. Certainly, most video games today are merely diversions (think Minesweeper or Tetris). They do not aspire to more than simple amusement — and nothing says they need to.

 

Some games, major projects with large budgets and legions of contributing staff,  clearly aspire to more. But when might we expect the literary or artistic experiences, expressed through the medium of video games, to achieve ‘classic’ status?

 

Edison’s kinetoscope parlor (i.e., movie theater) first opened to a paying public in April 1894. Edwin S. Porter crafted ‘The Great Train Robbery’ in 1903; Talkies arrived in 1928, and such films as ‘King Kong’ (1933),’The Wizard of Oz’ (1939) and ‘Gone With the Wind’ (1939) draw interest and are generally viewable today. It took four decades of development and experimentation, but film had arrived as a full-fledged dramatic medium.

As entertainment technology progresses, and as audiences grow more sophisticated in their appreciation of new media, ‘works of enduring interest and appeal’ will emerge in the medium of games as they did with cinema.

“Developers have realized they must move beyond the ‘zombie’ effects of really beautiful characters who have no social and emotional connection to the player. And this requires different ways of thinking about the game play itself,” says Katherine Isbister, Associate Professor at Rensselaer Polytechnic Institute (RPI) in upstate New York.

Indeed the quintessential mindless video game, ‘Pong’  became available to consumers in the mid 1970s; yet by 1981, even the most formally dramatic games had hardly progressed beyond ‘Colossal Cave Adventure’ or ‘Zork’, and the more innovative ‘Pac-Man’ and ‘Donkey Kong’. (At least they had characters, of sorts.)

 

So what games might still be played 25 or 50 years from now? What, if anything, may be enduring about this new, more interactive medium for sight, sound, motion and text? The core characteristic games have — that more traditional media don’t — is interactivity. In a game, not only does the main character drive the action, the player drives the character. Successful games engage the player in making the choices. By making full use of interactivity, in concert with the other dramatic elements, games have potential for more deeply engaging narratives, ones that force the player toward an axiom of choice.

 

But will the dimension of interactivity impede, rather than enhance, a ‘good story, well told?’

 

The video game industry by and large (and self-admittedly) lacks a rationale or any sort of roadmap for providing a new and uniquely powerful form of literature.

 

“Creating detailed, realistic, and expressive content takes a lot of people, time, and money,” says industry expert Andrew Glassner. These costs create a powerful argument to play it safe. The major [game] studios do take some risks, but generally they need to be conservative and stick to what they are confident will sell.”

But the industry may be getting closer. Several recently released games may be the forbearers of the firstcritically accepted classic in the gaming genre:

[2013: These examples are now dated ]

Indigo Prophecy

Indigo Prophecy (Quantic Dream/Atari 2005) is an innovative and interesting game in which the player participates in solving a supernatural murder mystery from the points of view of multiple characters. Indigo Prophecy employs the suspense/horror genre much in the way an earlier game, Max Payne, employed film noir. From the first cut-scene — where the crucial character (Lucas Kain) jerkily lurches forward to attack and quickly stab a man to death — to the initial return to player control (“What have I done? I’ve got to get out of here!”), we are engaged both as viewer and player.

 

In an echo of Hitchcock’s introductory scenes at the beginning of each episode of his 1955-62 television show, game director David Cage introduces the game and its mechanisms via the tutorial and leaves us with two pearls of wisdom to apply as we explore the game’s challenges: “Every action has consequences,” and “Things are never quite what they seem.”

 

Indigo Prophecy provides players an innovative select-and-commit interface for interacting with the game environment or other characters. These choices play out in unexpected ways — asking one question may preclude your opportunity to ask a different question, or a time limit may pass before you have decided which tack to take. For example, early on in the crime investigation, the detective characters encounter a wino in an alley near the crime scene. Once the player initiates conversation with the wino, a timer kicks in. The choices of conversational line will branch, and then irrevocably drop out.

 

The plot device of invoking multiple points-of-view — first, that of an unwilling killer who seeks to find out what mysterious forces caused him to kill, then second, that of the detectives who are investigating the crime — involves us in the characters’ struggles and moves the plot to its crisis while avoiding repetition.

 

Within the limits of current game design constraints, Indigo Prophecy constitutes, at best, a promising beginning.  What is significant about this game is not merely the rich plot and the characterizations, but the high degree to which it succeeds in meshing the metaphor of a supernatural mystery/suspense movie plot and its attendant conventions with a (mostly) playable and challenging game experience.

 

David Cage is on record as saying, “…Video games were only exploiting a tiny part of their amazing creative potential, because they concentrated on ‘Action’ and totally neglected a fundamental element of human experience – emotion.”  It is on the basis of this worthy observation, as well as on entertainment value, playability, etc., that this game should be considered

 

The Godfather

The Godfather (EA Games, 2006) is a big-budget game based on one of the greatest film properties of all time. It illustrates what can occur when the game publishers have a great property to work from — a classic of its original genre — and ‘sky’s the limit’ on development.

 


 

The Godfather illustrates its brilliant development in this screenshot of game play action.

 


 

However, the dramatic elements that made the movie great are distinctly lacking from this otherwise highly immersive and compelling game. Principally what is missing is adherence to the dramatic rule that the main character must move the action. (See: Robert McKee’s textbook on the film scripting craft Story: Substance, Structure, Style and The Principles of Screenwriting(Regan Books, 1997). The player’s character doesn’t ever face the moral crises that beset Michael Corleone, and those choices and crises are what made the movie great. In the game, the player’s character is not a Corleone; he is some kid from the streets whom The Family takes under its wing, and who, through a series of successful missions, eventually rises to become ‘Don of New York’. Although adherence to the canon of the film is strict, actual crisis for this character is lacking or contrived. The player’s character throughout is simply ‘a man on the make’– no moral choice.

Despite great production values and a solid-gold license on the source content, this game is not destined to become a cultural classic. An opportunity lost?

Oblivion

Oblivion (Bethesda Softworks/2K Games, 2006) is a fantasy role-playing game based on the standard premise that Joseph Campbell termed ‘The Hero’s Journey’. In Oblivion, McKee’s commandment ‘that the protagonist shall move the action towards the crisis’ is upheld; however, if the protagonist opts not the move towards the crisis just yet, the crisis will wait.

 

Roger Ebert addressed this issue directly: “There is a structural reason for [why video games don’t seem to have any classics yet]: Video games by their nature require player choices, which is the opposite of the strategy of serious film and literature, which requires authorial control.”

 

Movies and novels move along at a pace determined by their crafters. The player’s objectives and progress are essentially self-directed, through his choice of whom to assist, and whom to wreak vengeance on. Main plot in Oblivion can wait, will wait, but it is the job of the designers, and their minions the characters throughout the game, to nudge the player towards taking up the quest perilous and at some point stop being distracted with all the little ‘errands of advancement’ along the way. Thus, both the principles of player choice and of the call of destiny are upheld.

 

The most forward-looking element of Oblivion is its use of nuanced — and variable — relationships. Not only are competing interests part of the game, but the attitude of the non-player characters can be influenced for good or ill by the player through a series of mini-conversations and, in some cases, cash bribes. Perhaps in a way, Oblivion’s designers are attempting something more challenging than envisioned for many films — they attempt to draw the player in, to get him to take up the core challenge, rather than have the film’s director finally have to show him how it all comes out.  Maybe the player comes to care for these silly people whose world he is saving from destruction.

 


 

Oblivion’s great graphics are complemented with unique relationship game play, but not enough to make a classic.

 


 

Yet none of these games fully rise to the challenge and potential of a dramatic interactive narrative fiction played out in a game.

 

Any game that seeks to draw the player in (invoking Aristotle’s catharsis), faces the large storytelling and development challenges associated with making the player feel something about the choices they make, about the emotional investment and identification they develop with the character and the storyline. Beyond deploying interactivity to invoke catharsis and identification, modern games typically enable non-linear plot and discovery choices not viable in films.

 

Although ambitious, innovative and to various degrees successful in their own right, Indigo Prophecy, The Godfather, and Oblivion each lacks some essential quality that would constitute a true classic for the medium. The talent and technique, and perhaps the audience, necessary to pull this off may not yet exist. But surely it will.

What we need are, as Robert McKee says, “Good stories, well told.”  Or, adapting slightly, what we hope for are “Good games, great stories, and well playable.”

 

We are only a few years away, I think, from the first truly classic character-driven dramatic game. Distinct from a film or a play, which are watched, or a book, which is read, a game is played. When a game is created with characters who struggle to a crisis in a new and powerful story, and these struggles are combined with the remarkable interactivity the industry is capable of designing: then a classic game will be born. But it’s all in the playing. And we are not there yet.