A few things I really like about Laurel’s essays (apart from their liberal use of the word “bastard”): For one, reading Star Raiders transports me back to 1982, when the Atari 2600 gaming system was in its heyday.
How many hours I spent playing the arcade knock-off versions of Frogger, Pac-Man, and Joust, I’ll never know; furthermore, it’s a wonder I didn’t develop repetitive stress injuries in my hands for all the times I tried defeat the bomb-dropping burglar in Kaboom! (anybody remember that one?).
Like so many of the other writings we’ve read, it’s hard to grasp the full significance of the essay on the first read, and so my understanding of The Six Elements is very awkward at this point; but I can at least glean the thumbnail sketch: The application of the Aristotelian model of drama, and our understanding of what makes it work, as it’s applied to our relationship with computers. Interestingly, in our seminar there’s this recurring concept of “agency” that is once again articulated by Laurel.
To my thinking, the agency Laurel’s talking about as it relates specifically to character and thought gets to the heart of our expectations about “the ways in which things should work or exactly how they have gone awry” in computer design (I’m thinking less about computer games than native functionality, though). By the way, Sherry brilliantly explains how we build these expectations into our gadgets, even beyond the point of practical use to fulfill our understanding–or need–for what Laurel might call the full “spectacle” or “performance” of the machine:
It seems that our tendencies towards presupposed existence of spaces which exist in their entirety extend to the outer representations of our machines, not just the inner workings of them.
This phrase alone makes Laurel’s essay come alive for me! Aptly titled “Anthropomorphism,” Sherry’s post once again moves us closer to considering the fine line between humans and computers and our desire to make them into our own image.
Speaking of, another thing in Laurel’s essay that really jumps out at me is the idea of consistency in character, and how–just as in stories–inconsistencies in the user experience violate something akin to dramatic order. Laurel’s example of the spell-check-gone-bad illustrates how even well-intended features in computing can, without our prior knowledge of them, upset this innate sense of order if “this behavior is not represented to you in some way … ” (Where this is concerned, Microsoft is king IMHO. Animated paper clips emerging unexpectedly to *help* you?! But to be fair, Apple’s auto-correct feature on the iPhone is just as disruptive). Laurel’s point is well taken: When agency, thought, and character all conspire against the viewer-user, the result is either a really bad B movie or a horribly designed computer application.
Finally, Laurel points out something that really resonates with me, and it reminds me why I’m the kind of person who never, ever starts watching a movie in the middle (and, I think, why I’m also not much of a gamer since I generally don’t have the time to understand the complexities of modern video and alternate reality games). I think what Laurel is saying is that, whether you’re watching Dr. Zhivago or playing World of Warcraft, full enjoyment of the interactive experience depends on one’s expectations of what that experience should be–via the Aristotlean model, if you like–and then how well the experience conforms to those expectations. To illustrate this concept as it applies to computing, Laurel uses her Macintosh as an example:
My favorite Macintosh example is an error message that I sometimes encounter while running Multifinder: “Excel (or some other application) has unexpectedly quit.” “Well,” I usually reply, “the capricious little bastard!” Providing graceful beginnings and ending for human-computer activities is most often a nontrivial problem …