Hyperreality: The Battle at Play

ipad dream by lancesh on flickr
Photo "Ipad Dream" by Lance Shields via Flickr Creative Commons

This past summer, I was a nanny. The ten-year-old I supervised loved The Simpsons, and so we spent some time viewing this dysfunctional, cartoon family. Of the slew of episodes we consumed, one in particular stood out. To provide a basic premise: Millhouse, dorky best friend of Bart, stars in a superhero movie. On the set, a worker paints a horse to appear as a cow. “Why not just use a cow?” the blue-haired, bespectacled boy inquires.

“No, we’ve gotta use a painted horse to make it believable.” A nonplussed pause ensues.

“Well, what do you use for a horse, then?”

“Oh, we usually tape a bunch of cats together for that,” the worker replies.

The tween and I both chuckled at the ludicrous mental image. However, this reminded me of the typical cinematic practice of seeking out sounds more suited to the actions depicted—ones more believable than the noises that actually accompany the acts. So shampooing hair might pair with the sound of squishing al dente pasta in a fist.

There’s a name for this. Jean Baudrillard calls it simulacra. It’s where we ascribe more realness to a representation of a thing, than to the thing itself. We believe more in the copy than the original. This concept shows up a lot of places, from Plato’s Allegory of the Cave, with verisimilar images flickering on the cave wall; to The Matrix, with virtual life inoculating against the desert of the real.  

I remember a high school teacher putting forth a tentative theory that, these days, we concern ourselves more with mapping out the semblance of a desirable self, than actually developing a self. Some rolled with the theory, citing social media, which may report one having 866 friends and trillions of likes and yet, may not negate the real fear of friendlessness, or assuage the insecurity about one’s inherent unlikability. But others pushed back. It’s not like we actually mistake Facebook for real life, they protested.

Maybe. But so many of us spend so much time inhabiting the hyperreal world of the web, keeping stride with the hyperkinetic pace our devices set. Perhaps we still distinguish between the pile of cats and the horse. But that doesn’t mean we don’t pass a lot of hours with a hodgepodge of felines, pasturing the real deal a distance from daily living.

Back to the boys I sit for: the youngest really is hooked. He wakes up, and after brushing teeth, immediately plugs in. He dons headphones, and I don’t really know where he goes—his iPod touch conveys him to a digital elsewhere. When pried off, he’s usually distraught. If permitted, I think he’d while away the whole day there. 

Recently, a documentary called Web Junkie aired on PBS, examining the effects of dozens of hours of gaming on teens. “Many come to view the real world as fake,” the documentary propounds. They’ve paid respects to the Real; RIP the horse; it’s full-fledged simulacra—a copy with no original draft—now. The generation of my boys, likely weaned on the screen more so than any preceding it (given the increasing omnipresence of the neon god we’ve made, manifesting in the form of tablets, smart phones, kindles, etc.) frightens me.

A multitude of studies enumerate the drawbacks of our entanglement with screens. That playing graphic video games correlates with a greater proclivity toward violent behavior has become a near truism. Last Thursday, the little one excitedly wanted to show me something. A computer game by the name of Gary’s Mod. I couldn’t discern its objective. Maybe he couldn’t either. He just kept shooting himself and jumping off buildings, turning the filter blood red and proclaiming the figure—a repeatedly mangled, free-falling corpse—flying. I interjected with a Hmm, not sure I like this. Well, you can look away, he offered. On a following day, at the mall, we walked past a large L’Oréal poster of a woman caressing glossy hair; the boy pointed a hand imitating a gun at her; boom, shot her in the eye, he muttered, to the laughter of the older brother. And then, just yesterday, walking by the arcade games in the bowling alley, the little one asked to pretend to play. Okay, sure. We mimed racing against each other on a pair of plastic motorcycles. After a few minutes, I felt something pressed to the side of my forehead—a pistol attached to the neighboring game, which my boy had picked up and directed my way. I told him no, and he set it down. But it unsettled me slightly. Yeah, it was just a toy. But the triumphant smile playing across his face inspired a mild case of the heebie-jeebies.

A brief aside: this younger boy is hypersensitive. If he’s first to the house, he’ll ask teary-eyed why he always has to open the door; situate him in back, and he’ll weepily opine that we always make him close the door. He tells me I do not understand his horrible life. That his teachers take away all his days with worksheets. That his peers don’t always laugh at his jokes. That his brothers call him baby. And, when upset, he launches himself even more fervidly into the virtual; I suspect it’s a kind of coping mechanism—because, from his vantage, wouldn’t coming to view the real world, his “horrible life,” as fake, prove a comfort? The fact that white men commit the majority of mass shootings in America niggles at the back of my mind. And boy, with that combination—the sense of entitlement that accompanies white maleness, screen addiction rendering the real world false, upped violent tendencies, a stymied ability to regulate emotions resulting in super sensitivity—can it come as a surprise?

Another point of fright for me includes how, with screens mediating their worlds, kids tend to interact less and less with nature. It takes a lot to coax my boys out of doors. And when we venture, we often have some hang ups. In the mornings, we occasionally lay a picnic blanket in the backyard, so as to laze a place other than the couch. I bring a book, and they grab iPods, typically. Whenever an ant chooses to mosey onto our blanketed area, the boys freak. An ant, an ant, they yell, alarmed. They swipe it off the quilt, and smoosh its frantic insect body into the ground. I know the actual act is unremarkable. People kill bugs all the time. But I think it important to remark the alarm that spurred the killing. This alarm, cousin of fear, suggests a lack of exposure—the natural world, to them, presents a greatly unknown world. And for better or, I suspect, worse, we’ve long since learned to fear the unknown. I’ve heard it argued that the industrialized, Westernized, technologized lifestyle, cordoning folks off from much of the natural world, cinching its unknowability, has an ill effect on the climate change cause. If we’re apart from it, afforded distance enough to otherize it, then we fail to cultivate much care for it. We tend to treat nature as our dominion instead of a dwelling, exploiting rather than experiencing its wonders. And, instead of acknowledging the scary cost of our affluenza, we swipe away our culpability so as to continue unsustainably on with the willful ignorance assured by our image-laden lives.

An offshoot of this concern—the whole technology assisting our removal from nature dilemma—is the dodging of practicing delayed gratification skills. Now, usually I’m not about it. It seems sad that we’d frame so much of our life narratives as trudging through unsexy, everyday necessities so as to, in fine Pavlovian fashion, receive some “gratifying” external reward. I see this skillset collocated with beefing up SAT scores, which then corresponds with increased financial success.[1]

But I also quibble with immediate gratification. Underneath the umbrella of immediate gratification, we place activities requiring little to nothing of us. And I feel like most things worth doing require something. To begin to build a nourishing relationship with a person or activity demands some amount of time or energy or effort—it asks that we struggle.

I suppose I think we might do well to toss out delayed versus immediate gratification. Maybe engaging meaningfully with each other and the world evades either category; it combines the hard work of the delayed and the easy play of the immediate. I contend meaningful activity resides in the space between work and play.

I can’t help but recall summer’s close. Exploring a nearby college campus, the boys and I stumbled on twelve or so giant cardboard boxes. They grinned with elation. We proceeded to drag them, two at a time, to the backyard. The high noon sun danced against the back of our necks, and we all perspired with the effort. With the aid of duct tape and scissors, we created a pretty expansive cardboard fort. It took a couple days. We added blankets as rugs, hung little camping lamps, installed windows with Saran wrap, tacked on a lookout tower, and stocked a little kitchen. We spent more hours outside consecutively than we ever had. The iPods, not in hand, remained in the house. We whiled away the days with some prime make believe. It felt hard, and fun, and real. 

---

[1] Both of which—standardized tests and the almighty dollar—strike me as a tad hyperreal; our education system stresses the gravity of some “objective” representation of intelligence far more than it does any other kind (emotional, social, divergent, you name it); and, the money we exchange nowadays, once composed of precious metals, consists mostly of paper or virtual versions of currency.

About The Author

Scroll to Top