Archive for the ‘Uncategorized’ Category

framed

Posted: June 29, 2013 in Uncategorized
Tags: , , ,

So, with the help of a second vodka and soda since takeoff, I was sitting at 30,000 feet trying to forget that much of what apparently separated me from a sudden and violent trajectory toward certain oblivion was a series of mathematical equations concocted by a series of strangers to whom, in any other context, I would not be entrusting my life. But just as I was settling into the numbness in my head from the drinks and the numbness in my back from sitting nearly side-saddle so as not to allow my flesh to flow over the line in the sand drawn by the division between the seats, I got another visit from my upstairs neighbor.

I’ve mentioned her before. She’s the talkative one with, let’s just say, a well-endowed back porch that I really don’t recall her having enough fun to have earned. I never actually know when she’s gonna pop over, but it’s often when one of us has had too much to drink. Alcohol wags her tongue faster than the hips of a cocaine ho at treat time. And I have to admit the lure of her verbal pendulum spoke louder than the massaging socks offered in Sky Mall. So I invited her to sit down.

On an airplane, all the space is outside. Sure, the same could be said for any vessel we occupy, including our own bodies. But in an airplane, I suspect it’s the same set of strangers who flexed their mathleticism to make the thing fly who likewise designed all the adorable micro-versions of everyday items dotting the interior, doubtless to ensure the success of those flying theorems by keeping things light. I also suspect that there is an inverse relationship between a passenger’s size and his or her perception of a plane’s exaggeratedly microscopic inner accouterments. Seats aren’t designed for big asses. Seatbelts aren’t designed for big tummies. Meals aren’t designed for big appetites. And drinks aren’t designed for big thirst. Somehow the rules of aerodynamics change with a first-class upgrade, but theoretically the proletariat should be skinny. And realistically, that was another matter altogether. So my neighbor and I were left to cramp closer together, drain tiny bottles and ride the wakes of a big sky.

Often when my neighbor visits, setting becomes character. And what this one seemed to want to reveal to us were its frames, the ones confining our view of the limitless firmament outside. Leaning close to the window—again, so that my ass wouldn’t instigate an interpersonal Alamo right there on aisle 26—it occurred to me that I had never, ever, ever seen anything without a frame around it.

There are the obvious ones: picture frames, frames around paintings, window frames—each at once an instrument of focus and a boundary past which what you’re looking at ends. The television show you enjoy weekly has a frame as does the “big screen” blockbuster. A stage play has its edges, its top and bottom, left and right, onstage and off.

Perhaps this is why we’re brought to tearful awe by a big sky or big water, seemingly unbroken vistas before which we stand mute. But our eyes are only so big; they come with their own frames. And beyond that, we bring our perceptions to any sight. Perceptions may be symbolic frames, mental and physiological constructs, but they confine our view no less concretely. Inanely cutesy or not, this would be our frame of reference. And don’t even get me started on timeframes.

Perhaps I’m stating the obvious, but the issue is about more than finding and naming edges and borders. It’s about limitation. It’s about confinement. And it’s about the social agreement concerning where expression ends.

As writers, our tools are shamelessly limited. Each word is a frame, its meaning contained, finite. So we put them together. Limitation mounts. Once we give a chapter a title, we frame it. Once we give a feeling a name, we frame it. Once we frame something, it’s handily there for posterity, but something fundamental about its nature is missing, cut off, excluded.

No wonder the writer lives in a schizophrenic relationship with the page. Our passion is to create with an instrument that is the very definition of definition. It’s more than trying to birth part of the universe into a frame, which would be frustrating and sad enough. Rather, the act of writing is like trying to give birth through a frame, using a frame, to a frame.

Am I complaining? Yes. Do I have any answers? No. But I do think that part of the point of life might just be to figure out a way through the frame, past the veil. There has to be a way to express without constraint. Endlessly. Tapping in endlessly. Without borders of any kind. Otherwise, aren’t we kinda just locked inside with our neighbors upstairs? If so, I’m not sure there are enough vodka sodas to keep that ride interesting.

the memo

Posted: June 29, 2013 in Uncategorized
Tags: , , , , , ,

So how do you know when your “home office” has become your cave? How do you know when your ability to be comfortable when alone crosses over into drooling shut-in status, minus the drool? (Or plus the drool for that matter. It just tends to be that CSI giveaway that’s hard to ignore.) How long have family and friends avoided asking what you’ve been up to because they just don’t know how to make sense of what you do with your days? And if you happen to offer up that you’ve been working, how long has that puzzled expression crossed their brow, the one that says doesn’t work pay?

How long has everyone around you been wondering if what you really are is a hobbyist who’s run out of track for her miniature choo-choo?

So how do you know you’ve become a hermit writer?

I mean, it’s not like you grow up whispering such intentions to your best friends, right after everyone confides which one of the Bee Gees they’d do. It’s not like there was a career day that changed your life. Do they ever? For anyone? It’s not like that frumpy sweater was always so comfortable. Perhaps it wasn’t even always frumpy.

Rather, becoming a hermit writer is a subtle evolution. One day you’ve got a normal job like everyone else. The next day you’re fired. (Was that the first hint at your inability to blend?) The next day you’re writing something wretched but determined to stick with it. The next day you’re avoiding the phone. By the next day—and by day, I’m clearly speaking of biblically proportioned epochs in personal history—unless you’re either starving or receiving knocks on your door from social service agents called by concerned neighbors who couldn’t handle the smell any longer, you have no flipping idea how Howard Hughes you’ve gone on your own ass.

Until.

An event large or small forces you out of your hole, and you realize that there was a little item you missed while away, something you didn’t get.

The memo.

What memo, you ask? Exactly.

It’s the memo that would have told you that time had turned its butt to you and gone the other direction, that the smallest atomic matter of your life had changed, that whole children had become adults, that some things had become easier and you were doing it the hard way, that some things had become harder and you were in denial, that one space between sentences had usurped two, that they weren’t called memos anymore.

I suppose the typical reaction to such rude removal from the cave is to return posthaste. But all is not lost. Emerging and returning is the stuff of the tale of life itself. Is it not? So there you go, more fodder for the page.

What is cool? Who is cool? What does it mean to act cool? Be cool? What’s the difference between cool and good? For that matter, what’s the difference between cool and bad? Cool is a value judgment, to be sure, and its relationship to other assessment measures and concepts, such as goodness or badness, is always relative to whom and what are being judged. So it’s subjective. What isn’t? But the invocation of cool is peculiarly powerful. Utterly powerful. And utterly intolerant of another approach. In fact, though fascism isn’t cool, cool is fascist. It is mercurial yet absolute. And it is a commanding socio-economic diagnosis. Cool draws borders between ingroups and outgroups, creating a caste system comprised of symbolic social pressure and what it can buy.

Since we live, create, and communicate symbolically—sharing our sense of reality symbolically—those who can roll with the punches of the constantly changing symbology of cool are seen to be the most adept at splitting the fine hairs of reality. They are, in other words, in tune, in the know. They are also—statistically—young. After all, this is the demographic going through the incumbent system of rebellion that requires its own symbology, or code, so as not to appear as directly confrontational. The stuff of that code is typically the stuff of cool. Of course, the irony is that the very tool of revolution against authoritarian, uncool parents is fascism itself.

But young people grow older. Cool gets co-opted; it mutates into mainstream. An anthem like Saved by Zero that called for material minimalism during the excess of the eighties becomes an advertisement for auto financing options twenty years later. Cool is a cycle, and its lyrics are migraine-inducing repetitive.

Interestingly however, the concept of cool, regardless of specific content or subject matter, has itself been co-opted. In other words, cool and its association with hip insurrection have been completely adopted by our capitalist system in order to make a buck. CAKE sings about this in their song, How Do You Afford Your Rock ‘n’ Roll Lifestyle, with the lyrics, “Excess ain’t rebelling / You’re drinking what they’re selling.” And in Pattern Recognition, William Gibson claims that cool is nothing more than the label given to an observable consumer trend. Certainly all one has to do is walk by the monochromatic uniform offered in any Gap store to see this idea demonstrated in all its lack of creative glory.

The ultimate casualty of fascism will always be creativity. And even the coolest cat would probably agree that’s not cool. So, if Mr. Gibson and CAKE are right, the most effective way to combat cool is to stop throwing money at it. Quite simply: no consumption, no trend. No trend, no cool. Of course, we can’t exactly stop consuming. But it is enticing to aim away from trends and thereby less of the fascism of our youth oriented exchange. Less fascism equals more voices heard. And more voices heard means more of what symbols can do—create uniquely.

From matrimony to hook ups, from casual acquaintances to friendships with true bonds, there is a melding intention that exists, like anything, on a continuum. And, if the outcome is any indication, it can be argued that the compulsion to join together with another is ultimately an attempt to define oneself as an individual or, in double entendre, writing terms, to build character.

Much of Freudian psychology is the tale of ego mapping, the topography of established physical borders between child and parent that, once recognized and acted upon, begins one’s journey. Alone. Having been set on this solitary course, much of our ensuing lifetime energy is then directed toward recapturing that sense of connectedness and oneness we once knew with our parents.

Well, what if the very effort to merge with another instead served to encapsulate us further as individuals? And what if that result made for good fiction?

The rippling irony is that the endeavor to become one with the other reconfirms our edges—however, and here’s where we score a break, only as they brush across the other. So, regardless of how deeply one wishes to take the bond, the melding intention works to establish the individual but, fascinatingly, not necessarily standing alone. Rather, vis-à-vis the other.

To see how this plays out in fiction, one need look no further than the “shorthand” sketch of the evolution of a character’s internal development that Portland writer, Cynthia Whitcomb, has constructed. In it she outlines the idea that, as characters expand their focus from themselves ever outward, the characters themselves expand and become more interesting. In her shorthand, there are five basic levels of a character’s focus, ranging from his or her thinking only about matters of self to then caring for another, usually a romantic partner, to ever increasing spheres of attention and concern such as family and community to, finally, all of humanity. The fun, she says, is taking the reader or movie viewer along for the ride of these shifts. Moving characters up these levels of focus provides some of the engrossing, pleasurable arcs that keep us reading or watching.

So, to state it another way, there is an ironic, inverse relationship that renders a character more interesting as an individual in direct proportion to his or her decreased focus on self. We empirically know this to be true. And, returning to Freud, we can easily see that the shift of a character’s focus from him or herself to an expanding other would be meaningless if there were no weaning process, no ego shaping, no separation to begin with. The attempt then to remedy the situation by trying to join once again with the other—or, in story terms, a character’s having ever more meaningful contact with others—renders one an ever more sharply defined individual.

Good to know for all of us: the real ones as well as the ones in our head.

(And if you read a certain earlier blog about Schrödinger’s box, it also stands to reason using the logic of that piece that the more people one has included in one’s sphere of influence, the more witnesses one acquires. The more eyes on a person, the more he or she at least appears to be larger-than-life and interesting as an individual, hence celebrity.)

We’ve all been right there, right? A fresh seat awaits us on the first day of English class. A syllabus is flung on the desk before us, and we see that, Oh Nellie, the majority of the pending semester will be devoted to the particular delight that is the study of tragedy. Over the coming weeks, we’re given the definitions and groundwork. The terms are defined because, after all, not all brands of human suffering fall under the category of “Tragedy.” We read the heavy hitters: Plato, Aristotle, Shakespeare, Nietzsche, Kierkegaard, Woody Allen. And we pick up the tools of the trade: hamartia, peripeteia, irony. But for all the highfalutin talk about character flaws versus character errors or the nature of moral irresponsibility versus plain ol’ ignorant mistakes, there is a commonsense backdrop to this drama that is left completely offstage—time.

Timing is the singing cowboy hero of all genres, all narrative. Like the cowboy crooner, a tale’s timing might seem to fade into the background or rear its ugly head, but in either case it’s the rhythm of the piece. Any writer knows, without proper timing, stories are nothing more than independent sets of abstract coordinates. Setting, motivation, character, all would exist in a conceptual though meaningless falling-tree-in-the-forest-with-no-one-there-to-hear-it kind of way, each component pointlessly autonomous. It is not until a feat of timing intersects with these elements that the pieces fit. We can see that, even in a post-Tarantino Hollywood where time is a malleable toy, timing remains the star, the agent of quickening. As writers, as readers, we all know this to be true, at least on some intuitive level. But in the case of tragedy, the issue of timing goes further than the sound construction of story.

Timing defines tragedy.

The tale of human suffering is typically dependent on loss. Similarly, the notion of loss is dependent on a precursor of possession. In other words, one must have something in order to lose it. Of course, the more precious and esteemed the possession, the greater the loss once it’s gone and the more tragic the tale. But the fall from greatness/the great loss that is tragedy has a further baseline contingency that mandates that its very nature changes in relation to ones age.

Death when one is old may be sad, but there is not enough life left to lose to make the scenario a tragedy. But when one is young, death represents incomprehensible loss. And, since youth comes before the social mergers and acquisitions that can compose a span of life, it is typically the only possession that a young character can lose.

So death is tragic when one is young. Mistakes are tragic when one is old enough to know better. In the latter case, one is robbed of the glory of their youthful past and at the same time denied the dignity of a positive remembrance upon death. Conversely, a youthful mistake may be passed off as whimsy or ignorance. But if an error made while young robs one of his or her potential, then the future once faced is gone. This is a type of death, of course, and thus a tragedy. But not much beats the heartache of tragedy to be found in the mistakes made in later life as they deny the future as well as the past.

Although the term has been horribly misapplied, similar to the mauling of the concept of irony in a certain misguided pop song from a few years back, tragedy is a different animal than sadness or even disaster. Tragedy interacts with time uniquely. Whereas romance is possible for the young as well as the old, and fools, heroes, and villains of any age can play their roles in comedy to mystery to adventure—tragedy is rigid.

The writer weaves the elements of story. The thread is time. Duh. But tragedy must remain age appropriate. And the writer who doesn’t respect this distinction is headed for tragedy. Oh, the irony. Well, not so much on either front.

What’s an inaugural blog without a little self-reflection, calling-a-spade-a-spade, and quantum physics? So I think I’ll initiate a discussion on the nature of blogging itself—or rather, what I see as the bottom line impetus behind many forms of information chronicling, especially personal, intended for a mass audience. Other blogs on this site might differ, but it has been my experience that the majority of blogging is personal. On the surface, a post might seem to be about the latest application for an iPhone. But the tone remains subjective, and the ultimate purpose of the post appears to be the enabling of a revelation more about the blogger than what is being blogged. In other words, the nature of an ongoing blog, nearly regardless of subject matter, ultimately serves to discuss the blogger: what does he or she like, what is he or she interested in, in effect, what are the components of him or her. Now this observation may or may not be news to anyone, but the question is why do we feel the need to expose ourselves in this public manner? Fascinatingly, I think at least one answer can be found in quantum physics.

Once upon a time, a physicist named Schrödinger created a theoretical experiment that amounted to putting a cat in a box with a chemical roommate designed to kill the cat if the right domino effect within the lethal compounds occurred. Catch was there was no way to know if the right substance triggers had happened without opening the box. The idea was to demonstrate that, unless and until empirically detected, either outcome for the cat (life or death) could be true. Taken the next logical step, both outcomes are true simultaneously—the cat is both alive and dead at the same time—until the box is opened, and one or the other result is observed.

The idea is that nothing truly exists until it is perceived.

Okay, so back to us. We know we live in the paradox that, from earliest childhood, the establishment of the individual is communal. And now, thanks to the Internet, we can extend the paradox by sharing the most intimate details of our individuality with a global community while remaining utterly alone. This last aspect, the solitude, affords us a peculiar amount of anonymity, even as we divulge ourselves. And perhaps more importantly, this private rendering of a public face allows us to craft that face meticulously so we can show only what makes us look worthy, hip, good. Simultaneously, remaining alone while sharing ourselves helps us to navigate around the cultural admonition against grandstanding. After all, it’s impolite to talk about ourselves too much, we’re told. But heck, sitting around on the couch in sweats while clicking away on a keyboard and eating cereal hardly feels like hamming it up. But it is. It’s the closest we can get to showboating without alienating our every last friend and scaring away small animals.

So what about Schrödinger and his damn box? Where does it fit into all of this? The answer is simple really: we actually need to be looked into. We need to share. The mechanisms of quantum physics play out on the psychosocial plane as much as on the physical. So, on a very fundamental level, we feel as if we don’t exist until perceived by another.

Witnessed.

Ultimately, what are Facebook, Twitter, MySpace, and the type of information chronicled in most blogs but interpersonal versions of Schrödinger’s Box? They are big ol’ boxes of us, something to look inside, gain admittance to so that the “truth” will be perceived, measured, observed. Again, witnessed.

Blogging is a forum for our narrative. And our story feels meaningless until read. So welcome to my box.