Wednesday, June 24, 2020

COVID Topologies: Compelled to Be Present

At the suggestion of a colleague, I recently read J.G. Ballard's "The Enormous Space." The short story, about a man who decides that he's never going to leave his house again, has a "Bartleby The Scribner" meets Don DeLillo vibe to it, where -- as his self-imposed isolation sets in -- he starts to explore the space of his home more intimately, with predictably hallucinogenic results. But his initial explorations resonate with work in New Materialism and Object-Oriented Ontology: particularly as he explores his own relationship with his physical environment. 

I believe the story has gotten more attention in the shadow of COVID and its resultant quarantines (which, as of today, June 24th, 2020), people in the United States have seemingly become bored with and "prefer not to" follow. But the ongoing, slow collapse of the United States is something for another entry. I also believe that strict quarantines will be in effect again in some states after death tolls reach a level that registers on even the most fervent pro-life, evangelical conservatives' radar: that is to say, when enough of the right people die for the "all lives matter" crowd to actually notice; and/or when "bathing in the blood of Jesus" is no longer the necessary tonic to mitigate the long, slow, isolated, and painful COVID-deaths of loved ones. I have no doubt those deaths will be inevitably and preposterously blamed on Hillary Clinton, Barack Obama, and somehow Colin Kaepernick and the Black Lives Matter movement. 

On some level, however, I think that broader politically- and religiously-based science denial is linked to the same emotions that people felt when they were compelled to stay home: an abject fear of seeing things as they are. Now that's a philosophically loaded statement, I know: can we ever see things "as they are"? Let's not get mired in the intricacies of phenomenology here, though. Those who were in quarantine for any length of time were suddenly faced with the reality of their living spaces. Those home environments were no longer just spaces in which we "crashed" after work, or the spaces which we  meticulously crafted based on home decor magazines. Whether living in a "forever home," a "tiny house," or the only space a budget would allow, people were faced with the "reality" of those spaces -- spaces which became the material manifestation of choices and circumstances. Those spaces no longer were just the places we "had" or "owned" or "rented," they became the places where people actually lived. We were thrust into an uninvited meditation on the difference between occupying a space and living in one.

Much like Geoffrey Ballantyne in "The Enormous Room," we found ourselves subject to the spaces which previously remained "simply there." Some, I know, went on J.A.K. Gladney-like purges as they suddenly realized just how useless -- and heavy -- much of the objects around us were, and instead of finding ourselves surrounded by the fruits of our labor, we were instead trapped by the artifacts of the past. How many people during quarantine fumbled through their possessions, timidly fondling knicknacks, looking for some kind of Kondo-joy. Others, I'm sure, went the opposite route and ordered MORE things from the internet to serve as an even more claustrophobic cocoon of stuff to block out all the other stuff which we couldn't bring ourselves to face -- let alone touch and purge. While still others continued to fail to notice their surroundings at all, yet found themselves suffering random anxiety and panic attacks -- blaming the fear of COVID rather than the fact that their surrounding spaces were becoming increasingly smaller as the detritus of daily life "at home" collected around them. 

Those spaces ... the spaces in which we "live" ... which were once relegated to the role of a background to the present, were suddenly thrust into the foreground, reclaiming us and our subjectivity. They didn't just become present, they became the present -- a present in which we were implicated; a present with which we may have grown unfamiliar. And, given the circumstances, can you blame anyone for not being too keen on the present? Whether its seeing more unrest on the news or on social media, or being compelled to haplessly homeschool your own children? The present isn't always that much fun. 

I think, though, that there is at least one positive thing that we can learn from Geoffrey Ballantyne: that it is possible for us to more consciously occupy the present moment instead of trying to avoid it. While I don't advocate the extremes to which Geoffrey goes (no spoilers here, but you may never look at your freezer the same way again); I do think that there is something to be said for noticing and engaging the spaces in which we are implicated. The spaces in which we "live" should be the ones which with we engage rather than just treat as some kind of visual or ontological backdrop. Engaging with our spaces is a way of seeing things as they are. It's a way of being aware.











 

Thursday, June 18, 2020

Choice is a Privilege

Yes, I'm back. I'm not going to give excuses other than:

  1. I'm writing a book -- which tends to sap your ability to write anything NOT the manuscript. 
  2. COVID-19. I thankfully didn't have it, but of course the disruption to the semester was intense. 
  3. The death of George Floyd and the political implications to higher education's stance and statements on race. As the faculty trustee at my institution, it's been mine (and our Board's) focus at the moment. 
COVID-19 became a nightmare for educators at every level, and will continue to disrupt in varying degrees for semesters to come. I'm not surprised that masks have become politicized. I'm not surprised that universities are coming to realize the importance of face-to-face instruction for students as students consider gap-years rather than deal with an entire course load online. Concurrently, faculty such as myself are coming to realize the importance of having online components to our classes. Luddism here was a privilege to which none of us can adhere any longer. While technology broadly construed is my specialty, I was not a fan nor adopter of online or blended models of teaching. The technology wasn't (and still isn't) seamless enough for me to dynamically present material or facilitate discussion. That is my opinion, and it still is. I envy my colleagues who have managed this balance and who create rich and dynamic online learning experiences for students. I have been -- and will continue to -- look to them to guide me. 

The death of George Floyd and the spotlight on race and police brutality have put universities into (rightfully) awkward positions.  I'm sure many administrators and faculty alike have been blindsided by the activism of colleagues; or, conversely, lacks of activism. I won't go into the details of my own university's response, but I will say that what is abundantly clear is that for some, like myself, the death of George Floyd was a gut-check of my personal privilege, and a reassessing of the privilege and power that comes with it. Personally -- let me emphasize this -- personally, I found that I was not doing enough. My selective activism was a privilege in and of itself. As what will probably be one of the last generations of faculty to have tenure, I have an invaluable asset by and through which I can actively affect change. The challenge is for me to do so responsibly: not in the context of keeping anyone "comfortable," but in the context of not doing harm to those already being pulled in different directions by the intersectional forces of race, gender, poverty, and a myriad other vectors of power. 

I think that many of us in academia -- particularly in the humanities and social sciences --  have been talking about these things for so long that we forget that we have internalized and naturalized the discourse of race and gender studies. We simply can't believe that anyone but a college freshman could say "all lives matter" without malicious intent. Yet, when it does happen, we have a moral obligation to educate in the most effective manner possible. As a philosophy professor, I am spoiled in the fact that I face resistance on a daily basis to the things I teach in my classes: not necessarily because they are politically volatile, but because they are just ... well ... ridiculous. YOU try explaining to a student who never read ANY philosophy in their lives Plato's theory of Forms. Or try to thread the needle with Nietzsche and Sartre so that students realize that neither was a nihilist (in fact, both are as anti-nihilist as one can be). 

So when a community member, or relative, or administrator proudly -- and without malice -- says "all lives matter"* or declare themselves "colorblind"** in the context of wanting to be an ally or at least do no further harm, my duty is to be the educator that I am. Notice the context, however. There are plenty of people who have declared "all lives matter" or that they're "colorblind" with the same political fervor and intention as am internet comment section troll who has just learned the term "virtue signalling." 

Indeed, I've had some relatively important people declare that they'd love to sit down and have coffee with me because they'd find "real debate" with me to be "entertaining." I can't help but think of showing up in a toga yelling "ARE YOU NOT AMUSED?!" Instead, I politely decline, saying that while I'd love to talk about philosophy any day, I'd rather not take part in what will inevitably be the rhetorical equivalent of professional wrestling. They want me to follow the script of the liberal academic while they follow their own script of whatever they think the opposite of a liberal academic is.  

Yet, that's the real issue here, though, isn't it? I had the privilege of saying "no." I had the privilege of walking away from a so-called "debate" and get on with my life. Just as I could easily take another Facebook hiatus or "sit out" diversity discussions on campus simply because I was tired or just didn't feel like it. 

There are those who simply cannot walk away from these issues. They cannot sign out of Facebook and not have to deal with systemic racism. Every act of theirs becomes a political act simply by the color of their skin and/or their gender identity. Between this and COVID, I was given a double-dose of privilege-checking. Just like the moment when I sat down to write this entry and told myself that I was specifically NOT going to discuss any of the above because I wanted to focus on something else. Yes, I will focus on other things in future entries, but this entry needed to be written. And I had a responsibility to write it. 

Black lives matter.






* saying "all lives matter" as an answer to "Black lives matter" is akin to saying to the owner of a burning house that "all houses matter" as their house burns down; or saying to someone who lost a child that "all children matter." 

** When one is "blind" to race, it also infers that one is blind to the experiences unique to that race -- both the challenges that race has faced as well as the pride and accomplishments therein. 



Monday, February 18, 2019

My Battery is Low and it's Getting Dark



"There's a little black spot on the sun today,
that's my soul up there
"
- The Police, "King of Pain."


"My battery is low and it's getting dark."

Of course, these were not the actual last words of the Opportunity Rover, which sent its last transmission February 13th -- a routine status report that was not quite as poetic or existentially charged as its anthropomorphic translation. What set it apart was only that it was the last report Opportunity would ever send.

When I wrote Posthuman Suffering, I was thinking of exactly this kind of relationship between human beings and machines. And the momentary poignancy as this virally flashes across a social media landscape shows us exactly the dynamic I tried to elucidate: we want our machines -- our technological systems -- to legitimize and validate our own pain: in this instance, the pain of existential dread.

This object -- an only semi-autonomous planetary rover -- was designed to last 90 Martian days (a martian day is about 30 minutes longer than one on earth). It dutifully lasted over 5,000, spending its final moments in a valley, enshrouded by the dark of a major planetary dust storm. Its "dedication," coupled with the finality of its message, affects us on a deep emotional level. It "dies" alone. Its last status message is transformed into a last fulfillment of duty -- calling out to earth, noting the encroaching darkness and its own dwindling power supply. We are often fascinated by these real and fictional moments, whether it is the HAL 9000's halting rendition of "Daisy, Daisy," or Roy Batty's "tears in rain" speech from Blade Runner,  we feel a certain empathy as these fictional and real machines sputter and die.

Where most believed that we were simply projecting ourselves (and our fears) onto our machines, I took it a step further. This wasn't mere projection; it was a characteristic of a deeper, more ontological relationship we had with these machines. Yes, we are sad and lonely because we see our own existential loneliness in the dust-covered rover now sitting, dead, in a distant valley of Mars. But, more importantly, we're satisfied by it. Satisfied not due to any inherent sadism or misanthropy; quite the opposite: we're satisfied because it keeps us company in that solitude.

If you've ever pulled our your smart phone to take a picture in low light, and it gave you a low-battery warning, you received pretty much an analogous message that Opportunity sent back to NASA. Yet, in that moment, you're more likely to be angry with your phone rather than want to cradle it in your arms and serenade it with David Bowie or Imogen Heap.

But this -- this object that was 54.6 million kilometers away.

And it was alone.

And it was dying.

Of course, there are all sorts of reasons why NASA would "translate" Opportunity's final transmission in such a way (a way to "humanize" science, or perhaps even authentic, heartfelt emotion for a fifteen-year mission that was incredibly successful and coming to an end). Regardless, the reaction on social media, however fleeting it may be (or may have been), falls somewhere between empathy and solidarity.

The object sitting alone on Mars, made by human hands, the product of human ingenuity, partakes in a broader, deeper loneliness in which humans partake. Yet, there is no way to share such loneliness except metaphorically. And in this case, it's the humans who make the metaphors. If anything is being extended here, it's not humanity, it's metaphor. The mistake many cultural theorists make is to present this dynamic as simple anthropomorphization: we're personifying "Oppy" (interestingly enough, quite often as female: "she's sent her last message"). But that's not exactly what's happening. We're re-creating Opportunity into something else: through metaphor we are making it into a unique, autonomous, metaphorical entity that can and does feel.

In this posthuman suffering we were extending our autonomy, and all the suffering that goes along with that autonomy. We imagine ourselves sitting alone, reaching out, texting into the dark, hoping for some kind of response; posting on Facebook or Twitter or Instagram because it's not socially acceptable to say "I'm lonely and need someone to speak to and also I know someday I will die and that makes me feel even more lonely and I need some kind of contact."

So we post or text, and wait for authentication and validation.

In many ways, Opportunity rover is us, alone, in the dark, posting on social media and hoping for some kind of response to tell us we're not alone.

I've often said in my classes that every social media post -- no matter what the content -- is simply a Cartesian expression and can be translated into "I exist."

I say less often in my classes that there's always an existential codicil to these posts:

"I exist and I'm afraid of death."

But now, as I make a turn in my philosophy, I realize that the existentialist in me was too dazzled by the idea of our own, consciousness-based fear of death: a survival instinct complexified by a cerebral cortex which weaves narratives as a means of information processing. And when I thought about this in light of technological artifacts and systems of their use, I was too focused on the relationship between human and object rather than on the human and the objects in and of  themselves. In other words, I was being a good cultural theorist, but a middling philosopher.

The Opportunity rover is "up there," alone, amid rocks and dust. On the same planet are the non-functional husks of its predecessors and distant relatives. It was unique; the last of its kind. We imagine it in the desolation. We weave its narrative as one of solitary, but dedicated duty, amid rocks and dust. When we think about Opportunity, or any of the other human-made objects sitting on the moon, other planets, asteroids, and now hurtling through interstellar space (alone), the affect that occurs isn't a simple projection of human-like qualities onto an object. In the apprehension of the object, we become a new object, an Opportunity/human aggregate that is also constituted by the layers of sense-data, memories, emotions, experiences, and platforms through which much of that phenomena is brought into awareness. Metaphor isn't a thing we create or project, it is the phenomena of a distributed awareness.

To paraphrase "King of Pain," the speaker's soul is many things:
A little black spot on the sun today.
A black hat caught in a high tree top.
A flag pole rag and and the wind wont's stop.
A fossil that's trapped in a high cliff wall.
A dead salmon frozen in a waterfall.
A blue whale beached by a spring tide's ebb.
A butterfly trapped in a spider's web.
A red fox torn by a huntsman's pack.
A black winged gull with a broken back.
And, in the context of the song, there are other objects existing that aren't necessarily in the awareness of the speaker:
There's a king on a throne with his eyes torn out
There's a blind man looking for a shadow of doubt
There's a rich man sleeping on a golden bed
There's a skeleton choking on a crust of bread
The first group of objects (black spot, black hat, rag, etc.) are directly equated with the speaker's soul. But the second are not. They are just objects that frame the broader existence of the speaker, embedding them and all other objects in a broader world of objects, distributing the "pain" via the images invoked. The poignancy of the song comes with the extensive and Apollonian list of things, things that aren't necessarily solitary, sad, or tragic in and of themselves, but come to be so when folded into a broader aggregate that just happens to include a human being who is capable of understanding the above lyrics.

Whereas most would say that it's the reader that is lending the affective qualities to these objects, we need to look at the objects themselves and how -- as solitary objects embedded in a given situation, whether "real," "sensed," "imagined," "called to mind," etc. -- these objects create the "reader."

Getting back to our solitary rover, the pathos we feel for it comes from the images we see, our broader knowledge of Mars, our basic understanding of distance, the objects on the desks around us or the bed we're sitting on, the lack of any messages (or a particular message) on our phone, the dissonance between the expected amount of likes, loves, retweets,  comments on our last social media posts and the actual number of aforementioned interactions, the memories of when some caregiver may have forgotten to pick us up after karate practice, the dying valentine flower on our nightstand, the dreaming dog at our feet, etc., etc.

We feel for it not as a separate subjectivity witnessing something; we feel for it as an aggregate of the "objects" (loosely defined) which constitute our broader awareness. This is, perhaps, why on some level, for some particular people, at some particular moments, we are more moved by this object on a distant planet than we are from seeing suffering first-hand by a stranger or by the larger tragedy of our own dying planet. Certain aspects of this object, plus the objects around us, plus the "objects" of our thoughts, come together in a particular way creating a particularly emotional response.

It feels like the world is "turning circles, running 'round [our] brain[s]," because our brains are constituted by the "world" itself, even if that world includes a planet that we've only actually seen via pictures on the internet ...

... and a small robot, dying alone in the dark.








Monday, January 21, 2019

The Narratives of Things

Each of us lives according to our own narratives of self. Various traditions of philosophy treat those narratives differently. Some will celebrate it, that if we think positive thoughts or visualize what we want, that it will come to fruition. These traditions will put thoughts front and center as the way to progress, all stemming from a Cartesian way of looking at the world where we literally are (that is, we exist) because we think. The most watered down version of this comes in pop philosophy/psychology that we often see celebrated by celebrities and talk show hosts. "The Secret," which basically says (spoilers) that if you visualize something hard enough and long enough (aka think about it enough) you will achieve it. Adherents will say that it's much more complex than that; but it really isn't.

Other traditions see thought as more of a peripheral aspect of existence. Thinking is a result of the specific structures of our brains, and the stimuli that our embodied brains perceive and process. All thoughts have causes; those causes are materially based. The thoughts we have are determined by those causes. That's not as bleak as it sounds, however, when one thinks of the myriad stimuli to which we are exposed on a daily basis. We have enough of those, in fact, to make us think that we have free will. Our narratives of self have causes, and are not self-generated. In other words, we are not the prime-movers of our actions, per-se. But just because our thoughts and actions have causes, that doesn't mean we can't and don't make choices. Those choices are determined by causes, but that doesn't negate volition (the ability to choose between the myriad options we have).

For those who have been following my blog and/or my research, you know that I take a philosophical approach that expands the above to include the physical environments in which the embodied mind finds itself, as well as the artifacts we use to negotiate and mediate that environment.

As I've been thinking a lot about, well, thinking, I find that I often fall back on my old training in the field of literature and literary theory. In fact, the first bridge I built from literary theory into philosophy was that we use narratives to understand the world and our place in it. We create narratives not just to explain the unknown, but to integrate ourselves into that world. Even if our narratives are ones of the solitary lupine nature (i.e. the lone wolf), it is a story of solitude. It becomes a narrative through which we understand our place. And, if we're not careful, these narratives can dictate how we will behave. I find it ironic that people who often flinch at the aforementioned implications of determinism are often themselves enmeshed in their own deterministic narratives. They feel themselves "destined" or "cursed" to be [insert emotional/financial/psychological/academic state here].

I think, however, that it is just as philosophically valid to look at "things" -- that is to say, actual physical objects -- with the same, if not more weight than the thoughts that define our narratives. What stories are we creating and telling ourselves through the things that we both passively find ourselves surrounded by and the things we actively surround ourselves with?

The temptation is to think about these objects as "traces" of ourselves; as markers of past achievements; mementos to remind us of events or periods in our lives. And yes, that is true, but in emphasizing that view, we don't think of the effect those objects have on us in the present. I do think, however, that we see glimmers of that when -- usually after a trauma of some kind, either the loss of a relationship, the death of a loved one, or the failure of some major project -- we suddenly decide it's time to redecorate our personal or professional spaces in some way. But we get an interesting shift in perspective if we ask ourselves -- in moments of calm (or at least non-trauma/panic) -- "how are these objects defining and supporting my current narrative of self? What story of self is this object, this space, this environment making possible?"

In a more philosophical mode, we can ask "How does my being supervene upon these physical objects?" Or, "How is my being brought about by the objects around me?"

Most would think that was a psychological issue: objects affect emotional responses. Of course they do. But oftentimes when we think that way, we are looking at the self as a static object. An existence rather than an exist-ing.

If we want to dig deeper into the idea of distributed cognition and the object-oriented-ontology I'm getting at here, we need to think of the self as a dynamic, ongoing process.

How do these objects constitute, intervene upon, determine, or otherwise affect the process by which my "existence" unfolds or manifests itself?

So, I'm not asking "what stories are we telling through the artifacts we use and the environments in which we use them in?" I'm asking: "How do these artifacts and environments constitute the meaning-making process through which these stories are told?"

At least I think I am ... or maybe it's just time to redecorate.


Monday, January 14, 2019

Academic Work and Mental Health

I've always said to my students -- especially those thinking of doing Masters or Ph.D. programs -- that graduate work (and academic work in general) can psychologically take you apart and put you back together again. It will often bring up deeper issues that have been at play in our day-to-day lives for years.

As I was annotating a book the other day, I felt a familiar, dull ache start to radiate from my neck, to my shoulders, shoulder blades, and eventually lower back. I took a moment to think about how I was sitting and oriented in space: I was hunched over -- my shoulders were high up in and incredibly unnatural position close to my ears.  I thought about what my current acupuncturist, ortho-bionomist, and past 3 physical therapists would say. I stretched, straightened myself out, and paused to figure out why I hunch the way I do when I write.

It’s like I’m under siege, I thought to myself.

And then I realized there was something to that.

If there’s one refrain from my childhood that still haunts me when I work it’s “You’re lazy.”

My parents had this interesting pretzel logic: The reason I was smart was because I was lazy. I didn’t want to spend as much time on homework as the other kids because I just wanted to watch TV and do nothing. So I’d finish my homework fast and get A’s so “I didn’t have to work.”

No, that doesn’t make sense. But it was what I was told repeatedly when I was in grade school. Then in high school, on top of all of the above, I was accused of being lazy because I didn’t have a job at 14, like my father did.

And then in college, despite being on a full academic scholarship, getting 4.0s most semesters, making the deans list, (and eventually graduating summa cum laude), I was perpetually admonished by my parents for not getting a job during the 4 week winter break, or getting a “temporary job” in the two or three weeks between the last day of classes and the first day of my summer jobs (lab assistant for a couple of  years, and then day camp counselor). Again, according to them, it was because I was “lazy.” My work study jobs during the school year as an undergraduate didn’t count because they weren’t “real jobs.”

And even though I was doing schoolwork on evenings and weekends, my parents often maintained that I should be working some part-time job on the weekends.

So doing schoolwork (that is to say, doing the work to maintain my GPA, scholarships, etc.,) wasn’t “real work.” In retrospect, the biggest mistake of my undergrad days was living at home. But I did so because I got a good scholarship at a good undergrad institution close to home. It was how I afforded college without loans.

But just about every weekend, every break, or every moment I was trying to do work, I was at risk of having to field passive aggressive questions or comments from my mother and father regarding my avoidance of work.

My choice to go to grad school because I wanted to teach was, of course, because I didn’t want a “real job.”

Most confusing, though, was how my parents (my mother in particular) would tout my achievements to family and friends, even telling them "how hard [I] worked.” But when relatives or friends were gone, the criticism, passive aggressive comments, and negativity always came back. It’s no wonder why I hunch when I do work. I am in siege mode. It explains also why my dissertation took me so long to write, and why that period of my life was the most difficult in terms of my mental health: the more I achieved, the more lazy I thought I was actually being.

Even though I have generally come to terms with the complete irrationality of that logic, I do have to take pains (often literally) to be mindful of how I work, and not build a narrative out of the negative thoughts that do arise as I submerge into extended research. I went back into counseling last summer, mainly because I was starting to feel a sense of dread and depression about my sabbatical, which I knew made no sense. I'm so glad I did.

The things we achieve -- whether academic, professional, personal, etc. -- are things of which we should be proud. Sometimes we have to be a little proactive in reminding ourselves of how to accept our own accomplishments.

And maybe every 30 or 60 minutes, stand up and stretch.






Friday, January 4, 2019

Excavations and Turns

"To take embodiment seriously is simply to embrace a more balanced view of our cognitive (indeed, our human) nature.  We are thinking beings whose nature qua thinking beings is not accidentally but profoundly and continuously informed by our existence as physically embodied, and socially and technologically embedded organisms."
 -- Andy Clark, Supersizing the Mind: Embodiment, Action, and Cognitive Extension, (217).  

I've reached a point in my field-related research that I've internalized certain ideas to the extent that they have become the conceptual bedrock of my current project. However, as I dug up my annotations of Andy Clark's Supersizing the Mind, I realized that I have taken certain assumptions for granted ... and had briefly forgotten that I didn't always think the way that I do about phenomenology, materialism, and particularly distributed cognition. Apparently, as little as seven years ago, I wasn't convinced of Clark's hypothesis regarding the ways in which our cognition is functionally and essentially contingent upon our phenomenal environments. Now, of course, I am. But reading my sometimes-snarky comments and my critiques/questions about his work gave me valuable insight into my own intellectual development, and pointed at ways to sharpen my arguments in my current project.

Seven years ago, I was still thinking that language was the mitigating factor in the qualia of our experience. In fact, I had written a chapter for an anthology around that time, working under the aforementioned idea. Now I realize why that chapter was rejected and left to literally collect dust in my office. The rejection of that chapter really affected me, because it was an anthology in which I really wanted to be included. I knew that something was off with it. It never felt quite "right."

Then, filed next to those notes, was a different set of notes written around eight months later. Those notes represented a complete 180 degree turn in my thinking. Unsurprisingly, that chapter was accepted into a different anthology ("Thinking Through the Hoard" which appeared in Design, Mediation, and the Posthuman).  That piece was really the beginning of my current journey. I suppose Clark's ideas had "sunk in" with the help of other authors who pointed out some of the broader implications of his work (like Jane Bennet and Hans Verbeek).

There are a couple of takeaways from this anecdote: 1) as academics/researchers, our ideas are always evolving. Several philosophers, including Heidegger, experienced "turns" in their thinking, marked by a letting go of what seemed to be foundational concepts of their work. My own work in posthumanism has made a couple of turns from its original literary theory roots, to an emo existential phase, to its current post-phenomenological flavor. 2) Embrace the turns for that they are. There are reasons why we move on intellectually. Remember why we move on is helpful when anticipating critiques to your current thinking.







Monday, December 31, 2018

Sabbatical: The True Meaning of Time

So no apologies or grand statements regarding that my quiet academic blog is now alive and awake again. No promises as to what it will become, or how often I will update. I'm going to let this evolve on its own. Like all the best things I do, I have a sketch in my head as to what I'd like this blog to be while I'm sabbatical -- as I research and write what will hopefully be another book. But, things happen and unfold in interesting and unpredictable ways. I have been doing a great deal of research in the past several months, all in preparation for what will be several months of concentrated work.

For those who may not be familiar with what a sabbatical is or how it works, it's basically a paid leave from one's usual responsibilities on campus in order to do intensive research or writing. Most universities grant year-long sabbaticals; but since Western isn't the most cash-flush or research-oriented university, our sabbaticals are one semester long ... we can take a year if we'd like, but at half-pay. Since I can't afford to live on half of my salary, I opted for the semester-long sabbatical. Sabbaticals are something for which faculty have to apply and be approved. It's a multi-step process that requires proposals and evidence that one has actually done research while they're gone. Once you are on the tenure track, you can apply for a sabbatical once every 7 years. 

This is my first sabbatical. So I have no idea what to expect nor can I wax philosophical on what it's like. 

I can say, however, that this will be the first time I'm not on an academic schedule since I first started going to school. And I don't mean grad school. I mean Pre-K. My years have been portioned by the academic calendar since I was 4. Elementary school. High school. College. Grad school. Teaching. There were no breaks. I have always either been in a classroom as a student or as an instructor since I was 4 years old. I am now 46. You do the math. Sure, there are semester breaks, but this was the first time I entered a semester break without having to think about the next semester's classes. It was less disorienting than I thought it would be. 

Not many people who aren't teachers understand exactly how much time and energy teaching requires. I normally have a teaching load of 4 classes per semester. I'm physically in the classroom for 3 hours per course per week (spread out over a Monday/Wednesday/Friday or Tuesday/Thursday schedule for each). I am also required to have at least 5 hours per week of "office hours" for students. So that's 17 hours per week of teaching/office hours. That doesn't include class preps, grading, committee work, meetings, and the administrative side of directing the philosophy program. Most days, I arrive on campus by 8:30am and leave after 5pm. Most days before or after that I'm prepping/reading for classes, grading, or doing paperwork. Weekends are the same. When I leave for the day, I bring work with me. 

I get up at 5am on weekdays in order to have a little under 60 minutes to do my own research. Semester breaks are also times when I've been able to do my own research. But 1/3rd to 1/2 of those breaks are filled with writing recommendations for students, prepping for the next semester's classes, and dealing with the inevitable committee work that brings me to campus during those breaks. 

With a sabbatical, 85%-90% of the above work goes away. 

This is why sabbatical are precious ... because it gives us time. 

Time to let the big thoughts develop. Time to sit down and THINK. Time to actually read something that isn't a student paper or a committee report. Time to write through a problem without looking at the clock and thinking about how you're going to make Kant into a remotely interesting class. Time to focus on your own work instead of the at-risk student who has been looking really tired in class and probably isn't eating because they just got dumped by their fiancee or their dog is sick or they flipped their car over for the 3rd time in 2 years. Time to sit in quiet instead of dealing with yet another new directive from administration to fund raise or recruit even though you have zero experience or expertise in doing so. Time to read relevant writing in your field instead of being asked to justify the importance of your field or to report back as to exactly where your students from 7 years ago are working now and how your classes got them that particular job. 

There is time. 

Time to recharge myself so that when I do return, Kant will be an interesting class. Time to become re-invested in my field and feel legitimate as an academic again so that I can pay better attention to my students and reach out when I know they're at risk. Time to research so that when I return I have evidence of exactly how important my field is, and exactly why studying it isn't just important, but imperative to making students marketable to employers. 

There is time for me to focus on me, so that I can eventually focus better on my job and doing it well. 

That's what sabbatical is all about, Charlie Brown.