The Professional Sick Person

This last week, I spent a bit of time keeping up my title as a professional sick person. I achieved this, luckily, without having to be in any serious danger, because the cause of my temporary disposition was the series of vaccines I received. I tend to be especially prone to the minor side effects of vaccination- the symptoms that make one feel vaguely under the weather without making one feel seriously at risk of death -which isn’t surprising given my immune pathology.

Enduring a couple of days at most of foggy-headedness, low grade fevers and chills, and frustrating but bearable aches is, if still unpleasant, then at least better than most any other illness I have dealt with in the last decade.

What struck me was when I was told, contrary to my own experience and subsequent expectations, that a couple of days would be in itself an average amount of time for a “normal” person to recover fully from an ordinary illness. That, for someone who has a healthy and attenuated immune system, it is entirely possible to get through the whole cycle of infection for a garden variety cold in a weekend.

This is rather shocking news to me. I had always assumed that when the protagonist of some television show called in sick for a single day, and returned to work/school the next, that this was just one of those idiosyncrasies of the TV universe, the same way characters always wear designer brands and are perfectly made up.

I had always assumed that in reality, of course people who caught a cold would take at least a week to recover, since it usually takes me closer to two, assuming it doesn’t develop into some more severe infection. Of course people who have the flu spend between three and five weeks at home (still optimistic, if you’re asking me), that is, if they can get by without having to be hospitalized.

This probably shouldn’t surprise me. I know, consciously, that I spend more time confined to quarantine by illness than almost anyone I know, and certainly than anyone I’m friends with for reasons other than sharing a medical diagnosis or hospital ward with. Still, it’s easy to forget this. It’s extremely easy to assume, as I find myself often doing even without thinking, that barring obvious differences, other people are fundamentally not unlike myself, and share most of my perspectives, values, and challenges. Even when I am able to avoid doing this consciously, I find that my unconscious mind often does this for me.

It’s jarring to be suddenly reminded, then, of exactly how much my health truly does, and I don’t use this phrase lightly, screw me over; apparently it does so so often and so thoroughly that I have to a large degree ceased to notice, except when it causes a jarring contrast against my peers.

Feeling slightly terrible as a side effect of getting vaccines has, on an intellectual and intuitive level, ceased to be an annoyance in itself. It is only problematic insofar as it prevents me from going about my business otherwise: my mental fog makes writing difficult, my fevers and chills compel me to swaddle my shivering body to offset its failure to maintain temperature, and my omnipresent myalgia gives me a constant nagging reminder of the frailty of my mortal coil, but these are mere physical inconveniences. Of course, this does not negate the direct physical impact of my current disposition; it merely contextualizes it.

Having long ago grown used to the mental feeling of illness, and without feeling poor enough physically to garner any genuine concern for serious danger to my long term health and survival, the fact that I am sick rather than well is reduced to a mere footnote: a status. In the day to day story that I narrate to myself and others, the symptoms I have described are mere observations of the setting, without any lasting impact on the plot, nor on the essence of the story itself.

I often call myself a professional sick person; a phrase which I learnt from John Green via Hazel Grace Lancaster. The more time I spend thinking about my health, the more I find this metaphor apt. After all, in the past decade of being enrolled in and nominally attending public school, I have spent more time in hospitals than in a classroom. My health occupies a majority of my time, and the consequences for ignoring it are both immediate and dire. I regard my health as a fundamental part of my routine and identity, the way most do their jobs. Perhaps most compelling: my focus on it, like that of a professional on their trade, has warped my perspective.

We all know of the story of the IT expert incapable of explaining in human terms, or of the engineer so preoccupied with interesting solutions as to be blind to the obvious ones, or of the artist unable to accept a design that is less than perfect. In my case it is that I have spent so much time dealing with my own medical situation that it is exceedingly difficult to understand the relative simplicity of others’.

The Social Media Embargo

I have previously mentioned that I do not frequently indulge in social media. I thought it might be worthwhile to explore this in a bit more detail.

The Geopolitics of Social Media

Late middle and early high school are a perpetual arms race for popularity and social power. This is a well known and widely accepted thesis, and my experience during adolescence, in addition to my study of the high schools of past ages, and of other countries and cultures, has led me to treat it as a given. Social media hasn’t changed this. It has amplified this effect, however, in the same manner that improved intercontinental rocketry and the invention of nuclear ballistic missile submarines intensified the threat of the Cold War.

To illustrate: In the late 1940s and into the 1950, before ICBMs were accurate or widely deployed enough to make a credible threat of annihilation, the minimum amount of warning of impending doom, and the maximum amount of damage that could be inflicted, were limited by the size and capability of each side’s bomber fleet. Accordingly, a war could only be waged, and hence, could only escalate, as quickly as bombers could reach enemy territory. This both served as an inherent limit on the destructive capability of each side, and acted as a safeguard against accidental escalation by providing a time delay in which snap diplomacy could take place.

The invention of long range ballistic missiles, however, changed this fact by massively decreasing the time from launch order to annihilation, and the ballistic missile submarine carried this further by putting both powers perpetually in range for a decapitation strike – a disabling strike that would wipe out enemy command and launch capability.

This new strategic situation has two primary effects, both of which increase the possibility of accident, and the cost to both players. First, both powers must adopt a policy of “Launch on Warning” – that is, moving immediately to full annihilation based only on early warning, or even acting preemptively when one believes that an attack is or may be imminent. Secondly, both powers must accelerate their own armament programs, both to maintain their own decapitation strike ability, and to ensure that they have sufficient capacity that they will still maintain retaliatory ability after an enemy decapitation strike.

It is a prisoner’s dilemma, plain and simple. And indeed, with each technological iteration, the differences in payoffs and punishments becomes larger and more pronounced. At some point the cost of continuous arms race becomes overwhelming, but whichever player yields first also forfeits their status as a superpower.

The same is, at least in my experience, true of social media use. Regular checking and posting is generally distracting and appears to have serious mental health costs, but so long as the cycle continues, it also serves as the foremost means of social power projection. And indeed, as Mean Girls teaches us, in adolescence as in nuclear politics, the only way to protect against an adversary is to maintain the means to retaliate at the slightest provocation.

This trend is not new. Mean Girls, which codified much of what we think of as modern adolescent politics and social dynamics, was made in 2004. Technology has not changed the underlying nature of adolescence, though it has accelerated and amplified its effects and costs. Nor is it limited to adolescents: the same kind of power structures and popularity contests that dominated high school recur throughout the world, especially as social media and the internet at large play a greater role in organizing our lives.

This is not inherently a bad thing if one is adept at social media. If you have the energy to post, curate, and respond on a continuous schedule, more power to you. I, however, cannot. I blame most of this on my disability, which limits my ability to handle large amounts of stimuli without becoming both physiologically and psychologically overwhelmed. The other part of this I blame on my perfectionist tendencies, which require that I make my responses complete and precise, and that I see through my interactions until I am sure that I have proven my point. While this is a decent enough mindset for academic debate, it is actively counterproductive on the social internet.

Moreover, continuous exposure to the actions of my peers reminded me of a depressing fact that I tried often to forget: that I was not with them. My disability is not so much a handicap in that is prevents me from doing things when I am with my peers in that it prevents me from being present with them in the first place. I become sick, which prevents me from attending school, which keeps me out of conversations, which means I’m not included in plans, which means I can’t attend gatherings, and so forth. Social media reminds me of this by showing me all the exciting things that my friends are doing while I am confined to bed rest.

It is difficult to remedy this kind of depression and anxiety. Stray depressive thoughts that have no basis in reality can, at least sometimes, and for me often, be talked apart when it is proven that they are baseless, and it is relatively simple to dismiss them when they pop up later. But these factual reminders that I am objectively left out; that I am the only person among my peers among these smiling faces; seemingly that my existence is objectively sadder and less interesting; is far harder to argue.

The History of the Embargo

I first got a Facebook account a little less than six years ago, on my fourteenth birthday. This was my first real social media to speak of, and was both the beginning of the end of parental restrictions on my internet consumption, and the beginning of a very specific window of my adolescence that I have since come to particularly loath.

Facebook wasn’t technically new at this point, but it also wasn’t the immutable giant that it is today. It was still viewed as a game of the young, and it was entirely possible to find someone who wasn’t familiar with the concept of social media without being a total Luddite. Perhaps more relevantly, there were then the first wave of people such as myself, who had grown up with the internet as a lower-case entity, who were now of age to join social media. That is, these people had grown up never knowing a world where it was necessary to go to a library for information, or where information was something that was stored physically, or even where past stories were something held in one’s memory rather than on hard drives.

In this respect, I consider myself lucky that the official line of the New South Wales Department of Eduction and Training’s official computer curriculum was, at the time I went through it, almost technophobic by modern standards; vehemently denouncing the evils of “chatrooms” and regarding the use of this newfangled “email” with the darkest suspicion. It didn’t give me real skills to equip me for the revolution that was coming; that I would live through firsthand, but it did, I think, give me a sense of perspective.

Even if that curriculum was already outdated even by the time it got to me, it helped underscore how quickly things had changed in the few years before I had enrolled. This knowledge, even if I didn’t understand it at the time, helped to calibrate a sense of perspective and reasonableness that has been a moderating influence on my technological habits.

During the first two years or so of having a Facebook account, I fell into the rabbit hole of using social media. If I had an announcement, I posted it. If I found a curious photo, I posted it. If I had a funny joke or a stray thought, I posted it. Facebook didn’t take over my life, but it did become a major theatre of it. What was recorded and broadcast there seemed for a time to be equally important as the actual conversations and interactions I had during school.

This same period, perhaps unsurprisingly, also saw a decline in my mental wellbeing. It’s difficult to tease apart a direct cause, as a number of different things all happened at roughly the same time; my physiological health deteriorated, some of my earlier friends began to grow distant from me, and I started attending the school that would continually throw obstacles in my path and refuse to accommodate my disability. But I do think my use of social media amplified the psychological effects of these events, especially inasmuch as it acted a focusing lens on all the things that made me different and apart from my peers.

At the behest of those closest to me, I began to take breaks from social media. These helped, but given that they were always circumstantial or limited in time, their effects were accordingly temporary. Moreover, the fact that these breaks were an exception rather than a standing rule meant that I always returned to social media, and when I did, the chaos of catching up often undid whatever progress I might have made in the interim.

After I finally came to the conclusion that my use of social media was causing me more personal harm than good, I eventually decided that the only way I would be able to remove its influence was total prohibition. Others, perhaps, might find that they have the willpower to deal with shades of gray in their personal policies. And indeed, in my better hours, so do I. The problem is that I have found that social media is most likely to have its negative impacts when I am not in one of my better hours, but rather have been worn down by circumstance. It is therefore not enough for me to resolve that I should endeavor to spend less time on social media, or to log off when I feel it is becoming detrimental. I require strict rules that can only be overridden in the most exceedingly extenuating circumstances.

My solution was to write down the rules which I planned to enact. The idea was that those would be the rules, and if I could justify an exception in writing, I could amend them as necessary. Having this as a step helped to decouple the utilitarian action of checking social media from the compulsive cycle of escalation. If I had a genuine reason to use social media, such as using it to provide announcements to far flung relatives during a crisis, I could write a temporary amendment to my rules. If I merely felt compelled to log on for reasons that I could not express coherently in a written amendment, then that was not a good enough reason.

This decision hasn’t been without its drawbacks. I am, without social media, undoubtedly less connected to my peers as I might otherwise have been, and the trend which already existed of my being the last person to know of anything has continued to intensify, but crucially, I am not so acutely aware of this trend that it has a serious impact one way or another on my day to day psyche. Perhaps some months hence I shall, upon further reflection, come to the conclusion that my current regime is beginning to inflict more damage than that which it originally remedied, and once again amend my embargo.

Arguments Against the Embargo

My reflections on my social media embargo have brought me stumbling upon two relevant moral quandaries. The first is whether ignorance can truly be bliss, and whether there is an appreciable distinction between genuine experience and hedonistic simulation. In walling myself off from the world I have achieved a measure of peace and contentment, at the possible cost of disconnecting myself from my peers, and to a lesser degree from the outside world. In the philosophical terms, I have alienated myself, both from my fellow man, and from my species-essence. Of course, the question of whether social media is a genuine solution to, or a vehicle of, alienation, is a debate unto itself, particularly given my situation.

It is unlikely, if still possible, that my health would have allowed my participation in any kind of physical activity which I could have been foreseeably invited to as a direct result of increased social media presence. Particularly given my deteriorating mental health of the time, it seems far more reasonable to assume that my presence would have been more of a one-sided affair: I would have sat, and scrolled, and become too self conscious and anxious about the things that I saw to contribute in a way that would be noticed by others. With these considerations in mind, the question of authenticity of experience appears to be academic at best, and nothing for me to loose sleep over.

The second question regards the duty of expression. It has oft been posited, particularly with the socio-political turmoils of late, that every citizen has a duty to be informed, and to make their voice heard; and that furthermore in declining to take a position, we are, if not tacitly endorsing the greater evil, then at least tacitly declaring that all positions available are morally equivalent in our apathy. Indeed, I myself have made such arguments on the past as it pertains to voting, and to a lesser extent to advocacy in general.

The argument goes that social media is the modern equivalent of the colonial town square, or the classical forum, and that as the default venue for socio-political discussion, our abstract duty to be informed participants is thus transmogrified into a specific duty to participate on social media. This, combined with the vague Templar-esque compulsion to correct wrongs that also drives me to rearrange objects on the table, acknowledge others’ sneezes, and correct spelling, is not lost on me.

In practice, I have found that these discussions are, at best, pyrrhic, and more often entirely fruitless: they cause opposition to become more and more entrenched, poison relationships, and convert no one, all the while creating a blight in what is supposed to be a shared social space. And as Internet shouting matches tend to be crowned primarily by who blinks first, they create a situation in which any withdrawal, even for perfectly valid reasons such as, say, having more pressing matters than trading insults over tax policy, is viewed as concession.

While this doesn’t directly address the dilemma posited, it does make its proposal untenable. Taking to my social media to agitate is not particularly more effective than conducting a hunger strike against North Korea, and given my health situation, is not really a workable strategy. Given that ought implies can, I feel acceptably satisfied to dismiss any lingering doubts about my present course.

Song of Myself

Music has always played an important role in my life, and I have always found comfort in it during some of my darkest hours. In particular, I have often listened to songs that I feel reflect me as a person, regardless of whether I like them a great deal as songs, during times of crisis, as a means to remind myself who I am and what I fight for. This has led me to what I think is an interesting artistic experiment: putting together a playlist that represents, not necessarily my tastes for listening to today, but me as a person through my personal history.

To put it another way: if I was hosting an Olympics, what would the opening ceremony look, and more importantly, sound, like? Or, if I were designing a Voyager probe record to give a person I’ve never met a taste of what me means, what would it focus on?

I could easily spend a great deal of time compiling, editing, and rearranging a truly epic playlist that would last several hours. But that misses the point of this exercise. Because, while my interest in listening to my own soundtrack bight be effectively infinite, that of other people is not. The goal here is not to compile a soundtrack, but to gather a few selections that convey the zeitgeist of my past.

This is my first attempt at this. I have chosen four songs, each of which represents roughly five years of my life. I have compiled a playlist available for listening here (Yes, I have a YouTube account/channel; I use it to make my own playlists for listening. Nothing special). The songs and my takeaway from them are described below.


1997-2002: Rhapsody in Blue

If I had to pick a single piece to represent my life, it would probably have to be Rhapsody in Blue, by George Gershwin. This piece was my favorite musical piece for a long time, and I remember during our visits with my grandparents when my grandfather would put on his classical records, and I would be thrilled when this song came on.

Rhapsody in Blue is perhaps best known as the United jingle, which is part of why I loved it so much. It represented flying, travel, adventure, and being treated like a member of high society as we flew in business class. I also reveled in knowing the name of a song that everyone else knew merely as a jingle. The energy and strong melody of the piece still captivate me to this day, and remind me of the feeling of childhood delight with each new adventure and horizon.

2002 – 2007: Pack all Your Troubles Arr. Mark Northfield

Aside from being one of my favorite arrangements of any song, this particular arrangement captures many of the conflicting feelings I have towards the first part of my schooling. I was indeed happy to be in a learning environment where I could soak up knowledge, but at the same time I often found the classes themselves dreadfully dull. Additionally, while I was initially quite happy with my social group, within a couple of years I had gone from being at the center of all playground affairs to being a frequently bullied pariah.

This song juxtaposes the cheerful, upbeat World War I song with a musical soundscape of a battlefield of the same time period, becoming more chaotic and pessimistic as time goes on. This also reflects my general experience in primary school, as my social life, my overall happiness, and my physical health all deteriorated over this time from a point of relative contentment to a point of absolute crisis. (2007 was the first year in which I genuinely remember nearly dying, and the first time I was confronted with a bona-fide disability.)

2007-2012: Time, Forward!

If 2007 was a breaking point in my life, then the years following were a period of picking up the pieces, and learning how to adapt to my new reality. Time, Forward, by Georgy Sviridov, captures much the same feeling, which makes sense considering it is frequently used to represent, the Soviet 20s, including at the Sochi games. This period in my life was chaotic and turbulent, and of the things I have come to regret saying, doing, or believing, most of them happened during this period. Yet it was also a formative time, cementing the medical habits that would ensure my survival, and meeting several new friends.

During this time was when my family moved back to the United States. With a fresh start in a new hemisphere, and several new disabilities and diagnoses to juggle, I was determined above all not to allow myself to be bullied and victimized the way I had been during primary school. I threw myself into schoolwork, and tried to avoid any display of vulnerability whatsoever. This, I discovered, did not make me any more popular or liked than I had been during primary school, which yielded a great deal of angst and conflict.

2012 – 2017: Dance of the Knights

You’ll notice that this song is both pseudo-classical, in the same vein as Rhapsody in Blue, while still being known as a work of Prokofiev, a Russian, and later Soviet, composer. In this respect, it is somewhere between the 2007-2012 period, and the 1997-2002 period, which I reckon is a reasonably accurate assessment of the past five years. The great highs and lows between late primary and early high school, which often involved grave medical threats to my life, have thankfully (hopefully) given way to a more predictably unpredictable set of obstacles; not only medically, but socially and psychologically, as my friends and I have grown up and learned to handle drama better.

The commonalities between the earlier pieces also reflect the change in priorities that I have worked very hard to (re)cultivate after seeing the distress that my existentialist focus on schoolwork brought me. I have in the past few years, begun to reprioritize those things that I believe are more likely to bring me happiness over mere success, harkening back to the things I held dear, and found so intriguing in Rhapsody in Blue in early childhood. At the same time, the piece, partly as a result of its context in Romeo and Juliet, has a distinctly mature, adult air to it; something which I struggle with internally, but which I am nevertheless thrust into regularly as I age.


If anyone else is interested in trying this project/challenge, please, go ahead and let me know. I can imagine that this could make a good group prompt, and I would be very interested to compare others’ playlists with my own.

What Comes Next?

So, as you may remember from a few days ago, I am now officially-unofficially done with classes. This is obviously a relief. Yet it is also dizzyingly anticlimactic. For so long I was solely focused on getting done the schoolwork in front of me that I never once dared to imagine what the world would look like when I was done. Now I am, and the answer is, to summarize: more or less the same as it looked when I was still working.

There is now an interesting paradox with my schedule. The list of things that I have to do each day is now incredibly short, and comprises mostly on those items which are necessary to my day to day survival; I have to make sure I eat, and shower, and get to the doctors’ offices on time. Beyond that I have almost no commitments. I have no local friends with whom I might have plans, nor any career that requires certain hours of me, nor even any concrete future path for my further education (I was, and still am prevented from making such plans because my school still cannot provide an up to date and accurate transcript, which is a prerequisite to applying).

At the same time, now that I have some semblance of peace in my life, for the first time in memory, there are plenty of things which I could do. I could go for a pleasant walk in the park. I could take to the streets and protest something. I could fritter away countless hours on some video game, or some television series. I could write a blog post, or even several. My options are as boundless as my newfound time. Yet for as many things as I could do, there are few things that I need to do.

Moreover, almost all of here things that I could do require some degree of proactive effort on my part. In order to sink time into a video game, for example, I would first have to find and purchase a game that interests me, which would first require that I find a means to acquire, and run said game on my hardware (the bottleneck isn’t actually hardware on my end, but internet speed, which in my household is so criminally slow that it does not meet the bare minimum technical specifications for most online distribution platforms).

As problems go… this isn’t particularly problematic. On the contrary, I find it exhilarating, if also new and utterly terrifying, to think that I now command my own time; indeed, that I have time to command. In the past, the question of time management was decidedly hollow, given that I generally had none. My problem, as I insisted to an unsympathetic study skills teacher, was not that I categorically made poor use of time, but that I only possessed about three productive hours in a day in which to complete twelve hours of schoolwork. The only question involved was which schoolwork I focused on first, which was never truly solved, as each teacher would generally insist that their subject ought be my highest priority, and that all of their class work was absolutely essential and none could be pared down in accordance with my accommodations.

Nevertheless, while my new state of affairs isn’t necessarily problematic, it certainly has the potential to become so if I allow myself to become entranced in the siren song of complacency and cheap hedonism. I am aware that many people, especially people in my demographic, fall prey to various habits and abuses when lacking clear direction in life, therefore I have two primary aims in the time that it will take the school to produce the necessary paperwork for me to move on to higher education.

First, I need to keep busy, at least to an extent that will prevent me from wallowing; for wallowing is not only unproductive, but generally counterproductive, as it increases feelings of depression and helplessness, and is associated with all manner of negative medical outcomes.

Second, I need to keep moving forward. I am well aware that I often feel most hopeless when I cannot see any signs of progress, hence why much of the past five years has been so soul-crushing. In theory, it would be quite easy to occupy my time by playing video games and watching television; by building great structures of Lego and then deconstructing them; or even by writing long tracts, and then destroying them. But this would provide only a physical, and not a mental defense against wallowing. What I require is not merely for my time to be occupied, but an occupation in my time.

I am therefore setting for myself a number of goals. All of these goals are relatively small scale, as I have found that when setting my own goals as opposed to working under the direction of others, I tend to work better with small, tactically minded checklist-style agendas than vague, grand strategies. Most of these goals are relative mundanities, such as shifting around money among accounts, or installing proper antivirus software on a new laptop. All of these goals are intended to keep me busy and nominally productive. A few of them have to do with my writing here.

I generally detest people who post too much of their day to day personal affairs online, particularly those who publish meticulous details of their daily efforts to meet one target or another. However, having my goals publicly known has in past attempts seemed to be a decent motivator of sorts. It forces me to address them in one way or another down the line, even if all I do by addressing them is explain why they haven’t happened yet. If there is a reasonable explanation, I do not feel pressure; if there is not, I feel some compulsion to keep my word to myself and others. So, here are a few of my goals as they regard this blog:

1) I am looking at getting a gallery page set up which will allow me to display the photos that I have taken personally in one place, as well as showing off some of my sketches, which people say I ought to. Aside from being nice for people who like to look at pictures, having a gallery, or a portfolio if you will, has been a thing that I have wanted to have in my life since my first high school art class, as part of my quest to be a pretentious, beret-wearing, capital-a Artist, and people have been clamoring to see more of my pictures and sketches of late. My aim is to have this page in working order before thanksgiving.

2) I am also working on getting that fictional story I keep mentioning polished up for launch. The reason it hasn’t gone up yet is no longer that I haven’t written necessary materials, but that I am still working on getting the backend set up so that it displays nicely and consistently. I’m also still writing it, but I’m far enough along writing it that I can probably start posting as soon as I get the technical hijinks worked out.

This story was scheduled to start some time at the beginning of last month. However, a major glitch in the plugin I was aiming to use to assist its rollout caused a sitewide crash (you may remember that part), and subsequently I had to go back to the drawing board. Because I am, quite simply, not a computer coding person, the solution here is not going to be technically elegant. What’s probably going to happen is that the story is going to be posted under a sub-domain with a separate install of WordPress, in order to keep fiction and nonfiction posts from becoming mixed up. I’m working on trying to make navigating between the two as painless as possible. The timeline on this one should be before thanksgiving.

3) I aim to travel more. This isn’t as strictly blog related, but it is something I’m likely to post about. Specifically, I aim to find a method by which I can safely and comfortably travel, with some degree of independence, despite my disability. My goal is to undertake a proof of concept trip before May of next year.

4) I want to write and create more. No surprises there.

This is likely to be the last post of the daily post marathon. That is, unless something strikes my fancy between now and tomorrow. I reckon that this marathon has served its intended purpose of bringing me up to date on my writings quite nicely. I have actually enjoyed getting to write something every day, even if I know that writing, editing, and posting two thousand words a day is not sustainable for me, and I may yet decide to change up my posting routine some more in the future.

The Laptop Manifesto

The following is an open letter to my fellow students of our local public high school, which has just recently announced, without warning, that all students will henceforth be required to buy google chromebooks at their own expense.


I consider myself a good citizen. I obey the traffic laws when I walk into town. I vote on every issue. I turn in my assignments promptly. I raise my hand and wait to be called on. When my classmates come to me at the beginning of class with a sob story about how they lost their last pencil, and the teacher won’t loan them another for the big test, I am sympathetic to their plight. With education budgets as tight as they are, I am willing to share what I have.

Yet something about the rollout of our school’s new laptop policy does not sit well with me. That the school should announce mere weeks before school begins that henceforth all students shall be mandated to have a specific, high-end device strikes me as, at best, rude, and, at worst, an undue burden on students for a service that is legally supposed to be provided by the state at no cost.

Ours is, after all, a public school. Part of being a public school is being accessible to the public. That means all members of the public. Contrary to the apparent belief of the school board and high school administration, the entire student population does not consist solely of financially wealthy and economically stable families. Despite the fact that our government at both the local and state level is apparently content to routinely leave the burden of basic classroom necessities to students and individual teachers, it is still, legally, the responsibility of the school, not the student, to see that the student is equipped to learn.

Now, I am not opposed to technology. On the contrary, I think our school is long overdue for such a 1:1 program. Nor am I particularly opposed the ongoing effort to make more class materials digitally accessible. Nor even that the school should offer their own Chromebooks to students at the student’s expense. However, there is something profoundly wrong about the school making such costs mandatory.

Public school is supposed to be the default, free option for compulsory education. To enforce compulsory education as our state does, (to the point of calling child protective services on parents of students who miss what the administration considers to be too many days,) and then enforcing the cost of that education amounts to a kind of double taxation against families that attend public schools. Moreover, this double taxation has a disproportionate impact on those who need public schools the most.

This program as it stands is unfair, unjust, and as far as I can see, indefensible. I therefore call upon my fellow students to resist this unjust and arguably illegal decree, by refusing to comply. I call in particular upon those who are otherwise able to afford such luxuries as chromebooks to resist the pressure to bow to the system, and stand up for your fellow students.

What is a Home?

I know that I’m getting close to where I want to be when the GPS stops naming roads. That’s fine. These roads don’t have names, or even a planned logic to them, so much as they merely exist relative to other things. Out here, the roads are defined by where they go, rather than having places defined by addresses.

After a while I begin to recognize familiar landmarks. Like the roads, these landmarks don’t have names, but rather refer to some event in the past. First we drive through the small hamlet where I was strong armed into my first driving lesson. We pass the spot where my grandmother stopped the golf cart by the side of the road to point out the lavender honeysuckle to far younger versions of myself and my younger brother, and we spent a half hour sampling the taste of the flowers. Next we pass under the tree that my cousin was looking up at nervously when my father grabbed him by the shoulders and screamed that he was under attack by Drop Bears, causing my cousin to quite nearly soil himself.

I have never lived in a single house continuously for more than about eight years. I grew up traveling, an outsider wherever I went, and to me the notion of a single home country, let alone a single house for a home, is as foreign as it is incomprehensible. So is the concept of living within driving distance of most of one’s relatives, for that matter.

To me, home has always been a utilitarian rather than moral designation. Home is where I sleep for free, where my things that don’t fit in my suitcase go, and where the bills get forwarded to. Home is the place where I can take as long as I want in the bathroom, and rearrange the furniture to my arbitrary personal preferences, and invite people over without asking, but that is all. Anywhere these criteria are met can be home to me, with whatever other factors such as ownership, geographic location, and proximity to relatives, or points of personal history, being irrelevant. I can appreciate the logistical value of all of these things, but attaching much more importance to it seems strange.

Yet even as I write this I find myself challenging my points. Walking around my grandfather’s farmhouse, which is the closest thing I have to a consistent home, I am reminded of images of myself from a different time, especially of myself from a time before I was consciously able to make choices about who I am. It’s difficult to think of myself that long ago in terms of me, and my story, and much easier to think of myself in terms of the other objects that were also present.

My grandparents used to run a preschool from their house, and the front room is still stocked with toys and books from that era. Many of the decorations have remained unchanged from when my grandmother ran the place. The doors and cabinets are all painted in bright pastel colors. In my mind, these toys were as much my own as any that stayed at home while we traveled. Each of these toys has wrapped up in it the plot lines from several hundred different games between myself and whoever else I could rope into playing with me.

Against the wall is a height chart listing my, my brother’s, and my cousins’ heights since as early as we could stand. For most of my childhood this was the official scale for determining who was tallest in the ever raging battle for height supremacy, and I remember feeling ready to burst with pride the first time I was verified as tallest. I am tall enough now that I have outgrown the tallest measuring point. I am indisputably the tallest in the family. And yet I still feel some strange compulsion to measure myself there, beyond the mere curiosity that is aroused every time I see a height scale in a doctor’s office.

This place isn’t my home, not by a long shot. In many respects, it meets fewer of my utilitarian criteria than a given hotel. It is the closest I have ever felt to understanding the cultural phenomenon of Home, and yet it is still as foreign as anywhere else. If one’s home is tied to one’s childhood, as both my own observations and those of others I have read seem to indicate, then I will probably never have a home. This might be a sad realization, if I knew any different.

I have often been accused of holding a worldview that does not include room for certain “human” elements. This accusation, as far as I can tell, is probably on point, though somewhat misleading. It is not out of malice nor antipathy towards these elements that I do not place value on concepts such as “home”, “patriotism”, or, for that matter “family”. It is because they are foreign, and because from my viewpoint as an outsider, I genuinely cannot see their value.

I can understand and recognize the utilitarian value; I recognize the importance of having a place to which mail can be delivered and oversized objects can be stored; I can understand the preference for ensuring that one’s country of residence is secure and prosperous; and I can see the value of a close support network, and how one’s close relatives might easily become among one’s closest friends. But inasmuch as these things are said to suppose to have inherent value beyond their utilitarian worth, I cannot see it.

It is probably, I am told, a result of my relatively unusual life trajectory, which has served to isolate me from most cultural touchstones. I never had a home or homeland because we lived abroad and moved around when I was young. I fail to grasp the value of family because I have never lived in close proximity to extended relatives to the point of them becoming friends, and my illness and disability has further limited me from experiencing most of the cultural touchstones with which I might share with family.

It might sound like I am lamenting this fact. Perhaps I would be, if I knew what it was that I am allegedly missing. In reality, I only lament the fact that I cannot understand these things which seem to come naturally to others. That I lack a capital-H Home, or some deeper connection to extended family or country, is neither sad nor happy, but merely a fact of my existence.

PSA: Don’t Press My Buttons

Because this message seems to have been forgotten recently, here is a quick public service announcement to reiterate what should be readily apparent.

Messing with someone’s life support is bad. Don’t do that.

I’m aware that there is a certain compulsion to press buttons, especially buttons that one isn’t supposed to press, or isn’t sure what they do. Resist the temptation. The consequences otherwise could be deadly. Yes, I mean that entirely literally. It’s called life support for a reason, after all. Going up to someone and starting to press random buttons on medical devices is often equivalent to wrapping a tight arm around someone’s neck. You probably (hopefully) wouldn’t greet a stranger with a stranglehold. So don’t start fiddling with sensitive medical equipment.

Additionally, if you ignore this advice, you should not be surprised when the person whose life you are endangering reacts in self defense. You are, after all, putting their life at risk, the same as if you put them in a stranglehold. There is a very good chance that they will react from instinct, and you will get hurt. You wouldn’t be the first person I’ve heard of to wind up with a bloody nose, a few broken ribs, a fractured skull, maybe a punctured lung… you get the idea.

Don’t assume that because it doesn’t look like a medical device, that it’s fair to mess with either. A lot of newer medical devices aimed at patients who want to try and avoid sticking out are designed to look like ordinary electronic devices. Many newer models have touch screens and sleek modern interfaces. What’s more, a lot of life support setups now include smartphones as a receiver and CPU for more complicated functions, making these smartphones medical devices in practice.

Moreover, even where there is no direct life support function, many times phones are used as an integral part of one’s life support routine. For example, a patient may use their phone to convey medical data to their doctors for making adjustments. Or, a patient may rely on their phone as a means for emergency communication. While these applications do not have the same direct impact on physical safety, they nevertheless are critical to a person’s continued daily function, and an attack on such devices will present a disproportionate danger, and cause according psychological distress. Even relatively harmless phone pranks, which may not even impede the ordinary functioning of medical-related operations are liable to cause such distress.

What is concerned here is not so much the actual impediment, but the threat of impediment when it suddenly matters. For my own part, even complete destruction of my smartphone is not likely to put me in immediate physiological danger. It may, however, prove fatal if it prevents me from summoning assistance if, some time from now, something goes awry. Thus, what could have been a relatively uneventful and easily handled situation with my full resources, could become life threatening. As a result, any time there is any tampering with my phone, regardless of actual effect, it causes serious anxiety for my future wellbeing.

It is more difficult in such situations to establish the kind of causal chain of events which could morally and legally implicate the offender in the end result. For that matter, it is difficult for the would-be prankster to foresee the disproportionate impact of their simple actions. Indeed, common pranks with electronic devices, such as switching contact information, reorganizing apps, and changing background photos, is so broadly considered normal and benign that it is hard to conceive that they could even be interpreted as a serious threat, let alone result in medical harm. Hence my writing this here.

So, if you have any doubt whatsoever about messing with someone else’s devices, even if they may not look like medical devices, resist the temptation.

The Fly Painting Debate

Often in my travels, I am introduced to interesting people, who ask interesting questions. One such person recently was a lady who was, I am told, raised on a commune as a flower child, and who now works in developing educational materials for schools. Her main work consists of trying to convey philosophical and moral questions to young children in ways that allow them to have meaningful discussions.

One such question, which she related to me, focused on a man she knew tangentially who made pieces of microscopic art. Apparently this man makes paintings roughly the width of a human hair, using tools like insect appendages as paintbrushes. These microscopic paintings are sold to rich collectors to the tune of hundreds of thousands of dollars. Because of their size, they are not viewable without special equipment, and broadly speaking, cannot be put on display.

There is obviously a lot to unpack here. The first question is: Is what this man does art, especially if it cannot be enjoyed? My feeling is yes, for two reasons. First, there is artistic expression taking place on the part of the artist, and more importantly, the artwork itself does have an impact on its consumers, even if the impact is more from the knowledge of the existence of the piece than any direct observation. Secondly, the piecesare by their very existence intellectually stimulating and challenging, in a way that can provoke further questions and discussion.

Certainly they challenge the limits of size as a constraint of artistic medium. And these kinds of challenges, while often motivated by pride and hubris, do often push the boundaries of human progress as a whole, by generating interest and demand for scientific advancement. This criteria of challenging the status quo is what separates my bathroom toilet from Marcel Duchamp’s “Fountain”. Admittedly, these are fairly subjective criteria, but going any further inevitably turns into a more general debate on what constitutes art; a question which is almost definitionally paradoxical to answer.

The second, and to me, far more interesting question is: is this man’s job, and the amount he makes justifiable? Although few would argue that he is not within his rights to express himself as he pleases, what of the resulting price tag? Is it moral to spend hundreds of thousands of dollars on such items that are objectively luxuries, that provide no tangible public good? How should we regard the booming business of this man’s trade: as a quirky niche market enabled by a highly specialized economy and generous patrons willing to indulge ambitious projects, or as wasteful decadence that steals scarce resources to feed the hubris of a disconnected elite?

This points at a question that I keep coming back to in my philosophical analyses, specifically in my efforts to help other people. Is it better to focus resources on smaller incremental projects that affect a wider number of people, or larger, more targeted projects that have a disproportionate impact on a small group?

To illustrate, suppose you have five thousand dollars, and want to do the moral utilitarian thing, and use it to improve overall happiness. There are literally countless ways to do this, but let’s suppose that you want to focus on your community specifically. Let’s also suppose that your community, like my community, is located in a developed country with a generally good standard of living. Life may not always be glamorous for everyone, but everyone has a roof over their head and food on the table, if nothing else.

You have two main options for spending your five thousand dollars.

Option 1: You could choose to give five hundred people each ten dollars. All of these people will enjoy their money as a pleasant gift, though it probably isn’t going to turn anyone’s life around.

Option 2: You could choose to give a single person five thousand dollars all at once.

I’m genuinely torn on this question. The first option is the ostensibly fairer answer, but the actual quality of life increase is marginal. More people benefit, but people probably don’t take away the same stories and memories as the one person would from the payout. The increase in happiness here is basically equivocal, making them a wash from a utilitarian perspective.

This is amplified by two quirks of human psychology. The first is a propensity to remember large events over small events, which makes some sense as a strategy, but has a tendency to distort trends. This is especially true of good things, which tend to be minimized, while bad things tend to be more easily remembered. This is why, for example, Americans readily believe that crime is getting worse, even though statistically, the exact opposite is true.

The second amplifier is the human tendency to judge things in relative terms. Ten dollars, while certainly not nothing, does not make a huge difference relative to an annual salary of $55,000, while $5,000 is a decent chunk of change. Moreover, people will judge based relative to each other, meaning that some perceived happiness may well be lost in giving the same amount of money to more people.

This question comes up in charity all the time. Just think about the Make a Wish Foundation. For the same amount of money, their resources could easily reach far more people through research and more broad quality of life improvements. Yet they chose to focus on achieving individual wishes. Arguably they achieve greater happiness because they focus their resources on a handful of life-changing projects rather than a broader course of universal improvement.

Now, to be clear, this does not negate the impact of inequality, particularly at the levels faced in the modern world. Indeed, such problems only really appear in stable, developed societies where the the value of small gifts is marginal. In reality, while ten dollars may not mean a great deal to myself or my neighbor, it would mean the difference between riches and poverty in a village facing extreme poverty in a developing nation. Also, in reality, we are seldom faced with carefully balanced binary options between two extremes.

The question of the microscopic artist falls into a grey area between the two extremes. As a piece of art, such pieces invariably contribute, even if only incrementally, to the greater corpus of human work, and their creation and existence contributes in meaningful and measurable ways to overall human progress.

There is, of course, the subjective, and probably unanswerable question of to what degree the wealthy collector buyers of these pieces are derive their enjoyment from the artistic piece itself, or from the commodity; that is, whether they own it for artistic sake, or for the sake of owning it. This question is relevant, as it does have some bearing on what can be said to be the overall utilitarian happiness derived from the work, compared to the utilitarian happiness derived from the same sum of resources spent otherwise. Of course, this is unknowable and unprovable.

What, then, can be made of this question? The answer is probably not much, unless one favors punitively interventionist economic policy, or totalitarian restrictions on artistic expression. For my part, I am as unable to conclusively answer this question as I can answer the question of how best to focus charitable efforts. Yet I do think it is worthwhile to always bear in mind the trade offs which are being made.

Schoolwork Armistice

At 5:09pm EDT, 16th of August of this year, I was sitting hunched over an aging desktop computer working on the project that was claimed to be the main bottleneck between myself and graduation. It was supposed to be a simple project: reverse engineer and improve a simple construction toy. The concept is not a difficult one. The paperwork, that is, the engineering documentation which is supposed to be part of the “design process” which every engineer must invariably complete in precisely the correct manner, was also not terribly difficult, though it was grating, and, in my opinion, completely backwards and unnecessary.

In my experience tinkering around with medical devices, improvising on the fly solutions in life or death situations is less of a concrete process than a sort of spontaneous rabbit-out-of-the-hat wizardry. Any paperwork comes only after the problem has been attempted and solved, and only then to record results. This is only sensible as, if I waited to put my life support systems back together after they broke in the field until after I had filled out the proper forms, charted the problem on a set of blueprints, and submitted it for witness and review, I would be dead. Now, admittedly this probably isn’t what needs to be taught to people who are going to be professional engineers working for a legally liable company. But I still maintain that for an introductory level course that is supposed to focus on achieving proper methods of thinking, my way is more likely to be applicable to a wider range of everyday problems.

Even so, the problem doesn’t lie in paperwork. Paperwork, after all, can be fabricated after the fact if necessary. The difficult part lies in the medium I was expected to use. Rather than simply build my design with actual pieces, I was expected to use a fancy schmancy engineering program. I’m not sure why it is necessary for me to have to work ham-fistedly through another layer of abstraction which only seems to make my task more difficult by removing my ability to maneuver pieces in 3D space with my hands.

It’s worth nothing that I have never at any point been taught to use this computer program; not from the teacher of the course, nor my teacher, nor the program itself. It is not that the program is intuitive to an uninitiated mind; quite the opposite, in fact, as the assumption seems to be that anyone using the program will have had a formal engineering education, and hence be well versed in technical terminology, standards, notation, and jargon. Anything and everything that I have incidentally learned of this program comes either from blunt trial and error, or judicious use of google searches. Even now I would not say that I actually know how to use the program; merely that I have coincidentally managed to mimic the appearance of competence long enough to be graded favorably.

Now, for the record, I know I’m not the only one to come out of this particular course feeling this way. The course is advertised as being largely “self motivated”, and the teacher is known for being distinctly laissez faire provided that students can meet the letter of course requirements. I knew this much when I signed up. Talking to other students, it was agreed that the course is not so much self motivated as it is, to a large degree, self taught. This was especially true in my case, as, per the normal standard, I missed a great deal of class time, and given the teacher’s nature, was largely left on my own to puzzle through how exactly I was supposed to make the thing on my computer look like the fuzzy black and white picture attached to packet of make up work.

Although probably not the most frustrating course I have taken, this one is certainly a contender for the top three, especially the parts where I was forced to use the computer program. It got to the point where, at 5:09, I became so completely stuck, and as a direct result so she overwhelmingly frustrated, that to wit the only two choices left before me were as follows:

Option A
Make a hasty flight from the computer desk, and go for a long walk with no particular objective, at least until the climax of my immediate frustration has passed, and I am once again able to think of some new approach in my endless trial-and-error session, besides simply slinging increasingly harsh and exotic expletives at the inanimate PC.

Option B
Begin my hard earned and well deserved nervous breakdown in spectacular fashion by flipping over the table with the computer on it, trampling over the shattered remnants of this machine and bastion of my oppression, and igniting my revolution against the sanity that has brought me nothing but misery and sorrow.

It was a tough call, and one which I had to think long and hard about before committing. Eventually, my nominally better nature prevailed. By 7:12pm, I was sitting on my favorite park bench in town, sipping a double chocolate malted milkshake from the local chocolate shop, which I had justified to myself as being good for my doctors’ wishes that I gain weight, and putting the finishing touches on a blog post about Armageddon, feeling, if not contented, then at least one step back from the brink that I had worked myself up to.

I might have called it a day after I walked home, except that I knew that the version of the program that I had on my computer, that all my work files were saved with, and which had been required for the course, was being made obsolete and unusable by the developers five days hence. I was scheduled to depart for my eclipse trip the next morning. So, once again compelled against my desires and even my good sense by forces outside my control, I set back to work.

By 10:37pm, I had a working model on the computer. By 11:23, I had managed to save and print enough documentation that I felt I could tentatively call my work done. At 11:12am August 17th, the following morning, running about two hours behind my family’s initial departure plans (which is to say, roughly normal for time), I set the envelope with the work I had completed on the counter for my tutor to collect after I departed so that she might pass it along to the course teacher, who would point out whatever flaws I needed to address, which in all probability would take another two weeks at least of work.

This was the pattern I had learned to expect from my school. They had told me that I was close to being done enough times, only to disappoint when they discovered that they had miscalculated the credit requirements, or overlooked a clause in the relevant policy, or misplaced a crucial form, or whatever other excuse of the week they could conjure, that I simply grew numb to it. I had come consider myself a student the same way I consider myself disabled: maybe not strictly permanently, but not temporarily in a way that would lead me to ever plan otherwise.

Our drive southwest was broadly uneventful. On the second day we stopped for dinner about an hour short of our destination at Culver’s, where I traditionally get some variation of chocolate malt. At 9:32 EDT August 18th, my mother received the text message from my tutor: she had given the work to the course teacher who had declared that I would receive an A in the course. And that was it. I was done.

Perhaps I should feel more excited than I do. Honestly though I feel more numb than anything else. The message itself doesn’t mean that I’ve graduated; that still needs to come from the school administration and will likely take several more months to be ironed out. This isn’t victory, at least not yet. It won’t be victory until I have my diploma and my fully fixed transcript in hand, and am able to finally, after being forced to wait in limbo for years, begin applying to colleges and moving forward with my life. Even then, it will be at best a Pyrrhic victory, marking the end of a battle that took far too long, and cost far more than it ever should have. And that assumes that I really am done.

This does, however, represent something else. An armistice. Not an end to the war per se, but a pause, possibly an end, to the fighting. The beginning of the end of the end. The peace may or may not hold; that depends entirely on the school. I am not yet prepared to stand down entirely and commence celebrations, as I do not trust the school to keep their word. But I am perhaps ready to begin to imagine a different world, where I am not constantly engaged in the same Sisyphean struggle against a never ending onslaught of schoolwork.

The nature of my constant stream of makeup work has meant that I have not had proper free time in at least half a decade. While I have, at the insistence of my medical team and family, in recent years, taken steps to ensure that my life is not totally dominated solely by schoolwork, including this blog and many of the travels and projects documented on it, the ever looming presence of schoolwork has never ceased to cast a shadow over my life. In addition to causing great anxiety and distress, this has limited my ambitions and my enjoyment of life.

I look forward to a change of pace from this dystopian mental framework, now that it is no longer required. In addition to rediscovering the sweet luxury of boredom, I look forward to being able to write uninterrupted, and to being able to move forward on executing several new and exciting projects.

Eclipse Reactions

People have been asking since I announced that I would be chasing the eclipse for he to try and summarize my experience here. So, without further delay, here are my thoughts on the subject, muddled and disjointed though they may be.

It’s difficult to describe what seeing an eclipse feels like. A total eclipse, that is. A partial eclipse actually isn’t that noticeable until you get up to about 80% totality. You might feel slightly cooler than you’d otherwise expect for the middle of the day, and the shade of blue might look just slightly off for mid day sky, but unless you knew to get a pair of viewing glasses and look at the sun, it’d be entirely possible to miss it entirely.

A total eclipse is something else entirely. The thing that struck me the most was how sudden it all was. Basically, try to imagine six hours of sunset and twilight crammed into two minutes. Except, there isn’t a horizon that the sun is disappearing behind. The sun is still in the sky. It’s still daytime, and the sun is still there. It’s just not shining. This isn’t hard conceptually, but seeing it in person still rattles something very primal.

The regular cycle of day and night is more or less hardwired into human brains. It isn’t perfect, not by a long shot, but it is a part of normal healthy human function. We’re used to having long days and nights, with a slow transition. Seeing it happen all at once is disturbing in a primeval way. You wouldn’t even have to be looking at the sun to know that something is wrong. It just is.

For reference: this was the beginning of totality.
This was exactly 30 seconds later.

I know this wasn’t just me. The rest of the crowd felt it as well. The energy of the crowd in the immediate buildup to totality was like an electric current. It was an energy which could have either came out celebratory and joyous, or descended into riotous pandemonium. It was the kind of energy that one expects from an event of astronomical proportions. Nor was this reaction confined to human beings; the crickets began a frenzied cacophony chirping more intense than I have yet otherwise heard, and the flying insects began to confusedly swarm, unsure of what to make of the sudden and unplanned change of schedule.

It took me a while to put my finger on why this particular demonstration was so touching in a way that garden variety meteor showers, or even manmade light shows just aren’t. After all, it’s not like we don’t have the technology to create similarly dazzling displays. I still don’t think I’ve fully nailed it, but here’s my best shot.

All humans to some degree are aware of how precarious our situation is. We know that life, both in general, but also for each of us in particular, is quite fragile. We know that we rely on others and on nature to supplement our individual shortcomings, and to overcome the challenges of physical reality. An eclipse showcases this vulnerability. We all know that if the sun ever failed to come back out of an eclipse, that we would be very doomed.

Moreover, there’s not a whole lot we could do to fix the sun suddenly not working. A handful of humans might be able to survive for a while underground using nuclear reactors to mimic the sun’s many functions for a while, but that would really just be delaying the inevitable.

With the possible exception of global thermonuclear war, there’s nothing humans could do to each other or to this planet that would be more destructive than an astronomical event like an eclipse (honorable mention to climate change, which is already on track to destroy wide swaths of civilization, but ultimately falls short because it does so slowly enough that humans can theoretically adapt, if we get our act together fast). Yet, this is a completely natural, even regular occurrence. Pulling the rug from out under humanity’s feet is just something that the universe does from time to time.

An eclipse reminds us that our entire world, both literally and figuratively, is contained on a single planet; a single pale blue dot, and that our fate is inextricably linked to the fate of our planet. For as much as we boast about being masters of nature, and eclipse reminds us that there is still a great deal over which we have no control. It reminds us of this in a way that is subtle enough to be lost in translation if one does not experience it firsthand, but one which is nevertheless intuitable even if one is not consciously aware of the reasons.

None of this negates the visual spectacle; and indeed, it is quite a spectacle. Yet while it is a spectacle, it is not a show, and this is an important distinction. It is not a self-contained item of amusement, but rather a sudden, massive, and all enclumpassing change in the very environment. It’s not just that something appears in the sky, but that interferes with the sun, and by extension, the sky itself. It isn’t just that something new has appeared, but that all of the normal rules seem to be being rewritten. It is mind boggling.

As footage and images have emerged, particularly as video featuring the reactions of crowds of observers have begun to circulate, there have been many comments to the effect that the people acting excited, to the point of cheering and clapping, are overreacting, and possibly need to be examined, . I respectfully disagree. To see in person a tangible display of the the size and grandeur of the cosmos that surround us, is deeply impressive; revelatory even. On the contrary, I submit that between two people that have borne witness to our place in the universe, the one who fails to react immediately and viscerally is the one who needs to be examined.