For Whom The Bell Tolls

Someone whom I knew from my online activities died recently. To say that I was close to this person is a bit of a stretch. They were a very involved, even popular, figure in a community in which I am but one of many participants. Still, I was struck by their death, not least of all because I was not aware that they were ill in the first place, but also because I’m not entirely sure what to do now.

The (at this point still weak, and open to interpretation) scientific consensus is that while the formation and definition of bonds in online communities may vary from real life, and that, in certain edge cases, this may lead to statistical anomalies, online communities are, for the most part, reflective of normal human social behavior, and therefore social interaction in an online setting is not substantially materially different from real life communities[1][2]. Moreover, emotions garnered through online social experiences are just as real, at least to the recipient, as real life interaction. The reaction to this latter conclusion has been both mixed, and charged [4][5], which, fair enough, given the subject matter.

I have been reliably informed by a variety of sources both professional and amateur that I do not handle negative emotions well in general, grief in particular. With a couple of exceptions, I have never felt that the times when I felt grief over something, that I was justified in it enough to come forward publicly. I had more important duties which I could not reasonably justify taking my attention away from. Conversely, on the one or two occasions when I felt like I might be justified in grieving publicly, I did not experience the expected confrontation.

When I have experienced grief, it has seldom been a single tidal wave of emotions, causing catastrophic, but at its core, momentary, devastation to all in its path. Rather, it has been a slow, gentle rain, wavering slightly in its intensity, but remarkable above all for its persistence rather than its raw power. Though not as terrifying or awesome as the sudden flood, it inevitably brings the same destructive ends, wiping away the protective topsoil, exposing what lies beneath, and weakening the foundation of everything that has been built on top of it, eventually to its breaking point.

In this metaphor, the difference between the death of a person whom I am extremely close to, and the death of someone whom I know only peripherally is only a matter of duration and intensity. The rains still come. The damage is still done. And so, when someone with whom I am only tangentially connected, but connected nonetheless, I feel a degree of grief; a degree that some might even call disproportionate, but nevertheless present. The distress is genuine, regardless of logical or social justification.

It is always challenging to justify emotional responses. This is especially true when, as seems to be the case with grief in our culture, the emotional response demands a response of its own. In telling others that we feel grief, we seem to be, at least in a way, soliciting sympathy. And as with asking for support or accommodations on any matter, declaring grief too frequently, or on too shoddy a pretext, can invite backlash. Excessive mourning in public or on Facebook, or, indeed, on a blog post, can seem, at best, trite, and at worst, like sociopathic posturing to affirm one’s social status.

So, what is a particularly sensitive online acquaintance to do? What am I to do now?

On such occasions I am reminded of the words of the poet John Donne in his Devotions Upon Emergent Occasions, and severall steps in my Sickness, specifically, the following except from Meditation 17, which is frequently quoted out of its full context. I do not think there is much that I could add to it, so I will simply end with the relevant sections here.

Perchance, he for whom this bell tolls may be so ill, as that he knows not it tolls for him; and perchance I may think myself so much better than I am, as that they who are about me, and see my state, may have caused it to toll for me, and I know not that. The church is catholic, universal, so are all her actions; all that she does belongs to all. When she baptizes a child, that action concerns me; for that child is thereby connected to that body which is my head too, and ingrafted into that body whereof I am a member. And when she buries a man, that action concerns me: all mankind is of one author, and is one volume; when one man dies, one chapter is not torn out of the book, but translated into a better language; and every chapter must be so translated; God employs several translators; some pieces are translated by age, some by sickness, some by war, some by justice; but God’s hand is in every translation, and his hand shall bind up all our scattered leaves again for that library where every book shall lie open to one another. As therefore the bell that rings to a sermon calls not upon the preacher only, but upon the congregation to come, so this bell calls us all; but how much more me, who am brought so near the door by this sickness.
[…]

The bell doth toll for him that thinks it doth; and though it intermit again, yet from that minute that this occasion wrought upon him, he is united to God. Who casts not up his eye to the sun when it rises? but who takes off his eye from a comet when that breaks out? Who bends not his ear to any bell which upon any occasion rings? but who can remove it from that bell which is passing a piece of himself out of this world?

No man is an island, entire of itself; every man is a piece of the continent, a part of the main. If a clod be washed away by the sea, Europe is the less, as well as if a promontory were, as well as if a manor of thy friend’s or of thine own were: any man’s death diminishes me, because I am involved in mankind, and therefore never send to know for whom the bell tolls; it tolls for thee.

Works Consulted

Zhao, Jichang, et al. “Being rational or aggressive? A revisit to Dunbar׳s number in online social networks.” Neurocomputing 142 (2014): 343-53. Web. 27 May 2017. <https://arxiv.org/pdf/1011.1547.pdf>.

Golder, Scott A., et al. “Rhythms of Social Interaction: Messaging Within a Massive Online Network.” Communities and Technologies 2007 (2007): 41-66. Web. 27 May 2017. <https://arxiv.org/pdf/cs/0611137.pdf>.

Wilmot, Claire. “The Space Between Mourning and Grief.” The Atlantic. Atlantic Media Company, 08 June 2016. Web. 27 May 2017. <https://www.theatlantic.com/entertainment/archive/2016/06/internet-grief/485864/>.

Garber, Megan. “Enter the Grief Police.” The Atlantic. Atlantic Media Company, 20 Jan. 2016. Web. 27 May 2017. <https://www.theatlantic.com/entertainment/archive/2016/01/enter-the-grief-police/424746/>.

History Has its Eyes on You

In case it isn’t obvious from some of my recent writings, I’ve been thinking a lot about history. This has been mostly the fault of John Green, who decided in a recent step of his ongoing scavenger hunt, to pitch the age old question: “[I]s it enough to behold the universe of which we are part, or must we leave a footprint in the moondust for it all to have been worthwhile?” It’s a question that I have personally struggled with a great deal, more so recently as my health and circumstances have made it clear that trying to follow the usual school > college > career > marriage > 2.5 children > retirement and in that order thank you very much life path is a losing proposition.

The current political climate also has me thinking about the larger historical context of the present moment. Most people, regardless of their political affiliation, agree that our present drama is unprecedented, and the manner in which it plays out will certainly be significant to future generations. There seems to be a feeling in the air, a zeitgeist, if you will, that we are living in a critical time.

I recognize that this kind of talk isn’t new. Nearly a millennium ago, the participants of the first crusade, on both sides, believed they were living in the end times. The fall of Rome was acknowledged by most contemporary European scholars to be the end of history. Both world wars were regarded as the war to end all wars, and for many, including the famed George Orwell, the postwar destruction was regarded as the insurmountable beginning of the end for human progress and civilization. Every generation has believed that their problems were of such magnitude that they would irreparably change the course of the species.

Yet for every one of these times when a group has mistakenly believed that radical change is imminent, there has been another revolution that has arrived virtually unannounced because people assumed that life would always go on as it always had gone on. Until the 20th century, imperial rule was the way of the world, and European empires were expected to last for hundreds or even thousands of years. In the space of a single century, Marxism-Leninism went from being viewed as a fringe phenomenon, to a global threat expected to last well into the time when mankind was colonizing other worlds, to a discredited historical footnote. Computers could never replace humans in thinking jobs, until they suddenly began to do so in large numbers.

It is easy to look at history with perfect hindsight, and be led to believe that this is the way that things would always have gone regardless. This is especially true for anyone born in the past twenty five years, in an age after superpowers, where the biggest threat to the current world order has always been fringe radicals living in caves. I mean, really, am I just supposed to believe that there were two Germanies that both hated each other, and that everyone thought this was perfectly normal and would go on forever? Sure, there are still two Koreas, but no one really takes that division much seriously anymore, except maybe for the Koreans.

I’ve never been quite sure where I personally fit into history, and I’m sure a large part of that is because nothing of real capital-H Historical Importance has happened close to me in my lifetime. With the exception of the September 11th attacks, which happened so early in my life, and while I was living overseas, that they may as well have happened a decade earlier during the Cold War, and the rise of smartphones and social media, which happened only just as I turned old enough to never have known an adolescence without Facebook, things have, for the most part, been the same historical setting for my whole life.

The old people in my life have told me about watching or hearing about the moon landing, or the fall of the Berlin Wall, and about how it was a special moment because everyone knew that this was history unfolding in front of them. Until quite recently, the closest experiences I had in that vein were New Year’s celebrations, which always carry with them a certain air of historicity, and getting to stay up late (in Australian time) to watch a shuttle launch on television. Lately, though, this has changed, and I feel more and more that the news I am seeing today may well turn out to be a turning point in the historical narrative that I will tell my children and grandchildren.

Moreover, I increasingly feel a sensation that I can only describe as historical pressure; the feeling that this turmoil and chaos may well be the moment that leaves my footprint in the moondust, depending on how I act. The feeling that the world is in crisis, and it is up to me to cast my lot in with one cause or another.

One of my friends encapsulated this feeling with a quote, often attributed to Vladimir Lenin, but which it appears is quite likely from some later scholar or translator.
“There are decades where nothing happens; and there are weeks where decades happen.”
Although I’m not sure I entirely agree with this sentiment (I can’t, to my mind, think of a single decade where absolutely nothing happened), I think this illustrates the point that I am trying to make quite well. We seem to be living in a time where change is moving quickly, in many cases too quickly to properly contextualize and adjust, and we are being asked to pick a position and hold it. There is no time for rational middle ground because there is no time for rational contemplation.

Or, to put it another way: It is the best of times, it is the worst of times, it is the age of wisdom, it is the age of foolishness, it is the epoch of belief, it is the epoch of incredulity, it is the season of Light, it is the season of Darkness, it is the spring of hope, it is the winter of despair, we have everything before us, we have nothing before us, we are all going direct to Heaven, we are all going direct the other way – in short, the period is so far like the present period, that some of its noisiest authorities insist on its being received, for good or for evil, in the superlative degree of comparison only.

How, then, will this period be remembered? How will my actions, and the actions of my peers, go down in the larger historical story? Perhaps in future media, the year 2017 will be thought of as “just before that terrible thing happened, when everyone knew something bad was happening but none yet had the courage to face it”, the way we think of the early 1930s. Or will 2017 be remembered like the 1950s, as the beginning of a brave new era which saw humanity in general and the west in particular reach new heights?

It seems to be a recurring theme in these sorts of posts that I finish with something to the effect of “I don’t know, but maybe I’m fine not knowing in this instance”. This remains true, but I also certainly wish to avoid encouraging complacency. Not knowing the answers is okay, it’s human, even. But not continuing to question in the first place is how we wind up with a far worse future.

Something Old, Something New

It seems that I am now well and truly an adult. How do I know? Because I am facing a quintessentially adult problem: People I know; people who I view as my friends and peers and being of my own age rather than my parents; are getting married.

Credit to Chloe Effron of Mental Floss

It started innocently enough. I became first aware, during my yearly social media purge, in which I sort through unanswered notifications, update my profile details, and suppress old posts which are no longer in line with the image which I seek to present. While briefly slipping into the rabbit hole that is the modern news feed, I was made aware that one of my acquaintances and classmates from high school was now engaged to be wed. This struck me as somewhat odd, but certainly not worth making a fuss about.

Some months later, it emerged after a late night crisis call between my father and uncle, that my cousin had been given a ring by his grandmother in order to propose to his girlfriend. My understanding of the matter, which admittedly is third or fourth hand and full of gaps, is that this ring-giving was motivated not by my cousin himself, but by the grandmother’s views on unmarried cohabitation (which existed between my cousin and said girlfriend at the time) as a means to legitimize the present arrangement.

My father, being the person he was, decided, rather than tell me about this development, to make a bet on whether or not my cousin would eventually, at some unknown point in the future, become engaged to his girlfriend. Given what I knew about my cousin’s previous romantic experience (more in depth than breadth), and the statistics from the Census and Bureau of Labor Statistics (see info graphic above), I gave my conclusion that I did not expect that my cousin to become engaged within the next five years, give or take six months [1]. I was proven wrong within the week.

I brushed this off as another fluke. After all, my cousin, for all his merits, is rather suggestible and averse to interpersonal conflict. Furthermore, he comes from a more rural background with a strong emphasis on community values than my godless city-slicker upbringing. And whereas I would be content to tell my grandmother that I was perfectly content to live in delicious sin with my perfectly marvelous girl in my perfectly beautiful room [2], my cousin might be otherwise more concerned with traditional notions of propriety.

Today, though, came the final confirmation: wedding pictures from a friend of mine I knew from summer camp. The writing is on the wall. Childhood playtime is over, and we’re off to the races. In comes the age of attending wedding ceremonies and watching others live out their happily ever afters (or, as is increasingly common, fail spectacularly in a nuclear fireball of bitter recriminations). Naturally next on the agenda is figuring out which predictions about “most likely to succeed” and accurate with regards to careers, followed shortly by baby photos, school pictures, and so on.

At this point, I may as well hunker down for the day that my hearing and vision start failing. It would do me well, it seems, to hurry up and preorder my cane and get on the waiting list for my preferred retirement home. It’s not as though I didn’t see this coming from a decade away. Though I was, until now, quite sure that by the time that marriage became a going concern in my social circle that I would be finished with high school.

What confuses me more than anything else is that these most recent developments seem to be in defiance of the statistical trends of the last several decades. Since the end of the postwar population boom, the overall marriage rate has been in steady decline, as has the percentage of households composed primarily of a married couple. At the same time, both the number and percentage of nonfamily households (defined as “those not consisting of persons related by blood, marriage, adoption, or other legal arrangements”) has skyrocketed, and the growth of households has become uncoupled from the number of married couples, which were historically strongly correlated [3].

Which is to say that the prevalence of godless cohabitation out of wedlock is increasing. So too has increased the median age of first marriage, from as low as eighteen at the height of the postwar boom, to somewhere around thirty for men in my part of the world today. This begs an interesting question: For how long is this trend sustainable? That is, suppose the current trend of increasingly later marriages continues for the majority of people. At some point, presumably, couples will opt to simply forgo marriage altogether, and indeed, in many cases, already are in historic numbers [3]. At what point, then, does the marriage age snap back to the lower age practiced by those people who, now a minority, are still getting married early?

Looking at the maps a little closer, an few interesting correlations emerge [NB]. First, States with larger populations seem to have both fewer marriages per capita, and a higher median age of first marriage. Conversely, there is a weak, but visible correlation between a lower median age of first marriage, and an increased marriage per capita rate. There are a few conclusions that can be drawn from these two data sets, most of which match up with our existing cultural understanding of marriage in the modern United States.

First, marriage appears to have a geographic bias towards rural and less densely populated areas. This can be explained either by geography (perhaps large land area with fewer people makes individuals more interested in locking down relationships), or by a regional cultural trend (perhaps more rural communities are more god-fearing than us cityborne heathens, and thus feel more strongly about traditional “family values”.

Second, young marriage is on the decline nationwide, even in the above mentioned rural areas. There are ample potential reasons for this. Historically, things like demographic changes due to immigration or war, and the economic and political outlook have been cited as major factors in causing similar rises in the median age of first marriage.

Fascinatingly, one of the largest such rises seen during the early part of the 20th century was attributed to the influx of mostly male immigrants, which created more romantic competition for eligible bachelorettes, and hence, it is said, caused many to defer the choice to marry [3]. It seems possible, perhaps likely even, that the rise of modern connectivity has brought about a similar deference (think about how dating sights have made casual dating more accessible). Whether this effect works in tandem with, is caused by, or is a cause of, shifting cultural values, is difficult to say, but changing cultural norms is certainly also a factor.

Third, it seems that places where marriage is more common per capita have a lower median age of first marriage. Although a little counterintuitive, this makes some sense when examined in context. After all, the more important marriage is to a particular area-group, the higher it will likely be on a given person’s priority list. The higher a priority marriage is, the more likely that person is to want to get married sooner rather than later. Expectations of marriage, it seems, are very much a self-fulfilling prophecy.

NB: All of these two correlations have two major outliers: Nevada and Hawaii, which have far more marriages per capita than any other state, and fairly middle of the road ages of first marriage. It took me an unconscionably long time to figure out why.

So, if marriage is becoming increasingly less mainstream, are we going to see the median age of first marriage eventually level off and decrease as this particular statistic becomes predominated by those who are already predisposed to marry young regardless of cultural norms?

Reasonable people can take different views here, but I’m going to say no. At least not in the near future, for a few reasons.

Even if marriage is no longer the dominant arrangement for families and cohabitation (which it still is at present), there is still an immense cultural importance placed on marriage. Think of the fairy tales children grow up learning. The ones that always end “happily ever after”. We still associate that kind of “ever after” with marriage. And while young people may not be looking for that now, as increased life expectancies make “til death do us part” seem increasingly far off and irrelevant to the immediate concerns of everyday life, living happily ever after is certainly still on the agenda. People will still get married for as long as wedding days continue to be a major celebration and social function, which remains the case even in completely secular settings today.

And of course, there is the elephant in the room: Taxes and legal benefits. Like it or not, marriage is as much a secular institution as a religious one, and as a secular institution, marriage provides some fairly substantial incentives over simply cohabiting. The largest and most obvious of these is the ability to file taxes jointly as a single household. Other benefits such as the ability to make medical decisions if one partner is incapacitated, to share property without a formal contract, and the like, are also major incentives to formalize arrangements if all else is equal. These benefits are the main reason why denying legal marriage rights to same sex couples is a constitutional violation, and are the reason why marriage is unlikely to go extinct.

All of this statistical analysis, while not exactly comforting, has certainly helped cushion the blow of the existential crisis which seeing my peers reach major milestones far ahead of me generally brings with it. Aside from providing a fascinating distraction, pouring over old reports and analyses, the statistics have proven what I already suspected: that my peers and I simply have different priorities, and this need not be a bad thing. Not having marriage prospects at present is not by any means an indication that I am destined for male spinsterhood. And with regards to feeling old, the statistics are still on my side. At least for the time being.

Works Consulted

Effron, Chloe, and Caitlin Schneider. “At What Ages Do People First Get Married in Each State?” Mental Floss. N.p., 09 July 2015. Web. 14 May 2017. <http://mentalfloss.com/article/66034/what-ages-do-people-first-get-married-each-state>.

Masteroff, Joe, Fred Ebb, John Kander, Jill Haworth, Jack Gilford, Bert Convy, Lotte Lenya, Joel Grey, Hal Hastings, Don Walker, John Van Druten, and Christopher Isherwood. Cabaret: original Broadway cast recording. Sony Music Entertainment, 2008. MP3.

Wetzel, James. American Families: 75 Years of Change. Publication. N.p.: Bureau of Labor Statistics, n.d. Monthly Labor Review. Bureau of Labor Statistics, Mar. 1990. Web. 14 May 2017. <https://www.bls.gov/mlr/1990/03/art1full.pdf>.

Kirk, Chris. “Nevada Has the Most Marriages, but Which State Has the Fewest?” Slate Magazine. N.p., 11 May 2012. Web. 14 May 2017. <http://www.slate.com/articles/life/map_of_the_week/2012/05/marriage_rates_nevada_and_hawaii_have_the_highest_marriage_rates_in_the_u_s_.html>.

Tax, TurboTax – Taxes Income. “7 Tax Advantages of Getting Married.” Intuit TurboTax. N.p., n.d. Web. 15 May 2017. <https://turbotax.intuit.com/tax-tools/tax-tips/Family/7-Tax-Advantages-of-Getting-Married-/INF17870.html>.

Keep Calm and Carry On

Today, we know that poster as a, well, poster, of quintessential Britishness. It is simply another of our twenty-first century truisms, not unlike checking oneself before wrecking oneself. Yet this phrase has a far darker history.

In 1940, war hysteria in the British Isles was at its zenith. To the surprise of everyone, Nazi forces had overcome the Maginot line and steamrolled into Paris. British expeditionary forces at Dunkirk had faced large casualties, and been forced to abandon most of their equipment during the hastily organized evacuation. In Great Britain itself, the Home Guard had been activated, and overeager ministers began arming them with pikes and other medieval weapons [10]. For many, a German invasion of the home isles was deemed imminent.

Impelled by public fear and worried politicians, the British government began drawing up its contingency plans for its last stand on the British Isles. Few military strategists honestly believed that the German invasion would materialize. Allied intelligence made it clear that the Germans did not possess an invasion fleet, nor the necessary manpower, support aircraft, and logistical capacity to sustain more than a few minor probing raids [5]. Then again, few had expected France to fall so quickly. And given the Nazi’s track record so far, no one was willing to take chances [3].

Signposts were removed across the country to confuse invading forces. Evacuation plans for key government officials and the royal family were drawn up. Potential landing sites for a seaborne invasion were identified, and marked for saturation with every chemical weapon in the British stockpile. So far the threat of mutually assured destruction has prevented the large scale use of chemical weapons as seen in WWI. However, if an invasion of the homelands had begun, all bets would be off. Anti-invasion plans call for the massive use of chemical weapons against invading forces, and both chemical and biological weapons against German cities, intended to depopulate and render much of Europe uninhabitable [4][7][8].

Strategists studying prior German attacks, in particular the combined arms shock tactics which allowed Nazi forces to overcome superior numbers and fortifications, become convinced that the successful defence of the realm is dependent on avoiding confusion and stampedes of refugees from the civilian population, as seen in France and the Low Countries. To this end, the Ministry of Information is tasked with suppressing panic and ensuring that civilians are compliant with government and military instructions. Official pamphlets reiterate that citizens must not evacuate unless and until instructed to do so.

IF THE GERMANS COME […] YOU MUST REMAIN WHERE YOU ARE. THE ORDER IS “STAY PUT”. […] BE READY TO HELP THE MILITARY IN ANY WAY. […] THINK BEFORE YOU ACT. BUT THINK ALWAYS OF YOUR COUNTRY BEFORE YOU THINK OF YOURSELF. [9]

Yet some remained worried that this message would get lost in the confusion on invasion day. People would be scared, and perhaps need to be reminded. “[T]he British public were suspicious of lofty sentiment and reasoned argument. […] Of necessity, the wording and design had to be simple, for prompt reproduction and quick absorption.”[1]. So plans were made to make sure that the message is unmistakable and omnipresent. Instead of a long, logical pamphlet, a simple, clear message in a visually distinctive manner. The message, a mere five words, captures the entire spirit of the British home front in a single poster.

KEEP CALM AND CARRY ON

The poster was never widely distributed during World War II. The Luftwaffe, believing that it was not making enough progress towards the total air supremacy that was deemed as crucial for any serious invasion, switched its strategy from targeting RAF assets, to terror bombing campaigns against British cities. Luckily for the British, who by their own assessment were two or three weeks of losses away from ceding air superiority [5], this strategy, though it inflicted more civilian casualties, eased pressure on the RAF and military infrastructure enough to recover. Moreover, as the British people began to adapt to “the Blitz”, allied resolve strengthened rather than shattered.

German invasion never materialized. And as air raids became more a fact of life, and hence less terrifying and disorienting to civilians, the need for a propaganda offensive to quell panic and confusion subsided. As the RAF recovered, and particularly as German offensive forces began to shift to the new Soviet front, fears of a British collapse subsided. Most of the prepared “Keep Calm” posters were gradually recycled as part of the paper shortage.

With perfect historical retrospect, it is easy to recognize the fact that a large scale German invasion and occupation of the British Isles would have been exceedingly unlikely, and victory against an entrenched and organized British resistance would have been nigh impossible. The British government was on point when it stated that the key to victory against an invasion was level-headedness. Given popular reaction to the rediscovered copies of the “Keep Calm” design, it also seems that they were on the mark there.

The poster and the phrase it immortalized have long since become decoupled from its historical context. Yet not, interestingly, the essence it sought to convey. It is telling that many of the new appropriations of the phrase, as seen by a targeted image search, have to do with zombies, or other staples of the post-apocalyptic genre. In its original design, the poster adorns places where anxiety is commonplace, such as workplaces and dorm rooms, and has become go-to advice for those under stressful situations.

This last week in particular has been something of a roller coaster for me. I feel characteristically anxious about the future, and yet at the same time lack sufficient information to make a workable action plan to see me through these troubling times. At a doctor’s appointment, I was asked what my plan was for the near future. With no other option, I picked a response which has served both myself and my forebears well during dark hours: Keep Calm and Carry On.

Works Consulted

1) “Undergraduate Dissertation – WWII Poster Designs, 1997.” Drbexl.co.uk. N.p., 23 Jan. 2016. Web. 11 May 2017. <http://drbexl.co.uk/1997/07/11/undergraduate-dissertation-1997/>.

2) “Dunkirk rescue is over – Churchill defiant.” BBC News. British Broadcasting Corporation, 04 June 1940. Web. 11 May 2017. <http://news.bbc.co.uk/onthisday/hi/dates/stories/june/4/newsid_3500000/3500865.stm>.

3) Inman, Richard. “Fighting for Britain.” Wolverhampton History – Wolverhampton History. Wolverhampton City Council, 13 Dec. 2005. Web. 11 May 2017. <http://www.wolverhamptonhistory.org.uk/people/at_war/ww2/fighting3>.

4) Bellamy, Christopher. “Sixty secret mustard gas sites uncovered.” The Independent. Independent Digital News and Media, 03 June 1996. Web. 11 May 2017. <http://www.independent.co.uk/news/sixty-secret-mustard-gas-sites-uncovered-1335343.html>.

5) “Invasion Imminent.” Invasion Imminent – Suffolk Anti-invasion defences. N.p., n.d. Web. 11 May 2017. <http://pillboxes-suffolk.webeden.co.uk/invasion-imminent/4553642028>.

6) “Large bomb found at ex-Navy base.” BBC News. British Broadcasting Corporation, 22 Apr. 2006. Web. 11 May 2017. <http://news.bbc.co.uk/2/hi/uk_news/england/hampshire/4934102.stm>.

7) Ministry of Information. CIVIL DEFENCE – BRITAIN’S WARTIME DEFENCES, 1940. Digital image. Imperial War Museums. n.d. Web. 11 May 2017. <http://www.iwm.org.uk/collections/item/object/205019014>.

8) “Living with anthrax island.” BBC News. British Broadcasting Corporation, 08 Nov. 2001. Web. 11 May 2017. <http://news.bbc.co.uk/2/hi/uk_news/1643031.stm>.

9) Ministry of Information. If the Invader Comes. 1940. Print.

10) RAMSEY, SYED. TOOLS OF WAR;HISTORY OF WEAPONS IN MEDIEVAL TIMES. N.p.: ALPHA EDITIONS., n.d. Print.

Kindred Spirits

This weekend I spent my time volunteering with a charity which represents people who suffer from one of the many chronic diseases and disabilities at a local barbecue cooking competition. This came about because one of the competitors’ daughters was recently diagnosed with the same disease as I, and so wanted to invite someone to advocate and educate. What’s interesting is that his daughter is approximately the same age that I was when I was first diagnosed.

Being diagnosed at that particular age, while not unheard of, is nevertheless uncommon enough that it gave me momentary pause, and in preparing to meet her my mind this week has been on what I ought to tell her, and moreover, what I wish I could tell a younger version of myself when I was diagnosed. She was, as it turned out, not greatly interested in discussing health with me, which I suppose is fair enough. Even so, I have been thinking about this topic enough that it has more or less solidified into the following post:

I could tell you it gets easier, except I would be lying. It doesn’t get easier. People might tell you that it gets easier to manage, which is sort of true inasmuch as practice and experience make the day to day stuff less immediately challenging, same with anything. And of course, technology makes things better and easier. Not to be the old man yelling at the whippersnappers about how good they have it nowadays, but it is true that in the ten years I’ve had to deal with it, things have gotten both better and easier.

The important thing here is that over the course of years, the actual difficulty level doesn’t really change. This is depressing and frustrating, but it’s also not that bad in the big scheme of things. There are a lot of chronic diseases where things only get worse with time, and that’s not really the case with our disease. We have the sword of Damocles hanging over our heads threatening us if we mess up, but if we stay vigilant, and get nothing wrong, we can postpone that confrontation basically forever.

It means that you can get to a point where you can still do most things that ordinary people can do. It’s more difficult, and you’re never not going to have to be paying attention to your health in the background. That’s never going to change. You’re going to be starting from an unfair disadvantage, and you’re going to have to work harder to catch up. Along the way you will inevitably fail (it’s nothing personal; just a matter of odds), and your failure will be all the more spectacular and set you further back than what’s considered normal. It’s not fair. But you can still do it, despite the setbacks. In fact, for most of the important things in life, it’s not really optional.

Whatever caused this, whatever you think of it, whatever happens next, as of now, you are different. You are special. That’s neither a compliment, nor an insult. That’s a biological, medically-verified, legally-recognized fact. People around you will inevitably try to deny this, telling you that your needs aren’t any different from those around you, or that you shouldn’t act or feel or be different. Some of these people will mean well but be misguided; others will be looking for a way to hurt or distract you.

If you’re like me, and most people, at some point, you too will probably try to tell yourself this. It is, I have been told, an essential part of adolescence. Futile though it may be to say this then, and believe me when I say this that I mean it in the nicest way possible, that I must declare: whoever these sentiments come from, whatever their intentions, they are straight up wrong. You are different and special. You can choose how to react to that, and you can choose how to portray this, but you cannot change the basic fact. That you are different is not any reflection on you or anything you have done, and accepting this is not any sort of concession or confession; on the contrary, it reflects maturity and understanding.

It follows that your experience and your path may not be the “normal” one. This is neither good nor bad, but simply reflects the special circumstances which exist as a matter of fact. The fact that everything is that much harder may mean that you have to pick and choose your battles, or get extra help on some things, even if those things seem normal and easy for other people. This is to be expected, and is nothing to hide or be ashamed of. People around you may not understand this, and may give you a hard time. Just remember, as I was told when I was in your shoes: The people who matter don’t mind, and the people who mind don’t matter.

Angry in May

I am angry today. I don’t like feeling generally angry, because it’s usually quite draining without being actually fulfilling. Yet I feel rather compelled to be angry. I know several people who feel near or on the brink of desperation because of recent events regarding healthcare in particular and politics in general. I want to help, but there seems to be increasingly little I can do. I myself am somewhat worried about the future. In the wake of all of this I feel that I have the choice between being paralyzed by fear or being motivated by anger. The latter seems like an obvious choice.

The beginning of May is a time of a number of small holidays. April 30th marks the real end of the end of World War II, with the suicides of Hitler and company in Berlin and the transfer of governmental power to Reichpräsident (formerly admiral) Karl Dönitz, who would authorize the unconditional surrender of Nazi Germany on May 7th, known as VE Day in the west, and celebrated as Victory Day in the now-former soviet bloc on May 8th, due to the time difference between London and Moscow (and a few mishaps regarding paperwork and general distrust of the Soviets). Depending on where you live, this is either interesting trivia, or a very big deal.

Victory Day in Russia is one of the really big political occasions and is celebrated with an accordingly large show of military force. These parades are a chance for Russia to show off all the fancy toys that it will use to annihilate any future such invaders, for ordinary people to honor those they lost during the war, for old people and leftists to pine nostalgically for the halcyon days when the Soviet Union was strong and whippersnappers knew their place, and for western intelligence organizations to update their assessments on Russian military hardware. This last one has caused problems in the past, as miscounts to the number of bombers and missile launchers (the soviets were cycling them to up their numbers) led to the impression that a bomber and later missile gap existed between the Soviets and the US for most of the Cold War.

Speaking of bombastic parades, the First of May is either known as an occasion for maypole dancing, or for massive demonstrations with masses of red flags. Prior to the 1800s, May Day was something of a spring festival, likely originally associated with the Roman festival for the goddess of flowers, Flora, which took place on the first official day of summer. As Roman paganism fell out of fashion, the festival became a more secular celebration of springtime.

In 1904, the Sixth Conference of the Second Internationale declared that the first of May would be a day of protest for labor organizations to demonstrate, in memory of the May 4th, 1886 Haymarket Affair in Chicago. Subsequently, May Day became something of a major event for labor and workers’ rights groups. This was solidified after the formation of the Soviet Union (they seem to be a recurring element here), which, as a self-styled “workers’ state”, made May Day celebrations a big deal within its borders, and used the occasion to further sympathetic causes abroad.

This caused something of a feedback look, as governments taken in by anti-communist hysteria sought to either suppress (and thus, in many ways, legitimize) May Day demonstrations, or to control such demonstrations by making them official. Thus, in many countries, 1st May is celebrated as Labour Day (generally with the ‘u’). In 1955, Pope Pius XII declared May Day to be a feast day for Saint Joseph the Worker, in counter-celebration to labor celebrations.

May the Fourth, is, of course, celebrated as Star Wars Day, for obvious reasons. Historically it has been the day that I dress up in full character costume for school. Unfortunately, this year I was too sick to actually attend school, in costume or not. I was also recently informed that in Ohio in particular, 4th May is recognized primarily as the anniversary of the Kent State Massacre during the Vietnam War. To quote the friend who explained it to me:

So today is May 4th, affectionally known by most as Star Wars Day. That is what it used to be for me until I went to Kent State. Now May 4th is a day of remembrance. Because today in 1970, the National Guard opened fire on a group of students peacefully protesting the Vietnam War and killed 4. It has become a day for the entire campus to go silent, to walk the memorial, to relect on how important it is to speak up about what you believe is wrong. Politics is not always elections. Sometimes it is holding a candle at a memorial of people killed by the government. Sometimes it is remembering and refusing to forget. Either way, it is action. That is one of the most important lessons I have learned at Kent State.

The opening days of May have for some time now been a time of year when I typically pause and reflect. Having several small holidays- that is, holidays well known enough that I am reminded of their passing, without necessarily needing to go out of my way to prepare in advance -have helped add to this. Early May is typically long enough after cold and flu season that even if I’m not back in the thick of things, I’m usually on my feet. It’s also after midterms and standardized testing, while not being yet close enough to final exams that I can feel the weight of all my unfinished work bearing down on me in full force. Early May is a lull when I can get my bearings before hunkering down for the last act of the school year and hit the ground running for summer.

So, where am I? How am I doing? How am I going to come back into school roaring?

I don’t know the answer to any of these questions. There are too many things up in the air in my life, both at the micro and macro level. I feel uncertain and a little scared. And I feel angry.

Inasmuch as I have any real self confidence and self worth, I pride myself on my intelligence. I like that I can recall off the top of my head several different holiday occasions in the space of a fortnight, and succinctly explain their historical and cultural context. I enjoy being a know-it-all. I loath the unknown, and I detest the substitution of hard facts for squishy feelings. I consider these principles integral to my identity and personal value, and find it difficult and troubling to envision any future where I do not possess these traits, or where these merits are not accepted.

The Antibiotic Apocalypse and You

Following up on the theme established inadvertently last week: I’m still sick, though on the whole, I’m probably not feeling worse, and possibly arguably marginally better. In an effort to avoid the creativity-shattering spiral that happens when I stop writing altogether, this week I will endeavor to present some thoughts on a subject which I have been compelled to be thinking about anyway: Antibiotics.

A lot of concerns have been raised, rightfully, over the appearance of drug-resistant pathogens, with some going so far as to dub the growing appearances of resistant bacteria “the antibiotic apocalypse”. While antibiotic resistance isn’t a new problem per se, the newfound resistance to our more powerful “tiebreaker” drugs is certainly a cause for concern.

In press releases from groups such as the World Health Organization and the Centers for Disease Control and Prevention, much of the advice, while sound, has been concentrated on government organizations and healthcare providers. And while these people certainly have more responsibility and ability to react, this does not mean that ordinary concerned citizens cannot make a difference. Seeing as I am a person who relies on antibiotics a great deal, I figured I’d share some of the top recommendations for individuals to help in the global effort to ward off antibiotic resistance.

Before going further, I am compelled to restate what should be common sense: I don’t have actual medical qualifications, and thus what follows is pretty much a re-hash of what other experts have given as general, nonspecific information. With this in mind, my ramblings are no substitute for actual, tailored medical advice, and shouldn’t be treated as such.

Before you’re put on antibiotics

1) Stay home when you’re sick

This one is going to be repeated, because it bears repeating. Antibiotic resistant strains spread like any other illness, and the single best way to avoid spreading illness it to minimize contact with other people. Whether or not you are currently infected with antibiotic-resistant illness; in fact, whether or not you even have an illness that is treatable by antibiotics; staying at home when you’re sick will help you get better sooner, and is the single most important thing for public health in general.

2) Wash hands, take your vitamins, etcetera.

So obviously the best way to deal with illness is to avoid spreading it in the first place. This means washing your hands frequently (and properly! Sprinkling on some room temperature water like a baptism for your hands isn’t going to kill any germs), preparing food to proper standards, avoiding contact with sick people and the things they come in contact with, eating all of your vegetables, getting your vaccinations, you get the picture. Even if this doesn’t prevent you from getting sick, it will ensure that your immune system is in fighting shape for if you do.

3) Know how antibiotics work, and how resistance spreads

Remember high school biology? This is where all that arcana comes together. Antibiotics aren’t a magical cure-all. They use specific biological and chemical mechanisms to target specific kinds of organisms inside you. Antibiotics don’t work on viruses because they aren’t living organisms, and different kinds of antibiotics work against different diseases because of the biological and chemical distinctions.

Understanding the differences involved when making treatment decisions can be the difference between getting effective treatment and walking away unharmed, and spending time in the hospital to treat a resistant strain. Antibiotic resistance is a literally textbook example of evolution, so understanding how evolution works will help you combat it.

Public understanding of antibiotics and antibiotic resistance is such a critical part of combating resistance that it has been named by the World Health Organization as one of the key challenges in preventing a resistant superbug epidemic.

4) Treat anyone who is on antibiotics as if they were sick

If someone is on antibiotics and still doesn’t feel or seem well (and isn’t at home, for some reason), you’re going to want to take that at face value and keep your distance. You can also kindly suggest that they consider going home and resting. If you become sick after contact with such persons, be sure to mention it to your doctor.

If they’re feeling otherwise fine, you want to treat them as if they were immunocompromised. In other words, think of how you would conduct yourself health-wise around a newborn, or an elderly person. Extra hand-washing, making sure to wipe down surfaces, you get the picture. If they’re on antibiotics preventatively for a chronic immunodeficiency, they will appreciate the gesture. If they’re recovering from an acute illness, taking these extra precautions will help ensure that they don’t transmit pathogens and that their immune system has time to finish the job and recover.

5) Never demand antibiotics

I’ll admit, I’m slightly guilty of this one myself. I deal with a lot of doctors, and sometimes when I call in for a sick-day consult, I get paired with a GP who isn’t quite as experienced with my specific medical history, who may not have had time to go through my whole file, and who hasn’t been in close contact with my other dozen specialist doctors. Maybe they don’t recognize which of my symptoms are telltale signs for one diagnosis or another, or how my immunology team has a policy of escalating straight to a fourteen day course, or whatever.

I sympathize with the feeling of just wanting to get the doctor to write the stupid prescription like last time so one can get back to the important business of wasting away in bed. However, this is a problem. Not everyone is as familiar with how antibiotics work and with the intricacies of prescribing them, and so too often when patients ask for antibiotics, it ends up being the wrong call. This problem is amplified in countries such as the United States where economics and healthcare policies make it more difficult for doctors to refuse. This is also a major issue with prescription painkillers in the United States. So, listen to your doctor, and if they tell that you don’t need antibiotics, don’t pressure them.

Bear in mind that if a doctor says you don’t need antibiotics, it probably means that antibiotics won’t help or make you feel any better by taking them either, and could cause serious harm. For reference, approximately one in five of all hospital visits for drug side effects and overdoses are related to antibiotics.

It should go without saying that you should only get antibiotics (or any medication, really) via a prescription from your doctor, but apparently this is a serious enough problem that both the World Health Organization and the Centers for Disease Control and Prevention feel the need to mention this on their patient websites. So, yeah. Only take the drugs your doctor tells you to. Never take antibiotics left over from previous treatment, or from friends. If you have antibiotics left over from previous treatment, find your local government’s instructions for proper disposal.

If you are prescribed antibiotics

1) Take your medication on schedule, preferably with meals

Obviously, specific dosing instructions overrule this, but generally speaking, antibiotics are given a certain number of times per day, spaced a certain number of hours apart, and on a full stomach. Aside from helping to ensure that you will remember to take all of your medication, keeping to a schedule that coincides with mealtimes will help space dosages out and ensure that the antibiotics are working at maximum efficiency.

Skipping doses, or taking doses improperly vastly increases both the likelihood of developing resistant pathogens, and the risk of side effects.

2) Take probiotics between dosages

Antibiotics are fairly indiscriminate in their killing of anything it perceives as foreign. Although this makes them more effective against pathogens, it can also be devastating to the “helpful bacteria” that line your digestive tract. To this end, most gastroenterologists recommend taking a probiotic in between dosages of antibiotic. Aside from helping your body keep up it’s regular processes and repair collateral damage faster, this also occupies space and resources that would otherwise be ripe for the taking by the ones making you sick.

3) Keep taking your antibiotics, even if you feel well again

You can feel perfectly fine even while millions of hostile cells linger in your body. Every hostile cell that survives treatment is resistant, and can go in to start the infection all over again, only this time the antibiotic will be powerless to halt it. Only by taking all of your antibiotics on the schedule prescribed can you ensure that the infection is crushed the first time.

Furthermore, even though you may feel fine, your immune system has been dealt a damaging blow, and needs time to rebuild its forces. Continuing to take your antibiotics will help ensure that your weakened immune system does not let potentially deadly secondary infections slip through and wreak havoc.

4) Stay Home and Rest

Is this message getting through yet?

If you are on antibiotics, it means your body is engaged in a struggle, and it needs all of your resources focused on supporting that fight. Even the most effective antibiotics cannot eliminate every hostile cell. You immune system plays a vital role in hunting down and eliminating the remaining pathogens and preventing these resistant strains from multiplying and taking hold. In the later stages of this fight, you may not even feel sick, as there are too few resistant cells to cause serious damage. However, unless all of them are exterminated, the fight will continue and escalate.

Ideally, you should stay at home and rest for as long as you are taking antibiotics. However, since antibiotics are often given in courses of fourteen and twenty one days, this is impossible for most adults. At a barest minimum, you should stay home until you feel completely better, or until you are halfway done with your course of antibiotics, whichever is longer.

If you do return to your normal routine while taking antibiotics, keep in mind that you are still effectively sick. You should therefore take all of the normal precautions: extra hand washing, wiping down surfaces, extra nutrition and rest, and the like.

5) If you don’t feel better, contact your doctor immediately

Remember: Antibiotics are fairly all or nothing, and once an illness has developed a resistance to a specific treatment, continuing that line of treatment is unlikely to yield positive results and extremely likely to cause increased resistance to future treatment. Obviously, antibiotics, like any course of treatment, take some time to take effect, and won’t make you feel suddenly completely better overnight. However, if you are more than halfway through your treatment course and see no improvement, or feel markedly worse, this could be a sign that you require stronger medication.

This does not mean that you should stop taking your current medication, nor should you take this opportunity to demand stronger medication (both of these are really, colossally bad ideas). However, you should contact your doctor and let them know what’s going on. Your doctor may prescribe stronger antibiotics to replace your current treatment, or they may suggest additional adjunctive therapy to support you current treatment.

Works Consulted

“Antibiotic resistance.” World Health Organization. World Health Organization, n.d. Web. 28 Apr. 2017. <http://www.who.int/mediacentre/factsheets/antibiotic-resistance/en/>.

Freuman, Tamara Duker. “How (and Why) to Take Probiotics When Using Antibiotics.” U.S. News & World Report. U.S. News & World Report, 29 July 2014. Web. 28 Apr. 2017. .

“About Antibiotic Use and Resistance.” Centers for Disease Control and Prevention. Centers for Disease Control and Prevention, 16 Nov. 2016. Web. 28 Apr. 2017. <https://www.cdc.gov/getsmart/community/about/index.html>.

Commissioner, Office Of the. “Consumer Updates – How to Dispose of Unused Medicines.” U S Food and Drug Administration Home Page. Office of the Commissioner, n.d. Web. 28 Apr. 2017. <https://www.fda.gov/forconsumers/consumerupdates/ucm101653.htm>.

NIH-NIAID. “Antimicrobial (Drug) Resistance.” National Institutes of Health. U.S. Department of Health and Human Services, n.d. Web. 28 Apr. 2017. <https://www.niaid.nih.gov/research/antimicrobial-resistance>.

Ode to the Immune System

Context: I’m sick. When I’m sick, I get bored without being able to write properly. Consequently, I tend to write shorter things, like songs. Here’s a about how certain parts of the immune system work. To the tune of “The Red Army is Strongest” AKA “Red Army, Black Baron”, AKA “That song from the Comintern faction in Hearts of Iron IV“. Enjoy.

The deadly virus and the harmful germ
Are gathering a great dark storm
But without regard for the the malady
The immune cells defend the body

So see the macrophage, begin the war to wage
Take its enemies hand to hand
Then it engulfs them with its deadly rage
As it makes the body’s first stand

Hear great cry of the brave neutrophils
As they charge forth into great trouble
As without regard for the the malady
The immune cells defend the body

So see Dendritic Cells, ring out their warning bells
Awaking the nearest lymph gland
And activating the T and B cells
As it prepares the final stand

Watch the B lymphocytes turning the tide
Making antibodies well supplied
As without regard for the the malady
The immune cells defend the body

Now the new mem’ry Cells, in the lymph nodes shall dwell
As others die by their own hand
The body stands down, as now all is well
As it survived the final stand

Once Upon A Time

Once upon a time in a magical kingdom in Florida, a certain tourist hub instituted a policy for guests with disabilities. This policy, known as the Guest Assistance Card, allowed those who were familiar with its existence and could justify its use, powers unseen to mere mortals. With one of these mystical passes, a disabled guest and their party could avoid the long lines which plagued the kingdom. Although this could not heal the guests’ wounds, and could never make up for the challenges faced by these people in everyday life, it offered the promise of an escape. It kept true to the dream of a magical vacation unbound by the stresses and turmoils of everyday life.

Unfortunately, in a storybook example of why we can’t have nice things, there were evil-doers with poison in their hearts, who sought to abuse this system and corrupt it for everyone. Shady businessmen would rent their grandparents in wheelchairs to rich families craving the awesome power to cut lines. Eventually it became inevitable that the kingdom had to close this loophole. When it did so it shattered the hearts of many a handicapped child and their families.

Alright, I think you’re all caught up on the backstory here.

Though it disappoints me greatly that it came to this, with the level of abuse being turned up in tabloids and travel blogs, it was inevitable that Disney would have to end this program. As one who has used it myself, I will be the first to admit- it was overpowered. But from the impression I got from the guest services folks, that was part of the point. The point was never to get to the lowest common denominator necessary to adhere to federal anti-discrimination laws. The point was to enable these guests to enjoy their vacation. To enable magical moments which, for some of these kids, might never happen again.

There are many reasons why, for a long time, Walt Disney World was the default Make-A-Wish Foundation (and similar) destination, and this approach to disability is one of those reasons. The new program which replaced the GAC is workable- it basically works as a sort of on the go fastpass, giving you a return time equal to the listed standby wait minus ten minutes, after which you can go through the fastpass line at your leisure. But it is mundane compensation rather than a magical silver lining to living with disability. It is a crutch rather than a tricked out motorized wheelchair.

I don’t blame Disney for this change in policy. I know how some of the people were using the GAC, and they really had no choice. I do blame the ringleaders of these black market operations, and the people who paid them. As far as I am concerned, these people are guilty of perfidy, that is, the war crime of abusing the protections of the rules of war (such as feigning wounds) to gain an advantage. As for Disney, I am disappointed, but understanding.

I wish that this fairytale had a more appropriate ending. I wish that I could say that the evil doers faced poetic justice and were made to wait in an endless line while having to listen to the sounds of children crying and complaining about waiting. Unfortunately, this did not happen, and these few bad apples spoiled the bunch.

Ne Obliviscaris

How accurate is it to say that you will never forget something?

Obviously, not terribly. After all, “never” and “always”, being infinite, are not generally applicable on a human timescale. And, even if we assume that forgetting can only occur by the act of a living person, the nature of human memory over extended time periods makes “never forgetting” a rather unfulfillable promise.

This week represented a fascinating, if bittersweet, milestone for me. As of this Wednesday, I have been disabled for a majority of my life. The dramatic saga of my diagnosis was one such event which I have committed to “never forgetting”, even though I know that this task is impossible. In some respects, I feel as though I have already failed at this task. Promises made to me and to myself about not letting this label define me or limit my grand endeavors have proven impossible.

They tell you, when you’re dealing with a disability or a chronic disease, that you can’t let it define you or limit your options; that meeting a certain medical or legal definition doesn’t make you any different from your peers. While the thought is nice, I have increasingly found that mindset to be idealistic and impractical. Having your options limited is pretty much the definition of disability, and accepting that isn’t pessimism, it’s being realistic.

Whenever I take an unmodified psychiatric assessment, it always flags me for possible risk of depression and/or anxiety, with a healthy dash of obsessive-compulsive and paranoid symptoms. This is because I answer honestly on questions like “I feel different from my peers” and “I am sick a lot”. The fact of the matter is that I am objectively different from my peers because my body does not function within normal parameters, and I am sick a lot for the same reasons. Devoid of context, these statements might indicate a problem. Upon explaining that, yes, I do experience great everyday stress, because I have to cope with artificially supplementing missing organ function, most doctors agree that my apparent pessimism is completely justified, and in fact, represents a mostly-healthy means of coping with my present situation. After all, it’s not paranoia if your statistical chances of dying are vastly increased.

As for the issue of defining myself, it is my experience that people generally define themselves by the struggles they encounter and how they meet them. For example: if a person’s lifelong struggle is to climb Everest, I do not see why they should not describe themselves as a climber. For my part, my greatest struggle by far is staying alive and keeping my body from annihilating itself. This may seem relatively simple as a life struggle to the perfectly healthy and the uneducated, in the same way that climbing an oversize hill may seem like a simplistic goal for someone unacquainted with proper mountains.

To me at least, having someone tell me I can’t let my illness define me tells me that person has never really had to deal with serious health problems. Because taking proper care of oneself is a defining struggle. I am proud of the fact that I have managed to keep my body alive despite several key systems giving up on me. I am proud that I have managed to keep myself in a state that I can actually participate in life, even if my participation might be different from others’.

And yes, I understand that what is meant is that I ought not let my issues engulf the entirety of my existence- that I ought to still have non-health goals. But trying to plan goals completely independently of my health is setting myself up for failure. No matter how hard I try, no matter how much I will it to be so, I cannot change my basic physiological requirements. At best, I can try to make my personal and health goals work in harmony, but this does require me to let my disability set the boundaries of what challenges I undertake.

Yes, I can still run a marathon. But I couldn’t step outside and do it today. Not only would I fail, but if I persisted against medical advice, I might even die trying. Dealing with my health means I have to plan and make compromises. I can’t be completely single-minded about these kinds of goals because my health requires constant focus. Lying to myself, or having others lie to me, doesn’t help, and only increases the chance that I’ll feel worse about my situation. Accepting this, in effect, letting my disability define my boundaries and dictate my life, is the only way I will ever be able to move beyond it and start accomplishing other goals.