Notes on Descriptivism

There is an xkcd comic which deals with linguistic prescriptivism. For those not invested in the ongoing culture war surrounding grammar and linguistics, prescriptivism is the idea that there is a singular, ideal, correct version of language to which everyone ought adhere. This is distinct from linguistic descriptivism, which maintains that language is better thought of not as a set of rules, but as a set of norms; and that to try and enforce any kind of order on language is doomed to failure. In short, prescriptivism prescribes idealized rules, while descriptivism describes existing norms.

The comic presents a decidedly descriptivist worldview, tapping into the philosophical question of individual perception to make the point that language is inherently up to subjective interpretation, and therefore must vary from individual to individual. The comic also pokes fun at a particular type of behavior which has evolved into an Internet Troll archetype of sorts- the infamous Grammar Nazi. This is mostly an ad hominem, though it hints at another argument frequently used against prescriptivism; that attempts to enforce a universal language generally cause, or at least, often seem to cause, more contention, distress, and alienation than they prevent.

I am sympathetic to both of these arguments. I acknowledge that individual perceptions and biases create significant obstacles to improved communications, and I will agree, albeit with some reluctance and qualifications, that oftentimes, perhaps even in most cases, that the subtle errors and differences in grammar (NB: I use the term “grammar” here in the broad, colloquial sense, to include other similar items such as spelling, syntax, and the like) which one is liable to find among native speakers of a similar background do not cause significant confusion or discord to warrant the often contentious process of correction.

Nevertheless, I cannot accept the conclusion that these minor dissensions must necessarily cause us to abandon the idea of universal understanding. For that is my end goal in my prescriptivist tendencies: to see a language which is consistent and stable enough to be maximally accessible, not only to foreigners, but more importantly, to those who struggle in grappling with language to express themselves. This is where my own personal experience comes into the story. For, despite my reputation for sesquipedalian verbosity, I have often struggled with language, in both acute and chronic terms.

In acute terms, I have struggled with even basic speech during times of medical trauma. To this end, ensuring that communication is precise and unambiguous has proven enormously helpful, as a specific and unambiguous question, such as “On a scale of zero to ten, how much pain would you say you are currently experiencing?” is vastly easier to process and respond to than one that requires me to contextualize an answer, such as “How are you?”.

In chronic terms, the need to describe subjective experiences relies on keen use of precise vocabulary, which, for success, requires a strong command of language on the part of all parties involved. For example, the difference between feeling shaky, dizzy, lightheaded, nauseated, vertigo, and faint, are subtle, but carry vastly different implications in a medical context. Shaky is a buzzword for endocrinology, dizzy is a catch-all, but most readily associated with neurology, lightheadedness is referred to more often for respiratory, nausea has a close connection with gastroenterology, vertigo refers specifically to balance, which may be either an issue for Neurology, Ophthalmology, or an ENT specialist, and faintness is usually tied to circulatory problems.

In such contexts, these subtleties are not only relevant, but critical, and the casual disregard of these distinctions will cause material problems. The precise word choice used may, to use an example from my own experience, determine whether a patient in the ER is triaged as urgent, which in such situations may mean the difference between life and death. This is an extreme, albeit real, example, but the same dynamic can and will play out in other contexts. In order to prevent and mitigate such issues, there must be an accepted standard common to all for the meaning and use of language.

I should perhaps clarify that this is not a manifesto for hardcore prescriptivism. Such a standard is only useful insofar as it is used and accepted, and insofar as it continues to be common and accessible. Just as laws must from time to time be updated to reflect changes in society, and to address new concerns which were not previously foreseen, so too will new words, usages, and grammar inevitably need to be added, and obsolete forms simplified. But this does not negate the need for a standard. Descriptivism, labeling language as inherently chaotic and abandoning attempts to further understanding through improved communication, is a step backwards.

Heroes and Nurses

Since I published my last post about being categorically excluded from the nursing program of the university I am applying to, I have had many people insist that I ought to hold my ground on this one, even going so far as filing a legal complaint if that’s what it takes. I should say upfront that I appreciate this support. I appreciate having family and friends that are willing to stand by me, and I appreciate having allies who are willing to defend the rights of those with medical issues. It is an immense comfort to have people like this in my corner.

That firmly stated, there are a few reasons why I’m not fighting this right now. The first is pragmatic: I haven’t gotten into this university yet. Obviously, I don’t want the first impression of a school I hope to be admitted into to be a lawsuit. Moreover, there is some question of standing. Sure, I could try to argue that the fact that I was deterred from applying by their online statements on account of my medical condition constitutes discrimination in and of itself, but without a lot more groundwork to establish my case, it’s not completely open and shut. This could still be worth it if I was terribly passionate about nursing as a life path, which brings me to my second primary reason.

I’m not sure whether nursing would be right for me. Now, to be clear, I stand by my earlier statement that nursing is a career I could definitely see myself in, and which I think represents a distinct opportunity for me. But the same thing is true of several other careers: I think I would also find fulfillment as a researcher, or a policy maker, or an advocate. Nursing is intriguing and promising, but not necessarily uniquely so.

But the more salient point, perhaps, is that the very activities which are dangerous to me specifically, the reasons why I am excluded from the training program, the things which I would have to be very careful to avoid in any career as a nurse for my own safety and that of others, are the very same things that I feel attracted to in nursing.

This requires some unpacking.

Through my childhood my mother has often told me stories of my great-grandfather. To hear all of the tales, nay, legends of this man portray him as a larger than life figure with values and deeds akin to a classical hero of a bygone era. As the story goes, my great grandfather, when he was young, was taken ill with rheumatic fever. Deathly ill, in fact, to a point where the doctors told his parents that he would not survive, and the best they could do was to make him comfortable in his final days.

So weak was he that each carriage and motorcar that passed on the normally busy street outside wracked him with pain. His parents, who were wealthy and influential enough to do so, had the local government close the street. He languished this was for more than a year. And then, against all odds and expectations, he got better. It wasn’t a full recovery, as he still bore the scars on his heart and lungs from the illness. But he survived.

He was able to return back to school, albeit at the same place where he had left off, which was by now a year behind. He not only closed this gap, but in the end, actually skipped a grade and graduated early (Sidenote: If ever I have held unrealistically high academic expectations for myself, or failed to cut myself enough slack with regards to my own handicaps, this is certainly part of the reason why). After graduating, he went on to study law.

When the Second World War reared its ugly head, my great grandfather wanted to volunteer. He wanted to, but couldn’t, because of his rheumatic fever. Still, he wanted to serve his country. So he reached out to his contacts, including a certain fellow lawyer name of Bill Donovan, who had just been tasked by President Roosevelt with forming the Office of Strategic Services, a wartime intelligence agency meant to bring all the various independent intelligence and codebreaking organizations of the armed services under one roof. General Donovan saw that my great-grandfather was given an exemption from the surgeon general in order to be appointed as an officer in the OSS.

I still don’t know exactly what my great grandfather did in the war. He was close enough to Donovan, who played a large enough role in the foundation of the modern CIA, that many of the files are still classified, or at least redacted. I know that he was awarded a variety of medals, including the Legion of Merit, the Order of the British Empire, and the Order of the White Elephant. Family lore contends that the British Secret Service gave him the code number 006 for his work during allied intelligence operations.

I know from public records, among many other fascinating tidbits, that he provided information that was used as evidence at the Nuremberg Trials. I have read declassified letters that show that he maintained a private correspondence with, among other figures, a certain Allan Dulles. And old digitized congressional records show that he was well-respected enough in his field that he was called for the defense counsel in hearings before the House Un-American Activities Committee, where his word as an intelligence officer was able to vindicate former colleagues who were being implicated by the testimony of a female CPUSA organizer and admitted NKVD asset.

The point is, my great grandfather was a hero. He moved among the giants of the era. He helped to bring down the Nazis (the bad guys), bring them to justice, and to defend the innocent. Although I have no conclusive evidence that he was ever, strictly speaking, in danger, since public records are few an far between, it stands to reason that receiving that many medals requires some kind of risk. He did all this despite having no business in the military because of his rheumatic fever. Despite being exempt from the draft, he felt compelled to do his bit, and he did so.

This theme has always had an impact on me. The idea of doing my bit has has a profound, even foundational effect on my philosophy, both in my sense of personal direction, and in my larger ideals of how I think society ought work. And this idea has always been a requirement of any career that I might pursue.

To my mind, the image of nursing, the part that I feel drawn to, is that image used by the World Health Organization, the Red Cross, and the various civil defence and military auxiliary organizations, of the selfless heroine who courageously breaks with her station as a prim and proper lady in order to provide aid and comfort to the boys at the front serving valiantly Over There while the flag is raised in the background to a rising crescendo of your patriotic music of choice. Or else, of the humanitarian volunteer working in a far flung outpost, diligently healing those huddled masses yearning to breath free as they flee conflict. Or possibly of the brave health workers in the neglected tropical regions, serving as humanity’s first and most critical line of defence against global pandemic.

Now, I recognize, at least consciously, that these images are, at best, outdated romanticized images that represent only the most photogenic, if the most intense, fractions of the real work being done by nurses; and at worst are crude, harmful stereotypes that only serve to exacerbate the image problem that has contributed to the global nurse shortage. The common denominator in all of these, is that they are somehow on the “front lines”; that they are nursing as a means to save the world, if not as an individual hero, then certainly as part of a global united front. They represent the most stereotypically heroic, most dangerous aspects of the profession, and, relevant to my case, the very portions which would be prohibitively dangerous to an immunocompromised person.

This raises some deep personal questions. Obviously, I want and intend to do my bit, whatever that may come to mean in my context. But with regards to nursing, am I drawn to it because it is a means to do my bit, or because it offers the means to fit a kind of stereotypical hero archetype that I cannot otherwise by virtue or my exclusion from the military, astronaut training, etc (and probably could not as a nurse for similar reasons)? And the more salient question: if we assume that the more glamorous (for sore lack of a better word) aspects of nursing are out of the question (and given the apparent roadblocks for me to even enter the training program, it certainly seems reasonable to assume that such restrictions will be compelled regardless of my personal attitudes towards the risks involved), am I still interested in pursuing the field?

This is a very difficult question for me to answer, and the various ways in which it can be construed and interpreted make this all the more difficult. For example, my answer to the question “Would you still take this job if you knew it wasn’t as glamorous day to day as it’s presented?” would be very different from my answer to the question “Would you still be satisfied knowing that you were not helping people as much as you could be with the training you have, because your disability was holding you back from contributing in the field?” The latter question also spawns more dilemmas, such as “When faced with an obstacle related to a disability, is it preferable to take a stand on principle, or to cut losses and try to work out a minimally painful solution, even if it means letting disability and discrimination slide by?” All big thematic questions. And if they were not so relevant, I might enjoy idly pondering them.

Byronic Major

I’ve tried to write some version of this post three times now, starting from a broad perspective and slowly focusing in on my personal complaint, bringing in different views and sides of the story. Unfortunately, I haven’t managed to finish any of those. It seems the peculiar nature of my grievance on this occasion lends itself more easily to a sort of gloomy malaise liable to cause antipathy and writer’s block than the kind of righteous indignation that propels good essays.

Still, I need to get these points off my chest somehow. So I’m opting for a more direct approach: I’m upset. There are many reasons why I’m upset, but the main ones pertain to trying to apply to college. I get the impression from my friends who have had to go through the same that college applications may just be a naturally upsetting process. In a best case scenario, you wait in suspense for several weeks for a group of strangers to pass judgement on your carefully-laid life plans; indeed, on your moral character.

Or, if you’re me, you’ve had enough curveballs in your life so far that the pretense of knowing what state you’ll be in and what to do a year from now, let alone four years from now and for the rest of your life, seems ridiculous to the verge of lunacy. So you pull your hair and grit your teeth, and flip coins to choose majors because the application is due in two hours and you can’t pick undecided. So you write post-hoc justifications for why you chose that major, hoping that you’re a good enough writer that whoever reads it doesn’t see through your bluff.

Although certainly anxiety inducing, this isn’t the main reason why I’m upset. I just felt it needed to be included in the context here. While I was researching majors to possibly pick, I came across nursing. This is a field in which I have a fair amount of experience. After all, I spent more time in school in the nurse’s office than in a classroom. I happen to know that there is a global shortage of nurses; more pronounced, indeed, than the shortage of doctors. As a result, not only are there plenty of open jobs with increasing wages and benefits, but there are a growing number of scholarship opportunities and incentives programs for training.

Moreover, I also know that there is an ongoing concerted effort in the nursing field to attempt to correct the staggering gender imbalance, which cake about as a result of Florence Nightingale’s characterization of nursing as the stereotypically feminine activity; a characterization which in recent years has become acutely harmful to the field. Not only has this characterization discouraged young men who might be talented in the field, and created harmful stereotypes, but it has also begun to have an effect on women who seek to establish themselves as independent professionals. It seems the “nursing is for good girls” mentality has caused fewer “good girls”, that is, bright, driven, professional women, to apply to the field, exacerbating the global shortage.

In other words, there is a major opportunity for people such as myself to do some serious good. It’s not as competitive or high pressure as med school, and there are plenty of nursing roles that aren’t exposed to contagion, and so wouldn’t be a problem for my disability. The world is in dire need of nurses, and gender is no longer a barrier. Nursing is a field that I could see myself in, and would be willing to explore.

There’s just one problem: I’m not allowed into the program. My local university, or more specifically, the third-party group they contract with to administer the program, has certain health requirements in order to minimize liability. Specifically, they want immune titers (which I’ve had done before, and never not been deficient).

I understand the rationale behind these restrictions, even if I disagree with them for personal reasons. It’s not a bad policy. Though cliched to say, I’m not angry so much as disappointed. And even then, I’m not sure precisely with whom it is that I find myself disappointed.

Am I disappointed with the third-party contractor for setting workplace safety standards to protect both patients and students, and to adhere to the law in our litigious society? With the university, for contracting with a third party in the aim of giving its students hands-on experience? With the law, for having such high standards of practice for medical professionals? I find it hard to find fault, even accidental fault, with any of these entities. So what, then? Am I upset with myself for being disabled, and for wanting to help others as I have been helped? Maybe; probably, at least a little bit. With the universe, for being this way, that bad outcomes happen just as a result of circumstances? Certainly. But raging at the heavens doesn’t get me anywhere.

I know that I’m justified in being upset. My disability is preventing me from helping others and doing good: that is righteous anger if ever there was a right reason to be angry. A substantial part of me wants to be upset; to refuse to allow anyone or anything from standing in the way of my doing what I think is right, or to dictate the limits of my abilities. I want to be a hero, to overcome the obstacles in my path, to do the right thing no matter the cost. But I’m not sure in this instance the obstacles need to be overcome.

I don’t know where that leaves me. Probably something about a tragic hero.

2018 Resolution #3

2018 Resolution #3: Get back to exercising

Around spring of this past year I began, as a means of giving myself some easily-achievable goals, a loose program of regular exercise, chiefly in the form of regular walks. Although this simple routine did not give me, to borrow a phrase from the magazines I pass at the checkout counter, “a hot summer bod”, it did get me out of the house at a time where I needed to, and help build my stamina up in order to withstand our summer travel itinerary.

Despite my intentions, I fell out of this new habit after mid-November, and have not managed to get back into it. In my defense, my normal walking route from my house through town lacks sidewalks, and the lawns which I normally walk through are covered in snow. Our house is populated and organized in such a way that even if I possessed proper exercise equipment, there would be no place to put it.

Going to a gym does not strike me as a practical alternative. To put it simply, there is not a gym close by enough to drop by under casual pretenses. This is problematic for two reasons. First, an intense routine on a set schedule that requires a great deal of preparation and investment is more or less contraindicated by my medical situation, which has a distinct tendency to sabotage any such plans.

Secondly, such a routine would clash with the lies that I tell myself. In executing my more casual routine, I have found in motivating myself, it is often necessary, or at least, helpful, to have some mental pretext that does not involve exercise directly. If I can pitch getting out of the house to myself instead as a sightseeing expedition, or as a means of participating in town society by means of my presence, it is much easier to motivate myself without feeling anxious.

Accordingly, my resolution for the coming year is to exercise more later in the year when I can. Admittedly this is a weak goal, with a lot of wiggle room to get out of. And I might be more concerned about that, except that this was basically the same thing that I did last year, and at least that time, it worked.

The War on Kale

I have historically been anti-kale. Not that I don’t approve of the taste of kale. I eat kale in what I would consider fairly normal amounts, and have done even while denouncing kale. My enmity towards kale is not towards the Species Brassica oleracea, Cultivar group Acephala Group. Rather, my hostility is towards that set of notions and ideas for which kale has become a symbol and shorthand for in recent years.

In the circles which I frequent, at least, insofar as kale is known of, it is known as a “superfood”, which I am to understand, means that it is exceptionally healthy. It is touted, by those who are inclined to tout their choices in vegetables, as being an exemplar of the kinds of foods that one ought to eat constantly. That is to say, it is touted as a staple for diets.

Now, just as I have nothing against kale, I also have nothing against diets in the abstract. I recognize that one’s diet is a major factor in one’s long term health, and I appreciate the value of a carefully tailored, personalized diet plan for certain medical situations as a means to an end.

In point of fact, I am on one such plan. My diet plan reflects my medical situation which seems to have the effect of keeping me always on the brink of being clinically underweight, and far below the minimum weight which my doctors believe is healthy for me. My medically-mandated diet plan calls for me to eat more wherever possible; more food, more calories, more fats, proteins, and especially carbohydrates. My diet does not restrict me from eating more, but prohibits me from eating less.

Additionally, because my metabolism and gastronomical system is so capricious as to prevent me from simply eating more of everything without becoming ill and losing more weight, my diet focuses on having me eat the highest density of calories that I can get away with. A perfect meal, according to my dietician, nutrition, endocrinologist and gastroenterologist, would be something along the lines of a massive double burger (well done, per immunologist request), packed with extra cheese, tomatoes, onions, lots of bacon, and a liberal helping of sauce, with a sizable portion of fries, and a thick chocolate malted milkshake. Ideally, I would have this at least three times a day, and preferably a couple more snacks throughout the day.

Here’s the thing: out of all the people who will eventually read this post, only a very small proportion will ever need to be on such a diet. An even smaller proportion will need to stay on this diet outside of a limited timeframe to reach a specific end, such as recovering from an acute medical issue, or bulking up for some manner of physical challenge. This is fine. I wouldn’t expect many other people to be on a diet tailored by a team of medical specialists precisely for me. Despite the overly simplistic terms used in public school health and anatomy classes, every body is subtly (or in my case, not so subtly) different, and has accordingly different needs.

Some people, such as myself, can scarf 10,000 calories a day for a week with no discernible difference in my weight from if I had eaten 2,000. Other people can scarcely eat an entire candy bar without having to answer for it at the doctor’s office six months later. Our diets will, and should, be different to reflect this fact. Moreover, the neither the composition of our respective diets, nor particularly their effectiveness, is at all indicative of some kind of moral character.

This brings me back to kale. I probably couldn’t have told you what kale was before I had fellow high schoolers getting in my face about how kale was the next great superfood, and how if only I were eating more of it, maybe I wouldn’t have so many health problems. Because obviously turning from the diet plan specifically designed by my team of accredited physicians in favor of the one tweeted out by a celebrity is the cure that centuries of research and billions in funding has failed to unlock.

What? How dare I doubt its efficacy? Well obviously it’s not going to “suppress autoimmune activation”, whatever that means, with my kind of attitude. No, of course you know what I’m talking about. Of course you know my disease better than I do. How dare I question your nonexistent credentials? Why, just last night you watched a five minute YouTube video with clip-art graphics and showing how this diet = good and others = bad. Certainly that trumps my meager experience of a combined several months of direct instruction and training from the best healthcare experts in their respective fields, followed by a decade of firsthand self-management, hundreds of hours of volunteer work, and more participation in clinical research than most graduate students. Clearly I know nothing. Besides, those doctors are in the pockets of big pharma; the ones that make those evil vaccines and mind control nanobots.

I do not begrudge those who seek to improve themselves, nor even those who wish to help others by the same means through which they have achieved success themselves. However I cannot abide with those who take their particular diet as the new gospel, and try to see it implemented as a universal morality. Nor can I stand the insistence of those with no medical qualifications telling me that the things I do to stay alive, including my diet; the things that they have the distinct privilege of choice in; are not right for me.

I try to appreciate the honest intentions here where they exist, but frankly I cannot put up with someone who had never walked in my shoes criticizing my life support routine. My medical regimen is not a lifestyle choice any more than breathing is, and I am not going to change either of those things on second-hand advice received in a yoga lesson, or a ted talk, or even a public school health class. I cannot support a movement that calls for the categorical elimination of entire food groups, nor a propaganda campaign against the type of restaurant that helps me stick to my diet, nor the taxation of precisely the kind of foodstuffs which I have been prescribed by my medical team.

With no other option, I can do nothing but vehemently oppose this set of notions pertaining to the new cult of the diet, as I have sometimes referred to it, and its most prominent and recognizable symbol: kale. Indeed, in collages and creative projects in which others have encouraged me to express myself, the phrases “down with kale” and “death to kale”, with accompanying images of scratched-out pictures of kale and other vegetables, have featured prominently. I have one such collage framed and mounted in my bedroom as a reminder of all the wrongs which I seek to right.

This is, I will concede, something of a personal prejudice. Possibly even a stereotype. The kind of people that seem most liable to pierce my bubble and confront me over my diet tend to be the self-assured, zealous sort, and so it seems quite conceivable that I may be experiencing some kind of selection bias that causes me to see only the absolute worst in my interlocutors. It is possible in my ideo-intellectual intifada against kale, that I have thrown the baby out with the bathwater. In honesty, even if this were true, I probably wouldn’t apologize, on the grounds that what I have had to endure has been so upsetting that, with the stakes being my own life and death as they are, that my reaction has been not only justified, but correct.

As a brief aside, there is, I am sure, a great analogy to be drawn here, and an even greater deal of commentary to be drawn on this last train of thought as a reflection of the larger modern socio-political situation; refusing to acknowledge wrongdoing despite being demonstrably in the wrong. Such commentary might even be more interesting and relevant than the post I am currently writing. Nevertheless such musings are outside the scope of this particular post, though I may return to them in the future.

So my position has not changed. I remain convinced that all of my actions have been completely correct. I have not, and do not plan, to renounce my views until such time as I feel I have been conclusively proven wrong, which I do not feel has happened. What has changed is I have been given a glimpse at a different perspective.

What happened is that someone close to me received a new diagnosis of a disease close in pathology to one that I have, and which I am also at higher risk for, which prevents her from eating gluten. This person, who will remain nameless for the purposes of this post, is as good as a sister to me, and the rest of her immediate family are like my own. We see each other at least as often as I see other friends or relations. Our families have gone on vacation together. We visit and dine together regularly enough that any medical issue that affects their kitchen also affects our own.

Now, I try to be an informed person, and prior to my friend’s diagnosis, I was at least peripherally aware of the condition with which she now has to deal. I could have explained the disease’s pathology, symptoms, and treatment, and I probably could have listed a few items that did and did not contain gluten, although this last one is more a consequence of gazing forlornly at the shorter lines at gluten-free buffets at the conferences which I attended than a genuine intent to learn.

What I had not come to appreciate was how difficult it was to find food that was not only free from gluten in itself, but completely safe from any trace of cross contamination, which I have learned, does make a critical difference. Many brands and restaurants offer items that are labeled as gluten free in large print, but then in smaller print immediately below disclaim all responsibility for the results of the actual assembly and preparation of the food, and indeed, for the integrity of the ingredients received from elsewhere. This is, of course, utterly useless.

Where I have found such needed assurances, however, are from those for whom this purity is a point of pride. These are the suppliers that also proudly advertise that they do not stock items containing genetically modified foodstuff, or any produce that has been exposed to chemicals. These are the people who proclaim the supremacy of organic food and vegan diets. They are scrupulous about making sure their food is free of gluten not just because it is necessary for people with certain medical conditions, but as a matter of moral integrity. To them these matters are of not only practical but ethical. In short, these are kale supporters.

This puts me in an awkward position intellectually. On the one hand, the smug superiority with which these kale supporters denounce technologies that have great potential to decrease human hardship based on pseudoscience, and out of dietary pickiness as outlined above, is grating at best. On the other hand, they are among the only people who seem to be invested in providing decent quality gluten free produce which they are willing to stand behind, and though I would trust them on few other things, I am at least willing to trust that they have been thorough in their compulsiveness.

Seeing the results of this attitude I still detest from this new angle has forced me to reconsider my continued denouncements. The presence of a niche gluten-free market, which is undoubtedly a recent development, has, alas, not been driven by increased sensitivity to those with specific medical dietary restrictions, but because in this case my friend’s medical treatment just so happens to align with a subcategory of fad diet. That this niche market exists is a good thing, and it could not exist without kale supporters. The very pickiness that I malign has paved the way for a better quality of life for my comrades who cannot afford to be otherwise. The evangelical attitude that I rage against has also successfully demanded that the food I am buying for my friend is safe for them to eat.

I do not yet think that I have horribly misjudged kale and its supporters. But regardless, I can appreciate that in this matter, they have a point. And I consider it more likely now that I may have misjudged kale supporters on a wider front, or at least, that my impression of them has been biased by my own experiences. I can appreciate that in demanding a market for their fad diets, that they have also created real value.

I am a stubborn person by nature once I have made up my mind, and so even these minor and measured concessions are rather painful. But fair is fair. Kale has proven that it does have a purpose. And to that end I think it is only fitting that I wind down my war on kale. This is not a total cessation of all military action. There are still plenty of nutritional misconceptions to dispel, and bad policies to be refuted, and besides that I am far too stubborn to even promise with a straight face that I’m not going to get into arguments about a topic that is necessarily close to my heart. But the stereotype which I drew up several years ago as a common thread between the people who would pester me about fad diets and misconceptions about my health has become outdated and unhelpful. It is, then, perhaps time to rethink it.

On Horror Films

Recently, I was confronted with a poll regarding my favorite horror film. This was only slightly awkward, as, of the films listed as options, I had seen… none.

I really like this design.

Broadly speaking, I do not see fit to use my personal time to make myself experience negative emotions. Also, since the majority of horror films tend to focus on narrow, contrived circumstances and be driven by a supernatural, usually vaguely biblical demon, I find it difficult to suspend disbelief and buy into the premise. To me, the far better horror experiences have been disaster films, in particular those like Threads or By Dawn’s Early Light. Also certain alternate history films, in particular the HBO film, Fatherland, which did more to get across the real horror of the holocaust and genocide to thirteen year old me than six months of social studies lessons.

To wit, the only bona-fide horror film I’ve seen was something about Satan coming to haunt elevator-goers for their sins. Honestly I thought it was exceedingly mediocre at best. However, I saw this film at a birthday party for a friend of mine, the confidant of a previous crush. I had come to know this girl after she transferred to our public middle school from the local catholic school. We saw this film at her birthday party, which was, in the manner of things, perceived as the very height of society, in the pressence of an overwhelmingly female audience, most of whom my friend had known from St. Mary’s. Apparently to them the film was excellent, as many professed to be quite scared, and it remained the subject of conversation for some months afterward.

I have come to develop three alternative hypotheses for why everyone but myself seemed to enjoy this distinctly mediocre film. The first is that I am simply not a movie person and was oblivious to the apparent artistic merit of this film. This would fit existing data, as I have similarly ambiguous feelings towards many types of media my friends generally seem to laud. This is the simplest explanation, and thus the null hypothesis which I have broadly accepted for the past half-decade or so.

The second possible explanation is that, since the majority of the audience except for myself was Catholic, attended Catholic Church, and had gone to the Catholic primary school in our neighborhood, and because the film made several references to Catholic doctrine and literature, to the point that several times my friend had to lean over and whisper the names and significance of certain prayers or incantations, that this carried extra weight for those besides myself. Perhaps I lacked the necessary background context to understand what the creators were tying to reach for. Perhaps my relatively secular and avowedly skeptical upbringing had desensitized me to this specific subset of supernatural horror, while the far more mundane terrors of war, genocide, and plague fill much the same role in my psyche.

The third alternative was suggested to me by a male compatriot, who was not in attendance but was familiar with all of the attendees, several years after the fact, and subsequently corroborated by testimony from both male and female attendees. The third possibility is that my artistic assessment at the time was not only entirely on point, but was the silent majority opinion, yet that this opinion was suppressed consciously or unconsciously for social reasons. Perhaps, it has been posited to me, the appearance of being scared was for my own benefit? Going deeper, perhaps some or all of the motivation to see a horror film at a party of both sexes was not entirely platonic?

It is worth distinguishing, at this point, the relative numbers and attitudes of the various sexes. At this party, there were a total of about twenty teenagers. Of this number, there were three or four boys (my memory fails me as to exact figures), including myself. I was on the guest list from the beginning as a matter of course; I had been one of the birthday girl’s closest friends since she arrived in public school, and perhaps more importantly, her parents had met and emphatically approved of me. In fact I will go so far as to suggest that the main reason this girl’s staunchly traditionalist, conservative parents permitted their rebellious teenage daughter to invite boys over to a birthday party was because they trusted me, and believed my presence would be a moderating influence.

Also among the males in attendance were the brother of one of the popular socialite attendees, whose love of soap operas and celebrity gossip, and general stylistic flamboyance had convinced everyone concerned that he was not exactly straight; my closest friend, who was as passive and agreeable a teenager as you will ever have the pleasure to know; and a young man whose politics I staunchly disagreed with and who would later go on to have an eighteen month on and off relationship with the birthday girl, though he did not know it at the time.

Although I noticed this numerical gender discrepancy effectively immediately, at no point did it occur to me that, were I so motivated, I could probably have leveraged these odds into some manner of romantic affair. This, despite what could probably be reasonably interpreted as numerous hints to the effect of “Oh look how big the house is. Wouldn’t it be so easy for two people to get lost in one of these several secluded bedrooms?”

Although I credit this obliviousness largely to the immense respect I maintained for the host’s parents and the sanctity of their home, I must acknowledge a certain level of personal ignorance owing mainly to a lack of similar socialization, and also to childhood brain damage. This acute awareness of my own past, and in all likelihood, present, obliviousness to social subtleties is part of why I am so readily willing to accept that I might have easily missed whatever aspect of this film made it so worthwhile.

In any case, as the hypothesis goes, this particular film was in fact mediocre, just as I believed at the time. However, unlike myself with my single-minded judgement based solely on the artistic merits and lack thereof of the film, it is possible that my female comrades, while agreeing in the abstract with my assessment, opted instead to be somewhat more holistic in their presentation of opinions. Or to put it another way, they opted to be socially opportunistic in the ability to signal their emotional state. As it was described to me, my reaction would then, at least in theory, be to attempt to comfort and reassure them. I would assume the stereotypical role of male defender, and the implications therewith, which would somehow transmogrify into a similarly-structured relationship.

Despite the emphatic insistence of most involved parties, with no conclusive confession, I remain particularly skeptical of this hypothesis, though admittedly it does correlate with existing psychological and sociological research on terror-induced pair-bonding. I doubt I shall ever truly understand the horror genre. It would be easy to state categorically that there is no merit to trying to induce negative emotions without cause, and that those who wish to use such experiences as a cover for other overtures ought simply get over themselves, but given that, as things go, this is an apparently victimless crime, and seems to being a great deal of joy to some people, it is more likely that this issue lies more in myself than the rest of the world.

To a person who seeks to understand the whole truth in its entirety, the notion that there are some things that I simply do not have the capacity to understand is frustrating. Knowing that there are things which other people can comprehend, yet I cannot, is extremely frustrating. More than frustrating; it is horrifying. To know that there is an entire world of subtext and communication that is lost to me; that my brain is damaged in such a way that I am oblivious to things that are supposed to be obvious, is disconcerting to the point of terrifying.

I will probably never know the answer to these questions, as at this point I am probably the only one who yet bothers to dwell on that one evening many moons ago. It will remain in my memory an unsolved mystery, and a reminder that my perception is faulty in ways imperceptible to me, but obvious to others. It might even be accurate to say that I will remain haunted by this episode.

Happy Halloween.

My Superpowers

So, I don’t know if I mentioned this, but I have a minor superpower. Not the cyborg stuff. That exists, but isn’t really a power so much as a bunch of gadgets I wear to keep me alive. Nor any of the intellectual or creative abilities it has been alleged that I possess, for those are both ordinary in the scope of things, and also subjective. Rather I refer to my slight clairvoyance. I can sense changes in the weather. I have had this ability referred to as “my personal barometer”, but in truth it often functions more like a “personal air-raid siren”; specifically one that can’t be shut up.

Near as I can tell, this is related to pressure changes, and happens because something, somewhere inside me, is wired wrong. I have been told that my sinuses are out of order in such a way that would make me vulnerable to comparatively minor changes such as pressure, and strong circumstantial evidence suggests damage somewhere in my nervous system, caused by childhood encephalitis, which creates the microscopic, undetectable vulnerability that manifests in my seizures and migraines, and could plausibly be exploited by other factors.

This has the effect of allowing me to feel major weather changes somewhere between six hours and a week before it appears when I am, depending on the size and speed of a shift. It starts as a mild-bout of light-headedness, the same as the rush of blood flowing away from my head when standing up after not moving for some time. If it is a relatively minor dislocation, this may be all that I feel.

It then grows into a more general feeling of flu-like malaise; the same feeling that normally tells if one is sick even if there are not any active symptoms. At this point, my cognitive function begins to seriously degrade. I start to stutter and stumble, and struggle for the words that are on the tip of my tongue. I forget things and lose track of time. I will struggle both to get to sleep, and to wake up.

Depending on the severity and duration, these symptoms may be scarcely visible, or they may have me appearing on death’s door. It is difficult to tell these symptoms apart from those of allergies, migraines, or an infection, especially once I begin to experience chills and aches. This is compounded by my immune system’s proclivity to give false negatives due to my immunodeficiency, and false positives due to my autoimmune responses, for pathology. Fortunately, the end result is mostly the same: I am advised to stay home, rest, make sure I eat and drink plenty, redouble our protective quarantine procedures, etcetera.

At its worst, these symptoms also induce a cluster migraine, which confines me to bed and limits my ability to process and respond to stimuli to a level only slightly better than comatose. At this point, my symptoms are a storm unto itself, and, short of a hurricane, I’m probably not going to be much concerned with whatever is happening outside the confines of my room, as I’ve already effectively sealed myself off from the outside world. I will remain so confined for however long it takes until my symptoms pass. This may be a few hours, or a few weeks. During these days, my cognitive ability is limited to a couple hundred words, only forty or so of which are unique.

If I am lucky, I will still have the mental faculties to passively watch videos, listen to music with words, and occasionally write a handful of sentences. I generally cannot read long tracts, as reading requires several skills simultaneously – visual focus, language processing, inner narration, and imagination of the plot – which is usually beyond my limits. I can sometimes get by with audiobooks, provided the narration is slow enough and the plot not overly complex. If I am not able to deal with words, then I am limited to passing my waking hours listening to primarily classical music. Fortunately, I also tend to sleep a great deal more in this state.

Once I have entered this state, my superpower; or perhaps it is an unsung quirk of human perception; means that I don’t really consciously recognize time passing in the normal way. Without discrete events, sensations, or thoughts to mark time, the days all kind of meld together. With my shades closed, my light permanently off, and my sleep cycle shattered, days and nights lose their meaning. Every moment is the same as every other moment.

Thus, if it takes two weeks by calendar until I am well enough to return to normal function, I may wake up with only two or three days worth of discrete memories. And so in retrospect, the time that took other people two weeks to pass took me only three days. It therefore emerges that in addition to my limited form of clairvoyance, I also possess a limited form of time travel.

Admittedly, I am not great at controlling these powers. I have virtually no control over them, except some limited ability to treat the worst of the symptoms as they come up. So perhaps it is that they are not so much my powers as they are powers that affect me. They do not control me, as I still exist, albeit diminished, independent and regardless of them. They do affect others, but only through how they affect me.

All of this to say, the storms that are presently approaching the northeastern United States are having a rather large impact on my life at present. If I were of more of a superstitious bent, I might suggest that this is meant as a way to sabotage my plans to get organized and generally rain on my parade (cue canned laughter).

There isn’t a great deal that I can do to work around this, any more than a blind man can work around a print book. The best I can hope for is that this is a “two steps forward, one step back” situation, which will also depend on how quickly this storm clears up, and on me being able to hit the ground running afterwards.

There is Power in a Wristband


This post is part of the series: The Debriefing. Click to read all posts in this series.


Quick note: this post contains stuff that deals with issues of law and medical advice. While I always try to get things right, I am neither a doctor nor a lawyer, and my blog posts are not to be taken as such advice.

Among people I know for whom it is a going concern, medical identification is a controversial subject. For those not in the know, medical identification is a simple concept. The idea is to have some sort of preestablished method to convey to first responders and medical personnel the presence of a condition which may either require immediate, specific, treatment (say, a neurological issue that requires the immediate application of a specific rescue medication), or impact normal treatment (say, an allergy to a common drug) in the event that the patient is incapacitated.

The utilitarian benefits are obvious. In an emergency situation, where seconds count, making sure that this information is discovered and conveyed can, and often does, make the difference between life and death, and prevent delays and diversions that are costly in time, money, and future health outcomes. The importance of this element cannot be overstated. There are also some possible purported legal benefits to having pertinent medical information easily visible for law enforcement and security to see. On the other hand, some will tell you that this is a very bad idea, since it gives legal adversaries free evidence about your medical conditions, which is something they’d otherwise have to prove.

The arguments against are equally apparent. There are obvious ethical quandaries in compelling a group of people to identify themselves in public, especially as in this case it pertains to normally confidential information about medical and disability status. And even where the macro-scale political considerations do not enter it, there are the personal considerations. Being forced to make a certain statement in the way one dresses is never pleasant, and having that mode of personal choice and self expression can make the risk of exacerbated medical problems down the line seem like a fair trade off.

I can see both sides of the debate here. Personally, I do wear some medical identification at all times – a small bracelet around my left wrist – and have more or less continuously for the last decade. It is not so flamboyantly visible as some people would advise. I have no medical alert tattoos, nor embroidered jacket patches. My disability is not a point of pride. But it is easily discoverable should circumstances require it.

Obviously, I think that what I have done and continue to do is fundamentally correct and right, or at least, is right for me. To do less seems to me foolhardy, and to do more seems not worth the pains required. The pains it would cause me are not particularly logistical. Rather they refer to the social cost of my disability always being the first impression and first topic of conversation.

It bears repeating that, though I am an introvert in general, I am not particularly bashful about my medical situation. Provided I feel sociable, I am perfectly content to speak at length about all the nitty gritty details of the latest chapter in my medical saga. Yet even I have a point at which I am uncomfortable advertising that I have a disability. While I am not averse to inviting empathy, I do not desire others to see me as a burden, nor for my disability to define every aspect of our interactions any more than the face that I am left handed, or brown eyed, or a writer. I am perfectly content to mention my medical situation when it comes up in conversation. I do not think it appropriate to announce it every time I enter a room.

Since I feel this way, and I am also literally a spokesman and disability advocate, it is easy to understand that there are many who do not feel that it is even appropriate for them to say as much as I do. Some dislike the spotlight in general. Others are simply uncomfortable talking about a very personal struggle. Still others fear the stigma and backlash associated with any kind of imperfection and vulnerability, let alone one as significant as a bonafide disability. These fears are not unreasonable. The decision to wear medical identification, though undoubtedly beneficial to health and safety, is not without a tradeoff. Some perceive that tradeoff, rightly or wrongly, as not worth the cost.

Even though this position is certainly against standard medical advice, and I would never advocate people go against medical advice, I cannot bring myself to condemn those who go against this kind of advice with the same definitiveness with which I condemn, say, refusing to vaccinate for non-medical reasons, or insurance companies compelling patients to certain medical decisions for economic reasons. The personal reasons, even though they are personal and not medical, are too close to home. I have trouble finding fault with a child who doesn’t want to wear an itchy wristband, or a teenager who just wants to fit in and make their own decisions about appearance. I cannot fault them for wanting what by all rights should be theirs.

Yet the problem remains. Without proper identification it is impossible for first responders to identify those who have specific, urgent needs. Without having these identifiers be sufficiently obvious and present at all times, the need for security and law enforcement to react appropriately to those with special needs relies solely on their training beforehand, and on them trusting the people they have just detained.

In a perfect world, this problem would be completely moot. Even in a slightly less than perfect world, where all these diseases and conditions still existed, but police and first responder training was perfectly robust and effective, medical identification would not be needed. Likewise, in such a world, the stigma of medical identification would not exist; patients would feel perfectly safe announcing their condition to the world, and there would be no controversy in adhering to the standard medical advice.

In our world, it is a chicken-egg problem, brought on by understandable, if frustrating, human failings at every level. Trying to determine fault and blame ultimately comes down to questioning the nitty gritty of morality, ethics, and human nature, and as such, is more suited to an exercise in navel gazing than an earnest attempt to find solutions to the problems presently faced by modern patients. We can complain, justifiably and with merit, that the system is biased against us. However such complaints, cathartic though they may be, will not accomplish much.

This viscous cycle, however, can be broken. Indeed, it has been broken before, and recently. Historical examples abound of oppressed groups coming to break the stigma of an identifying symbol, and claiming it as a mark of pride. The example that comes most immediately to mind is the recent progress that has been made for LGBT+ groups in eroding the stigma of terms which quite recently were used as slurs, and in appropriating symbols such as the pink triangle as a symbol of pride. In a related vein, the Star of David, once known as a symbol of oppression and exclusion, has come to be used by the Jewish community in general, and Israel in particular, as a symbol of unity and commonality.

In contrast to such groups, the road for those requiring medical identification is comparatively straightforward. The disabled and sick are already widely regarded as sympathetic, if pitiful. Our symbols, though they may be stigmatized, are not generally reviled. When we face insensitivity, it is usually not because those we face are actively conspiring to deny us our needs, but simply because we may well be the first people they have encountered with these specific needs. As noted above, this is a chicken-egg problem, as the less sensitive the average person is, the more likely a given person with a disability that is easily hidden is to try and fly under the radar.

Imagine, then, if you can, such a world, where a medical identification necklace is as commonplace and unremarkable as a necklace with a religious symbol. Imagine seeing a parking lot with stickers announcing the medical condition of a driver or passenger with the same regularity as you see an advertisement for a political cause or a vacation destination. Try to picture a world where people are as unconcerned about seeing durable medical equipment as American flag apparel. It is not difficult to imagine. We are still a ways away from it, but it is within reach.

I know that this world is within reach, partially because I myself have seen the first inklings of it. I have spent time in this world, at conferences and meetings. At several of these conferences, wearing a colored wristband corresponding to one’s medical conditions is a requirement for entry, and here it is not seen as a symbol of stigma, but one of empowerment. Wristbands are worn in proud declaration, amid short sleeved shirts for walkathon teams, showing bare medical devices for all the world to see.

Indeed, in this world, the medical ID bracelet is a symbol of pride. It is shown off amid pictures of fists clenched high in triumph and empowerment. It is shown off in images of gentle hands held in friendship and solidarity.

It is worth mentioning with regards to this last point, that the system of wristbands is truly universal. That is to say, even those who have no medical afflictions whatsoever are issued wristbands, albeit in a different color. To those who are not directly afflicted, they are a symbol of solidarity with those who are. But it remains a positive symbol regardless.

The difference between these wristbands, which are positive symbols, and ordinary medical identification, which is at best inconvenient and at worst oppressive, has nothing to do with the physical discrepancies between them, and everything to do with the attitudes that are attached by both internal and external pressure. The wristbands, it will be seen, are a mere symbol, albeit a powerful one, onto which we project society’s collective feelings towards chronic disease and disability.

Medical identification is in itself amoral, but in its capacity as a symbol, it acts as a conduit to amplify our existing feelings and anxieties about our condition. In a world where disabled people are discriminated against, left to go bankrupt from buying medication for their survival, and even targeted by extremist groups, it is not hard to find legitimate anxieties to amplify in this manner. By contrast an environment in which the collective attitude towards these issues is one of acceptance and empowerment, these projected feelings can be equally positive.

The Moral Hazard of Hope


This post is part of the series: The Debriefing. Click to read all posts in this series.


Suppose that five years from today, you would receive an extremely large windfall. The exact number isn’t important, but let’s just say it’s large enough that you’ll have to budget things again. Not technically infinite, because that would break everything, but for the purposes of one person, basically undepletable. Let’s also assume that this money becomes yours in such a way that it can’t be taxed or swindled in getting it. This is also an alternate universe where inheritance and estates don’t exist, so there’s no scheming among family, and no point in considering them in your plans. Just roll with it.

No one else knows about it, so you can’t borrow against it, nor is anyone going to treat you differently until you have the money. You still have to be alive in five years to collect and enjoy your fortune. Freak accidents can still happen, and you can still go bankrupt in the interim, or get thrown in prison, or whatever, but as long as you’re around to cash the check five years from today, you’re in the money.

How would this change your behavior in the interim? How would your priorities change from what they are?

Well, first of all, you’re probably not going to invest in retirement, or long term savings in general. After all, you won’t need to. In fact, further saving would be foolish. You’re not going to need that extra drop in the bucket, which means saving it would be wasting it. You’re legitimately economically better off living the high life and enjoying yourself as much as possible without putting yourself in such severe financial jeopardy that you would be increasing your chances of being unable to collect your money.

If this seems insane, it’s important to remember here, that your lifestyle and enjoyment are quantifiable economic factors (the keyword is “utility”) that weigh against the (relative and ultimately arbitrary) value of your money. This is the whole reason why people buy stuff they don’t strictly need to survive, and why rich people spend more money than poor people, despite not being physiologically different. Because any money you save is basically worthless, and your happiness still has value, buying happiness, expensive and temporary though it may be, is always the economically rational choice.

This is tied to an important economic concept known as Moral Hazard, a condition where the normal risks and costs involved in a decision fail to apply, encouraging riskier behavior. I’m stretching the idea a little bit here, since it usually refers to more direct situations. For example, if I have a credit card that my parents pay for to use “for emergencies”, and I know I’m never going to see the bill, because my parents care more about our family’s credit score than most anything I would think to buy, then that’s a moral hazard. I have very little incentive to do the “right” thing, and a lot of incentive to do whatever I please.

There are examples in macroeconomics as well. For example, many say that large corporations in the United States are caught in a moral hazard problem, because they know that they are “too big to fail”, and will be bailed out by the government if they get in to serious trouble. As a result, these companies may be encouraged to make riskier decisions, knowing that any profits will be massive, and any losses will be passed along.

In any case, the idea is there. When the consequences of a risky decision become uncoupled from the reward, it can be no surprise when rational actors take more riskier decisions. If you know that in five years you’re going to be basically immune to any hardship, you’re probably not going to prepare for the long term.

Now let’s take a different example. Suppose you’re rushed to the hospital after a heart attack, and diagnosed with a heart condition. The condition is minor for now, but could get worse without treatment, and will get worse as you age regardless.

The bad news is, in order to avoid having more heart attacks, and possible secondary circulatory and organ problems, you’re going to need to follow a very strict regimen, including a draconian diet, a daily exercise routine, and a series of regular injections and blood tests.

The good news, your doctor informs you, is that the scientists, who have been tucked away in their labs and getting millions in yearly funding, are closing in on a cure. In fact, there’s already a new drug that’s worked really well in mice. A researcher giving a talk at a major conference recently showed a slide of a timeline that estimated FDA approval in no more than five years. Once you’re cured, assuming everything works as advertised, you won’t have to go through the laborious process of treatment.

The cure drug won’t help if you die of a heart attack before then, and it won’t fix any problems with your other organs if your heart gets bad enough that it can’t supply them with blood, but otherwise it will be a complete cure, as though you were never diagnosed in the first place. The nurse discharging you tells you that since most organ failure doesn’t appear until patients have been going for at least a decade, so long as you can avoid dying for half that long, you’ll be fine.

So, how are you going to treat this new chronic and life threatening disease? Maybe you will be the diligent, model patient, always deferring to the most conservative and risk averse in the medical literature, certainly hopeful for a cure, but not willing to bet your life on a grad student’s hypothesis. Or maybe, knowing nothing else on the subject, you will trust what your doctor told you, and your first impression of the disease, getting by with only as much invasive treatment as you can get away with to avoid dying and being called out by your medical team for being “noncompliant” (referred to in chronic illness circles in hushed tones as “the n-word”).

If the cure does come in five years, as happens only in stories and fantasies, then either way, you’ll be set. The second version of you might be a bit happier from having more fully sucked the marrow out of life. It’s also possible that the second version would have also had to endure another (probably non-fatal) heart attack or two, and dealt with more day to day symptoms like fatigue, pains, and poor circulation. But you never would have really lost anything for being the n-word.

On the other hand, if by the time five years have elapsed, the drug hasn’t gotten approval, or quite possibly, hasn’t gotten close after the researchers discovered that curing a disease in mice didn’t also solve it in humans, then the difference between the two versions of you are going to start to compound. It may not even be noticeable after five years. But after ten, twenty, thirty years, the second version of you is going to be worse for wear. You might not be dead. But there’s a much higher chance you’re going to have had several more heart attacks, and possibly other problems as well.

This is a case of moral hazard, plain and simple, and it does appear in the attitudes of patients with chronic conditions that require constant treatment. The fact that, in this case, the perception of a lack of risk and consequences is a complete fantasy is not relevant. All risk analyses depend on the information that is given and available, not on whatever the actual facts may be. We know that the patient’s decision is ultimately misguided because we know the information they are being given is false, or at least, misleading, and because our detached perspective allows us to take a dispassionate view of the situation.

The patient does not have this information or perspective. In all probability, they are starting out scared and confused, and want nothing more than to return to their previous normal life with as few interruptions as possible. The information and advice they were given, from a medical team that they trust, and possibly have no practical way of fact checking, has led them to believe that they do not particularly need to be strict about their new regimen, because there will not be time for long term consequences to catch up.

The medical team may earnestly believe this. It is the same problem one level up; the only difference is, their information comes from pharmaceutical manufacturers, who have a marketing interest in keeping patients and doctors optimistic about upcoming products, and researchers, who may be unfamiliar with the hurdles in getting a breakthrough from the early lab discoveries to a consumer-available product, and whose funding is dependent on drumming up public support through hype.

The patient is also complicit in this system that lies to them. Nobody wants to be told that their condition is incurable, and that they will be chronically sick until they die. No one wants to hear that their new diagnosis will either cause them to die early, or live long enough for their organs to fail, because even by adhering to the most rigid medical plan, the tools available simply cannot completely mimic the human body’s natural functions. Indeed, telling a patient that they will still suffer long term complications, whether in ten, twenty, or thirty years, almost regardless of their actions today, it can be argued, will have much the same effect as telling them that they will be healthy regardless.

Given the choice between two extremes, optimism is obviously the better policy. But this policy does have a tradeoff. It creates a moral hazard of hope. Ideally, we would be able to convey an optimistic perspective that also maintains an accurate view of the medical prognosis, and balances the need for bedside manner with incentivizing patients to take the best possible care of themselves. Obviously this is not an easy balance to strike, and the balance will vary from patient to patient. The happy-go-lucky might need to be brought down a peg or two with a reality check, while the nihilistic might need a spoonful of sugar to help the medicine go down. Finding this middle ground is not a task to be accomplished by a practitioner at a single visit, but a process to be achieved over the entire course of treatment, ideally with a diverse and well experienced team including mental health specialists.

In an effort to finish on a positive note, I will point out that this is already happening, or at least, is already starting to happen. As interdisciplinary medicine gains traction, patient mental health becomes more of a focus, and as patients with chronic conditions begin to live longer, more hospitals and practices are working harder to ensure that a positive and constructive mindset for self care is a priority, alongside educating patients on the actual logistics of self-care. Support is easier to find than ever, especially with organized patient conferences and events. This problem, much like the conditions that cause it, are chronic, but are manageable with effort.

 

The Debriefing

Earlier this month was another disability conference. Another exchange of ideas, predictions, tips, tricks, jokes, and commiseration. Another meticulously apportioned, carb-counted buffet of food for thought, and fodder for posts.

As my comrades working on the scientific research tell me, two points of data is still just anecdotal. Even so, this is the second time out of two conferences that I’ve come back with a lot to say. Last time, these mostly revolved around a central theme of sorts, enough so that I could structure them in a sequential series. This time there were still lots of good ideas, but they’re a little more scattershot, and harder to weave into a consistent narrative. So I’m going to try something different, again.

I’m starting a new category of semi-regular posts, called “The Debriefing” (name subject to change), to be denoted with a special title, and possibly fancy graphics. These will focus on topics which were points of discussion or interest at conferences, events, and such, that aren’t part of another series, and which have managed to capture my imagination. Topics which I’m looking forward to (hopefully) exploring include things like:

– The moral hazard of hoping for a cure: how inspiring hope for a cure imminently, or at least in a patient’s lifetime, can have perverse effects on self-care

– Controversy over medical identification: the current advice on the subject, and the legal, political, social, and psychological implications of following it

– Medical disclosure solidarity: suggestions for non-disabled job applicants to help strengthen the practical rights of disabled coworkers

– The stigma of longevity: when and why the chronically ill don’t go to the doctor

– Why I speak: how I learned to stop worrying and love public speaking

At least a couple of these ideas are already in the pipe, and are coming up in the next few days. The rest, I plan to write at some point. I feel reasonably confident listing these topics, despite my mixed record on actually writing the things I say I’m going to write mostly because these are all interesting topics that keep coming up, and given that I plan to attend several more conferences and events in the near future, even if I don’t get them soon, I fully expect they will come up again.