Friday, December 11, 2009

On the Virtue of Coffee

Caffeine is a virtuous stimulant. That is to say, utilizing the stimulating properties of coffee, tea, and other caffeine-bearing beverages for personal and professional use is socially acceptable, and often encouraged, in societies worldwide.

Expounding on the earlier thought concerning the Aristotelian characterization of virtue as moderation, it seems apropos to conclude that it is virtuous for one to consume caffeine because it yields a moderate amount of stimulation. Caffeine stimulates you enough to have a quickly noticeable impact on attention, activity, and productivity. But not too much, and not for too long.

Contrast caffeinated drinks with the more effective stimulants that pervade modern society, including illicit drugs like methamphetamine. Meth does a better job at stimulating your CNS and makes you far more active than caffeine does, but it's too much too fast, ultimately yielding a very unpleasant and unproductive comedown. Its short-acting and addictive nature make it prone to abuse and withdrawal cycles, leading to socially undesirable addicts. Social pressure begets law, so it is illegal for an individual to independently obtain or use powerful drugs like amphetamines.

Caffeine is addictive and like other "uppers" also has comedown and withdrawal symptoms that lead to unsociable characteristics, but not to the extent that the disadvantages outweigh the advantages. In other words, the gain in productivity during stimulation is widely considered valuable, and any subsequent drops in productivity are not severe enough to warrant imposed restriction.

By virtue of its moderate effect on human physiology, caffeine is culturally pervasive. It is attainable in roughly standardized doses, hence descriptive phrases like "about as much caffeine as in a cup of coffee". As you drink a cup of coffee the effects of the caffeine come timely, not immediately or overpoweringly. The caffeine is diluted in a mostly water solution, so intake is paced to a rate of drink that helps to self-limit unintentional over-consumption. More important, the stimulating effects are of manageable duration and intensity. Someone who downs too many shots of espresso might feel restless, hyperactive, or "on edge", but the effects are not wholly debilitating and conveniently fade in about a day's time. It's relatively easy to adjust and find a sustainable number of "cups per day" that provide moderate stimulation, of moderate duration, at appropriate times. For many people in western society that means a cup of coffee or tea each morning before beginning earnest work.

The virtue of coffee emerges from its cultivated moderation. Cultures worldwide have been utilizing stimulating plant life for thousands of years, creating and settling on balanced drinks like coffee and tea to propagate a moderated norm for optimal societal augmentation.

Monday, December 7, 2009

Inequity is Iniquity; Conflict is Correction

We rose above the apes owing to our aptitude for altruism. Empathy and keen awareness of the gains to be had when working with like-minded others allowed us to hunt larger game in groups, settle into agricultural communities, congregate into cities, and generally form our modern civilization.

What is the nature of societal conflict, then? If we are a necessarily altruistic society and have achieved our vaunted status because of this idea of helping others for mutual gain, why do we fight? That is to say, why does war seem such a persistent characteristic of humanity?

Wednesday, November 25, 2009


The verb "unfriend" is the New Oxford American Dictionary's Word of the Year for 2009. As in, “I decided to unfriend my roommate on Facebook after we had a fight.”

The official lexicographer has an interesting albeit brief quip about the word relating to modern technology and how its "un-" prefix is un-usual. Why is "unfriend" such a well-known trend this year? The concept of severing a form of communication with another isn't new to the Internet, let alone social networking sites which already host millions of people "friending" and "unfriending" each other. I suppose a better question, then: why is "unfriending" now more frequent or more public?

Wednesday, November 18, 2009

The Beginning of an Idea

How can two identical communications be perceived very differently by the same person?

Take the phrase "I love you!" and send it in two separate emails. The data is encoded and represented identically in both communications. A person reading the emails won't be able to differentiate between the emails when looking at their raw data.

Now consider the phrase "I love you!" sent to you in two separate emails: one from your significant other, and one from an unlikable business acquaintance. Both emails have only the phrase "I love you!" and no other content, but your emotional response changes dramatically once a key piece of information is known: the originator. Your initial emotional response, contextual interpretation, and even the voice you use to read the message in your head change once you can imagine who sent the message. Knowledge of origination is a key component of interpretation.

Tuesday, November 17, 2009


Fur Maria

There exists a girl with a bit of a problem: furfur. She prefers to wear pretty fur coats that hide her furfures, but feels for the animals that the fur's for. "Faux fur for furfur," she infers, "for *fur* fur is for the furry."

Saturday, November 14, 2009

Where's the virtue in all this?

"Virtue, then, is a kind of moderation inasmuch as it aims at the mean or moderate amount." - Aristotle, Nichomachean Ethics

Why am I writing this stuff down on a public blog? Nobody is following yet and I'm using that as a justification to call it posterity and use this like a philosophical diary for my future reflection. But I could do that by means other than a public web-log. People have done it privately with paper for thousands of years, after all.

I guess I'm still probing for the line between too little and too much. Mostly I am non-communicative with my structured thoughts; they remain internalized and I do not express them as they are formed. The thoughts I do communicate risk sounding vainglorious, as I have afforded these thoughts a higher status and have deemed them important enough to share with others. When I choose to write here on this page, I tacitly deem some of my conscious thought relevant and useful to others.

Is it vain to communicate your ideas? It seems a silly question, but it also seems humility is an essential element to convincing communication. I can't just yell my opinion at you and treat you like an idiot (even if you are super-wrong) because communication isn't particularly constructive that way. A convincing speaker needs to be empathetic, humble, and understanding of opposing viewpoints. More importantly, one must convey these traits when theorizing aloud to garner the best possible reception.

Given that humility is an element of utilitarian communication, how does one justify prioritizing their thought over that of another? When one expends effort to communicate to others, one is also not paying full attention to what others have expended effort to communicate. While I write about philosophy I am not reading about philosophy, even given the immense body of remaining knowledge I can peruse with incredible ease. If I devote my time to the passive acquisition of wisdom, it seems to stand to reason that any subsequent activity will be of greater quality and completeness. Yet, one clearly cannot remain voiceless and docile forever. Indolence is as much sin as conceit.

Is it because of an incapacity to internalize so much knowledge that one stops absorbing and starts expressing? Do we express ourselves when we become saturated and overconfident in the sufficiency of our understanding? What determines our saturation point? Questions that seem too multifarious to fully answer.

Regardless of the factors that create the mean, the mean defines the virtue. A wise person is thoughtful yet not silent, self-assured yet not boastful. Measure and balance characterize the virtuous individual. The virtuous mean is known to society collectively, but hard for any single actor to realize precisely.

And so I probe in the dark for the line between too much and too little. I naturally wish to be virtuous because it is, by definition, what society has deemed best for itself and therefore best for me as a member. I want to contribute my thoughts and grow in understanding with those around me, but when I speak I cannot help but fear that I overstep my bounds without recognizing as much.

Thursday, November 12, 2009

Everything You Think You Know About x is Wrong

An attention-grabbing title, no? One can take the phrase, insert their noun of choice for "x", and instantly imply that they can impart special knowledge to an audience eager to learn the heretofore unknown errors of their ways.

It seems reasonable to assume that the claim is intentionally outlandish. A presentation to a room of business experts won't be taken seriously if everything they know about business is wrong, and the presenter naturally needs to utilize the audience's existing knowledge set to propose a convincing counter-theory.

Most people implicitly accept that not everything they think will actually be overturned when they absorb content stemming from such a jarring headline, but they also want to know what the creator thinks is so important to be considered revolutionary in their field. The absolute nature of the claim sets the bar high for the claimant relative to his compatriots; expectations are high in those who call the bluff and take up the challenge to listen.

The contender puts his reputation on the line when the time comes to justify the contention. The challenge is in striking a balance between the commonly-accepted pool of knowledge and the new approaches to the information being postulated. A talk consisting entirely of novel data, unverified observations, and radical suggestions is naturally difficult to believe and will likely be ridiculed even if the conclusions are relevant and useful. Conversely, a talk consisting largely of well-known discussions and culminating simply in a different interpretative conclusion of the same knowledge set is underwhelming, and the audience may leave disappointed given the initial claim.

The court of public opinion ultimately determines the success of the presenter's ideas, but the final verdict isn't always obvious in the short-term. Einstein didn't posit his work on relativity with the hook "Everything You Know About Time is Wrong", but to an extent such usage could now be considered an appropriate challenge to 20th century physicists.

For illustrative purposes, I will also point out that my own attempt at using the idiom, "Everything You've Ever Thought Is Stupid", was not received well.

One-way Conversation

I'm having a one-way conversation with you. I'm talking to you, then imagining what you might think and say, and then thinking about how I can then talk about that. Iterate to the length of this post and you have our complete conversation, sans your side of the story. Let's play a game.

When I talk to you, I want only the most concise utterances to pass our hypothetical lips. I agonize first over my own words, then over yours as I'm suddenly forced to switch sides. Each sentence that survives the editing process is as carefully chosen as a move at a chess match, albeit one with only one player controlling both sides.

As the player I don't want to simply have one side crush the other, however, as the level of play is what determines the quality of content. When both sides of the conversation take their time and ponder carefully before making a move, the resulting monologue is not an unopposed diatribe, but a carefully reasoned and thought-provoking composition to be appreciated by others.

'Your Writing May Vary' since each game is different and the player is unavoidably biased, of course, but the ideal of the infinitely difficult game plagues the agonizing author, who restarts the game over and over again until the results are good enough. Our conversation branches into a thousand conversations, which eventually converge to form the final piece for display.


H1N1 is going to be this winter's favorite flu, mostly because of its widely publicized and now well-known moniker: "Swine Flu!" I wrote an early script for a play with this title that I would like to share with you here.

Just kidding but that's actually not a bad idea...

The dark cold of winter is a cruel mistress. She drives us indoors where we cloister together around warm fires, enjoy each others' conversation at indoor gatherings, and just generally spread our infectious effluvium into the stagnant air of those around us. We remain cheerfully unaware of the invisible invaders, save a few consciously withheld handshakes with particular persons of 'ill-repute', until about a day later when everything goes downhill fast.

Wednesday, November 11, 2009

Brain Dump

I yearn to have the machine put my thoughts to paper, for this blog to write itself full of my thinking with nary a brain cycle devoted to the task.

Yet implicitly I acknowledge that real thought is the communicative effort itself; formulating a sentence from the mess of brain activity that we call a thought, in such a precise manner so as to stimulate a similar mess of brain activity in another, is a hard problem with no singular solution. Using language, we structure our own patterns of thought into a standardized format digestible by minds around us, and in so doing our minds architect themselves in a manner that allows the others to likewise trigger thoughts in ourselves. The arrangement is what makes the thought, not the simple sum of the pieces. I wish to circumvent the effort and have a computational middle-man hand me my thoughts gift-wrapped.

One could suppose that, in time, information technology will allow us to interpret the seemingly incomprehensible data of the present. If this is the case, should I perhaps simply seek the best possible means available to me to record my brain activity and write with that instead of words? Utilizing the recording technology of the present, will I yield the benefits of future interpretation technology? Will the utility of that interpretation, however much more accurate it may be deemed, retain the value of the original act? Do the words that I wrote with that brain activity become less relevant than the brain activity itself?

Tuesday, November 10, 2009

Now Hold On

It occurs to me that you might get the idea that this blog is going to wax poetic consistently. Well you can just stop that right now.

It further occurs to me that these posts are likely listed in reverse chronological order and that you might read this before you even read what I'm referring to above. In that case you can just not start instead of stopping. On the other hand that means you'll have also read whatever it is that I will have written between my writing of this and your reading of it now, and those posts could be really simple and stupid for all I know. In that case prepare to be wowed.

Consider this my get-out-of-jail-free card for anything that you deem sub-standard in this blog. I know I probably will.

Entirely original thinking, as far as I know

Creation becomes all the more daunting when one has so much available to absorb.

In the Information Age, how does one write a meaningfully lengthy blog post when one can be certain they haven't learned all they can easily learn on the subject? What is to say I won't unintentionally duplicate another's work, or that my words even now aren't just a simple mix-and-match of more popular content? Have I created something of value, or is it only by virtue of my beholders' limited perspective that I appear to have generated it?

Readers absorb from readers, then mentally rearrange and piece together their own content for presentation within their own spheres of influence, wherein the content will be perceived as original given that the constituent components are non-obvious to the new readership. The author's value, then, is in repackaging ideas and rephrasing them in their respective cultural microcosms.

With the Internet as medium, I place my words on display on a personal site, this blog. It sits neutrally alongside countless others of immeasurable quality, equally visible by anyone on the network anywhere at anytime. Though each site's potential is the same, the scope of any one readership cannot encompass the entire spectrum of content, and thus I may achieve the aforementioned value of 'original' authorship in words, at least for the time being.

Perhaps soon, when information technology allows us to pull back and take in the wider perspective of the Internet's expanse of information, will we see otherwise-disparate content from a single viewpoint and realize the similarities with a glance.

You and I don't know that this post is nearly identical to another one from an introspective farmer somewhere in rural China.

Ungodly quips on atheism, a meaningless leap to nihilism at the end

Atheism is difficult for the human mind to reconcile. Blissful ignorance must be willingly cast aside. It takes perseverance and yields cold reward. Certainly, it is good to no longer fear the irrational. Conversely it is hard to shed the warmth of love, protection, after-life, meaning.

Atheism leaves a void, but it is filled by human perseverance and passion. We delve with intense focus into the means of our existence, but to justify what end?

What drives the nihilist? We edge closer to the threshold of immortality, hoping answers come before we must face this question of purpose.

Monday, November 9, 2009

Probably Pointless Ponderings on a First Post

"First Post!" the Internet commenter cries! Well, a few years in the past moreso than the present, but the practice persists despite near pointlessness. So I muse:

First post, an intriguing claim; the semantic and syntactic connotations indicate not an active but a reactive communicative burst. Semantically, the author is both claiming and proclaiming status by virtue of his or her immediacy to a content-rich antecedent, such as foundational media or original commentary. Syntactically, the words "First post!" are ordered such that they may be construed as unintentional mockery of the author's semantic intent, using "first" to adjectivally describe a word that implies 'after'.

Eventually a community reaches its tolerance limit and the number of genuine occurrences drops off, with a brief spike in facetious use some time shortly after the stigma of use is widely recognized. Group policing has a more literal effect in "thumbs up/thumbs down" forums like Digg as "First post"ers quickly fall below the default visibility threshold.


Hello, world! Hello, future self.

"This will be the first Skip thought", thought Skipper.

Think about...

Random Thoughts

Where Thinkers Come From
Real Time Web Analytics