2014-10-30

Privacy, Or Something Like It

In For the New Intellectual, Ayn Rand wrote:
Civilization is the progress toward a society of privacy. The savage’s whole existence is public, ruled by the laws of his tribe. Civilization is the process of setting man free from men.
Say what you will about Rand, she had a point. Early society - and modern "traditional" societies - are defined by the extent to which the community is involved in every person's life. We could speculate that this sort of involvement was an early precursor to modern law and order, but that doesn't really matter. What matters is that, until recently, the social order of the western world had reached a point where we could go home and essentially not be bothered or "judged" by others. At home, we were mostly free to do as we pleased, without having to involve "the community."

Two things seem to have reversed our course.

One of them is the extent to which large databases facilitate the collection and analysis of data that was previously considered to be innocuous. Modern data analysis, however, has proven remarkably successful at making accurate inferences about very private matters using data that we did not previously associate with privacy. Knowledge, once acquired, is bound to be used, and this knowledge has mostly been used to advertise to us. While many people bristle at the idea that their most personal information is being collected so that products can be sold to them, I rather consider it to be a very good thing. Markets are getting progressively better at serving the consumer, and for the most part data is either fully anonymized, or so vast that no real-world individual could hone in on a particular person and invade their privacy. Exceptions will exist, of course, and they will be rare.

But there is another, more problematic, factor undermining our privacy: social media. These are media through which we voluntarily make our private lives public on an international scale. The more paranoid streak of social media skepticism will suggest that, having volunteered our private lives, governments can now use that information to monitor and/or oppress us. Consider what rock musician Stuart Hamm recently posted on his Facebook wall: "So...I don't post photos or info of my family here. We are PAYING to have big brother watch us now. Suckers" Of course, as a libertarian, I sympathize with that fear. However, I don't consider it the primary danger of social media.

This morning The Atlantic published an article by Robinson Meyer, the closing paragraphs of which read as follows:
Is living such a public life worth the trouble? Is such a life worth being constantly exposed to vitriol and rage and threats from strangers—especially when the patterns of that abuse seem so random? Is the kind of work that would be required to sustain a “good” public, online social network possible? Is asking people to perform that moderating work something we even want to do? 
We often celebrate the social change and faster communication that public, networked life has brought about. But that kind of life—a new one that we’re all still trying out—requires remarkable sacrifice. We would do well to account for that sacrifice, and, at the very least, thank those who have made it.
So the real cost to living so prominently in the social media is not, in my view, corporate intrusions, nor is it government oppression. Instead, social media threatens to invade our personal psychological space. Every status update we post is an opportunity to be judged, or misunderstood, or threatened, or lashed-out at. Now that cameras are all digital and fully integrated with social media, every picture we take seemingly exposes us to other people's opinions about what we're doing.

Here's a picture of my baby - am I a good parent, or bad one? Here's a picture of my dinner - are you jealous, or is your dinner better, or do you think I'm making myself fat? Here's a picture of my band - is that cool, or am I trying too hard? Here's a picture of me wearing workout clothes - am I sexy enough? Here's a picture of my new girlfriend - how do you rate her?

It's interesting that we take to social media for good times, to gain the approval of the people we care about, maybe even to gain the approval of people we don't care about. Meanwhile, we must also accept the downside of this - maybe the people we do and/or don't care about disapprove of our conduct.

This is just the nature of living life as part of any society. The difference, though, is that in the good old days, we could actually escape society for a little while - go home, decompress, get out of the public eye for a bit. That's still possible in theory, of course. You can turn off all your devices and get away from it all, but today the cost of doing so is higher, because so much more of our lives has gone digital. I, for one, email friends and family many times throughout the day; I "speak" to them on Facebook; I share family snapshots with them; yet I live far away from them, thus social media affords us a level of intimacy that we wouldn't be able to experience without it. When I "unplug," I sacrifice all of that. I miss out on things I really do care about.

Sure, find the right balance for yourself. Find a level of connectivity that gives you the most of what you want the least of what you don't. Go ahead, make the trade-off.

But there's a trade-off to walking down the street, too, and walking down the street is not nearly as invasive to our psychological sense of privacy than the kind of information most of us share on social media. So calling for a "balance" or "moderation" is just another easy non-solution articulated to make us feel better. The simple fact is, we've lost a level of privacy that was previously hard-won. To be sure, we've gained something for it, but figuring out how to be authentic without being an attention whore, figuring out how to maintain a sense of privacy without becoming aloof, is not going to be an easy task for any of us any longer.

How will we regain our old-fashioned sense of privacy? Will we ever?

2014-10-16

A Sub-Two-Hour Marathon In 2038?

Everyone's talking about this Runner's World article about the prospects of a sub-two-hour marathon. I first saw the article on Facebook, via my Open Borders compatriot, John Lee. A few days later, Alex Tabarrok at Marginal Revolution reposted it with some mostly uninteresting discussion unfolding in the comments section.

I checked the comments section again this morning, and discovered a link to a rather fascinating (and short) blog post predicting that the sub-two-hour marathon will happen some time around the year 2038.
Inspired by Patrick Makau Musyoki's new marathon record in Berlin yesterday, I looked for trends in the marathon world records for each decade going back a century. I only included the fastest time in each decade. I expected a plateau like this, but I didn't expect it to be so neatly logarithmic....
Followed by:
A whole crop of articles commented over the last year onstatistically improbable sprinter Usain Bolt, who is ahead-of-trend by thirty years. In the same vein, looking at the marathon plot, we shouldn't expect a male human to break two hours in the marathon until 2038. And it's reasonably assumed that the incremental improvements we see in these times is a result of (decreasing marginal) improvements in training, nutrition, and running equipment.

2014-10-13

The Heuristic-Heuristic

No Single Standard

This morning, I pulled up the news headlines, and saw the following:

I can see how not having a set of therapeutic guidelines or recommendations based on science and experience would be a serious problem when it comes to controlling outbreaks of potentially fatal illnesses. But, according to the article, an absence of guidelines is not the problem:
Murphy says some of the issues in Texas stem from a "system problem" in the way public health care is managed in the USA. The Centers for Disease Control provides only guidance for infection prevention and management. "What they do in Texas, what they do in Illinois, it's up to the state," he says. 
"The question is, who's in charge?" Murphy says. "The states can follow all the guidelines and take the advice, which they usually do, but they don't have to. It's not a legal requirement. So there really is no one entity that's controlling things."
Do you see the problem now? It's not that medical science is failing us, it's that there is no central authority manipulating things from above.

Homo Heuristicus

Admit it: When you woke up this morning, you weren't particularly worried about whether or not there was one single, standardized way to deal with Ebola. Even if you were worried about Ebola, chances are, you were worried about catching Ebola, not about Ebola governance. Having read this morning's headlines, though, you are far more susceptible to forming an opinion on how "we as a nation" "should" "respond" to "Ebola." Scare quotes intended.

There is probably a propaganda mechanism at work here. Doctors are always angling for new ways to nudge us into complying with therapeutic standards. If they can find a way to force us into a single disaster response pattern, they probably will. That's because clinical and health care data is notoriously subject to variation. The more factors that can be controlled, the closer medical scientists can come to understanding a problem, and then healing it. Society is not a laboratory, however, and it shouldn't be subjected to stringent controls for the benefit of experts.

Well, that's all little more than philosophical bloviating. Human beings are paradigmatic thinkers. We yearn to rid ourselves of life's problems and inherent complexity by applying rules of thumb. To the extent that some problems actually can be solved by applying heuristics, this makes us a powerful tribe of apes indeed. 

But to the extent that standardized rules obliterate our ability to perceive nuance, undermine our dynamism and innovation, and allow for individualized experiences, they are more bane than boon.

The Heuristic-Heuristic

This brings me to the heuristic-heuristic, something that I've begun to perceive as a real threat to anyone who seeks any measure of authenticity whatsoever.

For people actively engaged in solving new problems, heuristics provide an important means by which to find viable solutions. And by "heuristics," I mean mostly the scientific method. Deductive reasoning is a powerful force for good, and we are indebted to those people who solve society's urgent problems.

You and I, on the other hand, do not solve these problems, but rather solve our own individual problems by consuming the products produced by the problem solvers. So, while Jonas Salk invented the polio vaccine, all we do is buy the polio vaccine. While auto makers are actively engaged in producing faster, safer, and more fuel-efficient vehicles, all we do is buy one. 

The point is that, while problem-solvers deploy a heuristic called "the scientific method" to innovate, we deploy a much cruder and far more useless heuristic called "find the product that solves our problem, and buy it." 

In the case of polio vaccines and cars, this heuristic serves us well. But in the case of our daily lives, this is a major source of our existential problems. We complain that the schmoozers get the job promotions, we complain that things just aren't like they were when we were kids, we complain that nobody knows the true meaning of Christmas, we complain that Senior Prom has become too big a deal. We wonder why there isn't a "single standard" Ebola response, but when we get to the hospital, we want doctors to give us personal, individualized attention with good bedside manner. 

The crude heuristic doesn't work for us. We commodify every aspect of our lives and gradually come to wonder why our lives seem to "lack something." 

The Cheaters

Via Facebook, I was pointed to this Slate.com article about why people in happy marriages cheat. Here's an excerpt:
Slate: So what are people looking for?

Perel: What’s changed is, we expect a lot more from our relationships. We expect to be happy. We brought happiness down from the afterlife, first to be an option and then a mandate. So we don’t divorce—or have affairs—because we are unhappy but because we could be happier. And all that is part of the feminist deliberation. I deserve this, I am entitled to this, I can have this! It allows people to finally pursue a desire to feel alive.

Slate: Alive?

Perel: That’s the one word I hear, worldwide—alive! That’s why an affair is such an erotic experience. It’s not about sex, it’s about desire, about attention, about reconnecting with parts of oneself you lost or you never knew existed. It’s about longing and loss. But the American discourse is framed entirely around betrayal and trauma.
Perel makes a lot of points in the interview - some good, some bad. She talks a lot about our expectations of a marriage, and she talks a lot about finding something about ourselves that we've lost. It's not that our partners aren't fulfilling us, it's that we ourselves are lacking what we need to be as happy as we might be.

Perel makes the mistake of suggesting more open marriage arrangements. This is a mistake because it doesn't solve the core, underlying problem. The question goes from "What's wrong with my life and my marriage?" to "What's wrong with my life, my marriage, and my affair?"

The point here is that commodifying marriage has basically ruined it. We expect the cutesy romance, followed by the expensive wedding, followed by childless marital bliss, followed by 2.3 children (it is still 2.3, isn't it?), followed by a commodified set of child-rearing benchmarks (first tooth, first day of school, first etc. etc.). Small wonder this has grown into boredom.

But if it is boredom, then of what benefit is adding one more commodity to the list? {Love, marriage, job, kids, infidelity, death} is not much better than {love, marriage, job, kids, death}. True, there is one more "term" in the "set," but this term would only ever prove valuable if it actually meant something to us. Its value - especially in light of what Perel believes - is not in the fact that it is part of the list of life experiences to "check-off," but rather in the fact that it is not supposed to be there. It is one rare triumph on authenticity in an otherwise commodified set of existence-benchmarks.

Normalizing, i.e. commodifying, the experience of infidelity will surely result in nothing more than rendering the experience itself inauthentic, and therefore no more interesting than anything else on the list. That's the first inevitable conclusion here.

The second one - the more important one, in fact - is that infidelity isn't the important thing; authenticity is. So we'd all be better off if we made our marriages (and our daily lives) more authentic, rather than trying to keep our experiences neatly packaged and then seeking to escape from them by engaging in divergent and self-destructive behavior.

Darn, there's that nuance stuff again!

De-Commodify

It's difficult for everyone, of course. Every moment of your life is a moment in which we experience some kind of pressure to commodify. We don't want our children to merely meet Santa Claus, we want them to meet him at a shopping mall, and have their pictures taken on his lap, and ask him for a particular Christmas present. And he has to be wearing a red suit with white trim and a black belt, and he has to be fat, and he has to say, "Ho ho ho." If it's not that, if it's not all of that, then we say that our children haven't had the "real" experience. 

This itself is preposterous, considering first that Santa Claus isn't real, and second that we can therefore define the experience however we want. It doesn't have to be any particular way! The whole thing is made up! So why not just invent a totally pleasant, authentic experience, and make that your holiday tradition?

This is my whole point.

Rather than seeking out a socially prescribed list of experiences and lifetime milestones, hoping that they will unfold in the way that they have unfolded for countless other people, we should take the time to recognize that whatever list of life experiences we have is ours for the choosing. We can define our lives to be anything we want them to be. Every minute of your life can be authentically yours. It can be as satisfying as you'd like it to be.

To accomplish this, you need to back away from the idea that your experiences should look and behave a certain way. You need to get away from the heuristic-heuristic, the mechanism telling you that X is only accomplished through Y. 

There might not be a product available to satisfy your need. There might not be a standard response to every terrible thing that happens in the world. Creating a new product or a new national standard will not necessarily fix things the way you want them to.

2014-10-09

Dichotomous Thinking

Heel-Strikers Versus Forefoot Runners

This morning I read an article in The Guardian about proper running form. Author Sam Murphy sets the stage:
A few weeks back, this blog ran a feature on running form and how to improve it. It included the oft-repeated advice about avoiding overstriding, which “causes the foot to land too far in front of the knee and encourages heel striking – and increases injury risk”. A reader commented that they’d “like to see a blog on whether heel striking really is a bad thing”, which spurred me to investigate.
Murphy then goes on to discuss the influence of the book Born To Run (I reviewed it on the blog here) and its role in promoting "barefoot running." Much of the remainder of the article discusses the evidence of whether "heel-striking" is bad, compared to "forefoot running."

The problem with such an article is that, aside from a small number of people with very extreme running form, almost no one is a pure "heel-striker" or "forefoot runner." Most of us fall somewhere on a continuum, where we tend more toward one direction than the other. For some, the tendency is quite mild. Still others land exactly in the middle of their foot.

In short, the problem with the article was dichotomous thinking.

"Cognitive Distortion"

Why is this a problem? Summer Beretsky at PsychCentral.com breaks it down for us:
...[U]sing dichotomous language boosts dichotomous thinking, and the latter is a type of cognitive distortion that can negatively influence the way you feel about yourself. If you’re dealing with anxiety, casual usage of extremely polar words can lead you to magnify thoughts and events through a distorted lens that can ultimately make you more anxious.
So the problem is twofold: First, dichotomous thinking is distorted, and therefore less accurate than having a more nuanced perspective. Second, and perhaps more importantly, dichotomous thinking can make you unhappy.

Murphy herself seems partially aware of this, as she writes that she has recently "begun to feel a little like someone who was converted to a religion by zealots." I can understand this, because when I finished reading Born To Run, I also gave barefoot running a try. It was consistently the one question that everyone asked me when they learned I had read or was reading the book.

When one watches racing events, one is typically struck by the same fact that McDougall reports in his book: good runners tend toward a similar running form. This is not altogether surprising since running is a natural human activity and all human bodies are built more or less the same with respect to musculo-skeletal structure.

That so many great runners have similar form is not a cognitive distortion. However, the insistence  that all runners should adopt the same set of practices to run well or run comfortably, is.

Anti-Vaxxers And Climate Deniers

In an article aimed at promoting the scientific validity of childhood vaccinations, Amy Parker succumbs to dichotomous thinking. While she opens her article with carefully worded sentences laying out the perspective of those who oppose childhood vaccinations, by her final paragraph, she is talking about all people as though the belong to one of two camps:
Those of you who have avoided childhood illnesses without vaccines are lucky. You couldn’t do it without us pro-vaxxers. Once the vaccination rates begin dropping, the drop in herd immunity will leave your children unprotected. The more people you convert to your anti-vax stance, the quicker that luck will run out.
Ah, yes. "Anti-vaxxers." Many commentators at Slate.com pointed out that "anti-vaxxers" are quite similar to "climate deniers" because both groups of people are opposed to the latest scientific research on the subject in question. This claim alone is somewhat dubious, since people who oppose vaccination don't typically feel that the science behind vaccines is bunk, but just that the risks outweigh the benefits. I disagree, but it is a value judgment based on information they have deemed important to them. As for "climate deniers," few if any openly disagree with the idea that climate doesn't change - the question is whether one believes specifically in the climate forecast models of those scientists who believe that anthropogenic global warming is a risk to the survival of the human species.

But just look at all those words. Why bother with all that nuance and fairness when we can simply engage in dichotomous thinking, box people into "camps" or "groups" or "sides," and then declare one group wholesale wrong?

Politics

Much has been written and said by many intelligent people about the "state" of political discourse today. We hear a lot about how polarized people have become, and this seems to suggest that dichotomous thinking is a rampant social problem. When was the last time you heard or read a political opinion that you didn't subsequently place into some kind of ideological box? 

In the political sphere, when people try to regain control of all this cognitive distortion, many of them fall into the logical fallacy that "the truth lies somewhere in the middle." The problem with this line of reasoning is that it accepts a dichotomous framing of issues and attempts to reconcile that dichotomy. In reality, dichotomous thinking is dangerous because it doesn't describe reality accurately at all.

For example, most of your day is probably spent indoors, at room temperature, i.e. neither hot nor cold. You wouldn't even think to describe this temperature. Temperature only becomes an issue when you find it either too hot or too cold, and suddenly we are confronted by an extreme from the dichotomy of hot versus cold. Never mind the fact that the vast majority of time spent indoors is spent at a temperature we don't really have linguistic terms to describe. The point being this: our reality is neither hot nor cold, it's room temperature. Framing things in terms of hot and cold doesn't adequately describe the majority of our day!

Meanwhile, at Cato Unbound, Kevin Vallier engages in some tetrachotomous thinking, boxing all possible viewpoints about religion in politics into four boxes. The reader may determine for him-or-herself whether Vallier's point resonates; my only point here is to remark that perhaps there are a few more possible ways to look at it. 

Conclusion

The philosophical concept of "difference" is a powerful one. It is one of the first things we learn as infants, and it forms the basis on which we build the knowledge that guides us for the rest of our lives. To that extent, some elementary form of dichotomous thinking will always be a part of human cognition.

But if we think rationally, then as we apply "difference" to our experiences and observations, we will start to uncover the inadequacy of dichotomy. We start to learn that life consists of more than just conceptual poles. We start to reject dichotomous thinking, and we gain a perspective that is at once more accurate and more curious.

2014-10-06

Workout Of The Day

I was somewhat skeptical of heart rate zone training when I started last week, but to my surprise I finished my workouts feeling a little bit more endorphin-charged than usual. Maybe this was psychological. Either way, I finished the week with a five-mile "long run" consisting of 100% negative splits (I'm not sure I've ever done that during a long run before) and peaking at a robust 6:35-per-mile pace. Best of all, I don't feel any risk of injury or over-use of my muscles, joints, or tendons.

Today, the adventure continues. It's a new week, and I'd like to increase my weekly mileage, so I'm heading out for a four-mile run in HR Zone 2 or 3.

Very Elucidating

I can't say much in favor of Kevin Vallier's recent attack on Hans Hermann Hoppe, because - although I agree with Vallier's take on Hoppe - the argumentative logic employed is not strong enough to stand up to scrutiny.

However, the comments section produced a wealth of information I had not read before. Pay close attention to commentator "King Snail," in particular.

2014-10-02

Workout of the Day, Surprise! Edition

Due to some laptop issues at work, I ended up doing today's workout in the morning when I had some idle time. Working out is a great way to make good use of small blocks of time you didn't expect to have on your hands.

As per my previous workouts this week, today's workout is/was a five-kilometer run.

I am aiming for 20 total miles this week, and hopefully 22-25 next week. This will be a slow, consistent build-up of miles up until about the 12-weeks-until-Cowtown point, around January 1st. I should be up to over 60 miles by then, and if I don't get too anxious to hit the 50-miles-per-week point, I'll enjoy a low risk of injury throughout the process.

By way of experimentation, I allowed myself to run outside of heart rate Zone 2 today, sticking in about the Zone 3-4 range, based on current heart rate estimates. I want to see how that impacts the way I feel today and tomorrow. My initial thoughts, now 90 minutes post-run are: There seem to be fewer endorphines, but I feel more satisfied with the quality of my workout.

To be continued...