Much Easier

It would be great if guys like me could easily sculpt our muscles to the point that we look like Arnold Schwarzenegger (back in the day) and have the ability to run great distances. In the real world, though, it's a trade-off: The more upper body mass, the harder it is to be a distance runner. No two ways about it.

Still, I sometimes surprise myself by how much easier it is to run when you're not carrying around a lot of upper body mass. After struggling for more than a year at paces far below my comfort zone, and distances far shorter than I'd prefer, a bout of food poisoning resulted in my quickly shedding major muscle mass. And the result is that I've become my old self again! Mile pace times under 6:30? No problem. Long runs? Bring 'em on!

There's no doubt that the shedding extraneous muscle mass has been the source of my rediscovered running prowess, but still, it's important not to draw too many of the wrong lessons here. Looking back over the course of the blog, it's a fairly consistent case that any time I make a major change to my fitness regimen, I make a lot of significant gains in a short period of time, before ultimately reaching a point of diminishing returns and stagnating again. This was as true of my body-building experience as it is of my recent running revival.

What should we make of it all?

Well, one important lesson to keep in mind is that if you're starting to feel like your workouts are getting stagnant, it's time to make a change so that you can recapture the good feelings.

A second lesson is that making continued progress is more difficult the more progress you've made. You might want to modify future expectations accordingly. Aim for smaller benchmarks. There are only so many 30-second PRs you can reasonably expect to have.

A third lesson is not to over-estimate early progress. You might just be getting a taste of the lowest-hanging fruit. The real challenges may yet be ahead.

It's Friday. That's all I got.

Open Borders: Deport All Troublemakers

My latest offering at Open Borders concludes as follows:
Many critics of immigration base their case against open borders on the differences between groups of human beings. I have attempted to show why this problem is not unique to immigrants, that we are in fact different from other natives, too. Eliminating differences in a community of peaceful people presents prejudicial, empirical, and practical problems that most would find unsettling. Those critics who point to “differences” as a justification for restricting immigration thus have a steep burden of proof assigned to them. Until they meet it, I remain unconvinced.
Read the whooooole thing


More Or Less

The best psychological advice I ever received was: Do more of the things that make you happy and less of the things that don't. On the surface, the suggestion is simple to the point of being unhelpful. "I know that already - my problem is that so many things are making me feel bad!" Dig a little deeper, though, and I'm certain you'll find it to be as profound as I do.

Exercise And Depression
What if I were to tell you that there was a medical therapy that was proven to elevate your mood for hours at a time by activating the brain's dopamine and seratonin responses, increasing adrenaline levels, lowering blood pressure, increasing the amount of oxygen in the brain, and providing a mega-dose of vitamins D and K? What if I told you that this therapy was available free of charge and came with small risks and almost no negative side effects? You'd be silly not to take that deal, right?

It sounds too good to be true until you know what therapy I'm talking about: exercise. Pessimists are quick to point out "that going for a run or picking up a guitar does not always solve the problem," but it is nearly impossible to maintain the strong form of that argument in light of all the research that has been done on the topic. As Harvard Medical School summarizes,
A review of studies stretching back to 1981 concluded that regular exercise can improve mood in people with mild to moderate depression. It also may play a supporting role in treating severe depression.
The science of the matter, at least, is clear. Exercise won't cure depression, but it almost always alleviates your symptoms.

More Of The Things That Make You Happy
I mention exercise not to propose it as a cure-all, but rather to demonstrate that certain things make you happy no matter what. The clinical evidence is inarguable: Exercise makes you happier than you would be without it. It's true that you can't always control all the negative things that happen to you in life, but you can always control the things that make you happy.

So, the crude example is: No matter what lousy stuff happens today, you can always go home and do some push-ups or sit-ups or some other appropriate form of exercise. Not everyone can run, but everyone can engage in a body-appropriate form of exercise. You know it will make you feel better because it is a clinically proven fact. No matter how bad things might have gotten today, at least you can squeeze in a workout. It won't solve your problems, but the fact that it will make you feel better is, I repeat, a fact.

Exercise isn't the only thing that makes you happy. Music might. Or art. Or cat pictures on the internet. Or meditation. Or a phone call to a friend or relative. None of these things has the magic power to solve all your problems, but if any of them make you feel better, then guess what: You have some control over how you feel. So use it.

Last point here: Some of "the things that make you happy" are shockingly simple. If you have to work with a difficult coworker, for example, you probably can't just pick up and go for a run every time he irritates you. But you can take the time to applaud yourself for being able to keep your composure and interact gracefully with someone whose personality tries your patience. Something that simple can be a great source of happiness when things are going wrong. And if your problem is more severe - a chronic medical condition like diabetes, for example - you don't need to force yourself to accomplish something every day to feel better about yourself. You can set your mind at ease by simply patting yourself on the back for keeping a sense of humor about your predicament.

No need to climb Everest here, just do things that make you happy.

Less Of The Things That Make You Unhappy
The flip side of my advice so far is to do less of the things that make you unhappy.

Some of this is very low-hanging fruit. If you know that binge drinking gives you a hangover and costs you a lot of money, then you know... stop it. If you know that, every time you try to give your teenager advice, she flips out and calls you a Nazi, it's probably in your benefit and hers to just cool it. Don't aggravate an already-bad situation. That's the introductory course.

The intermediate course is: Figure out how you're making your own problems worse, and try to make a change. If you find yourself always swearing at the rush hour traffic you have to drive through every day, you can take concrete steps to alleviate your problem. If you can flex your working hours in such a way that you show up earlier for work, and leave earlier, thus missing the worst of a traffic, that's a no-brainer. But wouldn't it be nice to be able to drive through bad traffic every day, and not lose your cool? Singing along with my favorite CDs sure helps me. Some people like audio books. Maybe you can find an alternate route that takes a little longer, but saves you a big headache. Maybe you can start taking the bus. Maybe you can offer to give your friend a ride and put some social pressure on yourself not to flip out. All of these things might help.

The key here is to remember that bad traffic is really irritating, and you're not wrong to think that it is; but doing less yelling is sure to help you from feeling worse. And if your problems are more serious, like my diabetes example above? Maybe what makes you feel bad is that you just think too much about it, and you need to take a break from talking or thinking about it all the time. Or perhaps what makes you feel bad is the isolation, in which case a support group or family member might be able to keep you from spending too much time in your own head. I can't say for sure, but again, the key is to find what makes you feel bad - isolation, obsession, pining for the ice cream you can no longer eat, whatever - and find a way to do less of it.

Weather, Population, And Meltdowns

According to the New York Daily News, "the South" is still struggling to gain control of things after the "mild" storm that recently paralyzed most of Atlanta. The nation's more populous metropolises are scratching their collective head and wondering how two inches of snow could possibly wreak so much havoc on a modern American city. To me, though, it is not much of a mystery. I'd like to say a few words about it, as a public service to all my Canadian readers, and my readers who are from regions that frequently get much bigger snowstorms, and yet still manage to function as cities and communities.

My first point is the most obvious. However normal two inches of snow might be for you, in places like Atlanta, Georgia, it is rare. While your city might be well equipped with snow plows, tow trucks, snow tires, tire chains, snow blowers, salt or sand on the road, etc., cities like Atlanta and Dallas have no such infrastructure. The reason is because it's not a good investment. I know it seems like a good investment, now that we've seen what havoc can be caused by two inches of snow, but the real trade-off is: tens of millions of dollars of municipal spending on snow maintenance infrastructure on the one hand, and public school spending, the social safety net, civil servant salaries, etc. on the other. When you stop to think that, in Georgia, two inches of snow all at once is something that only happens once every few years, the trade-off is not as obvious as it seems.

A second, related, point pertains to the decisions public officials made to try to manage the situation once the snow started falling. I'll grant you that if the mayor of Atlanta or the governor of Georgia grew up in New Hampshire or Salt Lake City, he should have known better. But if you grew up in a region of the world where major snowfall almost never happens, how good should we expect your judgement about snowstorms to be? Similarly, how well do I expect people in Columbus, Ohio to deal with hurricanes? You get the drift (pun intended).

My third point pertains to population. Ludwig von Mises observed that
...there prevails a tendency toward a distribution of population over the earth's surface in accordance with the physical productivity of the primary natural factors of production and the immobilization of inconvertible factors of production as affected in the past.*
This is a glorified way of saying that people follow the money. Elsewhere, Mises also discussed the Malthusian law of population - you know, the old idea that population will grow to the point where it becomes unsustainable, and then we'll all die. But in Mises' opinion, this is just a caricature what Malthus wrote. Mises rather saw the Malthusian "law" as being a special case of the Law of Diminishing Marginal Utility - and he was right.**

What this means for Atlanta is this. People flock to Atlanta to avail themselves of its many redeeming qualities. What they find when they get there is: (typically) beautiful weather, a colorful culture, and excellent economic opportunities. I'd sure take that deal over a city that offered cold weather, an uninteresting local culture, and poor economic activities, wouldn't you? (Maybe this is why people fled Detroit and flocked to places like Atlanta.) But it's not a costless win. To get to Atlanta, you have to incur moving costs; but you also have to give up any of things you might have enjoyed about your old city. Maybe you don't mind giving up the cold weather, but you might miss the skiing. Maybe you don't mind giving up the non-existent nightlife of Butte, Montana, but you might miss the local tourist attractions.

Anyway, Atlanta also comes with its own drawbacks, which may or may not be deal-breakers to you. One such drawback, especially if you come from rural Georgia and move to "the big city" in order to "make it," is its large population. True, it's not as populous as many other American cities, but regionally speaking, it's up there.

Well, you'll notice there are no reports of crippling traffic jams and people sleeping in grocery stores from Athens or Macon or Savannah, Georgia, right? Much less are there reports of kids stranded at schools in Lafayette, or Anderson, or Tifton, or Douglas, or Inverness, or Troy... Is this because these cities all have better winter infrastructure? Or superior elected officials? Or better preparation? Not likely. No, the real reason is that smaller, more sparsely populated cities can better-handle freak storms than more populous cities (assuming, of course, that the storm does not result in total devastation) because the people in those towns do not have to worry about severe traffic jams and freeway travel. The roads might be equally as hazardous, but there are fewer people on them. There might still be car accidents, but commutes are shorter, so they have less of an impact on the victims' ability to get home. And so on.

Consider all of this when you ask yourself why "the South" has so much trouble with two inches of snow. My sister - who, like me, grew up in the snowy Intermountain West - also lives in the South. She laughs when school is cancelled over "a dusting of snow." But she lives in a small southern town, not unlike Rome, Georgia, where people can make due with bad weather conditions. But if you live in Atlanta, and the snow falls, and the heavy traffic grinds to a standstill, and the baby is crying and you need to buy diapers, you'd better believe you'll feel the hurt.

* von Mises, Ludwig, Human Action, p. 627
** Ibid., p. 129, which reads as follows:
The Malthusian law of population and the concepts of absolute overpopulation and under-population and optimum population derived from it are the application of the law of returns to a special problem. They deal with changes in the supply of human labor, other factors being equal. Because people, for political considerations, wanted to reject the Malthusiam law, they fought with passion but with faulty arguments against the law of returns?--which, incidentally, they knew only as the law of diminishing returns of the use of capital and labor on land. Today we no longer need to pay any attention to these idle remonstrances. The law of returns is not limited to the use of complementary factors of production on land. The endeavors to refute or to demonstrate its validity by historical and experimental investigations of agricultural production are as needless as they are [p. 130] vain.


Are We But Racists?

As an addendum to my previous post, consider the following. Scientists have confirmed with genetic testing that human beings crossed-bred with neanderthals.
In the 1990s, researchers began finding fragments of Neanderthal DNA in fossils. By 2010 they had reconstructed most of the Neanderthal genome. When they compared it with the genomes of five living humans, they found similarities to small portions of the DNA in the Europeans and Asians. 
The researchers concluded that Neanderthals and modern humans must have interbred. Modern humans evolved in Africa and then expanded out into Asia and Europe, where Neanderthals lived. In a 2012 study, the researchers estimated that this interbreeding took place between 37,000 and 85,000 years ago.
In a rare act of mercy, I won't name any names in this post. However, the argument has been advanced in certain sectors (over and over and over again) that human beings are incapable of getting along in large, diverse, cosmopolitan communities. It has been said that history proves that this can only end in violent conflict.

I have the links to where this theory has been advanced, but you guys know who you are, don't you.

Anyway, I just have to laugh. History shows that we hate diversity so much that we are willing to breed with members of a completely different species, so long as they look a little bit like us and we find them sufficiently attractive.

So was this isolated happenstance? Some ancient pervert decided to get it on with a neanderthal, staining mankind's tradition of ethnocentrism forever? Or, is human history actually a tale of remarkable outreach and the peaceful pursuit of positive relationships?

Remember, we're broadcasting radio waves to space aliens even as I type this, hoping that we get an answer. Well sorry, alien ladies, I'm off the market. But I hear all those guys who took the "Red Pill" are really good at courting your submission.

More On In-Fighting

Apropos of something I must be unaware of, Steve Horwitz recently linked to this old post at Bleeding Heart Libertarians. That link, in turn, discusses the essay found on page 34 of this pamphlet, available at Mises.org.

First, let me quote a few passages from the Rockwell essay. To avoid confusion, I want to begin by making clear that I object to all of the above links, and the following passages are things that I find highly problematic.

To wit, Rockwell writes:
Pornographic photography, "free"-thinking, chaotic painting, atonal music, deconstructionist literature, Bauhaus architechture, and modernist films have nothing in common with the libertarian political agenda - no matter how much individual libertarians may revel in them. In addition to their aesthetic and moral disabilities, these "art forms" are political liabilities outside Berkeley and Greenwich Village.
I wonder if Rockwell would defend this passage today, if pressed to do so. Are we really to believe that if I enjoy listening to one of Schoenberg's serial compositions, I'm against libertarianism? That's a tough pill to swallow. Could Rockwell himself swallow it? Never mind that, actually, I have a better question: Could Rockwell provide a cogent argument for why Schoenberg is a "political liability?"

Later (emphasis mine):
Too many libertarians also join liberals in using the charge of racism to bash non-conformists. It may be scientifically false to believe, for example, that Asians are more intelligent than whites, but can it really be immoral? From a libertarian perspective, the only immorality would be to seek State recognition of this belief, whether correct or incorrect.
This is truly remarkable, and would be laughable, were it not an idea that keeps popping up in the blogosphere. But on a related note, Rockwell further writes:
From a Christian viewpoint, it is certainly wrong to treat someone unjustly or uncharitably as a result of racial beliefs. It is also wrong to treat someone unjustly or uncharitably because he's bald, hairy, skinny, or fat. But can it be immoral to prefer the company of one to the other?
Good question. Is it immoral to prefer hanging out with bald people versus people who are not bald? Is it immoral to choose one's private associations based on superficial physical traits? I wonder what Rockwell would say, if I asked him.

(Of course, the clever criticism of the above passage is that it is not merely wrong to treat someone unjustly or uncharitably as a result of inherent racism. Indeed, it is universally so, regardless of whether one is a Christian or a member of some other religion. Every religion preaches racial equality, every last one of them.)

And finally:
Libertarianism is widely seen as anti-force. But force will always be necessary to defend against wrong-doers and to administer justice. Libertarianism opposes aggression against the innocent, not coercion in general.
Rockwell insists that coercion is necessary and appropriate, so long as the victim is not innocent. That's unsettling for many reasons, but to me the most compelling reason is the possibility that such a policy in practice could very easily get out of hand.

The above passages are, in my opinion, the most offensive in the Rockwell pamphlet.

Having said all that, I must confess that I read Horwitz's post first, and was under the impression that Rockwell's essay would be far more offensive than it turned out to be.

Hearing an old, bearded, white man extol the virtues of conservative Christian, Anglo-Saxon culture is not exactly news. For one thing, you'll find similar expositions at iSteve, Anti-Gnostic, Chateau Heartiste, and so on. To be sure, there are a lot of praiseworthy aspects of Western culture, and there should be nothing wrong in plainly acknowledging them. But Rockwell doesn't seem to want to keep the good and toss the bad (and there is a lot of bad in every culture) when he writes that "Western civilization [is] eminently worthy of preservation and defense" (emphasis mine).

"Preservation and defense" begs the question how will we "preserve and defend" Western civilization, if not by practicing exclusion? Surely we all agree that the great works of Isaac Newton, Plato, Des Cartes, et al., ought to be preserved for as long as humans can learn from them, and defended against being destroyed. But even my friends born in Africa or Asia would agree with that. Thus, this is clearly not the kind of "preservation and defense" Rockwell has in mind.

I agree with Horwitz that far, at least. But Horwitz writes:
As Jacob [Levy] says, the attempt to court the right through appeals to the most unsavory sorts of arguments was a conscious part of the “paleolibertarian” strategy that Lew Rockwell and Murray Rothbard cooked up in the late 1980s. What’s happening right now is that the chickens of that effort are coming home to roost with large external costs on all of us as libertarians.
And later:
Even after the paleo strategy was abandoned, Ron was still there walking the line between “mainstream” libertarianism and the winking appeal to the hard right courted by the paleo strategy. Paul’s continued contact with the fringe groups of Truthers, racists, and the paranoid right are well documented. Even in 2008, he refused to return a campaign contribution of $500 from the white supremacist group Stormfront. You can still go to their site and see their love for Ron Paul in this campaign and you can find a picture of Ron with the owner of Stormfront’s website. Even if Ron had never intentionally courted them, isn’t it a huge problem that they think he is a good candidate? Doesn’t that say something really bad about the way Ron Paul is communicating his message?
Thus Horwitz insinuates that Lew Rockwell and Ron Paul might have deliberately courted neo-Nazis because they thought it would be good for the libertarian movement. That suggestion is more than wrong and offensive, it also defies logic. Why in the world would any libertarian think cozying up with neo-Nazis would benefit the movement. The mind reels.

Horwitz might counter: Indeed, that is the whole point - it was a terrible strategy. But first he must convince me that this was indeed the strategy, and despite the highly problematic passages I have quoted above, nothing about the Rockwell essay would lead me to believe that he intended to deliberately attract racists to increase the size of the liberty movement.

Sorting It Out
When I comment on such matters, the reader must understand that I wasn't "there" and I don't really know. Anything I say here should be understood to be speculation. However, it is speculation based on the available evidence. Those who were "there" are the only ones who can say for certain.

But us younger folks don't have the luxury of having been "there," and we'd still like to have a bit more liberty in our lives. What are we to think?

First of all, I think it's far more likely that Rockwell genuinely believes in social isolation. His website has published critiques of free immigration such as this essay by Hans-Hermann Hoppe. To discover that Lew Rockwell is a critic of cultural diversity is - once again - rather dog-bites-man. He writes in favor of "Western civilization," he denigrates the idea that Dizzy Gillespie's music compares to Bach, he finds no moral objection to believing that different races are differently intelligent, and he thinks it is perfectly fine to avoid hanging out with bald people. It is fairly safe to say that Rockwell's views coincide well with what Anti-Gnostic calls "the Dark Enlightenment."

I suppose the whole question - the one that Horwitz would like to force - is whether one can be both libertarian and ethnocentric. This is a question that likely lead to my unfortunate bromantic break-up with Sonic Charmer. It seems to be the question at the heart of the "open borders" debate. Can liberty-loving people be ethnocentric? Do those two ethics match up?

Coming at the question from the Enlightenment angle, as Horwitz does, replete as it is with tales of "all men are created equal," and women's suffrage, and civil rights, it is very difficult to conclude that ethnocentrism and libertarianism are compatible. Consider what libertarian godmother Ayn Rand wrote:
Racism is the lowest, most crudely primitive form of collectivism. It is the notion of ascribing moral, social or political significance to a man’s genetic lineage—the notion that a man’s intellectual and characterological traits are produced and transmitted by his internal body chemistry. Which means, in practice, that a man is to be judged, not by his own character and actions, but by the characters and actions of a collective of ancestors.
In the Objectivist tradition, racism is collectivism, and collectivism is both anti-liberty and anti-life. But this argument will obviously have no sway with those libertarians, such as Rockwell, who believe that Rand was the leader of a cult.

And this is the tragic end of the libertarian movement. At the end of the day, after decades of progress, the movement unravels in full view of the public, not because liberty was tried and failed, but because libertarians themselves cannot seem to agree whether or not they are racists.

Let me briefly acknowledge the obvious critique here: I understand that the Rockwells and Steve Sailers of the world don't actually believe that they're racists, but it's hard to conclude otherwise when they insist that there is nothing morally objectionable about preferring the company of whites. Isn't it?

Like Horwitz, I believe this leaves a pockmark on the face of the liberty movement that scares people away whenever liberty starts to get favorable press. The Tea Party rises, and quickly falls, precisely because charges of racism can stick to essays like the Rockwell piece I've discussed here.

But if you think I'm not giving Rockwell a fair shake, consider this passage from the same essay:
The only way to sever libertarianism's link with libertinism is with a cleansing debate. I want to start that debate, and on the proper grounds.
In effect, Lew Rockwell never called for a declaration that his way, and only his way, was the viable form of libertarianism - at least not in the offending article. Instead, he outlined a position he called "paleo-libertarianism," and called for a debate.

To my knowledge, this debate never happened. Horwitz's blog post does not seem to advance the debate very far, either, although I readily concede that the ideas laid out on BleedingHeartLibertarians.com certainly qualify as participating in a debate about what libertarianism consists of.

Rather than lob rhetorical molotov cocktails at each other, I think the old guard should participate in the very debate Rockwell hoped to initiate. And, I think the starting point should be not the Rockwell article or the Ron Paul newsletters, or even Horwitz's blog post.

Instead, I think the debate should start at the same place that so many of these folks came to self-identify as libertarians in the first place: Ayn Rand. Let them have the debate, and let them start by articulating the extent to which they agree with Rand's views on racism, quoted above. From that starting point, let them produce their rationale and convince each other that "true" libertarianism is either anarcho-capitalistic ethnic enclaves, or enlightened, libertine, "bleeding heart" societies.


I think I have found a new hobby horse in complaining about the juvenile, shrill, passive-aggressive, mildly effeminate language deployed by modern "progressives" against their opponents.

The kind of language to which I refer is typically spoken as though it were a bored response to a tired trope, but it always belies a level of irritation that goes far beyond what the literal translation of it would have you believe. That is, the speaker wants to sound unfazed, bored, condescending, almost sleepy, but instead ends up sounding extremely vexed. Before I venture any further here, I should provide some examples.

In a recent comment on Stephen Williamson's blog, Noah Smith writes:
In any case, we should not be dismissive of what Prescott is saying. [Ed: Smith is quoting Williamson here.] 
I think, actually, we kind of should.
"Actually, we kind of should." This is a perfect illustration of what I'm talking about (and Williamson offers the perfect response), but I'm not content to leave it at that. Let's take a look at additional examples.

Writing in response to a passage in this book (I am not endorsing the book, please note), an email correspondent of mine says:
I don't agree with any of this and find that it somewhat feels like it's justifying behavior that is violent, mean, and rapey.
Here the offending word is "rapey." There is absolutely nothing cute about rape, but adding "eee" to the end of it makes the description vaguely insulting and highly childlike. Like "meanie."

A similar phrase - one that you'll often read on Slate or Jezebel is "That is not okay." And typically the last two words are emphasized: "...not...o-KAY!" It's the kind of thing you'd expect a mother to say to a toddler. "We do not hit people. That is not... o-KAY!" Compare this to a far more disciplinary phrase like, "If you continue to hit your playmates, I will take away your toys and make you sit in the corner."

Saying that something is "not okay" is a far more effeminate, passive-aggressive way of disciplining a child. It assumes the role of a parent is to instruct a child as to what is "okay" and what is "not okay." It's Orwellian; we can draw close parallels to "good" and "un-good." As if a child's life is governed by what is understood to be "okay!" Children respond to incentives and consequences - just like the rest of us - not social conformism. But I digress.

Anyway, speaking of Slate, Amanda Marcotte writes, "All jokes aside, Hannity's boo-boo here was the result of a larger lie..." Boo-boo; infantile language; we can contrast it to the words gaffe or mistake. Understand, there is a reason Marcotte says that Sean Hannity made a "boo-boo," and the reason is to infantilize him and dismiss his position. But if what Hannity says is erroneous to the point of being childish (and believe me, I have no reason to believe that Sean Hannity has ever said anything worth defending at Stationary Waves), why would Marcotte write in the same article that "Sean Hannity found himself getting aggressive with a woman who called into his show"? Women accuse men of being aggressive toward women when they want to shame the men in question. And, however much Hannity may deserve to be shamed for the things he says - and whatever it is you may personally believe about any related issue - shaming is not an activity reserved for people about whom we feel blase.

...unless, of course, our passive-aggressive attitude and language belies a heightened level of shrill indignation.

The Point
Through it all, it is important to keep one thing in mind at all times: Those who rely on this kind of "shamey" language (get it? I'm doing it to them now...) are not actually responding with sound arguments. Consider each of the examples above.

First, Noah Smith argues that "we kind of should" dismiss a claim by Ed Prescott, but offers no substance with his comment. Williamson rightly calls this kind of comment "blather" and advises Smith to write about what he knows. Without argumentative substance, it's all just kind of "shamey," isn't it?

Second, my email correspondent could not credibly accuse the author of a 19th century book on gender relations of the crime of rape, so instead the correspondent simply accused him of being "rapey," i.e. offering an argument that resembles rape in some unspecified way. Rape is, of course, a terrible and inexcusable crime of hatred. What is "rapey?" It is something that the correspondent wishes to attach a similar level of shame, but without any sort of fact or reasoning to merit the claim. "Rapey" is "shamey."

Third, as discussed above, attaching negative consequences to unacceptable childhood behavior is called discipline. It is concrete, specific, and enforceable. But simply declaring something to be "not okay" and frowning furtively at a child (or an adult, please note) is synonymous with the act of declaring an action to be socially frowned-upon. The point is that those children (or adults) who engage in that behavior should be ashamed of themselves. It's shamey. But there isn't any specific reason why people who say "not okay" are saying what they're saying. The best you'll ever get from them is "we don't do that." It's an act of shaming someone by attaching social unanimity to whatever the speaker has deemed to be "not okay."

(More on this in a forthcoming post about use of the word "we," by the way.)

Finally, Marcotte's shaming of Sean Hannity for being "aggressive" (toward "women") and making a "boo-boo" is one last example of an argument made without facts. We are simply told to accept that Hannity is aggressive; we are simply told that his mistake is a juvenile one (because it is a "boo-boo," not an error).

And what was Hannity's error? It was the suggestion that women who care passionately about access to birth control ought to form a private charity to supply it for the less fortunate, rather than demand a government mandate and pass the cost around to every American taxpayer, regardless of whether he or she philosophically agrees with the mandate. That's not an error, it's an opinion.

Marcotte, by the way, goes on to say that such a private charity is exactly the same thing as both health insurance and ObamaCare, a claim I have endeavored to shame a bit myself (see "Error #8 in this response to David Simon's having said something similar).

Being "shamey" is not the same thing as being correct; but above all, being "shamey" is actually the exact opposite of having the better argument. Each and every case of a person's being "shamey" is an example of their having no concrete defense for his or her position. Of course, not being able to offer a good argument doesn't mean that one is wrong, it just means that one cannot legitimately claim to be more right than those who one is shaming. "Shameyness" is not merely a bad way to respond to an argument, it is not a rhetorical response at all. It is nothing more than immature, effeminate passive-aggression bereft of facts or reasoning.

So, rather than attempting to shame those who disagree with you, why not respond with facts and reasoning? You'll end up looking a lot less ridiculous.

I plan on following up on this issue when I notice particularly egregious examples. Thus, I've created a new label, "Shamey," by which to track the matter on Stationary Waves. For now, I wanted to introduce the issue, especially since it provides a good set of background information for a forthcoming post about use of the word "we."



Graphic courtesy mathworld.wolfram.com
Reading this recent coverage of a study that found an inverse relationship between coffee consumption and type 2 diabetes incidence (that is, more coffee = less diabetes), I was swept up by a familiar sense of irony.

It is, after all, a remarkable irony that I, of all people, would end up with diabetes. I drink more coffee than anyone. But it's not just that. I have been a life-long distance runner - I ran competitively from the age of 8 to the age of 20, and then got into ultra running and marathons. I've always eaten lots of vegetables and fruits. I've always adhered to a regular schedule and I've always managed to get plenty of sleep. Regular exercise, a healthy diet, a healthy schedule... I'm the last person anyone would expect to wind up with diabetes.

Of course, mine is not type 2 ("lifestyle diabetes"), but rather type 1: pancreatic failure. Through the dumb luck of genetic mutation, I ended up with a set of symptoms that most commonly affects people who live very unhealthy lifestyles. There's your poetic irony: the health nut winds up with the same problems that people get from abusing their bodies for decades.

For this reason, I like to call my condition "Ryabetes." It's not "real diabetes," it's a byproduct of the fact that if anyone is to be some kind of statistical outlier, I will.

It's tempting to draw a perverse lesson from this: It doesn't matter how healthy you think you are, something bad is going to happen to you, so you might as well just enjoy yourself. I could have lived a lot more recklessly during my youth; perhaps if I had known this was going to happen, I might indeed have done so.

The real lesson to learn, though, is that life is a series of chaotically random events that human beings try to control, but seldom can. But for the genius of insulin injections, I wouldn't even be alive to type this message. People are always searching for meaning where there isn't any meaning. When they do, they miss the more important truth that what is truly meaningful are the things that keep human beings alive and interacting with each other. Life is important. Irony just makes the story more interesting.

Why Doesn't Nickelback Sound Like Soundgarden?

Part One:
Photo courtesy www.nickelback-albums.net
It has been written (so widely, in fact, that it is stated as a fact at Ask.com) that Nickelback's primary influence was Soundgarden.

I first heard about this years ago, back when Nickelback's popularity was really starting to peak. As fate would have it, I was living in Alberta, Canada (Nickelback's geographic origin), at the time. Whether this gives me any extra insight into the truth of things is highly debatable, but I bring it up simply to say that there was a whole lotta Nickelback going on where I was, when I was there. Nickelback on the radio. Nickelback on TV. Nickelback in the clubs. Cover bands would even play Nickelback. It seems odd nowadays, because they are such a widely panned group despite their popularity, but for a while there, Nickelback really was "it."

And this is the question that absolutely fascinates me: How is it that a band whose primary influence is Soundgarden became the Poison of its generation? How does that happen? How does one set out to become Soundgarden and end up becoming Nickelback?

Part Two:
Another fascinating situation is walking into a dirty bar on Wednesday or Thursday night and listening to one of the many terrible local rock bands that play original music in every major and minor city in the United States and Canada. Many people believe that video killed the radio star, that hip hop killed rock, that MTV ruined everything, that the music business destroyed artistic integrity, and so on. I, on the other hand, have argued (see here and here and here, for example) that music isn't very good these days because musicians are bad at making good music.

See the average "local band" perform live is not merely a case in point, it is the entire case. I don't fault any artists for trying and failing. There is no shame in aspiring to be great and coming up a little short. What does strike me as odd, however, is that most local bands are under the impression that they are putting forth a quality product.

Wait, hear me out.

It is obvious to the band that they are not playing high caliber music. They know it, the audience knows it, everyone knows it. The issue is not whether the musicians are playing perfect or legendary music. The issue is: If they're playing such bad music, why don't they write something better? I know, I know: it's easier said than done. But still, if you idolize Soundgarden and yet your music comes out sounding like something of a cross between Chicago and The Sex Pistols, you have to know something went wrong. Don't you?

This isn't simply a matter of not being good enough. This is a question of having ears. Compare the chord progression of "Black Hole Sun" to the chord progression of "How You Remind Me." They are not merely dissimilar. They have virtually nothing in common. It's not just that two different people wrote those songs, it's that one is elaborate, making use of time-signature-, key-, and mode-changes, while the other is a single four-chord progression repeated over the simplest possible rock beat. It is only one step above Andrew W.K.'s claim that he was influenced by Beethoven. (Don't remember Andrew W.K.? Don't worry...)

About the only thing Nickelback and Soundgarden have in common is the fact that both bands sometimes play music that involves distorted electric guitars. That's it.

Part Three:
Nickelback has no obligation to sound like Soundgarden, nor is any local band required to make music that sounds like their favorite bands. That's not the point.

The point is that the moment one becomes aware of the fact that one is falling far short of the stated objective is the exact moment one typically pauses to reflect on what one is doing, and sets about to change course. That is to say, a young basketball player can only miss so many free throws before he stops to ask himself whether he might have better luck if he modified his form. Maybe he needs to put more spin on the ball. Maybe he needs more arm strength. Maybe his shot needs more loft. Maybe a million things. The point is, if he misses enough shots, he comes to one or both of two conclusions: (1) "I need more practice," (2) "I need to modify my technique."

In that respect, music is comparable to sports. If one sits down to write a great, heavy, riffy song like "Hands All Over," and comes out with "Someday," one has to be aware of the fact that the result doesn't match the intention.

"Every time I try to write something like Led Zeppelin, it comes out sounding like Pat Benetar!" Don't you get it? The problem is you. You need to hone your craft. You need to study up on what makes a Soundgarden song a Soundgarden song. You need to figure out why it is that Poison sounds more like Kiss than like Van Halen.

You can choose to love whatever music appeals to you. This is not about taste. It's about knowing what sounds sound like. "Love Gun" does not sound like "Hot For Teacher," even though they both have shuffle beats. There are important differences between the Beastie Boys and Kid Rock. If you plan on sounding like the modern equivalent of one or the other, then you need to know what those differences are.

Part Four:
This is my beef with local bands. They all say the same thing. They all love Led Zeppelin and they all want to be the next Nirvana. But when the lights go up and they take the stage, they all sound like Nickelback (if they're any good!). And through it all, none of them - not one - ever pauses to ask the question why they brought a map for the "Highway to Hell" and somehow ended up in "Margaritaville." 

Well, wouldn't you ask the question, if it happened to you? Why doesn't anyone ask the question?

It's not that the decline of music is any great mystery. Everyone who plays music in public is painfully aware of the fact that none of the bands out there sound anywhere near as good as [insert famous band of yore here]. We musicians - all of us - seem to know that there is no great appeal in what we're doing. The cover bands are still covering Credence Clearwater Revival and Jimi Hendrix - I know, because I cover those songs myself. 

There is, of course, nothing wrong with covering classic rock songs when performing on stage for money. People like those songs, and the job of an entertainer is to entertain. If your objective is to play those songs, and you end up playing those songs, then you started with a map to "Highway to Hell," and you ended up on the "Highway to Hell." Success!

And my point is that people are covering the old songs precisely because the new songs aren't any good. Musicians stop playing original music because they don't enjoy playing original music. Why not? Sure, we can point to the control-freaks who drive people out of bands, but that just means that all the people driven out of bands should eventually end up in less-controlling bands with each other. So it's not that.

The answer, the real answer, the final answer, is that the musicians who are writing original music don't seem to want to hone their craft, at least to the point where they can figure out why their maps all say "Soundgarden," but the road signs all say "Nickelback." They can't seem to figure out why that's a problem. They get offended, and make excuses: "There's nothing wrong with writing a good pop song!" I agree, there's nothing wrong with that, if that's what you set out to do. But if you set out to write a complex, progressive-rock-inspired grunge tune and then ended up with a good pop song, then you failed in your objective.

Part Five:
I'll leave with a parting word to music fans: You're not helping.

When you buy into what Rolling Stone tells you about the history of music, when you subscribe to the idea that it was Elvis ---> Beatles ---> Zeppelin ---> [a bunch of terrible glam bands] ---> Nirvana (SAVIORS!) ---> The Strokes or something ---> Nickelback ---> OutKast ---> [wait, what?] ---> P!nk ---> Lady Gaga ---> uhh... ---> Imagine Dragons, I think ---> Lady Antebellum ---> or did the other one come first? ---> what's on Ryan Seacrest this week? ---> wasn't Prince kind of famous for something, at some point? ---> Green Day had a hit album before American Idiot? ---> OMG I love Sound City!

That sentence isn't coherent, and neither is the Official Rolling Stone history of music. So, my request to you music fans is: stop making up excuses for music. Stop trying to fit it into what you perceive to be music history. The history of music is a massive scatter-plot of individual artists doing individual things. Each artist has his or her own artistic vision, and that artist is successful to the extent that the end result matches the original vision.

Get that? Commercial success is not relevant to artistic success. Being famous is not the same thing as being good. Having a hit record chronologically later than Nevermind does not automatically mean that the hit record was "inspired by" or "influenced by" or "made possible by" Nirvana. And, no matter how much it rattles your view of the musical cosmos, you absolutely must repeat this exercise by substituting "The Beatles" in for "Nirvana" and any Beatles album for "Nevermind". 

In short, you, the music fans, must learn to form your own opinions about music and its evolution. You must fight against the desire to simply buy into whatever prevailing opinion is marketed to you most aggressively. If music is important to you, you must learn to treat it as though it is important. You must learn to listen carefully, critically, and consider all aspects of what you're hearing.

There is no harm in not believing that music is important, and not treating it as such. But if you believe that you can treat music as though it is unimportant, just so that you have enough intellectual wiggle room to declare any pile of crap "genius," then you are lying to yourself and others. That might be good for you, but it's bad for music

And so my request to you is: Please stop.

Who Is The Modern-Day Steely Dan?

In the 1970s, Steely Dan made a name for themselves by playing jazz-inflected pop rock music that was heavy on the instrumentation, mildly progressive, and above all, sharply witty and critical of popular culture. No matter how much jazz they threw into the mix - the classic "Your Gold Teeth II" features a brilliant, jazzy solo by guitarist Denny Dias, for example - Steely Dan managed to achieve widespread commercial success and critical acclaim without ever receiving too-harsh criticism for their jazz roots. Despite their deep-cut forays into progressive rock territory (take the extended instrumental mid-section of the song "Aja," for example), they were still able to churn out hit after hit, like a regular gold record factory. And no matter how sharp and bitter the lyrical criticisms got ("You've been telling me you're a genius since you were 17 / And all the years I've known you, I still don't know what you mean"), they never reached the point of being thoroughly off-putting.

Put it all together, and what have you got? You've got a band that sits comfortably within the radio-friendly mainstream, while simultaneously keeping close touch with the guardians of intelligent music (jazz, classical, progressive rock, and the like). You've got a band that combines sharp wit, cocked, loaded, and aimed directly at pop culture while simultaneously being a part of it. You've got a band that, despite not being universally beloved, is widely adored by both casual pop music fans and music snobs alike.

Does anyone in the modern music landscape fit this description? The only name that comes to my mind is Foo Fighters, and they are not quite intelligent enough to warrant the comparison.

What's your opinion?


Fitness: I'm Still Here

I have been running about ten kilometers a day, and working my way up, in preparation for the Cowtown Half-Marathon coming up in late February. Because my objective with this half-marathon is really to re-accustom myself to running longer distances and shedding unnecessary muscle mass, I have been engaged neither in formal, structured training, nor in workouts that amount to anything more than "I'm going to go for a run."

I realize this makes for incredibly boring blogging. There is not much even a very talented writer could say about just kind of going for a run. So, rather than waste your time by trying to keep you updated on a series of rather dull daily runs, I have opted to blog about other topics. Nevertheless, I thought I should say something about how I've been training lately, in order to tend to that aspect of the blog.

I often discuss exercise motivation on my blog. Self-motivation plays such an important role in exercise because "fitness" is not a very specific endpoint. Once you "get there," you still don't feel as though you really are "there." All those pictures you see on social media websites, where the beautiful people are having a gorgeously photogenic exercise experience never actually happens in the real world.

No, in the real world, once you've achieved a certain level of fitness, it starts to be more about maintaining what you've got, preventing yourself from losing muscle mass, or strength, or speed, or flexibility... The human body is a very use-it-or-lose-it kind of a machine. If you lose your motivation, it's easy to lose the physical progress you've built up. Moreover, as you age, you naturally start to lose speed and strength, albeit gradually.

So today I'd like to offer a good, simple way to motivate yourself during your daily run: Go find some beautiful scenery to run through. It seems so simple and so obvious. Running can get monotonous, and a change of scenery is always good for that, but chances are you've already exhausted most of your local "running routes" many times over by now. Running around the neighborhood is convenient, and can be quite pleasant if you have a good rapport with the neighbors.

But there is no replacement for a scenic river or lake, a mountain path, a beautiful canyon, or a nice beach. There is no replacement for a quiet country road or a green park, the soul-lifting solitude of an expanse of farm land or a bustling sidewalk through the city's main park lined with a who's who of the local running culture. Whatever does it for you, go out there and find it. Find some place that makes you excited to run, that makes the experience more fun than trotting around the surrounding streets.

The result of my foray into more scenic running? I've managed to significantly increase my weekly miles, and more importantly, my average pace has returned to an ever-more-comfortable sub-7:00-per-mile pace. (Remember, folks, when it comes to running, faster is easier.)

Editing Out The Snark

Every time I quote this guy or tell someone to read his articles, I get a lot of push-back because of the way he phrases things. He has a knack for making the reader uncomfortable. Now, some people respond well to that sort of thing, and others require that bad news be delivered only in such a way as to boost their fragile egos.

So, let's try an experiment. I'll quote excerpts of his latest article in such a way as to soften the blow. If you like what you read (or even if you don't), I encourage you to read the whole thing. I'll post a link below.

[Note for the entire excerpt: For readability's sake, I am eliminating elipses and brackets. The passage below has been modified from its original form in that important passages have been eliminated for the sake of an intellectual experiment. See below for a link to the original - vastly superior - article.]

When someone tells you about the negatives of being too plugged in, they almost always blame work emails, as if the things that pay for your dinner are what distract you from dinner.

Email is a convenient scapegoat not just because "family time should be protected" but because it gets us out of inquiring what went wrong with our home life that we could ever be tempted by work emails, and the avoidance of this inquiry is highly suspicious, i.e. on purpose.

One of our time's great sociological questions is why we filled downtime back up with work. At some point home life became more stressful than work life, and by the mid-80s the home was no longer a respite from modern society's incessant demands. Home became work, and this parallels precisely the history of homework. Neither is there home cooking at home. Men have long been resigned to this, hence their desire to "get an early start" or eat their lunch in their cars, while little girls were hooked on the potential of a fulfilling work and home life, or at least work or home life, now women are in on the reveal... and it is shaking their very souls. If home is stressful for adults, think about how bad it is for teens, all they want to do is hang out and talk about how phony everything is and instead they're stuck upstairs with Snapchat while trying to ignore the growing emotional distance between their parents.

Part of the reason work and home keep mixing despite our professed desires is that that's how Americans were taught to see an aspirational adult life. In every TV show and movie the protagonist's job and personal life overlap-- doctors in love, CIA agents defending their family, late nights at the office trading zingers or abuse stories. While we no longer think we want the overlap, the shows reinforced the false psychology that a person is something, all the time and everywhere, and the backdrop world "sees" it, accepts it. The structure of these depictions represents the fundamental narcissistic fantasy: a fixed and clear identity-- a character--seen by a potential audience. This is why home is not relaxing: we are working to not let it be all that we are.

The standard criticism of social media and texting is backwards: it doesn't detract from real life relationships, it represents a much desired break from them. Having to be with someone, especially someone you're not having sex with, especially someone you're not having sex with anymore, is very, very hard; having people see you, especially when you're not amidst the symbols that you believe form your "real" identity - say, a hedge fund trader who has to be home with the kids or a pretty girl in a sweats at a supermarket - this is a kind of exposure far more embarrassing than any selfie. What if they confuse that as the real you? You can see a version of this in married couples who talk to each other, joke, eat, raise kids, do couples stuff, but don't make eye contact. Avoiding eye contact is a way of keeping reserved a part of yourself, to yourself. "I'm here," you whisper to yourself, "but I'm not going to let this all overtake me, I'm more than this." This message is strictly internal, after all, you may not be looking at them but they can still see you.

What the couple should have done to avoid this calamity is formed a shared identity, "this is us". But how were they going to do this? Everything conspires to drive them apart. Even a big tent TV show would be a shared hour.

The only shared identity these couples have is "the kids", which is why they can make eye contact easily when they talk about them. But relationship experts have analyzed today's marital difficulties completely backwards: rather than trying to find some common connection amidst the the turbulent waters of life, they are actually struggling against the current of the relationship to keep themselves private. They fought so many years to be seen as individuals, "be true to yourself", that a few years past the exploratory segment of the relationship and a shared mental space becomes suffocating. So plugging in gives them some privacy, a micro-break from shared reality, under the rhetorical cover of "connecting with others."

What went missing? Why, after a decade of marriage, should dinner be a regular review of the somewhat boring goings-ons of "the day"? Because that formality is freeing, it allows self-conscious physical bodies to get used to standing next to each other without having to be acting, this includes husbands and wives. When dinner is a controlled process with "manners" and expected topics of shared conversation and start and end times, as boring as it may get, it is boring, not you.
Much more to found here. Do read the whole thing. 


Anti-Gnostic has "the bullet points."
1. Not everything unleashed by the Enlightenment was good, hence the "Dark Enlightenment" as a reaction to same.
2. The Cathedral exists as an institutional alliance of Government, Academia and Business to further politically correct dogma and punish heretics. It is, in a real sense, religious and not just ideological.
3. Reality is not what the Cathedral tells us it is.
4. All men are not created equal; people are different in a variety of ways.
5. Democracy is a disaster. When Classical-era Greeks, Gilbert K. Chesterton and the American state's founders praise "democracy," what they are really praising is rule by property-owning men. The universal franchise is a farce and a slow-motion train wreck.
Regular readers do not have to be told that I am not quoting this as an endorsement, but rather very much the opposite.


A Plain-Language Guide To Health Care System Problems

Much has been written about the problems of the US health care system, and of health care systems in general. Most articles rely either on insider jargon and wonk-language or vague generalities. As a result, most people do not understand the real problems. This became painfully apparent to me after reading the replies under a recent, uncharacteristically political Facebook post from the genius musician Mark Zonder.

What follows is my attempt at providing a plain language, easy-to-understand synopsis of why health care is so expensive, why so few people can afford it, and why all attempts to find a legislative solution only serve to make things worse.

How Costs Increase
Once upon a time, people used to pay cash for medical products and services. In some developing countries, it is still like that. This system had pros - such as the affordability of, and easy access to, medicine - and cons - such as low levels of regulation and consequently high levels of danger and "trust" in poorly motivated, profit-seeking firms.

At some point, people in the developed world decided that they would like to give some affordability in order to gain some safety regulations. This reduced affordability because producers of medicine now had higher costs thanks to the fact that they had to clear medical burdens.

Regardless of whether you think this was a good trade-off, the fact remains that it was not fair. Rich consumers gained a great deal at the expense of both poor consumers (who could not afford the new, higher prices) and producers of medicine (who now had higher costs). Reasonable people can disagree about whether this is "worth it," but we must all agree that costs increased as a result. It is a plain fact.

In order to make up some of the possible "unfairness," legislators offered medical producers patent protection on their medicines. Patents are government-enforced monopolies on the production of a product. As we all know, monopolies create special profit protection for the monopoly holder in hopes that by offering patent protection, the government can help encourage producers to innovate. Reasonable people can disagree about whether this is "worth it," but we must all agree that patent protection increases prices and reduces supply, as every type of monopoly is bound to do so, by economic definition.

Note that in the case of patents, producers of medicine gained a great deal at the expense of poor consumers (who could not afford the new, higher prices) and rich consumers (who now had to pay higher prices for the same medicine).

Thus, we see that there is a certain duality involved here. Governments act to regulate products and services, which results in higher costs. Then, to prevent businesses from fleeing the market altogether, governments offer special protections to businesses in order to incentivize their participation in the market. But both the regulations and the special incentives increase costs to consumers. That's important because it means that regulation increases costs and government protection of industry increases costs.

The Poor Are Left In The Lurch
Despite the fact that regulators have poor consumers in mind when they regulate the health care industry, the result of the new regulations is increased costs. Your medicine might be safer, but you might no longer be able to afford it. Similarly, despite the fact that regulators have access to medicine in mind when they offer special incentives to medical producers, the result of the new incentives is increased costs. Your medicine might be readily available on store shelves, but you might no longer be able to afford it.

Or, alternatively, a government mandate that you must buy health insurance or receive subsidized insurance from government might increase your ability to schedule an appointment with a doctor; but you are now on the hook for costs that you previously were not paying. Thanks, ACA.

Pharmaceutical safety regulations and patents are but one example. We can say the same of medical licenses: The regulations exist to ensure that patients receive qualified care, but doctors must now incur steep costs to meet the regulations. Costs increase, and poor patients on the margin fall off the ledge - they can no longer afford that which they were once able to afford. Meanwhile, these same licenses serve as a market protection for doctors by being a barrier to greater market entry; but again, this merely translates into a decrease in the supply of care, and thus higher prices.

That which is intended to help consumers in general only serves to help the wealthy ones. The poor, meanwhile, are left in the lurch.

So we see that there is a certain duality involved here. Governments are good at heaping on regulations and lending special incentives to businesses. Reasonable people might even conclude that these regulations and incentives are "worth it" in the long run. But there can be no denying that the result of these policies is an increase in price and a decrease in supply.

Quite often, leftists see the problems of government incentives to business but fail to see the problems of government regulations. Equally as often, rightists see the problems of government regulations, but fail to see the problems of government incentives to business.

The reality is, though, that you can't have one without the other, and that both types of policies serve to reduce access to medicine, especially for the poor.

You may ultimately conclude that this is the way things have to be. I hope, however, that you will reconsider.


What They Should Have Told You About School

Part One:
When I was in my 3rd year of medical school and we all had to select our tax bracket, the Asian women went into surgery, ophthalmology, or the last two years of a PhD program, you know where the borderline sleeves went? Pediatrics, which I think is technically sublimation but I'm no psychiatrist. The logic was straightforward: they wanted kids, and, unlike surgery, pediatrics offered future doctor-moms a bit of flexibility, while the Asian women apparently didn't worry about working late because their kids would be at violin till 9:30.
-- The Last Psychiatrist, "Don't Hate Her Because She's Successful"

Part Two:
In a former life, back when I considered myself intellectually exceptional and precocious - in other words, back when I was a mere boy - I suffered from many of the common conceits of boys in my position. The first was the belief that I possessed an above-average level of personal career ambition. The second was the belief that I possessed an above-average intellect. There are an uncomfortably high number of seedy details in the story of how and why I believed such things about myself, but they are mostly insignificant because, like a lot of people in my position, exposure to the real world beat a sense of humility into me. (This is absolutely true, even if it is not obvious from reading my blog.)

Perhaps I now suffer from a conceit of a different color: the vanity of believing that my own experience is reflective of a broader truth. If so, I implore the reader to engage in a little suspension of disbelief for the purpose of finding out whether or not it actually is true; and also - hopefully - entertainment. That is to say, allow me the vanity of believing that I'm not so different than other people, after all.

Like a good, intelligent, ambitious young man, I set myself on course for excellence! I declared upon admission to university a double-major: political science and economics. The reason I chose economics is because I found it interesting; the reason I choice political science is because it said "Pre-Law." Thus, I decided I would pursue a career as - don't laugh - no, wait, go ahead and laugh - a criminal defense attorney! 

A few weeks into my university career, during the course of a mandatory meeting with my academic advisor in the Economics Department, I was informed that the phrase "Pre-Law" does not actually mean anything, because everything is a "pre-law" degree. Law schools don't require a "pre-law" specialization in the same sense that, say, medical school requires a biology specialization.

Get that? I planned on going to law school, and I actually had no idea that all you needed for law school was a degree in anything else. As you can see, research was not my strong suit back then.

Well, that ended my political science career forever. For a time after that, I decided a dual-major in economics and accounting was the right way to go. After falling asleep in each and every lecture of an entire semester's worth of Managerial Accounting, however, I concluded that what my "dream job" had always really been was neither criminal defense attorney nor captain of industry. Rather, my dream job was that of academic economist! I would grow a beard, lose my hair, wear a plaid shirt, and ride a scooter. For the rest of my life. (Okay, I'm being too hard on myself. I actually wanted to be John Cochrane.)

Sometime during the final months of my bachelor's degree, life happened, and I found my way to other adventures. That's not important. What's important is what everyone had been telling me since the day I scrawled the word "economics" on my college application:
Get a useful degree, Ryan, and get a nice job out of college. What will you do with an economics degree? Major in engineering. Go to med school. Study computers.
And so on.

"Bah!" I thought, and so thinks every pseudo-ambitious, pseudo-intellectual young man during his college years. "I don't need to study something useful! I need to study something I am interested in! I can make a career out of my passion!"

Part Three:
The funny thing is that every adult tells every young person the same thing when it's time to pack up and go to college, and every young person has a million arguments for why it's bad advice. People get jobs no matter what they choose to study. You can't force yourself to graduate at the top of the heap if you can't even keep your eyes open because the subject matter is boring! (Well, that one I do agree with.) So you might as well study what you enjoy. College is about the experience, anyway! Wheee...!

Parental advice never sinks in with kids because the parents are making their point poorly. No young person will ever care about getting a "good job." There's no such thing as a "good job." Jobs suck. The only "good job" out there is the one that pays you millions of dollars a year and asks precious little of you for it - which is to say no job at all.

But young people do understand that we must work for a living. Those who haven't figured it out by the time they reach their college years end up figuring it out in college. It's true, and everyone knows it.

There is one thing, however, that young people - and even most adults - do not properly understand, which is this simple truth: Every job out there is about equally as hard as every other job. Every job demands some overtime; every job demands that you sell yourself and turn yourself into a schmuck; every job demands that you put in some face-time and do some networking; every job (these days) demands that you know how to do some basic computer code-writing; every job involves and over-bearing boss, a report that nobody reads, and a bunch of meetings that no one enjoys.

In short, pretty much every job is the same. You might have a unique love for the outdoors, in which case you would rather be a lumberjack than a computer programmer, or a construction worker rather than a graphic designer. Or, you might have a unique love for creative pursuits, in which case you might rather pursue jobs that require more creativity from you and less quantitative analysis. Whatever. Those kinds of details aren't all that important from the standpoint of the big picture. All that really means is that, whatever you decide to do for a living, you should try to work your way into the industry that feels like home for you.

But whatever. Details.

Part Four:
Once you wrap your head around the fact that all jobs are more or less the same, then a student's choice of major in college becomes a much clearer decision. It becomes less important to "study what you want." It becomes less important to "get a good job after school." What remains is a single, crucial choice, one that will follow you for the rest of your life.

There is a reason I lead-in with that quote from The Last Psychiatrist. In it, he very casually mentions the whole purpose of a college education.

He writes, "When I was in my 3rd year of medical school and we all had to select our tax bracket...."

College is your opportunity to select your tax bracket. When you take that to heart, the decision becomes much easier. Lower tax brackets involve much less responsibility, and that is a choice that appeals to many people. In fact, there's nothing wrong at all with preferring a life of modest means and modest achievements, if that's what your choice is.

But if believe, as I did when I was that age, that financial success will come to you if you make the most of your intellectual passions, then the message I would like to give you is simply this: You are squandering your intellect.

If you really do want to achieve something later in your life, then it is in your best interest to choose a high tax bracket during your college years. That means: Choose a college diploma that gives you very high earnings potential and work your tail off to be at the head of your class. Then, leverage that diploma and those grades toward getting the highest-paying job you can possibly find. It is at that point that all of the pieces finally fit together in your head.

Ten years later, ten years after graduating college, you will finally know what you want to do for a living. It might not be what you're doing, but you will have the means to pursue it, whatever it is. By "means" I mean both the financial ability to pay for it and the work experience required to earn a place in that position, whatever it might be.

You'll be expertly positioned for a lifetime of success in a field you enjoy. That's what you want, right?



Google News makes me aware of this story, coming out of New York.
Tyler woke up six of his relatives, and they all made it outside. The boy then ran back to the room where his 57-year-old grandfather, Lewis Beach, was sleeping. Beach used a wheelchair and crutches after having a leg amputated because of health problems, the fire chief said. 
Firefighters found Tyler's body a few feet from Beach's, Ebmeyer said. The body of the boy's 54-year-old uncle, Steven Smith, was found in another part of the trailer, which didn't appear to have a working smoke detector, he said.
We never really know in advance how we will approach being in the thick of things. We all solemnly declare that if someone we love were ever in danger, we would nobly come to the rescue. Few of us ever have to face the horror of putting our declarations to the test. Of those who do, a sadly small number of us actually do wind up being heroes.

But when a six-year-old boy proves he has the grit, shows he has what it takes to lay down his life to save his family, I can't help but be awed by the human condition and the inherent strength lurking below the surface of ordinary people, waiting for a chance to come out and do its good.

Philip Zimbardo argues that we should raise our children to be prepared to be heroes, so that if they are ever put to the test, they will rise to the occasion, even when the crowd is working against them. He and Tyler Doohan give me hope for humanity.

Doohan, as far as I'm concerned, died a man, not a boy.

Liberty Today

Note: This is another "random thought" post. Forgive me if it is not as well-written or well-defended as my other posts.

Mungowitz has a good post on something Jeffrey Tucker once wrote about, during the good old days of the Ludwig von Mises Institute's old blog. (Funny that I had to resort to a Google cache to find that post. Thank goodness for Art Carden's follow-up.)

I think the democrats and republicans - the political class, really - have done a good job of caricaturing libertarianism and forcing libertarians into the classic dilemma. Whenever a policy issue comes up, libertarians argue for the liberal option, whatever it is. If it's crime, we say repeal laws and imprison fewer people; if it's business, we say regulate less and engage in more commerce; if it's war, we say shrink the military and bring the troops home; and so on. There are tens of millions of Americans who agree with any one of these points in isolation. One need not buy into a complete and all-consuming libertarianism in order to believe in a smaller military, the repeal of drug laws, the scaling back of regulations, and so on.

Establishment politicians know this, so rather than address the point, they deflect with the caricature. They say, "Of course you'd say that! If it were up to you, we'd have no government at all!" Before you know it, libertarians are forced to defend a complete absence of government rather than discuss a point of actual policy. And it works every time. Even now, most people have it in their heads that the average libertarian wants to nuke the fire department and leave babies to the wolves.

The worst kind of libertarian is the one that plays right into the hands of the establishment. The worst kind of libertarian is the one that proudly declares himself an anarchist and rages against any and every part of the machine they see. One drop of statism, and the whole brew is worthless. It's unfortunate that so many people buy into this rhetorical strategy, because it undermines the simple obviousness of libertarianism in general.

We all want more power over our own lives. We all want to feel safe and secure in our own homes as masters of our own domain. We all want to be left alone to raise our children how we personally see fit, without the overbearing intrusions of the state's spying and regulating. As the world continues to globalize under the force of the technological explosion that continues to revolutionize our lives despite the largest economic recession in a century, we are all feeling the love of liberty.

We all want fewer laws and intrusions. This isn't an abstract feeling, it's concrete. We care about access to unbiased information. We care about corrupt political systems. We care about living our lives however we deem appropriate.

Well, folks, that's all libertarianism really is. If you like the fire department, I'm not here to take it away from you. Whatever it is that makes you feel you have less control over your life that you believe to be fair - that's what I want to improve on.


My sixth Improv Trance piece is called "Tweaky," and it is unremarkable except for the fact that it is my favorite yet. I hope you like it, too.



Easy Come, Easy Go

As readers of this blog know, I spent the better part of the previous eighteen months building muscle mass and pursuing all-around fitness, as opposed to focusing on being a strong distance runner. I had some success, and managed to significantly change the shape of my body from being predominantly ectomorphic to being... well, more muscular, anyway.

I stopped lifting weights for mass during the summer of 2013. By October, I was doing some body weight exercises and investing the rest of my workout time in distance running. It was a lot more difficult than I expected, and I suspect the main reason was the added muscle mass I'd acquired. Even as late as mid-December, I was still receiving compliments on how muscular I looked. (I'd better be clear about this - we've all seen my YouTube videos, and we all know that I'm no Hercules. However much muscle I'd put on, it was comparatively substantial for me. That's my point.)

A couple of weeks ago, during a business trip, I wound up with a bad case of food poisoning. Instead of making me queasy for a couple of days, it stuck with me for over a week before I decided to get help from the doctor. The doctor predictably gave me a 7-day prescription of some intensive anti-biotics, and they managed to work most of their magic during the first day of therapy. Since then, I've been back to my old self again.

Back to my old self again, that is, with the exception of my muscle mass. In the course of two weeks - one of which consisted of my subsisting on a diet of green salad and chicken soup - my body managed to unload all that extra muscle mass I'd been carrying. You can see the result on my body. My arms are smaller, my shoulders have fallen forward a little, my abdominal muscles have disappeared...

In the interest of good health and effective running, I fully intend to build some of these muscles up a bit. I don't mind the small arms, but the bad posture is reflective of underdeveloped muscles in my back and abdomen, and weak stabilizing muscles are a recipe for running-related injury. But I won't be putting on an additional ten pounds of muscle like I did over the past year.

I mention this to underscore to my readers the fact that physical fitness is a fleeting thing. It disappears shortly after you make a decision to stop nurturing it. As anyone who has seen an elderly relative disintegrate can attest, it is very easy to "let it all go" and wind up injured or worse. It's important to maintain good health or risk losing everything.

Anything But Rap And Country

Note: This is yet another of my short throw-aways.

Ask someone what kind of music they like. A common answer you're likely to get is, "Anything, except rap and country."

The problem with this response is that the people who give it almost certainly never listen to Hungarian folk songs or Baroque orchestral music. They almost certainly never listen to bebop or children's songs. It is highly unlikely that they listen to Inuit throat singing. A good exercise is to put it to them as a follow up question: "Oh, really? So you like bebop? What's your favorite from Coltrane?" See how far you get with them.

From this, we can probably conclude that the phrase "anything but rap and country" is shorthand for "anything that does not sound particularly idiosyncratic." It's not that anyone must enjoy rap or country, it's just that if one doesn't enjoy any rap at all, and doesn't enjoy any country at all, then it is probably not true that they enjoy "anything" else. More probably, what they enjoy is popular radio music: classic rock, "80s, 90s, 00s, and today," and so on. What they really like is unremarkable music that does not require that they pay a lot of attention to what's going on.

Even musical styles like Mexican rock music or Bollywood pop music are unlikely to impress these folks who profess to like "anything," and Mexican and Indian music are both highly similar to what you find on North American radio, for the most part. A good Indian club song is not all that different from a good American club song. A good Mexican rock song sounds like it could be played by Hinder or Slash or 30 Seconds to Mars, or whoever happens to be the it-band of modern rock these days. But the fact that the singers sing in languages other than English means that a non-English-speaker is marginally more distracted than he or she would otherwise be.

In short, anything that demands more of a person's attention than the run-of-the-mill stuff is what these people actually dislike. If it sounds somehow different - foreign language lyrics, instruments uncommon in modern pop/rock, scales or chord changes that do not fit in with the Rolling Stone paradigm, etc. - it is likely to distract them, and being distracted means they don't like it.

Rap music is obviously idiosyncratic. There might be a little singing, but it doesn't tend to sound like Katy Perry. The beats can be somewhat jarring and aggressive. The poetry is often bellowed instead of crooned or even softly spoken. And so it goes with country. Although country music is less aggressive, it is no less idiosyncratic, with its fiddles and pedal steel guitars and the unique accent/diction of its singers. It is just different enough to be distracting.

When someone tells me that they enjoy "anything but rap and country," I make a mental note that the person is not much of a music fan. Can you imagine someone saying that their favorite type of painting is "anything except cubism and late impressionism?" Can you imagine someone saying that their favorite type of book is "anything except murder mysteries and historical fiction?" Can you imagine someone saying that their favorite kind of dancing is "anything except clogging and polka?"

Indeed, all you really know about people who say such things is that they do not have a large investment in whatever it is they're talking about. And they don't really need to be heavily invested in music, either, but then why do they make the claim that they like "anything?" They don't.

Men In The Workplace

I have a number of random thoughts going through my mind right now, and I would like to blog about every single one of them. If I do, though, they'll be short posts of little consequence to anyone. The points themselves won't end up being defended very well. That said, I can't bring myself to invest enough time in these things to write the kind of blog posts they probably deserve.

Thus, the next few blog posts will be a small series of short posts laying out ideas that are floating through my mind these days. If I don't conclusively prove my points, so be it. At the minimum, I will try to schedule these posts, so that they are a bit spaced-out, in case I want to write something of actual merit.

First up, men in the workforce.

There have been innumerable articles and blog posts out there reporting on the decline of the American man in the workplace. That is, men are falling being women in terms of career success, education, and workforce participation. Here's a recent post from John Cochrane, for example.

For the most part, all agree that the decline of men is a bad thing. After all, employment is staggering after/during the so-called "Great Recession." If we can get all those lackadaisical men to get serious and get back to work, we can put ourselves back on a good NGDP path, am I right?

Well, I'd like to offer a dissenting point of view here.

80 years ago, before the sexual revolution and the so-called "women's liberation" movement, the workforce was comprised almost entirely of men, and women stayed home. I don't think it's fair to say that women were working themselves to the bone back then. Sure, they had household responsibilities to attend to, and they did it well. But it was nothing like the hard labor and factory work that the average man was doing.

When the US government shipped the majority of its productive male population overseas to fight in the Second World War, women had to make up the difference at home. They discovered a few things: (1) work is hard, (2) work is satisfying, (3) they could do work just as well as men could. It is not surprising that women decided they'd like to continue working, even after the men came home. So they did, and it was to society's great benefit on many different levels.

It didn't happen overnight. Female workforce participation gradually increased over time, and it continues to do so. It is not yet on par with that of men, but it's getting there, and wages are equalizing, too. Whether or not you view these trends as "favorable," they are a fact. Those of us who support the equality of women are happy with this development.

There is just one small detail to account for: now that households receive a greater proportion of their income from female family members, they (by mathematical definition) receive a lesser proportion from males.

Another way to look at this is to note that, given that a household does not necessarily need two full-time workforce participators to operate, we should not be surprised that fewer men wish to work at all. Furthermore, it's not a bad thing. Work sucks. There is a reason economists call it "the disutility of labor." We only work because we have to. If we get a big boost of utility from our income, then we will work more; if work starts to suck, we will work less. We navigate this trade-off until we find a good, personal equilibrium at which we are working enough that going to work doesn't totally suck, but not so much that it becomes unbearable. Then we go home and live the rest of our lives.

It's tempting for economists - especially macroeconomists - to view human beings as "productivity machines." Even business managers buy into this concept. But, at the end of the day, we work to live, not live to work. If women decide they want to work more, that's great. If men decide they want to work less, so be it. What will likely happen is that households will find a point where they can "split the difference" and enjoy as much leisure and as little labor as possible.

My opinion - backed by no data whatsoever - is that this is what we're observing. Society is changing. Work isn't the be-all, end-all for everybody. If you could get away with one fewer hour of work if it meant gaining one extra hour at home, wouldn't you take that deal?