2014-03-31

Post-Training Training

I began running competitively at around the age of eight. I spent literal decades honing my approach to training, and it worked very well for me. At this point in my life, however, the combination of age, diabetes, and competing responsibilities is such that I'm unlikely to win another competitive race ever. This isn't a "bad thing," it's just a fact of life, and I'm okay with it.

The question is this: Now, how should I train?

The reason this is a question at all is because if I just go out for a run every day, then I'll never push myself hard enough to reap the benefits of exercise. I'll be moving my legs and getting fresh air and sunlight - both of which are very good things. But unless I'm training for a race, I'm not actively involved in ramping up my endurance base, or doing speed work, or increasing my VO2-max, or etc. etc. In other words, I'm exercising, but not very hard. For that reason, the training habits I've acquired over the years are a liability. Resting on my strengths means slacking off, even if I'm exercising, and slacking off doesn't help the body much more than taking a nap. The body adapts to repetitive demands; that which benefits in the beginning eventually loses its punch. If I want to keep my lungs in great shape, my heart in great shape, and my muscles, bones and ligaments strong, I need to keep challenging myself.

There are many ways to keep challenging oneself, of course, but my point here is that setting my sights on the next big race or the next major marathon is only one option of many. I might be better served to build a little muscle mass or do more calisthenics until my body starts slacking off on those things again. At that point, I could go back to running or choose something else.

The key point is that, without having a specific sport to train for, the possibilities broaden a little. Growing older and exercising involves embracing new possibilities and making the most of them.

2014-03-28

Nature Will Always Be More Powerful Than Mankind

A JetBlue flight made an emergency landing at JFK after the plane slammed into a seagull, which blew a hole in the nose of the aircraft upon takeoff from Westchester Airport Friday morning, authorities said. 
The FAA said the Airbus A320 bound for West Palm Beach, Fla. with 142 passengers declared an emergency at about 9:30 a.m. 
"On departure we hit numerous seagulls, one took a direct hit right on the nose, just below the windshield," the pilot said on air traffic recordings posted by LIVEATC.net
The bird remained stuck in the nose of the plane. 
"The way I'm looking at it right now, I don't think we can carry on to West Palm, because it does look like we have some damage to the nose here," the pilot said, adding that the damage appeared to be "part of the pressurization capsule."
NBC News has the scoop.

Meta-Life

Let's call this "Part Two."

The first came in the form of this post, "Never Stop Living," my attempt to remind the reader (and therefore myself) that life is short, and that the energies we expend ought to be channeled toward the activities lead us to the greatest levels of satisfaction. We ought to be active participants in life itself. More specifically, choosing not to expend any energy while engaged in the pursuit of happiness is a serious mistake.

My first attempt was denegrated as "YOLO" in a fancy hat. Maybe more people watch TV than climb Mount Everest because TV is genuinely more fun than training to summit a peak in a hostile climate, maybe dying in the process. Maybe the internet really does offer a more satisfying experience than a marathon.

This phenomenon gets all the more complicated when we account for the fact that even those experiences that are active, or outdoors, or that otherwise do not involve passive media consumption are seemingly valueless to people until they become part of the passive media-consumption process. You weren't there unless you "checked in" on social media. It wasn't beautiful unless it was Instagrammed. You can't really do it unless it's on your YouTube channel.

And then there is the greatest horror of all: that you were there, it was beautiful, and you really did it; you checked-in, you Instagrammed, you YouTubed, and nobody saw it. The apparent standard, and I do hope I'm wrong, is the quality of the passive experience, not the active one.

Odd, isn't it? We can barely tolerate a brilliant piano concerto recorded on someone's iPhone in a big concert hall and uploaded directly to YouTube, but give us a mediocre performance from a tweed-wearing hipster with "ironic" facial hair, recorded in multi-angle HD, and suddenly it's viral. Every day, millions of people trudge to the gym to turn their lives around. We don't care... unless they managed to take a couple of selfies and post them to social media with a list of all the vegetables they plan to eat for dinner. That's a quality gym experience. Meanwhile, in the darkest corner of the gym is a man who jumps rope for 30 minutes on one leg, then switches to the other leg, then spends another hour lifting free weights.

I understand the perspective of the audience. You can't applaud something that you don't know about, and it doesn't much matter that the twelve-year-old down the street can play the Midnight Sonata with passion and accuracy if you never see it or hear about it. And so long as you're investing your time on social media, you might as well focus on the HD videos and the best-looking selfies. Why should your consumptive experience have to suffer for the sake of someone who can't even manage a proper mix-down?

What I don't understand is why the quality of the digital representation, the meta-object, has come to serve as the metric for the quality of the experience itself, the object.

I don't have much advice to offer here, except to say that you might be better served enjoying the non-digital realm for what it is: a long line of private experiences that can define your life in a way that is satisfying only to you and the people directly involved. Social media can get you a few "likes," but that's as far as it goes.

On your death bed, would you rather remember the concerto you performed? Would you rather remember the sound of the audience and the reverberation of the notes against the walls of the concert hall, the chalky taste of the air when you walked in and took your initial bow? Would you rather remember the sunlight on the back of your neck as you ran past mile-marker-number-four and the burgeoning thirst in the back of your neck? Would you rather remember the chirping of the birds as you rounded the bend of the mountain pathway and discovered a pond that only perhaps a dozen other people had seen that month?

Or, would you rather remember that you posted a twit-pic that was retweeted a million times?

To what extent are you an active participant in your own life?

2014-03-26

Deafening Silence

The IRS recently decided to treat Bitcoin as property. Thus, every time you trade Bitcoin, you will be liable for reporting your capital gains or losses. This seemingly obliterates the potential to use Bitcoin as a medium of exchange. I am not close to the Bitcoin issue, but this strikes me as being a very significant event in the world of crypto-currency.

And yet the world of economics blogs is mum on the issue. In the words of Becky Hargrove, the silence is deafening.

2014-03-25

Paradigms, Part II - Social "Science"

Robin Hansen has a recent post about math-types who minimize the value of the social sciences because they don't take them seriously enough.

I have a social science background, and to a lesser extent, a mathematics background. My thoughts on the subject are probably less relevant than the thoughts of those people who are better-studied in one or both of those areas. Still, my vanity compels me to offer an opinion here.

It's Math
In my experience, hard sciences are applied logic, the most formal and generalized of which is pure mathematics. If you can understand pure mathematics, then you can understand any other field or discipline. But the reverse is not true: There are many, many more social scientists who do not understand pure mathematics than there are mathematicians who do not understand social sciences.

This isn't just a relic of the fact that academics tend toward logic, and thus tend toward mathematics. Logic is the underlying basis of all human thought processes. Philosophy in its original form (think ancient Greece) fused mathematics and ethics and astronomy and biology all into one glorious pool of reasoning. But that sort of thing can only go so far. Eventually, "philosophy" as a stand-alone discipline lost its steam under the realization that nearly all of the difficult problems were either better suited to scientific analysis (and thus, the question was one of science, not philosophy at all), or that it was really just a shortcoming of language. (See "the paradox paradox.")

When I say philosophical problems are a shortcoming of language, I mean that the problem is imaginary. For example, the color of uncooked salmon meat has no formal name in the English language. We could say that it's "pink," but it's not really pink. We could say that it's "orange," but it's not really orange. We could say that it's "red," but it's not red, either. Here a philosopher would be tempted to say that either salmon has no color or that color itself is a complex phenomenon (or an emergent one, am I right?). But both of those statements would be wrong, because the color of salmon exists and we call it "salmon-color" even though that's kind of stupid. The problem isn't with the color; the problem isn't even with the idea of color.

The problem is simply that English doesn't have a word for that color. Big deal? Call it "tau" and forget it was ever a problem. It doesn't matter.

Gradually, "philosophy" dissolved into logic, mathematics, and science. As formal logic became more rigorous, people seemed to notice that it was just math, and that's how we understand the field of "Logic" today. So then there were two: mathematics, and science.

Physicists have done the most work in revealing that science is basically just math. All of the important work in physics these days is mathematical conjecture confirmed by physical experimentation. The various other hard sciences out there - chemistry, biology, etc. - do not look much different from physics. At this point, the sub-discipline you're interested in is really just a specialization of the same thing, aka "science," aka mathematics.

Social Science
Finally we have social sciences. Social sciences started out as theories of human behavior, as observed by grouping things differently. We can look at a person's individual motives, in which case we're talking about psychology. We can look at a person's motives as he interacts with a group, in which case we're talking about sociology. We can ignore motives and focus on the quantifiable result of human behavior, in which case we're talking about economics or possibly political science.

But when we get right down to it, all this social scientific theorizing amounts to crafting paradigms, or way to view human behavior so that we can better understand it.

I don't deny that there is a psychological component to the world, as there is an economic one, and a sociological one, etc., etc. But when we engage in social science reasoning, we're not describing the profound truths of the universe, we're using paradigms to simply describe the many different ways human beings interact with each other.

That is to say, we might gain additional insight into a situation if we look at it "sociologically," but I don't think we'll ever come across a good predictive model for "quantum sociology." In other words, we can describe human behavior through the paradigm of modern sociology, but we don't derive empirical truth from this. It becomes the academic equivalent of poetry. It sounds nice because it resonates with us, but it won't teach us any secrets of the universe.

Similarly, we can describe human behavior in English or French, and our choice might connote something different in some specific way, but this is more relevant to the inadequacy of language than it is to the validity of our theory.

Conclusion
All that verbiage to say something simple: Social sciences are paradigms, useful for learning, but not otherwise relevant to the universe. Once you've learned what you need to learn from Weber's theories, you should think of them as analogies or talking points, not concrete truths of human behavior. That's the purpose of social science. But if you're looking for concrete truth, you need to learn the underlying logic and then move on to empirical science.

The rest is just prose.

ZMP

I've been notably out of the blogging loop lately. Truth be told, I may have reached "peak blogging" last year, and I've been experiencing diminishing returns ever since. The blog may survive, or may not. I suppose it depends on how interesting my life gets in the future, and how much time I dedicate to waxing about it. See this post for additional clarification.

But this morning, I went back to my blog feed to catch up on what's being written-about, and I discovered this post by Scott Sumner, which references this post by Tyler Cowen. I'm too late to the party to add an impactful comment under either of those blog posts, so I'm stuck with having a go from the comfort of my own blog.

The Argument For ZMP Workers
There is a lot of conjecture about the concept of a "zero marginal product" (ZMP) laborer. The idea here is that there are people out there in the economy who simply do not add value to the economy. For the sake of argument, let us imagine the receptionist at an office building. When you check-in at the office, you inform him or her what your business is ("I have a three o'clock appointment with Mr. Reynolds."). He or she then pages Mr. Reynolds, who walks down to meet you, and off you go to your appointment. This job belongs to a real human being, but it's easy to see how the same function could be performed by a touch-screen computer system integrated with the company's email server. The receptionist doesn't seem to add any value. He or she spends all his or her "down time" surfing the internet or watching TV or playing FreeCell or any of the other things you've seen receptionists do when you've checked-in somewhere.

If the economy gets tight, this receptionist might be the kind of person who loses his or her job and never becomes employed again. Why? Because this person has a comparative advantage in an economic activity that is no longer scarce. It doesn't "pay" to hire this kind of employee anymore. They don't bring anything to the table. Hence, "ZMP."

My Argument Against ZMP Workers
Of course, in the real world, someone might lose his or her receptionist job and go on to do some other low-skill task that actually does add value. The receptionist may find a promising career as a data front-end analyst (to give one realistic example), cleaning up data so that it can be imported into company databases and used by higher-level employees. Or, the receptionist may go to work at Starbuck's or Costco, drawing a similar salary by performing tasks that require little or no training. Or, the receptionist may take the loss of his or her job as an opportunity to focus on some as-yet-unpursued economic contribution (perhaps he or she is an artist or is college educated but does not yet work in his or her chosen field). Perhaps dozens of things.

My point is that, while it is extremely easy to imagine ZMP jobs, it is not so easy to imagine ZMP people. So the existence of "ZMP workers" seems to be a ruse. I do believe that there are many jobs in every economy (from the beginning of time) that add no particular value to anything. This, however, is a characteristic of the job, not one of the worker.

Coda:
I have blogged about ZMP before. In this post from 2011, I seem to be attracted to the idea. Look at my wildly off-base criticism of Bryan Caplan:
Caplan seems to believe that the economic consultants in my story "didn't have ZMP because they found other jobs."
In fact, it was I who was wrong, not he. If they found other jobs, then they must not be ZMP workers - it was their jobs that had ZMP. (As an aside, I still think that scenario fits with Arnold Kling's PSST idea, but clearly my criticism of Caplan's point was wrong, wrong, wrong.)

Somewhere along the lines, I successfully corrected my thinking on this. In my initial post about shotgun theories (April 2013), I used the ZMP concept for inspiration. There, I defended Caplan's position.

Later, in August, I was back to criticizing Caplan again, but luckily still had the right idea about ZMP.

2014-03-20

Paradigms, Part I

I mentioned a few posts back that I was going to write a forthcoming post about paradigms. I have unsuccessfully attempted to write this post a few times now, and finally realized that I need to give my thoughts a more thorough treatment. Instead of one, long post, you're going to get a few shorter ones.

My intent with these posts is to criticize strict adherence to a paradigm - any paradigm - because they can be misleading. Their use results in the sort of automatic thinking that can lead even a very careful and brilliant mind into overlooking important details, or failing to understand certain subtleties, or minimizing certain others, or explaining-away details before properly considering them.

Before I can make that case, though, I have to explain what paradigms are good for, and that shall be the topic of today's post.

What Is A Paradigm?
The word "paradigm" entered the public lexicon some time during the 1990s. I mean, it was always there, as long as it's been a word, but it wasn't an important word - it wasn't a buzzword - until the 1990s. "Paradigm" went right along with "synergy" and Palm Pilots and Franklin Day Planners. The general idea at the time was to bill big business as not just a series of steps aimed at producing and selling a good or service, but rather an idea or a mode of thinking.

Only suckers, the argument was, build laptop computers and sell them. Paradigms offer the advantage of viewing laptop production as a concept, which can then be improved and manipulated in the abstract. Paradigms offered business managers the advantage of "revolutionizing the business" without having to change real-world things like the structure of the assembly line, or the way depreciation is handled in the accounts, or the map of the supply chain.

I'm being critical of "paradigm" the buzzword, but paradigms can actually be extremely useful. For example, if you and your roommate decide to cook dinner together, you can look at it as a collaborate creative effort (Paradigm #1), or you can look at it as a food manufacturing process (Paradigm #2). Without passing judgement as to which paradigm will result in the "best meal," we can easily see that the two paradigms imply something different about how you'll do the work.

Paradigm #1 implies that the two of you will discuss, collaborate, and otherwise work on the same things at the same time. Paradigm #2 implies that you'll divvy up the work and only come together at the end of the process. It's certainly possible to look at cooking both ways, and each paradigm offers its own advantages and disadvantages. The main difference is the paradigm.

What Are Paradigms Good For?
The major advantage of paradigms, in my opinion, is that they are very instructive ways to learn about new things. In the cooking example above, if you didn't know how to cook, but had a recipe book and a roommate, you might really like the idea of treating it as a manufacturing process. You'd be able to follow the instructions, divide the labor, and manufacture your dinner. If you do that a few times, you'll quickly learn "how to cook."

Cooking is a relatively easy problem. Suppose you're trying to solve a tough scientific problem. One way to do that is to power through the scientific fundamentals and consider the implications of each fundamental separately, given what you know. That's not merely a lengthy and tiring process - it might also limit your creativity in solving the problem.

The Black-Scholes pricing model famously solved an investment problem by mathematically treating the problem as though it was a ballistics trajectory problem. In other words, Black and Scholes adopted a rocket science paradigm in order to solve an economics problem. In doing so, they learned about (and taught us) a great deal about economics.

So you can see that paradigms offer us the ability to learn a great deal about whatever it is we happen to be looking at.

Beyond Paradigms
There are certain limitations.

Imagine again the cooking example. What if you don't have a cookbook? What if your roommate is out that evening and cannot help you? What if you're missing some of the ingredients? Your paradigm might instruct you to "download additional instructions," or to "hire more line workers," or to "order a shipment of new raw materials," but obviously none of those things will help you make dinner.

But the point of the paradigm was to teach you how to complete a task. Hopefully, by the time you've fully absorbed the principles behind the paradigm, you'll know how to cook. At that point, you won't need the paradigm anymore.

Similarly, elaborate comparisons between financial instruments and rockets are weak and silly. The point of the Black-Scholes model was never to make such a comparison. The point was merely to use a mode of thinking to solve a problem.

Getting too caught-up in an elaborate analogy misses the point. In subsequent posts, I intend to argue that paradigms should be discarded as soon as we have absorbed the lessons they were designed to impart. Once we have the knowledge we need, the paradigm becomes a distraction, an urge to draw comparisons that ought not be drawn.

For now, though, I shall leave it at that.

2014-03-18

On Being Convincing

Spend enough time trying to convince people of your point of view, and you'll eventually reach a point of diminishing marginal personal satisfaction with these sorts of conversations. It's easy to just give up, disengage, and go back to playing Simcity or something.

Anecdotally, however, I can attest to the fact that I have been able to bring people over to my point of view from time to time. Having thought carefully about these situations, I can discern no predictable pattern from how it worked.

In some cases, I managed to be persuasive while being rude and highly flippant. From this, we could possibly reason that the embarrassment of another person's holding a wrong opinion was too much to bear, and they were made to reconsider their case. But in other instances - some involving the same people who were convinced on other issues - the tactic failed miserably. Those cases, taken in isolation, would be reason to convince us to remain as open and charitable - as "kind" - as possible. The contraction doesn't end there, of course. I've managed to convince people by being as kind as possible, and I've also failed accordingly.

Sometimes it pays to appeal to reason, empirical evidence, and logic. Other times, that is a distraction from what the audience deems to be a moral issue. Yet, attempt to speak to them of morals, and they will cite their own countervailing empirical evidence and logic. Once again, the tactic and its opposite both appear to work and to fail, sometimes for the same person or audience, with no discernible pattern emerging.

Zoom way out, though, and there is one thing that does not change over the course of the entire analysis: Consistency.

That is, it seems that the best way to bring others around to your point of view is to remain honest, consistent, and genuine about your personal beliefs. If you're on the right side of the issue, remain unwaveringly true to it, and have confidence that people will come around eventually.

At least, that's my conclusion. What's yours?

2014-03-14

The Omnipotent Ones

Way back when the original passage of "Obamacare" was being debated, the cynics liked to suggest that the legislation was so bad that it was specifically designed to fail. The logic was that if Obamacare fails, the American public would run screaming toward "single payer health care."

It's the kind of nefarious theory that plays well with the conservative and libertarian crowds, but - like the 9/11 and Iraq War conspiracies before it - it assumes that the US government is a highly coordinated, intelligent, and frankly omnipotent entity that actually has the power to pull something like this off. My experience with government, however, is that it simply isn't capable of that degree of coordinated effort.

Simply Not Capable
There is a great deal of counter-evidence for the assumed "awesome power" of government.

For example, think how long it took to capture Osama bin Laden, despite the fact that he was "hiding out" in a house in Pakistan. No matter who or what you might choose to blame for the fact that it took so long to catch him, the fact remains that it did. None of the possibilities you might suggest support the idea that the government is a highly efficient and capable entity.

Or, consider the fact that, despite having waged "wars" on both drugs and poverty, the US government is still a long way away from solving either problem. Consider the fact that the number one gripe of school systems is that they are under-funded, despite the fact that funds to schools have been increasing for decades. Consider the fact that tax revenues never change no matter how the tax rates happen to be set. Consider the fact that, in order to catch the alleged Boston Marathon bombers, the entire city of Boston had to be placed under house arrest, and even then it took days. Consider the fact that the government cannot even prevent people from smuggling bombs onto airplanes without creating a massive surveillance system that involves microwaving our luggage and taking x-ray photos of our naked bodies, and even then people still manage to bring explosives in on their shoes.

The list goes on and on. Add it up, and the picture of the US government is not one of an extremely capable entity like you see in the movies. It's one that bungles its way through our lives, barely managing to meet its lowest expectations.

Libertarians And Statists Alike
But the misguided belief that the government is awesome and omnipotent is the kind of belief that's necessary to both fans and foes of government.

Statists and moderates who favor a significant role of government in our lives simply must believe in government's omnipotence because, were the government a collection of bumblers, none of the logic behind government intervention would stand up to scrutiny. How can you argue for a Medicare expansion, for example, if you already know in advance that the government isn't capable of really solving any of the problems it attempts to solve? Thus, interventionists instead say, "Yes, we can!"

I'd be inclined to stop there and score some points with "the political base" if such things mattered to me. After all, many libertarian criticisms are founded on the notion that the government will never succeed in whatever it attempts. But how do libertarians reconcile that belief against their insistence that, for example, Obamacare was designed to fail in order to bring about single payer health care? Such a strategy would require such a coordinated effort across so many thousands of people for so many years that it very nearly would have been easier to simply get health care right the first time. Once again, even the libertarians assume that the government is too capable.

So this is an illusion embraced by everyone.

Incomplete Other
This is from the Wikipedia entry on Jacques Lacan:
"It is the mother who first occupies the position of the big Other for the child," Dylan Evans explains, "it is she who receives the child's primitive cries and retroactively sanctions them as a particular message".[8] The castration complex is formed when the child discovers that this Other is not complete because there is a "Lack (manque)" in the Other. This means that there is always a signifier missing from the trove of signifiers constituted by the Other. Lacan illustrates this incomplete Other graphically by striking a bar through the symbol A; hence another name for the castrated, incomplete Other is the "barred Other."[42]
Very early in child development, of course, the child does not fully understand that the mother actually is an other. So she merely plays this role for him without his actually knowing it.

Next, the child learns to differentiate himself from his mother, and soon thereafter discovers that there are things outside of mother's control. He can keep small secrets from her, he discovers things that don't directly involve her. Thus, his original "Other" is discovered to be "incomplete." None of us remember going through this stage of development, but it must be quite a revelation for someone who started out literally inside of his original Other.

I assume this is when the "terrible twos" really begin. Now that he knows that his mother isn't god, he begins to test the limits, to find out the extent to which she is not a complete Other. And then the rest of childhood happens, and his concept of the Other morphs into something more grandiose.

The reason I bring this up is not to suggest that Lacan's theories were faultless, but merely to point out that this is a useful way of conceiving of a common problem of human existence. I don't think it's a perfect way to describe things, but we can learn from the paradigm nonetheless. (And remember, paradigms are good for learning, and only good for learning.)

The Incomplete
So what we learn from Lacan about government is that it simply isn't as "complete" - as capable or as powerful - as we are originally taught to believe. After all, "the government" invented the internet and the atomic bomb. "The government" walked on the moon. "The government" teaches us about science and history. To think that it's only barely managing to accomplish these things - or even more accurately, to think that whatever "it" manages to accomplish is actually the work of specific individauls, not the government as a whole - is sure to open up a bit of an existential vacuum for anyone not accustomed to understanding the basic flaws of government activity.

Economics can help some of us learn about the basic shortcomings of failed government initiatives, but there is always a risk that we remain unconvinced. There is always the risk that we will simply acknowledge that past policies failed due to policy imperfections, that anOther, more superior policy or government will be able to rise to the challenges we've imposed.

In short, we have to learn that the government is an incomplete other, and hence not an Other at all.

And once we learn this, we also come to learn that there is "fixing" government except to scale it back and replace it with other human institutions, however fallible they might be.

Yes, even though they are fallible. Why? Because a fallible thing that is under your control will always be more manageable than a giant, faceless, flaw for which every attempt to improve it becomes a pointless set of competing arguments for a flawlessness that does not actually exist. Sort of like arguing over whether your C-student child should seek to be a CEO or a brain surgeon when he grows up, we'd be ignoring the facts of the universe and depriving us a viable solution to our problems. For heaven's sake, let the kid get a business degree and become a happy middle-manager somewhere.

2014-03-12

Too Soon?

It has now been 36 days since The New Republic published this article about the "comedic sweet spot" for jokes that make light of tragedies.

I consider the New Republic article to be a bit of a tragedy.

Hence, this blog post will perhaps be better-received today than on any other day in time.

I Wrote It Down For A Reason

All that talk about "rhetorical nuke buttons" last month was unnecessary. Years earlier, I had already developed the concept. I know this because it happens to appear on that anachronistic blog feature that sticks out like a sore thumb, the Lexicon.

Before I wrote about nuke buttons, I wrote about contrapostive influence. We could call it the tendency to jump the shark in the exact moment you win the argument. It's bad.

Every so often I get the idea in my head that I will delete the Lexicon. I don't really want to add it anymore because I have moved away from the idea of the all-encompassing philosophy that I used to joke about with my friends. Paradigms are useful when you first start to learn about something, but after a certain point they become restrictive.

More on that in a future post. For now, what I want to say is that one of the things that forces me to retain the Lexicon is the fact that every time I review it, I rediscover an old name for a recent concept. As the preceding paragraph implies, this is limiting from a paradigmatic perspective, but it is useful self-meta-analysis.

Doesn't it imply something, the fact that I created two names for the same basic idea just one year apart, and didn't even realize it?

That Duke Student

Territory this well-tread need not suffer my additional commentary. Sometimes the best I can do is point to good thoughts from other corners of the internet.

If you want the soundbite edition, have a look at this clip from her interview with Piers Morgan. (There is an easy joke to be made here pertaining to "relative status" of both the student and of Piers Morgan, but I'll skip it. I will, however, indulge my ego long enough to tell you that my use of the phrase "relative status" is itself a meta-joke. More on that in an intended future post.)

On the off-chance that you live under a bigger rock than I do, the video clip is sufficient for you to fill in the back-story yourself. This is not a complicated plot. But if you do happen to want the in-depth story, you can start here and work your way backwards.

Now, my initial reaction to this issue was that she was deflecting attention away from herself with a lot of red herring stuff about "rape culture" and betrayal. To this day, she faults the boy who "outed" her for betraying her trust, rather than herself for doing something in one kind of public sphere that she did not want known in another public sphere. (The idea that there is only one public sphere escapes her, I suppose.)

What I said was that it was a deflection. Someone else put it this way:
Crowdsourcing the superego means that as long as a few people say, “it’s not easy nowadays, I’d like to see that bitter old codger try to succeed in today’s world!” she gets off scot free. I’ve counted 11 such comments, and I’m not even trying. Guilt and shame evaporate.
That's entirely accurate, except for one minor detail: that quote appears in a criticism of a completely different event that occurred in November, 2012.

But if we want to keep in line with the Stationary Waves terminology (and really, this far invested in it, can we afford to diverge from the paradigm?) we'd call it shared guilt.

The cool thing about the internet is that if you lurk around in it long enough, you're bound to find people who think similarly about your favorite things. That's true of scorned pornography actresses, humiliated daughters, and, yes, even quasi-philosophical bloggers like myself.

...Well, c'mon. What other purpose does it serve to link to a bunch of eloquent people who agree with me, if not to do a little crowdsourcing of my own.?

2014-03-11

We're Here For A Good Time

Being the hedonic utility optimizers that we are, we will not generally engage in entertainment activities that require great effort.

This comes in many forms, but to understand it intuitively, consider what proportion of the global population ever attempts, at any point in their lives, to summit Mount Everest. After stripping away all the excuses and half-cocked explanations, we are left to conclude that climbing Everest is extremely difficult. It's so difficult, in fact, that almost no one attempts it. Almost no one even trains for an attempt.

We can say the same for any difficult activity. Learning to play the great works of Chopin on the piano requires years of meticulous practice and study; almost no one manages to do this. Running a marathon is not merely a twenty-six mile undertaking, but an eighteen week commitment (at least). Starting and maintaining a club or a charity is quite often a thankless time-suck that no one ever appreciates. And so on...

At this point, you might be thinking that I'm about to say, "Society has become all about instant gratification. We need to reverse course..." That's the easy play, the obvious play, and anyway I'm not completely convinced that society actually has become all about instant gratification. Technology and prosperity have combined to put all kinds of things at our fingertips. One of the reasons live music isn't in high demand these days, for example, is because we have things like Spotify, where we can hear pretty much any song we can think of almost instantly, no matter where we are located, as long as there is cellular phone service. (So, maybe not the summit of Everest, but you get the picture...) So we haven't become monsters of instant gratification, we've just become creatures of convenience.

Although it is difficult to argue against the blessings of modern convenience, it is likewise difficult to ignore the fact that hard work is a good thing that produces a better result than "phoning it in," either figuratively, as in the case of someone who would rather watch a documentary about Everest on Netflix than climb to the top himself, or literally, as in the case of someone who would rather stream "Teenage Dream" to her cell phone rather than learn some barre chords and play it herself. Take it from someone who plays the guitar from time to time: it is a lot more fun to play "Teenage Dream" than it is to listen to it, and that holds true even if you aren't simply cranking through the first couple of verses to set the stage for an epic, hour-long guitar improvisation.

So it is a bit of a dilemma. On the one hand, it is extremely fun to do this, but on the other hand you'll never get to do it unless you invest years of your life. And not every one of those years is going to be fun. In the beginning, you'll get blisters and your fingertips will peel off. Later on, you'll start to feel so comfortable playing in your blues box that it's going to take serious self discipline to break yourself out of it. Then you have to learn what Steve Howe does note for note, and even then you're not there yet, because then you'll have to write your own song, master it, and play it on stage in front of a captive audience. All of that is, let's face it, incredibly difficult.

When you get right down to it, we are talking about the disutility of labor, and the more labor required, the more disutility we experience. In order to keep yourself on track, you have to know at your very core that the future value of seeing the world from the top of a mountain so tall that it scrapes the edge of outer space far outweighs the present value of years - possibly even decades - of coming home from work early and cracking open a beer. As Neil Peart put it:
Some will sell their dreams for small desires
Or lose the race to rats, get caught in ticking traps
And start to dream of somewhere to relax their restless hearts
Somewhere out of a memory of quiet streets on quiet nights
He disparages the idea of "selling your dreams for small desires," but we're talking about utility here. The proverbial "mad philosopher" will offer you a choice between chopping off your little finger or experiencing a mild headache every waking moment for the rest of your life. How sharp would the expected future pain have to be to convince you to ditch your pinky? The point is that there is no right answer.

Still, at the end of the day, everyone takes on a big project or two. Everyone decides at some point to scale "their Everest" or to run "their marathon" or to learn "their Paganini's caprice." As much as it is human nature to avoid hard work, history also shows us that pursuit of greatness for its own sake also runs deep in the human psyche.

The conclusion? Make things easy on yourself. If you want to run a marathon, take your gym clothes to work with you. If you want to play the fast bits in "For the Love of God" note-for-note, then buy a travel guitar, and use it. And if your dreams require input from others - as scaling Everest requires people stationed at your own personal base camp, or as touring the country in a rock band requires other human being playing in your rock band - then you can't simply sit back and rely on others to treat your dreams with as much reverence as you do. You have to make it easy for them to just show up, because human beings don't like to work, especially when they're only here to have a good time.

2014-03-10

Compensatory Narcissism

I don't know much about psychology, and I'm certainly not qualified to engage in too much conjecture about it. But I do have a casual interest in it, I do think about it, and so things like this tend to catch my eye:
The general problem with the narcissist is that he can't see the other, he only sees others in relationship to him. It's a movie, or a video game. It's Grand Theft Auto. Sure, the other characters are real characters, but what matters is you. You don't even have to be a good guy, or the best guy-- just the main character. It is impossible to conceive that any of the characters in GTA can have thoughts that aren't about him. "But it's a game, it's not like real life." No, to the narcissist, "real life" isn't real either, it's simulacrum. Every action is about him, positively or negatively.
And later:
Shame, however, means you are caught doing something wrong, and so people get to decide how to see you, and see you as less. This is the narcissistic injury. You can't convince the other person you are more than what they see. "Wait, it's not how it looks! I can explain-- why won't you let me explain?!" That's why narcissists aren't loners: they need the reinforcement of their identity from other people, as a bulwark against reality.
The Last Psychiatrist is writing about malignant narcissism in this case, and that's the kind of narcissism with which most people are familiar. It's the "jerky" narcissism, the kind that produces a megalomaniac or - as in the above case - a killer.

But what about the compensatory narcissist? PTypes.com (whatever that is) gives Million's definition of compensatory narcissism: "overtly narcissistic behaviors [that] derive from an underlying sense of insecurity and weakness rather than from genuine feelings of self-confidence and high self-esteem." The whole page is worth reading if you're interested in the topic, but I'd like focus on a few points of interest.

The "five factor" model of psychology shows that compensatory narcissists score as follows:

  1. High neuroticism,
  2. High extraversion,
  3. High openness,
  4. Low agreeableness, and
  5. Low concientiousness.
This seems to produce a surprising blend of traits. For example, the compensatory narcissist is highly extraverted, and yet shows a great deal of withdrawal, rejectivity, and "pseudospeciation." I presume this is driven by her low levels of agreeableness, which renders itself as "cynicism." Wikipedia reports (however apocryphally) that there may be a connection between narcissism and eating disorders, and that the connection can be expressed by in terms of anorexia/bulemia or of over-eating.

Also, there's thistheatophilia.

The picture of the compensatory narcissist is an interesting one. Imagine one who is cynical, skeptical, and quick to argue; yet somehow also highly aloof and withdrawn. Imagine one who is hyper-sensitive to all criticism and who, once having received it, will deflect and project and cut down others in order to make oneself appear more magnificent - or at least more right. Imagine one who's best surrogate for empathy is vicariousness.

It is simultaneously a ghastly picture and a pitiful one. The devil is both incorruptibly evil and yet only evil because he hasn't the self worth to stand on his own two hooves. 

2014-03-07

Links-Post-Facto

I linked to a bunch of stuff on social media today, so I might as well dump it into "Some Links" post. Here you go.

These amazing full-color photos of pre-Bolshevik Russia were taken using an interesting approach: the photographer took each photograph three times, using a different filter on the lense. The result were full-color photos long before actual color photography became otherwise possible.

Kevin Erdmann links to a story of gold and silver.

Here is an excellent firsthand account of living with type 1 diabetes. Needless to say, she speaks for me, too.

Robert Murphy gives his impression from talking about Bitcoin for a week. He finds it revolutionary even over and above the monetary ramifications.

Win-win or lose-lose. Everything else is "unstable."


From The Mouths Of Babes

Here's another one that's been making the rounds on social media.

An organization that calls itself the American Association of University Women has been promoting an article about twin girls who created "a website devoted to educating other students about Title IX’s history and value." The website is here.
“My friends had never heard of this law before, and the thing I wanted to stress to them is how Title IX was a social revolution and how [it created] all these opportunities,” says Kavya.
Okay, I admit it: Criticizing the politics of an eighth graders is a pretty cheap shot. So that's not what I'm going to do. Instead, I'm going to point out how disturbing it is that anyone would laud young people for creating and spreading state propaganda.

The award-winning website's "Road to Title IX" page cites only one source: The Title IX legal manual published by the US Department of Justice. The website provides only two quotes outlining criticism of the law: one good and one bad. The good one comes from former president of the National Women's Organization, Judy Goldsmith, who said that the law was a "setback." The bad one was a quote from a seemingly random university student. The rest of the website is devoted to promoting the law.

I'm no historian, so I can't say whether Title IX has been the main driver in the push to more equal rights for women. Presumably, a website that purports to teach about the history of the law would tell me something about that. Unfortunately, all I get from the website is goose-stepping.

It's not a good thing to teach our children to sing praises for federal legislation. That is not a practical skill. It is not even a laudable skill. It is something reasonable people can do at the dinner table, nothing more. If the Ramamoorthys had created a website devoted to providing a historical account of women's rights, then I think that is a laudable endeavor. But to simply promote a highly contentious law as a "social revolution" is not worthy of our praise. It is propaganda.

2014-03-06

Credentials

This chart from LinkedIn shows the educational background of 79 people who applied for a job that pays $47,000 per year (median household income in the same area is about $51,000):
Note that the problem is not a lack of STEM-trained applicants.

My guess is that a sizable majority of these applicants could indeed perform the required tasks.

2014-03-05

Subtraction

This picture is making the rounds on Facebook. (Hat tip to Robert Murphy.)
The picture is designed to make the "new way" of teaching subtraction look excessively confusing and irrational. Surely doing one big subtraction problem is easier than doing four addition problems, right?

Well, first let's be honest: Even though the "old" way looks more concise, it is technically two subtraction problems: the first occurs as we subtract 2 from 2 in the "units" column, the second occurs when we subtract 1 from 3 in the "tens" column. But still, isn't four addition problems more work than two subtraction problems?

In this post, I'd like to argue that the so-called "new way" of teaching subtraction is not only exactly the same as using an abacus (which tends to confuse laymen who are largely unfamiliar with what the beads on an abacus are supposed to mean), but it is also totally identical to what you were taught to do as child, no matter how old you are.

What's Going On In The Picture, Anyway?
Many people have said they don't understand where the numbers are coming from in the "new way." So let me briefly explain.

We all know it's easier to perform arithmetic on multiples of 5s and 10s than on other numbers. This is the whole logic of the metric system. So the purpose of the "new way" is to leverage the ease of using 5s and 10s in order to solve trickier arithmetic. When I was growing up, we were given "extra credit" problems that attempted to teach us that if we wanted to know what 107 + 54 is, one you can get close to the answer in your head by rounding 107 up to 110 and 54 up to 55. So you get 165 and subtract out the 3 and the 1 (which you used for rounding) to get 161.

To be sure, some kids respond better to this kind of logic than others. Some kids see this as a shortcut. I always saw it as being a little excessive. But I concede that rounding works well for a certain kind of thinker, and that's the important part.

So, to return to the problem, here's the basic idea:
  1. Start with the smaller number, which is 12.
  2. Round up to the nearest multiple of 5, which is 15.
  3. Round up to the nearest multiple of 10, which is 20.
  4. Count up by 10s until you get close to the larger number without going over; this brings you to 30.
  5. Finish off the count to the larger number, bringing you to 32.
To go from 12 to 32, you had to "count up" 3 (to get to 15), 5 (to get to 20), 10 (to get to 30), and 2 (to get to 32). So the picture tells us that the answer is 3 + 5 + 10 + 2 = 20.

That's the logic behind it.

So What's Different?
If you think about it, this is not at all any different than what you were taught when you were learning simple subtraction problems as a child. 

If your experience was like mine, when your teacher gave you a problem like "5 - 3," you were probably told to announce to the universe, "Three!" 

Then you'd stick out one finger and say, "Four!" 

Then you'd stick out a second finger and say, "Five!" 

Counting upwards from 3 to 5 on your fingers thus gives you a result of 2, at which point you would say to your teacher, excitedly, "Five minus three is two!" And you'd get a gold star.

Well, if you understand the logic of counting on your fingers, then you understand the logic of the method described by the photo above. The only element added in this case is the fact that doing larger problems will require you to count by 5s and 10s once you run out of fingers.

Really, think about it. To get from 12 to 20, you need 8 fingers. You can use your last two fingers if you want to, but that only takes you up to 22. As adult, you'll quickly know to add an additional 10; but of course as an adult, you also know that the answer is 20 without having to think about it. Put yourself in the place of a child who does not know how to do multi-digit subtraction problems yet.

So you count up to 20; and as a child you know that 20 and 30 is easy, it's just another 10. And then you just have to count on your fingers again to get from 30 to 32.

So really, all we're doing here is counting up, just like you were taught to do decades ago. The only difference (heh, get it? difference?) is that instead of running out of fingers, you just round up and count by tens, and then count up to the final number.

Thus, the picture above is nothing more than a pictoral representation of counting on your fingers - but without the fingers. It's exactly the same method. It's "counting up." Identical.

The New Way Is The Really Old Way
This, by the way, is the whole logic behind an abacus. Only, instead of fingers, we have nine beads on a string. Once you run out of beads, you move up to the next string, which represents the next set of digits. So, 1-9 are on the first string, then you move one bead over on the next string to get 10. Then you count 1-9 on the bottom string again for numbers 11-19. Then you move a second bead on the second string to get 20. Continue this process until you run past the 9th bead on the second string, and then move up to the third string and move one bead over: That's 100. And so it goes.

This "new way" of teaching subtraction just means you're starting on your abacus at 12. You move 7 beads on the bottom string and one bead on the second string to arrive at 20. Then you move another bead on the second string to arrive at 30. Then you move two beads on the bottom string again to arrive at 32. If you moved your beads correctly, your abacus should show you a value of 20, and you have your answer.

Conclusion
It's been a long time since anyone taught children to properly use an abacus. It is an archaic tool, moreso even than a slide rule - which people my age have only heard about and never used. But that doesn't mean the logic behind the abacus is irrelevant, useless, or needlessly complicated. Using an abacus is just a handy-yet-primitive way to perform arithmetic calculations. Its being "handy-yet-primitive" is precisely what makes it such a valuable tool for learning, which is why it's still possible to buy abacuses at toy stores.

One advantage that abacuses offer is that they are visual: you can actually see the digits "move" as you count them, so you can internalize logical concepts with a visual indicator. This "new way" of teaching subtraction, by contrast, utilizes the same technique as the abacus, but without the visual indicator. In that sense, it is inferior to the abacus, but not at all inferior to the method I was taught when I grew up.

But it does look different than what I was taught, and this may be why people on my Facebook feed have had such a hard time understanding what's going. Initially, I myself rejected the "new way" as useless, but the more I thought about it, the better I liked the intuition behind it.

So count me among the fans of this new way of teaching subtraction.


Phronesis For Nukes

I've recently started following a blog that is very much unlike the rest of what I read/post about. It's called The Lovely Twenty-Somethings. People who find their way to Stationary Waves from places like EconLog or other such websites might find it a little puffy compared to my usual material (and outside my demographic), but the fact of the matter is that there is a lot of really great practical wisdom at this website, especially with regard to happiness and positivity.

Case in point, this recent post reads in part as follows:
Once on a day like any other day I made a comment on a video on Youtube. It was nothing extreme or incredibly offensive, in fact I wrote it in a comedic fashion. It was also a comment that was pretty similar to a few others on the comment section yet within days I found myself getting messages of replies to my comment. Rude, cruel and pretty ridiculous comments. Since I am someone who is as disconnected from negativity as possible I promptly deleted my comment and the messages to avoid getting any others, not because I was embarrassed or "running away" it was simply because the weird stress is gave me by being bombarded by people I did not know (and did not know me) and the words they so easily used towards me that I have never heard directed at me in the "real world".
This dovetails nicely with my concept of intellectual nuke buttons.

When you're confronted with these nukes, there is no possible way to "win" in a satisfying way. The guy with the nuke always wins, even if he has to blow himself up in order to get there. Your choice becomes: (a) stick around and get nuked, or (b) walk away. Blogger "Nicola K" provides us with an innovative solution in that she not only chooses (b) - which is the only appropriate choice, anyway - but she also takes the time to blot out all evidence of the negativity so that it doesn't poison anyone else. Good idea.

There is one minor point she makes with which I must disagree:
Freedom of speech is a powerful thing but insulting someone, putting them down, hurting them and suggesting that they are stupid and only the person commenting is right is not a freedom of speech. It is bullying like any other teenager or child would do except there is a chance it is from a grown adult sitting at their computer with so much negativity in them they use a comment page to release this negativity possibly because they do not have " that power" in their " real" lives.
I agree that we must not consider verbal bullying a "right," at least not in a certain emotional sense of the term "right." However, saying insulting things, however hurtful, is very much within the realm of freedom of speech. Even verbal bullying - indeed, even verbal abuse - is protected free speech in the legal sense, so long as it's not slanderous. This is right and appropriate.

But it is also not really what Nicola K was writing about.

2014-03-04

Children And Happiness

A recent article by Angus Deaton and Arthur Stone is making the rounds. In it, the authors argue that previous attempts to measure the happiness of parents compared to that of childless individuals are flawed. They give several reasons for this, none of which are particularly objectionable.

This claim that parents are, on average, less happy than childless individuals is something that pops up everywhere in the data. It is one of the primary points covered by Bryan Caplan in his book, Selfish Reasons To Have More Kids. Most people simply accept it as a fact. To the extent that Deaton and Stone tackle the legitimacy of these results, I think they've undertaken a worthy cause. There is not much point in comparing two radically different lifestyle decisions on the same benchmarks.

Really, I think even Deaton and Stone miss the point, as does Caplan and anyone else who has ever taken these results seriously. It's not that I question the empirical validity of the data. It probably is valid.

But consider this: every human being is the product of some sort of parental relationship. That means, 100% of the childless people studied had parents. Their happiness was made possible thanks only to the fact that they were born and raised. So, even if it's true that their parents are less happy than childless individuals, they are responsible for both their own happiness and the happiness of their childless children.

See, having kids isn't really about yourself, it's about your kids. This should be obvious, but for some reason it isn't. You shouldn't have children under the assumption that your children are going to make you happy. Instead, you should have children under the assumption that you're going to make them happy. That's what being a parent is all about: producing healthy, happy offspring who are glad to be alive.

If this costs you many tropical vacations or stress-free evenings, then that is simply the price you pay for the happiness of another generation of human beings. If you don't care about that, then fine - don't have kids. But if you do have kids, you probably don't worry too much about whether they make you happy. Instead, you probably worry about what might make them happier.

I'm not saying being a parent should be a mindless sacrifice, I'm just saying that parents who are, say 25% less happy than a comparative childless individual about their own lives are (literally) infinitely happier about the lives of their children than the childless individuals are about theirs. 

Tips To Keep Your Blood Sugar Down

I'm writing this down mostly for my own benefit, but I thought I'd post it publicly in case someone else can benefit from this.

Basically, every diabetic knows what to do to keep their blood sugar down, but there are so many factors that influence blood glucose levels that they aren't always easy to keep track of. So here is a handy cheat-sheet of things that I often forget or sometimes lose track of:

  1. Eating more than 30g of any kind of fat in a single meal. It's hard to keep track of this, especially since so many diabetes-friendly foods are high in fat (avocados, nuts, seeds, etc.). A pat of butter is 15g, a tablespoon of olive oil is 8g. You can see how, in the normal course of cooking - even healthy cooking - exceeding 30g of fat in a meal is easy to do. Note: Your blood sugar will spike 3-4 hours after eating >30g of fat, so it won't always be obvious that fat is the problem.
  2. Eating more than 60g of carbohydrates in a single meal. It's much easier to control your carbohydrate intake than your fat intake (if you're a diabetic), but it's worth noting that even if you take enough insulin to cover a high-carb meal, you still might end up with a glucose spike if you eat in excess of 60g of carbs.
  3. Coming down with a cold. Getting sick pushes your blood sugar up. In my experience, the effect kicks in about two days before I ever even feel myself getting sick. If you've ever experienced truly inexplicable blood sugar increases that persist over the course of a day, you might be getting sick.
  4. Stress. It's crucial that you take some time away from your stressors and focus on happy, soothing, calming things. Take the extra 15 seconds to calm yourself down when you're in a stressful situation, or your blood sugar might be running high the next time you test.
  5. Not drinking enough water. Because we often drink water to bring our blood sugar down from an existing high, we sometimes don't remember that drinking very little water over the course of the day can result in a high reading or two. Carry a bottle of water with you everywhere, and you won't have to worry about this.
  6. Restless nights. Unless you're active on the party circuit or a workaholic, you probably don't have much control over the occasional sleepless night. But when it happens, be prepared for higher BG readings.
  7. Insulin past the 28-day mark. This one tripped me up the other day. If you don't make it all the way through your open insulin container over the course of 28 days, don't be surprised if you find that your readings start to trend upward. In my experience, 28 days is exactly when my insulin starts to lose its potency. Sometimes I forget to set an alarm to remind myself to open a new cartridge, but it's important.
Obviously this is not an exhaustive list of factors that raise blood sugar. This is just a list of things that I, personally, lose track of. Typically I'm so busy focusing on 1 and 2 that I forget about 3-7. Or I'll hone in on 5 and then lose count of 1. Or "hidden carbs" like carrots - which usually count as "free" carbs - will add up and push me over the tipping point on 2.

Anyway, try to keep this stuff in mind. Or poke me with a stick if you happen to see me slipping up. 

2014-03-03

The Last Psychiatrist May Have Just Killed My Blog

Let's Start With The Punchline
Over the weekend, my favorite blogger (so, I guess that's a thing...) delivered another fine post. Of course I'm talking about The Last Psychiatrist, and of course I'm talking about his latest send-up of pop culture analysis.

It would be tempting to write this blog post as follows: I provide excerpts, each one followed by my own thoughts. Then I pretend that I have added value. But if I'm being honest with myself and with you, the reader, then all I really need to do is link to that post and ask you to read it. As a fan of that blog post, I'd like to ask you visit the author's blog and consider his thoughts. That's the only real value I can add here. So please do - as they say in the "blogosphere" - read the whole thing.

What Do I Mean, "Killed My Blog?"
One of recurring themes from that blog is the phrase, "If you read it [or watch it, or listen to it, or download it, etc.] then it's for you." Most recently, he writes this in reference to psychology studies "in the New York Times," but it's an aphorism he's coined for media consumption in general. The idea is that even things that seem confusing, or jarring, or stupid, or infuriating are things that were designed for people like us. So, if you find your forehead vein is starting to bulge over something Paul Krugman wrote, then you should be careful how you react. The fact that you read it - even though you hate it - suggests that you're the target demo.

Not coincidentally, I stopped reading Paul Krugman's blog posts and articles a long time ago. Does that make me "better than you?" No. It only means that I'm not Paul Krugman's target demo. On the other hand, I look forward to reading EconLog every single day; coincidence? No. Nor is it any coincidence that I also look forward to reading Marginal Revolution, or Less Wrong, or indeed even The Last Psychiatrist.

At this point, it's tempting to throw it all out with the bathwater, reasoning that I only read these things to allow myself the fantasy of doing good economics, and thinking carefully about current events, and becoming a more rational thinker, but without having to do any of the actual work necessary to credibly stake a claim to those things. That's the ultimate conclusion I'm working toward. But there is also the more obvious element that I simply enjoy reading the thoughts of intelligent people with whom I agree. A large part of it is the fact that they put words to things I haven't been able to articulate for myself. That's the power of a good writer; or a good hero. There's nothing to criticize in that.

Nevertheless, it all just begs the question. If you read Stationary Waves, then it's for you; who reads Stationary Waves?

I have web analytics to back this up, so pay attention: The answer to that question is, "Not very many people at all, really." I do get a surprisingly large number of hits every day, but I have reason to believe that the majority of those hits are spam-bots, computer algorithms that have been developed to visit random blogs on the internet because they know I will look at my web analytics and say, "Holy gee! I got twelve hits from something called credit-info.rsgad.com this month! They must have linked to me!" And at that point, the assumption is that I will click on the link to see my name in print, only to discover that it is an advertisement for something.

Please note: that URL I just gave you is a real one, from my real web analytics back-end, and I have no idea what's on the other side of it. Click at your own risk.

All this is to say that the internet has rigged the game to make me believe that I am getting a lot of hits on my popular blog - but I'm not. They rig the game so that I will keep blogging, so that I will keep clicking, so that eventually I will click on something I like, so that eventually I will click on something that I will buy. But this kind of tactic doesn't work on me, so the robots are out of luck.

As for my readers, I'm not actually trying to sell you anything. I've toyed with the idea in my mind, of course, but it never quite makes sense. What do I have to offer? I give you my music for free, and my thoughts do not have any real value for anyone other than myself.

So I bring a small level of low-cost entertainment into your life, and I hope you appreciate it for what it is: modest, local entertainment. I'm down-scaling from the political themes of the past and focusing on thoughts that, ultimately, mean more to me than they do to the readers of my blog. This is, perhaps, why my hit count has been gradually decreasing.

Even the spam-bots are figuring out that there isn't much value-added here.

Value, As Opposed To The Fetish Of Value
So The Last Psychiatrist writes, "You hold a fetish of value and not actual value." He wasn't writing about me, but I'm not wrong that the statement resonates with me. Furthermore, I think this statement is accurate of a great many people who dream bigger than they live.

I was in Walmart recently, and saw a married couple talking. They were both about the same age as I am, but they had obviously lived a lot harder than I have. They had certainly consumed a lot more alcohol and tobacco than I have over the course of their lives, and probably much more. Tattoos, low incomes, low intelligence, the whole kit-and-kaboodle.

I was surprised by their behavior as they spoke. They spoke as though they were the two smartest people in the world, and everyone else were idiots. I'm talking about their mannerisms and attitudes; what they were talking about was something totally mundane. I don't talk that way about mundane things - but if we widen the net to catch other areas of life, I sure do! I like to think that if I were running "the show" (whatever it is) differently, things would be a lot better. Elsewhere in Walmart, someone must have been looking at me, thinking to himself, "How curious - he sounds just like me, but he talks about stupid things like work."

That guy is probably the real deal. That guy has real value. He's probably designing a better jet engine or something - something of actual value. But enough about him, I'm talking about me.

At this point, it should be glaringly obvious:

  • I like to read and comment on economics blogs because I like to feel like an economist without having to go through all the work of, you know, earning a PhD in economics and then working my tail off to publish in high-quality journals
  • I like to read Less Wrong to pretend I am honing my critical thinking skills without having to go through all the work of, you know, doing math problems and logic puzzles that actually will improve my critical thinking skills.
Wait, wait - it gets worse. That's only two bullet points, but think about the others. This runs pretty deep. I like to record and post videos on YouTube because I like to think that my music is reaching a wide audience, even though I've never recorded, printed, and released an actual album. My live performances tend to gather a good crowd, but I haven't delivered one of those in a long time.

In fact, when you think about it, live music as a commercial enterprise hasn't been profitable on the smaller-scale since the early 1990s. As soon as clubs figured out that they could simply broadcast pre-recorded music over the sound system and people would still buy drinks, they stopped paying for musicians. Why bother, when a DJ works cheaper? Especially when the DJ is an MP3 player set to "auto-shuffle."

Since the early 90s, then, live music in clubs has been a commercial endeavor primarily aimed at making money from the band and the band's friends, not at the public at large. The club owner says, "Sure, you can play, but you have to sell your own tickets and do your own marketing, and I take a cut of the ticket sales." Translation: You're more than welcome to play if you do all the marketing work required to bring customers to the club. What club owner wouldn't take that deal? Bands are so desperate to be performers that they'll stop at nothing to get on stage and have glamour shots taken of themselves as they play "Brown-Eyed Girl."

While we all like to pretend that we're great local rock stars, we're really just selling free marketing to club owners for the sake of a feeling. We don't bring musical value to the world, we're just willing to pretend that we're doing it in order to have the club owner give us validation.

And, yes, my blog comments generate hits for the economics blogs on which I comment. I feel like I'm participating in a brainy discussion - and maybe I am, to some extent - but the major value I'm adding is to pad the hit counts of the blogs I frequent. I'm not really adding any value there. If I were, my own blog would be huge.

But Fear Not
While this might seem like a pessimistic, self-deprecating blog post, it actually has a huge upside.

I am not as bad as a lot of other people out there. That is, I've never taken my Facebook account too seriously, I've never invested any time in "cultivating an online persona," and I do not live my whole life in pursuit of a value-fetish. I'm as guilty of doing these things as anyone else - on the margin. But on the whole, I'm doing pretty good.

I might not have been able to "run the show" at work, but I have been able to pack away enough savings that I can invest in value outside of work. On some level, value does matter to me, and I pursue it; just not relentlessly. I'm not a workaholic, but I have things on the go, and that gives me a little more security than the hypothetical "Randi Zuckermans" out there of which The Last Psychiatrist writes.

So this is what I'd like to leave you with today: the sooner you figure out your illusions, the sooner you can overcome them. When you free yourself from fetishizing value, then you can get to work adding real value to your life. When you stop pretending that Facebook matters and start acknowledging that the only thing that really counts in life is your own memories, your own happiness, your interactions with those you love, then you win the game.

Yes, life is a game, and yes it's possible to win or lose. You don't win by becoming a millionaire or being interviewed by Katie Couric, though. You win by having a deeply satisfying home life and a long list of treasured memories. To be sure, for some people that might mean spending long hours at the office. If that's what you love, I say go for it. Those memories will count for you on your death bed.

But if it's not career success that gives you the best memories, then don't sweat it. Focus on the things in life that give you stories to tell your grandchildren about. Add value to those things, and do so relentlessly. Never stop adding value to those things, never give up. Your own happiness is the only fixed point in the universe; head straight for it and never stop. If you only ever take one thing from my blog, take that.