2015-08-31

You Know Your Movement Is In Trouble When...

A few years back, Emily Yoffe, AKA "Dear Prudence," got in trouble with her readers - and the media - for suggesting that college girls who drink too much at parties are putting themselves at increased risk of sexual assault. Later, she wrote:
I wrote a story whose message is obvious: The campus culture of binge drinking is toxic, and many rapists prey on drunk young women. I said that when women lose the capacity to be responsible for their actions, sexual predators target them for attack. As banal as these observations are, I knew this story would result in a torrent of outrage.
Something called Feministing, which I guess is a pun on "fisting," although I'm not sure why on Earth feminists would want to reclaim a world like that, called it a "rape denialism [sic] mannifesto." I thought that was preposterous.

If you're not familiar with Yoffe's work, you might not know her ideological bent. It's fairly safe to say she leans left, even if not far left. I mention this because Yoffe is exactly the kind of reasonable left-leaning media pundit that leftist feminists would be happy to call an ally; yet when she suggested sobriety as a potential safeguard against sexual assault, she was criticized.

Today, we learn that true feminist icon Chrissie Hynde has suffered much the same fate for much the same line of commentary. See for yourself:
If I'm walking around and I'm very modestly dressed and I'm keeping to myself and someone attacks me, then I'd say that's his fault. But if I'm being very lairy and putting it about and being provocative, then you are enticing someone who's already unhinged - don't do that. Come on! That's just common sense.
Twitter, which has inexplicably become our moral compass, was quick to tar and feather her as a victim-blaming fool. Someone coming to her defense had this to say:
People criticising Chrissie Hynde for her comments are overlooking that she is a victim and this is self blame. I feel v sorry for her tbh
If someone had told me years ago that feminists would be criticizing Chrissie Hynde for having antiquated views on rape, I think I would have been speechless.

I'm for women's equality, and in that sense, yes, I am a feminist. But I don't want to be put in the same category as people who are criticizing the likes of Yoffe and Hynde for stating something so obvious as to be a banality.

There's something wrong with the way feminism is being applied in today's world. On some level I wonder if what bothers this new kind of feminist is not the prospect of having fault so much as the idea of having responsibility. Of course it's not your fault if someone breaks into your house, but if you deliberately leave the keys in the front door keyhole you have facilitated your own burglary. What makes your body so different?

2015-08-28

The Hidden Assumption

Bryan Caplan wrote an interesting piece on the ethics of the Ashley Madison hack. Scott Sumner replied. Caplan replied to the reply.

At issue is the question of whether Ashley Madison users "deserved" to have their data stolen. Caplan doesn't come right out and say it, but he strongly implies it:
Constructing hypotheticals with blameworthy pseudo-victims is easy enough. Imagine someone attacks you with a chainsaw because you failed to kiss his feet. When he misses your head, he accidentally saw offs his own hand. Telling him, "This is your fault" as he clutches his bloody stump is not victim-blaming. Or to take a less egregious case, suppose a worker feigns sickness so he can go to the basketball game. Co-workers spot him on t.v. in the audience and he gets fired. If he decries is fate, "This is all on you" is the bitter truth.
Sumner disagrees:
Or let's take another case; you decide to violate the law by jaywalking, crossing the street in the middle of the block. You are struck and killed by a car. (This happened to a Bentley student a couple years ago.) Are you going to claim this person was a "victim" when she was clearly violating the law? Um, actually yes, I sort of do view her as a victim.
So Caplan clarifies his position:
None of this means that people who suffer horribly as a result of committing minor offenses aren't victims. I don't think that. I jaywalk, and I don't deserve to die. When people seriously suffer as a result of committing major offenses, however, I call that just deserts.
In essence, Caplan's point is that if negative consequences arise from a clear moral breach, then (as ethicists) we shouldn't take pains to avoid blaming people for their moral shortcomings. If someone severs his own hand while attempting to kill you, tough cookies, he shouldn't have tried to kill you.

In essence, Sumner's point is that a single moral breach isn't enough to justify significant suffering.

I was attracted to Sumner's position until I realized something: Sumner is suggesting that having your credit card data stolen is excessive punishment for cheating on your spouse. At the end of the day, Sumner might value marital fidelity a lot, but not nearly as much as he values the sanctity of his credit card data.

That's surprising.

2015-08-26

Good Explanation, Poor Prediction

Let's say you're overweight, have a bad diet, and never exercise. Let's say you go to the doctor for an annual checkup, and - for years - he tells you, "You've got to lose some weight, change your diet, and increase your activity level, or else you're going to acquire type 2 diabetes."

But, for years, you choose not to change your lifestyle and, for years, you fail to acquire type 2 diabetes. From time to time, some of your friends say, "Doctors like yours don't know anything about diabetes. They've been predicting that you'd become a type 2 diabetic, and yet it still hasn't happened yet!"

Then, one day, it happens. You're devastated. In a moment of weakness, your doctor says, "For years, I've been warning you that your poor health habits would result in type 2 diabetes - you should have listened to me."

Suppose you were to respond to your doctor as follows:

"True, for years you warned me and for years it never happened. So how useful was your warning? Not very. Clearly I have now acquired type 2 diabetes, but that doesn't mean your theory about weight, diet, and exercise is correct. What, if anything, would cause you to second-guess your beliefs?"

The moral of this story is: Sometimes ideas - even ideas we think are completely uncontroversial - have strong explanatory power but poor predictive power.

The Cost Of Diabetes Supplies

A while back, I discovered the ReliOn brand of diabetes supplies. From what I can tell, this seems to be a brand that is exclusive to Wal-Mart; I've never seen these supplies sold anywhere else. The cost of these supplies is much, much lower than the other brands they compete against. Not only are they drastically lower than the brand name supplies, but they are also drastically lower than the generic supplies sold in other stores.

As for their quality, I can vouch for the glucometer and test strips, the lancing device, and the insulin pen needles.

The glucometer and test strips have accuracy that is near-identical to that of my OneTouch meter and test strips. That is to say, each test can be as much as 10% off or so, and consecutive tests may yield inconsistent results, but the problem is no worse with the ReliOn meter/strips than it is with the (much more expensive) OneTouch meter/strips. I use the Bayer Contour glucometer and strips, the same one they use in emergency rooms, and this thing is the most accurate meter I've ever had. ReliOn doesn't beat the quality of the Contour Next meter, but the Contour is much more expensive.

The lancing device, like all of them, is a spring-loaded pin; there's not a lot that can go wrong there. The first one I bought was spotty, but the next one has lasted. I recently purchased a spare generic lancing device from CVS, and to my surprise, it was identical to the ReliOn device, but for a different logo printed on the outside of the device.

But the best ReliOn product has to be the insulin pen needles, which are vastly superior to BD needles, although not quite as good as my personal favorites, the Novo-fine brand needles. The ReliOn needles come at a fraction of the cost, but for a single use, they never get blocked, they perfectly fit every insulin pen I've ever owned, and they are generally thinner-gauge than the BD needles, which means less pain per injection.

I often purchase ReliOn products for cash even though my insurance company reimburses me for prescription supplies, simply because the price of ReliOn products is less than or equal to my prescription co-pay. Again, that, without any loss in quality.

If you're in search of low-cost diabetes supplies, I highly recommend the ReliOn brand.

The other day, at CVS, I noticed a man shopping for his first glucose meter and test strips. The pharmacist was trying to talk him through the cost of supplies, and which meter would be "best" for his situation. When she stepped away from him for a moment, I told him that he didn't have to pay for a meter, that companies will usually send him a free one. My goal was to give him a good piece of advice, and hopefully let him know about ReliOn, just to save him some money since he was obviously paying cash. Unfortunately, he misunderstood me, and got the impression I was concerned that he didn't have enough money to pay. (Not so - I just wanted him to get the best deal possible.) So I never got the chance.

Fast-forward to today, when I saw an electronic sign outside a big-brand pharmacy stating, "We sell Medicare diabetes supplies!!" It struck me that this seems to suggest that Medicare won't reimburse for ReliOn brand supplies. (I'm conjecturing because I don't actually know. Can one of my readers please confirm?) This means that the government, the insurance companies, and patients are all paying too much for diabetes supplies.

If you're a betting man/woman, you might want to find out whether ReliOn is a publicly traded company. If so, I predict its stock value will increase as more patients discover what I've just written here. (Disclaimer: Please don't misconstrue this as qualified financial advice. Trade at your own risk.)

2015-08-25

I Don't Get It, And It's All Stephen Hawking's Fault!

This is how reporters cover science:
Hawking’s idea isn’t entirely new: In fact, t’ Hooft made a similar proposal nearly two decades ago. It’s still unclear just where the differences lie, but it may be that Hawking has resolved some previous difficulties with the theory.
It's not "still unclear." It's perfectly clear. I bet Hawking and t' Hooft both understand exactly what the difference is between their two theories. It's only "still unclear" the reporter, who tried and failed to understand the difference between their ideas.

The reporter uses language to magically turn the phrase "I don't know what these guys are talking about," into "Maybe one day science will understand this better." The latter statement is certainly true, but it has nothing to do with the fact that the reporter doesn't get it. That's his problem. He makes it sound like his inability to understand is due to unclear theorizing on the part of the most brilliant physicists in the world.

How can a science reporter be incapable of admitting what he doesn't know, i.e. owning up to the cornerstone of scientific inquiry? Worse, how can he blame Stephen Hawking for the fact that he doesn't understand?

To be clear, there is absolutely no shame in not understanding theoretical physics as well as the leaders in that field. I would expect that most people, myself included, fail to grasp the cutting-edge ideas in physics. What bothers me, though, is that the reporter doesn't simply acknowledge that he can't understand it. Instead, he pretends that the matter is "still unclear," as if the question here is that maybe Hawking and t' Hooft are the ones who aren't sure about their own theories.

(Traffic) Signal Versus Noise

I have a fairly long daily commute, and accordingly, I've spent a lot of time observing traffic patterns and behavior. Every city that I've ever driven in has its own unique "driving personality," i.e. sets of behaviors that are more common within that city than outside of it. I'm sure you've noticed the same thing, and maybe you've also noticed that if you spend some time acquainting yourself with a city's "driving personality," you can often guess what a fellow motorist will do long before they do it - maybe even before they know they'll do it. This isn't a special or unique skill, it's just part of being an attentive driver.

I often get a kick out of motorists on a busy freeway who dart between lanes in an attempt to travel as quickly as possible. Many of them will change lanes as soon as they see a spot in traffic that might enable them to pass a small subset of cars. They change lanes, pass a few cars, and then wait for their next opening to pass the next subset.

This is funny to me because a lot of these vehicles must be well accustomed to driving these routes at that time of day on a regular basis. They're responding to the "noise," i.e. the momentary traffic patterns that might enable them to pass a few other cars, but they're ignoring the "signal."

In this case, the "signal" is the long-term trend. I drive more or less the same route every day. I don't need to dart between lanes and pass a few cars at a time because I'm already familiar with which lanes are, on average, faster than the others at which point on the road.

I've occasionally surprised people by how quickly I commute on the route I choose to drive. People assume I'm speeding most of the time, but I'm not. I simply pay attention to which point on the road is usually the best time to change lanes. On average, I easily cruise past the drivers who are constantly changing lanes because they make a wrong move that holds them back. Meanwhile, while I seldom make a "winning move," I never choose wrong.

On any given day, another motorist driving the same route might make it faster to my destination than I do. But, on average, I always save time compared to others, and especially compared to hastier drivers who are always trying to pass the next car ahead of them.

What's your favorite traffic trick?

2015-08-20

Music Is Dead And People Just Don't Get It

Tyler Cowen links to this New York Times article about musicians' rising incomes, called "The Creative Apocalypse That Wasn't." Author Steven Johnson suggests that all the fears of new technology devastating the livelihood of musical artists were unfounded.

Johnson introduces his thesis as follows, but predictably, I already think he's on the wrong track (emphasis mine).
The world of professional creativity, the critics fear, will soon be swallowed by the profusion of amateurs, or the collapse of prices in an age of infinite and instant reproduction will cheapen art so that no one will be able to quit their day jobs to make it — or both. 
The trouble with this argument is that it has been based largely on anecdote, on depressing stories about moderately successful bands that are still sharing an apartment or filmmakers who can’t get their pictures made because they refuse to pander to a teenage sensibility. When we do see hard data about the state of the culture business, it usually tracks broad industry trends or the successes and failures of individual entertainment companies. That data isn’t entirely irrelevant, of course; it’s useful to know whether the music industry is making more or less money than it did before Ulrich delivered his anti-­Napster testimony. But ultimately, those statistics only hint at the most important question. The dystopian scenario, after all, isn’t about the death of the record business or Hollywood; it’s about the death of music or movies. As a society, what we most want to ensure is that the artists can prosper — not the record labels or studios or publishing conglomerates, but the writers, musicians, directors and actors themselves.
Johnson is right about what "the dystopian scenario" is, but I'll get to that in a moment. Right now, I'd like to challenge his assertion that "As a society, what we most want to ensure is that artists can prosper." I'm not so sure that's true.

And I'm not just being cynical about society, I'm suggesting that whether or not artists "prosper" is entirely irrelevant to societies wants or needs. But that's not unique to artists, it's a fact of every other occupation out there. Society doesn't want artists, or doctors, or lawyers, or garbage men to "prosper." People in various professions prosper because they give society what it wants. In the case of doctors, society wants to be healed. Society would be thrilled if medical technology could be delivered robotically at a cost of zero dollars and all doctors had to seek other kinds of employment. That would be huge!

Analogously, society doesn't want artists to prosper, society wants art. Whether or not the artist prospers is beside the point. Many great artists throughout history have died as paupers and society didn't care. What society cares about is the output, the art. That's the end goal.

That is, by the way, why the "dystopian scenario" isn't about the death of business, but rather the death of the artistic media, the output. So Johnson contradicts himself here.

By The Numbers

To bolster his case, Johnson cites income data on performers and artists. He points out the following:
  • The Occupational Employment Statistics show* a >20% growth in the category that includes artists and performers (compared to a >14% growth in US population).
  • "Annual income for [this occupational group] grew by 40 percent, slightly more than the O.E.S. average of 38 percent."
  • A consulting agency used the US Economics Census to conclude* that there was a 40 percent increase in the size of self-employed performers, whose income grew by 60 percent during the ten-year time-frame between 2002 and 2012.
Johnson then concludes that, while musicians and artists in aggregate have experienced income growth that merely kept pace with inflation, they haven't seen a decrease in income, and meanwhile the number of people employed in those fields has increased, and the incomes of the self-employed artists are estimated to have increased sharply.

It sure looks like a good case for artists and musicians. More people in those fields are making as much money or more than ever before. 

But remember: Johnson isn't arguing against the claim that nobody makes money in music and art anymore. No, he's arguing against the "distopian scenario" that movies and music are dying.

Quality, Not Quantity

On the question of quality, Johnson is a lot lazier. I, personally, would have preferred that he tackle things from the angle of aesthetics, but that's a tall order in this philosophy-deprived world. Still, Johnson doesn't even give us a cursory "there's no accounting for taste," but rather dismisses the mere possibility that quality has decreased, at least for one medium (emphasis mine):
What about the economics of quality? Perhaps there are more musicians than ever, and the writers have collectively gotten a raise, but if the market is only rewarding bubble-­gum pop and ‘‘50 Shades Of Grey’’ sequels, there’s a problem. I think we can take it as a given that television is exempt from this concern: Shows like ‘‘Game Of Thrones,’’ ‘‘Orange Is The New Black,’’ ‘‘Breaking Bad’’ and so on confirm that we are living through a golden age of TV narrative. But are the other forms thriving artistically to the same degree?
I guess if you don't think Breaking Bad is one of the greatest artistic narratives in television history, then you're just denying reality. The sad thing here is that these series are, essentially, R-rated soap operas. They're certainly glitzy, but their only real attraction is (a) sex and (b) end-of-episode cliffhangers. The action is slow, and the plots are relatively predictable, right up until the end of each episode, at which point, there is a surprise twist that seems interesting enough to make you want to watch the next episode.

...which, when you think about it, is exactly how a soap opera works. It works for getting viewers, sure, but let's not kid ourselves. This ain't Tolstoy.

As for movies, Johnson evades the quality question entirely, and chooses instead to focus on the total earnings of films that were made under a certain budget level and received a certain Rotten Tomatoes score. Because the earnings number went up, Johnson concludes that cinematic quality is higher than ever. But that is a complete non sequitur. As one point of countervailing evidence, I'll simply remark that one of the movies Johnson lists as artistically challenging is Zero Dark Thirty - remember that piece of propaganda put out to glorify the assassination of Osama bin Laden? Take that, Citizen Kane!

This Is The End, My Only Friend, The End

More telling than that, however, is the fact that Johnson doesn't even attempt to make a case for the artistic integrity of the music business; not even a cursory paragraph, just nothing. So in lieu of dismantling the argument he obviously knew he couldn't make, I'll simply ask you to engage in a thought experiment for a moment.

What would the "dystopian scenario" look like in the music world? What would "the death of music" actually look like? 

Here's what I think it would look like: I think, rather than a complete absence of music, what you'd see is a situation in which it would be impossible to get away from extremely bad music. Such music would be omnipresent, cheap to produce, and virtually limitless in quantity. As such, it would deliberately eschew harmonic and rhythmic complexity, in favor of tried-and-true compositional elements that could be recycled as many times as possible.

So, for example, instead of the 20-minute-long symphonies, with thematic development that spanned the entire 20 minutes, such as those that used to debut a century ago, we'd see simple building blocks that could be copy-pasted as many times as it took to get to 20 minutes. The pieces would be highly repetitive and whatever thematic development that unfolded over the span of the piece would be that which could be supported by a limited number of building blocks. -- That's techno.

In the dystopian scenario, we'd see a situation in which the performance of a piece takes a back seat to precision. The art involved in the composition of this kind of music wouldn't pertain to elements that challenge a listener artistically, because that would be expensive, time-consuming, and commercially unattractive. Instead, the art involved would pertain to those elements that a modern, technology-savvy society could easily grasp: how precise is the rhythm, how many "wrong" notes are there, can it be replicated on my home copy of GarageBand, etc.? Because performance and musical proficiency are difficult to assess, most critical evaluation of music would focus on either its "X-factor," or its production quality. Not, "Does this concerto move me?" but simply "Was this concerto played on an instrument that was well-recorded by a cool new mic." (A felicitous offshoot of this kind of musical evaluation is that it is well-suited to product-placement. Gotta keep you buying gear so that you can soon make a techno symphony of your own.)

But in the dystopian scenario - the true worst-case - it wouldn't be true that all artists were techno artists. Instead, we'd see a situation in which even the serious artists produce this sort of music and are evaluated on the same level as techno artists. Serious music, whatever it sounded like, would just be another genre, another category of Grammy recipient. In the dystopian case, hackneyed 4-chord country artists would be given the same kind of accolades once reserved for, say, Beethoven.

See, in the real dystopia, music consumers would basically be ambivalent toward, oh, how about Nikki Minaj... between Nikki Minaj and Maurice Ravel. Artists like Minaj would be financially successful, or at least they'd be able to make livings as a rapidly growing new category of artistic entrepreneurs. Those more like Ravel, who rely on a combination of time, effort, originality, and artistic integrity to do what they do, would be about as successful as Nikki Minaj, initially; but then they would realize that they can earn the same amount of money with far less effort by uploading video remakes of video game songs and AutoTuned political speeches set to drum machines to their YouTube account and monetizing it.

Ha ha ha, but that would be ridiculous. That's just the dystopian scenario. That would never happen.

___________________
* Note: I did not personally verify Johnson's data, but elected to take it at face value. Interested persons should verify these data for themselves.

2015-08-19

The Ethics Of Public Knowledge Of Your Ethics

From The Awl, on the AshleyMadison.com hack:
Such a scenario would present a number of new questions for many more internet users— questions the nature of which they’ve never really had to deal with. If the names and email addresses are available in a simple Google-like search, for example, will they search for their partners? Friends? Coworkers? Representatives? Family members? If so, why? If not, why not? Will you seek out the raw leak data after reading this post? Will news organizations, presented with user profiles associated with public figures, ask for comment? Treat each as news? Which ones? How? The last time people dealt with similar questions on a large scale was when troves of internal Sony documents, including emails, were leaked. Before that, it was when hundreds of private celebrity photos were stolen and released last year. That act was widely denounced, as were the millions of subsequent acts by the people who viewed the photos. But enough people looked at these photos to set traffic records for sites like Reddit. In any case, an incredible number of ethical questions are posed by this situation!
The first ethical question raised by the hack is easy: Is it ethical to steal data from a private company to make this kind of social statement? No. What The Awl manages to point out, though, is how difficult the remaining questions are.

I don't believe it's fair to say that absolutely every user at AshleyMadison.com was a bad person engaged in unethical activity. Some might be there with the full blessing and consent of their partner(s). Some might be seeking escape from an otherwise unescapable situation. Some victims of the hack may not have used their accounts for years. Some may have reformed, made amends, and moved on with their lives. There might be some other users engaged in neutrally ethical activity. While, "They're cheaters who deserve what they get!" is an attractive knee-jerk, a more patient level of consideration reveals that a lot of these folks - over 30 million people, in fact - are innocent bystanders who don't deserve to be stolen from.

For that matter, even assuming they were guilty of cheating, does that mean they deserve to be victims of identity theft? By what logic would such a conclusion make sense? From what I can tell, the only reasoning that supports that conclusion is, "They did something morally wrong and deserve to suffer for it." But should they suffer anything, just for a moral lapse in one area of their lives?

There are even more interesting ethical questions regarding how you, a casual surfer of the internet chooses to respond to the hack. If you voluntarily search the data dump for incriminating evidence against people you know - even people you love - are you doing something wrong? The answer seems to be, "No, if you are a victim of someone else's cheating; but yes if you're digging for dirt on the people in your life." The problem is, how will you know whether your search is justified until you actually engage in the search? Are you comfortable with the conclusion that you yourself are morally culpable if your search turns up empty, but justified otherwise?

What if we discover that a disproportionately large percentage of the AshleyMadison.com user population holds positions of power? What would you conclude about that sort of situation? Who might you blame for staffing your public service with moral failures? Will you vote against them in the next cycle?

But here's what really gets my motor cranking: 30 million people have been potentially "found out." There is a good chance that among them is someone you know, someone who you believe to be a fundamentally "good person," but who you now have evidence to the contrary. Your opinion of that person is bound to change, but this is because that person's morality has been made public. They never had any good reason to think they'd be caught doing what they were doing, and your opinion of them was positive. They were caught, and your opinion changed. So, your opinion is largely founded on whether or not someone is caught in the act.

How many of your actions would survive that level of public scrutiny?

Minimum Purpose Machine

My Facebook feed alerted me to the existence of an interesting, even if misguided, piece of art: The Minimum Wage Machine. Slowrobot.com has a synopsis of the piece:
This machine allows anyone to work for minimum wage for as long as they like. Turning the crank on the side releases one penny every 4.97 seconds, for a total of $7.25 per hour. This corresponds to minimum wage for a person in New York. 
This piece is brilliant on multiple levels, particularly as social commentary. Without a doubt, most people who started operating the machine for fun would quickly grow disheartened and stop when realizing just how little they’re earning by turning this mindless crank. A person would then conceivably realize that this is what nearly two million people in the United States do every day…at much harder jobs than turning a crank. This turns the piece into a simple, yet effective argument for raising the minimum wage.
Let's summarize the argument more succinctly:

  1. Turning a purposeless crank is easy.
  2. Turning a purposeless crank for minimum wage is unattractive to museum-goers.
  3. The average minimum-wage job is more difficult than turning a purposeless crank.
  4. Museum-goers are unlikely to be attracted to more difficult minimum-wage tasks, considering that they are already unattracted to turning a purposeless crank.
  5. The minimum wage should be increased.
If one wanted to quickly do away with the Machine's argument, one could simply point out that nobody wants to spend their museum time (i.e. leisure time) working for minimum wage. I don't know about you, but I go to museums on my days off. I wouldn't turn a purposeless crank for my current salary, much less for minimum wage, at least not on my day off. That's the whole point of getting a day off.

So there's that.

But leave that aside for a moment. How does the Machine's creator account for the fact that, while people quickly give up on turning the crank on the Machine, people keep their minimum-wage jobs for a long time? I don't know anyone who would describe minimum wage work as being unequivocally pleasant, but it must be at least marginally more pleasant than interacting with the Minimum Wage Machine, because fewer people give up on their jobs than give up on the Machine.

This is especially puzzling in light of the fact that the Machine is designed to be easier to operate than it is to work at a minimum-wage job.

Could it be that minimum-wage work offers more to the worker than a slow-but-steady trickle of pennies? 

2015-08-18

Not Sexy At All Is The New Sexy

Sexual objectification is a spectrum, not a binary. Everyone, everywhere, at some point wants to be sexually objectified.

That might be an inflammatory way to put it, but I'm trying to make a point. Obviously, no one in their right mind wants to be thought of as "good for absolutely nothing, except sex," and that is the typical connotation attached to the phrase "sexual objectification." A great many attractive people - most of them women - struggle to be taken seriously on every other level due to the pervasive sexual objectification they are forced to endure. Clearly, and unequivocally, this is a bad thing.

But sex is a beautiful, normal, natural, positive part of the human experience. If you're anything like most of us mammals, you will at some point want to be viewed sexually by someone.

Like I said, it's not a binary thing. As with any other part of your identity - who you are, at your very core - you cannot simply stop being that person just because you're at work, or whatever. That doesn't mean everyone should interpret your business memos "sexually" (whatever that means), but it does mean that if you have any sexuality at all in your personality, it will sometimes make itself known to other people, whether that's what you intended, or not.

Furthermore, as adults, most of us choose to intend it now and again. This gives us some control over how we interact with the rest of the mammals out there. We don't always make sex a part of what we're doing, of course, but we do so on occasion.

This is all very obvious and uninteresting, but it's important that I start off today's post with that explication.

The Offense

ABC News reports that an Alabama Sorority's recruitment video - posted to YouTube, but evidently since removed (but re-uploaded here) - came under fire for being "unempowering." Actually, it's worse than that. The University of Alabama itself gave a stern criticism of the video, according to ABC:
In a statement, the University of Alabama said the video “is not reflective of UA's expectations for student organizations to be responsible digital citizens.”
Welcome to the new normal, in which a sorority's every action must reflect the university's "expectations" "to be responsible digital citizens." Personally, I struggle to understand what exactly comprises "being a responsible digital citizen."

Some things seem obvious, such as not "cyber-bullying" anyone, not hacking, protecting the sensitive or potentially sensitive personal information of others, protecting minors and sensitive people from potentially objectionable material (don't goatse me, bro), and perhaps even practicing "netiquette."

The sorority, however, appears not to have violated any of those expectations. Instead, they put together a recruitment video in which the members of their sorority were made to look as physically appealing as possible. There was no nudity in the video. Based on the clips in the ABC News video, they don't seem to have violated the university's dress code while on university property. No overtly sexual acts are depicted in the video. They simply dressed up in flirty outfits and frolicked around a bit, in order to portray the image that (1) sorority members are pretty, and (2) sorority members have a lot of fun.

Why else would a young woman want to join a sorority?

In this case, the offense appears to be the mere suggestion of elite sexuality, no different than anything you'd see on daytime TV. "We're pretty, and we have a lot of fun," they seem to say. "Come join us."

That's cause for uproar?

The Unempowered

One "A.L. Bailey, a writer, magazine copy editor, and online editor who lives in Hoover," had this to say about the video:
No, it's not a slick Playboy Playmate or Girls Gone Wild video. It's a sorority recruiting tool gaining on 500,000 views in its first week on YouTube. It's a parade of white girls and blonde hair dye, coordinated clothing, bikinis and daisy dukes, glitter and kisses, bouncing bodies, euphoric hand-holding and hugging, gratuitous booty shots, and matching aviator sunglasses. It's all so racially and aesthetically homogeneous and forced, so hyper-feminine, so reductive and objectifying, so Stepford Wives: College Edition. It's all so ... unempowering.
Unempowering is an interesting word choice here. If one wanted to make the argument that such a video takes power away from women, one would use the word disempowering. But saying that the video is disempowering is a strong claim against a video made independently by women, intended to appeal to women, and posted on a forum that requires women to voluntarily seek it out in order to watch it. One might say, "That message was meant for me, but failed to resonate," but one probably couldn't argue that "The video took away my power as an individual."

Instead, Bailey says the video is unempowering. I had never heard that word, so I looked it up, and what it means is (and I quote) "Not empowering."

I agree - the video is not empowering. Should it be? Bailey - by virtue of the fact that he or she chose to criticize the video for being "not empowering" - seems to think so. But why?

"Yes, sororities are known for being pretty and flirty;" she writes, "they aren't bastions of feminist ideologies. But perhaps they shouldn't completely sabotage them either."

Again, this is a fascinating word choice. To sabotage anyone - feminist or otherwise - would be unambiguously disempowering. But Bailey doesn't accuse the sorority of sabotaging any person, but rather sabotaging an ideology

And how does this sabotage occur? By portraying the actual members of an actual sorority as being every bit as sexy, flirty, and fun as they actually are. 

Bailey Opens The Kimono

It seems so strange to me. Why would anyone think the sort of thoughts contained in A.L. Bailey's article? 

Bailey tries to relate the video to current popular examples of misogyny:
Just last week during the GOP debate, Megyn Kelly of Fox News called out Donald Trump for dismissing women with misogynous insults. Mere hours later, he proved her point by taking to Twitter to call her a "bimbo." He also proved the point that women, in 2015, must still work diligently to be taken seriously. The continued fight for equal pay, the prevalence of women not being in charge of their own healthcare issues, and the ever-increasing number of women who are still coming out against Bill Cosby after decades of fearful silence show that we are not yet taken seriously.
None of this has anything to do with the video. But Bailey continues:
Meanwhile, these young women, with all their flouncing and hair-flipping, are making it so terribly difficult for anyone to take them seriously, now or in the future. The video lacks any mention of core ideals or service and philanthropy efforts. It lacks substance but boasts bodies. It's the kind of thing that subconsciously educates young men on how to perceive, and subsequently treat, women in their lives. It's the kind of thing I never want my young daughters to see or emulate.
These two paragraphs appear back-to-back in the article. The implication is that sex-positive videos of women "subconsciously educate" man viewers to insult female journalists or become (alleged) serial rapists.

Think about it: Bailey argues that videos such as these - featuring no nudity or sexual activity whatsoever, in which the most salacious thing that appears to occur is that a young woman blows a handful of glitter into the air - result in young men becoming rapists.

No, really, think about it. That's Bailey's argument. I haven't mischaracterized it.

In what I imagine was intended to be Bailey's emotional climax, he or she presents a series of characterizations of her own about the 72 young sorority members in the video. "That's 72 women," he or she writes, "who surely must be worth more than their appearances," "...who will potentially launch careers..." "...who could be a united front for empowerment..."

"And that's 72 women who will want to be taken seriously rather than be called bimbos--"

Oh. Now I get it.

Sex As Emotional Maturity

When I burst on the scene in the early 1990s, one of the things that made me notorious was my attack on the date-rape rhetoric of the time.... [M]y statements on the topic, such as my 1991 op-ed in New York Newsday, caused a firestorm. I wasn’t automatically kowtowing to the standard rhetoric that men are at fault for everything and women are utterly blameless. I said that my 1960s generation of women had won the right to sexual freedom–but with rights came personal responsibility. People went crazy! There was this absurd polarization where men were portrayed as demons and women as frail, innocent virgins. It was so Victorian! And there was also a big fight about pornography, which I strongly supported. In the 1990s, pro-sex feminism finally arose and took power. It was an entire wing of feminism that had been suppressed by the Gloria Steinem power structure–by Ms. Magazine and NOW– since the 1970s. It had been forced underground, but it started to emerge in San Francisco with the pro-sex and lipstick lesbians in the mid to late 1980s, but it got no national attention. Then all of a sudden, there was this big wave in the early 1990s. I became one of the outspoken figures of it after “Sexual Personae” was published in 1990. My views had always been suppressed, and I had had a lot of difficulty getting published–“Sexual Personae” had been rejected by seven publishers and five agents. So we fought those fights, but by the late 1990s, the controversies subsided, because my wing of pro-sex feminism had won!
What I remember about the 1990s is less about the state of the feminist power structure, and more about the kind of entertainment that was out there. Madonna and Prince - with their extremely sexy imagery - were hitting their peak popularity. At the movie theaters, so-called "erotic thrillers" were huge hits. (Think about movies such as Basic Instinct, Wild Orchid, Angel Heart, and so on.) Even on the small screen, the early 90s saw the birth of series like Red Shoe Diaries. Dr. Ruth became a household name. Late that decade, MTV's "Love Line" would do it all over again with Dr. Drew.

To put it simply, there was a lot of sex going on in popular culture. Generation X had come of age and by all appearances wasn't a particularly squeamish generation when it came to erotica. No one was scandalized, marginalized, "unempowered," or "triggered" by any of this. Or at least, to a much lesser degree than in previous generations. It was a cultural phenomenon. Society had evolved into something more "pro-sex," as Paglia might put it.

Meanwhile, I was living out my life in the extremely conservative local culture of suburban Utah, where sex seemed particularly verboten. Anything that hinted at a person's sexuality - from shorts deemed "too short," to strapless blouses, to locker room comments - was quickly maligned. The impact of this was that young women did their hair and makeup like old ladies, and young men tried to act as somberly and wholesomely as possible. They'd go on "dates," but without any sort of mere conversational outlet for any aspect of their sexual identity, they would be forced into a bizarrely saccharine cutesiness. For example, laser tag was a common dating activity. Unchaperoned dancing? Not so much.

Perhaps it was coming of age at a time and place where I gained exposure to both a minor sexual revolution and severe emotional sexual repression gave me some added insight into this, or perhaps I'm just making too much of it. In any case, what I concluded from my experience is that sex isn't just a part of a person's identity, it's a vitally important way through which we interact with the world.

You can't just sweep it under the rug. You can't just demand that young women strike it from the repertoire of their self-expression in an endless social crusade to gain "a united front for empowerment." You have to embrace what is a vital part of the human experience.

And if you don't? Well, here's what Camille Paglia thinks:
[INTERVIEWER]: I wanted to ask you about that. If Emma Sulkowicz were a student of yours, in an art class you were teaching, how would you grade her work? 
[PAGLIA]: [laughs] I’d give her a D! I call it “mattress feminism.” Perpetually lugging around your bad memories–never evolving or moving on! It’s like a parody of the worst aspects of that kind of grievance-oriented feminism. I called my feminism “Amazon feminism” or “street-smart feminism,” where you remain vigilant, learn how to defend yourself, and take responsibility for the choices you make. If something bad happens, you learn from it. You become stronger and move on. But hauling a mattress around on campus? Columbia, one of the great Ivy League schools with a tremendous history of scholarship, utterly disgraced itself in how it handled that case. It enabled this protracted masochistic exercise where a young woman trapped herself in her own bad memories and publicly labeled herself as a victim, which will now be her identity forever. This isn’t feminism–which should empower women, not cripple them. 
...To go around exhibiting and foregrounding your wounds is a classic neurotic symptom. But people are so lacking now in basic Freudian consciousness–because Freud got thrown out of mainstream feminism by Kate Millett and Gloria Steinem and company. So no one sees the pathology in all this.... I prophesied this in a piece I wrote... called “The Nursery-School Campus”. ...I was arguing that the obsessive focus by American academe with students’ emotional well-being was not what European universities have ever been concerned with. European universities don’t have this consumer-oriented view that they have to make their students enjoy themselves and feel good about themselves, with everything driven by self-esteem. Now we have people emerging with Ivy League degrees who have no idea how little they know about history or literature. Their minds are shockingly untrained. They’ve been treated as fragile emotional beings throughout their schooling. The situation is worsening year by year, as teachers have to watch what they say and give trigger warnings, because God forbid that American students should have to confront the brutal realities of human life. 
Meanwhile, while all of this nursery-school enabling is going on, we have the entire world veering towards ISIS–with barbaric decapitations and gay guys being thrown off roofs and stoned to death. All the harsh realities of human history are erupting, and this young generation is going to be utterly unprepared to deal with it. The nation is eventually going to be endangered by the inability of several generations of young people to make political decisions about a real world that they do not understand. The primitive realities of human life are exploding out there!

Coda

I spent some time trying to Google "A.L. Bailey" in hopes of finding out more about his or her perspective, and to learn more about what he or she had written. I couldn't find any other article penned by an A.L. Bailey. I couldn't find a public profile or LinkedIn page. In fact, I couldn't find anyone named A. Bailey listed in the city of Hoover, Alabama, where his or her op-ed's byline indicates he or she lives.

I don't fault someone for using a professional pseudonym, but when someone makes such strong claims, I expect more than a single, anonymous article launched against a few dozen college girls whose only crime was to embrace a natural, normal part of who they are. 

Someone is repressing young women, and it's not who Bailey thinks it is.

2015-08-17

Different Perspectives

Last week, I noticed that someone on Facebook had written a "happy birthday" message to, of all people, Fidel Castro. This came as a big surprise to me because I am acquainted with this person as part of a broadly "libertarian" circle of friends, and we libertarians seldom say anything nice about communist dictators, as you can well imagine.

Predictably, this birthday message sparked extensive commentary, debate, and follow-up statuses. For the most part, I tried to stay out of it, although I did take the time to point out that, given Castro's record of bad deeds (despite whatever the list of good deeds might be), the original well-wisher shouldn't have been surprised to discover that the Facebook status updates were controversial. Of course they would be!

Unlike many of the folks in my libertarian circle, however, I was less surprised to see the post. The reason is because I've had the opportunity to meet a fair number of Baby Boomers from the developing world.

One of them once explained to me that, whatever the shortcomings of communism, the communist countries offered aid to countries in Africa and South Asia when no one else would. It was on this foreign aid that this man - and many others like him - was able to obtain a scholarship to attend graduate school. (He himself studied in Ukraine, when it was still part of the USSR.) And of course, the PhD he earned in Economics was largely responsible for his career success later in life.

We in the United States are accustomed to seeing a man like Nikita Khrushchev as a horrible dictator, responsible for the suffering of millions of innocent people - and, indeed, he was. But he was also a man responsible for the kind of foreign aid that made a true positive impact on the lives of millions of others.

We Americans don't really have a good feel for the level poverty and suffering that exists in other parts of the world. People who experience it on a daily basis know that the solution is resources - money, aid, help in whatever form. They just need it. They are not always in a position to say, "No, I will not accept your offer to give me a better life because I read in The New York Times that you do not have a spotless human rights record."

Moreover, considering the number of bombs dropped by the United States government on starving people every year, it simply cannot be said that US foreign aid is more ethically sound to accept than Cuban aid.

So, while I won't be wishing Fidel Castro a happy birthday any time soon, I understand why someone else would.

2015-08-13

FDA Approval

Tyler Cowen has an almost-interesting post on the idea that reliable drugs could be approved "at full Medicare and Medicaid reimbursement rates, if not higher. Drugs with lesser efficacy or higher risk could be approved at lower reimbursement prices."

A Couple Of Problems With Cowen's Post

First, the FDA does not grant approvals for drug reimbursement. Rather, the FDA determines whether a drug can be sold on the market - regardless of who will reimburse anyone for anything. It is then Medicare, Medicaid, and private insurance companies that determine whether they will reimburse anyone for the medication. Unless I'm misunderstanding Cowen, it sounds like he thinks that an FDA approval is the same thing as Medicare reimbursement.

Second, private payers have already implemented this plan. They have a formulary - a list of approved medications for which they will reimburse the patient - and if you choose an alternative therapy, they will often reimburse the patient for the cost that the formulary drug would have incurred. Certain exceptions apply, but the point is that it's not exactly a revolutionary idea. Cowen says "proposals of this kind deserve further attention," but from my vantage point, they are already being practiced.

How Should FDA Approvals Actually Work?

Most of us are patients, not pharmaceutical employees, so we have no vested interest in cleaning up the approval process. In addition, most of us aren't insurers, so we are somewhat insulated from the market price of a medication.

Thus, the only thing we should really care about is whether we have access to medications for which we might some day see a use. The approval should be a one-step process: submit a form, presto, you're approved. This is to ensure that patients have access to every therapy they might want to try.

Stop objecting before you start. :) This means that potentially unsafe medications will gain approval. Yes, but potentially unsafe medications already do gain approval. The only differences between a rubber stamp process and the current status quo are (1) it's a lot more efficient to just rubber stamp the darn thing, and (2) patients will not be under the mistaken impression that an FDA approval means that the drug is safe.

Drugs aren't safe. Doctors aren't pharmaceuticals experts. The only way to make sure you get the health care you need is to do your own research and critical thinking. Your doctor cannot do it for you. The pharmaceutical companies will not do it for you. The FDA lies about doing it for you.

So, approve everything and leave it to us individuals to do our due diligence. We who are informed already are!

Don't Go To Oklahoma

The ticks will cripple you. The lakes will devour your brain. The tornadoes will level your city. You will go missing during music festivals. Even if you make it out alive, your health will be among the worst in the nation. But let's say you want to be healthy, get outdoors, and play a round of golf; the course will be "flat and boring." Five of the thirteen most boring towns in America are in Oklahoma, including the top two most boring towns.

Is there anything good about Oklahoma? Anything?

2015-08-12

Year One


She didn't want me to leave this morning. Ordinarily, she can't wait to burst from my arms and into her mother's, to play, or for her morning meal, or just because. Today, though, she wrapped her arm tightly around my shoulder and just buried her face in my chest. Reluctantly, I handed her over to her mom so that I could collect my things and head out the door to work. She looked disappointed, and she fidgeted out of mom's arms and back into mine. This happened two or three times before I finally had to leave for work.

The First Year's Goal

As painful as it is to have to leave for work when your baby daughter wants to just be with you, I have to take it as a good sign. 

There is no discipline to be given during a child's first year of life, so parenting consists almost entirely of providing for all of her most basic needs: food, shelter, clean diapers, medical attention when necessary, et cetera. Enough has been written about that kind of thing that I will waste no time writing about it here.

For all the "hard work" and "life changes" people go through when they first become parents, for all the stress and trouble people like to obsess about when they think - and talk - and write - about parenting, there truly is only one important goal to have during the first twelve months, and that goal is to develop the foundation of a strong and trusting bond between parent and child. The closeness and intimacy we forge at the dawn of a child's life is what determines how we will communicate with each other for the rest of our lives. 

This was the goal to which I have dedicated myself for the past 365 days.

Early Fatherhood

I don't quite know how to put this, but I spent so many years believing that I didn't want children, and then ultimately only convinced myself that I wanted children on a purely intellectual level. But now that it's happening, it's real-world, in-the-moment emotional stuff, and what I am learning is surprising - I am really excited. People like to tell horror stories: "Prepare for your life to change," and "Say goodbye to your free time," and "As soon as you have kids, everything revolves around them," and that sort of thing. They make it sound like torture. 
But I just feel like I'm about to get a new best friend who - if I can protect, advise, and encourage - will grow up to be a happy person whose smile makes the world a better place. Am I supposed to pine for the fact that I'll have to find a sitter if I want to spend the night having cocktails at a comedy club? If so, I don't really get it. 
And that tells me that I am probably cut out for children, after all.
That's from a journal I kept of my thoughts and experiences during my wife's pregnancy and my earliest experiences as a new father. It is a strange experience, evolving from a man with no intentions of nurturing a child into a man whose every second thought seems to involve his daughter. 

The passage written above was written during the first trimester of the pregnancy, yet already I had begun to develop a theory of parenting. That approach to parenting involved making space in my life for a new participant. As I put it the day after she was born,
Our twosome is a threesome. We multiplied. We didn't invite a third person into our family, we created a family first, and out of that grew a third person. It's all so obvious in hindsight.
My point was that, contrary to those who believe that "everything changes" and "life will never be the same," my theory involves simply inviting this fabulous little girl into the parts of my life I treasure most. While some dads might stop playing music when their children are born, I simply hand her a maraca and invite her to play along. Yes, it's different, but it's an improvement, in exactly the same way that marrying was a big change that made a big positive impact on my life.

I set out into my adulthood as a "lone ranger," and in my wife I met an equal partner. We now have a trusty sidekick. I didn't quite get it right a year ago when I wrote about this in my journal:
I have so far refused to adopt the kind of idiotic world-view that dominates the conversations that adults have with new- or prospective-parents. You know what I'm talking about, "Everything will change!" "You'll never sleep again!" "Get used to [insert bad thing parents blame on their own children here]!" 
Here's what irritates me about that: It sounds resentful. It comes off like a bunch of adults who deeply resent the fact that they can no longer live self-absorbed lives because they must now devote the lion's share of their personal existence to the welfare of other human beings. It should be obvious to anyone who even thinks about the concept of having children that childbirth marks a life transition from the period in which you mattered most to the period in which someone else matters most.
What I should have said was something more along the lines of this: because she matters most to me, making her happy is how I will make myself happy. Early fatherhood isn't the end of your old life as a childless man; it's the transition of your life from providing for one to providing for many. As the many thrive, so does your own sense of happiness.

That is, if you're doing it right.

Happy Birthday

So you'll have to forgive me if I stroke my ego a little bit here. My daughter loves me. She holds me tightly and doesn't want me to leave the room. She loves playing with me, and learning about music from me, and learning new words and concepts from me. She loves it when I take her hiking, and she loves it when we stay inside and play with her toys. She loves it when I sing to her and when we have bath time together. She giggles when I try to make her laugh, and she lets out great, hearty laughter when she wants to make me laugh. If I take this as a sign that my first year as a father has been a success, that's only natural. 

In the very beginning, my she wasn't much more than a little ball of need, but as the year progressed she grew into someone whose personality is genuinely charming to me. I love her love of the outdoors. I love her desire to make her parents laugh. I love the way she reaches out to other kids - even kids she doesn't know - in an effort to make them smile. I love that her reaction to her own fear is a timid smile, rather than tears. I love that when she's feeling ill, she valiantly tries to power through it - giving us, her parents smiles of encouragement - rather than expecting our pity.

Even at a year old, she has not only expressed her personality, but she has won us over as a valued member of the team, that trusty sidekick. Today, I don't know what I'd do without her. And that's just her first 365 days.

I should have known I'd feel this way about her, because the day after she was born, I wrote:
If there's one thing I would like my newborn daughter to know about [the day she was born], it's this: Your mother had the kind of fire in her eyes that day that could set the world aflame. That kind of determination is what summits Everests, builds multi-national companies, wins gold medals, cures disease, and otherwise furthers the path of human progress. This is the kind of person who gave birth to you; these genes are your genes; if you can channel the energy your mother has imparted to you through the miracle of genetics, nothing will ever stop you. You can achieve anything. That day, your birthday, your mother achieved everything....
It took nearly forty-eight solid hours of labor to bring you here. You were conceived in love and born into it. We will never stop loving you. I hope you live to see many moments such as the one your mother had when she conquered her pain and brought you into the world. You may not ultimately want children of your own, but I hope you give birth to many, many personal triumphs. You have your mother's eyes. May they forever sparkle like hers do.
One year old, and already I feel this way. I can only begin to imagine the depth of emotion that lies ahead of me as the years unfold.

Happy birthday, my sweet little nut. I love you.

2015-08-05

More Wrong

The following is a synopsis and expansion of a series of comments I left under a post at SlateStarCodex.com. Toward the end of the debate, a few fellow commentators remarked that I was making the case for frequentist inference, which - because I am not an academic statistician or philosopher - is something I hadn't heard of until yesterday.

Now, I'm not very fond of putting all opinions into categories, and I reiterate that I only heard of frequentist inference yesterday, so I'm not ready to declare to the world, "World, I am a frequentist inferror!" But I did a little reading up, and it does seem to reflect my approach to probability. Moreover, I'm pleased to learn that I wasn't just spouting a bunch of crazy-talk, and that intelligent people had traversed that path before I.

Enough preamble, though, let's get to it.

*          *          *

My core claim: I think LessWrong-ers set themselves into a pattern of thinking that is ill-suited to the majority of the human experience.

*          *          *

Here's a quick example of why I think so, to help initiate the discussion:

Suppose Eliezer Yudkowsky asked me to assign a probability to whether a sentient robot will murder a human being during the next, say, 20 years. Setting aside the fact that technological advances are not a probability (because they are the product of deliberate human action) and focusing solely on the question of the robot itself – assumed to exist – choosing to murder a human being, this is not a question of probability, because volitional acts don’t "just happen." It's not randomness that causes someone to murder someone else, there are deliberate thoughts and actions going on, and these things cannot be assigned "likelihoods" based on anything rational.

Now, it's possible to suggest that all things that human beings do are purely random phenomena, as some eventually claim. They say that human brain functions are subject to quantum mechanics, and there is randomness involved there. But it would be disturbing to use that fact to suggest that, at any given moment, there is an X% chance that you will murder someone (even if the chance is very small). At the risk of sounding harsh, such a belief sounds a lot like a psychotic break to me.

On the other hand, we could indeed observe that. each year, X% of people commit a murder. We can ask, “What is the probability that next year, the number will be Y% instead?” The reason we can ask that is because we're no longer asking about the probability of a particular murder involving specific people. Instead, we’re asking about the likelihood that a sample mean will differ from a historical population mean. That, my friends, is indeed a question of probability, and we can do valid statistical analysis on a question like that.

Proponents of Bayesian inference - such as the "Less Wrong community" - like to say things like, "but I know often murders occur, and I know what the demographics of a murderer are, and I can compare the prevalence of murder among certain demographics to a particular person and arrive at a forecast for how likely I think it is that the person will commit murder..." And I can see how that does become a probability problem, but it only really works for a random observation.

What I mean is, if I put a random person named Joe in front of you along with some demographic data, you can come up with a statistical model that can do a best-possible job of predicting whether that random person is going to become a murderer at some point in the future. But if I task you to predict whether someone you know is going to murder someone else you know, then we're no longer talking about randomness or probability. We're talking about a couple of people that you know, and people do things by choice, not by probability (unless you're having a psychotic break and you've convinced yourself that everything you think and do is the whim of the random forces of molecules colliding inside of you, otherwise called "Because Quantum Mechanics! Nihilism," or BQM Nihilism for short).

In short, there are two questions here:
  1. Will Joe murder someone?
  2. What is the probability that I can correctly guess whether someone fitting a particular demographic profile is a murderer?
My position: Only question #2 is a question of probability. Only question #2 is appropriate for statistics.

*          *          *

But still, suppose I had to predict whether Joe was a murderer. Then, isn't coming up with a Bayesian prior, and fitting it into a predictive model and conducting some analytics the best I can do, given the information that I have?

Here's where things get more interesting to me...

One of the marks of a truly wise person, in my opinion, is the ability to say (honestly), "Gee, I just don't know." Being comfortable with the fact that there are some things out there that are just simply unknowable is part of being a grown-up. It's a sign of emotional maturity. .We all wish we knew everything there was to know, but no matter how smart we are, no matter what kind of Bayesian games we play with ourselves, we'll never know everything. We'll never even come close! It's just not possible.

It's admirable to try to expand human knowledge, of course, and it's a wonderful character trait to have a thirst for knowledge. But it's mature to accept your limitations.

Back to Joe: If you know Joe, and you need to predict whether he will become a murderer at some point in the future, then sure you could assign a bunch of probabilities and update your priors in an ongoing "virtual Markov Chain," but fundamentally we're asking about Joe's character, and that's not subject to probability. Either you've got Joe's number, or you don't, but you didn't get to where you were by running a Bayesian model against your every interaction with him. 

And if you did, then you're not human.

*          *          *

When I’m uncertain about something, I just say, “I don’t really know for sure.” Then I either choose to guess, or choose not to guess. If I choose to guess, I take stock of the available information, but I don’t delude myself into thinking that there is a cardinal number attached to my guess when I’m talking about situations in which cardinal numbers do not apply.

Theists use physics right up until they don’t understand the physics anymore and then say, “The rest is a miracle of god!” 


Over-use of probability is a similar kind of thing. It’s just something LW-ers do to grapple with that whole “Incomplete Other” thing that fascinated Jacques Lacan so much.

I’m not going to say that it’s true in all cases, but hopefully you can see how this kind of thinking is susceptible to producing an obsessional neurosis. Obsessional neurosis occurs when someone engages in some compulsive activity in lieu of gaining real control over his or her life. Developing a giant Bayesian statistical model for life is the ultimate neurosis for a person inclined to formalized logic. Hell, somebody even made a movie about it:


You can imagine some poor schmuck trying to estimate the year of his death using a Markov Chain Monte Carlo simulation and choosing when the best time to sire a child might be…

*          *          *

But wait - there are even more problems with this kind of thinking.

One of them is that, when you're building a predictive model about something, you're engaged in a priori theorizing. You're implicitly saying, "This thing that I have chosen to include in my model is relevant to the question I am trying to answer." Similarly, by not including something, you are implicitly suggesting that it's not very relevant, or not statistically significant, to your question.

So, when we build a model to predict whether Joe is a murderer, we include a certain set of demographic information, but we may exclude other sets of information, and in doing so, we've biased our analysis with our opinions. We've expressed a "Bayesian prior" subconsciously, and that excluded prior is basically this: "There is a zero percent chance that the thing I have excluded from my model is relevant to the question I purport to answer." 

Maybe it is irrelevant. But maybe not. More to the point, if you haven't included it in your model for the first run of the analysis, then you've biased your model unfairly - according to the rules of Bayesian inference itself! And since no one could ever hope to begin with a model that includes everything, then there is no possible way that Bayesian analysis improves on our ability to solve common, everyday problems any more than any other biased method of cognition.

Period.

*          *          *

This blog post is already long enough, and I've left out the important criticism that statistical modeling almost always implies some sort of linear relationship between the predictive variables and the variable being predicted. Who among us is prepared to claim that human behavior is always and everywhere a continuous function of physical inputs?

And yet, when we attempt to subject all decision-making to Bayesian inference, that is exactly what we're suggesting.

Now, the funny part there is that I often encounter people - behavioral economists, for example - who are happy to suggest that any type of human preference that doesn't behave according to a modelable continuous function is "irrational." Gleefully, they proclaim that humans are not rational animals because, look here, people smoke cigarettes even though they know cigarettes are unhealthy, and look over there, people take on more debt than they can afford, even when it's clear that the debt is unaffordable.

Something about the Less Wrong crowd makes me think that this is their view, too. I get the impression that they simply feel that they are combating their irrational tendencies with a sublime brand of rationalism.

It's that underlying sense of transcendence, the suggestion that you might be able to achieve some sort of higher state of existence by putting into practice Eliezer Yudkowsky's principles of rationalism - which he himself describes in quasi-religious language, like "The Way" - that gives people like me the heebie-jeebies.

When a community of people offer you a chance at achieving a higher state of being by becoming a little less human, it starts to look more like a religion than a science. Is this a fair criticism of the Less Wrong community? I have no idea, but I do know that I'm not the only one to have made it.

*          *          *

What's the point? Why write all this when I don't really have any skin in the game? What do I care if people follow some Silicon Valley thirty-something with a quasi-religious fervor based on some basically sound and convincing mathematical source material?

Recall that I come from a place in the world where both religions and cults thrive. Recall that my blog has in many ways become a place to engage in self-analysis, self-criticism, and hopefully, the end of the kind of illusions we tell ourselves. Personal growth (not transcendence) requires that we always try to grow and develop.

We'll always be wrong - usually we'll be more wrong, not less. Trying to be less wrong is okay if it means gaining some new knowledge and using it to improve the results of your day-to-day action. But there is only so much knowledge you can have. It's tempting in today's world of "big data" and big processors and big Markov Chains to believe that all we need to do is model the possible outcomes of any scenario and update our priors.

But it's also vain. Growing up means accepting your limitations and working with them. In some cases, that might mean letting go of Bayesian inference and restricting your statistical analyses to problems that can actually be solved with statistics.

Salad Review: Trader Joe's Chicken Caesar

I wouldn't normally "review" a caesar salad on my blog, but in this case, we're talking about a salad that gets rave reviews. In fact, as I was buying my salad yesterday, the cashier who rung me up started telling me about how he ate the salad for lunch and that it was delicious. He said he was sure I would enjoy it.

So, this is a salad that is popular among economics bloggers and Trader Joe's employees alike. Clearly, this is something worth spending my time writing about.

I'll review this salad in terms of its individual components, and then provide a bottom-line recommendation at the end.

Lettuce

The lettuce was the salad's strongest suit, in my opinion. Most grocery store caesar salads consist of basically a big, chopped-up hunk of romaine lettuce, with special emphasis placed on the romaine hearts - the sweetest, juiciest part of the head. That's understandable, but I'm the kind of person who enjoys the taste of the mature romaine leaves. The flavor is a little more bitter, but a lot more complex, and it just suits me better.

Luckily for me, Trader Joe's gives equal billing to hearts and tips, which means that the salad has more tips than the average caesar salad.

The lettuce was also relatively fresh, although there were a few questionable leaves, but that isn't uncommon for grocery store salads.

Chicken

The chicken pieces themselves were pretty tasty, but there were only two pieces in there. So while the quality was relatively high, the quantity was much lower than, say, a comparable caesar salad purchased from Walmart.

For some, this might not actually be a detractor. Some people like to keep their portions relatively small. In fact, I remember a good friend of mine talking about sandwiches at a restaurant. When I asked her how big the sandwiches were, she "assured' me, "Not too big!" This made me laugh because she thought I was worried that I wouldn't be able to eat the whole thing, when in fact my concern was whether the sandwich was large enough to sate me (which is usually not the case).

So if you like your salads small, the chicken in Trader Joe's chicken caesar salad will deliver as promised. But if you're hungry, by the time you finish your salad, you'll still be hungry.

Dressing, Croutons, Etc.

The salad was weakest on this level. The dressing was insufficiently garlicky, the cheese was sparse and not particularly fresh, and the croutons were flavorless. 

I can overlook the caesar dressing itself. It's a difficult salad dressing to get right, especially considering that classic caesar dressing involves tastes that tend to be a little too much for the American palette (anchovies - not bacon - are required for a truly authentic caesar, and we all know how anchovies are viewed by most people). Overall, I found the dressing to be quite similar to the caesar dressing you'll find at Einstein Bros., which is not great, but... not terrible.

However, I can't overlook the cheese. What was provided was a tiny packet of Parmesan dust. It's not as if Trader Joe's can't afford to put some real cheese in the box. The dust was just insulting.

Cost

At about $4 for perhaps a cup and a half of lettuce and two small chicken strips, this is one of the more expensive ready-made grocery store salads on the market. A similarly priced salad at Walmart, for example, tastes no worse and greatly surpasses the Trader Joe's salad on quantity. Also, Walmart's salads come with a plastic fork. Trader Joe's leaves the eating utensils up to you.

That said, I knew I'd be spending extra money when I arrived at Trader Joe's in the first place. Nobody goes there to save money.

The Bottom Line

While the salad didn't taste all that bad, for $4 it just wasn't a very good value. I've already mentioned Walmart's caesar salads several times, because that is the baseline comparator in my case. Trader Joe's is nearby, and Walmart is the next-nearest grocery store. Consequently, I find myself purchasing a salad and a sandwich from Walmart with some regularity, and I have to say, Walmart has Trader Joe's beat, hands-down.

But even if we set aside my personal favorite grocery store salad and compare Trader Joe's salad to others "in general," I still feel it doesn't offer a good enough value proposition to be an attractive lunch option. You can either get more salad for the same money elsewhere, or a better-tasting salad for a comparable amount of money.

In any case, my recommendation is: if you're looking for a great caesar salad, do not go to Trader Joe's.

2015-08-04

Some Ways My Thinking About Music Is (Possibly) Odd

Years ago, a friend of mine made his band's album available to a group of us for free. The music was great. It was pop/punk, but the lyrics were clever, playful, and self-deprecating. It had exactly the same kind of "we don't take ourselves too seriously, but still, this is good music" kind of vibe that the early Frank Zappa albums had.

Because it reminded me of Frank Zappa, I said so. The larger group - and even my friend himself - erupted into incredulous laughter, and it took me a while to "live the comment down." I guess to their mode of thinking, a pop/punk album has about as much to do with Frank Zappa as a goldfish has to do with the planet Neptune.

That was not my first clue that I think about music differently than other people do, but it was an iconic clue.

So, in the spirit of that old comment about my friend's pop/punk band, here is a list of statements about music that strike me as being obvious, even though I readily concede that others will consider them to be "controversial."

  • I consider The Cult to be a sort of middle-step between Billy Idol and Danzig.
  • Saigon Kick and Enuff Z'Nuff, as far as I'm concerned, are 90s alternative rock bands, not 80s metal bands.
  • Bon Jovi has more in common with John Mellancamp than with Guns & Roses.
  • Speaking of whom, Jackyll was the band that Guns & Roses wanted to be.
  • The Beatles weren't songwriting geniuses, they were a boy band.
  • My reaction to the band Nirvana is exactly the same as my reaction to the band Cinderella: The music is okay, but how does anyone take this guy's voice seriously?
  • Arnold Schoenberg had a better sense of melody than Mozart.
  • The Darkness is not an ironic 80s metal throwback group.
  • Ted Nugent plays guitar solos like a rockabilly player.
  • 3-chord songs are, in general, more harmonically complex than 4-chord songs.
  • Simple compositions are not automatically more musical or pleasant to hear than complex compositions.
  • Popularity is not well correlated with quality.
  • If someone doesn't have a well-developed sense of musical aesthetics, then whatever music they hear most often is whatever they will like the most, regardless of any other factors going into that music. 
I'll add more to the list as I think of things to add.

2015-08-03

"Status"-Type Thinking

It's practically a truism that we seldom accuse others of a behavior that we ourselves do not regularly engage in. The compulsive liar is always the first to suspect others of dishonesty. The serial manipulator is always hyper-sensitive to others' ulterior motives. Pretentious people always criticize the supposed pretentiousness of the things they dislike.

So I thought Tyler Cowen's recent blog post entitled "What kind of blog post produces the most comments?" particularly rich, and not in a good way. In fact, I have seldom read a blog post by Cowen that was as nasty as this one.

After providing a hypothetical list of people who deserve to be "raised in status," and another one comprised of people who deserve to be "lowered in status," he writes (emphases added):
You might get a kick out of it the first time, but quickly you would grow tired of the lack of substance and indeed the sheer prejudice of the exercise. 
Yet, ultimately, the topic so appeals to you all. So much of debate, including political and economic debate, is about which groups and individuals deserve higher or lower status. It’s pretty easy — too easy in fact — to dissect most Paul Krugman blog posts along these lines. It’s also why a lot of blog posts about foreign countries don’t generate visceral reactions, unless of course it is the Greeks and the Germans, or some other set of stand-ins for disputes closer to home (or maybe that is your home). Chinese goings on are especially tough to parse into comparable American disputes over the status of one group vs. another. 
I hypothesize that an MR blog post attracts more comments when it a) has implications for who should be raised and lowered in status, and b) has some framework in place which allows you to make analytical points, but points which ultimately translate into a conclusion about a).
Cowen ends his post as follows: "Sometimes I am tempted to simply serve up the list and skip the analytics."

Why would he be tempted to serve up the list? To "produce the most comments," of course. Why would he want his blog posts to produce the most comments? To have a popular blog. And why would he want a popular blog?

...To elevate his own status.