Ten Best Bullshit Nonsense Crazy Errors

So, look—any article headlined These Are The 10 Best Places To Live In Connecticut is going to be stupid. If they claim to have developed a statistical analysis that brings up that list of ten, it’s going to be worthless as a statistical analysis. We all know that. The fact that it’s on the website of a real estate brokerage, rather than (f’r’ex) a regional magazine makes only the tiniest bit of difference, really.

And you may think it’s not worth looking at the stupid claims of statistical nonalysis in any detail, and you would probably be right. So let’s do just that, shall we?

OK, so as with all these things, they come up with a bunch of criteria. In this case, seven of them, including things like the crime rate, average commute time and unemployment rates. Oh, and the weather. And ‘quality of life’, which is itself a mishmash of things that they claim are highly correlated in the census data (and some other places), mostly including economic factors but also student-teacher ratio in (I assume) the public schools. Why mash that in with the economic ones? Because of the high correlation. OK, I guess, but—now what do you do with the seven categories, keeping in mind that some of them contain within them multiple kinds of things that are correlated?

The answer is, ignore all that complicated stuff. Just add ’em up and divide by seven. Give the commute time the same weight as the quality of life, which has the same weight as the unemployment rate, which has the same weight as everything else, because evidently making formulas in spreadsheets is difficult.

Oh, and when I say add ’em up and divide by seven what I mean is add up the ranks of the forty-two towns. That is, in each category, rank the towns one to forty-two, and then average the ranks in the four categories. That’s just un—you know, I was going to say unbelievably stupid, but it’s actually believable, completely believable, not difficult at all to believe that somebody would set their table up that way. It’s difficult to believe the number in the end has any value, sure, but the stupidity is totally believable.

Of course, when you do it by ranks like that, the distortion widens the difference between similar towns and not between different towns. Connecticut’s a tiny state, so my guess is that the forty-two towns have essentially indistinguishable weather patterns and air quality, or rather that there’s a group of twenty that are indistinguishable from each other, and another group of twenty that are indistinguishable from each other within that group, and maybe two outliers. But the distinction between New Milford weather and the Danbury weather ten miles downstream means that New Milford gets an 18 and Danbury a 1. Which, since remember, we’re just averaging seven categories, is responsible for 2.43 of the 5.85 in their final scores. And that will be similar in other categories where miniscule differences in real life (the difference between a teacher-student ratio of 13.1-to-one and 13.6-to-one will not be noticeable to a parent or student) must, by the nature of ranking, lead to substantial differences in score.

OK, so there are problems with the metric, is what I’m saying. But here’s the fun part: in the category of tax rank, there is a 41-way-tie for first. All the towns except East Hampton have a ‘1’ in that category. East Hampton has a ‘42’. It is not, bye-the-bye, the actual case that East Hampton has the highest mil rate in the state, and it is not even close to the actual case that all the other towns have identical tax burdens. No, the variation in property taxes from town to town is high enough that people really do choose to live in one town or another based on the current tax rate (and presumably a prediction of stability of some sort, I suppose). If you wanted a real estate agent to advise you on where to live—and you don’t, I’m pretty sure you don’t—one aspect that you might expect the estate agent to know something about is the difference in tax burden on either side of the city line. No? I suppose not.

And yes, I checked: the big deal score they used to come up with the Top Ten List really is the average of the ranks in the sevens, including the column for taxes with the forty-one way tie for first. Accuracy!

So. Why bother making fun of this? When my town hall brags about making this particular Top Ten List, as far as I’m concerned, they’re just embarrassing themselves and me. So there’s that. But also… it’s hard to keep in mind, all the time, that there really is no downside to the real estate company (that’s Movoto, by the way) in doing this sort of thing. They slap this up, send around an email to the towns involved, and get a handful of clicks. Somebody like Your Humble Blogger may gripe about it a bit, but so what? Nobody expects rigor in a Top Ten anything. It’s just for fun, right?

And this is why I keep saying that we need more statistics education and logic education in the high schools, and less trigonometry and geometry. As David S. Bernstein wrote about a different topic altogether This lengthy and meandering tale is ultimately about nothing, but along the way might prove instructive for anyone wondering why so many people walk around with their heads filled with a vast, expanding trove of untrue nonsense. Me too, of course, which is why it's so irritating.

Tolerabimus quod tolerare debemus,
-Vardibidian.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.