Search This Blog

Friday, February 25, 2011

Surveys: You get what you ask for?

OK, I give in. I'm writing about the Wisconsin budget repair battle. I came across an interesting poll result that's too good to pass up.

Poll: Wisconsin Gov. Scott Walker Winning Labor, Budget Fight - Peter Roff (

Dick Morris, former Clinton pollster, released his finding on public opinion in Wisconsin. Read the article and decide for yourself if the headline is fair. But pay special attention to the last two paragraphs.

When people are asked if collective bargaining for teachers should be restricted to benefits and wages (as the bill would do), 54% say "no" and are opposed to that part of the bill. However, if you re-state the question and say that the collective bargaining limit "gives schools more flexibility, makes it easier to get rid of bad teachers and retain good ones..." then the poll results flip and 58% say "yes" to that part of the bill.

I applaud Morris' poll for asking the question both ways because it's a great example of how easy it is to slant poll results. Regardless of the issue, if your survey frames a "yes" with negative outcomes you'll get fewer "yes" answers. Frame it with positive outcomes and you'll get more "yes" answers.

Well duh! That should obvious. Every pollster should avoid loading their questions with positive or negative terms.

Now go read some surveys on hot-button issues. Whether intended or not, you'll find a lot a loaded questions because it can be hard to write a neutral question. Sometimes the best option is to ask the question a couple of different ways, like Morris did, to see how the outcomes change.

Tuesday, February 22, 2011

Social Security and You (Part 1)

Back in the 80's, I was a public high school teacher for a couple of years. Therefore, I find the current events in Madison WI interesting. There are a multitude of data issues behind the protests and press conferences and I've learned things about teacher pensions that I wasn't aware of.

I considered writing about teacher pensions, but decided against it because 1) most who read this post aren't teachers, and 2) things could quickly deteriorate into a non-data political discourse.

Then I saw this comic in today's paper and decided that it was a good time to bring up Social Security. While it has its own political discourse risks, I'm willing to take that chance.

The primary target audience for this entire blog is college students but this post is especially for them.

How much do you know about Social Security taxes? If you have a job, you should look at your last pay stub. There are probably a couple of lines, one for "Social Security" and another for "Medicare" (maybe something's labeled OADSI). Many people discuss Social Security and Medicare as if they are one thing, but they're technically different programs. How much does that matter from a data perspective?

Regardless of whether or not you group them together, do you know what percent of your income they take for these taxes? Are you aware of how much your employer pays in addition to your payment? When you do your tax returns (or someone does them for you) do have any idea how income tax deductions and credits do or don't relate to Social Security/Medicare taxes?

I'm not going to answer any of these questions here. If you don't know the answers I urge you to look them up. I also strongly suggest that after you do your taxes you sit down and figure out exactly how much federal income tax, state income tax, and Social Security/Medicare tax you paid as a percent of your income. You may hear a lot about marginal tax rates but you should know what percentages you actually pay. You may be surprised, pleasantly or otherwise, at your actual percentages and how those three categories (fed, state, Soc Sec) compare.

Once you've figured out what percent you pay, do you know what you're supposed to get in return? Do you know what you're actually going to get? Is that really two different questions?

I hope that I've whet your appetite enough to seek some answers. After you find those answers you can let me know if it was worth the effort.

More to come...

Friday, February 18, 2011

Ordinal scales - #$#%!

I recently had rotator cuff surgery and now I'm in physical therapy. At every session the therapist asks "how would you rate your pain today on a 10 point scale?". They tell you that 0 is "no pain" and 10 is "the worst pain you can imagine".

How am I supposed to answer this question? It always hurts. Some days only a little, some days a little more, and some days a lot. How am I supposed to tell the difference between a 3 and 4? I've never actually said anything bigger than 5. Does that mean that my 5 is really a 10 because it's the worst pain I have or is my 5 really a 5 because I can imagine much worse pain (being skinned alive, dental drilling without novacaine, etc.)

I'm not saying that there is no such thing as ordinal data. Many phenomena of interest are naturally ordinal (good, plus good, double plus good!). The problem occurs when we impose a numeral-based scale on an ordinal phenomena. People start to think that the numerals represent real numbers. Then they think it makes sense to analyze them as numbers: "The patient's average pain this week was 3.428."

Fortunately, my therapist has never made a statement like that, but course evaluations where I teach are done on numeral-based ordinal scales and every semester the institution computes means and standard deviations for teachers, classes, divisions, etc. It's statistical nonsense.

Thursday, February 17, 2011

Surveys - where do they get these people?

I've always wondered where Family Feud found people for their surveys.

Now I wonder even more.

Wednesday, February 16, 2011

Is college useless?

Academically Adrift was published last month. The link takes you to the U of Chicago Press but you can Google "academically adrift" and find many articles about it. (Disclaimer: I have not read the book itself and I do not get a commission if you buy it.)

A key component seems to be the authors' finding that roughly 36% of all college students fail to show improvement on a particular learning assessment, the CLA. 

The street-level interpretation? College doesn't work.

As expected, some in the higher education world are upset by these findings and the study has been attacked on several fronts. However, it seems to me that a key statistical understanding is missing.

For the sake of discussion, let's take the findings at face value. Let's assume that the CLA is a good test and let's assume that the data is representative of all college students. 

Is the street-level interpretation fair?  Maybe, maybe not.

We don't know what would have happened to a similar group of non-students. What would the CLA show for a group who entered the workforce right after high school and never attended college? If there were the same 36% - 64% split then college would make no difference in learning.

But what if the non-college split was 90% - 10%? Then the college student results would be evidence that college works well. What if the non-college split went the other way and 90% of non-college students improved? That would be evidence that college actually hurts.

Even without considering an alternate test group, it's also worth pointing out the "half empty/half full" issue. If 36% of college students show no improvement, then 64% evidently do show some improvement.  If college increases scores for nearly 2/3 of students, it might not be fair to say college doesn't work.

What do you think of this study?  How can the data be interpreted?

Also, is it fair for you to comment on the study based on my blog post or should you read more about Academically Adrift first? Should you read the entire book before you comment? Should  I have read the entire book before writing a post? The world is full of summaries about summaries. How far back toward the original data should we have to dig before we're allowed to comment or draw conclusions?

Welcome to Data Matters

Welcome to my new blog. I've taught statistics for nearly 20 years from 100-level general education courses through applied MBA classes. In spite of my love of data and the stories that data can tell, I've learned over the years that most students don't share my devotion and many view my courses as "sadistics" instead of "statistics".

Therefore, I'm starting this blog as an out-of-class tool to engage my students in the world of data. I'll post article links, commentaries, and other data insights for them to read and perhaps comment on. If you aren't a student and you've stumbled across this blog, maybe you'll find something useful too.

There are many scholars (for example David Moore, Edward Tuftes, Ian Ayres) who have done significant work to make statistics more accessible and more useful to a broad audience. I don't expect this small blog to ever equal their contributions but maybe it will turn a few people on to the power and importance of data.