Please answer the following questions
My attitude toward surveys is:
- I hate them
- I'm allergic to them
- They take too long to complete
- They are a necessary evil.
- All of the above
- None of the above
- I'm bored and getting cranky
I've spent an inordinate amount of time this weekend updating my “What Gets Measured Gets Done: A School Library Media and Technology Program Self-study Workbook" in preparation for workshops I'll be doing for the New York SLMS Leadership retreat next month. Last revised in 2001, the booklet needed some updating and this was a good excuse. Anyone is welcome to use it or any part of it if they'd like to do a really swell, bang-up formal library/technology program evaluation.
I also revised the "Tools" document that goes with it - sample surveys and checklists to be used for data-gathering when doing program assessments or creating long range plans. I've decided to put all the surveys I've created (that I can remember and find) into this now 53 page document. The TOC looks like this:
Surveys
Staff survey for long-range tech planning purposes p. 36
Parent survey questions p. 2
Parent survey response summary form p. 4
Principal survey questions p. 5
Principal survey response summary form p. 7
Student survey questions p. 8
Primary student survey questions p. 10
Student survey response summary form p. 11
Teacher survey questions p. 12
Teacher survey response summary form p. 14
Program evaluation rubrics p. 15
Inventory templates
Budget p. 24
Library resources p. 25
Computer hardware p. 26
Video and voice hardware p. 27
Staffing p. 28
Miscellaneous checklists
Facilities and infrastructure p. 29
Curriculum p. 30
Climate p. 31
13 point library/media program checklist p. 32
Electronic resource checklist p. 35
Media Department Year End Report p. 42
Staff Technology Satisfaction Survey p. 46
Mankato Survey of Professional Technology Use, Ability and Accessibility p. 49
Yes, I should be shot.
In one of the documents I wrote: Good surveys have:
- a specific set of questions to be answered
- descriptive indicators of numerical scales
- a rapid means of compiling and reporting data
I would now add that good surveys are:
- Purposeful
- Focused
- Short
- Online
- Statistically defensible
One of the things I realized is that while I often ask respondents to identify themselves by gender and age, I've never taken the time to go back and disaggregate the data to see if gender or age made any difference in the responses. I just wasted people's time having them check the little boxes. Asking irrelevant questions is probably the biggest sin most survey makers commit.
I promise to do better in the future:
- Absolutely
- Maybe
- My intentions are good
- Like I'll remember
Reader Comments (2)
Doug,
I would add that good surveys are created using a form of "backward design." You start with a list of decisions that need to be made or questions that need to be answered, and then design a survey that will help you to collect the necessary information. And towards that end, I always design the survey with correlative analysis in mind: I ask the questions in such a way that I can look at how answers to one question correlate or don't correlate with another. For example, I ask the user to rate his/her comfort level with technology and then later on ask them to identify their preferred environment for professional development. The answers among the "low technology comfort level" group and their more experienced peers are bound to be different and it's important to make those distinctions.
Finally, I think that the sequencing of questions in a survey is a very important but oft-overlooked piece. Answering one question prejudices the respondent for all of the remaining questions on the survey. For example, if you start the survey by asking the respondent to identify all of the problems with your district's technology, that is going to put him/her in a certain mindset that will affect the rest of his/her responses.
Thanks, Mike. These are great suggestions. I think I need a class in this!
Doug