Greetings from Golden, Colorado, where in about an hour I'll be working with educators from the Jefferson County Schools. One of the things I've been asked to visit with school librarians about is how we can demonstrate school library media centers' impact on student learning. I hear this one a lot lately.
In a rather off-handed manner, I dealt with the topic last summer in the blog entry "A Trick Question." But I realize I did not do the topic justice. So for the last couple weeks, I've been trying to distill some thought and readings on the subject. Find the results below. I am heavily indebted to Joyce Valenza, David Loertcher, Ross Todd and others who I cite in the little article below.
Normally, I would ask this at the end of the blog entry, but I am going to do it now: Please, if you have ideas about how YOU demonstrate your influence on your school, please let me and other readers of this blog know. This is not a done deal, but something we will need to continue to work on for the remainder of our careers. Thanks!
Demonstrating Our Impact: Putting Numbers in Context
One of my favorite quotes comes from George Bernard Shaw: We should all be obliged to appear before a board every five years and justify our existence...on pain of liquidation. While Shaw was commenting one’s social worth, his words today could come from any number of administrators, school boards and legislatures and be aimed directly at school library media specialists. Finding an answer to the question “How do we demonstrate our impact on student achievement?” is increasingly important for every library media specialist in the country to be able to persuasively answer.
I have long been frustrated with this question, especially when those asking want empirical rather than anecdotal evidence to support claims of effectiveness. To me, genuine empirical evidence is the result of a controlled study and no school has the ability or will to do a controlled study on library effectiveness. Would your school:
- Be willing to have a significant portion of its students (and teachers) go without library services and resources as part of a control group?
- Be willing to wait three to four years for reliable longitudinal data?
- Be willing to change nothing else in the school to eliminate all other factors that might influence test scores?
- Be willing to find ways to factor out demographic data that may influence test results?
- Be able to analyze a large enough sample to be considered statistically significant?
- Be willing to provide the statistical and research expertise and manpower needed to make the study valid?"
I know mine wouldn’t participate in such a study, no matter how clear-cut the evidence produced. So how do we demonstrate out impact using “numbers?” Let’s look at a number of ways, none perfect, but when used in combination, powerful.
1. Standards and checklists. A common means of assessing a school library media program (and by inference assessing its impact on student learning) is by comparing an individual library media program to a state or national set of program standards. AASL’s Planning Guide for Information Power: Building Partnerships for Learning with School Library Media Program Assessment Rubric for the 21st Century (ALA, 1999) is one example of a tool that can be used to do such a comparison. Many states also have standards that can be used to evaluate library media programs. Minnesota’s, for example, can be found at <www.memoweb.org/htmlfiles/linkseffectiveslmp.html>.
Both AASL and MEMO use rubrics that quickly allow a media specialists to evaluate their programs. For example, MEMO’s “Standard One” under the Learning and Teaching section reads: “Is the program fully integrated?” and gives these levels
25-50% of classes use the media program’s materials and services the equivalent of at least once each semester.
50%-100% of classes use the media program’s materials and services the equivalent of at least once each semester. The media specialist is a regular member of curriculum teams. All media skills are taught through content-based projects.
50%-100% of classes use the media program’s materials and services the equivalent of at least twice each semester. Information literacy skills are an articulated component of a majority of content area curricula.
While standards can and should be used to help evaluate a program, the direct link between meeting such standards and local student achievement is not present. While backed by research, best practices, and the experience of the standards writers who are usually experts in the field, these tools can only suggest what may make a local program more effective, not demonstrate that the current program is having an impact. While important, standards are guides, not evidence.
2. Research studies. The Colorado studies are a good example of using statistical regression analysis to look for correlations between variables. In the case of statewide library studies, the relationship of effective library programs and standardized test scores is examined. School Libraries Work, (Scholastic, 2006) is an excellent summary of this types of research. <www.scholastic.com/librarians/printables/downloads/ slw_2006.pdf>. These can and should be shared with principals. Some statisticians do not approve of regression analyses because they show correlation, not causation, and because it is very difficult to factor out other variables that may have impacted the correlation.
Other formal individual research studies and meta-studies are also worth sharing with administrators. Stephen Krashen’s Power of Reading, 2nd edition, persuasively stacks up a large number of individual research reports to demonstrate that voluntary free reading can improve student reading ability. And he concludes that when students have access to a wide range of reading resources (in libraries, of course), they do more independent reading.
Unfortunately, just as all politics are local, so are all assessments local. While decision-makers are usually quite willing to read and acknowledge studies done “elsewhere,” most still want to know the direct impact their local program is having.
3. Counting things. Year-end reports that include circulation statistics, library usage, and collection size data are a common way for building library programs to demonstrate the degree to which they are being used, and by inference, having an impact on the educational program in the school.
Ontario Library Association’s Teacher Librarian Toolkit for Evidence Based Practice <accessola.com/osla/toolkit/home.html> contains a number of forms that can be used to track circulation and incidences of collaboration. Jacquie Henry provides a tool for tracking media center usage in the January 2006 issue of Library Media Connection.
Our district’s “Year End Report” asks library media specialists to enumerate the following:
Number of print materials circulated
AV materials circulated
In-library circulation of print
In-library circulation AV materials
AV equipment circulated
Use of space:
Drop in users
Number of books acquired and deleted
Number of AV materials acquired and deleted
Number of software programs acquired and deleted
Leadership team activities: (List any building/district committees on which you have served and your role on them.)
For primary, please list for each grade level library units taught that support classroom units and major skills taught.
For secondary, please list all units taught collaboratively and skills for which you had major responsibility for teaching.
Special programs or activities: (in-services, reading promotions, authors, events)
Please share a minimum of three instructional highlights for the past year. This is very helpful when concrete examples of media/tech services are needed.
Communications: (Please list how you have communicated with parents, staff and students this year.)
There is a movement away from counting things: materials, circulation, online resource uses, website hits, individual student visits, whole class visits and special activities conducted (tech fairs, reading promotions, etc.) to enumerating how many instructional activities were accomplished: booktalks given, skill lessons taught, teacher in-services provided, pathfinders/bibliographies created and collaborative units conducted. Administrators are less concerned about how many materials are available and more concerned about how they are being used.
Information and technology literacy skill attainment, if assessed and reported, is another means of “counting” one’s impact. Our elementary library media specialists have primary responsibility for teaching these skills and complete sections of student progress reports similar to those done in math and reading. At the building level, it is possible for the library media specialist to make a statement like: “89% of 6th grade students have demonstrated mastery of the district’s information literacy benchmarked skills.”
4. Asking people. Asking library users to complete surveys and participate in focus groups are frequently used to collect information about the impact of the library media programs.
Some sources of surveys:
• Johnson, What Gets Measured Gets Done (Tools): <www.doug-johnson.com/wgm/wgm.html>
• McGriff, Preddy, and Harvey, Program Perception <www.nobl.k12.in.us/media/NorthMedia/lms/data/ percept/percept.htm>
• Valenza, Power Tools Recharged (ALA, 2004)
Surveys of both students and teachers can be done either at the project level at completion, or on an annual basis. Joyce Valenza conducts video “exit interviews” of graduating seniors at her high school.
Survey-based data gathering was a powerful tool used by Todd and Kulthau to conduct Student Learning through Ohio School Libraries: The Ohio Research Study <www.oelma.org/studentlearning> in 2003. This type of study would be relatively easy to recreate at the building level.
5. Anecdotal data. Is there value to anecdotal evidence and stories? Despite my favorite statistics teacher’s dictum that the plural of anecdote is not data, I believe empirical evidence without stories is ineffective. One skill all great salespeople have is the ability to tell compelling personal tales that illustrate the points they wish to make. It’s one thing for the guy down at the Ford dealership to show a potential buyer a Consumer Reports study. But the real closer tells the story of how Ms. Jones buys this exact model every other year and swears each one is the best car she has ever owned. When selling (advocating for) our programs, our visions, and ourselves to those we wish to influence, we need to tell our stories. See “Once Upon a Time,” Library Media Connection, February 2002. <www.doug-johnson.com/dougwri/storytelling.html>.
Don’t discount how powerful “digital storytelling” can be as well. A short video or even photographs of students using the library media center for a variety of activities can be persuasive. How many times have you said, “If only the parents could see this, they would support the library 100%”? Though digital photography and a presentation to the PTA or Kiwanis organization, they can see your program.
Context and Focus Numbers alone, of course, mean little. They need to interpreted and placed in some type of meaningful context. Context can be achieved by setting and meeting goals and by looking at numbers in a historical context. Look, for example, at how each statement gets more powerful:
- 28 teachers participated in collaborative units (Is this good or bad?)
- 78% of teachers in the building participated in collaborative units (This tells me more.)
- 78% of teachers, up from 62% of teachers last year, participated in collaborative teaching units. (This shows a program that is getting stronger.)
It’s clear that a wide variety of means exist to assess a wide variety of library program activities. How does one choose the “what” and “how” of program evaluation?
David Loertscher’s Project Achievement <www.davidvl.org/achieve.html> suggests that data collection should be done at three levels in order to triangulate evidence: at the Learner Level; at the Teaching Unit Level; and at the Organization Level and provides tools to do just that. He also suggests evaluating the impact of the library program on four areas: Reading, Collaborative Planning, Information Literacy and Technology.
My suggestion is to pay careful attention to your building and district goals and annual objectives. If reading is a focus, then look at reading activities, promotions, collection development and circulation. (If there is a focus on a particular demographic within your school (ESL students for example), check to see if your circulation system will allow you to sort by that identifier. You own goals and the accomplishment of them, can also provide an effective means of assessment.
We can no longer afford to complete a program evaluation once every five years and have the results thrown in a drawer and never used. Our assessments need to help us improve our practice, to serve as indicators for our planning efforts, and to be an integral part of our communication efforts with our teachers, administrators, parents and communities. Assesment, of course, takes time. But less time than finding another job.
How are you “demonstrating your impact on student achievement?”