« Two reports worth reading | Main | Revisiting Stone Soup »
Thursday
Oct052006

How do we demonstrate our impact on student achievement?

Greetings from Golden, Colorado, where in about an hour I'll be working with educators from the Jefferson County Schools. One of the things I've been asked to visit with school librarians about is how we can demonstrate school library media centers' impact on student learning. I hear this one a lot lately.

In a rather off-handed manner, I dealt with the topic last summer in the blog entry "A Trick Question." But I realize I did not do the topic justice. So for the last couple weeks, I've been trying to distill some thought and readings on the subject. Find the results below. I am heavily indebted to Joyce Valenza, David Loertcher, Ross Todd and others who I cite in the little article below.

Normally, I would ask this at the end of the blog entry, but I am going to do it now: Please, if you have ideas about how YOU demonstrate your influence on your school, please let me and other readers of this blog know. This is not a done deal, but something we will need to continue to work on for the remainder of our careers. Thanks!

hoover6.jpg
 

Demonstrating Our Impact: Putting Numbers in Context

One of my favorite quotes comes from George Bernard Shaw: We should all be obliged to appear before a board every five years and justify our existence...on pain of liquidation. While Shaw was commenting one’s social worth, his words today could come from any number of administrators, school boards and legislatures and be aimed directly at school library media specialists. Finding an answer to the question “How do we demonstrate our impact on student achievement?” is increasingly important for every library media specialist in the country to be able to persuasively answer.

I have long been frustrated with this question, especially when those asking want empirical rather than anecdotal evidence to support claims of effectiveness. To me, genuine empirical evidence is the result of a controlled study and no school has the ability or will to do a controlled study on library effectiveness. Would your school:

  • Be willing to have a significant portion of its students (and teachers) go without library services and resources as part of a control group?
  • Be willing to wait three to four years for reliable longitudinal data?
  • Be willing to change nothing else in the school to eliminate all other factors that might influence test scores?
  • Be willing to find ways to factor out demographic data that may influence test results?
  • Be able to analyze a large enough sample to be considered statistically significant?
  • Be willing to provide the statistical and research expertise and manpower needed to make the study valid?"

I know mine wouldn’t participate in such a study, no matter how clear-cut the evidence produced. So how do we demonstrate out impact using “numbers?”  Let’s look at a number of ways, none perfect, but when used in combination, powerful.

1. Standards and checklists. A common means of assessing a school library media program (and by inference assessing its impact on student learning) is by comparing an individual library media program to a state or national set of program standards. AASL’s Planning Guide for Information Power: Building Partnerships for Learning with School Library Media Program Assessment Rubric for the 21st Century (ALA, 1999) is one example of a tool that can be used to do such a comparison. Many states also have standards that can be used to evaluate library media programs. Minnesota’s, for example, can be found at <www.memoweb.org/htmlfiles/linkseffectiveslmp.html>.

Both AASL and MEMO use rubrics that quickly allow a media specialists to evaluate their programs. For example, MEMO’s “Standard One” under the Learning and Teaching section reads: “Is the program fully integrated?” and gives these levels

Minimum
25-50% of classes use the media program’s materials and services the equivalent of at least once each semester.
Standard
50%-100% of classes use the media program’s materials and services the equivalent of at least once each semester. The media specialist is a regular member of curriculum teams. All media skills are taught through content-based projects.
Exemplary
50%-100% of classes use the media program’s materials and services the equivalent of at least twice each semester. Information literacy skills are an articulated component of a majority of content area curricula.

While standards can and should be used to help evaluate a program, the direct link between meeting such standards and local student achievement is not present. While backed by research, best practices, and the experience of the standards writers who are usually experts in the field, these tools can only suggest what may make a local program more effective, not demonstrate that the current program is having an impact. While important, standards are guides, not evidence.

2. Research studies. The Colorado studies are a good example of using statistical regression analysis to look for correlations between variables. In the case of statewide library studies, the relationship of effective library programs and standardized test scores is examined. School Libraries Work, (Scholastic, 2006) is an excellent summary of this types of research. <www.scholastic.com/librarians/printables/downloads/ slw_2006.pdf>. These can and should be shared with principals. Some statisticians do not approve of regression analyses because they show correlation, not causation, and because it is very difficult to factor out other variables that may have impacted the correlation.

Other formal individual research studies and meta-studies are also worth sharing with administrators. Stephen Krashen’s Power of Reading, 2nd edition, persuasively stacks up a large number of individual research reports to demonstrate that voluntary free reading can improve student reading ability. And he concludes that when students have access to a wide range of reading resources (in libraries, of course), they do more independent reading.

Unfortunately, just as all politics are local, so are all assessments local. While decision-makers are usually quite willing to read and acknowledge studies done “elsewhere,” most still want to know the direct impact their local program is having.

3. Counting things. Year-end reports that include circulation statistics, library usage, and collection size data are a common way for building library programs to demonstrate the degree to which they are being used, and by inference, having an impact on the educational program in the school.

Ontario Library Association’s  Teacher Librarian Toolkit for Evidence Based Practice <accessola.com/osla/toolkit/home.html> contains a number of forms that can be used to track circulation and incidences of collaboration. Jacquie Henry provides a tool for tracking media center usage in the January 2006 issue of Library Media Connection.

Our district’s “Year End Report” asks library media specialists to enumerate the following:

Circulation statistics:
Number of print materials circulated
AV materials circulated
In-library circulation of print
In-library circulation AV materials
AV equipment circulated

Use of space:
Classes held/hosted
Drop in users
Computer lab
After hours
Other uses

Collections:
Number of books acquired and deleted
Number of AV materials acquired and deleted
Number of software programs acquired and deleted

Leadership team activities: (List any building/district committees on which you have served and your role on them.)

Instructional activities:
For primary, please list for each grade level library units taught that support classroom units and major skills taught.
For secondary, please list all units taught collaboratively and skills for which you had major responsibility for teaching.

Special programs or activities: (in-services, reading promotions, authors, events)
Please share a minimum of three instructional highlights for the past year. This is very helpful when concrete examples of media/tech services are needed.

Communications: (Please list how you have communicated with parents, staff and students this year.)

There is a movement away from counting things: materials, circulation, online resource uses, website hits, individual student visits, whole class visits and special activities conducted (tech fairs, reading promotions, etc.) to enumerating how many instructional activities were accomplished:  booktalks given, skill lessons taught, teacher in-services provided, pathfinders/bibliographies created and collaborative units conducted. Administrators are less concerned about how many materials are available and more concerned about how they are being used.

Information and technology literacy skill attainment, if assessed and reported, is another means of “counting” one’s impact. Our elementary library media specialists have primary responsibility for teaching these skills and complete sections of student progress reports similar to those done in math and reading. At the building level, it is possible for the library media specialist to make a statement like: “89% of 6th grade students have demonstrated mastery of the district’s information literacy benchmarked skills.”

4. Asking people. Asking library users to complete surveys and participate in focus groups are frequently used to collect information about the impact of the library media programs.

Some sources of surveys:
•    Johnson, What Gets Measured Gets Done (Tools): <www.doug-johnson.com/wgm/wgm.html>
•    McGriff, Preddy, and Harvey, Program Perception <www.nobl.k12.in.us/media/NorthMedia/lms/data/ percept/percept.htm>
•    Valenza, Power Tools Recharged (ALA, 2004)

Surveys of both students and teachers can be done either at the project level at completion, or on an annual basis. Joyce Valenza conducts video “exit interviews” of graduating seniors at her high school.

Survey-based data gathering was a powerful tool used by Todd and Kulthau to conduct Student Learning through Ohio School Libraries: The Ohio Research Study <www.oelma.org/studentlearning> in 2003. This type of study would be relatively easy to recreate at the building level.

5. Anecdotal data. Is there value to anecdotal evidence and stories? Despite my favorite statistics teacher’s dictum that the plural of anecdote is not data, I believe empirical evidence without stories is ineffective. One skill all great salespeople have is the ability to tell compelling personal tales that illustrate the points they wish to make. It’s one thing for the guy down at the Ford dealership to show a potential buyer a Consumer Reports study. But the real closer tells the story of how Ms. Jones buys this exact model every other year and swears each one is the best car she has ever owned. When selling (advocating for) our programs, our visions, and ourselves to those we wish to influence, we need to tell our stories. See “Once Upon a Time,” Library Media Connection, February 2002. <www.doug-johnson.com/dougwri/storytelling.html>.

Don’t discount how powerful “digital storytelling” can be as well. A short video or even photographs of students using the library media center for a variety of activities can be persuasive. How many times have you said, “If only the parents could see this, they would support the library 100%”? Though digital photography and a presentation to the PTA or Kiwanis organization, they can see your program.

Context and Focus Numbers alone, of course, mean little. They need to interpreted and placed in some type of meaningful context. Context can be achieved by setting and meeting goals and by looking at numbers in a historical context. Look, for example, at how each statement gets more powerful:

  • 28 teachers participated in collaborative units (Is this good or bad?)
  • 78% of teachers in the building participated in collaborative units (This tells me more.)
  • 78% of teachers, up from 62% of teachers last year, participated in collaborative teaching units. (This shows a program that is getting stronger.)

It’s clear that a wide variety of means exist to assess a wide variety of library program activities. How does one choose the “what” and “how” of program evaluation?

David Loertscher’s Project Achievement <www.davidvl.org/achieve.html> suggests that data collection should be done at three levels in order to triangulate evidence: at the Learner Level; at the Teaching Unit Level; and at the Organization Level and provides tools to do just that. He also suggests evaluating the impact of the library program on four areas: Reading, Collaborative Planning, Information Literacy and Technology.

My suggestion is to pay careful attention to your building and district goals and annual objectives. If reading is a focus, then look at reading activities, promotions, collection development and circulation. (If there is a focus on a particular demographic within your school (ESL students for example), check to see if your circulation system will allow you to sort by that identifier.  You own goals and the accomplishment of them, can also provide an effective means of assessment.

We can no longer afford to complete a program evaluation once every five years and have the results thrown in a drawer and never used. Our assessments need to help us improve our practice, to serve as indicators for our planning efforts, and to be an integral part of our communication efforts with our teachers, administrators, parents and communities. Assesment, of course, takes time. But less time than finding another job.

How are you “demonstrating your impact on student achievement?”

EmailEmail Article to Friend

Reader Comments (2)

Thanks for the informative summary of how to demonstrate school library impact on student achievement. The up to date overview is quite useful.
I just completed a study on an Israeli high school using the Todd and Kulthau Student Learning through Ohio School Libraries: The Ohio Research Study instrument adapted to an Israeli environment where the information literacy instruction is led by IT instructors working with school librarians and curricular teachers. The program for 9th and 10th grade students focusses on inquiry-based learning in various curricular subjects integrated with info-lit and IT instruction.
The study created a detailed map of the students' perceptions of where the IL program and the school library and IT infrastructure aided them and a measure of the degree of help which they received in each learning dimension. The reported data for each question, as well as for each block of questions, can serve as an important tool for the school administrators and educational staff to examine the outcomes of the IL program, library and IT infrastructure from the students' viewpoints.
A number of recommendations for the school administrators and program staff resulted from the study.
October 7, 2006 | Unregistered CommenterReuven Werber
BLOG REVIEW FOR NONPROFIT MANAGEMENT COURSE 4800-702

In this blog, Doug Johnson is asking how we can demonstrate a school library’s media centers’ impact on student learning. I am sure this is of great importance with the shortage of funding in our schools. Mr. Johnson gives us several examples of the tools we can use to measure the impact but what he fails to do is tell us the outcome goal of the library program. He does provide us with standards that are often used to measure the success of the program but these are standards the program must meet as a minimum requirement. Standards would fall under the category of an activity goal rather than an outcome goal. We know the program must make an impact on students but what type of impact?
The information provided in this blog is quite useful but we must understand the goal of a program before a true evaluation can be completed.
November 12, 2006 | Unregistered CommenterMelissa Hunt

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>