Demonstrating Our Impact: Putting Numbers in Context - Part 2
Media Matters column, Leading and Learning, 2006-07, #3
The last Media Matters column began describing ways library media specialists are answering the vital question: “How do we demonstrate our impact on student achievement?” The use of standards and checklists, of research studies, and of year-end reports that “count things” were listed as common methods many media specialists are currently using – but not always achieving the results desired.
This column concludes the series by suggesting that user surveys can generate useful empirical data, that good anecdotal evidence is still vital, that context may be as important as the data, and that data gathering and reporting needs to be an on-going effort.
4. Asking people. Asking library users to complete surveys and participate in focus groups is an effective means of collecting information about the impact of the library media programs.
Here are some simple, common questions that might be included on a student survey. Questions one through twelve can be answered using a 1-4 Likert scale. Questions thirteen through fifteen are open-ended. (The results of surveys that use a Likert scale or other numerical responses are simple to graph, and such graphs can have a dramatic impact when communicating the results of such information.)
Student Survey Questions
1. I feel I can help decide what activities, rules and materials are a part of the library media center.
2. The media specialist lets me know when there are new materials or things to do in the media center.
3. There are enough books and other materials in the media center that I can get what I need.
4. I can find the books, computer software and other materials in the media center I need. I can understand them easily.
5. The materials in the media center are easy to find, are in good condition, and are up-to-date.
6. I think the skills I learn in the media center are important. I use them in class or at home as well as in the media center.
7. I can use the media center whenever I need to.
8. The media specialist helps me with my questions.
9. The media specialist is always there when I need help.
10. I feel welcome and comfortable in the media center.
11. I can get my work done in the media center.
12. I use technology in the media center to help me answer my questions.
13. The thing I like best about the library media center is:
14. One thing that could be changed about the library media center is:
15. Other comments or observations:
Surveys can also be conducted with teachers, administrators and parents, each yielding good information. Some sources of surveys include:
• Johnson, What Gets Measured Gets Done (Tools): <www.doug-johnson.com/wgm/wgm.html> (from which the questions above were taken.)
• McGriff, Preddy, and Harvey, Program Perception <www.nobl.k12.in.us/media/NorthMedia/lms/data/ percept/percept.htm>
• Valenza, Power Tools Recharged (ALA, 2004)
Surveys of both students and teachers can be done either at the project level or on an annual, program level. Joyce Valenza conducts video “exit interviews” of graduating seniors at her high school that help her determine the effectiveness of the media program over the academic career of her students. (See her exemplary "End of Year" report at <http://mciu.org/~spjvweb/annualreport06.pdf>.)
Survey-based data gathering was a powerful tool used by Todd and Kulthau to conduct Student Learning through Ohio School Libraries: The Ohio Research Study <www.oelma.org/studentlearning> in 2003. This type of study in which students are asked about the impact the media center has on their learning would be relatively easy to recreate at the building level.
5. Anecdotal data. Is there value to anecdotal evidence and stories? Despite my favorite statistics teacher’s dictum that the plural of anecdote is not data, I believe empirical evidence without stories is ineffective. One skill all great salespeople have is the ability to tell compelling personal tales that illustrate the points they wish to make. It’s one thing for the guy down at the car dealership to show a potential buyer a Consumer Reports study. But the real closer is his story of how Ms. Jones buys this exact model every other year and swears each one is the best car she has ever owned. When “selling” our programs, our visions, and ourselves to those we wish to influence, we need to tell our stories. See “Once Upon a Time,” Library Media Connection, February 2002. <www.doug-johnson.com/dougwri/storytelling.html>.
Don’t discount how powerful “digital storytelling” can be as well. A short video or even photographs of students using the library media center for a variety of activities can be persuasive. How many times have you said, “If only the parents could see this, they would support the library 100%”? Though digital photography and a presentation to the PTA or Kiwanis organization, they can see your program.
Context and Focus Numbers alone, of course, mean little. They need to be interpreted and placed in some type of meaningful context. Context can be achieved by setting and meeting goals and by looking at numbers in a historical context. Look, for example, at how each statement gets more powerful:
• 28 teachers participated in collaborative units (Is this good or bad?)
• 78% of teachers in the building participated in collaborative units (This tells me more.)
• 78% of teachers, up from 62% of teachers last year, participated in collaborative teaching units. (This shows a program that is getting stronger.)
In light of NCLB’s focus on the achievement of subgroups within a school, data that relate specifically to target populations may be more powerful than that which applies to the entire school population. While numbers showing that book circulation has grown by x% this year is good to report, numbers that show book checkout by the building’s ELL (English Language Learners) has increased by x% is probably of more interest to your administration.
David Loertscher’s Project Achievement <www.davidvl.org/achieve.html> suggests that data collection should be done at three levels in order to triangulate evidence: at the Learner Level; at the Teaching Unit Level; and at the Organization Level and he provides tools to do just that. He also suggests evaluating the impact of the library program on four areas: Reading, Collaborative Planning, Information Literacy and Technology.
My suggestion is to pay careful attention to your building and district goals and annual objectives. If reading is a focus, then look at reading activities, promotions, collection development and circulation. If there is a focus on a particular demographic within your school, focus on it. Your own goals, and the accomplishment of them, can also provide an effective means of assessment.
Traditionally, school curriculum areas and programs have been formally evaluated on a five to seven year cycle. This approach has been replaced in many schools by some form of continuous improvement model. In other words, evaluation - and actions determined by the results of those evaluations - need to be ongoing.
For the school library media program, some form of assessment should be conducted, analyzed and reported several times during the school year. A simple survey, a compilation and analysis of usage numbers of a particular resource, or reporting of units planned and taught become integral parts of regular communication efforts with staff and parents, and then can be easily aggregated for a final year-end report.
We can no longer afford to complete a program evaluation once every five years and have the results thrown in a drawer and never looked at until the next formal assessment. Our assessments need to help us improve our practice, to serve as indicators for our planning efforts, and to be an integral part of our communication efforts with our teachers, administrators, parents and communities. Assessment, of course, takes time, but less time than finding another job.
How are you “demonstrating your media program’s impact on student achievement?”