« Demonstrating Our Impact 2 | Main | What Gets Tested Gets Taught »

Demonstrating Our Impact 1

Demonstrating Our Impact: Putting Numbers in Context Part 1
Media Matters column, Leading and Learning, 2006-07, #2

One of my favorite quotes comes from George Bernard Shaw: “We should all be obliged to appear before a board every five years and justify our existence...on pain of liquidation.” While Shaw was commenting on one’s social worth, his words today could come from any number of administrators, school boards and legislatures and be aimed directly at school library media specialists. Finding a persuasive answer to the question “How do we demonstrate our impact on student achievement?” is increasingly important for every library media specialist in the country.

The next two Media Matters columns will examine how media specialists are trying to answer this question. Technology readers, pay attention. If you are not already being asked how technology is having an impact, I can guarantee you will be!

***

I have long been frustrated finding a convincing means of answering the accountability question, especially when those asking want empirical rather than anecdotal evidence to support claims of program effectiveness. To me, genuine empirical evidence is the result of a controlled study and no school has the ability or will to do a controlled study on library effectiveness. Would your school:
•    Be willing to have a significant portion of its students (and teachers) go without library services and resources as part of a control group?
•    Be willing to wait three to four years for reliable longitudinal data?
•    Be willing to change nothing else in the school to eliminate all other factors that might influence test scores?
•    Be willing to find ways to factor out demographic data that may influence test results?
•    Be able to analyze a large enough sample to be considered statistically significant?
•    Be willing to provide the statistical and research expertise and manpower needed to make the study valid?"

I know mine wouldn’t participate in such a study, no matter how clear-cut the evidence produced. So how do we demonstrate our impact using “numbers?”  Let’s look at a number of ways, none perfect, but when used in combination, powerful.

1. Standards and checklists. A common means of assessing a school library media program (and by inference assessing its impact on student learning) is by comparing an individual library media program to a state or national set of program standards. AASL’s Planning Guide for Information Power: Building Partnerships for Learning with School Library Media Program Assessment Rubric for the 21st Century (ALA, 1999) is one example of a tool that can be used to do such a comparison. Many states also have standards that can be used to evaluate library media programs. Minnesota’s, for example, can be found at <www.memoweb.org/htmlfiles/linkseffectiveslmp.html>.

Both AASL and MEMO use rubrics that quickly allow a media specialists to evaluate their programs. For example, MEMO’s “Standard One” under the Learning and Teaching section reads: “Is the program fully integrated?” and gives these levels

Minimum
25-50% of classes use the media program’s materials and services the equivalent of at least once each semester.
Standard
50%-100% of classes use the media program’s materials and services the equivalent of at least once each semester. The media specialist is a regular member of curriculum teams. All media skills are taught through content-based projects.
Exemplary
50%-100% of classes use the media program’s materials and services the equivalent of at least twice each semester. Information literacy skills are an articulated component of a majority of content area curricula.

While standards can and should be used to help evaluate a program, the direct link between meeting such standards and local student achievement is not present. While backed by research, best practices, and the experience of the standards writers who are usually experts in the field, these tools can only suggest what may make a local program more effective, not demonstrate that the current program is having an impact. While important, standards are guides, not evidence.

2. Research studies. The Colorado studies are a good example of using statistical regression analysis to look for correlations between variables. In the case of statewide library studies, the relationship of effective library programs and standardized test scores is examined. School Libraries Work, (Scholastic, 2006) is an excellent summary of this type of research. <www.scholastic.com/librarians/printables/downloads/ slw_2006.pdf>. These can and should be discussed with principals, not just placed in their mailboxes. Some statisticians do not approve of regression analyses because they show correlation, not causation, and because it is very difficult to factor out other variables that may have impacted the correlation.

Other formal individual research studies and meta-studies are also worth sharing with administrators. Stephen Krashen’s Power of Reading, 2nd edition, persuasively stacks up a large number of individual research reports to demonstrate that voluntary free reading can improve student reading ability. And he concludes that when students have access to a wide range of reading resources (in libraries, of course), they do more independent reading.

Unfortunately, just as all politics are local, so are all assessments local. While decision-makers are usually quite willing to read and acknowledge studies done “elsewhere,” most still want to know the direct impact their local program is having.


3. Counting things. Year-end reports that include circulation statistics, library usage, and collection size data are a common way for building library programs to demonstrate the degree to which they are being used, and by inference, having an impact on the educational program in the school.

Ontario Library Association’s  Teacher Librarian Toolkit for Evidence Based Practice <accessola.com/osla/toolkit/home.html> contains a number of forms that can be used to track circulation and incidences of collaboration. Jacquie Henry provides a tool for tracking media center usage in the January 2006 issue of Library Media Connection.

Our district’s “Year End Report” asks library media specialists to enumerate the following:

Circulation statistics:
Number of print materials circulated
AV materials circulated
In-library circulation of print
In-library circulation AV materials
AV equipment circulated

Use of space:
Classes held/hosted
Drop in users
Computer lab
After hours
Other uses

Collections:
Number of books acquired and deleted
Number of AV materials acquired and deleted
Number of software programs acquired and deleted

Leadership team activities: (List any building/district committees on which you have served and your role on them.)

Instructional activities:
For primary, please list for each grade level library units taught that support classroom units and major skills taught.
For secondary, please list all units taught collaboratively and skills for which you had major responsibility for teaching.

Special programs or activities: (in-services, reading promotions, authors, events)
Please share a minimum of three instructional highlights for the past year. This is very helpful when concrete examples of media/tech services are needed.

Communications: (Please list how you have communicated with parents, staff and students this year.)

There is a movement away from counting things: materials, circulation, online resource uses, website hits, individual student visits, whole class visits and special activities conducted (tech fairs, reading promotions, etc.) to enumerating how many instructional activities were accomplished:  booktalks given, skill lessons taught, teacher in-services provided, pathfinders/bibliographies created and collaborative units conducted. Administrators are less concerned about how many materials are available and more concerned about how they are being used.

Information and technology literacy skill attainment, if assessed and reported, is another means of “counting” one’s impact. Our elementary library media specialists have primary responsibility for teaching these skills and complete sections of student progress reports similar to those done in math and reading. At the building level, it is possible for the library media specialist to make a statement like: “89% of 6th grade students have demonstrated mastery of the district’s information literacy benchmarked skills.”

***

In the next Media Matters column, we’ll examine look at the use of surveys, why we should still gather anecdotal evidence, how to put numbers in context, and where we should focus our accountability efforts. Stay tuned!
 

Posted on Wednesday, June 13, 2007 at 02:50PM by Registered CommenterDoug Johnson in | Comments1 Comment

PrintView Printer Friendly Version

EmailEmail Article to Friend

Reader Comments (1)

Dont we run the risk of leaving talented kids behind when we put them i.e. students on alert to at a certain time to justify thier existance. Im not a believer that testing is an indication of any one persons level of learning. For instance I think im brillant but put a test in front of me and my mind shuts down. Technology is amazing and has inpacted our culture and allowed the human mind to broaden its horizons of learning.

April 23, 2008 | Unregistered CommenterHolly

PostPost a New Comment

Enter your information below to add a new comment.

My response is on my own website »
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>