There’s Little National Data About School Librarians. What Happened?

By Keith Curry Lance on March 16, 2018

From a national perspective, school libraries are largely data-less at the moment—a state of affairs that would be utterly unthinkable to public or academic library leaders.

Here’s why: The National Center for Education Statistics (NCES) Common Core of Data staffing counts used in my SLJ article “School Librarian, Where Art Thou?” is the only data about school libraries that the federal government collects from all schools and districts. That’s the sole source we’ve had since 2012, the final year of two other much more detailed national efforts to collect information on school libraries and librarians.

School library advocates need solid data. Changing this status quo should be a priority.

Of those two surveys, the more substantial, longer-running effort was the Survey of School Library Media Centers (SSLMC) conducted as part of the Schools and Staffing Surveys (SASS) program of the NCES. Beginning in 1993–94, the SSLMC was one of SASS’s sample surveys of districts, schools, and educators, with subsequent surveys in 1999–2000, 2003–04, 2007–08, and 2011–12. These five comprehensive surveys included questions on facilities and policies, staffing, technology and information literacy, and school library collections and expenditures.

The American Association of School Librarians (AASL) conducted six annual surveys with voluntary participants from 2007–12. These addressed similar topics:  staff activities, hours and staffing, collection size, technology, visits, and expenditures. The final survey included a supplementary report about filtering.

Why did they end? For similar reasons. While both organizations provided free reports summarizing overall survey results, neither offered online tools to facilitate free, timely access to individual library data—the kind of comparative information that might have informed school librarians and decision-makers.

The reasons why access wasn’t available differed, but the results were the same. People participate in surveys on a “what’s in it for me” basis. Once participants figure out that there isn’t enough in it for them from their viewpoint, they tend to bow out and invest their time and energy elsewhere.

The SASS system also contained a fundamental flaw: Most of those surveys included institutional questions and personal ones—ones about the library and ones directed to librarians personally—and were therefore subject to NCES’s stringent privacy policies.

Also, anyone who wished to use the data files had to purchase a license from NCES and meet strict requirements for maintaining data security and privacy.

I was part of one effort by school library folks to try to explain to NCES why this situation was untenable.  We asked if it would be possible to separate the institutional data from the personal and offer a public-use file. The answer was no.

As with so many large-scale surveys, the timeliness of the SASS data was another obstacle to its use. The SASS surveys were mostly conducted at four year intervals, and it usually took as many years to see reporting on the results. In short, by the time even summary data came out, it was considered too old to be very useful. Thus, potential data users were discouraged by at least two factors:  the inaccessibility of case-by-case data and the untimely releases of summary data.

The reasons that the AASL School Libraries Count Survey ended are less clear. Certainly, case-by-case data access was also an issue. Summary data reports were released on a timelier basis. However, from 2008, the survey’s second year, the number of responses each year dropped steadily. Like many associations that collect data from their members and others, AASL regarded this data as proprietary.

As problematic as these school library surveys were, they were at least something.

The bottom line is that unlike public and academic libraries, school libraries have access to neither a methodologically solid federal census or sample survey nor an association-based survey that collects voluntary data.

The SASS system’s replacement—the greatly simplified National Teacher and Principal Survey (NTPS)—includes staffing questions that mostly duplicate the CCD counts of librarians and support staff.  Like SASS, NTPS is only being conducted as a sample survey, not a national census of schools, and asks for full- and part-time head counts rather than full-time equivalents. Currently, there is neither a public-use data file nor a report on the first NTPS, for the 2015-16 school year. The practice of releasing only restricted-use files continues, despite its obvious hindrance of data access and use.

Ideally, NCES should establish a new, more viable survey conducted regularly and reported in a timely way. Intuitive tools facilitating access to school-by-school data would improve the odds of long-term success. Establishing a new mandate for NCES will require considerable political effort and, very likely, new funding. Our work is cut out for us.


Keith Curry Lance studies the evolving profession of school librarianship.