You are on page 1of 116

LIBRARY ASSESSMENT CONFERENCE

BUILDING EFFECTIVE, SUSTAINABLE, PRACTICAL ASSESSMENT

SEPTEMBER 2527, 2006 CHARLOTTESVILLE, VIRGINIA

ASSOCIATION OF RESEARCH LIBRARIES

LIBRARY ASSESSMENT CONFERENCE


BUILDING EFFECTIVE, SUSTAINABLE, PRACTICAL ASSESSMENT

SEPTEMBER 2527, 2006 CHARLOTTESVILLE, VIRGINIA

ASSOCIATION OF RESEARCH LIBRARIES

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Welcome
September 15, 2006 Dear colleagues, Welcome to the Library Assessment Conference: Building Effective, Sustainable, Practical Assessment. We are delighted to hold our first conference in historic Charlottesville. When Jefferson donated his books to the Library of Congress, he wrote I ask of your friendship, therefore, to make for me the tender of it to the Library Committee of Congress, not knowing myself of whom the Committee consists. In a similar fashion, we ask for your friendship in making sure that this effort succeeds. We are thrilled by the overwhelming response to the Conference, strong evidence of the blossoming of community awareness regarding library assessment. You will join more than 200 people participating in a rich three day program of workshops, tours, engaging plenary speakers, useful concurrent and poster sessions and many opportunities for informal discussion. The conference focus is on practical assessment that can be used to improve library service, and includes sessions on customer surveys, focus groups, learning outcomes, organizational climate surveys, performance metrics, evaluating electronic services and resources, and related marketing and management issues. Your commitment to library assessment is critical to the process of demonstrating the impact and connection of the library to the research, teaching and learning process. A library assessment program can only be as strong as our own ability to learn and adapt to the new challenges confronting our organizations. We hope that this library assessment conference is a catalyst that helps all of us address these challenges. We want you to be forthcoming and candid with your assessment of our Library Assessment Conference after all it is an old saying that the wisest of the wise may err (Aeschylus). A learning community of practitioners interested in library assessment may aspire to be the wisest of the wise. Please join us as we learn, build community, and also make library assessment fun -- wise people know how important that is! Steve Hiller, University of Washington, Co-Chair Martha Kyrillidou, Association of Research Libraries, Co-Chair Jim Self, University of Virginia, Co-Chair And the rest of the Conference Planning Committee: Francine DeFranco, University of Connecticut Brinley Franklin, University of Connecticut Richard Groves, Association of Research Libraries Lisa Janicke Hinchliffe, University of Illinois at Urbana-Champaign Joan Stein, Carnegie Mellon University Lynda White, University of Virginia
Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Thank You to Our Reception Sponsors

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Omni Floor Map

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

Program At A Glance

MONDAY Workshops 9:00 12:00 Welcome Opening 1:00 2:00 PLENARY I 2:00 3:00 Break 3:00-3:30 Parallel 1 3:30 5:00 Service Quality Assessment Heath (panel): LibQUAL+, ProSeBiCA (Development of New Library Services by Means of Conjoint Analysis), and CAPM (Comprehensive Access to Printed Materials) Thompson: How You Can Evaluate the Integrity of Your Library Service Quality Assessment Data: Inter-continental LibQUAL+ Analyses Used in Concrete Heuristic Examples Joe Zucca Data Analysis and Presentation Neal Kaske Introduction to Survey Analysis

SALON A

SALON B

ASHLAWN & HIGHLANDS

Colleen Cook Introduction to Focus Groups and Other Qualitative Methods (MONROE ROOM)

Steve Hiller, Martha Kyrillidou, Jim Self Speaker: Duane Webster, Executive Director, Association of Research Libraries John Lombardi, Chancellor, University of Massachusetts, Amherst, Library Performance Measures that Matter

Qualitative Approaches I Tatarka: Wayfinding in the Library: Usability Testing of Physical Spaces Jantti: Assessing the Service Needs and Expectations of Customers Holtze: Approaches to Analyzing Qualitative Research

Building Assessment in our Libraries I Hinchliffe: Using Surveys to Begin an Assessment Initiative Becher: How a Symbiotic Relationship Between Assessment and Marketing Moves the Library Forward Forrest: Assessment in the Emory University Libraries: Lurching toward sustainability

Parallel 2 5:00 6:00

Posters/Drinks 6:00 7:30 Dinner 7:30-9:00 Apres-Dinner 9:00 11:00

LibQUAL+ Follow-up Qualitative Approaches II Clark: Practical Assessment at Texas Dimmock (panel): Meliora: The Culture of A&M: Using LibQUAL+ Comments to Assessment at University of Rochesters Enhance Reference Services River Campus Libraries Duffy: Getting Our Priorities in Order: Are Our Service Values in Line with the Communities We Serve? Open Bar in ATRIUM (6:15-7:15) Posters in PRESTON Speaker: Brinley Franklin, Vice-Provost for University Libraries University of Connecticut Drinks and Conversation with Library Luminaries

Building Assessment in our Libraries II Ackermann: Using Effect Size Metaanalysis to Get the Most Out of the Library-related Survey Data Myers (panel): Developing an Integrated Approach to Library and Information Technology Assessment

Program At A Glance

TUESDAY Breakfast 7:30 9:00 7:30 9:00 Final poster viewing in PRESTON Information Literacy I Radcliff: Using the SAILS Test to Assess Information Literacy Smith (panel): Scenario Based ICT Literacy Assessment: A New Tool for Evaluating the Effectiveness of Library Instructional Programs Moving Assessment Forward Beck: Policy Action: The Continuous Improvement Cycle Cases from ARL and Carnegie MA I Libraries. Lakos: Evidence Based Library Management A View to the Future Self: Keys to Effective, Sustainable and Practical Assessment Parallel 3 9:00 10:30 Continental Breakfast Continental Breakfast

SALON A

SALON B

ASHLAWN & HIGHLANDS

Break 10:30 11:00 Parallel 4 11:00 12:00 Information Literacy II Fluk: The 4th R: Information Literacy in Institutional Assessment Oakleaf: The Right Assessment Tool for the Job: Seeking a Match Between Method and Need

Evaluation and Assessment Methods Kaske: Choosing the Best Tools for Evaluating Your Library McClure: Developing Best Fit Library Evaluation Strategies

Strategic Planning OMahony: Accountability to Key Stakeholders Saunders - Drilling The LibQUAL+ Data For Strategic Planning

Lunch 12:00 - 1:15 PLENARY II 1:30 2:30 Break 2:30 3:00 Parallel 5 3:00 5:00

Cathy de Rosa, Vice President, Marketing and Services, OCLC Inc. Changing User Needs and Perceptions

Reception 6:00 8:00 Post-Reception 8:30-10:30

Library as Place Balanced Scorecard Lippincott: Assessing Learning Spaces: A Matthews: Balanced Scorecards in Public Framework Libraries: A Project Summary Lewellen: Combining Qualitative and Pathak: The People Side of Planning & ImQuantitative Assessment of an Information plementing a Large Scale Balanced ScoreCommon card Initiative Sweetman: The Role of Assessment in Tolson (panel): Staff Involvement in the Renovation to Meet User Need Balanced Scorecard Shrimplin: Net Generation College Students and the Library as Place Speaker: Karin Wittenborg, University Librarian, University of Virginia Reception held at UVa Harrison Institute & Small Special Collections Library Drinks and Conversation with Library Luminaries

Assessing Organizational Climate Baughman (panel): From Organizational Assessment to Organizational Change: The University of Maryland Experience Lillard: Diversity and Organizational Culture Survey: Useful Methodological Tool or Pandoras Box Slight-Gibney: Looking In and Looking Out: Assessing Our Readiness to Embrace the Future

Program At A Glance

WEDNESDAY Breakfast 7:30-9:00 PLENARY III 9:00 10:00 Continental Breakfast Paul Hanges, Professor, Industrial and Organizational Psychology University of Maryland Organizational Diversity and Climate Assessment Continental Breakfast Break 10:00 10:30 Parallel 6 10:30-12:00 Organizational Culture/Learning Currie: Assessing Organizational Culture: Moving Towards Organizational Change and Renewal Belanger: Tools for Creating a Culture of Assessment: The CIPP Model and Utilization-Focused Evaluation Dillon: The Use of Outcome Based Evaluation (OBE) to Assess Staff Learning Activities

SALON A

SALON B

JAMES MONROE

Digital Library Jeng: Usability Assessment of Academic Digital Libraries Poe: Listening to Users: Creating More Useful Digital Library Tools and Services by Understanding the Needs of User Communities Manoff: Finding Useful and Practical Ways to Combine Electronic Resource Usage Data from Multiple Sources

Value and Impact Aerni: Contingent Valuation of Libraries: Examples from Academic, Public and Special Libraries Snead: Demonstrating Library Value Through Web-Based Evaluation Instructional Systems Town: Value and Impact Measurement: A UK Perspective and Progress Report on a National Programme (VAMP)

Lunch Closing 12:30 1:30 Workshops 2:00 5:00 Joe Zucca Data Analysis and Presentation

Speaker: Betsy Wilson, Dean of University Libraries, University of Washington Neal Kaske Introduction to Survey Analysis

Colleen Cook Introduction to Focus Groups and Other Qualitative Methods

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Sunday, September 24
3:00 p.m. - 8:00 p.m. Early Conference Registration Hotel Lobby University of Virginia Library staff will be available to answer questions about Charlottesville and to provide dinner recommendations. Monticello/Jefferson Library Tour Meets Meet in Atrium Buses Leave for Monticello Buses Leave Monticello for Omni Dinner on your Own in the Downtown Mall

3:50 p.m. 4:00 p.m. 7:30 p.m. 8:00 p.m.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

10

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25
8:00 a.m. 6:00 p.m. 9:00 a.m. - 12:00 noon Conference Registration Open Salon A/B Prefunction Area Preconference Workshops Data Analysis and Presentation with Joe Zucca Salon A Introduction to Survey Analysis with Neal Kaske Salon B Introduction to Focus Groups and Other Qualitative Methods with Colleen Cook James Monroe

12:00 noon - 1:00 p.m. 1:00 p.m. - 2:00 p.m.

Lunch on your Own Welcome & Opening Salon A/B Conference Co-Chairs Steve Hiller, Jim Self, and Martha Kyrillidou Speaker: Duane Webster, Executive Director, Association of Research Libraries

2:00 p.m. - 3:00 p.m.

Plenary I Salon A/B John Lombardi, Chancellor University of Massachusetts-Amherst Library Performance Measures That Matter Break Parallel Session 1 Service Quality Assessment Salon A LibQUAL+, ProSeBiCA (Development of New Library Services by Means of Conjoint Analysis), and CAPM (Comprehensive Access to Printed Materials) Panel: Fred Heath, Colleen Cook, Martha Kyrillidou, Bettina Koeper, Reinhold Decker, and Sayeed Choudhury How You Can Evaluate the Integrity of Your Library Service Quality Assessment Data: Intercontinental LibQUAL+ Analyses Used in Concrete Heuristic Examples Bruce Thompson, Martha Kyrillidou, and Colleen Cook Qualitative Approaches I Salon B Wayfinding in the Library: Usability Testing of Physical Spaces Nancy J. Kress, David K. Larsen, Tod A. Olsen, and Agnes M. Tatarka

3:00 p.m. - 3:30 p.m. 3:30 p.m. - 5:00 p.m.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

11

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25
3:30 p.m. - 5:00 p.m. Parallel Session 1 continued Qualitative Approaches I continued Salon B Assessing the Service Needs and Expectations of Customers No Longer a Mystery Margie Jantti Frequently Noted: Approaches to Analyzing Qualitative Research Elizabeth Smigielski, Judy Wulff, and Terri Holtze Building Assessment in our Libraries I Ashlawn & Highlands Getting Started with Library Assessment: Using Surveys to Begin an Assessment Initiative Lisa Janicke Hinchliffe and Tina E. Chrzastowski A Leap in the Right Direction: How a Symbiotic Relationship Between Assessment and Marketing Moves the Library Forward Melissa Becher and Mary Mintz Assessment in the Emory Libraries: Lurching toward Sustainability Panel: Charles Forrest and Susan Bailey 5:00 p.m. - 6:00 p.m. Parallel Session 2 LibQUAL+ Follow-up Salon A Practical Assessment at Texas A&M: Using LibQUAL+ Comments to Enhance Reference Services Dennis T. Clark Getting Our Priorities in Order: Are Our Service Values in Line with the Communities We Serve? Jocelyn Duffy, Damon Jaggars, and Shanna Smith Qualitative Approaches II Salon B Meliora: The Culture of Assessment at University of Rochesters River Campus Libraries Panel: Nora Dimmock, Judi Briden, and Helen Anderson

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

12

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25
5:00 p.m. - 6:00 p.m. Parallel Session 2 continued Building Assessment in our Libraries II Ashlawn & Highlands Library Assessment on a Budget: Using Effect Size Meta-Analysis to Get the Most out of the Library-Related Survey Data Available across Campus Eric Ackermann Developing an Integrated Approach to Library and Information Technology Assessment Panel: Jill Glaser, Bill Myers, Ryan P. Papesh, John M. Stratton

6:00 p.m. - 7:30 p.m.

Poster Session & Drinks Posters in Preston Bar in Atrium Open 6:15-7:15 Usage and Outcomes Evaluation of an Information Commons: A Multi-Method Pilot Study Rachel Applegate Issues in Establishing a Culture of Assessment in a Complex Academic Health Sciences Library Sally Bowler-Hill and Janis Teal Use of RFID Applications in Libraries Navjit Brar Statistics & Assessment: The Positive Effects at the Harold B. Lee Library of Brigham Young University Julene Butler and Brian Roberts Improving Library Services Using a User Activity Survey Alicia Estes Introducing the READ Scale: Qualitative Statistics for Academic Reference Services Bella Karr Gerlich and G. Lynn Berard Are the Needs and Wants the Same? Comparing Results from Graduate Student, Undergraduate Student, and Faculty Surveys Lisa Janicke Hinchlifee and Tina E. Chrzastowski Challenges Inherent in Assessing Faculty Productivity: A Meta-Analysis Perspective Sheila Curl Hoover Assessing the Research Impact of Electronic Journals at the University of Notre Dame Carol Branch, Carole Pilkinton, and Sherri Jones

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

13

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25
6:00 p.m. - 7:30 p.m. Poster Session & Drinks continued Posters in Preston Bar in Atrium Open 6:15-7:15 Information Seeking Behavior of International Graduate Students vs. American Graduate Students: A User Study at Virginia Tech 2005 Mary Finn Improving Annual Data Collecting: An Interactive Poster Session Linda Miller Using Collection Assessment to Determine Optimal Library Support for Doctoral Granting Academic Programs. Tom Moothart A Time-Budget Study of the George Mason University Libraries Liaison Program James E. Nalen Methodological Diversity and Assessment Sustainability: Growing the Culture of Assessment at the University of Washington Libraries Maureen Nolan, Jennifer Ward, and Stephanie Wright Collecting the *Right* Data for Decision-Making Kimberly Burke Sweetman and Marybeth McCartin Creating On-Going, Integrated Assessment Efforts in Community College Libraries Mark Thompson Bribes Can Work: Ensuring Your Assessment Tool Is Not Ignored Luke Vilelle Assessing Library Instruction with Experimental Designs Scott White and Remi Castonguay Developing Personas for Evaluating Library Service Needs Dan Wilson Perceiving Perception: A Case Study of Undergraduates Perception of Academic Libraries Steven Yates 7:30 p.m. - 9:00 p.m. Dinner Salon A/B Wine will be available for purchase by the bottle Speaker: Brinley Franklin, Vice Provost for University Libraries University of Connecticut 9:00 p.m. - 11:00 p.m. Drinks and Conversation with Library Luminaries Downtown Mall Bars

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

14

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26
7:30 a.m. - 9:00 a.m. 7:30 a.m. - 9:00 a.m. 8:00 a.m. - 12:00 noon 9:00 a.m. - 10:30 a.m. Continental Breakfast Salon A/B Final Poster Viewing Preston Registration Open Salon A/B Prefunction Area Parallel Session 3 Information Literacy I Salon A Using the SAILS Test to Assess Information Literacy Carolyn Radcliff and Joe Salem Scenario-based ICT Literacy Assessment: A New Tool for Evaluating the Effectiveness of Library Instructional Programs Panel: Dr. Gordon Smith, Alexis Smith Macklin, and Dr. Mary M. Somerville Moving Assessment Forward Salon B Data Policy Action: The Continuous Improvement Cycle Cases from ARL and Carnegie MA I Libraries. Susan J. Beck and Wanda V. Dole Evidence Based Library Management A View to the Future. Amos Lakos Keys to Effective, Sustainable, and Practical Assessment Steve Hiller, Martha Kyrillidou, and Jim Self

10:30 a.m. - 11:00 a.m. 11:00 a.m. - 12:00 noon

Break Parallel Session 4 Information Literacy II Salon A The Fourth "R": Information Literacy in Institutional Assessment Louis Fluk The Right Assessment Tool for the Job: Seeking a Match Between Method and Need Megan Oakleaf

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

15

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26
11:00 a.m. - 12:00 noon Parallel Session 4 continued Evaluation and Assessment Methods Salon B Choosing the Best Tools for Evaluating Your Library Neal Kaske Developing Best Fit Library Evaluation Strategies Charles R. McClure, John Carlo Bertot, Paul T. Jaeger and John T. Snead Strategic Planning Ashlawn & Highlands Accountability to Key Stakeholders Raynna Bowlby and Daniel OMahony Drilling the LibQUAL+ Data for Strategic Planning Stewart Saunders

12:00 noon - 1:15 p.m. 1:30 p.m. - 2:30 p.m.

Luncheon (No Speaker) Atrium Plenary II Salon A/B Cathy De Rosa, Vice President, Marketing & Library Services, OCLC Changing User Needs and Perceptions Break Parallel Session 5 Library As Place Salon A Assessing Learning Spaces: A Framework Joan K. Lippincott Combining Quantitative and Qualitative Assessment of an Information Common Gordon Fretwell and Rachel Lewellen Listening to Users: The Role of Assessment in Renovation to Meet User Need Kimberly Burke Sweetman and Lucinda Covert-Vale Net Generation College Students and the Library as Place Aaron K. Shrimplin and Matthew Magnuson

2:30 p.m. - 3:00 p.m. 3:00 p.m. - 5:00 p.m.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

16

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26
3:00 p.m. - 5:00 p.m. Parallel Session 5 continued Balanced Scorecard Salon B Balanced Scorecards in Public Libraries: A Project Summary Joe Matthews The People Side of Planning & Implementing a Large Scale Balanced Scorecard Initiative Susanna Pathak Yours, Mine, and Ours: Staff Involvement in the Balanced Scorecard Panel: Leland Deeds, Tabzeera Dosu, Laura Miller, Paul Rittelmeyer, Annette Stalnaker, Donna Tolson, and Carol Hunter Assessing Organization Climate Ashlawn & Highlands From Organizational Assessment to Organizational Change: The University of Maryland Library Experience Panel: Sue Baughman, Johnnie Love, Charles B. Lowry, and Maggie Sopanaro Diversity and Organizational Culture Survey: Useful Methodological Tool or Pandoras Box Laura Lillard Looking In and Looking Out: Assessing Our Readiness to Embrace the Future Nancy Slight-Gibney

5:00 p.m. - 6:30 p.m.

Buses Depart Omni for University of Virginia Meet in Atrium Please gather for the buses at the Mall entrance of the Atrium. For those who would like to visit the UVa Lawn or view exhibits at Harrison Institute/Small Special Collections Library, the buses will leave at 5:00pm. Doors to the reception will not open until 6:00pm. Reception at Harrison Institute / Small Special Collections Library Speaker: Karin Wittenborg, University Librarian, University of Virginia

6:00 p.m. - 8:00 p.m.

7:45 p.m. - 8:20 p.m.

Buses Depart University of Virginia for Omni Bus pick up location will be the same as drop off in front of Harrison Institute/Small Special Collections Library. Look for the buses with Library Assessment Conference signs returning to the Omni. Drinks and Conversation with Library Luminaries Downtown Mall Bars

8:30 p.m. - 10:30 p.m.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

17

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27
7:30 a.m. - 9:00 a.m. 7:30 a.m. - 9:00 a.m. 9:00 a.m. - 10:00 a.m. Registration Open Salon A/B Prefunction Area Continental Breakfast Salon A/B Plenary III Salon A/B Paul Hanges, Professor, Industrial and Organizational Psychology, University of Maryland Organizational Diversity and Climate Assessment Break Parallel Session 6 Organizational Culture/Learning Salon A Assessing Organizational Culture: Moving towards Organizational Change and Renewal Lyn Currie and Carol Shepstone Tools for Creating a Culture of Assessment: The CIPP Model and UtilizationFocused Evaluation Yvonne Belanger The use of Outcome Based Evaluation (OBE) to Assess Staff Learning Activities at University of Maryland Libraries Irma F. Dillon and Maggie Sopanaro Digital Library Salon B Usability Assessment of Academic Digital Libraries Judy Jeng Listening to Users: Creating More Useful Digital Library Tools and Services by Understanding the Needs of User Communities Felicia Poe All That Data: Finding Useful and Practical Ways to Combine Electronic Resource Usage Data from Multiple Sources Maribeth Manoff and Eleanor Read

10:00 a.m. - 10:30 a.m. 10:30 a.m. - 12:00 noon

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

18

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27
10:30 a.m. - 12:00 noon Parallel Session 6 continued Value and Impact James Monroe Contingent Valuation of Libraries: Examples from Academic, Public and Special Libraries Sarah Aerni and Donald W. King Demonstrating Library Value Through Web-Based Evaluation Instructional Systems Charles R. McClure, John Carlo Bertot, Paul T. Jaeger and John T. Snead Value and Impact Measurement: A UK Perspective and Progress Report on a National Program (VAMP) Stephen Town

12:00 noon - 12:30 p.m. 12:30 p.m. - 1:30 p.m.

Break on your Own Luncheon Salon A/B Speaker: Betsy Wilson, Dean of University Libraries, University of Washington Closing Steve Hiller

1:30 p.m. - 2:00 p.m. 2:00 p.m. - 5:00 p.m.

Break Postconference Workshops Data Analysis and Presentation with Joe Zucca Salon A Introduction to Survey Analysis with Neal Kaske Salon B Introduction to Focus Groups and Other Qualitative Methods with Colleen Cook James Monroe

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

19

20

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 2:00 p.m. - 3:00

Plenary I

Salon A/B

Library Performance Measures That Matter

John Lombardi Chancellor, University of Massachusetts-Amherst Co-editor, The Center


John Lombardi will speak in the general area of what a university administrator needs to know from the library. He has been the major force behind The Center which measures the performance of American research universities.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

21

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 3:30-5:00

Parallel 1 #1
Service Quality Assessment

Salon A

LibQUAL+, ProSeBiCA (Development of New Library Services by Means of Conjoint Analysis), and CAPM (Comprehensive Access to Printed Materials) Panel: Fred Heath, Colleen Cook, Martha Kyrillidou, Bettina Koeper, Reinhold Decker and Sayeed Choudhury
Introduction by Fred Heath, Colleen Cook and Martha Kyrillidou regarding LibQUAL+ and its relation to the methodologies described by the other panelists. Bettina Koeper, ProSeBiCA (Development of New Library Services by Means of Conjoint Analysis) In the context of changing educational environments current discussions about the strategic development of academic libraries clearly show the need of a basic change in their self-conception, turning from mere academic institutions to service provider who actively design and offer services that fit to the users needs and preferences. More and more costumer acceptance of library services becomes a new quality standard that will have significant effects on the libraries status within the universities. But how to achieve this acceptance? How to get a clear idea of user preferences in order to shape future library services? The project ProSeBiCA, funded by the German Research Foundation (DFG) and carried out by the Chair of Marketing at Bielefeld University and Bielefeld University Library, tries to give an answer to that question by adapting conjoint analysis as a marketing research method to the library context. Differing from surveys that refer to the evaluation of existing services conjoint analysis implies a tool based on preference measurement that allows to get a profound knowledge about the users requests towards services already available as well as potential ones. Thus ProSeBiCA aims at the development of a comprehensive analysis and simulation framework for academic libraries which enables a founded strategic planning of future service design. Its portability will be audited in cooperation with the University of Cottbus. An intensive exchange of information with the Sheridan Libraries of the Johns Hopkins University, USA, completes the project and led to further cooperation considering CAPM and LibQUAL+. Sayeed Choudhury, CAPM (Comprehensive Access to Printed Materials) As part of JHUs CAPM Project (http://ldp.library.jhu.edu/projects/capm), Choudhury led the development of an evaluation methodology using multi-attributed, state-preference techniques. Multi-attribute, statedpreference methods feature choice experiments to gather data for modeling user preferences. In the choice experiments, often expressed as surveys, subjects state which alternatives (services or features) they most prefer; the alternatives are distinguished by their multi-attributes. This approach was used to consider tradeoffs in varying attributes for a specific service of access to materials in an off-site shelving facility. Patrons were asked to choose varying times for delivery, access to images, and ability to search full-text, along with differing (hypothetical) fees for each of the attributes. During the 2002 JISC/CNI Conference, Choudhury mentioned the possibility of integrating the LibQUAL+ and CAPM assessment methodologies. LibQUAL+ helps identify gaps in a wide range of library services, but the question of priorities among the gaps is not immediately addressed. The CAPM methodology explicitly explores patrons preferences or choices for implementing a particular library service. Given the different, but complementary areas of emphasis and different theoretical underpinnings, there seemed to be potential for an integrated, and more comprehensive, approach. Fundamentally, the `outputs from a LibQUAL+ analysis can provide the `inputs for a multi-attribute stated-preference analysis, which acknowledges the need for tradeoffs when making decisions regarding resource allocation. Even with this promising idea, there was arguably too large a difference in the levels of granularity between the two methodologies. Bielefelds ProSeBiCA provides the appropriate bridge.
Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

22

Library Assessment Conference Building Effective, Sustainable, Practical Assessment Fred Heath, Vice Provost and Director of Libraries, University of Texas Colleen Cook, Dean and Director of the Texas A&M University Libraries Martha Kyrillidou, Director of Statistics and Service Quality Programs

Charlottesville, Virginia September 25-27, 2006

Bettina Koeper, ProSeBiCA (Development of New Library Services by Means of Conjoint Analysis) Reinhold Decker, Professor Dept of Business Admin. of Econ, Bielefeld University, Germany Sayeed Choudhury, CAPM (Comprehensive Access to Printed Materials)

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

23

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 3:30-5:00

Parallel 1 #1
Service Quality Assessment

Salon A

How You Can Evaluate the Integrity of Your Library Service Quality Assessment Data: Intercontinental LibQUAL+ Analyses Used in Concrete Heuristic Examples Bruce Thompson, Martha Kyrillidou, and Colleen Cook

This user-friendly, conversational presentation explains how you can evaluate the integrity or trustworthiness of library service quality assessment data. Using the metaphor of a bathroom scale, the ideas underlying (a) score reliability and (b) score validity are presented in an accessible manner. The use of the software, SPSS, to compute the related statistics is illustrated. LibQUAL+ data are used in heuristic examples, to make the discussion concrete, but the illustrations apply to both new and other measures of library service quality. Martha Kyrillidou, Director of Statistics and Service Quality Programs Bruce Thompson, Distinguished Professor of Educational Psychology and CEHD Distinguished Research Fellow, and Distinguished Professor of Library Science, Texas A&M University, and Adjunct Professor of Family and Community Medicine, Baylor College of Medicine (Houston) Colleen Cook, Dean and Director of the Texas A&M University Libraries

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

24

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 3:30-5:00

Parallel 1 #3
Qualitative Approaches I

Salon B

Wayfinding in the Library: Usability Testing of Physical Spaces

Nancy J. Kress, David K. Larsen, Tod A. Olsen, and Agnes M. Tatarka

Its common for libraries to evaluate the usability of online information systems by asking a subject to think out loud while they perform such tasks as finding a book title or journal article. This same technique can be to adapted to determine how well individuals are able to find and retrieve information in the physical environment. This paper discusses the benefits and utility of even small-scale wayfinding studies as a tool for highlighting barriers to the use of library collections. A primary benefit is that wayfinding studies allow librarians and other staff to see their library space through new eyes and better understand how difficult it can be for novice users to find library materials. This knowledge can inform efforts to reconfigure libraries and underscore areas for instruction. Wayfinding studies can also evaluate the effectiveness of library orientation programs and directional aids. An example illustrating the utility of wayfinding studies is a recent assessment at the University of Chicago Library. When this library participated in the spring 2004 LibQual+ survey, it received many comments from users that books were frequently not on the shelf. However, a follow-up study showed over a fifth of the books that patrons reported being unable to find were found in place on the shelf. So, in the Spring of 2005, a team at the University of Chicago Library undertook a study to help identify the reasons users were having trouble finding books in the Regenstein Library, a large building housing over 4 million volumes of material in the Humanities and Social Sciences in open bookstacks. To discover failure points in the book retrieval process, users were directly observed throughout the process from finding the catalog record and recording the necessary information to using online and printed guides and maps to navigate their way to the correct shelf. Subjects were recruited from first-year students who were asked to complete a brief online questionnaire on their Library use. Subjects were selected who had little history of checking out material from the Library in order to best approximate the new user experience. Subjects were given full bibliographic citations to three books and asked to follow the "think-out-loud" protocol while conducting the catalog search, interpreting the record, and locating the books in the collections. As subjects attempted to complete their tasks, they were monitored by a facilitator and a note taker from the study team. The study revealed problems with the terms used to designate locations, the arrangement of physical collections, the lack of an effective map and signage system, failure to distinguish between reference and circulating collections, and highlighted difficulties in reading call numbers. This compelling information is being used to reconfigure spaces improve directional aids, and inform our orientation and instruction programs - changes which will be assessed for effectiveness through additional wayfinding studies.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

25

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Nancy J. Kress is the Head of Bookstacks at the University of Chicagos Regenstein Library. She is pursuing an MLIS degree from the University of Illinois and expects to graduate in December of 2006. David K. Larsen has been Head of Access Services at the University of Chicago Library since 2000. In addition to his masters in library science from the University of Illinois, David holds a Ph.D. in American religious history from the University of Chicago Divinity School. Tod A. Olson is a Programmer/Analyst at the University of Chicago Library. He has over a decade of library computing experience, including work in usability and in digital libraries. Tod has an MSLIS from the University of Illinois at Urbana-Champaign. Agnes M. Tatarka is Reference Librarian/Technology Specialist at the University of Chicago Librarys Reference and Business Information Center. Agnes has an extensive background in information technology and holds an MLIS from the University of Texas at Austin.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

26

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 3:30-5:00

Parallel 1 #3
Qualitative Approaches I

Salon B

Assessing the Service Needs and Expectations of Customers No Longer a Mystery

Margie Jantti

The University of Wollongong Library (UWL) adopted the Australian Business Excellence Framework (ABEF) in 1994 primarily as a change management model. The Framework provides descriptions of the essential characteristics and approaches of organizational systems that deliver sustainable and excellent performance, with emphasis on determining the needs and expectations of customers and evaluating their perceptions of excellent service. The customer category of the ABEF mandates that organizations assess their customer relationship management systems and customer perception of value. Over the past decade, UWL has made extensive use of customer surveys and customer feedback systems as a means of evaluating satisfaction with services and resources. These approaches have provided critical data and information on customers perceptions of the importance and performance of various service and resource elements. While they have been an important mechanism for planned change and an improvement agenda, surveys and feedback systems are limited in their capacity to provide information and insight into the perceived value gained by engaging with the library or the total customer experience of a service transaction. UWL regularly evaluates performance against the ABEF to identify areas for potential improvement. One area addressed less rigorously by UWL was customer perception of value, that is how customers perceived UWLs competency in meeting customer value goals, that is, whether customers believed they received fair value for the investment or cost of engaging with a service. The adoption of a Mystery Shopper style evaluation of service delivery offered a new dimension to enable assessment of the quality and perceived value of services delivered by library staff and was first introduced in 2004. Selected to complement and expand on existing surveys and other feedback systems, mystery shopping had the potential to provide insight into the total customer experience, in particular the influence of staff attitudes, attributes and behaviors on overall customer satisfaction and sense of value. Repeated in 2005, the Mystery Shopper assessment methodology was modified to target areas identified as requiring improvement from the previous year, and to ensure that mutually beneficial outcomes were likely to be achieved by both customers and library staff. Measuring service quality through this approach has the capacity to underpin broad satisfaction ratings with genuine understanding of what library customers value. Findings from the University of Wollongong experience revealed the importance and value placed on how staff acknowledge, respond and interact with customers; the knowledge, experience and skill utilized; and the personalization and customization of services to meet the individual and unique needs of a diverse range of customers.

Margie Jantti has extensive experience in the interpretation and integration of globally recognized business excellence frameworks and quality standards within the library and information sector. The University of Wollongong Library is the only library organization to be recognized against the Australian Business Excellence Awards. Her contribution to the national agenda for quality assurance and best practice has been recognized through a Council of Australian University Librarians Achievement Award

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

27

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

28

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 3:30-5:00

Parallel 1 #3
Qualitative Approaches I

Salon B

Frequently Noted: Approaches to Analyzing Qualitative Research

Elizabeth Smigielski, Judy Wulff, and Terri Holtze

Much of library assessment creates qualitative data which is inherently difficult to use, but provides substantive, persuasive information. The primary pitfall of the approach lies in the large amount of disparate data collected which is unstructured and difficult to organize, interpret, and present. Further problems arise in using the data to persuade and elicit change within the organization. Librarians are good at creating studies and gathering data; where we fail is in the analysis and management of the data. Good data is lost in an overwhelming jumble of paper, written notes and quotes from test participants. Project leaders lose enthusiasm, projects lose momentum, and administrators lose faith. Project failure creates an organization in which assessment is seen as ineffective and irrelevant to daily work. With web usability and focus groups as models, the step-by-step process of creating a schema for organizing and interpreting the data, recognizing trends, consolidating the information and reformatting it for presentation will be illustrated. In addition, techniques will be discussed for maintaining momentum among the assessment group and the administration. Finally, effective strategies for using qualitative data to promote action will be presented.

All of the presenters are members of the University of Louisville Libraries Assessment and Resource Planning Team. Elizabeth Smigielski is Coordinator of Library Marketing. Judy Wulff is head of Electronic Services at the Kornhauser Health Sciences Library. Terri Holtze is the facilitator of the Web Management Team.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

29

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 3:30-5:00

Parallel 1 #2
Building Assessment in our Libraries I

Ashlawn & Highlands

Getting Started with Library Assessment: Using Surveys to Begin an Assessment Initiative Lisa Janicke Hinchliffe and Tina E. Chrzastowski

Developing a library assessment program is often a challenging task. Librarians and staff may question allocation of resources to assessment activities and feel threatened by potential results. This paper presents a case study for using library user surveys as the foundation for an evolving assessment program and related organizational development activities. The University Library at the University of Illinois at Urbana-Champaign (UIUC) undertook a three-year program of patron surveys to determine attitudes towards the librarys services, facilities and collections and to begin a library assessment program. This initial foray into library-wide assessment was administered by the Librarys Services Advisory Committee. The first group surveyed (spring 2004) consisted of graduate and professional students, followed by undergraduate students (spring 2005) and faculty (spring 2006). This series of surveys marks the beginning of formal library assessment at UIUC. Although the UIUC library participated in LibQual surveys in the past and individual UIUC librarians have been actively conducting library assessment at the unit level for many years, these patron surveys represent a new commitment to library-wide assessment at UIUC. Further opportunity for assessment education was made possible through participation in the Association for Research Libraries (ARL) program Making Library Assessment Work: Practical Approaches for Developing and Sustaining Effective Assessment. Steve Hiller and Jim Self, ARL Visiting Program Officers for library assessment, visited the UIUC Library on May 23-24, 2005. This visit, sponsored by the Services Advisory Committee, prompted discussion about library assessment and provided an outsiders view of our current assessment plans and projects. This paper will present the results from the three user-surveys and will also focus on the getting started in library assessment experiences of the UIUC Library. Strategies employed by the Services Advisory Committee to promote assessment and begin to create a culture of assessment will be presented, as well as current plans, successes and failures and our assessment directions for the future. Lisa Janicke Hinchliffe is Coordinator for Information Literacy Services and Instruction and Associate Professor of Library Administration at the University of Illinois at Urbana-Champaign. Her research focuses on library use and topics related to teaching and learning. Tina E. Chrzastowski is Chemistry Librarian and Professor of Library Administration at the University of Illinois at Urbana-Champaign. Her research focuses on library use and users.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

30

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 3:30-5:00

Parallel 1 #2
Building Assessment in our Libraries I

Ashlawn & Highlands

A Leap in the Right Direction: How a Symbiotic Relationship Between Assessment and Marketing Moves the Library Forward Melissa Becher and Mary Mintz

Assessment activities are never ends in themselves. The resulting data can be put to work throughout the library, but particularly in marketing efforts. Assessment identifies user populations that would benefit from targeted marketing and also documents their awareness of library services. Marketing campaigns to these populations address gaps in understanding and forge meaningful relationships between users and the library. Users' resulting growing satisfaction can be measured by further assessment, providing evidence of the marketing campaign's effectiveness. In this presentation, the authors show how a relationship between assessment and marketing developed at American University Library, how it led to an award-winning marketing campaign, and how it continues to inform joint assessment and marketing efforts that move the library forward. The AU Library Assessment Team participated in LibQUAL+ in 2001. The results from that survey were surprisingly negative for undergraduate students, prompting the team to conduct special focus groups in fall 2002. Further participation in LibQUAL+ 2003 and analysis of the results of university-conducted surveys confirmed the team's understanding of undergraduate perceptions of the library. All data indicated that undergraduates had a lower level of familiarity with the library and a lack of awareness of library resources and services. The Library Marketing Team used Assessment Team data and analyses to identify undergraduate students as a group prime for targeted marketing. The Marketing Team saw that the library could make significant gains by increasing undergraduate awareness of the resouces and services available to them. Team members initiated a formal series of meetings with Assessment Team members as part of the planning process for a fall 2004 campaign. The relationship between the two teams insured that the campaign aligned with Assessment Team findings about the undergraduate population. This marketing campaign won the 2005 Best Practices in Marketing Academic and Research Libraries @ Your Library Award from ACRL. Initial results from LibQUAL+ 2005 show that American University undergraduates' perceived level of library service has moved closer to their desired level of service. While it is hard to say that the marketing campaign was the only source of the increase, the results are promising enough to explore further. The Marketing and Assessment Teams plan more assessment to track long-range changes in perceptions of students who first matriculated in 2004 and will graduate as the class of 2008. These activities will determine if the fall 2004 campaign has effectively reached a goal of increasing undergraduate student satisfaction with and use of library services by at least twenty percent over four years. The relationship between assessment and marketing at American University will continue to be essential to attaining this goal. Melissa Becher has an M.S.L.I.S. from the University of Illinois at Urbana-Champaign. A Reference/Instruction Librarian, she leads the American University Library Assessment Team. Mary Mintz has an M.S.L.S. from University of North Carolina at Chapel Hill. She is Senior Reference Librarian and a founding member of the Library's Marketing Team.
Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

31

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

32

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 3:30-5:00

Parallel 1 #2
Building Assessment in our Libraries I

Ashlawn & Highlands

Assessment in the Emory Libraries: Lurching toward Sustainability

Panel: Charles Forrest and Susan Bailey

In 1996, in order to improve internal communication and decision making, the General Libraries at Emory University began a comprehensive organizational review and redesign process. In the resulting restructuring in 2001, the library introduced the Office of Program Assessment and Coordination, charged with library assessment and process improvement, and lateral coordination and alignment of processes across the library through the agency of the also newly created Market Councils. This presentation will trace the development of the Office and the assessment program through significant staff turnover, interim administrative arrangements, a campus wide strategic planning initiative leading to the revitalization of assessment activities, creation of the Position of Library Assessment Coordinator, and the evolution of its placement in the organization that lead to creation of the Office of Library Planning and Assessment.

Charles Forrest has over twenty-five years of experience in academic and research libraries. Since his arrival at Emory in 1988 he has held a variety of administrative positions in the main library, and is currently Director of the newly established Office of Library Planning and Assessment. Susan Bailey has been Library Assessment in the General Libraries at Emory University since the position was created in September 2005.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

33

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 5:00-6:00

Parallel 2 #1
LibQUAL+ Follow-up

Salon A

Practical Assessment at Texas A&M: Using LibQUAL+ Comments to Enhance Reference Services Dennis T. Clark

In the summer of 2005, Reference Services in the Evans Library at Texas A&M underwent a significant reorganization. The goal was to integrate the formerly separate Humanities/Social Sciences Reference Unit and Sciences/Engineering Reference Unit under a single service philosophy. Additionally, the library drastically changed to a model of offering Reference Services in a tiered environment in which tenured or tenure-track librarians have no scheduled reference desk hours at any time but are responsible for on-call responsibilities and in a chat-based reference service known locally as AsKnow. A new Head of Reference Services was hired with a mandate to continue the tradition of high-quality and proactive service for all patrons. The first priority under the new model of offering service was to examine the readily-available feedback in an effort to understand perceptions of reference service quality and initiate immediate changes when and where feasible. As Texas A&M University has been an active leader in LibQUAL+ since its inception, it was decided that our survey data would be the best place to begin. Although the Associate Dean for Advanced Studies (primarily public services) had gathered a group of librarians to begin analyzing the composite LibQUAL+ data from recent years, the Reference Unit thought it would be advantageous for a smaller, internal working group to look at the survey comments as they pertain to reference services. A group of volunteers from Reference Services, led by a newly-hired Undergraduate Specialist Librarian, was charged with reviewing all comments in the 2002-2005 LibQUAL+ surveys, spot trends which may affect the perception of reference services, refer other systemic issues to the Head of Reference, and to recommend change to reference organization and operations based on the comments. The charge continued that the recommendations should be low-cost (in time and talents output as well as financial encumbrance) but with high visibility. Reference Services had no preconceived notions as to their ability to correct systemic or complex problems related to systems, personnel or other areas outside the reference unit. Specifically, dealing with lowhanging fruit requires quick fixes. We agreed, however, to analyze the comments and refer any significant issues outside of our purview to the Associate Dean for Advanced Studies for her review. The team made several recommendations that fell into two thematic areas: patron accessibility to and knowledge of reference services and better training for staff including senior librarians. Very soon after the report was distributed, a small implementation team was created to streamline the process for getting the majority of the recommendations in place. It is our intent to annually review the LibQual+ comments in order to quickly respond to issues that are departmental in nature and cannot be extrapolated to the libraries as a whole. Of course, this departmental review should fit within a framework of overall library assessment, but it will give the local manager immediate and useful feedback on the departments perceived performance.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

34

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Dennis T. Clark is Head of Reference Services and Assistant Professor at Texas A&M University Libraries. Formerly the director of the music library at Vanderbilt University, he is interested in building a culture of responsiveness in research librarianship by incorporating cutting-edge trends and traditional models of service.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

35

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 5:00-6:00

Parallel 2 #1
LibQUAL+ Follow-up

Salon A

Getting Our Priorities in Order: Are Our Service Values in Line with the Communities We Serve? Jocelyn Duffy, Damon Jaggars, and Shanna Smith

LibQUAL+ is used by academic libraries from more than 500 institutions, including colleges, universities, and community colleges, as a method of measuring users perceptions of service quality. The instrument allows users to rate their minimum, perceived, and desired levels of service for 22 items in three dimensions: information control, library as place, and service affect. Using the results from the 2005 survey at the University of Texas at Austin, we examine how well the service priorities of library staff are aligned with the priorities of undergraduates, graduate students, and faculty. To define the priorities for a given individual, we re-scaled the desired score for each item to reflect the degree to which the item is above or below the average desired level across all items for that individual. The rescaled scores (termed priority scores) for the 22 items were then compared between the four groups using a multivariate analysis of variance (MANOVA). Preliminary results indicate that service priorities for library staff align more closely with those of undergraduates than with those of graduate students and faculty. This analysis is a first step in identifying service priority gaps between library staff and the users they serve. Our intention is to promote discussion among library staff about users needs and how closely staff service priorities align with those needs. In addition, our findings may prove useful as management information by allowing us to analyze our users service priorities and integrate the results of this analysis into organizational decision-making and planning processes. This paper will focus on the University of Texas Libraries, but the question answered and method of analysis will be useful to all libraries with a similar data set. We believe this will be a unique utilization of LibQUAL+ data, as we have not found a similar study within the existing LibQUAL+ literature. Jocelyn Duffy is the Assessment Coordinator for the University of Texas Libraries at the University of Texas at Austin. She has made presentations on LibQUAL+ and service issues and participated in information fairs at local and national meetings and library conferences. Damon Jaggars is the Associate Director for Student & Branch Services for the University of Texas Libraries. He has presented on service quality assessment in libraries at several regional and national conferences. Shanna Smith is the Manager of the Research Consulting group at the University of Texas at Austin, where she consults with clients from a variety of disciplines on data collection and analysis issues. She has presented papers with methodological and statistical content at several regional, national, and one international conference.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

36

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 5:00-6:00

Parallel 2 #3
Qualitative Approaches II

Salon B

Meliora: The Culture of Assessment at University of Rochesters River Campus Libraries

Nora Dimmock, Judi Briden, and Helen Anderson

The River Campus libraries at the University of Rochester have created an effective and sustainable user needs assessment program by establishing a website Usability Group, undertaking an undergraduate research study, and by putting statistics and a desktop report program on the librarians desktop. The information gathered by these programs has helped us to develop quantitative as well as qualitative picture of our users, allowing us to connect with them through our website, reference services and our collections. Nora Dimmock will give an overview of the Usability Groups contribution to the culture of assessment and demonstrate how it has become an integral part of the website design process at River Campus Libraries. Group members are assigned to a specific project under development. They conduct user studies throughout the design cycle. Collaboration with content and design groups gives the advantage of a much larger website development program using an iterative design process that is scalable, sustainable and successful. Judi Briden will describe the Undergraduate Research Project, a two-year study conducted by a team of librarians and staff, including an anthropologist. The study applied ethnographic methodologies to gain new perspectives on how undergraduates working on papers and other research-based assignments interacted with the libraries resources. Methods included recorded interviews in and out of the library, photo surveys, mapping, and dorm visits. The resultant recordings, drawings and photographs were co-viewed and discussed by reference librarians and team members. This shared process generated new insights for improving reference outreach, library facilities and web pages for undergraduates at the University of Rochester. Helen Anderson will discuss the Libraries subject liaison and collection development program. Subject librarians use skills and techniques developed through participation in groups such as the Undergraduate Research Project, the Usability Group and content groups to develop relationships with students and faculty and to learn about how those groups use library services and collections. They are encouraged to think about collections and access in broad terms. Tools such as our Bibliographers Desktop empower staff to create their own collection related reports. All of these groups contribute to the culture of assessment that has evolved at the River Campus Libraries over the last five to ten years.

Nora Dimmock, is Head of the Multimedia Center, Film Studies Librarian and a member of the Usability Group. Judi Briden, Digital Librarian for Public Services and subject librarian for Brain and Cognitive Sciences and a member of the Undergraduate Research Team. Helen Anderson is Head, Collection Development for River Campus Libraries and Russian Studies Librarian and a member of the Undergraduate Research Team.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

37

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

38

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 5:00-6:00

Parallel 2 #2
Building Assessment in our Libraries II

Ashlawn & Highlands

Library Assessment on a Budget: Using Effect Size Meta-Analysis to Get the Most out of the Library-Related Survey Data Available across Campus Eric Ackermann

Data related to library service quality can exist in the results of surveys conducted across campus for nonlibrary reasons. These surveys can range from the nationally administered Higher Education Research Institute and Faculty Survey (HERI) to the locally generated and administered freshman orientation course satisfaction surveys. At many colleges and universities these surveys are conducted regularly and provide space for local questions which can include several library-related items. For libraries with limited assessment budgets, getting several library-related questions on these surveys can be an inexpensive source of additional information about its user needs and their perceptions of library service quality. It does however leave one with the problem of making sense of data from many different survey instruments often using a bewildering array of sampling strategies, scales, data analyses, and outcomes reporting. One solution is to use meta-analysis. Meta-analysis is a quantitative method of research synthesis developed in the social sciences to handle data comparisons across disparate studies in a statistically valid manner. This study explores the use of meta-analysis as a library assessment tool, in particular one type of metaanalysis, effect size. It is used to compare the results from six analogous, library-related survey items from two different survey instruments administered by Radford University to its undergraduates in 2005: LibQUAL+ and the Radford University Undergraduate Exit Survey. The six item examined are hours of operation, access to information, staff quality, collection quality, users ability to find information and users ability analyze information. The process of effect size meta-analysis and its results are examined for its strengths and limits as a library assessment technique in light of its practicality, sustainability, and effectiveness. Eric Ackermann (M.S. in Information Sciences, University of Tennessee-Knoxville, 2001) is currently the Reference/Instruction and Assessment Librarian at Radford Universitys McConnell Library. He managed the LibQUAL+ survey of students in 2005 and faculty and staff in 2006. He is a member of the Virginia Assessment Group and VLACRLs Assessment SIG.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

39

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 5:00-6:00

Parallel 2 #2
Building Assessment in our Libraries II

Ashlawn & Highlands

Developing an Integrated Approach to Library and Information Technology Assessment

Jill Glaser, Bill Myers, Ryan P. Papesh, John M. Stratton

This presentation will provide an overview of the activities of the Information Services Assessment Council within the changing organizational climate at the University of Kansas. Ten years ago, three formerly separate divisions were integrated to form a larger organizational entity known as Information Services (IS). These divisions are the KU Libraries, Information Technology, and Networking and Telecommunications Services. Though each division maintains a separate identity and is overseen by separate administrative hierarchies, each division reports to the Vice Provost for Information Services. In recognition of the common needs and challenges our users face in the technologically advanced and interconnected scholarly information environment, each division is working more collaboratively than ever before. In this milieu, one constant need remains: the need to recognize and predict, as far as possible, the changes in users needs and behaviors and to measure overall IS effectiveness in meeting them. Embedded within the IS management structure are several groups charged with both leading initiatives and advising leadership of organization-wide activities designed to meet the larger University mission. The Information Services Assessment Council has as its charge the responsibility to oversee the assessment activities of the three divisions within Information Services, and strives to enable evidence-based decisionmaking by fostering and supporting a culture of assessmentan ongoing process in which services, resources and performance are measured against the expectations of users, and improvements are made to meet users needs effectively and efficiently. To accomplish this, the Assessment Council collaborates with and advises IS leadership and staff to identify priority assessment activities; develops and coordinates assessment-related staff development and educational programs; assists staff in developing and conducting assessment activities; and reports to the IS-wide community on assessment activities and initiatives. This presentation will identify the challenges and opportunities of conducting user-centered assessment in a fully integrated library/IT organization. It will describe the similarities and differences of these cultures, the steps that have been taken to use assessment as a unifying theme for approaching user services, and review several assessment activities, staff development activities, and planning initiatives that have resulted.

Jill Glaser has earned a Bachelor of Arts in Music, Piano Emphasis, and a Master of Business Administration, Information Technology Emphasis, both from the University of Kansas. Jill has worked primarily as a Web developer for the last 8 years, previously for IBM and Sprint. She currently serves as Web Services Coordinator for the University of Kansas Libraries, in addition to other Web development responsibilities in the Information Technology department. Bill Myers is director of assessment for information services at the University of Kansas, a new position created to facilitate an integrated libraries/IT assessment program at KU. Bill was formerly director of library development and assessment coordinator for the KU Libraries. He received the B.A. and M.A. in English from Fort Hays State University (Kansas).
Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

40

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Ryan R. Papesh has earned a Bachelor of Science in Engineering from Purdue University, and a Master of Science in Management from North Carolina State University. Ryan has worked extensively in the marketing of technology, in several industries, coast to coast. His choice of technology for the last 15 years is telecommunications, and he currently serves as the Customer Service Manager at Kansas University's Department of Networking and Telecommunications Services. John M. Stratton currently serves as Head of Outreach Services for the University of Kansas Libraries since April of 2005. Prior to that, he served as Head of the Regents Center Library at the KU Edwards Campus in Overland Park, Kansas, and as Co-coordinator of Reference Services for KU Libraries. John received his B.A. in History from KU, and a Master of Science in Library and Information Science from the University of Illinois, Urbana-Champaign.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

41

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Usage and Outcomes Evaluation of an Information Commons: A Multi-Method Pilot Study Rachel Applegate

In the past fifteen years many academic libraries have created learning environments called Information Commonstechnology-equipped areas where information and computer assistance are provided along with other services. This presentation reports on a multi-method study of a group-work oriented Academic Commons area in the library of an urban, commuter university (Indiana University Purdue University Indianapolis). The study has had two phases: a usage-design-implementation evaluation, and an outcomes evaluation. Previous reports of Information Commons evaluations have used methods such as interviews, question logs and tallies, surveys, interviews and focus groups. Many of these have been primarily descriptive, oriented around processes or outputs. The evaluation of the IUPUI University Library Academic Commons started with an examination of the pilot implementation of approximately 50 seats (fall 2005), in order to determine usage patterns to guide the development of additional space. This implementation or process evaluation used surveys, observations, and interviews; the data strongly influenced modifications of the pilot space and selections for future space (summer 2006) development. An outcomes evaluation, funded by a grant program focused on instructional innovation, was designed for spring of 2006, involving courses which had information and technology-intensive group work projects. The goal of the Commons space was to facilitate group work; this evaluation attempted to glean at least some information about whether this goal had been achieved. Faculty and library liaisons in social work, history, business, museum studies, and library science were involved in a collaborative assessment effort. Students in these classes proceeded to do group work and complete normal class assignments. At the end of their projects, they filled out surveys on their usage of Commons and other physical facilities to do their group work. The analysis (to be completed in summer 2006) will determine if there is any observable correspondence between project characteristics, in terms of information usage, technology usage, or group work effectiveness, and the reported use of library or other group work facilities. This approach is inductive and exploratory. It is innovative in library and academic literature in that it seeks to observe not only outputs (user and usage counts) but outcomes: the real reason that the space was developed. This presentation will not focus on Information Commons design, but on the strengths, weaknesses, and administrative issues of the various evaluative methodologies employed. The presenter will be able to report cautions and advice for others contemplating assessment of libraries, particularly library spaces. A physical Information Commons represents a re-emphasis upon the library as place, and it would be to the benefit of libraries if they are able to demonstrate to a non-library audience the role of libraries as physical environments for learning.

Applegate is currently an assistant professor at the Indianapolis campus of the Indiana University School of Library and Information Science. She has eighteen years of experience as a reference librarian and then library director at a small private college in Minnesota, and teaches evaluation, research, and academic librarianship.
Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

42

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

43

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Issues in Establishing a Culture of Assessment in a Complex Academic Health Sciences Library Sally Bowler-Hill and Janis Teal

Objective: To report on the University of New Mexico (UNM) Health Sciences Library and Informatics Center's (HSLIC) experience in creating a culture of assessment through the regular administration of customer satisfaction surveys for its library and technology support services. Setting: HSLIC's organizational structure includes four major divisions, reflecting its responsibilities to deliver the following services to the Health Sciences campus: Library services Technology support (workstation support, network, email, file storage, web administration, and application development) Educational development (consultation on educational content for online and in-person learning experiences) Biomedical informatics training and consultation Goal: To create a unified culture of assessment in which services are assessed regularly, contributing to a picture of the overall effectiveness of HSLIC. Methods: The management team committed to administration of annual surveys, initially of library services and then of technology support. Library services surveys began in 2002 with a survey developed in-house and continued in 2003 and 2005 using the LibQUAL+ survey. A technology support survey was developed in-house and administered in 2004 and 2006 because environmental scans, literature searches, and a survey of the Association of Academic Health Sciences Libraries (AAHSL) directors did not reveal any national standardized surveys such as LibQUAL+ for technology support services. The process of conducting an environmental scan and adopting or developing assessment measures for other services such as educational development and biomedical informatics is being planned. Results: A difficulty arose in comparing the gap analysis scores from LibQUAL+ with the Likert scale results from the technology support survey. This difficulty illustrates that having separate surveys using different methodologies limits HSLIC's ability to integrate survey data and assess overall strengths and weaknesses. It also impedes the development of a unified culture of assessment by compartmentalizing service units. Further, the technology supportsurvey does not afford the opportunity to benchmark against similar institutions. For libraries like HSLIC, whose responsibilities extend beyond traditional library services, the use of different survey tools to assess different types of services presents problems in consistency of data interpretation, benchmarking, and strategic planning. Conclusions: As HSLIC further develops its technology support survey and begins to evaluate assessment measures for other services, the cost-benefit of creating in-house surveys that better align with LibQUAL+ versus accepting inherent discrepancies derived from using different methodologies will be evaluated. The result will be a unified body of assessment which contributes to a picture of overall effectiveness of HSLIC services, creating a unified culture of assessment in the organization.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

44

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Sally Bowler-Hill, MA, coordinates strategic planning at the University of New Mexico Health Sciences Library and Informatics Center (HSLIC). She assists administering HSLICs customer satisfaction surveys. Janis Teal, MLA, MAT, AHIP, is the Deputy Director for Library Services at HSLIC. She coordinates the administration of the LibQUAL+ survey for HSLIC.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

45

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Use of RFID Applications in Libraries

Navjit Brar

The adoption of Radio Frequency Identification (RFID) technology by libraries promises a solution that could make inventorying of items in their collection possible in days instead of months and allow patrons to check out and return library property automatically at any time of the day. With an estimated 35 million library items tagged worldwide in over 300 libraries, this technology is generating an ever increasing interest. Besides speeding up checkouts, keeping collections in better order and alleviating repetitive strain injuries among librarians, RFID promises to provide a better control on theft, non-returns and misfiling of a librarys assets. The Industrial Technology Department and Robert E. Kennedy Library at Cal Poly State University, San Luis Obispo, collaborated in a research project to test and assess the effectiveness of RFID technologies in the library setting. From October to November, 2004, we surveyed participating libraries, RFID listserv, and LITA-L listserv subscribers to collect information with regards to the implementation of RFID systems in libraries. As a result of the positive response from library world, vendors interest in loaning us with this technology for testing, and our own students interest that would prepare them for a better job market, we decided to further conduct research by actually testing this system. Libramation provided us with the equipment for testing during spring 2005. With the RFID simulation of 250 items, this project addressed common issues of contention for any library. The hypotheses tested were that the system will simplify the check in and checkout process for staff, increase efficiency, and minimize repetitive motion; provide a selfcheck component; be able to secure magnetic media such as videos and cassettes, and be able to handle the discharge function in a manner similar to books; provide a link between security and the bibliographic record; provide an efficient way to inventory the librarys collection without having to physically handle each item; provide a flexible system that could be used easily with new and future technology, such as an automated materials handling system; and combining RFID technology at the circulation desk, self-check machines, and eventually an automated materials sorting system, will free circulation staff to perform direct patron information services. This poster session will inform attendees of our survey findings; workflows; test results particularly using Libramation & Innovative Interfaces; pre and post implementation costs; the effect of RFID on Library as a Place; and the conclusion.

Navjit Brar did her BA in Sociology & Psychology and MA in Sociology from Panjab University; MLIS from San Jose State University. She began her career as Library Assistant at USC and CSU, Fullerton; held professional positions at Michigan State and New Jersey. Currently working at Cal Poly Kennedy Library as an Assistant Dean, ABS.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

46

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Statistics & Assessment: The Positive Effects at the Harold B. Lee Library of Brigham Young University Julene Butler and Brian Roberts

Acting on their assumption that assessment is central to successful library service, administrators at Brigham Young Universitys Harold B. Lee Library began to establish a culture of assessment as part of their long-term strategic plan. Commencing in early 2001 when the Library repurposed a position to hire an assessment officer, the Library enhanced its assessment activities through participation in a variety of national and international studies. Four times between 2001 and 2006 the Library participated in LibQUAL+, twice with other libraries in the Consortium of Church Libraries & Archives. During the spring of 2005, the Library evaluated reference services through involvement in WOREP and also participated in SAILS to assess the effectiveness of its information literacy programs. Several in-house studies have been conducted, including an assessment of the role of subject specialists (2001), improvement of specific internal workflows and processes (2001), uniform collection of reference statistics (2003), usability of the library web site (2005), and analysis of future collection space needs (2005 & 2006). Findings from each of these studies have resulted in improvements to facilities and services, including establishment of an expedited acquisition system for urgently needed materials, creation of an Information Commons, and allocation of funding for adding journals to library collections. This paper describes the major assessment studies conducted by the Harold B. Lee Library since 2001, explains specific changes that have resulted from those studies, and discusses the impact assessment activities have had on library resources and organizational structure.

Julene Butler: Associate University Librarian, Lee Library, BYU. Ph.D., Communication, Information, Library Studies, Rutgers University, 1996. MLS: BYU, 1971. Brian Roberts: Process Improvement Specialist, Lee Library, BYU. MS, Statistics, BYU, 1983. BS, Business Statistics, BYU, 1980.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

47

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Improving Library Services Using a User Activity Survey

Alicia Estes

Bobst Library is the flagship library for the New York University community. It is visited by more than 6,500 users per day, and circulates almost one million books annually. Because of the Universitys urban setting, the library serves, not only, as a place for research, but also, as a social meeting center for students. The Bobst Library Code of Conduct "affirms a commitment to protecting an environment conducive to intellectual pursuits. Such an environment is characterized by respect for the rights of others, respect for the Library's resources, and respect for appropriate conduct in a public forum. The Code ensures that users of Bobst Library find themselves in an environment which will enable them to achieve their educational objectives." In 2003 Bobst Library had a food, drink, cell phone, laptop and quiet policy that was dated and at times ineffective or unenforceable. Library users repeatedly requested improvements to these policies that would enable them to enjoy a quality library work/study life and also empower them to request that fellow students respect the policies. Additionally, the planned renovation of the lower levels of Bobst Library, introduction of the Study Lounge on A-level, introduction of wireless computing on many floors, renovation of the 8th floor reading room and the new Current Periodicals Room offered the opportunity to "start fresh" with users in regard to the quality of their library environment. In 2003 the Director of Public Services conducted a user activity survey that was used to make decisions related to the renovation. A Quality of Life Committee (QLC) was formed in 2004 and was charged to review the various policies currently in effect in Bobst Library and create new policies, signage, plans for enforcement of the new policies and a program that will be implemented and introduced to users in Fall 2004. The QLC took over responsibility for the User Activity Survey however, the survey was repurposed (and slightly redesigned) to serve as a measure of quality of life issues within the library. This poster session will provide examples of the user activity surveys, show the comparative results of the user activity survey using a Powerpoint presentation and charts, and provide information on library services and changes implemented as a result of the survey findings. Some of these changes were the creation of a library rovers program using upperclassmen and/or graduate students, the development of a new cell phone awareness campaign, and the creation of a new beverage and food policy.

Alicia Estes is Head of the Business & Documents Center at Bobst Library of New York University and liaison and selector for US Business/Economics and the Hospitality & Tourism Department. She is active in the PRMS division of LAMA and was Chair of the John Cotton Dana Award Committee in 2005.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

48

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Introducing the READ Scale: Qualitative Statistics for Academic Reference Services

Bella Karr Gerlich and G. Lynn Berard

With the commercialization of online resources and technical know-how of users, reference librarians today are being sought out for their expertise in knowledge management/subject specialization vis--vis the reference transaction. Quantitative statistical measurements in use currently do not adequately reflect the effort/skills/knowledge associated with this work. A survey of Association of Research Libraries done in 2002 gives supporting evidence that many academic institutions are not completely satisfied with the usefulness of the reference statistics gathered noting that the migration of reference activity to areas beyond the traditional reference desk (e-mail, chat, office consultations), has further motivated many libraries to re-examine and modify current practices (ARL SPEC Kit 268, Reference Services & Assessment, 2002). This survey hoped to determine the state of statistical reporting in academic libraries hoped that the survey results would reveal current best practices, but instead, they revealed a situation in flux: The study reveals a general lack of confidence in current data collection techniques. Some of the dissatisfaction may be due to the fact that 77% of the responding libraries report that the number of reference transactions has decreased in the past three years. With many librarians feeling as busy as ever, some have concluded that the reference service data being collected does not accurately reflect their own level of activity. (ARL SPEC Kit 268, Reference Services & Assessment, 2002) There is a feeling of pressure of not performing when the professional literature speaks of declining reference numbers and gives little or no credit to reference/research assistance where it is due. It was with this sentiment in mind that the READ Scale was introduced and developed at Carnegie Mellon University. The READ Scale, whose acronym stands for Reference Effort Assistance Data, is a six-point scale tool for recording vital supplemental qualitative statistics gathered when reference librarians assist users with their inquiries / research, etc by placing an emphasis on recording the skills/knowledge/technique/tools/teaching utilized by the librarian during a reference transaction. The READ Scale has been in use by two reference service points at Carnegie Mellon concurrently since 2002, with 2002-2003 the pilot phase. Data gathered from its use has been reported at two American Library Conference poster sessions, in 2003 and 2005. Reference librarians who use the scale support and endorse its adoption as an important supplemental form of data gathering. The READ Scale emphasizes effort, recognizes time dedicated and knowledge skills used by the librarian at the time the reference transaction occurs this makes it especially appealing in a profession where the current accepted industry statistical standard for recording such reference transactions is a hash mark that records and rewards quantity as opposed to quality. The purpose of this presentation will be to: 1) 2) 3) Present data gathered using the READ Scale to date at Carnegie Mellon and invite/expand research participation/partnering to interested academic institutions; Collect data / experiences / feedback on using the scale from participating institutions; Present findings at future conferences; write articles, etc., develop professional support system or expand research participants, if desirable.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

49

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Bella Karr Gerlich is currently Associate University Librarian at GCSU, she was formerly Head, Arts & Special Collections, and developer / lead researcher of the READ Scale project at Carnegie Mellon. Ms. Gerlich has two decades of experience working in academic libraries with an emphasis on personnel and process management. G. Lynn Berard is Head, Science Libraries at Carnegie Mellon. Lynn teaches graduate LIS courses for Clarion University; is editor of FOCUS, the Carnegie Mellon Faculty Newspaper and author of the Engineering and Technology Section of Magazines for Libraries. Lynn has served on the SLA Board and spoke at numerous conferences.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

50

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Are the Needs and Wants the Same? Comparing Results from Graduate Student, Undergraduate Student, and Faculty Surveys Lisa Janicke Hinchlifee and Tina E. Chrzastowski

University of Illinois began its formal library-wide assessment program with a cycle of user surveys administered by the Librarys Services Advisory Committee. The first group surveyed consisted of graduate and professional students (spring 2004), followed by undergraduate students (spring 2005) and faculty and academic professionals (spring 2006). The proposed poster will highlight commonalities and differences among the groups is what they report they need and want as well as how the Library has responded to the data. Lisa Janicke Hinchliffe is Coordinator for Information Literacy Services and Instruction and Association Professor of Library Administration at the University of Illinois at Urbana-Champaign. Tina E. Chrzastowski is Chemistry Librarian and Professor of Library Administration at the University of Illinois at Urbana-Champaign. Together they have provided leadership for the Library's user survey initiative.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

51

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Challenges Inherent in Assessing Faculty Productivity: A Meta-Analysis Perspective

Sheila Curl Hoover

Academic librarians attempt to refine the monograph and serials selections based on microassessment methodologies. Faculty research interests and curriculum needs are routinely tracked and cataloged for mapping to a collection development or approval plan profile. Colleges and universities sometimes assist the process by naming areas or centers of excellence which can be used by librarians to justify increased funding for these areas. Integrated library management systems allow for detailed reports of what circulates and ILL records show what users need that the library does not own. But these microassessment techniques doe not adequately capture the work of the university or its information needs. In an environment of budget constraints and with the growing importance of outcomes assessment, a university library undertook a project to determine the library quotient of academic departments by tracking the various outputs of the departments. How much pressure does a program put on the library? How much demand on collections and services can be attributed to individual programs? On-campus data warehouses and commercial databases were mined to collect data on credit hours taught, enrollment, degrees granted, external funding for research and publishing activity of the academic departments. After three years this environmental scan resulted in a series of spreadsheets that not only tracked the products of the various colleges and departments, but began to show trends. This paper will discuss the methods used to gather the information, problems encountered, some preliminary results and how we shared some of our findings with the faculty, deans and university administration.

Sheila Curl Hoover is Associate Dean of Libraries for Outreach and Information Services at Texas Tech University. A graduate of the Columbia University School of Library Science, she has spent her career as engineering librarian at Kansas State University, Arizona State University, University of Notre Dame and Purdue University. She participated in the 2005 Service Quality Evaluation Assessment Academy.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

52

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Assessing the Research Impact of Electronic Journals at the University of Notre Dame

Carol Branch, Carole Pilkinton, and Sherri Jones

Many intriguing questions have arisen as journals continue to migrate from print to electronic format. Is the added convenience contributing to more productive research? Are researchers reading a greater number of articles, and is a wider selection of journal titles being used simply as a result of electronic access? Has their definition of core journals for their work changed? The University Libraries provide about 16,000 electronic journal titles to Notre Dame faculty and students. About 5,000 of these titles are direct publisher titles or package subscriptions, and aggregator collections such as Ebsco Academic Search Premier make up the rest. When a true publisher title is available in electronic form, the print version of the title has been canceled in almost all cases. Because the budgetary implications of journal literature are so important for libraries, we want to understand whether increased numbers available in package deals provide value to our users. We also need to evaluate the repercussions of canceling subscriptions and obtaining articles as needed through document delivery in an atmosphere where immediate access may be accelerating research activity to higher levels of productivity and may be raising library users expectations for document delivery of articles. During spring 2005, the University Libraries at Notre Dame conducted a Web-based survey to measure the impact of electronic journals on research and teaching at the University of Notre Dame. We hoped to determine what effect electronic access to a great percentage of the journal collection is having on teaching and research. The survey, which was sent to all faculty and graduate students via email, consisted of 20 questions and took approximately 15 minutes to complete. In addition to answering the survey questions, respondents were asked for their status (faculty or graduate student), primary college affiliation, and number of grants applied for or received, and number of articles accepted or published in the past 3 years. A total of 590 valid surveys were returned. The survey data was analyzed in order to reveal distinctions between faculty and graduate students, between colleges, and by status and publication level. The poster session will present the results of the survey, focusing on differences between faculty and graduate student information-seeking behavior, and differences between disciplines. Carol Brach is the Engineering Librarian, University of Notre Dame. Carole Pilkinton is the Electronic Resources Librarian, University of Notre Dame. Sherri Jones is the Life Sciences Librarian, University of Notre Dame

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

53

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Information Seeking Behavior of International Graduate students vs. American Graduate Students: A User Study at Virginia Tech 2005 Mary Finn

This is a comparative study on information needs and information seeking behavior of international graduate students and American graduate students. This user study is based on empirical data collected from an online survey conducted between April 7 and May 28, 2005 at Virginia Tech. The goal of this comparative study is to investigate how graduate students from diverse ethnic groups discover, select, and use various information sources, and to obtain insights into international graduate students information seeking behavior, especially its similarities and differences compared with the information locating patterns used by their American peers. The international student population in United States colleges and universities is continuously increasing. Since 1984 the United States has ranked first worldwide in the number of international students. In 2000, of the 1.2 million students pursuing postsecondary education outside their home countries, more than one third choose to study in the United States. Even after the events of September 11, 2001, the United States is still the first choice for study abroad for many international students, especially at the graduate level. At Virginia Tech, international graduate students accounted for 25% (1465) of the total graduate program enrollment in Spring Semester 2005. These students form a unique multicultural user group for the University Libraries. Majoring in a variety of disciplines, many international graduate students are working as teaching or research assistants in different departments. Understanding and meeting their affective as well as cognitive needs will not only help them achieve higher level of academic success but also enhance universities teaching and research capabilities. Despite the rapid growing number of international graduate students in the past decade, the literature review revealed a gap in studies of the international group over the past decade. Many of the earlier studies were done in the mid 1990s or even earlier. The authors found current research on international students library experience and needs lacking in the literature. Further, this study is unique in that previous research has not been done studying the similarities and differences between international students information seeking behavior and that of American students. This research employed quantitative method. About 6.3% (362) graduate students responded to the webbased anonymous survey. 315 of these returns are valid. The authors used statistical hypothesis testing techniques to study the following three areas: 1. 2. 3. Compare the information needs and information seeking behavior of international graduate students compared with American graduate students Investigate the relationship between English language proficiency of international graduate students and their information seeking behavior Investigate the relationship between length of stay in the United States of international graduate students and their information seeking behavior

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

54

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

This comparative study offers a current view of information needs and information seeking experiences of the growing international library user base. In addition, this research should help academic librarians understand more about the domestic patron group, and find more effective and cost-efficient systems to serve both groups. (This paper has been accepted by College and Research Libraries, and will be published in November 2006 or January 2007.)

Yan Liao (Clara) got her library science degree from University of Missouri, Columbia (UMC). She is a cataloging librarian at Virginia Tech, email: liaocy@vt.edu. Mary Finn is a cataloging librarian at Virginia Tech. She previously worked at University of Michigan, and the University of Detroit Mercy. Her library science degree is from Wayne State University. Email: maryfinn@vt.edu

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

55

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Improving Annual Data Collecting: An Interactive Poster Session

Linda Miller

In 2005, IRIS Research & Assessment Services (RAS) took on responsibility for Cornell University Library's (CUL's) annual data collection. With minimal resources, RAS worked to make the 2004/05 data collection easier and more transparent for data providers and library managers. It developed a table of the measures included in the most recent annual statistical report and reoccurring national surveys (ordered by functional area) to help familiarize staff with all measures and ensure that all core data was collected at one time; developed an expanded definitions file to promote consistency and to support data coordinators; created Excel files to facilitate data input and management, and to allow for percentage change comparisons with 2003/2004 data; encouraged and made it easier to include more notes; collected a large part of the centrally-provided data before the call to individual units so they had more time to review figures provided for their libraries; expanded instructions and provided training sessions; made it more explicit to whom the data was being reported; and involved the reporting units in data verification and analysis. In 2004/05, RAS also started to think about how to update tables to mainstream e-resource statistics and make the presentation of data more useful to a wider variety of audiences. Finally, RAS requested feedback from library managers in various forums. In 2006, RAS is building upon this earlier work. To ensure that current and future data collection efforts are as meaningful as possible, RAS asked each functional area's executive committee to take "ownership" of tables representing their areas, including setting and defining measures to be collected, and assisting in data review and analysis. We envision that this ongoing, cyclical process, involving staff throughout the library, will allow us to create a solid (and more easily gathered and shared) set of repurposable data to support a full assessment program - one that will incorporate both quantitative and qualitative metrics into future strategic planning efforts. In this poster session, Linda Miller will outline the CUL annual data collection and related processes and welcome discussing with other conference attendees their data gathering efforts. She will share insights gained on ARL-ASSESS

Linda joined Cornells IRIS Research & Assessment Services in 2003. RASs responsibilities include: coordinating/supporting the annual data collection and the completion of external central surveys; and taking on, as assigned, ad hoc data collection, manipulation, and presentation projects, and quick environmental scans/surveys of peers on library services and organizational issues.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

56

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Using Collection Assessment to Determine Optimal Library Support for Doctoral Granting Academic Programs. Tom Moothart

Using Collection Assessment to Determine Optimal Library Support for Doctoral Granting Academic Programs. Or I Cant Afford an Outstanding Library Collection. How Much Will a Great One Cost Me? The presentation will describe a methodology for assessing a subset of an academic library collection based on an analysis of peer institutions and user input. In order to determine the appropriate library support for its doctoral granting departments the Colorado State University Libraries compared its journal and book collections to peer institutions. Other resources were used to support the peer comparison, such as citation analysis and interlibrary loan reports. The CSU Libraries also administered a web-based survey to faculty and graduate students to determine satisfaction with the current research collection and areas of the collection that require more extensive acquisition. In addition to the survey University faculty assisted with the assessment by identifying peer institutions and reviewing the validity of the final assessment. By establishing a baseline for adequate support and involving academic faculty and University administrators in the process the resulting objective collection analysis will enable the Libraries to justify additional funding for inadequately supported programs. Assoc. Prof. Tom Moothart is the Veterinary Sciences Librarian at Colorado State University Libraries. He has also worked at the University of Missouri - Kansas City as a Science Reference Librarian.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

57

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

A Time-Budget Study of the George Mason University Libraries Liaison Program

James E. Nalen

Time-budget studies of programs and services can form a useful part of a librarys assessment toolkit. Analysis of time-budgets may reveal how staff are responding to changes in the institutional environment through their allocation of time among competing sets of necessary activities. In fiscal year 2006, the George Mason University Libraries employed a time-budget study to analyze the workload of librarians within the Libraries liaison program. The liaison program is situated in an environment of rapid growth in graduate education programs and sponsored research. Twenty librarians were required to report activities occupying each half hour of each day during five weeks that had been selected through systematic sampling. Activities were coded using a category system that had been deductively constructed, with considerable input from the librarians themselves. The time budget instrument consisted of an Excel worksheet. Excel functions within the worksheet automatically calculated the percentage of time spent on any given activity. Frequency tables, histograms and other descriptive statistics were generated from the aggregated data. The data lent themselves to comparisons between individuals and between sub-groups (e.g. liaisons at a particular library site) and the population as a whole. These statistics and comparisons helped to provide a better understanding of the complex nature of liaison work, while also challenging some assumptions about the Libraries liaison program. The time-budget methodology was found to be constrained by the seasonal nature of liaison librarians work, as well as by a certain level of demand characteristic bias. While the time budget survey revealed differences in how librarians allocate time to different aspects of the liaison program, the methodology did not help the Libraries to fashion an equitable redistribution of workload among librarians.

James E. Nalen is the Planning, Assessment & Organizational Development Coordinator at the George Mason University Libraries. He received his MSPA in 1999 from the University of Massachusetts Boston and MSLIS in 1996 from Simmons College.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

58

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Methodological Diversity and Assessment Sustainability: Growing the Culture of Assessment at the University of Washington Libraries Maureen Nolan, Jennifer Ward, and Stephanie Wright

The University of Washington Libraries has been active in library assessment since 1991 and is frequently mentioned as one of the few academic libraries with a thriving culture of assessment. During the past fifteen years the assessment program has grown steadily, moving from a one-time large-scale survey to an ongoing distributed program that utilizes a variety of methodological tools. Organizationally, assessment efforts have moved from an ad hoc survey committee to a broadly representative assessment committee, and recently to a central assessment and planning office with 1.5 FTE librarians and assessment efforts conducted throughout the Libraries. The assessment focus has broadened from user needs and satisfaction to evaluation of library services and resources and the value they add to the entire University community. This poster highlights different methods such as surveys, usability and data mining, that are used to gain input and evaluate services. We also show how specific assessment information has been used to improve services and add value to our customers.

All three presenters have long been active in UW Libraries Assessment and are members of the UW Libraries Assessment Group. Maureen Nolan, Natural Sciences & Resources / Friday Harbor Librarian, UW Libraries; Jennifer Ward, Head, Web Services, UW Libraries; and Stephanie Wright, Management Information Librarian, UW Libraries.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

59

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Collecting the *Right* Data for Decision-Making

Kimberly Burke Sweetman and Marybeth McCartin

User comments indicated a dissatisfaction with circulation desk hours. By examining circulation data specifically (by day of the week and hour of the day) rather than in its aggregate form, we were able to identify some simple hours adjustments that would better meet user needs.

Kimberly Burke Sweetman is Head of Access Services at New York University and the author of Managing Student Assistants, published this year in Neal-Schumans popular How-To-Do-It series Marybeth McCartin is Head of Instructional and Undergraduate Services at New York University. Ms. McCartin is winner of the 2005 Merlot Information Technology (Classics) award

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

60

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Creating On-going, Integrated Assessment Efforts in Community College Libraries

Mark Thompson

What will be covered: How to scale assessment efforts at a two-year college, so they are successful; How to identify and decided on issues to explore; Using assessment so it leads to results; and Success factors uncovered in four recent assessment projects. A concerted effort was made to infuse assessment into many aspects of managing the sole library at a large (15,000 students) community college in northern New Jersey. Given the busy library, with high demand at all service desks, there had been limited staff time to spend on large-scale, formal assessment efforts. Fortunately, however, in the fall of 2004, cost-effective participation in the national LibQUAL survey became available through the local consortium. This allowed the library, for the first time, to benchmark key issues related to services, resources and staffing. The insights from this study were then used to create an overall assessment plan. Priorities were established and then ad-hoc, short-term efforts were applied to various aspects of library services. Locally designed and executed measurement tools were used to gather input on four specific and important issues: Noise levels in library areas; Library website ease-of-use; Usage levels of specific electronic resources; and Which topics to cover in library instruction sessions. The approach taken was a practical one. The assistant director planned the efforts and ran the research internally in consultation with the institutional research director and the librarians. Each effort was designed to be of a reasonable scale and short-term. Each efforts cycle (plan, research, findings, ideas, and changes) took place in a few months, allowing for results to be realized near-term. This engendered a positive feedback loop and increased participation in the efforts. Customer satisfaction and usage levels were measured on each of these topics. The targeted efforts netted results for further assessment and, where necessary, changes were made in procedures and approaches. The participation of the staff in using and applying assessment data resulted in significant new levels of understanding of the issues, and in some cases, changes were made in providing library service. Assessment continues in these areas. Factors in successful community college library assessment: Start with overall view, then set priorities; Select targeted and scalable efforts; Benchmark: scope out the problems and solutions found at other libraries; Create short feedback loops and then take action; and Involve key staff in their areas.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

61

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Mark S. Thompson has 25 years of experience as a librarian spanning many arenas: corporate (Bell System; Dow Jones & Co.), academic (Fairleigh Dickinson Univ.), and public. He also founded his own information broker firm (Knowledge Resources). He is currently Assistant Director of the Library at Bergen Community College in Paramus, NJ.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

62

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Bribes Can Work: Ensuring Your Assessment Tool Is Not Ignored

Luke Vilelle

Youve got the questions. Theyve got the answers. It seems so simple. Yet no matter how finely tuned your survey is, how penetrating your focus group questions are, or how enlightening your usability study might be, you still have to get people to participate. Breaking through todays cluttered world to reach your target population can prove difficult. This practical, experience-based poster session will focus on the art of drawing students into your web of assessment. The presenter, who leads the marketing team at the University Libraries at Virginia Tech, will share his discoveries from marketing multiple surveys and usability studies over the past year and a half. Assessment tools to be highlighted include an online iPod contest that drew over 1,100 entrants into the new library catalog, a services survey conducted in the library lobby, and a usability study. Based on the complexity of the assessment tool and time required for completion, consider whether a giveaway is needed to entice students to participate. What giveaways attract the most attention? Is it better to offer a guaranteed cheap giveaway to everybody, or offer only a chance to win a more valuable prize? The poster session will discuss the effectiveness of various prizes, from a free soda to a chance to win an iPod, that have been used at Virginia Tech. Picking the lure is only the first step. The presenter will also discuss methods of marketing the assessment tool and any associated giveaway. If nobody knows you are awarding iPods, then your carefully targeted giveaway will be for naught. Sandwich boards, web site publicity, and old-fashioned personal interactions are a few of the methods to be discussed through this poster session. The poster will display its points through graphics and pictures whenever possible, and will use as large a text size as possible. The poster will be accompanied by a handout that summarizes the key points of the poster and includes a short bibliography. The presenter hopes attendees will leave with concrete ideas that can be used to increase participation rates in their next assessment tool. Luke Vilelle has been an Outreach Librarian at Virginia Tech since August 2004. He has presented poster sessions on marketing (at ACRL 2005) and on assessing virtual reference (at an ALA 2005 preconference), and is part of a panel session on assessment at the Virginia ACRL chapters 2006 spring program.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

63

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Assessing Library Instruction with Experimental Designs

Scott White and Remi Castonguay

The amount of information now available to students has created greater expectations on the part of faculty concerning student use of information resources. However, it is becoming evident that students are being overwhelmed by assignments, the amount of information available and the increasing number of sources in which to look. At LaGuardia, we offer one-shot library instruction sessions, credit bearing courses, and one-on-one consultations to teach students how to conduct meaningful research and prepare well-constructed assignments. We are now planning and conducting research to see if the instruction programs are affecting students in a variety of ways. Our assessment will focus on two modes of instruction a three-credit course offered in a cluster with an ESL course and a introductory social science course; and one-hour instruction sessions, mandated as part of an English research class, and voluntary for any faculty who wish to bring a class to the library for a session. In Fall 2005, we conducted over 150 one-hour classes. A student usability study currently under way at LaGuardia Community Colleges Library Media Resources Center is measuring how students navigate the Librarys website for information. This qualitative study, using questionnaires and in-depth student interviews, will present a picture of how students use the Library and its virtual resources. It is becoming clear that students who attend Library Instruction sessions perform differently on the tests than those who dont. Other research designs are currently being developed to help us answer why. The Librarys instruction cluster will also be evaluated using a quasi-experimental design. At this point, 50 students have been taught in the cluster over the last two years. We are planning to match those students with students who did not take the cluster to see if there are differences in student performance, retention, transfer and graduation rates. Faculty believe that the course helps students to perform better in classes where research is required, and gives them a better understanding of how the Library can help them successfully complete assignments in support of their class work. Anecdotal evidence suggests that the credit course, called LRC 102, Information Strategies, has a positive effect on students in terms of performance, retention, transfer and graduation. Using student data and conducting interviews with students who took the class and those in the control group, we will test the hypothesis that students who complete the LRC course will perform better in the above categories. Using various statistical analyses, we will control for demographics, national origins and student backgrounds.

Scott White is Head/Access Services and Systems Librarian at LaGuardia Community College (LAGCC), CUNY. He is also an adjunct lecturer at John Jay College. He teaches the Library's three-credit class in a three-course cluster at LAGCC. Remi Castonguay is the Coordinator of Instructional Resources Development at LAGCC, developing digital instructional materials for use in library instruction. He has completed a full web usability study of the LaGuardia Library website and is currently evaluating collected data.
Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

64

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

65

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Developing Personas for Evaluating Library Service Needs

Dan Wilson

This poster will explain a project being undertaken this year by the Claude Moore Health Sciences Library in conjunction with UVa's Public Health Sciences (PHS) department. The goals of the project are to: develop archetypal user "personas" based on primary user group members to assist in the prioritization of changes to library services, and to evaluate existing library service to determine the extent to which these services are meeting the needs of users. Working collaboratively, PHS will provide the structure and methodology of the project, while Library faculty will assist with content. For our purposes, personas are defined as archetypal users whose characteristics are derived from quantitative and qualitative data gained primarily from interviews and surveys. Each persona will represent a library user group whose members share similar information needs. The framework of the project includes a web survey and interviews with representatives of our major user groups. The web survey was approved by the University of Virginia's Institutional Review Board for Health Sciences Research (IRB). The web survey was released to targeted user groups in the middle of April and will ran until the first of May. Interviewing is currently taking place, and should be finished by the end of September. Each interview is being recorded, with the interviewee compensated for his/her time with a free lunch at the University Hospital cafeteria. Following each interview, PHS staff are reviewing, coding and analyzing content. In the Fall, PHS staff will begin developing personas based on the interviews and results of the survey. A draft of the personas will then be presented to Library faculty for their review. The Library plans to use the personas to evaluate the Library's web page as well as to assess openenrollment courses offered in the Library. In particular, we are hoping to ascertain the effectiveness of our current line-up of courses and delivery methods. Once the personas are established, they will be used to assess and evaluate other user services.

Dan Wilson has been the Assistant Director for Collection Management & Access Services at the University of Virginia Health Sciences Library since 1991. He manages the LibQual survey at the library and is active in developing local tools for assessing library services.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

66

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Monday, September 25 6:00-7:30

Poster

Preston

Perceiving Perception: A Case Study of Undergraduates Perception of Academic Libraries Steven Yates

As a part of work funded by an IMLS grant to educate and train 21st century academic librarians, the author developed and administered an online survey to undergraduate students completing requirements in a medium-sized universitys Honors College. The survey was aimed to gauge the students perceptions of the universitys current library resources. They were also asked to project what they would like to see out of the libraries system. These results were then compared to the OCLC Report titled Perceptions of Libraries and Information Resources to determine how a case study example of perception in a small student population reflects the trends found in the report.

Steven Yates is an IMLS fellow with University of Alabama Libraries and the University of Alabama School of Library and Information Studies. Elizabeth Aversa is Director and Professor at the School of Library and Information Studies at the University of Alabama.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

67

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 9:00-10:30

Parallel 3 #1
Information Literacy I

Salon A

Using the SAILS Test to Assess Information Literacy

Carolyn Radcliff and Joe Salem

The Standardized Assessment of Information Literacy Skills (SAILS) is a new tool for programmatic-level assessment of information literacy levels of cohorts of students. The SAILS test is based directly on the ACRL Information Competencies for Higher Education and the ACRL Objectives for Information Literacy Instruction. Focusing on both fundamental and advanced information literacy skills and concepts, the test asks students questions about research strategies; selecting sources; understanding and using finding tools; developing and revising search strategies; evaluating results; retrieving materials; documenting sources; and a host of legal and social issues related to ethical and effective use of information. Test questions were developed through a rigorous process of creation, review, testing, and revision. The test bank consists of more than 200 questions in multiple-choice format. The project team created a Web-based system of test administration, which facilitates ease of administration, data collection, and data analysis. Data are analyzed using item response theory as the measurement model. Results of the test are presented within skill sets and by the demographic variables of class standing and major. Comparisons with groups of institutions are also included. The goals of Project SAILS were to create an information literacy assessment tool that is valid and reliable; is easily administered; contains items not specific to a particular institution or library; and provides for both external and internal benchmarking. Through three years of development and invaluable feedback from participating institutions, those goals were met. The SAILS test is now in production with significant improvements in ease of test administration and clarity of reported results. Attendees will develop an understanding of the SAILS test and whether it may be useful at their own institutions. They will learn that SAILS can be administered to students at different stages in their college careers, allowing for comparison of freshmen to seniors. It is also possible to measure the effect of different teaching strategies and interventions. In these ways, librarians and their collaborators can know if students are being prepared to be successful information searchers and users. Project SAILS is located at Kent State University in Kent, Ohio and has received support from the Institute of Museum and Library Services, the Association of Research Libraries, the Ohio Board of Regents, and more than 80 institutions in Canada and the U.S. that participated in the development phase of the project. The presenters are members of the Project SAILS team.

Carolyn Radcliff is associate professor and reference and instruction librarian at Kent State University. She is a founding member of Project SAILS, and is co-administrator for the Wisconsin-Ohio Reference Evaluation Program WOREP). She has published and presented in the areas of information literacy assessment, reference service, and reference assessment. Joe Salem is assistant professor for Libraries and Media Services at Kent State University. He assumed his current position as Head of Reference and Government Information Services in 2005. His research and professional interests include the assessment of library services in general and information literacy instruction in particular. He is a core member of the Project SAILS Team.
Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

68

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

69

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 9:00-10:30

Parallel 3 #1
Information Literacy I

Salon A

Scenario-based ICT Literacy Assessment: A New Tool for Evaluating the Effectiveness of Library Instructional Programs Panel: Dr. Gordon Smith, Alexis Smith Macklin, and Dr. Mary M. Somerville

Panelists from Purdue University, California Polytechnic State University, San Luis Obispo, and the California State University system-wide Office of the Chancellor will describe a newly developed assessment of information and communication technology (ICT) literacy and discuss its potential as a powerful means of assessing the effectiveness of library instructional programs. Results from the first large-scale administration of the assessment will be presented as well as preliminary findings from its use in pre- and post- studies, and plans for ongoing longitudinal studies will be described. The ICT Literacy Assessment is the product of a partnership between leading higher education institutions and the Educational Testing Service. This group has developed the first realistic and interactive problem-based, scenario-based, web-based assessment tool that crosses disciplines and class levels to assess ICT proficiencies (cognitive and technical skills along with the ethical and legal use of information) especially developed for the higher education environment. Information about the ICT Literacy Assessment is available at http://www.ets.org/ictliteracy.

Dr. Gordon Smith is Director of Systemwide Library Programs in the CSU Office of the Chancellor. His responsibilities include management of systemwide library technology initiatives and development and implementation of systemwide library plans and policies. He is a member of the ETS National Advisory Committee for ICT Literacy Assessment. Alexius Smith Macklin, Associate Professor of Library and Information Science at Purdue University, specializes in distance learning, digital libraries, and general informatics. Macklin is currently pursuing a PhD in Curriculum and Instruction, with a focus on integrating information and communication technology into the higher education curriculum. Dr. Mary M. Somerville is Assistant Dean of Information and Instructional Services at California Polytechnic State University and adjunct faculty in San Jos State Universitys School of Library and Information Science. Recent publications appear in The Electronic Library, New Library World, Internet Reference Services Quarterly, and Australian Academic and Research Libraries.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

70

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 9:00-10:30

Parallel 3 #2
Moving Assessment Forward

Salon B

Data Policy Action: The Continuous Improvement Cycle Cases from ARL and Carnegie MA I Libraries. Susan J. Beck and Wanda V. Dole

This paper jointly reports two related research projects which focus on the impact of assessment on library management decision-making in ARL and Carnegie MA I Libraries. In 2002, Beck examined the impact of assessment on decision making in nine public ARL Libraries in the United States and Canada. In 2004 and 2005 Dole, Hurych and Liebst replicated Becks research methodology in nine Carnegie MA I Libraries. This paper will compare the results of these two research projects. University library directors from each institution were interviewed concerning the impact of assessment on decision-making in their organizations. Cabinet level administrators were interviewed as a group regarding the impact of assessment on decision-making within their purviews. The directors and administrators were asked the same set of questions. At the conclusion of each session, two brief surveys were administered to subjects. The first survey, based on Amos Lakos, Betsy Wilson & Shelly Phipps Do You Have A Culture of Assessment?, measured each respondents beliefs regarding his/her institutions development of a culture of assessment. The second survey, Becks Factors in Decision Making, asked the respondents to rank the importance of various factors such as cost, technology and staff buy-in on decision making. The paper will compare the results of the interviews and surveys in the two different types of libraries.

Susan J. Beck, Making Informed Decisions: the Implications of Assessment, Learning to Make Difference Proceedings of the Eleventh National Conference of the Association of College and Research Libraries April 10-13, 2003; Charlotte, North Carolina (Chicago: American Library Association, 2003), 5259. Beck, The Extent and Variety of Data-Informed Decisions at Academic Libraries: An Analysis of Nine Libraries of the Association of Research Libraries in the U. S. A. & Canada, 5th Northumbria International Conference on Performance Measurement in Libraries. Collingwood College, Durham, England, UK. 29 July 2003. Beck, Data-Informed Decision Making, ARL Bimonthly Report no. 230/231 (October/December 2003): 30. Wanda Dole, Jitka M. Hurych and Anne Liebst, Assessment A Core Competency for Library Leaders, Library Administration & Management, 19, no. 3 (Summer 2005), 125-32. Dole, Hurych and Liebst. ACore Competencies for Library Leaders: Is Assessment a Core Competency?, Papers from the International Conference held in Sofia, Bulgaria, 3-5 November 2004 (Sofia: University of Sofia, 2005), 236-249.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

71

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Susan J. Beck is Head of Public Services at the Paul Robeson Library of Rutgers, the State University of New Jersey Camden, New Jersey. She is currently the Chair of ALAs RUSAs Reference Services Section. Her research interests include management of reference, virtual reference services and assessment. Wanda V. Dole is Dean of Libraries at the Ottenheimer Library, University of Arkansas at Little Rock . She is currently the chair of ALAs ALCTS International Relations Committee and a member of the ALA Committee on Research and Statistics. Her research interests include assessment, strategic planning and professional values.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

72

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 9:00-10:30

Parallel 3 #2
Moving Assessment Forward

Salon B

Evidence Based Library Management A View to the Future.

Amos Lakos

In today's rapidly changing information and financial arenas, libraries must continually demonstrate that their services have relevance, value and impact on institutional stakeholders and customers. If libraries are to succeed in this new environment, decisions and decision making must be based on effective use of data and management information. This paper is an extension of my earlier work on developing management information services in libraries and on culture of assessment. I will focus my observations on the new opportunities for data analysis, assessment delivery and decision making in libraries. My work is informed by an earlier study done by Susan Beck (Rutgers) and the recent and ongoing assessment work carried out by Steve Hiller (University of Washington) and James Self (University of Virginia). In an earlier paper, "Creating a Culture of Assessment: A Catalyst for Organizational Change." Portal 4:3, (July 2004), Shelley Phipps (University of Arizona) and I discussed the need for libraries to build a "culture of assessment" into their larger organizational cultures. We described a new paradigm that encompasses organizational culture change, utilizing the learning organization, and a systems thinking approach. We made the case for transforming library organizational cultures and librarians professional culture in such a way as to focus on achieving quality and measurable outcomes for library customers and institutional stakeholders. We then defined the components of a culture of assessment and the elements required to implement it in libraries. Additionally, we identified the need for a clearly articulated purpose and strong leadership, the internalization of systems thinking, organizational openness and trust, ongoing open communication, and an actively- encouraged climate of risk taking. In short, we promulgated a future oriented transformative culture that values and uses assessment to succeed. We know that cultural change in organizations develops slowly and is a learned process. We also acknowledge the need for organizational learning in order to strengthen the foundations supporting transformational change. However, the information environment continues to rapidly evolve and change at a pace that libraries have difficulty anticipating and responding to. My presentation will examine how developments in the Information Technology (IT) area, especially the increased dominance of very large networked infrastructures and associated services, large scale digitization projects, collaborative frameworks, and economic and market trends, impact and will continue to impact our library environment, and how those developments introduce a variety of opportunities to move libraries and librarians to an evidence based framework. Libraries will need to seize these opportunities or face the likelihood of becoming relic or legacy organizations. In addition to drawing upon my earlier work and those of others, I will be conducting a series of direct discussions with researchers and library leaders which will focus on leaders' decision making activities and the degree to which they use data in their decision making. My conclusions will be future oriented and possibly speculative. They will describe a number of possible scenarios for incorporating data and analysis services to support the need to make evidence-based librarianship the norm.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

73

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Librarian Rosenfeld Management Library, UCLA Over 29 years experience in academic library work. Developed and maintained a Management Information environment in the University of Waterloo Library. Led the development of the concept of Culture of Assessment in libraries and co-developed with Shelley Phipps (University of Arizona) the ARL Creating a Culture of Assessment Workshop. As Head of the LITA Internet Portals Interest Group he organized and presented a full day Symposium on Portals in Libraries at the Annual ALA Conference in Orlando in June 2004. His current research focuses on portal implementations, the impact of large scale digitization on libraries, making assessment count and workflow redesign. Further information http://personal.anderson.ucla.edu/amos.lakos/index.html Email: aalakos@library.uwaterloo.ca

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

74

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 9:00-10:30

Parallel 3 #2
Moving Assessment Forward

Salon B

Keys to Effective, Sustainable, and Practical Assessment

Steve Hiller, Martha Kyrillidou, and Jim Self

The Association of Research Libraries sponsored program "Making Library Assessment Work" is a two year effort to evaluate assessment efforts in ARL Libraries led by ARL Visiting Program Officers Steve Hiller (University of Washington Libraries) and Jim Self (University of Virginia Library) and under the aegis of Martha Kyrillidou, the Director of ARL Statistics and Service Quality Programs. The goals of this program are "to assess the state of assessment efforts in individual research libraries, identify barriers and facilitators of assessment, and devise pragmatic approaches to assessment that can flourish in different local environments." Seven libraries participated in Phase I (February to June 2005) and another 18 libraries in Phase II (September 2005 - December 2006). As of September 2006, assessment efforts have been evaluated at 22 ARL libraries. Each participating library prepared a "self study" of their assessment efforts and needs which is followed by a 1.5 day site visit and a report containing recommendations and suggestions for establishing an effective, sustainable and practical assessment program. Assessment activities were evaluated within the context of the library, the user community, and the parent organization This paper represents a preliminary report on our findings and the implications for assessment in research libraries. Prior to the start of the program, we identified the following factors as important for good library assessment: Library leadership Organizational culture Identifying responsibility for assessment Library priorities Sufficiency of resources Data infrastructure Assessment skills and expertise Sustainability Analyzing and presenting results Using results to improve libraries

Whle local conditions and organizational cultures play important roles in the different approaches each library has taken to assessment, several of the factors listed above have emerged as especially critical for successful assessment. These appear to be the keys to developing effective, sustainable and practical assessment in research libraries.

Steve Hiller, Director, Assessment and Planning, University of Washington Libraries Martha Kyrillidou, Director of Statistics and Service Quality Programs, Association of Research Libraries Jim Self, Director, Management Information Services, University of Virginia Library

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

75

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

76

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 11:00-12:00

Parallel 4 #1
Information Literacy II

Salon A

The Fourth "R": Information Literacy in Institutional Assessment

Louis Fluk

Does your school succeed in producing information literate graduates? How do you know? This presentation will challenge participants to confront the practical difficulties of assessing student information literacy competencies as part of institutional assessment. It will feature a rubric developed to test information literacy competencies, range-finders developed to train raters of student work to use the rubric, and plans for professional development of faculty interested in infusing information literacy skillsdevelopment into their courses. LaGuardia Community College, in Long Island City, NY, has an enrollment of some 13,000 matriculated students (FTEs), an active Adult and Continuing Education program serving about 20,000 students, and three affiliated high schools. It forms part of the publicly-funded City University of New York, a consortium of 19 community colleges, four-year colleges, and graduate and professional schools. In 2002, LaGuardia developed an outcomes assessment plan to meet a mandate set by its accrediting agency, the Middle States Commission on Higher Education. Having identified seven core competencies that all LaGuardia graduates should possess, the College has been systematically developing tools, procedures, and professional development opportunities to make implementation of the outcomes assessment plan a reality. Rubrics have already been developed for critical literacy (the core competencies of reading, writing and critical thinking) and for oral communication. The rubrics are designed to evaluate student work across the disciplines and across students careers at the College, using e-Portfolios as a vehicle. Every entering student now receives instruction in how to create a personal webbased portfolio, a kind of electronic curriculum vitae. A section of each students e-Portfolio contains academic work deposited there for the purpose of subsequent assessment. Beginning in 2004, a group of library and discipline faculty members has been collaborating on development of a rubric to test student performance in the core competency of Research and Information Literacy. The original draft was based on Information Literacy Competency Standards for Higher Education published in 2000 by the Association of College and Research Libraries. It followed the format of LaGuardias Critical Literacy Rubric in that it stated learning outcomes, featured 6 levels of competency, and provided criteria and adjectives for distinguishing each level. The committee did extensive testing of successive versions of the rubric using sample student research papers, annotated bibliographies, and narratives of research. Then the committee selected range-finders, that is, student work that represent a perfect 6, a perfect 5, etc. The range-finders will serve to train faculty to apply the rubric to student work deposited in e-Portfolios. A year-long professional development seminar for faculty on Building Information Literacy in the Disciplines is being planned; the rubric and range-finders will be important tools for seminar participants. This presentation will introduce the process, the rubric, and the challenges of implementing meaningful assessment. Materials will be available for hands-on norming of range-finders and participants will engage in practical discussion of assessment issues, among them how to assure validity and consistency of results and how to organize professional development for faculty raters.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

77

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Professor Louise Fluk has been Coordinator of Instruction at LaGuardia Community College for the past eight years. She was instrumental in developing the three-credit course, Information Strategies, which introduces students to the research process. She chairs the Information Literacy Assessment Rubric Committee and is currently supervising the norming of the Colleges Information Literacy and Research Assessment Rubric.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

78

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 11:00-12:00

Parallel 4 #1
Information Literacy II

Salon A

The Right Assessment Tool for the Job: Seeking a Match Between Method and Need

Megan Oakleaf

In the twenty-first century, all institutions of higher education face calls for accountability. Until recently, the demands faced by other academic units on campus have passed over college and university libraries. Now, academic librarians are increasingly pressured to prove that the resources and services they provide result in improvement in student learning and development. In answer to calls for accountability, academic librarians must demonstrate their contribution to the teaching and learning outcomes of their institutions. One way librarians can achieve this goal is by assessing information literacy instruction on campus. When assessing information literacy skills, academic librarians have a variety of outcomes-based assessment tools from which to choose. The selection of an assessment tool should be based on the purposes of an assessment situation and the fit between the needs of an assessment situation and the strengths and weaknesses of individual assessment approaches. In this paper presentation, participants will learn about the major purposes of assessment and how they impact the criteria used to select an assessment tool. Among the purposes of assessment that will be discussed are the needs to respond to calls for accountability, to participate in accreditation processes, to make program improvements, and to enrich the student learning experience. Participants will also learn about several criteria useful in selecting an assessment tool including utility, relevance, stakeholder needs, measurability, and cost. The presentation will also describe the benefits and limitations of several outcomes-based approaches to the assessment of student learning outcomes. Specifically, participants will learn about assessment tools that evaluate student learning outcomes taught via information literacy instruction. The theoretical support, benefits, limitations, and relevant research related to the use of surveys, tests, performance assessments, and rubrics will be outlined and examples given of each assessment tool. Armed with background knowledge and lessons learned from colleagues throughout higher education, academic librarians can embrace the effective and efficient assessment of information literacy instruction efforts on their campuses. This presentation will give participants a jump start toward becoming proficient assessors ready to answer calls for accountability. Megan Oakleaf is an Assistant Professor in the School of Information Studies at Syracuse University. Prior to this position, Oakleaf served as Librarian for Instruction and Undergraduate Research at North Carolina State University. In this role, she trained fellow librarians in instructional theory and methods. She also provided library instruction for the First-Year Writing Program, First-Year College, and Department of Communication. Oakleaf completed her dissertation entitled, Assessing Information Literacy Skills: A Rubric Approach, at the School of Information and Library Science at the University of North Carolina at Chapel Hill. Prior to a career in librarianship, Oakleaf taught advanced composition in public secondary schools. Her research interests focus on outcomes-based assessment, user education, information services, and digital librarianship.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

79

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 11:00-12:00

Parallel 4 #2
Evaluation and Assessment Methods

Salon B

Choosing the Best Tools for Evaluating Your Library

Neal Kaske

We have numerous library assessment tools available to us today. Most of them have proven validity and high reliability but do not ensure that we have selected the best tool or tools to build our assessment case given our unique circumstances. This paper offers a series of questions which when answered provide direction as to the form of assessment and appropriate measurement tools to employ. The questions are: 1) Why are we measuring? 2) Who will use the results? 3) Do we have baseline data or is this effort to establish a benchmark? 4) What will this tool or tools tell us and what is the precision of its measurement? 5) What new key information will we have from this effort? 6) What are the initial and continuing costs for using this tool? 7) What are the staffing requirements and what does the staff take away from the effort? 8) Will the assessment resonate with and help support the goals of the librarys parent organization? 9) How will the findings be utilized by the librarys parent organization? 10) How might the findings from the assessment be used against the library? Methods for answering these questions are provided, accompanied by graphic illustrations of the different paths one can take in the choosing the best library assessment tool or tools for your given circumstances. Neal Kaske, Director of Statistics and Surveys, US National Commission for Libraries and Information Science, has been an active researcher in library evaluations for many years. His experience includes academic library administration, teaching, research, research management, and grant management. Neal's doctorate is in industrial engineering - library systems management, masters in librarianship and baccalaureate in sociology.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

80

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 11:00-12:00

Parallel 4 #2
Evaluation and Assessment Methods

Salon B

Developing Best Fit Library Evaluation Strategies

Charles R. McClure, John Carlo Bertot, Paul T. Jaeger and John T. Snead

This paper examines and describes best-fit evaluation strategies based on local situational contexts for libraries. This paper examines key issues and factors that appear to be associated with developing best fit evaluation efforts based on specific situational contexts, drawing upon the lessons and findings from a range of library evaluation projects. This paper then presents future directions for the development and implementation of best-fit evaluation strategies for libraries and specific recommendations for implementing a best-fit evaluation strategy in library settings. Library decision makers need to know the impacts, benefits, and value of library services and resources provided to the communities they serve. To identify these, library practitioners and researchers need best-fit evaluation strategies designed to provide data that describe and report impacts, benefits, and value in terms of the unique needs within specific situational contexts of individual libraries. They need to be able to select the best evaluation strategy given the: Specific program, service, resource use, or other item being evaluated; Situational factors unique to that library and its setting; Evaluation goals to be accomplished; Motivation for the evaluation; Availability of various data sources; Availability of staff and other resources for the evaluation; Diverse populations represented within the communities served by the library; Governance factors; Extent and availability of library resources to support the strategy; and Intended audience of the evaluation. To understand the impacts, benefits, and value of library services and resources, library decision makers must select evaluation strategies appropriate to targeted data needs within specific situational contexts. Library decision makers are often faced with difficulties matching their data needs with the appropriate evaluation approaches. There are many different kinds of evaluation data that a library may need and evaluation approaches that a library might employ. As a result, many libraries struggle with the problem of choosing the best evaluation approaches to effectively and efficiently demonstrate the value they provide. The development of best-fit evaluation strategies would significantly help to address such issues. Issues and factors researchers and practitioners should consider in the development and implementation of a best-fit evaluation strategy include: 1. Success with which libraries are currently employing a number of different evaluation approaches; 2. Problematic evaluation efforts in libraries (i.e., historical but outdated efforts, mismatched evaluation efforts to data needs); 3. How library situational factors (organizational, community, other) affect the successful use (or unsuccessful use) of leading evaluation approaches; and

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

81

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

4. Types of evaluations available for use; potential data types produced by each evaluation; strengths and weaknesses of each evaluation approach; and potential applications of each evaluation approach within varying library situational settings and contexts. The issues and factors examined in this paper will facilitate the use of the most appropriate evaluation strategy available to meet data needs and to demonstrate library community impact and value.

Charles R. McClure, Francis Eppes Professor and Director, Information Use Management and Policy Institute, College of Information, Florida State University; John Carlo Bertot, Professor and Associate Director; Paul T. Jaeger, Manager for Research Development; and John T. Snead is Research Coordinator at the Information Use Management and Policy Institute and is also a doctoral student in the College of Information.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

82

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 11:00-12:00

Parallel 4 #3
Strategic Planning

Ashlawn & Highlands

Accountability to Key Stakeholders

Raynna Bowlby and Daniel OMahony

Brown Universitys ambitious Plan for Academic Enrichment sets forth guidelines for Browns priorities and direction over the next fifteen to twenty years. A fundamental component of the plan is a set of measures designed to assess the Universitys progress toward achieving the plans strategic goals. In consultation with the Universitys Office of Institutional Research, the Library developed key indicators to measure its contributions to the success of the strategic initiatives outlined in the plan, utilizing data from the annual ARL survey, periodic LibQUAL+ user assessments, and targeted outcome measures from University surveys. Results are submitted annually for inclusion in the Provosts report to University administrators and the Corporation (trustees). These measures have been used to monitor the impact of the Universitys and the Librarys efforts in strengthening the academic enterprise of the University; thus, they directly and indirectly inform decisions by University administrators and Library decision-makers concerning priorities, services, and resource allocation. These measures also serve as a framework for further assessment within the Library to identify ways to align the Librarys efforts with the plan in order to make meaningful contributions to academic enrichment on campus.

Raynna Bowlby has served at Brown University Library for 19 years, and as the Librarys Organizational & Staff Development Officer was responsible for the administration of the 2005 LibQUAL+ survey on campus. She has an MLS from Simmons College, MBA from the University of Rhode Island, and a BA from Bates College. Daniel OMahony has served at Brown University Library for 14 years, and as team leader of the Library User Needs Team oversaw the administration of the 2002 LibQUAL+ survey on campus. He has an MSLS from Florida State University and a BA from the University of Florida.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

83

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 11:00-12:00

Parallel 4 #3
Strategic Planning

Ashlawn & Highlands

Drilling the LibQUAL+ Data for Strategic Planning

Stewart Saunders

When the Purdue University Libraries decided in 2004 to work on a strategic plan, it was recognized that realistic planning required reliable assessment data. The assessment data needed to be reliable not only at the university level but also at the college level. The colleges reflect discipline areas. We wanted to be able to drill down into the data, i.e., to break it out into subgroups, to see differences and variation by college and library, or for that matter, any subgroup that we might choose to investigate. To support this, the sampling of students was based on a stratum for each of the 10 colleges. All faculty were included in the survey design. 1. Drilling Down. Radar charts were created for undergraduates, graduate students, and faculty in each of the 10 colleges. It became apparent that certain themes were consistent across colleges and user groups, but the charts also revealed basic differences among colleges and other user groups. Use patterns, satisfaction patterns, and outcome patterns were also broken out by user groups. 2. The Strategic Plan. The LibQUAL+ data, and more specifically its detailed analysis, had an impact on the strategic planning process. This conclusion is supported by a questionnaire filled out by the 30 participants in the Strategic Planning Group. This survey asked the participants to assess the specific areas in which the LibQUAL+ data affected the planning process. The LibQUAL+ data that most influenced them were the data that surprised them. This data was not visible at the general level. It surfaced as we drilled down to lower levels. Others believed that LibQUAL+ data had a negative effect on strategic planning. I will discuss 1) how this happened, 2) how it led to the scrapping of the first draft of the plan, and 3) the birth on a second draft which folded in what was of value from the LibQUAL+ assessment.

Stewart Saunders is the Collections Librarian for the Humanities, Social Sciences, and Education at Purdue University. He was the administrator for the LibQual survey. He authored along with Judy Pask, Differentiating Information Skills and Computer Skills: A Factor Analytic Approach, Portal: Libraries and the Academy, 4(January 2004): 61-73.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

84

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 2:00 p.m. - 3:00

Plenary II

Salon A/B

Changing User Needs and Perceptions

Cathy De Rosa Vice President, Marketing & Library Services Online Computer Library Center, Inc.
Cathy De Rosa has been responsible for putting together the OCLC environmental scans on libraries and the future. She will speak on in the general area of the future(s) of research libraries and the role of assessment and market research in helping us shape that future.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

85

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #1
Library As Place

Salon A

Assessing Learning Spaces: A Framework

Joan K. Lippincott

How do you know when a new building project or renovation is successful? Many libraries proudly discuss the tremendous increase in gate counts when they open information commons or similar facilities. However, is the gate count a useful or sufficient measure of the success of a facility? This session will present a framework for developing an assessment strategy for new or renovated technology-enabled learning spaces. Institutions are spending large sums on renovation, additions, and new construction of learning spaces in order to incorporate technology and address new teaching and learning strategies. These spaces include classrooms, libraries, information commons, multi-media centers, and computer labs. The presentation will describe a process whereby work on assessment strategies begins at the inception of the project, as a means of sharpening the goals for the new learning space. A wide variety of goals for these spaces will be explored, including goals related to learning, development of social community, use of technology in teaching and learning, and student engagement. Then, various methods of assessment to collect data to address the goals will be described. Examples will be provided that are in use in a variety of institutions. Finally, some suggestions on assessment implementation strategies will be described.

Joan K. Lippincott is the Associate Executive Director of the Coalition for Networked Information (CNI), a joint project of the Association of Research Libraries (ARL) and EDUCAUSE. Joan previously held positions in the libraries of Cornell, Georgetown, George Washington University, and SUNY at Brockport. She has written articles and made presentations on such topics as Net Gen students, networked information, learning spaces, collaboration among professional groups, assessment, and teaching and learning in the networked environment.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

86

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #1
Library As Place

Salon A

Combining Quantitative and Qualitative Assessment of an Information Common

Gordon Fretwell and Rachel Lewellen

The presentation reviews how a quantitative (observational study) and a quantitative (focus group) project were conducted in tandem to enrich data analysis and improve insight into student use of the Learning Commons. The presentation addresses how the observational study data informed the focus group questions and how the results of the focus group sessions subsequently informed the need for future assessment. We discuss how the conclusions drawn from the combined data are more robust than if drawn from either study in isolation. The presentation also briefly reviews how each study was designed, conducted and analyzed.

Gordon Fretwell, Statistics Consultant University of Massachusetts Amherst Rachel Lewellen, Assessment Librarian University of Massachusetts Amherst

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

87

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #1
Library As Place

Salon A

Listening to Users: The Role of Assessment in Renovation to Meet User Need

Kimberly Burke Sweetman and Lucinda Covert-Vale

In 2004, New York Universitys Bobst Library renovated 4 floors of user space. The project was successful, in part due to the results of several library assessments con- ducted before, during and after the renovation. This paper will present the various assessment steps taken as part of a larger renovation planning effort, including assessment design and implementation; user outreach, participation and buy-in; outcomes; repurposing; and current assessment culture. The NYU Division of Libraries participated in LibQUAL+ in 2002, with poor results. The poor LibQUAL+ scores, however, helped the Libraries gain University support for a much needed renovation. The renovation process marked a significant point of departure from the way change planning had been approached previously. Rathr than presume to know what library users wanted, an assessment component was central to renovation planning. A variety of qualitative and quantitative assessment tools were used, including: User Activity Study - User behavior observation study that captured data on preferred study locations and furniture, laptop use, food and beverage consumption, use of library materials and use of personal materials Web-based user preferences survey - Designed in conjunction with an architectural firm, the survey focused on structural elements of the architecture and user preferences Renovation focus groups - Conducted to probe results of earlier surveys and studies Naming focus groups Undergraduate, graduate and staff focus groups to determine signage and naming of library service points in the renovated areas Web-based follow up study to measure user satisfaction with new spaces Quantitative follow up analysis of renovated space usage

Our success with various assessment techniques, most notably the user activity studies and web based surveys, enabled the creation of user spaces that met the needs of our user community. Perhaps more importantly, gathering and analyzing data as part of a discrete project improved staff confidence regarding our assessment abilities. By using both qualitative and quantitative data, we were able to understand user need and to know that we met these needs. As a result, assessment has been incorporated into an increasing number of projects, and the concept of assessment is better understood by library staff at all levels.

Kimberly Burke Sweetman is Head of Access Services at NYU's Division of Libraries. In addition to teaching numerous workshops on library management, she has taught management at LIU Palmer School of Library and Information Science. Ms. Sweetman holds an MA in Film Studies from Emory University and an MSLS from the Catholic University of America. She is the author of Managing Student Assistants, forthcoming in Neal Schuman's popular How-To-Do-It series. Lucinda Covert-Vail is Director of Public Services for NYU. Ms. Covert-Vail coordinates the activities of the Division including program planning and implementation, assessment, development of user and electronic services, and human resources. Ms. Covert-Vail holds an AB in English from UCLA, an MS in Library Science from the University of Southern California and an MS in Management from the Wagner Graduate School of Public Service at NYU.
Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

88

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

89

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #1
Library As Place

Salon A

Net Generation College Students and the Library as Place

Aaron K. Shrimplin and Matthew Magnuson

This research project is grounded in the assumption that the needs and expectations of library users have fundamentally changed and librarians must comprehend this change and adapt their practices to accord with this new user culture. In investigating Net Generation college students and their attitudes and views toward the library as place, the goal of this study is to better understand and incorporate students' needs and expectations into the planning and implementation of new roles, services, and uses of the library. This grant funded research project uses Q methodology to discover different opinion types among college students. To the extent librarians better understand what students think about the library, from their own points of view, and how they learn and use the library, they are in a better position to ask the right questions when rethinking roles, spaces, and services. Academic libraries have done a reasonably good job of collecting use and satisfaction data from their users. In recent years, academic libraries have been charting service quality with LibQUAL+(TM). While LibQUAL+ is a valuable tool for measuring service quality, it does not expose the variety of subjective viewpoints that students may have toward the library and its services. To the extent that we are able to understand what students think about the library, from their own points of view, we can reach out to these students and design better services and better facilities. Our study potentially builds on the growing body of research about library as place (see: Campbell, 2006; Council on Library and Information Resource, 2005; and Ranseen, 2002) and introduces a methodology that is not well known in the field of academic librarianship. Unlike other studies, we use Q methodology to examine the cluster of opinions held by Net Generation students with respect to the growing importance of the library as place. Q will be used to identify opinions, shared among students, on issues they consider important about the library as place. Applying a hybrid of qualitative and quantitative statistical techniques, Q provides a method for the scientific study of human subjectivity. Subjectivity in this context means nothing more than the communication of a person's own point of view about some topic. As such, it is well suited to developing exploratory understanding of college student's attitudes and beliefs about the library. Q methodology is an intensive form of analysis and involves small numbers of subjects, making it an excellent tool for ongoing assessment of user culture on a tight budget.

Aaron K. Shrimplin is an Associate Librarian at Miami University specializing in electronic information services and numeric data services. Aaron has co-authored articles published in C&RL News, Performance Measurement & Metrics, Library Collections, Acquisitions, & Technical Services, IASSIST Quarterly, and E-JASL. He has also presented at numerous national and regional conferences, including Computers in Libraries and HighEdWebDev.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

90

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #2
Balanced Scorecard

Salon B

Balanced Scorecards in Public Libraries: A Project Summary

Joe Matthews

The goal of this project was to assess the utility of a Library Scorecard that could be used to communicate the value of the public library using a variety of performance measures. A workbook, Scorecards for Results: A Guide for Developing a Library Balanced Scorecard, was developed as part of a Federal grant from the Institute of Museum and Library Services (IMLS). The workbook was tested by more than thirty public libraries across the U.S. A survey of stakeholders was used to assess the utility of this management assessment tool.

Joe Matthews is a well-known consultant and he is the project manager for the IMLS funded project to develop a Library Balanced Scorecard. Joe has written a number of books and has had a number of articles published. He received an MBA degree from the University of California, Irvine.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

91

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #2
Balanced Scorecard

Salon B

The People Side of Planning & Implementing a Large Scale Balanced Scorecard Initiative Susanna Pathak

What organizational infrastructure is needed to plan and implement the balanced scorecard? What leadership, training, staff support, and work processes can we use to create and sustain a balanced scorecard that is large scale and designed to serve as the organization's strategic plan? Using VCU Libraries experience in planning and implementing the balanced scorecard as strategic plan, the presenter will describe the "people side" of the process at VCU where the goal was to use a streamlined, efficient process. VCU's BSC (a 2 year plan) is up at: http://www.library.vcu.edu/strategicplan/index.html Susanna Pathak is the Planning & Assessment Librarian at Virginia Commonwealth University Libraries (since 2000). She is co-chair of a LAMA program scheduled for ALA annual conference in 2007 called "Balanced Scorecard: The Results Please!"

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

92

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #2
Balanced Scorecard

Salon B

Yours, Mine, and Ours: Staff Involvement in the Balanced Scorecard

Panel: Leland Deeds, Tabzeera Dosu, Laura Miller, Paul Rittelmeyer, Annette Stalnaker, Donna Tolson, and Carol Hunter
Most libraries have just a few specialized staff members who are heavily involved in assessment efforts. These employees may staff a management information services (MIS) unit or there may be a single assessment coordinator who conducts surveys and tabulates reports. Other library employees often have little chance to be involved in assessment activities other than at the point of data collection -- recording reference transactions or creating budget reports. At the University of Virginia, our primary assessment tool is the Balanced Scorecard (BSC), and we have an MIS unit to provide the technical support necessary for this tool. Many employees also participate in data collection activities. However, all library employees also have the opportunity to contribute to assessment efforts in another way through membership on the Balanced Scorecard Committee. The BSC Committee played an instrumental part in the implementation of the Scorecard, and continues to actively advise and monitor its content and annual process. Part presentation by the current chair and a past committee head, and part panel discussion with current committee members, this session will tell one Librarys story of how staff members have taken a very active part in the assessment process. Leland Deeds, Coordinator, Public Services, Clemons Library Tabzeera Dosu, Director of Financial Services Laura Miller, Administrative Assistant, Charles L. Brown Science & Engineering Library Paul Rittelmeyer, Head, Acquisitions and Preservation Annette Stalnaker, Copy Cataloging Coordinator Donna Tolson, Head, Scholars' Lab, Digital Research & Instructional Services Carol Hunter, Director, Science, Engineering, and Education Services

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

93

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #3
Assessing Organizational Climate

Ashlawn & Highlands

From Organizational Assessment to Organizational Change: The University of Maryland Library Experience Panel: Sue Baughman, Johnnie Love, Charles B. Lowry, and Maggie Sopanaro

The issues of diversity and organizational climate are of major concern to many academic institutions. These factors are measures of organizational health, and healthy organizations are more effective. The University of Maryland (UM) Libraries, as a team-based learning organization, is committed to addressing the issues of diversity and organizational climate through its assessment activities. Key to this is the Libraries commitment to service excellence at all levels to build a culture of shared vision, values and leadership. The UM Libraries change process began in 1996 and has affected every member in the organization. These changes range include the development of a new service philosophy, formation of teams, data driven decision-making, and the creation of a comprehensive learning program. Since 1996, several assessment processes have been conducted, resulting in additional organizational changes. Two such assessment activities are the Individual-Team-Organization (ITO) survey and the Organizational Culture and Diversity Assessment (OCDA). The ITO Survey is a commercially available instrument that looks at three components of an organization: individual members in the organization, teams that make up the organization and the organization itself. First administered in 1998, the survey was repeated in 2000 and 2003, insuring the Libraries ability to gauge the transition to an effective team environment over time. Areas to improve have been noted and changes made with the analysis of data for each survey period. Gauging the extent of continuous improvement is a critical element of this assessment tool. The OCDA was developed specifically for the UM Libraries in partnership with the UM Department of Industrial and Organizational Psychology. The OCDA addresses issues of climate for diversity, teamwork, learning and fairness key elements of a successful institution. Results of the 2000 OCDA not only offered insight in the areas of work and diversity climate and culture for the UM Libraries, but also provided a baseline against which the effectiveness of its interventions could be determined. In 2004, a revised OCDA survey was administered. The new instrument included measures of climates for teamwork and continual learning, current managerial practices, and the individuals attitudes and beliefs, and provided an updated snapshot of the diversity and organizational climate of the Libraries. Transforming the culture of a library organization is not an easy task. It involves personal development and mastery as well as continually clarifying and maintaining the larger vision for change. Immediate identification of "enablers" and "barriers" to assessment is critical in determining success strategies as well as knowing what could be considered "pitfalls" to avoid in reaching desired goals. The ITO and OCDA surveys have given the UM Libraries the opportunity to identify several such enablers and barriers. This panel presentation will provide an overview of how the UM Libraries established its infrastructure for organizational development, while reviewing the processes and interventions associated with the ITO and OCDA surveys. Additionally, enablers and barriers for assessment will be reviewed. The goal of this program is for attendees to discover means to use or adapt organizational assessments and act on the results.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

94

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Sue Baughman, Assistant Dean for Organizational Development, administered the Individual-TeamOrganization Survey and worked with the coordinating team in the 2004 OCDA Assessment. She supports individual, team and organization development through design of change processes, facilitation and coaching. She holds an MLS from UMs College of Library and Information Studies. Johnnie Love, Coordinator of Personnel Programs, worked with the coordinating team for the 2004 OCDA Assessment. Her area of research is diversity assessment. She has developed programs and services for library staff as a result OCDA assessment. She holds an MLS from the University of Denver. Charles B. Lowry, MA in history (UAla), Ph.D. in history (UFla) and MSLS (UNC/CH). He is Dean of Libraries and a professor in the College of Information Studies at UM since 1996. He teaches in the MLS program and is the editor of portal: Libraries and the Academy, Hopkins Press. Maggie Saponaro, Manager of Staff Learning and Development, worked with the coordinators of the 2004 OCDA Assessment. Her office provides educational and developmental programs for library staff, under the umbrella of the Learning Curriculum, an outgrowth of the 2000 OCDA assessment. She holds an MLS from UCLA.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

95

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #3
Assessing Organizational Climate

Ashlawn & Highlands

Diversity and Organizational Culture Survey: Useful Methodological Tool or Pandoras Box Laura Lillard

During the past ten years a number of academic libraries have conducted surveys dealing with organizational culture (or climate) and diversity. Survey goals include gaining a better understanding of organizational culture, communication within the organization, work place attitudes, and the climate for diversity. While these surveys have provided valuable information they have also tended to heighten staff expectations that results will be used to improve the internal organizational environment if they asked the question they must be prepared to act upon the results. This paper describes the experience at the University of Washington Libraries which conducted its first diversity and organizational culture survey in 2004. The survey was designed and conducted by the Diversity and Organizational Culture Task Force to help inform the development of a library diversity plan and provide an opportunity for library staff to evaluate a range of issues related to the work environment and communication. The Task Force reviewed relevant University of Washington documents, including the Diversity Compact and mission statement as well as notes from a diversity forum held in 2002. The group also looked at survey instruments created by other academic libraries. Ultimately we decided to create a unique survey designed to encourage staff to provide substantial qualitative input, rather than respond to a long list of potential concerns. The survey of 33 questions (including 7 open-ended ones) was divided into three major sections, 1)demographics; 2)diversity; 3)organizational culture. In August 2004, an email invitation was sent to 390 librarians and staff who were given two weeks to complete and submit the online survey. The overall response rate was 59%, but varied from less than 50% of classified staff to more than 80% of untenured librarians. Indeed, there were often significant differences in responses to questions between the four library staff groups: classified, profession, librarians with tenure and untenured librarians. While the scale responses yielded excellent information, the 60+ pages of comments provided invaluable insights into how the Libraries personnel feel about diversity and organizational climate. Communication was the area with the lowest satisfaction, especially lateral and vertical communication. Many of the comments dealt with perceived inadequacies of communication within the organization and the lack of transparency in the decision making process. After analyzing all results a decision was made to transfer responsibility of the organizational culture piece (including communication) to Libraries administration for their attention. A communications consultant was hired to delve more deeply into staff issues and concerns. She conducted extensive interviews and focus groups that were the basis of a report on improving communications within the Libraries. The emphasis on communications heightened staff expectations and they are expecting real changes in the area. Survey results also showed that diversity is important to our staff, especially in supporting our diverse user communities. However, the staff also perceived that the Libraries can do a better job in addressing diversity issues, especially in recruiting and retaining a diverse work force. The information gained from this survey was used in formulating the Libraries Diversity Plan. However, survey results also showed that there was a need to better inform staff about diversity issues.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

96

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Libraries that are considering the application of organizational culture and diversity surveys should keep in mind that once staff are asked, they will expect follow-through on results. When the box is opened library administrators must be able to address staff concerns.

Laura Lillard has been an academic librarian for almost 8 years. Her first position was as Education Librarian at Texas A&M University. In 2001 became Education Librarian at the University of Washington, Seattle. In response to the goals of the 2005 Diversity Plan Laura was appointed Diversity Officer by the Dean of Libraries.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

97

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Tuesday, September 26 3:00-5:00

Parallel 5 #3
Assessing Organizational Climate

Ashlawn & Highlands

Looking In and Looking Out: Assessing Our Readiness to Embrace the Future

Nancy Slight-Gibney

The University of Oregon Libraries Assessment Team will conduct a survey in May 2006 of all permanent library employees. In constructing the survey our team's intention is to identify aspects of our organizational culture that enhance or detract from our ability to transform nimbly and effectively as our environment changes. Our intent is to measure overall job satisfaction; to get feedback on the quality of relationships and effectiveness of communication throughout the organization; and to gather staff perceptions regarding diversity issues, engagement in planning and decision-making, comfort level with change, and managerial support for continuous learning. The major challenge in constructing the survey was developing a set of questions that would focus the respondents' thinking in two directions, both inward (individual work-life) and outward (organizational effectiveness). What will be reported in this poster session are: 1) the process of constructing the survey. 2) the results, and 3) what we learned from administering the survey. I would also be willing to participate in a panel discussion of work-life surveys if that is more desirable to the conference planners.

Nancy Slight-Gibney is currently the leader of the libraries' Assessment Team in addition to managing the budget. Most of my career has been as an acquisitions librarian until moving into my current position three years ago. My ongoing interests center around data driven decision making.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

98

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #1
Organizational Culture/Learning

Salon A

Assessing Organizational Culture: Moving towards Organizational Change and Renewal

Lyn Currie and Carol Shepstone

This paper presents a method for assessing a librarys organizational culture. Culture plays a critical role in creating a work environment where employees are committed and contribute to the success of the organization. A research project was conducted at the University of Saskatchewan Library to examine the ways in which the librarys culture influences the work of library staff and the effectiveness of the library. Phase 1 of the study focused on librarians and explored how the organizational culture of the library gives identity, provides collective commitment, builds social system stability and allows people to make sense of the organization (Sannwald, 2000). It also considered the acculturation process exploring how librarians (established and new) assimilate and/or influence the culture, values and perspectives of the library (Black and Leysen, 2002). The research study described the current cultural environment of the library: identified subcultures that exist; examined the congruence between subculture(s) and overall organizational culture; discussed those aspects of the culture(s) that impede or facilitate the work performance of librarians; and described the extent to which librarians, both new and established, are able to participate, influence and affect change. Phase II extended the study to include the perceptions of organizational culture, both existing and desired, of all other libray staff to provide a complete picture of the librarys organizational culture. The research study was conducted using the Competing Values Framework (CVF) developed by Cameron & Quinn (1999, 2006),which has been used extensively in research studies to examine the impact of culture on organizational issues. Their Organizational Culture Assessment Instrument (OCAI) assesses key dimensions of organizational culture and provides a picture of the fundamental assumptions on which an organization operates and the values that characterize it. It identifies an organizations current culture, predicts organizational performance and identifies the culture that organization members think should be developed to match the future demands of the environment and the opportunities that present. In the U of S study the CVF was used to: identify the various cultures that exist assess the impact of organizational culture and sub-cultures, on the work environment and the progress and success of librarians examine the impact of culture on organizational issues such as attracting, developing and retaining librarians examine the organizational culture from the perspective of all library staff, again identifying sub-cultures, congruencies, disconnections and similarities among a variety of formal and informal groupings. Finally, it contributed to the development of recommendations concerning transforming the current culture into a desired new culture, introducing changes to the organizational structure, leadership and management initiatives and new support mechanisms that facilitate a positive, creative and rewarding working environment for library staff. The CVF provides a useful method for deriving an organizational culture profile that reflects underlying attributes, including the management style, strategic plans, climate, reward system, means of bonding, leadership and basic values of the organization. It also provides a means of involving the entire organization in developing a broad based culture assessment and creating a strategy for cultural change.

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

99

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Lyn Currie (BA, Grad.Dip.Lib, MA (Hons)) is Head of the Education Library and Coordinator of Library Instruction at the University of Saskatchewan. She came to the University of Saskatchewan in 1996 with previous academic library experience at university libraries in Australia and Canada. Lyn is active in the University and library communities serving on the Instructional Development Committee of University Council, the Steering Committee for the New Learning Centre, and the Library Services Assessment Committee. She is co-convenor of the national Evidence-Based Librarianship Interest Group for the Canadian Library Association. Lyns principal areas of research and scholarship include adult and lifelong learning, information literacy programming, evaluation of student learning outcomes, peer review of teaching, evidence-based decision making and staff training and development. Carol Shepstone (BA, MLIS) is Head, Access Services Division at the University of Saskatchewan. She has worked in a range of information sectors including academic, public and special libraries, as well as museums and archives. Carol has an undergraduate degree in cultural anthropology with a specialization in museum studies. She is active in many professional bodies including the Saskatchewan Library Association (Vice-President), the Multitype Library Board (Chair) a government advisory board which promotes library co-operation in Saskatchewan, the Canadian Library Associations Copyright Working Group, and The Partnership, a grassroots network of provincial library associations. She is an alumnus of the Northern Exposure to Leadership Institute. She sits on the Librarys Management Committee, the Library Services Assessment Committee, and was instrumental in forming the Pre-Tenured Librarians Group at the University of Saskatchewan. Her research interests include academic library responses to interdisciplinary research and teaching, library co-operation and collaboration, organizational culture, the changing nature of library as place and Canadian copyright issues

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

100

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 26 9:00 a.m. - 10:00 a.m.

Plenary III

Salon A/B

Organizational Diversity and Climate Assessment

Paul Hanges Professor, Industrial and Organizational Psychology University of Maryland


Paul co-principal investigator of the Global Leader and Organizational Behavior Effectiveness (GLOBE) research project. He has been instrumental in developing and refining the Organizational Culture and Diversity Assessment (OCDA) instrument used at the University of Maryland. Paul will speak on the general area of Leadership, Cross-Cultural Issues, Organizational Culture and Diversity Assessment.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

101

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #1
Organizational Culture/Learning

Salon A

Tools for Creating a Culture of Assessment: The CIPP Model and Utilization-Focused Evaluation Yvonne Belanger

Many libraries struggle with the process of creating a culture of assessment and building the internal evaluation capacity of the library as an organization. This presentation will outline how two formal evaluation approaches frequently utilized in the context of evaluating educational and social programs can be applied to the context of academic library assessment projects. The two approaches are the CIPP model (Context Input Process Product) and utilization-focused evaluation. Both models support an organizational approach to evaluation that emphasizes the use of results for decision-making and program improvement. Daniel Stufflebeams CIPP model, sometimes referred to as Decision/Accountability-Oriented Evaluation, is a systems-based approach to evaluating programs that provides a useful structure for creating and sustaining a unified organization-wide evaluation strategy. The utilization-focused approach, articulated by Michael Quinn Patton, incorporates participatory evaluation techniques and details how evaluation can be structured to maximize use of results. Utilization-based evaluation assumes that evaluation is conducted openly and engages many stakeholders in the process of designing and conducting the evaluation. These two complementary evaluation models used either separately or in conjunction fit well into the academic library culture. Both approaches are useful for assessment and evaluation of library programs because they can be used to effectively engage librarians and administration together in instituting a culture of assessment where evaluation of library programs is viewed as a valuable tool rather than an added burden or threat. These approaches are also easy for libraries to adopt when the organizational capacity for supporting evaluation and assessment is low, since both approaches leverage existing decision-making and project support structures rather than relying on external evaluation resources and expertise. The presentation will include descriptions of both models, sample evaluation plans illustrating the application of these models in an academic library context, and examples of how these and other models can be used to support program improvement and accountability.

Yvonne Belanger is the Manager of Evaluation and Assessment for Duke University Library's Academic Technology and Instructional Services group. She coordinates evaluations of instructional technology initiatives and library programs at Duke. She holds an M.S. in Instructional Design with a concentration in Evaluation from Syracuse University and is a graduate of St. John's College.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

102

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #1
Organizational Culture/Learning

Salon A

The Use of Outcome Based Evaluation (OBE) to Assess Staff Learning Activities at University of Maryland Libraries Irma F. Dillon and Maggie Sopanaro

The University of Maryland (UM) Libraries introduced the Learning Curriculum in 2001 as part of its commitment to become a team-based learning organization. The Learning Curriculum program seeks to provide educational and developmental opportunities to all library staff members in order to develop the skills needed to support the Libraries goals. Needs assessment and evaluation both play a critical role in the development and implementation of Learning Curriculum activities. Outcomes Based Evaluation (OBE) looks at the impacts, benefits, and changes to participants as a result of the program(s) efforts during and/or after their participation. OBE examines these changes in the short-term, intermediate term and long-term. First piloted in 2004, the Libraries began using OBE in earnest to assess the impact of participation in Learning Curriculum activities in 2005. This paper will describe the evolution of OBE as a tool for assessing the UM Learning Curriculum as a whole and its individual modules. Examples and results of the evaluation process will be presented.

Irma F. Dillon is the Manager of MIS, University of Maryland Libraries, responsible for guiding the Libraries assessment and evaluation program. She developed Outcomes Based Evaluation for the Learning Curriculum program with Maggie Saponaro. She holds a MSLS from Atlanta University and an MBA from the University of Baltimore. Maggie Saponaro is Manager of Staff Learning and Development at UMD. Her office provides educational and developmental programs for library staff, under the umbrella of the Learning Curriculum. She works with Irma Dillon to develop assessment tools for Learning Curriculum activities, including OBE assessments. She holds a MLS from UCLA.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

103

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #2
Digital Library

Salon B

Usability Assessment of Academic Digital Libraries

Judy Jeng

Digital library development is explosive since its inception in the 1990s. However, evaluation has not kept pace. Empirical research in usability is lean. We need further work on methods for analyzing usability, including an understanding of how to balance rigor, appropriateness of techniques, and practical limitations. The main research goal of this study is to develop a model and a suite of instruments to evaluate the usability of academic digital libraries. Usability is an elusive concept and may be viewed from many perspectives. The study examines usability from the perspectives of effectiveness, efficiency, satisfaction, and learnability. Satisfaction is further examined in the areas of ease of use, organization of information, terminology and labeling, visual appearance, content, and mistake recovery. The evaluation includes both quantifying elements and affect measure. The outcome of the study provides benchmarks for comparison with similar sites. In addition, the study found interlocking relationships among effectiveness, efficiency, and satisfaction. However, it should be noted that each criterion has its own emphasis and should be measured separately. One cannot replace the other. The study is a user-centered evaluation. It reports users criteria for ease of use, organization of information, terminology, visual attractiveness, and mistake recovery. The causes of user lostness and navigation disorientation were identified. The issue of click cost was studied. A majority of participants report that they expect each click to lead them closer to the answer. The study found that different ethnic groups may have different attitudes in rating satisfaction, although there do not have statistically significant impact on performance. The study includes 41 subjects and was conducted twice to confirm findings. It is a cross-institutional study including two academic library sites, the Rutgers University Libraries and Queens College Library. The model developed in this study is applicable to other academic digital libraries or information systems. The methods and instruments may be tailored to fit specific system.

Judy Jeng is a Ph. D. from Rutgers University. Her dissertation, entitled Usability of the digital library: An evaluation model, received several awards and recognition, including ACRL, LITA, and State of New Jersey. Judy is currently Head of Collection Services at New Jersey City University.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

104

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #2
Digital Library

Salon B

Listening to Users: Creating More Useful Digital Library Tools and Services by Understanding the Needs of User Communities Felicia Poe

In order to create useful digital library tools and services, we must first understand the needs of our user communities. Discovering what our users value and the tasks they are trying to accomplish is a vital component of digital library service design and assessment. In this presentation, Felicia Poe will share what the California Digital Library (CDL) has learned from carrying out an array of assessment activities with current and potential users, highlighting a growing insight into digital library user communities, including students, faculty, K-12 teachers, librarians, archivists and others. She will also explore CDL efforts to uncover both the articulated and unarticulated needs of users, and importantly, how the findings of those efforts are incorporated into the service design and development process.

Felicia Poe joined the University of California, California Digital Library (CDL) in 2001, and assumed leadership of the newly established CDL Assessment Program in 2005. She is responsible for integrating assessment and evaluation activities into all stages of the digital library project and service lifecycle, ranging from early stage focus groups, to mid-stage user interface usability testing, to post-launch service reviews.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

105

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #2
Digital Library

Salon B

All That Data: Finding Useful and Practical Ways to Combine Electronic Resource Usage Data from Multiple Sources Maribeth Manoff and Eleanor Read

"Maximizing Library Investments in Digital Collections through Better Data Collection and Analysis (MaxData)" is a three-year research project sponsored by the Institute for Museum and Library Services (IMLS). The primary goal of the project is to provide cost and benefit information that will help librarians determine how best to capture, analyze, and interpret usage data for their electronic resources. The research team from the University of Tennessee Libraries is reviewing usage data from a number of sources, including vendors (COUNTER-compliant and non-compliant) and local sources such as link resolvers, federated search engines, proxy servers, and database logs. This presentation will discuss several methods of combining vendor-supplied usage data with link resolver and metasearch statistical reports and data from a local system that measures database use. To facilitate analyses and comparisons of usage data, it is helpful to combine different types of usage data into a single file or database. Data sources come in different formats (e.g., Microsoft Excel, HTML, XML, delimited files, plain text) and are not always consistent in the presentation of various data elements (e.g., ISSNs with and without hyphens). To reconcile these differences and achieve consistency and integration of the data, significant manipulations of the individual files may be necessary. These manipulations may be done manually or with computer programming. Manual approaches and programming solutions for extracting and manipulating these data will be compared in terms of time, effort, special skills, software, and/or cost that each would require. Initial analysis to discover comparable data elements among the different sources will also be presented. Maribeth Manoff is Assistant Professor and Coordinator for Networked Service Integration at the University of Tennessee Libraries. She is responsible for many of the web based services provided by the Libraries including link resolver and federated searching software. Her research interests are focused on emerging technologies, specifically the use of technology to improve access to electronic resources. Eleanor Read is Assistant Professor and Social Science Data Services Librarian at the University of Tennessee. She provides reference services for sociology, psychology, and numeric research data. She previously worked as a statistician on environmental clean-up and health effects research projects for the Department of Energy and Environmental Protection Agency. Gayle Baker is Professor and Electronic Services Coordinator at the University of Tennessee Libraries. She has served as adjunct faculty at the University of Tennessee School of Information Sciences and the University of Alabama School of Library and Information Studies. Prior to entering the library profession, Baker was a computer programmer for several years.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

106

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #3
Value and Impact

Monroe

Contingent Valuation of Libraries: Examples from Academic, Public and Special Libraries Sarah Aerni and Donald W. King

Contingent valuation is an economic method used to estimate the benefits of a non-priced good or service by examining the implication of not having the product or service. We have used contingent value analysis in studies of the usefulness and value of academic, public and special libraries. This presentation will summarize the methods we have used and the results we have discovered through this analysis. Contingent valuation was derived from individual library user responses. Each respondent provided an estimate of the cost in time and dollars to get the information if a library was not available, and we asked for further details about transactions for which respondents indicated they knew of a specific alternative (contingent) source. Our studies showed that in the absence of libraries and library collections, it would require significantly higher amounts of time and money to accomplish the same amount of output. The data we will discuss comes from many different studies. We conducted surveys for the academic libraries of the University of Pittsburgh and the University of Tennessee (with Carol Tenopir). Scholars responded to a readership survey that included contingent value questions. In these two studies, we focused on the print and electronic journal collections. Information on public libraries is based on two Return-on-Investment (ROI) studies that have been done, one for the state of Florida and the other for the Commonwealth of Pennsylvania. A comprehensive study was done for each state, including a statewide telephone household survey, in-library surveys conducted in several libraries, survey of other organizations that use libraries, analysis of a decade of public library statistics, and a follow-up census survey of the public libraries. The totals as well as comparisons between the two states will be presented. Contingent value questions were also asked in surveys of 84 special libraries and we will summarize the relevant results. We will clearly explain our methodology and have survey instruments available for anyone interested. A summary of these contingent value analyses, all done as part of larger studies, will give the audience a sense of what contingent value analysis can show about how libraries are valued in an age when many people declare that libraries are old fashioned and the answer to everything lies on the Internet. Sarah Aerni got her MLS from the University of Pittsburgh in 2002 and has been working as a research assistant for the University of North Carolina since Fall 2005. Donald W. King is a statistician who has been doing research about many types of libraries, users of libraries and the publishing industry for over 45 years.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

107

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #3
Value and Impact

Monroe

Demonstrating Library Value Through Web-Based Evaluation Instructional Systems

Charles R. McClure, John Carlo Bertot, Paul T. Jaeger and John T. Snead

A key issue facing libraries in the networked environment is estimating the value of services and resources provided to the community, demonstrating effectiveness, and advocating based on those valuations. While public libraries are arguably a public good based on their roles within their communities, governments and other agencies seek to quantify the value provided by public libraries. One way in which libraries can meet such challenges is through the development and use of web-based evaluation instruction systems that guide libraries in conducting evaluations and using those evaluations to demonstrate value and advocate for library resources. This paper details a long-term evolving effort to develop web-based evaluation instructional systems as a means for the selection of best-fit evaluation approaches for valuation and advocacy for public libraries in the networked environment. More specifically, this paper examines a content design process that focuses on user-centered data-appropriate evaluation methodologies where the content is organized and presented in a comprehensible way for use by library researchers and practitioners. Further, this paper presents insights into the development of a current web-based evaluation instructional system, drawing upon lessons from the design and implementation of prior web-based evaluation instructional modules. As an analysis of an ongoing iterative learning process, this paper examines concepts, best practices, and recommendations related to web-based evaluation instruction as an essential method for evaluation and program planning for libraries. Web-based evaluation instructional systems can help libraries in the networked environment to answer stakeholder concerns and meet user needs; make decisions about library resources and services; demonstrate value of the library to the community; help the library have a voice in the political environment; and support the role of the library as a public good. Ultimately, this paper explores the opportunities that web-based evaluation instructional systems provide for libraries in the networked environment.

Charles R. McClure, Francis Eppes Professor and Director, Information Use Management and Policy Institute, College of Information, Florida State University; John Carlo Bertot, Professor and Associate Director; Paul T. Jaeger, Manager for Research Development; and John T. Snead is Research Coordinator at the Information Use Management and Policy Institute and is also a doctoral student in the College of Information.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

108

Library Assessment Conference Building Effective, Sustainable, Practical Assessment

Charlottesville, Virginia September 25-27, 2006

Wednesday, September 27 10:30-12:00

Parallel 6 #3
Value and Impact

Monroe

Value and Impact Measurement: A UK Perspective and Progress Report on a National Program (VAMP) Stephen Town

This paper describes the background, initiation and progress of the UK Society of College, National and University Libraries (SCONUL) Value and Impact Measurement Program (VAMP). SCONUL had identified a growing need from University Librarians and Directors for data or methods to prove the value and worth of their library services to senior institutional stakeholders. The SCONUL Working Group on Performance Improvement (WGPI) was asked to develop a proposal to answer this requirement. The resulting VAMP project was approved by the Executive Board in February 2006, and is managed through a subgroup of the WGPI. The author has been designated as the Project Manager for the program. The anticipated product of the program will be a web-based framework or toolkit which will guide library managers in the use of a range of products and services that can be used to demonstrate the value and impact of academic libraries. A range of performance and improvement tools already exist which assist this purpose, and the project will therefore begin with an audit and critical review of these, and a member requirements survey. These together will provide a gap analysis to identify where new instruments may be needed. The succeeding phase of the program will therefore consist of the commissioning of specific work packages to develop new instruments and tools. These may include standard methods for assessing impact, value for money, staff performance and process costing. The overall framework or toolkit will also assist those who choose or are required to work within Balanced Scorecard, Critical Success Factors or Key Performance Indicators regimes. By the time of the conference the review and survey phase will be complete, as will a synthesis of findings and members workshops designed to gain feedback and support for specific product developments. Product definitions should also be complete, and work packages commissioned. The paper will therefore be an interim report describing these phases and looking forward towards the final range of products.

Stephen Town is Director of Information Services for Cranfield University at the Defence Academy of the United Kingdom. Stephen has been a long-serving member of SCONUL's Working Group on Performance Improvement and has led assessment projects on their behalf in the fields of benchmarking and information literacy, as well as the UK and Ireland implementation of LibQUAL+. Stephen has published widely in the field of academic library performance and assessment.

Notes

Co-sponsored by Association of Research Libraries, University of Virginia Library and the University of Washington Libraries

109

Library Assessment Conference Building Effective, Sustainable, Practical Assessment September 2527, 2006 Charlottesville, Virginia

www.arl.org/stats/laconf

You might also like