Assessing Institutional Outcomes: Mapping Results from Four Pragmatic Assessment Instruments

To add a paper, Login.

Assessing learning in higher education includes various course-specific indicators, such as instructor-designed tests, tasks, projects, and instructor perceptions. However, higher education research focuses on additional measures, such as retention (Hacket and Carrigan, 2001), as a success indicator, and other evidence for learning beyond the classroom. Additionally, mass media-reported indices of success at best colleges serve the public by emphasizing other criteria, such as alumni endowments. Outcomes and standards either locally-designed or administered through national or state agencies and regional accrediting bodies argue for indicators that assess institutional wellbeing, especially as it relates to information literacy, problem solving in small groups, critical thinking, and communication both written and spoken, and broadly conceived notions of what it means to be liberally educated. Accordingly, college administrators agree to ongoing assessments so they can profile students and understand the changing needs of the institution. Teaching in a culture of on-demand information necessitates such ongoing review. Many institutions regularly survey students using such instruments as the Cooperative Institutional Research Program (CIRP) and the National Survey of Student Engagement (NSSE). The NSSE survey assesses five benchmarks of effective learning practices that are linked to academic success: level of academic challenge, active and collaborative learning, student-faculty interactions, enriching educational experiences, and supportive campus environment (NSSE, 2005). Mark and Boruff-Jones (2003) argue that NSSE is underused as an assessment instrument to correlate higher education learning goals to student engagement. The survey instrument, both valid and reliable, provides information to compare different student groups. Through an analysis of an institution’s NSSE results compared to responses by students at similar colleges, instructors gain insight into students’ perceptions of their learning. These perceptions when compared to students’ responses over time and when examined in relation to other information about learning provide a more comprehensive view of institutional learning outcomes that prevent student learning. The authors, as part of an institutional task force for establishing Information Literacy (IL) discovered that NSSE data were useful to assess IL Standards in undergraduate education. The authors first mapped IL Standards on a Ladder of Abstraction to show Bloom’s Taxonomy of Cognitive Development (Weiss, Corso, Kelly, 2005). After the authors examined syllabi from across the curriculum, they identified courses with IL Standards and plotted each on the Ladder of Abstraction. The authors further gathered information from the 2003, 2004, and 2005 NSSE surveys. Annually, forty-five first-year students and 54 senior students are surveyed through the NSSE. Results are compared to those of 42,060 students from general first year and senior baccalaureate universities. Using an analysis of variance, the authors identified 19 statements from NSSE that correlate to the IL standards for information literacy. They plotted these select statements on the Ladder of Abstraction to identify statistical differences (p<0.05) between freshmen and seniors at other similar institutions, and freshmen and seniors at their college to measure growth or development area within the institution. Preliminary results indicate that students at this institution show growth (p<0.05) in several areas of cognitive development. However, responses by seniors compared to seniors at other institutions indicate a need for greater emphasis in one area. Results from the analysis of the ACRL standards, Bloom’s Taxonomy, the Ladder of Abstraction, and the NSEE surveys from three years portray sustainable learning at the institution and begin to indicate sharper focus for institutional improvement that when shared with faculty members can become the impetus for curricular development and faculty growth.

Keywords: NSSE, Information Literacy, Assessment, Bloom's Taxonomy, Institutional Assessment, Ladder of Abstraction
Stream: Curriculum and Pedagogy; Student Learning, Learner Experiences, Learner Diversity
Presentation Type: Paper Presentation in English
Paper: , , Assessing Institutional Outcomes

Dr. Gail Shanley Corso

Associate Professor, Division of Arts and Sciences, Neumann College
Aston, PA, USA

Dr. Corso has been teaching writing at Neumann College since 1992. Before then, she was a coordinator of writing across the curriculum in two areas at Syracuse University-- Newhouse School of Communication and the School of Visual and Performing Arts.

In recent years, she leads as coordinator of writing at Neumann. She coordinated the self study process (2001) for the Communication Arts program, initiating major curricular changes-- creating the Arts Production and Performance major, the redesigned Communication and Media Arts major, the Writing minor and Journalism minor.

Dr. Corso has authored numerous curricular reports for Neumann College, has co-authored several articles, presented on many occasions in national and regional forums on writing and learning, and on several occasions, she has presented in international forums, and she has written a chapter to a book on technology and writing.

Dr. Corso is an Associate Professor of English and Communication and Media Arts.

Dr. Sandra M. Weiss

Associate Professor, Division of Arts and Sciences, Neumann College
Aston, PA, USA

Sandra M. Weiss, Ed.D., is a Professor of Biology and Clinical Laboratory Science at Neumann College, United States of America. She has taught in higher education for over thirty years. Her interests are in the areas of hematology, immunology and immunohematology. Additionally, Dr. Weiss has worked in the highly technological field of clinical laboratory medicine and continues to consult in that area. She is actively researching comparative methods in coagulation and has additional expertise in educational technology.

Ref: L08P0118