Introducing the Updated NSSE survey!
Launched in 2013, the updated NSSE survey is built upon years of evidence-based testing, institutional feedback and recent advances in educational and survey research. While survey changes range from minimal adjustments to entirely new content, the updated instrument maintains NSSE’s signature focus on diagnostic and actionable information related to effective educational practice.
The survey was updated with four goals in mind:
- Develop new measures related to effective teaching and learning;
- Refine existing measures and scales;
- Improve the clarity and applicability of survey language; and
- Update terminology to reflect current educational contexts.
Consequently, the Institutional Report was thoroughly redesigned to accompany the survey update. With extensive use of color and graphics, the new reports are designed to be shared widely.
Sets of new, continuing, and updated items have been rigorously tested, and are grouped within several Engagement Indicators. These indicators fit within four engagement themes adapted from the Benchmarks of Effective Educational Practice, as follows:
- Academic Challenge – Including Higher-Order Learning, Reflective & Integrative Learning, Quantitative Reasoning, and Learning Strategies
- Learning with Peers – Including Collaborative Learning and Discussions with Diverse Others
- Experiences with Faculty – Including Student-Faculty Interaction and Effective Teaching Practices
- Campus Environment – Including Quality of Interactions and Supportive Environment
High-Impact Practices are special undergraduate opportunities such as service-learning, study abroad, research with faculty, and internships that have substantial positive effects on student learning and retention.
To learn more about the transition from NSSE’s five Benchmarks to the ten new Engagement Indicators and High-Impact Practices, see this document.
How Has the Survey Changed?
Here is an item-by-item comparison showing how the survey was updated from 2012, indicating which items were unchanged, slightly modified, significantly altered, and deleted. New items are included at the end.
Compared to NSSE 2012, about a quarter of NSSE questions are new, and nearly the same proportion unchanged. Of the half that have been changed, an equal number were modified in major or minor ways. Of course, some items were deleted to keep the overall length of the survey about the same.
Modules: Additional Question Sets
New customization options include topical modules, or short sets of questions, on focused areas such as academic advising, civic engagement, experiences with diversity, writing, and technology. Additional modules will be developed over time.
How Do the Changes Affect Comparisons with Prior-Year Results?
Even the best surveys must be periodically revised and updated, affecting multi-year analyses such as trend studies or pre-post designs. Although many items remain unchanged, others have been modified and a few have been dropped, limiting longitudinal comparability of individual questions and historical benchmarks.
While some new results are not directly comparable to past results, institutions can still evaluate longitudinal questions. For instance, if previous comparison group results indicate above average performance in a particular area, institutions can still gauge whether they outperform the same or a similar comparison group.
We are confident that these updates enhance NSSE's value to institutions. NSSE will continue to provide useful resources and work with participating institutions to ensure maximum benefit from survey participation.
Why Update NSSE?
After a decade in the field, we know more about what matters to student success, institutional improvement efforts, and properties of the NSSE survey itself. Moreover, as higher education faces increasing demands for assessment data, NSSE must stay relevant to current issues and concerns.
Long-time NSSE participants may recall that NSSE was updated regularly in the early years. Starting in 2005, however, we kept the survey largely unchanged as a practical matter for institutions to facilitate year-to-year comparisons. Our intention has been to roll out major updates at longer-term intervals, such as what is happening with the 2013 version. This approach balances the need of institutions to have year-to-year comparisons with NSSE’s need to respond periodically to changes in the higher education landscape, informed by a methodical research and development process.
Higher education is constantly changing, and it is important for NSSE to stay relevant to the most salient issues and priorities of institutional assessment and research.
NSSE 2013 Development
A multi-year process of planning and testing is now complete. This process has been scholarly, rigorous, and collaborative, and involved:
- Consulting with campus users and a variety of field experts
- Gathering ideas and feedback from other interested partners
- Examining NSSE’s psychometric properties, including six years of experimental items
- Conducting cognitive interviews and focus groups with students
- Pilot testing in 2011 and 2012.
- The updated NSSE instrument was unveiled at the AIR Forum June 4, 2012
FSSE and BCSSE
FSSE has been updated to complement NSSE’s changes, and has launched a new format that combines both course-based and typical-student questions and also introduces topical modules.
BCSSE has also been updated with new and modified items to increase alignment with NSSE for a more comprehensive analysis of the first-year experience.
To learn more, view the archived Webinar
“NSSE 2.0 – What to Expect in 2013”
recorded in March, 2013.
Share Your Ideas and Feedback
We are particularly interested in suggestions for enhanced reporting. If you have any ideas to share, here’s how:
- Share your ideas, feedback, and questions about the update and redesigned reports, via email at firstname.lastname@example.org, or call 812-856-5824. We look forward to seeing your ideas and suggestions.