Construction of the 2005 NSSE Benchmarks
The groups of items that go into the construction of the benchmarks were created with a blend of theory and empirical analysis. Initially, we conducted principal components analyses with oblique rotations. Then theory was employed to crystallize the item groupings into the respective groups. As in the past, only randomly sampled cases were included in the calculation of institutional benchmarks.
The process for calculating benchmark scores was revised in 2004 to make them easier to understand and to allow for institutions to calculate their own scores and run intrainstitutional comparisons.
The construction of the 2005 NSSE Benchmarks has four steps. First, all items that contribute to a benchmark are converted to a 0  100 point scale. For the 'enriching' items (question 7 on the survey), those students who indicated that they had already "done" the activity receive a score of 100, while those students who "plan to do," "do not plan to do," or who "have not decided" to do the activity receive a 0. Other items are converted as would be expected. For example, items with four response options (e.g., never, sometimes, often, very often) are recoded with values of 0, 33.33, 66.67, or 100.
Second, parttime students' scores were adjusted on four Level of Academic Challenge items (READASGN, WRITEMID, WRITESML, ACADPR01). For each item, a ratio was calculated by dividing the national average for fulltime students by the national average for parttime students. Each parttime student's score on an item was multiplied by the corresponding ratio to get their adjusted score. Adjusted scores were limited so as not to exceed 100.
Third, studentlevel scale scores were created for each group of items by taking the mean of each student's scores. A mean was calculated for each student so long as they had answered threefifths of the items in any particular group.
Finally, institutional benchmarks were created by calculating weighted averages of the studentlevel scale scores for each class (firstyear students and seniors).
Using base random sample from the 2005 NSSE survey administration, we examined the internal consistency of each NSSE benchmark using Cronbach's Alpha. The result is shown in table below:
Internal Consistency of NSSE Benchmarks (Cronbach's Alpha) 
Academic Challenge  0.74  0.76  0.75 
Active and Collaborative Learning  0.64  0.65  0.67 
StudentFaculty Interaction  0.72  0.75  0.75 
Enriching Educational Experiences  0.54  0.64  0.66 
Supportive Campus Environment  0.78  0.78  0.77 
Click here for more information about the weights.
Click here to download the SPSS syntax used to construct the benchmarks.
Click here to view student responses to each benchmark.
