Five Innovative Technologies for Student Affairs Assessment

Five Innovative Technologies for Student Affairs Assessment

Nathan K. Lindsay
University of Missouri-Kansas City
Jesse A. Riggs
Calvary Baptist College

Conducting assessment in student affairs can be a challenging and inefficient process, making it imperative for student affairs professionals to use appropriate methods and technologies. When student affairs professionals use an overabundance of paper and pencil surveys, survey fatigue among respondents and data entry errors can cause significant problems. To combat these challenges, a number of innovative technological options are available for collecting data from students, parents, faculty, and staff. These include clicker technology, personal digital assistants (PDAs), digital recorders, optimal mark read (OMR) scanners, and online surveys. Each technology has its own specific purposes and strengths, and those in student affairs can be more effective by combining these tools in more strategic and intentional ways. As demonstrated by several examples that have been used at the University of North Carolina Wilmington (UNCW), traditional paper-based survey instruments can be supplemented and enriched by each of these other data collection methods.

As outlined in Higher Education and the Digital Divide by Duderstadt, Atkins, and Van Houweling (2002), it is hard to overestimate the impact that technology has had on colleges and universities. In response to demands to create more diverse learning experiences, including online and distance learning opportunities, more faculty and staff are using technologies to connect to students and to one another in innovative ways. Likewise, as higher education experiences a shift toward technology-based learning, “the use of technology will be increasingly common in the development of assessment projects in the future” (Schuh, 2009, p. 244). Methods for assessment need to be planned and communicated clearly as technology integration shapes the road ahead. As Bresciani, Zelna, and Anderson (2004) point out, “Carefully considering your options by reflecting on what it is that you want to measure and how you can gather the best evidence is vital to the success of your work” (p. 25).

With the integration of technology that allows for multiple forms of data gathering across departments and divisions, the overall picture that is painted can be expanded; what may begin as wallet-size snapshots can become a full-scale, 360-degree view of the overall campus climate.

The need to increase the clarity of higher education’s vision as it pertains to assessment is, according to Keeling, Underhile, and Dungy (2008), “driven by two forces: external demands for accountability and internal commitments to improvement” (p. 1). These driving forces demand a more holistic and efficient approach to assessment that can be facilitated by the creative use of technology. By using several types of technology to gather data, assessment initiatives can reach audiences and events that were otherwise inaccessible. Such technologies also help student affairs professionals get out of the rut of always doing the same types of assessments.

The first innovative technology we will mention are clickers, which are becoming popular on many campuses across the country. The immediate feedback possible with clickers enhances the learning experience in workshops and classrooms by fostering more interaction, humor, and student inquiries, and the data and responses generated can be saved at the end of the session for later analysis. Several companies provide clicker technology and services, although the company, TurningPoint Technologies, seems to be gaining market share due to its seamless interaction with PowerPoint. Clickers have been effective in several student-based programs at the University of North Carolina Wilmington (UNCW), including new student orientation and student staff training. They have been used, also, for various division-wide meetings to familiarize staff with the ease of use of this technology.

There is a fairly large initial cost for acquiring the clickers (approximately $30-40/clicker) and the receivers (approximately $100/receiver), with clickers obviously being the larger expense because one is needed for each attendant. Other drawbacks for clickers are that clicker use can require more time than paper and pencil surveys, and individual clickers can disappear over time as a result of theft or carelessness. Some colleges and universities have begun to require that students purchase their own clickers, which can be used in all of their classes and workshops at the institution.

As a second option, Personal Digital Assessments (PDAs), such as iPods, iPhones, or other smartphones, are convenient tools for gathering student and attendee feedback immediately following events. PDAs essentially take an electronic survey and make it mobile, allowing for their use at any location, whether from the lobby outside an event to a busy sidewalk or intersection on campus. At UNCW, we have used iPods to conduct “60-second surveys” for the Career Center and to solicit feedback regarding numerous campus workshops and events. Students like iPods and sometimes even think that we are handing them out, but when they find out that we are just conducting surveys they are still willing to participate.

Depending on the software platform, data can be accessed immediately after the PDA is synced with a computer. PDAs are relatively inexpensive (approximately $70-$300), easily portable, and widely compatible with computer survey software. Some computer expertise may be needed to ensure optimum performance, and the qualitative data that can be acquired are limited, though this limitation can be augmented through the use of digital recorders. Other challenges that can arise are short battery life and the small screen size, making it important to purchase PDAs that are accessible for students with disabilities.

As a third versatile and mobile option for collecting data, digital voice recorders allow one to capture open-ended responses in focus groups around campus. Digital voice recorders provide an ability to concentrate on interviews or focus groups without having to worry about recording all comments and allow for direct quotations for reports, presentations, and articles. These recorders have been very useful at UNCW as we have listened to student veterans on campus explain their needs, challenges, and ideas for improvement. The recorders cost $50-$200, typically, and the recordings can be directly uploaded to a computer. Other options on the devices include adjustable listening speeds and index marks that can be placed in the recordings. The challenge with recordings is that transcribing the full text can require extensive time, and the voice quality and clarity sometimes result in missed data.

In situations where extensive quantitative data are desired, the OMR scanner is a good approach to collecting data from bubble sheets. Particularly useful when asking multiple choice/answer questions of participants, the scanners are fast and accurate, eliminating errors and time spent on data entry. Disadvantages include the high costs for OMR hardware, software, and forms, as well as limited customization options. It is also difficult to collect qualitative data on such surveys. Scanner prices vary according to specific features, such as whether they read forms that are one or two sided or whether they can recognize both pencil and pen. At UNCW, the OMR scanner has been very useful in collecting bubble sheet data that examined issues regarding our students’ substance abuse.

The fifth technological option, online surveys, is perhaps the most common technological method for assessment, whether they are sent via home-grown systems, Survey Monkey, or other purchased systems. At UNCW, web surveys conducted through Campus Labs (formerly StudentVoice) software have enabled the widespread collection of student participation trends, satisfaction, benchmarking, and learning outcomes data across campus. By using CampusLabs, links to web surveys are easy to send out to large numbers of recipients all at once via email, and the data are typically available in real-time once a survey is submitted. The software also keeps track of who has responded to the survey, reduces number crunching, and allows for easy cross-tab analyses and cut-and-paste tables and graphs. The cost for online platforms, such as Campus Labs, varies by institutional size. Though easy to use, web surveys can suffer from low response rates and self-selection biases. These can be offset by offering incentives for completing surveys, such as gift cards or drawings for prizes. Efforts to show how data are being used to make improvements on campus (e.g., “We’ve Heard Your Voice” initiative) can enhance student motivation as well.

The benefits of using these five technologies in tandem with one another are extensive. By using a combination of these tools, both quantitative and qualitative data can be solicited from students in a variety of locations and methods. By capturing rich student data that are reliable and valid, staff and administrators can be more attuned to the opinions and needs of the student population. Also, time spent on manual data entry can be reduced, removing both entry errors and staff fatigue.

Using these technologies has been crucial in the development of a “culture of learning” and a “culture of assessment” at UNCW. To facilitate greater technology use, staff training has been conducted both individually and collectively. Through widespread staff participation, technologies have been used to gather significant data at UNCW on such areas as community standards, family events, summer programming, and student veterans, in particular. By using a combination of online quantitative assessment and digitally recorded qualitative assessment, services and programs offered to military students at UNCW have been significantly enhanced. Such assessments are being used to provide a seamless transition for active duty and student veterans, allowing their time at UNCW to be one of academic achievement and immersion into the campus culture.

In summary, assessment is an ongoing process vital to the growth and improvement of any university. Through various technologies, the satisfaction, needs and learning outcomes of students can be assessed easily and accurately, allowing administrators to make the most of tight budgets by delivering targeted responses to university concerns and goals. Although these tools can often be expensive, funding for technology can be solicited from across the division/university and through available grants. Given their many potential positive outcomes, we have found that they are well worth the investment.


Bresciani, M. J., Zelna, C. L., & Anderson, J. A. (2004). Assessing student learning and development: A handbook for practitioners. Washington, D.C.: National Association of Student Personnel Administrators.

Duderstadt, J., Atkins, D., & Van Houweling, D. (2002). Higher education in the digital age. Westport, CT: Praeger Publishers.

Keeling, R., Wall, A., Underhile, R., & Dungy, G. (2008). Assessment reconsidered. USA: ICSSIA.

Schuh, J. (2009). Assessment methods for student affairs. San Francisco: Jossey-Bass.

Discussion Questions

  • How does the use of technology enhance assessment initiatives in student affairs?
  • In what situations would each of the five technologies be most appropriate?
  • What are some of the pros and cons associated with the use of each of these technologies?

About the Authors

Nathan K. Lindsay, PhD, is the Assistant Vice Provost for Assessment at the University of Missouri-Kansas City. He is also the former Director of Student Life Assessment at the University of North Carolina Wilmington.

Please e-mail inquiries to Nathan K. Lindsay.

Jesse A. Riggs is the Director of Institutional Research at Calvary Bible College.

Please e-mail inquiries to Jesse A. Riggs.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Leave a Reply

Your email address will not be published. Required fields are marked *