How Social Identities Affect Students with Autism for Transition to College


How Social Identities Affect Students with Autism for Transition to College

Edlyn Vallejo Peña
Jodie Kocur
California Lutheran University

In 2009, the United States Government Accountability Office reported that students with disabilities now comprise one in 10 college students. A more recent survey of four-year colleges and universities reported that nearly 15% of enrolled first-year students reported a disability (HERI, 2011). The enrollment of college students with autism spectrum disorder (ASD) in particular is projected to increase with growing diagnostic rates and more robust educational supports in the K-12 system. ASD is a developmental disability that can cause college students to experience challenges in communication, socialization, sensory processing, and restrictive and repetitive behaviors (Peña & Kocur, 2013). Today, 30% of students with ASD who complete high school attend college (Roux, Shattuck, Rast, Rava, & Anderson, 2015), rightfully making their way into postsecondary institutions. In 2008-2009, approximately 78% of four-year public institutions enrolled students with ASD (Raue & Lewis, 2011), though that percentage is presumed to be higher today. Additionally, because many students do not disclose their disability once in college, these findings likely under-report the presence of students with ASD (Newman, Wagner, Cameto, & Knokey, 2009).

While access to college is improving for students with ASD, they are less likely to transition to college, compared to students without disabilities and even students with other kinds of disabilities (Roux et al., 2015). Further, the data disaggregated by other demographic factors suggest inequitable educational opportunities among students with ASD across different social identities, such as race/ethnicity and educational background. White students with ASD enter college at greater rates than their racial and ethnic minority peers; 41% of White students with ASD attend college compared to 23% of Black and 29% of Latino students with ASD. Furthermore, over 75% of students with ASD who enrolled in college had at least one parent with a college education. Students with ASD whose parents went to college were three times more likely to transition to college. This is likely because students with ASD must rely more heavily on parent knowledge, support and guidance to prepare for, transition into, and succeed in college than their peers without disabilities (Peña & Kocur, 2013).

Scholars are just beginning to understand the ways in which demographics and social identities shape the experiences of college and college-bound students with disabilities that produce cumulative disadvantages (Peña, Stapleton, & Schaffer, 2016). In one of the few research studies that reports first-hand experiences of college students with ASD, MacLeod, Lewis, and Robertson (2013) report in their findings that, “it is likely that some or all [participants] came from relatively privileged backgrounds. In gaining entry to higher education, they are a minority within the autistic community” (p. 46). What is not yet understood is how the identities of families of students with ASD enable the students to prepare for and transition into college. Because students with ASD typically require greater parental support during these life events (Peña & Kocur, 2013), we qualitatively examined the experiences of 29 parents and caregivers of students with ASD who prepared for and/or transitioned into college life. The research question that guided this particular analysis is: What role do family social identities play in supporting students with ASD to prepare for and transition into college?  


We engaged in a secondary analysis of interview transcripts from a larger study exploring the experiences of parents of college-bound and college students with ASD. The methodological approach of the larger study involved a case study to concentrate on an in-depth analysis of an entity or bounded system (Patton, 2014). Case studies are useful in studying temporal processes to trace experiences, events, and changes over time. This approach allowed us to capture and analyze rich stories and experiences of an unknown phenomenon—the ways in which families as critical support systems play a role in supporting students with ASD to prepare for and transition into college. We employed purposeful sampling (Merriam & Tisdell, 2014) by identifying 29 parents in California whose students with ASD were either engaged in transition planning (while a high school junior or senior) or attending a 2- or 4-year post-secondary institution. To recruit participants, we emailed college disability support offices, clinicians who work with ASD clients, autism support group listservs, and posted on social networking sites.

Parent participants completed a demographic questionnaire and interview, lasting approximately one hour. The semi-structured nature of the interviews allowed us both the structure and flexibility to follow the parents’ lead when their recollections were rich and relevant. All 29 interviews were audio-recorded and transcribed. In our data analysis of the interview transcripts, we first conducted a within-case content analysis of individual participant transcripts in which we engaged in coding of text. We identified significant statements and core meanings about the role social identities played in supporting the postsecondary transition and development of each student with ASD. We then engaged in a cross-case analysis, enabling us to review and revise the codes across participants, grouping them into over-arching themes that answered the research question.


This section begins with a description of demographic information and privileged identities of the families who participated in the study. We then present themes that represent the experiences which contributed to supporting the transition of students with ASD to attend college.

Privileged Identities

Important demographic trends emerged in terms of race, parental education, and household income of participants. Of the 29 participants, 24 participants identified as White. Three participants identified as Latina/o and two as multiracial. As far as parental education, only one student with ASD, of the 29 represented in the study, lived in a household in which neither parent had ever enrolled in postsecondary education. Two participants indicated that at least one parent in the household had experienced some college or postsecondary education. The overwhelming majority of parents had either graduated from a four-year college (n=11) or earned a graduate degree (n=15).

The majority of the parents in this study came from middle to high-income households. Of the 28 participants who answered the question about household income, 23 indicated that the household income was over $90,000 per year. This is at least $30,000 more than the median income for families in the state of California (where the study was conducted), and about $26,000 more than the median income for families in the United States at the time of the study (U.S. Census Bureau, 2013). Noting the household income of the participants is important because it signals access to critical therapies and supports that insurance companies and school districts may not cover or offer. At least 22 of the 29 parents reported spending hundreds to thousands of dollars annually on one or more of the following: speech therapy, social skills sessions, an attorney or advocate, tutoring, occupational therapy, an educational aide or tutor, college preparation program, and a psychiatrist or psychologist. One parent noted, “he went through the Fast Forward [reading] program, which totally is the best $3000 worth I spent.” This level of collective wealth, educational status, and racial privilege among participants likely provided advantageous opportunities for transition into college; this is supported by the existing literature on cultural capital, social capital, and student success (Bourdieu, 1986), as explored in the sections that follow.

High Parental Aspirations

Given that nearly all of the students with ASD lived in a household that had at least one parent with some college education, the majority of students grew up with parents who communicated high aspirations for their children to attend college. When asked about the time in which parents began thinking about college as a possibility for their child’s future, 18 of the 29 parents responded with “always,” “it was never a question,” or “it was never not considered.” In contrast, only five participants considered college a possibility when their child was in high school and the rest thought about it somewhere in between “always” and high school. Even when low expectations were communicated by high schools during the transition process, parents presumed competence in their students and developed high levels of aspirations for their educational futures. One parent said that transition planning “was discussed in IEP meetings, but [the school] really didn’t have or provide direction,” echoing the experiences of other students with ASD who experienced low expectations from teachers and administrators. In spite of such structural barriers to transitioning into postsecondary education, parents advocate and coach their students through the transition (Peña & Kocur, 2013), as confirmed in this study.

Exercising Cultural Capital

Parents exercised cultural capital to assist their students with ASD to navigate transitioning into higher education. Cultural capital is known as accumulated cultural knowledge that brings about social mobility, status, and power (Bourdieu, 1986). Individuals who come from privileged identities and experiences tend to accumulate cultural capital to navigate complicated processes, structures, and systems like the transition into and persistence in higher education. The overwhelming majority of the parents in the study employed cultural knowledge and tools to guide students with ASD in three ways. They used their cultural capital to research postsecondary options, navigate policies for transition and admission, and advocate for access to resources to support their college success and retention. One proactive parent explained:

You’ve got to get online. You’ve got to look at books. I think you have to connect with a
professional who has the clinical experience to be able to evaluate if your kid can make it
academically. And then I think it’s a matter of going [to the campus] and researching.

Another parent explained that she “had always talked about [college] because I went to college.” She demanded transition planning from her son’s high school and took her son to her college campus to familiarize him with college life. Other parents guided students in selecting academic majors and degrees—from a “math major and a screenwriting minor” to an associate degree in veterinary technology—in order to maximize their future career opportunities and mobility.

Employing Social Capital

Social capital involves the development of networks and relationships to others in order to gain access to important resources for social mobility (Bourdieu, 1986). Social networks tend to benefit people in privileged positions by enabling them to maintain their power through acquiring critical resources and opportunities. Parents in the study generated and tapped into extensive social networks within and outside of the schools and colleges in which the students attended. Parents generated social capital through relationships with educational advocates, psychologists, and educators to access opportunities, information, and resources to prepare students with ASD for access and transition into postsecondary education. Parents then advocated fiercely to make sure students received appropriate supports as they transitioned into college. This involved generating relationships with key institutional agents at the students’ colleges—from disability coordinators to academic advisors. While parents encouraged the students to develop these relationships, the parents themselves often stepped in. One parent told her son to “just go to [the disability services] office,” but she worried that her son was not yet equipped to exercise his self-advocacy skills. The next day, the parent took it upon herself to email the disability services coordinator to request assistance for her son. By tapping into this institutional agent, the student gained access to accommodations through the disability services office that were critical to college persistence.


The findings of this study document the ways in which parents of privileged social identities—mostly White, college educated, and upper-middle class—mobilized to navigate and support their children with ASD through the transition process. By cultivating and employing high aspirations, cultural capital, and social capital, parents were advantageously equipped with knowledge, social networks, and the ability to tap into resources necessary for preparing students with ASD for college.

The results of the study suggest a number of implications for preparing, recruiting, and enrolling college students with ASD. The activities and practices in which families of privileged backgrounds engaged to mobilize their children’s access and transition into college can be instructive to other families who desire similar outcomes for their children. First, parents should develop high aspirations for their children to achieve a higher education. While students with disabilities typically experience additional educational challenges compared to students without disabilities, they have great potential to access postsecondary settings when high expectations and appropriate supports are in place (Cawthon, Garberoglio, Caemmerer, Bond, & Wendel, 2015). Second, families can make efforts to develop cultural and social capital to access resources important to transitioning to college. Toward this end, families can cultivate relationships with individuals who have college knowledge, visit and read about institutions of higher education, and participate in programs or services that provide access to transition resources.

Prior research has identified inequitable access to postsecondary education across race/ethnicity and parental education backgrounds for students with ASD (Roux et al., 2015). High schools and colleges must reconsider the ways in which they reach out to students with ASD and their parents, especially from disadvantaged backgrounds, to prepare them for the transition to college. Educators must involve parents and their students with ASD from marginalized backgrounds to develop college aspirations, advocacy skills, and social networks that will enable students to access and succeed in postsecondary environments. Federally-funded TRiO programs, for example, support first-generation, low-income students, and, in certain programs, students with disabilities specifically. High schools and postsecondary institutions can work with structured programs like these to reach historically underrepresented students with ASD earlier in the education pipeline. The findings of this study add another layer to our understanding of working with the broad backgrounds of students with ASD and provide contextual information about the experiences that lead to increased access and transition for these students.


Two obvious limitations to our findings center on the study’s sample of participants. First, the participants lacked diversity in terms of race/ethnicity, educational background, and family income status. Thus, the participants are not necessarily representative of families in the United States who successfully support their children with ASD to transition to college, though these kinds of national statistics are not yet available. Second, we did not interview college students with ASD themselves. Without their voices, an incomplete body of knowledge about college opportunity, access, and choice is constructed. Adding the voices of students with ASD to future research will enrich our conceptions about transition experiences to college. In addition, future studies should consider studying experiences of students with ASD from an intersectionality framework. Intersectionality provides an appropriate lens from which to examine the ways multiple social identities—race/ethnicity, first-generation status, socioeconomic status—intersect along a continuum of (dis)advantage and (dis)empowerment for people with disabilities of all backgrounds (Peña, Stapleton, & Shaffer, 2016). Lastly, future studies should also include an exploration of institutional practices and cultures in supporting students with ASD to transition to college. Identifying patterns of systemic behaviors and policies will uncover enabling and disabling structures for the growing number of students with ASD entering our colleges and universities.

Discussion Questions

  1. Describe ways in which postsecondary institutions, particularly programs focused on outreach and recruitment of students, can reach out to historically underserved students with ASD and other disabilities.
  2. In what ways can institutions of higher education work with the K-12 system to develop college aspirations, advocacy skills, and social networks among students with ASD to enable them to access and succeed in postsecondary environments?


Bourdieu, P. (1986). The forms of capital. In J. G. Richardson (Ed.), Handbook of theory research for the sociology of education (pp. 241-258). New York, NY: Greenwood Press.

Cawthon, S. W., Garberoglio, C. L., Caemmerer, J. M., Bond, M., & Wendel, E. (2015). Effect of parent involvement and parent expectations on postsecondary outcomes for individuals who are d/Deaf or hard of hearing. Exceptionality, 23(2), 73-99.

Higher Education Research Institute (HERI) (2011). College students with “hidden” disabilities: The freshman survey fall 2010. Retrieved from

MacLeod, A., Lewis, A. & Robertson, C. (2013). “Why should I be like bloody Rain Man?!” Navigating the autistic identity. British Journal of Special Education, 40(1), 41-49.

Merriam, S. B., & Tisdell, E. J. (2014). Qualitative research: A guide to design and implementation (4th ed.). San Francisco, CA: Jossey-Bass.

Newman, L., Wagner, M., Cameto, R., & Knokey, A. M. (2009). The post-high school outcomes of youth with disabilities up to 4 years after high school. A report from the National Longitudinal Transition Study 2 (NLTS-2), Prepared for the U.S. Department of Education (NCSER2009-3017). Retrieved from

Patton, M. Q. (2014). Qualitative research & evaluation methods integrating theory and practice (4th ed.). Thousand Oaks, CA: Sage Publications.

Peña, E. V., &  Kocur, J. (2013). Parenting experiences in supporting the transition of students with autism spectrum disorders into community college. Journal of Applied Research in Community Colleges, 20(2), 5-12.

Peña, E. V., Stapleton, L. D., Schaffer, L. M. (2016). Diverse and critical perspectives on disability identity. In E. S. Abes (Ed.), Critical Perspectives on Student Development Theory (pp. 85-96). San Francisco, CA: Jossey-Bass.

Raue, K., & Lewis, L. (2011). Students with disabilities at degree-granting postsecondary institutions.(NCES 2011-018). U.S. Department of Education, National Center for Education Statistics. Washington, D.C.: U.S. Government Printing Office.

Roux, A. M., Shattuck, P. T., Rast, J. E., Rava, J. A., and Anderson, K. A. (2015). National autism Indicators report: Transition into young adulthood. Philadelphia, PA: Drexel University.

U.S. Census Bureau (2013). Median income in the past 12 months (in 2013 inflation-adjusted dollars) by veteran status by sex for the civilian population 18 years and over with income. Retrieved from

About the Authors

Edlyn Peña is an associate professor and director of Doctoral Studies in Higher Education Leadership at California Lutheran University.  She is an award-winning researcher who studies social justice issues for students with disabilities, particularly autism, in the preschool through higher education pipeline. As the Co-Director of the Autism and Communication Center and member of the federal Intergency Autism Coordinating Committee, Peña is best known for her service to the autism community at the state and national level.

Jodie Kocur is an associate professor in the psychology department at California Lutheran University.  Dr. Kocur’s research interests include the transition to postsecondary education for students with Autism Spectrum Disorders (ASD) and the developmental origins of the experience and expression of anger in intimate relationships.  

Please e-mail inquiries to Edlyn Peña.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

No Adult Left Behind: Student Affairs Practices Offering Social Support to Adult Undergraduates


No Adult Left Behind: Student Affairs Practices Offering Social Support to Adult Undergraduates

Rebecca L. Brower
Bradley E. Cox
Amber E. Hampton
Florida State University


For each of the last 6 years at Florida State University, students pursuing a master’s degree in student affairs have taken a class entitled “The American College Student.”  At the beginning of the first class, the students are asked to spend 10 minutes describing American college students.

The results are pretty consistent. A self-confident student begins the discussion by describing students who look/sound/think/act a lot like he or she did just a few years earlier.  Next, another student, typically one who looks/sounds/thinks/acts differently than the first student, tells the class that not all college students are the same and then goes on to describe how he or she was different from the description provided by the first student.  Eventually, a student will argue that there are many ways to describe college students, and that trying to define the American college student is impossible.  The class transitions to exploring this question, “How do we paint a single portrait of such a diverse group of students?”  Students then list a series of characteristics that might be used to differentiate undergraduate students including demographics, institution type, and academic status. Rarely is age mentioned in the list of student characteristics.

In the second week of class, we transition from brainstorming who we think college students are, to established facts by sharing data from the National Center for Educational Statistics (NCES).  While this information substantiates the arguments made by the class during the activity in the first class session, one statistic often catches students by surprise.  According to NCES data, the average age of undergraduate students in 2000 was 25 years old.  In fact, although this population accounted for only 27% of undergraduates in 1970, roughly 42% of current undergraduates are 25 or older.  These statistics are surprising to students because adult undergraduates are rarely mentioned during the initial class discussion about the different types of American college students.

This classroom experience causes us to wonder the extent to which student affairs professionals have a real awareness of adult undergraduates. This question is particularly salient right now because as Wyatt (2011) pointed out, adult students are “one of the largest and fastest growing populations of students” (p. 13).   The perception that adult students are self-sufficient and do not need or want student affairs services may lead campus personnel to believe that adult undergraduates need less assistance than their traditional age peers. However, as Hardin (2008) emphasizes, “the misperception still exists that adult learners are self-supporting and do not need the same level of support as eighteen- to twenty-three-year-old students.  In reality, adult learners need at least as much assistance as traditional-aged students, and sometimes more” (p. 53).

The purpose of the current research study was to examine the extent to which student affairs divisions are adopting practices that offer social support to adult undergraduates. We not only write this article as a call to action supported by our research findings, but also as an invitation to take note of a population on our campuses who are in need of greater social support. In this article we present new data suggesting that by failing to adopt adequate practices supportive of adult students, divisions of student affairs offices at four-year colleges and universities may be losing an opportunity to improve outcomes for these adult students. Therefore, our study poses the following research question: To what extent are student affairs divisions adopting practices that the literature suggests provide social support to adult undergraduates (aged 25 and older)?

Literature Review

As a group, adult students share a number of characteristics: they are more likely than traditional age students to be first generation, female, ethnically diverse, and financially independent (Giancola, Grawitch, & Borchert, 2009). Adult students are also more likely to study part-time, have children for whom they are financially responsible, and work in addition to studying (Giancola et al., 2009; Hardin, 2008; Kasworm, 2003). Reasons for adult students to return to college include career concerns, family needs, and self-improvement (Bauman, Wang, DeLeon, Kafentzis, Zavala-Lopez, Lindsey, 2004; Chao & Good, 2004; Kasworm, 2003). Kasworm (2003) argues that adult students are motivated to attend college by “personal transitions and changes” as well as the desire for “proactive life planning” (p.6). These transitions may occur as the result of positive or negative life events such as promotion at work, reevaluating life goals, divorce, or losing a job (Chao & Good, 2004; Hardin, 2008).

Female students, who constitute the majority of adult undergraduates, have special concerns when they return to college (Carney-Crompton & Tan, 2002). Women, in particular, are more likely to experience role conflict between home and school (Carney-Crompton & Tan, 2002). The added demands of college, along with employment and other family roles, can produce an added strain on women students juggling multiple responsibilities. Conflicts such as these are often mitigated by the calculated choices some adults make about the timing of enrollment.  Women often return to school to support their family when they divorce or their children enter school (Hardin, 2008).  Because psychological stability increases with age in women, female adult students may be better equipped to manage the stressors and role conflicts in their lives (Carney-Crompton & Tan, 2002; Hardin, 2008). Nonetheless, family responsibilities are the most frequently cited reason for female adult students to leave college. One important factor in this equation is the age of women’s children, because women caring for young children experience the greatest role conflict and face the most serious academic challenges (Carney-Crompton & Tan, 2002; Hardin, 2008). Regardless of students’ gender, family/school conflict can be a major source of stress for many adult students (Giancola, Grawitch, & Borchert, 2009).

Social support is an important issue for adult undergraduates, particularly for female students, because students with stronger social support systems perform better academically while students with less social support are more likely to require campus services (Bauman et al., 2004). Sources of support for adult students tend to be family, partners, friends, coworkers, and professors on campus, though off campus sources of support are often more influential in their lives (Bauman et al, 2004; Carney-Crompton & Tan, 2002 ; Chao & Good, 2004; Donaldson et al., 2000 ; Graham & Gisi, 2000). Depending on a student’s life situation, family, partners, friends, and coworkers off campus can either be a major source of social support or a major source of stress in the case of family/school and work/school conflict (Donaldson et al., 2000; Giancola, Grawitch, & Borchert, 2009).

The literature on adult undergraduates suggests a number of student affairs practices that may offer adult students social support thereby helping them to succeed in college. First, student affairs offices would benefit from an infusion of ideas from colleagues who specialize in adult undergraduates (Bauman et al., 2004; Carney-Crompton & Tan, 2002; Giancola, Grawitch, & Borchert, 2009; Graham & Gisi, 2000; Hadfield, 2003; Hardin, 2008; Wyatt, 2011). After training or hiring staff that are cognizant of adult student needs and establishing an office for adult students, student affairs personnel can poll adult students through surveys or focus groups on their service needs and the greatest barriers to their success (Hadfield, 2003). This data can be used to establish services such as child care, orientation programs for adults, and adult student organizations (Bauman et al., 2004; Carney-Crompton & Tan, 2002; Giancola, Grawitch, & Borchert, 2009; Graham & Gisi, 2000; Hadfield, 2003; Hardin, 2009).

Programming is another area in which student affairs staff can offer support and help adult undergraduates succeed. Programming that nontraditional students find particularly useful include workshops on stress and time management as well as study skills (Bauman et al., 2004; Giancola, Grawitch, & Borchert, 2009). In addition, programming that welcomes families, partners, and friends, can assist adult students in feeling included in campus life (Hadfield, 2003; Hardin, 2009). Other existing services that can be tailored to the needs of adult students include career counseling, personal counseling and support groups, academic advising, and financial aid advising (Bauman et al., 2004; Chao & Good, 2004; Donaldson et al., 2000; Giancola, Grawitch, & Borchert, 2009).  These services, apart from childcare, are particularly useful for adult students who may not be interested in the traditional collegiate social experience, but benefit from resources connected to employment and course completion.

Theoretical Framework

The theoretical framework for this study is grounded in psychological literature, which suggests that adults require four types of support from their social systems: emotional, appraisal, informational, and instrumental support (House, 1981).  As described in House’s (1981) research, emotional support is felt when others are trustworthy and show concern; appraisal support gives positive encouragement; informational support is the ability to share knowledge and instruction; and instrumental is the shift of a physical setting or investment of funds. These categories broaden our understanding of social support and make sense in light of our study, applying equally to our research question, research design, and interpretation of findings. Specifically, we use this framework from psychological literature to categorize five student affairs practices as offering instrumental, informational, or appraisal support. Thus, childcare services offer instrumental support; new student orientations specifically for adult undergraduates provide informational support; adult student organizations and programming for student of diverse ages offer appraisal support; and hiring student affairs staff with expertise in adult undergraduates provides all three types of social support.

Methods and Data Sources

In order to determine the extent to which student affairs divisions are adopting practices that the literature suggests provide social support to adult undergraduates, we used data from the Survey of Student Affairs Policies, Programs, and Practices, which was distributed to the Chief Student Affairs Officer (typically the Vice President for Student Affairs or Dean of Students) at 57 institutions in five states (California, Florida, Iowa, Pennsylvania, and Texas). Included in this quantitative survey were 34 categories of questions that covered topics such as advising, orientation, and assessment date usage.  The survey was distributed by project staff in both hard copy and electronic formats. The 57 participating institutions included 22 bachelor’s degree granting institutions, 29 master’s degree granting institutions, 2 doctoral degree granting institutions, and four specialty institutions (i.e. one Bible college, one health professions school, and two schools of art and design). Of the participating colleges and universities, 13 were public not-for-profit, 42 were private not-for-profit, and two were private for-profit institutions. The sample was inclusive of a wide range of institutional types, sizes, levels of selectivity, and sources of control/funding. To review full results from the project, visit:

Our initial reports included information on the prevalence of student services in areas that the research literature suggests are especially beneficial for adult students: hiring student affairs staff with expertise in adult students, childcare services, orientation programs for adult students, student organizations for adult students, and events for students of different ages.  As shown in Table 1, there were specific questions targeting student populations and the adoption of policies. Possible answers to survey questions were dichotomous “yes,” “no” responses.

Survey Topic Survey Question
Orientation Does institution provide an orientation for specific student populations?
Events Does the institution’s student affairs division hold schedule events and programming for specific student populations?
Student Organizations Does the institution have formally recognized student organizations for specific student populations?
Staff Expertise Does the institution purposefully recruit staff members or counselors with expertise in specific student populations?
Childcare Services Does the institution have childcare services available for students?

Table 1.  Student Population Survey Questions (LIPSS)

We then compared the adoption rates of these practices (with the exception of childcare services) with those for international students and students of color. Services for traditionally underrepresented groups often increases the likelihood of success (Grant-Vallone, Reid, Umali, & Pohlert, 2003), hence, the targeted comparison of populations.  We did not include survey questions about childcare services specifically for international students and students of color because these services are typically extended to all students. Therefore, childcare is not included in subsequent statistical analyses. We then performed four logistic regressions to determine whether higher percentages of adult undergraduates at an institution increased the likelihood that the student affairs practices mentioned above (excluding childcare) would be enacted at that institution.


Our survey asked the extent to which student affairs divisions were adopting practices that the literature suggests provide social support to adult undergraduates. The answer to this question was an unexpected finding that called for more attention.

Table 2

Percentage of Institutions Adopting Student Affairs Practices for Specific Populations

Student Population International Students Students of Color Adult Students
Orientation 77% 21% 21%
Events 76% 74% 37%
Student Organizations 79% 60%* 26%
Staff Expertise 53% 51% 9%

Table 2.  Percentage of Institutions Adopting Student Affairs Practices for Specific Populations

*Average for African American, Hispanic, Asian American, and Native American student organizations.

As we reviewed the results (see Table 2), we encountered the same surprise as the students in our American College Student class.  When we compared the adoption rates of practices for adult undergraduates to rates for international students and students of color, we found that with the exception of orientation programs for students of color, adoption rates are higher for other student populations. It was disheartening to find that student affairs services for adult undergraduates lag behind services for other groups on campus. For instance, when we asked whether student affairs divisions purposely recruited staff members or counselors with expertise in specific student populations, 53% of institutions reported staff expertise for international students, 51% for students of color, and for adult students, it was only 9% of institutions. Logistic regressions uncovered no evidence that student affairs practices for adult undergraduates was related to the percentage of adult undergraduates attending an institution.  In none of the four logistic regressions was the size of the adult-student population a statistically significant predictor of adoption rates for these practices.

The regression results showed that higher percentages of adult undergraduates did not reliably distinguish between institutions with student orientations for adult students and those without such orientations (chi square = 1.706, df = 2, p = .189), nor for those with events for adult students and those without (chi square = .901, df = 2, p = .357). ), nor for those with student organizations for adult students and those without (chi square = .05, df = 2, p = .822), nor for those with student affairs staff expertise in adult students and those without (chi square = .033, df = 2,  p =.855).  In all of these cases, there was little relationship between the variables (Nagelkerke’s R2 of .046 for orientation, R2 of .021 for events, R2 of .001 for organizations, and R2 of .001 for staff expertise).  Thus, overall the greater presence of adult undergraduates in the student population does not seem to influence the practices adopted for these students. Therefore, the differing rates at which institutions adopt services supporting adult students cannot be dismissed as simply a function of the composition of the student body.


Any type of support, whether it is from our communities, families, or staff, decreases the impact of stressors during the college yearsarney-Crompton, & Tan, 2002; Giancola, Grawitch, & Borchert, 2009; Johnson, Schwartz, & Bower, 2010), and yet, we found that many campuses were not instituting the changes needed for adults.  This may be due to a lack of financial resources to develop new programming or the perception that the adult population on campus is small.  However, our results illustrate that student affairs services specifically for adult students significantly lag behind services for other student groups. We affirm the crucial importance of student affairs practices for students of color and international students, and agree that these practices could be extended to adult undergraduates as well. Furthermore, the greater presence of adult undergraduates in the student population does not seem to influence the rates of adoption of student affairs practices for these students. In light of these findings and the observation that adults are the fastest growing segment of the undergraduate population (Wyatt, 2011), we suggest that student affairs divisions would be well-served by a reexamination of their practices related to nontraditional students.

What Institutions Can Do

From our literature review and survey results, we identified five student affairs practices that can offer support for adult students:  hiring staff specializing in adult undergraduate experiences and issues; providing childcare; and tailoring orientation programs, programming, and student organizations to adult undergraduates.  It is our hope that the availability of services and support that are developed or modified for adult students will increase their social support and success, both on and off campus.  Each of the five areas described below are supported by literature suggesting that these practices are especially beneficial for adult undergraduates.

Specialized Staff

Our survey revealed that relatively few institutions hire student affairs staff with expertise in adult undergraduates. Student affairs offices can benefit from the insights of colleagues who have attended college as adult undergraduates or who specialize in the needs of adult undergraduates (Bauman et al., 2004; Carney-Crompton & Tan, 2002; Giancola, Grawitch, & Borchert, 2009; Graham & Gisi, 2000; Hadfield, 2003; Hardin, 2008; Wyatt, 2011).  Thus, we argue that a crucial first step in addressing the needs of these students is to hire or train staff with expertise in the lifestyles of nontraditional students.

Having advocates for adult students on staff might also then lead to the establishment of a central office where information and services could be disseminated to adult students. Giancola, Grawitch, and Borchert (2009) discuss a common thread for adults in dealing with conflicting commitments: conflicts among work, school, and family are prevalent among adult undergraduates. Finding a space such as a student affairs office with staff members who specialize in adult undergraduates can help adults navigate these conflicts (Hadfield, 2003).  We cannot overstate the importance of advocacy on behalf of adult students in areas that will be beneficial for them and offer resources, skills, support, and peace of mind.  Even hiring one staff member can make all the difference for adults.


If we frame the needs of adult students in terms of Maslow’s hierarchy of needs, a basic need is the financial means to attend college.  But more often than not, another basic requirement of adult students is the need to care for children, particularly for adult women, who as a gender, make up the majority of adult students (Johnson, Schwartz, & Bower, 2010).  Miller, Gault, and Thorman’s (2011) research identifies that in the year 2008, 23% of undergraduate students identified as being parents.  Many of these parents face greater challenges in college and have a lower rate of degree completion than students who do not have children (Miller, Gault, & Thorman, 2011).  This fact brings childcare to the forefront as a way to increase access for adult undergraduates.


Many more schools have begun to institute transfer student orientation, which is an improvement in sharing institutional resources with incoming students from other colleges.  Because many transfer students come from community colleges as well (Erisman & Steele, 2015), it may be advantageous for institutions to consider adding specific adult student components.  New student orientations specifically for adult undergraduates could provide both informational and social support.  A tailored adult student orientation can assist students in adjusting to college and connecting with resources on campus.  Since social support is an important consideration for adult undergraduates, orientation programs specifically for adults would help nontraditional students network with one another and adjust to the academic and social demands of college life. Adult student organizations would likewise provide nontraditional students with a sense of belonging and validation for the stressors in their lives.  An approach like this is likely to help all nontraditional students, and specifically assist adult students in building their peer networks and in adjusting to the academic and social demands of college life.


Programming is another area in which student affairs staff can help adult undergraduates succeed.  Bauman et al. (2004) and Giancola, Grawitch, and Borchert (2009) found that programming that nontraditional students find particularly useful includes workshops on stress and time management as well as study skills. In addition to skills-related programming, social programs that are open to families, partners, and children can widen avenues of involvement and feelings of belonging for adults.

Student Organizations

Adult student organizations would likewise provide nontraditional students with a sense of belonging, along with validation for the stressors in their lives (Bauman et al., 2004; Carney-Crompton & Tan, 2002; Giancola, Grawitch, & Borchert, 2009; Graham & Gisi, 2000; Hadfield, 2003; Hardin, 2009).  Our study revealed that fewer events are geared to students of different ages than to international students and students of color.  This deficit could be addressed through the development of programs that go beyond one identity, and instead cater to the multiple identities that students have.  As an example, an existing program geared towards African American students could add the words “children and families welcome” to shift the way a program is perceived by adult students.


Our study suggests that student affairs services offering social support to adult undergraduates lag behind services for other groups on campus. We also find that the percentage of undergraduates in the student population has little relationship to the availability of services for these students. Thus, we argue that student affairs divisions can do much more to facilitate the success of adult undergraduates in four-year colleges and universities.  Specifically, we recommend that student affairs divisions hire staff with expertise in adult undergraduates, who could then establish offices with services tailored to the needs of these students.

Discussion Questions

  1. How might your institution adapt existing events and services to encourage adult undergraduate participation?
  2. Adult undergraduates often struggle to balance responsibilities such as work and family with academics. How might the advice and resources you provide adult undergraduates differ from the advice and resources you provide traditional-aged students?
  3. It is often assumed that adult undergraduates should become assimilated into college life. We propose that it is equally important for colleges to become better integrated in the lives of students.  How might colleges better integrate the college culture with adult students’ lives?


Bauman, S. S. M., Wang, N., DeLeon, C. W., Kafentzis, J., Zavala‐Lopez, M., & Lindsey, M. S. (2004).  Nontraditional students’ service needs and social support resources: A pilot study. Journal of College Counseling, 7(1), 13-17.

Carney-Crompton, S., & Tan, J. (2002).  Support systems, psychological functioning, and academic performance of nontraditional female students. Adult Education Quarterly, 52(2), 140-154.

Chao, R., & Good, G. E. (2004).  Nontraditional students’ perspectives on college education: A qualitative study.  Journal of College Counseling, 7(1), 5-12.

Donaldson, J. F., Graham, S. W., Martindill, W., & Bradley, S. (2000).  Adult undergraduate

students: How do they define their experiences and their success?  The Journal of Continuing Higher Education, 48(2), 2-11.

Erisman, W. & Steele, P. (2015). Adult college completion in the 21st century: What we know

and what we don’t. Washington, DC: HigherEd Insight.

Giancola, J. K., Grawitch, M. J., & Borchert, D. (2009).  Dealing With the Stress of College A Model for Adult Students.  Adult Education Quarterly, 59(3), 246-263.

Graham, S. W., & Gisi, S. L. (2000).  Adult undergraduate students: What role does college involvement play?  NASPA Journal, 38(1), 99-121.

Grant-Vallone, E., Reid, K., Umali, C., & Pohlert, E. (2003).  An analysis of the effects of self-esteem, social support, and participation in student support services on students’ adjustment and commitment to college.  Journal of College Student Retention:  Research, Theory, and Practice, 5(3), 255-274).

Hadfield, J. (2003). Recruiting and retaining adult students. New Directions for Student Services, 2003(102), 17-26.

Hardin, C. J. (2008).  Adult students in higher education: A portrait of transitions.  New Directions for Higher Education, 2008(144), 49-57.

House, J.S. (1981). Work stress and social support. Reading, MA: Addison Wesley.

Johnson, L. G., Schwartz, R. A., & Bower, B. L. (2010).  Managing stress among adult women students in community colleges.  Community College Journal of Research and Practice, 24, 289-300.

Kasworm, C. E. (2003).  Setting the stage: Adults in higher education.  New directions for Student Services, 102, 3-10.

Miller, K., Gault, B., & Thorman, A. (2011).  Improving Child Care Access to Promote Postsecondary Success Among Low-Income Parents.  Washington, DC:  Institute for Women’s Policy Research.

Wyatt, L. G. (2011).  Nontraditional student engagement: Increasing adult student success and retention. The Journal of Continuing Higher Education, 59(1), 10-20.


About the Authors

Rebecca L. Brower is a Research Associate at the Center for Postsecondary Success at Florida State University. Her research focuses on institutional policies in higher education, particularly diversity policies that facilitate student encounters with difference.


Bradley E. Cox is an Associate Professor of higher education in the Department of Educational Leadership and Policy Studies at Florida State University. His research explores how institutional policies shape student experiences in college, with a particular emphasis on the systemic, institutional, and personal conditions that shape college experiences and outcomes for students on the autism spectrum.


Amber E. Hampton is an Associate Director at the Center for Leadership & Social Change at Florida State University and current doctoral student in the Higher Education program focusing on public policy. Hampton’s work as an Associate Director involves increasing dialogue as a form of communication across campus through programs with faculty, staff, and students. Her research as a student focuses on college access and underrepresented populations in higher education.


Please e-mail inquiries to Rebecca L. Brower.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

After 35 Years of Publishing Standards, Do CAS Standards Make a Difference?

CAS Standards Logo

After 35 Years of Publishing Standards, Do CAS Standards Make a Difference?

Wendy Neifeld Wheeler
Kelcie Timlin
The College of Saint Rose
Tristan Rios, Hamilton College


The Council for the Advancement of Standards (CAS) released the 2015 edition of the blue book in August 2015 (  This 9th edition of CAS includes thirteen revised functional areas and guidelines.  As higher education professionals know, calls for better quality accountability measures can be heard from stakeholders across the nation—including the federal government, funding agencies, state legislators, accrediting associations, elected officials, parents, and students.  Educators within student affairs have been impacted by the increasing demands to provide evidence of outcomes assessment.  Upcraft and Schuh (1996) asserted that as “questions of accountability, cost quality access, equity and accreditation combine to make assessment a necessity in higher education, they also make assessment a fundamental necessity in student affairs as well” (p. 7). The inherent goal to continuously make improvements in higher education is an equally compelling factor influencing the necessity of assessment (Keeling, Wall, Underhile, & Dungy, 2008). According to Keeling et al. (2008), “the use of assessment more importantly emerges from the desire of faculty members, student affairs professionals, parents, students and institutional administrators to know, and improve, the quality and effectiveness of higher education” (p. 1).

CAS was launched as a consortium of 11 charter members with the express charge “to advance standards for quality programs and services for students that promote student learning and development and promote the critical role of self- assessment in professional practice” (CAS, 2012, p. v).  It has influenced assessment practice in student affairs since its origination and continues to provide significant revisions and updates as with the newest edition’s release in August 2015.  A primary goal of CAS is to fulfill the foundational philosophy that “all professional practice should be designed to achieve specific outcomes for students” (CAS, 2012, p. v). According to Komives and Smedick (2012), “utilizing standards to guide program design along with related learning outcomes widely endorsed by professional associations and consortiums can help provide credibility and validity to campus specific programs” (p. 78).

Today, CAS has grown to include forty-one member organizations with representatives who have developed these resources through a collective approach that integrates numerous perspectives across student affairs.

The purpose of this study was to replicate the original research of Arminio and Gochenauer (2004). Their investigation was designed to “assess the impact of CAS on professionals in CAS member associations…the researchers sought to explore who uses CAS standards, how and why they are used and whether CAS standards are associated with enhanced student learning” (Arminio & Gochenauer, 2004, p. 52). To that end, the processes of sampling and data collection were mirrored as much as possible. This article addresses the question “What changes, if any, are there between the results of the 2004 publication by Arminio and Gochenauer and the current study?”


In the spring of 2012, the investigators made contact with original author Jan Arminio requesting permission to duplicate the 2004 quantitative study.  Researchers also consulted with Laura Dean, then president of CAS, for approval to move forward with the research.  Dean, in consultation with her CAS colleagues, agreed that CAS would support the research project. IRB approval was sought and granted from the home institution. Once all approvals were completed, Dean, on behalf of the investigators, emailed an invitation to participate in the study to the designated CAS liaisons (a representative of the member association whose role is to provide transparency between CAS and the association).  Ten of 41 professional associations agreed to participate, resulting in an initial sample size of 2,127.  The response rate for each of these associations is provided in Table 1.

Name of Organization Response Percentage Response


ACPA (American College Personnel Association) 13% 41
ACHA (American College Health Association) 1% 5
ACUHO-I (Associations of College and University Housing Officers) 4% 12
ACUI (Association of College Unions International) 3% 8
AHEAD (Association on Higher Education and Disability) 1% 3
NACA (National Association for Campus Activities) 1% 3
NACADA (National Academic Advising Association) 37% 114
NACE (National Association of Colleges and Employers) 1% 4
NASAP (National Association of Student Affairs Professionals) 4% 13
NODA (National Orientation Directors Association) 2% 7
Other 33% 99

Table 1. Professional Organization Response Rates

The survey consisted of all quantitative multiple-choice questions from the original study (Arminio & Gochenauer, 2004), plus several new multiple-choice questions expanding on the initial question set to 18 items. Twelve of the 18 questions allowed participants to add comments beyond selecting one of the available answer choices. The instrument had three primary purposes: to determine the extent to which the respondent was familiar with and/or utilized the CAS standards; to learn of other assessment tools or methods being used; and to investigate any existing relationship between assessment practices and student learning outcomes. The instrument was uploaded to the online survey system SurveyMonkey (, a free software platform. The researchers analyzed the data using the software included in the SurveyMonkey platform.  Frequencies and summaries of data were included in the statistical analysis.


A total of 15% (n=309) of the initial sample size was included in the data analysis. Of the 309 respondents, 36% (n=109) of the individuals indicated they were employed at public 4-year institutions and 24% (n=75) were from private 4-year institutions.  Community 2-year schools were represented by 12% (n=36) of the sample. A total of 6% (n=18) of individuals indicated job titles of vice presidents, associate vice presidents and assistant vice presidents, while deans, associate deans and assistant deans were indicated by 9% (n=29) of respondents. Directors made up the majority of the participants at a total of 23% (n=72), followed by coordinators at 15% (n=45). New professionals made up 9% (n=28) of the sample, and faculty represented 6% (n=18) of the sample. The remaining 32% (n=99) of participants indicated other as their job title as described in Table 2.

Table 2

Table 2. Percentage of Respondents in Each Job Category


Knowledge and Use of CAS

The current instrument included five yes/no questions; the first asking participants if they had previously heard of CAS. A total of 82% (n=254) of the participants indicated that they had heard of CAS. In contrast, Arminio and Gochenauer (2004) reported that 61% (n=890) of the participants in their study had heard of CAS. For those participants who indicated they had heard of CAS in the current study, follow-up questions were designed to investigate the extent of the usage of CAS.  Of the 254 respondents who had heard of CAS, 65% (n=158) indicated they were using CAS.

CAS Resources and How They Are Used

Respondents who had indicated they were using CAS (n=158) were asked to identify how they were using the current or past editions of the blue book, the CAS CD, or a particular CAS Self-Assessment Guide (SAG). Of those who use CAS, 81% (n=128) of respondents indicated that the blue book was their primary source of CAS-related information, with 72% (n=114) indicating that SAGs for a particular functional area was their secondary source of information.

Participants were then asked to identify how they use each of the materials. The multiple-choice options included: read it, to conduct self-assessment, for evaluation, as a general reference, as a resource guide for my work, for staff development, to increase institutional support, and for accreditation. The foremost reasons respondents were using the CAS resources were to conduct self-assessment at 41% (n= 65) and as a resource guide at 35% (n=55).  Only 11% (n=17) of the respondents selected, to increase institutional support, as a way they used CAS materials.

Arminio and Gochenauer (2004) reported that in their study “more respondents used CAS materials to guide their programs than for self-assessment” (p. 57), which is the reverse of this study’s findings.  Arminio and Gochenauer (2004) did not report on whether respondents indicated using CAS to increase institutional support, but they did include the general statement “several respondents noted using CAS standards to document support for increased resources” (p. 59).

The instrument contained a multiple-choice question asking respondents if CAS had influenced their programs and services. The question included eight items for the participants to select, along with the option to write in additional comments.  The most frequently identified item was assessing current program, which was reported by 70% (n=111) of the participants, with mission statement and goals following at 44% (n=70). This is highlighted by respondents who shared “CAS Standards were critical in helping me to develop program initiatives, mission statements, and assessment plans” and “CAS provides an essential guide of how to best measure, compose and evaluate one’s departmental programs and services.” It was reported by 9% (n=14) of the responders that CAS influenced budget requests. CAS standards were also credited in influencing “emergency procedures and statements of ethics” in addition to “serving as a guideline for organizational change” based on comments of respondents.

Of the respondents who had indicated they had heard of CAS, 35% (n=80) were not using CAS. The most common reasons given for using an alternative assessment tool included that: the tool was more specific to the program/service (59%; n=47); that the tool was easier to adapt to the program/service (33%; n=26); and that the tool was less complex than CAS (9%; n=7). Similar to the previous question, the instrument allowed participants to specify other reasons for the use of an alternative assessment tool.  Respondents indicated via comments that the alternative tools they were using had been “selected by the division of student affairs” or they were “based on the school’s strategic plan.”

Participants, who had indicated they had not heard of CAS, were asked about their knowledge regarding other assessment tool availability. Of the 55 participants who indicated they had not heard of CAS, 32% (n=18) indicated they had heard of alternative assessment tools, but 68% (n=37) had not.  Of those who reported they had heard of other assessment tools 71% (n= 13) indicated they were using an alternative.  Those participants who indicated they were using an alternative assessment listed the following instruments as examples: Noel-Levitz, Collegiate Learning Assessment, World Class Instruction Design and Assessment, and HESI Admission Assessment Exam. There were also comments that indicated that the alternative assessment instruments being used had been developed specifically for that department by internal staff.

Influence of CAS on Learning

A total of 85% (n=134) of respondents stated there was a connection between learning outcomes and CAS.  Arminio and Gochenaur (2004) reported that 24% of the respondents stated they measured learning outcomes generally and of those, 41% indicated a connection between the measured learning outcome and CAS standards.  Several respondents provided comments on learning outcomes as it relates to CAS.  One participant stated, “I think the learning outcomes are brilliant and will guide our programs at my university,” while another respondent shared “CAS provides an essential list of tools and items that must be included in order to meet minimal to substantial learning outcomes.” The current climate, which has a distinct focus on learning outcomes, may have been an influence that resulted in the increase from the 2004 study to the present.

Of those in the current study who indicated there was a connection between learning outcomes and CAS standards, 64% (n=86) described the connection as strong. The connection was described as vague by 36% (n=48) of respondents, while Arminio and Gochenauer (2004) reported 28% of respondents describing a vague connection between learning outcomes and CAS standards. The primary reason given for learning outcomes not being connected to assessment was because student satisfaction is measured more than learning outcomes.

Positive Comments and Constructive Criticism of CAS

Respondents were asked to reflect generally on CAS and share their thoughts.  The most common theme was focused on the collaborative nature of the CAS standards.  This is emphasized by the thoughts of one respondent:

CAS is an essential resource to the student affairs profession.  It is the ONLY available set of objective standards for a standard of practice in each area of student affairs.  The process by which CAS standards are written and vetted is excellent,

and “knowing functional area groups from across campus had to come together to agree on these standards provides even more weight as we work to make change.” The most common challenge of CAS standards was summarized by one respondent: “The sections (of CAS) are somewhat repetitive across functional areas…and the learning outcomes could be more specific to each functional area rather than just discussing broadly.”


The data collected in this study not only supports and enriches the research of Arminio and Gochenaur (2004), but provides an indicator of sustained knowledge and use of CAS since its introduction in the 1970’s.  The current study also indicates that CAS is used primarily to conduct self-assessment and that these assessment activities are directly related to established learning outcomes.  Recognizing the potential CAS can play in enhanced assessment practices of student affairs educators, professional organizations may want to consider additional means of providing training on CAS standards for its members.

Department leaders are encouraged to continue intentional discussions about the role of assessment in the day-to-day work of student affairs. To ensure continued commitment to assessment activities in the future, considerable thought and resources need to be part of a department’s strategic planning.  If one role of student affairs educators is to create the most effective learning opportunities for students, it is imperative that assessment undertakings hold a place of priority.

Limitations and Future Research

This study has a number of limitations.  The overall response rate was low.  This may have been impacted by several confounding factors. Specifically, the original study used a paper and pencil survey that was mailed to prospective participants.  The current study mirrored the questions from the original study, but used an electronic platform for administration. Shih and Fan (2008) have found “web survey modes generally have lower response rates (about 10% lower on the average) than mail survey modes” (p. 264).

The investigators experienced some unforeseen minor technological complications in the use of SurveyMonkey, thus three slight revisions needed to occur during the distribution of the study. Thus, initial invitees were asked to disregard the first link (to the first survey) and use the subsequent link. Future investigators who choose to further replicate the original research may want to revert back to a paper and pencil version of the instrument.

It is also possible that those who chose not to participate selected out of the survey because they were unfamiliar with CAS and did not feel their responses would be valued. It may benefit investigators to more overtly express that invitees do not need to be versed in CAS to participate in the research. Although the sample size was sufficient, others may want to implement additional strategies to increase the overall sample size.

The participants of this study were exclusively members of a “member association” of CAS, thus potentially skewing the results of the study in the direction of CAS knowledge and use.  Concurrently, selection bias may also be a limitation; those who responded may have been more invested in the subject of CAS or have a strong orientation towards the support of CAS. This restricts the generalizability of the findings to a wider range of diverse student affairs professionals who may not belong to these member associations and limits the contextual range of the data.  Future investigations may want to consider a comparison group by including professional organizations that are not member associations of CAS.


Colleges and universities continue to work towards improving assessment and accountability practices. Student affairs professionals seeking to advance their programs and services may want to reflect on whether CAS has served as a valuable resource for peers who, in this study, indicated positive experiences with the CAS instrument.  CAS provides a vetted tool that can serve as a resource in creating new programs, improving current practices and generally providing an instrument with which to judge our work in an intentional way. It is likely that CAS usage will continue to grow in member organizations and that new functional areas will be added.

In summary, two participants articulated the overall value of CAS in these ways: “I find the CAS standards to be very meaningful and an important framework from which to maintain clear focus about what programs are/are not doing and how to communicate to others what national standards and norms are” and “I think CAS standards are valuable to give our work credibility and as they provide guidance for us as we develop our programs.”

Discussion Questions

  1. What additional avenues can be utilized to broaden and enhance the use of CAS across divisions of Student Affairs?
  2. How can a CAS self-assessment study provide additional credibility and validity to the work of student affairs professionals?
  3. As campuses continue to explore and examine their own cultures of assessment, where does the CAS instrument fit into this picture?


Arminio, J. & Gochenaur, P. (2004). After 16 years of publishing standards, do CAS standards make a difference? College Student Affairs Journal, 24(1). 51- 65.

Council for the Advancement of Standards in Higher Education (2012). CAS professional standards for higher education (8th ed.). Washington, D.C.: Author Council for the Advancement of Standards in Higher Education. Retrieved from

Keeling, R. P., Wall, A. F., Underhile, R. & Dungy, G. J. (2008). Assessment reconsidered: Institutional  effectiveness for student success. Washington, DC: National Association of Student Personnel Administrators.

Komives, S. R. & Smedick, W. (2012).  Using standards to develop student learning outcomes.

In K. L. Guthrie & L. Osteen (Eds.), Developing Students Leadership Capacity. New Directions for Student Services, no. 140 (pp. 77-88). San Francisco: Jossey-Bass.

Shih, T. H. & Fan, X. (2008). Comparing response rates from Web and mail surveys: A meta- analysis. Field Methods, 20, 249-271.

Upcraft, M. & Schuh, J. (1996). Assessment in student affairs: A guide for practitioners. San Francisco: Jossey-Bass.

About the Authors

Wendy Neifeld Wheeler, Ph.D. is the Dean of Students/Title IX Coordinator at the Albany College of Pharmacy and Health Sciences.  She also teaches as an adjunct instructor in the College Student Services Administration program at The College of Saint Rose.

Kelcie Timlin, MS.Ed., is an Assistant Registrar at The College of Saint Rose.  Her current interests include Academic Advising and whole student development.

Tristan Rios, MS.Ed.,  is a Resident Director at Hamilton College.  He is interested in pursuing more advanced positions in Residence Life and aspires to be a Director.

Please e-mail inquiries to Wendy Neifeld Wheeler.

The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.


Strengths as a “Career Compass”: Helping Undergraduate Students Navigate their Career Development through Strengths Awareness and Development

Krista M. Soria, Brooke Arnold, & Katy Hinz, University of Minnesota
Jeremy Williams, University of St. Thomas

Career development professionals in higher education institutions are increasingly implementing strengths-based approaches in their daily practice with undergraduate students (Janke, Sorenson, & Traynor, 2010; Reese & Miller, 2009; Soria & Stubblefield, 2014; Soria & Stubblefield in press-a, in press-b; Stebleton, 2010; Stebleton, Soria, & Albecker, 2012). One of the most well-known tools to help college students discover their strengths is the Clifton StrengthsFinder, an online assessment that identifies areas where an individual’s greatest potential for building strengths exists (Asplund, Lopez, Hodges, & Harter, 2009).  These identified areas, referred to as talent themes, are naturally-recurring patterns of thoughts, feelings, and behaviors which, when refined with knowledge and skill, can be developed into strengths (Hodges & Harter, 2005). The StrengthsFinder assessment helps individuals to identify their personal five most salient talent themes out of 34 natural talent themes, often known colloquially as an individual’s “top five strengths.”

One of the fundamental principles underlying strengths-based perspectives in higher education is that college students who capitalize upon their best qualities will experience greater success in a variety of outcomes than if they spend time remediating their weaknesses (Clifton & Harter, 2003; Lopez & Louis, 2009). Scholars have contended that strength-based interventions in higher education promote college student engagement and retention because students who identify and apply their strengths in their lives will be more focused on their academic and career goals (Soria & Stubblefield, 2014; Stebleton, Soria, & Albecker, 2012). Yet, even as over one million college students in the United States have discovered their top five strengths and strengths-based approaches continue to gain steady momentum in colleges and universities, little research exists that empirically describes the benefits of strengths-based practices. In particular, little is known about the potential benefits of strengths-based approaches as a tool to elevate college students’ career exploration and career planning.  Therefore, the purpose of this study was to examine college students’ own perspectives on the utility of strengths-based approaches and strengths awareness in their career development.


In fall 2011, the Office of Student Engagement at a large, public research university located in the Midwest offered the StrengthsFinder assessment to all incoming first-year students at no charge. Before they arrived on campus for matriculation, 5,122 first-year students (amounting to 95.4% of the first-year class) took the online assessment and received their top five talent themes. All first-year students attended a strengths seminar during a weeklong concentration of programming prior to fall classes. Many first-year students also encountered strengths-related discussions in first-year seminar classes, housing and residential life offices, and in many other areas across campus. At the end of their first semester, all first-year students were invited to participate in an online survey measuring their strengths-related engagement across campus. To encourage participation, a lottery incentive was offered to participants, in the form of a chance to win one of four $25 university bookstore gift certificates. In the survey, students were asked to provide insights into how they had utilized their strengths within their first semester of study at the University.  These data were then used in the present study.


The student response rate for the survey was 27.8% (n = 1,493). White and female students were slightly overrepresented in this sample when compared with the first-year student population (which was 52.2% female and 75.4% White).  The sample was 61% female, 3% Black, 13% Asian, 1% Native American, 3% Hispanic, 4% international, and 76% White.

Data Analysis

In the survey, students responded to several essay questions, one of which asked students to “provide specific examples of where strengths has benefited your first-year experience at the University.” We used NVivo 10 software (QSR International, 2007) to categorize and code students’ responses to the survey item. Data were analyzed using in vivo, open, axial, and selective coding procedures (Creswell, 2007). In the process of integrating the data and refining the categories, central themes emerged that explained relationships among the data. Both codes and themes were sorted and reviewed for similarities and differences until the point of saturation: the point at which additional analysis does not offer any additional insight (Creswell, 2007). To enhance the credibility of our qualitative data analyses, we used direct quotes to authenticate the findings (Merriam, 2009). Codes and themes were verified by the authors, a step which enhanced the validity of the analyses (Creswell, 2007).


In analyzing the qualitative data, two key themes emerged that conveyed how some students perceived applications between strengths and their career exploration and development. The first theme described below conveys how strengths benefitted students’ career development by enhancing their self-awareness and increasing their career decision-making abilities. The second theme describes how students used their strengths self-awareness to obtain employment or engage in experiential opportunities. Collectively, these two broad themes suggest that strengths awareness enhances students’ self-awareness in ways that benefit areas related to short-term career opportunities and long-term career development pathways.

Using Strengths as a Compass in Career Development Decisions

In responding to the prompt asking students to cite specific examples of where strengths benefited their first-year experience, several students noted the applicability of strengths with regards to their career exploration.  For example, one student noted that he used strengths in “discovering what career path would fit me,” while another student reflected that strengths were useful with regards to “looking into my future career path.” A third student focused on the holistic benefits of strengths in career decision-making in stating:

Knowing my strengths gives me a good idea where I stand. I understood more about myself after the survey, and with some research, what kind of career would be good for me. I used them as the compass to discover the way I should be heading during my career.

In addition to discovering their top five strengths, students learned more about the types of work environments that would suit them best.  For example, one student wrote:

I realized that I don’t want to go into a job where I am doing basically the same thing every day. My top strength was “learner.” I knew that, but I never realized it or thought about it. My strengths helped me see that I really want/need to be in an occupation in which I am always learning and discovering new things.

Determining this student’s top five strengths improved this student’s ability to be more selective when pursuing jobs and their associated work environments.

While many students discussed how they were going to apply their strengths while making decisions along their career paths, several students also felt affirmed that they were already making appropriate career decisions after learning about their strengths.  For instance, one student noted that her strengths awareness “helped me to reassure myself in my chosen career/major path. My strengths fit my choice.”  Similarly, another student wrote, “It was helpful for affirmation that I’m looking at the right career paths.” These affirmations point toward an increase in students’ career decision-making abilities, as students became more confident that they were making the most appropriate career development decisions for themselves. In particular, one student’s reflection conveyed a deep understanding and application of strengths in consideration of a career path:

I would say that strengths has increased my self-awareness and has also reinforced some of my ideas about potential career paths. For example, my strengths: learner, intellection, input, restorative, and achiever fit my goal of becoming a doctor because it is necessary to be a lifelong learner, to be able to set and accomplish tasks, and to be able to solve problems.

Strengths gave these students the skills to help discover their career path, further their identity development as it relates to their chosen career, learn the environments where they work best, and reaffirm their career field choices.

Using Strengths to Obtain Jobs and Experiential Opportunities

Several students reported that they used their strengths in obtaining employment during their first year of study.  For example, one student noted, “My strengths were crucial in getting me the on campus job I wanted. My employer and I had a lengthy conversation about my strengths and how they could be applied in a job setting.” Likewise, another student wrote, “During my first interview, I told my manager about my strengths and how they are related to the job that I was applying to. She was very impressed.” One student effectively leveraged his awareness of strengths in a job interview.  He commented, “The question asked was, ‘if asked, what would your peers and professors say are your top traits or strengths?’ Because I knew what my top five strengths were, I answered the question with little difficulty.” These students interacted positively with employers through having knowledge of their strengths.

One student declared that he did not obtain the employment position to which he had applied.  However, the student’s positive attitude about his new-found strengths vocabulary helped him envision how he could reference his strengths in a future employment interview.  He stated:

It gave me a way to talk about my strengths and skills in a job interview on campus. I didn’t get the job, but they told me I was a strong candidate and actually recommended me to someone else searching for student workers. I think that having the vocabulary to talk about it helped me explain it better than I could on my own, which probably helped me make a good impression.

As in this student’s case, knowing how to verbalize strengths has the potential to open new career opportunities.

Beyond obtaining employment, students also related that they utilized their strengths in engaging in volunteer positions as well.  For example, one student noted she “was able to put some of my strengths on my application for the volunteer position of [Mascot] greeter, and I think the way I talked about how I could use those during my interview really helped get me the volunteer position.” Another student stated, “Strengths helped me during my job and volunteer position interviews.  I was able to discuss the strengths I would bring.”  Both of these students shared the benefit knowing strengths can bring to civic engagement-related positions.

Students frequently expressed learning specific ways in which they could use strengths in future job positions and in their future, long-term careers.  For example, several students noted that they listed their top five strengths on their resumes.  One student wrote, “I can point out strengths when employers ask, ‘What attributes will you bring to the table?’” Another student discussed future application within volunteering or employment: “It made me aware of what I might be better at doing, i.e. in a job or volunteering experience.” A third student projected her knowledge of how companies are utilizing strengths in their workplace as she envisioned being able to apply her strengths in those future contexts: “Knowing my strengths will help when I start applying to jobs because a lot of companies use strengths.”  Comparably, these students demonstrated the utility of using strengths both immediately and throughout their careers.

Discussion and Recommendations

The results of our qualitative data analyses suggest that many first-year college students saw great applicability of strengths awareness in their current employment searches, potential for post-college employment searches, and in serving as a compass to lead them on a career path that takes advantage of their natural talents. Overall, the use of strengths-related programming on this campus helped many students to enhance their self-awareness and career decision skills, in turn positively impacting their career development. The following paragraphs provide several recommendations that career development practitioners can utilize in their implementation of strengths-based approaches with college students.

First, we recommend that practitioners help students to gain an awareness of their strengths by encouraging them to take assessments to discover their strengths (e.g., the Clifton StrengthsFinder). Strengths-related conversations can begin by asking students to describe their strengths in their own words and think of examples of how they have utilized their strengths in the past.  Student affairs practitioners are also encouraged to discover their own strengths by taking the StrengthsFinder to facilitate connections with students and demonstrate how their own strengths are used in their professional practice (Soria & Stubblefield in press-a, in press-b).

Second, practitioners can help students to strategically use their strengths in a job search. When creating elevator pitches, resumes, and cover letters, students engage in powerful analysis when using their own words to describe in-depth examples of their strengths being utilized. For example, students could simply list their top five strengths on their resume or, to reflect upon their strengths on a deeper level, a career counselor could help students create bulleted action statements to help students discuss their the top five strengths without stating the StrengthsFinder themes.

Third, to take the strengths application a step further, career counselors can help students examine a job description and analyze how their strengths could be utilized in that role. As previously mentioned in student examples, students can seek job descriptions reaffirming their career path decisions and jobs with work environments more conducive to their strengths.  It is essential for career counselors to help students understand that their top five strengths do not necessarily equate to one specific job or career.  Instead, many strengths can be used for any job or career.  What matters most is how an individual maximizes his or her strengths to be successful in a specific role.

Last, strengths can be used for interview preparation.  Interview skills can be enhanced by career counselors asking students to create a chart with a list of experiences on the horizontal axis (e.g., work, volunteer, leadership in an organization, etc.) and their top five strengths on the vertical axis. In each box, students can identify examples of times they used their strengths in those experiences. By practicing those statements aloud, students will be more prepared for their interviews.

The StrengthsFinder assessment can be very useful for students as they plan their college experience and careers, particularly if they make their own meaning of the words and apply their strengths. The data from the survey shows that many students who took the assessment found the results helpful for choosing a career field, and when applying for jobs, interviewing for jobs, and engaging in other experiential opportunities.  These findings demonstrate the important role strengths can play in a student’s career development. Strengths can serve as a career compass to direct students on their path, and career counselors can facilitate this process by helping students make these connections.


In conclusion, the results of this brief qualitative report suggest that students see the potential for the applicability of strengths in their career development. In particular, students identified that knowing their strengths enhanced their self-awareness, contributed to their career decision-making abilities, and aided them in obtaining employment and experiential opportunities, thereby positively impacting their career development. We recommend that future researchers continue to examine the potential benefits of strengths-related approaches in higher education and that practitioners continue to develop new approaches to help students utilize their strengths in their career development.

Reflection Questions

  1. What do you think are some additional ways in which strengths awareness and strengths-based approaches can facilitate students’ development in higher education?
  2. How can you serve as a strengths-based practitioner in your daily work with undergraduate students?
  3. What are some alternative ways in which undergraduates can utilize strengths in their career development journeys?

About the Authors

Krista Soria is an analyst with the Office of Institutional Research at the University of Minnesota. Her research interests focus on understanding the experiences of underrepresented students on college campuses, developing high-impact practices to support students’ success, and leveraging opportunities to facilitate students’ leadership development. Krista is also an adjunct faculty with the leadership minor at the University of Minnesota, for the English department at Hamline University, for the educational leadership program at St. Mary’s University, and for the higher education administration program at St. Cloud State University.

Brooke Arnold works in the Carlson School of Management Undergraduate Program office at the University of Minnesota-Twin Cities.  As an academic adviser and career coach, she has the opportunity to help students discover, develop, and maximize their strengths during their collegiate experience and beyond.

Katy Hinz works in the Office for Student Engagement at the University of Minnesota-Twin Cities. Prior to that she worked in career services and continues to be passionate about how to help students use their strengths in the career exploration and career planning process. 

Jeremy Williams has worked in multiple career fields over the last decade with the primary goal of helping people.  Currently, he is a second year graduate student in the Leadership in Student Affairs program at the University of St. Thomas in St. Paul, Minnesota. 


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.


Asplund, J., Lopez, S. J., Hodges, T., & Harter, J. (2009). The Clifton StrengthsFinder® 2.0 technical report: Development and validation. Lincoln, NE: Gallup.

Clifton, D. O., & Harter, J. K. (2003). Investing in strengths. In A. K. S. Cameron, B. J. E.

Dutton, & C. R. E. Quinn (Eds.). Positive organizational scholarship: Foundations of a new discipline (pp. 111-121). San

Francisco, CA: Berrett-Koehler Publishers, Inc.

Creswell, J. W. (2007). Qualitative inquiry and research design: Choosing among five approaches (2nd ed.). Thousand Oaks, CA: Sage Publications.

Hodges, T. D., & Harter, J. K. (2005). A review of the theory and research underlying the StrengthsQuest program for students. Educational Horizons, 83(3) 190-201.

Janke, K. K., Sorenson, T D., & Traynor, A. P. (2010). Defining levels of learning for strengths development programs in pharmacy. Innovations in Pharmacy, 1(2), 1-10.

Lopez, S. J., & Louis, M. C. (2009). The principles of strengths-based education. Journal of College and Character, 10(4), 2-8.

Merriam, S. B. (2009). Qualitative research: Guide to design and implementation. San Francisco, CA: Jossey-Bass.

NVivo qualitative data analysis software, version 10. (2012). QSR International Pty Ltd.

Reese, R. J., & Miller, C. D. (2009). Using outcome to improve a career development course: Closing the scientist-practitioner gap. Journal of Career Assessment, 18(2), 207-219.

Soria, K. M., & Stubblefield, R. (in press-a). Building a strengths-based campus to support student retention. Journal of College Student Development.

Soria, K. M., & Stubblefield, R. (in press-b). Knowing me, knowing you: Building strengths awareness and belonging in higher education. Journal of College Student Retention: Research, Theory, and Practice.

Soria, K. M., & Stubblefield, R. (2014). First-year college students’ strengths awareness: Building a foundation for student engagement and academic excellence. Journal of the First-Year Experience and Students in Transition, 26(2), 69-88.

Stebleton, M. J. (2010). Infusing career assessment into a first-year experience course. Career Convergence Magazine. Retrieved from

Stebleton, M. J., Soria, K. M., & Albecker, A. (2012). Integrating strength-based education into a first-year experience curriculum. Journal of College and Character, 13(2), 1-8.

Elevating Native American College Students’ Sense of Belonging in Higher Education

Native American students are an underrepresented minority group in higher education, representing less than 1% of all college-going students in the United States (Ginder & Kelly-Reid, 2013).  Although they represent a small proportion of the college student population in the United States, it is important to research Native American students’ experiences in higher education.  For decades, scholars have documented the persistent challenges encountered by Native American college students, which can include lack of role models, feelings of isolation, racial discrimination, and a cultural mismatch in higher education (Garrod & Larimore, 1997; Larimore & McClellan, 2005).  These barriers are coupled by the challenges of being a non-traditional student, with many studies showing that the majority of Native American students are the first in their families to attend higher education, are employed while in college, have dependents, and live in poverty (American Indian College Fund Data, 2011).  The confluence of these factors contributes to higher dropout rates among Native American students: only 39% of Native American first-time, full-time students who started college in 2005 graduated within four years, compared to 60% of White students (Knapp, Kelly-Reid, & Ginder, 2012).

There is a significant lack of research about Native American students in higher education.  The majority of studies exploring factors associated with Native American students’ success in higher education feature qualitative designs, have smaller sample sizes, or are derived from single-institution samples (Jackson, Smith, and Hill, 2003; Larimore & McClellan, 2005; Okagaki, Helling, & Bingham, 2009).  Jackson et al. (2003) discovered family support, structured social support, the warmth of faculty and staff, and reliance upon spiritual resources contributed to Native American undergraduates’ retention.  The purpose of the present research study was to expand upon prior research by examining factors associated with Native American college students’ sense of belonging in higher education.  To expand upon prior research, we utilized a large sample size of Native American students within a quantitative, multi-institutional analysis.

This research is unique in that the primary dependent variable in this analysis was students’ sense of belonging, a concept that Hausmann, Schofield, and Woods (2006) connected to students’ retention.  Yet, we approach sense of belonging cautiously when considering the unique experiences of Native American college students.  Native American students experience a great degree of stress in higher education because many feel forced to choose between assimilating into the dominate culture as a means of achieving academic success and maintaining ties to their traditional culture by resisting dominant assimilation (Larimore & McClellan, 2005).  For many Native American students, these choices can mean breaking away from family and home communities or dropping out of higher education.  Some researchers have suggested Native American students who are able to connect with their cultural identity and also adapt to the demands of campus life are more likely to succeed in meeting their educational goals (Huffman, 2001).


The Student Experience in the Research University (SERU) survey is administered annually within a consortium of large, public research universities that are members of the Association of American Universities.  All sets of items used in the present study were derived from the SERU survey or provided by the institutional research offices at participating campuses.  The SERU survey contains over 600 items, and the purpose of the instrument is to gather data on students’ satisfaction, academic engagement, use of time, perceptions of campus climate, research experiences, and civic/community engagement, among other areas (Douglass, Thomson, & Zhao, 2012; Soria & Thomas-Card, 2014).  Researchers have provided evidence for the internal consistency of students’ responses over several administrations of the survey (Chatman, 2011).

In spring 2013, the SERU survey was administered to eligible undergraduate students enrolled at 13 institutions.  Institutional representatives sent emails to 356,699 enrolled undergraduates asking them to respond to the web-based questionnaire.  The institutional level completion response rate for the SERU survey was 35.50% (n = 126,622).  We utilized survey responses from Native American undergraduate students enrolled in 13 large, public research-intensive universities (n = 863).  The majority of Native American students identified as female (60.7%), non-transfer (76.0%), and non-first-generation (59.9%).  The average age of participants was 21.68 (SD = 4.93).


Dependent Variable

Four survey items were utilized to measure students’ sense of belonging.  Two items asked students to indicate their level of satisfaction with the social and academic aspects of their educational experiences and were scaled 1 (very dissatisfied) to 6 (very satisfied).  Two additional items asked students to rate their sense of belonging on campus and asked whether they would choose to reenroll on campus.  These items were scaled 1 (strongly disagree) to 6 (strongly agree).

Independent Variables

Several measures were utilized in the analysis that were either provided by students in the SERU or provided by institutional research offices at the respective institutions.  Institutions provided students’ sex, transfer status, and academic level (as defined by the number of credits earned).  Students provided information regarding their parents’ highest level of education achieved, from which we derived their status as first-generation students (defined as parents not earning a bachelor’s degree or higher).  Students also answered questions regarding their current residence and social class.  Prior researchers provided evidence for the validity of students’ self-reported social class (Soria & Barratt, 2012).

The SERU was administered at 13 different universities; therefore, to get a sense of whether the location of the institution had any bearing on student outcomes—and to preserve anonymity of participating institutions—we coded institutions into three categories based on their general geographic region in the United States with the remaining two schools (which were generally located on the West coast) as the referent schools.  The focal categories included four schools located in Southern regions, five schools located in the Midwest region, and two schools located in the upper-Eastern region of the United States.

Variables were used to assess students’ perceptions of campus climate for diversity and socioeconomic class, level of academic engagement, frequency of faculty interactions, and frequency of classmate interactions, which prior research has discovered are associated with students’ sense of belonging and retention (Soria & Stebleton, 2012, 2013).  We also utilized items which asked students to indicate the frequency with which they engaged in a variety of activities per week, including paid employment, community service, recreational activities, spiritual or religious activities, socializing with friends, and spending time with family.  These items were scaled from 1 hour to more than 30 hours.

Data Analyses

All data analyses were conducted using SPSS 21.0 and first utilized a factor analysis for the purpose of data reduction, to explain a larger set of measured variables with a smaller set of latent constructs.  To develop the dependent and independent measures used in this study, a factor analysis was conducted on 27 items with oblique rotation and used Velicer’s (1976) minimum average partial (MAP) method to estimate the factors (Courtney, 2013).  We utilized the procedures outlined by Courtney (2013) to analyze the data using SPSS R-Menu v2.0 (Basto & Pereira, 2012), and Velicer’s MAP values suggested a distinct fifth step minimum squared average partial correlation suggesting five factors.  Due to this evidence, five factors emerged: campus climate, academic engagement, sense of belonging, faculty interactions, and classmate interactions.  We computed the factor scores using the regression method and saved them as standardized scores with a mean of zero and a standard deviation of one.  Each of these factors had good reliability: campus climate (α = .868), academic engagement (α = .891), sense of belonging (α = .857), faculty interactions (α = .804), and classmate interactions (α = .823).

After conducting the factor analysis, hierarchical least squares regression analyses were conducted regressing students’ sense of belonging on the independent and control variables.  The model was guided by predominant theoretical frameworks suggesting students’ demographic characteristics and institutional contexts might covary with collegiate experiences, thereby potentially confounding the effects of those collegiate experiences (Astin, 1993; Pascarella & Terenzini, 2005).  To that end, we entered data into three blocks to assess the variance specific collegiate experience items explained above and beyond the variance accounted for by control measures (Petrocelli, 2003): 1) precollege characteristics; 2) institutional region, and; 3) collegiate experiences.


The results of the hierarchical linear regression analysis suggest Native American students’ collegiate experiences explained a significant among of unique variance in students’ sense of belonging above and beyond the variance accounted for by previously entered variables (R = .545, R2 =.297, F(14, 849) = 16.886, p < .001; R2 Change = .251, p < .001).  In other words, students’ collegiate experiences are significantly associated with their sense of belonging and help to predict their sense of belonging above precollege characteristics and institutional region.

Native American students’ perception of the campus climate for race and class, in addition to the frequency of their interactions with classmates, were significantly and positively associated with their sense of belonging (Table 1).  The frequency with which students participated in student clubs or organizations, engaged in recreational or creative interests, and socialized with friends was also positively associated with their sense of belonging.  The frequency with which students spent time with family was significantly and negatively associated with their sense of belonging, meaning that Native American students who spent more time with their families were less likely to feel a sense of belonging on campus (β = -.081).  None of the other collegiate variables were significant in this model, although we also found that students attending colleges in the Eastern region of the U.S. had significantly lower sense of belonging (β = -.084) compared to the students who attended colleges in other regions.

Table 1


The results of this study suggest there are elements of Native American students’ experiences on campus that can positively support their sense of belonging, in addition to factors that may detract from students’ sense of belonging.  In particular, we found that students’ engagement with their peers in academic and social contexts was particularly influential in promoting their sense of belonging, a finding congruent with prior scholarship (Larimore & McClellan, 2005).  Prior research suggested the importance of student-faculty interactions and family in Native American students’ belongingness (Jackson & Smith, 2001; Larimore & McClellan, 2005); however, in our study, we only measured the length of time students spent with faculty and family, not the quality of these relationships.  The time students spent with family may be attributed to living off campus with family, a factor that may compromise students’ ability to interact with peers on campus.  Based on these findings, it is recommended that researchers continue to explore the many ways in which students’ interactions with faculty and family can influence their collegiate experiences and deduce the ways in which these interactions may be crafted to support Native American students’ success.

Concomitant with the results of this study, there are several recommendations for student affairs practitioners to support Native American college students’ sense of belonging in higher education.  Given the connections between campus climate and sense of belonging, practitioners are encouraged to develop a warm and welcoming campus climate for students of color and students from lower social class backgrounds (Soria, 2012).  This study suggests that Native American students’ interactions with classmates in academic settings is positively associated with their sense of belonging, and practitioners need to provide adequate study spaces to students at hours convenient to their busy schedules.  Given the positive associations between Native American students’ time spent in student clubs and organizations, socializing with friends, and students’ sense of belonging, it is recommended that practitioners seek to integrate the curricular and co-curricular domains for students; for example, a Native American cultural group could have a space reserved for study time with peers in which hospitality is provided.  Opportunities for Native American students to explore recreational or creative interests alongside their peers may further support students’ integration in the university, while helping them to remain connected or develop new connections with their cultural traditions.

Discussion Questions

  1. How can Native American student services on your campus support students’ academic interactions with classmates, recreational or creative interests, and time spent socializing with friends?
  2. What steps has your campus taken to facilitate a welcoming campus climate for Native American students in particular?
  3. What spaces to Native American students occupy on your campus?  How can these actions and spaces be expanded to support Native American students’ sense of belonging and success?


American Indian College Fund. (2011). Facts about American Indian education. Denver, CO: Author. Retrieved from

Astin, A. W. (1993). What matters in college: Four critical years revisited. San Francisco, CA: Jossey-Bass.

Basto, M., & Pereira, J. M. (2012). An SPSS R-Menu for ordinal factor analysis. Journal of Statistical Software, 46(4), 1-29.

Chatman, S. (2011). Factor structure and reliability of the 2011 SERU/UCUES questionnaire core: SERU project technical report. Berkeley, CA: Center for Studies of Higher Education, University of California. Retrieved from…

Courtney, M. G. R. (2013). Determining the number of factors to retain in EFA: Using the SPSS R-menu v2.0 to make more judicious estimates. Practical Assessment, Research, & Evaluation, 18(8), 1-14.

Douglass, J. A., Thomson, G., & Zhao, C-M. (2012). The learning outcomes race: The value of self-reported gains in large research universities. Higher Education, 64(1), 317-355.

Garrod, A., & Larimore, C. (1997). First person, First peoples: Native American college graduates tell their life stories. Ithaca, NY: Cornell University Press.

Ginder, S. A., & Kelly-Reid, J. E. (2013). Postsecondary institutions and cost of attendance in 2012-2013; Degrees and other awards conferred, 2011-12 and 12-month enrollment, 2011-2012. Washington, DC: U.S. Department of Education.

Hausmann, L. R. M., Schofield, J. W., & Woods, R. L. (2007). Sense of belonging as a predictor of intentions to persist among African American and White first-year college students. Research in Higher Education, 48(1), 803-839.

Huffman, T. E. (2001). Resistance theory and the transculturation hypothesis as explanations of college attrition and persistence among culturally traditional American Indian students. Journal of American Indian Education, 40(3), 1-23.

Jackson, A. P., & Smith, S. A. (2001). Postsecondary transitions among Navajo students. Journal of American Indian Education, 40(2), 28-47.

Jackson, A. P., Smith, S. A., & Hill, C. L. (2003). Academic persistence among Native American college students. Journal of College Student Development, 44(4), 548-565.

Knapp, L. G., Kelly-Reid, J. E., & Ginder, S. A. (2012). Enrollment in postsecondary institutions, fall 2011; Financial statistics, fiscal year 2011; and graduation rates, selected cohorts, 2003-2008. Washington, DC: U.S. Department of Education.

Larimore, J. A., & McClellan, G. S. (2005). Native American student retention in U.S. postsecondary education. New Directions for Student Services (no. 109), 17-32.

Okagaki, L., Helling, M. K., & Bingham, G. E. (2009). American Indian college students’ ethnic identity and beliefs about education. Journal of College Student Development, 50(2), 157-176.

Pascarella, E. T., & Terenzini, P. T. (2005). How college affects students: A third decade of research (Vol. 2). San Francisco, CA: Jossey-Bass.

Petrocelli, J. V. (2003). Hiearchical multiple regression in counseling research: Common problems and possible remedies. Measurement and Evaluation in Counseling and Development, 36(1), 9-22.

Soria, K. M. (2012). Creating a successful transition for working-class first-year students. The Journal of College Orientation and Transition, 20(1), 44-55.

Soria, K. M., & Barratt, W. (2012, June). Examining class in the classroom: Utilizing social class data in institutional and academic research. Association for Institutional Research Forum, New Orleans, LA.

Soria, K. M., & Stebleton, M. J. (2012). First-generation students’ academic engagement and retention. Teaching in Higher Education, 17(6), 1-13.

Soria, K. M., & Stebleton, M. J. (2013). Social capital, academic engagement, and sense of belonging among working-class college students. College Student Affairs Journal, 31(2), 139-153.

Soria, K. M., & Thomas-Card, T. (2014). Relationships between motivations for community service participation and desire to continue service following college. Michigan Journal of Community Service Learning, 20(2), 53-64.

Tovar, E., Simon, M. A., & Lee, H. B. (2009). Development and validation of the college mattering inventory with diverse urban college students. Measurement & Evaluation in Counseling & Development, 42(1), 154-178.

Velicer, W. F. (1976). Determining the number of components from the matrix of partial correlations. Psychometrika, 41(1), 321-327.

About the Authors

Krista Soria is an analyst with the Office of Institutional Research at the University of Minnesota.  Her research interests focus on understanding the experiences of underrepresented students on college campuses, developing high-impact practices to support students’ success, and leveraging opportunities to facilitate students’ leadership development.  Krista is also an adjunct faculty with the leadership minor at the University of Minnesota.

Please e-mail inquiries to Krista Soria.

Brandon Alkire is an undergraduate student at the University of Minnesota.  He is majoring in Sociology and Law/Crime/Deviance and minoring in Political Science.  He is a Dakota citizen of the Standing Rock Sioux Tribe, which straddles the North/South Dakota boarder.  He is avidly involved in many activities at the University of Minnesota, including a general board member of the American Indian Student Cultural Center, member of the Native Student Awareness Committee, Student Parent Help Center, Circle of Indigenous Nations, and American Indian Studies Work Shop.

Please e-mail inquiries to Brandon Alkire.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Establishing an Inclusive Environment for Students with Autism


Students entering college today have diverse abilities and learning styles.  Through implementing universal design within higher education settings, professionals can enhance educational opportunities for all students.  In this paper, we show how to implement Higbee’s (2008) universal design principles into student development programs in order to support college students who have autism.

Students with Autism in Higher Education

Autism is the fastest growing developmental disorder in the nation.  In 2012, the United States Center for Disease Control and Prevention estimated that one in every 88 individuals had autism or a related disorder.  There is also an increase of students with autism entering higher education (Adreon & Durocher, 2007).  Government legislation has supported access to higher education for students with autism, as well as with other disabilities.  For example, the 1973 Rehabilitation Act (Section 504) banned discrimination of individuals with disabilities in programs and activities that received federal assistance (Evans, 2008).

Section 504 mandated that colleges that receive federal funding provide equal access for individuals with disabilities (Hall & Belch, 2000).  Higher education institutions have removed some barriers to education for students with disabilities.  College admission processes can no longer inquire whether an individual has a disability.  Section 504 also mandated that buildings must address architectural barriers that prohibit mobility for individuals with disabilities.  Therefore, many colleges removed physical barriers that impede mobility on campuses for those with disabilities.  For example, campuses have added ramps and automatically opening doors.  Though colleges have removed some challenges involved in gaining admission to college as well as navigating campuses, students with disabilities are still less likely to engage in the college experience and gain a diploma (Hall & Belch, 2000).  Focus is needed on how institutions can create inclusive learning environments for all students, including students with disabilities.

Understanding Neurodiversity and Autism


People with autism have differences in cognitive processes; these neurological varieties are often referred to as neurodiversity (Blume, 1998).  In the 1990s, people with autism developed the term neurodiversity, in order to assert that those with atypical brain wiring deserve respect.  Advocates stressed that anyone can be placed on a variety of spectrums (Pollak, 2009).  Neurodiversity notes learning differences, rather than difficulties.  It is intended to be a positive statement of differentiation; though individuals have differences, they are not dysfunctional (Grant, 2009).  People with difference do not need to be cured; rather, they need help and accommodations (Robison, 2013).

People with neurodiversity conditions may experience challenges completing everyday tasks, which are dependent on neurocognitive processing of information.  These include social interaction, attention span, and time management (Grant, 2009).  In addition to autism, neurodiversity conditions include Attention Deficit Hyperactivity Disorder (ADHD), dyslexia, and Tourette syndrome (National Symposium on Neurodiversity at Syracuse University, n.d.).


Autism is a neurodevelopment disorder that affects growth in areas of social interaction and behavior (Adreon & Durocher, 2007).  Autism Speaks (2014) defines autism spectrum disorder (ASD) and autism as, “characterized, in varying degrees, by difficulties in social interaction, verbal and nonverbal communication and repetitive behaviors… ASD can be associated with intellectual disability, difficulties in motor coordination and attention and physical health issues” (para. 1-3).

What Challenges Do College Students with Autism Typically Encounter?

Social Challenges

Students with autism may have difficulty forming relationships due to misinterpretations of social cues or conventions (Adreon & Durocher, 2007).  They may interpret information in an overly literal way, causing them to misunderstand others’ attempts at humor (Adreon & Durocher, 2007).  Consequently, they may become isolated or exploited because of their perceived naiveté (Welkowitz & Baker, 2005).  Students with autism may experience difficulty establishing trusting relationships in a new environment, such as the college campus.

Adaptation Challenges

One of the challenges that students with autism encounter when entering college is that they transition from a centralized support system into an environment where they must advocate for themselves (Higbee & Kalivoda, 2008).  Their centralized support system includes their families, which understand and embrace their differences.  Students with autism may struggle with advocating for themselves and clearly communicating their challenges.  Acclimating to college life is a process that often involves navigating a range of college offices and personnel.

Learning Differences

Students with autism may have differences in how they learn.  When information is provided too quickly, they may not fully grasp all of the information dictated.  This experience may lead to them feeling overwhelmed and anxious.  In addition, individuals with autism may use unusual mannerisms, such as rocking, as a means of self-soothing.

How Universal Design Can Support Students with Autism

The theory of universal design is inclusive for all populations, in all environments.  According to the Center for Universal Design (1997), the principle of universal design promotes the design of products to be usable for all people, without the need to be adapted.  Universal design stemmed from Accessible Design, which was supportive design to be used by individuals with disabilities (Universal Design, n.d.).  Universal design is usable by the widest range of people to the greatest extent possible.  It considers humans to have diverse abilities, making spaces and products easier to use for all people.

It is no longer the sole responsibility of disability services to create inclusive environments on college campuses.  All professionals must foster a community where everyone has an equal opportunity to learn.  Using universal design throughout all parts of college campuses, as well as during instruction, enables higher education professionals to support students with diverse abilities.  The universal design framework is influential in helping professionals create environments where all students can thrive.

It has been the foundation of the student affairs profession to support and embrace diversity (Nuss, 1996).  Just as professionals have led in promoting diversity of religion, race, and sexuality in higher education, it is also vital that student development professionals promote acceptance of neurodiversity.  By implementing universal design principles, student affairs professionals can nurture students’ intellectual and social development.

Universal Design Principles for Student Development Programs and Services

It is vital for student development professionals to implement universal design principles into their daily practices in order to support the success of students with autism.  Higbee (2008) presented nine principles for universal instruction design in student development programs:

  • Create welcoming spaces;
  • Develop, implement, and evaluate pathways for communication among students, staff, and faculty;
  • Promote interaction among students and between staff and students;
  • Ensure that each student and staff member has an equal opportunity to learn and grow;
  • Communicate clear expectations to students, supervisees, and other professional colleagues utilizing multiple formats and taking into consideration diverse learning communication styles;
  • Use methods and strategies that consider diverse learning styles, abilities, ways of knowing and previous experience and background knowledge, while recognizing each student’s and staff member’s unique identity and contributions;
  • Provide natural supports for learning and working to enhance opportunities for all students and staff;
  • Ensure confidentiality; and
  • Define service quality, establish benchmarks for best practices, and collaborate to evaluate services regularly (pp. 196-200).

Here we share examples of how professionals can incorporate universal design into campus programs and services to better support students with autism.

Create Welcoming Spaces

Higbee’s (2008) first principle is to create welcoming spaces (p. 196).  Students with autism may experience difficulty understanding others’ perspectives, and this challenge can lead to feelings of isolation.  When student development professionals create warm atmospheres in their offices and student meeting places, they help students feel valued.  Welcoming environments include staff and student workers greeting guests, offering genuine support, and fostering a sense of community.

Professionals must also use inclusive language that is welcoming to all.  They can train student workers and student leaders to use supportive, first-person language.  First-person language shows that workers appreciate diversity and honor individual identity. For example, professionals should use the term, “students with autism,” instead of “autistic students.”  According to Hall and Belch (2000), first-person language emphasizes the person over the disability.

Support Pathways for Communication

The next principle is to develop, implement, and evaluate pathways for communication among students, staff, and faculty (Higbee, 2008, p. 196).  Student development professionals must be cognizant of their communication practices and share directions in a clear and straightforward manner.  Sometimes, students with autism struggle to follow directions with multiple steps (Adreon & Durocher, 2007).  When introducing activities with several steps, such as during icebreakers, campus activities professionals should clearly state the rules and repeat them.  At large-scale events, such as orientations, professionals should also provide information in multiple methods, such as oral and written forms of communication.  In addition, professionals can communicate both in large group and small group formats.  Providing communication in multiple methods supports diverse learning styles and enhances educational experiences for all individuals involved.

Promote Interaction

Higbee’s (2008) third principle is to promote interaction among students and between staff and students (p. 197).  Student development professionals can serve as point persons for students with autism.  For some students with autism, it can be helpful to identify a point person to visit when they feel anxious (Myles & Adreon, 2001).  This person can be key in assisting the student in problem solving (Jekel & Loo, 2002).  For example, at Keene State College, a program exists where peer mentors are trained to offer support to students with autism (Welkowitz & Baker, 2005).

Offer Equal Opportunities for Learning and Growth

The next universal design principle for student development professionals is to ensure that each student and staff member has an equal opportunity to learn and grow (Higbee, 2008, p. 197).  Student affairs departments must develop services that improve opportunities for all students, but specifically reflect on the accessibility of resources to marginalized groups.  Student activities offices can offer leadership retreats that consider the diverse needs and abilities of all student attendees.  They can develop activities that are supportive to an array of unique learners.

Communicate Clear Expectations and Consider Diverse Communication Styles

Higbee’s (2008) fifth principle is to communicate clear expectations to students, supervisees, and other professional colleagues utilizing multiple formats and taking into consideration diverse learning communication styles (p. 198).  Students with autism tend to desire predictability and clear expectations; however, at times, this inclination may result in inflexible behavior (Adreon & Durocher, 2007).

Students with autism may become anxious when others do not adhere to rules, such as violating quiet hour rules in a residence hall.  It is important that student development professionals clearly explain living options so that students may make the optimal choice.  If students decides to live with a roommate, they must make efforts to understand the in’s and out’s of communal spaces.  If conflicts occur, professionals should help students negotiate through them, while maintaining appropriate boundaries and preventing dependency.

Consider Diverse Backgrounds and Recognize Students’ Strengths

The next principle is to use methods and strategies that consider diverse learning styles, abilities, ways of knowing and previous experience and background knowledge, while recognizing each student’s and staff member’s unique identity and contributions (Higbee, 2008, p. 198).  Professionals must take into consideration each student’s multiple intelligences.  Students with autism have various strengths, including their tendency to be reliable, as well as their tendency to pay great attention to detail (Adreon & Durocher, 2007).  Professionals can assist students by guiding them in further developing these strengths.  For example, professionals can help the student determine how their interests align with organizations, learning communities, or employment opportunities.

Provide Natural Learning Supports

Another universal design principle is to provide natural supports for learning and working to enhance opportunities for all students and staff (Higbee, 2008, p. 198).  Typically, students with autism have difficulty with academic content and organizational skills.  Student development professionals can aid students in managing their challenges by providing natural learning supports.  For example, written supports include meeting minutes and handouts.  Professionals can also scaffold concepts during instruction.  Most importantly, professionals must reinforce that mistakes are opportunities for learning.

Ensure Confidentiality

A very important principle in universal design within student development programs it to ensure confidentiality (Higbee, 2008, p. 199).  Students with autism have a right to confidentiality.  However, when services are not universally designed, confidentiality can be breached.  This is because such environments may distinguish the student as different (Higbee, 2008).  Professionals must honor students’ trust by allowing the student to decide whether to disclose and how to disclose.  Professionals must recognize that students with autism may encounter negative attitudes from others concerning their abilities (Kroeger & Schuck, 1993).  This may lead students to be reluctant to disclose their disability with staff and their peers.

Identify Service Quality and Evaluate Services

Higbee’s (2008) final principle is to define service quality, establish benchmarks for best practices, and collaborate to evaluate services regularly (p. 200)It is essential that student development professionals seek out ongoing professional development on how to be a resource for students with autism.  By providing training, supervisors can hold staff accountable in promoting an inclusive environment.  If properly implemented, these trainings will result in a culture that values differences.  Furthermore, training should not be restricted to employees, but be provided to students as well.  For example, offices can educate student leaders, such as club presidents, on how to incorporate universal design into their activities.  Not only is training essential, but evaluation is also important.  Evaluation allows professionals to learn how they can improve and better serve all students.


Fostering the feeling of community continues to be a challenge as colleges diversify (Hall & Belch, 2000).  Through implementing universal design principles, student affairs professionals can create a sense of community for all.  Use of universal design principles can enable colleges and universities to create inclusive environments that are able to appropriately support students with autism.

Discussion Questions

  1. How do student affairs professionals at your institution promote acceptance in regards to students with neurodiversity, and specifically students with autism?
  2. How can your campus better incorporate universal design throughout the various functional areas of student affairs (campus involvement, residence life, orientation, etc…)?
  3. How can you help your students learn the importance of creating an inclusive environment and acceptance of neurodiversity?


Adreon, D., & Durocher, J. S. (2007). Evaluating the college transition needs of individuals with high-functioning autism spectrum disorders. Intervention in School & Clinic, 42(5), 271-279.

Autism Speaks, (2014). What is autism? Retrieved from

Blume, H. (1998, Sept. 30). Neurodiversity. The Atlantic. Retrieved from

Center for Universal Design. (1997). What is Universal Design? Retrieved from…

Evans, N. (2008). Theoretical foundations of universal instructional design. In J.L. Higbee & E. Goff (Eds.). Pedagogy and Student Services for Institutional Transformation: Implementing Universal Design in Higher Education (pp. 11-24). Minneapolis, MN: Regents of the University of Minnesota.

Grant, D. (2009). The psychological assessment of neurodiversity. In D. Pollak (Ed.), Neurodiversity in higher education: Positive responses to specific learning differences (pp. 33-61). West Sussex, UK: Wiley & Sons.

Hall, L.M., & Belch, H.A. (2000). Setting the context: Reconsidering the principles of full participation and meaningful access for students with disabilities. New Direction for Student Services, 91, Fall 2000, 5-17.

Higbee, J. L. (2008). Universal design principles for student development programs and services. In J. L. Higbee & E. Goff (Eds.), Pedagogy and Student Services for Institutional Transformation: Implementing universal design in higher education (pp. 195-203). Minneapolis, MN: University of Minnesota, Center for Research on Developmental Education and Urban Literacy.

Higbee, J. L., & Kalivoda, K. S. (2008). The first-year experience. In J. L. Higbee & E. Goff (Eds.), Pedagogy and student services for institutional transformation: Implementing universal design in higher education (pp. 245-253). Minneapolis, MN: University of Minnesota, Center for Research on Developmental Education and Urban Literacy.

Kroeger, S., & Schuck, J. (1993). Moving ahead: Issues, recommendations, and conclusions. New Directions for Student Services, 64, Winter 1993, 103-110.

Jekel, D., & Loo, S. (2002). So you want to go to college: Recommendations, helpful tips, and suggestions for success at college. Watertown, MA: Asperger’s Association of New England.

Myles, B. S., & Adreon, D. (2001). Asperger syndrome and adolescence: Practical solutions for school success. Shawnee Mission, KS: Autism Asperger Publishing.

National Symposium on Neurodiversity at Syracuse University (n.d.). What is neurodiversity? Retrieved from

Nuss, E. (1996). The development of student affairs. In S. R. Komives & D. B. Woodard, (Eds.), Student services: A handbook for the profession (pp. 22-42). West Sussex, UK: Wiley & Sons.

Pollak, D. (2009). Introduction. In D. Pollak (Ed.), Neurodiversity in higher education: Positive responses to specific learning differences (pp. 9-11). West Sussex, UK: Wiley & Sons.

Robison, J. (2013). My life with Asperger’s: How to live a high functioning life with Asperger’s. Psychology today. Retrieved from…

Universal Design: The Resource for Universal Design News (n.d.). What is universal design? Retrieved from  &view=article&id=327:what-is-universal-design&catid=2196:universal-design&Itemid=113

U.S. Center for Disease Control and Prevention. (2012). Prevalence of Autism Spectrum Disorders — Autism and Developmental Disabilities Monitoring Network, 14 Sites, United States, 2008. Centers for Disease Control and Prevention. Retrieved from:

Prevalence of autism spectrum disorders. Autism and developmental disabilities monitoring network, 14 sites, United States, 2008. Morbidity and Mortality Weekly Report Surveillance Summaries, 61(3), 1-19. Atlanta, GA: Author

Welkowitz, L., & Baker, L. (2005). Supporting college students with Asperger Syndrome. In J. L. Baker, & L. A. Welkowitz (Eds.), Asperger Syndrome: Intervening in schools, clinics, and communities. Mahwah, NJ: Lawrence Erlbaum Associates.

About the Authors

Dale O’Neill, M.A., serves as the Coordinator of Leadership and Community Service Programs and the Interim Greek Life Advisor at the University of New Orleans.  She is currently pursuing a doctorate in Education Administration from the University of New Orleans (LA).  She is an active member in ACPA – College Student Educators International, having served as the Newsletter Editor for two years for the Standing Committee for Graduate Students & New Professionals as well as the Convention Program Chair and Newsletter Chair for the Standing Committee on Disability. 

Rory O’Neill Schmitt, Ph.D., is an educational researcher and has earned her doctorate in Curriculum and Instruction Studies.  Currently, she serves as a Faculty Associate in the University College of Arizona State University in Tempe, AZ.  She is a peer reviewer for the Current Issues in Education journal.  In addition, she volunteers on the board of the Arizona Art Therapy Association as its president.

Please email inquiries to Dale O’Neill.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Inclusion in Association Data Collection

ACPA Demographic Standard Question Committee

ACPA – College Student Educators International is the comprehensive higher education and student affairs/services Association which lives out its long-held Core Values to support college student success. Two of these Core Values speak directly to our commitment to issues of social justice, equity, and inclusion:

  • Diversity, multicultural competence and human dignity; and
  • Inclusiveness in and access to Association-wide involvement and decision-making.

It is the Association’s attentiveness to issues of inclusion, opportunities to increase understanding and competence, and recognition that our own identity matters in the work that we do that brings thousands of members back to ACPA annually.  As an Association, ACPA strives to provide its members with meaningful and intentional professional development programs; knowledge grounded on best practices and research; and a nurturing environment for networking and learning opportunities. Consequently, 99.5% of respondents/members affirmed in the 2012 Membership Survey that “ACPA fulfills/supports/lives the Association values.”

In various formats (typically membership forms and assessments), ACPA asks members to self-describe a number of different professional, social and/or personal identity demographics. The Association uses this data to monitor trends over time through multiple instruments, analyze responses, opinions and satisfaction by identity area with data from a single instrument, and distribute targeted information about ACPA programs, events and services. Members can choose to provide information on their personal/social identities for ACPA to use for educational planning and event promotional purposes.

The quest for inclusiveness is considered to be a journey, rather than a destination as nomenclature, definitions, and attributions change with time and context.  In recent years, ACPA received feedback via the 2012 Membership Survey, the ACPA 2013 Convention Evaluation and the ACPA Equity & Inclusion Advisory Board that some members have experienced these surveys or forms as marginalizing. Shortly after the ACPA 2013 Convention in Las Vegas, ACPA leaders from the Multiracial Network (MRN) and Standing Committee for Multicultural Affairs (CMA) wrote “An Open Letter to the ACPA Community” to highlight the importance of question and response option wording and to offer an educational moment for ACPA members who may encounter demographic questions in their work. It is through active participation as demonstrated by the Multiracial Network and the Standing Committee for Multicultural Affairs that brings about positive and forward change in ACPA, and we are grateful for their advocacy and involvement in the pursuit of new expectations.

For the past 18 months, a small group of ACPA Governing Board and International Office staff members have consulted widely across the Association to develop new standards for demographic questions in surveys and on membership or event registration forms. In our work, we consulted with current and former directorate leaders/members of Standing Committees and Commissions, educational researchers, assessment experts, Association professionals, convention volunteers, and social justice advocates to determine the most appropriate ways to develop questions that were sound psychometrically, yet did not create unintentional micro-aggressions for members. The end result was the creation of a proposal, approved by the ACPA Governing Board in December 2013, which documents standards for collecting and analyzing personal and/or social demographic information from members and/or event participants. The approved proposal also delineates questions most appropriate to ask via the individual membership form and professional development event registration form from those most appropriate to ask via supplemental assessment instruments to eliminate these concerns about exposure and privacy.

Going forward, it is the expectation that ACPA leaders and members who distribute surveys or create event registration forms sponsored by the Association will follow these new standards explicitly. Several of the required demographic questions are intentionally listed as free response options to allow members to self-describe these aspects of their identities so that they may inform future iterations of these questions and response options. A coding guide is available to ACPA leaders as a means of pre-determining coding expectations and to increase the ease of coding for future volunteers. We recognize, however, that demographic questions should be reviewed and updated annually by the Equity & Inclusion Advisory Board and the Governing Board Director of Membership Development with respect to evolving terminology, language and definitions.

In the case where an exception to the standards is requested, the ACPA International Office will review the requested changes and consult with the appropriate Governing Board member(s) to ensure that the requested exception does not contain unintended micro-aggressions. Exception requests might include, but are not limited to: Research focus/questions using different language/terminology, data analysis does not rely on all questions in standards, or campus Institutional Review Board (IRB) approval. Campus Institutional Review Board (IRB) approval supersedes ACPA policies regarding demographic questions and response options in research cases, but according to the current ACPA Research Request Policy, “Research requests must fit with the mission and purpose of ACPA, be culturally appropriate, and comply with ACPA’s statement on non-discrimination and ethical principles.”

A copy of the new ACPA Demographic Questions Standards is now available for your review and consideration for use on your campus or in your work.

As previously stated, these standards will be reviewed annually by the Equity & Inclusion Advisory Board and the Governing Board Director of Membership Development to ensure alignment with ACPA’s values, member feedback and ever-evolving nomenclature and definitions. During the drafting of this proposal the authors experienced several challenges/questions that future ACPA leaders should monitor and further evaluate. We believe those issues are worth noting as they more fully describe our journey, and not just this initial set of standards:

  • ACPA’s aspirations are global, yet its membership is overwhelmingly from the United States.  Many of the demographic questions proposed contain cultural references or terminology rooted in United States culture and history. As ACPA’s membership continues to grow outside of the United States, the individual membership form, event registration form and assessment demographic questions may need to be revised to reflect a more global set of questions and response alternatives.
  • ACPA does not currently have the capability to gather information about the relationship between a member’s country of origin, citizenship and/or country of residency. This information would be valuable in more fully understanding the global nature of student affairs/services work and the Association’s reach. Asking for this information via the membership or event registration form places members in the position of having to disclose their immigration status, which is unrelated to the goal of the information gathering. Should ACPA consider adopting demographic questions regarding citizenship in the future, it may be important to explore any possible legal implications for collecting and storing this data as well as to consider that immigration status is fluid and complex.
  • There was great discussion about whether asking members about languages spoken would build expectations of programs and services being offered in multiple languages to best meet member needs. Although not currently a component of ACPA’s principles, language accommodations are commonly cited as a component of Universal Design. Questions about the primary languages used by members has been added to the current standards, but these questions may be considered in the future as additions to the individual membership form and/or event registration form if ACPA expands its Universal Design principles to include languages used.

We would like to, once again, state our gratitude to the Multiracial Network and the Standing Committee for Multicultural Affairs for their advocacy and involvement in this significant ACPA advancement. We are also grateful to the many ACPA leaders, members and educational partners who supported this important work and we anticipate that current and future ACPA members will continue to experience the Association’s Core Values lived out in all arenas, including membership forms and experience or satisfaction surveys. If you feel that they do not, we want to hear from you to continue making positive and necessary edits to these standards. While we have reached this initial destination in the form of established standards, we continue on the journey of inclusion, individually and as an Association. That is what makes ACPA and its members so special.

ACPA Demographic Question Standard Committee

Chris Moody (American University-DC), ACPA Past-Director of Membership Development

Kathy Obear (Alliance for Change Consulting), ACPA Director of Equity and Inclusion

Heather Gasser (Michigan State University), ACPA Director of Membership Development

Tricia Fechter, ACPA-College Student Educators International

Stanton Cheah (University of Maryland at College Park)

Special acknowledgements:

ACPA Multiracial Network (MRN)

Standing Committee for Multicultural Affairs (CMA)

ACPA Equity & Inclusion Advisory Board

Commission for Spirituality, Faith, Meaning and Religion

Kathleen G. Kerr, University of Delaware

John Dugan, Loyola University Chicago

Jennifer Keup, The National Resource Center for The First-Year Experience and Students in Transition

Employee Evaluation Using Professional Competencies

Employee Evaluation Using Professional Competencies

Vicki L. Wise
Portland State University
Lisa J. Hatfield
Portland State University

At many universities the office of human resources typically offers generic employee evaluation forms for various classifications of non-academic employees that broadly measure their performance. Student Affairs employees are often evaluated against a set of standards that do not directly relate to their work. Our institution’s Student Affairs professionals have shared their difficulty in using our university’s generic evaluation tool in a meaningful way. The measured areas do not align to professional standards in Student Affairs, and the scale is not well defined and difficult to understand. Finally, many employees do not consider the scale easily applicable to goal setting and professional development.

To remedy these difficulties, the Director of Student Affairs Assessment and the Director of the Learning Center developed a supplemental employee self-evaluation tool aligned with the ACPA – College Student Educators International and NASPA – Student Affairs Administrators in Higher Education Professional Competency Areas for Student Affairs Practitioners (2010). In addition, our division of Student Affairs has used the Council for the Advancement of Standards (CAS) in Higher Education (2012) to inform strategic planning, program development, and assessment. The new evaluation tool also aligns to the CAS Standards. It was our desire to create an evaluation tool that would inform staff regarding areas of strength and areas where they might need further training and professional development.

We developed this scale following the recommendations of DeVellis (1991; 2012), who recommended an eight-step process in scale development to produce scales that accurately and reliably measure constructs of interest, and includes:

  • Defining the construct(s) of interest to measure.
  • Creating a set of draft questions that will become the item pool.
  • Determining the format for both the items and the response scale.
  • Seeking expert opinion for item and response scale review.
  • Adding items to reduce social desirable responding.
  • Pilot-testing items with a sample of the target population.
  • Analyzing the results of the pilot test to determine item and scale quality.
  • Determining which items to keep for the final scale.


Step 1: Decide What to Measure

In beginning to develop our instrument we needed to first identify the construct(s) and aspects of employee performance to measure. This was accomplished by reviewing the Professional Competency Areas for Student Affairs Practitioners (ACPA & NASPA, 2010) and our Human Resource guidelines specified on the employee evaluation tool. We used these collectively to develop our scale. ACPA and NASPA address 10 competency areas, each with outcomes at the basic, intermediate, and advanced skills levels. These are (1) advising and helping, (2) assessment, evaluation, and research, (3) equity, diversity, and inclusion, (4) ethical professional practice, (5) history, philosophy, and values, (6) human and organizational resources, (7) law, policy, and governance, (8) leadership, (9) personal foundations, and (10) student learning and development. There are 335 outcomes addressed across these competency areas.

To begin the process of distilling 335 outcomes to a set of items that would represent the work of Student Affairs staff, we created an Excel workbook with 10 sheets, one for each competency area and their corresponding outcomes. Three ratings’ columns allowed for the researchers and a graduate student to independently code each skill (335 outcomes) with a keyword that reflected the essence of that skill. The two researchers met and reviewed all three ratings. For each item, we asked if the keyword(s) reflected what we thought each item meant. Where we disagreed in our interpretation of an item, we discussed until we reached agreement on meaning. In this process, we created a fourth column for our agreed upon theme.

We then combined the 335 outcomes with the agreed upon themes into one spreadsheet. We sorted themes alphabetically and combined themes that were alike. The seven themes generated from this review were communication, cultural competence, core foundations, leadership, law and ethics, management, and professionalism.

Step 2: Generate Item Pool

We still had 335 outcomes related to our seven themes, and knew that we had to reduce outcomes to a manageable list. We reviewed all 335 items to determine which items best reflected each of the seven themes. For the first draft of the scale, we extracted 34 outcomes (items) from the competency areas aligned with our seven themes. We revisited the seven themes and their corresponding items and determined that the area of law and ethics should be removed and one item from this be moved to the cultural competence theme: act in accordance with federal and state/province laws and institutional policies regarding nondiscrimination. We then decided on a measurement scale that reflected levels of performance that would be informative to employees’ self-evaluation.

Step 3: Determine Format for Measurement

Our initial scale was Proficient and Not Proficient as we reasoned that an employee either met a standard or did not. However, upon feedback from department leaders in Student Affairs, we concluded that Not Proficient did not support a developmental model. Therefore, we used the following response scale: Proficient: exemplifies practices most or all of the time; Developing: exemplifies practices on occasion and has room for growth; and not applicable: does not apply to job description and expectations. We then created a draft rubric and prepared for reviews.

Step 4: Have Item Pool Reviewed by Expert

We solicited feedback from our university’s Student Affairs Assessment Council (SAAC). The SAAC is comprised of representatives from departments across Student Affairs that are responsible for conducting assessment in their areas. The goal of the Council is to create a systemic and systematic culture of assessment where we use data, in all its forms, to inform our educational practices and to ensure student success. Based on their feedback, several items were rewritten, as they were double-barreled, confusing, or contained errors. Three items were eliminated. For example, under the Management theme, we reworded this item Model the principles of the profession and communicate the expectation of the same from colleagues and supervisees to Model the standards of your professional organization (e.g., NASPA, NACADA, etc.) Our plan was to then submit the rubric to all Student Affairs staff for more feedback. The Council recommended removing the proficiency scale for the first review and having staff use the scale applies to my work or does not apply to my work. The Council also recommended that it was best to determine if the items were relevant to the work staff do and that including the proficiency scale would be confusing for judging item relevance.

Step 5: Consider Validation Items

DeVellis (1991; 2012) recommends including validation items to reduce response bias, which occurs when individuals may be motivated to present themselves in the most positive light, known as social desirability. The higher the consequences to the employee, the more likely there is to be bias. As this self-evaluation tool is not linked to promotion and pay, it is unlikely that staff would be motivated to demonstrate bias.

Step 6: Administer Items to a Developmental Sample

We pilot-tested the items with our target population of Student Affairs personnel and included their supervisors. Pre-testing allowed us to know if items were applicable to the work in which unclassified staff engage. We administered the items online using the survey tool Qualtrics. Respondents were asked to review each item in light of their current job position and note if the item was applicable or their work. If they reported that an item was not understandable, a follow-up question asked: Please tell us why the items are not understandable. All 153 staff employed in Student Affairs were provided the opportunity to evaluate the items. A total of 53 staff members (35%) responded. The feedback received was overwhelmingly positive with 98% of respondents reporting that items were understandable.

We expected that regardless of area employed in Student Affairs almost all staff would report that the competencies were applicable to their position. That was the case, generally, although there were a few exceptions (results shown in Table 1). In terms of communication, all but one item applied across the division: Assist students in ethical decision-making, and may include making referrals to more experienced professionals when appropriate was applicable less often. This makes sense given that not all positions engage students on a regular basis. In terms of Cultural Competence, we expected that 100% of positions would apply these practices. While the ratings were quite high, we realize that the item Ensure assessment practices are culturally inclusive was not well understood. The Director for Assessment and the Director for Diversity and Multicultural Services will address cultural inclusivity in future employee trainings. In the area of Core Foundations, ratings were quite high. In terms of Leadership, ratings were also high, although for two items ratings were a bit lower: Give appropriate feedback to individuals pertaining to professional growth and development and create or participate in meaningful mentoring. Many of the student affairs staff at our institution are not in roles that require giving feedback to other staff, so in some ways this item has limited applicability. However, it is still important for those who do have the responsibility to be able to do so. As fewer staff view themselves as responsible for mentoring, this can provide an opportunity for professional development. Ratings were lowest in the area of Management, unsurprisingly given the range of unclassified position responsibilities. Again, it is still important for those who do have that responsibility. Finally, ratings were high in the area of Professionalism.

The second pilot test included our proficiency scale: Proficient– exemplifies practices most or all of the time; Developing–exemplifies practices on occasion and has room for growth; Not Applicable–does not apply to job description and expectations. The items and scale were administered online. Twenty-five staff members reviewed the scale and determined that it was understandable.

Step 7: Evaluation of Items

In this step, DeVellis (1991; 2012) recommends that once the pilot data has been collected, items need to be evaluated to determine if they function. This includes examining item-score correlations, item variances, and means. DeVellis also recommended examining internal reliability (consistency) of a scale. In this step, we veered from the protocol, as items were assessed on a dichotomous scale, there was little variability in responding. We measured six themes and included 31 items so we knew we had a multidimensional scale. We examined internal consistency applying Kuder-Richardson 20 (KR-20). KR-20 is recommended when examining scales with dichotomous measures and is comparable to Cronbach’s α used for non-dichotomous measures (DeVellis, 1991; 2012). A general rule-of-thumb is that internal consistency be greater than .70 (Nunnally & Bernstein, 1994). We had an internal consistency reliability coefficient of .85.

Step 8: Optimize scale length

Typically, the final step in scale development is factor analysis to determine the number of factors and whether the scale represents a unidimensional or multidimensional set of factors. Factor analysis was unwarranted given that the data were dichotomous and the sample size small. Moreover, given that the maximum number of staff using this form is 153, and that we will not have access to the results since the form is a self-evaluation tool, this procedure was unwarranted.


This paper presents a scale development process used in Student Affairs. The scale developed was a Student Affairs staff self-evaluation tool that can be used for personal performance review and professional goal setting. It can also be used by supervisors to assist them in setting professional development agendas. As previously mentioned, this instrument will certainly inform our professional development efforts in Student Affairs. Our next annual employee self-evaluation period is spring 2014 and all Student Affairs staff will complete this self-assessment. The Director of Assessment will work with supervisors through a series of trainings on how to use the tool efficiently to support staff in their professional development. Our university has a very active employee-development agenda. The university training schedule offers workshops to support staff skill development in management/supervision, technology, and communication, to name a few. In addition, Student Affairs has an Employee Learning Group responsible for monthly learn-at-lunch sessions related to areas represented in the CAS Standards, and directly related to this evaluation tool.

While our original intention was to develop a tool for Student Affairs staff at our institution, we recognize that this scale could be used across Student Affairs in all job positions, as the competencies addressed on this scale should have applicability to all. Moreover, it can be used to expand and improve job descriptions to include these competencies.

Discussion Questions

  1. As this scale is used for employee self-reflection, growth and development, what types of professional development might you offer that is aligned with the competencies measured?
  2. Being able to measure employee growth over time is essential. Who at your institution might help develop an instrument like this for the Student Affairs staff?
  3. How might a supervisor use this scale to develop job descriptions for new employees?


ACPA & NASPA. (2010). Professional Competency Areas for Student Affairs Practitioners. Retrieved from:

Council for the Advancement of Standards in Higher Education. (2012). CAS professional standards for higher education (8th ed.). Washington, DC: Council for the Advancement of Standards.

DeVellis, R. F. (1991). Scale development: Theory and application. Applied Social Research Methods Series. Newbury Park, CA: Sage.

DeVellis, R. F. (2012). Scale development: Theory and application. Applied Social Research Methods Series (3rd ed.). Thousand Oaks, CA: Sage.

Nunnally J. C. , & Bernstein I. (1994). Psychometric theory (3rd ed.). New York, NY: McGraw-Hill.

About the Authors

Vicki L. Wise, PhD, serves as Director of Assessment & Research at Portland State University (PSU) where she oversees assessment, planning, and reporting for the Division of Enrollment Management & Student Affairs.  Prior to PSU, she was at James Madison University for 10 years and held the positions of Director of Assessment and Evaluation for the College of Education, Assistant Director for Institutional Research, and Assistant Professor/Research Administrator in the Center for Assessment and Research Studies. Vicki earned her PhD and MA degrees at the University of Nebraska in Psychological and Cultural Studies and Educational Psychology, respectively.

Please e-mail inquiries to Vicki L. Wise.

Lisa Hatfield is the Director of Portland State University’s Learning Center. Lisa is a member of our institution’s Student Affairs Assessment Council and has had a great deal of experience in classroom assessment (both student and instructor). Having taught in the K-12 system for several years, Lisa also has statewide experience with assessment, especially with evaluating students’ writing. She holds an MA and an MAT, and is a doctoral student in Curriculum and Instruction.

Please e-mail inquiries to Lisa Hatfield.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Student Data Analytics: What’s the FERPA Position?

Student Data Analytics: What’s the FERPA Position?

Jeffrey C. Sun
University of Louisville


May colleges and universities use student data to enhance educational programming and student achievement?  Generally speaking, the answer is yes.

Let us start with a basic refresher on the legal definition of the Family Educational Rights and Privacy Act (FERPA).  FERPA is a federal privacy law protecting student education records.  The law requires postsecondary institutions to provide college students access to their education records and mandates privacy protections of such records.  In particular, the education records of concern are those deemed as personally identifiable information such as a student’s social security number, biometric information (e.g., fingerprints, voice prints, retina and iris patterns), and other identifying information.  The law, while aimed at protecting student privacy, is not as restrictive as some might assert (Sun, 2014).

Recently, FERPA has undergone some regulatory changes.  In 2008 and 2011, the modified regulations made it possible for several uses of student data without student consent.  These permissible uses are intended to enhance educational programming and student achievement.  In this article, I present two instructive examples of these uses: adaptive learning technologies and state longitudinal data systems.

Adaptive Learning Technologies

FERPA allows colleges and universities to use personally identifiable student information without student consent when the university or state system office is using education records for certain educational programming and operational support reasons such as adaptive learning, predictive tests, student aid programs, or instructional improvements.[i]   As more postsecondary institutions engage in efforts of creating learning technologies for instruction and educational support, the questions of privacy become heightened.

Here’s a quick overview of adaptive learning technology.  Adaptive learning technology is typically a software-based tool in which a student undergoes a series of learning modules.  The learning modules cater to the student’s response.  For instance, if an adaptive learning technology is used for math, the program conducts a diagnostic through each lesson and identifies questions or concepts that present barriers for the student to comprehend.  Based on that information, the program presents reinforcement modules or new instructional presentations to address the challenging area.  The technology is a form of artificial intelligence.  Basically, the program adapts to the individual or mediates the learning with a somewhat personalized set of educational modules.

In higher education, adaptive learning technology is a growing learning and intervention tool.  It has been applied in a variety of ways including remedial education, supplemental education, and traditional educational learning settings.  For instance, at Arizona State University, a student who enrolls in an adaptive learning class must master a set of concepts, where the student accumulates and earns badges.  An established number of badges qualify the student to sit through the final exam to demonstrate course proficiency.  As Selingo et al. (2013) report, Arizona State University plans to integrate both an adaptive learning feature and an active learning classroom approach to general education courses.  Much of the traditional lecture portion can be captured through adaptive learning technology along with reinforcement activities.  Further, the active learning classroom supports the integration with problem solving activities.  The legal issue is that the adaptive learning technology is based on a partnership with two for-profit companies, Pearson and Knewton.  These companies serve as third party vendors of Arizona State and include uses of personally identifiable information from education records.

Based on the modifications in 2008 and 2011, FERPA allows this use.  The law, however, is clear that colleges and universities (and any of their approved contractors) must comply with certain requirements on how the data will be used, protected, and eventually destroyed.  Practically speaking, it requires Arizona State University and its vendor to have a clearly written agreement addressing these terms.

State Longitudinal Data Systems (SLDS)

FERPA also permits colleges and universities to use personally identifiable student information without student consent when the university or state system office is using education records to establish its State Longitudinal Data System (SLDS).  Most states have moved forward on building SLDS as these data systems offer great opportunity for policymakers and educators to link information through a statewide source from a P20W perspective.  The “P20” refers to education from early childhood through graduate school, and the “W” is including the workforce.  Thus, states are moving forward to link data of its citizens from cradle to career.

For some, SLDS presents a serious concern about privacy.  Lawsuits and other challenges from groups such as the Electronic Privacy Information Center (EPIC) have questioned the permissibility of these large datasets – particularly questioning the compliance to FERPA (Roternberg & Barnes, 2013).  Yet, FERPA does allow the data usage for SLDS because the law permits authorized governmental representatives to access education records without student consent when such use is for an audit or evaluation of a federal or state program and for the purposes of federal compliance.[ii]   Nonetheless, in many states, the SLDS is being administered by a state agency that is independent of the higher education and public education systems.  That arrangement presents an interesting problem for some institutions.  For instance, Maryland requires all institutions of higher education that operate in Maryland to report personally identifiable information from education records to the Maryland Longitudinal Data System Center, an agency independent of the state educational institutions and systems.  The University of Massachusetts, through online education, was uncertain whether it should disclose the education records of its students in Maryland.  The U.S. Department of Education’s Family Policy Compliance Office explained that the University of Massachusetts may disclose those education records so long as Maryland has mechanisms in place to allow this independent agency to receive the personally identifiable information from education records.  These mechanisms rest largely with the presence of an agreement between the Maryland Higher Education Commission and the Maryland Longitudinal Data System Center.  The agreement is very similar (though not identical) to the provisions discussed above when giving education records to third party vendors for adaptive learning technologies (e.g., how the data will be used, protected, and eventually destroyed).


In conclusion, FERPA is not necessarily a stifling compliance that is archaic and unworkable.  It factors emerging uses of education records such as the growing uses of student data for predictive modeling, adaptive learning technologies, and other system-wide analyses.  For more information about FERPA, please consult the Family Policy Compliance Office.

Finally, I encourage you to read a new book on big data, Building a Smarter University: Big Data, Innovation, and Ingenuity (Lane, 2014).  The book describes and analyzes the transformative use of big data in higher education.

Discussion Questions

  1. How does your institution use education records to enhance educational programming and student achievement?  Does the institution use data from the learning management system (e.g., Blackboard or Desire2Learn)?  In what ways is your institution using that data to track patterns and academically productive behaviors?  Have the data been employed for student learning assessment?
  2. How does your institution ensure privacy of education records?  While FERPA permits uses of education records without expressed consent of students, what steps or protocols are in place to ensure anonymity?  For instance, what are your institution’s disclosure avoidance techniques?  Does your institution discuss efforts of data anonymization, psuedonymization, or data sharing? How can your unit engage in these discussions with your Information Technology Division?
  3. What steps or professional growth opportunities has your institution, particularly the Division of Student Affairs, engaged in to envision how programming data (which is also an education record) and uses of technology may support the institution’s mission and comply with the law?


1.  20 U.S.C. § 1232g(b)(1)(F) (2014); 34 C.F.R. § 99.31(a)(6)(i) (2014).

2.  20 U.S.C. § 1232g(b)(1)(C), 20 U.S.C. § 1232g (b)(3) (2014); 34 C.F.R. § 99.31(a)(3), 34 C.F.R. § 99.35 (2014).


King, D. (2013, Nov. 22). [Letter to Dawna McIntyre]. Retrieved from

Lane, J. E. (Ed.). (2014). Building a smarter university: Big data, innovation, and ingenuity. Albany, NY: SUNY Press.

Rotenberg, M., & Barnes, K. (2013). Amassing student data and dissipating privacy rights. Educause Review, 48(1). Retrieved from

Selingo, J., Carey, K., Pennington, H., Fishman, R., & Palmer, I. (2013). The next generation university. Retrieved from

Sun, J. C. (2014).  Legal issues associated with big data in higher education: Ethical considerations and cautionary tales.  In J. E. Lane (Ed.), Building a smarter university: Big data, innovation, and ingenuity. Albany, NY: SUNY Press.

About the Author

Jeffrey C. Sun, J.D., Ph.D. is a Professor of Higher Education at the University of Louisville.  He teaches and writes about legal issues pertaining to higher education. 

Please email inquiries to Jeffrey C. Sun.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Student Affairs Staff Support, Resistance, or Indifference to Assessment

Student Affairs Staff Support, Resistance, or Indifference to Assessment

Matthew B. Fuller
Sam Houston State University

Student Affairs Staff Support, Resistance, or Indifference to Assessment

Much has been said about the importance of a culture of assessment in institutional communities and while less has been researched and written, an emerging scholarship does make it possible for student affairs practitioners to discuss their role in developing, maintaining, or augmenting a culture of assessment on their campus. Assessment culture is often assumed to be a positive force because purported benefits to student learning are highly desirable. However, a strong culture of assessment might also precipitate positive benefits to accreditation, financing institutional efforts, and the overall effectiveness of programs and the institution (Maki, 2010). Assessment that serves only the aims of improving student learning is often not tapped for its importance to institutional processes such as program review, accreditation, or planning. Conversely, assessment crafted only to respond to accreditation, accountability, or financial concerns often neglects or is completely disconnected from student learning. Instead, a healthy balance of assessment cultures, a tool capable of exploring and measuring this balance, and opportunities for cross-institutional dialogue about perceptions of assessment are needed.

Often, higher education professionals do not recognize or seek out the expertise of other professionals on their campus, preferring instead to adhere to tradition or other forms of collective, professional wisdom (Kezar, 2005; Ward, 2000). In the case of assessment, professional or disciplinary boundaries may prevent collaborations that would otherwise prove beneficial for autonomous units, the institution, and students. Traditional narratives falsely maintain a place for student affairs practitioners as merely the coordinators of non-classroom activities and purveyors of occasional, co-curricular learning (Kezar & Elrod, 2012). Recent research led by Kezar (2005), Kuh, Kinzie, Schuh, & Whitt (2005), Schuh & Gansemer-Topf (2010), and many others has firmly established the role of a student affairs professionals in contributing to deep, long-lasting, meaningful learning spanning disciplines and social boundaries.

The time has come to also recognize and solidify the vital role student affairs practitioners play in developing, maintaining, or changing an institution’s culture of assessment. In the fall 2011 semester, the new Survey of Assessment Culture© was administered to a nation-wide, stratified, and random sample of America’s directors of institutional research and assessment. The focus of this research effort was to explore factors influencing the manner in which higher education institutions develop, maintain, or augment their institutional culture of assessment. Developed around Maki’s (2010) Principles of an Inclusive Commitment to Assessment, the Survey of Assessment Culture contributes to initial empirical explorations of assessment cultures. This article introduces brief methodological aspects of the Survey of Assessment Culture before offering findings from one section of the Survey: assessment directors’ rankings of campus leaders’ support, resistance, or indifference to assessment, in particular, the support or resistance of student affairs staff. This article addresses the research question: “What are institutional research and assessment directors’ perceptions of student affairs practitioners’ support, resistance or indifference to assessment?” After providing brief methodological overviews and articulating findings, this article outlines new potentials for cross-institutional practice and directions for future research.

Brief Methodological Overview

Prior research has either relied on samples of convenience (Ndoye & Parker, 2010) or has not studied institutional research or assessment directors (Kuh & Ikenberry, 2009) to explore assessment practices. The present study uses publically-available resources to construct a random, stratified sample of the U.S. directors of institutional research and assessment. A listing of undergraduate, degree-granting, regionally-accredited institutions was downloaded from the Carnegie Classification of Institutions of Higher Education website and was stratified according to institutional full-time enrollment size, accreditation region, and Carnegie Basic Classification. This stratified listing of institutions was placed in a sampling matrix according to the type of degrees awarded (primarily associates vs. primarily bachelors), regional accreditation region, and size of full-time enrollment. This resulted in a listing of 2,617 institutions, a population similar to those surveyed by Kuh and Ikenberry (2009). Institutions were sampled at the most refined level of stratification and were over-sampled by a factor of three to ensure the best possible dispersion of a representative number of respondents at and across each level of stratification. A total of 1,026 institutions were randomly sampled for invitation to participate in the survey.

After institutions were randomly selected, the Higher Education Directory © ® was utilized to identify the contact information for directors of institutional research and assessment at sampled institutions. Although the Higher Education Directory is a voluntary listing of contact information, 77.2% or 792 email addresses for contacts were obtained using this resource. The remaining institutional contacts underwent status checks using institutional websites and public search engines1 to identify Chief Assessment Officers, identified as the individuals for whom assessment is their primary responsibility. 170 Chief Assessment Officers were identified using this method. The remaining 64 participants did not have an entry in the Higher Education Directory and web searches did not yield contact information. In these cases, the Provost of the institution was invited to participate in the survey and his/her contact information was gathered using the Higher Education Directory.2

Of the 917 invited participants, 316 responded to the electronic survey and completed at least three-quarters of the survey, providing a 34.5% response rate. The least-responded-to question still obtained 224 responses with the average number of responses being 302. This response suggests the potential for cautious generalizing to the national level and could be strengthened with greater response in future administrations.


The current study is limited in that little is known about respondents’ mental scheme when ranking various colleagues as supportive, resistant, or indifferent to assessment. Stated differently, additional analyses are needed to ascertain why an institutional research or assessment director indicated a particular group of colleagues was viewed as supportive or resistant to assessment. The constructs under examination in this study—support, resistance, and indifference—will be vastly different from participant to participant. For this reason, the current study focuses its findings on a clear depiction of institutional research and assessment directors’ perceptions rather than a “hard and fast” ranking of colleagues’ support of assessment. A more comprehensive discussion of sampling method, conceptual frameworks, and limitations can be found at the Sam Houston State University website . Findings from one section of the survey focusing on institutional research and assessment directors’ ranking of campus leaders’ support, resistance, or indifference to assessment are offered in the following section.


Participating institutional research and assessment directors were asked to rank ten different campus leaders or leadership groups regarding their supportiveness, resistance, or indifference/unawareness to assessment. Campus leaders or leadership groups included (a) Board of Trustee members; (b) President; (c) Provost; (d) Faculty; (e) Student Affairs staff; (f) Faculty Senate members; (g) Development/ Fundraising officers; (h) Alumni services; (i) Academic Advisors; (j) Student Government Leaders. A seven-point Likert-type scale was developed ranging from “Indifferent/Unaware of assessment” (1); “Highly Resistant” (2); “Moderately Resistant” (3); “Only Slightly Resistant” (4); “Only Slightly Supportive” (5); “Moderately Supportive” (6); to “Highly Supportive” (7).

Regarding student affairs staff, institutional research and assessment directors ranked only 5.8% of student affairs staff as “Indifferent/Unaware of assessment.” Highly resistant student affairs staff accounted for 1.3%, moderately resistant student affairs staff accounted for 2.7%, and student affairs staff who were ranked as “Only Slightly Resistant” accounted for 1.8% of the rankings provided. Participants ranked student affairs staff as “Only Slightly Supportive” in 12.4% of their rankings, while both the “Moderately Supportive” and “Highly Supportive” categories accounted for 38.1% of the rankings each. In aggregate, institutional research and assessment directors ranked 88.5% of student affairs staff as “Supportive” in any extent, 5.8% as “Resistant” in any extent, and 5.8% of “Indifferent/Unsupportive.”

In comparison, student affairs practitioners were ranked as just slightly less supportive of assessment as Presidents (91.6%) and Provosts (90.6%), though they were ranked as the third most supportive group. Faculty Senate (78.9%) and Faculty (75.8%) were viewed as the next most supportive leaders, followed by Academic Advisors (73.2%), Board of Trustee members (69.5%), Development/ Fundraising Officers (53.4%), Student Government leaders (49.3%), and Alumni groups (29.1%). An equal portion of Student Government leaders (49.3%) were ranked as indifferent to assessment and notable percentages of indifference are see in the Alumni groups (69.5% indifferent), Development and fundraising officers (42.9% indifferent), and Board of Trustee members (30.5% indifferent). For a more detailed comparison and treatment of scale items as interval/ratio data, see Fuller (2011).

Discussion and Call for Future Research

The large percentage of supportive responses indicates U.S. student affairs practitioners are perceived as being supportive of assessment and their colleagues in institutional research and assessment have taken note of this support. Coupled with presidents and provosts, student affairs practitioners are perceived as among the most supportive members of an institutional community. Empowering student affairs staff to demonstrate their support for assessment may prove beneficial to advancing assessment practices across campus. Student affairs staff may be supportive allies in advancing the benefits of assessment to faculty, administrators, or students. In particular, student affairs practitioners can be critical in reaching out to students or student organizations and instilling fundamental commitments to self-exploration and inquiry inherent in assessment.

Findings from the current study reveal a positive depiction of assessment administrators’ belief in student affairs practitioners. However, the fact that 5.8% of student affairs staff were ranked as Indifferent/Unaware and 5.8% were perceived as resistant to some extent offers an opportunity for student affairs practitioner, and all groups in the present study, to consider how they can be perceived as more supportive of assessment. Campus leaders in these groups may be astonished to learn that they are perceived more or less supportive of assessment than they view themselves. Student affairs staff have daily contact with students and are vital collaborators in an effective culture of assessment focused on improving student learning (Maki, 2010; Upcraft & Schuh, 1997). Further debunking myths that student affairs staff are resistant to assessment could precipitate advantageous conditions for the improvement of student learning as an institutional way of life.

Similarly, student affairs practitioners may see unique opportunities to translate their support for assessment into generative, meaningful action. Student affairs staff may see avenues for collaboration with institutional research staff or other colleagues perceived as more resistant to assessment. Student affairs staff must connect with colleagues inside and outside of student affairs and the institution. They are masters of seeking innovative partnerships, respectfully spanning boundaries, and leveraging colleagues for synergy (Kezar, 2005). Determining avenues for mutual benefit between student affairs staff and other campus colleagues concerned about advancing the benefits of assessment may initiate and sustain long-term cultural change in institutions. These findings may have been drastically different in years or decades prior (Astin & Antonio, 2012; Ewell, 2002; Upcraft & Schuh, 1997), and are points worth celebrating on individual campuses and in assessment scholarship. Data on the support, resistance, or indifference to assessment may be most meaningful as a model for additional, institution-level dialogue. Assessment practitioners may not have considered which campus constituents are most supportive, resistant, or indifferent to assessment on their campus. This finding may offer individual practitioners an avenue to initiate conversations about assessment within their units and on their campus. Moreover, student affairs or assessment practitioners may not fully recognize their role and the power they possess in formulating a campus culture supportive of assessment for the purpose of learning. Individual reflections on assessment cultures and supportive partnerships for assessment across campus communities may be the most useful and meaningful reflections in which educators can engage. This finding suggests student affairs colleagues support assessment to an exemplary degree and should be exemplified for their commitment to this reform agenda.

Further research and advocacy are necessary to organize student affairs practitioners’ highly supportive approach to assessment as a force for change in higher education. Doing so will allow institution research and assessment directors to connect supportive practitioners with more resistant or indifferent colleagues. Through these connections, a scholarship of collegiality (e.g., the study of how colleagues come together and are changed despite differing perspectives on a subject) will emerge and can be leveraged to support change in cultures of assessment. This will be an evolving process in which scholars and practitioners must engage to fully understand how traditionally disparate units learn from each other. Nonetheless, these initial descriptive results offer promising perspectives on the role student affairs practitioners can play as role models to transform or maintain an institution’s culture of assessment. For now, the overwhelming support student affairs practitioners have shown toward assessment is a point worth celebrating. These data support the notion that the majority of student affairs practitioners on the campuses studied are supportive of assessment. Positioning student affairs practitioners as exemplars of support for assessment may advance institutional improvements in cultures of assessment.

Discussion Questions

  • What forms of power and authority do student affairs staff members and leaders possess to change their institution’s culture of assessment?
  • How do different campus leaders and leadership groups define a culture of assessment or assessment in general?
  • What partnerships between student affairs staff and other campus leaders or leadership groups may be beneficial in advancing a culture of assessment?
  • What are the necessary components of an effective relationship between student affairs staff and campus leaders and leadership groups?


  1. Search terms: assessment; institutional research, evaluation, institutional effectiveness.
  2. Once the 1,026 survey respondents were invited to participate in the Survey, a total of 109 emails were returned as either inaccurate or no longer active. It can be assumed a total of 917 participants were adequately invited to participate in the survey.


Astin, A. W., & Antonio, A. L. (2012). Assessment for excellence: The philosophy and practice of assessment and evaluation in higher education (2nd ed.). Lanham, MD: Rowman & Littlefield Publishing Group, Inc.

Ewell, P. T. (2002). An emerging scholarship: A brief history of assessment. In T. Banta & Associates (Eds.). Building a scholarship of assessment (pp. 3–25). San Francisco, CA: Jossey-Bass.

Fuller, M. B. (2011). Preliminary results of the Survey of Assessment Culture. Retrieved from… AssessmentCultureResults.pdf

Kezar, A. (2005). Moving from I to we: Reorganizing for collaboration in higher education. Change: The Magazine of Higher Learning, 37(6),50-56

Kezar, A., & Elrod, S. (2012). Facilitating interdisciplinary learning: Lessons from project kaleidoscope. Change: The Magazine Of Higher Learning, 44(1), 16-25.

Kuh, G., & Ikenberry, S. (2009). More than you think, Less than we need: Learning outcomes assessment in American Higher Education. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Kuh, G. D., Kinzie, J., Schuh, J., & Whitt, E. (2005). Student success in college: Creating conditions that matter. San Francisco: Jossey-Bass.

Maki, P. (2010). Assessing for learning: Building a sustainable commitment across the institution (2nd ed.). Sterling, VA: Stylus.

Ndoye, A., & Parker, M. A. (2010). Creating and sustaining a culture of assessment. Planning For Higher Education, 38(2), 28-39.

Schuh, J.H., & Gansemer-Topf, A.M. (2010, December). The role of student affairs in student learning assessment. Urbana, IL: University of Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Upcraft, M. L., & Schuh, J. H. (1996). Assessment in student affairs: A guide for practitioners. San Francisco: Jossey-Bass.

Ward, D. (2000, January/February). Catching the wave of change in American higher education. Educause Review, 35(1), 22-30.

About the Author

Matthew Fuller, Ph.D., is Assistant Professor and Coordinator of Higher Education Administration at Sam Houston State University. Dr. Fuller serves as the Principal Investigator for the Survey of Assessment Culture, a nation-wide annual survey of factors influencing institutional assessment approaches. He has held administrative positions in assessment at Texas A&M University and Illinois State University as well as positions in Residence Life at Texas A&M University and the University of Alaska – Southeast. Dr. Fuller earned a Bachelor of Arts in Biology, a Master of Science in Educational Administration and Human Resource Development (Emphasis in Student Affairs Administration in Higher Education), and a certificate in College Teaching from Texas A&M University and a Ph.D. in Educational Administration and Foundations from Illinois State University. He is a 2008 recipient of the Association of Institutional Research’s Julia Duckwall Fellowship and a 2012 fellowship with the National Center for Educational Statistics’ National Data Institute. Dr. Fuller’s research agenda focuses on the foundations of assessment, assessment cultures in higher education, and the history of higher education.

Please e-mail inquiries to Matthew B. Fuller.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Staff Office.

Paradigms for Assessing Success in Career Services

Paradigms for Assessing Success in Career Services

Jessica M. Turos
Bowling Green State University
Patrick L. Roberts
East Carolina University

Assessment is vital to the success of student services in higher education and has tremendous potential to improve and inform our practice. Furthermore, as institutions face increasing calls for transparency and accountability, career services staff members can play a critical role in demonstrating student success through a variety of internal assessments. The focus on assessment is timely because career services staff are feeling the pressure of accountability from both internal influences, such as student affairs, academic affairs, faculty, students, and external stakeholders, such as parents, alumni, employers, government agencies (Ratcliffe, 2008a). Successful assessment is multifaceted and begins with an agreement by practitioners on goals and objectives for learning. Utilizing both direct and indirect methods of measurement, assessment can provide immediate feedback and generate information for long-term decision making1. Additionally, a continuous examination of the process can lead to key insights along the way (Palomba & Banta, 1999; Suskie, 2004). Through this article, the authors will discuss assessment challenges for career services practitioners; opportunities for career services staff members to perform assessment initiatives; promising career services assessment practices, including career courses, workshops, advisory boards, and benchmarks; career services staff members as practitioners and scholars; and conclusions and discussion questions.

Assessment Challenges in Career Services

Practicing outcomes-based assessment in career services can be particularly challenging when data requests from various campus constituents have focused historically on the following: (a) demographics, (b) satisfaction, and (c) needs (Greenberg & Harris, 2006). While demographic data allow career services practitioners to identify participation levels for various programs and services by groups, these data provide limited information about why students participated in the programs and services or what they learned from the experience. Gathering satisfaction data is another common assessment method for career services. However, it still provides an incomplete view. Why was the career program helpful? What did students learn from their interactions with career services staff members? These questions often are left unanswered.

Career services practitioners also use needs assessment to gain an understanding of student interest. While needs assessment can provide helpful information from a student perspective, Greenberg and Harris (2006) note this type of assessment “is not necessarily an indication that they would use the resource or attend the program if offered” (p. 20).

Another challenge career services professionals face is documenting student success. One of the greatest misnomers in higher education is that job placement is the responsibility of career services offices. As an integral component of the educational experience, we believe that teaching job searching skills is far more valuable to students than merely placing them in jobs. As compared to placement rates, job searching skills can be measured in terms of knowledge acquisition. Moreover, “determining the employment status of students upon graduation is an area that is both difficult and controversial” (Greenberg & Harris, 2006, p. 21). Part of this debate is the question of how career service offices are measured in terms of effectiveness. Are decisions based on the number of students who get jobs or the numbers of students who are served through individual and group appointments? Is it fair to expect career services staff members to deliver the educational component to students, but be evaluated by a different standard? Further, the question of placement rates cannot account for situations beyond employment. For example, what happens if a student obtains a job, but is underemployed? What transpires if someone gets a job, but then is laid off shortly after starting and does not know how to search for a new job? What ensues when a student works with staff from the career center, but has barriers to obtaining employment, such as a felony, lack of work experience, or speech challenges? Perhaps a more informative measure is to evaluate students’ job seeking skills rather than placement rates.

Assessment Opportunities in Career Services

According to Ratcliffe (2008a), “A major challenge for career services practitioners is how to document excellence in our contributions to student learning; how to show the value of our programs and services; and how to be accountable to our diverse stakeholders” (p. 43). By simply shifting the way in which we gather information about student learning, we can move away from indirect measures of assessment (questionnaires, evaluations, surveys, etc.) to more robust methods of assessment, such as portfolios and document analysis.

At Bowling Green State University, Career Center staff members help students evaluate their resumes by providing a resume rubric. With this rubric students are able to examine various components of their resumes and identify strategies to enhance their job search document. In a way, students are conducting their own document analysis, by using the resume rubric to assess what they learned from their conversations with career advisors and readings of job search preparation materials. Conducting a more formal document analysis by staff members would be best. For example, career services practitioners should examine to what extent students learned how to communicate accomplishments from a work experience in a concise format using action words in their resumes. Practitioners could do this by comparing the students’ original resumes to the new resumes created after their consulting appointments. However, due to time and resource constraints variations on direct assessment, such as the resume rubric example, can still provide powerful data. The key is for career services practitioners to be intentional about the data they are seeking.

Promising Practices

A growing trend for career services offices is to focus on outcome measures for assessment aligning with their offices’ goals (Ratcliffe, 2008b). This trend for assessing student learning aligns with the press for accountability, and it relates to the focus of assessment on continuous improvement. Assessing student learning in career services can be done in a variety of ways (Greenberg & Harris, 2006) including focusing on measuring outcomes and services in career courses, workshops, and advisory boards.

Career Courses. Career courses are powerful learning tools. Jessie Lombardo, Senior Career Counselor at Buffalo State College, teaches a career planning course designed to educate students about the career development process. Lombardo (personal communication, September 13, 2011) noted:

Students reported a significant increase in self-knowledge—namely, their interests, values, skills, and personality traits and how they relate to choosing a career. Also, they reported being better prepared to set goals and make decisions as a result of taking this course.

For career courses, students can be tested on the material covered using a direct assessment approach. Additionally, there can be evaluative assignments including job search materials and performance assessments, such as mock interviews, all of which are direct assessment methods.

Workshops. Career services professionals also can assess student learning from workshops. For example, after an interview workshop, students can be asked to identify key concepts, such as what the STAR (Situation, Task, Action, and Result) model stands for and how it can be used during the interview process. Additionally, career services staff members can assess learning that occurs from a career consultation about interviewing by conducting a mock interview and observing any improvements in communicating information about experience and accomplishments. While these are time intensive approaches, using a sampling technique such as availability sampling in high traffic areas of an institution (quad, union, dining halls) is sometimes more feasible. In fact, some assessment software provides the ability to utilize hand held devices and applications for cell phones or tablets. This new approach can get students curious, excited, and engaged in providing valuable feedback.

A primary function of student conduct programs is to foster learning and development among students. Many conduct offices have affirmed this educational purpose but still have not determined the extent to which it is being fulfilled. A comprehensive assessment plan based on clear and measureable learning and development outcomes is one step toward addressing this issue. An outcomes-based approach to assessment can provide conduct offices with much needed evidence regarding student learning and development. Additionally, this approach can enhance the educational experiences for students who interact with the office by promoting a greater degree of intentionality in program design and administration. Finally, professionals must consider the unique mission, culture, and programming structure of the conduct office for the assessment to be successful.

Advisory Boards. The potential impact of career services on an institution’s stakeholders has created a demand for the use of advisory boards to assess and evaluate employer needs and services currently offered. These boards often consist of a mixture of constituents, such as employers, students, alumni, parents, faculty, staff, or targeted groups. The most important aspect of any type of advisory boards is that members are users and invested in the services being offered by a career center. Schuh, Upcraft, and Associates (2001) suggest these qualities allow advisory board members to “interact with and relate to their peer consumers frequently and [they] are thus in a position to represent views beyond their own about the quality of, appropriateness of, and satisfaction with career center services” (p. 374). Advisory boards can be useful in a variety of ways, especially as an option to represent the needs of the many different types of stakeholders invested in the success of career services. Another benefit is that they can serve as an immediate, in-house pilot group to test programming, marketing materials, and other ideas before fully releasing the concept to the campus community.


Another promising practice for career services is the use of benchmarks. Benchmarking provides a point of reference for how an institution is doing in comparison to its peers. According to Greenberg and Harris (2006), “acquiring assessment data on client needs and satisfaction, employment outcomes, student learning, program review, and so forth is important and helpful; however, information in a vacuum has limited use” (p. 23). Fortunately for career services, the National Association of Colleges and Employers (NACE) created professional standards that career services offices can use (see NACE, 2009). Additionally, there are standards identified by the Council for the Advancement of Standards in Higher Education (CAS) related to career services (see CAS, 2012). NACE also gathers data from a variety of surveys of employers and career services professionals that can be used for benchmarking purposes.

Career Services Practitioners as Scholars and Researchers

Assessment informs our practice and can contribute to the knowledge base of our field. Research in career services is achievable, and it does not have to significantly add to our workload. Career services practitioners simply need to identify research questions to guide them in examining issues they see in their everyday work. For example, to examine students’ perceptions of their learning through on-campus jobs, Turos (2009) created and disseminated an online survey in which student employees self-reported their learning on a variety of outcomes. While it can be overwhelming at first, conducting assessment research by identifying the right questions when assessing and evaluating career services will not only inform our departments, institutions, and field, but it also can produce valuable information for the future.

Too often career services practitioners conduct great assessments, but they do not take the time to share the results on a broader scale. Since career services practitioners already gather demographic data, this may be one area to begin asking questions. According to Schuh, Upcraft, and Associates (2001), the “careful examination and analysis of these summary data can lead to helpful and even startling conclusions with significant implications for service delivery” (p. 367). For example, an annual assessment report of a mid-sized, state university based on demographic questions (Lombardo, 2011) revealed several key implications for future service delivery and areas of improvement:

  • Although the ratio of women to men enrolled at the university was approximately 60% to 40%, women were more likely to seek career counseling services in a ratio of 73% to 27%.
  • Approximately 40% of counseling sessions were conducted with business or education majors even though these academic programs only accounted for less than 25% of total enrollment.

These examples show that even though demographic data are sometimes overlooked, such data can be used as valuable resources for longitudinal studies on specific subgroups and could be used to target populations that might not traditionally take advantage of career services.


In today’s challenging economic times, career services must show effectiveness and accountability. Placement rates are only one small piece of a much larger assessment picture in career services. Although placement rates, like retention, will always be targeted in higher education, the contributions of career services are more complex than placement numbers. In order to remain relevant, career services must be seen as both a service as well as an educational component of a student’s collegiate experience. Assessment of career services must incorporate goals to identify effective programming that encourages student development complemented by services that improves employability. Ultimately, career professionals are not accompanying students when they submit their applications, interview, or accept a position. It follows that, the paradigm of assessment for career services of asking, “What are our placement rates?” should be more appropriately stated as “How do we encourage a student’s continued career success?”

Discussion Questions

  • What assessment projects are you working on that could be turned into an educational piece reaching a larger audience (e.g., conference presentation, journal article)?
  • How has assessment impacted your institution, your office, and your position?
  • What assessment trends have you seen at your institution?
  • How can assessment data be used to support career services programs, services, and learning outcomes?
  • What challenges does your institution, department, or office face when collecting assessment data?


1. “Direct measures of learning require students to display their knowledge and skills as they respond to the instrument itself….Indirect methods such as surveys and interviews ask students to reflect on their learning rather than to demonstrate it” (Palomba & Banta, 1999, pp. 11-12).


Council for the Advancement of Standards in Higher Education. (2012). The role of career services and CAS standards and guidelines (pp. 139-155). CAS professional standards for higher education (8th ed.).Washington, D.C.: Author.

Greenberg, R., & Harris, M. B. (2006). Measuring up: Assessment in career services. National Association of Colleges and Employers Journal, 18-24.

National Association of Colleges and Employers (NACE). (2009). Professional standards for colleges and university career services. In NACE principles for professional conduct. Bethlehem, PA; Author.

Lombardo, J. (2011). Annual assessment report. Buffalo: State University of New York College at Buffalo.

Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education. San Francisco, CA: Jossey-Bass.

Ratcliffe, S. (2008a). Demonstrating career services success: Rethinking how we tell the story. National Association of Colleges and Employers Journal, 40-44.

Ratcliffe, S. (2008b). Developing the career services story: An overview of assessment strategy. National Association of Colleges and Employers Journal, 41-47.

Schuh, J. H., Upcraft, M. L., & Associates (2001). Assessment practice in student affairs: An applications manual. San Francisco, CA: Jossey-Bass.

Suskie, L. (2004). Assessing student learning: A common sense guide. Bolton, MA: Anker.

Turos, J. M. (2009). Learning while earning: Assessing student employee learning. National Student Employment Association Journal, X(1), 11-20.

About the Authors

Jessica M. Turos is the Interim Director of the Career Center at Bowling Green State University. She is a directorate member of the Commission for Assessment and Evaluation.

Please e-mail inquiries to Jessica M. Turos.

Patrick L. Roberts is a career counselor at East Carolina University. He received his master’s degree in student personnel administration from Buffalo State College.

Please e-mail inquiries to Patrick L. Roberts.


The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.