Paradigms for Assessing Success in Career Services

Paradigms for Assessing Success in Career Services

Jessica M. Turos
Bowling Green State University
Patrick L. Roberts
East Carolina University

Assessment is vital to the success of student services in higher education and has tremendous potential to improve and inform our practice. Furthermore, as institutions face increasing calls for transparency and accountability, career services staff members can play a critical role in demonstrating student success through a variety of internal assessments. The focus on assessment is timely because career services staff are feeling the pressure of accountability from both internal influences, such as student affairs, academic affairs, faculty, students, and external stakeholders, such as parents, alumni, employers, government agencies (Ratcliffe, 2008a). Successful assessment is multifaceted and begins with an agreement by practitioners on goals and objectives for learning. Utilizing both direct and indirect methods of measurement, assessment can provide immediate feedback and generate information for long-term decision making1. Additionally, a continuous examination of the process can lead to key insights along the way (Palomba & Banta, 1999; Suskie, 2004). Through this article, the authors will discuss assessment challenges for career services practitioners; opportunities for career services staff members to perform assessment initiatives; promising career services assessment practices, including career courses, workshops, advisory boards, and benchmarks; career services staff members as practitioners and scholars; and conclusions and discussion questions.

Assessment Challenges in Career Services

Practicing outcomes-based assessment in career services can be particularly challenging when data requests from various campus constituents have focused historically on the following: (a) demographics, (b) satisfaction, and (c) needs (Greenberg & Harris, 2006). While demographic data allow career services practitioners to identify participation levels for various programs and services by groups, these data provide limited information about why students participated in the programs and services or what they learned from the experience. Gathering satisfaction data is another common assessment method for career services. However, it still provides an incomplete view. Why was the career program helpful? What did students learn from their interactions with career services staff members? These questions often are left unanswered.

Career services practitioners also use needs assessment to gain an understanding of student interest. While needs assessment can provide helpful information from a student perspective, Greenberg and Harris (2006) note this type of assessment “is not necessarily an indication that they would use the resource or attend the program if offered” (p. 20).

Another challenge career services professionals face is documenting student success. One of the greatest misnomers in higher education is that job placement is the responsibility of career services offices. As an integral component of the educational experience, we believe that teaching job searching skills is far more valuable to students than merely placing them in jobs. As compared to placement rates, job searching skills can be measured in terms of knowledge acquisition. Moreover, “determining the employment status of students upon graduation is an area that is both difficult and controversial” (Greenberg & Harris, 2006, p. 21). Part of this debate is the question of how career service offices are measured in terms of effectiveness. Are decisions based on the number of students who get jobs or the numbers of students who are served through individual and group appointments? Is it fair to expect career services staff members to deliver the educational component to students, but be evaluated by a different standard? Further, the question of placement rates cannot account for situations beyond employment. For example, what happens if a student obtains a job, but is underemployed? What transpires if someone gets a job, but then is laid off shortly after starting and does not know how to search for a new job? What ensues when a student works with staff from the career center, but has barriers to obtaining employment, such as a felony, lack of work experience, or speech challenges? Perhaps a more informative measure is to evaluate students’ job seeking skills rather than placement rates.

Assessment Opportunities in Career Services

According to Ratcliffe (2008a), “A major challenge for career services practitioners is how to document excellence in our contributions to student learning; how to show the value of our programs and services; and how to be accountable to our diverse stakeholders” (p. 43). By simply shifting the way in which we gather information about student learning, we can move away from indirect measures of assessment (questionnaires, evaluations, surveys, etc.) to more robust methods of assessment, such as portfolios and document analysis.

At Bowling Green State University, Career Center staff members help students evaluate their resumes by providing a resume rubric. With this rubric students are able to examine various components of their resumes and identify strategies to enhance their job search document. In a way, students are conducting their own document analysis, by using the resume rubric to assess what they learned from their conversations with career advisors and readings of job search preparation materials. Conducting a more formal document analysis by staff members would be best. For example, career services practitioners should examine to what extent students learned how to communicate accomplishments from a work experience in a concise format using action words in their resumes. Practitioners could do this by comparing the students’ original resumes to the new resumes created after their consulting appointments. However, due to time and resource constraints variations on direct assessment, such as the resume rubric example, can still provide powerful data. The key is for career services practitioners to be intentional about the data they are seeking.

Promising Practices

A growing trend for career services offices is to focus on outcome measures for assessment aligning with their offices’ goals (Ratcliffe, 2008b). This trend for assessing student learning aligns with the press for accountability, and it relates to the focus of assessment on continuous improvement. Assessing student learning in career services can be done in a variety of ways (Greenberg & Harris, 2006) including focusing on measuring outcomes and services in career courses, workshops, and advisory boards.

Career Courses. Career courses are powerful learning tools. Jessie Lombardo, Senior Career Counselor at Buffalo State College, teaches a career planning course designed to educate students about the career development process. Lombardo (personal communication, September 13, 2011) noted:

Students reported a significant increase in self-knowledge—namely, their interests, values, skills, and personality traits and how they relate to choosing a career. Also, they reported being better prepared to set goals and make decisions as a result of taking this course.

For career courses, students can be tested on the material covered using a direct assessment approach. Additionally, there can be evaluative assignments including job search materials and performance assessments, such as mock interviews, all of which are direct assessment methods.

Workshops. Career services professionals also can assess student learning from workshops. For example, after an interview workshop, students can be asked to identify key concepts, such as what the STAR (Situation, Task, Action, and Result) model stands for and how it can be used during the interview process. Additionally, career services staff members can assess learning that occurs from a career consultation about interviewing by conducting a mock interview and observing any improvements in communicating information about experience and accomplishments. While these are time intensive approaches, using a sampling technique such as availability sampling in high traffic areas of an institution (quad, union, dining halls) is sometimes more feasible. In fact, some assessment software provides the ability to utilize hand held devices and applications for cell phones or tablets. This new approach can get students curious, excited, and engaged in providing valuable feedback.

A primary function of student conduct programs is to foster learning and development among students. Many conduct offices have affirmed this educational purpose but still have not determined the extent to which it is being fulfilled. A comprehensive assessment plan based on clear and measureable learning and development outcomes is one step toward addressing this issue. An outcomes-based approach to assessment can provide conduct offices with much needed evidence regarding student learning and development. Additionally, this approach can enhance the educational experiences for students who interact with the office by promoting a greater degree of intentionality in program design and administration. Finally, professionals must consider the unique mission, culture, and programming structure of the conduct office for the assessment to be successful.

Advisory Boards. The potential impact of career services on an institution’s stakeholders has created a demand for the use of advisory boards to assess and evaluate employer needs and services currently offered. These boards often consist of a mixture of constituents, such as employers, students, alumni, parents, faculty, staff, or targeted groups. The most important aspect of any type of advisory boards is that members are users and invested in the services being offered by a career center. Schuh, Upcraft, and Associates (2001) suggest these qualities allow advisory board members to “interact with and relate to their peer consumers frequently and [they] are thus in a position to represent views beyond their own about the quality of, appropriateness of, and satisfaction with career center services” (p. 374). Advisory boards can be useful in a variety of ways, especially as an option to represent the needs of the many different types of stakeholders invested in the success of career services. Another benefit is that they can serve as an immediate, in-house pilot group to test programming, marketing materials, and other ideas before fully releasing the concept to the campus community.

Benchmarks

Another promising practice for career services is the use of benchmarks. Benchmarking provides a point of reference for how an institution is doing in comparison to its peers. According to Greenberg and Harris (2006), “acquiring assessment data on client needs and satisfaction, employment outcomes, student learning, program review, and so forth is important and helpful; however, information in a vacuum has limited use” (p. 23). Fortunately for career services, the National Association of Colleges and Employers (NACE) created professional standards that career services offices can use (see NACE, 2009). Additionally, there are standards identified by the Council for the Advancement of Standards in Higher Education (CAS) related to career services (see CAS, 2012). NACE also gathers data from a variety of surveys of employers and career services professionals that can be used for benchmarking purposes.

Career Services Practitioners as Scholars and Researchers

Assessment informs our practice and can contribute to the knowledge base of our field. Research in career services is achievable, and it does not have to significantly add to our workload. Career services practitioners simply need to identify research questions to guide them in examining issues they see in their everyday work. For example, to examine students’ perceptions of their learning through on-campus jobs, Turos (2009) created and disseminated an online survey in which student employees self-reported their learning on a variety of outcomes. While it can be overwhelming at first, conducting assessment research by identifying the right questions when assessing and evaluating career services will not only inform our departments, institutions, and field, but it also can produce valuable information for the future.

Too often career services practitioners conduct great assessments, but they do not take the time to share the results on a broader scale. Since career services practitioners already gather demographic data, this may be one area to begin asking questions. According to Schuh, Upcraft, and Associates (2001), the “careful examination and analysis of these summary data can lead to helpful and even startling conclusions with significant implications for service delivery” (p. 367). For example, an annual assessment report of a mid-sized, state university based on demographic questions (Lombardo, 2011) revealed several key implications for future service delivery and areas of improvement:

  • Although the ratio of women to men enrolled at the university was approximately 60% to 40%, women were more likely to seek career counseling services in a ratio of 73% to 27%.
  • Approximately 40% of counseling sessions were conducted with business or education majors even though these academic programs only accounted for less than 25% of total enrollment.

These examples show that even though demographic data are sometimes overlooked, such data can be used as valuable resources for longitudinal studies on specific subgroups and could be used to target populations that might not traditionally take advantage of career services.

Conclusion

In today’s challenging economic times, career services must show effectiveness and accountability. Placement rates are only one small piece of a much larger assessment picture in career services. Although placement rates, like retention, will always be targeted in higher education, the contributions of career services are more complex than placement numbers. In order to remain relevant, career services must be seen as both a service as well as an educational component of a student’s collegiate experience. Assessment of career services must incorporate goals to identify effective programming that encourages student development complemented by services that improves employability. Ultimately, career professionals are not accompanying students when they submit their applications, interview, or accept a position. It follows that, the paradigm of assessment for career services of asking, “What are our placement rates?” should be more appropriately stated as “How do we encourage a student’s continued career success?”

Discussion Questions

  • What assessment projects are you working on that could be turned into an educational piece reaching a larger audience (e.g., conference presentation, journal article)?
  • How has assessment impacted your institution, your office, and your position?
  • What assessment trends have you seen at your institution?
  • How can assessment data be used to support career services programs, services, and learning outcomes?
  • What challenges does your institution, department, or office face when collecting assessment data?

Notes

1. “Direct measures of learning require students to display their knowledge and skills as they respond to the instrument itself….Indirect methods such as surveys and interviews ask students to reflect on their learning rather than to demonstrate it” (Palomba & Banta, 1999, pp. 11-12).

References

Council for the Advancement of Standards in Higher Education. (2012). The role of career services and CAS standards and guidelines (pp. 139-155). CAS professional standards for higher education (8th ed.).Washington, D.C.: Author.

Greenberg, R., & Harris, M. B. (2006). Measuring up: Assessment in career services. National Association of Colleges and Employers Journal, 18-24.

National Association of Colleges and Employers (NACE). (2009). Professional standards for colleges and university career services. In NACE principles for professional conduct. Bethlehem, PA; Author. http://www.naceweb.org/principles/#careerservices

Lombardo, J. (2011). Annual assessment report. Buffalo: State University of New York College at Buffalo.

Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education. San Francisco, CA: Jossey-Bass.

Ratcliffe, S. (2008a). Demonstrating career services success: Rethinking how we tell the story. National Association of Colleges and Employers Journal, 40-44.

Ratcliffe, S. (2008b). Developing the career services story: An overview of assessment strategy. National Association of Colleges and Employers Journal, 41-47.

Schuh, J. H., Upcraft, M. L., & Associates (2001). Assessment practice in student affairs: An applications manual. San Francisco, CA: Jossey-Bass.

Suskie, L. (2004). Assessing student learning: A common sense guide. Bolton, MA: Anker.

Turos, J. M. (2009). Learning while earning: Assessing student employee learning. National Student Employment Association Journal, X(1), 11-20.

About the Authors

Jessica M. Turos is the Interim Director of the Career Center at Bowling Green State University. She is a directorate member of the Commission for Assessment and Evaluation.

Please e-mail inquiries to Jessica M. Turos.

Patrick L. Roberts is a career counselor at East Carolina University. He received his master’s degree in student personnel administration from Buffalo State College.

Please e-mail inquiries to Patrick L. Roberts.

Disclaimer

The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

12 thoughts on “Paradigms for Assessing Success in Career Services”

  1. I think the admin of this web site is really working hard in favor of his website, because here every stuff is quality based stuff.

Leave a Reply

Your email address will not be published. Required fields are marked *