Implementation Fidelity in First-Year Experience Programs | Braught

Abstract

Payton’s first-year experience (FYE) seminar is getting mixed reviews. Their supervisor and the Director of Assessment for the Division of Student Affairs are quite excited about some promising retention results, but since starting the job, Payton has heard from student staff that they may not be putting in the effort that the professional staff think they are. Payton worries that the process and results aren’t aligned, but they have been invited to a meeting about expanding the seminar program to reach more students.

Key Words: Assessment, Implementation Fidelity, First-Year Experience

Primary Characters

Payton (they/them) is the primary staff member responsible for coordinating the first-year experience (FYE) seminar in the Office of the First Year Experience. They have been in the role for one full year and are starting their second fall in the job.

Casey (he/him) is Payton’s direct supervisor. Casey has been at Blue River College for 10 years. Before Payton arrived, Casey was directly in charge of the program.

Rachel (she/her) is responsible for student affairs assessment and program evaluation as Director of Assessment. When hired, Rachel’s supervisor tasked Rachel with analyzing retention for all signature programs in the Division of Student Affairs in order to aid in determining which programs should be maintained. Rachel has learned that the division does not have a history of demonstrating impact using quantitative measures like retention or GPA.

Context And Case

Blue River College (BRC) is a mid-sized public university in a metropolitan area. Only 10% of students live on campus at the college, with many of the students commuting from elsewhere in the city or nearby suburbs. Many of the students choose Blue River College because of its focus on real-world/practical learning which connects the campus with the city.

Like other colleges and universities, Blue River College is experiencing increased pressure to maintain enrollment and increase retention of students from their first year into their second year. Last year, BRC retained 76% of students from their first year into their second year. In the last few years, the Division of Student Affairs has had a tight budget, with little wiggle room to introduce new programs or hire additional staff to increase capacity of additional programs and services. Some staff refer to Blue River College as a “commuter campus” and many staff perceive it to be a struggle to engage students in out-of-classroom student programming.

The FYE seminar, a one-credit course students can participate in during their fall semester at BRC is 10 years old. Around 35% of students participate in a FYE seminar. The program employs 20 peer mentors that interact with students during their welcome week and throughout weekly class sessions during their first term. Student feedback on the program is typically positive, but the information the program has collected has been largely anecdotal.

Payton Gets Started

Payton is starting their second year as a coordinator for the FYE at Blue River College. In their role, Payton is responsible for supervising 20 student staff members who serve as mentors for FYE seminars. The first year in their job was like drinking from a firehose. Payton didn’t feel like they could innovate much and mostly relied on what Joey, the previous coordinator of the program, developed. Payton felt that their master’s program had given them a strong foundation in designing effective FYE programs, especially in support of sense of belonging. They are eager to put some of these innovative ideas into action in their second academic year.

To prepare, last spring, Payton gathered data from their program’s course site on Canvas about how students engaged with different modules and assignments. They reviewed assignments across multiple sections of the course using a common rubric. They reviewed peer mentor feedback from student participants. Using this information, they made plans to shift a few assignments for the next round of FYE courses and made changes to mentor training. When Payton requested historical data from previous years to include in their review, Casey replied that he hadn’t analyzed much about the program in the last few years because no one had asked him for the results.

Payton Makes A Worrisome Discovery

Payton is quite popular with the peer mentors in the FYE seminars. The returning staff members connected more with Payton than their previous supervisor, Casey. This may be because Payton is closer in age with the students than Casey. During their fall training this year, returning student staff members shared that Casey did not really monitor the students’ facilitation of the seminar, which put Payton in an uncomfortable position given that Casey is now their supervisor, too.

Since then, Payton has discovered that there is quite a lot of variation between sections of the FYE seminar. First, a few of the returning peer mentors felt empowered to make decisions about which activities and lessons they didn’t like, resulting in some first-year students not getting exposed to all components of the curriculum. Second, a few returning mentors would regularly let their class out early. Third, Payton learned that some of the peer mentors were not tracking attendance despite the fact that attendance was a portion of students’ grades. Finally, some mentors were offering guidance counter to the intended goals. For example, one section was told that they shouldn’t sign up for the student involvement platform because “no one ever uses it.” Another section was told that the late-night events hosted by the Division of Student Affairs were “not worth going to.”

Payton finds this information frustrating but is equally frustrated that the data they reviewed on Canvas didn’t reveal such discrepancies until the student staff brought them up. Payton tried to mention some of these variations to Casey, but Casey brushed them off. Casey thinks that there’s just not much that can be done to monitor peer mentors without “micro-managing” and Casey avoided being a “helicopter” supervisor.

Positive Results For Payton And Casey’s Program

One day, Rachel, the staff member responsible for programmatic assessment and evaluation in the Division of Student Affairs, emails a report to Casey and Payton about some impactful retention results for the FYE program. Rachel’s report is a summary of all students who participated in the FYE seminar compared to all students at BRC who did not participate. While campus retention was around 76% last academic year, 90% of the students who participated in Payton’s FYE seminar were retained. Payton is pleased with this information, but also surprised to learn Rachel was working on a retention analysis for the program that Payton is primarily responsible for without Payton’s knowledge.

Later in the week, Casey excitedly tells Payton that divisional leadership is impressed by the positive impact of the FYE program on retention. Casey shares that he is so glad that his 10 years of hard work have paid off and that the Vice President for Student Affairs (VPSA) is considering expanding the program. In fact, the VPSA is considering adding a goal to the division’s strategic plan to make the seminar available to every incoming student.

Payton is concerned. First, the retention report might have highlighted that there were positive results from the program overall, but Payton knows that the peer mentors were implementing program expectations inconsistently across sections of the course. Payton worries that the intended goals of the program don’t necessarily reflect what actually happened. Second, Rachel provided an aggregate retention analysis of all students. Payton is worried that many of the students who sign up for the FYE program are already equipped with the skills necessary to persist into their sophomore year. Deep down, Payton is wondering if the impact of the program on retention rates is muddier than Rachel and Casey seem to think.

Payton has just received an invitation to attend a meeting with Rachel, Casey, and the VPSA to discuss the future of the program.

Discussion Questions

  1. What factors are influencing the culture around assessment in Payton’s division?
  2. How should Payton approach sharing their concerns about variation across the seminars? About assessment methods? What politics and ethics should they consider when determining how they will approach their meeting with Rachel and Casey?
  3. What questions should Payton ask Rachel about the aggregate retention analysis? How can Rachel improve how she analyzes, reports on, and communicates about retention of divisional programs?
  4. What additional assessment methods or analysis strategies are necessary in order for the VPSA to make a decision about expanding the FYE seminar program?
  5. How can Payton shift how they monitor the FYE seminars to improve implementation fidelity without damaging their positive relationships with staff members in their second year on the job?

Author Bio

Emily Braught (she/her) is a doctoral candidate at Indiana University and serves as the Director of Assessment and Planning in the Division of Student Affairs at Indiana University Indianapolis. As a scholar, she is interested in how institutional assessment influences decision-making. As a practitioner, she is passionate about increasing staff capacity for inquiry.