Series: Views of Assessment (Part II)

SERIES: VIEWS OF ASSESSMENT (PART II)

The Commission for Assessment and Evaluation (CAE) is pleased to sponsor this “Views of Assessment” series. Focusing on the experiences of student affairs educators working with assessment, the series highlights reflections from practitioners at different levels in their careers–graduate student, new professional, mid-level, and senior student affairs officer (SSAO). Each article offers rich narratives, personal experiences, and professional examples, as well as instructive wisdom and advice related to assessment practices and implementation.

Everyday Assessment for New Professionals

Maureen Flint
The University of Alabama

John Tilley
Clemson University

Introduction

As new professionals, assessment is the buzzword often heard but rarely understood. National Survey of Student Engagement (NSSE) data, institutional reports, and CAS standards are all useful and important assessment tools, but they can be difficult to use on a day-to-day basis for entry-level, graduate, and even mid-management professionals. As new professionals, we are tasked with creating, implementing, and applying the results of assessment within our functional areas, often with little idea of where to begin.

What does assessment look like on a day-to-day basis for the new professional? Today, assessment is no longer relegated to the comprehensive annual report or campus-wide survey, but is embedded in our culture, shaping what we do and why we do it. ACPA and NASPA (2015) describe the Assessment, Evaluation, and Research (AER) competency area for student affairs professionals as “the ability to design, conduct, critique, and use various AER methodologies and the results obtained from them, to utilize AER processes and their results to inform practice” (p. 12). This competency area reflects the increasingly data-driven decisions and outcomes-based processes used to measure and validate our work as student affairs professionals. Bresciani (2011) reflects that “assessment begins with simply wondering whether what you do all day is contributing to what you hope your efforts can accomplish” (p. 1). How we approach assessment as new professionals can be creative, reflexive, and authentic, staying true to the purpose of assessment as a tool for improvement, growth, and development.

We will discuss simple yet effective strategies for new professionals to implement assessment tools and apply results to be more intentional in their everyday practice. As a hall director and a coordinator for training and development in residence life, we will share our stories as new professionals incorporating assessment into our everyday work.

 

Strategies for Everyday Assessment

The first foundational outcome of the AER competency emphasizes the importance of the student affairs professional’s ability to “differentiate among assessment, program review, evaluation, planning, and research as well as the methods appropriate to each” (ACPA & NASPA, 2015, p. 20). We agree that it is important to recognize the difference between these methods, particularly the distinction between assessment, or the process of gathering, analyzing, and interpreting evidence, and evaluation, or the process of applying assessment evidence to increase effectiveness (Upcraft & Schuh, 1996). Yet, as Bresciani, Hickmott, and Gardner (2009) note, many professionals neglect the full picture, focusing on data collection and analysis and failing to engage in evaluation or strategic planning by communicating and applying the results of their assessment. Considering this, we move forward with a holistic definition of assessment as a way of thinking, an interconnected and iterative process of asking questions, seeking evidence, and applying results.

We examine four overarching questions for new professionals to ask when starting their assessment process. First, what do you want to know, and why does it matter? How does your assessment connect to the bigger picture, within the mission and vision of your department, division, or institution? Second, what information do you already have at your disposal that could help answer your questions? Third, who do you ask, and how do you ask it? Finally, how do you share what you have learned?

 

What Do You Want To Know, and Why Does It Matter?

At first glance, asking, “What do you want to know?” seems like an obvious question; it forms the foundation for each subsequent step in the process of assessment. Asking what you want to know helps guide how you ask questions, who you ask, and how you share what you have learned. For a new professional incorporating assessment into their everyday work, deconstructing what you want to know can move you towards an authentic space where you can engage with assumptions and relationships of power and privilege in your practice, moving past static “question and answer” to a cyclical and iterative process of assessment. Pillow (2003) describes this process of looking back as reflexivity, “an on-going self-awareness…which aids in making visible the practice and construction of knowledge within research in order to produce more accurate analysis” (p. 178).

So, what do you want to know? One way to begin pulling this apart is to ask why and how. For example, if you are interested in knowing if a training workshop was successful, you could ask, “How would I know if it was successful?” or “What would it look like if this was successful?” As a coordinator for training and development within a large residence life program employing over 250 student staff, asking these kinds of questions helped Maureen understand that in her department, a successful training was not only about the knowledge gained, but also the self-efficacy of student staff, or their belief in their own abilities, and their feeling of connectedness to the department as a whole. Understanding this helped Maureen see that how student staff experienced training had larger implications for the department and the culture that was being created in the residence halls.

When asking what you want to know, it is also important to consider the stakeholders involved. This may come from an angle of social justice and inclusion, including the voices of underrepresented students or populations, or from a political perspective. Culp and Dungy (2012) noted that “the culture of evidence in student affairs must be tied to the institution’s culture of evidence” (p. 6).  Likewise, Edwards and Gardner (2010) discussed the importance of nesting a residential curriculum assessment within the context of an institutional, divisional, and departmental mission and vision. Considering how what you want to know fits within your institution, division, or department’s mission or vision can not only help clarify how you move forward with your assessment but can also create an avenue to share what you learn, creating a broader audience for applying your results.

Understanding why what you are asking matters often includes taking a social justice lens to how you think about what you want to know. Asking, “Whose needs are not being met by this training or program?” or “What invisible populations am I overlooking?” are questions to consider. In her first year as a coordinator of training, Maureen and her colleagues received strong feedback from a segment of our student staff population who felt unserved by our trainings. Freshman Advisors (FAs) are student staff who focus on programming and relationships, but do not serve on call or perform administrative tasks. In their training feedback, many of the FAs expressed their feelings of exclusion from a training that was largely focused on the RA staff. Even the training hashtag we had created, one FA pointed out, referenced Resident Advisor (RA) staff, leaving out the FA staff. As professional staff looked to future trainings, asking, “How is this training including or excluding student staff roles?” helped increase feelings of inclusion by FA staff in future years.

 

What Information Do You Already Have?

As entry-level professionals, we have a wealth of information at our disposal. Students coming to college are recorded and categorized in a number of ways, including applications for admission, housing or meal plan contracts, and learning management systems, among others. Although some of this data may be restricted depending on your institution, there are other options to consider. Partners in other offices, such as student involvement, first-year experience, or housing may capture important information through their respective programs or surveys that could be shared across functional areas. In addition, many universities participate in multi-institutional surveys through EBI-MapWorks/Skyfactor, which provides data in relation to peer institutions. Even something as simple as a welcome survey from a resident assistant can provide useful foundational information and help guide future assessment.

In addition to the captured data that exist in various locations in the university, there also exists information that may not be as tangible. A university application for an incoming student cannot capture an overarching trend present in their residence hall environment or a deficiency in their student organization, and an official survey cannot shed much light on students’ daily lived experiences. However, this information often exists within our reach in the form of personal messages and candid comments on social media. For example, as a current employee at Clemson University, John knows from recent campus climate surveys that students of color, especially Black students, often feel excluded on campus. By following the hashtag #BeingBlackAtClemson, John can gain a better understanding of the real-life struggles and successes of these students, beyond the climate survey. By following key accounts, staying up to date on popular hashtags, and just checking to see what’s out there, new professionals can use Facebook, Twitter, Instagram, and Snapchat to get a glimpse of students’ narratives, language, and experiences in order to fill in some of the gaps between official survey instruments.

Knowing what information you already have can also simplify what questions you ask, while adding context to your results. For example, when we hire student staff, we ask a series of demographic questions including gender and birth date, as well as role-specific information such as how many years they have worked for us, what building they work in, and their major. Since we have an inventory of this information categorized by student ID number, when conducting future assessments or gathering feedback about trainings or programs, we can leave off these descriptive questions, adding them to the data analysis after the survey has been administered.

 

Who Do You Ask, and How Do You Ask the Question?

Knowing who to ask and how to ask is often the moment where new professionals feel out of their element when conducting assessment. The best advice we can give for practicing everyday assessment is to keep it simple and short. The new professional interested in developing more sophisticated methods can find texts dedicated to the creation of assessment questions, survey techniques, and methodologies (Alreck & Settle, 2004; Bresciani et al., 2010; Culp & Dungy, 2012; DeVellis, 2012; Schuh et al., 2009). We synthesize these texts into three suggestions as a foundation for more sophisticated methods.

First, only ask what you need to know. Earlier we discussed inventorying the information you already have, and we suggest referring to this to see what you already know about the group you are surveying. An additional piece of this is thinking reflexively about why you are asking for information. Is it necessary to ask about gender, race, or sexual orientation in an assessment of a recent hall program? The need for asking demographic information in surveys is embedded in our assumptions of research and, often, assessment. Consider the purpose for asking these questions and how asking (or not asking) questions regarding personal or social identities can be marginalizing.

A second tip for conducting assessment builds on the first: ask a question first and make it an easy question. An example is asking, “What did you learn from the program?” to students leaving an event. Alreck and Settle (2004) suggest sandwiching sensitive or difficult questions in the middle of a survey, leaving descriptive information until the end. This has a multi-fold purpose of building rapport before asking difficult questions, investing students in the assessment by asking for a response, and front-loading your assessment with information that is useful to you (Alreck & Settle, 2004).

A final consideration is not only to keep it simple, but to keep it short, both in the way that you ask your question, and the assessment itself. Schuh and associates (2009) note that “shorter questions maintain respondents’ attention and are less likely to create confusion” (p. 117), and this is true whether you are conducting a multi-part program survey or checking in about a recent program at the start of a staff meeting. If you plan to implement a larger assessment, consider pre-testing or piloting your questions. You could have RAs or student workers take an assessment, or workshop through possible questions before administering it to a larger group.

As an entry-level hall director, John frequently seeks ways to address the professional and developmental needs of the resident assistant staff with whom he works. One example of a simple assessment he used is the “sticky note” activity. Following an afternoon of conference-style, “choose your own adventure” sessions during resident assistant training, John led his staff in an activity where each staff member was given a sticky note and asked to briefly reflect on what they took away from one of the sessions and then stick the notes to the wall. Afterwards, John led a group discussion in order to shed more light and provide an opportunity to elaborate on individual answers. This not only produced detailed feedback on the preceding sessions, but also provided a jumping off point for developing staff-specific training and development activities for the semester. Compared to the larger narrative of assessment in higher education, such a simple activity might not seem like “assessment” in the buzzword sense, but gathering data in this example led to increased effectiveness of our resident assistant staff, which continues to advance the mission of the department and the university itself.

 

How Do You Share What You’ve Learned?

As we have discussed, collecting information is only part of the assessment process. The next step is applying and sharing the evidence and information you have gathered to improve processes and programs. How you share information, and with whom, can influence how they are perceived and ultimately implemented (Schuh et al., 2009). Often, once you have collected your data, it is helpful to go back to the first step, where you asked, “What do I want to know, and why does it matter?”

At this stage it is tempting to become self-congratulatory, focusing on what great things you learned, or stagnate, creating a list of descriptive statistics (Erwin, 1991). When creating a report to share, or communicating what you learned, consider your audience. When sharing information with institutional stakeholders, such as your division or department head, consider how what you learned fits within the mission of the organization, and focus on action steps to stimulate conversation and change (Schuh et al., 2009). This is another place to lean into reflexivity, considering who benefits from the results of the assessment, and how what you learned can create conversations about justice, or injustice, around your programs or services.

One group that is often forgotten at this step is the students you surveyed. Consider how to share the results of your assessment with them. Perhaps more so than numbers themselves, visual representations of data tend to be powerful tools for presenting evidence and illustrating concepts. This could look like creating and sharing a word cloud of responses to a minute-reflection or Poll Everywhere question, keeping a butcher-paper brainstorm on the wall in a common area or office space, or sending out an infographic with survey results in a newsletter.

As a training coordinator, Maureen shared what she learned from training assessments by conducting informal focus groups with community staffs across campus. After each monthly in-service, she would bring donuts to the weekly staff meeting of the team that had the highest response rate on the survey. She used the time with student staff to ask broader questions about training as a whole such as, “What are we doing well in our trainings?” and “What can we improve on?” These informal focus groups served to close the feedback loop by allowing a time to communicate what professional staff learned from assessment, and also giving student staff a voice and space to offer their feedback in a more personal manner. It is especially important at this step to think about sharing the information not as a finished product, but as the start of a new inquiry. What new questions were brought up from your assessment that you want to ask? What areas of improvement did you identify that you can assess in the future?

 

Conclusion

Although we are living in an age of assessment, where data-driven decisions and accountability are increasingly important, entry-level professionals often fall into the trap of thinking that assessment does not apply to their world. In conclusion, we suggest five key ideas to keep in mind when conducting your own assessment.

First, connect what you are doing to the big picture. Whether it is nested in your institution’s mission or vision, connected to a broader “why” in the narrative of student affairs, or answering a local question about your program, building, staff, or area, know why what you are asking matters. Second, only ask what you need to know. Whether it is information you already have or information that is not relevant to what you want to know, respect your students and their experiences by only asking the questions to which you need the answer. Third, involve students in the process of assessment. Incorporate focus groups as part of your staff meeting time, running possible questions by your student employees or asking them for qualitative feedback to guide what you want to know. Fourth, keep it simple, and have realistic expectations of what you can accomplish with assessment. It takes more than one committed and passionate entry-level professional to change a culture. Finally, close the feedback loop. Share your results with key stakeholders, including students as well as campus partners and divisional leaders, and set action steps to guide your next assessment.

Successfully incorporating assessment into your everyday work is not about knowing all the methodologies or theories or being an SPSS whiz. The core of assessment is curiosity and a desire to better understand the spaces, experiences, and perspectives around you. When you begin to practice everyday assessment, you start a process of inquiry that can help you better understand your context and the students you serve.

Discussion Questions

  1. As a new professional, what questions do you want to answer about your functional area?
  2. How can you demonstrate reflexivity in the way you practice assessment?
  3. What information do you already have that you can use to inform your assessment?

 

References

ACPA – College Student Educators International & NASPA – Student Affairs Administration in Higher Education. (2015). Professional competency areas for student affairs educators. Washington, DC: Authors.

Alreck, P., & Settle, R. (2004). The survey research handbook (3rd ed.). New York, NY: Irwin Press.

Bresciani, M. (2011). Making assessment meaningful: What new student affairs professionals and those new to assessment need to know. National Institute for Learning Outcomes Assessment. Retrieved from: learningoutcomesassessment.org/assessmentbriefs.htm

Bresciani, M., Gardner, M., & Hickmott, J. (2009). Demonstrating student success: A practical guide to outcomes-based assessment of learning and development in student affairs. Sterling, VA: Stylus.

Culp, M., & Dungy, G. (2012). Creating a culture of evidence in student affairs: A guide for leaders and practitioners. Washington D.C.: NASPA – Student Affairs Administrators in Higher Education.

DeVellis, R. (2012). Scale development: Theory and applications (3rd ed.). Los Angeles, CA: Sage.

Edwards, K. E., & Gardner, K. (2010, October 28). What is a residential curriculum? [PowerPoint slides]. Plenary session presented at the 2010 Residential Curriculum Institute, St. Paul, MN.

Erwin, T. D. (1991). Assessing student learning and development: A guide to the principles, goals, and methods of determining college outcomes. San Francisco, CA: Jossey-Bass

Pillow, W. (2003). Confession, catharsis, or cure?: Rethinking the uses of reflexivity as methodological power in qualitative research. International Journal of Qualitative Studies in Education. 16(2), 175-196.

Schuh, J. & Associates. (2009). Assessment methods for student affairs. San Francisco, CA: Jossey-Bass.

Upcraft, M. L., & Schuh, J. H. (1996). Assessment in student affairs: A guide for practitioners. San Francisco, CA: Jossey-Bass.

 

About the Authors

Maureen Flint is a PhD Candidate in Educational Research with a focus in Qualitative Methodologies at the University of Alabama. She has worked in a variety of professional capacities in student life including housing, intercultural engagement, and student unions. In her role as the Coordinator for Training and Professional Development in Housing and Residential Communities at The University of Alabama, she oversaw the ongoing training and professional development of 250+ undergraduate student staff members and 26 graduate assistants. Maureen holds an M.A. in Higher Education Administration from The University of Alabama where she worked as a Graduate Community Director.

John Tilley serves as a Community Director for University Housing & Dining at Clemson University. He oversees a residential community of approximately 690 students and supervises a staff of 21 Resident Assistants (RAs) and two Graduate Community Directors. John earned his M.A. in Higher Education Administration from The University of Alabama where he worked as a Graduate Community Director.

Please e-mail inquiries to Maureen Flint or John Tilley.

 

Disclaimer

The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

10 thoughts on “Series: Views of Assessment (Part II)”

Leave a Reply

Your email address will not be published. Required fields are marked *