Introduction and Discussion Questions to Part II

Introduction and Discussion Questions to Part II

Matthew Fifolt
Commission for Assessment and Evaluation
Kimberly Kline
Commission for Assessment and Evaluation

In the second half of our two-part series on assessment in student affairs, authors once again provide best practice and evidence-based strategies for assessing student learning outcomes in functional units. In summer 2012, Kim Yousey-Elsner and Stella Antic offered promising practices for assessing student learning in student activities (view article here) and Amanda Knerr and Jennifer Wright discussed ways in which residence life can support and enhance the formal academic curriculum through intentional co-curricular learning activities (view article here). For this issue, we explore assessment in the areas of career services and student conduct.

In the first article of Part II of this series, Jessica Turos and Patrick Roberts juxtapose the concept of outcomes-based assessment in career services with reports that are historically requested by this unit, namely demographic, satisfaction, and needs data. The authors highlight practical strategies that demonstrate both direct and indirect student learning and promote students’ continued career success (view article here).

In the second article, Kyle Tschepikow and Jeremy Inabinet explore opportunities related to assessing learning outcomes in student conduct programs. The authors describe competencies that promote student learning and development throughout the conduct process and identify strategies, resources, and tools that support professionals assessing conduct offices and their programs (view article here).

As assessment professionals and scholars, we hope these essays will provide readers with new ideas and starting points for conversations about assessment needs. We believe these promising practices are components of comprehensive, participatory assessment plans. Backed by professional literature, we are confident that building a culture of assessment in student affairs requires individuals to envision a system that transcends unit-specific boundaries.

Discussion Questions

As you read these two articles, we encourage you to consider the following questions specific to these two functional areas of student affairs:

  • What types of evidence would support the finding that learning occurred through a student’s involvement with career services or student conduct?
  • In what ways might community expectations be expressed in learning outcomes for student conduct?
  • How can a shift towards outcome measures alleviate some of the pressure that career services experiences for placement data?

Conclusion

A learning-centered approach to the assessment of student learning outcomes requires leadership and a vision for bridging the gap between curricular and co-curricular activities. It calls student affairs and assessment professionals to deliberately plan and assess programs and services so that our outcomes both resonate with academia and support the educational mission of the institution. Finally, a learning-centered approach to student affairs challenges us to redefine our roles from administrators to educators in order to remain relevant on our campuses and competitive in a world of expanding educational options.

Disclaimer

The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Assessment in Student Conduct Programs: Strategies, Resources, and Tools

Assessment in Student Conduct Programs: Strategies, Resources, and Tools

Kyle Tschepikow
University of Georgia
Jeremy W. Inabinet
University of Georgia

Most student affairs professionals today would agree that the principle aim of conduct administration is to educate students (Tschepikow, Cooper, & Dean, 2010; Waryold & Lancaster, 2008; Zacker, 1996). In fact, in their exposition on the professional philosophy of this functional unit, Waryold and Lancaster (2008) summarized, “the fundamental purpose of student conduct work is to promote growth and development in students while protecting the interests of the larger campus community” (p. 8). While there may be little debate about the educational purpose of student conduct, the extent to which this purpose is fulfilled on some college campuses may be less clear. A high-quality assessment program, built around clear and measureable educational outcomes, can be helpful in addressing this issue. The purpose of this article is to describe strategies, resources, and tools to support the development or enhancement of an assessment program tailored to the educational purpose of student conduct administration.

Strategies to Assess Student Learning and Development

Student conduct administration provides a variety of opportunities to educate students (Waryold & Lancaster, 2008). It is important for practitioners to identify those opportunities within their conduct programs and to define and publicize, in specific terms, the learning and development expected from them. A great place to start is with the process of resolving alleged violations, a fundamental component in most student conduct programs. This process can involve students as learners in different ways. For example, students who serve as hearing officers experience the process differently from those alleged to have violated the code of conduct. In both instances, however, learning and development outcomes can be articulated and assessed to understand the extent to which the process of resolving allegations facilitates the educational mission of the office.

With alleged students, there are opportunities to promote learning and development through intentionally structured sanctions. If the conduct office has stated educational outcomes, sanctions should always be constructed with these in mind. Community service, for example, is a common sanction assigned by conduct offices (Dannells, 1997). In many instances when students found responsible for violations are required to complete a certain number of community service hours and complete a written reflection on their service experience. This activity is often designed to promote development of social responsibility and civic engagement. The site of the service, the number of hours assigned, and the guidelines concerning the reflective essay should be informed by the office’s learning and development framework.

A student’s essay about his/her service experience can provide rich qualitative data. Professionals can use these data to determine whether the outcomes for the sanction have been achieved. A rubric can provide consistency and structure to this part of the assessment process. If given to the student in advance of the service experience, the rubric can clearly communicate expectations around learning and development, resulting in a more enriching learning experience for the student (Stevens & Levi, 2005). In addition, during conduct meetings administrators can collect information regarding students’ affective development and integration into campus life. With this information in hand, the hearing officer and the student can co-construct sanctions that meet the student’s need developmentally and align with office’s learning and development framework. Tailoring sanctions in this way creates a learning environment in which students are empowered to construct knowledge for themselves and apply it to complex problems, a key principal in transformative education as defined in Learning Reconsidered (American College Personnel Association & National Association of Student Personnel Administrators, 2004).

Training for students who serve on hearing boards is another ideal place to implement assessment strategies that promote the educational mission of student conduct programs. Like the sanctioning process, this aspect of conduct administration must begin with the articulation of learning and development outcomes. Lewis and Inabinet (2011) suggested conduct administrators design outcomes centered on the following competencies: the philosophy and history of student conduct, the student conduct process, critical thinking skills, preparing for a hearing, hearing decorum, questioning skills, weighing information, standards of proof, and issues of violence against women. Conduct administrators may decide to include other competency areas that reflect their unique mission, culture, and programming structure. Student board members may play a role in this process as well. All the same, it is paramount for practitioners responsible for this area to shape training curricula, exercises, and resources around intended outcomes. It is also important for administrators to articulate these outcomes to hearing officers in advance of training opportunities. With clearly defined outcomes and competencies, administrators will be able to more easily align training opportunities for student board members with the office’s educational mission, assess the effectiveness of those opportunities in facilitating their learning and development, and elucidate areas of focus for future training sessions.

Resources and Tools

A variety of resources and tools exist to support the implementation of the assessment strategies discussed above. One of the most established resources is the Council for the Advancement of Standards in Higher Education (CAS) Professional Standards (CAS, 2009). The latest version of this text, often referred to as the CAS Blue Book, contains standards and guidelines regarding the articulation and implementation of learning and development outcomes for student conduct programs. As noted earlier, learning and development outcomes play an important role in realizing the educational purpose of conduct administration.

Practitioners interested in defining or refining outcomes for student conduct programs may find benefit from the list of learning and development domains and corresponding dimensions provided in the CAS Blue Book (2009). The list of domains includes intrapersonal competence, humanitarianism, civic engagement, and practical competence—among others typically associated with student conduct programs. More specific dimensions are provided for each domain to assist practitioners in the development of outcomes aligned with the unique structure, mission, and programs of a particular conduct office. Dimensions under intrapersonal competence, for example, include self-understanding, self-respect, and commitment to ethics and integrity. These dimensions of learning and development can form the basis of clear and measurable learning and development outcomes for student training activities, campus outreach programs, sanctions for students, and other interventions designed to fulfill the educational mission of the conduct office.

Another established resource for conduct administrators that may be helpful in assessing the effectiveness of a program over time is the book Assessment Practice in Student Affairs (Schuh, Upcraft & Associates, 2001). The value of this text is in its practical orientation and concentration on student affairs programs and services. Conduct administrators looking for a starting place with assessment will find comfort in the broadly applicable step-by-step approach developed by the authors. Schuh, Upcraft, and Associates (2001) organize the assessment process into 11 applied steps that systematically lead the practitioner from defining the purpose of an assessment, to collecting and analyzing data, to reporting and using results to improve practice; these eleven steps can be viewed as a checklist of essential elements to consider in any well-designed plan. Additionally, conduct administrators with more developed assessment programs may find this resource beneficial as well. Professionals can use it to provide a framework to identify improvement opportunities in current assessment practices and processes.

Well-developed data collection tools, such as rubrics, questionnaires, and interview protocols are integral to any assessment program. After all, the information used to determine the effectiveness of any program will only be as good as the instrument used to gather it. As part of the VALUE project, the Association of American Colleges and Universities (AAC&U) enlisted teams of faculty and other academic and student affairs professionals to develop institutional level rubrics for 15 learning outcomes, many of which are germane to student conduct programs (AAC&U, n.d.). Each rubric includes a definition of the learning area, a glossary of terms used in the rubric, core dimensions of the learning area, and a scale used to measure student performance. Outcomes relevant to student conduct include civic knowledge and engagement, intercultural knowledge and competence, and ethical reasoning. Practitioners working with established assessment programs may find these rubrics to be helpful tools in designing local instruments to measure campus-specific outcomes. Professionals developing new assessment programs may find benefit from these rubrics as roadmaps toward the creation of learning and development outcomes. The rubrics might also help in the construction of data collection instruments.

In addition to AAC&U, other professional organizations can provide support for assessment to student conduct administrators. For example, each year ACPA sponsors the Student Affairs Assessment Institute. Participants at this institute can expect exposure to a diverse group of assessment experts who provide instruction on a range of skills and knowledge areas, including outcome development and measurement, focus group facilitation, questionnaire design, quantitative and qualitative data analysis, and benchmarking. The Association of Student Conduct Administrators (ASCA) also provides resources to student affairs professionals working in conduct such as the Donald D. Gehring Academy for Student Conduct Administration. The Gehring Academy is an intense week-long institute serving the educational needs of conduct administrators at different levels in the field.

Conclusion

A primary function of student conduct programs is to foster learning and development among students. Many conduct offices have affirmed this educational purpose but still have not determined the extent to which it is being fulfilled. A comprehensive assessment plan based on clear and measureable learning and development outcomes is one step toward addressing this issue. An outcomes-based approach to assessment can provide conduct offices with much needed evidence regarding student learning and development. Additionally, this approach can enhance the educational experiences for students who interact with the office by promoting a greater degree of intentionality in program design and administration. Finally, professionals must consider the unique mission, culture, and programming structure of the conduct office for the assessment to be successful.

Discussion Questions

  • What areas of student learning and development are most important to conduct administration on your campus? How are these areas targeted through current programs and services?
  • To what extent are students involved in the design and implementation of your office’s assessment program? What opportunities exist to increase meaningful student involvement?
  • Beyond the conduct office, who is available on campus to support the implementation of strategies to assess student learning and development? What resources and tools are already present on campus?

References

American College Personnel Association & National Association of Student Personnel Administrators (2004). Learning reconsidered: A campus-wide focus on the student experience. Washington, DC: Authors. Retrieved from: www.myacpa.edu

Association of American Colleges and Universities. (n.d.) Project description. [Web site]. Retrieved from: http://www.aacu.org/value/project_description.cfm

Council for the Advancement of Standards in Higher Education. (2009). CAS professional standards for higher education (7th ed.). Washington, D.C.: Author.

Dannells, M. (1997). From discipline to development: Rethinking student conduct in higher education. Washington, D.C.: Association for the Study of Higher Education.

Lewis, W.S. & Inabinet, J.W. (2011, July). Training student conduct boards: Selection, marketing, and competency based training. Lecture presented at the Donald D. Gehring Academy for Student Conduct Administration, Louisville, KY.

Schuh, J. H., Upcraft, M. L., & Associates (2001). Assessment practice in student affairs: An applications manual. San Francisco, CA: Jossey-Bass.

Stevens, D. D., & Levi, J. L. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning. Sterling, VA: Stylus. .

Tschepikow, K., Cooper, D. L., & Dean, L. A. (2010). Effects of CAS standards on assessment outcomes in student conduct programs. Journal of Student Conduct Administration, 3(1), 6-24.

Waryold, D. M., & Lancaster, J. M. (2008). The professional philosophy of student conduct administration. In J. M. Lancaster, D. M. Waryold, & L. Timm (Eds.), Student conduct practice: The complete guide for student affairs professionals (pp. 6-13). Sterling, VA: Stylus.

Zacker, J. (1996). Evaluation in judicial affairs. In W. Mercer (Ed.), Critical issues in judicial affairs. New Directions for Student Services, no. 73 (pp. 99-106). San Francisco, CA: Jossey-Bass.

About the Authors

Kyle Tschepikow is currently the director of student affairs assessment and staff development at the University of Georgia. He previously served as the director of residence life and chief judicial affairs officer at the University of Charleston in West Virginia. He holds a BA and MA in English literature and a PhD in Higher Education from the University of Georgia

Please e-mail inquiries to Kyle Tschepikow.

Jeremy Inabinet is currently pursuing a PhD in College Student Affairs Administration at the University of Georgia. He also serves as a doctoral intern in the department of Student Affairs Assessment at Georgia. Previously, Jeremy served as the assistant dean of students and chief student conduct administrator at Loyola University Chicago. He holds a bachelor’s in mass communications and theater and a master’s in education

Disclaimer

The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Introduction and Discussion Questions to Part I: Student Activities and Residence Life

Introduction and Discussion Questions to Part I: Student Activities and Residence Life

Matthew Fifolt
Commission for Assessment and Evaluation
Kimberly Kline
Commission for Assessment and Evaluation

For decades, scholars and practitioners in the field of higher education have repeatedly communicated the value and importance of student learning outcomes (Bresciani, Moore Gardner, & Hickmott, 2010; Erwin, 1991; Schuh & Upcraft, 2001), yet our experiences tell us that many student affairs professionals continue to report program outcomes (e.g., student satisfaction, headcounts) as primary evidence of success. Program outcomes, while important, are not sufficient (Getty, Young, & Whitaker-Lea, 2008; Westerberg & Roberts, 2011). To remain vital in today’s tough economic times, students affairs professionals must demonstrate intentional programming that is consistent with institutional goals for undergraduate learning and development (Green, Jones, & Aloi, 2008; Pike, Kuh, McCormick, Ethington, & Smart, 2011).

Why is student affairs so slow to respond? Many colleagues tell us they lack the practical tools for implementing a new assessment strategy. Others have expressed difficulty in translating assessment techniques across departments and units. The goal of this series is to provide road-tested and proven strategies for the assessment of student learning outcomes in functional areas of student affairs, specifically (a) student activities, (b) residence life, (c) career services, and (d) student conduct. Part One of this series will focus on student activities and residence life. Part Two of this series, scheduled to be published in the next issue of Developments, will feature career services and student conduct.

In the first article, Kim Yousey-Elsner and Stella Antic offer promising practices for assessing student learning in student activities. The authors provide a compelling rationale for developing an assessment plan and outline specific steps for completing an assessment cycle in student activities.

In the second article, Amanda Knerr and Jennifer Wright discuss the ways in which residence life can support and enhance the formal academic curriculum through intentional co-curricular learning activities. The authors demonstrate how the practical application of classroom assessment techniques can enhance residence life programming for students and improve real-time data collection by residence life staff members.

As assessment professionals and scholars, we hope that these essays will provide you with new ideas and starting points for conversation about assessment needs. We feel compelled to note; however, that these promising practices are components of comprehensive, participatory assessment plans. Backed by the professional literature, we strongly believe that building a culture of assessment in student affairs requires individuals to envision a system that transcends unit-specific boundaries.

Discussion Questions

As you read these two articles, we would encourage you to consider the following questions specific to these two functional areas of student affairs:

  • How might the learning outcomes of student activities and residence life reinforce one another?
  • Are there situations in which the learning outcomes of these two areas might be in conflict with one another?
  • What types of evidence would support the finding that learning occurred through student participation in programs sponsored by student activities or residence life?

Big Picture

While beyond the scope of this series, there are a number of excellent resources that can help student affairs professionals build a comprehensive assessment plan. Chief among them include:

  • Learning reconsidered: A campus-wide focus on the student experience (National Association of Student Personnel Administrators & American College Personnel Association, 2004)
  • Frameworks for assessing learning and development outcomes (Council for the Advancement of Standards in Higher Education, 2006)
  • Demonstrating student success: A practical guide to outcomes-based assessment of learning and development in student affairs (Bresciani, Gardner & Hickmott, 2010)
  • Assessment Skills and Knowledge (ASK) content standards for student affairs practitioners and scholars (American College Personnel Association, 2007)

For individuals interested in learning more about assessment and the role that student affairs can play in ensuring institutional accountability, we would also recommend the following report:

  • The data-driven student affairs enterprise: Strategies and best practices for instilling a culture of accountability (Education Advisory Board, 2009)

Conclusion

A learning-centered approach to student affairs assessment of student learning and developmental outcomes requires leadership and a vision for bridging the gap between curricular and co-curricular activities. It calls us to deliberately plan and assess programs and services so that our outcomes both resonate with academia and support the educational mission of the institution. Finally, a learning-centered approach to student affairs challenges us to redefine our roles, from administrators to educators, in order to remain relevant on our campuses and competitive in an ever-expanding world of educational options.

References

Bresciani, M. J., Gardner, M. M., & Hickmott, J. (2009). Demonstrating student success: A practical guide to outcomes-based assessment of learning and development in student affairs. Sterling, VA: Stylus Publishing.

Education Advisory Board (2009). The data-driven student affairs enterprise: Strategies and best practices for instilling a culture of accountability. Washington, DC: The Advisory Board Company.

Erwin, T. D. (1991). Assessing student learning and development. San Francisco: Jossey-Bass.

Getty, L. J., Young, D. Y., & Whitaker-Lea, L. D. (May/June, 2008). Casting the assessment netwide: Capturing all student learning. About Campus, 10-16. DOI: 10.1002/abc.247

Green, A. S., Jones, E., & Aloi, S. (2008). An exploration of high-quality student affairs learning outcomes assessment practices. NASPA Journal, 45(1), 133-157.

Keeling, R. P. (Ed.). (2004). Learning reconsidered: A campus-wide focus on the student experience. Washington, DC: National Association of Student Personnel Administrators & American College Personnel Association.

Pike, G. R., Kuh, G. D., McCormick, A. C., Ethington, C. A., & Smart, J. C. (2011). If and When money matters: The relationships among educational expenditures, student engagement and students’ learning outcomes. Research in Higher Education, 52(1), 81-106.

Schuh, J.H. & Upcraft, M. L. (2001). Assessment practice in Student Affairs. San Francisco, CA: Jossey-Bass.

Strayhorn, T., Creamer, D.G., Miller, T. &, Arminio, J. (2006). Frameworks for assessing learning and development outcomes. Washington DC: Council for the Advancement of Standards in Higher Education.

Westerberg, S., & Roberts, N. (2010-2011). Soaring or snoring: Energizing colleagues in Student affairs about learning outcomes and assessments (Parts I-III). NetResults: Critical Issues for Student Affairs Practitioners.

Disclaimer

The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Making Assessment Meaningful: Practical Assessment Techniques for Residential Environments

Making Assessment Meaningful: Practical Assessment Techniques for Residential Environments

Amanda R. Knerr
Pennsylvania State University
Jennifer Lenfant Wright
Mount St. Mary’s University

In the last decade, there has been an increased understanding that student affairs units have a shared responsibility with academic affairs units for student learning (Greater Expectations National Panel, 2002; Keeling, 2006), including developing opportunities for substantial out-of-classroom or co-curricular learning that enhances the formal academic curriculum. Residential life, as students’ home away from home, provides the optimal environment in which to engage in co-curricular learning opportunities. It is within the residential living environments that students learn skills such as resolving conflict, effectively managing time, understanding how one’s choices impact other’s in the community, identifying one’s own value and beliefs and describing how they differ from other’s within their environment, etc. (Blimling, 2010; Schroeder, Mable & Associates, 1994). Residential life also creates a unique setting in which to extend the classroom experience because it offers a variety of opportunities for faculty, staff, and students. It affords faculty the ability to offer guest presentations or workshops, students the opportunity to create informal study and discussion groups, and staff the ability to cluster students with similar academic pursuits to enhance the classroom learning experience (Inkelas & Wesman, 2003). The residential learning environment provides a rich and complex environment in which to develop intentional learning strategies through a planned residential curriculum model for students and then to assess those learning outcomes (Kerr & Tweedy, 2006). But how does a residential life program move towards creating intentional learning opportunities and how does one assess learning in this environment? This article will explain the process of assessing student learning in the residential setting, provide examples of assessment tools and plans currently utilized by universities, and present the foundation for residential student learning assessment.

Mapping the Learning Environment

The first step in assessing student learning in the residential environment is to determine what specifically is expected of students to learn during their time living on campus. Where will the program focus its attention? What will be the fundamental educational priority for the residential life program at the institution? For some, the priority may focus on issues of social justice or citizenship. For others, the decision may be to focus on issues of respect and responsibility.

The second step is then to create intentional learning outcomes that will guide the actions and activities of the environment. For example, as a result of participating in a specific event, initiative, or activity on your campus, what do you hope for students to take away? These outcomes will determine the programs that need to be put into place and provide a definitive framework for which to assess learning outcomes. One resource that can assist in the development of learning outcomes and assessment strategies include The Framework for Assessing Learning and Development Outcomes (Strayhorn, 2006).

Planning the Assessment

Once the educational priority, learning outcomes, and initiatives have been developed for a residential program, the third step is to develop assessment strategies that provide information about the degree to which students living in the residential community have met the intended outcomes. Traditionally, residential programs have used satisfaction assessments in order to gauge how satisfied students are with their living environment. However, students’ perceptions of satisfaction of actual accomplished learning may not be as accurate as directly observing whether or not students have met the desired learning outcomes. In order to directly assess student learning in the residential environment, a residential life professional can utilize traditional classroom assessment techniques which have relied on direct assessments to guide understanding of student learning and achievement. Direct assessment techniques require students to demonstrate their knowledge or skills through objective tests and/or performance opportunities. Indirect assessments ask students to reflect on what they perceive to have learned rather than demonstrate their learning (Palomba & Banta, 1999). Residential life staff should move to a more balanced approach between direct and indirect assessment techniques to have a more complete picture of what students have learned from participating in residential life programs and services.

Selecting Appropriate Assessment Tools

Angelo and Cross’ 1993 book on classroom assessment techniques may be particularly useful in creating a more balanced approach to assessment. Their classroom assessment techniques have been found to adequately assess students’ on-going learning in the classroom. This text provides specific assessment techniques to assess students’ knowledge acquisition, ways in which to synthesize information, and skills in the application of new knowledge to novel situations. Each assessment technique takes minimal time to prepare and most assessments can be completed by students in just a few minutes, providing rich assessment information on what students are learning relative to specific learning outcomes.

The one minute paper is one such classroom assessment technique that is now widely used (Angelo & Cross, 1993). In this assessment, students respond reflectively to a question posed by the educator. Questions could encompass areas where they learned something new, areas where they still have questions, information that particularly interested them, or a response to how they would respond to a situation. For example, students may participate in an environmental sustainability program held in the residence hall. After the presentation, the facilitator may ask participants to spend one minute describing on a piece of paper at least two different ways that the participant can reduce his/her carbon footprint while living in the residence halls. The participants can write a short reflection piece that outlines how they might take these steps themselves based on the information presented in the program or initiative. Within five minutes, residential staff members can have concrete examples of what students have learned related to the sustainability outcome developed for the program.

Polleverywhere.com and/or trivia clickers are two additional tools that can be utilized in accessing learning. These resources allow educators to ask questions throughout a presentation, floor or house meeting, or other event to gauge what students are learning in the session. In polleverywhere.com, questions can be imbedded into Microsoft Power Point or other presentations. Students can then text in “live” responses to the questions and can comment on each other’s responses. This Web site allows a certain number of participants to answer questions for free; additional responses can be purchased as necessary. Students particularly enjoy the ability to utilize current technology such as social media and texting in learning environments, and the data itself provides a real-time assessment of whether or not students are meeting the established learning goals. The downside for using polleverywhere.com is that it limits participation to those that utilize texting and cell phones. This may eliminate the voice of students who come from lower socio-economic backgrounds or who do not engage in the use of cell phones or texting services.

The trivia clickers are very similar to polleverywhere.com in that they capture real-time responses from participants. Commercial clickers are similar to the types of hand-held devices used in restaurants for trivia games. Educators create a series of questions and students can respond to them in real-time using the clickers. The downside to this approach is that there is an initial cost associated with the technology. However, feedback we have received from students thus far, indicates that they find the interaction with clicker technology very engaging and the real-time data related to understanding of programmatic content by the participants is very helpful to assess progress on students learning in the event/program/initiative.

Another way to assess student learning is through a pre-assessment and post-assessment survey for residential life professional and/or para-professional staff training. One can track the staff’s learning progress as well as the effectiveness of the trainings offered. Assessment surveys can be directly linked to learning outcomes within training sessions throughout the academic year. Areas of improvement are revealed through the comprehension of the learning outcomes assessed. It can be an eye-opening experience to discover a particular training session did not yield the acquired knowledge as intended. However, “discovering that programs are not functioning as they intended is not necessarily evidence that the program is “bad”; it merely indicates that the learning strategy present in the program is not well-suited to enhancing students’ acquisition of knowledge, values, or abilities” (Keeling, Wall, Underhile & Dungy, 2008, p. 73). This type of assessment has become extremely helpful in planning future training sessions for the para-professional residential life staff and solidifies the co-curricular learning and skills the staff engages in throughout the academic year.

National Assessments such as the Association of College and University Housing Officers-International(ACUHO-I)/Electronic Benchmarking Incorporated (EBI) Resident Assessment, ACUHO-I/EBI Student Staff Assessment and the ACUHO-I/EBI Apartment Assessment obtain student satisfaction and benchmarking data. The Resident Assessment is based on ACUHO-I and Council for the Advancement of Standards (CAS) Professional Standards. These assessments can be utilized to show what areas in which a residential life program is excelling and areas for improvement when comparing with other institutions. Another national assessment is the NASPA Consortium Benchmarking Assessment which has both a Residence Life Assessment and a Profile of the College Student Assessment that provide data covering student perceptions and satisfaction among national benchmarks of peer institutions and various other classification categories. This data highlights the overall indication of where the individual institution falls among these benchmarks. While both of these national assessments provide resident satisfaction information, they also provide information on student attainment of co-curricular learning outcomes. By comparing this institutional data with similar institutions, nationally and longitudinally, the results may inform the Residential Life programs’ practices and provide stakeholders, such as Senior Student Affairs Officers(SSAOs), Board of Trustees and Presidents, with valuable information to allocate resources to the program in the future.

Summary

Overall, residential life programs must engage in an “archeological dig” in which the unit digs deep into their institution’s mission, values, goals, and beliefs to determine how the residential program can support and enhance the institutional mission (Keeling, 2006; Keeling, Wall, Underhile, & Dungy, 2008). Many types of assessment methods exist and can provide various data sets to inform future goals, practices and learning outcomes of a residential program. Quantitative methods described above, such as national assessments or polleverywhere.com, are a common practice in higher education which provides four different categories of data: institutional indicators, test and grading data, large survey data and local survey data (Assessment Reconsidered, 2008). Qualitative methods, such as the one minute paper or residence life staff training assessments provide rich and deep information that can inform the development of programs and services. Mixed methods can be used to bring a residential program’s survey data to life by allowing student stories behind the quantitative data to be discovered. Though the best set of data-gathering approaches will take time, a mixed methodological approach utilizing direct and indirect methods and at times novel approaches is best when advising and supporting co-curricular learning outcomes in residential programs.

Discussion Questions

  1. Why is assessment important to a residential life program? What opportunities or challenges exist and how can assessment help us with these opportunities and challenges?
  2. What areas of student learning are important in the residential setting? How do we know that these learning outcomes are being met?
  3. What assessment tools will assist us in assessing student learning and why?
  4. What is one small assessment project that could assess student learning in our residential environment? What do we need to do to plan this assessment? When do we want to see it completed?

References

Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers. (2nd ed). San Fransisco, CA: Jossey-Bass.

Blimling, G. (2010). The Resident Assistant: Applications and strategies for working with college students in residence halls. Dubuque, Iowa; Kenall/Hunt Publishing Company.

Greater Expectations: A new vision for learning as a nation goes to college. (2002). National Panel Report. Association of American Colleges and Universities (www.aacu.org).

Inkelas, K. K. & Weisman, J. L. (2003). Different by design: An examination of student outcomes among participants in three types of living-learning programs. Journal of College Student Developoment, 44(3): 335-368. doi: 10.1353/csd.2003.0027

Keeling, R. P. (ed.). (2006). Learning reconsidered 2: A practice guide to implementing a campus-wide focus on the student experience. ACPA, ACUHO-I, ACUI, NACA, NACADA, NASPA, NIRSA.

Keeling, R. P., Wall, A. F., Underhile, R., & Dungy, G. J. (2008). Assessment reconsidered: Institutional effectiveness for student success. ICSSIA, NASPA, Keeling and Associates LLC.

Kerr, K. G., & Tweedy, J. (2006), Beyond seat time and student satisfaction: A curricular approach to residential education. About Campus, 11: 9–15. doi: 10.1002/abc.181

Knerr, A. R. (2011). Practical approaches to residence life assessment. Webinar, Academic Impressions, Colorado Springs, CO.

Palomba, C. A. & Banta, T. W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education. San Francisco: Jossey-Bass.

Schroeder, C. C., Mable, P. & Associates. (1994). Realizing the educational potential of residence halls. San Francisco: Jossey-Bass.

Strayhorn, T. L. (2006). Frameworks for assessing learning and development outcomes. Washington, DC: Council for the Advancement of Standards in Higher Education.

Upcraft, M. L, & Schuh, J. H. (2001). Assessment in student affairs: An applications manual. San Francisco: Jossey-Bass.

Other Resources

Free Assessment Webinars provided by CampusLabs:
https://www.studentvoice.com/app/Training/WebinarsFall11.aspx

Commission for Assessment and Evaluation’s (CAE) Wiki site:
http://acpacommissiononassessment.pbworks.com/w/page/26996026/FrontPage

ACPA’s ASK Standards booklet:
http://www2.myacpa.org/publications/internal-publications

Author Information

Amanda R. Knerr is the senior associate director of residence life at The Pennsylvania State University, where she oversees the unit’s assessment program. Amanda has been involved in ACPA in a number of ways including serving on planning committees for the Assessment Institute, Residential Curriculum Institute, and the Institute on Sustainability and has been a member of the ACPA Commission for Assessment and Evaluation Directorate for the last five years.

Please e-mail inquiries to Amanda R. Kerr.

Jennifer Lenfant Wright is the associate director for housing operations and student affairs assessment at Mount St. Mary’s University, where she oversees the housing operations for the campus and the student affairs assessment program. Jennifer is completing her three year term on the ACPA Standing Committee for Women Directorate, continuing her first year on the ACPA Commission for Assessment and Evaluation Directorate and is excited to be part of the 2013 ACPA Convention Team.

Please e-mail inquiries to Jennifer Lenfant Wright.

Disclaimer

The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Promising Practices in Assessing Learning in Student Activities

Promising Practices in Assessing Learning in Student Activities

Kim Yousey-Elsener
Campus Labs
Stella Mulberry Antic
University of North Texas

While publications such as Learning Reconsidered 2 (Keeling, 2006) and the CAS Learning Domains (Council for the Advancement of Standards, 2011) have set the standards for assessing student learning in campus activities, actually assessing student learning in co-curricular experiences is still a time consuming and elusive effort for many campus activities professionals. Bresciani (2011) and Schuh and Upcraft (2001) assert that purposeful, ongoing assessment of learning and administrative outcomes is critical to program improvement and success. However, assessing student learning can be a challenge for many reasons, since learning is “messy” and not linear for our students; therefore, finding an appropriate way to capture that learning for assessment purposes can be a challenge. In addition, student activities programs and services are often created for purposes beyond student learning, such as community building, connecting students to the institution, and educating students regarding campus traditions and ethos. While these are all important purposes, they are not solely for the objective of student learning. This notion leaves professionals asking: how and why do I need to assess learning?

While new technologies help us track exposure to events and programs, such as through the use of ID card swiping, it is important for us to go beyond these perfunctory ways of understanding our students outside of the classroom. Assessing student learning in campus activities tell us what students are “taking away” from interactions with specific programs and services. Perhaps the most important reason for assessing learning is for our own learning as practitioners. Assessing learning ensures that all of our planning, facilitating, and advising contribute to our students taking away tangible skills, knowledge, and behaviors. It also provides us the opportunity to improve our programs to better meet our intended goals. In addition, assessing student learning helps us engage in a larger conversation on campus, where well-articulated student learning outcomes and assessment results help to bridge the conversation of student learning with faculty at our institutions. These cross-institution conversations also serve the institution in articulating and proving what students are learning overall, an effort being encouraged across the nation by an increased focus on student learning in the accreditation process.

For all of these reasons, it is important for professionals in campus activities to gather information about every aspect of their programs, including budget and operations, student needs, student learning outcomes, and overall program effectiveness. While this can be a challenge, one suggestion is to not try to do everything at once; rather, pick a few programs or goals or assessment needs each year. Rotating through programs and services each year allows professionals the time and resources to gather useful information, while utilizing a pre-established framework may help as well. Below are examples of two institutions implementing promising practices using the CAS Standards and NACA Student Learning Frameworks.

The University of North Texas and CAS Standards

In 1979, the Council for the Advancement of Standards in Higher Education (CAS) was founded as an organization dedicated to advocating for the implementation of standards and guidelines in student affairs practice (CAS, 2011; Nuss, 2000). CAS has 41 member organizations that represent many of the major professional organizations in the field of student affairs, including those related to general practice, graduate preparation, counseling, health, housing and residence life, judicial affairs, facilities, Greek life, and campus activities, among others (CAS, 2011). CAS Standards and Guidelines are used frequently in student affairs in order to help departments assess their programs and services (McNeil, 2009).

At the University of North Texas (UNT), the Division of Student Affairs recommends departments undergo a full CAS review every five years. This process is facilitated by staff in the office of Research, Assessment, and Planning (RAP), which assists with at least two departmental CAS reviews each year. UNT’s Student Activities Center is currently engaged in this review using the CAS Standards and Guidelines for Campus Activities Programs within the following framework.

Norming. The first component to UNT’s CAS review process is the self-study, in which departmental staff work independently to score themselves across the various dimensions of the CAS Self-Assessment Guide (for Campus Activities Programs, there are 14 separate sections). Before the self-study, the staff participates in an initial norming session, facilitated by RAP staff, to learn about the rating system and to discuss the criteria needed to evaluate the standards and guidelines. Staff members then receives copies of the instrument and rating sheets, and are given ample time to complete the self-study and submitting their scores to both the RAP office and to a staff member in the department tasked with monitoring progress. Staff members are also asked to provide narratives for open-ended questions. This process takes between three and four weeks to complete.

Consensus Meeting. Once all self-studies are complete, the staff reconvenes for a consensus-building meeting, facilitated by RAP staff. The norming information is reiterated, and each element of the CAS Self-Assessment Guide is reviewed. Staff members are asked to give their ratings by holding up scorecards for all to see, and to give their rationale for the rating. If there is unanimous agreement, that score is assigned; if not, the range of scores is recorded and the discrepancies are discussed until the group reaches consensus. The consensus score is then recorded. This process can take up to one full day, but can occur over the course of several days for ease of scheduling if needed.

Dashboard. The RAP office developed a spreadsheet-based template that, when completed, automatically graphs a department’s scores across the various components of the CAS instrument. UNT developed this template with the collaboration of Eastern Washington University to ensure the most up-to-date congruence with CAS Standards and Guidelines. It was designed to provide senior leadership with a quick snapshot of the entire self-study process in one spreadsheet to save time and make the process more meaningful from an administrative enhancement perspective. The dashboard templates developed by RAP are available within a limited number of CAS areas, corresponding to those functional areas present at the campus. After the consensus meeting, RAP staff inputs the final scores and narrative into the template. A report is then generated and sent to the department’s director, who will then determine the next steps for gathering evidence.

Evidence Gathering. The next step in the process is to gather evidence to support the scores that were determined in the consensus meeting. Each rating should be accompanied by appropriate evidence, such as policy documents, organizational charts, departmental meeting minutes, and assessment results. Department staff work together to gather copies of the evidence needed, either electronically or in hard copy. The evidence is then put into a compendium along with the dashboard report. This process takes between four to six weeks to complete.

Internal-External Review. After the evidence has been gathered at a central location, RAP staff review the compendium through the lens of an external reviewer from another institution. In this “devil’s advocate” role, RAP ensures that enough evidence has been compiled to support the ratings and asks staff to gather more if needed. This is done in order to prepare the department for the official external review to endure it goes as smoothly as possible. This review takes roughly two to three weeks to complete.

External Review. Department directors are asked to identify colleagues outside the institution who are authorities in the specific functional area to come to campus as external reviewers. Once the compendium of evidence is complete, these external reviewers are invited to review the documentation and to hold interviews and focus groups with key stakeholders, such as department staff, senior leadership, and students. The feedback gathered from these meetings, combined with the reviewers’ expertise, inform the reviewers’ interpretation of the ratings and final recommendations. While the actual review takes between one to three days on campus, the invitations to reviewers should be extended at least eight to10 weeks in advance to allow for scheduling considerations.

Closing the Loop. Once the external review is complete, it is up to the department leadership to develop a summative evaluation of the entire process. This evaluation includes an executive summary to be shared with senior division leadership as well as action steps that will evolve from the external reviewers’ recommendations. If the action steps have any bearing on the department’s strategic plan or assessment plan, those documents should be updated during the annual review period and tracked accordingly.

The Student Activities Center has completed the above process through the internal-external review. The department is working through the external review phase as of Fall 2011, and a final copy of the summative evaluation will be posted on the UNT Division of Student Affairs Web site once completed.

Linfield College and NACA Framework

Created in 2007, the NACA Competency Guide for Student Leaders is available in several publications including guides for students and campus administrators. The Facilitators Guide provides practitioners with a description, learning outcomes, suggested initiatives, key questions, additional resources, and assessment questions related to 10 core competencies: (leadership development, assessment and evaluation, event management, meaningful interpersonal relationships, collaboration, social responsibility, effective communication, multicultural competency, intellectual growth, clarified values); and seven additional competencies: (enhanced self-esteem, realistic self-appraisal, healthy behavior and satisfying lifestyles, interdependence, spiritual awareness, personal and educational goals, career choices). More information about the Frameworks is available at www.naca.org.

In an effort to enhance student learning outside the classroom, the Student Affairs Division of Linfield College examined how they define and assess learning. To start this process, the entire staff within the division of Student Affairs read Learning Reconsidered 2 and participated in a day-long training on student learning outcomes and assessment. Following the training, the staff grappled with how to put the theory of the learning and assessment into practice. Staff members were introduced to the new NACA Student Leader Competency Guide at the NACA National Conference and a NASAP regional conference. It provided student learning outcomes and included an assessment tool and facilitator’s guide. This guide provided a good starting point for the development of learning outcomes and assessment.

A meeting of Residence Life and the Student Activities Staff occurred to narrow the 17 NACA competencies down to five core competencies that fit the student culture and desired outcomes. They included: leadership development, meaningful interpersonal relationships, collaboration, social responsibility and effective communication. The staff used the assessment tool in a pre and a post self-evaluation to determine the impact that programs were having on the five competencies. The first year of data revealed that students self-scored lower on the post-evaluation, even after extensive leadership training and a yearlong leadership experience. This scoring drop was credited to a greater self-awareness and understanding of their own leadership competencies. After much consideration it was decided to drop the pre-evaluation as part of the assessment process. The second year, at a full division retreat, the student affairs staff developed four core competencies for all student leaders, including leadership development, social responsibility, effective communication, multicultural competency. Also, a partnership was formed with CampusLabs to develop a post-evaluation for the student leaders with an expanded scale. Finally, multiple assessment methods were developed and incorporated utilizing reflection through journaling and one-to-one meetings

For Linfield, this was an intensive and rewarding project where valuable lessons were learned. First, a pre and post-evaluation is not always the ideal methodology. Second, for small divisions with limited budgets it is important to seek out existing resources. These may include guiding documents from other institutions, templates and tools through NACA, and consultations with CampusLabs. Finally, it is important to be upfront and direct about learning outcomes with students. Showing students that their leadership positions are learning laboratories was an important part of the assessment process. To that end, learning outcomes were incorporated into each part of a student’s leadership experience from the marketing of the position, to the hiring process, training, program goals and one-on-one meetings. Linfield found that once the language of learning and assessment was used, their students followed suit and incorporated it into their experience.

Conclusion

Whether staff are searching for a way to complete a full program review or a small campus looking for a place to start, assessing student learning in campus activities begins with determining what framework or process works best. There are myriad ways to assess student learning in the co-curricular realm. A focus on intentionally gathering relevant data to help improve the student experience is paramount, regardless of which method one chooses. Assessing student learning is a challenging and rewarding experience, one that can benefit students and staff alike.

Discussion Questions

  • What are the benefits of assessing student life programs? How can assessment help maximize opportunities or mitigate challenges in a student life context?
  • What do students learn through participation in student life programs? Are there differences in learning depending on breadth of experience vs. depth of experience?
  • How can student learning outcomes truly be measured in the context of student life?
  • What steps can one take today to plan an assessment of a student life program? What steps can be planned for this month? This semester?

References

Bresciani, M. J. (2011, August). Making assessment meaningful: What new student affairs professionals and those new to assessment need to know. (NILOA Assessment Brief: Student Affairs). Urbana, IL: University for Illinois and Indiana University, National Institute for Learning Outcomes Assessment.

Council for the Advancement of Standards in Higher Education. (2011). CAS handout. Retrieved from the CAS Web site http://www.cas.edu/wp-content/uploads/2011/04/CAS_Handout04-11.pdf

Keeling, R. P., American College Personnel Association, & National Association of Student Personnel Administrators (U.S.). (2006). Learning reconsidered 2: Implementing a campus-wide focus on the student experience. Washington, D.C.: ACPA.

McNeil, M. (2009). Using standards to support peer education. Retrieved from the Alice! Health Promotion Program, Columbia UniversityMcNeil, M. (2009). Using standards to support peer education. Retrieved from http://health.columbia.edu/files/healthservices/alice_downloads_using_st…

Nuss, E. M. (2000). The role of professional associations. In M. J. Barr, M. K. Desler, & Associates (Eds.),The handbook of student affairs administration (2nd ed.). San Francisco, CA: Jossey-Bass.

Schuh, J. H., & Upcraft, M. L. (2001). Assessment practice in student affairs: An applications manual. San Francisco, CA: Jossey-Bass Publishers.

White, E. R. (2006).Using CAS standards for self-assessment and improvement. Retrieved from the NACADA Clearinghouse of Academic Advising Resources Web site http://www.nacada.ksu.edu/Clearinghouse/AdvisingIssues/CAS.htm.

About the Authors

Kim Yousey-Elsener, Ph.D. is an Associate Director of Assessment Programs at Campus Labs as well as serving as the Chair for ACPA’s Commission for Assessment and Evaluation. In addition to her assessment work with over 100 campuses nation-wide she serves as adjunct faculty at West Virginia University.

Please e-mail inquiries to Kim Yousey-Elsener.

Stella Mulberry Antic, Ph.D., is the Assistant Director of Research, Assessment, and Planning for Student Affairs at the University of North Texas, and serves on the directorate for the ACPA Commission for Assessment and Evaluation. At UNT, Dr. Antic conducts research related to student populations and works on developing a statistical model of student retention using direct evidence of program and service usage patterns.

Please e-mail inquiries to Stella Mulberry Antic.

Disclaimer

The ideas expressed in this article are not necessarily those of the Developments editorial board or those of ACPA members or the ACPA Governing Board, Leadership, or International Office Staff.

Administrators Engaging in the Research Process

The purpose of this Developments series is to explore different perspectives of what it means to be a scholar practitioner, the various ways in which one can be a scholarship practitioner, and the impact doing so has on one’s personal and professional life. The contributing authors of this series address how they have approached being a scholar practitioner, the challenges and opportunities that accompany their approach, and recommendations for others who also want to want to pursue a career where scholarship and practice are purposefully interwoven.

In earlier Developments articles on entering the professoriate, authors focused on the various ways in which an administrator would need to understand the rules, norms, and expectations of the profession and to engage in scholarship as a prelude to making the transition from student affairs administrator to professor. For example, in her article “Student Affairs Pathways to the Professoriate: Perspectives on Teaching,” Julie Owen admonishes student affairs professionals that the complexities of faculty socialization affect one’s ability to bring experience as a student development administrator into the traditional classroom setting. In “Writing for Publication,” Dilley and Hart suggest that conference presentations, research, and scholarly journal articles, and articles in professional journals should be aimed at multiple audiences but developed into a research agenda for which the individual might become known in the field.

This article will turn the question around. What about the student affairs professional who wants to engage in scholarship while remaining an administrator? Is it possible to be a practitioner who is also a scholar rather than one who uses scholarship to transition into a faculty position?

What is a Scholar Practitioner?

In his seminal work Scholarship Reconsidered: Priorities of the Professoriate, Boyer (1997) argued for a broader definition of scholarship for the professoriate. Boyer was concerned about not only what “counts” as scholarship in the tenure process, but what matters about it. He argued, for example, that the scholarship of teaching – the study of how knowledge can best be transmitted to others and best learned – is a valid subject of inquiry for faculty members.

A scholar practitioner, on the other hand, is not concerned about tenure. Instead, the scholar practitioner engages in research and scholarly endeavors while continuing in the role of an administrator with no thought of transitioning to the professoriate. Scholarship in this sense is an end rather than a means to an end. The scholar practitioner engages in research to improve his or her own practice or to develop best practices in his or her administrative discipline. Scholar practitioners read the research reports of others and use them to improve their own effectiveness and that of their staff or peers. The scholar practitioner generates new knowledge not to convince a tenure committee that she or he has the right stuff, but simply to contribute to the advance of knowledge or practice in a chosen field.

What’s the Payoff for Student Development Professionals?

While tenure is not at stake and therefore is not a motivator for scholarly engagement by student development practitioners or other college administrators, there are still a number of rewards for the practitioner who decides to pursue a research agenda. First, ascendancy to the presidency in higher education has historically been through the academic department chair, dean and provost pathway. Few financial affairs professionals (like myself) – and perhaps even fewer student affairs professionals – have found an easy passage to the presidency. Building a resume that includes conference presentations, book and article reviews and published journal articles, book chapters and monographs can substantially improve the odds that a senior administrator in student affairs will be a considered a serious candidate for the leadership of a higher education institution.

Second, there is the joy of working at the nexus of research work and professional work. It is not clear in all cases which drives which – is it only true that research findings should influence the work we do and the way we do it – or should our work influence the scholarship we do and the ways we conduct it?

Third, there is the question of legitimacy. Student affairs professionals constantly struggle for legitimacy in the eyes of their faculty colleagues. Who has not heard the adage that “the faculty are the university” and that teaching and learning (meaning what goes on inside the classroom) are the core of the enterprise? Even granting that this is true, student development professionals have made a very strong case for the value of the learning that occurs outside the classroom, in student leadership opportunities, student clubs and activities, community volunteer opportunities and the like. Yet the student affairs function gets little respect from the faculty on some campuses. The student development professional who doubles as a scholar practitioner meets the faculty on their own terms – as an equal who engages in scholarly pursuits, subjects his or her ideas to peer review and contributes to the advancement of knowledge. Becoming a scholar practitioner can improve the credibility of the student affairs professional when she or he brings ideas about teaching and learning to discussions of assessment and accountability, service learning or the value of co-curricular education efforts on campus.

Finally, there is the professional motivation and personal rewards associated with being a scholar practitioner. While it will not necessarily lead to promotion or a higher salary, and most certainly will not result in a tenured position, contributing to the advancement of knowledge and the improvement of practice are rewards in themselves. As a creative endeavor, the research process can provide both professional development and increased job satisfaction.

A Brief Case Study of One Scholar Practitioner

My own path to becoming a scholar practitioner has evolved over more than twenty years as a higher education administrator. I currently serve as the vice president for financial affairs and treasurer of a comprehensive master’s level private institution and have served at the senior management level since the 1990s in both public and private higher education.

My doctoral work included completion of an historical dissertation on the development of the research mission at two Midwestern urban state universities. During my doctoral program and since its completion, I have made research and scholarly presentations on various aspects of the history of higher education at conferences of the American Educational Research Association (AERA), the Association for the Study of Higher Education (ASHE) and the History of Education Society. Coming directly out of my dissertation, my research focus has been on the historical development of the urban state universities sector of American higher education.

What, you might ask, is the connection between serving as the chief financial officer of a university and engaging in scholarship on the history of higher education? Most of the rewards, of course, have been either intangible or personal or both. The biggest payoff, however, has been a greater understanding of the world in which my faculty colleagues operate on a daily basis – the world of publish or perish. By putting myself in their shoes, I have gained a great deal of respect for the demands placed on them and the work involved in this aspect of their professional lives. I hope, in return, that they have gained some respect for my work and my efforts to engage in the knowledge generation process.

Getting Started as a Scholar Practitioner

Having been convinced by the articles in this series of the value of engaging in scholarly pursuits, yet not wishing to transition to the professoriate, where should the student affairs administrator who wants to become a scholar practitioner get started? Two immediate issues are finding time in an already hectic professional calendar to engage in research and finding support from your supervisor or others in the institution in terms of both time and resources.

Most colleges and universities (albeit perhaps less so in these difficult financial times) provide support to their administrators for conference attendance. Rather than merely attending the next gathering of your favorite professional organization, why not submit a proposal to make a presentation? If you ground your presentation in original research that you have conducted, rather than relying solely on the work of others, you will have taken a first step along the scholar practitioner road. If your proposal is accepted, actively seek the advice of your audience once you have completed the presentation, e.g. chat with them after the session or offer to send them the paper in return for feedback. Having been successful at one or more professional conferences, next submit a conference proposal to an appropriate scholarly organization such as ASHE or AERSA. Many colleges and universities will support the attendance of their administrators at scholarly conferences if they have an excepted peer reviewed paper. Finally, take your experience at conferences and move to the next level – submit a paper to an appropriate journal as Dilley and Hart outline in their article, “Writing for Publication.”

Finding – or making – the time for scholarly endeavors is a much more personal issue. In my case, all of my vacation time for two years was spent doing the research for my dissertation. As a morning person, I also find the time between 5:00 and 6:00 a.m. and weekends my most productive working time. Each budding scholar practitioner will, of course, need to find his or her own solution to the time question.

Conclusion

The student development professional does not need to limit his or her interest in engaging in the research process to a career change strategy. One can engage in the pursuit of scholarship for the sheer joy of learning, for contributing to the advancement of knowledge, or for the development of best practices in a specific field. While both motivations for conducting research are valid, the latter provides student affairs administrators with everything from the potential to climb the administrative ladder in higher education to the satisfaction of doing something for the sheer pleasure of accomplishment.


REFERENCES

Boyer, E. L. (1997). Scholarship reconsidered: Priorities of the professoriate. San Francisco: Jossey-Bass.

Dilley, Patrick and Hart, Jeni (2009). Writing for publication. ACPA Developments. Winter 2009.

Owen, Julie (2009). Student affairs pathways to the professoriate: perspectives on teaching. ACPA Developments. Summer 2009.

Please send inquiries and feedback to Ralph Kidder at [email protected].

– See more at: http://www.myacpa.org/article/administrators-engaging-research-process#sthash.oqHhzRuj.dpuf

Cultivate and Support Good Research!

Susan R. Jones
Director, Core Council for the Generation and Dissemination of Knowledge

The generation and dissemination of knowledge is central to the mission, values, and activities of ACPA. Whether through our highly regarded and exemplary publications such as the Journal of College Student Development, About Campus, the books published through ACPA’s Books and Media, or the scholarly presentations at national and regional conferences, ACPA leads the way in promoting and supporting the cutting edge scholarship in the field.

Strong publications and presentations come from good research. And good research requires willing respondents and participants. As you know, ACPA works very hard not to bombard our members with email messages. However, as we have moved in to the age of web-based surveys and almost total reliance on the internet for communication, nearly all of the requests that come to the national office for access to ACPA members for research are for internet-dependent strategies of communication. A task force appointed by then-President Jeanne Steffes worked hard to develop policies and procedures for those making research requests to membership. These have been approved and are now available on the ACPA web site, under the research link. We hoped to create policies and procedures that were clear, fair, and accessible so as to support research efforts, but also to put some safeguards in place. We have received a number of requests for access to members for research purposes, each of which meets our criteria. We have been reluctant however, to send each request out via email or in the ACPA E-lert, which then often results in significant delays for researchers, who often are working on carefully planned timetables.

To facilitate research efforts of our members, we will include an e-mail announcement with research invitations only or, depending on the timing, include a research announcement in the monthly E-lert. These will only occur whenever we have ACPA-approved research requests that cannot be incorporated into the ACPA Master Calendar. Please note that the current research policy only allows for three ACPA-approved research studies per semester. This helps ensure ACPA members are not unreasonably burdened with research participation requests. We know the addition of an announcement with research invitations only will mean more email from ACPA, but hope you will think kindly of requests to participate in research as this is how good scholarship is produced. Thanks in advance for your support and cooperation.

Research Opportunity

Research Opportunity

A research study designed to examine attrition from the student affairs profession is underway. Current student affairs professionals are being asked to help identify potential participants by providing the research team with the names and contact information of former student affairs professionals who left the field within the past 10 years; no longer work for a college or university; or have left student affairs but work in another division of a university or college. These participants will be asked to complete a brief online survey. Your contact information will only be used for the purpose of this study.

If you or someone you know meets the criteria for this study, please provide the study team with the name, current e-mail address, phone number and mailing address. This information should be forwarded to:

Ute Lowery
South University

[email protected]

(The study is sponsored by a subcommittee of NASPA’s Center for Women)