Assessing Learning Outcomes for Student Programming Board Leaders
However, little empirical research exists on programming boards, which are one of the main student involvement opportunities that unions offer. A recent study sought to add to the literature on this topic by investigating current practices used by unions to assess learning outcomes. Through document review and phone interviews with advisors at select institutions that are members of ACUI, themes emerged about what learning outcomes are being assessed, how assessment activities are conducted, and who is participating in such assessments.
About the Study
In an effort to focus the research on a logical starting point, this study was guided by the following six questions:
- Do student activities and union professionals have assessment plans for student leaders and volunteer members serving on a programming board?
- What, if any, learning outcomes are assessed and how were the learning outcomes developed?
- How are student activities and union professionals administering assessments for learning outcomes and events?
- Who is involved in the assessment planning process?
- How are student activities and union professionals trained on assessment?
- How are student activities and union professionals using what is learned from their outcomes assessment in training and development of student programming board leaders, and how is the event assessment associated with the programming board events used?
Student unions at public institutions that are members of the Association for American Universities (AAU) were an appropriate starting point for this study. All were classified by the Carnegie Foundation as highest research doctoral universities, had student populations of more than 20,000 students, and were members of ACUI. In addition, one institution was included in the sample that was not an AAU institution but that otherwise met the selection criteria, Kansas State University. A pilot version of the study was conducted with a convenience sample from Kansas State and the University of Kansas before the instrument was applied more broadly. In total, study participants were 21 activities and union professionals, representing 21 unique institutions, who served as the lead programming advisor for the campus programming board.
Assessment and Programming Boards
Unfortunately, research is limited on whether student activities and union professionals, specifically those who advise a programming board in a union context, have assessment plans in place and are assessing the learning outcomes of student leader experiences. The importance of assessment comes chiefly from the push for greater accountability, requiring colleges and universities to invest resources in identifying and measuring student learning outcomes both within and outside of the classroom. Pressure for greater accountability comes from external organizations (such as accreditation agencies and state legislatures) or is internally applied by university administrators. A recent example of the increased demand for accountability measures was a state legislative mandate for course-level continuous improvement reporting at Iowa’s three public universities. By Iowa state law, faculty who teach 300-plus person courses must create and use “formative and summative assessments” and submit a plan for using those assessments to improve student teaching, Inside Higher Ed reported in its coverage of the law. While this example involves classroom learning and such plans are not legally required of student affairs professionals, it does suggest that there is an external interest in accountability.
Student learning can happen in a variety of classroom or cocurricular contexts throughout a student’s university experience.
Student learning can happen in a variety of classroom or cocurricular contexts throughout a student’s university experience.Programming boards within student unions provide activities for students that their peers plan, coordinate, and execute. In so doing, they spend thousands of dollars and their student leaders spend countless hours to offer a variety of event options for students to experience. Such programs provide leadership opportunities for students, build campus community, and offer educational experiences in a different format than traditional classroom learning. Without research to support the endeavors of student union programming boards, there is an absence of data to demonstrate the effect these activities have on student learning. In addition, the lack of research on the effectiveness of student programming boards makes it difficult to defend their importance to administrators at varying levels, student fee granting committees, and parents who may not understand their value and question the level of funding and other resources student programming boards receive. To assess learning for those students involved in activities planning, it is important to understand the types of assessment activities that student activities and union professionals are undertaking. Additionally, assessment activities for program audiences serve as a first step in determining if these activities affect student learning.
The descriptive survey employed in this study sought to explore current assessment practices of learning outcomes, resources used, and outcomes of assessment activities for student programming board leaders. The data were collected through document review and phone interviews with each lead programming board advisor responsible for the board’s or office’s assessment efforts.
From this group of programming board professionals, all 21 student programming boards were performing some type of assessment about events and a majority of professionals had implemented learning outcomes assessment for their student leaders. The one metric all institutions were tracking was attendance at events. Attendance totals were used for budget presentations, annual reports, and post-event evaluations. In regard to learning outcomes, 16 of the offices (76%) were actively assessing student learning outcomes for student leaders. Four offices were in the process of creating learning outcomes, and one office had no plans to implement learning outcomes assessment. Thirteen interviewees indicated the decision to implement assessment was motivated by pressures from division administrators or because of campus-wide initiatives that were external to the student union. The four professionals who cited more internal sources of pressure indicated the impetus came from union management.
Learning outcomes used
Through document review and interviews, the top learning outcomes fell within nine themes: 1) communication and collaboration, 2) leadership development, 3) event management, 4) multiculturalism and civic engagement, 5) critical thinking and creativity, 6) intrapersonal development, 7) resilience and personal wellness, 8) traditions and institutional connections, and 9) customer service. The approaches in how professionals developed learning outcomes fell into four categories: using university or division-wide outcomes; developing outcomes based on national standards (usually from the Council for the Advancement of Standards in Higher Education or professional association core competencies); benchmarking against other programs or departments; and developing the outcomes themselves through their own staff. Some institutions used multiple methods.
The level of effort that went into developing learning outcomes for student leaders was clear from interviewing the professionals. The professionals worked with assessment directors and technology tools and resources to create the outcomes assessment used with their individual student programming boards. A key finding from this section is the extent to which event assessment was used when compared to learning outcomes assessment. Event assessment appeared to be used to provide information about program worth, while learning outcomes assessment was mostly used for student training and development.
When examining the event assessment process, it became clear that institutions were collecting multiple types of data. These data included: the number of events held and the attendance at each event, event satisfaction survey responses, marketing or needs survey data submitted by event participants, demographic information, and post-event surveys completed by student leaders.
A common assessment approach for the professionals using learning outcomes was to use self-assessment at the beginning of the students’ term in office to gauge the students’ understanding of the identified learning outcomes. Typically, learning outcomes assessments were developed internally based off the learning outcomes established by the programming board advisors. The pre and post self-assessments consisted of questions incorporating each learning outcome with responses collected using a Likert scale in which students indicated their level of competence with the outcome.
The rest of the professionals interviewed took a more individualized approach to administering learning outcomes with the student programming board leaders. A common example of the individualized approach was for the student leader to decide what learning outcomes they would like to work on for the year and then to set goals with their advisor as to how to accomplish them.
Assessment staff person and training
During the discussion about the professionals’ level of assessment training, the participants referenced formal instruction received as part of an educational degree generally held by staff as well as assessment consultation offered to staff among the overall division or union. The Michigan State University programming board was the only group in which graduate assistants led assessment efforts. Most student programming boards’ assessment efforts were coordinated and led by a student activities and union professional who worked with the programming board leaders as an advisor. The theory behind Michigan State’s approach was because the graduate assistants were currently immersed in a graduate training program, the opportunity to lead assessment efforts would provide a hands-on approach for them to implement what they learn in their coursework. Among the other models represented in the sample, nine had a committee responsible for coordinating assessment efforts, four had a dedicated union professional responsible for assessment, and most others had a dedicated assessment office that
Implications for Practice
Although valuable and needed on campus, assessment of learning outcomes was not extensive. The programming professionals interviewed as part of the study utilized student learning outcomes assessment only for the student leaders. The learning outcomes fell into two categories that utilized student leaders’ self-assessment. Student leaders were given a pre and post self-assessment of the learning outcomes. Professionals then either focused on all of the outcomes with all of the leaders, or the assessment process was more individual and the student picked the outcomes they would like to work on during their year as a leader. Even though a lot of time and thought went into developing the learning outcomes, the delivery and process did not translate into comprehensive learning outcomes assessment for the student programming board and events. However, the process of developing the learning outcomes provided a means for the professionals to speak in a language that is valued on campus. Having the framework of the learning outcomes also provided a structure for training processes and for the individual advisor and student leader meetings.
However, the process of developing the learning outcomes provided a means for the professionals to speak in a language that is valued on campus. Having the framework of the learning outcomes also provided a structure for training processes and for the individual advisor and student leader meetings.
The professional from the University of Wisconsin supported this notion: “Certainly we just live in this era of assessment. You always have to prove what you’re doing and your impact, and so it just became really clear that we had to start demonstrating how we’re having an impact. … Almost weekly I hear from our alumni about how valuable this experience was. I think for us to be able to tell that story in language that resonates with the rest of campus was really important.”
This quote reinforces that the pressure to implement learning outcomes is felt by programming board professionals today. The pressure is supported in the literature. In their publication The Role of Student Affairs in Student Learning Assessment, John Schuh and Ann Gansemer-Topf discussed the importance of linking assessment to institutional mission and purpose. They asserted that it is imperative that student affairs professionals develop services and experiences that contribute to student learning in ways that are valued at their institution.
Programs staff are limited in the time and resources needed to further assessment efforts. Throughout the interview conversations, a recurring theme emerged that student programming boards had implemented learning outcomes only for top student leaders. The professionals discussed that with the hundreds of events planned each year and the large funding provided for the boards (i.e., budgets ranged from $200,000 to $2 million), it was a struggle to develop outcomes that extended past top student leaders to general committee members, student volunteers, or student event attendees. Additionally, they had not developed outcomes for the events being planned or for the students attending the events. Most professionals cited the time and effort it takes to implement learning outcomes for the student leaders is enormous, and time was a major factor in furthering implementation to reach other student populations. The professional from the University of Colorado discussed the time limitation: “I think it’s just really challenging. Trying to find a system that works best for everyone and someone who’s got the time to put it all together. ... Everyone works a lot and has a lot of hours and putting one more thing on a list—like what comes off the list? And I think that’s probably the biggest challenge. And then implementation, whether it’s for assessment or for student learning outcomes assessment.”
However, The Student Learning Imperative: Implications for Student Affairs and Techniques for Assessing Student Learning and Development in Academia and Student Support Services emphasize that the importance of assessment comes chiefly from the push for greater accountability and that such mandates are here to stay. Calls for accountability require colleges and universities to invest resources in identifying and measuring student learning outcomes both within and outside of the classroom. In the past, student affairs professionals based assessments on benchmarks and student satisfaction, attempting to determine how many students participated in programs and to what degree these students reported satisfaction. This method of assessment did not measure a student’s understanding and learning nor did it provide guidance on how to enhance a particular outcome.
Resources, chiefly time and training, for assessment are vital. Though it seemed resources were available for assessing learning outcomes, whether through a dedicated student affairs assessment professional or assessment-based technology such as Campus Labs, resources is an area in which there can be the most improvement. Support from supervisors combined with an understanding of assessment can help new professionals design effective assessment plans.
Administrators need to develop sustainable assessment activities. During the interviews, some of the professionals shared that their division had a new top administrator and it was the administrator’s goal to implement assessment. A challenge is presented when the administrator leaves the position and assessment is not the next leader’s priority. New leaders often set new goals and assign resources and reports accordingly. Assessment needs to be part of an annual cycle, providing ongoing data collection for reports that outline assessment activities, their results, and the changes based on the results for dissemination on a yearly basis. This means that, regardless of whether a division is experiencing an administrator transition, assessment needs to be maintained as a priority.
Specific goals for assessment need to be set for funding sources available. The research indicates that accountability in higher education results in reduction of budget dollars if an institution cannot demonstrate learning of their students. However, the professionals interviewed shared that the event assessment was more important to their local budget process than implementing learning outcomes for student leaders. The professionals were finding they still needed to count event numbers, determine demographics, and use satisfaction surveys for budget and future event proposal purposes. Campus funding sources were less interested in the learning of a few student leaders and were more interested in the overall impact of programs or amount of programs created. Yet, learning outcome assessment was more important to the programming board professionals to be able to participate in the learning outcomes and assessment conversations happening on campus with their academic counterparts. The contradiction further reduces resources and creates competing goals.
In summary, student union programming staff are spending a great deal of time and effort on assessment and evaluation. While they hear the call for greater accountability, they are not necessarily being provided with the skills, time, or resources needed to engage in effective assessment practices.
In summary, student union programming staff are spending a great deal of time and effort on assessment and evaluation. While they hear the call for greater accountability, they are not necessarily being provided with the skills, time, or resources needed to engage in effective assessment practices. Further, student activities and union professionals confronted with competing goals from funding sources and campus administrators face difficulties in identifying what outcomes they should measure. As a result, most of their assessment practices focus on the outputs associated with their boards’ activities, counting how many people attend events and the number of events produced and, occasionally, collecting satisfaction data from event attendees. Meanwhile, they are limited to assessing learning outcomes for a select number of top student leaders and not the larger student population at their institutions. Universities and student unions need to address these resource constraints and goal contradictions to facilitate effective assessment of the learning experiences afforded to students by student programming boards.