By Dr. Stephen Berkwitz, Religious Studies, Department Head
We started with these assessment tools/steps for the major for at least 15 years. We had something called a Form D—a one-page form for faculty to fill out about majors in their courses. At the end of every semester, faculty would get a substantial stack of papers with the names of majors written on them. We were supposed to fill out the forms, recommend pieces from their work, and give a general overview of what they do well and what they need to work on. That, together with Exit Interview, was the standard assessment protocol for over a decade.
Developing Student Learning Outcomes and Aligning With Assessments
Three to four years ago, we were told to come up with student learning outcomes and match SLO’s to our assessment plan for the major. That’s when the department came up with a sheet and a procedure where we collected papers from students, read them, and filled out a form explaining how the students met the SLO’s we expect for them.
Our fairly barebones assessment procedure, grew into a much more time intensive method. Faculty were not enthusiastic about it. There too many majors, too many sheets, it was too time consuming, and it didn’t seem to be worthwhile. We were supposed to show advisees all of the forms during the exit exam. Faculty felt burdened by reading through three to four research papers ahead of the exit interview to show how students had met the SLOs. The assessment plan had become kind of unworkable because it was so time-consuming—I don’t think we saw results that justified the time we were spending—the data we were supposed to generate wasn’t being generated.
Keri (in Assessment) consulted with us and provided a few recommendations on how to streamline the process. One main recommendation was to have students do the work of filling out the form and explaining how they met the student learning outcomes. In the past, the faculty were working harder than the students.
The Faculty committee liked some of the recommendations and came up with a new plan designed to generate some data that would be useful but would not require a heavy investment of time on the part of faculty at the end of semesters. Following Keri’s advice, we split our assessment tool into three parts.
A Revised Three-Part Assessment Tool
Part 1
A pre exit interview form is sent to students when they make an appointment. Students provide us a response, in their own words, of how they see their work in the program with respect to our SLOs. Having asked students to do this, we’ve found it to be a useful exercise because students reflect on their career with us and get to take stock of what they have accomplished and learned. They don’t seem to mind filling out this 2½ page form. The process makes it easier on faculty because we don’t have to fill out the form ahead of time in their absence.
Part II
The second part is a nine-question survey on survey monkey that we send to them prior to the exit interview. This allows us to generate some quantitative information that could be useful as far as evaluation and assessment purposes are concerned.
Part III
The third part is a list of three open-ended questions we ask during the interview. We discussed the form they filled out ahead of time, as well as these four questions, that we fill out while talking with them. We have the two pieces of data from the exit interview and additional information from the survey. Our initial finding is that our new assessment plan gives us data that we can act on. It’s informative and a lot less time-consuming. Faculty feel much more positive about the assessment plan and are willing to carry it out compared to when the burden is on the faculty members themselves to collect all of the data. Sample Exit Interview
Two people sit in on the exit interviews with graduating majors—the advisor and one member of the curriculum and assessment committee, although the department head can fill in for one of these faculty members.
The other change we made was to abandon the “Form D” sheets that were filled out for every major in every Religious Studies course. We eventually acknowledged that faculty weren’t filling them out or going over them with their advisees. The idea was sound but impractical. We decided to stop that and replace it with the three tools that give us sufficient data to make changes to our program.
First Review of the Assessment Data
Following our first review of the new assessment data, we decided that students had plenty of writing assignments, but fewer opportunities to give oral presentations. So we decided as a department to give more attention to oral presentations. This led us to reflect collectively on how we can give students the opportunity to develop those skills as well? There was a sense that we need to do a little bit more to help students develop more oral communication skills.
We subsequently made a few slight changes to the forms themselves. We thought students may have misunderstood some of the questions on our self-report survey. We asked approximately how many papers and presentations they completed. The responses varied pretty dramatically because some students may have thought we were asking about their entire college career. We have changed this to indicate papers and presentations done for courses in the major specifically.
Assessment Results
Our assessment results have generally been pretty positive. One area where we feel we have under-performed concerns the degree to which our program prepares students for a future career. Student responses were respectable but lower compared to the other questions. We had a discussion about rewording the question or doing more in those areas. This discussion came down to weighing how useful it was to receive a 100% result or how useful it could be to ask questions that will give us something to act on.
Closing the Loop
The other part of our recent assessment retreat was to decide to do more in terms of professional development with our majors. We are considering working with the Career Center and Graduate College to hold workshops on preparing resumes, meeting with potential employers, and giving them more guidance in that area. This is something we will hopefully follow through with based on some of the results we received on our assessment data.
Our standing plan is to have a retreat every fall before the school year begins and spend part of the time going over the assessment data over the year. The idea of having an assessment committee member sit in on exit interviews is that the committee will then be able to give us a general sense of what they are hearing from student interviews and report this information to the department.
For the students, we want to convey that their opinions matter and what they tell us helps us to make changes to our program. One of our questions is what do you suggest that we do differently? We want to make them feel like their opinion counts. I think we have the form written in a way that helps them feel good about filling it out. If there were a grade attached to it, I don’t think they would want to do it. I think it would be like any other homework assignment they get. We have found that students are liberated by the opportunity to report on our department that doesn’t count for a grade. Sometimes we’ve gotten some really detailed and thoughtful responses from students.
Faculty like the process we’ve developed. Our undergraduate students seem to like it. We are getting good feedback, and we continually seek to tweak it in order to generate useful data that we can report and act upon.