Missouri State University
Assessment in Action
Understanding Student Learning

Conversations with Dr. Jillian Kinzie

by Angela Young

Dr. Jillian Kinzie of Indiana University is the Associate Director of the Center for Postsecondary Research & National Survey of Student Engagement (NSSE) Institute. She was invited to campus to discuss her expertise gained from years of involvement with academic surveys that seek to reach the heart of what students are truly experiencing.

Dr. Kinzie

Dr. Kinzie met with over 50 academic leaders from across campus during her visit to Missouri State in February and provided great insight for conversations based on student retention, perseverance, and support.

These conversations were a great opportunity for campus leaders to gather in a common area and have critical discussions. Faculty’s desire for high levels of student learning and success were the basis of discussions throughout the day.

Some topics raised by your colleagues included:

  • Long-term learning
  • Incoming skill set
  • Student ownership of education
  • Work/leisure balance
  • Needs of transfer students

So what are the next steps to ensure that these conversations continue?

Data from BCSSE* and NSSE** paired with the topics of discussion mentioned above can provide critical indicators and form a starting point for critical conversations. These conversations can help unify our ideas about what is important in student learning, and get us thinking about the next steps.

The Office of Assessment is willing to help bring these conversations to your college or department.

*BCSSE (Beginning College Survey of Student Engagement) provides important information about student thinking and preparation before they enter as first-year students.

**NSSE (National Survey of Student Engagement) provides direct feedback from students at the end of their first year and seniors.






Posted in Assessment in Action, NSSE/BCSSE | Tagged , , , , , , | Comments Off on Conversations with Dr. Jillian Kinzie

The Story of Assessment of Student Learning in Religious Studies: Less Time-Consuming and More Meaningful

By Dr. Stephen Berkwitz, Religious Studies, Department Head

We started with these assessment tools/steps for the major for at least 15 years. We had something called a Form D—a one-page form for faculty to fill out about majors in their courses. At the end of every semester, faculty would get a substantial stack of papers with the names of majors written on them. We were supposed to fill out the forms, recommend pieces from their work, and give a general overview of what they do well and what they need to work on. That, together with Exit Interview, was the standard assessment protocol for over a decade.

 Developing Student Learning Outcomes and Aligning With Assessments1763

Three to four years ago, we were told to come up with student learning outcomes and match SLO’s to our assessment plan for the major. That’s when the department came up with a sheet and a procedure where we collected papers from students, read them, and filled out a form explaining how the students met the SLO’s we expect for them.

Our fairly barebones assessment procedure, grew into a much more time intensive method. Faculty were not enthusiastic about it. There too many majors, too many sheets, it was too time consuming, and it didn’t seem to be worthwhile. We were supposed to show advisees all of the forms during the exit exam. Faculty felt burdened by reading through three to four research papers ahead of the exit interview to show how students had met the SLOs. The assessment plan had become kind of unworkable because it was so time-consuming—I don’t think we saw results that justified the time we were spending—the data we were supposed to generate wasn’t being generated.

Keri (in Assessment) consulted with us and provided a few recommendations on how to streamline the process. One main recommendation was to have students do the work of filling out the form and explaining how they met the student learning outcomes. In the past, the faculty were working harder than the students.

The Faculty committee liked some of the recommendations and came up with a new plan designed to generate some data that would be useful but would not require a heavy investment of time on the part of faculty at the end of semesters. Following Keri’s advice, we split our assessment tool into three parts.

A Revised Three-Part Assessment Tool

Part 1

A pre exit interview form is sent to students when they make an appointment. Students provide us a response, in their own words, of how they see their work in the program with respect to our SLOs. Having asked students to do this, we’ve found it to be a useful exercise because students reflect on their career with us and get to take stock of what they have accomplished and learned. They don’t seem to mind filling out this 2½ page form. The process makes it easier on faculty because we don’t have to fill out the form ahead of time in their absence.

Part II

The second part is a nine-question survey on survey monkey that we send to them prior to the exit interview. This allows us to generate some quantitative information that could be useful as far as evaluation and assessment purposes are concerned.

Part III

The third part is a list of three open-ended questions we ask during the interview. We discussed the form they filled out ahead of time, as well as these four questions, that we fill out while talking with them. We have the two pieces of data from the exit interview and additional information from the survey. Our initial finding is that our new assessment plan gives us data that we can act on. It’s informative and a lot less time-consuming. Faculty feel much more positive about the assessment plan and are willing to carry it out compared to when the burden is on the faculty members themselves to collect all of the data. Sample Exit Interview

Two people sit in on the exit interviews with graduating majors—the advisor and one member of the curriculum and assessment committee, although the department head can fill in for one of these faculty members.

The other change we made was to abandon the “Form D” sheets that were filled out for every major in every Religious Studies course. We eventually acknowledged that faculty weren’t filling them out or going over them with their advisees. The idea was sound but impractical. We decided to stop that and replace it with the three tools that give us sufficient data to make changes to our program.

First Review of the Assessment Data

Following our first review of the new assessment data, we decided that students had plenty of writing assignments, but fewer opportunities to give oral presentations. So we decided as a department to give more attention to oral presentations. This led us to reflect collectively on how we can give students the opportunity to develop those skills as well? There was a sense that we need to do a little bit more to help students develop more oral communication skills.

We subsequently made a few slight changes to the forms themselves. We thought students may have misunderstood some of the questions on our self-report survey. We asked approximately how many papers and presentations they completed. The responses varied pretty dramatically because some students may have thought we were asking about their entire college career. We have changed this to indicate papers and presentations done for courses in the major specifically.

Assessment Results

Our assessment results have generally been pretty positive. One area where we feel we have under-performed concerns the degree to which our program prepares students for a future career. Student responses were respectable but lower compared to the other questions. We had a discussion about rewording the question or doing more in those areas. This discussion came down to weighing how useful it was to receive a 100% result or how useful it could be to ask questions that will give us something to act on.

Closing the Loop

The other part of our recent assessment retreat was to decide to do more in terms of professional development with our majors. We are considering working with the Career Center and Graduate College to hold workshops on preparing resumes, meeting with potential employers, and giving them more guidance in that area. This is something we will hopefully follow through with based on some of the results we received on our assessment data.

Our standing plan is to have a retreat every fall before the school year begins and spend part of the time going over the assessment data over the year. The idea of having an assessment committee member sit in on exit interviews is that the committee will then be able to give us a general sense of what they are hearing from student interviews and report this information to the department.

For the students, we want to convey that their opinions matter and what they tell us helps us to make changes to our program. One of our questions is what do you suggest that we do differently? We want to make them feel like their opinion counts. I think we have the form written in a way that helps them feel good about filling it out. If there were a grade attached to it, I don’t think they would want to do it. I think it would be like any other homework assignment they get. We have found that students are liberated by the opportunity to report on our department that doesn’t count for a grade. Sometimes we’ve gotten some really detailed and thoughtful responses from students.

Faculty like the process we’ve developed. Our undergraduate students seem to like it. We are getting good feedback, and we continually seek to tweak it in order to generate useful data that we can report and act upon.


Posted in Assessment in Action | Comments Off on The Story of Assessment of Student Learning in Religious Studies: Less Time-Consuming and More Meaningful

Assessment January 2015 Update

The Office of Assessment is currently working on many different projects. In recent weeks we have done the following…


  • The Office of Assessment will be hosting Dr. Jillian Kinzie of Indiana University and NSSE (National Survey of Student Engagement) administration on campus February 26. Kinzie will be able to discuss what Missouri State can learn from their NSSE results and how that information can impact retention and student success.

Student Success Committee and Retention Dashboard

  • We are working with the Student Success Committee and Computer Services to pull together a variety of data points based on9713522738_7c3c1fcc04_oUT-Austin’s attempts to use data to understand retention. Survey data (BCSSE, NSSE), Parent Income, GPA, and other variables will be part of what we hope will be a searchable database and predictive tool in the future.

 Assessment Grants

  • The second public affairs assessment grant meeting took place in December. The next meeting is January 23. Several faculty are developing public affairs assignments and several departments—Athletic Training and Occupational Therapy and developing ways to integrate public affairs at the department level and in the interview/application process. Grant proposals will be on the Assessment blog and assignments developed through the grants will be included on the Public Affairs Toolkit website—a place where ideas for teaching public affairs are collected and a result of the QIP workshops in summer 2013 and 2014. If you have any assignments or ideas that we can include on this page, pelase share!

National Survey of Student Engagement (NSSE) & Faculty Survey of Student Engagement (FSSE)

  • Marketing efforts to raise student awareness of the survey are underway.
  • The first NSSE invitation will reach students February 10, 2015.
  • We will also be administering the Faculty Survey of Student Engagement beginning in April and the Faculty Survey of Student Engagement for Graduate Student Instructors. The value of this survey is to make comparisons between student perceptions and faculty perceptions.

Graduate Exit Survey

  • We have worked with the Graduate College on their graduate student exit survey. “Last Chance” reminders will be sent out today. The response rate from December graduates is 29%.  We are currently working with a GA in the Graduate College (Ben France) to make sure staff in both offices have access to the data and know how the survey functions.

Assessment Council

Assessment Council provided discussion and feedback on the Public Affairs open-ended response item that will be part of the University Exit Exam beginning this semester. Assessment Council will meet again January 13.  Members have been asked to look over Criterion 3 and 4 and provide feedback for the upcoming HLC visit.

Upcoming Meetings:

  • Tuesday, January 13 from 2:00-3:00
  • Tuesday, February 10 from 2:00-3:00

University Exit Exam

  • The University Exit Exam has been redesigned and will be administered under the new streamlined process utilizing Blackboard starting January 2015.
  • To close the loop and also analyze what we collect from students, open-ended Public Affairs responses on the University Exit Exam will be reviewed by faculty and staff reviewers during the May Public Affairs Assessment Workshop.

Public Affairs Curriculum Mapping Meeting

  • A group of faculty and staff met in December to discuss where public affairs experiences are offered to students throughout a four-year program in curricular and co-curricular. Attendees included Rachelle Darabi, Denise Baumann, Angela Young, Kurt Heinlein, Mike Wood, Joe Morris, Kelly Wood, and Keri Franklin.
    • After brainstorming curriculum and co-curricular public affairs opportunities the group discussed the following point:
    • Having a more deliberate and strategic approach to reach students on campus with public affairs across the four year
    • Creating a master calendar of Public Affairs-related events happening on campus and in the community that faculty can access for integration into class curriculum. Be able to tag events as public affair events in the master calendar.
    • Track attendance and perhaps require students to attend a certain number of public affairs event per year while a student at Missouri State. Use a scanner to track attendance.


Posted in Assessment in Action | Comments Off on Assessment January 2015 Update

Addressing Public Affairs Through Film in Communication Science Disorders

Deborah Cron, clinical associate professor,  applied and received a public affairs assessment grant to promote and assess evidence of student learning in her department of Communication Sciences and Disorders. The title of her project, “Using the Power of Social Influence and Impact of Entertainment to Provide a Broader Cultural Awareness and Increase Empathy in Students of Communication Sciences and Disorders” is described by Dr. Cron below. For more information about her reflective assignment, visit the Public Affairs ToolkitSignTeachGirlDeaf.


The Undergraduate Assessment Objective for CSD 495, Observation Clinical Practicum, states that Students will demonstrate cultural competence and ethical leadership by reflecting on and discussing the needs of individuals served in the Communication Sciences and Disorders clinical setting and how clinicians/teachers strive to meet these needs in a culturally competent, individualized way, through least biased, best clinical practices.

The assessment assignment for this undergraduate course requires a 1 1/2 page observation of an SLP therapy session observed during the semester that describes actions they as a therapist could take to deliver Speech-Language therapy to that client in a culturally competent, individualized way, through least biased, best clinical practice.

Two class periods of this two credit hour course were devoted to lecture, discussion, and video presentation of “Cultural Humility: People, Principles and Practices,” a 30 minute documentary by Vivian Chavez, a San Francisco State professor that uses music, interviews, archival footage, and images of community, nature and dance to explain why we need Cultural Humility, not simply Cultural Competence.

Class responses to this video via written comments were most enthusiastic, showed a deep interest in learning more about other cultures, and a definite surprise at the expanded definition of diversity from the Higher Learning Commission of the North Central Association of Colleges and Schools that states, “diversity is represented in many forms, such as differences in ideas, viewpoints, perspectives, values, religious beliefs, backgrounds, race, gender, age, sexual orientation, human capacity, and ethnicity….” Therefore, the students learned that not only do African Americans, Hispanics, and Asians have distinctive cultures, but so do youth gangs, divorcees, senior citizens, and college students. In addition to expressions of surprise and interest, a number of students seemed genuinely at a loss as to how to gain more information about other cultures. “Where do you look for further information to practice cultural humility/competence?” My typical response was, “use the internet, talk to colleagues, read books, see films.”

My course, like all courses, has limited seated time to expand further on this topic as it is necessary for students to also learn a sufficient amount about clinical methods to make their observations meaningful. But I wanted to find another way for students to expand their experiences. I have lost count of the healthcare professionals I know who have shared a story about a film that finally made them commit to their career choice. The power of the performing arts to enlighten as well as entertain is undeniable.


I propose to use the social influence and impact of entertainment to provide a broader cultural knowledge base to students by establishing a small library of theatrically released films relevant to professionals in the field of healthcare and communication disorders and provide extra credit opportunities for students in my class who are willing to write reflection papers on these films. If possible I would also like to schedule at least one “movie night” followed by a talk back with faculty and community leaders knowledgeable on the themes presented in the selected film.

Films would be selected with input from CSD department faculty and assure relevance to all three disciplines within the department including Education of the Deaf and Hard of Hearing, Audiology, and Speech Language Pathology; therefore selections would be available to impact students throughout the department, not only those with goals to become Speech Language Pathologists. The checkout system already in place for therapy materials could accommodate the checkout of items in this film library as well. The idea for a “movie night” could eventually be expanded to include other interested departments within the College of Health and Human Services. Funds from this grant would be used to purchase films.

The reflections would provide a written product that would be submitted as samples of student work, therefore contributing to a body of student work that will provide qualitative and quantitative data for research on the effectiveness of the practice. I am available to fulfill the eligibility selection criteria by attending meetings throughout the semester to participate in the desired “community of practice dedicated achieving greater clarity with regard to teaching and learning the public affairs mission” at Missouri State University.

Examples of films that would be included:

My Left Foot: a 1989 Irish film directed starring Daniel Day Lewis. It tells the true story of Christy Brown, an Irishman born with cerebral palsy who could control only his left foot. Christy Brown grew up in a poor, working-class family, and became a writer and artist.

The Miracle Worker: The story of Helen Keller and her teacher Annie Sullivan

Children of a Lesser God: An adaptation of the Tony Award winning stage play about a hearing speech teacher and deaf custodian who have conflicting ideologies on speech and deafness.

I am Sam: The story of a father with a developmental disability and his 17 year old daughter

Diving Bell and the Butterfly: True story of the Elle editor who suffered a stroke and has to live with an almost totally paralyzed body

Taare Zameen Par-Like Stars on Earth: Reissued by Disney, a 2007 award winning Indian drama about an eight year old who excels at art, whose teacher suspects that he is dyslexic and helps him to overcome his disability.

Temple Grandin: The 2010 biopic about an autistic woman who revolutionized practices for humane handling of livestock on cattle ranches.

Young At Heart: British documentary about a chorus of twenty-two senior citizens with an average age of eighty.

Posted in Action Reports, Assessment in Action, Public Affairs Toolkit | Tagged , , | Comments Off on Addressing Public Affairs Through Film in Communication Science Disorders

Collaborative Assessment of Student Learning in Comprehensive Religious Studies Graduate Programs

Closing the Loop in Assessment of Graduate Religious Studies Programs

In 2009, Dr. Steve Berkwitz from Religious Studies received nearly $20,000 in grant funding from the Wabash Center for Teaching and Learning in Theology and Religion to host a three-day conference with colleagues from 12 universities with stand-alone graduate programs in Religious Studies. According to Dr. Berkwitz, “This workshop provided the context for the assessment of our program and spurred us to make substantial changes to the MA degree in Religious Studies. Conversations with colleagues sharing examples of student learning led to the following structural changes to the program:

  • Based on a review of retention and completion data and the needs of students in the program, the department changed the timeline for student completion of the comprehensive exam.
  • Based on comparable programs and the needs of students, the program offered the option of writing a thesis or compiling a research portfolio with an introductory essay and intellectual biography.

“We’ve found the terminal MA attracts a broader constituency.”

Dr. Berkwitz explains: “We learned the way we structure our program was not helpful, so we dropped and streamlined the requirements for the seminar system and changed our comprehensive exam procedure. We found that students would take longer to finish their master’s program because they had to complete comp exams and thesis the same semester. We discovered that, for us, this wasn’t smart. Now, we have comp exams after year 1. Students now focus on writing a thesis or doing a research portfolio—another thing we learned from the other programs.”

He goes on to say, “The thesis,while useful, is not for everyone—some aren’t going on to a Ph.D. program. We’ve found the terminal MA attracts a broader constituency. Instead of revising two papers, they come up with a portfolio of their research work in which they add an introductory essay and describe how their projects fit together, and the students provide an intellectual biography of how their work fits together.”

Reaching Out to “Non-Completers”

Through the conversations about comprehensive exams, degree papers, and thesis options, the Religious Studies faculty made changes to the program. “We are finding success in having students finish the program more quickly. We have been reaching out to students who have disappeared and haven’t completed the thesis and told them about our portfolio option, and said, ‘Here’s a chance to finish the degree.’ Two or three who dropped off the map will now complete the degree. In our recent meeting at the society conference, I heard a colleague at Wake Forest say they were able to get some of their students back when given this option and they were happy. It makes everyone look good.”

The following faculty attended:

  1. Florida International University- Erik Larson
  2. Georgia State University – Kathryn McClymond
  3. Miami University of Ohio – Lisa Poirier
  4. Missouri State University – Mark Given, Martha Finch, Steve Berkwitz
  5. University of Colorado at Boulder – Holly Gayley
  6. University of Georgia – Carolyn Medine
  7. University of Kansas – William Lindsey
  8. University of Missouri-Columbia – Signe Cohen
  9. University of North Carolina-Charlotte – John Reeves
  10. University of South Carolina – Kevin Lewis
  11. Wake Forest University – Jarrod Whitaker
  12. Western Michigan University — Brian Wilson


Religious Studies Department

The Department of Religious Studies is

  • 12 tenure-track faculty
  • Approximately 60 majors
  • Approximately 35 graduate students

Keys to Success

  • Using external funding to bring together faculty from similar programs from across the United States
  • Identifying and understanding individual student needs (e.g., will they continue to a Ph.D. program?)
  • Talking through learning, and looking for patterns based on faculty observations and student feedback
  • Following up with students to share a new option for degree completion
  • Changing the comprehensive exam system based on a review of retention and completion data

Methods to Collect Evidence

  • Faculty observations
  • Collaboration with the faculty of similar programs
  • Student feedback
  • Review of program requirements
  • Review of alumni


Posted in Assessment in Action | Tagged , , | Comments Off on Collaborative Assessment of Student Learning in Comprehensive Religious Studies Graduate Programs

Assessment By Any Other Name…

Assessment sounds like grunge work, demanding serious brainpower, organization, weeks of planning, and a large committee. It’s like those household chores you hate, but do anyway because you enjoy hosting. Or, in this case, you enjoy your program/course.

But you probably already have assessment activities in place. You just may not realize it… Yet.

A few weeks ago, I was perusing the Dietetics website and discovered what I thought was a really engaging assessment report. It was actually the Dietetics Program Newsletter, but it could’ve served as an assessment report.



So what makes their newsletter such a perfect example of assessment?

It showcases 10 examples from Suskie’s Examples of Evidence of Student Learning:

  1. Admission rates into graduate programs and/or graduation rates from those programs.
  2. Placement rates of graduates into appropriate career positions.
  3. Pass rates on appropriate licensure/certification exams.
  4. Alumni perceptions of their career responsibilities and satisfaction.
  5. Student participation rates in faculty research, publication, and conference presentations.
  6. Dietetics has several pages dedicated to congratulating current students and alumni on awards and honors.
  7. Counts of courses with collaborative learning.
  8. Counts of courses taught using culturally responsive teaching.
  9. Counts of student majors participating in relevant co-curricular activities.
  10. Voluntary student attendance at disciplinary seminars and conferences and other intellectual/cultural relevant to a course/program.

What great information does your program/course already have that’s perfect for assessment? Check out Suskie’s one-page resource listing great ways to showcase how your students are learning.

How did this blog post change or support your thinking? Is Assessment looking less labor intensive? Does assessment by any other name really smell as sweet? Let us know in the comments section!

We’re here to support you in your assessment activities. If this post stirred up questions or generated ideas, invite us to your next assessment meeting by emailing Assessment@MissouriState.edu! We look forward to hearing from you.

Posted in Action Reports, Assessment in Action | Tagged , , , | Comments Off on Assessment By Any Other Name…

An Undergraduate Experience In Assessing Learning in Public Affairs

My Perspective

DSC_0476 (2)

Before my participation in the Quality Initiative Project (QIP)–an assessment of student learning related to public affairs, I didn’t give much thought to the public affairs mission. I never even considered why it was so important to the university. But now, I understand its purpose and hope to make it a part of my life outside of the university setting. The principles of the public affairs mission will help our generation create a better world. Equally important, I now know that my instructors care about my learning. The faculty that I met through the QIP workshop showed me that they want students to succeed; not just in the classroom, but in life, as well. My input was taken seriously and showed me that even as a student, I am an important actor in determining the direction of this university. I am so grateful for the opportunity and hope that I will be able to attend other QIP workshops in the future.

My Message to Students

Students, if your attitudes are like mine prior to the workshop, please take heart in my experience. You have an important voice and you are lucky to be at a university where the faculty take student experiences to heart. And just as important, we are part of a community determined to make the world a better place by means of the public affairs mission.

My Message to Faculty

Professors, faculty and staff, don’t be afraid of the public affairs mission. It may seem like a challenge to add new curriculum to your courses, but your contribution will make a difference in the lives of your students. Remember, teaching the Public Affairs Mission is not something that needs to be made as an assignment or a new unit; it can be found in how you portray yourself and in your teaching. As professionals at this university, we should be able to embody this mission in both your personal and professional life. If that can be done, teaching the PA mission will become the norm. If we work together, we can help make a brighter future for everyone!

As a senior and toward the end of my program, I was not at the height of my enthusiasm for school. In spring 2014, I just wanted to graduate. When I attended the Quality Initiative Project workshop in May, 2014, I was coming off an incredibly difficult semester. I had hit a roadblock and, like many other students, had no idea how to take my next steps. Though I did not expect it at the time, the workshop was an eye-opening experience for me, and one that helped restore my faith in education.

Throughout the week, I had the opportunity to witness how much teachers actually care about their students’ learning. Sometimes, as a student, it’s easy to assume that instructors are teaching their classes as a means to a paycheck. The QIP workshop proved the opposite: the participating faculty sincerely cared about how deeply students were understanding the principles of public affairs. For them, it was not enough that students could repeat the names of the public affairs pillars. They wanted to know if students learned enough about the university’s mission to be an example of public affairs in their daily lives.

“The QIP workshop challenged us to analyze ourselves on a personal as well as academic level.”

I don’t think I was the only QIP participant that got more than I bargained for from the workshop. Though the workshop’s purpose was to assess student work, it became much, much more. The QIP workshop challenged us to analyze ourselves on a personal level as well as an academic level.

– Louise Love

Posted in Perspectives, Quality Initiative Project | Tagged , , , , | Comments Off on An Undergraduate Experience In Assessing Learning in Public Affairs

Using Feedback from Student Internships to Make Changes In Agriculture Courses

Feedback From Internships Led to Research

After students provided feedback on internship experiences in Colorado and Kansas, agriculture faculty realized they needed to broaden students’ understanding of animal agriculture. In the feedback, faculty found students were amazed by how different agriculture is in other regions, especially once they had experienced new regions firsthand. Dr. Anson Elliott, department head, explained, “Students had some knowledge going out there, but what they learned here [in Missouri] did not apply as well in Colorado or Kansas because of differences in climate and size. Agriculture students at Missouri State are used to 200-acre cattle operations in Missouri. In Colorado and Kansas, it’s 200 ‘sections.’”

Differences in agricultural methods and terminology vary across the country. Sue Webb, senior instructor, explained, “Students who did internships with big companies were caught by surprise. There’s an unexpected variety across the country. For example, there is less water in the Great Plains. These regional variations prompted us to think about a collaborative study between the University of Central Missouri and Northwest Missouri.”

Student Exit Survey and Changes to the Program

Agriculture developed an Exit Survey for seniors. Results from the survey led the School of Agriculture to make the following changes:

  • Used the Journagan Ranch as a resource to develop more hands-on experiences, including short internships, for students
  • Increased access to labs in Basic Animal Science classes
  • Increased the number of short-term internships at agriculture facilities and provided more opportunities to work directly with animals
  • An increased emphasis on animal welfare

Triangulating Student Feedback and Employer Feedback to Make Changes to the Program

  • Hired a natural resource and forestry faculty member. Students repeatedly reported needing more forestry knowledge. This feedback from students was substantiated by the Conservation Department and the end user— employers.
  • Employers said students needed more experience with social media and we needed to modify our program. Based on employer feedback and student feedback from internships, we hired a full-time person for Agricultural Communications. This person leads efforts to help students learn more about social media in the agriculture industry.


  • 18 faculty
  • 650 majors
  • 9 undergraduate degree programs
  • 40 graduate students
  • 2 graduate programs

Keys to Success

  • Developed an exit survey for seniors.
  • Utilized information from employers and students to improve the program.

Methods to Collect Evidence

  • Collected evidence through an exit survey.
  • Collected evidence from students during the internship.
Posted in Assessment in Action, College | Tagged , , , , , , , | Comments Off on Using Feedback from Student Internships to Make Changes In Agriculture Courses