Missouri State University
Assessment in Action
Understanding Student Learning

MSU “to be commended” for Quality Initiative Project

As part of the university’s accreditation renewal process, the Higher Learning Commission (HLC) determined that “Missouri State is to be commended for its work on this three-year project which has the potential to impact other institutions through the sharing of resources as outlined in the report.”

RealResults

 

The ongoing Quality Initiative Project, as a function of the Office of Assessment, is intended to align “institutional outcomes with general education, public affairs, student affairs and professional education outcomes” through a collaborative review of student work related to the Public Affairs mission. This in turn leads to institutional changes as the project’s findings are re-integrated in the classroom as well as co-curricular programs.

The HLC determination is based on four criteria that they feel were demonstrated by the institution:

*Seriousness in the project
*Institutional impact
*Institutional commitment and engagement
*Resource provision

The university’s accreditation is reviewed by the Higher Learning Commission once per decade, and the 2015 review will culminate in an on-site visit taking place in October.

 

Posted in Assessment in Action | Leave a comment

Annual Report, 2015

As part of the university’s Student Development and Public Affairs unit, the Office of Assessment is responsible for submitting an annual report detailing progress toward stated goals in the past year and a look at the coming year’s primary objectives. You can see Assessment’s newly issued report right here: Assessment – Annual Report 2014-2015.

Some successes of the past year have included:

  • Re-engineering the University Exit Exam to make state reporting requirements easier, as well as making the Exam itself easier to use.
  • Increasing response rates on student engagement surveys by nearly 15% to help us better serve the student community both in the classroom and beyond.
  • Assisting campus-wide preparations for the on-site visit of the Higher Learning Commission.

 sunriseassessmentlogo

 

Posted in Assessment in Action | Leave a comment

This Week in Assessment: June 8-12

  • On Tuesday, June 9, the University Exit Exam became available to students enrolled in GEN 499 for the summer semester. The summer administration will include some improvements on the spring pilot administration, but the use of Blackboard has a big success.
  • The Career Center and the Office of Assessment are working together to enhance the senior survey section of the University Exit Exam. This will make state reporting requirements an easier process and replace a paper-and-pencil survey given at the graduation ceremony.
  • Consulted with several academic and student life programs as they think through their assessment programs for the coming year.
  • Administration of the Beginning College Survey of Student Engagement (BCSSE) has begun in partnership with SOAR. The BCSSE is used as an advising tool for incoming students and help the university understand what students expect and need from their campus experience.
  • The Assessment team has worked to help secondary teacher education programs align state and national standards to their rubrics in preparation for Taskstream.
Posted in Assessment in Action | Leave a comment

This Week in Assessment: May 3 – 8

assessmentweekly

Another week in the Office of Assessment is in the books. We laughed. We cried. We consumed coffee by the gallons. It’s an adventure. Keeping with long-standing tradition (as of two weeks ago), here are your 10 notable Assessment happenings:

  • The assessment team began working with the Graduate College to revise the graduate exit survey in order to fulfill state reporting requirements.
  • Assessment Staff and GAs assisted in hosting diversity training sessions in Plaster Student Union this week.
  • The penultimate batch of Public Affairs essays collected from the Exit Exam have been coded and sorted. These will be reviewed as part of the comprehensive public affairs assessment plan and shared with departments and colleges. After all, sharing is caring.
  • After an extensive search through a fantastic group of highly qualified applicants, two brand new GAs were offered and accepted positions for the upcoming academic year! One will be focused on all things Assessment. The other will play a role in both Assessment and the Ozarks Writing Project.
  • The short-windowed HLC Student Opinion Survey closed May 1st. 10% (around 1700 students) of the surveyed population responded.
  • There’s less than one more full month of NSSE action to go. So far, responses hover at the 30.2% rate. Prizes are wrapping up. Dust is settling. Data distribution talks are happening. Survey life is good.
  • As of writing, 33 participants across 8 Missouri State University colleges are planning to participate in the Comprehensive Public Affairs Assessment Workshop this summer. We’re excited to work with these wonderful thinking partners.
  • We had a great meeting with the Creative Writing program to discuss general education assessment plans. Thanks for inviting us!
Posted in Assessment in Action, Perspectives | Tagged , , , , , , , | Comments Off on This Week in Assessment: May 3 – 8

Shining Moments in the Office of Assessment

sunriseassessmentlogoRecently, we published a list blog post on important happenings in the office. Well, we like bulleted lists so much we wanted to follow it up with another one. This one notes some items everyone in the office finds worthy of celebration. So relax, find your zen and read through the list below for positive assessment energies.

  • Assessment staff met with faculty from the English department to collaborate on a 2-year gen ed assessment plan for Creative Writing courses. Forward-thinking for the win!
  • Anyone get an email about the Higher Learning Commission with a Student Opinion Survey link embedded inside? Yeah, we coordinated that. The survey is in advance of the accreditation visit coming this fall and polled approximately 18,000 undergraduate and graduate students, asking general questions about their MSU experience.
  • It’s official. MSU’s NSSE response rate has hit the 30% milestone. Not impressed yet? Consider this: we quickly surpassed the average response percentage of most other universities our size.
  • Currently, the FSSE (Faculty Survey of Student Engagement) is at a 22.5% response rate, while the FSSE-G (graduate instructor) is at 19.1%. These are solid rates for a survey that was sent to people who take surveys all the time.
  • 30 faculty and students have signed up to take part in the Public Affairs Comprehensive Assessment Workshop, which is a spin-off of the Quality Initiative Project that functions to research student learning and depth of knowledge on the three pillars of public affairs.
  • So far, 2,000 pages of student essays have been collected from the new University Exit Exam for use during the aforementioned workshop this May. 2,000. Think about that for a second; MSU students have something to say and we plan to take it to heart.
  • The assessment team has met several times to brainstorm and establish a set of collective commitments and a vision statement for assessment at MSU that will guide the office into the future.
  • Assessment Research Consultant, Mark Woolsey, achieved Black Belt status in his Blackboard training this week. The office is prepared to seek his sensei enlightenment any time Blackboard’s mysteries cloud our collective judgement. We forecast an abundance of digital karate-chops.
  • Assessment grad-assistants, Katie Palmietto and Charlie Whitaker, were given the fantastic opportunity to conduct interviews for a new fall GA opening in the Office of Assessment. The process is now in final interviews and we are thrilled with the number of excellent, talented applicants who came to speak with us.
  • The new University Exit Exam process that kicked off at the beginning of this year has been reported as running smoothly. Efforts continue to be made to streamline the process, such as adding FAQ items to address specific issues.
Posted in Assessment in Action | Tagged , , , , , , , | Comments Off on Shining Moments in the Office of Assessment

This Week in Assessment: April 19-24

assessmentweekly

It has been a busy week in the Office of Assessment at Missouri State. Since lists are a great way to digest what has been accomplished, I thought a quick, 10-item post about the latest Assessment happenings was just the thing to share.

1. I’ve spoken with, Director of Student Orientation, Joe Morris, about administering the Beginning College Survey of Student Engagement during SOAR. SOAR administers their summer survey via campus labs. Joe has asked us to talk through developing an assessment plan for SOAR. Really looking forward to it.

2. As a result of participation in the BCSSE, each student who took the survey has an advising report (click here for more info). Last year was the first year the Advising Report was available. In collaboration with Kathy Davis, Assessment developed four questions to send to 400 advisors to see if the reports were used. The survey had a 25% response rate (100/400) Around 50% were familiar with the report and 50% of those had used it. Of the ones who had not used it, 40% were willing and interested in trying it. This is already great news, but we will work on marketing it more for even greater effect. If people are interested in knowing more, we have these links. Otherwise, we would be more than happy to meet with you for a discussion.

8q6krUxTaYDiKyvXGVApWGlrf_iWU5vX1_unQKsWj8o
iPad winner, Danielle Kothe, group-hugging the wonderful NSSE ambassadors.

3. Awards are on a roll! We’ve awarded the graduation seating prize for timely taking the NSSE to Elizabeth Rudell from the College of Arts and Letters. And let’s not forget, Agriculture student, Danielle Kothe, who won an iPad for her quick NSSE response.

4. Speaking of the NSSE, MSU’s response rate is at 29.8% as of today. This is excellent—10% higher than 2012 and exceptional for an institution of our size. Thanks to former graduate assistant, Amy Bowen and Assessment Research Consultant, Angela Young. Their planning and collaboration with student groups has made this possible. Thank you!

5. The Faculty Survey is at 13.9% and the Faculty Survey for Graduate Instructors is at 11.3%. We intend to publicize it more and get those responses in. The more there are, the more we know.

6. This spring, students reflected on public affairs in the Exit Exam. Responses collected by April 2 are sorted by departments, colleges, and pillars and then sent out to all deans. One dean in particular copied hers for some light homework on a trip. Another dean is having a department heads’ meeting this week to review. Yet another dean has already sent hers out to department heads to review on April 23. A university college is using the information in its accreditation evidence file. We are so glad people are finding this information useful. We are excited about asking people to share how they have used this information and how it may have impacted their thinking.

7. I met with Recreation regarding their accreditation visit this week. I also met Dr. Kieth Ernce for the first time. Dr. Ernce is a reviewer for the accreditor and exhibited a fantastic knowledge about the assessment process. Over the past 10 years, he has gone on more than ten visits. Dr. Tina Liang in Recreation is a reviewer as well and has gone on her first visit this year. These are just a couple of wonderful resources and thinking partners on the assessment of student learning.

8. At Assessment Council in May, we will be reviewing the Exit Exam questions in GEN 499 and discussing changes to be made for the fall Exit Exam administration. Administration through Blackboard was new in the spring, so we look forward to tweaking and streamlining the process for the better.

Exit_Exam_banner

9. The student learning outcomes for all of the colleges are updated on the Assessment page. Finally. The goal was to start with them on our website to get organized. Web and New Media is expected to be adding these outcomes to department web pages by the end of the semester, which will offer greater transparency and more convenient access for both students and faculty alike.

10. For many years, Graduate College has administered a student survey. Since Mark Woolsey has joined us, he has begun collaborating with them. As a result, response rates are up, and we hope the Graduate College is satisfied with the collaboration. We are happy to help think through methods of collecting evidence of student learning for anyone interested.

by Dr. Keri Franklin

Posted in Assessment in Action, Perspectives | Tagged , , , , , , , , , , , | Comments Off on This Week in Assessment: April 19-24

Behind the Scenes in the Office of Assessment

Ever wonder what really goes on in that fourth floor corner of Carrington? Take a gander at one of the Office of Assessment’s most recent planning and brainstorming meetings by reading through the staff notes below:

Upcoming Goals

  • Following a successful meeting with the Diversity Visioning group (a committee part of a larger task force developed to design a new vision for the university – ed.) We decided to make greater efforts in bringing survey data — like the NSSE and BCSSE — to those interested across campus. A primary office goal is to accommodate Missouri State faculty and staff with compelling, accessible data and be consistent in following up to address specific questions.
  • Beginning in January, The office collaborated with the experts, such as Missouri State’s Blackboard, Computer Services and “Blackbelt,” Sue McCrory, to administer the Exit Exam through Blackboard. Work is still ongoing to make this new of data collection and dispersion much more efficient, which, in turn, will benefit departments seeking information about what their students are saying as they graduate.

More on the University Exit Exam

As part of the University Exit Exam and due to this change in administration, Exam administrators are asking students to choose between several prompts and write an open-ended response about ethical leadership, cultural competence, and community engagement. Our goal this week is to pull those responses which have been coded and de-identified and send those responses to departments. So, a department would see the work of students who are graduating in May 2015 and December 2015. This evidence of student learning could be used in department, program, and accreditation assessment plans.

Each college will also receive a set of student work. We will be contacting each department and dean to supply the student work and working with Associate Provost Chris Craig to share relevant information with heads of departments. During the April Assessment Council meeting, the council will review student work collected as an institution. There will also be a review session this May, which is open to anyone interested in participating in the review of student work related to public affairs.

Assessment Now and Tomorrow

  • We are working with the Student Experience Visioning Panel to find 10 first-year students to meet with that group and offer valuable feedback that will influence the future of the university.
  • Our big job right now is preparing for the Quality Initiative Project’s third annual workshop. Being that it is the third year, the name must change to reflect the current nature of the scoring and analysis event.
Posted in Assessment in Action, NSSE/BCSSE, Quality Initiative Project | Tagged , , , , , , , , , , | Comments Off on Behind the Scenes in the Office of Assessment

Conversations with Dr. Jillian Kinzie

by Angela Young

Dr. Jillian Kinzie of Indiana University is the Associate Director of the Center for Postsecondary Research & National Survey of Student Engagement (NSSE) Institute. She was invited to campus to discuss her expertise gained from years of involvement with academic surveys that seek to reach the heart of what students are truly experiencing.

Dr. Kinzie

Dr. Kinzie met with over 50 academic leaders from across campus during her visit to Missouri State in February and provided great insight for conversations based on student retention, perseverance, and support.

These conversations were a great opportunity for campus leaders to gather in a common area and have critical discussions. Faculty’s desire for high levels of student learning and success were the basis of discussions throughout the day.

Some topics raised by your colleagues included:

  • Long-term learning
  • Incoming skill set
  • Student ownership of education
  • Work/leisure balance
  • Needs of transfer students

So what are the next steps to ensure that these conversations continue?

Data from BCSSE* and NSSE** paired with the topics of discussion mentioned above can provide critical indicators and form a starting point for critical conversations. These conversations can help unify our ideas about what is important in student learning, and get us thinking about the next steps.

The Office of Assessment is willing to help bring these conversations to your college or department.

*BCSSE (Beginning College Survey of Student Engagement) provides important information about student thinking and preparation before they enter as first-year students.

**NSSE (National Survey of Student Engagement) provides direct feedback from students at the end of their first year and seniors.

 

 

 

 

 

Posted in Assessment in Action, NSSE/BCSSE | Tagged , , , , , , | Comments Off on Conversations with Dr. Jillian Kinzie

The Story of Assessment of Student Learning in Religious Studies: Less Time-Consuming and More Meaningful

By Dr. Stephen Berkwitz, Religious Studies, Department Head

We started with these assessment tools/steps for the major for at least 15 years. We had something called a Form D—a one-page form for faculty to fill out about majors in their courses. At the end of every semester, faculty would get a substantial stack of papers with the names of majors written on them. We were supposed to fill out the forms, recommend pieces from their work, and give a general overview of what they do well and what they need to work on. That, together with Exit Interview, was the standard assessment protocol for over a decade.

 Developing Student Learning Outcomes and Aligning With Assessments1763

Three to four years ago, we were told to come up with student learning outcomes and match SLO’s to our assessment plan for the major. That’s when the department came up with a sheet and a procedure where we collected papers from students, read them, and filled out a form explaining how the students met the SLO’s we expect for them.

Our fairly barebones assessment procedure, grew into a much more time intensive method. Faculty were not enthusiastic about it. There too many majors, too many sheets, it was too time consuming, and it didn’t seem to be worthwhile. We were supposed to show advisees all of the forms during the exit exam. Faculty felt burdened by reading through three to four research papers ahead of the exit interview to show how students had met the SLOs. The assessment plan had become kind of unworkable because it was so time-consuming—I don’t think we saw results that justified the time we were spending—the data we were supposed to generate wasn’t being generated.

Keri (in Assessment) consulted with us and provided a few recommendations on how to streamline the process. One main recommendation was to have students do the work of filling out the form and explaining how they met the student learning outcomes. In the past, the faculty were working harder than the students.

The Faculty committee liked some of the recommendations and came up with a new plan designed to generate some data that would be useful but would not require a heavy investment of time on the part of faculty at the end of semesters. Following Keri’s advice, we split our assessment tool into three parts.

A Revised Three-Part Assessment Tool

Part 1

A pre exit interview form is sent to students when they make an appointment. Students provide us a response, in their own words, of how they see their work in the program with respect to our SLOs. Having asked students to do this, we’ve found it to be a useful exercise because students reflect on their career with us and get to take stock of what they have accomplished and learned. They don’t seem to mind filling out this 2½ page form. The process makes it easier on faculty because we don’t have to fill out the form ahead of time in their absence.

Part II

The second part is a nine-question survey on survey monkey that we send to them prior to the exit interview. This allows us to generate some quantitative information that could be useful as far as evaluation and assessment purposes are concerned.

Part III

The third part is a list of three open-ended questions we ask during the interview. We discussed the form they filled out ahead of time, as well as these four questions, that we fill out while talking with them. We have the two pieces of data from the exit interview and additional information from the survey. Our initial finding is that our new assessment plan gives us data that we can act on. It’s informative and a lot less time-consuming. Faculty feel much more positive about the assessment plan and are willing to carry it out compared to when the burden is on the faculty members themselves to collect all of the data. Sample Exit Interview

Two people sit in on the exit interviews with graduating majors—the advisor and one member of the curriculum and assessment committee, although the department head can fill in for one of these faculty members.

The other change we made was to abandon the “Form D” sheets that were filled out for every major in every Religious Studies course. We eventually acknowledged that faculty weren’t filling them out or going over them with their advisees. The idea was sound but impractical. We decided to stop that and replace it with the three tools that give us sufficient data to make changes to our program.

First Review of the Assessment Data

Following our first review of the new assessment data, we decided that students had plenty of writing assignments, but fewer opportunities to give oral presentations. So we decided as a department to give more attention to oral presentations. This led us to reflect collectively on how we can give students the opportunity to develop those skills as well? There was a sense that we need to do a little bit more to help students develop more oral communication skills.

We subsequently made a few slight changes to the forms themselves. We thought students may have misunderstood some of the questions on our self-report survey. We asked approximately how many papers and presentations they completed. The responses varied pretty dramatically because some students may have thought we were asking about their entire college career. We have changed this to indicate papers and presentations done for courses in the major specifically.

Assessment Results

Our assessment results have generally been pretty positive. One area where we feel we have under-performed concerns the degree to which our program prepares students for a future career. Student responses were respectable but lower compared to the other questions. We had a discussion about rewording the question or doing more in those areas. This discussion came down to weighing how useful it was to receive a 100% result or how useful it could be to ask questions that will give us something to act on.

Closing the Loop

The other part of our recent assessment retreat was to decide to do more in terms of professional development with our majors. We are considering working with the Career Center and Graduate College to hold workshops on preparing resumes, meeting with potential employers, and giving them more guidance in that area. This is something we will hopefully follow through with based on some of the results we received on our assessment data.

Our standing plan is to have a retreat every fall before the school year begins and spend part of the time going over the assessment data over the year. The idea of having an assessment committee member sit in on exit interviews is that the committee will then be able to give us a general sense of what they are hearing from student interviews and report this information to the department.

For the students, we want to convey that their opinions matter and what they tell us helps us to make changes to our program. One of our questions is what do you suggest that we do differently? We want to make them feel like their opinion counts. I think we have the form written in a way that helps them feel good about filling it out. If there were a grade attached to it, I don’t think they would want to do it. I think it would be like any other homework assignment they get. We have found that students are liberated by the opportunity to report on our department that doesn’t count for a grade. Sometimes we’ve gotten some really detailed and thoughtful responses from students.

Faculty like the process we’ve developed. Our undergraduate students seem to like it. We are getting good feedback, and we continually seek to tweak it in order to generate useful data that we can report and act upon.

 

Posted in Assessment in Action | Comments Off on The Story of Assessment of Student Learning in Religious Studies: Less Time-Consuming and More Meaningful