Working Group 2017-2018

Who

In the late spring of 2017 provost Kaivola charged a working group of faculty and staff to:

“review the current course evaluation system and propose a revised process that better serves the institution and its members”.

A group was formed to bring expertise and motivation for the task from their respective roles.  The working group consists of

  • Kristen Chamberlain, Associate Professor of Communication Studies and Director of Assessment
  • Crystal Comer, Registrar (retired from group 01-2018)
  • Ben Denkinger, Assistant Professor of Psychology
  • Scott Krajewski, Director of IT and CIO
  • Terrance Kwame-Ross, Associate Professor of Education
  • Kelsey Richardson-Blackwell, Assistant Director of Academic Advising, Technology Solutions
  • Diane Pike, Professor of Sociology

Why

The current form and process was developed in 2004 and reflects the best thinking of that time for Augsburg College.  The time is right to evaluate the form and process and determine what is the best approach for Augsburg University 14 years later.

The Work in Fall 2017

The working group began meeting in earnest in the fall of 2017, building on an initial review of the literature by some members over the summer.  The work of the fall included the following.

  • A review and discussion of the current literature and research issues.  This 46 minute literature review by Rice University provides a succinct review for anyone interested.
  • Extensive discussions about Augsburg’s distinctiveness and history in relation to this topic.
  • Initial meetings and conversations with numerous campus groups including Chairs, Faculty Senate, Student Government, Grad Council, Deans, Gage Center departments, and others.
  • Open sessions in November and December to discuss the topic with interested individuals.
  • A pilot on changing response rates.
  • Initial work on defining the purpose of the instrument.

Pilot

One important theme in the literature and our conversations during fall 2017 was the issue of response rates.  To address this and learn more a pilot project was launched in December.  The pilot tested using in-class time to complete the surveys.

  • Piloteers dedicated class time for students to complete the surveys using their mobile devices, laptops or the classroom PC.
  • This strategy has been very effective in the PA Program.
  • Each piloteer received an email with a set of instructions.
  • Response rates will be compared between piloteers and all fall courses, all last fall courses, and piloteers last fall courses.

University Course Survey

Another important theme is purpose.  What is the purpose of the instrument at the university-wide level? The research points to having clarity on this issue in order to meet instituional goals and to ensure that results are interpreted appropriately.  The workgroup is discussing renaming the post-term institution-wide instrument the University Course Survey to reflect a redefined purpose.

  • The purpose of the survey will be to provide reliable and valid data on students’ perceptions of course experiences.
  • Therefore, the working group sees the form as one part of fulfilling the needs of the University to measure our promise to students through a focus on what we should know about the experiences our students are having in our courses.

The Work for spring 2018

The working group is planning the following for the spring:

  • An update to the faculty, including data on the response rate pilot project.
  • Drafting a framework for feedback.
  • Drafting a revised instrument based on the redefined purpose.
  • Developing a way to get feedback on and test a revised instrument.
  • Preparing a proposal for the faculty to vote on in the fall of 2018.

A Framework for feedback

With defining the purpose of the University Course Survey it became clear through discussions within the working group and with the various groups across campus that there are multiple needs.  A single instrument cannot be all things.  To address that challenge the working group is drafting a framework for feedback for faculty.  That framework includes:

  • A University Course Survey that is an end-of-term instrument that provides reliable and valid data on students’ perceptions of course experiences for faculty, chairs, and administration.
  • Support for two within course instruments administered by the faculty member for the faculty member’s practice. CTL and the Assessment Committee would provide examples and models of the more qualitative forms that would be ready to use or be adapted to specific courses be they labs, graduate level, multiple delivery methods, etc.
    1. Best practices recommend doing a mid-course correction activity during which the instructor gets feedback on what is working as the course is developing. This benefits the students in the course, not those who will take it next time.
    2. Faculty should collect course-specific data toward the end of the term on paper in class or using Moodle to provide information on particular assignments, texts, use of class or lab time and any other aspects of the course about which the instructor would benefit from feedback.