Assessment Committee

The College of Arts and Sciences Assessment Committee (CASAC) began working in October of 2012 toward the goal of enhancing student learning opportunities in the college’s many programs. The committee, which is faculty led, continues to encourage and assist faculty and administrators who seek to use contemporary assessment techniques to improve student learning opportunities. CASAC collaborates with the University Assessment Committee, the CAS Dean's Office and the College Advisory Council to achieve this end.

  • Assessment Committee Members

    Expand dropdown

    Faculty

    John Dunn, English [email protected]

    You Li, Communication, Media & Theatre Arts [email protected]

    Amanda Maher, History and Philosophy [email protected], co-chair

    Tricia McTague, Sociology, Anthropology, and Criminology [email protected]

    Laura Pawuk, Music and Dance [email protected]

    Greg Plagens, Political Science [email protected], co-chair

    Meriah Sage, Communication, Media & Theatre Arts [email protected]

    Angela Staples, Psychology [email protected]

    Department Heads

    Natalie Dove, Psychology [email protected]

    Mehmet Yaya, Economics [email protected]

    Associate Dean

    James Egge [email protected]

  • Recommended Approaches to Ongoing Program Improvement for the Current Academic Year

    Expand dropdown

    Contemporary assessment techniques determine what students are learning and use that information to make programmatic improvements.

    This year CASAC is strongly encouraging faculty to assess their programs using one of four approaches. Once each program has determined an assessment plan, they should submit it to CASAC by October 31.

    1. Direct Assessment of Student Learning 


    Direct assessment occurs when we evaluate actual student work (tests, papers, projects and tasks) in courses, usually comparing it to some desired program outcome.

    Continue with the program’s current assessment process and practices: Submit a report, using the provided template, on how student learning was assessed in the previous year and describe plans for the upcoming year. 

    2. Indirect Assessment of Student Learning: Focus on Students’ Experiences

    Indirect assessment occurs when we look at any information other than actual student work and attempt to draw conclusions about what might improve student learning opportunities. Common indirect assessment methods are surveys, exit interviews, focus groups, and department-level conversations about teaching methods, for example.

    Meet with your colleagues and discuss how students have responded, or are responding, to various formats of instruction offered this academic year, particularly in terms of how well students appear to be meeting the program’s learning outcomes or goals. 

    • Since the institution continues to experience significant changes in platforms or program offerings, what opportunities and challenges are students experiencing?
    • How have students adjusted? How well are students meeting the program’s learning outcomes or goals? 
    • Provide aggregate data, composite sketches, or specific examples.

    3. Indirect Assessment of Student Learning: Focus on Instruction 

    Meet with your colleagues and discuss instructional approaches and strategies that appear to have enhanced how well students are meeting program learning outcomes or goals.

    • Since the institution continues to experience significant changes in platforms or program offerings, what opportunities and challenges have faculty experienced, or are experiencing?
    • How have faculty adjusted? What innovations or improvements have you observed, particularly in terms of students’ achievements toward meeting the program’s learning outcomes or goals?
    • Provide aggregate, composite sketches, or specific examples.

    4. Program’s Choice of a Different Approach

    If none of the above fits with your program’s plan to reflect on assessing student learning, describe an approach you plan to use for this academic year, and describe how the plan will encourage faculty to reflect on how well students are meeting program learning outcomes or goals. 

  • Best Practices: The Ongoing Program Improvement Loop

    Expand dropdown

    Student learning is a campus-wide responsibility, and while our efforts may start in the classroom, it is a multidimensional, continuous, and collective effort. Many years of assessment at EMU and other institutions have led to a variety of techniques and practices to increase opportunities to improve student learning in individual courses and programs. CASAC recommends faculty and administrators think about ongoing program improvement in a continuous loop that includes four stages: plan, do, check and act.

    Planning and Goal Setting

    This stage is an appropriate place to start for a new course, the implementation of revised outcomes, or other additional program initiatives. It invites important but sometimes under-considered questions:

    • At the course level, what do I want students to learn in my course? It can be helpful to capture in writing the learning outcomes the course is supposed to address–both course outcomes and general education outcomes, if relevant.
    • At the program level, what do we want students to learn in our program or initiative? Related questions logically follow: 
      • Which program outcomes do we want to evaluate? 
      • Do we want to assess the effectiveness of a program delivery platform? 
      • Do we want to focus on the learning outcomes of a specific type of student? 
      • How can we observe or assess the identified outcomes? 

    Doing

    While assessment requires attention to outcomes, equally important are the conditions and experiences that lead to student engagement and success. 

    • In the classroom, this includes lesson planning, teaching strategies, pacing, text selection, and/or the cultivation of classroom climate. 
    • At the program level, this may include a change to prerequisites, a new instructional platform, a new mentorship program, or assessment of newly revised learning outcomes.   

    Checking and Identifying 

    After we set goals and make an assessment plan, we must check our data and compile our observations to evaluate if students are meeting the outcomes. In this phase we also identify trends and patterns and illuminate successes and areas of further improvement. This phase may include:

    • At the course level, reviewing student papers, projects, surveys, or exam results. 
    • At the program level, disaggregating assessment results,  professional exam results, review of capstone projects, student surveys, and the collegial sharing and recording of observations. 

    Informed Action

    When engaging in continual improvement, this informed action phase is sometimes referred to as ‘closing the loop.’ Here, programs take what we have learned, issues that emerged, and areas of growth and propose changes and additions to our program. Actions may include:

    • At the course level, changes in texts and materials, changes in pedagogy, revisions to assessments, and sourcing professional development. 
    • At the program level, curricular changes, instructional support, student support, resources, and revisions to the assessment plan.
  • Closing the Loop: Examples of Program Improvement in the College of Arts & Sciences

    Expand dropdown

    There is value in all stages of the assessment process as explained in the Best Practices: The Ongoing Program Improvement Loop section. The point where the process is most likely to break down is in the final stage, which is to act. Using what has been learned is critical, and it is often referred to as 'closing the loop.' Here are some common types, examples of changes that a program might make (adapted from California State University-Fullerton), and EMU resources that can provide support. 

    Types of Change 
    Examples
    Places of Discussion & Possible Resources
    Curriculum

    Change prerequisites

    Add required courses

    Replace existing courses with new ones

    Change course sequence

    Add internships, labs and other hands-on learning opportunities

    Department Meetings

    Departments' Instructional Committees

    Faculty Support

    Provide targeted professional development opportunities

    Increase number of TAs or peer mentors

    Add specialized support to faculty (Library, Academic Technology, etc.)

    Increase support to promote dialogues and community among faculty

    Pedagogy

    Change course assignments

    Add more active-learning components to course design

    Change textbooks

    Increase opportunities for formative feedback and peer-assisted learning

    Student Support-Academic

    Increase tutors

    Add more online resources

    Improve advising to make sure students take the right courses

    Provide resources and mentorship to encourage community building among students and between students and faculty

    Student Support-Personal
    Provide resources that support mental health, food insecurity, and financial assistance
    Resources

    Improve or expand lab spaces

    Provide resources to support student independent research

    Assessment Plan

    Refine outcome statements

    Change methods and/or measures

    Change where (e.g. courses) the data are collected

    Collect additional data

    Improve data reporting and dissemination mechanisms

  • Submit a Plan or Report

    Expand dropdown

    Faculty and administrators engaged in the assessment process this year can follow the link on this page to submit a plan or report. All plans and reports are received and stored by the Office of Institutional Research and Information Management. Members of CASAC review the plans and provide feedback on reports.

    If you are using the submission site for the first time, ask Associate Dean Jim Egge to request access for you. After uploading a file, be sure to click "Save."

    Submit a Plan or Report

    Template for Beginning Programs

    Template for Accredited and Experienced ProgramsCurriculum Map Template

 

 

Skip Section Navigation