Collect Data on Student Learning
Course grades are sufficient to inform an individual student about their progress in the class, but course grades are not granular enough to be used as assessment data. Therefore, when assessing a class, it is important to select appropriate methods and develop appropriate assignments. When determining which methods you will use to assess student learning in your course, it is important to make sure that the assessment tool aligns with the course goals and learning outcomes. To ensure that the methods are in alignment, compare the answers to the following two questions:
- What do I want my students to be able to do with what they are learning in my class?
- What am I requiring my students to do to demonstrate what they have learned?
If the answers to these questions are not the same, then the assignments and methods to assess student learning may not be able to capture whether students are sufficiently meeting the learning outcomes. Assessment tools are methods for collecting data on student learning. They can be split into two types of tools or measures.
- Direct measures are assessment tools that measure student learning by having students create or perform a task directly based on their learning.
- Indirect measures infer whether learning has taken place by asking for the perception of learning, typically from students, but also from those with whom they have worked.
Direct assessment includes direct evaluations of aggregate student achievement on specific learning outcomes (e.g., “as a whole, students have learned X at this level”); in other words, it is using student performances, course work, projects, etc. to demonstrate the students’ learning. These methods need to be able to aggregate achievement across students, but also to disaggregate the elements of the assignment to measure learning outcomes separately. There are a variety of tools that can be used to directly assess student learning, such as:
- Tools that are embedded in regular course assignments:
- standardized exams (nationally normed, proficiency, licensing, etc.)
- specific embedded test questions (that are aligned to specific learning goals)
- multiple choice questions
- short answer questions
- essay questions
- portfolios (graded with a rubric*)
- writing assignments may test multiple learning outcomes but each learning outcome should be assessed independently (graded with a rubric*)
- lab reports (graded with a rubric*)
- checklists of requisite skills
- minute papers or muddiest point exercises (or other graded or non-graded classroom assessment techniques**)
- pre- and/or post-testing – ask specific test questions at the beginning and end of the term (or before and after you teach a specific topic)
- Authentic assessment of real tasks:
- oral presentations (graded with a rubric*)
- group projects (graded with a rubric*)
- performances (musical, theater, etc.)
- capstone experiences
- oral defenses or exams
- videotapes of student skills performance
* Rubrics allow instructors to share their criteria easily with colleagues and students, while also allowing multiple graders to rate work based on comparable and standardized scales. Rubrics are often a helpful tool to be used to separately identify and assess learning outcomes; however, there are other tools that can also be used for this process.
** Classroom assessment techniques are simple exercises (graded or non-graded) that allow for more direct feedback about the learning/teaching process for the instructors and students in a course.
Indirect assessment includes tools that enable you to infer actual student achievement; frequently, this information comes from students self-reporting their perceptions of their learning. It is based on the perception of learning and relies on indicators that serve as proxies for learning.
Examples of indirect assessment include:
- surveys (of current students, alumni, etc.) - surveys include things like Student Evaluation of Instruction (SEIs), self-evaluation of learning, recall of learning experience after some time has passed, etc.
- exit interviews
- focus groups
- journaling (reflective, or other types)
- alumni database
- library usage
- CarmenCanvas usage data