She presented an "evidence" and "task" approach to assessing Student Learning Outcomes (SLOs), defined as the "accumulated knowledge, skills, and attitudes that students develop during a course of study". They can assessed at the class, programme, or institutional level (although for the purposes of this presentation, I felt that Dr. Kazin kept her examples to the programme level). SLOs reflect a shift of focus from, " What am I teaching?" to "What are students learning?" I found this learning-centred approach very familiar and relevant to me as it allowed me to participate in the rest of the presentation based on how we approach learning-centredness at NSCC.
SLOs need to be measurable to be meaningful, and in order for there to be proof that they have been achieved there must be evidence. This evidence is gathered through the careful design of tasks, questions, tests, that will show whether or not that a learners has met a SLO. The two main "tasks" that were presented in this workshop were:
- Multiple Choice Questions
- Constructed Responses
The next section of this workshop focussed on Multiple-Choice Questions (MCQs) - what they are good at, how to best construct them, and when to use them. MCQs are good at:
- Breadth
- Facts
- Some higher-order thinking skills
- Easy to understand results
- Increased reliability
- Objective scoring
- Efficient scoring
Some time was spent on analyzing the anatomy of a MCQ:
- Stem
- Distracter(s)
- Options
- Key
- Focus on the stem
- avoid undirected stems
- always direct test-takers to the matter at hand
- after reading the MCQ, test-takers should be able to answer the question without looking at the options
- avoid too much info in the stem
- strive for concise stems
- Focus on the options
- avoid overlap
- make the options parallel
- put options in the most logical order possible (an interesting point - I've always tended to mix up the order of the options. Hmmm...)
- avoid "All of the Above" and "None of the Above" options (Yes!!)
- capture common misconceptions in distracters - make sure they are wrong and avoid trickiness
- Avoid inadvertant clues
- clues to correct answer with a MCQ or within the test
- grammatical clues
Constructive Response (CR) questions are good at:
- Depth
- Higher-order thinking skills
- Assessment of performance
- Capturing the thought process
- Often less time to construct a test (but more time to score it)
- Short answer
- Essay
- Performance
A quick CR checklist:
- Define the task completely and specifically
- Give explicit directions regarding length, grading guidelines, and time to complete
- Develop and use appropriate scoring guide (rubric)
The next section of the workshop dealt with the creation and use of rubrics (something I have used for several years and really believe in). The two styles of rubrics looked at were holistic and analytic rubrics. there are pros and cons to both - personally I prefer the analytic style of rubric ( success factors, levels of assessment, tangible scoring). Holistic rubrics, in my opinion, tend to be more qualitative in nature and therefore more open to interpretation and discussion by learners and others (and we all know that a learner morphs into Clarence Darrow when discussing grades with faculty...).
The last section of the workshop dealt with puling things together - both before and after SLOs are put into a syllabus or on a Web site.
- Be sure it can be assessed
- Give learners opportunities to learn and practice it
- Commit to fewer and do them well
- Talk to learners about its importance
- Make it explicit when you explain assignments
- Assess it well in exams
- share results with colleagues - take a research approach
This was a very useful workshop - it showed me that MCQs do have a place in my assessment toolkit and it reinforced my opinion that CR style questions, tasks, and assignments are still my preferred assessment tools - for me assessment is about what you can do, not just what you know...
No comments:
Post a Comment