Monday, May 26, 2008

NISOD 2008 - Fundamentals of Good Assessment - Student Learning Outcomes

Today I attended two NISOD 2008 pre-conference workshops. This first one, Fundamentals of Good Assessment - Student Learning Outcomes was presented by Dr. Cathrael (Kate) Kazin of ETS (Educational Testing Service).

She presented an "evidence" and "task" approach to assessing Student Learning Outcomes (SLOs), defined as the "accumulated knowledge, skills, and attitudes that students develop during a course of study". They can assessed at the class, programme, or institutional level (although for the purposes of this presentation, I felt that Dr. Kazin kept her examples to the programme level). SLOs reflect a shift of focus from, " What am I teaching?" to "What are students learning?" I found this learning-centred approach very familiar and relevant to me as it allowed me to participate in the rest of the presentation based on how we approach learning-centredness at NSCC.

SLOs need to be measurable to be meaningful, and in order for there to be proof that they have been achieved there must be evidence. This evidence is gathered through the careful design of tasks, questions, tests, that will show whether or not that a learners has met a SLO. The two main "tasks" that were presented in this workshop were:
  • Multiple Choice Questions
  • Constructed Responses
Dr. Kazin uses the term "constructed response" to identify any questions or tasks that help learners construct their response. Learners should not have to guess at what a suitable answer to the question is.

The next section of this workshop focussed on Multiple-Choice Questions (MCQs) - what they are good at, how to best construct them, and when to use them. MCQs are good at:
  • Breadth
  • Facts
  • Some higher-order thinking skills
  • Easy to understand results
  • Increased reliability
  • Objective scoring
  • Efficient scoring
One of the other good uses of MCQs brought up be several members of the audience was the preparation of learners for the writing of certification or licensing exams, many of which are exclusively based on MCQs - a whole different debate - do you want a mechanic who gets 100% on the MCQ test or one who can actually fix your brakes? Hmmm...

Some time was spent on analyzing the anatomy of a MCQ:
  • Stem
  • Distracter(s)
  • Options
  • Key
I will admit to a personal bias against MCQs and MCQ tests. I really believe that the true measure of learning is in its application - show me what you can do - how have you assimilated the learning, made it your own and then applied it in the completion of one or more related tasks or activities - it's about competencies. Having said that, this workshop did point out to me that properly developed there is a place for the maligned (by me) MCQ. That is a valuable take away for me. The key is to ensure that MCQs are properly developed and used - several tricks of the MCQ trade were revealed:
  • Focus on the stem
    • avoid undirected stems
    • always direct test-takers to the matter at hand
    • after reading the MCQ, test-takers should be able to answer the question without looking at the options
    • avoid too much info in the stem
    • strive for concise stems
  • Focus on the options
    • avoid overlap
    • make the options parallel
    • put options in the most logical order possible (an interesting point - I've always tended to mix up the order of the options. Hmmm...)
    • avoid "All of the Above" and "None of the Above" options (Yes!!)
    • capture common misconceptions in distracters - make sure they are wrong and avoid trickiness
  • Avoid inadvertant clues
    • clues to correct answer with a MCQ or within the test
    • grammatical clues
While I am not sure that I am a total convert to MCQs, I did get a lot out of this workshop - MCQs do have a place in learning as an assessment tool, but like any tool they must be properly designed, developed, and deployed...

Constructive Response (CR) questions are good at:
  • Depth
  • Higher-order thinking skills
  • Assessment of performance
  • Capturing the thought process
  • Often less time to construct a test (but more time to score it)
Some examples of constructed response questions:
  • Short answer
  • Essay
  • Performance
These are the types of questions that I tend to place more value in personally - I feel that they are a truer test of learning in that they require some sort of synthesis on the part of the learners.

A quick CR checklist:
  • Define the task completely and specifically
  • Give explicit directions regarding length, grading guidelines, and time to complete
  • Develop and use appropriate scoring guide (rubric)
I don't like the idea of explicitly giving a length to a CR question - I find that many learners write their answers to the length, not the content. My answer when I am asked "How long does it have to be?" has always been "Long enough to answer the question". I personally believe that once learners get used to the idea that length does not matter that they focus more on actually answering the question

The next section of the workshop dealt with the creation and use of rubrics (something I have used for several years and really believe in). The two styles of rubrics looked at were holistic and analytic rubrics. there are pros and cons to both - personally I prefer the analytic style of rubric ( success factors, levels of assessment, tangible scoring). Holistic rubrics, in my opinion, tend to be more qualitative in nature and therefore more open to interpretation and discussion by learners and others (and we all know that a learner morphs into Clarence Darrow when discussing grades with faculty...).

The last section of the workshop dealt with puling things together - both before and after SLOs are put into a syllabus or on a Web site.
  • Be sure it can be assessed
  • Give learners opportunities to learn and practice it
  • Commit to fewer and do them well
  • Talk to learners about its importance
  • Make it explicit when you explain assignments
  • Assess it well in exams
  • share results with colleagues - take a research approach
ETS has developed a report "A Culture of Evidence" that describes an evidence-centred approach to assessing SLOs.

This was a very useful workshop - it showed me that MCQs do have a place in my assessment toolkit and it reinforced my opinion that CR style questions, tasks, and assignments are still my preferred assessment tools - for me assessment is about what you can do, not just what you know...

No comments: