In assessing your outcomes, whenever possible use direct evidence of learning rather than indirect evidence, which is very weak.

Examples of indirect evidence:

  • Asking students whether they learned
  • Asking students how much they learned
  • Asking students what they learned (i.e., “Did you learn about _____?”)
  • Simply reporting attendance at an event

Responses to the above all suggest students might have learned, but they are not evidence that they did.

It’s much better to use direct evidence of student learning – show that they learned. Examples of direct evidence include:

  • Pre- and post-tests of knowledge (or just post, but using both better measures learning – post measures knowledge, pre to post measures increase in knowledge, i.e., learning)
  • Pre- and post-surveys of opinions or beliefs (or just post, but both is generally better)
  • Student product (e.g., portfolio, journal, paper, project) rated by your professional staff using a rubric (or without a rubric, but with rubric is stronger evidence of learning)
  • Student presentation (or other performance) rated by your professional staff using a rubric (or without rubric, but with rubric is stronger evidence of learning)
  • Focus group or informal group discussion led by professional staff (this could also be indirect evidence, depends on the questions asked)
  • One-on-one interview with professional staff (could also be indirect evidence, depends on the questions asked)
  • Checklist of what students accomplished

Be sure to save all your assessment data (completed surveys, pre/post tests, scored rubrics) for at least four years, because WASC can ask to see them.

A rubric describing what SA is looking for in outcomes, benchmarks, and assessment methods is here.