Saturday, September 24, 2011

Sprinting with SPEN

This week I spent a day in Atlanta 'sprinting' to create practice problems with the Signal Processing Education Network (SPEN). SPEN is an NSF funded project bringing together five universities to create a broad teaching network and body of open educational resources (OER) for teaching and learning signal processing at the undergraduate level.


 
Sprint goals: Creating practice problems: The SPEN network is enhancing existing open materials and creating new open textbook teaching materials integrated with interactive simulations and a rich body of homework, test, and practice problems. This sprint brought together 33 people (faculty and graduate students) to create practice problems and upload them to two different question and answer databases. Each question ended up in both banks. One of the banks, Quadbase, is an open question bank that anyone can add questions to, and anyone can take questions from. The question bank can be used for interactive tutoring systems that pull targeted practice questions to help students learn and retain knowledge, for teachers building homework sets, and for learners looking for practice problems. The second question bank is Georgia Tech's Intelligent Tutoring System that is both a question bank and a tutoring system that Georgia Tech faculty and students use in their undergraduate signal processing classes.

This sprint was the first time SPEN got together for a day of group content creation. The sprint ran incredibly smoothly and the result was 160+ new questions for the databases. 

The procedure:
  • Prep: Participants bring existing materials: Before the sprint, organizers requested that participants bring any homework and problem sets that they already use.
  • Instructions: Sprint organizers created a set of instructions with question topic prompts and sample questions for several different types of questions (multiple choice, matching, free response).
  • Groups: The organizers put people in groups of 3 - 5 at round tables and gave each group a set of topics and a shared Google Doc to work together in. The Google Doc had question samples that could be copied and pasted to create new questions. Math was entered in TeX math which most of the participants already know and use fluently. It is a dense notation that can be converted to attractive equations.
  • Creation: Groups worked for 2 hours creating questions.
  • Signaling completion: When a question was finsished, someone would highlight it in green.
  • Uploading to the question banks: A separate group of four people spent the whole day watching for the green highlighting and then copying the questions into the two different question banks. It was labor-intensive, but meant that every question could end up in both banks despite each bank having slightly different formats. It also meant that participants didn't have to learn new tools.
  • Review: Then groups reviewed the questions for 1 hour. Two groups merged to do the review (thus making groups of 6 to 10 for the review. They would edit the questions to improve them.
  • Repeat: The whole process was repeated in the afternoon.
My observations
  • Pre-workshop prep not done. Very few participants brought existing materials. So it looks like it is important that success not require preparation.
  • Paper collaboration: Groups collaborated on creating questions by drawing on paper and talking about them. They did not collaborate directly in the Google Docs. Individuals would write up entire questions in the doc and then highlight them green to signal the uploaders.
  • Review by reading and solving: To review the questions, recall that two groups would merge or swap. So groups 1 and 2 would review together. Group 2 would review group 1's questions and perhaps ask some questions about the intention for each question. Then group 2 would actually solve all of group 1's questions. And of course group 1 would be doing the same with group 2's questions.
  • Review resulted in considerable change: The review process resulted in substantial revision of the questions. Revisions occurred for clarity, correctness (after solving), and also for transcription errors as the questions got put into the question banks.
  • Pedagogy benefits: In addition to the creation of a large body of questions that are now globally available and shareable, the process itself was valuable as participants discussed and reviewed and improved the questions. Thus the time that participants donated to the sharing process was also valuable pedagogically.
Sprint artifacts
Want to learn more?

No comments:

Post a Comment