Gamifying Blackboard Training

Technology trainings can be… well… just a bit dull. Being guided through how to click buttons in software has never really been that exciting and the steps can feel a bit disconnected from your real work. Participants are often at different skill and knowledge levels, but must move through the same content at the same pace. Plus, the information is rarely retained beyond the workshop. In most cases, attendees have difficulty remembering the steps or applying them on their own and need to reach out for help.

There must be a better way. Right? The eLIS Support team decided to find out. We threw the outline for our standard Intro to myLesley (Blackboard) workshop into the trash and started over from scratch.

The Goal(s)

Our goals were simple in scope, but not so simple to achieve. The primary goal for the new training was to increase the retention of content being delivered so faculty would be better able to use their new knowledge when they returned to their desks. Towards that end we used several techniques:

  • Storytelling would give attendees a narrative to attach the steps of the process to and aid memory creation.
  • Narrative and game-based design would create a fun experience to engage affective (emotion/feeling) learning and aid memory creation.
  • Faculty would teach themselves rather than watching a demo and then try to repeat the exact same steps. The need to figure it out and struggle a little would improve retention of the steps.
  • Faculty would be able to mostly move at their own pace allowing more tech savvy users to speed through the content and novices to take their time.

Faculty would learn to use available resources including support tutorials and working with their colleagues attending the workshop.

Blackboard Clue (Version 1.0)

Who Killed Mr. Blackboard?
We created Blackboard Clue. Faculty had to discover the identity of Mr. Blackboard’s murderer, but they would need to complete various tasks in Blackboard to do so.

First, they met the suspects in the Study. Then they reviewed the Detective’s Notebook, a blog, and made comments on the clues. Next, they interrogated the suspects on the Discussion Board, or two members of the support team in the other room. Finally, they created a Wanted Poster, content item, of the suspect who did the deed. The true murderer was revealed at the end of the game and everyone got to see if they guessed correctly.

How It Went
It was ok. Everyone seemed to have a good time, but they were a little distracted by the game itself. They were so busy trying to solve the murder that it overshadowed the learning. It was too hard to form connections around the content. The game needed more structure and probably a different narrative.

Soooo…. Back to the drawing board.

Agent L vs. GITS

Enter Agent L, a secret agent battling Gremlins in the System (GITS). GITS agents, NeoLuddite, Pandora, and Clippy are trying to disrupt Blackboard courses. Ben Friday (Miss Moneypenny) hands out the missions and Quinn (Q) provides tech support.

Mission 1: Alert Your Fellow Agents! – Use the Announcement tool.
Mission 2: Fix the Broken Content using the text editor.
First, review the intercepted GITS transmissions on the Discussion Board. Then, interview Clippy, a support team member in the back of the room, in a discussion forum to discover where the GITS server is.
Mission 3: Interrogate the Captured GITS Agents
Mission 4: Create an Assignment for Agent M to shut down the GITS server.

How It Went
The new game narrative had more structure allowing faculty to more easily move through the missions, complete the tasks, and not get lost. They used support tutorials to teach themselves all of the steps while the support team moved through the room to assist, but not answer their tech questions. Collaboration was encouraged allowing the more advanced attendees to happily aid the newbies in figuring stuff out.

The game format had two primary issues, however. One, faculty become their students and don’t read the instructions. (Shocking, I know.) The game master pointed out their error and directed them back to the mission profile. Two, some faculty just don’t want to play. They prefer the old format where they sit back and are fed the content without too much expectation. That attitude can be hard to overcome, but the other attendees simply play on without them.

Overall, there was a lot of laughter, a lot of noise and chaos, and everyone was successful. We’ve offered the game a few times now and feedback from faculty has been positive. We can’t say for certain that it’s more effective than the old traditional training, but at least we are all having fun. That must count for something. Plus, the eLIS team has been able to model a new way of teaching and give faculty a chance to see us a fellow educators, not just the “tech folks.”

Improving Peer Feedback with Peergrade

Lisa Spitz is an assistant professor at Lesley’s College of Art and Design and the program director for the User Experience online BS degree program. Lisa worked with eLIS this fall to pilot a peer feedback tool, called Peergrade, in her Sketching for Interactive Design course. Below she shares her and her students’ experiences using Peergrade.

In this course, students use sketching to document research insights, tell a story, and visualize mobile interface concepts and interactions. Each individual assignment includes a period of sketching and revising, where students provide peer feedback and then revise their own sketches for overall clarity. My initial experience teaching this course in Blackboard raised a number of challenges with the peer review process: not all peer feedback was of the same caliber and not all students received the same amount of feedback. This meant I was compensating for poor and/or incomplete feedback. I was also manually tracking the quality and quantity of feedback each student provided to their peers, for grading purposes. Further, due to inconsistencies in the types of feedback received, students reported finding it difficult to revise their work. 

Over the summer, John McCormick in eLIS introduced me to an online peer review platform called Peergrade. The overall format and structure of Peergrade was a good match for my particular assignment structure and I was interested in seeing how it might better support our students in the peer review process.

The tool itself was fairly easy to learn. As an instructor, I was able to set up my “classroom” in Peergrade and create each of my assignments. For students, their experience entailed posting their sketches in Peergrade and then evaluating their peers’ sketches based on a custom rubric (which I set up in advance of the course running). The biggest challenge I faced was in tailoring the rubrics to each individual assignment. Students evaluated their peers work based on quantity and diversity of sketches as well as unique requirements for each assignment. The rubrics I created provided students with both quantitative and qualitative feedback on their sketches; and the system guaranteed that each student received feedback from three other students.

Students responded favorably to the use of Peergrade. They were fairly self-sufficient in using the Peergrade platform. It required very little technical support from my end; for instance, allowing late assignment submissions and permitting students to re-upload their work. Some adhoc quotes found in journal entries and the course evaluation include:

  • “I really like Peergrade, I only wish the rest of my courses used this site. It is so much easier to give the feedback and receive the feedback that you want without upsetting another peer about your opinion. Since it is anonymous it is easier to be truthful if you have suggestions on changes.”
  • “That program allowed me to finally get honest feedback from my peers on how they truly felt about my work.”
  • “By reviewing other students, I often could improve my own work through just that process alone.”
  • “The peer commenting system was a great way to discuss among other students each other’s work before turning in the final assignment each week. Critiquing helped me understand my own skills better.”
  • “Peergrade was life changing, love it.”

In an ideal world, I’d spend more time user-testing assignment rubrics before launching these assignments with a live class. However, design is an iterative process and course design is no different and I anticipate refining each assignment rubric with each course instantiation.

If you’re just not sure how this would work within your classroom context, I’d say start small. Choose one project in which you’d like students to give and receive quality peer feedback. Decide what a “good” assignment submission looks like and set up your rubric to probe specifically on those areas. Then, see what the experience looks like from both the Instructor and Student views. Having an initial experience with Peergrade will help you to determine just how and when it might be an asset in your courses. 

Peergrade is currently be used in a small number of online and on campus courses at Lesley. If you would like more information on using Peergrade or peer feedback in your courses, contact elis@lesley.edu.

A Student’s Take on Peer Review

In this summative assignment in a freshmen Honors English Composition class, students were asked to review their papers and assignments from the course, and determine 1-3 specific areas of growth or improvement, as well as specific classroom activities, assignments, etc. that contributed to the improvement. Students were then asked to demonstrate this in a creative work in any format/mode, and present the project to the class. A goal of the project was to reflect on one’s learning in a creative style which reflects both the learning itself and the personality or talents of the student.

Emily Tran created this awesome reflective video on her experiences with peer review.

Lesley Instructors Publish in Journal “Literacy Research and Instruction”

Leah Van Vaerenewyck, a Lesley doctoral student, Valerie Shinas and Barbara Steckel, two literacy instructors from Lesley’s Graduate School of Education, have co-authored the article Sarah’s Story: One Teacher’s Enactment of TPACK+ in a History Classroom in the journal Literacy Research and Instruction.

The article focuses on the case study of one secondary History teacher and her approach to using technology in developing and supporting a socially-situated community of learners. The authors cite research suggesting teachers do not integrate technology within literacy or disciplinary curriculum at high levels (to support higher level cognitive skills, for example). They argue that to prepare students for higher education and employment, students must learn to think like scholars in the disciplines in which they study. For example, history students should be able to analyze primary documents, conduct research and synthesize information across various sources to draw conclusions. They argue that strategic and principled use of technology can support the development and maintenance of a community of learners focused on higher-level skill acquisition.

tpack visualization

TPACK, or Technology, Pedagogy, and Content Knowledge, is a framework built on Lee Shulman’s PCK (Pedagogical Content Knowledge). TPACK suggests the incorporation of technology with pedagogical content knowledge can produce more effective teaching. The authors suggest an expansion of the TPACK framework to include a sociocultural component and use this case study as empirical evidence to support an update to the TPACK model (TPACK+). They set out to “examine how sociocultural-oriented teacher knowledge, skills and beliefs intersect with TPACK in ways that leverage digital tools to create and sustain vibrant learning communities” (Van Vaerenewyck et al, 2017). Their observations showed strong evidence supporting this updated conceptualization of TPACK. The instructor’s use of learning technologies enabled the students to engage in authentic disciplinary discourse within socially situated learning experiences. The instructor was able to create a community of learners both within and beyond the boundaries of the physical classroom. Students engaged collaboratively in sophisticated ways, demonstrating that learning can be enhanced when embedded in socially situated experiences.

The authors call for further research examining in-service teachers’ skills and knowledge in relation to technology-integrated instruction to provide additional empirical support for their claim that the TPACK framework must be expanded.

Van Vaerenewyck, L. M., Shinas, V. H., & Steckel, B. (2017). Sarah’s Story: One Teacher’s Enactment of TPACK+ in a History Classroom. Literacy Research and Instruction56(2), 158-175.

The Hidden Element in Teaching: Modeling Expert Thinking

Instructors often direct students to produce assignments with very good support and guidance, such as examples of past work, a set of criteria or rubric, and detailed instructions or guiding questions. Less common is giving guidance for how students should think as they approach a task. Every discipline has a specific approach to thinking within the field [1]. For example, historians use evidence differently than other disciplines. They must weigh evidence that leads to different interpretations of historical events. They need to learn how to identify, select and use evidence in arguments [2]. In the study of literature, there are particular ways of analyzing literary texts. Novices (students), however; approach academic tasks differently than experts (instructors). Without specific guidance, they tend to use ways of thinking from earlier educational experiences, work or other life experiences. These approaches tend to be ill-suited for the discipline-specific, higher-level approaches required. The key challenge in teaching this type of thinking may be to make thinking visible.

Below are two examples from online courses at Lesley that use voice-over videos to model how the instructor approaches a task, focusing on the thinking that guides them. In the first video, instructor Wendy Hasenkamp shows students how to review a scientific article. This is from the course “Meditation and the Brain: Intro to Contemplative Neuroscience”. In the second video, instructor Lisa Spitz gives a detailed example of how one might work through a design challenge in the course “Typography I”, part of the new online “Design for User Experience” program.

Example 1

 

Example 2

This type of expert modeling is not the only way to support more expert-like thinking. Another example is called the “process worksheet” [3]. This can be a simple, text-based scaffold to thinking; it provides learners with steps they need to take to solve a problem or approach a learning task. It might show a series of phases with key rules of thumb or advice for how they might approach the task.

One reason that modeling expert thinking is less common as a support for students is that we often forget how we came to be experts and, as a result, it can be difficult to tease apart the details of how we approach our disciplines. This is sometimes called “the blindness of expertise”. Once we review how we approach a task, we can begin to see details that might help guide students.

For more information about modeling expertise in teaching, please contact John McCormick: jmccormi@lesley.edu.

Citations:

[1] Meyer, J., & Land, R. (2003). Threshold concepts and troublesome knowledge: Linkages to ways of thinking and practising within the disciplines (pp. 412-424). Edinburgh: University of Edinburgh.

[2] Grim, V., Pace, D., & Shopkow, L. (2004). Learning to use evidence in the study of history. New directions for teaching and learning, 2004(98), 57-65.

[3] Nadolski, R. J., Kirschner, P. A., & Merriënboer, J. J. (2005). Optimizing the number of steps in learning tasks for complex skills. British Journal of Educational Psychology, 75(2), 223-237.