A Student’s Take on Peer Review

In this summative assignment in a freshmen Honors English Composition class, students were asked to review their papers and assignments from the course, and determine 1-3 specific areas of growth or improvement, as well as specific classroom activities, assignments, etc. that contributed to the improvement. Students were then asked to demonstrate this in a creative work in any format/mode, and present the project to the class. A goal of the project was to reflect on one’s learning in a creative style which reflects both the learning itself and the personality or talents of the student.

Emily Tran created this awesome reflective video on her experiences with peer review.

Grading Offline This Winter Break

During the upcoming winter break myLesley will transition to a SaaS environment and upgrade to the latest version of Blackboard (Q4 2017). The upgrade and migration to the new servers will take place from Saturday, December 30, 2017 to Wednesday, January 3, 2018. You will not have access to myLesley during this time. All other Lesley services will be available.

We highly recommend completing your grading by Friday, December 29th. However, we understand that many faculty use the winter break to do their grading. With that in mind, we have provided some resources for downloading your students’ gradable items so that you can grade offline when the server is down. The instructions include downloading student submitted assignments, collecting and printing discussions, saving blog, journal and wiki content to PDF, and downloading a copy of your Grade Center.

Please keep in mind that you must complete these steps by December 29th. You will not have access to myLesley from December 30 – January 3.

If you have any questions or need assistance, please email elis@lesley.edu.

Lessons learned from running our first online Design for User Experience course

Today’s post is by Lisa Spitz, Lesley Assistant Professor and consultant for the College of Art and Design’s bachelor’s program in design for user experience.


In Fall 2, 2016 we ran our first course in the Design for User Experience program, Typography 1. 10 students signed up for the course. Excitement ensued. And then I started looking into the class roster. Of the 10 students, just 1 was a Design for UX student. The remaining students represented a mix of Business, Counseling, and Psychology programs. As a new program in an entirely new category for Lesley, I realize that it takes time to market and enroll new students. Nonetheless, I was a bit disappointed by the turn out. I didn’t question the applicability of the content to individuals “outside the field”. Principles of good typography is something anyone can benefit from. But I was worried about the complexity of the learning activities I’d planned and the Adobe software that was required to complete them.

What I learned over the subsequent 8 weeks is the importance of being flexible and the benefit of testing a course with individuals outside your domain. Let’s start with the latter point. For those familiar with Universal Design for Learning or Inclusive Design, it’s a bit like that. If you can make your course “work” for individuals outside your program, chances are it will work better for those inside your program as well. I’m not talking about “dumbing down” content or removing requirements. I’m talking about adding instructional supports to make the course content and expectations clearer. Here are a few ways I made that happen while the course was still in flight:

Providing better prompts
As a typography course, students were expected to create several designs and critique the work of their peers. However, journal entries revealed that students lacked the confidence to do so and some even felt hypocritical critiquing their peers’ work. The original critique questions I’d provided assumed they could judge which design was best (or worst) and give concrete recommendations on what to do next. But students were not sure how to assess the work of their peers. How would they know which was best? They certainly could tell which one they liked, but could not articulate why it was better. So, I went back to the drawing board and made the questions more personal. “What words would you use to describe this?”; “What is being emphasized?”; “What interests you about the design?” Etc. These questions were easier to answer. They required students to respond based on what they saw and how they felt, not what they deemed to be “good” or “bad”.

Original critique language:Critique_Before

Revised critique language: 
Critique_After


Creating more explicit directions

As a visual learner, one of the biggest challenges I faced when creating my own online course is finding ways around the “wall of text”. To explain an activity requires quite a bit of documentation. Aside from using all video or images, there’s almost no way around it. And when confusion arises, the tendency is to double down with more explanation. Instead, I took a step back, added images, cut text, and used more headings and bulleted lists – detailing process, specifications and steps for completion.

Original assignment description: (click for full size image)
direction_before_crop

Revised assignment description: (click for full size image)
directions_after_crop

Personalizing the feedback process
As students submitted their design work each week, I used the Assignment Tool to provide feedback. Originally, I defaulted to the WYSIWYG editor and took to writing what I thought worked/didn’t work and needed improvement. However, it felt as if some of my feedback was getting lost in translation. Again, the wall of text. Midway through the course I switched to video. Instead of writing a single piece of feedback, I recorded my screen as I looked at each of their design options and spoke about their use of typography in great details. If I’d have typed that feedback out, it would have been a novel. But to record it took just a few minutes. Students appreciated the new format and commented on how incredibly helpful it was.

All of these changes required a great deal of flexibility on my part. I ended up re-writing each week’s content before it went live; I added images to show, not tell; I created videos that demonstrated how to do the assignments; I offered up 30 minute 1:1 time slots to address individual challenges; and I gave feedback that was personal and specific. In the end, I had students comment on their appreciation for typography and design. But more importantly, I witnessed their transformation. When week 1 started, students proclaimed themselves unable to be creative. When week 8 finished, they professed the ways in which they were using their new knowledge of good typography to impact their professional and academic lives. As for myself, I still have some work to do within the course curriculum – but am confident that the results will be even better the next time around.

The Emergence of Learning Analytics: Evidence-based Decision Making

Learning Analytics is a fast-growing field in education focused on the use of data to improve teaching and learning. Learning management systems are starting to include dashboard tools with visual data displays, products like ALEKS use adaptive learning technologies in concert with analytics tools to provide students with personalized learning experiences, and Columbia University has recently established a Master’s degree in Learning Analytics.

While definitions vary, the focus of Learning Analytics is usually data that instructors and students can use, particularly during instruction, to positively impact learning. Below is an example of a dashboard in the open source learning management system called “Desire2Learn” showing course data for one student:

LA-D2Ldashboard

A different example of data used in teaching is shown below. This table is from Kaltura, which is integrated with myLesley (Blackboard). It shows data related to views of a video in an online professional development seminar facilitated in May, 2016. Such information can allow instructors to know which students are viewing the media and how much they are viewing:

 

Kaltura Data

A final example from the open source LMS called “Sakai” shows the nature of student interaction in online discussions through a social network diagram. This data can be used early in a course to find out which students are less involved, which could be future group leaders, and the level of collaboration in the discussions. As a course is running, an instructor might want to use this data to refine or redirect discussion activities, and enhance the course’s interactivity. This kind of useful information is much harder to discern using the typical discussion tools in learning management systems.

LA-Sakai

We have no doubt that you will continue to hear more about Learning Analytics as the technology you use to support your teaching integrates data that is more visually accessible and actionable. Making use of this information in the right way can only enhance the learning experience you deliver – making it more targeted and responsive.

To find out more about Learning Analytics that are currently available to in myLesley, contact elis@lesley.edu.

Bb Grader App for iPad

Blackboard’s Bb Grader app allows faculty to grade Blackboard Assignments while on the go from their iPad. Just take a look at a few of the features:

  • View all assignments submitted to the Assignment tool from all of your courses.
    Grade and annotate PDF, Word, Powerpoint, JPG, PNG and HTML files directly in the app.
  • Use assessment rubrics
  • Provide text, audio and video feedback to students
  • Return graded assignments to students as you complete each one or all at once
  • Track student progress using the Retention Center

It may just become your preferred way to grade. Bb Grader is iPad only. Sorry Android and iPhone users. Check out Bb Grader in action in the video below.