Easier Paper Grading with Google Classroom
Assessment, Collaboration, Google, Google Apps

Easier Paper Grading with Google Classroom

Hurricane Matthew forced TLT to cancel our session on “Easier Paper Grading with Google Classroom.”  We had several people ask if we could reschedule, so to meet the needs of more faculty we decided to do a recorded version of the class.  Check out the playlist to view the entire session, or click on the three lines in the upper right corner to view specific videos in the series.

 

Incorporating frequent quizzing encourages students to practice memory retrieval, which results in deeper, long-term learning.
Assessment, Best Practices, Small Teaching Tip, Teaching Advice

Small Teaching Tip #6: The Benefits of Frequent Quizzing

In a previous post, I discussed the important role memory retrieval plays in learning.  To briefly review: each time we recall a piece of information, we strengthen the neural pathways that move the information from our long-term memories to our working memories.  So the more times we retrieve the information, the more deeply we learn it.  This is known as the “testing effect.”

There are numerous ways to encourage students to practice memory retrieval, but one of the best strategies is frequent quizzing.

Tips for Frequent Quizzing

While quizzing is an effective method to practice memory retrieval, not all quizzes are created equal.  There are a few empirically-tested stipulations that must be considered:

  • First, make the quizzes count towards the course grade.  While we would love our students to complete quizzes simply for the joy of learning, most require extra incentive.  That being said, the quizzes should be relatively low-stakes.  The purpose of these quizzes is to practice retrieval, not to have an anxiety attack each week.
  • Second, avoid the pop quiz.  Pop quizzes are only effective at intimidating students into coming to class.  For most students, they do not encourage actual learning.  But quizzes that students know about in advance do.  Rest assured, these assessments do not need to be lengthy or require labor-intensive grading (there are countless instructional technologies that can help facilitate this process).
  • Third, design quizzes to be at least partially cumulative.  This requires students to reach back to concepts covered earlier in the term, developing deeper understanding and more complex mental models.  Remember: greater retrieval efforts equal greater learning (note the emphasis on the word effort).
  • Fourth, include question types that will be similar to what students can expect on exams.  This allows students to familiarize themselves with those formats so the exam is a test of knowledge instead of exam-taking ability.
  • Finally, occasionally assign quizzes that students complete before they learn new material.  This may seem strange, but a pre-quiz encourages students to consult their previous knowledge to help them grapple with new ideas.

If you don’t have enough class time to devote to frequent quizzes, consider using online quizzes through OAKS.  Most textbook publishers provide gigantic test banks that provide more than enough questions to create multiple quizzes throughout the semester. These banks are designed to be quickly imported into OAKS and quizzes can be automatically-graded, making quiz creation and administration simple.  But to ensure students are practicing retrieval, restrict the time limit so they don’t have the leeway to look up every answer in their notes or book (20-50 seconds per multiple choice question is advisable).

Providing frequent opportunities for retrieval will not only help your students remember important information, it will also open the door to higher levels of cognition.  I’ve shared one simple but powerful way to help your students learn that does not require an overwhelming amount of grading or extra preparation. Want more ideas?  Check out the rest of our Small Teaching Tips series!

References

Roediger, H. L., Agarwal, P. K., McDaniel, M. A., & McDermott, K. (2011). Test-enhanced learning in the classroom: Long-term improvements from quizzing. Journal of Experimental Psychology: Applied, 17, 382-395.

Roediger, H. L., & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1, 181-210.

Leeming, F. C. (2002). The exam-a-day procedure improves performance in psychology classes. Teaching of Psychology, 29, 210-212.

Lyle, K. B., & Crawford, N. A. (2011). Retrieving essential material at the end of lectures improves performance on statistics exams. Teaching of Psychology, 38, 94-97.

Richland, L. E., Kornell, N., & Kao, L. S. (2009). The pretesting effect: Do unsuccessful retrieval attempts enhance learning? Journal of Experimental Psychology: Applied, 15, 243-257.


This post is part of a series which presents low risk, high reward teaching ideas, inspired by James Lang’s book Small Teaching: Everyday Lessons from the Science of Learning.

Did you know?
Assessment

DID YOU KNOW…YOU CAN BLIND-GRADE QUIZ ESSAYS IN OAKS?

DID YOU KNOW? BLIND GRADING IN OAKS

I just learned that the OAKS Quizzes tool allows you to grade short answer, essay, or long answer questions without knowing which student wrote them.  This blind grading feature is something that has evidently been around for awhile but is so nicely hidden most users wouldn’t even know to look for it and is a great way to help remove grading bias.

  1. To blind grade a quiz begin by clicking on Grade > Quizzes from the upper navigation.
  2. Next to the appropriate quiz click Grade from the dropdown arrow.
  3. Click the Questions tab
  4. Click Blind Marking
  5. Click on the first question to grade
  6. Read the response and type in a grade and feedback
  7. You can navigate between student responses using the arrows at the top
  8. When finished click Save  then repeat the process for any other questions.

Images of the above directions

 

Dear TLT
Assessment, Dear TLT, TLT

Can I grade one OAKS Dropbox Assignment using two rubrics?

DEAR TLTDear TLT,

I have an assignment that has two components but produces one grade.  I’d like to use a different OAKS rubric to evaluate each component.  Is it possible to do this within one OAKS Dropbox so that there is only one grade in the grade book?  I really want to keep them as one assignment.

Sincerely,

Karen HB
Health and Human Performance


 

Dear Karen,

The answer is yes and no.  The OAKS Dropbox allows the instructor to attach multiple OAKS rubrics to one assignment and use both of them to interactively grade the work.  However, it will only automatically load the calculated score from the first rubric.  As you can see from the screenshot below, only Rubric 1’s score has been entered into the Score area for the assignment.

Two graded rubrics with the score from the first one transferred
Two rubrics added and graded. Notice only the first rubric score transferred automatically to the assignment score.

 

 

You will just need to manually enter the appropriate grade into the Score area, based on the outcome of the two rubrics.

The other option to consider would be to create only one rubric in OAKS that has two Criteria Groups.  Group 1 is for the first component of the assignment and Group 2 is for the second part.  The benefit of using the groups is the you can use different scoring levels per section.  Note: this may not produce the same outcome as the two rubrics so be sure to test this before applying it to a live assignment.

Multipgroup rubric

Sincerely,

TLT

headshot of Michelle McLeoed
Assessment

Faculty Guest Post: Using Technology to Optimize Student Feedback

This month’s faculty blogger is Michelle McLeod, PhD, ATC, PES, who is an Assistant Professor in the Department of Health & Human Performance.


This blog post is inspired from the lessons and skills learned during the Faculty Technology Institute in May of 2015 focused on planning an interactive lecture. I feel that my lectures are most effective if the classroom is engaged and interactive rather than me talking at students and merely hoping that they are paying attention. It is an opportunity for a real-time assessment and feedback to ensure that students are not only receiving information, but have a fundamental understanding of that information. This also provides me with feedback about my effectiveness in content delivery. I spent much of the 2015-2016 academic year incorporating interactive lecture and technology into my strategy of making the classroom truly more engaged. Here are some of the successes and failures that I encountered.

I will center this blog post on a research proposal assignment given in EXSC 433: Research Methods and Design in Health and Exercise Science. An area where I know that I can continue to improve is providing timely feedback to students. Rather than focusing on the research proposal itself, I will focus on the evolution of the project from the standpoint of how I could more efficiently assess student work and provide helpful feedback through the use of technology.

For me, one of the most painstaking processes of evaluating student work is accessing the work. I really, really dislike accessing work submitted via OAKS. It is so limited. I am also striving to go paperless with most work. In the Fall of 2015, I thought I had found the perfect solution: Kaizena. The tag-line on this Google app is “Fast, personal feedback on student work.” Dream come true, right? Not so fast. I attended a TLT workshop hosted by Jessica. Kaizena seemed awesome. Kaizena is a Google application so all students have access with their g.cofc.edu accounts. Students search for their professors on Kaizena and join “groups” (e.g. EXSC 433). The attraction to Kaizena is that this is a module to keep a running tab of conversations between students and professors regarding their work. Students upload their work in a document that allows you to view the work directly in Kaizena. Professors can create quick links for commonly used feedback in the form of text, hyperlinks, videos, and voice. I thought for sure that this would cut down on the time that it would take for me to provide student feedback and provide them with ample time to make corrections.

I couldn’t have been more wrong. In hindsight, it was probably not the best to try to incorporate this technology without playing around with it more first. I had 24 students in this course so I gave the students the option to work together on these research proposals. The drawbacks I encountered were that I could not create groups within Kaizena. Students had to search for each other first and add me to a conversation. Not such a big deal. However, when it came time for students to submit the first portion of the research proposal consisting of hyperlinks to articles and written summaries of the articles, I felt an impending sense of doom. You are not able to directly edit within the uploaded documents. You may highlight a portion of the document and provide commentary. If the student had submitted their work as a Google doc and provided permission to edit the document, then you could open the document in Google docs to do this. This seemed to negate the need for Kaizena (spoiler alert: this was ultimately my conclusion). Furthermore the biggest headache, perhaps, was that you couldn’t click on hyperlinks in the uploaded documents. As I mentioned previously, part of this portion of the assignment was for the student to provide hyperlinks so that I can confirm their provided references. I was asking students to resubmit their work and on many occasions students claimed to have submitted work that I could not find when I opened conversations.

The end result was that it took me longer than I had anticipated to provide valuable feedback. More of my time was spent requesting changes in the formatting so that I could even access the needed content. I therefore felt the need to be much more lenient in my assessments of student work. However, professors still have learning experiences on the regular, right? This spring semester, I kept this assignment as a part of the course.  Instead of Kaizena, I kept it simple and required students to submit their work via Google drive as a Google document. I still use OAKS to upload lecture content and grades for student accessibility. However, I almost exclusively provide links to a Google drive folder for students to submit their work. I can provide real-time feedback and review changes that have been made to student work as well as see when those changes were made. Because the students can also see when I have provided feedback, this helps to keep both parties accountable.

It’s still not a perfect system. I am still revising rubric content, and finding challenges with students being able to access folders (Tip: if a student says they do not have access, I find that it is because they are trying to sign in on an account other than their g.cofc.edu accounts. Instruct them to first try to sign out and sign back in!) Other lessons learned throughout this process of trying to use something new and fun:

1) Have a rubric! Developing a good rubric can be challenging and does take some time on the front end, but it has made my life easier as far as grading. Students also have a clearer picture of what is expected of them.

2) I love this assignment because it is an opportunity for students in our department to express their interests and creativity. That being said make sure that there are reasonable expectations for what you want to see in their work. I went from having very loose directions for student work to being pretty specific, down to the size of font used, margins, and maximum number of pages in length of proposal sections. In Google, I provided an example that the students could make a copy of and input their own work. You might be thinking: getting a little nit-picky here, Dr. M? Maybe; but, part of research proposal writing is being able to follow directions! Simple, yet still overlooked. 

3) Being able to provide feedback more efficiently and effectively has helped to improve student engagement and interaction. Not always in a direct and personal manner, but it improved communication. I felt that students were more inclined to ask questions or for clarification and I could provide better suggestions or solutions. This was reflected in my course evaluations this spring. Although I’m not yet lightning fast in my feedback my timeliness has drastically improved and I’m optimistic that it will continue.

 

Teacher and students engaged in discussion
Assessment, Best Practices, Pedagogy, Teaching Advice

The Essential Role of Memory Retrieval in Student Learning

Too often, at professional development workshops or on education blogs, there’s an emphasis on designing courses that encourage students to reach the summit of Bloom’s pyramid.  There’s absolutely nothing inadvisable about helping students analyze, evaluate, and explore.  But in our race to the top, we often overlook the importance of remembering, understanding, and even applying (especially in our upper-level courses).  According to cognitive psychologists, this is a mistake that can have damaging effects on student learning.  Without foundational knowledge, it is difficult, if not impossible, for students to demonstrate higher order levels of thinking.  According to cognitive psychologist Daniel Willingham:

“Thinking well requires knowing facts, and that’s true not simply because you need something to think about. The very processes that teachers care most about–critical thinking processes such as reasoning and problem-solving–are intimately intertwined with factual knowledge that is stored in long-term memory (not just found in the environment).” (quoted in Lang, 2016, p. 16)

Without a solid understanding of basic concepts, theories, and processes, a student cannot think creatively or critically about a discipline’s body of knowledge.  This academic groundwork allows students to integrate new knowledge in deeper ways and make more sophisticated connections.

Unfortunately, students often make poor choices when they attempt to learn new information.  Have you ever asked your students (maybe after the class did terribly on an exam) how they studied?  Often, students will say things like “I re-read my notes” or “I made flash cards and read them over and over again.”  Research has demonstrated that these are some of the least effective strategies for committing information to long-term memory.  Thus, if we care about our students’ learning, then we must design our courses in ways that actually help students learn, not simply cram and forget.  

Exams are considered by many to be the gold standard of measuring student learning.  However, most instructors are not familiar with the cognitive science literature and, therefore, do not design exams that actually result in student learning.  Better understanding the retrieval effect (sometimes called the testing effect) will help us to create more effective assessments.

How many times have you claimed your “brain is full” or “you can only remember so much”?  Our long-term memories are actually capable of holding quite a lot of information.  Cognitive psychologist Michelle Miller argues “the limiting factor is not storage capacity, but rather the ability to find what you need when you need it.  Long-term memory is rather like having a vast amount of closet space–it is easy to store many items, but it is difficult to retrieve the needed item in a timely fashion” (quoted in Lang, 2016, p. 28).  She explains that each time we recall a piece of information, we strengthen the neural pathways that move the information from our long-term memories to our working memories.  This is key.  The more times we retrieve the information, the better.

Book Small Teaching by James Lang

To encourage your students to practice retrieval, try these strategies from James Lang’s book Small Teaching:

The Retrieval Syllabus.  Most of us distribute our syllabi on the first day of class and never bring it up again, until a student violates a policy or makes a complaint.  Instead of thinking of your syllabus as a contract, envision it as a resource that is continuously referred to throughout the semester.  Fill out the course schedule with details that will help students see how the course will progress, how topics connect to one another, and how knowledge is organized in your discipline.  Then, during class, ask students to look at the document to orient themselves as well as remind them of what has been discussed thus far.

Warm-up Review.  In the first few minutes of class, ask students to write down on a scrap sheet of paper the topics that were covered the class period before or the main themes from the reading.  Ask students to share their “take aways”: What do they think was the most important point?  What struck them?  What piqued their interest?

I’ve done something similar with my students, but I simply asked the class to provide a review orally.  Typically, the same few students are the only ones who reply.  Thus, not everyone is encouraged to practice retrieval, so this method is less effective than asking all students to write down their recap.  This simple exercise has the added benefit of an intellectual “warm-up” — prepping students for learning and participating during class.

Exit Tickets.  Similarly, at the end of class, have students to complete an exit ticket.  For example, you could ask students to write down two things they learned and one question they still have.  This requires retrieval as well as provides valuable information about what students identify as important and what they are struggling with.  This can serve as a great jumping off point for the next class period.

What is absolutely essential for both warm-ups and exit tickets is that students are told not to consult their notes or textbook when responding.  If students look up the answers, they are not practicing retrieval.  It’s also important to explain to students the purpose of these exercises.  You’re not trying to test them or give them busy-work; you’re trying to help them learn more effectively.

Frequent Quizzing.  Frequent, low-stakes quizzes are one of the best ways for students to strengthen their retrieval muscles.  Remember that the more we recall information, the stronger the neural pathways between long-term and working memory.  When creating quizzes, it’s essential that they are not weighted heavily.  The point is to encourage retrieval, not stress students out.  It’s also important to include question types that will be similar to what students can expect on exams.  This allows students to familiarize themselves with those formats so the exam is a test of knowledge instead of exam-taking ability.

If you don’t have enough class time to devote to frequent quizzes, consider using online quizzes, such as through your Learning Management System (LMS).  Most textbook publishers provide gigantic test banks that provide more than enough questions to create multiple quizzes throughout the semester. These banks are designed to be quickly imported into your LMS and quizzes can be automatically-graded, making quiz creation and administration simple.  To ensure students are practicing retrieval, restrict the time limit so they don’t have the leeway to look up every answer in their notes or book (30-60 seconds per multiple choice question is advisable).

Space Out Due Dates.  Students should complete multiple smaller assessments throughout the semester (as opposed to only one midterm and one final exam).  Intersperse lower stakes assessments (e.g. weekly quizzes, practice problems, minute papers) with higher stakes assessments (e.g. exams, research papers, lab reports).  According to James Lang, “the more frequently that your students have to check in and offer some demonstration of their learning, the more often you are giving them retrieval practice” (2016, p. 36).

Providing frequent opportunities for retrieval will not only help your students remember important information, it will also open the door to higher levels of cognition.  I’ve shared simple but powerful ways to help your students learn that do not require extra preparation, overwhelming amounts of grading, or even that much class time.  Want more ideas?  Check out James Lang’s fantastic book Small Teaching and then ask yourself, “what small changes can I make to help my students learn?”

Evolution of a Group Research Project
1-1-1, Assessment, Best Practices, Research, TLT

Faculty Guest Post: Evolution of a Group Research Project

Today’s Faculty Guest Post is from Chris Mothorpe, Assistant Professor of Economics.  Chris attended TLT’s 2015 Faculty Technology Institute.  In this post, he reflects on the process of revising and improving a group research project in two of his courses: Urban Economics and Economics of Geography and Transportation.  This is an excerpt from Chris’ own blog.  To read the entire post, please visit: https://sites.google.com/site/chrismothorpe/home/group-research-projects


I am writing this blog post based on my experience conducting research projects in my upper level economic classes over the past three semesters. This post will not discuss the research project in its entirety; instead, it will provide a general overview of the project and then focus on specific challenges I have faced each semester and different strategies I have employed (or I am planning on employing to overcome them).  There are two main challenges I will discuss: 1) group formation; and 2) peer evaluations.

Project Overview

I decided to require a group research project after reading several magazine and newspaper articles discussing what companies are looking for in college graduates.  Atop many of the surveys were not the hard-technical skills taught in the classrooms, but many soft-skills developed in the non-academic, extracurricular setting.  These soft-skills include: 1) leadership; 2) ability to work in a team; 3) written communication skills; 4) problem solving skills; 5) work ethic; 6) verbal communication skills; 7) initiative; 8) interpersonal skills; 9) creativity; and 10) organizational ability.  Conducting a group-based research project provides students the opportunity to practice many of these skills — practice they would otherwise not receive if the class is taught in a more traditional manner.   A second motivating factor is to allow the students the opportunity to apply economic models to real world problems.

I decided to require a group research project after reading several magazine and newspaper articles discussing what companies are looking for in college graduates.  Atop many of the surveys were not the hard-technical skills taught in the classrooms, but many soft-skills developed in the non-academic, extracurricular setting.  These soft-skills include: 1) leadership; 2) ability to work in a team; 3) written communication skills; 4) problem solving skills; 5) work ethic; 6) verbal communication skills; 7) initiative; 8) interpersonal skills; 9) creativity; and 10) organizational ability.  Conducting a group-based research project provides students the opportunity to practice many of these skills — practice they would otherwise not receive if the class is taught in a more traditional manner.   A second motivating factor is to allow the students the opportunity to apply economic models to real world problems.

The stated objectives for the research project are:
  1. Analyze a contemporary economic issue or social issue using economic theory and models
  2. Demonstrate versatile and competent written, oral and digital communication skills
  3. Evaluate communication situations and audiences to make choices about the most effective ways to deliver messages
  4. Appraise written communication skills through self and peer evaluations
  5. Manage diverse teams successfully

The project is set up as a paper submission to the (fictional) Charleston Journal of Economics, which I reside over as Editor.  At the beginning of the semester, I pass out the Fall/Spring 20XX Charleston Journal of Economics (CJE) Request for Papers (RFP), which contains the objectives of the journal, the strategic areas, scoring criteria, formatting requirements, and examples of correctly formatted submissions. Throughout the semester, groups are required to submit portions of their project to the Editor and receive feedback (in the form of a letter from the editor). I have required the research project in the Spring of 2015, the Fall of 2015 and the Spring of 2016.  These three iterations have proven valuable as I continually update the project to improve on its effectiveness and efficiency in delivery.

Group Formation

In the first iteration (Spring 2015) of the research project, I allowed each student to write his/her own paper and choose any topic as long as it was related somehow Urban Economics.  While allowing each student the opportunity to write their own research paper provides the best learning opportunity for the student (since he/she receives individualized feedback), it is much harder (time consuming) on me. I realized that there were three main consequences to allowing students to complete their own project:
  1. Grading fatigue
  2. Increase time until work is returned to students
  3. Grading research projects detracts from other activities such as research

In the second iteration (Fall 2015), I switched from individual research projects to group based projects.  I allowed the groups to form endogenously — students selected their own groups.  Each research group was required to have 3-4 individuals.  The main problem that arose from students selecting their own groups is that the groups were not interdisciplinary in nature.  For example, Group A consisted of three Transportation and Logistics Majors.  One of the comments Group A received on one of their drafts was that their paper lacks a sufficient economic model.  The feedback I received from Group A was that there is not a economic major (or minor) in the group, and as a result no one is familiar with economic models.

In the second iteration, I also began restricting the topic selection by requiring each group’s research question to at least fall within one of the strategic areas of the Charleston Journal of Economics.  The strategic areas are:
  1. Transportation Infrastructure
  2. The Port of Charleston Expansion
  3. Coastal Community Resilience and the Impacts of Sea Level Rise/Climate Change
  4. The Long Savannah Development

In the third iteration (Spring 2016), I attempted to correct for the lack of interdisciplinary majors within a research group by assigning research groups.  To aid in the assignment of research groups, each student completed an Oaks quiz that asked the following questions:

  1. List the strategic areas in order of greater interest to least interest
  2. For your top ranked strategic area, list keywords of interest
  3. For your second ranked strategic area, list keywords of interest
  4. List your major(s)
  5. List your minor(s)
  6. List individuals you would like to work with

Students submitted their responses via an Oaks quiz and then I used their responses to assign groups.  Matches were made based on strategic areas and keywords; however, not all students receive their top ranked strategic area (most did) as I also sought to ensure that each group contained at least one each major or minor.  This mechanism worked well in solving the interdisciplinary problem previously encountered; however, the new problem that arose was that group members wanted a greater say about who was in their group as the “Free-Riding” problem arose in several groups.  The Free-Riding problem occurs when not all members contribute equally to the project, yet all group members receive the same grade.  Of the 8 research groups in the Spring of 2016, at least 4 registered complaints about one of their group members not contributing.

The Free-Rider Problem

I am planning on implementing two strategies to attempt to mitigate the Free-Riding Problem.  First, I plan on introducing a mechanism that will allow students to reveal information about themselves (e.g. work ethic) to other members in the class.  This mechanism is a series of group-based homework problem sets in the first few weeks of class and before the assignment of groups.  Groups will be randomly assigned.  The random assignment of groups will ensure that students are meeting and learning about other members of the class.  After the problem sets, students will again be asked to complete an Oaks quiz, but on their quiz there will be additional questions aimed at revealing their preferences for who they do and do not want to work with.

 

The second strategy is to have students submit peer evaluations of their group members when assignments are due.  A portion of the peer evaluation is a Grade Multiplier.  Each member of the group assigns every other member of the group a multiplier, which gives each group member control over every other group member’s grade.  The purpose of the multiplier is to provide incentive to group members to work hard towards the completion of the project.  In the Spring of 2016, I required the students to submit Peer Evaluations at the end of the semester; however, this did not provide strong incentives to students since at the time of submissions final class grades were almost known.  It was recommended to me, by a student, to conduct the peer evaluations more frequently.

 

Peer Evaluations are a useful tool that provide students with information on their performance over the course of the research project.  Since the goal of the project is to aid students in developing soft skills, the peer evaluations are particularly effective, since they address each student individually.  Herein lies the main problem since each time I require a peer evaluation I cannot write 20-40 individual letters commenting on their performance.  The remainder of this blog post discusses the tools I have developed to create individualized letters based on peer reviews in an (semi) automatic fashion.  Creating letters in this manner allows me to provide individualized feedback to students while at the same time not spending hours drafting letters.

 

The letter-creation process requires the following programs/files:
  1. The Form Letter – Microsoft Word Template
  2. Oaks Quiz and Excel File of Modified Data
  3. Microsoft Word Template File
  4. Microsoft Excel Template File
  5. Microsoft Excel Addin ExcelToWord

The procedure behind the automated process is to have students complete their peer evaluations through an Oaks quiz, text-mine their responses, and populate a form letter with student responses.  Note that this process relies on student responses on the peer evaluation but does leave open the possibility of directly editing the individualized letters.

[TLT Note: On his own blog, Chris provides instructions for using OAKS, Microsoft Word, and Microsoft Excel to facilitate the peer evaluations described above.  He also provides templates and examples. To access this information, please visit  https://sites.google.com/site/chrismothorpe/home/group-research-projects]

In this blog, I have discussed the research project that I conduct in my upper level economics classes, two of the challenges that have arisen, and various strategies I have or will employ to overcome the challenges.  To overcome group formation problems, I am employing an Oaks quiz and group based homework assigned in order to allow students the opportunity to reveal information about themselves to other students in the class as well as myself.  To overcome the “Free-Riding” problem, I am planning on employing a series of peer evaluations, which gives all members in the group some control over the grades of the other group members.

One key to conducting peer evaluations is returning individualized feedback to the student based on their performance.  I have also discussed a set of tools which will enable me to create individualized letters in a timely manner.  Providing timely and individualized feedback also enhances the learning outcomes of the research project since the project is geared towards student practice of their “soft” skills.  Receiving individualized feedback allows students to learn from their experience and develop a stronger set of skills that they can employ in the future.

Create Polls and Quizzes with Riddle
Assessment, Innovative Instruction, Web 2.0

Create Fun Polls and Quizzes with Riddle

Riddle is a FREE web-based tool that allows users to create opinion polls, lists, quizzes, and personality tests.  If you’re familiar with Buzzfeed (your students will be), Riddle allows you to create similar quizzes.  It’s a fun and simple formative assessment tool to engage students, gather their opinions, and gauge their understanding.

Create Polls and Quizzes with Riddle

Cool features of Riddle:

  • Templates to help you create quickly
  • Embed Youtube videos, and trim them to only the parts you want
  • Mobile-friendly, so students can use their smartphones.
  • 30 languages available
  • Have option to reveal responses immediately or hide them until you’re ready
  • Share via hyperlink, social media, or embed into a website

Ideas for using Riddle:

  • Create a syllabus quiz or a “getting to know you” survey at the beginning of the semester.
  • Have students create lists, such as “Top 10 contributors to global warming,” to help them synthesize content or review for exams.
  • Incorporate a poll during class to gauge students’ comprehension of the material so you can adapt your lecture.
  • Ask students to create polls or quizzes to engage their classmates during presentations or discussion facilitation.
  • Use a quiz at the end of class as an “exit ticket”

Create Polls and Quizzes with Riddle

Example Riddles:

  • Tufts University created a great quiz called “What Major Are You?
  • This University of Texas professor created a top ten list of things students should know about her and her section of the university’s freshmen book club.
  • This quiz is about the “Space Race” between the US and the USSR during the Cold War.
Assessment, TLT

Interested in Formative Assessment Tools?

The following article from eduTOPIA lists five formative assessment tools that you may want to explore  http://goo.gl/ZoCSzo .

Two of the tools mentioned, Socrative and Kahoot, are tools that TLT have conducted session on in the past and have created step by step tutorials for which can be found at https://blogs.charleston.edu/tlttutorials/

Another tool mentioned, Plickers, is a tool that TLT has featured at our past FTI and will be featuring again at our upcoming Teaching Learning and Technology conference.  You will have a chance to explore Plickers and hear from an Instructor about their experience using it in the classroom by attending the Faculty Discovery Lab and Lunch on 3/9. You must register for the Conference AND for this Lab and Lunch.  Register at: https://goo.gl/4IoQJc

Assessment, instructional technology, Pedagogy, TLT, Web 2.0

Quick Audience Feedback with GoSoapBox

If you’ve ever asked your students “are there any questions?” you’ve likely received blank stares and shrugs.  Assuming this to mean everyone understands the content, you move on.  But what if students aren’t sure how to articulate what they don’t understand?  Or what if they’re too shy or embarrassed to admit they are confused?  Or maybe you’re simply looking for a way to get greater feedback from and interaction with students during lecture?  GoSoapBox could be just what you’re looking for!

GoSoapBox is an audience response system (a “clicker” tool) that works on any device that connects to the Internet.  This tool allows you to survey the class for understanding, quiz them on content, and encourage discussion.  The beauty of GoSoapBox is the simplicity of the user interface despite offering numerous features.  For example, “Social Q&A” allows students to contribute ideas and up-vote the ones they like.  This could be great for an exam review: students submit questions and vote for the ones they really want to spend class time discussing.

GoSoapBox Social Q&A

The “Confusion Barometer” is a super simple way to gauge just how well students really understand the material.  Instructors can see a live graphical display of how many students are confused by the material being covered and can then adjust their teaching strategy as necessary. GoSoapBox also offers quizzing, polling, and discussion board features, which allow for short answer/open-ended responses.

GoSoapBox Confusion Barometer

Students can respond with their names or anonymously, so GoSoapBox can be used to monitor students as well as allow sensitive opinions to be freely shared.  Instructors can even export reports in spreadsheet form to track student performance.

So why explore GoSoapBox instead of PollEverywhere?  If you have a class of 30 students or fewer, GoSoapBox is free and provides features that Poll Everywhere doesn’t, such as the quick and easy “confusion barometer” and threaded discussions.  It’s also a simpler interface so it’s quite user-friendly, while PollEverywhere can sometimes be clunky.  However, if you have course enrollment of over 30 students, I would not pay for GoSoapBox; our site license for PollEverywhere provides a great audience response system for free.

Applicationhttp://www.gosoapbox.com/

Platform: Web

Cost: Free for courses of 30 students or fewer

Tutorialshttp://help.gosoapbox.com/

GoSoapBox Blog (for updates and tips): http://gosoapbox.com/blog/