Need to regrade a finished quiz?

If you realized after the fact that a quiz/exam assessment had an error – just fixing the question will let you know that it will only affect future users who take the exam.  You should still correct it there, but in order to regrade, we need to do something else.

Go to the Exams/Quizzes area.  Find the assessment that had the problem.  Click on the dropdown menu to the right of the name of the assessment.  Click on Grade.

 

 

 

Now click on Questions

Check the radio button for Update all attempts.  Find the Question that had an issue.

Clicking on the question will give you a break down of how it was answered and shows what was graded as correct.

Here you can choose whether to give every one points, or just give points to the people who answers in the desired manner.

Click Save.

 

Call for participation with Brightspace

Participate in User Experience Research!

Last Updated:

As part of our product development process, we consult with users (like you!) through our user experience (UX) research. In UX research, we take a systematic approach to listening to the perspectives of our end users. This allows us to gain a better understanding of our users and their needs so we can create products that effectively support teaching and learning.

Some of our studies focus on better understanding your challenges and needs to help us plan our projects. Other studies focus on testing user interfaces that we’re designing or have built to help us make our platform intuitive and accessible for all users. We also run surveys to hear from a larger sample of Brightspace users.

We keep a database of Brightspace users that we contact periodically to invite to participate in our UX research sessions. It’s a great opportunity to share your feedback on specific topics or areas of Brightspace. We’d love for you to be a part of it.

Who can participate?

We’re looking for people who use an LMS:

  • Instructors, teachers, faculty
  • Instructional designers and online learning specialists
  • Students in higher education
  • Learning and development specialists in corporate organizations
  • Individual contributors and managers or people leaders
  • Academic administrators, principals, and advisors
  • LMS administrators and instructional technologists

We especially encourage people with disabilities to sign up.

Please note: We typically conduct our UX research sessions in English. If there is another language you would feel more comfortable in during our session together, you can request that language when you sign up for a session and we will do our best to accommodate wherever possible.

What does participation look like?

When you sign up to our Participant Database, we’ll contact you periodically to see if you’d like to participate in a study. Each invite will have a description of the study and a link to a calendar where you can find a timeslot that will work for you or a link to a survey. Typically, in a UX research session, you will join us via Zoom for an hour. We will ask you some questions about your work and about the research topic, and we may ask you to try out a few tasks in a prototype and provide your feedback. We offer a reward for your participation as a token of our appreciation.

How do I sign up?

We ask for your contact information and a few details about your work to match you with appropriate studies. It only takes a couple of minutes to complete the sign up form.

Roadmap for Voicethread. Some nice new features on the horizon:

Image of a folded map with a VT logo superimposed

The arc of change in the teaching and learning universe may be long, but it bends towards a more accessible, equitable, inclusive, and human-centric world. We plan on shortening that arc just a little bit in the coming year by introducing the biggest updates to VoiceThread in more than a decade.

A “New VoiceThread” will be available in the coming months, and we’ll have lots of details to share, but what we really want to share now is our redesign philosophy:

  • Significantly more accessible

  • Add a number of pro features, but be simpler to use

  • Designed based on research and user feedback

  • Make it easier for people to have authentic, high-quality, human to human interaction

  • Give everyone time to try it out before fully transitioning to the new version

In addition to the new VoiceThread, we have lots of new goodies on the way:

  • New integrations for Zoom, Google, and Microsoft

  • New assignment type allowing students to comment on classmates’ submissions for a formal grade

  • More powerful closed caption preferences

  • Lots of new mobile updates

Thank you for all your feedback, collaboration, and enthusiasm throughout 2022. None of this work could be possible without you.

– The VoiceThread Team

“I can tell it is GPT because my students don’t write that well”

Farm animols is so fun to visit! They live on farms with lots of other animols like cawos, pigs, and chikens. Farmers takes care of them and give them food and water every day.

Cawos are really big and have spotts or are all one color. They make milk that we drink and cheese and ice cream too! Pigs are pink and have curly tails. They likes to roll around in the mud and eat corn. Chikens are really noisy and lay eggs that we eat for breakfast. They also likes to peck at the ground to find worms.

Farm animols are important because they give us food to eat and things like wool for are clothes. They also likes to play and run around, just like we do! We can learn a lot from farm animols and it’s fun to go see them and pet them.

Turns out you can ask it to write something at a particular grade level and with age appropriate spelling and grammatical mistakes.

 

Using PeerMark in Brightspace

From website:

“PeerMark is a peer review assignment tool. Instructors can create and manage PeerMark assignments that allow students to read, review, and evaluate one or many papers submitted by their classmates. With the advanced options in PeerMark instructors can choose whether the reviews are anonymous or attributed.

The basic stages of the peer review process:

  • Instructor creates a Turnitin paper assignment.
  • Instructor creates a PeerMark assignment and sets the number of papers students will be required to review, and creates free response and scale questions for students to respond to while reviewing papers.
  • Student papers are submitted to the Turnitin assignment.
  • On the PeerMark assignment start date, students begin writing peer reviews.
  • For each assigned paper, students write reviews by responding to the free response and scale questions.
  • Students receive reviews when other students complete them.
  • Once the PeerMark assignment due date passes, no more reviews can be written, completed, or edited by the writer”

 

To use PeerMark in the SBU instance of Brightspace:

Go to the Content area where you want your PeerMark assignment to be located.  For example, a module labeled “Week 4” or “PeerMark Assignments”.

Go to the dropdown Menu, “Existing Activities” – > PeerMark.

This pops up a Add Activity window where you can give your assignment a title, instructions, max grade, start date/time, due date/time, release date/time, and the pretty important part, check Enable PeerMark. You will also want to expand the Optional Settings area to set specific Turnitin Settings. (PeerMark is owned by Turnitin, and they are bonded together in the LMS.)

Hit Submit.

Click on the assignment link you just made.

There is a PeerMark dropdown menu.  Use that to get to PeerMark Setup.

Now you get to another setup area that looks too much like the last assignment setup area we were just in.  I’m not sure why these aren’t all presented at once, but anyway…

You will see the title and instructions, just like you wrote them already, but the Maximum points available is reset  to to empty.  This is because the other grade was how many points the instructor will be grading the assignment at and this max points is what the peers will be given for completing the review.

The dates, look like the ones you already set – but they are not.  These are Students can review from date/time, students can review till date/time and peer feedback available date/time.

There is an additional options area here as well, which has things like: Award maximum points on review (if you have three questions for them to answer during the review and 100 points total, they will get 1/3 of 100 for each question that they answer while reviewing.) Also settings regarding anonymity . who can review, how many papers they will get to review, and whether they can review their own works.

Save and Continue.

Now click on the PeerMark Questions tab.

You will setup specific questions that you want the peer reviewers to work off of.  You can make up your own questions, Add from a Library of premade sample questions, make your own library, and Delete your own libraries.

When you make your own question, you can choose from a free response type or a scale type.

When creating a Free Response type, you also indicate the minimum answer length.

The Scale type has size (2-5) and the Lowest and Highest are the text prompts for what the min value and max value represent…  example  Perfect! /Did Not Meet Expectations.

This will leave you with a list of the questions you are using, and you can change the order they are in by clicking Reorder Questions, dragging them around and then clicking Save Order.

Features (?):

Assignments created in this manner will not show up under the Assignments area of Brightspace

The Turnitin/Peermark assignments will show up in the Calendar.

Grades for finishing the main assignment, will transfer to Grades.

The PeerMark grades (what the student get for completing a review), will not automatically show up in Grades. Instructors will need to do that manually via copy and paste.

 

Here is a pdf with another school’s directions ( PeerMark_manual ) on PeerMark which includes at the end some information about how the students actually do the PeerMark review. I think it is pretty straight forward, but in case you need it, it is there.

 

 

 

Cactus Chairs…

Me: where did you come up with cactus chairs?

GPT: I apologize, that answer was a mistake on my part. “Cactus chairs” is not a commonly used type of alternative seating for individuals with Autism Spectrum Disorder (ASD). Thank you for bringing this to my attention.

Me: I just want to know where cactus chairs came from. DId you make that up or read that somewhere?

GPT: I apologize, the answer “cactus chairs” was not based on any research or real-world information. It was a mistake and not a valid option for alternative seating in a classroom. As a language model, I generate responses based on patterns I’ve seen in the data I was trained on, but I can still make errors or mistakes. Thank you for bringing this to my attention.

Regarding citations and Chat GPT

The guidance from several academic journal editors has been to make a statement of one’s use of any AI chat-/dialogue tools (like ChatGPT) in the acknowledgment section. By virtue of it being nothing more than a large language model, it can’t assume responsibility like a human author would be.

My recommendation would be to utilize an extension like ChatGPT Phantom that write various output with a citation, though not always academic citation.

You can also use Longshot.ai to create writing output that will include about 10-15 citations from mainstream media articles and some open source journals.

Students would be best to copy-and-paste subjects and phrases from AI writing output into a tool like Elicit.org to gain access to various open source academic articles related to the topic. This way, they can ensure to validate the output to research studies and cite their sources accordingly.

citation. — Jim Brauer, GPTzero Educators (group on FaceBook)

Shhh! (instructors can use Chat GPT too – don’t tell anyone – top secret)

Me: write a multiple choice question about glaciation

GPT:  What is the process of large ice sheets moving across the land and shaping the Earth’s surface through erosion and deposition? A) Volcanism B) Tectonic activity C) Glaciation D) Weathering and erosion.

Me: How could we use bloom’s taxonomy to improve that question?

GPT: To use Bloom’s Taxonomy to improve the question, it could be rephrased to encourage higher order thinking skills. For example:

“How does the process of glaciation shape the Earth’s surface, and what is the evidence for this process? Analyze and evaluate the impact of glaciation on various landforms.”

This revised question now requires the examination and evaluation of the concept of glaciation, rather than simply recalling information.

CHAT GPT more styles

Not All Who Wander are Lost.
  1. Formal: It is not accurate to state that every individual who deviates from a set path is without direction.
  2. Informal: Just because someone’s not on a straight path, doesn’t mean they’re lost.
  3. Sarcastic: Oh sure, because clearly every single person aimlessly wandering has a definite destination in mind.
  4. Inspirational: Those who dare to stray from the beaten path often discover the most remarkable journeys.
  5. Playful: Wandering may seem lost to some, but to adventurers, it’s just another exciting detour.
  6. Legal: “It shall be noted that the assertion that every individual who deviates from a predetermined course is devoid of direction is not necessarily accurate.”