Something to help your students (and that uncle) dissect dodgy online information.

Something to help your students (and that uncle) dissect dodgy online information.

I found this article on Gekoboard about Data Fallacies courtesy of a LinkedIn post from Jonathan Boymal at RMIT.

When it comes to false and incorrect information online you can’t judge a book by it’s cover. Even if students track a Youtube video or FB post back to it’s source they can find themselves on a credible looking site with links to multiple seemingly valid research reports.

It’s only if you step out of the information funnel provided and look at other reputable sources do you see the problem. This is becoming more and more important as more businesses and individuals realise there is money to be made from outrage and misinformation. We can all think of events over the last couple of years that wouldn’t have happened if the people involved had better analytical skills.

For example the Publication Bias may be contributing to some of the Ivermectin misinformation.

“For every study that shows statistically significant results, there may have been many similar tests that were inconclusive. However, significant results are more interesting to read about and are therefore more likely to get published. Not knowing how many ‘boring’ studies were filed away impacts our ability to judge the validity of the results we read about. When a company claims a certain activity had a major positive impact on growth, other companies may have tried the same thing without success, so they don’t talk about it”.

In case you haven’t been keeping up, here’s why Facebook is now called Meta.

Facebook is not a platform I that I recommend school teachers use as part of the their teaching. It’s track record around privacy in particular has been consistently poor. For commercial teachers of adults who use FB groups it’s probably ok as long as you include information on privacy settings.

But why Meta? According the Zuckerberg the rebrand is about repositioning Facebook to be more about the metaverse, (think Second life, VR, virtual 3D spaces) which Zuckerberg sees as the future of the internet.

It’s also fairly convenient timing giving the fall-out from the Facebook Papers which (according to the Washington Post) are a trove of internal Facebook documents reveal that “the social media giant has privately and meticulously tracked real-world harms exacerbated by its platforms, ignored warnings from its employees about the risks of their design decisions and exposed vulnerable communities around the world to a cocktail of dangerous content.”

If you need more, here’s a few other articles:

  • Facebook whistleblower reveals identity, says company ‘chooses profits over safety’ – “Facebook has publicized its work to combat misinformation and violent extremism relating to the 2020 election and insurrection,” she wrote in a cover letter on the subject. ” In reality, Facebook knew its algorithms and platforms promoted this type of harmful content, and it failed to deploy internally recommended or lasting countermeasures.”

So, yes it’s about transitioning to a FB version of Second Life, but it’s clearly also about distancing itself from it’s own toxic brand.

This should have been at the Climate change conference – an oil rig amusement park

This should have been at the Climate change conference – an oil rig amusement park

According to CNN Travel Saudi Arabia is planning to build the “world’s first tourism destination inspired by offshore oil platforms,”

What does that mean? Well, three hotels and 11 restaurants, interconnected platforms, roller coaster rides, bungee jumping and skydiving.

Ok, this hasn’t been built yet, but if it works we could have one in these in Bass Straight as a mid way destination on the way to Tasmania.

More personalised assessment during Covid?

This article, Where There’s a Willing Educator, There’s a Way on Edsurge is framed as a response to the pandemic, but, really it’s just about effective personalised assessment (as opposed to efficient volume processing).”

“Most have found that traditional modes of monitoring progress and improving academic performance do not suffice in this unique crisis. Students have been experiencing challenges with access to instructional content like never before. As such, the current educational climate requires unique considerations for assessing progress.”

I don’t think the experiences of the 3 leading teachers in the article reflect most teacher’s experiences which were more about making do, and struggling with the imposed limitations. It also doesn’t comment on the major limiting factors associated with these types of assessments – logistics and workload.

Having said that, the issues and principles like ownership, personal relevance and self reflection, are elements we should all be considering when designing assessment.

Multiple choice questions – is 3 the right number of options?

According to this article from Ben Butina on the eLearning Industry site, “How Many Options Should A Multiple-Choice Question Have?“, the answer is 3.

Coming from a site targeting industry trainers it’s tempting to write this off as a staff time saving measure for mandatory online courses, but a quick check of the attached links shows it’s pretty legit.

I would like to know if the approach to the questions changes the result. The standard format usually starts with 1 obviously wrong and 1 obviously right. But what about the other options – are they also clearly wrong, or do they tap into common misconceptions/mistakes? Does that have an impact?
I also wonder if it applies to one of my favourite MCQ approaches. Every answer is wrong and the student has to pick the least damaging option (although to be fair I mainly use it in branching scenarios rather than simple MCQs)

But, based on this I’ll probably shift to using 3 options most of the time. If I’m dealing with a subject that has numerous areas of common misunderstanding I’ll probably use more.

Here’s a quote straight from the article

“Three-option questions are just as reliable and valid as four- or five-option questions. (In fact, some evidence suggests that three-option questions are more reliable and valid than four- or five-option questions.)

Three-option questions do not reduce the difficulty of a test compared to four- or five-option questions.

Scores for tests made up of three-option questions do not differ significantly from tests made up of four- or five-option questions.

Students can complete three-option questions more quickly than four- or five-option questions.”

Weekend Funny – Absurd historical trends

This video Absurd historical trends is an amusing distraction.

On a technical note the creator has included a blue jay with a top hat and pipe as a kind of narrator which (to my eye) is a bit distracting and doesn’t really add to the effectiveness of the video. What do you think – a bit of fun and movement, or a bit distracting?
As an exercise, look at the structure and narration and make a note of where you think it could be trimmed.

Most Learners Have Found They Like Online Learning? But what does that mean?

“more than half of learners (51 percent) said they now hold more positive views of online learning too. And if they had to do it all over again, a whopping 79 percent said they “definitely” or “probably” would take part in online learning.”
https://campustechnology.com/articles/2021/10/07/report-most-learners-have-found-they-like-online-learning.aspx .

This article is worth a read. It’s important because is shows a shift in attitude, and an acceptance of remote courses with a high level of digital involvement, but there are a few things to keep in mind.

First – This was conducted by Wiley, a provider of online learning, so they have skin in the game,
Second – It’s higher education and importantly, 2/3 Graduate students (who we know from our own experiences are generally much more motivated).
Third – What are we talking about when we say “online learning”? An email only course is by definition an online course.

  • How skilled were the teachers?
  • How involved were the teachers?
  • What digital tools and modalities were used? How were they used?
  • How much was synchronous how much was asynchronous?
  • How well were the students supported (e.g. hardware, software, help desk, study skills, social support)?
  • How much of their curriculum content was provided digitally and in what forms?
  • How affluent were most of the students?
  • And many more questions.

Fourth – Liking isn’t learning. How were they assessed? Was it comparable? What were the results?

Categories

Archives