Learning strategies

Student-Curated Video Collection: An Activity

AEG/Telefunken television from 1937. This was newfangled back when I started screening videos for this course. Eckhard Etzold, CC BY-SA 2.0

AEG/Telefunken television from 1937. This was newfangled back when I started screening videos for this course. Eckhard Etzold, CC BY-SA 2.0

I’ve been working on revisions to a distance-education physical geology course, and attempting to make it more interactive by offering videos. Have you ever tried to source relevant and accurate videos for multiple topics across multiple course modules? It involves going through hours and hours of videos, and rarely finding one that is directly on point or without problematic inaccuracies. My search technique has evolved to skipping anything longer than 5 minutes that doesn’t come with a transcript or clear description, and then screening the video at 1.5x speed.

So what to do about getting reliable videos without spending most of your adult life in the attempt… well, one school of thought would say let the students do it. I experimented with this kind of activity a few years back, but didn’t have an opportunity to deploy it full-scale. Here are the instructions I provided, with annotations. If you try it, let me know how it goes!

Curating Videos for Historical Geology

In this assignment you will assemble a collection of videos and complementary resources for historical geology students. You will work from the TED Ed* Lessons Worth Sharing video collection, Awesome Nature. This collection can be found at http://ed.ted.com/series/awesome-nature.

*I chose TED Ed because the videos are short. The student who did this moved on to TED Talks, which are  longer. I’d advise limiting the length of videos if you don’t want to spend hours watching videos in order to grade the results. If I were doing this today, I’d also recommend the fabulous video collection at MinuteEarth.

Your work will form the basis of a collection of resources to be made available to future students in Geology 109. If you wish, you will be acknowledged as the curator of the resources when they are posted, although I reserve the right to make any modifications that might be necessary to optimize the effectiveness of the collection.

Rationale

In the Independent Studies version of Geology 109, students do not have access to video lectures. Sometimes the textbook is unclear or written in too technical a fashion for students new to the topic to immediately understand what is being said. Videos designed by someone with a different perspective on the topic can be very helpful for reinforcing concepts, or clarifying points of confusion.

The problem is that not all videos are created equal. Some have factual errors, or even seek to mislead viewers. Some could benefit from clarifications. The task of looking for and vetting videos requires an understanding of the objectives a video should satisfy, and an assessment of how well the video accomplishes those goals. It also requires that viewers understand why they are watching the video and what they should get out of it. When an instructor looks for videos, he or she has an idea of what students find difficult, but it is really the students themselves who can most accurately identify where they need help, and what helps the most.

Your task

  1. Identify a video that satisfies one or more of the learning objectives for Geology 109. Provide the name of the video, and the link.
  2. Write an overview of the video. This should not simply restate the title of the video, but should summarize its contents in three or four sentences.
  3. List the learning objectives from the Geology 109 Course Guide that the video covers, and indicate which chapter they are from.
  4. Identify three key questions that the video answers. The questions should not be a restatement of the learning objectives, and should make it clear to other students why they would find the video useful. The questions will take the following form:
    1. Have you ever wondered …?
    2. Would you like to know how [something works or happens/ happened]?
    3. Have you ever been confused by …?
  5. Identify five terms that are technical in nature, and that are key to understanding the topic of the video. Define those terms in simple language, using your own words.
  6. Identify three “loose ends,” and explain the loose ends so that others watching the video will not be confused by them. The “loose ends” could be:
    1. Points that could be expanded upon
    2. Points that might leave some confusion in the minds of students watching the video
    3. Factual errors (hopefully there won’t be any of those)
    4. Points that are inconsistent with something in the course materials (e.g., competing hypotheses, more recent information, etc.)
  7. Write ten multiple choice questions so students can test their knowledge after watching the video. Supply the correct answers. The questions should cover key points. A good set of multiple choice questions will have the following characteristics:
    1. Four answer options (a through d)
    2. Little to no use of answer options like “all of the above” or “none of the above.”
    3. It should not be obvious to someone with no prior knowledge of the topic which is the correct answer. (Over-simplified questions are not helpful when trying to understand a topic.)
    4. Questions should be relevant to the topic of the video and to the learning objectives.
    5. After doing the questions, it should be clear to students what key points they have not understood.

Deliverables

You will write up each video following the layout supplied at the end of this document. This layout is designed to be compatible with the Blackboard system. The specific software you use to create the write-up is not important, nor is the font. (Blackboard has some formatting limitations, and formatting must be done within the Blackboard text editor, so this is something I will have to do afterward.)

Grading

Each write-up is worth up to 10 points. Those points will be calculated as follows:

  • Is the video relevant to Geology 109, and is the relevance clearly explained? (2.5 points)
  • Are all of the elements in points 1 through 7 above provided (e.g., the learning objectives, multiple choice questions, etc., are present)? (2.5 points)
  • Is the write-up scientifically accurate (e.g., definitions are correct, multiple choice answers are correct, etc.)? (5 points)

You may curate as many videos as you like*, however the maximum possible score for the assignment portion of the class will be 100%.

*This assignment was designed for a specific student. You may wish to rethink the “as many as you like” policy, or turn it into a group project to reduce the workload.

Format for submission

Square brackets mean text that you will insert. Text in italics are my notes, and don’t need to be included in your write-up.

[Video title]

[url]

 

Summary

[Three to four sentence summary of the video topic]

 

Why watch this video?

  • Have you ever wondered […]?
  • Would you like to know how [something works or happens/ happened]?
  • Have you ever been confused by […]?

 

This video addresses the following learning objectives for Geology 109:

  • [Learning objective], Chapter [chapter number]
  • [Learning objective], Chapter [chapter number]
  • [as many additional points as necessary]

 

Some key terms used in this video are:

[term 1]: [definition]

[term 2]: [definition]

[term 3]: [definition]

[term 4]: [definition]

[term 5]: [definition]

 

Special notes

  • [Loose end 1, explanation]
  • [Loose end 2, explanation]
  • [Loose end 3, explanation]

 

Note: these could take the form of, “In the video, [topic] is mentioned, but [concept] isn’t explained. Here is what it means,” or “The video says [this] about [topic], but in the textbook it says [that].   The difference is [reason].”

 

Self-test

[Questions 1 through 10]

 

[Solutions (e.g., 1a, 2b, 3d, …)]

 

Deadline

All write-ups must be submitted on or before Monday, March 30th 2015.

 

Categories: Assessment, Distance education and e-learning, Learning strategies, Learning technologies, Teaching strategies, Uncategorized | Tags: , , , , | Leave a comment

A Guide to Arguing Against Man-Made Climate Change

If you must, then at least do it properly…

The debate about climate change ranges from people arguing that it isn’t happening at all, to those who argue that it is happening, but is entirely natural. The debate can become quite nasty, and part of the reason for this is not that people disagree, but that they disagree without following the rules of scientific discourse. I’m guessing in many cases this is accidental- a kind of cultural unawareness. It’s like making an otherwise innocuous hand gesture while on vacation in a foreign country, only to learn later that it was the rudest possible thing you could have done.

I’ve been annoyed by poor-quality discourse on this topic for some time, and written a few draft blog posts about it, but I’ll defer to the INTJ Teacher for a summary of the key issue (and the main reason I no longer read comment sections after news stories about climate change).

critical thinking2

So now that you know the problem in general terms, let’s talk specifics.

Dismissing the data

First of all, if you’re going to make claims that the data about climate change are problematic in some way, then you should know that there is no one data set. There are thousands of data sets worked on by thousands of people.

Some people seem to think that the whole matter rests on the “hockey stick” diagram of Michael Mann, Raymond Bradley, and Malcolm Hughes published in 1999. (You can download the paper as a pdf here.)

Hockey_stick_annotated

Annotated hockey-stick diagram

Briefly, this was an exercise in solving two kinds of problems: (1) taking temperature information from a variety of sources (e.g., tree rings) and turning it into something that could reasonably be plotted on the same diagram, and (2) figuring out how to take temperature measurements from all over the world and combine them into something representative of climate as a whole. The main reason it became controversial was that it showed a clear increase in temperature since 1850, and that result was not optimal for a certain subset of individuals with a disproportionate amount of political clout. There is a nice description of the debate about the diagram here, including arguments and counter-arguments, along with the relevant citations.

Those arguments are moot at this point, because the PAGES 2k consortium has compiled an enormous amount of data and done the whole project over again, getting essentially the same result (the green line in the figure above).  I can’t help but think that this was an in-your-face moment for Mann et al. (“In your face, Senator Inhofe!  In your face, Rep. Barton!  How d’ya like them proxies?!”)

Despite these results, if you still want to argue that the data are bad, you will need to do the following:

  • Specify which data set you are referring to. Usually this takes the form of a citation to the journal article where the data were first published.
  • Specify what is wrong with it. Was the equipment malfunctioning? Was the wrong thing being measured? Was there something in particular wrong with the analysis?
  • Assuming you are correct about that particular data set, explain why problems with that one data set can be used to dismiss conclusions from all of the other data sets. This will mean familiarizing yourself with the other data and the relevant arguments (although if you are arguing against them you would presumably have done this already).

Things that are not acceptable:

  • Attacks against the researchers. It is irrelevant whether the researchers are jerks, or whether you think they’ve been paid off. What matters are the data. If you can’t supply the necessary information, you have only conjecture.
  • Backing up your argument with someone else’s expert opinion (usually in the form of a url) if that opinion does not cover the points in the first list. It is discourteous to expect the person you are arguing with to hunt down the data backing someone else’s opinion in order to piece together your argument.
  • Arguing from the assumption that man-made climate change isn’t happening. If that’s your starting point, your arguments will tend to involve dismissing data not because there are concrete reasons to do so, but because based on your assumption, they can’t be true. This may be personally satisfying, and ring true to you, but it lacks intellectual integrity. If your argument is any good, that assumption won’t be necessary.

Climate models and uncertainty

It is a common misconception that uncertainty in the context of climate models means “we just don’t know.” Uncertainty is an actual number or envelope of values that everyone is expected to report. It describes the range of possibilities around a particular most likely outcome, and it can be very large or very small.

If you plan to dismiss model results on the basis of uncertainty, you will need to demonstrate that the uncertainty is too large to make the model useful. In cases where the envelope of uncertainty is greater than short-term variations, it may still be the case that long-term changes are much larger than the uncertainty.

Another misconception is that climate models are designed to show climate change in the same way that a baking soda and vinegar volcano is designed to demonstrate what a volcano is. Climate models take what we know of the physics and chemistry of the atmosphere, and add in information like how the winds blow and how the sun heats the Earth. Then we dump in a bunch of CO2 (mathematically speaking) and see what happens. In other words, models specify mechanisms not outcomes. They are actually the reverse of the baking soda and vinegar volcano.

The mathematical equations in a model must often be solved by approximation techniques (which are not at all ad hoc, despite how that sounds), and simplified in some ways so computers can actually complete the calculations in a reasonable timeframe. However, I would argue that they are the most transparent way possible to discuss how the climate might change. They involve putting all our cards on the table and showing our best possible understanding of what’s going on, because it’s got to be in writing (i.e., computer code).

The models aren’t top secret. If you really want to know what’s in them, someone will be able to point you to the code. If the someone is very accommodating (and they often are if you’re not being belligerent or simply trying to waste their time) they might explain some of it to you. But whether or not they do that effectively is irrelevant, because if you’re going to make claims about the models, it’s your obligation to make sure you know what you’re talking about.

If climate changes naturally, then none of the present change is man-made

This is a false dichotomy. No-one is arguing that nature isn’t involved in the usual ways. What they are saying is that the usual ways don’t do all of what we’re seeing now.

A simple way to think about it is as a shape-matching exercise. We would expect that if some trigger in nature is causing the climate to change, then a graph of the temperature change should resemble that of the triggering mechanism. The IPCC has done a nice job of making this comparison easy. In the image below I’ve marked up one of their figures from the Fifth Assessment Report in the way I usually do when I’m researching something. Panel a shows the temperature record (in black), and the panels below it show the changes in temperature attributable to different causes. In the upper right I’ve taken panels b through e and squashed them until they are on the same scale as panel a.

IPCC comparison

A common argument against man-made climate change is to say the sunspot cycles are to blame. You can see the temperature variations that result from these cycles in panel b, and again at the top right. While there are small scale fluctuations in a, it is quite evident that the shape of the effects of sunspot cycles cannot account for the shape of the temperature record, either in terms of having an upward trend, or in terms of the timescale of the temperature change in a. Even if you added in volcanoes (panel c), and the El Niño/ La Niña cycles (panel d), you couldn’t make the trend that appears in a.

The only graph with a similar shape is the one that shows the temperature variations we would expect from adding CO2 and aerosols at the rate humans have been doing it (panel e). The red line in panel a is what you get if you add together b through e. It doesn’t have as much variation as the black line, meaning there are still other things at play, but it does capture the over-all trends.

You needn’t rely on someone else’s complex mathematical analysis to do this. This is something you can do with your own eyeballs and commonsense-o-meter. You may still be inclined to argue that all of these graphs are made up out of thin air, but if you have a look at the many different studies involved (you can do this by reading the chapter in the IPCC report and looking at the citations), you should realize that it’s a pretty lame argument to dismiss all of them out of hand.

But if you are undeterred by said lameness, at that point anyone interested in a serious conversation is going to decide that it isn’t worth their time debating with you, because you’ve already decided that any evidence contrary to your point of view must be wrong. Nothing they can tell you or show you will make a difference, ergo the conversation is pointless. You will appear to be impervious to reason which, incidentally, will be assumed to be the case for your opinions on other matters as well, whether that impression is deserved or not. (“It’s not worth arguing with Jim… if he has an idea in his head, he won’t change his mind no matter what you tell him. He would stand under a blue sky and tell you it’s pink.”)

Scientists are paid off to say climate change is man-made

This argument is quite irrelevant given that the data are what matter, but I think part of this argument might be related to another misconception, so I’m going to address it anyway. It is true that there are millions of dollars spent on climate research grants, but this isn’t pocket money for scientists. To get a grant researchers must justify the amount of money they are asking for in terms of things like lab expenses, necessary travel, and the like. Often their salaries don’t even come into the picture because they are paid by employers, not grants. It is more likely they will be paying grad students and post docs than themselves. When they do apply for funding that will cover their own salaries, that salary must be justifiable in the context of what others in similar positions get paid. In many cases this is a matter of public record, so you can go look up the numbers for yourself.

Most research being done on climate change is funded by government grants. A very few scientists have funding from private donors (though there isn’t nearly as much money as for petroleum-related research), but there is a big check on what influence those donors can have. Research must still go through review to be published. Even if biased research did make it through review, scientists on grants are highly incentivized to pick it apart because that can be an argument for additional grants to further their own research. Getting a grant is a matter of professional survival, so competition for research grants is intense.

In conclusion

There is only one way to make arguments against man-made climate change, and that is to address data and conclusions honestly and appropriately. It may feel good to add your two cents, but if your comments amount to ad hominem attacks or generalizations so broad as to be silly, you shouldn’t expect a good response. You’ve just made the equivalent of a very rude hand gesture to people who value thoughtful and well-informed discourse.

This all seems obvious to me, and I’ve struggled to understand people who argue in a way that I can only describe as dishonest.  But maybe psychology is a factor.  The climate-change deniers need only suggest that scientists are making things up. People don’t want to feel that they’ve been fooled, and most don’t have the background to easily check such claims, so it feels much safer to settle into uninformed skepticism.

Categories: Learning strategies, Science and such, Teaching strategies | Tags: , , , | 3 Comments

Help for students, part 4: Exam panic

Exam panic is a tricky problem, because once you experience it, it can make you worry about panicking in the future.  Once you are anxious about panicking, that makes it all the more likely. Fortunately there is a way to fix this. The solution is, go ahead and panic… sort of.

Brain sees exam monster

The problem: you see an exam but your brain sees certain death.

Your brain is an amazing bit of biology that has evolved over millions of years to serve the needs of our ancestors. Unfortunately, somewhere during that evolutionary process it became a toddler-like entity which, regardless of your good intentions, is willful, easily bored, and prone to inconvenient emotional outbursts. It learned a few good tricks that were suitable for helping our ancestors to escape from predators and each other, but since then it has stubbornly refused to acknowledge that those same tricks can be counterproductive when dealing with anxiety over situations that are not likely to kill you.

Brains in a panic

Brains do not react well to certain death.

When you see an exam and feel anxious, your brain sees something else entirely. As far as it’s concerned, that exam is actually a large carnivore about to eat you for lunch. Your brain will try its best to persuade you that you are about to die, and that you should run for your life. Your brain is wrong, but it is also convincing.

Expect some exam anxiety or even outright panic, but realize that you don’t have to accept what your brain is telling you about the situation. Sit back and let it have a fit, like you’re waiting out a child’s temper tantrum. Without your complicity, your brain will not maintain its high panic state, and will settle down again in a few minutes. If you happen to imagine it as an obnoxious pinkish-grey wrinkly thing running back and forth, waving its arms in the air, and screaming at the top of its lungs, that might speed things along.

Brains exhaused after their panic

Sometimes you just have to wait them out.

Exam panic is only a disaster if you think it is. If you begin to panic, and mistakenly believe that the panic is the result of an accurate assessment of your situation, then more panic follows. Even worse, when you panic, your cognitive functioning can diminish- amongst other things, you can forget what you’ve studied. So now you’re suddenly unable to remember anything you studied, and becoming convinced that you are facing catastrophe. This leads to the all too common experience of blanking on an exam only to suddenly remember all of the answers 30 minutes later, once you’ve begun to relax.

Brain not committed to behaving itself in future

Unfortunately, you can’t leave it at home.

Fortunately, this can be managed by expecting that your brain will do stupid things in response to stress, realizing that you might have to let it freak out for a while, and then just waiting until it has regained its composure.

Categories: Challenges, For students, Learning strategies | Tags: , , , , | Leave a comment

Help for students, part 3: Reasons for miscalculating course expectations

Bert and Sally are two students who ran into difficulty on their final exams, and complained that there was an unreasonable amount of material to memorize for the course. So why did Bert and Sally not try to understand the course material rather than just memorize it? Maybe they thought memorizing would be easier and faster, or maybe they weren’t far enough along to transition toward understanding. But I suspect there is another reason. I suspect that they underestimated how much they would be expected to know, and how well they would have to know it. As a result, they prepared too superficially.

1. Approach to the assignments

In the course that Sally and Bert took, students do assignments which are often accomplished by paraphrasing the textbook in a way that is only slightly better than copying it outright. Because of this approach, they can get the right answers (and therefore good grades) without actually understanding key parts of what they’ve written. There seems to be a chain of reasoning that runs: “I didn’t really understand the assignments, but I still did well on them. Therefore, I will be able to do well on the final exam with a similar level of understanding.”

In fact, it is never safe to take one’s performance on assignments at face value unless one can be confident that the conditions of the exam will match the conditions of the assignment. For example, if you refer to your textbook while solving physics problems, this is not the same as having to solve a physics problem on an exam without your textbook and while under pressure. A good grade on that physics assignment would tell you very little about how you will do on the exam. In the case of the course that Sally and Bert took, a look at students’ performances over many offerings of the course shows that there is effectively no correlation between assignment grades and exam results.

Another problem with paraphrasing the textbook very closely is that while I suspect that students who do so are not clear on what they are writing, I have no way of knowing for sure what they do and don’t understand. That means I may give a student full points on a question even though that student might have misunderstood the text that he or she paraphrased. In that case, getting full points might convince a student that his or her understanding is correct, when in fact it isn’t. That student has eliminated any chance for me to find the error… until I grade the exam, that is. Then I hear from Berts and Sallys.

2. Reasonability assumption

When filtering out what is and isn’t necessary to study, a starting point might be the assumption that an instructor will not be unreasonable and will avoid demanding complex details, or asking questions about extremely difficult topics. One problem with this assumption, however, is that someone who is new to a field of study will not have the same perception of what is difficult or complex as someone who has worked in the field for a while. An idea that might seem complicated to the uninitiated could be a very basic principle in that field. A second problem is that sometimes a complicated or difficult topic can be very important for a particular area of study, and therefore necessary to learn even though learning it might seem nearly impossible at the time.

You may think that another problem with the reasonability assumption is that some instructors are unreasonable and use exams to punish students. I can’t say that’s never the case, but I will point out that a “reasonable” exam is not an exam that any student can pass- it is an exam that a student can pass if he or she has done a reasonable job of covering the stated course objectives.

In the end, if you’re not sure about whether something is important or not, and you can’t determine that from the learning objectives or course objectives, just ask your instructor.

3. Perceived importance of the course

I sensed that Sally was unhappy about taking the course. It was the last one she needed to get her degree, and she was anxious to move on with her life. She seemed to feel that the course was a pointless hoop to jump through, and just wanted to get it over with. Understanding the course material was not a priority for her, and maybe her feelings about how much work she should have to do for the course coloured her perspective on how much work would actually be required.

Sometimes students in Sally’s position assume that the instructor understands that the course is not important to them. They assume that the instructor knows better than to make the course too demanding and get in the way of a student graduating. However, even if a student’s reasons for taking the course colour his or her expectations about what the exam will be like or should be like, it does not affect the reality. The requirements will be the same regardless of why a student is taking the course, and students should expect that there could be the same kinds of demands as in courses that they view as more serious, or more important for achieving their goals.  Put another way, no-one should expect to get credit for a course without fulfilling its requirements.  I would also recommend against telling your instructor that he or she should pass you because the course doesn’t matter.

4. Learning is what someone else does to your head

Every now and again I run into students who prefer to be passive participants in their own learning. These are students who think that I should put more effort into helping them than they are willing to put into helping themselves. Frank was a classic case. Before assignments came due, he would email to ask what pages the answers were on in the textbook. An email exchange with Frank would look like this:

Frank: “I can’t find the answers to questions 1b, 1c, 2a, 2b, 3, 5, and 7. Can you tell me where I should read in the textbook?”

Me: “For question 1b asking what igneous rocks are, you can find the answer in the section titled “What Igneous Rocks Are” starting on page 53.”

Sometimes it is hard to find a specific topic amid other details, so I explained to Frank how he could look up the page numbers in the index of the textbook. Frank disregarded my explanation, and continued to ask similar questions.

I want to be careful to distinguish between students who take a passive approach, and those who ask a lot of questions about different topics, those who ask for help with the same topic repeatedly, or those who need assistance deciphering their textbooks. By definition, these students are not taking a passive approach because their questions have arisen out of an effort to understand the course material. In fact, I would prefer that more students contacted me with those kinds of questions. But this is very different from asking me to look up pages for you in the index.

Students who are passive about their learning will inevitably underestimate the amount of understanding that is required because they believe on some level that learning and understanding are things that they are given. That’s just not the way learning works.

Categories: For students, Learning strategies | Tags: , , , | 1 Comment

Help for students, part 2: Memorizing vs. understanding

Sally and Bert are two students who fell prey to unknown unknowns on their final exam. They both sent me emails complaining that the exam was unreasonably difficult- that they were required to supply more information than a student could reasonably be expected to memorize.

Sally said:

“I found the exam very difficult obviously and would like to see the format change as the amount of content that needs to be memorized is something I feel uncapable [sic] of. If it were a multiple choice exam, I believe the outcome would have been different.”

The exam did require more knowledge than a student might reasonably be expected to memorize, and that was because the students were expected to understand the course material rather than just memorize it. Memorization is a very inefficient way to attempt to store information. Understanding is much better. It’s the difference between learning the lyrics to your favourite song by trying to remember the words in random order, or learning them as they fit into lines and verses and tell a story.  One is next to impossible, and the other you can do after listening to the song a few times.

That’s not to say there is no memorizing involved, but ideally the situation would look something like Plan A in the diagram below. The diagram is a sketch of what part of your knowledge would depend on memorizing, and what part would depend on understanding as you progress through the course. Initially, there is quite a lot of memorizing required as you encounter new terms for the first time, but at the same time your understanding is increasing. Eventually, you are able to add new knowledge by building on your understanding, and less memorizing is required.

Memorizing versus understanding

If a substantial amount of learning gets left to the last minute, then to be as prepared as in Plan A, learning must happen a lot faster. In that case, we’re looking at Plan B where memorizing and understanding are condensed into a small amount of time. Unfortunately, your brain can only learn so much before it needs a break, so what actually happens is C. There is insufficient time to prepare, and that time is taken up mostly by memorizing because you don’t yet have enough of the individual puzzle pieces to start to build the big picture.

Categories: For students, Learning strategies | Tags: , , , | 1 Comment

Help for students, part 1: Breaking the curse of the unknown unknowns

Students often ask whether I can offer any tips on preparing for and writing exams. Sometimes they are new students who haven’t developed study strategies yet, and sometimes they have just become frustrated with strategies that don’t seem to be working for them. Sometimes they are panicked and desperate, and end their emails with “HELP” followed by several exclamation points. (Never a good sign.) So I thought it might be time to jot these things down in one place, rather than writing them over and over again in emails to unhappy students who waited to ask for help until it was too late.

If there is one thing that causes more problems for students preparing for exams than any other, it would be the unknown unknowns:

“…as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns — the ones we don’t know we don’t know.”  Donald Rumsfeld, US Secretary of Defense, 12 Feb 2002

When studying, known knowns are the topics you are confident about, and which you are right to be confident about. Known unknowns are the deficits in your knowledge that you are aware of, and which you therefore have a chance to fix. Where you get in trouble, however, are the unknown unknowns- the deficits in your knowledge that you don’t realize exist. You can’t fix those because you don’t know they’re there. At least, you don’t know they’re there until you hit an exam question you didn’t realize you were unprepared for. Then they become known unknowns, but it’s too late to do anything about them.

Here are two examples of what a run-in with unknown unknowns can sound like. Unfortunately, I receive emails like this on a regular basis:

Sally:

“I realize I am not going to pass this course even with the 20+ hours I studied over the last week. I have trouble putting the definitions on paper, I remember reading them and seeing them but can’t find the definition…”

 Bert:

“I felt as though I at least I completed the test and did not leave it blank, and felt confident that half my responses where right, but must have gotten confused…”

Note: “Bert” and “Sally” are not the real names of these students, and may or may not reflect their gender(s).

Sally’s unknown unknowns turned into known unknowns during the exam. In contrast, Bert emailed me because he was shocked that his exam grade was so low- Bert’s unknown unknowns were so sneaky that he got right through the exam without even noticing them.

Both Sally and Bert blamed the exam format for their problems. Their exam was short answer, and they felt that if they had clues in the form of multiple choice questions, then things would have gone better. As Bert put it,

“… there is no way someone first year can be capable to do this, let alone without instruction, or scientific key terms without getting terms mixed up, since there is no multiple questions [for] deductive logical reasoning…”

I think that part of Sally’s and Bert’s problem was that they underestimated how much understanding they would need to be successful on the exam. Ultimately, though, exam format should not be an issue.  If you know the answers, it shouldn’t matter whether the question format is short answer, multiple choice, essay, or interpretive dance. If you know it, you know it, and if you don’t, it makes just as much sense to blame your pencil.

The main problem that Bert and Sally had is that brains can be deceiving. In Sally’s case, after more than 20 hours of studying, everything looked familiar to her brain, and she believed it when it told her that she was ready for the exam. Unfortunately for Sally, the appearance of the page was what was familiar, not the information on it.

For both Sally and Bert it would have been a simple matter to set a trap for the unknown unknowns: if Sally and Bert had put their notes away every few minutes and tried to explain verbally or in writing what they had just read, they would have found very quickly that they couldn’t do it. Then they could have fixed the problem. Unfortunately, this is very hard work and should not be done for more than 45 minutes or so without taking a break. In Sally’s case, after many sustained hours of studying, her brain would have been too tired to manage it. She probably continued reading and not absorbing partly because she was too tired to do anything else.

Some of the sneakiest unknown unknowns hide so well that you might need someone else’s help to find them. Those are the kind where you remember information, but don’t realize that you have some part of it incorrect. The best way to trap these is to work with someone who might be able to pick up errors in your understanding as you explain the course material to them. This could be someone else in the class, or just a friend who asks you questions by referring to the textbook. Here are a few strategies that I’ve found helpful for turning unknown unknowns into known unknowns:

  • Scare them out into the open: Imagine that your instructor were to call you out of the blue to ask you questions about the course. What would you not want him or her to ask you about? Along the same lines, what would you not want to be asked about on the exam?
  • Treat learning objectives as questions and attempt to answer them without looking at your notes.
  • Reorganize information into diagrams and tables. For example, if you made a table to compare and contrast Neanderthals and Cro-Magnons, you might find that you can say something about Neanderthal body size, but you don’t remember how that compares to Cro-Magnon body size.   Diagrams and tables have the added benefit of being much easier to remember than lists of facts.
  • Study by explaining topics out loud to yourself or a friend. There is a difference between reading facts and trying to mentally organize them so you can say them out loud, and that difference can be enough to throw you off balance and expose unknown unknowns.
Categories: Assessment, Challenges, For students, Learning strategies | Tags: , , , | 3 Comments

Plagiarism-proof assignments: The Up-Goer Five Challenge

up_goer_fiveOk, so there’s probably no such thing as a plagiarism-proof assignment, but I think I’ve got a reasonable approximation thereof.

It originated with my frustration with the perpetual struggle to have students in my distance education classes answer questions in their own words. My students are using their textbooks to answer questions, and many seem to feel that a textbook is the exception to the rule when it comes to plagiarism. Some simply don’t understand that they’re doing anything wrong. From experience, I can tell you that many people who are not my students also see it that way, and complaining about it is a great way to be branded as unreasonable. The problem, as I’ve documented before, is that students who copy from their textbook also tend to fail the class. After last term, I’ve decided that it’s in my best interest to consume alcohol before grading assignments. I’m not allowed to ignore plagiarism, but what I don’t see

Absent blissful ignorance, the only way to deal with plagiarism (without causing myself a variety of problems) is to change the assignments so that plagiarism isn’t possible. Now, if you’ve attempted to do this, you know it isn’t easy. A search online will give you tips like having students put themselves in the position of a person experiencing a historical event, and explaining their perspective on the matter. That’s something students (most likely) can’t copy from the internet. But suggestions like that are not especially helpful when the topic is how volcanoes work. (Although now that I think about it, “Imagine you are an olivine crystal in a magma chamber…”)

The solution came from my online source of comfort, xkcd. Randall Munroe, the creator of the webcomic, set himself the challenge of labeling a diagram of NASA’s Saturn 5 rocket (Up Goer Five) with only the 1000 most commonly used words in the English language. Soon after, members of the geoscience community took up the challenge of explaining their fields of research in the 1000 most commonly used words. Here are two examples from a blog post by hydrogeologist Anne Jefferson. Anne writes:

” So I decided to see if I could explain urban hydrology and why I study it using only the words in the list. Here’s what I came up with:

I study how water moves in cities and other places. Water is under the ground and on top of it, and when we build things we change where it can go and how fast it gets there. This can lead to problems like wet and broken roads and houses. Our roads, houses, and animals, can also add bad things to the water. My job is to figure out what we have done to the water and how to help make it better. I also help people learn how to care about water and land. This might seem like a sad job, because often the water is very bad and we are not going to make things perfect, but I like knowing that I’m helping make things better.

Science, teach, observe, measure, buildings, and any synonym for waste/feces were among the words I had to write my way around. If I hadn’t had access to “water”, I might have given up in despair.

But my challenge was nothing compared to that faced by Chris, as he explained paleomagnetism without the word magnet:

I study what rocks tell us about how the ground moves and changes over many, many (more than a hundred times a hundred times a hundred) years. I can do this because little bits hidden inside a rock can remember where they were when they formed, and can give us their memories if we ask them in the right way. From these memories we can tell how far and how fast the rocks have moved, and if they have been turned around, in the time since they were made. It is important to know the stories of the past that rocks tell, because it is only by understanding that story that we really understand the place where we live, how to find the things that we need to live there, and how it might change in the years to come. We also need to know these things so we can find the places where the ground can move or shake very fast, which can be very bad for us and our homes.”

Is that brilliant, or what?! To make it even better, Theo Sanderson developed a text editor to check whether only those words have been used. This is what happened when I typed part of the introduction to the chapter on volcanoes:

Up-Goer Five text editor

Yes, fortunately it has the word “rock.”

I decided to test-drive this with my class. I gave them the option of answering their assignment questions in this way. It’s difficult, so they got bonus points for doing it. A handful attempted it, and that was probably the most fun I’ve ever had grading assignments. If you’d like to give this kind of assignment a shot, there are a few things to keep in mind:

  • Students (and colleagues) may be skeptical. Explain that the exercise requires a solid knowledge of the subject matter (in contrast to paraphrasing the textbook) and is a very effective way for students to diagnose whether they know what they think they know. In my books, that gives it a high score in the learning per unit time category.
  • The text editor has some work-arounds, like putting single quotes around a word, or adding “Mr or “Mrs” in front of a word (e.g., Mr Magma). Head those off at the pass, or you’ll get “But you didn’t say we couldn’t!”
  • You may wish to allow certain words for the assignment or for specific questions, depending on your goals. For example, if I were less diabolical, I might consider allowing the use of “lava.” The other reason for not allowing “lava” is that I want to be sure they know what it means. In contrast, I probably wouldn’t make them struggle with “North America.”
  • Make it clear that simple language does not mean simple answers. I found that students tended to give imprecise answers that didn’t address important details. I don’t think they were trying to cut corners- they just didn’t think it was necessary. If I were to do this again I would give them a rubric with examples of what is and isn’t adequate.
  • Recommend that they write out the key points of their answers in normal language first, and in a separate document, and then attempt to translate it.
  • Suggest that they use analogies or comparisons if they are stuck. For example, Randall Munroe refers to hydrogen as “the kind of air that once burned a big sky bag.”
  • Make the assignment shorter than you might otherwise, and focus on key objectives. Doing an assignment this way is a lot of work, and time consuming.
  • And finally, (as with all assignments) try it yourself first.

In that spirit:

I like to make stories with numbers to learn what happens when things go into the air that make air hot. Very old rocks from deep under water say things that help make number stories. The number stories are not perfect but they still tell us important ideas about how our home works. Some day the number stories about how old air got hot might come true again, but maybe if people know the old number stories, they will stop hurting the air. If they don’t stop hurting the air, it will be sad for us because our home will change in bad ways.

Categories: Assessment, Challenges, Distance education and e-learning, Learning strategies, Learning technologies, Teaching strategies | Tags: , , , , , | Leave a comment

Time: The final frontier

Timefleet Academy logo: a winged hourglass made of ammonites

A logo begging for a t-shirt

Here it is: the final incarnation of my design project for Design and Development of Educational Technology– the Timefleet Academy. It’s a tool to assist undergraduate students of historical geology with remembering events in Earth history, and how those events fit into the Geological Time Scale. Much of their work consists of memorizing a long list of complicated happenings. While memorizing is not exactly at the top of Bloom’s Taxonomy (it’s exactly at the bottom, in fact), it is necessary. One could approach this task by reading the textbook over and over, and hoping something will stick, but I think there’s a better way.

I envision a tool with three key features:

  • A timeline that incorporates the Geological Time Scale, and “zooms” to show events that occur over widely varying timescales
  • The ability to add events from a pre-existing library onto a custom timeline
  • Assessments to help students focus their efforts effectively

Here’s an introduction to the problem, and a sketch of my solution. If your sensors start to detect something familiar about this enterprise then you’re as much of a nerd as I am.

Timefleet Academy is based on the constructionist idea that building is good for learning. Making a representation of something (in this case, Earth history) is a way of distilling its essential features. That means analyzing what those features are, how they are related, and expressing them explicitly. Ultimately this translates to the intuitive notion that it is best to approach a complex topic by breaking it into small digestible pieces.

Geological Time Scale

This is what you get to memorize.

As challenging as the Geological Time Scale is to memorize, it does lend itself to “chunking” because the Time Scale comes already subdivided. Even better, those subdivisions are designed to reflect meaningful stages (and therefore meaningful groupings of events) in Earth history.

There is an official convention regarding the colours in the Geological Time Scale (so no, it wasn’t my choice to put red, fuchsia, and salmon next to each other), and I’ve used it on the interface for two reasons. One is that it’s employed on diagrams and geological maps, so students might as well become familiar with it. The other is that students can take advantage of colour association as a memory tool.

Assessments

Assessments are a key difference between Timefleet Academy and other “zoomable” timelines that already exist. The assessments would come in two forms.

1. Self assessment checklists

These allow users to document their progress through the list of resources attached to individual events. This might seem like a trivial housekeeping matter, but mentally constructing a map of what resources have been used costs cognitive capital. Answering the question “Have I been here already?” has a non-zero cognitive load, and one that doesn’t move the user toward the goal of learning historical geology.

2. Drag-and-drop drills

The second kind of assessment involves drill-type exercises where users drag and drop objects representing events, geological time periods, and dates, to place them in the right order. The algorithm governing how drills are set would take into account the following:

  • The user’s previous errors: It would allow for more practice in those areas.
  • Changes in the user’s skill level: It would adjust by making tasks more or less challenging. For example, the difficulty level could be increased by going from arranging events in chronological order to arranging them chronologically and situating them in the correct spots on the Geological Time Scale. Difficulty could also be increased by placing time limits on the exercise, requiring that the user apply acquired knowledge rather than looking up the information.
  • The context of events: If drills tend to focus on the same group of events, the result could be overly contextualized knowledge. In other words, if the student were repeatedly drilled on the order of events A, B, and C separately from the order of events D, E, and F, and were then asked to put A, B, and E in the right order, there could be a problem.

The feedback from drills would consist of correct answers and errors being indicated at the end of each exercise, and a marker placed on the timeline to indicate where (when) errors have occurred. Students would earn points toward a promotion within Timefleet Academy for completing drills, and for correct answers.

Who wouldn’t want a cool new uniform?

How do you know if it works?

1. Did learning outcomes improve?

This could be tested by comparing the performance of a group of students who used the tool to that of a control group who didn’t. Performance measures could be results from a multiple choice exam. They could also be scores derived from an interview with each student, where he or she is asked questions to gauge not only how well events are recalled, but also whether he or she can explain the larger context of an event, including causal relationships. It would be interesting to compare exam and interview scores for students within each group to see how closely the results of a recall test track the results of a test focused on understanding.

For the group of students who have access to the tool, it would be important to have a measure of how they used it, and how often. For example, did they use it once and lose interest? Did they use it for organizing events but not do drills? Or did they work at it regularly, adding events and testing themselves throughout? Without this information, it would be difficult to know how to interpret differences (or a lack of differences) in performance between the two groups.

 2. Do they want to use it?

This is an important indicator of whether students perceive that the tool is helpful, but also of their experience interacting with it. Students could be surveyed about which parts of the tool were useful and which weren’t, and asked for feedback about what changes would make it better. (The option to print out parts of the timeline, maybe?) They could be asked specific questions about aspects of the interface, such as whether their drill results were displayed effectively, whether the controls were easy to use, etc. It might be useful to ask them if they would use the tool again, either in its current form, or if it were redesigned to take into account their feedback.

Timefleet in the bigger picture

Writing a test

All set to pass the test of time

Timefleet Academy is ostensibly a tool to aid in memorizing the details of Earth history, but it actually does something more than that. It introduces students to a systematic way of learning- by identifying key features within an ocean of details, organizing those features, and then testing their knowledge.

The point system rewards students for testing their knowledge regardless of whether they get all of the answers right. The message is twofold: testing one’s knowledge is valuable because it provides information about what to do next; and testing one’s knowledge counts as progress toward a goal even if you don’t get the right answers every time. Maybe it’s threefold: if you do enough tests, eventually you get a cape, and a shirt with stars on it.

Categories: Assessment, Learning strategies, Learning technologies | Tags: , , , , | 2 Comments

Building assessments into a timeline tool for historical geology

In my last post I wrote about the challenges faced by undergraduate students in introductory historical geology. They are required to know an overwhelming breadth and depth of information about the history of the Earth, from 4.5 billion years ago to present. They must learn not only what events occurred, but also the name of the interval of the Geological Time Scale in which they occurred. This is a very difficult task! The Geological Time Scale itself is a challenge to memorize, and the events that fit on it often involve processes, locations, and organisms that students have never heard of. If you want to see a case of cognitive overload, just talk to a historical geology student.

My proposed solution was a scalable timeline. A regular old timeline is helpful for organizing events in chronological order, and it could be modified to include the divisions of the Geological Time Scale. However, a regular old timeline is simply not up to the task of displaying the relevant timescales of geological events, which vary over at least six orders of magnitude. It is also not up to the job of displaying the sheer number of events that students must know about. A scalable timeline would solve those problems by allowing students to zoom in and out to view different timescales, and by changing which events are shown depending on the scale. It would work just like Google Maps, where the type and amount of geographic information that is displayed depends on the map scale.

Doesn’t that exist already?

My first round of Google searches didn’t turn anything up, but more recently round two hit paydirt… sort of. Timeglider is a tool for making “zoomable” timelines, and allows the user to imbed media. It also has the catch phrase “It’s like Google Maps but for time,” which made me wonder if my last post was re-inventing the wheel.

ChronoZoom was designed with Big History in mind, which is consistent with the range of timescales that I would need. I experimented with this tool a little, and discovered that users can build timelines by adding exhibits, which appear as nodes on the timeline. Users can zoom in on an exhibit and access images, videos, etc.

If I had to choose, I’d use ChronoZoom because it’s free, and because students could create their own timelines and incorporate timelines or exhibits that I’ve made. Both Timeglider and ChronoZoom would help students organize information, and ChronoZoom already has a Geological Time Scale, but there are still features missing. One of those features is adaptive formative assessments that are responsive to students’ choices about what is important to learn.

Learning goals

There is a larger narrative in geological history, involving intricate feedbacks and cause-and-effect relationships, but very little of that richness is apparent until students have done a lot of memorization. My timeline tool would assist students in the following learning goals:

  • Memorize the Geological Time Scale and the dates of key event boundaries.
  • Memorize key events in Earth history.
  • Place individual geological events in the larger context of Earth history.

These learning goals fit right at the bottom of Bloom’s Taxonomy, but that doesn’t mean they aren’t important to accomplish. Students can’t move on to understanding why things happened without first having a good feeling for the events that took place. It’s like taking a photo with the lens cap on- you just don’t get the picture.

And why assessments?

This tool is intended to help students organize and visualize the information they must remember, but they still have to practice remembering it in order for it to stick. Formative assessments would give students that practice, and students could use the feedback from those assessments to gauge their knowledge and direct their study to the greatest advantage.

How it would work

The assessments would address events on a timeline that the students construct for themselves (My Timeline) by selecting from many hundreds of events on a Master Timeline. The figure below is a mock-up of what My Timeline would look like when the scale is limited to a relatively narrow 140 million year window. When students select events, related resources (videos, images, etc.) would also become accessible through My Timeline.

Timeline interface

A mock-up of My Timeline. A and B are pop-up windows designed to show students which resources they have used. C is access to practice exercises, and D is how the tool would show students where they need more work.

Students would benefit from two kinds of assessments:

Completion checklists and charts

The problem with having abundant resources is keeping track of which ones you’ve already looked at. Checklists and charts would show students which resources they have used. A mouse-over of a particular event would pop up a small window (A in the image above) with the date (or range of dates) of the event and a pie chart with sections representing the number of resources that are available for that event. A mouse-over on the pie chart would pop up a hyperlinked list of those resources (B). Students would choose whether to check off a particular resource once they are satisfied that they have what they need from it, or perhaps flag it if they find it especially helpful. If a resource is relevant for more than one event, and shows up on multiple checklists, then checks and flags would appear for all instances.

Drag-and-drop exercises

Some of my students construct elaborate sets of flashcards so they can arrange events or geological time intervals spatially. Why not save them the trouble of making flashcards?

Students could opt to practice remembering by visiting the Timefleet Academy (C). They would do exercises such as:

  • Dragging coloured blocks labeled with Geological Time Scale divisions to put them in the right order
  • Dragging events to either put them in the correct chronological order (lower difficulty) or to position them in the correct location on the timeline (higher difficulty)
  • Dragging dates from a bank of options onto the Geological Time Scale or onto specific events (very difficult)

Upon completion of each of the drag-and-drop exercise, students would see which parts of their responses were correct. Problem areas (for example, a geological time period in the wrong order) would be marked on My Timeline with a white outline (D) so students could review those events in the appropriate context. White outlines could be cleared directly by the student, or else by successfully completing Timefleet Academy exercises with those components.

Drag-and-drop exercises would include some randomly selected content, as well as items that the student has had difficulty with in the past. The difficulty of the exercises could be scaled to respond to increasing skill, either by varying the type of drag-and-drop task, or by placing time limits on the exercise. Because a student could become very familiar with one stretch of geologic time without knowing others very well, the tool would have to detect a change in skill level and respond accordingly.

A bit of motivation

Students would earn points for doing Timefleet Academy exercises. To reward persistence, they would earn points for completing the exercises, in addition to points for correct responses. Points would accumulate toward a progression through Timefleet Academy ranks, beginning with Time Cadet, and culminating in Time Overlord (and who wouldn’t want to be a Time Overlord?). Progressive ranks could be illustrated with an avatar that changes appearance, or a badging system. As much as I’d like to show you some avatars and badges, I am flat out of creativity, so I will leave it to your imagination for now.

Categories: Assessment, Learning strategies, Learning technologies | Tags: , , , , | Leave a comment

Wreck it good

nasty equationThis week I graded an exceptionally well-written exam.  The student used exam-writing and study strategies that I’ve found to be effective in my own experience.  This got me reflecting on my time as a student and I remembered the one thing that helped me more than any other skill or strategy that I developed: wreck it good.

Proclaimed in the same spirit as “git’er done!” (and therefore exempt from the usual grammatical rules), “wreck it good” was my license to fail.  Not only that, it made failure an imperative, which turned out to be a very good thing.

Here’s the scenario: I was taking a course in numerical modeling as part of my doctorate at Penn State.  The course included assignments that required writing computer code to simulate a variety of natural systems and processes.  I’d had some experience with programming, but the programming environment was new to me, and the application was also new.  Coding can be frustrating and challenging.  From time to time, my code produced such bizarre results that I had to remind myself that the computer was doing what I told it to, and not manifesting malicious intent.

As stressful as the course was (my husband claims it took five years off my lifespan), I look back on it with only positive feelings as a result of having given myself permission to fail.  It started out as a matter of pride.  I didn’t want to ask for help with my code only to hear “Did you try [insert obvious course of action that didn’t occur to me]?”  I resolved instead that I would try everything I could think of and make a complete and utter disaster of my code if necessary—I would wreck it good.  Then I could ask for help with the confidence that either the computer was broken, or the task was impossible when viewed from every conceivable angle by a normal human being.  I was nothing if not thorough.

A strange thing happened on the way to wrecking it good… I actually began to have fun with troubleshooting my code.  There was no risk involved in failure, because I could ask for help at any time.  That meant troubleshooting was more about exploring possibilities than fixing problems… and oddly enough, despite my best efforts, I never did wreck it good enough to need my instructor’s help.  It was very empowering to find out over and over again that however ugly and impossible the problem looked initially, I could handle it.  Bring it on, partial differential equation… cause I’m going to wreck you good!

Wrecking it good isn’t just for computer programming.  It works great for doing battle with math problems, or for posing challenging study questions to diagnose knowledge gaps.  It makes sense to try to fail—it is a full frontal attack on your learning challenges, and they won’t stand a chance.

Categories: Learning strategies | Tags: , , | 2 Comments

Blog at WordPress.com.