Accessibility, Information literacy, instructional tools, LibGuides, Research Guides and Accessibility, STEM and Accessibility

Chapter 10: Happy Blog-iversary

Gentle reader, we’re already at our tenth blog post!

Image of the Number 10
Photo by Adrian Curiel on Unsplash

LibGuides for All

Help me celebrate by checking out a resource I created on a theme that’s been running throughout my blogging careeer: Accessibility.

Take a look at “LibGuides for All” — it’s a site I created to help folks get started in creating accessible LibGuides.

I think that switching how you do stuff at work in your professional practice, as an individual but especially as a group, can be a little overwhelming and stressful. So, I collected some introductory material on digital accessibility, especially as it pertains to larger social questions in academia and STEM and college, and also created some how-to videos. I’m hoping this website can be used in conjunction with an in-house workshop on creating accessible LibGuides.

I asked for some feedback from colleagues and they were all for changing our practice to include digital accessibility. However, there was some reluctance in spending time on it without a directive from on high. I think I need to emphasize the fact that these are habits not big huge new ways of mind-altering change.

Big Questions

What has your experience been with professionals and whether they are willingness to incorporate small changes such as those found in LibGuides for All?

Please let me know what you think of LibGuides for All. You know me by now — I love your feedback.

 

Accessibility, Evaluating resources, Information literacy

Chapter 9: Evaluating Resources Module

A Module for your Delight

 

Evaluating Resources

All of this thinking about evaluating resources has inspired me to create a module — Evaluating Resources. It serves as an introduction to types of resources and how to evaluate them for first year college students.

I love teaching information literacy to first year college students. They are so passionate and their topics show great concern: safety of self-driving cars, LGBTQ sex ed, anti-vaxxers, the environment. They are often focused on WHY something is going on. It’s my job to help them focus their topic into a paper that takes the form of an argument on how to go about things better in this world and to use supporting evidence. Hence my module on evaluating sources.

Loom vs. Kaltura

A few posts ago, I asked your thoughts on Loom and Animoto. Now I’d like to know if anyone out there has used Kaltura. I used Loom for this module but am extremely disappointed by their not providing captions in the “freemium” version — so I added a transcript to my Evaluating Resources module instead of providing captions. But I’d prefer to provide captions!  Kaltura provides captions but I don’t like the way you can’t control the “picture in picture” function easily. Then there’s the fact that Adobe Express — which I used to create the module —  doesn’t make it easy to embed a video unless it’s an Adobe-created thing.

Enough with the whining.

What should I do about making my screen cast accessible with captions? Talk to me.

Feedback

Please give me some feedback on Evaluating Resources — I’m all about the improvement-through-feedback thing.

 

Evaluating resources, Information literacy

Chapter 8: Evaluate

Gentle reader…of late, some thoughts have weighed heavily on this librarian’s soul. I’ve always had a hard time wrapping my head around how to teach first year students how to evaluate resources.

I don’t want to feel like I’m giving heavy-handed advice and I don’t want to feel like I’m providing some binary philosophical system on

 

WHO TO TRUST 

 

 

2016 election

I’ve become quite curious about how the 2016 election and its relationship with fake news and social media has affected how librarians teach information literacy. Read more about it in the preprint Post-Facts: Information Literacy and Authority after the 2016 Election and Why Libraries Can’t Fight Fake News.

Container collapse

Container Collapse definition

Also, I’ve been slowly digesting the findings from the big longitudinal study Researching Students Information Choices (RSIC).  This research has coined the term “Container Collapse” — What happens when you “decant” information from its original container (book, journal, newspaper) and turn it into a document in a database that looks like all the others? You get container collapse. Can students recognize these different types of resources? Do these different types of documents matter anymore? The research suggests that heuristics (cues to interpret) don’t help all that much and that deep engagement is needed to evaluate sources.

Many students have been taught to evaluate resources using normative heuristics like “is it a .org or a .com?” I wonder if too great of a focus on these simplistic shortcuts has led to a lesser engagement with evaluation of resources.

CRAAP not — SIFT to the rescue

The CRAAP test has long been used as a way to evaluate sources. However, it can be a bit overwhelming to some students, so they just don’t bother doing it despite its very fun scatalogical name. SIFT takes a more realistic approach and has students take a gander at the source of the document as a primary way to evaluate its worthiness. SIFT is the new CRAAP.

STOP
INVESTIGATE THE SOURCE
FIND TRUSTED COVERAGE
TRACE BACK TO THE ORIGINAL

Curious about SIFT? Learn more from this SMSU LibGuide on the SIFT Process

What do you think?

For the youngsters out there: Did the 2016 election have an effect on how you were taught to evaluate sources?

For the more “seasoned”: Did the 2016 election have an effect on your teaching of this skill? Or did it have an effect on how your librarian friends talked about it?

Gentle reader, I’d like to know.