BY WILLIAM GUTH ON NOVEMBER 11, 2016
As an Instructional Technologist, receiving a daily barrage of emails from education tech companies is the norm. Each email offers a set of digital tools promising to “improve” the way our faculty deliver their content and “boost” our students’ learning outcomes to new heights. With each new day, there’s an opportunity to discover new tools and figure out whether these companies are really trying, or totally lying.
One type of instructional technology that is specifically important to the School of Professional Studies is online presentation tools, and our biggest challenge comes when we try to introduce new tools into the workflow of course design and content delivery. As an instructor you may think to yourself, “You mean I have to learn a new tool? I just became familiar with Canvas. Now you say I have to learn _______?” Believe me, we hear you.
In an effort to mitigate this challenge, I participated in an Online Learning Consortium Institute workshop called “Introduction to Online Presentation Tools.” Naturally the workshop exposed me to a spectrum of online presentation tools I had never heard of before, which I will share with you in a future blog post. But more importantly the workshop exposed me to a method of evaluating online learning tools that the School of Professional Studies will implement.
So, whether we hear about learning tools through ed tech vendors or directly from our faculty, there are a few questions we will ponder to determine whether not we should even consider testing and evaluating new tools.
1. Will use of this tool support stated learning objectives?
Identifying course objectives is one of most important aspects of the design process. In talking with your learning designer about objectives, you may brainstorm ways to achieve the objectives, a discussion that often leads to identifying tools can come in handy for these objectives. For example, if you want students to be able to diagram relationships, then a tool like Popplet might be useful. [Popplet: Description | Demo Video ]
2. How user-friendly is the tool?
If it is determined that a tool or technology does support stated learning objectives, that is when we start evaluating the user-friendliness of a tool against the “Web 2.0 Selection Criteria Checklist” developed by Bethany Bovard, Instructional Designer and eLearning Developer.
The checklist evaluates the tool across five areas of user friendliness including Access, Usability, Privacy & Intellectual Property, Workload & Time Management, and Fun Factor.
Access, for instance, asks us to consider whether the tool or technology:
- Works across different operating systems (e.g. Windows, Mac, Android, iOS)
- Works using different browsers (e.g. Firefox, Chrome, Explorer) and,
- Whether the tools has features compliant features with the Americans with Disabilities Act, or ADA. (e.g. closed captioning, keyboard navigation)
Each area for evaluation consists of three major criterion to check for, and has a list of questions you can ask yourself to determine if the tool meets that criteria. For the full evaluation checklist visit Save Time Choosing an Appropriate Tool.
After rigorously testing and narrowing our list of new and user-friendly tools, we move to evaluating how these tools will support the quality online experience our students expect and deserve.
3. Will use of this tool support various quality online educational criteria?
There are several models to choose from for evaluating online experience and educational quality, but for the bulk of our evaluations we will rely on the Seven Principles of Good Practice. The Seven Principles include: Student-Faculty Contact, Cooperation, Active Learning, Prompt Feedback, Diverse Ways of Learning, High Expectations and Time on Task.
In general any combination of these make for high quality online experiences where students are collaborative and social, teaching and learning from each other, learning by doing, interacting productively with faculty, and learning in ways that students find most effective for broadening their repertoires.
Beyond these three steps the Distance Learning team will evaluate costs, the reach of the tool beyond courses and departments, the strength of the company that developed the tool, and their support structure.
*For more information about commonly used frameworks for evaluating quality education criteria visit the following links:
- OLC’s Five Pillars of Quality,
- Implementing the Seven Principles: Technology as Lever, and the
- Community of Inquiry framework.