A facebook conversation from yesterday encouraged me to share one of the assignments that I developed for my instructional design course. The goal of the class is for the students to understand, experience, and apply instructional design in a variety of educational contexts.
One of the assignments I developed for asked students to enroll in a Massive Open Online Course (MOOC) and analyze the instructional materials within the course using one of the rubrics provided by Dick and Carey (the instructional design book we use in class). It was a lot of fun and the students appreciated the exercise. Given the lack of presence and voice by instructional designers in MOOC happenings, the lack of valid, reliable, and serious research that exists on the topic (though Rita Kop’s work on cMOOCs is admirable), and my desire to engage students in contemporary events, I came up with this assignment to embed MOOC analysis in my course. The assignment is available for download on https://dl.dropbox.com/u/2533962/instr-materials-veletsianos.doc and posted below for those who just want to skim it without downloading it. Enjoy and feel free to use it:
Instructional Material analysis assignment
Individually, you will examine and report on the instructional materials of one popular digital learning initiative. An analysis matrix will be provided to you, and you will use that to matrix to evaluate these initiatives.
Length: Minimum 500 words.
|Criteria||Levels of Attainment||Points|
|Written analysis (evaluation)||
This task requires a few hours of research before you can actually complete it. Even though this is an individual task, if you would like to discuss the assignment with any of your colleagues, please feel free to do so.
First read the chapter and the rest of the materials for this week. Without reading those, I can assure you that your understanding of the issues presented will be superficial.
Second, examine the rubric provided by Dick & Carey for evaluating instructional materials (p. 250-251 – see below for the rubric). You will be completing this rubric for a digital environment, and it’s a good idea to understand what it encompasses before you proceed.
Third, select one course provided on one of the following platforms to examine:
- A course on Coursera (select a course that is occurring right now or has been completed. DO NOT select a course that has not started yet): https://www.coursera.org/courses
- A course on EdX (select a course that is occurring right now. DO NOT select a course that has not started yet): https://www.edx.org/courses
- A free course on Udemy (select a course that includes at least 5 “lessons/lectures”): http://www.udemy.com/courses
You can also choose to examine DS106: http://ds106.us/ I am including DS106 on its own because it is a course as opposed to the above (Coursera, EdX, and Udemy) which are platforms. If you pick any of these three (Coursera, EdX, or Udemy), then you should also pick a course (e.g., Within Coursera a possible course is https://www.coursera.org/course/friendsmoneybytes).
Once you have made your selection, it’s time to research your course. Spend time looking around, examining and evaluating the instructional materials provided. You will use the rubric to keep track of the criteria that need to be assessed, and then using this rubric you will write a report assessing the instructional material for the course.
You should start your report by stating the course and its provider. A link would also be helpful. For example, using the example above, I would start my report by stating the following:
“I am examining the course entitled Networks: Friends, Money and Bytes (https://www.coursera.org/course/friendsmoneybytes). This course if offered through Coursera and is taught by Mung Chiang who is a Professor or Electrical Engineering at Princeton University. The course is an introduction to the topic of X and its objectives are XYZ.”
Your report should be specific and detailed in its evaluation of instructional material, and should be guided by the five criteria families discussed by DC: Goal-centered, learner-centered, learning-centered, context-centered, technical criteria. I would like to see that you understand each criterion and that you are capable of applying it to evaluating your course. For example, at the very least, I would expect to see statements such as the following:
Instructional designers use five criteria families to evaluate instructional materials. Learner-centered criteria focus on XYZ and refer to X. The instructional materials for this course appear to be adequate for this criterion because <provide list of reasons here>. The course could be improved in this domain by <list of additions/revisions here>. However, because item X was not disclosed in the course, I am not able to evaluate Y.
Let me reiterate that to complete this assignment you will need to do background research on the course and the platform. For example, your background research on Coursera will reveal that some of these courses have more than 80,000 students from around the world. This fact alone will impact your evaluation!
Instructional Material Evaluation Rubric
Rubric is copyright of: Dick, W., Carey, L. & Carey, J. (2008). Systematic Design of Instruction, (7th ed.) Upper Saddle River, NJ: Pearson.
A. Goal-centered Criteria:
Are the instructional materials:
|1. Congruent with the terminal and performance objectives?|
|2. Adequate in content coverage and completeness?|
|6. Objective in presentations (lack of content bias)?|
Are the instructional materials appropriate for learners’:
|1. Vocabulary and language?|
|2. Development level?|
|3. Background, experience, environment?|
|4. Experiences with testing formats and equipment?|
|5. Motivation and interest?|
|6. Cultural, racial, gender needs (lack bias)?|
Do the material include:
|1. Pre-instructional material?|
|2. Appropriate content sequencing?|
|3. Presentations that are complete, current and tailored for learners?|
|4. Practice exercises that are congruent with the goal?|
|5. Adequate and supportive feedback?|
|6. Appropriate assessment?|
|7. Appropriate sequence and chunk size?|
Are/do the instructional materials:
|1. Authentic for the learning and performance sites?|
|2. Feasible for the learning and performance sites?|
|3. Require additional equipment/tools?|
|4. Have congruent technical qualities for planned site (facilities/delivery system)?|
|5. Have adequate resources (time, budget, personal availability and skills)?|
Do the instructional materials have appropriate:
|1. Delivery system and media for the nature of objectives?|
|3. Graphic design and typography?|
|6. Audio and video quality?|
|7. Interface design?|
I’m working through my thoughts with this blog entry, as I’ve been trying to use this space to think out loud about my work and what I see happening in online education and higher ed.
A lot has been written about MOOCs and accreditation, and a lot more will be forthcoming. For example, see Terry Anderson’s post on this.
Today, I run across this quote in an article at Time Magazine:
…if Liu passes the graduate-level Harvard course she is taking for free through edX — one of the leading providers of massive open online courses, or MOOCs — she will be granted 7.5 credit hours, which her school district has agreed to accept as a form of professional development that can help her earn a higher salary. Liu might be among the first students nationwide to turn free online coursework into tangible college credit, but that number may soon grow exponentially.
What is the value of a critique?
The value of critique is to help us see a phenomenon through a different lens, to help us make sense of something in a different way, and to spark a conversation. This is the purpose, and value, of a paper we recently published with IRRODL on the topic of open scholarship.
The paper identifies the assumptions and challenges of openness and open scholarship and attempts to put forward suggestions for addressing those. A summary of our paper, appears below:
Many scholars hope and anticipate that open practices will broaden access to education and knowledge, reduce costs, enhance the impact and reach of scholarship and education, and foster the development of more equitable, effective, efficient, and transparent scholarly and educational processes. Wiley and Green (2012, pp. 88) note that “only time will tell” whether practices of open scholarship will transform education or whether the movement “will go down in the history books as just another fad that couldn’t live up to its press.” Given the emerging nature of such practices, educators are finding themselves in a position in which they can shape and/or be shaped by openness (Veletsianos, 2010). The intention of this paper is (a) to identify the assumptions of the open scholarship movement and (b) to highlight challenges associated with the movement’s aspirations of broadening access to education and knowledge. The goal of this paper is not to frame open scholarship as a problematic alternative to the status quo. Instead, as we see individuals, institutions, and organizations embrace openness, we have observed a parallel lack of critique of open educational practices. We find that such critiques are largely absent from the educational technology field, as members of the field tend to focus on the promises of educational technologies, rarely pausing to critique its assumptions. Selwyn (2011b, pp. 713) even charges that our field’s inherent positivity “limits the validity and credibility of the field as a site of serious academic endeavour.” Our intention is to spark a conversation with the hopes of creating a more equitable and effective future for digital education and scholarship. To this end, this paper is divided into three major sections. First, we review related literature to introduce the reader to the notion of open scholarship. Next, we discuss the assumptions of openness and open scholarship. We then identify the challenges of open scholarship and discuss how these may limit or problematize its outcomes.
Common assumptions and challenges are summarized as follows:
|Common themes and assumptions||Challenges|
|Open scholarship has a strong ideological basis rooted in an ethical pursuit for democratization, fundamental human rights, equality, and justice.||Are these ideals essential components of the open scholarship movement or are merely incidental to those who are pioneering the field?|
|Open scholarship emphasizes the importance of digital participation for enhanced scholarly outcomes||Scholars need to develop an understanding of participatory cultures and social/digital literacies in order to take full advantage of open scholarship.Need to redesign university curricula to prepare future scholars to account for the changing nature of scholarship.
|Open scholarship is treated as an emergent scholarly phenomenon that is co-evolutionary with technological advancements in the larger culture||Technology both shapes and is shaped by practice.Technology is not neutral, and its embedded values may advance tensions and compromises (e.g., flat relationships, homophily, filter bubbles).|
|Open scholarship is seen as a practical and effective means for achieving scholarly aims that are socially valuable||Open scholarship introduces new dilemmas and needs (e.g., personal information management challenges; Social stratification and exclusion).|
Given the topic, the best home for this paper was the International Review Of Research In Open And Distance Learning, through which you can download the paper for free in an open access manner:
I am in Cyprus to meet with a number of colleagues and give an invited talk at ICEM 2012.
Talk title: What does the future of design for online learning look like? Emerging technologies, Openness, MOOCs, and Digital Scholarship
Abstract: What will we observe if we take a long pause and examine the practice of online education today? What do emerging technologies, openness, Massive Open Online Courses, and digital scholarship tell us about the future that we are creating for learners, faculty members, and learning institutions? And what does entrepreneurial activity worldwide surrounding online education mean for the future of education and design? In this talk, I will discuss a number of emerging practices relating to online learning and online participation in a rapidly changing world and explain their implications for design practice. Emerging practices (e.g., open courses, researchers who blog, students who use social media to self-organize) can shape our teaching/learning practice and teaching/learning practice can shape these innovations. By examining, critiquing, and understanding these practices we will be able to understand potential futures for online learning and be better informed on how we can design effective and engaging online learning experiences. This talk will draw from my experiences and research on online learning, openness, and digital scholarship, and will present recent evidence detailing how researchers, learners, educators are creating, sharing, and negotiating knowledge and education online.
This is part of my ongoing reflection on moocs. See the rest of the entries here.
I have signed up to a number of MOOCs as a student (and led one of the #change11 weeks), and have spoken in general terms a couple of weeks ago about how education research can help improve the type of education offered through a MOOC. In this post, I will give specific suggestions, focusing on the University of Pennsylvania MOOC: Listening to World Music, offered through Coursera. I am signed up to this course, which started on June 23, and I just submitted the first assignment. I decided to post these thoughts early because of two reasons. First, the beginning of any course is an important moment in its success and I find that it takes a lot of planning and reflection. Second, MOOCS are discussed as being experiments in online education. The Atlantic even calls them “The single most important experiment in higher education.” I agree that they are experimental initiatives, and as such would benefit from ongoing evaluation and advice. Where I disagree with is the notion that they are a departure from what we know about online (and face-to-face) education. This post is intended to highlight just a couple of items that the Coursera instructional designers and learning technologists could have planned for in order to develop a more positive learning experience.
1. Length of the video lectures.
The syllabus lists the length of the video lectures (e.g., video 1 is 10:01 minutes long and video 2 is 10:45 minutes long.) However, this length is not provided on the page that students visit to watch the videos, which is where they need that information. I’ve annotated this below.
2. Opportunities for interaction.
The platform provides forums for students to interact with each other. Learners are of course instrumental and will figure out alternative, and more efficient and effective ways to communicate with each other, if they need to. For instance, in a number of other MOOCs students set up facebook groups, and I anticipate that this will happen here as well. What Coursera could do to support learners in working with each other is to integrate social media plugins within each course. I am surprised that this isn’t prominent within the course because you can see from the images below that Coursera uses social media plugins to allow students to announce participation in the course:
For instance, it appears that the course uses the #worldmusic hashtag, though it’s not integrated within the main page of the course, not does it seem to be a unique hashtag associated with the course.
3. How do you encourage students to watch the videos?
Let’s say that we added the length of each video next to its title. Now, the learner knows that they need about an hour to watch the video. Some learners (e.g., those who are intrinsically motivated by the topic) will watch them without much scaffolding. But, how do you provide encouragement for others to do so? Here’s where some social psychology insights might be helpful. By providing learners with simple descriptions of how the majority of their colleagues are behaving (i.e. appealing to social norms), then one might be able to encourage individuals to watch the videos. For example, the videos might include a dynamic subtitle that informs learners that “8 out of 10 of your peers have watched this video” or that “70% of your peers have completed the first assignment” and so on. This is the same strategy that hotels use to encourage users to reuse towels and the same strategy that Nike uses when it compares your running patterns to the running patterns of other runners, as shown in the image below:
4. Peer-grading expectations.
This course is different from others that I’ve participated in because it includes an element of peer-grading. This is exciting to me because I’m a firm believer in social learning. One minor concern however is the following: I don’t know how many peers I am supposed to evaluate. I thought I was supposed to evaluate just one, but each time I finish my evaluation, I am presented with the option to “evaluate next student.” Do I keep evaluating? How many do I need to evaluate before I can move to the next step? I don’t know. In other words, it’s always helpful to inform the learner of what s/he has to do. For instance, in my case, I just stopped evaluating peers after having evaluated 4 because I don’t know how much I am expected to do. Perhaps there’s no minimum… and this information would be helpful to me as a learner.
Overall, my experience with this course is positive, though there is a lot of room for improvement here, which is to be expected. For example, I haven’t touched much on the pedagogy of the course, but there’s a few more weeks left… so stay tuned!
Nike photo credit. Thanks to my colleague Chuck Hodges for directing my attention to the Nike example.
This entry is part of a reflective series of posts/questions relating to online learning, MOOCs, and openness. See the first one here.
Coursera announced today that it is adding a dozen or so universities as partners. In an article in the New York Times, Sebastian Thrun notes that MOOC courses are still experimental and argues: “I think we are rushing this a little bit,” he said. “I haven’t seen a single study showing that online learning is as good as other learning.”
This perception of online education as “better than” or “as good as” other forms of education (I imagine that Sebastian Thrun is referring to face-to-face education here), is rampant. I believe it is rampant because our field has not done a good job disseminating what we know and what we don’t know about online education. At the same time, individuals do not tend to go back to the foundations of the field to investigate what others have discovered.
The result: A lack of understanding that there’s a whole field out there (here?) that has developed important insights on how we can design online education effectively. The list of references at the end of this post are merely a few of the resources one can use to get started on what we know and what we don’t know about comparison studies (i.e. studies that compare learning between delivery modes).
The point of this entry is to argue that there’s no point to reinvent the wheel. There’s no point to make the same mistakes. And above all, past research has shown that there’s no point to study whether online education is as good as (or as bad as) other forms of education because what one will discover is that:
- There are no significant differences in learning outcomes between face-to-face education and online education.
- When differences are found between the two, the differences can be attributed to (a) pedagogy, or (b) and a lack of controls in the experimental design.
It is important to point out that the effectiveness of an educational approach is influenced greatly by other variables, such as instructor support or pedagogical approach. Therefore, it is very difficult (if not impossible) to compare face-to-face and online education because when one is not a replication of the other, they are vastly different, are based on different learner-instructor interactions, and offer different affordances. While researchers have tried to minimize differences and compare face-to-face learning and online learning in experimental ways, the interventions end up being meaningless for the types of powerful online/face-to-face teaching we might envision. Comparing delivery mechanisms therefore, blinds us to the important variables that truly impact how people learn.
The important and informative questions to ask are not comparative. Rather they focus squarely on understanding online education:
- How can we design effective and engaging online education (e.g., MOOCs)?
- What is the nature of learning in a MOOC?
- What is the learning experience in a MOOC like?
These questions are difficult. They won’t be answered by comparing survey responses and they won’t generate one simple answer. They will however generate answers that will be different depending on context. And that’s what’s exciting about doing research on online education.
- Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. U. S. Department of Education. http://www.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf
- Twenty Years of Research on the Academic Performance Differences Between Traditional and Distance Learning: Summative Meta-Analysis and Trend Examination http://jolt.merlot.org/vol6no2/shachar_0610.htm
- Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 43(4), 445-459.
- Clark, R. (1985). Evidence for confounding in computer-based instruction studies: Analyzing the meta-analyses. Educational Communications and Technology Journal, 33(4), 249-262.
- Kozma, R. (1994). Will media influence learning? Reframing the debate. Educational Technology, Research and Development, 42(2), 7-19.
- Tennyson, R. D. (1994). The big wrench vs. integrated approaches: The great media debate. Educational Technology Research & Development, 42(3), 15-28.
- Veletsianos, G. & Navarrete, C. (2012). Online Social Networks as Formal Learning Environments: Learner Experiences and Activities. The International Review Of Research In Open And Distance Learning, 13(1), 144-166
- Veletsianos, G. (2011). Designing Opportunities for Transformation with Emerging Technologies. Educational Technology, 51(2), 41-46.
- Veletsianos, G. (2010). A Definition of Emerging Technologies for Education. In G. Veletsianos (Ed.), Emerging Technologies in Distance Education (pp. 3-22). Athabasca University Press.
This entry is part of a reflective series of posts/questions relating to online learning, MOOCs, and openness.
MOOCS are everywhere nowadays. Coursera, Udacity, EdX, the connectivist MOOCs (e.g., #ds106, Change11), etc, depending on what lens one is using to examine them, are generating hope, excitement, uneasiness, and frustration. An important question that one needs to ask is: What is the purpose of a MOOC?
MOOCS have different purposes. For example, some MOOCs are built on the idea of democratizing education and enhancing societal well being. See Curt Bonk’s MOOC types, targets, and intents for additional MOOC purposes.
Other MOOCs are built on the idea of improving a specific skills. Today’s EdSurge newsletter included the following note:
GOOGLE’S FIRST MOOC comes in the form of a “Power Searching with Google” course consisting of six 50-minute classes on how to search “beyond the ten blue links.” Classes just started and at last count, over 100,000 people have already registered. Google promises to go way beyond the 101 stuff and dive into advanced features. We’re ready: we’ve been a little stumped at finding a query “to search exclusively in the Harvard University website to find pages that mention clowns.”
Let’s unpack this a bit. What is the purpose of this MOOC? This MOOC will help users make better use of google’s search capabilities. It will also help Google experiment with offering MOOC-type courses and reinforce consumer loyalty.
How does the Google MOOC fare with regards to enhancing societal well-being? Rather than offering courses to teach users how to search better, I would have rather seen Google develop online courses specifically aimed at reducing societal inequalities and enhancing well-being. I would have rather seen a course on “using our tools for speaking out against oppressive regimes” or “using our tools to facilitate the development of community in your neighborhood” or “using our tools to design and develop your own online class.”
I hope that this course is not the last that we see from Google, and that rather than focusing on teaching users a specific skill set, future courses focus on supporting the development of societal well-being.