Category: moocs Page 5 of 6

HarvardX and MITx open course reports

HarvardX and MITx released a number of reports describing their open courses. The overarching paper describing these initiatives, entitled HarvardX and MITx: The first year of open online courses is really helpful in gaining a holistic understanding of HarvardX and MITx learned about their initiatives.

My (preliminary) thoughts:

  • It’s exciting to see more data
  • It’s exciting to see education researchers involved in the analysis of the data
  • The researchers should be congratulated for making these reports available in an expeditious manner via SSRN
  • We need more interpretive/qualitative research to understand participants’ and practitioners’ experiences on the ground
  • I am wondering whether the community would benefit from access to the data that HarvardX and MITx have, as other individuals/groups could run additional analyses. Granted, I imagine this might require quite a lot of effort, not least in the development of procedures for data sharing.

The course reports appear below, and these are quite helpful in helping the community understand the particulars of each course:

Using an instructional design perspective to analyze MOOC materials

A facebook conversation from yesterday encouraged me to share one of the assignments that I developed for my instructional design course. The goal of the class is for the students to understand, experience, and apply instructional design in a variety of educational contexts.

One of the assignments I developed for asked students to enroll in a Massive Open Online Course (MOOC) and analyze the instructional materials within the course using one of the rubrics provided by Dick and Carey (the instructional design book we use in class). It was a lot of fun and the students appreciated the exercise. Given the lack of presence and voice by instructional designers in MOOC happenings, the lack of valid, reliable, and serious research that exists on the topic (though Rita Kop’s work on cMOOCs is admirable), and my desire to engage students in contemporary events, I came up with this assignment to embed MOOC analysis in my course. The assignment is available for download on https://dl.dropbox.com/u/2533962/instr-materials-veletsianos.doc and posted below for those who just want to skim it without downloading it. Enjoy and feel free to use it:

Instructional Material analysis assignment

Individually, you will examine and report on the instructional materials of one popular digital learning initiative. An analysis matrix will be provided to you, and you will use that to matrix to evaluate these initiatives.

Length: Minimum 500 words.

Criteria Levels of Attainment Points
Written analysis (evaluation)
  • Evaluation adheres to the matrix, is thoughtful, and presents evidence of original thought
  • Evaluation does not adhere to the matrix or is superficial on various levels
87-0
Rubric completion
  • Learner completes and submits the rubric for evaluating instructional materials (p. 250-251) for his/her selected initiative.
2

 

This task requires a few hours of research before you can actually complete it. Even though this is an individual task, if you would like to discuss the assignment with any of your colleagues, please feel free to do so.

Mechanics

First read the chapter and the rest of the materials for this week. Without reading those, I can assure you that your understanding of the issues presented will be superficial.

Second, examine the rubric provided by Dick & Carey for evaluating instructional materials (p. 250-251 – see below for the rubric). You will be completing this rubric for a digital environment, and it’s a good idea to understand what it encompasses before you proceed.

Third, select one course provided on one of the following platforms to examine:

  • A course on Coursera (select a course that is occurring right now or has been completed. DO NOT select a course that has not started yet): https://www.coursera.org/courses
  • A course on EdX (select a course that is occurring right now. DO NOT select a course that has not started yet): https://www.edx.org/courses

You can also choose to examine DS106: http://ds106.us/ I am including DS106 on its own because it is a course as opposed to the above (Coursera, EdX, and Udemy) which are platforms. If you pick any of these three (Coursera, EdX, or Udemy), then you should also pick a course (e.g., Within Coursera a possible course is https://www.coursera.org/course/friendsmoneybytes).

Assignment

Once you have made your selection, it’s time to research your course. Spend time looking around, examining and evaluating the instructional materials provided. You will use the rubric to keep track of the criteria that need to be assessed, and then using this rubric you will write a report assessing the instructional material for the course.

You should start your report by stating the course and its provider. A link would also be helpful. For example, using the example above, I would start my report by stating the following:

“I am examining the course entitled Networks: Friends, Money and Bytes (https://www.coursera.org/course/friendsmoneybytes). This course if offered through Coursera and is taught by Mung Chiang who is a Professor or Electrical Engineering at Princeton University. The course is an introduction to the topic of X and its objectives are XYZ.”

Your report should be specific and detailed in its evaluation of instructional material, and should be guided by the five criteria families discussed by DC: Goal-centered, learner-centered, learning-centered, context-centered, technical criteria. I would like to see that you understand each criterion and that you are capable of applying it to evaluating your course. For example, at the very least, I would expect to see statements such as the following:

Instructional designers use five criteria families to evaluate instructional materials. Learner-centered criteria focus on XYZ and refer to X. The instructional materials for this course appear to be adequate for this criterion because <provide list of reasons here>. The course could be improved in this domain by <list of additions/revisions here>. However, because item X was not disclosed in the course, I am not able to evaluate Y.

Let me reiterate that to complete this assignment you will need to do background research on the course and the platform. For example, your background research on Coursera will reveal that some of these courses have more than 80,000 students from around the world. This fact alone will impact your evaluation!

Instructional Material Evaluation Rubric

Rubric is copyright of: Dick, W., Carey, L. & Carey, J. (2008). Systematic Design of Instruction, (7th ed.) Upper Saddle River, NJ: Pearson.

A. Goal-centered Criteria:
Are the instructional materials:

Yes No Some
1. Congruent with the terminal and performance objectives?
2. Adequate in content coverage and completeness?
3. Authoritative?
4. Accurate?
5. Current?
6. Objective in presentations (lack of content bias)?

 

Learner-centered Criteria:
Are the instructional materials appropriate for learners’:
Yes No Some
1. Vocabulary and language?
2. Development level?
3. Background, experience, environment?
4. Experiences with testing formats and equipment?
5. Motivation and interest?
6. Cultural, racial, gender needs (lack bias)?

 

Learning-centered criteria
Do the material include:
Yes No Some
1. Pre-instructional material?
2. Appropriate content sequencing?
3. Presentations that are complete, current and tailored for learners?
4. Practice exercises that are congruent with the goal?
5. Adequate and supportive feedback?
6. Appropriate assessment?
7. Appropriate sequence and chunk size?

 

Context-centered Criteria
Are/do the instructional materials:
Yes No Some
1. Authentic for the learning and performance sites?
2. Feasible for the learning and performance sites?
3. Require additional equipment/tools?
4. Have congruent technical qualities for planned site (facilities/delivery system)?
5. Have adequate resources (time, budget, personal availability and skills)?

 

Technical criteria
Do the instructional materials have appropriate:
Yes No Some
1. Delivery system and media for the nature of objectives?
2. Packaging?
3. Graphic design and typography?
4. Durability?
5. Legibility?
6. Audio and video quality?
7. Interface design?
8. Navigation?
9. Functionality?

MOOCs, credit, accreditation, and narratives

I’m working through my thoughts with this blog entry, as I’ve been trying to use this space to think out loud about my work and what I see happening in online education and higher ed.

A lot has been written about MOOCs and accreditation, and a lot more will be forthcoming. For example, see Terry Anderson’s post on this.

Today, I run across this quote in an article at Time Magazine:

…if Liu passes the graduate-level Harvard course she is taking for free through edX — one of the leading providers of massive open online courses, or MOOCs — she will be granted 7.5 credit hours, which her school district has agreed to accept as a form of professional development that can help her earn a higher salary. Liu might be among the first students nationwide to turn free online coursework into tangible college credit, but that number may soon grow exponentially.
Critical educators have done a good job on exposing systems of oppression and unequal distribution of power that impoverish learning experiences. I believe that such a lens is increasingly important in the work of any researcher and educator thinking about the future of education. To illustrate, the description above is not just a narrative of the success of open education. It’s also a narrative of moocs  “carving new markets” rather than innovating the way higher education functions for the masses of people that could not have attained a degree in the first place. I think that we need to keep an open mind with regards to the potential, as well as the aims and pitfalls, of such initiatives. To explore a different perspective, I suggest that you read Richard Hall’s analysis on how the profit motive is threatening higher education.
Contrast this with the TechCrunch perspective that  “the school system, as we know it, is on the verge of extinction”as “it’s inevitable that online courses will in one way or another replace schools.”  The question to ask here is not whether this prophecy will come true. We know that it won’t because universities are valued social institutions that are embedded in the culture of their times, and even though they may change, they won’t disappear. An analysis of educational technology predictions of the past also shows that hype is rarely realized (pdf). What is important to ask however is this: Who benefits from the narrative of “extinct schools?” Is it the student? The edtech startups? The investors?

 

Assumptions and Challenges of Open Scholarship

What is the value of a critique?

The value of critique is to help us see a phenomenon through a different lens, to help us make sense of something in a different way, and to spark a conversation. This is the purpose, and value, of a paper we recently published with IRRODL on the topic of open scholarship.

The paper identifies the assumptions and challenges of openness and open scholarship and attempts to put forward suggestions for addressing those. A summary of our paper, appears below:

Many scholars hope and anticipate that open practices will broaden access to education and knowledge, reduce costs, enhance the impact and reach of scholarship and education, and foster the development of more equitable, effective, efficient, and transparent scholarly and educational processes. Wiley and Green (2012, pp. 88) note that “only time will tell” whether practices of open scholarship will transform education or whether the movement “will go down in the history books as just another fad that couldn’t live up to its press.” Given the emerging nature of such practices, educators are finding themselves in a position in which they can shape and/or be shaped by openness (Veletsianos, 2010). The intention of this paper is (a) to identify the assumptions of the open scholarship movement and (b) to highlight challenges associated with the movement’s aspirations of broadening access to education and knowledge. The goal of this paper is not to frame open scholarship as a problematic alternative to the status quo. Instead, as we see individuals, institutions, and organizations embrace openness, we have observed a parallel lack of critique of open educational practices. We find that such critiques are largely absent from the educational technology field, as members of the field tend to focus on the promises of educational technologies, rarely pausing to critique its assumptions. Selwyn (2011b, pp. 713) even charges that our field’s inherent positivity “limits the validity and credibility of the field as a site of serious academic endeavour.” Our intention is to spark a conversation with the hopes of creating a more equitable and effective future for digital education and scholarship. To this end, this paper is divided into three major sections. First, we review related literature to introduce the reader to the notion of open scholarship. Next, we discuss the assumptions of openness and open scholarship. We then identify the challenges of open scholarship and discuss how these may limit or problematize its outcomes.

Common assumptions and challenges are summarized as follows:

Common themes and assumptions Challenges
Open scholarship has a strong ideological basis rooted in an ethical pursuit for democratization, fundamental human rights, equality, and justice. Are these ideals essential components of the open scholarship movement or are merely incidental to those who are pioneering the field?
Open scholarship emphasizes the importance of digital participation for enhanced scholarly outcomes Scholars need to develop an understanding of participatory cultures and social/digital literacies in order to take full advantage of open scholarship.Need to redesign university curricula to prepare future scholars to account for the changing nature of scholarship.

 

Open scholarship is treated as an emergent scholarly phenomenon that is co-evolutionary with technological advancements in the larger culture Technology both shapes and is shaped by practice.Technology is not neutral, and its embedded values may advance tensions and compromises (e.g., flat relationships, homophily, filter bubbles).
Open scholarship is seen as a practical and effective means for achieving scholarly aims that are socially valuable Open scholarship introduces new dilemmas and needs (e.g., personal information management challenges; Social stratification and exclusion).

Given the topic, the best home for this paper was the International Review Of Research In Open And Distance Learning, through which you can download the paper for free in an open access manner:

Veletsianos, G. & Kimmons, R. (2012). Assumptions and Challenges of Open Scholarship. The International Review Of Research In Open And Distance Learning,13(4), 166-189. [HTML access or PDF access]

 

Invited talk at ICEM 2012

I am in Cyprus to meet with a number of colleagues and give an invited talk at ICEM 2012.

Talk title: What does the future of design for online learning look like? Emerging technologies, Openness, MOOCs, and Digital Scholarship

Abstract:  What will we observe if we take a long pause and examine the practice of online education today? What do emerging technologies, openness, Massive Open Online Courses, and digital scholarship tell us about the future that we are creating for learners, faculty members, and learning institutions? And what does entrepreneurial activity worldwide surrounding online education mean for the future of education and design? In this talk, I will discuss a number of emerging practices relating to online learning and online participation in a rapidly changing world and explain their implications for design practice. Emerging practices (e.g., open courses, researchers who blog, students who use social media to self-organize) can shape our teaching/learning practice and teaching/learning practice can shape these innovations. By examining, critiquing, and understanding these practices we will be able to understand potential futures for online learning and be better informed on how we can design effective and engaging online learning experiences. This talk will draw from my experiences and research on online learning, openness, and digital scholarship, and will present recent evidence detailing how researchers, learners, educators are creating, sharing, and negotiating knowledge and education online.

What education research and instructional design practice can offer to Coursera MOOCs

This is part of my ongoing reflection on moocs. See the rest of the entries here.

I have signed up to a number of MOOCs as a student (and led one of the #change11 weeks), and have spoken in general terms a couple of weeks ago about how education research can help improve the type of education offered through a MOOC. In this post, I will give specific suggestions, focusing on the University of Pennsylvania MOOC: Listening to World Music, offered through Coursera. I am signed up to this course, which started on June 23, and I just submitted the first assignment. I decided to post these thoughts early because of two reasons. First, the beginning of any course is an important moment in its success and I find that it takes a lot of planning and reflection. Second, MOOCS are discussed as being experiments in online education. The Atlantic even calls them “The single most important experiment in higher education.” I agree that they are experimental initiatives, and as such would benefit from ongoing evaluation and advice. Where I disagree with is the notion that they are a departure from what we know about online (and face-to-face) education. This post is intended to highlight just a couple of items that the Coursera instructional designers and learning technologists could have planned for in order to develop a more positive learning experience.

1. Length of the video lectures.

The syllabus lists the length of the video lectures (e.g., video 1 is 10:01 minutes long and video 2 is 10:45 minutes long.) However, this length is not provided on the page that students visit to watch the videos, which is where they need that information. I’ve annotated this below.

 

2. Opportunities for interaction.

The platform provides forums for students to interact with each other. Learners are of course instrumental and will figure out alternative, and more efficient and effective ways to communicate with each other, if they need to. For instance, in a number of other MOOCs students set up facebook groups, and I anticipate that this will happen here as well. What Coursera could do to support learners in working with each other is to integrate social media plugins within each course. I am surprised that this isn’t prominent within the course because you can see from the images below that Coursera uses social media plugins to allow students to announce participation in the course:

For instance, it appears that the course uses the #worldmusic hashtag, though it’s not integrated within the main page of the course, not does it seem to be a unique hashtag associated with the course.

3. How do you encourage students to watch the videos?

Let’s say that we added the length of each video next to its title. Now, the learner knows that they need about an hour to watch the video. Some learners (e.g., those who are intrinsically motivated by the topic) will watch them without much scaffolding. But, how do you provide encouragement for others to do so? Here’s where some social psychology insights might be helpful. By providing learners with simple descriptions of how the majority of their colleagues are behaving (i.e. appealing to social norms), then one might be able to encourage individuals to watch the videos. For example, the videos might include a dynamic subtitle that informs learners that “8 out of 10 of your peers have watched this video” or that “70% of your peers have completed the first assignment” and so on. This is the same strategy that hotels use to encourage users to reuse towels and the same strategy that Nike uses when it compares your running patterns to the running patterns of other runners, as shown in the image below:

4. Peer-grading expectations.

This course is different from others that I’ve participated in because it includes an element of peer-grading. This is exciting to me because I’m a firm believer in social learning. One minor concern however is the following: I don’t know how many peers I am supposed to evaluate. I thought I was supposed to evaluate just one, but each time I finish my evaluation, I am presented with the option to “evaluate next student.” Do I keep evaluating? How many do I need to evaluate before I can move to the next step? I don’t know. In other words, it’s always helpful to inform the learner of what s/he has to do. For instance, in my case, I just stopped evaluating peers after having evaluated 4 because I don’t know how much I am expected to do. Perhaps there’s no minimum… and this information would be helpful to me as a learner.

Overall, my experience with this course is positive, though there is a lot of room for improvement here, which is to be expected. For example, I haven’t touched much on the pedagogy of the course, but there’s a few more weeks left… so stay tuned!

Notes:

Nike photo credit. Thanks to my colleague Chuck Hodges for directing my attention to the Nike example.

MOOCs can, and should, learn from past research in education

This entry is part of a reflective series of posts/questions relating to online learning, MOOCs, and openness. See the first one here.

Coursera announced today that it is adding a dozen or so universities as partners. In an article in the New York Times, Sebastian Thrun notes that MOOC courses are still experimental and argues: “I think we are rushing this a little bit,” he said. “I haven’t seen a single study showing that online learning is as good as other learning.”

This perception of online education as “better than” or “as good as” other forms of education (I imagine that Sebastian Thrun is referring to face-to-face education here), is rampant. I believe it is rampant because our field has not done a good job disseminating what we know and what we don’t know about online education. At the same time, individuals do not tend to go back to the foundations of the field to investigate what others have discovered.

The result: A lack of understanding that there’s a whole field out there (here?) that has developed important insights on how we can design online education effectively. The list of references at the end of this post are merely a few of the resources one can use to get started on what we know and what we don’t know about comparison studies (i.e. studies that compare learning between delivery modes).

The point of this entry is to argue that there’s no point to reinvent the wheel. There’s no point to make the same mistakes. And above all, past research has shown that there’s no point to study whether online education is as good as (or as bad as) other forms of education because what one will discover is that:

 

  1. There are no significant differences in learning outcomes between face-to-face education and online education.
  2. When differences are found between the two, the differences can be attributed to (a) pedagogy, or (b) and a lack of controls in the experimental design.

 

It is important to point out that the effectiveness of an educational approach is influenced greatly by other variables, such as instructor support or pedagogical approach. Therefore, it is very difficult (if not impossible) to compare face-to-face and online education because when one is not a replication of the other, they are vastly different, are based on different learner-instructor interactions, and offer different affordances. While researchers have tried to minimize differences and compare face-to-face learning and online learning in experimental ways, the interventions end up being meaningless for the types of powerful online/face-to-face teaching we might envision. Comparing delivery mechanisms therefore, blinds us to the important variables that truly impact how people learn.

The important and informative questions to ask are not comparative. Rather they focus squarely on understanding online education:

  1. How can we design effective and engaging online education (e.g., MOOCs)?
  2. What is the nature of learning in a MOOC?
  3. What is the learning experience in a MOOC like?

These questions are difficult. They won’t be answered by comparing survey responses and they won’t generate one simple answer. They will however generate answers that will be different depending on context. And that’s what’s exciting about doing research on online education.

References:

Page 5 of 6

Powered by WordPress & Theme by Anders Norén