While designing my open course focusing on networked scholars, I’ll be posting updates here pertaining to pedagogical and design decisions that I’m making. [Aug 20, 2014 update: Course registration is open]
The course is intended to help doctoral students, academics, and other knowledge workers on how social media and networked technologies may support/extend/question their scholarship. The course will also be “wrapped” by a colleague in real-time and colleagues who teach research methods courses will be sharing it with their students. In short, the audience is diverse, their background knowledge varies, and their needs/desires will vary. So, the question becomes, how do you support all learners to achieve what they aspire to achieve?
I’ve been thinking a lot recently about success in open courses. I’m intrigued by discussions of multiple pathways (or dual layer) through open courses and I’ve been reflecting on how to support the different groups of people that might visit (and use) my course. In the GoNorth projects, we had thousands of teachers annually use our digital learning environment and curriculum. To accommodate their needs the curriculum consisted of 3 levels: (experience, explore, expand). This design encompassed varying levels of difficulty and involvement and allowed teachers to adjust the curriculum to local needs. In the edX course Data, Analytics, and Learning that George, Carolyn, Dragan, and Ryan are teaching in the Fall, the learner is given more of that control. The instructors write: ”This course will experiment with multiple learning pathways. It has been structured to allow learners to take various pathways through learning content – either in the existing edX format or in a social competency-based and self-directed format. Learners will have access to pathways that support both beginners, and more advanced students, with pointers to additional advanced resources. In addition to interactions within the edX platform, learners will be encouraged to engage in distributed conversations on social media such as blogs and Twitter.” I like this because of the recognition that learners come to courses with varying needs/wants and that recognition influenced the design of the course.
In thinking about the different needs that students in my course will have, a group of instructional designers and I at Royal Roads have created a scaffold to help individuals define what they want to achieve in the course. This tool will be helpful for self-directed learners and those with enough background knowledge on the topic, but, depending on how it is implemented, it can help novices as well. The scaffold is a Personal Learning Plan (.rtf). I think this might be helpful to others, so I’m tagging it with an open license so that others can use it as they see fit in their own courses. Here’s how it works:
I assume that individuals will enrol in this course to pursue a personal need/ambition (e.g., “I want to learn how education researchers use social media for research and I am at a loss as to where to start”). To support learners in this, I will be asking them to develop a personal learning plan (PLP) as a way to define, verbalize, and be mindful about their goals. A PLP will allow learners to define what they want to achieve by enrolling in the course and reflect on their successes and accomplishments.
Once participants create a PLP they can either keep it private, share it with the instructor, or share it on a discussion board. Sharing it on a discussion board might allow them to be more accountable to the goals they have set and to connect with colleagues that have similar goals. There is one problem here: Let’s assume that the course will be of interest to a couple of hundred people and a hundred of them post their PLPs on a discussion board. That will quickly become overwhelming for everyone. How do we reduce the information available to help learners find each other based on common interests? If learners could tag their post, and the tags became available at the top of the discussion thread, that could help, but alas, that’s not an option available on the platform that I am using. If any of you have any ideas, I’d love to hear them!
Below are two fictitious learning plans as examples. These only have 1 row each, but learners could include as many rows as they need.
The first one is relevant to PhD students
|Goal||Action(s) to achieve goal||Measure of success (i.e. How will I know that I was successful?)||How much time do I anticipate spending to achieve this goal?|
|Decide whether of not to start blogging about my dissertation||- Read assigned material- Participate in discussions||- Make a decision by the end of the course||2 hours per week for the next 4 weeks|
The second one applies to an early-career academic (e.g., a lecturer, a professor, a researcher, etc).
|Goal||Action(s) to achieve goal||Measure of success (i.e. How will I know that I was successful?)||How much time do I anticipate spending to achieve this goal?|
|My social media activity is gaining global following. I want to understand the tensions that I might face.||- Read everything associated with week 2.- Participate in as many relevant discussions as possible in week 2.- Join the live panel discussion during week 2.||- I will write a 200-word journal entry describing potential tensions and challenges that I might face.||7 hours during week 2|
Of course, it is entirely possible, and research has shown, that learners don’t know what they don’t know. A personal learning plan isn’t a panacea, which is why every course needs to include a diverse range of scaffolds and supports. But this is turning out to be a long post, so I’ll save those thoughts for a future update.
As always, I’d love to hear your thoughts. How does this sound? What might be some problems with it? How could it be improved?
MITx and HarvardX deserve huge congratulations for making data associated with a number of their MOOCs publicly available. Four months ago, I wrote that the “community would benefit from access to the data that HarvardX and MITx have, as other individuals/groups could run additional analyses. Granted, I imagine this might require quite a lot of effort, not least in the development of procedures for data sharing.” It seems that the researchers at MITx and HarvardX have tackled the issues involved to make the data available, and have developed thoughtful procedures to ensure de-identification. While some of the steps taken may limit analyses (e.g., the de-identification process document notes that “rows with 60 or more forum posts were deleted,” thus eliminating highly active users), this is a big step in the right direction and it should be celebrated.
Now… can we have some qualitative data? If any institutions are interested in making those available, I’d love talk to you, give you input, and work with you toward that goal.
I was at the Educause Learning Initiative conference last week (#ELI2014), where I had some interesting conversations and discussions around online learning, MOOCs, research methods, and the future of higher education.
Amy Collier and I presented early results from our qualitative studies looking at learners’ MOOC experiences (if you have not yet responded to our call to share your lived experiences with us, please consider this invitation). Our talk was entitled “Messy Realities: Investigating Learners’ Experiences in MOOCs.” Our thinking is guided by the notion that even though surveys and big data yield insights into general behavioral patterns, these methods are detached and can distance us rather than help us understand the human condition. As a result, the phenomenon of “learning in a MOOC” is understudied and undiscovered. During the session, we shared what we have been finding in our studies, highlighting the messiness of learning and teaching in the open.
Karen Vignare and Amy Collier were also very kind to extend an invitation to a number of us to share our work with individuals participating in the leadership seminar they organized. It was fantastic to hear Katie Vale (Harvard), Matt Meyer (The Pennsylvania State University), Rebecca Petersen (edX, MIT), and D. Christopher Brooks (EDUCAUSE) discuss their work, and once again, I felt grateful that we are having these conversations more openly, more frequently, and with greater intent.
Below are my rough notes from my 5-7 minute presentation. I appreciate parsimony (who doesn’t?), and in the words of D. Christopher Brooks, this is the litany of things I think:
I am a designer and researcher of education and learning. I study emerging technologies and emerging learning environments. I’m also a faculty member , and I have been teaching in higher education settings both face-to-face and online since 2005.
To contextualize my comments on MOOCs, first I want to describe my experiences with them:
- I have facilitated one week of the #change11 MOOC was organized by George Siemens and Stephen Downes in 2011. This MOOC had a distinctively connectivist flavor with each week being facilitated by 1 person.
- I have enrolled in a number of MOOCs, and have even completed a small number of them.
- I have repurposed MOOCs in my own courses. For example, I have asked students to enroll in MOOCs and write about them.
- I have published an e-book with my students, sharing stories of student experiences with MOOCs.
- Finally, I am actively involved in studying learners’ experiences in MOOCs in order to understand the human element in these emerging learning environments.
I have recently come to the realization that I have an ambivalent relationship with MOOCs. My relationship with MOOCs is one of the most ambivalent relationships I have had with anyone or anything. This relationship is more ambivalent than the love-ignore-hate relationship that my cat has with me!
On the one hand, I appreciate the opportunities for open learning that MOOCs provide. I also appreciate how MOOCs have brought us together to discuss issues around technology, teaching, and learning. At the same time, I cringe at the narratives around big data, I cringe at the hype, at the ignorance around what education is and should be about.
I want to talk about two topics today: MOOC research and the MOOC phenomenon.
On MOOC Research
- We don’t know much about MOOCs
- The things that we know about MOOCs are mostly the result of surveys, learning analytics, and big data research
- The existing research and the existing methods that we use are informative, BUT they simply paint an incomplete picture of MOOCs. We should be asking more in-depth questions about learner and instructor experiences in MOOCs
- Qualitative and interpretive research methods can and will help us better understand MOOCs, open learning, and open scholarship
- Descriptions of learner behaviors are helpful, but these descriptions only provide a glimpse and superficial summary of what students experience and what they do in digital learning environments. To give you an example, emerging research suggests that students may be “sampling” courses; a behavior that we don’t frequently see in traditional online courses or traditional face-to-face courses. Nonetheless, “sampling” is not how participants would describe their experiences or the ways they participate MOOCs. To illustrate, consider family-style Mediterranean meals that consist of numerous dishes, where participants sample a wide array of food. If you ask a person to describe this meal, to explain it to someone else, or to simply tell you about the meal, they will likely describe the meal as a feast, they might describe the tahini as lemony, the variety of flavors as intriguing, the whole meal as satisfying. Different people will also describe the meal differently: Tourists might describe the meal as fulfilling, heavy, or even extravagant; locals might describe the same meal as appropriate, or better than or worst than meals that they have had at other restaurants. “Sampling” may be an appropriate descriptor of the act of eating a family-style meal, or exploring a MOOC, but the descriptor does not fully capture the experience of sampling.
On the MOOC as a Phenomenon
MOOCs. The acronym stands for massive, open, online courses. That is not what MOOCs are though. MOOCs are a phenomenon. They represent something larger than a course and should be seen in conjunction to the rebirth and revival of educational technology. They represent symptoms, responses, and failures facing Higher Education. For instance, MOOCs are a response to the increasing costs of Higher Education; represent the belief that the purpose of education is to prepare students for the workforce; represent the belief that technology is the solution to the problems that education is facing; are indicative of scholarly failures; seem to represent the belief that education is a product that can be packaged, automated, and delivered; and, are a response to failures by researchers, designers, administrators, and institutions to develop effective and inspiring solutions to the problems of education (alternatively, they might also represent the failure of existing systems to support creative individuals in enacting change)*.
The MOOC is an acronym that elicits strong feelings: excitement, fear, defiance, uncertainty, hope, contempt…. To address these feelings we have to address the failures of higher education and the underlying causes that have given rise to MOOCs. For this reason, instead of talking about MOOCs at my own institution, I discuss innovations and approaches that I value, including networked scholarship, openness, flexibility, social learning, and the design and development of new technologies.
* NOTE: Rolin Moe and I are working on a paper refining and delineating these. If you have thoughts, concerns, or input on any of these issues, we’d love to hear form you!
HarvardX and MITx released a number of reports describing their open courses. The overarching paper describing these initiatives, entitled HarvardX and MITx: The first year of open online courses is really helpful in gaining a holistic understanding of HarvardX and MITx learned about their initiatives.
My (preliminary) thoughts:
- It’s exciting to see more data
- It’s exciting to see education researchers involved in the analysis of the data
- The researchers should be congratulated for making these reports available in an expeditious manner via SSRN
- We need more interpretive/qualitative research to understand participants’ and practitioners’ experiences on the ground
- I am wondering whether the community would benefit from access to the data that HarvardX and MITx have, as other individuals/groups could run additional analyses. Granted, I imagine this might require quite a lot of effort, not least in the development of procedures for data sharing.
The course reports appear below, and these are quite helpful in helping the community understand the particulars of each course:
- 3.091x Introduction to Solid-State Chemistry – Fall 2012 MITx Course Report
- 6.00x Introduction to Computer Science and Programming – Fall 2012 MITx Course Report
- 6.002x: Circuits and Electronics – Fall 2012 MITx Course Report
- 2.01x Elements of Structures – Spring 2013 MITx Course Report
- 3.091x Introduction to Solid-State Chemistry – Spring 2013 MITx course report
- 6.00x Introduction to Computer Science and Programming – Spring 2013 MITx Course Report
- 6.002x: Circuits and Electronics – Spring 2013 MITx Course Report
- 7.00x Introduction to Biology: The Secret of Life – Spring 2013 MITx Course Report
- 8.02x Electricity and Magnetism – Spring 2013 MITx Course Report
- 14.73x: The Challenges of Global Poverty - Spring 2013 MITx Course Report
- 8.MReV: Mechanics ReView – Summer 2013 MITx Course Report
A facebook conversation from yesterday encouraged me to share one of the assignments that I developed for my instructional design course. The goal of the class is for the students to understand, experience, and apply instructional design in a variety of educational contexts.
One of the assignments I developed for asked students to enroll in a Massive Open Online Course (MOOC) and analyze the instructional materials within the course using one of the rubrics provided by Dick and Carey (the instructional design book we use in class). It was a lot of fun and the students appreciated the exercise. Given the lack of presence and voice by instructional designers in MOOC happenings, the lack of valid, reliable, and serious research that exists on the topic (though Rita Kop’s work on cMOOCs is admirable), and my desire to engage students in contemporary events, I came up with this assignment to embed MOOC analysis in my course. The assignment is available for download on https://dl.dropbox.com/u/2533962/instr-materials-veletsianos.doc and posted below for those who just want to skim it without downloading it. Enjoy and feel free to use it:
Instructional Material analysis assignment
Individually, you will examine and report on the instructional materials of one popular digital learning initiative. An analysis matrix will be provided to you, and you will use that to matrix to evaluate these initiatives.
Length: Minimum 500 words.
|Criteria||Levels of Attainment||Points|
|Written analysis (evaluation)||
This task requires a few hours of research before you can actually complete it. Even though this is an individual task, if you would like to discuss the assignment with any of your colleagues, please feel free to do so.
First read the chapter and the rest of the materials for this week. Without reading those, I can assure you that your understanding of the issues presented will be superficial.
Second, examine the rubric provided by Dick & Carey for evaluating instructional materials (p. 250-251 – see below for the rubric). You will be completing this rubric for a digital environment, and it’s a good idea to understand what it encompasses before you proceed.
Third, select one course provided on one of the following platforms to examine:
- A course on Coursera (select a course that is occurring right now or has been completed. DO NOT select a course that has not started yet): https://www.coursera.org/courses
- A course on EdX (select a course that is occurring right now. DO NOT select a course that has not started yet): https://www.edx.org/courses
- A free course on Udemy (select a course that includes at least 5 “lessons/lectures”): http://www.udemy.com/courses
You can also choose to examine DS106: http://ds106.us/ I am including DS106 on its own because it is a course as opposed to the above (Coursera, EdX, and Udemy) which are platforms. If you pick any of these three (Coursera, EdX, or Udemy), then you should also pick a course (e.g., Within Coursera a possible course is https://www.coursera.org/course/friendsmoneybytes).
Once you have made your selection, it’s time to research your course. Spend time looking around, examining and evaluating the instructional materials provided. You will use the rubric to keep track of the criteria that need to be assessed, and then using this rubric you will write a report assessing the instructional material for the course.
You should start your report by stating the course and its provider. A link would also be helpful. For example, using the example above, I would start my report by stating the following:
“I am examining the course entitled Networks: Friends, Money and Bytes (https://www.coursera.org/course/friendsmoneybytes). This course if offered through Coursera and is taught by Mung Chiang who is a Professor or Electrical Engineering at Princeton University. The course is an introduction to the topic of X and its objectives are XYZ.”
Your report should be specific and detailed in its evaluation of instructional material, and should be guided by the five criteria families discussed by DC: Goal-centered, learner-centered, learning-centered, context-centered, technical criteria. I would like to see that you understand each criterion and that you are capable of applying it to evaluating your course. For example, at the very least, I would expect to see statements such as the following:
Instructional designers use five criteria families to evaluate instructional materials. Learner-centered criteria focus on XYZ and refer to X. The instructional materials for this course appear to be adequate for this criterion because <provide list of reasons here>. The course could be improved in this domain by <list of additions/revisions here>. However, because item X was not disclosed in the course, I am not able to evaluate Y.
Let me reiterate that to complete this assignment you will need to do background research on the course and the platform. For example, your background research on Coursera will reveal that some of these courses have more than 80,000 students from around the world. This fact alone will impact your evaluation!
Instructional Material Evaluation Rubric
Rubric is copyright of: Dick, W., Carey, L. & Carey, J. (2008). Systematic Design of Instruction, (7th ed.) Upper Saddle River, NJ: Pearson.
A. Goal-centered Criteria:
Are the instructional materials:
|1. Congruent with the terminal and performance objectives?|
|2. Adequate in content coverage and completeness?|
|6. Objective in presentations (lack of content bias)?|
Are the instructional materials appropriate for learners’:
|1. Vocabulary and language?|
|2. Development level?|
|3. Background, experience, environment?|
|4. Experiences with testing formats and equipment?|
|5. Motivation and interest?|
|6. Cultural, racial, gender needs (lack bias)?|
Do the material include:
|1. Pre-instructional material?|
|2. Appropriate content sequencing?|
|3. Presentations that are complete, current and tailored for learners?|
|4. Practice exercises that are congruent with the goal?|
|5. Adequate and supportive feedback?|
|6. Appropriate assessment?|
|7. Appropriate sequence and chunk size?|
Are/do the instructional materials:
|1. Authentic for the learning and performance sites?|
|2. Feasible for the learning and performance sites?|
|3. Require additional equipment/tools?|
|4. Have congruent technical qualities for planned site (facilities/delivery system)?|
|5. Have adequate resources (time, budget, personal availability and skills)?|
Do the instructional materials have appropriate:
|1. Delivery system and media for the nature of objectives?|
|3. Graphic design and typography?|
|6. Audio and video quality?|
|7. Interface design?|
I’m working through my thoughts with this blog entry, as I’ve been trying to use this space to think out loud about my work and what I see happening in online education and higher ed.
A lot has been written about MOOCs and accreditation, and a lot more will be forthcoming. For example, see Terry Anderson’s post on this.
Today, I run across this quote in an article at Time Magazine:
…if Liu passes the graduate-level Harvard course she is taking for free through edX — one of the leading providers of massive open online courses, or MOOCs — she will be granted 7.5 credit hours, which her school district has agreed to accept as a form of professional development that can help her earn a higher salary. Liu might be among the first students nationwide to turn free online coursework into tangible college credit, but that number may soon grow exponentially.
What is the value of a critique?
The value of critique is to help us see a phenomenon through a different lens, to help us make sense of something in a different way, and to spark a conversation. This is the purpose, and value, of a paper we recently published with IRRODL on the topic of open scholarship.
The paper identifies the assumptions and challenges of openness and open scholarship and attempts to put forward suggestions for addressing those. A summary of our paper, appears below:
Many scholars hope and anticipate that open practices will broaden access to education and knowledge, reduce costs, enhance the impact and reach of scholarship and education, and foster the development of more equitable, effective, efficient, and transparent scholarly and educational processes. Wiley and Green (2012, pp. 88) note that “only time will tell” whether practices of open scholarship will transform education or whether the movement “will go down in the history books as just another fad that couldn’t live up to its press.” Given the emerging nature of such practices, educators are finding themselves in a position in which they can shape and/or be shaped by openness (Veletsianos, 2010). The intention of this paper is (a) to identify the assumptions of the open scholarship movement and (b) to highlight challenges associated with the movement’s aspirations of broadening access to education and knowledge. The goal of this paper is not to frame open scholarship as a problematic alternative to the status quo. Instead, as we see individuals, institutions, and organizations embrace openness, we have observed a parallel lack of critique of open educational practices. We find that such critiques are largely absent from the educational technology field, as members of the field tend to focus on the promises of educational technologies, rarely pausing to critique its assumptions. Selwyn (2011b, pp. 713) even charges that our field’s inherent positivity “limits the validity and credibility of the field as a site of serious academic endeavour.” Our intention is to spark a conversation with the hopes of creating a more equitable and effective future for digital education and scholarship. To this end, this paper is divided into three major sections. First, we review related literature to introduce the reader to the notion of open scholarship. Next, we discuss the assumptions of openness and open scholarship. We then identify the challenges of open scholarship and discuss how these may limit or problematize its outcomes.
Common assumptions and challenges are summarized as follows:
|Common themes and assumptions||Challenges|
|Open scholarship has a strong ideological basis rooted in an ethical pursuit for democratization, fundamental human rights, equality, and justice.||Are these ideals essential components of the open scholarship movement or are merely incidental to those who are pioneering the field?|
|Open scholarship emphasizes the importance of digital participation for enhanced scholarly outcomes||Scholars need to develop an understanding of participatory cultures and social/digital literacies in order to take full advantage of open scholarship.Need to redesign university curricula to prepare future scholars to account for the changing nature of scholarship.
|Open scholarship is treated as an emergent scholarly phenomenon that is co-evolutionary with technological advancements in the larger culture||Technology both shapes and is shaped by practice.Technology is not neutral, and its embedded values may advance tensions and compromises (e.g., flat relationships, homophily, filter bubbles).|
|Open scholarship is seen as a practical and effective means for achieving scholarly aims that are socially valuable||Open scholarship introduces new dilemmas and needs (e.g., personal information management challenges; Social stratification and exclusion).|
Given the topic, the best home for this paper was the International Review Of Research In Open And Distance Learning, through which you can download the paper for free in an open access manner:
I am in Cyprus to meet with a number of colleagues and give an invited talk at ICEM 2012.
Talk title: What does the future of design for online learning look like? Emerging technologies, Openness, MOOCs, and Digital Scholarship
Abstract: What will we observe if we take a long pause and examine the practice of online education today? What do emerging technologies, openness, Massive Open Online Courses, and digital scholarship tell us about the future that we are creating for learners, faculty members, and learning institutions? And what does entrepreneurial activity worldwide surrounding online education mean for the future of education and design? In this talk, I will discuss a number of emerging practices relating to online learning and online participation in a rapidly changing world and explain their implications for design practice. Emerging practices (e.g., open courses, researchers who blog, students who use social media to self-organize) can shape our teaching/learning practice and teaching/learning practice can shape these innovations. By examining, critiquing, and understanding these practices we will be able to understand potential futures for online learning and be better informed on how we can design effective and engaging online learning experiences. This talk will draw from my experiences and research on online learning, openness, and digital scholarship, and will present recent evidence detailing how researchers, learners, educators are creating, sharing, and negotiating knowledge and education online.