Category: online learning
At the School of Education and Technology at Royal Roads University, we are very excited to be redesigning our MA in Learning and Technology. We will share more about the program in the near future, but for now we’d love any input that you may have on one of the courses my colleague Elizabeth Childs and I are designing. The course is called Digital Learning, Environments, Networks, and Communities. The link sends you to a Google Doc that hosts a very rough first draft of the course. We would love to hear your thoughts, critiques, ideas, gaps, etc on the Google Doc. Are we missing important details/readings? Are there additional activities that we should consider? What questions do you have? How can this course be better?
Some background information on the program follows.
Context: This is the first course in a two year MA degree in Learning and Technology (33 credits). The degree is offered in two modes: fully online and blended. The online group of students and the blended group of students come together in the third course. Thereafter, they continue together and complete the rest of the degree fully online.
The program is founded upon principles of networked learning, open pedagogy, personalization, relevance, and digital mindsets. Students collaborate and contribute meaningfully to digital learning networks and communities in the field. Graduates will be able to create and evaluate digital learning environments. Students will apply theoretical and practical knowledge to critically analyze learning innovations and assess their impact on organizations and society.
The program responds to the demand for qualified professionals in the field of technology-mediated learning and education. It addresses the need for individuals who have the knowledge, skills and ability to assume the leadership roles that are required to plan, design, develop, implement and evaluate contemporary learning initiatives. Following several foundational courses, students transition into the inquiry-focused portion of the program. Next, they create digital learning resources based on personalized learning plans and facilitate a student-designed and student-led seminar experience that requires them to draw upon the networks and community(ies) they have been contributing to and cultivating over the duration of the program.
My colleague Ash Shaw and I are working on a book. The book aims to highlight student voices in online learning. The main aims are to surface the experiences of online learners in an evocative and accessible manner, synthesize literature on the topic, and present our original work. Below is our draft table of contents. If you have a couple of minutes, could you take a look at it and let us know if there are any topics/debates/issues that might be of interest to the average faculty member and student that we are missing?
|#||Topic||Summary and questions answered|
|2||Demographics||Examines who today’s online learners are and how online learners demographics have changed over time. Who are today’s online learners? How many students enroll in online courses nationally and globally? How have demographics changed over time?|
|3||Who succeeds? (or, The online paradox)||Investigates the reasons why students who take online courses have greater degree completion rates when online courses are characterized by higher attrition rates.|
|4||Motivations||Investigates the reasons that individuals take online courses. Shows that students take online courses for a variety of reasons, and reveals that reasons differ depending on the type of online course (e.g., some learners take MOOCs for different reasons than online courses).|
|5||Digital Literacies||Examines the need for skills and the skills required to participate productively in online courses.|
|6||Note-taking||Uses note-taking to illustrate that online learning research that focuses on tracking student activity on platforms alone is insufficient to understand the human condition and hence improve learning outcomes.|
|7||Self-directed learning||Investigates self-directed learning as a process necessary for contemporary learners to develop and apply.|
|8||Openness||Investigates the meaning of the term openness in the context of online learning.|
|9||Personalized learning||Examines efforts to develop adaptive learning software and automate instruction (system control), and juxtaposes those efforts with designs that allow learners to personalize their own learning (learner control). Explores instructor strategies and designs to personalize learning.|
|10||Flexibility||Examines the ways that online courses can be designed to accommodate learners’ lives and allow flexible participation. Investigates issues of modality and (a)synchrnonicity.|
|11||Social Media||Investigates how social media are used in online courses and shows how intentional integration of such tools can lead to positive outcomes.|
|12||Loneliness or “The student who watched videos alone”||Examines how online learning can be a lonely and isolating experience and proposes strategies for enhancing presence and immediacy.|
|13||Emotions||Shows that learning online is an emotional experience, calling for a more caring pedagogy and critiquing the calls to employ online learning to simply make online learning offerings more efficient.|
|14||Lurking or “The student who learned as much by just watching videos”||Investigates the topic of lurking. Highlights the visible and invisible practices that online learners engage in. Demostrates…|
|15||Time or “The student who stole time from his family to study”||Explores the topic of time-management in online students’ lives, and investigates how courses can be designed to fit with the complexity of learner’s day-to-day realities (e.g., work and family requirements).|
|16||Dropout, Attrition, and Persistence||Explores the topic of attrition, as online courses often face higher attrition rates than alternatives.|
|17||Instructor||The role of the instructor in online learning environments. Investigates instructor presence, support, and explores how instructors can contribute to meaningful and effective learning experiences|
|18||Online vs. face-to-face learning||Investigates the question as to whether face-to-face learning is better than online learning. Presents the empirical research on the question and highlights (a) how different forms of education serve different needs, and (b) how learning design is a more significant factor in determining learning outcomes than modality.|
|19||MOOCs or “The student who completed 200 courses: And other, less profound, online learning experiences”||Explores the topic of MOOCs and summarizes the empirical research that exists on the topic. Explains the origins of the term, the different designs, and how the concept has evolved over time, with particular emphasis on students’ experiences in MOOCs.|
|20||The Learning Management System and Next-Generation Digital Learning Environments||Investigates the idea that Learning Management Systems contribute little to student learning. Proposes the courses are “nodes in a network” as opposed to hermetic containers of knowledge. Shows how course design differs between these two ideas.|
|21||Challenges and remediation strategies||Investigates the challenges that online learners face and the strategies employed by themselves and others to remediate them.|
Athabasca University Press has just published Emergence and Innovation in Digital Learning, a book I edited that owes its existence to the insightful authors who contributed their chapters on the topic. Like other titles published by AU Press, the book is open access.
Emerging technologies (e.g., social media, serious games, adaptive software) and emerging practices (e.g., openness, user modeling) in particular, have been heralded as providing opportunities to transform education, learning, and teaching. In such conversations it is often suggested that new ideas – whether technologies or practices – will address educational problems (e.g., open textbooks may make college more affordable) or provide opportunities to rethink the ways that education is organized and enacted (e.g., the collection and analysis of big data may enable designers to develop algorithms that provide early and critical feedback to at-risk students). Yet, our understanding of emerging technologies and emerging practices is elusive. In this book, we amalgamate work associated with emergence in digital education to conceptualize, design, critique, enhance, and better understand education.
If you’ve ben following the conversations in the last two years, there will be some themes that you’ll recognize here. To mention a few: defining emerging technologies; not-yetness; data mining; technology integration models; open and social learning; and sociocultural aspects of MOOCs.
In the days that follow, I will summarize each chapter here.
Over the last year or so, we’ve interviewed more than 200 individuals who have participated in a number of open courses. We are working on a project in which we are using learner narratives and vignettes from these interviews to help administrators, faculty, researchers, and learning designers understand learners and improve their learning experience. Though there are many ways that are used to understand learners (e.g., dashboards) we believe that in-depth vignettes of typical experiences may allow for greater sensitivity of the learners’ lifeworld and realities. We will be using these stories to problematize various aspects of digital learning. Each story will be followed by a longer analysis of the issues raised in the story. For now, below is one such (DRAFT!) story. What do you think? Is there anything else that you’d like to see in this narrative? Is it interesting? If you are an administrator, faculty, researcher, or learning designer, does this story add anything valuable?
Title: Why not?
Theme: Open learning opportunities are oftentimes costless and relatively risk-free.
Mary and her demanding Pomeranian, Kylie, live deep in the heart of Texas. “I have a passion for the law!” the thirty-year-old exclaimed when we called her on her landline. She had seriously considered going to law school and had even passed her LSATS, the law school entrance exams used for US Universities. But having just finished four intense years of a bachelor’s degree, she decided to wait a bit. “Law school just didn’t seem like a good choice at the time,” she reflected. Five years later, Mary has settled into her work as a business consultant. Her interest in the law is still keen, and she’s never completely given up the dream of law school, but it’s been tempered with a bit of realism. “I don’t know if I can afford to spend another three years in the classroom,” she confided to us, “I don’t know if I still have the same passion for the legal industry as I did five years ago.”
During an afternoon enjoying frozen mango margaritas with a friend, trying to cope with the scorching sun, Mary learned about MOOCs. Shortly thereafter, she signed up for a number of courses, dabbling in some and promptly forgetting about others. One day, ContractsX, a course on contract law taught by a Harvard professor, popped up on her screen and she decided to “give it a shot”. What had she got to lose? “It’s a free class, taught at one of the more well-respected institutions. Why not?!” she laughed.
The course was flexible and fit into her busy life. On Saturday mornings she would sit in her office, with Kylie by her side and a warm cup of dark roast coffee in her hand, and use her trusted iPad to watch Harvard Law lectures. These weren’t just any lectures. Professor Fried was a masterful storyteller, a king of his trade. It was through these short, interesting, and memorable stories that Professor Fried taught concepts relating to contract law. “I can’t believe that I’m sitting here, I’m learning this material from Harvard law!” The fast pace and cramped content made the course challenging, Mary acknowledged, and she didn’t always do as well as she would have liked on the course tests. But, as she was able to go back to review the answers and re-watch the videos, this didn’t stress her too much, and she ended up passing the course with flying colours. Proud of her certificate of accomplishment, Mary enthused, “It makes me want to keep coming back for more!”
Even though it was a personal interest in the law that led her to sign up for this course, Mary has found what she learned in ContractsX helpful when she has to deal with contracts in her own job. She has enthusiastically recommended the course to co-workers and friends. She’s currently taking a number of other open courses and is anxiously awaiting the second version of the Contracts course. While Mary’s dream of attending law school, may not have changed, her confidence in herself has: “I never thought of applying to Harvard. There was no way I would be getting in. But then, five years later, I’m taking a course from Harvard. I wouldn’t say that I’m a Harvard law student, but at least now I could sit across from a Harvard law student and have a clear conversation with them. It’s very rewarding to know that.”
A number of literature reviews have been published on MOOCs. None has focused exclusively on the empirical literature. In a recent paper, we analyzed the empirical literature published on MOOCs in 2013-2015 to make greater sense of who studies what and how. We found that:
- more than 80% of this literature is published by individuals whose home institutions are in North America and Europe,
- a select few papers are widely cited while nearly half of the papers are cited zero times,
- researchers have favored a quantitative if not positivist approach to the conduct of MOOC research,
- researchers have preferred the collection of data via surveys and automated methods
- some interpretive research was conducted on MOOCs in this time period, but it was often basic and it was the minority of studies that were informed by methods traditionally associated with qualitative research (e.g., interviews, observations, and focus groups)
- there is limited research reported on instructor-related topics, and
- even though researchers have attempted to identify and classify learners into various groupings, very little research examines the experiences of learner subpopulations (e.g., those who succeed vs those who don’t; men vs women).
We believe that the implications arising from this study are important for research on educational technology in general and not jut MOOC research. For instance, given the interest on big data and automated collection/analysis of the data trails that learners leave behind on digital learning environments, a broader methodological toolkit is imperative in the study of emerging digital learning environments.
Here’s a copy of the paper:
Veletsianos, G. & Shepherdson, P. (2016). A Systematic Analysis And Synthesis of the Empirical MOOC Literature Published in 2013-2015. The International Review of Research in Open and Distributed Learning, 17(2).
The attention that education and educational technology are receiving are significant, and I feel fortunate to be able to participate in efforts to improve education. Nonetheless, for the past couple of years has been bombarded with announcement after announcement of what the latest and greatest technology can do for education. These announcements are almost always filled with claims about their potential impact. Here’s a clipping of a recent email i received:
Let’s pause. Research from ISTE “found that the use of education technology (EdTech) resulted in 35 percent of students showing higher scores on class assessments and 32 percent increased engagement?” The link pointed to this post, written by the amazing Wendy Drexel. Let’s hone in on what Wendy actually wrote:
“Nationwide, we are seeing powerful results from the effective use of technology in classrooms. For example, results of research by ISTE and the Verizon Foundation earlier this year into the use of education technology had teachers reporting that 35 percent of their students showed higher scores on classroom assessments; 32 percent showed increased engagement; and 62 percent demonstrated increased proficiency with mobile devices. In fact, 60 percent of participating teachers also reported that by using their mobile devices, they provided more one-on-one help to students, and 47 percent said they spent less time on lectures to the entire class.”
These are interesting and worthwhile results. But they are mischaracterized in the email above. From the summary of the research posted on the ISTE site and a more detailed report of the research (pdf), we can begin to see how some edtech companies purport that there is evidence of impact and use that to further their cause. Here’s a summary of two issues ignored by the ad/email:
- The email claims that use of edtech resulted in 35% and 32% more students scoring higher in classroom assessment and engagement. What the research actually reported about this particular area is the following: teachers reported that edtech use led to increased scores/engagement. In more plain language, the teachers said that their students did better. We don’t know if they did or did not.
- The email paints a direct and unequivocal relationship between edtech use and outcomes.: “use of educational technology resulted in.” That’s not actually the case. Why?
- The actual research showed that even though there were differences in math and science outcomes between schools that participated and schools that did not, the results were not statistically significant. In different words: similar results could be expected without the use of this particular technology
- How is the technology used? The research report notes: “During site visits, observers noted that edtech-using teachers used technology to efficiently
facilitate drill and practice test preparation activities.” In other words: Edtech helps with teaching to the test and that seems to work. Put differently: We have powerful technologies that empower people to be creative and allow global collaboration, but have created systems that put teachers in situations in which they have to use these tools in simplistic ways.
The British Journal of Educational Technology and BERA approached us to create an infographic for the article we (Amy Collier, Emily Schneider, and myself) published last year: Digging Deeper into Learners’ Experiences in MOOCs: Participation in social networks outside of MOOCs, Notetaking, and contexts surrounding content consumption
Below is the outcome (and a pdf version is here):
“Personalized learning” is that one area of research and practice that brings to the forefront many of the debates and issues that the field is engaging with right now. If one wanted to walk people through the field, and wanted to do so through *one* specific topic, that topic would be personalized learning.
Personalized cans? (CC-licensed image from Flickr)
Here’s are some of the questions that personalized learning raises:
- We have a problem with labels and meaning in this field. Heck, we have a problem with what to call ourselves: Learning Technologies or Educational Technology? Or perhaps instructional design? Learning Design? Learning, Design, and Technology? Or is it Learning Science? Reiser asks: What field did you say you were in? The same is true for personalized learning. Audrey Watters and Mike Caulfield ask what does “personalized learning” mean and what is the term’s history? Does it mean different pathways for each learner, one pathway with varied pacing for each learner, or something else?
- The Chan-Zuckerberg initiative and the Bill and Melinda Gates Foundation endorse personalized learning. What is the role of philanthropy in education in general and educational technology in particular? Should educators and researchers “beware of big donors” or should they enthusiastically welcome the support in the current climate of declining public monies?
- Where is the locus of control? Is personalization controlled by the learner? Is the control left to the software? What of shared control? Obsolete views of personalization and adaptive learning focus on how the system can control both the content and the learning process ignoring, for the most part, the learner, even though learner control appears to be an important determinant of success in e-learning (see Singhanayok & Hooper, 1998). The important question in my mind is the following: How do we balance system and learner control? Such shared control should empower students and enable technology to support and enhance the process. Downes distinguishes between personalized learning and personal learning. I think that locus of control is the distinguishing aspect, and that the role of shared control remains an open conceptual and empirical question. Debates about xMOOCx vs cMOOCs fall in here as well as the debate regarding the value of guided vs discovery learning.
- How do big data and learning analytics improve learning and participation? What are the limitations of depending on trace data? Personalized learning often appears to depend on the creation of learner profiles. For example, if you fit a particular profile you might receive a particular worked-out example or semi-completed problem, and problems might vary as one progresses through a pathway. Or, you might get an email from Coursera about “recommended courses” (see my point above regarding definitions and meanings). Either way, the role that large datasets, analytics, and educational data science – as well as the limitations and assumptions of these approaches, as we show in our research – is central to personalization and new approaches to education.
- What assumptions do authors of personalized learning algorithms make? We can’t answer this question unless we look at the algorithms. Such algorithms are rarely transparent. They often come in “black box” form, which means that what we have no insight into the processes of how inputs are transformed to outputs. We don’t know the inner workings of the algorithms that Facebook, Twitter, and Google Scholar use, and we likely won’t know how the algorithms that EdTechCompany uses work to deliver particular content to particular groups of students. If independent researchers can’t evaluate the inner workings of personalized learning software, how can we be sure that such algorithms so what they are supposed to do without being prejudicial? Perhaps the authors of education technology algorithms need a code of conduct, and a course on social justice?
- Knewton touts its personalization engine. Does it actually work? Connecting this to broader conversations in the field: What evidence do we have about the claims made by the EdTech industry? Is there empirical evidence to support these claims? See for example, this analysis by Phil Hill on the relationship between LMS use and retention/performance and this paper by Royce Kimmons on the impact of LMS adoption on outcomes. If you’ve been in the position of making a technology purchasing in K-12/HigherEd, you have likely experienced the unending claims regarding the positive impact of technology on outcomes and retention.
- And speaking of data and outcomes, what of student privacy in this context? How long should software companies keep student data? Who has access to the data? Should the data follow students from one system (e.g., K-12) to another (e.g., Higher Ed)? Is there uniformity in place (e.g., consistent learner profiles) for this to happen? How does local legislation relate to educational technology companies’ use of student data? For example, see this analysis by BCCampus describing how British Columbia’s Freedom of Information and Protection of Privacy Act (FIPPA) impacts the use of US-based cloud services. The more one looks into personalization and its dependence on student data, the more one has to explore questions pertaining to privacy, surveillance and ethics.
- Finally, what is the role of openness is personalized learning? Advocates for open frequently argue that openness and open practices enable democratization, transparency, and empowerment. For instance, open textbooks allow instructors to revise them. But, what happens when the product that publishing companies sell isn’t content? What happens, when the product is personalized learning software that uses OER? Are the goals of the open movement met when publishers use OER bundles with personalized learning software that restricts the freedoms associated with OER? What becomes of the open agenda to empower instructors, students, and institutions?
There’s lots to contemplate here, but the point is this: Personalized learning is ground zero for the field and its debates.