In July of 2010, I published Emerging Technologies in Distance Education with Athabasca University Press. The book was published in print (for purchase) and e-book (open access) format. In the spirit of openness, I shared the book’s download statistics, one year after publication. It’s time for an update. Here it goes…
Writing an academic book is not about royalties. I’m elated when people read my work, and the value of that is immeasurable. So, thank you to all of you who downloaded and read this book – and above all, thank you, once again, to all the authors who contributed to this volume.
The most recent download statistics, 3 years after publication, show that:
- The full book was downloaded approximately 12,000 times.
- The full book and individual chapters were downloaded about 32,000 times.
- The three most downloaded chapters were:
- Developing Personal Learning Networks for Open and Social Learning (pdf)
- Personal Learning Environments (pdf)
Trey Martindale & Michael Dowdy
- A Definition of Emerging Technologies for Education (pdf)
- Developing Personal Learning Networks for Open and Social Learning (pdf)
The download statistics, broken down by month, are as follows:
The book, or chapters of it, have been used in the following courses:
- EDTECH 597: Social Network Learning, Boise State University (Fall, 2010)
- EDU 7271: Information and Communication: Social and Conventional Networks, Northeastern University (Spring 2011, Fall 2011)
- EDU 6407: Essentials of Multimedia for Distance Learning, Northeastern University (Spring 2011)
- PLENK 2010: Personal Learning Environments Networks and Knowledge, Athabasca University and the University of Prince Edward Island (Fall, 2010)
- OLIT 538: E-learning Course Design, University of New Mexico (Fall, 2010)
- EDUC60602: Teaching and Learning with Emerging Technologies, University of Manchester, UK (Spring 2011)
- EDEE 203: Technology in Education, The Open University of the Philippines.
- EDTC 6432: Computer Authoring, Seattle Pacific University
- EDLD 871 Special Topics in Instructional Leadership: Focus on K-12 Virtual Schools, University of Louisiana at Lafayette
- Emerging Technologies to Improve Teaching and Learning in Higher Education, Cape Higher Education Consortium (University of Cape Town, University of Stellenbosch, University of the Western Cape, Cape Peninsula University of Technology)
- Exploring Personal Learning Networks: Practical issues for organizations (Fall 2013), Northwestern University
- EDU-681100, Learning with Emerging Technologies: Theory and Practice, (Fall 2013), State University of New York, Empire State College
[If you are using the book or chapters of it in courses that are not listed above, I'd love to hear about it!]
I am sitting at a coffee shop in Vancouver, BC reflecting on my time at COHERE 2013. This was my first Canadian conference since moving to Victoria, and it was a great opportunity to meet and spend time with colleagues (many of them Canadians) including Tony Bates, Rory McGreal, Martha-Cleveland Innes, David Porter, Diane Janes, Diane Salter, Jenni Hayman, Richard Pinet, Robert Clougherty, and Cindy Ives. It was also great to see Ron Owston, Frank Bulk, and Kathleen Matheos again – and my colleagues Vivian Forssman and BJ Eib were there too! The conference was relatively small and the sessions were 40 minutes long, allowing ample time and space for conversations, networking, and debates. I really appreciated the intimate atmosphere that we were afforded for spending time with each other. The organizers (Kathleen Matheos and Stacey Woods) did a fantastic job!
Cable Green from Creative Commons delivered the first keynote and David Porter from BC Campus delivered the second. I sat on a respondent panel for Cable’s keynote and argued three points: (a) we need to build on and go beyond open educational resources, and think about open practices, (b) each of us needs to take action in supporting openness (e.g., by teaching sharing as a value and literacy), and (c) by recognizing that “open” is under threat of being subverted. It was fascinating to sit on a panel with four others and see how our responses to the keynote differed, but how they all coalesced around similar messages as well.
I also gave a presentation discussing early findings from my research into learners experiences in MOOCs, open courses, and other open learning environments, and you might be interested in Tony Bates’ take on this research:
These findings are not fully refined and analyzed, yet. However, in thinking about these results, reading the literature and claims around MOOCs, and thinking about recent developments in educational technology, I am beginning to see MOOCs more and more as a symptom of chronic failures of the educational system to tackle significant issues. On the one hand, I and others have argued that MOOC creators have ignored research into how people learn and how people learn with technology. Tony Bates in particular (see the last link), is very clear when he says “Why is MIT ignoring 25 years of research into online learning and 100 years research into how students learn in its design of online courses?”
On the other hand however, the rise of MOOCs seems to be a symptom of a series of failures and pressures. I like the argument that George Siemens makes in relation to inadequate university approaches to educational needs, “Universities have failed to recognize the pent-up demand for learning as the economy has diversified and society has become more complex and interconnected. As a consequence, the internet has contributed by creating a shadow education system where learners learn on their own and through social networks. MOOCs reflect society’s transition to a knowledge economy and reveal the inadequacy of existing university models to meet learner’s needs.” I’d like to take this argument further. As a field, we could do more to have greater impact on the design and development of educational technology solutions, including MOOCs. Steps to do that would include sharing our research more broadly and in different ways (e.g., publishing in open access venues and putting theory-to-practice), engaging in what Tom Reeves calls socially-responsible research that solves real problems, working across disciplines, reconsidering the ways that we understand, evaluate, and reward impact at our institutions, and so on. More on these issues, soon!
I visited Educause 2013 this year, largely after an invitation by Tanya Joosten and Amy Collier to participate on a panel exploring what makes technology pilots successful. The panel was entitled Prepare for Lift-Off: Becoming a Successful IT Pilot Site. Laura Pasquini took some notes on a google doc and Tanya posted the slidedeck here. The session was described as follows:
“Your campus is an innovator in many ways, and you’ve been approached to be a pilot site for a new campus IT product. You’d like to say yes to the idea, but you’re not sure you have the infrastructure to make it work. Join a panel of your university colleagues to learn the ropes and discover what it takes to successfully deliver and host technology pilots on your campus. The panelists will offer a dynamic conversation on the importance of stakeholder involvement, faculty engagement and selection, faculty development and support, technical infrastructure, student support, research and evaluation, and critical steps your institution needs to take to ensure your pilot not only flies but soars.”
Photo by Jason Jones
This was my first time at the conference. My goal throughout the conference was to explore this group’s horizon, or what this group is currently seeing as being promising initiatives for higher education. In summary, the focus was on: competency-based learning, learning analytics, and MOOCs. Openness was relatively absent. Research was largely absent. Vendor-driven solutions were pervasive, and I left yearning to know more about innovations created and implemented by learning designers and/or by institutions themselves.
There were two innovations that I have been thinking about since the event:
- I had a lovely chat with Rob Farrow who shared with me the work that the Open University is doing with the OER Research Hub. The project aims to collect evidence in relation to the claims surrounding openness, and more specifically to answer the question ‘What is the impact of OER on learning and teaching practices?’ Given my beliefs about the inordinate value that research brings to educational technology, you can see why I was exited about the topic.
- The second innovation that I learned about was Class Mob, which is a prototype developed through the Breakthough Models Academy. There are some interesting projects in that link, but I thought that Class Mobs represented a truly novel idea centering around the development of an alternative educational system that supported learners, accounted for what we know about teaching/learning, encouraged corporations to extend traditional higher education, and empowered individuals to have a say in their education.
In mid-August,, I posted a note asking for your vote on a panel I proposed for SXSWedu consisting of Tanya Joosten, Amy Collier, Audrey Watters, and myself. The topic was: Startups Should Talk with Researchers and Educators,
I’m uber excited to report that our proposal has been accepted. We will be headed to Austin in March to discuss how researchers and educators can contribute to the design, development, refinement, and ultimately effectiveness of learning technologies and educational technology.
Tanya described why she is interested in this topic on her blog. I thought I would do the same, especially as Tanya, Amy, and I were at Educause this week discussing issues that educational institutions need to consider when piloting for technology/innovations.
A lot of you may not know this, but I have a degree in computer science, and way back when, in my undergraduate thesis I developed software enabling real-time interactions between students and instructor that emulated classroom processes by allowing students to “raise their hand” to ask questions, make comments, etc. This was nothing spectacular, unique, or groundbreaking. Yet, it was my first attempt at developing educational technology to solve a (perceived) problem. Since then, I have concurrently done design/development work and research, and I see myself as a researcher and a designer. Some of the projects I have worked on are AvenueASL (a language learning and e-assessment platform), Project Engage (a dual credit course and online learning environment introducing students to the Big Ideas relating to Computer Science), Geothentic (an online environment immersing students in Geography through situated, real-world problem-solving), and AL through Water and MOSS (an online learning environment supporting science learning via outdoor exploration). I don’t only write about learning technologies. I also build them.
How does one reconcile D&D work and research? My perspective is that it’s not enough to study what happens with educational technology. Studying, analyzing, critiquing, and questioning educational technology is very important. It’s imperative. But, we need to take the additional step to use the research to (a) design and develop educational applications, and (b) inform others on what the research says so that they can develop effective technology-based solutions based on what we know about teaching and learning. Hence the need for this panel.
I was also motivated to put together this panel after participating in SXSWedu 2013. One of the sessions I attended last year focused on business models for educational technology. One of the panelists noted that their commitment to their investors is profit, not learning outcomes. I’m not naive. Entrepreneurship is important and we should support and reward it in various ways. However, putting profits before learning outcomes is corrosive and dangerous. The biggest losers in such a setup will be learners, the idea of the university, and the idea of education. Our panel at SXSWedu is an attempt to add some sense to the conversation, to ‘add the “edu” to “sxsxedu” ‘ (I think that’s a Laura Pasquini quote, but i might be mistaken). It is also an attempt to explain to startups and vendors how they can have their cake and eat it too, how they can make meaningful, and much needed, change in education without necessarily sacrificing other goals that they have.
Whether you are an educator, a startup company, a researcher, a reporter, or an administrator, please join us – we’d love to have you!
For your information, here is our panel’s description: Education is facing numerous challenges. Educational technology startups promise solutions. However, entrepreneurs seem to disregard the knowledge that educators and researchers have amassed that can help startups address these challenges, or, at least, help them avoid repeating the mistakes of the past. At the same time, we were astounded by the lack of educators and researchers that were sharing their knowledge at last year’s SXSWedu conference. The event felt more like a vendor gathering than what the SXSWedu website describes as “meaningful conversation and collaboration around promising practices and tools for improved learning.” If we want meaningful and transformational change in how we do education, it is imperative for entrepreneurs and educators/researchers to converse. In this interactive panel, we will discuss how educators/researchers can help startups improve their products and answer questions pertaining to education research, how people learn, and classroom practice.
You are invited to attend the first Professional Development webinar sponsored by the AECT Research & Theory Division!
Dr. David Merrill
Instructional Effectiveness Consultant & Professor Emeritus at Utah State University
October 17, 2013 at 1:30 P.M. (EDT)
My Hopes for the Future of Instructional Technology
This short paper presents reasons for three hopes for the future. First, it is time to move the training of instructional designers to the undergraduate level. Second, I hope that graduate programs in instructional technology will emphasize both the science of instruction — including theory development and research — and the technology of instruction, including using principles, models and theories derived from research as a foundation for designing instructional design tools that can be used to design instruction that is more effective, efficient and engaging. Third, it is time to restructure master’s programs to prepare students to manage designers-by-assignment (DBA) and to prepare them in designing instructional design tools that would enable DBA to produce more effective, efficient and engaging instructional materials.
Enilda Romero-Hall, Ph.D.
Min Kyu Kim, Ph.D.
Research & Theory Division Professional Development Facilitators
I’m excited to announce the publication of an open access e-book on learners’ experiences with open learning and MOOCs. The book consists of ten chapters by student authors and one introductory chapter by me. Part pedagogical experiment, part an exploratory investigation into learners’ experiences with emerging forms of learning, the aim of the book is to capture and share student stories of open online learning.
This publication is necessary for a number of reasons.
First, from a pedagogical perspective, whenever possible, we should be asking students to do a discipline, not just read about it. In this occasion, students were asked to do open online learning and reflect/write about their experience, instead of just reading about the field and the experience of others.
Second, in the frenzy surrounding the rise of “edtech” and MOOCs, it seems that student voices and experiences are rarely considered. This e-book is an attempt to remind designers and developers that the learning experience should be a central tenet of attempts to reform education. Let’s all remind ourselves that what we should be designing is learning experiences – not products for efficient consumption.
Third, the examination of learning experiences with open learning and MOOCs in the literature is scant. Further, recent literature tends to gravitate towards big data and analytics, and while those research endeavors are worthwhile, they tend to generate abstract descriptions of learner behaviors. A holistic understanding of learner experiences should include both investigations of patterns of how learners behave as well as in-depth qualitative descriptions of what learning in open environments is like. To illustrate, learning analytics research suggests that there are a number of ways learners typically engage with a course (e.g., completing, auditing, disengaging, sampling). Complementary to this, our book generates nuanced descriptions of some of these categories. For example, even though one of the authors would be considered as completing a MOOC he “was left with a partial sense of accomplishment and feelings of hollowness and incompleteness.”
The scholarly contributions from this book are two. They can be summarized as follows, but for in-depth descriptions, please read my full chapter, which is simultaneously published on Hybrid Pedagogy:
- The realities of open online learning are different from the hopes of open online learning.
- We only have small pieces of an incomplete mosaic of students’ learning experiences with open online learning.
As with the emerging technologies in distance education book that I edited in 2010 (also available as open access), please don’t hesitate to send me an email to let me know what you think about this book. I’d love your thoughts! If you are teaching a class on emerging learning environments, open education, online learning, and other related topics, and you find this book helpful as reading material, I’d love to hear about how you are using it!
P.S The book is published on Github, which means that you can effortlessly improve and expand on this work. If you want to learn more about this, Kris Shaffer, who was instrumental in making our github project happen, wrote an excellent article on Github and publishing.
The 4th edition of the Handbook of Research on Educational Communications and Technology recently arrived at my office.
My graduate student and I have a chapter in this 1000-page volume (!) that attempts to summarize the 2005-2011 literature focusing on pedagogical agents and virtual characters. We believe that this chapter can help orient individuals to the field and its research literature. The approach that we followed was simple: We first analyzed the claims made about pedagogical agents in the literature. Then, we presented the empirical evidence surrounding those claims.
The abstract summarizes our findings:
In this chapter we synthesize the pedagogical agent literature published during 2005–2011. During these years, researchers have claimed that pedagogical agents serve a variety of educational purposes such as being adaptable and versatile; engendering realistic simulations; addressing learners’ sociocultural needs; fostering engagement, motivation, and responsibility; and improving learning and performance. Empirical results supporting these claims are mixed, and results are often contradictory. Our investigation of prior literature also reveals that current research focuses on the examination of cognitive issues through the use of experimental and quasi-experimental methods. Nevertheless, sociocultural investigations are becoming increasingly popular, while mixed methods approaches, and to a lesser extent interpretive research, are garnering some attention in the literature. Suggestions for future research include the deployment of agents in naturalistic contexts and open-ended environments, and investigation of agent outcomes and implications in long-term interventions.
As always, a pdf of the paper is available below.
Veletsianos, G. & Russell, G. (2014). Pedagogical Agents. In Spector, M., Merrill, D., Elen, J., & Bishop, MJ (Eds.), Handbook of Research on Educational Communications and Technology, 4th Edition (pp. 759-769). Springer Academic.
Even though the concept of the Massive Open Online Course has become wildly popular during the last year, empirical research on these initiatives is largely absent.
On the one hand, this is not surprising. The fact that the research that exists in the literature falls under the case study approach is not surprising either. Historically, the research that characterizes emerging practices has been formative and focused on specific case studies (Dede, 1996). Research on connectivist MOOCs is available (e.g., see Fournier’s and Kop’s work), but research on other types of open courses is just slowly starting to emerge (e.g., see the work of the Lytics Lab and the research pertaining to P2PU). I hope and expect that a forthcoming special issue from JOLT focusing on MOOCs will add much needed insight.
The important questions that I believe we should be asking at this point are: What education-specific research will be beneficial to the field? What do we need to know? And how should we go about investigating what we need to know about? Systematic empirical research can (a) generate a deeper understanding of this phenomenon, (b) provide evidence to support or refute the claims surrounding MOOCs, and (c) help universities and MOOC providers enhance course offerings.
What follows is a set of research questions that, if answered, will generate insights into learner/instructor experiences, outcomes, practices, and interaction in massive open online learning courses:
- What are the learning outcomes of MOOCs?
- Who successfully completes MOOCs? What are the shared characteristics of the individuals who successfully complete MOOCs? For instance, past research shows that there’s a strong positive relationship between prior knowledge and learning (Dochy, Segers, & Buehl, 1999). It would not be a stretch to expect this to transfer to MOOCs.
- Why do learners sign-up for MOOCs? Note that this is an empirical question. We can surmise why they do, but asking them may yield different answers… or may bolster what we already think we are seeing.
- What factors cause learners to persist or cease participation in MOOCs? The concepts of “dropping out” and “retention” are not new (e.g., The Chronicle of Higher Education wrote a story in 2000 that was entitled “As Distance Education Comes of Age, the Challenge Is Keeping the Students“), have already been examined in the broader online learning literature (e.g., Park & Choi, 2009), and a number of models exists to explain dropout (e.g.,Bean & Metzner, 1985). Recent evidence highlights that academic locus of control and self-regulation are factors that mediate persistence in online learning (Lee, Choi, & Kim, 2013). However, the concept of “drop out” has historically been associated with for-credit endeavors. With large numbers of individuals seemingly enrolling in MOOCs out of sheer interest and curiosity, and perhaps merely exploring their options, what new knowledge can we gain about this issue? Koller, Ng, Do, and Chen (2013) add nuance to this discussion by adding the idea that “student intent” is important in this discussion, which I think is worthwhile. However, even with this variable in mind, we should still ask: What factors cause learners to persist or cease participation? Intent can be defined ex post facto by looking at the coursera data, but intent changes over time. For example, one may sign up for a course intending to complete it, but for various reasons (e.g., unrealistic expectations, lack of time, bad course design) may cease participation. Conversely, one may sign up for a course to simply explore a topic but may stay (e.g., a supportive community encourages ongoing participation).
- What is the learning experience like in a MOOC? How does this experience differ across designs and pedagogical models?
- How do learning communities and groups develop, grow, and dissipate in MOOCs, in both online spaces (e.g., Facebook groups) and face-to-face spaces (e.g., mediated by Meetups)?
- What factors are critical in sustaining learner interest, motivation, and participation in a MOOC?
A number of initiatives are in place at present to examine MOOCs. For example, HarvardX has established a research committee headed by Andrew Ho, a professor of education, to conduct research on EdX; Justin Reich is joining the HarvardX team as a Research Fellow; George Siemens, Valerie Irvine, and Jillianne Code are editing a special issue focused on MOOCs for the Journal of Online Learning and Teaching (pdf); and the Journal of Universal Computer Science is also hosting a special issue focused on Interaction in MOOCs (pdf). Such initiatives will go a long way in providing much needed empirical results on the topic.
Dede, C. (1996). Emerging technologies and distributed learning. American Journal of Distance Education, 10(2), 4-36.
Dochy, F., Segers, M., & Buehl, M. (1999). The relation between assessment practices and outcomes of studies: The case of research on prior knowledge. Review of Educational Research, 69, (2), 145-186.