Readers of this blog may be interested in a short encyclopedia entry, summarizing the adventure learning approach to education that colleagues and I are using in a few of our projects (e.g., YoTeach and AL Water expeditions). The paper also summarizes productive avenues for future research:
Veletsianos, G. (2012). Adventure Learning. In Seel, N. (Ed.), Encyclopedia of the Sciences of Learning (pp. 157-160). Springer Academic.
This is part of my ongoing reflection on moocs. See the rest of the entries here.
I have signed up to a number of MOOCs as a student (and led one of the #change11 weeks), and have spoken in general terms a couple of weeks ago about how education research can help improve the type of education offered through a MOOC. In this post, I will give specific suggestions, focusing on the University of Pennsylvania MOOC: Listening to World Music, offered through Coursera. I am signed up to this course, which started on June 23, and I just submitted the first assignment. I decided to post these thoughts early because of two reasons. First, the beginning of any course is an important moment in its success and I find that it takes a lot of planning and reflection. Second, MOOCS are discussed as being experiments in online education. The Atlantic even calls them “The single most important experiment in higher education.” I agree that they are experimental initiatives, and as such would benefit from ongoing evaluation and advice. Where I disagree with is the notion that they are a departure from what we know about online (and face-to-face) education. This post is intended to highlight just a couple of items that the Coursera instructional designers and learning technologists could have planned for in order to develop a more positive learning experience.
1. Length of the video lectures.
The syllabus lists the length of the video lectures (e.g., video 1 is 10:01 minutes long and video 2 is 10:45 minutes long.) However, this length is not provided on the page that students visit to watch the videos, which is where they need that information. I’ve annotated this below.
2. Opportunities for interaction.
The platform provides forums for students to interact with each other. Learners are of course instrumental and will figure out alternative, and more efficient and effective ways to communicate with each other, if they need to. For instance, in a number of other MOOCs students set up facebook groups, and I anticipate that this will happen here as well. What Coursera could do to support learners in working with each other is to integrate social media plugins within each course. I am surprised that this isn’t prominent within the course because you can see from the images below that Coursera uses social media plugins to allow students to announce participation in the course:
For instance, it appears that the course uses the #worldmusic hashtag, though it’s not integrated within the main page of the course, not does it seem to be a unique hashtag associated with the course.
3. How do you encourage students to watch the videos?
Let’s say that we added the length of each video next to its title. Now, the learner knows that they need about an hour to watch the video. Some learners (e.g., those who are intrinsically motivated by the topic) will watch them without much scaffolding. But, how do you provide encouragement for others to do so? Here’s where some social psychology insights might be helpful. By providing learners with simple descriptions of how the majority of their colleagues are behaving (i.e. appealing to social norms), then one might be able to encourage individuals to watch the videos. For example, the videos might include a dynamic subtitle that informs learners that “8 out of 10 of your peers have watched this video” or that “70% of your peers have completed the first assignment” and so on. This is the same strategy that hotels use to encourage users to reuse towels and the same strategy that Nike uses when it compares your running patterns to the running patterns of other runners, as shown in the image below:
4. Peer-grading expectations.
This course is different from others that I’ve participated in because it includes an element of peer-grading. This is exciting to me because I’m a firm believer in social learning. One minor concern however is the following: I don’t know how many peers I am supposed to evaluate. I thought I was supposed to evaluate just one, but each time I finish my evaluation, I am presented with the option to “evaluate next student.” Do I keep evaluating? How many do I need to evaluate before I can move to the next step? I don’t know. In other words, it’s always helpful to inform the learner of what s/he has to do. For instance, in my case, I just stopped evaluating peers after having evaluated 4 because I don’t know how much I am expected to do. Perhaps there’s no minimum… and this information would be helpful to me as a learner.
Overall, my experience with this course is positive, though there is a lot of room for improvement here, which is to be expected. For example, I haven’t touched much on the pedagogy of the course, but there’s a few more weeks left… so stay tuned!
Nike photo credit. Thanks to my colleague Chuck Hodges for directing my attention to the Nike example.
This entry is part of a reflective series of posts/questions relating to online learning, MOOCs, and openness. See the first one here.
Coursera announced today that it is adding a dozen or so universities as partners. In an article in the New York Times, Sebastian Thrun notes that MOOC courses are still experimental and argues: “I think we are rushing this a little bit,” he said. “I haven’t seen a single study showing that online learning is as good as other learning.”
This perception of online education as “better than” or “as good as” other forms of education (I imagine that Sebastian Thrun is referring to face-to-face education here), is rampant. I believe it is rampant because our field has not done a good job disseminating what we know and what we don’t know about online education. At the same time, individuals do not tend to go back to the foundations of the field to investigate what others have discovered.
The result: A lack of understanding that there’s a whole field out there (here?) that has developed important insights on how we can design online education effectively. The list of references at the end of this post are merely a few of the resources one can use to get started on what we know and what we don’t know about comparison studies (i.e. studies that compare learning between delivery modes).
The point of this entry is to argue that there’s no point to reinvent the wheel. There’s no point to make the same mistakes. And above all, past research has shown that there’s no point to study whether online education is as good as (or as bad as) other forms of education because what one will discover is that:
- There are no significant differences in learning outcomes between face-to-face education and online education.
- When differences are found between the two, the differences can be attributed to (a) pedagogy, or (b) and a lack of controls in the experimental design.
It is important to point out that the effectiveness of an educational approach is influenced greatly by other variables, such as instructor support or pedagogical approach. Therefore, it is very difficult (if not impossible) to compare face-to-face and online education because when one is not a replication of the other, they are vastly different, are based on different learner-instructor interactions, and offer different affordances. While researchers have tried to minimize differences and compare face-to-face learning and online learning in experimental ways, the interventions end up being meaningless for the types of powerful online/face-to-face teaching we might envision. Comparing delivery mechanisms therefore, blinds us to the important variables that truly impact how people learn.
The important and informative questions to ask are not comparative. Rather they focus squarely on understanding online education:
- How can we design effective and engaging online education (e.g., MOOCs)?
- What is the nature of learning in a MOOC?
- What is the learning experience in a MOOC like?
These questions are difficult. They won’t be answered by comparing survey responses and they won’t generate one simple answer. They will however generate answers that will be different depending on context. And that’s what’s exciting about doing research on online education.
- Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online Learning Studies. U. S. Department of Education. http://www.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf
- Twenty Years of Research on the Academic Performance Differences Between Traditional and Distance Learning: Summative Meta-Analysis and Trend Examination http://jolt.merlot.org/vol6no2/shachar_0610.htm
- Clark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 43(4), 445-459.
- Clark, R. (1985). Evidence for confounding in computer-based instruction studies: Analyzing the meta-analyses. Educational Communications and Technology Journal, 33(4), 249-262.
- Kozma, R. (1994). Will media influence learning? Reframing the debate. Educational Technology, Research and Development, 42(2), 7-19.
- Tennyson, R. D. (1994). The big wrench vs. integrated approaches: The great media debate. Educational Technology Research & Development, 42(3), 15-28.
- Veletsianos, G. & Navarrete, C. (2012). Online Social Networks as Formal Learning Environments: Learner Experiences and Activities. The International Review Of Research In Open And Distance Learning, 13(1), 144-166
- Veletsianos, G. (2011). Designing Opportunities for Transformation with Emerging Technologies. Educational Technology, 51(2), 41-46.
- Veletsianos, G. (2010). A Definition of Emerging Technologies for Education. In G. Veletsianos (Ed.), Emerging Technologies in Distance Education (pp. 3-22). Athabasca University Press.
At the end of last year’s AERA meeting, we passed on the leadership of the Computer & Internet Applications in Education SIG to a new group of individuals. I was honored to serve the SIG as Chair and Program Chair over the last two years, and wish the new officers great success in the years to come.
You can support them by considering the SIG for your AERA proposals and considering joining the SIG at its annual meeting. The new Chair’s welcome message follows:
Dear Computer and Internet Applications in Education (CIAE) SIG members and colleagues,
We would like to use this opportunity to introduce new SIG officers and
invite all of you to consider joining us at the 2013 AERA Annual Meeting
which will be held in San Francisco on April 27-May 1, 2013. We’d like to
thank to the previous members of the SIG executive committee (Dr. George
Veletsianos, Dr. Charles Miller, and Dr. Cassie Scharber) for their
valuable effort in advancing the SIG and organizing the sessions and
activities in AERA 2012. New SIG officers elected to serve are: Dr. Evrim
Baran (chair), Dr. Amy Pittenger (program chair), and Dr. Zeni Colorado
(treasurer). We are now working on to organize the SIG sessions and
activities for the AERA 2013 conference. We’d like to thank to all of our
reviewers who volunteered to help us during the reviewing process.
The purpose of the SIG CIAE is to promote research, teaching, and service
on the design, evaluation and critical use of computer and Internet
applications in education. We strive to be a dynamic group considering the
nature of dynamic and ever–changing landscape of educational environments
with computer and Internet applications. We are excited to see the
potentials of computer and Internet applications in the way we reconsider
our current educational practices and design innovative and critical
solutions for learners, teachers, and practitioners in educational
settings. Our SIG scope, vision and membership profiles reflect the
interest and scholarship in the following themes:
o Evolving contexts in educational technology: Design, integration, and
evaluation of educational technology
o The future of hybrid and online education (eg. extreme, adventure,
scenario, and game-based learning)
o Affordances of emerging technologies and approaches for the design and
evaluation of learning spaces (eg. information visualization tools, online
collaborative learning technologies, mobile platforms, learning analytics,
cloud computing, usability tools)
o Technology leadership for successful technology integration in
education: In-depth studies throughout the world
o Contemporary Issues in computers and Internet applications in education
(e. digital literacy, media literacy, privacy, security)
Our membership also have expertise in wide range of research methodologies
such as design-based research, case study, experimental design, mixed
methods, action research, ethnography, survey, content analysis, to name a
few. We hope to advance the research and scholarly conversation in CIAE
with your contribution and presence in our SIG. Please consider submitting
a proposal to the SIG and joining as at our Facebook group (
conversation with the SIG members or for more information on how to
actively participate to the activities. Please feel free to distribute this
information to those who would be interested in joining to the SIG.
One last reminder is about the AERA 2013 Annual Meeting submissions. Please
remember that this year proposals should be submitted by July 23, 2012, at
11:59 PM Pacific Time. More information on submission can be found
Feel free to contact me, or any member of the SIG executive committee, if you have questions.
I look forward to welcoming you to the AERA community. Thank you for your
support of AERA CIAE special interest group and education research.
Have a great summer!!!
Evrim Baran, Ph.D.
Chair, AERA Computer and Internet Applications in Education SIG
Assistant Professor of Educational Sciences
Middle East Technical University, TURKEY
This entry is part of a reflective series of posts/questions relating to online learning, MOOCs, and openness.
MOOCS are everywhere nowadays. Coursera, Udacity, EdX, the connectivist MOOCs (e.g., #ds106, Change11), etc, depending on what lens one is using to examine them, are generating hope, excitement, uneasiness, and frustration. An important question that one needs to ask is: What is the purpose of a MOOC?
MOOCS have different purposes. For example, some MOOCs are built on the idea of democratizing education and enhancing societal well being. See Curt Bonk’s MOOC types, targets, and intents for additional MOOC purposes.
Other MOOCs are built on the idea of improving a specific skills. Today’s EdSurge newsletter included the following note:
GOOGLE’S FIRST MOOC comes in the form of a “Power Searching with Google” course consisting of six 50-minute classes on how to search “beyond the ten blue links.” Classes just started and at last count, over 100,000 people have already registered. Google promises to go way beyond the 101 stuff and dive into advanced features. We’re ready: we’ve been a little stumped at finding a query “to search exclusively in the Harvard University website to find pages that mention clowns.”
Let’s unpack this a bit. What is the purpose of this MOOC? This MOOC will help users make better use of google’s search capabilities. It will also help Google experiment with offering MOOC-type courses and reinforce consumer loyalty.
How does the Google MOOC fare with regards to enhancing societal well-being? Rather than offering courses to teach users how to search better, I would have rather seen Google develop online courses specifically aimed at reducing societal inequalities and enhancing well-being. I would have rather seen a course on “using our tools for speaking out against oppressive regimes” or “using our tools to facilitate the development of community in your neighborhood” or “using our tools to design and develop your own online class.”
I hope that this course is not the last that we see from Google, and that rather than focusing on teaching users a specific skill set, future courses focus on supporting the development of societal well-being.
During the Fall semesters, I teach a course on the foundations of Instructional Design for our MA and PhD students. Two years ago, I shared my syllabus. Last year, I shared one of my favorite activities, in which I ask students to create a digital story comparing two instructional design models. This activity is part of the AECT open content portal, where you can find additional learning objects for educational technology courses.
I am now in the process of redesigning my course to be taught online in the Fall, and I thought I’d share three of the videos that I will be using in case others find them of interest.
The first video describes IDEO’s design process and is intended to introduce students to design thinking:
I use this second video as a discussion prompt for cognitive theories of learning.
And this third one is an instructional video from 1927 that I use to initiate a conversation about efficiency and effectiveness in designing learning materials (and, as a bonus, one from 1937)