One of the main arguments that we made in our recent paper on MOOCs, which is also the argument that I continue in this op ed piece published in Inside Higher Ed, is that the field needs to embrace diverse research methods to understand and improve digital learning. The following passage is from our paper, and given that the paper is quite long, I thought that posting it here might be helpful:
By capturing and analyzing digital data, the field of learning analytics promises great value and potential in understanding and improving learning and teaching. The focus on big data, log file analyses, and clickstream analytics in MOOCs is reflective of a broader societal trend towards big data analytics (Eynon, 2013; Selwyn, 2014) and toward greater accountability and measurement of student learning in higher education (Leahy, 2013; Moe, 2014). As technology becomes integrated in all aspects of education, the use of digital data and computational analysis techniques in education research will increase. However, an over-reliance on log file analyses and clickstream data to understand learning leaves many learner activities and experiences invisible to researchers.
While computational analyses are a powerful strategy for making a complex phenomenon tractable to human observation and interpretation, an overwhelming focus on any one methodology will fail to generate a complete understanding of individuals’ experiences, practices, and learning. The apparent over-reliance on MOOC platform clickstream data in the current literature poses a significant problem for understanding learning in and with MOOCs. Critics of big data in particular question what is missing from large data sets and what is privileged in the analyses of big data (e.g., boyd & Crawford, 2012). For instance, contextual factors such as economic forces, historical events, and politics are often excluded from clickstream data and analyses (Carr, 2014; Selwyn 2014). As a result, MOOC research frequently examines learning as an episodic and temporary event that is divorced from the context which surrounds it. While the observation of actions on digital learning environments allows researchers to report activities and behaviors, such reporting also needs an explanation as to why learners participate in MOOCs in the ways that they do. For example, in this research, participants reported that their participation in MOOCs varies according to the daily realities of their life and the context of the course. Learners’ descriptions of how these courses fit into their lives are a powerful reminder of the agency of each individual.
To gain a deeper and more diverse understanding of the MOOC phenomenon, researchers need to use multiple research methods. While clickstream data generates insights on observable behaviors, interpretive research approaches (e.g., ethnography, phenomenology, discourse analysis) add context to them. For example, Guo, Kim, and Rubin (2014), analyzed a large data set of MOOC video-watching behaviors, found that the median length of time spent watching a video is six minutes, and recommended that “instructors should segment videos into short chunks, ideally less than 6 minutes.” While dividing content into chunks aligns with psychological theories of learning (Miller, 1956), this finding does not explain why the median length of time learners spent watching videos is six minutes. Qualitative data and approaches can equip researchers to investigate the reasons why learners engage in video-watching behaviors in the ways that they do. For example, the median watching length of time might be associated with learner attention spans. On the other hand, multiple participants in this study noted that they were fitting the videos in-between other activities in their lives – thus shorter videos might be desirable for practical reasons: because they fit in individuals’ busy lives. Different reasons might be uncovered that explain why learners seem to engage with videos for six minutes, leading to different design inspirations and directions. Because the MOOC phenomenon, and its associated practices, are still at a nascent stage, interpretive approaches are valuable as they allow researchers to generate a refined understanding of meaning and scope of MOOCs. At the same time, it is significant to remember that a wholly interpretive approach to understanding learning in MOOCs will be equally deficient. Combining methods and pursuing an understanding of the MOOC phenomenon from multiple angles, while keeping in mind the strengths and weaknesses of each method, is the most productive avenue for future research.
A computational analysis and data science discourse is increasingly evident in educational technology research. This discourse posits that it is possible to tell a detailed and robust story about learning and teaching by relying on the depth and breadth of clickstream data. However, the findings in our research reveal meaningful learner activities and practices that evade data-capturing platforms and clickstream-based research. Off-platform experiences as described above (e.g., notetaking) call into question claims that can be made about learning that are limited to the activities that are observable on the MOOC platform. Further, the reasons that course content is consumed in the ways that it is exemplifies the opportunity to bring together multiple methodological approaches to researching online learning and participation.
What do learning experiences in MOOCs look like? Amy Collier, Emily Schneider and I have just published a paper that provides some in-depth answers to this question. Here is a copy of the paper in pdf. The paper is part of a special issue published by the British Journal of Educational Technology which can be found here (there are many excellent pieces in that issue, so be sure to read them).
In addition to trying to understand learner experiences, in the paper we describe that we did this study because ”ease of access to large data sets from xMOOCs offered through an increasing number of centralized platforms has shifted the focus of MOOC research primarily to data science and computational methodologies, giving rise to a discourse suggesting that teaching and learning can be fully analyzed, understood and designed for by examining clickstream data”
Our abstract reads:
Researchers describe with increasing confidence what they observe participants doing in massive open online courses (MOOCs). However, our understanding of learner activities in open courses is limited by researchers’ extensive dependence on log file analyses and clickstream data to make inferences about learner behaviors. Further, the field lacks an empirical understanding of how people experience MOOCs andwhy they engage in particular activities in the ways that they do. In this paper, we report three findings derived by interviewing 13 individuals about their experiences in MOOCs. We report on learner interactions in social networks outside of MOOC platforms, notetaking, and the contexts that surround content consumption. The examination and analysis of these practices contribute to a greater understanding of the MOOC phenomenon and to the limitations of clickstream-based research methods. Based on these findings, we conclude by making pragmatic suggestions for pedagogical and technological refinements to enhance open teaching and learning.
We reported 3 main findings:
1. Interactions in social networks outside of the MOOC platform
A number of learners alluded to interactions they have had with individuals who are part of their social networks. These include digital connections with other participants in a MOOC, face-toface interactions with friends and family, and face-to-face interactions with new connections in a MOOC.
Despite the fact that none of the popular MOOC platforms support integrated notetaking at the time of writing this paper, nearly all interviewees reported taking notes while watching lecture videos. Only one interviewee never took notes. However, the tools used to take notes and the subsequent use of notes varied substantially by learner.
3. Consuming content
All individuals participating in this study discussed factors that shaped the ways they consumed MOOC content, shedding light on the context surrounding their participation. Scholars in the learning sciences have long highlighted the critical role of the environment, arguing that learning must be understood as a sociocultural phenomenon situated in context and culture (Brown, Collins & Duguid, 1989). Patterns of MOOC content consumption can be examined by clickstream data, but these contextual factors help explain why learners exhibit particular patterns of participation.
Veletsianos, G., Collier, A., & Schneider, E. (2015). Digging Deeper into Learners’ Experiences in MOOCs: Participation in social networks outside of MOOCs, Notetaking, and contexts surrounding content consumption. British Journal of Educational Technology 46(3), 570-587.
At AERA this week, Amy Collier, Emily Schneider, and I will be presenting a paper that makes a series of arguments regarding learner activities and experiences in MOOCs in relation to clickstream-based MOOC research. One of the implications of our work is the following: learners’ participation and experiences in these courses resist binary and monolithic interpretations as they appear to be mediated by a digital-analog continuum as well as a social-individual continuum. In other words, learning and participation in MOOCs are both distributed and individually-socially negotiated. The following visual (which provides some hints on our results) makes this point clearer:
* and since the work of peer reviewers often goes unrecognized, let it be known, that this insight was prompted by a comment from one anonymous reviewer. So, whoever you are, thank you for your input.
I joined Audrey Watters, Philipp Schmidt, Stephen Downes, and Jeremy Friedberg in Toronto last week, to give a talk at Digital Learning Reimagined, an event hosted and organized by Ryerson University’s Chang School. I presented some of our latest research, and tried to highlight research findings and big ideas in 15 minutes. Below are my slides and a draft of my talk.
Welcome everyone! It’s a pleasure and an honor to be here. Even though I’m the person giving this talk, I’d like to acknowledge my collaborators. A lot of the work that I am going to present is collaborative and it wouldn’t have been possible without such amazing colleagues. These are: Royce Kimmons from the University of Idaho, Amy Collier and Emily Schneider from Stanford University, and Peter Shepherdson from the University of Zurich. The Canada Research Chairs program, the National Science Foundation and Royal Roads University have funded this work.
I want to start my talk by telling a story.
This castle that you see here is one of the most recognizable parts of Royal Roads University (RRU). But, don’t let the castle fool you. RRU was created in 1985. It’s purpose was to serve the needs of a changing society by serving working professionals through graduate digital education and multidisciplinary degrees. It has grown since 1985. It has matured, developed a social learning model that is now infused in all courses, developed new areas of focus, forged global partnerships, and continues to explore how to improve what it does through pedagogical and technological approaches.
Why am I sharing this short story about RRU?
Because this story, minus the specific details, is a common story. It’s also a Ryerson story, a story that is played out at the University of Southern New Hampshire, a story that Open Universities around that world have gone through. It is a story that repeats itself over and over for years and years.
What is the essence of the story?
It is often assumed that universities have been static, unchanging since the dawn of time. The short story I shared illustrates that universities are, and have always been, part of the society that houses them, and as societies change, universities change to reflect those societies. The economic, sociocultural, and technological pressures that universities are facing are sizable, and for better or for worse, usually for both, there’s a continuous re-imagination of education throughout time. Throughout time. Universities have always been changing.
As universities are changing and exploring different ways to offer education, faculty, researchers, and administrators engage in a number of practices that I like to describe as emerging. Emerging practices & emerging technologies are those that are not necessarily new, not yet fully researched, but appear promising.
Online learning and openness are example of emerging practices.
Online learning has a long history. But it also has a new history, with the development of multimedia platforms, media that can be embedded across platforms, syndication technologies that enable learners to use their own platforms for learning and so on. So, even though some of the problems that online learners are facing in contenmporary situations are not new (eg dropout), learners abilities’ to congregate in online communities is expanded through newer technologies and that poses different sorts of challenges and opportunities.
Another emerging practice is openness. Openness refers to liberal policies for the use, re-use, adaptation, and redistribution of content. Openness is also a value: It refers to adopting an ethos of transparency with regards to access to information. And this ethos ranges from academics publishing their work in open formats, to teaching open courses, to creating open textbooks. And it doesn’t stop at individual academics or institutions. In 2014 the Premiers of Alberta, British Columbia, and Saskatchewan signed a Memorandum of Understanding to facilitate creation, sharing, and use of Open Educational Resources. In the same year, SSHRC, NSERC, and CIHR have drafted a tri-agency open access policy to improve access to and dissemination of research results (NSERC, 2014);
There is a growing interest in and exploration of online learning and openness, practices which are still emerging. Next, I will share four recent results from our research into these practices that I believe are interesting to consider because they reveal the tensions that exist when dealing with emerging topics.
First, research into online learning is becoming more interdisciplinary
Interdisciplinary research into online learning means that individuals from a diverse range of disciplines, not just education, are interested in making sense of online learning. It is hoped that more research into online learning and more research from multidisciplinary groups will help us learn more about online learning and about learning in general.
We have evidence to show that research into online learning is becoming more interdisciplinary. I won’t bore you with the statistics, but we measure diversity in published research using a nifty measure and found that the period 2013-2014 can be described as more interdisciplinary than the period 2008-2012.
This is a positive trend, but before I explain its significance, let me explain to you how I view technology.
My perspective on online learning centers around the idea that technology is socially shaped . That means that technology always embeds its developers’ worldviews, beliefs, and assumptions into its design and the activities it supports and encourages.
What does this mean for interdisciplinarity? This means that we have both an opportunity and a challenge.
Our opportunity: to use our respective expertise to improve education.
Our challenge: to actually do interdisciplinary thinking and to go into the study and design of future educational systems with an open mind and the realization that our own personal experiences of education may not be generalizable. A lot of educational technology is produced by people of privilege and to develop educational technology that matters and makes societal difference, we need diversity in thinking and experience.
Our second finding refers to the increasing desire to collect, mine, and analyze data trails to make inferences about human behavior and learning. This practice is often referred to as learning analytics and educational data mining. This practice is a reflection of a larger societal trend toward big data analytics. The idea is that by looking at what people do online one can understand how to improve education.
A couple of things that researchers discovered for example are:
Data trails. Nearly everything that learners do online is tracked. Can we understand learners and improve learning by analyzing their data trails?
While these approaches can help us explain what people do, they often don’t tell us why they do they things they do nor how they actually experience online education.
My colleagues and I are interviewing MOOC students to learn about their experiences in MOOCs.
I am now going to tell you about our third result. We find that learners schedule their learning, use of resources, and participation to fit their daily life. This is in stark contrast to the idea of undergraduate education situated at a university and happening at particular time periods.
One retired individual in Panama that we interviewed works on his class early in the morning every day. Why does he do that? He does that because at that time his daughter is asleep. She is homeschooled and once she wakes up she needs access to the 1 computer that they have in the household to do her own schoolwork. In this case a lack of resources necessitates this scheduling.
One individual that we interviewed moved from the UK to the USA to be with her partner. She is currently waiting for her work permit, driver’s license, and so on, and she was enrolled in multiple MOOCs at the same time. She would work on her courses on Monday because she just “wanted them out of the way,” and so she would work on these courses straight throughout the day.
The fourth and final finding that I have for you today, is that MOOC platforms to date have not offered learners the ability to keep notes, so that particular activity, by virtue of being unsupported by the platform goes undetected when researchers only look at data trails.
Unsurprisingly, learners keep notes. A number of students that we talked to described that they keep notes on paper, frequently keeping a notebook for particular courses and returning to them during exams or during times that they needed them. Learners of course also keep notes in digital format. Usually in word documents, but again documents are dedicated to particular courses, but sometimes they are dedicated to particular topics across courses.
To give you an example, of how we believe this activity could be supported in the future and how we believe innovations can contribute to learning, we recommend designers support this practice by pedagogical innovations such as scaffolding notetaking, but also by technological innovations, by developing online systems for notetaking. What is important here is that such systems should support learning by being interoperable, by allow learners full and unrestricted access to their notes, supporting them to be able to import & export their notes between platforms. Such a design is in line with emerging ideas in the field which call for learners to own their data.
Thank you for being a great audience. I am really excited to hear the speakers that follow me, as I am sure you are!
A visualization of my talk, created by Giulia Forsythe
1. Congratulations to the following seven individuals who completed a doctoral degree in 2014.
2. It’s always interesting to explore literature outside of peer-reviewed journals to explore how early career colleagues are thinking about a topic.
3. The doctoral dissertations that follow were all published in 2014 and they focus on various aspects of MOOCs. Undoubtedly, some of the findings reported below will make it into the peer-reviewed literature. As far as I can tell, findings from Kassabian, Kellogg, and Moe have already been published.
3. I believe that it would have been more valuable if these were already published as a series of shorter articles instead of being published as volumes that then need additional effort to be revised/refined for submission to professional journals, a practice that is both frequent and encouraged. My dissertation in 2008 was a 3-paper series. There’s ways to do this, and really good reasons to do so. I’ve discussed this option with 3-4 doctoral students recently that are exploring the option, but institutions need policies and frameworks in place to support such efforts.
4. I digress. Below you can find the citation and abstracts for these seven dissertations. Enjoy!
Gerber, J. (2014). MOOCs: Innovation, Disruption and Instructional Leadership in Higher Education. ProQuest, UMI Dissertations Publishing.
In the beginning rush of attention surrounding MOOCs (Massive Open Online Courses), there was considerable speculation regarding the ideal use and potential impact of this new innovation on teaching, learning, and traditional higher educational structures. Yet universities and colleges were rushing to implement MOOCs despite neither data nor clear understanding regarding their potential disruptive force on the educational landscape. To examine the MOOC phenomenon more closely, I conducted qualitative research that examined MOOCs integration at higher education institutions identified to be at the forefront of the MOOC movement. Framed using Everett Rogers’ model of innovation diffusion (Rogers, 1962), MOOC early adopters were defined as faculty members from US institutions who offered MOOCs between April 2012 and December 2013. This study researched initial MOOC implementation efforts in order to better determine motivations, implications and future impact on higher education, which will provide greater context to this rapidly shifting innovation. My findings indicate that the primary institutional motivation to sponsor MOOCs was to raise and/or enhance their institutional brand. The findings also indicated that faculty that self-selected to participate in MOOCs at the early stage was open to experimentation as well as to the inherent risks associated with the trial of a new educational innovation. This study uncovered important implications on the main pedagogical mission of the university and its professors as a result of instructor and institutional involvement with MOOCs. More specifically, this study revealed that MOOCs have pushed pedagogical issues to the forefront, and faculty early adopters have shifted their classroom teaching in ways believed to improve the classroom experience and create more interactive learning opportunities for students as a result of MOOCs.
Kassabian, D. (2014). Massive Open Online Courses (MOOCs) at elite, early-adopter universities: Goals, progress, and value proposition. ProQuest, UMI Dissertations Publishing.
Massive Open Online Courses (MOOCs) have become a hot topic in higher education and have undergone rapid growth. More than 800 MOOCs have been offered to the public from more than 200 of the most well known universities in the world, with millions of learners taking them. While many elite universities have developed MOOCs, their motivations have not been entirely clear. This qualitative case study research explores what three early adopter universities, Columbia University, Duke University, and Harvard University, hope to achieve by becoming involved and investing in MOOCs, how they are assessing progress toward goals, and what value proposition they seek as a return on their investment. The findings of this research suggest that the studied universities have several goals in common and a few that differ, and importantly, that several of their goals do not directly align with the public narrative around MOOCs in the press. In particular, while the goals of the studied universities do include expanded access to education, their goals may have even more to do with promoting teaching innovation and providing benefits for their residential education. None of the studied universities were focused on improvements to higher education completion challenges through pursuit of MOOC credit, or the use of MOOCs as a way to control higher education costs–both of which are major elements of the public dialogue on MOOCs. Other goals of the early adopters studied included providing more visibility for some of their educational programs and their faculty, and enabling more evidence-based education research. This study concludes that the value proposition for early adopter universities is the ability to simultaneously pursue the goal of improving on-campus teaching and learning while also promoting the university and its faculty and connecting through educational outreach with the public–all while showing leadership in an emerging higher education learning technology.
Kellogg, S. (2014). Patterns of Peer Interaction and Mechanisms Governing Social Network Structure in Three Massively Open Online Courses for Educators. North Carolina State University.
MOOCs, or Massively Open Online Courses, have gained extensive media attention for their vast enrollment numbers and the alliance of prestigious universities collectively offering free courses to learners worldwide. For many, MOOCs are filling the role of continuous education and ongoing professional development, serving to satisfy personal intellectual curiosity or enhance the workplace skills of post-graduates. A recent development in the MOOC space has been courses tailored to educators serving in K-12 settings. MOOCs, particularly as a form of educator professional development, face a number of challenges. Academics as well as pundits from traditional and new media have raised a number of concerns about MOOCs, including the lack of instructional and social supports. It is an assumption of this study that many of the challenges facing MOOCs can be addressed by leveraging the massive number of learners to develop robust online learning communities. Despite the potential benefits for educators, however, building and sustaining online learning communities has generally proved problematic. This study attempts address critical gaps in the literature and address issues of community engagement in MOOCs by examining factors that influence peer interaction among educators. Specifically, this quantitative case study is framed by the social network perspective and utilizes recent advancements in Social Network Analysis to describe the peer discussion networks that develop and model the mechanisms that govern their structure.
Moe, R. (2014). The evolution and impact of the massive open online course. ProQuest, UMI Dissertations Publishing.
An online learning phenomenon emanated 2½ years ago from three courses taught at Stanford University, promising an opportunity for high-quality instruction from elite institutions and professors for no cost to the student. This phenomenon, which came to be known as the MOOC, catalyzed sweeping changes in both higher education’s relationship with distance education, as well as the discussion of higher education in society, in a remarkably short period of time. While people have questioned the effectiveness of MOOC learning and the potential negative consequences of adopting MOOC systems either in support of or to replace existing educational infrastructure, the MOOC movement has continued to grow at a rapid pace. This research study sought to define the characteristics of the MOOC on the terms of learning theory, pedagogy, history, society and policy through the use of an expert-based Delphi study, where participants engaged in a phenomenological dialogue about what constitutes a MOOC in practice, the present state of higher education in the wake of the MOOC movement, the effect the phenomenon has had on education both structurally as well as socially, and visions of the future of the institution of higher education as affected by the MOOC. In summary, panelists focused their agreement on cognitive and pragmatic aspects of the MOOC debate, such as a hope for learning analytics to offer solutions to educational problems as well as the opportunity for the MOOC system to offer tier-based education services to consumers. The Delphi discussion showcased the importance of cognitive theory in MOOC design as well as the relationship between MOOCs and economics, and highlighted the difficulty education experts have in agreeing on how to define educational terminology.
Outland, J. C. (2014). Examining the Market Positioning of Massive Open Online Courses to Maximize Employer Acceptance. ProQuest, UMI Dissertations Publishing.
Massive Open Online Courses (MOOCs) are a new instructional method utilizing many delivery methods common to online academic courses that are being offered in greater frequency as learner interest has increased. Learners may be attending these courses due to a lack of cost and perception that completion of this training may offer some benefit to them as they seek employment. Unfortunately, due to both the relatively recent development of MOOCs and the corresponding variety in delivery and documentation methods, little research had been completed on the acceptability of this instructional method by potential employers. Without this information, learners would be completing training that has little applicable benefit to them as they seek positions or advancement. Additionally, institutions would be offering courses in formats that do not fully benefit students, resulting in a sub-optimal use of institutional resources. The purpose of this qualitative study was to examine the perceptions of U.S.-based employers on instruction using variations of the MOOC model, and to identify traits in this delivery method that make completion of this training advantageous to potential applicants. Human resources and hiring managers were interviewed to determine their preferences using a focus group model. The data collected indicates MOOCS are positively perceived by employers, but not optimally positioned due to a lack of understanding and documentation. Employer perceptions of MOOCs can be enhanced by the consideration and inclusion of industry required skillsets to ensure that learners are focused on employer desired abilities that allow them to meet minimum and preferred job requirements. Additionally, by providing accurate and detailed documentation of the contents of a MOOC, institutions can ensure that a course is readily measurable by employers. This documentation can take many forms, but credentials that detail the topics covered, time spent, and completion evaluation method are preferred. By adopting these identified key requirements of employers, institutions may be able to better position their MOOC offerings into categories that are more easily understood and evaluated during the hiring process. These changes would then enhance the perceived benefits of these classes, and generate additional advantages for job seekers who have completed these courses.
Schulze, A. S. (2014). Massive open online courses (MOOCs) and completion rates: are self-directed adult learners the most successful at MOOCs?. ProQuest, UMI Dissertations Publishing.
Millions of adults have registered for massive open online courses, known as MOOCs, yet little research exists on how effective MOOCs are at meeting the needs of these learners. Critics of MOOCs highlight that their completion rates can average fewer than 5% of those registered. Such low completion rates raise questions about the effectiveness of MOOCs and whether adults enrolling in them have the skills and abilities needed for success. MOOCs have the potential to be powerful change agents for universities and students, but it has previously been unknown whether these online courses serve more than just the most persistent, self-directed learners. This study explored the relationship between self-directed learning readiness and MOOC completion percents among adults taking a single Coursera MOOC. By examining self-directed learning – the ability to take responsibility for one’s own educational experiences – and MOOC completion rates, this research may assist in improving the quality of MOOCs. A statistically significant relationship was found between self-directed learning and MOOC completion percentages. Those stronger in self-directed learning tended to complete a greater percent of the MOOC examined. In addition, English speaking ability demonstrated a mediating effect between self-directed learning and MOOC completion. Learners indicating a strong ability in speaking English were more likely to be ready for self-directed learning and completed a higher percentage of the MOOC. Compared with those that did not complete MOOCs, however, few additional differences in demographics of adult learners that completed MOOCs were found. To better understand the skills and experiences needed to be successful in a MOOC, additional research on factors that influence MOOC completion is warranted. If only a minority of strongly self-directed learners can successfully complete MOOCs, then more resources should be invested into the design and development of MOOCs to meet the needs of many learners. If this does not occur, then MOOC completion rates could continue to suffer and new open education solutions of higher quality may appear, making MOOCs a short-lived phenomenon.
Stefanic, N. M. (2014). Creativity-Based Music Learning: Modeling the Process and Learning Outcomes in a Massive Open Online Course. ProQuest, UMI Dissertations Publishing.
While developing creativity is an important goal of many educational endeavors, creating music, from a music education perspective, is a powerful pedagogical tool. Beyond comparing the relative creativity of individuals’ musical creative products (e.g., melodies, songs, lyrics, beats, etc.), research in musical creativity must consider how engaging in the creative process can be an effective teaching tool, what I have termed creativity-based music learning. If music teachers are to develop students’ abilities “to experience music as meaningful, informed by sensitive discernments and broad understandings, in each particular musical role engagement in which one becomes involved” (Reimer, 2003, p.214), then we must gain a better understanding of how different aspects of the person and context interact during the creative process. Based on the available literature, Webster (1987a, 2002) conceived the Model of Creative Thinking in Music as a conceptual model for understanding the importance of various components that are at work in the musical creative process. Since, generally speaking, learning results from thinking of some sort, Webster’s model represents a reasonable starting point from which to examine how musical creative thinking leads to musical learning. There is much research in music education and the general creativity literature that has investigated how these various component parts (e.g., music aptitude, personality, motivation, previous experience, context) relate to creativity, but there has yet to be any substantive attempt to understand how all of these various elements simultaneously interrelate during a given musical creative process. More importantly, there is limited research on how creativity-based music learning contributes to important learning outcomes such as students’ perceptions of learning from the process, students’ self-evaluations of creative products (e.g., songs they have written), the development of conceptual understandings, and the development of musical creative self-efficacy. The initial primary purpose of this study was to develop and identify a statistical model that best represents the nature of the various interrelationships of components of the musical creative process, as identified in Webster’s (2002) model, and as they relate to learning outcomes. Understanding how all of these components relate and ultimately impact various learning outcomes has important implications for how we educate our music students. Data were collected from students taking a Massive Open Online Course entitled “What is Music?: Finding Your Song,” which was designed, developed, and taught by the researcher, and offered in January 2014 through the Canvas Network. In the course, the question “what is music?” was approached from several perspectives, including Music as Human Activity, Music as Emotion, Music as Physics, and Music as Form. While learning about each perspective, students were encouraged to engage with and complete various musical creative projects (e.g., creating a representative playlist, writing lyrics, writing a melody, writing a song). Such an educational context in which creativity is used as a pedagogical tool provided an opportunity for studying the educational outcomes of such an approach. Embedded within the course were measures of several predictors of learning (based on Webster’s model), including past experience in music, personality, music aptitude, contextual support, musical creative self-efficacy, motivation, and situational engagement. Initial analysis plans included the use of structural equation modeling to (1) compare and contrast the statistical fit of competing models; and (2) examine how each of these constructs not only relate to each other, but also how they each contribute (uniquely and in combination) to various learning outcomes, including perceptions of learning, self-evaluations of creative products, and musical creative self-efficacy. However, a sufficient number of students did not engage in and complete the creative projects, nor did a sufficient number of students complete all of the research items, in order to examine the full structural model. When it became apparent that sufficient data would not be available, the study was re-envisioned to examine questions about why students chose to participate or not participate in the creative music-making projects. Data were collected from 281 students, and although missing data was quite extreme for variables measured late in the course (e.g., motivation), large amounts of data were available regarding students’ past experience in music, their expectations regarding participation as MOOC learners, and demographic information (e.g., age, gender, education, language, geographic region). The available data were used in an exploratory manner to derive a model for predicting creative project participation in the course. The sole important predictor of project participation was whether students identified themselves as an “active participant” at the beginning of the course, although this variable explained only a small amount of variability in project participation. Follow-up analyses for group differences in Active Participant (individuals who identified themselves as “active participants” versus all other Types of Learners) found that “active participants” had significantly higher levels of Musical Creative Self-Efficacy, greater perceptions of the learning context as challenge-supportive, and higher scores on the Openness personality factor. Notably, students’ Past Experience in Music appeared to be unrelated to both whether they intended to participate in the creative music-making projects and whether they actually participated in the projects. In addition to the primary MOOC study, the development and initial validation procedures and results for two new research instruments utilized in the MOOC study, the Past Experience in Music Inventory (PEMI) and the Musical Creative Self-Efficacy Scale (MCSES), are described in detail. The latent class measurement model utilized for measuring Past Experience in Music is a unique and potentially valuable approach for measuring this important variable in music research of all kinds. Finally, an exploratory analysis of all zero-order rank-order intercorrelations of all non-nominal variables indicated some initial support for the General Specified Model of Creativity-Based Learning. It was not possible to take the next step with the model: to prune it, alter it, or reject it altogether, but when viewed as a very large-scale pilot study, this study did provide enough evidence to warrant investing the considerable amount of resources necessary to take that next step. Implications for creativity-based music learning and the significance of MOOCs and MOOC research are discussed. In particular, music MOOCs represent an opportunity to fill in some much needed space for lifelong learning. However, if we are to promote lifelong musical engagement, then the pedagogy within a MOOC should also promote engagement. As such, questions and further research regarding such engagement, especially within a creativity-based learning framework, are central to better understanding how to promote and facilitate lifelong musical engagement and musical learning.
As part of our ongoing investigation into learning experiences and practices with openness and open courses, we are gathering institutional reports describing MOOC initiatives and outcomes. So far, we were able to locate the reports listed below. Do you know of any we are missing? If so, could you please share your links with us by posting a comment below?
Firmin, R., Schiorring, E., Whitmer, J., & Sujitparapitaya, S. (2013). Preliminary Summary: SJSU +Augmented online learning environment pilot project. Retrieved from http://www.sjsu.edu/chemistry/People/Faculty/Collins_Research_Page/AOLE Report -September 10 2013 final.pdf
Harrison, L. (2013). Open UToronto MOOC Initiative: Report on First Year of Activity. http://www.ocw.utoronto.ca/open-utoronto-mooc-initiative/
Ho, A. D., Reich, J., Nesterko, S., Seaton, D. T., Mullaney, T., Waldo, J., & Chuang, I. (2013). HarvardX and MITx : The First Year of Open Online Courses.
University of London. (2013). Massive Open Online Course (MOOC) Report 2013. Retrieved from http://www.londoninternational.ac.uk/sites/default/files/documents/mooc_report-2013.pdf
MOOC STRATEGY ADVISORY COMMITTEE FALL 2013 INTERIM REPORT – University of Illiniois at Urbana-Champaign. (2013). Retrieved from http://mooc.illinois.edu/docs/MSAC-Interim-Report-2013-11-11.pdf
Ithaca S+R. (2013). Interim Report : A Collaborative Effort to Test MOOCs and Other Online Learning Platforms on Campuses of the University System of Maryland, (October). Retrieved from http://www.sr.ithaka.org/sites/default/files/reports/S-R_Moocs_InterimReport_20131024.pdf
University Edinburgh. (2013). MOOCs @ Edinburgh 2013 – Report # 1 (p. 42). Edinburgh. Retrieved from http://www.era.lib.ed.ac.uk/bitstream/1842/6683/1/Edinburgh MOOCs Report 2013 #1.pdf
My colleague Charalambos Vrasidas and I are editing a special issue for Educational Media International focusing on learner experiences in massive open online courses. We are interested in empirical and theoretical manuscripts as well as systematic reviews/analyses/syntheses of the literature. Preliminary abstracts are due by December 19th. We have planned for the process to be prompt and aim for the issue to be published within 8 months or so.
As part of the special issue, and prompted by a note by Al Filreis, we have decided to include a section that enables individual learners to tell their own stories about their experiences with MOOCs. If you have taken an open course and would like to write a short piece about an aspect of your experience, this section of the special issue would be relevant to you. Like all other submissions, these will be peer-reviewed as well.
Individuals interested in this route can submit a 200-word abstract summarizing their intended submission and a 200-word bio by the 19th of December to firstname.lastname@example.org.
Invitations to submit full papers will be send on or before January 9, 2014. Manuscripts should be formatted using APA style and should be 1,200 words long, including references. The process to be followed thereafter is as follows:
- March 1, 2015: Full-length papers due via email at email@example.com
- May 1, 2015: Notification of acceptance/rejections
- June 30, 2015: Final papers with revisions due
- 2015: Special issue is published
On December 5th, Coursera sent an email inviting individuals to participate in a survey intended to investigate whether participation on Coursera “has had any career, educational, or social impact in [their] life.” The email also stated: “Your survey response will be used as part of a research study conducted by the University of Pennsylvania, the University of Washington, and Coursera, examining the impact of MOOCs.”
Research studies examining the impact of MOOCs outside of individual courses and studies that use methods other than clickstream data, are worthwhile and needed. I applaud Coursera and its partners for the effort to address this research gap.
However, the lack of information pertaining to the research is concerning.
By clicking on the email invitation, potential participants land on a page that describes the research study as follows:
Both the University of Pennsylvania and the University of Washington have offices in place to support researchers in conducting research in ways that protect human participants (the UPenn IRB is here and the U of Washington site here). Importantly, these offices are not just regulatory: they provide help and support. The UPenn site for example states that the mission of the IRB includes providing “professional guidance and support to the research community.”
At the very minimum, potential participants should be informed about the study and should provide their agreement to participate in the study. This process is called informed consent. The University of Washington IRB website describes it as follows:
Researchers are required to obtain the informed consent of all participants in human subjects research prior to enrolling those individuals in a study. The individual’s consent must be voluntary and based upon adequate knowledge of the purpose, risks, and potential benefits of a research study. All potential participants should also be informed of their right to abstain from participation or to withdraw consent to participate at any time without reprisal. After ensuring that a person has understood the information, the researcher should then obtain the person’s consent, preferably in writing. [more details here]
This information should be written in language that a layperson can understand and should be included in the screenshot above. In online surveys, consent is usually gained by asking participants to click on a button that indicates that the individual agrees to participate in the study.
This is all missing from the Coursera survey.
Granted, Coursera is a business entity, and it is not bound by the same requirements imposed upon researchers to conduct a survey. Businesses conduct surveys and market research all the time, and none of this applies. But this isn’t just market research. The email and survey introduction have a clear statement of intent: The data will be used for research. Even if Coursera is unaware of the existence of ethical guidelines, Facebook’s emotion contagion study and other news stories on ethics (e.g., Harvard’s hidden cameras efforts) should have provided a moment to pause and ask: Are we doing all we can to ensure that we are treating each other, and our research participants especially, in an ethical and caring way?