It has been suggested that the use of social technologies (e.g., social media, social networking sites) in higher education may be a worthwhile endeavor. Nevertheless, empirical literature examining user experiences, and more specifically instructor experiences, with these tools is limited. My colleagues and I conducted a study recently to address this gap in the literature. Our goal was to identify, describe, and make sense of initial instructor experiences with a social networking platform (Elgg) used in higher education courses. This follows a prior study in which we examined learner experiences with Elgg.
This study does not purport to describe the experiences of all instructors. Rather, it provides an in-depth examination and rich description of the experiences of five instructors who used a social networking platform in their courses. Readers should examine the context in which this study occurred and decide whether these findings may apply in their own situations.
We found that instructors:
- had expectations of Elgg that stemmed from numerous sources
- used Elgg in heterogeneous ways and for varied purposes
- compartmentalized Elgg and used it in familiar ways, and
- faced frustrations stemming from numerous sources.
Importantly, the ways that Elgg came to be used “on the ground” was contested. These ways contrasted starkly with the narrative of how social software might contribute benefits to educational practice. Furthermore, we found that learning management systems (e.g., Blackboard, Moodle, Canvas, Desire2Learn) may frame the ways through which other tools, such as social media and Elgg, are understood, used, and experienced, as instructors in our study continuously discussed their experiences with Elgg in comparison to an LMS, even though Elgg is not a traditional LMS.
Veletsianos, G., Kimmons, R., & French, K. (2013). Instructor experiences with a social networking site in a higher education setting: Expectations, Frustrations, Appropriation, and Compartmentalization (pdf). Educational Technology, Research and Development, 61(2), 255-278
You can download a pdf of the paper from the link above, or visit dx.doi.org/10.1007/s11423-012-9284-z for the published version.
A facebook conversation from yesterday encouraged me to share one of the assignments that I developed for my instructional design course. The goal of the class is for the students to understand, experience, and apply instructional design in a variety of educational contexts.
One of the assignments I developed for asked students to enroll in a Massive Open Online Course (MOOC) and analyze the instructional materials within the course using one of the rubrics provided by Dick and Carey (the instructional design book we use in class). It was a lot of fun and the students appreciated the exercise. Given the lack of presence and voice by instructional designers in MOOC happenings, the lack of valid, reliable, and serious research that exists on the topic (though Rita Kop’s work on cMOOCs is admirable), and my desire to engage students in contemporary events, I came up with this assignment to embed MOOC analysis in my course. The assignment is available for download on https://dl.dropbox.com/u/2533962/instr-materials-veletsianos.doc and posted below for those who just want to skim it without downloading it. Enjoy and feel free to use it:
Instructional Material analysis assignment
Individually, you will examine and report on the instructional materials of one popular digital learning initiative. An analysis matrix will be provided to you, and you will use that to matrix to evaluate these initiatives.
Length: Minimum 500 words.
|Criteria||Levels of Attainment||Points|
|Written analysis (evaluation)||
This task requires a few hours of research before you can actually complete it. Even though this is an individual task, if you would like to discuss the assignment with any of your colleagues, please feel free to do so.
First read the chapter and the rest of the materials for this week. Without reading those, I can assure you that your understanding of the issues presented will be superficial.
Second, examine the rubric provided by Dick & Carey for evaluating instructional materials (p. 250-251 – see below for the rubric). You will be completing this rubric for a digital environment, and it’s a good idea to understand what it encompasses before you proceed.
Third, select one course provided on one of the following platforms to examine:
- A course on Coursera (select a course that is occurring right now or has been completed. DO NOT select a course that has not started yet): https://www.coursera.org/courses
- A course on EdX (select a course that is occurring right now. DO NOT select a course that has not started yet): https://www.edx.org/courses
- A free course on Udemy (select a course that includes at least 5 “lessons/lectures”): http://www.udemy.com/courses
You can also choose to examine DS106: http://ds106.us/ I am including DS106 on its own because it is a course as opposed to the above (Coursera, EdX, and Udemy) which are platforms. If you pick any of these three (Coursera, EdX, or Udemy), then you should also pick a course (e.g., Within Coursera a possible course is https://www.coursera.org/course/friendsmoneybytes).
Once you have made your selection, it’s time to research your course. Spend time looking around, examining and evaluating the instructional materials provided. You will use the rubric to keep track of the criteria that need to be assessed, and then using this rubric you will write a report assessing the instructional material for the course.
You should start your report by stating the course and its provider. A link would also be helpful. For example, using the example above, I would start my report by stating the following:
“I am examining the course entitled Networks: Friends, Money and Bytes (https://www.coursera.org/course/friendsmoneybytes). This course if offered through Coursera and is taught by Mung Chiang who is a Professor or Electrical Engineering at Princeton University. The course is an introduction to the topic of X and its objectives are XYZ.”
Your report should be specific and detailed in its evaluation of instructional material, and should be guided by the five criteria families discussed by DC: Goal-centered, learner-centered, learning-centered, context-centered, technical criteria. I would like to see that you understand each criterion and that you are capable of applying it to evaluating your course. For example, at the very least, I would expect to see statements such as the following:
Instructional designers use five criteria families to evaluate instructional materials. Learner-centered criteria focus on XYZ and refer to X. The instructional materials for this course appear to be adequate for this criterion because <provide list of reasons here>. The course could be improved in this domain by <list of additions/revisions here>. However, because item X was not disclosed in the course, I am not able to evaluate Y.
Let me reiterate that to complete this assignment you will need to do background research on the course and the platform. For example, your background research on Coursera will reveal that some of these courses have more than 80,000 students from around the world. This fact alone will impact your evaluation!
Instructional Material Evaluation Rubric
Rubric is copyright of: Dick, W., Carey, L. & Carey, J. (2008). Systematic Design of Instruction, (7th ed.) Upper Saddle River, NJ: Pearson.
A. Goal-centered Criteria:
Are the instructional materials:
|1. Congruent with the terminal and performance objectives?|
|2. Adequate in content coverage and completeness?|
|6. Objective in presentations (lack of content bias)?|
Are the instructional materials appropriate for learners’:
|1. Vocabulary and language?|
|2. Development level?|
|3. Background, experience, environment?|
|4. Experiences with testing formats and equipment?|
|5. Motivation and interest?|
|6. Cultural, racial, gender needs (lack bias)?|
Do the material include:
|1. Pre-instructional material?|
|2. Appropriate content sequencing?|
|3. Presentations that are complete, current and tailored for learners?|
|4. Practice exercises that are congruent with the goal?|
|5. Adequate and supportive feedback?|
|6. Appropriate assessment?|
|7. Appropriate sequence and chunk size?|
Are/do the instructional materials:
|1. Authentic for the learning and performance sites?|
|2. Feasible for the learning and performance sites?|
|3. Require additional equipment/tools?|
|4. Have congruent technical qualities for planned site (facilities/delivery system)?|
|5. Have adequate resources (time, budget, personal availability and skills)?|
Do the instructional materials have appropriate:
|1. Delivery system and media for the nature of objectives?|
|3. Graphic design and typography?|
|6. Audio and video quality?|
|7. Interface design?|
What is the value of a critique?
The value of critique is to help us see a phenomenon through a different lens, to help us make sense of something in a different way, and to spark a conversation. This is the purpose, and value, of a paper we recently published with IRRODL on the topic of open scholarship.
The paper identifies the assumptions and challenges of openness and open scholarship and attempts to put forward suggestions for addressing those. A summary of our paper, appears below:
Many scholars hope and anticipate that open practices will broaden access to education and knowledge, reduce costs, enhance the impact and reach of scholarship and education, and foster the development of more equitable, effective, efficient, and transparent scholarly and educational processes. Wiley and Green (2012, pp. 88) note that “only time will tell” whether practices of open scholarship will transform education or whether the movement “will go down in the history books as just another fad that couldn’t live up to its press.” Given the emerging nature of such practices, educators are finding themselves in a position in which they can shape and/or be shaped by openness (Veletsianos, 2010). The intention of this paper is (a) to identify the assumptions of the open scholarship movement and (b) to highlight challenges associated with the movement’s aspirations of broadening access to education and knowledge. The goal of this paper is not to frame open scholarship as a problematic alternative to the status quo. Instead, as we see individuals, institutions, and organizations embrace openness, we have observed a parallel lack of critique of open educational practices. We find that such critiques are largely absent from the educational technology field, as members of the field tend to focus on the promises of educational technologies, rarely pausing to critique its assumptions. Selwyn (2011b, pp. 713) even charges that our field’s inherent positivity “limits the validity and credibility of the field as a site of serious academic endeavour.” Our intention is to spark a conversation with the hopes of creating a more equitable and effective future for digital education and scholarship. To this end, this paper is divided into three major sections. First, we review related literature to introduce the reader to the notion of open scholarship. Next, we discuss the assumptions of openness and open scholarship. We then identify the challenges of open scholarship and discuss how these may limit or problematize its outcomes.
Common assumptions and challenges are summarized as follows:
|Common themes and assumptions||Challenges|
|Open scholarship has a strong ideological basis rooted in an ethical pursuit for democratization, fundamental human rights, equality, and justice.||Are these ideals essential components of the open scholarship movement or are merely incidental to those who are pioneering the field?|
|Open scholarship emphasizes the importance of digital participation for enhanced scholarly outcomes||Scholars need to develop an understanding of participatory cultures and social/digital literacies in order to take full advantage of open scholarship.Need to redesign university curricula to prepare future scholars to account for the changing nature of scholarship.
|Open scholarship is treated as an emergent scholarly phenomenon that is co-evolutionary with technological advancements in the larger culture||Technology both shapes and is shaped by practice.Technology is not neutral, and its embedded values may advance tensions and compromises (e.g., flat relationships, homophily, filter bubbles).|
|Open scholarship is seen as a practical and effective means for achieving scholarly aims that are socially valuable||Open scholarship introduces new dilemmas and needs (e.g., personal information management challenges; Social stratification and exclusion).|
Given the topic, the best home for this paper was the International Review Of Research In Open And Distance Learning, through which you can download the paper for free in an open access manner:
I am in Cyprus to meet with a number of colleagues and give an invited talk at ICEM 2012.
Talk title: What does the future of design for online learning look like? Emerging technologies, Openness, MOOCs, and Digital Scholarship
Abstract: What will we observe if we take a long pause and examine the practice of online education today? What do emerging technologies, openness, Massive Open Online Courses, and digital scholarship tell us about the future that we are creating for learners, faculty members, and learning institutions? And what does entrepreneurial activity worldwide surrounding online education mean for the future of education and design? In this talk, I will discuss a number of emerging practices relating to online learning and online participation in a rapidly changing world and explain their implications for design practice. Emerging practices (e.g., open courses, researchers who blog, students who use social media to self-organize) can shape our teaching/learning practice and teaching/learning practice can shape these innovations. By examining, critiquing, and understanding these practices we will be able to understand potential futures for online learning and be better informed on how we can design effective and engaging online learning experiences. This talk will draw from my experiences and research on online learning, openness, and digital scholarship, and will present recent evidence detailing how researchers, learners, educators are creating, sharing, and negotiating knowledge and education online.
I have finished compiling my syllabus for an undergraduate seminar I am teaching, and I thought I would share it. This is a syllabus for a course in which we investigate major trends influencing education, and understand how education and learning institutions are (and are not) changing with the emergence of technologies, social behaviors, and cultural expectations. The syllabus is embedded below, but you can also download it from Scribd though this link
In their paper “Intentional Web Presence: 10 SEO Strategies Every Academic Needs to Know” Patrick Lowenthal and Joanna Dunlap offer excellent advice to academics mindful of their web presence and cognizant of the potential impact that the Internet may have on their scholarship. I’ve come to use most of these strategies over the years, but I am excited to see these collected at one location.
I’ll add an 11th strategy: Use an RSS aggregator to (e.g., Google Reader) to gather resources of interest effortlessly and consistently. For example, I receive alerts of the latest journal issues at my aggregator (you can also have these emailed to you). I also follow a number of colleagues’ blogs through there, so I don’t have to visit individual sites. My RSS aggregator also serves as an archiving mechanism.
As academics and scholars engage in the emerging practice of using “participatory technologies and online social networks to share, reflect upon, critique, improve, validate, and further their scholarship” (which is an argument that we made in this paper), these strategies are important to keep in mind.
What other strategies do you use?
Readers of this blog may be interested in a short encyclopedia entry, summarizing the adventure learning approach to education that colleagues and I are using in a few of our projects (e.g., YoTeach and AL Water expeditions). The paper also summarizes productive avenues for future research:
Veletsianos, G. (2012). Adventure Learning. In Seel, N. (Ed.), Encyclopedia of the Sciences of Learning (pp. 157-160). Springer Academic.
This is part of my ongoing reflection on moocs. See the rest of the entries here.
I have signed up to a number of MOOCs as a student (and led one of the #change11 weeks), and have spoken in general terms a couple of weeks ago about how education research can help improve the type of education offered through a MOOC. In this post, I will give specific suggestions, focusing on the University of Pennsylvania MOOC: Listening to World Music, offered through Coursera. I am signed up to this course, which started on June 23, and I just submitted the first assignment. I decided to post these thoughts early because of two reasons. First, the beginning of any course is an important moment in its success and I find that it takes a lot of planning and reflection. Second, MOOCS are discussed as being experiments in online education. The Atlantic even calls them “The single most important experiment in higher education.” I agree that they are experimental initiatives, and as such would benefit from ongoing evaluation and advice. Where I disagree with is the notion that they are a departure from what we know about online (and face-to-face) education. This post is intended to highlight just a couple of items that the Coursera instructional designers and learning technologists could have planned for in order to develop a more positive learning experience.
1. Length of the video lectures.
The syllabus lists the length of the video lectures (e.g., video 1 is 10:01 minutes long and video 2 is 10:45 minutes long.) However, this length is not provided on the page that students visit to watch the videos, which is where they need that information. I’ve annotated this below.
2. Opportunities for interaction.
The platform provides forums for students to interact with each other. Learners are of course instrumental and will figure out alternative, and more efficient and effective ways to communicate with each other, if they need to. For instance, in a number of other MOOCs students set up facebook groups, and I anticipate that this will happen here as well. What Coursera could do to support learners in working with each other is to integrate social media plugins within each course. I am surprised that this isn’t prominent within the course because you can see from the images below that Coursera uses social media plugins to allow students to announce participation in the course:
For instance, it appears that the course uses the #worldmusic hashtag, though it’s not integrated within the main page of the course, not does it seem to be a unique hashtag associated with the course.
3. How do you encourage students to watch the videos?
Let’s say that we added the length of each video next to its title. Now, the learner knows that they need about an hour to watch the video. Some learners (e.g., those who are intrinsically motivated by the topic) will watch them without much scaffolding. But, how do you provide encouragement for others to do so? Here’s where some social psychology insights might be helpful. By providing learners with simple descriptions of how the majority of their colleagues are behaving (i.e. appealing to social norms), then one might be able to encourage individuals to watch the videos. For example, the videos might include a dynamic subtitle that informs learners that “8 out of 10 of your peers have watched this video” or that “70% of your peers have completed the first assignment” and so on. This is the same strategy that hotels use to encourage users to reuse towels and the same strategy that Nike uses when it compares your running patterns to the running patterns of other runners, as shown in the image below:
4. Peer-grading expectations.
This course is different from others that I’ve participated in because it includes an element of peer-grading. This is exciting to me because I’m a firm believer in social learning. One minor concern however is the following: I don’t know how many peers I am supposed to evaluate. I thought I was supposed to evaluate just one, but each time I finish my evaluation, I am presented with the option to “evaluate next student.” Do I keep evaluating? How many do I need to evaluate before I can move to the next step? I don’t know. In other words, it’s always helpful to inform the learner of what s/he has to do. For instance, in my case, I just stopped evaluating peers after having evaluated 4 because I don’t know how much I am expected to do. Perhaps there’s no minimum… and this information would be helpful to me as a learner.
Overall, my experience with this course is positive, though there is a lot of room for improvement here, which is to be expected. For example, I haven’t touched much on the pedagogy of the course, but there’s a few more weeks left… so stay tuned!
Nike photo credit. Thanks to my colleague Chuck Hodges for directing my attention to the Nike example.