The attention that education and educational technology are receiving are significant, and I feel fortunate to be able to participate in efforts to improve education. Nonetheless, for the past couple of years has been bombarded with announcement after announcement of what the latest and greatest technology can do for education. These announcements are almost always filled with claims about their potential impact. Here’s a clipping of a recent email i received:
Let’s pause. Research from ISTE “found that the use of education technology (EdTech) resulted in 35 percent of students showing higher scores on class assessments and 32 percent increased engagement?” The link pointed to this post, written by the amazing Wendy Drexel. Let’s hone in on what Wendy actually wrote:
“Nationwide, we are seeing powerful results from the effective use of technology in classrooms. For example, results of research by ISTE and the Verizon Foundation earlier this year into the use of education technology had teachers reporting that 35 percent of their students showed higher scores on classroom assessments; 32 percent showed increased engagement; and 62 percent demonstrated increased proficiency with mobile devices. In fact, 60 percent of participating teachers also reported that by using their mobile devices, they provided more one-on-one help to students, and 47 percent said they spent less time on lectures to the entire class.”
These are interesting and worthwhile results. But they are mischaracterized in the email above. From the summary of the research posted on the ISTE site and a more detailed report of the research (pdf), we can begin to see how some edtech companies purport that there is evidence of impact and use that to further their cause. Here’s a summary of two issues ignored by the ad/email:
- The email claims that use of edtech resulted in 35% and 32% more students scoring higher in classroom assessment and engagement. What the research actually reported about this particular area is the following: teachers reported that edtech use led to increased scores/engagement. In more plain language, the teachers said that their students did better. We don’t know if they did or did not.
- The email paints a direct and unequivocal relationship between edtech use and outcomes.: “use of educational technology resulted in.” That’s not actually the case. Why?
- The actual research showed that even though there were differences in math and science outcomes between schools that participated and schools that did not, the results were not statistically significant. In different words: similar results could be expected without the use of this particular technology
- How is the technology used? The research report notes: “During site visits, observers noted that edtech-using teachers used technology to efficiently
facilitate drill and practice test preparation activities.” In other words: Edtech helps with teaching to the test and that seems to work. Put differently: We have powerful technologies that empower people to be creative and allow global collaboration, but have created systems that put teachers in situations in which they have to use these tools in simplistic ways.
Royce Kimmons and I have been exploring the use of large-scale data in a number of recent studies. We just published a paper that tries to make sense of students’ and professors’ social media participation on a large scale. We are continuing our qualitative investigations to understand “why, in what ways, and how” scholars (students & professors) are using social media, but this is our first data mining study making use of Twitter data. It’s also the first study using large-scale Twitter data to make sense of how professors and students of education are using Twitter.
Here’s a high-level summary of three of our findings:
- There is significant variation in how scholars participate on Twitter. The platform may not be the democratizing tool it is often purported to be: The most popular 1% scholars have an average follower base nearly 100 times that of scholars in the lower 99% and 700 times those in the bottom 50%.
- Civil rights and advocacy seem to be an important activity of social media participation – this is rarely captured in research to date, which most often focuses on how social media are used in teaching & research. Scholars’ participation on Twitter extends well beyond traditional notions of scholarship.
We found that those scholars who follow more users, have tweeted more, signal themselves as professors, and have been on Twitter longer will have more followers. This model predicts 83% of the variation on follower counts. This finding raises questions as to the meaning of follower counts and its use as a metric in conversations pertaining to scholarly quality/reach.
Veletsianos, G., & Kimmons, R. (2016). Scholars in an Increasingly Digital and Open World: How do Education Professors and Students use Twitter? The Internet and Higher Education, 30, 1-10.
When my friends Jon Becker and Alec Couros were applying for tenure, they did something open, innovative, and thoughtful: They asked the community for feedback on their work and scholarship. This feedback often gets missed in tenure applications because it the impact and reach of scholarship tends to be evaluated in basic ways: How many times was a publications cited? In how many high-impact factor journals did one publish in? Alec’s and Jon’s request serve to add another dimension to the evaluation of their work.
I find myself in a similar position. Would you please help me provide a more diverse evidence for my application? If my work has impacted you in any way, could you please add a note below? Perhaps my research was helpful in helping you get started on your MA/PhD thesis/dissertation. Or perhaps you used my work to provide professional development for teachers/faculty. Or, you assigned my work as reading. Or, you reused one of the teaching activities I shared on my blog. Or, you learned something from me at some time. Many of these “indicators of impact” are invisible, so, in essence what I am asking is to help me make them visible.
If you have a few moments to spare, I’d appreciate your feedback in the form below, which has the same format as the one created by Jon (Thanks, Jon!). My plan is to include these data with my application in raw and summary form.
The British Journal of Educational Technology and BERA approached us to create an infographic for the article we (Amy Collier, Emily Schneider, and myself) published last year: Digging Deeper into Learners’ Experiences in MOOCs: Participation in social networks outside of MOOCs, Notetaking, and contexts surrounding content consumption
Below is the outcome (and a pdf version is here):