Professor & Canada Research Chair in Innovative Learning and Technology at Royal Roads University

Category: emerging technologies

What do you do *in anticipation of* social media privacy concerns and scandals?

Posted on March 27th, by George Veletsianos in emerging technologies. Comments Off on What do you do *in anticipation of* social media privacy concerns and scandals?

Responses to the news relating to the Cambridge Analytica + Facebook scandal have been swift with many vouching to #DeleteFacebook. An extensive collection of resources relating to this scandal are here. Lee Skallerup Bessette calls the fiasco the latest iteration of “guess how safe and secure your data is and how it might be used for nefarious purposes but it’s actually worse than that.”

An Angus Reid poll in Canada shows that 3 out of 4 respondents indicated that they plan on changing the ways they use the platform. How many will actually change their practices and behavior? Rehardless, I wonder what people do when their habits center around mistrusting contemporary digital platforms and their opaque use of our data. In other words, what do you do on an ongoing basis when you anticipate that benevolence isn’t the distinguishing characteristic of social media platforms?

For example, like others, I:

  • purge my historical tweets (because bad actors can easily take them out of context)
  • use authenticator apps
  • use browser extensions that block ads and trackers
  • delete unused online accounts and profiles (unless of course you really still need your ICQ account)
  • rarely connect distinct apps (e.g., google with dropbox)

I’m quite sure I take a number of other, likely unconscious, steps that I’ve picked up over time for privacy’s sake. For instance, after thinking about this for a couple of minutes I remembered that I installed an app on my website that aims to limit brute-force login attempts. And it strikes me that many of the conscious (and unconscious) steps that I take are rarely enabled by the platforms that are thirsty for data: There’s no bulk delete button on Twittter; there’s no “unfollow all the pages I currently follow” on Facebook; and so on.

What steps do you take to minimize the likelihood of your social media data being used in unanticipated ways?

Tri-council guidance on using online public data in research

Posted on March 9th, by George Veletsianos in emerging technologies, NPS, open, scholarship. Comments Off on Tri-council guidance on using online public data in research

I am often asked whether there are Canadian ethics guidelines on the use of online public data in research. The  relevant section from the Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans is provided below. I believe that researchers should take further steps to protect privacy and confidentiality pertaining to public data, but with regards to accessing and using public online data, this is a start.

A sample project to which these guidelines may apply is the following:  The researcher will collect and analyze Twitter profiles and postings of higher education stakeholders (e.g., faculty, researchers, administrators) and institutional offices (e.g., institutional Twitter accounts). This research will use exclusively publicly available information. Private Twitter accounts (ie those that are not public and involve an expectation of privacy) will be excluded from the research. The purposes of the research is to gain a better understanding of Twitter metrics, practices, and use/participation.

 

=== Begin relevant Tricouncil guidance ===

Retrieved on December 12 2014 from http://www.pre.ethics.gc.ca/eng/policy-politique/initiatives/tcps2-eptc2/chapter2-chapitre2/

REB review is also not required where research uses exclusively publicly available information that may contain identifiable information, and for which there is no reasonable expectation of privacy. For example, identifiable information may be disseminated in the public domain through print or electronic publications; film, audio or digital recordings; press accounts; official publications of private or public institutions; artistic installations, exhibitions or literary events freely open to the public; or publications accessible in public libraries. Research that is non-intrusive, and does not involve direct interaction between the researcher and individuals through the Internet, also does not require REB review. Cyber-material such as documents, records, performances, online archival materials or published third party interviews to which the public is given uncontrolled access on the Internet for which there is no expectation of privacy is considered to be publicly available information.

Exemption from REB review is based on the information being accessible in the public domain, and that the individuals to whom the information refers have no reasonable expectation of privacy. Information contained in publicly accessible material may, however, be subject to copyright and/or intellectual property rights protections or dissemination restrictions imposed by the legal entity controlling the information.

However, there are situations where REB review is required.

There are publicly accessible digital sites where there is a reasonable expectation of privacy. When accessing identifiable information in publicly accessible digital sites, such as Internet chat rooms, and self-help groups with restricted membership, the privacy expectation of contributors of these sites is much higher. Researchers shall submit their proposal for REB review (see Article 10.3).

Where data linkage of different sources of publicly available information is involved, it could give rise to new forms of identifiable information that would raise issues of privacy and confidentiality when used in research, and would therefore require REB review (see Article 5.7).

When in doubt about the applicability of this article to their research, researchers should consult their REBs.

=== End relevant Tricouncil guidance ===

On Teacherbot rights

Posted on January 11th, by George Veletsianos in E-learning, emerging technologies, online learning, pedagogical agents, scholarship, work. Comments Off on On Teacherbot rights

Pause for a few more minutes and imagine that future in which technologies teach humans. Call them robots, bots, chatbots, algorithms, teaching machines, tutoring software, agents, or something else. Regardless, consider them technologies that teach. Now consider their rights.

Assuming that teaching bots can exhibit (algorithmic) intelligence, can behave with some sort of (algorithmic) morality, can learn, can plan their interactions with students and make choices about them, and overall behave somewhat independently… what rights do they have, or should they have, as non-human entities, as teachers?

Consider this scenario: A teaching bot teaches independently in an online course. It (S/he?) develops a novel pedagogical approach wherein student test scores are maximized for some, but not all, students. University administrators, in collaboration with an edtech company, learn of this and would like to intervene to ensure that every student is served in an equitable manner. They are considering refining the underlying code that runs the bot. If unsuccessful, they are considering replacing the bot with a new one.

What are the bot’s rights? Does it have the right to protest this change? Does it have the right to its life? Does it have the rights that all other workers have?

 

Followup: Some background reading on ethical principles for robots.

Web browser extension that filters offensive content

Posted on January 5th, by George Veletsianos in emerging technologies, networked scholars. Comments Off on Web browser extension that filters offensive content

“Nikola Draca, a third-year statistics student, and his colleague, Angus McLean, 23, an engineering student at McGill University, put their heads together to develop an extension called Soothe for the Google Chrome web browser that blurs out homophobic, racist, sexist, transphobic and other hateful language while browsing the web.” Source

I thought this was interesting, because:

  • It’s yet another example of how student work can contribute meaningfully to society
  • It attempts to take back (some) control from platforms, and enable individuals to refine the experiences they have online

Related initiatives include the following:

 

Imagine a future in which technologies teach humans

Posted on October 17th, by George Veletsianos in emerging technologies, my research, online learning, papers, scholarship. 2 comments

Pause for a few minutes and imagine a future in which technologies teach humans. Call them robots, bots, chatbots, algorithms, teaching machines, tutoring software, agents, or something else. Regardless, consider them technologies that teach.

robo_teacher

Vector created by Freepik

How far into the future is that time?

What do these technologies look like? Are they anthropomorphous? Are they human-like? In what ways are they human-like? Do they have voice capabilities, and if so, do they understand natural language? Are they men or women?  Do they have a representation in the way that one would imagine a teacher – such as a pedagogical agent – or do they function behind the scenes in ways that seem rather innocuous – such as the Mechanical MOOC?

Do these technologies teach humans of all ages? Do they teach independently, support human teachers, or do human teachers assist them? Are they featured in articles in the New York Times, The Guardian, and The Economist as innovations in education? Or, are they as common as desks and chairs, and therefore of less interest to the likes of the New York Times? Are they common in all learning contexts? Who benefits from technologies that teach? Is being taught by these technologies better or worse than being taught be a human teacher? In what ways is it better or worse? Are they integrated in affluent universities and k-12 schools? Or, are they solely used in educational institutions serving students of low socioeconomic status? Who has access to the human teachers and who gets the machines? Are they mostly used in public or private schools?

How do learners feel about them? Do they like them? Do they trust them? Ho do learners think that these technologies feel about them? Do they feel cared for and respected? How do learners interact with them? How do human teachers feel about them? Would parents want their children to be taught be these technologies? Which parents have a choice and which parents don’t? How do politicians feel about them? How do educational technology and data mining companies view them?

Do teaching technologies treat everyone the same based on some predetermined algorithm? Or, are their actions and responses based on machine learning algorithms that are so complex that even the designers of these technologies cannot predict their behaviour with exact precision? Do they subscribe to pre-determined pedagogical models? Or, do they “learn” what works over time for certain people, in certain settings, for certain content areas, for certain times of the day? Do they work independently in their own classroom? Or, do colonies of robo-teachers gather, share, and analyze the minutiae of student life, with each robo-teacher carefully orchestrating his or her next evidence-based pedagogical move supported by Petabytes of data?

Final question for this complicated future, I promise: What aspects of this future are necessary and desirable, and why?

MOOCs and Open Education in the Developing World symposium

Posted on August 24th, by George Veletsianos in emerging technologies, moocs. Comments Off on MOOCs and Open Education in the Developing World symposium

A thought-provoking pre-conference symposium is being organized and facilitated on October 17th by Curt Bonk, Sheila Jagannathan, Tom Reeves, and Tom Reynolds at this year’s E-learn conference. It’s focused on a variety of innovations pertaining to online learning in the context of the developing world. While some research demonstrates that  socioeconomic divides persist in the context of MOOCs used by US learners, the symposium organizers note that “minimal attention has been placed on how developing countries and regions of the world are taking advantage of these new forms of technology-enabled learning” and “this is exactly where exciting and impactful innovations are currently occurring.”

Beyond the impressive list of presenters, I appreciate

  • the diverse organizations represented here, which include universities, polytechnics, non-profits, NGOs, and financial institutions
  • the main questions behind the symposium, which is: How do innovations work in different contexts, for whom, why, and what can we learn from one another?

Plus: It’s in Vancouver, off the coast of Canada’s paradise, Victoria ;)

AERA statement and #edtech research

Posted on August 18th, by George Veletsianos in E-learning, emerging technologies, scholarship. Comments Off on AERA statement and #edtech research

What appears below is a copy of the AERA Statement on the Hateful Acts in Charlottesville. I am posting it here because there’s a tendency in our field to focus on instruction and learning that is effective, efficient, and engaging without considering that we need to evaluate instruction/learning in the context of larger societal needs. What’s the value of an effective programming course if it leaves behind traditionally disenfranchised groups? This reminds me of Tom Reeves and his efforts to encourage us all to engage in socially responsible research that addresses the urgent problems of our time.

AERA statement

The American Educational Research Association condemns racism in all its forms and joins others throughout our nation in the fight to eradicate hate, injustice, and racial violence. The recent events in Charlottesville not only make visible how White supremacy, racism, antisemitism, religious persecution, homophobia, and xenophobia continue to permeate our society, but also remind us of the critical importance of studying, analyzing, and broadly communicating about these patterns and structures. Our social responsibility as a community of education researchers is to engage in producing knowledge and to share that knowledge with clarity and integrity.

 

A wide range of scholarship can and must be used to inform and engage current and future generations in the multiple stories of our pasts, the realities of our presents, and the critical demands of our futures. We need to uncover and analyze how our educational system is connected to our past and present legacies of racism in all of its forms—how our institutions and practices persistently reproduce inequities. We must also develop the knowledge and evidence that can lead to practices and policies that address hate, support understanding and respect of others, and disrupt the divisive patterns of disparity and denigration. Researchers, together with educators across all levels of education, must confront the racism, xenophobia, power and privilege, and injustice that permeate the ordinary life of our nation and world and interrogate and teach the histories of our past. No one should leave our educational institutions thinking that the expressions of hate that were on display in Charlottesville are just legitimate “points of view” or acceptable acts of “free speech.” No one should leave our classrooms or campuses believing that the symbols of oppression and killing are mere logos.

 

Education is fundamentally about our futures as a nation and a world, for education can empower the next generation of human beings who can promote and protect human rights, build institutions, make laws, create knowledge and art, and imagine and make possible a just world. AERA is committed to providing the knowledge base and working with other scientific organizations to support educators and others in our communities to be able to confront hate and to teach all people to know the histories of slavery, racism, genocide, inhumanity, oppression, colonialism, and White supremacy, as well as to know and learn from the stories of those who have fought and devoted their lives to justice. We strive to make known and foster the use of research on institutional and individual factors that engender prejudice and acts of violence against groups. As researchers, we must be prepared to support educators with tools, knowledge, and expertise to notice, name, deal with, and confront these issues as they arise in our contemporary world, our communities, and in our institutions and classrooms.

 

Now is the time, as new school and academic years commence, to ensure that we do not ignore or forget the realities that underlie what we have just experienced nor resume a normalcy that belies the scholarship that we have. AERA is committed to continuing this conversation as we go from city to city. It is our priority in planning for the 2018 Annual Meeting in April in New York and speaks to the very heart of this year’s theme—“The Dreams, Possibilities, and Necessity of Public Education.”

 

 

Deborah Loewenberg Ball, AERA President

Felice J. Levine, AERA Executive Director

Research Dissemination, Research Mobilization, and Reaching Broader Audiences

Posted on June 21st, by George Veletsianos in emerging technologies, engagement, my research, open, research shorts. 1 Comment

I gave an ignite talk at the Canadian Society for the Study of Higher Education in early June, sharing some of the lessons learned in creating whiteboard animation videos for mobilizing research and reaching broader audiences. We’ve now turned that talk into a whiteboard animation video. It’s all very meta. Here it is below: