Author: George Veletsianos Page 9 of 82

#oer23 presentation: open access to research

Enilda, Josh, and I are working on a project examining the degree to which access to education research is available to the public, bringing together research interests that all three of us have had for a long time now. Enilda presented some of our early findings this week at OER23 in Scotland and shared her reflections here. Our slides are available at tiny.utk.edu/OER23

Part of the fun in this work is figuring out how to bring together a set of APIs to allow for programmatically retrieving data about published journal papers from different services (e.g., see Josh’s post).

 

opening up research through self-archiving practices

Issues that hybrid, online, and blended modes of teaching and learning introduce to collective agreements and bargaining

A few weeks ago, I was invited to offer input to a committee at a Canadian university examining issues that hybrid, online, and blended modes of teaching and learning introduce to collective agreements and bargaining. I appreciated that the committee identified experts to speak with in order to gain an evidence-informed understanding of the issues they were facing rather than allow their deliberations be guided by assumptions and beliefs (which, to be honest, many of the conversations around modality default to!).

I thought the questions I was asked were relevant to many, and so I am sharing them below. The gist of my responses follows each question.

  • What is your sense of the future of online, hybrid, and blended course delivery in Canadian universities?
    • Necessary, valuable, and growing. Ignore them at your own peril.
  • How do you see the work, the workload, the rights, and the responsibilities of faculty changing within this shifting terrain?
    • Rising workloads at first, but shifting over time (similar to how workload is higher when assigned a new course; opportunity to learn & explore relationship between online/hybrid and pedagogy, which may transfer to other settings). Responsibilities around quality similar, if not higher (which is unfortunate given that conversations around quality are different in relation to in-person courses). Rights: an opportunity for expanding the conversation to encompass in-person practices: reflect on ownership and where the real value of faculty lies – it’s not content.
  • What would you suggest are the biggest advantages to these delivery modes, and what would you flag as the biggest challenges that institutions face in moving towards these modes?
    • advantages: rethinking pedagogy, flexibility, supporting justice and EDI, reaching and supporting different kinds of learners; challenges: institutional infrastructure to support online/hybrid learning quality at the same level as supporting in-person.
  • What kinds of supports—technological, training, in-class, infrastructural, workload-based, or other – do you see as necessary for faculty to successfully deliver course through online modes?
    • This is the right question to ask. It’s not just about individual skills, competencies, and perceptions – it’s about how the institutions will support these learning modalities at the system level. In addition to the ones mentioned in the question, my answer highlighted that online/hybrid learning is a team sport and noted the need for instructional design support.  
  • As part of our own deliberations, we are concerned with the process through which mode of delivery for particular courses is determined. Do you have any advice on how this best happens? Are there any lessons from experiences at other universities about this?
    • This is a difficult one, especially at a time of many circulating viruses. I emphasized the need for flexibility and a decision-making process that is based on mutual trust and cooperation, and that is informed by student input. Ideally one where decisions aren’t top-down and aren’t solely guided by individual preferences. Also: the proportion of courses that are online need not be uniform across departments.

BC’s guidelines for digital learning strategy: inadvertent effects?

As you may or may not be aware, BC has developed a digital learning strategy. Here’s an earlier draft, and some earlier thoughts. With its release coming soon, I thought I would post a final set of thoughts that apply to this policy, but to other policies as well. I am only posting this because I saw that the University of California recently closed a loophole that allowed learners to fully complete their degree online. Such decision reminds me once again that decisions which are laser-focused on modality miss the bigger picture. Which then reminded me of the BC digital learning policy.

My reading and analysis of the guidelines coming to BC is that they raise quality standards for online and hybrid learning. This is a good thing. But, they are silent on the quality standards for in-person learning, and might therefore have inadvertent effects.

Because of the focus on a specific modality, the strategy creates a de facto level of standard for digital learning courses/programs/efforts that is higher than that for in-person courses/programs/efforts. While the document focuses on guidelines for “technology-enhanced learning,” it’s not explicit that these guidelines ought to apply to ALL courses.

In other words, the policy presumes that guidelines are unnecessary for in-person courses, or at the very least outside of the purview of the policy . As one example, note how the following important guideline specifically focuses on the digital but not the in-person context:

“Digital PSE in BC must achieve true, meaningful, and lasting reconciliation with Indigenous Peoples. It should advance and implement decolonial practices, promote Indigenization, and recognize Indigenous knowledge, pedagogies, and learning. To achieve these goals, technology-enhanced learning should…”

What I’d rather see is this:

Digital PSE in BC must achieve true, meaningful, and lasting reconciliation with Indigenous Peoples. It should advance and implement decolonial practices, promote Indigenization, and recognize Indigenous knowledge, pedagogies, and learning. To achieve these goals, technology-enhanced learning should…”

Or this:

Digital PSE in BC must achieve true, meaningful, and lasting reconciliation with Indigenous Peoples. It should advance and implement decolonial practices, promote Indigenization, and recognize Indigenous knowledge, pedagogies, and learning. Technology-enhanced learning provides risks and opportunities towards these goals, and in this context, institutions should… To achieve these goals, technology-enhanced learning should…

Setting a higher standard for digital learning compared to in-person learning is a problem for two reasons.

First, a different levels of standard produces the very real possibility of stifling innovation in digital learning and prioritizing in-person learning. Institutions which are considering digital learning will need to account with these guidelines, especially if they need to highlight how they are meeting them in QA and new degree approval processes. Yet, it’s unclear whether in-person offerings need to account for them. By raising the bar for one kind of approach, we might be inadvertently guiding institutions into the alternative modality.

Second, a different level of standard will impact the sector unevenly, and will disproportionately impact institutions and disciplines which are predominantly digital/online. The impacts that the strategy will have on in-person trades programs are limited compared to the impacts that it will have on education programs, which are typically blended. The impacts that it will have on  smaller institutions which are exploring expanding their digital learning offerings are greater that the impacts it will have on predominantly in-person institutions.

What is a possible solution?

This is a difficult one. One approach might be to clarify and be explicit that these guidelines apply to all courses/offerings regardless of modality. Designing education with ethics, equity, and decolonization in mind ought not be limited by whether the course takes place in-person, online, or in blended fashion. Further, any change in QA and course approval policies at the Ministry level will need to apply to all programs – not just “digital” ones.

Playing with prompt engineering in response to Hickey & Luo prompt engineering analysis

In an worthwhile and expansive post Daniel Hickey and Qianxu Morgan Luo explore knowledge-rich prompts and the ways in which ChatGPT and Bard differ in the results they return. I thought that

  1. their exploration was interesting,
  2. that their findings were interesting (e.g., “this initial analysis suggests that ChatGPT was much more capable of interpreting and applying the article using an engineered prompt that included the abstract. But ChatGPT did not add much knowledge about cyber ranges In contrast, Bard appeared to take away almost nothing from the abstract but knew more about cyber ranges”)
  3. but, I was convinced that the prompts could have been engineered in more nuanced ways to improve the outcome.

I also recently learned that thinking of prompts in an incremental fashion may provide more nuanced result, and so I went about to recreate Hickey and Luo’s prompt in ChatGPT to see if I could generate a better outcome.

What did Hickey and Luo do?

Their knowledge-rich prompt was the following:

Use the following article by Sinatra, Heddy, & Lombardi (2015) to motivate engagement by cybersecurity learners who are using a digital range and use APA citations: Engagement is one of the hottest research topics in the field of educational psychology. Research shows that multifarious benefits occur when students are engaged in their own learning, including increased motivation and achievement. However, there is little agreement on a concrete definition and effective measurement of engagement. This special issue serves to discuss and work toward addressing conceptual and instrumentation issues related to engagement, with particular interest in engagement in the domain of science learning. We start by describing the dimensional perspective of engagement (behavioral, cognitive, emotional, agentic) and suggest a complementary approach that places engagement instrumentation on a continuum. Specifically, we recommend that instrumentation be considered on a “grain-size” continuum that ranges from a person-centered to a context-centered orientation to clarify measurement issues. We then provide a synopsis of the articles included in this special issue and conclude with suggestions for future research.

ChatGPT’s output was:

Research in educational psychology has consistently shown that engagement is a key factor in successful learning outcomes (Sinatra, Heddy, & Lombardi, 2015). For cybersecurity learners who are using a digital range, engagement is particularly important as it can enhance their motivation and achievement in this field. As Sinatra et al. (2015) note, engagement can be viewed from a dimensional perspective, encompassing behavioral, cognitive, emotional, and agentic dimensions. These dimensions can be used to design effective learning experiences that tap into students’ interests and motivations, such as through gamification or interactive simulations.

However, there is still a need for a concrete definition and effective measurement of engagement, especially in the context of science learning (Sinatra et al., 2015). To address this issue, educators in the field of cybersecurity can consider using a complementary approach that places engagement instrumentation on a continuum, ranging from a person-centered to a context-centered orientation (Sinatra et al., 2015). This approach can help clarify measurement issues and enable educators to tailor their teaching strategies to the individual needs and interests of their students.

In conclusion, engagement is a crucial factor in the success of cybersecurity learners using a digital range. By taking a dimensional perspective and using a continuum-based approach to measure engagement, educators can design effective learning experiences that promote student motivation and achievement in this field.

Reference:

Sinatra, G. M., Heddy, B. C., & Lombardi, D. (2015). The challenges and opportunities of engaging learners in the scientific practices of explanation and argumentation. Journal of Educational Psychology, 107(2), 380–397. https://doi.org/10.1037/edu0000004

What did I do?

I divided the prompt in chunks, incrementally moving from the field (cybersecurity), to the topic within the field (cyber ranges) to the application of motivation to the topic (the Sinatra et al citation), to probing for specifics. I also used the term “cyber range” rather than “digital range” which Hickey and Luo used. This approach generated specific and actionable recommendations, especially compared to the generic reply above. My prompts and ChatGPT responses are below.

Prompt: I’d like to learn more about cybersecurity. Could you help me with this topic?

Sure! I’d be happy to help you learn more about cybersecurity. What specific areas of cybersecurity are you interested in? Some possible topics within cybersecurity include:

  1. Network security
  2. Information security
  3. Application security
  4. Cloud security
  5. Identity and access management
  6. Incident response
  7. Risk management
  8. Cybersecurity laws and regulations

Let me know what you’re interested in, and I can give you some pointers on where to start.

Prompt: What do people mean when they mention “cyber range” in the context of cybersecurity?

A “cyber range” is a simulated environment used for training and testing cybersecurity skills and technologies. It is essentially a virtual training ground where individuals and teams can practice identifying, preventing, and responding to cyber threats.

Cyber ranges can be set up to simulate a variety of scenarios, such as network intrusions, phishing attacks, malware infections, and denial-of-service attacks. They may also incorporate realistic components such as simulated user behavior, fake websites, and decoy systems.

By providing a safe and controlled environment for cybersecurity training and testing, cyber ranges can help organizations and individuals develop their skills and test their defenses without risking their actual systems or data. They can also be used to evaluate and compare different cybersecurity tools and techniques.

Some cyber ranges are open to the public, while others are only available to specific organizations or groups. Some are physical facilities that require specialized equipment and infrastructure, while others are entirely virtual and can be accessed from anywhere with an internet connection.

Overall, cyber ranges are an increasingly important tool for improving cybersecurity readiness and resilience in the face of an ever-evolving threat landscape.

Prompt: I would like to motivate engagement by cybersecurity learners who are using a cyber range. Using the ideas from Sinatra, Heddy, & Lombardi (2015) that are in quotes, write how to motivate engagement and use APA citations: “Engagement is one of the hottest research topics in the field of educational psychology. Research shows that multifarious benefits occur when students are engaged in their own learning, including increased motivation and achievement. However, there is little agreement on a concrete definition and effective measurement of engagement. This special issue serves to discuss and work toward addressing conceptual and instrumentation issues related to engagement, with particular interest in engagement in the domain of science learning. We start by describing the dimensional perspective of engagement (behavioral, cognitive, emotional, agentic) and suggest a complementary approach that places engagement instrumentation on a continuum. Specifically, we recommend that instrumentation be considered on a ‘grain-size’ continuum that ranges from a person-centered to a context-centered orientation to clarify measurement issues. We then provide a synopsis of the articles included in this special issue and conclude with suggestions for future research.”

To motivate engagement in cybersecurity learners who are using a cyber range, it is important to create a learning environment that is both challenging and supportive. According to Sinatra, Heddy, & Lombardi (2015), engagement is influenced by a variety of factors, including behavioral, cognitive, emotional, and agentic dimensions. To address these dimensions and promote engagement in the cyber range, the following strategies can be implemented:

  1. Provide hands-on learning experiences: learners should be given the opportunity to actively engage with the cyber range and practice their skills in a realistic and challenging environment. This can help to increase their sense of control and agency, which is important for promoting engagement (Sinatra, Heddy, & Lombardi, 2015).
  2. Create a supportive learning environment: learners should feel supported and encouraged throughout the learning process. Providing feedback, encouragement, and support can help to foster positive emotional and cognitive engagement (Sinatra, Heddy, & Lombardi, 2015).
  3. Use a variety of instructional strategies: learners have different learning styles and preferences, and it is important to use a variety of instructional strategies to cater to these differences. For example, using videos, interactive simulations, and quizzes can help to increase cognitive engagement (Sinatra, Heddy, & Lombardi, 2015). [Sidenote from George: This is worthy of a post around AI dispensing bad advice and perpetuating myths]
  4. Set clear learning goals: providing learners with clear learning goals and objectives can help to increase their motivation and focus. Goals should be challenging but achievable, and learners should be given regular feedback on their progress (Sinatra, Heddy, & Lombardi, 2015).

Overall, creating a supportive and challenging learning environment that promotes engagement can help to increase motivation and achievement in cybersecurity learners who are using a cyber range.

Reference: Sinatra, G. M., Heddy, B. C., & Lombardi, D. (2015). The challenges of defining and measuring student engagement in science. Educational Psychologist, 50(1), 1-13. doi: 10.1080/00461520.2014.1002924

Prompt: I want to know more about #2. How do I create a supportive learning environment in a cyber range?

Creating a supportive learning environment in a cyber range involves providing learners with feedback, encouragement, and support throughout their learning process. Here are some specific strategies you can use to promote a supportive learning environment:

  1. Provide timely feedback: learners need feedback on their progress to understand what they are doing well and what they need to improve. Provide feedback as soon as possible after a learning activity and be specific about what the learner did well and what they need to work on.
  2. Encourage learners: it is important to encourage learners and recognize their efforts. Encouragement can come in many forms, such as praise, positive reinforcement, and recognition of achievements. This can help learners build confidence and feel more motivated to learn.
  3. Build a sense of community: building a sense of community among learners can help to create a supportive learning environment. Encourage learners to work together, share their experiences, and collaborate on learning activities. This can create a sense of belonging and increase engagement.
  4. Provide support resources: learners may need additional support to be successful in a cyber range. Provide resources such as online help, tutorials, and support forums to help learners overcome challenges and get the assistance they need.
  5. Use effective communication: effective communication is key to creating a supportive learning environment. Communicate clearly and frequently with learners, provide clear instructions, and be available to answer questions and provide support.

By using these strategies, you can create a supportive learning environment that promotes engagement and success in the cyber range.

Prompt: Can you give me an example of an encouraging message I can send to learners who are in a cyber range?

Sure! Here is an example of an encouraging message you can send to learners who are in a cyber range:

“Great work on completing that cyber range activity! Your hard work and dedication to learning in this field are truly commendable. Remember, every challenge you overcome is an opportunity to grow and become a better cybersecurity professional. Keep up the good work, and don’t hesitate to reach out if you need any support or assistance. You’ve got this!”

This message acknowledges the learner’s efforts, encourages them to continue their learning journey, and offers support if needed.

Special Issue on Trauma-Informed Instructional Design Practices in JAID

JAID recently published a special issue on trauma-informed instructional design practices, that you can read here. The abstract for the introduction to the special issue summarizes the effort well:

This special issue of JAID begins to bridge the gap between the theories of social emotional learning/trauma informed learning with instructional design offering specific cases of design and development projects that illustrate the confluence of these two broad areas. We share these articles with our ID community in the hopes of creating principles for “compassionate instructional design” (Thomas et al., 2019) through a collection of practitioner cases and research articles on applied instructional design practices that are responsive to trauma-affected learners, and which highlight the complexities of the learning context of the learners being served.

Faculty curiosities about AI tools and ChatGPT

I led an online workshop/conversation on AI for ~200 faculty at three colleges/universities who came together today to learn about the topic. It centered on the following questions. I am sharing them here for the benefit of others, but also to ask: Are there other curiosities that you are seeing locally? (Yes, I know that the most recent EDUCAUSE poll highlights cheating as a top concern, though I’m not certain it ought to be)

  • How can (should??) I use AI for the benefit of my students’ learning?
  • Is ChatGPT really the disruptor it seems to be?
  • ChatGPT (AI) and authentic assessment – can these co-exist?
  • Neither I nor my students are as tech-savvy as it is often assumed we are. How do we keep up with innovations like ChatGPT, whether they be ‘good’ or ‘bad’, and how do we learn when to embrace them or ignore them?
  • Is ChatGPT (or other AI) a blessing or a curse for higher education?

ChatGPT is the tree, not the forest.

“Not see the forest for the trees,” is a North American idiom that is used to urge one that focusing on the details might lead them to miss the larger issue/problem. ChatGPT is the tree. Perhaps it’s the tallest or the leafiest tree, or the one that blossomed rapidly right in front of your eyes… sort of like a Japanese flowering cherry. What does this mean for you?

If you’re exploring ChatGPT – as a student, instructor, administrator, perhaps as a community – don’t focus solely on ChatGPT. Certainly, this particular tool is can serve as one illustration of the possibilities, pitfalls, and challenges of Generative AI, but making decisions about Generative AI by focusing solely on ChatGPT may lead you to make decisions that are grounded on the idiosyncrasies of this particular technology at this particular point in time.

What does this mean in practice? Your syllabus policies should be broader than ChatGPT. Your taskforce and working groups should look beyond this particular tool. Your classroom conversations should highlight additional technologies.

I was asked recently to lead a taskforce to explore implications and put forward recommendations for our teaching and learning community. ChatGTP was the impetus. But our focus is Generative AI. It needs to be. And there’s a long AIED history here, which includes some of my earlier work on pedagogical agents.

 

Page 9 of 82

Powered by WordPress & Theme by Anders Norén