"

Statewide AI Training Webinars

March 10, 2026: AI, Energy, and Water

Bert Baumgaertner (UI) and Joel Gladd (CWI) presented on AI’s energy and water footprint. We’d promised jobs too, but the environmental research alone filled the session. Jobs and the workforce will get its own webinar. Bert opened with interpretive framing around individual vs. collective obligations and descriptive vs. normative claims. He also provided guidance on what works or doesn’t when using arguments by analogy. This framing resurfaced throughout the presentation. From there, we looked at what data centers actually do, how efficiency is changing as legacy systems transition to newer closed-loop designs, and why the energy source matters as much as the cooling technology. Then we got into the cost of prompting. “AI uses x energy” is a meaningless statement without specifying the type of prompt. The compute required for a simple text query vs. a reasoning model vs. video generation varies by orders of magnitude. We also saw how, for many faculty, Zoom meetings dominate their energy and water footprint. We compared a variety of Gen Z vs. faculty scenarios to see how daily digital habits could be translated into water and energy calculations. To help with in-class workshops and discussions, Joel created a companion website (see below) for students and faculty to inventory their daily digital habits and see the water and energy usage totals. The idea is to navigate these conversations with students by understanding how it’s part of our entire digital lifestyle. Your Digital Life Website

 

Webinar 7 Slides

February 24, 2026 Redesigning for AI

In this third AI Catalyst Show-and-Tell, Samra Culum (Education, CSI) and Nick Lambertson (Accounting, CSI) walked us through how they redesigned large portions of their courses for AI. Both provided extremely useful insight into their thinking process around AI implementation, where they previously ran into problems, and how their most current assignments solve those issues. I also very much appreciated the discussion around how much time these efforts took them. Some updates were simple, others more labor-intensive and iterative. Samra explained how she updated a range of materials: her course syllabus, major assignments (including tags that clarifying AI/no-AI use), and rubrics. Nick showed how he built a simulated business environment for accounting students using ChatGPT and Claude to generate realistic transactions and source documents (including Monopoly money and 3D-printed supplies).

Webinar 6: Course Redesign Slides

February 10, 2026: Agentic AI

Kevin Rank, Liza Long, and Joel Gladd presented Webinar 5 in the Idaho AI Catalyst Initiative series on agentic AI. The webinar defines “agentic AI”, provides a brief intro for how it works differently from pre-2025 genAI interactions, then demonstrates how we’re using agents in our own workflows (especially for meeting the April ADA deadline!).

To define “agent,” we used Anthropic’s definition because it’s precise. An agent is a system where the LLM directs its own processes and tool usage. A chatbot waits for your prompt, responds, and stops. An agent takes a high-level goal and figures out the steps on its own. It acts, evaluates, revises, and loops until the task is done. Kevin walked through two agentic browsers, Perplexity Comet and Chrome with Gemini. Both can read pages, fill out forms, summarize across tabs, and complete tasks autonomously. Kevin found that agents could even complete a McGraw Hill SimNet Excel simulation. Liza demonstrated using the Claude browser extension to remediate Pressbooks content for ADA compliance ahead of the April 24th Title II deadline. Her workflow fixes heading styles, adds alt text, cleans up Creative Commons attributions, and corrects non-descriptive links, while emphasizing to “always be the human in the loop.” I showed how agentic CLIs like Codex and Claude Code use “skills” (think custom GPTs but at the terminal level) to pull Canvas and Pressbooks content via API, evaluate it for accessibility, and push fixes back.

We also talked about the risks. Agents continuously take screenshots, which creates FERPA concerns. Prompt injection is an ongoing problem. And existing AI policies weren’t written for tools that can complete a student’s Canvas assignments without the student ever reading them. The slides provide a few resources for thinking about “AI Governance,” including a recent whitepaper and Anna Mills’s substack on agentic browsers.

Webinar 5: Agentic AI Slides

January 27, 2026: NotebookLM and Deep Research

This show-and-tell session focused on NotebookLM and Deep Research. Kevin Rank (BSU), Abraham Romney (ISU), and Bert Baumgaertner (UI) demonstrated how they use NotebookLM and Deep Research as part of their course prep and teaching. Kevin presented a detailed guide for how to get the most out of this tool. Abraham then offered an overview of deep research as a tool that employs AI agents and retrieval augmented generation to quickly expand initial queries and help identify gaps in literature. Bert demonstrated a reflective workflow, using NotebookLM to iterate on his ideas, create custom visualizations like decision trees from course textbooks, and use Gemini to generate complex LaTeX code for publication-level graphics. The Catalysts also addressed common challenges, such as the organization of sources within NotebookLM and the current limitations of sharing notebook spaces with students using institutional accounts.

Webinar 4: NotebookLM and Deep Research to Slides

November 17, 2025: AI + Academic Integrity

Here is the presentation slide deck.

This third installment in the Catalyst AI Foundations for Faculty is a supplement to the previous webinar on AI + Assessment. As we improve our course design to become more AI-informed, we’re still faced with a major challenge: how do we handle academic integrity? Teaching has always required some level of enforcement and rigor; we’ve always had to hold students accountable for their work, but the rapid spread of ChatGPT and agentic browsers is eroding our pre-2022 approach to academic integrity. This is particularly true of online and hybrid courses, but everyone feels it.

In Webinar 3, the Idaho AI Catalyst series will cover all things academic integrity and related ethical questions, such as whether faculty should even require students to use AI (or ban it?), how to select policies that work best for your situation, and whether process-tracking tools such as Grammarly Authorship or Google Docs History should be part of your grading toolkit. We’ll also provide tips for communicating your course expectations clearly and how to hold students accountable without compromising your own integrity as an educator. The webinar will build on the work of former Idaho AI Fellows, Liza Long and Jason Blomquist, updated for Fall 2025 developments, such as the rise of agentic browsers.

Joel Gladd (College of Western Idaho) will provide an overview of recent AI developments and how they impact academic integrity, as well as the different ethical models and tactics that relate to these discussions. Catalyst members Taylor Waring (North Idaho College), Heidi Tighe (College of Southern Idaho), and Abraham Romney (Idaho State University) will present examples and insights from their own experience.

Who this is for

Higher-ed faculty, staff, and administrators who engage with students and learning environments. Useful for instructors, program coordinators, student conduct leads, instructional designers, and department chairs.

What you’ll learn

  • How recent AI technologies are impacting academic integrity discussions
  • How to communicate clear expectations for when and how AI use is permitted in your course
  • AI-use acknowledgement statements that support accountability
  • Conversation strategies for responding when student work appears AI-generated

October 20, 2025: AI + Assessment: Designing for Learning in the Age of GenAI

Here is the presentation slide deck

Since 2022, many faculty increasingly face an onslaught of student work that’s clearly AI-generated. As a professor on the subreddit r/Professor recently expressed it, “You can smell the cheating a mile away.” What can faculty do? Or what can instructional designers do for faculty? On the one hand, we keep hearing there are ways to design courses and assessments that lean into AI-related durable skills; on the other hand, none of us wants to provide yet more opportunities for students to “offload” their thinking to ChatGPT. But the time and support to figure this out feels thin. This Idaho Catalyst session offers a clear, highly targeted set of moves you can use now or over the next year to update your courses. We will introduce the AI Assessment Scale, from “no AI” to “fully AI,” and show how to use it to reconfigure workshops, assignments, entire courses, or even rethink program outcomes. You will see research‑based practices, live demos, and ready‑to‑use checklists sourced from Idaho faculty who have already begun redesigning for AI.

Who this is for

Higher‑ed faculty, staff, and admins who design or evaluate learning experiences. Useful for instructors, coordinators, assessment leads, instructional designers, and department chairs.

What you’ll learn

  • The AI Assessment Scale: how to place your assignments and courses on a spectrum from no‑AI to fully‑AI, and when to use each
  • Research‑based tips for using genAI, custom chatbots, and other AI tools in ways that promote rather than hinder learning
  • Cognitive offloading: where it typically occurs, how to detect it, and how to reassign the thinking back to students
  • Practical redesigns: updating workshops, assignments, projects, and whole courses without starting from scratch
  • Faculty spotlights: demonstrations from Idaho instructors who responded to student behaviors while increasing durable skills
  • Community connections: how to connect with Idaho higher‑ed colleagues doing this work

September 22, 2025: GenAI Starter Pack for Higher-Ed Faculty

Research shows there’s a “J-curve” effect to AI adoption: faculty who adopt generative AI initially spend more time on daily tasks, but then, once they’ve achieved a certain threshold of familiarity, their workload becomes more efficient and freer to do higher-level thinking and interactions. The problem is that, for many, the initial adoption curve feels encumbered with needless jargon and layers of complexity. This first Idaho AI Catalyst Webinar is for higher-ed faculty, staff, and admins who are new to generative AI or at an intermediate level. Come if you want to see how power users fold AI into real faculty workflows and move past the early learning curve with confidence. We will demystify key terms, show clear steps you can use today, and share best practices from colleagues across Idaho. You will leave with a toolkit and a simple plan to stay current as the field moves quickly, tailored to Idaho campuses. If you are already experienced, you will pick up ways to train others and sharpen your own practice. Everyone will be provided with resources to become part of a larger community of practice in Idaho.

What you’ll learn:

  • AI Literacy Basics
  • How Generative AI Works
  • Prompt Engineering Basics
  • Advanced Techniques
  • Common Tools for Faculty Workflows
  • Resources for becoming part of Idaho’s AI community of practice

March 13, 2025: AI for Writing and Assessment

Discover how to harness genAI tools to transform writing instruction and assessment in higher education. This interactive workshop explores practical strategies and ethical considerations for using AI to enhance student learning and writing, featuring OER resources designed to empower both educators and students. It is designed for instructors who give writing assessments of any kind. Come prepared to work on an assignment prompt that integrates AI in practical, productive, and ethical ways.

 

February 6, 2025: AI Citation, Attribution, and Detection

Transparency in the use of AI tools in academia is a necessary and ongoing debate. This session will share ideas on suggested formats and strategies for citation and attribution of AI use. Additionally, we will discuss current guidance on the use of AI detectors, their strengths and weaknesses, and the potential impacts on faculty-student relationships. Participants wil learn how to:

  • Evaluate appropriate citation formats and strategies for acknowledging AI tool usage in academic work.
  • Analyze the capabilities and limitations of current AI detection tools to make informed decisions about their application in academic settings.
  • Develop practical approaches for maintaining productive faculty-student relationships while addressing AI use in academic work.

The slide deck can be viewed here.

November 4, 2024: GenAI Enhancing Student Engagement and Learning Workshop

Join us for a discussion and workshop on incorporating GenAI tools to enhance student engagement and learning. In this workshop, we will:

  • share examples of AI teaching projects from across Higher Ed
  • discuss opportunities and challenges in using AI with students
  • share ideas and experiences from Idaho faculty using AI in their classrooms.

Take this opportunity to explore with colleagues from across the state as we consider how to thoughtfully use GenAI to support student learning. Featuring Roger Plothow (CEI), Joel Gladd (CWI), and Sarah Llewelyn (BSU).

There is no slide deck for this webinar.

 

October 4, 2024: Ethical Considerations in AI Integration Workshop

In this workshop, we will examine the ethical implications of using AI in higher education and develop strategies to address potential concerns. Through the materials and activities, you will:

  • understand ethical issues related to AI in education including academic integrity
  • identify best practices for ethical AI use in the classroom
  • develop strategies to mitigate potential ethical challenges.

The slide deck can be viewed here.

 

September 5, 2024: Introduction to Using Generative AI Tools

This presentation provides background and hands-on experience for Idaho educators who want to learn more about generative AI tools.

The slide deck can be viewed here.

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

A Guide to Teaching and Learning with Artificial Intelligence Copyright © by Jason Blomquist; Liza Long; and Joel Gladd is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.