AI and Our Next Conversations in Higher Education A Q&A with Instructure’s Ryan Lufkin In the last few years, technology market press protection has focused mostly on the new and remarkable abilities AI offers. It looks like our dream functionalities have been delivered, with more yet to be imagined. And the play of tech giants on the world phase has been both amusing and a little scary. This might seem like whatever you could want in a major technological shift– but is it?

Gladly, in the education market, we have another viewpoint. We still hear the voices of leaders asking us to consider what is our finest use and adoption of the technology– just as they have actually always done when it pertains to any cutting-edge innovation applied in education. One such voice is Ryan Lufkin, vice president of international strategy for Instructure, makers of the marketplace leading Canvas learning platform. Here, CT asks Lufkin how the focus of AI subjects in education will relocate the coming months, from the latest cool features and functions to the strenuous examination of executions aimed to support the long-lasting values of our higher education institutions.

stylized illustration of people conversing on headsets< img height="368" alt="elegant illustration of individuals conversing on headsets" width="644" src="https://campustechnology.com/-/media/EDU/CampusTechnology/2026/01/20260112highereducationconversation.jpg"/ > When transformative technologies finally become established and familiar to us, our discussions focus less on the technologies themselves and more on the best strategies to solve issues with them. (Image by AI: Microsoft Image Creator by Designer.)

Mary Grush: In higher education, how will our conversations of AI change in the coming months?

Ryan Lufkin: In 2026, the AI discussion in education will move from experimentation to accountability– and that’s a good thing.

In 2026, the AI conversation in education will move from experimentation to accountability– and that’s a good thing.

Grush: It sounds like a really good idea! What are some locations where that will likely appear?

Lufkin: Organizations will need to concentrate on governance, including transparency, vendor choice and management, principles, and scholastic stability, while also showing what has actually improved.

Grush: That’s such a comprehensive range of things to consider. Over all, what’s the secret, crucial factor as the AI discussion in education shifts, as you say, from experimentation to accountability?

Lufkin: Without a doubt it’s our absolute requirement for trainee information privacy in training AI tools.

That is a set rule. And if you aren’t a vendor who’s experienced in the college space, you may think that guideline is fungible, and it’s absolutely not. So, at Instructure we spend a great deal of time working with our partners and our universities to say, look, as you’re picking suppliers, or as you’re constructing this AI facilities, you need to put information security, data privacy, and information availability as the non-fungible requirements for any of those processes.

By admin