I recently had the privilege of judging a statewide business competition for high school students. Each team tackled role-play scenarios, written exams and oral presentations. These students had advanced through local and regional rounds, dedicating weeks to preparation. The winners? They get to celebrate at Disney World!
My judging zone had the challenge of modernizing the creation of a cookbook. Despite their marketing strategies and tech-savvy delivery models, not one of them AI. Not even once.
Surprised, I spoke with their mentors. It turns out students were explicitly told not to use AI. While concerns about plagiarism and academic integrity are valid, are we missing a bigger opportunity? Instead of banning AI, why aren't we teaching responsible, creative and ethical use of it? A 2023 survey from Education Week showed more than 50% of U.S. teachers discouraged AI tools like ChatGPT in the classroom.
This tension isn’t unique to the classroom. It's happening in our conference rooms, too. Do we prohibit AI to avoid risk, or do we embrace it with guardrails to unlock innovation?
Ed Catmull's book Creativity, Inc. offers a valuable lens through which to view this dilemma. While the book isn’t about AI, it explores the principles that built Pixar — a company rooted in creativity, trust and innovation. The lessons are just as relevant for today’s AI conversation as they were for animation pioneers.
Here are some Pixar-inspired takeaways we can apply to AI in the workplace:
- Creativity thrives in safe, trusting environments. Pixar didn’t replace artists with technology—they empowered them. Generative AI can do the same. For those who struggle with drawing, writing or coding, AI opens new creative doors.
- People over process. Pixar valued the human side of technology. Likewise, AI is created and refined by people. It reflects our values, biases and aspirations. We must teach that understanding AI is part of understanding ourselves.
- Fail forward. Pixar embraced iteration and learning from mistakes. AI models do, too. We should encouraged to experimentation, learning and growth with AI. Too many of us fear AI. Teaching that fear only embeds is more deeply.
- Transparency builds trust. Pixar’s leadership promoted open dialogue and honest feedback. In AI, this means asking: Who built the model? What data was used? When and how was it trained? Ethics should be part of every AI conversation.
- Art and tech can coexist. AI isn't just about code. It’s about creativity, empathy and thoughtful problem-solving. It should be taught as both a technical tool and a human amplifier.
But let's bring it back to our world. We’re IT professionals, educators and leaders navigating an AI-enabled world. For those who are skeptical or unsure where to start, here’s how to begin:
- Use ITIL principles. Start where you are. Choose one scenario and ask how AI could help.
- Collaborate, don’t dictate. People resist tools they feel are being forced on them. Make AI adoption a team effort. No one likes projects that are “done to” them rather than “done with” them.
- Start with the familiar. Grammarly, Microsoft Copilot, and Zoom AI are already part of many toolkits. Look at your own ITSM tools and call center software to see where AI is being introduced. Let these pave the way for deeper exploration into tools like ChatGPT, DALL-E or Synthesia.
- Build trust. Share results. Audit outcomes. Invite feedback. People trust what they understand.
- Be an AI Champion. If you’ve gained trust, use it. Demonstrate how AI makes you faster, smarter, more secure, and more creative. Show that it's not replacing you — it's enhancing you.
Pixar's Toy Story taught us to go "to infinity and beyond!" Let's equip the next generation — and ourselves — with the tools and mindset to do just that.
Susan Smith is a Program Manager at GTS Technology and a 2025 HDI Featured Contributor. Connect with Susan here on LinkedIn.