Webinar

Hubris, Hallucinations, and the Humbling Experience of AI Engineering

With

Nathan Peck

13 May 2025

This session is a real life story about implementing a challenging feature using AI engineering. I attempt to add a realistic physics engine to my game, while knowing nothing about how to do code physics properly. Along the way there is "vibe coding", bug fixes and refactors, challenging performance problems, complete rewrites, suffering, and finally success. Follow along to learn from my journey, or laugh at my mistakes. Either is okay! Key Takeaways - AI engineering still requires technically deep humans - To maximize value, your prompts must balance specifics against freedom - It's not all about "vibe coding", don't forget to "vibe refactor".

Embracing AI with a Critical Eye

In the rapidly evolving field of AI, the promise of automation and efficiency is both exciting and daunting. Nathan Peck shares his firsthand experience with Amazon Q, emphasizing the importance of human oversight. "These tools are still in need of a lot of human assistance to figure out what’s going on,” he explains, advocating for a balanced approach where AI acts as a capable assistant rather than a replacement for human developers.

The Power of Context in AI Engineering

Peck underscores the critical role of context when working with AI agents. By setting the right starting conditions—such as organizing code logically and providing relevant files—developers can significantly improve AI output. "The more that I can get the AI agent to the right place to start working earlier, the more successful that will be," Peck notes, highlighting the efficiency of smaller, decoupled modules in reducing processing load and enhancing AI performance.

Navigating AI Hallucinations

The talk dives into the common pitfalls known as "AI hallucinations," where the AI misinterprets or incorrectly processes data. Peck recounts an instance where a semantic decision led to a game character walking through walls, illustrating the importance of precision in API design. "Because I chose the word ‘active,’ the AI has assumed that this thing should be completely taken out of the physics simulation," he explains, showcasing the need for detailed human intervention to correct these errors.

Debugging and Collaborating with AI

Through live demonstrations, Peck reveals the iterative process of debugging and refining AI-generated code. He discusses classic bugs, such as perpetual object bouncing and characters clipping through walls, using these challenges to reinforce the necessity of detailed attention and collaboration with AI. "Vibe coding just isn’t enough," he asserts, urging developers to leverage their expertise alongside AI tools to achieve robust solutions.

Reflections on AI's Role in Development

Peck concludes with a broader reflection on the narrative surrounding AI in development. He cautions against the notion that AI will immediately replace developers, emphasizing the tools' current limitations and the indispensable role of human insight. By encouraging developers to experiment with AI, Peck frames the journey as an opportunity for growth and learning, advocating for a collaborative approach that celebrates both technological and human contributions.


About The Speaker

Nathan Peck

Senior Developer Advocate, Generative AI, Amazon Web Services (AWS)

Nathan is a software engineer with 20 years of expertise in JavaScript, as well as a decade of experience in cloud infrastructure and container orchestration. He now works at Amazon Web Services, helping build and improve the next generation of coding assistants for AI engineers, both inside Amazon and beyond.

Subscribe to our podcasts here

Welcome to the AI Native Dev Podcast, hosted by Guy Podjarny and Simon Maple. If you're a developer or dev leader, join us as we explore and help shape the future of software development in the AI era.

THE WEEKLY DIGEST

Subscribe

Sign up to be notified when we post.

Subscribe

JOIN US ON

Discord

Come and join the discussion.

Join