PLAY PODCASTS
AI Agents are Dumb Robots, Calling LLMs
Episode 1514

AI Agents are Dumb Robots, Calling LLMs

AI agents are set to transform software development, but software itself isn’t going anywhere—despite the dramatic predictions. On this episode of The New Stack Makers, Mark Hinkle, CEO and Founder of Peripety Labs, discusses how AI agents relate to serverless technologies, infrastructure-as-code (IaC), and configuration management.

The New Stack Podcast · Mark Hinkle, Peripety Labs, The New Stack, Alex Williams

March 20, 202528m 31s

Audio is streamed directly from the publisher (cdn.simplecast.com) as published in their RSS feed. Play Podcasts does not host this file. Rights-holders can request removal through the copyright & takedown page.

Show Notes

AI agents are set to transform software development, but software itself isn’t going anywhere—despite the dramatic predictions. On this episode of The New Stack Makers, Mark Hinkle, CEO and Founder of Peripety Labs, discusses how AI agents relate to serverless technologies, infrastructure-as-code (IaC), and configuration management. 

Hinkle envisions AI agents as “dumb robots” handling tasks like querying APIs and exchanging data, while the real intelligence remains in large language models (LLMs). These agents, likely implemented as serverless functions in Python or JavaScript, will automate software development processes dynamically. LLMs, leveraging vast amounts of open-source code, will enable AI agents to generate bespoke, task-specific tools on the fly—unlike traditional cloud tools from HashiCorp or configuration management tools like Chef and Puppet. 

As AI-generated tooling becomes more prevalent, managing and optimizing these agents will require strong observability and evaluation practices. According to Hinkle, this shift marks the future of software, where AI agents dynamically create, call, and manage tools for CI/CD, monitoring, and beyond. Check out the full episode for more insights. 

Learn more from The New Stack about emerging trends in AI agents: 

Lessons From Kubernetes and the Cloud Should Steer the AI Revolution

AI Agents: Why Workflows Are the LLM Use Case to Watch 

Join our community of newsletter subscribers to stay on top of the news and at the top of your game

Topics

peripety labsmark hinklesoftware developersoftware engineeringtech podcastthe new stacktechdeveloper podcastthe new stack makersserverlessobservability