Blog.

Exploring AI with Ansible-RoleBuilder and GPT-Engineer

Cover Image for Exploring AI with Ansible-RoleBuilder and GPT-Engineer
David Cannan
David Cannan

Self-Taught Developer's Journey

Nearly a year ago, I embarked on a journey to self-learn software development, a path that has been both challenging and rewarding. I was captivated by the promise of technology and its potential to solve complex problems. My curiosity led me to explore various tools and technologies, and in this post, I'd like to share my experiences of integrating artificial intelligence into a project I've been working on: Ansible-RoleBuilder.

Ansible-RoleBuilder: The Genesis

Ansible-RoleBuilder is a tool I've developed to automate the creation of Ansible roles. Born out of my interest in DevOps, the tool structures a filesystem with directories and files, such as `tasks/main.yml`, `vars/main.yml`, `meta/main.yml`, and more, all read from a `tools.json` file. This project has been an incredible learning opportunity, allowing me to apply the concepts I've been studying in a practical setting.

Discovering GPT-Engineer: AI Meets DevOps

In my exploration of new technologies, I stumbled upon Anton Osika's `gpt-engineer` project on GitHub. This AI-driven tool uses GPT, a powerful language model developed by OpenAI, to generate code based on a provided prompt. I was intrigued by the prospect of leveraging this AI capability to enhance my Ansible-RoleBuilder project.

The Experiment

I set out to answer a straightforward question: could `gpt-engineer` improve my Ansible-RoleBuilder project by generating an output of equal or better quality based on my markdown documentation?

To test this, I used my iPhone to remotely execute `gpt-engineer` on a Linux server, connecting through the ShellFish iOS app. I fed the AI a `main_prompt` file which explained Ansible-RoleBuilder in a way that a junior developer could understand. The AI model provided a detailed explanation and even suggested improvements like better error handling and addressing edge cases.

The initial run, though promising, didn't yield the results I had expected. It dawned on me that the AI model might have been working with an incomplete context of my project, as it was only given the `main_prompt` file and not the entire current working directory.

Lessons Learned

This experiment has been incredibly enlightening. I learned the importance of providing AI with a comprehensive context for the best results. My next step is to consolidate all relevant documentation and examples into the `main_prompt` file to give the AI a more complete understanding of the project.

What I found most impressive was the AI's ability not just to replicate but also to enhance existing processes. AI tools like `gpt-engineer` can propose code improvements and generate comprehensive narratives, showcasing their immense value in software development.

The Road Ahead

My adventure with Ansible-RoleBuilder and `gpt-engineer` is only beginning. As I refine my approach and delve deeper into AI, I look forward to the potential improvements and insights that lie ahead.

My journey as a self-taught developer has been enriched by the work of individuals like Anton Osika and the team behind `gpt-engineer`. Their groundbreaking work inspires novices like myself to push beyond our limits in the realm of software engineering.

This experiment is just one step on a long road of exploration into the potential of AI in software development. As I continue this journey, I'm excited about the endless possibilities that await. Stay tuned for more updates!


More Stories

Cover Image for Introduction to cda.data-lake and MinIO

Introduction to cda.data-lake and MinIO

The cda.data-lake project embodies a transformative approach to managing and processing data at scale. At its core, it leverages the robust capabilities of MinIO, an object storage solution that excels in performance and scalability. This integration empowers the project to handle an expansive array of data types and operations, ranging from simple storage to complex analytical computations and machine learning tasks. The use of MinIO ensures that the cda.data-lake can operate within a secure and compliant framework, making it a reliable foundation for data-driven innovation. As the cda.data-lake project evolves, the MinIO event notification system plays a pivotal role by automating workflows in real-time, thereby optimizing data processing and reducing manual intervention. This not only increases efficiency but also enables the system to swiftly adapt to the increasing volume and complexity of data. With MinIO's flexible and resilient infrastructure, the cda.data-lake project is set to redefine the standards of data handling and accessibility for diverse applications.

David Cannan
David Cannan
Cover Image for My Gartner's Peer Insights Review of MinIO - A Game Changer in Object Storage

My Gartner's Peer Insights Review of MinIO - A Game Changer in Object Storage

My experience with MinIO has been nothing short of fantastic. It's a testament to what a well-thought-out platform, backed by a passionate team and community, can achieve.

David Cannan
David Cannan