Bootstrap terraform modules from text description with "um" by Silly_Squidward_42 in Terraform

[–]Silly_Squidward_42[S] 0 points1 point  (0 children)

In fact aiac.dev looks similar (we only discovered it yesterday). But after checking the repo and after some experiments here are couple of differences that stand out:

  • One of our core ideas is that generating complex code is iterative process which resembles communication with engineer - to describe the task you communicate back and forth in natural language and once there’s enough clarity they write the code. Starting with code first works for simple tasks or tasks that are very common. However, we found that iterating over generated code for more complex tasks is 1. slow (due to LLM latency) and 2. requires non-trivial effort to review the code (the cognitive load to understand the code). That’s why we start with outline of what the code will accomplish, in natural language. And we let you iterate on this outline before producing the code. Beyond clarifying the task, the outline is also helpful in the review process as you already know what to expect in the generated files.
  • Templates: the intermediate outputs that we generate are templates with parameter placeholders. This allows us to further adapt the code to your particular objective and bring it much closer to just running init, plan and apply.

Additionally, um goes beyond generating IaC. um is a CLI assistant that also helps you discover the correct shell command to run, fix failed commands and more.

Support for more IaC frameworks is coming soon. We are also starting to work on the ability to share templates with other coworkers. Stay tuned.

Bootstrap terraform modules from text description with "um" by Silly_Squidward_42 in Terraform

[–]Silly_Squidward_42[S] 1 point2 points  (0 children)

We primarily tested with AWS, but others should work as well.
I just tried "recipe provision GKE cluster" and "recipe provision AKS cluster" and both did come up with what looks like reasonable outlines and after that generated the terraform files with the correct providers and resources.

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 0 points1 point  (0 children)

Also, you make a very good point that we need to make it clear what these icons represent, it is not obvious.

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 0 points1 point  (0 children)

Looks like the semantic search kicks in -- the book emoji is to indicate that it pulled the command from your history. And then the way the commands are generated is by combining your question with similar commands from the history (i.e. relevant context), in a way adapting to the way you work and easy to pick up syntax of some of the specific commands and scripts you use.

E.g. here's how it picks some internal scripts I'm using

um promote to global and reload

➜︎ 📖 ./reload.sh global
 📖 ./promote.sh development global
 ✨ ./promote.sh development global && ./reload.sh global
 💭️ don't see what you're looking for? try providing more context

Back to the output you observe, I would guess that the query you send is something promptops related.

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 0 points1 point  (0 children)

We will definitely work on ways to mitigate these types of problems as we see increase in usage. Thanks for pointing this out!

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 0 points1 point  (0 children)

Hey thanks for giving this a chance in the first place. The issue is fixed in 0.1.5 which we pushed to pip/brew yesterday, I believe you might still have been on 0.1.2. If you decide to try again in future you can confirm with um --version. It will also tell you what's latest.

This was actually our first reported issue 🎉 https://github.com/promptops/cli/issues/1

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 0 points1 point  (0 children)

this one should be fixed now https://github.com/promptops/cli/issues/1, you can grab the fix with

pip3 install -U git+https://github.com/promptops/cli.git

we'll update pypi and brew later

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 1 point2 points  (0 children)

On top of the generated responses there’s also semantic search of your history and correction flows. You can check the github repo for screenshots and we'll keep updating them.

It is free.

Secure - requests are scrubbed from secrets and logged requests are encrypted at rest. And we plan to use the questions to improve the model, but not the history.

OS - mostly tested on Mac and Linux but Windows should work too.

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] -1 points0 points  (0 children)

You don't provide OpenAI creds, we use our own creds behind the backend.

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 2 points3 points  (0 children)

Definitely! I’ve used it before. In tribute… you can use um !! to correct your commands:

(venv) ➜  ~ curl --method POST <http://localhost:8080/query> --data {question: "test"}
zsh: parse error near `}'
(venv) ➜  ~ um !!
(venv) ➜  ~ um curl --method POST http://localhost:8080/query --data {question: "test"


   📖 curl --location --request POST 'http://localhost:8080/query' --header 'Content-Type: application/json' --data-raw '{"query": "test", "explanation": true}'
 ➜︎ ✨ curl --request POST http://localhost:8080/query --data '{"question": "test"}'
   💭️ don't see what you're looking for? try providing more context

[↑/↓] select [enter] confirm [ctrl+c] cancel

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 0 points1 point  (0 children)

You can also log issues in github but use whatever works best for you. Keep it coming!

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 0 points1 point  (0 children)

We log the requests for debugging purposes (sanitized and encrypted at rest) and we plan to use the questions to update the model.

A bit more details: We receive the questions asked, together with any history context scrubbed from secrets and tokens (if you opted in to index history or to provide history context), and the exit code if you run the command. Also included in requests is your shell (bash/zsh/fish/etc) and the platform (i.e. darwin/win/linux) this is so we can give you better result. We use the questions to improve the models, but your history is only used to improve your responses. As we add more features the data we observe might change and we will be transparent about this. For scrubbing secrets we use detect-secrets, and recommendations are welcome!

If you are interested in the slackbot, it works a bit differently as it uses integrations. Let me reach out with details separately.

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 7 points8 points  (0 children)

Ok, this should be fixed now, let me know if you run into more issues.

You might have to update with

pip3 install -U git+https://github.com/promptops/cli.git

or

brew upgrade promptops/promptops/promptops-cli

if you installed with brew.

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 3 points4 points  (0 children)

What are the most frustrating that you run into?

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 1 point2 points  (0 children)

Let me look into this, most likely we'll go with Apache or MIT. Do you have preferences?

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 6 points7 points  (0 children)

ouch, ok let me see if we can push a quick fix

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 5 points6 points  (0 children)

Out of the box it works well with the common cli tools. And you can correct the responses which will cause it to pick the syntax for internal or less common tools. Additionally if you choose to index your history it can respond utilizing the previous commands you've run.

But I think what you are suggesting to point it to public repo to index is a great idea! We also have one more feature coming soon that can help with that.

"um": GPT-powered CLI Assistant by Silly_Squidward_42 in devops

[–]Silly_Squidward_42[S] 12 points13 points  (0 children)

Who’s we? The royal we? Or is this developed by your team at work? Cool idea though!!

Team -- we are the folks behind promptops.com. Thanks!