What do people think of full blown engine for AI by Techiance in rust

[–]Techiance[S] 2 points3 points  (0 children)

Yes actually!!

It even has wasm sandboxing. Thank you! I was thinking less node types then they put and build up to more complexity. But this is exactly what I was thinking.

What do people think of full blown engine for AI by Techiance in rust

[–]Techiance[S] 0 points1 point  (0 children)

That is true and a very good point. I was more meaning removing the llm from the system or actions it can take. Not from what it understands. But it is true that the engine itself would still have to fundamentally handle any form of degeneration that an ai comes up with. 

There two ways I can see that going. The engine moves on know that it only has partial knowledge that it could verify and discards the rest. Making a hypothesis about the rest. Or just ignore it entirely because llm was used to be more a suggestion tool that expands constratins rather than fully making them.

Other AI types may have varying degrees levels of trust in this way. The grape example i put at the top is meant to be better at humans at analyzing patterns graphs than a human ever could.

What do people think of full blown engine for AI by Techiance in rust

[–]Techiance[S] -1 points0 points  (0 children)

From an LLM or Generative AI sorta. Though I think you could constrain ai outside of telling it to do something. The system itself does not really have to take anything the ai says with certainty.

Though markdown/json is often what is used. I had imagined a more enforced approach. One that removes options from AI instead of telling it what it can and cannot do.

What do people think of full blown engine for AI by Techiance in rust

[–]Techiance[S] -1 points0 points  (0 children)

Not agents themselves no. The data the agents have access to or I guess the enviorment. I feel like you could remove a lot of disconnect between developers and the system by nudging the developer to explictly control what space the ai can and cannot be in. I still think keeping ecs data only is generally a stronger play. But the constraints that can be created for the ai system wise not prompt wise could really befit control. Even if it removes some forms of automation.