what differentiates a normal prompt engineer from super. A few things i can think of
- Cross LLM experience
- Understanding how to accuracy faster
- Expereince with LLM Tools
It is already starting. But most of those predictions may be widely applicable in 2-10 years.
It seems like all of these things will be feasible to some degree within a couple of years. It's hard to predict how long it will take for them to be robust and widely deployed.
Sugarcane AI provides an Open Source Microservices Framework for cross-LLM workflow/plugin development, allowing developers to prioritize business logic over LLM selection, cost, and performance.
Framework comprises
- LLM as a Service for Data Scientists, empowering data labelling and fine-tuning
- Prompt as a Service for Prompt developers, streamlining prompt management
- Workflow as a Service for Plugin developers to construct workflow plugins, facilitating the distribution of LLM, Prompts, and Plugins via APIs.
The Open Source framework encourages collaborative dataset development and enhances reusability of prompt packages and fine-tuned LLMs, facilitating sharing and monetization on an open marketplace.