Description of the idea
Currently, source code generating LLMs generate ObjectScript/BPL/DTL/PEX/SQL/MDX/CSP/XML for Productions/BI and Embedded Python code in a limited and/or inaccurate manner.
InterSystems could specialize an already trained open source code generation model (CodeLlama, StarCoder, etc.) and include support for InterSystems IRIS languages and technologies.
If possible, it could publish the result as an open Docker container for the community, making life easier especially for new customers who are not yet familiar with IRIS and thus increasing customer retention.
Who is the target audience?
InterSystems IRIS Developers
Potential clients testing and trying IRIS
What problem does it solve?
Easy use of AI LLM into interoperability flows
Use AI LLM processing without programming
Be more low/no code when the interoperability flow needs LLM processing
Reduce development complexity
Create smart interoperability flows
Being competitive in relation to the competition: https://docs.digibee.com/documentation/connectors-and-triggers/connectors/ai-tools/llm
How does this impact the efficiency, stability, reliability, etc, of the product?
Improve product UX
Ease to use AI for interoperability
Product will be more low/no code when integrate with LLM
Source code more secure, stable, reliable and efficient when use LLM
Provide a specific use case or scenario that illustrates how this idea could be used in practice.
It is necessary to integrate with the customer's registration, transaction history, logistics system and then respond in natural language when a certain purchase is going to arrive. With all the information in hand, LLM is able to analyze delivery history, customer address and free slots to respond to the customer via chatbot response.
Thank you for submitting the idea. The status has been changed to "Planned or In Progress".
This is not a commitment; plans are subject to change. Stay tuned!
@Benjamin De Boe - should this be assigned to you?