Idea Summary
The Generative AI Services
support Open AI, Cohere and OCI in the first release (APEX 24.1). Allowing other providers to be used will enable developers to use Open source models, as well as other Commercially supported ones in their apps. Such flexibility will be very beneficial for everyone using the new AI services, as there will be no barrier for trying out the features and will make the adoption a lot easier.
Use Case
People who are using services like Anthropic (Claude 3), Mistral or other models (even locally hosted LLMs) use different API URLs to access them. These APIs may vary as URLs and parameters. If developers can enter the full URL to their model REST APIs, together with any custom body parameters, it will make it possible to switch to any provider/model.
Preferred Solution (Optional)
My suggestions are based on the fact that APEX automatically adds /chat/completions
to the Base URL, when making the REST calls for the AI Assistant component. Giving the developer the ability to define the full URL will enable LLM providers with different URL structures to be used. So the idea is the following:
- Add a new option in the dropdown menu AI Provider
, called Custom.
- If this option is selected, another two options will appear in Settings
, just below Base URL
:
- Chat URL
: URL of the LLM Service Chat feature
- Completions URL
: URL of the LLM Service Completions feature
* Sometimes the LLM Service might require different parameters in the REST call, so these should be taken from the Additional Attributes
JSON when Custom AI Provided
option is selected.
Having saved the settings, the APEX procedure that calls the LLM provider's REST API will either use the developer provided full URLs (when Custom is selected as AI Provider
) or will use the current logic to append /chat/completions
to the Base URL if one of the existing AI Providers is selected.
PP: Sometimes developers could use a local LLM in POCs or test environment, which does not require API Keys or other credentials. That's why the Credential
option might be optional for Custom AI Providers.
