Generic OpenAI Compatible API support
Jarek Ceborski
It's added in Kerlig 1.7.0, see: https://www.kerlig.com/releases
Alex T
Just a ping to see if this will be done. I have proxified AWS Bedrock to the applications which support OpenAI compatible API so doing that feature will also help to integrate myriad of services.
Jarek Ceborski
Alex T I'm currently working on this, you will be able to add multiple instances of AI integrations (providers) including new "OpenAI compatible" integration.
Pramod J
Hi Jarek, Is there any update for this feature request? I see there are other requests on the board that are duplicates of this one.
Regis David Souza Mesquita
I agree, I was trying to integrate with Perplexity which is OpenAI compatible but was unable, I also have another provider that is also OpenAI compatible and provides opensource models and was also unable. I know you don't want users to enter the model but for advanced users that wouldn't be a problem at all.. To be able to have all my providers in one place would be a killer feature.
Jarek Ceborski
Thanks Pramod, I like this idea. I will need to check if these APIs can provide the list of available models. I don't want users to manually type model names like 'Meta-Llama-3.1-8B-Instruct'. Otherwise, this is a great idea.
Pramod J
Jarek,
Glad to hear you like the idea. With OpenAI API, we can easily fetch models. Both Cerebras and Groq have compatible APIs, as do Together and others.
You're familiar with these APIs from your experience integrating OpenRouter and Groq. Allowing users to choose offers specific benefits, such as LM Studio's "MLX Runtime" on M series Macs, which is faster than Ollama for local inference.
The
models
option in the API can help automatically fetch models, similar to how it works with OpenRouter (attached screenshot from LM Studio for reference).This approach is better than integrating individual OpenAI compatible API services in my humble opinion.
Cheers!
Alex T
Jarek Ceborski this what aws bedrock proxy gives out. This works with Msty, MindMac etc.
Jarek Ceborski
Alex T thanks!
Pramod J
For example, Cerebras is currently the fastest inference provider. Sambanova is another fast provider. These are faster than groq. Other such services may show up, so its useful to have OpenAI compatibility in general. Hope you will consider it, Jarek :)
Regis David Souza Mesquita
Pramod J specially if we can add more than one. OpenAI is kinda of the defacto format for most of the providers, if Kerlig supported multiple OpenAI compatible providers I would finally be able to have a single app instead of 4 constantly open..
Pramod J
Regis David Souza Mesquita Of course, we should be able to add as many such OpenAI compatible API providers as possible! Jarek, this will be a super feature, would make us use only your app for most use cases!