Include Groq into the list of inference providers.
A
Alexander Fedotov
Even though Groq fully supports OpenAI format of API calls, you chose to not include it into the list of inference providers despite the fact that it is widely popular among the more skillful practitioners working with Language Models...
Moreover! The attempt to set it up as a custom inference provider fails because your interface generates a red pop-up insisting on setting OpenAI, Antropic and the list of alsorans as a content of the 'model' field (which is just patently wrong).
Just to let you know, on this ninth day of January of 2025, all the more experienced people working with Language Models work with open-source models, like LLAMA-3.3 with modifications, Mis(x)tral, DeepSeek 3, etc. They are being promptly included in Groq infrastructure. The useless competition of OpenAI and Anthropic over the AGI, priced them out of the real market. And I'm not even mentioning MS and Google with their obsolete 'clouds'.
I would advise you to:
- Include Groq as a FIRST item on your list of providers, because that's what a professional is looking for when he first sees this list;
- Make the settings of a 'custom provider' _work_ (!!!), let people set up arbitrary model names (not a dropdown choice of the f...ing 'competitors for AGI').
Thank you for your time that you spent reading this message.
- Alex