Add Groq model support to LLMClient (#1977)#2165
Conversation
davidberenstein1957
left a comment
There was a problem hiding this comment.
Looks good already. Some minor remarks :)
… redundant seed assignment - Added _supports_json_format(model: str) to check for JSON support based on model name. - Now user can specify JSON format via the 'format' parameter instead of hardcoding. - Removed duplicate assignment of extra_params['seed'] = seed in GroqClient.complete()."
davidberenstein1957
left a comment
There was a problem hiding this comment.
Hi, I left some more comments.
kunjanshah0811
left a comment
There was a problem hiding this comment.
Fixed previous issues and added one comment that needs your attention in groq_client.py
Here https://github.com/Giskard-AI/giskard/pull/2165/files/10a4b68e3841dedb23605c1f5d3f333621942d0f#r2120776154
|
LFTM, one final thing is to mention it in our docs: https://docs.giskard.ai/en/stable/open_source/setting_up/index.html. |
I will update it by today and let you know. :) |
kunjanshah0811
left a comment
There was a problem hiding this comment.
I have added Groq Client in Documentation.
|
@kunjanshah0811 now, we need to make sure all tests pass. |
Solved the Linting Error. |
|
@kunjanshah0811 there are some remaining errors. Before pushing, could you make sure they pass locally? |
I ran both Let me know if anything else is needed from my end. @davidberenstein1957 😀 Note: Some tests require HuggingFace authentication. So, it is done locally via |
|
Awesome @kunjanshah0811 . Thanks a lot for the help. |
…e-multilingual-MiniLM-L12-v2
Fixed with minor errors. @davidberenstein1957 thanks a lot for the review. :) |
davidberenstein1957
left a comment
There was a problem hiding this comment.
Some minor tweaks for the dependencies and we are good to go. A lot of thanks for the help.
pyproject.toml
Outdated
| "mistralai>=1", | ||
| "boto3>=1.34.88", | ||
| "scikit-learn==1.4.2", | ||
| "bert-score>=0.3.13", |
There was a problem hiding this comment.
why did you duplicate these dependencies here? Is there a specific reason? I'm not sure if we rely on them directly and they might be included in the other libraries, like langchain already?
There was a problem hiding this comment.
why did you duplicate these dependencies here? Is there a specific reason? I'm not sure if we rely on them directly and they might be included in the other libraries, like langchain already?
I added those dependencies because I ran into import errors during local testing, and adding them directly resolved the issues at the time. But you're right,other libraries like langchain might already pull in some of them and I may have added them redundantly without double-checking after that. I’m happy to remove any that aren’t needed directly.
There was a problem hiding this comment.
I don't think we should set groq as default LLM client.
Could we revert the modifications here?
There was a problem hiding this comment.
I don't think we should set
groqas default LLM client. Could we revert the modifications here?
Thank you for the feedback. Before I make any changes, I want to ensure I understand your request correctly:
-
I'll remove Groq from being automatically selected as a default LLM client by modifying the
get_default_llm_api()function in__init__.py -
I'll keep the following in place so Groq can still be used when explicitly selected:
- The Groq client implementation (groq_client.py)
- The Groq dependency in pyproject.toml
- The test case for the Groq client
- "groq" as a valid option in the API list (users can still explicitly set GSK_LLM_API="groq")
Is this understanding correct? If you'd prefer I completely remove all Groq-related changes, please let me know.
There was a problem hiding this comment.
Hey @kunjanshah0811, yes exactly!
Just reverting llm/client/__init__.py should be fine, you can keep the groq_client.py as it is.
There was a problem hiding this comment.
Hey @kunjanshah0811, yes exactly! Just reverting
llm/client/__init__.pyshould be fine, you can keep thegroq_client.pyas it is.
@henchaves I have made suggested changes with the recent commit. Please have a look and let me know. Thanks a lot 🚀.
docs/open_source/setting_up/index.md
Outdated
| # Note: Groq does not currently support embedding models | ||
| # Use another provider for embeddings if needed |
There was a problem hiding this comment.
Perfect! Thanks.
Could you just mention to access https://docs.litellm.ai/docs/embedding/supported_embedding to see the full list of supported providers?
There was a problem hiding this comment.
Perfect! Thanks. Could you just mention to access https://docs.litellm.ai/docs/embedding/supported_embedding to see the full list of supported providers?
- I'll add the documentation reference you suggested in
docs/open_source/setting_up/index.mdlinking to LiteLLM's supported embedding providers
|
Thanks a lot @kunjanshah0811! The PR is ready to be merged 🙂 |
Thank you so much @davidberenstein1957 and @henchaves for your constant support! 🙏✨ |
|
Hi @kunjanshah0811 , thanks for merging this PR last year. Giskard v3 is coming! So, we are closing some active issues and PRs related to v2 and shifting our focus to nailing v3. At the moment, we are still actively developing v3, so we are not looking for external contributions yet, but we would love to hear your early feedback and expectations on our roadmap and discussion: https://github.com/orgs/Giskard-AI/discussions/2250. |
|
Hi @davidberenstein1957 , it was a good learning experience for me working on this open-source project. I am excited to take a look at V3 and what's changing this time :)
|

@davidberenstein1957 @henchaves @alexcombessie
Description
This PR adds support for using Groq as an additional LLM client in Giskard, without replacing the default LiteLLM client. It enables developers to integrate Groq models like
llama3-8b-8192by configuring and initializingGroqClientexplicitly.Related Issue
Closes Set any groq model as default LLMClient for giskard #1977
Type of Change
Checklist
CODE_OF_CONDUCT.mddocument.CONTRIBUTING.mdguide.pdm.lockrunningpdm update-lock(only applicable whenpyproject.tomlhas beenmodified)
Usage Example