Skip to content

Commit 0dd238d

Browse files
[Penify]: Documentation for commit - 6093f2b (#105)
Co-authored-by: penify-dev[bot] <146478655+penify-dev[bot]@users.noreply.github.com>
1 parent 6093f2b commit 0dd238d

1 file changed

Lines changed: 9 additions & 5 deletions

File tree

penify_hook/llm_client.py

Lines changed: 9 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -44,21 +44,25 @@ def generate_commit_summary(self, diff: str, message: str, generate_description:
4444
This function constructs a prompt for an LLM to produce a commit title and, if
4545
requested, a detailed description. The summary adheres to Semantic Commit
4646
Messages guidelines. If JIRA context is provided, it enriches the prompt with
47-
relevant issue information.
47+
relevant issue information. The function also handles token limits by
48+
truncating large diffs and includes additional parameters for flexibility.
4849
4950
Args:
5051
diff (str): Git diff of changes.
5152
message (str): User-provided commit message or instructions.
5253
generate_description (bool): Flag indicating whether to include a detailed description in the summary.
5354
repo_details (Dict): Details about the repository.
54-
jira_context (Dict?): Optional JIRA issue context to enhance the summary.
55+
jira_context (Dict?): Optional JIRA issue context to enhance the summary. Defaults to None.
56+
additonal_param (str?): An additional parameter for further customization. Defaults to "".
57+
additonal_param_2 (str?): Another additional parameter for further customization. Defaults to "".
5558
5659
Returns:
57-
Dict: A dictionary containing the title and description for the commit. If
58-
`generate_description` is False, the 'description' key may be absent.
60+
Dict: A dictionary containing 'title' and optionally 'description'.
5961
6062
Raises:
61-
ValueError: If the LLM model is not configured.
63+
ValueError: If the JSON structure from the LLM response is invalid.
64+
Exception: Any other errors during the process, which will exit the script with an error
65+
message.
6266
"""
6367
if not self.model:
6468
raise ValueError("LLM model not configured. Please provide a model when initializing LLMClient.")

0 commit comments

Comments
 (0)