[WIP] feat: Add class and prompts for issue label generator
Enhance Issue Analysis and Improve Output Formatting
Overview
This pull request introduces a new module for issue analysis, enhancing the functionality for generating issue labels and descriptions based on provided prompts. It also includes minor formatting improvements across existing code for better readability.
Changes
-
Key Changes:
- Added a new file
issue_analysis.pythat contains theIssueAnalysisGeneratorclass for generating issue labels and descriptions using a language model. - Introduced two new data classes,
IssueLabelOutputandIssueDescOutput, to encapsulate the outputs of the label and description generation processes. - Implemented methods
generate_issue_labelsandgenerate_issue_descto handle the respective functionalities, including error handling for missing inputs.
- Added a new file
-
New Features:
- The
IssueAnalysisGeneratorclass provides functionality to generate labels and descriptions for GitHub issues, utilizing prompts defined inissue_analysis_prompts.py. - New prompts for issue analysis have been added, allowing for structured feedback generation based on issue titles and descriptions.
- The
-
Refactoring:
- Improved string formatting in existing print statements across multiple files for consistency and readability.
- Added a placeholder function
create_issue_descriptioninoutput.pyfor future implementation, ensuring the structure is in place for generating issue descriptions.
Additional Notes
-
Error Handling: The new methods include checks for missing issue labels and descriptions, raising exceptions as necessary. This is a good practice to ensure that the methods fail gracefully when provided with invalid input.
-
Logging: The use of logging within the processing methods enhances traceability and debugging capabilities, which is beneficial for maintaining the code.
-
Maintainability: The introduction of data classes improves the organization of output data, making it easier to manage and extend in the future.
-
Performance: The methods for processing issues leverage the language model efficiently, but consider implementing caching mechanisms if the same issues are processed frequently to reduce API calls.
-
Testing: Ensure that unit tests are created for the new functionalities in
issue_analysis.pyto validate the behavior and output of the new methods.
โจ Generated with love by Kaizen โค๏ธ
Original Description
Is this task ready for review or are we waiting on anything specific?
The issue analysis generation is done. I'm still working on adding it to the github app.
Maybe we should merge this and create a separate task for github app?
That sounds good. I'll put this PR up for review.
@sauravpanda Do I need to make anymore changes to this PR? I'm still working on the GitHub app integration for issue analysis and will make a PR soon.
๐ Code Review Summary
โ All Clear: This commit looks good! ๐
๐ Stats
- Total Feedbacks: 0
- Suggested Refinements: 0
- Files Affected: 0
๐ Code Quality
[โโโโโโโโโโโโโโโโโโโโ] 90% (Excellent)
๐งช Test Cases
Test Cases need updates: Run !unittest to generate
Tests Not Found
The following files are missing corresponding test files:
-
kaizen/generator/issue_analysis.py -
kaizen/generator/unit_test.py -
kaizen/reviewer/code_scan.py -
kaizen/generator/e2e_tests.py -
kaizen/llms/prompts/issue_analysis_prompts.py -
kaizen/helpers/output.py
Tests Found But May Need Update
The following test files may need to be updated to reflect recent changes:
Generate Unit Tests
To generate unit test cases for the code, please type !unittest in a comment. This will create a new pull request with suggested unit tests.
โจ Generated with love by Kaizen โค๏ธ
Useful Commands
-
Feedback: Reply with
!feedback [your message] -
Ask PR: Reply with
!ask-pr [your question] -
Review: Reply with
!review -
Explain: Reply with
!explain [issue number]for more details on a specific issue -
Ignore: Reply with
!ignore [issue number]to mark an issue as false positive -
Update Tests: Reply with
!unittestto create a PR with test changes