Breast Cancer Diagnosis Prediction Using LIME Model #396
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Pull Request for PyVerse 💡
Requesting to submit a pull request to the PyVerse repository.
Issue Title
Please enter the title of the issue related to your pull request.
Explainable AI on Cancer Dataset
Info about the Related Issue
What's the goal of the project?
The main objective of this project is to predict whether a tumor is benign or malignant using machine learning models and explain the model’s decisions using LIME (Local Interpretable Model-Agnostic Explanations). The dataset used in this project is the Breast Cancer Dataset available from Kaggle.
Name
Please mention your name.
Janvi
GitHub ID
Please mention your GitHub ID.
inkerton
Email ID
Please mention your email ID for further communication.
janvichoudhary116@gmail.com
Identify Yourself
Mention in which program you are contributing (e.g., WoB, GSSOC, SSOC, SWOC).
GSSOC
Closes
Enter the issue number that will be closed through this PR.
Closes: #issue-number #302
Describe the Add-ons or Changes You've Made
Give a clear description of what you have added or modified.
Decision Tree Classifier: A tree-based model used for classification of tumors based on their features.
LIME (Local Interpretable Model-Agnostic Explanations): Used to explain predictions of the machine learning model by generating interpretable explanations for individual instances.
I have described my changes.
Type of Change
Select the type of change:
How Has This Been Tested?
Describe how your changes have been tested.
Unit Testing: Each individual function and module has been tested to ensure they perform as expected. For example, data preprocessing steps and model predictions were verified with known inputs.
Model Evaluation:
LIME Interpretability:
Checklist
Please confirm the following: