Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Explainable AI: Using Local Interpretable Model-agnostic Explanations (LIME) & SHapley Additive exPlanations (SHAP) #1109

Merged
merged 3 commits into from
Nov 7, 2024

Conversation

inkerton
Copy link
Contributor

@inkerton inkerton commented Nov 5, 2024

Pull Request for PyVerse 💡

Requesting to submit a pull request to the PyVerse repository.


Issue Title

Please enter the title of the issue related to your pull request.
Explainable AI: Using Local Interpretable Model-agnostic Explanations (LIME) & SHapley Additive exPlanations (SHAP)

  • I have provided the issue title.

Info about the Related Issue

What's the goal of the project?
Project Description
Explainable AI: Using LIME and SHAP

In the realm of machine learning, models often operate as "black boxes," making it difficult to understand how they arrive at their decisions. Explainable AI (XAI) seeks to demystify these models, providing insights into their inner workings. Two powerful techniques for achieving this are Local Interpretable Model-Agnostic Explanations (LIME) and SHapley Additive exPlanations (SHAP).

LIME (Local Interpretable Model-Agnostic Explanations)

LIME focuses on explaining individual predictions rather than the entire model. It works by perturbing the input data and observing how the model's predictions change. LIME then fits a simple, interpretable model (like a linear model) to these perturbed instances and their corresponding predictions. This local model can be easily understood and provides insights into the factors that influenced the original model's prediction.

SHAP (SHapley Additive exPlanations)

SHAP, on the other hand, leverages game theory to assign importance to each feature in a model's prediction. It calculates Shapley values, which represent the average marginal contribution of a feature to the model's output across all possible feature combinations. By examining these Shapley values, we can understand how much each feature contributed to the final prediction.

Key Differences Between LIME and SHAP:

Feature | LIME | SHAP -- | -- | -- Focus | Local explanations for individual predictions | Global explanations for the entire model Model | Fits a simple, interpretable model locally | Uses game theory to calculate feature importance Visualization | Often uses bar charts or heatmaps to show feature importance | Uses force plots or decision plots to visualize feature contributions
When to Use LIME or SHAP:

LIME:
Ideal for understanding the reasons behind specific predictions.
Useful for models that are difficult to interpret directly.
Can be applied to a wide range of models, including deep neural networks.
SHAP:
Provides a global understanding of feature importance across the entire dataset.
Can be used to identify the most influential features for a given model.
Offers a more rigorous and mathematically sound approach to feature attribution.
Real-World Applications:

Healthcare: Understanding why a model predicts a certain disease diagnosis.
Finance: Explaining credit decisions or stock price predictions.
Autonomous Vehicles: Interpreting the reasons behind a self-driving car's actions.
Criminal Justice: Assessing the fairness of algorithmic decision-making.
By using LIME and SHAP, we can enhance the transparency, accountability, and trust in AI systems. These techniques empower us to make informed decisions, identify biases, and improve the overall performance of machine learning models.

  • I have described the aim of the project.

Name

Please mention your name.
inkerton

  • I have provided my name.

GitHub ID

Please mention your GitHub ID.
inkerton

  • I have provided my GitHub ID.

Email ID

Please mention your email ID for further communication.
janvichoudhary116@gmail.com

  • I have provided my email ID.

Identify Yourself

Mention in which program you are contributing (e.g., WoB, GSSOC, SSOC, SWOC).
GSSOC

  • I have mentioned my participant role.

Closes

Enter the issue number that will be closed through this PR.
Closes: #issue-number #1085

  • I have provided the issue number.

Type of Change

Select the type of change:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Code style update (formatting, local variables)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Checklist

Please confirm the following:

  • My code follows the guidelines of this project.
  • I have performed a self-review of my own code.
  • I have commented my code, particularly wherever it was hard to understand.
  • I have made corresponding changes to the documentation.
  • My changes generate no new warnings.
  • I have added things that prove my fix is effective or that my feature works.
  • Any dependent changes have been merged and published in downstream modules.

Copy link

github-actions bot commented Nov 5, 2024

👋 Thank you for opening this pull request! We're excited to review your contribution. Please give us a moment, and we'll get back to you shortly!

Feel free to join our community on Discord to discuss more!

@UTSAVS26 UTSAVS26 added Contributor Denotes issues or PRs submitted by contributors to acknowledge their participation. Status: Review Ongoing 🔄 PR is currently under review and awaiting feedback from reviewers. level1 gssoc-ext labels Nov 6, 2024
@github-actions github-actions bot force-pushed the main branch 2 times, most recently from 609b090 to c1dc75e Compare November 7, 2024 02:42
Copy link

@ruhi47 ruhi47 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work!

@ruhi47 ruhi47 added Status: Approved ✔️ PRs that have passed review and are approved for merging. and removed Status: Review Ongoing 🔄 PR is currently under review and awaiting feedback from reviewers. labels Nov 7, 2024
@UTSAVS26 UTSAVS26 merged commit 9dcfeed into UTSAVS26:main Nov 7, 2024
1 of 2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Contributor Denotes issues or PRs submitted by contributors to acknowledge their participation. gssoc-ext level1 Status: Approved ✔️ PRs that have passed review and are approved for merging.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants