A Python package which helps to debug machine learning classifiers and explain their predictions

ELI5 is a Python package which helps to debug machine learning classifiers and explain their predictions. It provides support for the following machine learning frameworks and packages: scikit-learn. Currently ELI5 allows to explain weights and predictions of scikit-learn linear classifiers and regressors, print decision trees as text or as SVG, show feature importances and explain predictions of decision trees and tree-based ensembles. ELI5 understands text processing utilities from scikit-learn and can highlight text data accordingly. Pipeline and FeatureUnion are supported. […]

Read more

A game theoretic approach to explain the output of any machine learning model

SHAP (SHapley Additive exPlanations) is a game theoretic approach to explain the output of any machine learning model. It connects optimal credit allocation with local explanations using the classic Shapley values from game theory and their related extensions (see papers for details and citations). Install SHAP can be installed from either PyPI or conda-forge: pip install shap or conda install -c conda-forge shap Tree ensemble example (XGBoost/LightGBM/CatBoost/scikit-learn/pyspark models) While SHAP can explain the output of any machine learning model, we […]

Read more

Scanning your Conda environment for security vulnerabilities

You don’t want to deploy an application that has security vulnerabilities. That means your own code, but also third-party dependencies: it doesn’t matter how secure your code is if it’s exposing a TLS socket with a version of OpenSSL that has a remote code execution vulnerability. For pip-based Python applications, you’d usually run vulnerability scanners on Python dependencies like Django, and on system packages like OpenSSL. With Conda, however, the situation is a little different: Conda combines both types of […]

Read more

Explainability Requires Interactivity In Python

This repository contains the code to train all custom models used in the paper Explainability Requires Interactivity as well as to create all static explanations (heat maps and generative). For our interactive framework, see the sister repositor. Precomputed generative explanations are located at static_generative_explanations. Requirements Install the conda environment via conda env create -f env.yml (depending on your system you might need to change some versions, e.g. for pytorch, cudatoolkit and pytorch-lightning). For some parts you will need the FairFace […]

Read more

The VeriNet toolkit for verification of neural networks

The VeriNet toolkit is a state-of-the-art sound and complete symbolic interval propagation based toolkit for verification of neural networks. VeriNet won second place overall and was the most performing among toolkits not using GPUs in the 2nd international verification of neural networks competition. VeriNet is devloped at the Verification of Autonomous Systems (VAS) group, Imperial College London. Relevant Publications. VeriNet is developed as part of the following publications: Efficient Neural Network Verification via Adaptive Refinement and Adversarial Search DEEPSPLIT: An […]

Read more

A python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters

This python-based package offers a way of creating a parametric OpenMC plasma source from plasma parameters. The OpenMC sources are ring sources which reduces the computational cost and the settings.xml file size. The equations implemented here are taken from this paper. Installation To install openmc-plasma-source, simply run: pip install openmc-plasma-source Basic usage from openmc_plasma_source import Plasma # create a plasma source

Read more

Finding LCM using python from scratch

Here, I write a practice program to find the LCM i.e Lowest Common Multiplication of two numbers using python without library. Requirements No Special Requirements Contribution I have strong belief on open source contribution and I promote the open source contribution. So, I invite you all to add new features in this game and also simplify the code as you can. Come here and show your talent. Authors and acknowledgment Author Name: Sachin Vinayak Dabhade License GNU – Open for […]

Read more

Know your customer pipeline in apache air flow

Know your customer pipeline in apache air flow For a successful pipeline run take these steps: Run you Airflow server Admin -> connection -> create Triger the input_dag Before triggering the File_ process dag, move one of the JSON files into the tmp folder (for example the request_1411.json). In the program we must give the right name of the JSON file to load. Triger the File_process dag Input: For our data, when I read the assignment and what I understood […]

Read more

A sample application that demonstrates integrating Firmalyzer’s IoTVAS API

This repository hosts a sample application that demonstrates integrating Firmalyzer’s IoTVAS API with the Rapid7 InsightVM platform. This integration enables InsightVM users to: accurately identify IoT/connected devices and their vulnerabilities in the firmware code level track and manage discontinued, outdated and vulnerable devices from within InsightVM platform Clone the repository content to a local folder and issue the following commands: python3 -mvenv env source env/bin/activate pip install -r requirements.txt Note: This application is based on the InsightVM API client (located […]

Read more
1 483 484 485 486 487 973