Getting Started¶
Backward Compatibility ML library requirements¶
The requirements for installing and running the Backward Compatibility ML library are:
- Windows 10 / Linux OS (tested on Ubuntu 18.04 LTS)
- Python 3.6
Installing the Backward Compatibility ML library¶
Follow these steps to install the Backward Compatibility ML library on your computer. You may want to install Anaconda (or other virtual environment) on your system for convenience, then follow these steps:
1. (optional) Prepare a conda virtual environment:
conda create -n bcml python=3.6 conda activate bcml2. (optional) Ensure you have the latest pip
python -m pip install --upgrade pip3. Install the Backward Compatibility ML library:
- On Linux:
pip install backwardcompatibilityml- On Windows:
pip install backwardcompatibilityml -f https://download.pytorch.org/whl/torch_stable.html4. Import the `backwardcompatibilityml` package in your code. For example:
import backwardcompatibilityml.loss as bcloss import backwardcompatibilityml.scores as scores
Running the Backward Compatibility ML library examples¶
Note
The Backward Compatibility ML library examples were developed as Jupyter Notebooks and require the Jupyter Software to be installed. The steps below assume that you have git installed on your system.
The Backward Compatibility ML library includes several examples so you can quickly get an idea of its benefits and learn how to integrate it into your existing ML training workflow.
To download and run the examples, follow these steps:
1. Clone the BackwardCompatibilityML repository:
git clone https://github.com/microsoft/BackwardCompatibilityML.git
2. Install the requirements for the examples:
cd BackwardCompatibilityML
- On Linux:
pip install -r example-requirements.txt- On Windows:
pip install -r example-requirements.txt -f https://download.pytorch.org/whl/torch_stable.html
3. Start your Jupyter Notebooks server and load an example notebook under the `examples` folder:
cd examples jupyter notebook
Backward Compatibility ML library examples included¶
Notebook name | Framework | Dataset | Network | Optimizer | Backward Compatibility Dissonance Function | Backward Compatibility Loss Function | Uses CompatibilityAnalysis widget | Uses CompatibilityModel class | Uses ModelComparison widget |
bcbinary_cross_entropy | PyTorch | UCI Adult Data Set | LogisticRegression | SGD | New Error | Binary Cross-entropy Loss | N | N/A | N |
bckldivergence | PyTorch | MNIST | Custom | SGD | New Error | Kullback–Leibler Divergence Loss | N | N/A | N |
bcnllloss | PyTorch | MNIST | Custom | SGD | New Error | Negative Log Likelihood Loss | N | N/A | N |
compatibility-analysis | PyTorch | MNIST | Custom | SGD | New Error & Strict Imitation | Cross-entropy Loss | Y | N/A | N |
compatibility-analysis-adult | PyTorch | UCI Adult Data Set | LogisticRegression | SGD | New Error & Strict Imitation | Cross-entropy Loss | Y | N/A | N |
compatibility-analysis-adult-kldiv | PyTorch | UCI Adult Data Set | LogisticRegression | SGD | New Error & Strict Imitation | Kullback–Leibler Divergence Loss | Y | N/A | N |
compatibility-analysis-cifar10-resnet18 | PyTorch | CIFAR10 | Custom & RESNet 18 | SGD | New Error & Strict Imitation | Cross-entropy Loss | Y | N/A | N |
compatibility-analysis-cifar10-resnet18-pretrained | PyTorch | CIFAR10 | Custom & RESNet 18 (pretrained) | SGD | New Error & Strict Imitation | Cross-entropy Loss | Y | N/A | N |
compatibility-analysis-from-saved-data | PyTorch | MNIST | Custom | SGD | New Error & Strict Imitation | Cross-entropy Loss | Y | N/A | N |
compatibility-analysis-kldiv | PyTorch | MNIST | Custom | SGD | New Error & Strict Imitation | Kullback–Leibler Divergence Loss | Y | N/A | N |
model-comparison-MNIST | PyTorch | MNIST | Custom | SGD | N/A | N/A | N/A | N/A | Y |
si_cross_entropy_loss | PyTorch | MNIST | Custom | SGD | Strict Imitation | Cross-entropy Loss | N | N/A | N |
si_nllloss | PyTorch | MNIST | Custom | SGD | Strict Imitation | Negative Log Likelihood Loss | N | N/A | N |
tensorflow-MNIST-generalized | TensorFlow | MNIST | Custom | Adam | New Error | Cross-entropy Loss | N/A | N | N/A |
tensorflow-MNIST | TensorFlow | MNIST | Custom | Adam | New Error | Cross-entropy Loss | N/A | Y | N/A |
tensorflow-new-error-binary-cross-entropy-loss | TensorFlow | MNIST | Custom | Adam | New Error | Binary Cross-entropy Loss | N/A | N | N/A |
tensorflow-new-error-cross-entropy-loss | TensorFlow | MNIST | Custom | Adam | New Error | Cross-entropy Loss | N/A | N | N/A |
tensorflow-new-error-kldiv-loss | TensorFlow | MNIST | Custom | Adam | New Error | Cross-entropy Loss | N/A | N | N/A |
tensorflow-new-error-nll-loss | TensorFlow | MNIST | Custom | Adam | New Error | Negative Log Likelihood Loss | N/A | N | N/A |
tensorflow-strict-imitation-binary-cross-entropy-loss | TensorFlow | MNIST | Custom | Adam | Strict Imitation | Binary Cross-entropy Loss | N/A | N | N/A |
tensorflow-strict-imitation-cross-entropy-loss | TensorFlow | MNIST | Custom | Adam | Strict Imitation | Cross-entropy Loss | N/A | N | N/A |
tensorflow-strict-imitation-kldiv-loss | TensorFlow | MNIST | Custom | Adam | Strict Imitation | Cross-entropy Loss | N/A | N | N/A |
tensorflow-strict-imitation-nll-loss | TensorFlow | MNIST | Custom | Adam | Strict Imitation | Negative Log Likelihood Loss | N/A | N | N/A |
Next steps¶
Do you want to learn how to integrate the Backward Compatibility ML Loss Function in your new or existing ML training workflows? Follow this tutorial.
If you want to ask us a question, suggest a feature or report a bug, please contact the team by filing an issue in our repository on GitHub. We look forward to hearing from you!