This repository contains Keras implementation of a deep neural networks for embeddings learning using Siamese and Triplets approaches with different negative samples mining strategies.
git clone git@github.com:RocketFlash/EmbeddingNet.git
- tensorflow=2.2.0
- scikit-learn
- opencv
- matplotlib
- plotly - for interactive t-SNE plot visualization
- albumentations - for online augmentation during training
- image-classifiers - for different backbone models
- efficientnet - for efficientnet backbones
- keras-rectified-adam - for cool state-of-the-art optimization
Requirements could be installed using the following command:
$ pip install -r requirements.txt
To use wandb install it with:
$ pip install wandb
There are two options for training your network: annotations files and data folding
The simplest way to start training is to generate annotations csv files. Files must contain two columns: file path and class name. For example generate train.csv and val.csv that will contain two columns "image_id" and "label"
The first option is to split images into folders. In the training dataset, data for training and validation should be in separate folders, in each of which folders with images for each class. Dataset should have the following structure:
Dataset
└───train
│ └───class_1
│ │ image1.jpg
│ │ image2.jpg
│ │ ...
│ └───class_2
│ | image1.jpg
│ │ image2.jpg
│ │ ...
│ └───class_N
│ │ ...
│
└───val
│ └───class_1
│ │ image1.jpg
│ │ image2.jpg
│ │ ...
│ └───class_2
│ | image1.jpg
│ │ image2.jpg
│ │ ...
│ └───class_N
│ │ ...
For training, it is necessary to create a configuration file in which all network parameters and training parameters will be indicated. Examples of configuration files can be found in the configs folder.
After the configuration file is created, you can run train.py file, and start training:
$ python3 train.py [config (path to configuration_file)]
[--resume_from (the checkpoint file to resume from)]
The trained model can be tested using the following command:
$ python3 test.py [config (path to configuration_file)]
[--weights (path to trained model weights file)]
[--encodings (path to trained model encodings file)]
[--image (path to image file)]
Is is also possible to use test_network.ipynb notebook to test the trained network and visualize input data as well as output encodings.
Result encodings could be visualized interactively using plot_tsne_interactive function in utils.py.
t-SNE plots of russian traffic sign images embeddings (107 classes)
[1] Schroff, Florian, Dmitry Kalenichenko, and James Philbin. Facenet: A unified embedding for face recognition and clustering. CVPR 2015
[2] Alexander Hermans, Lucas Beyer, Bastian Leibe, In Defense of the Triplet Loss for Person Re-Identification, 2017
[3] Adam Bielski Siamese and triplet networks with online pair/triplet mining in PyTorch
[4] Olivier Moindrot Triplet Loss and Online Triplet Mining in TensorFlow