Skip to content

Fast Image Retrieval (FIRe) is an open source project to promote image retrieval research. It implements most of the major binary hashing methods to date, together with different popular backbone networks and public datasets.

License

CISiPLab/cisip-FIRe

Repository files navigation

screener

Released September, 2021

Documentation Status

Documentation: https://fast-image-retrieval.readthedocs.io/en/latest/

Introduction

Fast Image Retrieval (FIRe) is an open source image retrieval project release by Center of Image and Signal Processing Lab (CISiP Lab), Universiti Malaya. This framework implements most of the major binary hashing methods, together with different popular backbone networks and public datasets.

Major features

  • One for All

    Herein, we unified (i) various binary hashing methods, (ii) different backbone, and (iii) multiple datasets under a single framework to ease the research and benchmarking in this domain. It supports popular binary hashing methods, e.g. HashNet, GreedyHash, DPN, OrthoHash, etc.

  • Modularity

    We break the framework into parts so that one can easily implement their own method by joining up the components.

License

This project is released under BSD 3-Clause License.

Changelog

Please refer to Changelog for more detail.

Implemented method/backbone/datasets

Backbone

  1. Alexnet
  2. VGG{16}
  3. ResNet{18,34,50,101,152}

Loss (Method)

Supervised

Method Config Template Loss Name 64bit ImageNet AlexNet (mAP@1K)
ADSH adsh.yaml adsh 0.645
BiHalf bihalf-supervised.yaml bihalf-supervised 0.684
Cross Entropy ce.yaml ce 0.434
CSQ csq.yaml csq 0.686
DFH dfh.yaml dfh 0.689
DPN dpn.yaml dpn 0.692
DPSH dpsh.yaml dpsh 0.599
DTSH dtsh.yaml dtsh 0.608
GreedyHash greedyhash.yaml greedyhash 0.667
HashNet hashnet.yaml hashnet 0.588
JMLH jmlh.yaml jmlh 0.664
OrthoCos(OrthoHash) orthocos.yaml orthocos 0.701
OrthoArc(OrthoHash) orthoarc.yaml orthoarc 0.698
SDH-C sdhc.yaml sdhc 0.639

Unsupervised

Method Config Template Loss Name 64bit ImageNet AlexNet (mAP@1K)
BiHalf bihalf.yaml bihalf 0.403
CIBHash cibhash.yaml cibhash 0.322
GreedyHash greedyhash-unsupervised.yaml greedyhash-unsupervised 0.407
SSDH ssdh.yaml ssdh 0.146
TBH tbh.yaml tbh 0.324

Shallow (Non-Deep learning methods)

Method Config Template Loss Name 64bit ImageNet AlexNet (mAP@1K)
IMH imh.yaml imh 0.467
ITQ itq.yaml itq 0.402
LsH lsh.yaml lsh 0.206
PCAHash pca.yaml pca 0.405
SH sh.yaml sh 0.350
Shallow methods only works with descriptor datasets. We will upload the descriptor datasets and 

Datasets

Dataset Name in framework
ImageNet100 imagenet100
NUS-WIDE nuswide
MS-COCO coco
MIRFLICKR/Flickr25k mirflickr
Stanford Online Product sop
Cars dataset cars
CIFAR10 cifar10

Installation

Please head up to Get Started Docs for guides on setup conda environment and installation.

Tutorials

Please head up to Tutorials Docs for guidance.

Reference

If you find this framework useful in your research, please consider cite this project.

@inproceedings{dpn2020,
  title = {Deep Polarized Network for Supervised Learning of Accurate Binary Hashing Codes.},
  author = {Fan, Lixin and Ng, Kam Woh and Ju, Ce and Zhang, Tianyu and Chan, Chee Seng},
  booktitle = {IJCAI},
  pages = {825--831},
  year = {2020}
}

@inproceedings{orthohash2021,
  title = {One Loss for All: Deep Hashing with a Single Cosine Similarity based Learning Objective},
  author = {Hoe, Jiun Tian and Ng, Kam Woh and Zhang, Tianyu and Chan, Chee Seng and Song, Yi-Zhe and Xiang, Tao},
  booktitle = {Advances in Neural Information Processing Systems (NeurIPS)},
  year = {2021}
}

Contributing

We welcome the contributions to improve this project. Please file your suggestions/issues by creating new issues or send us a pull request for your new changes/improvement/features/fixes.

Releases

No releases published

Packages

No packages published

Languages