项目作者: crlz182

项目描述 :
This is a Keras implementation of the Netvlad network for visual place recognition.
高级语言: Python
项目地址: git://github.com/crlz182/Netvlad-Keras.git
创建时间: 2019-09-26T07:30:03Z
项目社区:https://github.com/crlz182/Netvlad-Keras

开源协议:

下载


Netvlad-Keras

This is a Keras implementation of the NetVLAD architecture for visual place recognition. The main purpose of this project is to provide an easy to use (and trainable) Keras model.

The original Netvlad model was written in matlab (Github page) and later transfered to Tensorflow (Github page).

Overview

The model can be found in netvlad_keras.py along with the custom layer in netvladlayer.py. The latter is adapted from the tensorflow version. The file savemodel.py sets the weights from the tensorflow checkpoint to the keras model and stores them. Since the main work was to get these weights from matlab to tensorflow, I encourage everyone using this repository to also cite the author’s paper:

T. Cieslewski, S. Choudhary, D. Scaramuzza: Data-Efficient Decentralized Visual SLAM IEEE International Conference on Robotics and Automation (ICRA), 2018.

In addition, compare.py checks wheather the outputs of the two models are consistent between Tensorflow and Keras for a given dataset.
check_layers.py is a low level debugging script which allows you to compare the two model’s outputs for each layer.

How to use

  1. Download the weights from here or create them on your own using savemodel.py.

  2. Create the model and load the weights

    1. from netvlad_keras import NetVLADModel
    2. model = NetVLADModel()
    3. model.load_weights('netvlad_weights.h5')
    4. model.build()
  3. Get the output for a batch of images
    1. output = model.predict(batch)

A working example can be found in example.py

Some results

A few results showing the best match in a dataset given the query image
Picture not found

TODO

  • Currently, the model cannot be loaded from a file because of the custom layers.
  • The loss function has to be implemented (the model is not trainable yet)