# Attention Convolutional Binary Neural Tree for Fine-Grained Visual Categorization This branch is developed for fine-grained recognition, the related paper is as follows: Attention Convolutional Binary Neural Tree for Fine-Grained Visual Categorization Ruyi Ji, Longyin Wen, Libo Zhang, Dawei Du, Yanjun Wu, Chen Zhao, Xianglong Liu, Feiyue Huang [Note:] This is our initial version. For the stability of the model and the reduction of parameters, we have made appropriate modifications. ### Files - Original Caffe library - Sum Pooling layer * src/caffe/proto/caffe.proto * include/caffe/layers/sum_pooling_layer.hpp * src/caffe/layers/sum_pooling_layer.cpp * src/caffe/layers/sum_pooling_layer.cu - L2 Normalize layer * include/caffe/layers/l2_normalize_layer.hpp * src/caffe/layers/l2_normalize_layer.cpp * src/caffe/layers/l2_normalize_layer.cu - Signed Sqrt layer * include/caffe/layers/signed_sqrt_layer.hpp * src/caffe/layers/signed_sqrt_layer.cpp * src/caffe/layers/signed_sqrt_layer.cu - Example * example/acnet/datasets * example/acnet/logs * example/acnet/model-zoo * example/acnet/prototxts * example/acnet/snapshots ### Train model 1. The Installation completely the same as [Caffe](http://caffe.berkeleyvision.org/). Please follow the [installation instructions](http://caffe.berkeleyvision.org/installation.html). Make sure you have correctly installed before using our code. 2. Download the [CUB dataset](http://www.vision.caltech.edu/visipedia/CUB-200-2011.html). 3. Preprocess the CUB dataset and creat list for training set and validation set, place them in example/acnet/datasets/. 4. Initialize the parameters of model with [init model](https://drive.google.com/file/d/16vdy_VybR-OUhb4KZN7p4hEo1T1cSOBf/view?usp=sharing). 5. Train the new added layers. bash ./run_first_stage.sh #num_gpu 6. Fine tune the whole network. bash ./run_second_stage.sh #num_gpu ### Citation If the code helps your research, please consider to cite our work: @article{1, title={Attention Convolutional Binary Neural Tree for Fine-Grained Visual Categorization}, author={Ruyi Ji, Longyin Wen, Libo Zhang, Dawei Du, Yanjun Wu, Chen Zhao, Xianglong Liu, Feiyue Huang}, journal={arXiv preprint arXiv:1909.11378}, year={2019} }