Commit b5861fbc authored by 蔡院强's avatar 蔡院强
Browse files

Update README.md

parent 8d38b7c1
Loading
Loading
Loading
Loading
+10 −7
Original line number Original line Diff line number Diff line
@@ -29,8 +29,18 @@ To fairly compare algorithms on the *Locount* task, we design a new evaluation p
for duplicate detections of one instance, for false positive detections, and for false counting numbers of detections. 
for duplicate detections of one instance, for false positive detections, and for false counting numbers of detections. 
Inspired by MS COCO protocol, we design new metrics *AP^{lc}*, *AP_{0.5}^{lc}*, *AP_{0.75}^{lc}*, and *AR^{lc}_{max}=150}* to evaluate the performance of methods, 
Inspired by MS COCO protocol, we design new metrics *AP^{lc}*, *AP_{0.5}^{lc}*, *AP_{0.75}^{lc}*, and *AR^{lc}_{max}=150}* to evaluate the performance of methods, 
which takes both the localization and counting accuracies into account. For more detailed definitions, please refer to the [paper](http://arxiv.org/abs/2003.08230).
which takes both the localization and counting accuracies into account. For more detailed definitions, please refer to the [paper](http://arxiv.org/abs/2003.08230).

## Baseline method
## Baseline method


We design a cascaded localization and counting network (CLCNet) to solve the *Locount* task, which gradually classifies and regresses the bounding boxes of objects, 
and estimates the number of instances enclosed in the predicted bounding boxes, with the increasing IoU and count number threshold in training phase. 
The architecture of the proposed CLCNet is shown in Fig. 3. The entire image is first fed into the backbone network to extract features.
A proposal sub-network (denoted as ''S_{0}'') is then used to produce preliminary object proposals. After that, given the detection proposals in the previous stage, 
multiple stages for localization and counting, i.e., S_{1},..., S_{N} are cascaded to generate final object bounding boxes with classification scores and the number of 
instances enclosed in the bounding box, where N is the total number of stages. For more detailed definitions, please refer to the [paper](http://arxiv.org/abs/2003.08230).
The counting accuracy threshold for the positive/negative sample generation is determined by the architecture design of CLCNet, which is described as follows.

We use the same architecture and configuration as CrCNN  for the box-regression and box-classification layers. For the instance counting layer, a direct strategy is to use a FC layer to regress a floating point number, indicating the number of instances, called {\em count-regression strategy}. However, the numbers of instances enclosed in the bounding boxes are integers, leading challenges for the network to regress accurately. For example, if the ground-truth numbers of instances are $4$ and $5$ for two bounding boxes, and both of the predictions are $4.5$, it is difficult for the network to choose the right direction in the training phase. To that end, we design a classification strategy to handle such issue, called {\em count-classification strategy}. Specifically, we assume the maximal number of instances is $\alpha$ and construct $\alpha$ bins to indicate the number of instances. Thus, the counting task is formulated as the multi-class classification task, which use a FC layer to determine the bin index for instance number.


## Citation
## Citation
If you find this dataset useful for your research, please cite
If you find this dataset useful for your research, please cite
@@ -43,13 +53,6 @@ If you find this dataset useful for your research, please cite
}
}
```
```


## Dataset
The *Locount* dataset is formed by 50,394 JPEG images with the resolution of 1920 × 1080 pixels.

The *training* subset includes 34,022 images.

The *testing* subset includes 16,372 images.



## Feedback
## Feedback
Suggestions and opinions of this dataset are welcome. Please contact the authors by sending email to libo@iscas.ac.cn.
Suggestions and opinions of this dataset are welcome. Please contact the authors by sending email to libo@iscas.ac.cn.