Need help with afm_cvpr2019?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

cherubicXN
246 Stars 55 Forks MIT License 16 Commits 10 Opened issues

Description

Official implementation of paper "Learning Attraction Field Map for Robust Line Segment Detection" (CVPR 2019)

Services available

!
?

Need anything else?

Contributors list

# 298,647
Python
C++
13 commits
# 77,913
CSS
sed
Shell
c-plus-...
1 commit

Learning Attraction Field Reprensentation for Robust Line Segment Detection (accepted by CVPR 2019)

This is the offical implementation for our CVPR paper.

Introduction

We reformulate the problem of line segment detection (LSD) as a coupled region coloring problem. Based on this new formulation, we can address the problem of LSD with convolutional neural networks.

Results

F-Measure and FPS

| Methods | Wireframe Dataset | YorkUrban Dataset | FPS| |:-----:|:-----:|:-----:|:-----:| | LSD | 0.647 | 0.591 | 19.6| | MCMLSD | 0.566 | 0.564 | 0.2 | | Linelet| 0.644 | 0.585 | 0.14| | Wireframe Parser| 0.728 | 0.627 | 2.24| |Ours (U-Net)| 0.752 | 0.639 | 10.3| |Ours (a-trous)| 0.773 | 0.646 | 6.6|

Precision and Recall Curves

Installation

Check INSTALL.md for installation instructions.

1.Data preparation

1.1 Downloading data

  • Wireframe Dataset: https://github.com/huangkuns/wireframe
  • YorkUrban Dataset: http://www.elderlab.yorku.ca/resources/york-urban-line-segment-database-information/

Please follow the above links to download Wireframe and YorkUrban datasets. For Wireframe dataset, we only need the file named pointlines.zip which contains images and line segment annotations for training and testing.

Once the files are downloaded, please unzip them into /data/wireframeraw and <AFMroot>/data/yorkraw respectively. The structures of wireframeraw and yorkraw folder are as follows: ``` wireframeraw/ - pointlines/*.pkl - train.txt - test.txt

yorkraw/ - filename0rgb.png - filename0.mat ... - filename{N}_rgb.png - filename{N}.mat ```

1.2. Data Pre-processing

Please run the following commands

cd /data/
python preparation_wireframe.py
python preparation_york.py

2. Hyper-parameter configurations

We use the YACS to control the hyper parameters. Our configuration files for U-Net (afm_unet.yaml) and a-trous Residual Unet (afm_atrous.yaml) are saved in the "/experiments" folder.

In each yaml file, the SAVEDIR is used to store the network weights and experimental results. The weights are saved in SAVEDIR/weights and the results are saved in SAVEDIR/results/DATASETname.

The TEST configuration is for outputing results in testing phase with different ways (e.g. save or display). We currently provide two output modes "display" and "save". You can custom more output methods in modeling/output/output.py.

3. Inference with pretrained models

The pretrained models for U-Net and atrous Residual U-Net can be downloaded from this link. Please place the weights into "/experiments/unet/weight" and "/experiments/atrous/weight" respectively.

  • For testing, please run the following command
python test.py --config-file experiments/afm_atrous.yaml --gpu 0

4. Training

Please run the following command

python train.py --config-file experiments/afm_atrous.yaml --gpu 0
to train a network. To speedup training procedure, our code will save the generated attraction field maps into /data/wireframe/.cache when you run training code in the first time.

5. Citations

If you find our work useful in your research, please consider citing:

@inproceedings{AFM,
title = "Learning Attraction Field Representation for Robust Line Segment Detection",
author = "Nan Xue and Song Bai and Fudong Wang and  Gui-Song Xia and Tianfu Wu and Liangpei Zhang",
booktitle = "IEEE Conference on Computer Vision and Pattern Recognition (CVPR)",
year = {2019},
}

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.