Xinggang Wang

Click:

The Last Update Time:..

Current position: Xinggang Wang - HUST homepage > Scientific Research > Paper Publications

Paper Publications

Boundary-preserving Mask R-CNN
Release time:2021-06-10  Hits:

Indexed by:会议论文
First Author:Cheng,Tianheng
Correspondence Author:Wang,Xinggang
Co-author:Liu,Wenyu,Huang,Lichao
Journal:2020 European Conference on Computer Vision(ECCV)
Date of Publication:2020-08-23
Abstract:Tremendous efforts have been made to improve mask localization accuracy in instance segmentation. Modern instance segmentation methods relying on fully convolutional networks perform pixel-wise classification, which ignores object boundaries and shap, leading coarse and indistinct mask prediction results and imprecise localization. To remedy these problems, we propose a conceptually simple yet effective Boundary-preserving Mask R-CNN (BMask R-CNN) to leverage object boundary information to improve mask localization accuracy. BMask R-CNN contains a boundary-preserving mask head in which object boundary and mask are mutually learned via feature fusion blocks. As a result, the predicted masks are better aligned with object boundaries. Without bells and whistles, BMask R-CNN outperforms Mask R-CNN by a considerable margin on the COCO dataset; in the Cityscapes dataset, there are more accurate boundary groundtruths available, so that BMask R-CNN obtains remarkable improvements over Mask R-CNN. Besides, it is not surprising to observe that BMask R-CNN obtains more obvious improvement when the evaluation criterion requires better localization (e.g.., AP 75 ) as shown in Fig. 1. Code and models are available at https://github.com/hustvl/BMaskR-CNN.