Paper ID: | 522 |
---|---|

Title: | Correlated Uncertainty for Learning Dense Correspondences from Noisy Labels |

This submission received three diverging scores initially 5,5,7. All reviewers pointed out a lack of comparison to related baselines and the authors provided new results in the rebuttal that satisfied the reviewers. One reviewer increased the score, so before AC discussion the scores are 5,6,7. The paper presents a technique for a real world relevant problem. The main concern during the review and discussion phase were. 1. Missing comparison to related work. This is addressed and fixed in the author rebuttal, all reviewers acknowledge the new results and find them sufficient. Please include the results in the final paper. 2. The choice of model components, eg. use of a single Gaussian, prob. U-Net I find this to be a reasonable choice. The statement 13. in the rebuttal states a good fit of the error distribution, but without an empirical result. This could be strengthened by an effort to understand the error distribution better, but it is sufficient for this submission and does not interfere with the claimed contributions. The extension of the prob. U-Net to this problem may be in itself already a contribution. The experiments presented here are much more targeted than those in the U-Net paper. 3. Clarity of the manuscript. This could be improved and I hope the reviews are useful to revise the manuscript. In summary the paper is sufficiently interesting and discusses a real problem for this task of estimating dense poses of humans. The fact that the uncertainty estimates do not improve the results of the method should not withhold the paper but is in the nature of the data. The models could be refined by better understanding of the error distributions and the presented technique describes a possible way to include them. Overall the positive aspects outweigh the negative ones. We hope that the final version will be revised and improved including the results of the rebuttal. Type Table 2 should probably read negative log-likelihood.