NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
An interesting theoretical bound on the error of the posterior mean of GPR by bounding Lipschitz constants. Unlike previous work such bound is easier to evaluate and therefore it can be more practical, as it has been shown by the application to safe RL. Given the importance of GPs to sequential decision making under uncertainty the paper will be of interest to many practitioners. Please note that this submission caused a huge amount of discussion around conference policy issues regarding slicing contributions: "Note that slicing contributions too thinly may result in submissions being deemed dual submissions. Specifically, a case of slicing too thinly may correspond to two submissions by the same authors that are so similar that publishing one would render the other too incremental to be accepted.” The authors need to be aware of this when submitting to neurips or to any other future ML conferences that have similar policies.