The paper attacks the classical problem of linear regression with missing values. It computes the Bayes predictor in several cases with missing values and then uses Neumann series to approximate the Bayes predictor. This approximation is then used to design Neural Networks with RelU functions. The propositions describing self-masking missingness, appears to be a novel concept, are interesting but can be considered slightly restrictive because of Linear Gaussian assumptions. However, both the results and the methods should be of interest to NeuriPS 2020 community.