The authors present a theoretical analysis of the asymptotic behavior of high order neural tangent kernel terms which holda for regular ReLU MLPs as well as ReLU hypenets (according to the terminology after author response period) . The results settle a previous conjecture on the asymptotic rates of various correlation functions arising in neural network dynamics. In particular, the results relate to both hype-nets and MLPs. Numerical illustration are provided in the paper. The reviewers appreciated the novelty of the proposed approach based on hypenets. They also noted that the topic is timely and relevant for the community and that the results presented answer important questions in the field. A reviewer commented that for hypenets 'the authors derive the convergence conditions for linear approximation of HNs, and the relation to GP and NTK thoroughly'. The reviewers, however, expressed concerns about 'nomenclature, analyzed model, and prior work'. The authors submitted a response to the reviewers' comments, as well as confidential comments to the area chair. After reading the response, updating the reviews, and discussion, the reviewers feel 'the paper would be heavily improved by adopting the correct nomenclature, discussing papers that use a similar model, and being upfront about the limitations of the theoretical results, for example not being able to capture the settings where hypernetworks are typically used'. The reviewers provided valuable hints on the directions for improvement towards the final version. We highly recommend to take the reviewers' suggestions into account while preparing the camera ready final version of the paper. The paper makes timely and relevant contributions the field, proving rigorous mathematical convergence results for the linear approximation of hype-nets, in relation with GPs and NTKs. This paper will likely be a reference in that area. Accept.