Ismailov, V.E.Savaş, Ekrem2020-11-212020-11-2120170163-0563https://doi.org/10.1080/01630563.2016.1254654https://hdl.handle.net/11467/3559In this article, we study approximation properties of single hidden layer neural networks with weights varying in finitely many directions and with thresholds from an open interval. We obtain a necessary and simultaneously su?cient measure theoretic condition for density of such networks in the space of continuous functions. Further, we prove a density result for neural networks with a specifically constructed activation function and a fixed number of neurons. © 2017 Taylor & Francis.eninfo:eu-repo/semantics/closedAccessActivation functionBorel measuredensitylightning boltneural networkorbitorthogonal measureweak convergenceMeasure Theoretic Results for Approximation by Neural Networks with Limited WeightsArticle387819830Q3WOS:000402005500001Q22-s2.0-8501724076310.1080/01630563.2016.1254654