ReLU Network with Bounded Width Is a Universal Approximator in View of an Approximate Identity

Moon, Sunghwan (2021) ReLU Network with Bounded Width Is a Universal Approximator in View of an Approximate Identity. Applied Sciences, 11 (1). p. 427. ISSN 2076-3417

[thumbnail of applsci-11-00427-v3.pdf] Text
applsci-11-00427-v3.pdf - Published Version

Download (465kB)

Abstract

Deep neural networks have shown very successful performance in a wide range of tasks, but a theory of why they work so well is in the early stage. Recently, the expressive power of neural networks, important for understanding deep learning, has received considerable attention. Classic results, provided by Cybenko, Barron, etc., state that a network with a single hidden layer and suitable activation functions is a universal approximator. A few years ago, one started to study how width affects the expressiveness of neural networks, i.e., a universal approximation theorem for a deep neural network with a Rectified Linear Unit (ReLU) activation function and bounded width. Here, we show how any continuous function on a compact set of

Item Type: Article
Subjects: SCI Archives > Engineering
Depositing User: Managing Editor
Date Deposited: 28 Jan 2023 06:13
Last Modified: 27 Sep 2024 05:10
URI: http://science.classicopenlibrary.com/id/eprint/431

Actions (login required)

View Item
View Item