A Constructive Approach for One-Shot Training of Neural Networks Using Hypercube-Based Topological Coverings

(Submitted on 9 Jan 2019)

Abstract: In this paper we presented a novel constructive approach for training deep
neural networks using geometric approaches. We show that a topological covering
can be used to define a class of distributed linear matrix inequalities, which
in turn directly specify the shape and depth of a neural network architecture.
The key insight is a fundamental relationship between linear matrix
inequalities and their ability to bound the shape of data, and the rectified
linear unit (ReLU) activation function employed in modern neural networks. We
show that unit cover geometry and cover porosity are two design variables in
cover-constructive learning that play a critical role in defining the
complexity of the model and generalizability of the resulting neural network
classifier. In the context of cover-constructive learning, these findings
underscore the age old trade-off between model complexity and overfitting (as
quantified by the number of elements in the data cover) and generalizability on
test data. Finally, we benchmark on algorithm on the Iris, MNIST, and Wine
dataset and show that the constructive algorithm is able to train a deep neural
network classifier in one shot, achieving equal or superior levels of training
and test classification accuracy with reduced training time.

Submission history

From: Enoch Yeung Ph.D. [

view email



Wed, 9 Jan 2019 18:59:10 UTC (2,774 KB)

Read More

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.