Many problems arising in science and engineering aim to find a function which is the optimal value of a specified functional, Some examples include optimal control, inverse analysis and optimal shape design. Only some of these, regarded as variational problems, can be solved analytically, and the only general technique is to approximate the solution using direct methods. Unfortunately, variational problems are very difficult to solve, and it becomes necessary to innovate in the field of numerical methods in order to overcome the difficulties.
The objective of this PhD Thesis is to develop a conceptual theory of neural networks from the perspective of functional analysis and variational calculus. Within this formulation, learning means to solve a variational problem by minimizing an objective functional associated to the neural network. The choice of the objective functional depends on the particular application. On the other side, its evaluation might need the integration of functions, ordinary differential equations or partial differential equations.
As it will be shown, neural networks are able to deal with a wide range of applications in mathematics and physics. More specifically, a variational formulation for the multilayer perceptron provides a direct method for solving variational problems. This includes typical applications such as function regression, pattern recognition or time series prediction, but also new ones such as optimal control, inverse problems and optimal shape design.
This addition of applications causes that a standard neural network is not able to deal with some particular problems, and it needs to be augmented. In this work an extended class of multilayer perceptron is developed which, besides the traditional neuron models and network architectures, includes independent parameters, boundary conditions and lower and upper bounds.
The computational performance of this numerical method is investigated here through the solution of different validation problems with analytical solution. Moreover, a variational formulation for an extended class of multilayer perceptron is applied to several engineering cases within optimal control, inverse problems or optimal shape design. Finally, this work comes with the open source neural networks C++ library Flood, which has been implemented following the functional analysis and calculus of variations theories.
© 2008-2024 Fundación Dialnet · Todos los derechos reservados