Ir al contenido

Documat


Resumen de Companion Losses for Deep Neural Networks

David Díaz Vico, Ángela Fernández Pascual, José Ramón Dorronsoro Ibero Árbol académico

  • Modern Deep Neuronal Network backends allow a great flexibility to define network architectures. This allows for multiple outputs with their specific losses which can make them more suitable for particular goals. In this work we shall explore this possibility for classification networks which will combine the categorical cross-entropy loss, typical of softmax probabilistic outputs, the categorical hinge loss, which extends the hinge loss standard on SVMs, and a novel Fisher loss which seeks to concentrate class members near their centroids while keeping these apart.


Fundación Dialnet

Mi Documat