Evolving Deep Neural Networks (2019)
Risto Miikkulainen, Jason Liang, Elliot Meyerson, Aditya Rawal, Dan Fink, Olivier Francon, Bala Raju, Hormoz Shahrzad, Arshak Navruzyan, Nigel Duffy, and Babak Hodjat
The success of deep learning depends on finding an architecture to fit the task. As deep learning has scaled up to more challenging tasks, the architectures have become difficult to design by hand. This paper proposes an automated method, CoDeepNEAT, for optimizing deep learning architectures through evolution. By extending existing neuroevolution methods to topology, components, and hyperparameters, this method achieves results comparable to best human designs in standard benchmarks in object recognition and language modeling. It also supports building a real-world application of automated image captioning on a magazine website. Given the anticipated increases in available computing power, evolution of deep networks is promising approach to constructing deep learning applications in the future.
View:
PDF
Citation:
In Artificial Intelligence in the Age of Neural Networks and Brain Computing, Robert Kozma, Cesare Alippi, Yoonsuck Choe, and Francesco Carlo Morabito (Eds.), pp. 293-312 2019. Amsterdam: Elsevier.
Bibtex:

Olivier Francon Collaborator olivier francon [at] cognizant com
Babak Hodjat Collaborator babak [at] cognizant com
Jason Zhi Liang Ph.D. Alumni jasonzliang [at] utexas edu
Elliot Meyerson Ph.D. Alumni ekm [at] cs utexas edu
Risto Miikkulainen Faculty risto [at] cs utexas edu
Aditya Rawal Ph.D. Alumni aditya [at] cs utexas edu
Hormoz Shahrzad Masters Alumni hormoz [at] cognizant com