Deep learning draws increasing attention from both academe and industries, which owns to its extraordinary deep architectures on learning meaningful representations of input data to significantly improve the performance of associated machine learning tasks. One of the key issues of existing deep learning approaches is that the meaningful representations can be learned only when their hyper-parameter settings are properly specified beforehand, and general-parameters are precisely learned during the training process. Because deep learning is an emerging topic, not much researches have been dedicated to automatically set the hyper-parameters, and accurately find the globally optimal general-parameters.
The problem in this regard can be formulated as optimization problems including the discrete optimization, constrained optimization, large-scale global optimization and multi-objective optimization, where evolutionary computation methods can play a crucial role. For example, in deep neural network-based deep learning algorithms, the types of building blocks are enumerated, the coefficients of the regularization terms are limited to one particular range, the numbers of weights are large and their values are required to be globally optimal. In addition, a higher performance deep learning algorithm typically requires more computational resources. To this end, the pursing of performance and computational resource are also conflicting objectives.
Evolutionary computation approaches, particularly genetic algorithms, particle swarm optimization and genetic programming, have shown superiority in addressing real-world discrete, constrained, large-scale and multi-objective optimization problems largely due to their powerful abilities in searching for global optima, dealing with non-convex and non-differentiable problems, finding a set of non-dominated solutions in a single run, and requiring no rich domain knowledge. Evolutionary deep learning aims at solving the optimization problems involved in deep learning algorithms.
In this talk, some recent advances in evolutionary learning algorithms are introduced, which cover the automatical architecture design, weight parameter optimization and multi-objective optimization problems in deep learning by using evolutionary computation approaches.
Dr. Sun received his B.Eng. and Ph.D. degrees from Southwest Petroleum University, and Sichuan University in Chengdu, China, in 2008 and 2017, respectively. He has been a jointly Ph.D. student from August 2015 to February 2017 in Oklahoma State University, US. He is currently a research fellow with the School of Engineering and Computer Science, Victoria University of Wellington, New Zealand. His areas of expertise include evolutionary deep learning, many-objective optimization and unsupervised deep learning.
Dr. Sun has published nine first-author papers on deep neural networks and evolutionary algorithms in fully-refereed international journals and conferences including four papers in top journals IEEE Transactions on Evolutionary Computation and Knowledge-based System. He won the best paper award of IEEE CIS Chengdu Section in 2016. Although being emergent, he has been a reviewer of >20 international journals/conferences including IEEE Transactions on Evolutionary Computation, IEEE Transactions on Cybernetics and IEEE Transactions on Emergent Topics in Computational Intelligence, and program committee member for international conferences including AAAI2018. Further, the paper "Evolving Unsupervised Deep Neural Networks for Learning Meaningful Representations", where Dr. Sun is the first author, is the first one published by IEEE Transactions on Evolutionary Computation on the topic of evolutionary deep learning. Dr. Sun is the organizer of the Special Session "Evolutionary Deep Learning and Applications" on CEC2019 and the "The 1st Workshop on Evolutionary Deep Learning", and the founding chair of the IEEE CIS Task-Force on "Evolutionary Deep Learning and Applications".