新聞標題： ( 2019-12-17 )
演講主題：A Swiss-Army Knife for the Theory of Deep Learning and Beyond
主講人：Greg Yang 先生(微軟研究院)
摘要內容：Abstract. The resurgence of neural networks has revolutionized artificial intelligence since 2010. Luckily for mathematicians and statistical physicists, the study of large random network scaling limits, which can be thought of as *nonlinear* random matrix theory, is both practically important and mathematically interesting. We describe several problems in this setting and develop a new comprehensive framework, called “tensor programs”, for solving these problems. This framework can be thought of as an automatic tool to derive the behavior of computation graphs with large matrices, as used in neural network computation. It is very general, and from it we also obtain new proofs of the semicircle and the Marchenko-Pastur laws. From this framework follows many insights on neural networks, such as the limit of wide neural networks to Gaussian processes. This talk presents the works arXiv:1902.04760 and arXiv:1910.12478.