Press "Enter" to skip to content

Computing gradients in analog circuits

Speaker Prof. Thomas Chaffey

School of Electrical and Computer Engineering, University of Sydney, Australia

DateTime Feb 10 (Tuesday), 2026|14:00

Zoom https://snu-ac-kr.zoom.us/my/jingyu.lee

Abstract

With the rising costs of machine learning, analog computing has seen a new wave of interest. Implementing neural networks in analog circuits may improve energy efficiency and inference speed by many orders of magnitude. However, such devices have not been able to be scaled to useful sizes due to the manufacturing variance of analog devices. A possible solution is to train analog hardware directly, adjusting device parameters in response to circuit measurements, using novel devices such as memristors and other nonvolatile memories. This talk will describe ongoing research on learning algorithms for analog electronic networks, exploiting the circuit structure and device properties to give fast methods to perform gradient descent using hardware measurements, with guaranteed convergence.

Biography

Thomas Chaffey is a lecturer in the School of Electrical and Computer Engineering at the University of Sydney, Australia. He received the B.Sc. (advmath) degree in mathematics and computer science and the M.P.E. degree in mechanical engineering from the University of Sydney in 2015 and 2018, respectively, and the Ph.D. degree from the University of Cambridge, U.K., in 2022. From 2022 to 2025 he held the Maudslay-Butler Research Fellowship in Engineering at Pembroke College, University of Cambridge. His research interests are in nonlinear control and its intersection with optimization, circuit theory and learning.