Ziv Goldfeld joined the School of Electrical and Computer Engineering at Cornell University as an assistant professor in July 2019.
During the 2017-2019 academic years, he was a postdoctoral research fellow in the Laboratory for Information and Decision Systems (LIDS) of the Electrical Engineering and Computer Science Department at MIT. Before that, Goldfeld graduated with a B.Sc., M.Sc. and Ph.D. (all summa cum laude) in Electrical and Computer Engineering from Ben Gurion University, Israel, in 2012, 2012 and 2017, respectively.
Goldfeld's research interests include optimal transport theory, statistical machine learning, information theory, high-dimensional and nonparametric statistics, applied probability and interacting particle systems. In his work, he seeks to understand and design engineering systems by formulating and solving mathematical models. A main objective is to cultivate a principled approach to machine learning rooted in theoretical analysis.
Recently, Goldfeld developed a novel optimal transport (OT) paradigm based on smoothing probability measures using Gaussian kernels. This framework inherits all the beneficial properties of classic OT, while alleviating the so-called `curse of dimensionality' in empirical approximation from samples. Questions of interest revolve around (Wasserstein) metric and topological structure, sample complexity of empirical approximation, and limit distributions. A main application of smooth OT is generative modeling, with an emphasis on generative adversarial networks (GANs) able to achieve state-of-the-art performance while consuming significantly less resources (data, computation, etc.).
Another major interest is an information-theoretic analysis of deep neural networks (DNNs). By developing tools for measuring the information flow in DNNs, Goldfeld aims to understand how these systems progressively build representations --- from crude and over-redundant representations in shallow layers, to highly-clustered and interpretable ones in deeper layers --- and to give the designer more control over that process. Combining these tools with new instance-dependent generalization bounds, the goal is to move the uncertain trial-and-error process of DNN design into the domain of deterministic engineering practice.
Other research trajectories include the construction of physical-layer security codes via adversarial training of DNNs, mutual information estimation based on recurrent neural networks, and data storage in stochastic Ising models.
ECE 6970 - Statistical distances for modern machine learning (Fall 2019)
- Z. Goldfeld, K. Greenewald, Y. Polyanskiy and J. Weed, “Convergence of smoothed empirical measures with applications to entropy estimation."
- Z. Goldfeld, E. van den Berg, K. Greenewald, I. Melnyk, N. Nguyen, B. Kingsbury and Y. Polyanskiy, “Estimating information flow in deep neural networks."
- Z. Goldfeld, G. Bresler and Y. Polyanskiy, “Information storage in the stochastic Ising model."
- A. Bunin, Z. Goldfeld, H. Permuter, S. Shamai, P. Cuff, P. Piantanida, “Key and message semantic-security over state-dependent channels."
- Z. Goldfeld, P. Cuff and H. H. Permuter, “Semantic-security capacity for wiretap channels of type II."
Selected Awards and Honors
- The Rothschild Postdoctoral Fellowship, 2017
- The Ben Gurion University Postdoctoral Fellowship, 2017
- Feder Award in the National Contest for Outstanding Student Research, 2016
- Best Tutor Award (Electrical and Computer Engineering Department, Ben Gurion University), 2016
- Best Student Paper Award at the International Conference on the Science of Electrical Engineering (ICSEE), 2014
B.Sc. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2012
M.Sc. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2012
Ph.D. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2018