Photo of Ziv Goldfeld

Ziv Goldfeld

Assistant Professor
Electrical and Computer Engineering
Frank H T Rhodes Hall, Room 322

Biography

Ziv Goldfeld joined the School of Electrical and Computer Engineering at Cornell University as an assistant professor in July 2019. He is a graduate field member in Computer Science, Data Science, and the Center of Applied Mathematics at Cornell University. He is also a member of the Foundations of Information, Networks, and Decision Systems (FIND) group. During the 2017-2019 academic years, he was a postdoctoral research fellow in LIDS of the Electrical Engineering and Computer Science Department at MIT. Before that, Goldfeld graduated with a B.Sc., M.Sc. and Ph.D. (all summa cum laude) in Electrical and Computer Engineering from Ben Gurion University, Israel, in 2012, 2012 and 2017, respectively.

Research Interests

Goldfeld's research interests include optimal transport theory, information theory, mathematical statistics, and applied probability. He develops mathematical tools for the design and analysis of modern inference and learning systems, with the goal of devising algorithms that are accurate, scalable, robust, private, and fair. 

Many learning tasks, from generative modeling to style transfer, can be distilled into a question of comparing or deriving transformations between complex, high-dimensional probability distributions. The key mathematical objects that quantify this comparison are statistical divergences, which are discrepancy measures between probability distributions. Popular classes of divergences include Wasserstein distances (rooted in optimal transport theory), f-divergences, integral probability metrics, and more. Despite their potency for modeling, analyzing, and designing learning algorithms, such divergences typically suffer from the computational and statistical hardness issues, especially in high-dimensional settings. To alleviate this impasse, Glodfeld's group explores new regularization paradigms for statistical divergences that preserve their meaningful structure and compatibility for inference while enabling statistical and computational scalability. Research questions include: (i) structural, topological, and geometric properties of regularized divergences (e.g., via smoothing, slicing, entropic penalty, etc.); (ii) high-dimensional statistical questions, such as empirical convergence rates, neural estimation techniques, robust estimation, limit distribution theory, etc.; and (iii) learning-theoretic applications to generative modeling, barycenter computation/estimation, testing, anomaly detection, etc.

Another focus is developing information-theoretic tools for measuring the flow of information through deep neural networks. The goal here is to explain the process by which deep nets progressively build representations of data—from crude and over-redundant representations in shallow layers to highly-clustered and interpretable ones in deeper layers—and to give the designer more control over that process. To that end, the project develops efficient estimators of information measures over the network (e.g., via built-in dimensionality reduction techniques). Such estimators also lead to new visualization, optimization, and pruning methods of deep neural networks. New instance-dependent generalization bounds based on information measures are also of interest.

Additional research trajectories include causal inference and relations to the directed information functional, differential privacy, physical layer security, high-dimensional nonparametric estimation, and interacting particle systems.

Teaching Interests

ECE 6970 - Statistical distances for modern machine learning (Fall 2019)
ECE 5630 - Information Theory for Data Transmission, Security and Machine Learning (Spring 2020)
ECE 4110 - Random Signals in Communications and Signal Processing

Selected Publications

  • Z. Goldfeld, K. Greenewald and K. Kato, “Asymptotic guarantees for generative modeling based on the smooth Wasserstein distance”.
  • Z. Aharoni, D. Tzur, Z. Goldfeld and H. H. Permuter, “Directed information: neural estimation and applications to learning from sequential data”.
  • Z. Goldfeld and K. Greenewald, “Gaussian-smoothed optimal transport: metric structure and statistical efficiency”.
  • Z. Goldfeld, K. Greenewald, Y. Polyanskiy and J. Weed, “Convergence of smoothed empirical measures with applications to entropy estimation."
  • Z. Goldfeld, E. van den Berg, K. Greenewald, I. Melnyk, N. Nguyen, B. Kingsbury and Y. Polyanskiy, “Estimating information flow in deep neural networks."

Selected Awards and Honors

  • NSF CAREER Award, 2021
  • NSF CRII Award, 2020
  • IBM University Award, 2020
  • The Rothschild Postdoctoral Fellowship, 2017
  • The Ben Gurion University Postdoctoral Fellowship, 2017
  • Feder Award in the National Contest for Outstanding Student Research, 2016
  • Best Tutor Award (Electrical and Computer Engineering Department, Ben Gurion University), 2016
  • Best Student Paper Award at the International Conference on the Science of Electrical Engineering (ICSEE), 2014

Education

B.Sc. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2012
M.Sc. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2012
Ph.D. Electrical and Computer Engineering, Ben Gurion University of the Negev, Israel, 2018

Websites

In the News