A recap on ICML 2021
I have been attending ICML virtually this year - a few papers caught my attention, mostly because they tackle issues related to what I am currently thinking about: generative models....
Summarizing a paper is a great exercise: putting thoughts into words requires to reach a substantial level of understanding of the paper. Some amount of litterature review is needed to understand the problem setting as well as previous work, which can help to form a high level mental image of the field. Moreover, unlike for your own papers, you can be as vocal as you want about potential weaknesses of the contributions, which help put things in perspective.
For that reason, I summarized two papers presented in ICML 2021: *Conjugate Energy Based Models*, and *Generative Particle Variational Inference via Estimation of Functional Gradients*. I focused on explaining the initial motivation and the high-level structure of the algorithm, and purposely skipped many details and training tricks described at length in the papers.
Finally, I also included a list of other papers that I enjoyed, but did not summarize.