Kyurae Kim

profile_shock_diffuse.png

I am a second year PhD student advised by Professor Jacob R. Gardner at the University of Pennsylvania working on Bayesian inference, stochastic optimization, Markov chain Monte Carlo sampling, and Bayesian optimization. I work closely with Professor Yi-An Ma and Alain Oliviero Durmus.

I acquired my Bachelor in Engineering degree at Sogang University, South Korea, during which I did undergraduate research under Professor Hongseok Kim, Tai-kyong Song, Sungyong Park, and Youngjae Kim. During this time, I also worked at Samsung Medical Center, South Korea, as an undergraduate researcher, at Kangbuk Samsung Hospital, South Korea, as a visiting researcher, and at Hansono, South Korea, as a part-time embedded software engineer. After graduating, I was a research associate at the University of Liverpool under Professor Simon Maskell and Jason F. Ralph. I hold memberships in the ACM, ISBA, and the IEEE (which implies that I’m a good tipper…)

Previously, I used to work on medical imaging, computer systems, high-performance computing, and array signal processing, but I’m also broadly interested in topics such as computational statistics, programming languages, and optimization. Here is a list of papers that I found interesting over my career.

Software

I am active within Julia’s computational statistics community as part of the Turing language team.

productivity tools that I use

(Last updated in 14 April 2024)
I heavily use cross-platform opensource software tools.

  • Emacs (heavily customized) for writing code
    • Magit for accessing Git within Emacs
  • Zotero for managing citations and exporting bibtex
  • Inkscape for drawing vector diagrams (but also Tikz if not in a rush)
  • Veusz for quick publication quality plots (but also PGFplots, Makie.jl for fancier stuff).
  • Evince for viewing and Foxit Reader for editing, and annotating PDF files.
  • Nomacs for viewing a lot of image files quickly (on Windows, FastStone is hard to beat)
  • Flameshot for taking screenshots

news

Oct 26, 2024 I will be back in South Korea from December 6 to December 25.
Sep 25, 2024 1 paper on BayesOpt has been accepted to NeurIPS’24 as spotlight
Aug 26, 2024 I will be in San Francisco from August to November.
Jul 24, 2024 I received a best reviewer award from ICML’24.
May 7, 2024 I will do an internship with the Prescient Design team at Genentech this Fall.

selected publications

  1. Approximation-Aware Bayesian Optimization
    Natalie Maus,  Kyurae Kim, Geoff Pleiss, David Eriksson, John P. Cunningham, and Jacob R. Gardner.
    In Advances in Neural Information Processing Systems Dec 2024
  2. Demystifying SGD with Doubly Stochastic Gradients
    Kyurae Kim, Joohwan Ko, Yi-An Ma, and Jacob R Gardner.
    In Proceedings of the International Conference on Machine Learning (ICML) Jul 2024
  3. Provably Scalable Black-Box Variational Inference with Structured Variational Families
    Joohwan Ko,  Kyurae Kim, Woo Chang Kim, and Jacob R Gardner.
    In Proceedings of the International Conference on Machine Learning (ICML) Jul 2024
  4. Stochastic Approximation with Biased MCMC for Expectation-Maximization
    Samuel Gruffaz,  Kyurae Kim, Alain Durmus, and Jacob R Gardner.
    In Proceedings of the International Conference on Artificial Intelligence and Machine Learning (AISTATS) May 2024
  5. Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing?
    Kyurae Kim, Yi-An Ma, and Jacob R. Gardner.
    In Proceedings of the International Conference on Artificial Intelligence and Machine Learning (AISTATS) May 2024
  6. On the Convergence of Black-Box Variational Inference
    Kyurae Kim, Jisu Oh, Kaiwen Wu, Yi-An Ma, and Jacob R. Gardner.
    In Advances in Neural Information Processing Systems Dec 2023
  7. The Behavior and Convergence of Local Bayesian Optimization
    Kaiwen Wu,  Kyurae Kim, Roman Garnett, and Jacob R. Gardner.
    In Advances in Neural Information Processing Systems Dec 2023