The Lindahl Letter
The Lindahl Letter
Deep generative models
0:00
Current time: 0:00 / Total time: -3:48
-3:48

Deep generative models

Perhaps you were looking for a bit more of a deep dive about deep generative models than will be contained in this relatively short missive. You could go check out Stanford University’s CS236 from Fall 2021 course [1]. That would help you begin to figure out just how unsupervised learning could be used to figure out the data distribution well enough to generate predicted other data elements. The content is broken up into 5 sections and you could contribute to the class GitHub if you wanted to provide feedback or improvement suggestions. Alternatively, you could learn more about this one from Prakash Pandey over at Towards Data Science from back in 2018 [2]. It's a faster read and pretty easy to digest compared to taking on a college level course. You could go the academic paper route for an introduction and dig into a work from Ruthotto & Haber from 2021:

Ruthotto, L., & Haber, E. (2021). An introduction to deep generative modeling. GAMM‐Mitteilungen44(2), e202100008. https://arxiv.org/pdf/2103.05180.pdf 

My initial run at digging into deep generative models opened the door to a bunch of different topics within the space. Right at the start I ran into semi-supervised learning, graphs, urban mobility, and molecular science. You can imagine the urban mobility one made me a little curious what people were doing to model that with deep generation. Apparently, Google Scholar had enough data to offer up three paths including migration, mobility, and morphology. To get started I dug in with a quick search on "Deep generative models" with urban mobility [3]. None of the articles within this search space have a lot of references. It might be a relatively small area of academic inquiry at the moment. You could read most of the relevant academic content related to deep generative models in an afternoon. Trying to dig into using them for some type of use case will take a bit more effort in terms of setup, technology, and selection of that use case. 

Here are 3 papers that showed up with the urban mobility search:

Eigenschink, P., Vamosi, S., Vamosi, R., Sun, C., Reutterer, T., & Kalcher, K. (2021). Deep Generative Models for Synthetic Data. ACM Computing Surveys. https://epub.wu.ac.at/8394/1/Deep_Generative_Models_for_Sequential_Data__WU_ePub_.pdf

Anda, C., & Ordonez Medina, S. A. (2019). Privacy-by-design generative models of urban mobility. Arbeitsberichte Verkehrs-und Raumplanung1454. https://www.research-collection.ethz.ch/bitstream/handle/20.500.11850/357034/3/ab1454.pdf

Johnsen, M., Brandt, O., Garrido, S., & Pereira, F. (2022). Population synthesis for urban resident modeling using deep generative models. Neural Computing and Applications34(6), 4677-4692. https://arxiv.org/ftp/arxiv/papers/2011/2011.06851.pdf 

3 decently cited academic papers:

Salakhutdinov, R. (2015). Learning deep generative models. Annual Review of Statistics and Its Application2, 361-385. https://www.utstat.toronto.edu/~rsalakhu/papers/Russ_thesis.pdf

Kingma, D. P., Mohamed, S., Jimenez Rezende, D., & Welling, M. (2014). Semi-supervised learning with deep generative models. Advances in neural information processing systems27. https://proceedings.neurips.cc/paper/2014/file/d523773c6b194f37b938d340d5d02232-Paper.pdf

Maaløe, L., Sønderby, C. K., Sønderby, S. K., & Winther, O. (2016, June). Auxiliary deep generative models. In International conference on machine learning (pp. 1445-1453). PMLR. http://proceedings.mlr.press/v48/maaloe16.pdf 

Links and thoughts:

“Teaching MLOps at scale with GitHub - Universe 2022”

“#83 Dr. ANDREW LAMPINEN (Deepmind) - Natural Language, Symbols and Grounding [NEURIPS2022 UNPLUGGED]”

Top 5 Tweets of the week:

Footnotes:

[1] https://deepgenerativemodels.github.io/ 

[2] https://towardsdatascience.com/deep-generative-models-25ab2821afd3 

[3] https://scholar.google.com/scholar?hl=en&as_sdt=0%2C6&q=%22Deep+generative+models%22+urban+mobility&btnG= 

What’s next for The Lindahl Letter?

  • Week 100: Overcrowding and ML

  • Week 101: Back to the ROI for ML 

  • Week 102: ML pracademics

  • Week 103: Rethinking the future of ML

  • Week 104: That 2nd year of posting recap

I’ll try to keep the what’s next list forward looking with at least five weeks of posts in planning or review. If you enjoyed this content, then please take a moment and share it with a friend. If you are new to The Lindahl Letter, then please consider subscribing. New editions arrive every Friday. Thank you and enjoy the week ahead.

Discussion about this episode