Probabilistic models | Statistical models

Flow-based generative model

A flow-based generative model is a generative model used in machine learning that explicitly models a probability distribution by leveraging normalizing flow, which is a statistical method using the change-of-variable law of probabilities to transform a simple distribution into a complex one. The direct modeling of likelihood provides many advantages. For example, the negative log-likelihood can be directly computed and minimized as the loss function. Additionally, novel samples can be generated by sampling from the initial distribution, and applying the flow transformation. In contrast, many alternative generative modeling methods such as variational autoencoder (VAE) and generative adversarial network do not explicitly represent the likelihood function. (Wikipedia).

Flow-based generative model
Video thumbnail

generative model vs discriminative model

understanding difference between generative model and discriminative model with simple example. all machine learning youtube videos from me, https://www.youtube.com/playlist?list=PLVNY1HnUlO26x597OgAN8TCgGTiE-38D6

From playlist Machine Learning

Video thumbnail

(ML 13.5) Generative process specification

A compact way to specify a model is by its "generative process", using a convenient convention involving the graphical model.

From playlist Machine Learning

Video thumbnail

Generative Modeling by Estimating Gradients of the Data Distribution - Stefano Ermon

Seminar on Theoretical Machine Learning Topic: Generative Modeling by Estimating Gradients of the Data Distribution Speaker: Stefano Ermon Affiliation: Stanford University Date: May 12, 2020 For more video please visit http://video.ias.edu

From playlist Mathematics

Video thumbnail

Set Distribution Networks: a Generative Model for Sets of Images (Paper Explained)

We've become very good at making generative models for images and classes of images, but not yet of sets of images, especially when the number of sets is unknown and can contain sets that have never been encountered during training. This paper builds a probabilistic framework and a practic

From playlist Papers Explained

Video thumbnail

Machine Learning for Computational Fluid Dynamics

Machine learning is rapidly becoming a core technology for scientific computing, with numerous opportunities to advance the field of computational fluid dynamics. This paper highlights some of the areas of highest potential impact, including to accelerate direct numerical simulations, to i

From playlist Data Driven Fluid Dynamics

Video thumbnail

Enhancing Computational Fluid Dynamics with Machine Learning

Research abstract by Ricardo Vinuesa (@Ricardo Vinuesa) from KTH!! Twitter: @ricardovinuesa In this video we discuss the recent article published in Nature Computational Science by Ricardo Vinuesa and Steve Brunton, where the potential of machine learning (ML) to improve numerical simulat

From playlist Research Abstracts from Brunton Lab

Video thumbnail

A Style-Based Generator Architecture for Generative Adversarial Networks

5-min ML Paper Challenge Presenter: https://www.linkedin.com/in/sherminehgh/ A Style-Based Generator Architecture for Generative Adversarial Networks https://arxiv.org/abs/1812.04948 We propose an alternative generator architecture for generative adversarial networks, borrowing from styl

From playlist AISC - 5-min Papers

Video thumbnail

Giada Basile: A gradient flow approach to kinetic equations

The lecture was held within the framework of the Hausdorff Trimester Program: Kinetic Theory Abstract: I will present some results obtained together with D. Benedetto and L. Bertini on a gradient flow formulation of linear kinetic equations, in terms of an entropy dissipation inequality.

From playlist Workshop: Probabilistic and variational methods in kinetic theory

Video thumbnail

Stochastic RNNs without Teacher-Forcing

We present a stochastic non-autoregressive RNN that does not require teacher-forcing for training. The content is based on our 2018 NeurIPS paper: Deep State Space Models for Unconditional Word Generation https://arxiv.org/abs/1806.04550

From playlist Deep Learning Architectures

Video thumbnail

DDPS | Data-driven methods for fluid simulations in computer graphics

Fluid phenomena are ubiquitous to our world experience: winds swooshing through trembling leaves, turbulent water streams running down a river, and cellular patterns generated from wrinkled flames are some few examples. These complex phenomena capture our attention and awe due to the beaut

From playlist Data-driven Physical Simulations (DDPS) Seminar Series

Video thumbnail

DDPS | Physics-Guided Deep Learning for Dynamics Forecasting

In this talk from July 9, 2021, University of California, San Diego Computer Science Ph.D. student Rui Wang discusses physics-based modeling with deep learning. Description: Modeling complex physical dynamics is a fundamental task in science and engineering. There is a growing need for in

From playlist Data-driven Physical Simulations (DDPS) Seminar Series

Video thumbnail

A New Balancing Method for Solving Parametric Max Flow

March 14, 2007 lecture by Bin Zhang for the Stanford University Computer Systems Colloquium (EE 380). A new, simple and fast algorithm finds a sequence of nested minimum cuts of a bipartite parametric flow network. Instead of working with the original parametric flow-network, the new meth

From playlist Course | Computer Systems Laboratory Colloquium (2006-2007)

Video thumbnail

DDPS | Modeling and controlling turbulent flows through deep learning

Description: The advent of new powerful deep neural networks (DNNs) has fostered their application in a wide range of research areas, including more recently in fluid mechanics. In this presentation, we will cover some of the fundamentals of deep learning applied to computational fluid dyn

From playlist Data-driven Physical Simulations (DDPS) Seminar Series

Video thumbnail

Towards Analyzing Normalizing Flows by Navin Goyal

Program Advances in Applied Probability II (ONLINE) ORGANIZERS Vivek S Borkar (IIT Bombay, India), Sandeep Juneja (TIFR Mumbai, India), Kavita Ramanan (Brown University, Rhode Island), Devavrat Shah (MIT, US) and Piyush Srivastava (TIFR Mumbai, India) DATE & TIME 04 January 2021 to 08 Janu

From playlist Advances in Applied Probability II (Online)

Video thumbnail

R. Bamler - Compactness and partial regularity theory of Ricci flows in higher dimensions (vt)

We present a new compactness theory of Ricci flows. This theory states that any sequence of Ricci flows that is pointed in an appropriate sense, subsequentially converges to a synthetic flow. Under a natural non-collapsing condition, this limiting flow is smooth on the complement of a sing

From playlist Ecole d'été 2021 - Curvature Constraints and Spaces of Metrics

Video thumbnail

DDPS | Turbulent disperse two-phase flows: simulations and data-driven modeling

In this DDPS talk from Aug. 26, 2021, University of Michigan Assistant Professor in Mechanical Engineering and Aerospace Engineering Jesse Capecelatro discusses a data-driven framework for model closure of the multiphase Reynolds Average Navier—Stokes (RANS) equations. Description: Turbu

From playlist Data-driven Physical Simulations (DDPS) Seminar Series

Video thumbnail

R. Bamler - Compactness and partial regularity theory of Ricci flows in higher dimensions

We present a new compactness theory of Ricci flows. This theory states that any sequence of Ricci flows that is pointed in an appropriate sense, subsequentially converges to a synthetic flow. Under a natural non-collapsing condition, this limiting flow is smooth on the complement of a sing

From playlist Ecole d'été 2021 - Curvature Constraints and Spaces of Metrics

Video thumbnail

Machine Learning for Weather Forecast - Deep Random Talks - Episode 15

Notes and resources: https://ai.science/l/2a91c2ab-0ffc-464c-95bb-71f37cd2bcc0 -Join our ML slack community: https://join.slack.com/t/aisc-to/shared_invite/zt-f5zq5l35-PSIJTFk4v60FML177PgsPg -Visit our website: https://ai.science -Book a 20-min AMA with Amir: https://calendly.com/amir

From playlist Deep Random Talks - Season 1

Related pages

Loss function | Inverse function | Deep learning | Jacobian matrix and determinant | Universal approximation theorem | Homeomorphism | Probability density function | Generative adversarial network | Generative model | Inverse function theorem | Rademacher distribution | Ambient isotopy | Determinant | Sphere eversion | Maximum likelihood estimation | Probability distribution | Mathematical induction | Autoencoder | Random variable | Invertible matrix | Whitney embedding theorem