About this Course
14,418 recent views

100% online

Start instantly and learn at your own schedule.

Flexible deadlines

Reset deadlines in accordance to your schedule.

Advanced Level

Approx. 23 hours to complete

English

Subtitles: English

Skills you will gain

InferenceGibbs SamplingMarkov Chain Monte Carlo (MCMC)Belief Propagation

100% online

Start instantly and learn at your own schedule.

Flexible deadlines

Reset deadlines in accordance to your schedule.

Advanced Level

Approx. 23 hours to complete

English

Subtitles: English

Syllabus - What you will learn from this course

Week
1
25 minutes to complete

Inference Overview

2 videos (Total 25 min)
1 hour to complete

Variable Elimination

4 videos (Total 56 min), 1 quiz
1 practice exercise
Variable Elimination18m
Week
2
18 hours to complete

Belief Propagation Algorithms

9 videos (Total 150 min), 3 quizzes
9 videos
Clique Tree Algorithm - Correctness18m
Clique Tree Algorithm - Computation16m
Clique Trees and Independence15m
Clique Trees and VE16m
BP In Practice15m
Loopy BP and Message Decoding21m
2 practice exercises
Message Passing in Cluster Graphs10m
Clique Tree Algorithm10m
Week
3
1 hour to complete

MAP Algorithms

5 videos (Total 74 min), 1 quiz
5 videos
Dual Decomposition - Intuition17m
Dual Decomposition - Algorithm16m
1 practice exercise
MAP Message Passing4m
Week
4
14 hours to complete

Sampling Methods

5 videos (Total 100 min), 3 quizzes
5 videos
Gibbs Sampling19m
Metropolis Hastings Algorithm27m
2 practice exercises
Sampling Methods14m
Sampling Methods PA Quiz8m
26 minutes to complete

Inference in Temporal Models

1 video (Total 20 min), 1 quiz
1 practice exercise
Inference in Temporal Models6m
4.6
54 ReviewsChevron Right

50%

started a new career after completing these courses

20%

got a tangible career benefit from this course

20%

got a pay increase or promotion

Top reviews from Probabilistic Graphical Models 2: Inference

By LLMar 12th 2017

Thanks a lot for professor D.K.'s great course for PGM inference part. Really a very good starting point for PGM model and preparation for learning part.

By YPMay 29th 2017

I learned pretty much from this course. It answered my quandaries from the representation course, and as well deepened my understanding of PGM.

Instructor

Avatar

Daphne Koller

Professor
School of Engineering

About Stanford University

The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto, California, United States....

About the Probabilistic Graphical Models Specialization

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems....
Probabilistic Graphical Models

Frequently Asked Questions

  • Once you enroll for a Certificate, you’ll have access to all videos, quizzes, and programming assignments (if applicable). Peer review assignments can only be submitted and reviewed once your session has begun. If you choose to explore the course without purchasing, you may not be able to access certain assignments.

  • When you enroll in the course, you get access to all of the courses in the Specialization, and you earn a certificate when you complete the work. Your electronic Certificate will be added to your Accomplishments page - from there, you can print your Certificate or add it to your LinkedIn profile. If you only want to read and view the course content, you can audit the course for free.

  • Execute the basic steps of a variable elimination or message passing algorithm

    Understand how properties of the graph structure influence the complexity of exact inference, and thereby estimate whether exact inference is likely to be feasible

    Go through the basic steps of an MCMC algorithm, both Gibbs sampling and Metropolis Hastings

    Understand how properties of the PGM influence the efficacy of sampling methods, and thereby estimate whether MCMC algorithms are likely to be effective

    Design Metropolis Hastings proposal distributions that are more likely to give good results

    Compute a MAP assignment by exact inference

    Honors track learners will be able to implement message passing algorithms and MCMC algorithms, and apply them to a real world problem

More questions? Visit the Learner Help Center.