About this Course
4.7
877 ratings
206 reviews
Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems. This course is the first in a sequence of three. It describes the two basic PGM representations: Bayesian Networks, which rely on a directed graph; and Markov networks, which use an undirected graph. The course discusses both the theoretical properties of these representations as well as their use in practice. The (highly recommended) honors track contains several hands-on assignments on how to represent some real-world problems. The course also presents some important extensions beyond the basic PGM representation, which allow more complex models to be encoded compactly....
Globe

100% online courses

Start instantly and learn at your own schedule.
Calendar

Flexible deadlines

Reset deadlines in accordance to your schedule.
Advanced Level

Advanced Level

Clock

Approx. 29 hours to complete

Suggested: 7 hours/week...
Comment Dots

English

Subtitles: English...

Skills you will gain

Bayesian NetworkGraphical ModelMarkov Random Field
Globe

100% online courses

Start instantly and learn at your own schedule.
Calendar

Flexible deadlines

Reset deadlines in accordance to your schedule.
Advanced Level

Advanced Level

Clock

Approx. 29 hours to complete

Suggested: 7 hours/week...
Comment Dots

English

Subtitles: English...

Syllabus - What you will learn from this course

Week
1
Clock
1 hour to complete

Introduction and Overview

This module provides an overall introduction to probabilistic graphical models, and defines a few of the key concepts that will be used later in the course....
Reading
4 videos (Total 35 min), 1 quiz
Video4 videos
Overview and Motivation19m
Distributions4m
Factors6m
Quiz1 practice exercise
Basic Definitions8m
Clock
10 hours to complete

Bayesian Network (Directed Models)

In this module, we define the Bayesian network representation and its semantics. We also analyze the relationship between the graph structure and the independence properties of a distribution represented over that graph. Finally, we give some practical tips on how to model a real-world situation as a Bayesian network....
Reading
15 videos (Total 190 min), 6 readings, 4 quizzes
Video15 videos
Reasoning Patterns9m
Flow of Probabilistic Influence14m
Conditional Independence12m
Independencies in Bayesian Networks18m
Naive Bayes9m
Application - Medical Diagnosis9m
Knowledge Engineering Example - SAMIAM14m
Basic Operations 13m
Moving Data Around 16m
Computing On Data 13m
Plotting Data 9m
Control Statements: for, while, if statements 12m
Vectorization 13m
Working on and Submitting Programming Exercises 3m
Reading6 readings
Setting Up Your Programming Assignment Environment10m
Installing Octave/MATLAB on Windows10m
Installing Octave/MATLAB on Mac OS X (10.10 Yosemite and 10.9 Mavericks)10m
Installing Octave/MATLAB on Mac OS X (10.8 Mountain Lion and Earlier)10m
Installing Octave/MATLAB on GNU/Linux10m
More Octave/MATLAB resources10m
Quiz3 practice exercises
Bayesian Network Fundamentals6m
Bayesian Network Independencies10m
Octave/Matlab installation2m
Week
2
Clock
1 hour to complete

Template Models for Bayesian Networks

In many cases, we need to model distributions that have a recurring structure. In this module, we describe representations for two such situations. One is temporal scenarios, where we want to model a probabilistic structure that holds constant over time; here, we use Hidden Markov Models, or, more generally, Dynamic Bayesian Networks. The other is aimed at scenarios that involve multiple similar entities, each of whose properties is governed by a similar model; here, we use Plate Models....
Reading
4 videos (Total 66 min), 1 quiz
Video4 videos
Temporal Models - DBNs23m
Temporal Models - HMMs12m
Plate Models20m
Quiz1 practice exercise
Template Models20m
Clock
11 hours to complete

Structured CPDs for Bayesian Networks

A table-based representation of a CPD in a Bayesian network has a size that grows exponentially in the number of parents. There are a variety of other form of CPD that exploit some type of structure in the dependency model to allow for a much more compact representation. Here we describe a number of the ones most commonly used in practice....
Reading
4 videos (Total 49 min), 3 quizzes
Video4 videos
Tree-Structured CPDs14m
Independence of Causal Influence13m
Continuous Variables13m
Quiz2 practice exercises
Structured CPDs8m
BNs for Genetic Inheritance PA Quiz22m
Week
3
Clock
17 hours to complete

Markov Networks (Undirected Models)

In this module, we describe Markov networks (also called Markov random fields): probabilistic graphical models based on an undirected graph representation. We discuss the representation of these models and their semantics. We also analyze the independence properties of distributions encoded by these graphs, and their relationship to the graph structure. We compare these independencies to those encoded by a Bayesian network, giving us some insight on which type of model is more suitable for which scenarios....
Reading
7 videos (Total 106 min), 3 quizzes
Video7 videos
General Gibbs Distribution15m
Conditional Random Fields22m
Independencies in Markov Networks4m
I-maps and perfect maps20m
Log-Linear Models22m
Shared Features in Log-Linear Models8m
Quiz2 practice exercises
Markov Networks8m
Independencies Revisited6m
Week
4
Clock
21 hours to complete

Decision Making

In this module, we discuss the task of decision making under uncertainty. We describe the framework of decision theory, including some aspects of utility functions. We then talk about how decision making scenarios can be encoded as a graphical model called an Influence Diagram, and how such models provide insight both into decision making and the value of information gathering....
Reading
3 videos (Total 61 min), 3 quizzes
Video3 videos
Utility Functions18m
Value of Perfect Information17m
Quiz2 practice exercises
Decision Theory8m
Decision Making PA Quiz18m
4.7
Direction Signs

17%

started a new career after completing these courses
Briefcase

83%

got a tangible career benefit from this course
Money

15%

got a pay increase or promotion

Top Reviews

By STJul 13th 2017

Prof. Koller did a great job communicating difficult material in an accessible manner. Thanks to her for starting Coursera and offering this advanced course so that we can all learn...Kudos!!

By CMOct 23rd 2017

The course was deep, and well-taught. This is not a spoon-feeding course like some others. The only downside were some "mechanical" problems (e.g. code submission didn't work for me).

Instructor

Daphne Koller

Professor
School of Engineering

About Stanford University

The Leland Stanford Junior University, commonly referred to as Stanford University or Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto, California, United States....

About the Probabilistic Graphical Models Specialization

Probabilistic graphical models (PGMs) are a rich framework for encoding probability distributions over complex domains: joint (multivariate) distributions over large numbers of random variables that interact with each other. These representations sit at the intersection of statistics and computer science, relying on concepts from probability theory, graph algorithms, machine learning, and more. They are the basis for the state-of-the-art methods in a wide variety of applications, such as medical diagnosis, image understanding, speech recognition, natural language processing, and many, many more. They are also a foundational tool in formulating many machine learning problems....
Probabilistic Graphical Models

Frequently Asked Questions

  • Once you enroll for a Certificate, you’ll have access to all videos, quizzes, and programming assignments (if applicable). Peer review assignments can only be submitted and reviewed once your session has begun. If you choose to explore the course without purchasing, you may not be able to access certain assignments.

  • When you enroll in the course, you get access to all of the courses in the Specialization, and you earn a certificate when you complete the work. Your electronic Certificate will be added to your Accomplishments page - from there, you can print your Certificate or add it to your LinkedIn profile. If you only want to read and view the course content, you can audit the course for free.

  • Apply the basic process of representing a scenario as a Bayesian network or a Markov network

    Analyze the independence properties implied by a PGM, and determine whether they are a good match for your distribution

    Decide which family of PGMs is more appropriate for your task

    Utilize extra structure in the local distribution for a Bayesian network to allow for a more compact representation, including tree-structured CPDs, logistic CPDs, and linear Gaussian CPDs

    Represent a Markov network in terms of features, via a log-linear model

    Encode temporal models as a Hidden Markov Model (HMM) or as a Dynamic Bayesian Network (DBN)

    Encode domains with repeating structure via a plate model

    Represent a decision making problem as an influence diagram, and be able to use that model to compute optimal decision strategies and information gathering strategies

    Honors track learners will be able to apply these ideas for complex, real-world problems

More questions? Visit the Learner Help Center.