Mastering Probabilistic Graphical Models with Python - Ankur Ankan - E-Book

Mastering Probabilistic Graphical Models with Python E-Book

Ankur Ankan

0,0
35,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Probabilistic Graphical Models is a technique in machine learning that uses the concepts of graph theory to compactly represent and optimally predict values in our data problems. In real world problems, it's often difficult to select the appropriate graphical model as well as the appropriate inference algorithm, which can make a huge difference in computation time and accuracy. Thus, it is crucial to know the working details of these algorithms.
This book starts with the basics of probability theory and graph theory, then goes on to discuss various models and inference algorithms. All the different types of models are discussed along with code examples to create and modify them, and also to run different inference algorithms on them. There is a complete chapter devoted to the most widely used networks Naive Bayes Model and Hidden Markov Models (HMMs). These models have been thoroughly discussed using real-world examples.

Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:

EPUB
MOBI

Seitenzahl: 319

Veröffentlichungsjahr: 2015

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Mastering Probabilistic Graphical Models Using Python
Credits
About the Authors
About the Reviewers
www.PacktPub.com
Support files, eBooks, discount offers, and more
Why subscribe?
Free access for Packt account holders
Preface
What this book covers
What you need for this book
Who this book is for
Conventions
Reader feedback
Customer support
Downloading the example code
Downloading the color images of this book
Errata
Piracy
Questions
1. Bayesian Network Fundamentals
Probability theory
Random variable
Independence and conditional independence
Installing tools
IPython
pgmpy
Representing independencies using pgmpy
Representing joint probability distributions using pgmpy
Conditional probability distribution
Representing CPDs using pgmpy
Graph theory
Nodes and edges
Walk, paths, and trails
Bayesian models
Representation
Factorization of a distribution over a network
Implementing Bayesian networks using pgmpy
Bayesian model representation
Reasoning pattern in Bayesian networks
D-separation
Direct connection
Indirect connection
Relating graphs and distributions
IMAP
IMAP to factorization
CPD representations
Deterministic CPDs
Context-specific CPDs
Tree CPD
Rule CPD
Summary
2. Markov Network Fundamentals
Introducing the Markov network
Parameterizing a Markov network – factor
Factor operations
Gibbs distributions and Markov networks
The factor graph
Independencies in Markov networks
Constructing graphs from distributions
Bayesian and Markov networks
Converting Bayesian models into Markov models
Converting Markov models into Bayesian models
Chordal graphs
Summary
3. Inference – Asking Questions to Models
Inference
Complexity of inference
Variable elimination
Analysis of variable elimination
Finding elimination ordering
Using the chordal graph property of induced graphs
Minimum fill/size/weight/search
Belief propagation
Clique tree
Constructing a clique tree
Message passing
Clique tree calibration
Message passing with division
Factor division
Querying variables that are not in the same cluster
MAP inference
MAP using variable elimination
Factor maximization
MAP using belief propagation
Finding the most probable assignment
Predictions from the model using pgmpy
A comparison of variable elimination and belief propagation
Summary
4. Approximate Inference
The optimization problem
The energy function
Exact inference as an optimization
The propagation-based approximation algorithm
Cluster graph belief propagation
Constructing cluster graphs
Pairwise Markov networks
Bethe cluster graph
Propagation with approximate messages
Message creation
Inference with approximate messages
Sum-product expectation propagation
Belief update propagation
MAP inference
Sampling-based approximate methods
Forward sampling
Conditional probability distribution
Likelihood weighting and importance sampling
Importance sampling
Importance sampling in Bayesian networks
Computing marginal probabilities
Ratio likelihood weighting
Normalized likelihood weighting
Markov chain Monte Carlo methods
Gibbs sampling
Markov chains
The multiple transitioning model
Using a Markov chain
Collapsed particles
Collapsed importance sampling
Summary
5. Model Learning – Parameter Estimation in Bayesian Networks
General ideas in learning
The goals of learning
Density estimation
Predicting the specific probability values
Knowledge discovery
Learning as an optimization
Empirical risk and overfitting
Discriminative versus generative training
Learning task
Model constraints
Data observability
Parameter learning
Maximum likelihood estimation
Maximum likelihood principle
The maximum likelihood estimate for Bayesian networks
Bayesian parameter estimation
Priors
Bayesian parameter estimation for Bayesian networks
Structure learning in Bayesian networks
Methods for the learning structure
Constraint-based structure learning
Structure score learning
The likelihood score
The Bayesian score
The Bayesian score for Bayesian networks
Summary
6. Model Learning – Parameter Estimation in Markov Networks
Maximum likelihood parameter estimation
Likelihood function
Log-linear model
Gradient ascent
Learning with approximate inference
Belief propagation and pseudo-moment matching
Structure learning
Constraint-based structure learning
Score-based structure learning
The likelihood score
Bayesian score
Summary
7. Specialized Models
The Naive Bayes model
Why does it even work?
Types of Naive Bayes models
Multivariate Bernoulli Naive Bayes model
Multinomial Naive Bayes model
Choosing the right model
Dynamic Bayesian networks
Assumptions
Discrete timeline assumption
The Markov assumption
Model representation
The Hidden Markov model
Generating an observation sequence
Computing the probability of an observation
The forward-backward algorithm
Computing the state sequence
Applications
The acoustic model
The language model
Summary
Index

Mastering Probabilistic Graphical Models Using Python

Mastering Probabilistic Graphical Models Using Python

Copyright © 2015 Packt Publishing

All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.

Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the authors, nor Packt Publishing, and its dealers and distributors will be held liable for any damages caused or alleged to be caused directly or indirectly by this book.

Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.

First published: July 2015

Production reference: 1280715

Published by Packt Publishing Ltd.

Livery Place

35 Livery Street

Birmingham B3 2PB, UK.

ISBN 978-1-78439-468-4

www.packtpub.com

Credits

Authors

Ankur Ankan

Abinash Panda

Reviewers

Matthieu Brucher

Dave (Jing) Tian

Xiao Xiao

Commissioning Editor

Kartikey Pandey

Acquisition Editors

Vivek Anantharaman

Sam Wood

Content Development Editor

Gaurav Sharma

Technical Editors

Ankita Thakur

Chinmay S. Puranik

Copy Editors

Shambhavi Pai

Swati Priya

Project Coordinator

Bijal Patel

Proofreader

Safis Editing

Indexer

Mariammal Chettiyar

Graphics

Disha Haria

Production Coordinator

Nilesh R. Mohite

Cover Work

Nilesh R. Mohite

About the Authors

Ankur Ankan is a BTech graduate from IIT (BHU), Varanasi. He is currently working in the field of data science. He is an open source enthusiast and his major work includes starting pgmpy with four other members. In his free time, he likes to participate in Kaggle competitions.

I would like to thank all the pgmpy contributors who have helped me in bringing it to its current stable state. Also, I would like to thank my parents for their relentless support in my endeavors.

Abinash Panda is an undergraduate from IIT (BHU), Varanasi, and is currently working as a data scientist. He has been a contributor to open source libraries such as the Shogun machine learning toolbox and pgmpy, which he started writing along with four other members. He spends most of his free time on improving pgmpy and helping new contributors.

I would like to thank all the pgmpy contributors. Also, I would like to thank my parents for their support. I am also grateful to all my batchmates of electronics engineering, the class of 2014, for motivating me.

About the Reviewers

Matthieu Brucher holds a master's degree from Ecole Supérieure d'Electricité (information, signals, measures), a master of computer science degree from the University of Paris XI, and a PhD in unsupervised manifold learning from the Université de Strasbourg, France. He is currently an HPC software developer at an oil company and works on next-generation reservoir simulation.

Dave (Jing) Tian is a graduate research fellow and a PhD student in the computer and information science and engineering (CISE) department at the University of Florida. He is a founding member of the Sensei center. His research involves system security, embedded systems security, trusted computing, and compilers. He is interested in Linux kernel hacking, compiler hacking, and machine learning. He also spent a year on AI and machine learning and taught Python and operating systems at the University of Oregon. Before that, he worked as a software developer in the Linux Control Platform (LCP) group at the Alcatel-Lucent (formerly, Lucent Technologies) R&D department for around 4 years. He got his bachelor's and master's degrees from EE in China. He can be reached via his blog at http://davejingtian.org and can be e-mailed at <[email protected]>.

Thanks to the authors of this book for doing a good job. I would also like to thank the editors of this book for making it perfect and giving me the opportunity to review such a nice book.

Xiao Xiao got her master's degree from the University of Oregon in 2014. Her research interest lies in probabilistic graphical models. Her previous project was to use probabilistic graphical models to predict human behavior to help people lose weight. Now, Xiao is working as a full-stack software engineer at Poshmark. She was also the reviewer of Building Probabilistic Graphical Models with Python, Packt Publishing.

www.PacktPub.com

Support files, eBooks, discount offers, and more

For support files and downloads related to your book, please visit www.PacktPub.com.

Did you know that Packt offers eBook versions of every book published, with PDF and ePub files available? You can upgrade to the eBook version at www.PacktPub.com and as a print book customer, you are entitled to a discount on the eBook copy. Get in touch with us at <[email protected]> for more details.

At www.PacktPub.com, you can also read a collection of free technical articles, sign up for a range of free newsletters and receive exclusive discounts and offers on Packt books and eBooks.

https://www2.packtpub.com/books/subscription/packtlib

Do you need instant solutions to your IT questions? PacktLib is Packt's online digital book library. Here, you can search, access, and read Packt's entire library of books.

Why subscribe?

Fully searchable across every book published by PacktCopy and paste, print, and bookmark contentOn demand and accessible via a web browser

Free access for Packt account holders

If you have an account with Packt at www.PacktPub.com, you can use this to access PacktLib today and view 9 entirely free books. Simply use your login credentials for immediate access.

Preface

This book focuses on the theoretical as well as practical uses of probabilistic graphical models, commonly known as PGM. This is a technique in machine learning in which we use the probability distribution over different variables to learn the model. In this book, we have discussed the different types of networks that can be constructed and the various algorithms for doing inference or predictions over these models. We have added examples wherever possible to make the concepts easier to understand. We also have code examples to promote understanding the concepts more effectively and working on real-life problems.

What this book covers

Chapter 1, Bayesian Network Fundamentals, discusses Bayesian networks (a type of graphical model), its representation, and the independence conditions that this type of network implies.

Chapter 2, Markov Network Fundamentals, discusses the other type of graphical model known as Markov network, its representation, and the independence conditions implied by it.

Chapter 3, Inference – Asking Questions to Models, discusses the various exact inference techniques used in graphical models to predict over newer data points.

Chapter 4, Approximate Inference, discusses the various methods for doing approximate inference in graphical models. As doing exact inference in the case of many real-life problems is computationally very expensive, approximate methods give us a faster way to do inference in such problems.

Chapter 5, Model Learning – Parameter Estimation in Bayesian Networks, discusses the various methods to learn a Bayesian network using data points that we have observed. This chapter also discusses the various methods of learning the network structure with observed data.

Chapter 6, Model Learning – Parameter Estimation in Markov Networks, discusses various methods for learning parameters and network structure in the case of Markov networks.

Chapter 7, Specialized Models, discusses some special cases in Bayesian and Markov models that are very widely used in real-life problems, such as Naive Bayes, Hidden Markov models, and others.

What you need for this book

In this book, we have used IPython to run all the code examples. It is not necessary to use IPython but we recommend you to use it. Most of the code examples use pgmpy and sckit-learn. Also, we have used NumPy at places to generate random data.

Who this book is for

This book will be useful for researchers, machine learning enthusiasts, and people who are working in the data science field and have a basic idea of machine learning or graphical models. This book will help readers to understand the details of graphical models and use them in their day-to-day data science problems.

Reader feedback

Feedback from our readers is always welcome. Let us know what you think about this book—what you liked or disliked. Reader feedback is important for us as it helps us develop titles that you will really get the most out of.

To send us general feedback, simply e-mail <[email protected]>, and mention the book's title in the subject of your message.

If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, see our author guide at www.packtpub.com/authors.

Customer support

Now that you are the proud owner of a Packt book, we have a number of things to help you to get the most from your purchase.

Downloading the example code

You can download the example code files from your account at http://www.packtpub.com for all the Packt Publishing books you have purchased. If you purchased this book elsewhere, you can visit http://www.packtpub.com/support and register to have the files e-mailed directly to you.

Downloading the color images of this book

We also provide you with a PDF file that has color images of the screenshots/diagrams used in this book. The color images will help you better understand the changes in the output. You can download this file from http://www.packtpub.com/sites/default/files/downloads/4684OS_ColorImages.pdf.

Errata

Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you find a mistake in one of our books—maybe a mistake in the text or the code—we would be grateful if you could report this to us. By doing so, you can save other readers from frustration and help us improve subsequent versions of this book. If you find any errata, please report them by visiting http://www.packtpub.com/submit-errata, selecting your book, clicking on the ErrataSubmissionForm link, and entering the details of your errata. Once your errata are verified, your submission will be accepted and the errata will be uploaded to our website or added to any list of existing errata under the Errata section of that title.

To view the previously submitted errata, go to https://www.packtpub.com/books/content/support and enter the name of the book in the search field. The required information will appear under the Errata section.

Piracy

Piracy of copyrighted material on the Internet is an ongoing problem across all media. At Packt, we take the protection of our copyright and licenses very seriously. If you come across any illegal copies of our works in any form on the Internet, please provide us with the location address or website name immediately so that we can pursue a remedy.

Please contact us at <[email protected]> with a link to the suspected pirated material.

We appreciate your help in protecting our authors and our ability to bring you valuable content.

Questions

If you have a problem with any aspect of this book, you can contact us at <[email protected]>, and we will do our best to address the problem.

Installing tools

Let's now see some coding examples using pgmpy, to represent joint distributions and independencies. Here, we will mostly work with IPython and pgmpy (and a few other libraries) for coding examples. So, before moving ahead, let's get a basic introduction to these.

IPython

IPython is a command shell for interactive computing in multiple programming languages, originally developed for the Python programming language, which offers enhanced introspection, rich media, additional shell syntax, tab completion, and a rich history. IPython provides the following features:

Powerful interactive shells (terminal and Qt-based)A browser-based notebook with support for code, text, mathematical expressions, inline plots, and other rich mediaSupport for interactive data visualization and use of GUI toolkitsFlexible and embeddable interpreters to load into one's own projectsEasy-to-use and high performance tools for parallel computing

You can install IPython using the following command:

>>> pip3 install ipython

To start the IPython command shell, you can simply type ipython3 in the terminal. For more installation instructions, you can visit http://ipython.org/install.html.

pgmpy

pgmpy is a Python library to work with Probabilistic Graphical models. As it's currently not on PyPi, we will need to build it manually. You can get the source code from the Git repository using the following command:

>>> git clone https://github.com/pgmpy/pgmpy

Now cd into the cloned directory switch branch for version used in this book and build it with the following code:

>>> cd pgmpy>>> git checkout book/v0.1>>> sudo python3 setup.py install

For more installation instructions, you can visit http://pgmpy.org/install.html.

With both IPython and pgmpy installed, you should now be able to run the examples in the book.