Mutual conditional independence and its applications to model selection in Markov networks

Article Type

Research Article

Publication Title

Annals of Mathematics and Artificial Intelligence

Abstract

The fundamental concepts underlying Markov networks are the conditional independence and the set of rules called Markov properties that translate conditional independence constraints into graphs. We introduce the concept of mutual conditional independence in an independent set of a Markov network, and we prove its equivalence to the Markov properties under certain regularity conditions. This extends the notion of similarity between separation in graph and conditional independence in probability to similarity between the mutual separation in graph and the mutual conditional independence in probability. Model selection in graphical models remains a challenging task due to the large search space. We show that mutual conditional independence property can be exploited to reduce the search space. We present a new forward model selection algorithm for graphical log-linear models using mutual conditional independence. We illustrate our algorithm with a real data set example. We show that for sparse models the size of the search space can be reduced from O(n3) to O(n2) using our proposed forward selection method rather than the classical forward selection method. We also envision that this property can be leveraged for model selection and inference in different types of graphical models.

First Page

951

Last Page

972

DOI

10.1007/s10472-020-09690-7

Publication Date

9-1-2020

Comments

Open Access, Hybrid Gold

This document is currently not available here.

Share

COinS