On the Information Theoretic Limits of Learning Ising Models

Rashish Tandon, Karthikeyan Shanmugam, Alexandros Dimakis, Pradeep Ravikumar

Abstract:   We provide a general framework for computing lower-bounds on the sample complexity of recovering the underlying graphs of Ising models, given i.i.d. samples. While there have been recent results for specific graph classes, these involve fairly extensive technical arguments that are specialized to each specific graph class. In contrast, we isolate two key graph-structural ingredients that can then be used to specify sample complexity lower-bounds. Presence of these structural properties makes the graph class hard to learn. We derive corollaries of our main result that not only recover existing recent results, but also provide lower bounds for novel graph classes not considered previously. We also extend our framework to the random graph setting and derive corollaries for Erdos-Renyi graphs in a certain dense setting

Download: arXiv version

Citation

  • On the Information Theoretic Limits of Learning Ising Models (arXiv, software)
    R. Tandon, K. Shanmugam, A. Dimakis, P. Ravikumar.
    To appear in Neural Information Processing Systems (NIPS), December 2014.

    Bibtex: