• Peter Dayan
    Department of Computer Science, University of Toronto, 6 King's College Road, Toronto, Ontario M5S 1A4, Canada
  • Geoffrey E. Hinton
    Department of Computer Science, University of Toronto, 6 King's College Road, Toronto, Ontario M5S 1A4, Canada
  • Radford M. Neal
    Department of Computer Science, University of Toronto, 6 King's College Road, Toronto, Ontario M5S 1A4, Canada
  • Richard S. Zemel
    CNL, The Salk Institute, PO Box 85800, San Diego, CA 92186-5800 USA

書誌事項

公開日
1995-09
DOI
  • 10.1162/neco.1995.7.5.889
公開者
MIT Press - Journals

この論文をさがす

説明

<jats:p> Discovering the structure inherent in a set of patterns is a fundamental aim of statistical inference or learning. One fruitful approach is to build a parameterized stochastic generative model, independent draws from which are likely to produce the patterns. For all but the simplest generative models, each pattern can be generated in exponentially many ways. It is thus intractable to adjust the parameters to maximize the probability of the observed patterns. We describe a way of finessing this combinatorial explosion by maximizing an easily computed lower bound on the probability of the observations. Our method can be viewed as a form of hierarchical self-supervised learning that may relate to the function of bottom-up and top-down cortical processing pathways. </jats:p>

収録刊行物

被引用文献 (34)*注記

もっと見る

詳細情報 詳細情報について

問題の指摘

ページトップへ