zpravodaj ČSKI
aktuální číslo
přednáška
archiv
  - 2019
  - 2018
  - 2017
  - 2016
  - 2015
  - 2014
  - 2013
  - 2012
  - 2011
  - 2010
  - 2009
  - 2008
  - 2007
  - 2006
  - 2005
  - 2004
  - 2003
  - 2002
  - 2001
  - 2000
  - 1999
  - 1998
  - 1997
  - 1996
  - 1995
  - 1994
 
 
zpravodaj ČSKI - leden 2004 [ pdf ]
[ zaslat oznámení o přednášce ]
leden|únor|březen|duben|květen|červen|říjen|listopad|prosinec


datum: 21.1.2004 v 10:00
název: Parameter Reusing in Learning Latent Class Models
přednášející: Karciauskae Gytise
místo konání: UTIA

souhrn: We address the problem of learning latent class (LC) models from data. An LC model consists of a hidden class variable and observed manifest variables. Each state of the class variable corresponds to a different component (class). Manifest variables are assumed to be conditionally independent given the class variable. Usually the parameters of an LC model are learned by running the EM algorithm from a random starting point. The problem with this approach is that EM often finds the parameters that correspond to local rather than global maximum. A possible solution is to run EM from many random starting points, but this makes the learning computationally very expensive. We propose to learn the model parameters by reusing the parameters from a model that contains fewer components. We do so by initializing EM with the parameters that are obtained by splitting one component in a model. We provide a theoretical justification of our approach and present the experimental results. According to them, our approach performs better than the standard one. We propose to improve the parameter reusing approach further by merging components. In the end, we discuss the possible extensions of our approach for learning the more general models with hidden variables.