|
|
|
|
LEADER |
01731 am a22002173u 4500 |
001 |
63118 |
042 |
|
|
|a dc
|
100 |
1 |
0 |
|a Sontag, David Alexander
|e author
|
100 |
1 |
0 |
|a Massachusetts Institute of Technology. Department of Electrical Engineering and Computer Science
|e contributor
|
100 |
1 |
0 |
|a Jaakkola, Tommi S.
|e contributor
|
100 |
1 |
0 |
|a Jaakkola, Tommi S.
|e contributor
|
100 |
1 |
0 |
|a Sontag, David Alexander
|e contributor
|
700 |
1 |
0 |
|a Jaakkola, Tommi S.
|e author
|
245 |
0 |
0 |
|a Tree block coordinate descent for map in graphical models
|
260 |
|
|
|b Journal of Machine Learning Research,
|c 2011-05-25T19:13:22Z.
|
856 |
|
|
|z Get fulltext
|u http://hdl.handle.net/1721.1/63118
|
520 |
|
|
|a abstract URL: http://jmlr.csail.mit.edu/proceedings/papers/v5/sontag09a.html
|
520 |
|
|
|a A number of linear programming relaxations have been proposed for finding most likely settings of the variables (MAP) in large probabilistic models. The relaxations are often succinctly expressed in the dual and reduce to different types of reparameterizations of the original model. The dual objectives are typically solved by performing local block coordinate descent steps. In this work, we show how to perform block coordinate descent on spanning trees of the graphical model. We also show how all of the earlier dual algorithms are related to each other, giving transformations from one type of reparameterization to another while maintaining monotonicity relative to a common objective function. Finally, we quantify when the MAP solution can and cannot be decoded directly from the dual LP relaxation.
|
546 |
|
|
|a en_US
|
655 |
7 |
|
|a Article
|
773 |
|
|
|t Proceedings of the 12th International Conference on Artifcial Intelligence and Statistics (AISTATS) 2009
|