rsm is a modified Python implementation of Replicated Softmax Model
of Salakhutdinov and Hinton (2009)
[PDF],
a simple single-layer "Deep Net" for documents.
This code is a modification to a Python implementation
by Joerg Landthaler,
http://www.fylance.de/rsm/,
in several aspects:
% rsm.py -H 10 -N 20 -b 10 train model loading data.. done. number of documents = 100 number of lexicon = 1324 number of hidden variables = 10 number of learning epochs = 20 number of CD iterations = 1 minibatch size = 10 learning rate = 0.001 updates per epoch: 10 | total updates: 200 Epoch[ 0] : PPL = 994.07 [iter=1] Epoch[ 1] : PPL = 508.56 [iter=1] Epoch[ 2] : PPL = 400.47 [iter=1] Epoch[ 3] : PPL = 353.96 [iter=1] Epoch[ 4] : PPL = 334.94 [iter=1] :For detailed usage of rsm.py, type
% rsm.py -h rsm.py, modified python implementation of Replicated Softmax Model. $Id: rsm.py,v 1.7 2013/06/28 10:23:26 daichi Exp $ usage : rsm.py [options] train model options: -H hiddens number of hidden variables (default = 50) -N epochs number of learning epochs (default = 1) -n iter iterations of contrastive divergence (default = 1) -b batch number of batch size (default = 1) -r rate learning rate (default = 0.001) %or just execute rsm.py.
% rsmhidden.py model.nips test.dat .*..*....*..*..*......*...*........***....*....... .*..*.*.....+.........*........*.................. .**.*.-...............*.-.*......-................ .*..*..............+..*...+.........-............. .*-.*.*...............*.+.*....................... .*..*..............-..*........+-....+............ ..-.-........-........*.-......*.........-..+..... ...............-........*.-....*.........*........ ....*.................*........................... .*..*.-.....*..-......*.+........-..-.............