https://gist.github.com/dineshj1/839e08576d441944fd6f36ca6896453b
Tuesday, August 30, 2016
Sunday, August 28, 2016
Hyperparameter optimization on a cluster - job submission script
Job submission scripts can be written in such a way as to make hyperparameter optimization easy.
I personally find it useful to loop over some set of configurations of hyperparameters within a Matlab/Python script, and also simultaneously dump those parameters into a spreadsheet. I then copy the cells from that spreadsheet into a Google Sheets spreadsheet, one run on each row. Later, I will record results in that Google Sheets spreadsheet too.
https://gist.github.com/dineshj1/7e38e6a68d6f7d81cc771ed77ce3d656
kitti227_submitjobs.m and kitti227_submitjobs.py are Matlab and Python job submission scripts that do similar things. Python does things a little more elegantly, using the Caffe python interface to automatically generate network prototxts (see layer_stack.py)
job_submission_script_example.m is just another job submission Matlab script, similar to kitti227_submitjobs.m.
I personally find it useful to loop over some set of configurations of hyperparameters within a Matlab/Python script, and also simultaneously dump those parameters into a spreadsheet. I then copy the cells from that spreadsheet into a Google Sheets spreadsheet, one run on each row. Later, I will record results in that Google Sheets spreadsheet too.
https://gist.github.com/dineshj1/7e38e6a68d6f7d81cc771ed77ce3d656
kitti227_submitjobs.m and kitti227_submitjobs.py are Matlab and Python job submission scripts that do similar things. Python does things a little more elegantly, using the Caffe python interface to automatically generate network prototxts (see layer_stack.py)
job_submission_script_example.m is just another job submission Matlab script, similar to kitti227_submitjobs.m.
Torch-hdf5
Don't use luarocks install hdf5. It might install a different library. What you want is:
https://github.com/deepmind/torch-hdf5/blob/master/doc/usage.md
To install it, clone the repository and type:
https://github.com/deepmind/torch-hdf5/blob/master/doc/usage.md
To install it, clone the repository and type:
luarocks make hdf5-0-0.rockspec
Test installation with:
th -e "require 'hdf5'"
th -e "require 'hdf5'"
If it works, you're set. If it throws up errors like "Error: unable to locate HDF5 header file at hdf5.h",
then you need to go to $TORCHPATH/share/lua/5.1/hdf5/config.lua and edit HDF5_INCLUDE_PATH. For e.g.
HDF5_INCLUDE_PATH = "/vision/vision_users/dineshj/local_installs/include/",
Friday, August 19, 2016
Caffe+dependencies installation with cudnn on vision/eldar machines (both with and without cuda)
https://gist.github.com/dineshj1/6f0371ba5615d7e9d50d870fd53c8d01
For this installation, install_caffe_new.sh was run on vision machines entirely. Not sure if installing dependencies would work just as well on eldar. OpenCV in particular caused problems when trying to install from eldar-1.
PS: The cudnn version installs, and make runtest passes all the important tests, but throws a segmentation fault when printing date:
[----------] Global test environment tear-down
[==========] 2081 tests from 277 test cases ran. (812540 ms total)
[ PASSED ] 2081 tests.
*** Aborted at 1471619205 (unix time) try "date -d @1471619205" if you are using GNU date ***
PC: @ 0x2b8879ed77db (unknown)
*** SIGSEGV (@0x2b886bac1188) received by PID 29519 (TID 0x2b887324c480) from PID 1806438792; stack trace: ***
@ 0x2b8879e8fcb0 (unknown)
@ 0x2b8879ed77db (unknown)
@ 0x2b8879ed8ce8 (unknown)
@ 0x2b8879edc1dc __libc_calloc
@ 0x2b8885d91279 (unknown)
make: *** [runtest] Segmentation fault
For this installation, install_caffe_new.sh was run on vision machines entirely. Not sure if installing dependencies would work just as well on eldar. OpenCV in particular caused problems when trying to install from eldar-1.
PS: The cudnn version installs, and make runtest passes all the important tests, but throws a segmentation fault when printing date:
[----------] Global test environment tear-down
[==========] 2081 tests from 277 test cases ran. (812540 ms total)
[ PASSED ] 2081 tests.
*** Aborted at 1471619205 (unix time) try "date -d @1471619205" if you are using GNU date ***
PC: @ 0x2b8879ed77db (unknown)
*** SIGSEGV (@0x2b886bac1188) received by PID 29519 (TID 0x2b887324c480) from PID 1806438792; stack trace: ***
@ 0x2b8879e8fcb0 (unknown)
@ 0x2b8879ed77db (unknown)
@ 0x2b8879ed8ce8 (unknown)
@ 0x2b8879edc1dc __libc_calloc
@ 0x2b8885d91279 (unknown)
make: *** [runtest] Segmentation fault
Monday, August 15, 2016
Download full directories with wget
See: http://stackoverflow.com/a/273776/2009491
In particular, what you probably need is this:
wget -r --no-parent --reject "index.html*" http://example.com/configs/.vim/Install osmesa on vision machines (in progress)
wget https://mesa.freedesktop.org/archive/current/mesa-11.0.7.tar.gz
dtrx mesa-11.0.7.tar.gz
cd mesa-11.0.7/
./configure --prefix=/vision/vision_users/dineshj/local_installs/ --disable-dri3
--> error saying libudev-dev or sysfs required
Was trying to do this to view OFF files. Found ways to do it within Matlab instead.
dtrx mesa-11.0.7.tar.gz
cd mesa-11.0.7/
./configure --prefix=/vision/vision_users/dineshj/local_installs/ --disable-dri3
--> error saying libudev-dev or sysfs required
Was trying to do this to view OFF files. Found ways to do it within Matlab instead.
Friday, August 5, 2016
Reduce file size of eps image file in Linux (eg., during arXiv submission)
It's not clear how to do this within Inkscape (even if you are creating your eps in Inkscape)
So, open the file in Gimp. Choose resolution 100 (default).
Then export to eps in Gimp. This should create a much smaller file already.
Tradeoff compactness and quality by adjusting resolution at the time of loading the eps into Gimp.
So, open the file in Gimp. Choose resolution 100 (default).
Then export to eps in Gimp. This should create a much smaller file already.
Tradeoff compactness and quality by adjusting resolution at the time of loading the eps into Gimp.
Subscribe to:
Posts (Atom)