* nn refactor
* nn refactor
* fix code style issue
* change back the layers
* code refactor
* change tests to automatic test
* add more test for model save
* remove some useless unit test
* add more save test
* add more writer test
* rnn test case automation
* refine save test
* refine save unit test
* remove NHWC
* meet code review
* meet code review
* use MulConst in MulTF
* add a flatten node for tf 1.1
* fix code style and failed unit test
* mv tf model layers to another package
* add a python example to show how to define model in tensorflow and run in BigDL
* move tf example python code to example folder
* Add backward test in LeNet and AlexNet (#19)
* add unit test of lenet backward
* add some print
* add backward test in lenet and alexnet
* seperate testModel into forward and backward methods
* using set instead of copy for output and gradInput
define private method
* revise set method
* add condition before set method
* grammar revision
* set cellAppendStartIdx to cells.length
* GRU, last layer CAddTable(false)
* share memory in CAddTable for LSTM, GRU, SimpleRNN; share memory for Reverse layer
* revise javadoc
* revert GRU
* set default inPlace value to be false in Reverse layer
* make linear layerwise regurization work
* add regularizer option to layers with weights
fix a bug
* new version
* include accRegularization in accGradparameters
* meet code review
* add layerwise regualrization performance tests
* fix to meet code review
* make Regularizer.isRegularized private and optimize L1 regularization implementation
* fix jenkins tests failed
* change l1 implementation
* improve layerwise regularization performance test
* improve layerwise regularization performance test
* delete layerwise regularization perform test