diff --git a/docs/readthedocs/source/conf.py b/docs/readthedocs/source/conf.py index 0991e694..a5ae5850 100644 --- a/docs/readthedocs/source/conf.py +++ b/docs/readthedocs/source/conf.py @@ -18,7 +18,7 @@ import glob import shutil import urllib -autodoc_mock_imports = ["openvino", "pytorch_lightning"] +autodoc_mock_imports = ["openvino", "pytorch_lightning", "keras"] # documentation root, use os.path.abspath to make it absolute, like shown here. #sys.path.insert(0, '.') diff --git a/docs/readthedocs/source/doc/PythonAPI/Chronos/forecasters.rst b/docs/readthedocs/source/doc/PythonAPI/Chronos/forecasters.rst index cc16b46b..b28aacca 100644 --- a/docs/readthedocs/source/doc/PythonAPI/Chronos/forecasters.rst +++ b/docs/readthedocs/source/doc/PythonAPI/Chronos/forecasters.rst @@ -4,43 +4,76 @@ Forecasters LSTMForecaster ---------------------------------------- -:strong:`Please refer to` `BasePytorchForecaster `__ :strong:`for other methods other than initialization`. - Long short-term memory(LSTM) is a special type of recurrent neural network(RNN). We implement the basic version of LSTM - VanillaLSTM for this forecaster for time-series forecasting task. It has two LSTM layers, two dropout layer and a dense layer. For the detailed algorithm description, please refer to `here `__. +`version:pytorch` + +:strong:`Please refer to` `BasePytorchForecaster `__ :strong:`for other methods other than initialization`. + .. automodule:: bigdl.chronos.forecaster.lstm_forecaster :members: :undoc-members: :show-inheritance: +`version:tensorflow` + +:strong:`Please refer to` `BaseTF2Forecaster `__ :strong:`for other methods other than initialization`. + +.. automodule:: bigdl.chronos.forecaster.tf.lstm_forecaster + :members: + :undoc-members: + :show-inheritance: + Seq2SeqForecaster ------------------------------------------- -:strong:`Please refer to` `BasePytorchForecaster `__ :strong:`for other methods other than initialization`. - Seq2SeqForecaster wraps a sequence to sequence model based on LSTM, and is suitable for multivariant & multistep time series forecasting. +`version:pytorch` + +:strong:`Please refer to` `BasePytorchForecaster `__ :strong:`for other methods other than initialization`. + .. automodule:: bigdl.chronos.forecaster.seq2seq_forecaster :members: :undoc-members: :show-inheritance: +`version:tensorflow` + +:strong:`Please refer to` `BaseTF2Forecaster `__ :strong:`for other methods other than initialization`. + +.. automodule:: bigdl.chronos.forecaster.tf.seq2seq_forecaster + :members: + :undoc-members: + :show-inheritance: + TCNForecaster ---------------------------------------- -:strong:`Please refer to` `BasePytorchForecaster `__ :strong:`for other methods other than initialization`. - Temporal Convolutional Networks (TCN) is a neural network that use convolutional architecture rather than recurrent networks. It supports multi-step and multi-variant cases. Causal Convolutions enables large scale parallel computing which makes TCN has less inference time than RNN based model such as LSTM. +`version:pytorch` + +:strong:`Please refer to` `BasePytorchForecaster `__ :strong:`for other methods other than initialization`. + .. automodule:: bigdl.chronos.forecaster.tcn_forecaster :members: :undoc-members: :show-inheritance: +`version:tensorflow` + +:strong:`Please refer to` `BaseTF2Forecaster `__ :strong:`for other methods other than initialization`. + +.. automodule:: bigdl.chronos.forecaster.tf.tcn_forecaster + :members: + :undoc-members: + :show-inheritance: + NBeatsForecaster ---------------------------------------- @@ -75,7 +108,6 @@ TCMFForecaster supports distributed training and inference. It is based on Orca :undoc-members: :show-inheritance: - MTNetForecaster ---------------------------------------- @@ -113,19 +145,20 @@ For the detailed algorithm description, please refer to `here