From 27302a4e5f674bc289e5f0f6e096c583aec2d709 Mon Sep 17 00:00:00 2001 From: liangs6212 <80952198+liangs6212@users.noreply.github.com> Date: Tue, 16 Nov 2021 11:24:07 +0800 Subject: [PATCH] Chronos: Forecaster in doc shows more friendly (#3441) * add hyperlinks * fix style error * remove space * remove typo "mtnet" --- .../source/doc/PythonAPI/Chronos/forecasters.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/readthedocs/source/doc/PythonAPI/Chronos/forecasters.rst b/docs/readthedocs/source/doc/PythonAPI/Chronos/forecasters.rst index 93951d43..41858d1c 100644 --- a/docs/readthedocs/source/doc/PythonAPI/Chronos/forecasters.rst +++ b/docs/readthedocs/source/doc/PythonAPI/Chronos/forecasters.rst @@ -4,7 +4,7 @@ Forecasters LSTMForecaster ---------------------------------------- -Please refer to BasePytorchForecaster for other methods other than initialization. +:strong:`Please refer to` `BasePytorchForecaster `__ :strong:`for other methods other than initialization`. Long short-term memory(LSTM) is a special type of recurrent neural network(RNN). We implement the basic version of LSTM - VanillaLSTM for this forecaster for time-series forecasting task. It has two LSTM layers, two dropout layer and a dense layer. @@ -19,7 +19,7 @@ For the detailed algorithm description, please refer to `here `__ :strong:`for other methods other than initialization`. Seq2SeqForecaster wraps a sequence to sequence model based on LSTM, and is suitable for multivariant & multistep time series forecasting. @@ -32,7 +32,7 @@ Seq2SeqForecaster wraps a sequence to sequence model based on LSTM, and is suita TCNForecaster ---------------------------------------- -Please refer to BasePytorchForecaster for other methods other than initialization. +:strong:`Please refer to` `BasePytorchForecaster `__ :strong:`for other methods other than initialization`. Temporal Convolutional Networks (TCN) is a neural network that use convolutional architecture rather than recurrent networks. It supports multi-step and multi-variant cases. Causal Convolutions enables large scale parallel computing which makes TCN has less inference time than RNN based model such as LSTM.