diff --git a/docs/readthedocs/source/doc/Chronos/Overview/deep_dive.rst b/docs/readthedocs/source/doc/Chronos/Overview/deep_dive.rst new file mode 100644 index 00000000..11fa4673 --- /dev/null +++ b/docs/readthedocs/source/doc/Chronos/Overview/deep_dive.rst @@ -0,0 +1,18 @@ +Chronos Deep Dive +========= + +* `Time Series Processing and Feature Engineering `__ introduces how to load a built-in/customized dataset and carry out transformation and feature engineering on it. +* `Time Series Forecasting `__ introduces how to build a time series forecasting application. +* `Time Series Anomaly Detection `__ introduces how to build a anomaly detection application. +* `Generate Synthetic Sequential Data `__ introduces how to build a series data generation application. +* `Useful Functionalities `__ introduces some functionalities provided by Chronos that can help you improve accuracy/performance or scale the application to a larger data. + +.. toctree:: + :maxdepth: 1 + :hidden: + + data_processing_feature_engineering.md + forecasting.md + anomaly_detection.md + simulation.md + useful_functionalities.md diff --git a/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-anomaly-detector.md b/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-anomaly-detector.md index d28c39a4..0a0b0190 100644 --- a/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-anomaly-detector.md +++ b/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-anomaly-detector.md @@ -1,4 +1,4 @@ -# Anomaly Detector Quickstart +# Detect Anomaly Point in Real Time Traffic Data --- diff --git a/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-autotsest-quickstart.md b/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-autotsest-quickstart.md index a98acab3..0486bf22 100644 --- a/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-autotsest-quickstart.md +++ b/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-autotsest-quickstart.md @@ -1,4 +1,4 @@ -# AutoTSEstimator Quickstart +# Tune a Forecasting Task Automatically --- diff --git a/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-tsdataset-forecaster-quickstart.md b/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-tsdataset-forecaster-quickstart.md index f0a59d88..de2d6c79 100644 --- a/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-tsdataset-forecaster-quickstart.md +++ b/docs/readthedocs/source/doc/Chronos/QuickStart/chronos-tsdataset-forecaster-quickstart.md @@ -1,4 +1,4 @@ -# TSDataset and Forecaster Quickstarts +# Predict Number of Taxi Passengers with Chronos Forecaster --- diff --git a/docs/readthedocs/source/doc/Chronos/QuickStart/index.md b/docs/readthedocs/source/doc/Chronos/QuickStart/index.md new file mode 100644 index 00000000..b3e719ea --- /dev/null +++ b/docs/readthedocs/source/doc/Chronos/QuickStart/index.md @@ -0,0 +1,98 @@ +# Chronos Tutorial + +- [**Predict Number of Taxi Passengers with Chronos Forecaster**](./chronos-tsdataset-forecaster-quickstart.html) + + > ![](../../../../image/colab_logo_32px.png)[Run in Google Colab](https://colab.research.google.com/github/intel-analytics/BigDL/blob/branch-2.0/python/chronos/colab-notebook/chronos_nyc_taxi_tsdataset_forecaster.ipynb)  ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/colab-notebook/chronos_nyc_taxi_tsdataset_forecaster.ipynb) + + In this guide we will demonstrate how to use _Chronos TSDataset_ and _Chronos Forecaster_ for time series processing and predict number of taxi passengers. + +--------------------------- + +- [**Tune a Forecasting Task Automatically**](./chronos-autotsest-quickstart.html) + + > ![](../../../../image/colab_logo_32px.png)[Run in Google Colab](https://colab.research.google.com/github/intel-analytics/BigDL/blob/branch-2.0/python/chronos/colab-notebook/chronos_autots_nyc_taxi.ipynb)  ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/colab-notebook/chronos_autots_nyc_taxi.ipynb) + + In this guide we will demonstrate how to use _Chronos AutoTSEstimator_ and _Chronos TSPipeline_ to auto tune a time seires forecasting task and handle the whole model development process easily. + +--------------------------- + +- [**Detect Anomaly Point in Real Time Traffic Data**](./chronos-anomaly-detector.html) + + > ![](../../../../image/colab_logo_32px.png)[Run in Google Colab](https://colab.research.google.com/github/intel-analytics/BigDL/blob/branch-2.0/python/chronos/colab-notebook/chronos_minn_traffic_anomaly_detector.ipynb)  ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/colab-notebook/chronos_minn_traffic_anomaly_detector.ipynb) + + In this guide we will demonstrate how to use _Chronos Anomaly Detector_ for real time traffic data from the Twin Cities Metro area in Minnesota anomaly detection. + +--------------------------- + +- [**Tune a Customized Time Series Forecasting Model with AutoTSEstimator.**](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/network_traffic/network_traffic_autots_customized_model.ipynb) + + > ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/network_traffic/network_traffic_autots_customized_model.ipynb) + + In this notebook, we demonstrate a reference use case where we use the network traffic KPI(s) in the past to predict traffic KPI(s) in the future. We demonstrate how to use _AutoTSEstimator_ to adjust the parameters of a customized model. + +--------------------------- + +- [**Auto Tune the Prediction of Network Traffic at the Transit Link of WIDE**](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/network_traffic/network_traffic_autots_forecasting.ipynb) + + > ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/network_traffic/network_traffic_autots_forecasting.ipynb) + + In this notebook, we demostrate a reference use case where we use the network traffic KPI(s) in the past to predict traffic KPI(s) in the future. We demostrate how to use _AutoTS_ in project [Chronos][4] to do time series forecasting in an automated and distributed way. + +--------------------------- + +- [**Multivariate Forecasting of Network Traffic at the Transit Link of WIDE**](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/network_traffic/network_traffic_model_forecasting.ipynb) + + > ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/network_traffic/network_traffic_model_forecasting.ipynb) + + In this notebook, we demonstrate a reference use case where we use the network traffic KPI(s) in the past to predict traffic KPI(s) in the future. We demostrate how to do univariate forecasting (predict only 1 series), and multivariate forecasting (predicts more than 1 series at the same time) using Project [Chronos][4]. + +--------------------------- + +- [**Multistep Forecasting of Network Traffic at the Transit Link of WIDE**](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/network_traffic/network_traffic_multivariate_multistep_tcnforecaster.ipynb) + + > ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/network_traffic/network_traffic_multivariate_multistep_tcnforecaster.ipynb) + + In this notebook, we demonstrate a reference use case where we use the network traffic KPI(s) in the past to predict traffic KPI(s) in the future. We demostrate how to do multivariate multistep forecasting using Project [Chronos][4]. + +--------------------------- + +- [**Stock Price Prediction with LSTMForecaster**](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/fsi/stock_prediction.ipynb) + + > ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/fsi/stock_prediction.ipynb) + + In this notebook, we demonstrate a reference use case where we use historical stock price data to predict the future price. The dataset we use is the daily stock price of S&P500 stocks during 2013-2018 (data source). We demostrate how to do univariate forecasting using the past 80% of the total days' MMM price to predict the future 20% days' daily price. + + Reference: ** + +--------------------------- + +- [**Stock Price Prediction with ProphetForecaster and AutoProphet**](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/fsi/stock_prediction_prophet.ipynb) + + > ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/fsi/stock_prediction_prophet.ipynb) + + In this notebook, we demonstrate a reference use case where we use historical stock price data to predict the future price using the ProphetForecaster and AutoProphet. The dataset we use is the daily stock price of S&P500 stocks during 2013-2018 [data source](https://www.kaggle.com/camnugent/sandp500/). + + Reference: **, ** + +--------------------------- + +- [**Unsupervised Anomaly Detection for CPU Usage**](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/AIOps/AIOps_anomaly_detect_unsupervised.ipynb) + + > ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/AIOps/AIOps_anomaly_detect_unsupervised.ipynb) + + We demonstrates how to perform anomaly detection based on Chronos's built-in [DBScanDetector][DBScan], [AEDetector][AE] and [ThresholdDetector][Threshold]. + +--------------------------- + +- [**Anomaly Detection for CPU Usage Based on Forecasters**](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/AIOps/AIOps_anomaly_detect_unsupervised_forecast_based.ipynb) + + > ![](../../../../image/GitHub-Mark-32px.png)[View source on GitHub](https://github.com/intel-analytics/BigDL/blob/branch-2.0/python/chronos/use-case/AIOps/AIOps_anomaly_detect_unsupervised_forecast_based.ipynb) + + We demonstrates how to leverage Chronos's built-in models ie. MTNet, to do time series forecasting. Then perform anomaly detection on predicted value with [ThresholdDetector][Threshold]. + + +[DBScan]: +[AE]: +[Threshold]: +[4]: + diff --git a/docs/readthedocs/source/doc/DLlib/Overview/keras-api.md b/docs/readthedocs/source/doc/DLlib/Overview/keras-api.md index 3584e85a..30184ef4 100644 --- a/docs/readthedocs/source/doc/DLlib/Overview/keras-api.md +++ b/docs/readthedocs/source/doc/DLlib/Overview/keras-api.md @@ -6,9 +6,9 @@ To define a model in Scala using the Keras-like API, one just needs to import the following packages: ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers._ -import com.intel.analytics.zoo.pipeline.api.keras.models._ -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers._ +import com.intel.analytics.bigdl.dllib.keras.models._ +import com.intel.analytics.bigdl.dllib.utils.Shape ``` One of the highlighted features with regard to the new API is __shape inference__. Users only need to specify the input shape (a `Shape` object __excluding__ batch dimension, for example, `inputShape=Shape(3, 4)` for 3D input) for the first layer of a model and for the remaining layers, the input dimension will be automatically inferred. @@ -19,9 +19,9 @@ Here we use the Keras-like API to define a LeNet CNN model and train it on the M ```scala import com.intel.analytics.bigdl.numeric.NumericFloat -import com.intel.analytics.zoo.pipeline.api.keras.layers._ -import com.intel.analytics.zoo.pipeline.api.keras.models._ -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers._ +import com.intel.analytics.bigdl.dllib.keras.models._ +import com.intel.analytics.bigdl.dllib.utils.Shape val model = Sequential() model.add(Reshape(Array(1, 28, 28), inputShape = Shape(28, 28, 1))) @@ -81,9 +81,9 @@ Sequential() Example code to create a sequential model: ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.{Dense, Activation} -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.{Dense, Activation} +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape val model = Sequential[Float]() model.add(Dense[Float](32, inputShape = Shape(128))) @@ -114,7 +114,7 @@ Parameters: To merge a list of input __nodes__ (__NOT__ layers), following some merge mode in the Functional API: ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.Merge.merge +import com.intel.analytics.bigdl.dllib.keras.layers.Merge.merge merge(inputs, mode = "sum", concatAxis = -1) // This will return an output NODE. ``` @@ -127,10 +127,10 @@ Parameters: Example code to create a graph model: ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.{Dense, Input} -import com.intel.analytics.zoo.pipeline.api.keras.layers.Merge.merge -import com.intel.analytics.zoo.pipeline.api.keras.models.Model -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.{Dense, Input} +import com.intel.analytics.bigdl.dllib.keras.layers.Merge.merge +import com.intel.analytics.bigdl.dllib.keras.models.Model +import com.intel.analytics.bigdl.dllib.utils.Shape // instantiate input nodes val input1 = Input[Float](inputShape = Shape(8)) @@ -169,9 +169,9 @@ Masking(mask_value=0.0, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Masking -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Masking +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -252,8 +252,8 @@ SparseDense(output_dim, init="glorot_uniform", activation=None, W_regularizer=No **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.SparseDense -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.SparseDense +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val layer = SparseDense[Float](outputDim = 5, inputShape = Shape(2, 4)) @@ -340,9 +340,9 @@ SoftShrink(value = 0.5, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.SoftShrink -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.SoftShrink +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -474,9 +474,9 @@ Reshape(target_shape, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.Reshape -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.Reshape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -587,9 +587,9 @@ Merge(layers=None, mode="sum", concat_axis=-1, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.InputLayer -import com.intel.analytics.zoo.pipeline.api.keras.layers.Merge -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.InputLayer +import com.intel.analytics.bigdl.dllib.keras.layers.Merge +import com.intel.analytics.bigdl.dllib.keras.models.Sequential import com.intel.analytics.bigdl.utils.{Shape, T} import com.intel.analytics.bigdl.tensor.Tensor @@ -711,9 +711,9 @@ MaxoutDense(output_dim, nb_feature=4, W_regularizer=None, b_regularizer=None, bi **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.MaxoutDense -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.MaxoutDense +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -784,9 +784,9 @@ Squeeze(dim=None, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.Squeeze -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.Squeeze +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -865,9 +865,9 @@ BinaryThreshold(value=1e-6, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.BinaryThreshold -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.BinaryThreshold +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -995,9 +995,9 @@ Sqrt(input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.Sqrt -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.Sqrt +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1063,9 +1063,9 @@ Mul(input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.Mul -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.Mul +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1156,9 +1156,9 @@ MulConstant(constant, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.MulConstant -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.MulConstant +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1253,9 +1253,9 @@ Scale(size, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.Scale -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.Scale +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1320,9 +1320,9 @@ Log(input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.Log -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.Log +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1422,9 +1422,9 @@ Identity(input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.layers.Identity -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.layers.Identity +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1557,8 +1557,8 @@ Select(dim, index, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Select +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Select import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1634,9 +1634,9 @@ Dense(output_dim, init="glorot_uniform", activation=None, W_regularizer=None, b_ **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Dense -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Dense +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1701,9 +1701,9 @@ Negative(input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Negative -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Negative +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1789,9 +1789,9 @@ CAdd(size, b_regularizer=None, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.CAdd -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.CAdd +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1872,9 +1872,9 @@ RepeatVector(n, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.RepeatVector -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.RepeatVector +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -1954,8 +1954,8 @@ GaussianSampler(input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.GaussianSampler +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.GaussianSampler import com.intel.analytics.bigdl.utils.{Shape, MultiShape, T} import com.intel.analytics.bigdl.tensor.Tensor @@ -2054,9 +2054,9 @@ Exp(input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Exp -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Exp +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -2186,9 +2186,9 @@ Square(input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Square -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Square +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -2320,9 +2320,9 @@ Power(power, scale=1, shift=0, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Power -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Power +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -2404,9 +2404,9 @@ AddConstant(constant, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.AddConstant -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.AddConstant +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -2509,9 +2509,9 @@ Narrow(dim, offset, length=1, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Narrow -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Narrow +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -2622,9 +2622,9 @@ Permute(dims, input_shape=None, name=None) **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.Permute -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.Permute +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential[Float]() @@ -2729,9 +2729,9 @@ ResizeBilinear(output_height, output_width, align_corner=False, dim_ordering="th **Scala example:** ```scala -import com.intel.analytics.zoo.pipeline.api.keras.models.Sequential -import com.intel.analytics.zoo.pipeline.api.keras.layers.ResizeBilinear -import com.intel.analytics.bigdl.utils.Shape +import com.intel.analytics.bigdl.dllib.keras.models.Sequential +import com.intel.analytics.bigdl.dllib.keras.layers.ResizeBilinear +import com.intel.analytics.bigdl.dllib.utils.Shape import com.intel.analytics.bigdl.tensor.Tensor val model = Sequential() @@ -2831,3 +2831,45 @@ array([[[[0.43790358, 0.61913717, 0.2543214 ], [[0.12074634, 0.6571231 , 0.752728 ], [0.86969995, 0.6700518 , 0.36353552]]]], dtype=float32) ``` +--- +## 8. Persistence +This section describes how to save and load the Keras-like API. + +### 8.1 save +To save a Keras model, you call the method `saveModel(path)`. + +**Scala:** +```scala +import com.intel.analytics.bigdl.dllib.keras.layers.{Dense, Activation} +import com.intel.analytics.bigdl.dllib.keras.models.Sequential + +val model = Sequential[Float]() +model.add(Dense[Float](32, inputShape = Shape(128))) +model.add(Activation[Float]("relu")) +model.saveModel("/tmp/seq.model") +``` +**Python:** +```python +import bigdl.dllib.keras.Sequential +from bigdl.dllib.keras.layer import Dense + +model = Sequential() +model.add(Dense(input_shape=(32, ))) +model.saveModel("/tmp/seq.model") +``` + +### 8.2 load +To load a saved Keras model, you call the method `load_model(path)`. + +**Scala:** +```scala +import com.intel.analytics.bigdl.dllib.keras.Models + +val model = Models.loadModel[Float]("/tmp/seq.model") +``` + +**Python:** +```python +from bigdl.dllib.keras.models +model = load_model("/tmp/seq.model") +``` diff --git a/docs/readthedocs/source/doc/Serving/Example/example.md b/docs/readthedocs/source/doc/Serving/Example/example.md index 743a7b1e..709949af 100644 --- a/docs/readthedocs/source/doc/Serving/Example/example.md +++ b/docs/readthedocs/source/doc/Serving/Example/example.md @@ -1,13 +1,18 @@ # Cluster Serving Example There are some examples provided for new user or existing Tensorflow user. +## Quick Start Example +Following is the recommended quick start example to transfer a local Keras application to Cluster Serving. + +[keras-to-cluster-serving-example](https://github.com/intel-analytics/BigDL/blob/branch-2.0/docs/readthedocs/source/doc/Serving/Example/keras-to-cluster-serving-example.ipynb) + ## End-to-end Example ### TFDataSet: -[l08c08_forecasting_with_lstm.py](https://github.com/intel-analytics/bigdl/tree/master/docs/docs/ClusterServingGuide/OtherFrameworkUsers/l08c08_forecasting_with_lstm.py) +[l08c08_forecasting_with_lstm.py](https://github.com/intel-analytics/bigdl/blob/branch-2.0/docs/docs/ClusterServingGuide/OtherFrameworkUsers/l08c08_forecasting_with_lstm.py) ### Tokenizer: -[l10c03_nlp_constructing_text_generation_model.py](https://github.com/intel-analytics/bigdl/tree/master/docs/docs/ClusterServingGuide/OtherFrameworkUsers/l10c03_nlp_constructing_text_generation_model.py) +[l10c03_nlp_constructing_text_generation_model.py](https://github.com/intel-analytics/bigdl/tree/master/blob/branch-2.0/ClusterServingGuide/OtherFrameworkUsers/l10c03_nlp_constructing_text_generation_model.py) ### ImageDataGenerator: -[transfer_learning.py](https://github.com/intel-analytics/bigdl/tree/master/docs/docs/ClusterServingGuide/OtherFrameworkUsers/transfer_learning.py) +[transfer_learning.py](https://github.com/intel-analytics/bigdl/blob/branch-2.0/docs/docs/ClusterServingGuide/OtherFrameworkUsers/transfer_learning.py) ## Model/Data Convert Guide This guide is for users who: diff --git a/docs/readthedocs/source/index.rst b/docs/readthedocs/source/index.rst index a69e8230..1e4549f4 100644 --- a/docs/readthedocs/source/index.rst +++ b/docs/readthedocs/source/index.rst @@ -65,14 +65,8 @@ BigDL Documentation :caption: Chronos Overview doc/Chronos/Overview/chronos.md - doc/Chronos/Overview/data_processing_feature_engineering.md - doc/Chronos/Overview/forecasting.md - doc/Chronos/Overview/anomaly_detection.md - doc/Chronos/Overview/simulation.md - doc/Chronos/Overview/useful_functionalities.md - doc/Chronos/QuickStart/chronos-autotsest-quickstart.md - doc/Chronos/QuickStart/chronos-tsdataset-forecaster-quickstart.md - doc/Chronos/QuickStart/chronos-anomaly-detector.md + doc/Chronos/Overview/deep_dive.rst + doc/Chronos/QuickStart/index.md .. toctree:: :maxdepth: 1