diff --git a/docs/readthedocs/source/doc/Orca/Overview/install.md b/docs/readthedocs/source/doc/Orca/Overview/install.md index 2cee6dd4..0a9ac2ec 100644 --- a/docs/readthedocs/source/doc/Orca/Overview/install.md +++ b/docs/readthedocs/source/doc/Orca/Overview/install.md @@ -46,7 +46,7 @@ conda activate py37 This section demonstrates how to install BigDL Orca via `pip`, which is the most recommended way. -__Note:__ +__Notes:__ * Installing BigDL Orca from pip will automatically install `pyspark`. To avoid possible conflicts, you are highly recommended to **unset the environment variable `SPARK_HOME`** if it exists in your environment. * If you are using a custom URL of Python Package Index to install the latest version, you may need to check whether the latest packages have been sync'ed with pypi. Or you can add the option `-i https://pypi.python.org/simple` when pip install to use pypi as the index-url. diff --git a/docs/readthedocs/source/doc/Orca/Tutorial/yarn.md b/docs/readthedocs/source/doc/Orca/Tutorial/yarn.md index e84c1499..748c7938 100644 --- a/docs/readthedocs/source/doc/Orca/Tutorial/yarn.md +++ b/docs/readthedocs/source/doc/Orca/Tutorial/yarn.md @@ -296,24 +296,25 @@ sc = init_orca_context(cluster_mode="spark-submit") ``` Before submitting the application on the __Client Node__, you need to: - -1. Prepare the conda environment on a __Development Node__ where conda is available and pack the conda environment to an archive: +- First, prepare the conda environment on a __Development Node__ where conda is available and pack the conda environment to an archive: ```bash conda pack -o environment.tar.gz ``` -2. Send the Conda archive to the __Client Node__; + +- Then send the conda archive to the __Client Node__; ```bash scp /path/to/environment.tar.gz username@client_ip:/path/to/ ``` -On the __Client Node__: -1. Download and extract [Spark](https://archive.apache.org/dist/spark/). Then setup the environment variables `${SPARK_HOME}` and `${SPARK_VERSION}`. +On the __Client Node__: +- Download and extract [Spark](https://archive.apache.org/dist/spark/). Then setup the environment variables `${SPARK_HOME}` and `${SPARK_VERSION}`. ```bash export SPARK_HOME=/path/to/spark # the folder path where you extract the Spark package export SPARK_VERSION="downloaded spark version" ``` -2. Refer to [here](../Overview/install.html#download-bigdl-orca) to download and unzip a BigDL assembly package. Make sure the Spark version of your downloaded BigDL matches your downloaded Spark. Then setup the environment variables `${BIGDL_HOME}` and `${BIGDL_VERSION}`. + +- Refer to [here](../Overview/install.html#download-bigdl-orca) to download and unzip a BigDL assembly package. Make sure the Spark version of your downloaded BigDL matches your downloaded Spark. Then setup the environment variables `${BIGDL_HOME}` and `${BIGDL_VERSION}`. ```bash export BIGDL_HOME=/path/to/unzipped_BigDL export BIGDL_VERSION="downloaded BigDL version"