[PPML] Refine readthedoc lines and space (#6509)

* Remove empty lines
* Use same space for indent
* Add empty line at the end
This commit is contained in:
Qiyuan Gong 2022-11-09 13:41:21 +08:00 committed by GitHub
parent 4f3c27bde0
commit 17fb75f8d7
6 changed files with 59 additions and 65 deletions

View file

@ -77,13 +77,13 @@ In your BigDL PPML container, you can run `/ppml/trusted-big-data-ml/azure/creat
Note: Please use the same VNet information of your client to create AKS. And use DC-Series VM size(i.e.Standard_DC8ds_v3) to create AKS. Note: Please use the same VNet information of your client to create AKS. And use DC-Series VM size(i.e.Standard_DC8ds_v3) to create AKS.
```bash ```bash
/ppml/trusted-big-data-ml/azure/create-aks.sh \ /ppml/trusted-big-data-ml/azure/create-aks.sh \
--resource-group myResourceGroup \ --resource-group myResourceGroup \
--vnet-resource-group myVnetResourceGroup \ --vnet-resource-group myVnetResourceGroup \
--vnet-name myVnetName \ --vnet-name myVnetName \
--subnet-name mySubnetName \ --subnet-name mySubnetName \
--cluster-name myAKSName \ --cluster-name myAKSName \
--vm-size myAKSNodeVMSize \ --vm-size myAKSNodeVMSize \
--node-count myAKSInitNodeCount --node-count myAKSInitNodeCount
``` ```
You can check the information by running: You can check the information by running:
@ -343,7 +343,6 @@ bash bigdl-ppml-submit.sh \
--verbose \ --verbose \
$SPARK_EXTRA_JAR_PATH \ $SPARK_EXTRA_JAR_PATH \
$ARGS $ARGS
``` ```
## 4. Run TPC-H example ## 4. Run TPC-H example
@ -375,7 +374,8 @@ will generate roughly 10GB of input data.
Generate primary key and data key, then save to file system. Generate primary key and data key, then save to file system.
The example code for generating the primary key and data key is like below: The example code for generating the primary key and data key is like below:
```
```bash
BIGDL_VERSION=2.1.0 BIGDL_VERSION=2.1.0
java -cp '/ppml/trusted-big-data-ml/work/bigdl-$BIGDL_VERSION/jars/*:/ppml/trusted-big-data-ml/work/spark-3.1.2/conf/:/ppml/trusted-big-data-ml/work/spark-3.1.2/jars/* \ java -cp '/ppml/trusted-big-data-ml/work/bigdl-$BIGDL_VERSION/jars/*:/ppml/trusted-big-data-ml/work/spark-3.1.2/conf/:/ppml/trusted-big-data-ml/work/spark-3.1.2/jars/* \
-Xmx10g \ -Xmx10g \
@ -390,7 +390,8 @@ java -cp '/ppml/trusted-big-data-ml/work/bigdl-$BIGDL_VERSION/jars/*:/ppml/trust
Encrypt data with specified BigDL `AzureKeyManagementService` Encrypt data with specified BigDL `AzureKeyManagementService`
The example code of encrypting data is like below: The example code of encrypting data is like below:
```
```bash
BIGDL_VERSION=2.1.0 BIGDL_VERSION=2.1.0
java -cp '/ppml/trusted-big-data-ml/work/bigdl-$BIGDL_VERSION/jars/*:/ppml/trusted-big-data-ml/work/spark-3.1.2/conf/:/ppml/trusted-big-data-ml/work/spark-3.1.2/jars/* \ java -cp '/ppml/trusted-big-data-ml/work/bigdl-$BIGDL_VERSION/jars/*:/ppml/trusted-big-data-ml/work/spark-3.1.2/conf/:/ppml/trusted-big-data-ml/work/spark-3.1.2/jars/* \
-Xmx10g \ -Xmx10g \
@ -417,7 +418,7 @@ location of the input data and where the output should be saved.
The example script to run a query is like: The example script to run a query is like:
``` ```bash
export RUNTIME_DRIVER_MEMORY=8g export RUNTIME_DRIVER_MEMORY=8g
export RUNTIME_DRIVER_PORT=54321 export RUNTIME_DRIVER_PORT=54321
@ -472,11 +473,3 @@ bash bigdl-ppml-submit.sh \
INPUT_DIR is the TPC-H's data dir. INPUT_DIR is the TPC-H's data dir.
OUTPUT_DIR is the dir to write the query result. OUTPUT_DIR is the dir to write the query result.
The optional parameter [QUERY] is the number of the query to run e.g 1, 2, ..., 22 The optional parameter [QUERY] is the number of the query to run e.g 1, 2, ..., 22

View file

@ -23,6 +23,7 @@ You can find more details in [Intel SGX Developer Guide](https://download.01.org
```eval_rst ```eval_rst
.. mermaid:: .. mermaid::
graph LR graph LR
subgraph SGX enclave subgraph SGX enclave
MRENCLAVE(fa:fa-file-signature MRENCLAVE) MRENCLAVE(fa:fa-file-signature MRENCLAVE)