docs(orca-context.md): RayContext -> OrcaRayContext (#7227)
This commit is contained in:
		
							parent
							
								
									06cb1971c7
								
							
						
					
					
						commit
						cc6f9b4dd2
					
				
					 1 changed files with 2 additions and 2 deletions
				
			
		| 
						 | 
					@ -18,7 +18,7 @@ init_orca_context(...)
 | 
				
			||||||
In `init_orca_context`, the user may specify necessary runtime configurations for the Orca program, including:
 | 
					In `init_orca_context`, the user may specify necessary runtime configurations for the Orca program, including:
 | 
				
			||||||
 | 
					
 | 
				
			||||||
- *Cluster mode*: Users can specify the computing environment for the program (a local machine, K8s cluster, Hadoop/YARN cluster, etc.).
 | 
					- *Cluster mode*: Users can specify the computing environment for the program (a local machine, K8s cluster, Hadoop/YARN cluster, etc.).
 | 
				
			||||||
- *Runtime*: Users can specify the backend for the program (spark and ray, etc.) to create SparkContext and/or RayContext, the cluster mode would work based on the specified runtime backend.
 | 
					- *Runtime*: Users can specify the backend for the program (spark and ray, etc.) to create SparkContext and/or OrcaRayContext, the cluster mode would work based on the specified runtime backend.
 | 
				
			||||||
- *Physical resources*: Users can specify the amount of physical resources to be allocated for the program on the underlying cluster, including the number of nodes in the cluster, the cores and memory allocated for each node, etc.
 | 
					- *Physical resources*: Users can specify the amount of physical resources to be allocated for the program on the underlying cluster, including the number of nodes in the cluster, the cores and memory allocated for each node, etc.
 | 
				
			||||||
 | 
					
 | 
				
			||||||
The Orca program simply runs `init_orca_context` on the local machine, which will automatically provision the runtime Python environment and distributed execution engine on the underlying computing environment (such as a single laptop, a large K8s or Hadoop cluster, etc.).
 | 
					The Orca program simply runs `init_orca_context` on the local machine, which will automatically provision the runtime Python environment and distributed execution engine on the underlying computing environment (such as a single laptop, a large K8s or Hadoop cluster, etc.).
 | 
				
			||||||
| 
						 | 
					@ -45,7 +45,7 @@ View the user guide for [K8s](../../UserGuide/k8s.md) and [Hadoop/YARN](../../Us
 | 
				
			||||||
 | 
					
 | 
				
			||||||
Under the hood, `OrcaContext` will automatically provision Apache Spark and/or Ray as the underlying execution engine for the distributed data processing and model training/inference.
 | 
					Under the hood, `OrcaContext` will automatically provision Apache Spark and/or Ray as the underlying execution engine for the distributed data processing and model training/inference.
 | 
				
			||||||
 | 
					
 | 
				
			||||||
Users can easily retrieve `SparkContext` and `RayContext`, the main entry point for Spark and Ray respectively, via `OrcaContext`:
 | 
					Users can easily retrieve `SparkContext` and `OrcaRayContext`, the main entry point for Spark and Ray respectively, via `OrcaContext`:
 | 
				
			||||||
 | 
					
 | 
				
			||||||
```python
 | 
					```python
 | 
				
			||||||
from bigdl.orca import OrcaContext
 | 
					from bigdl.orca import OrcaContext
 | 
				
			||||||
| 
						 | 
					
 | 
				
			||||||
		Loading…
	
		Reference in a new issue