LLM: Fix speculative llama3 long input error (#10934)
This commit is contained in:
		
							parent
							
								
									49ab5a2b0e
								
							
						
					
					
						commit
						1de878bee1
					
				
					 1 changed files with 2 additions and 1 deletions
				
			
		| 
						 | 
					@ -18,7 +18,8 @@ We suggest using conda to manage environment:
 | 
				
			||||||
conda create -n llm python=3.11
 | 
					conda create -n llm python=3.11
 | 
				
			||||||
conda activate llm
 | 
					conda activate llm
 | 
				
			||||||
pip install --pre --upgrade ipex-llm[all]
 | 
					pip install --pre --upgrade ipex-llm[all]
 | 
				
			||||||
pip install intel_extension_for_pytorch==2.1.0
 | 
					# transformers>=4.33.0 is required for Llama3 with IPEX-LLM optimizations
 | 
				
			||||||
 | 
					pip install transformers==4.37.0 
 | 
				
			||||||
```
 | 
					```
 | 
				
			||||||
 | 
					
 | 
				
			||||||
### 2. Configures high-performing processor environment variables
 | 
					### 2. Configures high-performing processor environment variables
 | 
				
			||||||
| 
						 | 
					
 | 
				
			||||||
		Loading…
	
		Reference in a new issue