codeshell: fix wrong links (#9417)

This commit is contained in:
Yining Wang 2023-11-12 19:22:33 +08:00 committed by GitHub
parent 2cfef5ef1e
commit d7334513e1
2 changed files with 3 additions and 3 deletions

View file

@ -162,7 +162,7 @@ Over 20 models have been optimized/verified on `bigdl-llm`, including *LLaMA/LLa
| Skywork | [link](python/llm/example/CPU/HF-Transformers-AutoModels/Model/skywork) | |
| InternLM-XComposer | [link](python/llm/example/CPU/HF-Transformers-AutoModels/Model/internlm-xcomposer) | |
| WizardCoder-Python | [link](python/llm/example/CPU/HF-Transformers-AutoModels/Model/wizardcoder-python) | |
| CodeShell | [link](python/llm/example/CPU/HF-Transformers-AutoModels/Model/CodeShell) | |
| CodeShell | [link](python/llm/example/CPU/HF-Transformers-AutoModels/Model/codeshell) | |
| Fuyu | [link](python/llm/example/CPU/HF-Transformers-AutoModels/Model/fuyu) | |
| Distil-Whisper | [link](python/llm/example/CPU/HF-Transformers-AutoModels/Model/distil-whisper) | [link](python/llm/example/GPU/HF-Transformers-AutoModels/Model/distil-whisper) |

View file

@ -55,7 +55,7 @@ Over 20 models have been optimized/verified on `bigdl-llm`, including *LLaMA/LLa
| Baichuan | [link](example/CPU/HF-Transformers-AutoModels/Model/baichuan) | [link](example/CPU/HF-Transformers-AutoModels/Model/baichuan) |
| Baichuan2 | [link](example/CPU/HF-Transformers-AutoModels/Model/baichuan2) | [link](example/GPU/HF-Transformers-AutoModels/Model/baichuan2) |
| InternLM | [link](example/CPU/HF-Transformers-AutoModels/Model/internlm) | [link](example/GPU/HF-Transformers-AutoModels/Model/internlm) |
| Qwen | [link](example/CPU/HF-Transformers-AutoModels/Model/qwen) | [link](example/GPU/HF-Transformers-AutoModels/Model/qwen) |
| Qwen | [link](example/CPU/HF-Transformers-AutoModels/Model/qwen) | [link](example/GPU/HF-Transformers-AutoModels/Model/qwen) |
| Qwen-VL | [link](example/CPU/HF-Transformers-AutoModels/Model/qwen-vl) | [link](example/GPU/HF-Transformers-AutoModels/Model/qwen-vl) |
| Aquila | [link](example/CPU/HF-Transformers-AutoModels/Model/aquila) | [link](example/GPU/HF-Transformers-AutoModels/Model/aquila) |
| Aquila2 | [link](example/CPU/HF-Transformers-AutoModels/Model/aquila2) | [link](example/GPU/HF-Transformers-AutoModels/Model/aquila2) |
@ -69,7 +69,7 @@ Over 20 models have been optimized/verified on `bigdl-llm`, including *LLaMA/LLa
| Skywork | [link](example/CPU/HF-Transformers-AutoModels/Model/skywork) | |
| InternLM-XComposer | [link](example/CPU/HF-Transformers-AutoModels/Model/internlm-xcomposer) | |
| WizardCoder-Python | [link](example/CPU/HF-Transformers-AutoModels/Model/wizardcoder-python) | |
| CodeShell | [link](example/CPU/HF-Transformers-AutoModels/Model/CodeShell) | |
| CodeShell | [link](example/CPU/HF-Transformers-AutoModels/Model/codeshell) | |
| Fuyu | [link](example/CPU/HF-Transformers-AutoModels/Model/fuyu) | |
| Distil-Whisper | [link](example/CPU/HF-Transformers-AutoModels/Model/distil-whisper) | [link](example/GPU/HF-Transformers-AutoModels/Model/distil-whisper) |