ipex-llm/python/llm/README.md
xingyuan li ea3cf6783e LLM: Command line wrapper for llama/bloom/gptneox (#8239)
* add llama/bloom/gptneox wrapper
* add readme
* upload binary main file
2023-06-08 14:55:22 +08:00

1 KiB

BigDL LLM

llm-cli

llm-cli is a command-line interface tool that allows easy execution of llama/gptneox/bloom models and generates results based on the provided prompt.

Usage

llm-cli -x <llama/gptneox/bloom> [-h] [args]

args are the arguments provided to the specified model program. You can use -x MODEL_FAMILY -h to retrieve the parameter list for a specific MODEL_FAMILY, for example:

llm-cli.sh -x llama -h

# Output:
# usage: main-llama [options]
#
# options:
#   -h, --help show this help message and exit
#   -i, --interactive run in interactive mode
#   --interactive-first run in interactive mode and wait for input right away
#   ...

Examples

Here are some examples of how to use the llm-cli tool:

Completion:

llm-cli.sh -t 16 -x llama -m ./llm-llama-model.bin -p 'Once upon a time,'

Chatting:

llm-cli.sh -t 16 -x llama -m ./llm-llama-model.bin -i --color

Feel free to explore different options and experiment with the llama/gptneox/bloom models using llm-cli!