Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add verbose information (config/system/performance) #31

Merged
merged 1 commit into from
Jul 6, 2023
Merged

Conversation

li-plus
Copy link
Owner

@li-plus li-plus commented Jul 6, 2023

Support displaying verbose output including config, system and performance information. Enable it by passing -v or --verbose to the main program. This should resolve #8.

Sample output:

$ ./build/bin/main -m chatglm2-ggml.bin -p 你好 -v -t 16
system info: | AVX = 1 | AVX2 = 1 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 1 | NEON = 0 | ARM_FMA = 0 | F16C = 1 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 1 | VSX = 0 |
inference config: | max_length = 2048 | max_context_length = 512 | top_k = 0 | top_p = 0.7 | temperature = 0.95 | num_threads = 16 |
loaded ChatGLM2 model from chatglm2-ggml.bin within: 25.119 ms

你好👋!我是人工智能助手 ChatGLM2-6B,很高兴见到你,欢迎问我任何问题。

prompt time: 633.128 ms / 17 tokens (37.242 ms/token)
output time: 2202.78 ms / 28 tokens (78.67 ms/token)
total time: 2835.91 ms

@li-plus li-plus changed the title Add verbose information (config/system/timing) Add verbose information (config/system/performance) Jul 6, 2023
@li-plus li-plus merged commit 7223505 into main Jul 6, 2023
6 checks passed
@li-plus li-plus deleted the verbose branch July 6, 2023 04:46
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Adding profiling info for chatglm.cpp
1 participant