langwhat
1.0.0
Answer "What is it?" on the command line with the power of large language models (LLMs).
pyWhat LLM version, leveraging OpenAI API and Sydney (Bing AI Chat).
langwhat 'f7316ffccd4d2d555a7522328cf792dd73bfbcd9'
langwhat 'f7316ffccd4d2d555a7522328cf792dd73bfbcd9' --zh
Sydney fixed my type "marry" automatically.
langwhat 'marry ball washington' -s
langwhat 'marry ball washington' -s -z
Responses are much faster when cache hits, and token usage becomes 0.
Note that Sydney doesn't support counting token usage, and always shows 0.
lw teddy --show-token-usage
lw teddy --show-token-usage
~/.config/langwhat/api_key.txt
-s
flagThis is the recommended installation method.
$ pipx install langwhat
# python 3.11 or higher is required, if your pipx uses a lower version of python by default,
# you could run the following command to install langwhat with python 3.11
# pipx install --python "$(which python3.11)"
$ pip install langwhat
$ langwhat --help
usage: lw [-h] [-z] [-s] [-C] [--show-token-usage] [-V] what
positional arguments:
what what is it
options:
-h, --help show this help message and exit
-z, --zh Use Mandarin to prompt and answer
-s, --sydney Use Sydney (Bing AI) instead of OpenAI
-C, --no-cache Disable cache
--show-token-usage Show token usage
-V, --version show program's version number and exit
$ git clone https://github.com/tddschn/langwhat.git
$ cd langwhat
$ poetry install