A desktop app for local, private, secured AI experimentation. Included out-of-the box are:
.mdx formatIt's made to be used alongside https://github.com/alexanderatallah/window.ai/ as a simple way to have a local inference server up and running in no time. window.ai + local.ai enable every web app to utilize AI without incurring any cost from either the developer or the user!
Right now, local.ai uses the https://github.com/rustformers/llm rust crate at its core. Check them out, they are super cool!
Go to the site at https://www.localai.app/ and click the button for your machine's architecture. You can also find the build manually in the GitHub release page.
Windows and MacOS binaries are signed under Plasmo Corp. - a company owned by the author of this project (@louisgv).
You may also build from source!
Here's how to run the project locally:
git submodule update --init --recursive
pnpm i
pnpm dev
Ties into the bring your own model concept -- Alex from window.ai
Anything AI-related including their derivatives should be open-source for all to inspect. GPLv3 enforces this chain of open-source.
local.ai, where should I post it?local.ai, where should I post it?Absolutely - Please note that any contribution toward this repo shall be relicensed under GPLv3. There are many ways to contribute, such as: