gobean's picture
Update README.md
429f8eb verified
|
raw
history blame
1.29 kB
metadata
license: other
license_name: deepseek
license_link: >-
  https://huggingface.co/deepseek-ai/deepseek-coder-33b-instruct/blob/main/LICENSE

This is a llamafile for deepseek-coder-33b-instruct.

The quantized gguf was downloaded straight from TheBloke, and then zipped into a llamafile using Mozilla's awesome project.

It's over 4gb so if you want to use it on Windows you'll have to run it from WSL.

WSL note: If you get the error about APE, and the recommended command

sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'

doesn't work, the file might be named something else so I had success with

sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop-late'

If that fails too, just navigate to /proc/sys/fs/binfmt_msc and see what files look like WSLInterop and echo a -1 to whatever it's called by changing that part of the recommended command.

Llamafiles are a standalone executable that run an LLM server locally on a variety of operating systems. You just run it, open the chat interface in a browser, and interact. Options can be passed in to expose the api etc.