gobean commited on
Commit
aefa5fb
·
verified ·
1 Parent(s): e381311

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -5
README.md CHANGED
@@ -10,19 +10,33 @@ This is a llamafile for [deepseek-coder-33b-instruct](https://huggingface.co/dee
10
  The quantized gguf was downloaded straight from [TheBloke](https://huggingface.co/TheBloke/deepseek-coder-33B-instruct-GGUF),
11
  and then zipped into a llamafile using [Mozilla's awesome project](https://github.com/Mozilla-Ocho/llamafile).
12
 
13
- It's over 4gb so if you want to use it on Windows you'll have to run it from WSL.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
14
 
15
  WSL note: If you get the error about APE, and the recommended command
16
 
17
  `sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'`
18
 
19
- doesn't work, the file might be named something else so I had success with
20
 
21
  `sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop-late'`
22
 
23
  If that fails too, just navigate to `/proc/sys/fs/binfmt_msc` and see what files look like `WSLInterop` and echo a -1 to whatever it's called by changing that part of the recommended command.
24
 
25
 
26
- Llamafiles are a standalone executable that run an LLM server locally on a variety of operating systems.
27
- You just run it, open the chat interface in a browser, and interact.
28
- Options can be passed in to expose the api etc.
 
10
  The quantized gguf was downloaded straight from [TheBloke](https://huggingface.co/TheBloke/deepseek-coder-33B-instruct-GGUF),
11
  and then zipped into a llamafile using [Mozilla's awesome project](https://github.com/Mozilla-Ocho/llamafile).
12
 
13
+
14
+
15
+ -= Llamafile =-
16
+
17
+ Llamafiles are a standalone executable that run an LLM server locally on a variety of operating systems including FreeBSD, Windows, Windows via WSL, Linux, and Mac.
18
+ The same file works everywhere, I've tested several of these on FreeBSD, Windows, Windows via WSL, and Linux.
19
+ You just download the .llamafile, (chmod +x or rename to .exe as needed), run it, open the chat interface in a browser, and interact.
20
+ Options can be passed in to expose the api etc. See their [docs](https://github.com/Mozilla-Ocho/llamafile) for details.
21
+
22
+ [Mozilla Blog Announcement for Llamafile](https://hacks.mozilla.org/2023/11/introducing-llamafile/)
23
+
24
+
25
+ Windows note:
26
+
27
+ If it's over 4gb and you want to use it on Windows, you'll have to run it from WSL.
28
 
29
  WSL note: If you get the error about APE, and the recommended command
30
 
31
  `sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop'`
32
 
33
+ doesn't work, the WSLInterop file might be named something else. I had success with
34
 
35
  `sudo sh -c 'echo -1 > /proc/sys/fs/binfmt_misc/WSLInterop-late'`
36
 
37
  If that fails too, just navigate to `/proc/sys/fs/binfmt_msc` and see what files look like `WSLInterop` and echo a -1 to whatever it's called by changing that part of the recommended command.
38
 
39
 
40
+ FreeBSD note:
41
+
42
+ Yes, it actually works on a fresh install of FreeBSD.