davideuler
commited on
Commit
·
a040d47
1
Parent(s):
6990e21
readme update
Browse files
README.md
CHANGED
@@ -4,8 +4,8 @@
|
|
4 |
|
5 |
There is tons of PDF reader/translator with AI supported. However none of them meets my need. I hope it could run totally on local with local LLMs.
|
6 |
|
7 |
-
I hope
|
8 |
-
Also I don't like to translate a 1000 pages long PDF file all at
|
9 |
|
10 |
## Features in PDF Translator for Human
|
11 |
You can read both the original PDF file and the translated content side by side.
|
@@ -16,7 +16,7 @@ The local/remote translation API is invoked on a per-page basis as needed, trigg
|
|
16 |
|
17 |

|
18 |
|
19 |
-
## Supported translators and
|
20 |
* Google Translator (NO need api-key, it it totally free)
|
21 |
* Local deployed LLMs (ollama, llama.cpp, mlx_lm ... etc.)
|
22 |
* ChatGPT
|
@@ -36,7 +36,7 @@ streamlit run pdf_translator_web.py
|
|
36 |
|
37 |
```
|
38 |
|
39 |
-
## Notes on
|
40 |
|
41 |
### Option 1.Start local llm By mlx_lm (works on Mac Sillicon.)
|
42 |
|
@@ -73,7 +73,11 @@ cmake --build build --config Release -j 12
|
|
73 |
|
74 |
```
|
75 |
|
76 |
-
|
|
|
|
|
|
|
|
|
77 |
|
78 |
For example, run the following command before start the streamlit application to enable translation by deepseek :
|
79 |
|
@@ -92,12 +96,11 @@ export OPENAI_API_KEY=sk-xxxx
|
|
92 |
```
|
93 |
|
94 |
|
95 |
-
### Options 3. Local inference service by ollama/vLLM and other application such as LMStudio
|
96 |
-
|
97 |
-
Please read the official guide for you LLM inferencing tool.
|
98 |
|
99 |
## Acknowlegement
|
100 |
|
101 |
https://github.com/nidhaloff/deep-translator
|
102 |
|
103 |
-
The project is based on the awesome deep-translator.
|
|
|
|
|
|
4 |
|
5 |
There is tons of PDF reader/translator with AI supported. However none of them meets my need. I hope it could run totally on local with local LLMs.
|
6 |
|
7 |
+
I hope to read both the original PDF and the translated pages side by side.
|
8 |
+
Also I don't like to translate a 1000 pages long PDF file all at once, it costs lots of time and tokens. And most of the time, I never complete reading through all contents of a long paper.
|
9 |
|
10 |
## Features in PDF Translator for Human
|
11 |
You can read both the original PDF file and the translated content side by side.
|
|
|
16 |
|
17 |

|
18 |
|
19 |
+
## Supported translators and LLMs:
|
20 |
* Google Translator (NO need api-key, it it totally free)
|
21 |
* Local deployed LLMs (ollama, llama.cpp, mlx_lm ... etc.)
|
22 |
* ChatGPT
|
|
|
36 |
|
37 |
```
|
38 |
|
39 |
+
## Notes on deployment and starting a local llm inference service
|
40 |
|
41 |
### Option 1.Start local llm By mlx_lm (works on Mac Sillicon.)
|
42 |
|
|
|
73 |
|
74 |
```
|
75 |
|
76 |
+
### Options 3. Local inference service by ollama/vLLM and other application such as LMStudio
|
77 |
+
|
78 |
+
Please read the official guide for you LLM inferencing tool.
|
79 |
+
|
80 |
+
### Option 4. Note on using OpenAI Compatible LLM service provider
|
81 |
|
82 |
For example, run the following command before start the streamlit application to enable translation by deepseek :
|
83 |
|
|
|
96 |
```
|
97 |
|
98 |
|
|
|
|
|
|
|
99 |
|
100 |
## Acknowlegement
|
101 |
|
102 |
https://github.com/nidhaloff/deep-translator
|
103 |
|
104 |
+
The project is based on the awesome deep-translator. Thanks to the excellent work in the original project, I can integrate it to the pdf translator tool.
|
105 |
+
|
106 |
+
Pull Requests are welcome.
|