Changed default requirements to CPU version of llama cpp. Added Gemini Flash 2.0 to model list. Output files should contain only final files. b0e08c8 seanpedrickcase commited on Mar 3
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards. b7f4700 seanpedrickcase commited on Dec 10, 2024
Added more guidance in Readme. Now wipes variables on click to create or summarise topics f8f34c2 seanpedrickcase commited on Dec 4, 2024