oopere commited on
Commit
93b8166
·
verified ·
1 Parent(s): ceb2073

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -48,7 +48,8 @@ This model is not intended to be used directly, but rather to be fine-tuned for
48
  ### Implementation Details
49
  - **Pruning Notebook:** [Detailed implementation and methodology](https://github.com/peremartra/Large-Language-Model-Notebooks-Course/blob/main/6-PRUNING/6_3_pruning_structured_llama3.2-1b_OK.ipynb)
50
  - **GitHub Repository:** [LLM Course](https://github.com/peremartra/Large-Language-Model-Notebooks-Course)
51
-
 
52
  ### Pruning Method
53
  - **Technique:** Structured pruning targeting MLP layers
54
  - **Pruning Ratio:** 40% of neurons removed from MLP layers
 
48
  ### Implementation Details
49
  - **Pruning Notebook:** [Detailed implementation and methodology](https://github.com/peremartra/Large-Language-Model-Notebooks-Course/blob/main/6-PRUNING/6_3_pruning_structured_llama3.2-1b_OK.ipynb)
50
  - **GitHub Repository:** [LLM Course](https://github.com/peremartra/Large-Language-Model-Notebooks-Course)
51
+ - **Article explaining pruning methodology:**[How to Prune LLaMA 3.2 and Similar Large Language Models](https://towardsdatascience.com/how-to-prune-llama-3-2-and-similar-large-language-models-cf18e9a2afb6?sk=af4c5e40e967437325050f019b3ae606)
52
+ -
53
  ### Pruning Method
54
  - **Technique:** Structured pruning targeting MLP layers
55
  - **Pruning Ratio:** 40% of neurons removed from MLP layers