File size: 1,006 Bytes
c765fa2
fdae0da
 
 
 
 
 
 
 
c765fa2
 
fdae0da
 
 
c765fa2
fdae0da
 
 
 
 
 
 
c765fa2
 
fdae0da
c765fa2
fdae0da
 
c765fa2
fdae0da
c765fa2
fdae0da
c765fa2
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
model_size: 494034560
required_memory: 1.84
metrics:
- GLUE_MRPC
license: apache-2.0
datasets:
- jtatman/python-code-dataset-500k
- Vezora/Tested-143k-Python-Alpaca
language:
- en
- es
base_model: meta-llama/Meta-Llama-3.1-8B-Instruct
library_name: adapter-transformers
tags:
- code
- python
- tiny
- open
- mini
- minitron
- tinytron
---

# Uploaded model

[<img src="https://github.githubassets.com/assets/GitHub-Mark-ea2971cee799.png" width="100"/><img src="https://github.githubassets.com/assets/GitHub-Logo-ee398b662d42.png" width="100"/>](https://github.com/Agnuxo1)
- **Developed by:** [Agnuxo](https://github.com/Agnuxo1)
- **License:** apache-2.0
- **Finetuned from model:** Agnuxo/Tinytron-Meta-Llama-3.1-8B-Instruct

This model was fine-tuned using [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.

[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)