--- library_name: transformers license: mit datasets: - MattCoddity/dockerNLcommands language: - en old_version: Bruce1489/Llama3.2-docker-command-v2 --- # Model Card for Model ID

Improved accuracy over V2.

Fine-tuned the "meta-llama/Llama-3.2-1B-Instruct" model using the QLoRA technique.

The resulting model can generate Docker commands based on given statements.

- **Developed by:** Bruce1489(정성훈, Bruce Jung) - **License:** mit - **Finetuned from model [optional]:** meta-llama/Llama-3.2-1B-Instruct ### Direct Use ```python import torch from transformers import pipeline pipe = pipeline( "text-generation", model="Bruce1489/Llama3.2-docker-command-v3", torch_dtype=torch.bfloat16, device_map="auto" ) messages = [ {"role": "system", "content": "translate this sentence in docker command"}, {"role": "user", "content": "Please show me the Docker containers that have exited and are related to the mongo image."}, ] outputs = pipe( messages, max_new_tokens=256, temperature = 0.5, top_p = 0.9, top_k = 10, do_sample = True, repetition_penalty = 1.2 ) print(outputs[0]["generated_text"][-1]['content']) ```