YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

This model can generate the solution to problem in LeetCode

The training data: codellama/CodeLlama-7b-Instruct-hf

The base model: codellama/CodeLlama-7b-Instruct-hf

You can find more information at: https://github.com/khaimt/coding_challenge_solver

The prompt template is:

prompt_str = (
            f"[INST] Write code to solve the following coding problem that obeys"
            f"the constraints and passes the example test cases."
            f"Please wrap your code answer using ```:\n{input}\n[/INST]```python\n"
        )

Where input is the problem in LeetCode, an example is: https://github.com/khaimt/coding_challenge_solver/blob/main/test_cases/problem1.txt

Example for inference:

prompt_str = (
            f"[INST] Write code to solve the following coding problem that obeys"
            f"the constraints and passes the example test cases."
            f"Please wrap your code answer using ```:\n{input}\n[/INST]```python\n"
        )
model = AutoModelForCausalLM.from_pretrained(model_path, device_map="auto", torch_dtype=torch.bfloat16)
token_ids = tokenizer([prompt_str], return_tensors="pt")["input_ids"]
token_ids = token_ids.to(model.device)
outputs = model.generate(input_ids=token_ids, max_new_tokens=1024, do_sample=True, temperature=0.0001)
all_token_ids = outputs[0].tolist()
ouput_token_ids = all_token_ids[token_ids.shape[-1] :]
output = tokenizer.decode(ouput_token_ids)
print("-------------Solution generated from Model---------")
print(output)
Downloads last month
16
Safetensors
Model size
6.74B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.