--- license: apache-2.0 --- # FINGU-AI/QL-Specilize-78b ## Overview `FINGU-AI/QL-Specilize-78b` is a powerful causal language model designed for a variety of natural language processing (NLP) tasks, including machine translation, text generation, and chat-based applications. This model is particularly useful for translating between languages, as well as supporting other custom NLP tasks through flexible input. ## Example Usage ### Installation Make sure to install the required packages: ```bash pip install torch transformers ``` ### Loading the Model ```python from transformers import AutoTokenizer, AutoModelForCausalLM import torch # Model and Tokenizer model_id = 'FINGU-AI/QL-Specilize-78b' model = AutoModelForCausalLM.from_pretrained(model_id, attn_implementation="sdpa", torch_dtype=torch.float16, device_map='auto') tokenizer = AutoTokenizer.from_pretrained(model_id) model.to('cuda') # Input Messages for Translation messages = [ {"role": "user", "content": "what is Machine learning?"} ] # Tokenize and Generate Response input_ids = tokenizer.apply_chat_template( messages, add_generation_prompt=True, return_tensors="pt" ).to('cuda') outputs = model.generate( input_ids, max_new_tokens=500, do_sample=True, ) # Decode and Print the Translation response = outputs[0][input_ids.shape[-1]:] print(tokenizer.decode(response, skip_special_tokens=True)) ```