NanoChat D34 SFT - HuggingFace Format

This is the pankajmathur/nanochat-d34-finetuned converted to HuggingFace Transformers format.

Model Description

  • Model type: Causal Language Model
  • Language: English
  • License: Apache 2.0

Usage

Install Transformer Library from Github with nanochat support

!pip install -q git+https://github.com/huggingface/transformers.git

Use dedicated NanoChatForCausalLM and PreTrainedTokenizerFast packages from Transformer Library

import torch
from transformers import NanoChatForCausalLM, PreTrainedTokenizerFast

# Load the converted model and tokenizer
tokenizer = PreTrainedTokenizerFast.from_pretrained("pankajmathur/nanochat-d34-sft-hf")
model = NanoChatForCausalLM.from_pretrained(
    "pankajmathur/nanochat-d34-sft-hf",
    torch_dtype=torch.bfloat16,
    device_map="auto"
)

# Generate text
prompt = "Hello, who are you?"

inputs = tokenizer(prompt, return_tensors="pt")
input_ids = inputs["input_ids"].to(model.device)

with torch.no_grad():
    outputs = model.generate(
        input_ids,
        max_new_tokens=100,
        do_sample=True,
        temperature=0.7,
        top_p=0.9,
        pad_token_id=tokenizer.eos_token_id
    )

response = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(f"🤖 Response:\n{response}")

Source

Citation

If you use this model, please cite accordingly.

Downloads last month
119
Safetensors
Model size
2B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for pankajmathur/nanochat-d34-sft-hf

Finetuned
(3)
this model
Finetunes
1 model

Dataset used to train pankajmathur/nanochat-d34-sft-hf