Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

leafspark
/
SFR-Iterative-DPO-LLaMA-3-8B-R-lora

Transformers
Safetensors
PEFT
mergekit
Model card Files Files and versions
xet
Community
SFR-Iterative-DPO-LLaMA-3-8B-R-lora
176 MB
  • 1 contributor
History: 3 commits
leafspark's picture
leafspark
model: upload adapter
a7545cf verified over 1 year ago
  • .gitattributes
    1.52 kB
    initial commit over 1 year ago
  • README.md
    557 Bytes
    readme: add model card over 1 year ago
  • adapter_config.json
    712 Bytes
    model: upload adapter over 1 year ago
  • adapter_model.safetensors
    176 MB
    xet
    model: upload adapter over 1 year ago