Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
AhmedCodes64
/
Merged_Model_DPO_v2
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
xet
Community
Use this model
main
Merged_Model_DPO_v2
462 MB
1 contributor
History:
2 commits
AhmedCodes64
Upload 15 files
402bb48
verified
10 months ago
.gitattributes
1.57 kB
Upload 15 files
10 months ago
README.md
5.13 kB
Upload 15 files
10 months ago
adapter_config.json
828 Bytes
Upload 15 files
10 months ago
adapter_model.safetensors
295 MB
xet
Upload 15 files
10 months ago
added_tokens.json
605 Bytes
Upload 15 files
10 months ago
merges.txt
1.67 MB
Upload 15 files
10 months ago
optimizer.pt
150 MB
xet
Upload 15 files
10 months ago
rng_state.pth
14.2 kB
xet
Upload 15 files
10 months ago
scaler.pt
988 Bytes
xet
Upload 15 files
10 months ago
scheduler.pt
1.06 kB
xet
Upload 15 files
10 months ago
special_tokens_map.json
614 Bytes
Upload 15 files
10 months ago
tokenizer.json
11.4 MB
xet
Upload 15 files
10 months ago
tokenizer_config.json
7.36 kB
Upload 15 files
10 months ago
trainer_state.json
85.7 kB
Upload 15 files
10 months ago
training_args.bin
6.26 kB
xet
Upload 15 files
10 months ago
vocab.json
2.78 MB
Upload 15 files
10 months ago