roberta-large-ner-ghtk-cs-add-label-new-data-3090-5Sep-1
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2936
- Tk: {'precision': 0.8476190476190476, 'recall': 0.7672413793103449, 'f1': 0.8054298642533937, 'number': 116}
- Gày: {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34}
- Gày trừu tượng: {'precision': 0.8997995991983968, 'recall': 0.9200819672131147, 'f1': 0.9098277608915907, 'number': 488}
- Iờ: {'precision': 0.6511627906976745, 'recall': 0.7368421052631579, 'f1': 0.6913580246913581, 'number': 38}
- Ã đơn: {'precision': 0.8719211822660099, 'recall': 0.8719211822660099, 'f1': 0.8719211822660099, 'number': 203}
- Đt: {'precision': 0.9341252699784017, 'recall': 0.9851936218678815, 'f1': 0.958980044345898, 'number': 878}
- Đt trừu tượng: {'precision': 0.8245614035087719, 'recall': 0.8068669527896996, 'f1': 0.8156182212581344, 'number': 233}
- Overall Precision: 0.8933
- Overall Recall: 0.9171
- Overall F1: 0.9050
- Overall Accuracy: 0.9643
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
| Training Loss | Epoch | Step | Validation Loss | Tk | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.0816 | 1.0 | 531 | 0.1918 | {'precision': 0.75, 'recall': 0.6724137931034483, 'f1': 0.709090909090909, 'number': 116} | {'precision': 0.7878787878787878, 'recall': 0.7647058823529411, 'f1': 0.7761194029850745, 'number': 34} | {'precision': 0.8652751423149905, 'recall': 0.9344262295081968, 'f1': 0.8985221674876848, 'number': 488} | {'precision': 0.5087719298245614, 'recall': 0.7631578947368421, 'f1': 0.6105263157894737, 'number': 38} | {'precision': 0.835, 'recall': 0.8226600985221675, 'f1': 0.8287841191066998, 'number': 203} | {'precision': 0.9128151260504201, 'recall': 0.989749430523918, 'f1': 0.9497267759562841, 'number': 878} | {'precision': 0.7931034482758621, 'recall': 0.7896995708154506, 'f1': 0.7913978494623657, 'number': 233} | 0.8594 | 0.9090 | 0.8835 | 0.9574 |
| 0.049 | 2.0 | 1062 | 0.2130 | {'precision': 0.8850574712643678, 'recall': 0.6637931034482759, 'f1': 0.7586206896551725, 'number': 116} | {'precision': 0.7368421052631579, 'recall': 0.8235294117647058, 'f1': 0.7777777777777778, 'number': 34} | {'precision': 0.9079754601226994, 'recall': 0.9098360655737705, 'f1': 0.9089048106448312, 'number': 488} | {'precision': 0.5263157894736842, 'recall': 0.7894736842105263, 'f1': 0.631578947368421, 'number': 38} | {'precision': 0.788546255506608, 'recall': 0.8817733990147784, 'f1': 0.8325581395348837, 'number': 203} | {'precision': 0.9257688229056203, 'recall': 0.9943052391799544, 'f1': 0.958813838550247, 'number': 878} | {'precision': 0.8089430894308943, 'recall': 0.8540772532188842, 'f1': 0.8308977035490607, 'number': 233} | 0.8769 | 0.9196 | 0.8977 | 0.9509 |
| 0.0372 | 3.0 | 1593 | 0.2337 | {'precision': 0.8058252427184466, 'recall': 0.7155172413793104, 'f1': 0.7579908675799087, 'number': 116} | {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} | {'precision': 0.8786127167630058, 'recall': 0.9344262295081968, 'f1': 0.9056603773584906, 'number': 488} | {'precision': 0.5076923076923077, 'recall': 0.868421052631579, 'f1': 0.6407766990291263, 'number': 38} | {'precision': 0.8724489795918368, 'recall': 0.8423645320197044, 'f1': 0.8571428571428571, 'number': 203} | {'precision': 0.9189473684210526, 'recall': 0.9943052391799544, 'f1': 0.9551422319474836, 'number': 878} | {'precision': 0.6593059936908517, 'recall': 0.8969957081545065, 'f1': 0.76, 'number': 233} | 0.8470 | 0.9317 | 0.8873 | 0.9602 |
| 0.0228 | 4.0 | 2124 | 0.2282 | {'precision': 0.7912087912087912, 'recall': 0.6206896551724138, 'f1': 0.6956521739130435, 'number': 116} | {'precision': 0.6842105263157895, 'recall': 0.7647058823529411, 'f1': 0.7222222222222222, 'number': 34} | {'precision': 0.9021956087824351, 'recall': 0.9262295081967213, 'f1': 0.9140546006066734, 'number': 488} | {'precision': 0.5769230769230769, 'recall': 0.7894736842105263, 'f1': 0.6666666666666666, 'number': 38} | {'precision': 0.8677248677248677, 'recall': 0.8078817733990148, 'f1': 0.836734693877551, 'number': 203} | {'precision': 0.9176346356916578, 'recall': 0.989749430523918, 'f1': 0.9523287671232877, 'number': 878} | {'precision': 0.7302158273381295, 'recall': 0.871244635193133, 'f1': 0.7945205479452054, 'number': 233} | 0.8664 | 0.9126 | 0.8889 | 0.9610 |
| 0.0178 | 5.0 | 2655 | 0.2266 | {'precision': 0.8623853211009175, 'recall': 0.8103448275862069, 'f1': 0.8355555555555555, 'number': 116} | {'precision': 0.725, 'recall': 0.8529411764705882, 'f1': 0.7837837837837837, 'number': 34} | {'precision': 0.876953125, 'recall': 0.9200819672131147, 'f1': 0.898, 'number': 488} | {'precision': 0.6842105263157895, 'recall': 0.34210526315789475, 'f1': 0.45614035087719296, 'number': 38} | {'precision': 0.8454106280193237, 'recall': 0.8620689655172413, 'f1': 0.8536585365853658, 'number': 203} | {'precision': 0.9425162689804772, 'recall': 0.989749430523918, 'f1': 0.9655555555555556, 'number': 878} | {'precision': 0.8201754385964912, 'recall': 0.8025751072961373, 'f1': 0.8112798264642083, 'number': 233} | 0.8915 | 0.9126 | 0.9019 | 0.9625 |
| 0.015 | 6.0 | 3186 | 0.2717 | {'precision': 0.8448275862068966, 'recall': 0.8448275862068966, 'f1': 0.8448275862068967, 'number': 116} | {'precision': 0.6944444444444444, 'recall': 0.7352941176470589, 'f1': 0.7142857142857144, 'number': 34} | {'precision': 0.9094567404426559, 'recall': 0.9262295081967213, 'f1': 0.917766497461929, 'number': 488} | {'precision': 0.6896551724137931, 'recall': 0.5263157894736842, 'f1': 0.5970149253731344, 'number': 38} | {'precision': 0.8808290155440415, 'recall': 0.8374384236453202, 'f1': 0.8585858585858585, 'number': 203} | {'precision': 0.941240478781284, 'recall': 0.9851936218678815, 'f1': 0.9627156371730663, 'number': 878} | {'precision': 0.8357487922705314, 'recall': 0.7424892703862661, 'f1': 0.7863636363636365, 'number': 233} | 0.9029 | 0.9060 | 0.9044 | 0.9637 |
| 0.0094 | 7.0 | 3717 | 0.2548 | {'precision': 0.8, 'recall': 0.8620689655172413, 'f1': 0.8298755186721992, 'number': 116} | {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} | {'precision': 0.9146341463414634, 'recall': 0.9221311475409836, 'f1': 0.9183673469387755, 'number': 488} | {'precision': 0.5714285714285714, 'recall': 0.7368421052631579, 'f1': 0.6436781609195403, 'number': 38} | {'precision': 0.8686868686868687, 'recall': 0.8472906403940886, 'f1': 0.85785536159601, 'number': 203} | {'precision': 0.9486899563318777, 'recall': 0.989749430523918, 'f1': 0.9687848383500557, 'number': 878} | {'precision': 0.8042553191489362, 'recall': 0.8111587982832618, 'f1': 0.8076923076923076, 'number': 233} | 0.8944 | 0.9231 | 0.9085 | 0.9642 |
| 0.0061 | 8.0 | 4248 | 0.2922 | {'precision': 0.8623853211009175, 'recall': 0.8103448275862069, 'f1': 0.8355555555555555, 'number': 116} | {'precision': 0.7631578947368421, 'recall': 0.8529411764705882, 'f1': 0.8055555555555555, 'number': 34} | {'precision': 0.8921568627450981, 'recall': 0.9323770491803278, 'f1': 0.9118236472945892, 'number': 488} | {'precision': 0.6304347826086957, 'recall': 0.7631578947368421, 'f1': 0.6904761904761905, 'number': 38} | {'precision': 0.8302752293577982, 'recall': 0.8916256157635468, 'f1': 0.859857482185273, 'number': 203} | {'precision': 0.9354144241119483, 'recall': 0.989749430523918, 'f1': 0.9618151632540122, 'number': 878} | {'precision': 0.77734375, 'recall': 0.8540772532188842, 'f1': 0.8139059304703476, 'number': 233} | 0.8813 | 0.9327 | 0.9062 | 0.9635 |
| 0.0038 | 9.0 | 4779 | 0.2873 | {'precision': 0.8557692307692307, 'recall': 0.7672413793103449, 'f1': 0.8090909090909091, 'number': 116} | {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} | {'precision': 0.9, 'recall': 0.9221311475409836, 'f1': 0.9109311740890689, 'number': 488} | {'precision': 0.6363636363636364, 'recall': 0.7368421052631579, 'f1': 0.6829268292682926, 'number': 38} | {'precision': 0.8844221105527639, 'recall': 0.8669950738916257, 'f1': 0.8756218905472638, 'number': 203} | {'precision': 0.9354144241119483, 'recall': 0.989749430523918, 'f1': 0.9618151632540122, 'number': 878} | {'precision': 0.7843137254901961, 'recall': 0.8583690987124464, 'f1': 0.819672131147541, 'number': 233} | 0.8894 | 0.9251 | 0.9069 | 0.9650 |
| 0.0025 | 10.0 | 5310 | 0.2936 | {'precision': 0.8476190476190476, 'recall': 0.7672413793103449, 'f1': 0.8054298642533937, 'number': 116} | {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} | {'precision': 0.8997995991983968, 'recall': 0.9200819672131147, 'f1': 0.9098277608915907, 'number': 488} | {'precision': 0.6511627906976745, 'recall': 0.7368421052631579, 'f1': 0.6913580246913581, 'number': 38} | {'precision': 0.8719211822660099, 'recall': 0.8719211822660099, 'f1': 0.8719211822660099, 'number': 203} | {'precision': 0.9341252699784017, 'recall': 0.9851936218678815, 'f1': 0.958980044345898, 'number': 878} | {'precision': 0.8245614035087719, 'recall': 0.8068669527896996, 'f1': 0.8156182212581344, 'number': 233} | 0.8933 | 0.9171 | 0.9050 | 0.9643 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support