End of training
Browse files
README.md
CHANGED
|
@@ -1,11 +1,11 @@
|
|
| 1 |
---
|
| 2 |
-
base_model: microsoft/git-large-r-coco
|
| 3 |
-
datasets:
|
| 4 |
-
- imagefolder
|
| 5 |
library_name: transformers
|
| 6 |
license: mit
|
|
|
|
| 7 |
tags:
|
| 8 |
- generated_from_trainer
|
|
|
|
|
|
|
| 9 |
model-index:
|
| 10 |
- name: git-large-r-coco-IDB_ADv1_COCOv6-r
|
| 11 |
results: []
|
|
@@ -18,8 +18,8 @@ should probably proofread and complete it, then remove this comment. -->
|
|
| 18 |
|
| 19 |
This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
|
| 20 |
It achieves the following results on the evaluation set:
|
| 21 |
-
- Loss: 0.
|
| 22 |
-
- Meteor Score: {'meteor': 0.
|
| 23 |
|
| 24 |
## Model description
|
| 25 |
|
|
@@ -47,39 +47,43 @@ The following hyperparameters were used during training:
|
|
| 47 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 48 |
- lr_scheduler_type: cosine
|
| 49 |
- lr_scheduler_warmup_steps: 5
|
| 50 |
-
- num_epochs:
|
| 51 |
- mixed_precision_training: Native AMP
|
| 52 |
|
| 53 |
### Training results
|
| 54 |
|
| 55 |
-
| Training Loss | Epoch | Step | Validation Loss | Meteor Score
|
| 56 |
-
|
| 57 |
-
|
|
| 58 |
-
|
|
| 59 |
-
|
|
| 60 |
-
|
|
| 61 |
-
|
|
| 62 |
-
|
|
| 63 |
-
|
|
| 64 |
-
|
|
| 65 |
-
|
|
| 66 |
-
|
|
| 67 |
-
|
|
| 68 |
-
|
|
| 69 |
-
|
|
| 70 |
-
| 0.
|
| 71 |
-
| 0.
|
| 72 |
-
| 0.
|
| 73 |
-
| 0.
|
| 74 |
-
| 0.
|
| 75 |
-
| 0.
|
| 76 |
-
| 0.
|
| 77 |
-
| 0.
|
| 78 |
-
| 0.
|
| 79 |
-
| 0.
|
| 80 |
-
| 0.
|
| 81 |
-
| 0.
|
| 82 |
-
| 0.
|
|
|
|
|
|
|
|
|
|
|
|
|
| 83 |
|
| 84 |
|
| 85 |
### Framework versions
|
|
|
|
| 1 |
---
|
|
|
|
|
|
|
|
|
|
| 2 |
library_name: transformers
|
| 3 |
license: mit
|
| 4 |
+
base_model: microsoft/git-large-r-coco
|
| 5 |
tags:
|
| 6 |
- generated_from_trainer
|
| 7 |
+
datasets:
|
| 8 |
+
- imagefolder
|
| 9 |
model-index:
|
| 10 |
- name: git-large-r-coco-IDB_ADv1_COCOv6-r
|
| 11 |
results: []
|
|
|
|
| 18 |
|
| 19 |
This model is a fine-tuned version of [microsoft/git-large-r-coco](https://huggingface.co/microsoft/git-large-r-coco) on the imagefolder dataset.
|
| 20 |
It achieves the following results on the evaluation set:
|
| 21 |
+
- Loss: 0.0759
|
| 22 |
+
- Meteor Score: {'meteor': 0.5120753072373921}
|
| 23 |
|
| 24 |
## Model description
|
| 25 |
|
|
|
|
| 47 |
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
| 48 |
- lr_scheduler_type: cosine
|
| 49 |
- lr_scheduler_warmup_steps: 5
|
| 50 |
+
- num_epochs: 150
|
| 51 |
- mixed_precision_training: Native AMP
|
| 52 |
|
| 53 |
### Training results
|
| 54 |
|
| 55 |
+
| Training Loss | Epoch | Step | Validation Loss | Meteor Score |
|
| 56 |
+
|:-------------:|:-------:|:----:|:---------------:|:------------------------------:|
|
| 57 |
+
| 0.2014 | 3.0 | 5 | 0.0722 | {'meteor': 0.5065157314467645} |
|
| 58 |
+
| 0.2196 | 6.6667 | 10 | 0.0627 | {'meteor': 0.5305128358337737} |
|
| 59 |
+
| 0.1292 | 10.0 | 15 | 0.0594 | {'meteor': 0.5263979985260087} |
|
| 60 |
+
| 0.0872 | 13.3333 | 20 | 0.0605 | {'meteor': 0.5108520041090149} |
|
| 61 |
+
| 0.0818 | 17.0 | 25 | 0.0582 | {'meteor': 0.5113838555978679} |
|
| 62 |
+
| 0.0531 | 20.0 | 30 | 0.0621 | {'meteor': 0.5103098091443673} |
|
| 63 |
+
| 0.0443 | 23.0 | 35 | 0.0632 | {'meteor': 0.5018489405290812} |
|
| 64 |
+
| 0.0489 | 26.6667 | 40 | 0.0639 | {'meteor': 0.5119666957218931} |
|
| 65 |
+
| 0.0315 | 30.0 | 45 | 0.0648 | {'meteor': 0.5128484629162279} |
|
| 66 |
+
| 0.0245 | 33.3333 | 50 | 0.0674 | {'meteor': 0.5114053511060893} |
|
| 67 |
+
| 0.0213 | 37.0 | 55 | 0.0689 | {'meteor': 0.5103811878007981} |
|
| 68 |
+
| 0.0108 | 40.0 | 60 | 0.0704 | {'meteor': 0.5080035696805529} |
|
| 69 |
+
| 0.0089 | 43.0 | 65 | 0.0712 | {'meteor': 0.5235500043256491} |
|
| 70 |
+
| 0.0088 | 46.6667 | 70 | 0.0730 | {'meteor': 0.5162015377767581} |
|
| 71 |
+
| 0.0079 | 50.0 | 75 | 0.0704 | {'meteor': 0.508295723546233} |
|
| 72 |
+
| 0.0055 | 53.3333 | 80 | 0.0727 | {'meteor': 0.5093121829271419} |
|
| 73 |
+
| 0.0049 | 57.0 | 85 | 0.0739 | {'meteor': 0.5119514909960822} |
|
| 74 |
+
| 0.0033 | 60.0 | 90 | 0.0749 | {'meteor': 0.5106012588241947} |
|
| 75 |
+
| 0.0033 | 63.0 | 95 | 0.0751 | {'meteor': 0.5154173493739934} |
|
| 76 |
+
| 0.0039 | 66.6667 | 100 | 0.0753 | {'meteor': 0.5147237489602302} |
|
| 77 |
+
| 0.003 | 70.0 | 105 | 0.0758 | {'meteor': 0.5141283140032965} |
|
| 78 |
+
| 0.0031 | 73.3333 | 110 | 0.0759 | {'meteor': 0.5147414640891067} |
|
| 79 |
+
| 0.0033 | 77.0 | 115 | 0.0759 | {'meteor': 0.5143698813094443} |
|
| 80 |
+
| 0.0026 | 80.0 | 120 | 0.0758 | {'meteor': 0.5133582948399001} |
|
| 81 |
+
| 0.0025 | 83.0 | 125 | 0.0758 | {'meteor': 0.5121935317618276} |
|
| 82 |
+
| 0.003 | 86.6667 | 130 | 0.0759 | {'meteor': 0.5121665250079868} |
|
| 83 |
+
| 0.0026 | 90.0 | 135 | 0.0759 | {'meteor': 0.5116741831961337} |
|
| 84 |
+
| 0.0026 | 93.3333 | 140 | 0.0759 | {'meteor': 0.512031945155839} |
|
| 85 |
+
| 0.003 | 97.0 | 145 | 0.0759 | {'meteor': 0.5120926382524592} |
|
| 86 |
+
| 0.0024 | 100.0 | 150 | 0.0759 | {'meteor': 0.5120753072373921} |
|
| 87 |
|
| 88 |
|
| 89 |
### Framework versions
|
model.safetensors
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
size 1576851440
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:1c7a479924302d23342c70961a85921db90ff2525c3d19a5d1f5069a2fae5ddd
|
| 3 |
size 1576851440
|
runs/Jul15_12-45-47_OZPC/events.out.tfevents.1752605150.OZPC.19756.2
CHANGED
|
@@ -1,3 +1,3 @@
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
-
oid sha256:
|
| 3 |
-
size
|
|
|
|
| 1 |
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:b476661e27ff86709510980057584e4b0e22b2811bac1f37fabe2b3abd5b26fb
|
| 3 |
+
size 19961
|