Upload logs/training_log_step11000.log with huggingface_hub
Browse files- logs/training_log_step11000.log +453 -0
logs/training_log_step11000.log
ADDED
|
@@ -0,0 +1,453 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
2025-11-09 19:05:41,604 - INFO -
|
| 2 |
+
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 3 |
+
β T5 TRAINING CONFIGURATION β
|
| 4 |
+
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
| 5 |
+
Mode: FULL
|
| 6 |
+
Platform: vast
|
| 7 |
+
Repository: ranjan56cse/t5-base-xsum-lora
|
| 8 |
+
Epochs: 3
|
| 9 |
+
Samples: ALL (204k)
|
| 10 |
+
Batch size: 16
|
| 11 |
+
Gradient accum: 2
|
| 12 |
+
Effective batch: 32
|
| 13 |
+
Save every: 1000 steps
|
| 14 |
+
Expected time: ~8-10 hours
|
| 15 |
+
|
| 16 |
+
2025-11-09 19:05:41,604 - INFO - Creating repository: ranjan56cse/t5-base-xsum-lora
|
| 17 |
+
2025-11-09 19:05:41,807 - INFO - β
Repo: https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 18 |
+
2025-11-09 19:05:41,807 - INFO - Loading google-t5/t5-base...
|
| 19 |
+
2025-11-09 19:05:52,938 - INFO - β
Gradient checkpointing enabled
|
| 20 |
+
2025-11-09 19:05:52,938 - INFO - Applying LoRA...
|
| 21 |
+
2025-11-09 19:05:52,976 - INFO - Loading XSum dataset...
|
| 22 |
+
2025-11-09 19:05:56,588 - INFO - β
Dataset: 204045 train, 11332 val
|
| 23 |
+
2025-11-09 19:05:56,588 - INFO - Tokenizing...
|
| 24 |
+
2025-11-09 19:08:02,802 - INFO - β
Tokenization complete
|
| 25 |
+
2025-11-09 19:08:03,857 - INFO - ============================================================
|
| 26 |
+
2025-11-09 19:08:03,858 - INFO - π STARTING TRAINING (~8-10 hours)
|
| 27 |
+
2025-11-09 19:08:03,859 - INFO - Effective batch size: 32
|
| 28 |
+
2025-11-09 19:08:03,859 - INFO - GPU: 0.84GB allocated, 0.92GB reserved
|
| 29 |
+
2025-11-09 19:08:03,859 - INFO - System: 4.1% used (17.4GB / 503.7GB)
|
| 30 |
+
2025-11-09 19:08:03,859 - INFO - ============================================================
|
| 31 |
+
2025-11-09 19:08:03,990 - INFO - ============================================================
|
| 32 |
+
2025-11-09 19:08:03,990 - INFO - π Training started
|
| 33 |
+
2025-11-09 19:08:03,990 - INFO - Total steps: 19128
|
| 34 |
+
2025-11-09 19:08:03,990 - INFO - GPU: NVIDIA GeForce RTX 3090
|
| 35 |
+
2025-11-09 19:08:03,990 - INFO - GPU Memory: 0.84GB allocated, 0.92GB reserved
|
| 36 |
+
2025-11-09 19:08:03,990 - INFO - System Memory: 4.1% used (17.4GB / 503.7GB)
|
| 37 |
+
2025-11-09 19:08:03,991 - INFO - ============================================================
|
| 38 |
+
2025-11-09 19:08:51,266 - INFO - Step 50/19128 | Loss: 12.5022 | LR: 2.88e-05 | GPU: 0.87GB
|
| 39 |
+
2025-11-09 19:09:38,077 - INFO - Step 100/19128 | Loss: 10.3469 | LR: 5.82e-05 | GPU: 0.87GB
|
| 40 |
+
2025-11-09 19:10:24,938 - INFO - Step 150/19128 | Loss: 4.0200 | LR: 8.82e-05 | GPU: 0.87GB
|
| 41 |
+
2025-11-09 19:11:11,674 - INFO - Step 200/19128 | Loss: 0.9201 | LR: 1.18e-04 | GPU: 0.87GB
|
| 42 |
+
2025-11-09 19:11:58,405 - INFO - Step 250/19128 | Loss: 0.7357 | LR: 1.48e-04 | GPU: 0.87GB
|
| 43 |
+
2025-11-09 19:12:45,152 - INFO - Step 300/19128 | Loss: 0.6602 | LR: 1.77e-04 | GPU: 0.87GB
|
| 44 |
+
2025-11-09 19:13:31,815 - INFO - Step 350/19128 | Loss: 0.6121 | LR: 2.07e-04 | GPU: 0.87GB
|
| 45 |
+
2025-11-09 19:14:18,499 - INFO - Step 400/19128 | Loss: 0.5817 | LR: 2.37e-04 | GPU: 0.87GB
|
| 46 |
+
2025-11-09 19:15:05,185 - INFO - Step 450/19128 | Loss: 0.5916 | LR: 2.67e-04 | GPU: 0.87GB
|
| 47 |
+
2025-11-09 19:15:51,879 - INFO - Step 500/19128 | Loss: 0.5675 | LR: 2.97e-04 | GPU: 0.87GB
|
| 48 |
+
2025-11-09 19:16:38,691 - INFO - Step 550/19128 | Loss: 0.5700 | LR: 2.99e-04 | GPU: 0.87GB
|
| 49 |
+
2025-11-09 19:17:25,546 - INFO - Step 600/19128 | Loss: 0.5610 | LR: 2.98e-04 | GPU: 0.87GB
|
| 50 |
+
2025-11-09 19:18:12,459 - INFO - Step 650/19128 | Loss: 0.5669 | LR: 2.98e-04 | GPU: 0.87GB
|
| 51 |
+
2025-11-09 19:18:59,163 - INFO - Step 700/19128 | Loss: 0.5659 | LR: 2.97e-04 | GPU: 0.87GB
|
| 52 |
+
2025-11-09 19:19:45,942 - INFO - Step 750/19128 | Loss: 0.5673 | LR: 2.96e-04 | GPU: 0.87GB
|
| 53 |
+
2025-11-09 19:20:32,786 - INFO - Step 800/19128 | Loss: 0.5619 | LR: 2.95e-04 | GPU: 0.87GB
|
| 54 |
+
2025-11-09 19:21:19,739 - INFO - Step 850/19128 | Loss: 0.5719 | LR: 2.94e-04 | GPU: 0.87GB
|
| 55 |
+
2025-11-09 19:22:06,708 - INFO - Step 900/19128 | Loss: 0.5576 | LR: 2.94e-04 | GPU: 0.87GB
|
| 56 |
+
2025-11-09 19:22:53,641 - INFO - Step 950/19128 | Loss: 0.5567 | LR: 2.93e-04 | GPU: 0.87GB
|
| 57 |
+
2025-11-09 19:23:40,445 - INFO - Step 1000/19128 | Loss: 0.5597 | LR: 2.92e-04 | GPU: 0.87GB
|
| 58 |
+
2025-11-09 19:25:15,772 - INFO - Step 1000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 59 |
+
2025-11-09 19:25:15,772 - INFO - ============================================================
|
| 60 |
+
2025-11-09 19:25:15,772 - INFO - π EVALUATION at step 1000
|
| 61 |
+
2025-11-09 19:25:15,772 - INFO - eval_loss: 0.5003
|
| 62 |
+
2025-11-09 19:25:15,772 - INFO - eval_runtime: 95.3235
|
| 63 |
+
2025-11-09 19:25:15,772 - INFO - eval_samples_per_second: 118.8790
|
| 64 |
+
2025-11-09 19:25:15,772 - INFO - eval_steps_per_second: 7.4380
|
| 65 |
+
2025-11-09 19:25:15,772 - INFO - epoch: 0.1600
|
| 66 |
+
2025-11-09 19:25:15,773 - INFO - gpu_memory_gb: 0.8662
|
| 67 |
+
2025-11-09 19:25:15,773 - INFO - system_memory_percent: 6.9000
|
| 68 |
+
2025-11-09 19:25:15,773 - INFO - ============================================================
|
| 69 |
+
2025-11-09 19:25:15,773 - INFO - π New best eval loss: 0.5003
|
| 70 |
+
2025-11-09 19:25:16,038 - INFO - ============================================================
|
| 71 |
+
2025-11-09 19:25:16,038 - INFO - πΎ Checkpoint 1: step 1000
|
| 72 |
+
2025-11-09 19:25:16,038 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 73 |
+
2025-11-09 19:25:16,038 - INFO - π€ Uploading checkpoint-1000 to Hub...
|
| 74 |
+
2025-11-09 19:25:20,110 - INFO - β
Checkpoint 1000 uploaded!
|
| 75 |
+
2025-11-09 19:25:20,110 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 76 |
+
2025-11-09 19:25:20,110 - INFO - ============================================================
|
| 77 |
+
2025-11-09 19:26:07,015 - INFO - Step 1050/19128 | Loss: 0.5565 | LR: 2.91e-04 | GPU: 0.87GB
|
| 78 |
+
2025-11-09 19:26:53,807 - INFO - Step 1100/19128 | Loss: 0.5767 | LR: 2.91e-04 | GPU: 0.87GB
|
| 79 |
+
2025-11-09 19:27:40,531 - INFO - Step 1150/19128 | Loss: 0.5620 | LR: 2.90e-04 | GPU: 0.87GB
|
| 80 |
+
2025-11-09 19:28:27,359 - INFO - Step 1200/19128 | Loss: 0.5864 | LR: 2.89e-04 | GPU: 0.87GB
|
| 81 |
+
2025-11-09 19:29:14,182 - INFO - Step 1250/19128 | Loss: 0.6260 | LR: 2.88e-04 | GPU: 0.87GB
|
| 82 |
+
2025-11-09 19:30:01,074 - INFO - Step 1300/19128 | Loss: 0.7742 | LR: 2.87e-04 | GPU: 0.87GB
|
| 83 |
+
2025-11-09 19:30:48,073 - INFO - Step 1350/19128 | Loss: 1.1101 | LR: 2.87e-04 | GPU: 0.87GB
|
| 84 |
+
2025-11-09 19:31:34,986 - INFO - Step 1400/19128 | Loss: 1.3211 | LR: 2.86e-04 | GPU: 0.87GB
|
| 85 |
+
2025-11-09 19:32:21,930 - INFO - Step 1450/19128 | Loss: 1.4130 | LR: 2.85e-04 | GPU: 0.87GB
|
| 86 |
+
2025-11-09 19:33:08,830 - INFO - Step 1500/19128 | Loss: 1.4265 | LR: 2.84e-04 | GPU: 0.87GB
|
| 87 |
+
2025-11-09 19:33:55,803 - INFO - Step 1550/19128 | Loss: 1.4700 | LR: 2.83e-04 | GPU: 0.87GB
|
| 88 |
+
2025-11-09 19:34:42,910 - INFO - Step 1600/19128 | Loss: 1.4561 | LR: 2.83e-04 | GPU: 0.87GB
|
| 89 |
+
2025-11-09 19:35:29,939 - INFO - Step 1650/19128 | Loss: 1.4693 | LR: 2.82e-04 | GPU: 0.87GB
|
| 90 |
+
2025-11-09 19:36:16,685 - INFO - Step 1700/19128 | Loss: 1.4729 | LR: 2.81e-04 | GPU: 0.87GB
|
| 91 |
+
2025-11-09 19:37:03,396 - INFO - Step 1750/19128 | Loss: 1.4599 | LR: 2.80e-04 | GPU: 0.87GB
|
| 92 |
+
2025-11-09 19:37:50,039 - INFO - Step 1800/19128 | Loss: 1.4725 | LR: 2.79e-04 | GPU: 0.87GB
|
| 93 |
+
2025-11-09 19:38:36,721 - INFO - Step 1850/19128 | Loss: 1.4503 | LR: 2.79e-04 | GPU: 0.87GB
|
| 94 |
+
2025-11-09 19:39:23,367 - INFO - Step 1900/19128 | Loss: 1.4812 | LR: 2.78e-04 | GPU: 0.87GB
|
| 95 |
+
2025-11-09 19:40:10,030 - INFO - Step 1950/19128 | Loss: 1.4761 | LR: 2.77e-04 | GPU: 0.87GB
|
| 96 |
+
2025-11-09 19:40:56,713 - INFO - Step 2000/19128 | Loss: 1.4960 | LR: 2.76e-04 | GPU: 0.87GB
|
| 97 |
+
2025-11-09 19:42:31,551 - INFO - Step 2000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 98 |
+
2025-11-09 19:42:31,551 - INFO - ============================================================
|
| 99 |
+
2025-11-09 19:42:31,551 - INFO - π EVALUATION at step 2000
|
| 100 |
+
2025-11-09 19:42:31,551 - INFO - eval_loss: 1.2512
|
| 101 |
+
2025-11-09 19:42:31,551 - INFO - eval_runtime: 94.8348
|
| 102 |
+
2025-11-09 19:42:31,551 - INFO - eval_samples_per_second: 119.4920
|
| 103 |
+
2025-11-09 19:42:31,551 - INFO - eval_steps_per_second: 7.4760
|
| 104 |
+
2025-11-09 19:42:31,551 - INFO - epoch: 0.3100
|
| 105 |
+
2025-11-09 19:42:31,551 - INFO - gpu_memory_gb: 0.8662
|
| 106 |
+
2025-11-09 19:42:31,551 - INFO - system_memory_percent: 13.2000
|
| 107 |
+
2025-11-09 19:42:31,551 - INFO - ============================================================
|
| 108 |
+
2025-11-09 19:42:31,768 - INFO - ============================================================
|
| 109 |
+
2025-11-09 19:42:31,768 - INFO - πΎ Checkpoint 2: step 2000
|
| 110 |
+
2025-11-09 19:42:31,769 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 111 |
+
2025-11-09 19:42:31,769 - INFO - π€ Uploading checkpoint-2000 to Hub...
|
| 112 |
+
2025-11-09 19:42:36,341 - INFO - β
Checkpoint 2000 uploaded!
|
| 113 |
+
2025-11-09 19:42:36,342 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 114 |
+
2025-11-09 19:42:36,342 - INFO - ============================================================
|
| 115 |
+
2025-11-09 19:43:23,118 - INFO - Step 2050/19128 | Loss: 1.4488 | LR: 2.75e-04 | GPU: 0.87GB
|
| 116 |
+
2025-11-09 19:44:09,811 - INFO - Step 2100/19128 | Loss: 1.4550 | LR: 2.75e-04 | GPU: 0.87GB
|
| 117 |
+
2025-11-09 19:44:56,495 - INFO - Step 2150/19128 | Loss: 1.4353 | LR: 2.74e-04 | GPU: 0.87GB
|
| 118 |
+
2025-11-09 19:45:43,252 - INFO - Step 2200/19128 | Loss: 1.4524 | LR: 2.73e-04 | GPU: 0.87GB
|
| 119 |
+
2025-11-09 19:46:30,038 - INFO - Step 2250/19128 | Loss: 1.4701 | LR: 2.72e-04 | GPU: 0.87GB
|
| 120 |
+
2025-11-09 19:47:16,729 - INFO - Step 2300/19128 | Loss: 1.4734 | LR: 2.71e-04 | GPU: 0.87GB
|
| 121 |
+
2025-11-09 19:48:03,415 - INFO - Step 2350/19128 | Loss: 1.5035 | LR: 2.71e-04 | GPU: 0.87GB
|
| 122 |
+
2025-11-09 19:48:50,056 - INFO - Step 2400/19128 | Loss: 1.4513 | LR: 2.70e-04 | GPU: 0.87GB
|
| 123 |
+
2025-11-09 19:49:36,603 - INFO - Step 2450/19128 | Loss: 1.4641 | LR: 2.69e-04 | GPU: 0.87GB
|
| 124 |
+
2025-11-09 19:50:23,155 - INFO - Step 2500/19128 | Loss: 1.4585 | LR: 2.68e-04 | GPU: 0.87GB
|
| 125 |
+
2025-11-09 19:51:09,800 - INFO - Step 2550/19128 | Loss: 1.4673 | LR: 2.67e-04 | GPU: 0.87GB
|
| 126 |
+
2025-11-09 19:51:56,482 - INFO - Step 2600/19128 | Loss: 1.4671 | LR: 2.67e-04 | GPU: 0.87GB
|
| 127 |
+
2025-11-09 19:52:43,089 - INFO - Step 2650/19128 | Loss: 1.4702 | LR: 2.66e-04 | GPU: 0.87GB
|
| 128 |
+
2025-11-09 19:53:29,716 - INFO - Step 2700/19128 | Loss: 1.4612 | LR: 2.65e-04 | GPU: 0.87GB
|
| 129 |
+
2025-11-09 19:54:16,277 - INFO - Step 2750/19128 | Loss: 1.4713 | LR: 2.64e-04 | GPU: 0.87GB
|
| 130 |
+
2025-11-09 19:55:02,907 - INFO - Step 2800/19128 | Loss: 1.4573 | LR: 2.64e-04 | GPU: 0.87GB
|
| 131 |
+
2025-11-09 19:55:49,565 - INFO - Step 2850/19128 | Loss: 1.4586 | LR: 2.63e-04 | GPU: 0.87GB
|
| 132 |
+
2025-11-09 19:56:36,226 - INFO - Step 2900/19128 | Loss: 1.4674 | LR: 2.62e-04 | GPU: 0.87GB
|
| 133 |
+
2025-11-09 19:57:22,928 - INFO - Step 2950/19128 | Loss: 1.4466 | LR: 2.61e-04 | GPU: 0.87GB
|
| 134 |
+
2025-11-09 19:58:09,596 - INFO - Step 3000/19128 | Loss: 1.4897 | LR: 2.60e-04 | GPU: 0.87GB
|
| 135 |
+
2025-11-09 19:59:44,409 - INFO - Step 3000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 136 |
+
2025-11-09 19:59:44,409 - INFO - ============================================================
|
| 137 |
+
2025-11-09 19:59:44,409 - INFO - π EVALUATION at step 3000
|
| 138 |
+
2025-11-09 19:59:44,410 - INFO - eval_loss: 1.2418
|
| 139 |
+
2025-11-09 19:59:44,410 - INFO - eval_runtime: 94.8105
|
| 140 |
+
2025-11-09 19:59:44,410 - INFO - eval_samples_per_second: 119.5230
|
| 141 |
+
2025-11-09 19:59:44,410 - INFO - eval_steps_per_second: 7.4780
|
| 142 |
+
2025-11-09 19:59:44,410 - INFO - epoch: 0.4700
|
| 143 |
+
2025-11-09 19:59:44,410 - INFO - gpu_memory_gb: 0.8662
|
| 144 |
+
2025-11-09 19:59:44,410 - INFO - system_memory_percent: 6.7000
|
| 145 |
+
2025-11-09 19:59:44,410 - INFO - ============================================================
|
| 146 |
+
2025-11-09 19:59:44,634 - INFO - ============================================================
|
| 147 |
+
2025-11-09 19:59:44,634 - INFO - πΎ Checkpoint 3: step 3000
|
| 148 |
+
2025-11-09 19:59:44,635 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 149 |
+
2025-11-09 19:59:44,635 - INFO - π€ Uploading checkpoint-3000 to Hub...
|
| 150 |
+
2025-11-09 19:59:48,888 - INFO - β
Checkpoint 3000 uploaded!
|
| 151 |
+
2025-11-09 19:59:48,888 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 152 |
+
2025-11-09 19:59:48,888 - INFO - ============================================================
|
| 153 |
+
2025-11-09 20:00:35,640 - INFO - Step 3050/19128 | Loss: 1.4621 | LR: 2.60e-04 | GPU: 0.87GB
|
| 154 |
+
2025-11-09 20:01:22,207 - INFO - Step 3100/19128 | Loss: 1.4443 | LR: 2.59e-04 | GPU: 0.87GB
|
| 155 |
+
2025-11-09 20:02:08,745 - INFO - Step 3150/19128 | Loss: 1.4314 | LR: 2.58e-04 | GPU: 0.87GB
|
| 156 |
+
2025-11-09 20:02:55,306 - INFO - Step 3200/19128 | Loss: 1.4172 | LR: 2.57e-04 | GPU: 0.87GB
|
| 157 |
+
2025-11-09 20:03:41,847 - INFO - Step 3250/19128 | Loss: 1.4878 | LR: 2.56e-04 | GPU: 0.87GB
|
| 158 |
+
2025-11-09 20:04:28,392 - INFO - Step 3300/19128 | Loss: 1.4344 | LR: 2.56e-04 | GPU: 0.87GB
|
| 159 |
+
2025-11-09 20:05:14,921 - INFO - Step 3350/19128 | Loss: 1.4634 | LR: 2.55e-04 | GPU: 0.87GB
|
| 160 |
+
2025-11-09 20:06:01,450 - INFO - Step 3400/19128 | Loss: 1.4679 | LR: 2.54e-04 | GPU: 0.87GB
|
| 161 |
+
2025-11-09 20:06:48,065 - INFO - Step 3450/19128 | Loss: 1.4641 | LR: 2.53e-04 | GPU: 0.87GB
|
| 162 |
+
2025-11-09 20:07:34,593 - INFO - Step 3500/19128 | Loss: 1.4396 | LR: 2.52e-04 | GPU: 0.87GB
|
| 163 |
+
2025-11-09 20:08:21,159 - INFO - Step 3550/19128 | Loss: 1.4850 | LR: 2.52e-04 | GPU: 0.87GB
|
| 164 |
+
2025-11-09 20:09:07,759 - INFO - Step 3600/19128 | Loss: 1.4355 | LR: 2.51e-04 | GPU: 0.87GB
|
| 165 |
+
2025-11-09 20:09:54,480 - INFO - Step 3650/19128 | Loss: 1.4419 | LR: 2.50e-04 | GPU: 0.87GB
|
| 166 |
+
2025-11-09 20:10:41,194 - INFO - Step 3700/19128 | Loss: 1.4224 | LR: 2.49e-04 | GPU: 0.87GB
|
| 167 |
+
2025-11-09 20:11:27,870 - INFO - Step 3750/19128 | Loss: 1.4473 | LR: 2.48e-04 | GPU: 0.87GB
|
| 168 |
+
2025-11-09 20:12:14,633 - INFO - Step 3800/19128 | Loss: 1.4341 | LR: 2.48e-04 | GPU: 0.87GB
|
| 169 |
+
2025-11-09 20:13:01,358 - INFO - Step 3850/19128 | Loss: 1.4463 | LR: 2.47e-04 | GPU: 0.87GB
|
| 170 |
+
2025-11-09 20:13:47,961 - INFO - Step 3900/19128 | Loss: 1.4348 | LR: 2.46e-04 | GPU: 0.87GB
|
| 171 |
+
2025-11-09 20:14:34,584 - INFO - Step 3950/19128 | Loss: 1.4326 | LR: 2.45e-04 | GPU: 0.87GB
|
| 172 |
+
2025-11-09 20:15:21,213 - INFO - Step 4000/19128 | Loss: 1.4586 | LR: 2.44e-04 | GPU: 0.87GB
|
| 173 |
+
2025-11-09 20:16:56,031 - INFO - Step 4000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 174 |
+
2025-11-09 20:16:56,032 - INFO - ============================================================
|
| 175 |
+
2025-11-09 20:16:56,032 - INFO - π EVALUATION at step 4000
|
| 176 |
+
2025-11-09 20:16:56,032 - INFO - eval_loss: 1.2330
|
| 177 |
+
2025-11-09 20:16:56,032 - INFO - eval_runtime: 94.8153
|
| 178 |
+
2025-11-09 20:16:56,032 - INFO - eval_samples_per_second: 119.5170
|
| 179 |
+
2025-11-09 20:16:56,032 - INFO - eval_steps_per_second: 7.4780
|
| 180 |
+
2025-11-09 20:16:56,032 - INFO - epoch: 0.6300
|
| 181 |
+
2025-11-09 20:16:56,032 - INFO - gpu_memory_gb: 0.8662
|
| 182 |
+
2025-11-09 20:16:56,032 - INFO - system_memory_percent: 6.9000
|
| 183 |
+
2025-11-09 20:16:56,032 - INFO - ============================================================
|
| 184 |
+
2025-11-09 20:16:56,240 - INFO - ============================================================
|
| 185 |
+
2025-11-09 20:16:56,241 - INFO - πΎ Checkpoint 4: step 4000
|
| 186 |
+
2025-11-09 20:16:56,241 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 187 |
+
2025-11-09 20:16:56,241 - INFO - π€ Uploading checkpoint-4000 to Hub...
|
| 188 |
+
2025-11-09 20:17:00,190 - INFO - β
Checkpoint 4000 uploaded!
|
| 189 |
+
2025-11-09 20:17:00,190 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 190 |
+
2025-11-09 20:17:00,190 - INFO - ============================================================
|
| 191 |
+
2025-11-09 20:17:47,036 - INFO - Step 4050/19128 | Loss: 1.4624 | LR: 2.44e-04 | GPU: 0.87GB
|
| 192 |
+
2025-11-09 20:18:33,726 - INFO - Step 4100/19128 | Loss: 1.4550 | LR: 2.43e-04 | GPU: 0.87GB
|
| 193 |
+
2025-11-09 20:19:20,355 - INFO - Step 4150/19128 | Loss: 1.4294 | LR: 2.42e-04 | GPU: 0.87GB
|
| 194 |
+
2025-11-09 20:20:06,989 - INFO - Step 4200/19128 | Loss: 1.4675 | LR: 2.41e-04 | GPU: 0.87GB
|
| 195 |
+
2025-11-09 20:20:53,597 - INFO - Step 4250/19128 | Loss: 1.4320 | LR: 2.40e-04 | GPU: 0.87GB
|
| 196 |
+
2025-11-09 20:21:40,182 - INFO - Step 4300/19128 | Loss: 1.4357 | LR: 2.40e-04 | GPU: 0.87GB
|
| 197 |
+
2025-11-09 20:22:26,684 - INFO - Step 4350/19128 | Loss: 1.4419 | LR: 2.39e-04 | GPU: 0.87GB
|
| 198 |
+
2025-11-09 20:23:13,218 - INFO - Step 4400/19128 | Loss: 1.4272 | LR: 2.38e-04 | GPU: 0.87GB
|
| 199 |
+
2025-11-09 20:23:59,888 - INFO - Step 4450/19128 | Loss: 1.4133 | LR: 2.37e-04 | GPU: 0.87GB
|
| 200 |
+
2025-11-09 20:24:46,653 - INFO - Step 4500/19128 | Loss: 1.4340 | LR: 2.36e-04 | GPU: 0.87GB
|
| 201 |
+
2025-11-09 20:25:33,287 - INFO - Step 4550/19128 | Loss: 1.4218 | LR: 2.36e-04 | GPU: 0.87GB
|
| 202 |
+
2025-11-09 20:26:19,993 - INFO - Step 4600/19128 | Loss: 1.4682 | LR: 2.35e-04 | GPU: 0.87GB
|
| 203 |
+
2025-11-09 20:27:06,680 - INFO - Step 4650/19128 | Loss: 1.4333 | LR: 2.34e-04 | GPU: 0.87GB
|
| 204 |
+
2025-11-09 20:27:53,348 - INFO - Step 4700/19128 | Loss: 1.4359 | LR: 2.33e-04 | GPU: 0.87GB
|
| 205 |
+
2025-11-09 20:28:39,968 - INFO - Step 4750/19128 | Loss: 1.4054 | LR: 2.32e-04 | GPU: 0.87GB
|
| 206 |
+
2025-11-09 20:29:26,496 - INFO - Step 4800/19128 | Loss: 1.4215 | LR: 2.32e-04 | GPU: 0.87GB
|
| 207 |
+
2025-11-09 20:30:13,206 - INFO - Step 4850/19128 | Loss: 1.4471 | LR: 2.31e-04 | GPU: 0.87GB
|
| 208 |
+
2025-11-09 20:30:59,857 - INFO - Step 4900/19128 | Loss: 1.4238 | LR: 2.30e-04 | GPU: 0.87GB
|
| 209 |
+
2025-11-09 20:31:46,547 - INFO - Step 4950/19128 | Loss: 1.4218 | LR: 2.29e-04 | GPU: 0.87GB
|
| 210 |
+
2025-11-09 20:32:33,138 - INFO - Step 5000/19128 | Loss: 1.4419 | LR: 2.28e-04 | GPU: 0.87GB
|
| 211 |
+
2025-11-09 20:34:08,183 - INFO - Step 5000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 212 |
+
2025-11-09 20:34:08,183 - INFO - ============================================================
|
| 213 |
+
2025-11-09 20:34:08,183 - INFO - π EVALUATION at step 5000
|
| 214 |
+
2025-11-09 20:34:08,183 - INFO - eval_loss: 1.2248
|
| 215 |
+
2025-11-09 20:34:08,183 - INFO - eval_runtime: 95.0420
|
| 216 |
+
2025-11-09 20:34:08,183 - INFO - eval_samples_per_second: 119.2310
|
| 217 |
+
2025-11-09 20:34:08,183 - INFO - eval_steps_per_second: 7.4600
|
| 218 |
+
2025-11-09 20:34:08,183 - INFO - epoch: 0.7800
|
| 219 |
+
2025-11-09 20:34:08,183 - INFO - gpu_memory_gb: 0.8662
|
| 220 |
+
2025-11-09 20:34:08,183 - INFO - system_memory_percent: 6.8000
|
| 221 |
+
2025-11-09 20:34:08,183 - INFO - ============================================================
|
| 222 |
+
2025-11-09 20:34:08,403 - INFO - ============================================================
|
| 223 |
+
2025-11-09 20:34:08,403 - INFO - πΎ Checkpoint 5: step 5000
|
| 224 |
+
2025-11-09 20:34:08,403 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 225 |
+
2025-11-09 20:34:08,403 - INFO - π€ Uploading checkpoint-5000 to Hub...
|
| 226 |
+
2025-11-09 20:34:12,224 - INFO - β
Checkpoint 5000 uploaded!
|
| 227 |
+
2025-11-09 20:34:12,224 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 228 |
+
2025-11-09 20:34:12,224 - INFO - ============================================================
|
| 229 |
+
2025-11-09 20:34:58,972 - INFO - Step 5050/19128 | Loss: 1.4405 | LR: 2.28e-04 | GPU: 0.87GB
|
| 230 |
+
2025-11-09 20:35:45,646 - INFO - Step 5100/19128 | Loss: 1.4490 | LR: 2.27e-04 | GPU: 0.87GB
|
| 231 |
+
2025-11-09 20:36:32,304 - INFO - Step 5150/19128 | Loss: 1.4233 | LR: 2.26e-04 | GPU: 0.87GB
|
| 232 |
+
2025-11-09 20:37:19,019 - INFO - Step 5200/19128 | Loss: 1.4230 | LR: 2.25e-04 | GPU: 0.87GB
|
| 233 |
+
2025-11-09 20:38:05,553 - INFO - Step 5250/19128 | Loss: 1.4315 | LR: 2.24e-04 | GPU: 0.87GB
|
| 234 |
+
2025-11-09 20:38:52,072 - INFO - Step 5300/19128 | Loss: 1.4180 | LR: 2.24e-04 | GPU: 0.87GB
|
| 235 |
+
2025-11-09 20:39:38,580 - INFO - Step 5350/19128 | Loss: 1.4056 | LR: 2.23e-04 | GPU: 0.87GB
|
| 236 |
+
2025-11-09 20:40:25,413 - INFO - Step 5400/19128 | Loss: 1.4351 | LR: 2.22e-04 | GPU: 0.87GB
|
| 237 |
+
2025-11-09 20:41:12,273 - INFO - Step 5450/19128 | Loss: 1.4377 | LR: 2.21e-04 | GPU: 0.87GB
|
| 238 |
+
2025-11-09 20:41:59,185 - INFO - Step 5500/19128 | Loss: 1.4065 | LR: 2.20e-04 | GPU: 0.87GB
|
| 239 |
+
2025-11-09 20:42:46,056 - INFO - Step 5550/19128 | Loss: 1.4246 | LR: 2.20e-04 | GPU: 0.87GB
|
| 240 |
+
2025-11-09 20:43:32,914 - INFO - Step 5600/19128 | Loss: 1.4607 | LR: 2.19e-04 | GPU: 0.87GB
|
| 241 |
+
2025-11-09 20:44:19,762 - INFO - Step 5650/19128 | Loss: 1.4211 | LR: 2.18e-04 | GPU: 0.87GB
|
| 242 |
+
2025-11-09 20:45:06,645 - INFO - Step 5700/19128 | Loss: 1.4475 | LR: 2.17e-04 | GPU: 0.87GB
|
| 243 |
+
2025-11-09 20:45:53,490 - INFO - Step 5750/19128 | Loss: 1.3977 | LR: 2.16e-04 | GPU: 0.87GB
|
| 244 |
+
2025-11-09 20:46:40,332 - INFO - Step 5800/19128 | Loss: 1.4034 | LR: 2.16e-04 | GPU: 0.87GB
|
| 245 |
+
2025-11-09 20:47:27,248 - INFO - Step 5850/19128 | Loss: 1.4237 | LR: 2.15e-04 | GPU: 0.87GB
|
| 246 |
+
2025-11-09 20:48:14,096 - INFO - Step 5900/19128 | Loss: 1.4371 | LR: 2.14e-04 | GPU: 0.87GB
|
| 247 |
+
2025-11-09 20:49:00,911 - INFO - Step 5950/19128 | Loss: 1.4416 | LR: 2.13e-04 | GPU: 0.87GB
|
| 248 |
+
2025-11-09 20:49:47,743 - INFO - Step 6000/19128 | Loss: 1.4164 | LR: 2.12e-04 | GPU: 0.87GB
|
| 249 |
+
2025-11-09 20:51:22,757 - INFO - Step 6000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 250 |
+
2025-11-09 20:51:22,757 - INFO - ============================================================
|
| 251 |
+
2025-11-09 20:51:22,757 - INFO - π EVALUATION at step 6000
|
| 252 |
+
2025-11-09 20:51:22,757 - INFO - eval_loss: 1.2173
|
| 253 |
+
2025-11-09 20:51:22,757 - INFO - eval_runtime: 95.0111
|
| 254 |
+
2025-11-09 20:51:22,757 - INFO - eval_samples_per_second: 119.2700
|
| 255 |
+
2025-11-09 20:51:22,757 - INFO - eval_steps_per_second: 7.4620
|
| 256 |
+
2025-11-09 20:51:22,757 - INFO - epoch: 0.9400
|
| 257 |
+
2025-11-09 20:51:22,757 - INFO - gpu_memory_gb: 0.8662
|
| 258 |
+
2025-11-09 20:51:22,757 - INFO - system_memory_percent: 6.9000
|
| 259 |
+
2025-11-09 20:51:22,757 - INFO - ============================================================
|
| 260 |
+
2025-11-09 20:51:22,967 - INFO - ============================================================
|
| 261 |
+
2025-11-09 20:51:22,967 - INFO - πΎ Checkpoint 6: step 6000
|
| 262 |
+
2025-11-09 20:51:22,967 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 263 |
+
2025-11-09 20:51:22,967 - INFO - π€ Uploading checkpoint-6000 to Hub...
|
| 264 |
+
2025-11-09 20:51:26,803 - INFO - β
Checkpoint 6000 uploaded!
|
| 265 |
+
2025-11-09 20:51:26,804 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 266 |
+
2025-11-09 20:51:26,804 - INFO - ============================================================
|
| 267 |
+
2025-11-09 20:52:13,740 - INFO - Step 6050/19128 | Loss: 1.3970 | LR: 2.12e-04 | GPU: 0.87GB
|
| 268 |
+
2025-11-09 20:53:00,601 - INFO - Step 6100/19128 | Loss: 1.4268 | LR: 2.11e-04 | GPU: 0.87GB
|
| 269 |
+
2025-11-09 20:53:47,421 - INFO - Step 6150/19128 | Loss: 1.4388 | LR: 2.10e-04 | GPU: 0.87GB
|
| 270 |
+
2025-11-09 20:54:34,236 - INFO - Step 6200/19128 | Loss: 1.4208 | LR: 2.09e-04 | GPU: 0.87GB
|
| 271 |
+
2025-11-09 20:55:21,070 - INFO - Step 6250/19128 | Loss: 1.4352 | LR: 2.09e-04 | GPU: 0.87GB
|
| 272 |
+
2025-11-09 20:56:07,893 - INFO - Step 6300/19128 | Loss: 1.4053 | LR: 2.08e-04 | GPU: 0.87GB
|
| 273 |
+
2025-11-09 20:56:54,729 - INFO - Step 6350/19128 | Loss: 1.4242 | LR: 2.07e-04 | GPU: 0.87GB
|
| 274 |
+
2025-11-09 20:57:42,159 - INFO - Step 6400/19128 | Loss: 1.4197 | LR: 2.06e-04 | GPU: 0.87GB
|
| 275 |
+
2025-11-09 20:58:28,931 - INFO - Step 6450/19128 | Loss: 1.4225 | LR: 2.05e-04 | GPU: 0.87GB
|
| 276 |
+
2025-11-09 20:59:15,830 - INFO - Step 6500/19128 | Loss: 1.4188 | LR: 2.05e-04 | GPU: 0.87GB
|
| 277 |
+
2025-11-09 21:00:02,658 - INFO - Step 6550/19128 | Loss: 1.4347 | LR: 2.04e-04 | GPU: 0.87GB
|
| 278 |
+
2025-11-09 21:00:49,521 - INFO - Step 6600/19128 | Loss: 1.4371 | LR: 2.03e-04 | GPU: 0.87GB
|
| 279 |
+
2025-11-09 21:01:36,328 - INFO - Step 6650/19128 | Loss: 1.4228 | LR: 2.02e-04 | GPU: 0.87GB
|
| 280 |
+
2025-11-09 21:02:23,188 - INFO - Step 6700/19128 | Loss: 1.4289 | LR: 2.01e-04 | GPU: 0.87GB
|
| 281 |
+
2025-11-09 21:03:10,045 - INFO - Step 6750/19128 | Loss: 1.4224 | LR: 2.01e-04 | GPU: 0.87GB
|
| 282 |
+
2025-11-09 21:03:56,874 - INFO - Step 6800/19128 | Loss: 1.4783 | LR: 2.00e-04 | GPU: 0.87GB
|
| 283 |
+
2025-11-09 21:04:43,802 - INFO - Step 6850/19128 | Loss: 1.4469 | LR: 1.99e-04 | GPU: 0.87GB
|
| 284 |
+
2025-11-09 21:05:30,629 - INFO - Step 6900/19128 | Loss: 1.4335 | LR: 1.98e-04 | GPU: 0.87GB
|
| 285 |
+
2025-11-09 21:06:17,469 - INFO - Step 6950/19128 | Loss: 1.3973 | LR: 1.97e-04 | GPU: 0.87GB
|
| 286 |
+
2025-11-09 21:07:04,306 - INFO - Step 7000/19128 | Loss: 1.4493 | LR: 1.97e-04 | GPU: 0.87GB
|
| 287 |
+
2025-11-09 21:08:39,314 - INFO - Step 7000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 288 |
+
2025-11-09 21:08:39,314 - INFO - ============================================================
|
| 289 |
+
2025-11-09 21:08:39,314 - INFO - π EVALUATION at step 7000
|
| 290 |
+
2025-11-09 21:08:39,314 - INFO - eval_loss: 1.2105
|
| 291 |
+
2025-11-09 21:08:39,314 - INFO - eval_runtime: 95.0044
|
| 292 |
+
2025-11-09 21:08:39,314 - INFO - eval_samples_per_second: 119.2790
|
| 293 |
+
2025-11-09 21:08:39,314 - INFO - eval_steps_per_second: 7.4630
|
| 294 |
+
2025-11-09 21:08:39,314 - INFO - epoch: 1.1000
|
| 295 |
+
2025-11-09 21:08:39,314 - INFO - gpu_memory_gb: 0.8662
|
| 296 |
+
2025-11-09 21:08:39,314 - INFO - system_memory_percent: 6.9000
|
| 297 |
+
2025-11-09 21:08:39,314 - INFO - ============================================================
|
| 298 |
+
2025-11-09 21:08:39,534 - INFO - ============================================================
|
| 299 |
+
2025-11-09 21:08:39,534 - INFO - πΎ Checkpoint 7: step 7000
|
| 300 |
+
2025-11-09 21:08:39,535 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 301 |
+
2025-11-09 21:08:39,535 - INFO - π€ Uploading checkpoint-7000 to Hub...
|
| 302 |
+
2025-11-09 21:08:43,730 - INFO - β
Checkpoint 7000 uploaded!
|
| 303 |
+
2025-11-09 21:08:43,730 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 304 |
+
2025-11-09 21:08:43,730 - INFO - ============================================================
|
| 305 |
+
2025-11-09 21:09:30,675 - INFO - Step 7050/19128 | Loss: 1.4059 | LR: 1.96e-04 | GPU: 0.87GB
|
| 306 |
+
2025-11-09 21:10:17,507 - INFO - Step 7100/19128 | Loss: 1.3989 | LR: 1.95e-04 | GPU: 0.87GB
|
| 307 |
+
2025-11-09 21:11:04,358 - INFO - Step 7150/19128 | Loss: 1.3932 | LR: 1.94e-04 | GPU: 0.87GB
|
| 308 |
+
2025-11-09 21:11:51,196 - INFO - Step 7200/19128 | Loss: 1.4181 | LR: 1.93e-04 | GPU: 0.87GB
|
| 309 |
+
2025-11-09 21:12:37,995 - INFO - Step 7250/19128 | Loss: 1.4567 | LR: 1.93e-04 | GPU: 0.87GB
|
| 310 |
+
2025-11-09 21:13:24,824 - INFO - Step 7300/19128 | Loss: 1.4207 | LR: 1.92e-04 | GPU: 0.87GB
|
| 311 |
+
2025-11-09 21:14:11,635 - INFO - Step 7350/19128 | Loss: 1.4020 | LR: 1.91e-04 | GPU: 0.87GB
|
| 312 |
+
2025-11-09 21:14:58,467 - INFO - Step 7400/19128 | Loss: 1.4161 | LR: 1.90e-04 | GPU: 0.87GB
|
| 313 |
+
2025-11-09 21:15:45,268 - INFO - Step 7450/19128 | Loss: 1.4214 | LR: 1.89e-04 | GPU: 0.87GB
|
| 314 |
+
2025-11-09 21:16:32,091 - INFO - Step 7500/19128 | Loss: 1.4018 | LR: 1.89e-04 | GPU: 0.87GB
|
| 315 |
+
2025-11-09 21:17:18,897 - INFO - Step 7550/19128 | Loss: 1.3888 | LR: 1.88e-04 | GPU: 0.87GB
|
| 316 |
+
2025-11-09 21:18:05,711 - INFO - Step 7600/19128 | Loss: 1.4376 | LR: 1.87e-04 | GPU: 0.87GB
|
| 317 |
+
2025-11-09 21:18:52,582 - INFO - Step 7650/19128 | Loss: 1.4172 | LR: 1.86e-04 | GPU: 0.87GB
|
| 318 |
+
2025-11-09 21:19:39,428 - INFO - Step 7700/19128 | Loss: 1.4116 | LR: 1.85e-04 | GPU: 0.87GB
|
| 319 |
+
2025-11-09 21:20:26,289 - INFO - Step 7750/19128 | Loss: 1.4148 | LR: 1.85e-04 | GPU: 0.87GB
|
| 320 |
+
2025-11-09 21:21:13,120 - INFO - Step 7800/19128 | Loss: 1.4197 | LR: 1.84e-04 | GPU: 0.87GB
|
| 321 |
+
2025-11-09 21:22:00,036 - INFO - Step 7850/19128 | Loss: 1.4202 | LR: 1.83e-04 | GPU: 0.87GB
|
| 322 |
+
2025-11-09 21:22:46,828 - INFO - Step 7900/19128 | Loss: 1.4046 | LR: 1.82e-04 | GPU: 0.87GB
|
| 323 |
+
2025-11-09 21:23:33,694 - INFO - Step 7950/19128 | Loss: 1.3885 | LR: 1.82e-04 | GPU: 0.87GB
|
| 324 |
+
2025-11-09 21:24:20,616 - INFO - Step 8000/19128 | Loss: 1.4116 | LR: 1.81e-04 | GPU: 0.87GB
|
| 325 |
+
2025-11-09 21:25:55,818 - INFO - Step 8000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 326 |
+
2025-11-09 21:25:55,819 - INFO - ============================================================
|
| 327 |
+
2025-11-09 21:25:55,819 - INFO - π EVALUATION at step 8000
|
| 328 |
+
2025-11-09 21:25:55,819 - INFO - eval_loss: 1.2042
|
| 329 |
+
2025-11-09 21:25:55,819 - INFO - eval_runtime: 95.1992
|
| 330 |
+
2025-11-09 21:25:55,819 - INFO - eval_samples_per_second: 119.0350
|
| 331 |
+
2025-11-09 21:25:55,819 - INFO - eval_steps_per_second: 7.4480
|
| 332 |
+
2025-11-09 21:25:55,819 - INFO - epoch: 1.2500
|
| 333 |
+
2025-11-09 21:25:55,819 - INFO - gpu_memory_gb: 0.8662
|
| 334 |
+
2025-11-09 21:25:55,819 - INFO - system_memory_percent: 7.0000
|
| 335 |
+
2025-11-09 21:25:55,819 - INFO - ============================================================
|
| 336 |
+
2025-11-09 21:25:56,041 - INFO - ============================================================
|
| 337 |
+
2025-11-09 21:25:56,041 - INFO - πΎ Checkpoint 8: step 8000
|
| 338 |
+
2025-11-09 21:25:56,041 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 339 |
+
2025-11-09 21:25:56,041 - INFO - π€ Uploading checkpoint-8000 to Hub...
|
| 340 |
+
2025-11-09 21:26:00,738 - INFO - β
Checkpoint 8000 uploaded!
|
| 341 |
+
2025-11-09 21:26:00,738 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 342 |
+
2025-11-09 21:26:00,738 - INFO - ============================================================
|
| 343 |
+
2025-11-09 21:26:47,780 - INFO - Step 8050/19128 | Loss: 1.4231 | LR: 1.80e-04 | GPU: 0.87GB
|
| 344 |
+
2025-11-09 21:27:34,641 - INFO - Step 8100/19128 | Loss: 1.3973 | LR: 1.79e-04 | GPU: 0.87GB
|
| 345 |
+
2025-11-09 21:28:21,484 - INFO - Step 8150/19128 | Loss: 1.4082 | LR: 1.78e-04 | GPU: 0.87GB
|
| 346 |
+
2025-11-09 21:29:08,354 - INFO - Step 8200/19128 | Loss: 1.4026 | LR: 1.78e-04 | GPU: 0.87GB
|
| 347 |
+
2025-11-09 21:29:55,181 - INFO - Step 8250/19128 | Loss: 1.4261 | LR: 1.77e-04 | GPU: 0.87GB
|
| 348 |
+
2025-11-09 21:30:41,985 - INFO - Step 8300/19128 | Loss: 1.4162 | LR: 1.76e-04 | GPU: 0.87GB
|
| 349 |
+
2025-11-09 21:31:28,809 - INFO - Step 8350/19128 | Loss: 1.4007 | LR: 1.75e-04 | GPU: 0.87GB
|
| 350 |
+
2025-11-09 21:32:15,629 - INFO - Step 8400/19128 | Loss: 1.4075 | LR: 1.74e-04 | GPU: 0.87GB
|
| 351 |
+
2025-11-09 21:33:02,411 - INFO - Step 8450/19128 | Loss: 1.3911 | LR: 1.74e-04 | GPU: 0.87GB
|
| 352 |
+
2025-11-09 21:33:49,222 - INFO - Step 8500/19128 | Loss: 1.3977 | LR: 1.73e-04 | GPU: 0.87GB
|
| 353 |
+
2025-11-09 21:34:36,010 - INFO - Step 8550/19128 | Loss: 1.4008 | LR: 1.72e-04 | GPU: 0.87GB
|
| 354 |
+
2025-11-09 21:35:22,826 - INFO - Step 8600/19128 | Loss: 1.3947 | LR: 1.71e-04 | GPU: 0.87GB
|
| 355 |
+
2025-11-09 21:36:09,612 - INFO - Step 8650/19128 | Loss: 1.3825 | LR: 1.70e-04 | GPU: 0.87GB
|
| 356 |
+
2025-11-09 21:36:56,293 - INFO - Step 8700/19128 | Loss: 1.4022 | LR: 1.70e-04 | GPU: 0.87GB
|
| 357 |
+
2025-11-09 21:37:42,785 - INFO - Step 8750/19128 | Loss: 1.3865 | LR: 1.69e-04 | GPU: 0.87GB
|
| 358 |
+
2025-11-09 21:38:29,318 - INFO - Step 8800/19128 | Loss: 1.4350 | LR: 1.68e-04 | GPU: 0.87GB
|
| 359 |
+
2025-11-09 21:39:16,016 - INFO - Step 8850/19128 | Loss: 1.4100 | LR: 1.67e-04 | GPU: 0.87GB
|
| 360 |
+
2025-11-09 21:40:02,580 - INFO - Step 8900/19128 | Loss: 1.3888 | LR: 1.66e-04 | GPU: 0.87GB
|
| 361 |
+
2025-11-09 21:40:49,082 - INFO - Step 8950/19128 | Loss: 1.4151 | LR: 1.66e-04 | GPU: 0.87GB
|
| 362 |
+
2025-11-09 21:41:35,633 - INFO - Step 9000/19128 | Loss: 1.3786 | LR: 1.65e-04 | GPU: 0.87GB
|
| 363 |
+
2025-11-09 21:43:10,410 - INFO - Step 9000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 364 |
+
2025-11-09 21:43:10,410 - INFO - ============================================================
|
| 365 |
+
2025-11-09 21:43:10,410 - INFO - π EVALUATION at step 9000
|
| 366 |
+
2025-11-09 21:43:10,410 - INFO - eval_loss: 1.1985
|
| 367 |
+
2025-11-09 21:43:10,410 - INFO - eval_runtime: 94.7740
|
| 368 |
+
2025-11-09 21:43:10,410 - INFO - eval_samples_per_second: 119.5690
|
| 369 |
+
2025-11-09 21:43:10,410 - INFO - eval_steps_per_second: 7.4810
|
| 370 |
+
2025-11-09 21:43:10,410 - INFO - epoch: 1.4100
|
| 371 |
+
2025-11-09 21:43:10,410 - INFO - gpu_memory_gb: 0.8662
|
| 372 |
+
2025-11-09 21:43:10,410 - INFO - system_memory_percent: 7.0000
|
| 373 |
+
2025-11-09 21:43:10,410 - INFO - ============================================================
|
| 374 |
+
2025-11-09 21:43:10,628 - INFO - ============================================================
|
| 375 |
+
2025-11-09 21:43:10,628 - INFO - πΎ Checkpoint 9: step 9000
|
| 376 |
+
2025-11-09 21:43:10,629 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 377 |
+
2025-11-09 21:43:10,629 - INFO - π€ Uploading checkpoint-9000 to Hub...
|
| 378 |
+
2025-11-09 21:43:14,790 - INFO - β
Checkpoint 9000 uploaded!
|
| 379 |
+
2025-11-09 21:43:14,790 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 380 |
+
2025-11-09 21:43:14,790 - INFO - ============================================================
|
| 381 |
+
2025-11-09 21:44:01,513 - INFO - Step 9050/19128 | Loss: 1.3709 | LR: 1.64e-04 | GPU: 0.87GB
|
| 382 |
+
2025-11-09 21:44:48,323 - INFO - Step 9100/19128 | Loss: 1.4021 | LR: 1.63e-04 | GPU: 0.87GB
|
| 383 |
+
2025-11-09 21:45:35,034 - INFO - Step 9150/19128 | Loss: 1.4054 | LR: 1.62e-04 | GPU: 0.87GB
|
| 384 |
+
2025-11-09 21:46:21,722 - INFO - Step 9200/19128 | Loss: 1.3909 | LR: 1.62e-04 | GPU: 0.87GB
|
| 385 |
+
2025-11-09 21:47:08,425 - INFO - Step 9250/19128 | Loss: 1.3949 | LR: 1.61e-04 | GPU: 0.87GB
|
| 386 |
+
2025-11-09 21:47:55,127 - INFO - Step 9300/19128 | Loss: 1.3927 | LR: 1.60e-04 | GPU: 0.87GB
|
| 387 |
+
2025-11-09 21:48:41,911 - INFO - Step 9350/19128 | Loss: 1.4053 | LR: 1.59e-04 | GPU: 0.87GB
|
| 388 |
+
2025-11-09 21:49:28,598 - INFO - Step 9400/19128 | Loss: 1.4217 | LR: 1.58e-04 | GPU: 0.87GB
|
| 389 |
+
2025-11-09 21:50:15,238 - INFO - Step 9450/19128 | Loss: 1.4174 | LR: 1.58e-04 | GPU: 0.87GB
|
| 390 |
+
2025-11-09 21:51:01,924 - INFO - Step 9500/19128 | Loss: 1.4396 | LR: 1.57e-04 | GPU: 0.87GB
|
| 391 |
+
2025-11-09 21:51:48,569 - INFO - Step 9550/19128 | Loss: 1.3969 | LR: 1.56e-04 | GPU: 0.87GB
|
| 392 |
+
2025-11-09 21:52:35,302 - INFO - Step 9600/19128 | Loss: 1.3905 | LR: 1.55e-04 | GPU: 0.87GB
|
| 393 |
+
2025-11-09 21:53:22,061 - INFO - Step 9650/19128 | Loss: 1.3712 | LR: 1.54e-04 | GPU: 0.87GB
|
| 394 |
+
2025-11-09 21:54:08,755 - INFO - Step 9700/19128 | Loss: 1.3787 | LR: 1.54e-04 | GPU: 0.87GB
|
| 395 |
+
2025-11-09 21:54:55,348 - INFO - Step 9750/19128 | Loss: 1.3795 | LR: 1.53e-04 | GPU: 0.87GB
|
| 396 |
+
2025-11-09 21:55:41,952 - INFO - Step 9800/19128 | Loss: 1.4115 | LR: 1.52e-04 | GPU: 0.87GB
|
| 397 |
+
2025-11-09 21:56:28,677 - INFO - Step 9850/19128 | Loss: 1.3968 | LR: 1.51e-04 | GPU: 0.87GB
|
| 398 |
+
2025-11-09 21:57:15,360 - INFO - Step 9900/19128 | Loss: 1.4296 | LR: 1.51e-04 | GPU: 0.87GB
|
| 399 |
+
2025-11-09 21:58:02,048 - INFO - Step 9950/19128 | Loss: 1.3983 | LR: 1.50e-04 | GPU: 0.87GB
|
| 400 |
+
2025-11-09 21:58:48,785 - INFO - Step 10000/19128 | Loss: 1.3813 | LR: 1.49e-04 | GPU: 0.87GB
|
| 401 |
+
2025-11-09 22:00:23,984 - INFO - Step 10000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 402 |
+
2025-11-09 22:00:23,984 - INFO - ============================================================
|
| 403 |
+
2025-11-09 22:00:23,984 - INFO - π EVALUATION at step 10000
|
| 404 |
+
2025-11-09 22:00:23,984 - INFO - eval_loss: 1.1934
|
| 405 |
+
2025-11-09 22:00:23,985 - INFO - eval_runtime: 95.1956
|
| 406 |
+
2025-11-09 22:00:23,985 - INFO - eval_samples_per_second: 119.0390
|
| 407 |
+
2025-11-09 22:00:23,985 - INFO - eval_steps_per_second: 7.4480
|
| 408 |
+
2025-11-09 22:00:23,985 - INFO - epoch: 1.5700
|
| 409 |
+
2025-11-09 22:00:23,985 - INFO - gpu_memory_gb: 0.8662
|
| 410 |
+
2025-11-09 22:00:23,985 - INFO - system_memory_percent: 6.9000
|
| 411 |
+
2025-11-09 22:00:23,985 - INFO - ============================================================
|
| 412 |
+
2025-11-09 22:00:24,209 - INFO - ============================================================
|
| 413 |
+
2025-11-09 22:00:24,210 - INFO - πΎ Checkpoint 10: step 10000
|
| 414 |
+
2025-11-09 22:00:24,210 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 415 |
+
2025-11-09 22:00:24,210 - INFO - π€ Uploading checkpoint-10000 to Hub...
|
| 416 |
+
2025-11-09 22:00:28,612 - INFO - β
Checkpoint 10000 uploaded!
|
| 417 |
+
2025-11-09 22:00:28,612 - INFO - π https://huggingface.co/ranjan56cse/t5-base-xsum-lora
|
| 418 |
+
2025-11-09 22:00:28,612 - INFO - ============================================================
|
| 419 |
+
2025-11-09 22:01:15,428 - INFO - Step 10050/19128 | Loss: 1.3868 | LR: 1.48e-04 | GPU: 0.87GB
|
| 420 |
+
2025-11-09 22:02:02,217 - INFO - Step 10100/19128 | Loss: 1.4145 | LR: 1.47e-04 | GPU: 0.87GB
|
| 421 |
+
2025-11-09 22:02:49,097 - INFO - Step 10150/19128 | Loss: 1.3916 | LR: 1.47e-04 | GPU: 0.87GB
|
| 422 |
+
2025-11-09 22:03:35,906 - INFO - Step 10200/19128 | Loss: 1.3796 | LR: 1.46e-04 | GPU: 0.87GB
|
| 423 |
+
2025-11-09 22:04:22,598 - INFO - Step 10250/19128 | Loss: 1.4049 | LR: 1.45e-04 | GPU: 0.87GB
|
| 424 |
+
2025-11-09 22:05:09,296 - INFO - Step 10300/19128 | Loss: 1.3931 | LR: 1.44e-04 | GPU: 0.87GB
|
| 425 |
+
2025-11-09 22:05:55,990 - INFO - Step 10350/19128 | Loss: 1.3685 | LR: 1.43e-04 | GPU: 0.87GB
|
| 426 |
+
2025-11-09 22:06:42,775 - INFO - Step 10400/19128 | Loss: 1.3856 | LR: 1.43e-04 | GPU: 0.87GB
|
| 427 |
+
2025-11-09 22:07:29,629 - INFO - Step 10450/19128 | Loss: 1.3871 | LR: 1.42e-04 | GPU: 0.87GB
|
| 428 |
+
2025-11-09 22:08:16,457 - INFO - Step 10500/19128 | Loss: 1.3822 | LR: 1.41e-04 | GPU: 0.87GB
|
| 429 |
+
2025-11-09 22:09:03,144 - INFO - Step 10550/19128 | Loss: 1.3909 | LR: 1.40e-04 | GPU: 0.87GB
|
| 430 |
+
2025-11-09 22:09:49,938 - INFO - Step 10600/19128 | Loss: 1.3876 | LR: 1.39e-04 | GPU: 0.87GB
|
| 431 |
+
2025-11-09 22:10:36,673 - INFO - Step 10650/19128 | Loss: 1.3611 | LR: 1.39e-04 | GPU: 0.87GB
|
| 432 |
+
2025-11-09 22:11:23,326 - INFO - Step 10700/19128 | Loss: 1.3871 | LR: 1.38e-04 | GPU: 0.87GB
|
| 433 |
+
2025-11-09 22:12:09,970 - INFO - Step 10750/19128 | Loss: 1.3808 | LR: 1.37e-04 | GPU: 0.87GB
|
| 434 |
+
2025-11-09 22:12:56,625 - INFO - Step 10800/19128 | Loss: 1.3733 | LR: 1.36e-04 | GPU: 0.87GB
|
| 435 |
+
2025-11-09 22:13:43,366 - INFO - Step 10850/19128 | Loss: 1.3835 | LR: 1.35e-04 | GPU: 0.87GB
|
| 436 |
+
2025-11-09 22:14:30,081 - INFO - Step 10900/19128 | Loss: 1.3768 | LR: 1.35e-04 | GPU: 0.87GB
|
| 437 |
+
2025-11-09 22:15:16,779 - INFO - Step 10950/19128 | Loss: 1.3826 | LR: 1.34e-04 | GPU: 0.87GB
|
| 438 |
+
2025-11-09 22:16:03,476 - INFO - Step 11000/19128 | Loss: 1.3699 | LR: 1.33e-04 | GPU: 0.87GB
|
| 439 |
+
2025-11-09 22:17:38,447 - INFO - Step 11000/19128 | Loss: 0.0000 | LR: 0.00e+00 | GPU: 0.87GB
|
| 440 |
+
2025-11-09 22:17:38,447 - INFO - ============================================================
|
| 441 |
+
2025-11-09 22:17:38,447 - INFO - π EVALUATION at step 11000
|
| 442 |
+
2025-11-09 22:17:38,447 - INFO - eval_loss: 1.1889
|
| 443 |
+
2025-11-09 22:17:38,447 - INFO - eval_runtime: 94.9678
|
| 444 |
+
2025-11-09 22:17:38,447 - INFO - eval_samples_per_second: 119.3250
|
| 445 |
+
2025-11-09 22:17:38,447 - INFO - eval_steps_per_second: 7.4660
|
| 446 |
+
2025-11-09 22:17:38,447 - INFO - epoch: 1.7300
|
| 447 |
+
2025-11-09 22:17:38,447 - INFO - gpu_memory_gb: 0.8662
|
| 448 |
+
2025-11-09 22:17:38,447 - INFO - system_memory_percent: 6.9000
|
| 449 |
+
2025-11-09 22:17:38,447 - INFO - ============================================================
|
| 450 |
+
2025-11-09 22:17:38,673 - INFO - ============================================================
|
| 451 |
+
2025-11-09 22:17:38,673 - INFO - πΎ Checkpoint 11: step 11000
|
| 452 |
+
2025-11-09 22:17:38,673 - INFO - GPU: 0.87GB allocated, 1.15GB reserved
|
| 453 |
+
2025-11-09 22:17:38,673 - INFO - π€ Uploading checkpoint-11000 to Hub...
|