Datasets:

Modalities:
Image
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
pandas
License:
AI4Industry commited on
Commit
07c484f
Β·
verified Β·
1 Parent(s): 43679fb

add Qwen3-VL-32B-Think & Qwen3-VL-8B-Think

Browse files
Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -128,6 +128,7 @@ We evaluated several of the latest popular MLLMs, including both closed-source a
128
  | Gemini-2.5-Pro | √ | Proprietary | 20250617 | 0.9095 | **0.9423** | 0.9259 |
129
  | GPT-5(medium) | √ | Proprietary | 20250807 | 0.9207 | 0.9226 | 0.9217 |
130
  | Qwen3-VL-235BA22B-Think | √ | Open | - | 0.9220 | 0.9134 | 0.9177 |
 
131
  | GPT-5.1(medium) | √ | Proprietary | 20251113 | 0.9108 | 0.9141 | 0.9125 |
132
  | GPT-5-mini | √ | Proprietary | 20250807 | 0.9108 | 0.9128 | 0.9118 |
133
  | Seed1.5-VL-Think | √ | Proprietary | 20250428 | 0.9056 | 0.9161 | 0.9109 |
@@ -137,6 +138,7 @@ We evaluated several of the latest popular MLLMs, including both closed-source a
137
  | Intern-S1 | √ | Open | - | 0.8938 | 0.8944 | 0.8941 |
138
  | Qwen3-VL-30BA3B-Think | √ | Open | - | 0.8689 | 0.8590 | 0.8689 |
139
  | Qwen3-VL-Plus | Γ— | Proprietary | 20250923 | 0.8551 | 0.8656 | 0.8604 |
 
140
  | Seed1.5-VL | Γ— | Proprietary | 20250328 | 0.8518 | 0.8669 | 0.8594 |
141
  | Qwen3-VL-235BA22B-Instruct | Γ— | Open | - | 0.8492 | 0.8675 | 0.8584 |
142
  | InternVL3-78b | Γ— | Open | - | 0.8531 | 0.8308 | 0.8420 |
@@ -167,6 +169,7 @@ We also conducted separate evaluations for different task types (in RxnBench-en)
167
  | Gemini-2.5-Pro | √ | Proprietary | 20250617 | 0.9331 | 0.9246 | 0.9459 | **0.9491** | 0.9322 | 0.6343 |
168
  | GPT-5(medium) | √ | Proprietary | 20250807 | **0.9349** | 0.9325 | 0.9493 | 0.9167 | 0.9492 | 0.7761 |
169
  | Qwen3-VL-235BA22B-Think | √ | Open | - | 0.9190 | 0.9405 | 0.9459 | 0.9213 | 0.9322 | **0.8433** |
 
170
  | GPT-5.1(medium) | √ | Proprietary | 20251113 | 0.9243 | 0.9365 | 0.9426 | 0.9167 | 0.9492 | 0.7090 |
171
  | GPT-5-mini | √ | Proprietary | 20250807 | 0.9225 | 0.9325 | 0.9257 | 0.9259 | 0.9831 | 0.7388 |
172
  | Seed1.5-VL-Think | √ | Proprietary | 20250428 | 0.8996 | 0.9365 | 0.9358 | 0.9074 | 0.9153 | 0.8060 |
@@ -176,6 +179,7 @@ We also conducted separate evaluations for different task types (in RxnBench-en)
176
  | Intern-S1 | √ | Open | - | 0.9014 | 0.9127 | 0.9223 | 0.9028 | 0.8814 | 0.7463 |
177
  | Qwen3-VL-30BA3B-Think | √ | Open | - | 0.8732 | 0.8810 | 0.9054 | 0.8843 | 0.9322 | 0.6940 |
178
  | Qwen3-VL-Plus | Γ— | Proprietary | 20250923 | 0.8275 | 0.8968 | 0.8986 | 0.8565 | 0.9153 | 0.7687 |
 
179
  | Seed1.5-VL | Γ— | Proprietary | 20250328 | 0.9327 | 0.9127 | 0.9122 | 0.8472 | 0.8305 | 0.7015 |
180
  | Qwen3-VL-235BA22B-Instruct | Γ— | Open | - | 0.8204 | 0.8929 | 0.8986 | 0.8426 | 0.8814 | 0.7761 |
181
  | InternVL3-78b | Γ— | Open | - | 0.8556 | 0.8730 | 0.8885 | 0.8981 | 0.9153 | 0.6194 |
 
128
  | Gemini-2.5-Pro | √ | Proprietary | 20250617 | 0.9095 | **0.9423** | 0.9259 |
129
  | GPT-5(medium) | √ | Proprietary | 20250807 | 0.9207 | 0.9226 | 0.9217 |
130
  | Qwen3-VL-235BA22B-Think | √ | Open | - | 0.9220 | 0.9134 | 0.9177 |
131
+ | Qwen3-VL-32B-Think | √ | Open | - | 0.9128 | 0.9161 | 0.9144 |
132
  | GPT-5.1(medium) | √ | Proprietary | 20251113 | 0.9108 | 0.9141 | 0.9125 |
133
  | GPT-5-mini | √ | Proprietary | 20250807 | 0.9108 | 0.9128 | 0.9118 |
134
  | Seed1.5-VL-Think | √ | Proprietary | 20250428 | 0.9056 | 0.9161 | 0.9109 |
 
138
  | Intern-S1 | √ | Open | - | 0.8938 | 0.8944 | 0.8941 |
139
  | Qwen3-VL-30BA3B-Think | √ | Open | - | 0.8689 | 0.8590 | 0.8689 |
140
  | Qwen3-VL-Plus | Γ— | Proprietary | 20250923 | 0.8551 | 0.8656 | 0.8604 |
141
+ | Qwen3-VL-8B-Think | √ | Open | - | 0.8636 | 0.8564 | 0.8600 |
142
  | Seed1.5-VL | Γ— | Proprietary | 20250328 | 0.8518 | 0.8669 | 0.8594 |
143
  | Qwen3-VL-235BA22B-Instruct | Γ— | Open | - | 0.8492 | 0.8675 | 0.8584 |
144
  | InternVL3-78b | Γ— | Open | - | 0.8531 | 0.8308 | 0.8420 |
 
169
  | Gemini-2.5-Pro | √ | Proprietary | 20250617 | 0.9331 | 0.9246 | 0.9459 | **0.9491** | 0.9322 | 0.6343 |
170
  | GPT-5(medium) | √ | Proprietary | 20250807 | **0.9349** | 0.9325 | 0.9493 | 0.9167 | 0.9492 | 0.7761 |
171
  | Qwen3-VL-235BA22B-Think | √ | Open | - | 0.9190 | 0.9405 | 0.9459 | 0.9213 | 0.9322 | **0.8433** |
172
+ | Qwen3-VL-32B-Think | √ | Open | - | 0.9296 | 0.9405 | 0.9426 | 0.9259 | 0.9153 | 0.7015 |
173
  | GPT-5.1(medium) | √ | Proprietary | 20251113 | 0.9243 | 0.9365 | 0.9426 | 0.9167 | 0.9492 | 0.7090 |
174
  | GPT-5-mini | √ | Proprietary | 20250807 | 0.9225 | 0.9325 | 0.9257 | 0.9259 | 0.9831 | 0.7388 |
175
  | Seed1.5-VL-Think | √ | Proprietary | 20250428 | 0.8996 | 0.9365 | 0.9358 | 0.9074 | 0.9153 | 0.8060 |
 
179
  | Intern-S1 | √ | Open | - | 0.9014 | 0.9127 | 0.9223 | 0.9028 | 0.8814 | 0.7463 |
180
  | Qwen3-VL-30BA3B-Think | √ | Open | - | 0.8732 | 0.8810 | 0.9054 | 0.8843 | 0.9322 | 0.6940 |
181
  | Qwen3-VL-Plus | Γ— | Proprietary | 20250923 | 0.8275 | 0.8968 | 0.8986 | 0.8565 | 0.9153 | 0.7687 |
182
+ | Qwen3-VL-8B-Think | √ | Open | - | 0.8768 | 0.8730 | 0.8885 | 0.9028 | 0.8983 | 0.6567 |
183
  | Seed1.5-VL | Γ— | Proprietary | 20250328 | 0.9327 | 0.9127 | 0.9122 | 0.8472 | 0.8305 | 0.7015 |
184
  | Qwen3-VL-235BA22B-Instruct | Γ— | Open | - | 0.8204 | 0.8929 | 0.8986 | 0.8426 | 0.8814 | 0.7761 |
185
  | InternVL3-78b | Γ— | Open | - | 0.8556 | 0.8730 | 0.8885 | 0.8981 | 0.9153 | 0.6194 |