Datasets:

Modalities:
Image
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
pandas
License:
AI4Industry commited on
Commit
3c0c43f
Β·
verified Β·
1 Parent(s): a4af6bb

Add Qwen3-VL-30BA3B-Thinking and Qwen3-VL-30BA3B-Instruct

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -134,6 +134,7 @@ We evaluated several of the latest popular MLLMs, including both closed-source a
134
  | GPT o4 mini | √ | Proprietary | 20250416 | 0.9062 | 0.9075 | 0.9069 |
135
  | InternVL3.5-241B-A28B | √ | Open | - | 0.9003 | 0.9062 | 0.9033 |
136
  | Intern-S1 | √ | Open | - | 0.8938 | 0.8944 | 0.8941 |
 
137
  | Qwen3-VL-Plus | Γ— | Proprietary | 20250923 | 0.8551 | 0.8656 | 0.8604 |
138
  | Seed1.5-VL | Γ— | Proprietary | 20250328 | 0.8518 | 0.8669 | 0.8594 |
139
  | Qwen3-VL-235BA22B-Instruct | Γ— | Open | - | 0.8492 | 0.8675 | 0.8584 |
@@ -145,6 +146,7 @@ We evaluated several of the latest popular MLLMs, including both closed-source a
145
  | GPT-5-nano | √ | Proprietary | 20250807 | 0.7980 | 0.7941 | 0.7961 |
146
  | Qwen2.5-VL-32B | Γ— | Open | - | 0.7980 | 0.7908 | 0.7944 |
147
  | Gemini-2.5-Flash | √ | Proprietary | 20250617 | 0.6925 | 0.8557 | 0.7741 |
 
148
  | GPT-4o | Γ— | Proprietary | 20240806 | 0.7462 | 0.7436 | 0.7449 |
149
  | Qwen2.5-VL-7b | Γ— | Open | - | 0.7082 | 0.7233 | 0.7158 |
150
  | Qwen2.5-VL-3b | Γ— | Open | - | 0.6748 | 0.6643 | 0.6696 |
@@ -154,7 +156,7 @@ We evaluated several of the latest popular MLLMs, including both closed-source a
154
  | *Random* | - | - | 0.2500 | 0.2500 | 0.2500 |
155
 
156
 
157
- We also conducted separate evaluations for different task types.
158
 
159
  | Moldel | think | Weight | UpdateTime | Type0 | Type1 | Type2 | Type3 | Type4 | Type5 |
160
  | ---- |:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|
@@ -168,6 +170,7 @@ We also conducted separate evaluations for different task types.
168
  | GPT o4 mini | √ | Proprietary | 20250416 | 0.6391 | 0.7302 | 0.7500 | 0.6667 | 0.6271 | 0.4627 |
169
  | InternVL3.5-241B-A28B | √ | Open | - | 0.8944 | 0.9127 | 0.9291 | 0.9167 | 0.9153 | 0.8134 |
170
  | Intern-S1 | √ | Open | - | 0.9014 | 0.9127 | 0.9223 | 0.9028 | 0.8814 | 0.7463 |
 
171
  | Qwen3-VL-Plus | Γ— | Proprietary | 20250923 | 0.8275 | 0.8968 | 0.8986 | 0.8565 | 0.9153 | 0.7687 |
172
  | Seed1.5-VL | Γ— | Proprietary | 20250328 | 0.9327 | 0.9127 | 0.9122 | 0.8472 | 0.8305 | 0.7015 |
173
  | Qwen3-VL-235BA22B-Instruct | Γ— | Open | - | 0.8204 | 0.8929 | 0.8986 | 0.8426 | 0.8814 | 0.7761 |
@@ -179,6 +182,7 @@ We also conducted separate evaluations for different task types.
179
  | GPT-5-nano | √ | Proprietary | 20250807 | 0.8063 | 0.8452 | 0.8311 | 0.8241 | 0.7797 | 0.5672 |
180
  | Qwen2.5-VL-32B | Γ— | Open | - | 0.7729 | 0.8413 | 0.8750 | 0.8009 | 0.8305 | 0.6418 |
181
  | Gemini-2.5-Flash | √ | Proprietary | 20250617 | 0.7799 | 0.6111 | 0.6757 | 0.6620 | 0.7627 | 0.5373 |
 
182
  | GPT-4o | Γ— | Proprietary | 20240806 | 0.7359 | 0.8175 | 0.7973 | 0.7500 | 0.7627 | 0.5224 |
183
  | Qwen2.5-VL-7b | Γ— | Open | - | 0.6678 | 0.7659 | 0.8041 | 0.7130 | 0.6441 | 0.5373 |
184
  | Qwen2.5-VL-3b | Γ— | Open | - | 0.6426 | 0.7381 | 0.7635 | 0.6898 | 0.6610 | 0.4776 |
 
134
  | GPT o4 mini | √ | Proprietary | 20250416 | 0.9062 | 0.9075 | 0.9069 |
135
  | InternVL3.5-241B-A28B | √ | Open | - | 0.9003 | 0.9062 | 0.9033 |
136
  | Intern-S1 | √ | Open | - | 0.8938 | 0.8944 | 0.8941 |
137
+ | Qwen3-VL-30BA3B-Think | √ | Open | - | 0.8689 | 0.8590 | 0.8689 |
138
  | Qwen3-VL-Plus | Γ— | Proprietary | 20250923 | 0.8551 | 0.8656 | 0.8604 |
139
  | Seed1.5-VL | Γ— | Proprietary | 20250328 | 0.8518 | 0.8669 | 0.8594 |
140
  | Qwen3-VL-235BA22B-Instruct | Γ— | Open | - | 0.8492 | 0.8675 | 0.8584 |
 
146
  | GPT-5-nano | √ | Proprietary | 20250807 | 0.7980 | 0.7941 | 0.7961 |
147
  | Qwen2.5-VL-32B | Γ— | Open | - | 0.7980 | 0.7908 | 0.7944 |
148
  | Gemini-2.5-Flash | √ | Proprietary | 20250617 | 0.6925 | 0.8557 | 0.7741 |
149
+ | Qwen3-VL-30BA3B-Instruct | Γ— | Open | - | 0.7456 | 0.7436 | 0.7456 |
150
  | GPT-4o | Γ— | Proprietary | 20240806 | 0.7462 | 0.7436 | 0.7449 |
151
  | Qwen2.5-VL-7b | Γ— | Open | - | 0.7082 | 0.7233 | 0.7158 |
152
  | Qwen2.5-VL-3b | Γ— | Open | - | 0.6748 | 0.6643 | 0.6696 |
 
156
  | *Random* | - | - | 0.2500 | 0.2500 | 0.2500 |
157
 
158
 
159
+ We also conducted separate evaluations for different task types (in RxnBench-en).
160
 
161
  | Moldel | think | Weight | UpdateTime | Type0 | Type1 | Type2 | Type3 | Type4 | Type5 |
162
  | ---- |:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|:----:|
 
170
  | GPT o4 mini | √ | Proprietary | 20250416 | 0.6391 | 0.7302 | 0.7500 | 0.6667 | 0.6271 | 0.4627 |
171
  | InternVL3.5-241B-A28B | √ | Open | - | 0.8944 | 0.9127 | 0.9291 | 0.9167 | 0.9153 | 0.8134 |
172
  | Intern-S1 | √ | Open | - | 0.9014 | 0.9127 | 0.9223 | 0.9028 | 0.8814 | 0.7463 |
173
+ | Qwen3-VL-30BA3B-Think | √ | Open | - | 0.8732 | 0.8810 | 0.9054 | 0.8843 | 0.9322 | 0.6940 |
174
  | Qwen3-VL-Plus | Γ— | Proprietary | 20250923 | 0.8275 | 0.8968 | 0.8986 | 0.8565 | 0.9153 | 0.7687 |
175
  | Seed1.5-VL | Γ— | Proprietary | 20250328 | 0.9327 | 0.9127 | 0.9122 | 0.8472 | 0.8305 | 0.7015 |
176
  | Qwen3-VL-235BA22B-Instruct | Γ— | Open | - | 0.8204 | 0.8929 | 0.8986 | 0.8426 | 0.8814 | 0.7761 |
 
182
  | GPT-5-nano | √ | Proprietary | 20250807 | 0.8063 | 0.8452 | 0.8311 | 0.8241 | 0.7797 | 0.5672 |
183
  | Qwen2.5-VL-32B | Γ— | Open | - | 0.7729 | 0.8413 | 0.8750 | 0.8009 | 0.8305 | 0.6418 |
184
  | Gemini-2.5-Flash | √ | Proprietary | 20250617 | 0.7799 | 0.6111 | 0.6757 | 0.6620 | 0.7627 | 0.5373 |
185
+ | Qwen3-VL-30BA3B-Instruct | Γ— | Open | - | 0.7042 | 0.7937 | 0.8311 | 0.7824 | 0.7119 | 0.5970 |
186
  | GPT-4o | Γ— | Proprietary | 20240806 | 0.7359 | 0.8175 | 0.7973 | 0.7500 | 0.7627 | 0.5224 |
187
  | Qwen2.5-VL-7b | Γ— | Open | - | 0.6678 | 0.7659 | 0.8041 | 0.7130 | 0.6441 | 0.5373 |
188
  | Qwen2.5-VL-3b | Γ— | Open | - | 0.6426 | 0.7381 | 0.7635 | 0.6898 | 0.6610 | 0.4776 |