0
stringclasses 12
values | 1
float64 0
26.5k
|
|---|---|
megatron.core.transformer.attention.forward.qkv
| 0.477408
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002976
|
megatron.core.transformer.attention.forward.core_attention
| 39.983906
|
megatron.core.transformer.attention.forward.linear_proj
| 1.315168
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 41.799934
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452736
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.883328
|
megatron.core.transformer.mlp.forward.activation
| 0.088128
|
megatron.core.transformer.mlp.forward.linear_fc2
| 1.840448
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 2.82336
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452224
|
megatron.core.transformer.attention.forward.qkv
| 0.246976
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 812.249878
|
megatron.core.transformer.attention.forward.linear_proj
| 0.882112
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 813.403503
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.233792
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.4408
|
megatron.core.transformer.mlp.forward.activation
| 0.048192
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.945248
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.44656
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.232512
|
megatron.core.transformer.attention.forward.qkv
| 0.243232
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00304
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008
|
megatron.core.transformer.attention.forward.core_attention
| 20.78784
|
megatron.core.transformer.attention.forward.linear_proj
| 0.731776
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 21.786976
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.233248
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.444768
|
megatron.core.transformer.mlp.forward.activation
| 0.04896
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.943616
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 1.44896
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.232096
|
megatron.core.transformer.attention.forward.qkv
| 0.133664
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944
|
megatron.core.transformer.attention.forward.core_attention
| 10.543872
|
megatron.core.transformer.attention.forward.linear_proj
| 0.438432
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.13968
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.121376
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.223616
|
megatron.core.transformer.mlp.forward.activation
| 0.027424
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.507872
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.770848
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.121216
|
megatron.core.transformer.attention.forward.qkv
| 0.131552
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002912
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912
|
megatron.core.transformer.attention.forward.core_attention
| 10.54944
|
megatron.core.transformer.attention.forward.linear_proj
| 0.421888
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.1256
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.121824
|
megatron.core.transformer.mlp.forward.linear_fc1
| 0.224992
|
megatron.core.transformer.mlp.forward.activation
| 0.027808
|
megatron.core.transformer.mlp.forward.linear_fc2
| 0.50688
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 0.772
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.121344
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.