0
stringclasses 12
values | 1
float64 0
2.17k
|
|---|---|
megatron.core.transformer.attention.forward.linear_proj
| 1.560192
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.4
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.552608
|
megatron.core.transformer.mlp.forward.linear_fc1
| 6.092576
|
megatron.core.transformer.mlp.forward.activation
| 0.796544
|
megatron.core.transformer.mlp.forward.linear_fc2
| 5.97536
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.876096
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.55264
|
megatron.core.transformer.attention.forward.qkv
| 2.778368
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008
|
megatron.core.transformer.attention.forward.core_attention
| 7.0504
|
megatron.core.transformer.attention.forward.linear_proj
| 1.569824
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.422784
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.552992
|
megatron.core.transformer.mlp.forward.linear_fc1
| 6.129696
|
megatron.core.transformer.mlp.forward.activation
| 0.802624
|
megatron.core.transformer.mlp.forward.linear_fc2
| 6.0224
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.967104
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.557344
|
megatron.core.transformer.attention.forward.qkv
| 2.807136
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003072
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.003008
|
megatron.core.transformer.attention.forward.core_attention
| 6.992032
|
megatron.core.transformer.attention.forward.linear_proj
| 1.535072
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.358592
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.538624
|
megatron.core.transformer.mlp.forward.linear_fc1
| 5.858688
|
megatron.core.transformer.mlp.forward.activation
| 0.755584
|
megatron.core.transformer.mlp.forward.linear_fc2
| 5.723296
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 12.349312
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.518688
|
megatron.core.transformer.attention.forward.qkv
| 2.658688
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00304
|
megatron.core.transformer.attention.forward.core_attention
| 6.82112
|
megatron.core.transformer.attention.forward.linear_proj
| 1.599264
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 11.103264
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.563744
|
megatron.core.transformer.mlp.forward.linear_fc1
| 6.256736
|
megatron.core.transformer.mlp.forward.activation
| 0.810624
|
megatron.core.transformer.mlp.forward.linear_fc2
| 6.124512
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 13.20336
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.56288
|
megatron.core.transformer.attention.forward.qkv
| 174.677307
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002976
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912
|
megatron.core.transformer.attention.forward.core_attention
| 886.057251
|
megatron.core.transformer.attention.forward.linear_proj
| 2.675072
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 1,064.229736
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 1,141.586548
|
megatron.core.transformer.mlp.forward.linear_fc1
| 5.865376
|
megatron.core.transformer.mlp.forward.activation
| 461.927063
|
megatron.core.transformer.mlp.forward.linear_fc2
| 4.667776
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 473.381622
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.453376
|
megatron.core.transformer.attention.forward.qkv
| 1.4032
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002944
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912
|
megatron.core.transformer.attention.forward.core_attention
| 3.070304
|
megatron.core.transformer.attention.forward.linear_proj
| 1.651584
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.150016
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.453888
|
megatron.core.transformer.mlp.forward.linear_fc1
| 3.065792
|
megatron.core.transformer.mlp.forward.activation
| 0.3336
|
megatron.core.transformer.mlp.forward.linear_fc2
| 3.851328
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 7.263136
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452768
|
megatron.core.transformer.attention.forward.qkv
| 1.405376
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.003008
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002944
|
megatron.core.transformer.attention.forward.core_attention
| 3.070336
|
megatron.core.transformer.attention.forward.linear_proj
| 1.66768
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.168352
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.452832
|
megatron.core.transformer.mlp.forward.linear_fc1
| 3.068704
|
megatron.core.transformer.mlp.forward.activation
| 0.334592
|
megatron.core.transformer.mlp.forward.linear_fc2
| 3.889152
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 7.305792
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.452928
|
megatron.core.transformer.attention.forward.qkv
| 1.403552
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.002848
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.002912
|
megatron.core.transformer.attention.forward.core_attention
| 3.071712
|
megatron.core.transformer.attention.forward.linear_proj
| 1.898272
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.397824
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.453376
|
megatron.core.transformer.mlp.forward.linear_fc1
| 3.06768
|
megatron.core.transformer.mlp.forward.activation
| 0.335104
|
megatron.core.transformer.mlp.forward.linear_fc2
| 3.925888
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp
| 7.343264
|
megatron.core.transformer.transformer_layer._forward_mlp.mlp_bda
| 0.4536
|
megatron.core.transformer.attention.forward.qkv
| 1.404608
|
megatron.core.transformer.attention.forward.adjust_key_value
| 0.00288
|
megatron.core.transformer.attention.forward.rotary_pos_emb
| 0.00288
|
megatron.core.transformer.attention.forward.core_attention
| 3.07392
|
megatron.core.transformer.attention.forward.linear_proj
| 1.90128
|
megatron.core.transformer.transformer_layer._forward_attention.self_attention
| 6.404992
|
megatron.core.transformer.transformer_layer._forward_attention.self_attn_bda
| 0.45312
|
megatron.core.transformer.mlp.forward.linear_fc1
| 3.055552
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.