Press ' - key to Hide/Show the right "Model" window.
Graph model from
'https://storage.googleapis.com/tfhub-tfjs-modules/google/imagenet/mobilenet_v1_100_224/classification/1/model.json'
is used.
You can use e.g. "JSON Viewer Awesome" Chrome extension to explore
model.json
file.
"The non-fused batch norm does computations using several individual Ops.
Fused batch norm combines the individual operations into a single kernel,
which runs faster"
BatchNorm constant (beta, gamma, E, var) are folded into Scale and Offset.
Relu is fused with BatchNorm operations.
So BatchNorm layers are not folded into convolution operations
(but they can fold them before inference)?
See also fold_batch_norms in Graph Transform Tool.