Classifying Flowers with CNNs and Transfer Learning

Open a new browser window (or tab) and Copy / Paste(Ctrl+V) images from image.google.com, unsplash.com, WebGL procedural flowers ... You can Copy images from your PC e.g. by Photos or Paint. Drag mouse (from the top to the right) to mark a new region for inference. Use the A,S,D,W keys to translate and J,K,L,I to enlarge bounding box (hold the Shift key to move x10 faster). This region from the source image (see its position, width and height in console) is scaled into 224x224 image and is shown to the right.

Comments

This is the TFjs port of Roshan Adusumilli's work Classifying Flowers with CNNs and Transfer Learning (Colab). The tf_flowers dataset with 5 classes 'dandelion', 'daisy', 'tulips', 'sunflowers', 'roses' is used (218 MB). You can train the model on Colab and use it in browser.

More procedural flower classifiers: Daisy, Tulip, Rose.

Surprisingly after training for 6 epochs I get val_accuracy: 0.8759. Much better than in Recognize Flowers using Transfer Learning tensorflow example (val_accuracy:7592 after 10 epochs and val_accuracy:8112 after the baseModel fine tuning). What is wrong?

As since we don't tune the base MobileNet model, therefore we could just collect extracted features (for N = 2500 images we will get only Nx1280x4 B = 12.8 MB), then train model in browser. We could augment data (e.g. by image flipping) too.

Didn't you see Colab SSD/YOLO Transfer Learning demos? (with "rose" detection better :)

A few corrections to Roshan's Colab model

1. TFhub feature extractor "https://tfhub.dev/google/imagenet/mobilenet_v2_100_224/feature_vector/3" is used as baseModel (it has the latest TFjs module).

2. On the top of the baseModel dense1 layer is defined

dense1 = layers.Dense(num_classes, activation='softmax')
model = tf.keras.Sequential([
    feature_extractor,
    dense1
])
model.summary()

model.compile(optimizer='adam', 
              loss='sparse_categorical_crossentropy', 
              metrics=['accuracy'])
EPOCHS = 6
history = model.fit(train_batches,
                    epochs=EPOCHS,
                    validation_data = validation_batches)

Importing Colab model

The model is trained for 6 epochs and the dense1 layer weights w0 and biases w1 are printed as plain text.
w = dense1.get_weights()
tf.print(tf.reshape(w[0],[1280*5]), summarize=-1)
tf.print(w[1])
Then these two JS arrays are used in TFjs model as
  const dense1 = tf.layers.dense({
    inputShape: [1280],  units: 5,  activation: 'softmax',  useBias: true,
    weights: [tf.tensor(w0, [1280,5]), tf.tensor(w1)]
  })
  model = tf.sequential({ layers: [ dense1 ] });
At last the top model is saved/loaded in TFjs
  const saveResult = await model.save('downloads://model');
  ... ...
  model = await tf.loadLayersModel('webgl/flowers/model.json')
Shanqing Cai (Google) suggests:
"try using tensorflowjs_converter's Python API instead of the command line, that is the `tfjs.converters.save_keras_model` Python function. When this function runs in the same Python program as the one that defines the original tf.keras model with the custom layer class, it'll have access to the definition of that class."
But I'd like to use "feature_extraction" JS module from TFHub...

Model summary

Total Number of Classes: 5
Total Number of Training Images: 2590
Total Number of Validation Images: 1080 

Total params: 2,264,389
Trainable params: 6,405
Non-trainable params: 2,257,984

Epoch 6/6
81/81 - 11s 137ms/step - loss: 0.1767 - accuracy: 0.9533 - val_loss: 0.3230 - val_accuracy: 0.8759

TFjs notes     updated 14 Jan 2020