|
More procedural flower classifiers: Daisy, Tulip, Rose.
Surprisingly after training for 6 epochs I get val_accuracy: 0.8759. Much better than in Recognize Flowers using Transfer Learning tensorflow example (val_accuracy:7592 after 10 epochs and val_accuracy:8112 after the baseModel fine tuning). What is wrong?
As since we don't tune the base MobileNet model, therefore we could just collect extracted features (for N = 2500 images we will get only Nx1280x4 B = 12.8 MB), then train model in browser. We could augment data (e.g. by image flipping) too.
Didn't you see Colab SSD/YOLO Transfer Learning demos? (with "rose" detection better :)
2. On the top of the baseModel dense1 layer is defined
dense1 = layers.Dense(num_classes, activation='softmax') model = tf.keras.Sequential([ feature_extractor, dense1 ]) model.summary() model.compile(optimizer='adam', loss='sparse_categorical_crossentropy', metrics=['accuracy']) EPOCHS = 6 history = model.fit(train_batches, epochs=EPOCHS, validation_data = validation_batches)
w = dense1.get_weights() tf.print(tf.reshape(w[0],[1280*5]), summarize=-1) tf.print(w[1])Then these two JS arrays are used in TFjs model as
const dense1 = tf.layers.dense({ inputShape: [1280], units: 5, activation: 'softmax', useBias: true, weights: [tf.tensor(w0, [1280,5]), tf.tensor(w1)] }) model = tf.sequential({ layers: [ dense1 ] });At last the top model is saved/loaded in TFjs
const saveResult = await model.save('downloads://model'); ... ... model = await tf.loadLayersModel('webgl/flowers/model.json')Shanqing Cai (Google) suggests:
Total Number of Classes: 5 Total Number of Training Images: 2590 Total Number of Validation Images: 1080 Total params: 2,264,389 Trainable params: 6,405 Non-trainable params: 2,257,984 Epoch 6/6 81/81 - 11s 137ms/step - loss: 0.1767 - accuracy: 0.9533 - val_loss: 0.3230 - val_accuracy: 0.8759