Learning Deep Learning Week 3

Week 14 of documenting my AI/ML learning journey (Jan 12 - Jan 18)

What was discussed last week…

  • The “ImageDataLoader” class from the Keras module can load, preprocess, and augment images to train a model on “datagens”, which are object instances of “ImageDataLoader”.

  • Training, Input, Target, Validation, and Test Data all have specific roles, even though their names are sometimes interchangeably used in real life.

Sorry guys…I was really busy this week with robotics meetings (FRC Team 1771, btw) and with an upcoming saxophone audition; thus, this week’s post will be more of like an addition to last week’s post.

Saturday, January 18th

Today I took a quick quiz, where I found out that I didn’t quite know what the “include_top” parameter did when loading a pre-trained model in Keras…but now I know what is does.

When include_top = True, this keeps the (top) fully-connected layer at the end of the pre-trained model, which makes it unable to be used for feature extraction, and keeps the original form of the pre-trained model, and vice versa.

Removing the top, fully-connected layer of a model is like rerouting a team of delivery drivers who were previously tasked with delivering anywhere within a 10 mile radius to specializing their delivery service to specific neighborhoods and streets (feature extraction), where their service becomes more focused on their specific delivery routes (task-specific retraining) and allows for more efficient deliveries.

keras.applications.VGG16(
    include_top=True, # This line
    weights="imagenet",
    input_tensor=None,
    input_shape=None,
    pooling=None,
    classes=1000,
    classifier_activation="softmax",
    name="vgg16",
)

Today I also learned about transpose convolution, which is basically the opposite of regular convolution, where the output tensor has larger dimensions compared to the input tensor. Thus, opposite convolution is especially useful when an image’s resolution needs to be increased, or when an image needs to be generated using vectors, or when semantic segmentation is used to make pixel-wise classification maps. (I’ll talk about this more in my next post!)

Lessons Learned

  • I learned what the “include_top” parameter does when loading a pre-trained model in Keras

  • Also, the basics of what transpose convolution is: the opposite of regular convolution!

Resources

Course I followed: