Published on

Back to Basics 03


In universities and colleges around the world, Google has encouraged the creation of "Google Developer Student Clubs". To round out the year, Sam Witteveen and I decided to offer our three-part "Back-to-Basic" series to clubs in the South-East Asia region, to introduce them to TensorFlow and Deep Learning.

This session was the third in the three-part series, and focused on Deep Learning for Natural Language Processing using Transformers (which is unusual for an introductory course, but is more 'current' than choosing the standard 'WordEmbeddings+RNNs' approach).

Our presentation was in two parts:

  • My initial talk, that :

    • Gave a brief outline of the use-cases for Transformers for NLP tasks
    • Introduced two new Layers : Tokenisation and Transformer
    • Outlined the BERT task, and how we could then apply Transfer Learning
  • Sam did a Code-Along using Google Colab, where he lead the audience through :

    • Loading and preparing an App review dataset
    • Building a TensorFlow (Hugging Face) BERT Classifier
    • Fine-tuning the model on our dataset, and looking at the outputs

The slides for my talk are here :

Presentation Screenshot

If there are any questions about the presentation please ask below, or contact me using the details given on the slides themselves.

Presentation Content Example