Home » Tutorials » TensorFlow

Data Packing Process for MLPERF BERT

Introduction Often NLP datasets have large variations in their samples length. setting a maximum sequence length (max_seq_len) and pad shorter sequences with zeros is a common approach used with GPUs and CPUs.This approach is very inefficient as it results in many unrequired operations (multiply by zeros). The potential speed up by avoiding padding is the … Read more

Intro to Autoencoders

An adaptation of Intro to Autoencoders tutorial using Habana Gaudi AI processors. This tutorial introduces autoencoders with three examples: the basics, image denoising, and anomaly detection. An autoencoder is a special type of neural network that is trained to copy its input to its output. For example, given an image of a handwritten digit, an autoencoder … Read more

Convolutional Neural Network (CNN)

An adaptation of Convolutional Neural Network (CNN) tutorial using Habana Gaudi AI processors. This tutorial demonstrates training a simple Convolutional Neural Network (CNN) to classify CIFAR images. Because this tutorial uses the Keras Sequential API, creating and training your model will take just a few lines of code. Import TensorFlow Enable Habana Let’s enable a single Gaudi device by loading … Read more

Mixed precision

An adaptation of TensorFlow Mixed precision tutorial using Habana Gaudi AI processors. This tutorial demonstrates enabling mixed-precision training for Keras models. You can find the full guide of TensorFlow Mixed Precision Training on Gaudi here. Overview Mixed precision is the use of both 16-bit and 32-bit floating-point types in a model during training to make it … Read more

Image classification

An adaptation of Image classification tutorial using Habana Gaudi AI processors. This tutorial shows how to classify images of flowers. It creates an image classifier using a keras.Sequential model, and loads data using preprocessing.image_dataset_from_directory. You will gain practical experience with the following concepts: Efficiently loading a dataset off disk. Identifying overfitting and applying techniques to mitigate it, including … Read more

Sign up for the latest Habana developer news, events, training, and updates.