3 Impressive Ways to Use the Vertex AI Platform

3 Impressive Ways to use the Vertex AI Platform

2026-04-28

Key Takeaways

  • Vertex AI consolidates dataset creation, AutoML training, endpoint deployment, and prediction serving into one Google Cloud product.
  • Tabular classification trains a model that assigns labels to rows of structured data and serves predictions through an online endpoint.
  • Tabular regression predicts continuous numerical values and supports both online and batch prediction modes.
  • Time-series forecasting on tabular data produces batch predictions for sequential, time-stamped inputs.
  • Image classification extends the same workflow to picture data, with online prediction through a deployed endpoint.
  • Tutorials run as console step-throughs or as Python SDK notebooks in Colab, Colab Enterprise, GitHub, or Vertex AI Workbench.
An AI logo, generated using artificial intelligence tools. Image credit: Alius Noreika / AI

An AI logo, generated using artificial intelligence tools. Image credit: Alius Noreika / AI

Google’s Vertex AI brings model training, deployment, and prediction into one managed environment, and three workflows show off what the platform actually does day to day: building a classification model from tabular data, running regression on numerical datasets, and forecasting time-series values without writing model code from scratch. A fourth pattern, image classification, rounds out the picture for teams working with visual data.

Each workflow follows the same arc inside Vertex AI: load a dataset, train an AutoML model, deploy it to an endpoint, and request predictions either online or in batch. The walkthroughs are available as Google Cloud console guides and as Python notebooks that open in Colab, Colab Enterprise, GitHub, or Vertex AI Workbench.

What Vertex AI Is Built To Do

Vertex AI is Google Cloud’s unified machine learning platform. It pulls together the pieces that engineers and analysts used to assemble themselves — dataset management, training, hyperparameter handling, model registry, deployment, monitoring, and prediction serving — under one product surface. The platform supports both AutoML, where Google trains the model from labelled examples, and custom training jobs written in frameworks like TensorFlow or PyTorch.

The three tutorial paths described below all use AutoML, which is the fastest route from raw data to a working endpoint. They were chosen because they map to the most common data shapes a team will encounter: rows in a spreadsheet, sequences over time, and images.

  1. Train a Classification Model on Tabular Data

The tabular classification workflow is the entry point most teams take. The task is straightforward: given a table where each row describes an item and one column holds a category label, train a model that predicts the label for new rows.

The steps inside Vertex AI follow a predictable path. First, create a Vertex AI dataset and import the tabular source — typically a CSV in Cloud Storage or a BigQuery table. Then, train a model with AutoML, which handles feature encoding, model selection, and hyperparameter tuning automatically. Once training finishes, deploy the model to an endpoint. The endpoint is the live serving address that accepts prediction requests.

With the endpoint running, the workflow ends at online prediction: send a single row of features over an HTTP request, get a label back in milliseconds. This pattern fits use cases like fraud scoring, lead qualification, churn flagging, and content moderation on structured signals.

Google offers this tutorial as a step-by-step Google Cloud console guide and as a Python SDK notebook that opens in Colab, Colab Enterprise, GitHub, or Vertex AI Workbench.

  1. Train a Regression Model for Numerical Predictions

Regression on tabular data follows the same scaffolding as classification, but the target column holds a continuous number rather than a category. The model learns to estimate values like price, demand, lifetime value, or sensor readings.

The workflow again starts with a Vertex AI dataset built from tabular input, then trains an AutoML regression model on top of it. The deployment step branches into two prediction modes, and the choice matters for how the model gets used downstream.

Prediction modes for regression:

  • Online prediction sends one or a few rows to a deployed endpoint and returns a value back in real time. Best fit: live applications such as pricing widgets, instant scoring, or in-product recommendations.
  • Batch prediction submits a large file of inputs, runs the prediction asynchronously, and writes results back to storage. Best fit: periodic scoring jobs, analytics pipelines, and overnight reports.

Vertex AI provides separate notebooks for each prediction mode so users can pick the version that matches their serving pattern. Both notebooks open in Colab, Colab Enterprise, GitHub, or Vertex AI Workbench.

  1. Train a Forecasting Model on Time-Series Data

Forecasting is a specialised cousin of regression. The data still lives in a table, but rows carry timestamps and the model learns from sequence — yesterday’s value influences today’s prediction, last week’s pattern shapes next week’s.

The Vertex AI tabular forecasting workflow creates a dataset, trains an AutoML forecasting model, and produces predictions in batch format only. Batch is the practical default here because forecasting jobs typically run on a schedule — every hour, every night, every week — over many series at once. Inventory planning, energy load estimation, traffic projections, and revenue forecasts all fit this pattern.

This tutorial is published as a notebook with the same four launch options: Colab, Colab Enterprise, GitHub, or Vertex AI Workbench.

Bonus Workflow: Image Classification

Beyond the three tabular paths, Vertex AI supports image classification with the same dataset-train-deploy-predict structure. Users upload images with category labels, train an AutoML image classification model, deploy to an endpoint, and request online predictions on new pictures. The tutorial is offered as a step-by-step guide on the Google Cloud console.

Comparing the Workflows at a Glance

Workflow Data type Output Prediction modes
Tabular classification Rows of structured features Category label Online
Tabular regression Rows of structured features Continuous numerical value Online and batch
Tabular forecasting Time-stamped rows Future values per series Batch
Image classification Labelled images Category label Online

Opening a Tutorial Notebook in Vertex AI Workbench

Notebook tutorials are the path most engineers take, because the Python SDK exposes more control than the console wizard. To run one inside a managed Workbench instance, click the Vertex AI Workbench link in the notebook list. The link opens the Vertex AI Workbench console, where the next screen — Deploy to notebook — asks for an instance name. Enter one and click Create.

When the instance is ready, a dialog appears titled Ready to open notebook. Click Open. The next page, Confirm deployment to notebook server, asks for confirmation; select Confirm. Before running any cells, open the Kernel menu and choose Restart Kernel and Clear all Outputs. This clears any state left from a previous session and gives the notebook a clean execution context.

Choosing the Right Tutorial for Your Data

The tutorials are meant as templates rather than finished products. Pick the one that matches the shape of your data and the kind of answer you need: a label, a number, a sequence, or a category for an image. Once a workflow is familiar, the same scaffolding extends to custom training, custom prediction containers, pipelines, and the rest of the Vertex AI surface — but AutoML is where most teams should start.

Documentation, code, and the full notebook library are available through Google Cloud’s Vertex AI documentation and through Google’s GitHub repositories.

If you are interested in this topic, we suggest you check our articles:

Sources: Vertex AI on Google Cloud

Written by Alius Noreika

3 Impressive Ways to use the Vertex AI Platform
We use cookies and other technologies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it..
Privacy policy