Deploying Machine Learning Models on Google Cloud Platform (GCP) (2023)

Deploying Machine Learning Models on Google Cloud Platform (GCP) (1)

May 4, 2020

·

9 min read

·

Train on Kaggle; deploy on Google Cloud

Deploying Machine Learning Models on Google Cloud Platform (GCP) (2)

The deployment of a machine learning (ML) model to production starts with actually building the model, which can be done in several ways and with many tools.

The approach and tools used at the development stage are very important at ensuring the smooth integration of the basic units that make up the machine learning pipeline. If these are not put into consideration before starting a project, there’s a huge chance of you ending up with an ML system having low efficiency and high latency.

For instance, using a function that has been deprecated might still work, but it tends to raise warnings and, as such, increases the response time of the system.

The first thing to do in order to ensure this good integration of all system units is to have a system architecture (blueprint) that shows the end-to-end integration of each logical part in the system. Below is the designed system architecture for this mini-project.

Deploying Machine Learning Models on Google Cloud Platform (GCP) (3)

Model Development

When we discuss model development, we’re talking about an iterative process where hypotheses are tested and models are derived, trained, tested, and built until a model with desired results is achieved.

This is the fun part for data scientist teams, where they can use their machine learning skills for tasks such as exploratory data analysis, feature engineering, model training, and evaluations on the given data.

The model used in this project was built and serialized on this Kaggle kernel using the titanic dataset. Note that I only used existing modules in standard packages such as Pandas, NumPy and sklearn so as not to end up building custom modules. you can take a look at my previous post [1]“Deployment of Machine learning Model Demystified (Part 2)” to know more about a custom pipeline

The performance of the model can be greatly improved on with feature transformation, but most transformers that work best on the data are not available on sklearn without the combination of Pandas, NumPy, and other useful libraries and that will lead to the building of additional modules during deployment. To keep things as simple as possible, I’ll refrain from exploring these topics in too much depth.

Deploying Machine Learning Models on Google Cloud Platform (GCP) (4)

Kaggle is the largest online community of data scientists and machine learning practitioners. Kaggle allows users to find and publish datasets, explore and build models in a web-based data-science environment, work with other data scientists and machine learning engineers, and enter competitions to solve data science challenges.

Deep learning — For experts, by experts. We’re using our decades of experience to deliver the best deep learning resources to your inbox each week.

ML Deployment on Google Cloud Platform

Deploying Machine Learning Models on Google Cloud Platform (GCP) (5)

Google Cloud Platform (GCP) is one of the primary options for cloud-based deployment of ML models, along with others such as AWS, Microsoft Azure, etc.

With GCP, depending on how you choose to have your model deployed, there are basically 3 options which are[2]:

  • Google AI Platform: An AI platform that makes it easy for machine learning developers, data scientists, and data engineers to take their ML projects from ideation to production and deployment, quickly and cost-effectively. From data engineering to “no lock-in” flexibility, Google’s AI Platform has an integrated toolchain that helps in building and running your own machine learning applications. As such, end-to-end ML model development and deployment is possible on the Google’s AI Platform without the need for external tools. The advantage of this is that you don't need to worry about choosing the best tool to get each job done, and how well each unit integrates with the larger system. Alternatives to this are Amazon SageMaker, Microsoft Azure, etc
Deploying Machine Learning Models on Google Cloud Platform (GCP) (6)
  • Google Cloud Function: Cloud function is the simplest way to run code. it is an event-driven, serverless compute platform whereby your function gets executed when needed, without the need for server provision while setting up or any other related compute resources [3]. The advantages of using cloud function include: automatic scaling based on the load; the ability to simplify complex application development across different languages; no servers to provision, manage, or upgrade; integrated monitoring, logging, and distributed tracing; built-in security at the role and per function level, based on the principle of least privilege; and key networking capabilities for hybrid and multi-cloud scenarios. Alternatives to this are AWS Lambda, Azure Functions, etc.
  • Google App Engine: Google’s App Engine is a platform as a service and cloud computing platform that’s mostly used for developing and hosting web applications. The auto-scaling feature of the App Engine automatically allocates more resources for the web application to handle the additional demand. This is the choice I experimented with for this project.

Why the use of the App Engine for this project?

The App Engine is a cloud-based platform, is quite comprehensive, and combines infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS). The runtime and languages are up to date with great documentation. Features in preview stage (beta) are made available to a large number of users, which keeps us informed of possible future developments.

To deploy this model on the App Engine using a terminal, there are four major things needed, which are:

  • The serialized model and model artifacts: This is the saved trained model and other standard objects used during data transformation. All will be stored in Google Storage (bucket) upon deployment so that they can be accessible by the main script for test data preparation and making predictions.
  • Main script.py: This is the script in which the prediction function is written, and where all necessary libraries listed in the requirements file needed for end-to-end data preparation and prediction are imported. I would add comments for each line of code so that it’s easier to read.
  • Requirement.txt: A simple text file that contains model dependencies with the exact version used during the training of the model. In order to avoid running into troubles, it’s better to check available versions of all the libraries and packages you’ll be using on the cloud before developing the model.
scikit-learn==0.22
numpy==1.18.0
pandas==0.25.3
flask==1.1.1
  • The app.yaml file: This is the file you can use to configure your App Engine app’s settings in the app.yaml file. This file specifies how URL paths correspond to request handlers and static files. The app.yaml file also contains information about your app's code, such as the runtime and the latest version identifier. Any configuration you omit on this file will be set to the default state. For this simple app, I only need to set the run time to python37 so that the App Engine can know the Docker image that will be running the app.
runtime: python37

There’s a lot more to writing app.yaml files, which can be found on the Google’s official documentation [4].

For easy deployment of this app, we need to have the project on a software development version control platform such as Bitbucket, GitHub, etc. kindly find the project repository on GitHub.

Steps to deploying the model on Google’s App Engine

  • Create a project on Google Cloud Platform
Deploying Machine Learning Models on Google Cloud Platform (GCP) (7)
  • Select the project and create an app using App Engine
Deploying Machine Learning Models on Google Cloud Platform (GCP) (8)

Set up the application by setting the permanent region where you want Google to manage your app. After this step, select the programming language used in writing the app.

Deploying Machine Learning Models on Google Cloud Platform (GCP) (9)
  • Either download Cloud SDK to deploy from your local machine or activate cloud shell from the cloud. For this demo, I’m using cloud shell. once the shell is activated, ensure that the Cloud Platform project is set to the intended project ID.
Deploying Machine Learning Models on Google Cloud Platform (GCP) (10)
  • Clone your GitHub project repo on the engine by running (git clone <link to clone your repository> )
Deploying Machine Learning Models on Google Cloud Platform (GCP) (11)
  • Change to the directory of the project containing the file to be uploaded on App Engine by running (cd ‘ cloned project folder’). You can call directories by running ls.
Deploying Machine Learning Models on Google Cloud Platform (GCP) (12)
  • Initialize gcloud in the project directory by running gcloud init. This will trigger some questions on the configuration of the Google Cloud SDK, which are pretty straightforward and can be easily answered.
  • The last step is to deploy the app by running the command gcloud app deploy. It will take some time to upload files, install app dependencies, and deploy the app.
Deploying Machine Learning Models on Google Cloud Platform (GCP) (13)
  • Once the uploading is done, you can run gcloud app browse to start the app in the browser—or copy the app URL manually if the browser isn’t detected.
Deploying Machine Learning Models on Google Cloud Platform (GCP) (14)

Note: You need to add the API endpoint to the URL if you’re using a custom prediction routine. For this project, Flask was the choice of web framework and the endpoint was declared as /prediction_endpoint.

Deploying Machine Learning Models on Google Cloud Platform (GCP) (15)

Test app with Postman

Since we didn't build any web interface for the project, we can use the Google app client to send an HTTP request to test the app, but we’ll use Postman to test it because we’re predicting in batches based on how we’re reading the dataset on the backend. Below is the response from the app after sending an HTTP request to get predictions for the uploaded test data.

Deploying Machine Learning Models on Google Cloud Platform (GCP) (16)

Wrapping up

The deployment of ML models can take different forms depending on the dataset, target platform for deployment, how end-users will utilize it, and many other factors.

In order to have a smooth ride when deploying models, do a thorough review of all units that make up the ML system architecture you’re trying to implement.

Connect with me on Twitter and LinkedIn

Check out the Project GitHub Repository and remember to star it in the link below:

Cheers

References

Future Work

  • Deployment of custom machine learning pipeline on GCP via App engine and Docker

Editor’s Note: Heartbeat is a contributor-driven online publication and community dedicated to providing premier educational resources for data science, machine learning, and deep learning practitioners. We’re committed to supporting and inspiring developers and engineers from all walks of life.

Editorially independent, Heartbeat is sponsored and published by Comet, an MLOps platform that enables data scientists & ML teams to track, compare, explain, & optimize their experiments. We pay our contributors, and we don’t sell ads.

If you’d like to contribute, head on over to our call for contributors. You can also sign up to receive our weekly newsletters (Deep Learning Weekly and the Comet Newsletter), join us on Slack, and follow Comet on Twitter and LinkedIn for resources, events, and much more that will help you build better ML models, faster.

Top Articles
Latest Posts
Article information

Author: Zonia Mosciski DO

Last Updated: 01/21/2023

Views: 6045

Rating: 4 / 5 (71 voted)

Reviews: 86% of readers found this page helpful

Author information

Name: Zonia Mosciski DO

Birthday: 1996-05-16

Address: Suite 228 919 Deana Ford, Lake Meridithberg, NE 60017-4257

Phone: +2613987384138

Job: Chief Retail Officer

Hobby: Tai chi, Dowsing, Poi, Letterboxing, Watching movies, Video gaming, Singing

Introduction: My name is Zonia Mosciski DO, I am a enchanting, joyous, lovely, successful, hilarious, tender, outstanding person who loves writing and wants to share my knowledge and understanding with you.