Host and schedule run python script that calls a saved TensorFlow model on Google Cloud

I have a python script that I want to run on a schedule within Google Cloud. This script loads a custom TensorFlow model, inputs new data (collected from an AIP), produces predictions, does some analysis on the predictions, and saves the results to Firestore. For this to run I have a list of dependencies, a model file, two pickle files, a custom loss function, and other custom functions.

I have a python application that is hosted on Firebase through Cloud Run. This application looks at the data saved in Firestore from the aforementioned script and displays it to the user.

I want to schedule this python script to run every day, within Google Cloud, save my predictions to my Firestore.

I have looked into creating Cloud Functions, but that does not allow for the upload of my model file or either pickle file.

I have also looked into using Compute Engines and creating an instant that runs on a chron job, but I run into an issue when installing TensorFlow on the virtual machine.

Because I am not (at this point) training my model in the AI Google Cloud services, I would like to find a way to do this without using that.

Is it possible to do this through Cloud Run with a Docker image in a similar way I set up the previously described python application?

Any suggestions about how to accomplish this would be much appreciated!

Source: Docker Questions

LEAVE A COMMENT