Testing Google Cloud Functions Locally
Google Cloud Functions is a serverless compute platform that allows you to run code in response to events without provisioning or managing servers. In this first part of a two-step tutorial, we’ll explore how to test Google Cloud Functions locally. In the second part, we’ll cover how to deploy them from GitHub. Prerequisites To follow along, you'll need: A Google Cloud account A Service Account with permissions to deploy Google Cloud Functions Some code you want to run on Cloud Functions For this example, I'll use the following sample code: import requests import pandas_gbq import pandas as pd from google.oauth2 import service_account def get_currency(): """ Get currency exchange rates from the API and call the save_to_bigquery function to save the data to BigQuery. Returns the data as a DataFrame. """ url = "https://open.er-api.com/v6/latest/USD" response = requests.get(url=url) data = response.json() df = pd.DataFrame(list(data['rates'].items()), columns=["currency", "exchange_rate"]) if not df.empty: save_to_bigquery(df) return "Process completed!" else: return "No data found." def save_to_bigquery(dataframe): """ Save the data to BigQuery. Required: service_account.json file with the credentials. """ df = dataframe project_id = "erthal-blog-de-projects" dataset_table = "financial.currency" credentials = service_account.Credentials.from_service_account_file("service_account.json") pandas_gbq.to_gbq(df, dataset_table, project_id, credentials = credentials, if_exists='replace') You can use any code you like—this is just a simple example. The script above fetches currency exchange rates (USD to other currencies) from a free API and stores the data in a BigQuery dataset. Functions Framework To test the function locally, we can use Google's function framework. This library allows us to set up a local development server and trigger the function via HTTP requests. Install the lib via pip: pip install functions-framework Setting up the HTTP Server To run the function locally, we need to: Import the library Define an entry point for the function Importing the library import functions-framework Defining the entry point The @functions_framework.http decorator specifies where the function should start handling requests. @functions_framework.http def get_currency(request): """ Get currency exchange rates from the API and call the save_to_bigquery function to save the data to BigQuery. Returns the data as a DataFrame. """ url = "https://open.er-api.com/v6/latest/USD" response = requests.get(url=url) data = response.json() df = pd.DataFrame(list(data['rates'].items()), columns=["currency", "exchange_rate"]) if not df.empty: save_to_bigquery(df) return "Process completed!" else: return "No data found." Note: I added the request parameter in get_currency(request), as Google Cloud Functions require it to process HTTP requests. Running the Function Locally Start the local server by running: functions-framework --target=main --port=YOUR_PORT This command starts an HTTP server on the specified port (default: 8080). If the command runs without errors, the server is up and waiting for requests. Stopping the Function If you need to stop the server (e.g., to redeploy your function), use the following command on Ubuntu: fuser -k YOUR_PORT/tcp Testing the Function To send a request to your local function, open another terminal (or use Postman) and run: curl localhost:YOUR_PORT -X POST -H "Content-Type: application/json" If everything is set up correctly, the function should return a successful response (HTTP 200) and process the request. You can monitor the logs in the terminal where the function is running. I hope this guide helps you test Google Cloud Functions locally!

Google Cloud Functions is a serverless compute platform that allows you to run code in response to events without provisioning or managing servers.
In this first part of a two-step tutorial, we’ll explore how to test Google Cloud Functions locally. In the second part, we’ll cover how to deploy them from GitHub.
Prerequisites
To follow along, you'll need:
- A Google Cloud account
- A Service Account with permissions to deploy Google Cloud Functions
- Some code you want to run on Cloud Functions
For this example, I'll use the following sample code:
import requests
import pandas_gbq
import pandas as pd
from google.oauth2 import service_account
def get_currency():
"""
Get currency exchange rates from the API and call the save_to_bigquery function to save the data to BigQuery.
Returns the data as a DataFrame.
"""
url = "https://open.er-api.com/v6/latest/USD"
response = requests.get(url=url)
data = response.json()
df = pd.DataFrame(list(data['rates'].items()), columns=["currency", "exchange_rate"])
if not df.empty:
save_to_bigquery(df)
return "Process completed!"
else:
return "No data found."
def save_to_bigquery(dataframe):
"""
Save the data to BigQuery.
Required: service_account.json file with the credentials.
"""
df = dataframe
project_id = "erthal-blog-de-projects"
dataset_table = "financial.currency"
credentials = service_account.Credentials.from_service_account_file("service_account.json")
pandas_gbq.to_gbq(df, dataset_table, project_id, credentials = credentials, if_exists='replace')
You can use any code you like—this is just a simple example. The script above fetches currency exchange rates (USD to other currencies) from a free API and stores the data in a BigQuery dataset.
Functions Framework
To test the function locally, we can use Google's function framework. This library allows us to set up a local development server and trigger the function via HTTP requests.
Install the lib via pip:
pip install functions-framework
Setting up the HTTP Server
To run the function locally, we need to:
- Import the library
- Define an entry point for the function
Importing the library
import functions-framework
Defining the entry point
The @functions_framework.http
decorator specifies where the function should start handling requests.
@functions_framework.http
def get_currency(request):
"""
Get currency exchange rates from the API and call the save_to_bigquery function to save the data to BigQuery.
Returns the data as a DataFrame.
"""
url = "https://open.er-api.com/v6/latest/USD"
response = requests.get(url=url)
data = response.json()
df = pd.DataFrame(list(data['rates'].items()), columns=["currency", "exchange_rate"])
if not df.empty:
save_to_bigquery(df)
return "Process completed!"
else:
return "No data found."
Note: I added the request parameter in get_currency(request), as Google Cloud Functions require it to process HTTP requests.
Running the Function Locally
Start the local server by running:
functions-framework --target=main --port=YOUR_PORT
This command starts an HTTP server on the specified port (default: 8080
). If the command runs without errors, the server is up and waiting for requests.
Stopping the Function
If you need to stop the server (e.g., to redeploy your function), use the following command on Ubuntu:
fuser -k YOUR_PORT/tcp
Testing the Function
To send a request to your local function, open another terminal (or use Postman) and run:
curl localhost:YOUR_PORT -X POST -H "Content-Type: application/json"
If everything is set up correctly, the function should return a successful response (HTTP 200) and process the request. You can monitor the logs in the terminal where the function is running.
I hope this guide helps you test Google Cloud Functions locally!