Need help with DataflowTemplates?
Click the “chat” button below for chat support from the developer who created it, or find similar developers for support.

About the developer

GoogleCloudPlatform
612 Stars 436 Forks Apache License 2.0 373 Commits 90 Opened issues

Description

Google-provided Cloud Dataflow template pipelines for solving simple in-Cloud data tasks

Services available

!
?

Need anything else?

Contributors list

Google Cloud Dataflow Template Pipelines

These Dataflow templates are an effort to solve simple, but large, in-Cloud data tasks, including data import/export/backup/restore and bulk API operations, without a development environment. The technology under the hood which makes these operations possible is the Google Cloud Dataflow service combined with a set of Apache Beam SDK templated pipelines.

Google is providing this collection of pre-implemented Dataflow templates as a reference and to provide easy customization for developers wanting to extend their functionality.

Open in Cloud Shell

Template Pipelines

* Supports user-defined functions (UDFs).

For documentation on each template's usage and parameters, please see the official docs.

Getting Started

Requirements

  • Java 8
  • Maven 3

Building the Project

Build the entire project using the maven compile command.

mvn clean compile

Creating a Template File

Dataflow templates can be created using a maven command which builds the project and stages the template file on Google Cloud Storage. Any parameters passed at template build time will not be able to be overwritten at execution time.

mvn compile exec:java \
-Dexec.mainClass=com.google.cloud.teleport.templates. \
-Dexec.cleanupDaemonThreads=false \
-Dexec.args=" \
--project= \
--stagingLocation=gs:///staging \
--tempLocation=gs:///temp \
--templateLocation=gs:///templates/.json \
--runner=DataflowRunner"

Executing a Template File

Once the template is staged on Google Cloud Storage, it can then be executed using the gcloud CLI tool. The runtime parameters required by the template can be passed in the parameters field via comma-separated list of

paramName=Value
.
gcloud dataflow jobs run  \
--gcs-location= \
--zone= \
--parameters 

Using UDFs

User-defined functions (UDFs) allow you to customize a template's functionality by providing a short JavaScript function without having to maintain the entire codebase. This is useful in situations which you'd like to rename fields, filter values, or even transform data formats before output to the destination. All UDFs are executed by providing the payload of the element as a string to the JavaScript function. You can then use JavaScript's in-built JSON parser or other system functions to transform the data prior to the pipeline's output. The return statement of a UDF specifies the payload to pass forward in the pipeline. This should always return a string value. If no value is returned or the function returns undefined, the incoming record will be filtered from the output.

UDF Function Specification

| Template | UDF Input Type | Input Description | UDF Output Type | Output Description | |-----------------------|----------------|-------------------------------------------------|-----------------|-------------------------------------------------------------------------------| | Datastore Bulk Delete | String | A JSON string of the entity | String | A JSON string of the entity to delete; filter entities by returning undefined | | Datastore to Pub/Sub | String | A JSON string of the entity | String | The payload to publish to Pub/Sub | | Datastore to GCS Text | String | A JSON string of the entity | String | A single-line within the output file | | GCS Text to BigQuery | String | A single-line within the input file | String | A JSON string which matches the destination table's schema | | Pub/Sub to BigQuery | String | A string representation of the incoming payload | String | A JSON string which matches the destination table's schema | | Pub/Sub to Datastore | String | A string representation of the incoming payload | String | A JSON string of the entity to write to Datastore | | Pub/Sub to Splunk | String | A string representation of the incoming payload | String | The event data to be sent to Splunk HEC events endpoint. Must be a string or a stringified JSON object |

UDF Examples

Adding fields

/**
 * A transform which adds a field to the incoming data.
 * @param {string} inJson
 * @return {string} outJson
 */
function transform(inJson) {
  var obj = JSON.parse(inJson);
  obj.dataFeed = "Real-time Transactions";
  obj.dataSource = "POS";
  return JSON.stringify(obj);
}

Filtering records

/**
 * A transform function which only accepts 42 as the answer to life.
 * @param {string} inJson
 * @return {string} outJson
 */
function transform(inJson) {
  var obj = JSON.parse(inJson);
  // only output objects which have an answer to life of 42.
  if (obj.hasOwnProperty('answerToLife') && obj.answerToLife === 42) {
    return JSON.stringify(obj);
  }
}

We use cookies. If you continue to browse the site, you agree to the use of cookies. For more information on our use of cookies please see our Privacy Policy.