Skip to main content

Google Cloud BigQuery Component

BigQuery is Google Cloud's fully managed, petabyte-scale, and cost-effective analytics data warehouse that lets you run analytics over vast amounts of data in near real time.

Component key: google-cloud-bigquery

Description

The Google Cloud BigQuery is Google Cloud's fully managed, petabyte-scale, and cost-effective analytics data warehouse that lets you run analytics over vast amounts of data in near real time.

The Push Notifications service lets you to receive notifications that an order has been created. This is called "push" since Google will push notifications to you about events, such as orders, that happen on the Google side.

  • The Content API's pubsubnotificationsettings.update receives the request and sends you back a cloudTopicName.

  • To configure additional Topics

    • In the Google Cloud console, select the navigation menu scroll to the Pub/Sub page (Navigation Menu > More Products > Analytics > Pub/Sub)
    • In the Topics page, click Create Topic
      • In the window that opens, enter MyTopic in the Topic ID field.
        • Leave the default values for the remaining options, and then click Create.
        • You see the success message: A new topic and a new subscription have been successfully created.
        • You have just created a topic called MyTopic and an associated default subscription MyTopic-sub.
  • You create a subscription for the topic and register the URL push endpoint with Cloud Pub/Sub.

  • To Configure Subscription go to Pub/Sub > Subscriptions

    • In the Subscriptions page, click Create subscription.

    • Enter MySub in the Subscription ID field.

    • For Select a Cloud Pub/Sub topic, select the MyTopic topic from the drop-down menu

    • Leave the default values for the remaining options.

    • Click Create

      • You see the success message: Subscription successfully added.
    • Click the Topics page and click MyTopic.

      • The MySub subscription is now attached to the topic

        MyTopic. Pub/Sub delivers all messages sent to

        MyTopic to the MySub and MyTopic-sub subscriptions.

  • Cloud Pub/Sub accepts your subscription and associates that cloudTopicName with your URL. When messages are published to that cloudTopicName (for example, your order notifications), they will be sent to your URL push endpoint.

Request

PUT https://shoppingcontent.googleapis.com/content/v2.1/merchantId/pubsubnotificationsettings

Connections

Google Cloud BigQuery Private Key

OAuth2

The Google BigQuery component authenticates requests through the Google Cloud Platform (GCP) OAuth 2.0 service. You'll need to create a GCP OAuth 2.0 app so your integration can authenticate and perform Google Drive tasks on your customers' behalf.

To create a Google Drive OAuth 2.0 app, first make sure you have a Google Developer account - you can sign up at https://console.cloud.google.com/. Then:

  1. Open up the Google Drive API Console
  2. Click CREATE PROJECT if you would like to create a new GCP project, or select an existing project.
  3. You will be prompted to enable Google BigQuery for your project. Click ENABLE.
  4. On the sidebar, select Credentials.
  5. An OAuth 2.0 app includes a "Consent Screen" (the page that asks "Do you want to allow (Your Company) to access Google Drive on your behalf?"). Click CONFIGURE CONSENT SCREEN.
    1. Your app will be externally available to your customers, so choose a User Type of External.
    2. Fill out the OAuth consent screen with an app name (your company or product's name), support email, app logo, domain, etc.
    3. You can ignore domains for now.
    4. On the next page, ignore scopes - this component knows what scopes it needs to run and will request the right scopes for you.
    5. Enter some test users for your testing purposes. Your app will only work for those testing users until it is "verified" by Google. When you are ready for verification (they verify your privacy policy statement, etc), click PUBLISH APP on the OAuth consent screen. That'll allow your customers to authorize your integration to access their Google Drive.
  6. Once your "Consent Screen" is configured open the Credentials page from the sidebar again.
  7. Click +CREATE CREDENTIALS and select OAuth client ID.
    1. Under Application type select Web application.
    2. Under Authorized redirect URIs enter Prismatic's OAuth 2.0 callback URL: https://oauth2.prismatic.io/callback
    3. Click CREATE.
  8. Take note of the Client ID and Client Secret that are generated.

INFO Make sure to publish your OAuth 2.0 app after you've tested it so users outside of your test users can authorize your integration to interact with Google Drive on their behalf.

Now that you have a Client ID and Client Secret, add Google Drive step to your integration in Prismatic. Open the Configuration Wizard Designer by clicking Configuration Wizard, select your Google Drive Connection and enter your client ID and secret. You will probably want to keep the default Google BigQuery scopes:

https://www.googleapis.com/auth/bigqueryView and manage your data in Google BigQuery and see the email address for your Google Account
https://www.googleapis.com/auth/bigquery.insertdataInsert data into Google BigQuery
https://www.googleapis.com/auth/cloud-platformSee, edit, configure, and delete your Google Cloud data and see the email address for your Google Account.
https://www.googleapis.com/auth/cloud-platform.read-onlyView your data across Google Cloud services and see the email address of your Google Account
https://www.googleapis.com/auth/devstorage.full_controlManage your data and permissions in Cloud Storage and see the email address for your Google Account
https://www.googleapis.com/auth/devstorage.read_onlyView your data in Google Cloud Storage
https://www.googleapis.com/auth/devstorage.read_writeManage your data in Cloud Storage and see the email address of your Google Account

Triggers

PubSub Notification

PubSub Notification Trigger Settings | key: myTrigger


Data Sources

Fetch Projects Names

Fetch an array of projects names | key: projectsNames | type: picklist

Data Source Payload

{
"result": [
{
"label": "John Locke",
"key": "650"
},
{
"label": "John Doe",
"key": "47012"
}
]
}

Fetch Tables Names

Fetch an array of tables names | key: tablesNames | type: picklist

Data Source Payload

{
"result": [
{
"label": "John Locke",
"key": "650"
},
{
"label": "John Doe",
"key": "47012"
}
]
}

Actions

Cancel Job

Requests that a job be cancelled. | key: cancelJob


Create Dataset

Creates a new empty dataset. | key: createDataset


Create Job

Starts a new asynchronous job. | key: createJob


Create Routine

Creates a new routine in the dataset. | key: createRoutine


Create Table

Creates a new, empty table in the dataset. | key: createTable


Delete Dataset

Deletes the dataset specified by the datasetId value. Before you can delete a dataset, you must delete all its tables, either manually or by specifying deleteContents. Immediately after deletion, you can create another dataset with the same name. | key: deleteDataset


Delete Job

Requests the deletion of the metadata of a job. | key: deleteJob


Delete Model

Deletes the model specified by model ID from the dataset. | key: deleteModel


Delete Routine

Deletes the routine specified by routine ID from the dataset. | key: deleteRoutine


Delete Table

Deletes the table specified by table ID from the dataset. | key: deleteTable


Get Dataset

Returns the dataset specified by datasetID. | key: getDataset


Get Job

Returns information about a specific job. | key: getJob


Get Model

Gets the specified model resource by model ID. | key: getModel


Get Policy

Gets the access control policy for a resource. | key: getPolicy


Get Query Job Results

Receives the results of a query job. | key: getQueryJobResult


Get Routine

Gets the specified routine resource by routine ID. | key: getRoutine


Get Service Account

Receives the service account for a project used for interactions with Google Cloud KMS | key: getServiceAccount


Get Table

Gets the specified table resource by table ID. | key: getTable


List Datasets

Lists all datasets in the specified project to which the user has been granted the READER dataset role. | key: listDatasets


List Jobs

Lists all jobs that you started in the specified project. | key: listJobs


List Models

Lists all models in the specified dataset. Requires the READER dataset role. After retrieving the list of models, you can get information about a particular model by calling the models.get method. | key: listModels


List Projects

Lists projects to which the user has been granted any project role. | key: listProjects


List Routines

Lists all routines in the specified dataset. | key: listRoutines


List Table Data

Lists the content of a table in rows. | key: listTableData


List Tables

Lists all tables in the specified dataset. | key: listTables


Patch Table

Patch information in an existing table. | key: patchTable


Query Job

Runs a BigQuery SQL query synchronously and returns query results if the query completes within a specified timeout. | key: queryJob


Raw Request

Send raw HTTP request to Google Cloud BigQuery | key: rawRequest


Set Policy

Sets the access control policy on the specified resource. | key: setPolicy


Table Data Insert All

Streams data into BigQuery one record at a time without needing to run a load job. | key: tableDataInsertAll


Update Dataset

Updates information in an existing dataset. The update method replaces the entire dataset resource, whereas the patch method only replaces fields that are provided in the submitted dataset resource. | key: updateDataset


Update Model

Patch specific fields in the specified model. | key: updateModel


Update Routine

Updates information in an existing routine. | key: updateRoutine


Update Table

Updates information in an existing table. | key: updateTable