Skip to main content Link Menu Expand (external link) Document Search Copy Copied

Pipelines

A pipeline defines a set of runnable actions composed from datacomponents to complete a set of tasks - for example, ELT. Pipelines are run as jobs, either manually or on a predetermined schedule. Only a single pipeline can be run at any given time.


Objects

Pipeline

Path Type Format Description
id String Version 4 UUID The pipeline ID
status String Pipeline Status  
name String   The pipeline name
schedule String Cron The interval at which to launch a new job e.g. 0 0 9-17 * * MON-FRI launches a job on the hour nine-to-five weekdays
timeout Integer Unsigned The number of seconds after which the job will terminate - if set to 0, an implicit default value of 300 seconds is used
maxRetries Integer Unsigned The maximum number of retries to attempt for a job ending with ERROR
script String Bash script Custom script to execute during a job
created String ISO 8601 timestamp When the pipeline was created
lastModified String ISO 8601 timestamp When the pipeline was last modified
properties Properties   The pipeline properties, defined by the dataplugin settings of each datacomponent
dataComponents Array of String Array of Datacomponent names The pipeline datacomponent names or create / update with dataplugin fullyQualifiedName
actions Array of String Array of Datacomponent names or commands The pipeline actions to run during a job
triggeredBy Array of String Array of pipeline names or workspace task identifiers Pipelines or workspace tasks that will trigger the pipeline on successful completion
Supported values for workspace tasks (case-insensitive):
{
  "id" : "37da82fd-da13-4df1-acd1-eb47fb8cc074",
  "status" : "READY",
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607] (updated)",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-07-15T09:10:49.554622",
  "lastModified" : "2024-07-15T09:10:49.554622",
  "properties" : {
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
    "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id"
  },
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
  "actions" : [ ],
  "triggeredBy" : [ ],
  "_embedded" : {
    "dataComponents" : [ {
      "id" : "985215c6-87c5-4633-bd45-7cc236a92f87",
      "created" : "2024-07-15T09:09:16.453622",
      "lastModified" : "2024-07-15T09:09:16.453623",
      "name" : "tap-google-analytics",
      "properties" : { },
      "commands" : { },
      "dataPlugin" : "extractors/tap-google-analytics--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "51bb57f1-9ccc-40ac-b14d-de574c855a63",
          "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
          "pluginType" : "EXTRACTOR",
          "name" : "tap-google-analytics",
          "namespace" : "tap_google_analytics",
          "variant" : "matatika",
          "label" : "Google Analytics",
          "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
          "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
          "hidden" : false,
          "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "https://github.com/Matatika/tap-google-analytics",
          "capabilities" : [ "CATALOG", "DISCOVER", "STATE" ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "oauth_credentials.authorization_url",
            "aliases" : [ ],
            "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
            "value" : "https://oauth2.googleapis.com/token",
            "kind" : "HIDDEN",
            "description" : "The endpoint used to create and refresh OAuth tokens.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.scope",
            "aliases" : [ ],
            "label" : "OAuth scopes we need to request access to",
            "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
            "kind" : "HIDDEN",
            "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.access_token",
            "aliases" : [ ],
            "label" : "Access Token",
            "kind" : "HIDDEN",
            "description" : "The token used to authenticate and authorize API requests.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_token",
            "aliases" : [ ],
            "label" : "OAuth Refresh Token",
            "kind" : "HIDDEN",
            "description" : "The token used to refresh the access token when it expires.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url",
            "aliases" : [ ],
            "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
            "kind" : "HIDDEN",
            "description" : "An optional function that will be called to refresh the access token using the refresh token.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url_auth",
            "aliases" : [ ],
            "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
            "kind" : "HIDDEN",
            "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_id",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_secret",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "view_id",
            "aliases" : [ ],
            "label" : "View ID",
            "placeholder" : "Ex. 198343027",
            "kind" : "STRING",
            "description" : "The ID of the Google Analytics view to retrieve data from.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "reports",
            "aliases" : [ ],
            "label" : "Reports",
            "placeholder" : "Ex. my_report_definition.json",
            "kind" : "STRING",
            "description" : "The specific reports to retrieve data from in the Google Analytics view.",
            "protected" : false
          }, {
            "name" : "start_date",
            "aliases" : [ ],
            "label" : "Start date",
            "kind" : "DATE_ISO8601",
            "description" : "The start date for the date range of data to retrieve.",
            "protected" : false
          }, {
            "name" : "end_date",
            "aliases" : [ ],
            "label" : "End date",
            "kind" : "DATE_ISO8601",
            "description" : "The end date for the date range of data to retrieve.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : true,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        }
      }
    }, {
      "id" : "c8b03642-3482-45f9-8bbd-9d0c53d468ed",
      "created" : "2024-07-15T09:08:49.599991",
      "lastModified" : "2024-07-15T09:11:20.958587",
      "name" : "Warehouse",
      "properties" : {
        "password" : "4SllWL17_E43GjJq5_GhDX041h",
        "default_target_schema" : "analytics",
        "dbname" : "fruzutu",
        "port" : "5432",
        "host" : "sharp-banana.postgres.database.azure.com",
        "user" : "fruzutu"
      },
      "commands" : { },
      "dataPlugin" : "loaders/target-postgres--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "9da27ab6-0f9a-478a-bc82-d119a43d6777",
          "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
          "pluginType" : "LOADER",
          "name" : "target-postgres",
          "namespace" : "postgres_transferwise",
          "variant" : "matatika",
          "label" : "Postgres Warehouse",
          "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
          "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/target-postgres/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "git+https://github.com/Matatika/[email protected]",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "user",
            "aliases" : [ "username" ],
            "label" : "User",
            "kind" : "STRING",
            "description" : "The username used to connect to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "password",
            "aliases" : [ ],
            "label" : "Password",
            "kind" : "PASSWORD",
            "description" : "The password used to authenticate the user.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "host",
            "aliases" : [ "address" ],
            "label" : "Host",
            "kind" : "STRING",
            "description" : "The hostname or IP address of the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "port",
            "aliases" : [ ],
            "label" : "Port",
            "value" : "5432",
            "kind" : "INTEGER",
            "description" : "The port number used to connect to the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "dbname",
            "aliases" : [ "database" ],
            "label" : "Database Name",
            "kind" : "STRING",
            "description" : "The name of the database to connect to.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "default_target_schema",
            "aliases" : [ ],
            "label" : "Default Target Schema",
            "value" : "analytics",
            "kind" : "STRING",
            "description" : "The default schema to use when writing data to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "ssl",
            "aliases" : [ ],
            "label" : "SSL",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
            "protected" : false,
            "value_post_processor" : "STRINGIFY"
          }, {
            "name" : "batch_size_rows",
            "aliases" : [ ],
            "label" : "Batch Size Rows",
            "value" : "100000",
            "kind" : "INTEGER",
            "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
            "protected" : false
          }, {
            "name" : "underscore_camel_case_fields",
            "aliases" : [ ],
            "label" : "Underscore Camel Case Fields",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
            "protected" : false
          }, {
            "name" : "flush_all_streams",
            "aliases" : [ ],
            "label" : "Flush All Streams",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
            "protected" : false
          }, {
            "name" : "parallelism",
            "aliases" : [ ],
            "label" : "Parallelism",
            "value" : "0",
            "kind" : "HIDDEN",
            "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "parallelism_max",
            "aliases" : [ ],
            "label" : "Max Parallelism",
            "value" : "16",
            "kind" : "HIDDEN",
            "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "default_target_schema_select_permission",
            "aliases" : [ ],
            "label" : "Default Target Schema Select Permission",
            "kind" : "HIDDEN",
            "description" : "The permission level required to select data from the default target schema.",
            "protected" : false
          }, {
            "name" : "schema_mapping",
            "aliases" : [ ],
            "label" : "Schema Mapping",
            "kind" : "HIDDEN",
            "description" : "A mapping of source schema names to target schema names.",
            "protected" : false
          }, {
            "name" : "add_metadata_columns",
            "aliases" : [ ],
            "label" : "Add Metadata Columns",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to add metadata columns to the target table.",
            "protected" : false
          }, {
            "name" : "hard_delete",
            "aliases" : [ ],
            "label" : "Hard Delete",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "data_flattening_max_level",
            "aliases" : [ ],
            "label" : "Data Flattening Max Level",
            "value" : "10",
            "kind" : "HIDDEN",
            "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "primary_key_required",
            "aliases" : [ ],
            "label" : "Primary Key Required",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not a primary key is required for the target table.",
            "protected" : false
          }, {
            "name" : "validate_records",
            "aliases" : [ ],
            "label" : "Validate Records",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "temp_dir",
            "aliases" : [ ],
            "label" : "Temporary Directory",
            "kind" : "HIDDEN",
            "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        }
      }
    }, {
      "id" : "3bf944f2-ee7e-4326-a7ce-35df580478f9",
      "created" : "2024-07-15T09:10:47.73595",
      "lastModified" : "2024-07-15T09:10:47.735961",
      "name" : "dbt",
      "properties" : { },
      "commands" : {
        "compile" : {
          "args" : "compile",
          "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
        },
        "seed" : {
          "args" : "seed",
          "description" : "Load data from csv files into your data warehouse."
        },
        "test" : {
          "args" : "test",
          "description" : "Runs tests on data in deployed models."
        },
        "docs-generate" : {
          "args" : "docs generate",
          "description" : "Generate documentation artifacts for your project."
        },
        "deps" : {
          "args" : "deps",
          "description" : "Pull the most recent version of the dependencies listed in packages.yml"
        },
        "run" : {
          "args" : "run",
          "description" : "Compile SQL and execute against the current target database."
        },
        "clean" : {
          "args" : "clean",
          "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
        },
        "snapshot" : {
          "args" : "snapshot",
          "description" : "Execute snapshots defined in your project."
        }
      },
      "dataPlugin" : "transformers/dbt--dbt-labs",
      "_embedded" : {
        "dataplugin" : {
          "id" : "422fa2e4-b419-4c24-af96-d49a7c3a2f35",
          "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
          "pluginType" : "TRANSFORMER",
          "name" : "dbt",
          "namespace" : "dbt",
          "variant" : "dbt-labs",
          "label" : "dbt",
          "description" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
          "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/dbt/",
          "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
          "repo" : "https://github.com/dbt-labs/dbt-core",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "project_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "profiles_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
            "kind" : "STRING",
            "env" : "DBT_PROFILES_DIR",
            "protected" : false
          }, {
            "name" : "target",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__DIALECT",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "source_schema",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "target_schema",
            "aliases" : [ ],
            "value" : "analytics",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "models",
            "aliases" : [ ],
            "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
            "kind" : "STRING",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "matatikaHidden" : false,
          "requires" : [ {
            "id" : "732af1ba-ad83-4fd0-ba19-4b07c02e1bba",
            "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
            "pluginType" : "FILE",
            "name" : "files-dbt",
            "namespace" : "dbt",
            "variant" : "matatika",
            "label" : "files-dbt",
            "description" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n",
            "hidden" : false,
            "pipUrl" : "git+https://github.com/Matatika/[email protected]",
            "repo" : "https://github.com/Matatika/files-dbt",
            "capabilities" : [ ],
            "select" : [ ],
            "update" : {
              "transform/profile/profiles.yml" : "true"
            },
            "vars" : { },
            "settings" : [ ],
            "variants" : [ ],
            "commands" : { },
            "matatikaHidden" : false,
            "requires" : [ ],
            "fullDescription" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n"
          } ],
          "fullDescription" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        }
      }
    } ],
    "latest job" : {
      "id" : "e6f969cb-8197-4d33-a588-b97ba78d904f",
      "created" : "2024-07-15T09:10:50.289063",
      "type" : "WORKSPACE_CONFIG",
      "maxAttempts" : 0,
      "attempt" : 0,
      "commitId" : "2afd2a5d29ec4ada81ebd49a24ce003acab803f2",
      "exitCode" : 0,
      "status" : "COMPLETE",
      "startTime" : "2024-07-15T09:11:02.609",
      "endTime" : "2024-07-15T09:11:21.687",
      "_embedded" : {
        "pipeline" : {
          "id" : "37da82fd-da13-4df1-acd1-eb47fb8cc074",
          "status" : "READY",
          "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607] (updated)",
          "timeout" : 0,
          "maxRetries" : 0,
          "created" : "2024-07-15T09:10:49.554622",
          "lastModified" : "2024-07-15T09:10:49.554622",
          "properties" : {
            "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
            "tap-google-analytics.view_id" : "1234567890",
            "tap-google-analytics.reports" : "reports",
            "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
            "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
            "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
            "tap-google-analytics.oauth_credentials.access_token" : "access_token",
            "tap-google-analytics.oauth_credentials.client_id" : "client_id"
          },
          "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
          "actions" : [ ],
          "triggeredBy" : [ ]
        },
        "profile" : {
          "id" : "auth0|5eb0327cbfd7490bff55feeb",
          "name" : "[email protected]",
          "handle" : "@sit+prod",
          "email" : "[email protected]"
        }
      },
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f"
        },
        "delete job" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f",
          "type" : "DELETE"
        },
        "logs" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f/logs?sequence=0",
          "type" : "GET"
        }
      }
    }
  },
  "_links" : {
    "update pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074",
      "type" : "PUT"
    },
    "delete pipeline" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074",
      "type" : "DELETE"
    },
    "draft pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/draft",
      "type" : "PUT"
    },
    "self" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074"
    },
    "environment" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/environment"
    },
    "jobs" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/jobs",
      "type" : "GET"
    },
    "metrics" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/metrics"
    },
    "add subscription" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/subscriptions"
    },
    "verify pipeline" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/verification",
      "type" : "POST"
    },
    "create job" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/jobs",
      "type" : "POST"
    },
    "latest job" : {
      "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f"
    }
  }
}

Properties

For each setting s in the datacomponentsdataplugin settings for each

Path Type Description
s.name s.kind Refer to s.description
  • Any required settings not satisfied by a datacomponent property must be provided as a pipeline property
  • Any settings that are already satisfied by a datacomponent property can be overridden by a pipeline property

Formats

Pipeline Status

Value Description
READY The pipeline completed processing resource changes
PROVISIONING The pipeline is processing resource changes
FAILED The pipeline failed to process resource changes

Requests

See Also


View all pipelines in a workspace

GET

/api/workspaces/{workspace-id}/pipelines

Returns all configured pipelines in the workspace {workspace-id}.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines' -i -X GET \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("GET", url, headers=headers)

print(response.text.encode('utf8'))

Response

200 OK

Pipeline collection with HAL links.

{
  "_embedded" : {
    "pipelines" : [ {
      "id" : "1c8022b3-3e71-4ba4-b897-4e187d6660cf",
      "status" : "READY",
      "name" : "Model lineage",
      "timeout" : 0,
      "maxRetries" : 0,
      "created" : "2024-07-15T09:10:48.217238",
      "lastModified" : "2024-07-15T09:10:48.217254",
      "properties" : { },
      "dataComponents" : [ "dbt", "dbt-artifacts", "matatika" ],
      "actions" : [ "dbt:deps", "dbt:docs-generate", "dbt-artifacts:convert-matatika", "matatika:publish" ],
      "triggeredBy" : [ ],
      "repositoryPath" : "pipelines/Model lineage.yml",
      "_embedded" : {
        "dataComponents" : [ {
          "id" : "3bf944f2-ee7e-4326-a7ce-35df580478f9",
          "created" : "2024-07-15T09:10:47.73595",
          "lastModified" : "2024-07-15T09:10:47.735961",
          "name" : "dbt",
          "properties" : { },
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "dataPlugin" : "transformers/dbt--dbt-labs",
          "_embedded" : {
            "dataplugin" : {
              "id" : "422fa2e4-b419-4c24-af96-d49a7c3a2f35",
              "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
              "pluginType" : "TRANSFORMER",
              "name" : "dbt",
              "namespace" : "dbt",
              "variant" : "dbt-labs",
              "label" : "dbt",
              "description" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
              "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/dbt/",
              "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
              "repo" : "https://github.com/dbt-labs/dbt-core",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "project_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "profiles_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
                "kind" : "STRING",
                "env" : "DBT_PROFILES_DIR",
                "protected" : false
              }, {
                "name" : "target",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__DIALECT",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "source_schema",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "target_schema",
                "aliases" : [ ],
                "value" : "analytics",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "models",
                "aliases" : [ ],
                "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
                "kind" : "STRING",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "compile" : {
                  "args" : "compile",
                  "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
                },
                "seed" : {
                  "args" : "seed",
                  "description" : "Load data from csv files into your data warehouse."
                },
                "test" : {
                  "args" : "test",
                  "description" : "Runs tests on data in deployed models."
                },
                "docs-generate" : {
                  "args" : "docs generate",
                  "description" : "Generate documentation artifacts for your project."
                },
                "deps" : {
                  "args" : "deps",
                  "description" : "Pull the most recent version of the dependencies listed in packages.yml"
                },
                "run" : {
                  "args" : "run",
                  "description" : "Compile SQL and execute against the current target database."
                },
                "clean" : {
                  "args" : "clean",
                  "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
                },
                "snapshot" : {
                  "args" : "snapshot",
                  "description" : "Execute snapshots defined in your project."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ {
                "id" : "732af1ba-ad83-4fd0-ba19-4b07c02e1bba",
                "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
                "pluginType" : "FILE",
                "name" : "files-dbt",
                "namespace" : "dbt",
                "variant" : "matatika",
                "label" : "files-dbt",
                "description" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n",
                "hidden" : false,
                "pipUrl" : "git+https://github.com/Matatika/[email protected]",
                "repo" : "https://github.com/Matatika/files-dbt",
                "capabilities" : [ ],
                "select" : [ ],
                "update" : {
                  "transform/profile/profiles.yml" : "true"
                },
                "vars" : { },
                "settings" : [ ],
                "variants" : [ ],
                "commands" : { },
                "matatikaHidden" : false,
                "requires" : [ ],
                "fullDescription" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n"
              } ],
              "fullDescription" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            }
          }
        }, {
          "id" : "9240a991-4c92-4fbd-b3fc-a9b494c35ef4",
          "created" : "2024-07-15T09:10:47.68811",
          "lastModified" : "2024-07-15T09:10:47.688121",
          "name" : "dbt-artifacts",
          "properties" : { },
          "commands" : {
            "convert-matatika" : {
              "args" : "convert --format matatika",
              "description" : "Convert artifacts to Matatika datasets."
            },
            "convert-mermaid" : {
              "args" : "convert --format mermaid",
              "description" : "Convert artifacts to [Mermaid entity relationship diagrams](https://mermaid.js.org/syntax/entityRelationshipDiagram.html)."
            }
          },
          "dataPlugin" : "utilities/dbt-artifacts--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "f830c410-8ac9-4d9e-b38c-ab3d70af776d",
              "repositoryPath" : "plugins/utilities/dbt-artifacts--matatika.lock",
              "pluginType" : "UTILITY",
              "name" : "dbt-artifacts",
              "namespace" : "dbt_artifacts",
              "variant" : "matatika",
              "label" : "dbt Artifacts",
              "description" : "A tool for processing [dbt artifacts](https://docs.getdbt.com/reference/artifacts/dbt-artifacts).",
              "logoUrl" : "https://app.matatika.com/assets/logos/utilities/dbt.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/dbt-artifacts/",
              "pipUrl" : "git+https://github.com/Matatika/dbt-artifacts-ext.git",
              "repo" : "https://github.com/Matatika/dbt-artifacts-ext",
              "executable" : "dbt-artifacts_extension",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "source_dir",
                "aliases" : [ ],
                "label" : "Source directory",
                "value" : ".meltano/transformers/dbt/target",
                "kind" : "STRING",
                "description" : "Source directory for the dbt manifest.json and catalog.json to get your lineage from.",
                "protected" : false
              }, {
                "name" : "output_dir",
                "aliases" : [ ],
                "label" : "Output directory",
                "value" : "output",
                "kind" : "STRING",
                "description" : "Target directory output.",
                "protected" : false
              }, {
                "name" : "resource_types",
                "aliases" : [ ],
                "label" : "Resource types",
                "value" : "[\"source\",\"model\",\"snapshot\"]",
                "kind" : "ARRAY",
                "description" : "Array of which dbt artifact to process. 'source', 'model' and 'snapshot' are frequently used, and you can also use 'all' to get every artifact found.",
                "protected" : false
              }, {
                "name" : "exclude_packages",
                "aliases" : [ ],
                "label" : "Packages to exclude",
                "value" : "[\"elementary\"]",
                "kind" : "ARRAY",
                "description" : "Array of which packages to exclude from model lineage.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "convert-matatika" : {
                  "args" : "convert --format matatika",
                  "description" : "Convert artifacts to Matatika datasets."
                },
                "convert-mermaid" : {
                  "args" : "convert --format mermaid",
                  "description" : "Convert artifacts to [Mermaid entity relationship diagrams](https://mermaid.js.org/syntax/entityRelationshipDiagram.html)."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "A tool for processing [dbt artifacts](https://docs.getdbt.com/reference/artifacts/dbt-artifacts).\n\n## Settings\n\n\n### Source directory\n\nSource directory for the dbt manifest.json and catalog.json to get your lineage from.\n\n### Output directory\n\nTarget directory output.\n\n### Resource types\n\nArray of which dbt artifact to process. 'source', 'model' and 'snapshot' are frequently used, and you can also use 'all' to get every artifact found.\n\n### Packages to exclude\n\nArray of which packages to exclude from model lineage.",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/f830c410-8ac9-4d9e-b38c-ab3d70af776d"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/f830c410-8ac9-4d9e-b38c-ab3d70af776d",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/9240a991-4c92-4fbd-b3fc-a9b494c35ef4"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/9240a991-4c92-4fbd-b3fc-a9b494c35ef4"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/9240a991-4c92-4fbd-b3fc-a9b494c35ef4"
            }
          }
        }, {
          "id" : "48946dbe-0d36-4be9-89af-c08d5d269179",
          "created" : "2024-07-15T09:10:47.750044",
          "lastModified" : "2024-07-15T09:10:47.750056",
          "name" : "matatika",
          "properties" : { },
          "commands" : {
            "publish" : {
              "args" : "publish $MATATIKA_DATASET_PATH",
              "description" : "Publish a matatika dataset."
            },
            "schedules" : {
              "args" : "schedules",
              "description" : "Convert Meltano jobs and schedules into Matatika pipeline yamls."
            }
          },
          "dataPlugin" : "utilities/matatika--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "00cb1f0b-2419-409a-bdaa-da2b94815144",
              "repositoryPath" : "plugins/utilities/matatika--matatika.lock",
              "pluginType" : "UTILITY",
              "name" : "matatika",
              "namespace" : "utility_matatika",
              "variant" : "matatika",
              "label" : "Matatika CLI",
              "description" : "Matatika CLI is a command-line interface tool for managing data science projects.\n\nMatatika CLI is a powerful tool that allows data scientists to manage their projects from the command line. With Matatika CLI, users can create and manage projects, upload and download data, and run experiments and analyses. The tool also provides version control and collaboration features, making it easy for teams to work together on projects. Matatika CLI is designed to be flexible and customizable, allowing users to tailor it to their specific needs. Overall, Matatika CLI is a valuable tool for any data scientist looking to streamline their workflow and improve their productivity.\n### Prerequisites\nThe dataset path is the location of the dataset you want to work with in Matatika CLI. To obtain the dataset path, you need to know where the dataset is stored in your Matatika account. You can find the dataset path by navigating to the dataset in the Matatika web interface and copying the path from the address bar of your web browser. Alternatively, you can use the `matatika datasets list` command in the Matatika CLI to list all the datasets in your account and their paths.",
              "logoUrl" : "https://app.matatika.com/assets/images/utility/matatika.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/matatika/",
              "pipUrl" : "matatika~=1.16.0",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "dataset_path",
                "aliases" : [ ],
                "label" : "Dataset Path",
                "value" : "output",
                "kind" : "STRING",
                "description" : "The path to a dataset or directory of datasets to publish to your workspace. This need to be the a Matatika dataset that is present in your workspace's repository.",
                "protected" : false
              }, {
                "name" : "auth_token",
                "aliases" : [ ],
                "label" : "Auth Token",
                "value" : "$AUTH_TOKEN",
                "kind" : "STRING",
                "description" : "A unique token used to authenticate the user and grant access to the Matatika workspace.",
                "env" : "AUTH_TOKEN",
                "protected" : false
              }, {
                "name" : "workspace_id",
                "aliases" : [ ],
                "label" : "Workspace ID",
                "value" : "$WORKSPACE_ID",
                "kind" : "STRING",
                "description" : "The unique identifier for the Matatika workspace.",
                "env" : "WORKSPACE_ID",
                "protected" : false
              }, {
                "name" : "endpoint_url",
                "aliases" : [ ],
                "label" : "Endpoint URL",
                "value" : "$ENDPOINT_URL",
                "kind" : "STRING",
                "description" : "The URL used to connect to the Matatika API.",
                "env" : "ENDPOINT_URL",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "publish" : {
                  "args" : "publish $MATATIKA_DATASET_PATH",
                  "description" : "Publish a matatika dataset."
                },
                "schedules" : {
                  "args" : "schedules",
                  "description" : "Convert Meltano jobs and schedules into Matatika pipeline yamls."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Matatika CLI is a command-line interface tool for managing data science projects.\n\nMatatika CLI is a powerful tool that allows data scientists to manage their projects from the command line. With Matatika CLI, users can create and manage projects, upload and download data, and run experiments and analyses. The tool also provides version control and collaboration features, making it easy for teams to work together on projects. Matatika CLI is designed to be flexible and customizable, allowing users to tailor it to their specific needs. Overall, Matatika CLI is a valuable tool for any data scientist looking to streamline their workflow and improve their productivity.\n### Prerequisites\nThe dataset path is the location of the dataset you want to work with in Matatika CLI. To obtain the dataset path, you need to know where the dataset is stored in your Matatika account. You can find the dataset path by navigating to the dataset in the Matatika web interface and copying the path from the address bar of your web browser. Alternatively, you can use the `matatika datasets list` command in the Matatika CLI to list all the datasets in your account and their paths.\n\n## Settings\n\n\n### Dataset Path\n\nThe path to a dataset or directory of datasets to publish to your workspace. This need to be the a Matatika dataset that is present in your workspace's repository.\n\n### Auth Token\n\nA unique token used to authenticate the user and grant access to the Matatika workspace.\n\n### Workspace ID\n\nThe unique identifier for the Matatika workspace.\n\n### Endpoint URL\n\nThe URL used to connect to the Matatika API.",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/00cb1f0b-2419-409a-bdaa-da2b94815144"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/00cb1f0b-2419-409a-bdaa-da2b94815144",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/48946dbe-0d36-4be9-89af-c08d5d269179"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/48946dbe-0d36-4be9-89af-c08d5d269179"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/48946dbe-0d36-4be9-89af-c08d5d269179"
            }
          }
        } ]
      },
      "_links" : {
        "update pipeline" : {
          "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf",
          "type" : "PUT"
        },
        "delete pipeline" : {
          "href" : "https://app.matatika.com/api/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf",
          "type" : "DELETE"
        },
        "draft pipeline" : {
          "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf/draft",
          "type" : "PUT"
        },
        "self" : {
          "href" : "https://app.matatika.com/api/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf"
        },
        "environment" : {
          "href" : "https://app.matatika.com/api/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf/environment"
        },
        "jobs" : {
          "href" : "https://app.matatika.com/api/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf/jobs",
          "type" : "GET"
        },
        "add subscription" : {
          "href" : "https://app.matatika.com/api/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf/subscriptions"
        },
        "verify pipeline" : {
          "href" : "https://app.matatika.com/api/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf/verification",
          "type" : "POST"
        },
        "create job" : {
          "href" : "https://app.matatika.com/api/pipelines/1c8022b3-3e71-4ba4-b897-4e187d6660cf/jobs",
          "type" : "POST"
        }
      }
    }, {
      "id" : "3e11d680-bed2-4900-a5eb-27ec3fd6844e",
      "status" : "DRAFT",
      "name" : "SIT-generated pipeline [2024-07-15T10:09:16.246704]",
      "schedule" : "0 0 0 25 12 ?",
      "timeout" : 0,
      "maxRetries" : 0,
      "created" : "2024-07-15T09:09:16.617741",
      "lastModified" : "2024-07-15T09:09:16.617741",
      "properties" : { },
      "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
      "actions" : [ ],
      "triggeredBy" : [ ],
      "_embedded" : {
        "dataComponents" : [ {
          "id" : "985215c6-87c5-4633-bd45-7cc236a92f87",
          "created" : "2024-07-15T09:09:16.453622",
          "lastModified" : "2024-07-15T09:09:16.453623",
          "name" : "tap-google-analytics",
          "properties" : { },
          "commands" : { },
          "dataPlugin" : "extractors/tap-google-analytics--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "51bb57f1-9ccc-40ac-b14d-de574c855a63",
              "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
              "pluginType" : "EXTRACTOR",
              "name" : "tap-google-analytics",
              "namespace" : "tap_google_analytics",
              "variant" : "matatika",
              "label" : "Google Analytics",
              "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
              "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
              "hidden" : false,
              "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
              "pipUrl" : "git+https://github.com/Matatika/[email protected]",
              "repo" : "https://github.com/Matatika/tap-google-analytics",
              "capabilities" : [ "CATALOG", "DISCOVER", "STATE" ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "oauth_credentials.authorization_url",
                "aliases" : [ ],
                "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
                "value" : "https://oauth2.googleapis.com/token",
                "kind" : "HIDDEN",
                "description" : "The endpoint used to create and refresh OAuth tokens.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.scope",
                "aliases" : [ ],
                "label" : "OAuth scopes we need to request access to",
                "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
                "kind" : "HIDDEN",
                "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.access_token",
                "aliases" : [ ],
                "label" : "Access Token",
                "kind" : "HIDDEN",
                "description" : "The token used to authenticate and authorize API requests.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_token",
                "aliases" : [ ],
                "label" : "OAuth Refresh Token",
                "kind" : "HIDDEN",
                "description" : "The token used to refresh the access token when it expires.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_proxy_url",
                "aliases" : [ ],
                "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
                "kind" : "HIDDEN",
                "description" : "An optional function that will be called to refresh the access token using the refresh token.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_proxy_url_auth",
                "aliases" : [ ],
                "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
                "kind" : "HIDDEN",
                "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.client_id",
                "aliases" : [ ],
                "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
                "kind" : "HIDDEN",
                "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.client_secret",
                "aliases" : [ ],
                "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
                "kind" : "HIDDEN",
                "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
                "protected" : false
              }, {
                "name" : "view_id",
                "aliases" : [ ],
                "label" : "View ID",
                "placeholder" : "Ex. 198343027",
                "kind" : "STRING",
                "description" : "The ID of the Google Analytics view to retrieve data from.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "reports",
                "aliases" : [ ],
                "label" : "Reports",
                "placeholder" : "Ex. my_report_definition.json",
                "kind" : "STRING",
                "description" : "The specific reports to retrieve data from in the Google Analytics view.",
                "protected" : false
              }, {
                "name" : "start_date",
                "aliases" : [ ],
                "label" : "Start date",
                "kind" : "DATE_ISO8601",
                "description" : "The start date for the date range of data to retrieve.",
                "protected" : false
              }, {
                "name" : "end_date",
                "aliases" : [ ],
                "label" : "End date",
                "kind" : "DATE_ISO8601",
                "description" : "The end date for the date range of data to retrieve.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : { },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : true,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
            }
          }
        }, {
          "id" : "c8b03642-3482-45f9-8bbd-9d0c53d468ed",
          "created" : "2024-07-15T09:08:49.599991",
          "lastModified" : "2024-07-15T09:11:20.958587",
          "name" : "Warehouse",
          "properties" : {
            "password" : "4SllWL17_E43GjJq5_GhDX041h",
            "default_target_schema" : "analytics",
            "dbname" : "fruzutu",
            "port" : "5432",
            "host" : "sharp-banana.postgres.database.azure.com",
            "user" : "fruzutu"
          },
          "commands" : { },
          "dataPlugin" : "loaders/target-postgres--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "9da27ab6-0f9a-478a-bc82-d119a43d6777",
              "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
              "pluginType" : "LOADER",
              "name" : "target-postgres",
              "namespace" : "postgres_transferwise",
              "variant" : "matatika",
              "label" : "Postgres Warehouse",
              "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
              "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/target-postgres/",
              "pipUrl" : "git+https://github.com/Matatika/[email protected]",
              "repo" : "git+https://github.com/Matatika/[email protected]",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "user",
                "aliases" : [ "username" ],
                "label" : "User",
                "kind" : "STRING",
                "description" : "The username used to connect to the Postgres Warehouse.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "password",
                "aliases" : [ ],
                "label" : "Password",
                "kind" : "PASSWORD",
                "description" : "The password used to authenticate the user.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "host",
                "aliases" : [ "address" ],
                "label" : "Host",
                "kind" : "STRING",
                "description" : "The hostname or IP address of the Postgres Warehouse server.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "port",
                "aliases" : [ ],
                "label" : "Port",
                "value" : "5432",
                "kind" : "INTEGER",
                "description" : "The port number used to connect to the Postgres Warehouse server.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "dbname",
                "aliases" : [ "database" ],
                "label" : "Database Name",
                "kind" : "STRING",
                "description" : "The name of the database to connect to.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "default_target_schema",
                "aliases" : [ ],
                "label" : "Default Target Schema",
                "value" : "analytics",
                "kind" : "STRING",
                "description" : "The default schema to use when writing data to the Postgres Warehouse.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "ssl",
                "aliases" : [ ],
                "label" : "SSL",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
                "protected" : false,
                "value_post_processor" : "STRINGIFY"
              }, {
                "name" : "batch_size_rows",
                "aliases" : [ ],
                "label" : "Batch Size Rows",
                "value" : "100000",
                "kind" : "INTEGER",
                "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
                "protected" : false
              }, {
                "name" : "underscore_camel_case_fields",
                "aliases" : [ ],
                "label" : "Underscore Camel Case Fields",
                "value" : "true",
                "kind" : "HIDDEN",
                "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
                "protected" : false
              }, {
                "name" : "flush_all_streams",
                "aliases" : [ ],
                "label" : "Flush All Streams",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
                "protected" : false
              }, {
                "name" : "parallelism",
                "aliases" : [ ],
                "label" : "Parallelism",
                "value" : "0",
                "kind" : "HIDDEN",
                "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "parallelism_max",
                "aliases" : [ ],
                "label" : "Max Parallelism",
                "value" : "16",
                "kind" : "HIDDEN",
                "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "default_target_schema_select_permission",
                "aliases" : [ ],
                "label" : "Default Target Schema Select Permission",
                "kind" : "HIDDEN",
                "description" : "The permission level required to select data from the default target schema.",
                "protected" : false
              }, {
                "name" : "schema_mapping",
                "aliases" : [ ],
                "label" : "Schema Mapping",
                "kind" : "HIDDEN",
                "description" : "A mapping of source schema names to target schema names.",
                "protected" : false
              }, {
                "name" : "add_metadata_columns",
                "aliases" : [ ],
                "label" : "Add Metadata Columns",
                "value" : "true",
                "kind" : "HIDDEN",
                "description" : "Whether or not to add metadata columns to the target table.",
                "protected" : false
              }, {
                "name" : "hard_delete",
                "aliases" : [ ],
                "label" : "Hard Delete",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "data_flattening_max_level",
                "aliases" : [ ],
                "label" : "Data Flattening Max Level",
                "value" : "10",
                "kind" : "HIDDEN",
                "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "primary_key_required",
                "aliases" : [ ],
                "label" : "Primary Key Required",
                "value" : "false",
                "kind" : "BOOLEAN",
                "description" : "Whether or not a primary key is required for the target table.",
                "protected" : false
              }, {
                "name" : "validate_records",
                "aliases" : [ ],
                "label" : "Validate Records",
                "value" : "false",
                "kind" : "BOOLEAN",
                "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "temp_dir",
                "aliases" : [ ],
                "label" : "Temporary Directory",
                "kind" : "HIDDEN",
                "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : { },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : true,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
            }
          }
        }, {
          "id" : "3bf944f2-ee7e-4326-a7ce-35df580478f9",
          "created" : "2024-07-15T09:10:47.73595",
          "lastModified" : "2024-07-15T09:10:47.735961",
          "name" : "dbt",
          "properties" : { },
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "dataPlugin" : "transformers/dbt--dbt-labs",
          "_embedded" : {
            "dataplugin" : {
              "id" : "422fa2e4-b419-4c24-af96-d49a7c3a2f35",
              "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
              "pluginType" : "TRANSFORMER",
              "name" : "dbt",
              "namespace" : "dbt",
              "variant" : "dbt-labs",
              "label" : "dbt",
              "description" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
              "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/dbt/",
              "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
              "repo" : "https://github.com/dbt-labs/dbt-core",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "project_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "profiles_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
                "kind" : "STRING",
                "env" : "DBT_PROFILES_DIR",
                "protected" : false
              }, {
                "name" : "target",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__DIALECT",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "source_schema",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "target_schema",
                "aliases" : [ ],
                "value" : "analytics",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "models",
                "aliases" : [ ],
                "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
                "kind" : "STRING",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "compile" : {
                  "args" : "compile",
                  "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
                },
                "seed" : {
                  "args" : "seed",
                  "description" : "Load data from csv files into your data warehouse."
                },
                "test" : {
                  "args" : "test",
                  "description" : "Runs tests on data in deployed models."
                },
                "docs-generate" : {
                  "args" : "docs generate",
                  "description" : "Generate documentation artifacts for your project."
                },
                "deps" : {
                  "args" : "deps",
                  "description" : "Pull the most recent version of the dependencies listed in packages.yml"
                },
                "run" : {
                  "args" : "run",
                  "description" : "Compile SQL and execute against the current target database."
                },
                "clean" : {
                  "args" : "clean",
                  "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
                },
                "snapshot" : {
                  "args" : "snapshot",
                  "description" : "Execute snapshots defined in your project."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ {
                "id" : "732af1ba-ad83-4fd0-ba19-4b07c02e1bba",
                "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
                "pluginType" : "FILE",
                "name" : "files-dbt",
                "namespace" : "dbt",
                "variant" : "matatika",
                "label" : "files-dbt",
                "description" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n",
                "hidden" : false,
                "pipUrl" : "git+https://github.com/Matatika/[email protected]",
                "repo" : "https://github.com/Matatika/files-dbt",
                "capabilities" : [ ],
                "select" : [ ],
                "update" : {
                  "transform/profile/profiles.yml" : "true"
                },
                "vars" : { },
                "settings" : [ ],
                "variants" : [ ],
                "commands" : { },
                "matatikaHidden" : false,
                "requires" : [ ],
                "fullDescription" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n"
              } ],
              "fullDescription" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            }
          }
        } ],
        "latest job" : {
          "id" : "02391ef9-ea14-461a-bfc3-de6bd76592c6",
          "created" : "2024-07-15T09:09:17.231363",
          "type" : "WORKSPACE_CONFIG",
          "maxAttempts" : 0,
          "attempt" : 0,
          "commitId" : "a6d55828ce4fb47b2a4ca9e0b0e668565e12cc4c",
          "exitCode" : 0,
          "status" : "COMPLETE",
          "startTime" : "2024-07-15T09:09:30.119",
          "endTime" : "2024-07-15T09:10:48.619"
        }
      },
      "_links" : {
        "update pipeline" : {
          "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e",
          "type" : "PUT"
        },
        "delete pipeline" : {
          "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e",
          "type" : "DELETE"
        },
        "draft pipeline" : {
          "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/draft",
          "type" : "PUT"
        },
        "self" : {
          "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e"
        },
        "environment" : {
          "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/environment"
        },
        "jobs" : {
          "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/jobs",
          "type" : "GET"
        },
        "metrics" : {
          "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/metrics"
        },
        "add subscription" : {
          "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/subscriptions"
        },
        "verify pipeline" : {
          "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/verification",
          "type" : "POST"
        },
        "latest job" : {
          "href" : "https://app.matatika.com/api/jobs/02391ef9-ea14-461a-bfc3-de6bd76592c6"
        }
      }
    }, {
      "id" : "37da82fd-da13-4df1-acd1-eb47fb8cc074",
      "status" : "READY",
      "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607] (updated)",
      "timeout" : 0,
      "maxRetries" : 0,
      "created" : "2024-07-15T09:10:49.554622",
      "lastModified" : "2024-07-15T09:10:49.554622",
      "properties" : { },
      "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
      "actions" : [ ],
      "triggeredBy" : [ ],
      "_embedded" : {
        "dataComponents" : [ {
          "id" : "985215c6-87c5-4633-bd45-7cc236a92f87",
          "created" : "2024-07-15T09:09:16.453622",
          "lastModified" : "2024-07-15T09:09:16.453623",
          "name" : "tap-google-analytics",
          "properties" : { },
          "commands" : { },
          "dataPlugin" : "extractors/tap-google-analytics--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "51bb57f1-9ccc-40ac-b14d-de574c855a63",
              "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
              "pluginType" : "EXTRACTOR",
              "name" : "tap-google-analytics",
              "namespace" : "tap_google_analytics",
              "variant" : "matatika",
              "label" : "Google Analytics",
              "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
              "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
              "hidden" : false,
              "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
              "pipUrl" : "git+https://github.com/Matatika/[email protected]",
              "repo" : "https://github.com/Matatika/tap-google-analytics",
              "capabilities" : [ "CATALOG", "DISCOVER", "STATE" ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "oauth_credentials.authorization_url",
                "aliases" : [ ],
                "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
                "value" : "https://oauth2.googleapis.com/token",
                "kind" : "HIDDEN",
                "description" : "The endpoint used to create and refresh OAuth tokens.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.scope",
                "aliases" : [ ],
                "label" : "OAuth scopes we need to request access to",
                "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
                "kind" : "HIDDEN",
                "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.access_token",
                "aliases" : [ ],
                "label" : "Access Token",
                "kind" : "HIDDEN",
                "description" : "The token used to authenticate and authorize API requests.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_token",
                "aliases" : [ ],
                "label" : "OAuth Refresh Token",
                "kind" : "HIDDEN",
                "description" : "The token used to refresh the access token when it expires.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_proxy_url",
                "aliases" : [ ],
                "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
                "kind" : "HIDDEN",
                "description" : "An optional function that will be called to refresh the access token using the refresh token.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_proxy_url_auth",
                "aliases" : [ ],
                "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
                "kind" : "HIDDEN",
                "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.client_id",
                "aliases" : [ ],
                "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
                "kind" : "HIDDEN",
                "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.client_secret",
                "aliases" : [ ],
                "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
                "kind" : "HIDDEN",
                "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
                "protected" : false
              }, {
                "name" : "view_id",
                "aliases" : [ ],
                "label" : "View ID",
                "placeholder" : "Ex. 198343027",
                "kind" : "STRING",
                "description" : "The ID of the Google Analytics view to retrieve data from.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "reports",
                "aliases" : [ ],
                "label" : "Reports",
                "placeholder" : "Ex. my_report_definition.json",
                "kind" : "STRING",
                "description" : "The specific reports to retrieve data from in the Google Analytics view.",
                "protected" : false
              }, {
                "name" : "start_date",
                "aliases" : [ ],
                "label" : "Start date",
                "kind" : "DATE_ISO8601",
                "description" : "The start date for the date range of data to retrieve.",
                "protected" : false
              }, {
                "name" : "end_date",
                "aliases" : [ ],
                "label" : "End date",
                "kind" : "DATE_ISO8601",
                "description" : "The end date for the date range of data to retrieve.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : { },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : true,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
            }
          }
        }, {
          "id" : "c8b03642-3482-45f9-8bbd-9d0c53d468ed",
          "created" : "2024-07-15T09:08:49.599991",
          "lastModified" : "2024-07-15T09:11:20.958587",
          "name" : "Warehouse",
          "properties" : {
            "password" : "4SllWL17_E43GjJq5_GhDX041h",
            "default_target_schema" : "analytics",
            "dbname" : "fruzutu",
            "port" : "5432",
            "host" : "sharp-banana.postgres.database.azure.com",
            "user" : "fruzutu"
          },
          "commands" : { },
          "dataPlugin" : "loaders/target-postgres--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "9da27ab6-0f9a-478a-bc82-d119a43d6777",
              "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
              "pluginType" : "LOADER",
              "name" : "target-postgres",
              "namespace" : "postgres_transferwise",
              "variant" : "matatika",
              "label" : "Postgres Warehouse",
              "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
              "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/target-postgres/",
              "pipUrl" : "git+https://github.com/Matatika/[email protected]",
              "repo" : "git+https://github.com/Matatika/[email protected]",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "user",
                "aliases" : [ "username" ],
                "label" : "User",
                "kind" : "STRING",
                "description" : "The username used to connect to the Postgres Warehouse.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "password",
                "aliases" : [ ],
                "label" : "Password",
                "kind" : "PASSWORD",
                "description" : "The password used to authenticate the user.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "host",
                "aliases" : [ "address" ],
                "label" : "Host",
                "kind" : "STRING",
                "description" : "The hostname or IP address of the Postgres Warehouse server.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "port",
                "aliases" : [ ],
                "label" : "Port",
                "value" : "5432",
                "kind" : "INTEGER",
                "description" : "The port number used to connect to the Postgres Warehouse server.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "dbname",
                "aliases" : [ "database" ],
                "label" : "Database Name",
                "kind" : "STRING",
                "description" : "The name of the database to connect to.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "default_target_schema",
                "aliases" : [ ],
                "label" : "Default Target Schema",
                "value" : "analytics",
                "kind" : "STRING",
                "description" : "The default schema to use when writing data to the Postgres Warehouse.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "ssl",
                "aliases" : [ ],
                "label" : "SSL",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
                "protected" : false,
                "value_post_processor" : "STRINGIFY"
              }, {
                "name" : "batch_size_rows",
                "aliases" : [ ],
                "label" : "Batch Size Rows",
                "value" : "100000",
                "kind" : "INTEGER",
                "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
                "protected" : false
              }, {
                "name" : "underscore_camel_case_fields",
                "aliases" : [ ],
                "label" : "Underscore Camel Case Fields",
                "value" : "true",
                "kind" : "HIDDEN",
                "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
                "protected" : false
              }, {
                "name" : "flush_all_streams",
                "aliases" : [ ],
                "label" : "Flush All Streams",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
                "protected" : false
              }, {
                "name" : "parallelism",
                "aliases" : [ ],
                "label" : "Parallelism",
                "value" : "0",
                "kind" : "HIDDEN",
                "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "parallelism_max",
                "aliases" : [ ],
                "label" : "Max Parallelism",
                "value" : "16",
                "kind" : "HIDDEN",
                "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "default_target_schema_select_permission",
                "aliases" : [ ],
                "label" : "Default Target Schema Select Permission",
                "kind" : "HIDDEN",
                "description" : "The permission level required to select data from the default target schema.",
                "protected" : false
              }, {
                "name" : "schema_mapping",
                "aliases" : [ ],
                "label" : "Schema Mapping",
                "kind" : "HIDDEN",
                "description" : "A mapping of source schema names to target schema names.",
                "protected" : false
              }, {
                "name" : "add_metadata_columns",
                "aliases" : [ ],
                "label" : "Add Metadata Columns",
                "value" : "true",
                "kind" : "HIDDEN",
                "description" : "Whether or not to add metadata columns to the target table.",
                "protected" : false
              }, {
                "name" : "hard_delete",
                "aliases" : [ ],
                "label" : "Hard Delete",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "data_flattening_max_level",
                "aliases" : [ ],
                "label" : "Data Flattening Max Level",
                "value" : "10",
                "kind" : "HIDDEN",
                "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "primary_key_required",
                "aliases" : [ ],
                "label" : "Primary Key Required",
                "value" : "false",
                "kind" : "BOOLEAN",
                "description" : "Whether or not a primary key is required for the target table.",
                "protected" : false
              }, {
                "name" : "validate_records",
                "aliases" : [ ],
                "label" : "Validate Records",
                "value" : "false",
                "kind" : "BOOLEAN",
                "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "temp_dir",
                "aliases" : [ ],
                "label" : "Temporary Directory",
                "kind" : "HIDDEN",
                "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : { },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : true,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
            }
          }
        }, {
          "id" : "3bf944f2-ee7e-4326-a7ce-35df580478f9",
          "created" : "2024-07-15T09:10:47.73595",
          "lastModified" : "2024-07-15T09:10:47.735961",
          "name" : "dbt",
          "properties" : { },
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "dataPlugin" : "transformers/dbt--dbt-labs",
          "_embedded" : {
            "dataplugin" : {
              "id" : "422fa2e4-b419-4c24-af96-d49a7c3a2f35",
              "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
              "pluginType" : "TRANSFORMER",
              "name" : "dbt",
              "namespace" : "dbt",
              "variant" : "dbt-labs",
              "label" : "dbt",
              "description" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
              "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/dbt/",
              "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
              "repo" : "https://github.com/dbt-labs/dbt-core",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "project_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "profiles_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
                "kind" : "STRING",
                "env" : "DBT_PROFILES_DIR",
                "protected" : false
              }, {
                "name" : "target",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__DIALECT",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "source_schema",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "target_schema",
                "aliases" : [ ],
                "value" : "analytics",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "models",
                "aliases" : [ ],
                "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
                "kind" : "STRING",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "compile" : {
                  "args" : "compile",
                  "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
                },
                "seed" : {
                  "args" : "seed",
                  "description" : "Load data from csv files into your data warehouse."
                },
                "test" : {
                  "args" : "test",
                  "description" : "Runs tests on data in deployed models."
                },
                "docs-generate" : {
                  "args" : "docs generate",
                  "description" : "Generate documentation artifacts for your project."
                },
                "deps" : {
                  "args" : "deps",
                  "description" : "Pull the most recent version of the dependencies listed in packages.yml"
                },
                "run" : {
                  "args" : "run",
                  "description" : "Compile SQL and execute against the current target database."
                },
                "clean" : {
                  "args" : "clean",
                  "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
                },
                "snapshot" : {
                  "args" : "snapshot",
                  "description" : "Execute snapshots defined in your project."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ {
                "id" : "732af1ba-ad83-4fd0-ba19-4b07c02e1bba",
                "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
                "pluginType" : "FILE",
                "name" : "files-dbt",
                "namespace" : "dbt",
                "variant" : "matatika",
                "label" : "files-dbt",
                "description" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n",
                "hidden" : false,
                "pipUrl" : "git+https://github.com/Matatika/[email protected]",
                "repo" : "https://github.com/Matatika/files-dbt",
                "capabilities" : [ ],
                "select" : [ ],
                "update" : {
                  "transform/profile/profiles.yml" : "true"
                },
                "vars" : { },
                "settings" : [ ],
                "variants" : [ ],
                "commands" : { },
                "matatikaHidden" : false,
                "requires" : [ ],
                "fullDescription" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n"
              } ],
              "fullDescription" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
              "_links" : {
                "self" : {
                  "href" : "https://app.matatika.com/api/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35"
                },
                "update dataplugin" : {
                  "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            },
            "update datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            },
            "delete datacomponent" : {
              "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
            }
          }
        } ],
        "latest job" : {
          "id" : "e6f969cb-8197-4d33-a588-b97ba78d904f",
          "created" : "2024-07-15T09:10:50.289063",
          "type" : "WORKSPACE_CONFIG",
          "maxAttempts" : 0,
          "attempt" : 0,
          "commitId" : "2afd2a5d29ec4ada81ebd49a24ce003acab803f2",
          "exitCode" : 0,
          "status" : "COMPLETE",
          "startTime" : "2024-07-15T09:11:02.609",
          "endTime" : "2024-07-15T09:11:21.687"
        }
      },
      "_links" : {
        "update pipeline" : {
          "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074",
          "type" : "PUT"
        },
        "delete pipeline" : {
          "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074",
          "type" : "DELETE"
        },
        "draft pipeline" : {
          "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/draft",
          "type" : "PUT"
        },
        "self" : {
          "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074"
        },
        "environment" : {
          "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/environment"
        },
        "jobs" : {
          "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/jobs",
          "type" : "GET"
        },
        "metrics" : {
          "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/metrics"
        },
        "add subscription" : {
          "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/subscriptions"
        },
        "verify pipeline" : {
          "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/verification",
          "type" : "POST"
        },
        "create job" : {
          "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/jobs",
          "type" : "POST"
        },
        "latest job" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f"
        }
      }
    } ]
  },
  "_links" : {
    "self" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines?page=0&size=20&sort=name,asc"
    }
  },
  "page" : {
    "size" : 20,
    "totalElements" : 3,
    "totalPages" : 1,
    "number" : 0
  }
}

View a pipeline

GET

/api/pipelines/{pipeline-id}

Returns the pipeline {pipeline-id}.

Prerequisites

  • Pipeline {pipeline-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074' -i -X GET \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("GET", url, headers=headers)

print(response.text.encode('utf8'))

Response

200 OK

Pipeline with HAL links.

{
  "id" : "37da82fd-da13-4df1-acd1-eb47fb8cc074",
  "status" : "READY",
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607] (updated)",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-07-15T09:10:49.554622",
  "lastModified" : "2024-07-15T09:10:49.554622",
  "properties" : {
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
    "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id"
  },
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
  "actions" : [ ],
  "triggeredBy" : [ ],
  "_embedded" : {
    "dataComponents" : [ {
      "id" : "985215c6-87c5-4633-bd45-7cc236a92f87",
      "created" : "2024-07-15T09:09:16.453622",
      "lastModified" : "2024-07-15T09:09:16.453623",
      "name" : "tap-google-analytics",
      "properties" : { },
      "commands" : { },
      "dataPlugin" : "extractors/tap-google-analytics--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "51bb57f1-9ccc-40ac-b14d-de574c855a63",
          "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
          "pluginType" : "EXTRACTOR",
          "name" : "tap-google-analytics",
          "namespace" : "tap_google_analytics",
          "variant" : "matatika",
          "label" : "Google Analytics",
          "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
          "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
          "hidden" : false,
          "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "https://github.com/Matatika/tap-google-analytics",
          "capabilities" : [ "CATALOG", "DISCOVER", "STATE" ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "oauth_credentials.authorization_url",
            "aliases" : [ ],
            "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
            "value" : "https://oauth2.googleapis.com/token",
            "kind" : "HIDDEN",
            "description" : "The endpoint used to create and refresh OAuth tokens.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.scope",
            "aliases" : [ ],
            "label" : "OAuth scopes we need to request access to",
            "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
            "kind" : "HIDDEN",
            "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.access_token",
            "aliases" : [ ],
            "label" : "Access Token",
            "kind" : "HIDDEN",
            "description" : "The token used to authenticate and authorize API requests.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_token",
            "aliases" : [ ],
            "label" : "OAuth Refresh Token",
            "kind" : "HIDDEN",
            "description" : "The token used to refresh the access token when it expires.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url",
            "aliases" : [ ],
            "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
            "kind" : "HIDDEN",
            "description" : "An optional function that will be called to refresh the access token using the refresh token.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url_auth",
            "aliases" : [ ],
            "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
            "kind" : "HIDDEN",
            "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_id",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_secret",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "view_id",
            "aliases" : [ ],
            "label" : "View ID",
            "placeholder" : "Ex. 198343027",
            "kind" : "STRING",
            "description" : "The ID of the Google Analytics view to retrieve data from.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "reports",
            "aliases" : [ ],
            "label" : "Reports",
            "placeholder" : "Ex. my_report_definition.json",
            "kind" : "STRING",
            "description" : "The specific reports to retrieve data from in the Google Analytics view.",
            "protected" : false
          }, {
            "name" : "start_date",
            "aliases" : [ ],
            "label" : "Start date",
            "kind" : "DATE_ISO8601",
            "description" : "The start date for the date range of data to retrieve.",
            "protected" : false
          }, {
            "name" : "end_date",
            "aliases" : [ ],
            "label" : "End date",
            "kind" : "DATE_ISO8601",
            "description" : "The end date for the date range of data to retrieve.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : true,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        }
      }
    }, {
      "id" : "c8b03642-3482-45f9-8bbd-9d0c53d468ed",
      "created" : "2024-07-15T09:08:49.599991",
      "lastModified" : "2024-07-15T09:11:20.958587",
      "name" : "Warehouse",
      "properties" : {
        "password" : "4SllWL17_E43GjJq5_GhDX041h",
        "default_target_schema" : "analytics",
        "dbname" : "fruzutu",
        "port" : "5432",
        "host" : "sharp-banana.postgres.database.azure.com",
        "user" : "fruzutu"
      },
      "commands" : { },
      "dataPlugin" : "loaders/target-postgres--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "9da27ab6-0f9a-478a-bc82-d119a43d6777",
          "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
          "pluginType" : "LOADER",
          "name" : "target-postgres",
          "namespace" : "postgres_transferwise",
          "variant" : "matatika",
          "label" : "Postgres Warehouse",
          "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
          "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/target-postgres/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "git+https://github.com/Matatika/[email protected]",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "user",
            "aliases" : [ "username" ],
            "label" : "User",
            "kind" : "STRING",
            "description" : "The username used to connect to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "password",
            "aliases" : [ ],
            "label" : "Password",
            "kind" : "PASSWORD",
            "description" : "The password used to authenticate the user.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "host",
            "aliases" : [ "address" ],
            "label" : "Host",
            "kind" : "STRING",
            "description" : "The hostname or IP address of the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "port",
            "aliases" : [ ],
            "label" : "Port",
            "value" : "5432",
            "kind" : "INTEGER",
            "description" : "The port number used to connect to the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "dbname",
            "aliases" : [ "database" ],
            "label" : "Database Name",
            "kind" : "STRING",
            "description" : "The name of the database to connect to.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "default_target_schema",
            "aliases" : [ ],
            "label" : "Default Target Schema",
            "value" : "analytics",
            "kind" : "STRING",
            "description" : "The default schema to use when writing data to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "ssl",
            "aliases" : [ ],
            "label" : "SSL",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
            "protected" : false,
            "value_post_processor" : "STRINGIFY"
          }, {
            "name" : "batch_size_rows",
            "aliases" : [ ],
            "label" : "Batch Size Rows",
            "value" : "100000",
            "kind" : "INTEGER",
            "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
            "protected" : false
          }, {
            "name" : "underscore_camel_case_fields",
            "aliases" : [ ],
            "label" : "Underscore Camel Case Fields",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
            "protected" : false
          }, {
            "name" : "flush_all_streams",
            "aliases" : [ ],
            "label" : "Flush All Streams",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
            "protected" : false
          }, {
            "name" : "parallelism",
            "aliases" : [ ],
            "label" : "Parallelism",
            "value" : "0",
            "kind" : "HIDDEN",
            "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "parallelism_max",
            "aliases" : [ ],
            "label" : "Max Parallelism",
            "value" : "16",
            "kind" : "HIDDEN",
            "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "default_target_schema_select_permission",
            "aliases" : [ ],
            "label" : "Default Target Schema Select Permission",
            "kind" : "HIDDEN",
            "description" : "The permission level required to select data from the default target schema.",
            "protected" : false
          }, {
            "name" : "schema_mapping",
            "aliases" : [ ],
            "label" : "Schema Mapping",
            "kind" : "HIDDEN",
            "description" : "A mapping of source schema names to target schema names.",
            "protected" : false
          }, {
            "name" : "add_metadata_columns",
            "aliases" : [ ],
            "label" : "Add Metadata Columns",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to add metadata columns to the target table.",
            "protected" : false
          }, {
            "name" : "hard_delete",
            "aliases" : [ ],
            "label" : "Hard Delete",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "data_flattening_max_level",
            "aliases" : [ ],
            "label" : "Data Flattening Max Level",
            "value" : "10",
            "kind" : "HIDDEN",
            "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "primary_key_required",
            "aliases" : [ ],
            "label" : "Primary Key Required",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not a primary key is required for the target table.",
            "protected" : false
          }, {
            "name" : "validate_records",
            "aliases" : [ ],
            "label" : "Validate Records",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "temp_dir",
            "aliases" : [ ],
            "label" : "Temporary Directory",
            "kind" : "HIDDEN",
            "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        }
      }
    }, {
      "id" : "3bf944f2-ee7e-4326-a7ce-35df580478f9",
      "created" : "2024-07-15T09:10:47.73595",
      "lastModified" : "2024-07-15T09:10:47.735961",
      "name" : "dbt",
      "properties" : { },
      "commands" : {
        "compile" : {
          "args" : "compile",
          "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
        },
        "seed" : {
          "args" : "seed",
          "description" : "Load data from csv files into your data warehouse."
        },
        "test" : {
          "args" : "test",
          "description" : "Runs tests on data in deployed models."
        },
        "docs-generate" : {
          "args" : "docs generate",
          "description" : "Generate documentation artifacts for your project."
        },
        "deps" : {
          "args" : "deps",
          "description" : "Pull the most recent version of the dependencies listed in packages.yml"
        },
        "run" : {
          "args" : "run",
          "description" : "Compile SQL and execute against the current target database."
        },
        "clean" : {
          "args" : "clean",
          "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
        },
        "snapshot" : {
          "args" : "snapshot",
          "description" : "Execute snapshots defined in your project."
        }
      },
      "dataPlugin" : "transformers/dbt--dbt-labs",
      "_embedded" : {
        "dataplugin" : {
          "id" : "422fa2e4-b419-4c24-af96-d49a7c3a2f35",
          "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
          "pluginType" : "TRANSFORMER",
          "name" : "dbt",
          "namespace" : "dbt",
          "variant" : "dbt-labs",
          "label" : "dbt",
          "description" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
          "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/dbt/",
          "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
          "repo" : "https://github.com/dbt-labs/dbt-core",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "project_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "profiles_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
            "kind" : "STRING",
            "env" : "DBT_PROFILES_DIR",
            "protected" : false
          }, {
            "name" : "target",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__DIALECT",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "source_schema",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "target_schema",
            "aliases" : [ ],
            "value" : "analytics",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "models",
            "aliases" : [ ],
            "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
            "kind" : "STRING",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "matatikaHidden" : false,
          "requires" : [ {
            "id" : "732af1ba-ad83-4fd0-ba19-4b07c02e1bba",
            "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
            "pluginType" : "FILE",
            "name" : "files-dbt",
            "namespace" : "dbt",
            "variant" : "matatika",
            "label" : "files-dbt",
            "description" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n",
            "hidden" : false,
            "pipUrl" : "git+https://github.com/Matatika/[email protected]",
            "repo" : "https://github.com/Matatika/files-dbt",
            "capabilities" : [ ],
            "select" : [ ],
            "update" : {
              "transform/profile/profiles.yml" : "true"
            },
            "vars" : { },
            "settings" : [ ],
            "variants" : [ ],
            "commands" : { },
            "matatikaHidden" : false,
            "requires" : [ ],
            "fullDescription" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n"
          } ],
          "fullDescription" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        }
      }
    } ],
    "latest job" : {
      "id" : "e6f969cb-8197-4d33-a588-b97ba78d904f",
      "created" : "2024-07-15T09:10:50.289063",
      "type" : "WORKSPACE_CONFIG",
      "maxAttempts" : 0,
      "attempt" : 0,
      "commitId" : "2afd2a5d29ec4ada81ebd49a24ce003acab803f2",
      "exitCode" : 0,
      "status" : "COMPLETE",
      "startTime" : "2024-07-15T09:11:02.609",
      "endTime" : "2024-07-15T09:11:21.687",
      "_embedded" : {
        "pipeline" : {
          "id" : "37da82fd-da13-4df1-acd1-eb47fb8cc074",
          "status" : "READY",
          "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607] (updated)",
          "timeout" : 0,
          "maxRetries" : 0,
          "created" : "2024-07-15T09:10:49.554622",
          "lastModified" : "2024-07-15T09:10:49.554622",
          "properties" : {
            "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
            "tap-google-analytics.view_id" : "1234567890",
            "tap-google-analytics.reports" : "reports",
            "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
            "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
            "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
            "tap-google-analytics.oauth_credentials.access_token" : "access_token",
            "tap-google-analytics.oauth_credentials.client_id" : "client_id"
          },
          "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
          "actions" : [ ],
          "triggeredBy" : [ ]
        },
        "profile" : {
          "id" : "auth0|5eb0327cbfd7490bff55feeb",
          "name" : "[email protected]",
          "handle" : "@sit+prod",
          "email" : "[email protected]"
        }
      },
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f"
        },
        "delete job" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f",
          "type" : "DELETE"
        },
        "logs" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f/logs?sequence=0",
          "type" : "GET"
        }
      }
    }
  },
  "_links" : {
    "update pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074",
      "type" : "PUT"
    },
    "delete pipeline" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074",
      "type" : "DELETE"
    },
    "draft pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/draft",
      "type" : "PUT"
    },
    "self" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074"
    },
    "environment" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/environment"
    },
    "jobs" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/jobs",
      "type" : "GET"
    },
    "metrics" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/metrics"
    },
    "add subscription" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/subscriptions"
    },
    "verify pipeline" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/verification",
      "type" : "POST"
    },
    "create job" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/jobs",
      "type" : "POST"
    },
    "latest job" : {
      "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f"
    }
  }
}

Initialise a pipeline in a workspace

POST

/api/workspaces/{workspace-id}/pipelines

Initialises a new pipeline in the workspace {workspace-id}.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines' -i -X POST \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("POST", url, headers=headers)

print(response.text.encode('utf8'))

Response

200 OK

Pipeline with HAL links.

{
  "id" : "3e11d680-bed2-4900-a5eb-27ec3fd6844e",
  "status" : "PROVISIONING",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-07-15T09:09:16.158108902",
  "lastModified" : "2024-07-15T09:09:16.158109202",
  "properties" : { },
  "dataComponents" : [ ],
  "actions" : [ ],
  "triggeredBy" : [ ],
  "_links" : {
    "create pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e",
      "type" : "PUT"
    },
    "draft pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/draft",
      "type" : "PUT"
    },
    "validate pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/validation",
      "type" : "POST"
    }
  }
}

Create or update a pipeline in a workspace

PUT

/api/workspaces/{workspace-id}/pipelines/{pipeline-id}

Creates or updates the pipeline {pipeline-id} in the workspace {workspace-id}.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Body

Pipeline resource.

{
  "name" : "SIT-generated pipeline [2024-07-15T10:09:16.246704]",
  "dataComponents" : [ "extractors/tap-google-analytics", "Warehouse", "dbt" ],
  "schedule" : "0 0 0 25 12 ?",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
    "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  }
}

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e' -i -X PUT \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json' \
    -d '{
  "name" : "SIT-generated pipeline [2024-07-15T10:09:16.246704]",
  "dataComponents" : [ "extractors/tap-google-analytics", "Warehouse", "dbt" ],
  "schedule" : "0 0 0 25 12 ?",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
    "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  }
}'

Python (requests)

import requests

url = "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e"

data = {
  "name" : "SIT-generated pipeline [2024-07-15T10:09:16.246704]",
  "dataComponents" : [ "extractors/tap-google-analytics", "Warehouse", "dbt" ],
  "schedule" : "0 0 0 25 12 ?",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
    "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  }
}
headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("PUT", url, headers=headers, data=data)

print(response.text.encode('utf8'))

Response

200 OK / 201 Created

Pipeline with HAL links.

{
  "id" : "3e11d680-bed2-4900-a5eb-27ec3fd6844e",
  "status" : "PROVISIONING",
  "name" : "SIT-generated pipeline [2024-07-15T10:09:16.246704]",
  "schedule" : "0 0 0 25 12 ?",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-07-15T09:09:16.617741",
  "lastModified" : "2024-07-15T09:09:16.617741",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
    "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  },
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
  "actions" : [ ],
  "triggeredBy" : [ ],
  "_embedded" : {
    "dataComponents" : [ {
      "id" : "985215c6-87c5-4633-bd45-7cc236a92f87",
      "created" : "2024-07-15T09:09:16.453622",
      "lastModified" : "2024-07-15T09:09:16.453623",
      "name" : "tap-google-analytics",
      "properties" : { },
      "commands" : { },
      "dataPlugin" : "extractors/tap-google-analytics--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "51bb57f1-9ccc-40ac-b14d-de574c855a63",
          "pluginType" : "EXTRACTOR",
          "name" : "tap-google-analytics",
          "namespace" : "tap_google_analytics",
          "variant" : "matatika",
          "label" : "Google Analytics",
          "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
          "logoUrl" : "/assets/images/datasource/tap-google-analytics.svg",
          "hidden" : false,
          "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "https://github.com/Matatika/tap-google-analytics",
          "capabilities" : [ "CATALOG", "DISCOVER", "STATE" ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "oauth_credentials.authorization_url",
            "aliases" : [ ],
            "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
            "value" : "https://oauth2.googleapis.com/token",
            "kind" : "HIDDEN",
            "description" : "The endpoint used to create and refresh OAuth tokens.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.scope",
            "aliases" : [ ],
            "label" : "OAuth scopes we need to request access to",
            "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
            "kind" : "HIDDEN",
            "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.access_token",
            "aliases" : [ ],
            "label" : "Access Token",
            "kind" : "HIDDEN",
            "description" : "The token used to authenticate and authorize API requests.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_token",
            "aliases" : [ ],
            "label" : "OAuth Refresh Token",
            "kind" : "HIDDEN",
            "description" : "The token used to refresh the access token when it expires.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url",
            "aliases" : [ ],
            "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
            "kind" : "HIDDEN",
            "description" : "An optional function that will be called to refresh the access token using the refresh token.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url_auth",
            "aliases" : [ ],
            "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
            "kind" : "HIDDEN",
            "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_id",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_secret",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "view_id",
            "aliases" : [ ],
            "label" : "View ID",
            "placeholder" : "Ex. 198343027",
            "kind" : "STRING",
            "description" : "The ID of the Google Analytics view to retrieve data from.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "reports",
            "aliases" : [ ],
            "label" : "Reports",
            "placeholder" : "Ex. my_report_definition.json",
            "kind" : "STRING",
            "description" : "The specific reports to retrieve data from in the Google Analytics view.",
            "protected" : false
          }, {
            "name" : "start_date",
            "aliases" : [ ],
            "label" : "Start date",
            "kind" : "DATE_ISO8601",
            "description" : "The start date for the date range of data to retrieve.",
            "protected" : false
          }, {
            "name" : "end_date",
            "aliases" : [ ],
            "label" : "End date",
            "kind" : "DATE_ISO8601",
            "description" : "The end date for the date range of data to retrieve.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : true,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        }
      }
    }, {
      "id" : "c8b03642-3482-45f9-8bbd-9d0c53d468ed",
      "created" : "2024-07-15T09:08:49.599991",
      "lastModified" : "2024-07-15T09:09:12.924665",
      "name" : "Warehouse",
      "properties" : {
        "password" : "4SllWL17_E43GjJq5_GhDX041h",
        "default_target_schema" : "analytics",
        "dbname" : "fruzutu",
        "port" : "5432",
        "host" : "sharp-banana.postgres.database.azure.com",
        "user" : "fruzutu"
      },
      "commands" : { },
      "dataPlugin" : "loaders/target-postgres--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "9da27ab6-0f9a-478a-bc82-d119a43d6777",
          "pluginType" : "LOADER",
          "name" : "target-postgres",
          "namespace" : "postgres_transferwise",
          "variant" : "matatika",
          "label" : "Postgres Warehouse",
          "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
          "logoUrl" : "/assets/logos/loaders/postgres.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/target-postgres/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "git+https://github.com/Matatika/[email protected]",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "user",
            "aliases" : [ "username" ],
            "label" : "User",
            "kind" : "STRING",
            "description" : "The username used to connect to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "password",
            "aliases" : [ ],
            "label" : "Password",
            "kind" : "PASSWORD",
            "description" : "The password used to authenticate the user.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "host",
            "aliases" : [ "address" ],
            "label" : "Host",
            "kind" : "STRING",
            "description" : "The hostname or IP address of the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "port",
            "aliases" : [ ],
            "label" : "Port",
            "value" : "5432",
            "kind" : "INTEGER",
            "description" : "The port number used to connect to the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "dbname",
            "aliases" : [ "database" ],
            "label" : "Database Name",
            "kind" : "STRING",
            "description" : "The name of the database to connect to.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "default_target_schema",
            "aliases" : [ ],
            "label" : "Default Target Schema",
            "value" : "analytics",
            "kind" : "STRING",
            "description" : "The default schema to use when writing data to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "ssl",
            "aliases" : [ ],
            "label" : "SSL",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
            "protected" : false,
            "value_post_processor" : "STRINGIFY"
          }, {
            "name" : "batch_size_rows",
            "aliases" : [ ],
            "label" : "Batch Size Rows",
            "value" : "100000",
            "kind" : "INTEGER",
            "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
            "protected" : false
          }, {
            "name" : "underscore_camel_case_fields",
            "aliases" : [ ],
            "label" : "Underscore Camel Case Fields",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
            "protected" : false
          }, {
            "name" : "flush_all_streams",
            "aliases" : [ ],
            "label" : "Flush All Streams",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
            "protected" : false
          }, {
            "name" : "parallelism",
            "aliases" : [ ],
            "label" : "Parallelism",
            "value" : "0",
            "kind" : "HIDDEN",
            "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "parallelism_max",
            "aliases" : [ ],
            "label" : "Max Parallelism",
            "value" : "16",
            "kind" : "HIDDEN",
            "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "default_target_schema_select_permission",
            "aliases" : [ ],
            "label" : "Default Target Schema Select Permission",
            "kind" : "HIDDEN",
            "description" : "The permission level required to select data from the default target schema.",
            "protected" : false
          }, {
            "name" : "schema_mapping",
            "aliases" : [ ],
            "label" : "Schema Mapping",
            "kind" : "HIDDEN",
            "description" : "A mapping of source schema names to target schema names.",
            "protected" : false
          }, {
            "name" : "add_metadata_columns",
            "aliases" : [ ],
            "label" : "Add Metadata Columns",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to add metadata columns to the target table.",
            "protected" : false
          }, {
            "name" : "hard_delete",
            "aliases" : [ ],
            "label" : "Hard Delete",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "data_flattening_max_level",
            "aliases" : [ ],
            "label" : "Data Flattening Max Level",
            "value" : "10",
            "kind" : "HIDDEN",
            "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "primary_key_required",
            "aliases" : [ ],
            "label" : "Primary Key Required",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not a primary key is required for the target table.",
            "protected" : false
          }, {
            "name" : "validate_records",
            "aliases" : [ ],
            "label" : "Validate Records",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "temp_dir",
            "aliases" : [ ],
            "label" : "Temporary Directory",
            "kind" : "HIDDEN",
            "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        }
      }
    }, {
      "id" : "da763b7f-14c6-4f37-988b-1968ad6b6e44",
      "created" : "2024-07-15T09:08:49.679742",
      "lastModified" : "2024-07-15T09:08:49.679743",
      "name" : "dbt",
      "properties" : { },
      "commands" : {
        "compile" : {
          "args" : "compile",
          "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
        },
        "seed" : {
          "args" : "seed",
          "description" : "Load data from csv files into your data warehouse."
        },
        "test" : {
          "args" : "test",
          "description" : "Runs tests on data in deployed models."
        },
        "docs-generate" : {
          "args" : "docs generate",
          "description" : "Generate documentation artifacts for your project."
        },
        "deps" : {
          "args" : "deps",
          "description" : "Pull the most recent version of the dependencies listed in packages.yml"
        },
        "run" : {
          "args" : "run",
          "description" : "Compile SQL and execute against the current target database."
        },
        "clean" : {
          "args" : "clean",
          "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
        },
        "snapshot" : {
          "args" : "snapshot",
          "description" : "Execute snapshots defined in your project."
        }
      },
      "dataPlugin" : "transformers/dbt--dbt-labs",
      "_embedded" : {
        "dataplugin" : {
          "id" : "422fa2e4-b419-4c24-af96-d49a7c3a2f35",
          "pluginType" : "TRANSFORMER",
          "name" : "dbt",
          "namespace" : "dbt",
          "variant" : "dbt-labs",
          "label" : "dbt",
          "description" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
          "logoUrl" : "/assets/images/transformer/dbt.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/dbt/",
          "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
          "repo" : "https://github.com/dbt-labs/dbt-core",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "project_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "profiles_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
            "kind" : "STRING",
            "env" : "DBT_PROFILES_DIR",
            "protected" : false
          }, {
            "name" : "target",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__DIALECT",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "source_schema",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "target_schema",
            "aliases" : [ ],
            "value" : "analytics",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "models",
            "aliases" : [ ],
            "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
            "kind" : "STRING",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "matatikaHidden" : false,
          "requires" : [ {
            "id" : "e6c1ad3d-ebf5-4c4a-b129-f68156b47555",
            "pluginType" : "FILE",
            "name" : "files-dbt",
            "namespace" : "dbt",
            "variant" : "matatika",
            "description" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n",
            "hidden" : false,
            "pipUrl" : "git+https://github.com/Matatika/[email protected]",
            "repo" : "https://github.com/Matatika/files-dbt",
            "capabilities" : [ ],
            "select" : [ ],
            "update" : {
              "transform/profile/profiles.yml" : "true"
            },
            "vars" : { },
            "settings" : [ ],
            "variants" : [ ],
            "commands" : { },
            "matatikaHidden" : false,
            "requires" : [ ],
            "fullDescription" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n"
          } ],
          "fullDescription" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/da763b7f-14c6-4f37-988b-1968ad6b6e44"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/da763b7f-14c6-4f37-988b-1968ad6b6e44"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/da763b7f-14c6-4f37-988b-1968ad6b6e44"
        }
      }
    } ],
    "latest job" : {
      "id" : "02391ef9-ea14-461a-bfc3-de6bd76592c6",
      "created" : "2024-07-15T09:09:17.231363",
      "type" : "WORKSPACE_CONFIG",
      "maxAttempts" : 0,
      "attempt" : 0,
      "status" : "QUEUED",
      "_embedded" : {
        "pipeline" : {
          "id" : "3e11d680-bed2-4900-a5eb-27ec3fd6844e",
          "status" : "PROVISIONING",
          "name" : "SIT-generated pipeline [2024-07-15T10:09:16.246704]",
          "schedule" : "0 0 0 25 12 ?",
          "timeout" : 0,
          "maxRetries" : 0,
          "created" : "2024-07-15T09:09:16.617741",
          "lastModified" : "2024-07-15T09:09:16.617741",
          "properties" : {
            "tap-google-analytics.view_id" : "1234567890",
            "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
            "tap-google-analytics.reports" : "reports",
            "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
            "tap-google-analytics.start_date" : "2024-06-15T10:09:16.254923+01:00",
            "tap-google-analytics.end_date" : "2024-07-15T10:09:16.255042+01:00",
            "tap-google-analytics.oauth_credentials.client_id" : "client_id",
            "tap-google-analytics.oauth_credentials.access_token" : "access_token"
          },
          "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
          "actions" : [ ],
          "triggeredBy" : [ ]
        },
        "profile" : {
          "id" : "auth0|5eb0327cbfd7490bff55feeb",
          "name" : "[email protected]",
          "handle" : "@sit+prod",
          "email" : "[email protected]"
        }
      },
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/jobs/02391ef9-ea14-461a-bfc3-de6bd76592c6"
        },
        "delete job" : {
          "href" : "https://app.matatika.com/api/jobs/02391ef9-ea14-461a-bfc3-de6bd76592c6",
          "type" : "DELETE"
        },
        "logs" : {
          "href" : "https://app.matatika.com/api/jobs/02391ef9-ea14-461a-bfc3-de6bd76592c6/logs?sequence=0",
          "type" : "GET"
        },
        "withdraw job" : {
          "href" : "https://app.matatika.com/api/jobs/02391ef9-ea14-461a-bfc3-de6bd76592c6/stopped",
          "type" : "PUT"
        }
      }
    }
  },
  "_links" : {
    "update pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e",
      "type" : "PUT"
    },
    "delete pipeline" : {
      "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e",
      "type" : "DELETE"
    },
    "draft pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/draft",
      "type" : "PUT"
    },
    "self" : {
      "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e"
    },
    "environment" : {
      "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/environment"
    },
    "jobs" : {
      "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/jobs",
      "type" : "GET"
    },
    "metrics" : {
      "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/metrics"
    },
    "add subscription" : {
      "href" : "https://app.matatika.com/api/pipelines/3e11d680-bed2-4900-a5eb-27ec3fd6844e/subscriptions"
    },
    "withdraw job" : {
      "href" : "https://app.matatika.com/api/jobs/02391ef9-ea14-461a-bfc3-de6bd76592c6/stopped",
      "type" : "PUT"
    },
    "latest job" : {
      "href" : "https://app.matatika.com/api/jobs/02391ef9-ea14-461a-bfc3-de6bd76592c6"
    }
  }
}

Create or update a pipeline as a draft

PUT

/api/workspaces/{workspace-id}/pipelines/{pipeline-id}/draft

Creates or updates the pipeline {pipeline-id} in the workspace {workspace-id} as a draft.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Body

Pipeline resource.

{
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607]",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/draft' -i -X PUT \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json' \
    -d '{
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607]",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}'

Python (requests)

import requests

url = "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/draft"

data = {
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607]",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}
headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("PUT", url, headers=headers, data=data)

print(response.text.encode('utf8'))

Response

200 OK / 201 Created

Pipeline with HAL links.

{
  "id" : "37da82fd-da13-4df1-acd1-eb47fb8cc074",
  "status" : "PROVISIONING",
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607]",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-07-15T09:10:49.554622",
  "lastModified" : "2024-07-15T09:10:49.554622",
  "properties" : { },
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
  "actions" : [ ],
  "triggeredBy" : [ ],
  "_embedded" : {
    "dataComponents" : [ {
      "id" : "985215c6-87c5-4633-bd45-7cc236a92f87",
      "created" : "2024-07-15T09:09:16.453622",
      "lastModified" : "2024-07-15T09:09:16.453623",
      "name" : "tap-google-analytics",
      "properties" : { },
      "commands" : { },
      "dataPlugin" : "extractors/tap-google-analytics--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "51bb57f1-9ccc-40ac-b14d-de574c855a63",
          "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
          "pluginType" : "EXTRACTOR",
          "name" : "tap-google-analytics",
          "namespace" : "tap_google_analytics",
          "variant" : "matatika",
          "label" : "Google Analytics",
          "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
          "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
          "hidden" : false,
          "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "https://github.com/Matatika/tap-google-analytics",
          "capabilities" : [ "CATALOG", "DISCOVER", "STATE" ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "oauth_credentials.authorization_url",
            "aliases" : [ ],
            "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
            "value" : "https://oauth2.googleapis.com/token",
            "kind" : "HIDDEN",
            "description" : "The endpoint used to create and refresh OAuth tokens.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.scope",
            "aliases" : [ ],
            "label" : "OAuth scopes we need to request access to",
            "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
            "kind" : "HIDDEN",
            "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.access_token",
            "aliases" : [ ],
            "label" : "Access Token",
            "kind" : "HIDDEN",
            "description" : "The token used to authenticate and authorize API requests.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_token",
            "aliases" : [ ],
            "label" : "OAuth Refresh Token",
            "kind" : "HIDDEN",
            "description" : "The token used to refresh the access token when it expires.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url",
            "aliases" : [ ],
            "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
            "kind" : "HIDDEN",
            "description" : "An optional function that will be called to refresh the access token using the refresh token.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url_auth",
            "aliases" : [ ],
            "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
            "kind" : "HIDDEN",
            "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_id",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_secret",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "view_id",
            "aliases" : [ ],
            "label" : "View ID",
            "placeholder" : "Ex. 198343027",
            "kind" : "STRING",
            "description" : "The ID of the Google Analytics view to retrieve data from.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "reports",
            "aliases" : [ ],
            "label" : "Reports",
            "placeholder" : "Ex. my_report_definition.json",
            "kind" : "STRING",
            "description" : "The specific reports to retrieve data from in the Google Analytics view.",
            "protected" : false
          }, {
            "name" : "start_date",
            "aliases" : [ ],
            "label" : "Start date",
            "kind" : "DATE_ISO8601",
            "description" : "The start date for the date range of data to retrieve.",
            "protected" : false
          }, {
            "name" : "end_date",
            "aliases" : [ ],
            "label" : "End date",
            "kind" : "DATE_ISO8601",
            "description" : "The end date for the date range of data to retrieve.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/51bb57f1-9ccc-40ac-b14d-de574c855a63",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : true,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/985215c6-87c5-4633-bd45-7cc236a92f87"
        }
      }
    }, {
      "id" : "c8b03642-3482-45f9-8bbd-9d0c53d468ed",
      "created" : "2024-07-15T09:08:49.599991",
      "lastModified" : "2024-07-15T09:10:47.972756",
      "name" : "Warehouse",
      "properties" : {
        "password" : "4SllWL17_E43GjJq5_GhDX041h",
        "default_target_schema" : "analytics",
        "dbname" : "fruzutu",
        "port" : "5432",
        "host" : "sharp-banana.postgres.database.azure.com",
        "user" : "fruzutu"
      },
      "commands" : { },
      "dataPlugin" : "loaders/target-postgres--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "9da27ab6-0f9a-478a-bc82-d119a43d6777",
          "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
          "pluginType" : "LOADER",
          "name" : "target-postgres",
          "namespace" : "postgres_transferwise",
          "variant" : "matatika",
          "label" : "Postgres Warehouse",
          "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
          "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/target-postgres/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "git+https://github.com/Matatika/[email protected]",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "user",
            "aliases" : [ "username" ],
            "label" : "User",
            "kind" : "STRING",
            "description" : "The username used to connect to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "password",
            "aliases" : [ ],
            "label" : "Password",
            "kind" : "PASSWORD",
            "description" : "The password used to authenticate the user.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "host",
            "aliases" : [ "address" ],
            "label" : "Host",
            "kind" : "STRING",
            "description" : "The hostname or IP address of the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "port",
            "aliases" : [ ],
            "label" : "Port",
            "value" : "5432",
            "kind" : "INTEGER",
            "description" : "The port number used to connect to the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "dbname",
            "aliases" : [ "database" ],
            "label" : "Database Name",
            "kind" : "STRING",
            "description" : "The name of the database to connect to.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "default_target_schema",
            "aliases" : [ ],
            "label" : "Default Target Schema",
            "value" : "analytics",
            "kind" : "STRING",
            "description" : "The default schema to use when writing data to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "ssl",
            "aliases" : [ ],
            "label" : "SSL",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
            "protected" : false,
            "value_post_processor" : "STRINGIFY"
          }, {
            "name" : "batch_size_rows",
            "aliases" : [ ],
            "label" : "Batch Size Rows",
            "value" : "100000",
            "kind" : "INTEGER",
            "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
            "protected" : false
          }, {
            "name" : "underscore_camel_case_fields",
            "aliases" : [ ],
            "label" : "Underscore Camel Case Fields",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
            "protected" : false
          }, {
            "name" : "flush_all_streams",
            "aliases" : [ ],
            "label" : "Flush All Streams",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
            "protected" : false
          }, {
            "name" : "parallelism",
            "aliases" : [ ],
            "label" : "Parallelism",
            "value" : "0",
            "kind" : "HIDDEN",
            "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "parallelism_max",
            "aliases" : [ ],
            "label" : "Max Parallelism",
            "value" : "16",
            "kind" : "HIDDEN",
            "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "default_target_schema_select_permission",
            "aliases" : [ ],
            "label" : "Default Target Schema Select Permission",
            "kind" : "HIDDEN",
            "description" : "The permission level required to select data from the default target schema.",
            "protected" : false
          }, {
            "name" : "schema_mapping",
            "aliases" : [ ],
            "label" : "Schema Mapping",
            "kind" : "HIDDEN",
            "description" : "A mapping of source schema names to target schema names.",
            "protected" : false
          }, {
            "name" : "add_metadata_columns",
            "aliases" : [ ],
            "label" : "Add Metadata Columns",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to add metadata columns to the target table.",
            "protected" : false
          }, {
            "name" : "hard_delete",
            "aliases" : [ ],
            "label" : "Hard Delete",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "data_flattening_max_level",
            "aliases" : [ ],
            "label" : "Data Flattening Max Level",
            "value" : "10",
            "kind" : "HIDDEN",
            "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "primary_key_required",
            "aliases" : [ ],
            "label" : "Primary Key Required",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not a primary key is required for the target table.",
            "protected" : false
          }, {
            "name" : "validate_records",
            "aliases" : [ ],
            "label" : "Validate Records",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "temp_dir",
            "aliases" : [ ],
            "label" : "Temporary Directory",
            "kind" : "HIDDEN",
            "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/9da27ab6-0f9a-478a-bc82-d119a43d6777",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/c8b03642-3482-45f9-8bbd-9d0c53d468ed"
        }
      }
    }, {
      "id" : "3bf944f2-ee7e-4326-a7ce-35df580478f9",
      "created" : "2024-07-15T09:10:47.73595",
      "lastModified" : "2024-07-15T09:10:47.735961",
      "name" : "dbt",
      "properties" : { },
      "commands" : {
        "compile" : {
          "args" : "compile",
          "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
        },
        "seed" : {
          "args" : "seed",
          "description" : "Load data from csv files into your data warehouse."
        },
        "test" : {
          "args" : "test",
          "description" : "Runs tests on data in deployed models."
        },
        "docs-generate" : {
          "args" : "docs generate",
          "description" : "Generate documentation artifacts for your project."
        },
        "deps" : {
          "args" : "deps",
          "description" : "Pull the most recent version of the dependencies listed in packages.yml"
        },
        "run" : {
          "args" : "run",
          "description" : "Compile SQL and execute against the current target database."
        },
        "clean" : {
          "args" : "clean",
          "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
        },
        "snapshot" : {
          "args" : "snapshot",
          "description" : "Execute snapshots defined in your project."
        }
      },
      "dataPlugin" : "transformers/dbt--dbt-labs",
      "_embedded" : {
        "dataplugin" : {
          "id" : "422fa2e4-b419-4c24-af96-d49a7c3a2f35",
          "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
          "pluginType" : "TRANSFORMER",
          "name" : "dbt",
          "namespace" : "dbt",
          "variant" : "dbt-labs",
          "label" : "dbt",
          "description" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
          "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/dbt/",
          "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
          "repo" : "https://github.com/dbt-labs/dbt-core",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "project_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "profiles_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
            "kind" : "STRING",
            "env" : "DBT_PROFILES_DIR",
            "protected" : false
          }, {
            "name" : "target",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__DIALECT",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "source_schema",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "target_schema",
            "aliases" : [ ],
            "value" : "analytics",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "models",
            "aliases" : [ ],
            "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
            "kind" : "STRING",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "matatikaHidden" : false,
          "requires" : [ {
            "id" : "732af1ba-ad83-4fd0-ba19-4b07c02e1bba",
            "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
            "pluginType" : "FILE",
            "name" : "files-dbt",
            "namespace" : "dbt",
            "variant" : "matatika",
            "label" : "files-dbt",
            "description" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n",
            "hidden" : false,
            "pipUrl" : "git+https://github.com/Matatika/[email protected]",
            "repo" : "https://github.com/Matatika/files-dbt",
            "capabilities" : [ ],
            "select" : [ ],
            "update" : {
              "transform/profile/profiles.yml" : "true"
            },
            "vars" : { },
            "settings" : [ ],
            "variants" : [ ],
            "commands" : { },
            "matatikaHidden" : false,
            "requires" : [ ],
            "fullDescription" : " Files dbt is a file bundle that automatically configures your project to run transforms with dbt.\nThe bundle includes template project configuration:\n\n- transform/models (directory)\n- transform/profile/profiles.yml\n- transform/dbt_project.yml\n- transform/.gitignore\n"
          } ],
          "fullDescription" : " Power your project transformations with dbt™, a SQL-first transformation tool that enables analytics engineers to develop transformations with code.\n\n***Version Control and CI/CD***\n\nUse Matatika to deploy and promote changes between dev, UAT, and production environments.\n\n***Test and Document***\n\nUse Matatika to develop and test every model prior to production release, and share dynamically generated documentation with all stakeholders.\n\n***Develop***\n\nWrite modular data transformations in .sql – Matatika together with dbt handles the chore of dependency management. ",
          "_links" : {
            "self" : {
              "href" : "https://app.matatika.com/api/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35"
            },
            "update dataplugin" : {
              "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/dataplugins/422fa2e4-b419-4c24-af96-d49a7c3a2f35",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        },
        "update datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        },
        "delete datacomponent" : {
          "href" : "https://app.matatika.com/api/datacomponents/3bf944f2-ee7e-4326-a7ce-35df580478f9"
        }
      }
    } ],
    "latest job" : {
      "id" : "e6f969cb-8197-4d33-a588-b97ba78d904f",
      "created" : "2024-07-15T09:10:50.289063",
      "type" : "WORKSPACE_CONFIG",
      "maxAttempts" : 0,
      "attempt" : 0,
      "status" : "QUEUED",
      "_embedded" : {
        "pipeline" : {
          "id" : "37da82fd-da13-4df1-acd1-eb47fb8cc074",
          "status" : "PROVISIONING",
          "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607]",
          "timeout" : 0,
          "maxRetries" : 0,
          "created" : "2024-07-15T09:10:49.554622",
          "lastModified" : "2024-07-15T09:10:49.554622",
          "properties" : { },
          "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
          "actions" : [ ],
          "triggeredBy" : [ ]
        },
        "profile" : {
          "id" : "auth0|5eb0327cbfd7490bff55feeb",
          "name" : "[email protected]",
          "handle" : "@sit+prod",
          "email" : "[email protected]"
        }
      },
      "_links" : {
        "self" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f"
        },
        "delete job" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f",
          "type" : "DELETE"
        },
        "logs" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f/logs?sequence=0",
          "type" : "GET"
        },
        "withdraw job" : {
          "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f/stopped",
          "type" : "PUT"
        }
      }
    }
  },
  "_links" : {
    "update pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074",
      "type" : "PUT"
    },
    "delete pipeline" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074",
      "type" : "DELETE"
    },
    "draft pipeline" : {
      "href" : "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/draft",
      "type" : "PUT"
    },
    "self" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074"
    },
    "environment" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/environment"
    },
    "jobs" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/jobs",
      "type" : "GET"
    },
    "metrics" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/metrics"
    },
    "add subscription" : {
      "href" : "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074/subscriptions"
    },
    "withdraw job" : {
      "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f/stopped",
      "type" : "PUT"
    },
    "latest job" : {
      "href" : "https://app.matatika.com/api/jobs/e6f969cb-8197-4d33-a588-b97ba78d904f"
    }
  }
}

Validate a pipeline configuration in a workspace

POST

/api/workspaces/{workspace-id}/pipelines/validation

Validates a pipeline configuration in the workspace {workspace-id}.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Body

Pipeline resource.

{
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607] (updated)",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/validation' -i -X POST \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json' \
    -d '{
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607] (updated)",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}'

Python (requests)

import requests

url = "https://app.matatika.com/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/validation"

data = {
  "name" : "SIT-generated pipeline [2024-07-15T10:10:49.295607] (updated)",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}
headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("POST", url, headers=headers, data=data)

print(response.text.encode('utf8'))

Response

200 OK

No response body provided.

400 Bad Request

Pipeline property validation errors.

{
  "timestamp" : "2024-07-15T09:11:27.980181019",
  "status" : 400,
  "error" : "Bad Request",
  "message" : "3 validation errors from 'resource'",
  "errors" : [ {
    "codes" : [ "NotBlank.oauth_credentials.access_token", "NotBlank" ],
    "defaultMessage" : "No value given for setting",
    "objectName" : "resource",
    "field" : "properties.tap-google-analytics.oauth_credentials.access_token",
    "bindingFailure" : true,
    "code" : "NotBlank"
  }, {
    "codes" : [ "NotBlank.oauth_credentials.refresh_token", "NotBlank" ],
    "defaultMessage" : "No value given for setting",
    "objectName" : "resource",
    "field" : "properties.tap-google-analytics.oauth_credentials.refresh_token",
    "bindingFailure" : true,
    "code" : "NotBlank"
  }, {
    "codes" : [ "NotBlank.view_id", "NotBlank" ],
    "defaultMessage" : "No value given for setting",
    "objectName" : "resource",
    "field" : "properties.tap-google-analytics.view_id",
    "bindingFailure" : true,
    "code" : "NotBlank"
  } ],
  "path" : "/api/workspaces/d9fda316-5510-46ee-8293-a24ba4230457/pipelines/validation"
}

Verify a pipeline

POST

/api/pipelines/{pipeline-id}/verification

Verifies the configuration of the pipeline {pipeline-id}.

Prerequisites

  • Pipeline {pipeline-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/pipelines/131238f5-7fb7-4113-b7c2-426826032416/verification' -i -X POST \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://app.matatika.com/api/pipelines/131238f5-7fb7-4113-b7c2-426826032416/verification"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("POST", url, headers=headers)

print(response.text.encode('utf8'))

Response

200 OK

Job with HAL links.

{
  "id" : "aafee7ee-e9dc-4cd6-aae6-31e0f8405887",
  "created" : "2024-07-15T09:16:32.151047",
  "type" : "PIPELINE_VERIFY",
  "maxAttempts" : 0,
  "attempt" : 0,
  "status" : "QUEUED",
  "_embedded" : {
    "pipeline" : {
      "id" : "131238f5-7fb7-4113-b7c2-426826032416",
      "status" : "READY",
      "name" : "SIT-Generated Pipeline [2024-07-15T10:14:36.626619]",
      "timeout" : 0,
      "maxRetries" : 0,
      "created" : "2024-07-15T09:14:36.938991",
      "lastModified" : "2024-07-15T09:14:36.938992",
      "properties" : { },
      "dataComponents" : [ "tap-matatika-sit", "Warehouse", "dbt" ],
      "actions" : [ ],
      "triggeredBy" : [ ],
      "repositoryPath" : "pipelines/SIT-Generated Pipeline [2024-07-15T10:14:36.626619].yml"
    },
    "profile" : {
      "id" : "auth0|5eb0327cbfd7490bff55feeb",
      "name" : "[email protected]",
      "handle" : "@sit+prod",
      "email" : "[email protected]"
    }
  },
  "_links" : {
    "self" : {
      "href" : "https://app.matatika.com/api/jobs/aafee7ee-e9dc-4cd6-aae6-31e0f8405887"
    },
    "delete job" : {
      "href" : "https://app.matatika.com/api/jobs/aafee7ee-e9dc-4cd6-aae6-31e0f8405887",
      "type" : "DELETE"
    },
    "logs" : {
      "href" : "https://app.matatika.com/api/jobs/aafee7ee-e9dc-4cd6-aae6-31e0f8405887/logs?sequence=0",
      "type" : "GET"
    },
    "withdraw job" : {
      "href" : "https://app.matatika.com/api/jobs/aafee7ee-e9dc-4cd6-aae6-31e0f8405887/stopped",
      "type" : "PUT"
    }
  }
}

Delete a pipeline

DELETE

/api/pipelines/{pipeline-id}

Deletes the pipeline {pipeline-id}.

Prerequisites

  • Pipeline {pipeline-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074' -i -X DELETE \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://app.matatika.com/api/pipelines/37da82fd-da13-4df1-acd1-eb47fb8cc074"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("DELETE", url, headers=headers)

print(response.text.encode('utf8'))

Response

204 No Content

No response body provided.


View pipeline metrics

GET

/api/pipelines/{pipeline-id}/metrics

Returns the pipeline metrics for each job of {pipeline-id}.

Prerequisites

  • Pipeline {pipeline-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://app.matatika.com/api/pipelines/131238f5-7fb7-4113-b7c2-426826032416/metrics' -i -X GET \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://app.matatika.com/api/pipelines/131238f5-7fb7-4113-b7c2-426826032416/metrics"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("GET", url, headers=headers)

print(response.text.encode('utf8'))

Response

  • 200: The dataset data (defaults to JSON format).
[ {
  "metrics.job-created" : "2024-07-15 09:16:32",
  "metrics.value" : 0.0
}, {
  "metrics.job-created" : "2024-07-15 09:17:51",
  "metrics.value" : 6.0
} ]
  • 204: No response body, metrics not enabled.