Skip to main content Link Menu Expand (external link) Document Search Copy Copied

Pipelines

A pipeline defines a set of runnable actions composed from datacomponents to complete a set of tasks - for example, ELT. Pipelines are run as jobs, either manually or on a predetermined schedule. Only a single pipeline can be run at any given time.


Objects

Pipeline

Path Type Format Description
id String Version 4 UUID The pipeline ID
status String Pipeline Status  
name String   The pipeline name
schedule String Cron The interval at which to launch a new job e.g. 0 0 9-17 * * MON-FRI launches a job on the hour nine-to-five weekdays
timeout Integer Unsigned The number of seconds after which the job will terminate - if set to 0, an implicit default value of 300 seconds is used
script String Bash script Custom script to execute during a job
created String ISO 8601 timestamp When the pipeline was created
lastModified String ISO 8601 timestamp When the pipeline was last modified
properties Properties   The pipeline properties, defined by the dataplugin settings of each datacomponent
dataComponents Array of String Array of Datacomponent names The pipeline datacomponent names or create / update with dataplugin fullyQualifiedName
actions Array of String Array of Datacomponent names or commands The pipeline actions to run during a job
{
  "id" : "e25572cb-daa2-44d7-9f12-7b3127a4cc55",
  "status" : "READY",
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649] (updated)",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-04-10T14:11:38.148468",
  "lastModified" : "2024-04-10T14:11:38.148469",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
    "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  },
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
  "actions" : [ ],
  "_embedded" : {
    "dataComponents" : [ {
      "id" : "0e155953-ad3b-4412-b1d3-70fdf2ae551e",
      "created" : "2024-04-10T14:09:44.004401",
      "lastModified" : "2024-04-10T14:09:44.004402",
      "name" : "tap-google-analytics",
      "properties" : { },
      "commands" : { },
      "dataPlugin" : "extractors/tap-google-analytics--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "15daaa43-8a54-471f-8534-84c0c9b10c2f",
          "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
          "pluginType" : "EXTRACTOR",
          "name" : "tap-google-analytics",
          "namespace" : "tap_google_analytics",
          "variant" : "matatika",
          "label" : "Google Analytics",
          "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
          "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
          "hidden" : false,
          "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "https://github.com/Matatika/tap-google-analytics",
          "capabilities" : [ "DISCOVER", "STATE", "CATALOG" ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "oauth_credentials.authorization_url",
            "aliases" : [ ],
            "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
            "value" : "https://oauth2.googleapis.com/token",
            "kind" : "HIDDEN",
            "description" : "The endpoint used to create and refresh OAuth tokens.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.scope",
            "aliases" : [ ],
            "label" : "OAuth scopes we need to request access to",
            "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
            "kind" : "HIDDEN",
            "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.access_token",
            "aliases" : [ ],
            "label" : "Access Token",
            "kind" : "HIDDEN",
            "description" : "The token used to authenticate and authorize API requests.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_token",
            "aliases" : [ ],
            "label" : "OAuth Refresh Token",
            "kind" : "HIDDEN",
            "description" : "The token used to refresh the access token when it expires.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url",
            "aliases" : [ ],
            "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
            "kind" : "HIDDEN",
            "description" : "An optional function that will be called to refresh the access token using the refresh token.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url_auth",
            "aliases" : [ ],
            "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
            "kind" : "HIDDEN",
            "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_id",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_secret",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "view_id",
            "aliases" : [ ],
            "label" : "View ID",
            "placeholder" : "Ex. 198343027",
            "kind" : "STRING",
            "description" : "The ID of the Google Analytics view to retrieve data from.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "reports",
            "aliases" : [ ],
            "label" : "Reports",
            "placeholder" : "Ex. my_report_definition.json",
            "kind" : "STRING",
            "description" : "The specific reports to retrieve data from in the Google Analytics view.",
            "protected" : false
          }, {
            "name" : "start_date",
            "aliases" : [ ],
            "label" : "Start date",
            "kind" : "DATE_ISO8601",
            "description" : "The start date for the date range of data to retrieve.",
            "protected" : false
          }, {
            "name" : "end_date",
            "aliases" : [ ],
            "label" : "End date",
            "kind" : "DATE_ISO8601",
            "description" : "The end date for the date range of data to retrieve.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : true,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        }
      }
    }, {
      "id" : "a3154ba7-8410-4e22-ae97-52a5625a37ff",
      "created" : "2024-04-10T14:09:17.265704",
      "lastModified" : "2024-04-10T14:12:19.12056",
      "name" : "Warehouse",
      "properties" : {
        "password" : "Cx44A8mk_EqD01Her_AVuH1j8Y",
        "dbname" : "xcshyhw",
        "default_target_schema" : "analytics",
        "port" : "5432",
        "host" : "sharp-banana.postgres.database.azure.com",
        "user" : "xcshyhw"
      },
      "commands" : { },
      "dataPlugin" : "loaders/target-postgres--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
          "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
          "pluginType" : "LOADER",
          "name" : "target-postgres",
          "namespace" : "postgres_transferwise",
          "variant" : "matatika",
          "label" : "Postgres Warehouse",
          "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
          "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/target-postgres/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "git+https://github.com/Matatika/[email protected]",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "user",
            "aliases" : [ "username" ],
            "label" : "User",
            "kind" : "STRING",
            "description" : "The username used to connect to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "password",
            "aliases" : [ ],
            "label" : "Password",
            "kind" : "PASSWORD",
            "description" : "The password used to authenticate the user.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "host",
            "aliases" : [ "address" ],
            "label" : "Host",
            "kind" : "STRING",
            "description" : "The hostname or IP address of the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "port",
            "aliases" : [ ],
            "label" : "Port",
            "value" : "5432",
            "kind" : "INTEGER",
            "description" : "The port number used to connect to the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "dbname",
            "aliases" : [ "database" ],
            "label" : "Database Name",
            "kind" : "STRING",
            "description" : "The name of the database to connect to.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "default_target_schema",
            "aliases" : [ ],
            "label" : "Default Target Schema",
            "value" : "analytics",
            "kind" : "STRING",
            "description" : "The default schema to use when writing data to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "ssl",
            "aliases" : [ ],
            "label" : "SSL",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
            "protected" : false,
            "value_post_processor" : "STRINGIFY"
          }, {
            "name" : "batch_size_rows",
            "aliases" : [ ],
            "label" : "Batch Size Rows",
            "value" : "100000",
            "kind" : "INTEGER",
            "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
            "protected" : false
          }, {
            "name" : "underscore_camel_case_fields",
            "aliases" : [ ],
            "label" : "Underscore Camel Case Fields",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
            "protected" : false
          }, {
            "name" : "flush_all_streams",
            "aliases" : [ ],
            "label" : "Flush All Streams",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
            "protected" : false
          }, {
            "name" : "parallelism",
            "aliases" : [ ],
            "label" : "Parallelism",
            "value" : "0",
            "kind" : "HIDDEN",
            "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "parallelism_max",
            "aliases" : [ ],
            "label" : "Max Parallelism",
            "value" : "16",
            "kind" : "HIDDEN",
            "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "default_target_schema_select_permission",
            "aliases" : [ ],
            "label" : "Default Target Schema Select Permission",
            "kind" : "HIDDEN",
            "description" : "The permission level required to select data from the default target schema.",
            "protected" : false
          }, {
            "name" : "schema_mapping",
            "aliases" : [ ],
            "label" : "Schema Mapping",
            "kind" : "HIDDEN",
            "description" : "A mapping of source schema names to target schema names.",
            "protected" : false
          }, {
            "name" : "add_metadata_columns",
            "aliases" : [ ],
            "label" : "Add Metadata Columns",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to add metadata columns to the target table.",
            "protected" : false
          }, {
            "name" : "hard_delete",
            "aliases" : [ ],
            "label" : "Hard Delete",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "data_flattening_max_level",
            "aliases" : [ ],
            "label" : "Data Flattening Max Level",
            "value" : "10",
            "kind" : "HIDDEN",
            "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "primary_key_required",
            "aliases" : [ ],
            "label" : "Primary Key Required",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not a primary key is required for the target table.",
            "protected" : false
          }, {
            "name" : "validate_records",
            "aliases" : [ ],
            "label" : "Validate Records",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "temp_dir",
            "aliases" : [ ],
            "label" : "Temporary Directory",
            "kind" : "HIDDEN",
            "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        }
      }
    }, {
      "id" : "a204013b-57c6-4b00-9765-7f06fe1878f8",
      "created" : "2024-04-10T14:11:34.838299",
      "lastModified" : "2024-04-10T14:11:34.838319",
      "name" : "dbt",
      "properties" : { },
      "commands" : {
        "compile" : {
          "args" : "compile",
          "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
        },
        "seed" : {
          "args" : "seed",
          "description" : "Load data from csv files into your data warehouse."
        },
        "test" : {
          "args" : "test",
          "description" : "Runs tests on data in deployed models."
        },
        "docs-generate" : {
          "args" : "docs generate",
          "description" : "Generate documentation artifacts for your project."
        },
        "deps" : {
          "args" : "deps",
          "description" : "Pull the most recent version of the dependencies listed in packages.yml"
        },
        "run" : {
          "args" : "run",
          "description" : "Compile SQL and execute against the current target database."
        },
        "clean" : {
          "args" : "clean",
          "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
        },
        "snapshot" : {
          "args" : "snapshot",
          "description" : "Execute snapshots defined in your project."
        }
      },
      "dataPlugin" : "transformers/dbt--dbt-labs",
      "_embedded" : {
        "dataplugin" : {
          "id" : "e1e1cf84-a771-451b-972d-829c74e18a48",
          "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
          "pluginType" : "TRANSFORMER",
          "name" : "dbt",
          "namespace" : "dbt",
          "variant" : "dbt-labs",
          "label" : "dbt",
          "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/dbt/",
          "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
          "repo" : "https://github.com/dbt-labs/dbt-core",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "project_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "profiles_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
            "kind" : "STRING",
            "env" : "DBT_PROFILES_DIR",
            "protected" : false
          }, {
            "name" : "target",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__DIALECT",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "source_schema",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "target_schema",
            "aliases" : [ ],
            "value" : "analytics",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "models",
            "aliases" : [ ],
            "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
            "kind" : "STRING",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "matatikaHidden" : false,
          "requires" : [ {
            "id" : "4b54a59d-5399-4877-9571-a6ab5c77e7d9",
            "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
            "pluginType" : "FILE",
            "name" : "files-dbt",
            "namespace" : "dbt",
            "variant" : "matatika",
            "label" : "files-dbt",
            "hidden" : false,
            "pipUrl" : "git+https://github.com/Matatika/[email protected]",
            "repo" : "https://github.com/Matatika/files-dbt",
            "capabilities" : [ ],
            "select" : [ ],
            "update" : {
              "transform/profile/profiles.yml" : "true"
            },
            "vars" : { },
            "settings" : [ ],
            "variants" : [ ],
            "commands" : { },
            "matatikaHidden" : false,
            "requires" : [ ],
            "fullDescription" : ""
          } ],
          "fullDescription" : "",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        }
      }
    } ],
    "latest job" : {
      "id" : "834fe1aa-1371-4dd0-886e-95ccecfa8698",
      "created" : "2024-04-10T14:11:38.662251",
      "type" : "WORKSPACE_CONFIG",
      "maxAttempts" : 0,
      "attempt" : 0,
      "commitId" : "d099ac12444cb1e4ce094ce4cdd9cef34f784d15",
      "exitCode" : 0,
      "status" : "COMPLETE",
      "startTime" : "2024-04-10T14:11:57.313",
      "endTime" : "2024-04-10T14:12:20.199",
      "_embedded" : {
        "pipeline" : {
          "id" : "e25572cb-daa2-44d7-9f12-7b3127a4cc55",
          "status" : "READY",
          "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649] (updated)",
          "timeout" : 0,
          "maxRetries" : 0,
          "created" : "2024-04-10T14:11:38.148468",
          "lastModified" : "2024-04-10T14:11:38.148469",
          "properties" : {
            "tap-google-analytics.view_id" : "1234567890",
            "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
            "tap-google-analytics.reports" : "reports",
            "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
            "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
            "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
            "tap-google-analytics.oauth_credentials.client_id" : "client_id",
            "tap-google-analytics.oauth_credentials.access_token" : "access_token"
          },
          "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
          "actions" : [ ]
        },
        "profile" : {
          "id" : "auth0|5eb0327cbfd7490bff55feeb",
          "name" : "[email protected]",
          "handle" : "@sit+prod",
          "email" : "[email protected]"
        }
      },
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698"
        },
        "delete job" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698",
          "type" : "DELETE"
        },
        "logs" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698/logs?sequence=0",
          "type" : "GET"
        }
      }
    }
  },
  "_links" : {
    "update pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55",
      "type" : "PUT"
    },
    "delete pipeline" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55",
      "type" : "DELETE"
    },
    "draft pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/draft",
      "type" : "PUT"
    },
    "self" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55"
    },
    "environment" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/environment"
    },
    "jobs" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/jobs",
      "type" : "GET"
    },
    "metrics" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/metrics"
    },
    "add subscription" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/subscriptions"
    },
    "verify pipeline" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/verification",
      "type" : "POST"
    },
    "create job" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/jobs",
      "type" : "POST"
    },
    "latest job" : {
      "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698"
    }
  }
}

Properties

For each setting s in the datacomponentsdataplugin settings for each

Path Type Description
s.name s.kind Refer to s.description
  • Any required settings not satisfied by a datacomponent property must be provided as a pipeline property
  • Any settings that are already satisfied by a datacomponent property can be overridden by a pipeline property

Formats

Pipeline Status

Value Description
READY The pipeline completed processing resource changes
PROVISIONING The pipeline is processing resource changes
FAILED The pipeline failed to process resource changes

Requests

See Also


View all pipelines in a workspace

GET

/api/workspaces/{workspace-id}/pipelines

Returns all configured pipelines in the workspace {workspace-id}.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines' -i -X GET \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("GET", url, headers=headers)

print(response.text.encode('utf8'))

Response

200 OK

Pipeline collection with HAL links.

{
  "_embedded" : {
    "pipelines" : [ {
      "id" : "38dcf355-5687-4841-af44-a25595545a6f",
      "status" : "READY",
      "name" : "Model lineage",
      "timeout" : 0,
      "maxRetries" : 0,
      "created" : "2024-04-10T14:11:35.618461",
      "lastModified" : "2024-04-10T14:11:35.618485",
      "properties" : { },
      "dataComponents" : [ "dbt", "dbt-artifacts", "matatika" ],
      "actions" : [ "dbt:deps", "dbt:docs-generate", "dbt-artifacts:convert-matatika", "matatika:publish" ],
      "repositoryPath" : "pipelines/Model lineage.yml",
      "_embedded" : {
        "dataComponents" : [ {
          "id" : "a204013b-57c6-4b00-9765-7f06fe1878f8",
          "created" : "2024-04-10T14:11:34.838299",
          "lastModified" : "2024-04-10T14:11:34.838319",
          "name" : "dbt",
          "properties" : { },
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "dataPlugin" : "transformers/dbt--dbt-labs",
          "_embedded" : {
            "dataplugin" : {
              "id" : "e1e1cf84-a771-451b-972d-829c74e18a48",
              "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
              "pluginType" : "TRANSFORMER",
              "name" : "dbt",
              "namespace" : "dbt",
              "variant" : "dbt-labs",
              "label" : "dbt",
              "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/dbt/",
              "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
              "repo" : "https://github.com/dbt-labs/dbt-core",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "project_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "profiles_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
                "kind" : "STRING",
                "env" : "DBT_PROFILES_DIR",
                "protected" : false
              }, {
                "name" : "target",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__DIALECT",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "source_schema",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "target_schema",
                "aliases" : [ ],
                "value" : "analytics",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "models",
                "aliases" : [ ],
                "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
                "kind" : "STRING",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "compile" : {
                  "args" : "compile",
                  "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
                },
                "seed" : {
                  "args" : "seed",
                  "description" : "Load data from csv files into your data warehouse."
                },
                "test" : {
                  "args" : "test",
                  "description" : "Runs tests on data in deployed models."
                },
                "docs-generate" : {
                  "args" : "docs generate",
                  "description" : "Generate documentation artifacts for your project."
                },
                "deps" : {
                  "args" : "deps",
                  "description" : "Pull the most recent version of the dependencies listed in packages.yml"
                },
                "run" : {
                  "args" : "run",
                  "description" : "Compile SQL and execute against the current target database."
                },
                "clean" : {
                  "args" : "clean",
                  "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
                },
                "snapshot" : {
                  "args" : "snapshot",
                  "description" : "Execute snapshots defined in your project."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ {
                "id" : "4b54a59d-5399-4877-9571-a6ab5c77e7d9",
                "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
                "pluginType" : "FILE",
                "name" : "files-dbt",
                "namespace" : "dbt",
                "variant" : "matatika",
                "label" : "files-dbt",
                "hidden" : false,
                "pipUrl" : "git+https://github.com/Matatika/[email protected]",
                "repo" : "https://github.com/Matatika/files-dbt",
                "capabilities" : [ ],
                "select" : [ ],
                "update" : {
                  "transform/profile/profiles.yml" : "true"
                },
                "vars" : { },
                "settings" : [ ],
                "variants" : [ ],
                "commands" : { },
                "matatikaHidden" : false,
                "requires" : [ ],
                "fullDescription" : ""
              } ],
              "fullDescription" : "",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            }
          }
        }, {
          "id" : "06f99e66-ccdc-48fc-ade7-7ca0b46299d0",
          "created" : "2024-04-10T14:11:34.772473",
          "lastModified" : "2024-04-10T14:11:34.77281",
          "name" : "dbt-artifacts",
          "properties" : { },
          "commands" : {
            "convert-matatika" : {
              "args" : "convert --format matatika",
              "description" : "Convert artifacts to Matatika datasets."
            },
            "convert-mermaid" : {
              "args" : "convert --format mermaid",
              "description" : "Convert artifacts to [Mermaid entity relationship diagrams](https://mermaid.js.org/syntax/entityRelationshipDiagram.html)."
            }
          },
          "dataPlugin" : "utilities/dbt-artifacts--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "e86cbb05-7b50-4512-a2a8-ac191b66e99a",
              "repositoryPath" : "plugins/utilities/dbt-artifacts--matatika.lock",
              "pluginType" : "UTILITY",
              "name" : "dbt-artifacts",
              "namespace" : "dbt_artifacts",
              "variant" : "matatika",
              "label" : "dbt Artifacts",
              "description" : "A tool for processing [dbt artifacts](https://docs.getdbt.com/reference/artifacts/dbt-artifacts).",
              "logoUrl" : "https://app.matatika.com/assets/logos/utilities/dbt.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/dbt-artifacts/",
              "pipUrl" : "git+https://github.com/Matatika/dbt-artifacts-ext.git",
              "repo" : "https://github.com/Matatika/dbt-artifacts-ext",
              "executable" : "dbt-artifacts_extension",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "source_dir",
                "aliases" : [ ],
                "label" : "Source directory",
                "value" : ".meltano/transformers/dbt/target",
                "kind" : "STRING",
                "description" : "Source directory for the dbt manifest.json and catalog.json to get your lineage from.",
                "protected" : false
              }, {
                "name" : "output_dir",
                "aliases" : [ ],
                "label" : "Output directory",
                "value" : "output",
                "kind" : "STRING",
                "description" : "Target directory output.",
                "protected" : false
              }, {
                "name" : "resource_types",
                "aliases" : [ ],
                "label" : "Resource types",
                "value" : "[\"source\",\"model\",\"snapshot\"]",
                "kind" : "ARRAY",
                "description" : "Array of which dbt artifact to process. 'source', 'model' and 'snapshot' are frequently used, and you can also use 'all' to get every artifact found.",
                "protected" : false
              }, {
                "name" : "exclude_packages",
                "aliases" : [ ],
                "label" : "Packages to exclude",
                "value" : "[\"elementary\"]",
                "kind" : "ARRAY",
                "description" : "Array of which packages to exclude from model lineage.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "convert-matatika" : {
                  "args" : "convert --format matatika",
                  "description" : "Convert artifacts to Matatika datasets."
                },
                "convert-mermaid" : {
                  "args" : "convert --format mermaid",
                  "description" : "Convert artifacts to [Mermaid entity relationship diagrams](https://mermaid.js.org/syntax/entityRelationshipDiagram.html)."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "A tool for processing [dbt artifacts](https://docs.getdbt.com/reference/artifacts/dbt-artifacts).\n\n## Settings\n\n\n### Source directory\n\nSource directory for the dbt manifest.json and catalog.json to get your lineage from.\n\n### Output directory\n\nTarget directory output.\n\n### Resource types\n\nArray of which dbt artifact to process. 'source', 'model' and 'snapshot' are frequently used, and you can also use 'all' to get every artifact found.\n\n### Packages to exclude\n\nArray of which packages to exclude from model lineage.",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/e86cbb05-7b50-4512-a2a8-ac191b66e99a"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/e86cbb05-7b50-4512-a2a8-ac191b66e99a",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/06f99e66-ccdc-48fc-ade7-7ca0b46299d0"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/06f99e66-ccdc-48fc-ade7-7ca0b46299d0"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/06f99e66-ccdc-48fc-ade7-7ca0b46299d0"
            }
          }
        }, {
          "id" : "b8e52182-592c-491e-b653-b5ea7b9c7fed",
          "created" : "2024-04-10T14:11:34.853607",
          "lastModified" : "2024-04-10T14:11:34.853628",
          "name" : "matatika",
          "properties" : { },
          "commands" : {
            "publish" : {
              "args" : "publish $MATATIKA_DATASET_PATH",
              "description" : "Publish a matatika dataset."
            },
            "schedules" : {
              "args" : "schedules",
              "description" : "Convert Meltano jobs and schedules into Matatika pipeline yamls."
            }
          },
          "dataPlugin" : "utilities/matatika--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "31fee1fd-3cae-4a26-8e27-0f84db5d261a",
              "repositoryPath" : "plugins/utilities/matatika--matatika.lock",
              "pluginType" : "UTILITY",
              "name" : "matatika",
              "namespace" : "utility_matatika",
              "variant" : "matatika",
              "label" : "Matatika CLI",
              "description" : "Matatika CLI is a command-line interface tool for managing data science projects.\n\nMatatika CLI is a powerful tool that allows data scientists to manage their projects from the command line. With Matatika CLI, users can create and manage projects, upload and download data, and run experiments and analyses. The tool also provides version control and collaboration features, making it easy for teams to work together on projects. Matatika CLI is designed to be flexible and customizable, allowing users to tailor it to their specific needs. Overall, Matatika CLI is a valuable tool for any data scientist looking to streamline their workflow and improve their productivity.\n### Prerequisites\nThe dataset path is the location of the dataset you want to work with in Matatika CLI. To obtain the dataset path, you need to know where the dataset is stored in your Matatika account. You can find the dataset path by navigating to the dataset in the Matatika web interface and copying the path from the address bar of your web browser. Alternatively, you can use the `matatika datasets list` command in the Matatika CLI to list all the datasets in your account and their paths.",
              "logoUrl" : "https://app.matatika.com/assets/images/utility/matatika.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/matatika/",
              "pipUrl" : "matatika~=1.16.0",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "dataset_path",
                "aliases" : [ ],
                "label" : "Dataset Path",
                "value" : "output",
                "kind" : "STRING",
                "description" : "The path to a dataset or directory of datasets to publish to your workspace. This need to be the a Matatika dataset that is present in your workspace's repository.",
                "protected" : false
              }, {
                "name" : "auth_token",
                "aliases" : [ ],
                "label" : "Auth Token",
                "value" : "$AUTH_TOKEN",
                "kind" : "STRING",
                "description" : "A unique token used to authenticate the user and grant access to the Matatika workspace.",
                "env" : "AUTH_TOKEN",
                "protected" : false
              }, {
                "name" : "workspace_id",
                "aliases" : [ ],
                "label" : "Workspace ID",
                "value" : "$WORKSPACE_ID",
                "kind" : "STRING",
                "description" : "The unique identifier for the Matatika workspace.",
                "env" : "WORKSPACE_ID",
                "protected" : false
              }, {
                "name" : "endpoint_url",
                "aliases" : [ ],
                "label" : "Endpoint URL",
                "value" : "$ENDPOINT_URL",
                "kind" : "STRING",
                "description" : "The URL used to connect to the Matatika API.",
                "env" : "ENDPOINT_URL",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "publish" : {
                  "args" : "publish $MATATIKA_DATASET_PATH",
                  "description" : "Publish a matatika dataset."
                },
                "schedules" : {
                  "args" : "schedules",
                  "description" : "Convert Meltano jobs and schedules into Matatika pipeline yamls."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Matatika CLI is a command-line interface tool for managing data science projects.\n\nMatatika CLI is a powerful tool that allows data scientists to manage their projects from the command line. With Matatika CLI, users can create and manage projects, upload and download data, and run experiments and analyses. The tool also provides version control and collaboration features, making it easy for teams to work together on projects. Matatika CLI is designed to be flexible and customizable, allowing users to tailor it to their specific needs. Overall, Matatika CLI is a valuable tool for any data scientist looking to streamline their workflow and improve their productivity.\n### Prerequisites\nThe dataset path is the location of the dataset you want to work with in Matatika CLI. To obtain the dataset path, you need to know where the dataset is stored in your Matatika account. You can find the dataset path by navigating to the dataset in the Matatika web interface and copying the path from the address bar of your web browser. Alternatively, you can use the `matatika datasets list` command in the Matatika CLI to list all the datasets in your account and their paths.\n\n## Settings\n\n\n### Dataset Path\n\nThe path to a dataset or directory of datasets to publish to your workspace. This need to be the a Matatika dataset that is present in your workspace's repository.\n\n### Auth Token\n\nA unique token used to authenticate the user and grant access to the Matatika workspace.\n\n### Workspace ID\n\nThe unique identifier for the Matatika workspace.\n\n### Endpoint URL\n\nThe URL used to connect to the Matatika API.",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/31fee1fd-3cae-4a26-8e27-0f84db5d261a"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/31fee1fd-3cae-4a26-8e27-0f84db5d261a",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/b8e52182-592c-491e-b653-b5ea7b9c7fed"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/b8e52182-592c-491e-b653-b5ea7b9c7fed"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/b8e52182-592c-491e-b653-b5ea7b9c7fed"
            }
          }
        } ]
      },
      "_links" : {
        "update pipeline" : {
          "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/38dcf355-5687-4841-af44-a25595545a6f",
          "type" : "PUT"
        },
        "delete pipeline" : {
          "href" : "https://catalog.matatika.com/api/pipelines/38dcf355-5687-4841-af44-a25595545a6f",
          "type" : "DELETE"
        },
        "draft pipeline" : {
          "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/38dcf355-5687-4841-af44-a25595545a6f/draft",
          "type" : "PUT"
        },
        "self" : {
          "href" : "https://catalog.matatika.com/api/pipelines/38dcf355-5687-4841-af44-a25595545a6f"
        },
        "environment" : {
          "href" : "https://catalog.matatika.com/api/pipelines/38dcf355-5687-4841-af44-a25595545a6f/environment"
        },
        "jobs" : {
          "href" : "https://catalog.matatika.com/api/pipelines/38dcf355-5687-4841-af44-a25595545a6f/jobs",
          "type" : "GET"
        },
        "add subscription" : {
          "href" : "https://catalog.matatika.com/api/pipelines/38dcf355-5687-4841-af44-a25595545a6f/subscriptions"
        },
        "verify pipeline" : {
          "href" : "https://catalog.matatika.com/api/pipelines/38dcf355-5687-4841-af44-a25595545a6f/verification",
          "type" : "POST"
        },
        "create job" : {
          "href" : "https://catalog.matatika.com/api/pipelines/38dcf355-5687-4841-af44-a25595545a6f/jobs",
          "type" : "POST"
        }
      }
    }, {
      "id" : "2283e04e-928e-4ac4-8634-25a642d9c66e",
      "status" : "DRAFT",
      "name" : "SIT-generated pipeline [2024-04-10T15:09:43.837414]",
      "schedule" : "0 0 0 25 12 ?",
      "timeout" : 0,
      "maxRetries" : 0,
      "created" : "2024-04-10T14:09:44.144403",
      "lastModified" : "2024-04-10T14:09:44.144404",
      "properties" : { },
      "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
      "actions" : [ ],
      "_embedded" : {
        "dataComponents" : [ {
          "id" : "0e155953-ad3b-4412-b1d3-70fdf2ae551e",
          "created" : "2024-04-10T14:09:44.004401",
          "lastModified" : "2024-04-10T14:09:44.004402",
          "name" : "tap-google-analytics",
          "properties" : { },
          "commands" : { },
          "dataPlugin" : "extractors/tap-google-analytics--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "15daaa43-8a54-471f-8534-84c0c9b10c2f",
              "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
              "pluginType" : "EXTRACTOR",
              "name" : "tap-google-analytics",
              "namespace" : "tap_google_analytics",
              "variant" : "matatika",
              "label" : "Google Analytics",
              "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
              "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
              "hidden" : false,
              "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
              "pipUrl" : "git+https://github.com/Matatika/[email protected]",
              "repo" : "https://github.com/Matatika/tap-google-analytics",
              "capabilities" : [ "DISCOVER", "STATE", "CATALOG" ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "oauth_credentials.authorization_url",
                "aliases" : [ ],
                "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
                "value" : "https://oauth2.googleapis.com/token",
                "kind" : "HIDDEN",
                "description" : "The endpoint used to create and refresh OAuth tokens.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.scope",
                "aliases" : [ ],
                "label" : "OAuth scopes we need to request access to",
                "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
                "kind" : "HIDDEN",
                "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.access_token",
                "aliases" : [ ],
                "label" : "Access Token",
                "kind" : "HIDDEN",
                "description" : "The token used to authenticate and authorize API requests.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_token",
                "aliases" : [ ],
                "label" : "OAuth Refresh Token",
                "kind" : "HIDDEN",
                "description" : "The token used to refresh the access token when it expires.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_proxy_url",
                "aliases" : [ ],
                "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
                "kind" : "HIDDEN",
                "description" : "An optional function that will be called to refresh the access token using the refresh token.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_proxy_url_auth",
                "aliases" : [ ],
                "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
                "kind" : "HIDDEN",
                "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.client_id",
                "aliases" : [ ],
                "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
                "kind" : "HIDDEN",
                "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.client_secret",
                "aliases" : [ ],
                "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
                "kind" : "HIDDEN",
                "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
                "protected" : false
              }, {
                "name" : "view_id",
                "aliases" : [ ],
                "label" : "View ID",
                "placeholder" : "Ex. 198343027",
                "kind" : "STRING",
                "description" : "The ID of the Google Analytics view to retrieve data from.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "reports",
                "aliases" : [ ],
                "label" : "Reports",
                "placeholder" : "Ex. my_report_definition.json",
                "kind" : "STRING",
                "description" : "The specific reports to retrieve data from in the Google Analytics view.",
                "protected" : false
              }, {
                "name" : "start_date",
                "aliases" : [ ],
                "label" : "Start date",
                "kind" : "DATE_ISO8601",
                "description" : "The start date for the date range of data to retrieve.",
                "protected" : false
              }, {
                "name" : "end_date",
                "aliases" : [ ],
                "label" : "End date",
                "kind" : "DATE_ISO8601",
                "description" : "The end date for the date range of data to retrieve.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : { },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : true,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
            }
          }
        }, {
          "id" : "a3154ba7-8410-4e22-ae97-52a5625a37ff",
          "created" : "2024-04-10T14:09:17.265704",
          "lastModified" : "2024-04-10T14:12:19.12056",
          "name" : "Warehouse",
          "properties" : {
            "password" : "Cx44A8mk_EqD01Her_AVuH1j8Y",
            "dbname" : "xcshyhw",
            "default_target_schema" : "analytics",
            "port" : "5432",
            "host" : "sharp-banana.postgres.database.azure.com",
            "user" : "xcshyhw"
          },
          "commands" : { },
          "dataPlugin" : "loaders/target-postgres--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
              "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
              "pluginType" : "LOADER",
              "name" : "target-postgres",
              "namespace" : "postgres_transferwise",
              "variant" : "matatika",
              "label" : "Postgres Warehouse",
              "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
              "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/target-postgres/",
              "pipUrl" : "git+https://github.com/Matatika/[email protected]",
              "repo" : "git+https://github.com/Matatika/[email protected]",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "user",
                "aliases" : [ "username" ],
                "label" : "User",
                "kind" : "STRING",
                "description" : "The username used to connect to the Postgres Warehouse.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "password",
                "aliases" : [ ],
                "label" : "Password",
                "kind" : "PASSWORD",
                "description" : "The password used to authenticate the user.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "host",
                "aliases" : [ "address" ],
                "label" : "Host",
                "kind" : "STRING",
                "description" : "The hostname or IP address of the Postgres Warehouse server.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "port",
                "aliases" : [ ],
                "label" : "Port",
                "value" : "5432",
                "kind" : "INTEGER",
                "description" : "The port number used to connect to the Postgres Warehouse server.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "dbname",
                "aliases" : [ "database" ],
                "label" : "Database Name",
                "kind" : "STRING",
                "description" : "The name of the database to connect to.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "default_target_schema",
                "aliases" : [ ],
                "label" : "Default Target Schema",
                "value" : "analytics",
                "kind" : "STRING",
                "description" : "The default schema to use when writing data to the Postgres Warehouse.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "ssl",
                "aliases" : [ ],
                "label" : "SSL",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
                "protected" : false,
                "value_post_processor" : "STRINGIFY"
              }, {
                "name" : "batch_size_rows",
                "aliases" : [ ],
                "label" : "Batch Size Rows",
                "value" : "100000",
                "kind" : "INTEGER",
                "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
                "protected" : false
              }, {
                "name" : "underscore_camel_case_fields",
                "aliases" : [ ],
                "label" : "Underscore Camel Case Fields",
                "value" : "true",
                "kind" : "HIDDEN",
                "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
                "protected" : false
              }, {
                "name" : "flush_all_streams",
                "aliases" : [ ],
                "label" : "Flush All Streams",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
                "protected" : false
              }, {
                "name" : "parallelism",
                "aliases" : [ ],
                "label" : "Parallelism",
                "value" : "0",
                "kind" : "HIDDEN",
                "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "parallelism_max",
                "aliases" : [ ],
                "label" : "Max Parallelism",
                "value" : "16",
                "kind" : "HIDDEN",
                "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "default_target_schema_select_permission",
                "aliases" : [ ],
                "label" : "Default Target Schema Select Permission",
                "kind" : "HIDDEN",
                "description" : "The permission level required to select data from the default target schema.",
                "protected" : false
              }, {
                "name" : "schema_mapping",
                "aliases" : [ ],
                "label" : "Schema Mapping",
                "kind" : "HIDDEN",
                "description" : "A mapping of source schema names to target schema names.",
                "protected" : false
              }, {
                "name" : "add_metadata_columns",
                "aliases" : [ ],
                "label" : "Add Metadata Columns",
                "value" : "true",
                "kind" : "HIDDEN",
                "description" : "Whether or not to add metadata columns to the target table.",
                "protected" : false
              }, {
                "name" : "hard_delete",
                "aliases" : [ ],
                "label" : "Hard Delete",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "data_flattening_max_level",
                "aliases" : [ ],
                "label" : "Data Flattening Max Level",
                "value" : "10",
                "kind" : "HIDDEN",
                "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "primary_key_required",
                "aliases" : [ ],
                "label" : "Primary Key Required",
                "value" : "false",
                "kind" : "BOOLEAN",
                "description" : "Whether or not a primary key is required for the target table.",
                "protected" : false
              }, {
                "name" : "validate_records",
                "aliases" : [ ],
                "label" : "Validate Records",
                "value" : "false",
                "kind" : "BOOLEAN",
                "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "temp_dir",
                "aliases" : [ ],
                "label" : "Temporary Directory",
                "kind" : "HIDDEN",
                "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : { },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : true,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
            }
          }
        }, {
          "id" : "a204013b-57c6-4b00-9765-7f06fe1878f8",
          "created" : "2024-04-10T14:11:34.838299",
          "lastModified" : "2024-04-10T14:11:34.838319",
          "name" : "dbt",
          "properties" : { },
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "dataPlugin" : "transformers/dbt--dbt-labs",
          "_embedded" : {
            "dataplugin" : {
              "id" : "e1e1cf84-a771-451b-972d-829c74e18a48",
              "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
              "pluginType" : "TRANSFORMER",
              "name" : "dbt",
              "namespace" : "dbt",
              "variant" : "dbt-labs",
              "label" : "dbt",
              "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/dbt/",
              "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
              "repo" : "https://github.com/dbt-labs/dbt-core",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "project_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "profiles_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
                "kind" : "STRING",
                "env" : "DBT_PROFILES_DIR",
                "protected" : false
              }, {
                "name" : "target",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__DIALECT",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "source_schema",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "target_schema",
                "aliases" : [ ],
                "value" : "analytics",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "models",
                "aliases" : [ ],
                "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
                "kind" : "STRING",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "compile" : {
                  "args" : "compile",
                  "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
                },
                "seed" : {
                  "args" : "seed",
                  "description" : "Load data from csv files into your data warehouse."
                },
                "test" : {
                  "args" : "test",
                  "description" : "Runs tests on data in deployed models."
                },
                "docs-generate" : {
                  "args" : "docs generate",
                  "description" : "Generate documentation artifacts for your project."
                },
                "deps" : {
                  "args" : "deps",
                  "description" : "Pull the most recent version of the dependencies listed in packages.yml"
                },
                "run" : {
                  "args" : "run",
                  "description" : "Compile SQL and execute against the current target database."
                },
                "clean" : {
                  "args" : "clean",
                  "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
                },
                "snapshot" : {
                  "args" : "snapshot",
                  "description" : "Execute snapshots defined in your project."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ {
                "id" : "4b54a59d-5399-4877-9571-a6ab5c77e7d9",
                "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
                "pluginType" : "FILE",
                "name" : "files-dbt",
                "namespace" : "dbt",
                "variant" : "matatika",
                "label" : "files-dbt",
                "hidden" : false,
                "pipUrl" : "git+https://github.com/Matatika/[email protected]",
                "repo" : "https://github.com/Matatika/files-dbt",
                "capabilities" : [ ],
                "select" : [ ],
                "update" : {
                  "transform/profile/profiles.yml" : "true"
                },
                "vars" : { },
                "settings" : [ ],
                "variants" : [ ],
                "commands" : { },
                "matatikaHidden" : false,
                "requires" : [ ],
                "fullDescription" : ""
              } ],
              "fullDescription" : "",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            }
          }
        } ],
        "latest job" : {
          "id" : "4a8b18ad-c552-4658-8cb8-b3916b087237",
          "created" : "2024-04-10T14:09:44.459264",
          "type" : "WORKSPACE_CONFIG",
          "maxAttempts" : 0,
          "attempt" : 0,
          "commitId" : "efc58bdb073770debf64a2fb7c8cfb4fba0e2496",
          "exitCode" : 0,
          "status" : "COMPLETE",
          "startTime" : "2024-04-10T14:09:58.907",
          "endTime" : "2024-04-10T14:11:36.209"
        }
      },
      "_links" : {
        "update pipeline" : {
          "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e",
          "type" : "PUT"
        },
        "delete pipeline" : {
          "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e",
          "type" : "DELETE"
        },
        "draft pipeline" : {
          "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/draft",
          "type" : "PUT"
        },
        "self" : {
          "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e"
        },
        "environment" : {
          "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/environment"
        },
        "jobs" : {
          "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/jobs",
          "type" : "GET"
        },
        "metrics" : {
          "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/metrics"
        },
        "add subscription" : {
          "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/subscriptions"
        },
        "verify pipeline" : {
          "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/verification",
          "type" : "POST"
        },
        "latest job" : {
          "href" : "https://catalog.matatika.com/api/jobs/4a8b18ad-c552-4658-8cb8-b3916b087237"
        }
      }
    }, {
      "id" : "e25572cb-daa2-44d7-9f12-7b3127a4cc55",
      "status" : "READY",
      "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649] (updated)",
      "timeout" : 0,
      "maxRetries" : 0,
      "created" : "2024-04-10T14:11:38.148468",
      "lastModified" : "2024-04-10T14:11:38.148469",
      "properties" : { },
      "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
      "actions" : [ ],
      "_embedded" : {
        "dataComponents" : [ {
          "id" : "0e155953-ad3b-4412-b1d3-70fdf2ae551e",
          "created" : "2024-04-10T14:09:44.004401",
          "lastModified" : "2024-04-10T14:09:44.004402",
          "name" : "tap-google-analytics",
          "properties" : { },
          "commands" : { },
          "dataPlugin" : "extractors/tap-google-analytics--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "15daaa43-8a54-471f-8534-84c0c9b10c2f",
              "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
              "pluginType" : "EXTRACTOR",
              "name" : "tap-google-analytics",
              "namespace" : "tap_google_analytics",
              "variant" : "matatika",
              "label" : "Google Analytics",
              "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
              "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
              "hidden" : false,
              "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
              "pipUrl" : "git+https://github.com/Matatika/[email protected]",
              "repo" : "https://github.com/Matatika/tap-google-analytics",
              "capabilities" : [ "DISCOVER", "STATE", "CATALOG" ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "oauth_credentials.authorization_url",
                "aliases" : [ ],
                "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
                "value" : "https://oauth2.googleapis.com/token",
                "kind" : "HIDDEN",
                "description" : "The endpoint used to create and refresh OAuth tokens.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.scope",
                "aliases" : [ ],
                "label" : "OAuth scopes we need to request access to",
                "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
                "kind" : "HIDDEN",
                "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.access_token",
                "aliases" : [ ],
                "label" : "Access Token",
                "kind" : "HIDDEN",
                "description" : "The token used to authenticate and authorize API requests.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_token",
                "aliases" : [ ],
                "label" : "OAuth Refresh Token",
                "kind" : "HIDDEN",
                "description" : "The token used to refresh the access token when it expires.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_proxy_url",
                "aliases" : [ ],
                "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
                "kind" : "HIDDEN",
                "description" : "An optional function that will be called to refresh the access token using the refresh token.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.refresh_proxy_url_auth",
                "aliases" : [ ],
                "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
                "kind" : "HIDDEN",
                "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.client_id",
                "aliases" : [ ],
                "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
                "kind" : "HIDDEN",
                "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
                "protected" : false
              }, {
                "name" : "oauth_credentials.client_secret",
                "aliases" : [ ],
                "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
                "kind" : "HIDDEN",
                "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
                "protected" : false
              }, {
                "name" : "view_id",
                "aliases" : [ ],
                "label" : "View ID",
                "placeholder" : "Ex. 198343027",
                "kind" : "STRING",
                "description" : "The ID of the Google Analytics view to retrieve data from.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "reports",
                "aliases" : [ ],
                "label" : "Reports",
                "placeholder" : "Ex. my_report_definition.json",
                "kind" : "STRING",
                "description" : "The specific reports to retrieve data from in the Google Analytics view.",
                "protected" : false
              }, {
                "name" : "start_date",
                "aliases" : [ ],
                "label" : "Start date",
                "kind" : "DATE_ISO8601",
                "description" : "The start date for the date range of data to retrieve.",
                "protected" : false
              }, {
                "name" : "end_date",
                "aliases" : [ ],
                "label" : "End date",
                "kind" : "DATE_ISO8601",
                "description" : "The end date for the date range of data to retrieve.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : { },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : true,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
            }
          }
        }, {
          "id" : "a3154ba7-8410-4e22-ae97-52a5625a37ff",
          "created" : "2024-04-10T14:09:17.265704",
          "lastModified" : "2024-04-10T14:12:19.12056",
          "name" : "Warehouse",
          "properties" : {
            "password" : "Cx44A8mk_EqD01Her_AVuH1j8Y",
            "dbname" : "xcshyhw",
            "default_target_schema" : "analytics",
            "port" : "5432",
            "host" : "sharp-banana.postgres.database.azure.com",
            "user" : "xcshyhw"
          },
          "commands" : { },
          "dataPlugin" : "loaders/target-postgres--matatika",
          "_embedded" : {
            "dataplugin" : {
              "id" : "8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
              "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
              "pluginType" : "LOADER",
              "name" : "target-postgres",
              "namespace" : "postgres_transferwise",
              "variant" : "matatika",
              "label" : "Postgres Warehouse",
              "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
              "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/target-postgres/",
              "pipUrl" : "git+https://github.com/Matatika/[email protected]",
              "repo" : "git+https://github.com/Matatika/[email protected]",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "user",
                "aliases" : [ "username" ],
                "label" : "User",
                "kind" : "STRING",
                "description" : "The username used to connect to the Postgres Warehouse.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "password",
                "aliases" : [ ],
                "label" : "Password",
                "kind" : "PASSWORD",
                "description" : "The password used to authenticate the user.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "host",
                "aliases" : [ "address" ],
                "label" : "Host",
                "kind" : "STRING",
                "description" : "The hostname or IP address of the Postgres Warehouse server.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "port",
                "aliases" : [ ],
                "label" : "Port",
                "value" : "5432",
                "kind" : "INTEGER",
                "description" : "The port number used to connect to the Postgres Warehouse server.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "dbname",
                "aliases" : [ "database" ],
                "label" : "Database Name",
                "kind" : "STRING",
                "description" : "The name of the database to connect to.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "default_target_schema",
                "aliases" : [ ],
                "label" : "Default Target Schema",
                "value" : "analytics",
                "kind" : "STRING",
                "description" : "The default schema to use when writing data to the Postgres Warehouse.",
                "required" : "true",
                "protected" : false
              }, {
                "name" : "ssl",
                "aliases" : [ ],
                "label" : "SSL",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
                "protected" : false,
                "value_post_processor" : "STRINGIFY"
              }, {
                "name" : "batch_size_rows",
                "aliases" : [ ],
                "label" : "Batch Size Rows",
                "value" : "100000",
                "kind" : "INTEGER",
                "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
                "protected" : false
              }, {
                "name" : "underscore_camel_case_fields",
                "aliases" : [ ],
                "label" : "Underscore Camel Case Fields",
                "value" : "true",
                "kind" : "HIDDEN",
                "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
                "protected" : false
              }, {
                "name" : "flush_all_streams",
                "aliases" : [ ],
                "label" : "Flush All Streams",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
                "protected" : false
              }, {
                "name" : "parallelism",
                "aliases" : [ ],
                "label" : "Parallelism",
                "value" : "0",
                "kind" : "HIDDEN",
                "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "parallelism_max",
                "aliases" : [ ],
                "label" : "Max Parallelism",
                "value" : "16",
                "kind" : "HIDDEN",
                "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "default_target_schema_select_permission",
                "aliases" : [ ],
                "label" : "Default Target Schema Select Permission",
                "kind" : "HIDDEN",
                "description" : "The permission level required to select data from the default target schema.",
                "protected" : false
              }, {
                "name" : "schema_mapping",
                "aliases" : [ ],
                "label" : "Schema Mapping",
                "kind" : "HIDDEN",
                "description" : "A mapping of source schema names to target schema names.",
                "protected" : false
              }, {
                "name" : "add_metadata_columns",
                "aliases" : [ ],
                "label" : "Add Metadata Columns",
                "value" : "true",
                "kind" : "HIDDEN",
                "description" : "Whether or not to add metadata columns to the target table.",
                "protected" : false
              }, {
                "name" : "hard_delete",
                "aliases" : [ ],
                "label" : "Hard Delete",
                "value" : "false",
                "kind" : "HIDDEN",
                "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "data_flattening_max_level",
                "aliases" : [ ],
                "label" : "Data Flattening Max Level",
                "value" : "10",
                "kind" : "HIDDEN",
                "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "primary_key_required",
                "aliases" : [ ],
                "label" : "Primary Key Required",
                "value" : "false",
                "kind" : "BOOLEAN",
                "description" : "Whether or not a primary key is required for the target table.",
                "protected" : false
              }, {
                "name" : "validate_records",
                "aliases" : [ ],
                "label" : "Validate Records",
                "value" : "false",
                "kind" : "BOOLEAN",
                "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
                "protected" : false
              }, {
                "name" : "temp_dir",
                "aliases" : [ ],
                "label" : "Temporary Directory",
                "kind" : "HIDDEN",
                "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : { },
              "matatikaHidden" : false,
              "requires" : [ ],
              "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : true,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
            }
          }
        }, {
          "id" : "a204013b-57c6-4b00-9765-7f06fe1878f8",
          "created" : "2024-04-10T14:11:34.838299",
          "lastModified" : "2024-04-10T14:11:34.838319",
          "name" : "dbt",
          "properties" : { },
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "dataPlugin" : "transformers/dbt--dbt-labs",
          "_embedded" : {
            "dataplugin" : {
              "id" : "e1e1cf84-a771-451b-972d-829c74e18a48",
              "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
              "pluginType" : "TRANSFORMER",
              "name" : "dbt",
              "namespace" : "dbt",
              "variant" : "dbt-labs",
              "label" : "dbt",
              "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
              "hidden" : false,
              "docs" : "https://www.matatika.com/data-details/dbt/",
              "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
              "repo" : "https://github.com/dbt-labs/dbt-core",
              "capabilities" : [ ],
              "select" : [ ],
              "update" : { },
              "vars" : { },
              "settings" : [ {
                "name" : "project_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "profiles_dir",
                "aliases" : [ ],
                "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
                "kind" : "STRING",
                "env" : "DBT_PROFILES_DIR",
                "protected" : false
              }, {
                "name" : "target",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__DIALECT",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "source_schema",
                "aliases" : [ ],
                "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "target_schema",
                "aliases" : [ ],
                "value" : "analytics",
                "kind" : "STRING",
                "protected" : false
              }, {
                "name" : "models",
                "aliases" : [ ],
                "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
                "kind" : "STRING",
                "protected" : false
              } ],
              "variants" : [ ],
              "commands" : {
                "compile" : {
                  "args" : "compile",
                  "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
                },
                "seed" : {
                  "args" : "seed",
                  "description" : "Load data from csv files into your data warehouse."
                },
                "test" : {
                  "args" : "test",
                  "description" : "Runs tests on data in deployed models."
                },
                "docs-generate" : {
                  "args" : "docs generate",
                  "description" : "Generate documentation artifacts for your project."
                },
                "deps" : {
                  "args" : "deps",
                  "description" : "Pull the most recent version of the dependencies listed in packages.yml"
                },
                "run" : {
                  "args" : "run",
                  "description" : "Compile SQL and execute against the current target database."
                },
                "clean" : {
                  "args" : "clean",
                  "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
                },
                "snapshot" : {
                  "args" : "snapshot",
                  "description" : "Execute snapshots defined in your project."
                }
              },
              "matatikaHidden" : false,
              "requires" : [ {
                "id" : "4b54a59d-5399-4877-9571-a6ab5c77e7d9",
                "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
                "pluginType" : "FILE",
                "name" : "files-dbt",
                "namespace" : "dbt",
                "variant" : "matatika",
                "label" : "files-dbt",
                "hidden" : false,
                "pipUrl" : "git+https://github.com/Matatika/[email protected]",
                "repo" : "https://github.com/Matatika/files-dbt",
                "capabilities" : [ ],
                "select" : [ ],
                "update" : {
                  "transform/profile/profiles.yml" : "true"
                },
                "vars" : { },
                "settings" : [ ],
                "variants" : [ ],
                "commands" : { },
                "matatikaHidden" : false,
                "requires" : [ ],
                "fullDescription" : ""
              } ],
              "fullDescription" : "",
              "_links" : {
                "self" : {
                  "href" : "https://catalog.matatika.com/api/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48"
                },
                "update dataplugin" : {
                  "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48",
                  "type" : "PUT"
                }
              }
            }
          },
          "draft" : false,
          "managed" : false,
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            },
            "update datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            },
            "delete datacomponent" : {
              "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
            }
          }
        } ],
        "latest job" : {
          "id" : "834fe1aa-1371-4dd0-886e-95ccecfa8698",
          "created" : "2024-04-10T14:11:38.662251",
          "type" : "WORKSPACE_CONFIG",
          "maxAttempts" : 0,
          "attempt" : 0,
          "commitId" : "d099ac12444cb1e4ce094ce4cdd9cef34f784d15",
          "exitCode" : 0,
          "status" : "COMPLETE",
          "startTime" : "2024-04-10T14:11:57.313",
          "endTime" : "2024-04-10T14:12:20.199"
        }
      },
      "_links" : {
        "update pipeline" : {
          "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55",
          "type" : "PUT"
        },
        "delete pipeline" : {
          "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55",
          "type" : "DELETE"
        },
        "draft pipeline" : {
          "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/draft",
          "type" : "PUT"
        },
        "self" : {
          "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55"
        },
        "environment" : {
          "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/environment"
        },
        "jobs" : {
          "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/jobs",
          "type" : "GET"
        },
        "metrics" : {
          "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/metrics"
        },
        "add subscription" : {
          "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/subscriptions"
        },
        "verify pipeline" : {
          "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/verification",
          "type" : "POST"
        },
        "create job" : {
          "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/jobs",
          "type" : "POST"
        },
        "latest job" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698"
        }
      }
    } ]
  },
  "_links" : {
    "self" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines?page=0&size=20&sort=name,asc"
    }
  },
  "page" : {
    "size" : 20,
    "totalElements" : 3,
    "totalPages" : 1,
    "number" : 0
  }
}

View a pipeline

GET

/api/pipelines/{pipeline-id}

Returns the pipeline {pipeline-id}.

Prerequisites

  • Pipeline {pipeline-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55' -i -X GET \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("GET", url, headers=headers)

print(response.text.encode('utf8'))

Response

200 OK

Pipeline with HAL links.

{
  "id" : "e25572cb-daa2-44d7-9f12-7b3127a4cc55",
  "status" : "READY",
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649] (updated)",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-04-10T14:11:38.148468",
  "lastModified" : "2024-04-10T14:11:38.148469",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
    "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  },
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
  "actions" : [ ],
  "_embedded" : {
    "dataComponents" : [ {
      "id" : "0e155953-ad3b-4412-b1d3-70fdf2ae551e",
      "created" : "2024-04-10T14:09:44.004401",
      "lastModified" : "2024-04-10T14:09:44.004402",
      "name" : "tap-google-analytics",
      "properties" : { },
      "commands" : { },
      "dataPlugin" : "extractors/tap-google-analytics--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "15daaa43-8a54-471f-8534-84c0c9b10c2f",
          "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
          "pluginType" : "EXTRACTOR",
          "name" : "tap-google-analytics",
          "namespace" : "tap_google_analytics",
          "variant" : "matatika",
          "label" : "Google Analytics",
          "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
          "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
          "hidden" : false,
          "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "https://github.com/Matatika/tap-google-analytics",
          "capabilities" : [ "DISCOVER", "STATE", "CATALOG" ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "oauth_credentials.authorization_url",
            "aliases" : [ ],
            "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
            "value" : "https://oauth2.googleapis.com/token",
            "kind" : "HIDDEN",
            "description" : "The endpoint used to create and refresh OAuth tokens.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.scope",
            "aliases" : [ ],
            "label" : "OAuth scopes we need to request access to",
            "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
            "kind" : "HIDDEN",
            "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.access_token",
            "aliases" : [ ],
            "label" : "Access Token",
            "kind" : "HIDDEN",
            "description" : "The token used to authenticate and authorize API requests.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_token",
            "aliases" : [ ],
            "label" : "OAuth Refresh Token",
            "kind" : "HIDDEN",
            "description" : "The token used to refresh the access token when it expires.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url",
            "aliases" : [ ],
            "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
            "kind" : "HIDDEN",
            "description" : "An optional function that will be called to refresh the access token using the refresh token.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url_auth",
            "aliases" : [ ],
            "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
            "kind" : "HIDDEN",
            "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_id",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_secret",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "view_id",
            "aliases" : [ ],
            "label" : "View ID",
            "placeholder" : "Ex. 198343027",
            "kind" : "STRING",
            "description" : "The ID of the Google Analytics view to retrieve data from.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "reports",
            "aliases" : [ ],
            "label" : "Reports",
            "placeholder" : "Ex. my_report_definition.json",
            "kind" : "STRING",
            "description" : "The specific reports to retrieve data from in the Google Analytics view.",
            "protected" : false
          }, {
            "name" : "start_date",
            "aliases" : [ ],
            "label" : "Start date",
            "kind" : "DATE_ISO8601",
            "description" : "The start date for the date range of data to retrieve.",
            "protected" : false
          }, {
            "name" : "end_date",
            "aliases" : [ ],
            "label" : "End date",
            "kind" : "DATE_ISO8601",
            "description" : "The end date for the date range of data to retrieve.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : true,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        }
      }
    }, {
      "id" : "a3154ba7-8410-4e22-ae97-52a5625a37ff",
      "created" : "2024-04-10T14:09:17.265704",
      "lastModified" : "2024-04-10T14:12:19.12056",
      "name" : "Warehouse",
      "properties" : {
        "password" : "Cx44A8mk_EqD01Her_AVuH1j8Y",
        "dbname" : "xcshyhw",
        "default_target_schema" : "analytics",
        "port" : "5432",
        "host" : "sharp-banana.postgres.database.azure.com",
        "user" : "xcshyhw"
      },
      "commands" : { },
      "dataPlugin" : "loaders/target-postgres--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
          "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
          "pluginType" : "LOADER",
          "name" : "target-postgres",
          "namespace" : "postgres_transferwise",
          "variant" : "matatika",
          "label" : "Postgres Warehouse",
          "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
          "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/target-postgres/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "git+https://github.com/Matatika/[email protected]",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "user",
            "aliases" : [ "username" ],
            "label" : "User",
            "kind" : "STRING",
            "description" : "The username used to connect to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "password",
            "aliases" : [ ],
            "label" : "Password",
            "kind" : "PASSWORD",
            "description" : "The password used to authenticate the user.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "host",
            "aliases" : [ "address" ],
            "label" : "Host",
            "kind" : "STRING",
            "description" : "The hostname or IP address of the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "port",
            "aliases" : [ ],
            "label" : "Port",
            "value" : "5432",
            "kind" : "INTEGER",
            "description" : "The port number used to connect to the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "dbname",
            "aliases" : [ "database" ],
            "label" : "Database Name",
            "kind" : "STRING",
            "description" : "The name of the database to connect to.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "default_target_schema",
            "aliases" : [ ],
            "label" : "Default Target Schema",
            "value" : "analytics",
            "kind" : "STRING",
            "description" : "The default schema to use when writing data to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "ssl",
            "aliases" : [ ],
            "label" : "SSL",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
            "protected" : false,
            "value_post_processor" : "STRINGIFY"
          }, {
            "name" : "batch_size_rows",
            "aliases" : [ ],
            "label" : "Batch Size Rows",
            "value" : "100000",
            "kind" : "INTEGER",
            "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
            "protected" : false
          }, {
            "name" : "underscore_camel_case_fields",
            "aliases" : [ ],
            "label" : "Underscore Camel Case Fields",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
            "protected" : false
          }, {
            "name" : "flush_all_streams",
            "aliases" : [ ],
            "label" : "Flush All Streams",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
            "protected" : false
          }, {
            "name" : "parallelism",
            "aliases" : [ ],
            "label" : "Parallelism",
            "value" : "0",
            "kind" : "HIDDEN",
            "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "parallelism_max",
            "aliases" : [ ],
            "label" : "Max Parallelism",
            "value" : "16",
            "kind" : "HIDDEN",
            "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "default_target_schema_select_permission",
            "aliases" : [ ],
            "label" : "Default Target Schema Select Permission",
            "kind" : "HIDDEN",
            "description" : "The permission level required to select data from the default target schema.",
            "protected" : false
          }, {
            "name" : "schema_mapping",
            "aliases" : [ ],
            "label" : "Schema Mapping",
            "kind" : "HIDDEN",
            "description" : "A mapping of source schema names to target schema names.",
            "protected" : false
          }, {
            "name" : "add_metadata_columns",
            "aliases" : [ ],
            "label" : "Add Metadata Columns",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to add metadata columns to the target table.",
            "protected" : false
          }, {
            "name" : "hard_delete",
            "aliases" : [ ],
            "label" : "Hard Delete",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "data_flattening_max_level",
            "aliases" : [ ],
            "label" : "Data Flattening Max Level",
            "value" : "10",
            "kind" : "HIDDEN",
            "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "primary_key_required",
            "aliases" : [ ],
            "label" : "Primary Key Required",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not a primary key is required for the target table.",
            "protected" : false
          }, {
            "name" : "validate_records",
            "aliases" : [ ],
            "label" : "Validate Records",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "temp_dir",
            "aliases" : [ ],
            "label" : "Temporary Directory",
            "kind" : "HIDDEN",
            "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        }
      }
    }, {
      "id" : "a204013b-57c6-4b00-9765-7f06fe1878f8",
      "created" : "2024-04-10T14:11:34.838299",
      "lastModified" : "2024-04-10T14:11:34.838319",
      "name" : "dbt",
      "properties" : { },
      "commands" : {
        "compile" : {
          "args" : "compile",
          "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
        },
        "seed" : {
          "args" : "seed",
          "description" : "Load data from csv files into your data warehouse."
        },
        "test" : {
          "args" : "test",
          "description" : "Runs tests on data in deployed models."
        },
        "docs-generate" : {
          "args" : "docs generate",
          "description" : "Generate documentation artifacts for your project."
        },
        "deps" : {
          "args" : "deps",
          "description" : "Pull the most recent version of the dependencies listed in packages.yml"
        },
        "run" : {
          "args" : "run",
          "description" : "Compile SQL and execute against the current target database."
        },
        "clean" : {
          "args" : "clean",
          "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
        },
        "snapshot" : {
          "args" : "snapshot",
          "description" : "Execute snapshots defined in your project."
        }
      },
      "dataPlugin" : "transformers/dbt--dbt-labs",
      "_embedded" : {
        "dataplugin" : {
          "id" : "e1e1cf84-a771-451b-972d-829c74e18a48",
          "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
          "pluginType" : "TRANSFORMER",
          "name" : "dbt",
          "namespace" : "dbt",
          "variant" : "dbt-labs",
          "label" : "dbt",
          "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/dbt/",
          "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
          "repo" : "https://github.com/dbt-labs/dbt-core",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "project_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "profiles_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
            "kind" : "STRING",
            "env" : "DBT_PROFILES_DIR",
            "protected" : false
          }, {
            "name" : "target",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__DIALECT",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "source_schema",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "target_schema",
            "aliases" : [ ],
            "value" : "analytics",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "models",
            "aliases" : [ ],
            "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
            "kind" : "STRING",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "matatikaHidden" : false,
          "requires" : [ {
            "id" : "4b54a59d-5399-4877-9571-a6ab5c77e7d9",
            "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
            "pluginType" : "FILE",
            "name" : "files-dbt",
            "namespace" : "dbt",
            "variant" : "matatika",
            "label" : "files-dbt",
            "hidden" : false,
            "pipUrl" : "git+https://github.com/Matatika/[email protected]",
            "repo" : "https://github.com/Matatika/files-dbt",
            "capabilities" : [ ],
            "select" : [ ],
            "update" : {
              "transform/profile/profiles.yml" : "true"
            },
            "vars" : { },
            "settings" : [ ],
            "variants" : [ ],
            "commands" : { },
            "matatikaHidden" : false,
            "requires" : [ ],
            "fullDescription" : ""
          } ],
          "fullDescription" : "",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        }
      }
    } ],
    "latest job" : {
      "id" : "834fe1aa-1371-4dd0-886e-95ccecfa8698",
      "created" : "2024-04-10T14:11:38.662251",
      "type" : "WORKSPACE_CONFIG",
      "maxAttempts" : 0,
      "attempt" : 0,
      "commitId" : "d099ac12444cb1e4ce094ce4cdd9cef34f784d15",
      "exitCode" : 0,
      "status" : "COMPLETE",
      "startTime" : "2024-04-10T14:11:57.313",
      "endTime" : "2024-04-10T14:12:20.199",
      "_embedded" : {
        "pipeline" : {
          "id" : "e25572cb-daa2-44d7-9f12-7b3127a4cc55",
          "status" : "READY",
          "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649] (updated)",
          "timeout" : 0,
          "maxRetries" : 0,
          "created" : "2024-04-10T14:11:38.148468",
          "lastModified" : "2024-04-10T14:11:38.148469",
          "properties" : {
            "tap-google-analytics.view_id" : "1234567890",
            "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
            "tap-google-analytics.reports" : "reports",
            "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
            "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
            "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
            "tap-google-analytics.oauth_credentials.client_id" : "client_id",
            "tap-google-analytics.oauth_credentials.access_token" : "access_token"
          },
          "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
          "actions" : [ ]
        },
        "profile" : {
          "id" : "auth0|5eb0327cbfd7490bff55feeb",
          "name" : "[email protected]",
          "handle" : "@sit+prod",
          "email" : "[email protected]"
        }
      },
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698"
        },
        "delete job" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698",
          "type" : "DELETE"
        },
        "logs" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698/logs?sequence=0",
          "type" : "GET"
        }
      }
    }
  },
  "_links" : {
    "update pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55",
      "type" : "PUT"
    },
    "delete pipeline" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55",
      "type" : "DELETE"
    },
    "draft pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/draft",
      "type" : "PUT"
    },
    "self" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55"
    },
    "environment" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/environment"
    },
    "jobs" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/jobs",
      "type" : "GET"
    },
    "metrics" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/metrics"
    },
    "add subscription" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/subscriptions"
    },
    "verify pipeline" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/verification",
      "type" : "POST"
    },
    "create job" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/jobs",
      "type" : "POST"
    },
    "latest job" : {
      "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698"
    }
  }
}

Initialise a pipeline in a workspace

POST

/api/workspaces/{workspace-id}/pipelines

Initialises a new pipeline in the workspace {workspace-id}.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines' -i -X POST \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("POST", url, headers=headers)

print(response.text.encode('utf8'))

Response

200 OK

Pipeline with HAL links.

{
  "id" : "2283e04e-928e-4ac4-8634-25a642d9c66e",
  "status" : "PROVISIONING",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-04-10T14:09:43.76881885",
  "lastModified" : "2024-04-10T14:09:43.76881965",
  "properties" : { },
  "dataComponents" : [ ],
  "actions" : [ ],
  "_links" : {
    "create pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e",
      "type" : "PUT"
    },
    "draft pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/draft",
      "type" : "PUT"
    },
    "validate pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/validation",
      "type" : "POST"
    }
  }
}

Create or update a pipeline in a workspace

PUT

/api/workspaces/{workspace-id}/pipelines/{pipeline-id}

Creates or updates the pipeline {pipeline-id} in the workspace {workspace-id}.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Body

Pipeline resource.

{
  "name" : "SIT-generated pipeline [2024-04-10T15:09:43.837414]",
  "dataComponents" : [ "extractors/tap-google-analytics", "Warehouse", "dbt" ],
  "schedule" : "0 0 0 25 12 ?",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
    "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  }
}

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e' -i -X PUT \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json' \
    -d '{
  "name" : "SIT-generated pipeline [2024-04-10T15:09:43.837414]",
  "dataComponents" : [ "extractors/tap-google-analytics", "Warehouse", "dbt" ],
  "schedule" : "0 0 0 25 12 ?",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
    "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  }
}'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e"

data = {
  "name" : "SIT-generated pipeline [2024-04-10T15:09:43.837414]",
  "dataComponents" : [ "extractors/tap-google-analytics", "Warehouse", "dbt" ],
  "schedule" : "0 0 0 25 12 ?",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
    "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  }
}
headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("PUT", url, headers=headers, data=data)

print(response.text.encode('utf8'))

Response

200 OK / 201 Created

Pipeline with HAL links.

{
  "id" : "2283e04e-928e-4ac4-8634-25a642d9c66e",
  "status" : "PROVISIONING",
  "name" : "SIT-generated pipeline [2024-04-10T15:09:43.837414]",
  "schedule" : "0 0 0 25 12 ?",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-04-10T14:09:44.144403",
  "lastModified" : "2024-04-10T14:09:44.144404",
  "properties" : {
    "tap-google-analytics.view_id" : "1234567890",
    "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
    "tap-google-analytics.reports" : "reports",
    "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
    "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
    "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
    "tap-google-analytics.oauth_credentials.client_id" : "client_id",
    "tap-google-analytics.oauth_credentials.access_token" : "access_token"
  },
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
  "actions" : [ ],
  "_embedded" : {
    "dataComponents" : [ {
      "id" : "0e155953-ad3b-4412-b1d3-70fdf2ae551e",
      "created" : "2024-04-10T14:09:44.004401",
      "lastModified" : "2024-04-10T14:09:44.004402",
      "name" : "tap-google-analytics",
      "properties" : { },
      "commands" : { },
      "dataPlugin" : "extractors/tap-google-analytics--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "15daaa43-8a54-471f-8534-84c0c9b10c2f",
          "pluginType" : "EXTRACTOR",
          "name" : "tap-google-analytics",
          "namespace" : "tap_google_analytics",
          "variant" : "matatika",
          "label" : "Google Analytics",
          "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
          "logoUrl" : "/assets/images/datasource/tap-google-analytics.svg",
          "hidden" : false,
          "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "https://github.com/Matatika/tap-google-analytics",
          "capabilities" : [ "DISCOVER", "STATE", "CATALOG" ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "oauth_credentials.authorization_url",
            "aliases" : [ ],
            "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
            "value" : "https://oauth2.googleapis.com/token",
            "kind" : "HIDDEN",
            "description" : "The endpoint used to create and refresh OAuth tokens.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.scope",
            "aliases" : [ ],
            "label" : "OAuth scopes we need to request access to",
            "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
            "kind" : "HIDDEN",
            "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.access_token",
            "aliases" : [ ],
            "label" : "Access Token",
            "kind" : "HIDDEN",
            "description" : "The token used to authenticate and authorize API requests.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_token",
            "aliases" : [ ],
            "label" : "OAuth Refresh Token",
            "kind" : "HIDDEN",
            "description" : "The token used to refresh the access token when it expires.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url",
            "aliases" : [ ],
            "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
            "kind" : "HIDDEN",
            "description" : "An optional function that will be called to refresh the access token using the refresh token.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url_auth",
            "aliases" : [ ],
            "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
            "kind" : "HIDDEN",
            "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_id",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_secret",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "view_id",
            "aliases" : [ ],
            "label" : "View ID",
            "placeholder" : "Ex. 198343027",
            "kind" : "STRING",
            "description" : "The ID of the Google Analytics view to retrieve data from.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "reports",
            "aliases" : [ ],
            "label" : "Reports",
            "placeholder" : "Ex. my_report_definition.json",
            "kind" : "STRING",
            "description" : "The specific reports to retrieve data from in the Google Analytics view.",
            "protected" : false
          }, {
            "name" : "start_date",
            "aliases" : [ ],
            "label" : "Start date",
            "kind" : "DATE_ISO8601",
            "description" : "The start date for the date range of data to retrieve.",
            "protected" : false
          }, {
            "name" : "end_date",
            "aliases" : [ ],
            "label" : "End date",
            "kind" : "DATE_ISO8601",
            "description" : "The end date for the date range of data to retrieve.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : true,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        }
      }
    }, {
      "id" : "a3154ba7-8410-4e22-ae97-52a5625a37ff",
      "created" : "2024-04-10T14:09:17.265704",
      "lastModified" : "2024-04-10T14:09:37.424353",
      "name" : "Warehouse",
      "properties" : {
        "password" : "Cx44A8mk_EqD01Her_AVuH1j8Y",
        "dbname" : "xcshyhw",
        "default_target_schema" : "analytics",
        "port" : "5432",
        "host" : "sharp-banana.postgres.database.azure.com",
        "user" : "xcshyhw"
      },
      "commands" : { },
      "dataPlugin" : "loaders/target-postgres--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
          "pluginType" : "LOADER",
          "name" : "target-postgres",
          "namespace" : "postgres_transferwise",
          "variant" : "matatika",
          "label" : "Postgres Warehouse",
          "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
          "logoUrl" : "/assets/logos/loaders/postgres.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/target-postgres/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "git+https://github.com/Matatika/[email protected]",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "user",
            "aliases" : [ "username" ],
            "label" : "User",
            "kind" : "STRING",
            "description" : "The username used to connect to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "password",
            "aliases" : [ ],
            "label" : "Password",
            "kind" : "PASSWORD",
            "description" : "The password used to authenticate the user.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "host",
            "aliases" : [ "address" ],
            "label" : "Host",
            "kind" : "STRING",
            "description" : "The hostname or IP address of the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "port",
            "aliases" : [ ],
            "label" : "Port",
            "value" : "5432",
            "kind" : "INTEGER",
            "description" : "The port number used to connect to the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "dbname",
            "aliases" : [ "database" ],
            "label" : "Database Name",
            "kind" : "STRING",
            "description" : "The name of the database to connect to.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "default_target_schema",
            "aliases" : [ ],
            "label" : "Default Target Schema",
            "value" : "analytics",
            "kind" : "STRING",
            "description" : "The default schema to use when writing data to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "ssl",
            "aliases" : [ ],
            "label" : "SSL",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
            "protected" : false,
            "value_post_processor" : "STRINGIFY"
          }, {
            "name" : "batch_size_rows",
            "aliases" : [ ],
            "label" : "Batch Size Rows",
            "value" : "100000",
            "kind" : "INTEGER",
            "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
            "protected" : false
          }, {
            "name" : "underscore_camel_case_fields",
            "aliases" : [ ],
            "label" : "Underscore Camel Case Fields",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
            "protected" : false
          }, {
            "name" : "flush_all_streams",
            "aliases" : [ ],
            "label" : "Flush All Streams",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
            "protected" : false
          }, {
            "name" : "parallelism",
            "aliases" : [ ],
            "label" : "Parallelism",
            "value" : "0",
            "kind" : "HIDDEN",
            "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "parallelism_max",
            "aliases" : [ ],
            "label" : "Max Parallelism",
            "value" : "16",
            "kind" : "HIDDEN",
            "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "default_target_schema_select_permission",
            "aliases" : [ ],
            "label" : "Default Target Schema Select Permission",
            "kind" : "HIDDEN",
            "description" : "The permission level required to select data from the default target schema.",
            "protected" : false
          }, {
            "name" : "schema_mapping",
            "aliases" : [ ],
            "label" : "Schema Mapping",
            "kind" : "HIDDEN",
            "description" : "A mapping of source schema names to target schema names.",
            "protected" : false
          }, {
            "name" : "add_metadata_columns",
            "aliases" : [ ],
            "label" : "Add Metadata Columns",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to add metadata columns to the target table.",
            "protected" : false
          }, {
            "name" : "hard_delete",
            "aliases" : [ ],
            "label" : "Hard Delete",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "data_flattening_max_level",
            "aliases" : [ ],
            "label" : "Data Flattening Max Level",
            "value" : "10",
            "kind" : "HIDDEN",
            "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "primary_key_required",
            "aliases" : [ ],
            "label" : "Primary Key Required",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not a primary key is required for the target table.",
            "protected" : false
          }, {
            "name" : "validate_records",
            "aliases" : [ ],
            "label" : "Validate Records",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "temp_dir",
            "aliases" : [ ],
            "label" : "Temporary Directory",
            "kind" : "HIDDEN",
            "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        }
      }
    }, {
      "id" : "01af5e45-e1b4-46fb-92b5-852fad27e3bd",
      "created" : "2024-04-10T14:09:17.343911",
      "lastModified" : "2024-04-10T14:09:17.343912",
      "name" : "dbt",
      "properties" : { },
      "commands" : {
        "compile" : {
          "args" : "compile",
          "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
        },
        "seed" : {
          "args" : "seed",
          "description" : "Load data from csv files into your data warehouse."
        },
        "test" : {
          "args" : "test",
          "description" : "Runs tests on data in deployed models."
        },
        "docs-generate" : {
          "args" : "docs generate",
          "description" : "Generate documentation artifacts for your project."
        },
        "deps" : {
          "args" : "deps",
          "description" : "Pull the most recent version of the dependencies listed in packages.yml"
        },
        "run" : {
          "args" : "run",
          "description" : "Compile SQL and execute against the current target database."
        },
        "clean" : {
          "args" : "clean",
          "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
        },
        "snapshot" : {
          "args" : "snapshot",
          "description" : "Execute snapshots defined in your project."
        }
      },
      "dataPlugin" : "transformers/dbt--dbt-labs",
      "_embedded" : {
        "dataplugin" : {
          "id" : "e1e1cf84-a771-451b-972d-829c74e18a48",
          "pluginType" : "TRANSFORMER",
          "name" : "dbt",
          "namespace" : "dbt",
          "variant" : "dbt-labs",
          "label" : "dbt",
          "logoUrl" : "/assets/images/transformer/dbt.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/dbt/",
          "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
          "repo" : "https://github.com/dbt-labs/dbt-core",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "project_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "profiles_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
            "kind" : "STRING",
            "env" : "DBT_PROFILES_DIR",
            "protected" : false
          }, {
            "name" : "target",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__DIALECT",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "source_schema",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "target_schema",
            "aliases" : [ ],
            "value" : "analytics",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "models",
            "aliases" : [ ],
            "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
            "kind" : "STRING",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "matatikaHidden" : false,
          "requires" : [ {
            "id" : "e6c1ad3d-ebf5-4c4a-b129-f68156b47555",
            "pluginType" : "FILE",
            "name" : "files-dbt",
            "namespace" : "dbt",
            "variant" : "matatika",
            "hidden" : false,
            "pipUrl" : "git+https://github.com/Matatika/[email protected]",
            "repo" : "https://github.com/Matatika/files-dbt",
            "capabilities" : [ ],
            "select" : [ ],
            "update" : {
              "transform/profile/profiles.yml" : "true"
            },
            "vars" : { },
            "settings" : [ ],
            "variants" : [ ],
            "commands" : { },
            "matatikaHidden" : false,
            "requires" : [ ],
            "fullDescription" : ""
          } ],
          "fullDescription" : "",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/01af5e45-e1b4-46fb-92b5-852fad27e3bd"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/01af5e45-e1b4-46fb-92b5-852fad27e3bd"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/01af5e45-e1b4-46fb-92b5-852fad27e3bd"
        }
      }
    } ],
    "latest job" : {
      "id" : "4a8b18ad-c552-4658-8cb8-b3916b087237",
      "created" : "2024-04-10T14:09:44.459264",
      "type" : "WORKSPACE_CONFIG",
      "maxAttempts" : 0,
      "attempt" : 0,
      "status" : "QUEUED",
      "_embedded" : {
        "pipeline" : {
          "id" : "2283e04e-928e-4ac4-8634-25a642d9c66e",
          "status" : "PROVISIONING",
          "name" : "SIT-generated pipeline [2024-04-10T15:09:43.837414]",
          "schedule" : "0 0 0 25 12 ?",
          "timeout" : 0,
          "maxRetries" : 0,
          "created" : "2024-04-10T14:09:44.144403",
          "lastModified" : "2024-04-10T14:09:44.144404",
          "properties" : {
            "tap-google-analytics.view_id" : "1234567890",
            "tap-google-analytics.oauth_credentials.client_secret" : "client_secret",
            "tap-google-analytics.reports" : "reports",
            "tap-google-analytics.oauth_credentials.refresh_token" : "refresh_token",
            "tap-google-analytics.start_date" : "2024-03-10T15:09:43.843155+01:00",
            "tap-google-analytics.end_date" : "2024-04-10T15:09:43.843204+01:00",
            "tap-google-analytics.oauth_credentials.client_id" : "client_id",
            "tap-google-analytics.oauth_credentials.access_token" : "access_token"
          },
          "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
          "actions" : [ ]
        },
        "profile" : {
          "id" : "auth0|5eb0327cbfd7490bff55feeb",
          "name" : "[email protected]",
          "handle" : "@sit+prod",
          "email" : "[email protected]"
        }
      },
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/jobs/4a8b18ad-c552-4658-8cb8-b3916b087237"
        },
        "delete job" : {
          "href" : "https://catalog.matatika.com/api/jobs/4a8b18ad-c552-4658-8cb8-b3916b087237",
          "type" : "DELETE"
        },
        "logs" : {
          "href" : "https://catalog.matatika.com/api/jobs/4a8b18ad-c552-4658-8cb8-b3916b087237/logs?sequence=0",
          "type" : "GET"
        },
        "withdraw job" : {
          "href" : "https://catalog.matatika.com/api/jobs/4a8b18ad-c552-4658-8cb8-b3916b087237/stopped",
          "type" : "PUT"
        }
      }
    }
  },
  "_links" : {
    "update pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e",
      "type" : "PUT"
    },
    "delete pipeline" : {
      "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e",
      "type" : "DELETE"
    },
    "draft pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/draft",
      "type" : "PUT"
    },
    "self" : {
      "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e"
    },
    "environment" : {
      "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/environment"
    },
    "jobs" : {
      "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/jobs",
      "type" : "GET"
    },
    "metrics" : {
      "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/metrics"
    },
    "add subscription" : {
      "href" : "https://catalog.matatika.com/api/pipelines/2283e04e-928e-4ac4-8634-25a642d9c66e/subscriptions"
    },
    "withdraw job" : {
      "href" : "https://catalog.matatika.com/api/jobs/4a8b18ad-c552-4658-8cb8-b3916b087237/stopped",
      "type" : "PUT"
    },
    "latest job" : {
      "href" : "https://catalog.matatika.com/api/jobs/4a8b18ad-c552-4658-8cb8-b3916b087237"
    }
  }
}

Create or update a pipeline as a draft

PUT

/api/workspaces/{workspace-id}/pipelines/{pipeline-id}/draft

Creates or updates the pipeline {pipeline-id} in the workspace {workspace-id} as a draft.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Body

Pipeline resource.

{
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649]",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/draft' -i -X PUT \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json' \
    -d '{
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649]",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/draft"

data = {
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649]",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}
headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("PUT", url, headers=headers, data=data)

print(response.text.encode('utf8'))

Response

200 OK / 201 Created

Pipeline with HAL links.

{
  "id" : "e25572cb-daa2-44d7-9f12-7b3127a4cc55",
  "status" : "PROVISIONING",
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649]",
  "timeout" : 0,
  "maxRetries" : 0,
  "created" : "2024-04-10T14:11:38.148468",
  "lastModified" : "2024-04-10T14:11:38.148469",
  "properties" : { },
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
  "actions" : [ ],
  "_embedded" : {
    "dataComponents" : [ {
      "id" : "0e155953-ad3b-4412-b1d3-70fdf2ae551e",
      "created" : "2024-04-10T14:09:44.004401",
      "lastModified" : "2024-04-10T14:09:44.004402",
      "name" : "tap-google-analytics",
      "properties" : { },
      "commands" : { },
      "dataPlugin" : "extractors/tap-google-analytics--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "15daaa43-8a54-471f-8534-84c0c9b10c2f",
          "repositoryPath" : "plugins/extractors/tap-google-analytics--matatika.lock",
          "pluginType" : "EXTRACTOR",
          "name" : "tap-google-analytics",
          "namespace" : "tap_google_analytics",
          "variant" : "matatika",
          "label" : "Google Analytics",
          "description" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.",
          "logoUrl" : "https://app.matatika.com/assets/images/datasource/tap-google-analytics.svg",
          "hidden" : false,
          "docs" : "https://www.matatika.com/docs/instant-insights/tap-google-analytics/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "https://github.com/Matatika/tap-google-analytics",
          "capabilities" : [ "DISCOVER", "STATE", "CATALOG" ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "oauth_credentials.authorization_url",
            "aliases" : [ ],
            "label" : "OAuth identity provider authorization endpoint used create and refresh tokens",
            "value" : "https://oauth2.googleapis.com/token",
            "kind" : "HIDDEN",
            "description" : "The endpoint used to create and refresh OAuth tokens.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.scope",
            "aliases" : [ ],
            "label" : "OAuth scopes we need to request access to",
            "value" : "profile email https://www.googleapis.com/auth/analytics.readonly",
            "kind" : "HIDDEN",
            "description" : "The specific scopes we need to request access to in order to connect to Google Analytics.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.access_token",
            "aliases" : [ ],
            "label" : "Access Token",
            "kind" : "HIDDEN",
            "description" : "The token used to authenticate and authorize API requests.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_token",
            "aliases" : [ ],
            "label" : "OAuth Refresh Token",
            "kind" : "HIDDEN",
            "description" : "The token used to refresh the access token when it expires.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url",
            "aliases" : [ ],
            "label" : "Optional - will be called with 'oauth_credentials.refresh_token' to refresh the access token",
            "kind" : "HIDDEN",
            "description" : "An optional function that will be called to refresh the access token using the refresh token.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.refresh_proxy_url_auth",
            "aliases" : [ ],
            "label" : "Optional - Sets Authorization header on 'oauth_credentials.refresh_url' request",
            "kind" : "HIDDEN",
            "description" : "An optional setting that sets the Authorization header on the refresh URL request.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_id",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client ID used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client ID used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "oauth_credentials.client_secret",
            "aliases" : [ ],
            "label" : "Optional - OAuth Client Secret used if refresh_proxy_url not supplied",
            "kind" : "HIDDEN",
            "description" : "An optional OAuth Client Secret used if the refresh proxy URL is not supplied.",
            "protected" : false
          }, {
            "name" : "view_id",
            "aliases" : [ ],
            "label" : "View ID",
            "placeholder" : "Ex. 198343027",
            "kind" : "STRING",
            "description" : "The ID of the Google Analytics view to retrieve data from.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "reports",
            "aliases" : [ ],
            "label" : "Reports",
            "placeholder" : "Ex. my_report_definition.json",
            "kind" : "STRING",
            "description" : "The specific reports to retrieve data from in the Google Analytics view.",
            "protected" : false
          }, {
            "name" : "start_date",
            "aliases" : [ ],
            "label" : "Start date",
            "kind" : "DATE_ISO8601",
            "description" : "The start date for the date range of data to retrieve.",
            "protected" : false
          }, {
            "name" : "end_date",
            "aliases" : [ ],
            "label" : "End date",
            "kind" : "DATE_ISO8601",
            "description" : "The end date for the date range of data to retrieve.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Google Analytics is a web analytics service that provides insights into website traffic and user behavior.\n\nGoogle Analytics allows website owners to track and analyze various metrics related to their website's performance, such as the number of visitors, pageviews, bounce rate, and average session duration. It also provides information on the demographics and interests of website visitors, as well as the sources of traffic, including organic search, paid search, social media, and referrals. This data can be used to optimize website content and marketing strategies, as well as to measure the effectiveness of advertising campaigns. Additionally, Google Analytics offers advanced features such as goal tracking, e-commerce tracking, and custom reporting, making it a powerful tool for businesses of all sizes.\n### Prerequisites\nTo obtain the OAuth identity provider authorization endpoint used to create and refresh tokens, you need to create a project in the Google API Console and enable the Google Analytics API. Then, you can create OAuth 2.0 credentials and configure the authorized redirect URIs. The authorization endpoint will be provided in the credentials.\n\nThe OAuth scopes you need to request access to depend on the specific data you want to access in Google Analytics. For example, if you want to read data from a specific view, you will need to request the \"https://www.googleapis.com/auth/analytics.readonly\" scope. You can find a list of available scopes in the Google Analytics API documentation.\n\nTo obtain the Access Token and OAuth Refresh Token, you need to authenticate the user and obtain their consent to access their Google Analytics data. This can be done using the Google Sign-In API or the OAuth 2.0 authorization flow. Once the user has granted access, you will receive an Access Token and a Refresh Token that you can use to make API requests.\n\nTo obtain the View ID, you need to log in to your Google Analytics account and navigate to the Admin section. From there, you can select the account, property, and view that you want to access and find the View ID in the View Settings.\n\n## Settings\n\n\n### View ID\n\nThe ID of the Google Analytics view to retrieve data from.\n\n### Reports\n\nThe specific reports to retrieve data from in the Google Analytics view.\n\n### Start date\n\nThe start date for the date range of data to retrieve.\n\n### End date\n\nThe end date for the date range of data to retrieve.",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/15daaa43-8a54-471f-8534-84c0c9b10c2f",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : true,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/0e155953-ad3b-4412-b1d3-70fdf2ae551e"
        }
      }
    }, {
      "id" : "a3154ba7-8410-4e22-ae97-52a5625a37ff",
      "created" : "2024-04-10T14:09:17.265704",
      "lastModified" : "2024-04-10T14:11:35.243442",
      "name" : "Warehouse",
      "properties" : {
        "password" : "Cx44A8mk_EqD01Her_AVuH1j8Y",
        "dbname" : "xcshyhw",
        "default_target_schema" : "analytics",
        "port" : "5432",
        "host" : "sharp-banana.postgres.database.azure.com",
        "user" : "xcshyhw"
      },
      "commands" : { },
      "dataPlugin" : "loaders/target-postgres--matatika",
      "_embedded" : {
        "dataplugin" : {
          "id" : "8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
          "repositoryPath" : "plugins/loaders/target-postgres--matatika.lock",
          "pluginType" : "LOADER",
          "name" : "target-postgres",
          "namespace" : "postgres_transferwise",
          "variant" : "matatika",
          "label" : "Postgres Warehouse",
          "description" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.",
          "logoUrl" : "https://app.matatika.com/assets/logos/loaders/postgres.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/target-postgres/",
          "pipUrl" : "git+https://github.com/Matatika/[email protected]",
          "repo" : "git+https://github.com/Matatika/[email protected]",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "user",
            "aliases" : [ "username" ],
            "label" : "User",
            "kind" : "STRING",
            "description" : "The username used to connect to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "password",
            "aliases" : [ ],
            "label" : "Password",
            "kind" : "PASSWORD",
            "description" : "The password used to authenticate the user.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "host",
            "aliases" : [ "address" ],
            "label" : "Host",
            "kind" : "STRING",
            "description" : "The hostname or IP address of the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "port",
            "aliases" : [ ],
            "label" : "Port",
            "value" : "5432",
            "kind" : "INTEGER",
            "description" : "The port number used to connect to the Postgres Warehouse server.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "dbname",
            "aliases" : [ "database" ],
            "label" : "Database Name",
            "kind" : "STRING",
            "description" : "The name of the database to connect to.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "default_target_schema",
            "aliases" : [ ],
            "label" : "Default Target Schema",
            "value" : "analytics",
            "kind" : "STRING",
            "description" : "The default schema to use when writing data to the Postgres Warehouse.",
            "required" : "true",
            "protected" : false
          }, {
            "name" : "ssl",
            "aliases" : [ ],
            "label" : "SSL",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to use SSL encryption when connecting to the Postgres Warehouse.",
            "protected" : false,
            "value_post_processor" : "STRINGIFY"
          }, {
            "name" : "batch_size_rows",
            "aliases" : [ ],
            "label" : "Batch Size Rows",
            "value" : "100000",
            "kind" : "INTEGER",
            "description" : "The number of rows to write to the Postgres Warehouse in each batch.",
            "protected" : false
          }, {
            "name" : "underscore_camel_case_fields",
            "aliases" : [ ],
            "label" : "Underscore Camel Case Fields",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to convert field names from camel case to underscore-separated format.",
            "protected" : false
          }, {
            "name" : "flush_all_streams",
            "aliases" : [ ],
            "label" : "Flush All Streams",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to flush all streams to the Postgres Warehouse before closing the connection.",
            "protected" : false
          }, {
            "name" : "parallelism",
            "aliases" : [ ],
            "label" : "Parallelism",
            "value" : "0",
            "kind" : "HIDDEN",
            "description" : "The number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "parallelism_max",
            "aliases" : [ ],
            "label" : "Max Parallelism",
            "value" : "16",
            "kind" : "HIDDEN",
            "description" : "The maximum number of threads to use when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "default_target_schema_select_permission",
            "aliases" : [ ],
            "label" : "Default Target Schema Select Permission",
            "kind" : "HIDDEN",
            "description" : "The permission level required to select data from the default target schema.",
            "protected" : false
          }, {
            "name" : "schema_mapping",
            "aliases" : [ ],
            "label" : "Schema Mapping",
            "kind" : "HIDDEN",
            "description" : "A mapping of source schema names to target schema names.",
            "protected" : false
          }, {
            "name" : "add_metadata_columns",
            "aliases" : [ ],
            "label" : "Add Metadata Columns",
            "value" : "true",
            "kind" : "HIDDEN",
            "description" : "Whether or not to add metadata columns to the target table.",
            "protected" : false
          }, {
            "name" : "hard_delete",
            "aliases" : [ ],
            "label" : "Hard Delete",
            "value" : "false",
            "kind" : "HIDDEN",
            "description" : "Whether or not to perform hard deletes when deleting data from the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "data_flattening_max_level",
            "aliases" : [ ],
            "label" : "Data Flattening Max Level",
            "value" : "10",
            "kind" : "HIDDEN",
            "description" : "The maximum level of nested data structures to flatten when writing data to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "primary_key_required",
            "aliases" : [ ],
            "label" : "Primary Key Required",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not a primary key is required for the target table.",
            "protected" : false
          }, {
            "name" : "validate_records",
            "aliases" : [ ],
            "label" : "Validate Records",
            "value" : "false",
            "kind" : "BOOLEAN",
            "description" : "Whether or not to validate records before writing them to the Postgres Warehouse.",
            "protected" : false
          }, {
            "name" : "temp_dir",
            "aliases" : [ ],
            "label" : "Temporary Directory",
            "kind" : "HIDDEN",
            "description" : "The directory to use for temporary files when writing data to the Postgres Warehouse.",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : { },
          "matatikaHidden" : false,
          "requires" : [ ],
          "fullDescription" : "Postgres Warehouse is a data warehousing solution built on top of the Postgres database management system.\n\nPostgres Warehouse is designed to handle large volumes of data and complex queries, making it an ideal solution for businesses that need to store and analyze large amounts of data. It provides a number of features that are specifically tailored to data warehousing, such as columnar storage, parallel processing, and support for advanced analytics. Additionally, Postgres Warehouse is highly scalable, allowing businesses to easily add more resources as their data needs grow. Overall, Postgres Warehouse is a powerful and flexible data warehousing solution that can help businesses make better decisions by providing them with the insights they need to succeed.\n### Prerequisites\nThe process of obtaining the required settings for connecting to a Postgres Warehouse may vary depending on the specific setup and configuration of the database. However, here are some general ways to obtain each of the required settings:\n\n- User: The user is typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the username.\n- Password: The password is also typically created when the database is set up. You can ask the database administrator or check the database documentation to find out the password.\n- Host: The host is the server where the database is located. You can ask the database administrator or check the database documentation to find out the host name or IP address.\n- Port: The port is the number that the database listens on for incoming connections. The default port for Postgres is 5432, but it may be different depending on the configuration. You can ask the database administrator or check the database documentation to find out the port number.\n- Database Name: The database name is the name of the specific database you want to connect to. You can ask the database administrator or check the database documentation to find out the database name.\n- Default Target Schema: The default target schema is the schema that you want to use as the default when connecting to the database. This may be set up by the database administrator or you may need to create it yourself. You can ask the database administrator or check the database documentation to find out the default target schema.\n\n## Settings\n\n\n### User\n\nThe username used to connect to the Postgres Warehouse.\n\n### Password\n\nThe password used to authenticate the user.\n\n### Host\n\nThe hostname or IP address of the Postgres Warehouse server.\n\n### Port\n\nThe port number used to connect to the Postgres Warehouse server.\n\n### Database Name\n\nThe name of the database to connect to.\n\n### Default Target Schema\n\nThe default schema to use when writing data to the Postgres Warehouse.\n\n### Batch Size Rows\n\nThe number of rows to write to the Postgres Warehouse in each batch.\n\n### Primary Key Required\n\nWhether or not a primary key is required for the target table.\n\n### Validate Records\n\nWhether or not to validate records before writing them to the Postgres Warehouse.",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/8eb6b56d-000e-4bcb-b909-fcd0f99c03e3",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : true,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a3154ba7-8410-4e22-ae97-52a5625a37ff"
        }
      }
    }, {
      "id" : "a204013b-57c6-4b00-9765-7f06fe1878f8",
      "created" : "2024-04-10T14:11:34.838299",
      "lastModified" : "2024-04-10T14:11:34.838319",
      "name" : "dbt",
      "properties" : { },
      "commands" : {
        "compile" : {
          "args" : "compile",
          "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
        },
        "seed" : {
          "args" : "seed",
          "description" : "Load data from csv files into your data warehouse."
        },
        "test" : {
          "args" : "test",
          "description" : "Runs tests on data in deployed models."
        },
        "docs-generate" : {
          "args" : "docs generate",
          "description" : "Generate documentation artifacts for your project."
        },
        "deps" : {
          "args" : "deps",
          "description" : "Pull the most recent version of the dependencies listed in packages.yml"
        },
        "run" : {
          "args" : "run",
          "description" : "Compile SQL and execute against the current target database."
        },
        "clean" : {
          "args" : "clean",
          "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
        },
        "snapshot" : {
          "args" : "snapshot",
          "description" : "Execute snapshots defined in your project."
        }
      },
      "dataPlugin" : "transformers/dbt--dbt-labs",
      "_embedded" : {
        "dataplugin" : {
          "id" : "e1e1cf84-a771-451b-972d-829c74e18a48",
          "repositoryPath" : "plugins/transformers/dbt--dbt-labs.lock",
          "pluginType" : "TRANSFORMER",
          "name" : "dbt",
          "namespace" : "dbt",
          "variant" : "dbt-labs",
          "label" : "dbt",
          "logoUrl" : "https://app.matatika.com/assets/images/transformer/dbt.png",
          "hidden" : false,
          "docs" : "https://www.matatika.com/data-details/dbt/",
          "pipUrl" : "dbt-core~=1.3.0 dbt-postgres~=1.3.0 dbt-snowflake~=1.3.0\n",
          "repo" : "https://github.com/dbt-labs/dbt-core",
          "capabilities" : [ ],
          "select" : [ ],
          "update" : { },
          "vars" : { },
          "settings" : [ {
            "name" : "project_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "profiles_dir",
            "aliases" : [ ],
            "value" : "$MELTANO_PROJECT_ROOT/transform/profile",
            "kind" : "STRING",
            "env" : "DBT_PROFILES_DIR",
            "protected" : false
          }, {
            "name" : "target",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__DIALECT",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "source_schema",
            "aliases" : [ ],
            "value" : "$MELTANO_LOAD__TARGET_SCHEMA",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "target_schema",
            "aliases" : [ ],
            "value" : "analytics",
            "kind" : "STRING",
            "protected" : false
          }, {
            "name" : "models",
            "aliases" : [ ],
            "value" : "$MELTANO_TRANSFORM__PACKAGE_NAME $MELTANO_EXTRACTOR_NAMESPACE my_meltano_project",
            "kind" : "STRING",
            "protected" : false
          } ],
          "variants" : [ ],
          "commands" : {
            "compile" : {
              "args" : "compile",
              "description" : "Generates executable SQL from source model, test, and analysis files. Compiled SQL files are written to the target/ directory."
            },
            "seed" : {
              "args" : "seed",
              "description" : "Load data from csv files into your data warehouse."
            },
            "test" : {
              "args" : "test",
              "description" : "Runs tests on data in deployed models."
            },
            "docs-generate" : {
              "args" : "docs generate",
              "description" : "Generate documentation artifacts for your project."
            },
            "deps" : {
              "args" : "deps",
              "description" : "Pull the most recent version of the dependencies listed in packages.yml"
            },
            "run" : {
              "args" : "run",
              "description" : "Compile SQL and execute against the current target database."
            },
            "clean" : {
              "args" : "clean",
              "description" : "Delete all folders in the clean-targets list (usually the dbt_modules and target directories.)"
            },
            "snapshot" : {
              "args" : "snapshot",
              "description" : "Execute snapshots defined in your project."
            }
          },
          "matatikaHidden" : false,
          "requires" : [ {
            "id" : "4b54a59d-5399-4877-9571-a6ab5c77e7d9",
            "repositoryPath" : "plugins/files/files-dbt--matatika.lock",
            "pluginType" : "FILE",
            "name" : "files-dbt",
            "namespace" : "dbt",
            "variant" : "matatika",
            "label" : "files-dbt",
            "hidden" : false,
            "pipUrl" : "git+https://github.com/Matatika/[email protected]",
            "repo" : "https://github.com/Matatika/files-dbt",
            "capabilities" : [ ],
            "select" : [ ],
            "update" : {
              "transform/profile/profiles.yml" : "true"
            },
            "vars" : { },
            "settings" : [ ],
            "variants" : [ ],
            "commands" : { },
            "matatikaHidden" : false,
            "requires" : [ ],
            "fullDescription" : ""
          } ],
          "fullDescription" : "",
          "_links" : {
            "self" : {
              "href" : "https://catalog.matatika.com/api/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48"
            },
            "update dataplugin" : {
              "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/dataplugins/e1e1cf84-a771-451b-972d-829c74e18a48",
              "type" : "PUT"
            }
          }
        }
      },
      "draft" : false,
      "managed" : false,
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        },
        "update datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        },
        "delete datacomponent" : {
          "href" : "https://catalog.matatika.com/api/datacomponents/a204013b-57c6-4b00-9765-7f06fe1878f8"
        }
      }
    } ],
    "latest job" : {
      "id" : "834fe1aa-1371-4dd0-886e-95ccecfa8698",
      "created" : "2024-04-10T14:11:38.662251",
      "type" : "WORKSPACE_CONFIG",
      "maxAttempts" : 0,
      "attempt" : 0,
      "status" : "QUEUED",
      "_embedded" : {
        "pipeline" : {
          "id" : "e25572cb-daa2-44d7-9f12-7b3127a4cc55",
          "status" : "PROVISIONING",
          "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649]",
          "timeout" : 0,
          "maxRetries" : 0,
          "created" : "2024-04-10T14:11:38.148468",
          "lastModified" : "2024-04-10T14:11:38.148469",
          "properties" : { },
          "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ],
          "actions" : [ ]
        },
        "profile" : {
          "id" : "auth0|5eb0327cbfd7490bff55feeb",
          "name" : "[email protected]",
          "handle" : "@sit+prod",
          "email" : "[email protected]"
        }
      },
      "_links" : {
        "self" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698"
        },
        "delete job" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698",
          "type" : "DELETE"
        },
        "logs" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698/logs?sequence=0",
          "type" : "GET"
        },
        "withdraw job" : {
          "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698/stopped",
          "type" : "PUT"
        }
      }
    }
  },
  "_links" : {
    "update pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55",
      "type" : "PUT"
    },
    "delete pipeline" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55",
      "type" : "DELETE"
    },
    "draft pipeline" : {
      "href" : "https://catalog.matatika.com/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/draft",
      "type" : "PUT"
    },
    "self" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55"
    },
    "environment" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/environment"
    },
    "jobs" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/jobs",
      "type" : "GET"
    },
    "metrics" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/metrics"
    },
    "add subscription" : {
      "href" : "https://catalog.matatika.com/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55/subscriptions"
    },
    "withdraw job" : {
      "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698/stopped",
      "type" : "PUT"
    },
    "latest job" : {
      "href" : "https://catalog.matatika.com/api/jobs/834fe1aa-1371-4dd0-886e-95ccecfa8698"
    }
  }
}

Validate a pipeline configuration in a workspace

POST

/api/workspaces/{workspace-id}/pipelines/validation

Validates a pipeline configuration in the workspace {workspace-id}.

Prerequisites

  • Workspace {workspace-id} must exist

Request

Body

Pipeline resource.

{
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649] (updated)",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/validation' -i -X POST \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json' \
    -d '{
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649] (updated)",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/validation"

data = {
  "name" : "SIT-generated pipeline [2024-04-10T15:11:37.765649] (updated)",
  "dataComponents" : [ "tap-google-analytics", "Warehouse", "dbt" ]
}
headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("POST", url, headers=headers, data=data)

print(response.text.encode('utf8'))

Response

200 OK

No response body provided.

400 Bad Request

Pipeline property validation errors.

{
  "timestamp" : "2024-04-10T14:12:26.453295713",
  "status" : 400,
  "error" : "Bad Request",
  "message" : "3 validation errors from 'resource'",
  "errors" : [ {
    "codes" : [ "NotBlank.oauth_credentials.access_token", "NotBlank" ],
    "defaultMessage" : "No value given for setting",
    "objectName" : "resource",
    "field" : "properties.tap-google-analytics.oauth_credentials.access_token",
    "bindingFailure" : true,
    "code" : "NotBlank"
  }, {
    "codes" : [ "NotBlank.oauth_credentials.refresh_token", "NotBlank" ],
    "defaultMessage" : "No value given for setting",
    "objectName" : "resource",
    "field" : "properties.tap-google-analytics.oauth_credentials.refresh_token",
    "bindingFailure" : true,
    "code" : "NotBlank"
  }, {
    "codes" : [ "NotBlank.view_id", "NotBlank" ],
    "defaultMessage" : "No value given for setting",
    "objectName" : "resource",
    "field" : "properties.tap-google-analytics.view_id",
    "bindingFailure" : true,
    "code" : "NotBlank"
  } ],
  "path" : "/api/workspaces/36863ec8-15ad-4503-98b3-bc5b4cb277a0/pipelines/validation"
}

Verify a pipeline

POST

/api/pipelines/{pipeline-id}/verification

Verifies the configuration of the pipeline {pipeline-id}.

Prerequisites

  • Pipeline {pipeline-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/pipelines/da46d020-47c8-4e63-987a-3d0d1532af04/verification' -i -X POST \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/pipelines/da46d020-47c8-4e63-987a-3d0d1532af04/verification"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("POST", url, headers=headers)

print(response.text.encode('utf8'))

Response

200 OK

Job with HAL links.

{
  "id" : "65146245-78a8-49e6-80da-207110cea5fa",
  "created" : "2024-04-10T14:14:58.981039",
  "type" : "PIPELINE_VERIFY",
  "maxAttempts" : 0,
  "attempt" : 0,
  "status" : "QUEUED",
  "_embedded" : {
    "pipeline" : {
      "id" : "da46d020-47c8-4e63-987a-3d0d1532af04",
      "status" : "READY",
      "name" : "SIT-Generated Pipeline [2024-04-10T15:13:10.386034]",
      "timeout" : 0,
      "maxRetries" : 0,
      "created" : "2024-04-10T14:13:10.796155",
      "lastModified" : "2024-04-10T14:13:10.796156",
      "properties" : { },
      "dataComponents" : [ "tap-matatika-sit", "Warehouse", "dbt" ],
      "actions" : [ ],
      "repositoryPath" : "pipelines/SIT-Generated Pipeline [2024-04-10T15:13:10.386034].yml"
    },
    "profile" : {
      "id" : "auth0|5eb0327cbfd7490bff55feeb",
      "name" : "[email protected]",
      "handle" : "@sit+prod",
      "email" : "[email protected]"
    }
  },
  "_links" : {
    "self" : {
      "href" : "https://catalog.matatika.com/api/jobs/65146245-78a8-49e6-80da-207110cea5fa"
    },
    "delete job" : {
      "href" : "https://catalog.matatika.com/api/jobs/65146245-78a8-49e6-80da-207110cea5fa",
      "type" : "DELETE"
    },
    "logs" : {
      "href" : "https://catalog.matatika.com/api/jobs/65146245-78a8-49e6-80da-207110cea5fa/logs?sequence=0",
      "type" : "GET"
    },
    "withdraw job" : {
      "href" : "https://catalog.matatika.com/api/jobs/65146245-78a8-49e6-80da-207110cea5fa/stopped",
      "type" : "PUT"
    }
  }
}

Delete a pipeline

DELETE

/api/pipelines/{pipeline-id}

Deletes the pipeline {pipeline-id}.

Prerequisites

  • Pipeline {pipeline-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55' -i -X DELETE \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/pipelines/e25572cb-daa2-44d7-9f12-7b3127a4cc55"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("DELETE", url, headers=headers)

print(response.text.encode('utf8'))

Response

204 No Content

No response body provided.


View pipeline metrics

GET

/api/pipelines/{pipeline-id}/metrics

Returns the pipeline metrics for each job of {pipeline-id}.

Prerequisites

  • Pipeline {pipeline-id} must exist

Request

Example Snippets

cURL

curl -H "Authorization: Bearer $ACCESS_TOKEN" 'https://catalog.matatika.com:443/api/pipelines/da46d020-47c8-4e63-987a-3d0d1532af04/metrics' -i -X GET \
    -H 'Accept: application/json, application/javascript, text/javascript, text/json' \
    -H 'Content-Type: application/json'

Python (requests)

import requests

url = "https://catalog.matatika.com:443/api/pipelines/da46d020-47c8-4e63-987a-3d0d1532af04/metrics"

headers = {
  'Authorization': ACCESS_TOKEN
}

response = requests.request("GET", url, headers=headers)

print(response.text.encode('utf8'))

Response

  • 200: The dataset data (defaults to JSON format).
[ {
  "metrics.job-created" : "2024-04-10 14:13:11",
  "metrics.value" : 0.0
}, {
  "metrics.job-created" : "2024-04-10 14:14:58",
  "metrics.value" : 0.0
}, {
  "metrics.job-created" : "2024-04-10 14:16:10",
  "metrics.value" : 6.0
} ]
  • 204: No response body, metrics not enabled.