Skip to main content
GET
/
api
/
entities
/
v1
ListEntities
curl --request GET \
  --url https://developer.synq.io/api/entities/v1 \
  --header 'Authorization: Bearer <token>'
{
  "pageInfo": {
    "totalCount": 123,
    "count": 123,
    "lastId": "<string>"
  },
  "entityIds": [
    {
      "airflowDag": {
        "integrationId": "<string>",
        "dagId": "<string>"
      }
    }
  ]
}

Authorizations

Authorization
string
header
required

Bearer authentication header of the form Bearer <token>, where <token> is your auth token.

Query Parameters

query.parts.identifierList.identifiers.dbtCoreNode.integrationId
string<uuid>
required

SYNQ integration_id that identifies the dbt Core project

query.parts.identifierList.identifiers.dbtCoreNode.nodeId
string
required

Dbt node_id that identifies one of dbt DAG nodes (model, test, etc)

query.parts.identifierList.identifiers.dbtCloudNode.projectId
string
required

Your dbt Cloud project id

query.parts.identifierList.identifiers.dbtCloudNode.accountId
string

Your dbt Cloud account id

query.parts.identifierList.identifiers.dbtCloudNode.nodeId
string
required

Dbt node_id that identifies one of dbt DAG nodes (model, test, etc)

query.parts.identifierList.identifiers.bigqueryTable.project
string
required

BigQuery project

query.parts.identifierList.identifiers.bigqueryTable.dataset
string
required

BigQuery dataset id

query.parts.identifierList.identifiers.bigqueryTable.table
string
required

BigQuery table name

query.parts.identifierList.identifiers.snowflakeTable.account
string
required

Snowflake account

query.parts.identifierList.identifiers.snowflakeTable.database
string
required

Snowflake database

query.parts.identifierList.identifiers.snowflakeTable.schema
string
required

Snowflake schema

query.parts.identifierList.identifiers.snowflakeTable.table
string
required

Snowflake table

query.parts.identifierList.identifiers.redshiftTable.cluster
string

Redshift cluster

query.parts.identifierList.identifiers.redshiftTable.database
string
required

Redshift database

query.parts.identifierList.identifiers.redshiftTable.schema
string
required

Redshift schema

query.parts.identifierList.identifiers.redshiftTable.table
string
required

Redshift table

query.parts.identifierList.identifiers.postgresTable.host
string
required

Postgres hostname without port

query.parts.identifierList.identifiers.postgresTable.database
string
required

Postgres database

query.parts.identifierList.identifiers.postgresTable.schema
string
required

Postgres schema

query.parts.identifierList.identifiers.postgresTable.table
string
required

Postgres table

query.parts.identifierList.identifiers.mysqlTable.host
string
required

Mysql hostname without port

query.parts.identifierList.identifiers.mysqlTable.schema
string
required

Mysql database

query.parts.identifierList.identifiers.mysqlTable.table
string
required

Mysql table

query.parts.identifierList.identifiers.clickhouseTable.host
string
required

Clickhouse hostname without port

query.parts.identifierList.identifiers.clickhouseTable.schema
string
required

Clickhouse database

query.parts.identifierList.identifiers.clickhouseTable.table
string
required

Clickhouse table

query.parts.identifierList.identifiers.airflowDag.integrationId
string
required

SYNQ integration_id that identifies the Airflow instance

query.parts.identifierList.identifiers.airflowDag.dagId
string
required

Airflow dag_id that identifies the DAG

query.parts.identifierList.identifiers.airflowTask.integrationId
string
required

SYNQ integration_id that identifies the Airflow instance

query.parts.identifierList.identifiers.airflowTask.dagId
string
required

Airflow dag_id that identifies the DAG

query.parts.identifierList.identifiers.airflowTask.taskId
string
required

Airflow task_id that identifies the task within the DAG

query.parts.identifierList.identifiers.custom.id
string
required

Id that identifies the custom entity The Id should be unique within the custom entity Identifier.

query.parts.identifierList.identifiers.dataproduct.id
string<uuid>
required

Dataproduct id that identifies the Dataproduct

query.parts.identifierList.identifiers.synqPath.path
string
required

SYNQ path that identifies the SYNQ entity, needs to be one of supported paths

query.parts.identifierList.identifiers.databricksTable.workspace
string
required

URL of Databricks workspace

query.parts.identifierList.identifiers.databricksTable.catalog
string
required

Databricks catalog

query.parts.identifierList.identifiers.databricksTable.schema
string
required

Databricks schema

query.parts.identifierList.identifiers.databricksTable.table
string
required

Databricks table or view

query.parts.identifierList.identifiers.trinoTable.host
string
required

Hostname of the Trino instance

query.parts.identifierList.identifiers.trinoTable.catalog
string
required

Trino catalog

query.parts.identifierList.identifiers.trinoTable.schema
string
required

Trino schema

query.parts.identifierList.identifiers.trinoTable.table
string
required

Trino table or view

query.parts.identifierList.identifiers.sqlMeshModel.integrationId
string<uuid>
required

SYNQ integration_id that identifies the dbt Core project

query.parts.identifierList.identifiers.sqlMeshModel.fqn
string
required

SQLMesh model fully qualified name

query.parts.identifierList.identifiers.sqlMeshAudit.integrationId
string<uuid>
required

SYNQ integration_id that identifies the dbt Core project

query.parts.identifierList.identifiers.sqlMeshAudit.fqn
string
required

SQLMesh model fully qualified name

query.parts.identifierList.identifiers.sqlMeshAudit.auditId
string
required

Identifier of the audit

query.parts.identifierList.identifiers.monitor.monitorId
string
required

Identifier of the monitor

query.parts.identifierList.identifiers.monitor.segment
string

Optional monitor segmentation identifier

query.parts.identifierList.identifiers.monitor.integrationId
string
deprecated

SYNQ integration_id of the monitored identifier

query.parts.withNameSearch.searchQuery
string
query.parts.withType.type.default
enum<string>

Default SYNQ types

Available options:
ENTITY_TYPE_UNSPECIFIED,
ENTITY_TYPE_BQ_TABLE,
ENTITY_TYPE_BQ_VIEW,
ENTITY_TYPE_LOOKER_LOOK,
ENTITY_TYPE_LOOKER_EXPLORE,
ENTITY_TYPE_LOOKER_VIEW,
ENTITY_TYPE_LOOKER_DASHBOARD,
ENTITY_TYPE_DBT_MODEL,
ENTITY_TYPE_DBT_TEST,
ENTITY_TYPE_DBT_SOURCE,
ENTITY_TYPE_DBT_PROJECT,
ENTITY_TYPE_DBT_METRIC,
ENTITY_TYPE_DBT_SNAPSHOT,
ENTITY_TYPE_DBT_SEED,
ENTITY_TYPE_DBT_ANALYSIS,
ENTITY_TYPE_DBT_EXPOSURE,
ENTITY_TYPE_DBT_GROUP,
ENTITY_TYPE_DBT_CLOUD_PROJECT,
ENTITY_TYPE_DBT_CLOUD_JOB,
ENTITY_TYPE_SNOWFLAKE_TABLE,
ENTITY_TYPE_SNOWFLAKE_VIEW,
ENTITY_TYPE_SNOWFLAKE_STREAM,
ENTITY_TYPE_SNOWFLAKE_DYNAMIC_TABLE,
ENTITY_TYPE_SNOWFLAKE_TASK,
ENTITY_TYPE_REDSHIFT_TABLE,
ENTITY_TYPE_REDSHIFT_VIEW,
ENTITY_TYPE_TABLEAU_EMBEDDED,
ENTITY_TYPE_TABLEAU_PUBLISHED,
ENTITY_TYPE_TABLEAU_CUSTOM_SQL,
ENTITY_TYPE_TABLEAU_TABLE,
ENTITY_TYPE_TABLEAU_SHEET,
ENTITY_TYPE_TABLEAU_DASHBOARD,
ENTITY_TYPE_AIRFLOW_DAG,
ENTITY_TYPE_AIRFLOW_TASK,
ENTITY_TYPE_CLICKHOUSE_TABLE,
ENTITY_TYPE_CLICKHOUSE_VIEW,
ENTITY_TYPE_ANOMALY_MONITOR,
ENTITY_TYPE_ANOMALY_MONITOR_SEGMENT,
ENTITY_TYPE_SQLTEST_TEST,
ENTITY_TYPE_POSTGRES_TABLE,
ENTITY_TYPE_POSTGRES_VIEW,
ENTITY_TYPE_MYSQL_TABLE,
ENTITY_TYPE_MYSQL_VIEW,
ENTITY_TYPE_DATABRICKS_WAREHOUSE,
ENTITY_TYPE_DATABRICKS_TABLE,
ENTITY_TYPE_DATABRICKS_VIEW,
ENTITY_TYPE_DATABRICKS_JOB,
ENTITY_TYPE_DATABRICKS_NOTEBOOK,
ENTITY_TYPE_DATABRICKS_QUERY,
ENTITY_TYPE_DATABRICKS_DASHBOARD,
ENTITY_TYPE_SQLMESH_PROJECT,
ENTITY_TYPE_SQLMESH_SQL_MODEL,
ENTITY_TYPE_SQLMESH_PYTHON_MODEL,
ENTITY_TYPE_SQLMESH_EXTERNAL,
ENTITY_TYPE_SQLMESH_SEED,
ENTITY_TYPE_SQLMESH_AUDIT,
ENTITY_TYPE_SQLMESH_UNIT_TEST,
ENTITY_TYPE_SQLMESH_ENVIRONMENT,
ENTITY_TYPE_SQLMESH_SNAPSHOT,
ENTITY_TYPE_DUCKDB_TABLE,
ENTITY_TYPE_DUCKDB_VIEW,
ENTITY_TYPE_TRINO_TABLE,
ENTITY_TYPE_TRINO_VIEW,
ENTITY_TYPE_ATLAN_ASSET,
ENTITY_TYPE_ATLAN_INTEGRATION,
ENTITY_TYPE_CASTORDOC_TABLE,
ENTITY_TYPE_CASTORDOC_DASHBOARD,
ENTITY_TYPE_CASTORDOC_VIEW,
ENTITY_TYPE_CUSTOM_ENTITY_GENERIC,
ENTITY_TYPE_CUSTOM_ENTITY_CUSTOM_TYPE_MIN,
ENTITY_TYPE_CUSTOM_ENTITY_CUSTOM_TYPE_MAX
query.parts.withType.type.custom
integer<int32>

Custom types as defined through synq.entities.custom.v1.TypesService

query.parts.withAnnotation.name
string
query.parts.withAnnotation.acceptedValue
string
deprecated
query.parts.withAnnotation.acceptedValues
string[]
query.parts.inDataPlatform.identifier.bigquery.project
string
required

BigQuery project

query.parts.inDataPlatform.identifier.clickhouse.host
string
required

Clickhouse host inclusive of port

query.parts.inDataPlatform.identifier.clickhouse.schema
string
required

Clickhouse database

query.parts.inDataPlatform.identifier.snowflake.account
string
required

Snowflake account

query.parts.inDataPlatform.identifier.snowflake.database
string
required

Snowflake database

query.parts.inDataPlatform.identifier.redshift.cluster
string
required

Redshift cluster

query.parts.inDataPlatform.identifier.redshift.database
string
required

Redshift database

query.parts.inDataPlatform.identifier.postgres.host
string
required

Postgres host inclusive of port

query.parts.inDataPlatform.identifier.postgres.database
string
required

Postgres database

query.parts.inDataPlatform.identifier.mysql.host
string
required

Mysql host inclusive of port

query.parts.inDataPlatform.identifier.databricks.workspace
string
required

URL of the databricks workspace

query.parts.inDataPlatform.identifier.dbtCloud.apiEndpoint
string
required

API endpoint for Dbt Cloud

query.parts.inDataPlatform.identifier.dbtCloud.accountId
string
required

Account ID

query.parts.inDataPlatform.identifier.dbtCloud.projectId
string
required

Project ID

query.parts.inDataPlatform.identifier.sqlMesh.defaultDatabaseInstance
string
required

Default database instance for SQL Mesh

query.parts.inDataPlatform.identifier.duckdb.motherduckAccount
string
query.parts.inDataPlatform.identifier.trino.coordinator
string
query.parts.inDataPlatform.identifier.synqIntegrationId
string

SYNQ integration ID maps to the created integration on the SYNQ platform.

query.parts.withDataPlatformType.types
enum<string>[]
Available options:
DATA_PLATFORM_TYPE_UNSPECIFIED,
DATA_PLATFORM_TYPE_BIGQUERY,
DATA_PLATFORM_TYPE_LOOKER,
DATA_PLATFORM_TYPE_DBT,
DATA_PLATFORM_TYPE_DBT_CLOUD,
DATA_PLATFORM_TYPE_DBT_SELF_HOSTED,
DATA_PLATFORM_TYPE_SNOWFLAKE,
DATA_PLATFORM_TYPE_GCP,
DATA_PLATFORM_TYPE_GIT,
DATA_PLATFORM_TYPE_REDSHIFT,
DATA_PLATFORM_TYPE_TABLEAU,
DATA_PLATFORM_TYPE_AIRFLOW,
DATA_PLATFORM_TYPE_CLICKHOUSE,
DATA_PLATFORM_TYPE_POSTGRES,
DATA_PLATFORM_TYPE_MYSQL,
DATA_PLATFORM_TYPE_DATABRICKS,
DATA_PLATFORM_TYPE_SQLMESH,
DATA_PLATFORM_TYPE_DUCKDB,
DATA_PLATFORM_TYPE_TRINO,
DATA_PLATFORM_TYPE_ATLAN,
DATA_PLATFORM_TYPE_COALESCE,
DATA_PLATFORM_TYPE_CASTORDOC,
DATA_PLATFORM_TYPE_SYNQ
query.parts.inFolder.path
string[]
query.parts.inDomain.domainId
string<uuid>
query.parts.query.operand
enum<string>

Defaults to AND if not specified.

Available options:
QUERY_OPERAND_UNSPECIFIED,
QUERY_OPERAND_AND,
QUERY_OPERAND_OR,
QUERY_OPERAND_EXCEPT,
QUERY_OPERAND_UPSTREAM,
QUERY_OPERAND_DOWNSTREAM
query.parts.unsupported.queryJson
string
pagination.cursor
string

Cursor for the next page of results. If not provided, returns the first page.

pagination.pageSize
integer<int32>

Maximum number of items to return in a single page. If not provided, defaults vary per API.

Required range: x >= 0

Response

200 - application/json

Success

pageInfo
PageInfo · object
entityIds
(airflow_dag · object | airflow_task · object | bigquery_table · object | clickhouse_table · object | custom · object | databricks_table · object | dataproduct · object | dbt_cloud_node · object | dbt_core_node · object | monitor · object | mysql_table · object | postgres_table · object | redshift_table · object | snowflake_table · object | sql_mesh_audit · object | sql_mesh_model · object | synq_path · object | trino_table · object)[]

Identifier is a unique reference to an entity in SYNQ system. Entity identifiers are designed to closely mimic identifiers used by data platforms and tools. To construct an identifier, you need to know the kind of the entity and the ids that you would normally use to identify it in the data platform or tool. For example, to identify a table in BigQuery, you would need to know the project, dataset, and table names.

  • airflow_dag
  • airflow_task
  • bigquery_table
  • clickhouse_table
  • custom
  • databricks_table
  • dataproduct
  • dbt_cloud_node
  • dbt_core_node
  • monitor
  • mysql_table
  • postgres_table
  • redshift_table
  • snowflake_table
  • sql_mesh_audit
  • sql_mesh_model
  • synq_path
  • trino_table