Insert data from your workflows into Google BigQuery tables for large-scale data warehousing and analytics.
Field | Type | Required | Description |
---|---|---|---|
Project | Select | Yes | Your Google BigQuery project |
Dataset | Select | Yes | Dataset containing your target table |
Table | Select | Yes | Target table to insert data into |
Data Mapping | Select | Yes | Choose “Use All Data” or “Map Specific Columns” |
Data | Data | Yes | Data source from previous workflow steps |
Column Mapping | Mapper | Yes* | Map data fields to table columns (*For Map Specific Columns only) |
Skip Invalid Rows | Switch | Yes | Skip rows that fail validation (default: enabled) |
Output | Description |
---|---|
Insert Results | Details about the insertion operation including success count |
What happens if my data doesn't match the table schema?
Can I insert data into multiple tables at once?
How do I handle different data structures from various sources?
What's the difference between the two data mapping options?
Can I append to existing data or does it overwrite?
How do I handle large datasets efficiently?
What if I need to transform data before insertion?