Run a Query executes SQL queries against your BigQuery datasets to retrieve, analyze, and transform data for your marketing workflows.


When to Use It

  • Execute custom SQL queries to retrieve specific data
  • Generate reports and analytics for marketing campaigns
  • Use it as AI Tool for AI Agents to run dynamic SQL queries

Inputs

FieldTypeRequiredDescription
ProjectSelectYesGoogle BigQuery project to execute the query in
QueryText AreaYesSQL query to execute against BigQuery tables

Outputs

OutputDescription
Query ResultsArray of rows returned by your query with execution metadata

Credit Cost

1 credit per operation.


Real-World Examples

BigQuery AI Agent:

AI Agent → List Datasets → List Tables → Get Table Schema → Run Query
"Allow AI to fully explore BigQuery and run intelligent SQL queries"

Monthly Campaign Analysis:

Run Query → AI Analyze Data → Send Email
"Execute custom SQL to analyze campaign performance and email insights"

Understanding Query Results

The returned data includes:

Query Output:

  • Array of rows with your query results
  • Column names matching your SELECT statement
  • Data types preserved from BigQuery tables
  • Null values handled appropriately

SQL Query Examples

Campaign Performance Analysis:

SELECT 
  campaign_name,
  SUM(spend) as total_spend,
  SUM(conversions) as total_conversions,
  ROUND(SUM(conversions) / SUM(spend), 2) as cost_per_conversion
FROM marketing_campaigns 
WHERE DATE(created_at) >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY)
GROUP BY campaign_name
ORDER BY total_conversions DESC

Best Practices

Query Optimization:

  • Use WHERE clauses to limit data scanning and reduce costs
  • Select only the columns you need rather than using SELECT *
  • Filter by date ranges to avoid processing unnecessary historical data
  • Use LIMIT for testing queries before running on full datasets

Cost Management:

  • Preview query costs in BigQuery console before execution
  • Use partitioned tables and filter by partition keys
  • Monitor bytes processed to manage BigQuery costs
  • Consider creating views for frequently used complex queries

Performance:

  • Use proper data types and avoid unnecessary conversions
  • Leverage clustering and partitioning for large tables
  • Test complex queries with LIMIT before full execution
  • Monitor execution times and optimize as needed

Tips

SQL Development:

  • Test your queries in BigQuery console first
  • Use query validation to check syntax before execution
  • Comment your SQL for team documentation
  • Save frequently used queries as templates

Dynamic Queries:

  • Use parameterized queries for flexible filtering
  • Combine with other workflow nodes for dynamic table/column names
  • Consider using variables from previous workflow steps
  • Plan for different data types and null value handling

Integration:

  • Results integrate seamlessly with other Markifact nodes
  • Use with AI Analyze Data for insights on query results
  • Combine with Rename Fields to standardize output column names
  • Perfect for feeding data into Google Sheets or other destinations

FAQ