When to Use It
- Analyze pixel performance and data quality
- Monitor event tracking and conversion patterns
- Identify technical issues with pixel implementation
- Assess data collection across different devices and browsers
- Review custom data field usage and effectiveness
- Optimize pixel configuration based on usage patterns
- Generate reports for stakeholders and compliance
- Debug tracking issues and data gaps
Inputs
Field | Type | Required | Description |
---|---|---|---|
Account | Select | Yes | Select the Meta Ads account containing the pixel |
Pixel | Select | Yes | Select the specific pixel to get statistics for |
Aggregation | Select | No | How to group the statistics (default: event_total_counts) |
Date Range | Date Range | No | Time period for statistics (default: last 28 days) |
Time Grouping | Select | No | Group by timestamp or date (default: time) |
Event Source Filter | Select | No | Filter by web/server events (default: all) |
Aggregation Options
Option | Description | Use Case |
---|---|---|
event_total_counts | Total event counts by type | Overall pixel performance overview |
event | Detailed event data with filters | Specific event analysis |
event_source | Events grouped by source (web/server) | Compare web vs CAPI performance |
pixel_fire | Pixel firing statistics | Technical monitoring |
host | Events grouped by website domain | Multi-site tracking analysis |
url | Events grouped by specific URLs | Page-level performance |
browser_type | Events by browser type | Browser compatibility analysis |
device_os | Events by operating system | Device targeting insights |
device_type | Events by device category | Mobile vs desktop analysis |
match_keys | Data matching quality stats | Attribution accuracy assessment |
Output
Returns statistics based on your selected aggregation method:Event Total Counts Example:
Event Source Example:
Device Type Example:
Credit Cost
- Cost per run: 1 credit
FAQs
Which aggregation method should I use for different analysis needs?
Which aggregation method should I use for different analysis needs?
For Overall Performance Monitoring:
- event_total_counts: Best starting point - shows all event volumes
- event: Detailed breakdown with filtering options
- event_source: Compare web pixel vs Conversions API performance
- pixel_fire: Monitor pixel installation and firing issues
- browser_type/device_os: Identify compatibility issues
- host: Analyze performance across different websites
- url: Identify high/low performing pages
- device_type: Understand user behavior patterns
- match_keys: Review data matching and attribution quality
- custom_data_field: Analyze custom parameter usage
How do I interpret event source data (web vs server)?
How do I interpret event source data (web vs server)?
Event Source Types:Web Events (Browser Pixel):
- Tracked directly from user’s browser
- Real-time user interactions
- Subject to ad blockers and privacy settings
- May miss some conversions due to technical issues
- Sent from your server to Meta
- More reliable and comprehensive
- Not affected by ad blockers
- Better for sensitive data and offline events
- 50/50 split: Excellent - both sources working well
- 70% web / 30% server: Good - strong browser tracking
- 30% web / 70% server: Good - strong server integration
- 90%+ from one source: Check the other source setup
- Aim for both sources active for redundancy
- Server events improve data quality and match rates
- Use server events for sensitive data (purchases, leads)
- Web events good for engagement tracking
What insights can I get from device and browser data?
What insights can I get from device and browser data?
Device Type Analysis:
- Mobile dominance: Optimize for mobile experience and ads
- Desktop preference: Focus on desktop-optimized content
- Balanced usage: Ensure responsive design and cross-device tracking
- Chrome/Safari/Firefox distribution: Check compatibility
- Unusual patterns: May indicate bot traffic or technical issues
- Privacy-focused browsers: Expect lower tracking rates
- iOS vs Android: Mobile app and campaign optimization
- Windows/Mac distribution: Desktop experience optimization
- Version information: Compatibility and feature support
- Ad creative optimization: Design for dominant platforms
- User experience improvements: Focus development efforts
- Targeting strategy: Adjust campaigns based on user preferences
- Technical troubleshooting: Identify platform-specific issues
- Sudden shifts in device/browser distribution
- Extremely low counts from major browsers/devices
- Inconsistent patterns compared to industry benchmarks
How can I use URL and host data for optimization?
How can I use URL and host data for optimization?
URL-Level Analysis:
- High-converting pages: Identify your best-performing content
- Drop-off points: Find where users leave your funnel
- Event distribution: See which pages drive specific actions
- A/B testing insights: Compare performance across page variants
- Multi-domain tracking: Compare performance across properties
- Subdomain optimization: Identify strong/weak areas
- Campaign landing pages: Analyze dedicated campaign sites
- Partner integrations: Track third-party domain performance
- Improve low-performing pages: Focus UX improvements
- Replicate success: Apply high-performing page elements elsewhere
- Adjust traffic allocation: Send more traffic to converting pages
- Fix technical issues: Address pages with tracking problems
- Homepage dominance: May indicate navigation issues
- Checkout abandonment: Focus on conversion optimization
- Blog engagement: Content marketing effectiveness
- Product page variations: Compare product performance
What do match keys statistics tell me about data quality?
What do match keys statistics tell me about data quality?
Match Keys Explained:
Match keys are data points used to connect pixel events with Meta user profiles:
- Email addresses
- Phone numbers
- External IDs
- Facebook user IDs
- Match rate: Percentage of events with successful user matching
- Coverage: How many events include each type of match key
- Quality scores: Accuracy and reliability of matches
- High email match rates: Good customer data collection
- Strong phone number coverage: Comprehensive contact information
- External ID usage: Effective CRM integration
- Low match rates: Data quality or collection issues
- Collect better data: Improve form fields and data capture
- Implement hashing: Properly format customer data
- Use Conversions API: Send server-side data for better matching
- Enable automatic matching: Let Meta optimize connections
- Better attribution accuracy
- Improved ad targeting effectiveness
- Higher conversion tracking reliability
- Enhanced audience building capabilities
How should I set up date ranges for meaningful analysis?
How should I set up date ranges for meaningful analysis?
Recommended Date Ranges:Daily Monitoring (1-7 days):
- Use case: Real-time issue detection
- Time grouping: By hour or timestamp
- Focus: Technical problems, campaign launches
- Use case: Campaign performance review
- Time grouping: By date
- Focus: Trend identification, optimization opportunities
- Use case: Strategic analysis and reporting
- Time grouping: By date or week
- Focus: Long-term patterns, seasonal effects
- Use case: Comprehensive pixel health assessment
- Time grouping: By week or month
- Focus: Infrastructure changes, business growth impact
- Year-over-year: Same period previous year
- Month-over-month: Compare recent months
- Before/after: Major website or campaign changes
- Use consistent date ranges for trend analysis
- Account for seasonality in your business
- Include sufficient data for statistical significance
- Consider external factors (holidays, market events)
What should I do if I see concerning patterns in my pixel stats?
What should I do if I see concerning patterns in my pixel stats?
Common Issues and Solutions:Sudden Drop in Events:
- Check: Website changes, pixel code modifications
- Action: Verify pixel installation, test with Pixel Helper
- Timeline: Address immediately
- Check: Customer data quality, CAPI implementation
- Action: Improve data collection, implement server events
- Timeline: Plan for gradual improvement
- Check: Bot traffic, technical restrictions
- Action: Implement bot filtering, check compatibility
- Timeline: Monitor and adjust over time
- Check: CAPI setup, server event configuration
- Action: Implement or fix Conversions API
- Timeline: Technical implementation project
- Check: Tracking code placement, page load issues
- Action: Fix technical implementation, optimize pages
- Timeline: Development sprint planning
- Set up automated alerts for significant changes
- Regular weekly reviews of key metrics
- Document known issues and planned fixes
- Compare against industry benchmarks when available
Can I export this data for further analysis?
Can I export this data for further analysis?
Data Export Options:Direct Use:
- Copy JSON output for spreadsheet analysis
- Use data in subsequent workflow nodes
- Generate reports within Markifact
- Google Sheets: Connect output to spreadsheet for team access
- BI Tools: Feed data into business intelligence platforms
- Dashboards: Create automated reporting workflows
- APIs: Use data in custom applications
- Trend Analysis: Track metrics over time
- Comparative Studies: Compare across pixels or time periods
- Correlation Analysis: Relate pixel performance to business metrics
- Forecasting: Predict future performance based on trends
- Stakeholder-specific views: Customize reports for different audiences
- Regular cadence: Set up automated reporting schedules
- Context inclusion: Add business context to raw data
- Action items: Include recommendations with data insights