Skip to main content

Overview

Export Consuelo data for analysis in external tools like BI platforms, data warehouses, or custom analytics systems.

Common Export Scenarios

DestinationUse CaseMethod
Data warehouseLong-term storageScheduled API export
BI toolDashboardsDirect connection
SpreadsheetAd-hoc analysisOne-time export
Machine learningModel trainingFeature extraction
BackupDisaster recoveryFull export

Export Script Template

import { GraphQLClient, gql } from 'graphql-request';
import fs from 'fs';
import dotenv from 'dotenv';

dotenv.config();

const client = new GraphQLClient(process.env.CONSUELO_GRAPHQL_URL, {
  headers: {
    authorization: `Bearer ${process.env.CONSUELO_API_KEY}`,
  },
});

const EXPORT_PEOPLE = gql`
  query ExportPeople($first: Int!, $after: String) {
    people(first: $first, after: $after) {
      edges {
        node {
          id
          email
          firstName
          lastName
          company {
            id
            name
            industry
          }
          createdAt
          updatedAt
        }
      }
      pageInfo {
        hasNextPage
        endCursor
      }
    }
  }
`;

async function exportAllPeople() {
  const records = [];
  let hasNextPage = true;
  let after = null;

  while (hasNextPage) {
    const data = await client.request(EXPORT_PEOPLE, {
      first: 60,
      after,
    });

    records.push(...data.people.edges.map(e => e.node));
    hasNextPage = data.people.pageInfo.hasNextPage;
    after = data.people.pageInfo.endCursor;

    console.log(`Exported ${records.length} records...`);

    // Rate limit protection
    await new Promise(resolve => setTimeout(resolve, 100));
  }

  return records;
}

async function exportToCSV(records, filename) {
  const headers = Object.keys(records[0]).join(',');
  const rows = records.map(r => 
    Object.values(r).map(v => 
      typeof v === 'object' ? JSON.stringify(v) : v
    ).join(',')
  );

  const csv = [headers, ...rows].join('\n');
  fs.writeFileSync(filename, csv);
  console.log(`Exported to ${filename}`);
}

async function main() {
  const records = await exportAllPeople();
  await exportToCSV(records, 'people-export.csv');
}

main().catch(console.error);

Export Formats

CSV Export

Best for spreadsheets:
function toCSV(records) {
  const headers = Object.keys(records[0]);
  const csv = [
    headers.join(','),
    ...records.map(r => 
      headers.map(h => {
        let val = r[h];
        if (typeof val === 'object') val = JSON.stringify(val);
        if (typeof val === 'string' && val.includes(',')) {
          val = `"${val}"`;
        }
        return val;
      }).join(',')
    )
  ].join('\n');
  return csv;
}

JSON Export

Best for programmatic use:
function toJSON(records) {
  return JSON.stringify(records, null, 2);
}

Parquet Export

Best for data warehouses:
// Using parquetjs
import parquet from 'parquetjs-lite';

async function toParquet(records, filename) {
  const schema = new parquet.ParquetSchema({
    id: { type: 'UTF8' },
    email: { type: 'UTF8' },
    createdAt: { type: 'TIMESTAMP_MILLIS' },
  });

  const writer = await parquet.ParquetWriter.openFile(schema, filename);
  
  for (const record of records) {
    await writer.appendRow(record);
  }

  await writer.close();
}

Aggregation Queries

Export calculated metrics:
const AGGREGATE_DEALS = gql`
  query AggregateDeals($groupBy: DealGroupByInput!) {
    deals(groupBy: $groupBy) {
      edges {
        node {
          stage
          _aggregations {
            amount {
              sum
              avg
              count
            }
          }
        }
      }
    }
  }
`;

async function exportDealSummary() {
  const data = await client.request(AGGREGATE_DEALS, {
    groupBy: { stage: true }
  });

  const summary = data.deals.edges.map(e => ({
    stage: e.node.stage,
    totalValue: e.node._aggregations.amount.sum,
    averageValue: e.node._aggregations.amount.avg,
    count: e.node._aggregations.amount.count,
  }));

  return summary;
}

Scheduled Exports

Run exports on a schedule:
// Using node-cron
import cron from 'node-cron';

cron.schedule('0 2 * * *', async () => {
  console.log('Starting daily export...');
  const records = await exportAllPeople();
  await exportToCSV(records, `exports/daily-${Date.now()}.csv`);
  console.log('Daily export complete');
});

Best Practices

Performance

  • Use pagination (first/after)
  • Add delays between requests
  • Export during off-peak hours
  • Filter to reduce data size

Data Quality

  • Validate exports after completion
  • Check for missing records
  • Verify date formats
  • Handle null values

Security

  • Store exports securely
  • Encrypt sensitive files
  • Limit access to exports
  • Delete old exports