Skip to main content

Overview

Automate importing data from external systems like other CRMs, spreadsheets, or data warehouses into Consuelo.

Common Import Scenarios

SourceMethodFrequency
Another CRMAPI integrationOne-time or scheduled
SpreadsheetCSV + APIWeekly/monthly
Data warehouseSQL export + APIDaily
Web formWebhook + APIReal-time
Email leadsParsing + APIReal-time

Import Script Template

import { GraphQLClient, gql } from 'graphql-request';
import dotenv from 'dotenv';

dotenv.config();

const client = new GraphQLClient(process.env.CONSUELO_GRAPHQL_URL, {
  headers: {
    authorization: `Bearer ${process.env.CONSUELO_API_KEY}`,
  },
});

const CREATE_PERSON = gql`
  mutation CreatePerson($data: PersonCreateInput!) {
    createPerson(data: $data) {
      id
      email
    }
  }
`;

const UPSERT_PERSON = gql`
  mutation UpsertPerson($data: PersonUpsertInput!) {
    upsertPerson(data: $data) {
      id
      email
    }
  }
`;

async function importRecords(records) {
  const results = { upserted: 0, failed: 0, errors: [] };

  for (const record of records) {
    try {
      // Map external data to Consuelo format
      const personData = {
        email: record.email,
        firstName: record.first_name,
        lastName: record.last_name,
        companyId: record.company_id,
      };

      // Use upsert to avoid duplicates
      await client.request(UPSERT_PERSON, { data: personData });
      results.upserted++;
    } catch (error) {
      results.failed++;
      results.errors.push({ record, error: error.message });
    }

    // Rate limit protection
    await new Promise(resolve => setTimeout(resolve, 100));
  }

  return results;
}

export { importRecords };

Import Strategies

Create if new, update if exists:
// Uses email as unique identifier
const result = await client.request(UPSERT_PERSON, {
  data: {
    email: "contact@example.com",
    firstName: "John",
    lastName: "Doe",
  }
});
Benefits:
  • No duplicates
  • Idempotent (safe to run multiple times)
  • Handles partial updates

Strategy 2: Check Before Create

Search first, then create:
const SEARCH_PERSON = gql`
  query SearchPerson($filter: PersonFilterInput!) {
    people(filter: $filter, first: 1) {
      edges {
        node {
          id
        }
      }
    }
  }
`;

async function createIfNotExists(email, data) {
  const search = await client.request(SEARCH_PERSON, {
    filter: { email: { eq: email } }
  });

  if (search.people.edges.length === 0) {
    return await client.request(CREATE_PERSON, { data });
  }

  return search.people.edges[0].node;
}

Strategy 3: Batch Import

Import multiple records at once:
const BATCH_CREATE_PEOPLE = gql`
  mutation CreatePeople($data: [PersonCreateInput!]!) {
    createPeople(data: $data) {
      id
      email
    }
  }
`;

async function batchImport(records) {
  const batch = records.map(r => ({
    email: r.email,
    firstName: r.firstName,
    lastName: r.lastName,
  }));

  return await client.request(BATCH_CREATE_PEOPLE, { data: batch });
}
Limits:
  • Max 60 records per batch
  • Must be same object type
  • Atomic operation (all succeed or all fail)

Error Handling

Partial Failures

When some records fail:
import fs from 'node:fs';

async function importWithErrors(records, client) {
  const log = [];

  for (const record of records) {
    try {
      await client.request(UPSERT_PERSON, {
        data: {
          email: record.email,
          firstName: record.first_name,
          lastName: record.last_name,
          companyId: record.company_id,
        },
      });
      log.push({ record, status: 'success' });
    } catch (error) {
      log.push({ 
        record, 
        status: 'failed', 
        error: error.message 
      });
    }
  }

  // Save log
  fs.writeFileSync('import-log.json', JSON.stringify(log, null, 2));

  return log;
}

Validation Errors

Handle validation failures:
const VALIDATION_ERRORS = {
  'email already exists': 'duplicate',
  'required field missing': 'validation',
  'invalid format': 'format',
};

function categorizeError(error) {
  for (const [pattern, category] of Object.entries(VALIDATION_ERRORS)) {
    if (error.message.includes(pattern)) {
      return category;
    }
  }
  return 'unknown';
}

Best Practices

Before Importing

  1. Export existing data for backup
  2. Test on small subset (10-50 records)
  3. Map fields carefully between systems
  4. Validate data format (dates, numbers)

During Import

  1. Use upsert to avoid duplicates
  2. Add delays between requests (100ms)
  3. Log everything for audit trail
  4. Monitor progress with console output

After Import

  1. Review errors in log file
  2. Fix and retry failed records
  3. Validate counts match expectations
  4. Check relationships are intact