Every sales team has the same Monday morning ritual. Someone exports a CSV from the CRM, opens it in a spreadsheet, manually calculates totals, applies some conditional formatting, copies the numbers into a slide deck, and emails it around. By the time it lands in inboxes, it is already out of date.

This workflow should be automated. Not because it is technically difficult, but because every minute spent formatting a report is a minute not spent on selling. This guide shows you how to replace the entire manual process with a few lines of code and the ReportForge sales-summary template.

Before and After: What Automation Changes

Manual Process

  • Export CSV from CRM
  • Open in spreadsheet software
  • Calculate totals manually
  • Apply conditional formatting
  • Copy into slide deck or doc
  • Email to stakeholders
  • 30-60 minutes every week

With ReportForge API

  • Query database or export CSV
  • POST to /api/csv-to-report
  • Receive formatted HTML report
  • Email or save automatically
  • Schedule with cron or CI
  • Zero manual formatting
  • Under 5 seconds end-to-end

Understanding the Sales-Summary Template

The sales-summary template is designed for tabular sales data. It requires two columns at minimum — item and amount — and optionally accepts quantity and category columns for richer summaries.

What the template calculates automatically:

Here is what your CSV should look like to get the most out of the template:

Sample Input CSV
item,amount,quantity,category
Widget Pro,1250.00,50,Hardware
Gadget Plus,890.50,30,Electronics
Service Plan,2400.00,12,Services
Cable Kit,156.75,100,Accessories
Monitor Stand,445.00,25,Hardware
USB Hub Pro,312.00,40,Electronics

Send this CSV with the sales-summary template and you receive a complete HTML report with a header showing total revenue, average sale, and largest line item, followed by the full data table with all columns rendered cleanly.

Integrating with Common CRM Exports

Most CRMs export sales data with different column names than what the template expects. The API is case-insensitive for column matching, but column names need to match. Here is a quick normalization step you can add before calling the API.

Salesforce Export Normalization

JavaScript — normalize-salesforce.js
import { parse, stringify } from 'csv-parse/sync';

/**
 * Salesforce exports "Opportunity Name" and "Amount" (with capital A).
 * This normalizes to the columns the sales-summary template expects.
 */
function normalizeSalesforceExport(rawCsv) {
  const records = parse(rawCsv, { columns: true, skip_empty_lines: true });

  const normalized = records.map(row => ({
    item: row['Opportunity Name'] || row['Name'],
    amount: parseFloat(row['Amount'] || '0'),
    quantity: parseInt(row['Quantity'] || '1', 10),
    category: row['Type'] || row['Stage'],
  }));

  // Convert back to CSV string
  return [
    'item,amount,quantity,category',
    ...normalized.map(r => `${r.item},${r.amount},${r.quantity},${r.category}`),
  ].join('\n');
}

HubSpot Export Normalization

JavaScript — normalize-hubspot.js
// HubSpot exports "Deal Name", "Deal Amount", "Deal Stage"
function normalizeHubspotExport(rawCsv) {
  const records = parse(rawCsv, { columns: true, skip_empty_lines: true });

  const normalized = records.map(row => ({
    item: row['Deal Name'],
    amount: parseFloat((row['Deal Amount'] || '0').replace(/[$,]/g, '')),
    category: row['Deal Stage'],
  }));

  return [
    'item,amount,category',
    ...normalized.map(r => `"${r.item}",${r.amount},"${r.category}"`),
  ].join('\n');
}

Full Pipeline: Weekly Sales Report Automation

Here is a complete script that you can schedule to run every Monday morning. It queries your database, generates the report, and emails it to your team — entirely without human intervention.

JavaScript — weekly-sales-report.js
import { createPool } from '@vercel/postgres';
import nodemailer from 'nodemailer';

const pool = createPool({ connectionString: process.env.DATABASE_URL });

async function weeklySalesReport() {
  const now = new Date();
  const weekAgo = new Date(now - 7 * 24 * 60 * 60 * 1000);

  // 1. Query this week's closed deals
  const { rows } = await pool.query(
    `SELECT name AS item, amount, quantity, category
     FROM deals
     WHERE closed_at >= $1 AND status = 'won'
     ORDER BY amount DESC`,
    [weekAgo.toISOString()]
  );

  if (rows.length === 0) {
    console.log('No closed deals this week.');
    return;
  }

  // 2. Serialize to CSV
  const csv = [
    'item,amount,quantity,category',
    ...rows.map(r => `"${r.item}",${r.amount},${r.quantity},"${r.category}"`),
  ].join('\n');

  // 3. Generate the HTML report
  const weekLabel = now.toLocaleDateString('en-US', { month: 'short', day: 'numeric' });
  const res = await fetch('https://reportforge-api.vercel.app/api/csv-to-report', {
    method: 'POST',
    headers: {
      'Content-Type': 'application/json',
      'X-API-Key': process.env.REPORTFORGE_API_KEY,
    },
    body: JSON.stringify({
      csv,
      template: 'sales-summary',
      title: `Weekly Sales — Week of ${weekLabel}`,
    }),
  });

  const { html, meta } = await res.json();
  console.log(`Report generated: ${meta.rowCount} deals`);

  // 4. Send via email
  const transporter = nodemailer.createTransport({
    host: 'smtp.gmail.com',
    port: 587,
    auth: { user: process.env.SMTP_USER, pass: process.env.SMTP_PASS },
  });

  await transporter.sendMail({
    from: '"Sales Bot" <reports@company.com>',
    to: 'sales-team@company.com',
    subject: `Weekly Sales Report — ${weekLabel}`,
    html,
  });

  console.log('Weekly sales report sent.');
}

weeklySalesReport().catch(console.error);

Python Version: Connecting to PostgreSQL

If your backend is Python-based, here is the equivalent pipeline using psycopg2 and requests. This pattern works equally well with MySQL, SQLite, or any data source you can query into a list of dictionaries.

Python — weekly_sales_report.py
import psycopg2
import requests
import smtplib
import csv
import io
import os
from datetime import datetime, timedelta
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText

def generate_weekly_report():
    week_ago = datetime.now() - timedelta(days=7)

    # 1. Query the database
    conn = psycopg2.connect(os.environ["DATABASE_URL"])
    cur = conn.cursor()
    cur.execute(
        """SELECT name AS item, amount, quantity, category
           FROM deals
           WHERE closed_at >= %s AND status = 'won'
           ORDER BY amount DESC""",
        (week_ago,)
    )
    rows = [dict(zip([d[0] for d in cur.description], row)) for row in cur.fetchall()]
    cur.close()
    conn.close()

    if not rows:
        print("No closed deals this week.")
        return

    # 2. Convert to CSV
    buf = io.StringIO()
    writer = csv.DictWriter(buf, fieldnames=["item", "amount", "quantity", "category"])
    writer.writeheader()
    writer.writerows(rows)
    csv_string = buf.getvalue()

    # 3. Call ReportForge API
    week_label = datetime.now().strftime("%b %d")
    resp = requests.post(
        "https://reportforge-api.vercel.app/api/csv-to-report",
        json={
            "csv": csv_string,
            "template": "sales-summary",
            "title": f"Weekly Sales — Week of {week_label}",
        },
        headers={"X-API-Key": os.environ["REPORTFORGE_API_KEY"]},
        timeout=10,
    )
    resp.raise_for_status()
    html = resp.json()["html"]
    print(f"Generated report for {len(rows)} deals")

    # 4. Email the report
    msg = MIMEMultipart("alternative")
    msg["Subject"] = f"Weekly Sales Report — {week_label}"
    msg["From"] = "reports@company.com"
    msg["To"] = "sales-team@company.com"
    msg.attach(MIMEText(html, "html"))

    with smtplib.SMTP("smtp.gmail.com", 587) as server:
        server.starttls()
        server.login(os.environ["SMTP_USER"], os.environ["SMTP_PASS"])
        server.send_message(msg)

    print("Weekly sales report sent.")

generate_weekly_report()

Scheduling the Report

Once your script works locally, scheduling it is straightforward. Here are the three most common approaches:

Linux/macOS Cron

Crontab — every Monday at 7am
# Run: crontab -e
0 7 * * 1 cd /app && node weekly-sales-report.js >> /var/log/sales-report.log 2>&1

GitHub Actions Scheduled Workflow

YAML — .github/workflows/weekly-report.yml
name: Weekly Sales Report
on:
  schedule:
    - cron: '0 7 * * 1'  # Every Monday at 7am UTC
jobs:
  report:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with: { node-version: '20' }
      - run: npm ci
      - run: node weekly-sales-report.js
        env:
          DATABASE_URL: ${{ secrets.DATABASE_URL }}
          REPORTFORGE_API_KEY: ${{ secrets.REPORTFORGE_API_KEY }}
          SMTP_USER: ${{ secrets.SMTP_USER }}
          SMTP_PASS: ${{ secrets.SMTP_PASS }}

Common Pitfalls to Avoid

Summary

Automating sales reports with the ReportForge API reduces a 30-60 minute weekly task to a script that runs in under 5 seconds. The sales-summary template handles all the layout and math — you just need to provide clean, normalized CSV data.

Start with the free tier to build and test your integration. When your pipeline is working, upgrade to the Starter plan for the higher rate limits you need in production.

Generate Your First Sales Report Free

Try the sales-summary template right now — no API key, no sign-up required.

Try It Live →