Need to capture website screenshots programmatically in Node.js? Whether you're building a link preview generator, a monitoring tool, or a content archive, this tutorial shows you the quickest path from URL to PNG.
We'll cover two approaches: the quick way using the URLSnap API (no browser setup), and the manual way using Puppeteer if you prefer self-hosting.
The fastest approach — no Chromium, no headless browser, no devDeps to maintain. Just an HTTP request:
// Install: npm install node-fetch
import fetch from 'node-fetch';
import fs from 'fs';
const API_KEY = 'your_api_key_here'; // get free at urlsnap.dev
async function screenshotUrl(url, outputPath) {
const params = new URLSearchParams({
url,
width: '1280',
height: '800',
format: 'png',
full_page: 'true',
});
const response = await fetch(
`https://urlsnap.dev/api/screenshot?${params}`,
{ headers: { 'x-api-key': API_KEY } }
);
if (!response.ok) {
const err = await response.json();
throw new Error(err.error);
}
const buffer = await response.buffer();
fs.writeFileSync(outputPath, buffer);
console.log(`Screenshot saved to ${outputPath}`);
}
screenshotUrl('https://example.com', 'screenshot.png');
The screenshot endpoint supports these query parameters:
GET https://urlsnap.dev/api/screenshot
Required:
url Target URL to screenshot
Optional:
width Viewport width in px (default: 1280)
height Viewport height in px (default: 800)
format 'png' or 'jpeg' (default: 'png')
full_page Capture full scrollable page: 'true'/'false'
delay Wait ms after page load, max 5000 (default: 0)
curl -X POST https://urlsnap.dev/api/register \
-H "Content-Type: application/json" \
-d '{"email":"you@example.com"}'
# Returns: {"key":"us_abc123...","message":"Free tier: 20 requests/day"}
import fetch, { Response } from 'node-fetch';
interface ScreenshotOptions {
url: string;
width?: number;
height?: number;
format?: 'png' | 'jpeg';
fullPage?: boolean;
delay?: number;
}
async function screenshot(
options: ScreenshotOptions,
apiKey: string
): Promise<Buffer> {
const params = new URLSearchParams({
url: options.url,
width: String(options.width ?? 1280),
height: String(options.height ?? 800),
format: options.format ?? 'png',
full_page: String(options.fullPage ?? false),
delay: String(options.delay ?? 0),
});
const res: Response = await fetch(
`https://urlsnap.dev/api/screenshot?${params}`,
{ headers: { 'x-api-key': apiKey } }
);
if (!res.ok) throw new Error((await res.json() as any).error);
return res.buffer();
}
const res = await fetch('https://urlsnap.dev/api/me', {
headers: { 'x-api-key': API_KEY }
});
const info = await res.json();
// { plan: 'free', requests_today: 3, daily_limit: 20, requests_total: 3 }
console.log(`Used ${info.requests_today}/${info.daily_limit} today`);
If you prefer running your own browser (useful for on-prem or airgapped environments):
# Install Puppeteer with bundled Chromium
npm install puppeteer
import puppeteer from 'puppeteer';
async function screenshotUrl(url: string, path: string) {
const browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox', '--disable-setuid-sandbox'],
});
const page = await browser.newPage();
await page.setViewport({ width: 1280, height: 800 });
await page.goto(url, { waitUntil: 'networkidle2' });
await page.screenshot({ path, fullPage: true });
await browser.close();
}
Use URLSnap when: you want zero infrastructure, fast integration, and don't want to manage browser crashes, memory leaks, or Chromium updates.
Self-host when: you have strict data residency requirements, need to screenshot internal/private URLs not reachable from the internet, or have very high volumes that justify the ops cost.
Free tier: 20 screenshots/day. No credit card required.
Get your free API key →