How to Automate Website Monitoring with Screenshots
Set up automated website monitoring using screenshot captures. Detect visual regressions, downtime, and content changes by scheduling periodic screenshots with the Site-Shot API.
Apr 2, 2026
Visual monitoring catches problems that traditional uptime checks miss. A website can return HTTP 200 while displaying a broken layout, missing images, or an error message in the page body. Automated screenshot monitoring captures what users actually see.
Why Monitor with Screenshots?
- Visual regression detection — catch CSS breaks, missing images, and layout shifts that don't trigger HTTP errors.
- Content verification — confirm that pricing, product listings, or legal pages display the correct information.
- Competitor tracking — monitor competitor websites for design changes, new features, or pricing updates.
- Compliance records — maintain timestamped visual records of web pages for legal or regulatory purposes.
- Downtime evidence — screenshot captures provide visual proof of outages beyond simple ping checks.
Basic Monitoring Script (Python)
Here's a Python script that captures a screenshot and saves it with a timestamp:
import os
from datetime import datetime
import requests
API_URL = "https://api.site-shot.com/"
API_KEY = "YOUR_API_KEY"
OUTPUT_DIR = "screenshots"
def capture(url, label="site"):
os.makedirs(OUTPUT_DIR, exist_ok=True)
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
filename = f"{label}_{timestamp}.png"
filepath = os.path.join(OUTPUT_DIR, filename)
response = requests.get(API_URL, params={
"url": url,
"userkey": API_KEY,
"width": 1280,
"height": 1024,
"format": "png",
"no_ads": 1,
"no_cookie_popup": 1,
}, timeout=70)
if response.status_code == 200:
with open(filepath, "wb") as f:
f.write(response.content)
print(f"Captured: {filepath}")
else:
print(f"Error: HTTP {response.status_code}")
if __name__ == "__main__":
capture("https://your-website.com", label="homepage")
capture("https://your-website.com/pricing", label="pricing")
Scheduling Captures
With cron (Linux/macOS)
Run the script every hour:
0 * * * * /usr/bin/python3 /path/to/monitor.py
With Task Scheduler (Windows)
Create a scheduled task that runs python monitor.py at your desired interval.
With GitHub Actions
name: Screenshot Monitor
on:
schedule:
- cron: '0 */6 * * *' # Every 6 hours
jobs:
capture:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: actions/setup-python@v5
with:
python-version: '3.12'
- run: pip install requests
- run: python monitor.py
- uses: actions/upload-artifact@v4
with:
name: screenshots-${{ github.run_number }}
path: screenshots/
Monitoring Multiple Pages
PAGES = [
("https://your-site.com/", "homepage"),
("https://your-site.com/pricing", "pricing"),
("https://your-site.com/docs", "docs"),
("https://competitor.com/", "competitor"),
]
for url, label in PAGES:
capture(url, label=label)
Detecting Visual Changes
Compare consecutive screenshots using image diffing. A simple pixel-comparison approach:
from PIL import Image
import numpy as np
def images_differ(path_a, path_b, threshold=0.01):
"""Return True if more than `threshold` fraction of pixels differ."""
img_a = np.array(Image.open(path_a))
img_b = np.array(Image.open(path_b))
if img_a.shape != img_b.shape:
return True
diff_pixels = np.sum(img_a != img_b) / img_a.size
return diff_pixels > threshold
When a change is detected, send an alert via email, Slack, or your preferred notification channel.
Tips for Reliable Monitoring
- Use
no_ads=1andno_cookie_popup=1to remove dynamic elements that change between captures and create false positives. - Set a consistent viewport (e.g., 1280×1024) so screenshots are always comparable.
- Use
delay_time=3000for pages with heavy JavaScript rendering to ensure all content is loaded. - Capture from a fixed country using the
countryparameter to avoid geo-based content variations. - Store screenshots with timestamps for audit trails and historical comparison.
Full Page Monitoring
For pages where content below the fold matters (e.g., long pricing pages), use full page capture:
response = requests.get(API_URL, params={
"url": url,
"userkey": API_KEY,
"full_size": 1,
"max_height": 10000,
"format": "jpeg", # JPEG for smaller files in archives
"no_ads": 1,
"no_cookie_popup": 1,
}, timeout=70)
Next Steps
- Set up the Site-Shot API with your API key
- Create a monitoring script for your key pages
- Schedule it with cron, GitHub Actions, or your CI/CD pipeline
- Add image diffing and alerting for automated change detection