Guide
How to Batch Compress Multiple Images at Once
Updated March 2026 · 7 min read
By CompressLocal Team
Compressing images one at a time is tedious. Whether you're preparing photos for a website, resizing images for a presentation, or shrinking attachments for email, batch compression saves you the repetitive work.
What is batch compression?
Batch compression means processing multiple images at the same time with the same settings. Instead of opening each image individually, adjusting quality, and exporting — you drop all your files in at once and get them all back compressed.
Comparing batch compression approaches
There are three main approaches, each with distinct trade-offs in speed, flexibility, and accessibility.
Photoshop batch actions
Photoshop's Image Processor (File → Scripts → Image Processor) lets you batch-resize and re-save an entire folder of images. You can also record custom Actions that apply sharpening, watermarking, or format conversion in the same pass. The output quality is excellent — Photoshop's JPEG encoder is one of the best available, and you get fine-grained control over color profiles and metadata stripping.
The downsides are real, though. Photoshop costs $23/month minimum (Photography plan), uses 2-4 GB of RAM while running, and the batch dialog is buried three menus deep. Processing 200 images can take 5-10 minutes because Photoshop opens each file fully into memory, applies the action, and saves sequentially. There's no concurrent processing. For someone who just needs smaller files, it's like driving a semi-truck to the grocery store.
ImageMagick and CLI tools
Command-line tools like imagemagick, jpegoptim, and cwebp are the power-user choice. A single command like mogrify -quality 80 -resize 1920x1920> *.jpg can process thousands of files. You can pipe them through GNU Parallel to compress 8 images simultaneously on an 8-core machine, cutting total time dramatically. They're free, scriptable, and can be integrated into CI/CD pipelines or cron jobs.
The barrier is knowledge. You need to know your shell, understand flag syntax, and be comfortable with destructive commands — mogrify overwrites originals by default unless you specify an output directory. One wrong wildcard and you've crushed your source files. ImageMagick also has a steep memory curve: converting a 50 MP RAW file can spike to 1-2 GB of RAM per process, so running 8 in parallel on a 16 GB machine will swap hard.
Browser-based tools
The fastest option for most people. No installation, works on any device with a modern browser. The best browser-based tools process images locally (in your browser) rather than uploading them to a server — which is both faster and more private. You drag files in, pick a target size, and download a ZIP. The entire round-trip for 20 photos typically takes under 10 seconds.
The trade-off is capacity. Browsers cap memory per tab at roughly 1-4 GB depending on the OS, so processing 500 images in one go isn't practical. But for batches of 5-50 images — which covers the vast majority of real-world tasks — browser-based tools hit the sweet spot of speed, simplicity, and privacy.
Who needs batch compression?
Photographers delivering client galleries
After a wedding or portrait session, photographers typically cull down to 300-800 edited images. Clients don't need 12 MB full-resolution files for sharing on social media or viewing on a tablet. Batch compressing the delivery set to 500-800 KB each cuts a 4 GB gallery down to under 400 MB, making downloads faster and cloud storage cheaper. Many photographers do this as a separate "web export" pass after their Lightroom edits.
E-commerce sellers preparing product catalogs
Platforms like Shopify, Etsy, and Amazon recommend product images under 1 MB. A seller listing 50 new products with 5 photos each needs to compress 250 images. Doing this one at a time would take over an hour. Batch compression with a consistent target — say 300 KB per image at 1200px wide — ensures uniform quality across the catalog and keeps page load times under the 3-second threshold that directly affects conversion rates.
Bloggers optimizing post images
A typical blog post uses 3-8 images. Over 50 posts, that's 150-400 images that affect your Core Web Vitals score. Google's Largest Contentful Paint (LCP) metric penalizes pages where the hero image takes more than 2.5 seconds to render. Batch compressing all images in a post to under 200 KB before publishing is one of the highest-impact SEO optimizations you can make — and it takes about 5 seconds with the right tool.
Students submitting assignments
Many university portals cap uploads at 5-10 MB total. A student scanning 15 pages of handwritten notes with a phone camera easily generates 3-5 MB per image — 45-75 MB total, well over the limit. Batch compressing those scans to 200-400 KB each brings the total under 6 MB while keeping text perfectly legible. This is especially common in engineering and medical programs where hand-drawn diagrams are part of the submission.
How batch compression works in the browser
Modern browsers have powerful image processing capabilities built in:
- Canvas API — re-encodes images at different quality levels
- Web Workers — processes images in background threads so the page stays responsive
- Concurrency — multiple images can compress simultaneously
This means a browser-based tool can compress 20 images in a few seconds, right on your device, without sending a single byte over the network.
Performance considerations
Concurrent processing is the single biggest factor in batch compression speed. A tool that compresses one image at a time will take 4x longer than one running 4 workers in parallel. On a modern laptop with 8 logical cores, 3-4 concurrent workers is the practical sweet spot — enough to saturate the CPU without starving the browser's main thread of resources for UI updates.
Memory usage scales with image dimensions, not file size. A 2 MB JPEG that's 4000x3000 pixels requires roughly 48 MB of uncompressed bitmap data in memory (4000 × 3000 × 4 bytes per RGBA pixel). Processing 10 such images concurrently means ~480 MB of bitmap data alone, plus overhead for the Canvas operations. This is why browser-based tools typically limit batch sizes to 20-50 files — it's a memory guardrail, not an arbitrary restriction.
Format matters too. JPEGs compress and decompress quickly because the codec is simple and heavily optimized. PNGs are slower to encode, especially at high compression levels, because the deflate algorithm does more work to find repeating patterns. WebP encoding sits between the two in speed but produces smaller files than JPEG at equivalent visual quality. If you're batch-processing a mix of formats, expect PNGs to be the bottleneck.
Common mistakes to avoid
Re-compressing already-compressed JPEGs
JPEG is a lossy format. Every time you re-encode a JPEG, you lose more detail — even at quality 95. Compressing a JPEG that was already saved at quality 80 down to quality 70 doesn't just remove 10% more data; it introduces new compression artifacts around edges and gradients that compound with the existing ones. The result looks noticeably worse than compressing the original once at quality 70. Always start from the highest-quality source you have. If you only have pre-compressed JPEGs, reduce quality as little as possible — or resize dimensions instead, which avoids re-encoding artifacts.
Mixing target sizes in a batch
Applying a single 200 KB target to a batch that includes both 6000x4000 photos and 400x300 thumbnails produces bad results. The large photos will look acceptable at 200 KB, but the small thumbnails are already under 200 KB — the tool either skips them (confusing) or re-encodes them unnecessarily (quality loss for no size gain). Sort your images by type or dimensions first, and run separate batches with appropriate targets for each group.
Not keeping originals
This sounds obvious, but it happens constantly. Someone batch-compresses a folder, deletes the originals to save disk space, then realizes a month later they need a high-resolution version for print. Compressed images cannot be uncompressed. Always keep your originals in a separate folder or archive. Storage is cheap — a 1 TB external drive costs under $50. Losing source files is permanent.
Compressing screenshots as JPEG instead of PNG
Screenshots, UI mockups, diagrams, and text-heavy images contain sharp edges and flat color regions. JPEG's lossy compression creates visible smudging around text and UI elements — the classic "JPEG artifact" look. PNG handles these images far better because its lossless compression preserves sharp edges perfectly, and flat-color regions compress extremely well in PNG (a 1920x1080 screenshot of a text editor might be only 150 KB as PNG but look terrible as a 150 KB JPEG). When batch-processing mixed content, separate your screenshots from your photographs and use the right format for each.
Tips for batch compression
- Set one target size for all images. 200-500 KB works for most use cases.
- Download as a ZIP when compressing many files — easier than downloading individually.
- Don't compress twice. Re-compressing an already compressed JPEG degrades quality further. Start from originals when possible.
- Mix formats freely. Most batch tools handle JPG, PNG, and WebP together.
Batch compress up to 20 images — free and private
CompressLocal processes 3 images concurrently using Web Workers. Drop your files, set a target size, and download as a ZIP. Everything runs in your browser.
Batch compress now