Chat on WhatsApp

Performance with 50k+ SKUs and YMM filtering — what to watch?

Three load points break first if you don’t architect them correctly:

  • YMM dropdown response time — Year → Make → Model → Trim cascading dropdown. Each click triggers a query for valid options. Bad architecture: full Magento controller request per click (200–500ms each, customer abandons by Trim). Good: pre-built JSON tree of vehicle data shipped to client at first PDP load (gzipped, ~80–200KB), Alpine.js handles cascading state client-side. Each click is <5ms. The only server roundtrip is the final “show parts” click.
  • Layered-nav with vehicle filter on category page — customer applies vehicle filter, Magento needs to AND-filter 50k SKUs by vehicle compatibility. Native EAV indexer can’t do this efficiently. Custom Elasticsearch filter via product attribute fits_vehicle_ids (multi-value integer array). ES handles the AND-filter in 30–80ms.
  • Sitemap + crawl budget — if you generate a URL per (category × vehicle) combination, you’re looking at 500k+ URLs. Google won’t crawl that. Pattern: only emit per-vehicle URLs for the top 10,000 vehicle × category combos by search volume. The rest are accessible via the on-site filter but not in the sitemap. Or use canonical tags pointing back to the category page for the long tail.

Cache strategy: vehicle-data JSON has TTL 24h with a manual purge after TecDoc / SEMA Data Coop refresh. Hyvä category cache holds the layered-nav HTML by URL hash including filter params. Full-page cache (Varnish) keys on a custom X-Magento-Vary header that includes the customer’s saved “My Garage” vehicle — otherwise you serve cached content for the wrong vehicle.

Real benchmarks from a recent build (78k SKUs, 95k vehicles, ~1.5M compatibility rows): YMM dropdown click p95 = 4ms client, layered-nav response p95 = 180ms, PDP TTFB = 240ms, Lighthouse mobile = 94.

Was this helpful?