PSAM On-Page Audit
palmspringsairmuseum.org — 2026-03-31
Executive Summary
Over the past few weeks we ran a deep technical audit of palmspringsairmuseum.org — a full crawl of every page on the main site and the shop subdomain, plus separate analyses of how the site loads for real visitors on desktop and mobile. The pages that follow walk through what was found, what each finding means in plain English, and what the web team can do to address it.
The headline: the mobile experience is severely compromised. On a typical smartphone, the homepage takes about 24 seconds to finish loading — long enough that most visitors will give up and leave before the page is ready. Desktop loads much faster, but it has its own issue: the page sits frozen for nearly a full second after appearing, while heavy scripts finish running. These aren't subjective complaints — they're measurements from Google's own performance tooling, and Google increasingly factors them into where the museum shows up in search results.
On top of the speed issue, the site's WordPress installation has accumulated a fair amount of technical debt over the years: a sitemap file that doesn't actually deliver a sitemap, hundreds of pages with no preview text in Google search results, dozens of auto-generated tag pages that compete with real content, and a handful of broken navigation links the museum's own menu still points at. None of this is unusual for a WordPress site that's been live and growing for a long time, and none of it is catastrophic on its own — but together they're holding back the museum's reach. The good news is that almost every issue is fixable with focused effort, and the report below identifies which fixes are the highest leverage.
The detailed numbers below come from Google's PageSpeed Insights tool. Each metric measures a different part of the page's loading experience — how fast something appears, how fast the page is usable, how stable the layout is. We show desktop and mobile side by side because they're measured under very different network and CPU conditions, and the gap between them is usually huge.
Desktop
| Metric | Value | Pass |
|---|---|---|
| First Contentful Paint Time it takes for the FIRST piece of content (text, image, or button) to appear on screen after a visitor clicks the link. The first "sign of life" the visitor sees. Anything under 1.8 s feels snappy. Over 3 s feels broken. | 1.6 s | Good |
| Largest Contentful Paint Time until the BIGGEST visible element on the page (usually the hero image or main heading) finishes loading. This is what Google uses as the "page is ready" signal. Under 2.5 s is good; over 4 s feels broken. | 2.4 s | Good |
| Total Blocking Time How long the page is FROZEN — the visitor can see content but tapping or scrolling does nothing because JavaScript is busy. Almost always caused by heavy ad scripts, tracking pixels, or chat widgets. Under 200 ms is good; over 600 ms is brutal on mobile. | 930 ms | Poor |
| Cumulative Layout Shift How much stuff JUMPS AROUND on the page after it first appears (e.g., an ad loads and pushes the article down just as you go to tap a link). Score is a number, not a time. Under 0.1 is good — barely perceptible. | 0.006 | Good |
| Speed Index An overall "how fast does the page paint" score. It watches the page render and asks: at what time was the screen mostly filled with content? Under 3.4 s is good; over 5.8 s feels slow. | 2.3 s | Good |
| Time to Interactive Time until the page is fully responsive — every button works, every menu opens. Often dramatically slower than Largest Contentful Paint because JavaScript keeps running after pixels appear on screen. Under 3.8 s is good; over 7.3 s is poor. | 4.5 s | Needs Work |
| Server Response Time How long the web server takes to send the FIRST byte of the page after the request arrives. Doesn't include rendering or JavaScript — it's purely server-side. Under 200 ms is good. Cloudflare's CDN typically gets this under 50 ms. | Root document took 10 ms | Good |
Mobile
| Metric | Value | Pass |
|---|---|---|
| First Contentful Paint Time it takes for the FIRST piece of content (text, image, or button) to appear on screen after a visitor clicks the link. The first "sign of life" the visitor sees. Anything under 1.8 s feels snappy. Over 3 s feels broken. | 14.1 s | Poor |
| Largest Contentful Paint Time until the BIGGEST visible element on the page (usually the hero image or main heading) finishes loading. This is what Google uses as the "page is ready" signal. Under 2.5 s is good; over 4 s feels broken. | 24.0 s | Poor |
| Total Blocking Time How long the page is FROZEN — the visitor can see content but tapping or scrolling does nothing because JavaScript is busy. Almost always caused by heavy ad scripts, tracking pixels, or chat widgets. Under 200 ms is good; over 600 ms is brutal on mobile. | 380 ms | Needs Work |
| Cumulative Layout Shift How much stuff JUMPS AROUND on the page after it first appears (e.g., an ad loads and pushes the article down just as you go to tap a link). Score is a number, not a time. Under 0.1 is good — barely perceptible. | 0 | Good |
| Speed Index An overall "how fast does the page paint" score. It watches the page render and asks: at what time was the screen mostly filled with content? Under 3.4 s is good; over 5.8 s feels slow. | 14.1 s | Poor |
| Time to Interactive Time until the page is fully responsive — every button works, every menu opens. Often dramatically slower than Largest Contentful Paint because JavaScript keeps running after pixels appear on screen. Under 3.8 s is good; over 7.3 s is poor. | 24.5 s | Poor |
| Server Response Time How long the web server takes to send the FIRST byte of the page after the request arrives. Doesn't include rendering or JavaScript — it's purely server-side. Under 200 ms is good. Cloudflare's CDN typically gets this under 50 ms. | Root document took 10 ms | Good |
Crawl Context
Issues by Count
Issue Detail
What it is: An XML sitemap is the file that tells Google "here is every page on my website." The file at /wp-sitemap.xml exists, but it's returning the homepage HTML instead of an XML document. Google treats that as a soft-404 — meaning it doesn't recognize the file as a sitemap, so the museum's deeper pages (event listings, Warbird Wednesday profiles, gala honorees) often go un-indexed for months.
How to fix: Either re-enable WordPress's built-in sitemap (something on the site is intercepting that URL with the page template) or install an SEO plugin (Yoast SEO, Rank Math) that emits proper application/xml, then submit the working URL in Google Search Console.
What it is: A meta description is the 1–2 sentence preview that shows up under the page title in Google search results. Only 85 of 956 pages on this site have one — meaning 91% of the museum's content is showing up in Google with whatever random snippet Google decided to grab (often the navigation menu or a footer disclaimer instead of a real description).
How to fix: Install or activate an SEO plugin (Yoast SEO, Rank Math, or All in One SEO). Each adds a Meta Description field beneath the post editor — write a 120–160 character description for each page. For 871 pages, doing it manually is brutal; use the AI-write button most plugins now ship to generate descriptions from the post body, then review and tweak.
What it is: Search engines truncate page titles at about 60 characters in their results — anything past that turns into a "…" cutoff. On this site, 535 pages have titles longer than 60 characters, mostly because the WordPress theme is auto-appending " – Palm Springs Air Museum" to every title (which alone eats roughly 30 characters of every title's budget).
How to fix: In the SEO plugin's title-template setting, drop the trailing " – Palm Springs Air Museum" suffix from internal pages — Google adds the brand name automatically. Then audit individual long titles and trim them to 50–60 characters, leading with the unique part (the aircraft, the event date, the honoree).
What it is: 190 pages on this site share a title with at least one other page. Open Cockpit Saturday events are the largest offender — 19 different event pages all titled the same thing. When pages share titles, Google can't tell which is the "main" page for that topic, so it picks one and buries the others (a phenomenon called cannibalization).
How to fix: Edit each duplicate post in WP Admin → Posts and add a unique differentiator to the title — usually the date or sequence number. "F-104 Starfighter — Open Cockpit Saturday, June 7 2025" beats a generic "Open Cockpit Saturdays — F-104 Starfighter." For bulk renames, the Better Search Replace plugin can append the date to all 19 affected titles in one operation (back up the database first).
What it is: 66 pages on the site failed to load during the audit — 8 are broken-link 404s (the page truly doesn't exist) and 58 are 403 "blocked by site security" (the security service refused to load them). Visitors and Google's crawler hit the same walls when they try to access those URLs.
How to fix: For the 8 broken links, see the Crawl Health page — each one is listed with the source page that's still linking to it. Either restore the missing page, redirect to the most relevant existing page, or remove the broken inbound link. For the 58 blocked pages, see the next item.
What it is: The site's Web Application Firewall (Cloudflare's bot-detection or Wordfence — whichever is active) blocked 58 pages from being audited. The pages exist, but the security service treated our crawler as a suspicious visitor and turned us away with a 403 Forbidden response. The same thing can happen to Googlebot and to real visitors on VPNs, mobile carriers, or older browsers.
How to fix: In Cloudflare → Security → Bots, enable "Verified Bots" so Googlebot, Bingbot, and other legitimate crawlers are allow-listed. If Wordfence or another WordPress security plugin is active, add the same crawlers to its allow list. Reduce bot-detection sensitivity on standard browsing paths (/programs/, /blog/) and keep aggressive blocking only on /wp-login.php and /wp-admin/.
What it is: 15 URLs on this site redirect to a different URL when visited. Most are routine (http→https, www→non-www, slug-cleanup) and harmless — but a few are problematic: the main "Warbird Rides" navigation link redirects to a page that returns 404, and several internal menus still point at old URLs that bounce through redirects on every visit.
How to fix: See the Redirect Chains section on the Crawl Health page for the full breakdown — each redirect is categorized (BROKEN, slug-cleanup, typo handler, protocol normalization) with concrete WordPress steps for each kind, plus a list of which internal links are still using the old URLs.
Domain-Level Checks
An XML sitemap is the file that tells Google "here is every page on my website." Without one, Google has to discover pages by following links — slower and less reliable. On this site: the file at /wp-sitemap.xml returns the homepage HTML instead of an XML document, so search engines won't recognize it as a sitemap. Needs fixing.
- /wp-sitemap.xml returns HTTP 200 OK but with Content-Type: text/html — it's serving the homepage HTML instead of an XML document. Google treats this as a soft-404 and won't recognize it as a sitemap. To fix: either re-enable WordPress's built-in sitemap (something on the site is intercepting that URL with the page template) or install an SEO plugin (Yoast or Rank Math) that emits proper application/xml, then submit the working URL in Google Search Console.
A small text file at /robots.txt tells search engines which parts of the site they may or may not crawl. On this site: the file exists, but its contents have several issues — see the list below.
- No Sitemap: declaration. The robots.txt file should end with a line like `Sitemap: https://palmspringsairmuseum.org/wp-sitemap.xml` so search engines know where to find the sitemap. Currently missing entirely.
- Crawl-delay: 3 is set. Google ignores crawl-delay, but Bing, Yandex and other engines honor it — meaning a full 12,000-URL re-crawl would take Bing ~10 hours of forced waits. Either remove it or lower to 0.5s.
- Two separate `User-agent: *` blocks (one in the Cloudflare-managed section at the top, one in the default block at the bottom). When a crawler hits two wildcard blocks, behavior is undefined — some only read the first, some merge, some ignore the file. Consolidate into one block.
- `Disallow: /*?` blocks every URL containing a question mark. This kills indexable URLs that use query parameters legitimately (search result pages, calendar views, share links). Replace with narrower rules targeting the specific noise patterns.
Some sites accidentally include a "don't index this page" tag in their homepage HTML. If the homepage carried that tag, the entire site would vanish from Google. On this site: no such tag was found — the homepage is fully indexable.
A valid SSL certificate gives your site the padlock icon and the https:// prefix. Browsers warn visitors away from sites without one, and Google ranks them lower. On this site: the certificate is valid and current.
HTTP/2 is the modern version of the protocol browsers use to talk to your server. It loads pages substantially faster than the older HTTP/1.1 by sending multiple files at once. On this site: HTTP/2 is enabled — pages load with the modern, faster protocol.
Every page should declare one "official" URL via a <link rel="canonical"> tag, so Google knows which version is the official one when several URLs serve similar content (with/without trailing slash, with tracking parameters, etc.). On this site: the canonical-URL setup failed our check — the homepage doesn't carry a clean canonical tag. Fix in the SEO plugin or theme template.
Your site can be reached at both www.palmspringsairmuseum.org and palmspringsairmuseum.org. To search engines, those look like two different websites unless one automatically forwards to the other. On this site: the forwarding is set up correctly — visitors who type either version end up at the same place.
Default web server installations broadcast their software name and version (e.g. "Apache 2.4.41 Ubuntu") in HTTP headers. Hiding this makes the site a smaller target for automated attacks. On this site: the signature is hidden.
When a visitor hits a URL that doesn't exist, the server should return a real 404 (broken link) response with a custom error page ("sorry, that page is missing — try the home page"). Misconfigured sites return a 200 OK with the homepage instead, which confuses Google. On this site: broken-link handling is working correctly.
If someone visits a folder URL like /wp-content/uploads/, the server should refuse to list its contents. When directory browsing is left on, anyone can scrape your file structure. On this site: directory browsing is disabled — folder contents are not exposed.
If a visitor types http:// (not https://), the server should automatically forward them to the secure version. Otherwise users land on the insecure URL and see a browser warning. On this site: the http→https redirect is working — visitors who type the insecure URL land on the secure one.