We host some page resources, such as fonts, on a different subdomain (e.g. webpages are at www.daniweb.com and resources are at cdn.daniweb.com)

An HTTP header of access-control-allow-origin: * is being sent from all cdn.daniweb.com requests.

When inspecting indexed pages from Google Search Console, from the URL Inspection tool I click on "View Crawled Page" and then the "More Info" tab. It shows me that the majority of page resources (including fonts from cdn.daniweb.com) couldn't be loaded because of "Other error". However, if I click the "Test Live URL" button, all of the resources were able to be loaded.

Is this a CORS issue?

We do have hotlink protection enabled in some places (e.g. post inline image attachments), that at first I thought was causing the issue. However, we don't have any hotlink protection on any files at cdn.daniweb.com.

Here are my general insights:

  1. Use a CORS testing tool to help diagnose the issue.
  2. Consult your server-side technology documentation for specific guidance on implementing CORS.
  3. Consider using a third-party CORS library if you're having trouble with your custom implementation.

This could indeed be a CORS (Cross-Origin Resource Sharing) issue. While Access-Control-Allow-Origin: should allow requests from any domain, Googlebot may not always handle it as expected. Consider specifying your main domain (e.g., https://www.daniweb.com) in the CORS policy instead of using. Also, ensure that preflight requests (OPTIONS) are correctly handled by your CDN and server.

Double-check if Googlebot’s IP range is inadvertently being blocked by hotlink protection or security settings. Additionally, clearing the CDN cache might help resolve any inconsistencies between live testing and crawled results.

It sounds like you might have a CORS issue. Even though you're sending the Access-Control-Allow-Origin: * header from cdn.daniweb.com, ensure it’s applied to all resources, including fonts. Since you have hotlink protection on some parts of your site, double-check that it’s not affecting requests from Googlebot. There could also be rules blocking specific user agents. The difference between the URL Inspection tool and the Test Live URL might be due to caching, so consider checking server logs for any errors when Google tries to access those resources. If everything seems fine, you could try disabling hotlink protection temporarily to see if that resolves the issue.

I've confirmed the access-control-allow-origin: * header is being sent from all resources. I've additionally confirmed that hotlink protection is not enabled for any of the fonts. I've also confirmed that cdn.daniweb.com does not have a robots.txt file in place that could potentially be blocking Googlebot's access to the fonts.

I don't believe it's a caching issue because these settings have been in place for many years. I am checking pages that were last crawled and indexed by Googlebot days or weeks ago. Very few pages of ours go longer than that being recrawled.

Also, there are a few resources that I do want to have hotlink protection on (not fonts, though). Those resources are able to be correctly loaded by Googlebot when I press the Test Live Page button.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.