How Google indexes JavaScript sites

5/5 (6)
5/56

SEO оптимизаця , javascript

1. Don’t cloak to Googlebot

  • Use „feature detection“ and „progressive enhancement“  techniques to make your content available to all users.

  • Avoid redirecting to an „unsupported browser“ page. Consider using a polyfill or other safe fallback where needed.

  • The features Googlebot currently doesn’t support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.

2. Use the rel=canonical attribute: Use rel=canonical when serving content from multiple URLs is required. Further information about the canonical
attribute can be found here.

3. Avoid the AJAX-Crawling scheme on new sites. Consider migrating old sites that use this scheme
soon. Remember to remove „meta fragment“ tags when migrating. Don’t use
a „meta fragment“ tag if the „escaped fragment“ URL doesn’t serve fully
rendered content.

4. Avoid using „#“ in URLs (outside of „#!“): Googlebot rarely indexes URLs with „#“ in them. Use „normal“ URLs with path/filename/query-parameters instead, consider using the History API for navigation.

5. Check your web pages: Use Search Console’s Fetch and Render tool to test how Googlebot sees your pages. Note that this tool doesn’t support „#!“ or „#“ URLs.

6. Check your robots.txt file: Ensure that all required resources (including JavaScript files / frameworks, server responses, 3rd-party APIs, etc) aren’t blocked by robots.txt. The Fetch and Render tool will list blocked resources discovered. If resources are uncontrollably blocked by robots.txt (e.g., 3rd-party APIs) or otherwise temporarily unavailable, ensure that your client-side code fails gracefully.

7. Do not use too many embedded resources: Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in timeouts & rendering without these resources being available (e.g., some JavaScript files might not be loaded). Use reasonable HTTP caching directives.

8.Google supports JavaScript to some extent: Google supports the use of JavaScript to provide titles, description & robots meta tags, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the spec, but the associated web page can be built using JS/PWA techniques. Remember to use a sitemap file with correct „lastmod“ dates for signaling changes on your website.

9. Other: search engines might not support JavaScript at all: Finally, keep in mind that other search engines and web services accessing your content might not support JavaScript at all, or might support a different subset.

In general, critical web page content should not be hidden in JavaScript. Google might be able to index the JavaScript content to some extent, but you will still have difficulty with other search engines.

източник