6 Things You Need to Know


Knowing the basics of JavaScript has become a vital skill for the modern SEO professional, although until recently the relationship between these two disciplines has been a highly debated subject.

The crucial question that hovers at the interface of SEO and JavaScript is whether search engine crawlers can perceive a website’s content properly, and realistically evaluate user experience.

While HTML, which works with PHP, CSS, etc., can be read directly by a crawler, a JavaScript-based website cannot be accessed right off the bat. Google crawlers first analyze the DOM, and only afterward do they render the website.

Basic Definitions

Before we delve into best practices for optimizing JavaScript, let’s take a quick look at some basic terminology:

  • JavaScript is a programming language used to make webpages dynamic and interactive. You can place JavaScript into an HTML document, or make a link or reference to it.
  • HTML stands for Hypertext Markup Language. In simple words, it is a content organizer: HTML provides a website’s structure (bullet lists, headlines, subheadlines, paragraphs, etc.) and defines static content.
  • AJAX is Asynchronous JavaScript and XML. Basically, it updates content without refreshing the whole page. AJAX enables web applications and servers to communicate without intruding on the current page.

However, you should note that starting in Q2 2018, Google will no longer need AJAX to render JavaScript-based websites.

A modern SEO professional should also have a basic understanding of DOM (Document Object Model). You can think of DOM as a tool used by Google to explore and analyze web pages.

First, Google receives an HTML document and identifies its JavaScript elements. Then the browser initiates DOM, enabling the search engine to render the page.

JavaScript-JQuery-CSS-HTML-Books

1. Let Search Engines See Your JavaScript

Robots.txt is set to provide Google search engines with appropriate crawling opportunities. If you block them from seeing JavaScript, the page will appear differently to web crawlers than it does to users.

This means that search engines won’t get the full user experience, and Google may interpret such actions as cloaking.

The best approach is to provide web crawlers with all the resources they need to see webpages in the exact same manner as users.

Consider arranging a meeting with developers, and decide together which files should be hidden from search engines, and which of them should be made accessible.

2. Internal Linking

Internal linking is a strong SEO tool used to show search engines your website architecture and point to the most important webpages.

The most essential advice here: use internal linking, and do not even try to replace it with JavaScript on-click events.

Yes, end URLs are likely to be found and crawled with on-click events, but web crawlers won’t associate them with the global navigation of your site.

Therefore, you would be better off to implement internal linking by using regular anchor tags within the HTML or DOM, to provide users with a better experience.

3. URL Structure

JavaScript-based websites used to include fragment identifiers within URLs, but lone hashes (#) and hashbangs (#!) are not recommended by Google.

A highly recommended method is pushState History API. It updates the URL in the address bar and allows JavaScript websites to leverage clean URLs.

A clean URL is also called a search engine-friendly URL, which consists of a plain text, easily understood by non-expert users.

Consider using pushState for infinite scroll, so the URL updates each time the user hits a new part of the page. In a perfect scenario, the user can refresh the page and still remain at the exact same spot.

Also, explore SEO best URL practices and start using them to improve user experience.

4. Test Your Website

Google is able to crawl and understand many forms of JavaScript, although some of them may be more challenging than others.

Here is an experiment by Bartosz Góralewicz that shows how Googlebot interacts with JavaScript on different frameworks.

This study helps us to understand when it is time to worry and act proactively.

However, it’s always better to predict possible mistakes and problems and avoid them, so why not conduct some testing?

Follow these two basic steps to detect possible breaks:

  • Check whether the content on your webpages appears in the DOM.
  • Test a couple of pages to make sure that Google is able to index your content.

It is crucial to find out whether Google is able to see your content and JavaScript in your robots.txt and analyze it properly. Therefore, consider manually checking pieces of your content and fetching them with Google to see if content appears.

Test your website by following this short guide from Google.

Done all your testing and the results look promising? Great!

But what if something isn’t working?

If there is any indication that Google can’t see your content properly, call your development team for help.

Meanwhile, an HTML snapshot may salvage the situation.

5. HTML Snapshots

Google introduced HTML snapshots in 2009 and disapproved them in 2015. That is a long story and an ongoing topic.

The thing you should know is that Google stills support HTML snapshots, although it has determined them to be elements to “avoid.”

HTML snapshots may be necessary for a couple of situations, so you should at least be familiar with them.

For example, if search engines cannot grasp the JavaScript on your website, you can provide them with an HTML snapshot, which is better than not having your content indexed and understood at all.

In a perfect world, a website would use some kind of user-agent detection on the server side and show the HTML snapshot to bots and users.

Note that Google strives to see the exact same experience as a viewer. Therefore, it is better to return HTML snapshots to search engine crawlers.

However, only do so in cases where there is currently something wrong with JavaScript, and it isn’t possible to contact your support team.

6. Site Latency

When a browser creates DOM using a received HTML document, it loads the majority of resources exactly as they are mentioned in the HTML document.

If a massive file exists at the top of an HTML document, a browser will load this huge file first, and all other information will appear only afterward, with a significant delay.

The key idea of Google’s critical rendering path is to first load pieces of information that are crucial for users. In other words, to place the most essential content for users above-the-fold.

If your JavaScript files or some unnecessary resources clog up the page load speed, you probably have a render-blocking JavaScript, also called a perceived latency.

This means that your pages have the potential to appear faster, but JavaScript code is slowing them down.

Check how long it takes to load a page with Page Speed Insights or other similar tools. Analyze the results to see if there is a render-blocking JavaScript.

Here are a couple of top solutions to resolve it:

  • Add JavaScript in the HTML.
  • Add the ‘async’ attribute to HTML to make your JavaScript asynchronous.
  • Reduce JavaScript elements within the HTML document.

When trying to improve the situation, keep in mind the basic rules of JavaScript.

For instance, scripts must be arranged in a certain order (order of precedence). If some scripts reference files, they can be used only after the referenced files are loaded.

You should constantly stay in touch with your development team to make sure that any alterations do not interrupt user experience.

Conclusion

Search engines are constantly evolving, so they will no doubt interpret your JavaScript better and faster in the future.

For now, make sure your existing content is crawlable and obtainable, with proper site latency. Hopefully this article will help you to optimize your website.

More SEO Resources:


Image Credit

In-post Photo: Stock Snap

Subscribe to SEJ

Get our daily newsletter from SEJ’s Founder Loren Baker about the latest news in the industry!





Source link

WP Twitter Auto Publish Powered By : XYZScripts.com
Exit mobile version