Client and Server Side Rendering Static Site Generators

Client and Server Side Rendering Static Site Generators

In the last few years the popularity of Single Page Applications has slightly increased. Before the SPA revolution, the majority of application logic was done back-side with some AJAX additions and the user interface improvements were done by JavaScript. Due to the fact that only a small piece of content was added asynchronously almost the whole content was rendered Server-side and available even while browsing without JavaScript.

When the SPA application appeared, in the era of Backbone JS, Ember and Angular, the first versions of all search engines crawlers were clear. None of them supported javascript rendering. Crawlers were just receiving the content rendered by servers and indexed by web pages which ended up with a terrible SEO so common while relying on javascript too much. Technology is evolving very fast, so a question arises whether the SEO problem is still valid for SPA applications in 2018/2019.

What is SSR? Do we need it?

SSR stands for Server-Side Rendering. It is a way to prerender parts of the application on the server like in a classical back-end language. A browser makes a request to a server, the server runs application pre-rendering, as it goes, and responds with a generated html. On the browser site the JS logic is automatically attached to the rendered state and the application can used as usual.
On the opposite end, we have Client-Side Rendering which is a standard method for all SPA frameworks. Such applications generate whole content in the browser runtime. Typically the client-side applications receive from the server a basic html structure with an empty placeholder “div” where all application components will be rendered.

There is also a third method similar to Server-Side Rendering called Static Site Generation. Although it may be confused with SSR, but the idea is a bit different. With this method during a building phase  html files are generated which will be served from the server. Such html files can be effective at rendering hundreds of components which will take time if done on client-side.The application can be pre-rendered once in a specific state, only to send it to the end user in simple html files in the fastest way possible while using http hosting.

We have 3 options to choose from. So, which one I should use?
Each option is used in specific cases and fulfils different needs. Let’s hope the following paragraphs will help you choose the best one.

Which solution I should use?

Client side rendering, server side rendering, static site generators

The infographic below shows a general idea behind each approach. The final effect is the same: fully functional application. The difference is in a way it is achieved.

At a first glance, all three graphics look almost the same with the same number of steps. The key difference is timing for each action. In client-side rendering, application stays long in the “loading” state, while it is making ajax request for necessary data, compiling views and injecting everything into DOM. Server load will be small even for a considerable number of requests. The browser side is overloaded with work which may even work longer and slower on weak computers or phones.

For server-side rendering most work is on the server-side, which does most of the job. The server executes the application code and generates html based on it in order to create a response.
After that the browser will need to attach all the events to the existing html, create virtual DOM etc. When does the server-side work the most? If the server is not powerful enough, the response time may increase slightly. With the server-side rendering we need to think how much traffic the application will generate and whether the server is able to handle it fast.

The third option is to use Static Site Generator which is the fastest one. The server just sends previously prepared html. On the browser side the framework that is being used is attached to the existing structure, as a result the application is ready to use. Neither the server nor the browser is excessively overloaded.

Let’s make a small comparison table for all the 3 solutions:

ProblemClient-Side renderingServer-Side RenderingStatic Site Generators
Social media sharing??
SEO, search engines❓it’s complicated 🙂??
High traffic??
Frequently changed dynamic content??❓it depends
Easy to implement?❓it depends

Quick notes about the above points:
In the case of social media sharing like Facebook or Twitter, they don’t execute javascript. With the client-side generated application there is always the same set of OG tags and meta properties, there is no possibility to share a specific page/route in our application/website. If social sharing is a must, then it is better to choose SSR or Static Site Generators.

SEO remains a big mystery. A few years ago the situation was clear because none of the the search engines  understood javascript. So, no content rendered client-side was indexed. The situation changed once Google announced that their crawler would finally render javascript. (The following paragraphs will contain small test to check them out). With SSR and Static Site Generators we are certain  that the content will be indexed, whereas in the case of CSR it’s a bit more complicated issue 🙂

For high traffic websites it is always  better to put as much as possible to the client, and CSR seems to be a reasonable option.

Static Site Generators don’t work well with a dynamic content because all html files are created much too often. Naturally, there can be parts which don’t change, to fully render the server-side and the dynamic ajax content on the client-side, in such a case Static Site Generators will do the job.

The easiest option to implement is a standard SPA app done client-side without any additions. With server-side rendering we have to maintain server part, which may be tricky based on solutions. With Static Site Generators depending on the chosen framework, the implementation will take as much as for client-side but with some additions.

How do crawlers work?

Crawlers are automated scripts which browse www to collect information about web pages and the connections between them. They are connected with search engines and pass on information to them. Based on the knowledge web crawlers/spiders collected search engines decide on the order in which search results appear.

Historically, no crawlers were executing javascript. Client-side rendered application was not indexed at all. The situation changed a little when Google announced that crawlers they were going to use execute javascript. It was great news but let’s remember there is world outside Google. And, there are more search engines on the market such as: Bing, Yahoo, Ask.com, Baidu, Yandex, DuckDuckGo and others.

According to market share calculations Google powered 76% of all searches. It means that 24% of searches around the web are done with different engines! How should we know how they will see our pages?  Here you can find fantastic tests with different frameworks and the most popular search engines. The results of this comparison are clear: only Google and Ask.com understand javascript and are able to index client-side rendered application correctly.

While creating a an application constantly ask yourself a question whether it is fine to be indexed only by Google and Ask.com or maybe it would be a good idea to consider other search engines on the market.

How will Google crawler see my page?

A while ago Google provided a great tool for developers in order to check how Google bots/crawlers can see our web pages. With just a few simple steps we can run a diagnostic of our page and see whether all information that is posted  is correctly rendered by the bot script. Bots may either reject some external scripts that are included or not render something if the execution time is too long. Unfortunately, there are no similar tools for other search engines and the developers learn about their functionality only from experiments like the one linked in the previous chapter.

Let’s test few simple examples using fetch as google tool. In the experiment we will check how Google render client-side rendered React application in different configurations.
Google needs to verify you as the owner of a webpage to be rendered by bot. There are multiple methods to achieve this. The simplest one requires uploading a file to the server or adding a special meta tag to the webpage.

Test 1 – a simple React application with nested components

Test 1 - a simple React application with nested components

The above example simple application with 2 nested components is being rendered by displaying image from an external link and a simple text. Everything is correctly rendered by Google bot. This proves that Google can handle correctly such a simple javascript rendered app.

Test 2 – a React application with a content loaded by slow AJAX response

Test 2 - a React application with a content loaded by slow AJAX response

In the above example there is an extended application from the previous example which adds a third component by requesting content from the server by using ajax request. The example adopted NASA api and it also added a small delay of 5-6 seconds for the content to be available the the application. When the content is ready both the image url and the simple text paragraph are rendered. Looks like again the bot has handled it correctly.

Test 3 – a React application with a very slow AJAX request

Test 3 - a React application with a very slow AJAX request

In the third example the first problem occurs 10 seconds have been added to the loading time for the AJAX response. The component is not rendered at all and it was skipped by Google bot. The example has used a single AJAX request with long loading time which probably would never happen in reality. But what about having a very complex application which has been created from hundreds of components and has relied on multiple external data endpoints? If there are some who require all Promises to be resolved then it is possible to end up with the same situation as the one above, some parts of the content will not be available for Google crawler and will not be indexed. In such a situation the only solution is to pre-render  SSR or Static Site Generators.

Summary

While developing a new Single Page Application we should carefully think over our needs. If we are going to implement an internal application, the admin panel where user needs to log in, we don’t need to worry about SEO, indexing or the social media stuff, client-side rendered application will do it for us.
Taking social media sharing into consideration use specific Open Graph tags for particular pages, having different titles and descriptions for each page etc. requires SRR or Static Site Generators. Social media tools don’t execute javascript so replacing meta information for specific routes will not be possible.

When we want to develop a website which should be indexed by search engines we need to be careful. If we only care about Google we can write client-side rendered application and in order to make sure that everything will be correctly indexed we should frequently check with fetch as google tool. Too long loading/rendering times may result in a part of the content not being indexed at all. If we need to support other search engines, the only solution will be to go for SSR or Static Site Generators solutions, because they are not prepared to oversee javascript applications.

While choosing between SSR and Static Site Generators we should consider how often our data will change. If we implement a highly static website we can choose one of the existing solutions for Static Site generators and develop our website in the technology we love to work with.

There are multiple libraries providing SSR and Static Site Generators solutions for top SPA frameworks. There are solutions for React, Vue or Angular. What are the differences and how to use them. Hope the lecture of my next article will help you choose proper tool for your next application.

See also: