Categories
Software Technology

Client and Server Side Rendering Static Site Generators

In the last few years the popularity of Single Page Applications has slightly increased. Before the SPA revolution, the majority of application logic was done back-side with some AJAX additions and the user interface improvements were done by JavaScript. Due to the fact that only a small piece of content was added asynchronously almost the whole content was rendered Server-side and available even while browsing without JavaScript.

When the SPA application appeared, in the era of Backbone JS, Ember and Angular, the first versions of all search engines crawlers were clear. None of them supported javascript rendering. Crawlers were just receiving the content rendered by servers and indexed by web pages which ended up with a terrible SEO so common while relying on javascript too much. Technology is evolving very fast, so a question arises whether the SEO problem is still valid for SPA applications in 2018/2019.

What is SSR? Do we need it?

SSR stands for Server-Side Rendering. It is a way to prerender parts of the application on the server like in a classical back-end language. A browser makes a request to a server, the server runs application pre-rendering, as it goes, and responds with a generated html. On the browser site the JS logic is automatically attached to the rendered state and the application can used as usual.
On the opposite end, we have Client-Side Rendering which is a standard method for all SPA frameworks. Such applications generate whole content in the browser runtime. Typically the client-side applications receive from the server a basic html structure with an empty placeholder “div” where all application components will be rendered.

There is also a third method similar to Server-Side Rendering called Static Site Generation. Although it may be confused with SSR, but the idea is a bit different. With this method during a building phase  html files are generated which will be served from the server. Such html files can be effective at rendering hundreds of components which will take time if done on client-side.The application can be pre-rendered once in a specific state, only to send it to the end user in simple html files in the fastest way possible while using http hosting.

We have 3 options to choose from. So, which one I should use?
Each option is used in specific cases and fulfils different needs. Let’s hope the following paragraphs will help you choose the best one.

Which solution I should use?

Client side rendering, server side rendering, static site generators

The infographic below shows a general idea behind each approach. The final effect is the same: fully functional application. The difference is in a way it is achieved.

At a first glance, all three graphics look almost the same with the same number of steps. The key difference is timing for each action. In client-side rendering, application stays long in the “loading” state, while it is making ajax request for necessary data, compiling views and injecting everything into DOM. Server load will be small even for a considerable number of requests. The browser side is overloaded with work which may even work longer and slower on weak computers or phones.

For server-side rendering most work is on the server-side, which does most of the job. The server executes the application code and generates html based on it in order to create a response.
After that the browser will need to attach all the events to the existing html, create virtual DOM etc. When does the server-side work the most? If the server is not powerful enough, the response time may increase slightly. With the server-side rendering we need to think how much traffic the application will generate and whether the server is able to handle it fast.

The third option is to use Static Site Generator which is the fastest one. The server just sends previously prepared html. On the browser side the framework that is being used is attached to the existing structure, as a result the application is ready to use. Neither the server nor the browser is excessively overloaded.

Let’s make a small comparison table for all the 3 solutions:

ProblemClient-Side renderingServer-Side RenderingStatic Site Generators
Social media sharing??
SEO, search engines❓it’s complicated 🙂??
High traffic??
Frequently changed dynamic content??❓it depends
Easy to implement?❓it depends

Quick notes about the above points:
In the case of social media sharing like Facebook or Twitter, they don’t execute javascript. With the client-side generated application there is always the same set of OG tags and meta properties, there is no possibility to share a specific page/route in our application/website. If social sharing is a must, then it is better to choose SSR or Static Site Generators.

SEO remains a big mystery. A few years ago the situation was clear because none of the the search engines  understood javascript. So, no content rendered client-side was indexed. The situation changed once Google announced that their crawler would finally render javascript. (The following paragraphs will contain small test to check them out). With SSR and Static Site Generators we are certain  that the content will be indexed, whereas in the case of CSR it’s a bit more complicated issue 🙂

For high traffic websites it is always  better to put as much as possible to the client, and CSR seems to be a reasonable option.

Static Site Generators don’t work well with a dynamic content because all html files are created much too often. Naturally, there can be parts which don’t change, to fully render the server-side and the dynamic ajax content on the client-side, in such a case Static Site Generators will do the job.

The easiest option to implement is a standard SPA app done client-side without any additions. With server-side rendering we have to maintain server part, which may be tricky based on solutions. With Static Site Generators depending on the chosen framework, the implementation will take as much as for client-side but with some additions.

How do crawlers work?

Crawlers are automated scripts which browse www to collect information about web pages and the connections between them. They are connected with search engines and pass on information to them. Based on the knowledge web crawlers/spiders collected search engines decide on the order in which search results appear.

Historically, no crawlers were executing javascript. Client-side rendered application was not indexed at all. The situation changed a little when Google announced that crawlers they were going to use execute javascript. It was great news but let’s remember there is world outside Google. And, there are more search engines on the market such as: Bing, Yahoo, Ask.com, Baidu, Yandex, DuckDuckGo and others.

According to market share calculations Google powered 76% of all searches. It means that 24% of searches around the web are done with different engines! How should we know how they will see our pages?  Here you can find fantastic tests with different frameworks and the most popular search engines. The results of this comparison are clear: only Google and Ask.com understand javascript and are able to index client-side rendered application correctly.

While creating a an application constantly ask yourself a question whether it is fine to be indexed only by Google and Ask.com or maybe it would be a good idea to consider other search engines on the market.

How will Google crawler see my page?

A while ago Google provided a great tool for developers in order to check how Google bots/crawlers can see our web pages. With just a few simple steps we can run a diagnostic of our page and see whether all information that is posted  is correctly rendered by the bot script. Bots may either reject some external scripts that are included or not render something if the execution time is too long. Unfortunately, there are no similar tools for other search engines and the developers learn about their functionality only from experiments like the one linked in the previous chapter.

Let’s test few simple examples using fetch as google tool. In the experiment we will check how Google render client-side rendered React application in different configurations.
Google needs to verify you as the owner of a webpage to be rendered by bot. There are multiple methods to achieve this. The simplest one requires uploading a file to the server or adding a special meta tag to the webpage.

Test 1 – a simple React application with nested components

Test 1 - a simple React application with nested components

The above example simple application with 2 nested components is being rendered by displaying image from an external link and a simple text. Everything is correctly rendered by Google bot. This proves that Google can handle correctly such a simple javascript rendered app.

Test 2 – a React application with a content loaded by slow AJAX response

Test 2 - a React application with a content loaded by slow AJAX response

In the above example there is an extended application from the previous example which adds a third component by requesting content from the server by using ajax request. The example adopted NASA api and it also added a small delay of 5-6 seconds for the content to be available the the application. When the content is ready both the image url and the simple text paragraph are rendered. Looks like again the bot has handled it correctly.

Test 3 – a React application with a very slow AJAX request

Test 3 - a React application with a very slow AJAX request

In the third example the first problem occurs 10 seconds have been added to the loading time for the AJAX response. The component is not rendered at all and it was skipped by Google bot. The example has used a single AJAX request with long loading time which probably would never happen in reality. But what about having a very complex application which has been created from hundreds of components and has relied on multiple external data endpoints? If there are some who require all Promises to be resolved then it is possible to end up with the same situation as the one above, some parts of the content will not be available for Google crawler and will not be indexed. In such a situation the only solution is to pre-render  SSR or Static Site Generators.

Summary

While developing a new Single Page Application we should carefully think over our needs. If we are going to implement an internal application, the admin panel where user needs to log in, we don’t need to worry about SEO, indexing or the social media stuff, client-side rendered application will do it for us.
Taking social media sharing into consideration use specific Open Graph tags for particular pages, having different titles and descriptions for each page etc. requires SRR or Static Site Generators. Social media tools don’t execute javascript so replacing meta information for specific routes will not be possible.

When we want to develop a website which should be indexed by search engines we need to be careful. If we only care about Google we can write client-side rendered application and in order to make sure that everything will be correctly indexed we should frequently check with fetch as google tool. Too long loading/rendering times may result in a part of the content not being indexed at all. If we need to support other search engines, the only solution will be to go for SSR or Static Site Generators solutions, because they are not prepared to oversee javascript applications.

While choosing between SSR and Static Site Generators we should consider how often our data will change. If we implement a highly static website we can choose one of the existing solutions for Static Site generators and develop our website in the technology we love to work with.

There are multiple libraries providing SSR and Static Site Generators solutions for top SPA frameworks. There are solutions for React, Vue or Angular. What are the differences and how to use them. Hope the lecture of my next article will help you choose proper tool for your next application.

See also:

Categories
Software Technology

TOP 5 Analytics tools to Measure App Success

Measuring app’s success helps you keep track of its performance and valuation for your app. There are different metrics you can use to measure the app’s success. However, you might also consider using analytics tools, which provide more credible results by eliminating the human error.

Before deciding which analytical tool is best for you, it would be a good idea to learn about different types of analytics tools and how they operate,  this knowledge will prove beneficial to make an informed choice.

Basically, there are major types of analytics tools; marketing analytics, in-app analytics and performance analytics.

1. App marketing analytics

This type of analytics helps you discover ways to monetize your app. How do users discover and learn about your app? Do they find it while browsing an application store or maybe somewhere else, e.g. on other websites?

2. In-app analytics

In-app analytics focuses on user behavior within the application. What do they do once they have opened your app? Do they, for example, click on the ads? The data you collect and analyze concerning your users will be huge help in the post-publishing development phase of your app. This could give you an insight into the most frequent and most valuable users.

3. App performance analytics

App performance analytics deals with the mechanical part of the application. The tool  identifies factors and malfunctions which cause your application to crash. Which devices register slow crawl and so forth.

This kind of analysis plays an important role because without it the app would be doomed.

How to go about selecting app analytics tools? There are many factors to consider, some are basic such as:

  • Key features – Some features are universally provided by most platforms. What makes this particular tool stand out? Does  it offer A/B testing, for example?
  • App needs – What kind of analytics do you need? If your app is very original then you might want to consider customizable metrics instead of the standardised ones.
  • Level of support –  Will the tool be helpful and reliable when things go south? Do they have an active customer care?
  • Size of  SDK –  How easy is the implementation process? Remember to choose the one that is simple to implement and not too complicated to master. Some SDKs can slow your app down and, as a result, reduce  its performance.
  • Cost – You can find platforms in various price tiers. Choose the one that is within your budget but also has all the features that you need.

Bearing all these elements in mind below you will find a list of top 5 app analytics tools.

1. Google Analytics for mobile

Google Analytics Logotype

Digital marketers may know Google Analytics as a website analytics tool but what they may not know is that it can be used as an app analytics tool. Some aspects such as user engagement can be tracked and measured with Google Analytics providing a very deep and valuable insight into the app visitors’ behaviour.

Google offers in-app analytics for both Android and iOS.

Pros of Google Analytics

  • Helps to understand the app user characteristics, traffic sources and volume
  • Shows what actions your users are taking
  • Measures in-app payments and revenue
  • Customizes reports specific to your business
  • Visualizes user navigation paths
  • Chunks and structures your data to understand different user groups’ behavior and enables you to isolate the user behavior thanks to the User Explorer Report

One major disadvantage of Google Analytics is that it’s not the most user friendly tool to use. It lacks support service in case something goes wrong, so you’ll have to go through tutorials and search for answers on your own.

2. Flurry

Flurry logotype

Flurry.com has been acquired by Yahoo Developer Network and is now the oldest mobile app analytics tool. It comes with many advantages and is absolutely free for iOS, Android as well as Blackberry. Flurry offers basic features necessary to manage and monitor the performance of your app.
The tool enables you to view the user’s experience in real time and as a result an insight into the user’s behavior.

Pros of  Flurry:

  • Provides user acquisition analytics
  • Displays data on the number of app sessions
  • Gives an accurate volume of active users, the demographics and frequency of use
  • Monitors add performance
  • Offers funnel management
  • Measures app retention

3. Mixpanel

Mixpanel logotype

Mixpanel has gained the trust of AirBnB, WordPress as well as match.com users. One of the strongest points of the tool is the installation speed, which is just under ten minutes. In addition, it is filled with analytical tools. Mixpanel supports both iOS and Android.
This particular analytical tool has a free plan as well as plans going up to $999 per month.

Pros of Mixpanel

  • Offers A/B testing
  • Gives app retention performance reports
  • Provides activity feeds and enables you to track revenue
  • Provides people analytics
  • Funnel  management
  • Targeted push messaging is enabled
  • Measures retention rate

Although it is a freemium, it is worth noting that the free version is limited to 25,000 data points per month.

4. Appsee

Appsee logotype

Appsee offers developers qualitative mobile app analytics. The insights are visual and may include videoing user behavior during sessions. This includes every action and activity  carried out by the user, it is automatically captured and tagged by Appsee’s technology. It has been recognized by Gartner as the leader of qualitative analytics.
The touch heat maps allow developers to see in-app user behavior comprehensively. This includes small details such as swipes which are displayed in color-codes. This app offers a 14-day trial after which the tool rolls into the premium version.

Pros of Appsee

  • Seamless integration (5 min)
  • Touch heat maps
  • Records user sessions
  • Provides crash reports
  • Offers retention analytics
  • Action cohorts
  • Retention analytics

Cons of Appsee

  • There is no freemium, just a free trial

5. Localytics

Localytics logotypes

If you are looking for smart targeting of your audience as well as marketing automation, then Localytics is you for you. It offers a diversity of features, both traditional and modern ones. Apart from the app analytics tools, Localytics is known for its personalization capabilities, which are achieved by targeted in-app messaging and push messages.
It also helps you re-engage with those who have uninstalled your app in order to convince them to come back and reinstall your app.
It comes with both free and paid subscription plans tailored to  your needs and budget.

Pros of Localytics:

  • Uninstall Tracking
  • Targeted in-app messaging
  • Push messaging
  • Funnel management
  • Life-time value tracking
  • Retention analytics
  • Enables A/B testing

It is worth noting that there are no funnels in the free version.

Conclusion

Technology evolves every day. Therefore, new app analytical tools are introduced every day. If you consider the criteria that the app has to meet, then the question which one to choose in this diverse and large market shall be an easy one. Choose the one that gives you the features which you need but is within your budget. On top of everything, choose the one that is easy to navigate. The above top 5 mobile app analytics tools have various features – so the choice is yours.

See also:

Categories
Blockchain Financial Services Technology

Blockchain testing MakerDao's stablecoin platform

As the cryptocurrency bear market continues, investors are looking for much-needed predictability. Stablecoins have risen in popularity as the market tumbles. Several have emerged, but few are as transparent as MakerDao . Maker had Espeo Blockchain testers check to make sure everything worked properly before they launched. Since the technology is so new, blockchain testing is vitally important for any project if you want to keep the trust of this fickle market.

Cryptocurrency is having a bit of an identity crisis lately. Some extoll utopian notions of freeing consumers from banks, while others see a way to make money. It was this unregulated asset speculation that spurred investors to flock to cryptocurrency late last year. Unsurprisingly, speculation in such a constrained asset class caused wild volatility in the market.

Market volatility, of course, is not good if you’d like to actually use cryptocurrency as a currency. One solution to this is the stablecoin. Stablecoins are crypto tokens with fiat currency or other more stable assets backing them. Similar to pegging a weak currency to a stronger one, stablecoins ideally lend confidence to those who hedge with them.

The Stablecoin

Centralization and dubious fiat backing are some current criticisms of stablecoins on the market. Users have to trust a central authority that the company issuing the tokens actually has the funds . But just like pegged fiat currency, stablecoins introduce much-needed confidence into the token economy.

Espeo Blockchain helped MakerDao with blockchain testing before launch. Unlike other stablecoins, Maker issues their Dai tokens using smart contracts in exchange for Ethereum. Head of business development, Gregory DiPrisco wrote in a company blog post from earlier this year that Maker operates in a similar way as a bank, just without the middlemen. The platform aims to introduce stability into the cryptocurrency market with its Dai token. Ether collateral backs each token. MakerDao reduces volatility and allows users to maintain their purchasing power with crypto assets.

” Pretend you are at the bank asking for a home equity loan. You put up your house as collateral and they give you cash as a loan in return… just replace your house with ether, the bank with a smart contract, and the loan with Dai.”

Of course, just as the bank might take your house if you can’t repay the loan, Maker automatically resells ethereum collateral if it drops below the value of the Dai loan. This mechanism maintains the integrity of the system and keeps the price stable. As prices drop, though you might think that the system would unravel. Mike Porcaro, head of communications at Maker remains positive, however. In an email interview, he said:

“Maker is unlocking the power of the blockchain for everyone by creating an inclusive platform for economic empowerment — allowing equal access to the global financial marketplace… The currency lives completely on the blockchain; its stability is unmediated by any locality, and its solvency does not rely on any trusted counterparties.”

Collateral debt position

MakerDao’s CDP portal holds a surplus of collateral in publicly auditable Ethereum smart contracts. Users can use Dai tokens in the same way as any other crypto. Users can send Dai to others, pay for goods and services, or save them long-term. Creating a Collateral Debt Position (CDP) allows users to protect their ETH assets with Dai stablecoins. Porcaro went on to say:

“People or organizations create Dai by locking-up ETH in a CDP. As long as people can open CDPs there will be Dai in circulation. [Maker is] seeing increased volumes of Dai in circulation. In fact, about 1.5% of total ETH is locked up in Dai smart contracts.”

The hope is that the system will encourage more people to use cryptocurrency as a medium of exchange. Stability is essential to achieve this.

Blockchain testing

Espeo Blockchain helped MakerDao with blockchain testing before it went live. Checking how usable the platform is, and the security of the smart contract code is critical in a blockchain project. This process ensures that the system functioned properly before people started putting their money in. Head of product, Soren Nielsen agreed to hire Espeo when his team needed assistance with testing the MakerDao platform. Nielsen explained in an email interview:

“We needed immediate assistance with testing a product, [so] we decided to “test the waters” with Espeo… A challenge we currently have in general when engaging new suppliers is that there is a steep learning curve unless you’re already a user of our system… I felt that we had a professional client-supplier relationship. It certainly was good to have a dedicated project manager following this from Espeo.”

Finding bugs to fix

Project manager Natasza Stanicka and testers Bartosz Kuczyński and Patryk Jaruga got to work trying out every aspect of the platform looking for bugs to fix. They had to methodically sift through every feature and behave just as a regular user would. Kuczyński recalled:

“First, we clicked through every clickable item and went through every user story and tried to find edge cases. I guess the challenge with that was the fact that it is a blockchain application and that has certain consequences in itself. We needed to make sure that it actually cooperates with the blockchain correctly and the results were accurate.”

Jaruga remembers finding a bug which interrupted Ethereum transactions between hardware wallets. He said that every device behaved a bit differently in transactions and that they tested the Maker platform with a range of hot and cold wallets. The Maker team fixed the bugs as soon as they knew about them, said Bartosz Kuczyński. He added:

“Maker’s development team was really interested in bug reports, and they dealt with them immediately. In that respect, I believe this was really very ‘well-oiled.’ The bugs were corrected immediately so we would be able to move on to the next thing. The overall attitude of the company was certainly very positive.”

Testing such a complex application ensured a successful launch. Since MakerDao relies on consumer trust, getting the calculations right is vitally important. Blockchain testing involves a lot of things. Tokenomics, UI/UX, and hardware compatibility are some of the aspects our testers analyzed. In order to maintain the integrity of the token system and preserve the public’s trust, making sure the app works as promised was essential.

Conclusion

Dai stablecoins and the MakerDao infrastructure goes a long way to stabilize the cryptocurrency ecosystem. For more people to adopt crypto and start using it as a medium of exchange, easing price volatility and uncertainty have to happen. Dai’s stability helps users hedge their Ethereum assets and protect their investments from wild swings in the market. Blockchain testing ensured the platform worked properly before launch. More than anything else, confidence in blockchain technology will encourage wider adoption. Knowing that you won’t lose everything overnight will spur more people to start using cryptocurrency.

Categories
Software Technology

Front-end history, 2018 trends and Espeo choices

Front-end development has changed slightly over last 10 to 15 years. Javascript has evolved the most among the existing programming languages. It turned away from writing simple logic on websites by using ugly, unstructured code and plugins, and evolved into building completely functional Single Page Applications.

Let’s go on a small journey back in time to see how it used to look in the past and what changes have happened over the last couple of years.

The good old days…

Front-end history, 2018 trends and Espeo choices
We are in 2006/2007. It’s a time when most 30+ developers started their careers. Javascript was weak and was not taken seriously. Most developers chose backed languages as their main technology to master. Web applications were using reloads every time a user interacted with UI. AJAX was still something new and was used only for updating small elements on the pages. The key player was jQuery, it simplified DOM data manipulations and added a nice wrapper for asynchronous code. Thousands of jQuery plugins were created which were responsible for specific UI interactions such as sliders, carousels, tabs and many more.

The term front-end developer didn’t exist then. People involved in javascript stuff were called web developers. Their responsibility was to turn designs into fully working web pages. Designers worked closely together, they styled using pure CSS without any any compilers and added  small portions of JS code to make websites more interactive. Various browsers with different capabilities posed a great challenge as they were implementing different standards. This is the era of many browser tricks and hours spent on Internet Explorer 7 support. Any automation like concatenating and scripts minification were done by back-end technologies and in most cases no one cared about it, ending with a long list of scripts and linked styles in html head section.

JavaScript revolution

A few years later, between 2009-2010 a lot of things happened in the front-end world. Many new key players appeared on the scene. It’s worth mentioning about a few frameworks which made a lot of noise at  that time:

  • Backbone JS It was the first attempt to make a 100% javascript application without any annoying reloads. The idea of Models and Collections behind Backbone made web application development more structured and created with well documented and clear boundaries understandable  for many developers.
  • RequireJSIt was the first solution for module dependency management in JS. It allowed  to split a code into multiple files so it could be downloaded on demand in browsers which was a great step into the future.
  • Sass and Less compilers became mature enough, which meant no more pure CSS, because NodeJs was a toddler compilers compared to other languages like Ruby, Python or .Net.
  • NodeJS It enabled javascript code to run not only in browsers, it  revolutionised front-end development workflow once and for all.
  • AngularJSIt was the first angular version, which gained a lot of popularity, and most developers turned to Angular world quickly.
  • PhoneGap It allowed to write html applications which worked  inside iOS/Android WebView, it was not perfect and frequently ended with performance issues because of Webview boundaries but enabled to create their own mobile application knowing only html and js.

Naturally, there is much, much more. Javascript stopped being just an additional technology. Companies started to hire “JS Developers.”

It’s a little too fast…

Frontend history


Evolution sped up after 2010. In the following years, thousands of new libraries appeared. Finally, JS Devs had package managers for dependencies. Most users started with bower and slowly turned into NPM packages when nodeJS started to be more mature. No more keeping all dependencies in projects repositories.

In the “good old days” any attempts to make some automation like concatenating files, minification or images optimization needed to be done outside of the JS world. But change was coming…

New task runners started to appear. Most developers began to use Grunt or Gulp for simple automation. Many tasks appeared for both frameworks allowing the possibility to check JS quality, to concatenate files or even to watch changes in source files to do browser refresh or inject changes asynchronously without refreshing. Both tools started to lose popularity for the  benefit of npm scripts. Small pieces of nodeJS scripts had the ability to do everything nodeJS allowed.

The biggest changes appeared in javascript SPA applications. Developers were inspired by solutions put forward by Backbone and Angular teams and started to recreate and improve on similar solutions.

It’s when: Ember, Knockout, and React were gaining popularity. The last one made the biggest impact in the front-end world for a long time.

Another aspect of the development which changed slightly during this time was the modular approach to development. In “the good old days”, modules in javascript did not exist. The only way to preserve privacy was to use design patterns such as Module Pattern, which were created from an unnamed and immediately executed javascript function. It was the only way not to pollute global scope with everything we have implemented. Require JS mentioned in the previous chapter made some progress here with AMD approach. Another approach which appeared was Browserify, it used common JS modules like in Node JS but inside a browser and it compiled them to a single file which should have been injected in html. Finally, Webpack appeared which became a standard for bundling and dealing with modules for client-side applications.

A big impact, a way of development, was also caused by UI Frameworks. Bootstrap, Foundation and, later on, Material-UI. Each one included a complete set of components, fonts, icons and utils such as the frequently-used grid system. Each one sped up the development process and helped to achieve consistent UI without any additional styling.

Fast and furious….

Few years ago, it was easy to follow all the news taking place in the front-end world. But the popularity of javascript was increasing. More and more developers started contributing to open source projects.
There are now millions of libraries/frameworks/scripts in javascript. Let’s take a simple test:

  • Think of any name of an item not connected with programing, e.g. focus on any home-related ones such as: table, home, knife, doors, bed….
  • Try to google it with “js” suffix
  • Almost every name is a name of a library, npm script or github repository…, you should have 100% match in the first 5 words which popped to your head.
Vue, React, Angular

Almost every aspect of the front-end development has changed. In the case of styling, for a few years, SASS was unquestionably a leader. We have adopted nice architectural solutions for sass such as: BEM or SMACSS . Then there was an idea that styles could be created directly in js tied up to the components with the proper scopes. Currently, there are large numbers of similar CSS in JS solutions.
In the case of SPA application situation is clearer, there are 3 competitors: React, Angular 2+ and a newbie, Vue; a detailed description and comparison of each is a topic for another blog post. Most developers choose one of them. Some people will agree with the image below:

The number of mobile application frameworks for javascripts from which we can choose is also huge. There are: React Native, Ionic, NativeScript, Cordova. React Native is gaining popularity and wider support. We need to choose the mobile app technology for our project carefully, maybe for the simpler ones PWA would be enough?

The first Backbone JS applications were huge and each month the application was getting bigger and bigger. When the bug appeared it was hard to tell in what  state application actually is; if it had been Redux or MobX debugging would  not have been so painful and time consuming.

The world of js is a huge one, and I only mentioned a few core libraries. If we dig into a specific area like SVG graphics, WebGl, Web Components, Server side rendering more and more good, well-known libraries will appear and each one deserves their own article. Server side rendering is a “hot” concept nowadays as we have entered the SPA era, where all content is rendered client side. Still, there are crawlers and social tools which do not understand javascript.
For the all non believers: JQuery is not dead! jQuery is still one of the most popular library on the internet. If you think jQuery is always ‘evil’ please read this.

State of Javascript in 2018

So, what is the current state of javascript? It’s widespread use says it all.

You can check out what the current trends are. Take heed and read the conclusions in each section while choosing the technology for a new project. The newest “hot” technology may not always be the best choice because of the lack of documentation and solved issues. Frameworks which are mature have a lot of contributors. Most errors have already been solved and if a new ones appear, there is always a person who will quickly address our problem and solve it.

How to be a good front-end developer in 2018? Naturally, we all need to follow the news frequently. So much is happening everyday, and it is always helpful to go to news aggregators such as: DailyJs. You will not became a technology specialist simply by reading articles, it just gives you an idea of what to learn next 🙂

A few years ago front-end developers were able to do everything their employer needed. It was not too difficult to be knowledgeable about css/sass, vanilla js, existing SPA frameworks.
Nowadays it’s a bit more complicated, it’s hard to be a specialist in everything and it’s not possible to reach master level in everything related to front-end. We need to specialize and choose our path to become: React, Angular 2+, Ember, Ionic, NativeScript, Styling etc. specialist.

Espeo Choices

Espeo choices

In Espeo we follow current trends. For our projects we use fresh well-known and well documented technologies.

In the case of creating new SPA projects React with Redux and Angular 2+ are a way to go, with increasing React dominance. Developers use React Native or native applications for mobile solutions. However, React Native can’t solve all mobile problems. As far as styling Sass or styled Components are concerned we use them extensively in our projects. Webpack and npm scripts have become a standard for all the projects we develop.

We are open to new technologies and still explore new possibilities such as: Vue, TypeScript, Reason…and many more. Possibly, we’ll only consider a few of them for our future projects.
 
[contact-form-7 id=”13387″ title=”Contact download_8_reasons”]