In web programming, the term “front-end” refers to the elements that enable interaction between your website and the visitor’s browser.
Front-End Optimization (OEM), also known as content optimization, groups together the elements of your website’s development process to make it load faster and more pleasant to navigate for the user.
Display speed is an SEO criterion that is becoming more and more important, Google has also put a new performance control tool that distinguishes the display on desktop on the one hand and on mobile on the other hand, in the context of the arrival of the mobile first index, thisconcern takes on its full meaning
In broad strokes, Front End Optimization focuses on reducing file sizes and minimizing the number of requests needed to load a given page. All this in the service of the display speed factor, an SEO parameter on which Google communicates assiduously.
During OEM, web designers and possibly SEO experts distinguish between the estimated and actual download time of a page. Estimated download time is considered eo because of its impact on the overall user experience (UX), while actual download time is often used as a benchmark performance metric.
Content Delivery Networks play an important role in the OEM optimization process because they are commonly used to simplify optimization tasks that require the most time. For example, a typical CDN has automatic file compression and reduction features, freeing you from having to manually tinker with individual website resources.
Server Access Speed: Time to First Byte or (TTFB)
Server access speed, often used to measure a website’s response time, is one of the most important performance metrics – as well as one of the most misunderstood.
- From the point of view of the actual download time, TTFB is the time it takes for the first byte of data to travel from a server to the calling browser.
- From the point of view of the estimated download time, the TTFB is the time it takes for the browser to parse the first byte after downloading the HTML file.
Only the estimated TTFB impacts the user experience, making it the more important of the two metrics.
Front-end delays account for up to 80% of your website’s response time.
There are several ways to improve the responsiveness of your website:
and gain positions towards the first page on Google
- Reducing http requests
- File Compression
- Cache optimization
- Code Minimization
- Image optimization
Reducing HTTP requests
When loading a web page, a browser must open a separate TCP connection per HTTP request issued, so an equal number of connections to the number of elements on the page to be downloaded.
The problem is that there is a limit to the number of concurrent connections a browser can open to a single host. This limit exists so that a server is not overloaded with a large number of HTTP requests. However, it also serves as a potential bottleneck, often forcing the browser to queue connection requests.
Since the maximum connection threshold is quickly reached, various techniques related to Front Page Optimization are used in order to reduce the number of elements on the page as much as possible. One of the most common is resource consolidation – a practice that involves grouping several small files into a single “package”.
Example…
Let’s say your website consists of an HTML file, two CSS files, 3 scripts, and 16 images – including your logo and various menu backgrounds. In total, a browser will need to make 22 HTTP requests to load a page on your site.
A user browsing Google Chrome can only open six simultaneous TCP connections to your server, so the browser queues the other 16 requests.
By consolidating all image models into a single sprite, you’ll reduce the number of requests from 22 to just 4.
Not only does this allow the Chrome browser to scan the page in a single “session”, but it also reduces the number of round trips required for the page to load.
The Content Delivery Network groups connections
CDNs can further reduce server response time by aggregating connections and ensuring that they remain open throughout a session.
While CDN does not reduce the number of requests per se, pooling improves performance by eliminating delays in closing and reopening TCP connections.
HTTP/2 Multiplexing
Early in the adoption phase, the HTTP/2 protocol introduces multiplexing – a connection method that allows multiple requests and responses to be sent over a single TCP connection.
In the near future, this will reduce the benefit of resource packaging described earlier.
File compression
Each of your web pages is built from a collection of HTML, JavaScript, CSS, and (possibly) other source code. The more complex the page, the larger the files and the longer the loading time will be.
With file compression, these files can be reduced to a fraction of their original size to improve site responsiveness. Preferred for its fast encoding/decoding times and high compression ratios, Gzip is the most popular file compression choice. It can reduce a source code by 60 or even 80 percent.
Note: Gzip is not effective in reducing the size of image files because they are already compressed.
Gzip’s strength is, to a certain extent, its ability to group all files into a single compressed .tar file (a.k.a., tarball). The downside is that it prevents individual files from being extracted. This is not a problem for web content because it must be uncompressed anyway to be able to load the entire page correctly.
The CDN factor
Almost all CDNs provide automated file compression, seamlessly converting all compressible source code to Gzip (e.g., CSS and JS files) before returning them to website visitors.
Cache optimization
HTTP cache headers play an important role in how browsers crawl a website because they determine which pieces of content are cached and for how long.
Caching stores your static files, which tend to be the largest, either outside of your server – or on visitors’ local disks or a nearby CDN PoP. This can significantly improve the website’s loading speed.
The downside is that manual header management may hide a tedious task with reduced efficiency. In addition, caching mechanisms often run into problems when dynamically manipulating generated content created on the fly when a page starts loading (e.g., AJAX objects and even dynamically generated HTML files).
CDN control options
Many CDNs offer cache control options, through an easy-to-configure “dashboard” interface.
This console allows
- Define site-wide policies.
- manage caching rules for individual items and even
- Define policies for entire filegroups
based on criteria such as file type and location, (e.g., always cache all images in the “/blog/” directory for 60 days).
CDNs have been incorporating machine learning techniques for some time. They follow content usage patterns to automatically optimize caching options, enabling caching to dynamic content deemed “typically impossible to be”
This relieves the server manager of many of the caching management tasks.
Code Minification
Minification (reducing the size of code) is an OEM process that recognizes the difference between how developers write code and how machines read that code.
The idea is that – while developers write the code for easy to read comprehension, with spaces, line breaks, and comments – machines will read the code without any of these elements, thus getting to the point.
Minification reduces the code to its bare essentials, reducing it by a good third or even half before compression.
The CDN factor in this context
CDNs have the ability to completely automate code minification , making it very easy for them to minify JavaScript, HTML, and CSS files on the fly, sent to visitors’ browsers.
Gzip AND Minification:
While converting code to Gzip and minifying it at the same time may seem redundant, combining the two methods offers the best results.
So, minifying the files before converting them to Gzip will shrink the tar file size by 5-10% more.
Image optimization
Caching and compression are the two most common methods of optimizing images, with caching being the more effective of the two. This is because, unlike source codes, all image formats are already compressed.
To further reduce the size of an image file, you need to “tamper” with the image data, either by removing some information from the header or by reducing the quality of the original image. This is known as lossy compression.
This is not a good idea, except possibly for some high-resolution images, because the human eye cannot naturally perceive all the visual information contained in such images.
For example, lossy compression can remove color gradations and reduce pixel complexity without affecting our perception of the image too much.
Beyond web compression, the best way to optimize images is to reduce the size and weight of images BEFORE putting them in the library.
The CDN factor for images
CDNs also help automate the image compression process, allowing you to choose the page load speed and image quality settings.
The most advanced CDNs also offer a progressive rendering option, emphasizing the original concept of lossy compression. With progressive rendering, the CDN quickly loads a pixelated version of the image. The CDN then gradually replaces it with a series of variants, which look better and better, until the actual image is ready to load.
Progressive rendering is useful for its ability to decrease estimated load times without sacrificing image quality.
Progressive rendering in progress
Vector images and raster images
Another image optimization technique is to replace some of your regular (raster) images with their vector counterparts.
This technique applies to images composed of simple geometric shapes: lines, curves, polygons, etc. A typical vector image is an icon or diagram.
You should use vector images when you can because:
- They are very small in size, given that they only need to collect data for a set of coordinates – not for each individual pixel.
- As their resolution is independent, they can be zoomed in and out indefinitely, without any impact on quality. This makes them perfect for a responsive design. (responsive)
All of these Front End optimizations
… are not always easy to implement, they require skills, especially when it comes to servers.
Moreover, depending on the type of hosting of your website, you will have no control over certain parameters. The most addicted will leave their shared hosting and opt for a dedicated server.
This naturally has a cost which, from one site to another, from one theme to another, will be cushioned by the gains in positions, the traffic generated and, we hope, the turnover in return.