Magento 2 Google Page Speed Insights Performance Optimisation using Mage Pack

Magento 2 is a legacy platform with a really bad performance. And to fix all issues is not possible because of the bad architecture of the system. It has a problem in the Backend and Frontend. Adobe officially discontinued the legacy frontend in favor of ReactJS and the backend should be replaced with Node JS / Java.

Magento 2 Frontend performance is really sucks because it has to many lines of JS code handled by homemade architecture — Require JS with minification and bundling and static generation via PHP. However, the best practice for Magento 2 is not to follow stupid Magento “best practices” and that's why brave guys build a modern optimizer instead of Magento core optimizer which is way more flexible, faster, produces smaller bundles, and doesn’t break on missing files. Mage Pack uses babel, and Node JS instead of the Magento PHP based “best practices”. The next step will be a rewrite of the M2 beast to Node JS microservices and ReactJS

Mage Pack Magento 2 Advanced bundling

The goal of JavaScript bundling is to reduce the number and size of requested assets for each page loaded in the browser. To do that, Mage Pack wants to build our bundles so that each page in our store will only need to download a common bundle and a page-specific bundle for each page accessed.

One way to achieve this is to define your bundles by page types. You can categorize Magento’s pages into several page types, including Category, Product, CMS, Customer, Cart, and Checkout. Each page categorized into one of these page types has a different set of RequireJS module dependencies. When you bundle your RequireJS modules by page type, you will end up with only a handful of bundles that cover the dependencies of any page in your store.

For example, you might end up with a bundle for the dependencies common to all pages, a bundle for CMS-only pages, a bundle for Catalog-only pages, another bundle for Search-only pages, and a bundle for Checkout pages.

A clean Magento installation allows reaching enough good performance by splitting bundles by page types, but some customizations may require deeper analysis and other asset distributions.

Magento 2 might need to install some patches.

  • For Magento 2.3.3 and earlier 7 patches are required
  • For Magento 2.3.4 and 2.3.5 1 patch is required
  • For Magento 2.4.0 no patches are required

To install magepack you need simply run

Install with npm:

No Composer reuire.

You can have the next issue:

Puppeteer is a Node library which provides a high-level API to control Chrome or Chromium over the DevTools Protocol. Puppeteer runs headless by default, but can be configured to run full (non-headless) Chrome or Chromium.

To install Puppeteer in your project, run:

after you will have:

Fix for Centos 7 AWS Linux 2:

after no issues at all.

How to use it?

First Generating bundler configuration

The first step is to run the generation against the existing, working shop. You can do it on any machine with access to the target shop, even your own computer. The goal here is to collect all of the RequireJS dependencies needed for a certain type of page layout. Currently, the following bundles are prepared:

  • containing modules needed by CMS pages.
  • containing modules needed by category pages.
  • containing modules needed by product pages.
  • containing modules needed by cart and checkout pages.

In addition, there is the bundle created by extracting all modules needed by each of the above and loaded on every page.

Running the generator

go to Magento root directory and run:

There are 3 required options you need to pass:

- URL to one of CMS pages (e.g. homepage).

- URL to one of the category pages.

- URL to one of product pages.

Note: Magepack will use given product page, add this product to the cart and visit both cart and checkout pages to collect dependencies.

Running the above command will generate file, where you can find each of the prepared bundles with the list of modules that will be included in them.

You will see something like this:

Image for post
Image for post
Mage Pack Bundler result

Bundling

Once you have generated bundler configuration, the next step would be to trigger the actual Magento optimization after the static content deploy stage has finished by running the following in the shop root directory:

This command will iterate over each deployed locale (excluding Magento/blank) and prepare bundles for each of them.

Image for post
Image for post
Magento 2 Mage Pack Bundling command result

Enabling Bundling

Once you made sure Magepack Magento module is installed,

what is left is to enable it via the admin panel under Stores->Configuration->Advanced->Developer or CLI:

and clearing the cache:

Now the shop should be way faster than before

Content breaks down before optimization and after in a live production Magento 2 site hosted on AWS.

Before with native trashy JS bundling disabled:

Image for post
Image for post
before optimization

After:

Image for post
Image for post
after optimization

JS content downloadable size was reduced from 6,093,573 to 639,630 almost tice.

Web Vitals Improvements Desctop:

Before Mage Pack:

Image for post
Image for post

After Mage Pack:

Image for post
Image for post

We can see improvements however not so big in the case of Web Vitals.

What is Magento Veb Vitals

Optimizing for quality of user experience is key to the long-term success of the Magento 2 website. Whether you’re a business owner, agency, or developer, Web Vitals can help you quantify the experience of your site and identify opportunities to improve.

Web Vitals is an initiative by Google to provide unified guidance for quality signals that are essential to delivering a great user experience on the web.

Google has provided a number of tools over the years to measure and report on performance. Some developers are experts at using these tools, while others have found the abundance of both tools and metrics challenging to keep up with.

Site owners should not have to be performance gurus in order to understand the quality of experience they are delivering to their users. The Web Vitals initiative aims to simplify the landscape, and help sites focus on the metrics that matter most, the Core Web Vitals.

Core Web Vitals

Core Web Vitals are the subset of Web Vitals that apply to all web pages, should be measured by all site owners, and will be surfaced across all Google tools. Each of the Core Web Vitals represents a distinct facet of the user experience, is measurable in the field, and reflects the real-world experience of a critical user-centric outcome.

The metrics that make up Core Web Vitals will evolve over time. The current set for 2020 focuses on three aspects of the user experience — loading, interactivity, and visual stability — and includes the following metrics (and their respective thresholds):

  • Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.
  • First Input Delay (FID): measures interactivity. To provide a good user experience, pages should have a FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, pages should maintain a CLS of less than 0.1.

For each of the above metrics, to ensure you’re hitting the recommended target for most of your users, a good threshold to measure is the 75th percentile of page loads, segmented across mobile and desktop devices.

Largest Contentful Paint (LCP) is an important, user-centric metric for measuring perceived Magento load speed because it marks the point in the page load timeline when the page’s main content has likely loaded — a fast LCP helps reassure the user that the page is useful.

Historically, it’s been a challenge for web developers to measure how quickly the main content of a web page loads and is visible to users.

Older metrics like load or DOMContentLoaded are not good because they don’t necessarily correspond to what the user sees on their screen. And newer, user-centric performance metrics like First Contentful Paint (FCP) only capture the very beginning of the loading experience. If a page shows a splash screen or displays a loading indicator, this moment is not very relevant to the user.

In the past google recommended performance metrics like First Meaningful Paint (FMP) and Speed Index (SI) (both available in Lighthouse) to help capture more of the loading experience after the initial paint, but these metrics are complex, hard to explain, and often wrong — meaning they still do not identify when the main content of the page has loaded.

What is LCP?

The Largest Contentful Paint (LCP) metric reports the render time of the largest image or text block visible within the viewport. Regularly banner for home page, Main product image for product page/

What is TBT?

Total Blocking Time (TBT) is an important lab metric for measuring load responsiveness because it helps quantify the severity of how non-interactive a page is prior to it becoming reliably interactive — a low TBT helps ensure that the page is usable.

The Total Blocking Time (TBT) metric measures the total amount of time between First Contentful Paint (FCP) and Time to Interactive (TTI) where the main thread was blocked for long enough to prevent input responsiveness.

The main thread is considered “blocked” any time there’s a Long Task — a task that runs on the main thread for more than 50 milliseconds (ms). We say the main thread is “blocked” because the browser cannot interrupt a task that’s in progress. So in the event that a user does interact with the page in the middle of a long task, the browser must wait for the task to finish before it can respond.

If the task is long enough (e.g. anything above 50 ms), it’s likely that the user will notice the delay and perceive the page as sluggish or janky.

The blocking time of a given long task is its duration in excess of 50 ms. And the total blocking time for a page is the sum of the blocking time for each long task that occurs between FCP and TTI.

Google Page Speed Insight Magento Mobile optimization results.

Before:

Image for post
Image for post

After:

Image for post
Image for post

As you can see any extension couldn't fix Magento 2 bad performance and achieve at least 50 points on mobile. You can just slightly improve it.

Description of the metrics:

First Contentful Paint:

First Contentful Paint marks the time at which the first text or image is painted.

Speed Index:

Speed Index shows how quickly the contents of a page are visibly populated.

Largest Contentful Paint:

Largest Contentful Paint marks the time at which the largest text or image is painted.

Time to Interactive

Time to interactive is the amount of time it takes for the page to become fully interactive.

Total Blocking Time:

Sum of all time periods between FCP and Time to Interactive, when task length exceeded 50ms, expressed in milliseconds.

Cumulative Layout Shift:

Cumulative Layout Shift measures the movement of visible elements within the viewport.

Magento bundling reduces the number of connections per page which is no problem for HTTP 2/3, but for each page request, it loads a lot of the JS code and needs to evaluate that code even when the requested page may only execute several rows of the JS code, however, the browser needs to evaluate all the code. Performance improves after the browser caches the bundles.

Following Magento documentation blindly would have negatively impacted our site’s performance. Since the different components of our sites — like the swatches, fotorama image gallery, mini cart, checkout — are all rendered on the client-side, letting RequireJS load our main JavaScript file programmatically would have left our users staring at a blank layout for a longer time.

So, the number of server requests not the main issue. Just reduce the number of server requests and mini will not save your eCommerce business from the expansive failure.

The main issue is the amount of JS code and it’s evaluation and execution time. When you are bundling JS you are obfuscating CPU usage statistics by JS scripts. And here is the real Magento Cpu Usage:

requirejs/require.js

jquery.js

mage/requirejs/text.js

fotorama/fotorama.js

As we can the main performance issues is Require JS script:

RequireJs has an optimization tool, which can help you to minify and concatenate your modules. It has a lot of options and can be difficult to use, but it gets easier with a build tool like GruntJs or (especially) Yeoman, which uses GruntJs to build.

In both, you can use the task (which optimizes modules), but again Yeoman is a bit easier since it has generators which will configure it all ready for you:

In the you just use a comment line to specify which js files should be minified/concatenated to which output file:

In the example above, the modules will be concatenated to ONE file, named .

This will be done by executing from the command line. This will start a lot of useful tasks, which will build the project in a folder, but again this is highly adaptable.

The resulting file (in ) has only (if you want) one javascript file:

Use of Yeoman to make life easier (at least for handling minification/concatenation).

Another website test:

Magento 2 default junky JS bundling :

Image for post
Image for post
Image for post
Image for post

Magento 2 Bundling disabled. Ajax RequireJS AMG include:

Image for post
Image for post
Image for post
Image for post

We can see less JS download however really bad Web Vital

Mage Pack Magento Bundling

Image for post
Image for post
Image for post
Image for post

As we can see it doesn't change the size of JS much. The quantity of the JS file in Magento is a critical issue. Jus override all functionality to ReactJS or even better PReact will fix this issue to me.

React 16 is actually smaller compared to 15.6.1!

  • is 5.3 kb (2.2 kb gzipped), down from 20.7 kb (6.9 kb gzipped).
  • is 103.7 kb (32.6 kb gzipped), down from 141 kb (42.9 kb gzipped).
  • + is 109 kb (34.8 kb gzipped), down from 161.7 kb (49.8 kb gzipped).

Most UI frameworks are large enough to be the majority of an app’s JavaScript size. Preact is different: it’s small enough that your code is the largest part of your application.

That means less JavaScript to download, parse and execute — leaving more time for your code, so you can build an experience you define without fighting to keep a framework under control. That's why we need Preact.

is our compatibility layer that allows you to leverage the many libraries of the React ecosystem and use them with Preact.It adds somewhere around 2kb to your bundle size, but has the advantage of supporting the vast majority of existing React modules you might find on npm.

Written by

Magento/APP Cloud Architect. Melting metal server infrastructure into cloud solutions.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store