Web optimization training camp, web page speed 50 times

Preface

We will use a complete example to optimize loading, rendering and other experiences step by step.

start

First, let’s look at the file composition of the project

This includes a basic web page element, JS (React App), CSS, and pictures.

Related resources see https://github.com/joesonw/web-accelerate-example

Let’s take a look at the whole page of serve first.

Server.js

‘use strict’;

Const FS = require (‘fs’);

Const path = require (‘path’);

Const koa = require (‘koa’);

Const app = koa ();

App.use (function* (next) {)

Const file = this.path.slice (1)’index.html’;

Try {

Const content = yield CB => fs.readFile (path.resolve (‘./dist’, file), CB);

This.body = content;

This.type = path.extname (file).Slice (1);

This.status = 200;

} catch (E) {

This.status = 404;

}

Yield next;

});

App.listen (process.env.PORT 3000);

This code is simply a forwarding of the files in the dist directory.

When you open the web page, you can see the related load.

As we can see, the whole app.js is 277kb, and in the case of the analog 3G network (blue frame), each load takes 999ms, and the download costs 911ms (red frame).

Next we will gradually optimize, and then compare the results every time.

Optimization (1) – – 304

The most common one in Web loading optimization is 304 Not Modified, the specific mechanism is the browser launch request, the headers contains If-Modified-Since, (such as without caching, the header field), the server compares the time of the final modification of the file on the hard disk (or in the memory), if the result is less than or equal to the request time, then return 304. otherwise, It returns 200 and adds the Last-Modified field, telling the client that the next request can try to ask whether there is a cache.

The specific code is as follows:

App.use (function* () {()

Const file = path.resolve (__dirname, path.resolve (‘dist’, this.path.slice (1)’index.html’));

Const headers = this.headers;

Let ifLastModified = this.headers[‘if-modified-since’];

If (ifLastModified) {

IfLastModified = new Date (ifLastModified);

}

Try {

Const stat = yield CB => fs.stat (file, CB);

Const now = Date.now ();

If (ifLastModified & &

File! = = path.resolve (__dirname, path.resolve (‘dist/index.html’)) {

If (ifLastModified > = stat.mtime) {

This.status = 304;

Return;

}

}

Console.log (file)

Const content = yield CB => fs.readFile (file, CB);

This.body = content;

This.type = path.extname (file).Slice (1);

This.status = 200;

This.set (‘Last-Modified’, stat.mtime);

} catch (E) {

This.status = 404;

}

});

(simulation of the actual situation, the home page will be dynamically generated, add some ads, tracking or personalized data, index.html is not cached).

Final effect:

We can see that the download time is 2ms, which can be almost ignored (only HTTP Headers), and the total load time is only 120ms, and 869ms. is a lot less 869ms. than before.

But, are we satisfied?

Optimization (two) — package separately

We can notice that we pack up only one JS file in the end, when the dependency is much changed (in this case only react and react-dom, each modification causes the entire JS file to be rerequested. So we want to extract different library (even the common code modules within the project).

We first need to create a webpack.vendors.config.js to build these library, or vendor..

Const path = require (‘path’);

Const WebpackCleanupPlugin = require (‘webpack-cleanup-plugin’);

Const HtmlWebpackPlugin = require (‘html-webpack-plugin’);

Const webpack = require (‘webpack’);

Const ExtractTextPlugin = require (" extract-text-webpack-plugin");

Module.exports = {{

Plugins: [

New webpack.DefinePlugin ({

‘process.env’: {

NODE_ENV:’" production" ‘,

},

}),

New webpack.optimize.OccurenceOrderPlugin (),

New webpack.optimize.UglifyJsPlugin ({

Compress: {

Warnings: false,

Screw_ie8: true,

Drop_console: true,

Drop_debugger: true,

},

}),

New webpack.DllPlugin ({

Path: path.resolve (__dirname,’dist/vendor/[name]-manifest.json’),

Name:'[name]’,

Context: ‘.’,

}),

]

Devtool:’hidden-source-map’,

Entry: {

‘react’: [‘react’,’react-dom’],

},

Output: {

Path: path.resolve (__dirname,’dist/vendor’),

Filename:'[name].js’,

Library:'[name]’,

},

};

Be aware

Entry: {

‘react’: [‘react’,’react-dom’],

},

It means that we can package the same type of package into a JS file.

Of course, we also need to make some modifications to webpack.production.js.

Const DLLs = fs.readdirSync (path.resolve (__dirname,’dist/vendor/’))

.filter (file => path.extname (file) = = =’.js’)

.map (file => path.basename (file,’.js’))

Const dllReferencePlugins = DLLs

.map (DLL =>

New webpack.DllReferencePlugin ({

Context: ‘.’,

Manifest: require (`./dist/vendor/${dll}-manifest.json`),

})

);

Module.exports = {{

Plugins: dllReferencePlugins.concat (

])

}

Here, we will automatically scan the files under the vendor directory and load all the vendor automatically.

In this way, we implemented subpackage loading (and some details were revised, including index.html, see GitHub, Step-2 branch).

The result is pretty good. App.js only needs more than 400 ms for loading alone, which is faster than at least half of all loading.

For general types of websites, the optimization has achieved very good results, but for large websites, we can still do a lot.

Optimization (three) – forced caching

We can note that optimization, one 304 of the request still takes more than 100 milliseconds, for large sites, and a lot of resources, this is still a very small expense. So can we save this? The answer is yes.

In the browser cache, there is a special field. Expires, which can specify the expiration time of the file until that moment, and the browser will not reinitiate the request, but read directly from the local cache.

However, it still needs to be requested every other time. How should we do it? The answer is, set extra long cache time, for example, 10 years. But then we can’t update anything. How can we use such features and easily update it.

We can add the hash feature value to the file name so that it can be reloaded only when the content of the file is changed, and it is suitable for distributed CDN, non overlay release, which can make it use the new resource in the case that the reference page (the front page) has been changed (the current server has been published), and the access is not available. When the server is published, it will still refer to the old resources, so that the Publication no longer needs to stay up late.

The details of the changes are seen in Git branch step-3.

Implementation results: you can see from the blue box that the cache has already taken effect, and the overall read time is only 20 milliseconds.

From the original 1000 milliseconds to the current 20 milliseconds, the simple three steps can make your webpage load 50 times faster.

Extended reading

1. in actual production, we usually see the loaded CDN domain name. Why?

This is because a large web site, the request will bring a lot of Cookie, some even close to the 1KB, and the 100 Picture loading, is the full 100KB. through the third party domain name (different from the current domain name), we can save a lot of unnecessary request head, Cookie head. Also to achieve the speed of the purpose.

2. another situation is that resources are distributed on different servers.

This is because browsers limit the number of concurrent downloads of resources under the same domain name.

The use of different resource servers can avoid this restriction and increase the number of downloads. However, the same cache hit rate is the same problem, so it also needs to store the data related to the user’s cache.

3. other methods

With the rapid development of technology, there are still many technologies that can enhance the experience of end-users.

BigPipe + Server-Side Rendering speeds up home page loading speed.

Goole AMP

HTTP/2

Leave a Reply

Your email address will not be published. Required fields are marked *