Abstract: Whether at work or in an interview, optimizing web front-end performance is very important. So what aspects do we need to start with optimization? You can follow Yahoo’s 34 Catch-Rules for front-end optimization, but now it’s 35, so it can be said to be Yahoo’s Catch-35 for front-end optimization. It has been classified, which is good. This way we have a clearer direction for optimization.
content part 1. Minimize the number of HTTP requests80% of end-user response time is spent on the front end, and most of that time is downloading the various components on the page: images, stylesheets, scripts, Flash, etc. Reducing the number of components will inevitably reduce the number of HTTP requests submitted by the page. This is the key to making your page faster.
One way to reduce the number of page components is to simplify the page design. But is there a way to build complex pages while speeding up response times? Well, there is indeed a way to have your cake and eat it too.
Merging files reduces the number of requests by putting all scripts in one file. Of course, you can also merge all CSS. Merging files can be a cumbersome task if the scripts and styles of each page are different, but doing this as part of the site publishing process can indeed improve response times. CSS Sprites are the preferred way to reduce the number of image requests. Integrate all background images into one image, and then use the CSS background-image and background-position properties to position the part to be displayed. Image mapping can combine multiple images into a single image, the total size is the same, but reduces the number of requests and speeds up page loading. Image mapping is only useful if the image is continuous on the page, such as a navigation bar. The process of setting coordinates for an image map is boring and error-prone, and using image maps for navigation is not easy, so this method is not recommended. Inline images (Base64 encoded) use the data: URL pattern to embed images into the page. This will increase the size of the HTML file. Putting inline images in a (cached) style sheet is a good idea and successfully avoids making the page heavy. However, current mainstream browsers do not support inline images well.Reducing the number of HTTP requests for a page is a starting point. This is an important guiding principle for improving the speed of the first visit of the site.
2. Reduce DNS lookupsThe domain name system establishes a mapping between host names and IP addresses, much like the mapping between names and numbers in a phone book. When you enter www.yahoo.com in the browser, the browser will contact the DNS resolver to return the IP address of the server. DNS has a cost, it takes 20 to 120 milliseconds to look up the IP address for a given hostname. The browser cannot download anything from the hostname until the DNS lookup is complete.
DNS lookups are cached more efficiently, on a special caching server by the user's ISP (Internet Service Provider) or local network, but can also be cached on the individual user's computer. DNS information is stored in the operating system's DNS cache (the DNS client service on Microsoft Windows). Most browsers have their own cache that is independent of the operating system. As long as the browser retains this record in its cache, it will not query the operating system for DNS.
IE caches DNS lookups for 30 minutes by default, as written in the DnsCacheTimeout registry setting. Firefox caches for 1 minute, which can be set using the network.dnsCacheExpiration configuration item. (Fasterfox changed the cache time to 1 hour PS Fasterfox is a speed-up plug-in for FF)
If the client's DNS cache is empty (including browser and operating system), the number of DNS lookups is equal to the number of different host names on the page, including page URLs, images, script files, style sheets, Flash objects, and other components. Hostname, reducing different hostnames can reduce DNS lookups.
Reducing the number of different hostnames also reduces the number of components that a page can download in parallel. Avoiding DNS lookups reduces response time, while reducing the number of parallel downloads increases response time. My rule is to spread the components across 2 to 4 hostnames, which is a compromise between reducing DNS lookups and allowing for high concurrent downloads.
3. Avoid redirectsRedirects use 301 and 302 status codes. Here is an HTTP header with a 301 status code:
HTTP/1.1 301 Moved PermanentlyLocation: http://example.com/newuriContent-Type: text/html
The browser will automatically jump to the URL specified in the Location field. All the information needed for redirection is in the HTTP header, and the response body is usually empty. In fact, additional HTTP headers such as Expires and Cache-Control also represent redirections. There are other ways to redirect: refresh meta tags and JavaScript, but if you must do a redirect, it's best to use the standard 3xx HTTP status code, mainly so that the back button can work properly.
Keep in mind that redirects can slow down the user experience. Inserting a redirect between the user and the HTML document will delay everything on the page. The page won't render and components won't start downloading until the HTML document is served to the browser.
There is a common redirect that is extremely wasteful of resources, and web developers are generally not aware of it, and that is when the URL is missing a slash at the end. For example, jumping to http://astrology.yahoo.com/astrology will return a 301 response that redirects to http://astrology.yahoo.com/astrology/ (note the trailing slash). In Apache, you can use Alias, mod_rewrite or DirectorySlash instructions to cancel unnecessary redirections.
The most common use of redirection is to connect the old site to the new site. It can also connect different parts of the same site and do some processing according to the different situations of the user (browser type, user account type, etc.). Connecting two websites using redirects is the simplest and requires only a small amount of additional code. Although using redirects at these times reduces development complexity for developers, it reduces user experience. An alternative is to use Alias and mod_rewrite, provided both code paths are on the same server. If redirection is used because the domain name changes, you can create a CNAME (create a DNS record pointing to another domain name as an alias) combined with the Alias or mod_rewrite directive.
4. Make Ajax cacheableOne of the benefits of Ajax is that it can provide immediate feedback to the user because it can request information asynchronously from the backend server. However, with Ajax there's no guarantee that users won't be bored while waiting for asynchronous JavaScript and XML responses to come back. In many applications, the ability of the user to wait depends on how Ajax is used. For example, in a web-based email client, users will remain focused on the results returned by Ajax requests in order to find email messages that match their search criteria. It's important to remember that asynchronous doesn't mean instant.
To improve performance, optimizing these Ajax responses is crucial. The most important way to improve Ajax performance is to make the response cacheable, as discussed in Adding Expires or Cache-Control HTTP headers. The following additional rules apply to Ajax:
Let's look at an example of a Web 2.0 email client using Ajax to download the user's address book for autocomplete functionality. If the user has not modified her address book since the last use, and the Ajax response is cacheable and has an Expires or Cache-Control HTTP header that has not expired, then the previous address book can be read from the cache. The browser must be informed whether it should continue to use the previously cached address book response or request a new one. This can be achieved by adding a timestamp to the Ajax URL of the address book indicating the last modification time of the user's address book, for example &t=1190241612. If the address book has not been modified since the last download and the timestamp remains unchanged, the address book will be read directly from the browser cache, thus avoiding an additional HTTP round trip. If the user has modified the address book, the timestamp also ensures that the new URL will not match the cached response and the browser will request the new address book entry.
Even though Ajax responses are created dynamically and may only be available to a single user, they can be cached, which will make your Web 2.0 application faster.
5. Lazy loading of componentsTake a closer look at the page and ask yourself: What is necessary to render the page in the first place? The rest can wait.
JavaScript is an ideal choice for separating before and after the onload event. For example, if you have JavaScript code and libraries that support drag-and-drop and animations, these can wait because the drag-and-drop element occurs after the page is initially rendered. Other sections that can be lazy loaded include hidden content (content that appears after an interaction) and collapsed images.
Tools can help reduce your workload: YUI Image Loader can lazy load collapsed images, and YUI Get utility is an easy way to bring in JS and CSS. The Yahoo! homepage is an example. You can open Firebug's network panel and take a closer look.
It's best to align performance goals with other web development best practices, such as progressive enhancement. If the client supports JavaScript, the user experience can be improved, but you must ensure that the page works properly when JavaScript is not supported. So, once you’re sure your page is working properly, you can enhance it with some lazy-loading scripts to support some fancy effects like drag-and-drop and animations.
6. Preload components
Preloading may seem like the opposite of lazy loading, but it actually has different goals. By preloading components, you can make full use of the browser's idle time to request components (images, styles, and scripts) that will be used in the future. When the user accesses the next page, most of the components are already in the cache, so the page will load faster from the user's perspective.
In actual applications, there are the following types of preloading:
A complex page means more bytes to download, and accessing the DOM with JavaScript will be slower. For example, when you want to add an event handler, there is a difference between looping through 500 DOM elements on the page and 5,000 DOM elements.
A large number of DOM elements is a sign that there is some irrelevant markup on the page that needs to be cleaned up. Are you using nested tables for layout? Or did you add a bunch of <div>s just to fix the layout problem? Perhaps better semantic markup should be used.
YUI CSS utilities are very helpful for layout: grids.css is for the overall layout, and fonts.css and reset.css can be used to remove the browser's default formatting. This is a good opportunity to start cleaning up and thinking about your markup, such as only using <div> when it makes sense semantically, not because it renders a newline.
The number of DOM elements is easy to test, just type in the Firebug console:
document.getElementsByTagName('*').lengthSo how many DOM elements are too many? You can refer to other similar well-marked pages. For example, the Yahoo! homepage is a fairly busy page, but has less than 700 elements (HTML tags).
8. Cross-domain separation of componentsSeparating components can maximize parallel downloads, but make sure you only use no more than 2-4 domains, as there is a DNS lookup penalty. For example, you can deploy HTML and dynamic content at www.example.org, and separate static components into static1.example.org and static2.example.org.
9. Use iframes as little as possibleYou can use iframe to insert an HTML document into a parent document. It is important to understand how iframe works and use it efficiently.
Advantages of <iframe>:
Disadvantages of <iframe>:
HTTP requests are expensive. There is no need to use an HTTP request to get a useless response (such as 404 Not Found). It will only slow down the user experience without any benefit.
Some sites use the helpful 404 - Do you mean xxx? , This is beneficial to the user experience, but it also wastes server resources (such as databases, etc.). The worst thing is that the external JavaScript you link to has an error and the result is a 404. First, this download will block parallel downloads. Secondly, the browser will try to parse the 404 response body because it is JavaScript code and need to find out what parts of it are available.
css part 11. Avoid using CSS expressionsUsing CSS expressions to dynamically set CSS properties is a powerful and dangerous way. Supported from IE5, but deprecated from IE8. For example, you can use a CSS expression to set the background color to alternate by hour:
background-color: expression( (new Date()).getHours()%2 ? #B8D4FF : #F08A00 );12. Select <link> to discard @import
A best practice was mentioned earlier: in order to achieve progressive rendering, CSS should be placed at the top.
Using @import in IE has the same effect as using <link> at the bottom, so it's best not to use it.
13. Avoid using filtersIE's proprietary AlphaImageLoader filter can be used to fix the problem of translucent PNG images in versions prior to IE7. During the image loading process, this filter will block rendering, freeze the browser, increase memory consumption, and is applied to each element, not each image, so there will be a lot of problems.
The best way is to simply not use AlphaImageLoader, and gracefully downgrade to using PNG8 images that are well supported in IE instead. If you must use AlphaImageLoader, you should use underscore hack:_filter to avoid affecting users of IE7 and higher.
14. Put the style sheet at the topWhen studying performance at Yahoo!, we discovered that placing style sheets in the HEAD section of the document made the page appear to load faster. This is because placing the style sheet in the head allows the page to render gradually.
Front-end engineers who are concerned about performance want the page to render incrementally. In other words, we want the browser to display existing content as quickly as possible, which is particularly important when there is a lot of content on the page or when the user's Internet connection is very slow. The importance of showing feedback to users (such as progress indicators) has been widely studied and documented. In our case, the HTML page is the progress indicator! When the browser gradually loads the page header, navigation bar, top logo, etc., these are used as feedback by users who are waiting for the page to load, which can improve the overall user experience.
js part 15. Remove duplicate scriptsPages containing duplicate script files can impact performance more than you think. In a review of the top 10 web sites in the United States, only two sites were found to contain duplicate scripts. Two main reasons increase the chances of duplicate scripts appearing on a single page: team size and number of scripts. In this case, duplicate scripts create unnecessary HTTP requests, execute useless JavaScript code, and impact page performance.
IE generates unnecessary HTTP requests, but Firefox does not. In IE, if a non-cacheable external script is introduced twice by the page, it will generate two HTTP requests when the page loads. Even if the script is cacheable, it will generate additional HTTP requests when the user reloads the page.
In addition to generating meaningless HTTP requests, evaluating the script multiple times wastes time. Because regardless of whether the script is cacheable or not, redundant JavaScript code will be executed in Firefox and IE.
One way to avoid accidentally importing the same script twice is to implement a script management module in the template system. A typical way to introduce scripts is to use SCRIPT tags in HTML pages:
<script type=text/javascript src=menu_1.0.17.js></script>16. Minimize DOM access
Accessing DOM elements with JavaScript is very slow, so in order to make the page more responsive, you should:
Sometimes it feels like the page is not responsive enough because too many frequently executed event handlers have been added to different elements of the DOM tree. This is why event delegation is recommended. If there are 10 buttons in a div, you should only add one event handler to the div container instead of one for each button. Events can bubble up, so you can capture the event and know which button is the source of the event.
18. Put the script at the bottomThe script will block parallel downloads. The official HTTP/1.1 documentation recommends that the browser should not download more than two components in parallel per host name. If the image comes from multiple host names, the number of parallel downloads can exceed two. If the script is downloading, the browser will not start any other download tasks, even under a different hostname.
Sometimes, it's not easy to move the script to the bottom. For example, if the script is inserted into the page content using document.write, there is no way to move it further down. There may also be scoping issues, which in most cases can be resolved.
A common suggestion is to use deferred scripts. Scripts with the DEFER attribute mean they cannot contain document.write and prompt the browser to tell them they can continue rendering. Unfortunately, Firefox does not support the DEFER attribute. In IE, the script may be deferred, but not as expected. If the script can be deferred, we can move it to the bottom of the page and the page will load faster.
javascript, css 19. Keep JavaScript and CSS outsideMany performance principles concern how to manage external components, however, before these concerns arise you should ask a more basic question: Should JavaScript and CSS be placed in external files or written directly on the page?
In fact, using external files can make the page faster because the JavaScript and CSS files will be cached in the browser. Inline JavaScript and CSS in an HTML document are re-downloaded each time the HTML document is requested. Doing so reduces the number of HTTP requests required but increases the size of the HTML document. On the other hand, if the JavaScript and CSS are in external files and have been cached by the browser, then we have successfully made the HTML document smaller without increasing the number of HTTP requests.
20. Minify JavaScript and CSSCompression specifically removes unnecessary characters from code to reduce size and thus improve loading speed. Code minimization means removing all comments and unnecessary whitespace characters (spaces, newlines and tabs). Doing this in JavaScript can improve responsiveness because the file to be downloaded is smaller. The two most commonly used JavaScript code compression tools are JSMin and YUI Compressor. YUI compressor can also compress CSS.
Obfuscation is an optional source code optimization measure that is more complex than compression, so the obfuscation process is also more likely to produce bugs. In a survey of the top ten websites in the United States, compression can reduce the size by 21%, and obfuscation can reduce the size by 25%. Although obfuscation provides a higher degree of reduction, it is more risky than compression.
In addition to compressing external scripts and styles, inline <script> and <style> blocks can also be compressed. Even with the gzip module enabled, compressing first can reduce the size by 5% or more. JavaScript and CSS are used more and more, so compressing code will have a good effect.
picture 21. Optimize imagesTry converting GIF format to PNG format and see if it saves space. Run pngcrush (or other PNG optimization tool) on all PNG images
22. Optimize CSS SpriteDon’t use larger images than you need just because you can set width and height in HTML. if needed
<img width=100 height=100 src=mycat.jpg Host: us.yimg.com If-Modified-Since: Tue, 12 Dec 2006 03:03:59 GMT If-None-Match: 10c24bc-4ab-457e1c1f HTTP/ 1.1 304 Not Modified32. Use GET request for Ajax
The Yahoo! Mailbox team discovered that when using XMLHttpRequest, the browser's POST request is implemented through a two-step process: first sending the HTTP header, and then sending the data. So it is best to use a GET request, which only needs to send a TCP message (unless there are too many cookies). The maximum URL length of IE is 2K, so if the data to be sent exceeds 2K, GET cannot be used.
An interesting side effect of POST requests is that no data is actually sent, like GET requests. As described in the HTTP documentation, GET requests are used to retrieve information. So its semantics are just to use GET requests to request data, not to send data that needs to be stored to the server.
33. Clear the buffer as early as possibleWhen a user requests a page, the server takes about 200 to 500 milliseconds to assemble the HTML page, during which time the browser idles and waits for the data to arrive. There is a flush() function in PHP, which allows you to send a part of the prepared HTML response to the browser, so that the browser can start to get the components while preparing the remaining part in the background. The benefit is mainly reflected in the very busy background or very light front end. On the page (PS In other words, the advantage is best reflected when the response time is mainly in the background).
The ideal place to clear the buffer is after the HEAD, because the HEAD part of the HTML is usually easier to generate and allows the introduction of any CSS and JavaScript files, allowing the browser to start fetching components in parallel while the background is still processing.
For example:
... <!-- css, js --> </head> <?php flush(); ?> <body> ... <!-- content -->34. Use a CDN (Content Delivery Network)
The user's physical distance from the server also has an impact on response time. Deploying content on multiple geographically dispersed servers allows users to load pages faster. But how to do it?
The first step in achieving geographically distributed content is: Don't try to redesign your web application to accommodate a distributed structure. Depending on the application, changing the structure may include daunting tasks such as synchronizing session state and replicating database transactions across servers (translations may not be accurate). Proposals to shorten the distance between users and content may be delayed or simply impossible to pass because of this difficulty.
Remember that 80% to 90% of an end user's response time is spent downloading page components: images, styles, scripts, Flash, etc. This is a golden rule of performance. It's better to scatter static content first rather than redesign the application structure from the beginning. This not only greatly reduces response time, but also makes it easier to demonstrate the CDN's contribution.
A content delivery network (CDN) is a group of web servers scattered in different geographical locations to deliver content to users more efficiently. Typically, the server chosen to deliver content is based on a measure of network distance. For example: choose the server with the smallest number of hops or the fastest response time.
35. Add Expires or Cache-Control HTTP headerThere are two aspects to this rule:
Web design is getting richer, which means there are more scripts, images, and Flash on the page. New visitors to the site may still have to submit a few HTTP requests, but by using an expiration date the component becomes cacheable, which avoids unnecessary HTTP requests during subsequent browsing sessions. Expiration HTTP headers are typically used on images, but they should be used on all components, including scripts, styles, and Flash components.
Browsers (and proxies) use caching to reduce the number and size of HTTP requests so that pages load faster. The web server uses the Expiration HTTP response header to tell the client how long each component of the page should be cached. Using a time in the distant future as the expiration date tells the browser that this response will not change before April 15, 2010.
Expires: Thu, 15 Apr 2010 20:00:00 GMT
If you are using the Apache server, use the ExpiresDefault directive to set the expiration date relative to the current date. The following example sets a validity period of 10 years from the request time:
ExpiresDefault access plus 10 years
The above is the entire content of this article. I hope it will be helpful to everyone’s study. I also hope everyone will support VeVb Wulin Network.