Maximizing Website Performance

Why maximizing website performance?

To meet these goals:

  • User satisfaction and business expectation
  • Improved Search ranking
  • Support Mobile Users (slow and limited Internet connection)
  • Reduce the costs (less bandwidth + less network latency + less resources usage (Servers, RAM …))

Note:

Network latency = (how long a packet takes to move between the server and the client).

Why websites are slow?

More than 80% slowness issues come from front end side and the rest percentages come from the back end side, client side means (HTML, CSS, JavaScript and images).

Note:

Performance measurement should be done periodically because of data changes.

What are the tools used to analyze website performance?

Use these tools to analyze websites performance before and after applying optimization to know where you are (benchmark):

Note:

Before running web performance testing, make sure to disable extensions in your browsers which could affect the result like AdBlock, FlashBlock …

Hints in using these tool:

Hints With Fiddler

  • Statistics tab (to see Number of requests , size , time and response contents)

01

  • Chart

02.png

  • Timeline

First request takes less than 1 sec which is from server side and the reset timeline represent the client side loading

03.png

Hints with YSlow

Check the Grade and the recommendation

04.png

Hints with Google PageSpeed

Show testing result for Mobile and Desktop and their Suggestions for improvements

05.png

Hints with Web developer tools

  • Requests loading

06.png

  • Hover context

07

  • Statistics

08.png

  • Filters (for example , filter to show only images and then check the statistics)

09

  • Disable Cache

10.png

Hints with Web Load Testing tool

  • Recording the loading intervals and Save it as Video

10.png

  • You can select browser versions , device types and test location (to test network latency) , also you can run the test up to 9 times (to see different behavior based on internet connection and cached view for second test)

11.png

  • Performance Score

12.png

  • Show First and Repeat View

First byte = first byte received by the browser

13.png

  • Compare (with different websites)

14.png

  • Emulate mobile version

15.png

  • Waterfall View

16.png

  • Tabular View

17.png

  • Content breakdown

18.png

  • Performance review

19.png

  • Single point of failure (SPOF)

What if your website use twitter API, to know how your website will behave when twitter is down

01

  • Block files

02.png

  • Disable JavaScript

03.png

  • Use Authentication

04.png

Hints with tracert

tracertgood.png

Hints with Google Search engine (Top links)

06.png

General Rules for Good Website performance:

  • Reduce HTTP requests
  • Send as little as possible over the network
  • Send it as infrequently as possible (using Cache)

Hints for some rules:

  • HTTP Compression in IIS

Note:

Don’t add images because they are already compressed

  • Content Expiration
    • Cache the files (static files like images and JavaScript) in client side
    • Client send “if-modified-since”, if modified then the server will send the new content with 200 http and if not then the server will send 304 http request
    • The expiration time stored in Cache-Control
    • Caching common resources will help to increase the performance in all pages
    • Sometime proxy server causes issue with Caching by telling the client there is no updates but in reality there is an update
    • In IIS, to enable it
      10.png
    • To check where the browser stores these files, for example in chrome, go to C:\Users\<user-name>\AppData\Local\Google\Chrome\User Data\Default\Cache
    • For more information , https://www.keycdn.com/blog/a-guide-to-http-cache-headers/
  • Content Distribution Network(CDN)
    • Locate files which geographically closer to users
    • And offloads work from your servers
    • Also if multiple sites using the same files then the browser will get it from the cache
    • Typically for static files like (CSS, images and JavaScript)
    • Common example, jQuery CDN https://code.jquery.com/
  • ETags
  • Remove unused HTTP headers
    • Also improve the site security (less information, less attacks)
    • Reduce Response size
    • Headers to remove
      • X-AspNet-Version
      • Server
      • X-Powered-By
  • Multiple domains for static Resources
    • Browsers will open a series of background connections to retrieve objects from web servers
    • Each browser has simultaneous limit
    • Be balanced, more domains mean more DNS initiation which cause slowness
    • http://www.browserscope.org/?category=network

Untitled.png

References:

 

 

 

Advertisements

Combining Client Scripts into a Composite Script could kill the website performance

It’s a good practice to reduce the number of files by combining them into one file using bundling to increase the website performance.

You can use <CompositeScript> which is a control in ASP.NET AJAX by placing all JavaScript files inside this tag as following:

03

But the issue with this way , that each time the page is loaded , bundling will occur at run time to combine these files into one file and this could affect the performance in case you have heavy load or visits in your websites.

I did a performance test against SharePoint website (10000 load test within 1 Min) using loader.io and the result was Scary.

Result #1: With using <CompositeScript>

It didn’t continuous because the Number of failed requests more than 50% of Success requests

01

Result #2:without using <CompositeScript>

The number of Success requests more than first test

02

Result #3: Bundling the files Manually into one file or using Web Essentials tool with Visual studio

The result , no comparison between this result and the previous results

03

So it’s good practice to bundle the files globally one time at the application start by code , manually or by using tools like Web Essentials

(This extension for Visual Studio 2012) https://visualstudiogallery.msdn.microsoft.com/07d54d12-7133-4e15-becb-6f451ea3bea6

(This extension for Visual Studio 2015)
https://visualstudiogallery.msdn.microsoft.com/ee6e6d8c-c837-41fb-886a-6b50ae2d06a2

 

My experience with loader.io

https://loader.io  is a very simple and starter easy tool for stress testing which is cloud-based (Amazon’s US-east data center) by SendGrid Labs. You can use it for both performance or load test to measure how your application (or API) performs under pressure and to know if your application need scaling (up or out) servers or use caching …etc.

Some of features of this online tool:

  • Testing websites and API
  • Support automation
  • Has free plan for 10000 clients
  • Has add-on in Azure portal
  • Sharing result and statistics with others
  • Watch the test in the real time
  • Verification for legal use because there is other online services used for DDoS attacks

This tool support three types of testing:

  1. Clients per test

For example: 200 clients over 100 seconds will split up to 2 client connections per second (Ideal for performance test and here you will concern in the response time).

  1. Clients per second

For example: 20-second test with 1,000 clients per second is the same as a 20-second test at 20,000 clients per test (Ideal for performance test).

  1. Maintain client load

For example: If you specify 0 and 10,000, the test will start with 0 clients and increase up to 10,000 simultaneous clients by the end of the test (Ideal for load test).

Better to go with different test types to know how your website behave under different circumstances.

For more details, you can check this link http://support.loader.io/article/16-test-types

Also they have two plans:

01

Sign up for free plan but you need to verify your account (this is the first verification):

02Add your website host:

03

Again you need to verify if you own this host to prevent malicious attacks like DDoS:

04

Just download the file and copy it to the root of your website

For example http://%5BYour Domain]/loaderio-082ba4dd0e3b57dee160f21e7ecc6f63.txt

Then you can start new test…

Define the URLs to run test against them.

As the below image you can define GET/POST, define custom headers (mostly with API) and other parameters

05

Choose test types and defines the Number of users and duration:

06

And you can schedule it

07

Note

It’s only support Basic authentication which It is insecure authentication and mostly no one use it these days.

Run the test to see the test results:

Here we can see Number of (Successful responses) which return 200 HTTP Code and also Number of (Timeout) responses or Failed responses.

08

Here shows the average response times (blue) and the number of active clients (green)

11

The details graph shows details about:

  • Total requests made
  • Total responses received
  • Success responses (response codes under 400) – ideally all responses fall in this category
  • 400-level errors (response codes 400-499) – often these indicate authentication problems, something is missing, or a number of other problems.
  • 500-level errors (response codes 500-599) – generally application errors. Check your logs for error details
  • Timeouts – no data received for the timeout period (10 seconds by default)
  • Network errors – DNS resolution or TCP connection problems

10

Here you can see the size of Bandwidth sent/Received by loader.io

09

For more details about the Test Results http://support.loader.io/article/19-test-results