1

I am writing a speed test to determine browser-server connection speed (js+php but I don't think it matters). I send a bunch of data-packages in different sizes and measure the time they need to reach the server (upload) respectively the client (download).

While still on my development machine - server and client on the same - I measure much more deviations in the upload speeds than in the download speeds. Have a look at those measurements:

Uploaded kb |    1 |    2 |    4 |    8 |   16 |   32 |   64 |  128 |  256 |  512 | 1024 | 2048 | 4096 | Average byte/sec
-------------------------------------------------------------------------------------------------------------------------
ms series 1 |    3 |    9 |    7 |    3 |   11 |   10 |   11 |    5 |   13 |   19 |   37 |   47 |  130 | 14.9344274575002
ms series 2 |    2 |    7 |    2 |    2 |    7 |    3 |    5 |    7 |   11 |   16 |   28 |   46 |   86 | 18.5357872423498
ms series 3 |    6 |    3 |    4 |    2 |    3 |    3 |    4 |    7 |    8 |   14 |   37 |   50 |  113 | 18.0964908223754
ms series 4 |    9 |    6 |    7 |    6 |    2 |    3 |   12 |   16 |   10 |   23 |   25 |   51 |   85 | 16.6696396851004
ms series 5 |    8 |    2 |    3 |   88 |    7 |    4 |    4 |    6 |   10 |   19 |   38 |   46 |   99 | 16.989223651609

Downl. kb   |    1 |    2 |    4 |    8 |   16 |   32 |   64 |  128 |  256 |  512 | 1024 | 2048 | 4096 | Average byte/sec
-------------------------------------------------------------------------------------------------------------------------
ms series 1 |   67 |   14 |   54 |   33 |   44 |   35 |   29 |   59 |   90 |  105 |  220 |  394 |  780 | 2.28178300694085
ms series 2 |   34 |   88 |   52 |   85 |   22 |   26 |   38 |   43 |   61 |  108 |  180 |  368 |  721 | 2.57923191272149
ms series 3 |   26 |   16 |  126 |   73 |   16 |   40 |   44 |   38 |   74 |  120 |  186 |  381 |  691 | 2.48128561711253
ms series 4 |   23 |    7 |   12 |   14 |   27 |   21 |   18 |   44 |   74 |   97 |  182 |  413 |  919 | 2.6496804364072
ms series 5 |   19 |   13 |   11 |   74 |   67 |   25 |   72 |   34 |   61 |   93 |  181 |  347 |  677 | 2.69509313778383
ms series 6 |   40 |   60 |   33 |   14 |   62 |   48 |   64 |   29 |   51 |   94 |  171 |  330 |  700 | 2.80628837904148
ms series 7 |   25 |    8 |   32 |   61 |   26 |   18 |   19 |   29 |   52 |  100 |  169 |  357 |  676 | 3.04519001690114

Upload Speeds are around ten times faster than download speeds which I can not explain. Since everything is happening on the same machine and there is not upload in any way which would in include a network provider or whatever I would have assumed everything the more or less same. Also as You can see in the very last column, the average byte/sec varies much more in case of upload tests (which goes from 14.9 to 18.5, so have a range of 3.6 two bytes/sec) then in case of download tests (going from 2.2 to 3.0 which makes a range of 0.8). Not the exact values for sure but the fact that average upload speed varies much more than average download speed is completely reproducible (on my local machine, didn't tried it out for a real server/client scenario).

Any idea why the upload speed is so much faster and varies so much more then the download speed?

Addendum: Ho do I measure?

Upload:

Before call do const startingTime = new Date().getTime(); then send package of x random base64 chars to server with xhr. Give back arriving time (PHP's $_SERVER['REQUEST_TIME_FLOAT']) and in xhr.load calculate:

testResult.duration = arrivalTime - startingTime;

Download:

Send request with size to server. Give back a package of x base64 characters, containing the arriving time of the call (again PHP's $_SERVER['REQUEST_TIME_FLOAT']) and in xhr.load calculate:

const currentTime = new Date().getTime();
testResult.duration = currentTime - arrivalTime;

1 Answer 1

0

I believe the problem is that you're relying on the start time of PHP execution measured with $_SERVER['REQUEST_TIME_FLOAT'] which has a precision of 1 microsecond and comparing it with Javascript's Date.getTime() method which has 1 second precision.

What you need is a high-resolution timer, which is really hard to do in a browser.

Your best bet would probably be to use PHP's $_SERVER['REQUEST_TIME_FLOAT'] for the start time as well as the stop time in all tests. It really is one of the most powerful native timers around. Either that or use a desktop language like Python for the client-side app which has better high-resolution timers than Javascript.

1
  • Hmm I see. Quite a problem, since the client has to be a browser (it have to work in webapp later).
    – Paflow
    Commented Sep 27, 2019 at 7:13

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .