Firefox 57 delays requests to tracking domains

Firefox Quantum – version 57 – introduced number of changes to the network requests scheduler.  One of them is using data of the Tracking Protection database to delay load of scripts from tracking domains when possible during the time a page is actively loading and rendering – I call it tailing.

This has a positive effect on page load performance as we save some of the network bandwidth, I/O and CPU for loading and processing of images and scripts running on the site so the web page is complete and ready sooner.

Tracking scripts are not disabled, we only delay their load for few seconds when we can.  Requests are kept on hold only while there are site sub-resources still loading and only up to about 6 seconds.  The delay is engaged only for scripts added dynamically or as async.  Tracking images and XHRs are always delayed, as well as any request made by a tracking script.  This is legal according all HTML specifications and it’s assumed that well built sites will not be affected regarding functionality.

To make it more clear what we exactly do for site and tracking requests, this is how scheduling roughly looks like when tailing is engaged:

Firefox Quantum Tracker Tailing OK

And here with the tailing turned off:

Firefox Quantum Tracker Tailing OFF

This is of course not without problems.  For sites that are either not well built or their rendering is influenced by scripts from tracking domains there can be a visible or even functional regression.  Simply said, some sites need to be fixed to be able to adopt this change in scheduling.

One example is Google’s Page-Hiding Snippet, which may cause a web page to be blank for whole 4 seconds since the navigation start.  What happens?  Google’s A/B testing initially hides the whole web page with opacity: 0.  The test script first has to do its job to prepare the page for the test and only then it unhides the page content.  The test script is dynamically loaded by the analytics.js script.  Both the analytics.js and the test script are loaded from www.google-analytics.com, a tracking domain, for which we engage the tailing delay.  As the result the page is blank until one of the following wins: 4 seconds timeout elapses or we load both the scripts and execute them.  For a common user this appears as a performance drawback and not a win.

Other example can be a web page referring an API of an async tracking script from a sync script, which obviously is a race condition, since there is no guarantee that an async script loads before a sync script.  There is a real life example of such not-well-built site using a Twitter API – window.twttr.  The twttr object is simply not there when the site’s script calls on it.  An exception is thrown and the rest of the site script is not executed breaking some of the page’s functionality.  That effected web page worked before tailing just because Twitter’s servers were fast to respond and executed sooner than the site script using the window.twttr object.  Hence, worked only by a lucky accident.  Note that sites with such race condition issues are 100% broken also when opened in Private Browsing windows or when Tracking Protection with just the default list is turned on.

To conclude on how useful the tailing feature is – unfortunately, at the moment I don’t have enough data to provide (it’s on its way, though.)  So far testing was made mostly locally and on our Web Page Test internal testing infrastructure.  The effect was unfortunately just hidden in the overall noise, hence more scientific and wide testing needs to be done.

 

EDIT: Interesting reactions on www.bleepingcomputer.com and Hacker News.

 

(Note: few somewhat off-topic comments have been trashed in case you wonder why they don’t appear here ; I will only accept comments bringing a benefit to discussion of this feature and its issues, thanks for understanding)

30 thoughts on “Firefox 57 delays requests to tracking domains

    1. This can be disabled via a preference in about:config, to disable switch network.http.tailing.enabled to false.

  1. Can we see in the dev tools which scripts are getting delayed and by how much? If not, are there any tools that will tell us? Also, this post doesn’t seem to apply to XHR requests but the linked feature bugzilla tickets seem to mention XHR. Does it apply to XHR trackers?

    1. All requests are visible in dev-tools as before, but there is no information that a request has been tailed, no. Actually, this is a good point and definitely worth filing a bug to include this information in dev-tools!

      There is no other convenient way right now to recognize tailing, no, except somewhat cumbersome offline logging and analyzingnsHttpChannel with tail-block-time property present.

      Another good point with XHR. Yes, tailing is applied to XHRs to tracking domains as well (will update the post.) And, also we tail any kind of a request made from a tracking script.

      And for completeness, favicon requests are using tailing as well.

    1. It can only be disabled in `about:config`, with the `network.http.tailing.enabled` pref. There is no knob exposed for that in devtools, sorry. Feel free to file a bug.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.