Web is fast, and we struggle to make it faster and faster every day. User surf from shiny and pricey devices, and do not want to wait for pages to load: they want it now, or they just run away.
Remember the '90ies, the Geocities era when we authored our beloved content via Frontpage and polluted the world with a plethora of <font>
, <center>
and ubiquitous <br/>
tags? Finally we shrink the dimension of HTML documents separating presentation from content. We eventually learnt to exploit browser pipeline loading techniques and to properly cache documents and resources for a faster experience.>
Finally we had AJAX and stable client-side frameworks that reduced data exchange to small JSON responses, and meanwhile even cable connection speed improved dramatically, so the speed debacle seemed to be over.
But now we feel the urge to share with our friends high resolution pictures of our meals straight from the venue table, and we are back to square one with the document size problem: images of real life are still big. We cannot do much about the single image, because we already pushed to the [entropic!] boundaries of image compression.
Yet we don’t need to load all images at once: maybe the user is not interested in reading the whole article, and he will leave just after getting to know the single piece of information he or she was looking for; why should we burden him with loading the whole set of delicious food, saturating his monthly data threshold?
The boring tech stuff
A smart technique is to defer the loading of each single image to when it is actually about to appear in the viewport: this way all media that lie deep down in the page are not fetched until we scroll to them.
Such idea is called lazy loading, and it is based on the following:
- don’t provide the
<img>
s with anysrc
attribute [otherwise they will be prefetched by all modern browsers]; - inject the actual
src
URL value into a customdata-*
attribute [e.g.data-src
]; - when the image reaches the viewport, load associated resource and inject it into original element.
Among the various implementations, we advice the well known jQuery plugin, and the more recent and concise Unveil.
Bonus: deferring the loading after document is ready leaves room for resolution detection, which is good for delivering retina resolution images:
<img data-src="path/to/someImage.jpg" data-retina-src="path/to/some/hiResImage.jpg">
A slightly different use case: Die Welt
Lazy loading is fine, but it may become too lazy sometimes: Die Welt online newspaper had a precise request stating that they wanted a slow loading of all the images in the background, with still the load-when-into-viewport feature; this would prevent that even after a long idle time, there would be still images yet to be loaded.
Request was reasonable and meaningful, and yet another challenge to face!
We added a background loading queue mechanism, that triggers loading for all images in the page, with configurable frequency.
This happens in the background in the sense that the load requests are issued at low frequency [e.g. a wise configuration would be 4 images loaded every 2 seconds], without much interference with user interactions: when the user scrolls to a new region in the page, images displayed in viewport start loading as in pure lazy loading approach.
We extended original jQuery plugin with 3 parameters:
idle_threshold
- the interval after which scroll event is considered finished [default value: 200ms];queue_size
- the number of images loaded per queue step [default value: 5 images];timeout_interval
- the period between queue steps [default time: 1000ms].
If queue_size
is set to 0, then no background operations are performed: we have pure lazy loading.
If queue_size
is set to the whole images count, we have default browser behavior [because all images loading is triggered at once].
Any intermediate amount will approximate browser loading capacity: to be mobile-device friendly, choosing a number between 2 and 5 would be reasonable.
idle_threshold
is the time interval after which a scroll event is considered finished. We need to rely on this because if the user scrolls down very fast, we don’t want the images appearing meanwhile to be loaded, but just the ones in the viewport when the user stops scrolling. The lower the value, the lesser the tolerance; it should stay in the range of 100 to 500 ms at most, being then considerably lower than timeout_interval
value, in order not to mess with background operations.
By scrolling down fast, every image in the intermediate trip appear in viewport:
When user finally stops scrolling, images visible at that moment are loaded:
timeout_interval
is the dequeue operations period: every timeout_interval
milliseconds, a number of queue_size
new images are loaded among the ones still waiting. The higher the value, the longer it takes to load the whole set.
Bottom line
Lazy loading is just a technique, with its pros and cons, and should not be a de-facto standard: a wise developer should just be aware of its existence, and consider its usage when optimization becomes a must and not an option.