An note regarding the memory usage in Chromium and uBO specifically.
I have observed that even though the memory usage as reported by developer tools under "JavaScript VM instance" stay rather stable in uBO[1] even after a lot of memory churning operations[2], the figure reported in Chromium's Task Manager keeps climbing up after each of these memory churning operation, and forcing garbage collection does not bring the reported usage down in the Task Manager.
There is something happening in Chromium's extension process (and outside the JavaScript VM in which the extension run) which may cause wildly various memory figures for even the same extension depending on how much memory-churning operations have occurred -- I wish Chromium devs would provide technical insights about this.
* * *
[1] Around 8 MB when using default settings/lists.
[2] For instance, repeatedly purging all filter lists and forcing an update.
The main takeaways seem to be that 1) extensions in general wind up massively increasing CPU time (by 5-20x) when loading "example.com", and 2) ad blockers wind up massively reducing CPU time (by 4-20x) when loading a "WCPO news article".
Which makes me happy that I use uBlock (edit: Origin, thanks below), and sad that I have to use LastPass.
HOWEVER -- I feel like both these metrics are potentially highly misleading, because CPU time isn't something the user directly observes -- it might be the limiting factor, or it might not affect the user experience at all (because the CPU usage is happening while waiting for further network resources that are even slower, or the CPU usage is happening after the page has visually finished loading all relevant content).
I'd be much more interested to see how extensions like Evernote or LastPass increase the time it takes for a real webpage (e.g. "nytimes.com") to finish painting the viewport not including ads, and similarly whether adblocking actually decreases the same -- or if all the advertising stuff really only happens afterwards. (Because sites are architected differently, you'd need to compute an aggregate score across a range of common sites.)
The Largest Contentful Paint (LCP) chart suggests an interesting phenomenon. Extensions like Dashlane and Evernote appear to slightly increase when that paint happens. These extensions load a large amount of code, but don't block the initial render.
Grammarly does not seem to push up LCP much. Its JS code runs before the page starts rendering, so maybe it is less likely to compete with other code later on.
Normally running code before the page renders would be bad, but if the page is network-constrained at that point it might soften the impact.
Thanks. Very impressed not only that you took the effort to run those, but also at how quickly you responded and were able to run, collect and publish results! Major kudos.
I recently switched from LastPass to 1Password because of the added latency from Lastpass. Lastpass adds about 70ms to first contentful paint on example.com. 1Password, on the other hand, runs after the painting is done so it doesn't block rendering. I polished up a blog draft I had lying around about switching to 1Password: https://joe.schafer.dev/passing-lastpass/
> I'd be much more interested to see how extensions like Evernote or LastPass increase the time it takes for a real webpage (e.g. "nytimes.com") to finish painting the viewport not including ads.
I reinstalled Lastpass to test on nytimes.com. It takes 58ms to evaluate onloadwff.js (the Lastpass entry point) before any content is rendered.
I have LastPass, but I keep it in “only activate extension when I click on the toolbar button”.
The only annoying thing is that LastPass requires the whole page to reload first - I don’t know why Chrome can’t load an extension into an already-loaded page.
I was surprised by the choice of WCPO, which to me is a local news station who owns WCPO.com. Does it mean something else I'm not aware of or are they famous for being slow?
Local news sites tend to be extremely slow and ad-heavy in my experience. So I thought of a medium-sized US city, searched for "Cincinnati news" and picked the top result.
> CPU usage is happening after the page has visually finished loading all relevant content
This can be very noticeable though, leading to sluggishness, stuttering scrolling, slowing down other tasks, increasing fan noise on laptops. uBO in my anecdotal experience may almost make a bigger positive impact in this fashion than simply at the initial render.
I've been in power/battery-constrained situations and it's crazy how much Chrome drains my battery compared to Safari. I still use it when I'm sure I'll be able to charge soon, but I've gone as far as converting some of my home-made extensions to Safari for situations where I need to conserve what I can.
Besides energy usage reported by the activity monitor, there is also the matter of how long the a user can stay on battery power. For laptop users, energy usage can be a huge deal for their overall satisfaction, even if the user isn't quite sure where they're gaining or losing anything.
The reason - Safari can delegate to linked in code that is allowed to take advantage of powersaving hooks in the OS, whereas, Chrome does the following:
1. Contact Google
2. Start to load page
3. Contact Google
4. Continue load page
5. Send more stuff to the big G
6. Maybe finish loading page
7. Why not send more stuff to Google
Most ad blockers work by blocking certain network requests that are initiated by the page. DDG Privacy Essentials reduces the number of network requests by 95% and the download weight by 80%.
DuckDuckGo Privacy Essentials reduces the CPU time of the article page from 31s to just 1.6s. All other tested extensions also bring CPU time down to at most 10s.
It seems the single most important thing regarding Real Word Performance is a good content blocking functionality (not to mention other boons). Why don't browsers come with one by default?
It blocks some tracking, but is it a good content blocker? Last time I checked I still needed to install ublock or equivalent for acceptable performance. How hard it is to incorporate something an extension already provides?
Not hard at all, they could include uBlock add-on by default (as they did with Pocket) except allow people to disable it this time.
They do rely on Google's advertising revenue for multiple $100M of funding; so they presumably don't want to be too successful in democratising ad-blocking.
Eich used to be a controlling influence on FF (as Mozilla CEO), since he was ousted he heads Brave browser which includes content blocking by default (and a content revenue model).
In short perhaps FF/Mozilla are curtailed by financial considerations.
Not unless they buy uBlock first, which is what they did with Pocket. And unlike Pocket, which was already a company looking to get bought, uBlock is very much not that.
Aggressive content blockers also tend to break websites. Not big sites like Washington Post for very long, but all sorts of small sites and in confusing ways.
For example, we had to rearchitect part of our site that was talking about marketing and advertising because various ad blockers decide any url pattern with '/advertising/' shouldn't be loaded.
Because a web browser should be fundamentally neutral by default. All content should get rendered the same way, regardless of which server or domain it comes from.
That doesn't preclude the browser from making choices to protect privacy and security, but all sites should be treated equally (as Safari does with its tracking protection, for instance).
I don't think I agree with that. Do you want spam treated the same as your regular email for example? It is up to the implementation to decide what is {spy|mal|track|junk}ware.
The reverse is also problematic, sites treat browsers (or even user agents) very differently. Why should browsers not do the same?
For email, I agree some sort of spam filtering is necessary—but I wouldn't want an email client to alter the contents of messages depending on who the sender is. (By default at least—customization through plugins is great!)
> Sites treat browsers (or even user agents) very differently. Why should browsers not do the same?
But isn't that exactly why browsers are now phasing out user agents? I'm all for that—I shouldn't have to fake my user agent in order for Slack to work in a mobile browser.
If there's a certain browser feature that a website needs, and the website detects this feature isn't present and changes its behavior accordingly, that's quite different!
When you fire up the Brave browser, it pings Google's servers and starts collecting your data. google doesnt give away multimillion dollar software applications without the ability to capture data with them.
Great research! I'm a user of evernote web clipper and seeing that they're adding 3mb to every page that I'm using is extremely discouraging. My browse to clip ratio is around 1000:1 and I'm probably removing the extension after this.
Just wondering, can browser extensions codesplit their bundles? If it's possible, then it is really disappointing to see these large companies loading huge bundles on initial load.
Did you try to set the extension to "run on click"? Right-click extension icon, "This can read and change site data" -> "When you click the extension". I didn't try it myself (it might require reloading the page), but might be an option for 1000:1 browse to clip ratio.
I just tried it and it doesn't seem to work in Version 83.0.4103.106 (Official Build) (64-bit).
When I clicked on EditThisCookie it said there were no cookies when it was set to "when you click this extension", but when I changed it to "all sites" it showed me cookies.
Nothing stopping them from doing code splitting. Most extensions just modify the dom to insert script tag(s). So the performance strategies are the same as any other frontend app.
I ran tests on example.com, apple.com, and a WCPO news article (for the ad blocker tests).
The biggest performance issue is that extensions just dump large scripts into pages indiscriminately. Most of the time extensions don't do anything differently based on the content of the page.
I also tested a WCPO news article to see the impact of ad blockers. I picked a local news website specifically because they contain a lot of ads. If a page doesn't have any ads the performance impact of an ad blocker will be slightly negative.
What my tests don't pick up is extensions that only run a certain domains. For example, if you run Honey on a shop they support I expect the CPU consumption to increase a lot more.
Not at all surprising that the worst two extensions by a long way are both from Avira.
Yesterday I opened Chrome and received a warning that the Avira extension had been installed.
I certainly did not install it willingly. I'm pretty sure I didn't install any other software that sneakily bundled it recently, either - I mean, I'm 99.9% sure that I haven't installed _any_ software in the past week. So how why did it suddenly show up? I reported it to Chrome from the web extensions store. Highly unlikely that they'll do anything about it though.
I ran every kind of virus test I could find about a month ago since I was getting weird display/jank issues. Couldn't find anything, and in the end I tracked the issues down to a windows display scaling error.
Any idea how I would go about testing for a botnet?
Resetting chrome isn't much of a solution if the underlying problem is botnet affiliation (trojan, rootkit, etc). Should be reasonably easy to shut down all applications and services, and inspect the traffic going through the router for suspicious domains. If you only have a windows machine connected there should only be Microsoft traffic and maybe the router manufacturer. Anything else, and it's probably malicious.
It's entertaining that there is yet another area where "running large numbers of regular expressions, known in advance and not changing all that often" over lots of input is performance critical.
This was a key driver behind writing Hyperscan (https://github.com/intel/hyperscan) which I used to work on when it was primarily used for network security (thousands or tens of thousands of regular expression rules for firewalls), but doing this for ad blocking seems pretty sensible too.
This is indeed a performance critical problem, but it is already pretty much solved at this point. If you look at the performance of the most popular content blockers, their decision time is already below fractions of milliseconds. So it does not seem like performance is really an issue anymore.
Yeah, I don't doubt that it's solvable by other means (notably hashing). It's just amusing that something we started building in 2006 - and open sourced in 2015 - largely solves a problem directly (i.e. you don't specifically have to rewrite your regexes).
To be fair, blocklists are not really lists of regexps. They contain some regexps indeed but the syntax is mostly custom and matching relies on both patterns found in URLs (This part could be partially expressed as RegExps) as well as a multitude of other things like constraints on domain of the frame, domain of the request, type of network request, etc.
Hyperscan is still dynamic, and C only.
Google or Mozilla could use perfect hashes and ship it. (Google won't).
You need a native extension for a fast low memory blocker.
All the tests were done on an n2-standard-2 gcp instance... That doesn't have a GPU. Rendering a webpage without a GPU uses a lot of different codepaths and isn't awfully representative of performance with a GPU.
But there might also be a performance hit with a GPU... There are various antipatterns like drawing on a canvas and then reading back the pixels repeatedly that perform atrociously with a GPU (due to the latency between the CPU and GPU). Some sites do a lot of that for things like custom font rendering, fingerprinting user hardware, custom data decompression algorithms that use canvas, etc.
Lab analysis is always going to have compromises. What's important is the relative change (without and with the extensions), and that the extra work isn't especially slowed down because of lack of GPU.
Cool to see our extension tested (an RSS reader). What's great with this is that it gives us a metric to work towards improving. I've always been under the assumption that "cpu is cheap", but it does have real effects.
> The Avira Browser Safety extension contains a website allowlist with 30k+ regular expressions. When the user navigates to a new page Avira checks if the page URL is in that allowlist
I wonder who thought that would be a good idea...
Sounds like something that could be significantly improved by compiling all patterns into a single statemachine.
That's what Safari does for its ad blocker plugin API. Also allows for greater privacy since the browser can apply a static list and the plugin never knows what you sites you browsed to. Google is trying to move to enforce the same model but it's limiting in some ways and generated massive uproar.
Yeah, back when I was doing similar matching on a big bunch of regexes, it was vastly faster to match on all of them lumped together in a group with the ‘or’ operator. And I learned of this more than fifteen years ago from a server-side lib that deduced the exact browser from the user-agent: it had a huge regex with various placeholders and a long list of ‘if/elseif’ checks to extract the values.
FYI: content blockers like uBlock Origin or Ghostery use block lists of tens of thousands of rules as well (can be up to 100k depending on lists enabled) and there are ways to make this matching very efficient[1].
I recently switched to Enpass after LastPass started having major issues in both Firefox and Chrome. I was having to remove and reinstall it at least once per week.
Enpass has been great. The setup with Dropbox sync isn’t as quick but that’s just one per device. It’s very speedy and hasn’t gotten in the way at all.
Isn't that weird - not a programmer but ... surely the only code they need to run on every page load is to add an event handler for forms to check if a login form has been focused? (At which point their add-on runs). What else are they doing?
It looks like a lot of the slowness from the slow extensions is from parsing and executing JavaScript. Are there opportunities from the browser side to make extensions faster? It seems like re-parsing the same JS on every pageload is an opportunity for gains from caching, but I'm also wondering about different delivery mechanisms like wasm. Are there security considerations here?
Alternatively, heavy code could be kept in the background script while page-injected scripts only do interaction. This would even employ the JIT properly.
Content blockers in the same class as uBlock Origin ("uBO")[1] have the added burden of having to enforce generic filters which are independent of the request domains, they have to find a specific pattern in the request URL, including with support for wildcards.
Even though, and despite this added burden, I will point out that uBO is almost as performant as DDG Privacy Essentials as per this report.
Furthermore, uBO contains WASM modules but they are not used in the Chromium version of the extension since this would require to add a `wasm-eval` directive to uBO's extension manifest, something I prefer to avoid for the time being, I fear this would cause more lengthy validation of the extension by the Chrome Web Store.
It is a deep dive into how modern content blockers work and what kind of rules they have to enforce (they are usually much more complex than simple domains, though).
As a side node, I recently ran a small experiment and found out that around 90% or blocking from Easylist/EasyPrivacy/uBO lists can be done based on domain information only. But of course this leaves a lot of corner cases where sites might break or ads might still show, and ultimately more granularity is needed to address these cases.
When you have an array, it's way faster to check the existence of array[key], which is O(1), than to loop over the full array to find if one of the values match the one you're searching for, which is O(n).
Object properties in JS are mostly the same thing.
I guess other extensions can't do the same because they don't have a simple value to search in a whitelist, but rather a list of regex which must all be executed.
Some filter addons perform complex logic trees based on properties other than the hostname(s) being loaded by the browser, and therefore take up more CPU and RAM for more wall-clock seconds to meet the needs of their increased complexity. They end up able to block more content than others, but that comes at a cost — which, as this page notes, can be much higher than anyone expected.
Has a browser team ever considered the possibility of creating allowlists for extensions only on certain websites? A native implementation of something like uMatrix that also worked on extensions could help end users at least remove slowdowns on sites they need to be performant.
In my experience, PB pretty much does nothing. UBO already blocks trackers and social media buttons, and if it doesn’t, you can turn on those filters. PB doesn’t even block the trackers until it “learns” that they are trackers, it just avoids sending cookies (which doesn’t stop facebook from fingerprinting you.) The best method would probably turning off 3rd party scripts and iframes in UBO, but that does mean you have to un-break a lot of sites. An extension like cookie autodelete would also do much more than PB, since, if you have no cookies, they can’t be sent.
Edit: I realized that I sound like I’m bashing PB too much. PB is definitely better than nothing, and doesn’t break any sites, but there are better things you can do which make PB obsolete.
Why do you want to fight over a word if you're able to understand everything and good people are trying to reduce bias in human life?
Every word that is commonly or uncommonly used has a physiological impact on perception that is not directly measurable and causes biases. In this case, whitelist = white = allow = positive. Blacklist = black = block = negative. Do you want peoples brain to subconsciously associate words like that?
Instead if we use words like 'allowlist' and 'blocklist', it reduces the association with people's skin color/ethnicity and hence the hidden biases that we do not know are present. This stuff takes generations to fix.
> Do you want peoples brain to subconsciously associate words like that?
A well-dressed man doesn't want an ink stain on his crisp white shirt, even if he's black.
When people get cavities, those are black spots on white enamel regardless of race.
Sorry, the white-substrate-black-stain metaphor isn't going to die any time soon.
Also, the absence of light looks black. Daylight will continue to be white and nighttime black; people will continue to stumble in the dark and be afraid, regardless of their skin color. Violent crime will continue to prefer the cloak of darkness.
It seems likely that kids and some adults will always have an irrational fear of the dark, since thousands of years ago nocturnal tigers used to eat our ancestors. The light/dark association is probably baked into our DNA due to this selective pressure; it's an unfortunate coincidence that similar words can be used to describe skin tone.
Cool, but it doesn't matter that no-extension is much faster because we can't browse without ad blocking. Even if it slows the page, it is still preferable. On the other hand ad blocking wastes less bandwidth and memory.
I don't think you read the graph right. Without ad-blocking extensions, the browser is of course much slower, about 15 times slower according to their tests.
The chosen site for that test is basically a worst-case. It uses a local TV news site. These are all cesspits. They have banners, auto-playing videos, popovers, clickjacking, the works.
I am doubtful of some of the numbers here... I mean I know browsing the web without an ad blocker is bad, but it isn't 'takes 35 seconds to load an article on gigabit ethernet' bad....
> DuckDuckGo Privacy Essentials reduces the CPU time of the article page from 31s to just 1.6s. All other tested extensions also bring CPU time down to at most 10s.
It doesn't even slow down the pages, the opposite is true here.
I have observed that even though the memory usage as reported by developer tools under "JavaScript VM instance" stay rather stable in uBO[1] even after a lot of memory churning operations[2], the figure reported in Chromium's Task Manager keeps climbing up after each of these memory churning operation, and forcing garbage collection does not bring the reported usage down in the Task Manager.
There is something happening in Chromium's extension process (and outside the JavaScript VM in which the extension run) which may cause wildly various memory figures for even the same extension depending on how much memory-churning operations have occurred -- I wish Chromium devs would provide technical insights about this.
* * *
[1] Around 8 MB when using default settings/lists.
[2] For instance, repeatedly purging all filter lists and forcing an update.