And this is why Whonix is critical - because even when you pop the browser, you still have another layer of protection - the gateway VM.
Tails browser on [almost anything] is one browser exploit away from beaconing out directly from your IP, and has done so rather frequently over the years.
Whonix stuffs the whole browser and such into a workstation VM, which is only connected to the gateway VM - which "torifies" everything coming in that port. So even if you pop the workstation and have root, you still can't beacon out directly without going through the gateway - you'd have to find an exploit in that bit as well, with only network access. Not impossible, but a lot harder.
And then package all that into Qubes and use it that way, because a disposable Whonix VM set is probably the safest way to browse the web...
I almost find it suspicious how heavily Tails is promoted over Whonix. Tails focuses on largely imaginary scenarios that only happen to people named Bob or Alice, while Whonix fixes the actual attacks that come up in subpoenas.
Apple's and Oranges; tails is designed for storing sensitive files amongst many other features whereas Whonix is a live CD that doesn't offer storage and is focused only on secure browsing.
I think you're backwards. Tails is the LiveCD with a browser (that can beacon straight out). Whonix is the VM based system. I think it's capable of more than just browsing, but I use it as the "secure browser" in Qubes as a disposable VM, because it just automatically does the right stuff with the gateway VM and such.
It is an complex idea but in theory one could produce a live-image that spins up the Whonix 'gateway' and 'workstation' virtual-machines into RAM. Boom, probably better than Tails.
The most obvious concern is the RAM-usage (because of tmpfs and each VM having allocated RAM on top of that) and if disk-usage between the gateway and workstation images could be de-duplicated to save space in the live-image.
Modern browsers should really be treated like operating systems because they have so many capabilities and are so complex. I try to run all of mine in separate virtual machines on Debian Linux using virt-manager. Additionally, they're sandboxed with firejail (looking at moving to bubblewrap) and apparmor. I'm less concerned with my IP address and more with a website being able to access random files on my computer.
> Tails browser on [almost anything] is one browser exploit away from beaconing out directly from your IP
as far as I am aware Tails use IP tables to force all network connections through tor. You would require an escape from the browser and then a privilege escalation to get around this.
"Whonix alone" is probably fine against browser exploits in the Tor browser (of which I generally assume there are many, because it's a browser of Very Much Interest to plenty of agencies). However, if you assume a "dirty host," with various bits of nastiness on it, if you're just using Virtualbox or something, it would be easy enough for a compromised Whonix workstation VM to chatter away with the host and have the host beacon out, or have the host modify the disk images for Whonix to add badness, or something of the sort. It's not a high risk, but if you're going to be doing something with Tor where failure of opsec puts you in prison for life (see DPR), it's something to consider.
Qubes adds a few more layers of isolation and security, because you now have a Type 1 hypervisor under everything (currently Xen), with your other isolation VMs separated out. Badness in another VM can't directly impact the Whonix VMs, unless it's compromised Dom0, at which point you've lost with Qubes anyway.
Both are at risk from a hypervisor escape as well, but I generally consider Xen to be a somewhat better inspected and harder to escape from target than Virtualbox or VMWare Workstation, just because there's less to Xen. It's a far smaller codebase, and when you're using hardware virtualization with paravirtualized devices (virtio-type interfaces), there's just not as much surface exposed for attack. It's not impossible, but I would generally consider VMWare/Virtualbox somewhat softer targets to escape from than Xen.
Again, does any of this matter for casual use? No. But if you're going to use Tor for things that have actual consequences, it may very well matter a lot, and at that point, fully understanding the various threats and how they've been used over the years may be a matter of your freedom.
For whatever it's worth, I try to add Tor traffic where I can, just to help with the noise factor.
Just a heads up for Android users: The Play store version is a few releases out of date, to get current use FDroid and make sure Guardian Project repo is selected (it's not by default).
Question for the Mozillans/Googlers: How is it that Firefox Nightly are fast-track released multiple times a day to Play Store but stable Tor Browser updates are stuck for weeks? Is there a 'skip the review' option for nightly releases?
It's a JavaScript engine bug and JS is disabled by default. Still important, but I question whether anyone who enables JS in Tor is worth compromising.
Because the web isn't practically browsable without js, so rather than users fiddling with noscript and ending up disabling a ton of security features, instead they just turned js on.
> This vulnerability doesn’t break the anonymity and encryption of Tor connections.
> The Safest security level of Tor Browser is not affected because JavaScript is disabled at this security level.
> For example, after you visit a malicious website, an attacker controlling this website might access the password or other sensitive information that you send to other websites afterwards during the same Tails session.
We are not aware of any such thing. As rebelwebmaster noted, when we know that we put it in our advisory.
Clearly the vulnerabilities are exploitable as demonstrated by Manfred Paul's winning Pwn2Own entry. The details were disclosed only to Zero Day Initiative staff (the contest organizers) and Mozilla. They have not been discovered on any website in the wild.
Also, they've specifically called that out in the advisory when they're aware of that being the case. See the last out-of-band security update they released for example:
A reminder that Tor Browser might be one of the least safe browsers you can run: it's a fork of Firefox, meaning that its maintainers have to coordinate and port patches from the mainline project. Firefox is already not one of the most hardened browser engines. Meanwhile, the fork you'll be running is specifically designed to hide sensitive traffic, and collapses all those users into a single version for exploits to target.
I'm ambivalent about Tor, but if you're using Tor, don't use the Browser Bundle.
> A reminder that Tor Browser might be one of the least safe browsers you can run: it's a fork of Firefox, meaning that its maintainers have to coordinate and port patches from the mainline project.
Tor Browser ships updates as soon as new ESR versions come out.
> Firefox is already not one of the most hardened browser engines.
That might've been true in the past, it's hard to argue for it now.
> Meanwhile, the fork you'll be running is specifically designed to hide sensitive traffic, and collapses all those users into a single version for exploits to target.
The overwhelming majority of exit traffic now is using HTTPS and Tor Browser ships with HTTPS Everywhere to avoid SSL Striping attacks (in fact the next version of the Tor Browser will have the HTTPS-Only mode enabled by default, it's already being tested in the alpha release), so how will those evil exit node burn those exploits?
> I'm ambivalent about Tor, but if you're using Tor, don't use the Browser Bundle.
First off, the "Tor Browser Bundle" is a deprecated name. If you're not using the Tor Browser you're making yourself both insecure (it ships with a smaller attack surface, no WebGL for example) and fingerprintable defeating thus the full privacy advantages of the Tor Browser. There is simply no other alternative.
Lets be real, you need to be using JavaScript for the internet to be functional, even within Tor. Anybody claiming they regularly use the internet with JS disabled is just lying for some sort of feel of superiority.
> Lets be real, you need to be using JavaScript for the internet to be
functional,
Nonsense. I use w3m for browsing and much more than 90 percent of the
web works fine. Fully 100 percemt of "the internet" works fine,
because that has nothing to do with JavaScript. Please stop
over-dramatising and catastrophising as a way to throw cold water on
what is a very good security practice. More than one medium security
environment I've worked in recently don't allow js (although
admittedly the sites we are allowed to access from there are limited).
I just read the top 100 website list and went to some of the top 20, like Yahoo, YouTube, Twitter, Instagram, Amazon, and Live.com (Microsoft).
YouTube, Twitter and Instagram don't work at all. Live.com wouldn't let me log in without JS. Amazon worked until checkout. Yahoo worked until login.
I think you are incorrect with your "nonsense" judgement, as this top-10 sampling is pretty sensible.
EDIT: `ewzimm` makes a good criticism of my analysis: these aren't necessarily the top sites used by Tor. However, how many Tor users (in authoritarian countries or just regular users) don't use it to visit the banned sites on the Top-100 list?
If I told you I don’t listen to the Billboard top 100 songs, would you say “nonsense, you don’t listen to music?”
I also prefer w3m and find most of the web much better as text only, switching over to another browser when I want video or some other JS feature. Or I can use something like youtube-dl to fetch a video. And there’s much more out there than the top 100 websites.
> If I told you I don’t listen to the Billboard top 100 songs, would you say “nonsense, you don’t listen to music?”
No, but the reponse is more like: I only listen to Indie, Billboard isn't music.
The vast majority of internet traffic, e.g., the most popular sites, mostly require JS. If you only visit obscure indie-rock sites, then fine, but we're talking about the masses, not the small niche exceptions.
It’s true that most people will likely stick to the most popular websites, but how likely are they to use Tor, especially self-configured outside the Tor browser? I’d bet the people who would do that are much more likely to spend more time outside the most popular websites.
That's a good point: this discussion is in the context of TOR, so that does self-select to some extent. It would make more sense for my argument if I knew what are the top-20 sites used by TOR and their JS requirements. I know people use Tor for Twitter in Turkey, so there's a problem right there!
To be fair, the websites you listed are extremely difficult to use anonymously with or without JS enabled. Most of the popular sites go pretty far out of their way to attach you to an identity that can be used to identify you outside their website.
If you’re using Tor to do your Amazon shopping - I wouldn’t recommend using the same environment to do anything where your anonymity being compromised could put you in danger since you just gave Amazon your credit card and mailing address.
With firefox and NoScript, you can whitelist the specific JS you need to make those sites work. You do it one time for a site you know you'll come back to often, and then you're done. In my case for example I whitelisted the scripts at old.reddit.com and redditstatic.com, and leave everything else blocked by default and it works fine for my needs (reading comments).
The fact that there are a handful of very frequently used websites that use JS doesn't make it impossible or overly burdensome to take sensible steps to limit which scripts you allow.
I use amazon in firefox with NoScript without issue, and while amazon gets to run some scripts, none of the JS at amazon-adsystem.com ever runs in my browser.
Youtube wants to load JS from over a dozen different places, but you only need to allow a couple to get videos to play (I personally prefer to just download yt videos to disk and watch them in VLC avoiding that issue entirely)
It however does not matter for the large majority of people who use those top 100 or even top 100,000 websites or even top 1,000,000 websites, and do not have the education, skill or time to learn about all the alternatives, if there are even any. It doesn't matter for the people living under repressive regimes who want to inform themselves on foreign news sites, access foreign NGO sites, or even watch things on youtube or look and/or participate in social media. And so on...
A large part of the web is not functional without js, and just because you chose to not use that part of the web (much) doesn't invalidate that point.
So I'd politely suggest you may tone it down a little when it comes to calling "nonsense".
> and do not have the education, skill or time to learn about all the alternatives,
With respect, this is hackernews. When I converse with people
here I do so with a different expectation of intellect and curiosity.
There are voices here who excuse technological abuses by appeal to the
ignorance of "the masses" - completely missing that there is a
different spirit going on in the sub-text of innovation and
entrepreneurialism here. If, as you claim, the majority are using
defective technologies, then that is a bigger problem, not something
to be celebrated. They deserve better and it's our job to help them
get that.
There are alternatives that will work without JS though.. obviously the majority of people use it by default, but if you don't want to have it enabled there is plenty of other options.
I doubt that. Monetization is a goal of most of these platforms, which makes the JS and privacy-hostile stuff absolutely intentional and implemented with malice aforethought. There's no reason to require client-side computing for the vast majority of sites, and the remainder are relatively niche webapps. Media-viewing sites aren't niche, and have good UX reasons, but I'd bet the vast majority of use is outside of the web interface anyways (ie. on mobile).
For everyday browsing I use NoScript, and rarely allow JS to run (I don't have JS right now!). With Tor, JS is always disabled, 100% of the time.
Tor is a niche use case, and not running JS is a cost that comes with the increased anonymity. I'm not using Tor to watch my "How to cook rice" videos or funny cat videos.
I mean, I don't have JS enabled for HN, although I don't know if you count that as "major".
But I'm not really keeping track, honestly. If I come across a website that isn't working with JS, I make the decision "is this worth allowing JS?". Sometimes the answer is yes, sometimes it is no. Often it means enabling the first-party domain to run JS but no others.
Conveniently, some of the paywalls on various news sites don't work with JS, but you can still read the article. So that'd be some of them that arguably work better without JS.
I don't use Tor for everyday browsing, only for the times I need/want it. In those cases, the equation always equals "no JS" -- that's the reason Im using Tor in the first place.
> Lets be real, you need to be using JavaScript for the internet to be functional, even within Tor
That's incorrect, especially the last part. Dark services work very hard to design their websites to work without JS, due to these exact vulnerabilities. Nobody on the dark web trusts JS, at all.
Actually some DNMs heavily encourage you or even force you to turn off Javascript before they let you log in/interact with the website. So while I think that JS is probably necessary for most of the regular web, that's not really the case here. It's only true if you use Tor to browse the clear net, which is probably not recommended anyways.
Hm, this is probably a joke, but I do vast majority of my browsing without javascript (noscript+umatrix or w3m). It's especially pleasant on news sites which are crammed with junk the few times I carelessly open them on the JS-only profile I reserve for Google's app suite.
I use brave and browse with JS disabled by default. Some sites don't work, some do. I regularly decide the info I'm looking for can be found somewhere else and back out of a broken site because of it. Some sites I enable and proceed with.
> Anybody claiming they regularly use the internet with JS disabled is just lying for some sort of feel of superiority.
Nope. I do, and I'm not lying. I started because it was required for my work and I just got used to it and now do it everywhere. The internet with NoScript is the best way to browse 90% of the time.
Even today, the vast majority of the sites I visit (including the one linked to in this post) work just fine (for what I want) without JS. That means the text I clicked to read is displayed and is readable, the images I clicked to view are displayed, etc. Other parts of the site may not work (menus for example), but if I'm just following a link to an article I want to read and I can read it without javascript why do I care if the menus on the site are broken or if i can't leave a comment?
For the sites I regularly visit that really do need JS I enable only the JS files needed to accomplish the things that I want and that's only necessary to do one time for each site. NoScript remembers my preferences on each domain.
For those rare occasions I actually need to enable JS to get the functionality I want on a site I'm visiting only once I can just temp allow only the scripts I need to get the content I want and the next time I close my browser (or clear those temp permissions by hand) that site is no longer allowed to use JS. Ill admit that for some random sites I wasn't that interested in in the first place, there are times where I'll still just close the tab and move on.
I really don't understand why people think it's so hard to use the web with NoScript. Overall, websites load much faster and look cleaner without JS and I'm much much more secure. Most of the time, it's really not a problem.
I will say, I do have an add-on called NukeAnything that lets you right click and remove whatever you want from webpages (only until the page is reloaded) and that occasionally does help fix some issues for sites that don't handle the lack of JS gracefully. If somebody's poorly designed JS heavy menu is spewed all over the page and covering the content I want to see, I can just right click and remove it. Same with obnoxious "we use cookies" banners that I refuse to interact with.
Honesty it's the other things I've done to harden the browser (disabling redirects, service workers, WebGL, WebRTC, Wasm, location sharing, DRM, plugins, cookies, web storage, etc.) that cause the most problems with sites, and I do keep another unhardened browser around (brave atm) to handle the sites I absolutely need to access that depend on that junk.
I use a text-only browser that has no suport for JS or CSS. I use it to read and comment on HN and to read every website submitted to HN. I have no idea what these websites look like in graphical browsers, but I can read 100% of them. I do not see fonts, images, layout, etc. I just read text and download files. For searching and downloading video from YouTube, I do not even use a text-only browser. I do everything from the command line. The only time I use a graphical browser that runs Javascript is for online shopping, banking and so forth. That is a very small percentage of overall internet use for me.
I regularly browse internet via Lynx, which does not support JavaScript. A lot of sites appear to be actively hostile toward Lynx but there are some sites that are very functional and even enjoyable.
True, disabling Javascript and surfing the (mainstream) web is deep in the no-fun zone, maybe just above "using Lynx as a day-to-day browser". :D
But what one could do is somewhat reduce the risk by only running JavaScript from the actual domain and it's subdomains by default, with something like µMatrix[1]. Most sites are already useable that way, and it's often obvious (to most people on this site) what domains have to be whitelisted to make it fully functual if they aren't. Or actually whitelist the domain for every website on the first visit. Tedious, but you only need to do it once per site.
Doing so at least protects a bit against malicious iframes or injected scripts from 3rd party domains, doesn't it? :)
Wikipedia does not require Javascript to be "functional". Also, the "internet" is much more than the www. The majority of protocols used on the internet do not rely on Javascript to be functional.
Nonsense. I was hired freelance to create a web forum for someone who wanted it to run on Tor and making everything work without JavaScript was the top requirement. The guy wanted an option to enable JS for those who were willing to trust it, but it was disabled by default and I designed all parts of the forum to run without JS.
No one said it's possible to design a site without JavaScript, just that for the vast majority of the internet, including sites user's rely on, it's unusable without it enabled.
I understand. And if anyone wants to use one of the sites that requires JavaScript within tor, then JS is needed within tor for them. Just because some random forum was developed to work without JS doesn't help if they want to use a site that wasn't developed to work without JS.
Terrible advice. If there is one thing I know from ~8 years of following the darkweb markets is that there's nothing worse than stepping outside of the common practice of: use Tor, use Tails, use Whonix
If you read the the DOJ indictments of Tor users what they have in common is that they stepped out of those bounds
There _was_ a period where Firefox (hence: the Tor Browser) was terrible and 0days were cheap (which is why most of the darkweb switched up to using VM's behing their browsers), but those days are over[0]
I can't recall a recent indictment where the adversary in the USA broke Tor Browser. If you are a dissident in Turkey, Syria or Russia .. you're more than safe using the Tor Browser bundle.
The NSA aren't burning 0days in Firefox and VM's on 99.9% of Tor users - if you're in that other 0.01% then good luck to you[1], your threat model is very different to those looking to obfuscate from oppressive regimes.
As somebody with an infosec background, this is where I feel the industry fails in the sense of "perfect is the enemy of good" - there is no such thing as perfect (I bet most who preach against Tor Browser wouldn't be able to come up with a model that is) - the practical advice today is, and always has been, use Tor (Browser), use Tails, use Whonix
[0] I used ungoogled-chromium in that period, until an DNM administrator during a chat told me he could spot me in his acccess logs
This chart does not support the referred claim at all. Payouts are not only linked to the browser's hardening, but also to the amount of affected users. Given Firefox's engine low market share, it's not very surprising that payouts for its vulnerabilities is lower than for Chrome.
Firefox, Safari and Edge being in the same price bracket and less than Google Chrome is not related to their relative security, but their marketshare being a lot less.
That's not a reliable source or claim to support the argument claimed here. That's more aligned with market demand, and whatever that company wants to pay out.
tar RCE, linux & macos LPE valued less than adobe pdf/cpanel? Interesting.
If you look at number of CVEs[1] Chrome is above Firefox, but I admit that especially given the market share that doesn't say much. I wish they had some score weighted rank.
using just this image it would imply chrome was the least secure browser, but I'm not sure I can really infer much at all from this image other than bugs have been found in all browsers.
Was this intended on showing firefox is the least hardened browser somehow?
I no longer use Tor either (unless I have to for work projects such as remote pentesting).
What is you opinion of Landlock (Linux kernel 5.13 and newer)? If we wrap vanilla FireFox in LandLock, proxy that to tor and use Apparmor/Tomoyo to further limit what FireFox could do (when it gets compromised) then I think that would be a much safer approach than using the Tor Browser Bundle.
> Meanwhile, the fork you'll be running is specifically designed to hide sensitive traffic, and collapses all those users into a single version for exploits to target.
Yeah, I was never a fan of their position on this. It's basically "let all websites track you and push ads at you all day long, but we've customized 50,000 settings so that you should look identical to everyone else using the Tor Browser" where as I don't trust that they've managed to cover every possible means to fingerprint a specific user/browser install.
Instead, I prefer to limit the amount of data websites can collect about me in the first place. I harden the browser as best as I can, block all active content by default, block all the ads I can, and I randomize a few little details (like screen and window resolution or user agent) which in total makes me feel better about my chances of avoiding being fingerprinted across sites and prevents most of the vulnerabilities that would cause a person to get compromised just by browsing to a website.
I still love the Tor Browser project though because they're great at spotting things introduced into firefox that would make it easier for you to be fingerprinted, and while I prefer to not give data, or give random data I do understand their reasoning for what they do.
> Firefox is already not one of the most hardened browser engines
I'm pretty sure it's one of the most hardened, because the list of major engines that are on that list in first place numbers approximately 3. If you want to claim that blink or webkit are more secure that's a reasonable argument, but just say that.
Yeah, Gecko is one of the most hardened browser engines out there at this point. Fission and the win32k.sys isolation basically bring the general architecture up to par with Chromium. Chromium got those features earlier and hence has more mature implementations of them, so the edge goes to Chromium, but there's not much of a large-scale difference anymore.
There are a few areas in which one browser has the edge over the other in terms of security (e.g. JIT hardening in Chromium's V8 gives it an advantage over Firefox, memory safety of pdf.js in Firefox reduces attack surface over the C++ PDFium in Chromium), but these are nowhere near the old days of "Chrome has a sandbox and Firefox doesn't" or even "Chrome isolates tabs from each other and Firefox doesn't".
This has been eloquently addressed by Tor veteran Mike Perry:[1]
Concerns about Javascript are rooted in two avenues:
1. Fingerprinting concerns.
2. Zero-day exploits against Firefox.
The reason we feel that leaving Javascript enabled trumps these concerns
is:
1. We want enough people to actually use Tor Browser such that it
becomes less interesting that you're a Tor user. We have plenty of
academic research and mathematical proofs that tell us quite clearly
that the more people use Tor, the better the privacy, anonymity, and
traffic analysis resistance properties will become.
In fact, my personal goal is to grab the entire "Do Not Track" userbase
from Mozilla. That userbase is probably well in excess of 12.5 million
people:
http://www.techworld.com.au/article/400248/
I do not believe we can capture that userbase if we ship a
JS-disabled-by-default browser.
2. Exploitable vulnerabilities can be anywhere in the browser, not just
in the JS interpreter. We disable and/or click-to-play the known major
vectors, but the best solutions here are providing bug bounties (Mozilla
does this; we should too, if we had any money) and sandboxing systems
(Seatbelt, AppArmor, SELinux).
> Meanwhile, the fork you'll be running is specifically designed to hide sensitive traffic, and collapses all those users into a single version for exploits to target.
That's a good thing too because of browser fingerprinting. It takes a lot of identifying points away by having everyone use the same.
Perhaps you mean "don't rely on just the Tor Browser"? How else would one use tor to browse the web? Certainly Whonix or another protection layer is advisable if you're doing anything serious as well.
Anyone know how much the Tor Browser 'Safer' security-level mitigates real exploits? Among several things it disables the JavaScript JIT functionality which has been a known mechanism for exploits.
What about the Brave browser in a private window? That used Tor but theoretically also has some added protection because of the browser. I’d love to hear your thoughts.
Brave browser has a notoriously bad history with their tor implementation. Would not trust [1].
Brave’s Tor mode, introduced in 2018, was sending requests for .onion domains to DNS resolvers, rather than private Tor nodes. A DNS resolver is a server that converts domain names into IP addresses. This means the .onion sites people searched for, with the understanding those searches would be private, were not. In fact, they could be observed by centralized internet service providers (ISPs).
> Tor Browser notifies the user of canvas read attempts and provides the option to return blank image data to prevent fingerprinting.
> Canvas Defender, a browser add-on, spoofs Canvas fingerprints.
> The LibreWolf browser project includes technology to block access to the HTML5 canvas by default
It doesn't seem to be the case that anything with javascript must leak canvas fingerprints.
Are you saying that Brave is unsafe because it has JS like every other browser on the planet or because it doesn't resist canvas fingerprinting specifically?
Can't tell why this was downvoted, it sounds like a legitimate question and on-topic given that this is an alternative to the TBB which GP was recommending to avoid.
For any such agency, a handful of Tor nodes gives your own agents a useful secure channel. An overwhelming majority of nodes would give you good insight into what other users are doing, but it's very hard to get such a majority since of course all your competitors think the same. Putting in place a handful of nodes to benefit your own agents is very possible, so that's what you do.
You can just hack into existing nodes. There are few enough nodes that accessing a large proportion of them is easily within the budget of a state security agency.
Multiple state agencies can fight over control of nodes and, if one of them somehow controls enough nodes they get such information.
If you're right that such agencies can afford to, I dunno, expend zero days to seize control of nodes, they're all going to do that. That doesn't magically create more nodes, just makes it harder to decide who (if anyone) controls them.
The most likely outcome isn't that multiple state agencies can get this done, but that none of them can despite their fervent wishes otherwise.
With repetition and italics - well that's more convincing!
They don't need to control the node; good hacking and spying doesn't reveal your presence unless that is beneficial. Nothing stops multiple attackers from having access unless they want to interfere with each other.
During the Blitz and through the V-weapon attacks, Germany relied heavily on field agents to let it know what it was really hitting in England. If the agents consistently reported that attacks were striking outer North West London for example, German targets would be adjusted South East to compensate. Like when target shooting.
Except, those agents didn't actually work for the Germans. Twenty Committee (because twenty = XX in Roman Numerals, a Double Cross) had identified all the German agents and offered them either indefinite imprisonment for Espionage, or service as agents feeding bogus information to their German masters (and we can infer, the third alternative was death). You can guess what most of them chose.
Twenty Committee in effect ran German Espionage in WWII. If they had destroyed all these agents the Germans would have known and perhaps, in time, the Germans would have replaced them, but instead the Germans believed they had a working on-the-ground network of agents in Britain.
The point of the story is: "Just" having accurate information when actually somebody else controls your source of intelligence isn't actually having accurate information at all, it means you're a fool. Either you have control or you do not.
> I've always assumed that Tor was a top target for 3 letter agencies
Tor doesn't defend against a global adversary like a three-letter agency with capabilities to monitor network traffic and latency globally, panopticon-style. This is explained plainly in the Tor design spec.
"Comments should get more thoughtful and substantive, not less, as a topic gets more divisive." https://news.ycombinator.com/newsguidelines.html (Not sure a rhetorical question to make some vague accusation counts as a substantive comment)
The vague accusation is that because "onion routing"[1] has roots in the military, it must have a backdoor that we haven't uncovered in decades. If the person had posted this Wikipedia link with the info you mentioned, for example, I wouldn't have thought it unsubstantial per the guidelines (even if the claim/accusation itself is unsubstantiated by the evidence, that's a difference of opinion and not a guidelines thing).
[1] Not the cryptography, not even the code implementation, but just the general concept: having a message packed in several layers of encryption such that intermediate routers don't know the contents. https://en.wikipedia.org/wiki/Onion_routing
The more unique your browser (i.e., the more you deviate from the Tor Browser based on Firefox ESR), the more unique and therefore fingerprintable you are.
The Tor browser is 100% unique, it makes no attempt to pretend to be anything other than itself. Your anonymity set is other Tor users, not other Firefox users.
The fact that they can detect that you're using the TOR browser configuration isn't that shocking when they also see that you are coming out of a TOR exit node, or the site you are loading is an Onion site. The anonymity comes from looking like every other person who downloaded Tails.
This doesn't contradict anything I said. If you believe you are contradicting what I said, perhaps you could rephrase what you thought my comment was communicating. Otherwise, I will interpret the intent of your reply as adding supporting details.
Tails browser on [almost anything] is one browser exploit away from beaconing out directly from your IP, and has done so rather frequently over the years.
Whonix stuffs the whole browser and such into a workstation VM, which is only connected to the gateway VM - which "torifies" everything coming in that port. So even if you pop the workstation and have root, you still can't beacon out directly without going through the gateway - you'd have to find an exploit in that bit as well, with only network access. Not impossible, but a lot harder.
And then package all that into Qubes and use it that way, because a disposable Whonix VM set is probably the safest way to browse the web...
And still disable Javascript.