People who say Windows is insecure aren't comparing it to Flash or Java, they are comparing it to OS X, Linux, or *BSD. Without addressing that comparison, this article is useless.
This stuff is hard to pin down; for instance, there's a lot of hardening you can (but most people do not) do on Linux, and Lion changes the game on countermeasures for OS X. So it's a uniquely bad time to try to resolve this controversy decisively. But it's also hard to ignore the fact that much of what Lion is doing was pioneered in WinAPI.
(I'm an OS X user; also, I do systems software security).
Whether Windows (Vista+) is safer than Linux is up for debate. But safety and security are different concepts.
I believe I've seen tptacek say a liquor store in Englewood, Chicago with bulletproof glass in front of the cashier is more secure than a 7-11 in Lincoln Park, but the 7-11 is safer. Or something like that. :)
Home folders and UAC have very little to do with the real-world security of Windows vs. other operating systems. The reality is, even today, if you get arbitrary code execution on any operating system you are probably boned.
The kernel is not some almighty bug free program. The question is then just whether or not the person who has gained the capacity to run code on your system knows an exploit that will give them root privileges.
The Android thing works because you aren't just running arbitrary code, you are probably running code that you got from the Android Marketplace, which was probably screened to make sure it doesn't do something bad.
How would they be screened? If you include native libraries in your APK, they're binaries, not code. They could scan the binary for API calls, but you're permitted to call dlopen() etc, so you could always hide a shared object in there and call it dynamically.
Okay, you can't screw up the whole system, but you can install a keylogger (X has got to be the most insecure windowing system out there), and you can use the system for a DDoS botnet.
Also if you use sudo or su in X, they now have root thanks to the keylogger.
And with threaded code attacks you don't even need to download an executable. You can run "script level" code by repurposing existing executable sequences.
If people cared about "we did x first" as a reason why some software is better, everyone would be browsing with Opera. What matters is how well they do something now.
I can't imagine how bad the guys from Opera must be feeling, when everyone points to them as an example of a pioneer, yet nobody actually uses Opera (a startup once called me "their Opera user")...
Windows XP is still supported, and even if I wanted to run Windows 7, it's too bloated to run on fully a third of the hardware in my organization (and that's just counting x86 desktop/laptops.) Windows XP is a problem, and Microsoft hasn't solved it.
I'm talking about machines which barely fit the minimum specs for Windows XP - I guess this is hard for you to grasp, but some people can't afford to buy a $300 netbook when they have a perfectly functional desktop from 6 years ago.
Also, we have several applications which, while they will run on Windows 7, they simply do not run well. From a business standpoint, it's really hard to justify an upgrade that results in reduced functionality in the name of security. I can clean up after malware pretty quickly - people working with outdated but business-critical applications lose hours of their days when software starts acting up because it wasn't written with a good security model in mind and it's being forced into that environment.
You can say Windows 7 solved the problem all you want, but there remain countless areas where Windows XP is required, and virtualization is not a magic bullet - networking is always tricky. And in any case, we do all that work of migration and what do we gain? Exactly what we have right now, except with hardware accelerated graphics that don't support all our hardware and some difficult to quantify reduction in malware attacks.
5-7 year old hardware. This is an affluent community - I know it's unusual to think about it here but some people absolutely cannot afford to upgrade. I'm talking about hardware that barely meets the minimum specs for XP.
I'm not one of them, but I see them every day. Windows 7 does not solve the Windows XP problem. (The Windows XP problem will be solved when any given computer can run Windows 7 easily.)
Criticism does not mean being "anti". It does sometimes but if it's a complex issue and people take their time to understand it's various aspects, the critics are usually never "anti".
From Windows 95 I was hooked on GUI's. I didn't like Microsoft but I liked Windows. I'd almost never care enough about Microsoft to say anything but when it came to Windows I always had an opinion I'd want to air and it was always in the spirit of wanting an improved product.
Mac OS 9 felt like a joke and the various Linux distributions felt unfinished.
But eventually I changed to OS X when one of it's later "kitties" came out. Among my friends who have a Mac I'm pretty much the first to ever criticize Apple products but if you look at how much money I spend on the platform and how much time I invest learning about it you wouldn't think I was a critic. I don't have opinions about Windows beside wishing IE wasn't such a pain to design for.
Being critical does not equal being against something. When your critics no longer want to share their opinion of your product or service, that's when you should be truly scared. That is when you're in a really, really bad place.
Fair enough. But Preston Gralla is certainly not like a walking MS PR firm. Compare and constrast Preston's stories with someone like John Gruber for Apple. Gruber has written maybe 100 blurbs about Apple in the past year and maybe one that was critical. Daniel Dilger is basically the same -- except maybe he hasn't yet written the critical Apple article yet.
Preston OTOH regularly writes critical MS articles, more often than he writes those that defend MS.
The author who refered to Preston as the equivalent of an MS PR firm doesn't know his history well.
I don't really follow any tech writers and my remarks are probably pretty off topic but I think it's important to recognize the differences. To some it might just be semantics but for others it's a whole world in difference because it affect their lives.
My comment is merely meant as an insignificant resistance towards the uprising of the misunderstood reflector and doubter. Far too often are people promoting division with it's constant "us"-and-"them"-rhetoric and "Either you're with us or against us"-policy.
It's healthy to doubt and question.
It's exactly how hacktivists such as Anonymous are alienated and how projects such as WikiLeaks is attempted to be criminalized. Whistleblowers are not the enemy nor the ones threatening the safety of the people.
Windows 7 is pretty damn secure, perhaps more so than Mac OS X and Linux with a default install. The problem is all the common shortcuts people (in some cases are forced to) take to use the applications they need/want.
I still see regular end users routinely made administrators of their computers for no good reason, or due to sloppy software (hello, Intuit).
Absolutely. For most of the existence of Windows, even when it was notably insecure, the vulnerabilities were all the worse because most users routinely used "Administrator" as their default account. For many years this was in fact the default out of the box, so no wonder that it was such a common thing. This then precipitated the number of software applications that required Administrator permission to install or in some cases even to run, because it was assumed this was "normal" anyway.
Pwn2own doesn't work the way you think it does. The participants use prepared exploits. You can't infer anything about the relative security of different systems that both get exploited there.
Zero day hackers and the malicious Chinese hacker spies also use prepared exploits, so you can infer something about the relative easiness of finding exploitable holes.
Pwn2own may use prepared exploits, but researchers tend to go for the easy low hanging fruit, so there is a lot to infer from who falls on the first day etc.
In the major distros, Fedora and Ubuntu and OpenSuSE, a lot of packages, some of which aren't used by most users, sometimes default iptables rules, sometimes no iptables rules at all.
In newer, less friendly but more tech oriented distros (these aren't opposing forces, but in Linux they're misunderstood to be) like Arch, a lot less.
Windows 7 especially does seem fairly safe and secure. The only time I've had a virus on my current computer is when I've actively downloaded files and run them without properly scanning them first (which is 100% my own fault of course; not a wise idea to download when rushing!)
I know some non-tech-savvy users who literally have 200+ programs installed and 3+ of those spammy toolbars on their IE installs. When you've got people who will download anything and everything they see, it's no surprise they get loads of viruses - and then blame Windows when thing go awry.
I agree that the architecture of Unix-based OSes is probably more secure than Windows 7 overall, although this doesn't mean that W7 is insecure.
"...that conventional wisdom is wrong..." ... "...For the very first time in its history...". So people are wrong to assume that windows products are vulnerable just because for the first in years it happend not to be ?
This article is pointless imo, doesn't explain why is it any safer, only assumes it because some hacker HIRED by microsoft said so.
Security isn't just about lack of buffer overflows. It's in the way users interact with the system, and the way they're encouraged to interact.
In most Linux distributions, the user is encouraged (by the design of the system) to, in almost all cases, either install cryptographically-signed software packages vetted and maintained by the vendor; or download a source package and compile it themselves. In Windows, the user is encouraged (by the design of the system) to download unsigned binaries from all over the Web and run them.
That's why I'll never consider Windows to be comparably secure to modern Linux distributions. Sure, you can keep a Windows system airtight, but the way the system is designed makes it require effort, so users, in general, don't.
I'm a Linux user and I do agree, but I do have to tell you - the lack of universal binaries that you can just download and install on a Linux workstation is one of the reasons Linux will never be as popular as Windows or even OS X.
People want convenience over anything else. A computer has to be secure in spite of people downloading software from all over the web.
Considering how i use computers, most of the time i found it far more convenient to apt-get search and 20s latter install (or the equivalent with the package manager you use) than downloading a random piece of software from a random internet site, which from this point might become a risk if it does not auto update, and if it does it's with a stupid yet-another rewritten auto update program that always uses 50 MB of RAM for just one program installed on the system, even when the program does not run.
And considering the success of "app stores" and the general dislike of the traditional windows desktop computers by the general public (except if you are fool enough to help them for free doing all the busy work maintenance tasks that can't even exist on serious systems), I guess the centralized packaging and distribution model is also pretty convenient for the general public...
Apparently you haven't seen the latest trends in the ruby development world. Here's two examples I've seen lately. The first is for OSX only but the second is for *nix.
curl get.pow.cx | sh
bash < <(curl -s https://rvm.beginrescueend.com/install/rvm)
Its not necessarily my favorite OS but I think that in general a lot of commenters have been living in the past (particularly wrt performance and reliability) since windows switched from 3.1 to NT.
To be fair, almost all of the proclamations I see from people about Windows being insecure are based on old, pre-SP2, pre-SDLC anecdotes and experiences (which were definitely true at one point, but haven't been true for quite a while).
It would be like if I railed against Linux being usable by complaining about how bad the sound subsystem was...
This Usenix video from last year of Crispin Cowan going over improvements to Windows security was interesting:
Im not a security expert but saying something isn't in the top 10 worst anymore isn't the same as saying its now better, Flash & Java may have become more insecure.
I know that if a popup tells me I can get free cat screensavers at 'randomwebsitewitharandomname.com', it's most likely fake and might have a virus attached. I don't download it and my system is 'secure'. However, if I don't know it's fake (i.e., I'm not experienced with computers at all,) I might download it. Now my system is insecure. Whoopee.
You could say that a system is only secure if it can catch things like that before it's too late. The problem there is that Windows, like other OSes, does not come with any anti-virus program built into the system itself (I'm excluding security measures built into Windows as this doesn't fully protect it.) Windows is only secure from viruses when an antivirus program is used.
Of course, that doesn't make other systems safe. Just because Mac and GNU/Linux viruses are rare doesn't mean it's secure. This mentality is 'security by unpopularity'. The real argument is in how these systems would protect themselves from viruses that actually ran on them if those viruses were as prevalent as Windows viruses and the systems were used as much as Windows. On that, we can only speculate.
My point is that I'm only secure if I know what I'm doing; software is not built to replace common sense
I once installed Windows XP on a machine that got compromised in the first 20 minutes just because I forgot to unplug the network cable while installing it -- although I wasn't prepared for offline updating anyway.
When a machine gets compromised by just being connected to a network, that's pretty bad and I sure hope Windows 7 is a lot better.
You were doing a fresh install (no SP) of a 10 year-old operating system (xp) on an infected network and you were surprised you got compromised?
Now, XP is first of all, not a great standard in security practices. But without any security updates, on a platform that hackers have had 10 years to find exploits to? If you installed that on an infected network, I'd be surprised if you didn't get compromised.
On the one hand, the evidence provided to back this claim is pretty lame:
1. Microsoft vendor says so.
2. Windows blogger says so.
3. Person *hired* by Microsoft says so.
That said, if Adobe can no longer say "hey, at least we're not as bad as the people who make the operating system", we may finally see some effort by them. More public shaming the better.
The millions of hijacked Windows XP machines ready to a botnet's bidding might beg to differ.
Let us never forget that Microsoft Windows is responsible for, among much else, the transfer of critical trade secrets, diplomatic communications, and weapons technologies to our competitors and enemies. If China wins World War 3 in 50 years, Windows, albeit indirectly, will be significantly responsible.
Windows 7 may be more secure, but XP is still a major drain on society.
By that same token, Windows was responsible for commoditizing the hardware by keeping the OEMs in competition with each other, leading to free fall of computer prices and thus putting a computer on almost every desktop and in most laps around the world. Even Apple moved to the x86 platform to take advantage of the scale and prices.
If you think Apple hardware is overpriced now, imagine if they didn't have any competition from Sony, Dell, HP, Acer, etc.
>She highlighted a fake document titled “Draft US-China Joint Statement” that was circulated among people with e-mail accounts at the State Department, the Defense Department, the Defense Intelligence Agency and Gmail. Clicking to download the document directed users instead to a fake Gmail log-in page that captured their passwords.
So that attack wouldn't have worked if the user was on an iPad or Droid or a Macbook Air or running the most hardened Linux computer. Right?
Zero days have been found in every browser/OS combo around. It's hard to see how OS X would fare better in a very targeted attack as Safari/OS X is usually one of the first to fall in PWN2OWN where the reward is a Macbook (not a win in a World War).
Stop getting your news only from places like Boyocott Novell, Groklaw,Slashdot, HN and the comments there. It warps your mind.
You are correct that Windows was responsible for the commoditization of computing. You are also correct that many hacks occurred through means other than unpatched Windows machines. Neither of those truths changes the fact that Windows XP has been and continues to be a serious issue for national and corporate security.
Almost every computer in China runs a pirated version of XP that can not be patched and is vulnerable and likely infected with some sort of botnet software. Many corporations and agencies in the US have been compromised in this way as well.