Many people here work for (or even own) companies that are enabling this sort of data collection to happen whether or not they take responsibility for it. We as a community are burdened with this issue and it is our responsibility to be knowledgeable about what is going on in an attempt to understand/deal with it.
Rust looks really great and I would love to switch all of my C development over to Rust but last time I tried to compile it, it took a good hour or two. Has this changed at all?
Well, we have a production-quality optimizer (LLVM). So it'll always take some time to compile.
Eventually once all of our LLVM patches are upstream and the versions including the patches make it into common repositories you may be able to use your system LLVM to compile the Rust compiler, at which point the compile times will drop dramatically.
We continue to work on compile speed of Rust code all the time.
Will you also work on providing a better Windows package? I don't understand why you don't just bundle all the dependencies, given that they all seem to be open source and redistributable.
Call me spoiled but I expect a one-click installer in this day and age. I still haven't tried to actually use Rust because of that and I am sure I am not the only one.
But, windows is (unfortunately) a bit of a second class platform at the moment (although there have been some very big strides in the last week or so). So part of the reason for poor windows packaging is Rust isn't ready on windows yet (even more so that its normal pre-alpha-ness on Linux and Mac).
The compiler is pretty slow (although there's a big drive starting a few days ago to get certain aspects down (mostly compiling small files); because running the testsuite is so slow that work on the compiler is getting dragged down), but compiling the compiler itself is particularly bad:
- it has to build LLVM: Rust currently has to use a custom version; the goal is for this to be unnecessary so that the system LLVM is ok.
- it has to build itself and the standard libraries 3 times: rustc is written in Rust; so it has to bootstrap.
- it eats a lot of memory, so bootstrapping on low memory systems is bad. Much of this is historical; the bootstrapped code means that large portions of the compiler are written when the language was less developed, and so is very non-idiomatic. It's progessively being modernised
The first and last can be fixed; the second can't, at least until there are binary releases.
Given my experience with compiler development, back when I was in the university, my expectations is that the second point would be pretty fast anyway, if the first and last issues are solved.
I have good experience with languages that use modules, some of them had bootstraped compilers like the Oberon family.
That is why I said I am hoping for these issues to be solved, as I think building LLVM is the main culprit.
There are periodic (made when someone feels like it, normally every one or two weeks) snapshots, i.e. a normal bootstrap saved as a binary, of the current master branch, which are uploaded to rust-lang.org, and downloaded (once, they are cached) when building a revision that requires that snap.
Exactly, this is the weirdest part to me. The US government feels alright with the fact that they have publicly declared anyone living outside the US a possible enemy. That is definitely not how I feel at all as a citizen and that viewpoint is obviously extremely dangerous. It seems like the US is becoming something like a xenophobic elderly person who isn't able to adjust to the times.
Thanks for saying this! ("That is definitely not how I feel at all as a citizen")
Yet another two cents: The idea of side-stepping national regulations isn't that farfetched: At least the UK and France are perfectly capable of operating their own systems (and are already employing them on a limited scale). So this isn't a constitutional affair of domestic interest only, but has a human rights aspect too.
And as a European citizen, I'm feeling pretty much under attack by this. I think, there's a foreseeable point in the future, where the remote threat of terrorism begins to wear and people are still feeling their exposure to a foreign surveillance they can't help about. Eventually this could possibly be the single major damage to the reputation of the US.
>The US government feels alright with the fact that they have publicly declared anyone living outside the US a possible enemy.
I disagree with this. The people who are tasked to prevent foreign powers from attacking, destabilizing or hindering U.S. intentions have a stated mission to prevent them from doing so and as such have that as a minimum starting point.
If you look at the other arms of the USG, namely the State department their charter is exactly the opposite, building partnerships, making friends etc...
So yes, the DoD and CIA view the rest of the world as hostile to the US by default. That's what they are there to defend against so it would make sense. That said, there are tons of building partnerships, humanitarian relief and international relations missions stemming from the DoD to foreign allies and partners. Its actually growing quite a bit - and likewise makes defending the nation against materiel proliferation, attacks, espionage etc... harder.
Sorry for the intervening (as this wasn't exactly a reply to my post).
I really would like Americans to understand that there is a great concern outside of the US about the directions the US, its government and its arms are taking.
The US had built a great reputation in 20th century as "the beacon of freedom" – and lost some of it during the G.W. Bush administration. Remember the very warm welcome Obama had when visiting Europe just before the inauguration (for example in Berlin)? This was really an expression of the wish to get back to terms as they used to be and to close the books over what seemed then to have been just an episode. Or take Obama's (quite premature) Nobel price as an example for yet another expression of this wish.
In the meantime things have changed. But it wasn't the change expected. From outside, it looks a bit like the US became out of balance. When naming the State department and other arms headed towards foreign policy, not much of them is perceived outside. (I really can't remember when the US State department was in the news last time, but it feels like to have been years ago.) What's perceived, is the intelligence, the DoD, drones, the NSA, etc. From outside it appears, as if the US with a self-description as the "blessed nation" has lost interest in co-operation on a large scale as it even targets its closest allies. What had been the epicenter of freedom, cool, and hip, now has started to feel a bit like a looming shadow. (It might be worth to note here that most political parties in Europe have their origins in the revolutions of 1848, which were essentially a revolt against surveillance and police control. Even in the social network age there are some of these values still alive and are nurturing some sensibility on this subject.) There's even saying of the cold war returning, but this time with the US featuring the bad guy. This is not, what the allies of the US have learned to expect from their partner. Nor is it, what its friends would wish them to be. This is not, how people would like to perceive the US, but eventually they start feeling being unable to help about it. There's a feeling of disappointment. And there's a great wish for co-operation and trust.
I think those are all totally reasonable and justifiable feelings. I agree in general that our military/intelligence arm has been carrying the US brand the loudest since 9/11. I think that was by design. I won't argue whether that is good or bad, because I could make the case either way.
My intent was to say that yes the self protective parts of the government are going to be inherently anti-foreign, by design. I think what your comment adds is basically replying to that with: "Ok, well if you guys make that part of your government the loudest and strongest part to the rest of the world, we probably will stop liking you guys, and will stop wanting to play with you."
I think that is a very valid criticism. It is one for the legislators and the public to take on however, not the arms of the government that are intended to protect it.
Hm, every new bit of news relating to the US government and the Internet makes me question what exactly the Internet is doing for me and if I could live without it.
Lately I've began to realize that within the next 5 years my Internet usage will be extremely minimal(if at all) unless there is some kind of huge change.
I'd rather just cancel all of my accounts on major websites than be forced use this creepy ID system.
I feel like most people don't consider that the content people post on Facebook is posted because the person wants everyone to see it... They wouldn't post about the absolutely mundane stuff that we all go though every day unless it's significant in someway. This is why it can be depressing, because you're looking at the best moments of hundreds of your friends' lives and you're just sitting there at the computer.
But when you post something interesting others feel the exact same way you did when they posted something interesting. Sort of a weird situation.
Why are people so interested in recreating a 'more open X' (where X is something like dropbox, skype etc...)? When these people could actually be iterating on these ideas and creating something different. Is it because there is lots of money/publicity surrounding this new found interest in "openness" or do these people actually believe that this is the most important thing to be working on? I am just trying to imagine what kind of new openness we could create if we began thinking outside of the framework that current applications provide already.
This is not not to say that they're not doing enough, but it's just a question that's been bouncing around in my head lately.
I totally agree, this isn't necessarily about the projects listed in the article. It is more of a reaction against the over abundance of announcements like "it's like X but open!" without really providing any significant openness over google drive or whatever it's attempting to replace.
After checking out the projects mentioned in the article I would definitely say that they don't fall into this category.
> Why are people so interested in recreating a 'more open X'
To me, the biggest benefit is avoiding the problem of orphaned products.
Think about the problems caused by the shutoff of a major web service, for example, Google Reader. Reader users relied on a unique service provided by a company. When the company decided for whatever reason to discontinue the service, those users were left high and dry.
If you use a self-hostable version that runs on a commodity technology stack (i.e. the underlying OS/webserver/Redis/whatever layers are offered by many different providers and/or self-hostable), you can be pretty sure you'll be able to retire the service on your schedule, not the provider's.
If the product is open-source, this is even better because it makes the code more resistant to "bit-rot" (the tendency of code to stop working even though no changes are made, due to changes in lower layers.)
Openness helps people who aren't ready to reimplement the whole thing try to iterate their idea on top of what already exists, rather than doing the 98% of the work that is the same as the other product plus the even harder 2% brand new concept work.
Camlistore is "like Dropbox" only for the purpose of a Wired article. In fact it is one of the most exciting projects around on the internet today.
It is a little like Bittorrent. When Bittorrent first came out there was a large community who were most excited about it because "it was open" - both open source and open in the sense it didn't require servers.
Has that "openness" been important to the success of Bittorrent? Perhaps, but most people now talk about more specific qualities: the fact it resists legal attack and the high downloads speeds that can be obtained without paying for bandwidth. Some of these are actually what people meant when they spoke of "openness" initially.
So "open"="nice", but discussion of specific qualities of that that means is much more important.
I'd note that "open" is only mentioned once on http://camlistore.org/, and that is a very specific quality ("Open Source" - not some general "open" principle).
I agree that we would do well to think completely outside the existing framework of services. Under present circumstances openness has moved to the forefront of peoples' thinking in lieu of imagining more fundamental changes. This is probably a good thing, but even better would be building completely new and open tools and services.
I'd imagine it's not seen as an improvement because (most) people don't value it.
You might as well say you're improving a service by localizing a service for Esperanto. To some (small) audience, that well may be an improvement. But most people are going to wonder why you're 'wasting' that time.
That may be true, however, we may understand that google has been doing this since day one but many people have no idea about this kind of stuff. Not everyone understands the Internet and how everything works... to some people I'd imagine this could be quite shocking.
But I do agree that this is definitely part of a trend and this could have been reported on at any time over the last couple years.
I personally found that some packages wouldn't install because they had dependencies that were broken in some way. This was about a year ago though, so I don't know how things have changed since then. Also, node packages seems to be small and specific so npm downloads many dependencies for each package. Again, I only played around with node for a little but I got a bit frustrated with all the packages and dependencies.
I think the issue you are describing really stems from poor semantic versioning. People will peg to specific revisions that they happen to use rather than depending on the general major or minor revision level, so small fixes don't naturally propagate to second- or third-level dependencies. There's no good solution to the problem that doesn't lead to other sorts of hell (imagine asking users to manage the revisions for third-level dependencies)
Were you using Windows? Although node.js "works" on Windows, there are a number of modules that only compile under UNIX, which makes Windows basically useless for node development.
Your analogy would only work if the website was actually hosting their own analytics. Using google analytics is quite literally participating in a larger tracking program.
It is not the same as individual shops collecting data on customers for their own use.
I don't know the details of what they do with it. But for instance, someone can purchase ads targeting a specific type of user and google probably uses this browsing data to decide if the user should be targeted for a specific kind of ad or not. That advertiser is in turn getting better ad targeting due to you using google analytics on your website.