So... it's actually a reasonable objection over bzip2? I mean, you explained why it does not work with bzip2.
I think their argument is sound and it makes using bzip2 less useful in certain situations. I was once saved in resolving a problem we had when I figured out that concatening gzipped files just works out of the box. If not, it would have meant a bit more code, lots of additional testing, etc.
totally agree with the statement though i feel its not an objection over bzip 2 rather than how it was implemented in programs that apply it. but i'm not really 100% since admittedly i did not personally reverse engineer bzip capable programs to see the current state of afairs. I am simply going by descriptions posted in comments and general system knowlesge.
how to compress data has little to no relation to how this compression can be implemented in programs. How its implemented, will reflect on how the quality of the algorithm is perceived, becaus e the two are not seperate from a user perspective.
I write lots of automated tests, but almost always after the development is finished. The only exception is when reproducing a bug, where I first write the test that reproduces it, then I fix the code.
TDD is about developing tests first then writing the code to make the tests pass. I know several people who gave it an honest try but gave up a few months later. They do advocate everyone should try the approach, though, simply because it will make you write production code that's easier to test later on.
... hmm, just looked it up. According to some sites on the web, TDD was created by Kent Beck as apart of Extreme Programming in the 90's and automated testing is a big part of TDD. Having lived through that era, thinking back, would say that TDD did help to popularize automated testing. It made us realize that focusing a ton on writing tests had a lot of benefits (and yeah, most of us didn't do the test first development part).
But this is kind of splitting hairs on what TDD is, not too important.
I don't agree with this. In my view, there are plenty of cases where the product changes are shoved down our throats.
I think the problem is that the product folks don't actually listen to the market. They read Jobs' biography and are convinced that they will tell their users what product they will like and that they will see the light later on.
The sad reality is: they are not Jobs (and even he was not faultless). So, we get Mac like Windows interfaces, we get mail clients losing features, we get AI in every single app you see, etc.
Funny story: I once bought and started up Galactic Civilizations 3.
It looked horrible, the textures just wouldn't load no matter what I tried. Finally, on a forum, some other user, presumably also from Europe, noted that you have to use decimal point as a decimal separator (my locale uses a comma). And that solved the problem.
In my experience, companies are perfectly happy with US companies, as long as the data doesn't leave Europe. This means we have to prove we only store data in European datacenters.
I guess that's fine for now, but it would be better if we could get European alternatives to AWS or GCP.
And why wouldn't this European equivalent do something that a lot of people in Europe dislike too, in the future? The entire model of large cloud companies is bad.
USA companies are subject to us laws, so any data will never be safe. Companies can be gagged, forced to seal their customer data and forced to lie about it, by law !
I'm not sure if it's accurate, but according to the summary on Wikipedia at least, the law "provides mechanisms for the companies or the courts to reject or challenge these if they believe the request violates the privacy rights of the foreign country the data is stored in."[0]
If that's accurate, your country's privacy laws would supersede US law. That said, as things are going, it's unlikely that they do.
Yes, development was being done in SVN but it was a huge pain. Continuous communication was required with the server (history lookups took ages, changing a file required a checkout, etc.) and that was just horribly inefficient for distributed teams. Even within Europe, much more so when cross-continent.
A DVCS was definitely required. And I would say git won out due to Linus inventing and then backing it, not because of a platform that would serve it.
I am not sure, it seems I did misremember. Though it's possible I was actually working with needs-lock files. I can definitely see a certain coworker from that time to put this on all files :/
And even in P4, you could checkout files only at the end, kind of like `git add`. Though this could provide some annoyance if someone had locked the file in the upstream.
I think their argument is sound and it makes using bzip2 less useful in certain situations. I was once saved in resolving a problem we had when I figured out that concatening gzipped files just works out of the box. If not, it would have meant a bit more code, lots of additional testing, etc.
reply