I'm curious about what you mean by "telling". What is the tell that you perceive? I would like to understand whether I misrepresented myself.
I agree with you, that none of it is easy. It is precisely why I used to doodle, to craft small projects to understand the core essence of what is going on: building an entire TCP/IP stack, writing a compiler, writing a database, an editor etc. That practice has allowed me to deploy into production a fair amount of efficient code.
But now, I find myself in the role of a project manager telling my highly capable coding buddy what to do, a role that I do not relish.
I don't know if it came from my work getting a PhD or from my work in startups [1] or earlier than that but I think any side project that "hasn't been done before" is not worth doing. For me any side project has to be something I can demo to an audience that, with a dash of showmanship, will knock their socks off.
For instance I knew a machine-learning based RSS reader was possible in 2004 and almost 20 years later it hurt that nobody else had made one, so I made one. I got interested in heart-rate variability and couldn't understand why I couldn't find any web-based HRV apps that used the BTLE API so I made
I wrote the prototype of that using Junie, the agent built into IntelliJ IDEA. I had a lot of anxiety because how do if I know if I coded it wrong or if the Windows Bluetooth stack is just being the Windows Bluetooth stack? The fact that I couldn't find public examples that could connect to a heart rate monitor made me wonder if there was a showstopper problem; what if I invest hours in study the documentation and "it just doesn't work?"
With Junie I had something up and running in 20 minutes that I understood and was ready to continue the development of. Now I can study the documentation and experiment with things and not have the fear I'm going to get stuck.
If you're making things that make no different like another TCP/IP stack and another compiler and another database and another editor no wonder you have been working on it for decades and have nothing public to show for it. You could have made an implementation of any of those things that was unique and different and shipped it which requires and entirely different kind of craftsmanship (if you use AI or not) and leaves you with a very different kind of feeling in the end.
[1] like oil and water in most people's mind, but like peanut butter and jelly in my mind.
"no wonder you have been working on it for decades and have nothing public to show for it"
That's because it was never my intention to show it off. Your motivation comes from making something new and showing it off. My motivation comes from learning something new _to me_ and capturing aha insights, even if that thing has been done before. It isn't "wrong" per se, just a different path. I'm not necessarily interested in carrying my side project to completion, just as many artists carry a notebook for their sketches that are not meant for public consumption.
I too did a PhD and several startups and produced several new products and projects that were well received in the market, and incorporated much that I learnt from my side projects.
But your comment did help jog me out of my local minimum. Thanks for your input.
I'm finding that in this build fast and break things culture, it is hard to revisit a project that is more than 3 years old.
I have a couple of android projects that are four years old. I have the architecture documented, my notes (to self) about some important details that I thought I was liable to forget, a raft of tests. Now I can't even get it to load inside the new version of Android Studio or to build it. There's a ton of indirection between different components spread over properties, xml, kotlin but what makes it worse is that any attempt to upgrade is a delicate dance between different versions and working one's ways around deprecated APIs. It isn't just the mobile ecosystem.
I have relatively good experience with both Rust and Go here. It still works and maybe you need update 2-3 dependencies that released an incompatible version, but it's not all completely falling apart just because you went on a vacation (looking at you npm)
Build fast and break things works great if you're the consumer, not the dev polishing the dark side of the monolith (helps if you're getting paid well though)
As a consumer, I can not remember any feature that I was so enamored about having a week earlier than I otherwise would have, at the expense of breaking things.
No, they didn't do a "quick analysis". They were in a race with Linus Pauling to figure out the structure. Pauling's son happened to leak the fact that Linus Pauling's lab had a triple helix, so they asked the son casually for notes. That, along with Gosling & Franklin's XRays convinced them that their own original model (and Pauling's) were flawed.
I'm surprised they weren't already, if one discards the ethical and moral issues (like one would expect from an Amazon product), they do have a lot of opportunities for working with each others data.
This is just the systemization, scaling of what existed previously. It's a rather predictable pattern at this point in America. Introduction through innocuous means, expansion through a combination of convenience and fear, then systematic expansion once the slaves have become accustomed to the new state of things. It's a rather common ratcheting normalization staircase.
Even public information clearly describes how it is the "CIAs" one trick pony, whether it's orchestrating a "color revolution" for "democracy", instigating conflicts and war to feign innocent self-defense, implementing social engineering and Constitutional subversion, or implementing mass surveillance specifically. It's the same wife-beater and child rapist type pattern of grooming abuse that then feigns innocence and deflects blame to anything and anyone else.
Most people are really not all that different than any run of the mill battered wife (even if only in the making), psychologically. I get it a lot when I point out what a trap and an illegitimate, enemy entity that the EU is (not to pick on the EU, because it also applies to the US and many other places, but it's far more pronounced with the "EU-cultists")... You get the constant predictable defenses of the love-bombing "abusive boyfriend"/wife beater in the making responses. "you don't understand", "the EU really loves me", "you never want anything good for me", "he showers me with all kinds of benefits and slick marketing", "we are going to be happy forever".
It's sad, and as someone that has watched that cycle unfold even in my own family, it's really kind of demoralizing and somewhat depressing to know exactly where it's heading and being unable to counter the forces that have roots a long long time ago, forces of nature. So, the US and the EU will have to suffer that which is predictable and was preventable, no matter how much they wanted to see the world through rose colored glasses.
Maybe for humanity's sake, China can free the world of the scourge of this cycle and the psychopathic, narcissistic, maniacal group of people that causes it all... if they don't just kill all life on the planet because if they can't be in control then no one can be in control.
This particular example of thundering herd isn't convincing. First, the database has a cache too, and the first query would end up benefiting the other queries for the same key. The only extra overhead is of the network, which is something a distributed lock would also have.
I would think that in the rare instance of multiple concurrent requests for the same key where none of the caches have it cached, it might just be worth it to take the slightly increased hit (if any) of going to the db instead of complicated it further and slowing down everyone else with the same mechanism.
> The only extra overhead is of the network, which is something a distributed lock would also have.
Well, There's also the 'overhead' of connection pooling. I put it that way because I've definitely run into the case of a 'hot' key (i.e. imagine hundreds of users that all need to download the same set of data because they are in the same group). Next thing you know your connection pool is getting saturated with these requests.
To your point however, I've also had cases where frankly querying the database is always fast enough (i.e. simple lookup on a table small enough that the DB engine practically always has it in memory anyway) so a cache would just be wasted dev time.
https://hackernews.hn/item?id=46960408
reply