I have used id() in production code, but the circumstances were extremely specific: It was in some monitoring-related code that was trying to dig down into the guts of the interpreter for the purposes of tracking memory usage on the level of individual pages. It was a whole thing.
Which is to say: Users of Python almost certainly don't want to call id(), unless they are mucking around with something related to the functioning of the interpreter itself. The only other circumstance in which I've used it in anything like real code was in a quick hack where what I really wanted was to put a bunch of dicts in a set, but you can't do that, so I made a set of their IDs instead (but this wasn't in what I would call "production code").
In general, most of the "wtfs" listed here are pretty banal. If you're using `is` on immutable objects, you're basically just asking for trouble. The functioning of string and integer interning and constant folding are implementation curiosities, and if you ever write code in which such differences matter, then you have made an error.
When it comes to profiling in Python, never underestimate the power of the standard library's profiler. You can supply it with a custom timing function when instantiating the Profile type [1], and as far as the module is concerned, this can be any function which returns a monotonically-increasing counter.
This means that you can turn the standard profiler into a memory profiler by providing a timing function which reports either total memory allocation or a meaningful proxy for allocation. I've had good results in the past using a timing function which returns the number of minor page faults (via resource.getrusage).
There's a whole generation of games from the mid to late 90s (and perhaps into the early 00s) that's remarkably annoying to play these days because of this issue. The game I've personally tried to get running, every once in a while, is Mechwarrior 3, but it's a complete disaster. The last time I tried to run it, I actually managed to load into the first mission, but the in-game physics were remarkably broken, with the amusing result where the first enemies you face in the game (a couple of little tanks) drove up towards me, hit a little bump, and then immediately rocketed into the sky.
At least part of the issue is that the game uses multiple threads, but was designed for systems with a single CPU, with a clock speed that's considerably slower than is present in modern systems. Something about this difference in timing breaks the whole thing in ways which are diverse and inexplicable.
Now, this comment thread contains plenty of possible solutions I could attempt, but if it's really a matter of the game relying on something like the CPU speeds of contemporary hardware (not to mention contemporary graphics hardware) then I start to think that I'd need to track down some kind of Pentium 3-era gaming PC to really make it work.
Part of the issue there is that the 3D-acceleration-enabled versions of Mechwarrior 2 (and Mercenaries) were buggy as heck even at the time. (Mercenaries was even pretty buggy before they did the 3D card patch, and it only got moreso.) It really doesn't help this kind of games preservation that the games themselves could be kind of junk, from a technical standpoint.
I played a ton of Mechwarrior 2 (ATI 3D Rage Edition) - the Aptiva bundle version - and didn't notice it being buggy.... Maybe this is a later version and earlier versions were more buggy?
My own experience was with the 3D-enhanced version of Mechwarrior 2: Mercenaries, specifically. It is possible that the equivalent version for Mechwarrior 2 was more stable. It's also possible my 3D drivers of the time were the cause of the issues I had, but Mercenaries was sufficiently buggy in all sorts of other ways (even the DOS version loved to crash) that I always just blamed the game.
I am taken a little aback by the name. There is, of course, already the D programming language, which has even had a major version number of 2 for quite some time.
It also brings to mind the JavaScript library d3, which, while not strictly for making diagrams, can easily lend itself to the purpose.
Calling this thing "D2" seems potentially confusing.
"D4" library/tool for Declarative Data-Driven Documents[4]
"D4" implementation[5] of the data language specification[1]
Overall, I think "D2" is objectively the best choice here. We have at least three "D"s, two "D4"s, and one "D3", so it makes sense to put it in as "D2". I certainly wouldn't want another "D" or, heaven forbid, a "D5".
I wish developers would stop with the one or two letter names for their products. In most cases there is already another product with the
same name and you just cannot do a websearch for it without a hassle.
All right, behold my idiotic story about the time I got a job for one of the major tech companies.
It's the late 00's. I am in my early 20s. I have an associate's degree from the local community college, and I have been messing around with programming since I was 8 years old. I am unemployed, having not long before been laid off from a minimum-wage job at a failing retail establishment. I am living at my parents' house, spending most of my time messing with code and hanging out on IRC.
I am contacted on IRC, one day, by a recruiter for one of the major tech companies. He tells me that I sound like I know what I'm talking about, and would I like to interview for a job?
Well, yes, of course I do.
This major tech company has an office in a city a few hours' drive from where I live. I go there for the interview process, which is somewhat grueling.
I don't get the job. I had a brain fart and whiffed some basic question about computational complexity; I'd forgotten what O(n log n) means. So it goes.
Perhaps a week later, the recruiter calls me back, saying that, out of the blue, a different subsidiary of the major tech company would like to interview me. This subsidiary is an acquired startup, a major website in its own right, and would I be interested in interviewing there?
Well, yes, of course I am.
This subsidiary's office is located in the Bay Area, which is a somewhat greater journey, but I fly there and do the interview process again.
I got the job, that time. I aced the interview, in fact. Pro tip: It is a good sign if you manage to blow through all of an interviewer's prepared questions, and force him to resort to asking riddles, and then manage to answer the riddle correctly, too.
It was only later, after I started to work there, that I learned the full story. The website's ops team needed to hire someone. Although this subsidiary was owned by the major tech company, it was still in many ways a separate company, not yet assimilated into the greater corporate entity. Thus, they basically just walked over to HR, and asked to poke around in their resume database.
They did a simple keyword search, and my resume popped up. It was a total coincidence. One of the keywords they used was the programming language they used, which I had also happened to use in some open-source stuff. Another keyword was related to the subject of the subsidiary's website, which, entirely unrelatedly, also happened to be a word used in the name of the retail job I had recently been laid off from.
My resume was very nearly a blank sheet of paper, aside from these things. I was told that the site's lead architect, on seeing it, reacted along the lines of, "Oh, we have got to interview this guy."
So they did, and I spent the next several years working there.
This is an incredibly stupid story. There is absolutely no part of this process which I would point to as advice for other people. To the extent that this has led to my success, it was blind, stinking luck, and I doubt it would ever happen again.
I'm not sure what moral you could take from this. "Know your shit" seems patronizing. I would like to think that possessing technical know-how with no relevant degree, certifications, or experience is still a state which can lead to success, but I suspect that such extraordinary luck is still a necessary component as well.
I think people massively underestimate “being in the right place at the right time” factor in their lives.
My current job - company gets big contract m, desperately needs to scale up. I was fed up with previous job, and just happened to have used Django. Got headhunted and got a huge pay rise. Got to company to find that they literally couldn’t find Django developers for love nor money locally. I’d only really taken on a tiny side project in the last job - had I not used it, they probably wouldn’t have been interested in me here.
It’s time - now I’m here, we’re training existing people up, but if they’ve not god a good mental model of backend development then it’s going to be a challenge.
I once wrote a popular browser-based tool for a popular game. Other tools existed in the space at the time I started the project, but I felt the need to go at the problem my own way, and I managed to add some mathematical sophistication that the other extant tools lacked. I did all of this purely for my own benefit: I was playing this game, and I wanted this tool to exist to aid in doing so.
I threw it online for other people to use, and it quickly gained traction. I'd estimate that it has had over ten thousand users, possibly multiple tens. Some of these users expressed, unprompted, an interest in giving me money for providing this tool. I simply set up a page on a payments website that caters to creators, and linked it from the tool. I provide no benefits for contributions; it is purely to provide an avenue for those that wish to do so.
If you total up the amount of work I've put into the project, and the total amount of money I've received, I would estimate I've earned somewhere in the vicinity of $1/hour for my work.
I didn't do the project for the money. I did it because I wanted the thing I made. It is an entirely selfish project. That others find it useful is gratifying, but the motive wasn't profit, and to try to pivot to a profit motive would be nonsense to me. (Not to mention, I don't want to insult the developers of the game for which the project is an aid, by charging money off the back of their work.)
I run a language learning site / web app that is totally free, no ads too, going on about 8 years now. It's for one specific language that has not had much attention from the software community but could use it.
I've never tried to promote it and have never updated it much, but there have been some random people over the years who have given me money. I had a few people send me 50$ unprompted, plus some people I met IRL pay for dinners and things like that when the topic came up.
Which is to say: Users of Python almost certainly don't want to call id(), unless they are mucking around with something related to the functioning of the interpreter itself. The only other circumstance in which I've used it in anything like real code was in a quick hack where what I really wanted was to put a bunch of dicts in a set, but you can't do that, so I made a set of their IDs instead (but this wasn't in what I would call "production code").
In general, most of the "wtfs" listed here are pretty banal. If you're using `is` on immutable objects, you're basically just asking for trouble. The functioning of string and integer interning and constant folding are implementation curiosities, and if you ever write code in which such differences matter, then you have made an error.