Hacker News .hnnew | past | comments | ask | show | jobs | submit | rewgs's commentslogin

> I have a Mac, not an XDG desktop. I would only expect and want X applications I run through Xquartz (all zero of them) to follow that.

XDG has nothing to do with X11. XDG stands for "Cross-Desktop Group," and is designed specifically for any Unix or Unix-like operating system, which includes macOS.


XDG stands for X Desktop Group. It absolutely does not stand for Cross Desktop Group and has nothing to do with macOS or Windows, outside of aforementioned X apps on Quartz via XQuartz which as far as I know is completely dead.

the successor to xdg, freedesktop.org, however is acknowledging the need for cross platform openness. that's exactly why you indeed can configure where the three main "stores" of compliant applications, their config, their data and their caches.

you can point them to %APPDATA%..., ~/Library or the Linux defaults.

my point in this is: there are free and open conventions and we wouldn't need this "my HOME is cluttered" fuss, if technical teams would embrace them.

so why don't they respect XDG_ env vars for their config and data?


> the need for cross platform openness.

The need for cross platform openness? Did anyone at FDO ask Apple or Microsoft if they wanted to comment or make suggestions on the basedir spec? Did FDO look at either of the platforms' existing specifications and see if there were any ideas they could copy? Or did they just do a Linux-oid thing, disregarding norms and specifications that exist already on the other platforms they didn't invite to the party, and force some weird thing onto those platforms under the assumption that it was good because its open and cross platform.

My complaints come from two things.

1. On my Linux computers, I am inundated by bad applications that do not follow the conventions set out by XDG for basedir. The one good thing basedir lets me do is set my own goddamn directories that aren't dot-file trash, but I still can't depend on it being followed or respected. On this, we seem to agree. But I think the defaults from basedir are terrible and dot-files in general are terrible stupid things, hence the rant.

2. On my Windows and Mac computers, there is no reason for XDG envvars to be set because they do not purport to follow XDG basedir specifications. Thus, even though I could set an envvar for XDG_CONFIG and so on, they are more often ignored by crappy developers. However, I am still highly annoyed by now having at least two places where all of this crap might be, because even if an app developer follows XDG, it is up to me to force them to follow the platforms conventions. All of the basedir directories already have better analogue on macOS, it's ~/Library. Why shouldn't it be on the developer to do a simple `if macOS then put config data in ~/Library/Application Support/APPNAME and caches into ~/Library/Caches/APP_NAME` as is expected and typical on the Mac? Especially since there are no easy ways to set envvars for GUI applications on the Mac, what, am I supposed to make my own shortcuts for everything to set 5 XDG_BLANK envvars to launch an app instead of double-clicking an app bundle?


Err, nope, it is 100% Cross Desktop Group: https://www.freedesktop.org/wiki/

That said, you are correct that it has nothing to do with Windows (and I never said that it did).


That is a backronym[1], it absolutely meant X Desktop Group and likely changed to "Cross Desktop Group" when they switched to Wayland. D-Bus, .desktop files, MPRIS are all listed as FDO specifications alongside the Desktop basedir spec and none of them are appropriate for macOS either.

FDO applies to Linux and "Unix-Like" Operating Systems. macOS is not "UNIX-like", Apple still bothers to get it certified under UNIX 2003 so it is technically not a Unix-like. Again, just because it has a /usr folder and a /var folder and can run a bash shell out of the box doesn't mean all the same mostly just OK standards from Linux should be copy-pasted over.

[1] https://lwn.net/2000/0427/a/freedesktop.html


"gen 2x2" is Microsoft level bad naming.

And USB gen 4x4 is for off-roading.

Is there a right kind of anthill to sit on?


In my area, the wrong kind of anthill contains anything in the genus Myrmecia, and the right kind contains almost anything else.


I used to play on top of a giant (for a kid me, anyway) anthill in a nearby forest.

That's how I learned that forest ants, at least the local ones, are incredibly docile. I never got bothered by them.


Bullet ants, on the other hand, are not fun. Not even a little bit.


If you’re an ant, sure!


There's a Far Side cartoon in that

"Oh boy, look at that all that melting ice cream.. I hope he sits on our anthill!"


black ants, cuz they tickle the nethers instead of biting them


I'm a lifelong musician, went to music school to study jazz and orchestration, was a professional film composer for 15 years prior to pivoting to programming. I've read quite a few books on the intersection of math and music.

And not once have I ever felt that these so-called intersections were anything other than contrived.

Of course we can interface with music from a mathematical perspective, but that doesn't mean that we should or that there's anything particularly illuminating to gleen from doing so.

Beyond the very basic math (honestly even that's perhaps too strong a word -- just because something is expressed in numbers doesn't make it _math_) of time signatures and some harmonic concepts up to maybe some of Slonimsky's work, doing so is IMO a fool's errand that exists only to fill space on a TEDx stage.


Holy shit, a wild Everett Bogue sighting. I read your blog way back. Hope you’re doing well!


lol, email me! I'm still an active user. ev@evbogue.com or 773-510-8601


But here's the thing: learning Android dev is nothing like "learning" to use an LLM.

Obviously there are tons of tools and systems building up around LLMs, and I don't intend to minimize that, but at the end of the day, an LLM is more analogous to a tool such as an IDE than a programming language. And I've never seen a job posting that dictated one must have X number of years in Y IDE; if they exist, they're rare, and it's hardly a massive hill to climb.

Sure, there's a continuum with regards to the difficulty of picking up a tool, e.g. learning a new editor is probably easier than learning, say, git. But learning git still has nothing on learning a whole tech stack.

I was very against LLM-assisted programming, but over time my position has softened, and Claude Code has become a regular part of my workflow. I've begun expanding out into the ancilary tools that interact with LLMs, and it's...not at all difficult to pick up. It's nothing like, say, learning iOS development. It's more like learning how to configure Neovim.

In fact, isn't this precisely one of the primary value propositions of LLMs -- that non-technical people can pick up these tools with ease and start doing technical work that they don't understand? If non-technical folks can pick up Claude Code, why would it be even _kind_ of difficult for a developer to?

So, I'm with the post author here: what is there to get left behind _from_?


"must have X number of years in Y IDE"

Not quite on topic but as an engineering manager responsible for IDE development, explaining to recruiters and candidates I wanted engineers who developed IDEs, not just used them. Unfortunately, that message couldn't get through so I saw many resumes claiming, say 5 years of Eclpse experience, but I would later determine they knew nothing of the internals of an IDE.

Presumably, people now claim 3 years of machine learning experience but via ChatGPT prompting.


My theory is that they're going to release a new Mac Pro that's about half the size of the current one. Enough space for some PCIe slots, but otherwise smaller given the enormous amount of wasted space in that thing since moving from Intel to Apple Silicon. Guessing the rack-mount model, should they continue selling it, will be 3 or 4u instead of 5u.

I know everyone thinks they're going to just kill it, but I don't see it. Apple's move under Tim Cook has been to exhaust supplies (see: filling the Intel Mac Pro chassis with air and not updating the CPU), letting people predict its death (see: 2013 -> 2019 Mac Pro silence), and then redesigning it into something people want while utilizing it as an opportunity to segment specs across their SKUs.

The Studio will remain the high-powered creator machine, whereas the Mac Pro will be retooled into an AI beast.


Why people buy the Studio with the high ram config is actually the unified memory. This is unique to Apple. I'm not sure what Mac Pro would do with PCIe cards . It would be useless for AI because what you want is unified memory that can be used by the GPU/AI not just ram.


Its not entirely unique to Apple: the Ryzen AI Max platform (in the e.g. Framework Desktop) is a unified memory platform. The PlayStation 5 also has a unified memory architecture (which given the chiplet was made by AMD, not too surprising) (people sleep on PlayStation hardware engineering; they're far better at skating to where the puck is headed than most hardware tech companies. remember Cell?)


Thank you! I was not aware of Framework Desktop. Unfortunately it seems it’s even more limited to ram(to 128GB vs studio 512GB on Mac studio)


> I'm not sure what Mac Pro would do with PCIe cards .

Video and Audio Engineers [1] would like to have a word. Not to mention PCIe Network Card. And they do use all the slot in the Cheese Gater although I believe a modern version could have cut those in half.

[1] https://www.production-expert.com/production-expert-1/2020/7...


PCIe cards would indeed be useless for AI unless Apple supports third-party GPUs, but there are certainly some pro creators that would still prefer to have them. I myself work in large-template film/game scoring and while we all love our Mac Studios, they're usually housed in a Sonnet chassis so that we can continue to use PCIe cards. Had Apple kept them in parity with the Studio w/r/t CPU and RAM, the rack-mount version of the Pro would've been a no-brainer.


It is already a walking zombie, Apple clearly no longer cares about the workstation market, regardless of how many "I still believe" t-shirts get sold to wear at WWDC.


I'm selling one! Email in profile, get in touch.


The article specifically talks about B2B and MDM-like features. The "average consumer" isn't the point here -- rather, governments, defense, high-security corporations, etc.


I always try and keep in mind that we typically think of software as having three versions -- alpha, beta, and release -- but for it's considered even kind of "finished."

In my own work, this often looks like writing the quick and dirty version (alpha), then polishing it (beta), then rewrite it from scratch with all the knowledge you gained along the way.

The trick is to not get caught up on the beta. It's all too tempting to chase perfection too early.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: