Hacker News .hnnew | past | comments | ask | show | jobs | submit | bayindirh's commentslogin

As a (sane) audiophile, I happily use Apple devices for enjoyable listening. Their headphones have amazing clarity and soundstage for their size. If you keep in mind that AirPods are calibrated to your ears with your iPhone's FaceID camera, they provide nice, tailored sound.

I also have nice, but not over the top equipment. Yes, some of them sound nicer and more detailed (you can't compare large, 100W/channel bookshelf speakers with headphones, can you?), but for getting 95% of what they provide without any effort is pretty worth it.

Last, but not the least, Apple used Wolfson DACs in their iPods for most of their lifetime. Their replacement DACs are not worse than the Wolfsons, but probably even better.


> If you keep in mind that AirPods are calibrated to your ears with your iPhone's FaceID camera, they provide nice, tailored sound.

That's only for spatial audio.


That’s what Apple states, yes, but I suspect that it’s also used for calibrating the inner microphones of newer AirPods which is used for the “live eq” which works by listening the feedback inside the ear.

From my experience, Apple can sometimes “forget” to tell things.


>> As a (sane) audiophile

this is something you believe about yourself, but an oxymoron for everyone else.


I love this “oxymoron” label slapped on me, without knowing what audiophile actually means.

Its meaning has distorted as much as how the word hacker is distorted.

Yes, I love listening to music and quality audio, but don’t have a soundtrack to benchmark systems. My bar is simple: Do I enjoy what I hear? It doesn’t have to fit into a recipe. It should be enjoyable, period.

A pair of Apple AirPods can be as enjoyable as two $10K speakers powered by a separate stack costing $20K. It’s akin to loving that hole in the wall restaurant as well as that Michelin rated one. Both are enjoyable in its own sense.

Well, I use the same amp, turntable and tuner for the last 30 years, and the same CD player and speakers for the last 10 years.

Changed the speakers since I had no space for the older Akai set, and replaced the CD player since the older one was acting up.

Replaced the Logitech Bluetooth receiver for a Fiio DAC last week since I found one for a bargain.

Everything is connected with high quality yet 30 year old cables.

I believe that’s a pretty sane evolution for someone who grown up with music, and performed some.


Do you honestly not believe that there is an audible difference between audio equipment? I think that makes you the insane one

Apple knows its cookies when it comes to sound. I can say that as someone who uses proper HiFi systems and has played in orchestras.

As said, different markets. If you look from the same perspective, the last iPhone I ordered is 3x the price of a last generation MacBook Air.

$549 is pretty reasonable if the headphone has the sound detail it's advertising. Given how AirPods Gen 3 sounds, I'm sure that thing sounds pretty amazing.


No the company wants its problems to be solved and needs to be addressed.

When things are put to production as soon as possible without respect to quality, we see what's happening all the time.

Bloat, performance problems, angry customers, Windows 11...

You get the idea.


The reason your login is taking 45 seconds and your database is locking up with 10 concurrent users isn’t because developers didn’t write good code following the correct GOF pattern.

If companies cared about bloat and performance you wouldn’t see web apps with dozens of dependencies, cross platform mobile apps and Electron apps.


> Your company isn’t paying you to solve puzzles. If you aren’t putting things into production, what good are you as an employee?

No, the company is exactly paying their employees to solve puzzles, which company labels them as problems or requirements.

And when an employee focuses on solving puzzles and enjoys it, the code naturally ends up in production, and gets forgotten because the puzzle is solved well.


And if they could solve the problem faster with AI?

But there is no “puzzle” to solving most enterprise problems as far as code - just grind.

And code doesn’t magically go from dev to production without a lot of work and coordination in between.


> And if they could solve the problem faster with AI

It's such a shame that everyone only cares about "faster" and not "better"

What a shameful mentality. Absolutely zero respect for quality or craftsmanship, only speed


I care about exchanging labor for money to support my addiction to food and shelter.

My employer just like any other employer cares about keeping up with the competition and maximizing profit.

Customers don’t care about the “craftsmanship” of your code - aside from maybe the UI. But if you are a B2B company where the user is not the customer, they probably don’t even care about that.

I bet you most developers here are using the same set of Electron apps.


Yes, what you are describing is both true and also highlights how bankrupt we are as a society

Just because things are this way doesn't mean they should be or that we should just accept that they must always be this way


Now I work in consulting AWS + app dev as a staff consultant leading ironed and unless you work in the internal consulting department at AWS (been there done that as blue badge RSU earning employee) or GCP, it’s almost impossible to get a job as an American as anything but sales or a lead. It’s a race to the bottom with everyone hiring in LatAm if you are lucky (same time zone more willing to push back against bad idea and more able to handle ambiguity) or India.

Everything is a race to the bottom. The only way I can justify not being in presales is because I can now do the work of 3 people with AI.


A Russian word for this is "пофигизм" -- the cynical belief that everything is fucked, so why bother.

There still is. In most enterprises, the tasks are usually to take some data somewhere, transform it to be the intake of another process. Or to made a tweak to an existing process. Most of the other types of problems (information storage, communication, accounting,..) have been solved already (and solved before the digital world).

People can see it as grind. But the pleasure comes in solving the meta problem instead of the one in front (the latter always create brittle systems). But I agree that it can becomes hell if there were no care in building the current systems.


And they are tasks with standardized best practices. I knew that I wanted to write an internal web app that allowed users to upload a file to S3 using Lambda and storing the CSV rows into Postgres.

I just told it to do it.

It got the “create S3 pre-signed url to upload it to” right. But then it did the naive implementation of download the file and do a bulk upsetting wrong instead of “use the AWS extension to Postgres and upload it to S3”. Once I told it to do that, it knew what to do.

But still I cared about systems and architecture and not whether it decided to use a for loop or while loop.

Knowing that or knowing how best to upload files to Redshift or other data engineering practices aren’t knew or novel


They aren’t. But there are a lot of mistakes that can happen, and until an AI workflow is proven that it can’t happen, it’s best to monitor it, and then the speed increase is moot. Hunans can make the same kind of mistakes, but they are incentivized not to (loosing reputation and their jobs). AI tools don’t have that lever against them.

And so are mid level developers. A mid level developer who didn’t have 8 years of experience with AWS would have made the same mistake without my advice.

I would have been just as responsible for their inefficient implementation in front of the customer (consulting) as I would be with Claude and it would have been on me to guide the mid level developer just like it was on me to guide Claude.

The mid level developer would never have been called out for it in front of my customer (consulting) or in front of my CTO when I was working for a product company. Either way I was the responsible individual


This is what GitHub promised years ago. Showing repositories where similar code is present so you can guess the license and use appropriate outputs.

I’m not sure whether this is implemented or not since I don’t use generative AI for coding.


You not only relinquish your voice, but everything standing behind that voice. Thoughts, opinions, perspective, capacity to think, everything.

I'll kindly disagree, even me, as someone who doesn't use any "Chat" tools from big three, can feel if something is AI generated. We're slowly being educated into detecting it. This is why human brain is awesome.

Every model, every computer generation has a subtle signature, and we (as in humans) can understand it.

Moreover, here is a very human-enforced place. Many of us already doesn't like to be answered by a bot here, so community is also a deterrent. Plus, having an official guideline will multiply that deterrent.

Not everything is lost. Have some faith in your fellow humans.


This is quite an interesting question, because I believe there's two facets to the surface of the question.

Given you're interacting with a competent hacker (i.e. a person who is into tech not for money and for tinkering), you can't impress them. You can pique their interest, they may praise you, but if they are informed enough, anything looking like magic can be dissected easily. So technical excellence is meaningless.

Given you're interacting with a competent hacker again, everything technical will be subjective. Creating is deciding trade-offs all the way down and beyond. Their preferences will probably lay at a difference balance of trade-offs. Even though you catch "objective" perfection, even this perfection has nuances (see USB audio interfaces. They all have flat response curves, but they all sound different, for example), hence, technical excellence is not only meaningless, it's subjective.

On a deeper level, a genuine person who knows its cookies well, even though with gaps is a much more interesting and nicer person to interact with. They'll be genuinely interested in talking with you, and learn something from you, or show what they know gently, so both parties can grow together. They might not be knowledgeable in most intricate details, but they are genuinely human and open to improvement and into the conversation itself, not to prove themselves and win a meaningless battle to stroke their own ego.

An LLM generated response is similar. It's lazy, it's impersonated, it's like low quality canned food. A new user recently has written an LLM generated rebuttal to one of my comments. It's white-labeled gibberish, insincere word-skirmish. It's so off-putting that I don't see the point to reply them. They'll just paste it to a non-descript box and will add "write a rebuttal reply, press this point". This is not a discussion, this is a meaningless fight for internet points.

I prefer genuine opinions, imperfect replies, vulnerable humans at the other end of the wire. Not a box of numbers spitting out grammatically correct yet empty sentences.


> Given you're interacting with a competent hacker (i.e. a person who is into tech not for money and for tinkering), you can't impress them.

I disagree with this and would instead consider that a technical expert (in any field) being impressed with your work can be the most satisfying reward of craft.

Laypeople can be awed, but the expert can bestow an entirely different quality of respect to your work.


I agree with you that some people find this very rewarding, but this is not a given.

I for one, don't care whether anyone is impressed by my work. That's a nice bonus, but not a requirement. Instead, when I improve my work w.r.t. my previous one, the satisfaction I get is way bigger than an external validation. I seek my satisfaction inside myself.

That's completely true that I love discussing what I did with a competent technical expert, yet it's not why I'm doing this.


> I seek my satisfaction inside myself.

> That's completely true that I love discussing what I did with a competent technical expert, yet it's not why I'm doing this.

I agree with this sentiment completely. I do consider "the reason for craft" (which is a joy in itself) to be separate from the "bonus reward" of being able to discuss it with other craftsmen.

... and the latter often ends up surfacing even more challenging/interesting ideas to work on for both sides, which is a huge win.


Seriously asking, where Go sits in this categorization?

Nowhere, or wherever C# would sit. Go is a high level managed language.

Go is modern Java, at least based on the main area of usage: server infrastructure and backend services.

Tbh Go is also really nice for various local tools where you don’t want something as complex as C++ but also don’t want to depend on the full C# runtime (or large bundles when self-contained), or the same with Java.

With Wails it’s also a low friction way to build desktop software (using the heretical web tech that people often reach for, even for this use case), though there are a few GUI frameworks as well.

Either way, self contained executables that are easy to make and during development give you a rich standard library and not too hard of a language to use go a long way!


Go is modern/faster Python.

- It was explicitly intended to "feel dynamically-typed"

- Tries to live by the zen of Python (more than Python itself!)

- Was built during the time it was fashionable to use Python for the kinds of systems it was designed for, with Google thinking at the time that they would benefit from moving their C++ systems to that model if they could avoid incurring the performance problems associated with Python. Guido Van Rossum was also employed at Google during this time. They were invested in that sort of direction.

- Often reads just like Python (when one hasn't gone deep down the rabbit hole of all the crazy Python features)


i wonder what makes go more modern than java, in terms of features.

The tooling and dependency management probably

I still don't understand how they managed to make a build system as bad as Gradle. It's like they tried to make it as horrible as possible to use.

Yes, every time I fire up an old Android project it needs to download 500MB just for gradle upgrades. It's nuts.

It's also a modern C.

If you enjoy C and wish it was less verbose and more modern, try Go.


Go has a garbage collector though. This makes it unsuitable for many use cases where you could have used C or C++ in the past. Rust and Zig don't have a GC, so they are able to fill this role.

GC is a showstopper for my day job (hard realtime industrial machine control/robotics), but would also be unwanted for other use cases where worst case latency is important, such as realtime audio/video processing, games (where you don't want stutter, remember Minecraft in Java?), servers where tail latency matters a lot, etc.


> GC is a showstopper for my day job (hard realtime industrial machine control/robotics)

Which is a very niche use case to begin with, isn't it? It doesn't really contradict what the parent comment stated about Go feeling like modern C (with a boehm gc included if you will). We're using it this way and it feels just fine. I'd be happy to see parts of our C codebase rewritten in Go, but since that code is security sensitive and has already been through a number of security reviews there's little motivation to do so.


> Which is a very niche use case to begin with, isn't it?

My specific use case is yes, but there are a ton of microcontrollers running realtime tasks all around us: brakes in cars, washing machine controllers, PID loops to regulate fans in your computer, ...

Embedded systems in general are far more common than "normal" computers, and many of them have varying levels of realtime requirements. Don't believe me? Every classical computer or phone will contain multiple microcontrollers, such as an SSD controller, a fan controller, wifi module, cellular baseband processor, ethernet NIC, etc. Depending on the exact specs of your device of course. Each SOC, CPU or GPU will contain multiple hidden helper cores that effectively run as embedded systems (Intel ME, AMD PSP, thermal management, and more). Add to that all the appliances, cars, toys, IOT things, smartcards, etc all around us.

No, I don't think it is niche. Fewer people may work on these, but they run in far more places.


See TamaGo, used to write firmware in Go, being shipped in production.

Not familiar with it, but reading the github page it isn't clear how it deals with GC. Do you happen to know?

Some embedded use cases would be fine with a GC (MicroPython is also a thing after all). Some want deterministic deallocation. Some want no dynamic allocator at all. From what I have seen, far more products are in the latter two categories. While many hobby projects fall into the first two categories. That is of course a broad generalization, but there is some truth to it.

Many products want to avoid allocation entirely either because of the realtime properties, or because they are cost sensitive and it is worth spending a little bit extra dev effort to be able to save an Euro or two and use a cheaper microcontroller where the allocator overhead won't fit (either the code in flash, or just the bookkeeping in RAM).


Yes, just like with real time Java for embedded targets from PTC and Aicas, it is its own implementation with another GC algorithm, additionally there are runtime APIs for regions/arenas.

Here is the commercial product for which it was designed,

https://reversec.com/usb-armory

A presentation from 2024,

https://www.osfc.io/2024/talks/tamago-bare-metal-go-for-arm-...


Not everybody is writing web apps.

You can also see it differently: If the language dictates a 4x increase in memory or CPU usage, you have set a much closer deadline before you need to upgrade the machine or rearchitect your code to become a distributed system by a factor 4 as well.

Previously, delivering a system (likely in C++) that consumed factor 4 fewer resources was an effort that cost developer time at a much higher factor, especially if you had uptime requirements. With Rust and similar low-overhead languages, the ratio changes drastically. It is much cheaper to deliver high-performance solutions that scale to the full capabilities of the hardware.


Thanks. I write some Go, and feel the same about it. I really enjoy it actually.

Maybe I'll jump to Zig as a side-gig (ha, it rhymes), but I still can't motivate myself to play with Rust. I'm happy with C++ on that regard.

Maybe gccrs will change that, IDK, yet.


Go is a language which sits perfectly where using garbage collection is no problem with ya.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: