Hacker News .hnnew | past | comments | ask | show | jobs | submit | pizza234's commentslogin

> Nobody is objecting to the loss of bad jobs. The jobs themselves are not the problem.

Very strong disagree; a lot of people is objecting. A job on an assembly line may be "bad" for somebody, but for somebody else can be a lifeline, if they won't be able to find another job soon enough and/or in reasonable conditions. Long-term, the job market can rebalance (and if unemployed people are supported in their education, it's great), but short-term displacement is a serious issue.


If your job is that tedious a robot could do it, it's a bad job. Do you think Sam Altman wastes a single minute on operations and the actual minutae of running a business? Fuck no he gets wageslaves like me and yo to do it

Every year, fewer and fewer people are capable of doing jobs that robots cannot do. That's sort of the whole conundrum here.

"Robots" broadly defined are getting more capable and more intelligent at a significantly faster rate than humans are.

This obviously produces incredible economic surplus, but 1) that surplus is naturally captured by the owners of those robots and not the people they replaced, and 2) doesn't seem clear that all the negative consequences of mass obsolescence are solvable by economic surplus even in theory.


Search "Humans are becoming horses" by CGP grey. He's making the exact same point as you except his is 15 years old and still hasn't passed.

I ask you to follow your premise to it's conclusion... who's paying for it these robots and who buys the stuff the robots make? Other robots?? In this world where robot serves robot, where exactly did we disappear to?


If you want to see what just productivity improvements (with no social innovations) naturally does, you can go read about the Gilded Age. Productivity improvements are necessary but not sufficient to enhance human wellbeing. Productivity improvements by themselves appear to simultaneously suppress quality of life for those below the productivity and/or capital ownership bar while increasing quality for those above it.

Yes, an economy is perfectly capable of orienting itself around satisfying the wants of the few people who have a lot of capital at the expense of the many who have little capital. Why wouldn't this be possible?

It obviously creates systemic risk in the economy, which is one of many reasons it should be mitigated by policy and taxation, but I'm not sure why you're acting like it's some mathematical impossibility.

Not sure anyone said anything about humans "disappearing," just driven to extreme economic hardship despite ample overall productivity, which again we have literally hundreds of real world examples of throughout history.


... You realize you just made exactly the same point I did, right? I know you have two eyes and 10 fingers but give those appendages a rest and reread

Careful system administration and web browsing were relatively safe; nowadays, even upgrading the local libraries carries risk that must be assessed.

It has always been that way. Literally the only distro that encourages an update process with the requisite effort you should be putting in is Slackware. You should be reading the source code you build. You should be building from source. You should fully understand your toolchains. Binary only distros have always been the equivalent of wearing a condom to have sex. Usually fine, but technically outsourcing the hard work to someone that lets be real, 90% never get to know well enough to credibly trust to any degree. NPM & proglang level package management just doubled down on the real-estate you had to shift through.

Being a responsible programmer/sys admin has always been read heavy, as long as I've been alive. Write only code is antithetical to the basis of running a trustworthy system.


> Because I would have to reboot into windows including any active applications I have?

In a gaming-only setup, Windows requires virtually no maintenance. Plus gaming itself is a monotasking activity.

I actually find it positive having to reboot, so I start with a gaming session, and I only play, and when I'm done I'm done. I get the appeal of everything-in-Linux (it was my setup) but it's also a hassle.


> In a gaming-only setup, Windows requires virtually no maintenance.

This is not remotely true anymore with Windows updates automatically restarting computers, windows updates pushing breaking changes especially in regards to GPU drivers, and more anticheats requiring secure boot.


These points are not (all) technically correct; for example, Windows does not restart "automatically" - it gives multiple options (this is important for dual booting).

Besides that, the root discussion is having a dual boot vs a virtualized windows; maintenance applies the same to both, it doesn't disappear when virtualizing Windows - the different is (the value one places) to context switching.


I used vfio in the past, and it's not true that setups like vfio or custom kernel/virtualization "just" work. For starters, custom setups need management. There are even latest generation GPUs whose drivers are not fully VFIO compatible.

VFIO had a host of problems that are rarely mentioned, because VFIO "just" works: power management, card driver, compatibility, audio passthrough or maybe not, USB passthrough or maybe not, stuttering, and so on.


VFIO is in a significantly better place than it was 10 years ago though. Proper IOMMU groups are more common on motherboards, flashing gpu bios less necessary, etc. and most importantly the community is bigger and older so there is a more knowledge about parts compatibility and vfio setup.

That said it’s almost entirely unnecessary with the state of Linux gaming now.


Better or not, even the latest generation AMD GPUs don't automatically guarantee a very stable VFIO, which makes the technology still immature.

Sure, in 10 years we’ve gone from bleeding edge, with server/workstation motherboards being necessary, to immature, with having to do a little homework on which consumer hardware to buy. It’s not like VFIO is something for the general public anyway.

I've been using TB for a decade and I too can't find anything better (even if my use case is very simple).

However, I find TB's development very misguided - it's evident to me that they give very little priority to stability:

- addons support (APIs) is a dumpster fire, and IMO a large addon ecosystem is what makes a client unique

- not so long ago, they added an instant messaging client, which has been a waste of dev resources

- at some point they overhauled the UI, but the result was a bloated slow mess (on some platforms), even with broken defaults

- there are bugs open for at least a decade (I consistently hit one)

It gives me the impression that the management prioritizes work that looks good on a screenshot, rather than stability.

I think it'd be positive if the Thunderbird org shut down. There are more pragmatic teams who could take over the project (see Betterbird).


There are several legal issues, including:

- You created profiles for students without consent

- You enabled anonymous posts about identifiable individuals

- There was no effective moderation/control system

Core issue: once you're aware of harmful content, you’re expected to act. If you don't address it in a reasonable time, you can become legally liable.


There are places where car is simply the mean of transport - to the point where using the car is preferred to literally a five minutes walk.

In contexts like this, using a car is perceived as a right - restricting usage doesn't make people think "I'll take the chance to use the bike", rather "How the f*ck do I get there now?".


The trouble is that the backlash occurs even in places that are pedestrian and transit dominated.


> Can someone knowledgeable comment on this? It seems extreme to say there's no safe level.

Not a direct answer, but the article reports the maximum exceeding amount:

> Maximum concentrations reached 351 mg/kg, dramatically exceeding the 10 mg/kg limit originally proposed by the European Chemicals Agency (ECHA).


For things you put on your skin that could be absorbed, shouldn't the limit take into account the area touching your skin? If I lay on a bed containing 10 mg/kg BPA, I would absorb a lot more than if I touch a headphone. So maybe it should be mg/kg*m^2 or something?


>> In other words up to 10% of all the crashes Firefox users see are not software bugs, they're caused by hardware defects!

> Bold claim. From my gut feeling this must be incorrect; I don't seem to get the same amount of crashes using chromium-based browsers such as thorium.

That's a misinterpretation. The finding refers to the composition of crashes, not the overall crash rate (which is not reported by the post). Brought to the extreme, there may have been 10 (reported) crashes in history of Firefox, and 1 due to faulty hardware, and the statement would still be correct.


> There was a level of creativity back in those days that we seem to not have as much nowaydays. Now things seem to be based more on math and things like signing.

Copy protections nowadays are actually extremely complex - just look at Denuvo and VMProtect. I presume that nowadays there are less copy protection schemes because producing a resilient one is too complex for small developer teams.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: