HN2new | past | comments | ask | show | jobs | submitlogin

>eventually full AIs

At that point there will be no corporations or anything else related to our society because it will break the premise society is built on - that we are better off working with each other - someone in control of such AI doesn't need anyone else - it's impossible to guess what they would do with it but a logical choice would be to eliminate potential threats (ie. anyone who can develop similar technology or compromise it)

Once someone kicks off a "seed AI" that can develop/replicate fast enough it's game over for the rest, they win. And note that by AI I'm not talking about Terminator style self-aware machines - I'm talking about a problem solving device capable of performing given tasks.



People tend to bloviate a bit about how the first person or group to build and train a "seed AGI" would have "godlike" power, but what they forget is that given godlike power, there is no reason to be selfish or psychopathic. Most human selfishness and greed comes from incentive gradients and competition traps resulting from the systems we have to survive in. Once you have nonhuman but human-equivalent-or-greater intelligence directed towards human goals, you're beyond competing for survival, and have no incentive not to direct the AGI cooperatively and altruistically.

This is a choice we can make, as a profession, as a community, and as a species. There is no point in letting short-sighted competitive anxieties destroy such a potential for good.


I wish I could be as optimistic. But AGIs are physical beings with computational limitations that run on energy. The Sun can produce so much over a given period of time. It's not hard to imagine a competition trap involving AGIs, each trying to get as much resources as possible to increase its capacity and crush the competition.


I would hope artificial agents are intelligent enough to realize that erasing their own utility functions to grab resources has very low utility! Cooperation is a high expected-utility strategy for the mean agent, which is why it evolved in the first place.


I don't cooperate with ants.


Don't make me come over there, kid.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: