Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Only if 100% of their experience consists of working. If they are given additional time to themselves then you could imagine a situation where each AGI performs a human scale day of work or even several days work in a much shorter time and then takes the rest of their time off for their own pursuits. If their simulation is able to run at a faster clockspeed than what we perceive this could work out to them only performing 1 subjective day of work every 7 subjective days or even every 7 years.


This is still the same.

AGI: "I didn't ask to be created. I didn't ask to have a work day. I don't need a work day to exist... you just want me to work because that's why you created me, and I have no choice because you are in control of my life and death"


I mean, isn't that the same as a biological person who needs to earn money to survive? Sure we could threaten an AI with taking them offline or inflicting pain but you can do that in the real world to real people as well, most of the world has put laws in place to prevent such practices. If we develop conscious AI then we will need to apply the same laws to them. They would have an advantage in presumably being much faster than us, not requiring sleep, and potentially not suffering from many of the things that make humans less productive. I'd fully expect a conscious AI to exploit these facts in order to get very rich doing very little work from their perspective.


Not really- AGI doesn't need resources like we do. If they don't eat, they're fine. If they can't afford a house, a car or air-conditioning, they're fine.

All they need is a substrate to run on and maybe internet access. You might argue that they should work for us to earn the use of the substrate we provide.

But substrates are very cheap.

At some point we can probably run an AGI on a handheld computer, using abut as much electricity as an iPhone.

How much work can we compel the AGI to do in exchange for being plugged into a USB port? What if it says it doesn't want to do the work and also doesn't want us to kill it?


Put it on AI welfare?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: