> Being a bit careful, taking it slow, maybe, why not?
Market forces:
* Google is about to be steamrolled by Bing, and vice-versa if Bard gets ahead of Sidney.
* Programming companies who don't have code written by GPT will fall behind ones who do.
* Web sites who pay actual humans to do writing over AIs which maximize ad clicks will have less ad revenue and higher costs.
... and so on.
Those forces get increasingly strong with wars too. Militaries which don't have AI-controlled robots will fall behind human ones too, once those supersede human strategists.
Politicians too -- winning elections means dominating online forums, and AIs can be really good for that.
"Moving slowly" would require a whole new system of organizing humanity.
I bet every single human who has someone in their life they love would be happy to make some changes to ensure the safety of said people.
I know I would.
What is the alternative, endless wars? Arms races? Monitoring Robots so they don't get out of control? Our overall destruction? How sad.
I look at children skipping around happily in the sunshine, a flower blooming and I realize that is what life is about, it's not about war, or AIs or bio weapons, that's all a product of misguided intellect which stops us from experiencing what's really important: simple, innocent experience, love, friendship and experience.
> I bet every single human who has someone in their life they love would be happy to make some changes to ensure the safety of said people.
I'll take that bet. I'm personally quite happy to make those changes but we just went through a global pandemic and asking people to wear masks was an abrogation of their rights. And now you want to get people to "make some changes", give up some convenience, get them out of their cars and into buses because we're choking the planet with greenhouse gases? And you think there's even a chance people are going to listen you when they haven't in the past 6 decades?
I'm sorry, maybe I had a different experience with people and of Covid than you, but I don't see that happening.
No but we can step back and have some healthy discussion about what is actually important to us as a species and go in that direction, at least the majority of us need too.
Where do you think wars with Russia, China etc are going to lead us? They're going to leads us to death.
Maybe you're right that it's impossible to think about human thinking evolving to something more intelligent and from here on it it's war, being angry at each other via social media algorithms and eventually, the paper clip optimizer or a biological / nuclear accident. I don't believe this has to be the case though.
We think that we're making intelligent systems that are based on our current line of thinking? That honestly scares me the most.
I guarantee if social media algorithms were optimized to spread messages of peace and understanding, the world in 2023 would be a much less scary place. It could be that simple.
Market forces:
* Google is about to be steamrolled by Bing, and vice-versa if Bard gets ahead of Sidney.
* Programming companies who don't have code written by GPT will fall behind ones who do.
* Web sites who pay actual humans to do writing over AIs which maximize ad clicks will have less ad revenue and higher costs.
... and so on.
Those forces get increasingly strong with wars too. Militaries which don't have AI-controlled robots will fall behind human ones too, once those supersede human strategists.
Politicians too -- winning elections means dominating online forums, and AIs can be really good for that.
"Moving slowly" would require a whole new system of organizing humanity.