The more I hear from the experts, the more worried I get about AI

Michael Tharrington - May 4 '23 - - Dev Community

I watched a presentation over YouTube yesterday that got me pretty freaked about the potential issues that may arise with the creation of AI.

You can check it out here:

A data point that they continuously return to is:

50% of AI researchers believe there's a 10% or greater chance that humans go extinct from our inability to control AI

They site the 2022 Expert Survey on Progress in AI as the source for this info, specifically answers to the question:

What probability do you put on human inability to control future advanced AI systems causing human extinction or similarly permanent and severe disempowerment of the human species?

When AI experts feel that there is a 1/10 chance of us going extinct from AI, it's cause for concern in my book.

I highly recommend checking out the video. They do a great job explaining how the field of AI has changed over the past 5 years and how/why it's progress is rapidly accelerating. They give specific examples of AI being used nefariously today and talk about the potential harm that may come. They also talk about how it can be so difficult to foresee these issues, comparing this point time to the advent of social media — how we didn't realize that this tech would bring about new problems or amplify existing ones, things like: information overload, addiction, doomscrolling, influencer culture, sexualization of kids, Qanon, and more. These things weren't the intended outcome, but they were very real effects that we're still contending with today.

I really hope that we take more time thinking about the negative effects that AI may cause before deploying these things into the world. We know that this is highly impactful tech with a lot of potential for good, but if we move too fast and don't think about the potential consequences, we may end up with a whole slew of AI-driven issues that we're not prepared to contend with.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .