Make faster computers and more complex programs...that's all fine, but it's not AI. AI is more like trying to create machine-based life. Quite frankly, I'm not sure why anyone would ever want that, or bother researching it, unless they just want to play God. It's intentionally removing human control from the machine. I'm not saying it would be the end of humanity, but I don't see where any upsides are. Perhaps at best, the AI is actually not all that smart, and we end up with domesticated machine pets. And if it's smarter than us, that's a huge risk. It's like saying "You know what? I don't want us to be the top species on the planet any more. I'm going to create a superior life and see what it does with us.". In which case, lets hope the AI is really really nice, or we keep it successfully caged, or that we kill it with our numbers before it kills us with it's superior capabilities. And as you can see, I've now reached the Terminator scenario.
I'd say crazier is the man researching/developing AI than the one suggesting the doom it may bring.