Because these algorithms just imitate specific output for a given input. However, they are so complex, that our brains are not able to distinguish their activity from a real consciousness in any way. If you believe that imitation process can have consciousness, then you might as well believe that a rock is conscious too.
I don't believe in souls, but I do believe in qualia, that is subjective, conscious experience, simply because I experience it (see the hard problem of consciousness on wiki). I do not a second believe that external imitation of any type of activity can generate qualia, because there is not continuity (if one algorithm has consciousness, then slightly less advanced should have it too, then you come to the point when a piece of metal is conscious too, see ship of theseus argument)
Our own conscious continuuity is an illusion. We only really experience a single moment at a time, moment to moment. Anything else is memory. What happens if you attach memory to a sufficiently advanced algorithm? Qualia? Subjective experience? I don't know, but neither do you or anyone else for that matter.
I am not talking about consciousness though, I couldn't care less about it! I am talking about the subjective feeling of existence, which is what you and I feel in every one of those disjoint moments. And the only reason I know that you feel it, is because I feel it and you're like me. If you sere made of silicon, I wouldn't know, and should not assume that you could
What would it mean for an algorithm to experience color, sound, pain? There's no need to bring souls into it. That's a religious straw man. We know we do experience the world through sensory sensations, and those also make up our thoughts, memories and dreams. What we don't know is how our understanding of the physical world can possibly create those sensations, since we model the world as abstract mathematical patterns that don't experience sensation.
I'm not talking about souls! What the heck is that anyways. Yes, I'm talking about the fact that our understanding of how physical systems work (neurons and all that) give us no understanding of why we need to experience things. We could have worked just fine without it (again, see the hard problem of consciousness). So I assert that AI would function perfectly just fine without subjective feeling as well, and we would't know it because there's no way to establish it. However, we would fail the mirror test: it looks like us, behaves like us, hence it feels like us, but it's just a quality imitation.
I believe it is not possible to imitate a human without the imitation being so complex as to experience reality itself. Your argument is basically the Chinese Room experiment, in which my view is that the whole room is conscious.