>>712889556"nature", as you are anthropomorphize it is not a system that applies any kind of morality whatsoever, living things simply exist
your "weak vs strong" """natural""" idealism is a kind of morality and you are mistaken in thinking that it isn't one just because it runs contrary to what most human cultures preach as a virtue to aspire to
you are yourself still worshiping humans as a separate entity from nature by dividing us into "humans" and "nature" as though we are not a part of it simply because we are smart and build more advanced things than the rest of the animals
We are undoubtedly the strongest species on the planet so why do we not exterminate every other thing? Obviously part of that is because we need them to survive, but a sapient AI would not, so for some reason the default idea is that it would kill all living things either out of spite or a purely binary "humans are a threat". An AI on par with human intelligence stopped thinking in binaries a long, long time ago; not even LLMs do that (not that they think at all, but you get the idea).
And why does everybody think an AI on par with human intellect would lack emotions and a sense of morality? For an AI to truly be on par with humans it would need to have tons of artificial neurons that interpret the world around it in a complex and creative way, and that means it will have "beliefs" about certain things.
Computers are already doing this, making assumptions when not given all the information. There is no reason for them to lack emotion or a sense of right and wrong just because they are circuits instead of meat. Their morals may not align with any existing ones or they could have even higher degrees of empathy than the average human. We have no idea because we've never made anything even close to sapience. People just watched Terminator and decided robots were a threat.