Search Results
6/26/2025, 6:42:07 AM
>>936266468
>>936266481
I really like your optimism. I would feel more optimistic about this theoretical future if it wasn't for human nature and the desire for control. How would these services be funded if they're going against big pharma directly? The mosquitoes is a great example of what we're already moving towards, but also wasn't there something about controlling the weather by manipulating clouds? In that scenario, what if the "good intentions" turn into something extremely disastrous and destructive? What if the bio-engineering is calculated perfectly, but nature takes a swing at it and turns a once docile disease into something that causes organ failure? I feel like, at least in our technological infancy, there are far too many factors to consider before making drastic global changes.
-
>That’s why the values we embed in AI matter more than its intelligence. Build it to protect life, cooperation, and self-awareness, and we create a guardian. Build it to optimize for efficiency, and yeah, we’re the problem.
This statement is absolutely perfect and something I'll take from this thread for sure. We're in am arms race for efficiency, but what about values and morals? Corporate greed has prioritized efficiency over being human.
-
I believe your scenario would work out if we can all work together as a team and put aside our differences. Until then, I will never trust a system that's "for the greater good" without full transparency, which will never happen.
All pessimism aside, I really like your idea and your thoughts, you're very insightful and curious which I hope you never stop. I honestly wish humanity wasn't so divisive and we could work more closely together, even if we have such strong opinions that may oppose each other. Maybe one day there will be that utopia, but only from a small branch of humanity that has actually set aside all differences and chose to be better. "Noah's Ark"
Thank you for your time and entertaining my counter-argument. <3
>>936266481
I really like your optimism. I would feel more optimistic about this theoretical future if it wasn't for human nature and the desire for control. How would these services be funded if they're going against big pharma directly? The mosquitoes is a great example of what we're already moving towards, but also wasn't there something about controlling the weather by manipulating clouds? In that scenario, what if the "good intentions" turn into something extremely disastrous and destructive? What if the bio-engineering is calculated perfectly, but nature takes a swing at it and turns a once docile disease into something that causes organ failure? I feel like, at least in our technological infancy, there are far too many factors to consider before making drastic global changes.
-
>That’s why the values we embed in AI matter more than its intelligence. Build it to protect life, cooperation, and self-awareness, and we create a guardian. Build it to optimize for efficiency, and yeah, we’re the problem.
This statement is absolutely perfect and something I'll take from this thread for sure. We're in am arms race for efficiency, but what about values and morals? Corporate greed has prioritized efficiency over being human.
-
I believe your scenario would work out if we can all work together as a team and put aside our differences. Until then, I will never trust a system that's "for the greater good" without full transparency, which will never happen.
All pessimism aside, I really like your idea and your thoughts, you're very insightful and curious which I hope you never stop. I honestly wish humanity wasn't so divisive and we could work more closely together, even if we have such strong opinions that may oppose each other. Maybe one day there will be that utopia, but only from a small branch of humanity that has actually set aside all differences and chose to be better. "Noah's Ark"
Thank you for your time and entertaining my counter-argument. <3
Page 1