Anonymous
10/20/2025, 2:02:33 AM
No.106943913
>>106942599
Yeah I haven't. I feel like a lot of programs don't, though. The ones we use every day, where most nerdy guys now are "terminally online", much of our day is spent waiting for data to be downloaded from the web, or uploading it. Micro-optimizations become less important than legibility then because the hard cap of network speed means no optimization you do, unless cleverly manipulating the network to achieve parallel downloads, or whatever (which is then logic flow, not exclusive to any language), is going to matter.
There are many scenarios where what you are saying will matter but I imagine that wouldn't be a thing unless you are working for a company that is doing something we don't ever need in our leisure time. Everything I ever interface with is primarily capped by the internet speed. So I'd rather have a program that is incredibly stable (because the code is so high level, developers are able to IMMEDIATELY understand it, and quickly change it to adapt to ever-changing web endpoints etc) and completes in 3 hours 30 seconds, than one written in Assembly that takes 3 hours 0 seconds.
The "good old days" stuff, the entire world was different and people forget this. Software came on a disc and once the software is out, there aren't patches every day, so you could produce a piece of software that basically never needs to change where you can be pretty confident that once you release it, you won't have to touch it again and users may just have to live with small ultra edge case bugs. Rollercoaster Tycoon 2 didn't need to suddenly make big alterations because all of a sudden some website it calls had the bright idea to completely revamp the site and now every single thing you use to scrape and parse and post to it is broken.
So I think that is where it's at for computing and software now. Legibility and adaptability, iteration speed (as in compile times), maintainability in general, is now the most important element.
Yeah I haven't. I feel like a lot of programs don't, though. The ones we use every day, where most nerdy guys now are "terminally online", much of our day is spent waiting for data to be downloaded from the web, or uploading it. Micro-optimizations become less important than legibility then because the hard cap of network speed means no optimization you do, unless cleverly manipulating the network to achieve parallel downloads, or whatever (which is then logic flow, not exclusive to any language), is going to matter.
There are many scenarios where what you are saying will matter but I imagine that wouldn't be a thing unless you are working for a company that is doing something we don't ever need in our leisure time. Everything I ever interface with is primarily capped by the internet speed. So I'd rather have a program that is incredibly stable (because the code is so high level, developers are able to IMMEDIATELY understand it, and quickly change it to adapt to ever-changing web endpoints etc) and completes in 3 hours 30 seconds, than one written in Assembly that takes 3 hours 0 seconds.
The "good old days" stuff, the entire world was different and people forget this. Software came on a disc and once the software is out, there aren't patches every day, so you could produce a piece of software that basically never needs to change where you can be pretty confident that once you release it, you won't have to touch it again and users may just have to live with small ultra edge case bugs. Rollercoaster Tycoon 2 didn't need to suddenly make big alterations because all of a sudden some website it calls had the bright idea to completely revamp the site and now every single thing you use to scrape and parse and post to it is broken.
So I think that is where it's at for computing and software now. Legibility and adaptability, iteration speed (as in compile times), maintainability in general, is now the most important element.