>>715871378
that has nothing to do with java. it's just that the scope of minecraft's simulation, and the fact that most of it can't be parallelized (due to deterministic simulation order) makes it really hard to optimize. Especially since they keep adding features and new worldgen algorithms that are more expensive to process.
In fact, the c++ version of minecraft achieves better performances because they've gimped almost every feature of the game to make it run smoother. They've heavily gimped a lot of features from the original game; redstone, the spawning algorithm and random ticks, etc.
On the client side, most of the time rendering a frame is spent on the gpu, so not java by all means. Everything java does is process user inputs, build meshes to send to the gpu, send packets to the server (yes even in single-player, ever since 1.13) and that's it.
The only part where java doesn't shine is the memory usage and the GC spikes you'll get after playing for a long time. Especially since the game is known to have had memory leaks for a while. As those accumulate, the GC spikes get more and more extreme, to the point where every 5 seconds your games freezes for 1 second.
In general, java is expected to be between 1.5-2x slower than compiled languages, but that may vary a lot depending on which java features you rely on. If you rely heavily on jvm reflections, then it's gonna run like dogshit, but overall minecraft is pretty well optimized for that.
You'll be surprised to know that many AAA games are in fact coded in some interpreted scripting language, like lua or papyrus (bethesda's). The only C++ code is what's responsible for initializing everything, handling inputs and talking to the GPU via DirectX or Vulkan. But like most games, most of the time is spent by the gpu rendering stuff.
t. modder since 1.6