Many years ago, when I used to play Minecraft, the common wisdom was simply to get as much memory as possible, so that the GC runs less frequently. With the debug overlay you could see how the game kept allocating at about a rate of 300-500 MB/s until the JVM GC was triggered, which created a noticeable lag in the game. I used to know a few MC server admins, and I was surprised just how massive the resource consumption of the game was. One server I usually played on ran on multiple machines. Specs were I believe around 128 GB memory, 32 cores, 16 or so dedicated to GC...
Something I remember from that forum post was that instead of using array based representation for the meshes, it uses classes for every kind of information (Vector, Points,...). Very nice school OOP, but not always the best way.