ARTIFACTORY: Encountering an unaccounted High Memory Usage by Java? – Here are some tips on how to diagnose and tune

Inbar Cisling
2022-05-26 11:29

If you encounter a steady growth in memory usage in your Java memory graphs,

Or for example, encounter a similar scenario such as this:

  • System deployed with 20gb RAM
  • Java heap is 65% of RAM, which is 13gb (Xmx=13g)
  • After running for a while, memory fills up and java RSS usage shows 19gb usage (this means that there is 6gb out of heap memory-The expected out of heap memory usage is below 1gb for any size).

It will be good to add those Java graphs to your statistics, to be able to identify where the growth is coming from:

User-added image

There are couple of things you should validate to support the theory that you are dealing with an Off-Heap memory growth:

1. Simply run ‘top’ command to validate what is the process that consumes the most memory, like this:
PID    USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM  TIME+    COMMAND
XXX    root      20   0  146.4g 118.6g   8724 S 411.1 62.9  91897:38 java
       root      20   0  892316   3748   1280 S  38.9  0.0   6453:21 collectd
       root      20   0  162652   2584   1512 R  16.7  0.0   0:00.07 top
       root      20   0 7010468  81116   8660 S   5.6  0.0  14640:08 jf-router
The VIRT represents the total virtual memory used by the process, the RES represents the actual physical RAM the process is using, as we can see the top consumer is Java.

Take additional runs of the top command to validate if the RES usage is getting reduced or whether it is only increasing over time.

2. You can generate a full heap dump (for small heap size it is a valid option), or use lighter commands such as jmap -histo, to validate the top memory consumer classes in your heap, here is an example of a jmap-histo output:
num     #instances         #bytes  class name (module)
-------------------------------------------------------
   1:     183948116    20051625016  [B (java.base@11.0.10)
   2:     107411561    10480182648  [Ljava.lang.Object; (java.base@11.0.10)
   3:      14622663     4640754192  [I (java.base@11.0.10)
   4:      66011886     3696665616  java.io.ObjectStreamClass$WeakClassKey       
   5:      67176598     2149651136  java.lang.String (java.base@11.0.10)
   6:          8447     1625706088  [Ljava.io.ObjectInputStream$HandleTable$HandleList; 

If your total heap size is less than the JVM memory usage metrics, for example, if the heap size is only 50g and the memory consumption is reaching 100g, it may imply that the memory increase is related to off heap usage and not on heap.

3. Off heap usage can be monitored using the 3 following commands
XX:NativeMemoryTracking=detail
jcmd <pid> VM.native_memory baseline
jcmd <pid> VM.native_memory detail.diff

 

          An example output of Native Memory Tracking:     Total: reserved=112659368KB +849436KB, committed=110938608KB +311636KB

         Java Heap (reserved=104857600KB, committed=104857600KB)
                    (mmap: reserved=104857600KB, committed=104857600KB)
 
              Class (reserved=390942KB +6691KB, committed=389834KB +6179KB)
                    (classes #67968 +629)
                    (instance classes #65290 +621, array classes #2678 +8)
                    (malloc=14110KB +547KB #236512 +4073)
                    (mmap: reserved=376832KB +6144KB, committed=375724KB +5632KB)
                    (Metadata:   )
                    (reserved=0KB, committed=0KB)
                    (used=0KB)
                    (free=0KB)
                    (waste=0KB =-nan%)
 
             Thread (reserved=1918398KB +628973KB, committed=226478KB +84781KB)
                    (thread #1858 +609)
                    (stack: reserved=1909532KB +626052KB, committed=217612KB +81860KB)
                    (malloc=6690KB +2208KB #11150 +3654)
                    (arena=2176KB +714 #3715 +1218)
 
                    
 
                    (malloc=14554KB +209KB #34757 -4963)
                    (mmap: reserved=247684KB, committed=219952KB +6904KB)
 
                GC  (reserved=4275279KB +71298KB, committed=4275279KB +71298KB)
                    (malloc=346483KB +71298KB #180670 +19138)
                    (mmap: reserved=3928796KB, committed=3928796KB)
 
            Compiler(reserved=11553KB +4492KB, committed=11553KB +4492KB)
                    (malloc=11420KB +4492KB #7690 +1681)
                    (arena=133KB #5)
 
           Internal (reserved=81594KB +24067KB, committed=81594KB +24067KB)
                    (malloc=81562KB +24067KB #33110 +8076)
                    (mmap: reserved=32KB, committed=32KB)
 
              Other (reserved=666518KB +3121KB, committed=666518KB +3121

          As you can see the malloc value is increasing for each memory arena and if you take a couple
          more NMT tracking and see this value is still increasing, we recommend tuning this setting as following:

Set this environment variable in your system:export MALLOC_ARENA_MAX=<2 x (# ofCPU Cores)>

MALLOC_ARENA_MAX sets the maximum number of memory pools used, regardless of the number of cores 
The default setting is 8 times number of cores you have on the machine, we recommend reducing it first to a couple of the machines cores (if the machine has 16 cores, set it to 32) and if you would like to reduce it, even more, try to reduce gradually and see if it has an effect (reducing it to a too low value can impact the CPU performance so please take it into account).

 

Here are some references you can use for reading about Java memory usage settings:
https://blog.malt.engineering/java-in-k8s-how-weve-reduced-memory-usage-without-changing-any-code-cbef5d740ad
https://newbedev.com/growing-resident-memory-usage-rss-of-java-process
https://blog.picnic.nl/quest-to-the-os-java-native-memory-5d3ef68ffc0a