What should be done with GC overhead limit exceeded?

A memory overflow occurred in the project code and was reported as java.lang.OutOfMemoryError: GC overhead limit exceeded , but using MAT tool to analyze dump files, it was found that the heap memory shared 1.6g ( Xmx setting is 2g ), and the BlockingQueue of the business thread pool was almost full of size , but the size evaluation we set at that time would not overflow memory even if it was full.
found an article on the Internet: https://blog.csdn.net/renfufei/article/details/77585294, can I add the JVM parameter: -XX:-UseGCOverheadLimit to solve this situation?
also ask GC overhead limit exceeded whether it has anything to do with the metadata area? Because of the -XX:MaxMetaspaceSize=128m configured, gc has been found in 98% .

clipboard.png

Apr.07,2022

there is an error java.lang.OutOfMemoryError: GC overhead limit exceeded , which is actually an inference of JVM . If garbage collection takes 98% of the time, but less than 2% of the memory is recovered, then JVM will think that OOM is about to happen, causing the program to end early. Of course, we can use -XX:-UseGCOverheadLimit to turn off this feature.

but it doesn't make sense. If you turn off this feature, the program will eventually report OOM . In the final analysis, it is either insufficient memory or a memory leak. Judging from your allocated memory, it should be a memory leak. You should find out where the memory leaked, and then fix . You said the BlockingQueue of the business thread pool is almost full . Could it be that you use unbounded queues and then tasks are consumed too slowly and tasks pile up?

besides, as far as I know, this error should have nothing to do with metadata.

Menu