-
Notifications
You must be signed in to change notification settings - Fork 31
faq
| Memory Analyzer | Minimum Java Version |
|---|---|
| 1.16+ | 17 |
| 1.12 - 1.15 | 11 |
| 1.8 - 1.11 | 1.8 |
For all officially released versions the following applies:
-
the two most recent releases from the latest major version will be available on http://download.eclipse.org
-
releases included in the Eclipse simultaneous release will be available on http://download.eclipse.org
-
older releases will be archived on http://archive.eclipse.org and available there for unlimited time
This applies both for RCPs and update sites.
Milestone builds, developer builds or previews may be fully removed from the download sites or archived - the Memory Analyzer team should decide this.
You may receive the following error messages or something quite similar:
Incompatible JVM
Version 1.4.2 of the JVM is not suitable for this product. Version 1.5.0 or
greater is required.
or
Incompatible JVM
Version 11.0.17 of the JVM is not suitable for this product. Version 17 or
greater is required.
Check above in the Compatibility matrix to ensure you are using an appropriate Java version, and JAVA_HOME is set appropriately.
If necessary, the runtime can be provided two ways:
MemoryAnalyzer.exe -vm path/to/java17/binOr, for a permanent update, the MemoryAnalyzer.ini file can be updated. It is
important that the arguments are across two lines. This must be before the
-vmargs lines.
-vm
path/to/java17/binThis error happens because the MAT plug-in requires a JDK 1.8 via its
manifest.mf file and the OSGi runtime dutifully does not activate the
plug-in.
See directly above, identical steps as for the Incompatible JVM.
The error is attempting to run the older version of Eclipse MAT with a newer version of Java. Upgrading to a newer version of MAT should work.
Check the compatibility matrix above.
java.lang.reflect.InaccessibleObjectException: Unable to make boolean sun.nio.fs.WindowsFileAttributes.isDirectoryLink() accessible: module java.base
does not "opens sun.nio.fs" to unnamed module @776d8097
As a workaround for this issue, you can also set, in MemoryAnalyzer.ini, the
following:
...
-vmargs
--add-opens=java.base/sun.nio.fs=ALL-UNNAMED
Analyzing big heap dumps can require more heap space. Be sure that your machine to do the parsing has sufficiently large available memory, and then configure Memory Analyzer to be able to use it.
# This example sets 4GB max, you can experiment with much larger
MemoryAnalyzer.exe -vmargs -Xmx4g -XX:-UseGCOverheadLimitYou can also provide the values in MemoryAnalyzer.ini with the installation.
-vmargs
-Xmx2g
-XX:-UseGCOverheadLimitThe -vmargs lines must come last in the MemoryAnalyzer.ini file.
As a rough guide, Memory Analyzer itself needs 32 to 64 bytes for each object in
the analyzed heap, so -Xmx2g might allow a heap dump containing 30 to 60
million objects to be analyzed.
It's possible to analyze heaps with up to 2 Billion objects, for which you would need a sufficiently large heap.
If you are running the Memory Analyzer inside your Eclipse SDK, you need to
edit the eclipse.ini file instead of MemoryAnalyzer.ini.
In short: if you run a 64bit VM, then all native parts also must be 64bit. But what if - like Motif on AIX - native SWT libraries are only available as 32bit version? One can still run the command line parsing on 64bit by executing the same steps as below for headless processing.
The initial parse and generation of the dominator tree uses the most memory, so it can be useful to do the initial parse on a large machine, then copy the heap dump and index files to a more convenient machine for further analysis.
There is a simple script to allow you to do the indexing on a remote machine, so long as there is a MAT installation.
# just do the indexing
/path/to/mat/ParseHeapDump.sh /path/to/heapdump.hprof
# also run some suspects reports
/path/to/mat/ParseHeapDump.sh /path/to/heapdump.hprof org.eclipse.mat.api:suspectsCommand-line reports:
- The
org.eclipse.mat.api:suspectscreates a leak suspect ZIP report. - The
org.eclipse.mat.api:overviewcreates the overview ZIP report. - The
org.eclipse.mat.api:top_componentscreates the top components report.
For more details, check out the section Running Eclipse in the Help Center.
With Memory Analyzer 0.8, but not Memory Analyzer 1.0 or later, the IBM DTFJ adapter has to be initialized in advance. For parsing IBM dumps with the IBM DTFJ adapter you Memory Analyzer 0.8 should use this command:
/usr/java5_64/jre/bin/java \
-Dosgi.bundles=org.eclipse.mat.dtfj@4:start,org.eclipse.equinox.common@2:start,org.eclipse.update.configurator@3:start,org.eclipse.core.runtime@start \
-jar plugins/org.eclipse.equinox.launcher_*.jar \
-consoleLog \
-application org.eclipse.mat.api.parse path/to/mydump.dmp.zip \
org.eclipse.mat.api:suspects \
org.eclipse.mat.api:overview \
org.eclipse.mat.api:top_componentsThis error indicates an inconsistent heap dump: The data in the heap dump is written in various segments. In this case, an address expected in a class segment is written into a instance segment.
The problem has been reported in heap dumps generated by jmap on Linux and Solaris operation systems and jdk1.5.0_13 and below. Solution: use latest jdk/jmap version or use jconsole to write the heap dump (needs jdk6).
This almost always means the heap dumps has not been written properly by the Virtual Machine. The Memory Analyzer is not able to read the heap dump.
If you are able to read the dump with other tools, please file a bug report. Using the HPROF options with debug output enabled may help in debugging this problem.
This warning message is printed to the log file, if the heap dump is written
via the (obsolete and unstable) HPROF agent. The agent can write multiple heap
dumps into one HPROF file. Memory Analyzer 1.2 and earlier has no UI support to
decide which heap dump to read. By default, MAT takes the first heap dump. If
you want to read an alternative dump, one has to start MAT with the system
property MAT_HPROF_DUMP_NR=<index>.
Memory Analyzer 1.3 provides a dialog for the user to select the appropriate dump.
Eclipse MAT currently only supports heap sizes with up to ~2billion objects, as it uses Java arrays internally when processing the file (which are limited to 2^31 entries).
To work around this, Eclipse MAT supports a setting to "discard" some % of objects so that only a fraction of objects are loaded.
To configure it, follow the recommendation: Consider enabling object discard, see Window > Preferences > Memory Analyzer > Enable discard. Then, you should be able to open the file with Eclipse MAT.
This is useful as it allows to load heaps with many objects. However, it may miss some linkages and object references from processing and results will be less accurate as a result.
See MAT Configuration help page for more information on 'discard'.
To show debug output of MAT:
- Create or append to the file ".options" in the eclipse main directory the lines:
org.eclipse.mat.parser/debug=true
org.eclipse.mat.report/debug=true
org.eclipse.mat.dtfj/debug=true
org.eclipse.mat.dtfj/debug/verbose=true
org.eclipse.mat.hprof/debug=true
org.eclipse.mat.hprof/debug/parser=trueEdit this file to remove some lines if you are not interested in output from a particular plug-in.
On macOS, this file should be placed in *.app/Contents/MacOS/.options
-
Start eclipse with the -debug option. This can be done by appending
-debugto theeclipse.iniorMemoryAnalyzer.ini file in the same directory as the.options` file. -
Be sure to also enable the
-consoleLogoption to actually see the output. -
If you want to enable debug output for the stand-alone Memory Analyzer create the options file in the mat directory and start memory analyzer using
MemoryAnalyzer -debug -consoleLogSymptom: When monitoring the memory usage interactively, the used heap size is much bigger than what MAT reports.
During the index creation, the Memory Analyzer removes unreachable objects because the various garbage collector algorithms tend to leave some garbage behind (if the object is too small, moving and re-assigning addresses is too expensive). This should, however, be no more than 3 to 4 percent.
If you want to know what objects are removed, enable debug output.
Another reason could be that the heap dump was not written properly. Especially older VM (1.4, 1.5) can have problems if the heap dump is written via jmap.
Otherwise, feel free to report a bug.
By default unreachable objects are removed from the heap dump while parsing and will not appear in class histogram, dominator tree, etc. Yet it is possible to open a histogram of unreachable objects. You can do it:
-
From the link on the Overview page
-
From the Query Browser via Java Basics --> Unreachable Objects Histogram
This histogram has no object graph behind it (unreachable objects are removed during the parsing of the heap dump, only class names are stored). Thus it is not possible to see e.g. a list of references for a particular unreachable object.
But there is a possibility to keep unreachable objects while parsing. For this you need to either:
- parse the heap dump from the command line providing the argument
-keep_unreachable_objects, as in
ParseHeapDump.bat -keep_unreachable_objects <heap dump>Or, for a permanent setting:
- Set the preference using 'Window' > 'Preferences' > 'Memory Analyzer' > 'Keep Unreachable Objects', then parse the dump.
Memory Analyzer version 1.1 and later has this preference page option to select keep_unreachable_objects.
Depending on the type of crash, consider testing with one or more of these options in MemoryAnalyzer.ini:
-Dorg.eclipse.swt.browser.XULRunnerPath=/usr/lib/xulrunner-compat/
Normally you must first install your distribution's xulrunner-compat package
-Dorg.eclipse.swt.browser.UseWebKitGTK=true
Is it possible to extend the Memory Analyzer to analyze the memory consumption of C or C++ programs?
No, this is not possible. The design of the Memory Analyzer is specific to Java heap dumps.