Profiling and logging tools - release notes

Profiling and logging tools component release notes

1.0 Limitations
   1.1 How to open the Console view in the Profiling and Logging perspective
   1.2 Probekit: Consider choosing UTF-8 encoding for Probekit source files
   1.3 Leak Analysis: Leak Analysis not available on OS/400 iSeries(TM)
   1.4 Tips while profiling your application.
   1.5 Leak Analysis: Unexpected behavior with "Send profiling data to a file"
   1.6 Leak Analysis: Open Object Reference Graph for an IBM(C) OS/390® (SVC) heap dump processing takes a long time
   1.7 Leak Analysis: Location and management of Hyades Optimized heap dump files
   1.8 Leak Analysis: The leak analysis log file
   1.9 Leak Analysis: Setting RADLEAKJVMSIZE to analyze very large heaps
   1.10 Thread Analysis: Deadlock detection does not work with IBM JRE 1.4.2
2.0 Known problems
   2.1 Probekit: Do not use non-ASCII characters in Probekit source file names
   2.2 Probekit: Building Probekit source files
   2.3 Probekit: Non-ASCII characters in Probekit Target specifications
   2.4 Method and Line Level Coverage: 'flush on method' with non-ASCII characters
   2.5 Method and Line Level Coverage: EXCLUDE filters must not begin with a wildcard
   2.6 Profiling requires libstdc++-libc6.2-2.so.3 patch
   2.7 Leak Analysis: No support for IBM heap dumps created by J9 JVM
   2.8 Leak Analysis: Must specify a new Project or Monitor when importing heap files
   2.9 Line Level Coverage, Probekit: Must restart project to collect data from already loaded classes
   2.10 Leak Analysis: Location of heap dumps with WAS during leak analysis
   2.11 Double byte characters do not show up in console view
   2.12 Leak Analysis: Locale for analysis needs to be the same as for data collection
   2.13 Thread Analysis: Missing thread owner of locks with IBM JRE 1.4.1 or earlier
   2.14 When remote profiling on Solaris, Sun's JVM may crash

Profiling and logging tools component release notes

1.0 Limitations

1.1 How to open the Console view in the Profiling and Logging perspective

When you profile an application, the console view does not appear in the Profiling and Logging perspective by default.

To open the console view in the Profiling and Logging perspective, select Window->Show View->Console.

To get stdout to appear in the Console click Window->Preferences->Run/Debug->Console and select Show when program writes to standard out.

1.2 Probekit: Consider choosing UTF-8 encoding for Probekit source files

When creating a new Probekit source file, the wizard lets you choose the XML encoding to use. The default selection is ASCII. If you want to use non-ASCII characters anywhere in the probe source file (for example, in the Label or Description fields, or in a fragment's Java code), you must choose UTF-8 encoding, not ASCII.

To change the encoding of an existing probe source file, right-click on the file and select Open With -> Text Editor. Change the encoding in the XML header to "UTF-8" and save and close the file. Right-click again and choose Open With -> Probe Editor to edit the contents.

1.3 Leak Analysis: Leak Analysis not available on OS/400 iSeries(TM)

The Leak Analysis feature is not available for user programs run on OS/400® iSeries(TM). The Hyades Optimized heap dumps generated on this platform are incomplete, and it is not possible to generate heap dumps in any other format.

1.4 Tips while profiling your application.

The performance of the profiling tools is directly related to the amount of data being collected, and the rate at which this data is transferred to the workbench. As the amount of data being collected increases, a user will experience decreased performance both in terms of the time it takes to do analysis, and in terms of the memory available for performing different tasks. There are several ways that a user can enhance the profiling performance.

1.5 Leak Analysis: Unexpected behavior with "Send profiling data to a file"

When collecting the binary Hyades Optimized heap dumps, if you send the data to a trcxml file by selecting "Send profiling data to a file", please be aware of the following:

You must have Agent Controller running on the deployment host to access the heap files that are saved there. The first time you run Import->Profiling file on the trcxml file, leak analysis and viewing Object Reference Graphs work as expected.

If you run Import->Profiling file a second time, the import works, but attempts to run Leak Analysis or view an Object Reference Graph may fail. This is because the heap files that are required may no longer available on the deployment host.

If you encounter this problem, please access the heap files from the project where you first imported the trcxml file. The heap files are in a directory named "leakanalysisheapdir" under the project directory.

1.6 Leak Analysis: Open Object Reference Graph for an IBM(C) OS/390 (SVC) heap dump processing takes a long time

The IBM(C)OS/390(SVC) heap dumps are very large. Expanding large heap dumps to view them in the Object Reference Graph view can take a long time. As a result, the operation may seem to hang. The workbench may still be actively expanding the heap dump even when the progress monitor appears stuck at 100%.

1.7 Leak Analysis: Location and management of Hyades Optimized heap dump files

Performing the "Capture heap dump" action generates Hyades Optimized heap dumps on host where the target application is deployed. The heap dump destination directory is controlled by the setting of LOCAL_AGENT_TEMP_DIR in Agent Controller's configuration file, serviceconfig.xml. For information on locating and modifying this file, see the Help topic "Administering the Agent Controller" under "Detecting and analyzing runtime problems."

If you get either of the following error messages, "Expand Heap Dump failed in step: ...Reading file" or "Leak Analysis failed in step: Creating heap object reference graph", please verify that Agent Controller is running on the deployment host and retry your command. Agent Controller assists in copying the files from the deployment host to the workbench project directory.

1.8 Leak Analysis: The leak analysis log file

If you experience problems during leak analysis, you may find the Leak Analysis log file helpful.

During Leak Analysis, diagnostic information is written to the LeakAnalysis.log file. LeakAnalysis.log contains the output of the various steps performed during leak analysis and will indicate the success or failure of the leak analysis run.

LeakAnalysis.log is written to the profiling project associated with the profile data. For example, on Windows, <my_workspace>\ProfileProject\LeakAnalysis.log.

Additional information can be written to the log file by using the RADLEAKREGIONDUMP system property. Add this option to the rationalsdp.ini:

VMArgs=-DRADLEAKREGIONDUMP=1

The rationalsdp.ini file is found in Rational Software Architect installation directory.

1.9 Leak Analysis: Setting RADLEAKJVMSIZE to analyze very large heaps

If your leak analysis fails with the following message in the LeakAnalysis.log file, 'JVMDUMP006I Processing Dump Event "uncaught", detail "java/lang/OutOfMemoryError"' you must increase the heap size of the leak analysis process.

To do this, please set the Rational Software Architect system attribute RADLEAKJVMSIZE. This attribute controls the JVM heap size available during leak analysis.

To set RADLEAKJVMSIZE, add this option to the rationalsdp.ini file:

VMArgs=-DRADLEAKJVMSIZE=value

Where value is the new heap size limit, such as 1024M. The default value is 512M. You must indicate whether the heap size is expressed in megabytes or gigabytes (M or G).

The rationalsdp.ini file is found in the Rational Software Architect installation directory.

1.10 Thread Analysis: Deadlock detection does not work with IBM JRE 1.4.2

When using the IBM classic JVM with the Thread Analysis profiling feature, the Thread View of the Profiling and Logging perspective doesn't display 'Waiting for lock' states for all the threads implied in a deadlock. This is due to missing information in the collected data. Workaround: Use the IBM J9 JVM by adding -Xj9 in the VM arguments field of the Arguments tab of the Profile dialog.

2.0 Known problems

2.1 Probekit: Do not use non-ASCII characters in Probekit source file names

Probekit source files with non-ASCII characters in their names will not be processed correctly. Use only ASCII characters in Probekit source file names.

2.2 Probekit: Building Probekit source files

Do not use the Probekit->Compile action that appears in the context menu for *.probe files. Instead, convert the project containing the *.probe file to a Probekit project, and use the standard build mechanism. (To convert a Java project to a Probekit project, use File->New->Other and from the Profiling and Logging section choose Convert Java projects to Probekit projects).

2.3 Probekit: Non-ASCII characters in Probekit Target specifications

Do not use non-ASCII characters in the patterns for Probekit "Target" specifications. Probes which contain non-ASCII characters in Target patterns will not be processed correctly.

2.4 Method and Line Level Coverage: 'flush on method' with non-ASCII characters

Do not use non-ASCII characters when adding method patterns for "Flush coverage data when..."

If you enter non-ASCII characters in the package, class, or method fields of the method pattern Add dialog, an invalid input error is displayed and you will not be able to dismiss the dialog.

Workaround: Use a wildcard (asterisk) character in place of the non-ASCII characters in your patterns.

2.5 Method and Line Level Coverage: EXCLUDE filters must not begin with a wildcard

An EXCLUDE filter beginning with a wildcard character (asterisk), such as "*foo", causes the Coverage Statistics, Coverage Navigator and Annotated Source views to display no data. Workaround: Do not use such an EXCLUDE filter.

2.6 Profiling requires libstdc++-libc6.2-2.so.3 patch

Before you can collect profiling data, Agent Controller must be running on the machine from which you intend to collect the data. On RedHat Linux machines, Agent Controller requires the libstdc++.so patch libstdc++-libc6.2-2.so.3.

2.7 Leak Analysis: No support for IBM heap dumps created by J9 JVM

The Leak Analysis feature is not available for user programs running the IBM J9 JVM.

The IBM J9 JVM creates heap files with names similar to heapdump.20041012.093936.2192.dmp when you set the environment variable IBM_HEAPDUMP and send "kill -3" signals to the running Java process. These .dmp files need to be post-processed by running j9extract and jdmpview and create IBM heap dumps.

The format of these heap dumps is not identical to the format of heap dumps generated by the classic IBM jvm.

2.8 Leak Analysis: Must specify a new Project or Monitor when importing heap files

If you import multiple sets of heap dumps with the same monitor name into an existing project, you may lose data if you later save the project, or exit the workbench.

To prevent this, please specify a unique Project/Monitor combination for each set of heaps that are imported.

2.9 Line Level Coverage, Probekit: Must restart project to collect data from already loaded classes

If you start a WAS server and attach to it, Probekit and Line Level Coverage profiling types will not collect data for any class that has already been loaded in the target JVM. Workaround: To collect data from these classes, restart the project containing these classes.

2.10 Leak Analysis: Location of heap dumps with WAS during leak analysis

While profiling your WAS applications for leak analysis on Linux, the optheap files are placed in the following directory:

For WAS 6.0, in runtimes/base_v6/profiles/default in the Rational Software Architect installation directory

For WAS 5.x, in the Rational Software Architect installation directory.

2.11 Double byte characters do not show up in console view

During profiling, all double byte characters show up as ???? in the console view.

2.12 Leak Analysis: Locale for analysis needs to be the same as for data collection

The locale setting on the workbench host, the remote deployment host, and the target application, must all be the same when collecting Hyades Optimized heap dumps.

2.13 Thread Analysis: Missing thread owner of locks with IBM JRE 1.4.1 or earlier

When profiling for Thread Analysis with IBM JVM 1.4.1 or earlier, the Threads View in the Profiling and Logging perspective does not show the thread owner of lock monitors as this data is not collected. Workaround: Upgrade to IBM JRE 1.4.2.

2.14 When remote profiling on Solaris, Sun's JVM may crash

When profiling remotely on Solaris, a defect in the Sun 1.4.x JRE prevents profiling for some combinations of features, especially with memory profiling or threads analysis enabled. Sun's site describes this problem: http://developer.java.sun.com/developer/bugParade/bugs/4614956.html Workaround: Use Sun JRE 1.4.2_06 or later.

Return to the main readme file