8220639: Need a way to augment JTREG_LAUNCHER_OPTIONS from command-line
Reviewed-by: erikj, dholmes
This commit is contained in:
parent
47e465cf1b
commit
aacb827896
@ -137,6 +137,8 @@ TEST FAILURE</code></pre>
|
|||||||
<h4 id="timeout_factor-1">TIMEOUT_FACTOR</h4>
|
<h4 id="timeout_factor-1">TIMEOUT_FACTOR</h4>
|
||||||
<p>The timeout factor (<code>-timeoutFactor</code>).</p>
|
<p>The timeout factor (<code>-timeoutFactor</code>).</p>
|
||||||
<p>Defaults to 4.</p>
|
<p>Defaults to 4.</p>
|
||||||
|
<h4 id="failure_handler_timeout">FAILURE_HANDLER_TIMEOUT</h4>
|
||||||
|
<p>Sets the argument <code>-timeoutHandlerTimeout</code> for JTReg. The default value is 0. This is only valid if the failure handler is built.</p>
|
||||||
<h4 id="test_mode">TEST_MODE</h4>
|
<h4 id="test_mode">TEST_MODE</h4>
|
||||||
<p>The test mode (<code>agentvm</code> or <code>othervm</code>).</p>
|
<p>The test mode (<code>agentvm</code> or <code>othervm</code>).</p>
|
||||||
<p>Defaults to <code>agentvm</code>.</p>
|
<p>Defaults to <code>agentvm</code>.</p>
|
||||||
@ -153,8 +155,10 @@ TEST FAILURE</code></pre>
|
|||||||
<p>Limit memory consumption (<code>-Xmx</code> and <code>-vmoption:-Xmx</code>, or none).</p>
|
<p>Limit memory consumption (<code>-Xmx</code> and <code>-vmoption:-Xmx</code>, or none).</p>
|
||||||
<p>Limit memory consumption for JTReg test framework and VM under test. Set to 0 to disable the limits.</p>
|
<p>Limit memory consumption for JTReg test framework and VM under test. Set to 0 to disable the limits.</p>
|
||||||
<p>Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).</p>
|
<p>Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).</p>
|
||||||
|
<h4 id="max_output">MAX_OUTPUT</h4>
|
||||||
|
<p>Set the property <code>javatest.maxOutputSize</code> for the launcher, to change the default JTReg log limit.</p>
|
||||||
<h4 id="keywords">KEYWORDS</h4>
|
<h4 id="keywords">KEYWORDS</h4>
|
||||||
<p>JTReg kewords sent to JTReg using <code>-k</code>. Please be careful in making sure that spaces and special characters (like <code>!</code>) are properly quoted. To avoid some issues, the special value <code>%20</code> can be used instead of space.</p>
|
<p>JTReg keywords sent to JTReg using <code>-k</code>. Please be careful in making sure that spaces and special characters (like <code>!</code>) are properly quoted. To avoid some issues, the special value <code>%20</code> can be used instead of space.</p>
|
||||||
<h4 id="extra_problem_lists">EXTRA_PROBLEM_LISTS</h4>
|
<h4 id="extra_problem_lists">EXTRA_PROBLEM_LISTS</h4>
|
||||||
<p>Use additional problem lists file or files, in addition to the default ProblemList.txt located at the JTReg test roots.</p>
|
<p>Use additional problem lists file or files, in addition to the default ProblemList.txt located at the JTReg test roots.</p>
|
||||||
<p>If multiple file names are specified, they should be separated by space (or, to help avoid quoting issues, the special value <code>%20</code>).</p>
|
<p>If multiple file names are specified, they should be separated by space (or, to help avoid quoting issues, the special value <code>%20</code>).</p>
|
||||||
@ -170,6 +174,8 @@ TEST FAILURE</code></pre>
|
|||||||
<h4 id="vm_options-1">VM_OPTIONS</h4>
|
<h4 id="vm_options-1">VM_OPTIONS</h4>
|
||||||
<p>Additional Java options to be used when compiling and running classes (sent to JTReg as <code>-vmoption</code>).</p>
|
<p>Additional Java options to be used when compiling and running classes (sent to JTReg as <code>-vmoption</code>).</p>
|
||||||
<p>This option is only needed in special circumstances. To pass Java options to your test classes, use <code>JAVA_OPTIONS</code>.</p>
|
<p>This option is only needed in special circumstances. To pass Java options to your test classes, use <code>JAVA_OPTIONS</code>.</p>
|
||||||
|
<h4 id="launcher_options">LAUNCHER_OPTIONS</h4>
|
||||||
|
<p>Additional Java options that are sent to the java launcher that starts the JTReg harness.</p>
|
||||||
<h4 id="aot_modules-1">AOT_MODULES</h4>
|
<h4 id="aot_modules-1">AOT_MODULES</h4>
|
||||||
<p>Generate AOT modules before testing for the specified module, or set of modules. If multiple modules are specified, they should be separated by space (or, to help avoid quoting issues, the special value <code>%20</code>).</p>
|
<p>Generate AOT modules before testing for the specified module, or set of modules. If multiple modules are specified, they should be separated by space (or, to help avoid quoting issues, the special value <code>%20</code>).</p>
|
||||||
<h4 id="retry_count">RETRY_COUNT</h4>
|
<h4 id="retry_count">RETRY_COUNT</h4>
|
||||||
@ -205,14 +211,19 @@ TEST FAILURE</code></pre>
|
|||||||
<p>Docker tests with default parameters may fail on systems with glibc versions not compatible with the one used in the default docker image (e.g., Oracle Linux 7.6 for x86). For example, they pass on Ubuntu 16.04 but fail on Ubuntu 18.04 if run like this on x86:</p>
|
<p>Docker tests with default parameters may fail on systems with glibc versions not compatible with the one used in the default docker image (e.g., Oracle Linux 7.6 for x86). For example, they pass on Ubuntu 16.04 but fail on Ubuntu 18.04 if run like this on x86:</p>
|
||||||
<pre><code>$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"</code></pre>
|
<pre><code>$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"</code></pre>
|
||||||
<p>To run these tests correctly, additional parameters for the correct docker image are required on Ubuntu 18.04 by using <code>JAVA_OPTIONS</code>.</p>
|
<p>To run these tests correctly, additional parameters for the correct docker image are required on Ubuntu 18.04 by using <code>JAVA_OPTIONS</code>.</p>
|
||||||
<pre><code>$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu -Djdk.test.docker.image.version=latest"</code></pre>
|
<pre><code>$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" \
|
||||||
|
JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu
|
||||||
|
-Djdk.test.docker.image.version=latest"</code></pre>
|
||||||
<h3 id="non-us-locale">Non-US locale</h3>
|
<h3 id="non-us-locale">Non-US locale</h3>
|
||||||
<p>If your locale is non-US, some tests are likely to fail. To work around this you can set the locale to US. On Unix platforms simply setting <code>LANG="en_US"</code> in the environment before running tests should work. On Windows, setting <code>JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"</code> helps for most, but not all test cases. For example:</p>
|
<p>If your locale is non-US, some tests are likely to fail. To work around this you can set the locale to US. On Unix platforms simply setting <code>LANG="en_US"</code> in the environment before running tests should work. On Windows, setting <code>JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"</code> helps for most, but not all test cases.</p>
|
||||||
|
<p>For example:</p>
|
||||||
<pre><code>$ export LANG="en_US" && make test TEST=...
|
<pre><code>$ export LANG="en_US" && make test TEST=...
|
||||||
$ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...</code></pre>
|
$ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...</code></pre>
|
||||||
<h3 id="pkcs11-tests">PKCS11 Tests</h3>
|
<h3 id="pkcs11-tests">PKCS11 Tests</h3>
|
||||||
<p>It is highly recommended to use the latest NSS version when running PKCS11 tests. Improper NSS version may lead to unexpected failures which are hard to diagnose. For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail on Ubuntu 18.04 with the default NSS version in the system. To run these tests correctly, the system property <code>test.nss.lib.paths</code> is required on Ubuntu 18.04 to specify the alternative NSS lib directories. For example:</p>
|
<p>It is highly recommended to use the latest NSS version when running PKCS11 tests. Improper NSS version may lead to unexpected failures which are hard to diagnose. For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail on Ubuntu 18.04 with the default NSS version in the system. To run these tests correctly, the system property <code>test.nss.lib.paths</code> is required on Ubuntu 18.04 to specify the alternative NSS lib directories.</p>
|
||||||
<pre><code>$ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" JTREG="JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs"</code></pre>
|
<p>For example:</p>
|
||||||
|
<pre><code>$ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" \
|
||||||
|
JTREG="JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs"</code></pre>
|
||||||
<p>For more notes about the PKCS11 tests, please refer to test/jdk/sun/security/pkcs11/README.</p>
|
<p>For more notes about the PKCS11 tests, please refer to test/jdk/sun/security/pkcs11/README.</p>
|
||||||
<h3 id="client-ui-tests">Client UI Tests</h3>
|
<h3 id="client-ui-tests">Client UI Tests</h3>
|
||||||
<p>Some Client UI tests use key sequences which may be reserved by the operating system. Usually that causes the test failure. So it is highly recommended to disable system key shortcuts prior testing. The steps to access and disable system key shortcuts for various platforms are provided below.</p>
|
<p>Some Client UI tests use key sequences which may be reserved by the operating system. Usually that causes the test failure. So it is highly recommended to disable system key shortcuts prior testing. The steps to access and disable system key shortcuts for various platforms are provided below.</p>
|
||||||
|
158
doc/testing.md
158
doc/testing.md
@ -37,11 +37,11 @@ Note that this option should point to the JTReg home, i.e. the top directory,
|
|||||||
containing `lib/jtreg.jar` etc. (An alternative is to set the `JT_HOME`
|
containing `lib/jtreg.jar` etc. (An alternative is to set the `JT_HOME`
|
||||||
environment variable to point to the JTReg home before running `configure`.)
|
environment variable to point to the JTReg home before running `configure`.)
|
||||||
|
|
||||||
To be able to run microbenchmarks, `configure` needs to know where to find
|
To be able to run microbenchmarks, `configure` needs to know where to find the
|
||||||
the JMH dependency. Use `--with-jmh=<path to JMH jars>` to point to a directory
|
JMH dependency. Use `--with-jmh=<path to JMH jars>` to point to a directory
|
||||||
containing the core JMH and transitive dependencies. The recommended dependencies
|
containing the core JMH and transitive dependencies. The recommended
|
||||||
can be retrieved by running `sh make/devkit/createJMHBundle.sh`, after which
|
dependencies can be retrieved by running `sh make/devkit/createJMHBundle.sh`,
|
||||||
`--with-jmh=build/jmh/jars` should work.
|
after which `--with-jmh=build/jmh/jars` should work.
|
||||||
|
|
||||||
## Test selection
|
## Test selection
|
||||||
|
|
||||||
@ -182,10 +182,10 @@ variables.
|
|||||||
These variables use a keyword=value approach to allow multiple values to be
|
These variables use a keyword=value approach to allow multiple values to be
|
||||||
set. So, for instance, `JTREG="JOBS=1;TIMEOUT_FACTOR=8"` will set the JTReg
|
set. So, for instance, `JTREG="JOBS=1;TIMEOUT_FACTOR=8"` will set the JTReg
|
||||||
concurrency level to 1 and the timeout factor to 8. This is equivalent to
|
concurrency level to 1 and the timeout factor to 8. This is equivalent to
|
||||||
setting `JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8`, but using the keyword format means that
|
setting `JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8`, but using the keyword format
|
||||||
the `JTREG` variable is parsed and verified for correctness, so
|
means that the `JTREG` variable is parsed and verified for correctness, so
|
||||||
`JTREG="TMIEOUT_FACTOR=8"` would give an error, while `JTREG_TMIEOUT_FACTOR=8` would just
|
`JTREG="TMIEOUT_FACTOR=8"` would give an error, while `JTREG_TMIEOUT_FACTOR=8`
|
||||||
pass unnoticed.
|
would just pass unnoticed.
|
||||||
|
|
||||||
To separate multiple keyword=value pairs, use `;` (semicolon). Since the shell
|
To separate multiple keyword=value pairs, use `;` (semicolon). Since the shell
|
||||||
normally eats `;`, the recommended usage is to write the assignment inside
|
normally eats `;`, the recommended usage is to write the assignment inside
|
||||||
@ -203,9 +203,10 @@ test suites.
|
|||||||
|
|
||||||
### General keywords (TEST_OPTS)
|
### General keywords (TEST_OPTS)
|
||||||
|
|
||||||
Some keywords are valid across different test suites. If you want to run
|
Some keywords are valid across different test suites. If you want to run tests
|
||||||
tests from multiple test suites, or just don't want to care which test suite specific
|
from multiple test suites, or just don't want to care which test suite specific
|
||||||
control variable to use, then you can use the general TEST_OPTS control variable.
|
control variable to use, then you can use the general TEST_OPTS control
|
||||||
|
variable.
|
||||||
|
|
||||||
There are also some keywords that applies globally to the test runner system,
|
There are also some keywords that applies globally to the test runner system,
|
||||||
not to any specific test suites. These are also available as TEST_OPTS keywords.
|
not to any specific test suites. These are also available as TEST_OPTS keywords.
|
||||||
@ -252,12 +253,13 @@ for only recently changed code. JCOV_DIFF_CHANGESET specifies a source
|
|||||||
revision. A textual report will be generated showing coverage of the diff
|
revision. A textual report will be generated showing coverage of the diff
|
||||||
between the specified revision and the repository tip.
|
between the specified revision and the repository tip.
|
||||||
|
|
||||||
The report is stored in `build/$BUILD/test-results/jcov-output/diff_coverage_report`
|
The report is stored in
|
||||||
file.
|
`build/$BUILD/test-results/jcov-output/diff_coverage_report` file.
|
||||||
|
|
||||||
### JTReg keywords
|
### JTReg keywords
|
||||||
|
|
||||||
#### JOBS
|
#### JOBS
|
||||||
|
|
||||||
The test concurrency (`-concurrency`).
|
The test concurrency (`-concurrency`).
|
||||||
|
|
||||||
Defaults to TEST_JOBS (if set by `--with-test-jobs=`), otherwise it defaults to
|
Defaults to TEST_JOBS (if set by `--with-test-jobs=`), otherwise it defaults to
|
||||||
@ -265,32 +267,43 @@ JOBS, except for Hotspot, where the default is *number of CPU cores/2*,
|
|||||||
but never more than *memory size in GB/2*.
|
but never more than *memory size in GB/2*.
|
||||||
|
|
||||||
#### TIMEOUT_FACTOR
|
#### TIMEOUT_FACTOR
|
||||||
|
|
||||||
The timeout factor (`-timeoutFactor`).
|
The timeout factor (`-timeoutFactor`).
|
||||||
|
|
||||||
Defaults to 4.
|
Defaults to 4.
|
||||||
|
|
||||||
|
#### FAILURE_HANDLER_TIMEOUT
|
||||||
|
|
||||||
|
Sets the argument `-timeoutHandlerTimeout` for JTReg. The default value is 0.
|
||||||
|
This is only valid if the failure handler is built.
|
||||||
|
|
||||||
#### TEST_MODE
|
#### TEST_MODE
|
||||||
|
|
||||||
The test mode (`agentvm` or `othervm`).
|
The test mode (`agentvm` or `othervm`).
|
||||||
|
|
||||||
Defaults to `agentvm`.
|
Defaults to `agentvm`.
|
||||||
|
|
||||||
#### ASSERT
|
#### ASSERT
|
||||||
|
|
||||||
Enable asserts (`-ea -esa`, or none).
|
Enable asserts (`-ea -esa`, or none).
|
||||||
|
|
||||||
Set to `true` or `false`. If true, adds `-ea -esa`. Defaults to true, except
|
Set to `true` or `false`. If true, adds `-ea -esa`. Defaults to true, except
|
||||||
for hotspot.
|
for hotspot.
|
||||||
|
|
||||||
#### VERBOSE
|
#### VERBOSE
|
||||||
|
|
||||||
The verbosity level (`-verbose`).
|
The verbosity level (`-verbose`).
|
||||||
|
|
||||||
Defaults to `fail,error,summary`.
|
Defaults to `fail,error,summary`.
|
||||||
|
|
||||||
#### RETAIN
|
#### RETAIN
|
||||||
|
|
||||||
What test data to retain (`-retain`).
|
What test data to retain (`-retain`).
|
||||||
|
|
||||||
Defaults to `fail,error`.
|
Defaults to `fail,error`.
|
||||||
|
|
||||||
#### MAX_MEM
|
#### MAX_MEM
|
||||||
|
|
||||||
Limit memory consumption (`-Xmx` and `-vmoption:-Xmx`, or none).
|
Limit memory consumption (`-Xmx` and `-vmoption:-Xmx`, or none).
|
||||||
|
|
||||||
Limit memory consumption for JTReg test framework and VM under test. Set to 0
|
Limit memory consumption for JTReg test framework and VM under test. Set to 0
|
||||||
@ -298,9 +311,14 @@ to disable the limits.
|
|||||||
|
|
||||||
Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).
|
Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).
|
||||||
|
|
||||||
|
#### MAX_OUTPUT
|
||||||
|
|
||||||
|
Set the property `javatest.maxOutputSize` for the launcher, to change the
|
||||||
|
default JTReg log limit.
|
||||||
|
|
||||||
#### KEYWORDS
|
#### KEYWORDS
|
||||||
|
|
||||||
JTReg kewords sent to JTReg using `-k`. Please be careful in making sure that
|
JTReg keywords sent to JTReg using `-k`. Please be careful in making sure that
|
||||||
spaces and special characters (like `!`) are properly quoted. To avoid some
|
spaces and special characters (like `!`) are properly quoted. To avoid some
|
||||||
issues, the special value `%20` can be used instead of space.
|
issues, the special value `%20` can be used instead of space.
|
||||||
|
|
||||||
@ -323,23 +341,30 @@ Set to `true` or `false`.
|
|||||||
If `true`, JTReg will use `-match:` option, otherwise `-exclude:` will be used.
|
If `true`, JTReg will use `-match:` option, otherwise `-exclude:` will be used.
|
||||||
Default is `false`.
|
Default is `false`.
|
||||||
|
|
||||||
|
|
||||||
#### OPTIONS
|
#### OPTIONS
|
||||||
|
|
||||||
Additional options to the JTReg test framework.
|
Additional options to the JTReg test framework.
|
||||||
|
|
||||||
Use `JTREG="OPTIONS=--help all"` to see all available JTReg options.
|
Use `JTREG="OPTIONS=--help all"` to see all available JTReg options.
|
||||||
|
|
||||||
#### JAVA_OPTIONS
|
#### JAVA_OPTIONS
|
||||||
|
|
||||||
Additional Java options for running test classes (sent to JTReg as
|
Additional Java options for running test classes (sent to JTReg as
|
||||||
`-javaoption`).
|
`-javaoption`).
|
||||||
|
|
||||||
#### VM_OPTIONS
|
#### VM_OPTIONS
|
||||||
|
|
||||||
Additional Java options to be used when compiling and running classes (sent to
|
Additional Java options to be used when compiling and running classes (sent to
|
||||||
JTReg as `-vmoption`).
|
JTReg as `-vmoption`).
|
||||||
|
|
||||||
This option is only needed in special circumstances. To pass Java options to
|
This option is only needed in special circumstances. To pass Java options to
|
||||||
your test classes, use `JAVA_OPTIONS`.
|
your test classes, use `JAVA_OPTIONS`.
|
||||||
|
|
||||||
|
#### LAUNCHER_OPTIONS
|
||||||
|
|
||||||
|
Additional Java options that are sent to the java launcher that starts the
|
||||||
|
JTReg harness.
|
||||||
|
|
||||||
#### AOT_MODULES
|
#### AOT_MODULES
|
||||||
|
|
||||||
Generate AOT modules before testing for the specified module, or set of
|
Generate AOT modules before testing for the specified module, or set of
|
||||||
@ -353,6 +378,7 @@ Retry failed tests up to a set number of times. Defaults to 0.
|
|||||||
### Gtest keywords
|
### Gtest keywords
|
||||||
|
|
||||||
#### REPEAT
|
#### REPEAT
|
||||||
|
|
||||||
The number of times to repeat the tests (`--gtest_repeat`).
|
The number of times to repeat the tests (`--gtest_repeat`).
|
||||||
|
|
||||||
Default is 1. Set to -1 to repeat indefinitely. This can be especially useful
|
Default is 1. Set to -1 to repeat indefinitely. This can be especially useful
|
||||||
@ -360,6 +386,7 @@ combined with `OPTIONS=--gtest_break_on_failure` to reproduce an intermittent
|
|||||||
problem.
|
problem.
|
||||||
|
|
||||||
#### OPTIONS
|
#### OPTIONS
|
||||||
|
|
||||||
Additional options to the Gtest test framework.
|
Additional options to the Gtest test framework.
|
||||||
|
|
||||||
Use `GTEST="OPTIONS=--help"` to see all available Gtest options.
|
Use `GTEST="OPTIONS=--help"` to see all available Gtest options.
|
||||||
@ -373,98 +400,127 @@ modules. If multiple modules are specified, they should be separated by space
|
|||||||
### Microbenchmark keywords
|
### Microbenchmark keywords
|
||||||
|
|
||||||
#### FORK
|
#### FORK
|
||||||
|
|
||||||
Override the number of benchmark forks to spawn. Same as specifying `-f <num>`.
|
Override the number of benchmark forks to spawn. Same as specifying `-f <num>`.
|
||||||
|
|
||||||
#### ITER
|
#### ITER
|
||||||
|
|
||||||
Number of measurement iterations per fork. Same as specifying `-i <num>`.
|
Number of measurement iterations per fork. Same as specifying `-i <num>`.
|
||||||
|
|
||||||
#### TIME
|
#### TIME
|
||||||
|
|
||||||
Amount of time to spend in each measurement iteration, in seconds. Same as
|
Amount of time to spend in each measurement iteration, in seconds. Same as
|
||||||
specifying `-r <num>`
|
specifying `-r <num>`
|
||||||
|
|
||||||
#### WARMUP_ITER
|
#### WARMUP_ITER
|
||||||
|
|
||||||
Number of warmup iterations to run before the measurement phase in each fork.
|
Number of warmup iterations to run before the measurement phase in each fork.
|
||||||
Same as specifying `-wi <num>`.
|
Same as specifying `-wi <num>`.
|
||||||
|
|
||||||
#### WARMUP_TIME
|
#### WARMUP_TIME
|
||||||
|
|
||||||
Amount of time to spend in each warmup iteration. Same as specifying `-w <num>`.
|
Amount of time to spend in each warmup iteration. Same as specifying `-w <num>`.
|
||||||
|
|
||||||
#### RESULTS_FORMAT
|
#### RESULTS_FORMAT
|
||||||
|
|
||||||
Specify to have the test run save a log of the values. Accepts the same values
|
Specify to have the test run save a log of the values. Accepts the same values
|
||||||
as `-rff`, i.e., `text`, `csv`, `scsv`, `json`, or `latex`.
|
as `-rff`, i.e., `text`, `csv`, `scsv`, `json`, or `latex`.
|
||||||
|
|
||||||
#### VM_OPTIONS
|
#### VM_OPTIONS
|
||||||
|
|
||||||
Additional VM arguments to provide to forked off VMs. Same as `-jvmArgs <args>`
|
Additional VM arguments to provide to forked off VMs. Same as `-jvmArgs <args>`
|
||||||
|
|
||||||
#### OPTIONS
|
#### OPTIONS
|
||||||
|
|
||||||
Additional arguments to send to JMH.
|
Additional arguments to send to JMH.
|
||||||
|
|
||||||
## Notes for Specific Tests
|
## Notes for Specific Tests
|
||||||
|
|
||||||
### Docker Tests
|
### Docker Tests
|
||||||
|
|
||||||
Docker tests with default parameters may fail on systems with glibc versions not
|
Docker tests with default parameters may fail on systems with glibc versions
|
||||||
compatible with the one used in the default docker image (e.g., Oracle Linux 7.6 for x86).
|
not compatible with the one used in the default docker image (e.g., Oracle
|
||||||
For example, they pass on Ubuntu 16.04 but fail on Ubuntu 18.04 if run like this on x86:
|
Linux 7.6 for x86). For example, they pass on Ubuntu 16.04 but fail on Ubuntu
|
||||||
|
18.04 if run like this on x86:
|
||||||
|
|
||||||
$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"
|
```
|
||||||
|
$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker"
|
||||||
|
```
|
||||||
|
|
||||||
To run these tests correctly, additional parameters for the correct docker image are
|
To run these tests correctly, additional parameters for the correct docker
|
||||||
required on Ubuntu 18.04 by using `JAVA_OPTIONS`.
|
image are required on Ubuntu 18.04 by using `JAVA_OPTIONS`.
|
||||||
|
|
||||||
$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu -Djdk.test.docker.image.version=latest"
|
```
|
||||||
|
$ make test TEST="jtreg:test/hotspot/jtreg/containers/docker" \
|
||||||
|
JTREG="JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu
|
||||||
|
-Djdk.test.docker.image.version=latest"
|
||||||
|
```
|
||||||
|
|
||||||
### Non-US locale
|
### Non-US locale
|
||||||
|
|
||||||
If your locale is non-US, some tests are likely to fail. To work around this you can
|
If your locale is non-US, some tests are likely to fail. To work around this
|
||||||
set the locale to US. On Unix platforms simply setting `LANG="en_US"` in the
|
you can set the locale to US. On Unix platforms simply setting `LANG="en_US"`
|
||||||
environment before running tests should work. On Windows, setting
|
in the environment before running tests should work. On Windows, setting
|
||||||
`JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"` helps for most, but not all test cases.
|
`JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"` helps for most, but
|
||||||
|
not all test cases.
|
||||||
|
|
||||||
For example:
|
For example:
|
||||||
|
|
||||||
$ export LANG="en_US" && make test TEST=...
|
```
|
||||||
$ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...
|
$ export LANG="en_US" && make test TEST=...
|
||||||
|
$ make test JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US" TEST=...
|
||||||
|
```
|
||||||
|
|
||||||
### PKCS11 Tests
|
### PKCS11 Tests
|
||||||
|
|
||||||
It is highly recommended to use the latest NSS version when running PKCS11 tests.
|
It is highly recommended to use the latest NSS version when running PKCS11
|
||||||
Improper NSS version may lead to unexpected failures which are hard to diagnose.
|
tests. Improper NSS version may lead to unexpected failures which are hard to
|
||||||
For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail on Ubuntu
|
diagnose. For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail
|
||||||
18.04 with the default NSS version in the system.
|
on Ubuntu 18.04 with the default NSS version in the system. To run these tests
|
||||||
To run these tests correctly, the system property `test.nss.lib.paths` is required
|
correctly, the system property `test.nss.lib.paths` is required on Ubuntu 18.04
|
||||||
on Ubuntu 18.04 to specify the alternative NSS lib directories.
|
to specify the alternative NSS lib directories.
|
||||||
|
|
||||||
For example:
|
For example:
|
||||||
|
|
||||||
$ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" JTREG="JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs"
|
```
|
||||||
|
$ make test TEST="jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java" \
|
||||||
|
JTREG="JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs"
|
||||||
|
```
|
||||||
|
|
||||||
For more notes about the PKCS11 tests, please refer to test/jdk/sun/security/pkcs11/README.
|
For more notes about the PKCS11 tests, please refer to
|
||||||
|
test/jdk/sun/security/pkcs11/README.
|
||||||
|
|
||||||
### Client UI Tests
|
### Client UI Tests
|
||||||
|
|
||||||
Some Client UI tests use key sequences which may be reserved by the operating
|
Some Client UI tests use key sequences which may be reserved by the operating
|
||||||
system. Usually that causes the test failure. So it is highly recommended to disable
|
system. Usually that causes the test failure. So it is highly recommended to
|
||||||
system key shortcuts prior testing. The steps to access and disable system key shortcuts
|
disable system key shortcuts prior testing. The steps to access and disable
|
||||||
for various platforms are provided below.
|
system key shortcuts for various platforms are provided below.
|
||||||
|
|
||||||
#### MacOS
|
#### MacOS
|
||||||
|
|
||||||
Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts;
|
Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts;
|
||||||
select or deselect desired shortcut.
|
select or deselect desired shortcut.
|
||||||
|
|
||||||
For example, test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java fails
|
For example,
|
||||||
on MacOS because it uses `CTRL + F1` key sequence to show or hide tooltip message
|
test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java
|
||||||
but the key combination is reserved by the operating system. To run the test correctly
|
fails on MacOS because it uses `CTRL + F1` key sequence to show or hide tooltip
|
||||||
the default global key shortcut should be disabled using the steps described above, and then deselect
|
message but the key combination is reserved by the operating system. To run the
|
||||||
"Turn keyboard access on or off" option which is responsible for `CTRL + F1` combination.
|
test correctly the default global key shortcut should be disabled using the
|
||||||
|
steps described above, and then deselect "Turn keyboard access on or off"
|
||||||
|
option which is responsible for `CTRL + F1` combination.
|
||||||
|
|
||||||
#### Linux
|
#### Linux
|
||||||
Open the Activities overview and start typing Settings; Choose Settings, click Devices,
|
|
||||||
then click Keyboard; set or override desired shortcut.
|
Open the Activities overview and start typing Settings; Choose Settings, click
|
||||||
|
Devices, then click Keyboard; set or override desired shortcut.
|
||||||
|
|
||||||
#### Windows
|
#### Windows
|
||||||
Type `gpedit` in the Search and then click Edit group policy; navigate to
|
|
||||||
User Configuration -> Administrative Templates -> Windows Components -> File Explorer;
|
Type `gpedit` in the Search and then click Edit group policy; navigate to User
|
||||||
in the right-side pane look for "Turn off Windows key hotkeys" and double click on it;
|
Configuration -> Administrative Templates -> Windows Components -> File
|
||||||
enable or disable hotkeys.
|
Explorer; in the right-side pane look for "Turn off Windows key hotkeys" and
|
||||||
|
double click on it; enable or disable hotkeys.
|
||||||
|
|
||||||
Note: restart is required to make the settings take effect.
|
Note: restart is required to make the settings take effect.
|
||||||
|
|
||||||
|
@ -101,9 +101,9 @@ help:
|
|||||||
$(info $(_) # method is 'auto', 'ignore' or 'fail' (default))
|
$(info $(_) # method is 'auto', 'ignore' or 'fail' (default))
|
||||||
$(info $(_) TEST="test1 ..." # Use the given test descriptor(s) for testing, e.g.)
|
$(info $(_) TEST="test1 ..." # Use the given test descriptor(s) for testing, e.g.)
|
||||||
$(info $(_) # make test TEST="jdk_lang gtest:all")
|
$(info $(_) # make test TEST="jdk_lang gtest:all")
|
||||||
$(info $(_) JTREG="OPT1=x;OPT2=y" # Control the JTREG test harness)
|
$(info $(_) JTREG="OPT1=x;OPT2=y" # Control the JTREG test harness, use 'help' to list)
|
||||||
$(info $(_) GTEST="OPT1=x;OPT2=y" # Control the GTEST test harness)
|
$(info $(_) GTEST="OPT1=x;OPT2=y" # Control the GTEST test harness, use 'help' to list)
|
||||||
$(info $(_) MICRO="OPT1=x;OPT2=y" # Control the MICRO test harness)
|
$(info $(_) MICRO="OPT1=x;OPT2=y" # Control the MICRO test harness, use 'help' to list)
|
||||||
$(info $(_) TEST_OPTS="OPT1=x;..." # Generic control of all test harnesses)
|
$(info $(_) TEST_OPTS="OPT1=x;..." # Generic control of all test harnesses)
|
||||||
$(info $(_) TEST_VM_OPTS="ARG ..." # Same as setting TEST_OPTS to VM_OPTIONS="ARG ...")
|
$(info $(_) TEST_VM_OPTS="ARG ..." # Same as setting TEST_OPTS to VM_OPTIONS="ARG ...")
|
||||||
$(info )
|
$(info )
|
||||||
|
@ -143,9 +143,6 @@ endif
|
|||||||
# Optionally create AOT libraries for specified modules before running tests.
|
# Optionally create AOT libraries for specified modules before running tests.
|
||||||
# Note, this could not be done during JDK build time.
|
# Note, this could not be done during JDK build time.
|
||||||
################################################################################
|
################################################################################
|
||||||
|
|
||||||
# Note, this could not be done during JDK build time.
|
|
||||||
|
|
||||||
# Parameter 1 is the name of the rule.
|
# Parameter 1 is the name of the rule.
|
||||||
#
|
#
|
||||||
# Remaining parameters are named arguments.
|
# Remaining parameters are named arguments.
|
||||||
@ -198,6 +195,10 @@ define SetupAotModuleBody
|
|||||||
$1_AOT_TARGETS += $$($1_AOT_LIB)
|
$1_AOT_TARGETS += $$($1_AOT_LIB)
|
||||||
endef
|
endef
|
||||||
|
|
||||||
|
################################################################################
|
||||||
|
# Optionally create AOT libraries before running tests.
|
||||||
|
# Note, this could not be done during JDK build time.
|
||||||
|
################################################################################
|
||||||
# Parameter 1 is the name of the rule.
|
# Parameter 1 is the name of the rule.
|
||||||
#
|
#
|
||||||
# Remaining parameters are named arguments.
|
# Remaining parameters are named arguments.
|
||||||
@ -291,9 +292,9 @@ $(eval $(call SetTestOpt,FAILURE_HANDLER_TIMEOUT,JTREG))
|
|||||||
$(eval $(call ParseKeywordVariable, JTREG, \
|
$(eval $(call ParseKeywordVariable, JTREG, \
|
||||||
SINGLE_KEYWORDS := JOBS TIMEOUT_FACTOR FAILURE_HANDLER_TIMEOUT \
|
SINGLE_KEYWORDS := JOBS TIMEOUT_FACTOR FAILURE_HANDLER_TIMEOUT \
|
||||||
TEST_MODE ASSERT VERBOSE RETAIN MAX_MEM RUN_PROBLEM_LISTS \
|
TEST_MODE ASSERT VERBOSE RETAIN MAX_MEM RUN_PROBLEM_LISTS \
|
||||||
RETRY_COUNT, \
|
RETRY_COUNT MAX_OUTPUT, \
|
||||||
STRING_KEYWORDS := OPTIONS JAVA_OPTIONS VM_OPTIONS KEYWORDS \
|
STRING_KEYWORDS := OPTIONS JAVA_OPTIONS VM_OPTIONS KEYWORDS \
|
||||||
EXTRA_PROBLEM_LISTS AOT_MODULES, \
|
EXTRA_PROBLEM_LISTS AOT_MODULES LAUNCHER_OPTIONS, \
|
||||||
))
|
))
|
||||||
|
|
||||||
ifneq ($(JTREG), )
|
ifneq ($(JTREG), )
|
||||||
@ -844,6 +845,14 @@ define SetupRunJtregTestBody
|
|||||||
JTREG_RUN_PROBLEM_LISTS ?= false
|
JTREG_RUN_PROBLEM_LISTS ?= false
|
||||||
JTREG_RETRY_COUNT ?= 0
|
JTREG_RETRY_COUNT ?= 0
|
||||||
|
|
||||||
|
ifneq ($$(JTREG_LAUNCHER_OPTIONS), )
|
||||||
|
$1_JTREG_LAUNCHER_OPTIONS += $$(JTREG_LAUNCHER_OPTIONS)
|
||||||
|
endif
|
||||||
|
|
||||||
|
ifneq ($$(JTREG_MAX_OUTPUT), )
|
||||||
|
$1_JTREG_LAUNCHER_OPTIONS += -Djavatest.maxOutputSize=$$(JTREG_MAX_OUTPUT)
|
||||||
|
endif
|
||||||
|
|
||||||
ifneq ($$($1_JTREG_MAX_MEM), 0)
|
ifneq ($$($1_JTREG_MAX_MEM), 0)
|
||||||
$1_JTREG_BASIC_OPTIONS += -vmoption:-Xmx$$($1_JTREG_MAX_MEM)
|
$1_JTREG_BASIC_OPTIONS += -vmoption:-Xmx$$($1_JTREG_MAX_MEM)
|
||||||
$1_JTREG_LAUNCHER_OPTIONS += -Xmx$$($1_JTREG_MAX_MEM)
|
$1_JTREG_LAUNCHER_OPTIONS += -Xmx$$($1_JTREG_MAX_MEM)
|
||||||
|
@ -220,6 +220,10 @@ define ParseKeywordVariableBody
|
|||||||
$$(eval mangled_part_eval := $$(call DoubleDollar, $$(mangled_part))) \
|
$$(eval mangled_part_eval := $$(call DoubleDollar, $$(mangled_part))) \
|
||||||
$$(eval part := $$$$(subst ||||,$$$$(SPACE),$$$$(mangled_part_eval))) \
|
$$(eval part := $$$$(subst ||||,$$$$(SPACE),$$$$(mangled_part_eval))) \
|
||||||
$$(eval $1_NO_MATCH := true) \
|
$$(eval $1_NO_MATCH := true) \
|
||||||
|
$$(if $$(filter help, $$(part)), \
|
||||||
|
$$(info Valid keywords for $1:) \
|
||||||
|
$$(info $$($1_SINGLE_KEYWORDS) $$($1_STRING_KEYWORDS).) \
|
||||||
|
$$(error Re-run without 'help' to continue)) \
|
||||||
$$(foreach keyword, $$($1_SINGLE_KEYWORDS), \
|
$$(foreach keyword, $$($1_SINGLE_KEYWORDS), \
|
||||||
$$(eval keyword_eval := $$(call DoubleDollar, $$(keyword))) \
|
$$(eval keyword_eval := $$(call DoubleDollar, $$(keyword))) \
|
||||||
$$(if $$(filter $$(keyword)=%, $$(part)), \
|
$$(if $$(filter $$(keyword)=%, $$(part)), \
|
||||||
|
Loading…
Reference in New Issue
Block a user