Compare commits

..

33 Commits

Author SHA1 Message Date
Ludovic Henry
6af643e5a1 8248657: Windows: strengthening in ThreadCritical regarding memory model
Reviewed-by: dholmes, kbarrett, aph, stuefe
2020-07-29 10:38:28 +02:00
Aleksey Shipilev
3349e10b7f 8250612: jvmciCompilerToVM.cpp declares jio_printf with "void" return type, should be "int"
Reviewed-by: thartmann, kvn
2020-07-29 09:48:08 +02:00
Andrei Pangin
a72a8984a9 8249719: MethodHandle performance suffers from bad ResolvedMethodTable hash function
Reviewed-by: simonis, stuefe, coleenp
2020-07-24 15:33:38 +03:00
Mikael Vidstedt
25d1305f7e Merge 2020-07-28 22:37:23 -07:00
Chris Plummer
76baa501fa 8250742: ProblemList serviceability/sa/ClhsdbPstack.java #id0 and #id1 for ZGC
Reviewed-by: sspitsyn
2020-07-28 16:41:07 -07:00
Joe Wang
64d130efc4 8249643: Clarify DOM documentation
Reviewed-by: lancea
2020-07-28 23:29:33 +00:00
Joe Darcy
77a10a182f 8250580: Address reliance on default constructors in java.rmi
Reviewed-by: smarks
2020-07-28 16:26:28 -07:00
Igor Ignatyev
0b42b1cf15 8250739: remove Compile::Generate_*_Graph methods declarations
Reviewed-by: kvn
2020-07-28 15:31:10 -07:00
Igor Ignatyev
f4301530b4 8250738: C2Compiler::is_intrinsic_supported(methodHandle&, bool) shouldn't be virtual
Reviewed-by: xliu, kvn
2020-07-28 15:31:09 -07:00
Harold Seigel
99ae9558fe 8250562: Clean up weird comment in vmTestbase class Terminator.java
Delete the weird comment.

Reviewed-by: lfoltan
2020-07-28 20:14:01 +00:00
Chris Plummer
816a7060ba 8248882: SA PMap and PStack support on OSX works with core files. Enable them
Reviewed-by: sspitsyn, amenkov
2020-07-28 12:04:58 -07:00
Chris Plummer
ab729d7075 8247515: OSX pc_to_symbol() lookup does not work with core files
Reviewed-by: sspitsyn, kevinw
2020-07-28 09:52:07 -07:00
Joe Darcy
1a5ef6606f 8249219: Update --release 15 symbol information for JDK 15 build 33
Reviewed-by: jlahoda
2020-07-28 09:25:23 -07:00
Joe Darcy
8a9675663f 8250640: Address reliance on default constructors in jdk.jdi
Reviewed-by: alanb
2020-07-28 09:21:04 -07:00
Christian Hagedorn
31368cd1f0 8249602: C2: assert(cnt == _outcnt) failed: no insertions allowed
Use DUIterator instead of DUIterator_Fast due to legit insertions.

Reviewed-by: kvn, thartmann
2020-07-28 16:05:30 +02:00
Coleen Phillimore
aff80ee900 8250589: Move Universe::_reference_pending_list into OopHandle
Use synchronization to reference the _reference_pending_list with OopHandle

Reviewed-by: shade, kbarrett, dholmes, eosterlund
2020-07-28 08:10:43 -04:00
Coleen Phillimore
42ac8e1856 8250042: Clean up methodOop and method_oop names from the code
Reviewed-by: dholmes, sspitsyn, cjplummer, chagedorn
2020-07-28 07:33:51 -04:00
Nick Gasson
0ebcf5c59d 8237483: AArch64 C1 OopMap inserted twice fatal error
Reviewed-by: aph
2020-07-28 16:50:32 +08:00
Aleksey Shipilev
5b99c6ae1f 8250605: Linux x86_32 builds fail after JDK-8249821
Reviewed-by: erikj, prr
2020-07-28 09:05:36 +02:00
Mikael Vidstedt
e7289aa4d4 Merge 2020-07-27 22:26:00 -07:00
Kim Barrett
188ad9714d 8247976: Update HotSpot Style Guide for C++14 adoption
Update and move style guide from wiki to jdk repo.

Reviewed-by: jrose, stefank, dholmes, mikael, stuefe, kvn
2020-07-27 22:19:33 -04:00
Jamil Nimeh
2aa291ad2c 8247630: Use two key share entries
Reviewed-by: xuelei
2020-07-27 18:20:57 -07:00
Doug Simon
f2e69156c8 8250556: revert JVMCI part of JDK-8230395
Reviewed-by: never, dholmes
2020-07-27 22:59:27 +02:00
Daniil Titov
277ec3d260 8216324: GetClassMethods is confused by the presence of default methods in super interfaces
Reviewed-by: sspitsyn, amenkov
2020-07-27 11:34:19 -07:00
Joe Darcy
ed7f796494 8250213: Address use of default constructors in com.sun.source.util
Reviewed-by: jjg
2020-07-27 11:07:30 -07:00
Patric Hedlin
761a92d7c9 8247766: [aarch64] guarantee(val < (1U << nbits)) failed: Field too big for insn
Reviewed-by: neliasso, aph
2020-07-27 10:56:51 +02:00
Martin Balao
31753ef9bf 8250582: Revert Principal Name type to NT-UNKNOWN when requesting TGS Kerberos tickets
Reviewed-by: weijun
2020-07-25 01:02:51 -03:00
Vicente Romero
6c2ff1781b 8249829: javac is issuing an incorrect static access error
Reviewed-by: jlahoda
2020-07-27 10:12:30 -04:00
Albert Yang
af8c3b4a7e 8242036: G1 HeapRegionRemSet::_n_coarse_entries could be a bool
Reviewed-by: kbarrett, eosterlund, tschatzl, lkorinth
2020-07-27 12:59:32 +02:00
Christian Hagedorn
417e8e449d 8248552: C2 crashes with SIGFPE due to division by zero
Bail out in PhaseIdealLoop:split_thru_phi when trying to split a Div or ModNode iv phi whose zero check was removed but could potentially still be zero based on type information.

Reviewed-by: kvn, thartmann
2020-07-27 11:03:17 +02:00
Yasumasa Suenaga
f84b5d2f80 8248362: JVMTI frame operations should use Thread-Local Handshake
Reviewed-by: sspitsyn, dholmes, dcubed
2020-07-27 15:49:53 +09:00
David Holmes
3dba35d248 8247296: Optimize JVM_GetDeclaringClass
Co-authored-by: Christoph Dreis <christoph.dreis@freenet.de>
Reviewed-by: shade, minqi
2020-07-26 20:29:42 -04:00
Ioi Lam
112bbcb396 8249087: Always initialize _body[0..1] in Symbol constructor
Reviewed-by: dholmes, lfoltan
2020-07-24 13:56:45 -07:00
23511 changed files with 953120 additions and 1645303 deletions

1
.gitattributes vendored
View File

@@ -1 +0,0 @@
* -text

File diff suppressed because it is too large Load Diff

6
.gitignore vendored
View File

@@ -2,7 +2,6 @@
/dist/
/.idea/
/.vscode/
/nbproject/
nbproject/private/
/webrev
/.src-rev
@@ -15,8 +14,3 @@ test/nashorn/lib
NashornProfile.txt
**/JTreport/**
**/JTwork/**
/src/utils/LogCompilation/target/
/.project/
/.settings/
*.class
.idea/workspace.xml

18
.hgignore Normal file
View File

@@ -0,0 +1,18 @@
^build/
^dist/
^.idea/
^.vscode/
nbproject/private/
^webrev
^.src-rev$
^.jib/
(^|/)\.DS_Store
(^|/)\.metadata/
(^|/)\.recommenders/
test/nashorn/script/external
test/nashorn/lib
NashornProfile.txt
(^|/)JTreport/
(^|/)JTwork/
(^|/)\.git/
^src/utils/hsdis/build/

10
.hgtags
View File

@@ -652,13 +652,3 @@ a32f58c6b8be81877411767de7ba9c4cf087c1b5 jdk-15+31
4a8fd81d64bafa523cddb45f82805536edace106 jdk-16+6
6b65f4e7a975628df51ef755b02642075390041d jdk-15+33
c3a4a7ea7c304cabdacdc31741eb94c51351668d jdk-16+7
b0817631d2f4395508cb10e81c3858a94d9ae4de jdk-15+34
0a73d6f3aab48ff6d7e61e47f0bc2d87a054f217 jdk-16+8
fd60c3146a024037cdd9be34c645bb793995a7cc jdk-15+35
c075a286cc7df767cce28e8057d6ec5051786490 jdk-16+9
b01985b4f88f554f97901e53e1ba314681dd9c19 jdk-16+10
e3f940bd3c8fcdf4ca704c6eb1ac745d155859d5 jdk-15+36
5c18d696c7ce724ca36df13933aa53f50e12b9e0 jdk-16+11
fc8e62b399bd93d06e8d13dc3b384c450e853dcd jdk-16+12
fd07cdb26fc70243ef23d688b545514f4ddf1c2b jdk-16+13
36b29df125dc88f11657ce93b4998aa9ff5f5d41 jdk-16+14

View File

@@ -1,34 +1,2 @@
[general]
project=jdk
jbs=JDK
[checks]
error=author,committer,reviewers,merge,issues,executable,symlink,message,hg-tag,whitespace,problemlists
[repository]
tags=(?:jdk-(?:[1-9]([0-9]*)(?:\.(?:0|[1-9][0-9]*)){0,4})(?:\+(?:(?:[0-9]+))|(?:-ga)))|(?:jdk[4-9](?:u\d{1,3})?-(?:(?:b\d{2,3})|(?:ga)))|(?:hs\d\d(?:\.\d{1,2})?-b\d\d)
branches=
[census]
version=0
domain=openjdk.org
[checks "whitespace"]
files=.*\.cpp|.*\.hpp|.*\.c|.*\.h|.*\.java|.*\.cc|.*\.hh|.*\.m|.*\.mm|.*\.gmk|.*\.m4|.*\.ac|Makefile
ignore-tabs=.*\.gmk|Makefile
[checks "merge"]
message=Merge
[checks "reviewers"]
reviewers=1
ignore=duke
[checks "committer"]
role=committer
[checks "issues"]
pattern=^([124-8][0-9]{6}): (\S.*)$
[checks "problemlists"]
dirs=test/jdk|test/langtools|test/lib-test|test/hotspot/jtreg|test/jaxp
bugids=dup

View File

@@ -1,3 +0,0 @@
# Contributing to the JDK
Please see <https://openjdk.java.net/contribute/> for how to contribute.

12
README Normal file
View File

@@ -0,0 +1,12 @@
Welcome to the JDK!
===================
For build instructions, please see either of these files:
* doc/building.html (html version)
* doc/building.md (markdown version)
See https://openjdk.java.net/ for more information about
the OpenJDK Community and the JDK.

174
README.md
View File

@@ -1,174 +0,0 @@
[![official JetBrains project](http://jb.gg/badges/official.svg)](https://confluence.jetbrains.com/display/ALL/JetBrains+on+GitHub)
# Welcome to JetBrains Runtime!
JetBrains Runtime is a fork of [OpenJDK](https://github.com/openjdk/jdk) available for Windows, Mac OS X, and Linux.
It includes a number enhancements in font rendering, HiDPI support, ligatures, performance improvements, and bugfixes.
> **_NOTE_**: This is a **development** branch that is periodically synhronized with
> the [OpenJDK master](https://github.com/openjdk/jdk/tree/master) branch.
>
> Release builds are based on these branches:
> * [master](https://github.com/JetBrains/JetBrainsRuntime/tree/master) (JDK 11)
> * [master17](https://github.com/JetBrains/JetBrainsRuntime/tree/master17) (JDK 17)
## Contents
- [Welcome to JetBrains Runtime](#jetbrains-runtime)
- [Products Built on JetBrains Runtime](#products-built-on-jetbrains-runtime)
- [Getting Sources](#getting-sources)
- [macOS, Linux](#macos-linux)
- [Windows](#sources-windows)
- [Configuring the Build Environment](#configuring-the-build-environment)
- [Linux (Docker)](#linux-docker)
- [Ubuntu Linux](#ubuntu-linux)
- [Windows](#build-windows)
- [macOS](#macos)
- [Developing](#developing)
- [Contributing](#contributing)
- [Resources](#resources)
## Products Built on JetBrains Runtime
* [Android Studio](https://developer.android.com/studio). The official IDE for Google's Android operating system.
* [CLion](https://www.jetbrains.com/clion/). A cross-platform IDE for C and C++ from JetBrains.
* [DataGrip](https://www.jetbrains.com/datagrip/). The IDE for Databases and SQL from JetBrains.
* [GoLand](https://www.jetbrains.com/go/). The cross-platform Go IDE from JetBrains.
* [IntelliJ IDEA](https://www.jetbrains.com/idea/). The IDE for JVM from JetBrains.
* [JProfiler](https://www.ej-technologies.com/products/jprofiler/overview.html). The Java profiler.
* [PhpStorm](https://www.jetbrains.com/phpstorm/). The PHP IDE from JetBrains.
* [PyCharm](https://www.jetbrains.com/pycharm/). The Python IDE from JetBrains.
* [Rider](https://www.jetbrains.com/rider/). The cross-platform .NET IDE from JetBrains.
* [RubyMine](https://www.jetbrains.com/ruby/). The Ruby and Rails IDE from JetBrains.
* [WebStorm](https://www.jetbrains.com/webstorm/). The JavaScript IDE from JetBrains.
* [YourKit](https://www.yourkit.com/). Java and .NET profilers.
## Getting Sources
### macOS, Linux
```
git config --global core.autocrlf input
git clone git@github.com:JetBrains/JetBrainsRuntime.git
```
### Windows
<a name="sources-windows"></a>
```
git config --global core.autocrlf false
git clone git@github.com:JetBrains/JetBrainsRuntime.git
```
## Configuring the Build Environment
Here are quick per-platform instructions for those who can't wait to get started.
Please refer to [OpenJDK build docs](https://openjdk.java.net/groups/build/doc/building.html) for in-depth
coverage of all the details.
> **_TIP:_** To get a preliminary report of what's missing, run `./configure` and check its output.
> It would usually have a meaningful advice on how to solve the problem.
### Linux (Docker)
Create a container:
```
$ cd jb/project/docker
$ docker build .
...
Successfully built 942ea9900054
```
Run these commands in the new container:
```
$ docker run -v `pwd`../../../../:/JetBrainsRuntime -it 942ea9900054
# cd /JetBrainsRuntime
# sh ./configure
# make images CONF=linux-x86_64-normal-server-release
```
### Ubuntu Linux
Install the necessary tools, libraries, and headers with:
```
$ sudo apt-get install autoconf make build-essential libx11-dev libxext-dev libxrender-dev libxtst-dev \
libxt-dev libxrandr-dev libcups2-dev libfontconfig1-dev libasound2-dev
```
Get Java 17 (for instance, [Azul Zulu Builds of OpenJDK 17](https://www.azul.com/downloads/?version=java-17-ea&package=jdk)).
Then run the following:
```
$ cd JetBrainsRuntime
$ git checkout jbr-dev
$ sh ./configure
$ make images
```
This will build the release configuration under `./build/linux-x86_64-server-release/`.
### Windows
<a name="build-windows"></a>
Install the following:
* [Cygwin x64](http://www.cygwin.com/).
Required packages: `autoconf`, `binutils`, `cpio`, `diffutils`, `file`, `gawk`, `gcc-core`, `make`, `m4`, `unzip`, `zip`.
Install those together with Cygwin.
* [Visual Studio compiler toolset](https://visualstudio.microsoft.com/downloads/).
Install with the desktop development kit, which includes Windows SDK and compilers.
Visual Studio 2019 is supported by default.
* Java 17 (for instance, [Azul Zulu Builds of OpenJDK 17](https://www.azul.com/downloads/?version=java-17-ea&os=windows&architecture=x86-64-bit&package=jdk).
If you have problems while configuring, read [Java tips on Cygwin](http://horstmann.com/articles/cygwin-tips.html).
From the command line:
```
"C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvarsall.bat" amd64
"c:\Program_Files\cygwin64\bin\mintty.exe" /bin/bash -l
```
The first command sets up environment variables, the second starts a Cygwin shell with the proper environment.
In the Cygwin shell:
```
$ cd JetBrainsRuntime
$ git checkout jbr-dev
$ bash configure --with-toolchain-version=2019
$ make images
```
This will build the release configuration under `./build/windows-x86_64-server-release/`.
### macOS
Install the following:
* Xcode command line developer tools and `autoconf` via [Homebrew](https://brew.sh/).
* Java 17 (for instance, [Azul Zulu Builds of OpenJDK 17](https://www.azul.com/downloads/?version=java-17-ea&package=jdk)).
From the command line:
```
$ cd JetBrainsRuntime
$ git checkout jbr-dev
$ sh ./configure
$ make images
```
This will build the release configuration under `./build/macosx-x86_64-server-release/`.
## Developing
You can use [CLion](https://www.jetbrains.com/clion/) to develop native parts of the JetBrains Runtime and
[IntelliJ IDEA](https://www.jetbrains.com/idea/) for the parts written in Java.
Both require projects to be created.
### CLion
Run
```
$ make compile-commands
```
in the git root and open the resulting `build/.../compile_commands.json` file as a project.
Then use `Tools | Compilation Database | Change Project Root` to point to git root of this repository.
See also this detailed step-by-step tutorial for all platforms:
[How to develop OpenJDK with CLion](https://blog.jetbrains.com/clion/2020/03/openjdk-with-clion/).
### IDEA
Run
```
$ sh ./bin/idea.sh
```
in the git root to generate project files (add `--help` for options). If you have multiple
configurations (for example, `release` and `fastdebug`), supply the `--conf <conf_name>` argument.
Then open the git root directory as a project in IDEA.
## Contributing
We are happy to receive your pull requests!
Before you submit one, please sign our [Contributor License Agreement (CLA)](https://www.jetbrains.com/agreements/cla/).
## Resources
* [JetBrains Runtime on github](https://github.com/JetBrains/JetBrainsRuntime).
* [OpenJDK build instructions](https://openjdk.java.net/groups/build/doc/building.html).
* [OpenJDK test instructions](https://htmlpreview.github.io/?https://raw.githubusercontent.com/openjdk/jdk/master/doc/building.html#running-tests).

View File

@@ -25,26 +25,7 @@
# Shell script for generating an IDEA project from a given list of modules
usage() {
echo "Usage: $0 [-h|--help] [-q|--quiet] [-a|--absolute-paths] [-r|--root <path>] [-o|--output <path>] [-c|--conf <conf_name>] [modules...]"
echo " -h | --help"
echo " -q | --quiet
No stdout output"
echo " -a | --absolute-paths
Use absolute paths to this jdk, so that generated .idea
project files can be moved independently of jdk sources"
echo " -r | --root <path>
Project content root
Default: $TOPLEVEL_DIR"
echo " -o | --output <path>
Where .idea directory with project files will be generated
(e.g. using '-o .' will place project files in './.idea')
Default: same as --root"
echo " -c | --conf <conf_name>
make configuration (release, slowdebug etc)"
echo " [modules...]
Generate project modules for specific java modules
(e.g. 'java.base java.desktop')
Default: all existing modules (java.* and jdk.*)"
echo "usage: $0 [-h|--help] [-v|--verbose] [-o|--output <path>] [modules]+"
exit 1
}
@@ -52,14 +33,10 @@ SCRIPT_DIR=`dirname $0`
#assume TOP is the dir from which the script has been called
TOP=`pwd`
cd $SCRIPT_DIR; SCRIPT_DIR=`pwd`
if [ "x$TOPLEVEL_DIR" = "x" ] ; then
cd .. ; TOPLEVEL_DIR=`pwd`
fi
cd $TOP;
VERBOSE=true
ABSOLUTE_PATHS=false
CONF_ARG=
IDEA_OUTPUT=$TOP/.idea
VERBOSE="false"
while [ $# -gt 0 ]
do
case $1 in
@@ -67,26 +44,12 @@ do
usage
;;
-q | --quiet )
VERBOSE=false
;;
-a | --absolute-paths )
ABSOLUTE_PATHS=true
;;
-r | --root )
TOPLEVEL_DIR="$2"
shift
-v | --vebose )
VERBOSE="true"
;;
-o | --output )
IDEA_OUTPUT="$2/.idea"
shift
;;
-c | --conf )
CONF_ARG="CONF_NAME=$2"
IDEA_OUTPUT=$2/.idea
shift
;;
@@ -101,21 +64,20 @@ do
shift
done
if [ "x$IDEA_OUTPUT" = "x" ] ; then
IDEA_OUTPUT="$TOPLEVEL_DIR/.idea"
mkdir -p $IDEA_OUTPUT || exit 1
cd $IDEA_OUTPUT; IDEA_OUTPUT=`pwd`
if [ "x$TOPLEVEL_DIR" = "x" ] ; then
cd $SCRIPT_DIR/..
TOPLEVEL_DIR=`pwd`
cd $IDEA_OUTPUT
fi
mkdir -p $IDEA_OUTPUT || exit 1
cd "$TOP" ; cd $TOPLEVEL_DIR; TOPLEVEL_DIR=`pwd`
cd "$TOP" ; cd $IDEA_OUTPUT; IDEA_OUTPUT=`pwd`
cd ..; IDEA_OUTPUT_PARENT=`pwd`
cd "$SCRIPT_DIR/.." ; OPENJDK_DIR=`pwd`
IDEA_MAKE="$OPENJDK_DIR/make/ide/idea/jdk"
MAKE_DIR="$SCRIPT_DIR/../make"
IDEA_MAKE="$MAKE_DIR/ide/idea/jdk"
IDEA_TEMPLATE="$IDEA_MAKE/template"
cp -rn "$TOPLEVEL_DIR/jb/project/idea-project-files"/* "$IDEA_OUTPUT"
cp -rn "$IDEA_TEMPLATE"/* "$IDEA_OUTPUT"
cp -r "$IDEA_TEMPLATE"/* "$IDEA_OUTPUT"
#override template
if [ -d "$TEMPLATES_OVERRIDE" ] ; then
@@ -124,31 +86,31 @@ if [ -d "$TEMPLATES_OVERRIDE" ] ; then
done
fi
if [ "$VERBOSE" = true ] ; then
echo "Will generate IDEA project files in \"$IDEA_OUTPUT\" for project \"$TOPLEVEL_DIR\""
if [ "$VERBOSE" = "true" ] ; then
echo "output dir: $IDEA_OUTPUT"
echo "idea template dir: $IDEA_TEMPLATE"
fi
cd $TOP ; make -f "$IDEA_MAKE/idea.gmk" -I "$OPENJDK_DIR" idea TOPLEVEL_DIR="$TOPLEVEL_DIR" \
MAKEOVERRIDES= IDEA_OUTPUT_PARENT="$IDEA_OUTPUT_PARENT" OUT="$IDEA_OUTPUT/env.cfg" MODULES="$*" $CONF_ARG || exit 1
cd $TOP ; make -f "$IDEA_MAKE/idea.gmk" -I $MAKE_DIR/.. idea MAKEOVERRIDES= OUT=$IDEA_OUTPUT/env.cfg MODULES="$*" || exit 1
cd $SCRIPT_DIR
. $IDEA_OUTPUT/env.cfg
# Expect MODULES, MODULE_NAMES, RELATIVE_PROJECT_DIR, RELATIVE_BUILD_DIR to be set
if [ "xMODULES" = "x" ] ; then
echo "FATAL: MODULES is empty" >&2; exit 1
# Expect MODULE_ROOTS, MODULE_NAMES, BOOT_JDK & SPEC to be set
if [ "x$MODULE_ROOTS" = "x" ] ; then
echo "FATAL: MODULE_ROOTS is empty" >&2; exit 1
fi
if [ "x$MODULE_NAMES" = "x" ] ; then
echo "FATAL: MODULE_NAMES is empty" >&2; exit 1
fi
if [ "x$RELATIVE_PROJECT_DIR" = "x" ] ; then
echo "FATAL: RELATIVE_PROJECT_DIR is empty" >&2; exit 1
if [ "x$BOOT_JDK" = "x" ] ; then
echo "FATAL: BOOT_JDK is empty" >&2; exit 1
fi
if [ "x$RELATIVE_BUILD_DIR" = "x" ] ; then
echo "FATAL: RELATIVE_BUILD_DIR is empty" >&2; exit 1
if [ "x$SPEC" = "x" ] ; then
echo "FATAL: SPEC is empty" >&2; exit 1
fi
if [ -d "$TOPLEVEL_DIR/.hg" ] ; then
@@ -159,43 +121,6 @@ if [ -d "$TOPLEVEL_DIR/.git" ] ; then
VCS_TYPE="Git"
fi
if [ "$ABSOLUTE_PATHS" = true ] ; then
if [ "x$PATHTOOL" != "x" ]; then
PROJECT_DIR="`$PATHTOOL -am $OPENJDK_DIR`"
TOPLEVEL_PROJECT_DIR="`$PATHTOOL -am $TOPLEVEL_DIR`"
else
PROJECT_DIR="$OPENJDK_DIR"
TOPLEVEL_PROJECT_DIR="$TOPLEVEL_DIR"
fi
MODULE_DIR="$PROJECT_DIR"
TOPLEVEL_MODULE_DIR="$TOPLEVEL_PROJECT_DIR"
cd "$IDEA_OUTPUT_PARENT" && cd "$RELATIVE_BUILD_DIR" && BUILD_DIR="`pwd`"
CLION_SCRIPT_TOPDIR="$OPENJDK_DIR"
CLION_PROJECT_DIR="$PROJECT_DIR"
else
if [ "$RELATIVE_PROJECT_DIR" = "." ] ; then
PROJECT_DIR=""
else
PROJECT_DIR="/$RELATIVE_PROJECT_DIR"
fi
if [ "$RELATIVE_TOPLEVEL_PROJECT_DIR" = "." ] ; then
TOPLEVEL_PROJECT_DIR=""
else
TOPLEVEL_PROJECT_DIR="/$RELATIVE_TOPLEVEL_PROJECT_DIR"
fi
MODULE_DIR="\$MODULE_DIR\$$PROJECT_DIR"
PROJECT_DIR="\$PROJECT_DIR\$$PROJECT_DIR"
TOPLEVEL_MODULE_DIR="\$MODULE_DIR\$$TOPLEVEL_PROJECT_DIR"
TOPLEVEL_PROJECT_DIR="\$PROJECT_DIR\$$TOPLEVEL_PROJECT_DIR"
BUILD_DIR="\$PROJECT_DIR\$/$RELATIVE_BUILD_DIR"
CLION_SCRIPT_TOPDIR="$CLION_RELATIVE_PROJECT_DIR"
CLION_PROJECT_DIR="\$PROJECT_DIR\$/$CLION_SCRIPT_TOPDIR"
fi
if [ "$VERBOSE" = true ] ; then
echo "Project root: $PROJECT_DIR"
echo "Generating IDEA project files..."
fi
### Replace template variables
NUM_REPLACEMENTS=0
@@ -219,106 +144,68 @@ add_replacement() {
eval TO$NUM_REPLACEMENTS='$2'
}
add_replacement "###PATHTOOL###" "$PATHTOOL"
add_replacement "###CLION_SCRIPT_TOPDIR###" "$CLION_SCRIPT_TOPDIR"
add_replacement "###CLION_PROJECT_DIR###" "$CLION_PROJECT_DIR"
add_replacement "###PROJECT_DIR###" "$PROJECT_DIR"
add_replacement "###MODULE_DIR###" "$MODULE_DIR"
add_replacement "###TOPLEVEL_PROJECT_DIR###" "$TOPLEVEL_PROJECT_DIR"
add_replacement "###TOPLEVEL_MODULE_DIR###" "$TOPLEVEL_MODULE_DIR"
add_replacement "###MODULE_NAMES###" "$MODULE_NAMES"
add_replacement "###VCS_TYPE###" "$VCS_TYPE"
add_replacement "###BUILD_DIR###" "$BUILD_DIR"
add_replacement "###RELATIVE_BUILD_DIR###" "$RELATIVE_BUILD_DIR"
if [ "x$PATHTOOL" != "x" ]; then
add_replacement "###BASH_RUNNER_PREFIX###" "\$PROJECT_DIR\$/.idea/bash.bat"
SPEC_DIR=`dirname $SPEC`
if [ "x$CYGPATH" = "x" ]; then
add_replacement "###BUILD_DIR###" "$SPEC_DIR"
add_replacement "###JTREG_HOME###" "$JT_HOME"
add_replacement "###IMAGES_DIR###" "$SPEC_DIR/images/jdk"
add_replacement "###ROOT_DIR###" "$TOPLEVEL_DIR"
add_replacement "###IDEA_DIR###" "$IDEA_OUTPUT"
else
add_replacement "###BASH_RUNNER_PREFIX###" ""
fi
if [ "x$PATHTOOL" != "x" ]; then
add_replacement "###BUILD_DIR###" "`cygpath -am $SPEC_DIR`"
add_replacement "###IMAGES_DIR###" "`cygpath -am $SPEC_DIR`/images/jdk"
add_replacement "###ROOT_DIR###" "`cygpath -am $TOPLEVEL_DIR`"
add_replacement "###IDEA_DIR###" "`cygpath -am $IDEA_OUTPUT`"
if [ "x$JT_HOME" = "x" ]; then
add_replacement "###JTREG_HOME###" ""
else
add_replacement "###JTREG_HOME###" "`$PATHTOOL -am $JT_HOME`"
add_replacement "###JTREG_HOME###" "`cygpath -am $JT_HOME`"
fi
else
add_replacement "###JTREG_HOME###" "$JT_HOME"
fi
MODULE_IMLS=""
TEST_MODULE_DEPENDENCIES=""
for module in $MODULE_NAMES; do
MODULE_IMLS="$MODULE_IMLS<module fileurl=\"file://\$PROJECT_DIR$/.idea/$module.iml\" filepath=\"\$PROJECT_DIR$/.idea/$module.iml\" /> "
TEST_MODULE_DEPENDENCIES="$TEST_MODULE_DEPENDENCIES<orderEntry type=\"module\" module-name=\"$module\" scope=\"TEST\" /> "
SOURCE_PREFIX="<sourceFolder url=\"file://"
SOURCE_POSTFIX="\" isTestSource=\"false\" />"
for root in $MODULE_ROOTS; do
if [ "x$CYGPATH" != "x" ]; then
root=`cygpath -am $root`
fi
SOURCES=$SOURCES" $SOURCE_PREFIX""$root""$SOURCE_POSTFIX"
done
add_replacement "###MODULE_IMLS###" "$MODULE_IMLS"
add_replacement "###TEST_MODULE_DEPENDENCIES###" "$TEST_MODULE_DEPENDENCIES"
add_replacement "###SOURCE_ROOTS###" "$SOURCES"
replace_template_dir "$IDEA_OUTPUT"
### Generate module project files
### Compile the custom Logger
if [ "$VERBOSE" = true ] ; then
echo "Generating project modules:"
fi
(
DEFAULT_IFS="$IFS"
IFS='#'
for value in $MODULES; do
(
eval "$value"
if [ "$VERBOSE" = true ] ; then
echo " $module"
fi
MAIN_SOURCE_DIRS=""
CONTENT_ROOTS=""
IFS=' '
for dir in $moduleSrcDirs; do
case $dir in
"src/"*) MAIN_SOURCE_DIRS="$MAIN_SOURCE_DIRS <sourceFolder url=\"file://$MODULE_DIR/$dir\" isTestSource=\"false\" />" ;;
*"/support/gensrc/$module") ;; # Exclude generated sources to avoid module-info conflicts, see https://youtrack.jetbrains.com/issue/IDEA-185108
*) CONTENT_ROOTS="$CONTENT_ROOTS <content url=\"file://$MODULE_DIR/$dir\">\
<sourceFolder url=\"file://$MODULE_DIR/$dir\" isTestSource=\"false\" generated=\"true\" /></content>" ;;
esac
done
if [ "x$MAIN_SOURCE_DIRS" != "x" ] ; then
CONTENT_ROOTS="<content url=\"file://$MODULE_DIR/src/$module\">$MAIN_SOURCE_DIRS</content>$CONTENT_ROOTS"
fi
add_replacement "###MODULE_CONTENT_ROOTS###" "$CONTENT_ROOTS"
DEPENDENCIES=""
for dep in $moduleDependencies; do
case $MODULE_NAMES in # Exclude skipped modules from dependencies
*"$dep"*) DEPENDENCIES="$DEPENDENCIES<orderEntry type=\"module\" module-name=\"$dep\" /> "
esac
done
add_replacement "###DEPENDENCIES###" "$DEPENDENCIES"
cp "$IDEA_OUTPUT/module.iml" "$IDEA_OUTPUT/$module.iml"
IFS="$DEFAULT_IFS"
replace_template_file "$IDEA_OUTPUT/$module.iml"
)
done
)
rm "$IDEA_OUTPUT/module.iml"
CLASSES=$IDEA_OUTPUT/classes
### Create shell script runner for Windows
if [ "x$ANT_HOME" = "x" ] ; then
# try some common locations, before giving up
if [ -f "/usr/share/ant/lib/ant.jar" ] ; then
ANT_HOME="/usr/share/ant"
elif [ -f "/usr/local/Cellar/ant/1.9.4/libexec/lib/ant.jar" ] ; then
ANT_HOME="/usr/local/Cellar/ant/1.9.4/libexec"
else
echo "FATAL: cannot find ant. Try setting ANT_HOME." >&2; exit 1
fi
fi
CP=$ANT_HOME/lib/ant.jar
rm -rf $CLASSES; mkdir $CLASSES
if [ "x$PATHTOOL" != "x" ]; then
echo "@echo off" > "$IDEA_OUTPUT/bash.bat"
if [ "x$WSL_DISTRO_NAME" != "x" ] ; then
echo "wsl -d $WSL_DISTRO_NAME --cd \"%cd%\" -e %*" >> "$IDEA_OUTPUT/bash.bat"
else
echo "$WINENV_ROOT\bin\bash.exe -l -c \"cd %CD:\=/%/ && %*\"" >> "$IDEA_OUTPUT/bash.bat"
fi
if [ "x$CYGPATH" = "x" ] ; then ## CYGPATH may be set in env.cfg
JAVAC_SOURCE_FILE=$IDEA_OUTPUT/src/idea/IdeaLoggerWrapper.java
JAVAC_SOURCE_PATH=$IDEA_OUTPUT/src
JAVAC_CLASSES=$CLASSES
JAVAC_CP=$CP
else
JAVAC_SOURCE_FILE=`cygpath -am $IDEA_OUTPUT/src/idea/IdeaLoggerWrapper.java`
JAVAC_SOURCE_PATH=`cygpath -am $IDEA_OUTPUT/src`
JAVAC_CLASSES=`cygpath -am $CLASSES`
JAVAC_CP=`cygpath -am $CP`
fi
if [ "$VERBOSE" = true ] ; then
IDEA_PROJECT_DIR="`dirname $IDEA_OUTPUT`"
if [ "x$PATHTOOL" != "x" ]; then
IDEA_PROJECT_DIR="`$PATHTOOL -am $IDEA_PROJECT_DIR`"
fi
echo "
Now you can open \"$IDEA_PROJECT_DIR\" as IDEA project
You can also run 'bash \"$IDEA_OUTPUT/jdk-clion/update-project.sh\"' to generate Clion project"
fi
$BOOT_JDK/bin/javac -d $JAVAC_CLASSES -sourcepath $JAVAC_SOURCE_PATH -cp $JAVAC_CP $JAVAC_SOURCE_FILE

0
configure vendored Executable file → Normal file
View File

View File

@@ -76,9 +76,8 @@
<li><a href="#specifying-the-target-platform">Specifying the Target Platform</a></li>
<li><a href="#toolchain-considerations">Toolchain Considerations</a></li>
<li><a href="#native-libraries">Native Libraries</a></li>
<li><a href="#cross-compiling-with-debian-sysroots">Cross compiling with Debian sysroots</a></li>
<li><a href="#creating-and-using-sysroots-with-qemu-deboostrap">Creating And Using Sysroots With qemu-deboostrap</a></li>
<li><a href="#building-for-armaarch64">Building for ARM/aarch64</a></li>
<li><a href="#building-for-musl">Building for musl</a></li>
<li><a href="#verifying-the-build">Verifying the Build</a></li>
</ul></li>
<li><a href="#build-performance">Build Performance</a><ul>
@@ -97,10 +96,12 @@
<li><a href="#getting-help">Getting Help</a></li>
</ul></li>
<li><a href="#hints-and-suggestions-for-advanced-users">Hints and Suggestions for Advanced Users</a><ul>
<li><a href="#setting-up-a-repository-for-pushing-changes-defpath">Setting Up a Repository for Pushing Changes (defpath)</a></li>
<li><a href="#bash-completion">Bash Completion</a></li>
<li><a href="#using-multiple-configurations">Using Multiple Configurations</a></li>
<li><a href="#handling-reconfigurations">Handling Reconfigurations</a></li>
<li><a href="#using-fine-grained-make-targets">Using Fine-Grained Make Targets</a></li>
<li><a href="#learn-about-mercurial">Learn About Mercurial</a></li>
</ul></li>
<li><a href="#understanding-the-build-system">Understanding the Build System</a><ul>
<li><a href="#configurations">Configurations</a></li>
@@ -114,10 +115,10 @@
</ul>
</nav>
<h2 id="tldr-instructions-for-the-impatient">TL;DR (Instructions for the Impatient)</h2>
<p>If you are eager to try out building the JDK, these simple steps works most of the time. They assume that you have installed Git (and Cygwin if running on Windows) and cloned the top-level JDK repository that you want to build.</p>
<p>If you are eager to try out building the JDK, these simple steps works most of the time. They assume that you have installed Mercurial (and Cygwin if running on Windows) and cloned the top-level JDK repository that you want to build.</p>
<ol type="1">
<li><p><a href="#getting-the-source-code">Get the complete source code</a>:<br />
<code>git clone https://git.openjdk.java.net/jdk/</code></p></li>
<code>hg clone http://hg.openjdk.java.net/jdk/jdk</code></p></li>
<li><p><a href="#running-configure">Run configure</a>:<br />
<code>bash configure</code></p>
<p>If <code>configure</code> fails due to missing dependencies (to either the <a href="#native-compiler-toolchain-requirements">toolchain</a>, <a href="#build-tools-requirements">build tools</a>, <a href="#external-library-requirements">external libraries</a> or the <a href="#boot-jdk-requirements">boot JDK</a>), most of the time it prints a suggestion on how to resolve the situation on your platform. Follow the instructions, and try running <code>bash configure</code> again.</p></li>
@@ -133,8 +134,8 @@
<p>The JDK is a complex software project. Building it requires a certain amount of technical expertise, a fair number of dependencies on external software, and reasonably powerful hardware.</p>
<p>If you just want to use the JDK and not build it yourself, this document is not for you. See for instance <a href="http://openjdk.java.net/install">OpenJDK installation</a> for some methods of installing a prebuilt JDK.</p>
<h2 id="getting-the-source-code">Getting the Source Code</h2>
<p>Make sure you are getting the correct version. As of JDK 10, the source is no longer split into separate repositories so you only need to clone one single repository. At the <a href="https://git.openjdk.java.net/">OpenJDK Git site</a> you can see a list of all available repositories. If you want to build an older version, e.g. JDK 11, it is recommended that you get the <code>jdk11u</code> repo, which contains incremental updates, instead of the <code>jdk11</code> repo, which was frozen at JDK 11 GA.</p>
<p>If you are new to Git, a good place to start is the book <a href="https://git-scm.com/book/en/v2">Pro Git</a>. The rest of this document assumes a working knowledge of Git.</p>
<p>Make sure you are getting the correct version. As of JDK 10, the source is no longer split into separate repositories so you only need to clone one single repository. At the <a href="http://hg.openjdk.java.net/">OpenJDK Mercurial server</a> you can see a list of all available repositories. If you want to build an older version, e.g. JDK 8, it is recommended that you get the <code>jdk8u</code> forest, which contains incremental updates, instead of the <code>jdk8</code> forest, which was frozen at JDK 8 GA.</p>
<p>If you are new to Mercurial, a good place to start is the <a href="http://www.mercurial-scm.org/guide">Mercurial Beginner's Guide</a>. The rest of this document assumes a working knowledge of Mercurial.</p>
<h3 id="special-considerations">Special Considerations</h3>
<p>For a smooth building experience, it is recommended that you follow these rules on where and how to check out the source code.</p>
<ul>
@@ -145,11 +146,7 @@
<ul>
<li><p>Create the directory that is going to contain the top directory of the JDK clone by using the <code>mkdir</code> command in the Cygwin bash shell. That is, do <em>not</em> create it using Windows Explorer. This will ensure that it will have proper Cygwin attributes, and that it's children will inherit those attributes.</p></li>
<li><p>Do not put the JDK clone in a path under your Cygwin home directory. This is especially important if your user name contains spaces and/or mixed upper and lower case letters.</p></li>
<li><p>You need to install a git client. You have two choices, Cygwin git or Git for Windows. Unfortunately there are pros and cons with each choice.</p>
<ul>
<li><p>The Cygwin <code>git</code> client has no line ending issues and understands Cygwin paths (which are used throughout the JDK build system). However, it does not currently work well with the Skara CLI tooling. Please see the <a href="https://wiki.openjdk.java.net/display/SKARA/Skara#Skara-Git">Skara wiki on Git clients</a> for up-to-date information about the Skara git client support.</p></li>
<li><p>The <a href="https://gitforwindows.org">Git for Windows</a> client has issues with line endings, and do not understand Cygwin paths. It does work well with the Skara CLI tooling, however. To alleviate the line ending problems, make sure you set <code>core.autocrlf</code> to <code>false</code> (this is asked during installation).</p></li>
</ul></li>
<li><p>Clone the JDK repository using the Cygwin command line <code>hg</code> client as instructed in this document. That is, do <em>not</em> use another Mercurial client such as TortoiseHg.</p></li>
</ul>
<p>Failure to follow this procedure might result in hard-to-debug build problems.</p></li>
</ul>
@@ -196,7 +193,7 @@
<p>Windows XP is not a supported platform, but all newer Windows should be able to build the JDK.</p>
<p>On Windows, it is important that you pay attention to the instructions in the <a href="#special-considerations">Special Considerations</a>.</p>
<p>Windows is the only non-POSIX OS supported by the JDK, and as such, requires some extra care. A POSIX support layer is required to build on Windows. Currently, the only supported such layers are Cygwin and Windows Subsystem for Linux (WSL). (Msys is no longer supported due to a too old bash; msys2 would likely be possible to support in a future version but that would require effort to implement.)</p>
<p>Internally in the build system, all paths are represented as Unix-style paths, e.g. <code>/cygdrive/c/git/jdk/Makefile</code> rather than <code>C:\git\jdk\Makefile</code>. This rule also applies to input to the build system, e.g. in arguments to <code>configure</code>. So, use <code>--with-msvcr-dll=/cygdrive/c/msvcr100.dll</code> rather than <code>--with-msvcr-dll=c:\msvcr100.dll</code>. For details on this conversion, see the section on <a href="#fixpath">Fixpath</a>.</p>
<p>Internally in the build system, all paths are represented as Unix-style paths, e.g. <code>/cygdrive/c/hg/jdk9/Makefile</code> rather than <code>C:\hg\jdk9\Makefile</code>. This rule also applies to input to the build system, e.g. in arguments to <code>configure</code>. So, use <code>--with-msvcr-dll=/cygdrive/c/msvcr100.dll</code> rather than <code>--with-msvcr-dll=c:\msvcr100.dll</code>. For details on this conversion, see the section on <a href="#fixpath">Fixpath</a>.</p>
<h4 id="cygwin">Cygwin</h4>
<p>A functioning <a href="http://www.cygwin.com/">Cygwin</a> environment is required for building the JDK on Windows. If you have a 64-bit OS, we strongly recommend using the 64-bit version of Cygwin.</p>
<p><strong>Note:</strong> Cygwin has a model of continuously updating all packages without any easy way to install or revert to a specific version of a package. This means that whenever you add or update a package in Cygwin, you might (inadvertently) update tools that are used by the JDK build process, and that can cause unexpected build problems.</p>
@@ -227,8 +224,6 @@
<pre><code>sudo apt-get install build-essential</code></pre>
<p>For rpm-based distributions (Fedora, Red Hat, etc), try this:</p>
<pre><code>sudo yum groupinstall &quot;Development Tools&quot;</code></pre>
<p>For Alpine Linux, aside from basic tooling, install the GNU versions of some programs:</p>
<pre><code>sudo apk add build-base bash grep zip</code></pre>
<h3 id="aix">AIX</h3>
<p>Please consult the AIX section of the <a href="https://wiki.openjdk.java.net/display/Build/Supported+Build+Platforms">Supported Build Platforms</a> OpenJDK Build Wiki page for details about which versions of AIX are supported.</p>
<h2 id="native-compiler-toolchain-requirements">Native Compiler (Toolchain) Requirements</h2>
@@ -270,7 +265,7 @@
<tbody>
<tr class="odd">
<td style="text-align: left;">Linux</td>
<td style="text-align: left;">gcc 10.2.0</td>
<td style="text-align: left;">gcc 9.2.0</td>
</tr>
<tr class="even">
<td style="text-align: left;">macOS</td>
@@ -278,14 +273,14 @@
</tr>
<tr class="odd">
<td style="text-align: left;">Windows</td>
<td style="text-align: left;">Microsoft Visual Studio 2019 update 16.7.2</td>
<td style="text-align: left;">Microsoft Visual Studio 2019 update 16.5.3</td>
</tr>
</tbody>
</table>
<p>All compilers are expected to be able to compile to the C99 language standard, as some C99 features are used in the source code. Microsoft Visual Studio doesn't fully support C99 so in practice shared code is limited to using C99 features that it does support.</p>
<h3 id="gcc">gcc</h3>
<p>The minimum accepted version of gcc is 5.0. Older versions will generate a warning by <code>configure</code> and are unlikely to work.</p>
<p>The JDK is currently known to be able to compile with at least version 10.2 of gcc.</p>
<p>The JDK is currently known to be able to compile with at least version 9.2 of gcc.</p>
<p>In general, any version between these two should be usable.</p>
<h3 id="clang">clang</h3>
<p>The minimum accepted version of clang is 3.5. Older versions will not be accepted by <code>configure</code>.</p>
@@ -299,7 +294,6 @@
<h3 id="microsoft-visual-studio">Microsoft Visual Studio</h3>
<p>The minimum accepted version of Visual Studio is 2017. Older versions will not be accepted by <code>configure</code> and will not work. The maximum accepted version of Visual Studio is 2019.</p>
<p>If you have multiple versions of Visual Studio installed, <code>configure</code> will by default pick the latest. You can request a specific version to be used by setting <code>--with-toolchain-version</code>, e.g. <code>--with-toolchain-version=2017</code>.</p>
<p>If you have Visual Studio installed but <code>configure</code> fails to detect it, it may be because of <a href="#spaces-in-path">spaces in path</a>.</p>
<h3 id="ibm-xl-cc">IBM XL C/C++</h3>
<p>Please consult the AIX section of the <a href="https://wiki.openjdk.java.net/display/Build/Supported+Build+Platforms">Supported Build Platforms</a> OpenJDK Build Wiki page for details about which versions of XLC are supported.</p>
<h2 id="boot-jdk-requirements">Boot JDK Requirements</h2>
@@ -319,8 +313,6 @@
<ul>
<li>To install on an apt-based Linux, try running <code>sudo apt-get install libfreetype6-dev</code>.</li>
<li>To install on an rpm-based Linux, try running <code>sudo yum install freetype-devel</code>.</li>
<li>To install on Alpine Linux, try running <code>sudo apk add freetype-dev</code>.</li>
<li>To install on macOS, try running <code>brew install freetype</code>.</li>
</ul>
<p>Use <code>--with-freetype-include=&lt;path&gt;</code> and <code>--with-freetype-lib=&lt;path&gt;</code> if <code>configure</code> does not automatically locate the platform FreeType files.</p>
<h3 id="cups">CUPS</h3>
@@ -328,7 +320,6 @@
<ul>
<li>To install on an apt-based Linux, try running <code>sudo apt-get install libcups2-dev</code>.</li>
<li>To install on an rpm-based Linux, try running <code>sudo yum install cups-devel</code>.</li>
<li>To install on Alpine Linux, try running <code>sudo apk add cups-dev</code>.</li>
</ul>
<p>Use <code>--with-cups=&lt;path&gt;</code> if <code>configure</code> does not properly locate your CUPS files.</p>
<h3 id="x11">X11</h3>
@@ -336,7 +327,6 @@
<ul>
<li>To install on an apt-based Linux, try running <code>sudo apt-get install libx11-dev libxext-dev libxrender-dev libxrandr-dev libxtst-dev libxt-dev</code>.</li>
<li>To install on an rpm-based Linux, try running <code>sudo yum install libXtst-devel libXt-devel libXrender-devel libXrandr-devel libXi-devel</code>.</li>
<li>To install on Alpine Linux, try running <code>sudo apk add libx11-dev libxext-dev libxrender-dev libxrandr-dev libxtst-dev libxt-dev</code>.</li>
</ul>
<p>Use <code>--with-x=&lt;path&gt;</code> if <code>configure</code> does not properly locate your X11 files.</p>
<h3 id="alsa">ALSA</h3>
@@ -344,7 +334,6 @@
<ul>
<li>To install on an apt-based Linux, try running <code>sudo apt-get install libasound2-dev</code>.</li>
<li>To install on an rpm-based Linux, try running <code>sudo yum install alsa-lib-devel</code>.</li>
<li>To install on Alpine Linux, try running <code>sudo apk add alsa-lib-dev</code>.</li>
</ul>
<p>Use <code>--with-alsa=&lt;path&gt;</code> if <code>configure</code> does not properly locate your ALSA files.</p>
<h3 id="libffi">libffi</h3>
@@ -352,7 +341,6 @@
<ul>
<li>To install on an apt-based Linux, try running <code>sudo apt-get install libffi-dev</code>.</li>
<li>To install on an rpm-based Linux, try running <code>sudo yum install libffi-devel</code>.</li>
<li>To install on Alpine Linux, try running <code>sudo apk add libffi-dev</code>.</li>
</ul>
<p>Use <code>--with-libffi=&lt;path&gt;</code> if <code>configure</code> does not properly locate your libffi files.</p>
<h2 id="build-tools-requirements">Build Tools Requirements</h2>
@@ -361,7 +349,6 @@
<ul>
<li>To install on an apt-based Linux, try running <code>sudo apt-get install autoconf</code>.</li>
<li>To install on an rpm-based Linux, try running <code>sudo yum install autoconf</code>.</li>
<li>To install on Alpine Linux, try running <code>sudo apk add autoconf</code>.</li>
<li>To install on macOS, try running <code>brew install autoconf</code>.</li>
<li>To install on Windows, try running <code>&lt;path to Cygwin setup&gt;/setup-x86_64 -q -P autoconf</code>.</li>
</ul>
@@ -404,7 +391,7 @@
<li><code>--enable-jvm-feature-&lt;feature&gt;</code> or <code>--disable-jvm-feature-&lt;feature&gt;</code> - Include (or exclude) <code>&lt;feature&gt;</code> as a JVM feature in Hotspot. You can also specify a list of features to be enabled, separated by space or comma, as <code>--with-jvm-features=&lt;feature&gt;[,&lt;feature&gt;...]</code>. If you prefix <code>&lt;feature&gt;</code> with a <code>-</code>, it will be disabled. These options will modify the default list of features for the JVM variant(s) you are building. For the <code>custom</code> JVM variant, the default list is empty. A complete list of valid JVM features can be found using <code>bash configure --help</code>.</li>
<li><code>--with-target-bits=&lt;bits&gt;</code> - Create a target binary suitable for running on a <code>&lt;bits&gt;</code> platform. Use this to create 32-bit output on a 64-bit build platform, instead of doing a full cross-compile. (This is known as a <em>reduced</em> build.)</li>
</ul>
<p>On Linux, BSD and AIX, it is possible to override where Java by default searches for runtime/JNI libraries. This can be useful in situations where there is a special shared directory for system JNI libraries. This setting can in turn be overridden at runtime by setting the <code>java.library.path</code> property.</p>
<p>On Linux, BSD and AIX, it is possible to override where Java by default searches for runtime/JNI libraries. This can be useful in situations where there is a special shared directory for system JNI libraries. This setting can in turn be overriden at runtime by setting the <code>java.library.path</code> property.</p>
<ul>
<li><code>--with-jni-libpath=&lt;path&gt;</code> - Use the specified path as a default when searching for runtime libraries.</li>
</ul>
@@ -443,7 +430,7 @@
<h3 id="configure-control-variables">Configure Control Variables</h3>
<p>It is possible to control certain aspects of <code>configure</code> by overriding the value of <code>configure</code> variables, either on the command line or in the environment.</p>
<p>Normally, this is <strong>not recommended</strong>. If used improperly, it can lead to a broken configuration. Unless you're well versed in the build system, this is hard to use properly. Therefore, <code>configure</code> will print a warning if this is detected.</p>
<p>However, there are a few <code>configure</code> variables, known as <em>control variables</em> that are supposed to be overridden on the command line. These are variables that describe the location of tools needed by the build, like <code>MAKE</code> or <code>GREP</code>. If any such variable is specified, <code>configure</code> will use that value instead of trying to autodetect the tool. For instance, <code>bash configure MAKE=/opt/gnumake4.0/bin/make</code>.</p>
<p>However, there are a few <code>configure</code> variables, known as <em>control variables</em> that are supposed to be overriden on the command line. These are variables that describe the location of tools needed by the build, like <code>MAKE</code> or <code>GREP</code>. If any such variable is specified, <code>configure</code> will use that value instead of trying to autodetect the tool. For instance, <code>bash configure MAKE=/opt/gnumake4.0/bin/make</code>.</p>
<p>If a configure argument exists, use that instead, e.g. use <code>--with-jtreg</code> instead of setting <code>JTREGEXE</code>.</p>
<p>Also note that, despite what autoconf claims, setting <code>CFLAGS</code> will not accomplish anything. Instead use <code>--with-extra-cflags</code> (and similar for <code>cxxflags</code> and <code>ldflags</code>).</p>
<h2 id="running-make">Running Make</h2>
@@ -480,7 +467,7 @@
<h3 id="make-control-variables">Make Control Variables</h3>
<p>It is possible to control <code>make</code> behavior by overriding the value of <code>make</code> variables, either on the command line or in the environment.</p>
<p>Normally, this is <strong>not recommended</strong>. If used improperly, it can lead to a broken build. Unless you're well versed in the build system, this is hard to use properly. Therefore, <code>make</code> will print a warning if this is detected.</p>
<p>However, there are a few <code>make</code> variables, known as <em>control variables</em> that are supposed to be overridden on the command line. These make up the &quot;make time&quot; configuration, as opposed to the &quot;configure time&quot; configuration.</p>
<p>However, there are a few <code>make</code> variables, known as <em>control variables</em> that are supposed to be overriden on the command line. These make up the &quot;make time&quot; configuration, as opposed to the &quot;configure time&quot; configuration.</p>
<h4 id="general-make-control-variables">General Make Control Variables</h4>
<ul>
<li><code>JOBS</code> - Specify the number of jobs to build with. See <a href="#build-performance">Build Performance</a>.</li>
@@ -488,7 +475,7 @@
<li><code>CONF</code> and <code>CONF_NAME</code> - Selecting the configuration(s) to use. See <a href="#using-multiple-configurations">Using Multiple Configurations</a></li>
</ul>
<h4 id="test-make-control-variables">Test Make Control Variables</h4>
<p>These make control variables only make sense when running tests. Please see <strong>Testing the JDK</strong> (<a href="testing.html">html</a>, <a href="testing.md">markdown</a>) for details.</p>
<p>These make control variables only make sense when running tests. Please see <a href="testing.html">Testing the JDK</a> for details.</p>
<ul>
<li><code>TEST</code></li>
<li><code>TEST_JOBS</code></li>
@@ -506,7 +493,7 @@
</ul>
<h2 id="running-tests">Running Tests</h2>
<p>Most of the JDK tests are using the <a href="http://openjdk.java.net/jtreg">JTReg</a> test framework. Make sure that your configuration knows where to find your installation of JTReg. If this is not picked up automatically, use the <code>--with-jtreg=&lt;path to jtreg home&gt;</code> option to point to the JTReg framework. Note that this option should point to the JTReg home, i.e. the top directory, containing <code>lib/jtreg.jar</code> etc.</p>
<p>The <a href="https://wiki.openjdk.java.net/display/Adoption">Adoption Group</a> provides recent builds of jtreg <a href="https://ci.adoptopenjdk.net/view/Dependencies/job/dependency_pipeline/lastSuccessfulBuild/artifact/jtreg/">here</a>. Download the latest <code>.tar.gz</code> file, unpack it, and point <code>--with-jtreg</code> to the <code>jtreg</code> directory that you just unpacked.</p>
<p>The <a href="https://wiki.openjdk.java.net/display/Adoption">Adoption Group</a> provides recent builds of jtreg <a href="https://ci.adoptopenjdk.net/view/Dependencies/job/jtreg/lastSuccessfulBuild/artifact">here</a>. Download the latest <code>.tar.gz</code> file, unpack it, and point <code>--with-jtreg</code> to the <code>jtreg</code> directory that you just unpacked.</p>
<p>Building of Hotspot Gtest suite requires the source code of Google Test framework. The top directory, which contains both <code>googletest</code> and <code>googlemock</code> directories, should be specified via <code>--with-gtest</code>. The supported version of Google Test is 1.8.1, whose source code can be obtained:</p>
<ul>
<li>by downloading and unpacking the source bundle from <a href="https://github.com/google/googletest/releases/tag/release-1.8.1">here</a></li>
@@ -514,7 +501,7 @@
</ul>
<p>To execute the most basic tests (tier 1), use:</p>
<pre><code>make run-test-tier1</code></pre>
<p>For more details on how to run tests, please see <strong>Testing the JDK</strong> (<a href="testing.html">html</a>, <a href="testing.md">markdown</a>).</p>
<p>For more details on how to run tests, please see the <a href="testing.html">Testing the JDK</a> document.</p>
<h2 id="cross-compiling">Cross-compiling</h2>
<p>Cross-compiling means using one platform (the <em>build</em> platform) to generate output that can ran on another platform (the <em>target</em> platform).</p>
<p>The typical reason for cross-compiling is that the build is performed on a more powerful desktop computer, but the resulting binaries will be able to run on a different, typically low-performing system. Most of the complications that arise when building for embedded is due to this separation of <em>build</em> and <em>target</em> systems.</p>
@@ -629,154 +616,78 @@ cp: cannot stat `arm-linux-gnueabihf/libSM.so&#39;: No such file or directory
cp: cannot stat `arm-linux-gnueabihf/libXt.so&#39;: No such file or directory</code></pre></li>
<li><p>If the X11 libraries are not properly detected by <code>configure</code>, you can point them out by <code>--with-x</code>.</p></li>
</ul>
<h3 id="cross-compiling-with-debian-sysroots">Cross compiling with Debian sysroots</h3>
<h3 id="creating-and-using-sysroots-with-qemu-deboostrap">Creating And Using Sysroots With qemu-deboostrap</h3>
<p>Fortunately, you can create sysroots for foreign architectures with tools provided by your OS. On Debian/Ubuntu systems, one could use <code>qemu-deboostrap</code> to create the <em>target</em> system chroot, which would have the native libraries and headers specific to that <em>target</em> system. After that, we can use the cross-compiler on the <em>build</em> system, pointing into chroot to get the build dependencies right. This allows building for foreign architectures with native compilation speed.</p>
<p>For example, cross-compiling to AArch64 from x86_64 could be done like this:</p>
<ul>
<li><p>Install cross-compiler on the <em>build</em> system:</p>
<pre><code>apt install g++-aarch64-linux-gnu gcc-aarch64-linux-gnu</code></pre></li>
<li><p>Create chroot on the <em>build</em> system, configuring it for <em>target</em> system:</p>
<pre><code>sudo qemu-debootstrap \
--arch=arm64 \
--verbose \
--include=fakeroot,symlinks,build-essential,libx11-dev,libxext-dev,libxrender-dev,libxrandr-dev,libxtst-dev,libxt-dev,libcups2-dev,libfontconfig1-dev,libasound2-dev,libfreetype6-dev,libpng-dev,libffi-dev \
--resolve-deps \
buster \
~/sysroot-arm64 \
http://httpredir.debian.org/debian/</code></pre></li>
<li><p>Make sure the symlinks inside the newly created chroot point to proper locations:</p>
<pre><code>sudo chroot ~/sysroot-arm64 symlinks -cr .</code></pre></li>
<li><p>Configure and build with newly created chroot as sysroot/toolchain-path:</p>
<pre><code>sh ./configure \
--openjdk-target=aarch64-linux-gnu \
--with-sysroot=~/sysroot-arm64
make images
ls build/linux-aarch64-server-release/</code></pre></li>
<li>Install cross-compiler on the <em>build</em> system:</li>
</ul>
<p>The build does not create new files in that chroot, so it can be reused for multiple builds without additional cleanup.</p>
<p>The build system should automatically detect the toolchain paths and dependencies, but sometimes it might require a little nudge with:</p>
<pre><code>apt install g++-aarch64-linux-gnu gcc-aarch64-linux-gnu</code></pre>
<ul>
<li><p>Native compilers: override <code>CC</code> or <code>CXX</code> for <code>./configure</code></p></li>
<li><p>Freetype lib location: override <code>--with-freetype-lib</code>, for example <code>${sysroot}/usr/lib/${target}/</code></p></li>
<li><p>Freetype includes location: override <code>--with-freetype-include</code> for example <code>${sysroot}/usr/include/freetype2/</code></p></li>
<li><p>X11 libraries location: override <code>--x-libraries</code>, for example <code>${sysroot}/usr/lib/${target}/</code></p></li>
<li>Create chroot on the <em>build</em> system, configuring it for <em>target</em> system:</li>
</ul>
<pre><code>sudo qemu-debootstrap --arch=arm64 --verbose \
--include=fakeroot,build-essential,libx11-dev,libxext-dev,libxrender-dev,libxrandr-dev,libxtst-dev,libxt-dev,libcups2-dev,libfontconfig1-dev,libasound2-dev,libfreetype6-dev,libpng12-dev \
--resolve-deps jessie /chroots/arm64 http://httpredir.debian.org/debian/</code></pre>
<ul>
<li>Configure and build with newly created chroot as sysroot/toolchain-path:</li>
</ul>
<pre><code>CC=aarch64-linux-gnu-gcc CXX=aarch64-linux-gnu-g++ sh ./configure --openjdk-target=aarch64-linux-gnu --with-sysroot=/chroots/arm64/ --with-toolchain-path=/chroots/arm64/
make images
ls build/linux-aarch64-normal-server-release/</code></pre>
<p>The build does not create new files in that chroot, so it can be reused for multiple builds without additional cleanup.</p>
<p>Architectures that are known to successfully cross-compile like this are:</p>
<table>
<thead>
<tr class="header">
<th style="text-align: left;">Target</th>
<th style="text-align: left;">Debian tree</th>
<th style="text-align: left;">Debian arch</th>
<th style="text-align: left;"><code>CC</code></th>
<th style="text-align: left;"><code>CXX</code></th>
<th style="text-align: left;"><code>--arch=...</code></th>
<th style="text-align: left;"><code>--openjdk-target=...</code></th>
<th><code>--with-jvm-variants=...</code></th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td style="text-align: left;">x86</td>
<td style="text-align: left;">buster</td>
<td style="text-align: left;">default</td>
<td style="text-align: left;">default</td>
<td style="text-align: left;">i386</td>
<td style="text-align: left;">i386-linux-gnu</td>
<td>(all)</td>
</tr>
<tr class="even">
<td style="text-align: left;">arm</td>
<td style="text-align: left;">buster</td>
<td style="text-align: left;">armhf</td>
<td style="text-align: left;">gcc-arm-linux-gnueabihf</td>
<td style="text-align: left;">g++-arm-linux-gnueabihf</td>
<td style="text-align: left;">armhf</td>
<td style="text-align: left;">arm-linux-gnueabihf</td>
<td>(all)</td>
</tr>
<tr class="odd">
<td style="text-align: left;">aarch64</td>
<td style="text-align: left;">buster</td>
<td style="text-align: left;">gcc-aarch64-linux-gnu</td>
<td style="text-align: left;">g++-aarch64-linux-gnu</td>
<td style="text-align: left;">arm64</td>
<td style="text-align: left;">aarch64-linux-gnu</td>
<td>(all)</td>
</tr>
<tr class="even">
<td style="text-align: left;">ppc64le</td>
<td style="text-align: left;">buster</td>
<td style="text-align: left;">ppc64el</td>
<td style="text-align: left;">gcc-powerpc64le-linux-gnu</td>
<td style="text-align: left;">g++-powerpc64le-linux-gnu</td>
<td style="text-align: left;">ppc64el</td>
<td style="text-align: left;">powerpc64le-linux-gnu</td>
<td>(all)</td>
</tr>
<tr class="odd">
<td style="text-align: left;">s390x</td>
<td style="text-align: left;">buster</td>
<td style="text-align: left;">gcc-s390x-linux-gnu</td>
<td style="text-align: left;">g++-s390x-linux-gnu</td>
<td style="text-align: left;">s390x</td>
<td style="text-align: left;">s390x-linux-gnu</td>
<td>(all)</td>
</tr>
<tr class="even">
<td style="text-align: left;">mipsle</td>
<td style="text-align: left;">buster</td>
<td style="text-align: left;">mipsel</td>
<td style="text-align: left;">mipsel-linux-gnu</td>
<td>zero</td>
</tr>
<tr class="odd">
<td style="text-align: left;">mips64le</td>
<td style="text-align: left;">buster</td>
<td style="text-align: left;">mips64el</td>
<td style="text-align: left;">mips64el-linux-gnueabi64</td>
<td>zero</td>
</tr>
<tr class="even">
<td style="text-align: left;">armel</td>
<td style="text-align: left;">buster</td>
<td style="text-align: left;">arm</td>
<td style="text-align: left;">arm-linux-gnueabi</td>
<td>zero</td>
</tr>
<tr class="odd">
<td style="text-align: left;">ppc</td>
<td style="text-align: left;">sid</td>
<td style="text-align: left;">powerpc</td>
<td style="text-align: left;">powerpc-linux-gnu</td>
<td>zero</td>
</tr>
<tr class="even">
<td style="text-align: left;">ppc64be</td>
<td style="text-align: left;">sid</td>
<td style="text-align: left;">ppc64</td>
<td style="text-align: left;">powerpc64-linux-gnu</td>
<td>(all)</td>
</tr>
<tr class="odd">
<td style="text-align: left;">m68k</td>
<td style="text-align: left;">sid</td>
<td style="text-align: left;">m68k</td>
<td style="text-align: left;">m68k-linux-gnu</td>
<td>zero</td>
</tr>
<tr class="even">
<td style="text-align: left;">alpha</td>
<td style="text-align: left;">sid</td>
<td style="text-align: left;">alpha</td>
<td style="text-align: left;">alpha-linux-gnu</td>
<td>zero</td>
</tr>
<tr class="odd">
<td style="text-align: left;">sh4</td>
<td style="text-align: left;">sid</td>
<td style="text-align: left;">sh4</td>
<td style="text-align: left;">sh4-linux-gnu</td>
<td>zero</td>
</tr>
</tbody>
</table>
<p>Additional architectures might be supported by Debian/Ubuntu Ports.</p>
<h3 id="building-for-armaarch64">Building for ARM/aarch64</h3>
<p>A common cross-compilation target is the ARM CPU. When building for ARM, it is useful to set the ABI profile. A number of pre-defined ABI profiles are available using <code>--with-abi-profile</code>: arm-vfp-sflt, arm-vfp-hflt, arm-sflt, armv5-vfp-sflt, armv6-vfp-hflt. Note that soft-float ABIs are no longer properly supported by the JDK.</p>
<h3 id="building-for-musl">Building for musl</h3>
<p>Just like it's possible to cross-compile for a different CPU, it's possible to cross-compile for musl libc on a glibc-based <em>build</em> system. A devkit suitable for most target CPU architectures can be obtained from <a href="https://musl.cc">musl.cc</a>. After installing the required packages in the sysroot, configure the build with <code>--openjdk-target</code>:</p>
<pre><code>sh ./configure --with-jvm-variants=server \
--with-boot-jdk=$BOOT_JDK \
--with-build-jdk=$BUILD_JDK \
--openjdk-target=x86_64-unknown-linux-musl \
--with-devkit=$DEVKIT \
--with-sysroot=$SYSROOT</code></pre>
<p>and run <code>make</code> normally.</p>
<h3 id="verifying-the-build">Verifying the Build</h3>
<p>The build will end up in a directory named like <code>build/linux-arm-normal-server-release</code>.</p>
<p>Inside this build output directory, the <code>images/jdk</code> will contain the newly built JDK, for your <em>target</em> system.</p>
@@ -820,14 +731,14 @@ ls build/linux-aarch64-server-release/</code></pre></li>
=== Output from failing command(s) repeated here ===
* For target hotspot_variant-server_libjvm_objs_psMemoryPool.o:
/localhome/git/jdk-sandbox/hotspot/src/share/vm/services/psMemoryPool.cpp:1:1: error: &#39;failhere&#39; does not name a type
/localhome/hg/jdk9-sandbox/hotspot/src/share/vm/services/psMemoryPool.cpp:1:1: error: &#39;failhere&#39; does not name a type
... (rest of output omitted)
* All command lines available in /localhome/git/jdk-sandbox/build/linux-x64/make-support/failure-logs.
* All command lines available in /localhome/hg/jdk9-sandbox/build/linux-x64/make-support/failure-logs.
=== End of repeated output ===
=== Make failed targets repeated here ===
lib/CompileJvm.gmk:207: recipe for target &#39;/localhome/git/jdk-sandbox/build/linux-x64/hotspot/variant-server/libjvm/objs/psMemoryPool.o&#39; failed
lib/CompileJvm.gmk:207: recipe for target &#39;/localhome/hg/jdk9-sandbox/build/linux-x64/hotspot/variant-server/libjvm/objs/psMemoryPool.o&#39; failed
make/Main.gmk:263: recipe for target &#39;hotspot-server-libs&#39; failed
=== End of repeated output ===
@@ -854,7 +765,7 @@ Hint: If caused by a warning, try configure --disable-warnings-as-errors.</code>
<p>Here are a suggested list of things to try if you are having unexpected build problems. Each step requires more time than the one before, so try them in order. Most issues will be solved at step 1 or 2.</p>
<ol type="1">
<li><p>Make sure your repository is up-to-date</p>
<p>Run <code>git pull origin master</code> to make sure you have the latest changes.</p></li>
<p>Run <code>hg pull -u</code> to make sure you have the latest changes.</p></li>
<li><p>Clean build results</p>
<p>The simplest way to fix incremental rebuild issues is to run <code>make clean</code>. This will remove all build results, but not the configuration or any build system support artifacts. In most cases, this will solve build errors resulting from incremental build mismatches.</p></li>
<li><p>Completely clean the build directory.</p>
@@ -863,8 +774,8 @@ Hint: If caused by a warning, try configure --disable-warnings-as-errors.</code>
make dist-clean
bash configure $(cat current-configuration)
make</code></pre></li>
<li><p>Re-clone the Git repository</p>
<p>Sometimes the Git repository gets in a state that causes the product to be un-buildable. In such a case, the simplest solution is often the &quot;sledgehammer approach&quot;: delete the entire repository, and re-clone it. If you have local changes, save them first to a different location using <code>git format-patch</code>.</p></li>
<li><p>Re-clone the Mercurial repository</p>
<p>Sometimes the Mercurial repository gets in a state that causes the product to be un-buildable. In such a case, the simplest solution is often the &quot;sledgehammer approach&quot;: delete the entire repository, and re-clone it. If you have local changes, save them first to a different location using <code>hg export</code>.</p></li>
</ol>
<h3 id="specific-build-issues">Specific Build Issues</h3>
<h4 id="clock-skew">Clock Skew</h4>
@@ -879,12 +790,23 @@ Clock skew detected. Your build may be incomplete.</code></pre>
cannot create ... Permission denied
spawn failed</code></pre>
<p>This can be a sign of a Cygwin problem. See the information about solving problems in the <a href="#cygwin">Cygwin</a> section. Rebooting the computer might help temporarily.</p>
<h4 id="spaces-in-path">Spaces in Path</h4>
<p>On Windows, when configuring, <code>fixpath.sh</code> may report that some directory names have spaces. Usually, it assumes those directories have <a href="https://docs.microsoft.com/en-us/windows-server/administration/windows-commands/fsutil-8dot3name">short paths</a>. You can run <code>fsutil file setshortname</code> in <code>cmd</code> on certain directories, such as <code>Microsoft Visual Studio</code> or <code>Windows Kits</code>, to assign arbitrary short paths so <code>configure</code> can access them.</p>
<h3 id="getting-help">Getting Help</h3>
<p>If none of the suggestions in this document helps you, or if you find what you believe is a bug in the build system, please contact the Build Group by sending a mail to <a href="mailto:build-dev@openjdk.java.net">build-dev@openjdk.java.net</a>. Please include the relevant parts of the configure and/or build log.</p>
<p>If you need general help or advice about developing for the JDK, you can also contact the Adoption Group. See the section on <a href="#contributing-to-openjdk">Contributing to OpenJDK</a> for more information.</p>
<h2 id="hints-and-suggestions-for-advanced-users">Hints and Suggestions for Advanced Users</h2>
<h3 id="setting-up-a-repository-for-pushing-changes-defpath">Setting Up a Repository for Pushing Changes (defpath)</h3>
<p>To help you prepare a proper push path for a Mercurial repository, there exists a useful tool known as <a href="http://openjdk.java.net/projects/code-tools/defpath">defpath</a>. It will help you setup a proper push path for pushing changes to the JDK.</p>
<p>Install the extension by cloning <code>http://hg.openjdk.java.net/code-tools/defpath</code> and updating your <code>.hgrc</code> file. Here's one way to do this:</p>
<pre><code>cd ~
mkdir hg-ext
cd hg-ext
hg clone http://hg.openjdk.java.net/code-tools/defpath
cat &lt;&lt; EOT &gt;&gt; ~/.hgrc
[extensions]
defpath=~/hg-ext/defpath/defpath.py
EOT</code></pre>
<p>You can now setup a proper push path using:</p>
<pre><code>hg defpath -d -u &lt;your OpenJDK username&gt;</code></pre>
<h3 id="bash-completion">Bash Completion</h3>
<p>The <code>configure</code> and <code>make</code> commands tries to play nice with bash command-line completion (using <code>&lt;tab&gt;</code> or <code>&lt;tab&gt;&lt;tab&gt;</code>). To use this functionality, make sure you enable completion in your <code>~/.bashrc</code> (see instructions for bash in your operating system).</p>
<p>Make completion will work out of the box, and will complete valid make targets. For instance, typing <code>make jdk-i&lt;tab&gt;</code> will complete to <code>make jdk-image</code>.</p>
@@ -908,7 +830,7 @@ sudo mv /tmp/configure /usr/local/bin</code></pre>
<p>If you update the repository and part of the configure script has changed, the build system will force you to re-run <code>configure</code>.</p>
<p>Most of the time, you will be fine by running <code>configure</code> again with the same arguments as the last time, which can easily be performed by <code>make reconfigure</code>. To simplify this, you can use the <code>CONF_CHECK</code> make control variable, either as <code>make CONF_CHECK=auto</code>, or by setting an environment variable. For instance, if you add <code>export CONF_CHECK=auto</code> to your <code>.bashrc</code> file, <code>make</code> will always run <code>reconfigure</code> automatically whenever the configure script has changed.</p>
<p>You can also use <code>CONF_CHECK=ignore</code> to skip the check for a needed configure update. This might speed up the build, but comes at the risk of an incorrect build result. This is only recommended if you know what you're doing.</p>
<p>From time to time, you will also need to modify the command line to <code>configure</code> due to changes. Use <code>make print-configuration</code> to show the command line used for your current configuration.</p>
<p>From time to time, you will also need to modify the command line to <code>configure</code> due to changes. Use <code>make print-configure</code> to show the command line used for your current configuration.</p>
<h3 id="using-fine-grained-make-targets">Using Fine-Grained Make Targets</h3>
<p>The default behavior for make is to create consistent and correct output, at the expense of build speed, if necessary.</p>
<p>If you are prepared to take some risk of an incorrect build, and know enough of the system to understand how things build and interact, you can speed up the build process considerably by instructing make to only build a portion of the product.</p>
@@ -937,6 +859,14 @@ sudo mv /tmp/configure /usr/local/bin</code></pre>
<h4 id="rebuilding-part-of-java.base-jdk_filter">Rebuilding Part of java.base (JDK_FILTER)</h4>
<p>If you are modifying files in <code>java.base</code>, which is the by far largest module in the JDK, then you need to rebuild all those files whenever a single file has changed. (This inefficiency will hopefully be addressed in JDK 10.)</p>
<p>As a hack, you can use the make control variable <code>JDK_FILTER</code> to specify a pattern that will be used to limit the set of files being recompiled. For instance, <code>make java.base JDK_FILTER=javax/crypto</code> (or, to combine methods, <code>make java.base-java-only JDK_FILTER=javax/crypto</code>) will limit the compilation to files in the <code>javax.crypto</code> package.</p>
<h3 id="learn-about-mercurial">Learn About Mercurial</h3>
<p>To become an efficient JDK developer, it is recommended that you invest in learning Mercurial properly. Here are some links that can get you started:</p>
<ul>
<li><a href="http://www.mercurial-scm.org/wiki/GitConcepts">Mercurial for git users</a></li>
<li><a href="http://www.mercurial-scm.org/wiki/Tutorial">The official Mercurial tutorial</a></li>
<li><a href="http://hginit.com/">hg init</a></li>
<li><a href="http://hgbook.red-bean.com/read/">Mercurial: The Definitive Guide</a></li>
</ul>
<h2 id="understanding-the-build-system">Understanding the Build System</h2>
<p>This section will give you a more technical description on the details of the build system.</p>
<h3 id="configurations">Configurations</h3>

View File

@@ -3,11 +3,11 @@
## TL;DR (Instructions for the Impatient)
If you are eager to try out building the JDK, these simple steps works most of
the time. They assume that you have installed Git (and Cygwin if running
the time. They assume that you have installed Mercurial (and Cygwin if running
on Windows) and cloned the top-level JDK repository that you want to build.
1. [Get the complete source code](#getting-the-source-code): \
`git clone https://git.openjdk.java.net/jdk/`
`hg clone http://hg.openjdk.java.net/jdk/jdk`
2. [Run configure](#running-configure): \
`bash configure`
@@ -47,14 +47,14 @@ JDK.
Make sure you are getting the correct version. As of JDK 10, the source is no
longer split into separate repositories so you only need to clone one single
repository. At the [OpenJDK Git site](https://git.openjdk.java.net/) you
repository. At the [OpenJDK Mercurial server](http://hg.openjdk.java.net/) you
can see a list of all available repositories. If you want to build an older version,
e.g. JDK 11, it is recommended that you get the `jdk11u` repo, which contains
incremental updates, instead of the `jdk11` repo, which was frozen at JDK 11 GA.
e.g. JDK 8, it is recommended that you get the `jdk8u` forest, which contains
incremental updates, instead of the `jdk8` forest, which was frozen at JDK 8 GA.
If you are new to Git, a good place to start is the book [Pro
Git](https://git-scm.com/book/en/v2). The rest of this document
assumes a working knowledge of Git.
If you are new to Mercurial, a good place to start is the [Mercurial Beginner's
Guide](http://www.mercurial-scm.org/guide). The rest of this document assumes a
working knowledge of Mercurial.
### Special Considerations
@@ -89,21 +89,9 @@ on where and how to check out the source code.
directory. This is especially important if your user name contains
spaces and/or mixed upper and lower case letters.
* You need to install a git client. You have two choices, Cygwin git or
Git for Windows. Unfortunately there are pros and cons with each choice.
* The Cygwin `git` client has no line ending issues and understands
Cygwin paths (which are used throughout the JDK build system).
However, it does not currently work well with the Skara CLI tooling.
Please see the [Skara wiki on Git clients](
https://wiki.openjdk.java.net/display/SKARA/Skara#Skara-Git) for
up-to-date information about the Skara git client support.
* The [Git for Windows](https://gitforwindows.org) client has issues
with line endings, and do not understand Cygwin paths. It does work
well with the Skara CLI tooling, however. To alleviate the line ending
problems, make sure you set `core.autocrlf` to `false` (this is asked
during installation).
* Clone the JDK repository using the Cygwin command line `hg` client
as instructed in this document. That is, do *not* use another Mercurial
client such as TortoiseHg.
Failure to follow this procedure might result in hard-to-debug build
problems.
@@ -185,7 +173,7 @@ likely be possible to support in a future version but that would require effort
to implement.)
Internally in the build system, all paths are represented as Unix-style paths,
e.g. `/cygdrive/c/git/jdk/Makefile` rather than `C:\git\jdk\Makefile`. This
e.g. `/cygdrive/c/hg/jdk9/Makefile` rather than `C:\hg\jdk9\Makefile`. This
rule also applies to input to the build system, e.g. in arguments to
`configure`. So, use `--with-msvcr-dll=/cygdrive/c/msvcr100.dll` rather than
`--with-msvcr-dll=c:\msvcr100.dll`. For details on this conversion, see the section
@@ -285,13 +273,6 @@ For rpm-based distributions (Fedora, Red Hat, etc), try this:
sudo yum groupinstall "Development Tools"
```
For Alpine Linux, aside from basic tooling, install the GNU versions of some
programs:
```
sudo apk add build-base bash grep zip
```
### AIX
Please consult the AIX section of the [Supported Build Platforms](
@@ -321,9 +302,9 @@ issues.
Operating system Toolchain version
------------------ -------------------------------------------------------
Linux gcc 10.2.0
Linux gcc 9.2.0
macOS Apple Xcode 10.1 (using clang 10.0.0)
Windows Microsoft Visual Studio 2019 update 16.7.2
Windows Microsoft Visual Studio 2019 update 16.5.3
All compilers are expected to be able to compile to the C99 language standard,
as some C99 features are used in the source code. Microsoft Visual Studio
@@ -335,7 +316,7 @@ features that it does support.
The minimum accepted version of gcc is 5.0. Older versions will generate a warning
by `configure` and are unlikely to work.
The JDK is currently known to be able to compile with at least version 10.2 of
The JDK is currently known to be able to compile with at least version 9.2 of
gcc.
In general, any version between these two should be usable.
@@ -382,9 +363,6 @@ If you have multiple versions of Visual Studio installed, `configure` will by
default pick the latest. You can request a specific version to be used by
setting `--with-toolchain-version`, e.g. `--with-toolchain-version=2017`.
If you have Visual Studio installed but `configure` fails to detect it, it may
be because of [spaces in path](#spaces-in-path).
### IBM XL C/C++
Please consult the AIX section of the [Supported Build Platforms](
@@ -453,8 +431,6 @@ rather than bundling the JDK's own copy.
libfreetype6-dev`.
* To install on an rpm-based Linux, try running `sudo yum install
freetype-devel`.
* To install on Alpine Linux, try running `sudo apk add freetype-dev`.
* To install on macOS, try running `brew install freetype`.
Use `--with-freetype-include=<path>` and `--with-freetype-lib=<path>`
if `configure` does not automatically locate the platform FreeType files.
@@ -469,7 +445,6 @@ your operating system.
libcups2-dev`.
* To install on an rpm-based Linux, try running `sudo yum install
cups-devel`.
* To install on Alpine Linux, try running `sudo apk add cups-dev`.
Use `--with-cups=<path>` if `configure` does not properly locate your CUPS
files.
@@ -483,8 +458,6 @@ Linux.
libx11-dev libxext-dev libxrender-dev libxrandr-dev libxtst-dev libxt-dev`.
* To install on an rpm-based Linux, try running `sudo yum install
libXtst-devel libXt-devel libXrender-devel libXrandr-devel libXi-devel`.
* To install on Alpine Linux, try running `sudo apk add libx11-dev
libxext-dev libxrender-dev libxrandr-dev libxtst-dev libxt-dev`.
Use `--with-x=<path>` if `configure` does not properly locate your X11 files.
@@ -497,7 +470,6 @@ required on Linux. At least version 0.9.1 of ALSA is required.
libasound2-dev`.
* To install on an rpm-based Linux, try running `sudo yum install
alsa-lib-devel`.
* To install on Alpine Linux, try running `sudo apk add alsa-lib-dev`.
Use `--with-alsa=<path>` if `configure` does not properly locate your ALSA
files.
@@ -512,7 +484,6 @@ Hotspot.
libffi-dev`.
* To install on an rpm-based Linux, try running `sudo yum install
libffi-devel`.
* To install on Alpine Linux, try running `sudo apk add libffi-dev`.
Use `--with-libffi=<path>` if `configure` does not properly locate your libffi
files.
@@ -528,7 +499,6 @@ platforms. At least version 2.69 is required.
autoconf`.
* To install on an rpm-based Linux, try running `sudo yum install
autoconf`.
* To install on Alpine Linux, try running `sudo apk add autoconf`.
* To install on macOS, try running `brew install autoconf`.
* To install on Windows, try running `<path to Cygwin setup>/setup-x86_64 -q
-P autoconf`.
@@ -657,7 +627,7 @@ features, use `bash configure --help=short` instead.)
On Linux, BSD and AIX, it is possible to override where Java by default
searches for runtime/JNI libraries. This can be useful in situations where
there is a special shared directory for system JNI libraries. This setting
can in turn be overridden at runtime by setting the `java.library.path` property.
can in turn be overriden at runtime by setting the `java.library.path` property.
* `--with-jni-libpath=<path>` - Use the specified path as a default
when searching for runtime libraries.
@@ -723,7 +693,7 @@ hard to use properly. Therefore, `configure` will print a warning if this is
detected.
However, there are a few `configure` variables, known as *control variables*
that are supposed to be overridden on the command line. These are variables that
that are supposed to be overriden on the command line. These are variables that
describe the location of tools needed by the build, like `MAKE` or `GREP`. If
any such variable is specified, `configure` will use that value instead of
trying to autodetect the tool. For instance, `bash configure
@@ -803,7 +773,7 @@ broken build. Unless you're well versed in the build system, this is hard to
use properly. Therefore, `make` will print a warning if this is detected.
However, there are a few `make` variables, known as *control variables* that
are supposed to be overridden on the command line. These make up the "make time"
are supposed to be overriden on the command line. These make up the "make time"
configuration, as opposed to the "configure time" configuration.
#### General Make Control Variables
@@ -818,7 +788,7 @@ configuration, as opposed to the "configure time" configuration.
#### Test Make Control Variables
These make control variables only make sense when running tests. Please see
**Testing the JDK** ([html](testing.html), [markdown](testing.md)) for details.
[Testing the JDK](testing.html) for details.
* `TEST`
* `TEST_JOBS`
@@ -848,7 +818,7 @@ containing `lib/jtreg.jar` etc.
The [Adoption Group](https://wiki.openjdk.java.net/display/Adoption) provides
recent builds of jtreg [here](
https://ci.adoptopenjdk.net/view/Dependencies/job/dependency_pipeline/lastSuccessfulBuild/artifact/jtreg/).
https://ci.adoptopenjdk.net/view/Dependencies/job/jtreg/lastSuccessfulBuild/artifact).
Download the latest `.tar.gz` file, unpack it, and point `--with-jtreg` to the
`jtreg` directory that you just unpacked.
@@ -865,8 +835,8 @@ To execute the most basic tests (tier 1), use:
make run-test-tier1
```
For more details on how to run tests, please see **Testing the JDK**
([html](testing.html), [markdown](testing.md)).
For more details on how to run tests, please see the [Testing
the JDK](testing.html) document.
## Cross-compiling
@@ -1090,7 +1060,7 @@ Note that X11 is needed even if you only want to build a headless JDK.
* If the X11 libraries are not properly detected by `configure`, you can
point them out by `--with-x`.
### Cross compiling with Debian sysroots
### Creating And Using Sysroots With qemu-deboostrap
Fortunately, you can create sysroots for foreign architectures with tools
provided by your OS. On Debian/Ubuntu systems, one could use `qemu-deboostrap` to
@@ -1102,67 +1072,38 @@ for foreign architectures with native compilation speed.
For example, cross-compiling to AArch64 from x86_64 could be done like this:
* Install cross-compiler on the *build* system:
```
apt install g++-aarch64-linux-gnu gcc-aarch64-linux-gnu
```
```
apt install g++-aarch64-linux-gnu gcc-aarch64-linux-gnu
```
* Create chroot on the *build* system, configuring it for *target* system:
```
sudo qemu-debootstrap \
--arch=arm64 \
--verbose \
--include=fakeroot,symlinks,build-essential,libx11-dev,libxext-dev,libxrender-dev,libxrandr-dev,libxtst-dev,libxt-dev,libcups2-dev,libfontconfig1-dev,libasound2-dev,libfreetype6-dev,libpng-dev,libffi-dev \
--resolve-deps \
buster \
~/sysroot-arm64 \
http://httpredir.debian.org/debian/
```
* Make sure the symlinks inside the newly created chroot point to proper locations:
```
sudo chroot ~/sysroot-arm64 symlinks -cr .
```
```
sudo qemu-debootstrap --arch=arm64 --verbose \
--include=fakeroot,build-essential,libx11-dev,libxext-dev,libxrender-dev,libxrandr-dev,libxtst-dev,libxt-dev,libcups2-dev,libfontconfig1-dev,libasound2-dev,libfreetype6-dev,libpng12-dev \
--resolve-deps jessie /chroots/arm64 http://httpredir.debian.org/debian/
```
* Configure and build with newly created chroot as sysroot/toolchain-path:
```
sh ./configure \
--openjdk-target=aarch64-linux-gnu \
--with-sysroot=~/sysroot-arm64
make images
ls build/linux-aarch64-server-release/
```
```
CC=aarch64-linux-gnu-gcc CXX=aarch64-linux-gnu-g++ sh ./configure --openjdk-target=aarch64-linux-gnu --with-sysroot=/chroots/arm64/ --with-toolchain-path=/chroots/arm64/
make images
ls build/linux-aarch64-normal-server-release/
```
The build does not create new files in that chroot, so it can be reused for multiple builds
without additional cleanup.
The build system should automatically detect the toolchain paths and dependencies, but sometimes
it might require a little nudge with:
* Native compilers: override `CC` or `CXX` for `./configure`
* Freetype lib location: override `--with-freetype-lib`, for example `${sysroot}/usr/lib/${target}/`
* Freetype includes location: override `--with-freetype-include` for example `${sysroot}/usr/include/freetype2/`
* X11 libraries location: override `--x-libraries`, for example `${sysroot}/usr/lib/${target}/`
Architectures that are known to successfully cross-compile like this are:
Target Debian tree Debian arch `--openjdk-target=...` `--with-jvm-variants=...`
------------ ------------ ------------- ------------------------ --------------
x86 buster i386 i386-linux-gnu (all)
arm buster armhf arm-linux-gnueabihf (all)
aarch64 buster arm64 aarch64-linux-gnu (all)
ppc64le buster ppc64el powerpc64le-linux-gnu (all)
s390x buster s390x s390x-linux-gnu (all)
mipsle buster mipsel mipsel-linux-gnu zero
mips64le buster mips64el mips64el-linux-gnueabi64 zero
armel buster arm arm-linux-gnueabi zero
ppc sid powerpc powerpc-linux-gnu zero
ppc64be sid ppc64 powerpc64-linux-gnu (all)
m68k sid m68k m68k-linux-gnu zero
alpha sid alpha alpha-linux-gnu zero
sh4 sid sh4 sh4-linux-gnu zero
Target `CC` `CXX` `--arch=...` `--openjdk-target=...`
------------ ------------------------- --------------------------- ------------- -----------------------
x86 default default i386 i386-linux-gnu
armhf gcc-arm-linux-gnueabihf g++-arm-linux-gnueabihf armhf arm-linux-gnueabihf
aarch64 gcc-aarch64-linux-gnu g++-aarch64-linux-gnu arm64 aarch64-linux-gnu
ppc64el gcc-powerpc64le-linux-gnu g++-powerpc64le-linux-gnu ppc64el powerpc64le-linux-gnu
s390x gcc-s390x-linux-gnu g++-s390x-linux-gnu s390x s390x-linux-gnu
Additional architectures might be supported by Debian/Ubuntu Ports.
### Building for ARM/aarch64
@@ -1172,25 +1113,6 @@ available using `--with-abi-profile`: arm-vfp-sflt, arm-vfp-hflt, arm-sflt,
armv5-vfp-sflt, armv6-vfp-hflt. Note that soft-float ABIs are no longer
properly supported by the JDK.
### Building for musl
Just like it's possible to cross-compile for a different CPU, it's possible to
cross-compile for musl libc on a glibc-based *build* system.
A devkit suitable for most target CPU architectures can be obtained from
[musl.cc](https://musl.cc). After installing the required packages in the
sysroot, configure the build with `--openjdk-target`:
```
sh ./configure --with-jvm-variants=server \
--with-boot-jdk=$BOOT_JDK \
--with-build-jdk=$BUILD_JDK \
--openjdk-target=x86_64-unknown-linux-musl \
--with-devkit=$DEVKIT \
--with-sysroot=$SYSROOT
```
and run `make` normally.
### Verifying the Build
The build will end up in a directory named like
@@ -1315,14 +1237,14 @@ ERROR: Build failed for target 'hotspot' in configuration 'linux-x64' (exit code
=== Output from failing command(s) repeated here ===
* For target hotspot_variant-server_libjvm_objs_psMemoryPool.o:
/localhome/git/jdk-sandbox/hotspot/src/share/vm/services/psMemoryPool.cpp:1:1: error: 'failhere' does not name a type
/localhome/hg/jdk9-sandbox/hotspot/src/share/vm/services/psMemoryPool.cpp:1:1: error: 'failhere' does not name a type
... (rest of output omitted)
* All command lines available in /localhome/git/jdk-sandbox/build/linux-x64/make-support/failure-logs.
* All command lines available in /localhome/hg/jdk9-sandbox/build/linux-x64/make-support/failure-logs.
=== End of repeated output ===
=== Make failed targets repeated here ===
lib/CompileJvm.gmk:207: recipe for target '/localhome/git/jdk-sandbox/build/linux-x64/hotspot/variant-server/libjvm/objs/psMemoryPool.o' failed
lib/CompileJvm.gmk:207: recipe for target '/localhome/hg/jdk9-sandbox/build/linux-x64/hotspot/variant-server/libjvm/objs/psMemoryPool.o' failed
make/Main.gmk:263: recipe for target 'hotspot-server-libs' failed
=== End of repeated output ===
@@ -1420,7 +1342,7 @@ order. Most issues will be solved at step 1 or 2.
1. Make sure your repository is up-to-date
Run `git pull origin master` to make sure you have the latest changes.
Run `hg pull -u` to make sure you have the latest changes.
2. Clean build results
@@ -1445,13 +1367,13 @@ order. Most issues will be solved at step 1 or 2.
make
```
4. Re-clone the Git repository
4. Re-clone the Mercurial repository
Sometimes the Git repository gets in a state that causes the product
Sometimes the Mercurial repository gets in a state that causes the product
to be un-buildable. In such a case, the simplest solution is often the
"sledgehammer approach": delete the entire repository, and re-clone it.
If you have local changes, save them first to a different location using
`git format-patch`.
`hg export`.
### Specific Build Issues
@@ -1483,15 +1405,6 @@ This can be a sign of a Cygwin problem. See the information about solving
problems in the [Cygwin](#cygwin) section. Rebooting the computer might help
temporarily.
#### Spaces in Path
On Windows, when configuring, `fixpath.sh` may report that some directory
names have spaces. Usually, it assumes those directories have
[short paths](https://docs.microsoft.com/en-us/windows-server/administration/windows-commands/fsutil-8dot3name).
You can run `fsutil file setshortname` in `cmd` on certain directories, such as
`Microsoft Visual Studio` or `Windows Kits`, to assign arbitrary short paths so
`configure` can access them.
### Getting Help
If none of the suggestions in this document helps you, or if you find what you
@@ -1505,6 +1418,33 @@ contact the Adoption Group. See the section on [Contributing to OpenJDK](
## Hints and Suggestions for Advanced Users
### Setting Up a Repository for Pushing Changes (defpath)
To help you prepare a proper push path for a Mercurial repository, there exists
a useful tool known as [defpath](
http://openjdk.java.net/projects/code-tools/defpath). It will help you setup a
proper push path for pushing changes to the JDK.
Install the extension by cloning
`http://hg.openjdk.java.net/code-tools/defpath` and updating your `.hgrc` file.
Here's one way to do this:
```
cd ~
mkdir hg-ext
cd hg-ext
hg clone http://hg.openjdk.java.net/code-tools/defpath
cat << EOT >> ~/.hgrc
[extensions]
defpath=~/hg-ext/defpath/defpath.py
EOT
```
You can now setup a proper push path using:
```
hg defpath -d -u <your OpenJDK username>
```
### Bash Completion
The `configure` and `make` commands tries to play nice with bash command-line
@@ -1570,8 +1510,8 @@ update. This might speed up the build, but comes at the risk of an incorrect
build result. This is only recommended if you know what you're doing.
From time to time, you will also need to modify the command line to `configure`
due to changes. Use `make print-configuration` to show the command line used
for your current configuration.
due to changes. Use `make print-configure` to show the command line used for
your current configuration.
### Using Fine-Grained Make Targets
@@ -1645,6 +1585,16 @@ instance, `make java.base JDK_FILTER=javax/crypto` (or, to combine methods,
`make java.base-java-only JDK_FILTER=javax/crypto`) will limit the compilation
to files in the `javax.crypto` package.
### Learn About Mercurial
To become an efficient JDK developer, it is recommended that you invest in
learning Mercurial properly. Here are some links that can get you started:
* [Mercurial for git users](http://www.mercurial-scm.org/wiki/GitConcepts)
* [The official Mercurial tutorial](http://www.mercurial-scm.org/wiki/Tutorial)
* [hg init](http://hginit.com/)
* [Mercurial: The Definitive Guide](http://hgbook.red-bean.com/read/)
## Understanding the Build System
This section will give you a more technical description on the details of the

View File

@@ -28,8 +28,7 @@
</ul></li>
<li><a href="#structure-and-formatting">Structure and Formatting</a><ul>
<li><a href="#factoring-and-class-design">Factoring and Class Design</a></li>
<li><a href="#source-files">Source Files</a></li>
<li><a href="#jtreg-tests">JTReg Tests</a></li>
<li><a href="#files">Files</a></li>
<li><a href="#naming">Naming</a></li>
<li><a href="#commenting">Commenting</a></li>
<li><a href="#macros">Macros</a></li>
@@ -49,7 +48,6 @@
<li><a href="#thread_local">thread_local</a></li>
<li><a href="#nullptr">nullptr</a></li>
<li><a href="#atomic">&lt;atomic&gt;</a></li>
<li><a href="#uniform-initialization">Uniform Initialization</a></li>
<li><a href="#additional-permitted-features">Additional Permitted Features</a></li>
<li><a href="#excluded-features">Excluded Features</a></li>
<li><a href="#undecided-features">Undecided Features</a></li>
@@ -89,26 +87,16 @@
<li><p>Don't use the Copy and Paste keys to replicate more than a couple lines of code. Name what you must repeat.</p></li>
<li><p>If a class needs a member function to change a user-visible attribute, the change should be done with a &quot;setter&quot; accessor matched to the simple &quot;getter&quot;.</p></li>
</ul>
<h3 id="source-files">Source Files</h3>
<h3 id="files">Files</h3>
<ul>
<li><p>All source files must have a globally unique basename. The build system depends on this uniqueness.</p></li>
<li><p>Do not put non-trivial function implementations in .hpp files. If the implementation depends on other .hpp files, put it in a .cpp or a .inline.hpp file.</p></li>
<li><p>.inline.hpp files should only be included in .cpp or .inline.hpp files.</p></li>
<li><p>All .inline.hpp files should include their corresponding .hpp file as the first include line. Declarations needed by other files should be put in the .hpp file, and not in the .inline.hpp file. This rule exists to resolve problems with circular dependencies between .inline.hpp files.</p></li>
<li><p>All .cpp files include precompiled.hpp as the first include line.</p></li>
<li><p>precompiled.hpp is just a build time optimization, so don't rely on it to resolve include problems.</p></li>
<li><p>Keep the include lines alphabetically sorted.</p></li>
<li><p>Put conditional inclusions (<code>#if ...</code>) at the end of the include list.</p></li>
</ul>
<h3 id="jtreg-tests">JTReg Tests</h3>
<ul>
<li><p>JTReg tests should have meaningful names.</p></li>
<li><p>JTReg tests associated with specific bugs should be tagged with the <code>@bug</code> keyword in the test description.</p></li>
<li><p>JTReg tests should be organized by component or feature under <code>test/</code>, in a directory hierarchy that generally follows that of the <code>src/</code> directory. There may be additional subdirectories to further categorize tests by feature. This structure makes it easy to run a collection of tests associated with a specific feature by specifying the associated directory as the source of the tests to run.</p>
<ul>
<li>Some (older) tests use the associated bug number in the directory name, the test name, or both. That naming style should no longer be used, with existing tests using that style being candidates for migration.</li>
</ul></li>
</ul>
<h3 id="naming">Naming</h3>
<ul>
<li><p>The length of a name may be correlated to the size of its scope. In particular, short names (even single letter names) may be fine in a small scope, but are usually inappropriate for larger scopes.</p></li>
@@ -277,17 +265,6 @@ while ( test_foo(args...) ) { // No, excess spaces around control</code></pre></
<p>Do not use facilities provided by the <code>&lt;atomic&gt;</code> header (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2427.html">n2427</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2752.htm">n2752</a>); instead, use the HotSpot <code>Atomic</code> class and related facilities.</p>
<p>Atomic operations in HotSpot code must have semantics which are consistent with those provided by the JDK's compilers for Java. There are platform-specific implementation choices that a C++ compiler might make or change that are outside the scope of the C++ Standard, and might differ from what the Java compilers implement.</p>
<p>In addition, HotSpot <code>Atomic</code> has a concept of &quot;conservative&quot; memory ordering, which may differ from (may be stronger than) sequentially consistent. There are algorithms in HotSpot that are believed to rely on that ordering.</p>
<h3 id="uniform-initialization">Uniform Initialization</h3>
<p>The use of <em>uniform initialization</em> (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2672.htm">n2672</a>), also known as <em>brace initialization</em>, is permitted.</p>
<p>Some relevant sections from cppreference.com:</p>
<ul>
<li><a href="https://en.cppreference.com/w/cpp/language/initialization">initialization</a></li>
<li><a href="https://en.cppreference.com/w/cpp/language/value_initialization">value initialization</a></li>
<li><a href="https://en.cppreference.com/w/cpp/language/direct_initialization">direct initialization</a></li>
<li><a href="https://en.cppreference.com/w/cpp/language/list_initialization">list initialization</a></li>
<li><a href="https://en.cppreference.com/w/cpp/language/aggregate_initialization">aggregate initialization</a></li>
</ul>
<p>Although related, the use of <code>std::initializer_list</code> remains forbidden, as part of the avoidance of the C++ Standard Library in HotSpot code.</p>
<h3 id="additional-permitted-features">Additional Permitted Features</h3>
<ul>
<li><p><code>constexpr</code> (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2235.pdf">n2235</a>) (<a href="https://isocpp.org/files/papers/N3652.html">n3652</a>)</p></li>
@@ -304,9 +281,7 @@ while ( test_foo(args...) ) { // No, excess spaces around control</code></pre></
<li><p>Defaulted and deleted functions (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2346.htm">n2346</a>)</p></li>
<li><p>Dynamic initialization and destruction with concurrency (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2660.htm">n2660</a>)</p></li>
<li><p><code>final</code> virtual specifiers for classes and virtual functions (<a href="http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2928.htm">n2928</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2010/n3206.htm">n3206</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2011/n3272.htm">n3272</a>)</p></li>
<li><p><code>override</code> virtual specifiers for virtual functions (<a href="http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2928.htm">n2928</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2010/n3206.htm">n3206</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2011/n3272.htm">n3272</a>)</p></li>
<li><p>Local and unnamed types as template parameters (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2657.htm">n2657</a>)</p></li>
<li><p>Range-based <code>for</code> loops (<a href="http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2930.html">n2930</a>) (<a href="https://en.cppreference.com/w/cpp/language/range-for">range-for</a>)</p></li>
</ul>
<h3 id="excluded-features">Excluded Features</h3>
<ul>
@@ -332,6 +307,7 @@ while ( test_foo(args...) ) { // No, excess spaces around control</code></pre></
<h3 id="undecided-features">Undecided Features</h3>
<p>This list is incomplete; it serves to explicitly call out some features that have not yet been discussed.</p>
<ul>
<li><p><code>overrides</code> virtual specifiers for virtual functions (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2011/n3272.htm">n3272</a>)</p></li>
<li><p>Trailing return type syntax for functions (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2541.htm">n2541</a>)</p></li>
<li><p>Variable templates (<a href="https://isocpp.org/files/papers/N3651.pdf">n3651</a>)</p></li>
<li><p>Member initializers and aggregates (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3653.html">n3653</a>)</p></li>

View File

@@ -126,7 +126,7 @@ lines of code. Name what you must repeat.
change should be done with a "setter" accessor matched to the simple
"getter".
### Source Files
### Files
* All source files must have a globally unique basename. The build
system depends on this uniqueness.
@@ -138,11 +138,6 @@ a .inline.hpp file.
* .inline.hpp files should only be included in .cpp or .inline.hpp
files.
* All .inline.hpp files should include their corresponding .hpp file as
the first include line. Declarations needed by other files should be put
in the .hpp file, and not in the .inline.hpp file. This rule exists to
resolve problems with circular dependencies between .inline.hpp files.
* All .cpp files include precompiled.hpp as the first include line.
* precompiled.hpp is just a build time optimization, so don't rely on
@@ -152,24 +147,6 @@ it to resolve include problems.
* Put conditional inclusions (`#if ...`) at the end of the include list.
### JTReg Tests
* JTReg tests should have meaningful names.
* JTReg tests associated with specific bugs should be tagged with the
`@bug` keyword in the test description.
* JTReg tests should be organized by component or feature under
`test/`, in a directory hierarchy that generally follows that of the
`src/` directory. There may be additional subdirectories to further
categorize tests by feature. This structure makes it easy to run a
collection of tests associated with a specific feature by specifying
the associated directory as the source of the tests to run.
* Some (older) tests use the associated bug number in the directory
name, the test name, or both. That naming style should no longer be
used, with existing tests using that style being candidates for migration.
### Naming
* The length of a name may be correlated to the size of its scope. In
@@ -686,23 +663,6 @@ ordering, which may differ from (may be stronger than) sequentially
consistent. There are algorithms in HotSpot that are believed to rely
on that ordering.
### Uniform Initialization
The use of _uniform initialization_
([n2672](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2672.htm)),
also known as _brace initialization_, is permitted.
Some relevant sections from cppreference.com:
* [initialization](https://en.cppreference.com/w/cpp/language/initialization)
* [value initialization](https://en.cppreference.com/w/cpp/language/value_initialization)
* [direct initialization](https://en.cppreference.com/w/cpp/language/direct_initialization)
* [list initialization](https://en.cppreference.com/w/cpp/language/list_initialization)
* [aggregate initialization](https://en.cppreference.com/w/cpp/language/aggregate_initialization)
Although related, the use of `std::initializer_list` remains forbidden, as
part of the avoidance of the C++ Standard Library in HotSpot code.
### Additional Permitted Features
* `constexpr`
@@ -752,18 +712,9 @@ part of the avoidance of the C++ Standard Library in HotSpot code.
([n3206](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2010/n3206.htm)),
([n3272](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2011/n3272.htm))
* `override` virtual specifiers for virtual functions
([n2928](http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2928.htm)),
([n3206](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2010/n3206.htm)),
([n3272](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2011/n3272.htm))
* Local and unnamed types as template parameters
([n2657](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2657.htm))
* Range-based `for` loops
([n2930](http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2930.html))
([range-for](https://en.cppreference.com/w/cpp/language/range-for))
### Excluded Features
* New string and character literals
@@ -823,6 +774,9 @@ in HotSpot code because of the "no implicit boolean" guideline.)
This list is incomplete; it serves to explicitly call out some
features that have not yet been discussed.
* `overrides` virtual specifiers for virtual functions
([n3272](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2011/n3272.htm))
* Trailing return type syntax for functions
([n2541](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2541.htm))

View File

@@ -1,223 +0,0 @@
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml" lang="" xml:lang="">
<head>
<meta charset="utf-8" />
<meta name="generator" content="pandoc" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
<title>Native/Unit Test Development Guidelines</title>
<style type="text/css">
code{white-space: pre-wrap;}
span.smallcaps{font-variant: small-caps;}
span.underline{text-decoration: underline;}
div.column{display: inline-block; vertical-align: top; width: 50%;}
</style>
<link rel="stylesheet" href="../make/data/docs-resources/resources/jdk-default.css" />
<!--[if lt IE 9]>
<script src="//cdnjs.cloudflare.com/ajax/libs/html5shiv/3.7.3/html5shiv-printshiv.min.js"></script>
<![endif]-->
</head>
<body>
<header id="title-block-header">
<h1 class="title">Native/Unit Test Development Guidelines</h1>
</header>
<nav id="TOC">
<ul>
<li><a href="#good-test-properties">Good test properties</a><ul>
<li><a href="#lightness">Lightness</a></li>
<li><a href="#isolation">Isolation</a></li>
<li><a href="#atomicity-and-self-containment">Atomicity and self-containment</a></li>
<li><a href="#repeatability">Repeatability</a></li>
<li><a href="#informativeness">Informativeness</a></li>
<li><a href="#testing-instead-of-visiting">Testing instead of visiting</a></li>
<li><a href="#nearness">Nearness</a></li>
</ul></li>
<li><a href="#asserts">Asserts</a><ul>
<li><a href="#several-checks">Several checks</a></li>
<li><a href="#first-parameter-is-expected-value">First parameter is expected value</a></li>
<li><a href="#floating-point-comparison">Floating-point comparison</a></li>
<li><a href="#c-string-comparison">C string comparison</a></li>
<li><a href="#error-messages">Error messages</a></li>
<li><a href="#uncluttered-output">Uncluttered output</a></li>
<li><a href="#failures-propagation">Failures propagation</a></li>
</ul></li>
<li><a href="#naming-and-grouping">Naming and Grouping</a><ul>
<li><a href="#test-group-names">Test group names</a></li>
<li><a href="#filename">Filename</a></li>
<li><a href="#file-location">File location</a></li>
<li><a href="#test-names">Test names</a></li>
<li><a href="#fixture-classes">Fixture classes</a></li>
<li><a href="#friend-classes">Friend classes</a></li>
<li><a href="#oscpu-specific-tests">OS/CPU specific tests</a></li>
</ul></li>
<li><a href="#miscellaneous">Miscellaneous</a><ul>
<li><a href="#hotspot-style">Hotspot style</a></li>
<li><a href="#codetest-metrics">Code/test metrics</a></li>
<li><a href="#access-to-non-public-members">Access to non-public members</a></li>
<li><a href="#death-tests">Death tests</a></li>
<li><a href="#external-flags">External flags</a></li>
<li><a href="#test-specific-flags">Test-specific flags</a></li>
<li><a href="#flag-restoring">Flag restoring</a></li>
<li><a href="#googletest-documentation">GoogleTest documentation</a></li>
</ul></li>
<li><a href="#todo">TODO</a></li>
</ul>
</nav>
<p>The purpose of these guidelines is to establish a shared vision on what kind of native tests and how we want to develop them for Hotspot using GoogleTest. Hence these guidelines include style items as well as test approach items.</p>
<p>First section of this document describes properties of good tests which are common for almost all types of test regardless of language, framework, etc. Further sections provide recommendations to achieve those properties and other HotSpot and/or GoogleTest specific guidelines.</p>
<h2 id="good-test-properties">Good test properties</h2>
<h3 id="lightness">Lightness</h3>
<p>Use the most lightweight type of tests.</p>
<p>In Hotspot, there are 3 different types of tests regarding their dependency on a JVM, each next level is slower than previous</p>
<ul>
<li><p><code>TEST</code> : a test does not depend on a JVM</p></li>
<li><p><code>TEST_VM</code> : a test does depend on an initialized JVM, but are supposed not to break a JVM, i.e. leave it in a workable state.</p></li>
<li><p><code>TEST_OTHER_VM</code> : a test depends on a JVM and requires a freshly initialized JVM or leaves a JVM in non-workable state</p></li>
</ul>
<h3 id="isolation">Isolation</h3>
<p>Tests have to be isolated: not to have visible side-effects, influences on other tests results.</p>
<p>Results of one test should not depend on test execution order, other tests, otherwise it is becoming almost impossible to find out why a test failed. Due to hotspot-specific, it is not so easy to get a full isolation, e.g. we share an initialized JVM between all <code>TEST_VM</code> tests, so if your test changes JVM's state too drastically and does not change it back, you had better consider <code>TEST_OTHER_VM</code>.</p>
<h3 id="atomicity-and-self-containment">Atomicity and self-containment</h3>
<p>Tests should be <em>atomic</em> and <em>self-contained</em> at the same time.</p>
<p>One test should check a particular part of a class, subsystem, functionality, etc. Then it is quite easy to determine what parts of a product are broken basing on test failures. On the other hand, a test should test that part more-or-less entirely, because when one sees a test <code>FooTest::bar</code>, they assume all aspects of bar from <code>Foo</code> are tested.</p>
<p>However, it is impossible to cover all aspects even of a method, not to mention a subsystem. In such cases, it is recommended to have several tests, one for each aspect of a thing under test. For example one test to tests how <code>Foo::bar</code> works if an argument is <code>null</code>, another test to test how it works if an argument is acceptable but <code>Foo</code> is not in the right state to accept it and so on. This helps not only to make tests atomic, self-contained but also makes test name self-descriptive (discussed in more details in <a href="#test-names">Test names</a>).</p>
<h3 id="repeatability">Repeatability</h3>
<p>Tests have to be repeatable.</p>
<p>Reproducibility is very crucial for a test. No one likes sporadic test failures, they are hard to investigate, fix and verify a fix.</p>
<p>In some cases, it is quite hard to write a 100% repeatable test, since besides a test there can be other moving parts, e.g. in case of <code>TEST_VM</code> there are several concurrently running threads. Despite this, we should try to make a test as reproducible as possible.</p>
<h3 id="informativeness">Informativeness</h3>
<p>In case of a failure, a test should be as <em>informative</em> as possible.</p>
<p>Having more information about a test failure than just compared values can be very useful for failure troubleshooting, it can reduce or even completely eliminate debugging hours. This is even more important in case of not 100% reproducible failures.</p>
<p>Achieving this property, one can easily make a test too verbose, so it will be really hard to find useful information in the ocean of useless information. Hence they should not only think about how to provide <a href="#error-messages">good information</a>, but also <a href="#uncluttered-output">when to do it</a>.</p>
<h3 id="testing-instead-of-visiting">Testing instead of visiting</h3>
<p>Tests should <em>test</em>.</p>
<p>It is not enough just to &quot;visit&quot; some code, a test should check that code does that it has to do, compare return values with expected values, check that desired side effects are done, and undesired are not, and so on. In other words, a test should contain at least one GoogleTest assertion and do not rely on JVM asserts.</p>
<p>Generally speaking to write a good test, one should create a model of the system under tests, a model of possible bugs (or bugs which one wants to find) and design tests using those models.</p>
<h3 id="nearness">Nearness</h3>
<p>Prefer having checks inside test code.</p>
<p>Not only does having test logic outside, e.g. verification method, depending on asserts in product code contradict with several items above but also decreases tests readability and stability. It is much easier to understand that a test is testing when all testing logic is located inside a test or nearby in shared test libraries. As a rule of thumb, the closer a check to a test, the better.</p>
<h2 id="asserts">Asserts</h2>
<h3 id="several-checks">Several checks</h3>
<p>Prefer <code>EXPECT</code> over <code>ASSERT</code> if possible.</p>
<p>This is related to the <a href="#informativeness">informativeness</a> property of tests, information for other checks can help to better localize a defects root-cause. One should use <code>ASSERT</code> if it is impossible to continue test execution or if it does not make much sense. Later in the text, <code>EXPECT</code> forms will be used to refer to both <code>ASSERT/EXPECT</code>.</p>
<p>When it is possible to make several different checks, but impossible to continue test execution if at least one check fails, you can use <code>::testing::Test::HasNonfatalFailure()</code> function. The recommended way to express that is <code>ASSERT_FALSE(::testing::Test::HasNonfatalFailure())</code>. Besides making it clear why a test is aborted, it also allows you to provide more information about a failure.</p>
<h3 id="first-parameter-is-expected-value">First parameter is expected value</h3>
<p>In all equality assertions, expected values should be passed as the first parameter.</p>
<p>This convention is adopted by GoogleTest, and there is a slight difference in how GoogleTest treats parameters, the most important one is <code>null</code> detection. Due to different reasons, <code>null</code> detection is enabled only for the first parameter, that is to said <code>EXPECT_EQ(NULL, object)</code> checks that object is <code>null</code>, while <code>EXPECT_EQ(object, NULL)</code> checks that object equals to <code>NULL</code>, GoogleTest is very strict regarding types of compared values so the latter will generates a compile-time error.</p>
<h3 id="floating-point-comparison">Floating-point comparison</h3>
<p>Use floating-point special macros to compare <code>float/double</code> values.</p>
<p>Because of floating-point number representations and round-off errors, regular equality comparison will not return true in most cases. There are special <code>EXPECT_FLOAT_EQ/EXPECT_DOUBLE_EQ</code> assertions which check that the distance between compared values is not more than 4 ULPs, there is also <code>EXPECT_NEAR(v1, v2, eps)</code> which checks that the absolute value of the difference between <code>v1</code> and <code>v2</code> is not greater than <code>eps</code>.</p>
<h3 id="c-string-comparison">C string comparison</h3>
<p>Use string special macros for C strings comparisons.</p>
<p><code>EXPECT_EQ</code> just compares pointers values, which is hardly what one wants comparing C strings. GoogleTest provides <code>EXPECT_STREQ</code> and <code>EXPECT_STRNE</code> macros to compare C string contents. There are also case-insensitive versions <code>EXPECT_STRCASEEQ</code>, <code>EXPECT_STRCASENE</code>.</p>
<h3 id="error-messages">Error messages</h3>
<p>Provide informative, but not too verbose error messages.</p>
<p>All GoogleTest asserts print compared expressions and their values, so there is no need to have them in error messages. Asserts print only compared values, they do not print any of interim variables, e.g. <code>ASSERT_TRUE((val1 == val2 &amp;&amp; isFail(foo(8)) || i == 18)</code> prints only one value. If you use some complex predicates, please consider <code>EXPECT_PRED*</code> or <code>EXPECT_FORMAT_PRED</code> assertions family, they check that a predicate returns true/success and print out all parameters values.</p>
<p>However in some cases, default information is not enough, a commonly used example is an assert inside a loop, GoogleTest will not print iteration values (unless it is an assert's parameter). Other demonstrative examples are printing error code and a corresponding error message; printing internal states which might have an impact on results. One should add this information to assert message using <code>&lt;&lt;</code> operator.</p>
<h3 id="uncluttered-output">Uncluttered output</h3>
<p>Print information only if it is needed.</p>
<p>Too verbose tests which print all information even if they pass are very bad practice. They just pollute output, so it becomes harder to find useful information. In order not print information till it is really needed, one should consider saving it to a temporary buffer and pass to an assert. <a href="https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp" class="uri">https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp</a> has a good example how to do that.</p>
<h3 id="failures-propagation">Failures propagation</h3>
<p>Wrap a subroutine call into <code>EXPECT_NO_FATAL_FAILURE</code> macro to propagate failures.</p>
<p><code>ASSERT</code> and <code>FAIL</code> abort only the current function, so if you have them in a subroutine, a test will not be aborted after the subroutine even if <code>ASSERT</code> or <code>FAIL</code> fails. You should call such subroutines in <code>ASSERT_NO_FATAL_FAILURE</code> macro to propagate fatal failures and abort a test. <code>(EXPECT|ASSERT)_NO_FATAL_FAILURE</code> can also be used to provide more information.</p>
<p>Due to obvious reasons, there are no <code>(EXPECT|ASSERT)_NO_NONFATAL_FAILURE</code> macros. However, if you need to check if a subroutine generated a nonfatal failure (failed an <code>EXPECT</code>), you can use <code>::testing::Test::HasNonfatalFailure</code> function, or <code>::testing::Test::HasFailure</code> function to check if a subroutine generated any failures, see <a href="#several-checks">Several checks</a>.</p>
<h2 id="naming-and-grouping">Naming and Grouping</h2>
<h3 id="test-group-names">Test group names</h3>
<p>Test group names should be in CamelCase, start and end with a letter. A test group should be named after tested class, functionality, subsystem, etc.</p>
<p>This naming scheme helps to find tests, filter them and simplifies test failure analysis. For example, class <code>Foo</code> - test group <code>Foo</code>, compiler logging subsystem - test group <code>CompilerLogging</code>, G1 GC — test group <code>G1GC</code>, and so forth.</p>
<h3 id="filename">Filename</h3>
<p>A test file must have <code>test_</code> prefix and <code>.cpp</code> suffix.</p>
<p>Both are actually requirements from the current build system to recognize your tests.</p>
<h3 id="file-location">File location</h3>
<p>Test file location should reflect a location of the tested part of the product.</p>
<ul>
<li><p>All unit tests for a class from <code>foo/bar/baz.cpp</code> should be placed <code>foo/bar/test_baz.cpp</code> in <code>hotspot/test/native/</code> directory. Having all tests for a class in one file is a common practice for unit tests, it helps to see all existing tests at once, share functions and/or resources without losing encapsulation.</p></li>
<li><p>For tests which test more than one class, directory hierarchy should be the same as product hierarchy, and file name should reflect the name of the tested subsystem/functionality. For example, if a sub-system under tests belongs to <code>gc/g1</code>, tests should be placed in <code>gc/g1</code> directory.</p></li>
</ul>
<p>Please note that framework prepends directory name to a test group name. For example, if <code>TEST(foo, check_this)</code> and <code>TEST(bar, check_that)</code> are defined in <code>hotspot/test/native/gc/shared/test_foo.cpp</code> file, they will be reported as <code>gc/shared/foo::check_this</code> and <code>gc/shared/bar::check_that</code>.</p>
<h3 id="test-names">Test names</h3>
<p>Test names should be in small_snake_case, start and end with a letter. A test name should reflect that a test checks.</p>
<p>Such naming makes tests self-descriptive and helps a lot during the whole test life cycle. It is easy to do test planning, test inventory, to see what things are not tested, to review tests, to analyze test failures, to evolve a test, etc. For example <code>foo_return_0_if_name_is_null</code> is better than <code>foo_sanity</code> or <code>foo_basic</code> or just <code>foo</code>, <code>humongous_objects_can_not_be_moved_by_young_gc</code> is better than <code>ho_young_gc</code>.</p>
<p>Actually using underscore is against GoogleTest project convention, because it can lead to illegal identifiers, however, this is too strict. Restricting usage of underscore for test names only and prohibiting test name starts or ends with an underscore are enough to be safe.</p>
<h3 id="fixture-classes">Fixture classes</h3>
<p>Fixture classes should be named after tested classes, subsystems, etc (follow <a href="#test-group-names">Test group names rule</a>) and have <code>Test</code> suffix to prevent class name conflicts.</p>
<h3 id="friend-classes">Friend classes</h3>
<p>All test purpose friends should have either <code>Test</code> or <code>Testable</code> suffix.</p>
<p>It greatly simplifies understanding of friendships purpose and allows statically check that private members are not exposed unexpectedly. Having <code>FooTest</code> as a friend of <code>Foo</code> without any comments will be understood as a necessary evil to get testability.</p>
<h3 id="oscpu-specific-tests">OS/CPU specific tests</h3>
<p>Guard OS/CPU specific tests by <code>#ifdef</code> and have OS/CPU name in filename.</p>
<p>For the time being, we do not support separate directories for OS, CPU, OS-CPU specific tests, in case we will have lots of such tests, we will change directory layout and build system to support that in the same way it is done in hotspot.</p>
<h2 id="miscellaneous">Miscellaneous</h2>
<h3 id="hotspot-style">Hotspot style</h3>
<p>Abide the norms and rules accepted in Hotspot style guide.</p>
<p>Tests are a part of Hotspot, so everything (if applicable) we use for Hotspot, should be used for tests as well. Those guidelines cover test-specific things.</p>
<h3 id="codetest-metrics">Code/test metrics</h3>
<p>Coverage information and other code/test metrics are quite useful to decide what tests should be written, what tests should be improved and what can be removed.</p>
<p>For unit tests, widely used and well-known coverage metric is branch coverage, which provides good quality of tests with relatively easy test development process. For other levels of testing, branch coverage is not as good, and one should consider others metrics, e.g. transaction flow coverage, data flow coverage.</p>
<h3 id="access-to-non-public-members">Access to non-public members</h3>
<p>Use explicit friend class to get access to non-public members.</p>
<p>We do not use GoogleTest macro to declare friendship relation, because, from our point of view, it is less clear than an explicit declaration.</p>
<p>Declaring a test fixture class as a friend class of a tested test is the easiest and the clearest way to get access. However, it has some disadvantages, here is some of them:</p>
<ul>
<li>Each test has to be declared as a friend</li>
<li>Subclasses do not inheritance friendship relation</li>
</ul>
<p>In other words, it is harder to share code between tests. Hence if you want to share code or expect it to be useful in other tests, you should consider making members in a tested class protected and introduce a shared test-only class which expose those members via public functions, or even making members publicly accessible right away in a product class. If it is not an option to change members visibility, one can create a friend class which exposes members.</p>
<h3 id="death-tests">Death tests</h3>
<p>You can not use death tests inside <code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code>.</p>
<p>We tried to make Hotspot-GoogleTest integration as transparent as possible, however, due to the current implementation of <code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code> tests, you cannot use death test functionality in them. These tests are implemented as GoogleTest death tests, and GoogleTest does not allow to have a death test inside another death test.</p>
<h3 id="external-flags">External flags</h3>
<p>Passing external flags to a tested JVM is not supported.</p>
<p>The rationality of such design decision is to simplify both tests and a test framework and to avoid failures related to incompatible flags combination till there is a good solution for that. However there are cases when one wants to test a JVM with specific flags combination, <code>_JAVA_OPTIONS</code> environment variable can be used to do that. Flags from <code>_JAVA_OPTIONS</code> will be used in <code>TEST_VM</code>, <code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code> tests.</p>
<h3 id="test-specific-flags">Test-specific flags</h3>
<p>Passing flags to a tested JVM in <code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code> should be possible, but is not implemented yet.</p>
<p>Facility to pass test-specific flags is needed for system, regression or other types of tests which require a fully initialized JVM in some particular configuration, e.g. with Serial GC selected. There is no support for such tests now, however, there is a plan to add that in upcoming releases.</p>
<p>For now, if a test depends on flags values, it should have <code>if (!&lt;flag&gt;) { return }</code> guards in the very beginning and <code>@requires</code> comment similar to jtreg <code>@requires</code> directive right before test macros. <a href="https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp" class="uri">https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp</a> ha an example of this temporary workaround. It is important to follow that pattern as it allows us to easily find all such tests and update them as soon as there is an implementation of flag passing facility.</p>
<p>In long-term, we expect jtreg to support GoogleTest tests as first class citizens, that is to say, jtreg will parse <span class="citation" data-cites="requires">@requires</span> comments and filter out inapplicable tests.</p>
<h3 id="flag-restoring">Flag restoring</h3>
<p>Restore changed flags.</p>
<p>It is quite common for tests to configure JVM in a certain way changing flags values. GoogleTest provides two ways to set up environment before a test and restore it afterward: using either constructor and destructor or <code>SetUp</code> and <code>TearDown</code> functions. Both ways require to use a test fixture class, which sometimes is too wordy. The simpler facilities like <code>FLAG_GUARD</code> macro or <code>*FlagSetting</code> classes could be used in such cases to restore/set values.</p>
<p>Caveats:</p>
<ul>
<li><p>Changing a flags value could break the invariants between flags' values and hence could lead to unexpected/unsupported JVM state.</p></li>
<li><p><code>FLAG_SET_*</code> macros can change more than one flag (in order to maintain invariants) so it is hard to predict what flags will be changed and it makes restoring all changed flags a nontrivial task. Thus in case one uses <code>FLAG_SET_*</code> macros, they should use <code>TEST_OTHER_VM</code> test type.</p></li>
</ul>
<h3 id="googletest-documentation">GoogleTest documentation</h3>
<p>In case you have any questions regarding GoogleTest itself, its asserts, test declaration macros, other macros, etc, please consult its documentation.</p>
<h2 id="todo">TODO</h2>
<p>Although this document provides guidelines on the most important parts of test development using GTest, it still misses a few items:</p>
<ul>
<li><p>Examples, esp for <a href="#access-to-non-public-members">access to non-public members</a></p></li>
<li>test types: purpose, drawbacks, limitation
<ul>
<li><code>TEST_VM</code></li>
<li><code>TEST_VM_F</code></li>
<li><code>TEST_OTHER_VM</code></li>
<li><code>TEST_VM_ASSERT</code></li>
<li><code>TEST_VM_ASSERT_MSG</code></li>
</ul></li>
<li>Miscellaneous
<ul>
<li>Test libraries
<ul>
<li>where to place</li>
<li>how to write</li>
<li>how to use</li>
</ul></li>
<li>test your tests
<ul>
<li>how to run tests in random order</li>
<li>how to run only specific tests</li>
<li>how to run each test separately</li>
<li>check that a test can find bugs it is supposed to by introducing them</li>
</ul></li>
<li>mocks/stubs/dependency injection</li>
<li>setUp/tearDown
<ul>
<li>vs c-tor/d-tor</li>
<li>empty test to test them</li>
</ul></li>
<li>internal (declared in .cpp) struct/classes</li>
</ul></li>
</ul>
</body>
</html>

View File

@@ -1,451 +0,0 @@
% Native/Unit Test Development Guidelines
The purpose of these guidelines is to establish a shared vision on
what kind of native tests and how we want to develop them for Hotspot
using GoogleTest. Hence these guidelines include style items as well
as test approach items.
First section of this document describes properties of good tests
which are common for almost all types of test regardless of language,
framework, etc. Further sections provide recommendations to achieve
those properties and other HotSpot and/or GoogleTest specific
guidelines.
## Good test properties
### Lightness
Use the most lightweight type of tests.
In Hotspot, there are 3 different types of tests regarding their
dependency on a JVM, each next level is slower than previous
* `TEST` : a test does not depend on a JVM
* `TEST_VM` : a test does depend on an initialized JVM, but are
supposed not to break a JVM, i.e. leave it in a workable state.
* `TEST_OTHER_VM` : a test depends on a JVM and requires a freshly
initialized JVM or leaves a JVM in non-workable state
### Isolation
Tests have to be isolated: not to have visible side-effects,
influences on other tests results.
Results of one test should not depend on test execution order, other
tests, otherwise it is becoming almost impossible to find out why a
test failed. Due to hotspot-specific, it is not so easy to get a full
isolation, e.g. we share an initialized JVM between all `TEST_VM` tests,
so if your test changes JVM's state too drastically and does not
change it back, you had better consider `TEST_OTHER_VM`.
### Atomicity and self-containment
Tests should be *atomic* and *self-contained* at the same time.
One test should check a particular part of a class, subsystem,
functionality, etc. Then it is quite easy to determine what parts of a
product are broken basing on test failures. On the other hand, a test
should test that part more-or-less entirely, because when one sees a
test `FooTest::bar`, they assume all aspects of bar from `Foo` are tested.
However, it is impossible to cover all aspects even of a method, not
to mention a subsystem. In such cases, it is recommended to have
several tests, one for each aspect of a thing under test. For example
one test to tests how `Foo::bar` works if an argument is `null`, another
test to test how it works if an argument is acceptable but `Foo` is not
in the right state to accept it and so on. This helps not only to make
tests atomic, self-contained but also makes test name self-descriptive
(discussed in more details in [Test names](#test-names)).
### Repeatability
Tests have to be repeatable.
Reproducibility is very crucial for a test. No one likes sporadic test
failures, they are hard to investigate, fix and verify a fix.
In some cases, it is quite hard to write a 100% repeatable test, since
besides a test there can be other moving parts, e.g. in case of
`TEST_VM` there are several concurrently running threads. Despite this,
we should try to make a test as reproducible as possible.
### Informativeness
In case of a failure, a test should be as *informative* as possible.
Having more information about a test failure than just compared values
can be very useful for failure troubleshooting, it can reduce or even
completely eliminate debugging hours. This is even more important in
case of not 100% reproducible failures.
Achieving this property, one can easily make a test too verbose, so it
will be really hard to find useful information in the ocean of useless
information. Hence they should not only think about how to provide
[good information](#error-messages), but also
[when to do it](#uncluttered-output).
### Testing instead of visiting
Tests should *test*.
It is not enough just to "visit" some code, a test should check that
code does that it has to do, compare return values with expected
values, check that desired side effects are done, and undesired are
not, and so on. In other words, a test should contain at least one
GoogleTest assertion and do not rely on JVM asserts.
Generally speaking to write a good test, one should create a model of
the system under tests, a model of possible bugs (or bugs which one
wants to find) and design tests using those models.
### Nearness
Prefer having checks inside test code.
Not only does having test logic outside, e.g. verification method,
depending on asserts in product code contradict with several items
above but also decreases tests readability and stability. It is much
easier to understand that a test is testing when all testing logic is
located inside a test or nearby in shared test libraries. As a rule of
thumb, the closer a check to a test, the better.
## Asserts
### Several checks
Prefer `EXPECT` over `ASSERT` if possible.
This is related to the [informativeness](#informativeness) property of
tests, information for other checks can help to better localize a
defects root-cause. One should use `ASSERT` if it is impossible to
continue test execution or if it does not make much sense. Later in
the text, `EXPECT` forms will be used to refer to both
`ASSERT/EXPECT`.
When it is possible to make several different checks, but impossible
to continue test execution if at least one check fails, you can
use `::testing::Test::HasNonfatalFailure()` function. The recommended
way to express that is
`ASSERT_FALSE(::testing::Test::HasNonfatalFailure())`. Besides making it
clear why a test is aborted, it also allows you to provide more
information about a failure.
### First parameter is expected value
In all equality assertions, expected values should be passed as the
first parameter.
This convention is adopted by GoogleTest, and there is a slight
difference in how GoogleTest treats parameters, the most important one
is `null` detection. Due to different reasons, `null` detection is enabled
only for the first parameter, that is to said `EXPECT_EQ(NULL, object)`
checks that object is `null`, while `EXPECT_EQ(object, NULL)` checks that
object equals to `NULL`, GoogleTest is very strict regarding types of
compared values so the latter will generates a compile-time error.
### Floating-point comparison
Use floating-point special macros to compare `float/double` values.
Because of floating-point number representations and round-off errors,
regular equality comparison will not return true in most cases. There
are special `EXPECT_FLOAT_EQ/EXPECT_DOUBLE_EQ` assertions which check
that the distance between compared values is not more than 4 ULPs,
there is also `EXPECT_NEAR(v1, v2, eps)` which checks that the absolute
value of the difference between `v1` and `v2` is not greater than `eps`.
### C string comparison
Use string special macros for C strings comparisons.
`EXPECT_EQ` just compares pointers values, which is hardly what one
wants comparing C strings. GoogleTest provides `EXPECT_STREQ` and
`EXPECT_STRNE` macros to compare C string contents. There are also
case-insensitive versions `EXPECT_STRCASEEQ`, `EXPECT_STRCASENE`.
### Error messages
Provide informative, but not too verbose error messages.
All GoogleTest asserts print compared expressions and their values, so
there is no need to have them in error messages. Asserts print only
compared values, they do not print any of interim variables, e.g.
`ASSERT_TRUE((val1 == val2 && isFail(foo(8)) || i == 18)` prints only
one value. If you use some complex predicates, please consider
`EXPECT_PRED*` or `EXPECT_FORMAT_PRED` assertions family, they check that
a predicate returns true/success and print out all parameters values.
However in some cases, default information is not enough, a commonly
used example is an assert inside a loop, GoogleTest will not print
iteration values (unless it is an assert's parameter). Other
demonstrative examples are printing error code and a corresponding
error message; printing internal states which might have an impact on
results. One should add this information to assert message using `<<`
operator.
### Uncluttered output
Print information only if it is needed.
Too verbose tests which print all information even if they pass are
very bad practice. They just pollute output, so it becomes harder to
find useful information. In order not print information till it is
really needed, one should consider saving it to a temporary buffer and
pass to an assert.
<https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp>
has a good example how to do that.
### Failures propagation
Wrap a subroutine call into `EXPECT_NO_FATAL_FAILURE` macro to
propagate failures.
`ASSERT` and `FAIL` abort only the current function, so if you have them
in a subroutine, a test will not be aborted after the subroutine even
if `ASSERT` or `FAIL` fails. You should call such subroutines in
`ASSERT_NO_FATAL_FAILURE` macro to propagate fatal failures and abort a
test. `(EXPECT|ASSERT)_NO_FATAL_FAILURE` can also be used to provide
more information.
Due to obvious reasons, there are no
`(EXPECT|ASSERT)_NO_NONFATAL_FAILURE` macros. However, if you need to
check if a subroutine generated a nonfatal failure (failed an `EXPECT`),
you can use `::testing::Test::HasNonfatalFailure` function,
or `::testing::Test::HasFailure` function to check if a subroutine
generated any failures, see [Several checks](#several-checks).
## Naming and Grouping
### Test group names
Test group names should be in CamelCase, start and end with a letter.
A test group should be named after tested class, functionality,
subsystem, etc.
This naming scheme helps to find tests, filter them and simplifies
test failure analysis. For example, class `Foo` - test group `Foo`,
compiler logging subsystem - test group `CompilerLogging`, G1 GC — test
group `G1GC`, and so forth.
### Filename
A test file must have `test_` prefix and `.cpp` suffix.
Both are actually requirements from the current build system to
recognize your tests.
### File location
Test file location should reflect a location of the tested part of the product.
* All unit tests for a class from `foo/bar/baz.cpp` should be placed
`foo/bar/test_baz.cpp` in `hotspot/test/native/` directory. Having all
tests for a class in one file is a common practice for unit tests, it
helps to see all existing tests at once, share functions and/or
resources without losing encapsulation.
* For tests which test more than one class, directory hierarchy should
be the same as product hierarchy, and file name should reflect the
name of the tested subsystem/functionality. For example, if a
sub-system under tests belongs to `gc/g1`, tests should be placed in
`gc/g1` directory.
Please note that framework prepends directory name to a test group
name. For example, if `TEST(foo, check_this)` and `TEST(bar, check_that)`
are defined in `hotspot/test/native/gc/shared/test_foo.cpp` file, they
will be reported as `gc/shared/foo::check_this` and
`gc/shared/bar::check_that`.
### Test names
Test names should be in small_snake_case, start and end with a letter.
A test name should reflect that a test checks.
Such naming makes tests self-descriptive and helps a lot during the
whole test life cycle. It is easy to do test planning, test inventory,
to see what things are not tested, to review tests, to analyze test
failures, to evolve a test, etc. For example
`foo_return_0_if_name_is_null` is better than `foo_sanity` or `foo_basic` or
just `foo`, `humongous_objects_can_not_be_moved_by_young_gc` is better
than `ho_young_gc`.
Actually using underscore is against GoogleTest project convention,
because it can lead to illegal identifiers, however, this is too
strict. Restricting usage of underscore for test names only and
prohibiting test name starts or ends with an underscore are enough to
be safe.
### Fixture classes
Fixture classes should be named after tested classes, subsystems, etc
(follow [Test group names rule](#test-group-names)) and have
`Test` suffix to prevent class name conflicts.
### Friend classes
All test purpose friends should have either `Test` or `Testable` suffix.
It greatly simplifies understanding of friendships purpose and allows
statically check that private members are not exposed unexpectedly.
Having `FooTest` as a friend of `Foo` without any comments will be
understood as a necessary evil to get testability.
### OS/CPU specific tests
Guard OS/CPU specific tests by `#ifdef` and have OS/CPU name in filename.
For the time being, we do not support separate directories for OS,
CPU, OS-CPU specific tests, in case we will have lots of such tests,
we will change directory layout and build system to support that in
the same way it is done in hotspot.
## Miscellaneous
### Hotspot style
Abide the norms and rules accepted in Hotspot style guide.
Tests are a part of Hotspot, so everything (if applicable) we use for
Hotspot, should be used for tests as well. Those guidelines cover
test-specific things.
### Code/test metrics
Coverage information and other code/test metrics are quite useful to
decide what tests should be written, what tests should be improved and
what can be removed.
For unit tests, widely used and well-known coverage metric is branch
coverage, which provides good quality of tests with relatively easy
test development process. For other levels of testing, branch coverage
is not as good, and one should consider others metrics, e.g.
transaction flow coverage, data flow coverage.
### Access to non-public members
Use explicit friend class to get access to non-public members.
We do not use GoogleTest macro to declare friendship relation,
because, from our point of view, it is less clear than an explicit
declaration.
Declaring a test fixture class as a friend class of a tested test is
the easiest and the clearest way to get access. However, it has some
disadvantages, here is some of them:
* Each test has to be declared as a friend
* Subclasses do not inheritance friendship relation
In other words, it is harder to share code between tests. Hence if you
want to share code or expect it to be useful in other tests, you
should consider making members in a tested class protected and
introduce a shared test-only class which expose those members via
public functions, or even making members publicly accessible right
away in a product class. If it is not an option to change members
visibility, one can create a friend class which exposes members.
### Death tests
You can not use death tests inside `TEST_OTHER_VM` and `TEST_VM_ASSERT*`.
We tried to make Hotspot-GoogleTest integration as transparent as
possible, however, due to the current implementation of `TEST_OTHER_VM`
and `TEST_VM_ASSERT*` tests, you cannot use death test functionality in
them. These tests are implemented as GoogleTest death tests, and
GoogleTest does not allow to have a death test inside another death
test.
### External flags
Passing external flags to a tested JVM is not supported.
The rationality of such design decision is to simplify both tests and
a test framework and to avoid failures related to incompatible flags
combination till there is a good solution for that. However there are
cases when one wants to test a JVM with specific flags combination,
`_JAVA_OPTIONS` environment variable can be used to do that. Flags from
`_JAVA_OPTIONS` will be used in `TEST_VM`, `TEST_OTHER_VM` and
`TEST_VM_ASSERT*` tests.
### Test-specific flags
Passing flags to a tested JVM in `TEST_OTHER_VM` and `TEST_VM_ASSERT*`
should be possible, but is not implemented yet.
Facility to pass test-specific flags is needed for system, regression
or other types of tests which require a fully initialized JVM in some
particular configuration, e.g. with Serial GC selected. There is no
support for such tests now, however, there is a plan to add that in
upcoming releases.
For now, if a test depends on flags values, it should have `if
(!<flag>) { return }` guards in the very beginning and `@requires`
comment similar to jtreg `@requires` directive right before test macros.
<https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp>
ha an example of this temporary workaround. It is important to follow
that pattern as it allows us to easily find all such tests and update
them as soon as there is an implementation of flag passing facility.
In long-term, we expect jtreg to support GoogleTest tests as first
class citizens, that is to say, jtreg will parse @requires comments
and filter out inapplicable tests.
### Flag restoring
Restore changed flags.
It is quite common for tests to configure JVM in a certain way
changing flags values. GoogleTest provides two ways to set up
environment before a test and restore it afterward: using either
constructor and destructor or `SetUp` and `TearDown` functions. Both ways
require to use a test fixture class, which sometimes is too wordy. The
simpler facilities like `FLAG_GUARD` macro or `*FlagSetting` classes could
be used in such cases to restore/set values.
Caveats:
* Changing a flags value could break the invariants between flags' values and hence could lead to unexpected/unsupported JVM state.
* `FLAG_SET_*` macros can change more than one flag (in order to
maintain invariants) so it is hard to predict what flags will be
changed and it makes restoring all changed flags a nontrivial task.
Thus in case one uses `FLAG_SET_*` macros, they should use `TEST_OTHER_VM`
test type.
### GoogleTest documentation
In case you have any questions regarding GoogleTest itself, its
asserts, test declaration macros, other macros, etc, please consult
its documentation.
## TODO
Although this document provides guidelines on the most important parts
of test development using GTest, it still misses a few items:
* Examples, esp for [access to non-public members](#access-to-non-public-members)
* test types: purpose, drawbacks, limitation
* `TEST_VM`
* `TEST_VM_F`
* `TEST_OTHER_VM`
* `TEST_VM_ASSERT`
* `TEST_VM_ASSERT_MSG`
* Miscellaneous
* Test libraries
* where to place
* how to write
* how to use
* test your tests
* how to run tests in random order
* how to run only specific tests
* how to run each test separately
* check that a test can find bugs it is supposed to by introducing them
* mocks/stubs/dependency injection
* setUp/tearDown
* vs c-tor/d-tor
* empty test to test them
* internal (declared in .cpp) struct/classes

View File

@@ -1,230 +0,0 @@
#!/usr/bin/env python3
import argparse
import math
import os.path
import sys
import subprocess
def fatal(msg):
sys.stderr.write(f"[fatal] {msg}\n")
sys.exit(1)
def verbose(options, *msg):
if options.verbose:
sys.stdout.write(f"[verbose] ")
sys.stdout.write(*msg)
sys.stdout.write('\n')
def first_line(str):
return "" if not str else str.splitlines()[0]
class Options:
def __init__(self):
ap = argparse.ArgumentParser(description="Show bugfixes differences between JBR and OpenJDK git repos",
epilog="Example: %(prog)s --jdk ./jdk11u/ --jbr ./JetBrainsRuntime/ --path src/hotspot --limit 200")
ap.add_argument('--jdk', dest='jdkpath', help='path to OpenJDK git repo', required=True)
ap.add_argument('--jbr', dest='jbrpath', help='path to JBR git repo', required=True)
ap.add_argument('--path', dest='path', help='limit to changes in this path (relative to git root)')
ap.add_argument('--limit', dest='limit', help='limit to this many log entries in --jdk repo', type=int, default=-1)
ap.add_argument('-o', dest="output_dir", help="save patches to this directory (created if necessary)")
ap.add_argument('-v', dest='verbose', help="verbose output", default=False, action='store_true')
args = ap.parse_args()
if not os.path.isdir(args.jdkpath):
fatal(f"{args.jdkpath} not a directory")
if not os.path.isdir(args.jbrpath):
fatal(f"{args.jbrpath} not a directory")
if not git_is_available():
fatal("can't run git commands; make sure git is in PATH")
self.jdkpath = args.jdkpath
self.jbrpath = args.jbrpath
self.path = args.path
self.limit = args.limit
self.output_dir = args.output_dir
self.verbose = args.verbose
class GitRepo:
def __init__(self, rootpath):
self.rootpath = rootpath
def run_git_cmd(self, git_args):
args = ["git", "-C", self.rootpath]
args.extend(git_args)
# print(f"Runnig git cmd '{' '.join(args)}'")
p = subprocess.run(args, capture_output=True, text=True)
if p.returncode != 0:
fatal(f"git returned non-zero code in {self.rootpath} ({first_line(p.stderr)})")
return p.stdout
def save_git_cmd(self, fname, git_args):
args = ["git", "-C", self.rootpath]
args.extend(git_args)
# print(f"Runnig git cmd '{' '.join(args)}'")
with open(fname, "w") as stdout_file:
p = subprocess.run(args, stdout=stdout_file)
if p.returncode != 0:
fatal(f"git returned non-zero code in {self.rootpath} ({first_line(p.stderr)})")
def current_branch(self):
branch_name = self.run_git_cmd(["branch", "--show-current"]).strip()
return branch_name
def log(self, path=None, limit=None):
cmds = ["log", "--no-decorate"]
if limit:
cmds.extend(["-n", str(limit)])
if path:
cmds.append(path)
full_log = self.run_git_cmd(cmds)
return full_log
class Commit:
def __init__(self, lines):
self.sha = lines[0].split()[1]
self.message = ""
self.bugid = None
# Commit message starts after one blank line
read_message = False
for l in lines:
if read_message:
self.message += l + "\n"
if not read_message and l == "":
read_message = True
if self.message and self.message != "" and ":" in self.message:
maybe_bugid = self.message.split(":")[0].strip()
if 10 >= len(maybe_bugid) >= 4:
self.bugid = maybe_bugid
class History:
def __init__(self, log):
log_itr = iter(log.splitlines())
self.commits = []
commit_lines = []
for line in log_itr:
if line.startswith("commit ") and len(commit_lines) > 0:
commit = Commit(commit_lines)
self.commits.append(commit)
commit_lines = []
commit_lines.append(line)
if len(commit_lines) > 0:
commit = Commit(commit_lines)
self.commits.append(commit)
def contains(self, str):
return any(str in commit.message for commit in self.commits)
def size(self):
return len(self.commits)
def print_explanation(options, jdk, jbr):
verbose(options, f"Reading history from '{jdk.rootpath}' on branch '{jdk.current_branch()}'")
if options.path:
verbose(options, f"\t(only under '{options.path}')")
verbose(options, f"Searching for same fixes in '{jbr.rootpath}' on branch '{jbr.current_branch()}'")
def git_is_available():
p = None
try:
p = subprocess.run(["git", "--help"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
except:
pass
return p is not None and p.returncode == 0
def main():
check_python_min_requirements()
options = Options()
jdk = GitRepo(options.jdkpath)
jbr = GitRepo(options.jbrpath)
print_explanation(options, jdk, jbr)
commits_to_save = []
try:
jdk_log = jdk.log(options.path, options.limit)
jdk_history = History(jdk_log)
jbr_log = jbr.log(options.path)
jbr_history = History(jbr_log)
verbose(options, f"Read {jdk_history.size()} commits in JDK, {jbr_history.size()} in JBR")
for c in jdk_history.commits:
if c.bugid:
verbose(options, f"Looking for bugfix for {c.bugid}")
if not jbr_history.contains(c.bugid):
commits_to_save.append(c)
print(f"[note] Fix for {c.bugid} not found in JBR ({jbr.rootpath})")
print(f" commit {c.sha}")
print(f" {first_line(c.message).strip()}")
except KeyboardInterrupt:
fatal("Interrupted")
if len(commits_to_save) > 0 and options.output_dir:
print()
if not os.path.exists(options.output_dir):
verbose(options, f"Creating output directory {options.output_dir}")
os.makedirs(options.output_dir)
nzeroes = len(str(len(commits_to_save)))
for i, c in enumerate(reversed(commits_to_save)):
fname = os.path.join(options.output_dir, f"{str(i).zfill(nzeroes)}-{c.bugid}.patch")
print(f"[note] {c.bugid} saved as {fname}")
fname = os.path.abspath(fname)
jdk.save_git_cmd(fname, ["format-patch", "-1", c.sha, "--stdout"])
script_fname = os.path.join(options.output_dir, "apply.sh")
with open(script_fname, "w") as script_file:
print(apply_script_code.format(os.path.abspath(jbr.rootpath), os.path.abspath(options.output_dir)),
file=script_file)
print(f"[note] Execute 'bash {script_fname}' to apply patches to {jbr.rootpath}")
def check_python_min_requirements():
if sys.version_info < (3, 6):
fatal("Minimum version 3.6 is required to run this script")
apply_script_code = """
#!/bin/bash
GITROOT={0}
PATCHROOT={1}
cd $PATCHROOT || exit 1
PATCHES=$(find $PATCHROOT -name '*.patch' | sort -n)
for P in $PATCHES; do
git -C $GITROOT am $P
if [ $? != 0 ]; then
mv "$P" "$P.failed"
echo "[ERROR] Patch $P did not apply cleanly. Try applying it manually."
echo "[NOTE] Execute this script to apply the remaining patches."
exit 1
else
mv "$P" "$P.done"
fi
done
echo "[NOTE] Done applying patches; check $PATCHROOT for .patch and .patch.failed to see if all have been applied."
"""
if __name__ == '__main__':
main()

View File

@@ -1,13 +0,0 @@
# jetbrains/runtime:jbr15env
FROM centos:7
RUN yum -y install centos-release-scl
RUN yum -y install devtoolset-8
RUN yum -y install zip bzip2 unzip tar wget make autoconf automake libtool gcc gcc-c++ libstdc++-devel alsa-devel cups-devel xorg-x11-devel libjpeg62-devel giflib-devel freetype-devel file which libXtst-devel libXt-devel libXrender-devel alsa-lib-devel fontconfig-devel libXrandr-devel libXi-devel git
# Install Java 16
RUN wget https://cdn.azul.com/zulu/bin/zulu16.28.11-ca-jdk16.0.0-linux_x64.tar.gz \
-O - | tar xz -C /
RUN mv /zulu16.28.11-ca-jdk16.0.0-linux_x64 /jdk16.0.0
ENV PATH /opt/rh/devtoolset-8/root/usr/bin:$PATH
RUN mkdir .git
RUN git config user.email "teamcity@jetbrains.com"
RUN git config user.name "builduser"

View File

@@ -1,7 +0,0 @@
FROM i386/ubuntu:xenial
RUN linux32 apt-get update && apt-get install -y --no-install-recommends apt-utils
COPY jbrsdk-11.0.5-b1 /jbrsdk-11.0.5-b1
RUN linux32 apt-get -y install file build-essential zip unzip curl libx11-dev libxext-dev \
libxrender-dev libxrandr-dev libxtst-dev libxt-dev libcups2-dev libasound2-data \
libpng12-0 libasound2 libfreetype6 libfontconfig1-dev libasound2-dev autoconf

View File

@@ -1,9 +0,0 @@
<component name="CopyrightManager">
<copyright>
<option name="notice" value="Copyright &amp;#36;originalComment.match(&quot;Copyright (\d+)&quot;, 1, &quot;-&quot;)&amp;#36;today.year JetBrains s.r.o.&#10;DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.&#10;&#10;This code is free software; you can redistribute it and/or modify it&#10;under the terms of the GNU General Public License version 2 only, as&#10;published by the Free Software Foundation.&#10;&#10;This code is distributed in the hope that it will be useful, but WITHOUT&#10;ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or&#10;FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License&#10;version 2 for more details (a copy is included in the LICENSE file that&#10;accompanied this code).&#10;&#10;You should have received a copy of the GNU General Public License version&#10;2 along with this work; if not, write to the Free Software Foundation,&#10;Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.&#10;&#10;Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA&#10;or visit www.oracle.com if you need additional information or have any&#10;questions." />
<option name="keyword" value="Copyright" />
<option name="allowReplaceKeyword" value="JetBrains" />
<option name="myName" value="JetBrains" />
<option name="myLocal" value="true" />
</copyright>
</component>

View File

@@ -1,3 +0,0 @@
<component name="CopyrightManager">
<settings default="JetBrains" />
</component>

View File

@@ -1 +0,0 @@
JetBrainsRuntime

View File

@@ -1,9 +0,0 @@
<component name="CopyrightManager">
<copyright>
<option name="notice" value="Copyright 2000-&amp;#36;today.year JetBrains s.r.o.&#10;&#10;Licensed under the Apache License, Version 2.0 (the &quot;License&quot;);&#10;you may not use this file except in compliance with the License.&#10;You may obtain a copy of the License at&#10;&#10;http://www.apache.org/licenses/LICENSE-2.0&#10;&#10;Unless required by applicable law or agreed to in writing, software&#10;distributed under the License is distributed on an &quot;AS IS&quot; BASIS,&#10;WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.&#10;See the License for the specific language governing permissions and&#10;limitations under the License." />
<option name="keyword" value="Copyright" />
<option name="allowReplaceKeyword" value="JetBrains" />
<option name="myName" value="JetBrains" />
<option name="myLocal" value="true" />
</copyright>
</component>

View File

@@ -1,3 +0,0 @@
<component name="CopyrightManager">
<settings default="JetBrains" />
</component>

View File

@@ -1,20 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="IssueNavigationConfiguration">
<option name="links">
<list>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})([A-Z]+\-\d+)(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://youtrack.jetbrains.com/issue/$1" />
</IssueNavigationLink>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})(?:JDK-)?(\d{7})(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://bugs.openjdk.java.net/browse/JDK-$1" />
</IssueNavigationLink>
</list>
</option>
</component>
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$/../.." vcs="Git" />
</component>
</project>

View File

@@ -1,20 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="IssueNavigationConfiguration">
<option name="links">
<list>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})([A-Z]+\-\d+)(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://youtrack.jetbrains.com/issue/$1" />
</IssueNavigationLink>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})(?:JDK-)?(\d{7})(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://bugs.openjdk.java.net/browse/JDK-$1" />
</IssueNavigationLink>
</list>
</option>
</component>
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$" vcs="Git" />
</component>
</project>

View File

@@ -1,135 +0,0 @@
apply plugin: 'java'
import org.gradle.internal.os.OperatingSystem
repositories {
mavenCentral()
}
def test_jvm = {
if (project.hasProperty('jbsdkhome')) {
file(jbsdkhome + (OperatingSystem.current().isWindows()?"/bin/java.exe" : "/bin/java")).absolutePath
} else {
if (OperatingSystem.current().isMacOsX()) {
file('../../../build/macosx-x86_64-normal-server-release/images/jdk-bundle/jdk-11.0.4.jdk/Contents/Home/bin/java').absolutePath
} else if (OperatingSystem.current().isLinux()) {
file('../../../build/linux-x86_64-normal-server-release/images/jdk/bin/java').absolutePath
} else {
file('../../../build/windows-x86_64-normal-server-release/images/jdk/bin/java.exe').absolutePath
}
}
}
dependencies {
testCompile('junit:junit:4.12'){
exclude group: 'org.hamcrest'
}
testCompile 'org.hamcrest:hamcrest-library:1.3'
testCompile 'net.java.dev.jna:jna:4.4.0'
testCompile 'com.twelvemonkeys.imageio:imageio-tiff:3.3.2'
testCompile 'org.apache.commons:commons-lang3:3.0'
}
def jdk_modules = ["java.base", "java.logging", "java.prefs",
"java.se.ee", "java.sql", "java.datatransfer",
"java.management", "java.rmi", "java.security.jgss",
"java.sql.rowset", "java.desktop", "java.management.rmi",
"java.scripting", "java.security.sasl", "java.transaction",
"java.instrument", "java.naming", "java.se",
"java.smartcardio", "java.xml.crypto"]
def jdk_class_dirs = []
jdk_modules.collect(jdk_class_dirs) {
new File("../../../src/" + it + "/share/classes")
}
if (OperatingSystem.current().isMacOsX())
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/macosx/classes"
}
else if (OperatingSystem.current().isLinux()) {
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/solaris/classes"
}
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/unix/classes"
}
} else
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/windows/classes"
}
sourceSets.main.java.srcDirs = jdk_class_dirs
sourceSets {
test {
java {
srcDir "../../../test/jdk/jbu"
}
}
}
test.dependsOn.clear()
test.dependsOn tasks.compileTestJava
test {
systemProperty "jb.java2d.metal", "true"
systemProperty "testdata", file('../../../test/jdk/jbu/testdata').absolutePath
// Generate golden images for DroidFontTest and MixedTextTest
// systemProperty "gentestdata", ""
// Enable Java2D logging (https://confluence.jetbrains.com/display/JRE/Java2D+Rendering+Logging)
// systemProperty "sun.java2d.trace", "log"
// systemProperty "sun.java2d.trace", "log,pimpl"
outputs.upToDateWhen { false }
executable = test_jvm()
// Enable async/dtrace profiler
jvmArgs "-XX:+PreserveFramePointer"
// Enable native J2D logging (only in debug build)
// Can be turned on for J2D by adding "#define DEBUG 1" into jdk/src/share/native/sun/java2d/Trace.h
// environment 'J2D_TRACE_LEVEL', '4'
}
def buildDir = project.buildscript.sourceFile.parentFile.parentFile.parentFile.parentFile
def make_cmd = "make"
if (OperatingSystem.current().isWindows()) {
def cyg_make_cmd = new File("c:/cygwin64/bin/make.exe")
if (cyg_make_cmd.exists()) make_cmd = cyg_make_cmd.absolutePath
}
def test_run = false
task make_images {
doLast {
if (!test_run) {
def pb = new ProcessBuilder().command(make_cmd.toString(), "-C", buildDir.absolutePath, "images")
def proc = pb.redirectErrorStream(true).start()
proc.inputStream.eachLine { println it }
assert proc.waitFor() == 0
}
}
}
task make_clean {
doLast {
def pb = new ProcessBuilder().command(make_cmd.toString(), "-C", buildDir.absolutePath, "clean")
def proc = pb.redirectErrorStream(true).start()
proc.inputStream.eachLine { println it }
assert proc.waitFor() == 0
}
}
task run_test {
doLast {
test_run = true
}
}
tasks.cleanTest.dependsOn tasks.run_test
classes.dependsOn.clear()
classes.dependsOn tasks.make_images
tasks.cleanClasses.dependsOn tasks.make_clean

View File

@@ -1,64 +0,0 @@
VENDOR_NAME="JetBrains s.r.o."
VENDOR_VERSION_STRING="JBR-${JBSDK_VERSION_WITH_DOTS}.${JDK_BUILD_NUMBER}-${build_number}"
[ -z "$bundle_type" ] || VENDOR_VERSION_STRING="${VENDOR_VERSION_STRING}-${bundle_type}"
do_reset_changes=0
do_reset_dcevm=0
HEAD_REVISION=0
function do_exit() {
exit_code=$1
[ $do_reset_changes -eq 1 ] && git checkout HEAD modules.list src/java.desktop/share/classes/module-info.java
if [ $do_reset_dcevm -eq 1 ]; then
[ ! -z $HEAD_REVISION ] && git reset --hard $HEAD_REVISION
fi
exit "$exit_code"
}
function update_jsdk_mods() {
__jsdk=$1
__jcef_mods=$2
__orig_jsdk_mods=$3
__updated_jsdk_mods=$4
# re-create java.desktop.jmod with updated module-info.class
tmp=.java.desktop.$$.tmp
mkdir "$tmp" || exit $?
"$__jsdk"/bin/jmod extract --dir "$tmp" "$__orig_jsdk_mods"/java.desktop.jmod || exit $?
"$__jsdk"/bin/javac \
--patch-module java.desktop="$__orig_jsdk_mods"/java.desktop.jmod \
--module-path "$__jcef_mods" -d "$tmp"/classes src/java.desktop/share/classes/module-info.java || exit $?
"$__jsdk"/bin/jmod \
create --class-path "$tmp"/classes --config "$tmp"/conf --header-files "$tmp"/include --legal-notice "$tmp"/legal --libs "$tmp"/lib \
java.desktop.jmod || exit $?
mv java.desktop.jmod "$__updated_jsdk_mods" || exit $?
rm -rf "$tmp"
# re-create java.base.jmod with updated hashes
tmp=.java.base.$$.tmp
mkdir "$tmp" || exit $?
hash_modules=$("$JSDK"/bin/jmod describe "$__orig_jsdk_mods"/java.base.jmod | grep hashes | awk '{print $2}' | tr '\n' '|' | sed s/\|$//) || exit $?
"$__jsdk"/bin/jmod extract --dir "$tmp" "$__orig_jsdk_mods"/java.base.jmod || exit $?
rm "$__updated_jsdk_mods"/java.base.jmod || exit $? # temp exclude from path
"$__jsdk"/bin/jmod \
create --module-path "$__updated_jsdk_mods" --hash-modules "$hash_modules" \
--class-path "$tmp"/classes --cmds "$tmp"/bin --config "$tmp"/conf --header-files "$tmp"/include --legal-notice "$tmp"/legal --libs "$tmp"/lib \
java.base.jmod || exit $?
mv java.base.jmod "$__updated_jsdk_mods" || exit $?
rm -rf "$tmp"
}
function get_mods_list() {
__mods=$1
echo $(ls $__mods) | sed s/\.jmod/,/g | sed s/,$//g | sed s/' '//g
}
function copy_jmods() {
__mods_list=$1
__jmods_from=$2
__jmods_to=$3
mkdir -p $__jmods_to
echo "${__mods_list}," | while read -d, mod; do cp $__jmods_from/$mod.jmod $__jmods_to/; done
}

View File

@@ -1,84 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies the current version of OpenJDK e.g. 11_0_6
# JDK_BUILD_NUMBER - specifies the number of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
JBSDK_VERSION_WITH_DOTS=$(echo $JBSDK_VERSION | sed 's/_/\./g')
source jb/project/tools/common/scripts/common.sh
JBRSDK_BASE_NAME=jbrsdk-${JBSDK_VERSION}
[ -z "$bundle_type" ] && (git apply -p0 < jb/project/tools/patches/exclude_jcef_module.patch || exit $?)
sh configure \
--with-debug-level=release \
--with-vendor-name="${VENDOR_NAME}" \
--with-vendor-version-string="${VENDOR_VERSION_STRING}" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="${JDK_BUILD_NUMBER}" \
--with-version-opt=b${build_number} \
--with-boot-jdk=${BOOT_JDK} \
--enable-cds=yes || exit $?
make clean CONF=linux-aarch64-server-release || exit $?
make images CONF=linux-aarch64-server-release test-image || exit $?
JBSDK=${JBRSDK_BASE_NAME}-linux-aarch64-b${build_number}
BASE_DIR=build/linux-aarch64-server-release/images
JSDK=${BASE_DIR}/jdk
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
rm -rf $BASE_DIR/$JBRSDK_BUNDLE
cp -r $JSDK $BASE_DIR/$JBRSDK_BUNDLE || exit $?
echo Creating $JBSDK.tar.gz ...
sed 's/JBR/JBRSDK/g' ${BASE_DIR}/${JBRSDK_BUNDLE}/release > release
mv release ${BASE_DIR}/${JBRSDK_BUNDLE}/release
tar -pcf $JBSDK.tar \
--exclude=*.debuginfo --exclude=demo --exclude=sample --exclude=man \
-C $BASE_DIR ${JBRSDK_BUNDLE} || exit $?
gzip $JBSDK.tar || exit $?
JBR_BUNDLE=jbr
JBR_BASE_NAME=jbr-$JBSDK_VERSION
rm -rf $BASE_DIR/$JBR_BUNDLE
JBR=$JBR_BASE_NAME-linux-aarch64-b$build_number
grep -v javafx modules.list | grep -v "jdk.internal.vm\|jdk.aot\|jcef" > modules.list.aarch64
echo Running jlink....
${JSDK}/bin/jlink \
--module-path ${JSDK}/jmods --no-man-pages --compress=2 \
--add-modules $(xargs < modules.list.aarch64 | sed s/" "//g | sed s/',$'//g) \
--output ${BASE_DIR}/${JBR_BUNDLE} || exit $?
echo Modifying release info ...
grep -v \"^JAVA_VERSION\" ${JSDK}/release | grep -v \"^MODULES\" >> ${BASE_DIR}/${JBR_BUNDLE}/release
echo Creating $JBR.tar.gz ...
tar -pcf $JBR.tar -C $BASE_DIR ${JBR_BUNDLE} || exit $?
gzip $JBR.tar || exit $?
JBRSDK_TEST=$JBRSDK_BASE_NAME-linux-test-aarch64-b$build_number
echo Creating $JBRSDK_TEST.tar.gz ...
tar -pcf $JBRSDK_TEST.tar -C $BASE_DIR --exclude='test/jdk/demos' test || exit $?
gzip $JBRSDK_TEST.tar || exit $?

View File

@@ -1,139 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies major version of OpenJDK e.g. 11_0_6 (instead of dots '.' underbars "_" are used)
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built; possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
# Environment variables:
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_linux_x64
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
bundle_type=$4
JBSDK_VERSION_WITH_DOTS=$(echo $JBSDK_VERSION | sed 's/_/\./g')
JCEF_PATH=${JCEF_PATH:=./jcef_linux_x64}
source jb/project/tools/common/scripts/common.sh
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-linux-x64-${fastdebug_infix}b${build_number}
echo Running jlink....
[ -d "$IMAGES_DIR"/"$__arch_name" ] && rm -rf "${IMAGES_DIR:?}"/"$__arch_name"
$JSDK/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$IMAGES_DIR"/"$__arch_name"
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$IMAGES_DIR"/"$__arch_name"/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' "$IMAGES_DIR"/"$__arch_name"/release > release
mv release "$IMAGES_DIR"/"$__arch_name"/release
copy_jmods "$__modules" "$__modules_path" "$IMAGES_DIR"/"$__arch_name"/jmods
fi
# jmod does not preserve file permissions (JDK-8173610)
[ -f "$IMAGES_DIR"/"$__arch_name"/lib/jcef_helper ] && chmod a+x "$IMAGES_DIR"/"$__arch_name"/lib/jcef_helper
echo Creating "$JBR".tar.gz ...
tar -pcf "$JBR".tar -C "$IMAGES_DIR" "$__arch_name" || do_exit $?
[ -f "$JBR".tar.gz ] && rm "$JBR.tar.gz"
gzip "$JBR".tar || do_exit $?
rm -rf "${IMAGES_DIR:?}"/"$__arch_name"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=linux-x86_64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=0
;;
"dcevm")
HEAD_REVISION=$(git rev-parse HEAD)
git am jb/project/tools/patches/dcevm/*.patch || do_exit $?
do_reset_dcevm=0
do_reset_changes=0
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=0
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=linux-x86_64-server-fastdebug
;;
esac
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
--enable-cds=yes || do_exit $?
make clean CONF=$RELEASE_NAME || exit $?
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods $JSDK $JCEF_PATH/jmods $JSDK/jmods $JSDK_MODS_DIR || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not changed
jbr_name_postfix="_${bundle_type}"
[ "$bundle_type" != "fd" ] && jbrsdk_name_postfix="_${bundle_type}"
fi
# create runtime image bundle
modules=$(xargs < modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat $JSDK/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" $JBRSDK_BUNDLE $JSDK_MODS_DIR "$modules" || do_exit $?
if [ -z "$bundle_type" ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-linux-test-x64-b${build_number}
echo Creating "$JBRSDK_TEST" ...
make test-image CONF=$RELEASE_NAME || do_exit $?
tar -pcf "$JBRSDK_TEST".tar -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
gzip "$JBRSDK_TEST".tar || do_exit $?
fi
do_exit 0

View File

@@ -1,80 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies the current version of OpenJDK e.g. 11_0_6
# JDK_BUILD_NUMBER - specifies the number of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
JBSDK_VERSION_WITH_DOTS=$(echo $JBSDK_VERSION | sed 's/_/\./g')
source jb/project/tools/common/scripts/common.sh
JBRSDK_BASE_NAME=jbrsdk-${JBSDK_VERSION}
[ -z "$bundle_type" ] && (git apply -p0 < jb/project/tools/patches/exclude_jcef_module.patch || exit $?)
linux32 bash configure \
--with-debug-level=release \
--with-vendor-name="${VENDOR_NAME}" \
--with-vendor-version-string="${VENDOR_VERSION_STRING}" \
--with-version-pre= \
--with-version-build=$JDK_BUILD_NUMBER \
--with-version-opt=b${build_number} \
--with-boot-jdk=${BOOT_JDK} \
--enable-cds=yes || exit $?
make clean CONF=linux-x86-server-release || exit $?
make images CONF=linux-x86-server-release test-image || exit $?
JBSDK=${JBRSDK_BASE_NAME}-linux-x86-b${build_number}
BASE_DIR=build/linux-x86-server-release/images
JSDK=${BASE_DIR}/jdk
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
rm -rf $BASE_DIR/$JBRSDK_BUNDLE
cp -r $JSDK $BASE_DIR/$JBRSDK_BUNDLE || exit $?
echo Creating $JBSDK.tar.gz ...
sed 's/JBR/JBRSDK/g' ${BASE_DIR}/${JBRSDK_BUNDLE}/release > release
mv release ${BASE_DIR}/${JBRSDK_BUNDLE}/release
tar -pcf $JBSDK.tar --exclude=*.debuginfo --exclude=demo --exclude=sample --exclude=man -C $BASE_DIR ${JBRSDK_BUNDLE} || exit $?
gzip $JBSDK.tar || exit $?
JBR_BUNDLE=jbr
JBR_BASE_NAME=jbr-$JBSDK_VERSION
rm -rf $BASE_DIR/$JBR_BUNDLE
JBR=$JBR_BASE_NAME-linux-x86-b$build_number
grep -v javafx modules.list | grep -v "jdk.internal.vm\|jdk.aot\|jcef" > modules.list.x86
echo Running jlink....
${JSDK}/bin/jlink \
--module-path ${JSDK}/jmods --no-man-pages --compress=2 \
--add-modules $(xargs < modules.list.x86 | sed s/" "//g | sed s/,$//g) --output ${BASE_DIR}/${JBR_BUNDLE} || exit $?
echo Modifying release info ...
grep -v \"^JAVA_VERSION\" ${JSDK}/release | grep -v \"^MODULES\" >> ${BASE_DIR}/${JBR_BUNDLE}/release
echo Creating $JBR.tar.gz ...
tar -pcf $JBR.tar -C $BASE_DIR $JBR_BUNDLE || exit $?
gzip $JBR.tar || exit $?
JBRSDK_TEST=$JBRSDK_BASE_NAME-linux-test-x86-b$build_number
echo Creating $JBRSDK_TEST.tar.gz ...
tar -pcf $JBRSDK_TEST.tar -C $BASE_DIR --exclude='test/jdk/demos' --exclude='test/hotspot/gtest' test || exit $?
gzip $JBRSDK_TEST.tar || exit $?

View File

@@ -1,16 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.cs.allow-jit</key>
<true/>
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<key>com.apple.security.cs.allow-dyld-environment-variables</key>
<true/>
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
<key>com.apple.security.cs.disable-executable-page-protection</key>
<true/>
</dict>
</plist>

View File

@@ -1,178 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies major version of OpenJDK e.g. 11_0_6 (instead of dots '.' underbars "_" are used)
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built; possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
# Environment variables:
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_mac
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
bundle_type=$4
architecture=$5 # aarch64 or x64
enable_aot=$6 # temporary param for building test jre with aot under aarch64
JBSDK_VERSION_WITH_DOTS=$(echo $JBSDK_VERSION | sed 's/_/\./g')
WITH_IMPORT_MODULES="--with-import-modules=${MODULAR_SDK_PATH:=./modular-sdk}"
JCEF_PATH=${JCEF_PATH:=./jcef_mac}
architecture=${architecture:=x64}
MAJOR_JBSDK_VERSION=$(echo "$JBSDK_VERSION_WITH_DOTS" | awk -F "." '{print $1}')
BOOT_JDK=${BOOT_JDK:=$(/usr/libexec/java_home -v 16)}
source jb/project/tools/common/scripts/common.sh
function copyJNF {
__contents_dir=$1
mkdir -p ${__contents_dir}/Frameworks
cp -Rp Frameworks/JavaNativeFoundation.framework ${__contents_dir}/Frameworks
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
tmp=.bundle.$$.tmp
mkdir "$tmp" || do_exit $?
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-osx-${architecture}-${fastdebug_infix}b${build_number}
JRE_CONTENTS=$tmp/$__arch_name/Contents
mkdir -p "$JRE_CONTENTS" || do_exit $?
echo Running jlink...
"$JSDK"/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$JRE_CONTENTS/Home" || do_exit $?
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$JRE_CONTENTS/Home/release"
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' $JRE_CONTENTS/Home/release > release
mv release $JRE_CONTENTS/Home/release
copy_jmods "$__modules" "$__modules_path" "$JRE_CONTENTS"/Home/jmods
fi
cp -R "$JSDK"/../MacOS "$JRE_CONTENTS"
cp "$JSDK"/../Info.plist "$JRE_CONTENTS"
if [[ "${architecture}" == *aarch64* ]]; then
# we can't notarize this library as usual framework (with headers and tbd-file)
# but single library notarizes correctly
copyJNF $JRE_CONTENTS
fi
[ -n "$bundle_type" ] && (cp -a $JCEF_PATH/Frameworks "$JRE_CONTENTS" || do_exit $?)
echo Creating "$JBR".tar.gz ...
COPYFILE_DISABLE=1 tar -pczf "$JBR".tar.gz --exclude='*.dSYM' --exclude='man' -C "$tmp" "$__arch_name" || do_exit $?
rm -rf "$tmp"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
CONF_ARCHITECTURE=x86_64
if [[ "${architecture}" == *aarch64* ]]; then
CONF_ARCHITECTURE=aarch64
fi
RELEASE_NAME=macosx-${CONF_ARCHITECTURE}-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=0
;;
"dcevm")
HEAD_REVISION=$(git rev-parse HEAD)
git am jb/project/tools/patches/dcevm/*.patch || do_exit $?
do_reset_dcevm=0
do_reset_changes=0
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=0
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=macosx-${CONF_ARCHITECTURE}-server-fastdebug
JBSDK=macosx-${architecture}-server-release
;;
esac
if [[ "${architecture}" == *aarch64* ]]; then
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="${VENDOR_NAME}" \
--with-vendor-version-string="${VENDOR_VERSION_STRING}" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="${JDK_BUILD_NUMBER}" \
--with-version-opt=b"${build_number}" \
--with-boot-jdk="$BOOT_JDK" \
--disable-hotspot-gtest --disable-javac-server --disable-full-docs --disable-manpages \
--enable-cds=no \
--with-extra-cflags="-F$(pwd)/Frameworks" \
--with-extra-cxxflags="-F$(pwd)/Frameworks" \
--with-extra-ldflags="-F$(pwd)/Frameworks" || do_exit $?
else
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
--enable-cds=yes || do_exit $?
fi
make clean CONF=$RELEASE_NAME || do_exit $?
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk-bundle/jdk-$MAJOR_JBSDK_VERSION.jdk/Contents/Home
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods "$JSDK" "$JCEF_PATH"/jmods "$JSDK"/jmods "$JSDK_MODS_DIR" || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not changed
jbr_name_postfix="_${bundle_type}"
fi
# create runtime image bundle
modules=$(xargs < modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat "$JSDK"/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" "$JBRSDK_BUNDLE" "$JSDK_MODS_DIR" "$modules" || do_exit $?
if [ -z "$bundle_type" ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-osx-test-${architecture}-b${build_number}
echo Creating "$JBRSDK_TEST" ...
make test-image CONF=$RELEASE_NAME || do_exit $?
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
COPYFILE_DISABLE=1 tar -pczf "$JBRSDK_TEST".tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
fi
do_exit 0

View File

@@ -1,120 +0,0 @@
#!/bin/bash
APP_DIRECTORY=$1
APPL_USER=$2
APPL_PASSWORD=$3
APP_NAME=$4
BUNDLE_ID=$5
FAKE_ROOT="${6:-fake-root}"
if [[ -z "$APP_DIRECTORY" ]] || [[ -z "$APPL_USER" ]] || [[ -z "$APPL_PASSWORD" ]]; then
echo "Usage: $0 AppDirectory Username Password"
exit 1
fi
if [[ ! -d "$APP_DIRECTORY" ]]; then
echo "AppDirectory '$APP_DIRECTORY' does not exist or not a directory"
exit 1
fi
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
function publish-log() {
id=$1
file=$2
curl -T "$file" "$ARTIFACTORY_URL/$id" || true
}
function altool-upload() {
# Since altool uses same file for upload token we have to trick it into using different folders for token file location
# Also it copies zip into TMPDIR so we override it too, to simplify cleanup
OLD_HOME="$HOME"
export HOME="$FAKE_ROOT/home"
export TMPDIR="$FAKE_ROOT/tmp"
mkdir -p "$HOME"
mkdir -p "$TMPDIR"
export _JAVA_OPTIONS="-Duser.home=$HOME -Djava.io.tmpdir=$TMPDIR"
# Reduce amount of downloads, cache transporter libraries
shared_itmstransporter="$OLD_HOME/shared-itmstransporter"
if [[ -f "$shared_itmstransporter" ]]; then
cp -r "$shared_itmstransporter" "$HOME/.itmstransporter"
fi
# For some reason altool prints everything to stderr, not stdout
set +e
xcrun altool --notarize-app \
--username "$APPL_USER" --password "$APPL_PASSWORD" \
--primary-bundle-id "$BUNDLE_ID" \
--asc-provider JetBrainssro --file "$1" 2>&1 | tee "altool.init.out"
unset TMPDIR
export HOME="$OLD_HOME"
set -e
}
#immediately exit script with an error if a command fails
set -euo pipefail
file="$APP_NAME.zip"
log "Zipping $file..."
rm -rf "$file"
ditto -c -k --sequesterRsrc --keepParent "$APP_DIRECTORY" "$file"
log "Notarizing $file..."
rm -rf "altool.init.out" "altool.check.out"
altool-upload "$file"
rm -rf "$file"
notarization_info="$(grep -e "RequestUUID" "altool.init.out" | grep -oE '([0-9a-f-]{36})')"
if [ -z "$notarization_info" ]; then
log "Faile to read RequestUUID from altool.init.out"
exit 10
fi
PATH="$PATH:/usr/local/bin/"
log "Notarization request sent, awaiting response"
spent=0
while true; do
# For some reason altool prints everything to stderr, not stdout
xcrun altool --username "$APPL_USER" --notarization-info "$notarization_info" --password "$APPL_PASSWORD" >"altool.check.out" 2>&1 || true
status="$(grep -oe 'Status: .*' "altool.check.out" | cut -c 9- || true)"
log "Current status: $status"
if [ "$status" = "invalid" ]; then
log "Notarization failed"
ec=1
elif [ "$status" = "success" ]; then
log "Notarization succeeded"
ec=0
else
if [ "$status" != "in progress" ]; then
log "Unknown notarization status, waiting more, altool output:"
cat "altool.check.out"
fi
if [[ $spent -gt 60 ]]; then
log "Waiting time out (apx 60 minutes)"
ec=2
break
fi
sleep 60
((spent += 1))
continue
fi
developer_log="developer_log.json"
log "Fetching $developer_log"
# TODO: Replace cut with trim or something better
url="$(grep -oe 'LogFileURL: .*' "altool.check.out" | cut -c 13-)"
wget "$url" -O "$developer_log" && cat "$developer_log" || true
if [ $ec != 0 ]; then
log "Publishing $developer_log"
publish-log "$notarization_info" "$developer_log"
fi
break
done
cat "altool.check.out"
rm -rf "altool.init.out" "altool.check.out"
exit $ec

View File

@@ -1,107 +0,0 @@
#!/bin/bash
APP_DIRECTORY=$1
JB_CERT=$2
if [[ -z "$APP_DIRECTORY" ]] || [[ -z "$JB_CERT" ]]; then
echo "Usage: $0 AppDirectory CertificateID"
exit 1
fi
if [[ ! -d "$APP_DIRECTORY" ]]; then
echo "AppDirectory '$APP_DIRECTORY' does not exist or not a directory"
exit 1
fi
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
#immediately exit script with an error if a command fails
set -euo pipefail
# Cleanup files left from previous sign attempt (if any)
find "$APP_DIRECTORY" -name '*.cstemp' -exec rm '{}' \;
log "Signing libraries and executables..."
# -perm +111 searches for executables
for f in \
"Contents/Home/bin" \
"Contents/Home/lib"; do
if [ -d "$APP_DIRECTORY/$f" ]; then
find "$APP_DIRECTORY/$f" \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -perm +111 \) \
-exec codesign --timestamp --force \
-v -s "$JB_CERT" --options=runtime \
--entitlements entitlements.xml {} \;
fi
done
if [ -d "$APP_DIRECTORY/Contents/Frameworks" ]; then
log "Signing frameworks..."
for f in $APP_DIRECTORY/Contents/Frameworks/*; do
find "$f" \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" \) \
-exec codesign --timestamp --force \
-v -s "$JB_CERT" \
--entitlements entitlements.xml {} \;
codesign --timestamp --force \
-v -s "$JB_CERT" --options=runtime \
--entitlements entitlements.xml "$f"
done
fi
log "Signing libraries in jars in $PWD"
# todo: add set -euo pipefail; into the inner sh -c
# `-e` prevents `grep -q && printf` loginc
# with `-o pipefail` there's no input for 'while' loop
find "$APP_DIRECTORY" -name '*.jar' \
-exec sh -c "set -u; unzip -l \"\$0\" | grep -q -e '\.dylib\$' -e '\.jnilib\$' -e '\.so\$' -e '^jattach\$' && printf \"\$0\0\" " {} \; |
while IFS= read -r -d $'\0' file; do
log "Processing libraries in $file"
rm -rf jarfolder jar.jar
mkdir jarfolder
filename="${file##*/}"
log "Filename: $filename"
cp "$file" jarfolder && (cd jarfolder && jar xf "$filename" && rm "$filename")
find jarfolder \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -name "jattach" \) \
-exec codesign --timestamp --force \
-v -s "$JB_CERT" --options=runtime \
--entitlements entitlements.xml {} \;
(cd jarfolder; zip -q -r -o ../jar.jar .)
mv jar.jar "$file"
done
rm -rf jarfolder jar.jar
log "Signing other files..."
for f in \
"Contents/MacOS"; do
if [ -d "$APP_DIRECTORY/$f" ]; then
find "$APP_DIRECTORY/$f" \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -perm +111 \) \
-exec codesign --timestamp --force \
-v -s "$JB_CERT" --options=runtime \
--entitlements entitlements.xml {} \;
fi
done
#log "Signing executable..."
#codesign --timestamp \
# -v -s "$JB_CERT" --options=runtime \
# --force \
# --entitlements entitlements.xml "$APP_DIRECTORY/Contents/MacOS/idea"
log "Signing whole app..."
codesign --timestamp \
-v -s "$JB_CERT" --options=runtime \
--force \
--entitlements entitlements.xml "$APP_DIRECTORY"
log "Verifying java is not broken"
find "$APP_DIRECTORY" \
-type f -name 'java' -perm +111 -exec {} -version \;

View File

@@ -1,134 +0,0 @@
#!/bin/bash -x
#immediately exit script with an error if a command fails
set -euo pipefail
export COPY_EXTENDED_ATTRIBUTES_DISABLE=true
export COPYFILE_DISABLE=true
INPUT_FILE=$1
EXPLODED=$2.exploded
BACKUP_JMODS=$2.backup
USERNAME=$3
PASSWORD=$4
CODESIGN_STRING=$5
NOTARIZE=$6
BUNDLE_ID=$7
cd "$(dirname "$0")"
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
log "Deleting $EXPLODED ..."
if test -d "$EXPLODED"; then
find "$EXPLODED" -mindepth 1 -maxdepth 1 -exec chmod -R u+wx '{}' \;
fi
rm -rf "$EXPLODED"
mkdir "$EXPLODED"
rm -rf "$BACKUP_JMODS"
mkdir "$BACKUP_JMODS"
log "Unzipping $INPUT_FILE to $EXPLODED ..."
tar -xzvf "$INPUT_FILE" --directory $EXPLODED
BUILD_NAME="$(ls "$EXPLODED")"
sed -i '' s/BNDL/APPL/ $EXPLODED/$BUILD_NAME/Contents/Info.plist
rm -f $EXPLODED/$BUILD_NAME/Contents/CodeResources
rm "$INPUT_FILE"
if test -d $EXPLODED/$BUILD_NAME/Contents/Home/jmods; then
mv $EXPLODED/$BUILD_NAME/Contents/Home/jmods $BACKUP_JMODS
fi
log "$INPUT_FILE extracted and removed"
APPLICATION_PATH="$EXPLODED/$BUILD_NAME"
find "$APPLICATION_PATH/Contents/Home/bin" \
-maxdepth 1 -type f -name '*.jnilib' -print0 |
while IFS= read -r -d $'\0' file; do
if [ -f "$file" ]; then
log "Linking $file"
b="$(basename "$file" .jnilib)"
ln -sf "$b.jnilib" "$(dirname "$file")/$b.dylib"
fi
done
find "$APPLICATION_PATH/Contents/" \
-maxdepth 1 -type f -name '*.txt' -print0 |
while IFS= read -r -d $'\0' file; do
if [ -f "$file" ]; then
log "Moving $file"
mv "$file" "$APPLICATION_PATH/Contents/Resources"
fi
done
non_plist=$(find "$APPLICATION_PATH/Contents/" -maxdepth 1 -type f -and -not -name 'Info.plist' | wc -l)
if [[ $non_plist -gt 0 ]]; then
log "Only Info.plist file is allowed in Contents directory but found $non_plist file(s):"
log "$(find "$APPLICATION_PATH/Contents/" -maxdepth 1 -type f -and -not -name 'Info.plist')"
exit 1
fi
log "Unlocking keychain..."
# Make sure *.p12 is imported into local KeyChain
security unlock-keychain -p "$PASSWORD" "/Users/$USERNAME/Library/Keychains/login.keychain"
attempt=1
limit=3
set +e
while [[ $attempt -le $limit ]]; do
log "Signing (attempt $attempt) $APPLICATION_PATH ..."
./sign.sh "$APPLICATION_PATH" "$CODESIGN_STRING"
ec=$?
if [[ $ec -ne 0 ]]; then
((attempt += 1))
if [ $attempt -eq $limit ]; then
set -e
fi
log "Signing failed, wait for 30 sec and try to sign again"
sleep 30
else
log "Signing done"
codesign -v "$APPLICATION_PATH" -vvvvv
log "Check sign done"
((attempt += limit))
fi
done
set -e
if [ "$NOTARIZE" = "yes" ]; then
log "Notarizing..."
# shellcheck disable=SC1090
source "$HOME/.notarize_token"
APP_NAME=$(echo ${INPUT_FILE} | awk -F"." '{ print $1 }')
# Since notarization tool uses same file for upload token we have to trick it into using different folders, hence fake root
# Also it leaves copy of zip file in TMPDIR, so notarize.sh overrides it and uses FAKE_ROOT as location for temp TMPDIR
FAKE_ROOT="$(pwd)/fake-root"
mkdir -p "$FAKE_ROOT"
echo "Notarization will use fake root: $FAKE_ROOT"
./notarize.sh "$APPLICATION_PATH" "$APPLE_USERNAME" "$APPLE_PASSWORD" "$APP_NAME" "$BUNDLE_ID" "$FAKE_ROOT"
rm -rf "$FAKE_ROOT"
set +e
log "Stapling..."
xcrun stapler staple "$APPLICATION_PATH"
else
log "Notarization disabled"
log "Stapling disabled"
fi
log "Zipping $BUILD_NAME to $INPUT_FILE ..."
(
#cd "$EXPLODED"
#ditto -c -k --sequesterRsrc --keepParent "$BUILD_NAME" "../$INPUT_FILE"
if test -d $BACKUP_JMODS/jmods; then
mv $BACKUP_JMODS/jmods $EXPLODED/$BUILD_NAME/Contents/Home
fi
tar -pczvf $INPUT_FILE --exclude='*.dSYM' --exclude='man' -C $EXPLODED $BUILD_NAME
log "Finished zipping"
)
rm -rf "$EXPLODED"
log "Done"

View File

@@ -1,56 +0,0 @@
#!/bin/bash -x
# How to call this script:
# eval $(jb/project/tools/mkjbrapi.sh)
# It is used to build jetbrains.api module
# After properly calling this script, you can use following variables:
# JBR_API_JAR - absolute path to resulting JAR
# JBR_API_SOURCES_JAR - absolute path to JAR with sources
# JBR_API_VERSION - JBR API version in form <major>.<minor>
# JBR_API_VERSION_MAJOR, JBR_API_VERSION_MINOR - JBR API version components
# JBR_BUILD_DIR - absolute path to JBR build directory
# JBR_BOOT_JDK - absolute path to used boot JDK
ROOT=$(pwd)
sh configure --with-debug-level=release --disable-warnings-as-errors 1>&2 || exit $?
# Get boot JDK & build directory using make script
make -f $ROOT/make/JBRApi.gmk -I $ROOT jbr-api MAKEOVERRIDES= CONF=release OUT="$ROOT/build/jbr-api.cfg" 1>&2 || exit $?
source "$ROOT/build/jbr-api.cfg" || exit $?
# Build module
make jetbrains.api 1>&2 || exit $?
# Get JBR API version from compiled class
JSHELL_COMMAND='
System.out.println("\nVERSION_MAJOR=" + com.jetbrains.JBRApi.getMajorVersion());
System.out.println("\nVERSION_MINOR=" + com.jetbrains.JBRApi.getMinorVersion());
/exit'
VERSION_VARIABLES=$("$BOOT_JDK/bin/jshell" -s --module-path "$BUILD_DIR/jdk/modules/jetbrains.api" \
--add-modules jetbrains.api <<< "$JSHELL_COMMAND" | grep "^VERSION\|^|") || exit $?
eval "$VERSION_VARIABLES" || exit $?
# Create JAR
(
cd "$BUILD_DIR/jdk/modules/jetbrains.api"
"$BOOT_JDK/bin/jar" -cf "$BUILD_DIR/jbr-api.jar" * 1>&2
) || exit $?
# Create source JAR
(
cd "src/jetbrains.api/share/classes"
"$BOOT_JDK/bin/jar" -cf "$BUILD_DIR/jbr-api-sources.jar" * 1>&2
) || exit $?
# Print output values
echo "JBR_API_JAR=$BUILD_DIR/jbr-api.jar"
echo "JBR_API_SOURCES_JAR=$BUILD_DIR/jbr-api-sources.jar"
echo "JBR_API_VERSION=$VERSION_MAJOR.$VERSION_MINOR"
echo "JBR_API_VERSION_MAJOR=$VERSION_MAJOR"
echo "JBR_API_VERSION_MINOR=$VERSION_MINOR"
echo "JBR_BUILD_DIR=$BUILD_DIR"
echo "JBR_BOOT_JDK=$BOOT_JDK"
echo "Success!" 1>&2

View File

@@ -1,30 +0,0 @@
diff --git modules.list modules.list
index dcf610a6a56..f8797505c23 100644
--- modules.list
+++ modules.list
@@ -51,4 +51,7 @@ jdk.zipfs,
jdk.hotspot.agent,
jetbrains.api,
jetbrains.api.impl,
-jdk.jcmd
+jdk.jcmd,
+jcef,
+gluegen.rt,
+jogl.all
diff --git src/java.desktop/share/classes/module-info.java src/java.desktop/share/classes/module-info.java
index 897647ee368..781d1809493 100644
--- src/java.desktop/share/classes/module-info.java
+++ src/java.desktop/share/classes/module-info.java
@@ -116,7 +116,11 @@ module java.desktop {
// see make/GensrcModuleInfo.gmk
exports sun.awt to
jdk.accessibility,
- jdk.unsupported.desktop;
+ jdk.unsupported.desktop,
+ jcef,
+ jogl.all;
+
+ exports java.awt.peer to jcef;
exports java.awt.dnd.peer to jdk.unsupported.desktop;
exports sun.awt.dnd to jdk.unsupported.desktop;

File diff suppressed because it is too large Load Diff

View File

@@ -1,29 +0,0 @@
From 960dafbeeba190911955c208b611fecc15d66738 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Wed, 11 Mar 2020 14:19:34 +0100
Subject: [PATCH 03/34] Fix class cast exception on redefinition of class A,
that is superclass of B that has anonymous class C
---
src/hotspot/share/oops/instanceKlass.cpp | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
diff --git a/src/hotspot/share/oops/instanceKlass.cpp b/src/hotspot/share/oops/instanceKlass.cpp
index 994fc8a3bc8..3be3a09ef8f 100644
--- a/src/hotspot/share/oops/instanceKlass.cpp
+++ b/src/hotspot/share/oops/instanceKlass.cpp
@@ -953,7 +953,10 @@ bool InstanceKlass::link_class_impl(TRAPS) {
if (!is_linked()) {
if (!is_rewritten()) {
- {
+ // (DCEVM): If class A is being redefined and class B->A (B is extended from A) and B is host class of anonymous class C
+ // then second redefinition fails with cannot cast klass exception. So we currently turn off bytecode verification
+ // on redefinition.
+ if (!AllowEnhancedClassRedefinition || !newest_version()->is_redefining()) {
bool verify_ok = verify_code(THREAD);
if (!verify_ok) {
return false;
--
2.23.0

View File

@@ -1,240 +0,0 @@
From 39df5f163d4a0f1fd6b92313a5570808f19d5e20 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 4 Oct 2020 21:12:12 +0200
Subject: [PATCH 05/34] Support for Lambda class redefinition
---
.../share/classfile/classLoaderData.cpp | 9 +++
.../share/classfile/classLoaderData.hpp | 2 +-
.../share/classfile/systemDictionary.cpp | 12 +++-
.../share/classfile/systemDictionary.hpp | 1 +
.../prims/jvmtiEnhancedRedefineClasses.cpp | 65 +++++++++++++++++--
.../prims/jvmtiEnhancedRedefineClasses.hpp | 1 +
.../share/prims/resolvedMethodTable.cpp | 2 +
src/hotspot/share/prims/unsafe.cpp | 1 +
8 files changed, 83 insertions(+), 10 deletions(-)
diff --git a/src/hotspot/share/classfile/classLoaderData.cpp b/src/hotspot/share/classfile/classLoaderData.cpp
index 0cd90bb8c27..4d64c6b454a 100644
--- a/src/hotspot/share/classfile/classLoaderData.cpp
+++ b/src/hotspot/share/classfile/classLoaderData.cpp
@@ -593,6 +593,15 @@ Dictionary* ClassLoaderData::create_dictionary() {
return new Dictionary(this, size, resizable);
}
+void ClassLoaderData::exchange_holders(ClassLoaderData* cld) {
+ oop holder_oop = _holder.peek();
+ _holder.replace(cld->_holder.peek());
+ cld->_holder.replace(holder_oop);
+ WeakHandle<vm_class_loader_data> exchange = _holder;
+ _holder = cld->_holder;
+ cld->_holder = exchange;
+}
+
// Tell the GC to keep this klass alive while iterating ClassLoaderDataGraph
oop ClassLoaderData::holder_phantom() const {
// A klass that was previously considered dead can be looked up in the
diff --git a/src/hotspot/share/classfile/classLoaderData.hpp b/src/hotspot/share/classfile/classLoaderData.hpp
index ba2393f8dd0..e2ae0a77351 100644
--- a/src/hotspot/share/classfile/classLoaderData.hpp
+++ b/src/hotspot/share/classfile/classLoaderData.hpp
@@ -181,7 +181,7 @@ class ClassLoaderData : public CHeapObj<mtClass> {
bool has_accumulated_modified_oops() { return _accumulated_modified_oops; }
oop holder_no_keepalive() const;
oop holder_phantom() const;
-
+ void exchange_holders(ClassLoaderData* cld);
private:
void unload();
bool keep_alive() const { return _keep_alive > 0; }
diff --git a/src/hotspot/share/classfile/systemDictionary.cpp b/src/hotspot/share/classfile/systemDictionary.cpp
index bd0cae7cb9b..8f2b46add4d 100644
--- a/src/hotspot/share/classfile/systemDictionary.cpp
+++ b/src/hotspot/share/classfile/systemDictionary.cpp
@@ -1062,10 +1062,14 @@ InstanceKlass* SystemDictionary::parse_stream(Symbol* class_name,
Handle class_loader,
ClassFileStream* st,
const ClassLoadInfo& cl_info,
+ InstanceKlass* old_klass,
TRAPS) {
EventClassLoad class_load_start_event;
ClassLoaderData* loader_data;
+
+ bool is_redefining = (old_klass != NULL);
+
bool is_unsafe_anon_class = cl_info.unsafe_anonymous_host() != NULL;
// - for unsafe anonymous class: create a new CLD whith a class holder that uses
@@ -1094,8 +1098,12 @@ InstanceKlass* SystemDictionary::parse_stream(Symbol* class_name,
class_name,
loader_data,
cl_info,
- false, // pick_newest
+ is_redefining, // pick_newest
CHECK_NULL);
+ if (is_redefining && k != NULL) {
+ k->set_redefining(true);
+ k->set_old_version(old_klass);
+ }
if ((cl_info.is_hidden() || is_unsafe_anon_class) && k != NULL) {
// Hidden classes that are not strong and unsafe anonymous classes must update
@@ -1998,7 +2006,7 @@ void SystemDictionary::remove_from_hierarchy(InstanceKlass* k) {
k->remove_from_sibling_list();
}
-// (DCEVM)
+// (DCEVM)
void SystemDictionary::update_constraints_after_redefinition() {
constraints()->update_after_redefinition();
}
diff --git a/src/hotspot/share/classfile/systemDictionary.hpp b/src/hotspot/share/classfile/systemDictionary.hpp
index 4547449dbec..931e655d631 100644
--- a/src/hotspot/share/classfile/systemDictionary.hpp
+++ b/src/hotspot/share/classfile/systemDictionary.hpp
@@ -329,6 +329,7 @@ public:
Handle class_loader,
ClassFileStream* st,
const ClassLoadInfo& cl_info,
+ InstanceKlass* old_klass,
TRAPS);
// Resolve from stream (called by jni_DefineClass and JVM_DefineClass)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 92ce6c27b8a..8b765623dcd 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -494,6 +494,8 @@ void VM_EnhancedRedefineClasses::doit() {
ClassLoaderDataGraph::classes_do(&clear_cpool_cache);
+ // SystemDictionary::methods_do(fix_invoke_method);
+
// JSR-292 support
if (_any_class_has_resolved_methods) {
bool trace_name_printed = false;
@@ -756,12 +758,34 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
// load hook event.
state->set_class_being_redefined(the_class, _class_load_kind);
- InstanceKlass* k = SystemDictionary::resolve_from_stream(the_class_sym,
- the_class_loader,
- protection_domain,
- &st,
- the_class,
- THREAD);
+ InstanceKlass* k;
+
+ if (InstanceKlass::cast(the_class)->is_anonymous()) {
+ const InstanceKlass* host_class = the_class->host_klass();
+
+ // Make sure it's the real host class, not another anonymous class.
+ while (host_class != NULL && host_class->is_anonymous()) {
+ host_class = host_class->host_klass();
+ }
+
+ k = SystemDictionary::parse_stream(the_class_sym,
+ the_class_loader,
+ protection_domain,
+ &st,
+ host_class,
+ the_class,
+ NULL,
+ THREAD);
+ k->class_loader_data()->exchange_holders(the_class->class_loader_data());
+ the_class->class_loader_data()->inc_keep_alive();
+ } else {
+ k = SystemDictionary::resolve_from_stream(the_class_sym,
+ the_class_loader,
+ protection_domain,
+ &st,
+ the_class,
+ THREAD);
+ }
// Clear class_being_redefined just to be sure.
state->clear_class_being_redefined();
@@ -1442,6 +1466,30 @@ void VM_EnhancedRedefineClasses::MethodDataCleaner::do_klass(Klass* k) {
}
}
+void VM_EnhancedRedefineClasses::fix_invoke_method(Method* method) {
+
+ constantPoolHandle other_cp = constantPoolHandle(method->constants());
+
+ for (int i = 0; i < other_cp->length(); i++) {
+ if (other_cp->tag_at(i).is_klass()) {
+ Klass* klass = other_cp->resolved_klass_at(i);
+ if (klass->new_version() != NULL) {
+ // Constant pool entry points to redefined class -- update to the new version
+ other_cp->klass_at_put(i, klass->newest_version());
+ }
+ assert(other_cp->resolved_klass_at(i)->new_version() == NULL, "Must be new klass!");
+ }
+ }
+
+ ConstantPoolCache* cp_cache = other_cp->cache();
+ if (cp_cache != NULL) {
+ cp_cache->clear_entries();
+ }
+
+}
+
+
+
void VM_EnhancedRedefineClasses::update_jmethod_ids() {
for (int j = 0; j < _matching_methods_length; ++j) {
Method* old_method = _matching_old_methods[j];
@@ -1979,7 +2027,10 @@ jvmtiError VM_EnhancedRedefineClasses::find_sorted_affected_classes(TRAPS) {
// Find classes not directly redefined, but affected by a redefinition (because one of its supertypes is redefined)
AffectedKlassClosure closure(_affected_klasses);
// Updated in j10, from original SystemDictionary::classes_do
- ClassLoaderDataGraph::dictionary_classes_do(&closure);
+
+ ClassLoaderDataGraph::classes_do(&closure);
+ //ClassLoaderDataGraph::dictionary_classes_do(&closure);
+
log_trace(redefine, class, load)("%d classes affected", _affected_klasses->length());
// Sort the affected klasses such that a supertype is always on a smaller array index than its subtype.
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index 60b62c3170a..d8a11b51fe9 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -116,6 +116,7 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
void rollback();
static void mark_as_scavengable(nmethod* nm);
static void unpatch_bytecode(Method* method);
+ static void fix_invoke_method(Method* method);
// Figure out which new methods match old methods in name and signature,
// which methods have been added, and which are no longer present
diff --git a/src/hotspot/share/prims/resolvedMethodTable.cpp b/src/hotspot/share/prims/resolvedMethodTable.cpp
index 122bb8c186b..81b3aa96564 100644
--- a/src/hotspot/share/prims/resolvedMethodTable.cpp
+++ b/src/hotspot/share/prims/resolvedMethodTable.cpp
@@ -414,6 +414,8 @@ void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed)
InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
Method* newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
+ log_info(redefine, class, load, exceptions)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
+
assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
assert(newer_method != NULL, "method_with_idnum() should not be NULL");
assert(old_method != newer_method, "sanity check");
diff --git a/src/hotspot/share/prims/unsafe.cpp b/src/hotspot/share/prims/unsafe.cpp
index 72d81ec9d6c..027afa3fabd 100644
--- a/src/hotspot/share/prims/unsafe.cpp
+++ b/src/hotspot/share/prims/unsafe.cpp
@@ -865,6 +865,7 @@ Unsafe_DefineAnonymousClass_impl(JNIEnv *env,
host_loader,
&st,
cl_info,
+ NULL,
CHECK_NULL);
if (anonk == NULL) {
return NULL;
--
2.23.0

View File

@@ -1,135 +0,0 @@
From 5af1daedc86b5fec0f222cbdda3afbdf518985ea Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sat, 23 May 2020 10:02:15 +0200
Subject: [PATCH 06/34] Fix "no original bytecode found" error if method with
bkp is missing
Sometimes IDE can deploy class with erroneous method, such method has
n bytecode, but breakpoint position can still exist.
---
src/hotspot/share/interpreter/bytecodes.cpp | 2 +-
.../share/interpreter/interpreterRuntime.cpp | 2 +-
src/hotspot/share/oops/method.cpp | 8 ++++----
src/hotspot/share/oops/method.hpp | 4 ++--
.../prims/jvmtiEnhancedRedefineClasses.cpp | 18 ++++++++++--------
5 files changed, 18 insertions(+), 16 deletions(-)
diff --git a/src/hotspot/share/interpreter/bytecodes.cpp b/src/hotspot/share/interpreter/bytecodes.cpp
index e377e36b88c..262ecc021b2 100644
--- a/src/hotspot/share/interpreter/bytecodes.cpp
+++ b/src/hotspot/share/interpreter/bytecodes.cpp
@@ -84,7 +84,7 @@ Bytecodes::Code Bytecodes::code_at(Method* method, int bci) {
Bytecodes::Code Bytecodes::non_breakpoint_code_at(const Method* method, address bcp) {
assert(method != NULL, "must have the method for breakpoint conversion");
assert(method->contains(bcp), "must be valid bcp in method");
- return method->orig_bytecode_at(method->bci_from(bcp));
+ return method->orig_bytecode_at(method->bci_from(bcp), false);
}
int Bytecodes::special_length_at(Bytecodes::Code code, address bcp, address end) {
diff --git a/src/hotspot/share/interpreter/interpreterRuntime.cpp b/src/hotspot/share/interpreter/interpreterRuntime.cpp
index ed3cc3eb6a2..504e59caf51 100644
--- a/src/hotspot/share/interpreter/interpreterRuntime.cpp
+++ b/src/hotspot/share/interpreter/interpreterRuntime.cpp
@@ -814,7 +814,7 @@ JRT_END
// Invokes
JRT_ENTRY(Bytecodes::Code, InterpreterRuntime::get_original_bytecode_at(JavaThread* thread, Method* method, address bcp))
- return method->orig_bytecode_at(method->bci_from(bcp));
+ return method->orig_bytecode_at(method->bci_from(bcp), false);
JRT_END
JRT_ENTRY(void, InterpreterRuntime::set_original_bytecode_at(JavaThread* thread, Method* method, address bcp, Bytecodes::Code new_code))
diff --git a/src/hotspot/share/oops/method.cpp b/src/hotspot/share/oops/method.cpp
index 516f2bb8f2f..1c88511a5fc 100644
--- a/src/hotspot/share/oops/method.cpp
+++ b/src/hotspot/share/oops/method.cpp
@@ -1853,14 +1853,14 @@ bool CompressedLineNumberReadStream::read_pair() {
#if INCLUDE_JVMTI
-Bytecodes::Code Method::orig_bytecode_at(int bci) const {
+Bytecodes::Code Method::orig_bytecode_at(int bci, bool no_fatal) const {
BreakpointInfo* bp = method_holder()->breakpoints();
for (; bp != NULL; bp = bp->next()) {
if (bp->match(this, bci)) {
return bp->orig_bytecode();
}
}
- {
+ if (!no_fatal) {
ResourceMark rm;
fatal("no original bytecode found in %s at bci %d", name_and_sig_as_C_string(), bci);
}
@@ -2006,7 +2006,7 @@ BreakpointInfo::BreakpointInfo(Method* m, int bci) {
_signature_index = m->signature_index();
_orig_bytecode = (Bytecodes::Code) *m->bcp_from(_bci);
if (_orig_bytecode == Bytecodes::_breakpoint)
- _orig_bytecode = m->orig_bytecode_at(_bci);
+ _orig_bytecode = m->orig_bytecode_at(_bci, false);
_next = NULL;
}
@@ -2015,7 +2015,7 @@ void BreakpointInfo::set(Method* method) {
{
Bytecodes::Code code = (Bytecodes::Code) *method->bcp_from(_bci);
if (code == Bytecodes::_breakpoint)
- code = method->orig_bytecode_at(_bci);
+ code = method->orig_bytecode_at(_bci, false);
assert(orig_bytecode() == code, "original bytecode must be the same");
}
#endif
diff --git a/src/hotspot/share/oops/method.hpp b/src/hotspot/share/oops/method.hpp
index 83ed2d9c3c1..4d4cc6dc012 100644
--- a/src/hotspot/share/oops/method.hpp
+++ b/src/hotspot/share/oops/method.hpp
@@ -230,7 +230,7 @@ class Method : public Metadata {
// JVMTI breakpoints
#if !INCLUDE_JVMTI
- Bytecodes::Code orig_bytecode_at(int bci) const {
+ Bytecodes::Code orig_bytecode_at(int bci, bool no_fatal) const {
ShouldNotReachHere();
return Bytecodes::_shouldnotreachhere;
}
@@ -239,7 +239,7 @@ class Method : public Metadata {
};
u2 number_of_breakpoints() const {return 0;}
#else // !INCLUDE_JVMTI
- Bytecodes::Code orig_bytecode_at(int bci) const;
+ Bytecodes::Code orig_bytecode_at(int bci, bool no_fatal) const;
void set_orig_bytecode_at(int bci, Bytecodes::Code code);
void set_breakpoint(int bci);
void clear_breakpoint(int bci);
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 8b765623dcd..a859b8e1162 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -1362,14 +1362,16 @@ void VM_EnhancedRedefineClasses::unpatch_bytecode(Method* method) {
if (code == Bytecodes::_breakpoint) {
int bci = method->bci_from(bcp);
- code = method->orig_bytecode_at(bci);
- java_code = Bytecodes::java_code(code);
- if (code != java_code &&
- (java_code == Bytecodes::_getfield ||
- java_code == Bytecodes::_putfield ||
- java_code == Bytecodes::_aload_0)) {
- // Let breakpoint table handling unpatch bytecode
- method->set_orig_bytecode_at(bci, java_code);
+ code = method->orig_bytecode_at(bci, true);
+ if (code != Bytecodes::_shouldnotreachhere) {
+ java_code = Bytecodes::java_code(code);
+ if (code != java_code &&
+ (java_code == Bytecodes::_getfield ||
+ java_code == Bytecodes::_putfield ||
+ java_code == Bytecodes::_aload_0)) {
+ // Let breakpoint table handling unpatch bytecode
+ method->set_orig_bytecode_at(bci, java_code);
+ }
}
} else {
java_code = Bytecodes::java_code(code);
--
2.23.0

View File

@@ -1,57 +0,0 @@
From 19d2274a5dff6e6b31474252b45e5e7484f0180b Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 24 May 2020 12:07:42 +0200
Subject: [PATCH 07/34] Replace deleted method with
Universe::throw_no_such_method_error
---
.../share/prims/resolvedMethodTable.cpp | 28 +++++++++----------
1 file changed, 14 insertions(+), 14 deletions(-)
diff --git a/src/hotspot/share/prims/resolvedMethodTable.cpp b/src/hotspot/share/prims/resolvedMethodTable.cpp
index 81b3aa96564..caf03ffe56d 100644
--- a/src/hotspot/share/prims/resolvedMethodTable.cpp
+++ b/src/hotspot/share/prims/resolvedMethodTable.cpp
@@ -404,25 +404,25 @@ void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed)
if (old_method->is_old()) {
+ InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
+ Method* newer_method;
+
// Method* new_method;
if (old_method->is_deleted()) {
- // FIXME:(DCEVM) - check if exception can be thrown
- // new_method = Universe::throw_no_such_method_error();
- continue;
- }
-
- InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
- Method* newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
+ newer_method = Universe::throw_no_such_method_error();
+ } else {
+ newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
- log_info(redefine, class, load, exceptions)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
+ log_info(redefine, class, load, exceptions)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
- assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
- assert(newer_method != NULL, "method_with_idnum() should not be NULL");
- assert(old_method != newer_method, "sanity check");
+ assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
+ assert(newer_method != NULL, "method_with_idnum() should not be NULL");
+ assert(old_method != newer_method, "sanity check");
- if (_the_table->lookup(newer_method) != NULL) {
- // old method was already adjusted if new method exists in _the_table
- continue;
+ if (_the_table->lookup(newer_method) != NULL) {
+ // old method was already adjusted if new method exists in _the_table
+ continue;
+ }
}
java_lang_invoke_ResolvedMethodName::set_vmtarget(mem_name, newer_method);
--
2.23.0

File diff suppressed because it is too large Load Diff

View File

@@ -1,50 +0,0 @@
From ca47ab5a0a6ce8e2644736f323a335a957311af9 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sat, 13 Jun 2020 18:50:59 +0200
Subject: [PATCH 09/34] Change log level in advanced redefinition
- Change log level for "Comparing different class ver.." to debug
- Fix adjust_method_entries_dcevm logging levels and severity
---
src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp | 2 +-
src/hotspot/share/prims/resolvedMethodTable.cpp | 4 ++--
2 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 6c9eb40ecf5..b09ba554e07 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -916,7 +916,7 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
// Calculated the difference between new and old class (field change, method change, supertype change, ...).
int VM_EnhancedRedefineClasses::calculate_redefinition_flags(InstanceKlass* new_class) {
int result = Klass::NoRedefinition;
- log_info(redefine, class, load)("Comparing different class versions of class %s",new_class->name()->as_C_string());
+ log_debug(redefine, class, load)("Comparing different class versions of class %s",new_class->name()->as_C_string());
assert(new_class->old_version() != NULL, "must have old version");
InstanceKlass* the_class = InstanceKlass::cast(new_class->old_version());
diff --git a/src/hotspot/share/prims/resolvedMethodTable.cpp b/src/hotspot/share/prims/resolvedMethodTable.cpp
index caf03ffe56d..eb9fcda44f3 100644
--- a/src/hotspot/share/prims/resolvedMethodTable.cpp
+++ b/src/hotspot/share/prims/resolvedMethodTable.cpp
@@ -413,7 +413,7 @@ void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed)
} else {
newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
- log_info(redefine, class, load, exceptions)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
+ log_debug(redefine, class, update)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
assert(newer_method != NULL, "method_with_idnum() should not be NULL");
@@ -433,7 +433,7 @@ void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed)
ResourceMark rm;
if (!(*trace_name_printed)) {
- log_info(redefine, class, update)("adjust: name=%s", old_method->method_holder()->external_name());
+ log_debug(redefine, class, update)("adjust: name=%s", old_method->method_holder()->external_name());
*trace_name_printed = true;
}
log_debug(redefine, class, update, constantpool)
--
2.23.0

View File

@@ -1,26 +0,0 @@
From 7e236beee0375656d1955fc1168143c1639fb7f1 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Tue, 6 Oct 2020 22:15:31 +0200
Subject: [PATCH 10/34] AllowEnhancedClassRedefinition is false (disabled) by
default
---
src/hotspot/share/runtime/globals.hpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/hotspot/share/runtime/globals.hpp b/src/hotspot/share/runtime/globals.hpp
index 5b367704800..2710c6ea0e5 100644
--- a/src/hotspot/share/runtime/globals.hpp
+++ b/src/hotspot/share/runtime/globals.hpp
@@ -2466,7 +2466,7 @@ const size_t minimumSymbolTableSize = 1024;
diagnostic(bool, DeoptimizeNMethodBarriersALot, false, \
"Make nmethod barriers deoptimise a lot.") \
\
- product(bool, AllowEnhancedClassRedefinition, true, \
+ product(bool, AllowEnhancedClassRedefinition, false, \
"Allow enhanced class redefinition beyond swapping method " \
"bodies") \
\
--
2.23.0

View File

@@ -1,25 +0,0 @@
From d56e73885111b386771f564ec6beb305338993df Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Mon, 19 Oct 2020 20:00:04 +0200
Subject: [PATCH 12/34] Set HOTSPOT_VM_DISTRO=Dynamic Code Evolution
---
make/autoconf/version-numbers | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/make/autoconf/version-numbers b/make/autoconf/version-numbers
index aabdc5bed20..df8025a2e84 100644
--- a/make/autoconf/version-numbers
+++ b/make/autoconf/version-numbers
@@ -45,7 +45,7 @@ PRODUCT_NAME=OpenJDK
PRODUCT_SUFFIX="Runtime Environment"
JDK_RC_PLATFORM_NAME=Platform
COMPANY_NAME=N/A
-HOTSPOT_VM_DISTRO="OpenJDK"
+HOTSPOT_VM_DISTRO="Dynamic Code Evolution"
VENDOR_URL=https://openjdk.java.net/
VENDOR_URL_BUG=https://bugreport.java.com/bugreport/
VENDOR_URL_VM_BUG=https://bugreport.java.com/bugreport/crash.jsp
--
2.23.0

View File

@@ -1,71 +0,0 @@
From 7ebad43ed45805b0a3736c510f708ff17697ba7e Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 11 Oct 2020 10:43:28 +0200
Subject: [PATCH 13/34] Fix G1 nmethod registration
---
.../prims/jvmtiEnhancedRedefineClasses.cpp | 19 ++++++++++++++++---
.../prims/jvmtiEnhancedRedefineClasses.hpp | 3 ++-
2 files changed, 18 insertions(+), 4 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index b09ba554e07..718426f2819 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -216,7 +216,14 @@ void VM_EnhancedRedefineClasses::mark_as_scavengable(nmethod* nm) {
}
}
-void VM_EnhancedRedefineClasses::mark_as_scavengable_g1(nmethod* nm) {
+void VM_EnhancedRedefineClasses::unregister_nmethod_g1(nmethod* nm) {
+ // It should work not only for G1 but also for another GCs, but this way is safer now
+ if (!nm->is_zombie() && !nm->is_unloaded()) {
+ Universe::heap()->unregister_nmethod(nm);
+ }
+}
+
+void VM_EnhancedRedefineClasses::register_nmethod_g1(nmethod* nm) {
// It should work not only for G1 but also for another GCs, but this way is safer now
if (!nm->is_zombie() && !nm->is_unloaded()) {
Universe::heap()->register_nmethod(nm);
@@ -521,8 +528,9 @@ void VM_EnhancedRedefineClasses::doit() {
// For now, mark all nmethod's as scavengable that are not scavengable already
if (ScavengeRootsInCode) {
if (UseG1GC) {
- // this should work also for other GCs
- CodeCache::nmethods_do(mark_as_scavengable_g1);
+ // G1 holds references to nmethods in regions based on oops values. Since oops in nmethod can be changed in ChangePointers* closures
+ // we unregister nmethods from G1 heap, then closures are processed (oops are changed) and finally we register nmethod to G1 again
+ CodeCache::nmethods_do(unregister_nmethod_g1);
} else {
CodeCache::nmethods_do(mark_as_scavengable);
}
@@ -545,6 +553,11 @@ void VM_EnhancedRedefineClasses::doit() {
Universe::root_oops_do(&oopClosureNoBarrier);
+ if (UseG1GC) {
+ // this should work also for other GCs
+ CodeCache::nmethods_do(register_nmethod_g1);
+ }
+
}
log_trace(redefine, class, obsolete, metadata)("After updating instances");
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index 9755944d70b..4c0412d343d 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -116,7 +116,8 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
void rollback();
static void mark_as_scavengable(nmethod* nm);
- static void mark_as_scavengable_g1(nmethod* nm);
+ static void unregister_nmethod_g1(nmethod* nm);
+ static void register_nmethod_g1(nmethod* nm);
static void unpatch_bytecode(Method* method);
static void fix_invoke_method(Method* method);
--
2.23.0

View File

@@ -1,26 +0,0 @@
From 5c7e5f245f79d7e8575461dab0c356ed74c8e9a3 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Thu, 22 Oct 2020 20:15:20 +0200
Subject: [PATCH 14/34] Initialize method's _new_version/_old_version to NULL
---
src/hotspot/share/oops/method.cpp | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/src/hotspot/share/oops/method.cpp b/src/hotspot/share/oops/method.cpp
index 1c88511a5fc..ce940cf10a9 100644
--- a/src/hotspot/share/oops/method.cpp
+++ b/src/hotspot/share/oops/method.cpp
@@ -91,7 +91,8 @@ Method* Method::allocate(ClassLoaderData* loader_data,
return new (loader_data, size, MetaspaceObj::MethodType, THREAD) Method(cm, access_flags);
}
-Method::Method(ConstMethod* xconst, AccessFlags access_flags) {
+Method::Method(ConstMethod* xconst, AccessFlags access_flags) : _new_version(NULL),
+ _old_version(NULL) {
NoSafepointVerifier no_safepoint;
set_constMethod(xconst);
set_access_flags(access_flags);
--
2.23.0

View File

@@ -1,193 +0,0 @@
From 6ffac6e5064ec6633fdbeb8520333dca00bc6a62 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Fri, 23 Oct 2020 10:20:26 +0200
Subject: [PATCH 15/34] Clear dcevm code separation
---
src/hotspot/share/classfile/systemDictionary.cpp | 4 ++--
src/hotspot/share/gc/serial/genMarkSweep.cpp | 8 +++++---
src/hotspot/share/interpreter/linkResolver.cpp | 16 +++++++++++-----
.../instrumentation/jfrEventClassTransformer.cpp | 2 +-
src/hotspot/share/oops/instanceKlass.cpp | 10 ++++++----
src/hotspot/share/oops/method.cpp | 2 +-
.../share/prims/jvmtiGetLoadedClasses.cpp | 2 +-
src/hotspot/share/runtime/reflection.cpp | 2 +-
8 files changed, 28 insertions(+), 18 deletions(-)
diff --git a/src/hotspot/share/classfile/systemDictionary.cpp b/src/hotspot/share/classfile/systemDictionary.cpp
index 8f2b46add4d..9ac6ec96cb5 100644
--- a/src/hotspot/share/classfile/systemDictionary.cpp
+++ b/src/hotspot/share/classfile/systemDictionary.cpp
@@ -1241,7 +1241,7 @@ InstanceKlass* SystemDictionary::resolve_from_stream(Symbol* class_name,
MutexLocker mu(THREAD, SystemDictionary_lock);
Klass* check = find_class(h_name, k->class_loader_data());
- assert((check == k && !k->is_redefining()) || (k->is_redefining() && check == k->old_version()), "should be present in the dictionary");
+ assert(check == k && !k->is_redefining() || k->is_redefining() && check == k->old_version(), "should be present in the dictionary");
} );
return k;
@@ -2290,7 +2290,7 @@ void SystemDictionary::check_constraints(unsigned int d_hash,
// also hold array classes.
assert(check->is_instance_klass(), "noninstance in systemdictionary");
- if ((defining == true) || ((k != check) && k->old_version() != check)) {
+ if ((defining == true) || (k != check && (!AllowEnhancedClassRedefinition || k->old_version() != check))) {
throwException = true;
ss.print("loader %s", loader_data->loader_name_and_id());
ss.print(" attempted duplicate %s definition for %s. (%s)",
diff --git a/src/hotspot/share/gc/serial/genMarkSweep.cpp b/src/hotspot/share/gc/serial/genMarkSweep.cpp
index 1d13c647452..548df01e557 100644
--- a/src/hotspot/share/gc/serial/genMarkSweep.cpp
+++ b/src/hotspot/share/gc/serial/genMarkSweep.cpp
@@ -334,7 +334,9 @@ void GenMarkSweep::mark_sweep_phase4() {
GenCompactClosure blk;
gch->generation_iterate(&blk, true);
- DcevmSharedGC::copy_rescued_objects_back(MarkSweep::_rescued_oops, true);
- DcevmSharedGC::clear_rescued_objects_resource(MarkSweep::_rescued_oops);
- MarkSweep::_rescued_oops = NULL;
+ if (AllowEnhancedClassRedefinition) {
+ DcevmSharedGC::copy_rescued_objects_back(MarkSweep::_rescued_oops, true);
+ DcevmSharedGC::clear_rescued_objects_resource(MarkSweep::_rescued_oops);
+ MarkSweep::_rescued_oops = NULL;
+ }
}
diff --git a/src/hotspot/share/interpreter/linkResolver.cpp b/src/hotspot/share/interpreter/linkResolver.cpp
index b6e9e0a308d..b2f24ddbeda 100644
--- a/src/hotspot/share/interpreter/linkResolver.cpp
+++ b/src/hotspot/share/interpreter/linkResolver.cpp
@@ -282,9 +282,14 @@ void LinkResolver::check_klass_accessibility(Klass* ref_klass, Klass* sel_klass,
if (!base_klass->is_instance_klass()) {
return; // no relevant check to do
}
-
- Reflection::VerifyClassAccessResults vca_result =
- Reflection::verify_class_access(ref_klass->newest_version(), InstanceKlass::cast(base_klass->newest_version()), true);
+ Klass* refKlassNewest = ref_klass;
+ Klass* baseKlassNewest = base_klass;
+ if (AllowEnhancedClassRedefinition) {
+ refKlassNewest = ref_klass->newest_version();
+ baseKlassNewest = base_klass->newest_version();
+ }
+ Reflection::VerifyClassAccessResults vca_result =
+ Reflection::verify_class_access(refKlassNewest, InstanceKlass::cast(baseKlassNewest), true);
if (vca_result != Reflection::ACCESS_OK) {
ResourceMark rm(THREAD);
char* msg = Reflection::verify_class_access_msg(ref_klass,
@@ -566,7 +571,8 @@ void LinkResolver::check_method_accessability(Klass* ref_klass,
// We'll check for the method name first, as that's most likely
// to be false (so we'll short-circuit out of these tests).
if (sel_method->name() == vmSymbols::clone_name() &&
- sel_klass->newest_version() == SystemDictionary::Object_klass()->newest_version() &&
+ ( !AllowEnhancedClassRedefinition && sel_klass == SystemDictionary::Object_klass() ||
+ AllowEnhancedClassRedefinition && sel_klass->newest_version() == SystemDictionary::Object_klass()->newest_version()) &&
resolved_klass->is_array_klass()) {
// We need to change "protected" to "public".
assert(flags.is_protected(), "clone not protected?");
@@ -1011,7 +1017,7 @@ void LinkResolver::resolve_field(fieldDescriptor& fd,
// or by the <init> method (in case of an instance field).
if (is_put && fd.access_flags().is_final()) {
- if (sel_klass != current_klass && sel_klass != current_klass->active_version()) {
+ if (sel_klass != current_klass && (!AllowEnhancedClassRedefinition || sel_klass != current_klass->active_version())) {
ResourceMark rm(THREAD);
stringStream ss;
ss.print("Update to %s final field %s.%s attempted from a different class (%s) than the field's declaring class",
diff --git a/src/hotspot/share/jfr/instrumentation/jfrEventClassTransformer.cpp b/src/hotspot/share/jfr/instrumentation/jfrEventClassTransformer.cpp
index 96fc139bea3..f7284197c5a 100644
--- a/src/hotspot/share/jfr/instrumentation/jfrEventClassTransformer.cpp
+++ b/src/hotspot/share/jfr/instrumentation/jfrEventClassTransformer.cpp
@@ -1471,7 +1471,7 @@ static InstanceKlass* create_new_instance_klass(InstanceKlass* ik, ClassFileStre
cld,
&cl_info,
ClassFileParser::INTERNAL, // internal visibility
- false,
+ false,
THREAD);
if (HAS_PENDING_EXCEPTION) {
log_pending_exception(PENDING_EXCEPTION);
diff --git a/src/hotspot/share/oops/instanceKlass.cpp b/src/hotspot/share/oops/instanceKlass.cpp
index 3be3a09ef8f..f8e60941046 100644
--- a/src/hotspot/share/oops/instanceKlass.cpp
+++ b/src/hotspot/share/oops/instanceKlass.cpp
@@ -199,7 +199,9 @@ bool InstanceKlass::has_nest_member(InstanceKlass* k, TRAPS) const {
// able to perform that loading but we can't exclude the compiler threads from
// executing this logic. But it should actually be impossible to trigger loading here.
Klass* k2 = _constants->klass_at(cp_index, THREAD);
- k2 = k2->newest_version();
+ if (AllowEnhancedClassRedefinition) {
+ k2 = k2->newest_version();
+ }
assert(!HAS_PENDING_EXCEPTION || PENDING_EXCEPTION->is_a(SystemDictionary::VirtualMachineError_klass()),
"Exceptions should not be possible here");
if (k2 == k) {
@@ -1003,7 +1005,7 @@ bool InstanceKlass::link_class_impl(TRAPS) {
#endif
set_init_state(linked);
// (DCEVM) Must check for old version in order to prevent infinite loops.
- if (JvmtiExport::should_post_class_prepare() && old_version() == NULL /* JVMTI deadlock otherwise */) {
+ if (JvmtiExport::should_post_class_prepare() && (!AllowEnhancedClassRedefinition || old_version() == NULL) /* JVMTI deadlock otherwise */) {
Thread *thread = THREAD;
assert(thread->is_Java_thread(), "thread->is_Java_thread()");
JvmtiExport::post_class_prepare((JavaThread *) thread, this);
@@ -1084,7 +1086,7 @@ void InstanceKlass::initialize_impl(TRAPS) {
// we might end up throwing IE from link/symbol resolution sites
// that aren't expected to throw. This would wreak havoc. See 6320309.
while ((is_being_initialized() && !is_reentrant_initialization(jt))
- || (old_version() != NULL && InstanceKlass::cast(old_version())->is_being_initialized())) {
+ || (AllowEnhancedClassRedefinition && old_version() != NULL && InstanceKlass::cast(old_version())->is_being_initialized())) {
wait = true;
jt->set_class_to_be_initialized(this);
ol.wait_uninterruptibly(jt);
@@ -3782,7 +3784,7 @@ void InstanceKlass::verify_on(outputStream* st) {
guarantee(sib->is_klass(), "should be klass");
// TODO: (DCEVM) explain
- guarantee(sib->super() == super || super->newest_version() == SystemDictionary::Object_klass(), "siblings should have same superklass");
+ guarantee(sib->super() == super || AllowEnhancedClassRedefinition && super->newest_version() == SystemDictionary::Object_klass(), "siblings should have same superklass");
}
// Verify local interfaces
diff --git a/src/hotspot/share/oops/method.cpp b/src/hotspot/share/oops/method.cpp
index ce940cf10a9..2d8e5b0256b 100644
--- a/src/hotspot/share/oops/method.cpp
+++ b/src/hotspot/share/oops/method.cpp
@@ -2208,7 +2208,7 @@ void Method::ensure_jmethod_ids(ClassLoaderData* loader_data, int capacity) {
// Add a method id to the jmethod_ids
jmethodID Method::make_jmethod_id(ClassLoaderData* loader_data, Method* m) {
// FIXME: (DCEVM) ???
- if (m != m->newest_version()) {
+ if (AllowEnhancedClassRedefinition && m != m->newest_version()) {
m = m->newest_version();
}
ClassLoaderData* cld = loader_data;
diff --git a/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp b/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
index 1c7677f270f..6c12ee64a6e 100644
--- a/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
@@ -75,7 +75,7 @@ public:
// the new version (SystemDictionary stores only new versions). But the LoadedClassesClosure's functionality was
// changed in java8 where jvmtiLoadedClasses collects all classes from all classloaders, therefore we
// must use new versions only.
- if (k->new_version()==NULL) {
+ if (AllowEnhancedClassRedefinition && k->new_version()==NULL) {
_classStack.push((jclass) _env->jni_reference(Handle(_cur_thread, k->java_mirror())));
if (_dictionary_walk) {
// Collect array classes this way when walking the dictionary (because array classes are
diff --git a/src/hotspot/share/runtime/reflection.cpp b/src/hotspot/share/runtime/reflection.cpp
index 0e7722dba7d..d67457f02ac 100644
--- a/src/hotspot/share/runtime/reflection.cpp
+++ b/src/hotspot/share/runtime/reflection.cpp
@@ -628,7 +628,7 @@ bool Reflection::verify_member_access(const Klass* current_class,
TRAPS) {
// (DCEVM) Decide accessibility based on active version
- if (current_class != NULL) {
+ if (AllowEnhancedClassRedefinition && current_class != NULL) {
current_class = current_class->active_version();
}
--
2.23.0

View File

@@ -1,26 +0,0 @@
From dc675de6ac42819b8536827ea450fcad13a97448 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Wed, 11 Nov 2020 18:45:15 +0100
Subject: [PATCH 16/34] Fix LoadedClassesClosure - fixes problems with remote
debugging
---
src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp b/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
index 6c12ee64a6e..2a469555dbd 100644
--- a/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
@@ -75,7 +75,7 @@ public:
// the new version (SystemDictionary stores only new versions). But the LoadedClassesClosure's functionality was
// changed in java8 where jvmtiLoadedClasses collects all classes from all classloaders, therefore we
// must use new versions only.
- if (AllowEnhancedClassRedefinition && k->new_version()==NULL) {
+ if (!AllowEnhancedClassRedefinition || k->new_version()==NULL) {
_classStack.push((jclass) _env->jni_reference(Handle(_cur_thread, k->java_mirror())));
if (_dictionary_walk) {
// Collect array classes this way when walking the dictionary (because array classes are
--
2.23.0

View File

@@ -1,183 +0,0 @@
From 1d682efa88c716e1849163d5abff3a3367581d16 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Mon, 16 Nov 2020 21:11:19 +0100
Subject: [PATCH 18/34] pre dcevm15 - fix GC spaces originally in removed CMS
patch
---
src/hotspot/share/gc/shared/space.cpp | 16 ++++++++--------
src/hotspot/share/gc/shared/space.hpp | 6 +++---
src/hotspot/share/gc/shared/space.inline.hpp | 14 ++++++++------
.../share/prims/jvmtiEnhancedRedefineClasses.cpp | 6 ++----
4 files changed, 21 insertions(+), 21 deletions(-)
diff --git a/src/hotspot/share/gc/shared/space.cpp b/src/hotspot/share/gc/shared/space.cpp
index 875a6dc854f..9772c32c42e 100644
--- a/src/hotspot/share/gc/shared/space.cpp
+++ b/src/hotspot/share/gc/shared/space.cpp
@@ -375,11 +375,11 @@ HeapWord* CompactibleSpace::forward_compact_top(size_t size, CompactPoint* cp, H
}
HeapWord* CompactibleSpace::forward(oop q, size_t size,
- CompactPoint* cp, HeapWord* compact_top) {
+ CompactPoint* cp, HeapWord* compact_top, bool force_forward) {
compact_top = forward_compact_top(size, cp, compact_top);
// store the forwarding pointer into the mark word
- if (cast_from_oop<HeapWord*>(q) != compact_top || (size_t)q->size() != size) {
+ if (force_forward || cast_from_oop<HeapWord*>(q) != compact_top || (size_t)q->size() != size) {
q->forward_to(oop(compact_top));
assert(q->is_gc_marked(), "encoding the pointer should preserve the mark");
} else {
@@ -501,7 +501,7 @@ bool CompactibleSpace::must_rescue(oop old_obj, oop new_obj) {
} else {
assert(space_index(old_obj) != space_index(new_obj), "old_obj and new_obj must be in different spaces");
- if (tenured_gen->is_in_reserved(new_obj)) {
+ if (new_in_tenured) {
// Must never rescue when moving from the new into the old generation.
assert(GenCollectedHeap::heap()->young_gen()->is_in_reserved(old_obj), "old_obj must be in DefNewGeneration");
assert(space_index(old_obj) > space_index(new_obj), "must be");
@@ -824,14 +824,14 @@ void OffsetTableContigSpace::verify() const {
// Compute the forward sizes and leave out objects whose position could
// possibly overlap other objects.
HeapWord* CompactibleSpace::forward_with_rescue(HeapWord* q, size_t size,
- CompactPoint* cp, HeapWord* compact_top) {
+ CompactPoint* cp, HeapWord* compact_top, bool force_forward) {
size_t forward_size = size;
// (DCEVM) There is a new version of the class of q => different size
if (oop(q)->klass()->new_version() != NULL) {
size_t new_size = oop(q)->size_given_klass(oop(q)->klass()->new_version());
- assert(size != new_size, "instances without changed size have to be updated prior to GC run");
+ // assert(size != new_size, "instances without changed size have to be updated prior to GC run");
forward_size = new_size;
}
@@ -845,7 +845,7 @@ HeapWord* CompactibleSpace::forward_with_rescue(HeapWord* q, size_t size,
return compact_top;
}
- return forward(oop(q), forward_size, cp, compact_top);
+ return forward(oop(q), forward_size, cp, compact_top, force_forward);
}
// Compute the forwarding addresses for the objects that need to be rescued.
@@ -861,11 +861,11 @@ HeapWord* CompactibleSpace::forward_rescued(CompactPoint* cp, HeapWord* compact_
// (DCEVM) There is a new version of the class of q => different size
if (oop(q)->klass()->new_version() != NULL) {
size_t new_size = oop(q)->size_given_klass(oop(q)->klass()->new_version());
- assert(size != new_size, "instances without changed size have to be updated prior to GC run");
+ // assert(size != new_size, "instances without changed size have to be updated prior to GC run");
size = new_size;
}
- compact_top = cp->space->forward(oop(q), size, cp, compact_top);
+ compact_top = cp->space->forward(oop(q), size, cp, compact_top, true);
assert(compact_top <= end(), "must not write over end of space!");
}
MarkSweep::_rescued_oops->clear();
diff --git a/src/hotspot/share/gc/shared/space.hpp b/src/hotspot/share/gc/shared/space.hpp
index c9bfc365f0f..f7648995454 100644
--- a/src/hotspot/share/gc/shared/space.hpp
+++ b/src/hotspot/share/gc/shared/space.hpp
@@ -405,7 +405,7 @@ public:
virtual void prepare_for_compaction(CompactPoint* cp) = 0;
// MarkSweep support phase3
DEBUG_ONLY(int space_index(oop obj));
- bool must_rescue(oop old_obj, oop new_obj);
+ virtual bool must_rescue(oop old_obj, oop new_obj);
HeapWord* rescue(HeapWord* old_obj);
virtual void adjust_pointers();
// MarkSweep support phase4
@@ -436,11 +436,11 @@ public:
// function of the then-current compaction space, and updates "cp->threshold
// accordingly".
virtual HeapWord* forward(oop q, size_t size, CompactPoint* cp,
- HeapWord* compact_top);
+ HeapWord* compact_top, bool force_forward);
// (DCEVM) same as forwad, but can rescue objects. Invoked only during
// redefinition runs
HeapWord* forward_with_rescue(HeapWord* q, size_t size, CompactPoint* cp,
- HeapWord* compact_top);
+ HeapWord* compact_top, bool force_forward);
HeapWord* forward_rescued(CompactPoint* cp, HeapWord* compact_top);
diff --git a/src/hotspot/share/gc/shared/space.inline.hpp b/src/hotspot/share/gc/shared/space.inline.hpp
index 5a93e93471b..fa645423685 100644
--- a/src/hotspot/share/gc/shared/space.inline.hpp
+++ b/src/hotspot/share/gc/shared/space.inline.hpp
@@ -163,6 +163,8 @@ inline void CompactibleSpace::scan_and_forward(SpaceType* space, CompactPoint* c
HeapWord* cur_obj = space->bottom();
HeapWord* scan_limit = space->scan_limit();
+ bool force_forward = false;
+
while (cur_obj < scan_limit) {
assert(!space->scanned_block_is_obj(cur_obj) ||
oop(cur_obj)->mark_raw().is_marked() || oop(cur_obj)->mark_raw().is_unlocked() ||
@@ -174,14 +176,15 @@ inline void CompactibleSpace::scan_and_forward(SpaceType* space, CompactPoint* c
size_t size = space->scanned_block_size(cur_obj);
if (redefinition_run) {
- compact_top = cp->space->forward_with_rescue(cur_obj, size, cp, compact_top);
+ compact_top = cp->space->forward_with_rescue(cur_obj, size, cp, compact_top, force_forward);
if (first_dead == NULL && oop(cur_obj)->is_gc_marked()) {
/* Was moved (otherwise, forward would reset mark),
set first_dead to here */
first_dead = cur_obj;
+ force_forward = true;
}
} else {
- compact_top = cp->space->forward(oop(cur_obj), size, cp, compact_top);
+ compact_top = cp->space->forward(oop(cur_obj), size, cp, compact_top, false);
}
cur_obj += size;
@@ -197,9 +200,9 @@ inline void CompactibleSpace::scan_and_forward(SpaceType* space, CompactPoint* c
// see if we might want to pretend this object is alive so that
// we don't have to compact quite as often.
- if (cur_obj == compact_top && dead_spacer.insert_deadspace(cur_obj, end)) {
+ if (!redefinition_run && cur_obj == compact_top && dead_spacer.insert_deadspace(cur_obj, end)) {
oop obj = oop(cur_obj);
- compact_top = cp->space->forward(obj, obj->size(), cp, compact_top);
+ compact_top = cp->space->forward(obj, obj->size(), cp, compact_top, force_forward);
end_of_live = end;
} else {
// otherwise, it really is a free region.
@@ -362,8 +365,7 @@ inline void CompactibleSpace::scan_and_compact(SpaceType* space, bool redefiniti
Prefetch::write(compaction_top, copy_interval);
// copy object and reinit its mark
- assert(cur_obj != compaction_top || oop(cur_obj)->klass()->new_version() != NULL,
- "everything in this pass should be moving");
+ assert(redefinition_run || cur_obj != compaction_top, "everything in this pass should be moving");
if (redefinition_run && oop(cur_obj)->klass()->new_version() != NULL) {
Klass* new_version = oop(cur_obj)->klass()->new_version();
if (new_version->update_information() == NULL) {
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 718426f2819..1da6661dd3e 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -431,13 +431,11 @@ public:
Klass* new_klass = obj->klass()->new_version();
if (new_klass->update_information() != NULL) {
- int size_diff = obj->size() - obj->size_given_klass(new_klass);
-
- // Either new size is bigger or gap is to small to be filled
- if (size_diff < 0 || (size_diff > 0 && (size_t) size_diff < CollectedHeap::min_fill_size())) {
+ if (obj->size() - obj->size_given_klass(new_klass) != 0) {
// We need an instance update => set back to old klass
_needs_instance_update = true;
} else {
+ // Either new size is bigger or gap is to small to be filled
oop src = obj;
if (new_klass->is_copying_backwards()) {
copy_to_tmp(obj);
--
2.23.0

View File

@@ -1,942 +0,0 @@
From 297f564f6af79fb824f5b4e9119f1d3d0c827fb0 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Mon, 16 Nov 2020 20:20:12 +0100
Subject: [PATCH 19/34] dcevm15 - fix java15 patch compilation issues
---
.../share/classfile/classFileParser.hpp | 8 +-
.../share/classfile/classLoaderData.cpp | 2 +-
src/hotspot/share/classfile/dictionary.hpp | 10 +-
src/hotspot/share/classfile/javaClasses.hpp | 2 +
.../share/gc/g1/g1FullGCCompactTask.cpp | 4 +-
.../share/gc/g1/g1FullGCCompactionPoint.cpp | 8 +-
.../share/gc/g1/g1FullGCPrepareTask.cpp | 4 +-
src/hotspot/share/gc/shared/dcevmSharedGC.cpp | 14 +-
src/hotspot/share/gc/shared/dcevmSharedGC.hpp | 2 +-
src/hotspot/share/gc/shared/gcConfig.cpp | 2 +-
src/hotspot/share/gc/shared/space.cpp | 4 +-
.../share/interpreter/linkResolver.cpp | 2 +-
src/hotspot/share/oops/instanceKlass.cpp | 17 ++-
src/hotspot/share/oops/instanceKlass.hpp | 1 +
src/hotspot/share/oops/klass.cpp | 8 +-
src/hotspot/share/prims/jvm.cpp | 2 +
.../prims/jvmtiEnhancedRedefineClasses.cpp | 129 +++++++++---------
.../prims/jvmtiEnhancedRedefineClasses.hpp | 14 +-
src/hotspot/share/prims/jvmtiEnv.cpp | 11 +-
.../share/prims/jvmtiRedefineClasses.cpp | 1 +
src/hotspot/share/prims/methodHandles.hpp | 3 +
src/hotspot/share/runtime/arguments.cpp | 22 +--
src/hotspot/share/runtime/mutexLocker.cpp | 2 +-
23 files changed, 159 insertions(+), 113 deletions(-)
diff --git a/src/hotspot/share/classfile/classFileParser.hpp b/src/hotspot/share/classfile/classFileParser.hpp
index e5761e61767..0d266b9047e 100644
--- a/src/hotspot/share/classfile/classFileParser.hpp
+++ b/src/hotspot/share/classfile/classFileParser.hpp
@@ -150,9 +150,6 @@ class ClassFileParser {
const intArray* _method_ordering;
GrowableArray<Method*>* _all_mirandas;
- // Enhanced class redefinition
- const bool _pick_newest;
-
enum { fixed_buffer_size = 128 };
u_char _linenumbertable_buffer[fixed_buffer_size];
@@ -206,6 +203,9 @@ class ClassFileParser {
bool _has_vanilla_constructor;
int _max_bootstrap_specifier_index; // detects BSS values
+ // (DCEVM) Enhanced class redefinition
+ const bool _pick_newest;
+
void parse_stream(const ClassFileStream* const stream, TRAPS);
void mangle_hidden_class_name(InstanceKlass* const ik);
@@ -582,7 +582,7 @@ class ClassFileParser {
ClassLoaderData* loader_data() const { return _loader_data; }
const Symbol* class_name() const { return _class_name; }
const InstanceKlass* super_klass() const { return _super_klass; }
- Array<Klass*>* local_interfaces() const { return _local_interfaces; }
+ Array<InstanceKlass*>* local_interfaces() const { return _local_interfaces; }
ReferenceType reference_type() const { return _rt; }
AccessFlags access_flags() const { return _access_flags; }
diff --git a/src/hotspot/share/classfile/classLoaderData.cpp b/src/hotspot/share/classfile/classLoaderData.cpp
index 4d64c6b454a..aadcd50ef4a 100644
--- a/src/hotspot/share/classfile/classLoaderData.cpp
+++ b/src/hotspot/share/classfile/classLoaderData.cpp
@@ -597,7 +597,7 @@ void ClassLoaderData::exchange_holders(ClassLoaderData* cld) {
oop holder_oop = _holder.peek();
_holder.replace(cld->_holder.peek());
cld->_holder.replace(holder_oop);
- WeakHandle<vm_class_loader_data> exchange = _holder;
+ WeakHandle<vm_weak_data> exchange = _holder;
_holder = cld->_holder;
cld->_holder = exchange;
}
diff --git a/src/hotspot/share/classfile/dictionary.hpp b/src/hotspot/share/classfile/dictionary.hpp
index 114a983e783..a50f4ff84d2 100644
--- a/src/hotspot/share/classfile/dictionary.hpp
+++ b/src/hotspot/share/classfile/dictionary.hpp
@@ -84,6 +84,11 @@ public:
void print_on(outputStream* st) const;
void verify();
+ // (DCEVM) Enhanced class redefinition
+ bool update_klass(unsigned int hash, Symbol* name, ClassLoaderData* loader_data, InstanceKlass* k, InstanceKlass* old_klass);
+
+ void rollback_redefinition();
+
private:
DictionaryEntry* new_entry(unsigned int hash, InstanceKlass* klass);
@@ -106,11 +111,6 @@ public:
void free_entry(DictionaryEntry* entry);
- // Enhanced class redefinition
- bool update_klass(unsigned int hash, Symbol* name, ClassLoaderData* loader_data, InstanceKlass* k, InstanceKlass* old_klass);
-
- void rollback_redefinition();
-
// (DCEVM) return old class if redefining in AllowEnhancedClassRedefinition, otherwise return "k"
static InstanceKlass* old_if_redefined(InstanceKlass* k) {
return (k != NULL && k->is_redefining()) ? ((InstanceKlass* )k->old_version()) : k;
diff --git a/src/hotspot/share/classfile/javaClasses.hpp b/src/hotspot/share/classfile/javaClasses.hpp
index a68c5139151..9abf2e1d105 100644
--- a/src/hotspot/share/classfile/javaClasses.hpp
+++ b/src/hotspot/share/classfile/javaClasses.hpp
@@ -255,7 +255,9 @@ class java_lang_Class : AllStatic {
static void set_init_lock(oop java_class, oop init_lock);
static void set_protection_domain(oop java_class, oop protection_domain);
static void set_class_loader(oop java_class, oop class_loader);
+ public: // DCEVM
static void set_component_mirror(oop java_class, oop comp_mirror);
+ private:
static void initialize_mirror_fields(Klass* k, Handle mirror, Handle protection_domain,
Handle classData, TRAPS);
static void initialize_mirror_fields(Klass* k, Handle mirror, Handle protection_domain, TRAPS);
diff --git a/src/hotspot/share/gc/g1/g1FullGCCompactTask.cpp b/src/hotspot/share/gc/g1/g1FullGCCompactTask.cpp
index f70f4606dc8..a22ed48560d 100644
--- a/src/hotspot/share/gc/g1/g1FullGCCompactTask.cpp
+++ b/src/hotspot/share/gc/g1/g1FullGCCompactTask.cpp
@@ -157,14 +157,14 @@ void G1FullGCCompactTask::serial_compaction_dcevm() {
size_t G1FullGCCompactTask::G1CompactRegionClosureDcevm::apply(oop obj) {
size_t size = obj->size();
- HeapWord* destination = (HeapWord*)obj->forwardee();
+ HeapWord* destination = cast_from_oop<HeapWord*>(obj->forwardee());
if (destination == NULL) {
// Object not moving
return size;
}
// copy object and reinit its mark
- HeapWord* obj_addr = (HeapWord*) obj;
+ HeapWord* obj_addr = cast_from_oop<HeapWord*>(obj);
if (!_rescue_oops_it->at_end() && **_rescue_oops_it == obj_addr) {
++(*_rescue_oops_it);
diff --git a/src/hotspot/share/gc/g1/g1FullGCCompactionPoint.cpp b/src/hotspot/share/gc/g1/g1FullGCCompactionPoint.cpp
index 1e49571c999..755935a2c91 100644
--- a/src/hotspot/share/gc/g1/g1FullGCCompactionPoint.cpp
+++ b/src/hotspot/share/gc/g1/g1FullGCCompactionPoint.cpp
@@ -174,7 +174,7 @@ void G1FullGCCompactionPoint::forward_dcevm(oop object, size_t size, bool force_
assert(_current_region != NULL, "Must have been initialized");
// Store a forwarding pointer if the object should be moved.
- if ((HeapWord*)object != _compaction_top || force_forward) {
+ if (cast_from_oop<HeapWord*>(object) != _compaction_top || force_forward) {
object->forward_to(oop(_compaction_top));
} else {
if (object->forwardee() != NULL) {
@@ -188,11 +188,11 @@ void G1FullGCCompactionPoint::forward_dcevm(oop object, size_t size, bool force_
} else {
// Make sure object has the correct mark-word set or that it will be
// fixed when restoring the preserved marks.
- assert(object->mark_raw() == markOopDesc::prototype_for_object(object) || // Correct mark
- object->mark_raw()->must_be_preserved(object) || // Will be restored by PreservedMarksSet
+ assert(object->mark_raw() == markWord::prototype_for_klass(object->klass()) || // Correct mark
+ object->mark_must_be_preserved() || // Will be restored by PreservedMarksSet
(UseBiasedLocking && object->has_bias_pattern_raw()), // Will be restored by BiasedLocking
"should have correct prototype obj: " PTR_FORMAT " mark: " PTR_FORMAT " prototype: " PTR_FORMAT,
- p2i(object), p2i(object->mark_raw()), p2i(markOopDesc::prototype_for_object(object)));
+ p2i(object), object->mark_raw().value(), markWord::prototype_for_klass(object->klass()).value());
}
assert(object->forwardee() == NULL, "should be forwarded to NULL");
}
diff --git a/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp b/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
index a45681b60cf..2f06b9617e4 100644
--- a/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
+++ b/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
@@ -269,7 +269,7 @@ size_t G1FullGCPrepareTask::G1PrepareCompactLiveClosureDcevm::apply(oop object)
HeapWord* compact_top = _cp->forward_compact_top(forward_size);
if (compact_top == NULL || must_rescue(object, oop(compact_top))) {
- _cp->rescued_oops()->append((HeapWord*)object);
+ _cp->rescued_oops()->append(cast_from_oop<HeapWord*>(object));
} else {
_cp->forward_dcevm(object, forward_size, (size != forward_size));
}
@@ -295,7 +295,7 @@ bool G1FullGCPrepareTask::G1PrepareCompactLiveClosureDcevm::must_rescue(oop old_
int new_size = old_obj->size_given_klass(oop(old_obj)->klass()->new_version());
int original_size = old_obj->size();
- bool overlap = ((HeapWord*)old_obj + original_size < (HeapWord*)new_obj + new_size);
+ bool overlap = (cast_from_oop<HeapWord*>(old_obj) + original_size < cast_from_oop<HeapWord*>(new_obj) + new_size);
return overlap;
}
diff --git a/src/hotspot/share/gc/shared/dcevmSharedGC.cpp b/src/hotspot/share/gc/shared/dcevmSharedGC.cpp
index 803e645f843..3dee097f1d3 100644
--- a/src/hotspot/share/gc/shared/dcevmSharedGC.cpp
+++ b/src/hotspot/share/gc/shared/dcevmSharedGC.cpp
@@ -58,10 +58,10 @@ void DcevmSharedGC::copy_rescued_objects_back(GrowableArray<HeapWord*>* rescued_
DcevmSharedGC::update_fields(rescued_obj, new_obj);
} else {
rescued_obj->set_klass(new_klass);
- Copy::aligned_disjoint_words((HeapWord*)rescued_obj, (HeapWord*)new_obj, size);
+ Copy::aligned_disjoint_words(cast_from_oop<HeapWord*>(rescued_obj), cast_from_oop<HeapWord*>(new_obj), size);
}
} else {
- Copy::aligned_disjoint_words((HeapWord*)rescued_obj, (HeapWord*)new_obj, size);
+ Copy::aligned_disjoint_words(cast_from_oop<HeapWord*>(rescued_obj), cast_from_oop<HeapWord*>(new_obj), size);
}
new_obj->init_mark_raw();
@@ -111,11 +111,11 @@ void DcevmSharedGC::update_fields(oop q, oop new_location) {
// Save object somewhere, there is an overlap in fields
if (new_klass_oop->is_copying_backwards()) {
- if (((HeapWord *)q >= (HeapWord *)new_location && (HeapWord *)q < (HeapWord *)new_location + new_size) ||
- ((HeapWord *)new_location >= (HeapWord *)q && (HeapWord *)new_location < (HeapWord *)q + size)) {
+ if ((cast_from_oop<HeapWord*>(q) >= cast_from_oop<HeapWord*>(new_location) && cast_from_oop<HeapWord*>(q) < cast_from_oop<HeapWord*>(new_location) + new_size) ||
+ (cast_from_oop<HeapWord*>(new_location) >= cast_from_oop<HeapWord*>(q) && cast_from_oop<HeapWord*>(new_location) < cast_from_oop<HeapWord*>(q) + size)) {
tmp = NEW_RESOURCE_ARRAY(HeapWord, size);
q = (oop) tmp;
- Copy::aligned_disjoint_words((HeapWord*)tmp_obj, (HeapWord*)q, size);
+ Copy::aligned_disjoint_words(cast_from_oop<HeapWord*>(tmp_obj), cast_from_oop<HeapWord*>(q), size);
}
}
@@ -131,13 +131,13 @@ void DcevmSharedGC::update_fields(oop q, oop new_location) {
void DcevmSharedGC::update_fields(oop new_location, oop tmp_obj, int *cur) {
assert(cur != NULL, "just checking");
- char* to = (char*)(HeapWord*)new_location;
+ char* to = (char*)cast_from_oop<HeapWord*>(new_location);
while (*cur != 0) {
int size = *cur;
if (size > 0) {
cur++;
int offset = *cur;
- HeapWord* from = (HeapWord*)(((char *)(HeapWord*)tmp_obj) + offset);
+ HeapWord* from = (HeapWord*)(((char *)cast_from_oop<HeapWord*>(tmp_obj)) + offset);
if (size == HeapWordSize) {
*((HeapWord*)to) = *from;
} else if (size == HeapWordSize * 2) {
diff --git a/src/hotspot/share/gc/shared/dcevmSharedGC.hpp b/src/hotspot/share/gc/shared/dcevmSharedGC.hpp
index e2ef0171fb2..a4e27e00280 100644
--- a/src/hotspot/share/gc/shared/dcevmSharedGC.hpp
+++ b/src/hotspot/share/gc/shared/dcevmSharedGC.hpp
@@ -29,7 +29,7 @@
#include "gc/shared/genOopClosures.hpp"
#include "gc/shared/taskqueue.hpp"
#include "memory/iterator.hpp"
-#include "oops/markOop.hpp"
+#include "oops/markWord.hpp"
#include "oops/oop.hpp"
#include "runtime/timer.hpp"
#include "utilities/growableArray.hpp"
diff --git a/src/hotspot/share/gc/shared/gcConfig.cpp b/src/hotspot/share/gc/shared/gcConfig.cpp
index f01d64d1434..5c1a09390f1 100644
--- a/src/hotspot/share/gc/shared/gcConfig.cpp
+++ b/src/hotspot/share/gc/shared/gcConfig.cpp
@@ -100,7 +100,7 @@ void GCConfig::fail_if_non_included_gc_is_selected() {
void GCConfig::select_gc_ergonomically() {
if (AllowEnhancedClassRedefinition && !UseG1GC) {
// Enhanced class redefinition only supports serial GC at the moment
- FLAG_SET_ERGO(bool, UseSerialGC, true);
+ FLAG_SET_ERGO(UseSerialGC, true);
} else if (os::is_server_class_machine()) {
#if INCLUDE_G1GC
FLAG_SET_ERGO_IF_DEFAULT(UseG1GC, true);
diff --git a/src/hotspot/share/gc/shared/space.cpp b/src/hotspot/share/gc/shared/space.cpp
index 9772c32c42e..e8e3d7884c2 100644
--- a/src/hotspot/share/gc/shared/space.cpp
+++ b/src/hotspot/share/gc/shared/space.cpp
@@ -440,7 +440,7 @@ int CompactibleSpace::space_index(oop obj) {
index++;
}
- tty->print_cr("could not compute space_index for %08xh", (HeapWord*)obj);
+ tty->print_cr("could not compute space_index for %08xh", cast_from_oop<HeapWord*>(obj));
index = 0;
Generation* gen = heap->old_gen();
@@ -485,7 +485,7 @@ bool CompactibleSpace::must_rescue(oop old_obj, oop new_obj) {
bool new_in_tenured = tenured_gen->is_in_reserved(new_obj);
if (old_in_tenured == new_in_tenured) {
// Rescue if object may overlap with a higher memory address.
- bool overlap = ((HeapWord*)old_obj + original_size < (HeapWord*)new_obj + new_size);
+ bool overlap = (cast_from_oop<HeapWord*>(old_obj) + original_size < cast_from_oop<HeapWord*>(new_obj) + new_size);
if (old_in_tenured) {
// Old and new address are in same space, so just compare the address.
// Must rescue if object moves towards the top of the space.
diff --git a/src/hotspot/share/interpreter/linkResolver.cpp b/src/hotspot/share/interpreter/linkResolver.cpp
index b2f24ddbeda..9daeeb70b34 100644
--- a/src/hotspot/share/interpreter/linkResolver.cpp
+++ b/src/hotspot/share/interpreter/linkResolver.cpp
@@ -1031,7 +1031,7 @@ void LinkResolver::resolve_field(fieldDescriptor& fd,
assert(m != NULL, "information about the current method must be available for 'put' bytecodes");
bool is_initialized_static_final_update = (byte == Bytecodes::_putstatic &&
fd.is_static() &&
- !(m()->is_static_initializer() || m()->name() == vmSymbols::ha_class_initializer_name()));
+ !(m->is_static_initializer() || m->name() == vmSymbols::ha_class_initializer_name()));
bool is_initialized_instance_final_update = ((byte == Bytecodes::_putfield || byte == Bytecodes::_nofast_putfield) &&
!fd.is_static() &&
!m->is_object_initializer());
diff --git a/src/hotspot/share/oops/instanceKlass.cpp b/src/hotspot/share/oops/instanceKlass.cpp
index f8e60941046..5e40d78a87e 100644
--- a/src/hotspot/share/oops/instanceKlass.cpp
+++ b/src/hotspot/share/oops/instanceKlass.cpp
@@ -1316,7 +1316,7 @@ void InstanceKlass::init_implementor() {
// (DCEVM) - init_implementor() for dcevm
void InstanceKlass::init_implementor_from_redefine() {
assert(is_interface(), "not interface");
- Klass** addr = adr_implementor();
+ Klass* volatile* addr = adr_implementor();
assert(addr != NULL, "null addr");
if (addr != NULL) {
*addr = NULL;
@@ -1659,6 +1659,21 @@ void InstanceKlass::methods_do(void f(Method* method)) {
}
}
+void InstanceKlass::methods_do(void f(Method* method, TRAPS), TRAPS) {
+ // Methods aren't stable until they are loaded. This can be read outside
+ // a lock through the ClassLoaderData for profiling
+ if (!is_loaded()) {
+ return;
+ }
+
+ int len = methods()->length();
+ for (int index = 0; index < len; index++) {
+ Method* m = methods()->at(index);
+ assert(m->is_method(), "must be method");
+ f(m, CHECK);
+ }
+}
+
// (DCEVM) Update information contains mapping of fields from old class to the new class.
// Info is stored on HEAP, you need to call clear_update_information to free the space.
void InstanceKlass::store_update_information(GrowableArray<int> &values) {
diff --git a/src/hotspot/share/oops/instanceKlass.hpp b/src/hotspot/share/oops/instanceKlass.hpp
index 6ead9426728..b56d42cb177 100644
--- a/src/hotspot/share/oops/instanceKlass.hpp
+++ b/src/hotspot/share/oops/instanceKlass.hpp
@@ -1069,6 +1069,7 @@ public:
void clear_update_information();
void methods_do(void f(Method* method));
+ void methods_do(void f(Method* method, TRAPS), TRAPS);
void array_klasses_do(void f(Klass* k));
void array_klasses_do(void f(Klass* k, TRAPS), TRAPS);
diff --git a/src/hotspot/share/oops/klass.cpp b/src/hotspot/share/oops/klass.cpp
index 352d8f84631..88f5ec9ba4a 100644
--- a/src/hotspot/share/oops/klass.cpp
+++ b/src/hotspot/share/oops/klass.cpp
@@ -200,13 +200,13 @@ void* Klass::operator new(size_t size, ClassLoaderData* loader_data, size_t word
Klass::Klass(KlassID id) : _id(id),
_java_mirror(NULL),
_prototype_header(markWord::prototype()),
- _shared_class_path_index(-1),
- _new_version(NULL),
_old_version(NULL),
+ _new_version(NULL),
+ _redefinition_flags(Klass::NoRedefinition),
_is_redefining(false),
+ _update_information(NULL),
_is_copying_backwards(false),
- _redefinition_flags(Klass::NoRedefinition),
- _update_information(NULL) {
+ _shared_class_path_index(-1) {
CDS_ONLY(_shared_class_flags = 0;)
CDS_JAVA_HEAP_ONLY(_archived_mirror = 0;)
_primary_supers[0] = this;
diff --git a/src/hotspot/share/prims/jvm.cpp b/src/hotspot/share/prims/jvm.cpp
index 333b65ccfc1..13bcac352fb 100644
--- a/src/hotspot/share/prims/jvm.cpp
+++ b/src/hotspot/share/prims/jvm.cpp
@@ -1054,6 +1054,7 @@ static jclass jvm_lookup_define_class(JNIEnv *env, jclass lookup, const char *na
class_loader,
protection_domain,
&st,
+ NULL,
CHECK_NULL);
if (log_is_enabled(Debug, class, resolve) && defined_k != NULL) {
@@ -1074,6 +1075,7 @@ static jclass jvm_lookup_define_class(JNIEnv *env, jclass lookup, const char *na
class_loader,
&st,
cl_info,
+ NULL,
CHECK_NULL);
if (defined_k == NULL) {
THROW_MSG_0(vmSymbols::java_lang_Error(), "Failure to define a hidden class");
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 1da6661dd3e..619e3988e3a 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -24,11 +24,14 @@
#include "precompiled.hpp"
#include "aot/aotLoader.hpp"
+#include "classfile/classFileParser.hpp"
#include "classfile/classFileStream.hpp"
#include "classfile/metadataOnStackMark.hpp"
#include "classfile/systemDictionary.hpp"
#include "classfile/verifier.hpp"
#include "classfile/dictionary.hpp"
+#include "classfile/classLoaderDataGraph.hpp"
+#include "interpreter/linkResolver.hpp"
#include "interpreter/oopMapCache.hpp"
#include "interpreter/rewriter.hpp"
#include "logging/logStream.hpp"
@@ -37,17 +40,22 @@
#include "memory/resourceArea.hpp"
#include "memory/iterator.inline.hpp"
#include "oops/fieldStreams.hpp"
+#include "oops/fieldStreams.inline.hpp"
#include "oops/klassVtable.hpp"
#include "oops/oop.inline.hpp"
#include "oops/constantPool.inline.hpp"
+#include "oops/metadata.hpp"
+#include "oops/methodData.hpp"
#include "prims/jvmtiImpl.hpp"
#include "prims/jvmtiClassFileReconstituter.hpp"
#include "prims/jvmtiEnhancedRedefineClasses.hpp"
#include "prims/methodComparator.hpp"
#include "prims/resolvedMethodTable.hpp"
+#include "prims/methodHandles.hpp"
#include "runtime/deoptimization.hpp"
#include "runtime/jniHandles.inline.hpp"
#include "runtime/relocator.hpp"
+#include "runtime/fieldDescriptor.hpp"
#include "runtime/fieldDescriptor.inline.hpp"
#include "utilities/bitMap.inline.hpp"
#include "prims/jvmtiThreadState.inline.hpp"
@@ -55,6 +63,8 @@
#include "oops/constantPool.inline.hpp"
#include "gc/g1/g1CollectedHeap.hpp"
#include "gc/shared/dcevmSharedGC.hpp"
+#include "gc/shared/scavengableNMethods.hpp"
+#include "ci/ciObjectFactory.hpp"
Array<Method*>* VM_EnhancedRedefineClasses::_old_methods = NULL;
Array<Method*>* VM_EnhancedRedefineClasses::_new_methods = NULL;
@@ -66,6 +76,7 @@ int VM_EnhancedRedefineClasses::_matching_methods_length = 0;
int VM_EnhancedRedefineClasses::_deleted_methods_length = 0;
int VM_EnhancedRedefineClasses::_added_methods_length = 0;
Klass* VM_EnhancedRedefineClasses::_the_class_oop = NULL;
+u8 VM_EnhancedRedefineClasses::_id_counter = 0;
//
// Create new instance of enhanced class redefiner.
@@ -88,6 +99,7 @@ VM_EnhancedRedefineClasses::VM_EnhancedRedefineClasses(jint class_count, const j
_class_load_kind = class_load_kind;
_res = JVMTI_ERROR_NONE;
_any_class_has_resolved_methods = false;
+ _id = next_id();
}
static inline InstanceKlass* get_ik(jclass def) {
@@ -211,9 +223,7 @@ class FieldCopier : public FieldClosure {
// TODO: review...
void VM_EnhancedRedefineClasses::mark_as_scavengable(nmethod* nm) {
- if (!nm->on_scavenge_root_list()) {
- CodeCache::add_scavenge_root_nmethod(nm);
- }
+ ScavengableNMethods::register_nmethod(nm);
}
void VM_EnhancedRedefineClasses::unregister_nmethod_g1(nmethod* nm) {
@@ -414,7 +424,7 @@ public:
_tmp_obj_size = size;
_tmp_obj = (oop)resource_allocate_bytes(size * HeapWordSize);
}
- Copy::aligned_disjoint_words((HeapWord*)o, (HeapWord*)_tmp_obj, size);
+ Copy::aligned_disjoint_words(cast_from_oop<HeapWord*>(o), cast_from_oop<HeapWord*>(_tmp_obj), size);
}
virtual void do_object(oop obj) {
@@ -505,9 +515,6 @@ void VM_EnhancedRedefineClasses::doit() {
ClearCpoolCacheAndUnpatch clear_cpool_cache(thread);
ClassLoaderDataGraph::classes_do(&clear_cpool_cache);
-
- // SystemDictionary::methods_do(fix_invoke_method);
-
// JSR-292 support
if (_any_class_has_resolved_methods) {
bool trace_name_printed = false;
@@ -564,8 +571,8 @@ void VM_EnhancedRedefineClasses::doit() {
InstanceKlass* old = InstanceKlass::cast(cur->old_version());
// Swap marks to have same hashcodes
- markOop cur_mark = cur->prototype_header();
- markOop old_mark = old->prototype_header();
+ markWord cur_mark = cur->prototype_header();
+ markWord old_mark = old->prototype_header();
cur->set_prototype_header(old_mark);
old->set_prototype_header(cur_mark);
@@ -579,14 +586,14 @@ void VM_EnhancedRedefineClasses::doit() {
// Revert pool holder for old version of klass (it was updated by one of ours closure!)
old->constants()->set_pool_holder(old);
- Klass* array_klasses = old->array_klasses();
+ ObjArrayKlass* array_klasses = old->array_klasses();
if (array_klasses != NULL) {
assert(cur->array_klasses() == NULL, "just checking");
// Transfer the array classes, otherwise we might get cast exceptions when casting array types.
// Also, set array klasses element klass.
cur->set_array_klasses(array_klasses);
- ObjArrayKlass::cast(array_klasses)->set_element_klass(cur);
+ array_klasses->set_element_klass(cur);
java_lang_Class::release_set_array_klass(cur->java_mirror(), array_klasses);
java_lang_Class::set_component_mirror(array_klasses->java_mirror(), cur->java_mirror());
}
@@ -641,11 +648,15 @@ void VM_EnhancedRedefineClasses::doit() {
//ClassLoaderDataGraph::classes_do(&clean_weak_method_links);
// Disable any dependent concurrent compilations
- SystemDictionary::notice_modification();
+ // SystemDictionary::notice_modification();
+
+ JvmtiExport::increment_redefinition_count();
// Set flag indicating that some invariants are no longer true.
// See jvmtiExport.hpp for detailed explanation.
- JvmtiExport::set_has_redefined_a_class();
+
+ // dcevm15: handled by _redefinition_count
+ // JvmtiExport::set_has_redefined_a_class();
#ifdef PRODUCT
if (log_is_enabled(Trace, redefine, class, obsolete, metadata)) {
@@ -718,7 +729,7 @@ bool VM_EnhancedRedefineClasses::is_modifiable_class(oop klass_mirror) {
}
// Cannot redefine or retransform an anonymous class.
- if (InstanceKlass::cast(k)->is_anonymous()) {
+ if (InstanceKlass::cast(k)->is_unsafe_anonymous()) {
return false;
}
return true;
@@ -804,22 +815,30 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
InstanceKlass* k;
- if (InstanceKlass::cast(the_class)->is_anonymous()) {
- const InstanceKlass* host_class = the_class->host_klass();
+ if (InstanceKlass::cast(the_class)->is_unsafe_anonymous()) {
+ const InstanceKlass* host_class = the_class->unsafe_anonymous_host();
// Make sure it's the real host class, not another anonymous class.
- while (host_class != NULL && host_class->is_anonymous()) {
- host_class = host_class->host_klass();
+ while (host_class != NULL && host_class->is_unsafe_anonymous()) {
+ host_class = host_class->unsafe_anonymous_host();
}
+ ClassLoadInfo cl_info(protection_domain,
+ host_class,
+ NULL, // dynamic_nest_host
+ NULL, // cp_patches
+ Handle(), // classData
+ false, // is_hidden
+ false, // is_strong_hidden
+ true); // FIXME: check if correct. can_access_vm_annotations
+
k = SystemDictionary::parse_stream(the_class_sym,
the_class_loader,
- protection_domain,
&st,
- host_class,
+ cl_info,
the_class,
- NULL,
THREAD);
+
k->class_loader_data()->exchange_holders(the_class->class_loader_data());
the_class->class_loader_data()->inc_keep_alive();
} else {
@@ -966,7 +985,7 @@ int VM_EnhancedRedefineClasses::calculate_redefinition_flags(InstanceKlass* new_
// Check interfaces
// Interfaces removed?
- Array<Klass*>* old_interfaces = the_class->transitive_interfaces();
+ Array<InstanceKlass*>* old_interfaces = the_class->transitive_interfaces();
for (i = 0; i < old_interfaces->length(); i++) {
InstanceKlass* old_interface = InstanceKlass::cast(old_interfaces->at(i));
if (!new_class->implements_interface_any_version(old_interface)) {
@@ -976,7 +995,7 @@ int VM_EnhancedRedefineClasses::calculate_redefinition_flags(InstanceKlass* new_
}
// Interfaces added?
- Array<Klass*>* new_interfaces = new_class->transitive_interfaces();
+ Array<InstanceKlass*>* new_interfaces = new_class->transitive_interfaces();
for (i = 0; i<new_interfaces->length(); i++) {
if (!the_class->implements_interface_any_version(new_interfaces->at(i))) {
result = result | Klass::ModifyClass;
@@ -1389,8 +1408,8 @@ void VM_EnhancedRedefineClasses::rollback() {
// Rewrite faster byte-codes back to their slower equivalent. Undoes rewriting happening in templateTable_xxx.cpp
// The reason is that once we zero cpool caches, we need to re-resolve all entries again. Faster bytecodes do not
// do that, they assume that cache entry is resolved already.
-void VM_EnhancedRedefineClasses::unpatch_bytecode(Method* method) {
- RawBytecodeStream bcs(method);
+void VM_EnhancedRedefineClasses::unpatch_bytecode(Method* method, TRAPS) {
+ RawBytecodeStream bcs(methodHandle(THREAD, method));
Bytecodes::Code code;
Bytecodes::Code java_code;
while (!bcs.is_last_bytecode()) {
@@ -1454,11 +1473,11 @@ void VM_EnhancedRedefineClasses::ClearCpoolCacheAndUnpatch::do_klass(Klass* k) {
HandleMark hm(_thread);
InstanceKlass *ik = InstanceKlass::cast(k);
- constantPoolHandle other_cp = constantPoolHandle(ik->constants());
+ constantPoolHandle other_cp = constantPoolHandle(_thread, ik->constants());
// Update host klass of anonymous classes (for example, produced by lambdas) to newest version.
- if (ik->is_anonymous() && ik->host_klass()->new_version() != NULL) {
- ik->set_host_klass(InstanceKlass::cast(ik->host_klass()->newest_version()));
+ if (ik->is_unsafe_anonymous() && ik->unsafe_anonymous_host()->new_version() != NULL) {
+ ik->set_unsafe_anonymous_host(InstanceKlass::cast(ik->unsafe_anonymous_host()->newest_version()));
}
// Update implementor if there is only one, in this case implementor() can reference old class
@@ -1492,7 +1511,18 @@ void VM_EnhancedRedefineClasses::ClearCpoolCacheAndUnpatch::do_klass(Klass* k) {
// If bytecode rewriting is enabled, we also need to unpatch bytecode to force resolution of zeroed entries
if (RewriteBytecodes) {
- ik->methods_do(unpatch_bytecode);
+ ik->methods_do(unpatch_bytecode, _thread);
+ }
+}
+
+u8 VM_EnhancedRedefineClasses::next_id() {
+ while (true) {
+ u8 id = _id_counter;
+ u8 next_id = id + 1;
+ u8 result = Atomic::cmpxchg(&_id_counter, id, next_id);
+ if (result == id) {
+ return next_id;
+ }
}
}
@@ -1512,31 +1542,8 @@ void VM_EnhancedRedefineClasses::MethodDataCleaner::do_klass(Klass* k) {
}
}
-void VM_EnhancedRedefineClasses::fix_invoke_method(Method* method) {
-
- constantPoolHandle other_cp = constantPoolHandle(method->constants());
-
- for (int i = 0; i < other_cp->length(); i++) {
- if (other_cp->tag_at(i).is_klass()) {
- Klass* klass = other_cp->resolved_klass_at(i);
- if (klass->new_version() != NULL) {
- // Constant pool entry points to redefined class -- update to the new version
- other_cp->klass_at_put(i, klass->newest_version());
- }
- assert(other_cp->resolved_klass_at(i)->new_version() == NULL, "Must be new klass!");
- }
- }
-
- ConstantPoolCache* cp_cache = other_cp->cache();
- if (cp_cache != NULL) {
- cp_cache->clear_entries();
- }
-
-}
-
-
-void VM_EnhancedRedefineClasses::update_jmethod_ids() {
+void VM_EnhancedRedefineClasses::update_jmethod_ids(TRAPS) {
for (int j = 0; j < _matching_methods_length; ++j) {
Method* old_method = _matching_old_methods[j];
jmethodID jmid = old_method->find_jmethod_id_or_null();
@@ -1547,10 +1554,10 @@ void VM_EnhancedRedefineClasses::update_jmethod_ids() {
if (jmid != NULL) {
// There is a jmethodID, change it to point to the new method
- methodHandle new_method_h(_matching_new_methods[j]);
+ methodHandle new_method_h(THREAD, _matching_new_methods[j]);
if (old_method->new_version() == NULL) {
- methodHandle old_method_h(_matching_old_methods[j]);
+ methodHandle old_method_h(THREAD, _matching_old_methods[j]);
jmethodID new_jmethod_id = Method::make_jmethod_id(old_method_h->method_holder()->class_loader_data(), old_method_h());
bool result = InstanceKlass::cast(old_method_h->method_holder())->update_jmethod_id(old_method_h(), new_jmethod_id);
} else {
@@ -1887,7 +1894,7 @@ void VM_EnhancedRedefineClasses::redefine_single_class(InstanceKlass* new_class_
// track number of methods that are EMCP for add_previous_version() call below
check_methods_and_mark_as_obsolete();
- update_jmethod_ids();
+ update_jmethod_ids(THREAD);
_any_class_has_resolved_methods = the_class->has_resolved_methods() || _any_class_has_resolved_methods;
@@ -2119,12 +2126,12 @@ jvmtiError VM_EnhancedRedefineClasses::do_topological_class_sorting(TRAPS) {
Handle protection_domain(THREAD, klass->protection_domain());
+ ClassLoadInfo cl_info(protection_domain);
+
ClassFileParser parser(&st,
klass->name(),
klass->class_loader_data(),
- protection_domain,
- NULL, // host_klass
- NULL, // cp_patches
+ &cl_info,
ClassFileParser::INTERNAL, // publicity level
true,
THREAD);
@@ -2134,7 +2141,7 @@ jvmtiError VM_EnhancedRedefineClasses::do_topological_class_sorting(TRAPS) {
links.append(KlassPair(super_klass, klass));
}
- Array<Klass*>* local_interfaces = parser.local_interfaces();
+ Array<InstanceKlass*>* local_interfaces = parser.local_interfaces();
for (int j = 0; j < local_interfaces->length(); j++) {
Klass* iface = local_interfaces->at(j);
if (iface != NULL && _affected_klasses->contains(iface)) {
@@ -2157,7 +2164,7 @@ jvmtiError VM_EnhancedRedefineClasses::do_topological_class_sorting(TRAPS) {
links.append(KlassPair(super_klass, klass));
}
- Array<Klass*>* local_interfaces = klass->local_interfaces();
+ Array<InstanceKlass*>* local_interfaces = klass->local_interfaces();
for (int j = 0; j < local_interfaces->length(); j++) {
Klass* interfaceKlass = local_interfaces->at(j);
if (_affected_klasses->contains(interfaceKlass)) {
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index 4c0412d343d..0066088b3b0 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -32,7 +32,7 @@
#include "memory/resourceArea.hpp"
#include "oops/objArrayKlass.hpp"
#include "oops/objArrayOop.hpp"
-#include "gc/shared/vmGCOperations.hpp"
+#include "gc/shared/gcVMOperations.hpp"
#include "../../../java.base/unix/native/include/jni_md.h"
//
@@ -59,6 +59,7 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
static int _deleted_methods_length;
static int _added_methods_length;
static Klass* _the_class_oop;
+ static u8 _id_counter;
// The instance fields are used to pass information from
// doit_prologue() to doit() and doit_epilogue().
@@ -91,6 +92,9 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
elapsedTimer _timer_heap_iterate;
elapsedTimer _timer_heap_full_gc;
+ // Redefinition id used by JFR
+ u8 _id;
+
// These routines are roughly in call order unless otherwise noted.
// Load and link new classes (either redefined or affected by redefinition - subclass, ...)
@@ -118,15 +122,14 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
static void mark_as_scavengable(nmethod* nm);
static void unregister_nmethod_g1(nmethod* nm);
static void register_nmethod_g1(nmethod* nm);
- static void unpatch_bytecode(Method* method);
- static void fix_invoke_method(Method* method);
+ static void unpatch_bytecode(Method* method, TRAPS);
// Figure out which new methods match old methods in name and signature,
// which methods have been added, and which are no longer present
void compute_added_deleted_matching_methods();
// Change jmethodIDs to point to the new methods
- void update_jmethod_ids();
+ void update_jmethod_ids(TRAPS);
// marking methods as old and/or obsolete
void check_methods_and_mark_as_obsolete();
@@ -141,6 +144,8 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
void flush_dependent_code(InstanceKlass* k_h, TRAPS);
+ u8 next_id();
+
static void check_class(InstanceKlass* k_oop, TRAPS);
static void dump_methods();
@@ -181,6 +186,7 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
bool allow_nested_vm_operations() const { return true; }
jvmtiError check_error() { return _res; }
+ u8 id() { return _id; }
// Modifiable test must be shared between IsModifiableClass query
// and redefine implementation
diff --git a/src/hotspot/share/prims/jvmtiEnv.cpp b/src/hotspot/share/prims/jvmtiEnv.cpp
index b6838ac034d..fba0f48abd7 100644
--- a/src/hotspot/share/prims/jvmtiEnv.cpp
+++ b/src/hotspot/share/prims/jvmtiEnv.cpp
@@ -456,20 +456,23 @@ JvmtiEnv::RetransformClasses(jint class_count, const jclass* classes) {
EventRetransformClasses event;
jvmtiError error;
+ u8 op_id;
if (AllowEnhancedClassRedefinition) {
MutexLocker sd_mutex(EnhancedRedefineClasses_lock);
VM_EnhancedRedefineClasses op(class_count, class_definitions, jvmti_class_load_kind_retransform);
VMThread::execute(&op);
+ op_id = op.id();
error = (op.check_error());
} else {
VM_RedefineClasses op(class_count, class_definitions, jvmti_class_load_kind_retransform);
VMThread::execute(&op);
+ op_id = op.id();
error = op.check_error();
}
if (error == JVMTI_ERROR_NONE) {
event.set_classCount(class_count);
- event.set_redefinitionId(op.id());
+ event.set_redefinitionId(op_id);
event.commit();
}
return error;
@@ -484,19 +487,23 @@ JvmtiEnv::RedefineClasses(jint class_count, const jvmtiClassDefinition* class_de
EventRedefineClasses event;
jvmtiError error;
+ u8 op_id;
+
if (AllowEnhancedClassRedefinition) {
MutexLocker sd_mutex(EnhancedRedefineClasses_lock);
VM_EnhancedRedefineClasses op(class_count, class_definitions, jvmti_class_load_kind_redefine);
VMThread::execute(&op);
+ op_id = op.id();
error = (op.check_error());
} else {
VM_RedefineClasses op(class_count, class_definitions, jvmti_class_load_kind_redefine);
VMThread::execute(&op);
+ op_id = op.id();
error = op.check_error();
}
if (error == JVMTI_ERROR_NONE) {
event.set_classCount(class_count);
- event.set_redefinitionId(op.id());
+ event.set_redefinitionId(op_id);
event.commit();
}
return error;
diff --git a/src/hotspot/share/prims/jvmtiRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiRedefineClasses.cpp
index a7840848e10..346eac7c431 100644
--- a/src/hotspot/share/prims/jvmtiRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiRedefineClasses.cpp
@@ -1271,6 +1271,7 @@ jvmtiError VM_RedefineClasses::load_new_class_versions(TRAPS) {
the_class_loader,
&st,
cl_info,
+ NULL,
THREAD);
// Clear class_being_redefined just to be sure.
state->clear_class_being_redefined();
diff --git a/src/hotspot/share/prims/methodHandles.hpp b/src/hotspot/share/prims/methodHandles.hpp
index 54f36202a5f..917d31efd77 100644
--- a/src/hotspot/share/prims/methodHandles.hpp
+++ b/src/hotspot/share/prims/methodHandles.hpp
@@ -180,6 +180,9 @@ public:
assert(ref_kind_is_valid(ref_kind), "");
return (ref_kind & 1) != 0;
}
+ static bool ref_kind_is_static(int ref_kind) {
+ return !ref_kind_has_receiver(ref_kind) && (ref_kind != JVM_REF_newInvokeSpecial);
+ }
static int ref_kind_to_flags(int ref_kind);
diff --git a/src/hotspot/share/runtime/arguments.cpp b/src/hotspot/share/runtime/arguments.cpp
index d05a2893498..3a92b8869dc 100644
--- a/src/hotspot/share/runtime/arguments.cpp
+++ b/src/hotspot/share/runtime/arguments.cpp
@@ -2128,13 +2128,15 @@ bool Arguments::check_gc_consistency() {
// of collectors.
uint i = 0;
if (UseSerialGC) i++;
- if (UseConcMarkSweepGC) i++;
- if (UseParallelGC || UseParallelOldGC) i++;
+ if (UseParallelGC) i++;
if (UseG1GC) i++;
+ if (UseEpsilonGC) i++;
+ if (UseZGC) i++;
+ if (UseShenandoahGC) i++;
if (AllowEnhancedClassRedefinition) {
// Must use serial GC. This limitation applies because the instance size changing GC modifications
// are only built into the mark and compact algorithm.
- if ((!UseSerialGC && !UseG1GC) && i >= 1) {
+ if (!UseSerialGC && !UseG1GC && i >= 1) {
jio_fprintf(defaultStream::error_stream(),
"Must use the Serial or G1 GC with enhanced class redefinition.\n");
return false;
@@ -4494,18 +4496,18 @@ void Arguments::setup_hotswap_agent() {
// TODO: open it only for org.hotswap.agent module
// Use to access java.lang.reflect.Proxy/proxyCache
- create_numbered_property("jdk.module.addopens", "java.base/java.lang=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.base/java.lang=ALL-UNNAMED", addopens_count++);
// Class of field java.lang.reflect.Proxy/proxyCache
- create_numbered_property("jdk.module.addopens", "java.base/jdk.internal.loader=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.base/jdk.internal.loader=ALL-UNNAMED", addopens_count++);
// Use to access java.io.Reader, java.io.InputStream, java.io.FileInputStream
- create_numbered_property("jdk.module.addopens", "java.base/java.io=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.base/java.io=ALL-UNNAMED", addopens_count++);
// java.beans.Introspector access
- create_numbered_property("jdk.module.addopens", "java.desktop/java.beans=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.desktop/java.beans=ALL-UNNAMED", addopens_count++);
// java.beans.Introspector access
- create_numbered_property("jdk.module.addopens", "java.desktop/com.sun.beans=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.desktop/com.sun.beans=ALL-UNNAMED", addopens_count++);
// com.sun.beans.introspect.ClassInfo access
- create_numbered_property("jdk.module.addopens", "java.desktop/com.sun.beans.introspect=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.desktop/com.sun.beans.introspect=ALL-UNNAMED", addopens_count++);
// com.sun.beans.introspect.util.Cache access
- create_numbered_property("jdk.module.addopens", "java.desktop/com.sun.beans.util=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.desktop/com.sun.beans.util=ALL-UNNAMED", addopens_count++);
}
diff --git a/src/hotspot/share/runtime/mutexLocker.cpp b/src/hotspot/share/runtime/mutexLocker.cpp
index 6f982072909..14a3ed730fe 100644
--- a/src/hotspot/share/runtime/mutexLocker.cpp
+++ b/src/hotspot/share/runtime/mutexLocker.cpp
@@ -287,7 +287,7 @@ void mutex_init() {
def(InitCompleted_lock , PaddedMonitor, leaf, true, _safepoint_check_never);
def(VtableStubs_lock , PaddedMutex , nonleaf, true, _safepoint_check_never);
def(Notify_lock , PaddedMonitor, nonleaf, true, _safepoint_check_always);
- def(EnhancedRedefineClasses_lock , PaddedMutex , nonleaf+7, false, Monitor::_safepoint_check_always); // for ensuring that class redefinition is not done in parallel
+ def(EnhancedRedefineClasses_lock , PaddedMutex , nonleaf+7, false, _safepoint_check_always); // for ensuring that class redefinition is not done in parallel
def(JNICritical_lock , PaddedMonitor, nonleaf, true, _safepoint_check_always); // used for JNI critical regions
def(AdapterHandlerLibrary_lock , PaddedMutex , nonleaf, true, _safepoint_check_always);
--
2.23.0

View File

@@ -1,25 +0,0 @@
From 336cab4f72c6e642e3077ea8d1a4860de33f5a4d Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Tue, 17 Nov 2020 17:40:24 +0100
Subject: [PATCH 20/34] dcevm15 - G1 fixes
---
src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp b/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
index 2f06b9617e4..476728a5d26 100644
--- a/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
+++ b/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
@@ -240,7 +240,7 @@ void G1FullGCPrepareTask::prepare_serial_compaction_dcevm() {
// collect remaining, not forwarded rescued oops using serial compact point
while (cp->last_rescued_oop() < cp->rescued_oops()->length()) {
- HeapRegion* hr = G1CollectedHeap::heap()->new_region(HeapRegion::GrainBytes / HeapWordSize, false, true);
+ HeapRegion* hr = G1CollectedHeap::heap()->new_region(HeapRegion::GrainBytes / HeapWordSize, HeapRegionType::Eden, true, G1NUMA::AnyNodeIndex);
if (hr == NULL) {
vm_exit_out_of_memory(0, OOM_MMAP_ERROR, "G1 - not enough of free regions after redefinition.");
}
--
2.23.0

View File

@@ -1,133 +0,0 @@
From cea4e2cca3c37233c728be7235f8f9d8be136cb5 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Tue, 17 Nov 2020 18:52:57 +0100
Subject: [PATCH 21/34] dcevm15 - Fix flush dependent code
---
.../prims/jvmtiEnhancedRedefineClasses.cpp | 57 +++++++------------
.../prims/jvmtiEnhancedRedefineClasses.hpp | 4 +-
2 files changed, 25 insertions(+), 36 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 619e3988e3a..efaf11e1666 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -508,7 +508,7 @@ void VM_EnhancedRedefineClasses::doit() {
// Deoptimize all compiled code that depends on this class (do only once, because it clears whole cache)
// if (_max_redefinition_flags > Klass::ModifyClass) {
- flush_dependent_code(NULL, thread);
+ flush_dependent_code(thread);
// }
// Adjust constantpool caches for all classes that reference methods of the evolved class.
@@ -647,17 +647,8 @@ void VM_EnhancedRedefineClasses::doit() {
//MethodDataCleaner clean_weak_method_links;
//ClassLoaderDataGraph::classes_do(&clean_weak_method_links);
- // Disable any dependent concurrent compilations
- // SystemDictionary::notice_modification();
-
JvmtiExport::increment_redefinition_count();
- // Set flag indicating that some invariants are no longer true.
- // See jvmtiExport.hpp for detailed explanation.
-
- // dcevm15: handled by _redefinition_count
- // JvmtiExport::set_has_redefined_a_class();
-
#ifdef PRODUCT
if (log_is_enabled(Trace, redefine, class, obsolete, metadata)) {
#endif
@@ -1746,6 +1737,18 @@ void VM_EnhancedRedefineClasses::transfer_old_native_function_registrations(Inst
transfer.transfer_registrations(_matching_old_methods, _matching_methods_length);
}
+// First step is to walk the code cache for each class redefined and mark
+// dependent methods. Wait until all classes are processed to deoptimize everything.
+void VM_EnhancedRedefineClasses::mark_dependent_code(InstanceKlass* ik) {
+ assert_locked_or_safepoint(Compile_lock);
+
+ // All dependencies have been recorded from startup or this is a second or
+ // subsequent use of RedefineClasses
+ if (0 && JvmtiExport::all_dependencies_are_recorded()) {
+ CodeCache::mark_for_evol_deoptimization(ik);
+ }
+}
+
// DCEVM - it always deoptimizes everything! (because it is very difficult to find only correct dependencies)
// Deoptimize all compiled code that depends on this class.
//
@@ -1762,33 +1765,21 @@ void VM_EnhancedRedefineClasses::transfer_old_native_function_registrations(Inst
// subsequent calls to RedefineClasses need only throw away code
// that depends on the class.
//
-void VM_EnhancedRedefineClasses::flush_dependent_code(InstanceKlass* k_h, TRAPS) {
+void VM_EnhancedRedefineClasses::flush_dependent_code(TRAPS) {
assert_locked_or_safepoint(Compile_lock);
// All dependencies have been recorded from startup or this is a second or
// subsequent use of RedefineClasses
// FIXME: for now, deoptimize all!
- if (0 && k_h != NULL && JvmtiExport::all_dependencies_are_recorded()) {
- CodeCache::flush_evol_dependents_on(k_h);
- Klass* superCl = k_h->super();
- // Deoptimize super classes since redefined class can has a new method override
- while (superCl != NULL && !superCl->is_redefining()) {
- CodeCache::flush_evol_dependents_on(InstanceKlass::cast(superCl));
- superCl = superCl->super();
+ if (0 && JvmtiExport::all_dependencies_are_recorded()) {
+ int deopt = CodeCache::mark_dependents_for_evol_deoptimization();
+ log_debug(redefine, class, nmethod)("Marked %d dependent nmethods for deopt", deopt);
+ if (deopt != 0) {
+ CodeCache::flush_evol_dependents();
}
} else {
- CodeCache::mark_all_nmethods_for_deoptimization();
-
- ResourceMark rm(THREAD);
- DeoptimizationMarker dm;
-
- // Deoptimize all activations depending on marked nmethods
- Deoptimization::deoptimize_dependents();
-
- // Make the dependent methods not entrant
- CodeCache::make_marked_nmethods_not_entrant();
-
- // From now on we know that the dependency information is complete
+ CodeCache::mark_all_nmethods_for_evol_deoptimization();
+ CodeCache::flush_evol_dependents();
JvmtiExport::set_all_dependencies_are_recorded(true);
}
}
@@ -1881,11 +1872,7 @@ void VM_EnhancedRedefineClasses::redefine_single_class(InstanceKlass* new_class_
JvmtiBreakpoints& jvmti_breakpoints = JvmtiCurrentBreakpoints::get_jvmti_breakpoints();
jvmti_breakpoints.clearall_in_class_at_safepoint(the_class);
- // DCEVM Deoptimization is always for whole java world, call only once after all classes are redefined
- // Deoptimize all compiled code that depends on this class
-// if (_max_redefinition_flags <= Klass::ModifyClass) {
-// flush_dependent_code(the_class, THREAD);
-// }
+ mark_dependent_code(the_class);
_old_methods = the_class->methods();
_new_methods = new_class->methods();
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index 0066088b3b0..bd5e7d153be 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -142,7 +142,9 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
// and in all direct and indirect subclasses.
void increment_class_counter(InstanceKlass *ik, TRAPS);
- void flush_dependent_code(InstanceKlass* k_h, TRAPS);
+ void mark_dependent_code(InstanceKlass* ik);
+
+ void flush_dependent_code(TRAPS);
u8 next_id();
--
2.23.0

View File

@@ -1,211 +0,0 @@
From 4f88dcec830d39452f69d1117729469fdb768a8f Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 12:05:26 +0100
Subject: [PATCH 22/34] dcevm15 - fix ResolvedMethodTable
---
src/hotspot/share/classfile/javaClasses.cpp | 5 -
src/hotspot/share/classfile/javaClasses.hpp | 1 -
.../share/prims/resolvedMethodTable.cpp | 139 +++++++++++-------
3 files changed, 84 insertions(+), 61 deletions(-)
diff --git a/src/hotspot/share/classfile/javaClasses.cpp b/src/hotspot/share/classfile/javaClasses.cpp
index 9b086a241f7..9a627786d0f 100644
--- a/src/hotspot/share/classfile/javaClasses.cpp
+++ b/src/hotspot/share/classfile/javaClasses.cpp
@@ -3996,11 +3996,6 @@ void java_lang_invoke_ResolvedMethodName::set_vmholder(oop resolved_method, oop
resolved_method->obj_field_put(_vmholder_offset, holder);
}
-void java_lang_invoke_ResolvedMethodName::set_vmholder_offset(oop resolved_method, Method* m) {
- assert(is_instance(resolved_method), "wrong type");
- resolved_method->obj_field_put(_vmholder_offset, m->method_holder()->java_mirror());
-}
-
oop java_lang_invoke_ResolvedMethodName::find_resolved_method(const methodHandle& m, TRAPS) {
const Method* method = m();
diff --git a/src/hotspot/share/classfile/javaClasses.hpp b/src/hotspot/share/classfile/javaClasses.hpp
index 9abf2e1d105..8f5993b7225 100644
--- a/src/hotspot/share/classfile/javaClasses.hpp
+++ b/src/hotspot/share/classfile/javaClasses.hpp
@@ -1107,7 +1107,6 @@ class java_lang_invoke_ResolvedMethodName : AllStatic {
static Method* vmtarget(oop resolved_method);
static void set_vmtarget(oop resolved_method, Method* method);
- static void set_vmholder_offset(oop resolved_method, Method* method);
static void set_vmholder(oop resolved_method, oop holder);
diff --git a/src/hotspot/share/prims/resolvedMethodTable.cpp b/src/hotspot/share/prims/resolvedMethodTable.cpp
index eb9fcda44f3..d0f1667b967 100644
--- a/src/hotspot/share/prims/resolvedMethodTable.cpp
+++ b/src/hotspot/share/prims/resolvedMethodTable.cpp
@@ -375,6 +375,67 @@ public:
}
};
+class AdjustMethodEntriesDcevm : public StackObj {
+ bool* _trace_name_printed;
+ GrowableArray<oop>* _oops_to_add;
+public:
+ AdjustMethodEntriesDcevm(GrowableArray<oop>* oops_to_add, bool* trace_name_printed) : _trace_name_printed(trace_name_printed), _oops_to_add(oops_to_add) {};
+ bool operator()(WeakHandle<vm_resolved_method_table_data>* entry) {
+ oop mem_name = entry->peek();
+ if (mem_name == NULL) {
+ // Removed
+ return true;
+ }
+
+ Method* old_method = (Method*)java_lang_invoke_ResolvedMethodName::vmtarget(mem_name);
+
+ if (old_method->is_old()) {
+
+ InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
+ Method* newer_method;
+
+ // Method* new_method;
+ if (old_method->is_deleted()) {
+ newer_method = Universe::throw_no_such_method_error();
+ } else {
+ newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
+
+ log_debug(redefine, class, update)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
+
+ assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
+ assert(newer_method != NULL, "method_with_idnum() should not be NULL");
+ assert(old_method != newer_method, "sanity check");
+
+ Thread* thread = Thread::current();
+ ResolvedMethodTableLookup lookup(thread, method_hash(newer_method), newer_method);
+ ResolvedMethodGet rmg(thread, newer_method);
+
+ if (_local_table->get(thread, lookup, rmg)) {
+ // old method was already adjusted if new method exists in _the_table
+ return true;
+ }
+ }
+
+ java_lang_invoke_ResolvedMethodName::set_vmtarget(mem_name, newer_method);
+ java_lang_invoke_ResolvedMethodName::set_vmholder(mem_name, newer_method->method_holder()->java_mirror());
+
+ newer_klass->set_has_resolved_methods();
+ _oops_to_add->append(mem_name);
+
+ ResourceMark rm;
+ if (!(*_trace_name_printed)) {
+ log_debug(redefine, class, update)("adjust: name=%s", old_method->method_holder()->external_name());
+ *_trace_name_printed = true;
+ }
+ log_debug(redefine, class, update, constantpool)
+ ("ResolvedMethod method update: %s(%s)",
+ newer_method->name()->as_C_string(), newer_method->signature()->as_C_string());
+ }
+
+ return true;
+ }
+};
+
// It is called at safepoint only for RedefineClasses
void ResolvedMethodTable::adjust_method_entries(bool * trace_name_printed) {
assert(SafepointSynchronize::is_at_safepoint(), "only called at safepoint");
@@ -382,73 +443,41 @@ void ResolvedMethodTable::adjust_method_entries(bool * trace_name_printed) {
AdjustMethodEntries adjust(trace_name_printed);
_local_table->do_safepoint_scan(adjust);
}
-#endif // INCLUDE_JVMTI
-// (DCEVM) It is called at safepoint only for RedefineClasses
+// It is called at safepoint only for RedefineClasses
void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed) {
assert(SafepointSynchronize::is_at_safepoint(), "only called at safepoint");
// For each entry in RMT, change to new method
- GrowableArray<oop>* oops_to_add = new GrowableArray<oop>();
-
- for (int i = 0; i < _the_table->table_size(); ++i) {
- for (ResolvedMethodEntry* entry = _the_table->bucket(i);
- entry != NULL;
- entry = entry->next()) {
-
- oop mem_name = entry->object_no_keepalive();
- // except ones removed
- if (mem_name == NULL) {
- continue;
- }
- Method* old_method = (Method*)java_lang_invoke_ResolvedMethodName::vmtarget(mem_name);
-
- if (old_method->is_old()) {
-
- InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
- Method* newer_method;
-
- // Method* new_method;
- if (old_method->is_deleted()) {
- newer_method = Universe::throw_no_such_method_error();
- } else {
- newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
-
- log_debug(redefine, class, update)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
-
- assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
- assert(newer_method != NULL, "method_with_idnum() should not be NULL");
- assert(old_method != newer_method, "sanity check");
-
- if (_the_table->lookup(newer_method) != NULL) {
- // old method was already adjusted if new method exists in _the_table
- continue;
- }
- }
+ GrowableArray<oop> oops_to_add(0);
+ AdjustMethodEntriesDcevm adjust(&oops_to_add, trace_name_printed);
+ _local_table->do_safepoint_scan(adjust);
+ Thread* thread = Thread::current();
+ for (int i = 0; i < oops_to_add.length(); i++) {
+ oop mem_name = oops_to_add.at(i);
+ Method* method = (Method*)java_lang_invoke_ResolvedMethodName::vmtarget(mem_name);
- java_lang_invoke_ResolvedMethodName::set_vmtarget(mem_name, newer_method);
- java_lang_invoke_ResolvedMethodName::set_vmholder_offset(mem_name, newer_method);
+ // The hash table takes ownership of the WeakHandle, even if it's not inserted.
- newer_klass->set_has_resolved_methods();
- oops_to_add->append(mem_name);
+ ResolvedMethodTableLookup lookup(thread, method_hash(method), method);
+ ResolvedMethodGet rmg(thread, method);
- ResourceMark rm;
- if (!(*trace_name_printed)) {
- log_debug(redefine, class, update)("adjust: name=%s", old_method->method_holder()->external_name());
- *trace_name_printed = true;
- }
- log_debug(redefine, class, update, constantpool)
- ("ResolvedMethod method update: %s(%s)",
- newer_method->name()->as_C_string(), newer_method->signature()->as_C_string());
+ while (true) {
+ if (_local_table->get(thread, lookup, rmg)) {
+ break;
+ }
+ WeakHandle<vm_resolved_method_table_data> wh = WeakHandle<vm_resolved_method_table_data>::create(Handle(thread, mem_name));
+ // The hash table takes ownership of the WeakHandle, even if it's not inserted.
+ if (_local_table->insert(thread, lookup, wh)) {
+ log_insert(method);
+ wh.resolve();
+ break;
}
- }
- for (int i = 0; i < oops_to_add->length(); i++) {
- oop mem_name = oops_to_add->at(i);
- Method* method = (Method*)java_lang_invoke_ResolvedMethodName::vmtarget(mem_name);
- _the_table->basic_add(method, Handle(Thread::current(), mem_name));
}
}
}
+#endif // INCLUDE_JVMTI
+
// Verification
class VerifyResolvedMethod : StackObj {
public:
--
2.23.0

View File

@@ -1,88 +0,0 @@
From 5379e56465d3d3930ec7ea91b1c64db2cdf70170 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 12:05:50 +0100
Subject: [PATCH 23/34] dcevm15 - fix Universe::root_oops_do
---
src/hotspot/share/memory/universe.cpp | 38 +++++++++------------------
1 file changed, 12 insertions(+), 26 deletions(-)
diff --git a/src/hotspot/share/memory/universe.cpp b/src/hotspot/share/memory/universe.cpp
index f6e4253b5a5..8dad437bd51 100644
--- a/src/hotspot/share/memory/universe.cpp
+++ b/src/hotspot/share/memory/universe.cpp
@@ -39,6 +39,7 @@
#include "gc/shared/gcConfig.hpp"
#include "gc/shared/gcLogPrecious.hpp"
#include "gc/shared/gcTraceTime.inline.hpp"
+#include "gc/shared/weakProcessor.hpp"
#include "interpreter/interpreter.hpp"
#include "logging/log.hpp"
#include "logging/logStream.hpp"
@@ -75,6 +76,7 @@
#include "runtime/thread.inline.hpp"
#include "runtime/timerTrace.hpp"
#include "runtime/vmOperations.hpp"
+#include "services/management.hpp"
#include "services/memoryService.hpp"
#include "utilities/align.hpp"
#include "utilities/copy.hpp"
@@ -180,45 +182,29 @@ void Universe::basic_type_classes_do(KlassClosure *closure) {
// FIXME: (DCEVM) This method should iterate all pointers that are not within heap objects.
void Universe::root_oops_do(OopClosure *oopClosure) {
-
- class AlwaysTrueClosure: public BoolObjectClosure {
- public:
- void do_object(oop p) { ShouldNotReachHere(); }
- bool do_object_b(oop p) { return true; }
- };
- AlwaysTrueClosure always_true;
-
Universe::oops_do(oopClosure);
// ReferenceProcessor::oops_do(oopClosure); (tw) check why no longer there
JNIHandles::oops_do(oopClosure); // Global (strong) JNI handles
Threads::oops_do(oopClosure, NULL);
ObjectSynchronizer::oops_do(oopClosure);
- // TODO: review, flat profiler was removed in j10
- // FlatProfiler::oops_do(oopClosure);
- JvmtiExport::oops_do(oopClosure);
+ // (DCEVM) TODO: Check if this is correct?
+ Management::oops_do(oopClosure);
+ OopStorageSet::vm_global()->oops_do(oopClosure);
+ CLDToOopClosure cld_closure(oopClosure, ClassLoaderData::_claim_none);
+ ClassLoaderDataGraph::cld_do(&cld_closure);
// Now adjust pointers in remaining weak roots. (All of which should
// have been cleared if they pointed to non-surviving objects.)
// Global (weak) JNI handles
- JNIHandles::weak_oops_do(&always_true, oopClosure);
+ WeakProcessor::oops_do(oopClosure);
CodeBlobToOopClosure blobClosure(oopClosure, CodeBlobToOopClosure::FixRelocations);
CodeCache::blobs_do(&blobClosure);
- StringTable::oops_do(oopClosure);
+ AOT_ONLY(AOTLoader::oops_do(oopClosure);)
+ // StringTable::oops_do was removed in j15
+ // StringTable::oops_do(oopClosure);
- // (DCEVM) TODO: Check if this is correct?
- //CodeCache::scavenge_root_nmethods_oops_do(oopClosure);
- //Management::oops_do(oopClosure);
- //ref_processor()->weak_oops_do(&oopClosure);
- //PSScavenge::reference_processor()->weak_oops_do(&oopClosure);
-
-#if INCLUDE_AOT
- if (UseAOT) {
- AOTLoader::oops_do(oopClosure);
- }
-#endif
- // SO_AllClasses
- SystemDictionary::oops_do(oopClosure);
+ // PSScavenge::reference_processor()->weak_oops_do(oopClosure);
}
void Universe::oops_do(OopClosure* f) {
--
2.23.0

View File

@@ -1,67 +0,0 @@
From c6ea68e66d37d70739f7b0ee74131322b4526a68 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 12:03:32 +0100
Subject: [PATCH 24/34] Cleanup dcevm comments
---
src/hotspot/share/classfile/classLoaderDataGraph.hpp | 2 +-
src/hotspot/share/classfile/systemDictionary.hpp | 2 +-
src/hotspot/share/gc/shared/gcConfig.cpp | 2 +-
src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp | 2 +-
4 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/src/hotspot/share/classfile/classLoaderDataGraph.hpp b/src/hotspot/share/classfile/classLoaderDataGraph.hpp
index f380aa3fa34..8ce94cccb47 100644
--- a/src/hotspot/share/classfile/classLoaderDataGraph.hpp
+++ b/src/hotspot/share/classfile/classLoaderDataGraph.hpp
@@ -104,7 +104,7 @@ class ClassLoaderDataGraph : public AllStatic {
static void dictionary_classes_do(KlassClosure* klass_closure);
- // Enhanced class redefinition
+ // (DCEVM) Enhanced class redefinition
static void rollback_redefinition();
// VM_CounterDecay iteration support
diff --git a/src/hotspot/share/classfile/systemDictionary.hpp b/src/hotspot/share/classfile/systemDictionary.hpp
index 931e655d631..1019dbd0d04 100644
--- a/src/hotspot/share/classfile/systemDictionary.hpp
+++ b/src/hotspot/share/classfile/systemDictionary.hpp
@@ -455,7 +455,7 @@ public:
static bool is_well_known_klass(Symbol* class_name);
#endif
- // Enhanced class redefinition
+ // (DCEVM) Enhanced class redefinition
static void remove_from_hierarchy(InstanceKlass* k);
static void update_constraints_after_redefinition();
diff --git a/src/hotspot/share/gc/shared/gcConfig.cpp b/src/hotspot/share/gc/shared/gcConfig.cpp
index 5c1a09390f1..23fbf715378 100644
--- a/src/hotspot/share/gc/shared/gcConfig.cpp
+++ b/src/hotspot/share/gc/shared/gcConfig.cpp
@@ -99,7 +99,7 @@ void GCConfig::fail_if_non_included_gc_is_selected() {
void GCConfig::select_gc_ergonomically() {
if (AllowEnhancedClassRedefinition && !UseG1GC) {
- // Enhanced class redefinition only supports serial GC at the moment
+ // (DCEVM) Enhanced class redefinition only supports serial GC at the moment
FLAG_SET_ERGO(UseSerialGC, true);
} else if (os::is_server_class_machine()) {
#if INCLUDE_G1GC
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index bd5e7d153be..5de375fb888 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -78,7 +78,7 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
// have any entries.
bool _any_class_has_resolved_methods;
- // Enhanced class redefinition, affected klasses contain all classes which should be redefined
+ // (DCEVM) Enhanced class redefinition, affected klasses contain all classes which should be redefined
// either because of redefine, class hierarchy or interface change
GrowableArray<Klass*>* _affected_klasses;
--
2.23.0

View File

@@ -1,43 +0,0 @@
From 507d97966c7145d0ae2533459cc504c7b0d6d5b6 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 18:49:05 +0100
Subject: [PATCH 25/34] Fix cpCache in not AllowEnhancedClassRedefinition mode
---
src/hotspot/share/oops/cpCache.hpp | 8 +++++---
1 file changed, 5 insertions(+), 3 deletions(-)
diff --git a/src/hotspot/share/oops/cpCache.hpp b/src/hotspot/share/oops/cpCache.hpp
index 121a13b1dda..64dcf6223f5 100644
--- a/src/hotspot/share/oops/cpCache.hpp
+++ b/src/hotspot/share/oops/cpCache.hpp
@@ -148,13 +148,13 @@ class ConstantPoolCacheEntry {
void set_bytecode_2(Bytecodes::Code code);
void set_f1(Metadata* f1) {
Metadata* existing_f1 = _f1; // read once
- //assert(existing_f1 == NULL || existing_f1 == f1, "illegal field change");
+ assert(AllowEnhancedClassRedefinition || existing_f1 == NULL || existing_f1 == f1, "illegal field change");
_f1 = f1;
}
void release_set_f1(Metadata* f1);
void set_f2(intx f2) {
intx existing_f2 = _f2; // read once
- //assert(existing_f2 == 0 || existing_f2 == f2, "illegal field change");
+ assert(AllowEnhancedClassRedefinition || existing_f2 == 0 || existing_f2 == f2, "illegal field change");
_f2 = f2;
}
void set_f2_as_vfinal_method(Method* f2) {
@@ -215,7 +215,9 @@ class ConstantPoolCacheEntry {
void initialize_resolved_reference_index(int ref_index) {
assert(_f2 == 0, "set once"); // note: ref_index might be zero also
_f2 = ref_index;
- _flags = 1 << is_resolved_ref_shift;
+ if (AllowEnhancedClassRedefinition) {
+ _flags = 1 << is_resolved_ref_shift;
+ }
}
void set_field( // sets entry to resolved field state
--
2.23.0

View File

@@ -1,32 +0,0 @@
From b516b615c20fafa2094dfb9f4cb08245b26418d0 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 19:51:46 +0100
Subject: [PATCH 26/34] dcevm15 - add ClassLoaderDataGraph_lock on
ClassLoaderDataGraph::classes_do
ClassLoaderDataGraph::classes_do need safepoint or lock,
find_sorted_affected_classes is not in safepoint therefore it must be
locked
---
src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index efaf11e1666..197e1c0029f 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -2063,7 +2063,10 @@ jvmtiError VM_EnhancedRedefineClasses::find_sorted_affected_classes(TRAPS) {
AffectedKlassClosure closure(_affected_klasses);
// Updated in j10, from original SystemDictionary::classes_do
- ClassLoaderDataGraph::classes_do(&closure);
+ {
+ MutexLocker mcld(ClassLoaderDataGraph_lock);
+ ClassLoaderDataGraph::classes_do(&closure);
+ }
//ClassLoaderDataGraph::dictionary_classes_do(&closure);
log_trace(redefine, class, load)("%d classes affected", _affected_klasses->length());
--
2.23.0

View File

@@ -1,29 +0,0 @@
From c6498946006879314bdc6218ee72da5d9c88f237 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sat, 28 Nov 2020 19:29:42 +0100
Subject: [PATCH 27/34] dcevm15 - check if has_nestmate_access_to has newest
host class
---
src/hotspot/share/oops/instanceKlass.cpp | 5 +++++
1 file changed, 5 insertions(+)
diff --git a/src/hotspot/share/oops/instanceKlass.cpp b/src/hotspot/share/oops/instanceKlass.cpp
index 5e40d78a87e..1d9623f2446 100644
--- a/src/hotspot/share/oops/instanceKlass.cpp
+++ b/src/hotspot/share/oops/instanceKlass.cpp
@@ -445,6 +445,11 @@ bool InstanceKlass::has_nestmate_access_to(InstanceKlass* k, TRAPS) {
return false;
}
+ if (AllowEnhancedClassRedefinition) {
+ // TODO: (DCEVM) check if it correct. It fix problems with lambdas (hidden)
+ cur_host = InstanceKlass::cast(cur_host->newest_version());
+ }
+
Klass* k_nest_host = k->nest_host(CHECK_false);
if (k_nest_host == NULL) {
return false;
--
2.23.0

View File

@@ -1,24 +0,0 @@
From 86c27155386c1c40642c99c63a242d1f5d8601a5 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sat, 28 Nov 2020 19:31:08 +0100
Subject: [PATCH 28/34] Remove unused fieldType
---
src/hotspot/share/classfile/vmSymbols.hpp | 1 -
1 file changed, 1 deletion(-)
diff --git a/src/hotspot/share/classfile/vmSymbols.hpp b/src/hotspot/share/classfile/vmSymbols.hpp
index 6a3b234b222..eb06684a288 100644
--- a/src/hotspot/share/classfile/vmSymbols.hpp
+++ b/src/hotspot/share/classfile/vmSymbols.hpp
@@ -465,7 +465,6 @@
template(static_offset_name, "staticOffset") \
template(static_base_name, "staticBase") \
template(field_offset_name, "fieldOffset") \
- template(field_type_name, "fieldType") \
\
/* name symbols needed by intrinsics */ \
\
--
2.23.0

View File

@@ -1,54 +0,0 @@
From 025d0d2903963fb79f83cf0d90418783d3ef6813 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 17:18:16 +0100
Subject: [PATCH 29/34] mark_as_scavengable only alive methods
---
.../share/prims/jvmtiEnhancedRedefineClasses.cpp | 14 ++++++++------
1 file changed, 8 insertions(+), 6 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 197e1c0029f..e00fac1f693 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -223,19 +223,21 @@ class FieldCopier : public FieldClosure {
// TODO: review...
void VM_EnhancedRedefineClasses::mark_as_scavengable(nmethod* nm) {
- ScavengableNMethods::register_nmethod(nm);
+ if (nm->is_alive()) {
+ ScavengableNMethods::register_nmethod(nm);
+ }
}
void VM_EnhancedRedefineClasses::unregister_nmethod_g1(nmethod* nm) {
// It should work not only for G1 but also for another GCs, but this way is safer now
- if (!nm->is_zombie() && !nm->is_unloaded()) {
+ if (nm->is_alive()) {
Universe::heap()->unregister_nmethod(nm);
}
}
void VM_EnhancedRedefineClasses::register_nmethod_g1(nmethod* nm) {
// It should work not only for G1 but also for another GCs, but this way is safer now
- if (!nm->is_zombie() && !nm->is_unloaded()) {
+ if (nm->is_alive()) {
Universe::heap()->register_nmethod(nm);
}
}
@@ -511,9 +513,9 @@ void VM_EnhancedRedefineClasses::doit() {
flush_dependent_code(thread);
// }
- // Adjust constantpool caches for all classes that reference methods of the evolved class.
- ClearCpoolCacheAndUnpatch clear_cpool_cache(thread);
- ClassLoaderDataGraph::classes_do(&clear_cpool_cache);
+ // Adjust constantpool caches for all classes that reference methods of the evolved class.
+ ClearCpoolCacheAndUnpatch clear_cpool_cache(thread);
+ ClassLoaderDataGraph::classes_do(&clear_cpool_cache);
// JSR-292 support
if (_any_class_has_resolved_methods) {
--
2.23.0

View File

@@ -1,28 +0,0 @@
From 27aabfefe7d799545049bb81ba19d4ed2ff6379c Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 17:20:11 +0100
Subject: [PATCH 30/34] dcevm15 - lock on
ClassLoaderDataGraph::rollback_redefinition
rollback is not in safepoint, therefore must be locked
---
src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp | 2 ++
1 file changed, 2 insertions(+)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index e00fac1f693..db5fb1c472b 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -1382,7 +1382,9 @@ void VM_EnhancedRedefineClasses::calculate_instance_update_information(Klass* ne
// Rollback all changes - clear new classes from the system dictionary, return old classes to directory, free memory.
void VM_EnhancedRedefineClasses::rollback() {
log_info(redefine, class, load)("Rolling back redefinition, result=%d", _res);
+ ClassLoaderDataGraph_lock->lock();
ClassLoaderDataGraph::rollback_redefinition();
+ ClassLoaderDataGraph_lock->unlock();
for (int i = 0; i < _new_classes->length(); i++) {
SystemDictionary::remove_from_hierarchy(_new_classes->at(i));
--
2.23.0

View File

@@ -1,28 +0,0 @@
From 9b405cb642d5935c39c8dbd522ea2fdecfc29ef3 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 19:59:50 +0100
Subject: [PATCH 31/34] ResourceMark in G1IterateObjectClosureTask fixing
memory leaks
G1IterateObjectClosureTask is used only in redefinition full GC run
---
src/hotspot/share/gc/g1/g1CollectedHeap.cpp | 3 +++
1 file changed, 3 insertions(+)
diff --git a/src/hotspot/share/gc/g1/g1CollectedHeap.cpp b/src/hotspot/share/gc/g1/g1CollectedHeap.cpp
index a29d2dddc2d..2af6df6c1e4 100644
--- a/src/hotspot/share/gc/g1/g1CollectedHeap.cpp
+++ b/src/hotspot/share/gc/g1/g1CollectedHeap.cpp
@@ -2362,6 +2362,9 @@ class G1IterateObjectClosureTask : public AbstractGangTask {
_cl(cl), _g1h(g1h), _hrclaimer(g1h->workers()->active_workers()) { }
virtual void work(uint worker_id) {
+ Thread *thread = Thread::current();
+ HandleMark hm(thread); // make sure any handles created are deleted
+ ResourceMark rm(thread);
IterateObjectClosureRegionClosure blk(_cl);
_g1h->heap_region_par_iterate_from_worker_offset(&blk, &_hrclaimer, worker_id);
}
--
2.23.0

View File

@@ -1,91 +0,0 @@
From 40fe40884d4efc50864bb3f2dd88f0a2e7122d5a Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 20:05:03 +0100
Subject: [PATCH 32/34] dcevm15 - fix hidded classes
---
.../prims/jvmtiEnhancedRedefineClasses.cpp | 41 ++++++++++++++-----
1 file changed, 30 insertions(+), 11 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index db5fb1c472b..590f7fdfafe 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -722,7 +722,8 @@ bool VM_EnhancedRedefineClasses::is_modifiable_class(oop klass_mirror) {
}
// Cannot redefine or retransform an anonymous class.
- if (InstanceKlass::cast(k)->is_unsafe_anonymous()) {
+ // TODO: check if is correct in j15
+ if (InstanceKlass::cast(k)->is_unsafe_anonymous() || InstanceKlass::cast(k)->is_hidden()) {
return false;
}
return true;
@@ -808,21 +809,27 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
InstanceKlass* k;
- if (InstanceKlass::cast(the_class)->is_unsafe_anonymous()) {
- const InstanceKlass* host_class = the_class->unsafe_anonymous_host();
+ if (the_class->is_unsafe_anonymous() || the_class->is_hidden()) {
+ InstanceKlass* dynamic_host_class = NULL;
+ InstanceKlass* unsafe_anonymous_host = NULL;
- // Make sure it's the real host class, not another anonymous class.
- while (host_class != NULL && host_class->is_unsafe_anonymous()) {
- host_class = host_class->unsafe_anonymous_host();
+ if (the_class->is_hidden()) {
+ log_debug(redefine, class, load)("loading hidden class %s", the_class->name()->as_C_string());
+ dynamic_host_class = the_class->nest_host(THREAD);
+ }
+
+ if (the_class->is_unsafe_anonymous()) {
+ log_debug(redefine, class, load)("loading usafe anonymous %s", the_class->name()->as_C_string());
+ unsafe_anonymous_host = the_class->unsafe_anonymous_host();
}
ClassLoadInfo cl_info(protection_domain,
- host_class,
- NULL, // dynamic_nest_host
+ unsafe_anonymous_host,
NULL, // cp_patches
+ dynamic_host_class, // dynamic_nest_host
Handle(), // classData
- false, // is_hidden
- false, // is_strong_hidden
+ the_class->is_hidden(), // is_hidden
+ !the_class->is_non_strong_hidden(), // is_strong_hidden
true); // FIXME: check if correct. can_access_vm_annotations
k = SystemDictionary::parse_stream(the_class_sym,
@@ -833,7 +840,17 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
THREAD);
k->class_loader_data()->exchange_holders(the_class->class_loader_data());
- the_class->class_loader_data()->inc_keep_alive();
+
+ if (the_class->is_hidden()) {
+ // from jvm_lookup_define_class() (jvm.cpp):
+ // The hidden class loader data has been artificially been kept alive to
+ // this point. The mirror and any instances of this class have to keep
+ // it alive afterwards.
+ the_class->class_loader_data()->dec_keep_alive();
+ } else {
+ the_class->class_loader_data()->inc_keep_alive();
+ }
+
} else {
k = SystemDictionary::resolve_from_stream(the_class_sym,
the_class_loader,
@@ -1475,6 +1492,8 @@ void VM_EnhancedRedefineClasses::ClearCpoolCacheAndUnpatch::do_klass(Klass* k) {
ik->set_unsafe_anonymous_host(InstanceKlass::cast(ik->unsafe_anonymous_host()->newest_version()));
}
+ // FIXME: check new nest_host for hidden
+
// Update implementor if there is only one, in this case implementor() can reference old class
if (ik->is_interface()) {
Klass* implKlass = ik->implementor();
--
2.23.0

View File

@@ -1,27 +0,0 @@
From 29920b076b4ad96d85adbce0a1d947e5022ba3ad Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 20:08:57 +0100
Subject: [PATCH 33/34] dcevm15 - DON'T clear F2 in CP cache after indy
unevolving
It's not clear why it was cleared in dcevm7-11
---
src/hotspot/share/oops/cpCache.cpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/hotspot/share/oops/cpCache.cpp b/src/hotspot/share/oops/cpCache.cpp
index 79a38dbeff0..650e6fab42d 100644
--- a/src/hotspot/share/oops/cpCache.cpp
+++ b/src/hotspot/share/oops/cpCache.cpp
@@ -650,7 +650,7 @@ void ConstantPoolCacheEntry::clear_entry() {
if (clearData) {
if (!is_resolved_reference()) {
- _f2 = 0;
+ // _f2 = 0;
}
// FIXME: (DCEVM) we want to clear flags, but parameter size is actually used
// after we return from the method, before entry is re-initialized. So let's
--
2.23.0

View File

@@ -1,49 +0,0 @@
From 1f13b20ab5553182680045b7d7324ff92da7e7f0 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 21:28:06 +0100
Subject: [PATCH 34/34] dcevm15 - fix Universe::root_oops_do
Removed ClassLoaderDataGraph::cld_do was cause of crashes due multiple
oop patching. ClassLoaderDataGraph::cld_do replaced in dcevm15
previously used and removed SystemDictionary:oops_do
---
src/hotspot/share/memory/universe.cpp | 11 ++++++++---
1 file changed, 8 insertions(+), 3 deletions(-)
diff --git a/src/hotspot/share/memory/universe.cpp b/src/hotspot/share/memory/universe.cpp
index 8dad437bd51..0199962a684 100644
--- a/src/hotspot/share/memory/universe.cpp
+++ b/src/hotspot/share/memory/universe.cpp
@@ -190,21 +190,26 @@ void Universe::root_oops_do(OopClosure *oopClosure) {
// (DCEVM) TODO: Check if this is correct?
Management::oops_do(oopClosure);
OopStorageSet::vm_global()->oops_do(oopClosure);
- CLDToOopClosure cld_closure(oopClosure, ClassLoaderData::_claim_none);
- ClassLoaderDataGraph::cld_do(&cld_closure);
+ // CLDToOopClosure cld_closure(oopClosure, ClassLoaderData::_claim_none);
+ // ClassLoaderDataGraph::cld_do(&cld_closure);
// Now adjust pointers in remaining weak roots. (All of which should
// have been cleared if they pointed to non-surviving objects.)
// Global (weak) JNI handles
WeakProcessor::oops_do(oopClosure);
+ JvmtiExport::oops_do(oopClosure);
+
CodeBlobToOopClosure blobClosure(oopClosure, CodeBlobToOopClosure::FixRelocations);
CodeCache::blobs_do(&blobClosure);
+
AOT_ONLY(AOTLoader::oops_do(oopClosure);)
+
// StringTable::oops_do was removed in j15
// StringTable::oops_do(oopClosure);
- // PSScavenge::reference_processor()->weak_oops_do(oopClosure);
+ // OopStorageSet::vm_global()->oops_do(oopClosure);
+
}
void Universe::oops_do(OopClosure* f) {
--
2.23.0

View File

@@ -1,126 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies major version of OpenJDK e.g. 11_0_6 (instead of dots '.' underbars "_" are used)
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built; possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
# Environment variables:
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_win_x64
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
bundle_type=$4
JBSDK_VERSION_WITH_DOTS=$(echo $JBSDK_VERSION | sed 's/_/\./g')
WORK_DIR=$(pwd)
JCEF_PATH=${JCEF_PATH:=$WORK_DIR/jcef_win_x64}
source jb/project/tools/common/scripts/common.sh
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
[ -d $__bundle_name ] && rm -rf $__bundle_name
echo Running jlink ...
${JSDK}/bin/jlink \
--module-path $__modules_path --no-man-pages --compress=2 \
--add-modules $__modules --output $__bundle_name || do_exit $?
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> $__bundle_name/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' $__bundle_name/release > release
mv release $__bundle_name/release
copy_jmods "$__modules" "$__modules_path" "$__bundle_name"/jmods
fi
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=windows-x86_64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=0
;;
"dcevm")
HEAD_REVISION=$(git rev-parse HEAD)
git am jb/project/tools/patches/dcevm/*.patch || do_exit $?
do_reset_dcevm=0
do_reset_changes=0
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=0
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=windows-x86_64-server-fastdebug
;;
esac
sh ./configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build=$JDK_BUILD_NUMBER \
--with-version-opt=b${build_number} \
--with-toolchain-version=$TOOLCHAIN_VERSION \
--with-boot-jdk=$BOOT_JDK \
--disable-ccache \
--enable-cds=yes || do_exit $?
if [ -z "$bundle_type" ]; then
make LOG=info CONF=$RELEASE_NAME clean images test-image || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME clean images || do_exit $?
fi
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
where cygpath
if [ $? -eq 0 ]; then
JCEF_PATH="$(cygpath -w $JCEF_PATH | sed 's/\\/\//g')"
fi
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods "$JSDK" "$JCEF_PATH"/jmods "$JSDK"/jmods "$JSDK_MODS_DIR" || do_exit $?
cp $JCEF_PATH/jmods/* ${JSDK_MODS_DIR} # $JSDK/jmods is not unchanged
jbr_name_postfix="_${bundle_type}"
fi
# create runtime image bundle
modules=$(xargs < modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat ${JSDK}/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\r//g | sed s/\\n//g)
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" "$JBRSDK_BUNDLE" "$JSDK_MODS_DIR" "$modules" || do_exit $?
do_exit 0

View File

@@ -1,62 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies the current version of OpenJDK e.g. 11_0_6
# JDK_BUILD_NUMBER - specifies the number of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
JBSDK_VERSION_WITH_DOTS=$(echo $JBSDK_VERSION | sed 's/_/\./g')
source jb/project/tools/common/scripts/common.sh
JBRSDK_BASE_NAME=jbrsdk-${JBSDK_VERSION}
WORK_DIR=$(pwd)
[ -z "$bundle_type" ] && (git apply -p0 < jb/project/tools/patches/exclude_jcef_module.patch || exit $?)
PATH="/usr/local/bin:/usr/bin:${PATH}"
./configure \
--with-target-bits=32 \
--with-vendor-name="${VENDOR_NAME}" \
--with-vendor-version-string="${VENDOR_VERSION_STRING}" \
--with-version-pre= \
--with-version-build=${JDK_BUILD_NUMBER} \
--with-version-opt=b${build_number} \
--with-toolchain-version=${TOOLCHAIN_VERSION} \
--with-boot-jdk=${BOOT_JDK} \
--disable-ccache \
--enable-cds=yes || exit 1
make clean CONF=windows-x86-server-release || exit 1
make LOG=info images CONF=windows-x86-server-release test-image || exit 1
JBSDK=${JBRSDK_BASE_NAME}-windows-x86-b${build_number}
BASE_DIR=build/windows-x86-server-release/images
JSDK=${BASE_DIR}/jdk
JBRSDK_BUNDLE=jbrsdk
rm -rf ${BASE_DIR}/${JBRSDK_BUNDLE} && rsync -a --exclude demo --exclude sample ${JSDK}/ ${JBRSDK_BUNDLE} || exit 1
sed 's/JBR/JBRSDK/g' ${JSDK}/release > release
mv release ${JBRSDK_BUNDLE}/release
JBR_BUNDLE=jbr
rm -rf ${JBR_BUNDLE}
grep -v javafx modules.list | grep -v "jdk.internal.vm\|jdk.aot\|jcef" > modules.list.x86
${JSDK}/bin/jlink \
--module-path ${JSDK}/jmods --no-man-pages --compress=2 \
--add-modules $(xargs < modules.list.x86 | sed s/" "//g) --output ${JBR_BUNDLE} || exit $?
echo Modifying release info ...
#grep -v \"^JAVA_VERSION\" ${JSDK}/release | grep -v \"^MODULES\" >> ${JBR_BUNDLE}/release

View File

@@ -1,58 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies major version of OpenJDK e.g. 11_0_6 (instead of dots '.' underbars "_" are used)
# JDK_BUILD_NUMBER - specifies udate release of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built; possible values:
# <empty> or nomod - the bundles without any additional modules (jcef)
# jcef - the bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
source jb/project/tools/common/scripts/common.sh
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
bundle_type=$4
function pack_jbr {
__bundle_name=$1
__arch_name=$2
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-windows-x64-${fastdebug_infix}b${build_number}
echo Creating $JBR.tar.gz ...
/usr/bin/tar -czf $JBR.tar.gz -C $BASE_DIR $__bundle_name || do_exit $?
}
[ "$bundle_type" == "nomod" ] && bundle_type=""
JBRSDK_BUNDLE=jbrsdk
RELEASE_NAME=windows-x86_64-server-release
IMAGES_DIR=build/$RELEASE_NAME/images
JBSDK=$JBRSDK_BASE_NAME-windows-x64-b$build_number
BASE_DIR=.
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
jbr_name_postfix="_${bundle_type}"
fi
pack_jbr jbr${jbr_name_postfix}
pack_jbr jbrsdk${jbr_name_postfix}
if [ -z "$bundle_type" ]; then
JBRSDK_TEST=$JBRSDK_BASE_NAME-windows-test-x64-b$build_number
echo Creating $JBRSDK_TEST.tar.gz ...
/usr/bin/tar -czf $JBRSDK_TEST.tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
fi

View File

@@ -1,42 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies the current version of OpenJDK e.g. 11_0_6
# JDK_BUILD_NUMBER - specifies the number of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
JBRSDK_BASE_NAME=jbrsdk-$JBSDK_VERSION
JBR_BASE_NAME=jbr-$JBSDK_VERSION
IMAGES_DIR=build/windows-x86-server-release/images
JSDK=$IMAGES_DIR/jdk
JBSDK=$JBRSDK_BASE_NAME-windows-x86-b$build_number
BASE_DIR=.
JBRSDK_BUNDLE=jbrsdk
echo Creating $JBSDK.tar.gz ...
/usr/bin/tar -czf $JBSDK.tar.gz $JBRSDK_BUNDLE || exit 1
JBR_BUNDLE=jbr
JBR_BASE_NAME=jbr-${JBSDK_VERSION}
JBR=$JBR_BASE_NAME-windows-x86-b$build_number
echo Creating $JBR.tar.gz ...
/usr/bin/tar -czf $JBR.tar.gz -C $BASE_DIR ${JBR_BUNDLE} || exit 1
JBRSDK_TEST=$JBRSDK_BASE_NAME-windows-test-x86-b$build_number
echo Creating $JBRSDK_TEST.tar.gz ...
/usr/bin/tar -czf $JBRSDK_TEST.tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || exit 1

View File

@@ -242,6 +242,16 @@ ifneq ($(filter product-bundles% legacy-bundles, $(MAKECMDGOALS)), )
)
JDK_SYMBOLS_BUNDLE_FILES := \
$(filter \
$(JDK_SYMBOLS_EXCLUDE_PATTERN) \
$(SYMBOLS_EXCLUDE_PATTERN) \
, \
$(filter-out \
$(JDK_IMAGE_HOMEDIR)/demo/% %.stripped.pdb \
, \
$(ALL_JDK_SYMBOLS_FILES) \
) \
) \
$(call FindFiles, $(SYMBOLS_IMAGE_DIR))
TEST_DEMOS_BUNDLE_FILES := $(filter $(JDK_DEMOS_IMAGE_HOMEDIR)/demo/%, \
@@ -373,7 +383,7 @@ ifneq ($(filter product-bundles% legacy-bundles, $(MAKECMDGOALS)), )
$(eval $(call SetupBundleFile, BUILD_JDK_SYMBOLS_BUNDLE, \
BUNDLE_NAME := $(JDK_SYMBOLS_BUNDLE_NAME), \
FILES := $(JDK_SYMBOLS_BUNDLE_FILES), \
BASE_DIRS := $(SYMBOLS_IMAGE_DIR), \
BASE_DIRS := $(JDK_SYMBOLS_IMAGE_DIR) $(wildcard $(SYMBOLS_IMAGE_DIR)), \
SUBDIR := $(JDK_BUNDLE_SUBDIR), \
UNZIP_DEBUGINFO := true, \
))
@@ -410,43 +420,17 @@ endif
################################################################################
ifneq ($(filter docs-jdk-bundles, $(MAKECMDGOALS)), )
DOCS_JDK_BUNDLE_FILES := $(call FindFiles, $(DOCS_JDK_IMAGE_DIR))
ifneq ($(filter docs-bundles, $(MAKECMDGOALS)), )
DOCS_BUNDLE_FILES := $(call FindFiles, $(DOCS_IMAGE_DIR))
$(eval $(call SetupBundleFile, BUILD_DOCS_JDK_BUNDLE, \
BUNDLE_NAME := $(DOCS_JDK_BUNDLE_NAME), \
FILES := $(DOCS_JDK_BUNDLE_FILES), \
BASE_DIRS := $(DOCS_JDK_IMAGE_DIR), \
$(eval $(call SetupBundleFile, BUILD_DOCS_BUNDLE, \
BUNDLE_NAME := $(DOCS_BUNDLE_NAME), \
FILES := $(DOCS_BUNDLE_FILES), \
BASE_DIRS := $(DOCS_IMAGE_DIR), \
SUBDIR := docs, \
))
DOCS_JDK_TARGETS += $(BUILD_DOCS_JDK_BUNDLE)
endif
ifneq ($(filter docs-javase-bundles, $(MAKECMDGOALS)), )
DOCS_JAVASE_BUNDLE_FILES := $(call FindFiles, $(DOCS_JAVASE_IMAGE_DIR))
$(eval $(call SetupBundleFile, BUILD_DOCS_JAVASE_BUNDLE, \
BUNDLE_NAME := $(DOCS_JAVASE_BUNDLE_NAME), \
FILES := $(DOCS_JAVASE_BUNDLE_FILES), \
BASE_DIRS := $(DOCS_JAVASE_IMAGE_DIR), \
SUBDIR := docs-javase, \
))
DOCS_JAVASE_TARGETS += $(BUILD_DOCS_JAVASE_BUNDLE)
endif
ifneq ($(filter docs-reference-bundles, $(MAKECMDGOALS)), )
DOCS_REFERENCE_BUNDLE_FILES := $(call FindFiles, $(DOCS_REFERENCE_IMAGE_DIR))
$(eval $(call SetupBundleFile, BUILD_DOCS_REFERENCE_BUNDLE, \
BUNDLE_NAME := $(DOCS_REFERENCE_BUNDLE_NAME), \
FILES := $(DOCS_REFERENCE_BUNDLE_FILES), \
BASE_DIRS := $(DOCS_REFERENCE_IMAGE_DIR), \
SUBDIR := docs-reference, \
))
DOCS_REFERENCE_TARGETS += $(BUILD_DOCS_REFERENCE_BUNDLE)
DOCS_TARGETS += $(BUILD_DOCS_BUNDLE)
endif
################################################################################
@@ -495,12 +479,9 @@ $(eval $(call IncludeCustomExtension, Bundles.gmk))
product-bundles: $(PRODUCT_TARGETS)
legacy-bundles: $(LEGACY_TARGETS)
test-bundles: $(TEST_TARGETS)
docs-jdk-bundles: $(DOCS_JDK_TARGETS)
docs-javase-bundles: $(DOCS_JAVASE_TARGETS)
docs-reference-bundles: $(DOCS_REFERENCE_TARGETS)
docs-bundles: $(DOCS_TARGETS)
static-libs-bundles: $(STATIC_LIBS_TARGETS)
jcov-bundles: $(JCOV_TARGETS)
.PHONY: all default product-bundles test-bundles \
docs-jdk-bundles docs-javase-bundles docs-reference-bundles \
.PHONY: all default product-bundles test-bundles docs-bundles \
static-libs-bundles jcov-bundles

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2018, 2020, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2018, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -45,6 +45,11 @@ $(OUTPUTDIR)/compile_commands.json: $(wildcard $(MAKESUPPORT_OUTPUTDIR)/compile-
$(RM) $@
$(FIND) $(MAKESUPPORT_OUTPUTDIR)/compile-commands/ -name \*.json | \
$(SORT) | $(XARGS) $(CAT) >> $@.tmp
$(if $(FIXPATH),$(FIXPATH) $(AWK) 'BEGIN { \
tmpfile = substr(ARGV[2],2); \
cmd = "$(CP) " "\047" tmpfile "\047" " $@.tmp"; \
system(cmd); \
}' -- @$@.tmp)
$(SED) -e '1s/^/[\$(NEWLINE)/' -e '$(DOLLAR)s/,\s\{0,\}$(DOLLAR)/\$(NEWLINE)]/' $@.tmp > $@
$(RM) $@.tmp

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2014, 2021, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2014, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -49,9 +49,8 @@ TARGETS += $(patsubst %, $(BUILDTOOLS_OUTPUTDIR)/gensrc/%/module-info.java, \
$(INTERIM_LANGTOOLS_MODULES))
$(eval $(call SetupCopyFiles, COPY_PREVIEW_FEATURES, \
FILES := $(TOPDIR)/src/java.base/share/classes/jdk/internal/javac/PreviewFeature.java \
$(TOPDIR)/src/java.base/share/classes/jdk/internal/javac/NoPreview.java, \
DEST := $(BUILDTOOLS_OUTPUTDIR)/gensrc/java.base.interim/jdk/internal/javac/, \
FILES := $(TOPDIR)/src/java.base/share/classes/jdk/internal/PreviewFeature.java, \
DEST := $(BUILDTOOLS_OUTPUTDIR)/gensrc/java.base.interim/jdk/internal/, \
))
TARGETS += $(COPY_PREVIEW_FEATURES)
@@ -75,15 +74,15 @@ define SetupInterimModule
EXCLUDE_FILES := $(TOPDIR)/src/$1/share/classes/module-info.java \
Standard.java, \
EXTRA_FILES := $(BUILDTOOLS_OUTPUTDIR)/gensrc/$1.interim/module-info.java, \
COPY := .gif .png .xml .css .js .js.template .txt javax.tools.JavaCompilerTool, \
COPY := .gif .png .xml .css .js javax.tools.JavaCompilerTool, \
BIN := $(BUILDTOOLS_OUTPUTDIR)/interim_langtools_modules/$1.interim, \
DISABLED_WARNINGS := module options, \
JAVAC_FLAGS := \
--module-path $(BUILDTOOLS_OUTPUTDIR)/interim_langtools_modules \
$$(INTERIM_LANGTOOLS_ADD_EXPORTS) \
--patch-module java.base=$(BUILDTOOLS_OUTPUTDIR)/gensrc/java.base.interim \
--add-exports java.base/jdk.internal.javac=java.compiler.interim \
--add-exports java.base/jdk.internal.javac=jdk.compiler.interim, \
--add-exports java.base/jdk.internal=java.compiler.interim \
--add-exports java.base/jdk.internal=jdk.compiler.interim, \
))
$1_DEPS_INTERIM := $$(addsuffix .interim, $$(filter \

View File

@@ -31,6 +31,507 @@ include MakeBase.gmk
include Modules.gmk
include JavaCompilation.gmk
# Hook to include the corresponding custom file, if present.
$(eval $(call IncludeCustomExtension, CompileJavaModules.gmk))
################################################################################
# Module specific build settings
java.base_DOCLINT += -Xdoclint:all/protected,-reference,-accessibility \
'-Xdoclint/package:java.*,javax.*'
java.base_JAVAC_FLAGS += -XDstringConcat=inline
java.base_COPY += .icu .dat .spp .nrm content-types.properties \
hijrah-config-Hijrah-umalqura_islamic-umalqura.properties
java.base_CLEAN += intrinsic.properties
java.base_EXCLUDE_FILES += \
$(TOPDIR)/src/java.base/share/classes/jdk/internal/module/ModuleLoaderMap.java
java.base_EXCLUDES += java/lang/doc-files
# Exclude BreakIterator classes that are just used in compile process to generate
# data files and shouldn't go in the product
java.base_EXCLUDE_FILES += sun/text/resources/BreakIteratorRules.java
ifeq ($(call isTargetOs, macosx aix), false)
java.base_EXCLUDE_FILES += sun/nio/fs/PollingWatchService.java
endif
ifeq ($(call isTargetOs, windows), true)
java.base_EXCLUDE_FILES += \
sun/nio/ch/SimpleAsynchronousFileChannelImpl.java \
#
endif
################################################################################
java.compiler_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:java.*,javax.*'
################################################################################
java.datatransfer_DOCLINT += -Xdoclint:all/protected,-reference \
'-Xdoclint/package:java.*,javax.*'
java.datatransfer_COPY += flavormap.properties
################################################################################
java.desktop_DOCLINT += -Xdoclint:all/protected,-reference \
'-Xdoclint/package:java.*,javax.*'
java.desktop_COPY += .gif .png .wav .txt .xml .css .pf
java.desktop_CLEAN += iio-plugin.properties cursors.properties
java.desktop_EXCLUDES += \
java/awt/doc-files \
javax/swing/doc-files \
javax/swing/text/doc-files \
javax/swing/plaf/synth/doc-files \
javax/swing/undo/doc-files \
sun/awt/X11/doc-files \
#
java.desktop_EXCLUDE_FILES += \
javax/swing/plaf/nimbus/InternalFrameTitlePanePainter.java \
javax/swing/plaf/nimbus/OptionPaneMessageAreaPainter.java \
javax/swing/plaf/nimbus/ScrollBarPainter.java \
javax/swing/plaf/nimbus/SliderPainter.java \
javax/swing/plaf/nimbus/SpinnerPainter.java \
javax/swing/plaf/nimbus/SplitPanePainter.java \
javax/swing/plaf/nimbus/TabbedPanePainter.java \
sun/awt/resources/security-icon-bw16.png \
sun/awt/resources/security-icon-bw24.png \
sun/awt/resources/security-icon-bw32.png \
sun/awt/resources/security-icon-bw48.png \
sun/awt/resources/security-icon-interim16.png \
sun/awt/resources/security-icon-interim24.png \
sun/awt/resources/security-icon-interim32.png \
sun/awt/resources/security-icon-interim48.png \
sun/awt/resources/security-icon-yellow16.png \
sun/awt/resources/security-icon-yellow24.png \
sun/awt/resources/security-icon-yellow32.png \
sun/awt/resources/security-icon-yellow48.png \
sun/awt/X11/java-icon16.png \
sun/awt/X11/java-icon24.png \
sun/awt/X11/java-icon32.png \
sun/awt/X11/java-icon48.png \
.template \
#
ifeq ($(call isTargetOs, macosx), true)
# exclude all X11 on Mac.
java.desktop_EXCLUDES += \
sun/awt/X11 \
sun/java2d/x11 \
sun/java2d/jules \
sun/java2d/xr \
com/sun/java/swing/plaf/gtk \
#
java.desktop_EXCLUDE_FILES += \
$(wildcard $(TOPDIR)/src/java.desktop/unix/classes/sun/java2d/*.java) \
$(wildcard $(TOPDIR)/src/java.desktop/unix/classes/sun/java2d/opengl/*.java) \
$(wildcard $(TOPDIR)/src/java.desktop/unix/classes/sun/awt/*.java) \
$(wildcard $(TOPDIR)/src/java.desktop/unix/classes/sun/font/*.java) \
#
else
# TBD: figure out how to eliminate this long list
java.desktop_EXCLUDE_FILES += \
sun/awt/X11/ScreenFormat.java \
sun/awt/X11/XArc.java \
sun/awt/X11/XChar2b.java \
sun/awt/X11/XCharStruct.java \
sun/awt/X11/XClassHint.java \
sun/awt/X11/XComposeStatus.java \
sun/awt/X11/XExtCodes.java \
sun/awt/X11/XFontProp.java \
sun/awt/X11/XFontSetExtents.java \
sun/awt/X11/XFontStruct.java \
sun/awt/X11/XGCValues.java \
sun/awt/X11/XHostAddress.java \
sun/awt/X11/XIMCallback.java \
sun/awt/X11/XIMHotKeyTrigger.java \
sun/awt/X11/XIMHotKeyTriggers.java \
sun/awt/X11/XIMPreeditCaretCallbackStruct.java \
sun/awt/X11/XIMPreeditDrawCallbackStruct.java \
sun/awt/X11/XIMPreeditStateNotifyCallbackStruct.java \
sun/awt/X11/XIMStatusDrawCallbackStruct.java \
sun/awt/X11/XIMStringConversionCallbackStruct.java \
sun/awt/X11/XIMStringConversionText.java \
sun/awt/X11/XIMStyles.java \
sun/awt/X11/XIMText.java \
sun/awt/X11/XIMValuesList.java \
sun/awt/X11/XImage.java \
sun/awt/X11/XKeyboardControl.java \
sun/awt/X11/XKeyboardState.java \
sun/awt/X11/XOMCharSetList.java \
sun/awt/X11/XOMFontInfo.java \
sun/awt/X11/XOMOrientation.java \
sun/awt/X11/XPoint.java \
sun/awt/X11/XRectangle.java \
sun/awt/X11/XSegment.java \
sun/awt/X11/XStandardColormap.java \
sun/awt/X11/XTextItem.java \
sun/awt/X11/XTextItem16.java \
sun/awt/X11/XTextProperty.java \
sun/awt/X11/XTimeCoord.java \
sun/awt/X11/XWindowChanges.java \
sun/awt/X11/XdbeSwapInfo.java \
sun/awt/X11/XmbTextItem.java \
sun/awt/X11/XwcTextItem.java
endif
ifeq ($(call isTargetOs, windows), true)
java.desktop_EXCLUDES += com/sun/java/swing/plaf/gtk
endif
ifdef BUILD_HEADLESS_ONLY
java.desktop_EXCLUDES += sun/applet
endif
ifeq ($(call isTargetOs, windows macosx), false)
java.desktop_EXCLUDE_FILES += sun/awt/AWTCharset.java
endif
# These files do not appear in the build result of the old build. This
# is because they are generated sources, but the AUTO_JAVA_FILES won't
# pick them up since they aren't generated when the source dirs are
# searched and they aren't referenced by any other classes so they won't
# be picked up by implicit compilation. On a rebuild, they are picked up
# and compiled. Exclude them here to produce the same rt.jar as the old
# build does when building just once.
java.desktop_EXCLUDE_FILES += \
javax/swing/plaf/nimbus/InternalFrameTitlePanePainter.java \
javax/swing/plaf/nimbus/OptionPaneMessageAreaPainter.java \
javax/swing/plaf/nimbus/ScrollBarPainter.java \
javax/swing/plaf/nimbus/SliderPainter.java \
javax/swing/plaf/nimbus/SpinnerPainter.java \
javax/swing/plaf/nimbus/SplitPanePainter.java \
javax/swing/plaf/nimbus/TabbedPanePainter.java \
#
################################################################################
java.scripting_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:java.*,javax.*'
java.scripting_COPY += .js
java.scripting_CLEAN += .properties
################################################################################
java.instrument_DOCLINT += -Xdoclint:all/protected,-accessibility \
'-Xdoclint/package:java.*,javax.*'
################################################################################
java.logging_DOCLINT += -Xdoclint:all/protected,-reference,-accessibility \
'-Xdoclint/package:java.*,javax.*'
################################################################################
java.management_DOCLINT += -Xdoclint:all/protected,-reference,-accessibility \
'-Xdoclint/package:java.*,javax.*'
################################################################################
java.management.rmi_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:javax.*'
################################################################################
java.prefs_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:java.*,javax.*'
################################################################################
java.transaction.xa_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:javax.*'
################################################################################
java.sql_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:java.*,javax.*'
################################################################################
java.sql.rowset_DOCLINT += -Xdoclint:all/protected,-accessibility \
'-Xdoclint/package:java.*,javax.*'
java.sql.rowset_CLEAN_FILES += $(wildcard \
$(TOPDIR)/src/java.sql.rowset/share/classes/com/sun/rowset/*.properties \
$(TOPDIR)/src/java.sql.rowset/share/classes/javax/sql/rowset/*.properties)
################################################################################
java.rmi_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:java.*,javax.*'
java.rmi_CLEAN_FILES += $(wildcard \
$(TOPDIR)/src/java.rmi/share/classes/sun/rmi/registry/resources/*.properties \
$(TOPDIR)/src/java.rmi/share/classes/sun/rmi/server/resources/*.properties)
################################################################################
java.xml_DOCLINT += -Xdoclint:all/protected,-accessibility \
'-Xdoclint/package:$(call CommaList, javax.xml.catalog javax.xml.datatype \
javax.xml.transform javax.xml.validation javax.xml.xpath)'
java.xml_CLEAN += .properties
################################################################################
java.naming_DOCLINT += -Xdoclint:all/protected,-accessibility \
'-Xdoclint/package:java.*,javax.*'
java.naming_CLEAN += jndiprovider.properties
################################################################################
java.security.jgss_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:java.*,javax.*'
################################################################################
java.smartcardio_DOCLINT += -Xdoclint:all/protected,-accessibility \
'-Xdoclint/package:java.*,javax.*'
################################################################################
java.xml.crypto_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:java.*,javax.*'
java.xml.crypto_COPY += .dtd .xml
java.xml.crypto_CLEAN += .properties
################################################################################
jdk.charsets_COPY += .dat
################################################################################
################################################################################
jdk.compiler_DOCLINT += -Xdoclint:all/protected \
'-Xdoclint/package:-com.sun.tools.*,-jdk.internal.*,sun.tools.serialver.resources.*'
jdk.compiler_JAVAC_FLAGS += -XDstringConcat=inline
jdk.compiler_CLEAN_FILES += $(wildcard \
$(patsubst %, $(TOPDIR)/src/jdk.compiler/share/classes/%/*.properties, \
sun/tools/serialver/resources))
################################################################################
jdk.hotspot.agent_DISABLED_WARNINGS += rawtypes serial cast static overrides \
fallthrough
jdk.hotspot.agent_COPY += .gif .png .properties
################################################################################
jdk.editpad_COPY += .properties
################################################################################
jdk.jshell_COPY += .jsh .properties
################################################################################
jdk.internal.le_COPY += .properties .caps .txt
################################################################################
jdk.internal.opt_COPY += .properties
################################################################################
jdk.jcmd_COPY += _options
################################################################################
jdk.dynalink_CLEAN += .properties
################################################################################
jdk.javadoc_COPY += .xml .css .js .png
################################################################################
jdk.jartool_JAVAC_FLAGS += -XDstringConcat=inline
################################################################################
# No SCTP implementation on Mac OS X or AIX. These classes should be excluded.
SCTP_IMPL_CLASSES = \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/AssociationChange.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/AssociationImpl.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/PeerAddrChange.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/ResultContainer.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/SctpChannelImpl.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/SctpMultiChannelImpl.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/SctpNet.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/SctpNotification.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/SctpServerChannelImpl.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/SendFailed.java \
$(TOPDIR)/src/jdk.sctp/unix/classes/sun/nio/ch/sctp/Shutdown.java
ifeq ($(call isTargetOs, macosx), true)
jdk.sctp_EXCLUDE_FILES += $(SCTP_IMPL_CLASSES)
endif
ifeq ($(call isTargetOs, aix), true)
jdk.sctp_EXCLUDE_FILES += $(SCTP_IMPL_CLASSES)
endif
################################################################################
jdk.incubator.jpackage_COPY += .gif .png .txt .spec .script .prerm .preinst \
.postrm .postinst .list .sh .desktop .copyright .control .plist .template \
.icns .scpt .wxs .wxl .wxi .ico .bmp .tiff
jdk.incubator.jpackage_CLEAN += .properties
################################################################################
jdk.jconsole_COPY += .gif .png
jdk.jconsole_CLEAN_FILES += $(wildcard \
$(TOPDIR)/src/jdk.jconsole/share/classes/sun/tools/jconsole/resources/*.properties)
################################################################################
jdk.jdeps_COPY += .txt
jdk.jdeps_CLEAN_FILES += $(wildcard \
$(TOPDIR)/src/jdk.jdeps/share/classes/com/sun/tools/jdeps/resources/*.properties \
$(TOPDIR)/src/jdk.jdeps/share/classes/com/sun/tools/javap/resources/*.properties)
################################################################################
jdk.jdi_EXCLUDES += \
com/sun/tools/example/debug/bdi \
com/sun/tools/example/debug/event \
com/sun/tools/example/debug/gui \
com/sun/jdi/doc-files \
#
jdk.jdi_EXCLUDE_FILES += jdi-overview.html
################################################################################
jdk.dev_CLEAN_FILES += $(wildcard \
$(patsubst %, $(TOPDIR)/src/jdk.dev/share/classes/%/*.properties, \
com/sun/tools/script/shell))
jdk.dev_COPY += .js oqlhelp.html .txt
################################################################################
jdk.internal.jvmstat_COPY += aliasmap
################################################################################
# -parameters provides method's parameters information in class file,
# JVMCI compilers make use of that information for various sanity checks.
# Don't use Indy strings concatenation to have good JVMCI startup performance.
# The exports are needed since JVMCI is dynamically exported (see
# jdk.vm.ci.services.internal.ReflectionAccessJDK::openJVMCITo).
jdk.internal.vm.ci_JAVAC_FLAGS += -parameters -XDstringConcat=inline
################################################################################
jdk.internal.vm.compiler_JAVAC_FLAGS += -parameters -XDstringConcat=inline \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.aarch64=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.amd64=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.code=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.code.site=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.code.stack=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.common=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.hotspot=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.hotspot.aarch64=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.hotspot.amd64=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.meta=jdk.internal.vm.compiler \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.runtime=jdk.internal.vm.compiler \
#
jdk.internal.vm.compiler_EXCLUDES += \
jdk.internal.vm.compiler.collections.test \
jdk.tools.jaotc.test \
org.graalvm.compiler.api.directives.test \
org.graalvm.compiler.api.test \
org.graalvm.compiler.asm.aarch64.test \
org.graalvm.compiler.asm.amd64.test \
org.graalvm.compiler.asm.test \
org.graalvm.compiler.core.aarch64.test \
org.graalvm.compiler.core.amd64.test \
org.graalvm.compiler.core.jdk9.test \
org.graalvm.compiler.core.match.processor \
org.graalvm.compiler.core.test \
org.graalvm.compiler.debug.test \
org.graalvm.compiler.graph.test \
org.graalvm.compiler.hotspot.aarch64.test \
org.graalvm.compiler.hotspot.amd64.test \
org.graalvm.compiler.hotspot.jdk15.test \
org.graalvm.compiler.hotspot.jdk9.test \
org.graalvm.compiler.hotspot.lir.test \
org.graalvm.compiler.hotspot.test \
org.graalvm.compiler.jtt \
org.graalvm.compiler.lir.jtt \
org.graalvm.compiler.lir.test \
org.graalvm.compiler.loop.test \
org.graalvm.compiler.microbenchmarks \
org.graalvm.compiler.nodeinfo.processor \
org.graalvm.compiler.nodes.test \
org.graalvm.compiler.options.processor \
org.graalvm.compiler.options.test \
org.graalvm.compiler.phases.common.test \
org.graalvm.compiler.processor \
org.graalvm.compiler.replacements.jdk10.test \
org.graalvm.compiler.replacements.jdk12.test \
org.graalvm.compiler.replacements.jdk9.test \
org.graalvm.compiler.replacements.processor \
org.graalvm.compiler.replacements.test \
org.graalvm.compiler.serviceprovider.processor \
org.graalvm.compiler.test \
org.graalvm.compiler.virtual.bench \
org.graalvm.micro.benchmarks \
org.graalvm.util.test \
#
################################################################################
# -parameters provides method's parameters information in class file,
# JVMCI compilers make use of that information for various sanity checks.
# Don't use Indy strings concatenation to have good JAOTC startup performance.
# The exports are needed since JVMCI is dynamically exported (see
# jdk.vm.ci.services.internal.ReflectionAccessJDK::openJVMCITo).
jdk.aot_JAVAC_FLAGS += -parameters -XDstringConcat=inline \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.aarch64=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.amd64=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.code=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.code.site=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.code.stack=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.common=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.hotspot=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.hotspot.aarch64=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.hotspot.amd64=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.meta=jdk.internal.vm.compiler,jdk.aot \
--add-exports jdk.internal.vm.ci/jdk.vm.ci.runtime=jdk.internal.vm.compiler,jdk.aot \
#
jdk.aot_EXCLUDES += \
jdk.tools.jaotc.test
#
################################################################################
sun.charsets_COPY += .dat
################################################################################
jdk.localedata_COPY += _dict _th
# Exclude BreakIterator classes that are just used in compile process to generate
# data files and shouldn't go in the product
jdk.localedata_EXCLUDE_FILES += sun/text/resources/ext/BreakIteratorRules_th.java
jdk.localedata_KEEP_ALL_TRANSLATIONS := true
################################################################################
jdk.jfr_DISABLED_WARNINGS += exports
jdk.jfr_COPY := .xsd .xml .dtd
jdk.jfr_JAVAC_FLAGS := -XDstringConcat=inline
################################################################################
# If this is an imported module that has prebuilt classes, only compile
# module-info.java.
@@ -67,8 +568,60 @@ MODULESOURCEPATH := $(call GetModuleSrcPath)
# Add imported modules to the modulepath
MODULEPATH := $(call PathList, $(IMPORT_MODULES_CLASSES))
ifeq ($(MODULE), jdk.internal.vm.ci)
## WORKAROUND jdk.internal.vm.ci source structure issue
JVMCI_MODULESOURCEPATH := $(MODULESOURCEPATH) \
$(subst /$(MODULE)/,/*/, $(filter-out %processor/src, \
$(wildcard $(TOPDIR)/src/$(MODULE)/share/classes/*/src)))
MODULESOURCEPATH := $(call PathList, $(JVMCI_MODULESOURCEPATH))
endif
ifeq ($(MODULE), jdk.internal.vm.compiler)
## WORKAROUND jdk.internal.vm.compiler source structure issue
VM_COMPILER_MODULESOURCEPATH := $(MODULESOURCEPATH) \
$(subst /$(MODULE)/,/*/, $(filter-out %processor/src %test/src %jtt/src %bench/src %microbenchmarks/src, \
$(wildcard $(TOPDIR)/src/$(MODULE)/share/classes/*/src)))
MODULESOURCEPATH := $(call PathList, $(VM_COMPILER_MODULESOURCEPATH))
endif
ifeq ($(MODULE), jdk.aot)
## WORKAROUND jdk.aot source structure issue
AOT_MODULESOURCEPATH := $(MODULESOURCEPATH) \
$(subst /$(MODULE)/,/*/, $(filter-out %processor/src, \
$(wildcard $(TOPDIR)/src/$(MODULE)/share/classes/*/src)))
MODULESOURCEPATH := $(call PathList, $(AOT_MODULESOURCEPATH))
endif
$(eval $(call SetupJavaCompilation, $(MODULE), \
SMALL_JAVA := false, \
MODULE := $(MODULE), \
SRC := $(wildcard $(MODULE_SRC_DIRS)), \
INCLUDES := $(JDK_USER_DEFINED_FILTER), \
FAIL_NO_SRC := $(FAIL_NO_SRC), \
BIN := $(if $($(MODULE)_BIN), $($(MODULE)_BIN), $(JDK_OUTPUTDIR)/modules), \
HEADERS := $(SUPPORT_OUTPUTDIR)/headers, \
CREATE_API_DIGEST := true, \
JAVAC_FLAGS := \
$($(MODULE)_DOCLINT) \
$($(MODULE)_JAVAC_FLAGS) \
--module-source-path $(MODULESOURCEPATH) \
--module-path $(MODULEPATH) \
--system none, \
))
TARGETS += $($(MODULE)) $($(MODULE)_COPY_EXTRA)
# Declare dependencies between java compilations of different modules.
# Since the other modules are declared in different invocations of this file,
# use the macro to find the correct target file to depend on.
# Only the javac compilation actually depends on other modules so limit
# dependency declaration to that by using the *_COMPILE_TARGET variable.
$($(MODULE)_COMPILE_TARGET): $(foreach d, $(call FindDepsForModule, $(MODULE)), \
$(call SetupJavaCompilationApiTarget, $d, \
$(if $($d_BIN), $($d_BIN), $(JDK_OUTPUTDIR)/modules/$d)))
################################################################################
# Copy zh_HK properties files from zh_TW (needed by some modules)
# Copy zh_HK properties files from zh_TW
$(JDK_OUTPUTDIR)/modules/%_zh_HK.properties: $(JDK_OUTPUTDIR)/modules/%_zh_TW.properties
$(install-file)
@@ -83,48 +636,13 @@ CreateHkTargets = \
.properties \
)
################################################################################
# Include module specific build settings
ifeq ($(MODULE), java.sql.rowset)
TARGETS += $(call CreateHkTargets, $(java.sql.rowset_CLEAN_FILES))
endif
-include Java.gmk
################################################################################
# Setup the main compilation
$(eval $(call SetupJavaCompilation, $(MODULE), \
SMALL_JAVA := false, \
MODULE := $(MODULE), \
SRC := $(wildcard $(MODULE_SRC_DIRS)), \
INCLUDES := $(JDK_USER_DEFINED_FILTER), \
FAIL_NO_SRC := $(FAIL_NO_SRC), \
BIN := $(if $($(MODULE)_BIN), $($(MODULE)_BIN), $(JDK_OUTPUTDIR)/modules), \
HEADERS := $(SUPPORT_OUTPUTDIR)/headers, \
CREATE_API_DIGEST := true, \
CLEAN := $(CLEAN), \
CLEAN_FILES := $(CLEAN_FILES), \
COPY := $(COPY), \
DISABLED_WARNINGS := $(DISABLED_WARNINGS_java), \
EXCLUDES := $(EXCLUDES), \
EXCLUDE_FILES := $(EXCLUDE_FILES), \
KEEP_ALL_TRANSLATIONS := $(KEEP_ALL_TRANSLATIONS), \
JAVAC_FLAGS := \
$(DOCLINT) \
$(JAVAC_FLAGS) \
--module-source-path $(MODULESOURCEPATH) \
--module-path $(MODULEPATH) \
--system none, \
))
TARGETS += $($(MODULE))
# Declare dependencies between java compilations of different modules.
# Since the other modules are declared in different invocations of this file,
# use the macro to find the correct target file to depend on.
# Only the javac compilation actually depends on other modules so limit
# dependency declaration to that by using the *_COMPILE_TARGET variable.
$($(MODULE)_COMPILE_TARGET): $(foreach d, $(call FindDepsForModule, $(MODULE)), \
$(call SetupJavaCompilationApiTarget, $d, \
$(if $($d_BIN), $($d_BIN), $(JDK_OUTPUTDIR)/modules/$d)))
ifeq ($(MODULE), java.rmi)
TARGETS += $(call CreateHkTargets, $(java.rmi_CLEAN_FILES))
endif
################################################################################
# If this is an imported module, copy the pre built classes and resources into
@@ -148,6 +666,10 @@ endif
################################################################################
$(eval $(call IncludeCustomExtension, CompileJavaModules-post.gmk))
################################################################################
all: $(TARGETS)
.PHONY: all

View File

@@ -33,20 +33,8 @@ include JavaCompilation.gmk
TOOLS_CLASSES_DIR := $(BUILDTOOLS_OUTPUTDIR)/tools_jigsaw_classes
# When using an external BUILDJDK, make it possible to shortcut building of
# these tools using the BUILD_JAVAC instead of having to build the complete
# exploded image first.
ifeq ($(EXTERNAL_BUILDJDK), true)
COMPILER := buildjdk
TARGET_RELEASE := $(TARGET_RELEASE_NEWJDK)
else
COMPILER := interim
TARGET_RELEASE := $(TARGET_RELEASE_NEWJDK_UPGRADED)
endif
$(eval $(call SetupJavaCompilation, BUILD_JIGSAW_TOOLS, \
TARGET_RELEASE := $(TARGET_RELEASE), \
COMPILER := $(COMPILER), \
TARGET_RELEASE := $(TARGET_RELEASE_NEWJDK_UPGRADED), \
SRC := $(TOPDIR)/make/jdk/src/classes, \
INCLUDES := build/tools/deps \
build/tools/docs \

View File

@@ -52,6 +52,94 @@ $(eval $(call SetupJavaCompilation, BUILD_TOOLS_HOTSPOT, \
TARGETS += $(BUILD_TOOLS_HOTSPOT)
################################################################################
# Graal build tools
ifeq ($(INCLUDE_GRAAL), true)
VM_CI_SRC_DIR := $(TOPDIR)/src/jdk.internal.vm.ci/share/classes
SRC_DIR := $(TOPDIR)/src/jdk.internal.vm.compiler/share/classes
##############################################################################
# Compile the annotation processors
$(eval $(call SetupJavaCompilation, BUILD_VM_COMPILER_MATCH_PROCESSOR, \
TARGET_RELEASE := $(TARGET_RELEASE_BOOTJDK), \
SRC := \
$(SRC_DIR)/org.graalvm.compiler.processor/src \
$(SRC_DIR)/org.graalvm.compiler.core.match.processor/src \
, \
EXCLUDE_FILES := $(EXCLUDE_FILES), \
BIN := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.match.processor, \
JAR := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.match.processor.jar, \
DISABLED_WARNINGS := options, \
))
TARGETS += $(BUILD_VM_COMPILER_MATCH_PROCESSOR)
##############################################################################
$(eval $(call SetupJavaCompilation, BUILD_VM_COMPILER_NODEINFO_PROCESSOR, \
TARGET_RELEASE := $(TARGET_RELEASE_BOOTJDK), \
SRC := \
$(SRC_DIR)/org.graalvm.compiler.processor/src \
$(SRC_DIR)/org.graalvm.compiler.nodeinfo.processor/src \
, \
BIN := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.nodeinfo.processor, \
JAR := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.nodeinfo.processor.jar, \
DISABLED_WARNINGS := options, \
))
TARGETS += $(BUILD_VM_COMPILER_NODEINFO_PROCESSOR)
##############################################################################
$(eval $(call SetupJavaCompilation, BUILD_VM_COMPILER_OPTIONS_PROCESSOR, \
TARGET_RELEASE := $(TARGET_RELEASE_BOOTJDK), \
DISABLED_WARNINGS := options, \
SRC := \
$(SRC_DIR)/org.graalvm.compiler.processor/src \
$(SRC_DIR)/org.graalvm.compiler.options.processor/src \
, \
BIN := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.options.processor, \
JAR := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.options.processor.jar, \
))
TARGETS += $(BUILD_VM_COMPILER_OPTIONS_PROCESSOR)
##############################################################################
$(eval $(call SetupJavaCompilation, BUILD_VM_COMPILER_REPLACEMENTS_PROCESSOR, \
TARGET_RELEASE := $(TARGET_RELEASE_BOOTJDK), \
SRC := \
$(SRC_DIR)/org.graalvm.compiler.processor/src \
$(SRC_DIR)/org.graalvm.compiler.replacements.processor/src \
, \
EXCLUDE_FILES := $(EXCLUDE_FILES), \
BIN := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.replacements.verifier, \
JAR := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.replacements.verifier.jar, \
DISABLED_WARNINGS := options, \
))
TARGETS += $(BUILD_VM_COMPILER_REPLACEMENTS_PROCESSOR)
##############################################################################
$(eval $(call SetupJavaCompilation, BUILD_VM_COMPILER_SERVICEPROVIDER_PROCESSOR, \
TARGET_RELEASE := $(TARGET_RELEASE_BOOTJDK), \
SRC := \
$(SRC_DIR)/org.graalvm.compiler.processor/src \
$(SRC_DIR)/org.graalvm.compiler.serviceprovider.processor/src \
, \
EXCLUDE_FILES := $(EXCLUDE_FILES), \
BIN := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.serviceprovider.processor, \
JAR := $(BUILDTOOLS_OUTPUTDIR)/jdk.vm.compiler.serviceprovider.processor.jar, \
DISABLED_WARNINGS := options, \
))
TARGETS += $(BUILD_VM_COMPILER_SERVICEPROVIDER_PROCESSOR)
##############################################################################
endif
all: $(TARGETS)
.PHONY: all

View File

@@ -56,8 +56,7 @@ $(eval $(call SetupJavaCompilation, BUILD_TOOLS_JDK, \
DISABLED_WARNINGS := options, \
JAVAC_FLAGS := \
--add-exports java.desktop/sun.awt=ALL-UNNAMED \
--add-exports java.base/sun.text=ALL-UNNAMED \
--add-exports java.base/sun.security.util=ALL-UNNAMED, \
--add-exports java.base/sun.text=ALL-UNNAMED, \
))
TARGETS += $(BUILD_TOOLS_JDK)

View File

@@ -213,12 +213,12 @@ endif
ifeq ($(call isTargetOs, windows), true)
ifeq ($(SHIP_DEBUG_SYMBOLS), )
JMOD_FLAGS += --exclude '**{_the.*,_*.marker*,*.diz,*.pdb,*.map}'
JMOD_FLAGS += --exclude '**{_the.*,_*.marker,*.diz,*.pdb,*.map}'
else
JMOD_FLAGS += --exclude '**{_the.*,_*.marker*,*.diz,*.map}'
JMOD_FLAGS += --exclude '**{_the.*,_*.marker,*.diz,*.map}'
endif
else
JMOD_FLAGS += --exclude '**{_the.*,_*.marker*,*.diz,*.debuginfo,*.dSYM/**,*.dSYM}'
JMOD_FLAGS += --exclude '**{_the.*,_*.marker,*.diz,*.debuginfo,*.dSYM/**,*.dSYM}'
endif
# Create jmods in the support dir and then move them into place to keep the

View File

@@ -1,4 +1,4 @@
# Copyright (c) 1997, 2021, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 1997, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -57,11 +57,14 @@ $(eval $(call IncludeCustomExtension, Docs.gmk))
################################################################################
# Javadoc settings
# Include configuration for URLs in generated javadoc
include $(TOPDIR)/make/conf/javadoc.conf
MODULES_SOURCE_PATH := $(call PathList, $(call GetModuleSrcPath) )
# URLs
JAVADOC_BASE_URL := https://docs.oracle.com/pls/topic/lookup?ctx=javase$(VERSION_NUMBER)&amp;id=homepage
BUG_SUBMIT_URL := https://bugreport.java.com/bugreport/
COPYRIGHT_URL := legal/copyright.html
LICENSE_URL := https://www.oracle.com/java/javase/terms/license/java$(VERSION_NUMBER)speclicense.html
REDISTRIBUTION_URL := https://www.oracle.com/technetwork/java/redist-137594.html
# In order to get a specific ordering it's necessary to specify the total
# ordering of tags as the tags are otherwise ordered in order of definition.
@@ -89,6 +92,7 @@ JAVADOC_TAGS := \
-tag see \
-taglet build.tools.taglet.ExtLink \
-taglet build.tools.taglet.Incubating \
-taglet build.tools.taglet.Preview \
-tagletpath $(BUILDTOOLS_OUTPUTDIR)/jdk_tools_classes \
$(CUSTOM_JAVADOC_TAGS) \
#
@@ -99,8 +103,7 @@ JAVADOC_TAGS := \
REFERENCE_TAGS := $(JAVADOC_TAGS)
# Which doclint checks to ignore
JAVADOC_DISABLED_DOCLINT_WARNINGS := missing
JAVADOC_DISABLED_DOCLINT_PACKAGES := org.w3c.* javax.smartcardio
JAVADOC_DISABLED_DOCLINT := accessibility html missing syntax reference
# The initial set of options for javadoc
JAVADOC_OPTIONS := -use -keywords -notimestamp \
@@ -154,14 +157,13 @@ COPYRIGHT_BOTTOM = \
<a href="$(REDISTRIBUTION_URL)">documentation redistribution policy</a>. \
$(DRAFT_MARKER_STR) <!-- Version $(VERSION_STRING) -->
# $1 - Optional "Other Versions" link
JAVADOC_BOTTOM = \
JAVADOC_BOTTOM := \
<a href="$(BUG_SUBMIT_URL)">Report a bug or suggest an enhancement</a><br> \
For further API reference and developer documentation see the \
<a href="$(JAVADOC_BASE_URL)" target="_blank">Java SE \
Documentation</a>, which contains more detailed, \
developer-targeted descriptions with conceptual overviews, definitions \
of terms, workarounds, and working code examples. $1<br> \
of terms, workarounds, and working code examples.<br> \
Java is a trademark or registered trademark of $(FULL_COMPANY_NAME) in \
the US and other countries.<br> \
$(call COPYRIGHT_BOTTOM, {@docroot}/../)
@@ -263,7 +265,6 @@ endef
# SHORT_NAME - The short name of this documentation collection
# LONG_NAME - The long name of this documentation collection
# TARGET_DIR - Where to store the output
# OTHER_VERSIONS - URL for other page listing versions
#
SetupApiDocsGeneration = $(NamedParamsMacroTemplate)
define SetupApiDocsGenerationBody
@@ -296,26 +297,18 @@ define SetupApiDocsGenerationBody
# Create a string like "-Xdoclint:all,-syntax,-html,..."
$1_OPTIONS += -Xdoclint:all,$$(call CommaList, $$(addprefix -, \
$$(JAVADOC_DISABLED_DOCLINT_WARNINGS)))
# Ignore the doclint warnings in certain packages
$1_OPTIONS += -Xdoclint/package:$$(call CommaList, $$(addprefix -, \
$$(JAVADOC_DISABLED_DOCLINT_PACKAGES)))
$$(JAVADOC_DISABLED_DOCLINT)))
$1_DOC_TITLE := $$($1_LONG_NAME)<br>Version $$(VERSION_SPECIFICATION) API \
Specification
$1_WINDOW_TITLE := $$(subst &amp;,&,$$($1_SHORT_NAME))$$(DRAFT_MARKER_TITLE)
$1_HEADER_TITLE := <div $$(HEADER_STYLE)><strong>$$($1_SHORT_NAME)</strong> \
$$(DRAFT_MARKER_STR)</div>
ifneq ($$($1_OTHER_VERSIONS), )
$1_JAVADOC_BOTTOM := $$(call JAVADOC_BOTTOM, <a href="$$($1_OTHER_VERSIONS)">Other versions.</a>)
else
$1_JAVADOC_BOTTOM := $$(call JAVADOC_BOTTOM, )
endif
$1_OPTIONS += -doctitle '$$($1_DOC_TITLE)'
$1_OPTIONS += -windowtitle '$$($1_WINDOW_TITLE)'
$1_OPTIONS += -header '$$($1_HEADER_TITLE)'
$1_OPTIONS += -bottom '$$($1_JAVADOC_BOTTOM)'
$1_OPTIONS += -bottom '$$(JAVADOC_BOTTOM)'
ifeq ($$(IS_DRAFT), true)
$1_OPTIONS += -top '$$(JAVADOC_TOP)'
endif
@@ -332,12 +325,6 @@ define SetupApiDocsGenerationBody
$$(eval $$(call create_overview_file,$1))
$1_OPTIONS += -overview $$($1_OVERVIEW)
# Add summary pages for new/deprecated APIs in recent releases
$1_OPTIONS += --since $(call CommaList, \
$(filter-out $(VERSION_DOCS_API_SINCE), \
$(call sequence, $(VERSION_DOCS_API_SINCE), $(VERSION_FEATURE))))
$1_OPTIONS += --since-label "New API since JDK $(VERSION_DOCS_API_SINCE)"
$$(foreach g, $$($1_GROUPS), \
$$(eval $1_OPTIONS += -group "$$($$g_GROUP_NAME)" "$$($$g_GROUP_MODULES)") \
)
@@ -453,7 +440,6 @@ $(eval $(call SetupApiDocsGeneration, JDK_API, \
SHORT_NAME := $(JDK_SHORT_NAME), \
LONG_NAME := $(JDK_LONG_NAME), \
TARGET_DIR := $(DOCS_OUTPUTDIR)/api, \
OTHER_VERSIONS := $(OTHER_JDK_VERSIONS_URL), \
))
# Targets generated are returned in JDK_API_JAVADOC_TARGETS and
@@ -470,7 +456,7 @@ $(eval $(call SetupApiDocsGeneration, JAVASE_API, \
MODULES := $(JAVASE_MODULES), \
SHORT_NAME := $(JAVASE_SHORT_NAME), \
LONG_NAME := $(JAVASE_LONG_NAME), \
TARGET_DIR := $(DOCS_JAVASE_IMAGE_DIR)/api, \
TARGET_DIR := $(IMAGES_OUTPUTDIR)/javase-docs/api, \
))
# Targets generated are returned in JAVASE_API_JAVADOC_TARGETS and
@@ -488,8 +474,8 @@ $(eval $(call SetupApiDocsGeneration, REFERENCE_API, \
MODULES := $(JAVASE_MODULES), \
SHORT_NAME := $(JAVASE_SHORT_NAME), \
LONG_NAME := $(JAVASE_LONG_NAME), \
TARGET_DIR := $(DOCS_REFERENCE_IMAGE_DIR)/api, \
JAVADOC_CMD := $(DOCS_REFERENCE_JAVADOC), \
TARGET_DIR := $(IMAGES_OUTPUTDIR)/reference-docs/api, \
JAVADOC_CMD := $(JAVADOC), \
OPTIONS := $(REFERENCE_OPTIONS), \
TAGS := $(REFERENCE_TAGS), \
))

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2016, 2021, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2016, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -61,7 +61,7 @@ endif
# Save the stderr output of the command and print it along with stdout in case
# something goes wrong.
$(CLASSLIST_FILE): $(INTERIM_IMAGE_DIR)/bin/java$(EXECUTABLE_SUFFIX) $(CLASSLIST_JAR)
$(CLASSLIST_FILE): $(INTERIM_IMAGE_DIR)/bin/java$(EXE_SUFFIX) $(CLASSLIST_JAR)
$(call MakeDir, $(LINK_OPT_DIR))
$(call LogInfo, Generating $(patsubst $(OUTPUTDIR)/%, %, $@))
$(call LogInfo, Generating $(patsubst $(OUTPUTDIR)/%, %, $(JLI_TRACE_FILE)))
@@ -88,16 +88,13 @@ $(CLASSLIST_FILE): $(INTERIM_IMAGE_DIR)/bin/java$(EXECUTABLE_SUFFIX) $(CLASSLIST
$(CAT) $(LINK_OPT_DIR)/stderr $(JLI_TRACE_FILE) ; \
exit $$exitcode \
)
$(GREP) -v HelloClasslist $@.raw.2 > $@.raw.3
$(FIXPATH) $(INTERIM_IMAGE_DIR)/bin/java \
-cp $(SUPPORT_OUTPUTDIR)/classlist.jar \
build.tools.classlist.SortClasslist $@.raw.3 > $@
$(GREP) -v HelloClasslist $@.raw.2 > $@
# The jli trace is created by the same recipe as classlist. By declaring these
# dependencies, make will correctly rebuild both jli trace and classlist
# incrementally using the single recipe above.
$(CLASSLIST_FILE): $(JLI_TRACE_FILE)
$(JLI_TRACE_FILE): $(INTERIM_IMAGE_DIR)/bin/java$(EXECUTABLE_SUFFIX) $(CLASSLIST_JAR)
$(JLI_TRACE_FILE): $(INTERIM_IMAGE_DIR)/bin/java$(EXE_SUFFIX) $(CLASSLIST_JAR)
TARGETS += $(CLASSLIST_FILE) $(JLI_TRACE_FILE)

View File

@@ -44,7 +44,7 @@ ALL_MODULES := $(call FindAllModules)
$(eval $(call ReadImportMetaData))
JRE_MODULES += $(filter $(ALL_MODULES), $(BOOT_MODULES) \
$(PLATFORM_MODULES) jdk.jdwp.agent)
$(PLATFORM_MODULES) $(JRE_TOOL_MODULES))
JDK_MODULES += $(ALL_MODULES)
JRE_MODULES_LIST := $(call CommaList, $(JRE_MODULES))
@@ -238,7 +238,6 @@ endif
ALL_JDK_MODULES := $(JDK_MODULES)
ALL_JRE_MODULES := $(sort $(JRE_MODULES), $(foreach m, $(JRE_MODULES), \
$(call FindTransitiveDepsForModule, $m)))
ALL_SYMBOLS_MODULES := $(JDK_MODULES)
ifeq ($(call isTargetOs, windows), true)
LIBS_TARGET_SUBDIR := bin
@@ -294,7 +293,6 @@ SetupCopyDebuginfo = \
# implementation above.
$(call SetupCopyDebuginfo,JDK)
$(call SetupCopyDebuginfo,JRE)
$(call SetupCopyDebuginfo,SYMBOLS)
################################################################################

View File

@@ -200,7 +200,7 @@ ifeq ($(HAS_SPEC),)
COMPARE_BUILD="$(COMPARE_BUILD):NODRYRUN=true" main && \
$(MAKE) $(MFLAGS) $(MAKE_LOG_FLAGS) -r -R -f $(topdir)/make/Init.gmk \
SPEC=$(spec) HAS_SPEC=true ACTUAL_TOPDIR=$(topdir) \
COMPARE_BUILD="$(COMPARE_BUILD):NODRYRUN=true" post-compare-build && \
COMPARE_BUILD="$(COMPARE_BUILD)" post-compare-build && \
) \
) true ) \
$(eval TARGET_DONE=true) \

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2016, 2020, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2016, 2018, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -32,7 +32,7 @@ include Modules.gmk
################################################################################
# Use this file inside the image as target for make rule
JIMAGE_TARGET_FILE := bin/java$(EXECUTABLE_SUFFIX)
JIMAGE_TARGET_FILE := bin/java$(EXE_SUFFIX)
INTERIM_MODULES_LIST := $(call CommaList, $(INTERIM_IMAGE_MODULES))

View File

@@ -1,20 +0,0 @@
include Makefile
include make/MainSupport.gmk
.PHONY: jbr-api
ifeq ($(SPEC),)
ifneq ($(words $(SPECS)),1)
@echo "Error: Multiple build specification files found. Please select one explicitly."
@exit 2
endif
jbr-api:
@cd $(topdir)
@$(MAKE) $(MFLAGS) $(MAKE_LOG_FLAGS) -r -R -j 1 -f $(topdir)/make/JBRApi.gmk SPEC=$(SPECS) HAS_SPEC=true ACTUAL_TOPDIR=$(topdir) MODULES="$(MODULES)" jbr-api
else #with SPEC
jbr-api:
$(ECHO) "BUILD_DIR=$(OUTPUTDIR)" > $(OUT)
$(ECHO) "BOOT_JDK=\"$(BOOT_JDK)\"" >> $(OUT)
endif

View File

@@ -38,8 +38,11 @@ ifeq ($(call isTargetOs, macosx), true)
MACOSX_PLIST_SRC := $(TOPDIR)/make/data/bundle
BUNDLE_ID := $(MACOSX_BUNDLE_ID_BASE).$(VERSION_SHORT)
BUNDLE_NAME := $(MACOSX_BUNDLE_NAME_BASE) $(VERSION_SHORT)
BUNDLE_INFO := $(MACOSX_BUNDLE_NAME_BASE) $(VERSION_STRING)
BUNDLE_PLATFORM_VERSION := $(VERSION_FEATURE).$(VERSION_INTERIM)
BUNDLE_VERSION := $(VERSION_NUMBER)
ifeq ($(COMPANY_NAME), N/A)
BUNDLE_VENDOR := UNDEFINED
else
@@ -72,26 +75,24 @@ ifeq ($(call isTargetOs, macosx), true)
SOURCE_FILES := $(MACOSX_PLIST_SRC)/JDK-Info.plist, \
OUTPUT_FILE := $(JDK_MACOSX_CONTENTS_DIR)/Info.plist, \
REPLACEMENTS := \
@@ID@@ => $(MACOSX_BUNDLE_ID_BASE).jdk ; \
@@ID@@ => $(BUNDLE_ID).jdk ; \
@@NAME@@ => $(BUNDLE_NAME) ; \
@@INFO@@ => $(BUNDLE_INFO) ; \
@@VERSION@@ => $(VERSION_NUMBER) ; \
@@BUILD_VERSION@@ => $(MACOSX_BUNDLE_BUILD_VERSION) ; \
@@VENDOR@@ => $(BUNDLE_VENDOR) ; \
@@MACOSX_VERSION_MIN@@ => $(MACOSX_VERSION_MIN) , \
@@PLATFORM_VERSION@@ => $(BUNDLE_PLATFORM_VERSION) ; \
@@VERSION@@ => $(BUNDLE_VERSION) ; \
@@VENDOR@@ => $(BUNDLE_VENDOR) , \
))
$(eval $(call SetupTextFileProcessing, BUILD_JRE_PLIST, \
SOURCE_FILES := $(MACOSX_PLIST_SRC)/JRE-Info.plist, \
OUTPUT_FILE := $(JRE_MACOSX_CONTENTS_DIR)/Info.plist, \
REPLACEMENTS := \
@@ID@@ => $(MACOSX_BUNDLE_ID_BASE).jre ; \
@@ID@@ => $(BUNDLE_ID).jre ; \
@@NAME@@ => $(BUNDLE_NAME) ; \
@@INFO@@ => $(BUNDLE_INFO) ; \
@@VERSION@@ => $(VERSION_NUMBER) ; \
@@BUILD_VERSION@@ => $(BUNDLE_BUILD_VERSION) ; \
@@VENDOR@@ => $(BUNDLE_VENDOR) ; \
@@MACOSX_VERSION_MIN@@ => $(MACOSX_VERSION_MIN) , \
@@PLATFORM_VERSION@@ => $(BUNDLE_PLATFORM_VERSION) ; \
@@VERSION@@ => $(BUNDLE_VERSION) ; \
@@VENDOR@@ => $(BUNDLE_VENDOR) , \
))
$(SUPPORT_OUTPUTDIR)/images/_jdk_bundle_attribute_set: $(COPY_JDK_IMAGE)

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2021, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -90,10 +90,12 @@ $(eval $(call SetupTarget, buildtools-jdk, \
$(eval $(call SetupTarget, buildtools-modules, \
MAKEFILE := CompileModuleTools, \
DEPS := exploded-image-base, \
))
$(eval $(call SetupTarget, buildtools-hotspot, \
MAKEFILE := CompileToolsHotspot, \
DEPS := interim-langtools, \
))
################################################################################
@@ -110,6 +112,7 @@ $(eval $(call SetupTarget, generate-exported-symbols, \
$(eval $(call DeclareRecipesForPhase, GENSRC, \
TARGET_SUFFIX := gensrc-src, \
FILE_PREFIX := Gensrc, \
MAKE_SUBDIR := gensrc, \
CHECK_MODULES := $(ALL_MODULES), \
))
@@ -147,6 +150,7 @@ ALL_TARGETS += $(GENSRC_TARGETS)
$(eval $(call DeclareRecipesForPhase, GENDATA, \
TARGET_SUFFIX := gendata, \
FILE_PREFIX := Gendata, \
MAKE_SUBDIR := gendata, \
CHECK_MODULES := $(ALL_MODULES), \
))
@@ -157,6 +161,7 @@ ALL_TARGETS += $(GENDATA_TARGETS)
$(eval $(call DeclareRecipesForPhase, COPY, \
TARGET_SUFFIX := copy, \
FILE_PREFIX := Copy, \
MAKE_SUBDIR := copy, \
CHECK_MODULES := $(ALL_MODULES), \
))
@@ -186,7 +191,6 @@ JAVA_TARGETS := $(addsuffix -java, $(JAVA_MODULES))
define DeclareCompileJavaRecipe
$1-java:
+($(CD) $(TOPDIR)/make && $(MAKE) $(MAKE_ARGS) \
$(patsubst %,-I%/modules/$1,$(PHASE_MAKEDIRS)) \
-f CompileJavaModules.gmk MODULE=$1)
endef
@@ -199,6 +203,7 @@ ALL_TARGETS += $(JAVA_TARGETS)
$(eval $(call DeclareRecipesForPhase, LIBS, \
TARGET_SUFFIX := libs, \
FILE_PREFIX := Lib, \
MAKE_SUBDIR := lib, \
CHECK_MODULES := $(ALL_MODULES), \
))
@@ -211,6 +216,7 @@ ALL_TARGETS += $(LIBS_TARGETS)
$(eval $(call DeclareRecipesForPhase, STATIC_LIBS, \
TARGET_SUFFIX := static-libs, \
FILE_PREFIX := Lib, \
MAKE_SUBDIR := lib, \
CHECK_MODULES := $(ALL_MODULES), \
EXTRA_ARGS := STATIC_LIBS=true, \
))
@@ -222,6 +228,7 @@ ALL_TARGETS += $(STATIC_LIBS_TARGETS)
$(eval $(call DeclareRecipesForPhase, LAUNCHER, \
TARGET_SUFFIX := launchers, \
FILE_PREFIX := Launcher, \
MAKE_SUBDIR := launcher, \
CHECK_MODULES := $(ALL_MODULES), \
))
@@ -338,7 +345,7 @@ $(eval $(call SetupTarget, test-image-demos-jdk, \
$(eval $(call SetupTarget, generate-summary, \
MAKEFILE := GenerateModuleSummary, \
DEPS := jmods buildtools-modules runnable-buildjdk, \
DEPS := jmods buildtools-modules, \
))
################################################################################
@@ -468,7 +475,7 @@ $(eval $(call SetupTarget, docs-jdk-api-javadoc, \
$(eval $(call SetupTarget, docs-jdk-api-modulegraph, \
MAKEFILE := Docs, \
TARGET := docs-jdk-api-modulegraph, \
DEPS := buildtools-modules runnable-buildjdk, \
DEPS := exploded-image buildtools-modules, \
))
$(eval $(call SetupTarget, docs-javase-api-javadoc, \
@@ -479,7 +486,7 @@ $(eval $(call SetupTarget, docs-javase-api-javadoc, \
$(eval $(call SetupTarget, docs-javase-api-modulegraph, \
MAKEFILE := Docs, \
TARGET := docs-javase-api-modulegraph, \
DEPS := buildtools-modules runnable-buildjdk, \
DEPS := exploded-image buildtools-modules, \
))
$(eval $(call SetupTarget, docs-reference-api-javadoc, \
@@ -490,7 +497,7 @@ $(eval $(call SetupTarget, docs-reference-api-javadoc, \
$(eval $(call SetupTarget, docs-reference-api-modulegraph, \
MAKEFILE := Docs, \
TARGET := docs-reference-api-modulegraph, \
DEPS := buildtools-modules runnable-buildjdk, \
DEPS := exploded-image buildtools-modules, \
))
# The gensrc steps for jdk.jdi create html spec files.
@@ -659,6 +666,18 @@ $(eval $(call SetupTarget, test-image-libtest-jtreg-native, \
DEPS := build-test-libtest-jtreg-native, \
))
$(eval $(call SetupTarget, build-test-hotspot-jtreg-graal, \
MAKEFILE := test/JtregGraalUnit, \
TARGET := build-test-hotspot-jtreg-graal, \
DEPS := exploded-image, \
))
$(eval $(call SetupTarget, test-image-hotspot-jtreg-graal, \
MAKEFILE := test/JtregGraalUnit, \
TARGET := test-image-hotspot-jtreg-graal, \
DEPS := build-test-hotspot-jtreg-graal, \
))
ifneq ($(GTEST_FRAMEWORK_SRC), )
$(eval $(call SetupTarget, test-image-hotspot-gtest, \
MAKEFILE := hotspot/test/GtestImage, \
@@ -736,24 +755,12 @@ $(eval $(call SetupTarget, test-bundles, \
DEPS := test-image, \
))
$(eval $(call SetupTarget, docs-jdk-bundles, \
$(eval $(call SetupTarget, docs-bundles, \
MAKEFILE := Bundles, \
TARGET := docs-jdk-bundles, \
TARGET := docs-bundles, \
DEPS := docs-image, \
))
$(eval $(call SetupTarget, docs-javase-bundles, \
MAKEFILE := Bundles, \
TARGET := docs-javase-bundles, \
DEPS := docs-javase-image, \
))
$(eval $(call SetupTarget, docs-reference-bundles, \
MAKEFILE := Bundles, \
TARGET := docs-reference-bundles, \
DEPS := docs-reference-image, \
))
$(eval $(call SetupTarget, static-libs-bundles, \
MAKEFILE := Bundles, \
TARGET := static-libs-bundles, \
@@ -857,14 +864,23 @@ else
# virtual target.
jdk.jdwp.agent-libs: jdk.jdwp.agent-gensrc
# jdk.jfr-gendata uses TOOL_JFR_GEN from buildtools-hotspot
jdk.jfr-gendata: buildtools-hotspot
# The swing beans need to have java base properly generated to avoid errors
# in javadoc. The X11 wrappers need the java.base include files to have been
# copied and processed.
java.desktop-gensrc-src: java.base-gensrc java.base-copy
# The annotation processing for jdk.internal.vm.compiler
# and jdk.internal.vm.compiler.management needs classes from the current JDK.
jdk.internal.vm.compiler-gensrc-src: $(addsuffix -java, \
$(call FindTransitiveDepsForModule, jdk.internal.vm.compiler))
jdk.internal.vm.compiler.management-gensrc-src: $(addsuffix -java, \
$(call FindTransitiveDepsForModule, jdk.internal.vm.compiler.management))
# For these modules, the gensrc step is generating a module-info.java.extra
# file to be processed by the gensrc-moduleinfo target.
jdk.internal.vm.compiler-gensrc-moduleinfo: jdk.internal.vm.compiler-gensrc-src
jdk.internal.vm.compiler.management-gensrc-moduleinfo: jdk.internal.vm.compiler.management-gensrc-src
jdk.jdeps-gendata: java
# The ct.sym generation uses all the moduleinfos as input
@@ -935,13 +951,10 @@ else
$(JMOD_TARGETS) $(INTERIM_JMOD_TARGETS): java.base-libs java.base-copy \
java.base-gendata jdk.jlink-launchers java
endif
else ifeq ($(EXTERNAL_BUILDJDK), false)
# The normal non cross compilation usecase needs to wait for the full
else
# The normal non cross compilation case uses needs to wait for the full
# exploded-image to avoid a race with the optimize target.
$(JMOD_TARGETS) $(INTERIM_JMOD_TARGETS): exploded-image
# The buildtools-modules are used for the exploded-image-optimize target,
# but can be built either using the exploded-image or an external BUILDJDK.
buildtools-modules: exploded-image-base
endif
# All modules include the main license files from java.base.
@@ -1062,18 +1075,6 @@ ifneq ($(COMPILE_TYPE), cross)
exploded-image: exploded-image-optimize
endif
# The runnable-buildjdk target guarantees that the buildjdk is done
# building and ready to be used. The exact set of dependencies it needs
# depends on what kind of buildjdk is used for the current configuration.
runnable-buildjdk:
ifeq ($(CREATE_BUILDJDK), true)
ifneq ($(CREATING_BUILDJDK), true)
runnable-buildjdk: create-buildjdk
endif
else ifeq ($(EXTERNAL_BUILDJDK), false)
runnable-buildjdk: exploded-image
endif
create-buildjdk: create-buildjdk-interim-image
docs-jdk-api: docs-jdk-api-javadoc
@@ -1127,16 +1128,8 @@ ifeq ($(call isTargetOs, macosx), true)
legacy-images: mac-legacy-jre-bundle
endif
# These targets build the various documentation images
docs-jdk-image: docs-jdk
docs-javase-image: docs-javase
docs-reference-image: docs-reference
# The docs-jdk-image is what most users expect to be built
docs-image: docs-jdk-image
all-docs-images: docs-jdk-image docs-javase-image docs-reference-image
docs-bundles: docs-jdk-bundles
all-docs-bundles: docs-jdk-bundles docs-javase-bundles docs-reference-bundles
# This target builds the documentation image
docs-image: docs-jdk
# This target builds the test image
test-image: prepare-test-image test-image-jdk-jtreg-native \
@@ -1152,6 +1145,10 @@ else
ifneq ($(GTEST_FRAMEWORK_SRC), )
test-image: test-image-hotspot-gtest
endif
ifeq ($(INCLUDE_GRAAL), true)
test-image: test-image-hotspot-jtreg-graal
endif
endif
ifeq ($(BUILD_FAILURE_HANDLER), true)
@@ -1165,7 +1162,7 @@ endif
################################################################################
# all-images builds all our deliverables as images.
all-images: product-images test-image all-docs-images
all-images: product-images test-image docs-image
# all-bundles packages all our deliverables as tar.gz bundles.
all-bundles: product-bundles test-bundles docs-bundles static-libs-bundles
@@ -1173,11 +1170,10 @@ all-bundles: product-bundles test-bundles docs-bundles static-libs-bundles
ALL_TARGETS += buildtools hotspot hotspot-libs hotspot-gensrc gensrc gendata \
copy java libs static-libs launchers jmods \
jdk.jdwp.agent-gensrc $(ALL_MODULES) demos \
exploded-image-base exploded-image runnable-buildjdk \
exploded-image-base exploded-image \
create-buildjdk docs-jdk-api docs-javase-api docs-reference-api docs-jdk \
docs-javase docs-reference docs-javadoc mac-bundles product-images legacy-images \
docs-image docs-javase-image docs-reference-image all-docs-images \
docs-bundles all-docs-bundles test-image all-images \
docs-image test-image all-images \
all-bundles
################################################################################

View File

@@ -150,7 +150,9 @@ define DeclareRecipeForModuleMakefile
$2-$$($1_TARGET_SUFFIX):
+($(CD) $(TOPDIR)/make && $(MAKE) $(MAKE_ARGS) \
-f ModuleWrapper.gmk -I $$(TOPDIR)/make/common/modules \
$$(patsubst %,-I%/modules/$2,$$(PHASE_MAKEDIRS)) \
$$(addprefix -I, $$(PHASE_MAKEDIRS) \
$$(addsuffix /modules/$2, $$(PHASE_MAKEDIRS)) \
) \
MODULE=$2 MAKEFILE_PREFIX=$$($1_FILE_PREFIX) $$($1_EXTRA_ARGS))
endef
@@ -183,6 +185,7 @@ endef
# Param 1: Name of list to add targets to
# Named params:
# TARGET_SUFFIX : Suffix of target to create for recipe
# MAKE_SUBDIR : Subdir for this build phase
# FILE_PREFIX : File prefix for this build phase
# CHECK_MODULES : List of modules to try
# MULTIPLE_MAKEFILES : Set to true to handle makefiles for the same module and

View File

@@ -39,7 +39,7 @@ include MakeBase.gmk
TARGETS :=
# Include the file being wrapped.
include $(MAKEFILE_PREFIX).gmk
include modules/$(MODULE)/$(MAKEFILE_PREFIX).gmk
# Setup copy rules from the modules directories to the jdk image directory.
ifeq ($(call isTargetOs, windows), true)

View File

@@ -53,7 +53,6 @@ define create-info-file
$(call info-file-item, "JAVA_VERSION_DATE", "$(VERSION_DATE)")
$(call info-file-item, "OS_NAME", "$(RELEASE_FILE_OS_NAME)")
$(call info-file-item, "OS_ARCH", "$(RELEASE_FILE_OS_ARCH)")
$(call info-file-item, "LIBC", "$(RELEASE_FILE_LIBC)")
endef
# Param 1 - The file containing the MODULES list

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2016, 2021, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2016, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -46,7 +46,7 @@ endif
$(eval $(call ParseKeywordVariable, TEST_OPTS, \
SINGLE_KEYWORDS := JOBS TIMEOUT_FACTOR JCOV JCOV_DIFF_CHANGESET, \
STRING_KEYWORDS := VM_OPTIONS JAVA_OPTIONS, \
STRING_KEYWORDS := VM_OPTIONS JAVA_OPTIONS AOT_MODULES, \
))
# Helper function to propagate TEST_OPTS values.
@@ -60,14 +60,19 @@ define SetTestOpt
endif
endef
# Setup _NT_SYMBOL_PATH on Windows, which points to our pdb files.
# Setup _NT_SYMBOL_PATH on Windows
ifeq ($(call isTargetOs, windows), true)
ifndef _NT_SYMBOL_PATH
SYMBOL_PATH := $(call PathList, $(sort $(patsubst %/, %, $(dir $(wildcard \
$(addprefix $(SYMBOLS_IMAGE_DIR)/bin/, *.pdb */*.pdb))))))
export _NT_SYMBOL_PATH := $(subst \\,\, $(call FixPath, \
$(subst $(DQUOTE),, $(SYMBOL_PATH))))
$(call LogDebug, Setting _NT_SYMBOL_PATH to $(_NT_SYMBOL_PATH))
# Can't use PathList here as it adds quotes around the value.
_NT_SYMBOL_PATH := \
$(subst $(SPACE),;,$(strip \
$(foreach p, $(sort $(dir $(wildcard \
$(addprefix $(SYMBOLS_IMAGE_DIR)/bin/, *.pdb */*.pdb)))), \
$(call FixPath, $p) \
) \
))
export _NT_SYMBOL_PATH
$(call LogDebug, Rewriting _NT_SYMBOL_PATH to $(_NT_SYMBOL_PATH))
endif
endif
@@ -134,6 +139,92 @@ ifeq ($(GCOV_ENABLED), true)
JTREG_COV_OPTIONS += -e:GCOV_PREFIX="$(GCOV_OUTPUT_DIR)"
endif
################################################################################
# Optionally create AOT libraries for specified modules before running tests.
# Note, this could not be done during JDK build time.
################################################################################
# Parameter 1 is the name of the rule.
#
# Remaining parameters are named arguments.
# MODULE The module to generate a library for
# BIN Output directory in which to put the library
# VM_OPTIONS List of JVM arguments to use when creating library
# OPTIONS_VAR Name of variable to put AOT java options in
# PREREQS_VAR Name of variable to put all AOT prerequisite rule targets in
# for test rules to depend on
#
SetupAotModule = $(NamedParamsMacroTemplate)
define SetupAotModuleBody
$1_AOT_LIB := $$($1_BIN)/$$(call SHARED_LIBRARY,$$($1_MODULE))
$1_AOT_CCLIST := $$(wildcard $$(TOPDIR)/test/hotspot/jtreg/compiler/aot/scripts/$$($1_MODULE)-list.txt)
# Create jaotc flags.
# VM flags which don't affect AOT code generation are filtered out:
# -Xcomp, -XX:+-TieredCompilation
$1_JAOTC_OPTS := \
-J-Xmx4g --info \
$$(addprefix -J, $$(filter-out -Xcomp %TieredCompilation, $$($1_VM_OPTIONS))) \
$$(addprefix --compile-commands$(SPACE), $$($1_AOT_CCLIST)) \
--linker-path $$(LD_JAOTC) \
#
ifneq ($$(filter -ea, $$($1_VM_OPTIONS)), )
$1_JAOTC_OPTS += --compile-with-assertions
endif
$$($1_AOT_LIB): $$(JDK_UNDER_TEST)/release \
$$(call DependOnVariable, $1_JAOTC_OPTS) \
$$(call DependOnVariable, JDK_UNDER_TEST)
$$(call LogWarn, Generating $$(patsubst $$(OUTPUTDIR)/%, %, $$@))
$$(call MakeTargetDir)
$$(call ExecuteWithLog, $$@, \
$((COV_ENVIRONMENT) \
$$(FIXPATH) $$(JDK_UNDER_TEST)/bin/jaotc \
$$($1_JAOTC_OPTS) --output $$@ --module $$($1_MODULE) \
)
$$(call ExecuteWithLog, $$@.check, ( \
$$(FIXPATH) $$(JDK_UNDER_TEST)/bin/java \
$$($1_VM_OPTIONS) -XX:+UnlockDiagnosticVMOptions -XX:+UnlockExperimentalVMOptions \
-XX:+PrintAOT -XX:+UseAOTStrictLoading \
-XX:AOTLibrary=$$@ -version \
> $$@.verify-aot \
))
$1_AOT_OPTIONS += -XX:+UnlockExperimentalVMOptions
$1_AOT_OPTIONS += -XX:AOTLibrary=$$($1_AOT_LIB)
$1_AOT_TARGETS += $$($1_AOT_LIB)
endef
################################################################################
# Optionally create AOT libraries before running tests.
# Note, this could not be done during JDK build time.
################################################################################
# Parameter 1 is the name of the rule.
#
# Remaining parameters are named arguments.
# MODULES The modules to generate a library for
# VM_OPTIONS List of JVM arguments to use when creating libraries
#
# After calling this, the following variables are defined
# $1_AOT_OPTIONS List of all java options needed to use the AOT libraries
# $1_AOT_TARGETS List of all targets that the test rule will need to depend on
#
SetupAot = $(NamedParamsMacroTemplate)
define SetupAotBody
$$(info Running with AOTd libraries for $$($1_MODULES))
# Put aot libraries in a separate directory so they are not deleted between
# test runs and may be reused between make invocations.
$$(foreach m, $$($1_MODULES), \
$$(eval $$(call SetupAotModule, $1_$$m, \
MODULE := $$m, \
BIN := $$(TEST_SUPPORT_DIR)/aot/$1, \
VM_OPTIONS := $$($1_VM_OPTIONS), \
)) \
$$(eval $1_AOT_OPTIONS += $$($1_$$m_AOT_OPTIONS)) \
$$(eval $1_AOT_TARGETS += $$($1_$$m_AOT_TARGETS)) \
)
endef
################################################################################
# Setup global test running parameters
################################################################################
@@ -192,6 +283,7 @@ endif
$(eval $(call SetTestOpt,VM_OPTIONS,JTREG))
$(eval $(call SetTestOpt,JAVA_OPTIONS,JTREG))
$(eval $(call SetTestOpt,AOT_MODULES,JTREG))
$(eval $(call SetTestOpt,JOBS,JTREG))
$(eval $(call SetTestOpt,TIMEOUT_FACTOR,JTREG))
@@ -202,7 +294,7 @@ $(eval $(call ParseKeywordVariable, JTREG, \
TEST_MODE ASSERT VERBOSE RETAIN MAX_MEM RUN_PROBLEM_LISTS \
RETRY_COUNT MAX_OUTPUT, \
STRING_KEYWORDS := OPTIONS JAVA_OPTIONS VM_OPTIONS KEYWORDS \
EXTRA_PROBLEM_LISTS LAUNCHER_OPTIONS, \
EXTRA_PROBLEM_LISTS AOT_MODULES LAUNCHER_OPTIONS, \
))
ifneq ($(JTREG), )
@@ -214,10 +306,11 @@ endif
$(eval $(call SetTestOpt,VM_OPTIONS,GTEST))
$(eval $(call SetTestOpt,JAVA_OPTIONS,GTEST))
$(eval $(call SetTestOpt,AOT_MODULES,GTEST))
$(eval $(call ParseKeywordVariable, GTEST, \
SINGLE_KEYWORDS := REPEAT, \
STRING_KEYWORDS := OPTIONS VM_OPTIONS JAVA_OPTIONS, \
STRING_KEYWORDS := OPTIONS VM_OPTIONS JAVA_OPTIONS AOT_MODULES, \
))
ifneq ($(GTEST), )
@@ -258,6 +351,8 @@ langtools_JTREG_PROBLEM_LIST += $(TOPDIR)/test/langtools/ProblemList.txt
hotspot_JTREG_PROBLEM_LIST += $(TOPDIR)/test/hotspot/jtreg/ProblemList.txt
lib-test_JTREG_PROBLEM_LIST += $(TOPDIR)/test/lib-test/ProblemList.txt
langtools_JTREG_MAX_MEM := 768m
################################################################################
# Parse test selection
#
@@ -498,18 +593,23 @@ define SetupRunGtestTestBody
$1_GTEST_REPEAT :=--gtest_repeat=$$(GTEST_REPEAT)
endif
run-test-$1: pre-run-test
ifneq ($$(GTEST_AOT_MODULES), )
$$(eval $$(call SetupAot, $1, \
MODULES := $$(GTEST_AOT_MODULES), \
VM_OPTIONS := $$(GTEST_VM_OPTIONS) $$(GTEST_JAVA_OPTIONS), \
))
endif
run-test-$1: pre-run-test $$($1_AOT_TARGETS)
$$(call LogWarn)
$$(call LogWarn, Running test '$$($1_TEST)')
$$(call MakeDir, $$($1_TEST_RESULTS_DIR) $$($1_TEST_SUPPORT_DIR))
$$(call ExecuteWithLog, $$($1_TEST_SUPPORT_DIR)/gtest, ( \
$$(CD) $$($1_TEST_SUPPORT_DIR) && \
$$(FIXPATH) $$(TEST_IMAGE_DIR)/hotspot/gtest/$$($1_VARIANT)/gtestLauncher \
-jdk $(JDK_UNDER_TEST) $$($1_GTEST_FILTER) \
--gtest_output=xml:$$($1_TEST_RESULTS_DIR)/gtest.xml \
--gtest_catch_exceptions=0 \
$$($1_GTEST_REPEAT) $$(GTEST_OPTIONS) $$(GTEST_VM_OPTIONS) \
$$(GTEST_JAVA_OPTIONS) \
$$(GTEST_JAVA_OPTIONS) $$($1_AOT_OPTIONS) \
> >($(TEE) $$($1_TEST_RESULTS_DIR)/gtest.txt) \
&& $$(ECHO) $$$$? > $$($1_EXITCODE) \
|| $$(ECHO) $$$$? > $$($1_EXITCODE) \
@@ -720,7 +820,7 @@ define SetupRunJtregTestBody
# Convert JTREG_foo into $1_JTREG_foo with a suitable value.
$$(eval $$(call SetJtregValue,$1,JTREG_TEST_MODE,agentvm))
$$(eval $$(call SetJtregValue,$1,JTREG_ASSERT,true))
$$(eval $$(call SetJtregValue,$1,JTREG_MAX_MEM,768m))
$$(eval $$(call SetJtregValue,$1,JTREG_MAX_MEM,512m))
$$(eval $$(call SetJtregValue,$1,JTREG_NATIVEPATH))
$$(eval $$(call SetJtregValue,$1,JTREG_BASIC_OPTIONS))
$$(eval $$(call SetJtregValue,$1,JTREG_PROBLEM_LIST))
@@ -736,7 +836,7 @@ define SetupRunJtregTestBody
# Make sure MaxRAMPercentage is high enough to not cause OOM or swapping since
# we may end up with a lot of JVM's
$1_JTREG_MAX_RAM_PERCENTAGE := $$(shell $(AWK) 'BEGIN { print 25 / $$($1_JTREG_JOBS); }')
$1_JTREG_MAX_RAM_PERCENTAGE := $$(shell $$(EXPR) 25 / $$($1_JTREG_JOBS))
JTREG_TIMEOUT_FACTOR ?= 4
@@ -789,15 +889,6 @@ define SetupRunJtregTestBody
$1_JTREG_BASIC_OPTIONS += -ea -esa
endif
ifeq ($$(ASAN_ENABLED), yes)
$1_JTREG_BASIC_OPTIONS += -e:ASAN_OPTIONS=handle_segv=0:handle_sigfpe=0:detect_leaks=false
$1_JTREG_BASIC_OPTIONS += -e:LD_PRELOAD=libasan.so.5
endif
ifeq ($$(USAN_ENABLED), yes)
$1_JTREG_BASIC_OPTIONS += -e:LD_PRELOAD=libubsan.so.1
endif
ifneq ($$($1_JTREG_NATIVEPATH), )
$1_JTREG_BASIC_OPTIONS += -nativepath:$$($1_JTREG_NATIVEPATH)
endif
@@ -825,6 +916,7 @@ define SetupRunJtregTestBody
endif
$1_JTREG_BASIC_OPTIONS += -e:TEST_IMAGE_DIR=$(TEST_IMAGE_DIR)
$1_JTREG_BASIC_OPTIONS += -e:TEST_IMAGE_GRAAL_DIR=$(TEST_IMAGE_DIR)/hotspot/jtreg/graal
ifneq ($$(JTREG_FAILURE_HANDLER_OPTIONS), )
$1_JTREG_LAUNCHER_OPTIONS += -Djava.library.path="$(JTREG_FAILURE_HANDLER_DIR)"
@@ -841,6 +933,17 @@ define SetupRunJtregTestBody
endif
endif
ifneq ($$(JTREG_AOT_MODULES), )
$$(eval $$(call SetupAot, $1, \
MODULES := $$(JTREG_AOT_MODULES), \
VM_OPTIONS := $$(JTREG_VM_OPTIONS) $$(JTREG_JAVA_OPTIONS), \
))
endif
ifneq ($$($1_AOT_OPTIONS), )
$1_JTREG_BASIC_OPTIONS += -vmoptions:"$$($1_AOT_OPTIONS)"
endif
clean-workdir-$1:
$$(RM) -r $$($1_TEST_SUPPORT_DIR)
@@ -875,7 +978,7 @@ define SetupRunJtregTestBody
done
endif
run-test-$1: pre-run-test clean-workdir-$1
run-test-$1: pre-run-test clean-workdir-$1 $$($1_AOT_TARGETS)
$$(call LogWarn)
$$(call LogWarn, Running test '$$($1_TEST)')
$$(call MakeDir, $$($1_TEST_RESULTS_DIR) $$($1_TEST_SUPPORT_DIR) \

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2017, 2021, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2017, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -189,11 +189,15 @@ ifeq ($(OPENJDK_TARGET_CPU), x86_64)
endif
ifeq ($(OPENJDK_TARGET_OS), windows)
FIXPATH_BASE := $(BASH) $(TOPDIR)/make/scripts/fixpath.sh
FIXPATH := $(FIXPATH_BASE) exec
ifeq ($(wildcard $(TEST_IMAGE_DIR)/bin/fixpath.exe), )
$(info Error: fixpath is missing from test image '$(TEST_IMAGE_DIR)')
$(error Cannot continue.)
endif
FIXPATH := $(TEST_IMAGE_DIR)/bin/fixpath.exe -c
PATH_SEP:=;
else
FIXPATH_BASE :=
FIXPATH :=
PATH_SEP:=:
endif
# Check number of cores and memory in MB
@@ -222,6 +226,25 @@ ifeq ($(MEMORY_SIZE), )
MEMORY_SIZE := 1024
endif
# Setup LD for AOT support
ifneq ($(DEVKIT_HOME), )
ifeq ($(OPENJDK_TARGET_OS), windows)
LD_JAOTC := $(DEVKIT_HOME)/VC/bin/x64/link.exe
LIBRARY_PREFIX :=
SHARED_LIBRARY_SUFFIX := .dll
else ifeq ($(OPENJDK_TARGET_OS), linux)
LD_JAOTC := $(DEVKIT_HOME)/bin/ld
LIBRARY_PREFIX := lib
SHARED_LIBRARY_SUFFIX := .so
else ifeq ($(OPENJDK_TARGET_OS), macosx)
LD_JAOTC := $(DEVKIT_HOME)/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/ld
LIBRARY_PREFIX := lib
SHARED_LIBRARY_SUFFIX := .dylib
endif
else
LD := ld
endif
ifneq ($(wildcard $(JDK_IMAGE_DIR)/template.xml), )
TEST_OPTS_JCOV := true
JCOV_IMAGE_DIR := $(JDK_IMAGE_DIR)
@@ -256,8 +279,8 @@ $(call CreateNewSpec, $(NEW_SPEC), \
MAKE := $(MAKE), \
BASH := $(BASH), \
JIB_JAR := $(JIB_JAR), \
FIXPATH_BASE := $(FIXPATH_BASE), \
FIXPATH := $(FIXPATH), \
PATH_SEP := $(PATH_SEP), \
OPENJDK_TARGET_OS := $(OPENJDK_TARGET_OS), \
OPENJDK_TARGET_OS_TYPE := $(OPENJDK_TARGET_OS_TYPE), \
OPENJDK_TARGET_OS_ENV := $(OPENJDK_TARGET_OS_ENV), \
@@ -267,6 +290,9 @@ $(call CreateNewSpec, $(NEW_SPEC), \
OPENJDK_TARGET_CPU_ENDIAN := $(OPENJDK_TARGET_CPU_ENDIAN), \
NUM_CORES := $(NUM_CORES), \
MEMORY_SIZE := $(MEMORY_SIZE), \
LD_JAOTC := $(LD_JAOTC), \
LIBRARY_PREFIX := $(LIBRARY_PREFIX), \
SHARED_LIBRARY_SUFFIX := $(SHARED_LIBRARY_SUFFIX), \
include $(TOPDIR)/make/RunTestsPrebuiltSpec.gmk, \
TEST_OPTS_JCOV := $(TEST_OPTS_JCOV), \
$(CUSTOM_NEW_SPEC_LINE), \

View File

@@ -116,13 +116,16 @@ JAVAC_CMD := $(BOOT_JDK)/bin/javac
JAR_CMD := $(BOOT_JDK)/bin/jar
JLINK_CMD := $(JDK_OUTPUTDIR)/bin/jlink
JMOD_CMD := $(JDK_OUTPUTDIR)/bin/jmod
JARSIGNER_CMD := $(BOOT_JDK)/bin/jarsigner
JAVA := $(FIXPATH) $(JAVA_CMD) $(JAVA_FLAGS_BIG) $(JAVA_FLAGS)
JAVA_SMALL := $(FIXPATH) $(JAVA_CMD) $(JAVA_FLAGS_SMALL) $(JAVA_FLAGS)
JAVA_DETACH := $(FIXPATH) $(FIXPATH_DETACH_FLAG) $(JAVA_CMD) $(JAVA_FLAGS_BIG) $(JAVA_FLAGS)
JAVAC := $(FIXPATH) $(JAVAC_CMD)
JAR := $(FIXPATH) $(JAR_CMD)
JLINK := $(FIXPATH) $(JLINK_CMD)
JMOD := $(FIXPATH) $(JMOD_CMD)
JARSIGNER := $(FIXPATH) $(JARSIGNER_CMD)
BUILD_JAVA := $(JDK_IMAGE_DIR)/bin/JAVA
################################################################################
@@ -150,6 +153,7 @@ LN := ln
MIG := mig
MKDIR := mkdir
MV := mv
NAWK := nawk
NICE := nice
PATCH := patch
PRINTF := printf
@@ -163,6 +167,7 @@ TAIL := tail
TEE := tee
TR := tr
TOUCH := touch
UNIQ := uniq
WC := wc
XARGS := xargs
ZIPEXE := zip
@@ -173,7 +178,7 @@ HG := hg
ULIMIT := ulimit
ifeq ($(OPENJDK_BUILD_OS), windows)
PATHTOOL := cygpath
CYGPATH := cygpath
endif
################################################################################

View File

@@ -30,25 +30,34 @@ include MakeBase.gmk
############################################################################
ifeq ($(call isTargetOs, windows), true)
FIXPATH_COPY := $(TEST_IMAGE_DIR)/bin/fixpath.exe
$(FIXPATH_COPY): $(firstword $(FIXPATH))
$(call install-file)
FIXPATH_WORKSPACE_ROOT := $(call FixPath, $(WORKSPACE_ROOT))
FIXPATH_OUTPUTDIR := $(call FixPath, $(OUTPUTDIR))
else
FIXPATH_WORKSPACE_ROOT := $(WORKSPACE_ROOT)
FIXPATH_OUTPUTDIR := $(OUTPUTDIR)
endif
BUILD_INFO_PROPERTIES := $(TEST_IMAGE_DIR)/build-info.properties
$(BUILD_INFO_PROPERTIES):
$(call MakeTargetDir)
$(ECHO) "# Build info properties for JDK tests" > $@
$(ECHO) "build.workspace.root=$(call FixPath, $(WORKSPACE_ROOT))" >> $@
$(ECHO) "build.output.root=$(call FixPath, $(OUTPUTDIR))" >> $@
$(ECHO) "build.workspace.root=$(FIXPATH_WORKSPACE_ROOT)" >> $@
$(ECHO) "build.output.root=$(FIXPATH_OUTPUTDIR)" >> $@
README := $(TEST_IMAGE_DIR)/Readme.txt
$(README):
$(call MakeTargetDir)
$(ECHO) > $@ 'JDK test image'
TARGETS += $(BUILD_INFO_PROPERTIES) $(README)
prepare-test-image: $(FIXPATH_COPY) $(BUILD_INFO_PROPERTIES)
$(call MakeDir, $(TEST_IMAGE_DIR))
$(ECHO) > $(TEST_IMAGE_DIR)/Readme.txt 'JDK test image'
################################################################################
prepare-test-image: $(TARGETS)
all: prepare-test-image
.PHONY: default all prepare-test-image

Some files were not shown because too many files have changed in this diff Show More