Compare commits

..

25 Commits

Author SHA1 Message Date
Maxim Kartashev
d05fa193aa JBR-4621 Implemented key repeat 2022-09-30 13:22:41 +03:00
Maxim Kartashev
0c13996782 JBR-4621 Input events support for Wayland
This includes basic mouse and keyboard support.
2022-09-23 09:23:18 +03:00
Maxim Kartashev
221ea45b95 Let WLToolkit work with DISPLAY unset 2022-07-08 11:44:31 +03:00
Alexey Ushakov
d7a5df34ef Improved sun.awt.wl.WLGraphicsEnvironment to support createCraphics() 2022-06-30 09:54:52 +02:00
Maxim Kartashev
194bce3a6e Added libwakefield source code to the tree
It is not integrated into the build infrastructure both for simplicity
and to avoid otherwise unnecessary dependencies on weston, pixman, etc.

Also fixed copyrights in the recently added files, including the
auto-generated ones.
2022-04-27 15:10:29 +03:00
Maxim Kartashev
87d83f4b27 Made it possible for Wayland tests to run in parallel
Also fixed a potential crash in getLocationOnScreen().
2022-04-26 19:27:03 +03:00
Maxim Kartashev
5f99c93eae Wayland test harness and sample test 2022-04-26 18:23:21 +03:00
Maxim Kartashev
7ed196e3c6 AWT Robot to support Wayland natively
Requires the presence of the 'wakefield' protocol extension on the
server side; will throw UOE on use otherwise. Can be completely
disabled by undefining WAKEFIELD_ROBOT during compilation.

Provides the ability to re-position the surface to the given absolute
coordinates, query the surface's position, obtain RGB of a pixel at the
given absolute coordinates and take a screenshot of an area.
2022-04-19 19:24:29 +03:00
Nikita Gubarkov
7836128343 Suppress unused-result warning for libfontmanager 2022-04-14 23:50:47 +03:00
nikita.gubarkov
62b72beede Text rendering support
Extracted X11-related code from libfontmanager into libfontmanager_xawt
2022-04-14 16:49:21 +03:00
Maxim Kartashev
77e472ba63 Reduced xdg_wm_base protocol version to 1 in order to run under Weston
This was done purely for convenience. The version can be bumped back up
at any time, but the change will require a more recent version
of Weston for testing.
2022-04-08 07:58:41 -07:00
Alexey Ushakov
2767704152 Added JFrame support 2022-04-06 21:03:31 +02:00
Alexey Ushakov
d352fe2b32 Fixed child hw component position 2022-03-31 22:04:37 +02:00
Alexey Ushakov
0e47c87515 Implemented heavyweight button rendering 2022-02-18 20:45:39 +01:00
Alexey Ushakov
297303bfa6 Moved native window management to WLComponentPeer 2022-02-15 22:45:06 +01:00
Alexey Ushakov
d5f8602c3b Added WLRepaintArea 2022-02-15 21:52:10 +01:00
Alexey Ushakov
dfd5f679e5 Refactored peers 2022-02-11 21:51:43 +01:00
Alexey Ushakov
5cc0bfb209 Added stubs for WLTK button peer 2022-02-11 20:00:46 +01:00
Alexey Ushakov
43d6b5012f Added 2d surface support 2022-02-11 20:00:22 +01:00
Alexey Ushakov
659ea0935c Added support for background color. Refactoring 2022-02-11 20:00:22 +01:00
Alexey Ushakov
eb90129dca Make simple awt window visible 2022-02-11 20:00:22 +01:00
Dmitry Batrak
89db5c8960 window showing and event loop prototype 2022-02-11 20:00:22 +01:00
Dmitry Batrak
cc8e7d1ee7 more stubbing for WLToolkit, add WLFramePeer 2022-02-11 20:00:22 +01:00
Dmitry Batrak
b359a00f63 more stubbing for WLToolkit 2022-02-11 20:00:21 +01:00
Alexey Ushakov
0c9bbd67d3 Created stub version of WLToolkit
A wayland base toolkit with native part linked to wayland-client library
2022-02-11 20:00:21 +01:00
8785 changed files with 134428 additions and 548373 deletions

File diff suppressed because it is too large Load Diff

2
.gitignore vendored
View File

@@ -18,5 +18,3 @@ NashornProfile.txt
/src/utils/LogCompilation/target/
/.project/
/.settings/
*.class
.idea/workspace.xml

View File

@@ -1,7 +1,64 @@
[![official JetBrains project](http://jb.gg/badges/official.svg)](https://confluence.jetbrains.com/display/ALL/JetBrains+on+GitHub)
# Welcome to the JDK!
# Welcome to JetBrains Runtime!
<a name="jetbrains-runtime"></a>
## Wakefield
This is a temporary section created to host information on the
[Wakefield](https://wiki.openjdk.java.net/display/wakefield) project.
Do not use this branch. It is [outdated](https://youtrack.jetbrains.com/issue/JBR-4375/New-branch-naming-policy-in-JBR-repo).
Please use [main](https://github.com/JetBrains/JetBrainsRuntime/tree/main) instead.
### Building
There are two addition `configure` arguments:
```
--with-wayland specify prefix directory for the wayland package
(expecting the headers under PATH/include)
--with-wayland-include specify directory for the wayland include files
```
As usual, there should be no need to specify those explicitly unless you're doing
something tricky.
However, a variant of `libwayland-dev` needs to be installed on the build system.
### Running
Make sure your system is configured such that `libwayland` can find the socket to connect to;
usually this means that the environment variable `WAYLAND_DISPLAY` is set to something
sensible. Then add this argument to `java`
```
-Dawt.toolkit.name=WLToolkit
```
### Testing
Testing that involves `Robot` is done inside a [Weston](https://gitlab.freedesktop.org/wayland/weston/)
instance with a special module loaded called `libwakefield`
that provides the necessary functionality. The Wayland-specific tests are therefore executed with a dedicated test driver
`test/jdk/java/awt/wakefield/WakefieldTestDriver.java`. The driver also provides an easy
way to run the test in several configurations with a different size and even number
of "outputs" (monitors).
To run the Wayland-specific tests, perform these steps:
* Install Weston version 9 (earlier versions are known NOT to work).
* Obtain `libwakefield.so` either by building from source (available under
`src/java.desktop/share/native/libwakefield` and not integrated into the rest of the
build infrastructure; see `README.md` there)
or by fetching the latest pre-built `x64` binary
```
wget https://github.com/mkartashev/wakefield/raw/main/libwakefield.so
```
* Set `LIBWAKEFIELD` environment variable to the full path to `libwakefield.so`
```
export LIBWAKEFIELD=/tmp/wakefield-testing/libwakefield.so
```
* Run `jtreg` like so
```
jtreg -e:XDG_RUNTIME_DIR -e:LIBWAKEFIELD -testjdk:... test/jdk/java/awt/wakefield/
```
This was verified to work in `Ubuntu 21.10`.
This does NOT work in `Ubuntu 21.04` or `Fedora 34`.
## Generic Info (not Wakefield-specific)
For build instructions please see the
[online documentation](https://openjdk.java.net/groups/build/doc/building.html),
or either of these files:
- [doc/building.html](doc/building.html) (html version)
- [doc/building.md](doc/building.md) (markdown version)
See <https://openjdk.java.net/> for more information about
the OpenJDK Community and the JDK.

View File

@@ -25,26 +25,7 @@
# Shell script for generating an IDEA project from a given list of modules
usage() {
echo "Usage: $0 [-h|--help] [-q|--quiet] [-a|--absolute-paths] [-r|--root <path>] [-o|--output <path>] [-c|--conf <conf_name>] [modules...]"
echo " -h | --help"
echo " -q | --quiet
No stdout output"
echo " -a | --absolute-paths
Use absolute paths to this jdk, so that generated .idea
project files can be moved independently of jdk sources"
echo " -r | --root <path>
Project content root
Default: $TOPLEVEL_DIR"
echo " -o | --output <path>
Where .idea directory with project files will be generated
(e.g. using '-o .' will place project files in './.idea')
Default: same as --root"
echo " -c | --conf <conf_name>
make configuration (release, slowdebug etc)"
echo " [modules...]
Generate project modules for specific java modules
(e.g. 'java.base java.desktop')
Default: all existing modules (java.* and jdk.*)"
echo "usage: $0 [-h|--help] [-v|--verbose] [-o|--output <path>] [-c|--conf <conf_name>] [modules]+"
exit 1
}
@@ -52,13 +33,10 @@ SCRIPT_DIR=`dirname $0`
#assume TOP is the dir from which the script has been called
TOP=`pwd`
cd $SCRIPT_DIR; SCRIPT_DIR=`pwd`
if [ "x$TOPLEVEL_DIR" = "x" ] ; then
cd .. ; TOPLEVEL_DIR=`pwd`
fi
cd $TOP;
VERBOSE=true
ABSOLUTE_PATHS=false
IDEA_OUTPUT=$TOP/.idea
VERBOSE="false"
CONF_ARG=
while [ $# -gt 0 ]
do
@@ -67,24 +45,14 @@ do
usage
;;
-q | --quiet )
VERBOSE=false
;;
-a | --absolute-paths )
ABSOLUTE_PATHS=true
;;
-r | --root )
TOPLEVEL_DIR="$2"
shift
-v | --vebose )
VERBOSE="true"
;;
-o | --output )
IDEA_OUTPUT="$2/.idea"
IDEA_OUTPUT=$2/.idea
shift
;;
-c | --conf )
CONF_ARG="CONF_NAME=$2"
shift
@@ -101,21 +69,23 @@ do
shift
done
if [ "x$IDEA_OUTPUT" = "x" ] ; then
IDEA_OUTPUT="$TOPLEVEL_DIR/.idea"
if [ -e $IDEA_OUTPUT ] ; then
rm -r $IDEA_OUTPUT
fi
mkdir -p $IDEA_OUTPUT || exit 1
cd $IDEA_OUTPUT; IDEA_OUTPUT=`pwd`
if [ "x$TOPLEVEL_DIR" = "x" ] ; then
cd $SCRIPT_DIR/..
TOPLEVEL_DIR=`pwd`
cd $IDEA_OUTPUT
fi
mkdir -p $IDEA_OUTPUT || exit 1
cd "$TOP" ; cd $TOPLEVEL_DIR; TOPLEVEL_DIR=`pwd`
cd "$TOP" ; cd $IDEA_OUTPUT; IDEA_OUTPUT=`pwd`
cd ..; IDEA_OUTPUT_PARENT=`pwd`
cd "$SCRIPT_DIR/.." ; OPENJDK_DIR=`pwd`
IDEA_MAKE="$OPENJDK_DIR/make/ide/idea/jdk"
MAKE_DIR="$SCRIPT_DIR/../make"
IDEA_MAKE="$MAKE_DIR/ide/idea/jdk"
IDEA_TEMPLATE="$IDEA_MAKE/template"
cp -rn "$TOPLEVEL_DIR/jb/project/idea-project-files"/* "$IDEA_OUTPUT"
cp -rn "$IDEA_TEMPLATE"/* "$IDEA_OUTPUT"
cp -r "$IDEA_TEMPLATE"/* "$IDEA_OUTPUT"
#override template
if [ -d "$TEMPLATES_OVERRIDE" ] ; then
@@ -124,31 +94,31 @@ if [ -d "$TEMPLATES_OVERRIDE" ] ; then
done
fi
if [ "$VERBOSE" = true ] ; then
echo "Will generate IDEA project files in \"$IDEA_OUTPUT\" for project \"$TOPLEVEL_DIR\""
if [ "$VERBOSE" = "true" ] ; then
echo "output dir: $IDEA_OUTPUT"
echo "idea template dir: $IDEA_TEMPLATE"
fi
cd $TOP ; make -f "$IDEA_MAKE/idea.gmk" -I "$OPENJDK_DIR" idea TOPLEVEL_DIR="$TOPLEVEL_DIR" \
MAKEOVERRIDES= IDEA_OUTPUT_PARENT="$IDEA_OUTPUT_PARENT" OUT="$IDEA_OUTPUT/env.cfg" MODULES="$*" $CONF_ARG || exit 1
cd $TOP ; make -f "$IDEA_MAKE/idea.gmk" -I $MAKE_DIR/.. idea MAKEOVERRIDES= OUT=$IDEA_OUTPUT/env.cfg MODULES="$*" $CONF_ARG || exit 1
cd $SCRIPT_DIR
. $IDEA_OUTPUT/env.cfg
# Expect MODULES, MODULE_NAMES, RELATIVE_PROJECT_DIR, RELATIVE_BUILD_DIR to be set
if [ "xMODULES" = "x" ] ; then
echo "FATAL: MODULES is empty" >&2; exit 1
# Expect MODULE_ROOTS, MODULE_NAMES, BOOT_JDK & SPEC to be set
if [ "x$MODULE_ROOTS" = "x" ] ; then
echo "FATAL: MODULE_ROOTS is empty" >&2; exit 1
fi
if [ "x$MODULE_NAMES" = "x" ] ; then
echo "FATAL: MODULE_NAMES is empty" >&2; exit 1
fi
if [ "x$RELATIVE_PROJECT_DIR" = "x" ] ; then
echo "FATAL: RELATIVE_PROJECT_DIR is empty" >&2; exit 1
if [ "x$BOOT_JDK" = "x" ] ; then
echo "FATAL: BOOT_JDK is empty" >&2; exit 1
fi
if [ "x$RELATIVE_BUILD_DIR" = "x" ] ; then
echo "FATAL: RELATIVE_BUILD_DIR is empty" >&2; exit 1
if [ "x$SPEC" = "x" ] ; then
echo "FATAL: SPEC is empty" >&2; exit 1
fi
if [ -d "$TOPLEVEL_DIR/.hg" ] ; then
@@ -159,43 +129,6 @@ if [ -d "$TOPLEVEL_DIR/.git" ] ; then
VCS_TYPE="Git"
fi
if [ "$ABSOLUTE_PATHS" = true ] ; then
if [ "x$PATHTOOL" != "x" ]; then
PROJECT_DIR="`$PATHTOOL -am $OPENJDK_DIR`"
TOPLEVEL_PROJECT_DIR="`$PATHTOOL -am $TOPLEVEL_DIR`"
else
PROJECT_DIR="$OPENJDK_DIR"
TOPLEVEL_PROJECT_DIR="$TOPLEVEL_DIR"
fi
MODULE_DIR="$PROJECT_DIR"
TOPLEVEL_MODULE_DIR="$TOPLEVEL_PROJECT_DIR"
cd "$IDEA_OUTPUT_PARENT" && cd "$RELATIVE_BUILD_DIR" && BUILD_DIR="`pwd`"
CLION_SCRIPT_TOPDIR="$OPENJDK_DIR"
CLION_PROJECT_DIR="$PROJECT_DIR"
else
if [ "$RELATIVE_PROJECT_DIR" = "." ] ; then
PROJECT_DIR=""
else
PROJECT_DIR="/$RELATIVE_PROJECT_DIR"
fi
if [ "$RELATIVE_TOPLEVEL_PROJECT_DIR" = "." ] ; then
TOPLEVEL_PROJECT_DIR=""
else
TOPLEVEL_PROJECT_DIR="/$RELATIVE_TOPLEVEL_PROJECT_DIR"
fi
MODULE_DIR="\$MODULE_DIR\$$PROJECT_DIR"
PROJECT_DIR="\$PROJECT_DIR\$$PROJECT_DIR"
TOPLEVEL_MODULE_DIR="\$MODULE_DIR\$$TOPLEVEL_PROJECT_DIR"
TOPLEVEL_PROJECT_DIR="\$PROJECT_DIR\$$TOPLEVEL_PROJECT_DIR"
BUILD_DIR="\$PROJECT_DIR\$/$RELATIVE_BUILD_DIR"
CLION_SCRIPT_TOPDIR="$CLION_RELATIVE_PROJECT_DIR"
CLION_PROJECT_DIR="\$PROJECT_DIR\$/$CLION_SCRIPT_TOPDIR"
fi
if [ "$VERBOSE" = true ] ; then
echo "Project root: $PROJECT_DIR"
echo "Generating IDEA project files..."
fi
### Replace template variables
NUM_REPLACEMENTS=0
@@ -219,106 +152,112 @@ add_replacement() {
eval TO$NUM_REPLACEMENTS='$2'
}
add_replacement "###PATHTOOL###" "$PATHTOOL"
add_replacement "###CLION_SCRIPT_TOPDIR###" "$CLION_SCRIPT_TOPDIR"
add_replacement "###CLION_PROJECT_DIR###" "$CLION_PROJECT_DIR"
add_replacement "###PROJECT_DIR###" "$PROJECT_DIR"
add_replacement "###MODULE_DIR###" "$MODULE_DIR"
add_replacement "###TOPLEVEL_PROJECT_DIR###" "$TOPLEVEL_PROJECT_DIR"
add_replacement "###TOPLEVEL_MODULE_DIR###" "$TOPLEVEL_MODULE_DIR"
add_replacement "###MODULE_NAMES###" "$MODULE_NAMES"
add_replacement "###VCS_TYPE###" "$VCS_TYPE"
add_replacement "###BUILD_DIR###" "$BUILD_DIR"
add_replacement "###RELATIVE_BUILD_DIR###" "$RELATIVE_BUILD_DIR"
if [ "x$PATHTOOL" != "x" ]; then
add_replacement "###BASH_RUNNER_PREFIX###" "\$PROJECT_DIR\$/.idea/bash.bat"
else
add_replacement "###BASH_RUNNER_PREFIX###" ""
fi
if [ "x$PATHTOOL" != "x" ]; then
SPEC_DIR=`dirname $SPEC`
if [ "x$CYGPATH" != "x" ]; then
add_replacement "###BUILD_DIR###" "`$CYGPATH -am $SPEC_DIR`"
add_replacement "###IMAGES_DIR###" "`$CYGPATH -am $SPEC_DIR`/images/jdk"
add_replacement "###ROOT_DIR###" "`$CYGPATH -am $TOPLEVEL_DIR`"
add_replacement "###IDEA_DIR###" "`$CYGPATH -am $IDEA_OUTPUT`"
if [ "x$JT_HOME" = "x" ]; then
add_replacement "###JTREG_HOME###" ""
else
add_replacement "###JTREG_HOME###" "`$PATHTOOL -am $JT_HOME`"
add_replacement "###JTREG_HOME###" "`$CYGPATH -am $JT_HOME`"
fi
elif [ "x$WSL_DISTRO_NAME" != "x" ]; then
add_replacement "###BUILD_DIR###" "`wslpath -am $SPEC_DIR`"
add_replacement "###IMAGES_DIR###" "`wslpath -am $SPEC_DIR`/images/jdk"
add_replacement "###ROOT_DIR###" "`wslpath -am $TOPLEVEL_DIR`"
add_replacement "###IDEA_DIR###" "`wslpath -am $IDEA_OUTPUT`"
if [ "x$JT_HOME" = "x" ]; then
add_replacement "###JTREG_HOME###" ""
else
add_replacement "###JTREG_HOME###" "`wslpath -am $JT_HOME`"
fi
else
add_replacement "###BUILD_DIR###" "$SPEC_DIR"
add_replacement "###JTREG_HOME###" "$JT_HOME"
add_replacement "###IMAGES_DIR###" "$SPEC_DIR/images/jdk"
add_replacement "###ROOT_DIR###" "$TOPLEVEL_DIR"
add_replacement "###IDEA_DIR###" "$IDEA_OUTPUT"
fi
MODULE_IMLS=""
TEST_MODULE_DEPENDENCIES=""
for module in $MODULE_NAMES; do
MODULE_IMLS="$MODULE_IMLS<module fileurl=\"file://\$PROJECT_DIR$/.idea/$module.iml\" filepath=\"\$PROJECT_DIR$/.idea/$module.iml\" /> "
TEST_MODULE_DEPENDENCIES="$TEST_MODULE_DEPENDENCIES<orderEntry type=\"module\" module-name=\"$module\" scope=\"TEST\" /> "
SOURCE_PREFIX="<sourceFolder url=\"file://"
SOURCE_POSTFIX="\" isTestSource=\"false\" />"
for root in $MODULE_ROOTS; do
if [ "x$CYGPATH" != "x" ]; then
root=`$CYGPATH -am $root`
elif [ "x$WSL_DISTRO_NAME" != "x" ]; then
root=`wslpath -am $root`
fi
VM_CI="jdk.internal.vm.ci/share/classes"
VM_COMPILER="src/jdk.internal.vm.compiler/share/classes"
if test "${root#*$VM_CI}" != "$root" || test "${root#*$VM_COMPILER}" != "$root"; then
for subdir in "$root"/*; do
if [ -d "$subdir" ]; then
SOURCES=$SOURCES" $SOURCE_PREFIX""$subdir"/src"$SOURCE_POSTFIX"
fi
done
else
SOURCES=$SOURCES" $SOURCE_PREFIX""$root""$SOURCE_POSTFIX"
fi
done
add_replacement "###MODULE_IMLS###" "$MODULE_IMLS"
add_replacement "###TEST_MODULE_DEPENDENCIES###" "$TEST_MODULE_DEPENDENCIES"
add_replacement "###SOURCE_ROOTS###" "$SOURCES"
replace_template_dir "$IDEA_OUTPUT"
### Generate module project files
### Compile the custom Logger
if [ "$VERBOSE" = true ] ; then
echo "Generating project modules:"
fi
(
DEFAULT_IFS="$IFS"
IFS='#'
for value in $MODULES; do
(
eval "$value"
if [ "$VERBOSE" = true ] ; then
echo " $module"
fi
MAIN_SOURCE_DIRS=""
CONTENT_ROOTS=""
IFS=' '
for dir in $moduleSrcDirs; do
case $dir in
"src/"*) MAIN_SOURCE_DIRS="$MAIN_SOURCE_DIRS <sourceFolder url=\"file://$MODULE_DIR/$dir\" isTestSource=\"false\" />" ;;
*"/support/gensrc/$module") ;; # Exclude generated sources to avoid module-info conflicts, see https://youtrack.jetbrains.com/issue/IDEA-185108
*) CONTENT_ROOTS="$CONTENT_ROOTS <content url=\"file://$MODULE_DIR/$dir\">\
<sourceFolder url=\"file://$MODULE_DIR/$dir\" isTestSource=\"false\" generated=\"true\" /></content>" ;;
esac
done
if [ "x$MAIN_SOURCE_DIRS" != "x" ] ; then
CONTENT_ROOTS="<content url=\"file://$MODULE_DIR/src/$module\">$MAIN_SOURCE_DIRS</content>$CONTENT_ROOTS"
fi
add_replacement "###MODULE_CONTENT_ROOTS###" "$CONTENT_ROOTS"
DEPENDENCIES=""
for dep in $moduleDependencies; do
case $MODULE_NAMES in # Exclude skipped modules from dependencies
*"$dep"*) DEPENDENCIES="$DEPENDENCIES<orderEntry type=\"module\" module-name=\"$dep\" /> "
esac
done
add_replacement "###DEPENDENCIES###" "$DEPENDENCIES"
cp "$IDEA_OUTPUT/module.iml" "$IDEA_OUTPUT/$module.iml"
IFS="$DEFAULT_IFS"
replace_template_file "$IDEA_OUTPUT/$module.iml"
)
done
)
rm "$IDEA_OUTPUT/module.iml"
CLASSES=$IDEA_OUTPUT/classes
### Create shell script runner for Windows
if [ "x$ANT_HOME" = "x" ] ; then
# try some common locations, before giving up
if [ -f "/usr/share/ant/lib/ant.jar" ] ; then
ANT_HOME="/usr/share/ant"
elif [ -f "/usr/local/Cellar/ant/1.9.4/libexec/lib/ant.jar" ] ; then
ANT_HOME="/usr/local/Cellar/ant/1.9.4/libexec"
else
echo "FATAL: cannot find ant. Try setting ANT_HOME." >&2; exit 1
fi
fi
CP=$ANT_HOME/lib/ant.jar
rm -rf $CLASSES; mkdir $CLASSES
if [ "x$PATHTOOL" != "x" ]; then
echo "@echo off" > "$IDEA_OUTPUT/bash.bat"
if [ "x$WSL_DISTRO_NAME" != "x" ] ; then
echo "wsl -d $WSL_DISTRO_NAME --cd \"%cd%\" -e %*" >> "$IDEA_OUTPUT/bash.bat"
else
echo "$WINENV_ROOT\bin\bash.exe -l -c \"cd %CD:\=/%/ && %*\"" >> "$IDEA_OUTPUT/bash.bat"
fi
# If we have a Windows boot JDK, we need a .exe suffix
if [ -e "$BOOT_JDK/bin/java.exe" ] ; then
JAVAC=javac.exe
else
JAVAC=javac
fi
# If we are on WSL, the boot JDK might be either Windows or Linux,
# and we need to use realpath instead of CYGPATH to make javac work on both.
# We need to handle this case first since CYGPATH might be set on WSL.
if [ "x$WSL_DISTRO_NAME" != "x" ]; then
JAVAC_SOURCE_FILE=`realpath --relative-to=./ $IDEA_OUTPUT/src/idea/IdeaLoggerWrapper.java`
JAVAC_SOURCE_PATH=`realpath --relative-to=./ $IDEA_OUTPUT/src`
JAVAC_CLASSES=`realpath --relative-to=./ $CLASSES`
ANT_TEMP=`mktemp -d -p ./`
cp $ANT_HOME/lib/ant.jar $ANT_TEMP/ant.jar
JAVAC_CP=$ANT_TEMP/ant.jar
elif [ "x$CYGPATH" != "x" ] ; then ## CYGPATH may be set in env.cfg
JAVAC_SOURCE_FILE=`$CYGPATH -am $IDEA_OUTPUT/src/idea/IdeaLoggerWrapper.java`
JAVAC_SOURCE_PATH=`$CYGPATH -am $IDEA_OUTPUT/src`
JAVAC_CLASSES=`$CYGPATH -am $CLASSES`
JAVAC_CP=`$CYGPATH -am $CP`
else
JAVAC_SOURCE_FILE=$IDEA_OUTPUT/src/idea/IdeaLoggerWrapper.java
JAVAC_SOURCE_PATH=$IDEA_OUTPUT/src
JAVAC_CLASSES=$CLASSES
JAVAC_CP=$CP
fi
$BOOT_JDK/bin/$JAVAC -d $JAVAC_CLASSES -sourcepath $JAVAC_SOURCE_PATH -cp $JAVAC_CP $JAVAC_SOURCE_FILE
if [ "$VERBOSE" = true ] ; then
IDEA_PROJECT_DIR="`dirname $IDEA_OUTPUT`"
if [ "x$PATHTOOL" != "x" ]; then
IDEA_PROJECT_DIR="`$PATHTOOL -am $IDEA_PROJECT_DIR`"
fi
echo "
Now you can open \"$IDEA_PROJECT_DIR\" as IDEA project
You can also run 'bash \"$IDEA_OUTPUT/jdk-clion/update-project.sh\"' to generate Clion project"
if [ "x$WSL_DISTRO_NAME" != "x" ]; then
rm -rf $ANT_TEMP
fi

View File

@@ -1,5 +1,5 @@
/*
* Copyright (c) 2011, 2012, Oracle and/or its affiliates. All rights reserved.
* Copyright (c) 2015, 2016, Oracle and/or its affiliates. All rights reserved.
* DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
*
* This code is free software; you can redistribute it and/or modify it
@@ -23,16 +23,23 @@
* questions.
*/
package com.apple.eawt.event;
/**
* Listener interface for receiving pressure events.
/*
* This little utility can be used to expand the jib-profiles configuration
* files into plain json.
*
* @see PressureEvent
* @see GestureUtilities
* Usage:
*
* jjs -scripting print-config.js -- [<jib-profiles.js>]
*
* @since Java for Mac OS X 10.5 Update 7, Java for Mac OS X 10.6 Update 2
*/
public interface PressureListener extends GestureListener {
public void pressure(final PressureEvent e);
var file = $ARG[0];
if (file == null) {
file = new java.io.File(__DIR__, "../conf/jib-profiles.js").getCanonicalPath();
}
load(file);
var input = {};
input.get = function(dependencyName, attribute) {
return "\${" + dependencyName + "." + attribute + "}";
};
print(JSON.stringify(getJibProfiles(input), null, 2));

0
configure vendored Executable file → Normal file
View File

View File

@@ -163,8 +163,6 @@
<h3 id="building-on-aarch64">Building on aarch64</h3>
<p>At a minimum, a machine with 8 cores is advisable, as well as 8 GB of RAM. (The more cores to use, the more memory you need.) At least 6 GB of free disk space is required.</p>
<p>If you do not have access to sufficiently powerful hardware, it is also possible to use <a href="#cross-compiling">cross-compiling</a>.</p>
<h4 id="branch-protection">Branch Protection</h4>
<p>In order to use Branch Protection features in the VM, <code>--enable-branch-protection</code> must be used. This option requires C++ compiler support (GCC 9.1.0+ or Clang 10+). The resulting build can be run on both machines with and without support for branch protection in hardware. Branch Protection is only supported for Linux targets.</p>
<h3 id="building-on-32-bit-arm">Building on 32-bit arm</h3>
<p>This is not recommended. Instead, see the section on <a href="#cross-compiling">Cross-compiling</a>.</p>
<h2 id="operating-system-requirements">Operating System Requirements</h2>
@@ -216,7 +214,7 @@
<p>Unfortunately, Cygwin can be unreliable in certain circumstances. If you experience build tool crashes or strange issues when building on Windows, please check the Cygwin FAQ on the <a href="https://cygwin.com/faq/faq.html#faq.using.bloda">&quot;BLODA&quot; list</a> and the section on <a href="https://cygwin.com/faq/faq.html#faq.using.fixing-fork-failures">fork() failures</a>.</p>
<h4 id="windows-subsystem-for-linux-wsl">Windows Subsystem for Linux (WSL)</h4>
<p>Windows 10 1809 or newer is supported due to a dependency on the wslpath utility and support for environment variable sharing through WSLENV. Version 1803 can work but intermittent build failures have been observed.</p>
<p>It's possible to build both Windows and Linux binaries from WSL. To build Windows binaries, you must use a Windows boot JDK (located in a Windows-accessible directory). To build Linux binaries, you must use a Linux boot JDK. The default behavior is to build for Windows. To build for Linux, pass <code>--build=x86_64-unknown-linux-gnu --openjdk-target=x86_64-unknown-linux-gnu</code> to <code>configure</code>.</p>
<p>It's possible to build both Windows and Linux binaries from WSL. To build Windows binaries, you must use a Windows boot JDK (located in a Windows-accessible directory). To build Linux binaries, you must use a Linux boot JDK. The default behavior is to build for Windows. To build for Linux, pass <code>--build=x86_64-unknown-linux-gnu --host=x86_64-unknown-linux-gnu</code> to <code>configure</code>.</p>
<p>If building Windows binaries, the source code must be located in a Windows- accessible directory. This is because Windows executables (such as Visual Studio and the boot JDK) must be able to access the source code. Also, the drive where the source is stored must be mounted as case-insensitive by changing either /etc/fstab or /etc/wsl.conf in WSL. Individual directories may be corrected using the fsutil tool in case the source was cloned before changing the mount options.</p>
<p>Note that while it's possible to build on WSL, testing is still not fully supported.</p>
<h3 id="macos">macOS</h3>
@@ -273,7 +271,7 @@
<tbody>
<tr class="odd">
<td style="text-align: left;">Linux</td>
<td style="text-align: left;">gcc 11.2.0</td>
<td style="text-align: left;">gcc 10.2.0</td>
</tr>
<tr class="even">
<td style="text-align: left;">macOS</td>
@@ -281,14 +279,14 @@
</tr>
<tr class="odd">
<td style="text-align: left;">Windows</td>
<td style="text-align: left;">Microsoft Visual Studio 2022 update 17.1.0</td>
<td style="text-align: left;">Microsoft Visual Studio 2019 update 16.7.2</td>
</tr>
</tbody>
</table>
<p>All compilers are expected to be able to compile to the C99 language standard, as some C99 features are used in the source code. Microsoft Visual Studio doesn't fully support C99 so in practice shared code is limited to using C99 features that it does support.</p>
<h3 id="gcc">gcc</h3>
<p>The minimum accepted version of gcc is 5.0. Older versions will generate a warning by <code>configure</code> and are unlikely to work.</p>
<p>The JDK is currently known to be able to compile with at least version 11.2 of gcc.</p>
<p>The JDK is currently known to be able to compile with at least version 10.2 of gcc.</p>
<p>In general, any version between these two should be usable.</p>
<h3 id="clang">clang</h3>
<p>The minimum accepted version of clang is 3.5. Older versions will not be accepted by <code>configure</code>.</p>
@@ -570,8 +568,7 @@ x86_64-linux-gnu-to-ppc64le-linux-gnu</code></pre>
<p>To be able to build, we need a &quot;Build JDK&quot;, which is a JDK built from the current sources (that is, the same as the end result of the entire build process), but able to run on the <em>build</em> system, and not the <em>target</em> system. (In contrast, the Boot JDK should be from an older release, e.g. JDK 8 when building JDK 9.)</p>
<p>The build process will create a minimal Build JDK for you, as part of building. To speed up the build, you can use <code>--with-build-jdk</code> to <code>configure</code> to point to a pre-built Build JDK. Please note that the build result is unpredictable, and can possibly break in subtle ways, if the Build JDK does not <strong>exactly</strong> match the current sources.</p>
<h3 id="specifying-the-target-platform">Specifying the Target Platform</h3>
<p>You <em>must</em> specify the target platform when cross-compiling. Doing so will also automatically turn the build into a cross-compiling mode. The simplest way to do this is to use the <code>--openjdk-target</code> argument, e.g. <code>--openjdk-target=arm-linux-gnueabihf</code>. or <code>--openjdk-target=aarch64-oe-linux</code>. This will automatically set the <code>--host</code> and <code>--target</code> options for autoconf, which can otherwise be confusing. (In autoconf terminology, the &quot;target&quot; is known as &quot;host&quot;, and &quot;target&quot; is used for building a Canadian cross-compiler.)</p>
<p>If <code>--build</code> has not been explicitly passed to configure, <code>--openjdk-target</code> will autodetect the build platform and internally set the flag automatically, otherwise the platform that was explicitly passed to <code>--build</code> will be used instead.</p>
<p>You <em>must</em> specify the target platform when cross-compiling. Doing so will also automatically turn the build into a cross-compiling mode. The simplest way to do this is to use the <code>--openjdk-target</code> argument, e.g. <code>--openjdk-target=arm-linux-gnueabihf</code>. or <code>--openjdk-target=aarch64-oe-linux</code>. This will automatically set the <code>--build</code>, <code>--host</code> and <code>--target</code> options for autoconf, which can otherwise be confusing. (In autoconf terminology, the &quot;target&quot; is known as &quot;host&quot;, and &quot;target&quot; is used for building a Canadian cross-compiler.)</p>
<h3 id="toolchain-considerations">Toolchain Considerations</h3>
<p>You will need two copies of your toolchain, one which generates output that can run on the target system (the normal, or <em>target</em>, toolchain), and one that generates output that can run on the build system (the <em>build</em> toolchain). Note that cross-compiling is only supported for gcc at the time being. The gcc standard is to prefix cross-compiling toolchains with the target denominator. If you follow this standard, <code>configure</code> is likely to pick up the toolchain correctly.</p>
<p>The <em>build</em> toolchain will be autodetected just the same way the normal <em>build</em>/<em>target</em> toolchain will be autodetected when not cross-compiling. If this is not what you want, or if the autodetection fails, you can specify a devkit containing the <em>build</em> toolchain using <code>--with-build-devkit</code> to <code>configure</code>, or by giving <code>BUILD_CC</code> and <code>BUILD_CXX</code> arguments.</p>
@@ -890,28 +887,30 @@ spawn failed</code></pre>
<p>If you need general help or advice about developing for the JDK, you can also contact the Adoption Group. See the section on <a href="#contributing-to-openjdk">Contributing to OpenJDK</a> for more information.</p>
<h2 id="reproducible-builds">Reproducible Builds</h2>
<p>Build reproducibility is the property of getting exactly the same bits out when building, every time, independent on who builds the product, or where. This is for many reasons a harder goal than it initially appears, but it is an important goal, for security reasons and others. Please see <a href="https://reproducible-builds.org">Reproducible Builds</a> for more information about the background and reasons for reproducible builds.</p>
<p>Currently, it is not possible to build OpenJDK fully reproducibly, but getting there is an ongoing effort.</p>
<p>An absolute prerequisite for building reproducible is to speficy a fixed build time, since time stamps are embedded in many file formats. This is done by setting the <code>SOURCE_DATE_EPOCH</code> environment variable, which is an <a href="https://reproducible-builds.org/docs/source-date-epoch/">industry standard</a>, that many tools, such as gcc, recognize, and use in place of the current time when generating output.</p>
<p>To generate reproducible builds, you must set <code>SOURCE_DATE_EPOCH</code> before running <code>configure</code>. The value in <code>SOURCE_DATE_EPOCH</code> will be stored in the configuration, and used by <code>make</code>. Setting <code>SOURCE_DATE_EPOCH</code> before running <code>make</code> will have no effect on the build.</p>
<p>You must also make sure your build does not rely on <code>configure</code>'s default adhoc version strings. Default adhoc version strings <code>OPT</code> segment include user name and source directory. You can either override just the <code>OPT</code> segment using <code>--with-version-opt=&lt;any fixed string&gt;</code>, or you can specify the entire version string using <code>--with-version-string=&lt;your version&gt;</code>.</p>
<p>This is a typical example of how to build the JDK in a reproducible way:</p>
<pre><code>export SOURCE_DATE_EPOCH=946684800
bash configure --with-version-opt=adhoc
make</code></pre>
<p>Note that regardless if you specify a source date for <code>configure</code> or not, the JDK build system will set <code>SOURCE_DATE_EPOCH</code> for all build tools when building. If <code>--with-source-date</code> has the value <code>updated</code> (which is the default unless <code>SOURCE_DATE_EPOCH</code> is found by in the environment by <code>configure</code>), the source date value will be determined at build time.</p>
<p>There are several aspects of reproducible builds that can be individually adjusted by <code>configure</code> arguments. If any of these are given, they will override the value derived from <code>SOURCE_DATE_EPOCH</code>. These arguments are:</p>
<p>Currently, it is not possible to build OpenJDK fully reproducibly, but getting there is an ongoing effort. There are some things you can do to minimize non-determinism and make a larger part of the build reproducible:</p>
<ul>
<li><p><code>--with-source-date</code></p>
<p>This option controls how the JDK build sets <code>SOURCE_DATE_EPOCH</code> when building. It can be set to a value describing a date, either an epoch based timestamp as an integer, or a valid ISO-8601 date.</p>
<p>It can also be set to one of the special values <code>current</code>, <code>updated</code> or <code>version</code>. <code>current</code> means that the time of running <code>configure</code> will be used. <code>version</code> will use the nominal release date for the current JDK version. <code>updated</code>, which means that <code>SOURCE_DATE_EPOCH</code> will be set to the current time each time you are running <code>make</code>. All choices, except for <code>updated</code>, will set a fixed value for the source date timestamp.</p>
<p>When <code>SOURCE_DATE_EPOCH</code> is set, the default value for <code>--with-source-date</code> will be the value given by <code>SOURCE_DATE_EPOCH</code>. Otherwise, the default value is <code>updated</code>.</p></li>
<li><p><code>--with-hotspot-build-time</code></p>
<p>This option controls the build time string that will be included in the hotspot library (<code>libjvm.so</code> or <code>jvm.dll</code>). When the source date is fixed (e.g. by setting <code>SOURCE_DATE_EPOCH</code>), the default value for <code>--with-hotspot-build-time</code> will be an ISO 8601 representation of that time stamp. Otherwise the default value will be the current time when building hotspot.</p></li>
<li><p><code>--with-copyright-year</code></p>
<p>This option controls the copyright year in some generated text files. When the source date is fixed (e.g. by setting <code>SOURCE_DATE_EPOCH</code>), the default value for <code>--with-copyright-year</code> will be the year of that time stamp. Otherwise the default is the current year at the time of running configure. This can be overridden by <code>--with-copyright-year=&lt;year&gt;</code>.</p></li>
<li><p><code>--enable-reproducible-build</code></p>
<p>This option controls some additional behavior needed to make the build reproducible. When the source date is fixed (e.g. by setting <code>SOURCE_DATE_EPOCH</code>), this flag will be turned on by default. Otherwise, the value is determined by heuristics. If it is explicitly turned off, the build might not be reproducible.</p></li>
<li>Turn on build system support for reproducible builds</li>
</ul>
<p>Add the flag <code>--enable-reproducible-build</code> to your <code>configure</code> command line. This will turn on support for reproducible builds where it could otherwise be lacking.</p>
<ul>
<li>Do not rely on <code>configure</code>'s default adhoc version strings</li>
</ul>
<p>Default adhoc version strings OPT segment include user name, source directory and timestamp. You can either override just the OPT segment using <code>--with-version-opt=&lt;any fixed string&gt;</code>, or you can specify the entire version string using <code>--with-version-string=&lt;your version&gt;</code>.</p>
<ul>
<li>Specify how the build sets <code>SOURCE_DATE_EPOCH</code></li>
</ul>
<p>The JDK build system will set the <code>SOURCE_DATE_EPOCH</code> environment variable during building, depending on the value of the <code>--with-source-date</code> option for <code>configure</code>. The default value is <code>updated</code>, which means that <code>SOURCE_DATE_EPOCH</code> will be set to the current time each time you are running <code>make</code>.</p>
<p>The <a href="https://reproducible-builds.org/docs/source-date-epoch/"><code>SOURCE_DATE_EPOCH</code> environment variable</a> is an industry standard, that many tools, such as gcc, recognize, and use in place of the current time when generating output.</p>
<p>For reproducible builds, you need to set this to a fixed value. You can use the special value <code>version</code> which will use the nominal release date for the current JDK version, or a value describing a date, either an epoch based timestamp as an integer, or a valid ISO-8601 date.</p>
<p><strong>Hint:</strong> If your build environment already sets <code>SOURCE_DATE_EPOCH</code>, you can propagate this using <code>--with-source-date=$SOURCE_DATE_EPOCH</code>.</p>
<ul>
<li>Specify a hotspot build time</li>
</ul>
<p>Set a fixed hotspot build time. This will be included in the hotspot library (<code>libjvm.so</code> or <code>jvm.dll</code>) and defaults to the current time when building hotspot. Use <code>--with-hotspot-build-time=&lt;any fixed string&gt;</code> for reproducible builds. It's a string so you don't need to format it specifically, so e.g. <code>n/a</code> will do. Another solution is to use the <code>SOURCE_DATE_EPOCH</code> variable, e.g. <code>--with-hotspot-build-time=$(date --date=@$SOURCE_DATE_EPOCH)</code>.</p>
<ul>
<li>Copyright year</li>
</ul>
<p>The copyright year in some generated text files are normally set to the current year. This can be overridden by <code>--with-copyright-year=&lt;year&gt;</code>. For fully reproducible builds, this needs to be set to a fixed value.</p>
<h2 id="hints-and-suggestions-for-advanced-users">Hints and Suggestions for Advanced Users</h2>
<h3 id="bash-completion">Bash Completion</h3>
<p>The <code>configure</code> and <code>make</code> commands tries to play nice with bash command-line completion (using <code>&lt;tab&gt;</code> or <code>&lt;tab&gt;&lt;tab&gt;</code>). To use this functionality, make sure you enable completion in your <code>~/.bashrc</code> (see instructions for bash in your operating system).</p>

View File

@@ -135,14 +135,6 @@ space is required.
If you do not have access to sufficiently powerful hardware, it is also
possible to use [cross-compiling](#cross-compiling).
#### Branch Protection
In order to use Branch Protection features in the VM, `--enable-branch-protection`
must be used. This option requires C++ compiler support (GCC 9.1.0+ or Clang
10+). The resulting build can be run on both machines with and without support
for branch protection in hardware. Branch Protection is only supported for
Linux targets.
### Building on 32-bit arm
This is not recommended. Instead, see the section on [Cross-compiling](
@@ -244,8 +236,8 @@ It's possible to build both Windows and Linux binaries from WSL. To build
Windows binaries, you must use a Windows boot JDK (located in a
Windows-accessible directory). To build Linux binaries, you must use a Linux
boot JDK. The default behavior is to build for Windows. To build for Linux, pass
`--build=x86_64-unknown-linux-gnu --openjdk-target=x86_64-unknown-linux-gnu`
to `configure`.
`--build=x86_64-unknown-linux-gnu --host=x86_64-unknown-linux-gnu` to
`configure`.
If building Windows binaries, the source code must be located in a Windows-
accessible directory. This is because Windows executables (such as Visual Studio
@@ -329,9 +321,9 @@ issues.
Operating system Toolchain version
------------------ -------------------------------------------------------
Linux gcc 11.2.0
Linux gcc 10.2.0
macOS Apple Xcode 10.1 (using clang 10.0.0)
Windows Microsoft Visual Studio 2022 update 17.1.0
Windows Microsoft Visual Studio 2019 update 16.7.2
All compilers are expected to be able to compile to the C99 language standard,
as some C99 features are used in the source code. Microsoft Visual Studio
@@ -343,7 +335,7 @@ features that it does support.
The minimum accepted version of gcc is 5.0. Older versions will generate a warning
by `configure` and are unlikely to work.
The JDK is currently known to be able to compile with at least version 11.2 of
The JDK is currently known to be able to compile with at least version 10.2 of
gcc.
In general, any version between these two should be usable.
@@ -382,9 +374,9 @@ available for this update.
### Microsoft Visual Studio
For aarch64 machines running Windows the minimum accepted version is Visual Studio 2019
(16.8 or higher). For all other platforms the minimum accepted version of
Visual Studio is 2017. Older versions will not be accepted by `configure` and will
For aarch64 machines running Windows the minimum accepted version is Visual Studio 2019
(16.8 or higher). For all other platforms the minimum accepted version of
Visual Studio is 2017. Older versions will not be accepted by `configure` and will
not work. For all platforms the maximum accepted version of Visual Studio is 2022.
If you have multiple versions of Visual Studio installed, `configure` will by
@@ -986,16 +978,11 @@ You *must* specify the target platform when cross-compiling. Doing so will also
automatically turn the build into a cross-compiling mode. The simplest way to
do this is to use the `--openjdk-target` argument, e.g.
`--openjdk-target=arm-linux-gnueabihf`. or `--openjdk-target=aarch64-oe-linux`.
This will automatically set the `--host` and `--target` options for
This will automatically set the `--build`, `--host` and `--target` options for
autoconf, which can otherwise be confusing. (In autoconf terminology, the
"target" is known as "host", and "target" is used for building a Canadian
cross-compiler.)
If `--build` has not been explicitly passed to configure, `--openjdk-target`
will autodetect the build platform and internally set the flag automatically,
otherwise the platform that was explicitly passed to `--build` will be used
instead.
### Toolchain Considerations
You will need two copies of your toolchain, one which generates output that can
@@ -1527,85 +1514,57 @@ https://reproducible-builds.org) for more information about the background and
reasons for reproducible builds.
Currently, it is not possible to build OpenJDK fully reproducibly, but getting
there is an ongoing effort.
there is an ongoing effort. There are some things you can do to minimize
non-determinism and make a larger part of the build reproducible:
An absolute prerequisite for building reproducible is to speficy a fixed build
time, since time stamps are embedded in many file formats. This is done by
setting the `SOURCE_DATE_EPOCH` environment variable, which is an [industry
standard]( https://reproducible-builds.org/docs/source-date-epoch/), that many
tools, such as gcc, recognize, and use in place of the current time when
generating output.
* Turn on build system support for reproducible builds
To generate reproducible builds, you must set `SOURCE_DATE_EPOCH` before running
`configure`. The value in `SOURCE_DATE_EPOCH` will be stored in the
configuration, and used by `make`. Setting `SOURCE_DATE_EPOCH` before running
`make` will have no effect on the build.
Add the flag `--enable-reproducible-build` to your `configure` command line.
This will turn on support for reproducible builds where it could otherwise be
lacking.
You must also make sure your build does not rely on `configure`'s default adhoc
version strings. Default adhoc version strings `OPT` segment include user name
and source directory. You can either override just the `OPT` segment using
* Do not rely on `configure`'s default adhoc version strings
Default adhoc version strings OPT segment include user name, source directory
and timestamp. You can either override just the OPT segment using
`--with-version-opt=<any fixed string>`, or you can specify the entire version
string using `--with-version-string=<your version>`.
This is a typical example of how to build the JDK in a reproducible way:
* Specify how the build sets `SOURCE_DATE_EPOCH`
```
export SOURCE_DATE_EPOCH=946684800
bash configure --with-version-opt=adhoc
make
```
The JDK build system will set the `SOURCE_DATE_EPOCH` environment variable
during building, depending on the value of the `--with-source-date` option for
`configure`. The default value is `updated`, which means that
`SOURCE_DATE_EPOCH` will be set to the current time each time you are running
`make`.
Note that regardless if you specify a source date for `configure` or not, the
JDK build system will set `SOURCE_DATE_EPOCH` for all build tools when building.
If `--with-source-date` has the value `updated` (which is the default unless
`SOURCE_DATE_EPOCH` is found by in the environment by `configure`), the source
date value will be determined at build time.
The [`SOURCE_DATE_EPOCH` environment variable](
https://reproducible-builds.org/docs/source-date-epoch/) is an industry
standard, that many tools, such as gcc, recognize, and use in place of the
current time when generating output.
There are several aspects of reproducible builds that can be individually
adjusted by `configure` arguments. If any of these are given, they will override
the value derived from `SOURCE_DATE_EPOCH`. These arguments are:
For reproducible builds, you need to set this to a fixed value. You can use the
special value `version` which will use the nominal release date for the current
JDK version, or a value describing a date, either an epoch based timestamp as an
integer, or a valid ISO-8601 date.
* `--with-source-date`
**Hint:** If your build environment already sets `SOURCE_DATE_EPOCH`, you can
propagate this using `--with-source-date=$SOURCE_DATE_EPOCH`.
This option controls how the JDK build sets `SOURCE_DATE_EPOCH` when
building. It can be set to a value describing a date, either an epoch based
timestamp as an integer, or a valid ISO-8601 date.
* Specify a hotspot build time
It can also be set to one of the special values `current`, `updated` or
`version`. `current` means that the time of running `configure` will be
used. `version` will use the nominal release date for the current JDK
version. `updated`, which means that `SOURCE_DATE_EPOCH` will be set to the
current time each time you are running `make`. All choices, except for
`updated`, will set a fixed value for the source date timestamp.
Set a fixed hotspot build time. This will be included in the hotspot library
(`libjvm.so` or `jvm.dll`) and defaults to the current time when building
hotspot. Use `--with-hotspot-build-time=<any fixed string>` for reproducible
builds. It's a string so you don't need to format it specifically, so e.g. `n/a`
will do. Another solution is to use the `SOURCE_DATE_EPOCH` variable, e.g.
`--with-hotspot-build-time=$(date --date=@$SOURCE_DATE_EPOCH)`.
When `SOURCE_DATE_EPOCH` is set, the default value for `--with-source-date`
will be the value given by `SOURCE_DATE_EPOCH`. Otherwise, the default value
is `updated`.
* Copyright year
* `--with-hotspot-build-time`
This option controls the build time string that will be included in the
hotspot library (`libjvm.so` or `jvm.dll`). When the source date is fixed
(e.g. by setting `SOURCE_DATE_EPOCH`), the default value for
`--with-hotspot-build-time` will be an ISO 8601 representation of that time
stamp. Otherwise the default value will be the current time when building
hotspot.
* `--with-copyright-year`
This option controls the copyright year in some generated text files. When
the source date is fixed (e.g. by setting `SOURCE_DATE_EPOCH`), the default
value for `--with-copyright-year` will be the year of that time stamp.
Otherwise the default is the current year at the time of running configure.
This can be overridden by `--with-copyright-year=<year>`.
* `--enable-reproducible-build`
This option controls some additional behavior needed to make the build
reproducible. When the source date is fixed (e.g. by setting
`SOURCE_DATE_EPOCH`), this flag will be turned on by default. Otherwise, the
value is determined by heuristics. If it is explicitly turned off, the build
might not be reproducible.
The copyright year in some generated text files are normally set to the current
year. This can be overridden by `--with-copyright-year=<year>`. For fully
reproducible builds, this needs to be set to a fixed value.
## Hints and Suggestions for Advanced Users

View File

@@ -51,7 +51,6 @@
<li><a href="#atomic">&lt;atomic&gt;</a></li>
<li><a href="#uniform-initialization">Uniform Initialization</a></li>
<li><a href="#local-function-objects">Local Function Objects</a></li>
<li><a href="#inheriting-constructors">Inheriting constructors</a></li>
<li><a href="#additional-permitted-features">Additional Permitted Features</a></li>
<li><a href="#excluded-features">Excluded Features</a></li>
<li><a href="#undecided-features">Undecided Features</a></li>
@@ -68,9 +67,7 @@
<h3 id="counterexamples-and-updates">Counterexamples and Updates</h3>
<p>Many of the guidelines mentioned here have (sometimes widespread) counterexamples in the HotSpot code base. Finding a counterexample is not sufficient justification for new code to follow the counterexample as a precedent, since readers of your code will rightfully expect your code to follow the greater bulk of precedents documented here.</p>
<p>Occasionally a guideline mentioned here may be just out of synch with the actual HotSpot code base. If you find that a guideline is consistently contradicted by a large number of counterexamples, please bring it up for discussion and possible change. The architectural rule, of course, is &quot;When in Rome do as the Romans&quot;. Sometimes in the suburbs of Rome the rules are a little different; these differences can be pointed out here.</p>
<p>Proposed changes should be discussed on the <a href="mailto:hotspot-dev@openjdk.java.net">HotSpot Developers</a> mailing list. Changes are likely to be cautious and incremental, since HotSpot coders have been using these guidelines for years.</p>
<p>Substantive changes are approved by <a href="https://www.rfc-editor.org/rfc/rfc7282.html">rough consensus</a> of the <a href="https://openjdk.java.net/census#hotspot">HotSpot Group</a> Members. The Group Lead determines whether consensus has been reached.</p>
<p>Editorial changes (changes that only affect the description of HotSpot style, not its substance) do not require the full consensus gathering process. The normal HotSpot pull request process may be used for editorial changes, with the additional requirement that the requisite reviewers are also HotSpot Group Members.</p>
<p>Proposed changes should be discussed on the <a href="mailto:hotspot-dev@openjdk.java.net">HotSpot Developers</a> mailing list, and approved by <a href="https://en.wikipedia.org/wiki/Rough_consensus">rough consensus</a> of the <a href="https://openjdk.java.net/census#hotspot">HotSpot Group</a> Members. The Group Lead determines whether consensus has been reached. Changes are likely to be cautious and incremental, since HotSpot coders have been using these guidelines for years.</p>
<h2 id="structure-and-formatting">Structure and Formatting</h2>
<h3 id="factoring-and-class-design">Factoring and Class Design</h3>
<ul>
@@ -154,7 +151,7 @@
<h3 id="whitespace">Whitespace</h3>
<ul>
<li><p>In general, don't change whitespace unless it improves readability or consistency. Gratuitous whitespace changes will make integrations and backports more difficult.</p></li>
<li><p>Use <a href="https://en.wikipedia.org/wiki/Indentation_style#Variant:_1TBS_(OTBS)">One-True-Brace-Style</a>. The opening brace for a function or class is normally at the end of the line; it is sometimes moved to the beginning of the next line for emphasis. Substatements are enclosed in braces, even if there is only a single statement. Extremely simple one-line statements may drop braces around a substatement.</p></li>
<li><p>Use One-True-Brace-Style. The opening brace for a function or class is normally at the end of the line; it is sometimes moved to the beginning of the next line for emphasis. Substatements are enclosed in braces, even if there is only a single statement. Extremely simple one-line statements may drop braces around a substatement.</p></li>
<li><p>Indentation levels are two columns.</p></li>
<li><p>There is no hard line length limit. That said, bear in mind that excessively long lines can cause difficulties. Some people like to have multiple side-by-side windows in their editors, and long lines may force them to choose among unpleasant options. They can use wide windows, reducing the number that can fit across the screen, and wasting a lot of screen real estate because most lines are not that long. Alternatively, they can have more windows across the screen, with long lines wrapping (or worse, requiring scrolling to see in their entirety), which is harder to read. Similar issues exist for side-by-side code reviews.</p></li>
<li><p>Tabs are not allowed in code. Set your editor accordingly.<br> (Emacs: <code>(setq-default indent-tabs-mode nil)</code>.)</p></li>
@@ -211,7 +208,7 @@ while ( test_foo(args...) ) { // No, excess spaces around control</code></pre></
<p>Rationale: Other than to implement exceptions (which HotSpot doesn't use), most potential uses of <a href="https://en.wikipedia.org/wiki/Run-time_type_information" title="Runtime Type Information">RTTI</a> are better done via virtual functions. Some of the remainder can be replaced by bespoke mechanisms. The cost of the additional runtime data structures needed to support <a href="https://en.wikipedia.org/wiki/Run-time_type_information" title="Runtime Type Information">RTTI</a> are deemed not worthwhile, given the alternatives.</p>
<h3 id="memory-allocation">Memory Allocation</h3>
<p>Do not use the standard global allocation and deallocation functions (operator new and related functions). Use of these functions by HotSpot code is disabled for some platforms.</p>
<p>Rationale: HotSpot often uses &quot;resource&quot; or &quot;arena&quot; allocation. Even where heap allocation is used, the standard global functions are avoided in favor of wrappers around malloc and free that support the VM's Native Memory Tracking (NMT) feature. Typically, uses of the global operator new are inadvertent and therefore often associated with memory leaks.</p>
<p>Rationale: HotSpot often uses &quot;resource&quot; or &quot;arena&quot; allocation. Even where heap allocation is used, the standard global functions are avoided in favor of wrappers around malloc and free that support the VM's Native Memory Tracking (NMT) feature.</p>
<p>Native memory allocation failures are often treated as non-recoverable. The place where &quot;out of memory&quot; is (first) detected may be an innocent bystander, unrelated to the actual culprit.</p>
<h3 id="class-inheritance">Class Inheritance</h3>
<p>Use public single inheritance.</p>
@@ -271,8 +268,8 @@ while ( test_foo(args...) ) { // No, excess spaces around control</code></pre></
<p>The underlying type of a <em>scoped-enum</em> should also be specified explicitly if conversions may be applied to values of that type.</p>
<p>Due to bugs in certain (very old) compilers, there is widespread use of enums and avoidance of in-class initialization of static integral constant members. Compilers having such bugs are no longer supported. Except where an enum is semantically appropriate, new code should use integral constants.</p>
<h3 id="thread_local">thread_local</h3>
<p>Avoid use of <code>thread_local</code> (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2659.htm">n2659</a>); and instead, use the HotSpot macro <code>THREAD_LOCAL</code>, for which the initializer must be a constant expression. When <code>thread_local</code> must be used, use the Hotspot macro <code>APPROVED_CPP_THREAD_LOCAL</code> to indicate that the use has been given appropriate consideration.</p>
<p>As was discussed in the review for <a href="https://mail.openjdk.java.net/pipermail/hotspot-dev/2019-September/039487.html">JDK-8230877</a>, <code>thread_local</code> allows dynamic initialization and destruction semantics. However, that support requires a run-time penalty for references to non-function-local <code>thread_local</code> variables defined in a different translation unit, even if they don't need dynamic initialization. Dynamic initialization and destruction of non-local <code>thread_local</code> variables also has the same ordering problems as for ordinary non-local variables. So we avoid use of <code>thread_local</code> in general, limiting its use to only those cases where dynamic initialization or destruction are essential. See <a href="https://bugs.openjdk.java.net/browse/JDK-8282469">JDK-8282469</a> for further discussion.</p>
<p>Do not use <code>thread_local</code> (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2659.htm">n2659</a>); instead, use the HotSpot macro <code>THREAD_LOCAL</code>. The initializer must be a constant expression.</p>
<p>As was discussed in the review for <a href="https://mail.openjdk.java.net/pipermail/hotspot-dev/2019-September/039487.html">JDK-8230877</a>, <code>thread_local</code> allows dynamic initialization and destruction semantics. However, that support requires a run-time penalty for references to non-function-local <code>thread_local</code> variables defined in a different translation unit, even if they don't need dynamic initialization. Dynamic initialization and destruction of namespace-scoped thread local variables also has the same ordering problems as for ordinary namespace-scoped variables.</p>
<h3 id="nullptr">nullptr</h3>
<p>Prefer <code>nullptr</code> (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2431.pdf">n2431</a>) to <code>NULL</code>. Don't use (constexpr or literal) 0 for pointers.</p>
<p>For historical reasons there are widespread uses of both <code>NULL</code> and of integer 0 as a pointer value.</p>
@@ -406,10 +403,6 @@ while ( test_foo(args...) ) { // No, excess spaces around control</code></pre></
<ul>
<li>Make () more optional for lambdas (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p1102r2.html">p1102r2</a>)</li>
</ul>
<h3 id="inheriting-constructors">Inheriting constructors</h3>
<p>Do not use <em>inheriting constructors</em> (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2540.htm">n2540</a>).</p>
<p>C++11 provides simple syntax allowing a class to inherit the constructors of a base class. Unfortunately there are a number of problems with the original specification, and C++17 contains significant revisions (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/p0136r1.html" title="p0136r1">p0136r1</a> opens with a list of 8 Core Issues). Since HotSpot doesn't support use of C++17, use of inherited constructors could run into those problems. Such uses might also change behavior in a future HotSpot update to use C++17 or later, potentially in subtle ways that could lead to hard to diagnose problems. Because of this, HotSpot code must not use inherited constructors.</p>
<p>Note that gcc7 provides the <code>-fnew-inheriting-ctors</code> option to use the <a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/p0136r1.html" title="p0136r1">p0136r1</a> semantics. This is enabled by default when using C++17 or later. It is also enabled by default for <code>fabi-version=11</code> (introduced by gcc7) or higher when using C++11/14, as the change is considered a Defect Report that applies to those versions. Earlier versions of gcc don't have that option, and other supported compilers may not have anything similar.</p>
<h3 id="additional-permitted-features">Additional Permitted Features</h3>
<ul>
<li><p><code>constexpr</code> (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2235.pdf">n2235</a>) (<a href="https://isocpp.org/files/papers/N3652.html">n3652</a>)</p></li>
@@ -428,7 +421,6 @@ while ( test_foo(args...) ) { // No, excess spaces around control</code></pre></
<li><p><code>final</code> virtual specifiers for classes and virtual functions (<a href="http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2928.htm">n2928</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2010/n3206.htm">n3206</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2011/n3272.htm">n3272</a>)</p></li>
<li><p><code>override</code> virtual specifiers for virtual functions (<a href="http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2928.htm">n2928</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2010/n3206.htm">n3206</a>), (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2011/n3272.htm">n3272</a>)</p></li>
<li><p>Range-based <code>for</code> loops (<a href="http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2930.html">n2930</a>) (<a href="https://en.cppreference.com/w/cpp/language/range-for">range-for</a>)</p></li>
<li><p>Unrestricted Unions (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2544.pdf">n2544</a>)</p></li>
</ul>
<h3 id="excluded-features">Excluded Features</h3>
<ul>
@@ -444,7 +436,7 @@ while ( test_foo(args...) ) { // No, excess spaces around control</code></pre></
<li><p>Inline namespaces (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2535.htm">n2535</a>) — HotSpot makes very limited use of namespaces.</p></li>
<li><p><code>using namespace</code> directives. In particular, don't use <code>using namespace std;</code> to avoid needing to qualify Standard Library names.</p></li>
<li><p>Propagating exceptions (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2179.html">n2179</a>) — HotSpot does not permit the use of exceptions, so this feature isn't useful.</p></li>
<li><p>Avoid non-local variables with non-constexpr initialization. In particular, avoid variables with types requiring non-trivial initialization or destruction. Initialization order problems can be difficult to deal with and lead to surprises, as can destruction ordering. HotSpot doesn't generally try to cleanup on exit, and running destructors at exit can also lead to problems.</p></li>
<li><p>Avoid namespace-scoped variables with non-constexpr initialization. In particular, avoid variables with types requiring non-trivial initialization or destruction. Initialization order problems can be difficult to deal with and lead to surprises, as can destruction ordering. HotSpot doesn't generally try to cleanup on exit, and running destructors at exit can also lead to problems.</p></li>
<li><p><code>[[deprecated]]</code> attribute (<a href="http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3760.html">n3760</a>) — Not relevant in HotSpot code.</p></li>
<li><p>Avoid most operator overloading, preferring named functions. When operator overloading is used, ensure the semantics conform to the normal expected behavior of the operation.</p></li>
<li><p>Avoid most implicit conversion constructors and (implicit or explicit) conversion operators. (Note that conversion to <code>bool</code> isn't needed in HotSpot code because of the &quot;no implicit boolean&quot; guideline.)</p></li>

View File

@@ -56,19 +56,12 @@ can be pointed out here.
Proposed changes should be discussed on the
[HotSpot Developers](mailto:hotspot-dev@openjdk.java.net) mailing
list. Changes are likely to be cautious and incremental, since HotSpot
coders have been using these guidelines for years.
Substantive changes are approved by
[rough consensus](https://www.rfc-editor.org/rfc/rfc7282.html) of
list, and approved by
[rough consensus](https://en.wikipedia.org/wiki/Rough_consensus) of
the [HotSpot Group](https://openjdk.java.net/census#hotspot) Members.
The Group Lead determines whether consensus has been reached.
Editorial changes (changes that only affect the description of HotSpot
style, not its substance) do not require the full consensus gathering
process. The normal HotSpot pull request process may be used for
editorial changes, with the additional requirement that the requisite
reviewers are also HotSpot Group Members.
Changes are likely to be cautious and incremental, since HotSpot
coders have been using these guidelines for years.
## Structure and Formatting
@@ -294,9 +287,7 @@ well.
or consistency. Gratuitous whitespace changes will make integrations
and backports more difficult.
* Use [One-True-Brace-Style](
https://en.wikipedia.org/wiki/Indentation_style#Variant:_1TBS_(OTBS)).
The opening brace for a function or class
* Use One-True-Brace-Style. The opening brace for a function or class
is normally at the end of the line; it is sometimes moved to the
beginning of the next line for emphasis. Substatements are enclosed
in braces, even if there is only a single statement. Extremely simple
@@ -471,9 +462,7 @@ code is disabled for some platforms.
Rationale: HotSpot often uses "resource" or "arena" allocation. Even
where heap allocation is used, the standard global functions are
avoided in favor of wrappers around malloc and free that support the
VM's Native Memory Tracking (NMT) feature. Typically, uses of the global
operator new are inadvertent and therefore often associated with memory
leaks.
VM's Native Memory Tracking (NMT) feature.
Native memory allocation failures are often treated as non-recoverable.
The place where "out of memory" is (first) detected may be an innocent
@@ -633,7 +622,7 @@ Here are a few closely related example bugs:<br>
### enum
Where appropriate, _scoped-enums_ should be used.
([n2347](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2347.pdf))
([n2347](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2347.pdf))
Use of _unscoped-enums_ is permitted, though ordinary constants may be
preferable when the automatic initializer feature isn't used.
@@ -653,12 +642,10 @@ integral constants.
### thread_local
Avoid use of `thread_local`
Do not use `thread_local`
([n2659](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2659.htm));
and instead, use the HotSpot macro `THREAD_LOCAL`, for which the initializer must
be a constant expression. When `thread_local` must be used, use the Hotspot macro
`APPROVED_CPP_THREAD_LOCAL` to indicate that the use has been given appropriate
consideration.
instead, use the HotSpot macro `THREAD_LOCAL`. The initializer must
be a constant expression.
As was discussed in the review for
[JDK-8230877](https://mail.openjdk.java.net/pipermail/hotspot-dev/2019-September/039487.html),
@@ -667,18 +654,14 @@ semantics. However, that support requires a run-time penalty for
references to non-function-local `thread_local` variables defined in a
different translation unit, even if they don't need dynamic
initialization. Dynamic initialization and destruction of
non-local `thread_local` variables also has the same ordering
problems as for ordinary non-local variables. So we avoid use of
`thread_local` in general, limiting its use to only those cases where dynamic
initialization or destruction are essential. See
[JDK-8282469](https://bugs.openjdk.java.net/browse/JDK-8282469)
for further discussion.
namespace-scoped thread local variables also has the same ordering
problems as for ordinary namespace-scoped variables.
### nullptr
Prefer `nullptr`
([n2431](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2431.pdf))
to `NULL`. Don't use (constexpr or literal) 0 for pointers.
to `NULL`. Don't use (constexpr or literal) 0 for pointers.
For historical reasons there are widespread uses of both `NULL` and of
integer 0 as a pointer value.
@@ -947,7 +930,7 @@ References:
* Generalized lambda capture (init-capture) ([N3648])
* Generic (polymorphic) lambda expressions ([N3649])
[n2657]: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2657.htm
[n2657]: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2657.htm
[n2927]: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2009/n2927.pdf
[N3648]: https://isocpp.org/files/papers/N3648.html
[N3649]: https://isocpp.org/files/papers/N3649.html
@@ -985,34 +968,10 @@ References from C++23
[p1102r2]: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2020/p1102r2.html
### Inheriting constructors
Do not use _inheriting constructors_
([n2540](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2540.htm)).
C++11 provides simple syntax allowing a class to inherit the constructors of a
base class. Unfortunately there are a number of problems with the original
specification, and C++17 contains significant revisions ([p0136r1] opens with
a list of 8 Core Issues). Since HotSpot doesn't support use of C++17, use of
inherited constructors could run into those problems. Such uses might also
change behavior in a future HotSpot update to use C++17 or later, potentially
in subtle ways that could lead to hard to diagnose problems. Because of this,
HotSpot code must not use inherited constructors.
Note that gcc7 provides the `-fnew-inheriting-ctors` option to use the
[p0136r1] semantics. This is enabled by default when using C++17 or later.
It is also enabled by default for `fabi-version=11` (introduced by gcc7) or
higher when using C++11/14, as the change is considered a Defect Report that
applies to those versions. Earlier versions of gcc don't have that option,
and other supported compilers may not have anything similar.
[p0136r1]: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/p0136r1.html
"p0136r1"
### Additional Permitted Features
* `constexpr`
([n2235](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2235.pdf))
([n2235](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2235.pdf))
([n3652](https://isocpp.org/files/papers/N3652.html))
* Sized deallocation
@@ -1067,9 +1026,6 @@ and other supported compilers may not have anything similar.
([n2930](http://www.open-std.org/JTC1/SC22/WG21/docs/papers/2009/n2930.html))
([range-for](https://en.cppreference.com/w/cpp/language/range-for))
* Unrestricted Unions
([n2544](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2544.pdf))
### Excluded Features
* New string and character literals
@@ -1101,7 +1057,7 @@ namespace std;` to avoid needing to qualify Standard Library names.
([n2179](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2179.html)) &mdash;
HotSpot does not permit the use of exceptions, so this feature isn't useful.
* Avoid non-local variables with non-constexpr initialization.
* Avoid namespace-scoped variables with non-constexpr initialization.
In particular, avoid variables with types requiring non-trivial
initialization or destruction. Initialization order problems can be
difficult to deal with and lead to surprises, as can destruction
@@ -1122,14 +1078,14 @@ in HotSpot code because of the "no implicit boolean" guideline.)
* Avoid covariant return types.
* Avoid `goto` statements.
* Avoid `goto` statements.
### Undecided Features
This list is incomplete; it serves to explicitly call out some
features that have not yet been discussed.
* Trailing return type syntax for functions
* Trailing return type syntax for functions
([n2541](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2541.htm))
* Variable templates
@@ -1143,7 +1099,7 @@ features that have not yet been discussed.
* Rvalue references and move semantics
[ADL]: https://en.cppreference.com/w/cpp/language/adl
[ADL]: https://en.cppreference.com/w/cpp/language/adl
"Argument Dependent Lookup"
[ODR]: https://en.cppreference.com/w/cpp/language/definition

View File

@@ -1,230 +0,0 @@
#!/usr/bin/env python3
import argparse
import math
import os.path
import sys
import subprocess
def fatal(msg):
sys.stderr.write(f"[fatal] {msg}\n")
sys.exit(1)
def verbose(options, *msg):
if options.verbose:
sys.stdout.write(f"[verbose] ")
sys.stdout.write(*msg)
sys.stdout.write('\n')
def first_line(str):
return "" if not str else str.splitlines()[0]
class Options:
def __init__(self):
ap = argparse.ArgumentParser(description="Show bugfixes differences between JBR and OpenJDK git repos",
epilog="Example: %(prog)s --jdk ./jdk11u/ --jbr ./JetBrainsRuntime/ --path src/hotspot --limit 200")
ap.add_argument('--jdk', dest='jdkpath', help='path to OpenJDK git repo', required=True)
ap.add_argument('--jbr', dest='jbrpath', help='path to JBR git repo', required=True)
ap.add_argument('--path', dest='path', help='limit to changes in this path (relative to git root)')
ap.add_argument('--limit', dest='limit', help='limit to this many log entries in --jdk repo', type=int, default=-1)
ap.add_argument('-o', dest="output_dir", help="save patches to this directory (created if necessary)")
ap.add_argument('-v', dest='verbose', help="verbose output", default=False, action='store_true')
args = ap.parse_args()
if not os.path.isdir(args.jdkpath):
fatal(f"{args.jdkpath} not a directory")
if not os.path.isdir(args.jbrpath):
fatal(f"{args.jbrpath} not a directory")
if not git_is_available():
fatal("can't run git commands; make sure git is in PATH")
self.jdkpath = args.jdkpath
self.jbrpath = args.jbrpath
self.path = args.path
self.limit = args.limit
self.output_dir = args.output_dir
self.verbose = args.verbose
class GitRepo:
def __init__(self, rootpath):
self.rootpath = rootpath
def run_git_cmd(self, git_args):
args = ["git", "-C", self.rootpath]
args.extend(git_args)
# print(f"Runnig git cmd '{' '.join(args)}'")
p = subprocess.run(args, capture_output=True, text=True)
if p.returncode != 0:
fatal(f"git returned non-zero code in {self.rootpath} ({first_line(p.stderr)})")
return p.stdout
def save_git_cmd(self, fname, git_args):
args = ["git", "-C", self.rootpath]
args.extend(git_args)
# print(f"Runnig git cmd '{' '.join(args)}'")
with open(fname, "w") as stdout_file:
p = subprocess.run(args, stdout=stdout_file)
if p.returncode != 0:
fatal(f"git returned non-zero code in {self.rootpath} ({first_line(p.stderr)})")
def current_branch(self):
branch_name = self.run_git_cmd(["branch", "--show-current"]).strip()
return branch_name
def log(self, path=None, limit=None):
cmds = ["log", "--no-decorate"]
if limit:
cmds.extend(["-n", str(limit)])
if path:
cmds.append(path)
full_log = self.run_git_cmd(cmds)
return full_log
class Commit:
def __init__(self, lines):
self.sha = lines[0].split()[1]
self.message = ""
self.bugid = None
# Commit message starts after one blank line
read_message = False
for l in lines:
if read_message:
self.message += l + "\n"
if not read_message and l == "":
read_message = True
if self.message and self.message != "" and ":" in self.message:
maybe_bugid = self.message.split(":")[0].strip()
if 10 >= len(maybe_bugid) >= 4:
self.bugid = maybe_bugid
class History:
def __init__(self, log):
log_itr = iter(log.splitlines())
self.commits = []
commit_lines = []
for line in log_itr:
if line.startswith("commit ") and len(commit_lines) > 0:
commit = Commit(commit_lines)
self.commits.append(commit)
commit_lines = []
commit_lines.append(line)
if len(commit_lines) > 0:
commit = Commit(commit_lines)
self.commits.append(commit)
def contains(self, str):
return any(str in commit.message for commit in self.commits)
def size(self):
return len(self.commits)
def print_explanation(options, jdk, jbr):
verbose(options, f"Reading history from '{jdk.rootpath}' on branch '{jdk.current_branch()}'")
if options.path:
verbose(options, f"\t(only under '{options.path}')")
verbose(options, f"Searching for same fixes in '{jbr.rootpath}' on branch '{jbr.current_branch()}'")
def git_is_available():
p = None
try:
p = subprocess.run(["git", "--help"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
except:
pass
return p is not None and p.returncode == 0
def main():
check_python_min_requirements()
options = Options()
jdk = GitRepo(options.jdkpath)
jbr = GitRepo(options.jbrpath)
print_explanation(options, jdk, jbr)
commits_to_save = []
try:
jdk_log = jdk.log(options.path, options.limit)
jdk_history = History(jdk_log)
jbr_log = jbr.log(options.path)
jbr_history = History(jbr_log)
verbose(options, f"Read {jdk_history.size()} commits in JDK, {jbr_history.size()} in JBR")
for c in jdk_history.commits:
if c.bugid:
verbose(options, f"Looking for bugfix for {c.bugid}")
if not jbr_history.contains(c.bugid):
commits_to_save.append(c)
print(f"[note] Fix for {c.bugid} not found in JBR ({jbr.rootpath})")
print(f" commit {c.sha}")
print(f" {first_line(c.message).strip()}")
except KeyboardInterrupt:
fatal("Interrupted")
if len(commits_to_save) > 0 and options.output_dir:
print()
if not os.path.exists(options.output_dir):
verbose(options, f"Creating output directory {options.output_dir}")
os.makedirs(options.output_dir)
nzeroes = len(str(len(commits_to_save)))
for i, c in enumerate(reversed(commits_to_save)):
fname = os.path.join(options.output_dir, f"{str(i).zfill(nzeroes)}-{c.bugid}.patch")
print(f"[note] {c.bugid} saved as {fname}")
fname = os.path.abspath(fname)
jdk.save_git_cmd(fname, ["format-patch", "-1", c.sha, "--stdout"])
script_fname = os.path.join(options.output_dir, "apply.sh")
with open(script_fname, "w") as script_file:
print(apply_script_code.format(os.path.abspath(jbr.rootpath), os.path.abspath(options.output_dir)),
file=script_file)
print(f"[note] Execute 'bash {script_fname}' to apply patches to {jbr.rootpath}")
def check_python_min_requirements():
if sys.version_info < (3, 6):
fatal("Minimum version 3.6 is required to run this script")
apply_script_code = """
#!/bin/bash
GITROOT={0}
PATCHROOT={1}
cd $PATCHROOT || exit 1
PATCHES=$(find $PATCHROOT -name '*.patch' | sort -n)
for P in $PATCHES; do
git -C $GITROOT am $P
if [ $? != 0 ]; then
mv "$P" "$P.failed"
echo "[ERROR] Patch $P did not apply cleanly. Try applying it manually."
echo "[NOTE] Execute this script to apply the remaining patches."
exit 1
else
mv "$P" "$P.done"
fi
done
echo "[NOTE] Done applying patches; check $PATCHROOT for .patch and .patch.failed to see if all have been applied."
"""
if __name__ == '__main__':
main()

View File

@@ -1,13 +0,0 @@
# jetbrains/runtime:jbr15env
FROM centos:7
RUN yum -y install centos-release-scl
RUN yum -y install devtoolset-8
RUN yum -y install zip bzip2 unzip tar wget make autoconf automake libtool gcc gcc-c++ libstdc++-devel alsa-devel cups-devel xorg-x11-devel libjpeg62-devel giflib-devel freetype-devel file which libXtst-devel libXt-devel libXrender-devel alsa-lib-devel fontconfig-devel libXrandr-devel libXi-devel git
# Install Java 16
RUN wget https://cdn.azul.com/zulu/bin/zulu17.28.13-ca-jdk17.0.0-linux_x64.tar.gz \
-O - | tar xz -C /
RUN mv /zulu17.28.13-ca-jdk17.0.0-linux_x64 /jdk17.0.0
ENV PATH /opt/rh/devtoolset-8/root/usr/bin:$PATH
RUN mkdir .git
RUN git config user.email "teamcity@jetbrains.com"
RUN git config user.name "builduser"

View File

@@ -1,47 +0,0 @@
# NOTE: This Dockerfile is meant to be used from the mkdocker_aarch64.sh script.
# Pull a concrete version of Linux that does NOT recieve updates after it's
# been created. This is so that the image is as stable as possible to make
# image creation reproducible.
# NB: this also means there may be no security-related fixes there, need to
# move the version to the next manually.
FROM arm64v8/ubuntu:focal-20211006
# Install the necessary build tools
RUN export DEBIAN_FRONTEND=noninteractive \
export DEBCONF_NONINTERACTIVE_SEEN=true && \
echo 'tzdata tzdata/Areas select Etc' | debconf-set-selections; \
echo 'tzdata tzdata/Zones/Etc select UTC' | debconf-set-selections; \
apt-get update -qy && \
apt-get install -qy \
autoconf \
build-essential \
bzip2 \
file \
g++-10=10.3.0-1ubuntu1~20.04 \
gcc-10=10.3.0-1ubuntu1~20.04 \
git \
libasound2-dev \
libcups2-dev \
libfontconfig1-dev \
libx11-dev \
libxext-dev \
libxrandr-dev \
libxrender-dev \
libxt-dev \
libxtst-dev \
make \
tar \
unzip \
zip && \
update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-10 100 --slave /usr/bin/g++ g++ /usr/bin/g++-10 && \
apt-get clean -qy && \
rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
# Set up boot JDK for building
COPY boot_jdk.tar.gz /jdk17/
RUN cd /jdk17 && tar --strip-components=1 -xzf boot_jdk.tar.gz && rm /jdk17/boot_jdk.tar.gz
ENV BOOT_JDK=/jdk17
RUN git config --global user.email "teamcity@jetbrains.com" && \
git config --global user.name "builduser"

View File

@@ -1,22 +0,0 @@
# NOTE: This Dockerfile is meant to be used from the mkdocker_musl_aarch64.sh script.
# Pull a concrete version of Linux that does NOT recieve updates after it's
# been created. This is so that the image is as stable as possible to make
# image creation reproducible.
# NB: this also means there may be no security-related fixes there, need to
# move the version to the next manually.
FROM arm64v8/alpine:3.12
# Install the necessary build tools
RUN apk --no-cache add --update bash grep tar zip bzip2 fontconfig build-base \
git libx11-dev libxext-dev libxrandr-dev libxrender-dev libxt-dev \
libxtst-dev autoconf freetype-dev cups-dev alsa-lib-dev file \
fontconfig fontconfig-dev linux-headers
# Set up boot JDK for building
COPY boot_jdk_musl_aarch64.tar.gz /jdk17/
RUN cd /jdk17 && tar --strip-components=1 -xzf boot_jdk_musl_aarch64.tar.gz && rm /jdk17/boot_jdk_musl_aarch64.tar.gz
ENV BOOT_JDK=/jdk17
RUN git config --global user.email "teamcity@jetbrains.com" && \
git config --global user.name "builduser"

View File

@@ -1,22 +0,0 @@
# NOTE: This Dockerfile is meant to be used from the mkdocker_musl_x64.sh script.
# Pull a concrete version of Linux that does NOT recieve updates after it's
# been created. This is so that the image is as stable as possible to make
# image creation reproducible.
# NB: this also means there may be no security-related fixes there, need to
# move the version to the next manually.
FROM alpine:3.5
# Install the necessary build tools
RUN apk --no-cache add --update bash grep tar zip bzip2 fontconfig build-base \
git libx11-dev libxext-dev libxrandr-dev libxrender-dev libxt-dev \
libxtst-dev autoconf freetype-dev cups-dev alsa-lib-dev file \
fontconfig fontconfig-dev linux-headers
# Set up boot JDK for building
COPY boot_jdk_musl_amd64.tar.gz /jdk17/
RUN cd /jdk17 && tar --strip-components=1 -xzf boot_jdk_musl_amd64.tar.gz && rm /jdk17/boot_jdk_musl_amd64.tar.gz
ENV BOOT_JDK=/jdk17
RUN git config --global user.email "teamcity@jetbrains.com" && \
git config --global user.name "builduser"

View File

@@ -1,7 +0,0 @@
FROM i386/ubuntu:xenial
RUN linux32 apt-get update && apt-get install -y --no-install-recommends apt-utils
COPY jbrsdk-11.0.5-b1 /jbrsdk-11.0.5-b1
RUN linux32 apt-get -y install file build-essential zip unzip curl libx11-dev libxext-dev \
libxrender-dev libxrandr-dev libxtst-dev libxt-dev libcups2-dev libasound2-data \
libpng12-0 libasound2 libfreetype6 libfontconfig1-dev libasound2-dev autoconf

View File

@@ -1,26 +0,0 @@
#!/bin/bash -x
# This script creates a Docker image suitable for building AArch64 variant
# of the JetBrains Runtime "dev" version.
BOOT_JDK_REMOTE_FILE=zulu17.30.15-ca-jdk17.0.1-linux_aarch64.tar.gz
BOOT_JDK_SHA=4d9c9116eb0cdd2d7fb220d6d27059f4bf1b7e95cc93d5512bd8ce3791af86c7
BOOT_JDK_LOCAL_FILE=boot_jdk.tar.gz
if [ ! -f $BOOT_JDK_LOCAL_FILE ]; then
# Obtain "boot JDK" from outside of the container.
wget -nc https://cdn.azul.com/zulu/bin/${BOOT_JDK_REMOTE_FILE} -O $BOOT_JDK_LOCAL_FILE
else
echo "boot JDK \"$BOOT_JDK_LOCAL_FILE\" present, skipping download"
fi
# Verify that what we've downloaded can be trusted.
sha256sum -c - <<EOF
$BOOT_JDK_SHA *$BOOT_JDK_LOCAL_FILE
EOF
docker build -t jbrdevenv_arm64v8 -f Dockerfile.aarch64 .
# NB: the resulting container can (and should) be used without the network
# connection (--network none) during build in order to reduce the chance
# of build contamination.

View File

@@ -1,26 +0,0 @@
#!/bin/bash -x
# This script creates a Docker image suitable for building musl AArch64 variant
# of the JetBrains Runtime version 17.
BOOT_JDK_REMOTE_FILE=zulu17.32.13-ca-jdk17.0.2-linux_musl_aarch64.tar.gz
BOOT_JDK_SHA=6b920559abafbe9bdef386a20ecf3a2f318bc1f0d8359eb1f95aee26606bbc70
BOOT_JDK_LOCAL_FILE=boot_jdk_musl_aarch64.tar.gz
if [ ! -f $BOOT_JDK_LOCAL_FILE ]; then
# Obtain "boot JDK" from outside of the container.
wget -nc https://cdn.azul.com/zulu/bin/${BOOT_JDK_REMOTE_FILE} -O $BOOT_JDK_LOCAL_FILE
else
echo "boot JDK \"$BOOT_JDK_LOCAL_FILE\" present, skipping download"
fi
# Verify that what we've downloaded can be trusted.
sha256sum -c - <<EOF
$BOOT_JDK_SHA *$BOOT_JDK_LOCAL_FILE
EOF
docker build -t jbr17buildenv -f Dockerfile.musl_aarch64 .
# NB: the resulting container can (and should) be used without the network
# connection (--network none) during build in order to reduce the chance
# of build contamination.

View File

@@ -1,26 +0,0 @@
#!/bin/bash -x
# This script creates a Docker image suitable for building musl-x64 variant
# of the JetBrains Runtime version 17.
BOOT_JDK_REMOTE_FILE=zulu17.32.13-ca-jdk17.0.2-linux_musl_x64.tar.gz
BOOT_JDK_SHA=bcc5342011bd9f3643372aadbdfa68d47463ff0d8621668a0bdf2910614d95c6
BOOT_JDK_LOCAL_FILE=boot_jdk_musl_amd64.tar.gz
if [ ! -f $BOOT_JDK_LOCAL_FILE ]; then
# Obtain "boot JDK" from outside of the container.
wget -nc https://cdn.azul.com/zulu/bin/${BOOT_JDK_REMOTE_FILE} -O $BOOT_JDK_LOCAL_FILE
else
echo "boot JDK \"$BOOT_JDK_LOCAL_FILE\" present, skipping download"
fi
# Verify that what we've downloaded can be trusted.
sha256sum -c - <<EOF
$BOOT_JDK_SHA *$BOOT_JDK_LOCAL_FILE
EOF
docker build -t jbr17buildenv -f Dockerfile.musl_x64 .
# NB: the resulting container can (and should) be used without the network
# connection (--network none) during build in order to reduce the chance
# of build contamination.

View File

@@ -1,9 +0,0 @@
<component name="CopyrightManager">
<copyright>
<option name="notice" value="Copyright &amp;#36;originalComment.match(&quot;Copyright (\d+)&quot;, 1, &quot;-&quot;)&amp;#36;today.year JetBrains s.r.o.&#10;DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.&#10;&#10;This code is free software; you can redistribute it and/or modify it&#10;under the terms of the GNU General Public License version 2 only, as&#10;published by the Free Software Foundation.&#10;&#10;This code is distributed in the hope that it will be useful, but WITHOUT&#10;ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or&#10;FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License&#10;version 2 for more details (a copy is included in the LICENSE file that&#10;accompanied this code).&#10;&#10;You should have received a copy of the GNU General Public License version&#10;2 along with this work; if not, write to the Free Software Foundation,&#10;Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.&#10;&#10;Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA&#10;or visit www.oracle.com if you need additional information or have any&#10;questions." />
<option name="keyword" value="Copyright" />
<option name="allowReplaceKeyword" value="JetBrains" />
<option name="myName" value="JetBrains" />
<option name="myLocal" value="true" />
</copyright>
</component>

View File

@@ -1,3 +0,0 @@
<component name="CopyrightManager">
<settings default="JetBrains" />
</component>

View File

@@ -1 +0,0 @@
JetBrainsRuntime

View File

@@ -1,9 +0,0 @@
<component name="CopyrightManager">
<copyright>
<option name="notice" value="Copyright 2000-&amp;#36;today.year JetBrains s.r.o.&#10;&#10;Licensed under the Apache License, Version 2.0 (the &quot;License&quot;);&#10;you may not use this file except in compliance with the License.&#10;You may obtain a copy of the License at&#10;&#10;http://www.apache.org/licenses/LICENSE-2.0&#10;&#10;Unless required by applicable law or agreed to in writing, software&#10;distributed under the License is distributed on an &quot;AS IS&quot; BASIS,&#10;WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.&#10;See the License for the specific language governing permissions and&#10;limitations under the License." />
<option name="keyword" value="Copyright" />
<option name="allowReplaceKeyword" value="JetBrains" />
<option name="myName" value="JetBrains" />
<option name="myLocal" value="true" />
</copyright>
</component>

View File

@@ -1,3 +0,0 @@
<component name="CopyrightManager">
<settings default="JetBrains" />
</component>

View File

@@ -1,20 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="IssueNavigationConfiguration">
<option name="links">
<list>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})([A-Z]+\-\d+)(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://youtrack.jetbrains.com/issue/$1" />
</IssueNavigationLink>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})(?:JDK-)?(\d{7})(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://bugs.openjdk.java.net/browse/JDK-$1" />
</IssueNavigationLink>
</list>
</option>
</component>
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$/../.." vcs="Git" />
</component>
</project>

View File

@@ -1,13 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="JAVA_MODULE" version="4">
<component name="NewModuleRootManager" inherit-compiler-output="true">
<exclude-output />
<content url="file://$MODULE_DIR$/src/jetbrains.api">
<sourceFolder url="file://$MODULE_DIR$/src/jetbrains.api/src" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/jetbrains.api/templates" isTestSource="false" />
</content>
<orderEntry type="sourceFolder" forTests="false" />
<orderEntry type="inheritedJdk" />
</component>
</module>

View File

@@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectModuleManager">
<modules>
<module fileurl="file://$PROJECT_DIR$/.idea/jdk.iml" filepath="$PROJECT_DIR$/.idea/jdk.iml" />
###MODULE_IMLS###
<module fileurl="file://$PROJECT_DIR$/.idea/jetbrains.api.iml" filepath="$PROJECT_DIR$/.idea/jetbrains.api.iml" />
<module fileurl="file://$PROJECT_DIR$/.idea/test.iml" filepath="$PROJECT_DIR$/.idea/test.iml" />
</modules>
</component>
</project>

View File

@@ -1,20 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="IssueNavigationConfiguration">
<option name="links">
<list>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})([A-Z]+\-\d+)(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://youtrack.jetbrains.com/issue/$1" />
</IssueNavigationLink>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})(?:JDK-)?(\d{7})(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://bugs.openjdk.java.net/browse/JDK-$1" />
</IssueNavigationLink>
</list>
</option>
</component>
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$" vcs="Git" />
</component>
</project>

View File

@@ -1,135 +0,0 @@
apply plugin: 'java'
import org.gradle.internal.os.OperatingSystem
repositories {
mavenCentral()
}
def test_jvm = {
if (project.hasProperty('jbsdkhome')) {
file(jbsdkhome + (OperatingSystem.current().isWindows()?"/bin/java.exe" : "/bin/java")).absolutePath
} else {
if (OperatingSystem.current().isMacOsX()) {
file('../../../build/macosx-x86_64-normal-server-release/images/jdk-bundle/jdk-11.0.4.jdk/Contents/Home/bin/java').absolutePath
} else if (OperatingSystem.current().isLinux()) {
file('../../../build/linux-x86_64-normal-server-release/images/jdk/bin/java').absolutePath
} else {
file('../../../build/windows-x86_64-normal-server-release/images/jdk/bin/java.exe').absolutePath
}
}
}
dependencies {
testCompile('junit:junit:4.12'){
exclude group: 'org.hamcrest'
}
testCompile 'org.hamcrest:hamcrest-library:1.3'
testCompile 'net.java.dev.jna:jna:4.4.0'
testCompile 'com.twelvemonkeys.imageio:imageio-tiff:3.3.2'
testCompile 'org.apache.commons:commons-lang3:3.0'
}
def jdk_modules = ["java.base", "java.logging", "java.prefs",
"java.se.ee", "java.sql", "java.datatransfer",
"java.management", "java.rmi", "java.security.jgss",
"java.sql.rowset", "java.desktop", "java.management.rmi",
"java.scripting", "java.security.sasl", "java.transaction",
"java.instrument", "java.naming", "java.se",
"java.smartcardio", "java.xml.crypto"]
def jdk_class_dirs = []
jdk_modules.collect(jdk_class_dirs) {
new File("../../../src/" + it + "/share/classes")
}
if (OperatingSystem.current().isMacOsX())
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/macosx/classes"
}
else if (OperatingSystem.current().isLinux()) {
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/solaris/classes"
}
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/unix/classes"
}
} else
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/windows/classes"
}
sourceSets.main.java.srcDirs = jdk_class_dirs
sourceSets {
test {
java {
srcDir "../../../test/jdk/jbu"
}
}
}
test.dependsOn.clear()
test.dependsOn tasks.compileTestJava
test {
systemProperty "jb.java2d.metal", "true"
systemProperty "testdata", file('../../../test/jdk/jbu/testdata').absolutePath
// Generate golden images for DroidFontTest and MixedTextTest
// systemProperty "gentestdata", ""
// Enable Java2D logging (https://confluence.jetbrains.com/display/JRE/Java2D+Rendering+Logging)
// systemProperty "sun.java2d.trace", "log"
// systemProperty "sun.java2d.trace", "log,pimpl"
outputs.upToDateWhen { false }
executable = test_jvm()
// Enable async/dtrace profiler
jvmArgs "-XX:+PreserveFramePointer"
// Enable native J2D logging (only in debug build)
// Can be turned on for J2D by adding "#define DEBUG 1" into jdk/src/share/native/sun/java2d/Trace.h
// environment 'J2D_TRACE_LEVEL', '4'
}
def buildDir = project.buildscript.sourceFile.parentFile.parentFile.parentFile.parentFile
def make_cmd = "make"
if (OperatingSystem.current().isWindows()) {
def cyg_make_cmd = new File("c:/cygwin64/bin/make.exe")
if (cyg_make_cmd.exists()) make_cmd = cyg_make_cmd.absolutePath
}
def test_run = false
task make_images {
doLast {
if (!test_run) {
def pb = new ProcessBuilder().command(make_cmd.toString(), "-C", buildDir.absolutePath, "images")
def proc = pb.redirectErrorStream(true).start()
proc.inputStream.eachLine { println it }
assert proc.waitFor() == 0
}
}
}
task make_clean {
doLast {
def pb = new ProcessBuilder().command(make_cmd.toString(), "-C", buildDir.absolutePath, "clean")
def proc = pb.redirectErrorStream(true).start()
proc.inputStream.eachLine { println it }
assert proc.waitFor() == 0
}
}
task run_test {
doLast {
test_run = true
}
}
tasks.cleanTest.dependsOn tasks.run_test
classes.dependsOn.clear()
classes.dependsOn tasks.make_images
tasks.cleanClasses.dependsOn tasks.make_clean

View File

@@ -1,54 +0,0 @@
java.base,
java.compiler,
java.datatransfer,
java.desktop,
java.instrument,
java.logging,
java.management,
java.management.rmi,
java.naming,
java.net.http,
java.prefs,
java.rmi,
java.scripting,
java.se,
java.security.jgss,
java.security.sasl,
java.smartcardio,
java.sql,
java.sql.rowset,
java.transaction.xa,
java.xml,
java.xml.crypto,
jdk.accessibility,
jdk.attach,
jdk.charsets,
jdk.compiler,
jdk.crypto.cryptoki,
jdk.crypto.ec,
jdk.dynalink,
jdk.httpserver,
jdk.internal.ed,
jdk.internal.le,
jdk.internal.vm.ci,
jdk.javadoc,
jdk.jdi,
jdk.jdwp.agent,
jdk.jfr,
jdk.jsobject,
jdk.localedata,
jdk.management,
jdk.management.agent,
jdk.management.jfr,
jdk.naming.dns,
jdk.naming.rmi,
jdk.net,
jdk.sctp,
jdk.security.auth,
jdk.security.jgss,
jdk.unsupported,
jdk.unsupported.desktop,
jdk.xml.dom,
jdk.zipfs,
jdk.hotspot.agent,
jdk.jcmd

View File

@@ -1,16 +0,0 @@
#!/bin/sh
# $1 - Boot JDK
# $2 - JBR part of API version
cd "`dirname "$0"`/../../../../.."
PWD="`pwd`"
CONF="$PWD/build/jbr-api.conf"
./configure --with-debug-level=release --with-boot-jdk=$1 || exit $?
make jbr-api CONF=release MAKEOVERRIDES= "JBR_API_CONF_FILE=$CONF" JBR_API_JBR_VERSION=$2 || exit $?
. $CONF || exit $?
echo "##teamcity[buildNumber '$VERSION']"
cp "$JAR" ./jbr-api-${VERSION}.jar || exit $?
cp "$SOURCES_JAR" ./jbr-api-${VERSION}-sources.jar || exit $?
echo "##teamcity[publishArtifacts '$PWD/jbr-api-${VERSION}.jar']"
echo "##teamcity[publishArtifacts '$PWD/jbr-api-${VERSION}-sources.jar']"

View File

@@ -1,138 +0,0 @@
#!/bin/bash -x
function do_maketest() {
if [ "${bundle_type: -1}" == "t" ]; then
echo ${bundle_type%?}
return 1
else
echo ${bundle_type}
return 0
fi
}
function getVersionProp() {
grep "^${1}" make/conf/version-numbers.conf | cut -d'=' -f2
}
while getopts ":i?" o; do
case "${o}" in
i)
i="incremental build"
INC_BUILD=1
;;
esac
done
shift $((OPTIND-1))
build_number=$1
bundle_type=$2
architecture=$3 # aarch64 or x64
bundle_type=$(do_maketest)
do_maketest=$?
tag_prefix="jdk-"
OPENJDK_TAG=$(git log --simplify-by-decoration --decorate=short --pretty=short | grep "$tag_prefix" | cut -d "(" -f2 | cut -d ")" -f1 | awk '{print $2}' | sort -t "-" -k 2 -g | tail -n 1)
VERSION_FEATURE=$(getVersionProp "DEFAULT_VERSION_FEATURE")
VERSION_INTERIM=$(getVersionProp "DEFAULT_VERSION_INTERIM")
VERSION_UPDATE=$(getVersionProp "DEFAULT_VERSION_UPDATE")
[[ $VERSION_UPDATE = 0 ]] && JBSDK_VERSION="$VERSION_FEATURE" || JBSDK_VERSION="${VERSION_FEATURE}.${VERSION_INTERIM}.${VERSION_UPDATE}"
echo "##teamcity[setParameter name='env.JBSDK_VERSION' value='${JBSDK_VERSION}']"
JDK_BUILD_NUMBER=${JDK_BUILD_NUMBER:=$(echo $OPENJDK_TAG | awk -F "-|[+]" '{print $3}')}
[ -z $JDK_BUILD_NUMBER ] && JDK_BUILD_NUMBER=1
echo "##teamcity[setParameter name='env.JDK_UPDATE_NUMBER' value='${JDK_BUILD_NUMBER}']"
VENDOR_NAME="JetBrains s.r.o."
VENDOR_VERSION_STRING="JBR-${JBSDK_VERSION}+${JDK_BUILD_NUMBER}-${build_number}"
[ -z "$bundle_type" ] || VENDOR_VERSION_STRING="${VENDOR_VERSION_STRING}-${bundle_type}"
do_reset_changes=0
do_reset_dcevm=0
HEAD_REVISION=0
OS_NAME=$(uname -s)
# Enable reproducible builds
TZ=UTC
export TZ
SOURCE_DATE_EPOCH="$(git log -1 --pretty=%ct)"
export SOURCE_DATE_EPOCH
case "$OS_NAME" in
Linux)
COPYRIGHT_YEAR="$(date --utc --date=@$SOURCE_DATE_EPOCH +%Y)"
BUILD_TIME="$(date --utc --date=@$SOURCE_DATE_EPOCH +%F)"
REPRODUCIBLE_TAR_OPTS="--mtime=@$SOURCE_DATE_EPOCH --owner=0 --group=0 --numeric-owner --pax-option=exthdr.name=%d/PaxHeaders/%f,delete=atime,delete=ctime"
;;
Darwin)
COPYRIGHT_YEAR="$(date -u -r $SOURCE_DATE_EPOCH +%Y)"
BUILD_TIME="$(date -u -r $SOURCE_DATE_EPOCH +%F)"
TOUCH_TIME="$(date -u -r $SOURCE_DATE_EPOCH +%Y%m%d%H%M.%S)"
REPRODUCIBLE_TAR_OPTS="--uid 0 --gid 0 --numeric-owner"
;;
*)
# TODO: Windows
;;
esac
REPRODUCIBLE_BUILD_OPTS="--enable-reproducible-build
--with-source-date=$SOURCE_DATE_EPOCH
--with-hotspot-build-time=$BUILD_TIME
--with-copyright-year=$COPYRIGHT_YEAR
--disable-absolute-paths-in-output
--with-build-user=builduser"
function do_exit() {
exit_code=$1
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
if [ $do_reset_dcevm -eq 1 ]; then
[ ! -z $HEAD_REVISION ] && git reset --hard $HEAD_REVISION
fi
exit "$exit_code"
}
function update_jsdk_mods() {
__jsdk=$1
__jcef_mods=$2
__orig_jsdk_mods=$3
__updated_jsdk_mods=$4
# re-create java.desktop.jmod with updated module-info.class
tmp=.java.desktop.$$.tmp
mkdir "$tmp" || exit $?
"$__jsdk"/bin/jmod extract --dir "$tmp" "$__orig_jsdk_mods"/java.desktop.jmod || exit $?
"$__jsdk"/bin/javac \
--patch-module java.desktop="$__orig_jsdk_mods"/java.desktop.jmod \
--module-path "$__jcef_mods" -d "$tmp"/classes src/java.desktop/share/classes/module-info.java || exit $?
"$__jsdk"/bin/jmod \
create --class-path "$tmp"/classes --config "$tmp"/conf --header-files "$tmp"/include --legal-notice "$tmp"/legal --libs "$tmp"/lib \
java.desktop.jmod || exit $?
mv java.desktop.jmod "$__updated_jsdk_mods" || exit $?
rm -rf "$tmp"
# re-create java.base.jmod with updated hashes
tmp=.java.base.$$.tmp
mkdir "$tmp" || exit $?
hash_modules=$("$JSDK"/bin/jmod describe "$__orig_jsdk_mods"/java.base.jmod | grep hashes | awk '{print $2}' | tr '\n' '|' | sed s/\|$//) || exit $?
"$__jsdk"/bin/jmod extract --dir "$tmp" "$__orig_jsdk_mods"/java.base.jmod || exit $?
rm "$__updated_jsdk_mods"/java.base.jmod || exit $? # temp exclude from path
"$__jsdk"/bin/jmod \
create --module-path "$__updated_jsdk_mods" --hash-modules "$hash_modules" \
--class-path "$tmp"/classes --cmds "$tmp"/bin --config "$tmp"/conf --header-files "$tmp"/include --legal-notice "$tmp"/legal --libs "$tmp"/lib \
java.base.jmod || exit $?
mv java.base.jmod "$__updated_jsdk_mods" || exit $?
rm -rf "$tmp"
}
function get_mods_list() {
__mods=$1
echo $(ls $__mods) | sed s/\.jmod/,/g | sed s/,$//g | sed s/' '//g
}
function copy_jmods() {
__mods_list=$1
__jmods_from=$2
__jmods_to=$3
mkdir -p $__jmods_to
echo "${__mods_list}," | while read -d, mod; do cp $__jmods_from/$mod.jmod $__jmods_to/; done
}

View File

@@ -1,159 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_linux_aarch64
source jb/project/tools/common/scripts/common.sh
JCEF_PATH=${JCEF_PATH:=./jcef_linux_aarch64}
function do_configure {
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
--enable-cds=yes \
$REPRODUCIBLE_BUILD_OPTS \
|| do_exit $?
}
function is_musl {
libc=$(ldd /bin/ls | grep 'musl' | head -1 | cut -d ' ' -f1)
if [ -z $libc ]; then
# This is not Musl, return 1 == false
return 1
fi
return 0
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
libc_type_suffix=''
if is_musl; then libc_type_suffix='musl-' ; fi
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-linux-${libc_type_suffix}aarch64-${fastdebug_infix}b${build_number}
echo Running jlink....
[ -d "$IMAGES_DIR"/"$__arch_name" ] && rm -rf "${IMAGES_DIR:?}"/"$__arch_name"
$JSDK/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$IMAGES_DIR"/"$__arch_name"
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$IMAGES_DIR"/"$__arch_name"/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' "$IMAGES_DIR"/"$__arch_name"/release > release
mv release "$IMAGES_DIR"/"$__arch_name"/release
cp $IMAGES_DIR/jdk/lib/src.zip "$IMAGES_DIR"/"$__arch_name"/lib
copy_jmods "$__modules" "$__modules_path" "$IMAGES_DIR"/"$__arch_name"/jmods
fi
# jmod does not preserve file permissions (JDK-8173610)
[ -f "$IMAGES_DIR"/"$__arch_name"/lib/jcef_helper ] && chmod a+x "$IMAGES_DIR"/"$__arch_name"/lib/jcef_helper
echo Creating "$JBR".tar.gz ...
(cd "$IMAGES_DIR" &&
find "$__arch_name" -print0 | LC_ALL=C sort -z | \
tar $REPRODUCIBLE_TAR_OPTS \
--no-recursion --null -T - -cf "$JBR".tar) || do_exit $?
mv "$IMAGES_DIR"/"$JBR".tar ./"$JBR".tar
[ -f "$JBR".tar.gz ] && rm "$JBR.tar.gz"
touch -c -d "@$SOURCE_DATE_EPOCH" "$JBR".tar
gzip "$JBR".tar || do_exit $?
rm -rf "${IMAGES_DIR:?}"/"$__arch_name"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=linux-aarch64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=1
do_maketest=1
;;
"dcevm")
HEAD_REVISION=$(git rev-parse HEAD)
git am jb/project/tools/patches/dcevm/*.patch || do_exit $?
do_reset_dcevm=0
do_reset_changes=0
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=1
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=linux-aarch64-server-fastdebug
;;
esac
if [ -z "$INC_BUILD" ]; then
do_configure || do_exit $?
make clean CONF=$RELEASE_NAME || do_exit $?
fi
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module_aarch64.patch || do_exit $?
update_jsdk_mods $JSDK $JCEF_PATH/jmods $JSDK/jmods $JSDK_MODS_DIR || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not changed
jbr_name_postfix="_${bundle_type}"
[ "$bundle_type" != "fd" ] && jbrsdk_name_postfix="_${bundle_type}"
fi
# create runtime image bundle
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat $JSDK/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" $JBRSDK_BUNDLE $JSDK_MODS_DIR "$modules" || do_exit $?
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-linux-${libc_type_suffix}test-aarch64-b${build_number}
echo Creating "$JBRSDK_TEST" ...
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
make test-image CONF=$RELEASE_NAME || do_exit $?
tar -pcf "$JBRSDK_TEST".tar -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
gzip "$JBRSDK_TEST".tar || do_exit $?
fi
do_exit 0

View File

@@ -1,159 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_linux_x64
source jb/project/tools/common/scripts/common.sh
JCEF_PATH=${JCEF_PATH:=./jcef_linux_x64}
function do_configure {
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
--enable-cds=yes \
$REPRODUCIBLE_BUILD_OPTS \
|| do_exit $?
}
function is_musl {
libc=$(ldd /bin/ls | grep 'musl' | head -1 | cut -d ' ' -f1)
if [ -z $libc ]; then
# This is not Musl, return 1 == false
return 1
fi
return 0
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
libc_type_suffix=''
if is_musl; then libc_type_suffix='musl-' ; fi
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-linux-${libc_type_suffix}x64-${fastdebug_infix}b${build_number}
echo Running jlink....
[ -d "$IMAGES_DIR"/"$__arch_name" ] && rm -rf "${IMAGES_DIR:?}"/"$__arch_name"
$JSDK/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$IMAGES_DIR"/"$__arch_name"
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$IMAGES_DIR"/"$__arch_name"/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' "$IMAGES_DIR"/"$__arch_name"/release > release
mv release "$IMAGES_DIR"/"$__arch_name"/release
cp $IMAGES_DIR/jdk/lib/src.zip "$IMAGES_DIR"/"$__arch_name"/lib
copy_jmods "$__modules" "$__modules_path" "$IMAGES_DIR"/"$__arch_name"/jmods
fi
# jmod does not preserve file permissions (JDK-8173610)
[ -f "$IMAGES_DIR"/"$__arch_name"/lib/jcef_helper ] && chmod a+x "$IMAGES_DIR"/"$__arch_name"/lib/jcef_helper
echo Creating "$JBR".tar.gz ...
(cd "$IMAGES_DIR" &&
find "$__arch_name" -print0 | LC_ALL=C sort -z | \
tar $REPRODUCIBLE_TAR_OPTS \
--no-recursion --null -T - -cf "$JBR".tar) || do_exit $?
mv "$IMAGES_DIR"/"$JBR".tar ./"$JBR".tar
[ -f "$JBR".tar.gz ] && rm "$JBR.tar.gz"
touch -c -d "@$SOURCE_DATE_EPOCH" "$JBR".tar
gzip "$JBR".tar || do_exit $?
rm -rf "${IMAGES_DIR:?}"/"$__arch_name"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=linux-x86_64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=1
do_maketest=1
;;
"dcevm")
HEAD_REVISION=$(git rev-parse HEAD)
git am jb/project/tools/patches/dcevm/*.patch || do_exit $?
do_reset_dcevm=0
do_reset_changes=0
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=1
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=linux-x86_64-server-fastdebug
;;
esac
if [ -z "$INC_BUILD" ]; then
do_configure || do_exit $?
make clean CONF=$RELEASE_NAME || do_exit $?
fi
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods $JSDK $JCEF_PATH/jmods $JSDK/jmods $JSDK_MODS_DIR || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not changed
jbr_name_postfix="_${bundle_type}"
[ "$bundle_type" != "fd" ] && jbrsdk_name_postfix="_${bundle_type}"
fi
# create runtime image bundle
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat $JSDK/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" $JBRSDK_BUNDLE $JSDK_MODS_DIR "$modules" || do_exit $?
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-linux-${libc_type_suffix}test-x64-b${build_number}
echo Creating "$JBRSDK_TEST" ...
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
make test-image CONF=$RELEASE_NAME || do_exit $?
tar -pcf "$JBRSDK_TEST".tar -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
gzip "$JBRSDK_TEST".tar || do_exit $?
fi
do_exit 0

View File

@@ -1,80 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies the current version of OpenJDK e.g. 11_0_6
# JDK_BUILD_NUMBER - specifies the number of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
JBSDK_VERSION_WITH_DOTS=$(echo $JBSDK_VERSION | sed 's/_/\./g')
source jb/project/tools/common/scripts/common.sh
JBRSDK_BASE_NAME=jbrsdk-${JBSDK_VERSION}
[ -z "$bundle_type" ] && (git apply -p0 < jb/project/tools/patches/exclude_jcef_module.patch || exit $?)
linux32 bash configure \
--with-debug-level=release \
--with-vendor-name="${VENDOR_NAME}" \
--with-vendor-version-string="${VENDOR_VERSION_STRING}" \
--with-version-pre= \
--with-version-build=$JDK_BUILD_NUMBER \
--with-version-opt=b${build_number} \
--with-boot-jdk=${BOOT_JDK} \
--enable-cds=yes || exit $?
make clean CONF=linux-x86-server-release || exit $?
make images CONF=linux-x86-server-release test-image || exit $?
JBSDK=${JBRSDK_BASE_NAME}-linux-x86-b${build_number}
BASE_DIR=build/linux-x86-server-release/images
JSDK=${BASE_DIR}/jdk
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
rm -rf $BASE_DIR/$JBRSDK_BUNDLE
cp -r $JSDK $BASE_DIR/$JBRSDK_BUNDLE || exit $?
echo Creating $JBSDK.tar.gz ...
sed 's/JBR/JBRSDK/g' ${BASE_DIR}/${JBRSDK_BUNDLE}/release > release
mv release ${BASE_DIR}/${JBRSDK_BUNDLE}/release
tar -pcf $JBSDK.tar --exclude=*.debuginfo --exclude=demo --exclude=sample --exclude=man -C $BASE_DIR ${JBRSDK_BUNDLE} || exit $?
gzip $JBSDK.tar || exit $?
JBR_BUNDLE=jbr
JBR_BASE_NAME=jbr-$JBSDK_VERSION
rm -rf $BASE_DIR/$JBR_BUNDLE
JBR=$JBR_BASE_NAME-linux-x86-b$build_number
grep -v javafx jb/project/tools/common/modules.list | grep -v "jdk.internal.vm\|jdk.aot\|jcef" > modules.list.x86
echo Running jlink....
${JSDK}/bin/jlink \
--module-path ${JSDK}/jmods --no-man-pages --compress=2 \
--add-modules $(xargs < modules.list.x86 | sed s/" "//g | sed s/,$//g) --output ${BASE_DIR}/${JBR_BUNDLE} || exit $?
echo Modifying release info ...
grep -v \"^JAVA_VERSION\" ${JSDK}/release | grep -v \"^MODULES\" >> ${BASE_DIR}/${JBR_BUNDLE}/release
echo Creating $JBR.tar.gz ...
tar -pcf $JBR.tar -C $BASE_DIR $JBR_BUNDLE || exit $?
gzip $JBR.tar || exit $?
JBRSDK_TEST=$JBRSDK_BASE_NAME-linux-test-x86-b$build_number
echo Creating $JBRSDK_TEST.tar.gz ...
tar -pcf $JBRSDK_TEST.tar -C $BASE_DIR --exclude='test/jdk/demos' --exclude='test/hotspot/gtest' test || exit $?
gzip $JBRSDK_TEST.tar || exit $?

View File

@@ -1,16 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.cs.allow-jit</key>
<true/>
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<key>com.apple.security.cs.allow-dyld-environment-variables</key>
<true/>
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
<key>com.apple.security.cs.disable-executable-page-protection</key>
<true/>
</dict>
</plist>

View File

@@ -1,178 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_mac
source jb/project/tools/common/scripts/common.sh
JCEF_PATH=${JCEF_PATH:=./jcef_mac}
architecture=${architecture:=x64}
BOOT_JDK=${BOOT_JDK:=$(/usr/libexec/java_home -v 17)}
function do_configure {
if [[ "${architecture}" == *aarch64* ]]; then
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="${VENDOR_NAME}" \
--with-vendor-version-string="${VENDOR_VERSION_STRING}" \
--with-macosx-bundle-name-base=${VENDOR_VERSION_STRING} \
--with-macosx-bundle-id-base="com.jetbrains.jbr" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="${JDK_BUILD_NUMBER}" \
--with-version-opt=b"${build_number}" \
--with-boot-jdk="$BOOT_JDK" \
--disable-hotspot-gtest --disable-javac-server --disable-full-docs --disable-manpages \
--enable-cds=no \
--with-extra-cflags="-F$(pwd)/Frameworks" \
--with-extra-cxxflags="-F$(pwd)/Frameworks" \
--with-extra-ldflags="-F$(pwd)/Frameworks" \
$REPRODUCIBLE_BUILD_OPTS \
|| do_exit $?
else
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-macosx-bundle-name-base=${VENDOR_VERSION_STRING} \
--with-macosx-bundle-id-base="com.jetbrains.jbr" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
--enable-cds=yes \
$REPRODUCIBLE_BUILD_OPTS \
|| do_exit $?
fi
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
tmp=.bundle.$$.tmp
mkdir "$tmp" || do_exit $?
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-osx-${architecture}-${fastdebug_infix}b${build_number}
JRE_CONTENTS=$tmp/$__arch_name/Contents
mkdir -p "$JRE_CONTENTS" || do_exit $?
echo Running jlink...
"$JSDK"/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$JRE_CONTENTS/Home" || do_exit $?
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$JRE_CONTENTS/Home/release"
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' $JRE_CONTENTS/Home/release > release
mv release $JRE_CONTENTS/Home/release
cp $IMAGES_DIR/jdk-bundle/jdk-$JBSDK_VERSION.jdk/Contents/Home/lib/src.zip $JRE_CONTENTS/Home/lib
copy_jmods "$__modules" "$__modules_path" "$JRE_CONTENTS"/Home/jmods
fi
cp -R "$JSDK"/../MacOS "$JRE_CONTENTS"
cp "$JSDK"/../Info.plist "$JRE_CONTENTS"
[ -n "$bundle_type" ] && (cp -a $JCEF_PATH/Frameworks "$JRE_CONTENTS" || do_exit $?)
echo Creating "$JBR".tar.gz ...
# Normalize timestamp
find "$tmp"/"$__arch_name" -print0 | xargs -0 touch -c -h -t "$TOUCH_TIME"
(cd "$tmp" &&
find "$__arch_name" -print0 | LC_ALL=C sort -z | \
COPYFILE_DISABLE=1 tar $REPRODUCIBLE_TAR_OPTS --no-recursion --null -T - \
-czf "$JBR".tar.gz --exclude='*.dSYM' --exclude='man') || do_exit $?
mv "$tmp"/"$JBR".tar.gz "$JBR".tar.gz
rm -rf "$tmp"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
CONF_ARCHITECTURE=x86_64
if [[ "${architecture}" == *aarch64* ]]; then
CONF_ARCHITECTURE=aarch64
fi
RELEASE_NAME=macosx-${CONF_ARCHITECTURE}-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=1
do_maketest=1
;;
"dcevm")
HEAD_REVISION=$(git rev-parse HEAD)
git am jb/project/tools/patches/dcevm/*.patch || do_exit $?
do_reset_dcevm=0
do_reset_changes=0
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=1
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=macosx-${CONF_ARCHITECTURE}-server-fastdebug
JBSDK=macosx-${architecture}-server-release
;;
esac
if [ -z "$INC_BUILD" ]; then
do_configure || do_exit $?
make clean CONF=$RELEASE_NAME || do_exit $?
fi
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk-bundle/jdk-$JBSDK_VERSION.jdk/Contents/Home
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods "$JSDK" "$JCEF_PATH"/jmods "$JSDK"/jmods "$JSDK_MODS_DIR" || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not changed
jbr_name_postfix="_${bundle_type}"
fi
# create runtime image bundle
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat "$JSDK"/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" "$JBRSDK_BUNDLE" "$JSDK_MODS_DIR" "$modules" || do_exit $?
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-osx-test-${architecture}-b${build_number}
echo Creating "$JBRSDK_TEST" ...
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
make test-image CONF=$RELEASE_NAME || do_exit $?
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
COPYFILE_DISABLE=1 tar -pczf "$JBRSDK_TEST".tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
fi
do_exit 0

View File

@@ -1,118 +0,0 @@
#!/bin/bash -x
APP_DIRECTORY=$1
APPL_USER=$2
APPL_PASSWORD=$3
APP_NAME=$4
BUNDLE_ID=$5
FAKE_ROOT="${6:-fake-root}"
if [[ -z "$APP_DIRECTORY" ]] || [[ -z "$APPL_USER" ]] || [[ -z "$APPL_PASSWORD" ]]; then
echo "Usage: $0 AppDirectory Username Password"
exit 1
fi
if [[ ! -d "$APP_DIRECTORY" ]]; then
echo "AppDirectory '$APP_DIRECTORY' does not exist or not a directory"
exit 1
fi
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
function publish-log() {
id=$1
file=$2
curl -T "$file" "$ARTIFACTORY_URL/$id" || true
}
function altool-upload() {
# Since altool uses same file for upload token we have to trick it into using different folders for token file location
# Also it copies zip into TMPDIR so we override it too, to simplify cleanup
OLD_HOME="$HOME"
export HOME="$FAKE_ROOT/home"
export TMPDIR="$FAKE_ROOT/tmp"
mkdir -p "$HOME"
mkdir -p "$TMPDIR"
export _JAVA_OPTIONS="-Duser.home=$HOME -Djava.io.tmpdir=$TMPDIR"
# Reduce amount of downloads, cache transporter libraries
shared_itmstransporter="$OLD_HOME/shared-itmstransporter"
if [[ -f "$shared_itmstransporter" ]]; then
cp -r "$shared_itmstransporter" "$HOME/.itmstransporter"
fi
# For some reason altool prints everything to stderr, not stdout
set +e
xcrun altool --notarize-app \
--username "$APPL_USER" --password "$APPL_PASSWORD" \
--primary-bundle-id "$BUNDLE_ID" \
--asc-provider JetBrainssro --file "$1" 2>&1 | tee "altool.init.out"
unset TMPDIR
export HOME="$OLD_HOME"
set -e
}
#immediately exit script with an error if a command fails
set -euo pipefail
#file="$APP_NAME.zip"
#log "Zipping $file..."
#rm -rf "$file"
#ditto -c -k --sequesterRsrc --keepParent "$APP_DIRECTORY" "$file"
log "Notarizing $APP_NAME..."
rm -rf "altool.init.out" "altool.check.out"
altool-upload "$APP_NAME"
notarization_info="$(grep -e "RequestUUID" "altool.init.out" | grep -oE '([0-9a-f-]{36})')"
if [ -z "$notarization_info" ]; then
log "Faile to read RequestUUID from altool.init.out"
exit 10
fi
PATH="$PATH:/usr/local/bin/"
log "Notarization request sent, awaiting response"
spent=0
while true; do
# For some reason altool prints everything to stderr, not stdout
xcrun altool --username "$APPL_USER" --notarization-info "$notarization_info" --password "$APPL_PASSWORD" >"altool.check.out" 2>&1 || true
status="$(grep -oe 'Status: .*' "altool.check.out" | cut -c 9- || true)"
log "Current status: $status"
if [ "$status" = "invalid" ]; then
log "Notarization failed"
ec=1
elif [ "$status" = "success" ]; then
log "Notarization succeeded"
ec=0
else
if [ "$status" != "in progress" ]; then
log "Unknown notarization status, waiting more, altool output:"
cat "altool.check.out"
fi
if [[ $spent -gt 60 ]]; then
log "Waiting time out (apx 60 minutes)"
ec=2
break
fi
sleep 60
((spent += 1))
continue
fi
developer_log="developer_log.json"
log "Fetching $developer_log"
# TODO: Replace cut with trim or something better
url="$(grep -oe 'LogFileURL: .*' "altool.check.out" | cut -c 13-)"
wget "$url" -O "$developer_log" && cat "$developer_log" || true
if [ $ec != 0 ]; then
log "Publishing $developer_log"
publish-log "$notarization_info" "$developer_log"
fi
break
done
cat "altool.check.out"
rm -rf "altool.init.out" "altool.check.out"
exit $ec

View File

@@ -1,111 +0,0 @@
#!/bin/bash -x
APPLICATION_PATH=$1
APP_NAME=$2
BUNDLE_ID=$3
JB_DEVELOPER_CERT=$4
JB_INSTALLER_CERT=$5
if [[ -z "$APPLICATION_PATH" ]] || [[ -z "$JB_DEVELOPER_CERT" ]]; then
echo "Usage: $0 AppDirectory CertificateID"
exit 1
fi
if [[ ! -d "$APPLICATION_PATH" ]]; then
echo "AppDirectory '$APPLICATION_PATH' does not exist or not a directory"
exit 1
fi
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
#immediately exit script with an error if a command fails
set -euo pipefail
# Cleanup files left from previous sign attempt (if any)
find "$APPLICATION_PATH" -name '*.cstemp' -exec rm '{}' \;
log "Signing libraries and executables..."
# -perm +111 searches for executables
for f in \
"Contents/Home/lib" "Contents/MacOS" \
"Contents/Home/Frameworks" \
"Contents/Frameworks"; do
if [ -d "$APPLICATION_PATH/$f" ]; then
find "$APPLICATION_PATH/$f" \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -name "*.tbd" -o -name "*.node" -o -perm +111 \) \
-exec codesign --timestamp \
-v -s "$JB_DEVELOPER_CERT" --options=runtime --force \
--entitlements entitlements.xml {} \;
fi
done
log "Signing libraries in jars in $PWD"
# todo: add set -euo pipefail; into the inner sh -c
# `-e` prevents `grep -q && printf` loginc
# with `-o pipefail` there's no input for 'while' loop
find "$APPLICATION_PATH" -name '*.jar' \
-exec sh -c "set -u; unzip -l \"\$0\" | grep -q -e '\.dylib\$' -e '\.jnilib\$' -e '\.so\$' -e '\.tbd\$' -e '^jattach\$' && printf \"\$0\0\" " {} \; |
while IFS= read -r -d $'\0' file; do
log "Processing libraries in $file"
rm -rf jarfolder jar.jar
mkdir jarfolder
filename="${file##*/}"
log "Filename: $filename"
cp "$file" jarfolder && (cd jarfolder && jar xf "$filename" && rm "$filename")
find jarfolder \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -name "*.tbd" -o -name "jattach" \) \
-exec codesign --timestamp \
--force \
-v -s "$JB_DEVELOPER_CERT" --options=runtime \
--entitlements entitlements.xml {} \;
(cd jarfolder; zip -q -r -o -0 ../jar.jar .)
mv jar.jar "$file"
done
rm -rf jarfolder jar.jar
log "Signing other files..."
for f in \
"Contents/Home/bin"; do
if [ -d "$APPLICATION_PATH/$f" ]; then
find "$APPLICATION_PATH/$f" \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -name "*.tbd" -o -perm +111 \) \
-exec codesign --timestamp \
-v -s "$JB_DEVELOPER_CERT" --options=runtime --force \
--entitlements entitlements.xml {} \;
fi
done
#log "Signing executable..."
#codesign --timestamp \
# -v -s "$JB_DEVELOPER_CERT" --options=runtime \
# --force \
# --entitlements entitlements.xml "$APPLICATION_PATH/Contents/MacOS/idea"
log "Signing whole app..."
codesign --timestamp \
-v -s "$JB_DEVELOPER_CERT" --options=runtime \
--force \
--entitlements entitlements.xml "$APPLICATION_PATH"
BUILD_NAME=$(echo $APPLICATION_PATH | awk -F"/" '{ print $2 }')
log "Creating $APP_NAME.pkg..."
rm -rf "$APP_NAME.pkg"
pkgbuild --identifier $BUNDLE_ID --sign "$JB_INSTALLER_CERT" --root $APPLICATION_PATH \
--install-location /Library/Java/JavaVirtualMachines/${BUILD_NAME} ${APP_NAME}.pkg
#log "Signing whole app..."
#codesign --timestamp \
# -v -s "$JB_DEVELOPER_CERT" --options=runtime \
# --force \
# --entitlements entitlements.xml $APP_NAME.pkg
log "Verifying java is not broken"
find "$APPLICATION_PATH" \
-type f -name 'java' -perm +111 -exec {} -version \;

View File

@@ -1,138 +0,0 @@
#!/bin/bash -x
#immediately exit script with an error if a command fails
set -euo pipefail
export COPY_EXTENDED_ATTRIBUTES_DISABLE=true
export COPYFILE_DISABLE=true
INPUT_FILE=$1
EXPLODED=$2.exploded
BACKUP_JMODS=$2.backup
USERNAME=$3
PASSWORD=$4
CODESIGN_STRING=$5
JB_INSTALLER_CERT=$6
NOTARIZE=$7
BUNDLE_ID=$8
cd "$(dirname "$0")"
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
log "Deleting $EXPLODED ..."
if test -d "$EXPLODED"; then
find "$EXPLODED" -mindepth 1 -maxdepth 1 -exec chmod -R u+wx '{}' \;
fi
rm -rf "$EXPLODED"
mkdir "$EXPLODED"
rm -rf "$BACKUP_JMODS"
mkdir "$BACKUP_JMODS"
log "Unzipping $INPUT_FILE to $EXPLODED ..."
tar -xzvf "$INPUT_FILE" --directory $EXPLODED
BUILD_NAME="$(ls "$EXPLODED")"
#sed -i '' s/BNDL/APPL/ $EXPLODED/$BUILD_NAME/Contents/Info.plist
rm -f $EXPLODED/$BUILD_NAME/Contents/CodeResources
rm "$INPUT_FILE"
if test -d $EXPLODED/$BUILD_NAME/Contents/Home/jmods; then
mv $EXPLODED/$BUILD_NAME/Contents/Home/jmods $BACKUP_JMODS
fi
log "$INPUT_FILE extracted and removed"
APP_NAME=$(echo ${INPUT_FILE} | awk -F".tar" '{ print $1 }')
APPLICATION_PATH=$(sed "s/osx-//" <<< "$EXPLODED/$APP_NAME")
mv $EXPLODED/$BUILD_NAME $APPLICATION_PATH
find "$APPLICATION_PATH/Contents/Home/bin" \
-maxdepth 1 -type f -name '*.jnilib' -print0 |
while IFS= read -r -d $'\0' file; do
if [ -f "$file" ]; then
log "Linking $file"
b="$(basename "$file" .jnilib)"
ln -sf "$b.jnilib" "$(dirname "$file")/$b.dylib"
fi
done
find "$APPLICATION_PATH/Contents/" \
-maxdepth 1 -type f -name '*.txt' -print0 |
while IFS= read -r -d $'\0' file; do
if [ -f "$file" ]; then
log "Moving $file"
mv "$file" "$APPLICATION_PATH/Contents/Resources"
fi
done
non_plist=$(find "$APPLICATION_PATH/Contents/" -maxdepth 1 -type f -and -not -name 'Info.plist' | wc -l)
if [[ $non_plist -gt 0 ]]; then
log "Only Info.plist file is allowed in Contents directory but found $non_plist file(s):"
log "$(find "$APPLICATION_PATH/Contents/" -maxdepth 1 -type f -and -not -name 'Info.plist')"
exit 1
fi
log "Unlocking keychain..."
# Make sure *.p12 is imported into local KeyChain
security unlock-keychain -p "$PASSWORD" "/Users/$USERNAME/Library/Keychains/login.keychain"
attempt=1
limit=3
set +e
while [[ $attempt -le $limit ]]; do
log "Signing (attempt $attempt) $APPLICATION_PATH ..."
./sign.sh "$APPLICATION_PATH" "$APP_NAME" "$BUNDLE_ID" "$CODESIGN_STRING" "$JB_INSTALLER_CERT"
ec=$?
if [[ $ec -ne 0 ]]; then
((attempt += 1))
if [ $attempt -eq $limit ]; then
set -e
fi
log "Signing failed, wait for 30 sec and try to sign again"
sleep 30
else
log "Signing done"
codesign -v "$APPLICATION_PATH" -vvvvv
log "Check sign done"
spctl -a -v $APPLICATION_PATH
((attempt += limit))
fi
done
set -e
if [ "$NOTARIZE" = "yes" ]; then
log "Notarizing..."
# shellcheck disable=SC1090
source "$HOME/.notarize_token"
# Since notarization tool uses same file for upload token we have to trick it into using different folders, hence fake root
# Also it leaves copy of zip file in TMPDIR, so notarize.sh overrides it and uses FAKE_ROOT as location for temp TMPDIR
FAKE_ROOT="$(pwd)/fake-root"
mkdir -p "$FAKE_ROOT"
echo "Notarization will use fake root: $FAKE_ROOT"
./notarize.sh "$APPLICATION_PATH" "$APPLE_USERNAME" "$APPLE_PASSWORD" "$APP_NAME.pkg" "$BUNDLE_ID" "$FAKE_ROOT"
rm -rf "$FAKE_ROOT"
set +e
log "Stapling..."
xcrun stapler staple "$APPLICATION_PATH"
else
log "Notarization disabled"
log "Stapling disabled"
fi
log "Zipping $BUILD_NAME to $INPUT_FILE ..."
(
#cd "$EXPLODED"
#ditto -c -k --sequesterRsrc --keepParent "$BUILD_NAME" "../$INPUT_FILE"
if test -d $BACKUP_JMODS/jmods; then
mv $BACKUP_JMODS/jmods $APPLICATION_PATH/Contents/Home
fi
mv $APPLICATION_PATH $EXPLODED/$BUILD_NAME
tar -pczvf $INPUT_FILE --exclude='man' -C $EXPLODED $BUILD_NAME
log "Finished zipping"
)
rm -rf "$EXPLODED"
log "Done"

View File

@@ -1,30 +0,0 @@
diff --git jb/project/tools/common/modules.list jb/project/tools/common/modules.list
index 522acb7cb43..c40e689d5de 100644
--- jb/project/tools/common/modules.list
+++ jb/project/tools/common/modules.list
@@ -51,4 +51,7 @@ jdk.unsupported.desktop,
jdk.xml.dom,
jdk.zipfs,
jdk.hotspot.agent,
-jdk.jcmd
+jdk.jcmd,
+jcef,
+gluegen.rt,
+jogl.all
diff --git src/java.desktop/share/classes/module-info.java src/java.desktop/share/classes/module-info.java
index 897647ee368..781d1809493 100644
--- src/java.desktop/share/classes/module-info.java
+++ src/java.desktop/share/classes/module-info.java
@@ -116,7 +116,11 @@ module java.desktop {
// see make/GensrcModuleInfo.gmk
exports sun.awt to
jdk.accessibility,
- jdk.unsupported.desktop;
+ jdk.unsupported.desktop,
+ jcef,
+ jogl.all;
+
+ exports java.awt.peer to jcef;
exports java.awt.dnd.peer to jdk.unsupported.desktop;
exports sun.awt.dnd to jdk.unsupported.desktop;

View File

@@ -1,30 +0,0 @@
diff --git jb/project/tools/common/modules.list jb/project/tools/common/modules.list
index 522acb7..c40e689 100644
--- jb/project/tools/common/modules.list
+++ jb/project/tools/common/modules.list
@@ -51,4 +51,7 @@ jdk.unsupported.desktop,
jdk.xml.dom,
jdk.zipfs,
jdk.hotspot.agent,
-jdk.jcmd
+jdk.jcmd,
+jcef,
+gluegen.rt,
+jogl.all
diff --git src/java.desktop/share/classes/module-info.java src/java.desktop/share/classes/module-info.java
index 897647e..781d180 100644
--- src/java.desktop/share/classes/module-info.java
+++ src/java.desktop/share/classes/module-info.java
@@ -116,7 +116,11 @@ module java.desktop {
// see make/GensrcModuleInfo.gmk
exports sun.awt to
jdk.accessibility,
- jdk.unsupported.desktop;
+ jdk.unsupported.desktop,
+ jcef,
+ jogl.all;
+
+ exports java.awt.peer to jcef;
exports java.awt.dnd.peer to jdk.unsupported.desktop;
exports sun.awt.dnd to jdk.unsupported.desktop;

File diff suppressed because it is too large Load Diff

View File

@@ -1,29 +0,0 @@
From 960dafbeeba190911955c208b611fecc15d66738 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Wed, 11 Mar 2020 14:19:34 +0100
Subject: [PATCH 03/34] Fix class cast exception on redefinition of class A,
that is superclass of B that has anonymous class C
---
src/hotspot/share/oops/instanceKlass.cpp | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
diff --git a/src/hotspot/share/oops/instanceKlass.cpp b/src/hotspot/share/oops/instanceKlass.cpp
index 994fc8a3bc8..3be3a09ef8f 100644
--- a/src/hotspot/share/oops/instanceKlass.cpp
+++ b/src/hotspot/share/oops/instanceKlass.cpp
@@ -953,7 +953,10 @@ bool InstanceKlass::link_class_impl(TRAPS) {
if (!is_linked()) {
if (!is_rewritten()) {
- {
+ // (DCEVM): If class A is being redefined and class B->A (B is extended from A) and B is host class of anonymous class C
+ // then second redefinition fails with cannot cast klass exception. So we currently turn off bytecode verification
+ // on redefinition.
+ if (!AllowEnhancedClassRedefinition || !newest_version()->is_redefining()) {
bool verify_ok = verify_code(THREAD);
if (!verify_ok) {
return false;
--
2.23.0

View File

@@ -1,240 +0,0 @@
From 39df5f163d4a0f1fd6b92313a5570808f19d5e20 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 4 Oct 2020 21:12:12 +0200
Subject: [PATCH 05/34] Support for Lambda class redefinition
---
.../share/classfile/classLoaderData.cpp | 9 +++
.../share/classfile/classLoaderData.hpp | 2 +-
.../share/classfile/systemDictionary.cpp | 12 +++-
.../share/classfile/systemDictionary.hpp | 1 +
.../prims/jvmtiEnhancedRedefineClasses.cpp | 65 +++++++++++++++++--
.../prims/jvmtiEnhancedRedefineClasses.hpp | 1 +
.../share/prims/resolvedMethodTable.cpp | 2 +
src/hotspot/share/prims/unsafe.cpp | 1 +
8 files changed, 83 insertions(+), 10 deletions(-)
diff --git a/src/hotspot/share/classfile/classLoaderData.cpp b/src/hotspot/share/classfile/classLoaderData.cpp
index 0cd90bb8c27..4d64c6b454a 100644
--- a/src/hotspot/share/classfile/classLoaderData.cpp
+++ b/src/hotspot/share/classfile/classLoaderData.cpp
@@ -593,6 +593,15 @@ Dictionary* ClassLoaderData::create_dictionary() {
return new Dictionary(this, size, resizable);
}
+void ClassLoaderData::exchange_holders(ClassLoaderData* cld) {
+ oop holder_oop = _holder.peek();
+ _holder.replace(cld->_holder.peek());
+ cld->_holder.replace(holder_oop);
+ WeakHandle<vm_class_loader_data> exchange = _holder;
+ _holder = cld->_holder;
+ cld->_holder = exchange;
+}
+
// Tell the GC to keep this klass alive while iterating ClassLoaderDataGraph
oop ClassLoaderData::holder_phantom() const {
// A klass that was previously considered dead can be looked up in the
diff --git a/src/hotspot/share/classfile/classLoaderData.hpp b/src/hotspot/share/classfile/classLoaderData.hpp
index ba2393f8dd0..e2ae0a77351 100644
--- a/src/hotspot/share/classfile/classLoaderData.hpp
+++ b/src/hotspot/share/classfile/classLoaderData.hpp
@@ -181,7 +181,7 @@ class ClassLoaderData : public CHeapObj<mtClass> {
bool has_accumulated_modified_oops() { return _accumulated_modified_oops; }
oop holder_no_keepalive() const;
oop holder_phantom() const;
-
+ void exchange_holders(ClassLoaderData* cld);
private:
void unload();
bool keep_alive() const { return _keep_alive > 0; }
diff --git a/src/hotspot/share/classfile/systemDictionary.cpp b/src/hotspot/share/classfile/systemDictionary.cpp
index bd0cae7cb9b..8f2b46add4d 100644
--- a/src/hotspot/share/classfile/systemDictionary.cpp
+++ b/src/hotspot/share/classfile/systemDictionary.cpp
@@ -1062,10 +1062,14 @@ InstanceKlass* SystemDictionary::parse_stream(Symbol* class_name,
Handle class_loader,
ClassFileStream* st,
const ClassLoadInfo& cl_info,
+ InstanceKlass* old_klass,
TRAPS) {
EventClassLoad class_load_start_event;
ClassLoaderData* loader_data;
+
+ bool is_redefining = (old_klass != NULL);
+
bool is_unsafe_anon_class = cl_info.unsafe_anonymous_host() != NULL;
// - for unsafe anonymous class: create a new CLD whith a class holder that uses
@@ -1094,8 +1098,12 @@ InstanceKlass* SystemDictionary::parse_stream(Symbol* class_name,
class_name,
loader_data,
cl_info,
- false, // pick_newest
+ is_redefining, // pick_newest
CHECK_NULL);
+ if (is_redefining && k != NULL) {
+ k->set_redefining(true);
+ k->set_old_version(old_klass);
+ }
if ((cl_info.is_hidden() || is_unsafe_anon_class) && k != NULL) {
// Hidden classes that are not strong and unsafe anonymous classes must update
@@ -1998,7 +2006,7 @@ void SystemDictionary::remove_from_hierarchy(InstanceKlass* k) {
k->remove_from_sibling_list();
}
-// (DCEVM)
+// (DCEVM)
void SystemDictionary::update_constraints_after_redefinition() {
constraints()->update_after_redefinition();
}
diff --git a/src/hotspot/share/classfile/systemDictionary.hpp b/src/hotspot/share/classfile/systemDictionary.hpp
index 4547449dbec..931e655d631 100644
--- a/src/hotspot/share/classfile/systemDictionary.hpp
+++ b/src/hotspot/share/classfile/systemDictionary.hpp
@@ -329,6 +329,7 @@ public:
Handle class_loader,
ClassFileStream* st,
const ClassLoadInfo& cl_info,
+ InstanceKlass* old_klass,
TRAPS);
// Resolve from stream (called by jni_DefineClass and JVM_DefineClass)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 92ce6c27b8a..8b765623dcd 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -494,6 +494,8 @@ void VM_EnhancedRedefineClasses::doit() {
ClassLoaderDataGraph::classes_do(&clear_cpool_cache);
+ // SystemDictionary::methods_do(fix_invoke_method);
+
// JSR-292 support
if (_any_class_has_resolved_methods) {
bool trace_name_printed = false;
@@ -756,12 +758,34 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
// load hook event.
state->set_class_being_redefined(the_class, _class_load_kind);
- InstanceKlass* k = SystemDictionary::resolve_from_stream(the_class_sym,
- the_class_loader,
- protection_domain,
- &st,
- the_class,
- THREAD);
+ InstanceKlass* k;
+
+ if (InstanceKlass::cast(the_class)->is_anonymous()) {
+ const InstanceKlass* host_class = the_class->host_klass();
+
+ // Make sure it's the real host class, not another anonymous class.
+ while (host_class != NULL && host_class->is_anonymous()) {
+ host_class = host_class->host_klass();
+ }
+
+ k = SystemDictionary::parse_stream(the_class_sym,
+ the_class_loader,
+ protection_domain,
+ &st,
+ host_class,
+ the_class,
+ NULL,
+ THREAD);
+ k->class_loader_data()->exchange_holders(the_class->class_loader_data());
+ the_class->class_loader_data()->inc_keep_alive();
+ } else {
+ k = SystemDictionary::resolve_from_stream(the_class_sym,
+ the_class_loader,
+ protection_domain,
+ &st,
+ the_class,
+ THREAD);
+ }
// Clear class_being_redefined just to be sure.
state->clear_class_being_redefined();
@@ -1442,6 +1466,30 @@ void VM_EnhancedRedefineClasses::MethodDataCleaner::do_klass(Klass* k) {
}
}
+void VM_EnhancedRedefineClasses::fix_invoke_method(Method* method) {
+
+ constantPoolHandle other_cp = constantPoolHandle(method->constants());
+
+ for (int i = 0; i < other_cp->length(); i++) {
+ if (other_cp->tag_at(i).is_klass()) {
+ Klass* klass = other_cp->resolved_klass_at(i);
+ if (klass->new_version() != NULL) {
+ // Constant pool entry points to redefined class -- update to the new version
+ other_cp->klass_at_put(i, klass->newest_version());
+ }
+ assert(other_cp->resolved_klass_at(i)->new_version() == NULL, "Must be new klass!");
+ }
+ }
+
+ ConstantPoolCache* cp_cache = other_cp->cache();
+ if (cp_cache != NULL) {
+ cp_cache->clear_entries();
+ }
+
+}
+
+
+
void VM_EnhancedRedefineClasses::update_jmethod_ids() {
for (int j = 0; j < _matching_methods_length; ++j) {
Method* old_method = _matching_old_methods[j];
@@ -1979,7 +2027,10 @@ jvmtiError VM_EnhancedRedefineClasses::find_sorted_affected_classes(TRAPS) {
// Find classes not directly redefined, but affected by a redefinition (because one of its supertypes is redefined)
AffectedKlassClosure closure(_affected_klasses);
// Updated in j10, from original SystemDictionary::classes_do
- ClassLoaderDataGraph::dictionary_classes_do(&closure);
+
+ ClassLoaderDataGraph::classes_do(&closure);
+ //ClassLoaderDataGraph::dictionary_classes_do(&closure);
+
log_trace(redefine, class, load)("%d classes affected", _affected_klasses->length());
// Sort the affected klasses such that a supertype is always on a smaller array index than its subtype.
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index 60b62c3170a..d8a11b51fe9 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -116,6 +116,7 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
void rollback();
static void mark_as_scavengable(nmethod* nm);
static void unpatch_bytecode(Method* method);
+ static void fix_invoke_method(Method* method);
// Figure out which new methods match old methods in name and signature,
// which methods have been added, and which are no longer present
diff --git a/src/hotspot/share/prims/resolvedMethodTable.cpp b/src/hotspot/share/prims/resolvedMethodTable.cpp
index 122bb8c186b..81b3aa96564 100644
--- a/src/hotspot/share/prims/resolvedMethodTable.cpp
+++ b/src/hotspot/share/prims/resolvedMethodTable.cpp
@@ -414,6 +414,8 @@ void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed)
InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
Method* newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
+ log_info(redefine, class, load, exceptions)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
+
assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
assert(newer_method != NULL, "method_with_idnum() should not be NULL");
assert(old_method != newer_method, "sanity check");
diff --git a/src/hotspot/share/prims/unsafe.cpp b/src/hotspot/share/prims/unsafe.cpp
index 72d81ec9d6c..027afa3fabd 100644
--- a/src/hotspot/share/prims/unsafe.cpp
+++ b/src/hotspot/share/prims/unsafe.cpp
@@ -865,6 +865,7 @@ Unsafe_DefineAnonymousClass_impl(JNIEnv *env,
host_loader,
&st,
cl_info,
+ NULL,
CHECK_NULL);
if (anonk == NULL) {
return NULL;
--
2.23.0

View File

@@ -1,135 +0,0 @@
From 5af1daedc86b5fec0f222cbdda3afbdf518985ea Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sat, 23 May 2020 10:02:15 +0200
Subject: [PATCH 06/34] Fix "no original bytecode found" error if method with
bkp is missing
Sometimes IDE can deploy class with erroneous method, such method has
n bytecode, but breakpoint position can still exist.
---
src/hotspot/share/interpreter/bytecodes.cpp | 2 +-
.../share/interpreter/interpreterRuntime.cpp | 2 +-
src/hotspot/share/oops/method.cpp | 8 ++++----
src/hotspot/share/oops/method.hpp | 4 ++--
.../prims/jvmtiEnhancedRedefineClasses.cpp | 18 ++++++++++--------
5 files changed, 18 insertions(+), 16 deletions(-)
diff --git a/src/hotspot/share/interpreter/bytecodes.cpp b/src/hotspot/share/interpreter/bytecodes.cpp
index e377e36b88c..262ecc021b2 100644
--- a/src/hotspot/share/interpreter/bytecodes.cpp
+++ b/src/hotspot/share/interpreter/bytecodes.cpp
@@ -84,7 +84,7 @@ Bytecodes::Code Bytecodes::code_at(Method* method, int bci) {
Bytecodes::Code Bytecodes::non_breakpoint_code_at(const Method* method, address bcp) {
assert(method != NULL, "must have the method for breakpoint conversion");
assert(method->contains(bcp), "must be valid bcp in method");
- return method->orig_bytecode_at(method->bci_from(bcp));
+ return method->orig_bytecode_at(method->bci_from(bcp), false);
}
int Bytecodes::special_length_at(Bytecodes::Code code, address bcp, address end) {
diff --git a/src/hotspot/share/interpreter/interpreterRuntime.cpp b/src/hotspot/share/interpreter/interpreterRuntime.cpp
index ed3cc3eb6a2..504e59caf51 100644
--- a/src/hotspot/share/interpreter/interpreterRuntime.cpp
+++ b/src/hotspot/share/interpreter/interpreterRuntime.cpp
@@ -814,7 +814,7 @@ JRT_END
// Invokes
JRT_ENTRY(Bytecodes::Code, InterpreterRuntime::get_original_bytecode_at(JavaThread* thread, Method* method, address bcp))
- return method->orig_bytecode_at(method->bci_from(bcp));
+ return method->orig_bytecode_at(method->bci_from(bcp), false);
JRT_END
JRT_ENTRY(void, InterpreterRuntime::set_original_bytecode_at(JavaThread* thread, Method* method, address bcp, Bytecodes::Code new_code))
diff --git a/src/hotspot/share/oops/method.cpp b/src/hotspot/share/oops/method.cpp
index 516f2bb8f2f..1c88511a5fc 100644
--- a/src/hotspot/share/oops/method.cpp
+++ b/src/hotspot/share/oops/method.cpp
@@ -1853,14 +1853,14 @@ bool CompressedLineNumberReadStream::read_pair() {
#if INCLUDE_JVMTI
-Bytecodes::Code Method::orig_bytecode_at(int bci) const {
+Bytecodes::Code Method::orig_bytecode_at(int bci, bool no_fatal) const {
BreakpointInfo* bp = method_holder()->breakpoints();
for (; bp != NULL; bp = bp->next()) {
if (bp->match(this, bci)) {
return bp->orig_bytecode();
}
}
- {
+ if (!no_fatal) {
ResourceMark rm;
fatal("no original bytecode found in %s at bci %d", name_and_sig_as_C_string(), bci);
}
@@ -2006,7 +2006,7 @@ BreakpointInfo::BreakpointInfo(Method* m, int bci) {
_signature_index = m->signature_index();
_orig_bytecode = (Bytecodes::Code) *m->bcp_from(_bci);
if (_orig_bytecode == Bytecodes::_breakpoint)
- _orig_bytecode = m->orig_bytecode_at(_bci);
+ _orig_bytecode = m->orig_bytecode_at(_bci, false);
_next = NULL;
}
@@ -2015,7 +2015,7 @@ void BreakpointInfo::set(Method* method) {
{
Bytecodes::Code code = (Bytecodes::Code) *method->bcp_from(_bci);
if (code == Bytecodes::_breakpoint)
- code = method->orig_bytecode_at(_bci);
+ code = method->orig_bytecode_at(_bci, false);
assert(orig_bytecode() == code, "original bytecode must be the same");
}
#endif
diff --git a/src/hotspot/share/oops/method.hpp b/src/hotspot/share/oops/method.hpp
index 83ed2d9c3c1..4d4cc6dc012 100644
--- a/src/hotspot/share/oops/method.hpp
+++ b/src/hotspot/share/oops/method.hpp
@@ -230,7 +230,7 @@ class Method : public Metadata {
// JVMTI breakpoints
#if !INCLUDE_JVMTI
- Bytecodes::Code orig_bytecode_at(int bci) const {
+ Bytecodes::Code orig_bytecode_at(int bci, bool no_fatal) const {
ShouldNotReachHere();
return Bytecodes::_shouldnotreachhere;
}
@@ -239,7 +239,7 @@ class Method : public Metadata {
};
u2 number_of_breakpoints() const {return 0;}
#else // !INCLUDE_JVMTI
- Bytecodes::Code orig_bytecode_at(int bci) const;
+ Bytecodes::Code orig_bytecode_at(int bci, bool no_fatal) const;
void set_orig_bytecode_at(int bci, Bytecodes::Code code);
void set_breakpoint(int bci);
void clear_breakpoint(int bci);
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 8b765623dcd..a859b8e1162 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -1362,14 +1362,16 @@ void VM_EnhancedRedefineClasses::unpatch_bytecode(Method* method) {
if (code == Bytecodes::_breakpoint) {
int bci = method->bci_from(bcp);
- code = method->orig_bytecode_at(bci);
- java_code = Bytecodes::java_code(code);
- if (code != java_code &&
- (java_code == Bytecodes::_getfield ||
- java_code == Bytecodes::_putfield ||
- java_code == Bytecodes::_aload_0)) {
- // Let breakpoint table handling unpatch bytecode
- method->set_orig_bytecode_at(bci, java_code);
+ code = method->orig_bytecode_at(bci, true);
+ if (code != Bytecodes::_shouldnotreachhere) {
+ java_code = Bytecodes::java_code(code);
+ if (code != java_code &&
+ (java_code == Bytecodes::_getfield ||
+ java_code == Bytecodes::_putfield ||
+ java_code == Bytecodes::_aload_0)) {
+ // Let breakpoint table handling unpatch bytecode
+ method->set_orig_bytecode_at(bci, java_code);
+ }
}
} else {
java_code = Bytecodes::java_code(code);
--
2.23.0

View File

@@ -1,57 +0,0 @@
From 19d2274a5dff6e6b31474252b45e5e7484f0180b Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 24 May 2020 12:07:42 +0200
Subject: [PATCH 07/34] Replace deleted method with
Universe::throw_no_such_method_error
---
.../share/prims/resolvedMethodTable.cpp | 28 +++++++++----------
1 file changed, 14 insertions(+), 14 deletions(-)
diff --git a/src/hotspot/share/prims/resolvedMethodTable.cpp b/src/hotspot/share/prims/resolvedMethodTable.cpp
index 81b3aa96564..caf03ffe56d 100644
--- a/src/hotspot/share/prims/resolvedMethodTable.cpp
+++ b/src/hotspot/share/prims/resolvedMethodTable.cpp
@@ -404,25 +404,25 @@ void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed)
if (old_method->is_old()) {
+ InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
+ Method* newer_method;
+
// Method* new_method;
if (old_method->is_deleted()) {
- // FIXME:(DCEVM) - check if exception can be thrown
- // new_method = Universe::throw_no_such_method_error();
- continue;
- }
-
- InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
- Method* newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
+ newer_method = Universe::throw_no_such_method_error();
+ } else {
+ newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
- log_info(redefine, class, load, exceptions)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
+ log_info(redefine, class, load, exceptions)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
- assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
- assert(newer_method != NULL, "method_with_idnum() should not be NULL");
- assert(old_method != newer_method, "sanity check");
+ assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
+ assert(newer_method != NULL, "method_with_idnum() should not be NULL");
+ assert(old_method != newer_method, "sanity check");
- if (_the_table->lookup(newer_method) != NULL) {
- // old method was already adjusted if new method exists in _the_table
- continue;
+ if (_the_table->lookup(newer_method) != NULL) {
+ // old method was already adjusted if new method exists in _the_table
+ continue;
+ }
}
java_lang_invoke_ResolvedMethodName::set_vmtarget(mem_name, newer_method);
--
2.23.0

File diff suppressed because it is too large Load Diff

View File

@@ -1,50 +0,0 @@
From ca47ab5a0a6ce8e2644736f323a335a957311af9 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sat, 13 Jun 2020 18:50:59 +0200
Subject: [PATCH 09/34] Change log level in advanced redefinition
- Change log level for "Comparing different class ver.." to debug
- Fix adjust_method_entries_dcevm logging levels and severity
---
src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp | 2 +-
src/hotspot/share/prims/resolvedMethodTable.cpp | 4 ++--
2 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 6c9eb40ecf5..b09ba554e07 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -916,7 +916,7 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
// Calculated the difference between new and old class (field change, method change, supertype change, ...).
int VM_EnhancedRedefineClasses::calculate_redefinition_flags(InstanceKlass* new_class) {
int result = Klass::NoRedefinition;
- log_info(redefine, class, load)("Comparing different class versions of class %s",new_class->name()->as_C_string());
+ log_debug(redefine, class, load)("Comparing different class versions of class %s",new_class->name()->as_C_string());
assert(new_class->old_version() != NULL, "must have old version");
InstanceKlass* the_class = InstanceKlass::cast(new_class->old_version());
diff --git a/src/hotspot/share/prims/resolvedMethodTable.cpp b/src/hotspot/share/prims/resolvedMethodTable.cpp
index caf03ffe56d..eb9fcda44f3 100644
--- a/src/hotspot/share/prims/resolvedMethodTable.cpp
+++ b/src/hotspot/share/prims/resolvedMethodTable.cpp
@@ -413,7 +413,7 @@ void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed)
} else {
newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
- log_info(redefine, class, load, exceptions)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
+ log_debug(redefine, class, update)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
assert(newer_method != NULL, "method_with_idnum() should not be NULL");
@@ -433,7 +433,7 @@ void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed)
ResourceMark rm;
if (!(*trace_name_printed)) {
- log_info(redefine, class, update)("adjust: name=%s", old_method->method_holder()->external_name());
+ log_debug(redefine, class, update)("adjust: name=%s", old_method->method_holder()->external_name());
*trace_name_printed = true;
}
log_debug(redefine, class, update, constantpool)
--
2.23.0

View File

@@ -1,26 +0,0 @@
From 7e236beee0375656d1955fc1168143c1639fb7f1 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Tue, 6 Oct 2020 22:15:31 +0200
Subject: [PATCH 10/34] AllowEnhancedClassRedefinition is false (disabled) by
default
---
src/hotspot/share/runtime/globals.hpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/hotspot/share/runtime/globals.hpp b/src/hotspot/share/runtime/globals.hpp
index 5b367704800..2710c6ea0e5 100644
--- a/src/hotspot/share/runtime/globals.hpp
+++ b/src/hotspot/share/runtime/globals.hpp
@@ -2466,7 +2466,7 @@ const size_t minimumSymbolTableSize = 1024;
diagnostic(bool, DeoptimizeNMethodBarriersALot, false, \
"Make nmethod barriers deoptimise a lot.") \
\
- product(bool, AllowEnhancedClassRedefinition, true, \
+ product(bool, AllowEnhancedClassRedefinition, false, \
"Allow enhanced class redefinition beyond swapping method " \
"bodies") \
\
--
2.23.0

View File

@@ -1,25 +0,0 @@
From d56e73885111b386771f564ec6beb305338993df Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Mon, 19 Oct 2020 20:00:04 +0200
Subject: [PATCH 12/34] Set HOTSPOT_VM_DISTRO=Dynamic Code Evolution
---
make/autoconf/version-numbers | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/make/autoconf/version-numbers b/make/autoconf/version-numbers
index aabdc5bed20..df8025a2e84 100644
--- a/make/autoconf/version-numbers
+++ b/make/autoconf/version-numbers
@@ -45,7 +45,7 @@ PRODUCT_NAME=OpenJDK
PRODUCT_SUFFIX="Runtime Environment"
JDK_RC_PLATFORM_NAME=Platform
COMPANY_NAME=N/A
-HOTSPOT_VM_DISTRO="OpenJDK"
+HOTSPOT_VM_DISTRO="Dynamic Code Evolution"
VENDOR_URL=https://openjdk.java.net/
VENDOR_URL_BUG=https://bugreport.java.com/bugreport/
VENDOR_URL_VM_BUG=https://bugreport.java.com/bugreport/crash.jsp
--
2.23.0

View File

@@ -1,71 +0,0 @@
From 7ebad43ed45805b0a3736c510f708ff17697ba7e Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 11 Oct 2020 10:43:28 +0200
Subject: [PATCH 13/34] Fix G1 nmethod registration
---
.../prims/jvmtiEnhancedRedefineClasses.cpp | 19 ++++++++++++++++---
.../prims/jvmtiEnhancedRedefineClasses.hpp | 3 ++-
2 files changed, 18 insertions(+), 4 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index b09ba554e07..718426f2819 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -216,7 +216,14 @@ void VM_EnhancedRedefineClasses::mark_as_scavengable(nmethod* nm) {
}
}
-void VM_EnhancedRedefineClasses::mark_as_scavengable_g1(nmethod* nm) {
+void VM_EnhancedRedefineClasses::unregister_nmethod_g1(nmethod* nm) {
+ // It should work not only for G1 but also for another GCs, but this way is safer now
+ if (!nm->is_zombie() && !nm->is_unloaded()) {
+ Universe::heap()->unregister_nmethod(nm);
+ }
+}
+
+void VM_EnhancedRedefineClasses::register_nmethod_g1(nmethod* nm) {
// It should work not only for G1 but also for another GCs, but this way is safer now
if (!nm->is_zombie() && !nm->is_unloaded()) {
Universe::heap()->register_nmethod(nm);
@@ -521,8 +528,9 @@ void VM_EnhancedRedefineClasses::doit() {
// For now, mark all nmethod's as scavengable that are not scavengable already
if (ScavengeRootsInCode) {
if (UseG1GC) {
- // this should work also for other GCs
- CodeCache::nmethods_do(mark_as_scavengable_g1);
+ // G1 holds references to nmethods in regions based on oops values. Since oops in nmethod can be changed in ChangePointers* closures
+ // we unregister nmethods from G1 heap, then closures are processed (oops are changed) and finally we register nmethod to G1 again
+ CodeCache::nmethods_do(unregister_nmethod_g1);
} else {
CodeCache::nmethods_do(mark_as_scavengable);
}
@@ -545,6 +553,11 @@ void VM_EnhancedRedefineClasses::doit() {
Universe::root_oops_do(&oopClosureNoBarrier);
+ if (UseG1GC) {
+ // this should work also for other GCs
+ CodeCache::nmethods_do(register_nmethod_g1);
+ }
+
}
log_trace(redefine, class, obsolete, metadata)("After updating instances");
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index 9755944d70b..4c0412d343d 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -116,7 +116,8 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
void rollback();
static void mark_as_scavengable(nmethod* nm);
- static void mark_as_scavengable_g1(nmethod* nm);
+ static void unregister_nmethod_g1(nmethod* nm);
+ static void register_nmethod_g1(nmethod* nm);
static void unpatch_bytecode(Method* method);
static void fix_invoke_method(Method* method);
--
2.23.0

View File

@@ -1,26 +0,0 @@
From 5c7e5f245f79d7e8575461dab0c356ed74c8e9a3 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Thu, 22 Oct 2020 20:15:20 +0200
Subject: [PATCH 14/34] Initialize method's _new_version/_old_version to NULL
---
src/hotspot/share/oops/method.cpp | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/src/hotspot/share/oops/method.cpp b/src/hotspot/share/oops/method.cpp
index 1c88511a5fc..ce940cf10a9 100644
--- a/src/hotspot/share/oops/method.cpp
+++ b/src/hotspot/share/oops/method.cpp
@@ -91,7 +91,8 @@ Method* Method::allocate(ClassLoaderData* loader_data,
return new (loader_data, size, MetaspaceObj::MethodType, THREAD) Method(cm, access_flags);
}
-Method::Method(ConstMethod* xconst, AccessFlags access_flags) {
+Method::Method(ConstMethod* xconst, AccessFlags access_flags) : _new_version(NULL),
+ _old_version(NULL) {
NoSafepointVerifier no_safepoint;
set_constMethod(xconst);
set_access_flags(access_flags);
--
2.23.0

View File

@@ -1,193 +0,0 @@
From 6ffac6e5064ec6633fdbeb8520333dca00bc6a62 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Fri, 23 Oct 2020 10:20:26 +0200
Subject: [PATCH 15/34] Clear dcevm code separation
---
src/hotspot/share/classfile/systemDictionary.cpp | 4 ++--
src/hotspot/share/gc/serial/genMarkSweep.cpp | 8 +++++---
src/hotspot/share/interpreter/linkResolver.cpp | 16 +++++++++++-----
.../instrumentation/jfrEventClassTransformer.cpp | 2 +-
src/hotspot/share/oops/instanceKlass.cpp | 10 ++++++----
src/hotspot/share/oops/method.cpp | 2 +-
.../share/prims/jvmtiGetLoadedClasses.cpp | 2 +-
src/hotspot/share/runtime/reflection.cpp | 2 +-
8 files changed, 28 insertions(+), 18 deletions(-)
diff --git a/src/hotspot/share/classfile/systemDictionary.cpp b/src/hotspot/share/classfile/systemDictionary.cpp
index 8f2b46add4d..9ac6ec96cb5 100644
--- a/src/hotspot/share/classfile/systemDictionary.cpp
+++ b/src/hotspot/share/classfile/systemDictionary.cpp
@@ -1241,7 +1241,7 @@ InstanceKlass* SystemDictionary::resolve_from_stream(Symbol* class_name,
MutexLocker mu(THREAD, SystemDictionary_lock);
Klass* check = find_class(h_name, k->class_loader_data());
- assert((check == k && !k->is_redefining()) || (k->is_redefining() && check == k->old_version()), "should be present in the dictionary");
+ assert(check == k && !k->is_redefining() || k->is_redefining() && check == k->old_version(), "should be present in the dictionary");
} );
return k;
@@ -2290,7 +2290,7 @@ void SystemDictionary::check_constraints(unsigned int d_hash,
// also hold array classes.
assert(check->is_instance_klass(), "noninstance in systemdictionary");
- if ((defining == true) || ((k != check) && k->old_version() != check)) {
+ if ((defining == true) || (k != check && (!AllowEnhancedClassRedefinition || k->old_version() != check))) {
throwException = true;
ss.print("loader %s", loader_data->loader_name_and_id());
ss.print(" attempted duplicate %s definition for %s. (%s)",
diff --git a/src/hotspot/share/gc/serial/genMarkSweep.cpp b/src/hotspot/share/gc/serial/genMarkSweep.cpp
index 1d13c647452..548df01e557 100644
--- a/src/hotspot/share/gc/serial/genMarkSweep.cpp
+++ b/src/hotspot/share/gc/serial/genMarkSweep.cpp
@@ -334,7 +334,9 @@ void GenMarkSweep::mark_sweep_phase4() {
GenCompactClosure blk;
gch->generation_iterate(&blk, true);
- DcevmSharedGC::copy_rescued_objects_back(MarkSweep::_rescued_oops, true);
- DcevmSharedGC::clear_rescued_objects_resource(MarkSweep::_rescued_oops);
- MarkSweep::_rescued_oops = NULL;
+ if (AllowEnhancedClassRedefinition) {
+ DcevmSharedGC::copy_rescued_objects_back(MarkSweep::_rescued_oops, true);
+ DcevmSharedGC::clear_rescued_objects_resource(MarkSweep::_rescued_oops);
+ MarkSweep::_rescued_oops = NULL;
+ }
}
diff --git a/src/hotspot/share/interpreter/linkResolver.cpp b/src/hotspot/share/interpreter/linkResolver.cpp
index b6e9e0a308d..b2f24ddbeda 100644
--- a/src/hotspot/share/interpreter/linkResolver.cpp
+++ b/src/hotspot/share/interpreter/linkResolver.cpp
@@ -282,9 +282,14 @@ void LinkResolver::check_klass_accessibility(Klass* ref_klass, Klass* sel_klass,
if (!base_klass->is_instance_klass()) {
return; // no relevant check to do
}
-
- Reflection::VerifyClassAccessResults vca_result =
- Reflection::verify_class_access(ref_klass->newest_version(), InstanceKlass::cast(base_klass->newest_version()), true);
+ Klass* refKlassNewest = ref_klass;
+ Klass* baseKlassNewest = base_klass;
+ if (AllowEnhancedClassRedefinition) {
+ refKlassNewest = ref_klass->newest_version();
+ baseKlassNewest = base_klass->newest_version();
+ }
+ Reflection::VerifyClassAccessResults vca_result =
+ Reflection::verify_class_access(refKlassNewest, InstanceKlass::cast(baseKlassNewest), true);
if (vca_result != Reflection::ACCESS_OK) {
ResourceMark rm(THREAD);
char* msg = Reflection::verify_class_access_msg(ref_klass,
@@ -566,7 +571,8 @@ void LinkResolver::check_method_accessability(Klass* ref_klass,
// We'll check for the method name first, as that's most likely
// to be false (so we'll short-circuit out of these tests).
if (sel_method->name() == vmSymbols::clone_name() &&
- sel_klass->newest_version() == SystemDictionary::Object_klass()->newest_version() &&
+ ( !AllowEnhancedClassRedefinition && sel_klass == SystemDictionary::Object_klass() ||
+ AllowEnhancedClassRedefinition && sel_klass->newest_version() == SystemDictionary::Object_klass()->newest_version()) &&
resolved_klass->is_array_klass()) {
// We need to change "protected" to "public".
assert(flags.is_protected(), "clone not protected?");
@@ -1011,7 +1017,7 @@ void LinkResolver::resolve_field(fieldDescriptor& fd,
// or by the <init> method (in case of an instance field).
if (is_put && fd.access_flags().is_final()) {
- if (sel_klass != current_klass && sel_klass != current_klass->active_version()) {
+ if (sel_klass != current_klass && (!AllowEnhancedClassRedefinition || sel_klass != current_klass->active_version())) {
ResourceMark rm(THREAD);
stringStream ss;
ss.print("Update to %s final field %s.%s attempted from a different class (%s) than the field's declaring class",
diff --git a/src/hotspot/share/jfr/instrumentation/jfrEventClassTransformer.cpp b/src/hotspot/share/jfr/instrumentation/jfrEventClassTransformer.cpp
index 96fc139bea3..f7284197c5a 100644
--- a/src/hotspot/share/jfr/instrumentation/jfrEventClassTransformer.cpp
+++ b/src/hotspot/share/jfr/instrumentation/jfrEventClassTransformer.cpp
@@ -1471,7 +1471,7 @@ static InstanceKlass* create_new_instance_klass(InstanceKlass* ik, ClassFileStre
cld,
&cl_info,
ClassFileParser::INTERNAL, // internal visibility
- false,
+ false,
THREAD);
if (HAS_PENDING_EXCEPTION) {
log_pending_exception(PENDING_EXCEPTION);
diff --git a/src/hotspot/share/oops/instanceKlass.cpp b/src/hotspot/share/oops/instanceKlass.cpp
index 3be3a09ef8f..f8e60941046 100644
--- a/src/hotspot/share/oops/instanceKlass.cpp
+++ b/src/hotspot/share/oops/instanceKlass.cpp
@@ -199,7 +199,9 @@ bool InstanceKlass::has_nest_member(InstanceKlass* k, TRAPS) const {
// able to perform that loading but we can't exclude the compiler threads from
// executing this logic. But it should actually be impossible to trigger loading here.
Klass* k2 = _constants->klass_at(cp_index, THREAD);
- k2 = k2->newest_version();
+ if (AllowEnhancedClassRedefinition) {
+ k2 = k2->newest_version();
+ }
assert(!HAS_PENDING_EXCEPTION || PENDING_EXCEPTION->is_a(SystemDictionary::VirtualMachineError_klass()),
"Exceptions should not be possible here");
if (k2 == k) {
@@ -1003,7 +1005,7 @@ bool InstanceKlass::link_class_impl(TRAPS) {
#endif
set_init_state(linked);
// (DCEVM) Must check for old version in order to prevent infinite loops.
- if (JvmtiExport::should_post_class_prepare() && old_version() == NULL /* JVMTI deadlock otherwise */) {
+ if (JvmtiExport::should_post_class_prepare() && (!AllowEnhancedClassRedefinition || old_version() == NULL) /* JVMTI deadlock otherwise */) {
Thread *thread = THREAD;
assert(thread->is_Java_thread(), "thread->is_Java_thread()");
JvmtiExport::post_class_prepare((JavaThread *) thread, this);
@@ -1084,7 +1086,7 @@ void InstanceKlass::initialize_impl(TRAPS) {
// we might end up throwing IE from link/symbol resolution sites
// that aren't expected to throw. This would wreak havoc. See 6320309.
while ((is_being_initialized() && !is_reentrant_initialization(jt))
- || (old_version() != NULL && InstanceKlass::cast(old_version())->is_being_initialized())) {
+ || (AllowEnhancedClassRedefinition && old_version() != NULL && InstanceKlass::cast(old_version())->is_being_initialized())) {
wait = true;
jt->set_class_to_be_initialized(this);
ol.wait_uninterruptibly(jt);
@@ -3782,7 +3784,7 @@ void InstanceKlass::verify_on(outputStream* st) {
guarantee(sib->is_klass(), "should be klass");
// TODO: (DCEVM) explain
- guarantee(sib->super() == super || super->newest_version() == SystemDictionary::Object_klass(), "siblings should have same superklass");
+ guarantee(sib->super() == super || AllowEnhancedClassRedefinition && super->newest_version() == SystemDictionary::Object_klass(), "siblings should have same superklass");
}
// Verify local interfaces
diff --git a/src/hotspot/share/oops/method.cpp b/src/hotspot/share/oops/method.cpp
index ce940cf10a9..2d8e5b0256b 100644
--- a/src/hotspot/share/oops/method.cpp
+++ b/src/hotspot/share/oops/method.cpp
@@ -2208,7 +2208,7 @@ void Method::ensure_jmethod_ids(ClassLoaderData* loader_data, int capacity) {
// Add a method id to the jmethod_ids
jmethodID Method::make_jmethod_id(ClassLoaderData* loader_data, Method* m) {
// FIXME: (DCEVM) ???
- if (m != m->newest_version()) {
+ if (AllowEnhancedClassRedefinition && m != m->newest_version()) {
m = m->newest_version();
}
ClassLoaderData* cld = loader_data;
diff --git a/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp b/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
index 1c7677f270f..6c12ee64a6e 100644
--- a/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
@@ -75,7 +75,7 @@ public:
// the new version (SystemDictionary stores only new versions). But the LoadedClassesClosure's functionality was
// changed in java8 where jvmtiLoadedClasses collects all classes from all classloaders, therefore we
// must use new versions only.
- if (k->new_version()==NULL) {
+ if (AllowEnhancedClassRedefinition && k->new_version()==NULL) {
_classStack.push((jclass) _env->jni_reference(Handle(_cur_thread, k->java_mirror())));
if (_dictionary_walk) {
// Collect array classes this way when walking the dictionary (because array classes are
diff --git a/src/hotspot/share/runtime/reflection.cpp b/src/hotspot/share/runtime/reflection.cpp
index 0e7722dba7d..d67457f02ac 100644
--- a/src/hotspot/share/runtime/reflection.cpp
+++ b/src/hotspot/share/runtime/reflection.cpp
@@ -628,7 +628,7 @@ bool Reflection::verify_member_access(const Klass* current_class,
TRAPS) {
// (DCEVM) Decide accessibility based on active version
- if (current_class != NULL) {
+ if (AllowEnhancedClassRedefinition && current_class != NULL) {
current_class = current_class->active_version();
}
--
2.23.0

View File

@@ -1,26 +0,0 @@
From dc675de6ac42819b8536827ea450fcad13a97448 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Wed, 11 Nov 2020 18:45:15 +0100
Subject: [PATCH 16/34] Fix LoadedClassesClosure - fixes problems with remote
debugging
---
src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp b/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
index 6c12ee64a6e..2a469555dbd 100644
--- a/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiGetLoadedClasses.cpp
@@ -75,7 +75,7 @@ public:
// the new version (SystemDictionary stores only new versions). But the LoadedClassesClosure's functionality was
// changed in java8 where jvmtiLoadedClasses collects all classes from all classloaders, therefore we
// must use new versions only.
- if (AllowEnhancedClassRedefinition && k->new_version()==NULL) {
+ if (!AllowEnhancedClassRedefinition || k->new_version()==NULL) {
_classStack.push((jclass) _env->jni_reference(Handle(_cur_thread, k->java_mirror())));
if (_dictionary_walk) {
// Collect array classes this way when walking the dictionary (because array classes are
--
2.23.0

View File

@@ -1,183 +0,0 @@
From 1d682efa88c716e1849163d5abff3a3367581d16 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Mon, 16 Nov 2020 21:11:19 +0100
Subject: [PATCH 18/34] pre dcevm15 - fix GC spaces originally in removed CMS
patch
---
src/hotspot/share/gc/shared/space.cpp | 16 ++++++++--------
src/hotspot/share/gc/shared/space.hpp | 6 +++---
src/hotspot/share/gc/shared/space.inline.hpp | 14 ++++++++------
.../share/prims/jvmtiEnhancedRedefineClasses.cpp | 6 ++----
4 files changed, 21 insertions(+), 21 deletions(-)
diff --git a/src/hotspot/share/gc/shared/space.cpp b/src/hotspot/share/gc/shared/space.cpp
index 875a6dc854f..9772c32c42e 100644
--- a/src/hotspot/share/gc/shared/space.cpp
+++ b/src/hotspot/share/gc/shared/space.cpp
@@ -375,11 +375,11 @@ HeapWord* CompactibleSpace::forward_compact_top(size_t size, CompactPoint* cp, H
}
HeapWord* CompactibleSpace::forward(oop q, size_t size,
- CompactPoint* cp, HeapWord* compact_top) {
+ CompactPoint* cp, HeapWord* compact_top, bool force_forward) {
compact_top = forward_compact_top(size, cp, compact_top);
// store the forwarding pointer into the mark word
- if (cast_from_oop<HeapWord*>(q) != compact_top || (size_t)q->size() != size) {
+ if (force_forward || cast_from_oop<HeapWord*>(q) != compact_top || (size_t)q->size() != size) {
q->forward_to(oop(compact_top));
assert(q->is_gc_marked(), "encoding the pointer should preserve the mark");
} else {
@@ -501,7 +501,7 @@ bool CompactibleSpace::must_rescue(oop old_obj, oop new_obj) {
} else {
assert(space_index(old_obj) != space_index(new_obj), "old_obj and new_obj must be in different spaces");
- if (tenured_gen->is_in_reserved(new_obj)) {
+ if (new_in_tenured) {
// Must never rescue when moving from the new into the old generation.
assert(GenCollectedHeap::heap()->young_gen()->is_in_reserved(old_obj), "old_obj must be in DefNewGeneration");
assert(space_index(old_obj) > space_index(new_obj), "must be");
@@ -824,14 +824,14 @@ void OffsetTableContigSpace::verify() const {
// Compute the forward sizes and leave out objects whose position could
// possibly overlap other objects.
HeapWord* CompactibleSpace::forward_with_rescue(HeapWord* q, size_t size,
- CompactPoint* cp, HeapWord* compact_top) {
+ CompactPoint* cp, HeapWord* compact_top, bool force_forward) {
size_t forward_size = size;
// (DCEVM) There is a new version of the class of q => different size
if (oop(q)->klass()->new_version() != NULL) {
size_t new_size = oop(q)->size_given_klass(oop(q)->klass()->new_version());
- assert(size != new_size, "instances without changed size have to be updated prior to GC run");
+ // assert(size != new_size, "instances without changed size have to be updated prior to GC run");
forward_size = new_size;
}
@@ -845,7 +845,7 @@ HeapWord* CompactibleSpace::forward_with_rescue(HeapWord* q, size_t size,
return compact_top;
}
- return forward(oop(q), forward_size, cp, compact_top);
+ return forward(oop(q), forward_size, cp, compact_top, force_forward);
}
// Compute the forwarding addresses for the objects that need to be rescued.
@@ -861,11 +861,11 @@ HeapWord* CompactibleSpace::forward_rescued(CompactPoint* cp, HeapWord* compact_
// (DCEVM) There is a new version of the class of q => different size
if (oop(q)->klass()->new_version() != NULL) {
size_t new_size = oop(q)->size_given_klass(oop(q)->klass()->new_version());
- assert(size != new_size, "instances without changed size have to be updated prior to GC run");
+ // assert(size != new_size, "instances without changed size have to be updated prior to GC run");
size = new_size;
}
- compact_top = cp->space->forward(oop(q), size, cp, compact_top);
+ compact_top = cp->space->forward(oop(q), size, cp, compact_top, true);
assert(compact_top <= end(), "must not write over end of space!");
}
MarkSweep::_rescued_oops->clear();
diff --git a/src/hotspot/share/gc/shared/space.hpp b/src/hotspot/share/gc/shared/space.hpp
index c9bfc365f0f..f7648995454 100644
--- a/src/hotspot/share/gc/shared/space.hpp
+++ b/src/hotspot/share/gc/shared/space.hpp
@@ -405,7 +405,7 @@ public:
virtual void prepare_for_compaction(CompactPoint* cp) = 0;
// MarkSweep support phase3
DEBUG_ONLY(int space_index(oop obj));
- bool must_rescue(oop old_obj, oop new_obj);
+ virtual bool must_rescue(oop old_obj, oop new_obj);
HeapWord* rescue(HeapWord* old_obj);
virtual void adjust_pointers();
// MarkSweep support phase4
@@ -436,11 +436,11 @@ public:
// function of the then-current compaction space, and updates "cp->threshold
// accordingly".
virtual HeapWord* forward(oop q, size_t size, CompactPoint* cp,
- HeapWord* compact_top);
+ HeapWord* compact_top, bool force_forward);
// (DCEVM) same as forwad, but can rescue objects. Invoked only during
// redefinition runs
HeapWord* forward_with_rescue(HeapWord* q, size_t size, CompactPoint* cp,
- HeapWord* compact_top);
+ HeapWord* compact_top, bool force_forward);
HeapWord* forward_rescued(CompactPoint* cp, HeapWord* compact_top);
diff --git a/src/hotspot/share/gc/shared/space.inline.hpp b/src/hotspot/share/gc/shared/space.inline.hpp
index 5a93e93471b..fa645423685 100644
--- a/src/hotspot/share/gc/shared/space.inline.hpp
+++ b/src/hotspot/share/gc/shared/space.inline.hpp
@@ -163,6 +163,8 @@ inline void CompactibleSpace::scan_and_forward(SpaceType* space, CompactPoint* c
HeapWord* cur_obj = space->bottom();
HeapWord* scan_limit = space->scan_limit();
+ bool force_forward = false;
+
while (cur_obj < scan_limit) {
assert(!space->scanned_block_is_obj(cur_obj) ||
oop(cur_obj)->mark_raw().is_marked() || oop(cur_obj)->mark_raw().is_unlocked() ||
@@ -174,14 +176,15 @@ inline void CompactibleSpace::scan_and_forward(SpaceType* space, CompactPoint* c
size_t size = space->scanned_block_size(cur_obj);
if (redefinition_run) {
- compact_top = cp->space->forward_with_rescue(cur_obj, size, cp, compact_top);
+ compact_top = cp->space->forward_with_rescue(cur_obj, size, cp, compact_top, force_forward);
if (first_dead == NULL && oop(cur_obj)->is_gc_marked()) {
/* Was moved (otherwise, forward would reset mark),
set first_dead to here */
first_dead = cur_obj;
+ force_forward = true;
}
} else {
- compact_top = cp->space->forward(oop(cur_obj), size, cp, compact_top);
+ compact_top = cp->space->forward(oop(cur_obj), size, cp, compact_top, false);
}
cur_obj += size;
@@ -197,9 +200,9 @@ inline void CompactibleSpace::scan_and_forward(SpaceType* space, CompactPoint* c
// see if we might want to pretend this object is alive so that
// we don't have to compact quite as often.
- if (cur_obj == compact_top && dead_spacer.insert_deadspace(cur_obj, end)) {
+ if (!redefinition_run && cur_obj == compact_top && dead_spacer.insert_deadspace(cur_obj, end)) {
oop obj = oop(cur_obj);
- compact_top = cp->space->forward(obj, obj->size(), cp, compact_top);
+ compact_top = cp->space->forward(obj, obj->size(), cp, compact_top, force_forward);
end_of_live = end;
} else {
// otherwise, it really is a free region.
@@ -362,8 +365,7 @@ inline void CompactibleSpace::scan_and_compact(SpaceType* space, bool redefiniti
Prefetch::write(compaction_top, copy_interval);
// copy object and reinit its mark
- assert(cur_obj != compaction_top || oop(cur_obj)->klass()->new_version() != NULL,
- "everything in this pass should be moving");
+ assert(redefinition_run || cur_obj != compaction_top, "everything in this pass should be moving");
if (redefinition_run && oop(cur_obj)->klass()->new_version() != NULL) {
Klass* new_version = oop(cur_obj)->klass()->new_version();
if (new_version->update_information() == NULL) {
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 718426f2819..1da6661dd3e 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -431,13 +431,11 @@ public:
Klass* new_klass = obj->klass()->new_version();
if (new_klass->update_information() != NULL) {
- int size_diff = obj->size() - obj->size_given_klass(new_klass);
-
- // Either new size is bigger or gap is to small to be filled
- if (size_diff < 0 || (size_diff > 0 && (size_t) size_diff < CollectedHeap::min_fill_size())) {
+ if (obj->size() - obj->size_given_klass(new_klass) != 0) {
// We need an instance update => set back to old klass
_needs_instance_update = true;
} else {
+ // Either new size is bigger or gap is to small to be filled
oop src = obj;
if (new_klass->is_copying_backwards()) {
copy_to_tmp(obj);
--
2.23.0

View File

@@ -1,942 +0,0 @@
From 297f564f6af79fb824f5b4e9119f1d3d0c827fb0 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Mon, 16 Nov 2020 20:20:12 +0100
Subject: [PATCH 19/34] dcevm15 - fix java15 patch compilation issues
---
.../share/classfile/classFileParser.hpp | 8 +-
.../share/classfile/classLoaderData.cpp | 2 +-
src/hotspot/share/classfile/dictionary.hpp | 10 +-
src/hotspot/share/classfile/javaClasses.hpp | 2 +
.../share/gc/g1/g1FullGCCompactTask.cpp | 4 +-
.../share/gc/g1/g1FullGCCompactionPoint.cpp | 8 +-
.../share/gc/g1/g1FullGCPrepareTask.cpp | 4 +-
src/hotspot/share/gc/shared/dcevmSharedGC.cpp | 14 +-
src/hotspot/share/gc/shared/dcevmSharedGC.hpp | 2 +-
src/hotspot/share/gc/shared/gcConfig.cpp | 2 +-
src/hotspot/share/gc/shared/space.cpp | 4 +-
.../share/interpreter/linkResolver.cpp | 2 +-
src/hotspot/share/oops/instanceKlass.cpp | 17 ++-
src/hotspot/share/oops/instanceKlass.hpp | 1 +
src/hotspot/share/oops/klass.cpp | 8 +-
src/hotspot/share/prims/jvm.cpp | 2 +
.../prims/jvmtiEnhancedRedefineClasses.cpp | 129 +++++++++---------
.../prims/jvmtiEnhancedRedefineClasses.hpp | 14 +-
src/hotspot/share/prims/jvmtiEnv.cpp | 11 +-
.../share/prims/jvmtiRedefineClasses.cpp | 1 +
src/hotspot/share/prims/methodHandles.hpp | 3 +
src/hotspot/share/runtime/arguments.cpp | 22 +--
src/hotspot/share/runtime/mutexLocker.cpp | 2 +-
23 files changed, 159 insertions(+), 113 deletions(-)
diff --git a/src/hotspot/share/classfile/classFileParser.hpp b/src/hotspot/share/classfile/classFileParser.hpp
index e5761e61767..0d266b9047e 100644
--- a/src/hotspot/share/classfile/classFileParser.hpp
+++ b/src/hotspot/share/classfile/classFileParser.hpp
@@ -150,9 +150,6 @@ class ClassFileParser {
const intArray* _method_ordering;
GrowableArray<Method*>* _all_mirandas;
- // Enhanced class redefinition
- const bool _pick_newest;
-
enum { fixed_buffer_size = 128 };
u_char _linenumbertable_buffer[fixed_buffer_size];
@@ -206,6 +203,9 @@ class ClassFileParser {
bool _has_vanilla_constructor;
int _max_bootstrap_specifier_index; // detects BSS values
+ // (DCEVM) Enhanced class redefinition
+ const bool _pick_newest;
+
void parse_stream(const ClassFileStream* const stream, TRAPS);
void mangle_hidden_class_name(InstanceKlass* const ik);
@@ -582,7 +582,7 @@ class ClassFileParser {
ClassLoaderData* loader_data() const { return _loader_data; }
const Symbol* class_name() const { return _class_name; }
const InstanceKlass* super_klass() const { return _super_klass; }
- Array<Klass*>* local_interfaces() const { return _local_interfaces; }
+ Array<InstanceKlass*>* local_interfaces() const { return _local_interfaces; }
ReferenceType reference_type() const { return _rt; }
AccessFlags access_flags() const { return _access_flags; }
diff --git a/src/hotspot/share/classfile/classLoaderData.cpp b/src/hotspot/share/classfile/classLoaderData.cpp
index 4d64c6b454a..aadcd50ef4a 100644
--- a/src/hotspot/share/classfile/classLoaderData.cpp
+++ b/src/hotspot/share/classfile/classLoaderData.cpp
@@ -597,7 +597,7 @@ void ClassLoaderData::exchange_holders(ClassLoaderData* cld) {
oop holder_oop = _holder.peek();
_holder.replace(cld->_holder.peek());
cld->_holder.replace(holder_oop);
- WeakHandle<vm_class_loader_data> exchange = _holder;
+ WeakHandle<vm_weak_data> exchange = _holder;
_holder = cld->_holder;
cld->_holder = exchange;
}
diff --git a/src/hotspot/share/classfile/dictionary.hpp b/src/hotspot/share/classfile/dictionary.hpp
index 114a983e783..a50f4ff84d2 100644
--- a/src/hotspot/share/classfile/dictionary.hpp
+++ b/src/hotspot/share/classfile/dictionary.hpp
@@ -84,6 +84,11 @@ public:
void print_on(outputStream* st) const;
void verify();
+ // (DCEVM) Enhanced class redefinition
+ bool update_klass(unsigned int hash, Symbol* name, ClassLoaderData* loader_data, InstanceKlass* k, InstanceKlass* old_klass);
+
+ void rollback_redefinition();
+
private:
DictionaryEntry* new_entry(unsigned int hash, InstanceKlass* klass);
@@ -106,11 +111,6 @@ public:
void free_entry(DictionaryEntry* entry);
- // Enhanced class redefinition
- bool update_klass(unsigned int hash, Symbol* name, ClassLoaderData* loader_data, InstanceKlass* k, InstanceKlass* old_klass);
-
- void rollback_redefinition();
-
// (DCEVM) return old class if redefining in AllowEnhancedClassRedefinition, otherwise return "k"
static InstanceKlass* old_if_redefined(InstanceKlass* k) {
return (k != NULL && k->is_redefining()) ? ((InstanceKlass* )k->old_version()) : k;
diff --git a/src/hotspot/share/classfile/javaClasses.hpp b/src/hotspot/share/classfile/javaClasses.hpp
index a68c5139151..9abf2e1d105 100644
--- a/src/hotspot/share/classfile/javaClasses.hpp
+++ b/src/hotspot/share/classfile/javaClasses.hpp
@@ -255,7 +255,9 @@ class java_lang_Class : AllStatic {
static void set_init_lock(oop java_class, oop init_lock);
static void set_protection_domain(oop java_class, oop protection_domain);
static void set_class_loader(oop java_class, oop class_loader);
+ public: // DCEVM
static void set_component_mirror(oop java_class, oop comp_mirror);
+ private:
static void initialize_mirror_fields(Klass* k, Handle mirror, Handle protection_domain,
Handle classData, TRAPS);
static void initialize_mirror_fields(Klass* k, Handle mirror, Handle protection_domain, TRAPS);
diff --git a/src/hotspot/share/gc/g1/g1FullGCCompactTask.cpp b/src/hotspot/share/gc/g1/g1FullGCCompactTask.cpp
index f70f4606dc8..a22ed48560d 100644
--- a/src/hotspot/share/gc/g1/g1FullGCCompactTask.cpp
+++ b/src/hotspot/share/gc/g1/g1FullGCCompactTask.cpp
@@ -157,14 +157,14 @@ void G1FullGCCompactTask::serial_compaction_dcevm() {
size_t G1FullGCCompactTask::G1CompactRegionClosureDcevm::apply(oop obj) {
size_t size = obj->size();
- HeapWord* destination = (HeapWord*)obj->forwardee();
+ HeapWord* destination = cast_from_oop<HeapWord*>(obj->forwardee());
if (destination == NULL) {
// Object not moving
return size;
}
// copy object and reinit its mark
- HeapWord* obj_addr = (HeapWord*) obj;
+ HeapWord* obj_addr = cast_from_oop<HeapWord*>(obj);
if (!_rescue_oops_it->at_end() && **_rescue_oops_it == obj_addr) {
++(*_rescue_oops_it);
diff --git a/src/hotspot/share/gc/g1/g1FullGCCompactionPoint.cpp b/src/hotspot/share/gc/g1/g1FullGCCompactionPoint.cpp
index 1e49571c999..755935a2c91 100644
--- a/src/hotspot/share/gc/g1/g1FullGCCompactionPoint.cpp
+++ b/src/hotspot/share/gc/g1/g1FullGCCompactionPoint.cpp
@@ -174,7 +174,7 @@ void G1FullGCCompactionPoint::forward_dcevm(oop object, size_t size, bool force_
assert(_current_region != NULL, "Must have been initialized");
// Store a forwarding pointer if the object should be moved.
- if ((HeapWord*)object != _compaction_top || force_forward) {
+ if (cast_from_oop<HeapWord*>(object) != _compaction_top || force_forward) {
object->forward_to(oop(_compaction_top));
} else {
if (object->forwardee() != NULL) {
@@ -188,11 +188,11 @@ void G1FullGCCompactionPoint::forward_dcevm(oop object, size_t size, bool force_
} else {
// Make sure object has the correct mark-word set or that it will be
// fixed when restoring the preserved marks.
- assert(object->mark_raw() == markOopDesc::prototype_for_object(object) || // Correct mark
- object->mark_raw()->must_be_preserved(object) || // Will be restored by PreservedMarksSet
+ assert(object->mark_raw() == markWord::prototype_for_klass(object->klass()) || // Correct mark
+ object->mark_must_be_preserved() || // Will be restored by PreservedMarksSet
(UseBiasedLocking && object->has_bias_pattern_raw()), // Will be restored by BiasedLocking
"should have correct prototype obj: " PTR_FORMAT " mark: " PTR_FORMAT " prototype: " PTR_FORMAT,
- p2i(object), p2i(object->mark_raw()), p2i(markOopDesc::prototype_for_object(object)));
+ p2i(object), object->mark_raw().value(), markWord::prototype_for_klass(object->klass()).value());
}
assert(object->forwardee() == NULL, "should be forwarded to NULL");
}
diff --git a/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp b/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
index a45681b60cf..2f06b9617e4 100644
--- a/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
+++ b/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
@@ -269,7 +269,7 @@ size_t G1FullGCPrepareTask::G1PrepareCompactLiveClosureDcevm::apply(oop object)
HeapWord* compact_top = _cp->forward_compact_top(forward_size);
if (compact_top == NULL || must_rescue(object, oop(compact_top))) {
- _cp->rescued_oops()->append((HeapWord*)object);
+ _cp->rescued_oops()->append(cast_from_oop<HeapWord*>(object));
} else {
_cp->forward_dcevm(object, forward_size, (size != forward_size));
}
@@ -295,7 +295,7 @@ bool G1FullGCPrepareTask::G1PrepareCompactLiveClosureDcevm::must_rescue(oop old_
int new_size = old_obj->size_given_klass(oop(old_obj)->klass()->new_version());
int original_size = old_obj->size();
- bool overlap = ((HeapWord*)old_obj + original_size < (HeapWord*)new_obj + new_size);
+ bool overlap = (cast_from_oop<HeapWord*>(old_obj) + original_size < cast_from_oop<HeapWord*>(new_obj) + new_size);
return overlap;
}
diff --git a/src/hotspot/share/gc/shared/dcevmSharedGC.cpp b/src/hotspot/share/gc/shared/dcevmSharedGC.cpp
index 803e645f843..3dee097f1d3 100644
--- a/src/hotspot/share/gc/shared/dcevmSharedGC.cpp
+++ b/src/hotspot/share/gc/shared/dcevmSharedGC.cpp
@@ -58,10 +58,10 @@ void DcevmSharedGC::copy_rescued_objects_back(GrowableArray<HeapWord*>* rescued_
DcevmSharedGC::update_fields(rescued_obj, new_obj);
} else {
rescued_obj->set_klass(new_klass);
- Copy::aligned_disjoint_words((HeapWord*)rescued_obj, (HeapWord*)new_obj, size);
+ Copy::aligned_disjoint_words(cast_from_oop<HeapWord*>(rescued_obj), cast_from_oop<HeapWord*>(new_obj), size);
}
} else {
- Copy::aligned_disjoint_words((HeapWord*)rescued_obj, (HeapWord*)new_obj, size);
+ Copy::aligned_disjoint_words(cast_from_oop<HeapWord*>(rescued_obj), cast_from_oop<HeapWord*>(new_obj), size);
}
new_obj->init_mark_raw();
@@ -111,11 +111,11 @@ void DcevmSharedGC::update_fields(oop q, oop new_location) {
// Save object somewhere, there is an overlap in fields
if (new_klass_oop->is_copying_backwards()) {
- if (((HeapWord *)q >= (HeapWord *)new_location && (HeapWord *)q < (HeapWord *)new_location + new_size) ||
- ((HeapWord *)new_location >= (HeapWord *)q && (HeapWord *)new_location < (HeapWord *)q + size)) {
+ if ((cast_from_oop<HeapWord*>(q) >= cast_from_oop<HeapWord*>(new_location) && cast_from_oop<HeapWord*>(q) < cast_from_oop<HeapWord*>(new_location) + new_size) ||
+ (cast_from_oop<HeapWord*>(new_location) >= cast_from_oop<HeapWord*>(q) && cast_from_oop<HeapWord*>(new_location) < cast_from_oop<HeapWord*>(q) + size)) {
tmp = NEW_RESOURCE_ARRAY(HeapWord, size);
q = (oop) tmp;
- Copy::aligned_disjoint_words((HeapWord*)tmp_obj, (HeapWord*)q, size);
+ Copy::aligned_disjoint_words(cast_from_oop<HeapWord*>(tmp_obj), cast_from_oop<HeapWord*>(q), size);
}
}
@@ -131,13 +131,13 @@ void DcevmSharedGC::update_fields(oop q, oop new_location) {
void DcevmSharedGC::update_fields(oop new_location, oop tmp_obj, int *cur) {
assert(cur != NULL, "just checking");
- char* to = (char*)(HeapWord*)new_location;
+ char* to = (char*)cast_from_oop<HeapWord*>(new_location);
while (*cur != 0) {
int size = *cur;
if (size > 0) {
cur++;
int offset = *cur;
- HeapWord* from = (HeapWord*)(((char *)(HeapWord*)tmp_obj) + offset);
+ HeapWord* from = (HeapWord*)(((char *)cast_from_oop<HeapWord*>(tmp_obj)) + offset);
if (size == HeapWordSize) {
*((HeapWord*)to) = *from;
} else if (size == HeapWordSize * 2) {
diff --git a/src/hotspot/share/gc/shared/dcevmSharedGC.hpp b/src/hotspot/share/gc/shared/dcevmSharedGC.hpp
index e2ef0171fb2..a4e27e00280 100644
--- a/src/hotspot/share/gc/shared/dcevmSharedGC.hpp
+++ b/src/hotspot/share/gc/shared/dcevmSharedGC.hpp
@@ -29,7 +29,7 @@
#include "gc/shared/genOopClosures.hpp"
#include "gc/shared/taskqueue.hpp"
#include "memory/iterator.hpp"
-#include "oops/markOop.hpp"
+#include "oops/markWord.hpp"
#include "oops/oop.hpp"
#include "runtime/timer.hpp"
#include "utilities/growableArray.hpp"
diff --git a/src/hotspot/share/gc/shared/gcConfig.cpp b/src/hotspot/share/gc/shared/gcConfig.cpp
index f01d64d1434..5c1a09390f1 100644
--- a/src/hotspot/share/gc/shared/gcConfig.cpp
+++ b/src/hotspot/share/gc/shared/gcConfig.cpp
@@ -100,7 +100,7 @@ void GCConfig::fail_if_non_included_gc_is_selected() {
void GCConfig::select_gc_ergonomically() {
if (AllowEnhancedClassRedefinition && !UseG1GC) {
// Enhanced class redefinition only supports serial GC at the moment
- FLAG_SET_ERGO(bool, UseSerialGC, true);
+ FLAG_SET_ERGO(UseSerialGC, true);
} else if (os::is_server_class_machine()) {
#if INCLUDE_G1GC
FLAG_SET_ERGO_IF_DEFAULT(UseG1GC, true);
diff --git a/src/hotspot/share/gc/shared/space.cpp b/src/hotspot/share/gc/shared/space.cpp
index 9772c32c42e..e8e3d7884c2 100644
--- a/src/hotspot/share/gc/shared/space.cpp
+++ b/src/hotspot/share/gc/shared/space.cpp
@@ -440,7 +440,7 @@ int CompactibleSpace::space_index(oop obj) {
index++;
}
- tty->print_cr("could not compute space_index for %08xh", (HeapWord*)obj);
+ tty->print_cr("could not compute space_index for %08xh", cast_from_oop<HeapWord*>(obj));
index = 0;
Generation* gen = heap->old_gen();
@@ -485,7 +485,7 @@ bool CompactibleSpace::must_rescue(oop old_obj, oop new_obj) {
bool new_in_tenured = tenured_gen->is_in_reserved(new_obj);
if (old_in_tenured == new_in_tenured) {
// Rescue if object may overlap with a higher memory address.
- bool overlap = ((HeapWord*)old_obj + original_size < (HeapWord*)new_obj + new_size);
+ bool overlap = (cast_from_oop<HeapWord*>(old_obj) + original_size < cast_from_oop<HeapWord*>(new_obj) + new_size);
if (old_in_tenured) {
// Old and new address are in same space, so just compare the address.
// Must rescue if object moves towards the top of the space.
diff --git a/src/hotspot/share/interpreter/linkResolver.cpp b/src/hotspot/share/interpreter/linkResolver.cpp
index b2f24ddbeda..9daeeb70b34 100644
--- a/src/hotspot/share/interpreter/linkResolver.cpp
+++ b/src/hotspot/share/interpreter/linkResolver.cpp
@@ -1031,7 +1031,7 @@ void LinkResolver::resolve_field(fieldDescriptor& fd,
assert(m != NULL, "information about the current method must be available for 'put' bytecodes");
bool is_initialized_static_final_update = (byte == Bytecodes::_putstatic &&
fd.is_static() &&
- !(m()->is_static_initializer() || m()->name() == vmSymbols::ha_class_initializer_name()));
+ !(m->is_static_initializer() || m->name() == vmSymbols::ha_class_initializer_name()));
bool is_initialized_instance_final_update = ((byte == Bytecodes::_putfield || byte == Bytecodes::_nofast_putfield) &&
!fd.is_static() &&
!m->is_object_initializer());
diff --git a/src/hotspot/share/oops/instanceKlass.cpp b/src/hotspot/share/oops/instanceKlass.cpp
index f8e60941046..5e40d78a87e 100644
--- a/src/hotspot/share/oops/instanceKlass.cpp
+++ b/src/hotspot/share/oops/instanceKlass.cpp
@@ -1316,7 +1316,7 @@ void InstanceKlass::init_implementor() {
// (DCEVM) - init_implementor() for dcevm
void InstanceKlass::init_implementor_from_redefine() {
assert(is_interface(), "not interface");
- Klass** addr = adr_implementor();
+ Klass* volatile* addr = adr_implementor();
assert(addr != NULL, "null addr");
if (addr != NULL) {
*addr = NULL;
@@ -1659,6 +1659,21 @@ void InstanceKlass::methods_do(void f(Method* method)) {
}
}
+void InstanceKlass::methods_do(void f(Method* method, TRAPS), TRAPS) {
+ // Methods aren't stable until they are loaded. This can be read outside
+ // a lock through the ClassLoaderData for profiling
+ if (!is_loaded()) {
+ return;
+ }
+
+ int len = methods()->length();
+ for (int index = 0; index < len; index++) {
+ Method* m = methods()->at(index);
+ assert(m->is_method(), "must be method");
+ f(m, CHECK);
+ }
+}
+
// (DCEVM) Update information contains mapping of fields from old class to the new class.
// Info is stored on HEAP, you need to call clear_update_information to free the space.
void InstanceKlass::store_update_information(GrowableArray<int> &values) {
diff --git a/src/hotspot/share/oops/instanceKlass.hpp b/src/hotspot/share/oops/instanceKlass.hpp
index 6ead9426728..b56d42cb177 100644
--- a/src/hotspot/share/oops/instanceKlass.hpp
+++ b/src/hotspot/share/oops/instanceKlass.hpp
@@ -1069,6 +1069,7 @@ public:
void clear_update_information();
void methods_do(void f(Method* method));
+ void methods_do(void f(Method* method, TRAPS), TRAPS);
void array_klasses_do(void f(Klass* k));
void array_klasses_do(void f(Klass* k, TRAPS), TRAPS);
diff --git a/src/hotspot/share/oops/klass.cpp b/src/hotspot/share/oops/klass.cpp
index 352d8f84631..88f5ec9ba4a 100644
--- a/src/hotspot/share/oops/klass.cpp
+++ b/src/hotspot/share/oops/klass.cpp
@@ -200,13 +200,13 @@ void* Klass::operator new(size_t size, ClassLoaderData* loader_data, size_t word
Klass::Klass(KlassID id) : _id(id),
_java_mirror(NULL),
_prototype_header(markWord::prototype()),
- _shared_class_path_index(-1),
- _new_version(NULL),
_old_version(NULL),
+ _new_version(NULL),
+ _redefinition_flags(Klass::NoRedefinition),
_is_redefining(false),
+ _update_information(NULL),
_is_copying_backwards(false),
- _redefinition_flags(Klass::NoRedefinition),
- _update_information(NULL) {
+ _shared_class_path_index(-1) {
CDS_ONLY(_shared_class_flags = 0;)
CDS_JAVA_HEAP_ONLY(_archived_mirror = 0;)
_primary_supers[0] = this;
diff --git a/src/hotspot/share/prims/jvm.cpp b/src/hotspot/share/prims/jvm.cpp
index 333b65ccfc1..13bcac352fb 100644
--- a/src/hotspot/share/prims/jvm.cpp
+++ b/src/hotspot/share/prims/jvm.cpp
@@ -1054,6 +1054,7 @@ static jclass jvm_lookup_define_class(JNIEnv *env, jclass lookup, const char *na
class_loader,
protection_domain,
&st,
+ NULL,
CHECK_NULL);
if (log_is_enabled(Debug, class, resolve) && defined_k != NULL) {
@@ -1074,6 +1075,7 @@ static jclass jvm_lookup_define_class(JNIEnv *env, jclass lookup, const char *na
class_loader,
&st,
cl_info,
+ NULL,
CHECK_NULL);
if (defined_k == NULL) {
THROW_MSG_0(vmSymbols::java_lang_Error(), "Failure to define a hidden class");
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 1da6661dd3e..619e3988e3a 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -24,11 +24,14 @@
#include "precompiled.hpp"
#include "aot/aotLoader.hpp"
+#include "classfile/classFileParser.hpp"
#include "classfile/classFileStream.hpp"
#include "classfile/metadataOnStackMark.hpp"
#include "classfile/systemDictionary.hpp"
#include "classfile/verifier.hpp"
#include "classfile/dictionary.hpp"
+#include "classfile/classLoaderDataGraph.hpp"
+#include "interpreter/linkResolver.hpp"
#include "interpreter/oopMapCache.hpp"
#include "interpreter/rewriter.hpp"
#include "logging/logStream.hpp"
@@ -37,17 +40,22 @@
#include "memory/resourceArea.hpp"
#include "memory/iterator.inline.hpp"
#include "oops/fieldStreams.hpp"
+#include "oops/fieldStreams.inline.hpp"
#include "oops/klassVtable.hpp"
#include "oops/oop.inline.hpp"
#include "oops/constantPool.inline.hpp"
+#include "oops/metadata.hpp"
+#include "oops/methodData.hpp"
#include "prims/jvmtiImpl.hpp"
#include "prims/jvmtiClassFileReconstituter.hpp"
#include "prims/jvmtiEnhancedRedefineClasses.hpp"
#include "prims/methodComparator.hpp"
#include "prims/resolvedMethodTable.hpp"
+#include "prims/methodHandles.hpp"
#include "runtime/deoptimization.hpp"
#include "runtime/jniHandles.inline.hpp"
#include "runtime/relocator.hpp"
+#include "runtime/fieldDescriptor.hpp"
#include "runtime/fieldDescriptor.inline.hpp"
#include "utilities/bitMap.inline.hpp"
#include "prims/jvmtiThreadState.inline.hpp"
@@ -55,6 +63,8 @@
#include "oops/constantPool.inline.hpp"
#include "gc/g1/g1CollectedHeap.hpp"
#include "gc/shared/dcevmSharedGC.hpp"
+#include "gc/shared/scavengableNMethods.hpp"
+#include "ci/ciObjectFactory.hpp"
Array<Method*>* VM_EnhancedRedefineClasses::_old_methods = NULL;
Array<Method*>* VM_EnhancedRedefineClasses::_new_methods = NULL;
@@ -66,6 +76,7 @@ int VM_EnhancedRedefineClasses::_matching_methods_length = 0;
int VM_EnhancedRedefineClasses::_deleted_methods_length = 0;
int VM_EnhancedRedefineClasses::_added_methods_length = 0;
Klass* VM_EnhancedRedefineClasses::_the_class_oop = NULL;
+u8 VM_EnhancedRedefineClasses::_id_counter = 0;
//
// Create new instance of enhanced class redefiner.
@@ -88,6 +99,7 @@ VM_EnhancedRedefineClasses::VM_EnhancedRedefineClasses(jint class_count, const j
_class_load_kind = class_load_kind;
_res = JVMTI_ERROR_NONE;
_any_class_has_resolved_methods = false;
+ _id = next_id();
}
static inline InstanceKlass* get_ik(jclass def) {
@@ -211,9 +223,7 @@ class FieldCopier : public FieldClosure {
// TODO: review...
void VM_EnhancedRedefineClasses::mark_as_scavengable(nmethod* nm) {
- if (!nm->on_scavenge_root_list()) {
- CodeCache::add_scavenge_root_nmethod(nm);
- }
+ ScavengableNMethods::register_nmethod(nm);
}
void VM_EnhancedRedefineClasses::unregister_nmethod_g1(nmethod* nm) {
@@ -414,7 +424,7 @@ public:
_tmp_obj_size = size;
_tmp_obj = (oop)resource_allocate_bytes(size * HeapWordSize);
}
- Copy::aligned_disjoint_words((HeapWord*)o, (HeapWord*)_tmp_obj, size);
+ Copy::aligned_disjoint_words(cast_from_oop<HeapWord*>(o), cast_from_oop<HeapWord*>(_tmp_obj), size);
}
virtual void do_object(oop obj) {
@@ -505,9 +515,6 @@ void VM_EnhancedRedefineClasses::doit() {
ClearCpoolCacheAndUnpatch clear_cpool_cache(thread);
ClassLoaderDataGraph::classes_do(&clear_cpool_cache);
-
- // SystemDictionary::methods_do(fix_invoke_method);
-
// JSR-292 support
if (_any_class_has_resolved_methods) {
bool trace_name_printed = false;
@@ -564,8 +571,8 @@ void VM_EnhancedRedefineClasses::doit() {
InstanceKlass* old = InstanceKlass::cast(cur->old_version());
// Swap marks to have same hashcodes
- markOop cur_mark = cur->prototype_header();
- markOop old_mark = old->prototype_header();
+ markWord cur_mark = cur->prototype_header();
+ markWord old_mark = old->prototype_header();
cur->set_prototype_header(old_mark);
old->set_prototype_header(cur_mark);
@@ -579,14 +586,14 @@ void VM_EnhancedRedefineClasses::doit() {
// Revert pool holder for old version of klass (it was updated by one of ours closure!)
old->constants()->set_pool_holder(old);
- Klass* array_klasses = old->array_klasses();
+ ObjArrayKlass* array_klasses = old->array_klasses();
if (array_klasses != NULL) {
assert(cur->array_klasses() == NULL, "just checking");
// Transfer the array classes, otherwise we might get cast exceptions when casting array types.
// Also, set array klasses element klass.
cur->set_array_klasses(array_klasses);
- ObjArrayKlass::cast(array_klasses)->set_element_klass(cur);
+ array_klasses->set_element_klass(cur);
java_lang_Class::release_set_array_klass(cur->java_mirror(), array_klasses);
java_lang_Class::set_component_mirror(array_klasses->java_mirror(), cur->java_mirror());
}
@@ -641,11 +648,15 @@ void VM_EnhancedRedefineClasses::doit() {
//ClassLoaderDataGraph::classes_do(&clean_weak_method_links);
// Disable any dependent concurrent compilations
- SystemDictionary::notice_modification();
+ // SystemDictionary::notice_modification();
+
+ JvmtiExport::increment_redefinition_count();
// Set flag indicating that some invariants are no longer true.
// See jvmtiExport.hpp for detailed explanation.
- JvmtiExport::set_has_redefined_a_class();
+
+ // dcevm15: handled by _redefinition_count
+ // JvmtiExport::set_has_redefined_a_class();
#ifdef PRODUCT
if (log_is_enabled(Trace, redefine, class, obsolete, metadata)) {
@@ -718,7 +729,7 @@ bool VM_EnhancedRedefineClasses::is_modifiable_class(oop klass_mirror) {
}
// Cannot redefine or retransform an anonymous class.
- if (InstanceKlass::cast(k)->is_anonymous()) {
+ if (InstanceKlass::cast(k)->is_unsafe_anonymous()) {
return false;
}
return true;
@@ -804,22 +815,30 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
InstanceKlass* k;
- if (InstanceKlass::cast(the_class)->is_anonymous()) {
- const InstanceKlass* host_class = the_class->host_klass();
+ if (InstanceKlass::cast(the_class)->is_unsafe_anonymous()) {
+ const InstanceKlass* host_class = the_class->unsafe_anonymous_host();
// Make sure it's the real host class, not another anonymous class.
- while (host_class != NULL && host_class->is_anonymous()) {
- host_class = host_class->host_klass();
+ while (host_class != NULL && host_class->is_unsafe_anonymous()) {
+ host_class = host_class->unsafe_anonymous_host();
}
+ ClassLoadInfo cl_info(protection_domain,
+ host_class,
+ NULL, // dynamic_nest_host
+ NULL, // cp_patches
+ Handle(), // classData
+ false, // is_hidden
+ false, // is_strong_hidden
+ true); // FIXME: check if correct. can_access_vm_annotations
+
k = SystemDictionary::parse_stream(the_class_sym,
the_class_loader,
- protection_domain,
&st,
- host_class,
+ cl_info,
the_class,
- NULL,
THREAD);
+
k->class_loader_data()->exchange_holders(the_class->class_loader_data());
the_class->class_loader_data()->inc_keep_alive();
} else {
@@ -966,7 +985,7 @@ int VM_EnhancedRedefineClasses::calculate_redefinition_flags(InstanceKlass* new_
// Check interfaces
// Interfaces removed?
- Array<Klass*>* old_interfaces = the_class->transitive_interfaces();
+ Array<InstanceKlass*>* old_interfaces = the_class->transitive_interfaces();
for (i = 0; i < old_interfaces->length(); i++) {
InstanceKlass* old_interface = InstanceKlass::cast(old_interfaces->at(i));
if (!new_class->implements_interface_any_version(old_interface)) {
@@ -976,7 +995,7 @@ int VM_EnhancedRedefineClasses::calculate_redefinition_flags(InstanceKlass* new_
}
// Interfaces added?
- Array<Klass*>* new_interfaces = new_class->transitive_interfaces();
+ Array<InstanceKlass*>* new_interfaces = new_class->transitive_interfaces();
for (i = 0; i<new_interfaces->length(); i++) {
if (!the_class->implements_interface_any_version(new_interfaces->at(i))) {
result = result | Klass::ModifyClass;
@@ -1389,8 +1408,8 @@ void VM_EnhancedRedefineClasses::rollback() {
// Rewrite faster byte-codes back to their slower equivalent. Undoes rewriting happening in templateTable_xxx.cpp
// The reason is that once we zero cpool caches, we need to re-resolve all entries again. Faster bytecodes do not
// do that, they assume that cache entry is resolved already.
-void VM_EnhancedRedefineClasses::unpatch_bytecode(Method* method) {
- RawBytecodeStream bcs(method);
+void VM_EnhancedRedefineClasses::unpatch_bytecode(Method* method, TRAPS) {
+ RawBytecodeStream bcs(methodHandle(THREAD, method));
Bytecodes::Code code;
Bytecodes::Code java_code;
while (!bcs.is_last_bytecode()) {
@@ -1454,11 +1473,11 @@ void VM_EnhancedRedefineClasses::ClearCpoolCacheAndUnpatch::do_klass(Klass* k) {
HandleMark hm(_thread);
InstanceKlass *ik = InstanceKlass::cast(k);
- constantPoolHandle other_cp = constantPoolHandle(ik->constants());
+ constantPoolHandle other_cp = constantPoolHandle(_thread, ik->constants());
// Update host klass of anonymous classes (for example, produced by lambdas) to newest version.
- if (ik->is_anonymous() && ik->host_klass()->new_version() != NULL) {
- ik->set_host_klass(InstanceKlass::cast(ik->host_klass()->newest_version()));
+ if (ik->is_unsafe_anonymous() && ik->unsafe_anonymous_host()->new_version() != NULL) {
+ ik->set_unsafe_anonymous_host(InstanceKlass::cast(ik->unsafe_anonymous_host()->newest_version()));
}
// Update implementor if there is only one, in this case implementor() can reference old class
@@ -1492,7 +1511,18 @@ void VM_EnhancedRedefineClasses::ClearCpoolCacheAndUnpatch::do_klass(Klass* k) {
// If bytecode rewriting is enabled, we also need to unpatch bytecode to force resolution of zeroed entries
if (RewriteBytecodes) {
- ik->methods_do(unpatch_bytecode);
+ ik->methods_do(unpatch_bytecode, _thread);
+ }
+}
+
+u8 VM_EnhancedRedefineClasses::next_id() {
+ while (true) {
+ u8 id = _id_counter;
+ u8 next_id = id + 1;
+ u8 result = Atomic::cmpxchg(&_id_counter, id, next_id);
+ if (result == id) {
+ return next_id;
+ }
}
}
@@ -1512,31 +1542,8 @@ void VM_EnhancedRedefineClasses::MethodDataCleaner::do_klass(Klass* k) {
}
}
-void VM_EnhancedRedefineClasses::fix_invoke_method(Method* method) {
-
- constantPoolHandle other_cp = constantPoolHandle(method->constants());
-
- for (int i = 0; i < other_cp->length(); i++) {
- if (other_cp->tag_at(i).is_klass()) {
- Klass* klass = other_cp->resolved_klass_at(i);
- if (klass->new_version() != NULL) {
- // Constant pool entry points to redefined class -- update to the new version
- other_cp->klass_at_put(i, klass->newest_version());
- }
- assert(other_cp->resolved_klass_at(i)->new_version() == NULL, "Must be new klass!");
- }
- }
-
- ConstantPoolCache* cp_cache = other_cp->cache();
- if (cp_cache != NULL) {
- cp_cache->clear_entries();
- }
-
-}
-
-
-void VM_EnhancedRedefineClasses::update_jmethod_ids() {
+void VM_EnhancedRedefineClasses::update_jmethod_ids(TRAPS) {
for (int j = 0; j < _matching_methods_length; ++j) {
Method* old_method = _matching_old_methods[j];
jmethodID jmid = old_method->find_jmethod_id_or_null();
@@ -1547,10 +1554,10 @@ void VM_EnhancedRedefineClasses::update_jmethod_ids() {
if (jmid != NULL) {
// There is a jmethodID, change it to point to the new method
- methodHandle new_method_h(_matching_new_methods[j]);
+ methodHandle new_method_h(THREAD, _matching_new_methods[j]);
if (old_method->new_version() == NULL) {
- methodHandle old_method_h(_matching_old_methods[j]);
+ methodHandle old_method_h(THREAD, _matching_old_methods[j]);
jmethodID new_jmethod_id = Method::make_jmethod_id(old_method_h->method_holder()->class_loader_data(), old_method_h());
bool result = InstanceKlass::cast(old_method_h->method_holder())->update_jmethod_id(old_method_h(), new_jmethod_id);
} else {
@@ -1887,7 +1894,7 @@ void VM_EnhancedRedefineClasses::redefine_single_class(InstanceKlass* new_class_
// track number of methods that are EMCP for add_previous_version() call below
check_methods_and_mark_as_obsolete();
- update_jmethod_ids();
+ update_jmethod_ids(THREAD);
_any_class_has_resolved_methods = the_class->has_resolved_methods() || _any_class_has_resolved_methods;
@@ -2119,12 +2126,12 @@ jvmtiError VM_EnhancedRedefineClasses::do_topological_class_sorting(TRAPS) {
Handle protection_domain(THREAD, klass->protection_domain());
+ ClassLoadInfo cl_info(protection_domain);
+
ClassFileParser parser(&st,
klass->name(),
klass->class_loader_data(),
- protection_domain,
- NULL, // host_klass
- NULL, // cp_patches
+ &cl_info,
ClassFileParser::INTERNAL, // publicity level
true,
THREAD);
@@ -2134,7 +2141,7 @@ jvmtiError VM_EnhancedRedefineClasses::do_topological_class_sorting(TRAPS) {
links.append(KlassPair(super_klass, klass));
}
- Array<Klass*>* local_interfaces = parser.local_interfaces();
+ Array<InstanceKlass*>* local_interfaces = parser.local_interfaces();
for (int j = 0; j < local_interfaces->length(); j++) {
Klass* iface = local_interfaces->at(j);
if (iface != NULL && _affected_klasses->contains(iface)) {
@@ -2157,7 +2164,7 @@ jvmtiError VM_EnhancedRedefineClasses::do_topological_class_sorting(TRAPS) {
links.append(KlassPair(super_klass, klass));
}
- Array<Klass*>* local_interfaces = klass->local_interfaces();
+ Array<InstanceKlass*>* local_interfaces = klass->local_interfaces();
for (int j = 0; j < local_interfaces->length(); j++) {
Klass* interfaceKlass = local_interfaces->at(j);
if (_affected_klasses->contains(interfaceKlass)) {
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index 4c0412d343d..0066088b3b0 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -32,7 +32,7 @@
#include "memory/resourceArea.hpp"
#include "oops/objArrayKlass.hpp"
#include "oops/objArrayOop.hpp"
-#include "gc/shared/vmGCOperations.hpp"
+#include "gc/shared/gcVMOperations.hpp"
#include "../../../java.base/unix/native/include/jni_md.h"
//
@@ -59,6 +59,7 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
static int _deleted_methods_length;
static int _added_methods_length;
static Klass* _the_class_oop;
+ static u8 _id_counter;
// The instance fields are used to pass information from
// doit_prologue() to doit() and doit_epilogue().
@@ -91,6 +92,9 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
elapsedTimer _timer_heap_iterate;
elapsedTimer _timer_heap_full_gc;
+ // Redefinition id used by JFR
+ u8 _id;
+
// These routines are roughly in call order unless otherwise noted.
// Load and link new classes (either redefined or affected by redefinition - subclass, ...)
@@ -118,15 +122,14 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
static void mark_as_scavengable(nmethod* nm);
static void unregister_nmethod_g1(nmethod* nm);
static void register_nmethod_g1(nmethod* nm);
- static void unpatch_bytecode(Method* method);
- static void fix_invoke_method(Method* method);
+ static void unpatch_bytecode(Method* method, TRAPS);
// Figure out which new methods match old methods in name and signature,
// which methods have been added, and which are no longer present
void compute_added_deleted_matching_methods();
// Change jmethodIDs to point to the new methods
- void update_jmethod_ids();
+ void update_jmethod_ids(TRAPS);
// marking methods as old and/or obsolete
void check_methods_and_mark_as_obsolete();
@@ -141,6 +144,8 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
void flush_dependent_code(InstanceKlass* k_h, TRAPS);
+ u8 next_id();
+
static void check_class(InstanceKlass* k_oop, TRAPS);
static void dump_methods();
@@ -181,6 +186,7 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
bool allow_nested_vm_operations() const { return true; }
jvmtiError check_error() { return _res; }
+ u8 id() { return _id; }
// Modifiable test must be shared between IsModifiableClass query
// and redefine implementation
diff --git a/src/hotspot/share/prims/jvmtiEnv.cpp b/src/hotspot/share/prims/jvmtiEnv.cpp
index b6838ac034d..fba0f48abd7 100644
--- a/src/hotspot/share/prims/jvmtiEnv.cpp
+++ b/src/hotspot/share/prims/jvmtiEnv.cpp
@@ -456,20 +456,23 @@ JvmtiEnv::RetransformClasses(jint class_count, const jclass* classes) {
EventRetransformClasses event;
jvmtiError error;
+ u8 op_id;
if (AllowEnhancedClassRedefinition) {
MutexLocker sd_mutex(EnhancedRedefineClasses_lock);
VM_EnhancedRedefineClasses op(class_count, class_definitions, jvmti_class_load_kind_retransform);
VMThread::execute(&op);
+ op_id = op.id();
error = (op.check_error());
} else {
VM_RedefineClasses op(class_count, class_definitions, jvmti_class_load_kind_retransform);
VMThread::execute(&op);
+ op_id = op.id();
error = op.check_error();
}
if (error == JVMTI_ERROR_NONE) {
event.set_classCount(class_count);
- event.set_redefinitionId(op.id());
+ event.set_redefinitionId(op_id);
event.commit();
}
return error;
@@ -484,19 +487,23 @@ JvmtiEnv::RedefineClasses(jint class_count, const jvmtiClassDefinition* class_de
EventRedefineClasses event;
jvmtiError error;
+ u8 op_id;
+
if (AllowEnhancedClassRedefinition) {
MutexLocker sd_mutex(EnhancedRedefineClasses_lock);
VM_EnhancedRedefineClasses op(class_count, class_definitions, jvmti_class_load_kind_redefine);
VMThread::execute(&op);
+ op_id = op.id();
error = (op.check_error());
} else {
VM_RedefineClasses op(class_count, class_definitions, jvmti_class_load_kind_redefine);
VMThread::execute(&op);
+ op_id = op.id();
error = op.check_error();
}
if (error == JVMTI_ERROR_NONE) {
event.set_classCount(class_count);
- event.set_redefinitionId(op.id());
+ event.set_redefinitionId(op_id);
event.commit();
}
return error;
diff --git a/src/hotspot/share/prims/jvmtiRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiRedefineClasses.cpp
index a7840848e10..346eac7c431 100644
--- a/src/hotspot/share/prims/jvmtiRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiRedefineClasses.cpp
@@ -1271,6 +1271,7 @@ jvmtiError VM_RedefineClasses::load_new_class_versions(TRAPS) {
the_class_loader,
&st,
cl_info,
+ NULL,
THREAD);
// Clear class_being_redefined just to be sure.
state->clear_class_being_redefined();
diff --git a/src/hotspot/share/prims/methodHandles.hpp b/src/hotspot/share/prims/methodHandles.hpp
index 54f36202a5f..917d31efd77 100644
--- a/src/hotspot/share/prims/methodHandles.hpp
+++ b/src/hotspot/share/prims/methodHandles.hpp
@@ -180,6 +180,9 @@ public:
assert(ref_kind_is_valid(ref_kind), "");
return (ref_kind & 1) != 0;
}
+ static bool ref_kind_is_static(int ref_kind) {
+ return !ref_kind_has_receiver(ref_kind) && (ref_kind != JVM_REF_newInvokeSpecial);
+ }
static int ref_kind_to_flags(int ref_kind);
diff --git a/src/hotspot/share/runtime/arguments.cpp b/src/hotspot/share/runtime/arguments.cpp
index d05a2893498..3a92b8869dc 100644
--- a/src/hotspot/share/runtime/arguments.cpp
+++ b/src/hotspot/share/runtime/arguments.cpp
@@ -2128,13 +2128,15 @@ bool Arguments::check_gc_consistency() {
// of collectors.
uint i = 0;
if (UseSerialGC) i++;
- if (UseConcMarkSweepGC) i++;
- if (UseParallelGC || UseParallelOldGC) i++;
+ if (UseParallelGC) i++;
if (UseG1GC) i++;
+ if (UseEpsilonGC) i++;
+ if (UseZGC) i++;
+ if (UseShenandoahGC) i++;
if (AllowEnhancedClassRedefinition) {
// Must use serial GC. This limitation applies because the instance size changing GC modifications
// are only built into the mark and compact algorithm.
- if ((!UseSerialGC && !UseG1GC) && i >= 1) {
+ if (!UseSerialGC && !UseG1GC && i >= 1) {
jio_fprintf(defaultStream::error_stream(),
"Must use the Serial or G1 GC with enhanced class redefinition.\n");
return false;
@@ -4494,18 +4496,18 @@ void Arguments::setup_hotswap_agent() {
// TODO: open it only for org.hotswap.agent module
// Use to access java.lang.reflect.Proxy/proxyCache
- create_numbered_property("jdk.module.addopens", "java.base/java.lang=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.base/java.lang=ALL-UNNAMED", addopens_count++);
// Class of field java.lang.reflect.Proxy/proxyCache
- create_numbered_property("jdk.module.addopens", "java.base/jdk.internal.loader=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.base/jdk.internal.loader=ALL-UNNAMED", addopens_count++);
// Use to access java.io.Reader, java.io.InputStream, java.io.FileInputStream
- create_numbered_property("jdk.module.addopens", "java.base/java.io=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.base/java.io=ALL-UNNAMED", addopens_count++);
// java.beans.Introspector access
- create_numbered_property("jdk.module.addopens", "java.desktop/java.beans=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.desktop/java.beans=ALL-UNNAMED", addopens_count++);
// java.beans.Introspector access
- create_numbered_property("jdk.module.addopens", "java.desktop/com.sun.beans=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.desktop/com.sun.beans=ALL-UNNAMED", addopens_count++);
// com.sun.beans.introspect.ClassInfo access
- create_numbered_property("jdk.module.addopens", "java.desktop/com.sun.beans.introspect=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.desktop/com.sun.beans.introspect=ALL-UNNAMED", addopens_count++);
// com.sun.beans.introspect.util.Cache access
- create_numbered_property("jdk.module.addopens", "java.desktop/com.sun.beans.util=ALL-UNNAMED", addopens_count++);
+ create_numbered_module_property("jdk.module.addopens", "java.desktop/com.sun.beans.util=ALL-UNNAMED", addopens_count++);
}
diff --git a/src/hotspot/share/runtime/mutexLocker.cpp b/src/hotspot/share/runtime/mutexLocker.cpp
index 6f982072909..14a3ed730fe 100644
--- a/src/hotspot/share/runtime/mutexLocker.cpp
+++ b/src/hotspot/share/runtime/mutexLocker.cpp
@@ -287,7 +287,7 @@ void mutex_init() {
def(InitCompleted_lock , PaddedMonitor, leaf, true, _safepoint_check_never);
def(VtableStubs_lock , PaddedMutex , nonleaf, true, _safepoint_check_never);
def(Notify_lock , PaddedMonitor, nonleaf, true, _safepoint_check_always);
- def(EnhancedRedefineClasses_lock , PaddedMutex , nonleaf+7, false, Monitor::_safepoint_check_always); // for ensuring that class redefinition is not done in parallel
+ def(EnhancedRedefineClasses_lock , PaddedMutex , nonleaf+7, false, _safepoint_check_always); // for ensuring that class redefinition is not done in parallel
def(JNICritical_lock , PaddedMonitor, nonleaf, true, _safepoint_check_always); // used for JNI critical regions
def(AdapterHandlerLibrary_lock , PaddedMutex , nonleaf, true, _safepoint_check_always);
--
2.23.0

View File

@@ -1,25 +0,0 @@
From 336cab4f72c6e642e3077ea8d1a4860de33f5a4d Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Tue, 17 Nov 2020 17:40:24 +0100
Subject: [PATCH 20/34] dcevm15 - G1 fixes
---
src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp b/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
index 2f06b9617e4..476728a5d26 100644
--- a/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
+++ b/src/hotspot/share/gc/g1/g1FullGCPrepareTask.cpp
@@ -240,7 +240,7 @@ void G1FullGCPrepareTask::prepare_serial_compaction_dcevm() {
// collect remaining, not forwarded rescued oops using serial compact point
while (cp->last_rescued_oop() < cp->rescued_oops()->length()) {
- HeapRegion* hr = G1CollectedHeap::heap()->new_region(HeapRegion::GrainBytes / HeapWordSize, false, true);
+ HeapRegion* hr = G1CollectedHeap::heap()->new_region(HeapRegion::GrainBytes / HeapWordSize, HeapRegionType::Eden, true, G1NUMA::AnyNodeIndex);
if (hr == NULL) {
vm_exit_out_of_memory(0, OOM_MMAP_ERROR, "G1 - not enough of free regions after redefinition.");
}
--
2.23.0

View File

@@ -1,133 +0,0 @@
From cea4e2cca3c37233c728be7235f8f9d8be136cb5 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Tue, 17 Nov 2020 18:52:57 +0100
Subject: [PATCH 21/34] dcevm15 - Fix flush dependent code
---
.../prims/jvmtiEnhancedRedefineClasses.cpp | 57 +++++++------------
.../prims/jvmtiEnhancedRedefineClasses.hpp | 4 +-
2 files changed, 25 insertions(+), 36 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 619e3988e3a..efaf11e1666 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -508,7 +508,7 @@ void VM_EnhancedRedefineClasses::doit() {
// Deoptimize all compiled code that depends on this class (do only once, because it clears whole cache)
// if (_max_redefinition_flags > Klass::ModifyClass) {
- flush_dependent_code(NULL, thread);
+ flush_dependent_code(thread);
// }
// Adjust constantpool caches for all classes that reference methods of the evolved class.
@@ -647,17 +647,8 @@ void VM_EnhancedRedefineClasses::doit() {
//MethodDataCleaner clean_weak_method_links;
//ClassLoaderDataGraph::classes_do(&clean_weak_method_links);
- // Disable any dependent concurrent compilations
- // SystemDictionary::notice_modification();
-
JvmtiExport::increment_redefinition_count();
- // Set flag indicating that some invariants are no longer true.
- // See jvmtiExport.hpp for detailed explanation.
-
- // dcevm15: handled by _redefinition_count
- // JvmtiExport::set_has_redefined_a_class();
-
#ifdef PRODUCT
if (log_is_enabled(Trace, redefine, class, obsolete, metadata)) {
#endif
@@ -1746,6 +1737,18 @@ void VM_EnhancedRedefineClasses::transfer_old_native_function_registrations(Inst
transfer.transfer_registrations(_matching_old_methods, _matching_methods_length);
}
+// First step is to walk the code cache for each class redefined and mark
+// dependent methods. Wait until all classes are processed to deoptimize everything.
+void VM_EnhancedRedefineClasses::mark_dependent_code(InstanceKlass* ik) {
+ assert_locked_or_safepoint(Compile_lock);
+
+ // All dependencies have been recorded from startup or this is a second or
+ // subsequent use of RedefineClasses
+ if (0 && JvmtiExport::all_dependencies_are_recorded()) {
+ CodeCache::mark_for_evol_deoptimization(ik);
+ }
+}
+
// DCEVM - it always deoptimizes everything! (because it is very difficult to find only correct dependencies)
// Deoptimize all compiled code that depends on this class.
//
@@ -1762,33 +1765,21 @@ void VM_EnhancedRedefineClasses::transfer_old_native_function_registrations(Inst
// subsequent calls to RedefineClasses need only throw away code
// that depends on the class.
//
-void VM_EnhancedRedefineClasses::flush_dependent_code(InstanceKlass* k_h, TRAPS) {
+void VM_EnhancedRedefineClasses::flush_dependent_code(TRAPS) {
assert_locked_or_safepoint(Compile_lock);
// All dependencies have been recorded from startup or this is a second or
// subsequent use of RedefineClasses
// FIXME: for now, deoptimize all!
- if (0 && k_h != NULL && JvmtiExport::all_dependencies_are_recorded()) {
- CodeCache::flush_evol_dependents_on(k_h);
- Klass* superCl = k_h->super();
- // Deoptimize super classes since redefined class can has a new method override
- while (superCl != NULL && !superCl->is_redefining()) {
- CodeCache::flush_evol_dependents_on(InstanceKlass::cast(superCl));
- superCl = superCl->super();
+ if (0 && JvmtiExport::all_dependencies_are_recorded()) {
+ int deopt = CodeCache::mark_dependents_for_evol_deoptimization();
+ log_debug(redefine, class, nmethod)("Marked %d dependent nmethods for deopt", deopt);
+ if (deopt != 0) {
+ CodeCache::flush_evol_dependents();
}
} else {
- CodeCache::mark_all_nmethods_for_deoptimization();
-
- ResourceMark rm(THREAD);
- DeoptimizationMarker dm;
-
- // Deoptimize all activations depending on marked nmethods
- Deoptimization::deoptimize_dependents();
-
- // Make the dependent methods not entrant
- CodeCache::make_marked_nmethods_not_entrant();
-
- // From now on we know that the dependency information is complete
+ CodeCache::mark_all_nmethods_for_evol_deoptimization();
+ CodeCache::flush_evol_dependents();
JvmtiExport::set_all_dependencies_are_recorded(true);
}
}
@@ -1881,11 +1872,7 @@ void VM_EnhancedRedefineClasses::redefine_single_class(InstanceKlass* new_class_
JvmtiBreakpoints& jvmti_breakpoints = JvmtiCurrentBreakpoints::get_jvmti_breakpoints();
jvmti_breakpoints.clearall_in_class_at_safepoint(the_class);
- // DCEVM Deoptimization is always for whole java world, call only once after all classes are redefined
- // Deoptimize all compiled code that depends on this class
-// if (_max_redefinition_flags <= Klass::ModifyClass) {
-// flush_dependent_code(the_class, THREAD);
-// }
+ mark_dependent_code(the_class);
_old_methods = the_class->methods();
_new_methods = new_class->methods();
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index 0066088b3b0..bd5e7d153be 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -142,7 +142,9 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
// and in all direct and indirect subclasses.
void increment_class_counter(InstanceKlass *ik, TRAPS);
- void flush_dependent_code(InstanceKlass* k_h, TRAPS);
+ void mark_dependent_code(InstanceKlass* ik);
+
+ void flush_dependent_code(TRAPS);
u8 next_id();
--
2.23.0

View File

@@ -1,211 +0,0 @@
From 4f88dcec830d39452f69d1117729469fdb768a8f Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 12:05:26 +0100
Subject: [PATCH 22/34] dcevm15 - fix ResolvedMethodTable
---
src/hotspot/share/classfile/javaClasses.cpp | 5 -
src/hotspot/share/classfile/javaClasses.hpp | 1 -
.../share/prims/resolvedMethodTable.cpp | 139 +++++++++++-------
3 files changed, 84 insertions(+), 61 deletions(-)
diff --git a/src/hotspot/share/classfile/javaClasses.cpp b/src/hotspot/share/classfile/javaClasses.cpp
index 9b086a241f7..9a627786d0f 100644
--- a/src/hotspot/share/classfile/javaClasses.cpp
+++ b/src/hotspot/share/classfile/javaClasses.cpp
@@ -3996,11 +3996,6 @@ void java_lang_invoke_ResolvedMethodName::set_vmholder(oop resolved_method, oop
resolved_method->obj_field_put(_vmholder_offset, holder);
}
-void java_lang_invoke_ResolvedMethodName::set_vmholder_offset(oop resolved_method, Method* m) {
- assert(is_instance(resolved_method), "wrong type");
- resolved_method->obj_field_put(_vmholder_offset, m->method_holder()->java_mirror());
-}
-
oop java_lang_invoke_ResolvedMethodName::find_resolved_method(const methodHandle& m, TRAPS) {
const Method* method = m();
diff --git a/src/hotspot/share/classfile/javaClasses.hpp b/src/hotspot/share/classfile/javaClasses.hpp
index 9abf2e1d105..8f5993b7225 100644
--- a/src/hotspot/share/classfile/javaClasses.hpp
+++ b/src/hotspot/share/classfile/javaClasses.hpp
@@ -1107,7 +1107,6 @@ class java_lang_invoke_ResolvedMethodName : AllStatic {
static Method* vmtarget(oop resolved_method);
static void set_vmtarget(oop resolved_method, Method* method);
- static void set_vmholder_offset(oop resolved_method, Method* method);
static void set_vmholder(oop resolved_method, oop holder);
diff --git a/src/hotspot/share/prims/resolvedMethodTable.cpp b/src/hotspot/share/prims/resolvedMethodTable.cpp
index eb9fcda44f3..d0f1667b967 100644
--- a/src/hotspot/share/prims/resolvedMethodTable.cpp
+++ b/src/hotspot/share/prims/resolvedMethodTable.cpp
@@ -375,6 +375,67 @@ public:
}
};
+class AdjustMethodEntriesDcevm : public StackObj {
+ bool* _trace_name_printed;
+ GrowableArray<oop>* _oops_to_add;
+public:
+ AdjustMethodEntriesDcevm(GrowableArray<oop>* oops_to_add, bool* trace_name_printed) : _trace_name_printed(trace_name_printed), _oops_to_add(oops_to_add) {};
+ bool operator()(WeakHandle<vm_resolved_method_table_data>* entry) {
+ oop mem_name = entry->peek();
+ if (mem_name == NULL) {
+ // Removed
+ return true;
+ }
+
+ Method* old_method = (Method*)java_lang_invoke_ResolvedMethodName::vmtarget(mem_name);
+
+ if (old_method->is_old()) {
+
+ InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
+ Method* newer_method;
+
+ // Method* new_method;
+ if (old_method->is_deleted()) {
+ newer_method = Universe::throw_no_such_method_error();
+ } else {
+ newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
+
+ log_debug(redefine, class, update)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
+
+ assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
+ assert(newer_method != NULL, "method_with_idnum() should not be NULL");
+ assert(old_method != newer_method, "sanity check");
+
+ Thread* thread = Thread::current();
+ ResolvedMethodTableLookup lookup(thread, method_hash(newer_method), newer_method);
+ ResolvedMethodGet rmg(thread, newer_method);
+
+ if (_local_table->get(thread, lookup, rmg)) {
+ // old method was already adjusted if new method exists in _the_table
+ return true;
+ }
+ }
+
+ java_lang_invoke_ResolvedMethodName::set_vmtarget(mem_name, newer_method);
+ java_lang_invoke_ResolvedMethodName::set_vmholder(mem_name, newer_method->method_holder()->java_mirror());
+
+ newer_klass->set_has_resolved_methods();
+ _oops_to_add->append(mem_name);
+
+ ResourceMark rm;
+ if (!(*_trace_name_printed)) {
+ log_debug(redefine, class, update)("adjust: name=%s", old_method->method_holder()->external_name());
+ *_trace_name_printed = true;
+ }
+ log_debug(redefine, class, update, constantpool)
+ ("ResolvedMethod method update: %s(%s)",
+ newer_method->name()->as_C_string(), newer_method->signature()->as_C_string());
+ }
+
+ return true;
+ }
+};
+
// It is called at safepoint only for RedefineClasses
void ResolvedMethodTable::adjust_method_entries(bool * trace_name_printed) {
assert(SafepointSynchronize::is_at_safepoint(), "only called at safepoint");
@@ -382,73 +443,41 @@ void ResolvedMethodTable::adjust_method_entries(bool * trace_name_printed) {
AdjustMethodEntries adjust(trace_name_printed);
_local_table->do_safepoint_scan(adjust);
}
-#endif // INCLUDE_JVMTI
-// (DCEVM) It is called at safepoint only for RedefineClasses
+// It is called at safepoint only for RedefineClasses
void ResolvedMethodTable::adjust_method_entries_dcevm(bool * trace_name_printed) {
assert(SafepointSynchronize::is_at_safepoint(), "only called at safepoint");
// For each entry in RMT, change to new method
- GrowableArray<oop>* oops_to_add = new GrowableArray<oop>();
-
- for (int i = 0; i < _the_table->table_size(); ++i) {
- for (ResolvedMethodEntry* entry = _the_table->bucket(i);
- entry != NULL;
- entry = entry->next()) {
-
- oop mem_name = entry->object_no_keepalive();
- // except ones removed
- if (mem_name == NULL) {
- continue;
- }
- Method* old_method = (Method*)java_lang_invoke_ResolvedMethodName::vmtarget(mem_name);
-
- if (old_method->is_old()) {
-
- InstanceKlass* newer_klass = InstanceKlass::cast(old_method->method_holder()->new_version());
- Method* newer_method;
-
- // Method* new_method;
- if (old_method->is_deleted()) {
- newer_method = Universe::throw_no_such_method_error();
- } else {
- newer_method = newer_klass->method_with_idnum(old_method->orig_method_idnum());
-
- log_debug(redefine, class, update)("Adjusting method: '%s' of new class %s", newer_method->name_and_sig_as_C_string(), newer_klass->name()->as_C_string());
-
- assert(newer_klass == newer_method->method_holder(), "call after swapping redefined guts");
- assert(newer_method != NULL, "method_with_idnum() should not be NULL");
- assert(old_method != newer_method, "sanity check");
-
- if (_the_table->lookup(newer_method) != NULL) {
- // old method was already adjusted if new method exists in _the_table
- continue;
- }
- }
+ GrowableArray<oop> oops_to_add(0);
+ AdjustMethodEntriesDcevm adjust(&oops_to_add, trace_name_printed);
+ _local_table->do_safepoint_scan(adjust);
+ Thread* thread = Thread::current();
+ for (int i = 0; i < oops_to_add.length(); i++) {
+ oop mem_name = oops_to_add.at(i);
+ Method* method = (Method*)java_lang_invoke_ResolvedMethodName::vmtarget(mem_name);
- java_lang_invoke_ResolvedMethodName::set_vmtarget(mem_name, newer_method);
- java_lang_invoke_ResolvedMethodName::set_vmholder_offset(mem_name, newer_method);
+ // The hash table takes ownership of the WeakHandle, even if it's not inserted.
- newer_klass->set_has_resolved_methods();
- oops_to_add->append(mem_name);
+ ResolvedMethodTableLookup lookup(thread, method_hash(method), method);
+ ResolvedMethodGet rmg(thread, method);
- ResourceMark rm;
- if (!(*trace_name_printed)) {
- log_debug(redefine, class, update)("adjust: name=%s", old_method->method_holder()->external_name());
- *trace_name_printed = true;
- }
- log_debug(redefine, class, update, constantpool)
- ("ResolvedMethod method update: %s(%s)",
- newer_method->name()->as_C_string(), newer_method->signature()->as_C_string());
+ while (true) {
+ if (_local_table->get(thread, lookup, rmg)) {
+ break;
+ }
+ WeakHandle<vm_resolved_method_table_data> wh = WeakHandle<vm_resolved_method_table_data>::create(Handle(thread, mem_name));
+ // The hash table takes ownership of the WeakHandle, even if it's not inserted.
+ if (_local_table->insert(thread, lookup, wh)) {
+ log_insert(method);
+ wh.resolve();
+ break;
}
- }
- for (int i = 0; i < oops_to_add->length(); i++) {
- oop mem_name = oops_to_add->at(i);
- Method* method = (Method*)java_lang_invoke_ResolvedMethodName::vmtarget(mem_name);
- _the_table->basic_add(method, Handle(Thread::current(), mem_name));
}
}
}
+#endif // INCLUDE_JVMTI
+
// Verification
class VerifyResolvedMethod : StackObj {
public:
--
2.23.0

View File

@@ -1,88 +0,0 @@
From 5379e56465d3d3930ec7ea91b1c64db2cdf70170 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 12:05:50 +0100
Subject: [PATCH 23/34] dcevm15 - fix Universe::root_oops_do
---
src/hotspot/share/memory/universe.cpp | 38 +++++++++------------------
1 file changed, 12 insertions(+), 26 deletions(-)
diff --git a/src/hotspot/share/memory/universe.cpp b/src/hotspot/share/memory/universe.cpp
index f6e4253b5a5..8dad437bd51 100644
--- a/src/hotspot/share/memory/universe.cpp
+++ b/src/hotspot/share/memory/universe.cpp
@@ -39,6 +39,7 @@
#include "gc/shared/gcConfig.hpp"
#include "gc/shared/gcLogPrecious.hpp"
#include "gc/shared/gcTraceTime.inline.hpp"
+#include "gc/shared/weakProcessor.hpp"
#include "interpreter/interpreter.hpp"
#include "logging/log.hpp"
#include "logging/logStream.hpp"
@@ -75,6 +76,7 @@
#include "runtime/thread.inline.hpp"
#include "runtime/timerTrace.hpp"
#include "runtime/vmOperations.hpp"
+#include "services/management.hpp"
#include "services/memoryService.hpp"
#include "utilities/align.hpp"
#include "utilities/copy.hpp"
@@ -180,45 +182,29 @@ void Universe::basic_type_classes_do(KlassClosure *closure) {
// FIXME: (DCEVM) This method should iterate all pointers that are not within heap objects.
void Universe::root_oops_do(OopClosure *oopClosure) {
-
- class AlwaysTrueClosure: public BoolObjectClosure {
- public:
- void do_object(oop p) { ShouldNotReachHere(); }
- bool do_object_b(oop p) { return true; }
- };
- AlwaysTrueClosure always_true;
-
Universe::oops_do(oopClosure);
// ReferenceProcessor::oops_do(oopClosure); (tw) check why no longer there
JNIHandles::oops_do(oopClosure); // Global (strong) JNI handles
Threads::oops_do(oopClosure, NULL);
ObjectSynchronizer::oops_do(oopClosure);
- // TODO: review, flat profiler was removed in j10
- // FlatProfiler::oops_do(oopClosure);
- JvmtiExport::oops_do(oopClosure);
+ // (DCEVM) TODO: Check if this is correct?
+ Management::oops_do(oopClosure);
+ OopStorageSet::vm_global()->oops_do(oopClosure);
+ CLDToOopClosure cld_closure(oopClosure, ClassLoaderData::_claim_none);
+ ClassLoaderDataGraph::cld_do(&cld_closure);
// Now adjust pointers in remaining weak roots. (All of which should
// have been cleared if they pointed to non-surviving objects.)
// Global (weak) JNI handles
- JNIHandles::weak_oops_do(&always_true, oopClosure);
+ WeakProcessor::oops_do(oopClosure);
CodeBlobToOopClosure blobClosure(oopClosure, CodeBlobToOopClosure::FixRelocations);
CodeCache::blobs_do(&blobClosure);
- StringTable::oops_do(oopClosure);
+ AOT_ONLY(AOTLoader::oops_do(oopClosure);)
+ // StringTable::oops_do was removed in j15
+ // StringTable::oops_do(oopClosure);
- // (DCEVM) TODO: Check if this is correct?
- //CodeCache::scavenge_root_nmethods_oops_do(oopClosure);
- //Management::oops_do(oopClosure);
- //ref_processor()->weak_oops_do(&oopClosure);
- //PSScavenge::reference_processor()->weak_oops_do(&oopClosure);
-
-#if INCLUDE_AOT
- if (UseAOT) {
- AOTLoader::oops_do(oopClosure);
- }
-#endif
- // SO_AllClasses
- SystemDictionary::oops_do(oopClosure);
+ // PSScavenge::reference_processor()->weak_oops_do(oopClosure);
}
void Universe::oops_do(OopClosure* f) {
--
2.23.0

View File

@@ -1,67 +0,0 @@
From c6ea68e66d37d70739f7b0ee74131322b4526a68 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 12:03:32 +0100
Subject: [PATCH 24/34] Cleanup dcevm comments
---
src/hotspot/share/classfile/classLoaderDataGraph.hpp | 2 +-
src/hotspot/share/classfile/systemDictionary.hpp | 2 +-
src/hotspot/share/gc/shared/gcConfig.cpp | 2 +-
src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp | 2 +-
4 files changed, 4 insertions(+), 4 deletions(-)
diff --git a/src/hotspot/share/classfile/classLoaderDataGraph.hpp b/src/hotspot/share/classfile/classLoaderDataGraph.hpp
index f380aa3fa34..8ce94cccb47 100644
--- a/src/hotspot/share/classfile/classLoaderDataGraph.hpp
+++ b/src/hotspot/share/classfile/classLoaderDataGraph.hpp
@@ -104,7 +104,7 @@ class ClassLoaderDataGraph : public AllStatic {
static void dictionary_classes_do(KlassClosure* klass_closure);
- // Enhanced class redefinition
+ // (DCEVM) Enhanced class redefinition
static void rollback_redefinition();
// VM_CounterDecay iteration support
diff --git a/src/hotspot/share/classfile/systemDictionary.hpp b/src/hotspot/share/classfile/systemDictionary.hpp
index 931e655d631..1019dbd0d04 100644
--- a/src/hotspot/share/classfile/systemDictionary.hpp
+++ b/src/hotspot/share/classfile/systemDictionary.hpp
@@ -455,7 +455,7 @@ public:
static bool is_well_known_klass(Symbol* class_name);
#endif
- // Enhanced class redefinition
+ // (DCEVM) Enhanced class redefinition
static void remove_from_hierarchy(InstanceKlass* k);
static void update_constraints_after_redefinition();
diff --git a/src/hotspot/share/gc/shared/gcConfig.cpp b/src/hotspot/share/gc/shared/gcConfig.cpp
index 5c1a09390f1..23fbf715378 100644
--- a/src/hotspot/share/gc/shared/gcConfig.cpp
+++ b/src/hotspot/share/gc/shared/gcConfig.cpp
@@ -99,7 +99,7 @@ void GCConfig::fail_if_non_included_gc_is_selected() {
void GCConfig::select_gc_ergonomically() {
if (AllowEnhancedClassRedefinition && !UseG1GC) {
- // Enhanced class redefinition only supports serial GC at the moment
+ // (DCEVM) Enhanced class redefinition only supports serial GC at the moment
FLAG_SET_ERGO(UseSerialGC, true);
} else if (os::is_server_class_machine()) {
#if INCLUDE_G1GC
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
index bd5e7d153be..5de375fb888 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.hpp
@@ -78,7 +78,7 @@ class VM_EnhancedRedefineClasses: public VM_GC_Operation {
// have any entries.
bool _any_class_has_resolved_methods;
- // Enhanced class redefinition, affected klasses contain all classes which should be redefined
+ // (DCEVM) Enhanced class redefinition, affected klasses contain all classes which should be redefined
// either because of redefine, class hierarchy or interface change
GrowableArray<Klass*>* _affected_klasses;
--
2.23.0

View File

@@ -1,43 +0,0 @@
From 507d97966c7145d0ae2533459cc504c7b0d6d5b6 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 18:49:05 +0100
Subject: [PATCH 25/34] Fix cpCache in not AllowEnhancedClassRedefinition mode
---
src/hotspot/share/oops/cpCache.hpp | 8 +++++---
1 file changed, 5 insertions(+), 3 deletions(-)
diff --git a/src/hotspot/share/oops/cpCache.hpp b/src/hotspot/share/oops/cpCache.hpp
index 121a13b1dda..64dcf6223f5 100644
--- a/src/hotspot/share/oops/cpCache.hpp
+++ b/src/hotspot/share/oops/cpCache.hpp
@@ -148,13 +148,13 @@ class ConstantPoolCacheEntry {
void set_bytecode_2(Bytecodes::Code code);
void set_f1(Metadata* f1) {
Metadata* existing_f1 = _f1; // read once
- //assert(existing_f1 == NULL || existing_f1 == f1, "illegal field change");
+ assert(AllowEnhancedClassRedefinition || existing_f1 == NULL || existing_f1 == f1, "illegal field change");
_f1 = f1;
}
void release_set_f1(Metadata* f1);
void set_f2(intx f2) {
intx existing_f2 = _f2; // read once
- //assert(existing_f2 == 0 || existing_f2 == f2, "illegal field change");
+ assert(AllowEnhancedClassRedefinition || existing_f2 == 0 || existing_f2 == f2, "illegal field change");
_f2 = f2;
}
void set_f2_as_vfinal_method(Method* f2) {
@@ -215,7 +215,9 @@ class ConstantPoolCacheEntry {
void initialize_resolved_reference_index(int ref_index) {
assert(_f2 == 0, "set once"); // note: ref_index might be zero also
_f2 = ref_index;
- _flags = 1 << is_resolved_ref_shift;
+ if (AllowEnhancedClassRedefinition) {
+ _flags = 1 << is_resolved_ref_shift;
+ }
}
void set_field( // sets entry to resolved field state
--
2.23.0

View File

@@ -1,32 +0,0 @@
From b516b615c20fafa2094dfb9f4cb08245b26418d0 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 22 Nov 2020 19:51:46 +0100
Subject: [PATCH 26/34] dcevm15 - add ClassLoaderDataGraph_lock on
ClassLoaderDataGraph::classes_do
ClassLoaderDataGraph::classes_do need safepoint or lock,
find_sorted_affected_classes is not in safepoint therefore it must be
locked
---
src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index efaf11e1666..197e1c0029f 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -2063,7 +2063,10 @@ jvmtiError VM_EnhancedRedefineClasses::find_sorted_affected_classes(TRAPS) {
AffectedKlassClosure closure(_affected_klasses);
// Updated in j10, from original SystemDictionary::classes_do
- ClassLoaderDataGraph::classes_do(&closure);
+ {
+ MutexLocker mcld(ClassLoaderDataGraph_lock);
+ ClassLoaderDataGraph::classes_do(&closure);
+ }
//ClassLoaderDataGraph::dictionary_classes_do(&closure);
log_trace(redefine, class, load)("%d classes affected", _affected_klasses->length());
--
2.23.0

View File

@@ -1,29 +0,0 @@
From c6498946006879314bdc6218ee72da5d9c88f237 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sat, 28 Nov 2020 19:29:42 +0100
Subject: [PATCH 27/34] dcevm15 - check if has_nestmate_access_to has newest
host class
---
src/hotspot/share/oops/instanceKlass.cpp | 5 +++++
1 file changed, 5 insertions(+)
diff --git a/src/hotspot/share/oops/instanceKlass.cpp b/src/hotspot/share/oops/instanceKlass.cpp
index 5e40d78a87e..1d9623f2446 100644
--- a/src/hotspot/share/oops/instanceKlass.cpp
+++ b/src/hotspot/share/oops/instanceKlass.cpp
@@ -445,6 +445,11 @@ bool InstanceKlass::has_nestmate_access_to(InstanceKlass* k, TRAPS) {
return false;
}
+ if (AllowEnhancedClassRedefinition) {
+ // TODO: (DCEVM) check if it correct. It fix problems with lambdas (hidden)
+ cur_host = InstanceKlass::cast(cur_host->newest_version());
+ }
+
Klass* k_nest_host = k->nest_host(CHECK_false);
if (k_nest_host == NULL) {
return false;
--
2.23.0

View File

@@ -1,24 +0,0 @@
From 86c27155386c1c40642c99c63a242d1f5d8601a5 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sat, 28 Nov 2020 19:31:08 +0100
Subject: [PATCH 28/34] Remove unused fieldType
---
src/hotspot/share/classfile/vmSymbols.hpp | 1 -
1 file changed, 1 deletion(-)
diff --git a/src/hotspot/share/classfile/vmSymbols.hpp b/src/hotspot/share/classfile/vmSymbols.hpp
index 6a3b234b222..eb06684a288 100644
--- a/src/hotspot/share/classfile/vmSymbols.hpp
+++ b/src/hotspot/share/classfile/vmSymbols.hpp
@@ -465,7 +465,6 @@
template(static_offset_name, "staticOffset") \
template(static_base_name, "staticBase") \
template(field_offset_name, "fieldOffset") \
- template(field_type_name, "fieldType") \
\
/* name symbols needed by intrinsics */ \
\
--
2.23.0

View File

@@ -1,54 +0,0 @@
From 025d0d2903963fb79f83cf0d90418783d3ef6813 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 17:18:16 +0100
Subject: [PATCH 29/34] mark_as_scavengable only alive methods
---
.../share/prims/jvmtiEnhancedRedefineClasses.cpp | 14 ++++++++------
1 file changed, 8 insertions(+), 6 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index 197e1c0029f..e00fac1f693 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -223,19 +223,21 @@ class FieldCopier : public FieldClosure {
// TODO: review...
void VM_EnhancedRedefineClasses::mark_as_scavengable(nmethod* nm) {
- ScavengableNMethods::register_nmethod(nm);
+ if (nm->is_alive()) {
+ ScavengableNMethods::register_nmethod(nm);
+ }
}
void VM_EnhancedRedefineClasses::unregister_nmethod_g1(nmethod* nm) {
// It should work not only for G1 but also for another GCs, but this way is safer now
- if (!nm->is_zombie() && !nm->is_unloaded()) {
+ if (nm->is_alive()) {
Universe::heap()->unregister_nmethod(nm);
}
}
void VM_EnhancedRedefineClasses::register_nmethod_g1(nmethod* nm) {
// It should work not only for G1 but also for another GCs, but this way is safer now
- if (!nm->is_zombie() && !nm->is_unloaded()) {
+ if (nm->is_alive()) {
Universe::heap()->register_nmethod(nm);
}
}
@@ -511,9 +513,9 @@ void VM_EnhancedRedefineClasses::doit() {
flush_dependent_code(thread);
// }
- // Adjust constantpool caches for all classes that reference methods of the evolved class.
- ClearCpoolCacheAndUnpatch clear_cpool_cache(thread);
- ClassLoaderDataGraph::classes_do(&clear_cpool_cache);
+ // Adjust constantpool caches for all classes that reference methods of the evolved class.
+ ClearCpoolCacheAndUnpatch clear_cpool_cache(thread);
+ ClassLoaderDataGraph::classes_do(&clear_cpool_cache);
// JSR-292 support
if (_any_class_has_resolved_methods) {
--
2.23.0

View File

@@ -1,28 +0,0 @@
From 27aabfefe7d799545049bb81ba19d4ed2ff6379c Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 17:20:11 +0100
Subject: [PATCH 30/34] dcevm15 - lock on
ClassLoaderDataGraph::rollback_redefinition
rollback is not in safepoint, therefore must be locked
---
src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp | 2 ++
1 file changed, 2 insertions(+)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index e00fac1f693..db5fb1c472b 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -1382,7 +1382,9 @@ void VM_EnhancedRedefineClasses::calculate_instance_update_information(Klass* ne
// Rollback all changes - clear new classes from the system dictionary, return old classes to directory, free memory.
void VM_EnhancedRedefineClasses::rollback() {
log_info(redefine, class, load)("Rolling back redefinition, result=%d", _res);
+ ClassLoaderDataGraph_lock->lock();
ClassLoaderDataGraph::rollback_redefinition();
+ ClassLoaderDataGraph_lock->unlock();
for (int i = 0; i < _new_classes->length(); i++) {
SystemDictionary::remove_from_hierarchy(_new_classes->at(i));
--
2.23.0

View File

@@ -1,28 +0,0 @@
From 9b405cb642d5935c39c8dbd522ea2fdecfc29ef3 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 19:59:50 +0100
Subject: [PATCH 31/34] ResourceMark in G1IterateObjectClosureTask fixing
memory leaks
G1IterateObjectClosureTask is used only in redefinition full GC run
---
src/hotspot/share/gc/g1/g1CollectedHeap.cpp | 3 +++
1 file changed, 3 insertions(+)
diff --git a/src/hotspot/share/gc/g1/g1CollectedHeap.cpp b/src/hotspot/share/gc/g1/g1CollectedHeap.cpp
index a29d2dddc2d..2af6df6c1e4 100644
--- a/src/hotspot/share/gc/g1/g1CollectedHeap.cpp
+++ b/src/hotspot/share/gc/g1/g1CollectedHeap.cpp
@@ -2362,6 +2362,9 @@ class G1IterateObjectClosureTask : public AbstractGangTask {
_cl(cl), _g1h(g1h), _hrclaimer(g1h->workers()->active_workers()) { }
virtual void work(uint worker_id) {
+ Thread *thread = Thread::current();
+ HandleMark hm(thread); // make sure any handles created are deleted
+ ResourceMark rm(thread);
IterateObjectClosureRegionClosure blk(_cl);
_g1h->heap_region_par_iterate_from_worker_offset(&blk, &_hrclaimer, worker_id);
}
--
2.23.0

View File

@@ -1,91 +0,0 @@
From 40fe40884d4efc50864bb3f2dd88f0a2e7122d5a Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 20:05:03 +0100
Subject: [PATCH 32/34] dcevm15 - fix hidded classes
---
.../prims/jvmtiEnhancedRedefineClasses.cpp | 41 ++++++++++++++-----
1 file changed, 30 insertions(+), 11 deletions(-)
diff --git a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
index db5fb1c472b..590f7fdfafe 100644
--- a/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
+++ b/src/hotspot/share/prims/jvmtiEnhancedRedefineClasses.cpp
@@ -722,7 +722,8 @@ bool VM_EnhancedRedefineClasses::is_modifiable_class(oop klass_mirror) {
}
// Cannot redefine or retransform an anonymous class.
- if (InstanceKlass::cast(k)->is_unsafe_anonymous()) {
+ // TODO: check if is correct in j15
+ if (InstanceKlass::cast(k)->is_unsafe_anonymous() || InstanceKlass::cast(k)->is_hidden()) {
return false;
}
return true;
@@ -808,21 +809,27 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
InstanceKlass* k;
- if (InstanceKlass::cast(the_class)->is_unsafe_anonymous()) {
- const InstanceKlass* host_class = the_class->unsafe_anonymous_host();
+ if (the_class->is_unsafe_anonymous() || the_class->is_hidden()) {
+ InstanceKlass* dynamic_host_class = NULL;
+ InstanceKlass* unsafe_anonymous_host = NULL;
- // Make sure it's the real host class, not another anonymous class.
- while (host_class != NULL && host_class->is_unsafe_anonymous()) {
- host_class = host_class->unsafe_anonymous_host();
+ if (the_class->is_hidden()) {
+ log_debug(redefine, class, load)("loading hidden class %s", the_class->name()->as_C_string());
+ dynamic_host_class = the_class->nest_host(THREAD);
+ }
+
+ if (the_class->is_unsafe_anonymous()) {
+ log_debug(redefine, class, load)("loading usafe anonymous %s", the_class->name()->as_C_string());
+ unsafe_anonymous_host = the_class->unsafe_anonymous_host();
}
ClassLoadInfo cl_info(protection_domain,
- host_class,
- NULL, // dynamic_nest_host
+ unsafe_anonymous_host,
NULL, // cp_patches
+ dynamic_host_class, // dynamic_nest_host
Handle(), // classData
- false, // is_hidden
- false, // is_strong_hidden
+ the_class->is_hidden(), // is_hidden
+ !the_class->is_non_strong_hidden(), // is_strong_hidden
true); // FIXME: check if correct. can_access_vm_annotations
k = SystemDictionary::parse_stream(the_class_sym,
@@ -833,7 +840,17 @@ jvmtiError VM_EnhancedRedefineClasses::load_new_class_versions(TRAPS) {
THREAD);
k->class_loader_data()->exchange_holders(the_class->class_loader_data());
- the_class->class_loader_data()->inc_keep_alive();
+
+ if (the_class->is_hidden()) {
+ // from jvm_lookup_define_class() (jvm.cpp):
+ // The hidden class loader data has been artificially been kept alive to
+ // this point. The mirror and any instances of this class have to keep
+ // it alive afterwards.
+ the_class->class_loader_data()->dec_keep_alive();
+ } else {
+ the_class->class_loader_data()->inc_keep_alive();
+ }
+
} else {
k = SystemDictionary::resolve_from_stream(the_class_sym,
the_class_loader,
@@ -1475,6 +1492,8 @@ void VM_EnhancedRedefineClasses::ClearCpoolCacheAndUnpatch::do_klass(Klass* k) {
ik->set_unsafe_anonymous_host(InstanceKlass::cast(ik->unsafe_anonymous_host()->newest_version()));
}
+ // FIXME: check new nest_host for hidden
+
// Update implementor if there is only one, in this case implementor() can reference old class
if (ik->is_interface()) {
Klass* implKlass = ik->implementor();
--
2.23.0

View File

@@ -1,27 +0,0 @@
From 29920b076b4ad96d85adbce0a1d947e5022ba3ad Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 20:08:57 +0100
Subject: [PATCH 33/34] dcevm15 - DON'T clear F2 in CP cache after indy
unevolving
It's not clear why it was cleared in dcevm7-11
---
src/hotspot/share/oops/cpCache.cpp | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/hotspot/share/oops/cpCache.cpp b/src/hotspot/share/oops/cpCache.cpp
index 79a38dbeff0..650e6fab42d 100644
--- a/src/hotspot/share/oops/cpCache.cpp
+++ b/src/hotspot/share/oops/cpCache.cpp
@@ -650,7 +650,7 @@ void ConstantPoolCacheEntry::clear_entry() {
if (clearData) {
if (!is_resolved_reference()) {
- _f2 = 0;
+ // _f2 = 0;
}
// FIXME: (DCEVM) we want to clear flags, but parameter size is actually used
// after we return from the method, before entry is re-initialized. So let's
--
2.23.0

View File

@@ -1,49 +0,0 @@
From 1f13b20ab5553182680045b7d7324ff92da7e7f0 Mon Sep 17 00:00:00 2001
From: Vladimir Dvorak <vladimir.dvorak@jetbrains.com>
Date: Sun, 29 Nov 2020 21:28:06 +0100
Subject: [PATCH 34/34] dcevm15 - fix Universe::root_oops_do
Removed ClassLoaderDataGraph::cld_do was cause of crashes due multiple
oop patching. ClassLoaderDataGraph::cld_do replaced in dcevm15
previously used and removed SystemDictionary:oops_do
---
src/hotspot/share/memory/universe.cpp | 11 ++++++++---
1 file changed, 8 insertions(+), 3 deletions(-)
diff --git a/src/hotspot/share/memory/universe.cpp b/src/hotspot/share/memory/universe.cpp
index 8dad437bd51..0199962a684 100644
--- a/src/hotspot/share/memory/universe.cpp
+++ b/src/hotspot/share/memory/universe.cpp
@@ -190,21 +190,26 @@ void Universe::root_oops_do(OopClosure *oopClosure) {
// (DCEVM) TODO: Check if this is correct?
Management::oops_do(oopClosure);
OopStorageSet::vm_global()->oops_do(oopClosure);
- CLDToOopClosure cld_closure(oopClosure, ClassLoaderData::_claim_none);
- ClassLoaderDataGraph::cld_do(&cld_closure);
+ // CLDToOopClosure cld_closure(oopClosure, ClassLoaderData::_claim_none);
+ // ClassLoaderDataGraph::cld_do(&cld_closure);
// Now adjust pointers in remaining weak roots. (All of which should
// have been cleared if they pointed to non-surviving objects.)
// Global (weak) JNI handles
WeakProcessor::oops_do(oopClosure);
+ JvmtiExport::oops_do(oopClosure);
+
CodeBlobToOopClosure blobClosure(oopClosure, CodeBlobToOopClosure::FixRelocations);
CodeCache::blobs_do(&blobClosure);
+
AOT_ONLY(AOTLoader::oops_do(oopClosure);)
+
// StringTable::oops_do was removed in j15
// StringTable::oops_do(oopClosure);
- // PSScavenge::reference_processor()->weak_oops_do(oopClosure);
+ // OopStorageSet::vm_global()->oops_do(oopClosure);
+
}
void Universe::oops_do(OopClosure* f) {
--
2.23.0

View File

@@ -1,83 +0,0 @@
#!/bin/bash -x
usage ()
{
echo "Usage: perfcmp.sh [options] <test_results_cur> <test_results_ref> <results> <test_prefix> <noHeaders>"
echo "Options:"
echo -e " -h, --help\tdisplay this help"
echo -e " -tc\tprint teacmity statistic"
echo -e "test_results_cur - the file with metrics values for the current measuring"
echo -e "test_results_ref - the file with metrics values for the reference measuring"
echo -e "results - results of comaprison"
echo -e "test_prefix - specifys measuring type, makes sense for enabled -tc, by default no prefixes"
echo -e "noHeaders - by default 1-st line contains headers"
echo -e ""
echo -e "test_results_* files content should be in csv format with header and tab separator:"
echo -e "The 1-st column is the test name"
echo -e "The 2-st column is the test value"
echo -e ""
echo -e "Example:"
echo -e "Test Value"
echo -e "Testname 51.54"
}
while [ -n "$1" ]
do
case "$1" in
-h | --help) usage
exit 1 ;;
-tc) tc=1
shift
break ;;
*) break;;
esac
done
if [[ "$#" < "3" ]]; then
echo "Error: Invalid arguments"
usage
exit 1
fi
curFile=$1
refFile=$2
resFile=$3
testNamePrefix=$4
noHeaders=$5
echo $curFile
echo $refFile
echo $resFile
curValues=`cat "$curFile" | cut -f 2 | tr -d '\t'`
if [ -z $noHeaders ]; then
curValuesHeader=`echo "$curValues" | head -n +1`_cur
header=`cat "$refFile" | head -n +1 | awk -F'\t' -v x=$curValuesHeader '{print " "$1"\t"$2"_ref\t"x"\tratio"}'`
testContent=`paste -d '\t' $refFile <(echo "$curValues") | tail -n +2`
else
testContent=`paste -d '\t' $refFile <(echo "$curValues") | tail -n +1`
fi
testContent=`echo "$testContent" | awk -F'\t' '{ if ($3>$2+$2*0.1) {print "* "$1"\t"$2"\t"$3"\t"(($2==0)?"-":$3/$2)} else {print " "$1"\t"$2"\t"$3"\t"(($2==0)?"-":$3/$2)} }'`
if [ -z $noHeaders ]; then
echo "$header" > $resFile
fi
echo "$testContent" >> $resFile
cat "$resFile" | tr '\t' ';' | column -t -s ';' | tee $resFile
if [ -z $tc ]; then
exit 0
fi
echo "$testContent" 2>&1 | (
while read -r s; do
testname=`echo "$s" | cut -f 1 | tr -d "[:space:]" | tr -d "*"`
duration=`echo "$s" | cut -f 3`
failed=`echo "$s" | cut -c1 | grep -c "*"`
echo \#\#teamcity[testStarted name=\'$testNamePrefix$testname\']
echo "===>$s"
echo \#\#teamcity[buildStatisticValue key=\'$testNamePrefix$testname\' value=\'$duration\']
[ $failed -eq 1 ] && echo \#\#teamcity[testFailed name=\'$testNamePrefix$testname\' message=\'$s\']
echo \#\#teamcity[testFinished name=\'$testNamePrefix$testname\' duration=\'$duration\']
failed=0
done
)

View File

@@ -1,131 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_win_x64
source jb/project/tools/common/scripts/common.sh
WORK_DIR=$(pwd)
JCEF_PATH=${JCEF_PATH:=$WORK_DIR/jcef_win_x64}
function do_configure {
sh ./configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build=$JDK_BUILD_NUMBER \
--with-version-opt=b${build_number} \
--with-toolchain-version=$TOOLCHAIN_VERSION \
--with-boot-jdk=$BOOT_JDK \
--disable-ccache \
--enable-cds=yes || do_exit $?
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
echo Running jlink ...
${JSDK}/bin/jlink \
--module-path $__modules_path --no-man-pages --compress=2 \
--add-modules $__modules --output $__arch_name || do_exit $?
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> $__arch_name/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' $__arch_name/release > release
mv release $__arch_name/release
cp $IMAGES_DIR/jdk/lib/src.zip $__arch_name/lib
copy_jmods "$__modules" "$__modules_path" "$__arch_name"/jmods
fi
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=windows-x86_64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=0
do_maketest=1
;;
"dcevm")
HEAD_REVISION=$(git rev-parse HEAD)
git am jb/project/tools/patches/dcevm/*.patch || do_exit $?
do_reset_dcevm=0
do_reset_changes=0
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=0
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=windows-x86_64-server-fastdebug
;;
esac
if [ -z "$INC_BUILD" ]; then
do_configure || do_exit $?
if [ $do_maketest -eq 1 ]; then
make LOG=info CONF=$RELEASE_NAME clean images test-image || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME clean images || do_exit $?
fi
else
if [ $do_maketest -eq 1 ]; then
make LOG=info CONF=$RELEASE_NAME images test-image || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME images || do_exit $?
fi
fi
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
where cygpath
if [ $? -eq 0 ]; then
JCEF_PATH="$(cygpath -w $JCEF_PATH | sed 's/\\/\//g')"
fi
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods "$JSDK" "$JCEF_PATH"/jmods "$JSDK"/jmods "$JSDK_MODS_DIR" || do_exit $?
cp $JCEF_PATH/jmods/* ${JSDK_MODS_DIR} # $JSDK/jmods is not unchanged
jbr_name_postfix="_${bundle_type}"
fi
# create runtime image bundlef
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
modules+=",jdk.crypto.mscapi"
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat ${JSDK}/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\r//g | sed s/\\n//g)
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" "$JBRSDK_BUNDLE" "$JSDK_MODS_DIR" "$modules" || do_exit $?
do_exit 0

View File

@@ -1,63 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies the current version of OpenJDK e.g. 11_0_6
# JDK_BUILD_NUMBER - specifies the number of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
JBSDK_VERSION_WITH_DOTS=$(echo $JBSDK_VERSION | sed 's/_/\./g')
source jb/project/tools/common/scripts/common.sh
JBRSDK_BASE_NAME=jbrsdk-${JBSDK_VERSION}
WORK_DIR=$(pwd)
[ -z "$bundle_type" ] && (git apply -p0 < jb/project/tools/patches/exclude_jcef_module.patch || exit $?)
PATH="/usr/local/bin:/usr/bin:${PATH}"
./configure \
--with-target-bits=32 \
--with-vendor-name="${VENDOR_NAME}" \
--with-vendor-version-string="${VENDOR_VERSION_STRING}" \
--with-version-pre= \
--with-version-build=${JDK_BUILD_NUMBER} \
--with-version-opt=b${build_number} \
--with-toolchain-version=${TOOLCHAIN_VERSION} \
--with-boot-jdk=${BOOT_JDK} \
--disable-ccache \
--enable-cds=yes || exit 1
make clean CONF=windows-x86-server-release || exit 1
make LOG=info images CONF=windows-x86-server-release test-image || exit 1
JBSDK=${JBRSDK_BASE_NAME}-windows-x86-b${build_number}
BASE_DIR=build/windows-x86-server-release/images
JSDK=${BASE_DIR}/jdk
JBRSDK_BUNDLE=jbrsdk
rm -rf ${BASE_DIR}/${JBRSDK_BUNDLE} && rsync -a --exclude demo --exclude sample ${JSDK}/ ${JBRSDK_BUNDLE} || exit 1
sed 's/JBR/JBRSDK/g' ${JSDK}/release > release
mv release ${JBRSDK_BUNDLE}/release
JBR_BUNDLE=jbr
rm -rf ${JBR_BUNDLE}
grep -v javafx jb/project/tools/common/modules.list | grep -v "jdk.internal.vm\|jdk.aot\|jcef" > modules.list.x86
echo ",jdk.crypto.mscapi" >> modules.list.x86
${JSDK}/bin/jlink \
--module-path ${JSDK}/jmods --no-man-pages --compress=2 \
--add-modules $(xargs < modules.list.x86 | sed s/" "//g) --output ${JBR_BUNDLE} || exit $?
echo Modifying release info ...
#grep -v \"^JAVA_VERSION\" ${JSDK}/release | grep -v \"^MODULES\" >> ${JBR_BUNDLE}/release

View File

@@ -1,48 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script packs test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
source jb/project/tools/common/scripts/common.sh
[ "$bundle_type" == "jcef" ] && do_maketest=1
function pack_jbr {
__bundle_name=$1
__arch_name=$2
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-windows-x64-${fastdebug_infix}b${build_number}
echo Creating $JBR.tar.gz ...
/usr/bin/tar -czf $JBR.tar.gz -C $BASE_DIR $__arch_name || do_exit $?
}
[ "$bundle_type" == "nomod" ] && bundle_type=""
JBRSDK_BUNDLE=jbrsdk
RELEASE_NAME=windows-x86_64-server-release
IMAGES_DIR=build/$RELEASE_NAME/images
BASE_DIR=.
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
jbr_name_postfix="_${bundle_type}"
fi
pack_jbr jbr${jbr_name_postfix} jbr
pack_jbr jbrsdk${jbr_name_postfix} jbrsdk
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=$JBRSDK_BUNDLE-$JBSDK_VERSION-windows-test-x64-b$build_number
echo Creating $JBRSDK_TEST.tar.gz ...
/usr/bin/tar -czf $JBRSDK_TEST.tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
fi

View File

@@ -1,42 +0,0 @@
#!/bin/bash -x
# The following parameters must be specified:
# JBSDK_VERSION - specifies the current version of OpenJDK e.g. 11_0_6
# JDK_BUILD_NUMBER - specifies the number of OpenJDK build or the value of --with-version-build argument to configure
# build_number - specifies the number of JetBrainsRuntime build
#
# jbrsdk-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
# jbr-${JBSDK_VERSION}-osx-x64-b${build_number}.tar.gz
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
JBSDK_VERSION=$1
JDK_BUILD_NUMBER=$2
build_number=$3
JBRSDK_BASE_NAME=jbrsdk-$JBSDK_VERSION
JBR_BASE_NAME=jbr-$JBSDK_VERSION
IMAGES_DIR=build/windows-x86-server-release/images
JSDK=$IMAGES_DIR/jdk
JBSDK=$JBRSDK_BASE_NAME-windows-x86-b$build_number
BASE_DIR=.
JBRSDK_BUNDLE=jbrsdk
echo Creating $JBSDK.tar.gz ...
/usr/bin/tar -czf $JBSDK.tar.gz $JBRSDK_BUNDLE || exit 1
JBR_BUNDLE=jbr
JBR_BASE_NAME=jbr-${JBSDK_VERSION}
JBR=$JBR_BASE_NAME-windows-x86-b$build_number
echo Creating $JBR.tar.gz ...
/usr/bin/tar -czf $JBR.tar.gz -C $BASE_DIR ${JBR_BUNDLE} || exit 1
JBRSDK_TEST=$JBRSDK_BASE_NAME-windows-test-x86-b$build_number
echo Creating $JBRSDK_TEST.tar.gz ...
/usr/bin/tar -czf $JBRSDK_TEST.tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || exit 1

View File

@@ -29,7 +29,7 @@ include $(SPEC)
include MakeBase.gmk
# When FIXPATH is set, let it process the file to make sure all paths are usable
# by system native tools. The FIXPATH tool assumes arguments preceded by an @
# by system native tools. The FIXPATH tool assumes arguments preceeded by an @
# character points to a text file containing further arguments (similar to a
# linker). It replaces any such arguments with a different temporary filename,
# whose contents has been processed to make any paths native. To obtain a

View File

@@ -104,7 +104,7 @@ define SetupBuildDemoBody
ifneq ($$($1_SRC_SUB_DIR), )
$1_MAIN_SRC := $$($1_SRC_BASE)/$$($1_SRC_SUB_DIR)
else
# for almost all
# for allmost all
$1_MAIN_SRC := $$($1_SRC_BASE)
endif

View File

@@ -131,7 +131,7 @@ JAVA_PLATFORM := Java Platform
ifeq ($(IS_DRAFT), true)
DRAFT_MARKER_STR := <br><strong>DRAFT $(VERSION_STRING)</strong>
ifeq ($(VERSION_BUILD), )
ifeq ($(VERSION_BUILD), 0)
DRAFT_MARKER_TITLE := $(SPACE)[ad-hoc build]
else
DRAFT_MARKER_TITLE := $(SPACE)[build $(VERSION_BUILD)]

View File

@@ -1,148 +0,0 @@
#
# Copyright (c) 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 2 only, as
# published by the Free Software Foundation. Oracle designates this
# particular file as subject to the "Classpath" exception as provided
# by Oracle in the LICENSE file that accompanied this code.
#
# This code is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
# version 2 for more details (a copy is included in the LICENSE file that
# accompanied this code).
#
# You should have received a copy of the GNU General Public License version
# 2 along with this work; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
#
# Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
# or visit www.oracle.com if you need additional information or have any
# questions.
#
default: all
include $(SPEC)
include MakeBase.gmk
# Hook to include the corresponding custom file, if present.
$(eval $(call IncludeCustomExtension, Doctor.gmk))
################################################################################
#
# Help user diagnose possible errors and problems with the build environment.
#
prologue:
$(ECHO)
$(ECHO) '"make doctor" will help you analyze your build environment. It can highlight'
$(ECHO) 'certain well-known problems, but it can never find all possible errors.'
TARGETS += prologue
check-git: prologue
$(ECHO)
$(ECHO) '* Verifying that configure has picked up git...'
ifeq ($(GIT), )
$(ECHO) 'WARNING: "git" is not present. This will disable several checks.'
$(ECHO) '! Correct by installing git and verifying that it is in the PATH'
endif
TARGETS += check-git
ifneq ($(GIT), )
AUTOCRLF := $(shell $(GIT) config core.autocrlf)
endif
check-autocrlf: check-git
ifneq ($(GIT), )
ifeq ($(call isBuildOs, windows), true)
$(ECHO)
$(ECHO) '* Verifying git core.autocrlf value...'
ifneq ($(AUTOCRLF), false)
$(ECHO) 'WARNING: core.autocrlf is not "false". HIGH RISK of build failure!'
$(ECHO) '! Correct by running 'git config --global core.autocrlf false' and re-cloning the repo'
endif
endif
endif
TARGETS += check-autocrlf
check-configure-warnings: check-autocrlf
$(ECHO)
$(ECHO) '* Checking for warnings from configure...'
warning_output=`$(GREP) -e "^\* Memory limit:" -A 300 $(OUTPUTDIR)/configure.log | $(TAIL) -n +3 | $(SED) -e '$(DOLLAR){/^$(DOLLAR)/d;}'` && \
if test -n "$$warning_output" ; then \
$(ECHO) ' ---' ; \
$(GREP) -e "^\* Memory limit:" -A 300 $(OUTPUTDIR)/configure.log | $(TAIL) -n +3 | $(SED) -e '$(DOLLAR){/^$(DOLLAR)/d;}' ; \
$(ECHO) ' ---' ; \
$(ECHO) '! Inspect the warnings, fix any problems, and re-run configure' ; \
fi
TARGETS += check-configure-warnings
ifneq ($(GIT), )
# This might have been set by custom component
UNTRACKED_FILES ?= $(shell $(GIT) status --porcelain --ignored | $(CUT) -c 4-)
endif
check-core-files: check-configure-warnings
ifneq ($(GIT), )
$(ECHO)
$(ECHO) '* Checking for left-over core files...'
core_files_found=`echo "$(UNTRACKED_FILES)" | $(TR) ' ' '\n' | $(GREP) core` && \
if test -n "$$core_files_found" ; then \
$(ECHO) 'Found these potential core files. They might interfere with the build process:' ; \
$(ECHO) ' ---' ; \
$(ECHO) $$core_files_found | $(TR) ' ' '\n'; \
$(ECHO) ' ---' ; \
$(ECHO) '! Remove left-over core files' ; \
fi || : # do nothing if grep returns non-0 value
endif
TARGETS += check-core-files
check-bad-file-names: check-core-files
ifneq ($(GIT), )
$(ECHO)
$(ECHO) '* Checking for untracked files with illegal names...'
core_files_found=`echo "$(UNTRACKED_FILES)" | $(TR) ' ' '\n' | $(GREP) '#'` && \
if test -n "$$core_files_found" ; then \
$(ECHO) 'Found these files with illegal names. They *will* cause build failures:' ; \
$(ECHO) ' ---' ; \
$(ECHO) $$core_files_found | $(TR) ' ' '\n'; \
$(ECHO) ' ---' ; \
$(ECHO) '! Remove all files with '#' in their name from the JDK source tree' ; \
fi || : # do nothing if grep returns non-0 value
endif
TARGETS += check-bad-file-names
epilogue: check-bad-file-names
$(ECHO)
$(ECHO) '* If all else fails, try removing the entire build directory and re-creating'
$(ECHO) 'the same configuration using:'
$(ECHO) ' ---' ; \
$(ECHO) configure_command_line=\$$\(make print-configuration\)
$(ECHO) make dist-clean
$(ECHO) bash configure \$$configure_command_line
$(ECHO) ' ---' ; \
$(ECHO)
$(ECHO) '* The build README (doc/building.md) is a great source of information,'
$(ECHO) 'especially the chapter "Fixing Unexpected Build Failures". Check it out!'
$(ECHO)
$(ECHO) '* If you still need assistance please contact build-dev@openjdk.java.net.'
$(ECHO)
TARGETS += epilogue
################################################################################
doctor: $(TARGETS)
all: doctor
.PHONY: default all doctor $(TARGETS)

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2012, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2012, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -53,17 +53,14 @@ help:
$(info $(_) make docs-jdk-api # Create just JDK javadocs)
$(info $(_) make bootcycle-images # Build images twice, second time with newly built JDK)
$(info $(_) make install # Install the generated images locally)
$(info $(_) make reconfigure # Rerun configure with the same arguments as last time)
$(info $(_) make help # Give some help on using make)
$(info $(_) make check # Run basic testing (currently tier1))
$(info $(_) make test-<test> # Run test, e.g. test-tier1)
$(info $(_) make test TEST=<t> # Run test(s) given by TEST specification)
$(info $(_) make exploded-test TEST=<t> # Run test(s) on the exploded image instead of)
$(info $(_) # the full jdk image)
$(info )
$(info Targets for troubleshooting)
$(info $(_) make help # Give some help on using make)
$(info $(_) make doctor # Diagnose build environment problems)
$(info $(_) make reconfigure # Rerun configure with the same arguments as last time)
$(info )
$(info Targets for cleaning)
$(info $(_) make clean # Remove all files generated by make, but not those)
$(info $(_) # generated by configure)

View File

@@ -23,7 +23,7 @@
# questions.
#
# This makefile creates a jdk image overlaid with statically linked core
# This makefile creates a jdk image overlayed with statically linked core
# libraries.
default: all

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2020, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2020, 2021, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -35,165 +35,107 @@ include JdkNativeCompilation.gmk
################################################################################
HSDIS_OUTPUT_DIR := $(SUPPORT_OUTPUTDIR)/hsdis
REAL_HSDIS_NAME := hsdis-$(OPENJDK_TARGET_CPU_LEGACY_LIB)$(SHARED_LIBRARY_SUFFIX)
BUILT_HSDIS_LIB := $(HSDIS_OUTPUT_DIR)/$(REAL_HSDIS_NAME)
HSDIS_TOOLCHAIN := TOOLCHAIN_DEFAULT
HSDIS_TOOLCHAIN_CFLAGS := $(CFLAGS_JDKLIB)
HSDIS_TOOLCHAIN_LDFLAGS := $(LDFLAGS_JDKLIB)
ifeq ($(call isTargetOs, windows), true)
INSTALLED_HSDIS_DIR := $(JDK_OUTPUTDIR)/bin
ifeq ($(HSDIS_BACKEND), capstone)
ifeq ($(call isTargetCpuArch, x86), true)
CAPSTONE_ARCH := CS_ARCH_X86
CAPSTONE_MODE := CS_MODE_$(OPENJDK_TARGET_CPU_BITS)
else ifeq ($(call isTargetCpuArch, aarch64), true)
CAPSTONE_ARCH := CS_ARCH_ARM64
CAPSTONE_MODE := CS_MODE_ARM
else
$(error No support for Capstone on this platform)
endif
# On windows, we need to "fake" a completely different toolchain using gcc
# instead of the normal microsoft toolchain. This is quite hacky...
HSDIS_CFLAGS += -DCAPSTONE_ARCH=$(CAPSTONE_ARCH) \
-DCAPSTONE_MODE=$(CAPSTONE_MODE)
endif
MINGW_BASE := x86_64-w64-mingw32
ifeq ($(HSDIS_BACKEND), llvm)
# Use C++ instead of C
HSDIS_TOOLCHAIN_CFLAGS := $(CXXFLAGS_JDKLIB)
HSDIS_TOOLCHAIN := TOOLCHAIN_LINK_CXX
ifeq ($(call isTargetOs, linux), true)
LLVM_OS := pc-linux-gnu
else ifeq ($(call isTargetOs, macosx), true)
LLVM_OS := apple-darwin
else ifeq ($(call isTargetOs, windows), true)
LLVM_OS := pc-windows-msvc
else
$(error No support for LLVM on this platform)
endif
HSDIS_CFLAGS += -DLLVM_DEFAULT_TRIPLET='"$(OPENJDK_TARGET_CPU)-$(LLVM_OS)"'
endif
ifeq ($(HSDIS_BACKEND), binutils)
ifeq ($(call isTargetOs, windows), true)
# On windows, we need to "fake" a completely different toolchain using gcc
# instead of the normal microsoft toolchain. This is quite hacky...
MINGW_BASE := x86_64-w64-mingw32
MINGW_SYSROOT = $(shell $(MINGW_BASE)-gcc -print-sysroot)
MINGW_SYSROOT = $(shell $(MINGW_BASE)-gcc -print-sysroot)
ifeq ($(wildcard $(MINGW_SYSROOT)), )
# Use fallback path
MINGW_SYSROOT := /usr/$(MINGW_BASE)
ifeq ($(wildcard $(MINGW_SYSROOT)), )
# Use fallback path
MINGW_SYSROOT := /usr/$(MINGW_BASE)
ifeq ($(wildcard $(MINGW_SYSROOT)), )
$(error mingw sysroot not found)
endif
$(error mingw sysroot not found)
endif
$(eval $(call DefineNativeToolchain, TOOLCHAIN_MINGW, \
CC := $(MINGW_BASE)-gcc, \
LD := $(MINGW_BASE)-ld, \
OBJCOPY := $(MINGW_BASE)-objcopy, \
RC := $(RC), \
SYSROOT_CFLAGS := --sysroot=$(MINGW_SYSROOT), \
SYSROOT_LDFLAGS := --sysroot=$(MINGW_SYSROOT), \
))
MINGW_SYSROOT_LIB_PATH := $(MINGW_SYSROOT)/mingw/lib
ifeq ($(wildcard $(MINGW_SYSROOT_LIB_PATH)), )
# Try without mingw
MINGW_SYSROOT_LIB_PATH := $(MINGW_SYSROOT)/lib
ifeq ($(wildcard $(MINGW_SYSROOT_LIB_PATH)), )
$(error mingw sysroot lib path not found)
endif
endif
MINGW_VERSION = $(shell $(MINGW_BASE)-gcc -v 2>&1 | $(GREP) "gcc version" | $(CUT) -d " " -f 3)
MINGW_GCC_LIB_PATH := /usr/lib/gcc/$(MINGW_BASE)/$(MINGW_VERSION)
ifeq ($(wildcard $(MINGW_GCC_LIB_PATH)), )
# Try using only major version number
MINGW_VERSION_MAJOR := $(firstword $(subst ., , $(MINGW_VERSION)))
MINGW_GCC_LIB_PATH := /usr/lib/gcc/$(MINGW_BASE)/$(MINGW_VERSION_MAJOR)
ifeq ($(wildcard $(MINGW_GCC_LIB_PATH)), )
$(error mingw gcc lib path not found)
endif
endif
TOOLCHAIN_TYPE := gcc
OPENJDK_TARGET_OS := linux
CC_OUT_OPTION := -o$(SPACE)
LD_OUT_OPTION := -o$(SPACE)
GENDEPS_FLAGS := -MMD -MF
CFLAGS_DEBUG_SYMBOLS := -g
DISABLED_WARNINGS :=
DISABLE_WARNING_PREFIX := -Wno-
CFLAGS_WARNINGS_ARE_ERRORS := -Werror
SHARED_LIBRARY_FLAGS := -shared
HSDIS_TOOLCHAIN := TOOLCHAIN_MINGW
HSDIS_TOOLCHAIN_CFLAGS :=
HSDIS_TOOLCHAIN_LDFLAGS := -L$(MINGW_GCC_LIB_PATH) -L$(MINGW_SYSROOT_LIB_PATH)
MINGW_DLLCRT := $(MINGW_SYSROOT_LIB_PATH)/dllcrt2.o
HSDIS_TOOLCHAIN_LIBS := $(MINGW_DLLCRT) -lmingw32 -lgcc -lgcc_eh -lmoldname \
-lmingwex -lmsvcrt -lpthread -ladvapi32 -lshell32 -luser32 -lkernel32
else
HSDIS_TOOLCHAIN_LIBS := -ldl
endif
$(eval $(call DefineNativeToolchain, TOOLCHAIN_MINGW, \
CC := $(MINGW_BASE)-gcc, \
LD := $(MINGW_BASE)-ld, \
OBJCOPY := $(MINGW_BASE)-objcopy, \
RC := $(RC), \
SYSROOT_CFLAGS := --sysroot=$(MINGW_SYSROOT), \
SYSROOT_LDFLAGS := --sysroot=$(MINGW_SYSROOT), \
))
MINGW_SYSROOT_LIB_PATH := $(MINGW_SYSROOT)/mingw/lib
ifeq ($(wildcard $(MINGW_SYSROOT_LIB_PATH)), )
# Try without mingw
MINGW_SYSROOT_LIB_PATH := $(MINGW_SYSROOT)/lib
ifeq ($(wildcard $(MINGW_SYSROOT_LIB_PATH)), )
$(error mingw sysroot lib path not found)
endif
endif
MINGW_VERSION = $(shell $(MINGW_BASE)-gcc -v 2>&1 | $(GREP) "gcc version" | $(CUT) -d " " -f 3)
MINGW_GCC_LIB_PATH := /usr/lib/gcc/$(MINGW_BASE)/$(MINGW_VERSION)
ifeq ($(wildcard $(MINGW_GCC_LIB_PATH)), )
# Try using only major version number
MINGW_VERSION_MAJOR := $(firstword $(subst ., , $(MINGW_VERSION)))
MINGW_GCC_LIB_PATH := /usr/lib/gcc/$(MINGW_BASE)/$(MINGW_VERSION_MAJOR)
ifeq ($(wildcard $(MINGW_GCC_LIB_PATH)), )
$(error mingw gcc lib path not found)
endif
endif
TOOLCHAIN_TYPE := gcc
OPENJDK_TARGET_OS := linux
CC_OUT_OPTION := -o$(SPACE)
LD_OUT_OPTION := -o$(SPACE)
GENDEPS_FLAGS := -MMD -MF
CFLAGS_DEBUG_SYMBOLS := -g
DISABLED_WARNINGS :=
DISABLE_WARNING_PREFIX := -Wno-
CFLAGS_WARNINGS_ARE_ERRORS := -Werror
SHARED_LIBRARY_FLAGS := -shared
HSDIS_TOOLCHAIN := TOOLCHAIN_MINGW
HSDIS_TOOLCHAIN_CFLAGS :=
HSDIS_TOOLCHAIN_LDFLAGS := -L$(MINGW_GCC_LIB_PATH) -L$(MINGW_SYSROOT_LIB_PATH)
MINGW_DLLCRT := $(MINGW_SYSROOT_LIB_PATH)/dllcrt2.o
HSDIS_TOOLCHAIN_LIBS := $(MINGW_DLLCRT) -lmingw32 -lgcc -lgcc_eh -lmoldname \
-lmingwex -lmsvcrt -lpthread -ladvapi32 -lshell32 -luser32 -lkernel32
else
INSTALLED_HSDIS_DIR := $(JDK_OUTPUTDIR)/lib
HSDIS_TOOLCHAIN := TOOLCHAIN_DEFAULT
HSDIS_TOOLCHAIN_CFLAGS := $(CFLAGS_JDKLIB)
HSDIS_TOOLCHAIN_LDFLAGS := $(LDFLAGS_JDKLIB)
HSDIS_TOOLCHAIN_LIBS := -ldl
endif
$(eval $(call SetupJdkLibrary, BUILD_HSDIS, \
NAME := hsdis, \
SRC := $(TOPDIR)/src/utils/hsdis/$(HSDIS_BACKEND), \
EXTRA_HEADER_DIRS := $(TOPDIR)/src/utils/hsdis, \
SRC := $(TOPDIR)/src/utils/hsdis, \
TOOLCHAIN := $(HSDIS_TOOLCHAIN), \
OUTPUT_DIR := $(HSDIS_OUTPUT_DIR), \
OBJECT_DIR := $(HSDIS_OUTPUT_DIR), \
DISABLED_WARNINGS_gcc := undef format-nonliteral sign-compare, \
DISABLED_WARNINGS_clang := undef format-nonliteral, \
CFLAGS := $(HSDIS_TOOLCHAIN_CFLAGS) $(HSDIS_CFLAGS), \
LDFLAGS := $(HSDIS_TOOLCHAIN_LDFLAGS) $(HSDIS_LDFLAGS) $(SHARED_LIBRARY_FLAGS), \
LDFLAGS := $(HSDIS_TOOLCHAIN_LDFLAGS) $(SHARED_LIBRARY_FLAGS), \
LIBS := $(HSDIS_LIBS) $(HSDIS_TOOLCHAIN_LIBS), \
))
$(BUILT_HSDIS_LIB): $(BUILD_HSDIS_TARGET)
$(install-file)
build: $(BUILD_HSDIS) $(BUILT_HSDIS_LIB)
build: $(BUILD_HSDIS)
TARGETS += build
ifeq ($(ENABLE_HSDIS_BUNDLING), false)
INSTALLED_HSDIS_NAME := hsdis-$(OPENJDK_TARGET_CPU_LEGACY_LIB)$(SHARED_LIBRARY_SUFFIX)
ifeq ($(call isTargetOs, windows), true)
JDK_HSDIS_DIR := $(JDK_OUTPUTDIR)/bin
IMAGE_HSDIS_DIR := $(JDK_IMAGE_DIR)/bin
else
JDK_HSDIS_DIR := $(JDK_OUTPUTDIR)/lib
IMAGE_HSDIS_DIR := $(JDK_IMAGE_DIR)/lib
endif
INSTALLED_HSDIS := $(INSTALLED_HSDIS_DIR)/$(INSTALLED_HSDIS_NAME)
INSTALLED_HSDIS_JDK := $(JDK_HSDIS_DIR)/$(REAL_HSDIS_NAME)
INSTALLED_HSDIS_IMAGE := $(IMAGE_HSDIS_DIR)/$(REAL_HSDIS_NAME)
$(INSTALLED_HSDIS_JDK): $(BUILT_HSDIS_LIB)
ifeq ($(HSDIS_BACKEND), binutils)
$(call LogWarn, NOTE: The resulting build might not be redistributable. Seek legal advice before distributing.)
endif
$(INSTALLED_HSDIS): $(BUILD_HSDIS_TARGET)
$(call LogWarn, NOTE: The resulting build might not be redistributable. Seek legal advice before distibuting.)
$(install-file)
$(INSTALLED_HSDIS_IMAGE): $(BUILT_HSDIS_LIB)
$(install-file)
install: $(INSTALLED_HSDIS_JDK) $(INSTALLED_HSDIS_IMAGE)
else
install:
$(ECHO) NOTE: make install-hsdis is a no-op with --enable-hsdis-bundling
endif
install: $(INSTALLED_HSDIS)
TARGETS += install

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2012, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2012, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -352,7 +352,7 @@ else # HAS_SPEC=true
$(call PrintFailureReports)
$(call PrintBuildLogFailures)
$(call ReportProfileTimes)
$(PRINTF) "HELP: Run 'make doctor' to diagnose build problems.\n\n"
$(PRINTF) "Hint: See doc/building.html#troubleshooting for assistance.\n\n"
ifneq ($(COMPARE_BUILD), )
$(call CleanupCompareBuild)
endif

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2021, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -310,9 +310,17 @@ else # $(HAS_SPEC)=true
# level of reproducible builds
define SetupReproducibleBuild
ifeq ($$(SOURCE_DATE), updated)
# For static values of SOURCE_DATE (not "updated"), these are set in spec.gmk
export SOURCE_DATE_EPOCH := $$(shell $$(DATE) +"%s")
export SOURCE_DATE_ISO_8601 := $$(call EpochToISO8601, $$(SOURCE_DATE_EPOCH))
SOURCE_DATE := $$(shell $$(DATE) +"%s")
endif
export SOURCE_DATE_EPOCH := $$(SOURCE_DATE)
ifeq ($$(IS_GNU_DATE), yes)
export SOURCE_DATE_ISO_8601 := $$(shell $$(DATE) --utc \
--date="@$$(SOURCE_DATE_EPOCH)" \
+"%Y-%m-%dT%H:%M:%SZ" 2> /dev/null)
else
export SOURCE_DATE_ISO_8601 := $$(shell $$(DATE) -u \
-j -f "%s" "$$(SOURCE_DATE_EPOCH)" \
+"%Y-%m-%dT%H:%M:%SZ" 2> /dev/null)
endif
endef
@@ -461,10 +469,10 @@ else # $(HAS_SPEC)=true
$(PRINTF) "\n=== Make failed targets repeated here ===\n" ; \
$(GREP) "recipe for target .* failed" $(BUILD_LOG) ; \
$(PRINTF) "=== End of repeated output ===\n" ; \
$(PRINTF) "\nHELP: Try searching the build log for the name of the first failed target.\n" ; \
$(PRINTF) "\nHint: Try searching the build log for the name of the first failed target.\n" ; \
else \
$(PRINTF) "\nNo indication of failed target found.\n" ; \
$(PRINTF) "HELP: Try searching the build log for '] Error'.\n" ; \
$(PRINTF) "Hint: Try searching the build log for '] Error'.\n" ; \
fi \
)
endef

View File

@@ -1,84 +0,0 @@
#
# Copyright 2000-2021 JetBrains s.r.o.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
include $(SPEC)
include MakeBase.gmk
include JavaCompilation.gmk
JBR_API_ROOT_DIR := $(TOPDIR)/src/jetbrains.api
JBR_API_TOOLS_DIR := $(JBR_API_ROOT_DIR)/tools
JBR_API_SRC_DIR := $(JBR_API_ROOT_DIR)/src
JBR_API_OUTPUT_DIR := $(OUTPUTDIR)/jbr-api
JBR_API_GENSRC_DIR := $(JBR_API_OUTPUT_DIR)/gensrc
JBR_API_BIN_DIR := $(JBR_API_OUTPUT_DIR)/bin
JBR_API_VERSION_PROPERTIES := $(JBR_API_ROOT_DIR)/version.properties
JBR_API_VERSION_GENSRC := $(JBR_API_OUTPUT_DIR)/jbr-api.version
JBR_API_GENSRC_BATCH := $(JBR_API_VERSION_GENSRC)
JBR_API_SRC_FILES := $(call FindFiles, $(JBR_API_SRC_DIR))
JBR_API_GENSRC_FILES := $(foreach f, $(call FindFiles, $(JBR_API_SRC_DIR)), \
$(JBR_API_GENSRC_DIR)/$(call RelativePath, $f, $(JBR_API_SRC_DIR)))
ifeq ($(JBR_API_JBR_VERSION),)
JBR_API_JBR_VERSION := <DEVELOPMENT>
JBR_API_FAIL_ON_HASH_MISMATCH := false
else
.PHONY: $(JBR_API_VERSION_PROPERTIES)
JBR_API_FAIL_ON_HASH_MISMATCH := true
endif
ARCHIVE_BUILD_JBR_API_BIN := $(JBR_API_BIN_DIR)
$(eval $(call SetupJavaCompilation, BUILD_JBR_API, \
SMALL_JAVA := true, \
COMPILER := bootjdk, \
SRC := $(JBR_API_GENSRC_DIR), \
EXTRA_FILES := $(JBR_API_GENSRC_FILES), \
BIN := $(JBR_API_BIN_DIR), \
JAR := $(JBR_API_OUTPUT_DIR)/jbr-api.jar, \
))
$(eval $(call SetupJarArchive, BUILD_JBR_API_SOURCES_JAR, \
DEPENDENCIES := $(JBR_API_GENSRC_FILES), \
SRCS := $(JBR_API_GENSRC_DIR), \
JAR := $(JBR_API_OUTPUT_DIR)/jbr-api-sources.jar, \
SUFFIXES := .java, \
BIN := $(JBR_API_BIN_DIR), \
))
# Grouped targets may not be supported, so hack dependencies: sources -> version file -> generated sources
$(JBR_API_VERSION_GENSRC): $(JBR_API_SRC_FILES) $(JBR_API_VERSION_PROPERTIES) $(JBR_API_TOOLS_DIR)/Gensrc.java
$(ECHO) Generating sources for JBR API
$(JAVA_CMD) $(JAVA_FLAGS_SMALL) "$(JBR_API_TOOLS_DIR)/Gensrc.java" \
"$(TOPDIR)/src" "$(JBR_API_OUTPUT_DIR)" "$(JBR_API_JBR_VERSION)"
$(JBR_API_GENSRC_FILES): $(JBR_API_VERSION_GENSRC)
$(TOUCH) $@
jbr-api-check-version: $(JBR_API_GENSRC_FILES) $(JBR_API_VERSION_PROPERTIES)
$(JAVA_CMD) $(JAVA_FLAGS_SMALL) "$(JBR_API_TOOLS_DIR)/CheckVersion.java" \
"$(JBR_API_ROOT_DIR)" "$(JBR_API_GENSRC_DIR)" "$(JBR_API_FAIL_ON_HASH_MISMATCH)"
jbr-api: $(BUILD_JBR_API) $(BUILD_JBR_API_SOURCES_JAR) jbr-api-check-version
.PHONY: jbr-api jbr-api-check-version
ifneq ($(JBR_API_CONF_FILE),)
$(JBR_API_CONF_FILE): $(JBR_API_GENSRC_FILES)
$(ECHO) "VERSION=`$(CAT) $(JBR_API_VERSION_GENSRC)`" > $(JBR_API_CONF_FILE)
$(ECHO) "JAR=$(JBR_API_OUTPUT_DIR)/jbr-api.jar" >> $(JBR_API_CONF_FILE)
$(ECHO) "SOURCES_JAR=$(JBR_API_OUTPUT_DIR)/jbr-api-sources.jar" >> $(JBR_API_CONF_FILE)
jbr-api: $(JBR_API_CONF_FILE)
.PHONY: $(JBR_API_CONF_FILE)
endif

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2021, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -260,13 +260,6 @@ $(eval $(call SetupTarget, hotspot-ide-project, \
ALL_TARGETS += $(HOTSPOT_VARIANT_TARGETS) $(HOTSPOT_VARIANT_GENSRC_TARGETS) \
$(HOTSPOT_VARIANT_LIBS_TARGETS)
################################################################################
# Help and user support
$(eval $(call SetupTarget, doctor, \
MAKEFILE := Doctor, \
))
################################################################################
# Generate libs and launcher targets for creating compile_commands.json fragments
define DeclareCompileCommandsRecipe
@@ -542,7 +535,6 @@ ifneq ($(HSDIS_BACKEND), none)
$(eval $(call SetupTarget, install-hsdis, \
MAKEFILE := Hsdis, \
TARGET := install, \
DEPS := jdk-image, \
))
endif
@@ -869,10 +861,6 @@ else
$(foreach t, $(filter-out java.base-libs, $(LIBS_TARGETS)), \
$(eval $t: java.base-libs))
ifeq ($(ENABLE_HSDIS_BUNDLING), true)
java.base-copy: build-hsdis
endif
# jdk.accessibility depends on java.desktop
jdk.accessibility-libs: java.desktop-libs
@@ -1349,14 +1337,6 @@ create-main-targets-include:
@$(ECHO) ALL_MAIN_TARGETS := $(sort $(ALL_TARGETS)) > \
$(MAKESUPPORT_OUTPUTDIR)/main-targets.gmk
################################################################################
# JBR API
$(eval $(call SetupTarget, jbr-api, \
MAKEFILE := JBRApi, \
TARGET := jbr-api \
))
################################################################################
# Hook to include the corresponding custom file, if present.
$(eval $(call IncludeCustomExtension, Main-post.gmk))

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2014, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2014, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -35,8 +35,6 @@ default: all
include $(SPEC)
include MakeBase.gmk
MODULE_SRC := $(TOPDIR)/src/$(MODULE)
# All makefiles should add the targets to be built to this variable.
TARGETS :=

View File

@@ -139,7 +139,7 @@ endif
################################################################################
# Each factor variable comes in 3 variants. The first one is reserved for users
# to use on command line. The other two are for predefined configurations in JDL
# to use on command line. The other two are for predifined configurations in JDL
# and for machine specific configurations respectively.
TEST_JOBS_FACTOR ?= 1
TEST_JOBS_FACTOR_JDL ?= 1
@@ -524,7 +524,7 @@ define SetupRunGtestTestBody
$$(subst $$(TOPDIR)/, , $$($1_TEST_RESULTS_DIR))))
$$(if $$(wildcard $$($1_RESULT_FILE)), \
$$(eval $1_TOTAL := $$(shell $$(AWK) '/==========.* tests? from .* \
test (cases?|suites?) ran/ { print $$$$2 }' $$($1_RESULT_FILE))) \
test cases? ran/ { print $$$$2 }' $$($1_RESULT_FILE))) \
$$(if $$($1_TOTAL), , $$(eval $1_TOTAL := 0)) \
$$(eval $1_PASSED := $$(shell $$(AWK) '/\[ PASSED \] .* tests?./ \
{ print $$$$4 }' $$($1_RESULT_FILE))) \
@@ -782,6 +782,8 @@ define SetupRunJtregTestBody
# Make it possible to specify the JIB_DATA_DIR for tests using the
# JIB Artifact resolver
$1_JTREG_BASIC_OPTIONS += -e:JIB_DATA_DIR
# Some tests needs to find a boot JDK using the JDK8_HOME variable.
$1_JTREG_BASIC_OPTIONS += -e:JDK8_HOME=$$(BOOT_JDK)
# If running on Windows, propagate the _NT_SYMBOL_PATH to enable
# symbol lookup in hserr files
ifeq ($$(call isTargetOs, windows), true)
@@ -797,15 +799,6 @@ define SetupRunJtregTestBody
$1_JTREG_BASIC_OPTIONS += -ea -esa
endif
ifeq ($$(ASAN_ENABLED), yes)
$1_JTREG_BASIC_OPTIONS += -e:ASAN_OPTIONS=handle_segv=0:handle_sigfpe=0:detect_leaks=false
$1_JTREG_BASIC_OPTIONS += -e:LD_PRELOAD=libasan.so.5
endif
ifeq ($$(USAN_ENABLED), yes)
$1_JTREG_BASIC_OPTIONS += -e:LD_PRELOAD=libubsan.so.1
endif
ifneq ($$($1_JTREG_NATIVEPATH), )
$1_JTREG_BASIC_OPTIONS += -nativepath:$$($1_JTREG_NATIVEPATH)
endif

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2016, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2016, 2018, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -50,8 +50,8 @@ else ifneq ($(and $(GIT), $(wildcard $(TOPDIR)/.git)), )
USE_SCM := true
SCM_DIR := .git
ID_COMMAND := $(PRINTF) "git:%s%s\n" \
"$$($(GIT) log -n1 --format=%H | cut -c1-12)" \
"$$(if test -n "$$($(GIT) status --porcelain)"; then printf '+'; fi)"
"$$(git log -n1 --format=%H | cut -c1-12)" \
"$$(if test -n "$$(git status --porcelain)"; then printf '+'; fi)"
endif
ifeq ($(USE_SCM), true)

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2021, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -44,13 +44,16 @@ TOOL_COMPILEFONTCONFIG = $(JAVA_SMALL) -cp $(BUILDTOOLS_OUTPUTDIR)/jdk_tools_cla
--add-exports java.desktop/sun.awt=ALL-UNNAMED \
build.tools.compilefontconfig.CompileFontConfig
TOOL_COMPILEPROPERTIES = $(JAVA_SMALL) -cp $(BUILDTOOLS_OUTPUTDIR)/jdk_tools_classes \
build.tools.compileproperties.CompileProperties
TOOL_GENERATECHARACTER = $(JAVA_SMALL) -cp $(BUILDTOOLS_OUTPUTDIR)/jdk_tools_classes \
build.tools.generatecharacter.GenerateCharacter
TOOL_CHARACTERNAME = $(JAVA_SMALL) -cp $(BUILDTOOLS_OUTPUTDIR)/jdk_tools_classes \
build.tools.generatecharacter.CharacterName
TOOL_DTDBUILDER = $(JAVA_SMALL) -Ddtd_home=$(TOPDIR)/src/java.desktop/share/data/dtdbuilder \
TOOL_DTDBUILDER = $(JAVA_SMALL) -Ddtd_home=$(TOPDIR)/make/data/dtdbuilder \
-Djava.awt.headless=true \
-cp $(BUILDTOOLS_OUTPUTDIR)/jdk_tools_classes build.tools.dtdbuilder.DTDBuilder

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2012, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2012, 2019, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -50,7 +50,7 @@ endif
X11WRAPPERS_OUTPUT := $(SUPPORT_OUTPUTDIR)/x11wrappers
GENERATOR_SOURCE_FILE := $(X11WRAPPERS_OUTPUT)/src/data_generator.c
GENSRC_X11WRAPPERS_DATADIR := $(TOPDIR)/src/java.desktop/unix/data/x11wrappergen
GENSRC_X11WRAPPERS_DATADIR := $(TOPDIR)/make/data/x11wrappergen
WRAPPER_OUTPUT_FILE := $(GENSRC_X11WRAPPERS_DATADIR)/sizes-$(BITS).txt
BITS := $(OPENJDK_TARGET_CPU_BITS)

View File

@@ -172,7 +172,7 @@ AC_DEFUN_ONCE([BASIC_SETUP_DEVKIT],
UTIL_PREPEND_TO_PATH([TOOLCHAIN_PATH],$DEVKIT_TOOLCHAIN_PATH)
# If DEVKIT_SYSROOT is set, use that, otherwise try a couple of known
# places for backwards compatibility.
# places for backwards compatiblity.
if test "x$DEVKIT_SYSROOT" != x; then
SYSROOT="$DEVKIT_SYSROOT"
elif test -d "$DEVKIT_ROOT/$host_alias/libc"; then
@@ -193,7 +193,7 @@ AC_DEFUN_ONCE([BASIC_SETUP_DEVKIT],
# You can force the sysroot if the sysroot encoded into the compiler tools
# is not correct.
AC_ARG_WITH(sys-root, [AS_HELP_STRING([--with-sys-root],
[alias for --with-sysroot for backwards compatibility])],
[alias for --with-sysroot for backwards compatability])],
[SYSROOT=$with_sys_root]
)
@@ -496,7 +496,7 @@ AC_DEFUN_ONCE([BASIC_SETUP_DEFAULT_MAKE_TARGET],
AC_DEFUN_ONCE([BASIC_SETUP_DEFAULT_LOG],
[
AC_ARG_WITH(log, [AS_HELP_STRING([--with-log],
[[default value for make LOG argument [warn]]])])
[[default vaue for make LOG argument [warn]]])])
AC_MSG_CHECKING([for default LOG value])
if test "x$with_log" = x; then
DEFAULT_LOG=""

View File

@@ -80,7 +80,6 @@ AC_DEFUN_ONCE([BASIC_SETUP_FUNDAMENTAL_TOOLS],
# Optional tools, we can do without them
UTIL_LOOKUP_PROGS(DF, df)
UTIL_LOOKUP_PROGS(GIT, git)
UTIL_LOOKUP_PROGS(NICE, nice)
UTIL_LOOKUP_PROGS(READLINK, greadlink readlink)
@@ -161,23 +160,25 @@ AC_DEFUN([BASIC_CHECK_MAKE_VERSION],
AC_DEFUN([BASIC_CHECK_MAKE_OUTPUT_SYNC],
[
# Check if make supports the output sync option and if so, setup using it.
UTIL_ARG_WITH(NAME: output-sync, TYPE: literal,
VALID_VALUES: [none recurse line target], DEFAULT: none,
OPTIONAL: true, ENABLED_DEFAULT: true,
ENABLED_RESULT: OUTPUT_SYNC_SUPPORTED,
CHECKING_MSG: [for make --output-sync value],
DESC: [set make --output-sync type if supported by make],
CHECK_AVAILABLE:
[
AC_MSG_CHECKING([if make --output-sync is supported])
if ! $MAKE --version -O > /dev/null 2>&1; then
AC_MSG_RESULT([no])
AVAILABLE=false
else
AC_MSG_RESULT([yes])
fi
]
)
AC_MSG_CHECKING([if make --output-sync is supported])
if $MAKE --version -O > /dev/null 2>&1; then
OUTPUT_SYNC_SUPPORTED=true
AC_MSG_RESULT([yes])
AC_MSG_CHECKING([for output-sync value])
AC_ARG_WITH([output-sync], [AS_HELP_STRING([--with-output-sync],
[set make output sync type if supported by make. @<:@recurse@:>@])],
[OUTPUT_SYNC=$with_output_sync])
if test "x$OUTPUT_SYNC" = "x"; then
OUTPUT_SYNC=none
fi
AC_MSG_RESULT([$OUTPUT_SYNC])
if ! $MAKE --version -O$OUTPUT_SYNC > /dev/null 2>&1; then
AC_MSG_ERROR([Make did not the support the value $OUTPUT_SYNC as output sync type.])
fi
else
OUTPUT_SYNC_SUPPORTED=false
AC_MSG_RESULT([no])
fi
AC_SUBST(OUTPUT_SYNC_SUPPORTED)
AC_SUBST(OUTPUT_SYNC)
])
@@ -269,8 +270,6 @@ AC_DEFUN([BASIC_CHECK_TAR],
TAR_TYPE="bsd"
elif test "x$($TAR -v | $GREP "bsdtar")" != "x"; then
TAR_TYPE="bsd"
elif test "x$($TAR --version | $GREP "busybox")" != "x"; then
TAR_TYPE="busybox"
elif test "x$OPENJDK_BUILD_OS" = "xaix"; then
TAR_TYPE="aix"
fi
@@ -282,12 +281,9 @@ AC_DEFUN([BASIC_CHECK_TAR],
TAR_SUPPORTS_TRANSFORM="true"
elif test "x$TAR_TYPE" = "aix"; then
# -L InputList of aix tar: name of file listing the files and directories
# that need to be archived or extracted
# that need to be archived or extracted
TAR_INCLUDE_PARAM="L"
TAR_SUPPORTS_TRANSFORM="false"
elif test "x$TAR_TYPE" = "xbusybox"; then
TAR_INCLUDE_PARAM="T"
TAR_SUPPORTS_TRANSFORM="false"
else
TAR_INCLUDE_PARAM="I"
TAR_SUPPORTS_TRANSFORM="false"
@@ -343,6 +339,7 @@ AC_DEFUN_ONCE([BASIC_SETUP_COMPLEX_TOOLS],
UTIL_LOOKUP_PROGS(READELF, greadelf readelf)
UTIL_LOOKUP_PROGS(DOT, dot)
UTIL_LOOKUP_PROGS(HG, hg)
UTIL_LOOKUP_PROGS(GIT, git)
UTIL_LOOKUP_PROGS(STAT, stat)
UTIL_LOOKUP_PROGS(TIME, time)
UTIL_LOOKUP_PROGS(FLOCK, flock)
@@ -351,7 +348,7 @@ AC_DEFUN_ONCE([BASIC_SETUP_COMPLEX_TOOLS],
UTIL_LOOKUP_PROGS(DTRACE, dtrace, $PATH:/usr/sbin)
UTIL_LOOKUP_PROGS(PATCH, gpatch patch)
# Check if it's GNU time
[ IS_GNU_TIME=`$TIME --version 2>&1 | $GREP 'GNU [Tt]ime'` ]
IS_GNU_TIME=`$TIME --version 2>&1 | $GREP 'GNU time'`
if test "x$IS_GNU_TIME" != x; then
IS_GNU_TIME=yes
else
@@ -377,15 +374,17 @@ AC_DEFUN_ONCE([BASIC_SETUP_COMPLEX_TOOLS],
UTIL_REQUIRE_PROGS(XATTR, xattr)
UTIL_LOOKUP_PROGS(CODESIGN, codesign)
# Check for user provided code signing identity.
UTIL_ARG_WITH(NAME: macosx-codesign-identity, TYPE: string,
DEFAULT: openjdk_codesign, CHECK_VALUE: UTIL_CHECK_STRING_NON_EMPTY,
DESC: [specify the macosx code signing identity],
CHECKING_MSG: [for macosx code signing identity]
)
AC_SUBST(MACOSX_CODESIGN_IDENTITY)
if test "x$CODESIGN" != "x"; then
# Check for user provided code signing identity.
# If no identity was provided, fall back to "openjdk_codesign".
AC_ARG_WITH([macosx-codesign-identity], [AS_HELP_STRING([--with-macosx-codesign-identity],
[specify the code signing identity])],
[MACOSX_CODESIGN_IDENTITY=$with_macosx_codesign_identity],
[MACOSX_CODESIGN_IDENTITY=openjdk_codesign]
)
AC_SUBST(MACOSX_CODESIGN_IDENTITY)
# Verify that the codesign certificate is present
AC_MSG_CHECKING([if codesign certificate is present])
$RM codesign-testfile

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2021, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -185,16 +185,6 @@ AC_DEFUN([BASIC_SETUP_PATHS_WINDOWS],
AC_MSG_RESULT([unknown])
AC_MSG_WARN([It seems that your find utility is non-standard.])
fi
if test "x$GIT" != x && test -e $TOPDIR/.git; then
git_autocrlf=`$GIT config core.autocrlf`
if test "x$git_autocrlf" != x && test "x$git_autocrlf" != "xfalse"; then
AC_MSG_NOTICE([Your git configuration does not set core.autocrlf to false.])
AC_MSG_NOTICE([If you checked out this code using that setting, the build WILL fail.])
AC_MSG_NOTICE([To correct, run "git config --global core.autocrlf false" and re-clone the repo.])
AC_MSG_WARN([Code is potentially incorrectly cloned. HIGH RISK of build failure!])
fi
fi
])
# Verify that the directory is usable on Windows

Some files were not shown because too many files have changed in this diff Show More