Compare commits

..

62 Commits

Author SHA1 Message Date
Nikita Gubarkov
fd009fe88e Enumerate and filter physical devices, create a logical device. Corrections. 2023-04-05 20:32:21 +03:00
Nikita Gubarkov
738f762d3e Enumerate and filter physical devices, create a logical device. 2023-03-31 19:55:13 +03:00
Nikita Gubarkov
ef9172f50e Remove bundled Vulkan headers, use stubs when Vulkan is disabled. Corrections. 2023-03-31 12:17:36 +03:00
Nikita Gubarkov
ed8a7ad702 Remove bundled Vulkan headers, use stubs when Vulkan is disabled. 2023-03-30 21:53:47 +03:00
Alexey Ushakov
6238545ead Vulkan sdk support in configure. Corrections 2023-03-17 21:30:47 +01:00
Alexey Ushakov
9bda49c60c Vulkan sdk support in configure 2023-03-16 21:13:22 +01:00
Nikita Gubarkov
a8cdbbfb8d Added debug messenger. 2023-03-16 22:10:10 +02:00
Nikita Gubarkov
2d7c8037fa Decoupled Vulkan from Wayland.
Extracted protocol-independent logic into "share".
2023-03-16 20:57:18 +02:00
Nikita Gubarkov
b51d9559b0 Forgot to stage vk_video headers. 2023-03-16 18:10:04 +02:00
Nikita Gubarkov
f410ae50af Imported Vulkan headers, started using Vulkan-Hpp. 2023-03-16 18:05:38 +02:00
Alexey Ushakov
7ebdeb734b Added detection of Vulkan support. Corrections. 2023-03-16 11:21:15 +01:00
Alexey Ushakov
f7821f8d45 Added detection of Vulkan support 2023-03-15 19:29:22 +01:00
Maxim Kartashev
28a609691a Implemented GraphicsEnvironment and Device, HiDPI (scaling) support 2022-12-13 10:06:03 +03:00
Maxim Kartashev
2f47290c19 Dialog and window menu support 2022-11-28 10:57:38 +03:00
Maxim Kartashev
ba1c4a464d dlopen version 0 of xkbcommon if non-versioned file is missing 2022-11-23 13:19:15 +03:00
Dmitry Batrak
466f1dd0fd generate correct mouse events after click in window resize area 2022-11-23 11:17:30 +03:00
Dmitry Batrak
fd37168007 set cursor on pointer enter event, as per Wayland API requirement
in the initial implementation cursor was only updated on pointer move events
2022-11-18 19:51:17 +03:00
Dmitry Batrak
607dcb78a8 support setting mouse cursors 2022-11-18 14:22:21 +03:00
Maxim Kartashev
f9f5359629 Prevent deadlock when scrolling
SurfaceData need to be able to be locked twice during scrolling,
but the associated mutex wasn't recursive, which led to a deadlock.
2022-11-16 17:22:24 +03:00
Maxim Kartashev
6b19a747f7 Basic support for VolatileImage
The image is actually a non-volatile software implementation.
2022-11-15 11:52:49 +03:00
Maxim Kartashev
7d7d9f9bf5 Prevent race condition when destroying buffer manager
Also implemented AWT_LOCK() family of macros
2022-11-15 11:52:23 +03:00
Dmitry Batrak
66c28a3606 prevent crashes on concurrent access to AWT API 2022-11-14 19:12:31 +03:00
Maxim Kartashev
e2321b5594 Implemented getColorModel() and createAcceleratedImage()
This is enough to make J2Ddemo and StylePad work
2022-11-11 12:04:24 +03:00
Dmitry Batrak
5813b10e65 maximize/un-maximize improvements
* remove 'roundtrip' calls - they don't seem to be needed after recent changes to paint logic
* remove unneeded lock in WLFramePeer.setState - corresponding code doesn't query or modify any state
* always repaint client decorations on frame state change - it might not be accompanied by size change
* remember the size of frame before maximization, use it on de-maximization, if compositor doesn't propose a size itself
2022-11-09 16:18:28 +03:00
Maxim Kartashev
d121a93cb1 JBR-4918 More bugfixes in Wayland buffers management
Event-driven painting of client decorations.
Smooth window resize.
Transactional commits at AWT and Swing level
based on frame numbers.
2022-11-08 08:40:18 +03:00
Dmitry Batrak
f7638abee2 initialize memory allocated for WLFrame
just in case, to prevent potential usage of uninitialized fields in future
2022-11-02 12:41:08 +03:00
Dmitry Batrak
55b1310c24 support setting state to a window before making it visible, and right afterwards 2022-10-31 12:23:16 +03:00
Maxim Kartashev
bfe03f4bd1 Revert "JBR-4918 More bugfixes in Wayland buffers management"
This reverts commit 15a09a1564.
2022-10-28 13:20:27 +03:00
Maxim Kartashev
15a09a1564 JBR-4918 More bugfixes in Wayland buffers management
Event-driven painting of client decorations.
Smooth window resize.
Transactional commits at AWT and Swing level
based on frame numbers.
2022-10-28 11:28:57 +03:00
Dmitry Batrak
858380c36d fix assertion in WLKeyboardFocusManagerPeer 2022-10-21 18:34:02 +03:00
Dmitry Batrak
a81b44d79d client-side decorations, and some fixes for minimize/maximize window functionality 2022-10-21 16:46:36 +03:00
Dmitry Batrak
7f9aee3c7f make default component focused on frame activation 2022-10-21 12:05:01 +03:00
Maxim Kartashev
0478a24483 JBR-4918 Additional bugfixes in Wayland buffers management 2022-10-21 09:48:51 +03:00
Maxim Kartashev
c113772448 JBR-4865 Support xdg-shell functions
Implemented maximize/fullscreen together with the reverse functions.
2022-10-19 11:24:08 +03:00
Maxim Kartashev
757194800f JBR-4918 Implement support for window size change 2022-10-18 11:07:01 +03:00
Maxim Kartashev
b9c4ac35ec JBR-4865 Support xdg-shell functions 2022-10-18 11:06:57 +03:00
Dmitry Batrak
adf8d95f7b simplify Wayland events dispatching, fix known issues 2022-10-13 10:23:03 +03:00
Maxim Kartashev
b2986aef46 JBR-4621 Implemented key repeat 2022-10-12 14:23:09 +03:00
Maxim Kartashev
cea81933d9 JBR-4621 Input events support for Wayland
This includes basic mouse and keyboard support.
2022-10-12 14:23:09 +03:00
Maxim Kartashev
6779e2c59b Let WLToolkit work with DISPLAY unset 2022-10-12 14:23:06 +03:00
Alexey Ushakov
4b7c5f62a9 Improved sun.awt.wl.WLGraphicsEnvironment to support createCraphics() 2022-10-12 14:22:34 +03:00
Maxim Kartashev
df204bb882 Added libwakefield source code to the tree
It is not integrated into the build infrastructure both for simplicity
and to avoid otherwise unnecessary dependencies on weston, pixman, etc.

Also fixed copyrights in the recently added files, including the
auto-generated ones.
2022-10-12 14:22:34 +03:00
Maxim Kartashev
0d7fdcf415 Made it possible for Wayland tests to run in parallel
Also fixed a potential crash in getLocationOnScreen().
2022-10-12 14:22:34 +03:00
Maxim Kartashev
73c8c50262 Wayland test harness and sample test 2022-10-12 14:22:34 +03:00
Maxim Kartashev
16cacd0b55 AWT Robot to support Wayland natively
Requires the presence of the 'wakefield' protocol extension on the
server side; will throw UOE on use otherwise. Can be completely
disabled by undefining WAKEFIELD_ROBOT during compilation.

Provides the ability to re-position the surface to the given absolute
coordinates, query the surface's position, obtain RGB of a pixel at the
given absolute coordinates and take a screenshot of an area.
2022-10-12 14:22:34 +03:00
Nikita Gubarkov
19496fcef9 Suppress unused-result warning for libfontmanager 2022-10-12 14:22:34 +03:00
nikita.gubarkov
9cb4769361 Text rendering support
Extracted X11-related code from libfontmanager into libfontmanager_xawt
2022-10-12 14:22:33 +03:00
Maxim Kartashev
dc36d0afaf Reduced xdg_wm_base protocol version to 1 in order to run under Weston
This was done purely for convenience. The version can be bumped back up
at any time, but the change will require a more recent version
of Weston for testing.
2022-10-12 14:22:33 +03:00
Alexey Ushakov
20ca5a41f4 Added JFrame support 2022-10-12 14:22:33 +03:00
Alexey Ushakov
e864ea8469 Fixed child hw component position 2022-10-12 14:22:33 +03:00
Alexey Ushakov
b3e31866ec Implemented heavyweight button rendering 2022-10-12 14:22:33 +03:00
Alexey Ushakov
4cdce4b44a Moved native window management to WLComponentPeer 2022-10-12 14:22:33 +03:00
Alexey Ushakov
2187957e7e Added WLRepaintArea 2022-10-12 14:22:33 +03:00
Alexey Ushakov
14aa544c86 Refactored peers 2022-10-12 14:22:32 +03:00
Alexey Ushakov
de5214531a Added stubs for WLTK button peer 2022-10-12 14:22:32 +03:00
Alexey Ushakov
ea6f74d64f Added 2d surface support 2022-10-12 14:22:32 +03:00
Alexey Ushakov
56e174709b Added support for background color. Refactoring 2022-10-12 14:22:32 +03:00
Alexey Ushakov
c5103ff4a8 Make simple awt window visible 2022-10-12 14:22:32 +03:00
Dmitry Batrak
49c103709e window showing and event loop prototype 2022-10-12 14:22:32 +03:00
Dmitry Batrak
4b21d041d8 more stubbing for WLToolkit, add WLFramePeer 2022-10-12 14:22:31 +03:00
Dmitry Batrak
c1ee18adfb more stubbing for WLToolkit 2022-10-12 14:22:31 +03:00
Alexey Ushakov
693e16b0a1 Created stub version of WLToolkit
A wayland base toolkit with native part linked to wayland-client library
2022-10-12 14:22:28 +03:00
15120 changed files with 534351 additions and 1187370 deletions

View File

@@ -42,5 +42,5 @@ runs:
run: |
# Extract value from configuration file
value="$(grep -h ${{ inputs.var }}= make/conf/github-actions.conf | cut -d '=' -f 2-)"
echo "value=$value" >> $GITHUB_OUTPUT
echo "::set-output name=value::$value"
shell: bash

View File

@@ -61,7 +61,7 @@ runs:
$build_dir/make-support/failure-summary.log \
$build_dir/make-support/failure-logs/* \
failure-logs/ 2> /dev/null || true
echo 'failure=true' >> $GITHUB_OUTPUT
echo '::set-output name=failure::true'
fi
shell: bash

View File

@@ -42,7 +42,7 @@ runs:
run: |
# Convert platform name to upper case
platform_prefix="$(echo ${{ inputs.platform }} | tr [a-z-] [A-Z_])"
echo "value=$platform_prefix" >> $GITHUB_OUTPUT
echo "::set-output name=value::$platform_prefix"
shell: bash
- name: 'Get URL configuration'
@@ -105,5 +105,5 @@ runs:
id: path-name
run: |
# Export the path
echo 'path=bootjdk/jdk' >> $GITHUB_OUTPUT
echo '::set-output name=path::bootjdk/jdk'
shell: bash

View File

@@ -103,7 +103,7 @@ runs:
tests_dir="$(cygpath $tests_dir)"
fi
echo "jdk=$jdk_dir" >> $GITHUB_OUTPUT
echo "symbols=$symbols_dir" >> $GITHUB_OUTPUT
echo "tests=$tests_dir" >> $GITHUB_OUTPUT
echo "::set-output name=jdk::$jdk_dir"
echo "::set-output name=symbols::$symbols_dir"
echo "::set-output name=tests::$tests_dir"
shell: bash

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2022, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -43,12 +43,12 @@ runs:
uses: actions/checkout@v3
with:
repository: google/googletest
ref: 'v${{ steps.version.outputs.value }}'
ref: 'release-${{ steps.version.outputs.value }}'
path: gtest
- name: 'Export path to where GTest is installed'
id: path-name
run: |
# Export the path
echo 'path=gtest' >> $GITHUB_OUTPUT
echo '::set-output name=path::gtest'
shell: bash

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -57,7 +57,7 @@ runs:
- name: 'Build JTReg'
run: |
# Build JTReg and move files to the proper locations
bash make/build.sh --jdk "$JAVA_HOME_17_X64"
bash make/build.sh --jdk "$JAVA_HOME_11_X64"
mkdir ../installed
mv build/images/jtreg/* ../installed
working-directory: jtreg/src
@@ -68,5 +68,5 @@ runs:
id: path-name
run: |
# Export the path
echo 'path=jtreg/installed' >> $GITHUB_OUTPUT
echo '::set-output name=path::jtreg/installed'
shell: bash

View File

@@ -30,16 +30,15 @@ runs:
using: composite
steps:
- name: 'Install MSYS2'
# use a specific release of msys2/setup-msys2 to prevent jtreg build failures on newer release
uses: msys2/setup-msys2@7efe20baefed56359985e327d329042cde2434ff
uses: msys2/setup-msys2@v2
with:
install: 'autoconf tar unzip zip make'
path-type: minimal
location: ${{ runner.tool_cache }}/msys2
location: msys2
# We can't run bash until this is completed, so stick with pwsh
- name: 'Set MSYS2 path'
run: |
# Prepend msys2/msys64/usr/bin to the PATH
echo "$env:RUNNER_TOOL_CACHE/msys2/msys64/usr/bin" >> $env:GITHUB_PATH
echo "$env:GITHUB_WORKSPACE/msys2/msys64/usr/bin" >> $env:GITHUB_PATH
shell: pwsh

View File

@@ -62,9 +62,9 @@ runs:
fi
if [[ "$jdk_bundle_zip$jdk_bundle_tar_gz$symbols_bundle$tests_bundle" != "" ]]; then
echo 'bundles-found=true' >> $GITHUB_OUTPUT
echo '::set-output name=bundles-found::true'
else
echo 'bundles-found=false' >> $GITHUB_OUTPUT
echo '::set-output name=bundles-found::false'
fi
shell: bash

View File

@@ -25,7 +25,6 @@
#
GITHUB_STEP_SUMMARY="$1"
GITHUB_OUTPUT="$2"
test_suite_name=$(cat build/run-test-prebuilt/test-support/test-last-ids.txt)
results_dir=build/run-test-prebuilt/test-results/$test_suite_name/text
@@ -42,12 +41,12 @@ error_count=$(echo $errors | wc -w || true)
if [[ "$failures" = "" && "$errors" = "" ]]; then
# We know something went wrong, but not what
echo 'error-message=Unspecified test suite failure. Please see log for job for details.' >> $GITHUB_OUTPUT
echo '::set-output name=error-message::Unspecified test suite failure. Please see log for job for details.'
exit 0
fi
echo 'failure=true' >> $GITHUB_OUTPUT
echo "error-message=Test run reported $failure_count test failure(s) and $error_count error(s). See summary for details." >> $GITHUB_OUTPUT
echo '::set-output name=failure::true'
echo "::set-output name=error-message::Test run reported $failure_count test failure(s) and $error_count error(s). See summary for details."
echo '### :boom: Test failures summary' >> $GITHUB_STEP_SUMMARY

View File

@@ -40,12 +40,6 @@ on:
extra-conf-options:
required: false
type: string
configure-arguments:
required: false
type: string
make-arguments:
required: false
type: string
jobs:
build-cross-compile:
@@ -171,7 +165,7 @@ jobs:
--with-jmod-compress=zip-1
CC=${{ matrix.gnu-arch }}-linux-gnu${{ matrix.gnu-abi}}-gcc-${{ inputs.gcc-major-version }}
CXX=${{ matrix.gnu-arch }}-linux-gnu${{ matrix.gnu-abi}}-g++-${{ inputs.gcc-major-version }}
${{ inputs.extra-conf-options }} ${{ inputs.configure-arguments }} || (
${{ inputs.extra-conf-options }} || (
echo "Dumping config.log:" &&
cat config.log &&
exit 1)
@@ -180,5 +174,5 @@ jobs:
id: build
uses: ./.github/actions/do-build
with:
make-target: 'hotspot ${{ inputs.make-arguments }}'
make-target: 'hotspot'
platform: linux-${{ matrix.target-cpu }}

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2022, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -58,12 +58,6 @@ on:
apt-extra-packages:
required: false
type: string
configure-arguments:
required: false
type: string
make-arguments:
required: false
type: string
jobs:
build-linux:
@@ -102,7 +96,7 @@ jobs:
run: |
# Set a proper suffix for packages if using a different architecture
if [[ '${{ inputs.apt-architecture }}' != '' ]]; then
echo 'suffix=:${{ inputs.apt-architecture }}' >> $GITHUB_OUTPUT
echo '::set-output name=suffix:::${{ inputs.apt-architecture }}'
fi
# Upgrading apt to solve libc6 installation bugs, see JDK-8260460.
@@ -126,9 +120,10 @@ jobs:
--with-boot-jdk=${{ steps.bootjdk.outputs.path }}
--with-jtreg=${{ steps.jtreg.outputs.path }}
--with-gtest=${{ steps.gtest.outputs.path }}
--enable-jtreg-failure-handler
--with-zlib=system
--with-jmod-compress=zip-1
${{ inputs.extra-conf-options }} ${{ inputs.configure-arguments }} || (
${{ inputs.extra-conf-options }} || (
echo "Dumping config.log:" &&
cat config.log &&
exit 1)
@@ -137,7 +132,7 @@ jobs:
id: build
uses: ./.github/actions/do-build
with:
make-target: '${{ inputs.make-target }} ${{ inputs.make-arguments }}'
make-target: '${{ inputs.make-target }}'
platform: ${{ inputs.platform }}
debug-suffix: '${{ matrix.suffix }}'

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2022, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -45,12 +45,6 @@ on:
xcode-toolset-version:
required: true
type: string
configure-arguments:
required: false
type: string
make-arguments:
required: false
type: string
jobs:
build-macos:
@@ -101,9 +95,10 @@ jobs:
--with-boot-jdk=${{ steps.bootjdk.outputs.path }}
--with-jtreg=${{ steps.jtreg.outputs.path }}
--with-gtest=${{ steps.gtest.outputs.path }}
--enable-jtreg-failure-handler
--with-zlib=system
--with-jmod-compress=zip-1
${{ inputs.extra-conf-options }} ${{ inputs.configure-arguments }} || (
${{ inputs.extra-conf-options }} || (
echo "Dumping config.log:" &&
cat config.log &&
exit 1)
@@ -112,7 +107,7 @@ jobs:
id: build
uses: ./.github/actions/do-build
with:
make-target: '${{ inputs.make-target }} ${{ inputs.make-arguments }}'
make-target: '${{ inputs.make-target }}'
platform: ${{ inputs.platform }}
debug-suffix: '${{ matrix.suffix }}'

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2022, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -48,12 +48,6 @@ on:
msvc-toolset-architecture:
required: true
type: string
configure-arguments:
required: false
type: string
make-arguments:
required: false
type: string
env:
# These are needed to make the MSYS2 bash work properly
@@ -98,26 +92,12 @@ jobs:
id: gtest
uses: ./.github/actions/get-gtest
- name: 'Check toolchain installed'
id: toolchain-check
run: |
set +e
'/c/Program Files (x86)/Microsoft Visual Studio/2019/Enterprise/vc/auxiliary/build/vcvars64.bat' -vcvars_ver=${{ inputs.msvc-toolset-version }}
if [ $? -eq 0 ]; then
echo "Toolchain is already installed"
echo "toolchain-installed=true" >> $GITHUB_OUTPUT
else
echo "Toolchain is not yet installed"
echo "toolchain-installed=false" >> $GITHUB_OUTPUT
fi
- name: 'Install toolchain and dependencies'
run: |
# Run Visual Studio Installer
'/c/Program Files (x86)/Microsoft Visual Studio/Installer/vs_installer.exe' \
modify --quiet --installPath 'C:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise' \
modify --quiet --installPath 'C:/Program Files (x86)/Microsoft Visual Studio/2019/Enterprise' \
--add Microsoft.VisualStudio.Component.VC.${{ inputs.msvc-toolset-version }}.${{ inputs.msvc-toolset-architecture }}
if: steps.toolchain-check.outputs.toolchain-installed != 'true'
- name: 'Configure'
run: >
@@ -128,9 +108,10 @@ jobs:
--with-boot-jdk=${{ steps.bootjdk.outputs.path }}
--with-jtreg=${{ steps.jtreg.outputs.path }}
--with-gtest=${{ steps.gtest.outputs.path }}
--enable-jtreg-failure-handler
--with-msvc-toolset-version=${{ inputs.msvc-toolset-version }}
--with-jmod-compress=zip-1
${{ inputs.extra-conf-options }} ${{ inputs.configure-arguments }} || (
${{ inputs.extra-conf-options }} || (
echo "Dumping config.log:" &&
cat config.log &&
exit 1)
@@ -138,13 +119,12 @@ jobs:
# We need a minimal PATH on Windows
# Set PATH to "", so just GITHUB_PATH is included
PATH: ''
shell: env /usr/bin/bash --login -eo pipefail {0}
- name: 'Build'
id: build
uses: ./.github/actions/do-build
with:
make-target: '${{ inputs.make-target }} ${{ inputs.make-arguments }}'
make-target: '${{ inputs.make-target }}'
platform: ${{ inputs.platform }}
debug-suffix: '${{ matrix.suffix }}'

View File

@@ -26,18 +26,16 @@
name: 'OpenJDK GHA Sanity Checks'
on:
push:
branches-ignore:
- master
- pr/*
workflow_dispatch:
inputs:
platforms:
description: 'Platform(s) to execute on (comma separated, e.g. "linux-x64, macos, aarch64")'
required: true
default: 'linux-x64, linux-x86, linux-x64-variants, linux-cross-compile, macos-x64, macos-aarch64, windows-x64, windows-aarch64, docs'
configure-arguments:
description: 'Additional configure arguments'
required: false
make-arguments:
description: 'Additional make arguments'
required: false
default: 'linux-x64, linux-x86, linux-x64-variants, linux-cross-compile, macos-x64, macos-aarch64, windows-x64, windows-aarch64'
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
@@ -55,14 +53,12 @@ jobs:
outputs:
linux-x64: ${{ steps.include.outputs.linux-x64 }}
linux-x86: ${{ steps.include.outputs.linux-x86 }}
# additional build options for linux-x64 are disabled
# linux-x64-variants: ${{ steps.include.outputs.linux-x64-variants }}
linux-x64-variants: ${{ steps.include.outputs.linux-x64-variants }}
linux-cross-compile: ${{ steps.include.outputs.linux-cross-compile }}
macos-x64: ${{ steps.include.outputs.macos-x64 }}
macos-aarch64: ${{ steps.include.outputs.macos-aarch64 }}
windows-x64: ${{ steps.include.outputs.windows-x64 }}
windows-aarch64: ${{ steps.include.outputs.windows-aarch64 }}
docs: ${{ steps.include.outputs.docs }}
steps:
# This function must be inlined in main.yml, or we'd be forced to checkout the repo
@@ -75,17 +71,19 @@ jobs:
# 'false' otherwise.
# arg $1: platform name or names to look for
function check_platform() {
if [[ '${{ !secrets.JDK_SUBMIT_FILTER || startsWith(github.ref, 'refs/heads/submit/') }}' == 'false' ]]; then
# If JDK_SUBMIT_FILTER is set, and this is not a "submit/" branch, don't run anything
echo 'false'
return
fi
if [[ $GITHUB_EVENT_NAME == workflow_dispatch ]]; then
input='${{ github.event.inputs.platforms }}'
elif [[ $GITHUB_EVENT_NAME == push ]]; then
if [[ '${{ !secrets.JDK_SUBMIT_FILTER || startsWith(github.ref, 'refs/heads/submit/') }}' == 'false' ]]; then
# If JDK_SUBMIT_FILTER is set, and this is not a "submit/" branch, don't run anything
>&2 echo 'JDK_SUBMIT_FILTER is set and not a "submit/" branch'
echo 'false'
return
else
input='${{ secrets.JDK_SUBMIT_PLATFORMS }}'
fi
input='${{ secrets.JDK_SUBMIT_PLATFORMS }}'
else
echo 'Internal error in GHA'
exit 1
fi
normalized_input="$(echo ,$input, | tr -d ' ')"
@@ -106,15 +104,14 @@ jobs:
echo 'false'
}
echo "linux-x64=$(check_platform linux-x64 linux x64)" >> $GITHUB_OUTPUT
echo "linux-x86=$(check_platform linux-x86 linux x86)" >> $GITHUB_OUTPUT
echo "linux-x64-variants=$(check_platform linux-x64-variants variants)" >> $GITHUB_OUTPUT
echo "linux-cross-compile=$(check_platform linux-cross-compile cross-compile)" >> $GITHUB_OUTPUT
echo "macos-x64=$(check_platform macos-x64 macos x64)" >> $GITHUB_OUTPUT
echo "macos-aarch64=$(check_platform macos-aarch64 macos aarch64)" >> $GITHUB_OUTPUT
echo "windows-x64=$(check_platform windows-x64 windows x64)" >> $GITHUB_OUTPUT
echo "windows-aarch64=$(check_platform windows-aarch64 windows aarch64)" >> $GITHUB_OUTPUT
echo "docs=$(check_platform docs)" >> $GITHUB_OUTPUT
echo "::set-output name=linux-x64::$(check_platform linux-x64 linux x64)"
echo "::set-output name=linux-x86::$(check_platform linux-x86 linux x86)"
echo "::set-output name=linux-x64-variants::$(check_platform linux-x64-variants variants)"
echo "::set-output name=linux-cross-compile::$(check_platform linux-cross-compile cross-compile)"
echo "::set-output name=macos-x64::$(check_platform macos-x64 macos x64)"
echo "::set-output name=macos-aarch64::$(check_platform macos-aarch64 macos aarch64)"
echo "::set-output name=windows-x64::$(check_platform windows-x64 windows x64)"
echo "::set-output name=windows-aarch64::$(check_platform windows-aarch64 windows aarch64)"
###
### Build jobs
@@ -127,9 +124,7 @@ jobs:
with:
platform: linux-x64
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
apt-gcc-version: '10.3.0-15ubuntu1'
# The linux-x64 jdk bundle is used as buildjdk for the cross-compile job
if: needs.select.outputs.linux-x64 == 'true' || needs.select.outputs.linux-cross-compile == 'true'
@@ -141,14 +136,12 @@ jobs:
platform: linux-x86
gcc-major-version: '10'
gcc-package-suffix: '-multilib'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
apt-gcc-version: '10.3.0-15ubuntu1'
apt-architecture: 'i386'
# Some multilib libraries do not have proper inter-dependencies, so we have to
# install their dependencies manually.
apt-extra-packages: 'libfreetype6-dev:i386 libtiff-dev:i386 libcupsimage2-dev:i386 libc6-i386 libgcc-s1:i386 libstdc++6:i386'
apt-extra-packages: 'libfreetype6-dev:i386 libtiff-dev:i386 libcupsimage2-dev:i386 libc6-i386'
extra-conf-options: '--with-target-bits=32'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.linux-x86 == 'true'
build-linux-x64-hs-nopch:
@@ -160,10 +153,8 @@ jobs:
make-target: 'hotspot'
debug-levels: '[ "debug" ]'
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
apt-gcc-version: '10.3.0-15ubuntu1'
extra-conf-options: '--disable-precompiled-headers'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.linux-x64-variants == 'true'
build-linux-x64-hs-zero:
@@ -175,10 +166,8 @@ jobs:
make-target: 'hotspot'
debug-levels: '[ "debug" ]'
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
apt-gcc-version: '10.3.0-15ubuntu1'
extra-conf-options: '--with-jvm-variants=zero --disable-precompiled-headers'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.linux-x64-variants == 'true'
build-linux-x64-hs-minimal:
@@ -190,10 +179,8 @@ jobs:
make-target: 'hotspot'
debug-levels: '[ "debug" ]'
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
apt-gcc-version: '10.3.0-15ubuntu1'
extra-conf-options: '--with-jvm-variants=minimal --disable-precompiled-headers'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.linux-x64-variants == 'true'
build-linux-x64-hs-optimized:
@@ -206,10 +193,8 @@ jobs:
# Technically this is not the "debug" level, but we can't inject a new matrix state for just this job
debug-levels: '[ "debug" ]'
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
apt-gcc-version: '10.3.0-15ubuntu1'
extra-conf-options: '--with-debug-level=optimized --disable-precompiled-headers'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.linux-x64-variants == 'true'
build-linux-cross-compile:
@@ -220,10 +205,8 @@ jobs:
uses: ./.github/workflows/build-cross-compile.yml
with:
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
apt-gcc-cross-version: '10.4.0-4ubuntu1~22.04cross1'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
apt-gcc-version: '10.3.0-15ubuntu1'
apt-gcc-cross-version: '10.3.0-8ubuntu1cross1'
if: needs.select.outputs.linux-cross-compile == 'true'
build-macos-x64:
@@ -232,9 +215,7 @@ jobs:
uses: ./.github/workflows/build-macos.yml
with:
platform: macos-x64
xcode-toolset-version: '12.5.1'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
xcode-toolset-version: '11.7'
if: needs.select.outputs.macos-x64 == 'true'
build-macos-aarch64:
@@ -243,10 +224,8 @@ jobs:
uses: ./.github/workflows/build-macos.yml
with:
platform: macos-aarch64
xcode-toolset-version: '12.5.1'
xcode-toolset-version: '12.4'
extra-conf-options: '--openjdk-target=aarch64-apple-darwin'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.macos-aarch64 == 'true'
build-windows-x64:
@@ -257,8 +236,6 @@ jobs:
platform: windows-x64
msvc-toolset-version: '14.29'
msvc-toolset-architecture: 'x86.x64'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.windows-x64 == 'true'
build-windows-aarch64:
@@ -271,27 +248,8 @@ jobs:
msvc-toolset-architecture: 'arm64'
make-target: 'hotspot'
extra-conf-options: '--openjdk-target=aarch64-unknown-cygwin'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.windows-aarch64 == 'true'
build-docs:
name: docs
needs: select
uses: ./.github/workflows/build-linux.yml
with:
platform: linux-x64
debug-levels: '[ "debug" ]'
make-target: 'docs-jdk-bundles'
# Make sure we never try to make full docs, since that would require a
# build JDK, and we do not need the additional testing of the graphs.
extra-conf-options: '--disable-full-docs'
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.docs == 'true'
###
### Test jobs
###

View File

@@ -1,270 +0,0 @@
#
# Copyright 2000-2023 JetBrains s.r.o.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 2 only, as
# published by the Free Software Foundation. Oracle designates this
# particular file as subject to the "Classpath" exception as provided
# by Oracle in the LICENSE file that accompanied this code.
#
# This code is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
# version 2 for more details (a copy is included in the LICENSE file that
# accompanied this code).
#
# You should have received a copy of the GNU General Public License version
# 2 along with this work; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
#
# Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
# or visit www.oracle.com if you need additional information or have any
# questions.
#
name: 'Build OpenJDK on pull request'
on:
pull_request:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
jobs:
###
### Determine platforms to include
###
select:
name: 'Select platforms'
runs-on: ubuntu-22.04
outputs:
linux-x64: ${{ steps.include.outputs.linux-x64 }}
linux-x86: ${{ steps.include.outputs.linux-x86 }}
linux-cross-compile: ${{ steps.include.outputs.linux-cross-compile }}
macos-x64: ${{ steps.include.outputs.macos-x64 }}
macos-aarch64: ${{ steps.include.outputs.macos-aarch64 }}
windows-x64: ${{ steps.include.outputs.windows-x64 }}
windows-aarch64: ${{ steps.include.outputs.windows-aarch64 }}
windows-x86: ${{ steps.include.outputs.windows-x86 }}
steps:
# This function must be inlined in main.yml, or we'd be forced to checkout the repo
- name: 'Check what jobs to run'
id: include
run: |
# Determine which platform jobs to run
# Returns 'true' if the input platform list matches any of the platform monikers given as argument,
# 'false' otherwise.
# arg $1: platform name or names to look for
function check_platform() {
if [[ $GITHUB_EVENT_NAME == workflow_dispatch ]]; then
input='${{ github.event.inputs.platforms }}'
elif [[ $GITHUB_EVENT_NAME == push ]]; then
if [[ '${{ !secrets.JDK_SUBMIT_FILTER || startsWith(github.ref, 'refs/heads/submit/') }}' == 'false' ]]; then
# If JDK_SUBMIT_FILTER is set, and this is not a "submit/" branch, don't run anything
>&2 echo 'JDK_SUBMIT_FILTER is set and not a "submit/" branch'
echo 'false'
return
else
input='${{ secrets.JDK_SUBMIT_PLATFORMS }}'
fi
fi
normalized_input="$(echo ,$input, | tr -d ' ')"
if [[ "$normalized_input" == ",," ]]; then
# For an empty input, assume all platforms should run
echo 'true'
return
else
# Check for all acceptable platform names
for part in $* ; do
if echo "$normalized_input" | grep -q -e ",$part," ; then
echo 'true'
return
fi
done
fi
echo 'false'
}
echo "linux-x64=$(check_platform linux-x64 linux x64)" >> $GITHUB_OUTPUT
echo "linux-x86=$(check_platform linux-x86 linux x86)" >> $GITHUB_OUTPUT
echo "linux-x64-variants=$(check_platform linux-x64-variants variants)" >> $GITHUB_OUTPUT
echo "linux-cross-compile=$(check_platform linux-cross-compile cross-compile)" >> $GITHUB_OUTPUT
echo "macos-x64=$(check_platform macos-x64 macos x64)" >> $GITHUB_OUTPUT
echo "macos-aarch64=$(check_platform macos-aarch64 macos aarch64)" >> $GITHUB_OUTPUT
echo "windows-x64=$(check_platform windows-x64 windows x64)" >> $GITHUB_OUTPUT
echo "windows-x86=$(check_platform windows-x86 windows x86)" >> $GITHUB_OUTPUT
echo "windows-aarch64=$(check_platform windows-aarch64 windows aarch64)" >> $GITHUB_OUTPUT
echo "docs=$(check_platform docs)" >> $GITHUB_OUTPUT
###
### Build jobs
###
build-linux-x64:
name: linux-x64
needs: select
uses: ./.github/workflows/build-linux.yml
with:
platform: linux-x64
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
# The linux-x64 jdk bundle is used as buildjdk for the cross-compile job
if: needs.select.outputs.linux-x64 == 'true' || needs.select.outputs.linux-cross-compile == 'true'
build-linux-x86:
name: linux-x86
needs: select
uses: ./.github/workflows/build-linux.yml
with:
platform: linux-x86
gcc-major-version: '10'
gcc-package-suffix: '-multilib'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
apt-architecture: 'i386'
# Some multilib libraries do not have proper inter-dependencies, so we have to
# install their dependencies manually.
apt-extra-packages: 'libfreetype6-dev:i386 libtiff-dev:i386 libcupsimage2-dev:i386 libc6-i386'
extra-conf-options: '--with-target-bits=32'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.linux-x86 == 'true'
build-linux-cross-compile:
name: linux-cross-compile
needs:
- select
- build-linux-x64
uses: ./.github/workflows/build-cross-compile.yml
with:
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
apt-gcc-cross-version: '10.4.0-4ubuntu1~22.04cross1'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.linux-cross-compile == 'true'
build-macos-x64:
name: macos-x64
needs: select
uses: ./.github/workflows/build-macos.yml
with:
platform: macos-x64
xcode-toolset-version: '12.5.1'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.macos-x64 == 'true'
build-macos-aarch64:
name: macos-aarch64
needs: select
uses: ./.github/workflows/build-macos.yml
with:
platform: macos-aarch64
xcode-toolset-version: '12.5.1'
extra-conf-options: '--openjdk-target=aarch64-apple-darwin'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.macos-aarch64 == 'true'
build-windows-x64:
name: windows-x64
needs: select
uses: ./.github/workflows/build-windows.yml
with:
platform: windows-x64
msvc-toolset-version: '14.29'
msvc-toolset-architecture: 'x86.x64'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.windows-x64 == 'true'
build-windows-x86:
name: windows-x86
needs: select
uses: ./.github/workflows/build-windows.yml
with:
platform: windows-x86
msvc-toolset-version: '14.29'
msvc-toolset-architecture: 'x86'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.windows-x86 == 'true'
build-windows-aarch64:
name: windows-aarch64
needs: select
uses: ./.github/workflows/build-windows.yml
with:
platform: windows-aarch64
msvc-toolset-version: '14.29'
msvc-toolset-architecture: 'arm64'
make-target: 'hotspot'
extra-conf-options: '--openjdk-target=aarch64-unknown-cygwin'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.windows-aarch64 == 'true'
build-docs:
name: docs
needs: select
uses: ./.github/workflows/build-linux.yml
with:
platform: linux-x64
debug-levels: '[ "debug" ]'
make-target: 'docs-jdk-bundles'
# Make sure we never try to make full docs, since that would require a
# build JDK, and we do not need the additional testing of the graphs.
extra-conf-options: '--disable-full-docs'
gcc-major-version: '10'
apt-gcc-version: '10.4.0-4ubuntu1~22.04'
configure-arguments: ${{ github.event.inputs.configure-arguments }}
make-arguments: ${{ github.event.inputs.make-arguments }}
if: needs.select.outputs.docs == 'true'
# Remove bundles so they are not misconstrued as binary distributions from the JDK project
remove-bundles:
name: 'Remove bundle artifacts'
runs-on: ubuntu-22.04
if: always()
needs:
- build-linux-x64
- build-linux-x86
- build-linux-cross-compile
- build-macos-x64
- build-macos-aarch64
- build-windows-x64
- build-windows-aarch64
- build-windows-x86
steps:
# Hack to get hold of the api environment variables that are only defined for actions
- name: 'Get API configuration'
id: api
uses: actions/github-script@v6
with:
script: 'return { url: process.env["ACTIONS_RUNTIME_URL"], token: process.env["ACTIONS_RUNTIME_TOKEN"] }'
- name: 'Remove bundle artifacts'
run: |
# Find and remove all bundle artifacts
ALL_ARTIFACT_URLS="$(curl -s \
-H 'Accept: application/json;api-version=6.0-preview' \
-H 'Authorization: Bearer ${{ fromJson(steps.api.outputs.result).token }}' \
'${{ fromJson(steps.api.outputs.result).url }}_apis/pipelines/workflows/${{ github.run_id }}/artifacts?api-version=6.0-preview')"
BUNDLE_ARTIFACT_URLS="$(echo "$ALL_ARTIFACT_URLS" | jq -r -c '.value | map(select(.name|startswith("bundles-"))) | .[].url')"
for url in $BUNDLE_ARTIFACT_URLS; do
echo "Removing $url"
curl -s \
-H 'Accept: application/json;api-version=6.0-preview' \
-H 'Authorization: Bearer ${{ fromJson(steps.api.outputs.result).token }}' \
-X DELETE "$url" \
|| echo "Failed to remove bundle"
done

View File

@@ -64,7 +64,6 @@ jobs:
- 'hs/tier1 gc'
- 'hs/tier1 runtime'
- 'hs/tier1 serviceability'
- 'lib-test/tier1'
include:
- test-name: 'jdk/tier1 part 1'
@@ -99,10 +98,6 @@ jobs:
test-suite: 'test/hotspot/jtreg/:tier1_serviceability'
debug-suffix: -debug
- test-name: 'lib-test/tier1'
test-suite: 'test/lib-test/:tier1'
debug-suffix: -debug
steps:
- name: 'Checkout the JDK source'
uses: actions/checkout@v3
@@ -143,9 +138,9 @@ jobs:
# We need a minimal PATH on Windows
# Set PATH to "", so just GITHUB_PATH is included
if [[ '${{ runner.os }}' == 'Windows' ]]; then
echo "value=" >> $GITHUB_OUTPUT
echo "::set-output name=value::"
else
echo "value=$PATH" >> $GITHUB_OUTPUT
echo "::set-output name=value::$PATH"
fi
- name: 'Run tests'
@@ -159,7 +154,7 @@ jobs:
SYMBOLS_IMAGE_DIR=${{ steps.bundles.outputs.symbols-path }}
TEST_IMAGE_DIR=${{ steps.bundles.outputs.tests-path }}
JTREG='JAVA_OPTIONS=-XX:-CreateCoredumpOnCrash;VERBOSE=fail,error,time;KEYWORDS=!headful'
&& bash ./.github/scripts/gen-test-summary.sh "$GITHUB_STEP_SUMMARY" "$GITHUB_OUTPUT"
&& bash ./.github/scripts/gen-test-summary.sh "$GITHUB_STEP_SUMMARY"
env:
PATH: ${{ steps.path.outputs.value }}
@@ -192,7 +187,7 @@ jobs:
fi
artifact_name="results-${{ inputs.platform }}-$(echo ${{ matrix.test-name }} | tr '/ ' '__')"
echo "artifact-name=$artifact_name" >> $GITHUB_OUTPUT
echo "::set-output name=artifact-name::$artifact_name"
if: always()
- name: 'Upload test results'

5
.gitignore vendored
View File

@@ -18,8 +18,3 @@ NashornProfile.txt
/src/utils/LogCompilation/target/
/.project/
/.settings/
/.project
/.classpath
/.cproject
/compile_commands.json
/.cache

View File

@@ -1,7 +1,7 @@
[general]
project=jdk
jbs=JDK
version=21
version=20
[checks]
error=author,committer,reviewers,merge,issues,executable,symlink,message,hg-tag,whitespace,problemlists
@@ -15,7 +15,7 @@ version=0
domain=openjdk.org
[checks "whitespace"]
files=.*\.cpp|.*\.hpp|.*\.c|.*\.h|.*\.java|.*\.cc|.*\.hh|.*\.m|.*\.mm|.*\.md|.*\.gmk|.*\.m4|.*\.ac|Makefile
files=.*\.cpp|.*\.hpp|.*\.c|.*\.h|.*\.java|.*\.cc|.*\.hh|.*\.m|.*\.mm|.*\.gmk|.*\.m4|.*\.ac|Makefile
ignore-tabs=.*\.gmk|Makefile
[checks "merge"]

View File

@@ -2,8 +2,8 @@
OPENJDK ASSEMBLY EXCEPTION
The OpenJDK source code made available by Oracle America, Inc. (Oracle) at
openjdk.org ("OpenJDK Code") is distributed under the terms of the GNU
General Public License <https://www.gnu.org/copyleft/gpl.html> version 2
openjdk.java.net ("OpenJDK Code") is distributed under the terms of the GNU
General Public License <http://www.gnu.org/copyleft/gpl.html> version 2
only ("GPL2"), with the following clarification and special exception.
Linking this OpenJDK Code statically or dynamically with other code
@@ -12,7 +12,7 @@ only ("GPL2"), with the following clarification and special exception.
As a special exception, Oracle gives you permission to link this
OpenJDK Code with certain code licensed by Oracle as indicated at
https://openjdk.org/legal/exception-modules-2007-05-08.html
http://openjdk.java.net/legal/exception-modules-2007-05-08.html
("Designated Exception Modules") to produce an executable,
regardless of the license terms of the Designated Exception Modules,
and to copy and distribute the resulting executable under GPL2,

View File

@@ -1,3 +1,3 @@
# Contributing to the JDK
Please see <https://openjdk.org/contribute> for how to contribute.
Please see <https://openjdk.java.net/contribute/> for how to contribute.

258
README.md
View File

@@ -1,216 +1,64 @@
[![official JetBrains project](http://jb.gg/badges/official.svg)](https://confluence.jetbrains.com/display/ALL/JetBrains+on+GitHub)
# Welcome to the JDK!
# Welcome to JetBrains Runtime!
## Wakefield
This is a temporary section created to host information on the
[Wakefield](https://wiki.openjdk.java.net/display/wakefield) project.
JetBrains Runtime is a fork of [OpenJDK](https://github.com/openjdk/jdk) available for Windows, Mac OS X, and Linux.
It supports enhanced class redefinition ([DCEVM](https://ssw.jku.at/dcevm/)),
features optional [JCEF](https://github.com/JetBrains/jcef), a framework for embedding Chromium-based browsers,
includes a number of improvements in font rendering, keyboards support,
windowing/focus subsystems, HiDPI, accessibility, and performance, provides better desktop integration
and bugfixes not yet present in OpenJDK.
> **_NOTE_**: This is a **development** branch that is periodically synchronized with
> the [OpenJDK master](https://github.com/openjdk/jdk/tree/master) branch.
>
Release builds are based on these branches:
* [jbr11](https://github.com/JetBrains/JetBrainsRuntime/tree/jbr11) (JDK 11)
* [jbr17](https://github.com/JetBrains/JetBrainsRuntime/tree/jbr17) (JDK 17)
Download the latest releases of JetBrains Runtime to use with JetBrains IDEs. The full list
can be found on the [releases page](https://github.com/JetBrains/JetBrainsRuntime/releases).
## Releases based on JDK 17
| IDE Version | Latest JBR | Date Released |
|-------------|--------------------------------------------------------------------------------------------------------|---------------|
| 2023.1 | [17.0.6-b829.5](https://github.com/JetBrains/JetBrainsRuntime/releases/tag/jbr-release-17.0.6b829.5) | 01-Mar-2023 |
| 2022.3 | [17.0.6-b653.34](https://github.com/JetBrains/JetBrainsRuntime/releases/tag/jbr-release-17.0.6b653.34) | 28-Feb-2023 |
| 2022.2 | [17.0.6-b469.82](https://github.com/JetBrains/JetBrainsRuntime/releases/tag/jbr-release-17.0.6b469.82) | 06-Mar-2023 |
## Releases based on JDK 11
| IDE Version | Latest JBR | Date Released |
|-------------|-------------------------------------------------------------------------------------------------------|---------------|
| 2022.1 | [11_0_16-b2043.64](https://github.com/JetBrains/JetBrainsRuntime/releases/tag/jbr11_0_16b2043.64) | 10-Nov-2022 |
| 2021.3 | [11_0_14_1-b1751.46](https://github.com/JetBrains/JetBrainsRuntime/releases/tag/jbr11_0_14_1b1751.46) | 21-Feb-2022 |
| 2021.2 | [11_0_13-b1504.49](https://github.com/JetBrains/JetBrainsRuntime/releases/tag/jb11_0_13-b1504.49) | 15-Nov-2021 |
| 2021.1 | [11.0.11+9-b1341.60](https://github.com/JetBrains/JetBrainsRuntime/issues/171#issuecomment-1248891540)| 15-Jun-2021 |
| 2020.3 | [11_0_10-b1145.115](https://github.com/JetBrains/JetBrainsRuntime/issues/171#issuecomment-1249243977) | 21-Jun-2021 |
## Contents
- [Welcome to JetBrains Runtime](#welcome-to-jetbrains-runtime)
- [Why Use JetBrains Runtime?](#why-use-jetbrains-runtime)
- [Products Built on JetBrains Runtime](#products-built-on-jetbrains-runtime)
- [Getting Sources](#getting-sources)
- [macOS, Linux](#macos-linux)
- [Windows](#sources-windows)
- [Configuring the Build Environment](#configuring-the-build-environment)
- [Linux (Docker)](#linux-docker)
- [Ubuntu Linux](#ubuntu-linux)
- [Windows](#build-windows)
- [macOS](#macos)
- [Developing](#developing)
- [Contributing](#contributing)
- [Resources](#resources)
## Why Use JetBrains Runtime?
* **Embedded browser**: JetBrains Runtime includes the Java Chromium Embedded Framework ([JCEF](https://github.com/JetBrains/jcef)), which
enables you to embed a Chromium-based browsers in your JVM-based application.
To use it, [download a build with JCEF](https://github.com/JetBrains/JetBrainsRuntime/releases).
* **Enhanced class re-definition** with the [DCEVM](https://ssw.jku.at/dcevm/) technology that makes it easier to reload
changed code without restarting JVM; this feature needs to be explicitly enabled with `-XX:+AllowEnhancedClassRedefinition`.
* **Better FPS performance** for graphics-intensive applications.
* **Improved font rendering**, **keyboard input** (such as shortcuts and multinational keyboards),
**HiDPI** and **accessibility** support.
* **Robust desktop experience**: GUI-related fixes often reach JetBrains Runtime much earlier than the corresponding version of OpenJDK.
## Products Built on JetBrains Runtime
* [Android Studio](https://developer.android.com/studio). The official IDE for Google's Android operating system.
* [CLion](https://www.jetbrains.com/clion/). A cross-platform IDE for C and C++ from JetBrains.
* [DataGrip](https://www.jetbrains.com/datagrip/). The IDE for Databases and SQL from JetBrains.
* [GoLand](https://www.jetbrains.com/go/). The cross-platform Go IDE from JetBrains.
* [IntelliJ IDEA](https://www.jetbrains.com/idea/). The IDE for JVM from JetBrains.
* [JProfiler](https://www.ej-technologies.com/products/jprofiler/overview.html). The Java profiler.
* [PhpStorm](https://www.jetbrains.com/phpstorm/). The PHP IDE from JetBrains.
* [PyCharm](https://www.jetbrains.com/pycharm/). The Python IDE from JetBrains.
* [Rider](https://www.jetbrains.com/rider/). The cross-platform .NET IDE from JetBrains.
* [RubyMine](https://www.jetbrains.com/ruby/). The Ruby and Rails IDE from JetBrains.
* [Toolbox App](https://www.jetbrains.com/toolbox-app/). JetBrains IDE manager.
* [WebStorm](https://www.jetbrains.com/webstorm/). The JavaScript IDE from JetBrains.
* [YourKit](https://www.yourkit.com/). Java and .NET profilers.
## Getting Sources
### macOS, Linux
### Building
There are two addition `configure` arguments:
```
git config --global core.autocrlf input
git clone git@github.com:JetBrains/JetBrainsRuntime.git
cd JetBrainsRuntime
git checkout jbr21
--with-wayland specify prefix directory for the wayland package
(expecting the headers under PATH/include)
--with-wayland-include specify directory for the wayland include files
```
As usual, there should be no need to specify those explicitly unless you're doing
something tricky.
However, a variant of `libwayland-dev` needs to be installed on the build system.
### Running
Make sure your system is configured such that `libwayland` can find the socket to connect to;
usually this means that the environment variable `WAYLAND_DISPLAY` is set to something
sensible. Then add this argument to `java`
```
-Dawt.toolkit.name=WLToolkit
```
### Windows
<a name="sources-windows"></a>
### Testing
Testing that involves `Robot` is done inside a [Weston](https://gitlab.freedesktop.org/wayland/weston/)
instance with a special module loaded called `libwakefield`
that provides the necessary functionality. The Wayland-specific tests are therefore executed with a dedicated test driver
`test/jdk/java/awt/wakefield/WakefieldTestDriver.java`. The driver also provides an easy
way to run the test in several configurations with a different size and even number
of "outputs" (monitors).
To run the Wayland-specific tests, perform these steps:
* Install Weston version 9 (earlier versions are known NOT to work).
* Obtain `libwakefield.so` either by building from source (available under
`src/java.desktop/share/native/libwakefield` and not integrated into the rest of the
build infrastructure; see `README.md` there)
or by fetching the latest pre-built `x64` binary
```
git config --global core.autocrlf false
git clone git@github.com:JetBrains/JetBrainsRuntime.git
cd JetBrainsRuntime
git checkout jbr21
wget https://github.com/mkartashev/wakefield/raw/main/libwakefield.so
```
* Set `LIBWAKEFIELD` environment variable to the full path to `libwakefield.so`
```
export LIBWAKEFIELD=/tmp/wakefield-testing/libwakefield.so
```
* Run `jtreg` like so
```
jtreg -e:XDG_RUNTIME_DIR -e:LIBWAKEFIELD -testjdk:... test/jdk/java/awt/wakefield/
```
## Configuring the Build Environment
Here are quick per-platform instructions for those who can't wait to get started.
Please refer to [OpenJDK build docs](https://openjdk.java.net/groups/build/doc/building.html) for in-depth
coverage of all the details.
This was verified to work in `Ubuntu 21.10`.
This does NOT work in `Ubuntu 21.04` or `Fedora 34`.
> **_TIP:_** To get a preliminary report of what's missing, run `./configure` and check its output.
> It would usually have meaningful advice on how to solve the problem.
## Generic Info (not Wakefield-specific)
For build instructions please see the
[online documentation](https://openjdk.org/groups/build/doc/building.html),
or either of these files:
### Linux (Docker)
Create a container:
```
$ cd jb/project/docker
$ docker build .
...
Successfully built 942ea9900054
```
Run these commands in the new container:
```
$ docker run -v `pwd`../../../../:/JetBrainsRuntime -it 942ea9900054
# cd /JetBrainsRuntime
# sh ./configure
# make images CONF=linux-x86_64-normal-server-release
```
- [doc/building.html](doc/building.html) (html version)
- [doc/building.md](doc/building.md) (markdown version)
### Ubuntu Linux
Install the necessary tools, libraries, and headers with:
```
$ sudo apt-get install autoconf make build-essential libx11-dev libxext-dev libxrender-dev libxtst-dev \
libxt-dev libxrandr-dev libcups2-dev libfontconfig1-dev libasound2-dev
```
Get Java 19 (for instance, [Azul Zulu Builds of OpenJDK 19](https://www.azul.com/downloads/?version=java-19-sts&os=linux&package=jdk)).
Then run the following:
```
$ cd JetBrainsRuntime
$ git checkout main
$ sh ./configure
$ make images
```
This will build the release configuration under `./build/linux-x86_64-server-release/`.
### Windows
<a name="build-windows"></a>
Install the following:
* [Cygwin x64](http://www.cygwin.com/).
Required packages: `autoconf`, `binutils`, `cpio`, `diffutils`, `file`, `gawk`, `gcc-core`, `make`, `m4`, `unzip`, `zip`.
Install those together with Cygwin.
* [Visual Studio compiler toolset](https://visualstudio.microsoft.com/downloads/).
Install with the desktop development kit, which includes Windows SDK and compilers.
Visual Studio 2019 is supported by default.
* Java 19 (for instance, [Azul Zulu Builds of OpenJDK 19](https://www.azul.com/downloads/?version=java-19-sts&os=windows&package=jdk)).
If you have problems while configuring, read [Java tips on Cygwin](http://horstmann.com/articles/cygwin-tips.html).
From the command line:
```
"C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\VC\Auxiliary\Build\vcvarsall.bat" amd64
"c:\Program_Files\cygwin64\bin\mintty.exe" /bin/bash -l
```
The first command sets up environment variables, the second starts a Cygwin shell with the proper environment.
In the Cygwin shell:
```
$ cd JetBrainsRuntime
$ git checkout main
$ bash configure --with-toolchain-version=2019
$ make images
```
This will build the release configuration under `./build/windows-x86_64-server-release/`.
### macOS
Install the following:
* Xcode command line developer tools and `autoconf` via [Homebrew](https://brew.sh/).
* Java 19 (for instance, [Azul Zulu Builds of OpenJDK 19](https://www.azul.com/downloads/?version=java-19-sts&os=macos&package=jdk)).
From the command line:
```
$ cd JetBrainsRuntime
$ git checkout main
$ sh ./configure
$ make images
```
This will build the release configuration under `./build/macosx-x86_64-server-release/`.
## Developing
You can use [CLion](https://www.jetbrains.com/clion/) to develop native parts of the JetBrains Runtime and
[IntelliJ IDEA](https://www.jetbrains.com/idea/) for the parts written in Java.
Both require projects to be created.
### CLion
Run
```
$ make compile-commands
```
in the git root and open the resulting `build/.../compile_commands.json` file as a project.
Then use `Tools | Compilation Database | Change Project Root` to point to git root of this repository.
See also this detailed step-by-step tutorial for all platforms:
[How to develop OpenJDK with CLion](https://blog.jetbrains.com/clion/2020/03/openjdk-with-clion/).
### IDEA
Run
```
$ sh ./bin/idea.sh
```
in the git root to generate project files (add `--help` for options). If you have multiple
configurations (for example, `release` and `fastdebug`), supply the `--conf <conf_name>` argument.
Then open the git root directory as a project in IDEA.
## Contributing
We are happy to receive your pull requests!
Before you submit one, please sign our [Contributor License Agreement (CLA)](https://www.jetbrains.com/agreements/cla/).
## Resources
* [JetBrains Runtime on GitHub](https://github.com/JetBrains/JetBrainsRuntime).
* [OpenJDK build instructions](https://openjdk.java.net/groups/build/doc/building.html).
* [OpenJDK test instructions](https://htmlpreview.github.io/?https://raw.githubusercontent.com/openjdk/jdk/master/doc/building.html#running-tests).
* [How to develop OpenJDK with CLion](https://blog.jetbrains.com/clion/2020/03/openjdk-with-clion/).
See <https://openjdk.org/> for more information about
the OpenJDK Community and the JDK.

View File

@@ -25,26 +25,7 @@
# Shell script for generating an IDEA project from a given list of modules
usage() {
echo "Usage: $0 [-h|--help] [-q|--quiet] [-a|--absolute-paths] [-r|--root <path>] [-o|--output <path>] [-c|--conf <conf_name>] [modules...]"
echo " -h | --help"
echo " -q | --quiet
No stdout output"
echo " -a | --absolute-paths
Use absolute paths to this jdk, so that generated .idea
project files can be moved independently of jdk sources"
echo " -r | --root <path>
Project content root
Default: $TOPLEVEL_DIR"
echo " -o | --output <path>
Where .idea directory with project files will be generated
(e.g. using '-o .' will place project files in './.idea')
Default: same as --root"
echo " -c | --conf <conf_name>
make configuration (release, slowdebug etc)"
echo " [modules...]
Generate project modules for specific java modules
(e.g. 'java.base java.desktop')
Default: all existing modules (java.* and jdk.*)"
echo "usage: $0 [-h|--help] [-v|--verbose] [-o|--output <path>] [-c|--conf <conf_name>] [modules]+"
exit 1
}
@@ -52,13 +33,10 @@ SCRIPT_DIR=`dirname $0`
#assume TOP is the dir from which the script has been called
TOP=`pwd`
cd $SCRIPT_DIR; SCRIPT_DIR=`pwd`
if [ "x$TOPLEVEL_DIR" = "x" ] ; then
cd .. ; TOPLEVEL_DIR=`pwd`
fi
cd $TOP;
VERBOSE=true
ABSOLUTE_PATHS=false
IDEA_OUTPUT=$TOP/.idea
VERBOSE="false"
CONF_ARG=
while [ $# -gt 0 ]
do
@@ -67,24 +45,14 @@ do
usage
;;
-q | --quiet )
VERBOSE=false
;;
-a | --absolute-paths )
ABSOLUTE_PATHS=true
;;
-r | --root )
TOPLEVEL_DIR="$2"
shift
-v | --vebose )
VERBOSE="true"
;;
-o | --output )
IDEA_OUTPUT="$2/.idea"
IDEA_OUTPUT=$2/.idea
shift
;;
-c | --conf )
CONF_ARG="CONF_NAME=$2"
shift
@@ -101,17 +69,20 @@ do
shift
done
if [ "x$IDEA_OUTPUT" = "x" ] ; then
IDEA_OUTPUT="$TOPLEVEL_DIR/.idea"
if [ -e $IDEA_OUTPUT ] ; then
rm -r $IDEA_OUTPUT
fi
mkdir -p $IDEA_OUTPUT || exit 1
cd $IDEA_OUTPUT; IDEA_OUTPUT=`pwd`
if [ "x$TOPLEVEL_DIR" = "x" ] ; then
cd $SCRIPT_DIR/..
TOPLEVEL_DIR=`pwd`
cd $IDEA_OUTPUT
fi
mkdir -p $IDEA_OUTPUT || exit 1
cd "$TOP" ; cd $TOPLEVEL_DIR; TOPLEVEL_DIR=`pwd`
cd "$TOP" ; cd $IDEA_OUTPUT; IDEA_OUTPUT=`pwd`
cd ..; IDEA_OUTPUT_PARENT=`pwd`
cd "$SCRIPT_DIR/.." ; OPENJDK_DIR=`pwd`
IDEA_MAKE="$OPENJDK_DIR/make/ide/idea/jdk"
MAKE_DIR="$SCRIPT_DIR/../make"
IDEA_MAKE="$MAKE_DIR/ide/idea/jdk"
IDEA_TEMPLATE="$IDEA_MAKE/template"
cp -r "$IDEA_TEMPLATE"/* "$IDEA_OUTPUT"
@@ -123,31 +94,31 @@ if [ -d "$TEMPLATES_OVERRIDE" ] ; then
done
fi
if [ "$VERBOSE" = true ] ; then
echo "Will generate IDEA project files in \"$IDEA_OUTPUT\" for project \"$TOPLEVEL_DIR\""
if [ "$VERBOSE" = "true" ] ; then
echo "output dir: $IDEA_OUTPUT"
echo "idea template dir: $IDEA_TEMPLATE"
fi
cd $TOP ; make -f "$IDEA_MAKE/idea.gmk" -I "$OPENJDK_DIR" idea TOPLEVEL_DIR="$TOPLEVEL_DIR" \
MAKEOVERRIDES= IDEA_OUTPUT_PARENT="$IDEA_OUTPUT_PARENT" OUT="$IDEA_OUTPUT/env.cfg" MODULES="$*" $CONF_ARG || exit 1
cd $TOP ; make -f "$IDEA_MAKE/idea.gmk" -I $MAKE_DIR/.. idea MAKEOVERRIDES= OUT=$IDEA_OUTPUT/env.cfg MODULES="$*" $CONF_ARG || exit 1
cd $SCRIPT_DIR
. $IDEA_OUTPUT/env.cfg
# Expect MODULES, MODULE_NAMES, RELATIVE_PROJECT_DIR, RELATIVE_BUILD_DIR to be set
if [ "xMODULES" = "x" ] ; then
echo "FATAL: MODULES is empty" >&2; exit 1
# Expect MODULE_ROOTS, MODULE_NAMES, BOOT_JDK & SPEC to be set
if [ "x$MODULE_ROOTS" = "x" ] ; then
echo "FATAL: MODULE_ROOTS is empty" >&2; exit 1
fi
if [ "x$MODULE_NAMES" = "x" ] ; then
echo "FATAL: MODULE_NAMES is empty" >&2; exit 1
fi
if [ "x$RELATIVE_PROJECT_DIR" = "x" ] ; then
echo "FATAL: RELATIVE_PROJECT_DIR is empty" >&2; exit 1
if [ "x$BOOT_JDK" = "x" ] ; then
echo "FATAL: BOOT_JDK is empty" >&2; exit 1
fi
if [ "x$RELATIVE_BUILD_DIR" = "x" ] ; then
echo "FATAL: RELATIVE_BUILD_DIR is empty" >&2; exit 1
if [ "x$SPEC" = "x" ] ; then
echo "FATAL: SPEC is empty" >&2; exit 1
fi
if [ -d "$TOPLEVEL_DIR/.hg" ] ; then
@@ -158,43 +129,6 @@ if [ -d "$TOPLEVEL_DIR/.git" ] ; then
VCS_TYPE="Git"
fi
if [ "$ABSOLUTE_PATHS" = true ] ; then
if [ "x$PATHTOOL" != "x" ]; then
PROJECT_DIR="`$PATHTOOL -am $OPENJDK_DIR`"
TOPLEVEL_PROJECT_DIR="`$PATHTOOL -am $TOPLEVEL_DIR`"
else
PROJECT_DIR="$OPENJDK_DIR"
TOPLEVEL_PROJECT_DIR="$TOPLEVEL_DIR"
fi
MODULE_DIR="$PROJECT_DIR"
TOPLEVEL_MODULE_DIR="$TOPLEVEL_PROJECT_DIR"
cd "$IDEA_OUTPUT_PARENT" && cd "$RELATIVE_BUILD_DIR" && BUILD_DIR="`pwd`"
CLION_SCRIPT_TOPDIR="$OPENJDK_DIR"
CLION_PROJECT_DIR="$PROJECT_DIR"
else
if [ "$RELATIVE_PROJECT_DIR" = "." ] ; then
PROJECT_DIR=""
else
PROJECT_DIR="/$RELATIVE_PROJECT_DIR"
fi
if [ "$RELATIVE_TOPLEVEL_PROJECT_DIR" = "." ] ; then
TOPLEVEL_PROJECT_DIR=""
else
TOPLEVEL_PROJECT_DIR="/$RELATIVE_TOPLEVEL_PROJECT_DIR"
fi
MODULE_DIR="\$MODULE_DIR\$$PROJECT_DIR"
PROJECT_DIR="\$PROJECT_DIR\$$PROJECT_DIR"
TOPLEVEL_MODULE_DIR="\$MODULE_DIR\$$TOPLEVEL_PROJECT_DIR"
TOPLEVEL_PROJECT_DIR="\$PROJECT_DIR\$$TOPLEVEL_PROJECT_DIR"
BUILD_DIR="\$PROJECT_DIR\$/$RELATIVE_BUILD_DIR"
CLION_SCRIPT_TOPDIR="$CLION_RELATIVE_PROJECT_DIR"
CLION_PROJECT_DIR="\$PROJECT_DIR\$/$CLION_SCRIPT_TOPDIR"
fi
if [ "$VERBOSE" = true ] ; then
echo "Project root: $PROJECT_DIR"
echo "Generating IDEA project files..."
fi
### Replace template variables
NUM_REPLACEMENTS=0
@@ -218,106 +152,126 @@ add_replacement() {
eval TO$NUM_REPLACEMENTS='$2'
}
add_replacement "###PATHTOOL###" "$PATHTOOL"
add_replacement "###CLION_SCRIPT_TOPDIR###" "$CLION_SCRIPT_TOPDIR"
add_replacement "###CLION_PROJECT_DIR###" "$CLION_PROJECT_DIR"
add_replacement "###PROJECT_DIR###" "$PROJECT_DIR"
add_replacement "###MODULE_DIR###" "$MODULE_DIR"
add_replacement "###TOPLEVEL_PROJECT_DIR###" "$TOPLEVEL_PROJECT_DIR"
add_replacement "###TOPLEVEL_MODULE_DIR###" "$TOPLEVEL_MODULE_DIR"
add_replacement "###MODULE_NAMES###" "$MODULE_NAMES"
add_replacement "###VCS_TYPE###" "$VCS_TYPE"
add_replacement "###BUILD_DIR###" "$BUILD_DIR"
add_replacement "###RELATIVE_BUILD_DIR###" "$RELATIVE_BUILD_DIR"
if [ "x$PATHTOOL" != "x" ]; then
add_replacement "###BASH_RUNNER_PREFIX###" "\$PROJECT_DIR\$/.idea/bash.bat"
else
add_replacement "###BASH_RUNNER_PREFIX###" ""
fi
if [ "x$PATHTOOL" != "x" ]; then
SPEC_DIR=`dirname $SPEC`
if [ "x$CYGPATH" != "x" ]; then
add_replacement "###BUILD_DIR###" "`$CYGPATH -am $SPEC_DIR`"
add_replacement "###IMAGES_DIR###" "`$CYGPATH -am $SPEC_DIR`/images/jdk"
add_replacement "###ROOT_DIR###" "`$CYGPATH -am $TOPLEVEL_DIR`"
add_replacement "###IDEA_DIR###" "`$CYGPATH -am $IDEA_OUTPUT`"
if [ "x$JT_HOME" = "x" ]; then
add_replacement "###JTREG_HOME###" ""
else
add_replacement "###JTREG_HOME###" "`$PATHTOOL -am $JT_HOME`"
add_replacement "###JTREG_HOME###" "`$CYGPATH -am $JT_HOME`"
fi
elif [ "x$WSL_DISTRO_NAME" != "x" ]; then
add_replacement "###BUILD_DIR###" "`wslpath -am $SPEC_DIR`"
add_replacement "###IMAGES_DIR###" "`wslpath -am $SPEC_DIR`/images/jdk"
add_replacement "###ROOT_DIR###" "`wslpath -am $TOPLEVEL_DIR`"
add_replacement "###IDEA_DIR###" "`wslpath -am $IDEA_OUTPUT`"
if [ "x$JT_HOME" = "x" ]; then
add_replacement "###JTREG_HOME###" ""
else
add_replacement "###JTREG_HOME###" "`wslpath -am $JT_HOME`"
fi
else
add_replacement "###BUILD_DIR###" "$SPEC_DIR"
add_replacement "###JTREG_HOME###" "$JT_HOME"
add_replacement "###IMAGES_DIR###" "$SPEC_DIR/images/jdk"
add_replacement "###ROOT_DIR###" "$TOPLEVEL_DIR"
add_replacement "###IDEA_DIR###" "$IDEA_OUTPUT"
fi
MODULE_IMLS=""
TEST_MODULE_DEPENDENCIES=""
for module in $MODULE_NAMES; do
MODULE_IMLS="$MODULE_IMLS<module fileurl=\"file://\$PROJECT_DIR$/.idea/$module.iml\" filepath=\"\$PROJECT_DIR$/.idea/$module.iml\" /> "
TEST_MODULE_DEPENDENCIES="$TEST_MODULE_DEPENDENCIES<orderEntry type=\"module\" module-name=\"$module\" scope=\"TEST\" /> "
SOURCE_PREFIX="<sourceFolder url=\"file://"
SOURCE_POSTFIX="\" isTestSource=\"false\" />"
for root in $MODULE_ROOTS; do
if [ "x$CYGPATH" != "x" ]; then
root=`$CYGPATH -am $root`
elif [ "x$WSL_DISTRO_NAME" != "x" ]; then
root=`wslpath -am $root`
fi
VM_CI="jdk.internal.vm.ci/share/classes"
VM_COMPILER="src/jdk.internal.vm.compiler/share/classes"
if test "${root#*$VM_CI}" != "$root" || test "${root#*$VM_COMPILER}" != "$root"; then
for subdir in "$root"/*; do
if [ -d "$subdir" ]; then
SOURCES=$SOURCES" $SOURCE_PREFIX""$subdir"/src"$SOURCE_POSTFIX"
fi
done
else
SOURCES=$SOURCES" $SOURCE_PREFIX""$root""$SOURCE_POSTFIX"
fi
done
add_replacement "###MODULE_IMLS###" "$MODULE_IMLS"
add_replacement "###TEST_MODULE_DEPENDENCIES###" "$TEST_MODULE_DEPENDENCIES"
add_replacement "###SOURCE_ROOTS###" "$SOURCES"
replace_template_dir "$IDEA_OUTPUT"
### Generate module project files
### Compile the custom Logger
if [ "$VERBOSE" = true ] ; then
echo "Generating project modules:"
fi
(
DEFAULT_IFS="$IFS"
IFS='#'
for value in $MODULES; do
(
eval "$value"
if [ "$VERBOSE" = true ] ; then
echo " $module"
fi
MAIN_SOURCE_DIRS=""
CONTENT_ROOTS=""
IFS=' '
for dir in $moduleSrcDirs; do
case $dir in
"src/"*) MAIN_SOURCE_DIRS="$MAIN_SOURCE_DIRS <sourceFolder url=\"file://$MODULE_DIR/$dir\" isTestSource=\"false\" />" ;;
*"/support/gensrc/$module") ;; # Exclude generated sources to avoid module-info conflicts, see https://youtrack.jetbrains.com/issue/IDEA-185108
*) CONTENT_ROOTS="$CONTENT_ROOTS <content url=\"file://$MODULE_DIR/$dir\">\
<sourceFolder url=\"file://$MODULE_DIR/$dir\" isTestSource=\"false\" generated=\"true\" /></content>" ;;
esac
done
if [ "x$MAIN_SOURCE_DIRS" != "x" ] ; then
CONTENT_ROOTS="<content url=\"file://$MODULE_DIR/src/$module\">$MAIN_SOURCE_DIRS</content>$CONTENT_ROOTS"
fi
add_replacement "###MODULE_CONTENT_ROOTS###" "$CONTENT_ROOTS"
DEPENDENCIES=""
for dep in $moduleDependencies; do
case $MODULE_NAMES in # Exclude skipped modules from dependencies
*"$dep"*) DEPENDENCIES="$DEPENDENCIES<orderEntry type=\"module\" module-name=\"$dep\" /> "
esac
done
add_replacement "###DEPENDENCIES###" "$DEPENDENCIES"
cp "$IDEA_OUTPUT/module.iml" "$IDEA_OUTPUT/$module.iml"
IFS="$DEFAULT_IFS"
replace_template_file "$IDEA_OUTPUT/$module.iml"
)
done
)
rm "$IDEA_OUTPUT/module.iml"
CLASSES=$IDEA_OUTPUT/classes
### Create shell script runner for Windows
if [ "x$PATHTOOL" != "x" ]; then
echo "@echo off" > "$IDEA_OUTPUT/bash.bat"
if [ "x$WSL_DISTRO_NAME" != "x" ] ; then
echo "wsl -d $WSL_DISTRO_NAME --cd \"%cd%\" -e %*" >> "$IDEA_OUTPUT/bash.bat"
if [ "x$ANT_HOME" = "x" ] ; then
# try some common locations
if [ -f "/usr/share/ant/lib/ant.jar" ] ; then
ANT_HOME="/usr/share/ant"
else
echo "$WINENV_ROOT\bin\bash.exe -l -c \"cd %CD:\=/%/ && %*\"" >> "$IDEA_OUTPUT/bash.bat"
try_ant=$(ls /opt/homebrew/Cellar/ant/*/libexec/lib/ant.jar 2> /dev/null | sort -r | head -n 1)
if [ "x$try_ant" != "x" ] ; then
ANT_HOME=$(cd $(dirname $try_ant)/.. && pwd)
else
try_ant=$(ls /usr/local/Cellar/ant/*/libexec/lib/ant.jar 2> /dev/null | sort -r | head -n 1)
if [ "x$try_ant" != "x" ] ; then
ANT_HOME=$(cd $(dirname $try_ant)/.. && pwd)
fi
fi
fi
else
if [ ! -f "$ANT_HOME/lib/ant.jar" ] ; then
echo "FATAL: ANT_HOME is incorrect. Try removing it and use autodetection, or fix the value" >&2; exit 1
fi
fi
if [ "x$ANT_HOME" = "x" ] ; then
echo "FATAL: cannot find ant. Try setting ANT_HOME." >&2; exit 1
fi
CP=$ANT_HOME/lib/ant.jar
rm -rf $CLASSES; mkdir $CLASSES
# If we have a Windows boot JDK, we need a .exe suffix
if [ -e "$BOOT_JDK/bin/java.exe" ] ; then
JAVAC=javac.exe
else
JAVAC=javac
fi
if [ "$VERBOSE" = true ] ; then
IDEA_PROJECT_DIR="`dirname $IDEA_OUTPUT`"
if [ "x$PATHTOOL" != "x" ]; then
IDEA_PROJECT_DIR="`$PATHTOOL -am $IDEA_PROJECT_DIR`"
fi
echo "
Now you can open \"$IDEA_PROJECT_DIR\" as IDEA project
You can also run 'bash \"$IDEA_OUTPUT/jdk-clion/update-project.sh\"' to generate Clion project"
# If we are on WSL, the boot JDK might be either Windows or Linux,
# and we need to use realpath instead of CYGPATH to make javac work on both.
# We need to handle this case first since CYGPATH might be set on WSL.
if [ "x$WSL_DISTRO_NAME" != "x" ]; then
JAVAC_SOURCE_FILE=`realpath --relative-to=./ $IDEA_OUTPUT/src/idea/IdeaLoggerWrapper.java`
JAVAC_SOURCE_PATH=`realpath --relative-to=./ $IDEA_OUTPUT/src`
JAVAC_CLASSES=`realpath --relative-to=./ $CLASSES`
ANT_TEMP=`mktemp -d -p ./`
cp $ANT_HOME/lib/ant.jar $ANT_TEMP/ant.jar
JAVAC_CP=$ANT_TEMP/ant.jar
elif [ "x$CYGPATH" != "x" ] ; then ## CYGPATH may be set in env.cfg
JAVAC_SOURCE_FILE=`$CYGPATH -am $IDEA_OUTPUT/src/idea/IdeaLoggerWrapper.java`
JAVAC_SOURCE_PATH=`$CYGPATH -am $IDEA_OUTPUT/src`
JAVAC_CLASSES=`$CYGPATH -am $CLASSES`
JAVAC_CP=`$CYGPATH -am $CP`
else
JAVAC_SOURCE_FILE=$IDEA_OUTPUT/src/idea/IdeaLoggerWrapper.java
JAVAC_SOURCE_PATH=$IDEA_OUTPUT/src
JAVAC_CLASSES=$CLASSES
JAVAC_CP=$CP
fi
$BOOT_JDK/bin/$JAVAC -d $JAVAC_CLASSES -sourcepath $JAVAC_SOURCE_PATH -cp $JAVAC_CP $JAVAC_SOURCE_FILE
if [ "x$WSL_DISTRO_NAME" != "x" ]; then
rm -rf $ANT_TEMP
fi

View File

@@ -1,6 +1,6 @@
#!/bin/bash
#
# Copyright (c) 2015, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2015, 2016, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -128,15 +128,6 @@ install_jib() {
exit 1
fi
fi
# Want to check the filetype using file, to see if we got served a HTML error page.
# This is sensitive to the filename containing a specific string, but good enough.
file "${installed_jib_script}.gz" | grep "gzip compressed data" > /dev/null
if [ $? -ne 0 ]; then
echo "Warning: ${installed_jib_script}.gz is not a gzip file."
echo "If you are behind a proxy you may need to configure exceptions using no_proxy."
echo "The download URL was: ${jib_url}"
exit 1
fi
echo "Extracting JIB bootstrap script"
rm -f "${installed_jib_script}"
gunzip "${installed_jib_script}.gz"
@@ -144,28 +135,6 @@ install_jib() {
echo "${data_string}" > "${install_data}"
}
# Returns a shell-escaped version of the argument given.
shell_quote() {
if [[ -n "$1" ]]; then
# Uses only shell-safe characters? No quoting needed.
# '=' is a zsh meta-character, but only in word-initial position.
if echo "$1" | grep '^[ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789\.:,%/+=_-]\{1,\}$' > /dev/null \
&& ! echo "$1" | grep '^=' > /dev/null; then
quoted="$1"
else
if echo "$1" | grep "[\'!]" > /dev/null; then
# csh does history expansion within single quotes, but not
# when backslash-escaped!
local quoted_quote="'\\''" quoted_exclam="'\\!'"
word="${1//\'/${quoted_quote}}"
word="${1//\!/${quoted_exclam}}"
fi
quoted="'$1'"
fi
echo "$quoted"
fi
}
# Main body starts here
setup_url
@@ -182,16 +151,4 @@ if [ -z "${JIB_SRC_DIR}" ]; then
export JIB_SRC_DIR="${mydir}/../"
fi
# Save the original command line
conf_quoted_arguments=()
for conf_option; do
conf_quoted_arguments=("${conf_quoted_arguments[@]}" "$(shell_quote "$conf_option")")
done
export REAL_CONFIGURE_COMMAND_LINE="${conf_quoted_arguments[@]}"
myfulldir="$(cd "${mydir}" > /dev/null && pwd)"
export REAL_CONFIGURE_COMMAND_EXEC_FULL="$BASH $myfulldir/$myname"
export REAL_CONFIGURE_COMMAND_EXEC_SHORT="$myname"
${installed_jib_script} "$@"

0
configure vendored Executable file → Normal file
View File

File diff suppressed because it is too large Load Diff

View File

@@ -126,8 +126,6 @@ space is required.
Even for 32-bit builds, it is recommended to use a 64-bit build machine, and
instead create a 32-bit target using `--with-target-bits=32`.
Note: The Windows 32-bit x86 port is deprecated and may be removed in a future release.
### Building on aarch64
At a minimum, a machine with 8 cores is advisable, as well as 8 GB of RAM.
@@ -164,11 +162,11 @@ This table lists the OS versions used by Oracle when building the JDK. Such
information is always subject to change, but this table is up to date at the
time of writing.
| Operating system | Vendor/version used |
| ----------------- | ---------------------------------- |
| Linux | Oracle Enterprise Linux 6.4 / 7.6 |
| macOS | Mac OS X 10.13 (High Sierra) |
| Windows | Windows Server 2012 R2 |
Operating system Vendor/version used
----------------- -------------------------------------------------------
Linux Oracle Enterprise Linux 6.4 / 7.6
macOS Mac OS X 10.13 (High Sierra)
Windows Windows Server 2012 R2
The double version numbers for Linux are due to the hybrid model
used at Oracle, where header files and external libraries from an older version
@@ -201,8 +199,6 @@ rule also applies to input to the build system, e.g. in arguments to
`--with-msvcr-dll=c:\msvcr100.dll`. For details on this conversion, see the section
on [Fixpath](#fixpath).
Note: The Windows 32-bit x86 port is deprecated and may be removed in a future release.
#### Cygwin
A functioning [Cygwin](http://www.cygwin.com/) environment is required for
@@ -315,17 +311,14 @@ Build Wiki page for details about which versions of AIX are supported.
Large portions of the JDK consists of native code, that needs to be compiled to
be able to run on the target platform. In theory, toolchain and operating
system should be independent factors, but in practice there's more or less a
one-to-one correlation between target operating system and toolchain. There are
ongoing efforts to loosen this strict coupling between compiler and operating
system (see [JDK-8288293](https://bugs.openjdk.org/browse/JDK-8288293)) but it
will likely be a very long time before this goal can be realized.
one-to-one correlation between target operating system and toolchain.
| Operating system | Supported toolchain |
| ------------------ | ------------------------- |
| Linux | gcc, clang |
| macOS | Apple Xcode (using clang) |
| AIX | IBM XL C/C++ |
| Windows | Microsoft Visual Studio |
Operating system Supported toolchain
------------------ -------------------------
Linux gcc, clang
macOS Apple Xcode (using clang)
AIX IBM XL C/C++
Windows Microsoft Visual Studio
Please see the individual sections on the toolchains for version
recommendations. As a reference, these versions of the toolchains are used, at
@@ -334,11 +327,11 @@ possible to compile the JDK with both older and newer versions, but the closer
you stay to this list, the more likely you are to compile successfully without
issues.
| Operating system | Toolchain version |
| ------------------ | ------------------------------------------ |
| Linux | gcc 11.2.0 |
| macOS | Apple Xcode 10.1 (using clang 10.0.0) |
| Windows | Microsoft Visual Studio 2022 update 17.1.0 |
Operating system Toolchain version
------------------ -------------------------------------------------------
Linux gcc 11.2.0
macOS Apple Xcode 10.1 (using clang 10.0.0)
Windows Microsoft Visual Studio 2022 update 17.1.0
All compilers are expected to be able to compile to the C99 language standard,
as some C99 features are used in the source code. Microsoft Visual Studio
@@ -366,20 +359,20 @@ To use clang instead of gcc on Linux, use `--with-toolchain-type=clang`.
The oldest supported version of Xcode is 8.
You will need the Xcode command line developer tools to be able to build
the JDK. (Actually, *only* the command line tools are needed, not the IDE.)
You will need the Xcode command lines developers tools to be able to build
the JDK. (Actually, *only* the command lines tools are needed, not the IDE.)
The simplest way to install these is to run:
```
xcode-select --install
```
When updating Xcode, it is advisable to keep an older version for building the JDK.
To use a specific version of Xcode you have multiple options:
* Use `xcode-select -s` before running `configure`, e.g. `xcode-select -s /Applications/Xcode13.1.app`. The drawback is that the setting
is system wide and you may have to revert it after an OpenJDK build.
* Use configure option `--with-xcode-path`, e.g. `configure --with-xcode-path=/Applications/Xcode13.1.app`
This allows using a specific Xcode version for an OpenJDK build, independently of the active Xcode version by `xcode-select`.
It is advisable to keep an older version of Xcode for building the JDK when
updating Xcode. This [blog page](
http://iosdevelopertips.com/xcode/install-multiple-versions-of-xcode.html) has
good suggestions on managing multiple Xcode versions. To use a specific version
of Xcode, use `xcode-select -s` before running `configure`, or use
`--with-toolchain-path` to point to the version of Xcode to use, e.g.
`configure --with-toolchain-path=/Applications/Xcode8.app/Contents/Developer/usr/bin`
If you have recently (inadvertently) updated your OS and/or Xcode version, and
the JDK can no longer be built, please see the section on [Problems with the
@@ -475,19 +468,6 @@ rather than bundling the JDK's own copy.
Use `--with-freetype-include=<path>` and `--with-freetype-lib=<path>`
if `configure` does not automatically locate the platform FreeType files.
### Fontconfig
Fontconfig from [freedesktop.org Fontconfig](http://fontconfig.org) is required
on all platforms except Windows and macOS.
* To install on an apt-based Linux, try running `sudo apt-get install
libfontconfig-dev`.
* To install on an rpm-based Linux, try running `sudo yum install
fontconfig-devel`.
Use `--with-fontconfig-include=<path>` and `--with-fontconfig=<path>`
if `configure` does not automatically locate the platform Fontconfig files.
### CUPS
CUPS, [Common UNIX Printing System](http://www.cups.org) header files are
@@ -877,18 +857,17 @@ containing `lib/jtreg.jar` etc.
The [Adoption Group](https://wiki.openjdk.org/display/Adoption) provides
recent builds of jtreg [here](
https://ci.adoptium.net/view/Dependencies/job/dependency_pipeline/lastSuccessfulBuild/artifact/jtreg/).
https://ci.adoptopenjdk.net/view/Dependencies/job/dependency_pipeline/lastSuccessfulBuild/artifact/jtreg/).
Download the latest `.tar.gz` file, unpack it, and point `--with-jtreg` to the
`jtreg` directory that you just unpacked.
Building of Hotspot Gtest suite requires the source code of Google
Test framework. The top directory, which contains both `googletest`
and `googlemock` directories, should be specified via `--with-gtest`.
The minimum supported version of Google Test is 1.13.0, whose source
code can be obtained:
Building of Hotspot Gtest suite requires the source code of Google Test framework.
The top directory, which contains both `googletest` and `googlemock`
directories, should be specified via `--with-gtest`.
The supported version of Google Test is 1.8.1, whose source code can be obtained:
* by downloading and unpacking the source bundle from [here](https://github.com/google/googletest/releases/tag/v1.13.0)
* or by checking out `v1.13.0` tag of `googletest` project: `git clone -b v1.13.0 https://github.com/google/googletest`
* by downloading and unpacking the source bundle from [here](https://github.com/google/googletest/releases/tag/release-1.8.1)
* or by checking out `release-1.8.1` tag of `googletest` project: `git clone -b release-1.8.1 https://github.com/google/googletest`
To execute the most basic tests (tier 1), use:
```
@@ -987,14 +966,14 @@ https://sourceware.org/autobook/autobook/autobook_17.html). If no
targets are given, a native toolchain for the current platform will be
created. Currently, at least the following targets are known to work:
| Supported devkit targets |
| ------------------------ |
| x86_64-linux-gnu |
| aarch64-linux-gnu |
| arm-linux-gnueabihf |
| ppc64-linux-gnu |
| ppc64le-linux-gnu |
| s390x-linux-gnu |
Supported devkit targets
-------------------------
x86_64-linux-gnu
aarch64-linux-gnu
arm-linux-gnueabihf
ppc64-linux-gnu
ppc64le-linux-gnu
s390x-linux-gnu
`BASE_OS` must be one of "OEL6" for Oracle Enterprise Linux 6 or
"Fedora" (if not specified "OEL6" will be the default). If the base OS
@@ -1164,7 +1143,7 @@ Note that X11 is needed even if you only want to build a headless JDK.
### Cross compiling with Debian sysroots
Fortunately, you can create sysroots for foreign architectures with tools
provided by your OS. On Debian/Ubuntu systems, one could use `debootstrap` to
provided by your OS. On Debian/Ubuntu systems, one could use `qemu-deboostrap` to
create the *target* system chroot, which would have the native libraries and headers
specific to that *target* system. After that, we can use the cross-compiler on the *build*
system, pointing into chroot to get the build dependencies right. This allows building
@@ -1179,7 +1158,7 @@ For example, cross-compiling to AArch64 from x86_64 could be done like this:
* Create chroot on the *build* system, configuring it for *target* system:
```
sudo debootstrap \
sudo qemu-debootstrap \
--arch=arm64 \
--verbose \
--include=fakeroot,symlinks,build-essential,libx11-dev,libxext-dev,libxrender-dev,libxrandr-dev,libxtst-dev,libxt-dev,libcups2-dev,libfontconfig1-dev,libasound2-dev,libfreetype6-dev,libpng-dev,libffi-dev \
@@ -1187,8 +1166,6 @@ For example, cross-compiling to AArch64 from x86_64 could be done like this:
buster \
~/sysroot-arm64 \
http://httpredir.debian.org/debian/
# If the target architecture is `riscv64`,
# the path should be `debian-ports` instead of `debian`.
```
* Make sure the symlinks inside the newly created chroot point to proper locations:
@@ -1221,22 +1198,21 @@ it might require a little nudge with:
Architectures that are known to successfully cross-compile like this are:
| Target | Debian tree | Debian arch | `--openjdk-target=...` | `--with-jvm-variants=...` |
| ------------ | ------------ | ------------- | ------------------------ | ------------------------- |
| x86 | buster | i386 | i386-linux-gnu | (all) |
| arm | buster | armhf | arm-linux-gnueabihf | (all) |
| aarch64 | buster | arm64 | aarch64-linux-gnu | (all) |
| ppc64le | buster | ppc64el | powerpc64le-linux-gnu | (all) |
| s390x | buster | s390x | s390x-linux-gnu | (all) |
| mipsle | buster | mipsel | mipsel-linux-gnu | zero |
| mips64le | buster | mips64el | mips64el-linux-gnueabi64 | zero |
| armel | buster | arm | arm-linux-gnueabi | zero |
| ppc | sid | powerpc | powerpc-linux-gnu | zero |
| ppc64be | sid | ppc64 | powerpc64-linux-gnu | (all) |
| m68k | sid | m68k | m68k-linux-gnu | zero |
| alpha | sid | alpha | alpha-linux-gnu | zero |
| sh4 | sid | sh4 | sh4-linux-gnu | zero |
| riscv64 | sid | riscv64 | riscv64-linux-gnu | (all) |
Target Debian tree Debian arch `--openjdk-target=...` `--with-jvm-variants=...`
------------ ------------ ------------- ------------------------ --------------
x86 buster i386 i386-linux-gnu (all)
arm buster armhf arm-linux-gnueabihf (all)
aarch64 buster arm64 aarch64-linux-gnu (all)
ppc64le buster ppc64el powerpc64le-linux-gnu (all)
s390x buster s390x s390x-linux-gnu (all)
mipsle buster mipsel mipsel-linux-gnu zero
mips64le buster mips64el mips64el-linux-gnueabi64 zero
armel buster arm arm-linux-gnueabi zero
ppc sid powerpc powerpc-linux-gnu zero
ppc64be sid ppc64 powerpc64-linux-gnu (all)
m68k sid m68k m68k-linux-gnu zero
alpha sid alpha alpha-linux-gnu zero
sh4 sid sh4 sh4-linux-gnu zero
### Building for ARM/aarch64
@@ -1246,44 +1222,6 @@ available using `--with-abi-profile`: arm-vfp-sflt, arm-vfp-hflt, arm-sflt,
armv5-vfp-sflt, armv6-vfp-hflt. Note that soft-float ABIs are no longer
properly supported by the JDK.
### Building for RISC-V
The RISC-V community provides a basic
[GNU compiler toolchain](https://github.com/riscv-collab/riscv-gnu-toolchain),
but the [external libraries](#External-Library-Requirements) required by OpenJDK
complicate the building process. The placeholder `<toolchain-installed-path>`
shown below is the path where you want to install the toolchain.
* Install the RISC-V GNU compiler toolchain:
```
git clone --recursive https://github.com/riscv-collab/riscv-gnu-toolchain
cd riscv-gnu-toolchain
./configure --prefix=<toolchain-installed-path>
make linux
export PATH=<toolchain-installed-path>/bin:$PATH
```
* Cross-compile all the required libraries:
```
# An example for libffi
git clone https://github.com/libffi/libffi
cd libffi
./configure --host=riscv64-unknown-linux-gnu --prefix=<toolchain-installed-path>/sysroot/usr
make
make install
```
* Configure and build OpenJDK:
```
bash configure \
--with-boot-jdk=$BOOT_JDK \
--openjdk-target=riscv64-linux-gnu \
--with-sysroot=<toolchain-installed-path>/sysroot \
--with-toolchain-path=<toolchain-installed-path>/bin \
--with-extra-path=<toolchain-installed-path>/bin
make images
```
### Building for musl
Just like it's possible to cross-compile for a different CPU, it's possible to
@@ -1392,12 +1330,12 @@ it.
To use, setup an icecc network, and install icecc on the build machine. Then
run `configure` using `--enable-icecc`.
### Using the javac server
### Using sjavac
To speed up compilation of Java code, especially during incremental
compilations, the javac server is automatically enabled in the configuration
step by default. To explicitly enable or disable the javac server, use either
`--enable-javac-server` or `--disable-javac-server`.
To speed up compilation of Java code, especially during incremental compilations,
the sjavac server is automatically enabled in the configuration step by default.
To explicitly enable or disable sjavac, use either `--enable-javac-server`
or `--disable-javac-server`.
### Building the Right Target

File diff suppressed because it is too large Load Diff

View File

@@ -572,12 +572,8 @@ There are a few exceptions to this rule.
* `#include <new>` to use placement `new`, `std::nothrow`, and `std::nothrow_t`.
* `#include <limits>` to use `std::numeric_limits`.
* `#include <type_traits>` with some restrictions, listed below.
* `#include <cstddef>` to use `std::nullptr_t` and `std::max_align_t`.
Certain restrictions apply to the declarations provided by `<type_traits>`.
* The `alignof` operator should be used rather than `std::alignment_of<>`.
* `#include <type_traits>`.
* `#include <cstddef>` to use `std::nullptr_t`.
TODO: Rather than directly \#including (permitted) Standard Library
headers, use a convention of \#including wrapper headers (in some
@@ -655,51 +651,6 @@ constant members. Compilers having such bugs are no longer supported.
Except where an enum is semantically appropriate, new code should use
integral constants.
### alignas
_Alignment-specifiers_ (`alignas`
[n2341](https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2341.pdf))
are permitted, with restrictions.
_Alignment-specifiers_ are permitted when the requested alignment is a
_fundamental alignment_ (not greater than `alignof(std::max_align_t)`
[C++14 3.11/2](https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4296.pdf)).
_Alignment-specifiers_ with an _extended alignment_ (greater than
`alignof(std::max_align_t)`
[C++14 3.11/3](https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4296.pdf))
may only be used to align variables with static or automatic storage duration
([C++14 3.7.1, 3.7.3](https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4296.pdf)).
As a consequence, _over-aligned types_ are forbidden; this may change if
HotSpot updates to using C++17 or later
([p0035r4](https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2016/p0035r4.html)).
Large _extended alignments_ should be avoided, particularly for stack
allocated objects. What is a large value may depend on the platform and
configuration. There may also be hard limits for some platforms.
An _alignment-specifier_ must always be applied to a definition
([C++14 10.6.2/6](https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2014/n4296.pdf)).
(C++ allows an _alignment-specifier_ to optionally also be applied to a
declaration, so long as the definition has equivalent alignment. There isn't
any known benefit from duplicating the alignment in a non-definition
declaration, so such duplication should be avoided in HotSpot code.)
Enumerations are forbidden from having _alignment-specifiers_. Aligned
enumerations were originally permitted but insufficiently specified, and were
later (C++20) removed
([CWG 2354](https://cplusplus.github.io/CWG/issues/2354.html)).
Permitting such usage in HotSpot now would just cause problems in the future.
_Alignment-specifiers_ are forbidden in `typedef` and _alias-declarations_.
This may work or may have worked in some versions of some compilers, but was
later (C++14) explicitly disallowed
([CWG 1437](https://cplusplus.github.io/CWG/issues/1437.html)).
The HotSpot macro `ATTRIBUTE_ALIGNED` provides similar capabilities for
platforms that define it. This macro predates the use by HotSpot of C++
versions providing `alignas`. New code should use `alignas`.
### thread_local
Avoid use of `thread_local`
@@ -1058,37 +1009,8 @@ and other supported compilers may not have anything similar.
[p0136r1]: http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/p0136r1.html
"p0136r1"
### Attributes
The use of some attributes
([n2761](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2761.pdf))
(listed below) is permitted. (Note that some of the attributes defined in
that paper didn't make it into the final specification.)
Attributes are syntactically permitted in a broad set of locations, but
specific attributes are only permitted in a subset of those locations. In
some cases an attribute that appertains to a given element may be placed in
any of several locations with the same meaning. In those cases HotSpot has a
preferred location.
* An attribute that appertains to a function is placed at the beginning of the
function's declaration, rather than between the function name and the parameter
list.
Only the following attributes are permitted:
* `[[noreturn]]`
The following attributes are expressly forbidden:
* `[[carries_dependency]]` - Related to `memory_order_consume`.
* `[[deprecated]]` - Not relevant in HotSpot code.
### Additional Permitted Features
* `alignof`
([n2341](https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2341.pdf))
* `constexpr`
([n2235](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2007/n2235.pdf))
([n3652](https://isocpp.org/files/papers/N3652.html))
@@ -1186,6 +1108,10 @@ difficult to deal with and lead to surprises, as can destruction
ordering. HotSpot doesn't generally try to cleanup on exit, and
running destructors at exit can also lead to problems.
* `[[deprecated]]` attribute
([n3760](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3760.html)) &mdash;
Not relevant in HotSpot code.
* Avoid most operator overloading, preferring named functions. When
operator overloading is used, ensure the semantics conform to the
normal expected behavior of the operation.
@@ -1210,6 +1136,9 @@ features that have not yet been discussed.
* Member initializers and aggregates
([n3653](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3653.html))
* `[[noreturn]]` attribute
([n2761](http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2008/n2761.pdf))
* Rvalue references and move semantics
[ADL]: https://en.cppreference.com/w/cpp/language/adl

View File

@@ -5,19 +5,11 @@
<meta name="generator" content="pandoc" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
<title>Native/Unit Test Development Guidelines</title>
<style>
code{white-space: pre-wrap;}
span.smallcaps{font-variant: small-caps;}
div.columns{display: flex; gap: min(4vw, 1.5em);}
div.column{flex: auto; overflow-x: auto;}
div.hanging-indent{margin-left: 1.5em; text-indent: -1.5em;}
ul.task-list{list-style: none;}
ul.task-list li input[type="checkbox"] {
width: 0.8em;
margin: 0 0.8em 0.2em -1.6em;
vertical-align: middle;
}
.display.math{display: block; text-align: center; margin: 0.5rem auto;}
<style type="text/css">
code{white-space: pre-wrap;}
span.smallcaps{font-variant: small-caps;}
span.underline{text-decoration: underline;}
div.column{display: inline-block; vertical-align: top; width: 50%;}
</style>
<link rel="stylesheet" href="../make/data/docs-resources/resources/jdk-default.css" />
<!--[if lt IE 9]>
@@ -28,442 +20,174 @@
<header id="title-block-header">
<h1 class="title">Native/Unit Test Development Guidelines</h1>
</header>
<nav id="TOC" role="doc-toc">
<nav id="TOC">
<ul>
<li><a href="#good-test-properties" id="toc-good-test-properties">Good
test properties</a>
<ul>
<li><a href="#lightness" id="toc-lightness">Lightness</a></li>
<li><a href="#isolation" id="toc-isolation">Isolation</a></li>
<li><a href="#atomicity-and-self-containment"
id="toc-atomicity-and-self-containment">Atomicity and
self-containment</a></li>
<li><a href="#repeatability"
id="toc-repeatability">Repeatability</a></li>
<li><a href="#informativeness"
id="toc-informativeness">Informativeness</a></li>
<li><a href="#testing-instead-of-visiting"
id="toc-testing-instead-of-visiting">Testing instead of
visiting</a></li>
<li><a href="#nearness" id="toc-nearness">Nearness</a></li>
<li><a href="#good-test-properties">Good test properties</a><ul>
<li><a href="#lightness">Lightness</a></li>
<li><a href="#isolation">Isolation</a></li>
<li><a href="#atomicity-and-self-containment">Atomicity and self-containment</a></li>
<li><a href="#repeatability">Repeatability</a></li>
<li><a href="#informativeness">Informativeness</a></li>
<li><a href="#testing-instead-of-visiting">Testing instead of visiting</a></li>
<li><a href="#nearness">Nearness</a></li>
</ul></li>
<li><a href="#asserts" id="toc-asserts">Asserts</a>
<ul>
<li><a href="#several-checks" id="toc-several-checks">Several
checks</a></li>
<li><a href="#first-parameter-is-expected-value"
id="toc-first-parameter-is-expected-value">First parameter is expected
value</a></li>
<li><a href="#floating-point-comparison"
id="toc-floating-point-comparison">Floating-point comparison</a></li>
<li><a href="#c-string-comparison" id="toc-c-string-comparison">C string
comparison</a></li>
<li><a href="#error-messages" id="toc-error-messages">Error
messages</a></li>
<li><a href="#uncluttered-output"
id="toc-uncluttered-output">Uncluttered output</a></li>
<li><a href="#failures-propagation"
id="toc-failures-propagation">Failures propagation</a></li>
<li><a href="#asserts">Asserts</a><ul>
<li><a href="#several-checks">Several checks</a></li>
<li><a href="#first-parameter-is-expected-value">First parameter is expected value</a></li>
<li><a href="#floating-point-comparison">Floating-point comparison</a></li>
<li><a href="#c-string-comparison">C string comparison</a></li>
<li><a href="#error-messages">Error messages</a></li>
<li><a href="#uncluttered-output">Uncluttered output</a></li>
<li><a href="#failures-propagation">Failures propagation</a></li>
</ul></li>
<li><a href="#naming-and-grouping" id="toc-naming-and-grouping">Naming
and Grouping</a>
<ul>
<li><a href="#test-group-names" id="toc-test-group-names">Test group
names</a></li>
<li><a href="#filename" id="toc-filename">Filename</a></li>
<li><a href="#file-location" id="toc-file-location">File
location</a></li>
<li><a href="#test-names" id="toc-test-names">Test names</a></li>
<li><a href="#fixture-classes" id="toc-fixture-classes">Fixture
classes</a></li>
<li><a href="#friend-classes" id="toc-friend-classes">Friend
classes</a></li>
<li><a href="#oscpu-specific-tests" id="toc-oscpu-specific-tests">OS/CPU
specific tests</a></li>
<li><a href="#naming-and-grouping">Naming and Grouping</a><ul>
<li><a href="#test-group-names">Test group names</a></li>
<li><a href="#filename">Filename</a></li>
<li><a href="#file-location">File location</a></li>
<li><a href="#test-names">Test names</a></li>
<li><a href="#fixture-classes">Fixture classes</a></li>
<li><a href="#friend-classes">Friend classes</a></li>
<li><a href="#oscpu-specific-tests">OS/CPU specific tests</a></li>
</ul></li>
<li><a href="#miscellaneous" id="toc-miscellaneous">Miscellaneous</a>
<ul>
<li><a href="#hotspot-style" id="toc-hotspot-style">Hotspot
style</a></li>
<li><a href="#codetest-metrics" id="toc-codetest-metrics">Code/test
metrics</a></li>
<li><a href="#access-to-non-public-members"
id="toc-access-to-non-public-members">Access to non-public
members</a></li>
<li><a href="#death-tests" id="toc-death-tests">Death tests</a></li>
<li><a href="#external-flags" id="toc-external-flags">External
flags</a></li>
<li><a href="#test-specific-flags"
id="toc-test-specific-flags">Test-specific flags</a></li>
<li><a href="#flag-restoring" id="toc-flag-restoring">Flag
restoring</a></li>
<li><a href="#googletest-documentation"
id="toc-googletest-documentation">GoogleTest documentation</a></li>
<li><a href="#miscellaneous">Miscellaneous</a><ul>
<li><a href="#hotspot-style">Hotspot style</a></li>
<li><a href="#codetest-metrics">Code/test metrics</a></li>
<li><a href="#access-to-non-public-members">Access to non-public members</a></li>
<li><a href="#death-tests">Death tests</a></li>
<li><a href="#external-flags">External flags</a></li>
<li><a href="#test-specific-flags">Test-specific flags</a></li>
<li><a href="#flag-restoring">Flag restoring</a></li>
<li><a href="#googletest-documentation">GoogleTest documentation</a></li>
</ul></li>
<li><a href="#todo" id="toc-todo">TODO</a></li>
<li><a href="#todo">TODO</a></li>
</ul>
</nav>
<p>The purpose of these guidelines is to establish a shared vision on
what kind of native tests and how we want to develop them for Hotspot
using GoogleTest. Hence these guidelines include style items as well as
test approach items.</p>
<p>First section of this document describes properties of good tests
which are common for almost all types of test regardless of language,
framework, etc. Further sections provide recommendations to achieve
those properties and other HotSpot and/or GoogleTest specific
guidelines.</p>
<p>The purpose of these guidelines is to establish a shared vision on what kind of native tests and how we want to develop them for Hotspot using GoogleTest. Hence these guidelines include style items as well as test approach items.</p>
<p>First section of this document describes properties of good tests which are common for almost all types of test regardless of language, framework, etc. Further sections provide recommendations to achieve those properties and other HotSpot and/or GoogleTest specific guidelines.</p>
<h2 id="good-test-properties">Good test properties</h2>
<h3 id="lightness">Lightness</h3>
<p>Use the most lightweight type of tests.</p>
<p>In Hotspot, there are 3 different types of tests regarding their
dependency on a JVM, each next level is slower than previous</p>
<p>In Hotspot, there are 3 different types of tests regarding their dependency on a JVM, each next level is slower than previous</p>
<ul>
<li><p><code>TEST</code> : a test does not depend on a JVM</p></li>
<li><p><code>TEST_VM</code> : a test does depend on an initialized JVM,
but are supposed not to break a JVM, i.e. leave it in a workable
state.</p></li>
<li><p><code>TEST_OTHER_VM</code> : a test depends on a JVM and requires
a freshly initialized JVM or leaves a JVM in non-workable state</p></li>
<li><p><code>TEST_VM</code> : a test does depend on an initialized JVM, but are supposed not to break a JVM, i.e. leave it in a workable state.</p></li>
<li><p><code>TEST_OTHER_VM</code> : a test depends on a JVM and requires a freshly initialized JVM or leaves a JVM in non-workable state</p></li>
</ul>
<h3 id="isolation">Isolation</h3>
<p>Tests have to be isolated: not to have visible side-effects,
influences on other tests results.</p>
<p>Results of one test should not depend on test execution order, other
tests, otherwise it is becoming almost impossible to find out why a test
failed. Due to hotspot-specific, it is not so easy to get a full
isolation, e.g. we share an initialized JVM between all
<code>TEST_VM</code> tests, so if your test changes JVM's state too
drastically and does not change it back, you had better consider
<code>TEST_OTHER_VM</code>.</p>
<h3 id="atomicity-and-self-containment">Atomicity and
self-containment</h3>
<p>Tests should be <em>atomic</em> and <em>self-contained</em> at the
same time.</p>
<p>One test should check a particular part of a class, subsystem,
functionality, etc. Then it is quite easy to determine what parts of a
product are broken basing on test failures. On the other hand, a test
should test that part more-or-less entirely, because when one sees a
test <code>FooTest::bar</code>, they assume all aspects of bar from
<code>Foo</code> are tested.</p>
<p>However, it is impossible to cover all aspects even of a method, not
to mention a subsystem. In such cases, it is recommended to have several
tests, one for each aspect of a thing under test. For example one test
to tests how <code>Foo::bar</code> works if an argument is
<code>null</code>, another test to test how it works if an argument is
acceptable but <code>Foo</code> is not in the right state to accept it
and so on. This helps not only to make tests atomic, self-contained but
also makes test name self-descriptive (discussed in more details in <a
href="#test-names">Test names</a>).</p>
<p>Tests have to be isolated: not to have visible side-effects, influences on other tests results.</p>
<p>Results of one test should not depend on test execution order, other tests, otherwise it is becoming almost impossible to find out why a test failed. Due to hotspot-specific, it is not so easy to get a full isolation, e.g. we share an initialized JVM between all <code>TEST_VM</code> tests, so if your test changes JVM's state too drastically and does not change it back, you had better consider <code>TEST_OTHER_VM</code>.</p>
<h3 id="atomicity-and-self-containment">Atomicity and self-containment</h3>
<p>Tests should be <em>atomic</em> and <em>self-contained</em> at the same time.</p>
<p>One test should check a particular part of a class, subsystem, functionality, etc. Then it is quite easy to determine what parts of a product are broken basing on test failures. On the other hand, a test should test that part more-or-less entirely, because when one sees a test <code>FooTest::bar</code>, they assume all aspects of bar from <code>Foo</code> are tested.</p>
<p>However, it is impossible to cover all aspects even of a method, not to mention a subsystem. In such cases, it is recommended to have several tests, one for each aspect of a thing under test. For example one test to tests how <code>Foo::bar</code> works if an argument is <code>null</code>, another test to test how it works if an argument is acceptable but <code>Foo</code> is not in the right state to accept it and so on. This helps not only to make tests atomic, self-contained but also makes test name self-descriptive (discussed in more details in <a href="#test-names">Test names</a>).</p>
<h3 id="repeatability">Repeatability</h3>
<p>Tests have to be repeatable.</p>
<p>Reproducibility is very crucial for a test. No one likes sporadic
test failures, they are hard to investigate, fix and verify a fix.</p>
<p>In some cases, it is quite hard to write a 100% repeatable test,
since besides a test there can be other moving parts, e.g. in case of
<code>TEST_VM</code> there are several concurrently running threads.
Despite this, we should try to make a test as reproducible as
possible.</p>
<p>Reproducibility is very crucial for a test. No one likes sporadic test failures, they are hard to investigate, fix and verify a fix.</p>
<p>In some cases, it is quite hard to write a 100% repeatable test, since besides a test there can be other moving parts, e.g. in case of <code>TEST_VM</code> there are several concurrently running threads. Despite this, we should try to make a test as reproducible as possible.</p>
<h3 id="informativeness">Informativeness</h3>
<p>In case of a failure, a test should be as <em>informative</em> as
possible.</p>
<p>Having more information about a test failure than just compared
values can be very useful for failure troubleshooting, it can reduce or
even completely eliminate debugging hours. This is even more important
in case of not 100% reproducible failures.</p>
<p>Achieving this property, one can easily make a test too verbose, so
it will be really hard to find useful information in the ocean of
useless information. Hence they should not only think about how to
provide <a href="#error-messages">good information</a>, but also <a
href="#uncluttered-output">when to do it</a>.</p>
<p>In case of a failure, a test should be as <em>informative</em> as possible.</p>
<p>Having more information about a test failure than just compared values can be very useful for failure troubleshooting, it can reduce or even completely eliminate debugging hours. This is even more important in case of not 100% reproducible failures.</p>
<p>Achieving this property, one can easily make a test too verbose, so it will be really hard to find useful information in the ocean of useless information. Hence they should not only think about how to provide <a href="#error-messages">good information</a>, but also <a href="#uncluttered-output">when to do it</a>.</p>
<h3 id="testing-instead-of-visiting">Testing instead of visiting</h3>
<p>Tests should <em>test</em>.</p>
<p>It is not enough just to "visit" some code, a test should check that
code does that it has to do, compare return values with expected values,
check that desired side effects are done, and undesired are not, and so
on. In other words, a test should contain at least one GoogleTest
assertion and do not rely on JVM asserts.</p>
<p>Generally speaking to write a good test, one should create a model of
the system under tests, a model of possible bugs (or bugs which one
wants to find) and design tests using those models.</p>
<p>It is not enough just to &quot;visit&quot; some code, a test should check that code does that it has to do, compare return values with expected values, check that desired side effects are done, and undesired are not, and so on. In other words, a test should contain at least one GoogleTest assertion and do not rely on JVM asserts.</p>
<p>Generally speaking to write a good test, one should create a model of the system under tests, a model of possible bugs (or bugs which one wants to find) and design tests using those models.</p>
<h3 id="nearness">Nearness</h3>
<p>Prefer having checks inside test code.</p>
<p>Not only does having test logic outside, e.g. verification method,
depending on asserts in product code contradict with several items above
but also decreases tests readability and stability. It is much easier
to understand that a test is testing when all testing logic is located
inside a test or nearby in shared test libraries. As a rule of thumb,
the closer a check to a test, the better.</p>
<p>Not only does having test logic outside, e.g. verification method, depending on asserts in product code contradict with several items above but also decreases tests readability and stability. It is much easier to understand that a test is testing when all testing logic is located inside a test or nearby in shared test libraries. As a rule of thumb, the closer a check to a test, the better.</p>
<h2 id="asserts">Asserts</h2>
<h3 id="several-checks">Several checks</h3>
<p>Prefer <code>EXPECT</code> over <code>ASSERT</code> if possible.</p>
<p>This is related to the <a href="#informativeness">informativeness</a>
property of tests, information for other checks can help to better
localize a defects root-cause. One should use <code>ASSERT</code> if it
is impossible to continue test execution or if it does not make much
sense. Later in the text, <code>EXPECT</code> forms will be used to
refer to both <code>ASSERT/EXPECT</code>.</p>
<p>When it is possible to make several different checks, but impossible
to continue test execution if at least one check fails, you can use
<code>::testing::Test::HasNonfatalFailure()</code> function. The
recommended way to express that is
<code>ASSERT_FALSE(::testing::Test::HasNonfatalFailure())</code>.
Besides making it clear why a test is aborted, it also allows you to
provide more information about a failure.</p>
<h3 id="first-parameter-is-expected-value">First parameter is expected
value</h3>
<p>In all equality assertions, expected values should be passed as the
first parameter.</p>
<p>This convention is adopted by GoogleTest, and there is a slight
difference in how GoogleTest treats parameters, the most important one
is <code>null</code> detection. Due to different reasons,
<code>null</code> detection is enabled only for the first parameter,
that is to said <code>EXPECT_EQ(NULL, object)</code> checks that object
is <code>null</code>, while <code>EXPECT_EQ(object, NULL)</code> checks
that object equals to <code>NULL</code>, GoogleTest is very strict
regarding types of compared values so the latter will generates a
compile-time error.</p>
<p>This is related to the <a href="#informativeness">informativeness</a> property of tests, information for other checks can help to better localize a defects root-cause. One should use <code>ASSERT</code> if it is impossible to continue test execution or if it does not make much sense. Later in the text, <code>EXPECT</code> forms will be used to refer to both <code>ASSERT/EXPECT</code>.</p>
<p>When it is possible to make several different checks, but impossible to continue test execution if at least one check fails, you can use <code>::testing::Test::HasNonfatalFailure()</code> function. The recommended way to express that is <code>ASSERT_FALSE(::testing::Test::HasNonfatalFailure())</code>. Besides making it clear why a test is aborted, it also allows you to provide more information about a failure.</p>
<h3 id="first-parameter-is-expected-value">First parameter is expected value</h3>
<p>In all equality assertions, expected values should be passed as the first parameter.</p>
<p>This convention is adopted by GoogleTest, and there is a slight difference in how GoogleTest treats parameters, the most important one is <code>null</code> detection. Due to different reasons, <code>null</code> detection is enabled only for the first parameter, that is to said <code>EXPECT_EQ(NULL, object)</code> checks that object is <code>null</code>, while <code>EXPECT_EQ(object, NULL)</code> checks that object equals to <code>NULL</code>, GoogleTest is very strict regarding types of compared values so the latter will generates a compile-time error.</p>
<h3 id="floating-point-comparison">Floating-point comparison</h3>
<p>Use floating-point special macros to compare
<code>float/double</code> values.</p>
<p>Because of floating-point number representations and round-off
errors, regular equality comparison will not return true in most cases.
There are special <code>EXPECT_FLOAT_EQ/EXPECT_DOUBLE_EQ</code>
assertions which check that the distance between compared values is not
more than 4 ULPs, there is also <code>EXPECT_NEAR(v1, v2, eps)</code>
which checks that the absolute value of the difference between
<code>v1</code> and <code>v2</code> is not greater than
<code>eps</code>.</p>
<p>Use floating-point special macros to compare <code>float/double</code> values.</p>
<p>Because of floating-point number representations and round-off errors, regular equality comparison will not return true in most cases. There are special <code>EXPECT_FLOAT_EQ/EXPECT_DOUBLE_EQ</code> assertions which check that the distance between compared values is not more than 4 ULPs, there is also <code>EXPECT_NEAR(v1, v2, eps)</code> which checks that the absolute value of the difference between <code>v1</code> and <code>v2</code> is not greater than <code>eps</code>.</p>
<h3 id="c-string-comparison">C string comparison</h3>
<p>Use string special macros for C strings comparisons.</p>
<p><code>EXPECT_EQ</code> just compares pointers values, which is
hardly what one wants comparing C strings. GoogleTest provides
<code>EXPECT_STREQ</code> and <code>EXPECT_STRNE</code> macros to
compare C string contents. There are also case-insensitive versions
<code>EXPECT_STRCASEEQ</code>, <code>EXPECT_STRCASENE</code>.</p>
<p><code>EXPECT_EQ</code> just compares pointers values, which is hardly what one wants comparing C strings. GoogleTest provides <code>EXPECT_STREQ</code> and <code>EXPECT_STRNE</code> macros to compare C string contents. There are also case-insensitive versions <code>EXPECT_STRCASEEQ</code>, <code>EXPECT_STRCASENE</code>.</p>
<h3 id="error-messages">Error messages</h3>
<p>Provide informative, but not too verbose error messages.</p>
<p>All GoogleTest asserts print compared expressions and their values,
so there is no need to have them in error messages. Asserts print only
compared values, they do not print any of interim variables, e.g.
<code>ASSERT_TRUE((val1 == val2 &amp;&amp; isFail(foo(8)) || i == 18)</code>
prints only one value. If you use some complex predicates, please
consider <code>EXPECT_PRED*</code> or <code>EXPECT_FORMAT_PRED</code>
assertions family, they check that a predicate returns true/success and
print out all parameters values.</p>
<p>However in some cases, default information is not enough, a commonly
used example is an assert inside a loop, GoogleTest will not print
iteration values (unless it is an assert's parameter). Other
demonstrative examples are printing error code and a corresponding error
message; printing internal states which might have an impact on results.
One should add this information to assert message using
<code>&lt;&lt;</code> operator.</p>
<p>All GoogleTest asserts print compared expressions and their values, so there is no need to have them in error messages. Asserts print only compared values, they do not print any of interim variables, e.g. <code>ASSERT_TRUE((val1 == val2 &amp;&amp; isFail(foo(8)) || i == 18)</code> prints only one value. If you use some complex predicates, please consider <code>EXPECT_PRED*</code> or <code>EXPECT_FORMAT_PRED</code> assertions family, they check that a predicate returns true/success and print out all parameters values.</p>
<p>However in some cases, default information is not enough, a commonly used example is an assert inside a loop, GoogleTest will not print iteration values (unless it is an assert's parameter). Other demonstrative examples are printing error code and a corresponding error message; printing internal states which might have an impact on results. One should add this information to assert message using <code>&lt;&lt;</code> operator.</p>
<h3 id="uncluttered-output">Uncluttered output</h3>
<p>Print information only if it is needed.</p>
<p>Too verbose tests which print all information even if they pass are
very bad practice. They just pollute output, so it becomes harder to
find useful information. In order not print information till it is
really needed, one should consider saving it to a temporary buffer and
pass to an assert. <a
href="https://git.openjdk.org/jdk/blob/master/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp"
class="uri">https://git.openjdk.org/jdk/blob/master/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp</a>
has a good example how to do that.</p>
<p>Too verbose tests which print all information even if they pass are very bad practice. They just pollute output, so it becomes harder to find useful information. In order not print information till it is really needed, one should consider saving it to a temporary buffer and pass to an assert. <a href="https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp" class="uri">https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp</a> has a good example how to do that.</p>
<h3 id="failures-propagation">Failures propagation</h3>
<p>Wrap a subroutine call into <code>EXPECT_NO_FATAL_FAILURE</code>
macro to propagate failures.</p>
<p><code>ASSERT</code> and <code>FAIL</code> abort only the current
function, so if you have them in a subroutine, a test will not be
aborted after the subroutine even if <code>ASSERT</code> or
<code>FAIL</code> fails. You should call such subroutines in
<code>ASSERT_NO_FATAL_FAILURE</code> macro to propagate fatal failures
and abort a test. <code>(EXPECT|ASSERT)_NO_FATAL_FAILURE</code> can also
be used to provide more information.</p>
<p>Due to obvious reasons, there are no
<code>(EXPECT|ASSERT)_NO_NONFATAL_FAILURE</code> macros. However, if you
need to check if a subroutine generated a nonfatal failure (failed an
<code>EXPECT</code>), you can use
<code>::testing::Test::HasNonfatalFailure</code> function, or
<code>::testing::Test::HasFailure</code> function to check if a
subroutine generated any failures, see <a href="#several-checks">Several
checks</a>.</p>
<p>Wrap a subroutine call into <code>EXPECT_NO_FATAL_FAILURE</code> macro to propagate failures.</p>
<p><code>ASSERT</code> and <code>FAIL</code> abort only the current function, so if you have them in a subroutine, a test will not be aborted after the subroutine even if <code>ASSERT</code> or <code>FAIL</code> fails. You should call such subroutines in <code>ASSERT_NO_FATAL_FAILURE</code> macro to propagate fatal failures and abort a test. <code>(EXPECT|ASSERT)_NO_FATAL_FAILURE</code> can also be used to provide more information.</p>
<p>Due to obvious reasons, there are no <code>(EXPECT|ASSERT)_NO_NONFATAL_FAILURE</code> macros. However, if you need to check if a subroutine generated a nonfatal failure (failed an <code>EXPECT</code>), you can use <code>::testing::Test::HasNonfatalFailure</code> function, or <code>::testing::Test::HasFailure</code> function to check if a subroutine generated any failures, see <a href="#several-checks">Several checks</a>.</p>
<h2 id="naming-and-grouping">Naming and Grouping</h2>
<h3 id="test-group-names">Test group names</h3>
<p>Test group names should be in CamelCase, start and end with a letter.
A test group should be named after tested class, functionality,
subsystem, etc.</p>
<p>This naming scheme helps to find tests, filter them and simplifies
test failure analysis. For example, class <code>Foo</code> - test group
<code>Foo</code>, compiler logging subsystem - test group
<code>CompilerLogging</code>, G1 GC — test group <code>G1GC</code>, and
so forth.</p>
<p>Test group names should be in CamelCase, start and end with a letter. A test group should be named after tested class, functionality, subsystem, etc.</p>
<p>This naming scheme helps to find tests, filter them and simplifies test failure analysis. For example, class <code>Foo</code> - test group <code>Foo</code>, compiler logging subsystem - test group <code>CompilerLogging</code>, G1 GC — test group <code>G1GC</code>, and so forth.</p>
<h3 id="filename">Filename</h3>
<p>A test file must have <code>test_</code> prefix and <code>.cpp</code>
suffix.</p>
<p>Both are actually requirements from the current build system to
recognize your tests.</p>
<p>A test file must have <code>test_</code> prefix and <code>.cpp</code> suffix.</p>
<p>Both are actually requirements from the current build system to recognize your tests.</p>
<h3 id="file-location">File location</h3>
<p>Test file location should reflect a location of the tested part of
the product.</p>
<p>Test file location should reflect a location of the tested part of the product.</p>
<ul>
<li><p>All unit tests for a class from <code>foo/bar/baz.cpp</code>
should be placed <code>foo/bar/test_baz.cpp</code> in
<code>hotspot/test/native/</code> directory. Having all tests for a
class in one file is a common practice for unit tests, it helps to see
all existing tests at once, share functions and/or resources without
losing encapsulation.</p></li>
<li><p>For tests which test more than one class, directory hierarchy
should be the same as product hierarchy, and file name should reflect
the name of the tested subsystem/functionality. For example, if a
sub-system under tests belongs to <code>gc/g1</code>, tests should be
placed in <code>gc/g1</code> directory.</p></li>
<li><p>All unit tests for a class from <code>foo/bar/baz.cpp</code> should be placed <code>foo/bar/test_baz.cpp</code> in <code>hotspot/test/native/</code> directory. Having all tests for a class in one file is a common practice for unit tests, it helps to see all existing tests at once, share functions and/or resources without losing encapsulation.</p></li>
<li><p>For tests which test more than one class, directory hierarchy should be the same as product hierarchy, and file name should reflect the name of the tested subsystem/functionality. For example, if a sub-system under tests belongs to <code>gc/g1</code>, tests should be placed in <code>gc/g1</code> directory.</p></li>
</ul>
<p>Please note that framework prepends directory name to a test group
name. For example, if <code>TEST(foo, check_this)</code> and
<code>TEST(bar, check_that)</code> are defined in
<code>hotspot/test/native/gc/shared/test_foo.cpp</code> file, they will
be reported as <code>gc/shared/foo::check_this</code> and
<code>gc/shared/bar::check_that</code>.</p>
<p>Please note that framework prepends directory name to a test group name. For example, if <code>TEST(foo, check_this)</code> and <code>TEST(bar, check_that)</code> are defined in <code>hotspot/test/native/gc/shared/test_foo.cpp</code> file, they will be reported as <code>gc/shared/foo::check_this</code> and <code>gc/shared/bar::check_that</code>.</p>
<h3 id="test-names">Test names</h3>
<p>Test names should be in small_snake_case, start and end with a
letter. A test name should reflect that a test checks.</p>
<p>Such naming makes tests self-descriptive and helps a lot during the
whole test life cycle. It is easy to do test planning, test inventory,
to see what things are not tested, to review tests, to analyze test
failures, to evolve a test, etc. For example
<code>foo_return_0_if_name_is_null</code> is better than
<code>foo_sanity</code> or <code>foo_basic</code> or just
<code>foo</code>,
<code>humongous_objects_can_not_be_moved_by_young_gc</code> is better
than <code>ho_young_gc</code>.</p>
<p>Actually using underscore is against GoogleTest project convention,
because it can lead to illegal identifiers, however, this is too strict.
Restricting usage of underscore for test names only and prohibiting test
name starts or ends with an underscore are enough to be safe.</p>
<p>Test names should be in small_snake_case, start and end with a letter. A test name should reflect that a test checks.</p>
<p>Such naming makes tests self-descriptive and helps a lot during the whole test life cycle. It is easy to do test planning, test inventory, to see what things are not tested, to review tests, to analyze test failures, to evolve a test, etc. For example <code>foo_return_0_if_name_is_null</code> is better than <code>foo_sanity</code> or <code>foo_basic</code> or just <code>foo</code>, <code>humongous_objects_can_not_be_moved_by_young_gc</code> is better than <code>ho_young_gc</code>.</p>
<p>Actually using underscore is against GoogleTest project convention, because it can lead to illegal identifiers, however, this is too strict. Restricting usage of underscore for test names only and prohibiting test name starts or ends with an underscore are enough to be safe.</p>
<h3 id="fixture-classes">Fixture classes</h3>
<p>Fixture classes should be named after tested classes, subsystems, etc
(follow <a href="#test-group-names">Test group names rule</a>) and have
<code>Test</code> suffix to prevent class name conflicts.</p>
<p>Fixture classes should be named after tested classes, subsystems, etc (follow <a href="#test-group-names">Test group names rule</a>) and have <code>Test</code> suffix to prevent class name conflicts.</p>
<h3 id="friend-classes">Friend classes</h3>
<p>All test purpose friends should have either <code>Test</code> or
<code>Testable</code> suffix.</p>
<p>It greatly simplifies understanding of friendships purpose and
allows statically check that private members are not exposed
unexpectedly. Having <code>FooTest</code> as a friend of
<code>Foo</code> without any comments will be understood as a necessary
evil to get testability.</p>
<p>All test purpose friends should have either <code>Test</code> or <code>Testable</code> suffix.</p>
<p>It greatly simplifies understanding of friendships purpose and allows statically check that private members are not exposed unexpectedly. Having <code>FooTest</code> as a friend of <code>Foo</code> without any comments will be understood as a necessary evil to get testability.</p>
<h3 id="oscpu-specific-tests">OS/CPU specific tests</h3>
<p>Guard OS/CPU specific tests by <code>#ifdef</code> and have OS/CPU
name in filename.</p>
<p>For the time being, we do not support separate directories for OS,
CPU, OS-CPU specific tests, in case we will have lots of such tests, we
will change directory layout and build system to support that in the
same way it is done in hotspot.</p>
<p>Guard OS/CPU specific tests by <code>#ifdef</code> and have OS/CPU name in filename.</p>
<p>For the time being, we do not support separate directories for OS, CPU, OS-CPU specific tests, in case we will have lots of such tests, we will change directory layout and build system to support that in the same way it is done in hotspot.</p>
<h2 id="miscellaneous">Miscellaneous</h2>
<h3 id="hotspot-style">Hotspot style</h3>
<p>Abide the norms and rules accepted in Hotspot style guide.</p>
<p>Tests are a part of Hotspot, so everything (if applicable) we use for
Hotspot, should be used for tests as well. Those guidelines cover
test-specific things.</p>
<p>Tests are a part of Hotspot, so everything (if applicable) we use for Hotspot, should be used for tests as well. Those guidelines cover test-specific things.</p>
<h3 id="codetest-metrics">Code/test metrics</h3>
<p>Coverage information and other code/test metrics are quite useful to
decide what tests should be written, what tests should be improved and
what can be removed.</p>
<p>For unit tests, widely used and well-known coverage metric is branch
coverage, which provides good quality of tests with relatively easy test
development process. For other levels of testing, branch coverage is not
as good, and one should consider others metrics, e.g. transaction flow
coverage, data flow coverage.</p>
<p>Coverage information and other code/test metrics are quite useful to decide what tests should be written, what tests should be improved and what can be removed.</p>
<p>For unit tests, widely used and well-known coverage metric is branch coverage, which provides good quality of tests with relatively easy test development process. For other levels of testing, branch coverage is not as good, and one should consider others metrics, e.g. transaction flow coverage, data flow coverage.</p>
<h3 id="access-to-non-public-members">Access to non-public members</h3>
<p>Use explicit friend class to get access to non-public members.</p>
<p>We do not use GoogleTest macro to declare friendship relation,
because, from our point of view, it is less clear than an explicit
declaration.</p>
<p>Declaring a test fixture class as a friend class of a tested test is
the easiest and the clearest way to get access. However, it has some
disadvantages, here is some of them:</p>
<p>We do not use GoogleTest macro to declare friendship relation, because, from our point of view, it is less clear than an explicit declaration.</p>
<p>Declaring a test fixture class as a friend class of a tested test is the easiest and the clearest way to get access. However, it has some disadvantages, here is some of them:</p>
<ul>
<li>Each test has to be declared as a friend</li>
<li>Subclasses do not inheritance friendship relation</li>
</ul>
<p>In other words, it is harder to share code between tests. Hence if
you want to share code or expect it to be useful in other tests, you
should consider making members in a tested class protected and introduce
a shared test-only class which expose those members via public
functions, or even making members publicly accessible right away in a
product class. If it is not an option to change members visibility, one
can create a friend class which exposes members.</p>
<p>In other words, it is harder to share code between tests. Hence if you want to share code or expect it to be useful in other tests, you should consider making members in a tested class protected and introduce a shared test-only class which expose those members via public functions, or even making members publicly accessible right away in a product class. If it is not an option to change members visibility, one can create a friend class which exposes members.</p>
<h3 id="death-tests">Death tests</h3>
<p>You can not use death tests inside <code>TEST_OTHER_VM</code> and
<code>TEST_VM_ASSERT*</code>.</p>
<p>We tried to make Hotspot-GoogleTest integration as transparent as
possible, however, due to the current implementation of
<code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code> tests, you
cannot use death test functionality in them. These tests are implemented
as GoogleTest death tests, and GoogleTest does not allow to have a death
test inside another death test.</p>
<p>You can not use death tests inside <code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code>.</p>
<p>We tried to make Hotspot-GoogleTest integration as transparent as possible, however, due to the current implementation of <code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code> tests, you cannot use death test functionality in them. These tests are implemented as GoogleTest death tests, and GoogleTest does not allow to have a death test inside another death test.</p>
<h3 id="external-flags">External flags</h3>
<p>Passing external flags to a tested JVM is not supported.</p>
<p>The rationality of such design decision is to simplify both tests and
a test framework and to avoid failures related to incompatible flags
combination till there is a good solution for that. However there are
cases when one wants to test a JVM with specific flags combination,
<code>_JAVA_OPTIONS</code> environment variable can be used to do that.
Flags from <code>_JAVA_OPTIONS</code> will be used in
<code>TEST_VM</code>, <code>TEST_OTHER_VM</code> and
<code>TEST_VM_ASSERT*</code> tests.</p>
<p>The rationality of such design decision is to simplify both tests and a test framework and to avoid failures related to incompatible flags combination till there is a good solution for that. However there are cases when one wants to test a JVM with specific flags combination, <code>_JAVA_OPTIONS</code> environment variable can be used to do that. Flags from <code>_JAVA_OPTIONS</code> will be used in <code>TEST_VM</code>, <code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code> tests.</p>
<h3 id="test-specific-flags">Test-specific flags</h3>
<p>Passing flags to a tested JVM in <code>TEST_OTHER_VM</code> and
<code>TEST_VM_ASSERT*</code> should be possible, but is not implemented
yet.</p>
<p>Facility to pass test-specific flags is needed for system, regression
or other types of tests which require a fully initialized JVM in some
particular configuration, e.g. with Serial GC selected. There is no
support for such tests now, however, there is a plan to add that in
upcoming releases.</p>
<p>For now, if a test depends on flags values, it should have
<code>if (!&lt;flag&gt;) { return }</code> guards in the very beginning
and <code>@requires</code> comment similar to jtreg
<code>@requires</code> directive right before test macros. <a
href="https://git.openjdk.org/jdk/blob/master/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp"
class="uri">https://git.openjdk.org/jdk/blob/master/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp</a>
ha an example of this temporary workaround. It is important to follow
that pattern as it allows us to easily find all such tests and update
them as soon as there is an implementation of flag passing facility.</p>
<p>In long-term, we expect jtreg to support GoogleTest tests as first
class citizens, that is to say, jtreg will parse <span class="citation"
data-cites="requires">@requires</span> comments and filter out
inapplicable tests.</p>
<p>Passing flags to a tested JVM in <code>TEST_OTHER_VM</code> and <code>TEST_VM_ASSERT*</code> should be possible, but is not implemented yet.</p>
<p>Facility to pass test-specific flags is needed for system, regression or other types of tests which require a fully initialized JVM in some particular configuration, e.g. with Serial GC selected. There is no support for such tests now, however, there is a plan to add that in upcoming releases.</p>
<p>For now, if a test depends on flags values, it should have <code>if (!&lt;flag&gt;) { return }</code> guards in the very beginning and <code>@requires</code> comment similar to jtreg <code>@requires</code> directive right before test macros. <a href="https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp" class="uri">https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp</a> ha an example of this temporary workaround. It is important to follow that pattern as it allows us to easily find all such tests and update them as soon as there is an implementation of flag passing facility.</p>
<p>In long-term, we expect jtreg to support GoogleTest tests as first class citizens, that is to say, jtreg will parse <span class="citation" data-cites="requires">@requires</span> comments and filter out inapplicable tests.</p>
<h3 id="flag-restoring">Flag restoring</h3>
<p>Restore changed flags.</p>
<p>It is quite common for tests to configure JVM in a certain way
changing flags values. GoogleTest provides two ways to set up
environment before a test and restore it afterward: using either
constructor and destructor or <code>SetUp</code> and
<code>TearDown</code> functions. Both ways require to use a test fixture
class, which sometimes is too wordy. The simpler facilities like
<code>FLAG_GUARD</code> macro or <code>*FlagSetting</code> classes could
be used in such cases to restore/set values.</p>
<p>It is quite common for tests to configure JVM in a certain way changing flags values. GoogleTest provides two ways to set up environment before a test and restore it afterward: using either constructor and destructor or <code>SetUp</code> and <code>TearDown</code> functions. Both ways require to use a test fixture class, which sometimes is too wordy. The simpler facilities like <code>FLAG_GUARD</code> macro or <code>*FlagSetting</code> classes could be used in such cases to restore/set values.</p>
<p>Caveats:</p>
<ul>
<li><p>Changing a flags value could break the invariants between flags'
values and hence could lead to unexpected/unsupported JVM
state.</p></li>
<li><p><code>FLAG_SET_*</code> macros can change more than one flag (in
order to maintain invariants) so it is hard to predict what flags will
be changed and it makes restoring all changed flags a nontrivial task.
Thus in case one uses <code>FLAG_SET_*</code> macros, they should use
<code>TEST_OTHER_VM</code> test type.</p></li>
<li><p>Changing a flags value could break the invariants between flags' values and hence could lead to unexpected/unsupported JVM state.</p></li>
<li><p><code>FLAG_SET_*</code> macros can change more than one flag (in order to maintain invariants) so it is hard to predict what flags will be changed and it makes restoring all changed flags a nontrivial task. Thus in case one uses <code>FLAG_SET_*</code> macros, they should use <code>TEST_OTHER_VM</code> test type.</p></li>
</ul>
<h3 id="googletest-documentation">GoogleTest documentation</h3>
<p>In case you have any questions regarding GoogleTest itself, its
asserts, test declaration macros, other macros, etc, please consult its
documentation.</p>
<p>In case you have any questions regarding GoogleTest itself, its asserts, test declaration macros, other macros, etc, please consult its documentation.</p>
<h2 id="todo">TODO</h2>
<p>Although this document provides guidelines on the most important
parts of test development using GTest, it still misses a few items:</p>
<p>Although this document provides guidelines on the most important parts of test development using GTest, it still misses a few items:</p>
<ul>
<li><p>Examples, esp for <a href="#access-to-non-public-members">access
to non-public members</a></p></li>
<li><p>test types: purpose, drawbacks, limitation</p>
<li><p>Examples, esp for <a href="#access-to-non-public-members">access to non-public members</a></p></li>
<li>test types: purpose, drawbacks, limitation
<ul>
<li><code>TEST_VM</code></li>
<li><code>TEST_VM_F</code></li>
@@ -471,7 +195,7 @@ to non-public members</a></p></li>
<li><code>TEST_VM_ASSERT</code></li>
<li><code>TEST_VM_ASSERT_MSG</code></li>
</ul></li>
<li><p>Miscellaneous</p>
<li>Miscellaneous
<ul>
<li>Test libraries
<ul>
@@ -484,8 +208,7 @@ to non-public members</a></p></li>
<li>how to run tests in random order</li>
<li>how to run only specific tests</li>
<li>how to run each test separately</li>
<li>check that a test can find bugs it is supposed to by introducing
them</li>
<li>check that a test can find bugs it is supposed to by introducing them</li>
</ul></li>
<li>mocks/stubs/dependency injection</li>
<li>setUp/tearDown

View File

@@ -100,7 +100,7 @@ Generally speaking to write a good test, one should create a model of
the system under tests, a model of possible bugs (or bugs which one
wants to find) and design tests using those models.
### Nearness
### Nearness
Prefer having checks inside test code.
@@ -156,7 +156,7 @@ that the distance between compared values is not more than 4 ULPs,
there is also `EXPECT_NEAR(v1, v2, eps)` which checks that the absolute
value of the difference between `v1` and `v2` is not greater than `eps`.
### C string comparison
### C string comparison
Use string special macros for C strings comparisons.
@@ -194,7 +194,7 @@ very bad practice. They just pollute output, so it becomes harder to
find useful information. In order not print information till it is
really needed, one should consider saving it to a temporary buffer and
pass to an assert.
<https://git.openjdk.org/jdk/blob/master/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp>
<https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/shared/test_memset_with_concurrent_readers.cpp>
has a good example how to do that.
### Failures propagation
@@ -229,7 +229,7 @@ test failure analysis. For example, class `Foo` - test group `Foo`,
compiler logging subsystem - test group `CompilerLogging`, G1 GC — test
group `G1GC`, and so forth.
### Filename
### Filename
A test file must have `test_` prefix and `.cpp` suffix.
@@ -283,7 +283,7 @@ Fixture classes should be named after tested classes, subsystems, etc
(follow [Test group names rule](#test-group-names)) and have
`Test` suffix to prevent class name conflicts.
### Friend classes
### Friend classes
All test purpose friends should have either `Test` or `Testable` suffix.
@@ -303,7 +303,7 @@ the same way it is done in hotspot.
## Miscellaneous
### Hotspot style
### Hotspot style
Abide the norms and rules accepted in Hotspot style guide.
@@ -383,7 +383,7 @@ upcoming releases.
For now, if a test depends on flags values, it should have `if
(!<flag>) { return }` guards in the very beginning and `@requires`
comment similar to jtreg `@requires` directive right before test macros.
<https://git.openjdk.org/jdk/blob/master/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp>
<https://hg.openjdk.java.net/jdk/jdk/file/tip/test/hotspot/gtest/gc/g1/test_g1IHOPControl.cpp>
ha an example of this temporary workaround. It is important to follow
that pattern as it allows us to easily find all such tests and update
them as soon as there is an implementation of flag passing facility.
@@ -392,7 +392,7 @@ In long-term, we expect jtreg to support GoogleTest tests as first
class citizens, that is to say, jtreg will parse @requires comments
and filter out inapplicable tests.
### Flag restoring
### Flag restoring
Restore changed flags.
@@ -404,7 +404,7 @@ require to use a test fixture class, which sometimes is too wordy. The
simpler facilities like `FLAG_GUARD` macro or `*FlagSetting` classes could
be used in such cases to restore/set values.
Caveats:
Caveats:
* Changing a flags value could break the invariants between flags' values and hence could lead to unexpected/unsupported JVM state.

View File

@@ -5,19 +5,11 @@
<meta name="generator" content="pandoc" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
<title>IDE support in the JDK</title>
<style>
code{white-space: pre-wrap;}
span.smallcaps{font-variant: small-caps;}
div.columns{display: flex; gap: min(4vw, 1.5em);}
div.column{flex: auto; overflow-x: auto;}
div.hanging-indent{margin-left: 1.5em; text-indent: -1.5em;}
ul.task-list{list-style: none;}
ul.task-list li input[type="checkbox"] {
width: 0.8em;
margin: 0 0.8em 0.2em -1.6em;
vertical-align: middle;
}
.display.math{display: block; text-align: center; margin: 0.5rem auto;}
<style type="text/css">
code{white-space: pre-wrap;}
span.smallcaps{font-variant: small-caps;}
span.underline{text-decoration: underline;}
div.column{display: inline-block; vertical-align: top; width: 50%;}
</style>
<link rel="stylesheet" href="../make/data/docs-resources/resources/jdk-default.css" />
<!--[if lt IE 9]>
@@ -28,144 +20,41 @@
<header id="title-block-header">
<h1 class="title">IDE support in the JDK</h1>
</header>
<nav id="TOC" role="doc-toc">
<nav id="TOC">
<ul>
<li><a href="#introduction" id="toc-introduction">Introduction</a>
<ul>
<li><a href="#ide-support-for-native-code"
id="toc-ide-support-for-native-code">IDE support for native
code</a></li>
<li><a href="#ide-support-for-java-code"
id="toc-ide-support-for-java-code">IDE support for Java code</a></li>
<li><a href="#introduction">Introduction</a><ul>
<li><a href="#ide-support-for-native-code">IDE support for native code</a></li>
<li><a href="#ide-support-for-java-code">IDE support for Java code</a></li>
</ul></li>
</ul>
</nav>
<h2 id="introduction">Introduction</h2>
<p>When you are familiar with building and testing the JDK, you may want
to configure an IDE to work with the source code. The instructions
differ a bit depending on whether you are interested in working with the
native (C/C++) or the Java code.</p>
<p>When you are familiar with building and testing the JDK, you may want to configure an IDE to work with the source code. The instructions differ a bit depending on whether you are interested in working with the native (C/C++) or the Java code.</p>
<h3 id="ide-support-for-native-code">IDE support for native code</h3>
<p>There are a few ways to generate IDE configuration for the native
sources, depending on which IDE to use.</p>
<p>There are a few ways to generate IDE configuration for the native sources, depending on which IDE to use.</p>
<h4 id="visual-studio-code">Visual Studio Code</h4>
<p>The make system can generate a <a
href="https://code.visualstudio.com">Visual Studio Code</a> workspace
that has C/C++ source indexing configured correctly, as well as launcher
targets for tests and the Java launcher. After configuring, a workspace
for the configuration can be generated using:</p>
<p>The make system can generate a <a href="https://code.visualstudio.com">Visual Studio Code</a> workspace that has C/C++ source indexing configured correctly, as well as launcher targets for tests and the Java launcher. After configuring, a workspace for the configuration can be generated using:</p>
<pre class="shell"><code>make vscode-project</code></pre>
<p>This creates a file called <code>jdk.code-workspace</code> in the
build output folder. The full location will be printed after the
workspace has been generated. To use it, choose
<code>File -&gt; Open Workspace...</code> in Visual Studio Code.</p>
<p>This creates a file called <code>jdk.code-workspace</code> in the build output folder. The full location will be printed after the workspace has been generated. To use it, choose <code>File -&gt; Open Workspace...</code> in Visual Studio Code.</p>
<h5 id="alternative-indexers">Alternative indexers</h5>
<p>The main <code>vscode-project</code> target configures the default
C++ support in Visual Studio Code. There are also other source indexers
that can be installed, that may provide additional features. It's
currently possible to generate configuration for two such indexers, <a
href="https://clang.llvm.org/extra/clangd/">clangd</a> and <a
href="https://github.com/Andersbakken/rtags">rtags</a>. These can be
configured by appending the name of the indexer to the make target, such
as:</p>
<p>The main <code>vscode-project</code> target configures the default C++ support in Visual Studio Code. There are also other source indexers that can be installed, that may provide additional features. It's currently possible to generate configuration for two such indexers, <a href="https://clang.llvm.org/extra/clangd/">clangd</a> and <a href="https://github.com/Andersbakken/rtags">rtags</a>. These can be configured by appending the name of the indexer to the make target, such as:</p>
<pre class="shell"><code>make vscode-project-clangd</code></pre>
<p>Additional instructions for configuring the given indexer will be
displayed after the workspace has been generated.</p>
<p>Additional instructions for configuring the given indexer will be displayed after the workspace has been generated.</p>
<h4 id="visual-studio">Visual Studio</h4>
<p>The make system can generate a Visual Studio project for the Hotspot
native source. After configuring, the project is generated using:</p>
<p>The make system can generate a Visual Studio project for the Hotspot native source. After configuring, the project is generated using:</p>
<pre class="shell"><code>make hotspot-ide-project</code></pre>
<p>This creates a file named <code>jvm.vcxproj</code> in
<code>ide\hotspot-visualstudio</code> subfolder of the build output
folder. The file can be opened in Visual Studio via
<code>File -&gt; Open -&gt; Project/Solution</code>.</p>
<h4 id="eclipse-cdt">Eclipse CDT</h4>
<p>The make system can generate an Eclipse CDT Workspace that enables
Eclipse indexing for the C and C++ sources throughout the entire
codebase, as well as registering all common make targets to be runnable
from the Eclipse explorer. This can be done after configuring by
running:</p>
<pre><code>make eclipse-native-env</code></pre>
<p>After this is run, simply open and import the workspace in Eclipse
through
<code>File -&gt; Import -&gt; Projects from Folder or Archive</code> and
at <code>Import source</code> click on the directory
<code>ide\eclipse</code>, which can be found in the build output
folder.</p>
<p>If this doesn't work, you can also try
<code>File -&gt; Import -&gt; Existing Projects into Workspace</code>
instead.</p>
<p>Setting up an Eclipse Workspace is relatively lightweight compared to
other supported IDEs, but requires that your CDT installation has Cross
GCC support enabled at the moment, even if you aren't cross compiling.
The Visual C++ compiler is, at present, not supported as an indexer.</p>
<p>If desired, you can instead request make to only include indexing
support for just the Java Virtual Machine instead of the entire native
codebase, by running:</p>
<pre><code>make eclipse-hotspot-env</code></pre>
<p>If you think your particular Eclipse installation can handle the
strain, the make system also supports generating a combined Java and
C/C++ Workspace for Eclipse which can then conveniently switch between
Java and C/C++ natures during development by running:</p>
<pre><code>make eclipse-mixed-env</code></pre>
<p>Do note that this generates all features that come with both Java and
C/C++ natures.</p>
<p>By default, the Eclipse Workspace is located in the ide subdirectory
in the build output. To share the JDK's source directory with the
Eclipse Workspace, you can instead run:</p>
<pre><code>make eclipse-shared-&lt;ENV&gt;-env</code></pre>
<p>Eclipse support in the JDK is relatively new, so do keep in mind that
not everything may work at the moment. As such, the resulting Workspace
also has compilation database parsing support enabled, so you can pass
Eclipse the compile commands file (see below) if all else fails.</p>
<p>This creates a file named <code>jvm.vcxproj</code> in <code>ide\hotspot-visualstudio</code> subfolder of the build output folder. The file can be opened in Visual Studio via <code>File -&gt; Open -&gt; Project/Solution</code>.</p>
<h4 id="compilation-database">Compilation Database</h4>
<p>The make system can generate generic native code indexing support in
the form of a <a
href="https://clang.llvm.org/docs/JSONCompilationDatabase.html">Compilation
Database</a> that can be used by many different IDEs and source code
indexers.</p>
<p>The make system can generate generic native code indexing support in the form of a <a href="https://clang.llvm.org/docs/JSONCompilationDatabase.html">Compilation Database</a> that can be used by many different IDEs and source code indexers.</p>
<pre class="shell"><code>make compile-commands</code></pre>
<p>It's also possible to generate the Compilation Database for the
HotSpot source code only, which is a bit faster as it includes less
information.</p>
<p>It's also possible to generate the Compilation Database for the HotSpot source code only, which is a bit faster as it includes less information.</p>
<pre class="shell"><code>make compile-commands-hotspot</code></pre>
<h3 id="ide-support-for-java-code">IDE support for Java code</h3>
<h4 id="intellij-idea">IntelliJ IDEA</h4>
<p>The JDK project has a script that can be used for indexing the
project with IntelliJ. After configuring and building the JDK, an
IntelliJ workspace can be generated by running the following command in
the top-level folder of the cloned repository:</p>
<p>The JDK project has a script that can be used for indexing the project with IntelliJ. After configuring and building the JDK, an IntelliJ workspace can be generated by running the following command in the top-level folder of the cloned repository:</p>
<pre class="shell"><code>bash bin/idea.sh</code></pre>
<p>To use it, choose <code>File -&gt; Open...</code> in IntelliJ and
select the folder where you ran the above script.</p>
<p>Next, configure the project SDK in IntelliJ. Open
<code>File -&gt; Project Structure -&gt; Project</code> and select
<code>build/&lt;config&gt;/images/jdk</code> as the SDK to use.</p>
<p>In order to run the tests from the IDE, you can use the JTReg plugin.
Instructions for building and using the plugin can be found <a
href="https://github.com/openjdk/jtreg/tree/master/plugins/idea">here</a>.</p>
<h4 id="eclipse">Eclipse</h4>
<p>Eclipse JDT is a widely used Java IDE and has been for a very long
time, being a popular choice alongside IntelliJ IDEA for Java
development. Likewise, the JDK now includes support for developing its
Java sources with Eclipse, which can be achieved by setting up a Java
Workspace by running:</p>
<pre><code>make eclipse-java-env</code></pre>
<p>After the workspace has been generated you can import it in the same
way as you would with Eclipse CDT:</p>
<p>Follow
<code>File -&gt; Import -&gt; Projects from Folder or Archive</code> and
select the <code>ide\eclipse</code> directory in the build output folder
to import the newly created Java Workspace.</p>
<p>If doing so results in an error, you can also import the JDK via
<code>File -&gt; Import -&gt; Existing Projects into Workspace</code> as
a last resort.</p>
<p>Alternatively, if you want a Java Workspace inside the JDK's source
directory, you can instead run:</p>
<pre><code>make eclipse-shared-java-env</code></pre>
<p>As mentioned above for Eclipse CDT, you can create a combined Java
and C/C++ Workspace which can conveniently switch between Java and C/C++
natures during development by running:</p>
<pre><code>make eclipse-mixed-env</code></pre>
<p>To use it, choose <code>File -&gt; Open...</code> in IntelliJ and select the folder where you ran the above script.</p>
<p>Next, configure the project SDK in IntelliJ. Open <code>File -&gt; Project Structure -&gt; Project</code> and select <code>build/&lt;config&gt;/images/jdk</code> as the SDK to use.</p>
<p>In order to run the tests from the IDE, you can use the JTReg plugin. Instructions for building and using the plugin can be found <a href="https://github.com/openjdk/jtreg/tree/master/plugins/idea">here</a>.</p>
</body>
</html>

View File

@@ -56,63 +56,6 @@ This creates a file named `jvm.vcxproj` in `ide\hotspot-visualstudio`
subfolder of the build output folder. The file can be opened in Visual Studio
via `File -> Open -> Project/Solution`.
#### Eclipse CDT
The make system can generate an Eclipse CDT Workspace that enables Eclipse
indexing for the C and C++ sources throughout the entire codebase, as well as
registering all common make targets to be runnable from the Eclipse explorer.
This can be done after configuring by running:
```
make eclipse-native-env
```
After this is run, simply open and import the workspace in Eclipse through
`File -> Import -> Projects from Folder or Archive` and at
`Import source` click on the directory `ide\eclipse`, which can be
found in the build output folder.
If this doesn't work, you can also try
`File -> Import -> Existing Projects into Workspace`
instead.
Setting up an Eclipse Workspace is relatively lightweight compared to other
supported IDEs, but requires that your CDT installation has Cross GCC support
enabled at the moment, even if you aren't cross compiling. The Visual C++
compiler is, at present, not supported as an indexer.
If desired, you can instead request make to only include indexing support for
just the Java Virtual Machine instead of the entire native codebase, by running:
```
make eclipse-hotspot-env
```
If you think your particular Eclipse installation can handle the strain, the
make system also supports generating a combined Java and C/C++ Workspace for
Eclipse which can then conveniently switch between Java and C/C++ natures
during development by running:
```
make eclipse-mixed-env
```
Do note that this generates all features that come with both Java and C/C++
natures.
By default, the Eclipse Workspace is located in the ide subdirectory in the
build output. To share the JDK's source directory with the Eclipse Workspace,
you can instead run:
```
make eclipse-shared-<ENV>-env
```
Eclipse support in the JDK is relatively new, so do keep in mind that not
everything may work at the moment. As such, the resulting Workspace also
has compilation database parsing support enabled, so you can pass Eclipse
the compile commands file (see below) if all else fails.
#### Compilation Database
The make system can generate generic native code indexing support in the form of
@@ -153,40 +96,3 @@ as the SDK to use.
In order to run the tests from the IDE, you can use the JTReg plugin.
Instructions for building and using the plugin can be found
[here](https://github.com/openjdk/jtreg/tree/master/plugins/idea).
#### Eclipse
Eclipse JDT is a widely used Java IDE and has been for a very long time, being
a popular choice alongside IntelliJ IDEA for Java development. Likewise, the
JDK now includes support for developing its Java sources with Eclipse, which
can be achieved by setting up a Java Workspace by running:
```
make eclipse-java-env
```
After the workspace has been generated you can import it in the same way as
you would with Eclipse CDT:
Follow `File -> Import -> Projects from Folder or Archive` and select the
`ide\eclipse` directory in the build output folder to import the newly created
Java Workspace.
If doing so results in an error, you can also import the JDK via
`File -> Import -> Existing Projects into Workspace`
as a last resort.
Alternatively, if you want a Java Workspace inside the JDK's source directory,
you can instead run:
```
make eclipse-shared-java-env
```
As mentioned above for Eclipse CDT, you can create a combined Java and C/C++
Workspace which can conveniently switch between Java and C/C++ natures during
development by running:
```
make eclipse-mixed-env
```

View File

@@ -5,99 +5,54 @@
<meta name="generator" content="pandoc" />
<meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
<title>Testing the JDK</title>
<style>
code{white-space: pre-wrap;}
span.smallcaps{font-variant: small-caps;}
div.columns{display: flex; gap: min(4vw, 1.5em);}
div.column{flex: auto; overflow-x: auto;}
div.hanging-indent{margin-left: 1.5em; text-indent: -1.5em;}
ul.task-list{list-style: none;}
ul.task-list li input[type="checkbox"] {
width: 0.8em;
margin: 0 0.8em 0.2em -1.6em;
vertical-align: middle;
}
.display.math{display: block; text-align: center; margin: 0.5rem auto;}
<style type="text/css">
code{white-space: pre-wrap;}
span.smallcaps{font-variant: small-caps;}
span.underline{text-decoration: underline;}
div.column{display: inline-block; vertical-align: top; width: 50%;}
</style>
<link rel="stylesheet" href="../make/data/docs-resources/resources/jdk-default.css" />
<style type="text/css">pre, code, tt { color: #1d6ae5; }</style>
<!--[if lt IE 9]>
<script src="//cdnjs.cloudflare.com/ajax/libs/html5shiv/3.7.3/html5shiv-printshiv.min.js"></script>
<![endif]-->
<style type="text/css">pre, code, tt { color: #1d6ae5; }</style>
</head>
<body>
<header id="title-block-header">
<h1 class="title">Testing the JDK</h1>
</header>
<nav id="TOC" role="doc-toc">
<nav id="TOC">
<ul>
<li><a href="#overview" id="toc-overview">Overview</a></li>
<li><a href="#running-tests-locally-with-make-test"
id="toc-running-tests-locally-with-make-test">Running tests locally with
<code>make test</code></a>
<ul>
<li><a href="#configuration"
id="toc-configuration">Configuration</a></li>
<li><a href="#using-make-test-the-run-test-framework">Using &quot;make test&quot; (the run-test framework)</a><ul>
<li><a href="#configuration">Configuration</a></li>
</ul></li>
<li><a href="#test-selection" id="toc-test-selection">Test selection</a>
<ul>
<li><a href="#common-test-groups" id="toc-common-test-groups">Common
Test Groups</a></li>
<li><a href="#jtreg" id="toc-jtreg">JTReg</a></li>
<li><a href="#gtest" id="toc-gtest">Gtest</a></li>
<li><a href="#microbenchmarks"
id="toc-microbenchmarks">Microbenchmarks</a></li>
<li><a href="#special-tests" id="toc-special-tests">Special
tests</a></li>
<li><a href="#test-selection">Test selection</a><ul>
<li><a href="#common-test-groups">Common Test Groups</a></li>
<li><a href="#jtreg">JTReg</a></li>
<li><a href="#gtest">Gtest</a></li>
<li><a href="#microbenchmarks">Microbenchmarks</a></li>
<li><a href="#special-tests">Special tests</a></li>
</ul></li>
<li><a href="#test-results-and-summary"
id="toc-test-results-and-summary">Test results and summary</a></li>
<li><a href="#test-suite-control" id="toc-test-suite-control">Test suite
control</a>
<ul>
<li><a href="#general-keywords-test_opts"
id="toc-general-keywords-test_opts">General keywords
(TEST_OPTS)</a></li>
<li><a href="#jtreg-keywords" id="toc-jtreg-keywords">JTReg
keywords</a></li>
<li><a href="#gtest-keywords" id="toc-gtest-keywords">Gtest
keywords</a></li>
<li><a href="#microbenchmark-keywords"
id="toc-microbenchmark-keywords">Microbenchmark keywords</a></li>
<li><a href="#test-results-and-summary">Test results and summary</a></li>
<li><a href="#test-suite-control">Test suite control</a><ul>
<li><a href="#general-keywords-test_opts">General keywords (TEST_OPTS)</a></li>
<li><a href="#jtreg-keywords">JTReg keywords</a></li>
<li><a href="#gtest-keywords">Gtest keywords</a></li>
<li><a href="#microbenchmark-keywords">Microbenchmark keywords</a></li>
</ul></li>
<li><a href="#notes-for-specific-tests"
id="toc-notes-for-specific-tests">Notes for Specific Tests</a>
<ul>
<li><a href="#docker-tests" id="toc-docker-tests">Docker Tests</a></li>
<li><a href="#non-us-locale" id="toc-non-us-locale">Non-US
locale</a></li>
<li><a href="#pkcs11-tests" id="toc-pkcs11-tests">PKCS11 Tests</a></li>
<li><a href="#client-ui-tests" id="toc-client-ui-tests">Client UI
Tests</a></li>
<li><a href="#notes-for-specific-tests">Notes for Specific Tests</a><ul>
<li><a href="#docker-tests">Docker Tests</a></li>
<li><a href="#non-us-locale">Non-US locale</a></li>
<li><a href="#pkcs11-tests">PKCS11 Tests</a></li>
<li><a href="#client-ui-tests">Client UI Tests</a></li>
</ul></li>
<li><a href="#editing-this-document"
id="toc-editing-this-document">Editing this document</a></li>
<li><a href="#editing-this-document">Editing this document</a></li>
</ul>
</nav>
<h2 id="overview">Overview</h2>
<p>The bulk of JDK tests use <a
href="https://openjdk.org/jtreg/">jtreg</a>, a regression test framework
and test runner built for the JDK's specific needs. Other test
frameworks are also used. The different test frameworks can be executed
directly, but there is also a set of make targets intended to simplify
the interface, and figure out how to run your tests for you.</p>
<h2 id="running-tests-locally-with-make-test">Running tests locally with
<code>make test</code></h2>
<p>This is the easiest way to get started. Assuming you've built the JDK
locally, execute:</p>
<pre><code>$ make test</code></pre>
<p>This will run a default set of tests against the JDK, and present you
with the results. <code>make test</code> is part of a family of
test-related make targets which simplify running tests, because they
invoke the various test frameworks for you. The "make test framework" is
simple to start with, but more complex ad-hoc combination of tests is
also possible. You can always invoke the test frameworks directly if you
want even more control.</p>
<h2 id="using-make-test-the-run-test-framework">Using &quot;make test&quot; (the run-test framework)</h2>
<p>This new way of running tests is developer-centric. It assumes that you have built a JDK locally and want to test it. Running common test targets is simple, and more complex ad-hoc combination of tests is possible. The user interface is forgiving, and clearly report errors it cannot resolve.</p>
<p>The main target <code>test</code> uses the jdk-image as the tested product. There is also an alternate target <code>exploded-test</code> that uses the exploded image instead. Not all tests will run successfully on the exploded image, but using this target can greatly improve rebuild times for certain workflows.</p>
<p>Previously, <code>make test</code> was used to invoke an old system for running tests, and <code>make run-test</code> was used for the new test framework. For backward compatibility with scripts and muscle memory, <code>run-test</code> (and variants like <code>exploded-run-test</code> or <code>run-test-tier1</code>) are kept as aliases.</p>
<p>Some example command-lines:</p>
<pre><code>$ make test-tier1
$ make test-jdk_lang JTREG=&quot;JOBS=8&quot;
@@ -107,214 +62,51 @@ $ make test TEST=&quot;hotspot:hotspot_gc&quot; JTREG=&quot;JOBS=1;TIMEOUT_FACTO
$ make test TEST=&quot;jtreg:test/hotspot:hotspot_gc test/hotspot/jtreg/native_sanity/JniVersion.java&quot;
$ make test TEST=&quot;micro:java.lang.reflect&quot; MICRO=&quot;FORK=1;WARMUP_ITER=2&quot;
$ make exploded-test TEST=tier2</code></pre>
<p>"tier1" and "tier2" refer to tiered testing, see further down. "TEST"
is a test selection argument which the make test framework will use to
try to find the tests you want. It iterates over the available test
frameworks, and if the test isn't present in one, it tries the next one.
The main target <code>test</code> uses the jdk-image as the tested
product. There is also an alternate target <code>exploded-test</code>
that uses the exploded image instead. Not all tests will run
successfully on the exploded image, but using this target can greatly
improve rebuild times for certain workflows.</p>
<p>Previously, <code>make test</code> was used to invoke an old system
for running tests, and <code>make run-test</code> was used for the new
test framework. For backward compatibility with scripts and muscle
memory, <code>run-test</code> and variants like
<code>exploded-run-test</code> or <code>run-test-tier1</code> are kept
as aliases.</p>
<h3 id="configuration">Configuration</h3>
<p>To be able to run JTReg tests, <code>configure</code> needs to know
where to find the JTReg test framework. If it is not picked up
automatically by configure, use the
<code>--with-jtreg=&lt;path to jtreg home&gt;</code> option to point to
the JTReg framework. Note that this option should point to the JTReg
home, i.e. the top directory, containing <code>lib/jtreg.jar</code> etc.
(An alternative is to set the <code>JT_HOME</code> environment variable
to point to the JTReg home before running <code>configure</code>.)</p>
<p>To be able to run microbenchmarks, <code>configure</code> needs to
know where to find the JMH dependency. Use
<code>--with-jmh=&lt;path to JMH jars&gt;</code> to point to a directory
containing the core JMH and transitive dependencies. The recommended
dependencies can be retrieved by running
<code>sh make/devkit/createJMHBundle.sh</code>, after which
<code>--with-jmh=build/jmh/jars</code> should work.</p>
<p>When tests fail or timeout, jtreg runs its failure handler to capture
necessary data from the system where the test was run. This data can
then be used to analyze the test failures. Collecting this data involves
running various commands (which are listed in files residing in
<code>test/failure_handler/src/share/conf</code>) and some of these
commands use <code>sudo</code>. If the system's <code>sudoers</code>
file isn't configured to allow running these commands, then it can
result in password being prompted during the failure handler execution.
Typically, when running locally, collecting this additional data isn't
always necessary. To disable running the failure handler, use
<code>--enable-jtreg-failure-handler=no</code> when running
<code>configure</code>. If, however, you want to let the failure handler
to run and don't want to be prompted for sudo password, then you can
configure your <code>sudoers</code> file appropriately. Please read the
necessary documentation of your operating system to see how to do that;
here we only show one possible way of doing that - edit the
<code>/etc/sudoers.d/sudoers</code> file to include the following
line:</p>
<p>To be able to run JTReg tests, <code>configure</code> needs to know where to find the JTReg test framework. If it is not picked up automatically by configure, use the <code>--with-jtreg=&lt;path to jtreg home&gt;</code> option to point to the JTReg framework. Note that this option should point to the JTReg home, i.e. the top directory, containing <code>lib/jtreg.jar</code> etc. (An alternative is to set the <code>JT_HOME</code> environment variable to point to the JTReg home before running <code>configure</code>.)</p>
<p>To be able to run microbenchmarks, <code>configure</code> needs to know where to find the JMH dependency. Use <code>--with-jmh=&lt;path to JMH jars&gt;</code> to point to a directory containing the core JMH and transitive dependencies. The recommended dependencies can be retrieved by running <code>sh make/devkit/createJMHBundle.sh</code>, after which <code>--with-jmh=build/jmh/jars</code> should work.</p>
<p>When tests fail or timeout, jtreg runs its failure handler to capture necessary data from the system where the test was run. This data can then be used to analyze the test failures. Collecting this data involves running various commands (which are listed in files residing in <code>test/failure_handler/src/share/conf</code>) and some of these commands use <code>sudo</code>. If the system's <code>sudoers</code> file isn't configured to allow running these commands, then it can result in password being prompted during the failure handler execution. Typically, when running locally, collecting this additional data isn't always necessary. To disable running the failure handler, use <code>--enable-jtreg-failure-handler=no</code> when running <code>configure</code>. If, however, you want to let the failure handler to run and don't want to be prompted for sudo password, then you can configure your <code>sudoers</code> file appropriately. Please read the necessary documentation of your operating system to see how to do that; here we only show one possible way of doing that - edit the <code>/etc/sudoers.d/sudoers</code> file to include the following line:</p>
<pre><code>johndoe ALL=(ALL) NOPASSWD: /sbin/dmesg</code></pre>
<p>This line configures <code>sudo</code> to <em>not</em> prompt for
password for the <code>/sbin/dmesg</code> command (this is one of the
commands that is listed in the files at
<code>test/failure_handler/src/share/conf</code>), for the user
<code>johndoe</code>. Here <code>johndoe</code> is the user account
under which the jtreg tests are run. Replace the username with a
relevant user account of your system.</p>
<p>This line configures <code>sudo</code> to <em>not</em> prompt for password for the <code>/sbin/dmesg</code> command (this is one of the commands that is listed in the files at <code>test/failure_handler/src/share/conf</code>), for the user <code>johndoe</code>. Here <code>johndoe</code> is the user account under which the jtreg tests are run. Replace the username with a relevant user account of your system.</p>
<h2 id="test-selection">Test selection</h2>
<p>All functionality is available using the <code>test</code> make
target. In this use case, the test or tests to be executed is controlled
using the <code>TEST</code> variable. To speed up subsequent test runs
with no source code changes, <code>test-only</code> can be used instead,
which do not depend on the source and test image build.</p>
<p>For some common top-level tests, direct make targets have been
generated. This includes all JTReg test groups, the hotspot gtest, and
custom tests (if present). This means that <code>make test-tier1</code>
is equivalent to <code>make test TEST="tier1"</code>, but the latter is
more tab-completion friendly. For more complex test runs, the
<code>test TEST="x"</code> solution needs to be used.</p>
<p>The test specifications given in <code>TEST</code> is parsed into
fully qualified test descriptors, which clearly and unambigously show
which tests will be run. As an example, <code>:tier1</code> will expand
to
<code>jtreg:$(TOPDIR)/test/hotspot/jtreg:tier1 jtreg:$(TOPDIR)/test/jdk:tier1 jtreg:$(TOPDIR)/test/langtools:tier1 jtreg:$(TOPDIR)/test/nashorn:tier1 jtreg:$(TOPDIR)/test/jaxp:tier1</code>.
You can always submit a list of fully qualified test descriptors in the
<code>TEST</code> variable if you want to shortcut the parser.</p>
<p>All functionality is available using the <code>test</code> make target. In this use case, the test or tests to be executed is controlled using the <code>TEST</code> variable. To speed up subsequent test runs with no source code changes, <code>test-only</code> can be used instead, which do not depend on the source and test image build.</p>
<p>For some common top-level tests, direct make targets have been generated. This includes all JTReg test groups, the hotspot gtest, and custom tests (if present). This means that <code>make test-tier1</code> is equivalent to <code>make test TEST=&quot;tier1&quot;</code>, but the latter is more tab-completion friendly. For more complex test runs, the <code>test TEST=&quot;x&quot;</code> solution needs to be used.</p>
<p>The test specifications given in <code>TEST</code> is parsed into fully qualified test descriptors, which clearly and unambigously show which tests will be run. As an example, <code>:tier1</code> will expand to <code>jtreg:$(TOPDIR)/test/hotspot/jtreg:tier1 jtreg:$(TOPDIR)/test/jdk:tier1 jtreg:$(TOPDIR)/test/langtools:tier1 jtreg:$(TOPDIR)/test/nashorn:tier1 jtreg:$(TOPDIR)/test/jaxp:tier1</code>. You can always submit a list of fully qualified test descriptors in the <code>TEST</code> variable if you want to shortcut the parser.</p>
<h3 id="common-test-groups">Common Test Groups</h3>
<p>Ideally, all tests are run for every change but this may not be
practical due to the limited testing resources, the scope of the change,
etc.</p>
<p>The source tree currently defines a few common test groups in the
relevant <code>TEST.groups</code> files. There are test groups that
cover a specific component, for example <code>hotspot_gc</code>. It is a
good idea to look into <code>TEST.groups</code> files to get a sense
what tests are relevant to a particular JDK component.</p>
<p>Component-specific tests may miss some unintended consequences of a
change, so other tests should also be run. Again, it might be
impractical to run all tests, and therefore <em>tiered</em> test groups
exist. Tiered test groups are not component-specific, but rather cover
the significant parts of the entire JDK.</p>
<p>Multiple tiers allow balancing test coverage and testing costs. Lower
test tiers are supposed to contain the simpler, quicker and more stable
tests. Higher tiers are supposed to contain progressively more thorough,
slower, and sometimes less stable tests, or the tests that require
special configuration.</p>
<p>Contributors are expected to run the tests for the areas that are
changed, and the first N tiers they can afford to run, but at least
tier1.</p>
<p>Ideally, all tests are run for every change but this may not be practical due to the limited testing resources, the scope of the change, etc.</p>
<p>The source tree currently defines a few common test groups in the relevant <code>TEST.groups</code> files. There are test groups that cover a specific component, for example <code>hotspot_gc</code>. It is a good idea to look into <code>TEST.groups</code> files to get a sense what tests are relevant to a particular JDK component.</p>
<p>Component-specific tests may miss some unintended consequences of a change, so other tests should also be run. Again, it might be impractical to run all tests, and therefore <em>tiered</em> test groups exist. Tiered test groups are not component-specific, but rather cover the significant parts of the entire JDK.</p>
<p>Multiple tiers allow balancing test coverage and testing costs. Lower test tiers are supposed to contain the simpler, quicker and more stable tests. Higher tiers are supposed to contain progressively more thorough, slower, and sometimes less stable tests, or the tests that require special configuration.</p>
<p>Contributors are expected to run the tests for the areas that are changed, and the first N tiers they can afford to run, but at least tier1.</p>
<p>A brief description of the tiered test groups:</p>
<ul>
<li><p><code>tier1</code>: This is the lowest test tier. Multiple
developers run these tests every day. Because of the widespread use, the
tests in <code>tier1</code> are carefully selected and optimized to run
fast, and to run in the most stable manner. The test failures in
<code>tier1</code> are usually followed up on quickly, either with
fixes, or adding relevant tests to problem list. GitHub Actions
workflows, if enabled, run <code>tier1</code> tests.</p></li>
<li><p><code>tier2</code>: This test group covers even more ground.
These contain, among other things, tests that either run for too long to
be at <code>tier1</code>, or may require special configuration, or tests
that are less stable, or cover the broader range of non-core JVM and JDK
features/components(for example, XML).</p></li>
<li><p><code>tier3</code>: This test group includes more stressful
tests, the tests for corner cases not covered by previous tiers, plus
the tests that require GUIs. As such, this suite should either be run
with low concurrency (<code>TEST_JOBS=1</code>), or without headful
tests(<code>JTREG_KEYWORDS=\!headful</code>), or both.</p></li>
<li><p><code>tier4</code>: This test group includes every other test not
covered by previous tiers. It includes, for example,
<code>vmTestbase</code> suites for Hotspot, which run for many hours
even on large machines. It also runs GUI tests, so the same
<code>TEST_JOBS</code> and <code>JTREG_KEYWORDS</code> caveats
apply.</p></li>
<li><p><code>tier1</code>: This is the lowest test tier. Multiple developers run these tests every day. Because of the widespread use, the tests in <code>tier1</code> are carefully selected and optimized to run fast, and to run in the most stable manner. The test failures in <code>tier1</code> are usually followed up on quickly, either with fixes, or adding relevant tests to problem list. GitHub Actions workflows, if enabled, run <code>tier1</code> tests.</p></li>
<li><p><code>tier2</code>: This test group covers even more ground. These contain, among other things, tests that either run for too long to be at <code>tier1</code>, or may require special configuration, or tests that are less stable, or cover the broader range of non-core JVM and JDK features/components (for example, XML).</p></li>
<li><p><code>tier3</code>: This test group includes more stressful tests, the tests for corner cases not covered by previous tiers, plus the tests that require GUIs. As such, this suite should either be run with low concurrency (<code>TEST_JOBS=1</code>), or without headful tests (<code>JTREG_KEYWORDS=\!headful</code>), or both.</p></li>
<li><p><code>tier4</code>: This test group includes every other test not covered by previous tiers. It includes, for example, <code>vmTestbase</code> suites for Hotspot, which run for many hours even on large machines. It also runs GUI tests, so the same <code>TEST_JOBS</code> and <code>JTREG_KEYWORDS</code> caveats apply.</p></li>
</ul>
<h3 id="jtreg">JTReg</h3>
<p>JTReg tests can be selected either by picking a JTReg test group, or
a selection of files or directories containing JTReg tests.
Documentation can be found at <a
href="https://openjdk.org/jtreg/">https://openjdk.org/jtreg/</a>, note
especially the extensive <a
href="https://openjdk.org/jtreg/faq.html">FAQ</a>.</p>
<p>JTReg test groups can be specified either without a test root, e.g.
<code>:tier1</code> (or <code>tier1</code>, the initial colon is
optional), or with, e.g. <code>hotspot:tier1</code>,
<code>test/jdk:jdk_util</code> or
<code>$(TOPDIR)/test/hotspot/jtreg:hotspot_all</code>. The test root can
be specified either as an absolute path, or a path relative to the JDK
top directory, or the <code>test</code> directory. For simplicity, the
hotspot JTReg test root, which really is <code>hotspot/jtreg</code> can
be abbreviated as just <code>hotspot</code>.</p>
<p>When specified without a test root, all matching groups from all test
roots will be added. Otherwise, only the group from the specified test
root will be added.</p>
<p>Individual JTReg tests or directories containing JTReg tests can also
be specified, like
<code>test/hotspot/jtreg/native_sanity/JniVersion.java</code> or
<code>hotspot/jtreg/native_sanity</code>. Just like for test root
selection, you can either specify an absolute path (which can even point
to JTReg tests outside the source tree), or a path relative to either
the JDK top directory or the <code>test</code> directory.
<code>hotspot</code> can be used as an alias for
<code>hotspot/jtreg</code> here as well.</p>
<p>As long as the test groups or test paths can be uniquely resolved,
you do not need to enter the <code>jtreg:</code> prefix. If this is not
possible, or if you want to use a fully qualified test descriptor, add
<code>jtreg:</code>, e.g.
<code>jtreg:test/hotspot/jtreg/native_sanity</code>.</p>
<p>JTReg tests can be selected either by picking a JTReg test group, or a selection of files or directories containing JTReg tests. Documentation can be found at <a href="https://openjdk.org/jtreg/">https://openjdk.org/jtreg/</a>, note especially the extensive <a href="https://openjdk.org/jtreg/faq.html">FAQ</a>.</p>
<p>JTReg test groups can be specified either without a test root, e.g. <code>:tier1</code> (or <code>tier1</code>, the initial colon is optional), or with, e.g. <code>hotspot:tier1</code>, <code>test/jdk:jdk_util</code> or <code>$(TOPDIR)/test/hotspot/jtreg:hotspot_all</code>. The test root can be specified either as an absolute path, or a path relative to the JDK top directory, or the <code>test</code> directory. For simplicity, the hotspot JTReg test root, which really is <code>hotspot/jtreg</code> can be abbreviated as just <code>hotspot</code>.</p>
<p>When specified without a test root, all matching groups from all test roots will be added. Otherwise, only the group from the specified test root will be added.</p>
<p>Individual JTReg tests or directories containing JTReg tests can also be specified, like <code>test/hotspot/jtreg/native_sanity/JniVersion.java</code> or <code>hotspot/jtreg/native_sanity</code>. Just like for test root selection, you can either specify an absolute path (which can even point to JTReg tests outside the source tree), or a path relative to either the JDK top directory or the <code>test</code> directory. <code>hotspot</code> can be used as an alias for <code>hotspot/jtreg</code> here as well.</p>
<p>As long as the test groups or test paths can be uniquely resolved, you do not need to enter the <code>jtreg:</code> prefix. If this is not possible, or if you want to use a fully qualified test descriptor, add <code>jtreg:</code>, e.g. <code>jtreg:test/hotspot/jtreg/native_sanity</code>.</p>
<h3 id="gtest">Gtest</h3>
<p><strong>Note:</strong> To be able to run the Gtest suite, you need to
configure your build to be able to find a proper version of the gtest
source. For details, see the section <a
href="building.html#running-tests">"Running Tests" in the build
documentation</a>.</p>
<p>Since the Hotspot Gtest suite is so quick, the default is to run all
tests. This is specified by just <code>gtest</code>, or as a fully
qualified test descriptor <code>gtest:all</code>.</p>
<p>If you want, you can single out an individual test or a group of
tests, for instance <code>gtest:LogDecorations</code> or
<code>gtest:LogDecorations.level_test_vm</code>. This can be
particularly useful if you want to run a shaky test repeatedly.</p>
<p>For Gtest, there is a separate test suite for each JVM variant. The
JVM variant is defined by adding <code>/&lt;variant&gt;</code> to the
test descriptor, e.g. <code>gtest:Log/client</code>. If you specify no
variant, gtest will run once for each JVM variant present (e.g. server,
client). So if you only have the server JVM present, then
<code>gtest:all</code> will be equivalent to
<code>gtest:all/server</code>.</p>
<p>Since the Hotspot Gtest suite is so quick, the default is to run all tests. This is specified by just <code>gtest</code>, or as a fully qualified test descriptor <code>gtest:all</code>.</p>
<p>If you want, you can single out an individual test or a group of tests, for instance <code>gtest:LogDecorations</code> or <code>gtest:LogDecorations.level_test_vm</code>. This can be particularly useful if you want to run a shaky test repeatedly.</p>
<p>For Gtest, there is a separate test suite for each JVM variant. The JVM variant is defined by adding <code>/&lt;variant&gt;</code> to the test descriptor, e.g. <code>gtest:Log/client</code>. If you specify no variant, gtest will run once for each JVM variant present (e.g. server, client). So if you only have the server JVM present, then <code>gtest:all</code> will be equivalent to <code>gtest:all/server</code>.</p>
<h3 id="microbenchmarks">Microbenchmarks</h3>
<p>Which microbenchmarks to run is selected using a regular expression
following the <code>micro:</code> test descriptor, e.g.,
<code>micro:java.lang.reflect</code>. This delegates the test selection
to JMH, meaning package name, class name and even benchmark method names
can be used to select tests.</p>
<p>Using special characters like <code>|</code> in the regular
expression is possible, but needs to be escaped multiple times:
<code>micro:ArrayCopy\\\\\|reflect</code>.</p>
<p>Which microbenchmarks to run is selected using a regular expression following the <code>micro:</code> test descriptor, e.g., <code>micro:java.lang.reflect</code>. This delegates the test selection to JMH, meaning package name, class name and even benchmark method names can be used to select tests.</p>
<p>Using special characters like <code>|</code> in the regular expression is possible, but needs to be escaped multiple times: <code>micro:ArrayCopy\\\\\|reflect</code>.</p>
<h3 id="special-tests">Special tests</h3>
<p>A handful of odd tests that are not covered by any other testing
framework are accessible using the <code>special:</code> test
descriptor. Currently, this includes <code>failure-handler</code> and
<code>make</code>.</p>
<p>A handful of odd tests that are not covered by any other testing framework are accessible using the <code>special:</code> test descriptor. Currently, this includes <code>failure-handler</code> and <code>make</code>.</p>
<ul>
<li><p>Failure handler testing is run using
<code>special:failure-handler</code> or just
<code>failure-handler</code> as test descriptor.</p></li>
<li><p>Tests for the build system, including both makefiles and related
functionality, is run using <code>special:make</code> or just
<code>make</code> as test descriptor. This is equivalent to
<code>special:make:all</code>.</p>
<p>A specific make test can be run by supplying it as argument, e.g.
<code>special:make:idea</code>. As a special syntax, this can also be
expressed as <code>make-idea</code>, which allows for command lines as
<code>make test-make-idea</code>.</p></li>
<li><p>Failure handler testing is run using <code>special:failure-handler</code> or just <code>failure-handler</code> as test descriptor.</p></li>
<li><p>Tests for the build system, including both makefiles and related functionality, is run using <code>special:make</code> or just <code>make</code> as test descriptor. This is equivalent to <code>special:make:all</code>.</p>
<p>A specific make test can be run by supplying it as argument, e.g. <code>special:make:idea</code>. As a special syntax, this can also be expressed as <code>make-idea</code>, which allows for command lines as <code>make test-make-idea</code>.</p></li>
</ul>
<h2 id="test-results-and-summary">Test results and summary</h2>
<p>At the end of the test run, a summary of all tests run will be
presented. This will have a consistent look, regardless of what test
suites were used. This is a sample summary:</p>
<p>At the end of the test run, a summary of all tests run will be presented. This will have a consistent look, regardless of what test suites were used. This is a sample summary:</p>
<pre><code>==============================
Test summary
==============================
@@ -324,61 +116,20 @@ Test summary
jtreg:nashorn/test:tier1 133 133 0 0
==============================
TEST FAILURE</code></pre>
<p>Tests where the number of TOTAL tests does not equal the number of
PASSed tests will be considered a test failure. These are marked with
the <code>&gt;&gt; ... &lt;&lt;</code> marker for easy
identification.</p>
<p>The classification of non-passed tests differs a bit between test
suites. In the summary, ERROR is used as a catch-all for tests that
neither passed nor are classified as failed by the framework. This might
indicate test framework error, timeout or other problems.</p>
<p>In case of test failures, <code>make test</code> will exit with a
non-zero exit value.</p>
<p>All tests have their result stored in
<code>build/$BUILD/test-results/$TEST_ID</code>, where TEST_ID is a
path-safe conversion from the fully qualified test descriptor, e.g. for
<code>jtreg:jdk/test:tier1</code> the TEST_ID is
<code>jtreg_jdk_test_tier1</code>. This path is also printed in the log
at the end of the test run.</p>
<p>Additional work data is stored in
<code>build/$BUILD/test-support/$TEST_ID</code>. For some frameworks,
this directory might contain information that is useful in determining
the cause of a failed test.</p>
<p>Tests where the number of TOTAL tests does not equal the number of PASSed tests will be considered a test failure. These are marked with the <code>&gt;&gt; ... &lt;&lt;</code> marker for easy identification.</p>
<p>The classification of non-passed tests differs a bit between test suites. In the summary, ERROR is used as a catch-all for tests that neither passed nor are classified as failed by the framework. This might indicate test framework error, timeout or other problems.</p>
<p>In case of test failures, <code>make test</code> will exit with a non-zero exit value.</p>
<p>All tests have their result stored in <code>build/$BUILD/test-results/$TEST_ID</code>, where TEST_ID is a path-safe conversion from the fully qualified test descriptor, e.g. for <code>jtreg:jdk/test:tier1</code> the TEST_ID is <code>jtreg_jdk_test_tier1</code>. This path is also printed in the log at the end of the test run.</p>
<p>Additional work data is stored in <code>build/$BUILD/test-support/$TEST_ID</code>. For some frameworks, this directory might contain information that is useful in determining the cause of a failed test.</p>
<h2 id="test-suite-control">Test suite control</h2>
<p>It is possible to control various aspects of the test suites using
make control variables.</p>
<p>These variables use a keyword=value approach to allow multiple values
to be set. So, for instance,
<code>JTREG="JOBS=1;TIMEOUT_FACTOR=8"</code> will set the JTReg
concurrency level to 1 and the timeout factor to 8. This is equivalent
to setting <code>JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8</code>, but using
the keyword format means that the <code>JTREG</code> variable is parsed
and verified for correctness, so <code>JTREG="TMIEOUT_FACTOR=8"</code>
would give an error, while <code>JTREG_TMIEOUT_FACTOR=8</code> would
just pass unnoticed.</p>
<p>To separate multiple keyword=value pairs, use <code>;</code>
(semicolon). Since the shell normally eats <code>;</code>, the
recommended usage is to write the assignment inside qoutes, e.g.
<code>JTREG="...;..."</code>. This will also make sure spaces are
preserved, as in
<code>JTREG="JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug"</code>.</p>
<p>(Other ways are possible, e.g. using backslash:
<code>JTREG=JOBS=1\;TIMEOUT_FACTOR=8</code>. Also, as a special
technique, the string <code>%20</code> will be replaced with space for
certain options, e.g.
<code>JTREG=JAVA_OPTIONS=-XshowSettings%20-Xlog:gc+ref=debug</code>.
This can be useful if you have layers of scripts and have trouble
getting proper quoting of command line arguments through.)</p>
<p>As far as possible, the names of the keywords have been standardized
between test suites.</p>
<p>It is possible to control various aspects of the test suites using make control variables.</p>
<p>These variables use a keyword=value approach to allow multiple values to be set. So, for instance, <code>JTREG=&quot;JOBS=1;TIMEOUT_FACTOR=8&quot;</code> will set the JTReg concurrency level to 1 and the timeout factor to 8. This is equivalent to setting <code>JTREG_JOBS=1 JTREG_TIMEOUT_FACTOR=8</code>, but using the keyword format means that the <code>JTREG</code> variable is parsed and verified for correctness, so <code>JTREG=&quot;TMIEOUT_FACTOR=8&quot;</code> would give an error, while <code>JTREG_TMIEOUT_FACTOR=8</code> would just pass unnoticed.</p>
<p>To separate multiple keyword=value pairs, use <code>;</code> (semicolon). Since the shell normally eats <code>;</code>, the recommended usage is to write the assignment inside qoutes, e.g. <code>JTREG=&quot;...;...&quot;</code>. This will also make sure spaces are preserved, as in <code>JTREG=&quot;JAVA_OPTIONS=-XshowSettings -Xlog:gc+ref=debug&quot;</code>.</p>
<p>(Other ways are possible, e.g. using backslash: <code>JTREG=JOBS=1\;TIMEOUT_FACTOR=8</code>. Also, as a special technique, the string <code>%20</code> will be replaced with space for certain options, e.g. <code>JTREG=JAVA_OPTIONS=-XshowSettings%20-Xlog:gc+ref=debug</code>. This can be useful if you have layers of scripts and have trouble getting proper quoting of command line arguments through.)</p>
<p>As far as possible, the names of the keywords have been standardized between test suites.</p>
<h3 id="general-keywords-test_opts">General keywords (TEST_OPTS)</h3>
<p>Some keywords are valid across different test suites. If you want to
run tests from multiple test suites, or just don't want to care which
test suite specific control variable to use, then you can use the
general TEST_OPTS control variable.</p>
<p>There are also some keywords that applies globally to the test runner
system, not to any specific test suites. These are also available as
TEST_OPTS keywords.</p>
<p>Some keywords are valid across different test suites. If you want to run tests from multiple test suites, or just don't want to care which test suite specific control variable to use, then you can use the general TEST_OPTS control variable.</p>
<p>There are also some keywords that applies globally to the test runner system, not to any specific test suites. These are also available as TEST_OPTS keywords.</p>
<h4 id="jobs">JOBS</h4>
<p>Currently only applies to JTReg.</p>
<h4 id="timeout_factor">TIMEOUT_FACTOR</h4>
@@ -390,57 +141,28 @@ TEST_OPTS keywords.</p>
<h4 id="aot_modules">AOT_MODULES</h4>
<p>Applies to JTReg and GTest.</p>
<h4 id="jcov">JCOV</h4>
<p>This keywords applies globally to the test runner system. If set to
<code>true</code>, it enables JCov coverage reporting for all tests run.
To be useful, the JDK under test must be run with a JDK built with JCov
instrumentation
(<code>configure --with-jcov=&lt;path to directory containing lib/jcov.jar&gt;</code>,
<code>make jcov-image</code>).</p>
<p>The simplest way to run tests with JCov coverage report is to use the
special target <code>jcov-test</code> instead of <code>test</code>, e.g.
<code>make jcov-test TEST=jdk_lang</code>. This will make sure the JCov
image is built, and that JCov reporting is enabled.</p>
<p>The JCov report is stored in
<code>build/$BUILD/test-results/jcov-output/report</code>.</p>
<p>Please note that running with JCov reporting can be very memory
intensive.</p>
<p>This keywords applies globally to the test runner system. If set to <code>true</code>, it enables JCov coverage reporting for all tests run. To be useful, the JDK under test must be run with a JDK built with JCov instrumentation (<code>configure --with-jcov=&lt;path to directory containing lib/jcov.jar&gt;</code>, <code>make jcov-image</code>).</p>
<p>The simplest way to run tests with JCov coverage report is to use the special target <code>jcov-test</code> instead of <code>test</code>, e.g. <code>make jcov-test TEST=jdk_lang</code>. This will make sure the JCov image is built, and that JCov reporting is enabled.</p>
<p>The JCov report is stored in <code>build/$BUILD/test-results/jcov-output/report</code>.</p>
<p>Please note that running with JCov reporting can be very memory intensive.</p>
<h4 id="jcov_diff_changeset">JCOV_DIFF_CHANGESET</h4>
<p>While collecting code coverage with JCov, it is also possible to find
coverage for only recently changed code. JCOV_DIFF_CHANGESET specifies a
source revision. A textual report will be generated showing coverage of
the diff between the specified revision and the repository tip.</p>
<p>The report is stored in
<code>build/$BUILD/test-results/jcov-output/diff_coverage_report</code>
file.</p>
<p>While collecting code coverage with JCov, it is also possible to find coverage for only recently changed code. JCOV_DIFF_CHANGESET specifies a source revision. A textual report will be generated showing coverage of the diff between the specified revision and the repository tip.</p>
<p>The report is stored in <code>build/$BUILD/test-results/jcov-output/diff_coverage_report</code> file.</p>
<h3 id="jtreg-keywords">JTReg keywords</h3>
<h4 id="jobs-1">JOBS</h4>
<p>The test concurrency (<code>-concurrency</code>).</p>
<p>Defaults to TEST_JOBS (if set by <code>--with-test-jobs=</code>),
otherwise it defaults to JOBS, except for Hotspot, where the default is
<em>number of CPU cores/2</em>, but never more than <em>memory size in
GB/2</em>.</p>
<p>Defaults to TEST_JOBS (if set by <code>--with-test-jobs=</code>), otherwise it defaults to JOBS, except for Hotspot, where the default is <em>number of CPU cores/2</em>, but never more than <em>memory size in GB/2</em>.</p>
<h4 id="timeout_factor-1">TIMEOUT_FACTOR</h4>
<p>The timeout factor (<code>-timeoutFactor</code>).</p>
<p>Defaults to 4.</p>
<h4 id="failure_handler_timeout">FAILURE_HANDLER_TIMEOUT</h4>
<p>Sets the argument <code>-timeoutHandlerTimeout</code> for JTReg. The
default value is 0. This is only valid if the failure handler is
built.</p>
<h4 id="jtreg_test_thread_factory">JTREG_TEST_THREAD_FACTORY</h4>
<p>Sets the <code>-testThreadFactory</code> for JTReg. It should be the
fully qualified classname of a class which implements
<code>java.util.concurrent.ThreadFactory</code>. One such implementation
class, named Virtual, is currently part of the JDK build in the
<code>test/jtreg_test_thread_factory/</code> directory. This class gets
compiled during the test image build. The implementation of the Virtual
class creates a new virtual thread for executing each test class.</p>
<p>Sets the argument <code>-timeoutHandlerTimeout</code> for JTReg. The default value is 0. This is only valid if the failure handler is built.</p>
<h4 id="test_mode">TEST_MODE</h4>
<p>The test mode (<code>agentvm</code> or <code>othervm</code>).</p>
<p>Defaults to <code>agentvm</code>.</p>
<h4 id="assert">ASSERT</h4>
<p>Enable asserts (<code>-ea -esa</code>, or none).</p>
<p>Set to <code>true</code> or <code>false</code>. If true, adds
<code>-ea -esa</code>. Defaults to true, except for hotspot.</p>
<p>Set to <code>true</code> or <code>false</code>. If true, adds <code>-ea -esa</code>. Defaults to true, except for hotspot.</p>
<h4 id="verbose">VERBOSE</h4>
<p>The verbosity level (<code>-verbose</code>).</p>
<p>Defaults to <code>fail,error,summary</code>.</p>
@@ -448,200 +170,92 @@ class creates a new virtual thread for executing each test class.</p>
<p>What test data to retain (<code>-retain</code>).</p>
<p>Defaults to <code>fail,error</code>.</p>
<h4 id="max_mem">MAX_MEM</h4>
<p>Limit memory consumption (<code>-Xmx</code> and
<code>-vmoption:-Xmx</code>, or none).</p>
<p>Limit memory consumption for JTReg test framework and VM under test.
Set to 0 to disable the limits.</p>
<p>Defaults to 512m, except for hotspot, where it defaults to 0 (no
limit).</p>
<p>Limit memory consumption (<code>-Xmx</code> and <code>-vmoption:-Xmx</code>, or none).</p>
<p>Limit memory consumption for JTReg test framework and VM under test. Set to 0 to disable the limits.</p>
<p>Defaults to 512m, except for hotspot, where it defaults to 0 (no limit).</p>
<h4 id="max_output">MAX_OUTPUT</h4>
<p>Set the property <code>javatest.maxOutputSize</code> for the
launcher, to change the default JTReg log limit.</p>
<p>Set the property <code>javatest.maxOutputSize</code> for the launcher, to change the default JTReg log limit.</p>
<h4 id="keywords">KEYWORDS</h4>
<p>JTReg keywords sent to JTReg using <code>-k</code>. Please be careful
in making sure that spaces and special characters (like <code>!</code>)
are properly quoted. To avoid some issues, the special value
<code>%20</code> can be used instead of space.</p>
<p>JTReg keywords sent to JTReg using <code>-k</code>. Please be careful in making sure that spaces and special characters (like <code>!</code>) are properly quoted. To avoid some issues, the special value <code>%20</code> can be used instead of space.</p>
<h4 id="extra_problem_lists">EXTRA_PROBLEM_LISTS</h4>
<p>Use additional problem lists file or files, in addition to the
default ProblemList.txt located at the JTReg test roots.</p>
<p>If multiple file names are specified, they should be separated by
space (or, to help avoid quoting issues, the special value
<code>%20</code>).</p>
<p>The file names should be either absolute, or relative to the JTReg
test root of the tests to be run.</p>
<p>Use additional problem lists file or files, in addition to the default ProblemList.txt located at the JTReg test roots.</p>
<p>If multiple file names are specified, they should be separated by space (or, to help avoid quoting issues, the special value <code>%20</code>).</p>
<p>The file names should be either absolute, or relative to the JTReg test root of the tests to be run.</p>
<h4 id="run_problem_lists">RUN_PROBLEM_LISTS</h4>
<p>Use the problem lists to select tests instead of excluding them.</p>
<p>Set to <code>true</code> or <code>false</code>. If <code>true</code>,
JTReg will use <code>-match:</code> option, otherwise
<code>-exclude:</code> will be used. Default is <code>false</code>.</p>
<p>Set to <code>true</code> or <code>false</code>. If <code>true</code>, JTReg will use <code>-match:</code> option, otherwise <code>-exclude:</code> will be used. Default is <code>false</code>.</p>
<h4 id="options">OPTIONS</h4>
<p>Additional options to the JTReg test framework.</p>
<p>Use <code>JTREG="OPTIONS=--help all"</code> to see all available
JTReg options.</p>
<p>Use <code>JTREG=&quot;OPTIONS=--help all&quot;</code> to see all available JTReg options.</p>
<h4 id="java_options-1">JAVA_OPTIONS</h4>
<p>Additional Java options for running test classes (sent to JTReg as
<code>-javaoption</code>).</p>
<p>Additional Java options for running test classes (sent to JTReg as <code>-javaoption</code>).</p>
<h4 id="vm_options-1">VM_OPTIONS</h4>
<p>Additional Java options to be used when compiling and running classes
(sent to JTReg as <code>-vmoption</code>).</p>
<p>This option is only needed in special circumstances. To pass Java
options to your test classes, use <code>JAVA_OPTIONS</code>.</p>
<p>Additional Java options to be used when compiling and running classes (sent to JTReg as <code>-vmoption</code>).</p>
<p>This option is only needed in special circumstances. To pass Java options to your test classes, use <code>JAVA_OPTIONS</code>.</p>
<h4 id="launcher_options">LAUNCHER_OPTIONS</h4>
<p>Additional Java options that are sent to the java launcher that
starts the JTReg harness.</p>
<p>Additional Java options that are sent to the java launcher that starts the JTReg harness.</p>
<h4 id="aot_modules-1">AOT_MODULES</h4>
<p>Generate AOT modules before testing for the specified module, or set
of modules. If multiple modules are specified, they should be separated
by space (or, to help avoid quoting issues, the special value
<code>%20</code>).</p>
<p>Generate AOT modules before testing for the specified module, or set of modules. If multiple modules are specified, they should be separated by space (or, to help avoid quoting issues, the special value <code>%20</code>).</p>
<h4 id="retry_count">RETRY_COUNT</h4>
<p>Retry failed tests up to a set number of times, until they pass. This
allows to pass the tests with intermittent failures. Defaults to 0.</p>
<p>Retry failed tests up to a set number of times, until they pass. This allows to pass the tests with intermittent failures. Defaults to 0.</p>
<h4 id="repeat_count">REPEAT_COUNT</h4>
<p>Repeat the tests up to a set number of times, stopping at first
failure. This helps to reproduce intermittent test failures. Defaults to
0.</p>
<h4 id="report">REPORT</h4>
<p>Use this report style when reporting test results (sent to JTReg as
<code>-report</code>). Defaults to <code>files</code>.</p>
<p>Repeat the tests up to a set number of times, stopping at first failure. This helps to reproduce intermittent test failures. Defaults to 0.</p>
<h3 id="gtest-keywords">Gtest keywords</h3>
<h4 id="repeat">REPEAT</h4>
<p>The number of times to repeat the tests
(<code>--gtest_repeat</code>).</p>
<p>Default is 1. Set to -1 to repeat indefinitely. This can be
especially useful combined with
<code>OPTIONS=--gtest_break_on_failure</code> to reproduce an
intermittent problem.</p>
<p>The number of times to repeat the tests (<code>--gtest_repeat</code>).</p>
<p>Default is 1. Set to -1 to repeat indefinitely. This can be especially useful combined with <code>OPTIONS=--gtest_break_on_failure</code> to reproduce an intermittent problem.</p>
<h4 id="options-1">OPTIONS</h4>
<p>Additional options to the Gtest test framework.</p>
<p>Use <code>GTEST="OPTIONS=--help"</code> to see all available Gtest
options.</p>
<p>Use <code>GTEST=&quot;OPTIONS=--help&quot;</code> to see all available Gtest options.</p>
<h4 id="aot_modules-2">AOT_MODULES</h4>
<p>Generate AOT modules before testing for the specified module, or set
of modules. If multiple modules are specified, they should be separated
by space (or, to help avoid quoting issues, the special value
<code>%20</code>).</p>
<p>Generate AOT modules before testing for the specified module, or set of modules. If multiple modules are specified, they should be separated by space (or, to help avoid quoting issues, the special value <code>%20</code>).</p>
<h3 id="microbenchmark-keywords">Microbenchmark keywords</h3>
<h4 id="fork">FORK</h4>
<p>Override the number of benchmark forks to spawn. Same as specifying
<code>-f &lt;num&gt;</code>.</p>
<p>Override the number of benchmark forks to spawn. Same as specifying <code>-f &lt;num&gt;</code>.</p>
<h4 id="iter">ITER</h4>
<p>Number of measurement iterations per fork. Same as specifying
<code>-i &lt;num&gt;</code>.</p>
<p>Number of measurement iterations per fork. Same as specifying <code>-i &lt;num&gt;</code>.</p>
<h4 id="time">TIME</h4>
<p>Amount of time to spend in each measurement iteration, in seconds.
Same as specifying <code>-r &lt;num&gt;</code></p>
<p>Amount of time to spend in each measurement iteration, in seconds. Same as specifying <code>-r &lt;num&gt;</code></p>
<h4 id="warmup_iter">WARMUP_ITER</h4>
<p>Number of warmup iterations to run before the measurement phase in
each fork. Same as specifying <code>-wi &lt;num&gt;</code>.</p>
<p>Number of warmup iterations to run before the measurement phase in each fork. Same as specifying <code>-wi &lt;num&gt;</code>.</p>
<h4 id="warmup_time">WARMUP_TIME</h4>
<p>Amount of time to spend in each warmup iteration. Same as specifying
<code>-w &lt;num&gt;</code>.</p>
<p>Amount of time to spend in each warmup iteration. Same as specifying <code>-w &lt;num&gt;</code>.</p>
<h4 id="results_format">RESULTS_FORMAT</h4>
<p>Specify to have the test run save a log of the values. Accepts the
same values as <code>-rff</code>, i.e., <code>text</code>,
<code>csv</code>, <code>scsv</code>, <code>json</code>, or
<code>latex</code>.</p>
<p>Specify to have the test run save a log of the values. Accepts the same values as <code>-rff</code>, i.e., <code>text</code>, <code>csv</code>, <code>scsv</code>, <code>json</code>, or <code>latex</code>.</p>
<h4 id="vm_options-2">VM_OPTIONS</h4>
<p>Additional VM arguments to provide to forked off VMs. Same as
<code>-jvmArgs &lt;args&gt;</code></p>
<p>Additional VM arguments to provide to forked off VMs. Same as <code>-jvmArgs &lt;args&gt;</code></p>
<h4 id="options-2">OPTIONS</h4>
<p>Additional arguments to send to JMH.</p>
<h2 id="notes-for-specific-tests">Notes for Specific Tests</h2>
<h3 id="docker-tests">Docker Tests</h3>
<p>Docker tests with default parameters may fail on systems with glibc
versions not compatible with the one used in the default docker image
(e.g., Oracle Linux 7.6 for x86). For example, they pass on Ubuntu 16.04
but fail on Ubuntu 18.04 if run like this on x86:</p>
<p>Docker tests with default parameters may fail on systems with glibc versions not compatible with the one used in the default docker image (e.g., Oracle Linux 7.6 for x86). For example, they pass on Ubuntu 16.04 but fail on Ubuntu 18.04 if run like this on x86:</p>
<pre><code>$ make test TEST=&quot;jtreg:test/hotspot/jtreg/containers/docker&quot;</code></pre>
<p>To run these tests correctly, additional parameters for the correct
docker image are required on Ubuntu 18.04 by using
<code>JAVA_OPTIONS</code>.</p>
<p>To run these tests correctly, additional parameters for the correct docker image are required on Ubuntu 18.04 by using <code>JAVA_OPTIONS</code>.</p>
<pre><code>$ make test TEST=&quot;jtreg:test/hotspot/jtreg/containers/docker&quot; \
JTREG=&quot;JAVA_OPTIONS=-Djdk.test.docker.image.name=ubuntu
-Djdk.test.docker.image.version=latest&quot;</code></pre>
<h3 id="non-us-locale">Non-US locale</h3>
<p>If your locale is non-US, some tests are likely to fail. To work
around this you can set the locale to US. On Unix platforms simply
setting <code>LANG="en_US"</code> in the environment before running
tests should work. On Windows or MacOS, setting
<code>JTREG="VM_OPTIONS=-Duser.language=en -Duser.country=US"</code>
helps for most, but not all test cases.</p>
<p>If your locale is non-US, some tests are likely to fail. To work around this you can set the locale to US. On Unix platforms simply setting <code>LANG=&quot;en_US&quot;</code> in the environment before running tests should work. On Windows or MacOS, setting <code>JTREG=&quot;VM_OPTIONS=-Duser.language=en -Duser.country=US&quot;</code> helps for most, but not all test cases.</p>
<p>For example:</p>
<pre><code>$ export LANG=&quot;en_US&quot; &amp;&amp; make test TEST=...
$ make test JTREG=&quot;VM_OPTIONS=-Duser.language=en -Duser.country=US&quot; TEST=...</code></pre>
<h3 id="pkcs11-tests">PKCS11 Tests</h3>
<p>It is highly recommended to use the latest NSS version when running
PKCS11 tests. Improper NSS version may lead to unexpected failures which
are hard to diagnose. For example,
sun/security/pkcs11/Secmod/AddTrustedCert.java may fail on Ubuntu 18.04
with the default NSS version in the system. To run these tests
correctly, the system property <code>test.nss.lib.paths</code> is
required on Ubuntu 18.04 to specify the alternative NSS lib
directories.</p>
<p>It is highly recommended to use the latest NSS version when running PKCS11 tests. Improper NSS version may lead to unexpected failures which are hard to diagnose. For example, sun/security/pkcs11/Secmod/AddTrustedCert.java may fail on Ubuntu 18.04 with the default NSS version in the system. To run these tests correctly, the system property <code>test.nss.lib.paths</code> is required on Ubuntu 18.04 to specify the alternative NSS lib directories.</p>
<p>For example:</p>
<pre><code>$ make test TEST=&quot;jtreg:sun/security/pkcs11/Secmod/AddTrustedCert.java&quot; \
JTREG=&quot;JAVA_OPTIONS=-Dtest.nss.lib.paths=/path/to/your/latest/NSS-libs&quot;</code></pre>
<p>For more notes about the PKCS11 tests, please refer to
test/jdk/sun/security/pkcs11/README.</p>
<p>For more notes about the PKCS11 tests, please refer to test/jdk/sun/security/pkcs11/README.</p>
<h3 id="client-ui-tests">Client UI Tests</h3>
<h4 id="system-key-shortcuts">System key shortcuts</h4>
<p>Some Client UI tests use key sequences which may be reserved by the
operating system. Usually that causes the test failure. So it is highly
recommended to disable system key shortcuts prior testing. The steps to
access and disable system key shortcuts for various platforms are
provided below.</p>
<h5 id="macos">macOS</h5>
<p>Choose Apple menu; System Preferences, click Keyboard, then click
Shortcuts; select or deselect desired shortcut.</p>
<p>For example,
test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java
fails on MacOS because it uses <code>CTRL + F1</code> key sequence to
show or hide tooltip message but the key combination is reserved by the
operating system. To run the test correctly the default global key
shortcut should be disabled using the steps described above, and then
deselect "Turn keyboard access on or off" option which is responsible
for <code>CTRL + F1</code> combination.</p>
<h5 id="linux">Linux</h5>
<p>Open the Activities overview and start typing Settings; Choose
Settings, click Devices, then click Keyboard; set or override desired
shortcut.</p>
<h5 id="windows">Windows</h5>
<p>Type <code>gpedit</code> in the Search and then click Edit group
policy; navigate to User Configuration -&gt; Administrative Templates
-&gt; Windows Components -&gt; File Explorer; in the right-side pane
look for "Turn off Windows key hotkeys" and double click on it; enable
or disable hotkeys.</p>
<p>Some Client UI tests use key sequences which may be reserved by the operating system. Usually that causes the test failure. So it is highly recommended to disable system key shortcuts prior testing. The steps to access and disable system key shortcuts for various platforms are provided below.</p>
<h4 id="macos">MacOS</h4>
<p>Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts; select or deselect desired shortcut.</p>
<p>For example, test/jdk/javax/swing/TooltipManager/JMenuItemToolTipKeyBindingsTest/JMenuItemToolTipKeyBindingsTest.java fails on MacOS because it uses <code>CTRL + F1</code> key sequence to show or hide tooltip message but the key combination is reserved by the operating system. To run the test correctly the default global key shortcut should be disabled using the steps described above, and then deselect &quot;Turn keyboard access on or off&quot; option which is responsible for <code>CTRL + F1</code> combination.</p>
<h4 id="linux">Linux</h4>
<p>Open the Activities overview and start typing Settings; Choose Settings, click Devices, then click Keyboard; set or override desired shortcut.</p>
<h4 id="windows">Windows</h4>
<p>Type <code>gpedit</code> in the Search and then click Edit group policy; navigate to User Configuration -&gt; Administrative Templates -&gt; Windows Components -&gt; File Explorer; in the right-side pane look for &quot;Turn off Windows key hotkeys&quot; and double click on it; enable or disable hotkeys.</p>
<p>Note: restart is required to make the settings take effect.</p>
<h4 id="robot-api">Robot API</h4>
<p>Most automated Client UI tests use <code>Robot</code> API to control
the UI. Usually, the default operating system settings need to be
adjusted for Robot to work correctly. The detailed steps how to access
and update these settings for different platforms are provided
below.</p>
<h5 id="macos-1">macOS</h5>
<p><code>Robot</code> is not permitted to control your Mac by default
since macOS 10.15. To allow it, choose Apple menu -&gt; System Settings,
click Privacy &amp; Security; then click Accessibility and ensure the
following apps are allowed to control your computer: <em>Java</em> and
<em>Terminal</em>. If the tests are run from an IDE, the IDE should be
granted this permission too.</p>
<h5 id="windows-1">Windows</h5>
<p>On Windows if Cygwin terminal is used to run the tests, there is a
delay in focus transfer. Usually it causes automated UI test failure. To
disable the delay, type <code>regedit</code> in the Search and then
select Registry Editor; navigate to the following key:
<code>HKEY_CURRENT_USER\Control Panel\Desktop</code>; make sure the
<code>ForegroundLockTimeout</code> value is set to 0.</p>
<p>Additional information about Client UI tests configuration for
various operating systems can be obtained at <a
href="https://wiki.openjdk.org/display/ClientLibs/Automated+client+GUI+testing+system+set+up+requirements">Automated
client GUI testing system set up requirements</a></p>
<h2 id="editing-this-document">Editing this document</h2>
<p>If you want to contribute changes to this document, edit
<code>doc/testing.md</code> and then run
<code>make update-build-docs</code> to generate the same changes in
<code>doc/testing.html</code>.</p>
<p>If you want to contribute changes to this document, edit <code>doc/testing.md</code> and then run <code>make update-build-docs</code> to generate the same changes in <code>doc/testing.html</code>.</p>
</body>
</html>

View File

@@ -1,26 +1,21 @@
% Testing the JDK
## Overview
## Using "make test" (the run-test framework)
The bulk of JDK tests use [jtreg](https://openjdk.org/jtreg/), a regression
test framework and test runner built for the JDK's specific needs. Other test
frameworks are also used. The different test frameworks can be executed
directly, but there is also a set of make targets intended to simplify the
interface, and figure out how to run your tests for you.
This new way of running tests is developer-centric. It assumes that you have
built a JDK locally and want to test it. Running common test targets is simple,
and more complex ad-hoc combination of tests is possible. The user interface is
forgiving, and clearly report errors it cannot resolve.
## Running tests locally with `make test`
The main target `test` uses the jdk-image as the tested product. There is
also an alternate target `exploded-test` that uses the exploded image
instead. Not all tests will run successfully on the exploded image, but using
this target can greatly improve rebuild times for certain workflows.
This is the easiest way to get started. Assuming you've built the JDK locally,
execute:
$ make test
This will run a default set of tests against the JDK, and present you with the
results. `make test` is part of a family of test-related make targets which
simplify running tests, because they invoke the various test frameworks for
you. The "make test framework" is simple to start with, but more complex
ad-hoc combination of tests is also possible. You can always invoke the test
frameworks directly if you want even more control.
Previously, `make test` was used to invoke an old system for running tests, and
`make run-test` was used for the new test framework. For backward compatibility
with scripts and muscle memory, `run-test` (and variants like
`exploded-run-test` or `run-test-tier1`) are kept as aliases.
Some example command-lines:
@@ -33,20 +28,6 @@ Some example command-lines:
$ make test TEST="micro:java.lang.reflect" MICRO="FORK=1;WARMUP_ITER=2"
$ make exploded-test TEST=tier2
"tier1" and "tier2" refer to tiered testing, see further down. "TEST" is a
test selection argument which the make test framework will use to try to
find the tests you want. It iterates over the available test frameworks, and
if the test isn't present in one, it tries the next one. The main target
`test` uses the jdk-image as the tested product. There is also an alternate
target `exploded-test` that uses the exploded image instead. Not all tests
will run successfully on the exploded image, but using this target can
greatly improve rebuild times for certain workflows.
Previously, `make test` was used to invoke an old system for running tests,
and `make run-test` was used for the new test framework. For backward
compatibility with scripts and muscle memory, `run-test` and variants like
`exploded-run-test` or `run-test-tier1` are kept as aliases.
### Configuration
To be able to run JTReg tests, `configure` needs to know where to find the
@@ -110,58 +91,54 @@ if you want to shortcut the parser.
### Common Test Groups
Ideally, all tests are run for every change but this may not be practical due
to the limited testing resources, the scope of the change, etc.
Ideally, all tests are run for every change but this may not be practical due to the limited
testing resources, the scope of the change, etc.
The source tree currently defines a few common test groups in the relevant
`TEST.groups` files. There are test groups that cover a specific component,
for example `hotspot_gc`. It is a good idea to look into `TEST.groups` files
to get a sense what tests are relevant to a particular JDK component.
The source tree currently defines a few common test groups in the relevant `TEST.groups`
files. There are test groups that cover a specific component, for example `hotspot_gc`.
It is a good idea to look into `TEST.groups` files to get a sense what tests are relevant
to a particular JDK component.
Component-specific tests may miss some unintended consequences of a change, so
other tests should also be run. Again, it might be impractical to run all
tests, and therefore
_tiered_ test groups exist. Tiered test groups are not component-specific, but
rather cover the significant parts of the entire JDK.
Component-specific tests may miss some unintended consequences of a change, so other
tests should also be run. Again, it might be impractical to run all tests, and therefore
_tiered_ test groups exist. Tiered test groups are not component-specific, but rather cover
the significant parts of the entire JDK.
Multiple tiers allow balancing test coverage and testing costs. Lower test
tiers are supposed to contain the simpler, quicker and more stable tests.
Higher tiers are supposed to contain progressively more thorough, slower, and
sometimes less stable tests, or the tests that require special
configuration.
Multiple tiers allow balancing test coverage and testing costs. Lower test tiers are supposed to
contain the simpler, quicker and more stable tests. Higher tiers are supposed to contain
progressively more thorough, slower, and sometimes less stable tests, or the tests that require
special configuration.
Contributors are expected to run the tests for the areas that are changed, and
the first N tiers they can afford to run, but at least tier1.
Contributors are expected to run the tests for the areas that are changed, and the first N tiers
they can afford to run, but at least tier1.
A brief description of the tiered test groups:
- `tier1`: This is the lowest test tier. Multiple developers run these tests
every day. Because of the widespread use, the tests in `tier1` are
carefully selected and optimized to run fast, and to run in the most stable
manner. The test failures in `tier1` are usually followed up on quickly,
either with fixes, or adding relevant tests to problem list. GitHub Actions
workflows, if enabled, run `tier1` tests.
- `tier1`: This is the lowest test tier. Multiple developers run these tests every day.
Because of the widespread use, the tests in `tier1` are carefully selected and optimized to run
fast, and to run in the most stable manner. The test failures in `tier1` are usually followed up
on quickly, either with fixes, or adding relevant tests to problem list. GitHub Actions workflows,
if enabled, run `tier1` tests.
- `tier2`: This test group covers even more ground. These contain, among other
things, tests that either run for too long to be at `tier1`, or may require
special configuration, or tests that are less stable, or cover the broader
range of non-core JVM and JDK features/components(for example, XML).
- `tier2`: This test group covers even more ground. These contain, among other things,
tests that either run for too long to be at `tier1`, or may require special configuration,
or tests that are less stable, or cover the broader range of non-core JVM and JDK features/components
(for example, XML).
- `tier3`: This test group includes more stressful tests, the tests for corner
cases not covered by previous tiers, plus the tests that require GUIs. As
such, this suite should either be run with low concurrency
(`TEST_JOBS=1`), or without headful tests(`JTREG_KEYWORDS=\!headful`), or
both.
- `tier3`: This test group includes more stressful tests, the tests for corner cases
not covered by previous tiers, plus the tests that require GUIs. As such, this suite
should either be run with low concurrency (`TEST_JOBS=1`), or without headful tests
(`JTREG_KEYWORDS=\!headful`), or both.
- `tier4`: This test group includes every other test not covered by previous
tiers. It includes, for example, `vmTestbase` suites for Hotspot, which run
for many hours even on large machines. It also runs GUI tests, so the same
`TEST_JOBS` and `JTREG_KEYWORDS` caveats apply.
- `tier4`: This test group includes every other test not covered by previous tiers. It includes,
for example, `vmTestbase` suites for Hotspot, which run for many hours even on large
machines. It also runs GUI tests, so the same `TEST_JOBS` and `JTREG_KEYWORDS` caveats
apply.
### JTReg
JTReg tests can be selected either by picking a JTReg test group, or a selection
of files or directories containing JTReg tests. Documentation can be found at
of files or directories containing JTReg tests. Documentation can be found at
[https://openjdk.org/jtreg/](https://openjdk.org/jtreg/), note especially the
extensive [FAQ](https://openjdk.org/jtreg/faq.html).
@@ -192,11 +169,6 @@ use a fully qualified test descriptor, add `jtreg:`, e.g.
### Gtest
**Note:** To be able to run the Gtest suite, you need to configure your build to
be able to find a proper version of the gtest source. For details, see the
section ["Running Tests" in the build
documentation](building.html#running-tests).
Since the Hotspot Gtest suite is so quick, the default is to run all tests.
This is specified by just `gtest`, or as a fully qualified test descriptor
`gtest:all`.
@@ -378,15 +350,6 @@ Defaults to 4.
Sets the argument `-timeoutHandlerTimeout` for JTReg. The default value is 0.
This is only valid if the failure handler is built.
#### JTREG_TEST_THREAD_FACTORY
Sets the `-testThreadFactory` for JTReg. It should be the fully qualified classname
of a class which implements `java.util.concurrent.ThreadFactory`.
One such implementation class, named Virtual, is currently part of the JDK build
in the `test/jtreg_test_thread_factory/` directory. This class gets compiled during
the test image build. The implementation of the Virtual class creates a new virtual
thread for executing each test class.
#### TEST_MODE
The test mode (`agentvm` or `othervm`).
@@ -493,11 +456,6 @@ Repeat the tests up to a set number of times, stopping at first failure.
This helps to reproduce intermittent test failures.
Defaults to 0.
#### REPORT
Use this report style when reporting test results (sent to JTReg as `-report`).
Defaults to `files`.
### Gtest keywords
#### REPEAT
@@ -615,14 +573,12 @@ test/jdk/sun/security/pkcs11/README.
### Client UI Tests
#### System key shortcuts
Some Client UI tests use key sequences which may be reserved by the operating
system. Usually that causes the test failure. So it is highly recommended to
disable system key shortcuts prior testing. The steps to access and disable
system key shortcuts for various platforms are provided below.
##### macOS
#### MacOS
Choose Apple menu; System Preferences, click Keyboard, then click Shortcuts;
select or deselect desired shortcut.
@@ -635,12 +591,12 @@ test correctly the default global key shortcut should be disabled using the
steps described above, and then deselect "Turn keyboard access on or off"
option which is responsible for `CTRL + F1` combination.
##### Linux
#### Linux
Open the Activities overview and start typing Settings; Choose Settings, click
Devices, then click Keyboard; set or override desired shortcut.
##### Windows
#### Windows
Type `gpedit` in the Search and then click Edit group policy; navigate to User
Configuration -> Administrative Templates -> Windows Components -> File
@@ -649,37 +605,10 @@ double click on it; enable or disable hotkeys.
Note: restart is required to make the settings take effect.
#### Robot API
Most automated Client UI tests use `Robot` API to control the UI. Usually,
the default operating system settings need to be adjusted for Robot
to work correctly. The detailed steps how to access and update these settings
for different platforms are provided below.
##### macOS
`Robot` is not permitted to control your Mac by default since
macOS 10.15. To allow it, choose Apple menu -> System Settings, click
Privacy & Security; then click Accessibility and ensure the following apps are
allowed to control your computer: *Java* and *Terminal*. If the tests are run
from an IDE, the IDE should be granted this permission too.
##### Windows
On Windows if Cygwin terminal is used to run the tests, there is a delay in
focus transfer. Usually it causes automated UI test failure. To disable the
delay, type `regedit` in the Search and then select Registry Editor; navigate
to the following key: `HKEY_CURRENT_USER\Control Panel\Desktop`; make sure
the `ForegroundLockTimeout` value is set to 0.
Additional information about Client UI tests configuration for various operating
systems can be obtained at [Automated client GUI testing system set up
requirements](https://wiki.openjdk.org/display/ClientLibs/Automated+client+GUI+testing+system+set+up+requirements)
## Editing this document
If you want to contribute changes to this document, edit `doc/testing.md` and
then run `make update-build-docs` to generate the same changes in
If you want to contribute changes to this document, edit `doc/testing.md` and
then run `make update-build-docs` to generate the same changes in
`doc/testing.html`.
---

View File

@@ -1,238 +0,0 @@
#!/usr/bin/env python3
import argparse
import os.path
import sys
import subprocess
def fatal(msg):
sys.stderr.write(f"[fatal] {msg}\n")
sys.exit(1)
def verbose(options, *msg):
if options.verbose:
sys.stderr.write(f"[verbose] ")
sys.stderr.write(*msg)
sys.stderr.write('\n')
def first_line(str):
return "" if not str else str.splitlines()[0]
class Options:
def __init__(self):
ap = argparse.ArgumentParser(description="Show commit differences between branches of JBR git repos",
epilog="Example: %(prog)s --from origin/jbr17 --to jbr17.b469 --path "
"src/hotspot --limit 200")
ap.add_argument('--jbr', dest='jbrpath', help='path to JBR git root', required=True)
ap.add_argument('--from', dest='frombranch', help='branch to take commits from', required=True)
ap.add_argument('--to', dest='tobranch', help='branch to apply new commits to', required=True)
ap.add_argument('--path', dest='path', help='limit to changes in this path (relative to git root)')
ap.add_argument('--limit', dest='limit', help='limit to this many log entries in --jdk repo', type=int, default=-1)
ap.add_argument('--html', dest="ishtml", help="print out HTML rather than plain text", action='store_true')
ap.add_argument('-o', dest="output", help="print the list of missing commits to this file"
" to be used as exclude list later")
ap.add_argument('--exclude', dest='exclude', help='exclude commits listed in the given file '
'(can use edited -o output file as input here)')
ap.add_argument('-v', dest='verbose', help="verbose output", default=False, action='store_true')
args = ap.parse_args()
if not os.path.isdir(args.jbrpath):
fatal(f"{args.jbrpath} not a directory")
if not git_is_available():
fatal("can't run git commands; make sure git is in PATH")
self.frombranch = args.frombranch
self.tobranch = args.tobranch
self.jbrpath = args.jbrpath
self.path = args.path
self.limit = args.limit
self.exclude = args.exclude
self.output = args.output
self.ishtml = args.ishtml
self.verbose = args.verbose
class GitRepo:
def __init__(self, rootpath):
self.rootpath = rootpath
def run_git_cmd(self, git_args):
args = ["git", "-C", self.rootpath]
args.extend(git_args)
# print(f"Runnig git cmd '{' '.join(args)}'")
p = subprocess.run(args, capture_output=True, text=True)
if p.returncode != 0:
fatal(f"git returned non-zero code in {self.rootpath} ({first_line(p.stderr)})")
return p.stdout
def save_git_cmd(self, fname, git_args):
args = ["git", "-C", self.rootpath]
args.extend(git_args)
# print(f"Runnig git cmd '{' '.join(args)}'")
with open(fname, "w") as stdout_file:
p = subprocess.run(args, stdout=stdout_file)
if p.returncode != 0:
fatal(f"git returned non-zero code in {self.rootpath} ({first_line(p.stderr)})")
def current_branch(self):
branch_name = self.run_git_cmd(["branch", "--show-current"]).strip()
return branch_name
def log(self, branch, path=None, limit=None):
cmds = ["log", "--no-decorate", branch]
if limit:
cmds.extend(["-n", str(limit)])
if path:
cmds.append(path)
full_log = self.run_git_cmd(cmds)
return full_log
class Commit:
def __init__(self, lines):
self.sha = lines[0].split()[1]
self.message = ""
self.bugid = ""
# Commit message starts after one blank line
read_message = False
for l in lines:
if read_message:
self.message = l.strip()
t = self.message.split(' ')
if len(t) > 1:
bugid = t[0]
if bugid.startswith("fixup"):
bugid = t[1]
bugid = bugid.strip(":")
if bugid.startswith("JBR-") or bugid.isnumeric():
self.bugid = bugid
break
if not read_message and l == "":
read_message = True
class History:
def __init__(self, log):
log_itr = iter(log.splitlines())
self.commits = []
commit_lines = []
for line in log_itr:
if line.startswith("commit ") and len(commit_lines) > 0:
commit = Commit(commit_lines)
self.commits.append(commit)
commit_lines = []
commit_lines.append(line)
if len(commit_lines) > 0:
commit = Commit(commit_lines)
self.commits.append(commit)
def contains(self, str):
return any(str in commit.message for commit in self.commits)
def size(self):
return len(self.commits)
def print_explanation(options, jbr):
verbose(options, f"Reading history from '{jbr.rootpath}'")
if options.path:
verbose(options, f"\t(only under '{options.path}')")
if options.limit > 0:
verbose(options, f"\t(up to '{options.limit}' commits)")
verbose(options, f"Searching for missing fixes in '{options.tobranch}' compared with '{options.frombranch}'")
def git_is_available():
p = None
try:
p = subprocess.run(["git", "--help"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
except:
pass
return p is not None and p.returncode == 0
def main():
check_python_min_requirements()
options = Options()
jbr = GitRepo(options.jbrpath)
print_explanation(options, jbr)
commits_to_save = []
try:
log_from = jbr.log(options.frombranch, options.path, options.limit)
log_to = jbr.log(options.tobranch, options.path, options.limit)
history_from = History(log_from)
history_to = History(log_to)
verbose(options, f"Read {history_from.size()} commits from '{options.frombranch}', {history_to.size()} from {options.tobranch}")
exclude_list=[]
if options.exclude:
with open(options.exclude, "r") as exclude_file:
l = exclude_file.read().split('\n')
exclude_list = list(filter(lambda line: not line.startswith("#"), l))
for c in history_from.commits:
if c.message:
verbose(options, f"Looking for commit '{c.message}'")
if c.message in exclude_list:
verbose(options, "...nope, in exclude list")
continue
if not history_to.contains(c.message):
commits_to_save.append(c)
except KeyboardInterrupt:
fatal("Interrupted")
print_out_commits(options, commits_to_save)
save_commits_to_file(commits_to_save, options)
def save_commits_to_file(commits_to_save, options):
if len(commits_to_save) > 0 and options.output:
print()
with open(options.output, "w") as out:
for i, c in enumerate(reversed(commits_to_save)):
print(f"# {c.sha}", file=out)
print(c.message, file=out)
def print_out_commits(options, commits_to_save):
if options.ishtml:
print("<html><body>")
print(f"<p><b>Commits on <code>{options.frombranch}</code>"
f" missing from <code>{options.tobranch}</code></b></p></h1>")
if len(commits_to_save) > 0:
for c in sorted(commits_to_save, key=lambda commit: commit.bugid):
if options.ishtml:
msg = c.message
bugurl = ""
if c.bugid:
if c.bugid.isnumeric():
bugurl = f"https://bugs.openjdk.org/browse/JDK-{c.bugid}"
elif c.bugid.startswith("JBR-"):
bugurl = f"https://youtrack.jetbrains.com/issue/{c.bugid}"
if len(bugurl) > 0:
msg = msg.replace(c.bugid, f"<a href='{bugurl}'>{c.bugid}</a>")
sha = f"<a href='https://jetbrains.team/p/jbre/repositories/jbr/commits?commits={c.sha}'>" \
f"{c.sha[0:8]}</a>"
print(f"<li>{msg} ({sha})</li>")
else:
print(f"{c.message} ({c.sha[0:8]})")
if options.ishtml:
print("</body></html>")
def check_python_min_requirements():
if sys.version_info < (3, 6):
fatal("Minimum version 3.6 is required to run this script")
if __name__ == '__main__':
main()

View File

@@ -1,12 +0,0 @@
#!/bin/bash
if [[ -z "$1" ]]; then
SCANNER=wayland-scanner
else
SCANNER="$1"
fi
set -ex
"$SCANNER" client-header src/java.desktop/share/native/libwakefield/protocol/wakefield.xml src/java.desktop/unix/native/libawt_wlawt/wakefield-client-protocol.h
"$SCANNER" private-code src/java.desktop/share/native/libwakefield/protocol/wakefield.xml src/java.desktop/unix/native/libawt_wlawt/wakefield-client-protocol.c

View File

@@ -1,230 +0,0 @@
#!/usr/bin/env python3
import argparse
import math
import os.path
import sys
import subprocess
def fatal(msg):
sys.stderr.write(f"[fatal] {msg}\n")
sys.exit(1)
def verbose(options, *msg):
if options.verbose:
sys.stdout.write(f"[verbose] ")
sys.stdout.write(*msg)
sys.stdout.write('\n')
def first_line(str):
return "" if not str else str.splitlines()[0]
class Options:
def __init__(self):
ap = argparse.ArgumentParser(description="Show bugfixes differences between JBR and OpenJDK git repos",
epilog="Example: %(prog)s --jdk ./jdk11u/ --jbr ./JetBrainsRuntime/ --path src/hotspot --limit 200")
ap.add_argument('--jdk', dest='jdkpath', help='path to OpenJDK git repo', required=True)
ap.add_argument('--jbr', dest='jbrpath', help='path to JBR git repo', required=True)
ap.add_argument('--path', dest='path', help='limit to changes in this path (relative to git root)')
ap.add_argument('--limit', dest='limit', help='limit to this many log entries in --jdk repo', type=int, default=-1)
ap.add_argument('-o', dest="output_dir", help="save patches to this directory (created if necessary)")
ap.add_argument('-v', dest='verbose', help="verbose output", default=False, action='store_true')
args = ap.parse_args()
if not os.path.isdir(args.jdkpath):
fatal(f"{args.jdkpath} not a directory")
if not os.path.isdir(args.jbrpath):
fatal(f"{args.jbrpath} not a directory")
if not git_is_available():
fatal("can't run git commands; make sure git is in PATH")
self.jdkpath = args.jdkpath
self.jbrpath = args.jbrpath
self.path = args.path
self.limit = args.limit
self.output_dir = args.output_dir
self.verbose = args.verbose
class GitRepo:
def __init__(self, rootpath):
self.rootpath = rootpath
def run_git_cmd(self, git_args):
args = ["git", "-C", self.rootpath]
args.extend(git_args)
# print(f"Runnig git cmd '{' '.join(args)}'")
p = subprocess.run(args, capture_output=True, text=True)
if p.returncode != 0:
fatal(f"git returned non-zero code in {self.rootpath} ({first_line(p.stderr)})")
return p.stdout
def save_git_cmd(self, fname, git_args):
args = ["git", "-C", self.rootpath]
args.extend(git_args)
# print(f"Runnig git cmd '{' '.join(args)}'")
with open(fname, "w") as stdout_file:
p = subprocess.run(args, stdout=stdout_file)
if p.returncode != 0:
fatal(f"git returned non-zero code in {self.rootpath} ({first_line(p.stderr)})")
def current_branch(self):
branch_name = self.run_git_cmd(["branch", "--show-current"]).strip()
return branch_name
def log(self, path=None, limit=None):
cmds = ["log", "--no-decorate"]
if limit:
cmds.extend(["-n", str(limit)])
if path:
cmds.append(path)
full_log = self.run_git_cmd(cmds)
return full_log
class Commit:
def __init__(self, lines):
self.sha = lines[0].split()[1]
self.message = ""
self.bugid = None
# Commit message starts after one blank line
read_message = False
for l in lines:
if read_message:
self.message += l + "\n"
if not read_message and l == "":
read_message = True
if self.message and self.message != "" and ":" in self.message:
maybe_bugid = self.message.split(":")[0].strip()
if 10 >= len(maybe_bugid) >= 4:
self.bugid = maybe_bugid
class History:
def __init__(self, log):
log_itr = iter(log.splitlines())
self.commits = []
commit_lines = []
for line in log_itr:
if line.startswith("commit ") and len(commit_lines) > 0:
commit = Commit(commit_lines)
self.commits.append(commit)
commit_lines = []
commit_lines.append(line)
if len(commit_lines) > 0:
commit = Commit(commit_lines)
self.commits.append(commit)
def contains(self, str):
return any(str in commit.message for commit in self.commits)
def size(self):
return len(self.commits)
def print_explanation(options, jdk, jbr):
verbose(options, f"Reading history from '{jdk.rootpath}' on branch '{jdk.current_branch()}'")
if options.path:
verbose(options, f"\t(only under '{options.path}')")
verbose(options, f"Searching for same fixes in '{jbr.rootpath}' on branch '{jbr.current_branch()}'")
def git_is_available():
p = None
try:
p = subprocess.run(["git", "--help"], stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
except:
pass
return p is not None and p.returncode == 0
def main():
check_python_min_requirements()
options = Options()
jdk = GitRepo(options.jdkpath)
jbr = GitRepo(options.jbrpath)
print_explanation(options, jdk, jbr)
commits_to_save = []
try:
jdk_log = jdk.log(options.path, options.limit)
jdk_history = History(jdk_log)
jbr_log = jbr.log(options.path)
jbr_history = History(jbr_log)
verbose(options, f"Read {jdk_history.size()} commits in JDK, {jbr_history.size()} in JBR")
for c in jdk_history.commits:
if c.bugid:
verbose(options, f"Looking for bugfix for {c.bugid}")
if not jbr_history.contains(c.bugid):
commits_to_save.append(c)
print(f"[note] Fix for {c.bugid} not found in JBR ({jbr.rootpath})")
print(f" commit {c.sha}")
print(f" {first_line(c.message).strip()}")
except KeyboardInterrupt:
fatal("Interrupted")
if len(commits_to_save) > 0 and options.output_dir:
print()
if not os.path.exists(options.output_dir):
verbose(options, f"Creating output directory {options.output_dir}")
os.makedirs(options.output_dir)
nzeroes = len(str(len(commits_to_save)))
for i, c in enumerate(reversed(commits_to_save)):
fname = os.path.join(options.output_dir, f"{str(i).zfill(nzeroes)}-{c.bugid}.patch")
print(f"[note] {c.bugid} saved as {fname}")
fname = os.path.abspath(fname)
jdk.save_git_cmd(fname, ["format-patch", "-1", c.sha, "--stdout"])
script_fname = os.path.join(options.output_dir, "apply.sh")
with open(script_fname, "w") as script_file:
print(apply_script_code.format(os.path.abspath(jbr.rootpath), os.path.abspath(options.output_dir)),
file=script_file)
print(f"[note] Execute 'bash {script_fname}' to apply patches to {jbr.rootpath}")
def check_python_min_requirements():
if sys.version_info < (3, 6):
fatal("Minimum version 3.6 is required to run this script")
apply_script_code = """
#!/bin/bash
GITROOT={0}
PATCHROOT={1}
cd $PATCHROOT || exit 1
PATCHES=$(find $PATCHROOT -name '*.patch' | sort -n)
for P in $PATCHES; do
git -C $GITROOT am $P
if [ $? != 0 ]; then
mv "$P" "$P.failed"
echo "[ERROR] Patch $P did not apply cleanly. Try applying it manually."
echo "[NOTE] Execute this script to apply the remaining patches."
exit 1
else
mv "$P" "$P.done"
fi
done
echo "[NOTE] Done applying patches; check $PATCHROOT for .patch and .patch.failed to see if all have been applied."
"""
if __name__ == '__main__':
main()

View File

@@ -1,46 +0,0 @@
# NOTE: This Dockerfile is meant to be used from the mkdocker_aarch64.sh script.
# Pull a concrete version of Linux that does NOT recieve updates after it's
# been created. This is so that the image is as stable as possible to make
# image creation reproducible.
# NB: this also means there may be no security-related fixes there, need to
# move the version to the next manually.
# jetbrains/runtime:jbr17env_aarch64
FROM arm64v8/centos:7
# Install the necessary build tools
RUN yum -y update; \
yum -y install centos-release-scl; \
yum -y install devtoolset-10-10.1-0.el7; \
yum -y install \
alsa-lib-devel-1.1.8-1.el7.aarch64 \
autoconf-2.69-11.el7.noarch \
automake-1.13.4-3.el7.noarch \
bzip2-1.0.6-13.el7.aarch64 \
cups-devel-1.6.3-51.el7.aarch64 \
file-5.11-37.el7.aarch64 \
fontconfig-devel-2.13.0-4.3.el7.aarch64 \
freetype-devel-2.8-14.el7_9.1.aarch64 \
giflib-devel-4.1.6-9.el7.aarch64 \
git-1.8.3.1-24.el7_9.aarch64 \
libtool-2.4.2-22.el7_3.aarch64 \
libXi-devel-1.7.9-1.el7.aarch64 \
libXrandr-devel-1.5.1-2.el7.aarch64 \
libXrender-devel-0.9.10-1.el7.aarch64 \
libXt-devel-1.1.5-3.el7.aarch64 \
libXtst-devel-1.2.3-1.el7.aarch64 \
make-3.82-24.el7.aarch64 \
rsync-3.1.2-12.el7_9.aarch64 \
tar-1.26-35.el7.aarch64 \
unzip-6.0-24.el7_9.aarch64 \
wayland-devel-1.15.0-1.el7 \
zip-3.0-11.el7.aarch64; \
yum -y clean all
ENV PATH="/opt/rh/devtoolset-10/root/usr/bin:${PATH}"
ENV LD_LIBRARY_PATH="/opt/rh/devtoolset-10/root/usr/lib64:/opt/rh/devtoolset-10/root/usr/lib:/opt/rh/devtoolset-10/root/usr/lib64/dyninst:/opt/rh/devtoolset-10/root/usr/lib/dyninst:/opt/rh/devtoolset-10/root/usr/lib64:/opt/rh/devtoolset-10/root/usr/lib"
ENV PKG_CONFIG_PATH="/opt/rh/devtoolset-10/root/usr/lib64/pkgconfig"
RUN git config --global user.email "teamcity@jetbrains.com" && \
git config --global user.name "builduser"

View File

@@ -1,22 +0,0 @@
# NOTE: This Dockerfile is meant to be used from the mkdocker_musl_aarch64.sh script.
# Pull a concrete version of Linux that does NOT recieve updates after it's
# been created. This is so that the image is as stable as possible to make
# image creation reproducible.
# NB: this also means there may be no security-related fixes there, need to
# move the version to the next manually.
FROM arm64v8/alpine:3.12
# Install the necessary build tools
RUN apk --no-cache add --update bash grep tar zip bzip2 rsync fontconfig build-base \
git libx11-dev libxext-dev libxrandr-dev libxrender-dev libxt-dev \
libxtst-dev autoconf freetype-dev cups-dev alsa-lib-dev file \
fontconfig fontconfig-dev linux-headers
# Set up boot JDK for building
COPY boot_jdk_musl_aarch64.tar.gz /jdk17/
RUN cd /jdk17 && tar --strip-components=1 -xzf boot_jdk_musl_aarch64.tar.gz && rm /jdk17/boot_jdk_musl_aarch64.tar.gz
ENV BOOT_JDK=/jdk17
RUN git config --global user.email "teamcity@jetbrains.com" && \
git config --global user.name "builduser"

View File

@@ -1,22 +0,0 @@
# NOTE: This Dockerfile is meant to be used from the mkdocker_musl_x64.sh script.
# Pull a concrete version of Linux that does NOT recieve updates after it's
# been created. This is so that the image is as stable as possible to make
# image creation reproducible.
# NB: this also means there may be no security-related fixes there, need to
# move the version to the next manually.
FROM alpine:3.14
# Install the necessary build tools
RUN apk --no-cache add --update bash grep tar zip bzip2 rsync fontconfig build-base \
git libx11-dev libxext-dev libxrandr-dev libxrender-dev libxt-dev \
libxtst-dev autoconf freetype-dev cups-dev alsa-lib-dev file \
fontconfig fontconfig-dev linux-headers
# Set up boot JDK for building
COPY boot_jdk_musl_amd64.tar.gz /jdk17/
RUN cd /jdk17 && tar --strip-components=1 -xzf boot_jdk_musl_amd64.tar.gz && rm /jdk17/boot_jdk_musl_amd64.tar.gz
ENV BOOT_JDK=/jdk17
RUN git config --global user.email "teamcity@jetbrains.com" && \
git config --global user.name "builduser"

View File

@@ -1,55 +0,0 @@
# NOTE: This Dockerfile is meant to be used from the mkdocker_x86.sh script.
# Pull a concrete version of Linux that does NOT receive updates after it's
# been created. This is so that the image is as stable as possible to make
# image creation reproducible.
# NB: this also means there may be no security-related fixes there, need to
# move the version to the next manually.
#FROM i386/ubuntu:xenial
#FROM i386/ubuntu:bionic
FROM i386/ubuntu:focal
RUN linux32 \
apt-get update && apt-get install -y --no-install-recommends apt-utils
RUN export DEBIAN_FRONTEND=noninteractive \
export DEBCONF_NONINTERACTIVE_SEEN=true && \
echo 'tzdata tzdata/Areas select Etc' | debconf-set-selections; \
echo 'tzdata tzdata/Zones/Etc select UTC' | debconf-set-selections; \
linux32 \
apt-get -y install \
autoconf \
build-essential \
curl \
file \
git \
libx11-dev \
libxext-dev \
libxrender-dev \
libxrandr-dev \
libxtst-dev \
libxt-dev \
libcups2-dev \
libasound2-data \
# libpng12-0 \
libasound2 \
libfreetype6 \
libfontconfig1-dev \
libasound2-dev \
rsync \
unzip \
zip
RUN linux32 \
apt-get -y install \
g++-10 \
gcc-10 && \
update-alternatives --install /usr/bin/gcc gcc /usr/bin/gcc-10 100 --slave /usr/bin/g++ g++ /usr/bin/g++-10 && \
apt-get clean -qy && \
rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/*
# Set up boot JDK for building
COPY boot_jdk_x86.tar.gz /jdk17/
RUN cd /jdk17 && tar --strip-components=1 -xzf boot_jdk_x86.tar.gz && rm /jdk17/boot_jdk_x86.tar.gz
ENV BOOT_JDK=/jdk17
RUN git config --global user.email "teamcity@jetbrains.com" && \
git config --global user.name "builduser"

View File

@@ -1,36 +0,0 @@
# jetbrains/runtime:jbr17env_x86_64
FROM centos:7
RUN yum -y install centos-release-scl; \
yum -y install devtoolset-10-10.1-0.el7; \
yum -y install \
alsa-lib-devel-1.1.8-1.el7 \
autoconf-2.69-11.el7 \
automake-1.13.4-3.el7 \
bzip2-1.0.6-13.el7 \
cups-devel-1.6.3-51.el7 \
file-5.11-37.el7 \
fontconfig-devel-2.13.0-4.3.el7 \
freetype-devel-2.8-14.el7_9.1 \
giflib-devel-4.1.6-9.el7 \
git-1.8.3.1-24.el7_9 \
libtool-2.4.2-22.el7_3 \
libXi-devel-1.7.9-1.el7 \
libXrandr-devel-1.5.1-2.el7 \
libXrender-devel-0.9.10-1.el7 \
libXt-devel-1.1.5-3.el7 \
libXtst-devel-1.2.3-1.el7 \
make-3.82-24.el7 \
tar-1.26-35.el7 \
unzip-6.0-24.el7_9 \
wayland-devel-1.15.0-1.el7 \
wget-1.14-18.el7_6.1 \
which-2.20-7.el7 \
zip-3.0-11.el7
RUN mkdir .git && \
git config user.email "teamcity@jetbrains.com" && \
git config user.name "builduser"
ENV LD_LIBRARY_PATH="/opt/rh/devtoolset-10/root/usr/lib64:/opt/rh/devtoolset-10/root/usr/lib:/opt/rh/devtoolset-10/root/usr/lib64/dyninst:/opt/rh/devtoolset-10/root/usr/lib/dyninst:/opt/rh/devtoolset-10/root/usr/lib64:/opt/rh/devtoolset-10/root/usr/lib"
ENV PATH="/opt/rh/devtoolset-10/root/usr/bin::${PATH}"
ENV PKG_CONFIG_PATH="/opt/rh/devtoolset-10/root/usr/lib64/pkgconfig"

View File

@@ -1,29 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# This script creates a Docker image suitable for building AArch64 variant
# of the JetBrains Runtime "dev" version.
BOOT_JDK_REMOTE_FILE=zulu17.30.15-ca-jdk17.0.1-linux_aarch64.tar.gz
BOOT_JDK_SHA=4d9c9116eb0cdd2d7fb220d6d27059f4bf1b7e95cc93d5512bd8ce3791af86c7
BOOT_JDK_LOCAL_FILE=boot_jdk.tar.gz
if [ ! -f $BOOT_JDK_LOCAL_FILE ]; then
# Obtain "boot JDK" from outside of the container.
wget -nc https://cdn.azul.com/zulu/bin/${BOOT_JDK_REMOTE_FILE} -O $BOOT_JDK_LOCAL_FILE
else
echo "boot JDK \"$BOOT_JDK_LOCAL_FILE\" present, skipping download"
fi
# Verify that what we've downloaded can be trusted.
sha256sum -c - <<EOF
$BOOT_JDK_SHA *$BOOT_JDK_LOCAL_FILE
EOF
docker build -t jbrdevenv_arm64v8 -f Dockerfile.aarch64 .
# NB: the resulting container can (and should) be used without the network
# connection (--network none) during build in order to reduce the chance
# of build contamination.

View File

@@ -1,29 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# This script creates a Docker image suitable for building musl AArch64 variant
# of the JetBrains Runtime version 17.
BOOT_JDK_REMOTE_FILE=zulu17.32.13-ca-jdk17.0.2-linux_musl_aarch64.tar.gz
BOOT_JDK_SHA=6b920559abafbe9bdef386a20ecf3a2f318bc1f0d8359eb1f95aee26606bbc70
BOOT_JDK_LOCAL_FILE=boot_jdk_musl_aarch64.tar.gz
if [ ! -f $BOOT_JDK_LOCAL_FILE ]; then
# Obtain "boot JDK" from outside of the container.
wget -nc https://cdn.azul.com/zulu/bin/${BOOT_JDK_REMOTE_FILE} -O $BOOT_JDK_LOCAL_FILE
else
echo "boot JDK \"$BOOT_JDK_LOCAL_FILE\" present, skipping download"
fi
# Verify that what we've downloaded can be trusted.
sha256sum -c - <<EOF
$BOOT_JDK_SHA *$BOOT_JDK_LOCAL_FILE
EOF
docker build -t jbr17buildenv -f Dockerfile.musl_aarch64 .
# NB: the resulting container can (and should) be used without the network
# connection (--network none) during build in order to reduce the chance
# of build contamination.

View File

@@ -1,29 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# This script creates a Docker image suitable for building musl-x64 variant
# of the JetBrains Runtime version 17.
BOOT_JDK_REMOTE_FILE=zulu17.32.13-ca-jdk17.0.2-linux_musl_x64.tar.gz
BOOT_JDK_SHA=bcc5342011bd9f3643372aadbdfa68d47463ff0d8621668a0bdf2910614d95c6
BOOT_JDK_LOCAL_FILE=boot_jdk_musl_amd64.tar.gz
if [ ! -f $BOOT_JDK_LOCAL_FILE ]; then
# Obtain "boot JDK" from outside of the container.
wget -nc https://cdn.azul.com/zulu/bin/${BOOT_JDK_REMOTE_FILE} -O $BOOT_JDK_LOCAL_FILE
else
echo "boot JDK \"$BOOT_JDK_LOCAL_FILE\" present, skipping download"
fi
# Verify that what we've downloaded can be trusted.
sha256sum -c - <<EOF
$BOOT_JDK_SHA *$BOOT_JDK_LOCAL_FILE
EOF
docker build -t jbr17buildenv -f Dockerfile.musl_x64 .
# NB: the resulting container can (and should) be used without the network
# connection (--network none) during build in order to reduce the chance
# of build contamination.

View File

@@ -1,26 +0,0 @@
#!/bin/bash -x
# This script creates a Docker image suitable for building x86 variant
# of the JetBrains Runtime version 17.
BOOT_JDK_REMOTE_FILE=zulu17.34.19-ca-jdk17.0.3-linux_i686.tar.gz
BOOT_JDK_SHA=1c35c374ba0001e675d6e80819d5be900c4e141636d5e484992a8c550be14481
BOOT_JDK_LOCAL_FILE=boot_jdk_x86.tar.gz
if [ ! -f $BOOT_JDK_LOCAL_FILE ]; then
# Obtain "boot JDK" from outside of the container.
wget -nc https://cdn.azul.com/zulu/bin/${BOOT_JDK_REMOTE_FILE} -O $BOOT_JDK_LOCAL_FILE
else
echo "boot JDK \"$BOOT_JDK_LOCAL_FILE\" present, skipping download"
fi
# Verify that what we've downloaded can be trusted.
sha256sum -c - <<EOF
$BOOT_JDK_SHA *$BOOT_JDK_LOCAL_FILE
EOF
docker build -t jetbrains/runtime:jbr17env_x86 -f Dockerfile.x86 .
# NB: the resulting container can (and should) be used without the network
# connection (--network none) during build in order to reduce the chance
# of build contamination.

View File

@@ -1 +0,0 @@
JetBrainsRuntime

View File

@@ -1,20 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="IssueNavigationConfiguration">
<option name="links">
<list>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})([A-Z]+\-\d+)(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://youtrack.jetbrains.com/issue/$1" />
</IssueNavigationLink>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})(?:JDK-)?(\d{7})(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://bugs.openjdk.java.net/browse/JDK-$1" />
</IssueNavigationLink>
</list>
</option>
</component>
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$/../.." vcs="Git" />
</component>
</project>

View File

@@ -1,13 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<module type="JAVA_MODULE" version="4">
<component name="NewModuleRootManager" inherit-compiler-output="true">
<exclude-output />
<content url="file://$MODULE_DIR$/src/jetbrains.api">
<sourceFolder url="file://$MODULE_DIR$/src/jetbrains.api/src" isTestSource="false" />
<sourceFolder url="file://$MODULE_DIR$/src/jetbrains.api/templates" isTestSource="false" />
</content>
<orderEntry type="sourceFolder" forTests="false" />
<orderEntry type="inheritedJdk" />
</component>
</module>

View File

@@ -1,12 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectModuleManager">
<modules>
<module fileurl="file://$PROJECT_DIR$/.idea/jdk.iml" filepath="$PROJECT_DIR$/.idea/jdk.iml" />
###MODULE_IMLS###
<module fileurl="file://$PROJECT_DIR$/.idea/jetbrains.api.iml" filepath="$PROJECT_DIR$/.idea/jetbrains.api.iml" />
<module fileurl="file://$PROJECT_DIR$/.idea/test.iml" filepath="$PROJECT_DIR$/.idea/test.iml" />
</modules>
</component>
</project>

View File

@@ -1,20 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="IssueNavigationConfiguration">
<option name="links">
<list>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})([A-Z]+\-\d+)(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://youtrack.jetbrains.com/issue/$1" />
</IssueNavigationLink>
<IssueNavigationLink>
<option name="issueRegexp" value="(?:^|\s|\p{Punct})(?:JDK-)?(\d{7})(?=$|\s|\p{Punct})" />
<option name="linkRegexp" value="https://bugs.openjdk.java.net/browse/JDK-$1" />
</IssueNavigationLink>
</list>
</option>
</component>
<component name="VcsDirectoryMappings">
<mapping directory="$PROJECT_DIR$" vcs="Git" />
</component>
</project>

View File

@@ -1,135 +0,0 @@
apply plugin: 'java'
import org.gradle.internal.os.OperatingSystem
repositories {
mavenCentral()
}
def test_jvm = {
if (project.hasProperty('jbsdkhome')) {
file(jbsdkhome + (OperatingSystem.current().isWindows()?"/bin/java.exe" : "/bin/java")).absolutePath
} else {
if (OperatingSystem.current().isMacOsX()) {
file('../../../build/macosx-x86_64-normal-server-release/images/jdk-bundle/jdk-11.0.4.jdk/Contents/Home/bin/java').absolutePath
} else if (OperatingSystem.current().isLinux()) {
file('../../../build/linux-x86_64-normal-server-release/images/jdk/bin/java').absolutePath
} else {
file('../../../build/windows-x86_64-normal-server-release/images/jdk/bin/java.exe').absolutePath
}
}
}
dependencies {
testCompile('junit:junit:4.12'){
exclude group: 'org.hamcrest'
}
testCompile 'org.hamcrest:hamcrest-library:1.3'
testCompile 'net.java.dev.jna:jna:4.4.0'
testCompile 'com.twelvemonkeys.imageio:imageio-tiff:3.3.2'
testCompile 'org.apache.commons:commons-lang3:3.0'
}
def jdk_modules = ["java.base", "java.logging", "java.prefs",
"java.se.ee", "java.sql", "java.datatransfer",
"java.management", "java.rmi", "java.security.jgss",
"java.sql.rowset", "java.desktop", "java.management.rmi",
"java.scripting", "java.security.sasl", "java.transaction",
"java.instrument", "java.naming", "java.se",
"java.smartcardio", "java.xml.crypto"]
def jdk_class_dirs = []
jdk_modules.collect(jdk_class_dirs) {
new File("../../../src/" + it + "/share/classes")
}
if (OperatingSystem.current().isMacOsX())
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/macosx/classes"
}
else if (OperatingSystem.current().isLinux()) {
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/solaris/classes"
}
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/unix/classes"
}
} else
jdk_modules.collect(jdk_class_dirs) {
"../../../src/" + it + "/windows/classes"
}
sourceSets.main.java.srcDirs = jdk_class_dirs
sourceSets {
test {
java {
srcDir "../../../test/jdk/jbu"
}
}
}
test.dependsOn.clear()
test.dependsOn tasks.compileTestJava
test {
systemProperty "jb.java2d.metal", "true"
systemProperty "testdata", file('../../../test/jdk/jbu/testdata').absolutePath
// Generate golden images for DroidFontTest and MixedTextTest
// systemProperty "gentestdata", ""
// Enable Java2D logging (https://confluence.jetbrains.com/display/JRE/Java2D+Rendering+Logging)
// systemProperty "sun.java2d.trace", "log"
// systemProperty "sun.java2d.trace", "log,pimpl"
outputs.upToDateWhen { false }
executable = test_jvm()
// Enable async/dtrace profiler
jvmArgs "-XX:+PreserveFramePointer"
// Enable native J2D logging (only in debug build)
// Can be turned on for J2D by adding "#define DEBUG 1" into jdk/src/share/native/sun/java2d/Trace.h
// environment 'J2D_TRACE_LEVEL', '4'
}
def buildDir = project.buildscript.sourceFile.parentFile.parentFile.parentFile.parentFile
def make_cmd = "make"
if (OperatingSystem.current().isWindows()) {
def cyg_make_cmd = new File("c:/cygwin64/bin/make.exe")
if (cyg_make_cmd.exists()) make_cmd = cyg_make_cmd.absolutePath
}
def test_run = false
task make_images {
doLast {
if (!test_run) {
def pb = new ProcessBuilder().command(make_cmd.toString(), "-C", buildDir.absolutePath, "images")
def proc = pb.redirectErrorStream(true).start()
proc.inputStream.eachLine { println it }
assert proc.waitFor() == 0
}
}
}
task make_clean {
doLast {
def pb = new ProcessBuilder().command(make_cmd.toString(), "-C", buildDir.absolutePath, "clean")
def proc = pb.redirectErrorStream(true).start()
proc.inputStream.eachLine { println it }
assert proc.waitFor() == 0
}
}
task run_test {
doLast {
test_run = true
}
}
tasks.cleanTest.dependsOn tasks.run_test
classes.dependsOn.clear()
classes.dependsOn tasks.make_images
tasks.cleanClasses.dependsOn tasks.make_clean

View File

@@ -1,54 +0,0 @@
java.base,
java.compiler,
java.datatransfer,
java.desktop,
java.instrument,
java.logging,
java.management,
java.management.rmi,
java.naming,
java.net.http,
java.prefs,
java.rmi,
java.scripting,
java.se,
java.security.jgss,
java.security.sasl,
java.smartcardio,
java.sql,
java.sql.rowset,
java.transaction.xa,
java.xml,
java.xml.crypto,
jdk.accessibility,
jdk.attach,
jdk.charsets,
jdk.compiler,
jdk.crypto.cryptoki,
jdk.crypto.ec,
jdk.dynalink,
jdk.httpserver,
jdk.internal.ed,
jdk.internal.le,
jdk.internal.vm.ci,
jdk.javadoc,
jdk.jdi,
jdk.jdwp.agent,
jdk.jfr,
jdk.jsobject,
jdk.localedata,
jdk.management,
jdk.management.agent,
jdk.management.jfr,
jdk.naming.dns,
jdk.naming.rmi,
jdk.net,
jdk.sctp,
jdk.security.auth,
jdk.security.jgss,
jdk.unsupported,
jdk.unsupported.desktop,
jdk.xml.dom,
jdk.zipfs,
jdk.hotspot.agent,
jdk.jcmd

View File

@@ -1,18 +0,0 @@
#!/bin//bash
set -euo pipefail
# $1 - Boot JDK
# $2 - JBR part of API version
cd "`dirname "$0"`/../../../../.."
PWD="`pwd`"
CONF="$PWD/build/jbr-api.conf"
./configure --with-debug-level=release --with-boot-jdk=$1 || exit $?
make jbr-api CONF=release MAKEOVERRIDES= "JBR_API_CONF_FILE=$CONF" JBR_API_JBR_VERSION=$2 || exit $?
. $CONF || exit $?
echo "##teamcity[buildNumber '$VERSION']"
cp "$JAR" ./jbr-api-${VERSION}.jar || exit $?
cp "$SOURCES_JAR" ./jbr-api-${VERSION}-sources.jar || exit $?
echo "##teamcity[publishArtifacts '$PWD/jbr-api-${VERSION}.jar']"
echo "##teamcity[publishArtifacts '$PWD/jbr-api-${VERSION}-sources.jar']"

View File

@@ -1,175 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
function check_bundle_type_maketest() {
# check whether last char is 't', if so remove it
if [ "${bundle_type: -1}" == "t" ]; then
bundle_type="${bundle_type%?}"
do_maketest=1
else
do_maketest=0
fi
}
function getVersionProp() {
grep "^${1}" make/conf/version-numbers.conf | cut -d'=' -f2
}
while getopts ":i?" o; do
case "${o}" in
i) INC_BUILD=1 ;;
esac
done
shift $((OPTIND-1))
if [[ $# -lt 2 ]]; then
echo "Required at least two arguments: build_number bundle_type"
exit 1
fi
build_number=$1
bundle_type=$2
# shellcheck disable=SC2034
architecture=${3:-x64} # aarch64 or x64
check_bundle_type_maketest
tag_prefix="jdk-"
OPENJDK_TAG=$(git log --simplify-by-decoration --decorate=short --pretty=short | grep "$tag_prefix" | cut -d "(" -f2 | cut -d ")" -f1 | awk '{print $2}' | tr -d ',' | sort -t "-" -k 2 -g | tail -n 1)
VERSION_FEATURE=$(getVersionProp "DEFAULT_VERSION_FEATURE")
VERSION_INTERIM=$(getVersionProp "DEFAULT_VERSION_INTERIM")
VERSION_UPDATE=$(getVersionProp "DEFAULT_VERSION_UPDATE")
VERSION_PATCH=$(getVersionProp "DEFAULT_VERSION_PATCH")
[[ $VERSION_UPDATE = 0 ]] && JBSDK_VERSION="$VERSION_FEATURE" || JBSDK_VERSION="${VERSION_FEATURE}.${VERSION_INTERIM}.${VERSION_UPDATE}"
[[ $VERSION_PATCH = 0 ]] || JBSDK_VERSION="${VERSION_FEATURE}.${VERSION_INTERIM}.${VERSION_UPDATE}.${VERSION_PATCH}"
echo "##teamcity[setParameter name='env.JBSDK_VERSION' value='${JBSDK_VERSION}']"
JDK_BUILD_NUMBER=${JDK_BUILD_NUMBER:=$(echo $OPENJDK_TAG | awk -F "-|[+]" '{print $3}')}
[ -z $JDK_BUILD_NUMBER ] && JDK_BUILD_NUMBER=1
echo "##teamcity[setParameter name='env.JDK_UPDATE_NUMBER' value='${JDK_BUILD_NUMBER}']"
VENDOR_NAME="JetBrains s.r.o."
VENDOR_VERSION_STRING="JBR-${JBSDK_VERSION}+${JDK_BUILD_NUMBER}-${build_number}"
[ -z "$bundle_type" ] || VENDOR_VERSION_STRING="${VENDOR_VERSION_STRING}-${bundle_type}"
do_reset_changes=0
do_reset_dcevm=0
HEAD_REVISION=0
STATIC_CONF_ARGS=""
common_conf_props_file="jb/project/tools/common/static_conf_args.txt"
if [[ -f "$common_conf_props_file" ]]; then
STATIC_CONF_ARGS=$(<$common_conf_props_file)
fi
OS_NAME=$(uname -s)
# Enable reproducible builds
TZ=UTC
export TZ
SOURCE_DATE_EPOCH="$(git log -1 --pretty=%ct)"
export SOURCE_DATE_EPOCH
COPYRIGHT_YEAR=""
BUILD_TIME=""
TOUCH_TIME=""
REPRODUCIBLE_TAR_OPTS=""
case "$OS_NAME" in
Linux)
COPYRIGHT_YEAR="$(date --utc --date=@$SOURCE_DATE_EPOCH +%Y)"
BUILD_TIME="$(date --utc --date=@$SOURCE_DATE_EPOCH +%F)"
REPRODUCIBLE_TAR_OPTS="--mtime=@$SOURCE_DATE_EPOCH --owner=0 --group=0 --numeric-owner --pax-option=exthdr.name=%d/PaxHeaders/%f,delete=atime,delete=ctime"
;;
CYGWIN*)
COPYRIGHT_YEAR="$(date --utc --date=@$SOURCE_DATE_EPOCH +%Y)"
BUILD_TIME="$(date --utc --date=@$SOURCE_DATE_EPOCH +%F)"
REPRODUCIBLE_TAR_OPTS="--mtime=@$SOURCE_DATE_EPOCH --owner=0 --group=0 --numeric-owner --pax-option=exthdr.name=%d/PaxHeaders/%f,delete=atime,delete=ctime"
;;
Darwin)
COPYRIGHT_YEAR="$(date -u -r $SOURCE_DATE_EPOCH +%Y)"
BUILD_TIME="$(date -u -r $SOURCE_DATE_EPOCH +%F)"
TOUCH_TIME="$(date -u -r $SOURCE_DATE_EPOCH +%Y%m%d%H%M.%S)"
REPRODUCIBLE_TAR_OPTS="--uid 0 --gid 0 --numeric-owner"
;;
esac
WITH_ZIPPED_NATIVE_DEBUG_SYMBOLS="--with-native-debug-symbols=zipped"
REPRODUCIBLE_BUILD_OPTS="--with-source-date=$SOURCE_DATE_EPOCH
--with-hotspot-build-time=$BUILD_TIME
--with-copyright-year=$COPYRIGHT_YEAR
--disable-absolute-paths-in-output
--with-build-user=builduser"
function zip_native_debug_symbols() {
image_bundle_path=$(echo $1 | cut -d"/" -f-4)
jdk_name=$(echo $1 | cut -d"/" -f5)
jbr_diz_name=$2
[ -d "dizfiles" ] && rm -rf dizfiles
mkdir dizfiles
rsync_target="../../../../dizfiles"
[ -z "$jdk_name" ] && rsync_target=$rsync_target"/"$jbr_diz_name
(cd $image_bundle_path && find . -name '*.diz' -exec rsync -R {} $rsync_target \;)
[ ! -z "$jdk_name" ] && mv dizfiles/$jdk_name dizfiles/$jbr_diz_name
(cd dizfiles && find $jbr_diz_name -print0 | COPYFILE_DISABLE=1 \
tar --no-recursion --null -T - -czf ../"$jbr_diz_name".tar.gz) || do_exit $?
}
function do_exit() {
exit_code=$1
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
if [ $do_reset_dcevm -eq 1 ]; then
[ ! -z $HEAD_REVISION ] && git reset --hard $HEAD_REVISION
fi
exit "$exit_code"
}
function update_jsdk_mods() {
__jsdk=$1
__jcef_mods=$2
__orig_jsdk_mods=$3
__updated_jsdk_mods=$4
# re-create java.desktop.jmod with updated module-info.class
tmp=.java.desktop.$$.tmp
mkdir "$tmp" || exit $?
"$__jsdk"/bin/jmod extract --dir "$tmp" "$__orig_jsdk_mods"/java.desktop.jmod || exit $?
"$__jsdk"/bin/javac \
--patch-module java.desktop="$__orig_jsdk_mods"/java.desktop.jmod \
--module-path "$__jcef_mods" -d "$tmp"/classes src/java.desktop/share/classes/module-info.java || exit $?
"$__jsdk"/bin/jmod \
create --class-path "$tmp"/classes --config "$tmp"/conf --header-files "$tmp"/include --legal-notice "$tmp"/legal --libs "$tmp"/lib \
java.desktop.jmod || exit $?
mv java.desktop.jmod "$__updated_jsdk_mods" || exit $?
rm -rf "$tmp"
# re-create java.base.jmod with updated hashes
tmp=.java.base.$$.tmp
mkdir "$tmp" || exit $?
hash_modules=$("$__jsdk"/bin/jmod describe "$__orig_jsdk_mods"/java.base.jmod | grep hashes | awk '{print $2}' | tr '\n' '|' | sed s/\|$//) || exit $?
"$__jsdk"/bin/jmod extract --dir "$tmp" "$__orig_jsdk_mods"/java.base.jmod || exit $?
rm "$__updated_jsdk_mods"/java.base.jmod || exit $? # temp exclude from path
"$__jsdk"/bin/jmod \
create --module-path "$__updated_jsdk_mods" --hash-modules "$hash_modules" \
--class-path "$tmp"/classes --cmds "$tmp"/bin --config "$tmp"/conf --header-files "$tmp"/include --legal-notice "$tmp"/legal --libs "$tmp"/lib \
java.base.jmod || exit $?
mv java.base.jmod "$__updated_jsdk_mods" || exit $?
rm -rf "$tmp"
}
function get_mods_list() {
__mods=$1
echo $(ls $__mods) | sed s/\.jmod/,/g | sed s/,$//g | sed s/' '//g
}
function copy_jmods() {
__mods_list=$1
__jmods_from=$2
__jmods_to=$3
mkdir -p $__jmods_to
echo "${__mods_list}," | while read -d, mod; do cp $__jmods_from/$mod.jmod $__jmods_to/; done
}

View File

@@ -1 +0,0 @@
--with-vendor-vm-bug-url=https://youtrack.jetbrains.com/issues/JBR

View File

@@ -1,165 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_linux_aarch64
source jb/project/tools/common/scripts/common.sh
JCEF_PATH=${JCEF_PATH:=./jcef_linux_aarch64}
function do_configure {
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
--enable-cds=yes \
$STATIC_CONF_ARGS \
$REPRODUCIBLE_BUILD_OPTS \
$WITH_ZIPPED_NATIVE_DEBUG_SYMBOLS \
|| do_exit $?
}
function is_musl {
libc=$(ldd /bin/ls | grep 'musl' | head -1 | cut -d ' ' -f1)
if [ -z $libc ]; then
# This is not Musl, return 1 == false
return 1
fi
return 0
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
libc_type_suffix=''
fastdebug_infix=''
if is_musl; then libc_type_suffix='musl-' ; fi
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-linux-${libc_type_suffix}aarch64-${fastdebug_infix}b${build_number}
__root_dir=${__bundle_name}-${JBSDK_VERSION}-linux-${libc_type_suffix}aarch64-${fastdebug_infix:-}b${build_number}
echo Running jlink....
[ -d "$IMAGES_DIR"/"$__root_dir" ] && rm -rf "${IMAGES_DIR:?}"/"$__root_dir"
$JSDK/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$IMAGES_DIR"/"$__root_dir"
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$IMAGES_DIR"/"$__root_dir"/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' "$IMAGES_DIR"/"$__root_dir"/release > release
mv release "$IMAGES_DIR"/"$__root_dir"/release
cp $IMAGES_DIR/jdk/lib/src.zip "$IMAGES_DIR"/"$__root_dir"/lib
copy_jmods "$__modules" "$__modules_path" "$IMAGES_DIR"/"$__root_dir"/jmods
zip_native_debug_symbols $IMAGES_DIR/jdk "${JBR}_diz"
fi
# jmod does not preserve file permissions (JDK-8173610)
[ -f "$IMAGES_DIR"/"$__root_dir"/lib/jcef_helper ] && chmod a+x "$IMAGES_DIR"/"$__root_dir"/lib/jcef_helper
echo Creating "$JBR".tar.gz ...
(cd "$IMAGES_DIR" &&
find "$__root_dir" -print0 | LC_ALL=C sort -z | \
tar $REPRODUCIBLE_TAR_OPTS \
--no-recursion --null -T - -cf "$JBR".tar) || do_exit $?
mv "$IMAGES_DIR"/"$JBR".tar ./"$JBR".tar
[ -f "$JBR".tar.gz ] && rm "$JBR.tar.gz"
touch -c -d "@$SOURCE_DATE_EPOCH" "$JBR".tar
gzip "$JBR".tar || do_exit $?
rm -rf "${IMAGES_DIR:?}"/"$__root_dir"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=linux-aarch64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=1
do_maketest=1
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=1
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=linux-aarch64-server-fastdebug
;;
esac
if [ -z "${INC_BUILD:-}" ]; then
do_configure || do_exit $?
make clean CONF=$RELEASE_NAME || do_exit $?
fi
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module_aarch64.patch || do_exit $?
update_jsdk_mods $JSDK $JCEF_PATH/jmods $JSDK/jmods $JSDK_MODS_DIR || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not changed
jbr_name_postfix="_${bundle_type}"
cat $JCEF_PATH/jcef.version >> $JSDK/release
else
jbr_name_postfix=""
fi
# create runtime image bundle
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat $JSDK/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" $JBRSDK_BUNDLE $JSDK_MODS_DIR "$modules" || do_exit $?
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-linux-${libc_type_suffix}test-aarch64-b${build_number}
echo Creating "$JBRSDK_TEST" ...
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
make test-image jbr-api CONF=$RELEASE_NAME JBR_API_JBR_VERSION=TEST || do_exit $?
cp "build/${RELEASE_NAME}/jbr-api/jbr-api.jar" "${IMAGES_DIR}/test"
tar -pcf "$JBRSDK_TEST".tar -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
gzip "$JBRSDK_TEST".tar || do_exit $?
fi
do_exit 0

View File

@@ -1,164 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_linux_x64
source jb/project/tools/common/scripts/common.sh
JCEF_PATH=${JCEF_PATH:=./jcef_linux_x64}
function do_configure {
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
--enable-cds=yes \
$STATIC_CONF_ARGS \
$REPRODUCIBLE_BUILD_OPTS \
$WITH_ZIPPED_NATIVE_DEBUG_SYMBOLS \
|| do_exit $?
}
function is_musl {
libc=$(ldd /bin/ls | grep 'musl' | head -1 | cut -d ' ' -f1)
if [ -z $libc ]; then
# This is not Musl, return 1 == false
return 1
fi
return 0
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
libc_type_suffix=''
fastdebug_infix=''
if is_musl; then libc_type_suffix='musl-' ; fi
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-linux-${libc_type_suffix}x64-${fastdebug_infix}b${build_number}
__root_dir=${__bundle_name}-${JBSDK_VERSION}-linux-${libc_type_suffix}x64-${fastdebug_infix:-}b${build_number}
echo Running jlink....
[ -d "$IMAGES_DIR"/"$__root_dir" ] && rm -rf "${IMAGES_DIR:?}"/"$__root_dir"
$JSDK/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$IMAGES_DIR"/"$__root_dir"
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$IMAGES_DIR"/"$__root_dir"/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' "$IMAGES_DIR"/"$__root_dir"/release > release
mv release "$IMAGES_DIR"/"$__root_dir"/release
cp $IMAGES_DIR/jdk/lib/src.zip "$IMAGES_DIR"/"$__root_dir"/lib
copy_jmods "$__modules" "$__modules_path" "$IMAGES_DIR"/"$__root_dir"/jmods
zip_native_debug_symbols $IMAGES_DIR/jdk "${JBR}_diz"
fi
# jmod does not preserve file permissions (JDK-8173610)
[ -f "$IMAGES_DIR"/"$__root_dir"/lib/jcef_helper ] && chmod a+x "$IMAGES_DIR"/"$__root_dir"/lib/jcef_helper
echo Creating "$JBR".tar.gz ...
(cd "$IMAGES_DIR" &&
find "$__root_dir" -print0 | LC_ALL=C sort -z | \
tar $REPRODUCIBLE_TAR_OPTS \
--no-recursion --null -T - -cf "$JBR".tar) || do_exit $?
mv "$IMAGES_DIR"/"$JBR".tar ./"$JBR".tar
[ -f "$JBR".tar.gz ] && rm "$JBR.tar.gz"
touch -c -d "@$SOURCE_DATE_EPOCH" "$JBR".tar
gzip "$JBR".tar || do_exit $?
#rm -rf "${IMAGES_DIR:?}"/"$__root_dir"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=linux-x86_64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=1
do_maketest=1
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=1
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=linux-x86_64-server-fastdebug
;;
esac
if [ -z "${INC_BUILD:-}" ]; then
do_configure || do_exit $?
make clean CONF=$RELEASE_NAME || do_exit $?
fi
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods $JSDK $JCEF_PATH/jmods $JSDK/jmods $JSDK_MODS_DIR || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not changed
jbr_name_postfix="_${bundle_type}"
cat $JCEF_PATH/jcef.version >> $JSDK/release
else
jbr_name_postfix=""
fi
# create runtime image bundle
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat $JSDK/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" $JBRSDK_BUNDLE $JSDK_MODS_DIR "$modules" || do_exit $?
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-linux-${libc_type_suffix}test-x64-b${build_number}
echo Creating "$JBRSDK_TEST" ...
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
make test-image jbr-api CONF=$RELEASE_NAME JBR_API_JBR_VERSION=TEST || do_exit $?
cp "build/${RELEASE_NAME}/jbr-api/jbr-api.jar" "${IMAGES_DIR}/test"
tar -pcf "$JBRSDK_TEST".tar -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
gzip "$JBRSDK_TEST".tar || do_exit $?
fi
do_exit 0

View File

@@ -1,144 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
source jb/project/tools/common/scripts/common.sh
function do_configure {
linux32 bash configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
$STATIC_CONF_ARGS \
--enable-cds=yes \
$REPRODUCIBLE_BUILD_OPTS \
$WITH_ZIPPED_NATIVE_DEBUG_SYMBOLS \
|| do_exit $?
}
function is_musl {
libc=$(ldd /bin/ls | grep 'musl' | head -1 | cut -d ' ' -f1)
if [ -z $libc ]; then
# This is not Musl, return 1 == false
return 1
fi
return 0
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
libc_type_suffix=''
fastdebug_infix=''
if is_musl; then libc_type_suffix='musl-' ; fi
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-linux-${libc_type_suffix}x86-${fastdebug_infix}b${build_number}
__root_dir=${__bundle_name}-${JBSDK_VERSION}-linux-${libc_type_suffix}x86-${fastdebug_infix:-}b${build_number}
echo Running jlink....
[ -d "$IMAGES_DIR"/"$__root_dir" ] && rm -rf "${IMAGES_DIR:?}"/"$__root_dir"
$JSDK/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$IMAGES_DIR"/"$__root_dir"
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$IMAGES_DIR"/"$__root_dir"/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' "$IMAGES_DIR"/"$__root_dir"/release > release
mv release "$IMAGES_DIR"/"$__root_dir"/release
cp $IMAGES_DIR/jdk/lib/src.zip "$IMAGES_DIR"/"$__root_dir"/lib
copy_jmods "$__modules" "$__modules_path" "$IMAGES_DIR"/"$__root_dir"/jmods
zip_native_debug_symbols $IMAGES_DIR/jdk "${JBR}_diz"
fi
# jmod does not preserve file permissions (JDK-8173610)
[ -f "$IMAGES_DIR"/"$__root_dir"/lib/jcef_helper ] && chmod a+x "$IMAGES_DIR"/"$__root_dir"/lib/jcef_helper
echo Creating "$JBR".tar.gz ...
(cd "$IMAGES_DIR" &&
find "$__root_dir" -print0 | LC_ALL=C sort -z | \
tar $REPRODUCIBLE_TAR_OPTS \
--no-recursion --null -T - -cf "$JBR".tar) || do_exit $?
mv "$IMAGES_DIR"/"$JBR".tar ./"$JBR".tar
[ -f "$JBR".tar.gz ] && rm "$JBR.tar.gz"
touch -c -d "@$SOURCE_DATE_EPOCH" "$JBR".tar
gzip "$JBR".tar || do_exit $?
rm -rf "${IMAGES_DIR:?}"/"$__root_dir"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=linux-x86-server-release
case "$bundle_type" in
"jcef")
echo "not implemented" && do_exit 1
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=1
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=linux-x86-server-fastdebug
;;
esac
if [ -z "${INC_BUILD:-}" ]; then
do_configure || do_exit $?
make clean CONF=$RELEASE_NAME || do_exit $?
fi
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
echo Fixing permissions
chmod -R a+r $JSDK
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ]; then
jbr_name_postfix="_${bundle_type}"
else
jbr_name_postfix=""
fi
# create runtime image bundle
modules=$(grep -v "jdk.internal.vm" jb/project/tools/common/modules.list | xargs | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat $JSDK/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" $JBRSDK_BUNDLE $JSDK_MODS_DIR "$modules" || do_exit $?
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-linux-${libc_type_suffix}test-x86-b${build_number}
echo Creating "$JBRSDK_TEST" ...
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
make test-image jbr-api CONF=$RELEASE_NAME JBR_API_JBR_VERSION=TEST || do_exit $?
cp "build/${RELEASE_NAME}/jbr-api/jbr-api.jar" "${IMAGES_DIR}/test"
tar -pcf "$JBRSDK_TEST".tar -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
gzip "$JBRSDK_TEST".tar || do_exit $?
fi
do_exit 0

View File

@@ -1,47 +0,0 @@
#!/bin/bash
SCRIPT_DIR="$(cd "$(dirname "$0")" >/dev/null && pwd)"
source "$SCRIPT_DIR/jetsign-common.sh" || exit 1
function isMacOsBinary() {
file "$1" | grep -q 'Mach-O'
}
function isSigned() {
codesign --verify "$1" >/dev/null 2>&1 && ! grep -q Signature=adhoc < <(codesign --display --verbose "$1" 2>&1)
}
# last argument is a path to be signed
pathToBeSigned="$(pwd)/${*: -1}"
jetSignArgs=("${@:1:$#-1}")
if [[ ! -f "$pathToBeSigned" ]]; then
echo "$pathToBeSigned is missing or not a file"
exit 1
elif isSigned "$pathToBeSigned" && ! isForced "${jetSignArgs[@]}" ; then
echo "Already signed: $pathToBeSigned"
elif [[ "$JETSIGN_CLIENT" == "null" ]]; then
echo "JetSign client is missing, cannot proceed with signing"
exit 1
elif ! isMacOsBinary "$pathToBeSigned" && [[ "$pathToBeSigned" != *.sit ]] && [[ "$pathToBeSigned" != *.tar.gz ]]; then
echo "$pathToBeSigned won't be signed, assumed not to be a macOS executable"
else
if isMacOsBinary "$pathToBeSigned" && ! isSigned "$pathToBeSigned" ; then
echo "Unsigned macOS binary: $pathToBeSigned"
fi
workDir=$(dirname "$pathToBeSigned")
pathSigned="$workDir/signed/${pathToBeSigned##*/}"
jetSignExtensions=$(jetSignExtensions "${jetSignArgs[@]}")
contentType=$(jetSignContentType "$pathToBeSigned")
(
cd "$workDir" || exit 1
"$JETSIGN_CLIENT" -log-format text -denoted-content-type "$contentType" -extensions "$jetSignExtensions" "$pathToBeSigned"
# SRE-1223 (Codesign removes execute bits in executable files) workaround
chmod "$(stat -f %A "$pathToBeSigned")" "$pathSigned"
if isMacOsBinary "$pathSigned"; then
isSigned "$pathSigned"
fi
rm "$pathToBeSigned"
mv "$pathSigned" "$pathToBeSigned"
rm -rf "$workDir/signed"
)
fi

View File

@@ -1,16 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>com.apple.security.cs.allow-jit</key>
<true/>
<key>com.apple.security.cs.allow-unsigned-executable-memory</key>
<true/>
<key>com.apple.security.cs.allow-dyld-environment-variables</key>
<true/>
<key>com.apple.security.cs.disable-library-validation</key>
<true/>
<key>com.apple.security.cs.disable-executable-page-protection</key>
<true/>
</dict>
</plist>

View File

@@ -1,63 +0,0 @@
#!/bin/bash
set -euo pipefail
function isForced() {
for arg in "$@"; do
if [[ "$arg" == --force ]]; then
return 0
fi
done
return 1
}
function jetSignExtensions() {
args=("$@")
((lastElementIndex=${#args[@]}-1))
for index in "${!args[@]}"; do
arg=${args[$index]}
case "$arg" in
--sign | -s)
echo -n 'mac_codesign_identity='
continue
;;
--entitlements)
echo -n 'mac_codesign_entitlements='
continue
;;
--options=runtime)
echo -n 'mac_codesign_options=runtime'
;;
--force)
echo -n 'mac_codesign_force=true'
;;
--timestamp | --verbose | -v)
continue
;;
*)
echo -n "$arg"
;;
esac
if [[ $index != "$lastElementIndex" ]]; then
echo -n ","
fi
done
}
# See jetbrains.sign.util.FileUtil.contentType
function jetSignContentType() {
case "${1##*/}" in
*.sit)
echo -n 'application/x-mac-app-zip'
;;
*.tar.gz)
echo -n 'application/x-mac-app-targz'
;;
*.pkg)
echo -n 'application/x-mac-pkg'
;;
*)
echo -n 'application/x-mac-app-bin'
;;
esac
}

View File

@@ -1,171 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_mac
source jb/project/tools/common/scripts/common.sh
JCEF_PATH=${JCEF_PATH:=./jcef_mac}
BOOT_JDK=${BOOT_JDK:=$(/usr/libexec/java_home -v 17)}
function do_configure {
if [[ "${architecture}" == *aarch64* ]]; then
ENABLE_CDS="--enable-cds=no"
else
ENABLE_CDS="--enable-cds=yes"
fi
sh configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-macosx-bundle-name-base=${VENDOR_VERSION_STRING} \
--with-macosx-bundle-id-base="com.jetbrains.jbr" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build="$JDK_BUILD_NUMBER" \
--with-version-opt=b"$build_number" \
--with-boot-jdk="$BOOT_JDK" \
--enable-cds=yes \
$STATIC_CONF_ARGS \
$REPRODUCIBLE_BUILD_OPTS \
$WITH_ZIPPED_NATIVE_DEBUG_SYMBOLS \
|| do_exit $?
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
fastdebug_infix=''
tmp=.bundle.$$.tmp
mkdir "$tmp" || do_exit $?
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-osx-${architecture}-${fastdebug_infix:-}b${build_number}
__root_dir=${__bundle_name}-${JBSDK_VERSION}-osx-${architecture}-${fastdebug_infix:-}b${build_number}
JRE_CONTENTS=$tmp/$__root_dir/Contents
mkdir -p "$JRE_CONTENTS" || do_exit $?
echo Running jlink...
"$JSDK"/bin/jlink \
--module-path "$__modules_path" --no-man-pages --compress=2 \
--add-modules "$__modules" --output "$JRE_CONTENTS/Home" || do_exit $?
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> "$JRE_CONTENTS/Home/release"
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' $JRE_CONTENTS/Home/release > release
mv release $JRE_CONTENTS/Home/release
cp $IMAGES_DIR/jdk-bundle/jdk-$JBSDK_VERSION.jdk/Contents/Home/lib/src.zip $JRE_CONTENTS/Home/lib
copy_jmods "$__modules" "$__modules_path" "$JRE_CONTENTS"/Home/jmods
zip_native_debug_symbols $IMAGES_DIR/jdk-bundle/jdk-$JBSDK_VERSION.jdk "${JBR}_diz"
fi
if [ "$bundle_type" == "jcef" ]; then
cat $JCEF_PATH/jcef.version >> "$JRE_CONTENTS/Home/release"
fi
cp -R "$JSDK"/../MacOS "$JRE_CONTENTS"
cp "$JSDK"/../Info.plist "$JRE_CONTENTS"
[ -n "$bundle_type" ] && (cp -a $JCEF_PATH/Frameworks "$JRE_CONTENTS" || do_exit $?)
echo Creating "$JBR".tar.gz ...
# Normalize timestamp
find "$tmp"/"$__root_dir" -print0 | xargs -0 touch -c -h -t "$TOUCH_TIME"
(cd "$tmp" &&
find "$__root_dir" -print0 | LC_ALL=C sort -z | \
COPYFILE_DISABLE=1 tar $REPRODUCIBLE_TAR_OPTS --no-recursion --null -T - \
-czf "$JBR".tar.gz --exclude='*.dSYM' --exclude='man') || do_exit $?
mv "$tmp"/"$JBR".tar.gz "$JBR".tar.gz
rm -rf "$tmp"
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
CONF_ARCHITECTURE=x86_64
if [[ "${architecture}" == *aarch64* ]]; then
CONF_ARCHITECTURE=aarch64
fi
RELEASE_NAME=macosx-${CONF_ARCHITECTURE}-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=1
do_maketest=1
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=1
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=macosx-${CONF_ARCHITECTURE}-server-fastdebug
JBSDK=macosx-${architecture}-server-release
;;
esac
if [ -z "${INC_BUILD:-}" ]; then
do_configure || do_exit $?
make clean CONF=$RELEASE_NAME || do_exit $?
fi
make images CONF=$RELEASE_NAME || do_exit $?
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk-bundle/jdk-$JBSDK_VERSION.jdk/Contents/Home
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods "$JSDK" "$JCEF_PATH"/jmods "$JSDK"/jmods "$JSDK_MODS_DIR" || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not changed
jbr_name_postfix="_${bundle_type}"
else
jbr_name_postfix=""
fi
# create runtime image bundle
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat "$JSDK"/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" "$JBRSDK_BUNDLE" "$JSDK_MODS_DIR" "$modules" || do_exit $?
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=${JBRSDK_BUNDLE}-${JBSDK_VERSION}-osx-test-${architecture}-b${build_number}
echo Creating "$JBRSDK_TEST" ...
[ $do_reset_changes -eq 1 ] && git checkout HEAD jb/project/tools/common/modules.list src/java.desktop/share/classes/module-info.java
make test-image jbr-api CONF=$RELEASE_NAME JBR_API_JBR_VERSION=TEST || do_exit $?
cp "build/${RELEASE_NAME}/jbr-api/jbr-api.jar" "${IMAGES_DIR}/test"
[ -f "$JBRSDK_TEST.tar.gz" ] && rm "$JBRSDK_TEST.tar.gz"
COPYFILE_DISABLE=1 tar -pczf "$JBRSDK_TEST".tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
fi
do_exit 0

View File

@@ -1,40 +0,0 @@
#!/bin/bash
#immediately exit script with an error if a command fails
set -euo pipefail
[[ "${SCRIPT_VERBOSE:-}" == "1" ]] && set -x
APP_PATH=$1
if [[ -z "$APP_PATH" ]]; then
echo "Usage: $0 AppPath"
exit 1
fi
if [[ ! -f "$APP_PATH" ]]; then
echo "AppName '$APP_PATH' does not exist or not a file"
exit 1
fi
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
# check required parameters
: "${APPLE_ISSUER_ID}"
: "${APPLE_KEY_ID}"
: "${APPLE_PRIVATE_KEY}"
# shellcheck disable=SC2064
trap "rm -f \"$PWD/tmp_key\"" INT EXIT RETURN
echo -n "${APPLE_PRIVATE_KEY}" > tmp_key
log "Notarizing $APP_PATH..."
xcrun notarytool submit --key tmp_key --key-id "${APPLE_KEY_ID}" --issuer "${APPLE_ISSUER_ID}" "$APP_PATH" 2>&1 | tee "notarytool.submit.out"
REQUEST_ID="$(grep -e " id: " "notarytool.submit.out" | grep -oE '([0-9a-f-]{36})'| head -n1)"
xcrun notarytool wait "$REQUEST_ID" --key tmp_key --key-id "${APPLE_KEY_ID}" --issuer "${APPLE_ISSUER_ID}" --timeout 6h ||:
xcrun notarytool log "$REQUEST_ID" --key tmp_key --key-id "${APPLE_KEY_ID}" --issuer "${APPLE_ISSUER_ID}" developer_log.json ||:
xcrun notarytool info "$REQUEST_ID" --key tmp_key --key-id "${APPLE_KEY_ID}" --issuer "${APPLE_ISSUER_ID}"
log "Notarizing finished"

View File

@@ -1,41 +0,0 @@
#!/bin/bash
SCRIPT_DIR="$(cd "$(dirname "$0")" >/dev/null && pwd)"
source "$SCRIPT_DIR/jetsign-common.sh" || exit 1
function isSigned() {
pkgutil --check-signature "$1" >/dev/null 2>&1 && grep -q "signed by a developer certificate" < <(pkgutil --check-signature "$1" 2>&1)
}
# second last argument is a path to be signed
pathToBeSigned="$(pwd)/${*:(-2):1}"
# last argument is a path to signed file
pathOut="$(pwd)/${*:(-1)}"
jetSignArgs=("${@:1:$#-2}")
if [[ ! -f "$pathToBeSigned" ]]; then
echo "$pathToBeSigned is missing or not a file"
exit 1
elif isSigned "$pathToBeSigned" && ! isForced "${jetSignArgs[@]}" ; then
echo "Already signed: $pathToBeSigned"
elif [[ "$JETSIGN_CLIENT" == "null" ]]; then
echo "JetSign client is missing, cannot proceed with signing"
exit 1
elif [[ "$pathToBeSigned" != *.pkg ]]; then
echo "$pathToBeSigned won't be signed, assumed not to be a macOS package"
else
if ! isSigned "$pathToBeSigned" ; then
echo "Unsigned macOS package: $pathToBeSigned"
fi
workDir=$(dirname "$pathToBeSigned")
pathSigned="$workDir/signed/${pathToBeSigned##*/}"
jetSignExtensions=$(jetSignExtensions "${jetSignArgs[@]}")
contentType=$(jetSignContentType "$pathToBeSigned")
(
cd "$workDir" || exit 1
"$JETSIGN_CLIENT" -log-format text -denoted-content-type "$contentType" -extensions "$jetSignExtensions" "$pathToBeSigned"
isSigned "$pathSigned"
rm -f "$pathOut"
mv "$pathSigned" "$pathOut"
rm -rf "$workDir/signed"
)
fi

View File

@@ -1,162 +0,0 @@
#!/bin/bash
#immediately exit script with an error if a command fails
set -euo pipefail
[[ "${SCRIPT_VERBOSE:-}" == "1" ]] && set -x
if [[ $# -lt 5 ]]; then
echo "Usage: $0 AppDirectory AppName BundleId CertificateID InstallerCertificateID"
exit 1
fi
APPLICATION_PATH=$1
PKG_NAME=$2
BUNDLE_ID=$3
JB_DEVELOPER_CERT=$4
JB_INSTALLER_CERT=$5
SCRIPT_DIR="$(cd "$(dirname "$0")" >/dev/null && pwd)"
# Use JetBrains sign utility if it's available
if [[ "${JETSIGN_CLIENT:=}" == "null" ]] || [[ "$JETSIGN_CLIENT" == "" ]]; then
JB_SIGN=false
SIGN_UTILITY="codesign"
PRODUCTSIGN_UTILITY="productsign"
else
JB_SIGN=true
SIGN_UTILITY="$SCRIPT_DIR/codesign.sh"
PRODUCTSIGN_UTILITY="$SCRIPT_DIR/productsign.sh"
fi
if [[ ! -d "$APPLICATION_PATH" ]]; then
echo "AppDirectory '$APPLICATION_PATH' does not exist or not a directory"
exit 1
fi
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
# Cleanup files left from previous sign attempt (if any)
find "$APPLICATION_PATH" -name '*.cstemp' -exec rm '{}' \;
log "Signing libraries and executables..."
# -perm +111 searches for executables
for f in \
"Contents/Home/lib" "Contents/MacOS" \
"Contents/Home/Frameworks" \
"Contents/Frameworks"; do
if [ -d "$APPLICATION_PATH/$f" ]; then
find "$APPLICATION_PATH/$f" \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -name "*.tbd" -o -name "*.node" -o -perm +111 \) \
-exec "$SIGN_UTILITY" --timestamp \
-v -s "$JB_DEVELOPER_CERT" --options=runtime --force \
--entitlements "$SCRIPT_DIR/entitlements.xml" {} \;
fi
done
log "Signing libraries in jars in $APPLICATION_PATH"
# todo: add set -euo pipefail; into the inner sh -c
# `-e` prevents `grep -q && printf` loginc
# with `-o pipefail` there's no input for 'while' loop
find "$APPLICATION_PATH" -name '*.jar' \
-exec sh -c "set -u; unzip -l \"\$0\" | grep -q -e '\.dylib\$' -e '\.jnilib\$' -e '\.so\$' -e '\.tbd\$' -e '^jattach\$' && printf \"\$0\0\" " {} \; |
while IFS= read -r -d $'\0' file; do
log "Processing libraries in $file"
rm -rf jarfolder jar.jar
mkdir jarfolder
filename="${file##*/}"
log "Filename: $filename"
cp "$file" jarfolder && (cd jarfolder && jar xf "$filename" && rm "$filename")
find jarfolder \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -name "*.tbd" -o -name "jattach" \) \
-exec "$SIGN_UTILITY" --timestamp \
--force \
-v -s "$JB_DEVELOPER_CERT" --options=runtime \
--entitlements "$SCRIPT_DIR/entitlements.xml" {} \;
(cd jarfolder; zip -q -r -o -0 ../jar.jar .)
mv jar.jar "$file"
done
rm -rf jarfolder jar.jar
log "Signing other files..."
# shellcheck disable=SC2043
for f in \
"Contents/Home/bin"; do
if [ -d "$APPLICATION_PATH/$f" ]; then
find "$APPLICATION_PATH/$f" \
-type f \( -name "*.jnilib" -o -name "*.dylib" -o -name "*.so" -o -name "*.tbd" -o -perm +111 \) \
-exec "$SIGN_UTILITY" --timestamp \
-v -s "$JB_DEVELOPER_CERT" --options=runtime --force \
--entitlements "$SCRIPT_DIR/entitlements.xml" {} \;
fi
done
log "Signing whole frameworks..."
# shellcheck disable=SC2043
if [ "$JB_SIGN" = true ]; then for f in \
"Contents/Home/Frameworks" "Contents/Frameworks"; do
if [ -d "$APPLICATION_PATH/$f" ]; then
find "$APPLICATION_PATH/$f" \( -name '*.framework' -o -name '*.app' \) -maxdepth 1 | while read -r line
do
log "Signing '$line':"
tar -pczf tmp-to-sign.tar.gz -C "$(dirname "$line")" "$(basename "$line")"
"$SIGN_UTILITY" --timestamp \
-v -s "$JB_DEVELOPER_CERT" --options=runtime \
--force \
--entitlements "$SCRIPT_DIR/entitlements.xml" tmp-to-sign.tar.gz
rm -rf "$line"
tar -xzf tmp-to-sign.tar.gz --directory "$(dirname "$line")"
rm -f tmp-to-sign.tar.gz
done
fi
done; fi
log "Checking framework signatures..."
for f in \
"Contents/Home/Frameworks" "Contents/Frameworks"; do
if [ -d "$APPLICATION_PATH/$f" ]; then
find "$APPLICATION_PATH/$f" -name '*.framework' -maxdepth 1 | while read -r line
do
log "Checking '$line':"
codesign --verify --deep --strict --verbose=4 "$line"
done
fi
done
log "Signing whole app..."
if [ "$JB_SIGN" = true ]; then
tar -pczf tmp-to-sign.tar.gz --exclude='man' -C "$(dirname "$APPLICATION_PATH")" "$(basename "$APPLICATION_PATH")"
"$SIGN_UTILITY" --timestamp \
-v -s "$JB_DEVELOPER_CERT" --options=runtime \
--force \
--entitlements "$SCRIPT_DIR/entitlements.xml" tmp-to-sign.tar.gz
rm -rf "$APPLICATION_PATH"
tar -xzf tmp-to-sign.tar.gz --directory "$(dirname "$APPLICATION_PATH")"
rm -f tmp-to-sign.tar.gz
else
"$SIGN_UTILITY" --timestamp \
-v -s "$JB_DEVELOPER_CERT" --options=runtime \
--force \
--entitlements "$SCRIPT_DIR/entitlements.xml" "$APPLICATION_PATH"
fi
BUILD_NAME="$(basename "$APPLICATION_PATH")"
log "Creating $PKG_NAME..."
rm -rf "$PKG_NAME"
mkdir -p unsigned
pkgbuild --identifier $BUNDLE_ID --root $APPLICATION_PATH \
--install-location /Library/Java/JavaVirtualMachines/${BUILD_NAME} unsigned/${PKG_NAME}
log "Signing $PKG_NAME..."
"$PRODUCTSIGN_UTILITY" --timestamp --sign "$JB_INSTALLER_CERT" unsigned/${PKG_NAME} ${PKG_NAME}
log "Verifying java is not broken"
find "$APPLICATION_PATH" \
-type f -name 'java' -perm +111 -exec {} -version \;

View File

@@ -1,136 +0,0 @@
#!/bin/bash
#immediately exit script with an error if a command fails
set -euo pipefail
[[ "${SCRIPT_VERBOSE:-}" == "1" ]] && set -x
export COPY_EXTENDED_ATTRIBUTES_DISABLE=true
export COPYFILE_DISABLE=true
INPUT_FILE=$1
EXPLODED=$2.exploded
BACKUP_JMODS=$2.backup
USERNAME=$3
PASSWORD=$4
CODESIGN_STRING=$5
JB_INSTALLER_CERT=$6
NOTARIZE=$7
BUNDLE_ID=$8
SCRIPT_DIR="$(cd "$(dirname "$0")" >/dev/null && pwd)"
function log() {
echo "$(date '+[%H:%M:%S]') $*"
}
log "Deleting $EXPLODED ..."
if test -d "$EXPLODED"; then
find "$EXPLODED" -mindepth 1 -maxdepth 1 -exec chmod -R u+wx '{}' \;
fi
rm -rf "$EXPLODED"
mkdir "$EXPLODED"
rm -rf "$BACKUP_JMODS"
mkdir "$BACKUP_JMODS"
log "Unzipping $INPUT_FILE to $EXPLODED ..."
tar -xzvf "$INPUT_FILE" --directory $EXPLODED
BUILD_NAME="$(ls "$EXPLODED")"
#sed -i '' s/BNDL/APPL/ $EXPLODED/$BUILD_NAME/Contents/Info.plist
rm -f $EXPLODED/$BUILD_NAME/Contents/CodeResources
rm "$INPUT_FILE"
if test -d $EXPLODED/$BUILD_NAME/Contents/Home/jmods; then
mv $EXPLODED/$BUILD_NAME/Contents/Home/jmods $BACKUP_JMODS
fi
log "$INPUT_FILE extracted and removed"
APP_NAME=$(basename "$INPUT_FILE" | awk -F".tar" '{ print $1 }')
PKG_NAME="$APP_NAME.pkg"
APPLICATION_PATH=$EXPLODED/$(ls $EXPLODED)
find "$APPLICATION_PATH/Contents/Home/bin" \
-maxdepth 1 -type f -name '*.jnilib' -print0 |
while IFS= read -r -d $'\0' file; do
if [ -f "$file" ]; then
log "Linking $file"
b="$(basename "$file" .jnilib)"
ln -sf "$b.jnilib" "$(dirname "$file")/$b.dylib"
fi
done
find "$APPLICATION_PATH/Contents/" \
-maxdepth 1 -type f -name '*.txt' -print0 |
while IFS= read -r -d $'\0' file; do
if [ -f "$file" ]; then
log "Moving $file"
mv "$file" "$APPLICATION_PATH/Contents/Resources"
fi
done
non_plist=$(find "$APPLICATION_PATH/Contents/" -maxdepth 1 -type f -and -not -name 'Info.plist' | wc -l)
if [[ $non_plist -gt 0 ]]; then
log "Only Info.plist file is allowed in Contents directory but found $non_plist file(s):"
log "$(find "$APPLICATION_PATH/Contents/" -maxdepth 1 -type f -and -not -name 'Info.plist')"
exit 1
fi
if [[ "${JETSIGN_CLIENT:=}" == "null" ]] || [[ "$JETSIGN_CLIENT" == "" ]]; then
log "Unlocking keychain..."
# Make sure *.p12 is imported into local KeyChain
security unlock-keychain -p "$PASSWORD" "/Users/$USERNAME/Library/Keychains/login.keychain"
fi
attempt=1
limit=3
set +e
while [[ $attempt -le $limit ]]; do
log "Signing (attempt $attempt) $APPLICATION_PATH ..."
"$SCRIPT_DIR/sign.sh" "$APPLICATION_PATH" "$PKG_NAME" "$BUNDLE_ID" "$CODESIGN_STRING" "$JB_INSTALLER_CERT"
ec=$?
if [[ $ec -ne 0 ]]; then
((attempt += 1))
if [ $attempt -eq $limit ]; then
set -e
fi
log "Signing failed, wait for 30 sec and try to sign again"
sleep 30
else
log "Signing done"
codesign -v "$APPLICATION_PATH" -vvvvv
log "Check sign done"
spctl -a -v $APPLICATION_PATH
((attempt += limit))
fi
done
set -e
if [ "$NOTARIZE" = "yes" ]; then
log "Notarizing..."
"$SCRIPT_DIR/notarize.sh" "$PKG_NAME"
log "Stapling..."
xcrun stapler staple "$APPLICATION_PATH" ||:
xcrun stapler staple "$PKG_NAME" ||:
else
log "Notarization disabled"
log "Stapling disabled"
fi
log "Zipping $BUILD_NAME to $INPUT_FILE ..."
(
#cd "$EXPLODED"
#ditto -c -k --sequesterRsrc --keepParent "$BUILD_NAME" "../$INPUT_FILE"
if test -d $BACKUP_JMODS/jmods; then
mv $BACKUP_JMODS/jmods $APPLICATION_PATH/Contents/Home
fi
if [[ "$APPLICATION_PATH" != "$EXPLODED/$BUILD_NAME" ]]; then
mv $APPLICATION_PATH $EXPLODED/$BUILD_NAME
else
echo "No move, source == destination: $APPLICATION_PATH"
fi
tar -pczvf $INPUT_FILE --exclude='man' -C $EXPLODED $BUILD_NAME
log "Finished zipping"
)
rm -rf "$EXPLODED"
log "Done"

View File

@@ -1,30 +0,0 @@
diff --git jb/project/tools/common/modules.list jb/project/tools/common/modules.list
index 522acb7cb43..c40e689d5de 100644
--- jb/project/tools/common/modules.list
+++ jb/project/tools/common/modules.list
@@ -51,4 +51,7 @@ jdk.unsupported.desktop,
jdk.xml.dom,
jdk.zipfs,
jdk.hotspot.agent,
-jdk.jcmd
+jdk.jcmd,
+jcef,
+gluegen.rt,
+jogl.all
diff --git src/java.desktop/share/classes/module-info.java src/java.desktop/share/classes/module-info.java
index 897647ee368..781d1809493 100644
--- src/java.desktop/share/classes/module-info.java
+++ src/java.desktop/share/classes/module-info.java
@@ -116,7 +116,11 @@ module java.desktop {
// see make/GensrcModuleInfo.gmk
exports sun.awt to
jdk.accessibility,
- jdk.unsupported.desktop;
+ jdk.unsupported.desktop,
+ jcef,
+ jogl.all;
+
+ exports java.awt.peer to jcef;
exports java.awt.dnd.peer to jdk.unsupported.desktop;
exports sun.awt.dnd to jdk.unsupported.desktop;

View File

@@ -1,30 +0,0 @@
diff --git jb/project/tools/common/modules.list jb/project/tools/common/modules.list
index 522acb7..c40e689 100644
--- jb/project/tools/common/modules.list
+++ jb/project/tools/common/modules.list
@@ -51,4 +51,7 @@ jdk.unsupported.desktop,
jdk.xml.dom,
jdk.zipfs,
jdk.hotspot.agent,
-jdk.jcmd
+jdk.jcmd,
+jcef,
+gluegen.rt,
+jogl.all
diff --git src/java.desktop/share/classes/module-info.java src/java.desktop/share/classes/module-info.java
index 897647e..781d180 100644
--- src/java.desktop/share/classes/module-info.java
+++ src/java.desktop/share/classes/module-info.java
@@ -116,7 +116,11 @@ module java.desktop {
// see make/GensrcModuleInfo.gmk
exports sun.awt to
jdk.accessibility,
- jdk.unsupported.desktop;
+ jdk.unsupported.desktop,
+ jcef,
+ jogl.all;
+
+ exports java.awt.peer to jcef;
exports java.awt.dnd.peer to jdk.unsupported.desktop;
exports sun.awt.dnd to jdk.unsupported.desktop;

View File

@@ -1,162 +0,0 @@
#!/bin/bash
set -euo pipefail
TC_PRINT=0
# Always print TeamCity service messages if running under TeamCity
[[ -n "${TEAMCITY_VERSION:-}" ]] && TC_PRINT=1
while getopts ":t" o; do
case "${o}" in
t) TC_PRINT=1 ;;
*);;
esac
done
shift $((OPTIND-1))
NEWFILEPATH="$1"
CONFIGID="$2"
BUILDID="$3"
TOKEN="$4"
if [ ! -f "$NEWFILEPATH" ]; then
echo "File not found: $NEWFILEPATH"
exit 1
fi
#
# Get the size of new artifact
#
unameOut="$(uname -s)"
case "${unameOut}" in
Linux*)
NEWFILESIZE=$(stat -c%s "$NEWFILEPATH")
;;
Darwin*)
NEWFILESIZE=$(stat -f%z "$NEWFILEPATH")
;;
CYGWIN*)
NEWFILESIZE=$(stat -c%s "$NEWFILEPATH")
;;
MINGW*)
NEWFILESIZE=$(stat -c%s "$NEWFILEPATH")
;;
*)
echo "Unknown machine: ${unameOut}"
exit 1
esac
FILENAME=$(basename "${NEWFILEPATH}")
#
# Get pattern of artifact name
# Base filename pattern: <BUNDLE_TYPE>-<JDK_VERSION>-<OS>-<ARCH>-b<BUILD>.tar.gz: jbr_dcevm-17.0.2-osx-x64-b1234.tar.gz
# BUNDLE_TYPE: jbr, jbrsdk, jbr_dcevm, jbrsdk_jcef etc.
# OS_ARCH_PATTERN - <os_architecture>: osx-x64, linux-aarch64, linux-musl-x64, windows-x64 etc.
BUNDLE_TYPE=jbrsdk
OS_ARCH_PATTERN=""
FILE_EXTENSION=tar.gz
re='(jbr[a-z_]*).*-[0-9_\.]+-(.+)-b[0-9]+(.+)'
if [[ $FILENAME =~ $re ]]; then
BUNDLE_TYPE=${BASH_REMATCH[1]}
OS_ARCH_PATTERN=${BASH_REMATCH[2]}
FILE_EXTENSION=${BASH_REMATCH[3]}
else
echo "File name $FILENAME does not match regex $re"
exit 1
fi
function test_started_msg() {
if [ $TC_PRINT -eq 1 ]; then
echo "##teamcity[testStarted name='$1']"
fi
}
function test_failed_msg() {
if [ $TC_PRINT -eq 1 ]; then
echo "##teamcity[testFailed name='$1' message='$2']"
fi
}
function test_finished_msg() {
if [ $TC_PRINT -eq 1 ]; then
echo "##teamcity[testFinished name='$1']"
fi
}
test_name="${BUNDLE_TYPE}_${OS_ARCH_PATTERN//\-/_}${FILE_EXTENSION//\./_}"
test_started_msg "$test_name"
echo "BUNDLE_TYPE: $BUNDLE_TYPE"
echo "OS_ARCH_PATTERN: $OS_ARCH_PATTERN"
echo "FILE_EXTENSION: $FILE_EXTENSION"
echo "Size of $FILENAME is $NEWFILESIZE bytes"
#
# Get previous successful build ID
# Example:
# CONFIGID=IntellijCustomJdk_Jdk17_Master_LinuxX64jcef
# BUILDID=12345678
#
# expected return value
# id="123".number="567"
#
CURL_RESPONSE=$(curl -sSL --header "Authorization: Bearer $TOKEN" "https://buildserver.labs.intellij.net/app/rest/builds/?locator=buildType:(id:$CONFIGID),status:success,count:1,finishDate:(build:$BUILDID,condition:before)")
re='id=\"([0-9]+)\".+number=\"([0-9\.]+)\"'
# ID: Previous successful build id
ID=0
if [[ $CURL_RESPONSE =~ $re ]]; then
ID=${BASH_REMATCH[1]}
echo "Previous build ID: $ID"
echo "Previous build number: ${BASH_REMATCH[2]}"
else
msg="ERROR: cannot find previous build"
echo "$msg"
echo "$CURL_RESPONSE"
test_failed_msg "$test_name" "$msg"
test_finished_msg "$test_name"
exit 1
fi
#
# Get artifacts from previous successful build
#
# expected return value
# name="jbrsdk_jcef*.tar.gz size="123'
#
CURL_RESPONSE=$(curl -sSL --header "Authorization: Bearer $TOKEN" "https://buildserver.labs.intellij.net/app/rest/builds/$ID?fields=id,number,artifacts(file(name,size))")
echo "Artifacts of the previous build:"
echo "$CURL_RESPONSE"
# Find binary size (in response) with reg exp
re="name=\"(${BUNDLE_TYPE}[^\"]+${OS_ARCH_PATTERN}[^\"]+${FILE_EXTENSION})\" size=\"([0-9]+)\""
if [[ $CURL_RESPONSE =~ $re ]]; then
prevFileName=${BASH_REMATCH[1]}
echo "Previous artifact name: $prevFileName"
prevFileSize=${BASH_REMATCH[2]}
echo "Previous artifact size: $prevFileSize"
((allowedSize=prevFileSize+prevFileSize/20)) # use 5% threshold
echo "Allowed size: $allowedSize"
if [[ "$NEWFILESIZE" -gt "$allowedSize" ]]; then
msg="ERROR: new size is significantly greater than previous size (need to investigate)"
echo "$msg"
test_failed_msg "$test_name" "$msg"
test_finished_msg "$test_name"
exit 1
else
echo "PASSED"
test_finished_msg "$test_name"
fi
else
msg="ERROR: cannot find string with size in xml response:"
echo "Regex: $re"
echo "$msg"
echo "$CURL_RESPONSE"
test_failed_msg "$test_name" "$msg"
test_finished_msg "$test_name"
exit 1
fi

View File

@@ -1,93 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
usage ()
{
echo "Usage: perfcmp.sh [options] <test_results_cur> <test_results_ref> <results> <test_prefix> <noHeaders>"
echo "Options:"
echo -e " -h, --help\tdisplay this help"
echo -e " -tc\tprint teacmity statistic"
echo -e "test_results_cur - the file with metrics values for the current measuring"
echo -e "test_results_ref - the file with metrics values for the reference measuring"
echo -e "results - results of comaprison"
echo -e "test_prefix - specifys measuring type, makes sense for enabled -tc, by default no prefixes"
echo -e "noHeaders - by default 1-st line contains headers"
echo -e ""
echo -e "test_results_* files content should be in csv format with header and tab separator:"
echo -e "The 1-st column is the test name"
echo -e "The 2-st column is the test value"
echo -e ""
echo -e "Example:"
echo -e "Test Value"
echo -e "Testname 51.54"
}
while [ -n "$1" ]
do
case "$1" in
-h | --help) usage
exit 1 ;;
-tc) tc=1
shift
break ;;
*) break;;
esac
done
if [[ "$#" < "3" ]]; then
echo "Error: Invalid arguments"
usage
exit 1
fi
curFile=$1
refFile=$2
resFile=$3
testNamePrefix=$4
noHeaders=$5
echo $curFile
echo $refFile
echo $resFile
curValues=`cat "$curFile" | cut -f 2 | tr -d '\t'`
if [ -z $noHeaders ]; then
curValuesHeader=`echo "$curValues" | head -n +1`_cur
header=`cat "$refFile" | head -n +1 | awk -F'\t' -v x=$curValuesHeader '{print " "$1"\t"$2"_ref\t"x"\tratio"}'`
testContent=`paste -d '\t' $refFile <(echo "$curValues") | tail -n +2`
else
testContent=`paste -d '\t' $refFile <(echo "$curValues") | tail -n +1`
fi
testContent=`echo "$testContent" | tr "," "." | awk -F'\t' '{
if ($3>$2+$2*0.1) {
print "* "$1"\t"$2"\t"$3"\t"(($2>0)?$3/$2:"-")
} else {
print " "$1"\t"$2"\t"$3"\t"(($2>0)?$3/$2:"-")
}
}'`
if [ -z $noHeaders ]; then
echo "$header" > $resFile
fi
echo "$testContent" >> $resFile
cat "$resFile" | tr '\t' ';' | column -t -s ';' | tee $resFile
if [ -z $tc ]; then
exit 0
fi
failed=0
echo "$testContent" 2>&1 | (
while read -r s; do
testname=`echo "$s" | cut -f 1 | tr -d "[:space:]" | tr -d "*"`
duration=`echo "$s" | cut -f 3`
echo "$s" | cut -c1 | grep -c "*" && failed=1
echo \#\#teamcity[testStarted name=\'$testNamePrefix$testname\']
echo "===>$s"
echo \#\#teamcity[buildStatisticValue key=\'$testNamePrefix$testname\' value=\'$duration\']
[ $failed -eq 1 ] && echo \#\#teamcity[testFailed name=\'$testNamePrefix$testname\' message=\'$s\']
echo \#\#teamcity[testFinished name=\'$testNamePrefix$testname\' duration=\'$duration\']
failed=0
done
)

View File

@@ -1,151 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_win_aarch64
if [ -z "$BUILD_JDK" ]; then
echo "BUILD_JDK environment variable must be specified and point to a JDK built from the current sources" \
" and is able to run on the build system. See OpenJDK documentation for --with-build-jdk for more info."
exit 1
fi
source jb/project/tools/common/scripts/common.sh
WORK_DIR=$(pwd)
JCEF_PATH=${JCEF_PATH:=$WORK_DIR/jcef_win_aarch64}
NVDA_PATH=${NVDA_PATH:=$WORK_DIR/nvda_controllerClient}
function do_configure {
sh ./configure \
--enable-option-checking=fatal \
--openjdk-target=aarch64-unknown-cygwin \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build=$JDK_BUILD_NUMBER \
--with-version-opt=b${build_number} \
--with-toolchain-version=$TOOLCHAIN_VERSION \
--with-boot-jdk=$BOOT_JDK \
--with-build-jdk=$BUILD_JDK \
--with-nvdacontrollerclient=$NVDA_PATH \
--disable-ccache \
--enable-cds=yes \
$STATIC_CONF_ARGS \
$REPRODUCIBLE_BUILD_OPTS \
|| do_exit $?
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
fastdebug_infix=''
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
__root_dir=${__bundle_name}-${JBSDK_VERSION}-windows-aarch64-${fastdebug_infix}b${build_number}
echo Running jlink ...
${BUILD_JDK}/bin/jlink \
--module-path $__modules_path --no-man-pages --compress=2 \
--add-modules $__modules --output $__root_dir || do_exit $?
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> $__root_dir/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' $__root_dir/release > release
mv release $__root_dir/release
cp $IMAGES_DIR/jdk/lib/src.zip $__root_dir/lib
for dir in $(ls -d $IMAGES_DIR/jdk/*); do
rsync -amv --include="*/" --include="*.pdb" --exclude="*" $dir $__root_dir
done
copy_jmods "$__modules" "$__modules_path" "$__root_dir"/jmods
fi
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=windows-aarch64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=0
do_maketest=1
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=0
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=windows-aarch64-server-fastdebug
;;
esac
if [ -z "${INC_BUILD:-}" ]; then
do_configure || do_exit $?
if [ $do_maketest -eq 1 ]; then
make LOG=info CONF=$RELEASE_NAME clean images test-image jbr-api JBR_API_JBR_VERSION=TEST || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME clean images || do_exit $?
fi
else
if [ $do_maketest -eq 1 ]; then
make LOG=info CONF=$RELEASE_NAME images test-image jbr-api JBR_API_JBR_VERSION=TEST || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME images || do_exit $?
fi
fi
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
where cygpath
if [ $? -eq 0 ]; then
JCEF_PATH="$(cygpath -w $JCEF_PATH | sed 's/\\/\//g')"
fi
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module_aarch64.patch || do_exit $?
update_jsdk_mods "$BUILD_JDK" "$JCEF_PATH"/jmods "$JSDK"/jmods "$JSDK_MODS_DIR" || do_exit $?
cp $JCEF_PATH/jmods/* $JSDK_MODS_DIR # $JSDK/jmods is not unchanged
jbr_name_postfix="_${bundle_type}"
cat $JCEF_PATH/jcef.version >> $JSDK/release
else
jbr_name_postfix=""
fi
# create runtime image bundle
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
modules+=",jdk.crypto.mscapi"
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat ${JSDK}/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\r//g | sed s/\\n//g) || do_exit $?
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" "$JBRSDK_BUNDLE" "$JSDK_MODS_DIR" "$modules" || do_exit $?
do_exit 0

View File

@@ -1,142 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script makes test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
# Environment variables:
# JDK_BUILD_NUMBER - specifies update release of OpenJDK build or the value of --with-version-build argument
# to configure
# By default JDK_BUILD_NUMBER is set zero
# JCEF_PATH - specifies the path to the directory with JCEF binaries.
# By default JCEF binaries should be located in ./jcef_win_x64
source jb/project/tools/common/scripts/common.sh
WORK_DIR=$(pwd)
JCEF_PATH=${JCEF_PATH:=$WORK_DIR/jcef_win_x64}
NVDA_PATH=${NVDA_PATH:=$WORK_DIR/nvda_controllerClient}
function do_configure {
sh ./configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build=$JDK_BUILD_NUMBER \
--with-version-opt=b${build_number} \
--with-toolchain-version=$TOOLCHAIN_VERSION \
--with-boot-jdk=$BOOT_JDK \
--with-nvdacontrollerclient=$NVDA_PATH \
--disable-ccache \
--enable-cds=yes \
$STATIC_CONF_ARGS \
$REPRODUCIBLE_BUILD_OPTS \
|| do_exit $?
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
fastdebug_infix=''
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
__root_dir=${__bundle_name}-${JBSDK_VERSION}-windows-x64-${fastdebug_infix}b${build_number}
echo Running jlink ...
${JSDK}/bin/jlink \
--module-path $__modules_path --no-man-pages --compress=2 \
--add-modules $__modules --output $__root_dir || do_exit $?
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> $__root_dir/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' $__root_dir/release > release
mv release $__root_dir/release
cp $IMAGES_DIR/jdk/lib/src.zip $__root_dir/lib
for dir in $(ls -d $IMAGES_DIR/jdk/*); do
rsync -amv --include="*/" --include="*.pdb" --exclude="*" $dir $__root_dir
done
copy_jmods "$__modules" "$__modules_path" "$__root_dir"/jmods
fi
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=windows-x86_64-server-release
case "$bundle_type" in
"jcef")
do_reset_changes=0
do_maketest=1
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=0
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=windows-x86_64-server-fastdebug
;;
esac
if [ -z "${INC_BUILD:-}" ]; then
do_configure || do_exit $?
if [ $do_maketest -eq 1 ]; then
make LOG=info CONF=$RELEASE_NAME clean images test-image jbr-api JBR_API_JBR_VERSION=TEST || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME clean images || do_exit $?
fi
else
if [ $do_maketest -eq 1 ]; then
make LOG=info CONF=$RELEASE_NAME images test-image jbr-api JBR_API_JBR_VERSION=TEST || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME images || do_exit $?
fi
fi
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
where cygpath
if [ $? -eq 0 ]; then
JCEF_PATH="$(cygpath -w $JCEF_PATH | sed 's/\\/\//g')"
fi
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods "$JSDK" "$JCEF_PATH"/jmods "$JSDK"/jmods "$JSDK_MODS_DIR" || do_exit $?
cp $JCEF_PATH/jmods/* ${JSDK_MODS_DIR} # $JSDK/jmods is not unchanged
jbr_name_postfix="_${bundle_type}"
cat $JCEF_PATH/jcef.version >> $JSDK/release
else
jbr_name_postfix=""
fi
# create runtime image bundle
modules=$(xargs < jb/project/tools/common/modules.list | sed s/" "//g) || do_exit $?
modules+=",jdk.crypto.mscapi"
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat ${JSDK}/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\r//g | sed s/\\n//g)
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" "$JBRSDK_BUNDLE" "$JSDK_MODS_DIR" "$modules" || do_exit $?
do_exit 0

View File

@@ -1,131 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# $ ./java --version
# openjdk 11.0.6 2020-01-14
# OpenJDK Runtime Environment (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number})
# OpenJDK 64-Bit Server VM (build 11.0.6+${JDK_BUILD_NUMBER}-b${build_number}, mixed mode)
#
source jb/project/tools/common/scripts/common.sh
WORK_DIR=$(pwd)
NVDA_PATH=${NVDA_PATH:=$WORK_DIR/nvda_controllerClient}
function do_configure {
sh ./configure \
$WITH_DEBUG_LEVEL \
--with-vendor-name="$VENDOR_NAME" \
--with-vendor-version-string="$VENDOR_VERSION_STRING" \
--with-jvm-features=shenandoahgc \
--with-version-pre= \
--with-version-build=$JDK_BUILD_NUMBER \
--with-version-opt=b${build_number} \
--with-toolchain-version=$TOOLCHAIN_VERSION \
--with-boot-jdk=$BOOT_JDK \
--with-nvdacontrollerclient=$NVDA_PATH \
--disable-ccache \
--enable-cds=yes \
$STATIC_CONF_ARGS \
$REPRODUCIBLE_BUILD_OPTS \
|| do_exit $?
}
function create_image_bundle {
__bundle_name=$1
__arch_name=$2
__modules_path=$3
__modules=$4
fastdebug_infix=''
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
__root_dir=${__bundle_name}-${JBSDK_VERSION}-windows-x86-${fastdebug_infix}b${build_number}
echo Running jlink ...
${JSDK}/bin/jlink \
--module-path $__modules_path --no-man-pages --compress=2 \
--add-modules $__modules --output $__root_dir || do_exit $?
grep -v "^JAVA_VERSION" "$JSDK"/release | grep -v "^MODULES" >> $__root_dir/release
if [ "$__arch_name" == "$JBRSDK_BUNDLE" ]; then
sed 's/JBR/JBRSDK/g' $__root_dir/release > release
mv release $__root_dir/release
cp $IMAGES_DIR/jdk/lib/src.zip $__root_dir/lib
for dir in $(ls -d $IMAGES_DIR/jdk/*); do
rsync -amv --include="*/" --include="*.pdb" --exclude="*" $dir $__root_dir
done
copy_jmods "$__modules" "$__modules_path" "$__root_dir"/jmods
fi
}
WITH_DEBUG_LEVEL="--with-debug-level=release"
RELEASE_NAME=windows-x86_64-server-release
case "$bundle_type" in
"jcef")
echo "not implemented" && do_exit 1
;;
"nomod" | "")
bundle_type=""
;;
"fd")
do_reset_changes=0
WITH_DEBUG_LEVEL="--with-debug-level=fastdebug"
RELEASE_NAME=windows-x86_64-server-fastdebug
;;
esac
if [ -z "${INC_BUILD:-}" ]; then
do_configure || do_exit $?
if [ $do_maketest -eq 1 ]; then
make LOG=info CONF=$RELEASE_NAME clean images test-image jbr-api JBR_API_JBR_VERSION=TEST || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME clean images || do_exit $?
fi
else
if [ $do_maketest -eq 1 ]; then
make LOG=info CONF=$RELEASE_NAME images test-image jbr-api JBR_API_JBR_VERSION=TEST || do_exit $?
else
make LOG=info CONF=$RELEASE_NAME images || do_exit $?
fi
fi
IMAGES_DIR=build/$RELEASE_NAME/images
JSDK=$IMAGES_DIR/jdk
JSDK_MODS_DIR=$IMAGES_DIR/jmods
JBRSDK_BUNDLE=jbrsdk
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ]; then
git apply -p0 < jb/project/tools/patches/add_jcef_module.patch || do_exit $?
update_jsdk_mods "$JSDK" "$JCEF_PATH"/jmods "$JSDK"/jmods "$JSDK_MODS_DIR" || do_exit $?
cp $JCEF_PATH/jmods/* ${JSDK_MODS_DIR} # $JSDK/jmods is not unchanged
jbr_name_postfix="_${bundle_type}"
else
jbr_name_postfix=""
fi
# create runtime image bundle
modules=$(grep -v "jdk.internal.vm" jb/project/tools/common/modules.list | xargs | sed s/" "//g) || do_exit $?
modules+=",jdk.crypto.mscapi"
create_image_bundle "jbr${jbr_name_postfix}" "jbr" $JSDK_MODS_DIR "$modules" || do_exit $?
# create sdk image bundle
modules=$(cat ${JSDK}/release | grep MODULES | sed s/MODULES=//g | sed s/' '/','/g | sed s/\"//g | sed s/\\r//g | sed s/\\n//g)
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "fd" ] || [ "$bundle_type" == "$JBRSDK_BUNDLE" ]; then
modules=${modules},$(get_mods_list "$JCEF_PATH"/jmods)
fi
create_image_bundle "$JBRSDK_BUNDLE${jbr_name_postfix}" "$JBRSDK_BUNDLE" "$JSDK_MODS_DIR" "$modules" || do_exit $?
do_exit 0

View File

@@ -1,57 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script packs test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
source jb/project/tools/common/scripts/common.sh
[ "$bundle_type" == "jcef" ] && do_maketest=1
function pack_jbr {
__bundle_name=$1
__arch_name=$2
fastdebug_infix=''
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-windows-aarch64-${fastdebug_infix}b${build_number}
__root_dir=${__bundle_name}-${JBSDK_VERSION}-windows-aarch64-${fastdebug_infix}b${build_number}
echo Creating $JBR.tar.gz ...
/usr/bin/tar -czf $JBR.tar.gz -C $BASE_DIR $__root_dir || do_exit $?
}
[ "$bundle_type" == "nomod" ] && bundle_type=""
JBRSDK_BUNDLE=jbrsdk
RELEASE_NAME=windows-aarch64-server-release
IMAGES_DIR=build/$RELEASE_NAME/images
BASE_DIR=.
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
jbr_name_postfix="_${bundle_type}"
else
jbr_name_postfix=""
fi
pack_jbr jbr${jbr_name_postfix} jbr
pack_jbr jbrsdk${jbr_name_postfix} jbrsdk
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=$JBRSDK_BUNDLE-$JBSDK_VERSION-windows-test-aarch64-b$build_number
cp "build/${RELEASE_NAME}/jbr-api/jbr-api.jar" "${IMAGES_DIR}/test" || do_exit $?
echo Creating $JBRSDK_TEST.tar.gz ...
/usr/bin/tar -czf $JBRSDK_TEST.tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
fi

View File

@@ -1,57 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# jcef - the release bundles with jcef
# fd - the fastdebug bundles which also include the jcef module
#
# This script packs test-image along with JDK images when bundle_type is set to "jcef".
# If the character 't' is added at the end of bundle_type then it also makes test-image along with JDK images.
#
source jb/project/tools/common/scripts/common.sh
[ "$bundle_type" == "jcef" ] && do_maketest=1
function pack_jbr {
__bundle_name=$1
__arch_name=$2
fastdebug_infix=''
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-windows-x64-${fastdebug_infix}b${build_number}
__root_dir=${__bundle_name}-${JBSDK_VERSION}-windows-x64-${fastdebug_infix}b${build_number}
echo Creating $JBR.tar.gz ...
chmod -R ug+rwx,o+rx ${BASE_DIR}/$__root_dir
/usr/bin/tar -czf $JBR.tar.gz -C $BASE_DIR $__root_dir || do_exit $?
}
[ "$bundle_type" == "nomod" ] && bundle_type=""
JBRSDK_BUNDLE=jbrsdk
RELEASE_NAME=windows-x86_64-server-release
IMAGES_DIR=build/$RELEASE_NAME/images
BASE_DIR=.
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
jbr_name_postfix="_${bundle_type}"
else
jbr_name_postfix=""
fi
pack_jbr jbr${jbr_name_postfix} jbr
pack_jbr jbrsdk${jbr_name_postfix} jbrsdk
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=$JBRSDK_BUNDLE-$JBSDK_VERSION-windows-test-x64-b$build_number
cp "build/${RELEASE_NAME}/jbr-api/jbr-api.jar" "${IMAGES_DIR}/test" || do_exit $?
echo Creating $JBRSDK_TEST.tar.gz ...
/usr/bin/tar -czf $JBRSDK_TEST.tar.gz -C $IMAGES_DIR --exclude='test/jdk/demos' test || do_exit $?
fi

View File

@@ -1,53 +0,0 @@
#!/bin/bash
set -euo pipefail
set -x
# The following parameters must be specified:
# build_number - specifies the number of JetBrainsRuntime build
# bundle_type - specifies bundle to be built;possible values:
# <empty> or nomod - the release bundles without any additional modules (jcef)
# fd - the fastdebug bundles which also include the jcef module
#
source jb/project/tools/common/scripts/common.sh
[ "$bundle_type" == "jcef" ] && echo "not implemented" && do_exit 1
function pack_jbr {
__bundle_name=$1
__arch_name=$2
fastdebug_infix=''
[ "$bundle_type" == "fd" ] && [ "$__arch_name" == "$JBRSDK_BUNDLE" ] && __bundle_name=$__arch_name && fastdebug_infix="fastdebug-"
JBR=${__bundle_name}-${JBSDK_VERSION}-windows-x86-${fastdebug_infix}b${build_number}
__root_dir=${__bundle_name}-${JBSDK_VERSION}-windows-x86-${fastdebug_infix}b${build_number}
echo Creating $JBR.tar.gz ...
chmod -R ug+rwx,o+rx ${BASE_DIR}/$__root_dir
/usr/bin/tar -czf $JBR.tar.gz -C $BASE_DIR $__root_dir || do_exit $?
}
[ "$bundle_type" == "nomod" ] && bundle_type=""
JBRSDK_BUNDLE=jbrsdk
RELEASE_NAME=windows-x86_64-server-release
IMAGES_DIR=build/$RELEASE_NAME/images
BASE_DIR=.
if [ "$bundle_type" == "jcef" ] || [ "$bundle_type" == "dcevm" ] || [ "$bundle_type" == "fd" ]; then
jbr_name_postfix="_${bundle_type}"
else
jbr_name_postfix=""
fi
pack_jbr jbr${jbr_name_postfix} jbr
pack_jbr jbrsdk${jbr_name_postfix} jbrsdk
if [ $do_maketest -eq 1 ]; then
JBRSDK_TEST=$JBRSDK_BUNDLE-$JBSDK_VERSION-windows-test-x86-b$build_number
cp "build/${RELEASE_NAME}/jbr-api/jbr-api.jar" "${IMAGES_DIR}/test" || do_exit $?
echo Creating $JBRSDK_TEST.tar.gz ...
/usr/bin/tar -czf $JBRSDK_TEST.tar.gz -C $BASE_DIR --exclude='test/jdk/demos' test || do_exit $?
fi

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2016, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2016, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -449,7 +449,7 @@ ifneq ($(filter jcov-bundles, $(MAKECMDGOALS)), )
BUNDLE_NAME := $(JCOV_BUNDLE_NAME), \
FILES := $(JCOV_BUNDLE_FILES), \
BASE_DIRS := $(JCOV_IMAGE_DIR), \
SUBDIR := jdk-$(VERSION_NUMBER), \
SUBDIR := $(JDK_BUNDLE_SUBDIR), \
))
JCOV_TARGETS += $(BUILD_JCOV_BUNDLE)

View File

@@ -171,41 +171,41 @@ $(BUILD_DEMO_CodePointIM_JAR): $(CODEPOINT_METAINF_SERVICE_FILE)
$(eval $(call SetupBuildDemo, FileChooserDemo, \
DEMO_SUBDIR := jfc, \
DISABLED_WARNINGS := rawtypes deprecation unchecked this-escape, \
DISABLED_WARNINGS := rawtypes deprecation unchecked, \
))
$(eval $(call SetupBuildDemo, SwingSet2, \
DEMO_SUBDIR := jfc, \
EXTRA_COPY_TO_JAR := .java, \
EXTRA_MANIFEST_ATTR := SplashScreen-Image: resources/images/splash.png, \
DISABLED_WARNINGS := rawtypes deprecation unchecked static serial cast this-escape, \
DISABLED_WARNINGS := rawtypes deprecation unchecked static serial cast, \
))
$(eval $(call SetupBuildDemo, Font2DTest, \
DISABLED_WARNINGS := rawtypes deprecation unchecked serial cast this-escape, \
DISABLED_WARNINGS := rawtypes deprecation unchecked serial cast, \
DEMO_SUBDIR := jfc, \
))
$(eval $(call SetupBuildDemo, J2Ddemo, \
DEMO_SUBDIR := jfc, \
MAIN_CLASS := java2d.J2Ddemo, \
DISABLED_WARNINGS := rawtypes deprecation unchecked cast lossy-conversions this-escape, \
DISABLED_WARNINGS := rawtypes deprecation unchecked cast lossy-conversions, \
JAR_NAME := J2Ddemo, \
))
$(eval $(call SetupBuildDemo, Metalworks, \
DISABLED_WARNINGS := rawtypes unchecked this-escape, \
DISABLED_WARNINGS := rawtypes unchecked, \
DEMO_SUBDIR := jfc, \
))
$(eval $(call SetupBuildDemo, Notepad, \
DISABLED_WARNINGS := rawtypes this-escape, \
DISABLED_WARNINGS := rawtypes, \
DEMO_SUBDIR := jfc, \
))
$(eval $(call SetupBuildDemo, Stylepad, \
DEMO_SUBDIR := jfc, \
DISABLED_WARNINGS := rawtypes unchecked this-escape, \
DISABLED_WARNINGS := rawtypes unchecked, \
EXTRA_SRC_DIR := $(DEMO_SHARE_SRC)/jfc/Notepad, \
EXCLUDE_FILES := $(DEMO_SHARE_SRC)/jfc/Notepad/README.txt, \
))
@@ -215,12 +215,11 @@ $(eval $(call SetupBuildDemo, SampleTree, \
))
$(eval $(call SetupBuildDemo, TableExample, \
DISABLED_WARNINGS := rawtypes unchecked deprecation this-escape, \
DISABLED_WARNINGS := rawtypes unchecked deprecation, \
DEMO_SUBDIR := jfc, \
))
$(eval $(call SetupBuildDemo, TransparentRuler, \
DISABLED_WARNINGS := this-escape, \
DEMO_SUBDIR := jfc, \
MAIN_CLASS := transparentruler.Ruler, \
))

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2014, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2014, 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -98,7 +98,6 @@ define SetupInterimModule
EXCLUDES := sun javax/tools/snippet-files, \
EXCLUDE_FILES := $(TOPDIR)/src/$1/share/classes/module-info.java \
$(TOPDIR)/src/$1/share/classes/javax/tools/ToolProvider.java \
$(TOPDIR)/src/$1/share/classes/com/sun/tools/javac/launcher/Main.java \
Standard.java, \
EXTRA_FILES := $(BUILDTOOLS_OUTPUTDIR)/gensrc/$1.interim/module-info.java \
$($1.interim_EXTRA_FILES), \
@@ -110,9 +109,7 @@ define SetupInterimModule
$$(INTERIM_LANGTOOLS_ADD_EXPORTS) \
--patch-module java.base=$(BUILDTOOLS_OUTPUTDIR)/gensrc/java.base.interim \
--add-exports java.base/jdk.internal.javac=java.compiler.interim \
--add-exports java.base/jdk.internal.javac=jdk.compiler.interim \
--add-exports jdk.internal.opt/jdk.internal.opt=jdk.compiler.interim \
--add-exports jdk.internal.opt/jdk.internal.opt=jdk.javadoc.interim, \
--add-exports java.base/jdk.internal.javac=jdk.compiler.interim, \
))
$1_DEPS_INTERIM := $$(addsuffix .interim, $$(filter \
@@ -128,20 +125,5 @@ $(foreach m, $(INTERIM_LANGTOOLS_BASE_MODULES), \
)
################################################################################
# Setup the compilation of the javac server build tool. Technically, this is not
# really "interim" langtools, but just like it, it is needed henceforth for all
# java compilation using the interim compiler.
$(eval $(call SetupJavaCompilation, BUILD_JAVAC_SERVER, \
COMPILER := bootjdk, \
TARGET_RELEASE := $(TARGET_RELEASE_BOOTJDK), \
SRC := $(TOPDIR)/make/langtools/tools, \
INCLUDES := javacserver, \
BIN := $(BUILDTOOLS_OUTPUTDIR)/langtools_javacserver_classes, \
))
TARGETS += $(BUILD_JAVAC_SERVER)
################################################################################
all: $(TARGETS)

View File

@@ -53,7 +53,7 @@ $(eval $(call SetupJavaCompilation, BUILD_JIGSAW_TOOLS, \
build/tools/jigsaw, \
COPY := .properties .html, \
BIN := $(TOOLS_CLASSES_DIR), \
DISABLED_WARNINGS := fallthrough this-escape, \
DISABLED_WARNINGS := fallthrough, \
JAVAC_FLAGS := \
--add-modules jdk.jdeps \
--add-exports java.base/jdk.internal.module=ALL-UNNAMED \

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -57,9 +57,7 @@ $(eval $(call SetupJavaCompilation, BUILD_TOOLS_JDK, \
JAVAC_FLAGS := \
--add-exports java.desktop/sun.awt=ALL-UNNAMED \
--add-exports java.base/sun.text=ALL-UNNAMED \
--add-exports java.base/sun.security.util=ALL-UNNAMED \
--add-exports jdk.internal.opt/jdk.internal.opt=jdk.compiler.interim \
--add-exports jdk.internal.opt/jdk.internal.opt=jdk.javadoc.interim, \
--add-exports java.base/sun.security.util=ALL-UNNAMED, \
))
TARGETS += $(BUILD_TOOLS_JDK)
@@ -71,13 +69,6 @@ $(eval $(call SetupCopyFiles,COPY_NIMBUS_TEMPLATES, \
TARGETS += $(COPY_NIMBUS_TEMPLATES)
$(eval $(call SetupCopyFiles,COPY_CLDRCONVERTER_PROPERTIES, \
SRC := $(TOPDIR)/make/jdk/src/classes/build/tools/cldrconverter, \
DEST := $(BUILDTOOLS_OUTPUTDIR)/jdk_tools_classes/build/tools/cldrconverter, \
FILES := $(wildcard $(TOPDIR)/make/jdk/src/classes/build/tools/cldrconverter/*.properties)))
TARGETS += $(COPY_CLDRCONVERTER_PROPERTIES)
################################################################################
$(eval $(call SetupJavaCompilation, COMPILE_DEPEND, \
@@ -92,9 +83,7 @@ $(eval $(call SetupJavaCompilation, COMPILE_DEPEND, \
--add-exports jdk.compiler/com.sun.tools.javac.comp=ALL-UNNAMED \
--add-exports jdk.compiler/com.sun.tools.javac.main=ALL-UNNAMED \
--add-exports jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED \
--add-exports jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED \
--add-exports jdk.internal.opt/jdk.internal.opt=jdk.compiler.interim \
--add-exports jdk.internal.opt/jdk.internal.opt=jdk.javadoc.interim, \
--add-exports jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, \
))
DEPEND_SERVICE_PROVIDER := $(BUILDTOOLS_OUTPUTDIR)/depend/META-INF/services/com.sun.source.util.Plugin

View File

@@ -1,4 +1,4 @@
# Copyright (c) 1997, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 1997, 2021, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -80,7 +80,6 @@ JAVADOC_TAGS := \
-taglet build.tools.taglet.JSpec\$$JLS \
-taglet build.tools.taglet.JSpec\$$JVMS \
-taglet build.tools.taglet.ModuleGraph \
-taglet build.tools.taglet.SealedGraph \
-taglet build.tools.taglet.ToolGuide \
-tag since \
-tag serialData \
@@ -102,23 +101,17 @@ REFERENCE_TAGS := $(JAVADOC_TAGS)
JAVADOC_DISABLED_DOCLINT_WARNINGS := missing
JAVADOC_DISABLED_DOCLINT_PACKAGES := org.w3c.* javax.smartcardio
# Allow overriding on the command line
# (intentionally sharing name with the javac option)
JAVA_WARNINGS_ARE_ERRORS ?= -Werror
# The initial set of options for javadoc
JAVADOC_OPTIONS := -use -keywords -notimestamp \
-encoding ISO-8859-1 -docencoding UTF-8 -breakiterator \
-splitIndex --system none -javafx --expand-requires transitive \
--override-methods=summary \
--no-external-specs-page
--override-methods=summary
# The reference options must stay stable to allow for comparisons across the
# development cycle.
REFERENCE_OPTIONS := -XDignore.symbol.file=true -use -keywords -notimestamp \
-encoding ISO-8859-1 -breakiterator -splitIndex --system none \
-html5 -javafx --expand-requires transitive \
--no-external-specs-page
-html5 -javafx --expand-requires transitive
# Should we add DRAFT stamps to the generated javadoc?
ifeq ($(VERSION_IS_GA), true)
@@ -194,55 +187,25 @@ JAVASE_LONG_NAME := Java<sup>&reg;</sup> Platform, Standard Edition
# Functions
# Helper function for creating a svg file from a dot file generated by the
# GenGraphs tool for a module.
# GenGraphs tool.
# param 1: SetupJavadocGeneration namespace ($1)
# param 2: module name
#
define setup_module_graph_dot_to_svg
$1_$2_DOT_SRC := $$($1_MODULE_GRAPHS_DIR)/$2.dot
define setup_gengraph_dot_to_svg
$1_$2_DOT_SRC := $$($1_GENGRAPHS_DIR)/$2.dot
$1_$2_SVG_TARGET := $$($1_TARGET_DIR)/$2/module-graph.svg
# For each module needing a graph, create a svg file from the dot file
# generated by the GenGraphs tool and store it in the target dir.
$$(eval $$(call SetupExecute, module_graphs_svg_$1_$2, \
$$(eval $$(call SetupExecute, gengraphs_svg_$1_$2, \
INFO := Running dot for module graphs for $2, \
DEPS := $$(module_graphs_dot_$1_TARGET), \
DEPS := $$(gengraphs_$1_TARGET), \
OUTPUT_FILE := $$($1_$2_SVG_TARGET), \
SUPPORT_DIR := $$($1_MODULE_GRAPHS_DIR), \
SUPPORT_DIR := $$($1_GENGRAPHS_DIR), \
COMMAND := $$(DOT) -Tsvg -o $$($1_$2_SVG_TARGET) $$($1_$2_DOT_SRC), \
))
$1_GRAPHS_TARGETS += $$($1_$2_SVG_TARGET)
endef
# Helper function for creating a svg file for a class for which the SealedGraph
# taglet has generated a dot file. The dot file has a special name which
# encodes the module and class the graph belongs to.
#
# param 1: SetupJavadocGeneration namespace ($1)
# param 2: dot file name
#
define setup_sealed_graph_dot_to_svg
$1_$2_DOT_SRC := $$($1_SEALED_GRAPHS_DIR)/$2.dot
$1_$2_TARGET_CLASS := $$(word 2, $$(subst _, , $2))
$1_$2_SLASHED_NAME := $$(subst .,/, $$($1_$2_TARGET_CLASS))
$1_$2_TARGET_MODULE := $$(word 1, $$(subst _, , $2))
$1_$2_TARGET_PATH := $$($1_TARGET_DIR)/$$($1_$2_TARGET_MODULE)/$$(dir $$($1_$2_SLASHED_NAME))
$1_$2_TARGET_NAME := $$(notdir $$($1_$2_SLASHED_NAME))
$1_$2_SVG_TARGET := $$($1_$2_TARGET_PATH)/$$($1_$2_TARGET_NAME)-sealed-graph.svg
$$(call MakeDir, $$($1_$2_TARGET_PATH))
# For each class needing a graph, create a svg file from the dot file
# generated by the SealedGraph taglet and store it in the target dir.
$$(eval $$(call SetupExecute, sealed_graphs_svg_$1_$2, \
INFO := Running dot for sealed graphs for $$($1_$2_TARGET_MODULE)/$$($1_$2_TARGET_CLASS), \
DEPS := $$($1_$2_DOT_SRC), \
OUTPUT_FILE := $$($1_$2_SVG_TARGET), \
SUPPORT_DIR := $$($1_SEALED_GRAPHS_DIR), \
COMMAND := $$(DOT) -Tsvg -o $$($1_$2_SVG_TARGET) $$($1_$2_DOT_SRC), \
))
$1_GRAPHS_TARGETS += $$($1_$2_SVG_TARGET)
$1_MODULEGRAPH_TARGETS += $$($1_$2_SVG_TARGET)
endef
# Helper function to create the overview.html file to use with the -overview
@@ -290,7 +253,7 @@ endef
#
# Parameter 1 is the name of the rule. This name is used as variable prefix.
# Targets generated are returned as $1_JAVADOC_TARGETS and
# $1_GRAPHS_TARGETS. Note that the index.html file will work as a "touch
# $1_MODULEGRAPH_TARGETS. Note that the index.html file will work as a "touch
# file" for all the magnitude of files that are generated by javadoc.
#
# Remaining parameters are named arguments. These include:
@@ -313,12 +276,9 @@ define SetupApiDocsGenerationBody
-Djspec.version=$$(VERSION_SPECIFICATION)
ifeq ($$(ENABLE_FULL_DOCS), true)
$1_SEALED_GRAPHS_DIR := $$(SUPPORT_OUTPUTDIR)/docs/$1-sealed-graphs
# Tell the ModuleGraph and SealedGraph taglets to generate html links to
# soon-to-be-created svg files with module/sealed graphs.
$1_JAVA_ARGS += -DenableModuleGraph=true -DsealedDotOutputDir=$$($1_SEALED_GRAPHS_DIR)
$$(call MakeDir, $$($1_SEALED_GRAPHS_DIR))
# Tell the ModuleGraph taglet to generate html links to soon-to-be-created
# svg files with module graphs.
$1_JAVA_ARGS += -DenableModuleGraph=true
endif
# Start with basic options and tags
@@ -339,7 +299,6 @@ define SetupApiDocsGenerationBody
# Ignore the doclint warnings in certain packages
$1_OPTIONS += -Xdoclint/package:$$(call CommaList, $$(addprefix -, \
$$(JAVADOC_DISABLED_DOCLINT_PACKAGES)))
$1_OPTIONS += $$(JAVA_WARNINGS_ARE_ERRORS)
$1_DOC_TITLE := $$($1_LONG_NAME)<br>Version $$(VERSION_SPECIFICATION) API \
Specification
@@ -425,46 +384,30 @@ define SetupApiDocsGenerationBody
# First we run the GenGraph tool. It will query the module structure of the
# running JVM and output .dot files for all existing modules.
MODULE_GRAPHS_PROPS := \
GENGRAPHS_PROPS := \
$$(TOPDIR)/make/jdk/src/classes/build/tools/jigsaw/javadoc-graphs.properties
$1_MODULE_GRAPHS_DIR := $$(SUPPORT_OUTPUTDIR)/docs/$1-module-graphs
$1_GENGRAPHS_DIR := $$(SUPPORT_OUTPUTDIR)/docs/$1-gengraphs
$$(eval $$(call SetupExecute, module_graphs_dot_$1, \
INFO := Generating module graphs for $1 documentation, \
DEPS := $$(BUILD_JIGSAW_TOOLS) $$(MODULE_GRAPHS_PROPS), \
OUTPUT_DIR := $$($1_MODULE_GRAPHS_DIR), \
COMMAND := $$(TOOL_GENGRAPHS) --spec --output $$($1_MODULE_GRAPHS_DIR) \
--dot-attributes $$(MODULE_GRAPHS_PROPS), \
$$(eval $$(call SetupExecute, gengraphs_$1, \
INFO := Running gengraphs for $1 documentation, \
DEPS := $$(BUILD_JIGSAW_TOOLS) $$(GENGRAPHS_PROPS), \
OUTPUT_DIR := $$($1_GENGRAPHS_DIR), \
COMMAND := $$(TOOL_GENGRAPHS) --spec --output $$($1_GENGRAPHS_DIR) \
--dot-attributes $$(GENGRAPHS_PROPS), \
))
# For each module needing a graph, create a svg file from the dot file
# generated by the GenGraphs tool and store it in the target dir.
# They will depend on module_graphs_dot_$1_TARGET, and will be added to
# $1_GRAPHS_TARGETS.
# They will depend on gengraphs_$1_TARGET, and will be added to $1.
$$(foreach m, $$($1_MODULES_NEEDING_GRAPH), \
$$(eval $$(call setup_module_graph_dot_to_svg,$1,$$m)) \
)
# We have asked SealedGraph to generate dot files and links to svg files.
# Now we must produce the svg files from the dot files.
# Get a list of classes for which SealedGraph has generated dot files
$1_SEALED_CLASSES := $$(patsubst %.dot,%,$$(patsubst \
$$($1_SEALED_GRAPHS_DIR)/%,%, \
$$(wildcard $$($1_SEALED_GRAPHS_DIR)/*.dot)))
# For each class needing a graph, create a svg file from the dot file
# generated by the SealedGraph taglet and store it in the target dir.
# They will will be added to $1_GRAPHS_TARGETS.
$$(foreach c, $$($1_SEALED_CLASSES), \
$$(eval $$(call setup_sealed_graph_dot_to_svg,$1,$$c)) \
$$(eval $$(call setup_gengraph_dot_to_svg,$1,$$m)) \
)
endif
endef
################################################################################
# Setup generation of the JDK API documentation (javadoc + graphs)
# Setup generation of the JDK API documentation (javadoc + modulegraph)
# Define the groups of the JDK API documentation
JavaSE_GROUP_NAME := Java SE
@@ -513,10 +456,10 @@ $(eval $(call SetupApiDocsGeneration, JDK_API, \
))
# Targets generated are returned in JDK_API_JAVADOC_TARGETS and
# JDK_API_GRAPHS_TARGETS.
# JDK_API_MODULEGRAPH_TARGETS.
################################################################################
# Setup generation of the Java SE API documentation (javadoc + graphs)
# Setup generation of the Java SE API documentation (javadoc + modulegraph)
# The Java SE module scope is just java.se and its transitive indirect
# exports.
@@ -530,10 +473,10 @@ $(eval $(call SetupApiDocsGeneration, JAVASE_API, \
))
# Targets generated are returned in JAVASE_API_JAVADOC_TARGETS and
# JAVASE_API_GRAPHS_TARGETS.
# JAVASE_API_MODULEGRAPH_TARGETS.
################################################################################
# Setup generation of the reference Java SE API documentation (javadoc + graphs)
# Setup generation of the reference Java SE API documentation (javadoc + modulegraph)
# The reference javadoc is just the same as javase, but using the BootJDK javadoc
# and a stable set of javadoc options. Typically it is used for generating
@@ -551,7 +494,7 @@ $(eval $(call SetupApiDocsGeneration, REFERENCE_API, \
))
# Targets generated are returned in REFERENCE_API_JAVADOC_TARGETS and
# REFERENCE_API_GRAPHS_TARGETS.
# REFERENCE_API_MODULEGRAPH_TARGETS.
################################################################################
@@ -621,7 +564,7 @@ $(foreach n, 0 1 2, \
$(eval specs_bottom_rel_path := $(specs_bottom_rel_path)../) \
)
SPECS_TOP := $(if $(filter true, $(IS_DRAFT)), <header class="draft-header" role="banner">$(DRAFT_TEXT)</header>)
SPECS_TOP := $(if $(filter true, $(IS_DRAFT)), <header class="draft-header">$(DRAFT_TEXT)</header>)
# For all html files in $module/share/specs directories, copy and add the
# copyright footer.
@@ -650,9 +593,6 @@ ifeq ($(ENABLE_PANDOC), true)
# html, if we have pandoc (otherwise we'll just skip this).
GLOBAL_SPECS_DEFAULT_CSS_FILE := $(DOCS_OUTPUTDIR)/resources/jdk-default.css
# Unset the following to suppress the link to the tool guides
NAV_LINK_GUIDES := --nav-link-guides
HEADER_RIGHT_SIDE_INFO := <strong>$(subst &amp;,&,$(JDK_SHORT_NAME))$(DRAFT_MARKER_STR)</strong>
$(foreach m, $(ALL_MODULES), \
$(eval SPECS_$m := $(call FindModuleSpecsDirs, $m)) \
@@ -669,8 +609,7 @@ ifeq ($(ENABLE_PANDOC), true)
REPLACEMENTS := \
@@VERSION_SPECIFICATION@@ => $(VERSION_SPECIFICATION) ; \
@@VERSION_STRING@@ => $(VERSION_STRING), \
POST_PROCESS := $(TOOL_FIXUPPANDOC) --insert-nav --nav-right-info '$(HEADER_RIGHT_SIDE_INFO)' \
--nav-subdirs $($m_$f_NOF_SUBDIRS) $(NAV_LINK_GUIDES), \
POST_PROCESS := $(TOOL_FIXUPPANDOC), \
)) \
$(eval JDK_SPECS_TARGETS += $($($m_$f_NAME))) \
) \
@@ -704,8 +643,7 @@ ifeq ($(ENABLE_PANDOC), true)
@@VERSION_SHORT@@ => $(VERSION_SHORT) ; \
@@VERSION_SPECIFICATION@@ => $(VERSION_SPECIFICATION), \
OPTIONS := --toc -V include-before='$(SPECS_TOP)' -V include-after='$(SPECS_BOTTOM_1)', \
POST_PROCESS := $(TOOL_FIXUPPANDOC) --insert-nav --nav-right-info '$(HEADER_RIGHT_SIDE_INFO)' \
--nav-subdirs 1 --nav-link-guides, \
POST_PROCESS := $(TOOL_FIXUPPANDOC), \
EXTRA_DEPS := $(PANDOC_HTML_MANPAGE_FILTER) \
$(PANDOC_HTML_MANPAGE_FILTER_SOURCE), \
)) \
@@ -720,25 +658,13 @@ endif
# Special treatment for generated documentation
SPEC_HEADER_BLOCK := \
<header id="title-block-header"> \
<div class="navbar"> \
<div>$(HEADER_RIGHT_SIDE_INFO)</div> \
<nav><ul><li><a href="PATH_TO_SPECS/../api/index.html">API</a> \
<li><a href="PATH_TO_SPECS/index.html">OTHER SPECIFICATIONS \
<li><a href="PATH_TO_SPECS/man/index.html">TOOL GUIDES</a></ul></nav> \
</div> \
</header>
JDWP_PROTOCOL := $(SUPPORT_OUTPUTDIR)/gensrc/jdk.jdi/jdwp-protocol.html
ifneq ($(call ApplySpecFilter, $(JDWP_PROTOCOL)), )
JDWP_HEADER_BLOCK := $(subst PATH_TO_SPECS,..,$(SPEC_HEADER_BLOCK))
$(eval $(call SetupTextFileProcessing, PROCESS_JDWP_PROTOCOL, \
SOURCE_FILES := $(JDWP_PROTOCOL), \
OUTPUT_DIR := $(DOCS_OUTPUTDIR)/specs/jdwp, \
REPLACEMENTS := \
<style> => <link rel="stylesheet" href="../../resources/jdk-default.css"/><style> ; \
<body> => <body>$(SPECS_TOP)$(JDWP_HEADER_BLOCK) ; \
<body> => <body>$(SPECS_TOP) ; \
</body> => $(SPECS_BOTTOM_1)</body>, \
))
JDK_SPECS_TARGETS += $(PROCESS_JDWP_PROTOCOL)
@@ -747,13 +673,11 @@ endif
# Get jvmti.html from the main jvm variant (all variants' jvmti.html are identical).
JVMTI_HTML ?= $(HOTSPOT_OUTPUTDIR)/variant-$(JVM_VARIANT_MAIN)/gensrc/jvmtifiles/jvmti.html
ifneq ($(call ApplySpecFilter, $(JVMTI_HTML)), )
JVMTI_HEADER_BLOCK := $(subst PATH_TO_SPECS,.,$(SPEC_HEADER_BLOCK))
$(eval $(call SetupTextFileProcessing, PROCESS_JVMTI_HTML, \
SOURCE_FILES := $(JVMTI_HTML), \
OUTPUT_DIR := $(DOCS_OUTPUTDIR)/specs/, \
REPLACEMENTS := \
<style> => <link rel="stylesheet" href="../resources/jdk-default.css"/><style> ; \
<body> => <body>$(SPECS_TOP)$(JVMTI_HEADER_BLOCK) ; \
<body> => <body>$(SPECS_TOP) ; \
</body> => $(SPECS_BOTTOM_0)</body>, \
))
JDK_SPECS_TARGETS += $(PROCESS_JVMTI_HTML)
@@ -768,7 +692,7 @@ JAVADOC_ZIP_FILE := $(OUTPUTDIR)/bundles/$(JAVADOC_ZIP_NAME)
$(eval $(call SetupZipArchive, BUILD_JAVADOC_ZIP, \
SRC := $(DOCS_OUTPUTDIR), \
ZIP := $(JAVADOC_ZIP_FILE), \
EXTRA_DEPS := $(JDK_API_JAVADOC_TARGETS) $(JDK_API_GRAPHS_TARGETS) \
EXTRA_DEPS := $(JDK_API_JAVADOC_TARGETS) $(JDK_API_MODULEGRAPH_TARGETS) \
$(JDK_SPECS_TARGETS), \
))
@@ -796,15 +720,15 @@ SPECS_ZIP_TARGETS += $(BUILD_SPECS_ZIP)
docs-jdk-api-javadoc: $(JDK_API_JAVADOC_TARGETS) $(JDK_API_CUSTOM_TARGETS)
docs-jdk-api-graphs: $(JDK_API_GRAPHS_TARGETS)
docs-jdk-api-modulegraph: $(JDK_API_MODULEGRAPH_TARGETS)
docs-javase-api-javadoc: $(JAVASE_API_JAVADOC_TARGETS) $(JAVASE_API_CUSTOM_TARGETS)
docs-javase-api-graphs: $(JAVASE_API_GRAPHS_TARGETS)
docs-javase-api-modulegraph: $(JAVASE_API_MODULEGRAPH_TARGETS)
docs-reference-api-javadoc: $(REFERENCE_API_JAVADOC_TARGETS) $(REFERENCE_API_CUSTOM_TARGETS)
docs-reference-api-graphs: $(REFERENCE_API_GRAPHS_TARGETS)
docs-reference-api-modulegraph: $(REFERENCE_API_MODULEGRAPH_TARGETS)
docs-jdk-specs: $(JDK_SPECS_TARGETS)
@@ -814,12 +738,12 @@ docs-zip: $(ZIP_TARGETS)
docs-specs-zip: $(SPECS_ZIP_TARGETS)
all: docs-jdk-api-javadoc docs-jdk-api-graphs docs-javase-api-javadoc \
docs-javase-api-graphs docs-reference-api-javadoc \
docs-reference-api-graphs docs-jdk-specs docs-jdk-index docs-zip \
all: docs-jdk-api-javadoc docs-jdk-api-modulegraph docs-javase-api-javadoc \
docs-javase-api-modulegraph docs-reference-api-javadoc \
docs-reference-api-modulegraph docs-jdk-specs docs-jdk-index docs-zip \
docs-specs-zip
.PHONY: default all docs-jdk-api-javadoc docs-jdk-api-graphs \
docs-javase-api-javadoc docs-javase-api-graphs \
docs-reference-api-javadoc docs-reference-api-graphs docs-jdk-specs \
.PHONY: default all docs-jdk-api-javadoc docs-jdk-api-modulegraph \
docs-javase-api-javadoc docs-javase-api-modulegraph \
docs-reference-api-javadoc docs-reference-api-modulegraph docs-jdk-specs \
docs-jdk-index docs-zip docs-specs-zip

View File

@@ -125,11 +125,6 @@ test-prebuilt:
$(MAKE) --no-print-directory -r -R -I make/common/ -f make/RunTestsPrebuilt.gmk \
test-prebuilt CUSTOM_MAKE_DIR=$(CUSTOM_MAKE_DIR) TEST="$(TEST)" )
test-prebuilt-with-exit-code:
@( cd $(topdir) && \
$(MAKE) --no-print-directory -r -R -I make/common/ -f make/RunTestsPrebuilt.gmk \
test-prebuilt-with-exit-code CUSTOM_MAKE_DIR=$(CUSTOM_MAKE_DIR) TEST="$(TEST)" )
# Alias for backwards compatibility
run-test-prebuilt: test-prebuilt

View File

@@ -277,20 +277,15 @@ else # HAS_SPEC=true
$(ECHO) $(CONFIGURE_COMMAND_LINE)
reconfigure:
ifneq ($(REAL_CONFIGURE_COMMAND_EXEC_FULL), )
$(ECHO) "Re-running configure using original command line '$(REAL_CONFIGURE_COMMAND_EXEC_SHORT) $(REAL_CONFIGURE_COMMAND_LINE)'"
$(eval RECONFIGURE_COMMAND := $(REAL_CONFIGURE_COMMAND_EXEC_FULL) $(REAL_CONFIGURE_COMMAND_LINE))
else ifneq ($(CONFIGURE_COMMAND_LINE), )
ifneq ($(CONFIGURE_COMMAND_LINE), )
$(ECHO) "Re-running configure using arguments '$(CONFIGURE_COMMAND_LINE)'"
$(eval RECONFIGURE_COMMAND := $(BASH) $(TOPDIR)/configure $(CONFIGURE_COMMAND_LINE))
else
$(ECHO) "Re-running configure using default settings"
$(eval RECONFIGURE_COMMAND := $(BASH) $(TOPDIR)/configure)
endif
( cd $(CONFIGURE_START_DIR) && PATH="$(ORIGINAL_PATH)" AUTOCONF="$(AUTOCONF)" \
CUSTOM_ROOT="$(CUSTOM_ROOT)" \
CUSTOM_CONFIG_DIR="$(CUSTOM_CONFIG_DIR)" \
$(RECONFIGURE_COMMAND) )
$(BASH) $(TOPDIR)/configure $(CONFIGURE_COMMAND_LINE) )
##############################################################################
# The main target, for delegating into Main.gmk
@@ -324,7 +319,7 @@ else # HAS_SPEC=true
ifneq ($(PARALLEL_TARGETS), )
$(call PrepareFailureLogs)
$(call StartGlobalTimer)
$(call PrepareJavacServer)
$(call PrepareSmartJavac)
# JOBS will only be empty for a bootcycle-images recursive call
# or if specified via a make argument directly. In those cases
# treat it as NOT using jobs at all.
@@ -339,7 +334,7 @@ else # HAS_SPEC=true
cd $(TOPDIR) && $(MAKE) $(MAKE_ARGS) -j 1 -f make/Init.gmk \
HAS_SPEC=true on-failure ; \
exit $$exitcode ) )
$(call CleanupJavacServer)
$(call CleanupSmartJavac)
$(call StopGlobalTimer)
$(call ReportBuildTimes)
endif
@@ -351,7 +346,7 @@ else # HAS_SPEC=true
endif
on-failure:
$(call CleanupJavacServer)
$(call CleanupSmartJavac)
$(call StopGlobalTimer)
$(call ReportBuildTimes)
$(call PrintFailureReports)
@@ -364,11 +359,11 @@ else # HAS_SPEC=true
# Support targets for COMPARE_BUILD, used for makefile development
pre-compare-build:
$(call WaitForJavacServerFinish)
$(call WaitForSmartJavacFinish)
$(call PrepareCompareBuild)
post-compare-build:
$(call WaitForJavacServerFinish)
$(call WaitForSmartJavacFinish)
$(call CleanupCompareBuild)
$(call CompareBuildDoComparison)

View File

@@ -204,15 +204,6 @@ ifeq ($(HAS_SPEC),)
# Otherwise select those that contain the given CONF string
matching_confs := $$(strip $$(foreach var, $$(all_confs), \
$$(if $$(findstring $$(CONF), $$(var)), $$(var))))
ifneq ($$(filter $$(CONF), $$(matching_confs)), )
# If we found an exact match, use that
matching_confs := $$(CONF)
# Don't repeat this output on make restarts caused by including
# generated files.
ifeq ($$(MAKE_RESTARTS),)
$$(info Using exact match for CONF=$$(CONF) (other matches are possible))
endif
endif
endif
ifeq ($$(matching_confs),)
$$(info Error: No configurations found matching CONF=$$(CONF).)
@@ -435,10 +426,10 @@ else # $(HAS_SPEC)=true
# Compare first and second build. Ignore any error code from compare.sh.
$(ECHO) "Comparing between comparison rebuild (this/new) and baseline (other/old)"
$(if $(COMPARE_BUILD_COMP_DIR), \
+(cd $(COMPARE_BUILD_OUTPUTDIR) && ./compare.sh -vv $(COMPARE_BUILD_COMP_OPTS) \
+(cd $(COMPARE_BUILD_OUTPUTDIR) && ./compare.sh $(COMPARE_BUILD_COMP_OPTS) \
-2dirs $(COMPARE_BUILD_OUTPUTDIR)/$(COMPARE_BUILD_COMP_DIR) \
$(OUTPUTDIR)/$(COMPARE_BUILD_COMP_DIR) $(COMPARE_BUILD_IGNORE_RESULT)), \
+(cd $(COMPARE_BUILD_OUTPUTDIR) && ./compare.sh -vv $(COMPARE_BUILD_COMP_OPTS) \
+(cd $(COMPARE_BUILD_OUTPUTDIR) && ./compare.sh $(COMPARE_BUILD_COMP_OPTS) \
-o $(OUTPUTDIR) $(COMPARE_BUILD_IGNORE_RESULT)) \
)
endef
@@ -502,15 +493,15 @@ else # $(HAS_SPEC)=true
# Remove any javac server logs and port files. This
# prevents a new make run to reuse the previous servers.
define PrepareJavacServer
define PrepareSmartJavac
$(if $(JAVAC_SERVER_DIR), \
$(RM) -r $(JAVAC_SERVER_DIR) 2> /dev/null && \
$(MKDIR) -p $(JAVAC_SERVER_DIR) \
)
endef
define CleanupJavacServer
[ -f $(JAVAC_SERVER_DIR)/server.port ] && $(ECHO) Stopping javac server && \
define CleanupSmartJavac
[ -f $(JAVAC_SERVER_DIR)/server.port ] && $(ECHO) Stopping sjavac server && \
$(TOUCH) $(JAVAC_SERVER_DIR)/server.port.stop; true
endef
@@ -519,13 +510,13 @@ else # $(HAS_SPEC)=true
# move or remove the build output directory. Since we have no proper
# synchronization process, wait for a while and hope it helps. This is only
# used by build comparisons.
define WaitForJavacServerFinish
define WaitForSmartJavacFinish
$(if $(JAVAC_SERVER_DIR), \
sleep 5\
)
endef
else
define WaitForJavacServerFinish
define WaitForSmartJavacFinish
endef
endif

View File

@@ -1,93 +0,0 @@
#
# Copyright 2000-2023 JetBrains s.r.o.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 2 only, as
# published by the Free Software Foundation. Oracle designates this
# particular file as subject to the "Classpath" exception as provided
# by Oracle in the LICENSE file that accompanied this code.
#
# This code is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License
# version 2 for more details (a copy is included in the LICENSE file that
# accompanied this code).
#
# You should have received a copy of the GNU General Public License version
# 2 along with this work; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
#
# Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
# or visit www.oracle.com if you need additional information or have any
# questions.
#
include $(SPEC)
include MakeBase.gmk
include JavaCompilation.gmk
JBR_API_ROOT_DIR := $(TOPDIR)/src/jetbrains.api
JBR_API_TOOLS_DIR := $(JBR_API_ROOT_DIR)/tools
JBR_API_SRC_DIR := $(JBR_API_ROOT_DIR)/src
JBR_API_OUTPUT_DIR := $(OUTPUTDIR)/jbr-api
JBR_API_GENSRC_DIR := $(JBR_API_OUTPUT_DIR)/gensrc
JBR_API_BIN_DIR := $(JBR_API_OUTPUT_DIR)/bin
JBR_API_VERSION_PROPERTIES := $(JBR_API_ROOT_DIR)/version.properties
JBR_API_VERSION_GENSRC := $(JBR_API_OUTPUT_DIR)/jbr-api.version
JBR_API_GENSRC_BATCH := $(JBR_API_VERSION_GENSRC)
JBR_API_SRC_FILES := $(call FindFiles, $(JBR_API_SRC_DIR))
JBR_API_GENSRC_FILES := $(foreach f, $(call FindFiles, $(JBR_API_SRC_DIR)), \
$(JBR_API_GENSRC_DIR)/$(call RelativePath, $f, $(JBR_API_SRC_DIR)))
ifeq ($(JBR_API_JBR_VERSION),)
JBR_API_JBR_VERSION := DEVELOPMENT
JBR_API_FAIL_ON_HASH_MISMATCH := false
else
.PHONY: $(JBR_API_VERSION_PROPERTIES)
JBR_API_FAIL_ON_HASH_MISMATCH := true
endif
ARCHIVE_BUILD_JBR_API_BIN := $(JBR_API_BIN_DIR)
$(eval $(call SetupJavaCompilation, BUILD_JBR_API, \
SMALL_JAVA := true, \
COMPILER := bootjdk, \
SRC := $(JBR_API_GENSRC_DIR), \
EXTRA_FILES := $(JBR_API_GENSRC_FILES), \
BIN := $(JBR_API_BIN_DIR), \
JAR := $(JBR_API_OUTPUT_DIR)/jbr-api.jar, \
))
$(eval $(call SetupJarArchive, BUILD_JBR_API_SOURCES_JAR, \
DEPENDENCIES := $(JBR_API_GENSRC_FILES), \
SRCS := $(JBR_API_GENSRC_DIR), \
JAR := $(JBR_API_OUTPUT_DIR)/jbr-api-sources.jar, \
SUFFIXES := .java, \
BIN := $(JBR_API_BIN_DIR), \
))
# Grouped targets may not be supported, so hack dependencies: sources -> version file -> generated sources
$(JBR_API_VERSION_GENSRC): $(JBR_API_SRC_FILES) $(JBR_API_VERSION_PROPERTIES) $(JBR_API_TOOLS_DIR)/Gensrc.java
$(ECHO) Generating sources for JBR API
$(JAVA_CMD) $(JAVA_FLAGS_SMALL) "$(JBR_API_TOOLS_DIR)/Gensrc.java" \
"$(TOPDIR)/src" "$(JBR_API_OUTPUT_DIR)" "$(JBR_API_JBR_VERSION)"
$(JBR_API_GENSRC_FILES): $(JBR_API_VERSION_GENSRC)
$(TOUCH) $@
jbr-api-check-version: $(JBR_API_GENSRC_FILES) $(JBR_API_VERSION_PROPERTIES)
$(JAVA_CMD) $(JAVA_FLAGS_SMALL) "$(JBR_API_TOOLS_DIR)/CheckVersion.java" \
"$(JBR_API_ROOT_DIR)" "$(JBR_API_GENSRC_DIR)" "$(JBR_API_FAIL_ON_HASH_MISMATCH)"
jbr-api: $(BUILD_JBR_API) $(BUILD_JBR_API_SOURCES_JAR) jbr-api-check-version
.PHONY: jbr-api jbr-api-check-version
ifneq ($(JBR_API_CONF_FILE),)
$(JBR_API_CONF_FILE): $(JBR_API_GENSRC_FILES)
$(ECHO) "VERSION=`$(CAT) $(JBR_API_VERSION_GENSRC)`" > $(JBR_API_CONF_FILE)
$(ECHO) "JAR=$(JBR_API_OUTPUT_DIR)/jbr-api.jar" >> $(JBR_API_CONF_FILE)
$(ECHO) "SOURCES_JAR=$(JBR_API_OUTPUT_DIR)/jbr-api-sources.jar" >> $(JBR_API_CONF_FILE)
jbr-api: $(JBR_API_CONF_FILE)
.PHONY: $(JBR_API_CONF_FILE)
endif

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -257,46 +257,6 @@ $(eval $(call SetupTarget, hotspot-ide-project, \
ARGS := -I$(TOPDIR)/make/hotspot, \
))
$(eval $(call SetupTarget, eclipse-java-env, \
MAKEFILE := ide/eclipse/CreateWorkspace, \
ARGS := --always-make WORKSPACE=java SHARED=false, \
))
$(eval $(call SetupTarget, eclipse-hotspot-env, \
MAKEFILE := ide/eclipse/CreateWorkspace, \
ARGS := --always-make WORKSPACE=hotspot SHARED=false, \
))
$(eval $(call SetupTarget, eclipse-native-env, \
MAKEFILE := ide/eclipse/CreateWorkspace, \
ARGS := --always-make WORKSPACE=native SHARED=false, \
))
$(eval $(call SetupTarget, eclipse-mixed-env, \
MAKEFILE := ide/eclipse/CreateWorkspace, \
ARGS := --always-make SHARED=false, \
))
$(eval $(call SetupTarget, eclipse-shared-java-env, \
MAKEFILE := ide/eclipse/CreateWorkspace, \
ARGS := --always-make WORKSPACE=java SHARED=true, \
))
$(eval $(call SetupTarget, eclipse-shared-hotspot-env, \
MAKEFILE := ide/eclipse/CreateWorkspace, \
ARGS := --always-make WORKSPACE=hotspot SHARED=true, \
))
$(eval $(call SetupTarget, eclipse-shared-native-env, \
MAKEFILE := ide/eclipse/CreateWorkspace, \
ARGS := --always-make WORKSPACE=native SHARED=true, \
))
$(eval $(call SetupTarget, eclipse-shared-mixed-env, \
MAKEFILE := ide/eclipse/CreateWorkspace, \
ARGS := --always-make SHARED=true, \
))
ALL_TARGETS += $(HOTSPOT_VARIANT_TARGETS) $(HOTSPOT_VARIANT_GENSRC_TARGETS) \
$(HOTSPOT_VARIANT_LIBS_TARGETS)
@@ -506,15 +466,15 @@ ALL_TARGETS += bootcycle-images
# Docs targets
# If building full docs, to complete docs-*-api we need both the javadoc and
# graphs targets.
# modulegraph targets.
$(eval $(call SetupTarget, docs-jdk-api-javadoc, \
MAKEFILE := Docs, \
TARGET := docs-jdk-api-javadoc, \
))
$(eval $(call SetupTarget, docs-jdk-api-graphs, \
$(eval $(call SetupTarget, docs-jdk-api-modulegraph, \
MAKEFILE := Docs, \
TARGET := docs-jdk-api-graphs, \
TARGET := docs-jdk-api-modulegraph, \
DEPS := buildtools-modules runnable-buildjdk, \
))
@@ -523,9 +483,9 @@ $(eval $(call SetupTarget, docs-javase-api-javadoc, \
TARGET := docs-javase-api-javadoc, \
))
$(eval $(call SetupTarget, docs-javase-api-graphs, \
$(eval $(call SetupTarget, docs-javase-api-modulegraph, \
MAKEFILE := Docs, \
TARGET := docs-javase-api-graphs, \
TARGET := docs-javase-api-modulegraph, \
DEPS := buildtools-modules runnable-buildjdk, \
))
@@ -534,9 +494,9 @@ $(eval $(call SetupTarget, docs-reference-api-javadoc, \
TARGET := docs-reference-api-javadoc, \
))
$(eval $(call SetupTarget, docs-reference-api-graphs, \
$(eval $(call SetupTarget, docs-reference-api-modulegraph, \
MAKEFILE := Docs, \
TARGET := docs-reference-api-graphs, \
TARGET := docs-reference-api-modulegraph, \
DEPS := buildtools-modules runnable-buildjdk, \
))
@@ -747,22 +707,6 @@ ifeq ($(BUILD_FAILURE_HANDLER), true)
))
endif
ifeq ($(BUILD_JTREG_TEST_THREAD_FACTORY), true)
# Builds the test thread factory jtreg extension
$(eval $(call SetupTarget, build-test-test-thread-factory, \
MAKEFILE := test/BuildJtregTestThreadFactory, \
TARGET := build, \
DEPS := interim-langtools exploded-image, \
))
# Copies the jtreg test thread factory into the test image
$(eval $(call SetupTarget, test-image-test-thread-factory, \
MAKEFILE := test/BuildJtregTestThreadFactory, \
TARGET := images, \
DEPS := build-test-test-thread-factory, \
))
endif
$(eval $(call SetupTarget, build-microbenchmark, \
MAKEFILE := test/BuildMicrobenchmark, \
DEPS := interim-langtools exploded-image, \
@@ -1163,14 +1107,9 @@ docs-reference-api: docs-reference-api-javadoc
# If we're building full docs, we must also generate the module graphs to
# get non-broken api documentation.
ifeq ($(ENABLE_FULL_DOCS), true)
docs-jdk-api: docs-jdk-api-graphs
docs-javase-api: docs-javase-api-graphs
docs-reference-api: docs-reference-api-graphs
# We must generate javadoc first so we know what graphs are needed
docs-jdk-api-graphs: docs-jdk-api-javadoc
docs-javase-api-graphs: docs-javase-api-javadoc
docs-reference-api-graphs: docs-reference-api-javadoc
docs-jdk-api: docs-jdk-api-modulegraph
docs-javase-api: docs-javase-api-modulegraph
docs-reference-api: docs-reference-api-modulegraph
endif
docs-jdk: docs-jdk-api docs-jdk-specs docs-jdk-index
@@ -1243,10 +1182,6 @@ ifeq ($(BUILD_FAILURE_HANDLER), true)
test-image: test-image-failure-handler
endif
ifeq ($(BUILD_JTREG_TEST_THREAD_FACTORY), true)
test-image: test-image-test-thread-factory
endif
ifneq ($(JMH_CORE_JAR), )
test-image: build-microbenchmark
endif
@@ -1414,14 +1349,6 @@ create-main-targets-include:
@$(ECHO) ALL_MAIN_TARGETS := $(sort $(ALL_TARGETS)) > \
$(MAKESUPPORT_OUTPUTDIR)/main-targets.gmk
################################################################################
# JBR API
$(eval $(call SetupTarget, jbr-api, \
MAKEFILE := JBRApi, \
TARGET := jbr-api \
))
################################################################################
# Hook to include the corresponding custom file, if present.
$(eval $(call IncludeCustomExtension, Main-post.gmk))

View File

@@ -51,7 +51,6 @@ define create-info-file
$(if $(VENDOR_VERSION_STRING), \
$(call info-file-item, "IMPLEMENTOR_VERSION", "$(VENDOR_VERSION_STRING)"))
$(call info-file-item, "JAVA_VERSION_DATE", "$(VERSION_DATE)")
$(call info-file-item, "JAVA_RUNTIME_VERSION", "$(VERSION_STRING)")
$(call info-file-item, "OS_NAME", "$(RELEASE_FILE_OS_NAME)")
$(call info-file-item, "OS_ARCH", "$(RELEASE_FILE_OS_ARCH)")
$(call info-file-item, "LIBC", "$(RELEASE_FILE_LIBC)")

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2016, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2016, 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -93,9 +93,6 @@ endif
JTREG_FAILURE_HANDLER_DIR := $(TEST_IMAGE_DIR)/failure_handler
JTREG_FAILURE_HANDLER := $(JTREG_FAILURE_HANDLER_DIR)/jtregFailureHandler.jar
JTREG_TEST_THREAD_FACTORY_DIR := $(TEST_IMAGE_DIR)/jtreg_test_thread_factory
JTREG_TEST_THREAD_FACTORY_JAR := $(JTREG_TEST_THREAD_FACTORY_DIR)/jtregTestThreadFactory.jar
JTREG_FAILURE_HANDLER_TIMEOUT ?= 0
ifneq ($(wildcard $(JTREG_FAILURE_HANDLER)), )
@@ -199,12 +196,11 @@ $(eval $(call SetTestOpt,JAVA_OPTIONS,JTREG))
$(eval $(call SetTestOpt,JOBS,JTREG))
$(eval $(call SetTestOpt,TIMEOUT_FACTOR,JTREG))
$(eval $(call SetTestOpt,FAILURE_HANDLER_TIMEOUT,JTREG))
$(eval $(call SetTestOpt,REPORT,JTREG))
$(eval $(call ParseKeywordVariable, JTREG, \
SINGLE_KEYWORDS := JOBS TIMEOUT_FACTOR FAILURE_HANDLER_TIMEOUT \
TEST_MODE ASSERT VERBOSE RETAIN TEST_THREAD_FACTORY MAX_MEM RUN_PROBLEM_LISTS \
RETRY_COUNT REPEAT_COUNT MAX_OUTPUT REPORT $(CUSTOM_JTREG_SINGLE_KEYWORDS), \
TEST_MODE ASSERT VERBOSE RETAIN MAX_MEM RUN_PROBLEM_LISTS \
RETRY_COUNT REPEAT_COUNT MAX_OUTPUT $(CUSTOM_JTREG_SINGLE_KEYWORDS), \
STRING_KEYWORDS := OPTIONS JAVA_OPTIONS VM_OPTIONS KEYWORDS \
EXTRA_PROBLEM_LISTS LAUNCHER_OPTIONS \
$(CUSTOM_JTREG_STRING_KEYWORDS), \
@@ -356,7 +352,7 @@ ExpandJtregPath = \
# with test id: dir/Test.java#selection -> Test.java#selection -> .java#selection -> #selection
# without: dir/Test.java -> Test.java -> .java -> <<empty string>>
TestID = \
$(subst .sh,,$(subst .html,,$(subst .java,,$(suffix $(notdir $1)))))
$(subst .java,,$(suffix $(notdir $1)))
# The test id starting with a hash (#testid) will be stripped by all
# evals in ParseJtregTestSelectionInner and will be reinserted by calling
@@ -746,11 +742,9 @@ define SetupRunJtregTestBody
JTREG_VERBOSE ?= fail,error,summary
JTREG_RETAIN ?= fail,error
JTREG_TEST_THREAD_FACTORY ?=
JTREG_RUN_PROBLEM_LISTS ?= false
JTREG_RETRY_COUNT ?= 0
JTREG_REPEAT_COUNT ?= 0
JTREG_REPORT ?= files
ifneq ($$(JTREG_RETRY_COUNT), 0)
ifneq ($$(JTREG_REPEAT_COUNT), 0)
@@ -760,14 +754,6 @@ define SetupRunJtregTestBody
endif
endif
ifneq ($$(JTREG_TEST_THREAD_FACTORY), )
$1_JTREG_BASIC_OPTIONS += -testThreadFactoryPath:$$(JTREG_TEST_THREAD_FACTORY_JAR)
$1_JTREG_BASIC_OPTIONS += -testThreadFactory:$$(JTREG_TEST_THREAD_FACTORY)
$1_JTREG_BASIC_OPTIONS += $$(addprefix $$(JTREG_PROBLEM_LIST_PREFIX), $$(wildcard \
$$(addprefix $$($1_TEST_ROOT)/, ProblemList-$$(JTREG_TEST_THREAD_FACTORY).txt) \
))
endif
ifneq ($$(JTREG_LAUNCHER_OPTIONS), )
$1_JTREG_LAUNCHER_OPTIONS += $$(JTREG_LAUNCHER_OPTIONS)
endif
@@ -784,13 +770,10 @@ define SetupRunJtregTestBody
# Make sure the tmp dir is normalized as some tests will react badly otherwise
$1_TEST_TMP_DIR := $$(abspath $$($1_TEST_SUPPORT_DIR)/tmp)
# test.boot.jdk is used by some test cases that want to execute a previous
# version of the JDK.
$1_JTREG_BASIC_OPTIONS += -$$($1_JTREG_TEST_MODE) \
-verbose:$$(JTREG_VERBOSE) -retain:$$(JTREG_RETAIN) \
-concurrency:$$($1_JTREG_JOBS) -timeoutFactor:$$(JTREG_TIMEOUT_FACTOR) \
-vmoption:-XX:MaxRAMPercentage=$$($1_JTREG_MAX_RAM_PERCENTAGE) \
-vmoption:-Dtest.boot.jdk="$$(BOOT_JDK)" \
-vmoption:-Djava.io.tmpdir="$$($1_TEST_TMP_DIR)"
$1_JTREG_BASIC_OPTIONS += -automatic -ignore:quiet
@@ -871,7 +854,6 @@ define SetupRunJtregTestBody
-dir:$$(JTREG_TOPDIR) \
-reportDir:$$($1_TEST_RESULTS_DIR) \
-workDir:$$($1_TEST_SUPPORT_DIR) \
-report:$${JTREG_REPORT} \
$$$${JTREG_STATUS} \
$$(JTREG_OPTIONS) \
$$(JTREG_FAILURE_HANDLER_OPTIONS) \

View File

@@ -295,11 +295,6 @@ test-prebuilt:
@cd $(TOPDIR) && $(MAKE) $(MAKE_ARGS) -f make/RunTests.gmk run-test \
TEST="$(TEST)"
test-prebuilt-with-exit-code: test-prebuilt
@if test -f $(MAKESUPPORT_OUTPUTDIR)/exit-with-error ; then \
exit 1 ; \
fi
all: test-prebuilt
.PHONY: default all test-prebuilt

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2014, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2014, 2020, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -31,7 +31,6 @@ include JavaCompilation.gmk
include Modules.gmk
SRC_ZIP_WORK_DIR := $(SUPPORT_OUTPUTDIR)/src
$(if $(filter $(TOPDIR)/%, $(SUPPORT_OUTPUTDIR)), $(eval SRC_ZIP_BASE := $(TOPDIR)), $(eval SRC_ZIP_BASE := $(SUPPORT_OUTPUTDIR)))
# Hook to include the corresponding custom file, if present.
$(eval $(call IncludeCustomExtension, ZipSource.gmk))
@@ -46,10 +45,10 @@ ALL_MODULES := $(FindAllModules)
# again to create src.zip.
$(foreach m, $(ALL_MODULES), \
$(foreach d, $(call FindModuleSrcDirs, $m), \
$(eval $d_TARGET := $(SRC_ZIP_WORK_DIR)/$(patsubst $(TOPDIR)/%,%,$(patsubst $(SUPPORT_OUTPUTDIR)/%,%,$d))/$m) \
$(eval $d_TARGET := $(SRC_ZIP_WORK_DIR)/$(patsubst $(TOPDIR)/%,%,$d)/$m) \
$(if $(SRC_GENERATED), , \
$(eval $$($d_TARGET): $d ; \
$$(if $(filter $(SRC_ZIP_BASE)/%, $d), $$(link-file-relative), $$(link-file-absolute)) \
$$(if $(filter $(TOPDIR)/%, $d), $$(link-file-relative), $$(link-file-absolute)) \
) \
) \
$(eval SRC_ZIP_SRCS += $$($d_TARGET)) \

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -31,11 +31,6 @@ AC_DEFUN_ONCE([BASIC_INIT],
[
# Save the original command line. This is passed to us by the wrapper configure script.
AC_SUBST(CONFIGURE_COMMAND_LINE)
# We might have the original command line if the wrapper was called by some
# other script.
AC_SUBST(REAL_CONFIGURE_COMMAND_EXEC_SHORT)
AC_SUBST(REAL_CONFIGURE_COMMAND_EXEC_FULL)
AC_SUBST(REAL_CONFIGURE_COMMAND_LINE)
# AUTOCONF might be set in the environment by the user. Preserve for "make reconfigure".
AC_SUBST(AUTOCONF)
# Save the path variable before it gets changed
@@ -60,7 +55,6 @@ AC_DEFUN([BASIC_CHECK_LEFTOVER_OVERRIDDEN],
###############################################################################
# Setup basic configuration paths, and platform-specific stuff related to PATHs.
# Make sure to only use tools set up in BASIC_SETUP_FUNDAMENTAL_TOOLS.
AC_DEFUN_ONCE([BASIC_SETUP_PATHS],
[
# Save the current directory this script was started from
@@ -102,29 +96,6 @@ AC_DEFUN_ONCE([BASIC_SETUP_PATHS],
AUTOCONF_DIR=$TOPDIR/make/autoconf
])
###############################################################################
# Setup what kind of build environment type we have (CI or local developer)
AC_DEFUN_ONCE([BASIC_SETUP_BUILD_ENV],
[
if test "x$CI" = "xtrue"; then
DEFAULT_BUILD_ENV="ci"
AC_MSG_NOTICE([CI environment variable set to $CI])
else
DEFAULT_BUILD_ENV="dev"
fi
UTIL_ARG_WITH(NAME: build-env, TYPE: literal,
RESULT: BUILD_ENV,
VALID_VALUES: [auto dev ci], DEFAULT: auto,
CHECKING_MSG: [for build environment type],
DESC: [select build environment type (affects certain default values)],
IF_AUTO: [
RESULT=$DEFAULT_BUILD_ENV
]
)
AC_SUBST(BUILD_ENV)
])
###############################################################################
# Evaluates platform specific overrides for devkit variables.
# $1: Name of variable
@@ -168,15 +139,6 @@ AC_DEFUN([BASIC_SETUP_XCODE_SYSROOT],
if test $? -ne 0; then
AC_MSG_ERROR([The xcodebuild tool in the devkit reports an error: $XCODEBUILD_OUTPUT])
fi
elif test "x$TOOLCHAIN_PATH" != x; then
UTIL_LOOKUP_PROGS(XCODEBUILD, xcodebuild, $TOOLCHAIN_PATH)
if test "x$XCODEBUILD" != x; then
XCODEBUILD_OUTPUT=`"$XCODEBUILD" -version 2>&1`
if test $? -ne 0; then
AC_MSG_WARN([Ignoring the located xcodebuild tool $XCODEBUILD due to an error: $XCODEBUILD_OUTPUT])
XCODEBUILD=
fi
fi
else
UTIL_LOOKUP_PROGS(XCODEBUILD, xcodebuild)
if test "x$XCODEBUILD" != x; then
@@ -326,22 +288,6 @@ AC_DEFUN_ONCE([BASIC_SETUP_DEVKIT],
[UTIL_PREPEND_TO_PATH([TOOLCHAIN_PATH],$with_toolchain_path)]
)
AC_ARG_WITH([xcode-path], [AS_HELP_STRING([--with-xcode-path],
[set up toolchain on Mac OS using a path to an Xcode installation])])
if test "x$with_xcode_path" != x; then
if test "x$OPENJDK_BUILD_OS" = "xmacosx"; then
UTIL_PREPEND_TO_PATH([TOOLCHAIN_PATH],
$with_xcode_path/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin:$with_xcode_path/Contents/Developer/usr/bin)
else
AC_MSG_WARN([Option --with-xcode-path is only valid on Mac OS, ignoring.])
fi
fi
AC_MSG_CHECKING([for toolchain path])
AC_MSG_RESULT([$TOOLCHAIN_PATH])
AC_SUBST(TOOLCHAIN_PATH)
AC_ARG_WITH([extra-path], [AS_HELP_STRING([--with-extra-path],
[prepend these directories to the default path])],
[UTIL_PREPEND_TO_PATH([EXTRA_PATH],$with_extra_path)]
@@ -360,6 +306,10 @@ AC_DEFUN_ONCE([BASIC_SETUP_DEVKIT],
AC_MSG_RESULT([$SYSROOT])
AC_SUBST(SYSROOT)
AC_MSG_CHECKING([for toolchain path])
AC_MSG_RESULT([$TOOLCHAIN_PATH])
AC_SUBST(TOOLCHAIN_PATH)
AC_MSG_CHECKING([for extra path])
AC_MSG_RESULT([$EXTRA_PATH])
])
@@ -478,11 +428,7 @@ AC_DEFUN([BASIC_CHECK_DIR_ON_LOCAL_DISK],
# df on AIX does not understand -l. On modern AIXes it understands "-T local" which
# is the same. On older AIXes we just continue to live with a "not local build" warning.
if test "x$OPENJDK_TARGET_OS" = xaix; then
if "$DF -T local > /dev/null 2>&1"; then
DF_LOCAL_ONLY_OPTION='-T local'
else # AIX may use GNU-utils instead
DF_LOCAL_ONLY_OPTION='-l'
fi
DF_LOCAL_ONLY_OPTION='-T local'
elif test "x$OPENJDK_BUILD_OS_ENV" = "xwindows.wsl1"; then
# In WSL1, we can only build on a drvfs file system (that is, a mounted real Windows drive)
DF_LOCAL_ONLY_OPTION='-t drvfs'

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -24,13 +24,8 @@
#
###############################################################################
# It is recommended to use exactly this version of pandoc, especially for
# re-generating checked in html files
RECOMMENDED_PANDOC_VERSION=2.19.2
###############################################################################
# Setup the most fundamental tools, used for setting up build platform and
# path handling.
# Setup the most fundamental tools that relies on not much else to set up,
# but is used by much of the early bootstrap code.
AC_DEFUN_ONCE([BASIC_SETUP_FUNDAMENTAL_TOOLS],
[
# Bootstrapping: These tools are needed by UTIL_LOOKUP_PROGS
@@ -42,28 +37,7 @@ AC_DEFUN_ONCE([BASIC_SETUP_FUNDAMENTAL_TOOLS],
UTIL_CHECK_NONEMPTY(FILE)
AC_PATH_PROGS(LDD, ldd)
# Required tools
UTIL_REQUIRE_PROGS(ECHO, echo)
UTIL_REQUIRE_PROGS(TR, tr)
UTIL_REQUIRE_PROGS(UNAME, uname)
UTIL_REQUIRE_PROGS(WC, wc)
# Required tools with some special treatment
UTIL_REQUIRE_SPECIAL(GREP, [AC_PROG_GREP])
UTIL_REQUIRE_SPECIAL(EGREP, [AC_PROG_EGREP])
UTIL_REQUIRE_SPECIAL(SED, [AC_PROG_SED])
# Tools only needed on some platforms
UTIL_LOOKUP_PROGS(PATHTOOL, cygpath wslpath)
UTIL_LOOKUP_PROGS(CMD, cmd.exe, $PATH:/cygdrive/c/windows/system32:/mnt/c/windows/system32:/c/windows/system32)
])
###############################################################################
# Setup further tools that should be resolved early but after setting up
# build platform and path handling.
AC_DEFUN_ONCE([BASIC_SETUP_TOOLS],
[
# Required tools
# First are all the fundamental required tools.
UTIL_REQUIRE_PROGS(BASH, bash)
UTIL_REQUIRE_PROGS(CAT, cat)
UTIL_REQUIRE_PROGS(CHMOD, chmod)
@@ -71,6 +45,7 @@ AC_DEFUN_ONCE([BASIC_SETUP_TOOLS],
UTIL_REQUIRE_PROGS(CUT, cut)
UTIL_REQUIRE_PROGS(DATE, date)
UTIL_REQUIRE_PROGS(DIFF, gdiff diff)
UTIL_REQUIRE_PROGS(ECHO, echo)
UTIL_REQUIRE_PROGS(EXPR, expr)
UTIL_REQUIRE_PROGS(FIND, find)
UTIL_REQUIRE_PROGS(GUNZIP, gunzip)
@@ -92,20 +67,27 @@ AC_DEFUN_ONCE([BASIC_SETUP_TOOLS],
UTIL_REQUIRE_PROGS(TAR, gtar tar)
UTIL_REQUIRE_PROGS(TEE, tee)
UTIL_REQUIRE_PROGS(TOUCH, touch)
UTIL_REQUIRE_PROGS(TR, tr)
UTIL_REQUIRE_PROGS(UNAME, uname)
UTIL_REQUIRE_PROGS(WC, wc)
UTIL_REQUIRE_PROGS(XARGS, xargs)
# Required tools with some special treatment
# Then required tools that require some special treatment.
UTIL_REQUIRE_SPECIAL(GREP, [AC_PROG_GREP])
UTIL_REQUIRE_SPECIAL(EGREP, [AC_PROG_EGREP])
UTIL_REQUIRE_SPECIAL(FGREP, [AC_PROG_FGREP])
UTIL_REQUIRE_SPECIAL(SED, [AC_PROG_SED])
# Optional tools, we can do without them
UTIL_LOOKUP_PROGS(DF, df)
UTIL_LOOKUP_PROGS(GIT, git)
UTIL_LOOKUP_PROGS(NICE, nice)
UTIL_LOOKUP_PROGS(READLINK, greadlink readlink)
UTIL_LOOKUP_PROGS(WHOAMI, whoami)
# Tools only needed on some platforms
# These are only needed on some platforms
UTIL_LOOKUP_PROGS(PATHTOOL, cygpath wslpath)
UTIL_LOOKUP_PROGS(LSB_RELEASE, lsb_release)
UTIL_LOOKUP_PROGS(CMD, cmd.exe, $PATH:/cygdrive/c/windows/system32:/mnt/c/windows/system32:/c/windows/system32)
# For compare.sh only
UTIL_LOOKUP_PROGS(CMP, cmp)
@@ -298,7 +280,7 @@ AC_DEFUN([BASIC_CHECK_TAR],
if test "x$TAR_TYPE" = "xgnu"; then
TAR_INCLUDE_PARAM="T"
TAR_SUPPORTS_TRANSFORM="true"
elif test "x$TAR_TYPE" = "xaix"; then
elif test "x$TAR_TYPE" = "aix"; then
# -L InputList of aix tar: name of file listing the files and directories
# that need to be archived or extracted
TAR_INCLUDE_PARAM="L"
@@ -444,29 +426,22 @@ AC_DEFUN_ONCE([BASIC_SETUP_PANDOC],
[
UTIL_LOOKUP_PROGS(PANDOC, pandoc)
if test "x$PANDOC" != x; then
AC_MSG_CHECKING([for pandoc version])
PANDOC_VERSION=`$PANDOC --version 2>&1 | $TR -d '\r' | $HEAD -1 | $CUT -d " " -f 2`
AC_MSG_RESULT([$PANDOC_VERSION])
if test "x$PANDOC_VERSION" != x$RECOMMENDED_PANDOC_VERSION; then
AC_MSG_WARN([pandoc is version $PANDOC_VERSION, not the recommended version $RECOMMENDED_PANDOC_VERSION])
fi
PANDOC_MARKDOWN_FLAG="markdown"
AC_MSG_CHECKING([if the pandoc smart extension needs to be disabled for markdown])
if $PANDOC --list-extensions | $GREP -q '+smart'; then
PANDOC_MARKDOWN_FLAG="markdown"
if test -n "$PANDOC"; then
AC_MSG_CHECKING(if the pandoc smart extension needs to be disabled for markdown)
if $PANDOC --list-extensions | $GREP -q '\+smart'; then
AC_MSG_RESULT([yes])
PANDOC_MARKDOWN_FLAG="markdown-smart"
else
AC_MSG_RESULT([no])
fi
fi
if test -n "$PANDOC"; then
ENABLE_PANDOC="true"
else
ENABLE_PANDOC="false"
fi
AC_SUBST(ENABLE_PANDOC)
AC_SUBST(PANDOC_MARKDOWN_FLAG)
])

View File

@@ -89,8 +89,8 @@ AC_DEFUN([BASIC_SETUP_PATHS_WINDOWS],
WINENV_TEMP_DIR=$($PATHTOOL -u $($CMD /q /c echo %TEMP% 2> /dev/null) | $TR -d '\r\n')
AC_MSG_RESULT([$WINENV_TEMP_DIR])
if test "x$OPENJDK_BUILD_OS_ENV" = "xwindows.wsl1" || test "x$OPENJDK_BUILD_OS_ENV" = "xwindows.wsl2"; then
# Don't trust the current directory for WSL, but change to an OK temp dir
if test "x$OPENJDK_BUILD_OS_ENV" = "xwindows.wsl2"; then
# Don't trust the current directory for WSL2, but change to an OK temp dir
cd "$WINENV_TEMP_DIR"
# Bring along confdefs.h or autoconf gets all confused
cp "$CONFIGURE_START_DIR/confdefs.h" "$WINENV_TEMP_DIR"
@@ -228,7 +228,7 @@ AC_DEFUN([BASIC_WINDOWS_FINALIZE_FIXPATH],
# Platform-specific finalization
AC_DEFUN([BASIC_WINDOWS_FINALIZE],
[
if test "x$OPENJDK_BUILD_OS_ENV" = "xwindows.wsl1" || test "x$OPENJDK_BUILD_OS_ENV" = "xwindows.wsl2"; then
if test "x$OPENJDK_BUILD_OS_ENV" = "xwindows.wsl2"; then
# Change back from temp dir
cd $CONFIGURE_START_DIR
fi

View File

@@ -1,5 +1,5 @@
#
# Copyright (c) 2011, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2011, 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -382,7 +382,7 @@ AC_DEFUN_ONCE([BOOTJDK_SETUP_BOOT_JDK],
# Finally, set some other options...
# Determine if the boot jdk jar supports the --date option
if $JAR --help 2>&1 | $GREP -q -e "--date=TIMESTAMP"; then
if $JAR --help 2>&1 | $GREP -q "\-\-date=TIMESTAMP"; then
BOOT_JDK_JAR_SUPPORTS_DATE=true
else
BOOT_JDK_JAR_SUPPORTS_DATE=false

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
#!/bin/sh
#
# Copyright (c) 2012, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2012, 2022, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2021, Azul Systems, Inc. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
@@ -29,27 +29,16 @@
# and fix the broken property, if needed.
DIR=`dirname $0`
OUT=`. $DIR/autoconf-config.guess 2> /dev/null`
OUT=`. $DIR/autoconf-config.guess`
# Handle some cases that autoconf-config.guess is not capable of
if [ "x$OUT" = x ]; then
if [ `uname -s` = Linux ]; then
# Test and fix little endian MIPS.
if [ `uname -m` = mipsel ]; then
OUT=mipsel-unknown-linux-gnu
fi
# Test and fix cygwin machine arch .x86_64
elif [[ `uname -s` = CYGWIN* ]]; then
if [ `uname -m` = ".x86_64" ]; then
OUT=x86_64-unknown-cygwin
fi
fi
if [ "x$OUT" = x ]; then
# Run autoconf-config.guess again to get the error message.
. $DIR/autoconf-config.guess > /dev/null
else
printf "guessed by custom config.guess... " >&2
# Detect C library.
# Use '-gnu' suffix on systems that use glibc.
# Use '-musl' suffix on systems that use the musl libc.
echo $OUT | grep -- -linux- > /dev/null 2> /dev/null
if test $? = 0; then
libc_vendor=`ldd --version 2>&1 | sed -n '1s/.*\(musl\).*/\1/p'`
if [ x"${libc_vendor}" = x"musl" ]; then
OUT=`echo $OUT | sed 's/-gnu/-musl/'`
fi
fi
@@ -68,11 +57,11 @@ if test $? = 0; then
fi
# Test and fix wsl
echo $OUT | grep '\(unknown\|pc\)-linux-gnu' > /dev/null 2> /dev/null
echo $OUT | grep unknown-linux-gnu > /dev/null 2> /dev/null
if test $? = 0; then
uname -r | grep -i microsoft > /dev/null 2> /dev/null
if test $? = 0; then
OUT=`echo $OUT | sed -e 's/\(unknown\|pc\)-linux-gnu/pc-wsl/'`
OUT=`echo $OUT | sed -e 's/unknown-linux-gnu/pc-wsl/'`
fi
fi
@@ -92,6 +81,57 @@ if test $? = 0; then
OUT=powerpc$KERNEL_BITMODE`echo $OUT | sed -e 's/[^-]*//'`
fi
# Test and fix little endian PowerPC64.
# TODO: should be handled by autoconf-config.guess.
if [ "x$OUT" = x ]; then
if [ `uname -m` = ppc64le ]; then
if [ `uname -s` = Linux ]; then
OUT=powerpc64le-unknown-linux-gnu
fi
fi
fi
# Test and fix little endian MIPS.
if [ "x$OUT" = x ]; then
if [ `uname -s` = Linux ]; then
if [ `uname -m` = mipsel ]; then
OUT=mipsel-unknown-linux-gnu
elif [ `uname -m` = mips64el ]; then
OUT=mips64el-unknown-linux-gnu
fi
fi
fi
# Test and fix LoongArch64.
if [ "x$OUT" = x ]; then
if [ `uname -s` = Linux ]; then
if [ `uname -m` = loongarch64 ]; then
OUT=loongarch64-unknown-linux-gnu
fi
fi
fi
# Test and fix RISC-V.
if [ "x$OUT" = x ]; then
if [ `uname -s` = Linux ]; then
if [ `uname -m` = riscv64 ]; then
OUT=riscv64-unknown-linux-gnu
fi
fi
fi
# Test and fix cpu on macos-aarch64, uname -p reports arm, buildsys expects aarch64
echo $OUT | grep arm-apple-darwin > /dev/null 2> /dev/null
if test $? != 0; then
# The GNU version of uname may be on the PATH which reports arm64 instead
echo $OUT | grep arm64-apple-darwin > /dev/null 2> /dev/null
fi
if test $? = 0; then
if [ `uname -m` = arm64 ]; then
OUT=aarch64`echo $OUT | sed -e 's/[^-]*//'`
fi
fi
# Test and fix cpu on Macosx when C preprocessor is not on the path
echo $OUT | grep i386-apple-darwin > /dev/null 2> /dev/null
if test $? = 0; then

View File

@@ -1,6 +1,6 @@
#!/bin/sh
#
# Copyright (c) 2014, 2023, Oracle and/or its affiliates. All rights reserved.
# Copyright (c) 2014, 2022, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
@@ -29,16 +29,46 @@
DIR=`dirname $0`
if echo $* | grep linux-musl >/dev/null ; then
echo $*
exit
fi
# Allow wsl
if echo $* | grep pc-wsl >/dev/null ; then
echo $*
exit
fi
# Allow msys2
if echo $* | grep pc-msys >/dev/null ; then
echo $*
exit
fi
# Canonicalize for riscv which autoconf-config.sub doesn't handle
if echo $* | grep '^riscv\(32\|64\)-linux' >/dev/null ; then
result=`echo $@ | sed 's/linux/unknown-linux/'`
echo $result
exit
fi
# Filter out everything that doesn't begin with "aarch64-"
if ! echo $* | grep '^aarch64-' >/dev/null ; then
. $DIR/autoconf-config.sub "$@"
# autoconf-config.sub exits, so we never reach here, but just in
# case we do:
exit
fi
while test $# -gt 0 ; do
case $1 in
-- ) # Stop option processing
shift; break ;;
aarch64-* )
config=`echo $1 | sed 's/^aarch64-/arm-/'`
sub_args="$sub_args $config"
shift; ;;
- ) # Use stdin as input.
sub_args="$sub_args $1"
shift; break ;;
@@ -51,5 +81,7 @@ done
result=`. $DIR/autoconf-config.sub $sub_args "$@"`
exitcode=$?
result=`echo $result | sed "s/^arm-/aarch64-/"`
echo $result
exit $exitcode

Some files were not shown because too many files have changed in this diff Show More