diff --git a/.hgtags-top-repo b/.hgtags-top-repo index 18a8b5ba2ef..67c185a8e1b 100644 --- a/.hgtags-top-repo +++ b/.hgtags-top-repo @@ -402,3 +402,6 @@ ef056360ddf3977d7d2ddbeb456a4d612d19ea05 jdk-9+152 783ec7542cf7154e5d2b87f55bb97d28f81e9ada jdk-9+156 4eb77fb98952dc477a4229575c81d2263a9ce711 jdk-9+157 a4087bc10a88a43ea3ad0919b5b4af1c86977221 jdk-9+158 +fe8466adaef8178dba94be53c789a0aaa87d13bb jdk-9+159 +4d29ee32d926ebc960072d51a3bc558f95c1cbad jdk-9+160 +cda60babd152d889aba4d8f20a8f643ab151d3de jdk-9+161 diff --git a/README b/README index 477b38887fc..537dea30aec 100644 --- a/README +++ b/README @@ -1,40 +1,10 @@ -README: - This file should be located at the top of the OpenJDK Mercurial root - repository. A full OpenJDK repository set (forest) should also include - the following 7 nested repositories: - "jdk", "hotspot", "langtools", "nashorn", "corba", "jaxws" and "jaxp". +Welcome to OpenJDK! +=================== - The root repository can be obtained with something like: - hg clone http://hg.openjdk.java.net/jdk9/jdk9 openjdk9 +For information about building OpenJDK, including how to fully retrieve all +source code, please see either of these: - You can run the get_source.sh script located in the root repository to get - the other needed repositories: - cd openjdk9 && sh ./get_source.sh + * common/doc/building.html (html version) + * common/doc/building.md (markdown version) - People unfamiliar with Mercurial should read the first few chapters of - the Mercurial book: http://hgbook.red-bean.com/read/ - - See http://openjdk.java.net/ for more information about OpenJDK. - -Simple Build Instructions: - - 0. Get the necessary system software/packages installed on your system, see - http://hg.openjdk.java.net/jdk9/jdk9/raw-file/tip/README-builds.html - - 1. If you don't have a jdk8 or newer jdk, download and install it from - http://java.sun.com/javase/downloads/index.jsp - Add the /bin directory of this installation to your PATH environment - variable. - - 2. Configure the build: - bash ./configure - - 3. Build the OpenJDK: - make all - The resulting JDK image should be found in build/*/images/jdk - -where make is GNU make 3.81 or newer, /usr/bin/make on Linux usually -is 3.81 or newer. Note that on Solaris, GNU make is called "gmake". - -Complete details are available in the file: - http://hg.openjdk.java.net/jdk9/jdk9/raw-file/tip/README-builds.html +See http://openjdk.java.net/ for more information about OpenJDK. diff --git a/README-builds.html b/README-builds.html deleted file mode 100644 index 6d7d5b52461..00000000000 --- a/README-builds.html +++ /dev/null @@ -1,1406 +0,0 @@ - -
-This README file contains build instructions for the -OpenJDK. Building the source code for the OpenJDK -requires a certain degree of technical expertise.
- -Some Headlines:
- -configure && make
" style buildvsvars*.bat
and vcvars*.bat
files are run
-automaticallyThe OpenJDK sources are maintained with the revision control system -Mercurial. If you are new to -Mercurial, please see the Beginner Guides or refer to the Mercurial Book. -The first few chapters of the book provide an excellent overview of Mercurial, -what it is and how it works.
- -For using Mercurial with the OpenJDK refer to the Developer Guide: Installing -and Configuring Mercurial section for more information.
- - - -To get the entire set of OpenJDK Mercurial repositories use the script
-get_source.sh
located in the root repository:
hg clone http://hg.openjdk.java.net/jdk9/jdk9 YourOpenJDK
- cd YourOpenJDK
- bash ./get_source.sh
-
-
-Once you have all the repositories, keep in mind that each repository is its
-own independent repository. You can also re-run ./get_source.sh
anytime to
-pull over all the latest changesets in all the repositories. This set of
-nested repositories has been given the term "forest" and there are various
-ways to apply the same hg
command to each of the repositories. For
-example, the script make/scripts/hgforest.sh
can be used to repeat the
-same hg
command on every repository, e.g.
cd YourOpenJDK
- bash ./make/scripts/hgforest.sh status
-
-
-
-
-The set of repositories and what they contain:
- -There are some very basic guidelines:
- -build/
directory..hgignore
file in each repository must exist and should include
-^build/
, ^dist/
and optionally any nbproject/private
directories. It
-should NEVER include anything in the src/
or test/
or any managed
-directory area of a repository.javah
output). There are some exceptions to this rule, in
-particular with some of the generated configure scripts.The very first step in building the OpenJDK is making sure the system itself -has everything it needs to do OpenJDK builds. Once a system is setup, it -generally doesn't need to be done again.
- -Building the OpenJDK is now done with running a configure
script which will
-try and find and verify you have everything you need, followed by running
-make
, e.g.
-- --
bash ./configure
-make all
Where possible the configure
script will attempt to located the various
-components in the default locations or via component specific variable
-settings. When the normal defaults fail or components cannot be found,
-additional configure
options may be necessary to help configure
find the
-necessary tools for the build, or you may need to re-visit the setup of your
-system due to missing software packages.
NOTE: The configure
script file does not have execute permissions and
-will need to be explicitly run with bash
, see the source guidelines.
Before even attempting to use a system to build the OpenJDK there are some very -basic system setups needed. For all systems:
- -Be sure the GNU make utility is version 3.81 (4.0 on windows) or newer, e.g.
-run "make -version
"
Install a Bootstrap JDK. All OpenJDK builds require access to a previously -released JDK called the bootstrap JDK or boot JDK. The general rule is -that the bootstrap JDK must be an instance of the previous major release of -the JDK. In addition, there may be a requirement to use a release at or -beyond a particular update level.
- -Building JDK 9 requires JDK 8. JDK 9 developers should not use JDK 9 as -the boot JDK, to ensure that JDK 9 dependencies are not introduced into the -parts of the system that are built with JDK 8.
- -The JDK 8 binaries can be downloaded from Oracle's JDK 8 download
-site.
-For build performance reasons it is very important that this bootstrap JDK
-be made available on the local disk of the machine doing the build. You
-should add its bin
directory to the PATH
environment variable. If
-configure
has any issues finding this JDK, you may need to use the
-configure
option --with-boot-jdk
.
Ensure that GNU make, the Bootstrap JDK, and the compilers are all in your -PATH environment variable.
And for specific systems:
- -Linux
- -Install all the software development packages needed including -alsa, freetype, cups, and -xrender. See specific system packages.
Solaris
- -Install all the software development packages needed including Studio -Compilers, freetype, cups, and -xrender. See specific system packages.
Windows
- -Mac OS X
- -Install XCode 6.3
With Linux, try and favor the system packages over building your own or getting -packages from other areas. Most Linux builds should be possible with the -system's available packages.
- -Note that some Linux systems have a habit of pre-populating your environment
-variables for you, for example JAVA_HOME
might get pre-defined for you to
-refer to the JDK installed on your Linux system. You will need to unset
-JAVA_HOME
. It's a good idea to run env
and verify the environment variables
-you are getting from the default system settings make sense for building the
-OpenJDK.
At a minimum, the Studio 12 Update 4 Compilers (containing -version 5.13 of the C and C++ compilers) is required, including specific -patches.
- -The Solaris Studio installation should contain at least these packages:
- --- -- -
-- - - -Package -Version -- -developer/solarisstudio-124/backend -12.4-1.0.6.0 -- -developer/solarisstudio-124/c++ -12.4-1.0.10.0 -- -developer/solarisstudio-124/cc -12.4-1.0.4.0 -- -developer/solarisstudio-124/library/c++-libs -12.4-1.0.10.0 -- -developer/solarisstudio-124/library/math-libs -12.4-1.0.0.1 -- -developer/solarisstudio-124/library/studio-gccrt -12.4-1.0.0.1 -- -developer/solarisstudio-124/studio-common -12.4-1.0.0.1 -- -developer/solarisstudio-124/studio-ja -12.4-1.0.0.1 -- -developer/solarisstudio-124/studio-legal -12.4-1.0.0.1 -- - -developer/solarisstudio-124/studio-zhCN -12.4-1.0.0.1 -
In particular backend 12.4-1.0.6.0 contains a critical patch for the sparc -version.
- -Place the bin
directory in PATH
.
The Oracle Solaris Studio Express compilers at: Oracle Solaris Studio Express -Download site are also an option, although these compilers -have not been extensively used yet.
- - - -Building on Windows requires a Unix-like environment, notably a Unix-like -shell. There are several such environments available of which -Cygwin and -MinGW/MSYS are currently supported for the -OpenJDK build. One of the differences of these systems from standard Windows -tools is the way they handle Windows path names, particularly path names which -contain spaces, backslashes as path separators and possibly drive letters. -Depending on the use case and the specifics of each environment these path -problems can be solved by a combination of quoting whole paths, translating -backslashes to forward slashes, escaping backslashes with additional -backslashes and translating the path names to their "8.3" -version.
- - - -CYGWIN is an open source, Linux-like environment which tries to emulate a
-complete POSIX layer on Windows. It tries to be smart about path names and can
-usually handle all kinds of paths if they are correctly quoted or escaped
-although internally it maps drive letters <drive>:
to a virtual directory
-/cygdrive/<drive>
.
You can always use the cygpath
utility to map pathnames with spaces or the
-backslash character into the C:/
style of pathname (called 'mixed'), e.g.
-cygpath -s -m "<path>"
.
Note that the use of CYGWIN creates a unique problem with regards to setting
-PATH
. Normally on Windows the PATH
variable contains directories
-separated with the ";" character (Solaris and Linux use ":"). With CYGWIN, it
-uses ":", but that means that paths like "C:/path" cannot be placed in the
-CYGWIN version of PATH
and instead CYGWIN uses something like
-/cygdrive/c/path
which CYGWIN understands, but only CYGWIN understands.
The OpenJDK build requires CYGWIN version 1.7.16 or newer. Information about -CYGWIN can be obtained from the CYGWIN website at -www.cygwin.com.
- -By default CYGWIN doesn't install all the tools required for building the -OpenJDK. Along with the default installation, you need to install the following -tools.
- --- -- -
-- - - -Binary Name -Category -Package -Description -- -ar.exe -Devel -binutils -The GNU assembler, linker and binary utilities -- -make.exe -Devel -make -The GNU version of the 'make' utility built for CYGWIN -- -m4.exe -Interpreters -m4 -GNU implementation of the traditional Unix macro processor -- -cpio.exe -Utils -cpio -A program to manage archives of files -- -gawk.exe -Utils -awk -Pattern-directed scanning and processing language -- -file.exe -Utils -file -Determines file type using 'magic' numbers -- -zip.exe -Archive -zip -Package and compress (archive) files -- -unzip.exe -Archive -unzip -Extract compressed files in a ZIP archive -- - -free.exe -System -procps -Display amount of free and used memory in the system -
Note that the CYGWIN software can conflict with other non-CYGWIN software on -your Windows system. CYGWIN provides a FAQ for known issues and problems, of particular interest is the -section on BLODA (applications that interfere with -CYGWIN).
- - - -MinGW ("Minimalist GNU for Windows") is a collection of free Windows specific
-header files and import libraries combined with GNU toolsets that allow one to
-produce native Windows programs that do not rely on any 3rd-party C runtime
-DLLs. MSYS is a supplement to MinGW which allows building applications and
-programs which rely on traditional UNIX tools to be present. Among others this
-includes tools like bash
and make
. See MinGW/MSYS for more information.
Like Cygwin, MinGW/MSYS can handle different types of path formats. They are
-internally converted to paths with forward slashes and drive letters
-<drive>:
replaced by a virtual directory /<drive>
. Additionally, MSYS
-automatically detects binaries compiled for the MSYS environment and feeds them
-with the internal, Unix-style path names. If native Windows applications are
-called from within MSYS programs their path arguments are automatically
-converted back to Windows style path names with drive letters and backslashes
-as path separators. This may cause problems for Windows applications which use
-forward slashes as parameter separator (e.g. cl /nologo /I
) because MSYS may
-wrongly replace such parameters by drive letters.
In addition to the tools which will be installed by default, you have to
-manually install the msys-zip
and msys-unzip
packages. This can be easily
-done with the MinGW command line installer:
mingw-get.exe install msys-zip
- mingw-get.exe install msys-unzip
-
-
-
-
-The 32-bit and 64-bit OpenJDK Windows build requires Microsoft Visual Studio
-C++ 2013 (VS2013) Professional Edition or Express compiler. The compiler and
-other tools are expected to reside in the location defined by the variable
-VS120COMNTOOLS
which is set by the Microsoft Visual Studio installer.
Only the C++ part of VS2013 is needed. Try to let the installation go to the -default install directory. Always reboot your system after installing VS2013. -The system environment variable VS120COMNTOOLS should be set in your -environment.
- -Make sure that TMP and TEMP are also set in the environment and refer to
-Windows paths that exist, like C:\temp
, not /tmp
, not /cygdrive/c/temp
,
-and not C:/temp
. C:\temp
is just an example, it is assumed that this area
-is private to the user, so by default after installs you should see a unique
-user path in these variables.
Make sure you get the right XCode version.
- -The basic invocation of the configure
script looks like:
-- --
bash ./configure [options]
This will create an output directory containing the "configuration" and setup -an area for the build result. This directory typically looks like:
- --- --
build/linux-x64-normal-server-release
configure
will try to figure out what system you are running on and where all
-necessary build components are. If you have all prerequisites for building
-installed, it should find everything. If it fails to detect any component
-automatically, it will exit and inform you about the problem. When this
-happens, read more below in the configure
options.
Some examples:
- --- - - -Windows 32bit build with freetype specified:
- -
-bash ./configure --with-freetype=/cygdrive/c/freetype-i586 --with-target- -bits=32
Debug 64bit Build:
-
-bash ./configure --enable-debug --with-target-bits=64
Complete details on all the OpenJDK configure
options can be seen with:
-- --
bash ./configure --help=short
Use -help
to see all the configure
options available. You can generate any
-number of different configurations, e.g. debug, release, 32, 64, etc.
Some of the more commonly used configure
options are:
-- - - --
--enable-debug
- set the debug level to fastdebug (this is a shorthand for--with-debug- - level=fastdebug
)
-- - - -- -
--with-alsa=
path
- select the location of the Advanced Linux Sound Architecture (ALSA)Version 0.9.1 or newer of the ALSA files are required for building the - OpenJDK on Linux. These Linux files are usually available from an "alsa" of - "libasound" development package, and it's highly recommended that you try - and use the package provided by the particular version of Linux that you are - using.
- -- -
--with-boot-jdk=
path
- select the Bootstrap JDK- -
--with-boot-jdk-jvmargs=
"args"
- provide the JVM options to be used to run the Bootstrap JDK- -
--with-cacerts=
path
- select the path to the cacerts file.See Certificate Authority on Wikipedia for a better understanding of the Certificate - Authority (CA). A certificates file named "cacerts" represents a system-wide - keystore with CA certificates. In JDK and JRE binary bundles, the "cacerts" - file contains root CA certificates from several public CAs (e.g., VeriSign, - Thawte, and Baltimore). The source contain a cacerts file without CA root - certificates. Formal JDK builders will need to secure permission from each - public CA and include the certificates into their own custom cacerts file. - Failure to provide a populated cacerts file will result in verification - errors of a certificate chain during runtime. By default an empty cacerts - file is provided and that should be fine for most JDK developers.
-
-- - - -- -
--with-cups=
path
- select the CUPS install locationThe Common UNIX Printing System (CUPS) Headers are required for building the - OpenJDK on Solaris and Linux. The Solaris header files can be obtained by - installing the package print/cups.
- -The CUPS header files can always be downloaded from - www.cups.org.
- -- -
--with-cups-include=
path
- select the CUPS include directory location- -
--with-debug-level=
level
- select the debug information level of release, fastdebug, or slowdebug-
--with-dev-kit=
path
- select location of the compiler install or developer install location
-- - - -- -
--with-freetype=
path
- select the freetype files to use.Expecting the freetype libraries under
- -lib/
and the headers under -include/
.Version 2.3 or newer of FreeType is required. On Unix systems required files - can be available as part of your distribution (while you still may need to - upgrade them). Note that you need development version of package that - includes both the FreeType library and header files.
- -You can always download latest FreeType version from the FreeType - website. Building the freetype 2 libraries from - scratch is also possible, however on Windows refer to the Windows FreeType - DLL build instructions.
- -Note that by default FreeType is built with byte code hinting support - disabled due to licensing restrictions. In this case, text appearance and - metrics are expected to differ from Sun's official JDK build. See the - SourceForge FreeType2 Home Page - for more information.
- -- -
--with-import-hotspot=
path
- select the location to find hotspot binaries from a previous build to avoid - building hotspot- -
--with-target-bits=
arg
- select 32 or 64 bit build- -
--with-jvm-variants=
variants
- select the JVM variants to build from, comma separated list that can - include: server, client, kernel, zero and zeroshark- -
--with-memory-size=
size
- select the RAM size that GNU make will think this system has- -
--with-msvcr-dll=
path
- select themsvcr100.dll
file to include in the Windows builds (C/C++ - runtime library for Visual Studio).This is usually picked up automatically from the redist directories of - Visual Studio 2013.
- --
--with-num-cores=
cores
- select the number of cores to use (processor count or CPU count)
-- -- -
--with-x=
path
- select the location of the X11 and xrender files.The XRender Extension Headers are required for building the OpenJDK on - Solaris and Linux. The Linux header files are usually available from a - "Xrender" development package, it's recommended that you try and use the - package provided by the particular distribution of Linux that you are using. - The Solaris XRender header files is included with the other X11 header files - in the package SFWxwinc on new enough versions of Solaris and will be - installed in
-/usr/X11/include/X11/extensions/Xrender.h
or -/usr/openwin/share/include/X11/extensions/Xrender.h
The basic invocation of the make
utility looks like:
-- --
make all
This will start the build to the output directory containing the
-"configuration" that was created by the configure
script. Run make help
for
-more information on the available targets.
There are some of the make targets that are of general interest:
- --- -empty
- -
- build everything but no images- -
all
- build everything including images- -
all-conf
- build all configurations- -
images
- create complete j2sdk and j2re images- -
install
- install the generated images locally, typically in/usr/local
- -
clean
- remove all files generated by make, but not those generated byconfigure
- -
dist-clean
- remove all files generated by both andconfigure
(basically killing the - configuration)-
help
- give some help on usingmake
, including some interesting make targets
When the build is completed, you should see the generated binaries and
-associated files in the j2sdk-image
directory in the output directory. In
-particular, the build/*/images/j2sdk-image/bin
directory should contain
-executables for the OpenJDK tools and utilities for that configuration. The
-testing tool jtreg
will be needed and can be found at: the jtreg
-site. The provided regression tests in the
-repositories can be run with the command:
-- --
cd test && make PRODUCT_HOME=`pwd`/../build/*/images/j2sdk-image all
Q: The generated-configure.sh
file looks horrible! How are you going to
-edit it?
-A: The generated-configure.sh
file is generated (think "compiled") by the
-autoconf tools. The source code is in configure.ac
and various .m4 files in
-common/autoconf, which are much more readable.
Q: Why is the generated-configure.sh
file checked in, if it is
-generated?
-A: If it was not generated, every user would need to have the autoconf
-tools installed, and re-generate the configure
file as the first step. Our
-goal is to minimize the work needed to be done by the user to start building
-OpenJDK, and to minimize the number of external dependencies required.
Q: Do you require a specific version of autoconf for regenerating
-generated-configure.sh
?
-A: Yes, version 2.69 is required and should be easy enough to aquire on all
-supported operating systems. The reason for this is to avoid large spurious
-changes in generated-configure.sh
.
Q: How do you regenerate generated-configure.sh
after making changes to
-the input files?
-A: Regnerating generated-configure.sh
should always be done using the
-script common/autoconf/autogen.sh
to ensure that the correct files get
-updated. This script should also be run after mercurial tries to merge
-generated-configure.sh
as a merge of the generated file is not guaranteed to
-be correct.
Q: What are the files in common/makefiles/support/*
for? They look like
-gibberish.
-A: They are a somewhat ugly hack to compensate for command line length
-limitations on certain platforms (Windows, Solaris). Due to a combination of
-limitations in make and the shell, command lines containing too many files will
-not work properly. These helper files are part of an elaborate hack that will
-compress the command line in the makefile and then uncompress it safely. We're
-not proud of it, but it does fix the problem. If you have any better
-suggestions, we're all ears! :-)
Q: I want to see the output of the commands that make runs, like in the old
-build. How do I do that?
-A: You specify the LOG
variable to make. There are several log levels:
warn
-- Default and very quiet.info
-- Shows more progress information than warn.debug
-- Echos all command lines and prints all macro calls for
-compilation definitions.trace
-- Echos all $(shell) command lines as well.Q: When do I have to re-run configure
?
-A: Normally you will run configure
only once for creating a
-configuration. You need to re-run configuration only if you want to change any
-configuration options, or if you pull down changes to the configure
script.
Q: I have added a new source file. Do I need to modify the makefiles?
-A: Normally, no. If you want to create e.g. a new native library, you will
-need to modify the makefiles. But for normal file additions or removals, no
-changes are needed. There are certan exceptions for some native libraries where
-the source files are spread over many directories which also contain sources
-for other libraries. In these cases it was simply easier to create include
-lists rather than excludes.
Q: When I run configure --help
, I see many strange options, like
---dvidir
. What is this?
-A: Configure provides a slew of options by default, to all projects that
-use autoconf. Most of them are not used in OpenJDK, so you can safely ignore
-them. To list only OpenJDK specific features, use configure --help=short
-instead.
Q: configure
provides OpenJDK-specific features such as --with-
-builddeps-server
that are not described in this document. What about those?
-A: Try them out if you like! But be aware that most of these are
-experimental features. Many of them don't do anything at all at the moment; the
-option is just a placeholder. Others depend on pieces of code or infrastructure
-that is currently not ready for prime time.
Q: How will you make sure you don't break anything?
-A: We have a script that compares the result of the new build system with
-the result of the old. For most part, we aim for (and achieve) byte-by-byte
-identical output. There are however technical issues with e.g. native binaries,
-which might differ in a byte-by-byte comparison, even when building twice with
-the old build system. For these, we compare relevant aspects (e.g. the symbol
-table and file size). Note that we still don't have 100% equivalence, but we're
-close.
Q: I noticed this thing X in the build that looks very broken by design.
-Why don't you fix it?
-A: Our goal is to produce a build output that is as close as technically
-possible to the old build output. If things were weird in the old build, they
-will be weird in the new build. Often, things were weird before due to
-obscurity, but in the new build system the weird stuff comes up to the surface.
-The plan is to attack these things at a later stage, after the new build system
-is established.
Q: The code in the new build system is not that well-structured. Will you
-fix this?
-A: Yes! The new build system has grown bit by bit as we converted the old
-system. When all of the old build system is converted, we can take a step back
-and clean up the structure of the new build system. Some of this we plan to do
-before replacing the old build system and some will need to wait until after.
Q: Is anything able to use the results of the new build's default make
-target?
-A: Yes, this is the minimal (or roughly minimal) set of compiled output
-needed for a developer to actually execute the newly built JDK. The idea is
-that in an incremental development fashion, when doing a normal make, you
-should only spend time recompiling what's changed (making it purely
-incremental) and only do the work that's needed to actually run and test your
-code. The packaging stuff that is part of the images
target is not needed for
-a normal developer who wants to test his new code. Even if it's quite fast,
-it's still unnecessary. We're targeting sub-second incremental rebuilds! ;-)
-(Or, well, at least single-digit seconds...)
Q: I usually set a specific environment variable when building, but I can't
-find the equivalent in the new build. What should I do?
-A: It might very well be that we have neglected to add support for an
-option that was actually used from outside the build system. Email us and we
-will add support for it!
Building OpenJDK requires a lot of horsepower. Some of the build tools can be
-adjusted to utilize more or less of resources such as parallel threads and
-memory. The configure
script analyzes your system and selects reasonable
-values for such options based on your hardware. If you encounter resource
-problems, such as out of memory conditions, you can modify the detected values
-with:
--with-num-cores
-- number of cores in the build system, e.g.
---with-num-cores=8
--with-memory-size
-- memory (in MB) available in the build system,
-e.g. --with-memory-size=1024
It might also be necessary to specify the JVM arguments passed to the Bootstrap
-JDK, using e.g. --with-boot-jdk-jvmargs="-Xmx8G -enableassertions"
. Doing
-this will override the default JVM arguments passed to the Bootstrap JDK.
One of the top goals of the new build system is to improve the build -performance and decrease the time needed to build. This will soon also apply to -the java compilation when the Smart Javac wrapper is fully supported.
- -At the end of a successful execution of configure
, you will get a performance
-summary, indicating how well the build will perform. Here you will also get
-performance hints. If you want to build fast, pay attention to those!
The OpenJDK build supports building with ccache when using gcc or clang. Using
-ccache can radically speed up compilation of native code if you often rebuild
-the same sources. Your milage may vary however so we recommend evaluating it
-for yourself. To enable it, make sure it's on the path and configure with
---enable-ccache
.
If you are using network shares, e.g. via NFS, for your source code, make sure -the build directory is situated on local disk. The performance penalty is -extremely high for building on a network share, close to unusable.
- -The old build builds multiple JVMs on 32-bit systems (client and server; and on
-Windows kernel as well). In the new build we have changed this default to only
-build server when it's available. This improves build times for those not
-interested in multiple JVMs. To mimic the old behavior on platforms that
-support it, use --with-jvm-variants=client,server
.
By default, configure
will analyze your machine and run the make process in
-parallel with as many threads as you have cores. This behavior can be
-overridden, either "permanently" (on a configure
basis) using
---with-num-cores=N
or for a single build only (on a make basis), using
-make JOBS=N
.
If you want to make a slower build just this time, to save some CPU power for
-other processes, you can run e.g. make JOBS=2
. This will force the makefiles
-to only run 2 parallel processes, or even make JOBS=1
which will disable
-parallelism.
If you want to have it the other way round, namely having slow builds default
-and override with fast if you're impatient, you should call configure
with
---with-num-cores=2
, making 2 the default. If you want to run with more cores,
-run make JOBS=8
If the build fails (and it's not due to a compilation error in a source file
-you've changed), the first thing you should do is to re-run the build with more
-verbosity. Do this by adding LOG=debug
to your make command line.
The build log (with both stdout and stderr intermingled, basically the same as
-you see on your console) can be found as build.log
in your build directory.
You can ask for help on build problems with the new build system on either the -build-dev or the -build-infra-dev -mailing lists. Please include the relevant parts of the build log.
- -A build can fail for any number of reasons. Most failures are a result of
-trying to build in an environment in which all the pre-build requirements have
-not been met. The first step in troubleshooting a build failure is to recheck
-that you have satisfied all the pre-build requirements for your platform.
-Scanning the configure
log is a good first step, making sure that what it
-found makes sense for your system. Look for strange error messages or any
-difficulties that configure
had in finding things.
Some of the more common problems with builds are briefly described below, with -suggestions for remedies.
- -Corrupted Bundles on Windows:
-Some virus scanning software has been known to corrupt the downloading of
-zip bundles. It may be necessary to disable the 'on access' or 'real time'
-virus scanning features to prevent this corruption. This type of 'real time'
-virus scanning can also slow down the build process significantly.
-Temporarily disabling the feature, or excluding the build output directory
-may be necessary to get correct and faster builds.
Slow Builds:
-If your build machine seems to be overloaded from too many simultaneous C++
-compiles, try setting the JOBS=1
on the make
command line. Then try
-increasing the count slowly to an acceptable level for your system. Also:
Creating the javadocs can be very slow, if you are running javadoc, consider -skipping that step.
- -Faster CPUs, more RAM, and a faster DISK usually helps. The VM build tends -to be CPU intensive (many C++ compiles), and the rest of the JDK will often -be disk intensive.
- -Faster compiles are possible using a tool called -ccache.
File time issues:
-If you see warnings that refer to file time stamps, e.g.
-- -Warning message:
-File 'xxx' has modification time in the future.
-Warning message:Clock skew detected. Your build may be incomplete.
These warnings can occur when the clock on the build machine is out of sync -with the timestamps on the source files. Other errors, apparently unrelated -but in fact caused by the clock skew, can occur along with the clock skew -warnings. These secondary errors may tend to obscure the fact that the true -root cause of the problem is an out-of-sync clock.
- -If you see these warnings, reset the clock on the build machine, run
-"gmake clobber
" or delete the directory containing the build output, and
-restart the build from the beginning.
Error message: Trouble writing out table to disk
-Increase the amount of swap space on your build machine. This could be
-caused by overloading the system and it may be necessary to use:
-- --
make JOBS=1
to reduce the load on the system.
Error Message: libstdc++ not found
:
-This is caused by a missing libstdc++.a library. This is installed as part
-of a specific package (e.g. libstdc++.so.devel.386). By default some 64-bit
-Linux versions (e.g. Fedora) only install the 64-bit version of the
-libstdc++ package. Various parts of the JDK build require a static link of
-the C++ runtime libraries to allow for maximum portability of the built
-images.
Linux Error Message: cannot restore segment prot after reloc
-This is probably an issue with SELinux (See SELinux on
-Wikipedia). Parts of the VM is built
-without the -fPIC
for performance reasons.
To completely disable SELinux:
- -$ su root
# system-config-securitylevel
In the window that appears, select the SELinux tab
Disable SELinux
Alternatively, instead of completely disabling it you could disable just -this one check.
- -Windows Error Messages:
-*** fatal error - couldn't allocate heap, ...
-rm fails with "Directory not empty"
-unzip fails with "cannot create ... Permission denied"
-unzip fails with "cannot create ... Error 50"
The CYGWIN software can conflict with other non-CYGWIN software. See the -CYGWIN FAQ section on BLODA (applications that interfere with -CYGWIN).
Windows Error Message: spawn failed
-Try rebooting the system, or there could be some kind of issue with the disk
-or disk partition being used. Sometimes it comes with a "Permission Denied"
-message.
The Makefiles in the OpenJDK are only valid when used with the GNU version of
-the utility command make
(usually called gmake
on Solaris). A few notes
-about using GNU make:
PATH
./usr/bin/make
on Solaris. If your Solaris system
-has the software from the Solaris Developer Companion CD installed, you
-should try and use /usr/bin/gmake
or /usr/gnu/bin/make
.Information on GNU make, and access to ftp download sites, are available on the -GNU make web site . The latest -source to GNU make is available at -ftp.gnu.org/pub/gnu/make/.
- - - -First step is to get the GNU make 3.81 or newer source from -ftp.gnu.org/pub/gnu/make/. Building is a -little different depending on the OS but is basically done with:
- - bash ./configure
- make
-
-
-This file often describes specific requirements for what we call the "minimum -build environments" (MBE) for this specific release of the JDK. What is listed -below is what the Oracle Release Engineering Team will use to build the Oracle -JDK product. Building with the MBE will hopefully generate the most compatible -bits that install on, and run correctly on, the most variations of the same -base OS and hardware architecture. In some cases, these represent what is often -called the least common denominator, but each Operating System has different -aspects to it.
- -In all cases, the Bootstrap JDK version minimum is critical, we cannot -guarantee builds will work with older Bootstrap JDK's. Also in all cases, more -RAM and more processors is better, the minimums listed below are simply -recommendations.
- -With Solaris and Mac OS X, the version listed below is the oldest release we -can guarantee builds and works, and the specific version of the compilers used -could be critical.
- -With Windows the critical aspect is the Visual Studio compiler used, which due -to it's runtime, generally dictates what Windows systems can do the builds and -where the resulting bits can be used.
- -NOTE: We expect a change here off these older Windows OS releases and to a -'less older' one, probably Windows 2008R2 X64.
- -With Linux, it was just a matter of picking a stable distribution that is a -good representative for Linux in general.
- -It is understood that most developers will NOT be using these specific -versions, and in fact creating these specific versions may be difficult due to -the age of some of this software. It is expected that developers are more often -using the more recent releases and distributions of these operating systems.
- -Compilation problems with newer or different C/C++ compilers is a common
-problem. Similarly, compilation problems related to changes to the
-/usr/include
or system header files is also a common problem with older,
-newer, or unreleased OS versions. Please report these types of problems as bugs
-so that they can be dealt with accordingly.
-- -- -
-- - - -Base OS and Architecture -OS -C/C++ Compiler -Bootstrap JDK -Processors -RAM Minimum -DISK Needs -- -Linux X86 (32-bit) and X64 (64-bit) -Oracle Enterprise Linux 6.4 -gcc 4.9.2 -JDK 8 -2 or more -1 GB -6 GB -- -Solaris SPARCV9 (64-bit) -Solaris 11 Update 1 -Studio 12 Update 4 + patches -JDK 8 -4 or more -4 GB -8 GB -- -Solaris X64 (64-bit) -Solaris 11 Update 1 -Studio 12 Update 4 + patches -JDK 8 -4 or more -4 GB -8 GB -- -Windows X86 (32-bit) -Windows Server 2012 R2 x64 -Microsoft Visual Studio C++ 2013 Professional Edition -JDK 8 -2 or more -2 GB -6 GB -- -Windows X64 (64-bit) -Windows Server 2012 R2 x64 -Microsoft Visual Studio C++ 2013 Professional Edition -JDK 8 -2 or more -2 GB -6 GB -- - -Mac OS X X64 (64-bit) -Mac OS X 10.9 "Mavericks" -Xcode 6.3 or newer -JDK 8 -2 or more -4 GB -6 GB -
We won't be listing all the possible environments, but we will try to provide -what information we have available to us.
- -NOTE: The community can help out by updating this part of the document.
- -After installing the latest Fedora you need to
-install several build dependencies. The simplest way to do it is to execute the
-following commands as user root
:
yum-builddep java-1.7.0-openjdk
- yum install gcc gcc-c++
-
-
-In addition, it's necessary to set a few environment variables for the build:
- - export LANG=C
- export PATH="/usr/lib/jvm/java-openjdk/bin:${PATH}"
-
-
-After installing CentOS 5.5 you need to make sure you -have the following Development bundles installed:
- -Plus the following packages:
- -The freetype 2.3 packages don't seem to be available, but the freetype 2.3 -sources can be downloaded, built, and installed easily enough from the -freetype site. Build and install -with something like:
- - bash ./configure
- make
- sudo -u root make install
-
-
-Mercurial packages could not be found easily, but a Google search should find -ones, and they usually include Python if it's needed.
- -After installing Debian 5 you need to install several
-build dependencies. The simplest way to install the build dependencies is to
-execute the following commands as user root
:
aptitude build-dep openjdk-7
- aptitude install openjdk-7-jdk libmotif-dev
-
-
-In addition, it's necessary to set a few environment variables for the build:
- - export LANG=C
- export PATH="/usr/lib/jvm/java-7-openjdk/bin:${PATH}"
-
-
-After installing Ubuntu 12.04 you need to install several -build dependencies. The simplest way to do it is to execute the following -commands:
- - sudo aptitude build-dep openjdk-7
- sudo aptitude install openjdk-7-jdk
-
-
-In addition, it's necessary to set a few environment variables for the build:
- - export LANG=C
- export PATH="/usr/lib/jvm/java-7-openjdk/bin:${PATH}"
-
-
-After installing OpenSUSE 11.1 you need to install -several build dependencies. The simplest way to install the build dependencies -is to execute the following commands:
- - sudo zypper source-install -d java-1_7_0-openjdk
- sudo zypper install make
-
-
-In addition, it is necessary to set a few environment variables for the build:
- - export LANG=C
- export PATH="/usr/lib/jvm/java-1.7.0-openjdk/bin:$[PATH}"
-
-
-Finally, you need to unset the JAVA_HOME
environment variable:
export -n JAVA_HOME`
-
-
-After installing Mandriva Linux One 2009 Spring you need
-to install several build dependencies. The simplest way to install the build
-dependencies is to execute the following commands as user root
:
urpmi java-1.7.0-openjdk-devel make gcc gcc-c++ freetype-devel zip unzip
- libcups2-devel libxrender1-devel libalsa2-devel libstc++-static-devel
- libxtst6-devel libxi-devel
-
-
-In addition, it is necessary to set a few environment variables for the build:
- - export LANG=C
- export PATH="/usr/lib/jvm/java-1.7.0-openjdk/bin:${PATH}"
-
-
-After installing OpenSolaris 2009.06 you need to -install several build dependencies. The simplest way to install the build -dependencies is to execute the following commands:
- - pfexec pkg install SUNWgmake SUNWj7dev sunstudioexpress SUNWcups SUNWzip
- SUNWunzip SUNWxwhl SUNWxorg-headers SUNWaudh SUNWfreetype2
-
-
-In addition, it is necessary to set a few environment variables for the build:
- - export LANG=C
- export PATH="/opt/SunStudioExpress/bin:${PATH}"
-
-
-End of the OpenJDK build README document.
- -Please come again!
- - diff --git a/common/autoconf/basics.m4 b/common/autoconf/basics.m4 index eb5c5f85752..20180c4446a 100644 --- a/common/autoconf/basics.m4 +++ b/common/autoconf/basics.m4 @@ -530,6 +530,7 @@ AC_DEFUN_ONCE([BASIC_SETUP_FUNDAMENTAL_TOOLS], BASIC_PATH_PROGS(DF, df) BASIC_PATH_PROGS(CPIO, [cpio bsdcpio]) BASIC_PATH_PROGS(NICE, nice) + BASIC_PATH_PROGS(PANDOC, pandoc) ]) # Setup basic configuration paths, and platform-specific stuff related to PATHs. diff --git a/common/autoconf/basics_windows.m4 b/common/autoconf/basics_windows.m4 index 1a79bf8fc86..2ae6e34a556 100644 --- a/common/autoconf/basics_windows.m4 +++ b/common/autoconf/basics_windows.m4 @@ -329,8 +329,8 @@ AC_DEFUN([BASIC_CHECK_PATHS_WINDOWS], AC_MSG_ERROR([Something is wrong with your cygwin installation since I cannot find cygpath.exe in your path]) fi AC_MSG_CHECKING([cygwin root directory as unix-style path]) - # The cmd output ends with Windows line endings (CR/LF), the grep command will strip that away - cygwin_winpath_root=`cd / ; cmd /c cd | $GREP ".*"` + # The cmd output ends with Windows line endings (CR/LF) + cygwin_winpath_root=`cd / ; cmd /c cd | $TR -d '\r\n'` # Force cygpath to report the proper root by including a trailing space, and then stripping it off again. CYGWIN_ROOT_PATH=`$CYGPATH -u "$cygwin_winpath_root " | $CUT -f 1 -d " "` AC_MSG_RESULT([$CYGWIN_ROOT_PATH]) diff --git a/common/autoconf/flags.m4 b/common/autoconf/flags.m4 index ca7f625664c..e2d82c67283 100644 --- a/common/autoconf/flags.m4 +++ b/common/autoconf/flags.m4 @@ -355,7 +355,7 @@ AC_DEFUN([FLAGS_SETUP_COMPILER_FLAGS_FOR_LIBS], SHARED_LIBRARY_FLAGS="-dynamiclib -compatibility_version 1.0.0 -current_version 1.0.0 $PICFLAG" JVM_CFLAGS="$JVM_CFLAGS $PICFLAG" fi - SET_EXECUTABLE_ORIGIN='-Wl,-rpath,@loader_path/.' + SET_EXECUTABLE_ORIGIN='-Wl,-rpath,@loader_path[$]1' SET_SHARED_LIBRARY_ORIGIN="$SET_EXECUTABLE_ORIGIN" SET_SHARED_LIBRARY_NAME='-Wl,-install_name,@rpath/[$]1' SET_SHARED_LIBRARY_MAPFILE='-Wl,-exported_symbols_list,[$]1' @@ -375,7 +375,7 @@ AC_DEFUN([FLAGS_SETUP_COMPILER_FLAGS_FOR_LIBS], # Linking is different on MacOSX PICFLAG='' SHARED_LIBRARY_FLAGS="-dynamiclib -compatibility_version 1.0.0 -current_version 1.0.0 $PICFLAG" - SET_EXECUTABLE_ORIGIN='-Wl,-rpath,@loader_path/.' + SET_EXECUTABLE_ORIGIN='-Wl,-rpath,@loader_path[$]1' SET_SHARED_LIBRARY_ORIGIN="$SET_EXECUTABLE_ORIGIN" SET_SHARED_LIBRARY_NAME='-Wl,-install_name,@rpath/[$]1' SET_SHARED_LIBRARY_MAPFILE='-Wl,-exported_symbols_list,[$]1' diff --git a/common/autoconf/generated-configure.sh b/common/autoconf/generated-configure.sh index acba47d7148..147c5869745 100644 --- a/common/autoconf/generated-configure.sh +++ b/common/autoconf/generated-configure.sh @@ -1024,6 +1024,7 @@ build_os build_vendor build_cpu build +PANDOC NICE CPIO DF @@ -1281,6 +1282,7 @@ READLINK DF CPIO NICE +PANDOC MAKE UNZIP ZIPEXE @@ -2244,6 +2246,7 @@ Some influential environment variables: DF Override default value for DF CPIO Override default value for CPIO NICE Override default value for NICE + PANDOC Override default value for PANDOC MAKE Override default value for MAKE UNZIP Override default value for UNZIP ZIPEXE Override default value for ZIPEXE @@ -5043,7 +5046,7 @@ TOOLCHAIN_MINIMUM_VERSION_xlc="" # # $1 = compiler to test (CC or CXX) # $2 = human readable name of compiler (C or C++) -# $3 = list of compiler names to search for +# $3 = compiler name to search for # Detect the core components of the toolchain, i.e. the compilers (CC and CXX), @@ -5170,7 +5173,7 @@ VS_SDK_PLATFORM_NAME_2013= #CUSTOM_AUTOCONF_INCLUDE # Do not change or remove the following line, it is needed for consistency checks: -DATE_WHEN_GENERATED=1486679715 +DATE_WHEN_GENERATED=1489410066 ############################################################################### # @@ -15358,6 +15361,203 @@ $as_echo "$tool_specified" >&6; } + # Publish this variable in the help. + + + if [ -z "${PANDOC+x}" ]; then + # The variable is not set by user, try to locate tool using the code snippet + for ac_prog in pandoc +do + # Extract the first word of "$ac_prog", so it can be a program name with args. +set dummy $ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_PANDOC+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $PANDOC in + [\\/]* | ?:[\\/]*) + ac_cv_path_PANDOC="$PANDOC" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_PANDOC="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + + ;; +esac +fi +PANDOC=$ac_cv_path_PANDOC +if test -n "$PANDOC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $PANDOC" >&5 +$as_echo "$PANDOC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$PANDOC" && break +done + + else + # The variable is set, but is it from the command line or the environment? + + # Try to remove the string !PANDOC! from our list. + try_remove_var=${CONFIGURE_OVERRIDDEN_VARIABLES//!PANDOC!/} + if test "x$try_remove_var" = "x$CONFIGURE_OVERRIDDEN_VARIABLES"; then + # If it failed, the variable was not from the command line. Ignore it, + # but warn the user (except for BASH, which is always set by the calling BASH). + if test "xPANDOC" != xBASH; then + { $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: Ignoring value of PANDOC from the environment. Use command line variables instead." >&5 +$as_echo "$as_me: WARNING: Ignoring value of PANDOC from the environment. Use command line variables instead." >&2;} + fi + # Try to locate tool using the code snippet + for ac_prog in pandoc +do + # Extract the first word of "$ac_prog", so it can be a program name with args. +set dummy $ac_prog; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_PANDOC+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $PANDOC in + [\\/]* | ?:[\\/]*) + ac_cv_path_PANDOC="$PANDOC" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_PANDOC="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + + ;; +esac +fi +PANDOC=$ac_cv_path_PANDOC +if test -n "$PANDOC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $PANDOC" >&5 +$as_echo "$PANDOC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + test -n "$PANDOC" && break +done + + else + # If it succeeded, then it was overridden by the user. We will use it + # for the tool. + + # First remove it from the list of overridden variables, so we can test + # for unknown variables in the end. + CONFIGURE_OVERRIDDEN_VARIABLES="$try_remove_var" + + # Check if we try to supply an empty value + if test "x$PANDOC" = x; then + { $as_echo "$as_me:${as_lineno-$LINENO}: Setting user supplied tool PANDOC= (no value)" >&5 +$as_echo "$as_me: Setting user supplied tool PANDOC= (no value)" >&6;} + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for PANDOC" >&5 +$as_echo_n "checking for PANDOC... " >&6; } + { $as_echo "$as_me:${as_lineno-$LINENO}: result: disabled" >&5 +$as_echo "disabled" >&6; } + else + # Check if the provided tool contains a complete path. + tool_specified="$PANDOC" + tool_basename="${tool_specified##*/}" + if test "x$tool_basename" = "x$tool_specified"; then + # A command without a complete path is provided, search $PATH. + { $as_echo "$as_me:${as_lineno-$LINENO}: Will search for user supplied tool PANDOC=$tool_basename" >&5 +$as_echo "$as_me: Will search for user supplied tool PANDOC=$tool_basename" >&6;} + # Extract the first word of "$tool_basename", so it can be a program name with args. +set dummy $tool_basename; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_PANDOC+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $PANDOC in + [\\/]* | ?:[\\/]*) + ac_cv_path_PANDOC="$PANDOC" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_PANDOC="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi +done + done +IFS=$as_save_IFS + + ;; +esac +fi +PANDOC=$ac_cv_path_PANDOC +if test -n "$PANDOC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $PANDOC" >&5 +$as_echo "$PANDOC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + + if test "x$PANDOC" = x; then + as_fn_error $? "User supplied tool $tool_basename could not be found" "$LINENO" 5 + fi + else + # Otherwise we believe it is a complete path. Use it as it is. + { $as_echo "$as_me:${as_lineno-$LINENO}: Will use user supplied tool PANDOC=$tool_specified" >&5 +$as_echo "$as_me: Will use user supplied tool PANDOC=$tool_specified" >&6;} + { $as_echo "$as_me:${as_lineno-$LINENO}: checking for PANDOC" >&5 +$as_echo_n "checking for PANDOC... " >&6; } + if test ! -x "$tool_specified"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: not found" >&5 +$as_echo "not found" >&6; } + as_fn_error $? "User supplied tool PANDOC=$tool_specified does not exist or is not executable" "$LINENO" 5 + fi + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $tool_specified" >&5 +$as_echo "$tool_specified" >&6; } + fi + fi + fi + + fi + + + + # Now we can determine OpenJDK build and target platforms. This is required to # have early on. # Make sure we can run config.sub. @@ -16200,8 +16400,8 @@ $as_echo "$as_me: Your cygwin is too old. You are running $CYGWIN_VERSION, but a fi { $as_echo "$as_me:${as_lineno-$LINENO}: checking cygwin root directory as unix-style path" >&5 $as_echo_n "checking cygwin root directory as unix-style path... " >&6; } - # The cmd output ends with Windows line endings (CR/LF), the grep command will strip that away - cygwin_winpath_root=`cd / ; cmd /c cd | $GREP ".*"` + # The cmd output ends with Windows line endings (CR/LF) + cygwin_winpath_root=`cd / ; cmd /c cd | $TR -d '\r\n'` # Force cygpath to report the proper root by including a trailing space, and then stripping it off again. CYGWIN_ROOT_PATH=`$CYGPATH -u "$cygwin_winpath_root " | $CUT -f 1 -d " "` { $as_echo "$as_me:${as_lineno-$LINENO}: result: $CYGWIN_ROOT_PATH" >&5 @@ -33138,10 +33338,9 @@ done if test -n "$TOOLCHAIN_PATH"; then PATH_save="$PATH" PATH="$TOOLCHAIN_PATH" - for ac_prog in $SEARCH_LIST -do - # Extract the first word of "$ac_prog", so it can be a program name with args. -set dummy $ac_prog; ac_word=$2 + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}$SEARCH_LIST", so it can be a program name with args. +set dummy ${ac_tool_prefix}$SEARCH_LIST; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } if ${ac_cv_path_TOOLCHAIN_PATH_CC+:} false; then : @@ -33180,20 +33379,73 @@ $as_echo "no" >&6; } fi - test -n "$TOOLCHAIN_PATH_CC" && break +fi +if test -z "$ac_cv_path_TOOLCHAIN_PATH_CC"; then + ac_pt_TOOLCHAIN_PATH_CC=$TOOLCHAIN_PATH_CC + # Extract the first word of "$SEARCH_LIST", so it can be a program name with args. +set dummy $SEARCH_LIST; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_ac_pt_TOOLCHAIN_PATH_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $ac_pt_TOOLCHAIN_PATH_CC in + [\\/]* | ?:[\\/]*) + ac_cv_path_ac_pt_TOOLCHAIN_PATH_CC="$ac_pt_TOOLCHAIN_PATH_CC" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_ac_pt_TOOLCHAIN_PATH_CC="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi done + done +IFS=$as_save_IFS + + ;; +esac +fi +ac_pt_TOOLCHAIN_PATH_CC=$ac_cv_path_ac_pt_TOOLCHAIN_PATH_CC +if test -n "$ac_pt_TOOLCHAIN_PATH_CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_pt_TOOLCHAIN_PATH_CC" >&5 +$as_echo "$ac_pt_TOOLCHAIN_PATH_CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_pt_TOOLCHAIN_PATH_CC" = x; then + TOOLCHAIN_PATH_CC="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + TOOLCHAIN_PATH_CC=$ac_pt_TOOLCHAIN_PATH_CC + fi +else + TOOLCHAIN_PATH_CC="$ac_cv_path_TOOLCHAIN_PATH_CC" +fi CC=$TOOLCHAIN_PATH_CC PATH="$PATH_save" fi - # AC_PATH_PROGS can't be run multiple times with the same variable, + # AC_PATH_TOOL can't be run multiple times with the same variable, # so create a new name for this run. if test "x$CC" = x; then - for ac_prog in $SEARCH_LIST -do - # Extract the first word of "$ac_prog", so it can be a program name with args. -set dummy $ac_prog; ac_word=$2 + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}$SEARCH_LIST", so it can be a program name with args. +set dummy ${ac_tool_prefix}$SEARCH_LIST; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } if ${ac_cv_path_POTENTIAL_CC+:} false; then : @@ -33232,8 +33484,62 @@ $as_echo "no" >&6; } fi - test -n "$POTENTIAL_CC" && break +fi +if test -z "$ac_cv_path_POTENTIAL_CC"; then + ac_pt_POTENTIAL_CC=$POTENTIAL_CC + # Extract the first word of "$SEARCH_LIST", so it can be a program name with args. +set dummy $SEARCH_LIST; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_ac_pt_POTENTIAL_CC+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $ac_pt_POTENTIAL_CC in + [\\/]* | ?:[\\/]*) + ac_cv_path_ac_pt_POTENTIAL_CC="$ac_pt_POTENTIAL_CC" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_ac_pt_POTENTIAL_CC="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi done + done +IFS=$as_save_IFS + + ;; +esac +fi +ac_pt_POTENTIAL_CC=$ac_cv_path_ac_pt_POTENTIAL_CC +if test -n "$ac_pt_POTENTIAL_CC"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_pt_POTENTIAL_CC" >&5 +$as_echo "$ac_pt_POTENTIAL_CC" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_pt_POTENTIAL_CC" = x; then + POTENTIAL_CC="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + POTENTIAL_CC=$ac_pt_POTENTIAL_CC + fi +else + POTENTIAL_CC="$ac_cv_path_POTENTIAL_CC" +fi CC=$POTENTIAL_CC fi @@ -34439,10 +34745,9 @@ done if test -n "$TOOLCHAIN_PATH"; then PATH_save="$PATH" PATH="$TOOLCHAIN_PATH" - for ac_prog in $SEARCH_LIST -do - # Extract the first word of "$ac_prog", so it can be a program name with args. -set dummy $ac_prog; ac_word=$2 + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}$SEARCH_LIST", so it can be a program name with args. +set dummy ${ac_tool_prefix}$SEARCH_LIST; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } if ${ac_cv_path_TOOLCHAIN_PATH_CXX+:} false; then : @@ -34481,20 +34786,73 @@ $as_echo "no" >&6; } fi - test -n "$TOOLCHAIN_PATH_CXX" && break +fi +if test -z "$ac_cv_path_TOOLCHAIN_PATH_CXX"; then + ac_pt_TOOLCHAIN_PATH_CXX=$TOOLCHAIN_PATH_CXX + # Extract the first word of "$SEARCH_LIST", so it can be a program name with args. +set dummy $SEARCH_LIST; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_ac_pt_TOOLCHAIN_PATH_CXX+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $ac_pt_TOOLCHAIN_PATH_CXX in + [\\/]* | ?:[\\/]*) + ac_cv_path_ac_pt_TOOLCHAIN_PATH_CXX="$ac_pt_TOOLCHAIN_PATH_CXX" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_ac_pt_TOOLCHAIN_PATH_CXX="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi done + done +IFS=$as_save_IFS + + ;; +esac +fi +ac_pt_TOOLCHAIN_PATH_CXX=$ac_cv_path_ac_pt_TOOLCHAIN_PATH_CXX +if test -n "$ac_pt_TOOLCHAIN_PATH_CXX"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_pt_TOOLCHAIN_PATH_CXX" >&5 +$as_echo "$ac_pt_TOOLCHAIN_PATH_CXX" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_pt_TOOLCHAIN_PATH_CXX" = x; then + TOOLCHAIN_PATH_CXX="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + TOOLCHAIN_PATH_CXX=$ac_pt_TOOLCHAIN_PATH_CXX + fi +else + TOOLCHAIN_PATH_CXX="$ac_cv_path_TOOLCHAIN_PATH_CXX" +fi CXX=$TOOLCHAIN_PATH_CXX PATH="$PATH_save" fi - # AC_PATH_PROGS can't be run multiple times with the same variable, + # AC_PATH_TOOL can't be run multiple times with the same variable, # so create a new name for this run. if test "x$CXX" = x; then - for ac_prog in $SEARCH_LIST -do - # Extract the first word of "$ac_prog", so it can be a program name with args. -set dummy $ac_prog; ac_word=$2 + if test -n "$ac_tool_prefix"; then + # Extract the first word of "${ac_tool_prefix}$SEARCH_LIST", so it can be a program name with args. +set dummy ${ac_tool_prefix}$SEARCH_LIST; ac_word=$2 { $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 $as_echo_n "checking for $ac_word... " >&6; } if ${ac_cv_path_POTENTIAL_CXX+:} false; then : @@ -34533,8 +34891,62 @@ $as_echo "no" >&6; } fi - test -n "$POTENTIAL_CXX" && break +fi +if test -z "$ac_cv_path_POTENTIAL_CXX"; then + ac_pt_POTENTIAL_CXX=$POTENTIAL_CXX + # Extract the first word of "$SEARCH_LIST", so it can be a program name with args. +set dummy $SEARCH_LIST; ac_word=$2 +{ $as_echo "$as_me:${as_lineno-$LINENO}: checking for $ac_word" >&5 +$as_echo_n "checking for $ac_word... " >&6; } +if ${ac_cv_path_ac_pt_POTENTIAL_CXX+:} false; then : + $as_echo_n "(cached) " >&6 +else + case $ac_pt_POTENTIAL_CXX in + [\\/]* | ?:[\\/]*) + ac_cv_path_ac_pt_POTENTIAL_CXX="$ac_pt_POTENTIAL_CXX" # Let the user override the test with a path. + ;; + *) + as_save_IFS=$IFS; IFS=$PATH_SEPARATOR +for as_dir in $PATH +do + IFS=$as_save_IFS + test -z "$as_dir" && as_dir=. + for ac_exec_ext in '' $ac_executable_extensions; do + if as_fn_executable_p "$as_dir/$ac_word$ac_exec_ext"; then + ac_cv_path_ac_pt_POTENTIAL_CXX="$as_dir/$ac_word$ac_exec_ext" + $as_echo "$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext" >&5 + break 2 + fi done + done +IFS=$as_save_IFS + + ;; +esac +fi +ac_pt_POTENTIAL_CXX=$ac_cv_path_ac_pt_POTENTIAL_CXX +if test -n "$ac_pt_POTENTIAL_CXX"; then + { $as_echo "$as_me:${as_lineno-$LINENO}: result: $ac_pt_POTENTIAL_CXX" >&5 +$as_echo "$ac_pt_POTENTIAL_CXX" >&6; } +else + { $as_echo "$as_me:${as_lineno-$LINENO}: result: no" >&5 +$as_echo "no" >&6; } +fi + + if test "x$ac_pt_POTENTIAL_CXX" = x; then + POTENTIAL_CXX="" + else + case $cross_compiling:$ac_tool_warned in +yes:) +{ $as_echo "$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet" >&5 +$as_echo "$as_me: WARNING: using cross tools not prefixed with host triplet" >&2;} +ac_tool_warned=yes ;; +esac + POTENTIAL_CXX=$ac_pt_POTENTIAL_CXX + fi +else + POTENTIAL_CXX="$ac_cv_path_POTENTIAL_CXX" +fi CXX=$POTENTIAL_CXX fi @@ -49074,7 +49486,7 @@ $as_echo "$ac_cv_c_bigendian" >&6; } SHARED_LIBRARY_FLAGS="-dynamiclib -compatibility_version 1.0.0 -current_version 1.0.0 $PICFLAG" JVM_CFLAGS="$JVM_CFLAGS $PICFLAG" fi - SET_EXECUTABLE_ORIGIN='-Wl,-rpath,@loader_path/.' + SET_EXECUTABLE_ORIGIN='-Wl,-rpath,@loader_path$1' SET_SHARED_LIBRARY_ORIGIN="$SET_EXECUTABLE_ORIGIN" SET_SHARED_LIBRARY_NAME='-Wl,-install_name,@rpath/$1' SET_SHARED_LIBRARY_MAPFILE='-Wl,-exported_symbols_list,$1' @@ -49094,7 +49506,7 @@ $as_echo "$ac_cv_c_bigendian" >&6; } # Linking is different on MacOSX PICFLAG='' SHARED_LIBRARY_FLAGS="-dynamiclib -compatibility_version 1.0.0 -current_version 1.0.0 $PICFLAG" - SET_EXECUTABLE_ORIGIN='-Wl,-rpath,@loader_path/.' + SET_EXECUTABLE_ORIGIN='-Wl,-rpath,@loader_path$1' SET_SHARED_LIBRARY_ORIGIN="$SET_EXECUTABLE_ORIGIN" SET_SHARED_LIBRARY_NAME='-Wl,-install_name,@rpath/$1' SET_SHARED_LIBRARY_MAPFILE='-Wl,-exported_symbols_list,$1' @@ -52643,12 +53055,12 @@ $as_echo "no, forced" >&6; } # Only enable AOT on linux-X64. if test "x$OPENJDK_TARGET_OS-$OPENJDK_TARGET_CPU" = "xlinux-x86_64"; then if test -e "$HOTSPOT_TOPDIR/src/jdk.aot"; then - if test -e "$HOTSPOT_TOPDIR/src/jdk.vm.compiler"; then + if test -e "$HOTSPOT_TOPDIR/src/jdk.internal.vm.compiler"; then ENABLE_AOT="true" else ENABLE_AOT="false" if test "x$enable_aot" = "xyes"; then - as_fn_error $? "Cannot build AOT without hotspot/src/jdk.vm.compiler sources. Remove --enable-aot." "$LINENO" 5 + as_fn_error $? "Cannot build AOT without hotspot/src/jdk.internal.vm.compiler sources. Remove --enable-aot." "$LINENO" 5 fi fi else @@ -64379,8 +64791,8 @@ $as_echo "$JVM_FEATURES" >&6; } JVM_FEATURES_jvmci="" fi - { $as_echo "$as_me:${as_lineno-$LINENO}: checking if jdk.vm.compiler should be built" >&5 -$as_echo_n "checking if jdk.vm.compiler should be built... " >&6; } + { $as_echo "$as_me:${as_lineno-$LINENO}: checking if jdk.internal.vm.compiler should be built" >&5 +$as_echo_n "checking if jdk.internal.vm.compiler should be built... " >&6; } if [[ " $JVM_FEATURES " =~ " graal " ]] ; then { $as_echo "$as_me:${as_lineno-$LINENO}: result: yes, forced" >&5 $as_echo "yes, forced" >&6; } diff --git a/common/autoconf/hotspot.m4 b/common/autoconf/hotspot.m4 index b295b015a9f..acf2a5fe185 100644 --- a/common/autoconf/hotspot.m4 +++ b/common/autoconf/hotspot.m4 @@ -215,12 +215,12 @@ AC_DEFUN_ONCE([HOTSPOT_ENABLE_DISABLE_AOT], # Only enable AOT on linux-X64. if test "x$OPENJDK_TARGET_OS-$OPENJDK_TARGET_CPU" = "xlinux-x86_64"; then if test -e "$HOTSPOT_TOPDIR/src/jdk.aot"; then - if test -e "$HOTSPOT_TOPDIR/src/jdk.vm.compiler"; then + if test -e "$HOTSPOT_TOPDIR/src/jdk.internal.vm.compiler"; then ENABLE_AOT="true" else ENABLE_AOT="false" if test "x$enable_aot" = "xyes"; then - AC_MSG_ERROR([Cannot build AOT without hotspot/src/jdk.vm.compiler sources. Remove --enable-aot.]) + AC_MSG_ERROR([Cannot build AOT without hotspot/src/jdk.internal.vm.compiler sources. Remove --enable-aot.]) fi fi else @@ -327,7 +327,7 @@ AC_DEFUN_ONCE([HOTSPOT_SETUP_JVM_FEATURES], JVM_FEATURES_jvmci="" fi - AC_MSG_CHECKING([if jdk.vm.compiler should be built]) + AC_MSG_CHECKING([if jdk.internal.vm.compiler should be built]) if HOTSPOT_CHECK_JVM_FEATURE(graal); then AC_MSG_RESULT([yes, forced]) if test "x$JVM_FEATURES_jvmci" != "xjvmci" ; then diff --git a/common/autoconf/spec.gmk.in b/common/autoconf/spec.gmk.in index bb6d2209c41..7951c737998 100644 --- a/common/autoconf/spec.gmk.in +++ b/common/autoconf/spec.gmk.in @@ -574,20 +574,31 @@ BUILD_JAVA=@FIXPATH@ $(BUILD_JDK)/bin/java $(BUILD_JAVA_FLAGS) # Use ?= as this can be overridden from bootcycle-spec.gmk BOOT_JDK_MODULAR ?= @BOOT_JDK_MODULAR@ -INTERIM_OVERRIDE_MODULES := java.compiler jdk.compiler \ - jdk.jdeps jdk.javadoc jdk.rmic +INTERIM_LANGTOOLS_OVERRIDE_MODULES := java.compiler jdk.compiler \ + jdk.jdeps jdk.javadoc +INTERIM_RMIC_OVERRIDE_MODULES := jdk.rmic ifeq ($(BOOT_JDK_MODULAR), true) - INTERIM_OVERRIDE_MODULES_ARGS = $(foreach m, $(INTERIM_OVERRIDE_MODULES), \ + INTERIM_LANGTOOLS_OVERRIDE_MODULES_ARGS = $(foreach m, \ + $(INTERIM_LANGTOOLS_OVERRIDE_MODULES), \ --patch-module $m=$(BUILDTOOLS_OUTPUTDIR)/override_modules/$m) - INTERIM_LANGTOOLS_ARGS = $(INTERIM_OVERRIDE_MODULES_ARGS) + INTERIM_RMIC_OVERRIDE_MODULES_ARGS = $(foreach m, \ + $(INTERIM_LANGTOOLS_OVERRIDE_MODULES) \ + $(INTERIM_RMIC_OVERRIDE_MODULES), \ + --patch-module $m=$(BUILDTOOLS_OUTPUTDIR)/override_modules/$m) + INTERIM_LANGTOOLS_ARGS = $(INTERIM_LANGTOOLS_OVERRIDE_MODULES_ARGS) JAVAC_MAIN_CLASS = -m jdk.compiler/com.sun.tools.javac.Main JAVADOC_MAIN_CLASS = -m jdk.javadoc/jdk.javadoc.internal.tool.Main else - INTERIM_OVERRIDE_MODULES_ARGS = \ + INTERIM_LANGTOOLS_OVERRIDE_MODULES_ARGS = \ -Xbootclasspath/p:$(call PathList, \ $(addprefix $(BUILDTOOLS_OUTPUTDIR)/override_modules/, \ - $(INTERIM_OVERRIDE_MODULES))) - INTERIM_LANGTOOLS_ARGS = $(INTERIM_OVERRIDE_MODULES_ARGS) \ + $(INTERIM_LANGTOOLS_OVERRIDE_MODULES))) + INTERIM_RMIC_OVERRIDE_MODULES_ARGS = \ + -Xbootclasspath/p:$(call PathList, \ + $(addprefix $(BUILDTOOLS_OUTPUTDIR)/override_modules/, \ + $(INTERIM_LANGTOOLS_OVERRIDE_MODULES) \ + $(INTERIM_RMIC_OVERRIDE_MODULES))) + INTERIM_LANGTOOLS_ARGS = $(INTERIM_LANGTOOLS_OVERRIDE_MODULES_ARGS) \ -cp $(BUILDTOOLS_OUTPUTDIR)/override_modules/jdk.compiler JAVAC_MAIN_CLASS = com.sun.tools.javac.Main JAVADOC_MAIN_CLASS = jdk.javadoc.internal.tool.Main @@ -637,6 +648,7 @@ MKDIR:=@MKDIR@ MV:=@MV@ NAWK:=@NAWK@ NICE:=@NICE@ +PANDOC:=@PANDOC@ PATCH:=@PATCH@ PRINTF:=@PRINTF@ RM:=@RM@ diff --git a/common/autoconf/toolchain.m4 b/common/autoconf/toolchain.m4 index 16b0df04b4f..29ebea9ccac 100644 --- a/common/autoconf/toolchain.m4 +++ b/common/autoconf/toolchain.m4 @@ -440,7 +440,7 @@ AC_DEFUN([TOOLCHAIN_EXTRACT_COMPILER_VERSION], # # $1 = compiler to test (CC or CXX) # $2 = human readable name of compiler (C or C++) -# $3 = list of compiler names to search for +# $3 = compiler name to search for AC_DEFUN([TOOLCHAIN_FIND_COMPILER], [ COMPILER_NAME=$2 @@ -482,15 +482,15 @@ AC_DEFUN([TOOLCHAIN_FIND_COMPILER], if test -n "$TOOLCHAIN_PATH"; then PATH_save="$PATH" PATH="$TOOLCHAIN_PATH" - AC_PATH_PROGS(TOOLCHAIN_PATH_$1, $SEARCH_LIST) + AC_PATH_TOOL(TOOLCHAIN_PATH_$1, $SEARCH_LIST) $1=$TOOLCHAIN_PATH_$1 PATH="$PATH_save" fi - # AC_PATH_PROGS can't be run multiple times with the same variable, + # AC_PATH_TOOL can't be run multiple times with the same variable, # so create a new name for this run. if test "x[$]$1" = x; then - AC_PATH_PROGS(POTENTIAL_$1, $SEARCH_LIST) + AC_PATH_TOOL(POTENTIAL_$1, $SEARCH_LIST) $1=$POTENTIAL_$1 fi diff --git a/common/bin/update-build-readme.sh b/common/bin/update-build-readme.sh deleted file mode 100644 index 9d1968a4c60..00000000000 --- a/common/bin/update-build-readme.sh +++ /dev/null @@ -1,62 +0,0 @@ -#!/bin/bash - -# Get an absolute path to this script, since that determines the top-level -# directory. -this_script_dir=`dirname $0` -TOPDIR=`cd $this_script_dir/../.. > /dev/null && pwd` - -GREP=grep -MD_FILE=$TOPDIR/README-builds.md -HTML_FILE=$TOPDIR/README-builds.html - -# Locate the markdown processor tool and check that it is the correct version. -locate_markdown_processor() { - if [ -z "$MARKDOWN" ]; then - MARKDOWN=`which markdown 2> /dev/null` - if [ -z "$MARKDOWN" ]; then - echo "Error: Cannot locate markdown processor" 1>&2 - exit 1 - fi - fi - - # Test version - MARKDOWN_VERSION=`$MARKDOWN -version | $GREP version` - if [ "x$MARKDOWN_VERSION" != "xThis is Markdown, version 1.0.1." ]; then - echo "Error: Expected markdown version 1.0.1." 1>&2 - echo "Actual version found: $MARKDOWN_VERSION" 1>&2 - echo "Download markdown here: https://daringfireball.net/projects/markdown/" 1>&2 - exit 1 - fi - -} - -# Verify that the source markdown file looks sound. -verify_source_code() { - TOO_LONG_LINES=`$GREP -E -e '^.{80}.+$' $MD_FILE` - if [ "x$TOO_LONG_LINES" != x ]; then - echo "Warning: The following lines are longer than 80 characters:" - $GREP -E -e '^.{80}.+$' $MD_FILE - fi -} - -# Convert the markdown file to html format. -process_source() { - echo "Generating html file from markdown" - cat > $HTML_FILE << END - - -This README file contains build instructions for the OpenJDK. Building the source code for the OpenJDK requires a certain degree of technical expertise.
+Some Headlines:
+configure && make
" style buildvsvars*.bat
and vcvars*.bat
files are run automaticallyThe OpenJDK sources are maintained with the revision control system Mercurial. If you are new to Mercurial, please see the Beginner Guides or refer to the Mercurial Book. The first few chapters of the book provide an excellent overview of Mercurial, what it is and how it works.
+For using Mercurial with the OpenJDK refer to the Developer Guide: Installing and Configuring Mercurial section for more information.
+To get the entire set of OpenJDK Mercurial repositories use the script get_source.sh
located in the root repository:
hg clone http://hg.openjdk.java.net/jdk9/jdk9 YourOpenJDK
+ cd YourOpenJDK
+ bash ./get_source.sh
+Once you have all the repositories, keep in mind that each repository is its own independent repository. You can also re-run ./get_source.sh
anytime to pull over all the latest changesets in all the repositories. This set of nested repositories has been given the term "forest" and there are various ways to apply the same hg
command to each of the repositories. For example, the script make/scripts/hgforest.sh
can be used to repeat the same hg
command on every repository, e.g.
cd YourOpenJDK
+ bash ./make/scripts/hgforest.sh status
+The set of repositories and what they contain:
+There are some very basic guidelines:
+build/
directory..hgignore
file in each repository must exist and should include ^build/
, ^dist/
and optionally any nbproject/private
directories. It should NEVER include anything in the src/
or test/
or any managed directory area of a repository.javah
output). There are some exceptions to this rule, in particular with some of the generated configure scripts.The very first step in building the OpenJDK is making sure the system itself has everything it needs to do OpenJDK builds. Once a system is setup, it generally doesn't need to be done again.
+Building the OpenJDK is now done with running a configure
script which will try and find and verify you have everything you need, followed by running make
, e.g.
+++
bash ./configure
+make all
Where possible the configure
script will attempt to located the various components in the default locations or via component specific variable settings. When the normal defaults fail or components cannot be found, additional configure
options may be necessary to help configure
find the necessary tools for the build, or you may need to re-visit the setup of your system due to missing software packages.
NOTE: The configure
script file does not have execute permissions and will need to be explicitly run with bash
, see the source guidelines.
Before even attempting to use a system to build the OpenJDK there are some very basic system setups needed. For all systems:
+make -version
"* Install a Bootstrap JDK. All OpenJDK builds require access to a previously released JDK called the bootstrap JDK or boot JDK. The general rule is that the bootstrap JDK must be an instance of the previous major release of the JDK. In addition, there may be a requirement to use a release at or beyond a particular update level.
+Building JDK 9 requires JDK 8. JDK 9 developers should not use JDK 9 as the boot JDK, to ensure that JDK 9 dependencies are not introduced into the parts of the system that are built with JDK 8.
+The JDK 8 binaries can be downloaded from Oracle's JDK 8 download site. For build performance reasons it is very important that this bootstrap JDK be made available on the local disk of the machine doing the build. You should add its bin
directory to the PATH
environment variable. If configure
has any issues finding this JDK, you may need to use the configure
option --with-boot-jdk
.
And for specific systems:
+Install all the software development packages needed including alsa, freetype, cups, and xrender. See specific system packages.
+Install all the software development packages needed including Studio Compilers, freetype, cups, and xrender. See specific system packages.
+Windows
Install Visual Studio 2013
Mac OS X
Install XCode 6.3
+With Linux, try and favor the system packages over building your own or getting packages from other areas. Most Linux builds should be possible with the system's available packages.
+Note that some Linux systems have a habit of pre-populating your environment variables for you, for example JAVA_HOME
might get pre-defined for you to refer to the JDK installed on your Linux system. You will need to unset JAVA_HOME
. It's a good idea to run env
and verify the environment variables you are getting from the default system settings make sense for building the OpenJDK.
At a minimum, the Studio 12 Update 4 Compilers (containing version 5.13 of the C and C++ compilers) is required, including specific patches.
+The Solaris Studio installation should contain at least these packages:
+Package | +Version | +
---|---|
developer/solarisstudio-124/backend | +12.4-1.0.6.0 | +
developer/solarisstudio-124/c++ | +12.4-1.0.10.0 | +
developer/solarisstudio-124/cc | +12.4-1.0.4.0 | +
developer/solarisstudio-124/library/c++-libs | +12.4-1.0.10.0 | +
developer/solarisstudio-124/library/math-libs | +12.4-1.0.0.1 | +
developer/solarisstudio-124/library/studio-gccrt | +12.4-1.0.0.1 | +
developer/solarisstudio-124/studio-common | +12.4-1.0.0.1 | +
developer/solarisstudio-124/studio-ja | +12.4-1.0.0.1 | +
developer/solarisstudio-124/studio-legal | +12.4-1.0.0.1 | +
developer/solarisstudio-124/studio-zhCN | +12.4-1.0.0.1 | +
In particular backend 12.4-1.0.6.0 contains a critical patch for the sparc version.
+Place the bin
directory in PATH
.
The Oracle Solaris Studio Express compilers at: Oracle Solaris Studio Express Download site are also an option, although these compilers have not been extensively used yet.
+Building on Windows requires a Unix-like environment, notably a Unix-like shell. There are several such environments available of which Cygwin and MinGW/MSYS are currently supported for the OpenJDK build. One of the differences of these systems from standard Windows tools is the way they handle Windows path names, particularly path names which contain spaces, backslashes as path separators and possibly drive letters. Depending on the use case and the specifics of each environment these path problems can be solved by a combination of quoting whole paths, translating backslashes to forward slashes, escaping backslashes with additional backslashes and translating the path names to their "8.3" version.
+CYGWIN is an open source, Linux-like environment which tries to emulate a complete POSIX layer on Windows. It tries to be smart about path names and can usually handle all kinds of paths if they are correctly quoted or escaped although internally it maps drive letters <drive>:
to a virtual directory /cygdrive/<drive>
.
You can always use the cygpath
utility to map pathnames with spaces or the backslash character into the C:/
style of pathname (called 'mixed'), e.g. cygpath -s -m "<path>"
.
Note that the use of CYGWIN creates a unique problem with regards to setting PATH
. Normally on Windows the PATH
variable contains directories separated with the ";" character (Solaris and Linux use ":"). With CYGWIN, it uses ":", but that means that paths like "C:/path" cannot be placed in the CYGWIN version of PATH
and instead CYGWIN uses something like /cygdrive/c/path
which CYGWIN understands, but only CYGWIN understands.
The OpenJDK build requires CYGWIN version 1.7.16 or newer. Information about CYGWIN can be obtained from the CYGWIN website at www.cygwin.com.
+By default CYGWIN doesn't install all the tools required for building the OpenJDK. Along with the default installation, you need to install the following tools.
+Binary Name | +Category | +Package | +Description | +
---|---|---|---|
ar.exe | +Devel | +binutils | +The GNU assembler, linker and binary utilities | +
make.exe | +Devel | +make | +The GNU version of the 'make' utility built for CYGWIN | +
m4.exe | +Interpreters | +m4 | +GNU implementation of the traditional Unix macro processor | +
cpio.exe | +Utils | +cpio | +A program to manage archives of files | +
gawk.exe | +Utils | +awk | +Pattern-directed scanning and processing language | +
file.exe | +Utils | +file | +Determines file type using 'magic' numbers | +
zip.exe | +Archive | +zip | +Package and compress (archive) files | +
unzip.exe | +Archive | +unzip | +Extract compressed files in a ZIP archive | +
free.exe | +System | +procps | +Display amount of free and used memory in the system | +
Note that the CYGWIN software can conflict with other non-CYGWIN software on your Windows system. CYGWIN provides a FAQ for known issues and problems, of particular interest is the section on BLODA (applications that interfere with CYGWIN).
+MinGW ("Minimalist GNU for Windows") is a collection of free Windows specific header files and import libraries combined with GNU toolsets that allow one to produce native Windows programs that do not rely on any 3rd-party C runtime DLLs. MSYS is a supplement to MinGW which allows building applications and programs which rely on traditional UNIX tools to be present. Among others this includes tools like bash
and make
. See MinGW/MSYS for more information.
Like Cygwin, MinGW/MSYS can handle different types of path formats. They are internally converted to paths with forward slashes and drive letters <drive>:
replaced by a virtual directory /<drive>
. Additionally, MSYS automatically detects binaries compiled for the MSYS environment and feeds them with the internal, Unix-style path names. If native Windows applications are called from within MSYS programs their path arguments are automatically converted back to Windows style path names with drive letters and backslashes as path separators. This may cause problems for Windows applications which use forward slashes as parameter separator (e.g. cl /nologo /I
) because MSYS may wrongly replace such parameters by drive letters.
In addition to the tools which will be installed by default, you have to manually install the msys-zip
and msys-unzip
packages. This can be easily done with the MinGW command line installer:
mingw-get.exe install msys-zip
+ mingw-get.exe install msys-unzip
+The 32-bit and 64-bit OpenJDK Windows build requires Microsoft Visual Studio C++ 2013 (VS2013) Professional Edition or Express compiler. The compiler and other tools are expected to reside in the location defined by the variable VS120COMNTOOLS
which is set by the Microsoft Visual Studio installer.
Only the C++ part of VS2013 is needed. Try to let the installation go to the default install directory. Always reboot your system after installing VS2013. The system environment variable VS120COMNTOOLS should be set in your environment.
+Make sure that TMP and TEMP are also set in the environment and refer to Windows paths that exist, like C:\temp
, not /tmp
, not /cygdrive/c/temp
, and not C:/temp
. C:\temp
is just an example, it is assumed that this area is private to the user, so by default after installs you should see a unique user path in these variables.
Make sure you get the right XCode version.
+The basic invocation of the configure
script looks like:
+++
bash ./configure [options]
This will create an output directory containing the "configuration" and setup an area for the build result. This directory typically looks like:
++++
build/linux-x64-normal-server-release
configure
will try to figure out what system you are running on and where all necessary build components are. If you have all prerequisites for building installed, it should find everything. If it fails to detect any component automatically, it will exit and inform you about the problem. When this happens, read more below in the configure
options.
Some examples:
+++Windows 32bit build with freetype specified:
+
+bash ./configure --with-freetype=/cygdrive/c/freetype-i586 --with-target-bits=32
++Debug 64bit Build:
+
+bash ./configure --enable-debug --with-target-bits=64
Complete details on all the OpenJDK configure
options can be seen with:
+++
bash ./configure --help=short
Use -help
to see all the configure
options available. You can generate any number of different configurations, e.g. debug, release, 32, 64, etc.
Some of the more commonly used configure
options are:
++ ++
--enable-debug
+set the debug level to fastdebug (this is a shorthand for--with-debug-level=fastdebug
)
+++
--with-alsa=
path
+select the location of the Advanced Linux Sound Architecture (ALSA)
++Version 0.9.1 or newer of the ALSA files are required for building the OpenJDK on Linux. These Linux files are usually available from an "alsa" of "libasound" development package, and it's highly recommended that you try and use the package provided by the particular version of Linux that you are using.
+
+++
--with-boot-jdk=
path
+select the Bootstrap JDK
+++
--with-boot-jdk-jvmargs=
"args"
+provide the JVM options to be used to run the Bootstrap JDK
+++
--with-cacerts=
path
+select the path to the cacerts file.
++ +See Certificate Authority on Wikipedia for a better understanding of the Certificate Authority (CA). A certificates file named "cacerts" represents a system-wide keystore with CA certificates. In JDK and JRE binary bundles, the "cacerts" file contains root CA certificates from several public CAs (e.g., VeriSign, Thawte, and Baltimore). The source contain a cacerts file without CA root certificates. Formal JDK builders will need to secure permission from each public CA and include the certificates into their own custom cacerts file. Failure to provide a populated cacerts file will result in verification errors of a certificate chain during runtime. By default an empty cacerts file is provided and that should be fine for most JDK developers.
+
+++
--with-cups=
path
+select the CUPS install location
++The Common UNIX Printing System (CUPS) Headers are required for building the OpenJDK on Solaris and Linux. The Solaris header files can be obtained by installing the package print/cups.
+
++The CUPS header files can always be downloaded from www.cups.org.
+
+++
--with-cups-include=
path
+select the CUPS include directory location
+++
--with-debug-level=
level
+select the debug information level of release, fastdebug, or slowdebug
++ ++
--with-dev-kit=
path
+select location of the compiler install or developer install location
+++
--with-freetype=
path
+select the freetype files to use.
++Expecting the freetype libraries under
+lib/
and the headers underinclude/
.
++Version 2.3 or newer of FreeType is required. On Unix systems required files can be available as part of your distribution (while you still may need to upgrade them). Note that you need development version of package that includes both the FreeType library and header files.
+
++You can always download latest FreeType version from the FreeType website. Building the freetype 2 libraries from scratch is also possible, however on Windows refer to the Windows FreeType DLL build instructions.
+
++Note that by default FreeType is built with byte code hinting support disabled due to licensing restrictions. In this case, text appearance and metrics are expected to differ from Sun's official JDK build. See the SourceForge FreeType2 Home Page for more information.
+
+++
--with-import-hotspot=
path
+select the location to find hotspot binaries from a previous build to avoid building hotspot
+++
--with-target-bits=
arg
+select 32 or 64 bit build
+++
--with-jvm-variants=
variants
+select the JVM variants to build from, comma separated list that can include: server, client, kernel, zero and zeroshark
+++
--with-memory-size=
size
+select the RAM size that GNU make will think this system has
+++
--with-msvcr-dll=
path
+select themsvcr100.dll
file to include in the Windows builds (C/C++ runtime library for Visual Studio).
++This is usually picked up automatically from the redist directories of Visual Studio 2013.
+
++ ++
--with-num-cores=
cores
+select the number of cores to use (processor count or CPU count)
+++
--with-x=
path
+select the location of the X11 and xrender files.
++The XRender Extension Headers are required for building the OpenJDK on Solaris and Linux. The Linux header files are usually available from a "Xrender" development package, it's recommended that you try and use the package provided by the particular distribution of Linux that you are using. The Solaris XRender header files is included with the other X11 header files in the package SFWxwinc on new enough versions of Solaris and will be installed in
+/usr/X11/include/X11/extensions/Xrender.h
or/usr/openwin/share/include/X11/extensions/Xrender.h
The basic invocation of the make
utility looks like:
+++
make all
This will start the build to the output directory containing the "configuration" that was created by the configure
script. Run make help
for more information on the available targets.
There are some of the make targets that are of general interest:
+++empty
+
+build everything but no images
+++
all
+build everything including images
+++
all-conf
+build all configurations
+++
images
+create complete j2sdk and j2re images
+++
install
+install the generated images locally, typically in/usr/local
+++
clean
+remove all files generated by make, but not those generated byconfigure
+++
dist-clean
+remove all files generated by both andconfigure
(basically killing the configuration)
+++
help
+give some help on usingmake
, including some interesting make targets
When the build is completed, you should see the generated binaries and associated files in the j2sdk-image
directory in the output directory. In particular, the build/*/images/j2sdk-image/bin
directory should contain executables for the OpenJDK tools and utilities for that configuration. The testing tool jtreg
will be needed and can be found at: the jtreg site. The provided regression tests in the repositories can be run with the command:
+++
cd test && make PRODUCT_HOME=`pwd`/../build/*/images/j2sdk-image all
Q: The generated-configure.sh
file looks horrible! How are you going to edit it?
+A: The generated-configure.sh
file is generated (think "compiled") by the autoconf tools. The source code is in configure.ac
and various .m4 files in common/autoconf, which are much more readable.
Q: Why is the generated-configure.sh
file checked in, if it is generated?
+A: If it was not generated, every user would need to have the autoconf tools installed, and re-generate the configure
file as the first step. Our goal is to minimize the work needed to be done by the user to start building OpenJDK, and to minimize the number of external dependencies required.
Q: Do you require a specific version of autoconf for regenerating generated-configure.sh
?
+A: Yes, version 2.69 is required and should be easy enough to aquire on all supported operating systems. The reason for this is to avoid large spurious changes in generated-configure.sh
.
Q: How do you regenerate generated-configure.sh
after making changes to the input files?
+A: Regnerating generated-configure.sh
should always be done using the script common/autoconf/autogen.sh
to ensure that the correct files get updated. This script should also be run after mercurial tries to merge generated-configure.sh
as a merge of the generated file is not guaranteed to be correct.
Q: What are the files in common/makefiles/support/*
for? They look like gibberish.
+A: They are a somewhat ugly hack to compensate for command line length limitations on certain platforms (Windows, Solaris). Due to a combination of limitations in make and the shell, command lines containing too many files will not work properly. These helper files are part of an elaborate hack that will compress the command line in the makefile and then uncompress it safely. We're not proud of it, but it does fix the problem. If you have any better suggestions, we're all ears! :-)
Q: I want to see the output of the commands that make runs, like in the old build. How do I do that?
+A: You specify the LOG
variable to make. There are several log levels:
warn
-- Default and very quiet.info
-- Shows more progress information than warn.debug
-- Echos all command lines and prints all macro calls for compilation definitions.trace
-- Echos all $(shell) command lines as well.Q: When do I have to re-run configure
?
+A: Normally you will run configure
only once for creating a configuration. You need to re-run configuration only if you want to change any configuration options, or if you pull down changes to the configure
script.
Q: I have added a new source file. Do I need to modify the makefiles?
+A: Normally, no. If you want to create e.g. a new native library, you will need to modify the makefiles. But for normal file additions or removals, no changes are needed. There are certan exceptions for some native libraries where the source files are spread over many directories which also contain sources for other libraries. In these cases it was simply easier to create include lists rather than excludes.
Q: When I run configure --help
, I see many strange options, like --dvidir
. What is this?
+A: Configure provides a slew of options by default, to all projects that use autoconf. Most of them are not used in OpenJDK, so you can safely ignore them. To list only OpenJDK specific features, use configure --help=short
instead.
Q: configure
provides OpenJDK-specific features such as --with-builddeps-server
that are not described in this document. What about those?
+A: Try them out if you like! But be aware that most of these are experimental features. Many of them don't do anything at all at the moment; the option is just a placeholder. Others depend on pieces of code or infrastructure that is currently not ready for prime time.
Q: How will you make sure you don't break anything?
+A: We have a script that compares the result of the new build system with the result of the old. For most part, we aim for (and achieve) byte-by-byte identical output. There are however technical issues with e.g. native binaries, which might differ in a byte-by-byte comparison, even when building twice with the old build system. For these, we compare relevant aspects (e.g. the symbol table and file size). Note that we still don't have 100% equivalence, but we're close.
Q: I noticed this thing X in the build that looks very broken by design. Why don't you fix it?
+A: Our goal is to produce a build output that is as close as technically possible to the old build output. If things were weird in the old build, they will be weird in the new build. Often, things were weird before due to obscurity, but in the new build system the weird stuff comes up to the surface. The plan is to attack these things at a later stage, after the new build system is established.
Q: The code in the new build system is not that well-structured. Will you fix this?
+A: Yes! The new build system has grown bit by bit as we converted the old system. When all of the old build system is converted, we can take a step back and clean up the structure of the new build system. Some of this we plan to do before replacing the old build system and some will need to wait until after.
Q: Is anything able to use the results of the new build's default make target?
+A: Yes, this is the minimal (or roughly minimal) set of compiled output needed for a developer to actually execute the newly built JDK. The idea is that in an incremental development fashion, when doing a normal make, you should only spend time recompiling what's changed (making it purely incremental) and only do the work that's needed to actually run and test your code. The packaging stuff that is part of the images
target is not needed for a normal developer who wants to test his new code. Even if it's quite fast, it's still unnecessary. We're targeting sub-second incremental rebuilds! ;-) (Or, well, at least single-digit seconds...)
Q: I usually set a specific environment variable when building, but I can't find the equivalent in the new build. What should I do?
+A: It might very well be that we have neglected to add support for an option that was actually used from outside the build system. Email us and we will add support for it!
Building OpenJDK requires a lot of horsepower. Some of the build tools can be adjusted to utilize more or less of resources such as parallel threads and memory. The configure
script analyzes your system and selects reasonable values for such options based on your hardware. If you encounter resource problems, such as out of memory conditions, you can modify the detected values with:
--with-num-cores
-- number of cores in the build system, e.g. --with-num-cores=8
--with-memory-size
-- memory (in MB) available in the build system, e.g. --with-memory-size=1024
It might also be necessary to specify the JVM arguments passed to the Bootstrap JDK, using e.g. --with-boot-jdk-jvmargs="-Xmx8G -enableassertions"
. Doing this will override the default JVM arguments passed to the Bootstrap JDK.
One of the top goals of the new build system is to improve the build performance and decrease the time needed to build. This will soon also apply to the java compilation when the Smart Javac wrapper is fully supported.
+At the end of a successful execution of configure
, you will get a performance summary, indicating how well the build will perform. Here you will also get performance hints. If you want to build fast, pay attention to those!
The OpenJDK build supports building with ccache when using gcc or clang. Using ccache can radically speed up compilation of native code if you often rebuild the same sources. Your milage may vary however so we recommend evaluating it for yourself. To enable it, make sure it's on the path and configure with --enable-ccache
.
If you are using network shares, e.g. via NFS, for your source code, make sure the build directory is situated on local disk. The performance penalty is extremely high for building on a network share, close to unusable.
+The old build builds multiple JVMs on 32-bit systems (client and server; and on Windows kernel as well). In the new build we have changed this default to only build server when it's available. This improves build times for those not interested in multiple JVMs. To mimic the old behavior on platforms that support it, use --with-jvm-variants=client,server
.
By default, configure
will analyze your machine and run the make process in parallel with as many threads as you have cores. This behavior can be overridden, either "permanently" (on a configure
basis) using --with-num-cores=N
or for a single build only (on a make basis), using make JOBS=N
.
If you want to make a slower build just this time, to save some CPU power for other processes, you can run e.g. make JOBS=2
. This will force the makefiles to only run 2 parallel processes, or even make JOBS=1
which will disable parallelism.
If you want to have it the other way round, namely having slow builds default and override with fast if you're impatient, you should call configure
with --with-num-cores=2
, making 2 the default. If you want to run with more cores, run make JOBS=8
If the build fails (and it's not due to a compilation error in a source file you've changed), the first thing you should do is to re-run the build with more verbosity. Do this by adding LOG=debug
to your make command line.
The build log (with both stdout and stderr intermingled, basically the same as you see on your console) can be found as build.log
in your build directory.
You can ask for help on build problems with the new build system on either the build-dev or the build-infra-dev mailing lists. Please include the relevant parts of the build log.
+A build can fail for any number of reasons. Most failures are a result of trying to build in an environment in which all the pre-build requirements have not been met. The first step in troubleshooting a build failure is to recheck that you have satisfied all the pre-build requirements for your platform. Scanning the configure
log is a good first step, making sure that what it found makes sense for your system. Look for strange error messages or any difficulties that configure
had in finding things.
Some of the more common problems with builds are briefly described below, with suggestions for remedies.
+Corrupted Bundles on Windows:
+Some virus scanning software has been known to corrupt the downloading of zip bundles. It may be necessary to disable the 'on access' or 'real time' virus scanning features to prevent this corruption. This type of 'real time' virus scanning can also slow down the build process significantly. Temporarily disabling the feature, or excluding the build output directory may be necessary to get correct and faster builds.
Slow Builds:
+If your build machine seems to be overloaded from too many simultaneous C++ compiles, try setting the JOBS=1
on the make
command line. Then try increasing the count slowly to an acceptable level for your system. Also:
Creating the javadocs can be very slow, if you are running javadoc, consider skipping that step.
+Faster CPUs, more RAM, and a faster DISK usually helps. The VM build tends to be CPU intensive (many C++ compiles), and the rest of the JDK will often be disk intensive.
+Faster compiles are possible using a tool called ccache.
+++Warning message:
+File 'xxx' has modification time in the future.
+Warning message:Clock skew detected. Your build may be incomplete.
These warnings can occur when the clock on the build machine is out of sync with the timestamps on the source files. Other errors, apparently unrelated but in fact caused by the clock skew, can occur along with the clock skew warnings. These secondary errors may tend to obscure the fact that the true root cause of the problem is an out-of-sync clock.
+If you see these warnings, reset the clock on the build machine, run "gmake clobber
" or delete the directory containing the build output, and restart the build from the beginning.
Trouble writing out table to disk
+++
make JOBS=1
to reduce the load on the system.
+Error Message: libstdc++ not found
:
+This is caused by a missing libstdc++.a library. This is installed as part of a specific package (e.g. libstdc++.so.devel.386). By default some 64-bit Linux versions (e.g. Fedora) only install the 64-bit version of the libstdc++ package. Various parts of the JDK build require a static link of the C++ runtime libraries to allow for maximum portability of the built images.
Linux Error Message: cannot restore segment prot after reloc
+This is probably an issue with SELinux (See SELinux on Wikipedia). Parts of the VM is built without the -fPIC
for performance reasons.
To completely disable SELinux:
+$ su root
# system-config-securitylevel
In the window that appears, select the SELinux tab
Disable SELinux
Alternatively, instead of completely disabling it you could disable just this one check.
+*** fatal error - couldn't allocate heap, ...
rm fails with "Directory not empty"
unzip fails with "cannot create ... Permission denied"
unzip fails with "cannot create ... Error 50"
The CYGWIN software can conflict with other non-CYGWIN software. See the CYGWIN FAQ section on BLODA (applications that interfere with CYGWIN).
+spawn failed
The Makefiles in the OpenJDK are only valid when used with the GNU version of the utility command make
(usually called gmake
on Solaris). A few notes about using GNU make:
PATH
./usr/bin/make
on Solaris. If your Solaris system has the software from the Solaris Developer Companion CD installed, you should try and use /usr/bin/gmake
or /usr/gnu/bin/make
.Information on GNU make, and access to ftp download sites, are available on the GNU make web site. The latest source to GNU make is available at ftp.gnu.org/pub/gnu/make/.
+First step is to get the GNU make 3.81 or newer source from ftp.gnu.org/pub/gnu/make/. Building is a little different depending on the OS but is basically done with:
+ bash ./configure
+ make
+This file often describes specific requirements for what we call the "minimum build environments" (MBE) for this specific release of the JDK. What is listed below is what the Oracle Release Engineering Team will use to build the Oracle JDK product. Building with the MBE will hopefully generate the most compatible bits that install on, and run correctly on, the most variations of the same base OS and hardware architecture. In some cases, these represent what is often called the least common denominator, but each Operating System has different aspects to it.
+In all cases, the Bootstrap JDK version minimum is critical, we cannot guarantee builds will work with older Bootstrap JDK's. Also in all cases, more RAM and more processors is better, the minimums listed below are simply recommendations.
+With Solaris and Mac OS X, the version listed below is the oldest release we can guarantee builds and works, and the specific version of the compilers used could be critical.
+With Windows the critical aspect is the Visual Studio compiler used, which due to it's runtime, generally dictates what Windows systems can do the builds and where the resulting bits can be used.
+NOTE: We expect a change here off these older Windows OS releases and to a 'less older' one, probably Windows 2008R2 X64.
+With Linux, it was just a matter of picking a stable distribution that is a good representative for Linux in general.
+It is understood that most developers will NOT be using these specific versions, and in fact creating these specific versions may be difficult due to the age of some of this software. It is expected that developers are more often using the more recent releases and distributions of these operating systems.
+Compilation problems with newer or different C/C++ compilers is a common problem. Similarly, compilation problems related to changes to the /usr/include
or system header files is also a common problem with older, newer, or unreleased OS versions. Please report these types of problems as bugs so that they can be dealt with accordingly.
Bootstrap JDK: JDK 8
+Base OS and Architecture | +OS | +C/C++ Compiler | +Processors | +RAM Minimum | +DISK Needs | +
---|---|---|---|---|---|
Linux X86 (32-bit) and X64 (64-bit) | +Oracle Enterprise Linux 6.4 | +gcc 4.9.2 | +2 or more | +1 GB | +6 GB | +
Solaris SPARCV9 (64-bit) | +Solaris 11 Update 1 | +Studio 12 Update 4 + patches | +4 or more | +4 GB | +8 GB | +
Solaris X64 (64-bit) | +Solaris 11 Update 1 | +Studio 12 Update 4 + patches | +4 or more | +4 GB | +8 GB | +
Windows X86 (32-bit) | +Windows Server 2012 R2 x64 | +Microsoft Visual Studio C++ 2013 Professional Edition | +2 or more | +2 GB | +6 GB | +
Windows X64 (64-bit) | +Windows Server 2012 R2 x64 | +Microsoft Visual Studio C++ 2013 Professional Edition | +2 or more | +2 GB | +6 GB | +
Mac OS X X64 (64-bit) | +Mac OS X 10.9 "Mavericks" | +Xcode 6.3 or newer | +2 or more | +4 GB | +6 GB | +
We won't be listing all the possible environments, but we will try to provide what information we have available to us.
+NOTE: The community can help out by updating this part of the document.
+After installing the latest Fedora you need to install several build dependencies. The simplest way to do it is to execute the following commands as user root
:
yum-builddep java-1.7.0-openjdk
+ yum install gcc gcc-c++
+In addition, it's necessary to set a few environment variables for the build:
+ export LANG=C
+ export PATH="/usr/lib/jvm/java-openjdk/bin:${PATH}"
+After installing CentOS 5.5 you need to make sure you have the following Development bundles installed:
+Plus the following packages:
+The freetype 2.3 packages don't seem to be available, but the freetype 2.3 sources can be downloaded, built, and installed easily enough from the freetype site. Build and install with something like:
+ bash ./configure
+ make
+ sudo -u root make install
+Mercurial packages could not be found easily, but a Google search should find ones, and they usually include Python if it's needed.
+After installing Debian 5 you need to install several build dependencies. The simplest way to install the build dependencies is to execute the following commands as user root
:
aptitude build-dep openjdk-7
+ aptitude install openjdk-7-jdk libmotif-dev
+In addition, it's necessary to set a few environment variables for the build:
+ export LANG=C
+ export PATH="/usr/lib/jvm/java-7-openjdk/bin:${PATH}"
+After installing Ubuntu 12.04 you need to install several build dependencies. The simplest way to do it is to execute the following commands:
+ sudo aptitude build-dep openjdk-7
+ sudo aptitude install openjdk-7-jdk
+In addition, it's necessary to set a few environment variables for the build:
+ export LANG=C
+ export PATH="/usr/lib/jvm/java-7-openjdk/bin:${PATH}"
+After installing OpenSUSE 11.1 you need to install several build dependencies. The simplest way to install the build dependencies is to execute the following commands:
+ sudo zypper source-install -d java-1_7_0-openjdk
+ sudo zypper install make
+In addition, it is necessary to set a few environment variables for the build:
+ export LANG=C
+ export PATH="/usr/lib/jvm/java-1.7.0-openjdk/bin:$[PATH}"
+Finally, you need to unset the JAVA_HOME
environment variable:
export -n JAVA_HOME`
+After installing Mandriva Linux One 2009 Spring you need to install several build dependencies. The simplest way to install the build dependencies is to execute the following commands as user root
:
urpmi java-1.7.0-openjdk-devel make gcc gcc-c++ freetype-devel zip unzip
+ libcups2-devel libxrender1-devel libalsa2-devel libstc++-static-devel
+ libxtst6-devel libxi-devel
+In addition, it is necessary to set a few environment variables for the build:
+ export LANG=C
+ export PATH="/usr/lib/jvm/java-1.7.0-openjdk/bin:${PATH}"
+After installing OpenSolaris 2009.06 you need to install several build dependencies. The simplest way to install the build dependencies is to execute the following commands:
+ pfexec pkg install SUNWgmake SUNWj7dev sunstudioexpress SUNWcups SUNWzip
+ SUNWunzip SUNWxwhl SUNWxorg-headers SUNWaudh SUNWfreetype2
+In addition, it is necessary to set a few environment variables for the build:
+ export LANG=C
+ export PATH="/opt/SunStudioExpress/bin:${PATH}"
+End of the OpenJDK build README document.
+Please come again!
+ + diff --git a/README-builds.md b/common/doc/building.md similarity index 77% rename from README-builds.md rename to common/doc/building.md index c6907cd7ab2..3eb96dfd6e1 100644 --- a/README-builds.md +++ b/common/doc/building.md @@ -1,9 +1,9 @@ +% OpenJDK Build README + ![OpenJDK](http://openjdk.java.net/images/openjdk.png) -# OpenJDK Build README -***** +-------------------------------------------------------------------------------- - ## Introduction This README file contains build instructions for the @@ -19,34 +19,34 @@ Some Headlines: is recommended. * The build should scale, i.e. more processors should cause the build to be done in less wall-clock time - * Nested or recursive make invocations have been significantly reduced, - as has the total fork/exec or spawning of sub processes during the build + * Nested or recursive make invocations have been significantly reduced, as + has the total fork/exec or spawning of sub processes during the build * Windows MKS usage is no longer supported * Windows Visual Studio `vsvars*.bat` and `vcvars*.bat` files are run automatically * Ant is no longer used when building the OpenJDK - * Use of ALT_* environment variables for configuring the build is no longer + * Use of ALT\_\* environment variables for configuring the build is no longer supported -***** +------------------------------------------------------------------------------- ## Contents * [Introduction](#introduction) * [Use of Mercurial](#hg) - * [Getting the Source](#get_source) - * [Repositories](#repositories) + * [Getting the Source](#get_source) + * [Repositories](#repositories) * [Building](#building) - * [System Setup](#setup) - * [Linux](#linux) - * [Solaris](#solaris) - * [Mac OS X](#macosx) - * [Windows](#windows) - * [Configure](#configure) - * [Make](#make) + * [System Setup](#setup) + * [Linux](#linux) + * [Solaris](#solaris) + * [Mac OS X](#macosx) + * [Windows](#windows) + * [Configure](#configure) + * [Make](#make) * [Testing](#testing) -***** +------------------------------------------------------------------------------- * [Appendix A: Hints and Tips](#hints) * [FAQ](#faq) @@ -55,23 +55,22 @@ Some Headlines: * [Appendix B: GNU Make Information](#gmake) * [Appendix C: Build Environments](#buildenvironments) -***** +------------------------------------------------------------------------------- - ## Use of Mercurial The OpenJDK sources are maintained with the revision control system [Mercurial](http://mercurial.selenic.com/wiki/Mercurial). If you are new to -Mercurial, please see the [Beginner Guides](http://mercurial.selenic.com/wiki/ -BeginnersGuides) or refer to the [Mercurial Book](http://hgbook.red-bean.com/). -The first few chapters of the book provide an excellent overview of Mercurial, -what it is and how it works. +Mercurial, please see the [Beginner +Guides](http://mercurial.selenic.com/wiki/BeginnersGuides) or refer to the +[Mercurial Book](http://hgbook.red-bean.com/). The first few chapters of the +book provide an excellent overview of Mercurial, what it is and how it works. For using Mercurial with the OpenJDK refer to the [Developer Guide: Installing -and Configuring Mercurial](http://openjdk.java.net/guide/ -repositories.html#installConfig) section for more information. +and Configuring +Mercurial](http://openjdk.java.net/guide/repositories.html#installConfig) +section for more information. - ### Getting the Source To get the entire set of OpenJDK Mercurial repositories use the script @@ -83,16 +82,15 @@ To get the entire set of OpenJDK Mercurial repositories use the script Once you have all the repositories, keep in mind that each repository is its own independent repository. You can also re-run `./get_source.sh` anytime to -pull over all the latest changesets in all the repositories. This set of -nested repositories has been given the term "forest" and there are various -ways to apply the same `hg` command to each of the repositories. For -example, the script `make/scripts/hgforest.sh` can be used to repeat the -same `hg` command on every repository, e.g. +pull over all the latest changesets in all the repositories. This set of nested +repositories has been given the term "forest" and there are various ways to +apply the same `hg` command to each of the repositories. For example, the +script `make/scripts/hgforest.sh` can be used to repeat the same `hg` command +on every repository, e.g. cd YourOpenJDK bash ./make/scripts/hgforest.sh status - ### Repositories The set of repositories and what they contain: @@ -135,9 +133,8 @@ There are some very basic guidelines: * Files not needed for typical building or testing of the repository should not be added to the repository. -***** +------------------------------------------------------------------------------- - ## Building The very first step in building the OpenJDK is making sure the system itself @@ -148,7 +145,7 @@ Building the OpenJDK is now done with running a `configure` script which will try and find and verify you have everything you need, followed by running `make`, e.g. -> **`bash ./configure`** +> **`bash ./configure`** \ > **`make all`** Where possible the `configure` script will attempt to located the various @@ -161,9 +158,8 @@ system due to missing software packages. **NOTE:** The `configure` script file does not have execute permissions and will need to be explicitly run with `bash`, see the source guidelines. -***** +------------------------------------------------------------------------------- - ### System Setup Before even attempting to use a system to build the OpenJDK there are some very @@ -174,14 +170,14 @@ basic system setups needed. For all systems: * Install a Bootstrap JDK. All OpenJDK builds require access to a previously - released JDK called the _bootstrap JDK_ or _boot JDK._ The general rule is + released JDK called the *bootstrap JDK* or *boot JDK.* The general rule is that the bootstrap JDK must be an instance of the previous major release of the JDK. In addition, there may be a requirement to use a release at or beyond a particular update level. - **_Building JDK 9 requires JDK 8. JDK 9 developers should not use JDK 9 as + ***Building JDK 9 requires JDK 8. JDK 9 developers should not use JDK 9 as the boot JDK, to ensure that JDK 9 dependencies are not introduced into the - parts of the system that are built with JDK 8._** + parts of the system that are built with JDK 8.*** The JDK 8 binaries can be downloaded from Oracle's [JDK 8 download site](http://www.oracle.com/technetwork/java/javase/downloads/index.html). @@ -217,7 +213,6 @@ And for specific systems: Install [XCode 6.3](https://developer.apple.com/xcode/) - #### Linux With Linux, try and favor the system packages over building your own or getting @@ -231,69 +226,29 @@ refer to the JDK installed on your Linux system. You will need to unset you are getting from the default system settings make sense for building the OpenJDK. - #### Solaris - ##### Studio Compilers -At a minimum, the [Studio 12 Update 4 Compilers](http://www.oracle.com/ -technetwork/server-storage/solarisstudio/downloads/index.htm) (containing -version 5.13 of the C and C++ compilers) is required, including specific -patches. +At a minimum, the [Studio 12 Update 4 +Compilers](http://www.oracle.com/technetwork/server-storage/solarisstudio/downloads/index.htm) +(containing version 5.13 of the C and C++ compilers) is required, including +specific patches. The Solaris Studio installation should contain at least these packages: ->**Package** | -**Version** | -
developer/solarisstudio-124/backend | -12.4-1.0.6.0 | -
developer/solarisstudio-124/c++ | -12.4-1.0.10.0 | -
developer/solarisstudio-124/cc | -12.4-1.0.4.0 | -
developer/solarisstudio-124/library/c++-libs | -12.4-1.0.10.0 | -
developer/solarisstudio-124/library/math-libs | -12.4-1.0.0.1 | -
developer/solarisstudio-124/library/studio-gccrt | -12.4-1.0.0.1 | -
developer/solarisstudio-124/studio-common | -12.4-1.0.0.1 | -
developer/solarisstudio-124/studio-ja | -12.4-1.0.0.1 | -
developer/solarisstudio-124/studio-legal | -12.4-1.0.0.1 | -
developer/solarisstudio-124/studio-zhCN | -12.4-1.0.0.1 | -
Binary Name | -Category | -Package | -Description | -
ar.exe | -Devel | -binutils | -The GNU assembler, linker and binary utilities | -
make.exe | -Devel | -make | -The GNU version of the 'make' utility built for CYGWIN | -
m4.exe | -Interpreters | -m4 | -GNU implementation of the traditional Unix macro processor | -
cpio.exe | -Utils | -cpio | -A program to manage archives of files | -
gawk.exe | -Utils | -awk | -Pattern-directed scanning and processing language | -
file.exe | -Utils | -file | -Determines file type using 'magic' numbers | -
zip.exe | -Archive | -zip | -Package and compress (archive) files | -
unzip.exe | -Archive | -unzip | -Extract compressed files in a ZIP archive | -
free.exe | -System | -procps | -Display amount of free and used memory in the system | -
Base OS and Architecture | -OS | -C/C++ Compiler | -Bootstrap JDK | -Processors | -RAM Minimum | -DISK Needs | -
---|---|---|---|---|---|---|
Linux X86 (32-bit) and X64 (64-bit) | -Oracle Enterprise Linux 6.4 | -gcc 4.9.2 | -JDK 8 | -2 or more | -1 GB | -6 GB | -
Solaris SPARCV9 (64-bit) | -Solaris 11 Update 1 | -Studio 12 Update 4 + patches | -JDK 8 | -4 or more | -4 GB | -8 GB | -
Solaris X64 (64-bit) | -Solaris 11 Update 1 | -Studio 12 Update 4 + patches | -JDK 8 | -4 or more | -4 GB | -8 GB | -
Windows X86 (32-bit) | -Windows Server 2012 R2 x64 | -Microsoft Visual Studio C++ 2013 Professional Edition | -JDK 8 | -2 or more | -2 GB | -6 GB | -
Windows X64 (64-bit) | -Windows Server 2012 R2 x64 | -Microsoft Visual Studio C++ 2013 Professional Edition | -JDK 8 | -2 or more | -2 GB | -6 GB | -
Mac OS X X64 (64-bit) | -Mac OS X 10.9 "Mavericks" | -Xcode 6.3 or newer | -JDK 8 | -2 or more | -4 GB | -6 GB | -