diff --git a/src/Doc/System/Environment.thy b/src/Doc/System/Environment.thy --- a/src/Doc/System/Environment.thy +++ b/src/Doc/System/Environment.thy @@ -1,510 +1,510 @@ (*:maxLineLen=78:*) theory Environment imports Base begin chapter \The Isabelle system environment\ text \ This manual describes Isabelle together with related tools as seen from a system oriented view. See also the \<^emph>\Isabelle/Isar Reference Manual\ @{cite "isabelle-isar-ref"} for the actual Isabelle input language and related concepts, and \<^emph>\The Isabelle/Isar Implementation Manual\ @{cite "isabelle-implementation"} for the main concepts of the underlying implementation in Isabelle/ML. \ section \Isabelle settings \label{sec:settings}\ text \ Isabelle executables may depend on the \<^emph>\Isabelle settings\ within the process environment. This is a statically scoped collection of environment variables, such as @{setting ISABELLE_HOME}, @{setting ML_SYSTEM}, @{setting ML_HOME}. These variables are \<^emph>\not\ intended to be set directly from the shell, but are provided by Isabelle \<^emph>\components\ their \<^emph>\settings files\ as explained below. \ subsection \Bootstrapping the environment \label{sec:boot}\ text \ Isabelle executables need to be run within a proper settings environment. This is bootstrapped as described below, on the first invocation of one of the outer wrapper scripts (such as @{executable_ref isabelle}). This happens only once for each process tree, i.e.\ the environment is passed to subprocesses according to regular Unix conventions. \<^enum> The special variable @{setting_def ISABELLE_HOME} is determined automatically from the location of the binary that has been run. You should not try to set @{setting ISABELLE_HOME} manually. Also note that the Isabelle executables either have to be run from their original location in the distribution directory, or via the executable objects created by the @{tool install} tool. Symbolic links are admissible, but a plain copy of the \<^dir>\$ISABELLE_HOME/bin\ files will not work! \<^enum> The file \<^file>\$ISABELLE_HOME/etc/settings\ is run as a @{executable_ref bash} shell script with the auto-export option for variables enabled. This file holds a rather long list of shell variable assignments, thus providing the site-wide default settings. The Isabelle distribution already contains a global settings file with sensible defaults for most variables. When installing the system, only a few of these may have to be adapted (probably @{setting ML_SYSTEM} etc.). \<^enum> The file \<^path>\$ISABELLE_HOME_USER/etc/settings\ (if it exists) is run in the same way as the site default settings. Note that the variable @{setting ISABELLE_HOME_USER} has already been set before --- usually to something like \<^verbatim>\$USER_HOME/.isabelle/Isabelle2020\. Thus individual users may override the site-wide defaults. Typically, a user settings file contains only a few lines, with some assignments that are actually changed. Never copy the central \<^file>\$ISABELLE_HOME/etc/settings\ file! Since settings files are regular GNU @{executable_def bash} scripts, one may use complex shell commands, such as \<^verbatim>\if\ or \<^verbatim>\case\ statements to set variables depending on the system architecture or other environment variables. Such advanced features should be added only with great care, though. In particular, external environment references should be kept at a minimum. \<^medskip> A few variables are somewhat special, e.g.\ @{setting_def ISABELLE_TOOL} is set automatically to the absolute path name of the @{executable isabelle} executables. \<^medskip> Note that the settings environment may be inspected with the @{tool getenv} tool. This might help to figure out the effect of complex settings scripts. \ subsection \Common variables\ text \ This is a reference of common Isabelle settings variables. Note that the list is somewhat open-ended. Third-party utilities or interfaces may add their own selection. Variables that are special in some sense are marked with \\<^sup>*\. \<^descr>[@{setting_def USER_HOME}\\<^sup>*\] Is the cross-platform user home directory. On Unix systems this is usually the same as @{setting HOME}, but on Windows it is the regular home directory of the user, not the one of within the Cygwin root file-system.\<^footnote>\Cygwin itself offers another choice whether its HOME should point to the \<^path>\/home\ directory tree or the Windows user home.\ \<^descr>[@{setting_def ISABELLE_HOME}\\<^sup>*\] is the location of the top-level Isabelle distribution directory. This is automatically determined from the Isabelle executable that has been invoked. Do not attempt to set @{setting ISABELLE_HOME} yourself from the shell! \<^descr>[@{setting_def ISABELLE_HOME_USER}] is the user-specific counterpart of @{setting ISABELLE_HOME}. The default value is relative to \<^path>\$USER_HOME/.isabelle\, under rare circumstances this may be changed in the global setting file. Typically, the @{setting ISABELLE_HOME_USER} directory mimics @{setting ISABELLE_HOME} to some extend. In particular, site-wide defaults may be overridden by a private \<^verbatim>\$ISABELLE_HOME_USER/etc/settings\. \<^descr>[@{setting_def ISABELLE_PLATFORM_FAMILY}\\<^sup>*\] is automatically set to the general platform family: \<^verbatim>\linux\, \<^verbatim>\macos\, \<^verbatim>\windows\. Note that platform-dependent tools usually need to refer to the more specific identification according to @{setting ISABELLE_PLATFORM64}, @{setting ISABELLE_PLATFORM32}, @{setting ISABELLE_WINDOWS_PLATFORM64}, @{setting ISABELLE_WINDOWS_PLATFORM32}. \<^descr>[@{setting_def ISABELLE_PLATFORM64}\\<^sup>*\, @{setting_def ISABELLE_PLATFORM32}\\<^sup>*\] indicate the standard Posix platform: \<^verbatim>\x86_64\ for 64 bit and \<^verbatim>\x86\ for 32 bit, together with a symbolic name for the operating system (\<^verbatim>\linux\, \<^verbatim>\darwin\, \<^verbatim>\cygwin\). All platforms support 64 bit executables, some platforms also support 32 bit executables. In GNU bash scripts, it is possible to use the following expressions (with quotes) to specify a preference of 64 bit over 32 bit: @{verbatim [display] \"${ISABELLE_PLATFORM64:-$ISABELLE_PLATFORM32}"\} In contrast, the subsequent expression prefers the old 32 bit variant (which is only relevant for unusual applications): @{verbatim [display] \"${ISABELLE_PLATFORM32:-$ISABELLE_PLATFORM64}"\} \<^descr>[@{setting_def ISABELLE_WINDOWS_PLATFORM64}\\<^sup>*\, @{setting_def ISABELLE_WINDOWS_PLATFORM32}\\<^sup>*\] indicate the native Windows platform. These settings are analogous (but independent) of those for the standard Posix subsystem: @{setting ISABELLE_PLATFORM64}, @{setting ISABELLE_PLATFORM32}. In GNU bash scripts, a preference for native Windows platform variants may be specified like this (first 64 bit, second 32 bit): @{verbatim [display] \"${ISABELLE_WINDOWS_PLATFORM64:-${ISABELLE_WINDOWS_PLATFORM32:- ${ISABELLE_PLATFORM64:-$ISABELLE_PLATFORM32}}}"\} \<^descr>[@{setting ISABELLE_TOOL}\\<^sup>*\] is automatically set to the full path name of the @{executable isabelle} executable. \<^descr>[@{setting_def ISABELLE_IDENTIFIER}\\<^sup>*\] refers to the name of this Isabelle distribution, e.g.\ ``\<^verbatim>\Isabelle2020\''. \<^descr>[@{setting_def ML_SYSTEM}, @{setting_def ML_HOME}, @{setting_def ML_OPTIONS}, @{setting_def ML_PLATFORM}, @{setting_def ML_IDENTIFIER}\\<^sup>*\] specify the underlying ML system to be used for Isabelle. There is only a fixed set of admissable @{setting ML_SYSTEM} names (see the \<^file>\$ISABELLE_HOME/etc/settings\ file of the distribution). The actual compiler binary will be run from the directory @{setting ML_HOME}, with @{setting ML_OPTIONS} as first arguments on the command line. The optional @{setting ML_PLATFORM} may specify the binary format of ML heap images, which is useful for cross-platform installations. The value of @{setting ML_IDENTIFIER} is automatically obtained by composing the values of @{setting ML_SYSTEM}, @{setting ML_PLATFORM} and the Isabelle version values. \<^descr>[@{setting_def ISABELLE_JDK_HOME}] points to a full JDK (Java Development Kit) installation with \<^verbatim>\javac\ and \<^verbatim>\jar\ executables. Note that conventional \<^verbatim>\JAVA_HOME\ points to the JRE (Java Runtime Environment), not the JDK. \<^descr>[@{setting_def ISABELLE_JAVA_PLATFORM}] identifies the hardware and operating system platform for the Java installation of Isabelle. That is always the (native) 64 bit variant: \<^verbatim>\x86_64-linux\, \<^verbatim>\x86_64-darwin\, \<^verbatim>\x86_64-windows\. \<^descr>[@{setting_def ISABELLE_BROWSER_INFO}] is the directory where HTML and PDF browser information is stored (see also \secref{sec:info}); its default is \<^path>\$ISABELLE_HOME_USER/browser_info\. For ``system build mode'' (see \secref{sec:tool-build}), @{setting_def ISABELLE_BROWSER_INFO_SYSTEM} is used instead; its default is \<^path>\$ISABELLE_HOME/browser_info\. \<^descr>[@{setting_def ISABELLE_HEAPS}] is the directory where session heap images, log files, and build databases are stored; its default is \<^path>\$ISABELLE_HOME_USER/heaps\. If @{system_option system_heaps} is \<^verbatim>\true\, @{setting_def ISABELLE_HEAPS_SYSTEM} is used instead; its default is \<^path>\$ISABELLE_HOME/heaps\. See also \secref{sec:tool-build}. \<^descr>[@{setting_def ISABELLE_LOGIC}] specifies the default logic to load if none is given explicitely by the user. The default value is \<^verbatim>\HOL\. \<^descr>[@{setting_def ISABELLE_LINE_EDITOR}] specifies the line editor for the @{tool_ref console} interface. \<^descr>[@{setting_def ISABELLE_LATEX}, @{setting_def ISABELLE_PDFLATEX}, @{setting_def ISABELLE_BIBTEX}] refer to {\LaTeX} related tools for Isabelle document preparation (see also \secref{sec:tool-latex}). \<^descr>[@{setting_def ISABELLE_TOOLS}] is a colon separated list of directories that are scanned by @{executable isabelle} for external utility programs (see also \secref{sec:isabelle-tool}). \<^descr>[@{setting_def ISABELLE_DOCS}] is a colon separated list of directories with documentation files. \<^descr>[@{setting_def PDF_VIEWER}] specifies the program to be used for displaying \<^verbatim>\pdf\ files. \<^descr>[@{setting_def DVI_VIEWER}] specifies the program to be used for displaying \<^verbatim>\dvi\ files. \<^descr>[@{setting_def ISABELLE_TMP_PREFIX}\\<^sup>*\] is the prefix from which any running Isabelle ML process derives an individual directory for temporary files. \<^descr>[@{setting_def ISABELLE_TOOL_JAVA_OPTIONS}] is passed to the \<^verbatim>\java\ executable when running Isabelle tools (e.g.\ @{tool build}). This is occasionally helpful to provide more heap space, via additional options like \<^verbatim>\-Xms1g -Xmx4g\. \ subsection \Additional components \label{sec:components}\ text \ Any directory may be registered as an explicit \<^emph>\Isabelle component\. The general layout conventions are that of the main Isabelle distribution itself, and the following two files (both optional) have a special meaning: \<^item> \<^verbatim>\etc/settings\ holds additional settings that are initialized when bootstrapping the overall Isabelle environment, cf.\ \secref{sec:boot}. As usual, the content is interpreted as a GNU bash script. It may refer to the component's enclosing directory via the \<^verbatim>\COMPONENT\ shell variable. For example, the following setting allows to refer to files within the component later on, without having to hardwire absolute paths: @{verbatim [display] \MY_COMPONENT_HOME="$COMPONENT"\} Components can also add to existing Isabelle settings such as @{setting_def ISABELLE_TOOLS}, in order to provide component-specific tools that can be invoked by end-users. For example: @{verbatim [display] \ISABELLE_TOOLS="$ISABELLE_TOOLS:$COMPONENT/lib/Tools"\} \<^item> \<^verbatim>\etc/components\ holds a list of further sub-components of the same structure. The directory specifications given here can be either absolute (with leading \<^verbatim>\/\) or relative to the component's main directory. The root of component initialization is @{setting ISABELLE_HOME} itself. After initializing all of its sub-components recursively, @{setting ISABELLE_HOME_USER} is included in the same manner (if that directory exists). This allows to install private components via \<^path>\$ISABELLE_HOME_USER/etc/components\, although it is often more convenient to do that programmatically via the \<^verbatim>\init_component\ shell function in the \<^verbatim>\etc/settings\ script of \<^verbatim>\$ISABELLE_HOME_USER\ (or any other component directory). For example: @{verbatim [display] \init_component "$HOME/screwdriver-2.0"\} This is tolerant wrt.\ missing component directories, but might produce a warning. \<^medskip> More complex situations may be addressed by initializing components listed in a given catalog file, relatively to some base directory: @{verbatim [display] \init_components "$HOME/my_component_store" "some_catalog_file"\} The component directories listed in the catalog file are treated as relative to the given base directory. See also \secref{sec:tool-components} for some tool-support for resolving components that are formally initialized but not installed yet. \ section \The Isabelle tool wrapper \label{sec:isabelle-tool}\ text \ The main \<^emph>\Isabelle tool wrapper\ provides a generic startup environment for Isabelle-related utilities, user interfaces, add-on applications etc. Such tools automatically benefit from the settings mechanism (\secref{sec:settings}). Moreover, this is the standard way to invoke Isabelle/Scala functionality as a separate operating-system process. Isabelle command-line tools are run uniformly via a common wrapper --- @{executable_ref isabelle}: @{verbatim [display] \Usage: isabelle TOOL [ARGS ...] Start Isabelle TOOL with ARGS; pass "-?" for tool-specific help. Available tools: ...\} Tools may be implemented in Isabelle/Scala or as stand-alone executables (usually as GNU bash scripts). In the invocation of ``@{executable isabelle}~\tool\'', the named \tool\ is resolved as follows (and in the given order). \<^enum> An external tool found on the directories listed in the @{setting ISABELLE_TOOLS} settings variable (colon-separated list in standard POSIX notation). \<^enum> If a file ``\tool\\<^verbatim>\.scala\'' is found, the source needs to define some object that extends the class \<^verbatim>\Isabelle_Tool.Body\. The Scala compiler is invoked on the spot (which may take some time), and the body function is run with the command-line arguments as \<^verbatim>\List[String]\. \<^enum> If an executable file ``\tool\'' is found, it is invoked as stand-alone program with the command-line arguments provided as \<^verbatim>\argv\ array. \<^enum> An internal tool that is registered in \<^verbatim>\Isabelle_Tool.internal_tools\ within the Isabelle/Scala namespace of \<^verbatim>\Pure.jar\. This is the preferred entry-point for high-end tools implemented in Isabelle/Scala --- compiled once when the Isabelle distribution is built. These tools are provided by Isabelle/Pure and cannot be augmented in user-space. There are also some administrative tools that are available from a bare repository clone of Isabelle, but not in regular distributions. \ subsubsection \Examples\ text \ Show the list of available documentation of the Isabelle distribution: @{verbatim [display] \isabelle doc\} View a certain document as follows: @{verbatim [display] \isabelle doc system\} Query the Isabelle settings environment: @{verbatim [display] \isabelle getenv ISABELLE_HOME_USER\} \ section \The raw Isabelle ML process\ subsection \Batch mode \label{sec:tool-process}\ text \ The @{tool_def process} tool runs the raw ML process in batch mode: @{verbatim [display] \Usage: isabelle process [OPTIONS] Options are: -T THEORY load theory -d DIR include session directory -e ML_EXPR evaluate ML expression on startup -f ML_FILE evaluate ML file on startup -l NAME logic session name (default ISABELLE_LOGIC="HOL") -m MODE add print mode for output -o OPTION override Isabelle system OPTION (via NAME=VAL or NAME) Run the raw Isabelle ML process in batch mode.\} \<^medskip> Options \<^verbatim>\-e\ and \<^verbatim>\-f\ allow to evaluate ML code, before the ML process is started. The source is either given literally or taken from a file. Multiple \<^verbatim>\-e\ and \<^verbatim>\-f\ options are evaluated in the given order. Errors lead to premature exit of the ML process with return code 1. \<^medskip> Option \<^verbatim>\-T\ loads a specified theory file. This is a wrapper for \<^verbatim>\-e\ with a suitable \<^ML>\use_thy\ invocation. \<^medskip> Option \<^verbatim>\-l\ specifies the logic session name. Option \<^verbatim>\-d\ specifies additional directories for session roots, see also \secref{sec:tool-build}. \<^medskip> The \<^verbatim>\-m\ option adds identifiers of print modes to be made active for this session. For example, \<^verbatim>\-m ASCII\ prefers ASCII replacement syntax over mathematical Isabelle symbols. \<^medskip> Option \<^verbatim>\-o\ allows to override Isabelle system options for this process, see also \secref{sec:system-options}. \ subsubsection \Examples\ text \ The subsequent example retrieves the \<^verbatim>\Main\ theory value from the theory loader within ML: @{verbatim [display] \isabelle process -e 'Thy_Info.get_theory "Main"'\} Observe the delicate quoting rules for the GNU bash shell vs.\ ML. The Isabelle/ML and Scala libraries provide functions for that, but here we need to do it manually. \<^medskip> This is how to invoke a function body with proper return code and printing of errors, and without printing of a redundant \<^verbatim>\val it = (): unit\ result: - @{verbatim [display] \isabelle process -e 'Command_Line.tool0 (fn () => writeln "OK")'\} - @{verbatim [display] \isabelle process -e 'Command_Line.tool0 (fn () => error "Bad")'\} + @{verbatim [display] \isabelle process -e 'Command_Line.tool (fn () => writeln "OK")'\} + @{verbatim [display] \isabelle process -e 'Command_Line.tool (fn () => error "Bad")'\} \ subsection \Interactive mode\ text \ The @{tool_def console} tool runs the raw ML process with interactive console and line editor: @{verbatim [display] \Usage: isabelle console [OPTIONS] Options are: -d DIR include session directory -i NAME include session in name-space of theories -l NAME logic session name (default ISABELLE_LOGIC) -m MODE add print mode for output -n no build of session image on startup -o OPTION override Isabelle system OPTION (via NAME=VAL or NAME) -r bootstrap from raw Poly/ML Build a logic session image and run the raw Isabelle ML process in interactive mode, with line editor ISABELLE_LINE_EDITOR.\} \<^medskip> Option \<^verbatim>\-l\ specifies the logic session name. By default, its heap image is checked and built on demand, but the option \<^verbatim>\-n\ skips that. Option \<^verbatim>\-i\ includes additional sessions into the name-space of theories: multiple occurrences are possible. Option \<^verbatim>\-r\ indicates a bootstrap from the raw Poly/ML system, which is relevant for Isabelle/Pure development. \<^medskip> Options \<^verbatim>\-d\, \<^verbatim>\-m\, \<^verbatim>\-o\ have the same meaning as for @{tool process} (\secref{sec:tool-process}). \<^medskip> The Isabelle/ML process is run through the line editor that is specified via the settings variable @{setting ISABELLE_LINE_EDITOR} (e.g.\ @{executable_def rlwrap} for GNU readline); the fall-back is to use plain standard input/output. The user is connected to the raw ML toplevel loop: this is neither Isabelle/Isar nor Isabelle/ML within the usual formal context. The most relevant ML commands at this stage are \<^ML>\use\ (for ML files) and \<^ML>\use_thy\ (for theory files). \ section \The raw Isabelle Java process \label{sec:isabelle-java}\ text \ The @{executable_ref isabelle_java} executable allows to run a Java process within the name space of Java and Scala components that are bundled with Isabelle, but \<^emph>\without\ the Isabelle settings environment (\secref{sec:settings}). After such a JVM cold-start, the Isabelle environment can be accessed via \<^verbatim>\Isabelle_System.getenv\ as usual, but the underlying process environment remains clean. This is e.g.\ relevant when invoking other processes that should remain separate from the current Isabelle installation. \<^medskip> Note that under normal circumstances, Isabelle command-line tools are run \<^emph>\within\ the settings environment, as provided by the @{executable isabelle} wrapper (\secref{sec:isabelle-tool} and \secref{sec:tool-java}). \ subsubsection \Example\ text \ The subsequent example creates a raw Java process on the command-line and invokes the main Isabelle application entry point: @{verbatim [display] \isabelle_java isabelle.Main\} \ section \YXML versus XML \label{sec:yxml-vs-xml}\ text \ Isabelle tools often use YXML, which is a simple and efficient syntax for untyped XML trees. The YXML format is defined as follows. \<^enum> The encoding is always UTF-8. \<^enum> Body text is represented verbatim (no escaping, no special treatment of white space, no named entities, no CDATA chunks, no comments). \<^enum> Markup elements are represented via ASCII control characters \\<^bold>X = 5\ and \\<^bold>Y = 6\ as follows: \begin{tabular}{ll} XML & YXML \\\hline \<^verbatim>\<\\name attribute\\<^verbatim>\=\\value \\\<^verbatim>\>\ & \\<^bold>X\<^bold>Yname\<^bold>Yattribute\\<^verbatim>\=\\value\\<^bold>X\ \\ \<^verbatim>\\name\\<^verbatim>\>\ & \\<^bold>X\<^bold>Y\<^bold>X\ \\ \end{tabular} There is no special case for empty body text, i.e.\ \<^verbatim>\\ is treated like \<^verbatim>\\. Also note that \\<^bold>X\ and \\<^bold>Y\ may never occur in well-formed XML documents. Parsing YXML is pretty straight-forward: split the text into chunks separated by \\<^bold>X\, then split each chunk into sub-chunks separated by \\<^bold>Y\. Markup chunks start with an empty sub-chunk, and a second empty sub-chunk indicates close of an element. Any other non-empty chunk consists of plain text. For example, see \<^file>\~~/src/Pure/PIDE/yxml.ML\ or \<^file>\~~/src/Pure/PIDE/yxml.scala\. YXML documents may be detected quickly by checking that the first two characters are \\<^bold>X\<^bold>Y\. \ end diff --git a/src/Pure/Admin/build_history.scala b/src/Pure/Admin/build_history.scala --- a/src/Pure/Admin/build_history.scala +++ b/src/Pure/Admin/build_history.scala @@ -1,593 +1,593 @@ /* Title: Pure/Admin/build_history.scala Author: Makarius Build other history versions. */ package isabelle import java.io.{File => JFile} import java.time.format.DateTimeFormatter import java.util.Locale object Build_History { /* log files */ val engine = "build_history" val log_prefix = engine + "_" /* augment settings */ def augment_settings( other_isabelle: Other_Isabelle, threads: Int, arch_64: Boolean, heap: Int, max_heap: Option[Int], more_settings: List[String]): String = { val (ml_platform, ml_settings) = { val windows_32 = "x86-windows" val windows_64 = "x86_64-windows" val windows_64_32 = "x86_64_32-windows" val platform_32 = other_isabelle.getenv("ISABELLE_PLATFORM32") val platform_64 = other_isabelle.getenv("ISABELLE_PLATFORM64") val platform_64_32 = platform_64.replace("x86_64-", "x86_64_32-") val polyml_home = try { Path.explode(other_isabelle.getenv("ML_HOME")).dir } catch { case ERROR(msg) => error("Bad ML_HOME: " + msg) } def ml_home(platform: String): Path = polyml_home + Path.explode(platform) def err(platform: String): Nothing = error("Platform " + platform + " unavailable on this machine") def check_dir(platform: String): Boolean = platform != "" && ml_home(platform).is_dir val ml_platform = if (Platform.is_windows && arch_64) { if (check_dir(windows_64)) windows_64 else err(windows_64) } else if (Platform.is_windows && !arch_64) { if (check_dir(windows_64_32)) windows_64_32 else if (check_dir(windows_32)) windows_32 else platform_32 // x86-cygwin } else if (arch_64) { if (check_dir(platform_64)) platform_64 else err(platform_64) } else if (check_dir(platform_64_32)) platform_64_32 else platform_32 val ml_options = "--minheap " + heap + (if (max_heap.isDefined) " --maxheap " + max_heap.get else "") + " --gcthreads " + threads + (if (ml_platform.endsWith("-windows")) " --codepage utf8" else "") (ml_platform, List( "ML_HOME=" + File.bash_path(ml_home(ml_platform)), "ML_PLATFORM=" + quote(ml_platform), "ML_OPTIONS=" + quote(ml_options))) } val thread_settings = List( "ISABELLE_JAVA_SYSTEM_OPTIONS=\"$ISABELLE_JAVA_SYSTEM_OPTIONS -Disabelle.threads=" + threads + "\"", "ISABELLE_BUILD_OPTIONS=\"threads=" + threads + "\"") val settings = List(ml_settings, thread_settings) ::: (if (more_settings.isEmpty) Nil else List(more_settings)) File.append(other_isabelle.etc_settings, "\n" + cat_lines(settings.map(terminate_lines(_)))) ml_platform } /** build_history **/ private val default_user_home = Path.explode("$USER_HOME") private val default_rev = "tip" private val default_multicore = (1, 1) private val default_heap = 1500 private val default_isabelle_identifier = "build_history" def build_history( options: Options, root: Path, user_home: Path = default_user_home, progress: Progress = No_Progress, rev: String = default_rev, afp_rev: Option[String] = None, afp_partition: Int = 0, isabelle_identifier: String = default_isabelle_identifier, ml_statistics_step: Int = 1, components_base: Path = Components.default_components_base, fresh: Boolean = false, hostname: String = "", multicore_base: Boolean = false, multicore_list: List[(Int, Int)] = List(default_multicore), arch_64: Boolean = false, heap: Int = default_heap, max_heap: Option[Int] = None, init_settings: List[String] = Nil, more_settings: List[String] = Nil, verbose: Boolean = false, build_tags: List[String] = Nil, build_args: List[String] = Nil): List[(Process_Result, Path)] = { /* sanity checks */ if (File.eq(Path.explode("~~"), root)) error("Repository coincides with ISABELLE_HOME=" + Path.explode("~~").expand) for ((threads, _) <- multicore_list if threads < 1) error("Bad threads value < 1: " + threads) for ((_, processes) <- multicore_list if processes < 1) error("Bad processes value < 1: " + processes) if (heap < 100) error("Bad heap value < 100: " + heap) if (max_heap.isDefined && max_heap.get < heap) error("Bad max_heap value < heap: " + max_heap.get) System.getenv("ISABELLE_SETTINGS_PRESENT") match { case null | "" => case _ => error("Cannot run build_history within existing Isabelle settings environment") } /* checkout Isabelle + AFP repository */ def checkout(dir: Path, version: String): String = { val hg = Mercurial.repository(dir) hg.update(rev = version, clean = true) progress.echo_if(verbose, hg.log(version, options = "-l1")) hg.id(rev = version) } val isabelle_version = checkout(root, rev) val afp_repos = root + Path.explode("AFP") val afp_version = afp_rev.map(checkout(afp_repos, _)) val (afp_build_args, afp_sessions) = if (afp_rev.isEmpty) (Nil, Nil) else { val afp = AFP.init(options, afp_repos) (List("-d", "~~/AFP/thys"), afp.partition(afp_partition)) } /* main */ val other_isabelle = Other_Isabelle(root, isabelle_identifier = isabelle_identifier, user_home = user_home, progress = progress) val build_host = proper_string(hostname) getOrElse Isabelle_System.hostname() val build_history_date = Date.now() val build_group_id = build_host + ":" + build_history_date.time.ms var first_build = true for ((threads, processes) <- multicore_list) yield { /* init settings */ val component_settings = other_isabelle.init_components( components_base = components_base, catalogs = List("main", "optional")) other_isabelle.init_settings(component_settings ::: init_settings) other_isabelle.resolve_components(verbose) val ml_platform = augment_settings(other_isabelle, threads, arch_64, heap, max_heap, more_settings) val isabelle_output = other_isabelle.isabelle_home_user + Path.explode("heaps") + Path.explode(other_isabelle.getenv("ML_IDENTIFIER")) val isabelle_output_log = isabelle_output + Path.explode("log") val isabelle_base_log = isabelle_output + Path.explode("../base_log") if (first_build) { other_isabelle.resolve_components(verbose) if (fresh) Isabelle_System.rm_tree(other_isabelle.isabelle_home + Path.explode("lib/classes")) other_isabelle.bash( "env PATH=\"" + File.bash_path(Path.explode("~~/lib/dummy_stty").expand) + ":$PATH\" " + "bin/isabelle jedit -b", redirect = true, echo = verbose).check for { tool <- List("ghc_setup", "ocaml_setup") if other_isabelle.getenv("ISABELLE_" + Word.uppercase(tool)) == "true" && (other_isabelle.isabelle_home + Path.explode("lib/Tools/" + tool)).is_file } other_isabelle(tool, echo = verbose) Isabelle_System.rm_tree(isabelle_base_log) } Isabelle_System.rm_tree(isabelle_output) Isabelle_System.mkdirs(isabelle_output) val log_path = other_isabelle.isabelle_home_user + Build_Log.log_subdir(build_history_date) + Build_Log.log_filename(Build_History.engine, build_history_date, List(build_host, ml_platform, "M" + threads) ::: build_tags) Isabelle_System.mkdirs(log_path.dir) val build_out = other_isabelle.isabelle_home_user + Path.explode("log/build.out") val build_out_progress = new File_Progress(build_out) build_out.file.delete /* build */ if (multicore_base && !first_build && isabelle_base_log.is_dir) Isabelle_System.copy_dir(isabelle_base_log, isabelle_output_log) val build_start = Date.now() val build_args1 = List("-v", "-j" + processes) ::: afp_build_args ::: build_args val build_result = Other_Isabelle(root, isabelle_identifier = isabelle_identifier, user_home = user_home, progress = build_out_progress)( "build " + Bash.strings(build_args1 ::: afp_sessions), redirect = true, echo = true, strict = false) val build_end = Date.now() val build_info: Build_Log.Build_Info = Build_Log.Log_File(log_path.file_name, build_result.out_lines). parse_build_info(ml_statistics = true) /* output log */ val store = Sessions.store(options + "build_database_server=false") val meta_info = Properties.lines_nonempty(Build_Log.Prop.build_tags.name, build_tags) ::: Properties.lines_nonempty(Build_Log.Prop.build_args.name, build_args1) ::: List( Build_Log.Prop.build_group_id.name -> build_group_id, Build_Log.Prop.build_id.name -> (build_host + ":" + build_start.time.ms), Build_Log.Prop.build_engine.name -> Build_History.engine, Build_Log.Prop.build_host.name -> build_host, Build_Log.Prop.build_start.name -> Build_Log.print_date(build_start), Build_Log.Prop.build_end.name -> Build_Log.print_date(build_end), Build_Log.Prop.isabelle_version.name -> isabelle_version) ::: afp_version.map(Build_Log.Prop.afp_version.name -> _).toList build_out_progress.echo("Reading session build info ...") val session_build_info = build_info.finished_sessions.flatMap(session_name => { val database = isabelle_output + store.database(session_name) if (database.is_file) { using(SQLite.open_database(database))(db => { val theory_timings = try { store.read_theory_timings(db, session_name).map(ps => Protocol.Theory_Timing_Marker((Build_Log.SESSION_NAME, session_name) :: ps)) } catch { case ERROR(_) => Nil } val session_sources = store.read_build(db, session_name).map(_.sources) match { case Some(sources) if sources.length == SHA1.digest_length => List("Sources " + session_name + " " + sources) case _ => Nil } theory_timings ::: session_sources }) } else Nil }) build_out_progress.echo("Reading ML statistics ...") val ml_statistics = build_info.finished_sessions.flatMap(session_name => { val database = isabelle_output + store.database(session_name) val log_gz = isabelle_output + store.log_gz(session_name) val properties = if (database.is_file) { using(SQLite.open_database(database))(db => store.read_ml_statistics(db, session_name)) } else if (log_gz.is_file) { Build_Log.Log_File(log_gz).parse_session_info(ml_statistics = true).ml_statistics } else Nil val trimmed_properties = if (ml_statistics_step <= 0) Nil else if (ml_statistics_step == 1) properties else { (for { (ps, i) <- properties.iterator.zipWithIndex if i % ml_statistics_step == 0 } yield ps).toList } trimmed_properties.map(ps => (Build_Log.SESSION_NAME -> session_name) :: ps) }) build_out_progress.echo("Reading error messages ...") val session_errors = build_info.failed_sessions.flatMap(session_name => { val database = isabelle_output + store.database(session_name) val errors = if (database.is_file) { try { using(SQLite.open_database(database))(db => store.read_errors(db, session_name)) } // column "errors" could be missing catch { case _: java.sql.SQLException => Nil } } else Nil errors.map(msg => List(Build_Log.SESSION_NAME -> session_name, Markup.CONTENT -> msg)) }) build_out_progress.echo("Reading heap sizes ...") val heap_sizes = build_info.finished_sessions.flatMap(session_name => { val heap = isabelle_output + Path.explode(session_name) if (heap.is_file) Some("Heap " + session_name + " (" + Value.Long(heap.file.length) + " bytes)") else None }) build_out_progress.echo("Writing log file " + log_path.ext("xz") + " ...") File.write_xz(log_path.ext("xz"), terminate_lines( Protocol.Meta_Info_Marker(meta_info) :: build_result.out_lines ::: session_build_info ::: ml_statistics.map(Protocol.ML_Statistics_Marker.apply) ::: session_errors.map(Protocol.Error_Message_Marker.apply) ::: heap_sizes), XZ.options(6)) /* next build */ if (multicore_base && first_build && isabelle_output_log.is_dir) Isabelle_System.copy_dir(isabelle_output_log, isabelle_base_log) Isabelle_System.rm_tree(isabelle_output) first_build = false (build_result, log_path.ext("xz")) } } /* command line entry point */ private object Multicore { private val Pat1 = """^(\d+)$""".r private val Pat2 = """^(\d+)x(\d+)$""".r def parse(s: String): (Int, Int) = s match { case Pat1(Value.Int(x)) => (x, 1) case Pat2(Value.Int(x), Value.Int(y)) => (x, y) case _ => error("Bad multicore configuration: " + quote(s)) } } def main(args: Array[String]) { - Command_Line.tool0 { + Command_Line.tool { var afp_rev: Option[String] = None var multicore_base = false var components_base: Path = Components.default_components_base var heap: Option[Int] = None var max_heap: Option[Int] = None var multicore_list = List(default_multicore) var isabelle_identifier = default_isabelle_identifier var afp_partition = 0 var more_settings: List[String] = Nil var fresh = false var hostname = "" var init_settings: List[String] = Nil var arch_64 = false var output_file = "" var rev = default_rev var ml_statistics_step = 1 var build_tags = List.empty[String] var user_home = default_user_home var verbose = false var exit_code = false val getopts = Getopts(""" Usage: Admin/build_history [OPTIONS] REPOSITORY [ARGS ...] Options are: -A REV include $ISABELLE_HOME/AFP repository at given revision -B first multicore build serves as base for scheduling information -C DIR base directory for Isabelle components (default: """ + Components.default_components_base + """) -H SIZE minimal ML heap in MB (default: """ + default_heap + """ for x86, """ + default_heap * 2 + """ for x86_64) -M MULTICORE multicore configurations (see below) -N NAME alternative ISABELLE_IDENTIFIER (default: """ + default_isabelle_identifier + """) -P NUMBER AFP partition number (0, 1, 2, default: 0=unrestricted) -U SIZE maximal ML heap in MB (default: unbounded) -e TEXT additional text for generated etc/settings -f fresh build of Isabelle/Scala components (recommended) -h NAME override local hostname -i TEXT initial text for generated etc/settings -m ARCH processor architecture (32=x86, 64=x86_64, default: x86) -o FILE output file for log names (default: stdout) -r REV update to revision (default: """ + default_rev + """) -s NUMBER step size for ML statistics (0=none, 1=all, n=step, default: 1) -t TAG free-form build tag (multiple occurrences possible) -u DIR alternative USER_HOME directory -v verbose -x return overall exit code from build processes Build Isabelle sessions from the history of another REPOSITORY clone, passing ARGS directly to its isabelle build tool. Each MULTICORE configuration consists of one or two numbers (default 1): THREADS or THREADSxPROCESSES, e.g. -M 1,2,4 or -M 1x4,2x2,4. """, "A:" -> (arg => afp_rev = Some(arg)), "B" -> (_ => multicore_base = true), "C:" -> (arg => components_base = Path.explode(arg)), "H:" -> (arg => heap = Some(Value.Int.parse(arg))), "M:" -> (arg => multicore_list = space_explode(',', arg).map(Multicore.parse)), "N:" -> (arg => isabelle_identifier = arg), "P:" -> (arg => afp_partition = Value.Int.parse(arg)), "U:" -> (arg => max_heap = Some(Value.Int.parse(arg))), "e:" -> (arg => more_settings = more_settings ::: List(arg)), "f" -> (_ => fresh = true), "h:" -> (arg => hostname = arg), "i:" -> (arg => init_settings = init_settings ::: List(arg)), "m:" -> { case "32" | "x86" => arch_64 = false case "64" | "x86_64" => arch_64 = true case bad => error("Bad processor architecture: " + quote(bad)) }, "o:" -> (arg => output_file = arg), "r:" -> (arg => rev = arg), "s:" -> (arg => ml_statistics_step = Value.Int.parse(arg)), "t:" -> (arg => build_tags = build_tags ::: List(arg)), "u:" -> (arg => user_home = Path.explode(arg)), "v" -> (_ => verbose = true), "x" -> (_ => exit_code = true)) val more_args = getopts(args) val (root, build_args) = more_args match { case root :: build_args => (Path.explode(root), build_args) case _ => getopts.usage() } val progress = new Console_Progress(stderr = true) val results = build_history(Options.init(), root, user_home = user_home, progress = progress, rev = rev, afp_rev = afp_rev, afp_partition = afp_partition, isabelle_identifier = isabelle_identifier, ml_statistics_step = ml_statistics_step, components_base = components_base, fresh = fresh, hostname = hostname, multicore_base = multicore_base, multicore_list = multicore_list, arch_64 = arch_64, heap = heap.getOrElse(if (arch_64) default_heap * 2 else default_heap), max_heap = max_heap, init_settings = init_settings, more_settings = more_settings, verbose = verbose, build_tags = build_tags, build_args = build_args) if (output_file == "") { for ((_, log_path) <- results) Output.writeln(log_path.implode, stdout = true) } else { File.write(Path.explode(output_file), cat_lines(for ((_, log_path) <- results) yield log_path.implode)) } val rc = (0 /: results) { case (rc, (res, _)) => rc max res.rc } if (rc != 0 && exit_code) sys.exit(rc) } } /** remote build_history -- via command-line **/ def remote_build_history( ssh: SSH.Session, isabelle_repos_self: Path, isabelle_repos_other: Path, isabelle_repos_source: String = Isabelle_Cronjob.isabelle_repos_source, afp_repos_source: String = AFP.repos_source, isabelle_identifier: String = "remote_build_history", self_update: Boolean = false, progress: Progress = No_Progress, rev: String = "", afp_rev: Option[String] = None, options: String = "", args: String = ""): List[(String, Bytes)] = { /* Isabelle self repository */ val self_hg = Mercurial.setup_repository(isabelle_repos_source, isabelle_repos_self, ssh = ssh) def execute(cmd: String, args: String, echo: Boolean = false, strict: Boolean = true): Unit = ssh.execute( Isabelle_System.export_isabelle_identifier(isabelle_identifier) + ssh.bash_path(isabelle_repos_self + Path.explode(cmd)) + " " + args, progress_stdout = progress.echo_if(echo, _), progress_stderr = progress.echo_if(echo, _), strict = strict).check if (self_update) { val hg = Mercurial.repository(Path.explode("~~")) hg.push(self_hg.root_url, force = true) self_hg.update(rev = hg.parent(), clean = true) execute("bin/isabelle", "components -I") execute("bin/isabelle", "components -a", echo = true) execute("Admin/build", "jars_fresh") } val rev_id = self_hg.id(rev) /* Isabelle other + AFP repository */ if (Mercurial.is_repository(isabelle_repos_other, ssh = ssh)) { ssh.rm_tree(isabelle_repos_other) } val other_hg = Mercurial.clone_repository( ssh.bash_path(isabelle_repos_self), isabelle_repos_other, rev = rev_id, ssh = ssh) val afp_options = if (afp_rev.isEmpty) "" else { val afp_repos = isabelle_repos_other + Path.explode("AFP") val afp_hg = Mercurial.setup_repository(afp_repos_source, afp_repos, ssh = ssh) " -A " + Bash.string(afp_rev.get) } /* Admin/build_history */ ssh.with_tmp_dir(tmp_dir => { val output_file = tmp_dir + Path.explode("output") execute("Admin/build_history", "-o " + ssh.bash_path(output_file) + (if (rev == "") "" else " -r " + Bash.string(rev_id)) + " " + options + afp_options + " " + ssh.bash_path(isabelle_repos_other) + " " + args, echo = true, strict = false) for (line <- split_lines(ssh.read(output_file))) yield { val log = Path.explode(line) val bytes = ssh.read_bytes(log) ssh.rm(log) (log.file_name, bytes) } }) } } diff --git a/src/Pure/Admin/build_polyml.scala b/src/Pure/Admin/build_polyml.scala --- a/src/Pure/Admin/build_polyml.scala +++ b/src/Pure/Admin/build_polyml.scala @@ -1,322 +1,322 @@ /* Title: Pure/Admin/build_polyml.scala Author: Makarius Build Poly/ML from sources. */ package isabelle import scala.util.matching.Regex object Build_PolyML { /** platform-specific build **/ sealed case class Platform_Info( options: List[String] = Nil, setup: String = "", copy_files: List[String] = Nil, ldd_pattern: Option[(String, Regex)] = None) private val platform_info = Map( "linux" -> Platform_Info( options = List("LDFLAGS=-Wl,-rpath,_DUMMY_"), ldd_pattern = Some(("ldd", """\s*libgmp.*=>\s*(\S+).*""".r))), "darwin" -> Platform_Info( options = List("--build=x86_64-darwin", "CFLAGS=-arch x86_64 -O3 -I../libffi/include", "CXXFLAGS=-arch x86_64 -O3 -I../libffi/include", "CCASFLAGS=-arch x86_64", "LDFLAGS=-segprot POLY rwx rwx"), setup = "PATH=/usr/bin:/bin:/usr/sbin:/sbin", ldd_pattern = Some(("otool -L", """\s*(\S+lib(?:polyml|gmp).*dylib).*""".r))), "windows" -> Platform_Info( options = List("--host=x86_64-w64-mingw32", "CPPFLAGS=-I/mingw64/include", "--disable-windows-gui"), setup = """PATH=/usr/bin:/bin:/mingw64/bin export CONFIG_SITE=/mingw64/etc/config.site""", copy_files = List("$MSYS/mingw64/bin/libgcc_s_seh-1.dll", "$MSYS/mingw64/bin/libgmp-10.dll", "$MSYS/mingw64/bin/libstdc++-6.dll", "$MSYS/mingw64/bin/libwinpthread-1.dll"))) def build_polyml( root: Path, sha1_root: Option[Path] = None, progress: Progress = No_Progress, arch_64: Boolean = false, options: List[String] = Nil, msys_root: Option[Path] = None) { if (!((root + Path.explode("configure")).is_file && (root + Path.explode("PolyML")).is_dir)) error("Bad Poly/ML root directory: " + root) val platform_arch = if (arch_64) "x86_64" else "x86_64_32" val platform_os = Platform.os_name val platform = platform_arch + "-" + platform_os val platform_64 = "x86_64-" + platform_os val info = platform_info.get(platform_os) getOrElse error("Bad OS platform: " + quote(platform_os)) val settings = msys_root match { case None if Platform.is_windows => error("Windows requires specification of msys root directory") case None => Isabelle_System.settings() case Some(msys) => Isabelle_System.settings() + ("MSYS" -> msys.expand.implode) } if (Platform.is_linux && !Isabelle_System.bash("chrpath -v").ok) { error("""Missing "chrpath" executable on Linux""") } /* bash */ def bash( cwd: Path, script: String, redirect: Boolean = false, echo: Boolean = false): Process_Result = { val script1 = msys_root match { case None => script case Some(msys) => File.bash_path(msys + Path.explode("usr/bin/bash")) + " -c " + Bash.string(script) } progress.bash(script1, cwd = cwd.file, redirect = redirect, echo = echo) } /* configure and make */ val configure_options = List("--disable-shared", "--enable-intinf-as-int", "--with-gmp") ::: info.options ::: options ::: (if (arch_64) Nil else List("--enable-compact32bit")) bash(root, info.setup + "\n" + """ [ -f Makefile ] && make distclean { ./configure --prefix="$PWD/target" """ + Bash.strings(configure_options) + """ rm -rf target make compiler && make compiler && make install } || { echo "Build failed" >&2; exit 2; } """, redirect = true, echo = true).check val ldd_files = { info.ldd_pattern match { case Some((ldd, pattern)) => val lines = bash(root, ldd + " target/bin/poly").check.out_lines for { line <- lines; List(lib) <- pattern.unapplySeq(line) } yield lib case None => Nil } } /* sha1 library */ val sha1_files = if (sha1_root.isDefined) { val dir1 = sha1_root.get bash(dir1, "./build " + platform_64, redirect = true, echo = true).check val dir2 = dir1 + Path.explode(platform_64) File.read_dir(dir2).map(entry => dir2.implode + "/" + entry) } else Nil /* target */ val target = Path.explode(platform) Isabelle_System.rm_tree(target) Isabelle_System.mkdirs(target) for (file <- info.copy_files ::: ldd_files ::: sha1_files) File.copy(Path.explode(file).expand_env(settings), target) for { d <- List("target/bin", "target/lib") dir = root + Path.explode(d) entry <- File.read_dir(dir) } File.move(dir + Path.explode(entry), target) /* poly: library path */ if (Platform.is_linux) { bash(target, "chrpath -r '$ORIGIN' poly", echo = true).check } else if (Platform.is_macos) { for (file <- ldd_files) { bash(target, """install_name_tool -change """ + Bash.string(file) + " " + Bash.string("@executable_path/" + Path.explode(file).file_name) + " poly").check } } /* polyc: directory prefix */ { val polyc_path = target + Path.explode("polyc") val Header = "#! */bin/sh".r val polyc_patched = split_lines(File.read(polyc_path)) match { case Header() :: lines => val lines1 = lines.map(line => if (line.startsWith("prefix=")) "prefix=\"$(cd \"$(dirname \"$0\")\"; pwd)\"" else if (line.startsWith("BINDIR=")) "BINDIR=\"$prefix\"" else if (line.startsWith("LIBDIR=")) "LIBDIR=\"$prefix\"" else line) cat_lines("#!/usr/bin/env bash" ::lines1) case lines => error(cat_lines("Cannot patch polyc -- undetected header:" :: lines.take(3))) } File.write(polyc_path, polyc_patched) } } /** skeleton for component **/ private def extract_sources(source_archive: Path, component_dir: Path) = { if (source_archive.get_ext == "zip") { Isabelle_System.bash( "unzip -x " + File.bash_path(source_archive.absolute), cwd = component_dir.file).check } else { Isabelle_System.gnutar("-xzf " + File.bash_path(source_archive), dir = component_dir).check } val src_dir = component_dir + Path.explode("src") File.read_dir(component_dir) match { case List(s) => File.move(component_dir + Path.basic(s), src_dir) case _ => error("Source archive contains multiple directories") } val lines = split_lines(File.read(src_dir + Path.explode("RootX86.ML"))) val ml_files = for { line <- lines rest <- Library.try_unprefix("use", line) } yield "ML_file" + rest File.write(src_dir + Path.explode("ROOT.ML"), """(* Poly/ML Compiler root file. When this file is open in the Prover IDE, the ML files of the Poly/ML compiler can be explored interactively. This is a separate copy: it does not affect the running ML session. *) """ + ml_files.mkString("\n", "\n", "\n")) } def build_polyml_component( source_archive: Path, component_dir: Path, sha1_root: Option[Path] = None) { if (component_dir.is_file || component_dir.is_dir) error("Component directory already exists: " + component_dir) Isabelle_System.mkdirs(component_dir) extract_sources(source_archive, component_dir) File.copy(Path.explode("~~/Admin/polyml/README"), component_dir) val etc_dir = component_dir + Path.explode("etc") Isabelle_System.mkdirs(etc_dir) File.copy(Path.explode("~~/Admin/polyml/settings"), etc_dir) sha1_root match { case Some(dir) => Mercurial.repository(dir).archive(File.standard_path(component_dir + Path.explode("sha1"))) case None => } } /** Isabelle tool wrappers **/ val isabelle_tool1 = Isabelle_Tool("build_polyml", "build Poly/ML from sources", args => { - Command_Line.tool0 { + Command_Line.tool { var msys_root: Option[Path] = None var arch_64 = false var sha1_root: Option[Path] = None val getopts = Getopts(""" Usage: isabelle build_polyml [OPTIONS] ROOT [CONFIGURE_OPTIONS] Options are: -M DIR msys root directory (for Windows) -m ARCH processor architecture (32=x86_64_32, 64=x86_64, default: 32) -s DIR sha1 sources, see https://isabelle.sketis.net/repos/sha1 Build Poly/ML in the ROOT directory of its sources, with additional CONFIGURE_OPTIONS (e.g. --without-gmp). """, "M:" -> (arg => msys_root = Some(Path.explode(arg))), "m:" -> { case "32" | "x86_64_32" => arch_64 = false case "64" | "x86_64" => arch_64 = true case bad => error("Bad processor architecture: " + quote(bad)) }, "s:" -> (arg => sha1_root = Some(Path.explode(arg)))) val more_args = getopts(args) val (root, options) = more_args match { case root :: options => (Path.explode(root), options) case Nil => getopts.usage() } build_polyml(root, sha1_root = sha1_root, progress = new Console_Progress, arch_64 = arch_64, options = options, msys_root = msys_root) } }) val isabelle_tool2 = Isabelle_Tool("build_polyml_component", "make skeleton for Poly/ML component", args => { - Command_Line.tool0 { + Command_Line.tool { var sha1_root: Option[Path] = None val getopts = Getopts(""" Usage: isabelle build_polyml_component [OPTIONS] SOURCE_ARCHIVE COMPONENT_DIR Options are: -s DIR sha1 sources, see https://isabelle.sketis.net/repos/sha1 Make skeleton for Poly/ML component, based on official source archive (zip or tar.gz). """, "s:" -> (arg => sha1_root = Some(Path.explode(arg)))) val more_args = getopts(args) val (source_archive, component_dir) = more_args match { case List(archive, dir) => (Path.explode(archive), Path.explode(dir)) case _ => getopts.usage() } build_polyml_component(source_archive, component_dir, sha1_root = sha1_root) } }) } diff --git a/src/Pure/Admin/build_release.scala b/src/Pure/Admin/build_release.scala --- a/src/Pure/Admin/build_release.scala +++ b/src/Pure/Admin/build_release.scala @@ -1,848 +1,848 @@ /* Title: Pure/Admin/build_release.scala Author: Makarius Build full Isabelle distribution from repository. */ package isabelle object Build_Release { /** release info **/ sealed case class Bundle_Info( platform: Platform.Family.Value, platform_description: String, name: String) { def path: Path = Path.explode(name) } class Release private[Build_Release]( progress: Progress, val date: Date, val dist_name: String, val dist_dir: Path, val dist_version: String, val ident: String) { val isabelle_dir: Path = dist_dir + Path.explode(dist_name) val isabelle_archive: Path = dist_dir + Path.explode(dist_name + ".tar.gz") val isabelle_library_archive: Path = dist_dir + Path.explode(dist_name + "_library.tar.gz") def other_isabelle(dir: Path): Other_Isabelle = Other_Isabelle(dir + Path.explode(dist_name), isabelle_identifier = dist_name + "-build", progress = progress) def bundle_info(platform: Platform.Family.Value): Bundle_Info = platform match { case Platform.Family.linux => Bundle_Info(platform, "Linux", dist_name + "_linux.tar.gz") case Platform.Family.macos => Bundle_Info(platform, "macOS", dist_name + "_macos.tar.gz") case Platform.Family.windows => Bundle_Info(platform, "Windows", dist_name + ".exe") } } /** generated content **/ /* patch release */ private def change_file(dir: Path, name: String, f: String => String) { val file = dir + Path.explode(name) File.write(file, f(File.read(file))) } private val getsettings_file: String = "lib/scripts/getsettings" private val ISABELLE_ID = """ISABELLE_ID="(.+)"""".r def patch_release(release: Release, is_official: Boolean) { val dir = release.isabelle_dir for (name <- List("src/Pure/System/distribution.ML", "src/Pure/System/distribution.scala")) { change_file(dir, name, s => s.replaceAllLiterally("val is_identified = false", "val is_identified = true") .replaceAllLiterally("val is_official = false", "val is_official = " + is_official)) } change_file(dir, getsettings_file, s => s.replaceAllLiterally("ISABELLE_ID=\"\"", "ISABELLE_ID=" + quote(release.ident)) .replaceAllLiterally("ISABELLE_IDENTIFIER=\"\"", "ISABELLE_IDENTIFIER=" + quote(release.dist_name))) change_file(dir, "lib/html/library_index_header.template", s => s.replaceAllLiterally("{ISABELLE}", release.dist_name)) for { name <- List( "src/Pure/System/distribution.ML", "src/Pure/System/distribution.scala", "lib/Tools/version") } { change_file(dir, name, s => s.replaceAllLiterally("repository version", release.dist_version)) } change_file(dir, "README", s => s.replaceAllLiterally("some repository version of Isabelle", release.dist_version)) } /* ANNOUNCE */ def make_announce(release: Release) { File.write(release.isabelle_dir + Path.explode("ANNOUNCE"), """ IMPORTANT NOTE ============== This is a snapshot of Isabelle/""" + release.ident + """ from the repository. """) } /* NEWS */ def make_news(other_isabelle: Other_Isabelle, dist_version: String) { val target = other_isabelle.isabelle_home + Path.explode("doc") val target_fonts = target + Path.explode("fonts") Isabelle_System.mkdirs(target_fonts) other_isabelle.copy_fonts(target_fonts) HTML.write_document(target, "NEWS.html", List(HTML.title("NEWS (" + dist_version + ")")), List( HTML.chapter("NEWS"), HTML.source( Symbol.decode(File.read(other_isabelle.isabelle_home + Path.explode("NEWS")))))) } /* bundled components */ class Bundled(platform: Option[Platform.Family.Value] = None) { def detect(s: String): Boolean = s.startsWith("#bundled") && !s.startsWith("#bundled ") def apply(name: String): String = "#bundled" + (platform match { case None => "" case Some(plat) => "-" + plat }) + ":" + name private val Pattern1 = ("""^#bundled:(.*)$""").r private val Pattern2 = ("""^#bundled-(.*):(.*)$""").r def unapply(s: String): Option[String] = s match { case Pattern1(name) => Some(name) case Pattern2(Platform.Family(plat), name) if platform == Some(plat) => Some(name) case _ => None } } def record_bundled_components(dir: Path) { val catalogs = List("main", "bundled").map((_, new Bundled())) ::: default_platform_families.flatMap(platform => List(platform.toString, "bundled-" + platform.toString). map((_, new Bundled(platform = Some(platform))))) File.append(Components.components(dir), terminate_lines("#bundled components" :: (for { (catalog, bundled) <- catalogs.iterator path = Components.admin(dir) + Path.basic(catalog) if path.is_file line <- split_lines(File.read(path)) if line.nonEmpty && !line.startsWith("#") && !line.startsWith("jedit_build") } yield bundled(line)).toList)) } def get_bundled_components(dir: Path, platform: Platform.Family.Value): (List[String], String) = { val Bundled = new Bundled(platform = Some(platform)) val components = for { Bundled(name) <- Components.read_components(dir) if !name.startsWith("jedit_build") } yield name val jdk_component = components.find(_.startsWith("jdk")) getOrElse error("Missing jdk component") (components, jdk_component) } def activate_components(dir: Path, platform: Platform.Family.Value, more_names: List[String]) { def contrib_name(name: String): String = Components.contrib(name = name).implode val Bundled = new Bundled(platform = Some(platform)) Components.write_components(dir, Components.read_components(dir).flatMap(line => line match { case Bundled(name) => if (Components.check_dir(Components.contrib(dir, name))) Some(contrib_name(name)) else None case _ => if (Bundled.detect(line)) None else Some(line) }) ::: more_names.map(contrib_name)) } def make_contrib(dir: Path) { Isabelle_System.mkdirs(Components.contrib(dir)) File.write(Components.contrib(dir, "README"), """This directory contains add-on components that contribute to the main Isabelle distribution. Separate licensing conditions apply, see each directory individually. """) } /** build release **/ private def execute(dir: Path, script: String): Unit = Isabelle_System.bash(script, cwd = dir.file).check private def execute_tar(dir: Path, args: String): Unit = Isabelle_System.gnutar(args, dir = dir).check /* build heaps on remote server */ private def remote_build_heaps( options: Options, platform: Platform.Family.Value, build_sessions: List[String], local_dir: Path) { val server_option = "build_release_server_" + platform.toString options.string(server_option) match { case SSH.Target(user, host) => using(SSH.open_session(options, host = host, user = user))(ssh => { Isabelle_System.with_tmp_file("tmp", "tar")(local_tmp_tar => { execute_tar(local_dir, "-cf " + File.bash_path(local_tmp_tar) + " .") ssh.with_tmp_dir(remote_dir => { val remote_tmp_tar = remote_dir + Path.basic("tmp.tar") ssh.write_file(remote_tmp_tar, local_tmp_tar) val remote_commands = List( "cd " + File.bash_path(remote_dir), "tar -xf tmp.tar", "./bin/isabelle build -o system_heaps -b -- " + Bash.strings(build_sessions), "tar -cf tmp.tar heaps") ssh.execute(remote_commands.mkString(" && ")).check ssh.read_file(remote_tmp_tar, local_tmp_tar) }) execute_tar(local_dir, "-xf " + File.bash_path(local_tmp_tar)) }) }) case s => error("Bad " + server_option + ": " + quote(s)) } } /* main */ private val default_platform_families: List[Platform.Family.Value] = List(Platform.Family.linux, Platform.Family.windows, Platform.Family.macos) def build_release(base_dir: Path, options: Options, components_base: Path = Components.default_components_base, progress: Progress = No_Progress, rev: String = "", afp_rev: String = "", official_release: Boolean = false, proper_release_name: Option[String] = None, platform_families: List[Platform.Family.Value] = default_platform_families, more_components: List[Path] = Nil, website: Option[Path] = None, build_sessions: List[String] = Nil, build_library: Boolean = false, parallel_jobs: Int = 1): Release = { val hg = Mercurial.repository(Path.explode("$ISABELLE_HOME")) val release = { val date = Date.now() val dist_name = proper_release_name getOrElse ("Isabelle_" + Date.Format.date(date)) val dist_dir = (base_dir + Path.explode("dist-" + dist_name)).absolute val version = proper_string(rev) orElse proper_release_name getOrElse "tip" val ident = try { hg.id(version) } catch { case ERROR(_) => error("Bad repository version: " + version) } val dist_version = proper_release_name match { case Some(name) => name + ": " + Date.Format("LLLL uuuu")(date) case None => "Isabelle repository snapshot " + ident + " " + Date.Format.date(date) } new Release(progress, date, dist_name, dist_dir, dist_version, ident) } /* make distribution */ if (release.isabelle_archive.is_file) { progress.echo_warning("Release archive already exists: " + release.isabelle_archive) val archive_ident = Isabelle_System.with_tmp_dir("build_release")(tmp_dir => { val getsettings = Path.explode(release.dist_name + "/" + getsettings_file) execute_tar(tmp_dir, "-xzf " + File.bash_path(release.isabelle_archive) + " " + File.bash_path(getsettings)) split_lines(File.read(tmp_dir + getsettings)) .collectFirst({ case ISABELLE_ID(ident) => ident }) .getOrElse(error("Failed to read ISABELLE_ID from " + release.isabelle_archive)) }) if (release.ident != archive_ident) { error("Mismatch of release identification " + release.ident + " vs. archive " + archive_ident) } } else { progress.echo_warning("Producing release archive " + release.isabelle_archive + " ...") Isabelle_System.mkdirs(release.dist_dir) if (release.isabelle_dir.is_dir) error("Directory " + release.isabelle_dir + " already exists") progress.echo_warning("Retrieving Mercurial repository version " + release.ident) hg.archive(release.isabelle_dir.expand.implode, rev = release.ident, options = "--type files") for (name <- List(".hg_archival.txt", ".hgtags", ".hgignore", "README_REPOSITORY")) { (release.isabelle_dir + Path.explode(name)).file.delete } progress.echo_warning("Preparing distribution " + quote(release.dist_name)) patch_release(release, proper_release_name.isDefined && official_release) if (proper_release_name.isEmpty) make_announce(release) make_contrib(release.isabelle_dir) execute(release.isabelle_dir, """find . -print | xargs chmod -f u+rw""") record_bundled_components(release.isabelle_dir) /* build tools and documentation */ val other_isabelle = release.other_isabelle(release.dist_dir) other_isabelle.init_settings( other_isabelle.init_components(components_base = components_base, catalogs = List("main"))) other_isabelle.resolve_components(echo = true) try { val export_classpath = "export CLASSPATH=" + Bash.string(other_isabelle.getenv("ISABELLE_CLASSPATH")) + "\n" other_isabelle.bash(export_classpath + "./Admin/build all", echo = true).check other_isabelle.bash(export_classpath + "./bin/isabelle jedit -b", echo = true).check } catch { case ERROR(_) => error("Failed to build tools") } try { other_isabelle.bash( "./bin/isabelle build_doc -a -o system_heaps -j " + parallel_jobs, echo = true).check } catch { case ERROR(_) => error("Failed to build documentation") } make_news(other_isabelle, release.dist_version) for (name <- List("Admin", "browser_info", "heaps")) { Isabelle_System.rm_tree(other_isabelle.isabelle_home + Path.explode(name)) } other_isabelle.cleanup() progress.echo_warning("Creating distribution archive " + release.isabelle_archive) def execute_dist_name(script: String): Unit = Isabelle_System.bash(script, cwd = release.dist_dir.file, env = Isabelle_System.settings() + ("DIST_NAME" -> release.dist_name)).check execute_dist_name(""" set -e chmod -R a+r "$DIST_NAME" chmod -R u+w "$DIST_NAME" chmod -R g=o "$DIST_NAME" find "$DIST_NAME" -type f "(" -name "*.thy" -o -name "*.ML" -o -name "*.scala" ")" -print | xargs chmod -f u-w """) execute_tar(release.dist_dir, "-czf " + File.bash_path(release.isabelle_archive) + " " + Bash.string(release.dist_name)) execute_dist_name(""" set -e mv "$DIST_NAME" "${DIST_NAME}-old" mkdir "$DIST_NAME" mv "${DIST_NAME}-old/README" "${DIST_NAME}-old/NEWS" "${DIST_NAME}-old/ANNOUNCE" \ "${DIST_NAME}-old/COPYRIGHT" "${DIST_NAME}-old/CONTRIBUTORS" "$DIST_NAME" mkdir "$DIST_NAME/doc" mv "${DIST_NAME}-old/doc/"*.pdf \ "${DIST_NAME}-old/doc/"*.html \ "${DIST_NAME}-old/doc/"*.css \ "${DIST_NAME}-old/doc/fonts" \ "${DIST_NAME}-old/doc/Contents" "$DIST_NAME/doc" rm -f Isabelle && ln -sf "$DIST_NAME" Isabelle rm -rf "${DIST_NAME}-old" """) } /* make application bundles */ val bundle_infos = platform_families.map(release.bundle_info) for (bundle_info <- bundle_infos) { val isabelle_name = release.dist_name val platform = bundle_info.platform progress.echo("\nApplication bundle for " + platform) Isabelle_System.with_tmp_dir("build_release")(tmp_dir => { // release archive execute_tar(tmp_dir, "-xzf " + File.bash_path(release.isabelle_archive)) val other_isabelle = release.other_isabelle(tmp_dir) val isabelle_target = other_isabelle.isabelle_home // bundled components progress.echo("Bundled components:") val contrib_dir = Components.contrib(isabelle_target) val (bundled_components, jdk_component) = get_bundled_components(isabelle_target, platform) Components.resolve(components_base, bundled_components, target_dir = Some(contrib_dir), copy_dir = Some(release.dist_dir + Path.explode("contrib")), progress = progress) val more_components_names = more_components.map(Components.unpack(contrib_dir, _, progress = progress)) Components.purge(contrib_dir, platform) activate_components(isabelle_target, platform, more_components_names) // Java parameters val java_options_title = "# Java runtime options" val java_options: List[String] = (for { variable <- List( "ISABELLE_JAVA_SYSTEM_OPTIONS", "JEDIT_JAVA_SYSTEM_OPTIONS", "JEDIT_JAVA_OPTIONS") opt <- Word.explode(other_isabelle.getenv(variable)) } yield opt) ::: List("-Disabelle.jedit_server=" + isabelle_name) val classpath: List[Path] = { val base = isabelle_target.absolute Path.split(other_isabelle.getenv("ISABELLE_CLASSPATH")).map(path => { val abs_path = path.absolute File.relative_path(base, abs_path) match { case Some(rel_path) => rel_path case None => error("Bad ISABELLE_CLASSPATH element: " + abs_path) } }) ::: List(Path.explode("src/Tools/jEdit/dist/jedit.jar")) } val jedit_options = Path.explode("src/Tools/jEdit/etc/options") val jedit_props = Path.explode("src/Tools/jEdit/dist/properties/jEdit.props") // build heaps if (build_sessions.nonEmpty) { progress.echo("Building heaps ...") remote_build_heaps(options, platform, build_sessions, isabelle_target) } // application bundling platform match { case Platform.Family.linux => File.write(isabelle_target + jedit_options, File.read(isabelle_target + jedit_options) .replaceAll("jedit_reset_font_size : int =.*", "jedit_reset_font_size : int = 24")) File.write(isabelle_target + jedit_props, File.read(isabelle_target + jedit_props) .replaceAll("console.fontsize=.*", "console.fontsize=18") .replaceAll("helpviewer.fontsize=.*", "helpviewer.fontsize=18") .replaceAll("metal.primary.fontsize=.*", "metal.primary.fontsize=18") .replaceAll("metal.secondary.fontsize=.*", "metal.secondary.fontsize=18") .replaceAll("view.fontsize=.*", "view.fontsize=24") .replaceAll("view.gutter.fontsize=.*", "view.gutter.fontsize=16")) File.write(isabelle_target + Path.explode("Isabelle.options"), terminate_lines(java_options_title :: java_options)) val isabelle_app = isabelle_target + Path.explode("lib/scripts/Isabelle_app") File.write(isabelle_app, File.read(Path.explode("~~/Admin/Linux/Isabelle_app")) .replaceAllLiterally("{CLASSPATH}", classpath.map("$ISABELLE_HOME/" + _).mkString(":")) .replaceAllLiterally("/jdk/", "/" + jdk_component + "/")) File.set_executable(isabelle_app, true) val linux_app = isabelle_target + Path.explode("contrib/linux_app") File.move(linux_app + Path.explode("Isabelle"), isabelle_target + Path.explode(isabelle_name)) Isabelle_System.rm_tree(linux_app) val archive_name = isabelle_name + "_linux.tar.gz" progress.echo("Packaging " + archive_name + " ...") execute_tar(tmp_dir, "-czf " + File.bash_path(release.dist_dir + Path.explode(archive_name)) + " " + Bash.string(isabelle_name)) case Platform.Family.macos => File.write(isabelle_target + jedit_props, File.read(isabelle_target + jedit_props) .replaceAll("lookAndFeel=.*", "lookAndFeel=com.apple.laf.AquaLookAndFeel") .replaceAll("delete-line.shortcut=.*", "delete-line.shortcut=C+d") .replaceAll("delete.shortcut2=.*", "delete.shortcut2=A+d") .replaceAll("plugin-blacklist.MacOSX.jar=true", "plugin-blacklist.MacOSX.jar=")) // MacOS application bundle File.move(isabelle_target + Path.explode("contrib/macos_app"), tmp_dir) val isabelle_app = Path.explode(isabelle_name + ".app") val app_dir = tmp_dir + isabelle_app File.move(tmp_dir + Path.explode("macos_app/Isabelle.app"), app_dir) val app_contents = app_dir + Path.explode("Contents") val app_resources = app_contents + Path.explode("Resources") File.move(tmp_dir + Path.explode(isabelle_name), app_resources) File.write(app_contents + Path.explode("Info.plist"), File.read(Path.explode("~~/Admin/MacOS/Info.plist")) .replaceAllLiterally("{ISABELLE_NAME}", isabelle_name) .replaceAllLiterally("{JAVA_OPTIONS}", terminate_lines(java_options.map(opt => "" + opt + "")))) for (cp <- classpath) { File.link( Path.explode("../Resources/" + isabelle_name + "/") + cp, app_contents + Path.explode("Java"), force = true) } File.link( Path.explode("../Resources/" + isabelle_name + "/contrib/" + jdk_component + "/x86_64-darwin"), app_contents + Path.explode("PlugIns/bundled.jdk"), force = true) File.link( Path.explode("../../Info.plist"), app_resources + Path.explode(isabelle_name + "/" + isabelle_name + ".plist"), force = true) File.link( Path.explode("Contents/Resources/" + isabelle_name), app_dir + Path.explode("Isabelle"), force = true) // application archive val archive_name = isabelle_name + "_macos.tar.gz" progress.echo("Packaging " + archive_name + " ...") execute_tar(tmp_dir, "-czf " + File.bash_path(release.dist_dir + Path.explode(archive_name)) + " " + File.bash_path(isabelle_app)) case Platform.Family.windows => File.write(isabelle_target + jedit_props, File.read(isabelle_target + jedit_props) .replaceAll("lookAndFeel=.*", "lookAndFeel=com.sun.java.swing.plaf.windows.WindowsLookAndFeel") .replaceAll("foldPainter=.*", "foldPainter=Square")) // application launcher File.move(isabelle_target + Path.explode("contrib/windows_app"), tmp_dir) val app_template = Path.explode("~~/Admin/Windows/launch4j") File.write(isabelle_target + Path.explode(isabelle_name + ".l4j.ini"), (java_options_title :: java_options).map(_ + "\r\n").mkString) val isabelle_xml = Path.explode("isabelle.xml") val isabelle_exe = Path.explode(isabelle_name + ".exe") File.write(tmp_dir + isabelle_xml, File.read(app_template + isabelle_xml) .replaceAllLiterally("{ISABELLE_NAME}", isabelle_name) .replaceAllLiterally("{OUTFILE}", File.platform_path(isabelle_target + isabelle_exe)) .replaceAllLiterally("{ICON}", File.platform_path(app_template + Path.explode("isabelle_transparent.ico"))) .replaceAllLiterally("{SPLASH}", File.platform_path(app_template + Path.explode("isabelle.bmp"))) .replaceAllLiterally("{CLASSPATH}", cat_lines(classpath.map(cp => " %EXEDIR%\\" + File.platform_path(cp).replace('/', '\\') + ""))) .replaceAllLiterally("\\jdk\\", "\\" + jdk_component + "\\")) execute(tmp_dir, "\"windows_app/launch4j-${ISABELLE_PLATFORM_FAMILY}/launch4j\" isabelle.xml") File.copy(app_template + Path.explode("manifest.xml"), isabelle_target + isabelle_exe.ext("manifest")) // Cygwin setup val cygwin_template = Path.explode("~~/Admin/Windows/Cygwin") File.copy(cygwin_template + Path.explode("Cygwin-Terminal.bat"), isabelle_target) val cygwin_mirror = File.read(isabelle_target + Path.explode("contrib/cygwin/isabelle/cygwin_mirror")) val cygwin_bat = Path.explode("Cygwin-Setup.bat") File.write(isabelle_target + cygwin_bat, File.read(cygwin_template + cygwin_bat) .replaceAllLiterally("{MIRROR}", cygwin_mirror)) File.set_executable(isabelle_target + cygwin_bat, true) for (name <- List("isabelle/postinstall", "isabelle/rebaseall")) { val path = Path.explode(name) File.copy(cygwin_template + path, isabelle_target + Path.explode("contrib/cygwin") + path) } execute(isabelle_target, """find . -type f -not -name "*.exe" -not -name "*.dll" """ + (if (Platform.is_macos) "-perm +100" else "-executable") + " -print0 > contrib/cygwin/isabelle/executables") execute(isabelle_target, """find . -type l -exec echo "{}" ";" -exec readlink "{}" ";" """ + """> contrib/cygwin/isabelle/symlinks""") execute(isabelle_target, """find . -type l -exec rm "{}" ";" """) File.write(isabelle_target + Path.explode("contrib/cygwin/isabelle/uninitialized"), "") // executable archive (self-extracting 7z) val archive_name = isabelle_name + ".7z" val exe_archive = tmp_dir + Path.explode(archive_name) exe_archive.file.delete progress.echo("Packaging " + archive_name + " ...") execute(tmp_dir, "7z -y -bd a " + File.bash_path(exe_archive) + " " + Bash.string(isabelle_name)) if (!exe_archive.is_file) error("Failed to create archive: " + exe_archive) val sfx_exe = tmp_dir + Path.explode("windows_app/7zsd_All_x64.sfx") val sfx_txt = File.read(Path.explode("~~/Admin/Windows/Installer/sfx.txt")). replaceAllLiterally("{ISABELLE_NAME}", isabelle_name) Bytes.write(release.dist_dir + isabelle_exe, Bytes.read(sfx_exe) + Bytes(sfx_txt) + Bytes.read(exe_archive)) File.set_executable(release.dist_dir + isabelle_exe, true) } }) progress.echo("DONE") } /* minimal website */ for (dir <- website) { val website_platform_bundles = for { bundle_info <- bundle_infos if (release.dist_dir + bundle_info.path).is_file } yield (bundle_info.name, bundle_info) val isabelle_link = HTML.link(Isabelle_Cronjob.isabelle_repos_source + "/rev/" + release.ident, HTML.text("Isabelle/" + release.ident)) val afp_link = HTML.link(AFP.repos_source + "/rev/" + afp_rev, HTML.text("AFP/" + afp_rev)) HTML.write_document(dir, "index.html", List(HTML.title(release.dist_name)), List( HTML.section(release.dist_name + " (" + release.ident + ")"), HTML.subsection("Platforms"), HTML.itemize( website_platform_bundles.map({ case (bundle, bundle_info) => List(HTML.link(bundle, HTML.text(bundle_info.platform_description))) })), HTML.subsection("Repositories"), HTML.itemize( List(List(isabelle_link)) ::: (if (afp_rev == "") Nil else List(List(afp_link)))))) for ((bundle, _) <- website_platform_bundles) File.copy(release.dist_dir + Path.explode(bundle), dir) } /* HTML library */ if (build_library) { if (release.isabelle_library_archive.is_file) { progress.echo_warning("Library archive already exists: " + release.isabelle_library_archive) } else { Isabelle_System.with_tmp_dir("build_release")(tmp_dir => { val bundle = release.dist_dir + Path.explode(release.dist_name + "_" + Platform.family + ".tar.gz") execute_tar(tmp_dir, "-xzf " + File.bash_path(bundle)) val other_isabelle = release.other_isabelle(tmp_dir) Isabelle_System.mkdirs(other_isabelle.etc) File.write(other_isabelle.etc_preferences, "ML_system_64 = true\n") other_isabelle.bash("bin/isabelle build -f -j " + parallel_jobs + " -o browser_info -o document=pdf -o document_variants=document:outline=/proof,/ML" + " -o system_heaps -c -a -d '~~/src/Benchmarks'", echo = true).check other_isabelle.isabelle_home_user.file.delete execute(tmp_dir, "chmod -R a+r " + Bash.string(release.dist_name)) execute(tmp_dir, "chmod -R g=o " + Bash.string(release.dist_name)) execute_tar(tmp_dir, "-czf " + File.bash_path(release.isabelle_library_archive) + " " + Bash.string(release.dist_name + "/browser_info")) }) } } release } /** command line entry point **/ def main(args: Array[String]) { - Command_Line.tool0 { + Command_Line.tool { var afp_rev = "" var components_base: Path = Components.default_components_base var official_release = false var proper_release_name: Option[String] = None var website: Option[Path] = None var build_sessions: List[String] = Nil var more_components: List[Path] = Nil var parallel_jobs = 1 var build_library = false var options = Options.init() var platform_families = default_platform_families var rev = "" val getopts = Getopts(""" Usage: Admin/build_release [OPTIONS] BASE_DIR Options are: -A REV corresponding AFP changeset id -C DIR base directory for Isabelle components (default: """ + Components.default_components_base + """) -O official release (not release-candidate) -R RELEASE proper release with name -W WEBSITE produce minimal website in given directory -b SESSIONS build platform-specific session images (separated by commas) -c ARCHIVE clean bundling with additional component .tar.gz archive -j INT maximum number of parallel jobs (default 1) -l build library -o OPTION override Isabelle system OPTION (via NAME=VAL or NAME) -p NAMES platform families (default: """ + default_platform_families.mkString(",") + """) -r REV Mercurial changeset id (default: RELEASE or tip) Build Isabelle release in base directory, using the local repository clone. """, "A:" -> (arg => afp_rev = arg), "C:" -> (arg => components_base = Path.explode(arg)), "O" -> (_ => official_release = true), "R:" -> (arg => proper_release_name = Some(arg)), "W:" -> (arg => website = Some(Path.explode(arg))), "b:" -> (arg => build_sessions = space_explode(',', arg)), "c:" -> (arg => { val path = Path.explode(arg) Components.Archive.get_name(path.file_name) more_components = more_components ::: List(path) }), "j:" -> (arg => parallel_jobs = Value.Int.parse(arg)), "l" -> (_ => build_library = true), "o:" -> (arg => options = options + arg), "p:" -> (arg => platform_families = space_explode(',', arg).map(Platform.Family.parse)), "r:" -> (arg => rev = arg)) val more_args = getopts(args) val base_dir = more_args match { case List(base_dir) => base_dir case _ => getopts.usage() } val progress = new Console_Progress() if (platform_families.contains(Platform.Family.windows) && !Isabelle_System.bash("7z i").ok) error("Building for windows requires 7z") build_release(Path.explode(base_dir), options, components_base = components_base, progress = progress, rev = rev, afp_rev = afp_rev, official_release = official_release, proper_release_name = proper_release_name, website = website, platform_families = if (platform_families.isEmpty) default_platform_families else platform_families, more_components = more_components, build_sessions = build_sessions, build_library = build_library, parallel_jobs = parallel_jobs) } } } diff --git a/src/Pure/Admin/isabelle_cronjob.scala b/src/Pure/Admin/isabelle_cronjob.scala --- a/src/Pure/Admin/isabelle_cronjob.scala +++ b/src/Pure/Admin/isabelle_cronjob.scala @@ -1,608 +1,608 @@ /* Title: Pure/Admin/isabelle_cronjob.scala Author: Makarius Main entry point for administrative cronjob at TUM. */ package isabelle import java.nio.file.Files import scala.annotation.tailrec import scala.collection.mutable object Isabelle_Cronjob { /* global resources: owned by main cronjob */ val backup = "lxbroy10:cronjob" val main_dir: Path = Path.explode("~/cronjob") val main_state_file: Path = main_dir + Path.explode("run/main.state") val current_log: Path = main_dir + Path.explode("run/main.log") // owned by log service val cumulative_log: Path = main_dir + Path.explode("log/main.log") // owned by log service val isabelle_repos_source = "https://isabelle.sketis.net/repos/isabelle" val isabelle_repos: Path = main_dir + Path.explode("isabelle") val afp_repos: Path = main_dir + Path.explode("AFP") val build_log_dirs = List(Path.explode("~/log"), Path.explode("~/afp/log"), Path.explode("~/cronjob/log")) /** logger tasks **/ sealed case class Logger_Task(name: String = "", body: Logger => Unit) /* init and exit */ def get_rev(): String = Mercurial.repository(isabelle_repos).id() def get_afp_rev(): String = Mercurial.repository(afp_repos).id() val init: Logger_Task = Logger_Task("init", logger => { Isabelle_Devel.make_index() Mercurial.setup_repository(isabelle_repos_source, isabelle_repos) Mercurial.setup_repository(AFP.repos_source, afp_repos) File.write(logger.log_dir + Build_Log.log_filename("isabelle_identify", logger.start_date), Build_Log.Identify.content(logger.start_date, Some(get_rev()), Some(get_afp_rev()))) Isabelle_System.bash( """rsync -a --include="*/" --include="plain_identify*" --exclude="*" """ + Bash.string(backup + "/log/.") + " " + File.bash_path(main_dir) + "/log/.").check if (!Isabelle_Devel.cronjob_log.is_file) Files.createSymbolicLink(Isabelle_Devel.cronjob_log.file.toPath, current_log.file.toPath) }) val exit: Logger_Task = Logger_Task("exit", logger => { Isabelle_System.bash( "rsync -a " + File.bash_path(main_dir) + "/log/." + " " + Bash.string(backup) + "/log/.") .check }) /* build release */ val build_release: Logger_Task = Logger_Task("build_release", logger => { Isabelle_Devel.release_snapshot(logger.options, rev = get_rev(), afp_rev = get_afp_rev()) }) /* integrity test of build_history vs. build_history_base */ val build_history_base: Logger_Task = Logger_Task("build_history_base", logger => { using(logger.ssh_context.open_session("lxbroy10"))(ssh => { val results = Build_History.remote_build_history(ssh, isabelle_repos, isabelle_repos.ext("build_history_base"), isabelle_identifier = "cronjob_build_history", self_update = true, rev = "build_history_base", options = "-f", args = "HOL") for ((log_name, bytes) <- results) { Bytes.write(logger.log_dir + Path.explode(log_name), bytes) } }) }) /* remote build_history */ sealed case class Item( known: Boolean, isabelle_version: String, afp_version: Option[String], pull_date: Date) { def unknown: Boolean = !known def versions: (String, Option[String]) = (isabelle_version, afp_version) def known_versions(rev: String, afp_rev: Option[String]): Boolean = known && rev != "" && isabelle_version == rev && (afp_rev.isEmpty || afp_rev.get != "" && afp_version == afp_rev) } def recent_items(db: SQL.Database, days: Int, rev: String, afp_rev: Option[String], sql: SQL.Source): List[Item] = { val afp = afp_rev.isDefined val select = Build_Log.Data.select_recent_versions( days = days, rev = rev, afp_rev = afp_rev, sql = "WHERE " + sql) db.using_statement(select)(stmt => stmt.execute_query().iterator(res => { val known = res.bool(Build_Log.Data.known) val isabelle_version = res.string(Build_Log.Prop.isabelle_version) val afp_version = if (afp) proper_string(res.string(Build_Log.Prop.afp_version)) else None val pull_date = res.date(Build_Log.Data.pull_date(afp)) Item(known, isabelle_version, afp_version, pull_date) }).toList) } def unknown_runs(items: List[Item]): List[List[Item]] = { val (run, rest) = Library.take_prefix[Item](_.unknown, items.dropWhile(_.known)) if (run.nonEmpty) run :: unknown_runs(rest) else Nil } sealed case class Remote_Build( description: String, host: String, actual_host: String = "", user: String = "", port: Int = 0, proxy_host: String = "", proxy_user: String = "", proxy_port: Int = 0, self_update: Boolean = false, historic: Boolean = false, history: Int = 0, history_base: String = "build_history_base", options: String = "", args: String = "", afp: Boolean = false, bulky: Boolean = false, more_hosts: List[String] = Nil, detect: SQL.Source = "", active: Boolean = true) { def ssh_session(context: SSH.Context): SSH.Session = context.open_session(host = host, user = user, port = port, actual_host = actual_host, proxy_host = proxy_host, proxy_user = proxy_user, proxy_port = proxy_port, permissive = proxy_host.nonEmpty) def sql: SQL.Source = Build_Log.Prop.build_engine + " = " + SQL.string(Build_History.engine) + " AND " + SQL.member(Build_Log.Prop.build_host.ident, host :: more_hosts) + (if (detect == "") "" else " AND " + SQL.enclose(detect)) def profile: Build_Status.Profile = Build_Status.Profile(description, history = history, afp = afp, bulky = bulky, sql = sql) def pick( options: Options, rev: String = "", filter: Item => Boolean = _ => true): Option[(String, Option[String])] = { val afp_rev = if (afp) Some(get_afp_rev()) else None val store = Build_Log.store(options) using(store.open_database())(db => { def pick_days(days: Int, gap: Int): Option[(String, Option[String])] = { val items = recent_items(db, days, rev, afp_rev, sql).filter(filter) def runs = unknown_runs(items).filter(run => run.length >= gap) if (historic || items.exists(_.known_versions(rev, afp_rev))) { val longest_run = (List.empty[Item] /: runs)({ case (item1, item2) => if (item1.length >= item2.length) item1 else item2 }) if (longest_run.isEmpty) None else Some(longest_run(longest_run.length / 2).versions) } else if (rev != "") Some((rev, afp_rev)) else runs.flatten.headOption.map(_.versions) } pick_days(options.int("build_log_history") max history, 2) orElse pick_days(200, 5) orElse pick_days(2000, 1) }) } } val remote_builds_old: List[Remote_Build] = List( Remote_Build("AFP bulky", "lrzcloud1", self_update = true, proxy_host = "lxbroy10", proxy_user = "i21isatest", options = "-m64 -M6 -U30000 -s10 -t AFP", args = "-g large -g slow", afp = true, bulky = true, detect = Build_Log.Prop.build_tags + " = " + SQL.string("AFP")), Remote_Build("AFP", "lxbroy7", args = "-N -X large -X slow", afp = true, detect = Build_Log.Prop.build_tags + " = " + SQL.string("AFP")), Remote_Build("Poly/ML 5.7 Linux", "lxbroy8", history_base = "37074e22e8be", options = "-m32 -B -M1x2,2 -t polyml-5.7 -i 'init_component /home/isabelle/contrib/polyml-5.7'", args = "-N -g timing", detect = Build_Log.Prop.build_tags + " = " + SQL.string("polyml-5.7") + " AND " + Build_Log.Settings.ML_OPTIONS + " <> " + SQL.string("-H 500")), Remote_Build("Poly/ML 5.7.1 Linux", "lxbroy8", history_base = "a9d5b59c3e12", options = "-m32 -B -M1x2,2 -t polyml-5.7.1-pre2 -i 'init_component /home/isabelle/contrib/polyml-test-905dae2ebfda'", args = "-N -g timing", detect = Build_Log.Prop.build_tags + " = " + SQL.string("polyml-5.7.1-pre1") + " OR " + Build_Log.Prop.build_tags + " = " + SQL.string("polyml-5.7.1-pre2")), Remote_Build("Poly/ML 5.7 Mac OS X", "macbroy2", history_base = "37074e22e8be", options = "-m32 -B -M1x4,4 -t polyml-5.7 -i 'init_component /home/isabelle/contrib/polyml-5.7'", args = "-a", detect = Build_Log.Prop.build_tags + " = " + SQL.string("polyml-5.7")), Remote_Build("Poly/ML 5.7.1 Mac OS X", "macbroy2", history_base = "a9d5b59c3e12", options = "-m32 -B -M1x4,4 -t polyml-5.7.1-pre2 -i 'init_component /home/isabelle/contrib/polyml-test-905dae2ebfda'", args = "-a", detect = Build_Log.Prop.build_tags + " = " + SQL.string("polyml-5.7.1-pre1") + " OR " + Build_Log.Prop.build_tags + " = " + SQL.string("polyml-5.7.1-pre2")), Remote_Build("Poly/ML test", "lxbroy8", options = "-m32 -B -M1x2,2 -t polyml-test -i 'init_component /home/isabelle/contrib/polyml-5.7-20170217'", args = "-N -g timing", detect = Build_Log.Prop.build_tags + " = " + SQL.string("polyml-test")), Remote_Build("Mac OS X 10.8 Mountain Lion", "macbroy30", options = "-m32 -M2", args = "-a", detect = Build_Log.Prop.build_start + " < date '2017-03-03'")) val remote_builds1: List[List[Remote_Build]] = { List( List(Remote_Build("Linux A", "lxbroy9", options = "-m32 -B -M1x2,2", args = "-N -g timing")), List(Remote_Build("Linux B", "lxbroy10", historic = true, history = 90, options = "-m32 -B -M1x4,2,4,6", args = "-N -g timing")), List(Remote_Build("Linux Benchmarks", "lxbroy5", historic = true, history = 90, options = "-m32 -B -M1x2,2 -t Benchmarks" + " -e ISABELLE_GHC=ghc -e ISABELLE_MLTON=mlton -e ISABELLE_OCAML=ocaml" + " -e ISABELLE_OCAMLC=ocamlc -e ISABELLE_OCAMLFIND=ocamlfind -e ISABELLE_SMLNJ=sml" + " -e ISABELLE_SWIPL=swipl", args = "-N -a -d '~~/src/Benchmarks'", detect = Build_Log.Prop.build_tags + " = " + SQL.string("Benchmarks"))), List( Remote_Build("Mac OS X", "macbroy2", options = "-m32 -M8" + " -e ISABELLE_GHC=ghc -e ISABELLE_MLTON=mlton -e ISABELLE_OCAML=ocaml" + " -e ISABELLE_OCAMLC=ocamlc -e ISABELLE_OCAML_SETUP=true" + " -e ISABELLE_OPAM_ROOT=\"$ISABELLE_HOME/opam\"" + " -e ISABELLE_SMLNJ=/home/isabelle/smlnj/110.85/bin/sml", args = "-a", detect = Build_Log.Prop.build_tags.undefined, history_base = "2c0f24e927dd"), Remote_Build("Mac OS X, quick_and_dirty", "macbroy2", options = "-m32 -M8 -t quick_and_dirty", args = "-a -o quick_and_dirty", detect = Build_Log.Prop.build_tags + " = " + SQL.string("quick_and_dirty"), history_base = "2c0f24e927dd"), Remote_Build("Mac OS X, skip_proofs", "macbroy2", options = "-m32 -M8 -t skip_proofs", args = "-a -o skip_proofs", detect = Build_Log.Prop.build_tags + " = " + SQL.string("skip_proofs"), history_base = "2c0f24e927dd")), List( Remote_Build("Mac OS X 10.12 Sierra", "macbroy30", options = "-m32 -M2", args = "-a", detect = Build_Log.Prop.build_start + " > date '2017-03-03'")), List(Remote_Build("Mac OS X 10.10 Yosemite", "macbroy31", options = "-m32 -M2", args = "-a")), List(Remote_Build("Mac OS X 10.13 High Sierra", "lapbroy68", options = "-m32 -M1,2,4 -e ISABELLE_GHC_SETUP=true", self_update = true, args = "-a -d '~~/src/Benchmarks'")), List(Remote_Build("Mac OS X 10.14 Mojave", "lapnipkow3", options = "-m32 -M1,2 -e ISABELLE_GHC_SETUP=true", self_update = true, args = "-a -d '~~/src/Benchmarks'")), List(Remote_Build("Mac OS X 10.15 Catalina", "laramac01", user = "makarius", proxy_host = "laraserver", proxy_user = "makarius", self_update = true, options = "-m32 -M4 -e ISABELLE_GHC_SETUP=true", args = "-a -d '~~/src/Benchmarks'")), List( Remote_Build("Windows", "vmnipkow9", historic = true, history = 90, self_update = true, options = "-m32 -M4" + " -e ISABELLE_OCAML=ocaml -e ISABELLE_OCAMLC=ocamlc -e ISABELLE_OCAML_SETUP=true" + " -e ISABELLE_GHC_SETUP=true" + " -e ISABELLE_SMLNJ=/usr/local/smlnj-110.81/bin/sml", args = "-a", detect = Build_Log.Settings.ML_PLATFORM + " = " + SQL.string("x86-windows")), Remote_Build("Windows", "vmnipkow9", historic = true, history = 90, self_update = true, options = "-m64 -M4" + " -e ISABELLE_OCAML=ocaml -e ISABELLE_OCAMLC=ocamlc -e ISABELLE_OCAML_SETUP=true" + " -e ISABELLE_GHC_SETUP=true" + " -e ISABELLE_SMLNJ=/usr/local/smlnj-110.81/bin/sml", args = "-a", detect = Build_Log.Settings.ML_PLATFORM + " = " + SQL.string("x86_64-windows"))) ) ::: { for { (n, hosts) <- List(1 -> List("lxbroy6"), 2 -> List("lxbroy8", "lxbroy5")) } yield { List(Remote_Build("AFP", host = hosts.head, more_hosts = hosts.tail, options = "-m32 -M1x2 -t AFP -P" + n + " -e ISABELLE_GHC=ghc" + " -e ISABELLE_MLTON=mlton" + " -e ISABELLE_OCAML=ocaml -e ISABELLE_OCAMLC=ocamlc -e ISABELLE_OCAMLFIND=ocamlfind" + " -e ISABELLE_SMLNJ=sml", args = "-N -X large -X slow", afp = true, detect = Build_Log.Prop.build_tags + " = " + SQL.string("AFP"))) } } } val remote_builds2: List[List[Remote_Build]] = List( List( Remote_Build("AFP2", "lrzcloud2", actual_host = "10.195.2.10", self_update = true, proxy_host = "lxbroy10", proxy_user = "i21isatest", options = "-m32 -M1x8 -t AFP" + " -e ISABELLE_GHC=ghc" + " -e ISABELLE_MLTON=mlton" + " -e ISABELLE_OCAML=ocaml -e ISABELLE_OCAMLC=ocamlc -e ISABELLE_OCAMLFIND=ocamlfind" + " -e ISABELLE_SMLNJ=sml", args = "-a -X large -X slow", afp = true, detect = Build_Log.Prop.build_tags + " = " + SQL.string("AFP")), Remote_Build("AFP bulky2", "lrzcloud2", actual_host = "10.195.2.10", self_update = true, proxy_host = "lxbroy10", proxy_user = "i21isatest", options = "-m64 -M8 -U30000 -s10 -t AFP", args = "-g large -g slow", afp = true, bulky = true, detect = Build_Log.Prop.build_tags + " = " + SQL.string("AFP")))) def remote_build_history(rev: String, afp_rev: Option[String], i: Int, r: Remote_Build) : Logger_Task = { val task_name = "build_history-" + r.host Logger_Task(task_name, logger => { using(r.ssh_session(logger.ssh_context))(ssh => { val results = Build_History.remote_build_history(ssh, isabelle_repos, isabelle_repos.ext(r.host), isabelle_identifier = "cronjob_build_history", self_update = r.self_update, rev = rev, afp_rev = afp_rev, options = " -N " + Bash.string(task_name) + (if (i < 0) "" else "_" + (i + 1).toString) + " -f -h " + Bash.string(r.host) + " " + r.options, args = "-o timeout=10800 " + r.args) for ((log_name, bytes) <- results) { logger.log(Date.now(), log_name) Bytes.write(logger.log_dir + Path.explode(log_name), bytes) } }) }) } val build_status_profiles: List[Build_Status.Profile] = (remote_builds_old :: remote_builds1 ::: remote_builds2).flatten.map(_.profile) /** task logging **/ object Log_Service { def apply(options: Options, progress: Progress = No_Progress): Log_Service = new Log_Service(SSH.init_context(options), progress) } class Log_Service private(val ssh_context: SSH.Context, progress: Progress) { current_log.file.delete private val thread: Consumer_Thread[String] = Consumer_Thread.fork("cronjob: logger", daemon = true)( consume = (text: String) => { // critical File.append(current_log, text + "\n") File.append(cumulative_log, text + "\n") progress.echo(text) true }) def shutdown() { thread.shutdown() } val hostname: String = Isabelle_System.hostname() def log(date: Date, task_name: String, msg: String): Unit = if (task_name != "") thread.send( "[" + Build_Log.print_date(date) + ", " + hostname + ", " + task_name + "]: " + msg) def start_logger(start_date: Date, task_name: String): Logger = new Logger(this, start_date, task_name) def run_task(start_date: Date, task: Logger_Task) { val logger = start_logger(start_date, task.name) val res = Exn.capture { task.body(logger) } val end_date = Date.now() val err = res match { case Exn.Res(_) => None case Exn.Exn(exn) => Output.writeln("Exception trace for " + quote(task.name) + ":") exn.printStackTrace() val first_line = split_lines(Exn.message(exn)).headOption getOrElse "exception" Some(first_line) } logger.log_end(end_date, err) } def fork_task(start_date: Date, task: Logger_Task): Task = new Task(task.name, run_task(start_date, task)) } class Logger private[Isabelle_Cronjob]( val log_service: Log_Service, val start_date: Date, val task_name: String) { def ssh_context: SSH.Context = log_service.ssh_context def options: Options = ssh_context.options def log(date: Date, msg: String): Unit = log_service.log(date, task_name, msg) def log_end(end_date: Date, err: Option[String]) { val elapsed_time = end_date.time - start_date.time val msg = (if (err.isEmpty) "finished" else "ERROR " + err.get) + (if (elapsed_time.seconds < 3.0) "" else " (" + elapsed_time.message_hms + " elapsed time)") log(end_date, msg) } val log_dir: Path = main_dir + Build_Log.log_subdir(start_date) Isabelle_System.mkdirs(log_dir) log(start_date, "started") } class Task private[Isabelle_Cronjob](name: String, body: => Unit) { private val future: Future[Unit] = Future.thread("cronjob: " + name) { body } def is_finished: Boolean = future.is_finished } /** cronjob **/ def cronjob(progress: Progress, exclude_task: Set[String]) { /* soft lock */ val still_running = try { Some(File.read(main_state_file)) } catch { case ERROR(_) => None } still_running match { case None | Some("") => case Some(running) => error("Isabelle cronjob appears to be still running: " + running) } /* log service */ val log_service = Log_Service(Options.init(), progress = progress) def run(start_date: Date, task: Logger_Task) { log_service.run_task(start_date, task) } def run_now(task: Logger_Task) { run(Date.now(), task) } /* structured tasks */ def SEQ(tasks: List[Logger_Task]): Logger_Task = Logger_Task(body = _ => for (task <- tasks.iterator if !exclude_task(task.name) || task.name == "") run_now(task)) def PAR(tasks: List[Logger_Task]): Logger_Task = Logger_Task(body = _ => { @tailrec def join(running: List[Task]) { running.partition(_.is_finished) match { case (Nil, Nil) => case (Nil, _ :: _) => Thread.sleep(500); join(running) case (_ :: _, remaining) => join(remaining) } } val start_date = Date.now() val running = for (task <- tasks if !exclude_task(task.name)) yield log_service.fork_task(start_date, task) join(running) }) /* repository structure */ val hg = Mercurial.repository(isabelle_repos) val hg_graph = hg.graph() def history_base_filter(r: Remote_Build): Item => Boolean = { val base_rev = hg.id(r.history_base) val nodes = hg_graph.all_succs(List(base_rev)).toSet (item: Item) => nodes(item.isabelle_version) } /* main */ val main_start_date = Date.now() File.write(main_state_file, main_start_date + " " + log_service.hostname) run(main_start_date, Logger_Task("isabelle_cronjob", logger => run_now( SEQ(List( init, build_history_base, build_release, PAR( List(remote_builds1, remote_builds2).map(remote_builds => SEQ(List( PAR(remote_builds.map(_.filter(_.active)).map(seq => SEQ( for { (r, i) <- (if (seq.length <= 1) seq.map((_, -1)) else seq.zipWithIndex) (rev, afp_rev) <- r.pick(logger.options, hg.id(), history_base_filter(r)) } yield remote_build_history(rev, afp_rev, i, r)))), Logger_Task("jenkins_logs", _ => Jenkins.download_logs(logger.options, Jenkins.build_log_jobs, main_dir)), Logger_Task("build_log_database", logger => Isabelle_Devel.build_log_database(logger.options, build_log_dirs)), Logger_Task("build_status", logger => Isabelle_Devel.build_status(logger.options)))))), exit))))) log_service.shutdown() main_state_file.file.delete } /** command line entry point **/ def main(args: Array[String]) { - Command_Line.tool0 { + Command_Line.tool { var force = false var verbose = false var exclude_task = Set.empty[String] val getopts = Getopts(""" Usage: Admin/cronjob/main [OPTIONS] Options are: -f apply force to do anything -v verbose -x NAME exclude tasks with this name """, "f" -> (_ => force = true), "v" -> (_ => verbose = true), "x:" -> (arg => exclude_task += arg)) val more_args = getopts(args) if (more_args.nonEmpty) getopts.usage() val progress = if (verbose) new Console_Progress() else No_Progress if (force) cronjob(progress, exclude_task) else error("Need to apply force to do anything") } } } diff --git a/src/Pure/ML/ml_console.scala b/src/Pure/ML/ml_console.scala --- a/src/Pure/ML/ml_console.scala +++ b/src/Pure/ML/ml_console.scala @@ -1,87 +1,89 @@ /* Title: Pure/ML/ml_console.scala Author: Makarius The raw ML process in interactive mode. */ package isabelle object ML_Console { /* command line entry point */ def main(args: Array[String]) { Command_Line.tool { var dirs: List[Path] = Nil var include_sessions: List[String] = Nil var logic = Isabelle_System.getenv("ISABELLE_LOGIC") var modes: List[String] = Nil var no_build = false var options = Options.init() var raw_ml_system = false val getopts = Getopts(""" Usage: isabelle console [OPTIONS] Options are: -d DIR include session directory -i NAME include session in name-space of theories -l NAME logic session name (default ISABELLE_LOGIC=""" + quote(logic) + """) -m MODE add print mode for output -n no build of session image on startup -o OPTION override Isabelle system OPTION (via NAME=VAL or NAME) -r bootstrap from raw Poly/ML Build a logic session image and run the raw Isabelle ML process in interactive mode, with line editor ISABELLE_LINE_EDITOR=""" + quote(Isabelle_System.getenv("ISABELLE_LINE_EDITOR")) + """. """, "d:" -> (arg => dirs = dirs ::: List(Path.explode(arg))), "i:" -> (arg => include_sessions = include_sessions ::: List(arg)), "l:" -> (arg => logic = arg), "m:" -> (arg => modes = arg :: modes), "n" -> (arg => no_build = true), "o:" -> (arg => options = options + arg), "r" -> (_ => raw_ml_system = true)) val more_args = getopts(args) if (more_args.nonEmpty) getopts.usage() val sessions_structure = Sessions.load_structure(options, dirs = dirs) val store = Sessions.store(options) // build logic if (!no_build && !raw_ml_system) { val progress = new Console_Progress() val rc = progress.interrupt_handler { Build.build_logic(options, logic, build_heap = true, progress = progress, dirs = dirs) } if (rc != 0) sys.exit(rc) } // process loop val process = ML_Process(options, sessions_structure, store, logic = logic, args = List("-i"), redirect = true, modes = if (raw_ml_system) Nil else modes ::: List("ASCII"), raw_ml_system = raw_ml_system, session_base = if (raw_ml_system) None else Some(Sessions.base_info( options, logic, dirs = dirs, include_sessions = include_sessions).check_base)) val tty_loop = new TTY_Loop(process.stdin, process.stdout, Some(process.interrupt _)) val process_result = Future.thread[Int]("process_result") { val rc = process.join tty_loop.cancel // FIXME does not quite work, cannot interrupt blocking read on System.in rc } tty_loop.join - process_result.join + + val rc = process_result.join + if (rc != 0) sys.exit(rc) } } } diff --git a/src/Pure/System/command_line.ML b/src/Pure/System/command_line.ML --- a/src/Pure/System/command_line.ML +++ b/src/Pure/System/command_line.ML @@ -1,30 +1,27 @@ (* Title: Pure/System/command_line.ML Author: Makarius Support for Isabelle/ML command line tools. *) signature COMMAND_LINE = sig exception EXIT of int - val tool: (unit -> int) -> unit - val tool0: (unit -> unit) -> unit + val tool: (unit -> unit) -> unit end; structure Command_Line: COMMAND_LINE = struct exception EXIT of int; fun tool body = Thread_Attributes.uninterruptible (fn restore_attributes => fn () => let val rc = - restore_attributes body () + (restore_attributes body (); 0) handle EXIT rc => rc | exn => Exn.capture_exit 2 (fn () => (Runtime.exn_error_message exn; raise exn)) (); in exit rc end) (); -fun tool0 body = tool (fn () => (body (); 0)); - end; diff --git a/src/Pure/System/command_line.scala b/src/Pure/System/command_line.scala --- a/src/Pure/System/command_line.scala +++ b/src/Pure/System/command_line.scala @@ -1,42 +1,40 @@ /* Title: Pure/System/command_line.scala Author: Makarius Support for Isabelle/Scala command line tools. */ package isabelle object Command_Line { object Chunks { private def chunks(list: List[String]): List[List[String]] = list.indexWhere(_ == "\n") match { case -1 => List(list) case i => val (chunk, rest) = list.splitAt(i) chunk :: chunks(rest.tail) } def unapplySeq(list: List[String]): Option[List[List[String]]] = Some(chunks(list)) } var debug = false - def tool(body: => Int): Nothing = + def tool(body: => Unit): Nothing = { val rc = - try { body } + try { body; 0 } catch { case exn: Throwable => Output.error_message(Exn.message(exn) + (if (debug) "\n" + Exn.trace(exn) else "")) Exn.return_code(exn, 2) } sys.exit(rc) } - def tool0(body: => Unit): Nothing = tool { body; 0 } - - def ML_tool0(body: List[String]): String = - "Command_Line.tool0 (fn () => (" + body.mkString("; ") + "));" + def ML_tool(body: List[String]): String = + "Command_Line.tool (fn () => (" + body.mkString("; ") + "));" } diff --git a/src/Pure/System/isabelle_tool.scala b/src/Pure/System/isabelle_tool.scala --- a/src/Pure/System/isabelle_tool.scala +++ b/src/Pure/System/isabelle_tool.scala @@ -1,182 +1,182 @@ /* Title: Pure/System/isabelle_tool.scala Author: Makarius Author: Lars Hupel Isabelle system tools: external executables or internal Scala functions. */ package isabelle import java.net.URLClassLoader import scala.reflect.runtime.universe import scala.tools.reflect.{ToolBox,ToolBoxError} object Isabelle_Tool { /* Scala source tools */ abstract class Body extends Function[List[String], Unit] private def compile(path: Path): Body = { def err(msg: String): Nothing = cat_error(msg, "The error(s) above occurred in Isabelle/Scala tool " + path) val source = File.read(path) val class_loader = new URLClassLoader(Array(), getClass.getClassLoader) val tool_box = universe.runtimeMirror(class_loader).mkToolBox() try { val symbol = tool_box.parse(source) match { case tree: universe.ModuleDef => tool_box.define(tree) case _ => err("Source does not describe a module (Scala object)") } tool_box.compile(universe.Ident(symbol))() match { case body: Body => body case _ => err("Ill-typed source: Isabelle_Tool.Body expected") } } catch { case e: ToolBoxError => if (tool_box.frontEnd.hasErrors) { val infos = tool_box.frontEnd.infos.toList val msgs = infos.map(info => "Error in line " + info.pos.line + ":\n" + info.msg) err(msgs.mkString("\n")) } else err(e.toString) } } /* external tools */ private def dirs(): List[Path] = Path.split(Isabelle_System.getenv_strict("ISABELLE_TOOLS")) private def is_external(dir: Path, file_name: String): Boolean = { val file = (dir + Path.explode(file_name)).file try { file.isFile && file.canRead && (file_name.endsWith(".scala") || file.canExecute) && !file_name.endsWith("~") && !file_name.endsWith(".orig") } catch { case _: SecurityException => false } } private def list_external(): List[(String, String)] = { val Description = """.*\bDESCRIPTION: *(.*)""".r for { dir <- dirs() if dir.is_dir file_name <- File.read_dir(dir) if is_external(dir, file_name) } yield { val source = File.read(dir + Path.explode(file_name)) val name = Library.perhaps_unsuffix(".scala", file_name) val description = split_lines(source).collectFirst({ case Description(s) => s }) getOrElse "" (name, description) } } private def find_external(name: String): Option[List[String] => Unit] = dirs().collectFirst({ case dir if is_external(dir, name + ".scala") => compile(dir + Path.explode(name + ".scala")) case dir if is_external(dir, name) => (args: List[String]) => { val tool = dir + Path.explode(name) val result = Isabelle_System.bash(File.bash_path(tool) + " " + Bash.strings(args)) sys.exit(result.print_stdout.rc) } }) /* internal tools */ private lazy val internal_tools: List[Isabelle_Tool] = Isabelle_System.init_classes[Isabelle_Scala_Tools]("ISABELLE_SCALA_TOOLS") .flatMap(_.tools.toList) private def list_internal(): List[(String, String)] = for (tool <- internal_tools.toList) yield (tool.name, tool.description) private def find_internal(name: String): Option[List[String] => Unit] = internal_tools.collectFirst({ case tool if tool.name == name => - args => Command_Line.tool0 { tool.body(args) } + args => Command_Line.tool { tool.body(args) } }) /* command line entry point */ def main(args: Array[String]) { - Command_Line.tool0 { + Command_Line.tool { args.toList match { case Nil | List("-?") => val tool_descriptions = (list_external() ::: list_internal()).sortBy(_._1). map({ case (a, "") => a case (a, b) => a + " - " + b }) Getopts(""" Usage: isabelle TOOL [ARGS ...] Start Isabelle TOOL with ARGS; pass "-?" for tool-specific help. Available tools:""" + tool_descriptions.mkString("\n ", "\n ", "\n")).usage case tool_name :: tool_args => find_external(tool_name) orElse find_internal(tool_name) match { case Some(tool) => tool(tool_args) case None => error("Unknown Isabelle tool: " + quote(tool_name)) } } } } } sealed case class Isabelle_Tool(name: String, description: String, body: List[String] => Unit) class Isabelle_Scala_Tools(val tools: Isabelle_Tool*) class Tools extends Isabelle_Scala_Tools( Build.isabelle_tool, Build_Docker.isabelle_tool, Doc.isabelle_tool, Dump.isabelle_tool, Export.isabelle_tool, ML_Process.isabelle_tool, Mercurial.isabelle_tool, Mkroot.isabelle_tool, Options.isabelle_tool, Phabricator.isabelle_tool1, Phabricator.isabelle_tool2, Phabricator.isabelle_tool3, Phabricator.isabelle_tool4, Present.isabelle_tool, Profiling_Report.isabelle_tool, Server.isabelle_tool, Scala_Project.isabelle_tool, Update.isabelle_tool, Update_Cartouches.isabelle_tool, Update_Comments.isabelle_tool, Update_Header.isabelle_tool, Update_Then.isabelle_tool, Update_Theorems.isabelle_tool, isabelle.vscode.Grammar.isabelle_tool, isabelle.vscode.Server.isabelle_tool) class Admin_Tools extends Isabelle_Scala_Tools( Build_Cygwin.isabelle_tool, Build_Doc.isabelle_tool, Build_Fonts.isabelle_tool, Build_JDK.isabelle_tool, Build_PolyML.isabelle_tool1, Build_PolyML.isabelle_tool2, Build_Status.isabelle_tool, Check_Sources.isabelle_tool, Components.isabelle_tool, isabelle.vscode.Build_VSCode.isabelle_tool) diff --git a/src/Pure/Tools/build.scala b/src/Pure/Tools/build.scala --- a/src/Pure/Tools/build.scala +++ b/src/Pure/Tools/build.scala @@ -1,873 +1,873 @@ /* Title: Pure/Tools/build.scala Author: Makarius Options: :folding=explicit: Build and manage Isabelle sessions. */ package isabelle import scala.collection.SortedSet import scala.annotation.tailrec object Build { /** auxiliary **/ /* persistent build info */ sealed case class Session_Info( sources: String, input_heaps: List[String], output_heap: Option[String], return_code: Int) { def ok: Boolean = return_code == 0 } /* queue with scheduling information */ private object Queue { type Timings = (List[Properties.T], Double) def load_timings(progress: Progress, store: Sessions.Store, name: String): Timings = { val no_timings: Timings = (Nil, 0.0) store.access_database(name) match { case None => no_timings case Some(db) => def ignore_error(msg: String) = { progress.echo_warning("Ignoring bad database " + db + (if (msg == "") "" else "\n" + msg)) no_timings } try { val command_timings = store.read_command_timings(db, name) val session_timing = store.read_session_timing(db, name) match { case Markup.Elapsed(t) => t case _ => 0.0 } (command_timings, session_timing) } catch { case ERROR(msg) => ignore_error(msg) case exn: java.lang.Error => ignore_error(Exn.message(exn)) case _: XML.Error => ignore_error("") } finally { db.close } } } def make_session_timing(sessions_structure: Sessions.Structure, timing: Map[String, Double]) : Map[String, Double] = { val maximals = sessions_structure.build_graph.maximals.toSet def desc_timing(name: String): Double = { if (maximals.contains(name)) timing(name) else { val descendants = sessions_structure.build_descendants(List(name)).toSet val g = sessions_structure.build_graph.restrict(descendants) (0.0 :: g.maximals.flatMap(desc => { val ps = g.all_preds(List(desc)) if (ps.exists(p => timing.get(p).isEmpty)) None else Some(ps.map(timing(_)).sum) })).max } } timing.keySet.iterator.map(name => (name -> desc_timing(name))).toMap.withDefaultValue(0.0) } def apply(progress: Progress, sessions_structure: Sessions.Structure, store: Sessions.Store) : Queue = { val graph = sessions_structure.build_graph val names = graph.keys val timings = names.map(name => (name, load_timings(progress, store, name))) val command_timings = timings.map({ case (name, (ts, _)) => (name, ts) }).toMap.withDefaultValue(Nil) val session_timing = make_session_timing(sessions_structure, timings.map({ case (name, (_, t)) => (name, t) }).toMap) object Ordering extends scala.math.Ordering[String] { def compare_timing(name1: String, name2: String): Int = { val t1 = session_timing(name1) val t2 = session_timing(name2) if (t1 == 0.0 || t2 == 0.0) 0 else t1 compare t2 } def compare(name1: String, name2: String): Int = compare_timing(name2, name1) match { case 0 => sessions_structure(name2).timeout compare sessions_structure(name1).timeout match { case 0 => name1 compare name2 case ord => ord } case ord => ord } } new Queue(graph, SortedSet(names: _*)(Ordering), command_timings) } } private class Queue( graph: Graph[String, Sessions.Info], order: SortedSet[String], val command_timings: String => List[Properties.T]) { def is_inner(name: String): Boolean = !graph.is_maximal(name) def is_empty: Boolean = graph.is_empty def - (name: String): Queue = new Queue(graph.del_node(name), order - name, // FIXME scala-2.10.0 .. 2.12.4 TreeSet problem!? command_timings) def dequeue(skip: String => Boolean): Option[(String, Sessions.Info)] = { val it = order.iterator.dropWhile(name => skip(name) || !graph.defined(name) // FIXME scala-2.10.0 .. 2.12.4 TreeSet problem!? || !graph.is_minimal(name)) if (it.hasNext) { val name = it.next; Some((name, graph.get_node(name))) } else None } } /* PIDE protocol handler */ class Handler(progress: Progress, session: Session, session_name: String) extends Session.Protocol_Handler { val build_session_errors: Promise[List[String]] = Future.promise override def exit() { build_session_errors.cancel } private def build_session_finished(msg: Prover.Protocol_Output): Boolean = { val errors = try { XML.Decode.list(x => x)(Symbol.decode_yxml(msg.text)). map(err => Pretty.string_of(Protocol_Message.expose_no_reports(err))) } catch { case ERROR(err) => List(err) } build_session_errors.fulfill(errors) session.send_stop() true } private def loading_theory(msg: Prover.Protocol_Output): Boolean = msg.properties match { case Markup.Loading_Theory(name) => progress.theory(Progress.Theory(name, session = session_name)) true case _ => false } val functions = List( Markup.BUILD_SESSION_FINISHED -> build_session_finished, Markup.LOADING_THEORY -> loading_theory) } /* job: running prover process */ private class Job(progress: Progress, name: String, val info: Sessions.Info, deps: Sessions.Deps, store: Sessions.Store, do_store: Boolean, verbose: Boolean, pide: Boolean, val numa_node: Option[Int], command_timings: List[Properties.T]) { private val options = NUMA.policy_options(info.options, numa_node) private val sessions_structure = deps.sessions_structure private val graph_file = Isabelle_System.tmp_file("session_graph", "pdf") isabelle.graphview.Graph_File.write(options, graph_file, deps(name).session_graph_display) private val export_tmp_dir = Isabelle_System.tmp_dir("export") private val export_consumer = Export.consumer(store.open_database(name, output = true), cache = store.xz_cache) private val future_result: Future[Process_Result] = Future.thread("build") { val parent = info.parent.getOrElse("") val base = deps(name) val args_yxml = YXML.string_of_body( { import XML.Encode._ pair(list(pair(string, int)), pair(list(properties), pair(bool, pair(Path.encode, pair(list(pair(Path.encode, Path.encode)), pair(string, pair(string, pair(string, pair(string, pair(Path.encode, pair(list(pair(Options.encode, list(pair(string, properties)))), pair(list(pair(string, properties)), pair(list(pair(string, string)), pair(list(string), pair(list(pair(string, string)), pair(list(string), list(string)))))))))))))))))( (Symbol.codes, (command_timings, (verbose, (store.browser_info, (info.document_files, (File.standard_path(graph_file), (parent, (info.chapter, (name, (Path.current, (info.theories, (sessions_structure.session_positions, (sessions_structure.dest_session_directories, (base.doc_names, (base.global_theories.toList, (base.loaded_theories.keys, info.bibtex_entries.map(_.info)))))))))))))))))) }) val env = Isabelle_System.settings() + ("ISABELLE_EXPORT_TMP" -> File.standard_path(export_tmp_dir)) + ("ISABELLE_ML_DEBUGGER" -> options.bool("ML_debugger").toString) val is_pure = Sessions.is_pure(name) val eval_store = if (!do_store) Nil else { (if (info.theories.nonEmpty) List("ML_Heap.share_common_data ()") else Nil) ::: List("ML_Heap.save_child " + ML_Syntax.print_string_bytes(File.platform_path(store.output_heap(name)))) } if (pide && !is_pure) { val resources = new Resources(sessions_structure, deps(parent)) val session = new Session(options, resources) val handler = new Handler(progress, session, name) session.init_protocol_handler(handler) val process = Isabelle_Process(session, options, sessions_structure, store, logic = parent, cwd = info.dir.file, env = env).await_startup session.protocol_command("build_session", args_yxml) val result = process.join handler.build_session_errors.join match { case Nil => result case errors => result.error_rc.output( errors.flatMap(s => split_lines(Output.error_message_text(s))) ::: errors.map(Protocol.Error_Message_Marker.apply)) } } else { val args_file = Isabelle_System.tmp_file("build") File.write(args_file, args_yxml) val eval_build = "Build.build " + ML_Syntax.print_string_bytes(File.standard_path(args_file)) - val eval = Command_Line.ML_tool0(eval_build :: eval_store) + val eval = Command_Line.ML_tool(eval_build :: eval_store) val process = if (is_pure) { ML_Process(options, deps.sessions_structure, store, raw_ml_system = true, args = (for ((root, _) <- Thy_Header.ml_roots) yield List("--use", root)).flatten ::: List("--eval", eval), cwd = info.dir.file, env = env, cleanup = () => args_file.delete) } else { ML_Process(options, deps.sessions_structure, store, logic = parent, args = List("--eval", eval), cwd = info.dir.file, env = env, cleanup = () => args_file.delete) } process.result( progress_stdout = { case Protocol.Loading_Theory_Marker(theory) => progress.theory(Progress.Theory(theory, session = name)) case Protocol.Export.Marker((args, path)) => val body = try { Bytes.read(path) } catch { case ERROR(msg) => error("Failed to read export " + quote(args.compound_name) + "\n" + msg) } path.file.delete export_consumer(name, args, body) case _ => }, progress_limit = options.int("process_output_limit") match { case 0 => None case m => Some(m * 1000000L) }, strict = false) } } def terminate: Unit = future_result.cancel def is_finished: Boolean = future_result.is_finished private val timeout_request: Option[Event_Timer.Request] = { if (info.timeout > Time.zero) Some(Event_Timer.request(Time.now() + info.timeout) { terminate }) else None } def join: (Process_Result, Option[String]) = { val result0 = future_result.join val result1 = export_consumer.shutdown(close = true).map(Output.error_message_text) match { case Nil => result0 case errs => result0.errors(errs).error_rc } Isabelle_System.rm_tree(export_tmp_dir) if (result1.ok) Present.finish(progress, store.browser_info, graph_file, info, name) graph_file.delete val was_timeout = timeout_request match { case None => false case Some(request) => !request.cancel } val result2 = if (result1.interrupted) { if (was_timeout) result1.error(Output.error_message_text("Timeout")).was_timeout else result1.error(Output.error_message_text("Interrupt")) } else result1 val heap_digest = if (result2.ok && do_store && store.output_heap(name).is_file) Some(Sessions.write_heap_digest(store.output_heap(name))) else None (result2, heap_digest) } } /** build with results **/ class Results private[Build](results: Map[String, (Option[Process_Result], Sessions.Info)]) { def sessions: Set[String] = results.keySet def cancelled(name: String): Boolean = results(name)._1.isEmpty def apply(name: String): Process_Result = results(name)._1.getOrElse(Process_Result(1)) def info(name: String): Sessions.Info = results(name)._2 val rc: Int = (0 /: results.iterator.map( { case (_, (Some(r), _)) => r.rc case (_, (None, _)) => 1 }))(_ max _) def ok: Boolean = rc == 0 override def toString: String = rc.toString } def build( options: Options, progress: Progress = No_Progress, check_unknown_files: Boolean = false, build_heap: Boolean = false, clean_build: Boolean = false, dirs: List[Path] = Nil, select_dirs: List[Path] = Nil, infos: List[Sessions.Info] = Nil, numa_shuffling: Boolean = false, max_jobs: Int = 1, list_files: Boolean = false, check_keywords: Set[String] = Set.empty, fresh_build: Boolean = false, no_build: Boolean = false, soft_build: Boolean = false, verbose: Boolean = false, export_files: Boolean = false, pide: Boolean = false, requirements: Boolean = false, all_sessions: Boolean = false, base_sessions: List[String] = Nil, exclude_session_groups: List[String] = Nil, exclude_sessions: List[String] = Nil, session_groups: List[String] = Nil, sessions: List[String] = Nil, selection: Sessions.Selection = Sessions.Selection.empty): Results = { val build_options = options.int.update("completion_limit", 0).bool.update("ML_statistics", true) val store = Sessions.store(build_options) Isabelle_Fonts.init() /* session selection and dependencies */ val full_sessions = Sessions.load_structure(build_options, dirs = dirs, select_dirs = select_dirs, infos = infos) def sources_stamp(deps: Sessions.Deps, name: String): String = { val digests = full_sessions(name).meta_digest :: deps.sources(name) ::: deps.imported_sources(name) SHA1.digest(cat_lines(digests.map(_.toString).sorted)).toString } val selection1 = Sessions.Selection(requirements, all_sessions, base_sessions, exclude_session_groups, exclude_sessions, session_groups, sessions) ++ selection val deps = { val deps0 = Sessions.deps(full_sessions.selection(selection1), progress = progress, inlined_files = true, verbose = verbose, list_files = list_files, check_keywords = check_keywords).check_errors if (soft_build && !fresh_build) { val outdated = deps0.sessions_structure.build_topological_order.flatMap(name => store.access_database(name) match { case Some(db) => using(db)(store.read_build(_, name)) match { case Some(build) if build.ok && build.sources == sources_stamp(deps0, name) => None case _ => Some(name) } case None => Some(name) }) Sessions.deps(full_sessions.selection(Sessions.Selection(sessions = outdated)), progress = progress, inlined_files = true).check_errors } else deps0 } /* check unknown files */ if (check_unknown_files) { val source_files = (for { (_, base) <- deps.session_bases.iterator (path, _) <- base.sources.iterator } yield path).toList val exclude_files = List(Path.explode("$POLYML_EXE")).map(_.canonical_file) val unknown_files = Mercurial.check_files(source_files)._2. filterNot(path => exclude_files.contains(path.canonical_file)) if (unknown_files.nonEmpty) { progress.echo_warning("Unknown files (not part of the underlying Mercurial repository):" + unknown_files.map(path => path.expand.implode).sorted.mkString("\n ", "\n ", "")) } } /* main build process */ val queue = Queue(progress, deps.sessions_structure, store) store.prepare_output_dir() if (clean_build) { for (name <- full_sessions.imports_descendants(full_sessions.imports_selection(selection1))) { val (relevant, ok) = store.clean_output(name) if (relevant) { if (ok) progress.echo("Cleaned " + name) else progress.echo(name + " FAILED to clean") } } } // scheduler loop case class Result( current: Boolean, heap_digest: Option[String], process: Option[Process_Result], info: Sessions.Info) { def ok: Boolean = process match { case None => false case Some(res) => res.rc == 0 } } def sleep() { try { Thread.sleep(500) } catch { case Exn.Interrupt() => Exn.Interrupt.impose() } } val numa_nodes = new NUMA.Nodes(numa_shuffling) @tailrec def loop( pending: Queue, running: Map[String, (List[String], Job)], results: Map[String, Result]): Map[String, Result] = { def used_node(i: Int): Boolean = running.iterator.exists( { case (_, (_, job)) => job.numa_node.isDefined && job.numa_node.get == i }) if (pending.is_empty) results else { if (progress.stopped) for ((_, (_, job)) <- running) job.terminate running.find({ case (_, (_, job)) => job.is_finished }) match { case Some((name, (input_heaps, job))) => //{{{ finish job val (process_result, heap_digest) = job.join val log_lines = process_result.out_lines.filterNot(Protocol_Message.Marker.test) val process_result_tail = { val tail = job.info.options.int("process_output_tail") process_result.copy( out_lines = "(see also " + store.output_log(name).file.toString + ")" :: (if (tail == 0) log_lines else log_lines.drop(log_lines.length - tail max 0))) } // write log file if (process_result.ok) { File.write_gzip(store.output_log_gz(name), terminate_lines(log_lines)) } else File.write(store.output_log(name), terminate_lines(log_lines)) // write database { val build_log = Build_Log.Log_File(name, process_result.out_lines). parse_session_info( command_timings = true, theory_timings = true, ml_statistics = true, task_statistics = true) using(store.open_database(name, output = true))(db => store.write_session_info(db, name, build_log = if (process_result.timeout) build_log.error("Timeout") else build_log, build = Session_Info(sources_stamp(deps, name), input_heaps, heap_digest, process_result.rc))) } // messages process_result.err_lines.foreach(progress.echo) if (process_result.ok) progress.echo("Finished " + name + " (" + process_result.timing.message_resources + ")") else { progress.echo(name + " FAILED") if (!process_result.interrupted) progress.echo(process_result_tail.out) } loop(pending - name, running - name, results + (name -> Result(false, heap_digest, Some(process_result_tail), job.info))) //}}} case None if running.size < (max_jobs max 1) => //{{{ check/start next job pending.dequeue(running.isDefinedAt) match { case Some((name, info)) => val ancestor_results = deps.sessions_structure.build_requirements(List(name)).filterNot(_ == name). map(results(_)) val ancestor_heaps = ancestor_results.flatMap(_.heap_digest) val do_store = build_heap || Sessions.is_pure(name) || queue.is_inner(name) val (current, heap_digest) = { store.access_database(name) match { case Some(db) => using(db)(store.read_build(_, name)) match { case Some(build) => val heap_digest = store.find_heap_digest(name) val current = !fresh_build && build.ok && build.sources == sources_stamp(deps, name) && build.input_heaps == ancestor_heaps && build.output_heap == heap_digest && !(do_store && heap_digest.isEmpty) (current, heap_digest) case None => (false, None) } case None => (false, None) } } val all_current = current && ancestor_results.forall(_.current) if (all_current) loop(pending - name, running, results + (name -> Result(true, heap_digest, Some(Process_Result(0)), info))) else if (no_build) { if (verbose) progress.echo("Skipping " + name + " ...") loop(pending - name, running, results + (name -> Result(false, heap_digest, Some(Process_Result(1)), info))) } else if (ancestor_results.forall(_.ok) && !progress.stopped) { progress.echo((if (do_store) "Building " else "Running ") + name + " ...") store.clean_output(name) using(store.open_database(name, output = true))(store.init_session_info(_, name)) val numa_node = numa_nodes.next(used_node) val job = new Job(progress, name, info, deps, store, do_store, verbose, pide = pide, numa_node, queue.command_timings(name)) loop(pending, running + (name -> (ancestor_heaps, job)), results) } else { progress.echo(name + " CANCELLED") loop(pending - name, running, results + (name -> Result(false, heap_digest, None, info))) } case None => sleep(); loop(pending, running, results) } ///}}} case None => sleep(); loop(pending, running, results) } } } /* build results */ val results0 = if (deps.is_empty) { progress.echo_warning("Nothing to build") Map.empty[String, Result] } else loop(queue, Map.empty, Map.empty) val results = new Results( (for ((name, result) <- results0.iterator) yield (name, (result.process, result.info))).toMap) if (export_files) { for (name <- full_sessions.imports_selection(selection1).iterator if results(name).ok) { val info = results.info(name) if (info.export_files.nonEmpty) { progress.echo("Exporting " + info.name + " ...") for ((dir, prune, pats) <- info.export_files) { Export.export_files(store, name, info.dir + dir, progress = if (verbose) progress else No_Progress, export_prune = prune, export_patterns = pats) } } } } if (results.rc != 0 && (verbose || !no_build)) { val unfinished = (for { name <- results.sessions.iterator if !results(name).ok } yield name).toList.sorted progress.echo("Unfinished session(s): " + commas(unfinished)) } /* global browser info */ if (!no_build) { val browser_chapters = (for { (name, result) <- results0.iterator if result.ok info = full_sessions(name) if info.options.bool("browser_info") } yield (info.chapter, (name, info.description))).toList.groupBy(_._1). map({ case (chapter, es) => (chapter, es.map(_._2)) }).filterNot(_._2.isEmpty) for ((chapter, entries) <- browser_chapters) Present.update_chapter_index(store.browser_info, chapter, entries) if (browser_chapters.nonEmpty) Present.make_global_index(store.browser_info) } results } /* Isabelle tool wrapper */ val isabelle_tool = Isabelle_Tool("build", "build and manage Isabelle sessions", args => { val build_options = Word.explode(Isabelle_System.getenv("ISABELLE_BUILD_OPTIONS")) var base_sessions: List[String] = Nil var select_dirs: List[Path] = Nil var numa_shuffling = false var pide = false var requirements = false var soft_build = false var exclude_session_groups: List[String] = Nil var all_sessions = false var build_heap = false var clean_build = false var dirs: List[Path] = Nil var export_files = false var fresh_build = false var session_groups: List[String] = Nil var max_jobs = 1 var check_keywords: Set[String] = Set.empty var list_files = false var no_build = false var options = Options.init(opts = build_options) var verbose = false var exclude_sessions: List[String] = Nil val getopts = Getopts(""" Usage: isabelle build [OPTIONS] [SESSIONS ...] Options are: -B NAME include session NAME and all descendants -D DIR include session directory and select its sessions -N cyclic shuffling of NUMA CPU nodes (performance tuning) -P build via PIDE protocol -R operate on requirements of selected sessions -S soft build: only observe changes of sources, not heap images -X NAME exclude sessions from group NAME and all descendants -a select all sessions -b build heap images -c clean build -d DIR include session directory -e export files from session specification into file-system -f fresh build -g NAME select session group NAME -j INT maximum number of parallel jobs (default 1) -k KEYWORD check theory sources for conflicts with proposed keywords -l list session source files -n no build -- test dependencies only -o OPTION override Isabelle system OPTION (via NAME=VAL or NAME) -v verbose -x NAME exclude session NAME and all descendants Build and manage Isabelle sessions, depending on implicit settings: """ + Library.prefix_lines(" ", Build_Log.Settings.show()) + "\n", "B:" -> (arg => base_sessions = base_sessions ::: List(arg)), "D:" -> (arg => select_dirs = select_dirs ::: List(Path.explode(arg))), "N" -> (_ => numa_shuffling = true), "P" -> (_ => pide = true), "R" -> (_ => requirements = true), "S" -> (_ => soft_build = true), "X:" -> (arg => exclude_session_groups = exclude_session_groups ::: List(arg)), "a" -> (_ => all_sessions = true), "b" -> (_ => build_heap = true), "c" -> (_ => clean_build = true), "d:" -> (arg => dirs = dirs ::: List(Path.explode(arg))), "e" -> (_ => export_files = true), "f" -> (_ => fresh_build = true), "g:" -> (arg => session_groups = session_groups ::: List(arg)), "j:" -> (arg => max_jobs = Value.Int.parse(arg)), "k:" -> (arg => check_keywords = check_keywords + arg), "l" -> (_ => list_files = true), "n" -> (_ => no_build = true), "o:" -> (arg => options = options + arg), "v" -> (_ => verbose = true), "x:" -> (arg => exclude_sessions = exclude_sessions ::: List(arg))) val sessions = getopts(args) val progress = new Console_Progress(verbose = verbose) val start_date = Date.now() if (verbose) { progress.echo( "Started at " + Build_Log.print_date(start_date) + " (" + Isabelle_System.getenv("ML_IDENTIFIER") + " on " + Isabelle_System.hostname() +")") progress.echo(Build_Log.Settings.show() + "\n") } val results = progress.interrupt_handler { build(options, progress = progress, check_unknown_files = Mercurial.is_repository(Path.explode("~~")), build_heap = build_heap, clean_build = clean_build, dirs = dirs, select_dirs = select_dirs, numa_shuffling = NUMA.enabled_warning(progress, numa_shuffling), max_jobs = max_jobs, list_files = list_files, check_keywords = check_keywords, fresh_build = fresh_build, no_build = no_build, soft_build = soft_build, verbose = verbose, export_files = export_files, pide = pide, requirements = requirements, all_sessions = all_sessions, base_sessions = base_sessions, exclude_session_groups = exclude_session_groups, exclude_sessions = exclude_sessions, session_groups = session_groups, sessions = sessions) } val end_date = Date.now() val elapsed_time = end_date.time - start_date.time if (verbose) { progress.echo("\nFinished at " + Build_Log.print_date(end_date)) } val total_timing = (Timing.zero /: results.sessions.iterator.map(a => results(a).timing))(_ + _). copy(elapsed = elapsed_time) progress.echo(total_timing.message_resources) sys.exit(results.rc) }) /* build logic image */ def build_logic(options: Options, logic: String, progress: Progress = No_Progress, build_heap: Boolean = false, dirs: List[Path] = Nil, fresh: Boolean = false, strict: Boolean = false): Int = { val rc = if (!fresh && build(options, build_heap = build_heap, no_build = true, dirs = dirs, sessions = List(logic)).ok) 0 else { progress.echo("Build started for Isabelle/" + logic + " ...") Build.build(options, progress = progress, build_heap = build_heap, fresh_build = fresh, dirs = dirs, sessions = List(logic)).rc } if (strict && rc != 0) error("Failed to build Isabelle/" + logic) else rc } } diff --git a/src/Pure/Tools/profiling_report.scala b/src/Pure/Tools/profiling_report.scala --- a/src/Pure/Tools/profiling_report.scala +++ b/src/Pure/Tools/profiling_report.scala @@ -1,55 +1,55 @@ /* Title: Pure/Tools/profiling_report.scala Author: Makarius Report Poly/ML profiling information from log files. */ package isabelle import java.util.Locale object Profiling_Report { def profiling_report(log_file: Build_Log.Log_File): List[(Long, String)] = { val Line = """^(?:### )?([ 0-9]{10}) (\S+|GARBAGE COLLECTION.*)$""".r val Count = """ *(\d+)""".r val clean = """-?\(\d+\).*$""".r var results = Map.empty[String, Long] for (Line(Count(Value.Long(count)), raw_fun) <- log_file.lines) { val fun = clean.replaceAllIn(raw_fun, "") results += (fun -> (results.getOrElse(fun, 0L) + count)) } for ((fun, count) <- results.toList.sortBy(_._2)) yield (count, fun) } /* Isabelle tool wrapper */ val isabelle_tool = Isabelle_Tool("profiling_report", "report Poly/ML profiling information from log files", args => { - Command_Line.tool0 { + Command_Line.tool { val getopts = Getopts(""" Usage: isabelle profiling_report [LOGS ...] Report Poly/ML profiling output from log files (potentially compressed). """) val log_names = getopts(args) for (name <- log_names) { val log_file = Build_Log.Log_File(Path.explode(name)) val results = for ((count, fun) <- profiling_report(log_file)) yield String.format(Locale.ROOT, "%14d %s", count.asInstanceOf[AnyRef], fun.asInstanceOf[AnyRef]) if (results.nonEmpty) Output.writeln(cat_lines((log_file.name + ":") :: results)) } } }) }