diff --git a/src/Doc/Isar_Ref/Spec.thy b/src/Doc/Isar_Ref/Spec.thy --- a/src/Doc/Isar_Ref/Spec.thy +++ b/src/Doc/Isar_Ref/Spec.thy @@ -1,1538 +1,1548 @@ (*:maxLineLen=78:*) theory Spec imports Main Base begin chapter \Specifications\ text \ The Isabelle/Isar theory format integrates specifications and proofs, with support for interactive development by continuous document editing. There is a separate document preparation system (see \chref{ch:document-prep}), for typesetting formal developments together with informal text. The resulting hyper-linked PDF documents can be used both for WWW presentation and printed copies. The Isar proof language (see \chref{ch:proofs}) is embedded into the theory language as a proper sub-language. Proof mode is entered by stating some \<^theory_text>\theorem\ or \<^theory_text>\lemma\ at the theory level, and left again with the final conclusion (e.g.\ via \<^theory_text>\qed\). \ section \Defining theories \label{sec:begin-thy}\ text \ \begin{matharray}{rcl} @{command_def "theory"} & : & \toplevel \ theory\ \\ @{command_def (global) "end"} & : & \theory \ toplevel\ \\ @{command_def "thy_deps"}\\<^sup>*\ & : & \theory \\ \\ \end{matharray} Isabelle/Isar theories are defined via theory files, which consist of an outermost sequence of definition--statement--proof elements. Some definitions are self-sufficient (e.g.\ \<^theory_text>\fun\ in Isabelle/HOL), with foundational proofs performed internally. Other definitions require an explicit proof as justification (e.g.\ \<^theory_text>\function\ and \<^theory_text>\termination\ in Isabelle/HOL). Plain statements like \<^theory_text>\theorem\ or \<^theory_text>\lemma\ are merely a special case of that, defining a theorem from a given proposition and its proof. The theory body may be sub-structured by means of \<^emph>\local theory targets\, such as \<^theory_text>\locale\ and \<^theory_text>\class\. It is also possible to use \<^theory_text>\context begin \ end\ blocks to delimited a local theory context: a \<^emph>\named context\ to augment a locale or class specification, or an \<^emph>\unnamed context\ to refer to local parameters and assumptions that are discharged later. See \secref{sec:target} for more details. \<^medskip> A theory is commenced by the \<^theory_text>\theory\ command, which indicates imports of previous theories, according to an acyclic foundational order. Before the initial \<^theory_text>\theory\ command, there may be optional document header material (like \<^theory_text>\section\ or \<^theory_text>\text\, see \secref{sec:markup}). The document header is outside of the formal theory context, though. A theory is concluded by a final @{command (global) "end"} command, one that does not belong to a local theory target. No further commands may follow such a global @{command (global) "end"}. \<^rail>\ @@{command theory} @{syntax system_name} @'imports' (@{syntax system_name} +) \ keywords? abbrevs? @'begin' ; keywords: @'keywords' (keyword_decls + @'and') ; keyword_decls: (@{syntax string} +) ('::' @{syntax name} @{syntax tags})? ; abbrevs: @'abbrevs' (((text+) '=' (text+)) + @'and') ; @@{command thy_deps} (thy_bounds thy_bounds?)? ; thy_bounds: @{syntax name} | '(' (@{syntax name} + @'|') ')' \ \<^descr> \<^theory_text>\theory A imports B\<^sub>1 \ B\<^sub>n begin\ starts a new theory \A\ based on the merge of existing theories \B\<^sub>1 \ B\<^sub>n\. Due to the possibility to import more than one ancestor, the resulting theory structure of an Isabelle session forms a directed acyclic graph (DAG). Isabelle takes care that sources contributing to the development graph are always up-to-date: changed files are automatically rechecked whenever a theory header specification is processed. Empty imports are only allowed in the bootstrap process of the special theory \<^theory>\Pure\, which is the start of any other formal development based on Isabelle. Regular user theories usually refer to some more complex entry point, such as theory \<^theory>\Main\ in Isabelle/HOL. The @{keyword_def "keywords"} specification declares outer syntax (\chref{ch:outer-syntax}) that is introduced in this theory later on (rare in end-user applications). Both minor keywords and major keywords of the Isar command language need to be specified, in order to make parsing of proof documents work properly. Command keywords need to be classified according to their structural role in the formal text. Examples may be seen in Isabelle/HOL sources itself, such as @{keyword "keywords"}~\<^verbatim>\"typedef"\ \:: thy_goal_defn\ or @{keyword "keywords"}~\<^verbatim>\"datatype"\ \:: thy_defn\ for theory-level definitions with and without proof, respectively. Additional @{syntax tags} provide defaults for document preparation (\secref{sec:document-markers}). The @{keyword_def "abbrevs"} specification declares additional abbreviations for syntactic completion. The default for a new keyword is just its name, but completion may be avoided by defining @{keyword "abbrevs"} with empty text. \<^descr> @{command (global) "end"} concludes the current theory definition. Note that some other commands, e.g.\ local theory targets \<^theory_text>\locale\ or \<^theory_text>\class\ may involve a \<^theory_text>\begin\ that needs to be matched by @{command (local) "end"}, according to the usual rules for nested blocks. \<^descr> \<^theory_text>\thy_deps\ visualizes the theory hierarchy as a directed acyclic graph. By default, all imported theories are shown. This may be restricted by specifying bounds wrt. the theory inclusion relation. \ section \Local theory targets \label{sec:target}\ text \ \begin{matharray}{rcll} @{command_def "context"} & : & \theory \ local_theory\ \\ @{command_def (local) "end"} & : & \local_theory \ theory\ \\ @{keyword_def "private"} \\ @{keyword_def "qualified"} \\ \end{matharray} A local theory target is a specification context that is managed separately within the enclosing theory. Contexts may introduce parameters (fixed variables) and assumptions (hypotheses). Definitions and theorems depending on the context may be added incrementally later on. \<^emph>\Named contexts\ refer to locales (cf.\ \secref{sec:locale}) or type classes (cf.\ \secref{sec:class}); the name ``\-\'' signifies the global theory context. \<^emph>\Unnamed contexts\ may introduce additional parameters and assumptions, and results produced in the context are generalized accordingly. Such auxiliary contexts may be nested within other targets, like \<^theory_text>\locale\, \<^theory_text>\class\, \<^theory_text>\instantiation\, \<^theory_text>\overloading\. \<^rail>\ @@{command context} @{syntax name} @{syntax_ref "opening"}? @'begin' ; @@{command context} @{syntax_ref "includes"}? (@{syntax context_elem} * ) @'begin' ; @{syntax_def target}: '(' @'in' @{syntax name} ')' \ \<^descr> \<^theory_text>\context c bundles begin\ opens a named context, by recommencing an existing locale or class \c\. Note that locale and class definitions allow to include the \<^theory_text>\begin\ keyword as well, in order to continue the local theory immediately after the initial specification. Optionally given \bundles\ only take effect in the surface context within the \<^theory_text>\begin\ / \<^theory_text>\end\ block. \<^descr> \<^theory_text>\context bundles elements begin\ opens an unnamed context, by extending the enclosing global or local theory target by the given declaration bundles (\secref{sec:bundle}) and context elements (\<^theory_text>\fixes\, \<^theory_text>\assumes\ etc.). This means any results stemming from definitions and proofs in the extended context will be exported into the enclosing target by lifting over extra parameters and premises. \<^descr> @{command (local) "end"} concludes the current local theory, according to the nesting of contexts. Note that a global @{command (global) "end"} has a different meaning: it concludes the theory itself (\secref{sec:begin-thy}). \<^descr> \<^theory_text>\private\ or \<^theory_text>\qualified\ may be given as modifiers before any local theory command. This restricts name space accesses to the local scope, as determined by the enclosing \<^theory_text>\context begin \ end\ block. Outside its scope, a \<^theory_text>\private\ name is inaccessible, and a \<^theory_text>\qualified\ name is only accessible with some qualification. Neither a global \<^theory_text>\theory\ nor a \<^theory_text>\locale\ target provides a local scope by itself: an extra unnamed context is required to use \<^theory_text>\private\ or \<^theory_text>\qualified\ here. \<^descr> \(\@{keyword_def "in"}~\c)\ given after any local theory command specifies an immediate target, e.g.\ ``\<^theory_text>\definition (in c)\'' or ``\<^theory_text>\theorem (in c)\''. This works both in a local or global theory context; the current target context will be suspended for this command only. Note that ``\<^theory_text>\(in -)\'' will always produce a global result independently of the current target context. Any specification element that operates on \local_theory\ according to this manual implicitly allows the above target syntax \<^theory_text>\(in c)\, but individual syntax diagrams omit that aspect for clarity. \<^medskip> The exact meaning of results produced within a local theory context depends on the underlying target infrastructure (locale, type class etc.). The general idea is as follows, considering a context named \c\ with parameter \x\ and assumption \A[x]\. Definitions are exported by introducing a global version with additional arguments; a syntactic abbreviation links the long form with the abstract version of the target context. For example, \a \ t[x]\ becomes \c.a ?x \ t[?x]\ at the theory level (for arbitrary \?x\), together with a local abbreviation \a \ c.a x\ in the target context (for the fixed parameter \x\). Theorems are exported by discharging the assumptions and generalizing the parameters of the context. For example, \a: B[x]\ becomes \c.a: A[?x] \ B[?x]\, again for arbitrary \?x\. \ section \Bundled declarations \label{sec:bundle}\ text \ \begin{matharray}{rcl} @{command_def "bundle"} & : & \local_theory \ local_theory\ \\ @{command "bundle"} & : & \theory \ local_theory\ \\ @{command_def "print_bundles"}\\<^sup>*\ & : & \context \\ \\ @{command_def "include"} & : & \proof(state) \ proof(state)\ \\ @{command_def "including"} & : & \proof(prove) \ proof(prove)\ \\ @{keyword_def "includes"} & : & syntax \\ \end{matharray} The outer syntax of fact expressions (\secref{sec:syn-att}) involves theorems and attributes, which are evaluated in the context and applied to it. Attributes may declare theorems to the context, as in \this_rule [intro] that_rule [elim]\ for example. Configuration options (\secref{sec:config}) are special declaration attributes that operate on the context without a theorem, as in \[[show_types = false]]\ for example. Expressions of this form may be defined as \<^emph>\bundled declarations\ in the context, and included in other situations later on. Including declaration bundles augments a local context casually without logical dependencies, which is in contrast to locales and locale interpretation (\secref{sec:locale}). \<^rail>\ @@{command bundle} @{syntax name} ( '=' @{syntax thms} @{syntax for_fixes} | @'begin') ; @@{command print_bundles} ('!'?) ; (@@{command include} | @@{command including}) (@{syntax name}+) ; @{syntax_def "includes"}: @'includes' (@{syntax name}+) ; @{syntax_def "opening"}: @'opening' (@{syntax name}+) ; @@{command unbundle} (@{syntax name}+) \ \<^descr> \<^theory_text>\bundle b = decls\ defines a bundle of declarations in the current context. The RHS is similar to the one of the \<^theory_text>\declare\ command. Bundles defined in local theory targets are subject to transformations via morphisms, when moved into different application contexts; this works analogously to any other local theory specification. \<^descr> \<^theory_text>\bundle b begin body end\ defines a bundle of declarations from the \body\ of local theory specifications. It may consist of commands that are technically equivalent to \<^theory_text>\declare\ or \<^theory_text>\declaration\, which also includes \<^theory_text>\notation\, for example. Named fact declarations like ``\<^theory_text>\lemmas a [simp] = b\'' or ``\<^theory_text>\lemma a [simp]: B \\'' are also admitted, but the name bindings are not recorded in the bundle. \<^descr> \<^theory_text>\print_bundles\ prints the named bundles that are available in the current context; the ``\!\'' option indicates extra verbosity. \<^descr> \<^theory_text>\include b\<^sub>1 \ b\<^sub>n\ activates the declarations from the given bundles in a proof body (forward mode). This is analogous to \<^theory_text>\note\ (\secref{sec:proof-facts}) with the expanded bundles. \<^descr> \<^theory_text>\including b\<^sub>1 \ b\<^sub>n\ is similar to \<^theory_text>\include\, but works in proof refinement (backward mode). This is analogous to \<^theory_text>\using\ (\secref{sec:proof-facts}) with the expanded bundles. \<^descr> \<^theory_text>\includes b\<^sub>1 \ b\<^sub>n\ is similar to \<^theory_text>\include\, but applies to a confined specification context: unnamed \<^theory_text>\context\s and long statements of \<^theory_text>\theorem\. \<^descr> \<^theory_text>\opening b\<^sub>1 \ b\<^sub>n\ is similar to \<^theory_text>\includes\, but applies to a named specification context: \<^theory_text>\locale\s, \<^theory_text>\class\es and named \<^theory_text>\context\s. The effect is confined to the surface context within the specification block itself and the corresponding \<^theory_text>\begin\ / \<^theory_text>\end\ block. \<^descr> \<^theory_text>\unbundle b\<^sub>1 \ b\<^sub>n\ activates the declarations from the given bundles in the current local theory context. This is analogous to \<^theory_text>\lemmas\ (\secref{sec:theorems}) with the expanded bundles. Here is an artificial example of bundling various configuration options: \ (*<*)experiment begin(*>*) bundle trace = [[simp_trace, linarith_trace, metis_trace, smt_trace]] lemma "x = x" including trace by metis (*<*)end(*>*) section \Term definitions \label{sec:term-definitions}\ text \ \begin{matharray}{rcll} @{command_def "definition"} & : & \local_theory \ local_theory\ \\ @{attribute_def "defn"} & : & \attribute\ \\ @{command_def "print_defn_rules"}\\<^sup>*\ & : & \context \\ \\ @{command_def "abbreviation"} & : & \local_theory \ local_theory\ \\ @{command_def "print_abbrevs"}\\<^sup>*\ & : & \context \\ \\ \end{matharray} Term definitions may either happen within the logic (as equational axioms of a certain form (see also \secref{sec:overloading}), or outside of it as rewrite system on abstract syntax. The second form is called ``abbreviation''. \<^rail>\ @@{command definition} decl? definition ; @@{command abbreviation} @{syntax mode}? decl? abbreviation ; @@{command print_abbrevs} ('!'?) ; decl: @{syntax name} ('::' @{syntax type})? @{syntax mixfix}? @'where' ; definition: @{syntax thmdecl}? @{syntax prop} @{syntax spec_prems} @{syntax for_fixes} ; abbreviation: @{syntax prop} @{syntax for_fixes} \ \<^descr> \<^theory_text>\definition c where eq\ produces an internal definition \c \ t\ according to the specification given as \eq\, which is then turned into a proven fact. The given proposition may deviate from internal meta-level equality according to the rewrite rules declared as @{attribute defn} by the object-logic. This usually covers object-level equality \x = y\ and equivalence \A \ B\. End-users normally need not change the @{attribute defn} setup. Definitions may be presented with explicit arguments on the LHS, as well as additional conditions, e.g.\ \f x y = t\ instead of \f \ \x y. t\ and \y \ 0 \ g x y = u\ instead of an unrestricted \g \ \x y. u\. \<^descr> \<^theory_text>\print_defn_rules\ prints the definitional rewrite rules declared via @{attribute defn} in the current context. \<^descr> \<^theory_text>\abbreviation c where eq\ introduces a syntactic constant which is associated with a certain term according to the meta-level equality \eq\. Abbreviations participate in the usual type-inference process, but are expanded before the logic ever sees them. Pretty printing of terms involves higher-order rewriting with rules stemming from reverted abbreviations. This needs some care to avoid overlapping or looping syntactic replacements! The optional \mode\ specification restricts output to a particular print mode; using ``\input\'' here achieves the effect of one-way abbreviations. The mode may also include an ``\<^theory_text>\output\'' qualifier that affects the concrete syntax declared for abbreviations, cf.\ \<^theory_text>\syntax\ in \secref{sec:syn-trans}. \<^descr> \<^theory_text>\print_abbrevs\ prints all constant abbreviations of the current context; the ``\!\'' option indicates extra verbosity. \ section \Axiomatizations \label{sec:axiomatizations}\ text \ \begin{matharray}{rcll} @{command_def "axiomatization"} & : & \theory \ theory\ & (axiomatic!) \\ \end{matharray} \<^rail>\ @@{command axiomatization} @{syntax vars}? (@'where' axiomatization)? ; axiomatization: (@{syntax thmdecl} @{syntax prop} + @'and') @{syntax spec_prems} @{syntax for_fixes} \ \<^descr> \<^theory_text>\axiomatization c\<^sub>1 \ c\<^sub>m where \\<^sub>1 \ \\<^sub>n\ introduces several constants simultaneously and states axiomatic properties for these. The constants are marked as being specified once and for all, which prevents additional specifications for the same constants later on, but it is always possible to emit axiomatizations without referring to particular constants. Note that lack of precise dependency tracking of axiomatizations may disrupt the well-formedness of an otherwise definitional theory. Axiomatization is restricted to a global theory context: support for local theory targets \secref{sec:target} would introduce an extra dimension of uncertainty what the written specifications really are, and make it infeasible to argue why they are correct. Axiomatic specifications are required when declaring a new logical system within Isabelle/Pure, but in an application environment like Isabelle/HOL the user normally stays within definitional mechanisms provided by the logic and its libraries. \ section \Generic declarations\ text \ \begin{matharray}{rcl} @{command_def "declaration"} & : & \local_theory \ local_theory\ \\ @{command_def "syntax_declaration"} & : & \local_theory \ local_theory\ \\ @{command_def "declare"} & : & \local_theory \ local_theory\ \\ \end{matharray} Arbitrary operations on the background context may be wrapped-up as generic declaration elements. Since the underlying concept of local theories may be subject to later re-interpretation, there is an additional dependency on a morphism that tells the difference of the original declaration context wrt.\ the application context encountered later on. A fact declaration is an important special case: it consists of a theorem which is applied to the context by means of an attribute. \<^rail>\ (@@{command declaration} | @@{command syntax_declaration}) ('(' @'pervasive' ')')? \ @{syntax text} ; @@{command declare} (@{syntax thms} + @'and') \ \<^descr> \<^theory_text>\declaration d\ adds the declaration function \d\ of ML type \<^ML_type>\declaration\, to the current local theory under construction. In later application contexts, the function is transformed according to the morphisms being involved in the interpretation hierarchy. If the \<^theory_text>\(pervasive)\ option is given, the corresponding declaration is applied to all possible contexts involved, including the global background theory. \<^descr> \<^theory_text>\syntax_declaration\ is similar to \<^theory_text>\declaration\, but is meant to affect only ``syntactic'' tools by convention (such as notation and type-checking information). \<^descr> \<^theory_text>\declare thms\ declares theorems to the current local theory context. No theorem binding is involved here, unlike \<^theory_text>\lemmas\ (cf.\ \secref{sec:theorems}), so \<^theory_text>\declare\ only has the effect of applying attributes as included in the theorem specification. \ section \Locales \label{sec:locale}\ text \ A locale is a functor that maps parameters (including implicit type parameters) and a specification to a list of declarations. The syntax of locales is modeled after the Isar proof context commands (cf.\ \secref{sec:proof-context}). Locale hierarchies are supported by maintaining a graph of dependencies between locale instances in the global theory. Dependencies may be introduced through import (where a locale is defined as sublocale of the imported instances) or by proving that an existing locale is a sublocale of one or several locale instances. A locale may be opened with the purpose of appending to its list of declarations (cf.\ \secref{sec:target}). When opening a locale declarations from all dependencies are collected and are presented as a local theory. In this process, which is called \<^emph>\roundup\, redundant locale instances are omitted. A locale instance is redundant if it is subsumed by an instance encountered earlier. A more detailed description of this process is available elsewhere @{cite Ballarin2014}. \ subsection \Locale expressions \label{sec:locale-expr}\ text \ A \<^emph>\locale expression\ denotes a context composed of instances of existing locales. The context consists of the declaration elements from the locale instances. Redundant locale instances are omitted according to roundup. \<^rail>\ @{syntax_def locale_expr}: (instance + '+') @{syntax for_fixes} ; instance: (qualifier ':')? @{syntax name} (pos_insts | named_insts) \ rewrites? ; qualifier: @{syntax name} ('?')? ; pos_insts: ('_' | @{syntax term})* ; named_insts: @'where' (@{syntax name} '=' @{syntax term} + @'and') ; rewrites: @'rewrites' (@{syntax thmdecl}? @{syntax prop} + @'and') \ A locale instance consists of a reference to a locale and either positional or named parameter instantiations optionally followed by rewrites clauses. Identical instantiations (that is, those that instantiate a parameter by itself) may be omitted. The notation ``\_\'' enables to omit the instantiation for a parameter inside a positional instantiation. Terms in instantiations are from the context the locale expressions is declared in. Local names may be added to this context with the optional \<^theory_text>\for\ clause. This is useful for shadowing names bound in outer contexts, and for declaring syntax. In addition, syntax declarations from one instance are effective when parsing subsequent instances of the same expression. Instances have an optional qualifier which applies to names in declarations. Names include local definitions and theorem names. If present, the qualifier itself is either mandatory (default) or non-mandatory (when followed by ``\<^verbatim>\?\''). Non-mandatory means that the qualifier may be omitted on input. Qualifiers only affect name spaces; they play no role in determining whether one locale instance subsumes another. Rewrite clauses amend instances with equations that act as rewrite rules. This is particularly useful for changing concepts introduced through definitions. Rewrite clauses are available only in interpretation commands (see \secref{sec:locale-interpretation} below) and must be proved the user. \ subsection \Locale declarations\ text \ \begin{tabular}{rcl} @{command_def "locale"} & : & \theory \ local_theory\ \\ @{command_def "experiment"} & : & \theory \ local_theory\ \\ @{command_def "print_locale"}\\<^sup>*\ & : & \context \\ \\ @{command_def "print_locales"}\\<^sup>*\ & : & \context \\ \\ @{command_def "locale_deps"}\\<^sup>*\ & : & \context \\ \\ \end{tabular} @{index_ref \\<^theory_text>\fixes\ (element)\} @{index_ref \\<^theory_text>\constrains\ (element)\} @{index_ref \\<^theory_text>\assumes\ (element)\} @{index_ref \\<^theory_text>\defines\ (element)\} @{index_ref \\<^theory_text>\notes\ (element)\} \<^rail>\ @@{command locale} @{syntax name} ('=' @{syntax locale})? @'begin'? ; @@{command experiment} (@{syntax context_elem}*) @'begin' ; @@{command print_locale} '!'? @{syntax name} ; @@{command print_locales} ('!'?) ; @{syntax_def locale}: @{syntax context_elem}+ | @{syntax_ref "opening"} ('+' (@{syntax context_elem}+))? | @{syntax locale_expr} @{syntax_ref "opening"}? ('+' (@{syntax context_elem}+))? ; @{syntax_def context_elem}: @'fixes' @{syntax vars} | @'constrains' (@{syntax name} '::' @{syntax type} + @'and') | @'assumes' (@{syntax props} + @'and') | @'defines' (@{syntax thmdecl}? @{syntax prop} @{syntax prop_pat}? + @'and') | @'notes' (@{syntax thmdef}? @{syntax thms} + @'and') \ \<^descr> \<^theory_text>\locale loc = import opening bundles + body\ defines a new locale \loc\ as a context consisting of a certain view of existing locales (\import\) plus some additional elements (\body\) with declaration \bundles\ enriching the context of the command itself. All \import\, \bundles\ and \body\ are optional; the degenerate form \<^theory_text>\locale loc\ defines an empty locale, which may still be useful to collect declarations of facts later on. Type-inference on locale expressions automatically takes care of the most general typing that the combined context elements may acquire. The \import\ consists of a locale expression; see \secref{sec:locale-expr} above. Its \<^theory_text>\for\ clause defines the parameters of \import\. These are parameters of the defined locale. Locale parameters whose instantiation is omitted automatically extend the (possibly empty) \<^theory_text>\for\ clause: they are inserted at its beginning. This means that these parameters may be referred to from within the expression and also in the subsequent context elements and provides a notational convenience for the inheritance of parameters in locale declarations. Declarations from \bundles\, see \secref{sec:bundle}, are effective in the entire command including a subsequent \<^theory_text>\begin\ / \<^theory_text>\end\ block, but they do not contribute to the declarations stored in the locale. The \body\ consists of context elements: \<^descr> @{element "fixes"}~\x :: \ (mx)\ declares a local parameter of type \\\ and mixfix annotation \mx\ (both are optional). The special syntax declaration ``\(\@{keyword_ref "structure"}\)\'' means that \x\ may be referenced implicitly in this context. \<^descr> @{element "constrains"}~\x :: \\ introduces a type constraint \\\ on the local parameter \x\. This element is deprecated. The type constraint should be introduced in the \<^theory_text>\for\ clause or the relevant @{element "fixes"} element. \<^descr> @{element "assumes"}~\a: \\<^sub>1 \ \\<^sub>n\ introduces local premises, similar to \<^theory_text>\assume\ within a proof (cf.\ \secref{sec:proof-context}). \<^descr> @{element "defines"}~\a: x \ t\ defines a previously declared parameter. This is similar to \<^theory_text>\define\ within a proof (cf.\ \secref{sec:proof-context}), but @{element "defines"} is restricted to Pure equalities and the defined variable needs to be declared beforehand via @{element "fixes"}. The left-hand side of the equation may have additional arguments, e.g.\ ``@{element "defines"}~\f x\<^sub>1 \ x\<^sub>n \ t\'', which need to be free in the context. \<^descr> @{element "notes"}~\a = b\<^sub>1 \ b\<^sub>n\ reconsiders facts within a local context. Most notably, this may include arbitrary declarations in any attribute specifications included here, e.g.\ a local @{attribute simp} rule. Both @{element "assumes"} and @{element "defines"} elements contribute to the locale specification. When defining an operation derived from the parameters, \<^theory_text>\definition\ (\secref{sec:term-definitions}) is usually more appropriate. Note that ``\<^theory_text>\(is p\<^sub>1 \ p\<^sub>n)\'' patterns given in the syntax of @{element "assumes"} and @{element "defines"} above are illegal in locale definitions. In the long goal format of \secref{sec:goals}, term bindings may be included as expected, though. \<^medskip> Locale specifications are ``closed up'' by turning the given text into a predicate definition \loc_axioms\ and deriving the original assumptions as local lemmas (modulo local definitions). The predicate statement covers only the newly specified assumptions, omitting the content of included locale expressions. The full cumulative view is only provided on export, involving another predicate \loc\ that refers to the complete specification text. In any case, the predicate arguments are those locale parameters that actually occur in the respective piece of text. Also these predicates operate at the meta-level in theory, but the locale packages attempts to internalize statements according to the object-logic setup (e.g.\ replacing \\\ by \\\, and \\\ by \\\ in HOL; see also \secref{sec:object-logic}). Separate introduction rules \loc_axioms.intro\ and \loc.intro\ are provided as well. \<^descr> \<^theory_text>\experiment body begin\ opens an anonymous locale context with private naming policy. Specifications in its body are inaccessible from outside. This is useful to perform experiments, without polluting the name space. \<^descr> \<^theory_text>\print_locale "locale"\ prints the contents of the named locale. The command omits @{element "notes"} elements by default. Use \<^theory_text>\print_locale!\ to have them included. \<^descr> \<^theory_text>\print_locales\ prints the names of all locales of the current theory; the ``\!\'' option indicates extra verbosity. \<^descr> \<^theory_text>\locale_deps\ visualizes all locales and their relations as a Hasse diagram. This includes locales defined as type classes (\secref{sec:class}). \ subsection \Locale interpretation \label{sec:locale-interpretation}\ text \ \begin{matharray}{rcl} @{command "interpretation"} & : & \local_theory \ proof(prove)\ \\ @{command_def "interpret"} & : & \proof(state) | proof(chain) \ proof(prove)\ \\ @{command_def "global_interpretation"} & : & \theory | local_theory \ proof(prove)\ \\ @{command_def "sublocale"} & : & \theory | local_theory \ proof(prove)\ \\ @{command_def "print_interps"}\\<^sup>*\ & : & \context \\ \\ @{method_def intro_locales} & : & \method\ \\ @{method_def unfold_locales} & : & \method\ \\ @{attribute_def trace_locales} & : & \mbox{\attribute\ \quad default \false\} \\ \end{matharray} Locales may be instantiated, and the resulting instantiated declarations added to the current context. This requires proof (of the instantiated specification) and is called \<^emph>\locale interpretation\. Interpretation is possible within arbitrary local theories (\<^theory_text>\interpretation\), within proof bodies (\<^theory_text>\interpret\), into global theories (\<^theory_text>\global_interpretation\) and into locales (\<^theory_text>\sublocale\). \<^rail>\ @@{command interpretation} @{syntax locale_expr} ; @@{command interpret} @{syntax locale_expr} ; @@{command global_interpretation} @{syntax locale_expr} definitions? ; @@{command sublocale} (@{syntax name} ('<' | '\'))? @{syntax locale_expr} \ definitions? ; @@{command print_interps} @{syntax name} ; definitions: @'defines' (@{syntax thmdecl}? @{syntax name} \ @{syntax mixfix}? '=' @{syntax term} + @'and'); \ The core of each interpretation command is a locale expression \expr\; the command generates proof obligations for the instantiated specifications. Once these are discharged by the user, instantiated declarations (in particular, facts) are added to the context in a post-processing phase, in a manner specific to each command. Interpretation commands are aware of interpretations that are already active: post-processing is achieved through a variant of roundup that takes interpretations of the current global or local theory into account. In order to simplify the proof obligations according to existing interpretations use methods @{method intro_locales} or @{method unfold_locales}. Rewrites clauses \<^theory_text>\rewrites eqns\ occur within expressions. They amend the morphism through which a locale instance is interpreted with rewrite rules, also called rewrite morphisms. This is particularly useful for interpreting concepts introduced through definitions. The equations must be proved the user. To enable syntax of the instantiated locale within the equation, while reading a locale expression, equations of a locale instance are read in a temporary context where the instance is already activated. If activation fails, typically due to duplicate constant declarations, processing falls back to reading the equation first. Given definitions \defs\ produce corresponding definitions in the local theory's underlying target \<^emph>\and\ amend the morphism with rewrite rules stemming from the symmetric of those definitions. Hence these need not be proved explicitly the user. Such rewrite definitions are a even more useful device for interpreting concepts introduced through definitions, but they are only supported for interpretation commands operating in a local theory whose implementing target actually supports this. Note that despite the suggestive \<^theory_text>\and\ connective, \defs\ are processed sequentially without mutual recursion. \<^descr> \<^theory_text>\interpretation expr\ interprets \expr\ into a local theory such that its lifetime is limited to the current context block (e.g. a locale or unnamed context). At the closing @{command end} of the block the interpretation and its declarations disappear. Hence facts based on interpretation can be established without creating permanent links to the interpreted locale instances, as would be the case with @{command sublocale}. When used on the level of a global theory, there is no end of a current context block, hence \<^theory_text>\interpretation\ behaves identically to \<^theory_text>\global_interpretation\ then. \<^descr> \<^theory_text>\interpret expr\ interprets \expr\ into a proof context: the interpretation and its declarations disappear when closing the current proof block. Note that for \<^theory_text>\interpret\ the \eqns\ should be explicitly universally quantified. \<^descr> \<^theory_text>\global_interpretation expr defines defs\ interprets \expr\ into a global theory. When adding declarations to locales, interpreted versions of these declarations are added to the global theory for all interpretations in the global theory as well. That is, interpretations into global theories dynamically participate in any declarations added to locales. Free variables in the interpreted expression are allowed. They are turned into schematic variables in the generated declarations. In order to use a free variable whose name is already bound in the context --- for example, because a constant of that name exists --- add it to the \<^theory_text>\for\ clause. When used in a nested target, resulting declarations are propagated through the whole target stack. \<^descr> \<^theory_text>\sublocale name \ expr defines defs\ interprets \expr\ into the locale \name\. A proof that the specification of \name\ implies the specification of \expr\ is required. As in the localized version of the theorem command, the proof is in the context of \name\. After the proof obligation has been discharged, the locale hierarchy is changed as if \name\ imported \expr\ (hence the name \<^theory_text>\sublocale\). When the context of \name\ is subsequently entered, traversing the locale hierarchy will involve the locale instances of \expr\, and their declarations will be added to the context. This makes \<^theory_text>\sublocale\ dynamic: extensions of a locale that is instantiated in \expr\ may take place after the \<^theory_text>\sublocale\ declaration and still become available in the context. Circular \<^theory_text>\sublocale\ declarations are allowed as long as they do not lead to infinite chains. If interpretations of \name\ exist in the current global theory, the command adds interpretations for \expr\ as well, with the same qualifier, although only for fragments of \expr\ that are not interpreted in the theory already. Rewrites clauses in the expression or rewrite definitions \defs\ can help break infinite chains induced by circular \<^theory_text>\sublocale\ declarations. In a named context block the \<^theory_text>\sublocale\ command may also be used, but the locale argument must be omitted. The command then refers to the locale (or class) target of the context block. \<^descr> \<^theory_text>\print_interps name\ lists all interpretations of locale \name\ in the current theory or proof context, including those due to a combination of an \<^theory_text>\interpretation\ or \<^theory_text>\interpret\ and one or several \<^theory_text>\sublocale\ declarations. \<^descr> @{method intro_locales} and @{method unfold_locales} repeatedly expand all introduction rules of locale predicates of the theory. While @{method intro_locales} only applies the \loc.intro\ introduction rules and therefore does not descend to assumptions, @{method unfold_locales} is more aggressive and applies \loc_axioms.intro\ as well. Both methods are aware of locale specifications entailed by the context, both from target statements, and from interpretations (see below). New goals that are entailed by the current context are discharged automatically. While @{method unfold_locales} is part of the default method for \<^theory_text>\proof\ and often invoked ``behind the scenes'', @{method intro_locales} helps understand which proof obligations originated from which locale instances. The latter method is useful while developing proofs but rare in finished developments. \<^descr> @{attribute trace_locales}, when set to \true\, prints the locale instances activated during roundup. Use this when locale commands yield obscure errors or for understanding local theories created by complex locale hierarchies. \begin{warn} If a global theory inherits declarations (body elements) for a locale from one parent and an interpretation of that locale from another parent, the interpretation will not be applied to the declarations. \end{warn} \begin{warn} Since attributes are applied to interpreted theorems, interpretation may modify the context of common proof tools, e.g.\ the Simplifier or Classical Reasoner. As the behaviour of such tools is \<^emph>\not\ stable under interpretation morphisms, manual declarations might have to be added to the target context of the interpretation to revert such declarations. \end{warn} \begin{warn} An interpretation in a local theory or proof context may subsume previous interpretations. This happens if the same specification fragment is interpreted twice and the instantiation of the second interpretation is more general than the interpretation of the first. The locale package does not attempt to remove subsumed interpretations. \end{warn} \begin{warn} While \<^theory_text>\interpretation (in c) \\ is admissible, it is not useful since its result is discarded immediately. \end{warn} \ section \Classes \label{sec:class}\ text \ \begin{matharray}{rcl} @{command_def "class"} & : & \theory \ local_theory\ \\ @{command_def "instantiation"} & : & \theory \ local_theory\ \\ @{command_def "instance"} & : & \local_theory \ local_theory\ \\ @{command "instance"} & : & \theory \ proof(prove)\ \\ @{command_def "subclass"} & : & \local_theory \ local_theory\ \\ @{command_def "print_classes"}\\<^sup>*\ & : & \context \\ \\ @{command_def "class_deps"}\\<^sup>*\ & : & \context \\ \\ @{method_def intro_classes} & : & \method\ \\ \end{matharray} A class is a particular locale with \<^emph>\exactly one\ type variable \\\. Beyond the underlying locale, a corresponding type class is established which is interpreted logically as axiomatic type class @{cite "Wenzel:1997:TPHOL"} whose logical content are the assumptions of the locale. Thus, classes provide the full generality of locales combined with the commodity of type classes (notably type-inference). See @{cite "isabelle-classes"} for a short tutorial. \<^rail>\ @@{command class} class_spec @'begin'? ; class_spec: @{syntax name} '=' ((@{syntax name} @{syntax_ref "opening"}? '+' (@{syntax context_elem}+)) | @{syntax name} @{syntax_ref "opening"}? | @{syntax_ref "opening"}? '+' (@{syntax context_elem}+)) ; @@{command instantiation} (@{syntax name} + @'and') '::' @{syntax arity} @'begin' ; @@{command instance} (() | (@{syntax name} + @'and') '::' @{syntax arity} | @{syntax name} ('<' | '\') @{syntax name} ) ; @@{command subclass} @{syntax name} ; @@{command class_deps} (class_bounds class_bounds?)? ; class_bounds: @{syntax sort} | '(' (@{syntax sort} + @'|') ')' \ \<^descr> \<^theory_text>\class c = superclasses bundles + body\ defines a new class \c\, inheriting from \superclasses\. This introduces a locale \c\ with import of all locales \superclasses\. Any @{element "fixes"} in \body\ are lifted to the global theory level (\<^emph>\class operations\ \f\<^sub>1, \, f\<^sub>n\ of class \c\), mapping the local type parameter \\\ to a schematic type variable \?\ :: c\. Likewise, @{element "assumes"} in \body\ are also lifted, mapping each local parameter \f :: \[\]\ to its corresponding global constant \f :: \[?\ :: c]\. The corresponding introduction rule is provided as \c_class_axioms.intro\. This rule should be rarely needed directly --- the @{method intro_classes} method takes care of the details of class membership proofs. Optionally given \bundles\ take effect in the surface context within the \body\ and the potentially following \<^theory_text>\begin\ / \<^theory_text>\end\ block. \<^descr> \<^theory_text>\instantiation t :: (s\<^sub>1, \, s\<^sub>n)s begin\ opens a target (cf.\ \secref{sec:target}) which allows to specify class operations \f\<^sub>1, \, f\<^sub>n\ corresponding to sort \s\ at the particular type instance \(\\<^sub>1 :: s\<^sub>1, \, \\<^sub>n :: s\<^sub>n) t\. A plain \<^theory_text>\instance\ command in the target body poses a goal stating these type arities. The target is concluded by an @{command_ref (local) "end"} command. Note that a list of simultaneous type constructors may be given; this corresponds nicely to mutually recursive type definitions, e.g.\ in Isabelle/HOL. \<^descr> \<^theory_text>\instance\ in an instantiation target body sets up a goal stating the type arities claimed at the opening \<^theory_text>\instantiation\. The proof would usually proceed by @{method intro_classes}, and then establish the characteristic theorems of the type classes involved. After finishing the proof, the background theory will be augmented by the proven type arities. On the theory level, \<^theory_text>\instance t :: (s\<^sub>1, \, s\<^sub>n)s\ provides a convenient way to instantiate a type class with no need to specify operations: one can continue with the instantiation proof immediately. \<^descr> \<^theory_text>\subclass c\ in a class context for class \d\ sets up a goal stating that class \c\ is logically contained in class \d\. After finishing the proof, class \d\ is proven to be subclass \c\ and the locale \c\ is interpreted into \d\ simultaneously. A weakened form of this is available through a further variant of @{command instance}: \<^theory_text>\instance c\<^sub>1 \ c\<^sub>2\ opens a proof that class \c\<^sub>2\ implies \c\<^sub>1\ without reference to the underlying locales; this is useful if the properties to prove the logical connection are not sufficient on the locale level but on the theory level. \<^descr> \<^theory_text>\print_classes\ prints all classes in the current theory. \<^descr> \<^theory_text>\class_deps\ visualizes classes and their subclass relations as a directed acyclic graph. By default, all classes from the current theory context are show. This may be restricted by optional bounds as follows: \<^theory_text>\class_deps upper\ or \<^theory_text>\class_deps upper lower\. A class is visualized, iff it is a subclass of some sort from \upper\ and a superclass of some sort from \lower\. \<^descr> @{method intro_classes} repeatedly expands all class introduction rules of this theory. Note that this method usually needs not be named explicitly, as it is already included in the default proof step (e.g.\ of \<^theory_text>\proof\). In particular, instantiation of trivial (syntactic) classes may be performed by a single ``\<^theory_text>\..\'' proof step. \ subsection \The class target\ text \ %FIXME check A named context may refer to a locale (cf.\ \secref{sec:target}). If this locale is also a class \c\, apart from the common locale target behaviour the following happens. \<^item> Local constant declarations \g[\]\ referring to the local type parameter \\\ and local parameters \f[\]\ are accompanied by theory-level constants \g[?\ :: c]\ referring to theory-level class operations \f[?\ :: c]\. \<^item> Local theorem bindings are lifted as are assumptions. \<^item> Local syntax refers to local operations \g[\]\ and global operations \g[?\ :: c]\ uniformly. Type inference resolves ambiguities. In rare cases, manual type annotations are needed. \ subsection \Co-regularity of type classes and arities\ text \ The class relation together with the collection of type-constructor arities must obey the principle of \<^emph>\co-regularity\ as defined below. \<^medskip> For the subsequent formulation of co-regularity we assume that the class relation is closed by transitivity and reflexivity. Moreover the collection of arities \t :: (\<^vec>s)c\ is completed such that \t :: (\<^vec>s)c\ and \c \ c'\ implies \t :: (\<^vec>s)c'\ for all such declarations. Treating sorts as finite sets of classes (meaning the intersection), the class relation \c\<^sub>1 \ c\<^sub>2\ is extended to sorts as follows: \[ \s\<^sub>1 \ s\<^sub>2 \ \c\<^sub>2 \ s\<^sub>2. \c\<^sub>1 \ s\<^sub>1. c\<^sub>1 \ c\<^sub>2\ \] This relation on sorts is further extended to tuples of sorts (of the same length) in the component-wise way. \<^medskip> Co-regularity of the class relation together with the arities relation means: \[ \t :: (\<^vec>s\<^sub>1)c\<^sub>1 \ t :: (\<^vec>s\<^sub>2)c\<^sub>2 \ c\<^sub>1 \ c\<^sub>2 \ \<^vec>s\<^sub>1 \ \<^vec>s\<^sub>2\ \] for all such arities. In other words, whenever the result classes of some type-constructor arities are related, then the argument sorts need to be related in the same way. \<^medskip> Co-regularity is a very fundamental property of the order-sorted algebra of types. For example, it entails principal types and most general unifiers, e.g.\ see @{cite "nipkow-prehofer"}. \ section \Overloaded constant definitions \label{sec:overloading}\ text \ Definitions essentially express abbreviations within the logic. The simplest form of a definition is \c :: \ \ t\, where \c\ is a new constant and \t\ is a closed term that does not mention \c\. Moreover, so-called \<^emph>\hidden polymorphism\ is excluded: all type variables in \t\ need to occur in its type \\\. \<^emph>\Overloading\ means that a constant being declared as \c :: \ decl\ may be defined separately on type instances \c :: (\\<^sub>1, \, \\<^sub>n)\ decl\ for each type constructor \\\. At most occasions overloading will be used in a Haskell-like fashion together with type classes by means of \<^theory_text>\instantiation\ (see \secref{sec:class}). Sometimes low-level overloading is desirable; this is supported by \<^theory_text>\consts\ and \<^theory_text>\overloading\ explained below. The right-hand side of overloaded definitions may mention overloaded constants recursively at type instances corresponding to the immediate argument types \\\<^sub>1, \, \\<^sub>n\. Incomplete specification patterns impose global constraints on all occurrences. E.g.\ \d :: \ \ \\ on the left-hand side means that all corresponding occurrences on some right-hand side need to be an instance of this, and general \d :: \ \ \\ will be disallowed. Full details are given by Kun\v{c}ar @{cite "Kuncar:2015"}. \<^medskip> The \<^theory_text>\consts\ command and the \<^theory_text>\overloading\ target provide a convenient interface for end-users. Regular specification elements such as @{command definition}, @{command inductive}, @{command function} may be used in the body. It is also possible to use \<^theory_text>\consts c :: \\ with later \<^theory_text>\overloading c \ c :: \\ to keep the declaration and definition of a constant separate. \begin{matharray}{rcl} @{command_def "consts"} & : & \theory \ theory\ \\ @{command_def "overloading"} & : & \theory \ local_theory\ \\ \end{matharray} \<^rail>\ @@{command consts} ((@{syntax name} '::' @{syntax type} @{syntax mixfix}?) +) ; @@{command overloading} ( spec + ) @'begin' ; spec: @{syntax name} ( '\' | '==' ) @{syntax term} ( '(' @'unchecked' ')' )? \ \<^descr> \<^theory_text>\consts c :: \\ declares constant \c\ to have any instance of type scheme \\\. The optional mixfix annotations may attach concrete syntax to the constants declared. \<^descr> \<^theory_text>\overloading x\<^sub>1 \ c\<^sub>1 :: \\<^sub>1 \ x\<^sub>n \ c\<^sub>n :: \\<^sub>n begin \ end\ defines a theory target (cf.\ \secref{sec:target}) which allows to specify already declared constants via definitions in the body. These are identified by an explicitly given mapping from variable names \x\<^sub>i\ to constants \c\<^sub>i\ at particular type instances. The definitions themselves are established using common specification tools, using the names \x\<^sub>i\ as reference to the corresponding constants. Option \<^theory_text>\(unchecked)\ disables global dependency checks for the corresponding definition, which is occasionally useful for exotic overloading; this is a form of axiomatic specification. It is at the discretion of the user to avoid malformed theory specifications! \ subsubsection \Example\ consts Length :: "'a \ nat" overloading Length\<^sub>0 \ "Length :: unit \ nat" Length\<^sub>1 \ "Length :: 'a \ unit \ nat" Length\<^sub>2 \ "Length :: 'a \ 'b \ unit \ nat" Length\<^sub>3 \ "Length :: 'a \ 'b \ 'c \ unit \ nat" begin fun Length\<^sub>0 :: "unit \ nat" where "Length\<^sub>0 () = 0" fun Length\<^sub>1 :: "'a \ unit \ nat" where "Length\<^sub>1 (a, ()) = 1" fun Length\<^sub>2 :: "'a \ 'b \ unit \ nat" where "Length\<^sub>2 (a, b, ()) = 2" fun Length\<^sub>3 :: "'a \ 'b \ 'c \ unit \ nat" where "Length\<^sub>3 (a, b, c, ()) = 3" end lemma "Length (a, b, c, ()) = 3" by simp lemma "Length ((a, b), (c, d), ()) = 2" by simp lemma "Length ((a, b, c, d, e), ()) = 1" by simp section \Incorporating ML code \label{sec:ML}\ text \ \begin{matharray}{rcl} @{command_def "SML_file"} & : & \local_theory \ local_theory\ \\ @{command_def "SML_file_debug"} & : & \local_theory \ local_theory\ \\ @{command_def "SML_file_no_debug"} & : & \local_theory \ local_theory\ \\ @{command_def "ML_file"} & : & \local_theory \ local_theory\ \\ @{command_def "ML_file_debug"} & : & \local_theory \ local_theory\ \\ @{command_def "ML_file_no_debug"} & : & \local_theory \ local_theory\ \\ @{command_def "ML"} & : & \local_theory \ local_theory\ \\ @{command_def "ML_export"} & : & \local_theory \ local_theory\ \\ @{command_def "ML_prf"} & : & \proof \ proof\ \\ @{command_def "ML_val"} & : & \any \\ \\ @{command_def "ML_command"} & : & \any \\ \\ @{command_def "setup"} & : & \theory \ theory\ \\ @{command_def "local_setup"} & : & \local_theory \ local_theory\ \\ @{command_def "attribute_setup"} & : & \local_theory \ local_theory\ \\ \end{matharray} \begin{tabular}{rcll} @{attribute_def ML_print_depth} & : & \attribute\ & default 10 \\ @{attribute_def ML_source_trace} & : & \attribute\ & default \false\ \\ @{attribute_def ML_debugger} & : & \attribute\ & default \false\ \\ @{attribute_def ML_exception_trace} & : & \attribute\ & default \false\ \\ @{attribute_def ML_exception_debugger} & : & \attribute\ & default \false\ \\ @{attribute_def ML_environment} & : & \attribute\ & default \Isabelle\ \\ \end{tabular} \<^rail>\ (@@{command SML_file} | @@{command SML_file_debug} | @@{command SML_file_no_debug} | @@{command ML_file} | @@{command ML_file_debug} | @@{command ML_file_no_debug}) @{syntax name} ';'? ; (@@{command ML} | @@{command ML_export} | @@{command ML_prf} | @@{command ML_val} | @@{command ML_command} | @@{command setup} | @@{command local_setup}) @{syntax text} ; @@{command attribute_setup} @{syntax name} '=' @{syntax text} @{syntax text}? \ \<^descr> \<^theory_text>\SML_file name\ reads and evaluates the given Standard ML file. Top-level SML bindings are stored within the (global or local) theory context; the initial environment is restricted to the Standard ML implementation of Poly/ML, without the many add-ons of Isabelle/ML. Multiple \<^theory_text>\SML_file\ commands may be used to build larger Standard ML projects, independently of the regular Isabelle/ML environment. \<^descr> \<^theory_text>\ML_file name\ reads and evaluates the given ML file. The current theory context is passed down to the ML toplevel and may be modified, using \<^ML>\Context.>>\ or derived ML commands. Top-level ML bindings are stored within the (global or local) theory context. \<^descr> \<^theory_text>\SML_file_debug\, \<^theory_text>\SML_file_no_debug\, \<^theory_text>\ML_file_debug\, and \<^theory_text>\ML_file_no_debug\ change the @{attribute ML_debugger} option locally while the given file is compiled. \<^descr> \<^theory_text>\ML\ is similar to \<^theory_text>\ML_file\, but evaluates directly the given \text\. Top-level ML bindings are stored within the (global or local) theory context. \<^descr> \<^theory_text>\ML_export\ is similar to \<^theory_text>\ML\, but the resulting toplevel bindings are exported to the global bootstrap environment of the ML process --- it has a lasting effect that cannot be retracted. This allows ML evaluation without a formal theory context, e.g. for command-line tools via @{tool process} @{cite "isabelle-system"}. \<^descr> \<^theory_text>\ML_prf\ is analogous to \<^theory_text>\ML\ but works within a proof context. Top-level ML bindings are stored within the proof context in a purely sequential fashion, disregarding the nested proof structure. ML bindings introduced by \<^theory_text>\ML_prf\ are discarded at the end of the proof. \<^descr> \<^theory_text>\ML_val\ and \<^theory_text>\ML_command\ are diagnostic versions of \<^theory_text>\ML\, which means that the context may not be updated. \<^theory_text>\ML_val\ echos the bindings produced at the ML toplevel, but \<^theory_text>\ML_command\ is silent. \<^descr> \<^theory_text>\setup "text"\ changes the current theory context by applying \text\, which refers to an ML expression of type \<^ML_type>\theory -> theory\. This enables to initialize any object-logic specific tools and packages written in ML, for example. \<^descr> \<^theory_text>\local_setup\ is similar to \<^theory_text>\setup\ for a local theory context, and an ML expression of type \<^ML_type>\local_theory -> local_theory\. This allows to invoke local theory specification packages without going through concrete outer syntax, for example. \<^descr> \<^theory_text>\attribute_setup name = "text" description\ defines an attribute in the current context. The given \text\ has to be an ML expression of type \<^ML_type>\attribute context_parser\, cf.\ basic parsers defined in structure \<^ML_structure>\Args\ and \<^ML_structure>\Attrib\. In principle, attributes can operate both on a given theorem and the implicit context, although in practice only one is modified and the other serves as parameter. Here are examples for these two cases: \ (*<*)experiment begin(*>*) attribute_setup my_rule = \Attrib.thms >> (fn ths => Thm.rule_attribute ths (fn context: Context.generic => fn th: thm => let val th' = th OF ths in th' end))\ attribute_setup my_declaration = \Attrib.thms >> (fn ths => Thm.declaration_attribute (fn th: thm => fn context: Context.generic => let val context' = context in context' end))\ (*<*)end(*>*) text \ \<^descr> @{attribute ML_print_depth} controls the printing depth of the ML toplevel pretty printer. Typically the limit should be less than 10. Bigger values such as 100--1000 are occasionally useful for debugging. \<^descr> @{attribute ML_source_trace} indicates whether the source text that is given to the ML compiler should be output: it shows the raw Standard ML after expansion of Isabelle/ML antiquotations. \<^descr> @{attribute ML_debugger} controls compilation of sources with or without debugging information. The global system option @{system_option_ref ML_debugger} does the same when building a session image. It is also possible use commands like \<^theory_text>\ML_file_debug\ etc. The ML debugger is explained further in @{cite "isabelle-jedit"}. \<^descr> @{attribute ML_exception_trace} indicates whether the ML run-time system should print a detailed stack trace on exceptions. The result is dependent on various ML compiler optimizations. The boundary for the exception trace is the current Isar command transactions: it is occasionally better to insert the combinator \<^ML>\Runtime.exn_trace\ into ML code for debugging @{cite "isabelle-implementation"}, closer to the point where it actually happens. \<^descr> @{attribute ML_exception_debugger} controls detailed exception trace via the Poly/ML debugger, at the cost of extra compile-time and run-time overhead. Relevant ML modules need to be compiled beforehand with debugging enabled, see @{attribute ML_debugger} above. \<^descr> @{attribute ML_environment} determines the named ML environment for toplevel declarations, e.g.\ in command \<^theory_text>\ML\ or \<^theory_text>\ML_file\. The following ML environments are predefined in Isabelle/Pure: \<^item> \Isabelle\ for Isabelle/ML. It contains all modules of Isabelle/Pure and further add-ons, e.g. material from Isabelle/HOL. \<^item> \SML\ for official Standard ML. It contains only the initial basis according to \<^url>\http://sml-family.org/Basis/overview.html\. The Isabelle/ML function \<^ML>\ML_Env.setup\ defines a new ML environment. This is useful to incorporate big SML projects in an isolated name space, possibly with variations on ML syntax; the existing setup of \<^ML>\ML_Env.SML_operations\ follows the official standard. It is also possible to move toplevel bindings between ML environments, using a notation with ``\>\'' as separator. For example: \ (*<*)experiment begin(*>*) declare [[ML_environment = "Isabelle>SML"]] ML \val println = writeln\ declare [[ML_environment = "SML"]] ML \println "test"\ declare [[ML_environment = "Isabelle"]] ML \ML \println\ (*bad*) handle ERROR msg => warning msg\ (*<*)end(*>*) section \Generated files and exported files\ text \ Write access to the physical file-system is incompatible with the stateless model of processing Isabelle documents. To avoid bad effects, the following concepts for abstract file-management are provided by Isabelle: \<^descr>[Generated files] are stored within the theory context in Isabelle/ML. This allows to operate on the content in Isabelle/ML, e.g. via the command @{command compile_generated_files}. \<^descr>[Exported files] are stored within the session database in Isabelle/Scala. This allows to deliver artefacts to external tools, see also @{cite "isabelle-system"} for session \<^verbatim>\ROOT\ declaration \<^theory_text>\export_files\, and @{tool build} option \<^verbatim>\-e\. A notable example is the command @{command_ref export_code} (\chref{ch:export-code}): it uses both concepts simultaneously. File names are hierarchically structured, using a slash as separator. The (long) theory name is used as a prefix: the resulting name needs to be globally unique. \begin{matharray}{rcll} @{command_def "generate_file"} & : & \local_theory \ local_theory\ \\ @{command_def "export_generated_files"} & : & \context \\ \\ @{command_def "compile_generated_files"} & : & \context \\ \\ @{command_def "external_file"} & : & \any \ any\ \\ \end{matharray} \<^rail>\ @@{command generate_file} path '=' content ; path: @{syntax embedded} ; content: @{syntax embedded} ; @@{command export_generated_files} (files_in_theory + @'and') ; files_in_theory: (@'_' | (path+)) (('(' @'in' @{syntax name} ')')?) ; @@{command compile_generated_files} (files_in_theory + @'and') \ (@'external_files' (external_files + @'and'))? \ (@'export_files' (export_files + @'and'))? \ (@'export_prefix' path)? ; external_files: (path+) (('(' @'in' path ')')?) ; export_files: (path+) (executable?) ; executable: '(' ('exe' | 'executable') ')' ; @@{command external_file} @{syntax name} ';'? \ \<^descr> \<^theory_text>\generate_file path = content\ augments the table of generated files within the current theory by a new entry: duplicates are not allowed. The name extension determines a pre-existent file-type; the \content\ is a string that is preprocessed according to rules of this file-type. For example, Isabelle/Pure supports \<^verbatim>\.hs\ as file-type for Haskell: embedded cartouches are evaluated as Isabelle/ML expressions of type \<^ML_type>\string\, the result is inlined in Haskell string syntax. \<^descr> \<^theory_text>\export_generated_files paths (in thy)\ retrieves named generated files from the given theory (that needs be reachable via imports of the current one). By default, the current theory node is used. Using ``\<^verbatim>\_\'' (underscore) instead of explicit path names refers to \emph{all} files of a theory node. The overall list of files is prefixed with the respective (long) theory name and exported to the session database. In Isabelle/jEdit the result can be browsed via the virtual file-system with prefix ``\<^verbatim>\isabelle-export:\'' (using the regular file-browser). + \<^descr> \<^theory_text>\scala_build_generated_files paths (in thy)\ retrieves named generated + files as for \<^theory_text>\export_generated_files\ and writes them into a temporary + directory, which is taken as starting point for build process of + Isabelle/Scala/Java modules (see @{cite "isabelle-system"}). The + corresponding @{path build.props} file is expected directly in the toplevel + directory, instead of @{path "etc/build.props"} for Isabelle system + components. These properties need to specify sources, resources, services + etc. as usual. The resulting \<^verbatim>\jar\ module becomes an export artifact of the + session database. + \<^descr> \<^theory_text>\compile_generated_files paths (in thy) where compile_body\ retrieves named generated files as for \<^theory_text>\export_generated_files\ and writes them into a temporary directory, such that the \compile_body\ may operate on them as an ML function of type \<^ML_type>\Path.T -> unit\. This may create further files, e.g.\ executables produced by a compiler that is invoked as external process (e.g.\ via \<^ML>\Isabelle_System.bash\), or any other files. The option ``\<^theory_text>\external_files paths (in base_dir)\'' copies files from the physical file-system into the temporary directory, \emph{before} invoking \compile_body\. The \base_dir\ prefix is removed from each of the \paths\, but the remaining sub-directory structure is reconstructed in the target directory. The option ``\<^theory_text>\export_files paths\'' exports the specified files from the temporary directory to the session database, \emph{after} invoking \compile_body\. Entries may be decorated with ``\<^theory_text>\(exe)\'' to say that it is a platform-specific executable program: the executable file-attribute will be set, and on Windows the \<^verbatim>\.exe\ file-extension will be included; ``\<^theory_text>\(executable)\'' only refers to the file-attribute, without special treatment of the \<^verbatim>\.exe\ extension. The option ``\<^theory_text>\export_prefix path\'' specifies an extra path prefix for all exports of \<^theory_text>\export_files\ above. \<^descr> \<^theory_text>\external_file name\ declares the formal dependency on the given file name, such that the Isabelle build process knows about it (see also @{cite "isabelle-system"}). This is required for any files mentioned in \<^theory_text>\compile_generated_files / external_files\ above, in order to document source dependencies properly. It is also possible to use \<^theory_text>\external_file\ alone, e.g.\ when other Isabelle/ML tools use \<^ML>\File.read\, without specific management of content by the Prover IDE. \ section \Primitive specification elements\ subsection \Sorts\ text \ \begin{matharray}{rcll} @{command_def "default_sort"} & : & \local_theory \ local_theory\ \end{matharray} \<^rail>\ @@{command default_sort} @{syntax sort} \ \<^descr> \<^theory_text>\default_sort s\ makes sort \s\ the new default sort for any type variable that is given explicitly in the text, but lacks a sort constraint (wrt.\ the current context). Type variables generated by type inference are not affected. Usually the default sort is only changed when defining a new object-logic. For example, the default sort in Isabelle/HOL is \<^class>\type\, the class of all HOL types. When merging theories, the default sorts of the parents are logically intersected, i.e.\ the representations as lists of classes are joined. \ subsection \Types \label{sec:types-pure}\ text \ \begin{matharray}{rcll} @{command_def "type_synonym"} & : & \local_theory \ local_theory\ \\ @{command_def "typedecl"} & : & \local_theory \ local_theory\ \\ \end{matharray} \<^rail>\ @@{command type_synonym} (@{syntax typespec} '=' @{syntax type} @{syntax mixfix}?) ; @@{command typedecl} @{syntax typespec} @{syntax mixfix}? \ \<^descr> \<^theory_text>\type_synonym (\\<^sub>1, \, \\<^sub>n) t = \\ introduces a \<^emph>\type synonym\ \(\\<^sub>1, \, \\<^sub>n) t\ for the existing type \\\. Unlike the semantic type definitions in Isabelle/HOL, type synonyms are merely syntactic abbreviations without any logical significance. Internally, type synonyms are fully expanded. \<^descr> \<^theory_text>\typedecl (\\<^sub>1, \, \\<^sub>n) t\ declares a new type constructor \t\. If the object-logic defines a base sort \s\, then the constructor is declared to operate on that, via the axiomatic type-class instance \t :: (s, \, s)s\. \begin{warn} If you introduce a new type axiomatically, i.e.\ via @{command_ref typedecl} and @{command_ref axiomatization} (\secref{sec:axiomatizations}), the minimum requirement is that it has a non-empty model, to avoid immediate collapse of the logical environment. Moreover, one needs to demonstrate that the interpretation of such free-form axiomatizations can coexist with other axiomatization schemes for types, notably @{command_def typedef} in Isabelle/HOL (\secref{sec:hol-typedef}), or any other extension that people might have introduced elsewhere. \end{warn} \ section \Naming existing theorems \label{sec:theorems}\ text \ \begin{matharray}{rcll} @{command_def "lemmas"} & : & \local_theory \ local_theory\ \\ @{command_def "named_theorems"} & : & \local_theory \ local_theory\ \\ \end{matharray} \<^rail>\ @@{command lemmas} (@{syntax thmdef}? @{syntax thms} + @'and') @{syntax for_fixes} ; @@{command named_theorems} (@{syntax name} @{syntax text}? + @'and') \ \<^descr> \<^theory_text>\lemmas a = b\<^sub>1 \ b\<^sub>n\~@{keyword_def "for"}~\x\<^sub>1 \ x\<^sub>m\ evaluates given facts (with attributes) in the current context, which may be augmented by local variables. Results are standardized before being stored, i.e.\ schematic variables are renamed to enforce index \0\ uniformly. \<^descr> \<^theory_text>\named_theorems name description\ declares a dynamic fact within the context. The same \name\ is used to define an attribute with the usual \add\/\del\ syntax (e.g.\ see \secref{sec:simp-rules}) to maintain the content incrementally, in canonical declaration order of the text structure. \ section \Oracles \label{sec:oracles}\ text \ \begin{matharray}{rcll} @{command_def "oracle"} & : & \theory \ theory\ & (axiomatic!) \\ @{command_def "thm_oracles"}\\<^sup>*\ & : & \context \\ \\ \end{matharray} Oracles allow Isabelle to take advantage of external reasoners such as arithmetic decision procedures, model checkers, fast tautology checkers or computer algebra systems. Invoked as an oracle, an external reasoner can create arbitrary Isabelle theorems. It is the responsibility of the user to ensure that the external reasoner is as trustworthy as the application requires. Another typical source of errors is the linkup between Isabelle and the external tool, not just its concrete implementation, but also the required translation between two different logical environments. Isabelle merely guarantees well-formedness of the propositions being asserted, and records within the internal derivation object how presumed theorems depend on unproven suppositions. This also includes implicit type-class reasoning via the order-sorted algebra of class relations and type arities (see also @{command_ref instantiation} and @{command_ref instance}). \<^rail>\ @@{command oracle} @{syntax name} '=' @{syntax text} ; @@{command thm_oracles} @{syntax thms} \ \<^descr> \<^theory_text>\oracle name = "text"\ turns the given ML expression \text\ of type \<^ML_text>\'a -> cterm\ into an ML function of type \<^ML_text>\'a -> thm\, which is bound to the global identifier \<^ML_text>\name\. This acts like an infinitary specification of axioms! Invoking the oracle only works within the scope of the resulting theory. See \<^file>\~~/src/HOL/Examples/Iff_Oracle.thy\ for a worked example of defining a new primitive rule as oracle, and turning it into a proof method. \<^descr> \<^theory_text>\thm_oracles thms\ displays all oracles used in the internal derivation of the given theorems; this covers the full graph of transitive dependencies. \ section \Name spaces\ text \ \begin{matharray}{rcl} @{command_def "alias"} & : & \local_theory \ local_theory\ \\ @{command_def "type_alias"} & : & \local_theory \ local_theory\ \\ @{command_def "hide_class"} & : & \theory \ theory\ \\ @{command_def "hide_type"} & : & \theory \ theory\ \\ @{command_def "hide_const"} & : & \theory \ theory\ \\ @{command_def "hide_fact"} & : & \theory \ theory\ \\ \end{matharray} \<^rail>\ (@{command alias} | @{command type_alias}) @{syntax name} '=' @{syntax name} ; (@{command hide_class} | @{command hide_type} | @{command hide_const} | @{command hide_fact}) ('(' @'open' ')')? (@{syntax name} + ) \ Isabelle organizes any kind of name declarations (of types, constants, theorems etc.) by separate hierarchically structured name spaces. Normally the user does not have to control the behaviour of name spaces by hand, yet the following commands provide some way to do so. \<^descr> \<^theory_text>\alias\ and \<^theory_text>\type_alias\ introduce aliases for constants and type constructors, respectively. This allows adhoc changes to name-space accesses. \<^descr> \<^theory_text>\type_alias b = c\ introduces an alias for an existing type constructor. \<^descr> \<^theory_text>\hide_class names\ fully removes class declarations from a given name space; with the \(open)\ option, only the unqualified base name is hidden. Note that hiding name space accesses has no impact on logical declarations --- they remain valid internally. Entities that are no longer accessible to the user are printed with the special qualifier ``\??\'' prefixed to the full internal name. \<^descr> \<^theory_text>\hide_type\, \<^theory_text>\hide_const\, and \<^theory_text>\hide_fact\ are similar to \<^theory_text>\hide_class\, but hide types, constants, and facts, respectively. \ end diff --git a/src/Doc/System/Scala.thy b/src/Doc/System/Scala.thy --- a/src/Doc/System/Scala.thy +++ b/src/Doc/System/Scala.thy @@ -1,493 +1,493 @@ (*:maxLineLen=78:*) theory Scala imports Base begin chapter \Isabelle/Scala systems programming \label{sec:scala}\ text \ Isabelle/ML and Isabelle/Scala are the two main implementation languages of the Isabelle environment: \<^item> Isabelle/ML is for \<^emph>\mathematics\, to develop tools within the context of symbolic logic, e.g.\ for constructing proofs or defining domain-specific formal languages. See the \<^emph>\Isabelle/Isar implementation manual\ @{cite "isabelle-implementation"} for more details. \<^item> Isabelle/Scala is for \<^emph>\physics\, to connect with the world of systems and services, including editors and IDE frameworks. There are various ways to access Isabelle/Scala modules and operations: \<^item> Isabelle command-line tools (\secref{sec:scala-tools}) run in a separate Java process. \<^item> Isabelle/ML antiquotations access Isabelle/Scala functions (\secref{sec:scala-functions}) via the PIDE protocol: execution happens within the running Java process underlying Isabelle/Scala. \<^item> The \<^verbatim>\Console/Scala\ plugin of Isabelle/jEdit @{cite "isabelle-jedit"} operates on the running Java application, using the Scala read-eval-print-loop (REPL). The main Isabelle/Scala/jEdit functionality is provided by \<^file>\$ISABELLE_HOME/lib/classes/isabelle.jar\. Further underlying Scala and Java libraries are bundled with Isabelle, e.g.\ to access SQLite or PostgreSQL via JDBC. Add-on Isabelle components may augment the system environment by providing suitable configuration in \<^path>\etc/settings\ (GNU bash script). The shell function \<^bash_function>\classpath\ helps to write \<^path>\etc/settings\ in a portable manner: it refers to library \<^verbatim>\jar\ files in standard POSIX path notation. On Windows, this is converted to native platform format, before invoking Java (\secref{sec:scala-tools}). \<^medskip> There is also an implicit build process for Isabelle/Scala/Java modules, based on \<^path>\etc/build.props\ within the component directory (see also \secref{sec:scala-build}). \ section \Command-line tools \label{sec:scala-tools}\ subsection \Java Runtime Environment \label{sec:tool-java}\ text \ The @{tool_def java} tool is a direct wrapper for the Java Runtime Environment, within the regular Isabelle settings environment (\secref{sec:settings}) and Isabelle classpath. The command line arguments are that of the bundled Java distribution: see option \<^verbatim>\-help\ in particular. The \<^verbatim>\java\ executable is taken from @{setting ISABELLE_JDK_HOME}, according to the standard directory layout for regular distributions of OpenJDK. The shell function \<^bash_function>\isabelle_jdk\ allows shell scripts to invoke other Java tools robustly (e.g.\ \<^verbatim>\isabelle_jdk jar\), without depending on accidental operating system installations. \ subsection \Scala toplevel \label{sec:tool-scala}\ text \ The @{tool_def scala} tool is a direct wrapper for the Scala toplevel, similar to @{tool java} above. The command line arguments are that of the bundled Scala distribution: see option \<^verbatim>\-help\ in particular. This allows to interact with Isabelle/Scala interactively. \ subsubsection \Example\ text \ Explore the Isabelle system environment in Scala: @{verbatim [display, indent = 2] \$ isabelle scala\} @{scala [display, indent = 2] \import isabelle._ val isabelle_home = Isabelle_System.getenv("ISABELLE_HOME") val options = Options.init() options.bool("browser_info") options.string("document")\} \ subsection \Scala script wrapper\ text \ The executable @{executable "$ISABELLE_HOME/bin/isabelle_scala_script"} allows to run Isabelle/Scala source files stand-alone programs, by using a suitable ``hash-bang'' line and executable file permissions. For example: @{verbatim [display, indent = 2] \#!/usr/bin/env isabelle_scala_script\} @{scala [display, indent = 2] \val options = isabelle.Options.init() Console.println("browser_info = " + options.bool("browser_info")) Console.println("document = " + options.string("document"))\} This assumes that the executable may be found via the @{setting PATH} from the process environment: this is the case when Isabelle settings are active, e.g.\ in the context of the main Isabelle tool wrapper \secref{sec:isabelle-tool}. Alternatively, the full \<^file>\$ISABELLE_HOME/bin/isabelle_scala_script\ may be specified in expanded form. \ subsection \Scala compiler \label{sec:tool-scalac}\ text \ The @{tool_def scalac} tool is a direct wrapper for the Scala compiler; see also @{tool scala} above. The command line arguments are that of the bundled Scala distribution. This provides a low-level mechanism to compile further Scala modules, depending on existing Isabelle/Scala functionality; the resulting \<^verbatim>\class\ or \<^verbatim>\jar\ files can be added to the Java classpath using the shell function \<^bash_function>\classpath\. A more convenient high-level approach works via \<^path>\etc/build.props\ (see \secref{sec:scala-build}). \ section \Isabelle/Scala/Java modules \label{sec:scala-build}\ subsection \Component configuration via \<^path>\etc/build.props\\ text \ Isabelle components may augment the Isabelle/Scala/Java environment declaratively via properties given in \<^path>\etc/build.props\ (within the component directory). This specifies an output \<^verbatim>\jar\ \<^emph>\module\, based on Scala or Java \<^emph>\sources\, and arbitrary \<^emph>\resources\. Moreover, a module can specify \<^emph>\services\ that are subclasses of \<^scala_type>\isabelle.Isabelle_System.Service\; these have a particular meaning to Isabelle/Scala tools. Before running a Scala or Java process, the Isabelle system implicitly ensures that all provided modules are compiled and packaged (as jars). It is also possible to invoke @{tool scala_build} explicitly, with extra options. \<^medskip> The syntax of \<^path>\etc/build.props\ follows a regular Java properties file\<^footnote>\\<^url>\https://docs.oracle.com/en/java/javase/15/docs/api/java.base/java/util/Properties.html#load(java.io.Reader)\\, but the encoding is \<^verbatim>\UTF-8\, instead of historic \<^verbatim>\ISO 8859-1\ from the API documentation. The subsequent properties are relevant for the Scala/Java build process. Most properties are optional: the default is an empty string (or list). File names are relative to the main component directory and may refer to Isabelle settings variables (e.g. \<^verbatim>\$ISABELLE_HOME\). \<^item> \<^verbatim>\title\ (required) is a human-readable description of the module, used in printed messages. \<^item> \<^verbatim>\module\ specifies a \<^verbatim>\jar\ file name for the output module, as result of the specified sources (and resources). If this is absent (or \<^verbatim>\no_build\ is set, as described below), there is no implicit build process. The contributing sources might be given nonetheless, notably for @{tool scala_project} (\secref{sec:tool-scala-project}), which includes Scala/Java sources of components, while suppressing \<^verbatim>\jar\ modules (to avoid duplication of program content). \<^item> \<^verbatim>\no_build\ is a Boolean property, with default \<^verbatim>\false\. If set to \<^verbatim>\true\, the implicit build process for the given \<^verbatim>\module\ is \<^emph>\omitted\ --- it is assumed to be provided by other means. \<^item> \<^verbatim>\scalac_options\ and \<^verbatim>\javac_options\ augment the default settings @{setting_ref ISABELLE_SCALAC_OPTIONS} and @{setting_ref ISABELLE_JAVAC_OPTIONS} for this component; option syntax follows the regular command-line tools \<^verbatim>\scalac\ and \<^verbatim>\javac\, respectively. \<^item> \<^verbatim>\main\ specifies the main entry point for the \<^verbatim>\jar\ module. This is only relevant for direct invocation like ``\<^verbatim>\java -jar test.jar\''. \<^item> \<^verbatim>\requirements\ is a list of \<^verbatim>\jar\ modules that are needed in the compilation process, but not provided by the regular classpath (notably @{setting ISABELLE_CLASSPATH}). A \<^emph>\normal entry\ refers to a single \<^verbatim>\jar\ file name, possibly with settings variables as usual. E.g. \<^file>\$ISABELLE_SCALA_JAR\ for the main \<^file>\$ISABELLE_HOME/lib/classes/isabelle.jar\ (especially relevant for add-on modules). A \<^emph>\special entry\ is of the form \<^verbatim>\env:\\variable\ and refers to a settings variable from the Isabelle environment: its value may consist of multiple \<^verbatim>\jar\ entries (separated by colons). Environment variables are not expanded recursively. \<^item> \<^verbatim>\resources\ is a list of files that should be included in the resulting \<^verbatim>\jar\ file. Each item consists of a pair separated by colon: \source\\<^verbatim>\:\\target\ means to copy an existing source file (relative to the component directory) to the given target file or directory (relative to the \<^verbatim>\jar\ name space). A \file\ specification without colon abbreviates \file\\<^verbatim>\:\\file\, i.e. the file is copied while retaining its relative path name. \<^item> \<^verbatim>\sources\ is a list of \<^verbatim>\.scala\ or \<^verbatim>\.java\ files that contribute to the specified module. It is possible to use both languages simultaneously: the Scala and Java compiler will be invoked consecutively to make this work. \<^item> \<^verbatim>\services\ is a list of class names to be registered as Isabelle service providers (subclasses of \<^scala_type>\isabelle.Isabelle_System.Service\). Internal class names of the underlying JVM need to be given: e.g. see method @{scala_method (in java.lang.Object) getClass}. Particular services require particular subclasses: instances are filtered according to their dynamic type. For example, class \<^scala_type>\isabelle.Isabelle_Scala_Tools\ collects Scala command-line tools, and class \<^scala_type>\isabelle.Scala.Functions\ collects Scala functions (\secref{sec:scala-functions}). \ subsection \Explicit Isabelle/Scala/Java build \label{sec:tool-scala-build}\ text \ The @{tool_def scala_build} tool explicitly invokes the build process for all registered components. @{verbatim [display] \Usage: isabelle scala_build [OPTIONS] Options are: -f force fresh build -q quiet mode: suppress stdout/stderr Build Isabelle/Scala/Java modules of all registered components (if required). \} For each registered Isabelle component that provides \<^path>\etc/build.props\, the specified output \<^verbatim>\module\ is checked against the corresponding input \<^verbatim>\requirements\, \<^verbatim>\resources\, \<^verbatim>\sources\. If required, there is an automatic build using \<^verbatim>\scalac\ or \<^verbatim>\javac\ (or both). The identity of input files is recorded within the output \<^verbatim>\jar\, using SHA1 digests in \<^verbatim>\META-INF/isabelle/shasum\. \<^medskip> Option \<^verbatim>\-f\ forces a fresh build, regardless of the up-to-date status of input files vs. the output module. \<^medskip> Option \<^verbatim>\-q\ suppresses all output on stdout/stderr produced by the Scala or Java compiler. - \<^medskip> Explicit invocation of \<^verbatim>\isabelle scala_build\ mainly serves testing or + \<^medskip> Explicit invocation of @{tool scala_build} mainly serves testing or applications with special options: the Isabelle system normally does an automatic the build on demand. \ subsection \Project setup for common Scala IDEs \label{sec:tool-scala-project}\ text \ The @{tool_def scala_project} tool creates a project configuration for all Isabelle/Java/Scala modules specified in components via \<^path>\etc/build.props\, together with additional source files given on the command-line: @{verbatim [display] \Usage: isabelle scala_project [OPTIONS] [MORE_SOURCES ...] Options are: -D DIR project directory (default: "$ISABELLE_HOME_USER/scala_project") -G use Gradle as build tool -L make symlinks to original source files -M use Maven as build tool -f force update of existing directory Setup project for Isabelle/Scala/jEdit --- to support common IDEs such as IntelliJ IDEA. Either option -G or -M is mandatory to specify the build tool.\} The generated configuration is for Gradle\<^footnote>\\<^url>\https://gradle.org\\ or Maven\<^footnote>\\<^url>\https://maven.apache.org\\, but the main purpose is to import it into common IDEs like IntelliJ IDEA\<^footnote>\\<^url>\https://www.jetbrains.com/idea\\. This allows to explore the sources with static analysis and other hints in real-time. The generated files refer to physical file-system locations, using the path notation of the underlying OS platform. Thus the project needs to be recreated whenever the Isabelle installation is changed or moved. \<^medskip> Option \<^verbatim>\-G\ selects Gradle and \<^verbatim>\-M\ selects Maven as Java/Scala build tool: either one needs to be specified explicitly. These tools have a tendency to break down unexpectedly, so supporting both increases the chances that the generated IDE project works properly. \<^medskip> Option \<^verbatim>\-L\ produces \<^emph>\symlinks\ to the original files: this allows to develop Isabelle/Scala/jEdit modules within an external IDE. The default is to \<^emph>\copy\ source files, so editing them within the IDE has no permanent effect on the originals. \<^medskip> Option \<^verbatim>\-D\ specifies an explicit project directory, instead of the default \<^path>\$ISABELLE_HOME_USER/scala_project\. Option \<^verbatim>\-f\ forces an existing project directory to be \<^emph>\purged\ --- after some sanity checks that it has been generated by @{tool "scala_project"} before. \ subsubsection \Examples\ text \ Create a project directory and for editing the original sources: @{verbatim [display] \isabelle scala_project -f -L\} On Windows, this usually requires Administrator rights, in order to create native symlinks. \ section \Registered Isabelle/Scala functions \label{sec:scala-functions}\ subsection \Defining functions in Isabelle/Scala\ text \ The service class \<^scala_type>\isabelle.Scala.Functions\ collects Scala functions of type \<^scala_type>\isabelle.Scala.Fun\: by registering instances via \<^verbatim>\services\ in \<^path>\etc/build.props\ (\secref{sec:scala-build}), it becomes possible to invoke Isabelle/Scala from Isabelle/ML (see below). An example is the predefined collection of \<^scala_type>\isabelle.Scala.Functions\ in \<^file>\$ISABELLE_HOME/etc/build.props\. The overall list of registered functions is accessible in Isabelle/Scala as \<^scala_object>\isabelle.Scala.functions\. The general class \<^scala_type>\isabelle.Scala.Fun\ expects a multi-argument / multi-result function \<^scala_type>\List[isabelle.Bytes] => List[isabelle.Bytes]\; more common are instances of \<^scala_type>\isabelle.Scala.Fun_Strings\ for type \<^scala_type>\List[String] => List[String]\, or \<^scala_type>\isabelle.Scala.Fun_String\ for type \<^scala_type>\String => String\. \ subsection \Invoking functions in Isabelle/ML\ text \ Isabelle/PIDE provides a protocol to invoke registered Scala functions in ML: this works both within the Prover IDE and in batch builds. The subsequent ML antiquotations refer to Scala functions in a formally-checked manner. \begin{matharray}{rcl} @{ML_antiquotation_def "scala_function"} & : & \ML_antiquotation\ \\ @{ML_antiquotation_def "scala"} & : & \ML_antiquotation\ \\ \end{matharray} \<^rail>\ (@{ML_antiquotation scala_function} | @{ML_antiquotation scala}) @{syntax embedded} \ \<^descr> \@{scala_function name}\ inlines the checked function name as ML string literal. \<^descr> \@{scala name}\ and \@{scala_thread name}\ invoke the checked function via the PIDE protocol. In Isabelle/ML this appears as a function of type \<^ML_type>\string list -> string list\ or \<^ML_type>\string -> string\, depending on the definition in Isabelle/Scala. Evaluation is subject to interrupts within the ML runtime environment as usual. A \<^scala>\null\ result in Scala raises an exception \<^ML>\Scala.Null\ in ML. The execution of \@{scala}\ works via a Scala future on a bounded thread farm, while \@{scala_thread}\ always forks a separate Java/VM thread. The standard approach of representing datatypes via strings works via XML in YXML transfer syntax. See Isabelle/ML operations and modules @{ML YXML.string_of_body}, @{ML YXML.parse_body}, @{ML_structure XML.Encode}, @{ML_structure XML.Decode}; similarly for Isabelle/Scala. Isabelle symbols may have to be recoded via Scala operations \<^scala_method>\isabelle.Symbol.decode\ and \<^scala_method>\isabelle.Symbol.encode\. \ subsubsection \Examples\ text \ Invoke the predefined Scala function \<^scala_function>\echo\: \ ML \ val s = "test"; val s' = \<^scala>\echo\ s; \<^assert> (s = s') \ text \ Let the Scala compiler process some toplevel declarations, producing a list of errors: \ ML \ val source = "class A(a: Int, b: Boolean)" val errors = \<^scala>\scala_toplevel\ source |> YXML.parse_body |> let open XML.Decode in list string end; \<^assert> (null errors)\ text \ The above is merely for demonstration. See \<^ML>\Scala_Compiler.toplevel\ for a more convenient version with builtin decoding and treatment of errors. \ section \Documenting Isabelle/Scala entities\ text \ The subsequent document antiquotations help to document Isabelle/Scala entities, with formal checking of names against the Isabelle classpath. \begin{matharray}{rcl} @{antiquotation_def "scala"} & : & \antiquotation\ \\ @{antiquotation_def "scala_object"} & : & \antiquotation\ \\ @{antiquotation_def "scala_type"} & : & \antiquotation\ \\ @{antiquotation_def "scala_method"} & : & \antiquotation\ \\ \end{matharray} \<^rail>\ (@@{antiquotation scala} | @@{antiquotation scala_object}) @{syntax embedded} ; @@{antiquotation scala_type} @{syntax embedded} types ; @@{antiquotation scala_method} class @{syntax embedded} types args ; class: ('(' @'in' @{syntax name} types ')')? ; types: ('[' (@{syntax name} ',' +) ']')? ; args: ('(' (nat | (('_' | @{syntax name}) + ',')) ')')? \ \<^descr> \@{scala s}\ is similar to \@{verbatim s}\, but the given source text is checked by the Scala compiler as toplevel declaration (without evaluation). This allows to write Isabelle/Scala examples that are statically checked. \<^descr> \@{scala_object x}\ checks the given Scala object name (simple value or ground module) and prints the result verbatim. \<^descr> \@{scala_type T[A]}\ checks the given Scala type name (with optional type parameters) and prints the result verbatim. \<^descr> \@{scala_method (in c[A]) m[B](n)}\ checks the given Scala method \m\ in the context of class \c\. The method argument slots are either specified by a number \n\ or by a list of (optional) argument types; this may refer to type variables specified for the class or method: \A\ or \B\ above. Everything except for the method name \m\ is optional. The absence of the class context means that this is a static method. The absence of arguments with types means that the method can be determined uniquely as \<^verbatim>\(\\m\\<^verbatim>\ _)\ in Scala (no overloading). \ subsubsection \Examples\ text \ Miscellaneous Isabelle/Scala entities: \<^item> object: \<^scala_object>\isabelle.Isabelle_Process\ \<^item> type without parameter: @{scala_type isabelle.Console_Progress} \<^item> type with parameter: @{scala_type List[A]} \<^item> static method: \<^scala_method>\isabelle.Isabelle_System.bash\ \<^item> class and method with type parameters: @{scala_method (in List[A]) map[B]("A => B")} \<^item> overloaded method with argument type: @{scala_method (in Int) "+" (Int)} \ end diff --git a/src/Pure/General/file.scala b/src/Pure/General/file.scala --- a/src/Pure/General/file.scala +++ b/src/Pure/General/file.scala @@ -1,329 +1,330 @@ /* Title: Pure/General/file.scala Author: Makarius File-system operations. */ package isabelle import java.io.{BufferedWriter, OutputStreamWriter, FileOutputStream, BufferedOutputStream, OutputStream, InputStream, FileInputStream, BufferedInputStream, BufferedReader, InputStreamReader, File => JFile, IOException} import java.nio.file.{StandardOpenOption, Path => JPath, Files, SimpleFileVisitor, FileVisitOption, FileVisitResult} import java.nio.file.attribute.BasicFileAttributes import java.net.{URL, MalformedURLException} import java.util.zip.{GZIPInputStream, GZIPOutputStream} import java.util.EnumSet import org.tukaani.xz.{XZInputStream, XZOutputStream} import scala.collection.mutable object File { /* standard path (Cygwin or Posix) */ def standard_path(path: Path): String = path.expand.implode def standard_path(platform_path: String): String = isabelle.setup.Environment.standard_path(platform_path) def standard_path(file: JFile): String = standard_path(file.getPath) def standard_url(name: String): String = try { val url = new URL(name) if (url.getProtocol == "file" && Url.is_wellformed_file(name)) standard_path(Url.parse_file(name)) else name } catch { case _: MalformedURLException => standard_path(name) } /* platform path (Windows or Posix) */ def platform_path(standard_path: String): String = isabelle.setup.Environment.platform_path(standard_path) def platform_path(path: Path): String = platform_path(standard_path(path)) def platform_file(path: Path): JFile = new JFile(platform_path(path)) /* platform files */ def absolute(file: JFile): JFile = file.toPath.toAbsolutePath.normalize.toFile def absolute_name(file: JFile): String = absolute(file).getPath def canonical(file: JFile): JFile = file.getCanonicalFile def canonical_name(file: JFile): String = canonical(file).getPath def path(file: JFile): Path = Path.explode(standard_path(file)) def pwd(): Path = path(Path.current.absolute_file) /* relative paths */ def relative_path(base: Path, other: Path): Option[Path] = { val base_path = base.java_path val other_path = other.java_path if (other_path.startsWith(base_path)) Some(path(base_path.relativize(other_path).toFile)) else None } /* bash path */ def bash_path(path: Path): String = Bash.string(standard_path(path)) def bash_path(file: JFile): String = Bash.string(standard_path(file)) def bash_platform_path(path: Path): String = Bash.string(platform_path(path)) /* directory entries */ def check_dir(path: Path): Path = if (path.is_dir) path else error("No such directory: " + path) def check_file(path: Path): Path = if (path.is_file) path else error("No such file: " + path) /* directory content */ def read_dir(dir: Path): List[String] = { if (!dir.is_dir) error("No such directory: " + dir.toString) val files = dir.file.listFiles if (files == null) Nil else files.toList.map(_.getName).sorted } def get_dir(dir: Path): String = read_dir(dir).filter(name => (dir + Path.basic(name)).is_dir) match { case List(entry) => entry case dirs => error("Exactly one directory entry expected: " + commas_quote(dirs.sorted)) } def find_files( start: JFile, pred: JFile => Boolean = _ => true, include_dirs: Boolean = false, follow_links: Boolean = false ): List[JFile] = { val result = new mutable.ListBuffer[JFile] def check(file: JFile): Unit = if (pred(file)) result += file if (start.isFile) check(start) else if (start.isDirectory) { val options = if (follow_links) EnumSet.of(FileVisitOption.FOLLOW_LINKS) else EnumSet.noneOf(classOf[FileVisitOption]) Files.walkFileTree(start.toPath, options, Integer.MAX_VALUE, new SimpleFileVisitor[JPath] { override def preVisitDirectory( path: JPath, attrs: BasicFileAttributes ): FileVisitResult = { if (include_dirs) check(path.toFile) FileVisitResult.CONTINUE } override def visitFile( path: JPath, attrs: BasicFileAttributes ): FileVisitResult = { val file = path.toFile if (include_dirs || !file.isDirectory) check(file) FileVisitResult.CONTINUE } } ) } result.toList } /* read */ def read(file: JFile): String = Bytes.read(file).text def read(path: Path): String = read(path.file) def read_stream(reader: BufferedReader): String = { val output = new StringBuilder(100) var c = -1 while ({ c = reader.read; c != -1 }) output += c.toChar reader.close() output.toString } def read_stream(stream: InputStream): String = read_stream(new BufferedReader(new InputStreamReader(stream, UTF8.charset))) def read_gzip(file: JFile): String = read_stream(new GZIPInputStream(new BufferedInputStream(new FileInputStream(file)))) def read_gzip(path: Path): String = read_gzip(path.file) def read_xz(file: JFile): String = read_stream(new XZInputStream(new BufferedInputStream(new FileInputStream(file)))) def read_xz(path: Path): String = read_xz(path.file) /* read lines */ def read_line(reader: BufferedReader): Option[String] = { val line = try { reader.readLine} catch { case _: IOException => null } Option(line).map(Library.trim_line) } def read_lines(reader: BufferedReader, progress: String => Unit): List[String] = { val result = new mutable.ListBuffer[String] var line: Option[String] = None while ({ line = read_line(reader); line.isDefined }) { progress(line.get) result += line.get } reader.close() result.toList } /* write */ def writer(file: JFile): BufferedWriter = new BufferedWriter(new OutputStreamWriter(new FileOutputStream(file), UTF8.charset)) def write_file( file: JFile, text: String, make_stream: OutputStream => OutputStream ): Unit = { val stream = make_stream(new FileOutputStream(file)) using(new BufferedWriter(new OutputStreamWriter(stream, UTF8.charset)))(_.append(text)) } def write(file: JFile, text: String): Unit = write_file(file, text, s => s) def write(path: Path, text: String): Unit = write(path.file, text) def write_gzip(file: JFile, text: String): Unit = write_file(file, text, (s: OutputStream) => new GZIPOutputStream(new BufferedOutputStream(s))) def write_gzip(path: Path, text: String): Unit = write_gzip(path.file, text) def write_xz(file: JFile, text: String, options: XZ.Options): Unit = File.write_file(file, text, s => new XZOutputStream(new BufferedOutputStream(s), options)) def write_xz(file: JFile, text: String): Unit = write_xz(file, text, XZ.options()) def write_xz(path: Path, text: String, options: XZ.Options): Unit = write_xz(path.file, text, options) def write_xz(path: Path, text: String): Unit = write_xz(path, text, XZ.options()) def write_backup(path: Path, text: String): Unit = { if (path.is_file) Isabelle_System.move_file(path, path.backup) write(path, text) } def write_backup2(path: Path, text: String): Unit = { if (path.is_file) Isabelle_System.move_file(path, path.backup2) write(path, text) } /* append */ def append(file: JFile, text: String): Unit = Files.write(file.toPath, UTF8.bytes(text), StandardOpenOption.APPEND, StandardOpenOption.CREATE) def append(path: Path, text: String): Unit = append(path.file, text) /* change */ def change( path: Path, init: Boolean = false, strict: Boolean = false )(f: String => String): Unit = { if (!path.is_file && init) write(path, "") val x = read(path) val y = f(x) if (x != y) write(path, y) else if (strict) error("Unchanged file: " + path) } def change_lines(path: Path, init: Boolean = false, strict: Boolean = false)( f: List[String] => List[String]): Unit = change(path, init = init, strict = strict)(text => cat_lines(f(split_lines(text)))) /* eq */ def eq(file1: JFile, file2: JFile): Boolean = try { Files.isSameFile(file1.toPath, file2.toPath) } catch { case ERROR(_) => false } def eq(path1: Path, path2: Path): Boolean = eq(path1.file, path2.file) /* eq_content */ def eq_content(file1: JFile, file2: JFile): Boolean = if (eq(file1, file2)) true else if (file1.length != file2.length) false else Bytes.read(file1) == Bytes.read(file2) def eq_content(path1: Path, path2: Path): Boolean = eq_content(path1.file, path2.file) /* permissions */ def is_executable(path: Path): Boolean = { if (Platform.is_windows) Isabelle_System.bash("test -x " + bash_path(path)).check.ok else path.file.canExecute } def set_executable(path: Path, flag: Boolean): Unit = { if (Platform.is_windows && flag) Isabelle_System.chmod("a+x", path) else if (Platform.is_windows) Isabelle_System.chmod("a-x", path) else path.file.setExecutable(flag, false) } /* content */ object Content { - def apply(path: Path, content: Bytes): Content = new Content_Bytes(path, content) - def apply(path: Path, content: String): Content = new Content_String(path, content) + def apply(path: Path, content: Bytes): Content_Bytes = new Content_Bytes(path, content) + def apply(path: Path, content: String): Content_String = new Content_String(path, content) def apply(path: Path, content: XML.Body): Content_XML = new Content_XML(path, content) } trait Content { def path: Path def write(dir: Path): Unit + override def toString: String = path.toString } - final class Content_Bytes private[File](val path: Path, content: Bytes) extends Content { + final class Content_Bytes private[File](val path: Path, val content: Bytes) extends Content { def write(dir: Path): Unit = { val full_path = dir + path Isabelle_System.make_directory(full_path.expand.dir) Bytes.write(full_path, content) } } - final class Content_String private[File](val path: Path, content: String) extends Content { + final class Content_String private[File](val path: Path, val content: String) extends Content { def write(dir: Path): Unit = { val full_path = dir + path Isabelle_System.make_directory(full_path.expand.dir) File.write(full_path, content) } } - final class Content_XML private[File](val path: Path, content: XML.Body) { + final class Content_XML private[File](val path: Path, val content: XML.Body) { def output(out: XML.Body => String): Content_String = new Content_String(path, out(content)) } } diff --git a/src/Pure/Pure.thy b/src/Pure/Pure.thy --- a/src/Pure/Pure.thy +++ b/src/Pure/Pure.thy @@ -1,1547 +1,1546 @@ (* Title: Pure/Pure.thy Author: Makarius The Pure theory, with definitions of Isar commands and some lemmas. *) theory Pure keywords "!!" "!" "+" ":" ";" "<" "<=" "==" "=>" "?" "[" "\" "\" "\" "\" "\" "\" "]" "binder" "by" "in" "infix" "infixl" "infixr" "is" "open" "output" "overloaded" "pervasive" "premises" "structure" "unchecked" and "private" "qualified" :: before_command and "assumes" "constrains" "defines" "fixes" "for" "if" "includes" "notes" "rewrites" "obtains" "shows" "when" "where" "|" :: quasi_command and "text" "txt" :: document_body and "text_raw" :: document_raw and "default_sort" :: thy_decl and "typedecl" "nonterminal" "judgment" "consts" "syntax" "no_syntax" "translations" "no_translations" "type_notation" "no_type_notation" "notation" "no_notation" "alias" "type_alias" "declare" "hide_class" "hide_type" "hide_const" "hide_fact" :: thy_decl and "type_synonym" "definition" "abbreviation" "lemmas" :: thy_defn and "axiomatization" :: thy_stmt and "external_file" "bibtex_file" "ROOTS_file" :: thy_load and "generate_file" :: thy_decl and "export_generated_files" :: diag + and "scala_build_generated_files" :: diag and "compile_generated_files" :: diag and "external_files" "export_files" "export_prefix" - and "scala_build_component" "scala_build_directory" :: diag + and "export_classpath" and "ML_file" "ML_file_debug" "ML_file_no_debug" :: thy_load % "ML" and "SML_file" "SML_file_debug" "SML_file_no_debug" :: thy_load % "ML" and "SML_import" "SML_export" "ML_export" :: thy_decl % "ML" and "ML_prf" :: prf_decl % "proof" (* FIXME % "ML" ?? *) and "ML_val" "ML_command" :: diag % "ML" and "simproc_setup" :: thy_decl % "ML" and "setup" "local_setup" "attribute_setup" "method_setup" "declaration" "syntax_declaration" "parse_ast_translation" "parse_translation" "print_translation" "typed_print_translation" "print_ast_translation" "oracle" :: thy_decl % "ML" and "bundle" :: thy_decl_block and "unbundle" :: thy_decl and "include" "including" :: prf_decl and "print_bundles" :: diag and "context" "locale" "experiment" :: thy_decl_block and "interpret" :: prf_goal % "proof" and "interpretation" "global_interpretation" "sublocale" :: thy_goal and "class" :: thy_decl_block and "subclass" :: thy_goal and "instantiation" :: thy_decl_block and "instance" :: thy_goal and "overloading" :: thy_decl_block and "opening" :: quasi_command and "code_datatype" :: thy_decl and "theorem" "lemma" "corollary" "proposition" :: thy_goal_stmt and "schematic_goal" :: thy_goal_stmt and "notepad" :: thy_decl_block and "have" :: prf_goal % "proof" and "hence" :: prf_goal % "proof" and "show" :: prf_asm_goal % "proof" and "thus" :: prf_asm_goal % "proof" and "then" "from" "with" :: prf_chain % "proof" and "note" :: prf_decl % "proof" and "supply" :: prf_script % "proof" and "using" "unfolding" :: prf_decl % "proof" and "fix" "assume" "presume" "define" :: prf_asm % "proof" and "consider" :: prf_goal % "proof" and "obtain" :: prf_asm_goal % "proof" and "let" "write" :: prf_decl % "proof" and "case" :: prf_asm % "proof" and "{" :: prf_open % "proof" and "}" :: prf_close % "proof" and "next" :: next_block % "proof" and "qed" :: qed_block % "proof" and "by" ".." "." "sorry" "\" :: "qed" % "proof" and "done" :: "qed_script" % "proof" and "oops" :: qed_global % "proof" and "defer" "prefer" "apply" :: prf_script % "proof" and "apply_end" :: prf_script % "proof" and "subgoal" :: prf_script_goal % "proof" and "proof" :: prf_block % "proof" and "also" "moreover" :: prf_decl % "proof" and "finally" "ultimately" :: prf_chain % "proof" and "back" :: prf_script % "proof" and "help" "print_commands" "print_options" "print_context" "print_theory" "print_definitions" "print_syntax" "print_abbrevs" "print_defn_rules" "print_theorems" "print_locales" "print_classes" "print_locale" "print_interps" "print_attributes" "print_simpset" "print_rules" "print_trans_rules" "print_methods" "print_antiquotations" "print_ML_antiquotations" "thy_deps" "locale_deps" "class_deps" "thm_deps" "thm_oracles" "print_term_bindings" "print_facts" "print_cases" "print_statement" "thm" "prf" "full_prf" "prop" "term" "typ" "print_codesetup" "unused_thms" :: diag and "print_state" :: diag and "welcome" :: diag and "end" :: thy_end and "realizers" :: thy_decl and "realizability" :: thy_decl and "extract_type" "extract" :: thy_decl and "find_theorems" "find_consts" :: diag and "named_theorems" :: thy_decl abbrevs "\\tag" = "\<^marker>\tag \" and "===>" = "===>" (*prevent replacement of very long arrows*) and "--->" = "\\" and "hence" "thus" "default_sort" "simproc_setup" "apply_end" "realizers" "realizability" = "" and "hence" = "then have" and "thus" = "then show" begin section \Isar commands\ subsection \Other files\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\external_file\ "formal dependency on external file" (Resources.provide_parse_file >> (fn get_file => Toplevel.theory (#2 o get_file))); val _ = Outer_Syntax.command \<^command_keyword>\bibtex_file\ "check bibtex database file in Prover IDE" (Resources.provide_parse_file >> (fn get_file => Toplevel.theory (fn thy => let val ({lines, pos, ...}, thy') = get_file thy; val _ = Bibtex.check_database_output pos (cat_lines lines); in thy' end))); val _ = Outer_Syntax.command \<^command_keyword>\ROOTS_file\ "session ROOTS file" (Resources.provide_parse_file >> (fn get_file => Toplevel.theory (fn thy => let val ({src_path, lines, pos = pos0, ...}, thy') = get_file thy; val ctxt = Proof_Context.init_global thy'; val dir = Path.dir (Path.expand (Resources.master_directory thy' + src_path)); val _ = (lines, pos0) |-> fold (fn line => fn pos1 => let val pos2 = Position.symbol_explode line pos1; val range = Position.range (pos1, pos2); val source = Input.source true line range; val _ = if line = "" then () else if String.isPrefix "#" line then Context_Position.report ctxt (#1 range) Markup.comment else (ignore (Resources.check_session_dir ctxt (SOME dir) source) handle ERROR msg => Output.error_message msg); in pos2 |> Position.symbol "\n" end); in thy' end))); val _ = Outer_Syntax.local_theory \<^command_keyword>\generate_file\ "generate source file, with antiquotations" (Parse.path_binding -- (\<^keyword>\=\ |-- Parse.embedded_input) >> Generated_Files.generate_file_cmd); val files_in_theory = (Parse.underscore >> K [] || Scan.repeat1 Parse.path_binding) -- Scan.option (\<^keyword>\(\ |-- Parse.!!! (\<^keyword>\in\ |-- Parse.theory_name --| \<^keyword>\)\)); val _ = Outer_Syntax.command \<^command_keyword>\export_generated_files\ "export generated files from given theories" (Parse.and_list1 files_in_theory >> (fn args => Toplevel.keep (fn st => Generated_Files.export_generated_files_cmd (Toplevel.context_of st) args))); val base_dir = Scan.optional (\<^keyword>\(\ |-- Parse.!!! (\<^keyword>\in\ |-- Parse.path_input --| \<^keyword>\)\)) (Input.string ""); val external_files = Scan.repeat1 Parse.path_input -- base_dir; val exe = Parse.reserved "exe" >> K true || Parse.reserved "executable" >> K false; val executable = \<^keyword>\(\ |-- Parse.!!! (exe --| \<^keyword>\)\) >> SOME || Scan.succeed NONE; val export_files = Scan.repeat1 Parse.path_binding -- executable; val _ = Outer_Syntax.command \<^command_keyword>\compile_generated_files\ "compile generated files and export results" (Parse.and_list files_in_theory -- Scan.optional (\<^keyword>\external_files\ |-- Parse.!!! (Parse.and_list1 external_files)) [] -- Scan.optional (\<^keyword>\export_files\ |-- Parse.!!! (Parse.and_list1 export_files)) [] -- Scan.optional (\<^keyword>\export_prefix\ |-- Parse.path_binding) ("", Position.none) -- (Parse.where_ |-- Parse.!!! Parse.ML_source) >> (fn ((((args, external), export), export_prefix), source) => Toplevel.keep (fn st => Generated_Files.compile_generated_files_cmd (Toplevel.context_of st) args external export export_prefix source))); val _ = - Outer_Syntax.command \<^command_keyword>\scala_build_component\ - "build and export Isabelle/Scala/Java module (defined via etc/build.props)" - (Parse.path_input >> (fn dir => - Toplevel.keep (fn st => Scala_Build.scala_build_component (Toplevel.context_of st) dir))); - - val _ = - Outer_Syntax.command \<^command_keyword>\scala_build_directory\ - "build and export Isabelle/Scala/Java module (defined via build.props)" - (Parse.path_input >> (fn dir => - Toplevel.keep (fn st => Scala_Build.scala_build_directory (Toplevel.context_of st) dir))); + Outer_Syntax.command \<^command_keyword>\scala_build_generated_files\ + "build and export Isabelle/Scala/Java module" + (Parse.and_list files_in_theory -- + Scan.optional (\<^keyword>\external_files\ |-- Parse.!!! (Parse.and_list1 external_files)) [] + >> (fn (args, external) => + Toplevel.keep (fn st => + Generated_Files.scala_build_generated_files_cmd + (Toplevel.context_of st) args external))); in end\ external_file "ROOT0.ML" external_file "ROOT.ML" subsection \Embedded ML text\ ML \ local val semi = Scan.option \<^keyword>\;\; val _ = Outer_Syntax.command \<^command_keyword>\ML_file\ "read and evaluate Isabelle/ML file" (Resources.parse_file --| semi >> ML_File.ML NONE); val _ = Outer_Syntax.command \<^command_keyword>\ML_file_debug\ "read and evaluate Isabelle/ML file (with debugger information)" (Resources.parse_file --| semi >> ML_File.ML (SOME true)); val _ = Outer_Syntax.command \<^command_keyword>\ML_file_no_debug\ "read and evaluate Isabelle/ML file (no debugger information)" (Resources.parse_file --| semi >> ML_File.ML (SOME false)); val _ = Outer_Syntax.command \<^command_keyword>\SML_file\ "read and evaluate Standard ML file" (Resources.parse_file --| semi >> ML_File.SML NONE); val _ = Outer_Syntax.command \<^command_keyword>\SML_file_debug\ "read and evaluate Standard ML file (with debugger information)" (Resources.parse_file --| semi >> ML_File.SML (SOME true)); val _ = Outer_Syntax.command \<^command_keyword>\SML_file_no_debug\ "read and evaluate Standard ML file (no debugger information)" (Resources.parse_file --| semi >> ML_File.SML (SOME false)); val _ = Outer_Syntax.command \<^command_keyword>\SML_export\ "evaluate SML within Isabelle/ML environment" (Parse.ML_source >> (fn source => let val flags: ML_Compiler.flags = {environment = ML_Env.SML_export, redirect = false, verbose = true, debug = NONE, writeln = writeln, warning = warning}; in Toplevel.theory (Context.theory_map (ML_Context.exec (fn () => ML_Context.eval_source flags source))) end)); val _ = Outer_Syntax.command \<^command_keyword>\SML_import\ "evaluate Isabelle/ML within SML environment" (Parse.ML_source >> (fn source => let val flags: ML_Compiler.flags = {environment = ML_Env.SML_import, redirect = false, verbose = true, debug = NONE, writeln = writeln, warning = warning}; in Toplevel.generic_theory (ML_Context.exec (fn () => ML_Context.eval_source flags source) #> Local_Theory.propagate_ml_env) end)); val _ = Outer_Syntax.command ("ML_export", \<^here>) "ML text within theory or local theory, and export to bootstrap environment" (Parse.ML_source >> (fn source => Toplevel.generic_theory (fn context => context |> Config.put_generic ML_Env.ML_environment ML_Env.Isabelle |> Config.put_generic ML_Env.ML_write_global true |> ML_Context.exec (fn () => ML_Context.eval_source (ML_Compiler.verbose true ML_Compiler.flags) source) |> Config.restore_generic ML_Env.ML_write_global context |> Config.restore_generic ML_Env.ML_environment context |> Local_Theory.propagate_ml_env))); val _ = Outer_Syntax.command \<^command_keyword>\ML_prf\ "ML text within proof" (Parse.ML_source >> (fn source => Toplevel.proof (Proof.map_context (Context.proof_map (ML_Context.exec (fn () => ML_Context.eval_source (ML_Compiler.verbose true ML_Compiler.flags) source))) #> Proof.propagate_ml_env))); val _ = Outer_Syntax.command \<^command_keyword>\ML_val\ "diagnostic ML text" (Parse.ML_source >> Isar_Cmd.ml_diag true); val _ = Outer_Syntax.command \<^command_keyword>\ML_command\ "diagnostic ML text (silent)" (Parse.ML_source >> Isar_Cmd.ml_diag false); val _ = Outer_Syntax.command \<^command_keyword>\setup\ "ML setup for global theory" (Parse.ML_source >> (Toplevel.theory o Isar_Cmd.setup)); val _ = Outer_Syntax.local_theory \<^command_keyword>\local_setup\ "ML setup for local theory" (Parse.ML_source >> Isar_Cmd.local_setup); val _ = Outer_Syntax.command \<^command_keyword>\oracle\ "declare oracle" (Parse.range Parse.name -- Parse.!!! (\<^keyword>\=\ |-- Parse.ML_source) >> (fn (x, y) => Toplevel.theory (Isar_Cmd.oracle x y))); val _ = Outer_Syntax.local_theory \<^command_keyword>\attribute_setup\ "define attribute in ML" (Parse.name_position -- Parse.!!! (\<^keyword>\=\ |-- Parse.ML_source -- Scan.optional Parse.embedded "") >> (fn (name, (txt, cmt)) => Attrib.attribute_setup name txt cmt)); val _ = Outer_Syntax.local_theory \<^command_keyword>\method_setup\ "define proof method in ML" (Parse.name_position -- Parse.!!! (\<^keyword>\=\ |-- Parse.ML_source -- Scan.optional Parse.embedded "") >> (fn (name, (txt, cmt)) => Method.method_setup name txt cmt)); val _ = Outer_Syntax.local_theory \<^command_keyword>\declaration\ "generic ML declaration" (Parse.opt_keyword "pervasive" -- Parse.ML_source >> (fn (pervasive, txt) => Isar_Cmd.declaration {syntax = false, pervasive = pervasive} txt)); val _ = Outer_Syntax.local_theory \<^command_keyword>\syntax_declaration\ "generic ML syntax declaration" (Parse.opt_keyword "pervasive" -- Parse.ML_source >> (fn (pervasive, txt) => Isar_Cmd.declaration {syntax = true, pervasive = pervasive} txt)); val _ = Outer_Syntax.local_theory \<^command_keyword>\simproc_setup\ "define simproc in ML" (Parse.name_position -- (\<^keyword>\(\ |-- Parse.enum1 "|" Parse.term --| \<^keyword>\)\ --| \<^keyword>\=\) -- Parse.ML_source >> (fn ((a, b), c) => Isar_Cmd.simproc_setup a b c)); in end\ subsection \Theory commands\ subsubsection \Sorts and types\ ML \ local val _ = Outer_Syntax.local_theory \<^command_keyword>\default_sort\ "declare default sort for explicit type variables" (Parse.sort >> (fn s => fn lthy => Local_Theory.set_defsort (Syntax.read_sort lthy s) lthy)); val _ = Outer_Syntax.local_theory \<^command_keyword>\typedecl\ "type declaration" (Parse.type_args -- Parse.binding -- Parse.opt_mixfix >> (fn ((args, a), mx) => Typedecl.typedecl {final = true} (a, map (rpair dummyS) args, mx) #> snd)); val _ = Outer_Syntax.local_theory \<^command_keyword>\type_synonym\ "declare type abbreviation" (Parse.type_args -- Parse.binding -- (\<^keyword>\=\ |-- Parse.!!! (Parse.typ -- Parse.opt_mixfix')) >> (fn ((args, a), (rhs, mx)) => snd o Typedecl.abbrev_cmd (a, args, mx) rhs)); in end\ subsubsection \Consts\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\judgment\ "declare object-logic judgment" (Parse.const_binding >> (Toplevel.theory o Object_Logic.add_judgment_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\consts\ "declare constants" (Scan.repeat1 Parse.const_binding >> (Toplevel.theory o Sign.add_consts_cmd)); in end\ subsubsection \Syntax and translations\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\nonterminal\ "declare syntactic type constructors (grammar nonterminal symbols)" (Parse.and_list1 Parse.binding >> (Toplevel.theory o Sign.add_nonterminals_global)); val _ = Outer_Syntax.local_theory \<^command_keyword>\syntax\ "add raw syntax clauses" (Parse.syntax_mode -- Scan.repeat1 Parse.const_decl >> uncurry (Local_Theory.syntax_cmd true)); val _ = Outer_Syntax.local_theory \<^command_keyword>\no_syntax\ "delete raw syntax clauses" (Parse.syntax_mode -- Scan.repeat1 Parse.const_decl >> uncurry (Local_Theory.syntax_cmd false)); val trans_pat = Scan.optional (\<^keyword>\(\ |-- Parse.!!! (Parse.inner_syntax Parse.name --| \<^keyword>\)\)) "logic" -- Parse.inner_syntax Parse.string; fun trans_arrow toks = ((\<^keyword>\\\ || \<^keyword>\=>\) >> K Syntax.Parse_Rule || (\<^keyword>\\\ || \<^keyword>\<=\) >> K Syntax.Print_Rule || (\<^keyword>\\\ || \<^keyword>\==\) >> K Syntax.Parse_Print_Rule) toks; val trans_line = trans_pat -- Parse.!!! (trans_arrow -- trans_pat) >> (fn (left, (arr, right)) => arr (left, right)); val _ = Outer_Syntax.command \<^command_keyword>\translations\ "add syntax translation rules" (Scan.repeat1 trans_line >> (Toplevel.theory o Isar_Cmd.translations)); val _ = Outer_Syntax.command \<^command_keyword>\no_translations\ "delete syntax translation rules" (Scan.repeat1 trans_line >> (Toplevel.theory o Isar_Cmd.no_translations)); in end\ subsubsection \Translation functions\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\parse_ast_translation\ "install parse ast translation functions" (Parse.ML_source >> (Toplevel.theory o Isar_Cmd.parse_ast_translation)); val _ = Outer_Syntax.command \<^command_keyword>\parse_translation\ "install parse translation functions" (Parse.ML_source >> (Toplevel.theory o Isar_Cmd.parse_translation)); val _ = Outer_Syntax.command \<^command_keyword>\print_translation\ "install print translation functions" (Parse.ML_source >> (Toplevel.theory o Isar_Cmd.print_translation)); val _ = Outer_Syntax.command \<^command_keyword>\typed_print_translation\ "install typed print translation functions" (Parse.ML_source >> (Toplevel.theory o Isar_Cmd.typed_print_translation)); val _ = Outer_Syntax.command \<^command_keyword>\print_ast_translation\ "install print ast translation functions" (Parse.ML_source >> (Toplevel.theory o Isar_Cmd.print_ast_translation)); in end\ subsubsection \Specifications\ ML \ local val _ = Outer_Syntax.local_theory' \<^command_keyword>\definition\ "constant definition" (Scan.option Parse_Spec.constdecl -- (Parse_Spec.opt_thm_name ":" -- Parse.prop) -- Parse_Spec.if_assumes -- Parse.for_fixes >> (fn (((decl, spec), prems), params) => #2 oo Specification.definition_cmd decl params prems spec)); val _ = Outer_Syntax.local_theory' \<^command_keyword>\abbreviation\ "constant abbreviation" (Parse.syntax_mode -- Scan.option Parse_Spec.constdecl -- Parse.prop -- Parse.for_fixes >> (fn (((mode, decl), spec), params) => Specification.abbreviation_cmd mode decl params spec)); val axiomatization = Parse.and_list1 (Parse_Spec.thm_name ":" -- Parse.prop) -- Parse_Spec.if_assumes -- Parse.for_fixes >> (fn ((a, b), c) => (c, b, a)); val _ = Outer_Syntax.command \<^command_keyword>\axiomatization\ "axiomatic constant specification" (Scan.optional Parse.vars [] -- Scan.optional (Parse.where_ |-- Parse.!!! axiomatization) ([], [], []) >> (fn (a, (b, c, d)) => Toplevel.theory (#2 o Specification.axiomatization_cmd a b c d))); val _ = Outer_Syntax.local_theory \<^command_keyword>\alias\ "name-space alias for constant" (Parse.binding -- (Parse.!!! \<^keyword>\=\ |-- Parse.name_position) >> Specification.alias_cmd); val _ = Outer_Syntax.local_theory \<^command_keyword>\type_alias\ "name-space alias for type constructor" (Parse.binding -- (Parse.!!! \<^keyword>\=\ |-- Parse.name_position) >> Specification.type_alias_cmd); in end\ subsubsection \Notation\ ML \ local val _ = Outer_Syntax.local_theory \<^command_keyword>\type_notation\ "add concrete syntax for type constructors" (Parse.syntax_mode -- Parse.and_list1 (Parse.type_const -- Parse.mixfix) >> (fn (mode, args) => Local_Theory.type_notation_cmd true mode args)); val _ = Outer_Syntax.local_theory \<^command_keyword>\no_type_notation\ "delete concrete syntax for type constructors" (Parse.syntax_mode -- Parse.and_list1 (Parse.type_const -- Parse.mixfix) >> (fn (mode, args) => Local_Theory.type_notation_cmd false mode args)); val _ = Outer_Syntax.local_theory \<^command_keyword>\notation\ "add concrete syntax for constants / fixed variables" (Parse.syntax_mode -- Parse.and_list1 (Parse.const -- Parse.mixfix) >> (fn (mode, args) => Local_Theory.notation_cmd true mode args)); val _ = Outer_Syntax.local_theory \<^command_keyword>\no_notation\ "delete concrete syntax for constants / fixed variables" (Parse.syntax_mode -- Parse.and_list1 (Parse.const -- Parse.mixfix) >> (fn (mode, args) => Local_Theory.notation_cmd false mode args)); in end\ subsubsection \Theorems\ ML \ local val long_keyword = Parse_Spec.includes >> K "" || Parse_Spec.long_statement_keyword; val long_statement = Scan.optional (Parse_Spec.opt_thm_name ":" --| Scan.ahead long_keyword) Binding.empty_atts -- Scan.optional Parse_Spec.includes [] -- Parse_Spec.long_statement >> (fn ((binding, includes), (elems, concl)) => (true, binding, includes, elems, concl)); val short_statement = Parse_Spec.statement -- Parse_Spec.if_statement -- Parse.for_fixes >> (fn ((shows, assumes), fixes) => (false, Binding.empty_atts, [], [Element.Fixes fixes, Element.Assumes assumes], Element.Shows shows)); fun theorem spec schematic descr = Outer_Syntax.local_theory_to_proof' spec ("state " ^ descr) ((long_statement || short_statement) >> (fn (long, binding, includes, elems, concl) => ((if schematic then Specification.schematic_theorem_cmd else Specification.theorem_cmd) long Thm.theoremK NONE (K I) binding includes elems concl))); val _ = theorem \<^command_keyword>\theorem\ false "theorem"; val _ = theorem \<^command_keyword>\lemma\ false "lemma"; val _ = theorem \<^command_keyword>\corollary\ false "corollary"; val _ = theorem \<^command_keyword>\proposition\ false "proposition"; val _ = theorem \<^command_keyword>\schematic_goal\ true "schematic goal"; in end\ ML \ local val _ = Outer_Syntax.local_theory' \<^command_keyword>\lemmas\ "define theorems" (Parse_Spec.name_facts -- Parse.for_fixes >> (fn (facts, fixes) => #2 oo Specification.theorems_cmd Thm.theoremK facts fixes)); val _ = Outer_Syntax.local_theory' \<^command_keyword>\declare\ "declare theorems" (Parse.and_list1 Parse.thms1 -- Parse.for_fixes >> (fn (facts, fixes) => #2 oo Specification.theorems_cmd "" [(Binding.empty_atts, flat facts)] fixes)); val _ = Outer_Syntax.local_theory \<^command_keyword>\named_theorems\ "declare named collection of theorems" (Parse.and_list1 (Parse.binding -- Scan.optional Parse.embedded "") >> fold (fn (b, descr) => snd o Named_Theorems.declare b descr)); in end\ subsubsection \Hide names\ ML \ local fun hide_names command_keyword what hide parse prep = Outer_Syntax.command command_keyword ("hide " ^ what ^ " from name space") ((Parse.opt_keyword "open" >> not) -- Scan.repeat1 parse >> (fn (fully, args) => (Toplevel.theory (fn thy => let val ctxt = Proof_Context.init_global thy in fold (hide fully o prep ctxt) args thy end)))); val _ = hide_names \<^command_keyword>\hide_class\ "classes" Sign.hide_class Parse.class Proof_Context.read_class; val _ = hide_names \<^command_keyword>\hide_type\ "types" Sign.hide_type Parse.type_const ((#1 o dest_Type) oo Proof_Context.read_type_name {proper = true, strict = false}); val _ = hide_names \<^command_keyword>\hide_const\ "consts" Sign.hide_const Parse.const ((#1 o dest_Const) oo Proof_Context.read_const {proper = true, strict = false}); val _ = hide_names \<^command_keyword>\hide_fact\ "facts" Global_Theory.hide_fact Parse.name_position (Global_Theory.check_fact o Proof_Context.theory_of); in end\ subsection \Bundled declarations\ ML \ local val _ = Outer_Syntax.maybe_begin_local_theory \<^command_keyword>\bundle\ "define bundle of declarations" ((Parse.binding --| \<^keyword>\=\) -- Parse.thms1 -- Parse.for_fixes >> (uncurry Bundle.bundle_cmd)) (Parse.binding --| Parse.begin >> Bundle.init); val _ = Outer_Syntax.local_theory \<^command_keyword>\unbundle\ "activate declarations from bundle in local theory" (Scan.repeat1 Parse.name_position >> Bundle.unbundle_cmd); val _ = Outer_Syntax.command \<^command_keyword>\include\ "activate declarations from bundle in proof body" (Scan.repeat1 Parse.name_position >> (Toplevel.proof o Bundle.include_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\including\ "activate declarations from bundle in goal refinement" (Scan.repeat1 Parse.name_position >> (Toplevel.proof o Bundle.including_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\print_bundles\ "print bundles of declarations" (Parse.opt_bang >> (fn b => Toplevel.keep (Bundle.print_bundles b o Toplevel.context_of))); in end\ subsection \Local theory specifications\ subsubsection \Specification context\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\context\ "begin local theory context" (((Parse.name_position -- Scan.optional Parse_Spec.opening []) >> (fn (name, incls) => Toplevel.begin_main_target true (Target_Context.context_begin_named_cmd incls name)) || Scan.optional Parse_Spec.includes [] -- Scan.repeat Parse_Spec.context_element >> (fn (incls, elems) => Toplevel.begin_nested_target (Target_Context.context_begin_nested_cmd incls elems))) --| Parse.begin); val _ = Outer_Syntax.command \<^command_keyword>\end\ "end context" (Scan.succeed (Toplevel.exit o Toplevel.end_main_target o Toplevel.end_nested_target o Toplevel.end_proof (K Proof.end_notepad))); in end\ subsubsection \Locales and interpretation\ ML \ local val locale_context_elements = Scan.repeat1 Parse_Spec.context_element; val locale_val = ((Parse_Spec.locale_expression -- Scan.optional Parse_Spec.opening []) || Parse_Spec.opening >> pair ([], [])) -- Scan.optional (\<^keyword>\+\ |-- Parse.!!! locale_context_elements) [] || locale_context_elements >> pair (([], []), []); val _ = Outer_Syntax.command \<^command_keyword>\locale\ "define named specification context" (Parse.binding -- Scan.optional (\<^keyword>\=\ |-- Parse.!!! locale_val) ((([], []), []), []) -- Parse.opt_begin >> (fn ((name, ((expr, includes), elems)), begin) => Toplevel.begin_main_target begin (Expression.add_locale_cmd name Binding.empty includes expr elems #> snd))); val _ = Outer_Syntax.command \<^command_keyword>\experiment\ "open private specification context" (Scan.repeat Parse_Spec.context_element --| Parse.begin >> (fn elems => Toplevel.begin_main_target true (Experiment.experiment_cmd elems #> snd))); val _ = Outer_Syntax.command \<^command_keyword>\interpret\ "prove interpretation of locale expression in proof context" (Parse.!!! Parse_Spec.locale_expression >> (fn expr => Toplevel.proof (Interpretation.interpret_cmd expr))); val interpretation_args_with_defs = Parse.!!! Parse_Spec.locale_expression -- (Scan.optional (\<^keyword>\defines\ |-- Parse.and_list1 (Parse_Spec.opt_thm_name ":" -- ((Parse.binding -- Parse.opt_mixfix') --| \<^keyword>\=\ -- Parse.term))) ([])); val _ = Outer_Syntax.local_theory_to_proof \<^command_keyword>\global_interpretation\ "prove interpretation of locale expression into global theory" (interpretation_args_with_defs >> (fn (expr, defs) => Interpretation.global_interpretation_cmd expr defs)); val _ = Outer_Syntax.command \<^command_keyword>\sublocale\ "prove sublocale relation between a locale and a locale expression" ((Parse.name_position --| (\<^keyword>\\\ || \<^keyword>\<\) -- interpretation_args_with_defs >> (fn (loc, (expr, defs)) => Toplevel.theory_to_proof (Interpretation.global_sublocale_cmd loc expr defs))) || interpretation_args_with_defs >> (fn (expr, defs) => Toplevel.local_theory_to_proof NONE NONE (Interpretation.sublocale_cmd expr defs))); val _ = Outer_Syntax.command \<^command_keyword>\interpretation\ "prove interpretation of locale expression in local theory or into global theory" (Parse.!!! Parse_Spec.locale_expression >> (fn expr => Toplevel.local_theory_to_proof NONE NONE (Interpretation.isar_interpretation_cmd expr))); in end\ subsubsection \Type classes\ ML \ local val class_context_elements = Scan.repeat1 Parse_Spec.context_element; val class_val = ((Parse_Spec.class_expression -- Scan.optional Parse_Spec.opening []) || Parse_Spec.opening >> pair []) -- Scan.optional (\<^keyword>\+\ |-- Parse.!!! class_context_elements) [] || class_context_elements >> pair ([], []); val _ = Outer_Syntax.command \<^command_keyword>\class\ "define type class" (Parse.binding -- Scan.optional (\<^keyword>\=\ |-- class_val) (([], []), []) -- Parse.opt_begin >> (fn ((name, ((supclasses, includes), elems)), begin) => Toplevel.begin_main_target begin (Class_Declaration.class_cmd name includes supclasses elems #> snd))); val _ = Outer_Syntax.local_theory_to_proof \<^command_keyword>\subclass\ "prove a subclass relation" (Parse.class >> Class_Declaration.subclass_cmd); val _ = Outer_Syntax.command \<^command_keyword>\instantiation\ "instantiate and prove type arity" (Parse.multi_arity --| Parse.begin >> (fn arities => Toplevel.begin_main_target true (Class.instantiation_cmd arities))); val _ = Outer_Syntax.command \<^command_keyword>\instance\ "prove type arity or subclass relation" ((Parse.class -- ((\<^keyword>\\\ || \<^keyword>\<\) |-- Parse.!!! Parse.class) >> Class.classrel_cmd || Parse.multi_arity >> Class.instance_arity_cmd) >> Toplevel.theory_to_proof || Scan.succeed (Toplevel.local_theory_to_proof NONE NONE (Class.instantiation_instance I))); in end\ subsubsection \Arbitrary overloading\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\overloading\ "overloaded definitions" (Scan.repeat1 (Parse.name --| (\<^keyword>\==\ || \<^keyword>\\\) -- Parse.term -- Scan.optional (\<^keyword>\(\ |-- (\<^keyword>\unchecked\ >> K false) --| \<^keyword>\)\) true >> Scan.triple1) --| Parse.begin >> (fn operations => Toplevel.begin_main_target true (Overloading.overloading_cmd operations))); in end\ subsection \Proof commands\ ML \ local val _ = Outer_Syntax.local_theory_to_proof \<^command_keyword>\notepad\ "begin proof context" (Parse.begin >> K Proof.begin_notepad); in end\ subsubsection \Statements\ ML \ local val structured_statement = Parse_Spec.statement -- Parse_Spec.cond_statement -- Parse.for_fixes >> (fn ((shows, (strict, assumes)), fixes) => (strict, fixes, assumes, shows)); val _ = Outer_Syntax.command \<^command_keyword>\have\ "state local goal" (structured_statement >> (fn (a, b, c, d) => Toplevel.proof' (fn int => Proof.have_cmd a NONE (K I) b c d int #> #2))); val _ = Outer_Syntax.command \<^command_keyword>\show\ "state local goal, to refine pending subgoals" (structured_statement >> (fn (a, b, c, d) => Toplevel.proof' (fn int => Proof.show_cmd a NONE (K I) b c d int #> #2))); val _ = Outer_Syntax.command \<^command_keyword>\hence\ "old-style alias of \"then have\"" (structured_statement >> (fn (a, b, c, d) => Toplevel.proof' (fn int => Proof.chain #> Proof.have_cmd a NONE (K I) b c d int #> #2))); val _ = Outer_Syntax.command \<^command_keyword>\thus\ "old-style alias of \"then show\"" (structured_statement >> (fn (a, b, c, d) => Toplevel.proof' (fn int => Proof.chain #> Proof.show_cmd a NONE (K I) b c d int #> #2))); in end\ subsubsection \Local facts\ ML \ local val facts = Parse.and_list1 Parse.thms1; val _ = Outer_Syntax.command \<^command_keyword>\then\ "forward chaining" (Scan.succeed (Toplevel.proof Proof.chain)); val _ = Outer_Syntax.command \<^command_keyword>\from\ "forward chaining from given facts" (facts >> (Toplevel.proof o Proof.from_thmss_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\with\ "forward chaining from given and current facts" (facts >> (Toplevel.proof o Proof.with_thmss_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\note\ "define facts" (Parse_Spec.name_facts >> (Toplevel.proof o Proof.note_thmss_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\supply\ "define facts during goal refinement (unstructured)" (Parse_Spec.name_facts >> (Toplevel.proof o Proof.supply_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\using\ "augment goal facts" (facts >> (Toplevel.proof o Proof.using_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\unfolding\ "unfold definitions in goal and facts" (facts >> (Toplevel.proof o Proof.unfolding_cmd)); in end\ subsubsection \Proof context\ ML \ local val structured_statement = Parse_Spec.statement -- Parse_Spec.if_statement' -- Parse.for_fixes >> (fn ((shows, assumes), fixes) => (fixes, assumes, shows)); val _ = Outer_Syntax.command \<^command_keyword>\fix\ "fix local variables (Skolem constants)" (Parse.vars >> (Toplevel.proof o Proof.fix_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\assume\ "assume propositions" (structured_statement >> (fn (a, b, c) => Toplevel.proof (Proof.assume_cmd a b c))); val _ = Outer_Syntax.command \<^command_keyword>\presume\ "assume propositions, to be established later" (structured_statement >> (fn (a, b, c) => Toplevel.proof (Proof.presume_cmd a b c))); val _ = Outer_Syntax.command \<^command_keyword>\define\ "local definition (non-polymorphic)" ((Parse.vars --| Parse.where_) -- Parse_Spec.statement -- Parse.for_fixes >> (fn ((a, b), c) => Toplevel.proof (Proof.define_cmd a c b))); val _ = Outer_Syntax.command \<^command_keyword>\consider\ "state cases rule" (Parse_Spec.obtains >> (Toplevel.proof' o Obtain.consider_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\obtain\ "generalized elimination" (Parse.parbinding -- Scan.optional (Parse.vars --| Parse.where_) [] -- structured_statement >> (fn ((a, b), (c, d, e)) => Toplevel.proof' (Obtain.obtain_cmd a b c d e))); val _ = Outer_Syntax.command \<^command_keyword>\let\ "bind text variables" (Parse.and_list1 (Parse.and_list1 Parse.term -- (\<^keyword>\=\ |-- Parse.term)) >> (Toplevel.proof o Proof.let_bind_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\write\ "add concrete syntax for constants / fixed variables" (Parse.syntax_mode -- Parse.and_list1 (Parse.const -- Parse.mixfix) >> (fn (mode, args) => Toplevel.proof (Proof.write_cmd mode args))); val _ = Outer_Syntax.command \<^command_keyword>\case\ "invoke local context" (Parse_Spec.opt_thm_name ":" -- (\<^keyword>\(\ |-- Parse.!!! (Parse.name_position -- Scan.repeat (Parse.maybe Parse.binding) --| \<^keyword>\)\) || Parse.name_position >> rpair []) >> (Toplevel.proof o Proof.case_cmd)); in end\ subsubsection \Proof structure\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\{\ "begin explicit proof block" (Scan.succeed (Toplevel.proof Proof.begin_block)); val _ = Outer_Syntax.command \<^command_keyword>\}\ "end explicit proof block" (Scan.succeed (Toplevel.proof Proof.end_block)); val _ = Outer_Syntax.command \<^command_keyword>\next\ "enter next proof block" (Scan.succeed (Toplevel.proof Proof.next_block)); in end\ subsubsection \End proof\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\qed\ "conclude proof" (Scan.option Method.parse >> (fn m => (Option.map Method.report m; Isar_Cmd.qed m))); val _ = Outer_Syntax.command \<^command_keyword>\by\ "terminal backward proof" (Method.parse -- Scan.option Method.parse >> (fn (m1, m2) => (Method.report m1; Option.map Method.report m2; Isar_Cmd.terminal_proof (m1, m2)))); val _ = Outer_Syntax.command \<^command_keyword>\..\ "default proof" (Scan.succeed Isar_Cmd.default_proof); val _ = Outer_Syntax.command \<^command_keyword>\.\ "immediate proof" (Scan.succeed Isar_Cmd.immediate_proof); val _ = Outer_Syntax.command \<^command_keyword>\done\ "done proof" (Scan.succeed Isar_Cmd.done_proof); val _ = Outer_Syntax.command \<^command_keyword>\sorry\ "skip proof (quick-and-dirty mode only!)" (Scan.succeed Isar_Cmd.skip_proof); val _ = Outer_Syntax.command \<^command_keyword>\\\ "dummy proof (quick-and-dirty mode only!)" (Scan.succeed Isar_Cmd.skip_proof); val _ = Outer_Syntax.command \<^command_keyword>\oops\ "forget proof" (Scan.succeed Toplevel.forget_proof); in end\ subsubsection \Proof steps\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\defer\ "shuffle internal proof state" (Scan.optional Parse.nat 1 >> (Toplevel.proof o Proof.defer)); val _ = Outer_Syntax.command \<^command_keyword>\prefer\ "shuffle internal proof state" (Parse.nat >> (Toplevel.proof o Proof.prefer)); val _ = Outer_Syntax.command \<^command_keyword>\apply\ "initial goal refinement step (unstructured)" (Method.parse >> (fn m => (Method.report m; Toplevel.proofs (Proof.apply m)))); val _ = Outer_Syntax.command \<^command_keyword>\apply_end\ "terminal goal refinement step (unstructured)" (Method.parse >> (fn m => (Method.report m; Toplevel.proofs (Proof.apply_end m)))); val _ = Outer_Syntax.command \<^command_keyword>\proof\ "backward proof step" (Scan.option Method.parse >> (fn m => (Option.map Method.report m; Toplevel.proof (fn state => let val state' = state |> Proof.proof m |> Seq.the_result ""; val _ = Output.information (Proof_Context.print_cases_proof (Proof.context_of state) (Proof.context_of state')); in state' end)))) in end\ subsubsection \Subgoal focus\ ML \ local val opt_fact_binding = Scan.optional (Parse.binding -- Parse.opt_attribs || Parse.attribs >> pair Binding.empty) Binding.empty_atts; val for_params = Scan.optional (\<^keyword>\for\ |-- Parse.!!! ((Scan.option Parse.dots >> is_some) -- (Scan.repeat1 (Parse.maybe_position Parse.name_position)))) (false, []); val _ = Outer_Syntax.command \<^command_keyword>\subgoal\ "focus on first subgoal within backward refinement" (opt_fact_binding -- (Scan.option (\<^keyword>\premises\ |-- Parse.!!! opt_fact_binding)) -- for_params >> (fn ((a, b), c) => Toplevel.proofs (Seq.make_results o Seq.single o #2 o Subgoal.subgoal_cmd a b c))); in end\ subsubsection \Calculation\ ML \ local val calculation_args = Scan.option (\<^keyword>\(\ |-- Parse.!!! ((Parse.thms1 --| \<^keyword>\)\))); val _ = Outer_Syntax.command \<^command_keyword>\also\ "combine calculation and current facts" (calculation_args >> (Toplevel.proofs' o Calculation.also_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\finally\ "combine calculation and current facts, exhibit result" (calculation_args >> (Toplevel.proofs' o Calculation.finally_cmd)); val _ = Outer_Syntax.command \<^command_keyword>\moreover\ "augment calculation by current facts" (Scan.succeed (Toplevel.proof' Calculation.moreover)); val _ = Outer_Syntax.command \<^command_keyword>\ultimately\ "augment calculation by current facts, exhibit result" (Scan.succeed (Toplevel.proof' Calculation.ultimately)); val _ = Outer_Syntax.command \<^command_keyword>\print_trans_rules\ "print transitivity rules" (Scan.succeed (Toplevel.keep (Calculation.print_rules o Toplevel.context_of))); in end\ subsubsection \Proof navigation\ ML \ local fun report_back () = Output.report [Markup.markup (Markup.bad ()) "Explicit backtracking"]; val _ = Outer_Syntax.command \<^command_keyword>\back\ "explicit backtracking of proof command" (Scan.succeed (Toplevel.actual_proof (fn prf => (report_back (); Proof_Node.back prf)) o Toplevel.skip_proof report_back)); in end\ subsection \Diagnostic commands (for interactive mode only)\ ML \ local val opt_modes = Scan.optional (\<^keyword>\(\ |-- Parse.!!! (Scan.repeat1 Parse.name --| \<^keyword>\)\)) []; val _ = Outer_Syntax.command \<^command_keyword>\help\ "retrieve outer syntax commands according to name patterns" (Scan.repeat Parse.name >> (fn pats => Toplevel.keep (fn st => Outer_Syntax.help (Toplevel.theory_of st) pats))); val _ = Outer_Syntax.command \<^command_keyword>\print_commands\ "print outer syntax commands" (Scan.succeed (Toplevel.keep (Outer_Syntax.print_commands o Toplevel.theory_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_options\ "print configuration options" (Parse.opt_bang >> (fn b => Toplevel.keep (Attrib.print_options b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_context\ "print context of local theory target" (Scan.succeed (Toplevel.keep (Pretty.writeln_chunks o Toplevel.pretty_context))); val _ = Outer_Syntax.command \<^command_keyword>\print_theory\ "print logical theory contents" (Parse.opt_bang >> (fn b => Toplevel.keep (Pretty.writeln o Proof_Display.pretty_theory b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_definitions\ "print dependencies of definitional theory content" (Parse.opt_bang >> (fn b => Toplevel.keep (Pretty.writeln o Proof_Display.pretty_definitions b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_syntax\ "print inner syntax of context" (Scan.succeed (Toplevel.keep (Proof_Context.print_syntax o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_defn_rules\ "print definitional rewrite rules of context" (Scan.succeed (Toplevel.keep (Local_Defs.print_rules o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_abbrevs\ "print constant abbreviations of context" (Parse.opt_bang >> (fn b => Toplevel.keep (Proof_Context.print_abbrevs b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_theorems\ "print theorems of local theory or proof context" (Parse.opt_bang >> (fn b => Toplevel.keep (Pretty.writeln o Pretty.chunks o Isar_Cmd.pretty_theorems b))); val _ = Outer_Syntax.command \<^command_keyword>\print_locales\ "print locales of this theory" (Parse.opt_bang >> (fn verbose => Toplevel.keep (fn state => let val thy = Toplevel.theory_of state in Pretty.writeln (Locale.pretty_locales thy verbose) end))); val _ = Outer_Syntax.command \<^command_keyword>\print_classes\ "print classes of this theory" (Scan.succeed (Toplevel.keep (Class.print_classes o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_locale\ "print locale of this theory" (Parse.opt_bang -- Parse.name_position >> (fn (show_facts, raw_name) => Toplevel.keep (fn state => let val thy = Toplevel.theory_of state; val name = Locale.check thy raw_name; in Pretty.writeln (Locale.pretty_locale thy show_facts name) end))); val _ = Outer_Syntax.command \<^command_keyword>\print_interps\ "print interpretations of locale for this theory or proof context" (Parse.name_position >> (fn raw_name => Toplevel.keep (fn state => let val ctxt = Toplevel.context_of state; val thy = Toplevel.theory_of state; val name = Locale.check thy raw_name; in Pretty.writeln (Locale.pretty_registrations ctxt name) end))); val _ = Outer_Syntax.command \<^command_keyword>\print_attributes\ "print attributes of this theory" (Parse.opt_bang >> (fn b => Toplevel.keep (Attrib.print_attributes b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_simpset\ "print context of Simplifier" (Parse.opt_bang >> (fn b => Toplevel.keep (Pretty.writeln o Simplifier.pretty_simpset b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_rules\ "print intro/elim rules" (Scan.succeed (Toplevel.keep (Context_Rules.print_rules o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_methods\ "print methods of this theory" (Parse.opt_bang >> (fn b => Toplevel.keep (Method.print_methods b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_antiquotations\ "print document antiquotations" (Parse.opt_bang >> (fn b => Toplevel.keep (Document_Antiquotation.print_antiquotations b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_ML_antiquotations\ "print ML antiquotations" (Parse.opt_bang >> (fn b => Toplevel.keep (ML_Context.print_antiquotations b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\locale_deps\ "visualize locale dependencies" (Scan.succeed (Toplevel.keep (Toplevel.theory_of #> (fn thy => Locale.pretty_locale_deps thy |> map (fn {name, parents, body} => ((name, Graph_Display.content_node (Locale.extern thy name) [body]), parents)) |> Graph_Display.display_graph_old)))); val _ = Outer_Syntax.command \<^command_keyword>\print_term_bindings\ "print term bindings of proof context" (Scan.succeed (Toplevel.keep (Pretty.writeln_chunks o Proof_Context.pretty_term_bindings o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_facts\ "print facts of proof context" (Parse.opt_bang >> (fn b => Toplevel.keep (Proof_Context.print_local_facts b o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_cases\ "print cases of proof context" (Scan.succeed (Toplevel.keep (Pretty.writeln_chunks o Proof_Context.pretty_cases o Toplevel.context_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_statement\ "print theorems as long statements" (opt_modes -- Parse.thms1 >> Isar_Cmd.print_stmts); val _ = Outer_Syntax.command \<^command_keyword>\thm\ "print theorems" (opt_modes -- Parse.thms1 >> Isar_Cmd.print_thms); val _ = Outer_Syntax.command \<^command_keyword>\prf\ "print proof terms of theorems" (opt_modes -- Scan.option Parse.thms1 >> Isar_Cmd.print_prfs false); val _ = Outer_Syntax.command \<^command_keyword>\full_prf\ "print full proof terms of theorems" (opt_modes -- Scan.option Parse.thms1 >> Isar_Cmd.print_prfs true); val _ = Outer_Syntax.command \<^command_keyword>\prop\ "read and print proposition" (opt_modes -- Parse.term >> Isar_Cmd.print_prop); val _ = Outer_Syntax.command \<^command_keyword>\term\ "read and print term" (opt_modes -- Parse.term >> Isar_Cmd.print_term); val _ = Outer_Syntax.command \<^command_keyword>\typ\ "read and print type" (opt_modes -- (Parse.typ -- Scan.option (\<^keyword>\::\ |-- Parse.!!! Parse.sort)) >> Isar_Cmd.print_type); val _ = Outer_Syntax.command \<^command_keyword>\print_codesetup\ "print code generator setup" (Scan.succeed (Toplevel.keep (Code.print_codesetup o Toplevel.theory_of))); val _ = Outer_Syntax.command \<^command_keyword>\print_state\ "print current proof state (if present)" (opt_modes >> (fn modes => Toplevel.keep (Print_Mode.with_modes modes (Output.state o Toplevel.string_of_state)))); val _ = Outer_Syntax.command \<^command_keyword>\welcome\ "print welcome message" (Scan.succeed (Toplevel.keep (fn _ => writeln (Session.welcome ())))); in end\ subsection \Dependencies\ ML \ local val theory_bounds = Parse.theory_name >> single || (\<^keyword>\(\ |-- Parse.enum "|" Parse.theory_name --| \<^keyword>\)\); val _ = Outer_Syntax.command \<^command_keyword>\thy_deps\ "visualize theory dependencies" (Scan.option theory_bounds -- Scan.option theory_bounds >> (fn args => Toplevel.keep (fn st => Thy_Deps.thy_deps_cmd (Toplevel.context_of st) args))); val class_bounds = Parse.sort >> single || (\<^keyword>\(\ |-- Parse.enum "|" Parse.sort --| \<^keyword>\)\); val _ = Outer_Syntax.command \<^command_keyword>\class_deps\ "visualize class dependencies" (Scan.option class_bounds -- Scan.option class_bounds >> (fn args => Toplevel.keep (fn st => Class_Deps.class_deps_cmd (Toplevel.context_of st) args))); val _ = Outer_Syntax.command \<^command_keyword>\thm_deps\ "print theorem dependencies (immediate non-transitive)" (Parse.thms1 >> (fn args => Toplevel.keep (fn st => let val thy = Toplevel.theory_of st; val ctxt = Toplevel.context_of st; in Pretty.writeln (Thm_Deps.pretty_thm_deps thy (Attrib.eval_thms ctxt args)) end))); val _ = Outer_Syntax.command \<^command_keyword>\thm_oracles\ "print all oracles used in theorems (full graph of transitive dependencies)" (Parse.thms1 >> (fn args => Toplevel.keep (fn st => let val ctxt = Toplevel.context_of st; val thms = Attrib.eval_thms ctxt args; in Pretty.writeln (Thm_Deps.pretty_thm_oracles ctxt thms) end))); val thy_names = Scan.repeat1 (Scan.unless Parse.minus Parse.theory_name); val _ = Outer_Syntax.command \<^command_keyword>\unused_thms\ "find unused theorems" (Scan.option ((thy_names --| Parse.minus) -- Scan.option thy_names) >> (fn opt_range => Toplevel.keep (fn st => let val thy = Toplevel.theory_of st; val ctxt = Toplevel.context_of st; fun pretty_thm (a, th) = Proof_Context.pretty_fact ctxt (a, [th]); val check = Theory.check {long = false} ctxt; in Thm_Deps.unused_thms_cmd (case opt_range of NONE => (Theory.parents_of thy, [thy]) | SOME (xs, NONE) => (map check xs, [thy]) | SOME (xs, SOME ys) => (map check xs, map check ys)) |> map pretty_thm |> Pretty.writeln_chunks end))); in end\ subsubsection \Find consts and theorems\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\find_consts\ "find constants by name / type patterns" (Find_Consts.query_parser >> (fn spec => Toplevel.keep (fn st => Pretty.writeln (Find_Consts.pretty_consts (Toplevel.context_of st) spec)))); val options = Scan.optional (Parse.$$$ "(" |-- Parse.!!! (Scan.option Parse.nat -- Scan.optional (Parse.reserved "with_dups" >> K false) true --| Parse.$$$ ")")) (NONE, true); val _ = Outer_Syntax.command \<^command_keyword>\find_theorems\ "find theorems meeting specified criteria" (options -- Find_Theorems.query_parser >> (fn ((opt_lim, rem_dups), spec) => Toplevel.keep (fn st => Pretty.writeln (Find_Theorems.pretty_theorems (Find_Theorems.proof_state st) opt_lim rem_dups spec)))); in end\ subsection \Code generation\ ML \ local val _ = Outer_Syntax.command \<^command_keyword>\code_datatype\ "define set of code datatype constructors" (Scan.repeat1 Parse.term >> (Toplevel.theory o Code.declare_datatype_cmd)); in end\ subsection \Extraction of programs from proofs\ ML \ local val parse_vars = Scan.optional (Parse.$$$ "(" |-- Parse.list1 Parse.name --| Parse.$$$ ")") []; val _ = Outer_Syntax.command \<^command_keyword>\realizers\ "specify realizers for primitive axioms / theorems, together with correctness proof" (Scan.repeat1 (Parse.name -- parse_vars --| Parse.$$$ ":" -- Parse.string -- Parse.string) >> (fn xs => Toplevel.theory (fn thy => Extraction.add_realizers (map (fn (((a, vs), s1), s2) => (Global_Theory.get_thm thy a, (vs, s1, s2))) xs) thy))); val _ = Outer_Syntax.command \<^command_keyword>\realizability\ "add equations characterizing realizability" (Scan.repeat1 Parse.string >> (Toplevel.theory o Extraction.add_realizes_eqns)); val _ = Outer_Syntax.command \<^command_keyword>\extract_type\ "add equations characterizing type of extracted program" (Scan.repeat1 Parse.string >> (Toplevel.theory o Extraction.add_typeof_eqns)); val _ = Outer_Syntax.command \<^command_keyword>\extract\ "extract terms from proofs" (Scan.repeat1 (Parse.name -- parse_vars) >> (fn xs => Toplevel.theory (fn thy => Extraction.extract (map (apfst (Global_Theory.get_thm thy)) xs) thy))); in end\ section \Auxiliary lemmas\ subsection \Meta-level connectives in assumptions\ lemma meta_mp: assumes "PROP P \ PROP Q" and "PROP P" shows "PROP Q" by (rule \PROP P \ PROP Q\ [OF \PROP P\]) lemmas meta_impE = meta_mp [elim_format] lemma meta_spec: assumes "\x. PROP P x" shows "PROP P x" by (rule \\x. PROP P x\) lemmas meta_allE = meta_spec [elim_format] lemma swap_params: "(\x y. PROP P x y) \ (\y x. PROP P x y)" .. lemma equal_allI: \(\x. PROP P x) \ (\x. PROP Q x)\ if \\x. PROP P x \ PROP Q x\ by (simp only: that) subsection \Meta-level conjunction\ lemma all_conjunction: "(\x. PROP A x &&& PROP B x) \ ((\x. PROP A x) &&& (\x. PROP B x))" proof assume conj: "\x. PROP A x &&& PROP B x" show "(\x. PROP A x) &&& (\x. PROP B x)" proof - fix x from conj show "PROP A x" by (rule conjunctionD1) from conj show "PROP B x" by (rule conjunctionD2) qed next assume conj: "(\x. PROP A x) &&& (\x. PROP B x)" fix x show "PROP A x &&& PROP B x" proof - show "PROP A x" by (rule conj [THEN conjunctionD1, rule_format]) show "PROP B x" by (rule conj [THEN conjunctionD2, rule_format]) qed qed lemma imp_conjunction: "(PROP A \ PROP B &&& PROP C) \ ((PROP A \ PROP B) &&& (PROP A \ PROP C))" proof assume conj: "PROP A \ PROP B &&& PROP C" show "(PROP A \ PROP B) &&& (PROP A \ PROP C)" proof - assume "PROP A" from conj [OF \PROP A\] show "PROP B" by (rule conjunctionD1) from conj [OF \PROP A\] show "PROP C" by (rule conjunctionD2) qed next assume conj: "(PROP A \ PROP B) &&& (PROP A \ PROP C)" assume "PROP A" show "PROP B &&& PROP C" proof - from \PROP A\ show "PROP B" by (rule conj [THEN conjunctionD1]) from \PROP A\ show "PROP C" by (rule conj [THEN conjunctionD2]) qed qed lemma conjunction_imp: "(PROP A &&& PROP B \ PROP C) \ (PROP A \ PROP B \ PROP C)" proof assume r: "PROP A &&& PROP B \ PROP C" assume ab: "PROP A" "PROP B" show "PROP C" proof (rule r) from ab show "PROP A &&& PROP B" . qed next assume r: "PROP A \ PROP B \ PROP C" assume conj: "PROP A &&& PROP B" show "PROP C" proof (rule r) from conj show "PROP A" by (rule conjunctionD1) from conj show "PROP B" by (rule conjunctionD2) qed qed declare [[ML_write_global = false]] end diff --git a/src/Pure/ROOT.ML b/src/Pure/ROOT.ML --- a/src/Pure/ROOT.ML +++ b/src/Pure/ROOT.ML @@ -1,366 +1,367 @@ (* Title: Pure/ROOT.ML Author: Makarius Main entry point for the Isabelle/Pure bootstrap process. Note: When this file is open in the Prover IDE, the ML files of Isabelle/Pure can be explored interactively. This is a separate copy of Pure within Pure: it does not affect the running logic session. *) chapter "Isabelle/Pure bootstrap"; ML_file "ML/ml_name_space.ML"; section "Bootstrap phase 0: Poly/ML setup"; ML_file "ML/ml_init.ML"; ML_file "ML/ml_system.ML"; ML_file "General/basics.ML"; ML_file "General/symbol_explode.ML"; ML_file "Concurrent/multithreading.ML"; ML_file "Concurrent/unsynchronized.ML"; ML_file "Concurrent/synchronized.ML"; ML_file "Concurrent/counter.ML"; ML_file "ML/ml_heap.ML"; ML_file "ML/ml_print_depth0.ML"; ML_file "ML/ml_pretty.ML"; ML_file "ML/ml_compiler0.ML"; section "Bootstrap phase 1: towards ML within position context"; subsection "Library of general tools"; ML_file "library.ML"; ML_file "General/print_mode.ML"; ML_file "General/alist.ML"; ML_file "General/table.ML"; ML_file "General/random.ML"; ML_file "General/value.ML"; ML_file "General/properties.ML"; ML_file "General/output.ML"; ML_file "PIDE/markup.ML"; ML_file "General/utf8.ML"; ML_file "General/scan.ML"; ML_file "General/source.ML"; ML_file "General/symbol.ML"; ML_file "General/position.ML"; ML_file "General/symbol_pos.ML"; ML_file "General/input.ML"; ML_file "General/comment.ML"; ML_file "General/antiquote.ML"; ML_file "ML/ml_lex.ML"; ML_file "ML/ml_compiler1.ML"; section "Bootstrap phase 2: towards ML within Isar context"; subsection "Library of general tools"; ML_file "General/integer.ML"; ML_file "General/stack.ML"; ML_file "General/queue.ML"; ML_file "General/heap.ML"; ML_file "General/same.ML"; ML_file "General/ord_list.ML"; ML_file "General/balanced_tree.ML"; ML_file "General/linear_set.ML"; ML_file "General/buffer.ML"; ML_file "General/pretty.ML"; ML_file "General/rat.ML"; ML_file "PIDE/xml.ML"; ML_file "General/path.ML"; ML_file "General/url.ML"; ML_file "System/bash.ML"; ML_file "General/file_stream.ML"; ML_file "General/bytes.ML"; ML_file "General/file.ML"; ML_file "General/long_name.ML"; ML_file "General/binding.ML"; ML_file "General/seq.ML"; ML_file "General/time.ML"; ML_file "General/timing.ML"; ML_file "General/sha1.ML"; ML_file "PIDE/yxml.ML"; ML_file "ML/ml_profiling.ML"; ML_file "PIDE/byte_message.ML"; ML_file "PIDE/protocol_message.ML"; ML_file "PIDE/document_id.ML"; ML_file "General/socket_io.ML"; ML_file "General/change_table.ML"; ML_file "General/graph.ML"; ML_file "System/options.ML"; subsection "Fundamental structures"; ML_file "name.ML"; ML_file "term.ML"; ML_file "context.ML"; ML_file "config.ML"; ML_file "context_position.ML"; ML_file "soft_type_system.ML"; subsection "Concurrency within the ML runtime"; ML_file "ML/exn_properties.ML"; ML_file_no_debug "ML/exn_debugger.ML"; ML_file "Concurrent/thread_data_virtual.ML"; ML_file "Concurrent/isabelle_thread.ML"; ML_file "Concurrent/single_assignment.ML"; ML_file "Concurrent/par_exn.ML"; ML_file "Concurrent/task_queue.ML"; ML_file "Concurrent/future.ML"; ML_file "Concurrent/event_timer.ML"; ML_file "Concurrent/timeout.ML"; ML_file "Concurrent/lazy.ML"; ML_file "Concurrent/par_list.ML"; ML_file "Concurrent/mailbox.ML"; ML_file "Concurrent/cache.ML"; ML_file "PIDE/active.ML"; ML_file "Thy/export.ML"; subsection "Inner syntax"; ML_file "Syntax/type_annotation.ML"; ML_file "Syntax/term_position.ML"; ML_file "Syntax/lexicon.ML"; ML_file "Syntax/ast.ML"; ML_file "Syntax/syntax_ext.ML"; ML_file "Syntax/parser.ML"; ML_file "Syntax/syntax_trans.ML"; ML_file "Syntax/mixfix.ML"; ML_file "Syntax/printer.ML"; ML_file "Syntax/syntax.ML"; subsection "Core of tactical proof system"; ML_file "term_ord.ML"; ML_file "term_items.ML"; ML_file "term_subst.ML"; ML_file "General/completion.ML"; ML_file "General/name_space.ML"; ML_file "sorts.ML"; ML_file "type.ML"; ML_file "logic.ML"; ML_file "Syntax/simple_syntax.ML"; ML_file "net.ML"; ML_file "item_net.ML"; ML_file "envir.ML"; ML_file "consts.ML"; ML_file "term_xml.ML"; ML_file "primitive_defs.ML"; ML_file "sign.ML"; ML_file "defs.ML"; ML_file "term_sharing.ML"; ML_file "pattern.ML"; ML_file "unify.ML"; ML_file "theory.ML"; ML_file "proofterm.ML"; ML_file "thm.ML"; ML_file "cterm_items.ML"; ML_file "more_pattern.ML"; ML_file "more_unify.ML"; ML_file "more_thm.ML"; ML_file "facts.ML"; ML_file "thm_name.ML"; ML_file "global_theory.ML"; ML_file "pure_thy.ML"; ML_file "drule.ML"; ML_file "morphism.ML"; ML_file "variable.ML"; ML_file "conv.ML"; ML_file "goal_display.ML"; ML_file "tactical.ML"; ML_file "search.ML"; ML_file "tactic.ML"; ML_file "raw_simplifier.ML"; ML_file "conjunction.ML"; ML_file "assumption.ML"; subsection "Isar -- Intelligible Semi-Automated Reasoning"; (*ML support and global execution*) ML_file "ML/ml_syntax.ML"; ML_file "ML/ml_env.ML"; ML_file "ML/ml_options.ML"; ML_file "ML/ml_print_depth.ML"; ML_file_no_debug "Isar/runtime.ML"; ML_file "PIDE/execution.ML"; ML_file "ML/ml_compiler.ML"; ML_file "skip_proof.ML"; ML_file "goal.ML"; (*outer syntax*) ML_file "Isar/keyword.ML"; ML_file "Isar/token.ML"; ML_file "Isar/parse.ML"; ML_file "Thy/document_source.ML"; ML_file "Thy/thy_header.ML"; ML_file "Thy/document_marker.ML"; (*proof context*) ML_file "Isar/object_logic.ML"; ML_file "Isar/rule_cases.ML"; ML_file "Isar/auto_bind.ML"; ML_file "type_infer.ML"; ML_file "Syntax/local_syntax.ML"; ML_file "Isar/proof_context.ML"; ML_file "type_infer_context.ML"; ML_file "Syntax/syntax_phases.ML"; ML_file "Isar/args.ML"; (*theory specifications*) ML_file "Isar/local_defs.ML"; ML_file "Isar/local_theory.ML"; ML_file "Isar/entity.ML"; ML_file "PIDE/command_span.ML"; ML_file "Thy/thy_element.ML"; ML_file "Thy/markdown.ML"; ML_file "Thy/latex.ML"; (*ML with context and antiquotations*) ML_file "ML/ml_context.ML"; ML_file "ML/ml_antiquotation.ML"; ML_file "ML/ml_compiler2.ML"; ML_file "ML/ml_antiquotations.ML"; section "Bootstrap phase 3: towards theory Pure and final ML toplevel setup"; (*basic proof engine*) ML_file "par_tactical.ML"; ML_file "context_tactic.ML"; ML_file "Isar/proof_display.ML"; ML_file "Isar/attrib.ML"; ML_file "Isar/context_rules.ML"; ML_file "Isar/method.ML"; ML_file "Isar/proof.ML"; ML_file "Isar/element.ML"; ML_file "Isar/obtain.ML"; ML_file "Isar/subgoal.ML"; ML_file "Isar/calculation.ML"; (*local theories and targets*) ML_file "Isar/locale.ML"; ML_file "Isar/generic_target.ML"; ML_file "Isar/bundle.ML"; ML_file "Isar/overloading.ML"; ML_file "axclass.ML"; ML_file "Isar/class.ML"; ML_file "Isar/named_target.ML"; ML_file "Isar/expression.ML"; ML_file "Isar/interpretation.ML"; ML_file "Isar/class_declaration.ML"; ML_file "Isar/target_context.ML"; ML_file "Isar/experiment.ML"; ML_file "simplifier.ML"; ML_file "Tools/plugin.ML"; (*executable theory content*) ML_file "Isar/code.ML"; (*specifications*) ML_file "Isar/spec_rules.ML"; ML_file "Isar/specification.ML"; ML_file "Isar/parse_spec.ML"; ML_file "Isar/typedecl.ML"; (*toplevel transactions*) ML_file "Isar/proof_node.ML"; ML_file "Isar/toplevel.ML"; (*proof term operations*) ML_file "Proof/proof_rewrite_rules.ML"; ML_file "Proof/proof_syntax.ML"; ML_file "Proof/proof_checker.ML"; ML_file "Proof/extraction.ML"; (*Isabelle system*) ML_file "PIDE/protocol_command.ML"; +ML_file "System/java.ML"; ML_file "System/scala.ML"; ML_file "System/process_result.ML"; ML_file "System/isabelle_system.ML"; (*theory documents*) ML_file "Thy/term_style.ML"; ML_file "Isar/outer_syntax.ML"; ML_file "Thy/document_antiquotation.ML"; ML_file "Thy/document_output.ML"; ML_file "Thy/document_antiquotations.ML"; ML_file "General/graph_display.ML"; ML_file "pure_syn.ML"; ML_file "PIDE/command.ML"; ML_file "PIDE/query_operation.ML"; ML_file "PIDE/resources.ML"; ML_file "Thy/thy_info.ML"; ML_file "thm_deps.ML"; ML_file "Thy/export_theory.ML"; ML_file "Thy/sessions.ML"; ML_file "PIDE/session.ML"; ML_file "PIDE/document.ML"; (*ML add-ons*) ML_file "ML/ml_pp.ML"; ML_file "ML/ml_thms.ML"; ML_file "ML/ml_instantiate.ML"; ML_file "ML/ml_file.ML"; ML_file "ML/ml_pid.ML"; (*theory and proof operations*) ML_file "Isar/isar_cmd.ML"; subsection "Isabelle/Isar system"; ML_file "System/command_line.ML"; ML_file "System/message_channel.ML"; ML_file "System/isabelle_process.ML"; ML_file "System/scala_compiler.ML"; ML_file "System/isabelle_tool.ML"; ML_file "Thy/bibtex.ML"; ML_file "PIDE/protocol.ML"; ML_file "General/output_primitives_virtual.ML"; subsection "Miscellaneous tools and packages for Pure Isabelle"; ML_file "General/base64.ML"; ML_file "General/xz.ML"; ML_file "Tools/build.ML"; ML_file "Tools/named_thms.ML"; ML_file "Tools/print_operation.ML"; ML_file "Tools/rail.ML"; ML_file "Tools/rule_insts.ML"; ML_file "Tools/thy_deps.ML"; ML_file "Tools/class_deps.ML"; ML_file "Tools/find_theorems.ML"; ML_file "Tools/find_consts.ML"; ML_file "Tools/simplifier_trace.ML"; ML_file_no_debug "Tools/debugger.ML"; ML_file "Tools/named_theorems.ML"; ML_file "Tools/doc.ML"; ML_file "Tools/jedit.ML"; ML_file "Tools/ghc.ML"; +ML_file "Tools/scala_build.ML"; ML_file "Tools/generated_files.ML"; -ML_file "Tools/scala_build.ML"; diff --git a/src/Pure/System/isabelle_system.scala b/src/Pure/System/isabelle_system.scala --- a/src/Pure/System/isabelle_system.scala +++ b/src/Pure/System/isabelle_system.scala @@ -1,501 +1,501 @@ /* Title: Pure/System/isabelle_system.scala Author: Makarius Miscellaneous Isabelle system operations. */ package isabelle import java.util.{Map => JMap, HashMap} import java.io.{File => JFile, IOException} import java.nio.file.{Path => JPath, Files, SimpleFileVisitor, FileVisitResult, StandardCopyOption, FileSystemException} import java.nio.file.attribute.BasicFileAttributes import scala.jdk.CollectionConverters._ object Isabelle_System { /* settings environment */ def settings(putenv: List[(String, String)] = Nil): JMap[String, String] = { val env0 = isabelle.setup.Environment.settings() if (putenv.isEmpty) env0 else { val env = new HashMap(env0) for ((a, b) <- putenv) env.put(a, b) env } } def getenv(name: String, env: JMap[String, String] = settings()): String = Option(env.get(name)).getOrElse("") def getenv_strict(name: String, env: JMap[String, String] = settings()): String = proper_string(getenv(name, env)) getOrElse error("Undefined Isabelle environment variable: " + quote(name)) /* services */ abstract class Service @volatile private var _services: Option[List[Class[Service]]] = None def services(): List[Class[Service]] = { if (_services.isEmpty) init() // unsynchronized check _services.get } def make_services[C](c: Class[C]): List[C] = for { c1 <- services() if Library.is_subclass(c1, c) } yield c1.getDeclaredConstructor().newInstance().asInstanceOf[C] /* init settings + services */ - def make_services(): List[Class[Service]] = { + def get_services(): List[Class[Service]] = { def make(where: String, names: List[String]): List[Class[Service]] = { for (name <- names) yield { def err(msg: String): Nothing = error("Bad Isabelle/Scala service " + quote(name) + " in " + where + "\n" + msg) try { Class.forName(name).asInstanceOf[Class[Service]] } catch { case _: ClassNotFoundException => err("Class not found") case exn: Throwable => err(Exn.message(exn)) } } } def from_env(variable: String): List[Class[Service]] = make(quote(variable), space_explode(':', getenv_strict(variable))) def from_jar(platform_jar: String): List[Class[Service]] = make(quote(platform_jar), isabelle.setup.Build.get_services(JPath.of(platform_jar)).asScala.toList) - from_env("ISABELLE_SCALA_SERVICES") ::: Scala.class_path().flatMap(from_jar) + from_env("ISABELLE_SCALA_SERVICES") ::: Scala.get_classpath().flatMap(from_jar) } def init(isabelle_root: String = "", cygwin_root: String = ""): Unit = { isabelle.setup.Environment.init(isabelle_root, cygwin_root) - synchronized { if (_services.isEmpty) { _services = Some(make_services()) } } + synchronized { if (_services.isEmpty) { _services = Some(get_services()) } } } /* getetc -- static distribution parameters */ def getetc(name: String, root: Path = Path.ISABELLE_HOME): Option[String] = { val path = root + Path.basic("etc") + Path.basic(name) if (path.is_file) { Library.trim_split_lines(File.read(path)) match { case Nil => None case List(s) => Some(s) case _ => error("Single line expected in " + path.absolute) } } else None } /* Isabelle distribution identification */ def isabelle_id(root: Path = Path.ISABELLE_HOME): String = getetc("ISABELLE_ID", root = root) orElse Mercurial.archive_id(root) orElse Mercurial.id_repository(root, rev = "") getOrElse error("Failed to identify Isabelle distribution " + root.expand) object Isabelle_Id extends Scala.Fun_String("isabelle_id") { val here = Scala_Project.here def apply(arg: String): String = isabelle_id() } def isabelle_tags(root: Path = Path.ISABELLE_HOME): String = getetc("ISABELLE_TAGS", root = root) orElse Mercurial.archive_tags(root) getOrElse { if (Mercurial.is_repository(root)) { val hg = Mercurial.repository(root) hg.tags(rev = hg.parent()) } else "" } def export_isabelle_identifier(isabelle_identifier: String): String = "export ISABELLE_IDENTIFIER=" + Bash.string(isabelle_identifier) + "\n" def isabelle_identifier(): Option[String] = proper_string(getenv("ISABELLE_IDENTIFIER")) def isabelle_heading(): String = isabelle_identifier() match { case None => "" case Some(version) => " (" + version + ")" } def isabelle_name(): String = getenv_strict("ISABELLE_NAME") def identification(): String = "Isabelle" + (try { "/" + isabelle_id () } catch { case ERROR(_) => "" }) + isabelle_heading() /** file-system operations **/ /* scala functions */ private def apply_paths( args: List[String], fun: PartialFunction[List[Path], Unit] ): List[String] = { fun(args.map(Path.explode)) Nil } private def apply_paths1(args: List[String], fun: Path => Unit): List[String] = apply_paths(args, { case List(path) => fun(path) }) private def apply_paths2(args: List[String], fun: (Path, Path) => Unit): List[String] = apply_paths(args, { case List(path1, path2) => fun(path1, path2) }) private def apply_paths3(args: List[String], fun: (Path, Path, Path) => Unit): List[String] = apply_paths(args, { case List(path1, path2, path3) => fun(path1, path2, path3) }) /* permissions */ def chmod(arg: String, path: Path): Unit = bash("chmod " + arg + " " + File.bash_path(path)).check def chown(arg: String, path: Path): Unit = bash("chown " + arg + " " + File.bash_path(path)).check /* directories */ def make_directory(path: Path): Path = { if (!path.is_dir) { try { Files.createDirectories(path.java_path) } catch { case ERROR(_) => error("Failed to create directory: " + path.absolute) } } path } def new_directory(path: Path): Path = if (path.is_dir) error("Directory already exists: " + path.absolute) else make_directory(path) def copy_dir(dir1: Path, dir2: Path): Unit = { val res = bash("cp -a " + File.bash_path(dir1) + " " + File.bash_path(dir2)) if (!res.ok) { cat_error("Failed to copy directory " + dir1.absolute + " to " + dir2.absolute, res.err) } } def with_copy_dir[A](dir1: Path, dir2: Path)(body: => A): A = { if (dir2.is_file || dir2.is_dir) error("Directory already exists: " + dir2.absolute) else { try { copy_dir(dir1, dir2); body } finally { rm_tree(dir2 ) } } } object Make_Directory extends Scala.Fun_Strings("make_directory") { val here = Scala_Project.here def apply(args: List[String]): List[String] = apply_paths1(args, make_directory) } object Copy_Dir extends Scala.Fun_Strings("copy_dir") { val here = Scala_Project.here def apply(args: List[String]): List[String] = apply_paths2(args, copy_dir) } /* copy files */ def copy_file(src: JFile, dst: JFile): Unit = { val target = if (dst.isDirectory) new JFile(dst, src.getName) else dst if (!File.eq(src, target)) { try { Files.copy(src.toPath, target.toPath, StandardCopyOption.COPY_ATTRIBUTES, StandardCopyOption.REPLACE_EXISTING) } catch { case ERROR(msg) => cat_error("Failed to copy file " + File.path(src).absolute + " to " + File.path(dst).absolute, msg) } } } def copy_file(src: Path, dst: Path): Unit = copy_file(src.file, dst.file) def copy_file_base(base_dir: Path, src: Path, target_dir: Path): Unit = { val src1 = src.expand val src1_dir = src1.dir if (!src1.starts_basic) error("Illegal path specification " + src1 + " beyond base directory") copy_file(base_dir + src1, Isabelle_System.make_directory(target_dir + src1_dir)) } object Copy_File extends Scala.Fun_Strings("copy_file") { val here = Scala_Project.here def apply(args: List[String]): List[String] = apply_paths2(args, copy_file) } object Copy_File_Base extends Scala.Fun_Strings("copy_file_base") { val here = Scala_Project.here def apply(args: List[String]): List[String] = apply_paths3(args, copy_file_base) } /* move files */ def move_file(src: JFile, dst: JFile): Unit = { val target = if (dst.isDirectory) new JFile(dst, src.getName) else dst if (!File.eq(src, target)) Files.move(src.toPath, target.toPath, StandardCopyOption.REPLACE_EXISTING) } def move_file(src: Path, dst: Path): Unit = move_file(src.file, dst.file) /* symbolic link */ def symlink(src: Path, dst: Path, force: Boolean = false, native: Boolean = false): Unit = { val src_file = src.file val dst_file = dst.file val target = if (dst_file.isDirectory) new JFile(dst_file, src_file.getName) else dst_file if (force) target.delete def cygwin_link(): Unit = { if (native) { error("Failed to create native symlink on Windows: " + quote(src_file.toString) + "\n(but it could work as Administrator)") } else isabelle.setup.Environment.cygwin_link(File.standard_path(src), target) } try { Files.createSymbolicLink(target.toPath, src_file.toPath) } catch { case _: UnsupportedOperationException if Platform.is_windows => cygwin_link() case _: FileSystemException if Platform.is_windows => cygwin_link() } } /* tmp files */ def isabelle_tmp_prefix(): JFile = { val path = Path.explode("$ISABELLE_TMP_PREFIX") path.file.mkdirs // low-level mkdirs to avoid recursion via Isabelle environment File.platform_file(path) } def tmp_file(name: String, ext: String = "", base_dir: JFile = isabelle_tmp_prefix()): JFile = { val suffix = if (ext == "") "" else "." + ext val file = Files.createTempFile(base_dir.toPath, name, suffix).toFile file.deleteOnExit() file } def with_tmp_file[A](name: String, ext: String = "")(body: Path => A): A = { val file = tmp_file(name, ext) try { body(File.path(file)) } finally { file.delete } } /* tmp dirs */ def rm_tree(root: JFile): Unit = { root.delete if (root.isDirectory) { Files.walkFileTree(root.toPath, new SimpleFileVisitor[JPath] { override def visitFile(file: JPath, attrs: BasicFileAttributes): FileVisitResult = { try { Files.deleteIfExists(file) } catch { case _: IOException => } FileVisitResult.CONTINUE } override def postVisitDirectory(dir: JPath, e: IOException): FileVisitResult = { if (e == null) { try { Files.deleteIfExists(dir) } catch { case _: IOException => } FileVisitResult.CONTINUE } else throw e } } ) } } def rm_tree(root: Path): Unit = rm_tree(root.file) object Rm_Tree extends Scala.Fun_Strings("rm_tree") { val here = Scala_Project.here def apply(args: List[String]): List[String] = apply_paths1(args, rm_tree) } def tmp_dir(name: String, base_dir: JFile = isabelle_tmp_prefix()): JFile = { val dir = Files.createTempDirectory(base_dir.toPath, name).toFile dir.deleteOnExit() dir } def with_tmp_dir[A](name: String)(body: Path => A): A = { val dir = tmp_dir(name) try { body(File.path(dir)) } finally { rm_tree(dir) } } /* quasi-atomic update of directory */ def update_directory(dir: Path, f: Path => Unit): Unit = { val new_dir = dir.ext("new") val old_dir = dir.ext("old") rm_tree(new_dir) rm_tree(old_dir) f(new_dir) if (dir.is_dir) move_file(dir, old_dir) move_file(new_dir, dir) rm_tree(old_dir) } /** external processes **/ /* GNU bash */ def bash(script: String, description: String = "", cwd: JFile = null, env: JMap[String, String] = settings(), redirect: Boolean = false, input: String = "", progress_stdout: String => Unit = (_: String) => (), progress_stderr: String => Unit = (_: String) => (), watchdog: Option[Bash.Watchdog] = None, strict: Boolean = true, cleanup: () => Unit = () => () ): Process_Result = { Bash.process(script, description = description, cwd = cwd, env = env, redirect = redirect, cleanup = cleanup). result(input = input, progress_stdout = progress_stdout, progress_stderr = progress_stderr, watchdog = watchdog, strict = strict) } /* command-line tools */ def require_command(cmd: String, test: String = "--version"): Unit = { if (!bash(Bash.string(cmd) + " " + test).ok) error("Missing system command: " + quote(cmd)) } private lazy val gnutar_check: Boolean = try { bash("tar --version").check.out.containsSlice("GNU tar") || error("") } catch { case ERROR(_) => false } def gnutar( args: String, dir: Path = Path.current, original_owner: Boolean = false, strip: Int = 0, redirect: Boolean = false ): Process_Result = { val options = (if (dir.is_current) "" else "-C " + File.bash_path(dir) + " ") + (if (original_owner) "" else "--owner=root --group=staff ") + (if (strip <= 0) "" else "--strip-components=" + strip + " ") if (gnutar_check) bash("tar " + options + args, redirect = redirect) else error("Expected to find GNU tar executable") } def make_patch(base_dir: Path, src: Path, dst: Path, diff_options: String = ""): String = { with_tmp_file("patch") { patch => Isabelle_System.bash( "diff -ru " + diff_options + " -- " + File.bash_path(src) + " " + File.bash_path(dst) + " > " + File.bash_path(patch), cwd = base_dir.file).check_rc(_ <= 1) File.read(patch) } } def hostname(): String = bash("hostname -s").check.out def open(arg: String): Unit = bash("exec \"$ISABELLE_OPEN\" " + Bash.string(arg) + " >/dev/null 2>/dev/null &") def pdf_viewer(arg: Path): Unit = bash("exec \"$PDF_VIEWER\" " + File.bash_path(arg) + " >/dev/null 2>/dev/null &") def open_external_file(name: String): Boolean = { val ext = Library.take_suffix((c: Char) => c != '.', name.toList)._2.mkString val external = ext.nonEmpty && Library.space_explode(':', getenv("ISABELLE_EXTERNAL_FILES")).contains(ext) if (external) { if (ext == "pdf" && Path.is_wellformed(name)) pdf_viewer(Path.explode(name)) else open(name) } external } /** Isabelle resources **/ /* repository clone with Admin */ def admin(): Boolean = Path.explode("~~/Admin").is_dir /* default logic */ def default_logic(args: String*): String = { args.find(_ != "") match { case Some(logic) => logic case None => getenv_strict("ISABELLE_LOGIC") } } /* download file */ def download(url_name: String, progress: Progress = new Progress): HTTP.Content = { val url = Url(url_name) progress.echo("Getting " + quote(url_name)) try { HTTP.Client.get(url) } catch { case ERROR(msg) => cat_error("Failed to download " + quote(url_name), msg) } } def download_file(url_name: String, file: Path, progress: Progress = new Progress): Unit = Bytes.write(file, download(url_name, progress = progress).bytes) object Download extends Scala.Fun("download", thread = true) { val here = Scala_Project.here override def invoke(args: List[Bytes]): List[Bytes] = args.map(url => download(url.text).bytes) } /* repositories */ val isabelle_repository: Mercurial.Server = Mercurial.Server("https://isabelle.sketis.net/repos/isabelle") val afp_repository: Mercurial.Server = Mercurial.Server("https://isabelle.sketis.net/repos/afp-devel") def official_releases(): List[String] = Library.trim_split_lines( isabelle_repository.read_file(Path.explode("Admin/Release/official"))) } diff --git a/src/Pure/System/java.ML b/src/Pure/System/java.ML new file mode 100644 --- /dev/null +++ b/src/Pure/System/java.ML @@ -0,0 +1,42 @@ +(* Title: Pure/System/java.ML + Author: Makarius + +Support for Java language. +*) + +signature JAVA = +sig + val print_string: string -> string +end; + +structure Java: JAVA = +struct + +(* string literals *) + +local + +val print_str = + fn "\b" => "\\b" + | "\t" => "\\t" + | "\n" => "\\n" + | "\f" => "\\f" + | "\r" => "\\r" + | "\"" => "\\\"" + | "\\" => "\\\\" + | s => + let val c = ord s in + if c < 16 then "\\u000" ^ Int.fmt StringCvt.HEX c + else if c < 128 then "\\u00" ^ Int.fmt StringCvt.HEX c + else error ("Cannot print non-ASCII Java/Scala string literal: " ^ quote s) + end; + +in + +fun print_string str = + quote (translate_string print_str str) + handle Fail _ => error ("Cannot print non-ASCII Java/Scala string literal: " ^ quote str); + +end; + +end; diff --git a/src/Pure/System/scala.scala b/src/Pure/System/scala.scala --- a/src/Pure/System/scala.scala +++ b/src/Pure/System/scala.scala @@ -1,390 +1,390 @@ /* Title: Pure/System/scala.scala Author: Makarius Support for Scala at runtime. */ package isabelle import java.io.{File => JFile, PrintStream, ByteArrayOutputStream, OutputStream} import scala.collection.mutable import scala.annotation.tailrec import dotty.tools.dotc.CompilationUnit import dotty.tools.dotc.ast.Trees.PackageDef import dotty.tools.dotc.ast.untpd import dotty.tools.dotc.core.Contexts.{Context => CompilerContext} import dotty.tools.dotc.core.NameOps.moduleClassName import dotty.tools.dotc.core.{Phases, StdNames} import dotty.tools.dotc.interfaces import dotty.tools.dotc.reporting.{Diagnostic, ConsoleReporter} import dotty.tools.dotc.util.{SourceFile, SourcePosition, NoSourcePosition} import dotty.tools.repl import dotty.tools.repl.{ReplCompiler, ReplDriver} object Scala { /** registered functions **/ abstract class Fun(val name: String, val thread: Boolean = false) { override def toString: String = name def single: Boolean = false def bytes: Boolean = false def position: Properties.T = here.position def here: Scala_Project.Here def invoke(args: List[Bytes]): List[Bytes] } trait Single_Fun extends Fun { override def single: Boolean = true } trait Bytes_Fun extends Fun { override def bytes: Boolean = true } abstract class Fun_Strings(name: String, thread: Boolean = false) extends Fun(name, thread = thread) { override def invoke(args: List[Bytes]): List[Bytes] = apply(args.map(_.text)).map(Bytes.apply) def apply(args: List[String]): List[String] } abstract class Fun_String(name: String, thread: Boolean = false) extends Fun_Strings(name, thread = thread) with Single_Fun { override def apply(args: List[String]): List[String] = List(apply(Library.the_single(args))) def apply(arg: String): String } abstract class Fun_Bytes(name: String, thread: Boolean = false) extends Fun(name, thread = thread) with Single_Fun with Bytes_Fun { override def invoke(args: List[Bytes]): List[Bytes] = List(apply(Library.the_single(args))) def apply(arg: Bytes): Bytes } val encode_fun: XML.Encode.T[Fun] = { fun => import XML.Encode._ pair(string, pair(pair(bool, bool), properties))( fun.name, ((fun.single, fun.bytes), fun.position)) } class Functions(val functions: Fun*) extends Isabelle_System.Service lazy val functions: List[Fun] = Isabelle_System.make_services(classOf[Functions]).flatMap(_.functions) /** demo functions **/ object Echo extends Fun_String("echo") { val here = Scala_Project.here def apply(arg: String): String = arg } object Sleep extends Fun_String("sleep") { val here = Scala_Project.here def apply(seconds: String): String = { val t = seconds match { case Value.Double(s) => Time.seconds(s) case _ => error("Malformed argument: " + quote(seconds)) } val t0 = Time.now() t.sleep() val t1 = Time.now() (t1 - t0).toString } } /** compiler **/ - def class_path(): List[String] = + def get_classpath(): List[String] = space_explode(JFile.pathSeparatorChar, System.getProperty("java.class.path", "")) .filter(_.nonEmpty) object Compiler { object Message { object Kind extends Enumeration { val error, warning, info, other = Value } private val Header = """^--.* (Error|Warning|Info): .*$""".r val header_kind: String => Kind.Value = { case "Error" => Kind.error case "Warning" => Kind.warning case "Info" => Kind.info case _ => Kind.other } // see compiler/src/dotty/tools/dotc/reporting/MessageRendering.scala def split(str: String): List[Message] = { var kind = Kind.other val text = new mutable.StringBuilder val result = new mutable.ListBuffer[Message] def flush(): Unit = { if (text.nonEmpty) { result += Message(kind, text.toString) } kind = Kind.other text.clear() } for (line <- Library.trim_split_lines(str)) { line match { case Header(k) => flush(); kind = header_kind(k) case _ => if (line.startsWith("-- ")) flush() } if (text.nonEmpty) { text += '\n' } text ++= line } flush() result.toList } } sealed case class Message(kind: Message.Kind.Value, text: String) { def is_error: Boolean = kind == Message.Kind.error override def toString: String = text } sealed case class Result( state: repl.State, messages: List[Message], unit: Option[CompilationUnit] = None ) { val errors: List[String] = messages.flatMap(msg => if (msg.is_error) Some(msg.text) else None) def ok: Boolean = errors.isEmpty def check_state: repl.State = if (ok) state else error(cat_lines(errors)) override def toString: String = if (ok) "Result(ok)" else "Result(error)" } def context( settings: List[String] = Nil, jar_dirs: List[JFile] = Nil, class_loader: Option[ClassLoader] = None ): Context = { val isabelle_settings = Word.explode(Isabelle_System.getenv_strict("ISABELLE_SCALAC_OPTIONS")) def find_jars(dir: JFile): List[String] = File.find_files(dir, file => file.getName.endsWith(".jar")). map(File.absolute_name) - val classpath = (class_path() ::: jar_dirs.flatMap(find_jars)).mkString(JFile.pathSeparator) + val classpath = (get_classpath() ::: jar_dirs.flatMap(find_jars)).mkString(JFile.pathSeparator) val settings1 = isabelle_settings ::: settings ::: List("-classpath", classpath) new Context(settings1, class_loader) } class Context private [Compiler]( val settings: List[String], val class_loader: Option[ClassLoader] = None ) { private val out_stream = new ByteArrayOutputStream(1024) private val out = new PrintStream(out_stream) private val driver: ReplDriver = new ReplDriver(settings.toArray, out, class_loader) def init_state: repl.State = driver.initialState def compile(source: String, state: repl.State = init_state): Result = { out.flush() out_stream.reset() val state1 = driver.run(source)(state) out.flush() val messages = Message.split(out_stream.toString(UTF8.charset)) out_stream.reset() Result(state1, messages) } } } object Toplevel extends Fun_String("scala_toplevel") { val here = Scala_Project.here def apply(source: String): String = { val errors = try { Compiler.context().compile(source).errors.map("Scala error: " + _) } catch { case ERROR(msg) => List(msg) } locally { import XML.Encode._; YXML.string_of_body(list(string)(errors)) } } } /** interpreter thread **/ object Interpreter { /* requests */ sealed abstract class Request case class Execute(command: (Compiler.Context, repl.State) => repl.State) extends Request case object Shutdown extends Request /* known interpreters */ private val known = Synchronized(Set.empty[Interpreter]) def add(interpreter: Interpreter): Unit = known.change(_ + interpreter) def del(interpreter: Interpreter): Unit = known.change(_ - interpreter) def get[A](which: PartialFunction[Interpreter, A]): Option[A] = known.value.collectFirst(which) } class Interpreter(context: Compiler.Context, out: OutputStream = Console.out) { interpreter => private val running = Synchronized[Option[Thread]](None) def running_thread(thread: Thread): Boolean = running.value.contains(thread) def interrupt_thread(): Unit = running.change({ opt => opt.foreach(_.interrupt()); opt }) private var state = context.init_state private lazy val thread: Consumer_Thread[Interpreter.Request] = Consumer_Thread.fork("Scala.Interpreter") { case Interpreter.Execute(command) => try { running.change(_ => Some(Thread.currentThread())) state = command(context, state) } finally { running.change(_ => None) Exn.Interrupt.dispose() } true case Interpreter.Shutdown => Interpreter.del(interpreter) false } def shutdown(): Unit = { thread.send(Interpreter.Shutdown) interrupt_thread() thread.shutdown() } def execute(command: (Compiler.Context, repl.State) => repl.State): Unit = thread.send(Interpreter.Execute(command)) def reset(): Unit = thread.send(Interpreter.Execute((context, _) => context.init_state)) Interpreter.add(interpreter) thread } /** invoke Scala functions from ML **/ /* invoke function */ object Tag extends Enumeration { val NULL, OK, ERROR, FAIL, INTERRUPT = Value } def function_thread(name: String): Boolean = functions.find(fun => fun.name == name) match { case Some(fun) => fun.thread case None => false } def function_body(name: String, args: List[Bytes]): (Tag.Value, List[Bytes]) = functions.find(fun => fun.name == name) match { case Some(fun) => Exn.capture { fun.invoke(args) } match { case Exn.Res(null) => (Tag.NULL, Nil) case Exn.Res(res) => (Tag.OK, res) case Exn.Exn(Exn.Interrupt()) => (Tag.INTERRUPT, Nil) case Exn.Exn(e) => (Tag.ERROR, List(Bytes(Exn.message(e)))) } case None => (Tag.FAIL, List(Bytes("Unknown Isabelle/Scala function: " + quote(name)))) } /* protocol handler */ class Handler extends Session.Protocol_Handler { private var session: Session = null private var futures = Map.empty[String, Future[Unit]] override def init(session: Session): Unit = synchronized { this.session = session } override def exit(): Unit = synchronized { for ((id, future) <- futures) cancel(id, future) futures = Map.empty } private def result(id: String, tag: Scala.Tag.Value, res: List[Bytes]): Unit = synchronized { if (futures.isDefinedAt(id)) { session.protocol_command_raw("Scala.result", Bytes(id) :: Bytes(tag.id.toString) :: res) futures -= id } } private def cancel(id: String, future: Future[Unit]): Unit = { future.cancel() result(id, Scala.Tag.INTERRUPT, Nil) } private def invoke_scala(msg: Prover.Protocol_Output): Boolean = synchronized { msg.properties match { case Markup.Invoke_Scala(name, id) => def body(): Unit = { val (tag, res) = Scala.function_body(name, msg.chunks) result(id, tag, res) } val future = if (Scala.function_thread(name)) { Future.thread(name = Isabelle_Thread.make_name(base = "invoke_scala"))(body()) } else Future.fork(body()) futures += (id -> future) true case _ => false } } private def cancel_scala(msg: Prover.Protocol_Output): Boolean = synchronized { msg.properties match { case Markup.Cancel_Scala(id) => futures.get(id) match { case Some(future) => cancel(id, future) case None => } true case _ => false } } override val functions: Session.Protocol_Functions = List( Markup.Invoke_Scala.name -> invoke_scala, Markup.Cancel_Scala.name -> cancel_scala) } } class Scala_Functions extends Scala.Functions( Scala.Echo, Scala.Sleep, Scala.Toplevel, Scala_Build.Scala_Fun, Base64.Decode, Base64.Encode, XZ.Compress, XZ.Uncompress, Doc.Doc_Names, Bibtex.Check_Database, Isabelle_System.Make_Directory, Isabelle_System.Copy_Dir, Isabelle_System.Copy_File, Isabelle_System.Copy_File_Base, Isabelle_System.Rm_Tree, Isabelle_System.Download, Isabelle_System.Isabelle_Id, Isabelle_Tool.Isabelle_Tools, isabelle.atp.SystemOnTPTP.List_Systems, isabelle.atp.SystemOnTPTP.Run_System) diff --git a/src/Pure/Thy/document_build.scala b/src/Pure/Thy/document_build.scala --- a/src/Pure/Thy/document_build.scala +++ b/src/Pure/Thy/document_build.scala @@ -1,491 +1,494 @@ /* Title: Pure/Thy/document_build.scala Author: Makarius Build theory document (PDF) from session database. */ package isabelle object Document_Build { /* document variants */ abstract class Document_Name { def name: String def path: Path = Path.basic(name) override def toString: String = name } object Document_Variant { def parse(opt: String): Document_Variant = Library.space_explode('=', opt) match { case List(name) => Document_Variant(name, Latex.Tags.empty) case List(name, tags) => Document_Variant(name, Latex.Tags(tags)) case _ => error("Malformed document variant: " + quote(opt)) } } sealed case class Document_Variant(name: String, tags: Latex.Tags) extends Document_Name { def print: String = if (tags.toString.isEmpty) name else name + "=" + tags.toString } sealed case class Document_Input(name: String, sources: SHA1.Digest) extends Document_Name sealed case class Document_Output(name: String, sources: SHA1.Digest, log_xz: Bytes, pdf: Bytes) extends Document_Name { def log: String = log_xz.uncompress().text def log_lines: List[String] = split_lines(log) def write(db: SQL.Database, session_name: String): Unit = write_document(db, session_name, this) def write(dir: Path): Path = { val path = dir + Path.basic(name).pdf Isabelle_System.make_directory(path.expand.dir) Bytes.write(path, pdf) path } } /* SQL data model */ object Data { val session_name = SQL.Column.string("session_name").make_primary_key val name = SQL.Column.string("name").make_primary_key val sources = SQL.Column.string("sources") val log_xz = SQL.Column.bytes("log_xz") val pdf = SQL.Column.bytes("pdf") val table = SQL.Table("isabelle_documents", List(session_name, name, sources, log_xz, pdf)) def where_equal(session_name: String, name: String = ""): SQL.Source = "WHERE " + Data.session_name.equal(session_name) + (if (name == "") "" else " AND " + Data.name.equal(name)) } def read_documents(db: SQL.Database, session_name: String): List[Document_Input] = { val select = Data.table.select(List(Data.name, Data.sources), Data.where_equal(session_name)) db.using_statement(select)(stmt => stmt.execute_query().iterator({ res => val name = res.string(Data.name) val sources = res.string(Data.sources) Document_Input(name, SHA1.fake_digest(sources)) }).toList) } def read_document( db: SQL.Database, session_name: String, name: String ): Option[Document_Output] = { val select = Data.table.select(sql = Data.where_equal(session_name, name)) db.using_statement(select)({ stmt => val res = stmt.execute_query() if (res.next()) { val name = res.string(Data.name) val sources = res.string(Data.sources) val log_xz = res.bytes(Data.log_xz) val pdf = res.bytes(Data.pdf) Some(Document_Output(name, SHA1.fake_digest(sources), log_xz, pdf)) } else None }) } def write_document(db: SQL.Database, session_name: String, doc: Document_Output): Unit = { db.using_statement(Data.table.insert()){ stmt => stmt.string(1) = session_name stmt.string(2) = doc.name stmt.string(3) = doc.sources.toString stmt.bytes(4) = doc.log_xz stmt.bytes(5) = doc.pdf stmt.execute() } } /* context */ val texinputs: Path = Path.explode("~~/lib/texinputs") val isabelle_styles: List[Path] = List("isabelle.sty", "isabellesym.sty", "pdfsetup.sty", "railsetup.sty"). map(name => texinputs + Path.basic(name)) def context( session: String, deps: Sessions.Deps, db_context: Sessions.Database_Context, progress: Progress = new Progress ): Context = { - val info = deps.sessions_structure(session) + val structure = deps.sessions_structure + val info = structure(session) val base = deps(session) val hierarchy = deps.sessions_structure.build_hierarchy(session) - new Context(info, base, hierarchy, db_context, progress) + val classpath = db_context.get_classpath(structure, session) + new Context(info, base, hierarchy, db_context, classpath, progress) } final class Context private[Document_Build]( info: Sessions.Info, base: Sessions.Base, hierarchy: List[String], db_context: Sessions.Database_Context, + val classpath: List[File.Content_Bytes], val progress: Progress = new Progress ) { /* session info */ def session: String = info.name def options: Options = info.options def document_bibliography: Boolean = options.bool("document_bibliography") def document_logo: Option[String] = options.string("document_logo") match { case "" => None case "_" => Some("") case name => Some(name) } def document_build: String = options.string("document_build") def get_engine(): Engine = { val name = document_build engines.find(_.name == name).getOrElse(error("Bad document_build engine " + quote(name))) } def get_export(theory: String, name: String): Export.Entry = db_context.get_export(hierarchy, theory, name) /* document content */ def documents: List[Document_Variant] = info.documents def session_theories: List[Document.Node.Name] = base.session_theories def document_theories: List[Document.Node.Name] = session_theories ::: base.document_theories lazy val document_latex: List[File.Content_XML] = for (name <- document_theories) yield { val path = Path.basic(tex_name(name)) val content = YXML.parse_body(get_export(name.theory, Export.DOCUMENT_LATEX).text) File.Content(path, content) } lazy val session_graph: File.Content = { val path = Presentation.session_graph_path val content = graphview.Graph_File.make_pdf(options, base.session_graph_display) File.Content(path, content) } lazy val session_tex: File.Content = { val path = Path.basic("session.tex") val content = Library.terminate_lines( base.session_theories.map(name => "\\input{" + tex_name(name) + "}")) File.Content(path, content) } lazy val isabelle_logo: Option[File.Content] = { document_logo.map(logo_name => Isabelle_System.with_tmp_file("logo", ext = "pdf") { tmp_path => Logo.create_logo(logo_name, output_file = tmp_path, quiet = true) val path = Path.basic("isabelle_logo.pdf") val content = Bytes.read(tmp_path) File.Content(path, content) }) } /* document directory */ def prepare_directory( dir: Path, doc: Document_Variant, latex_output: Latex.Output ): Directory = { val doc_dir = Isabelle_System.make_directory(dir + Path.basic(doc.name)) /* actual sources: with SHA1 digest */ isabelle_styles.foreach(Isabelle_System.copy_file(_, doc_dir)) val comment_latex = options.bool("document_comment_latex") if (!comment_latex) { Isabelle_System.copy_file(texinputs + Path.basic("comment.sty"), doc_dir) } doc.tags.sty(comment_latex).write(doc_dir) for ((base_dir, src) <- info.document_files) { Isabelle_System.copy_file_base(info.dir + base_dir, src, doc_dir) } session_tex.write(doc_dir) for (content <- document_latex) { content.output(latex_output(_, file_pos = content.path.implode_symbolic)) .write(doc_dir) } val root_name1 = "root_" + doc.name val root_name = if ((doc_dir + Path.explode(root_name1).tex).is_file) root_name1 else "root" val digests1 = List(doc.print, document_logo.toString, document_build).map(SHA1.digest) val digests2 = File.find_files(doc_dir.file, follow_links = true).map(SHA1.digest) val sources = SHA1.digest_set(digests1 ::: digests2) /* derived material: without SHA1 digest */ isabelle_logo.foreach(_.write(doc_dir)) session_graph.write(doc_dir) Directory(doc_dir, doc, root_name, sources) } def old_document(directory: Directory): Option[Document_Output] = for { old_doc <- db_context.input_database(session)(read_document(_, _, directory.doc.name)) if old_doc.sources == directory.sources } yield old_doc } sealed case class Directory( doc_dir: Path, doc: Document_Variant, root_name: String, sources: SHA1.Digest ) { def root_name_script(ext: String = ""): String = Bash.string(if (ext.isEmpty) root_name else root_name + "." + ext) def conditional_script(ext: String, exe: String, after: String = ""): String = "if [ -f " + root_name_script(ext) + " ]\n" + "then\n" + " " + exe + " " + root_name_script() + "\n" + (if (after.isEmpty) "" else " " + after) + "fi\n" def log_errors(): List[String] = Latex.latex_errors(doc_dir, root_name) ::: Bibtex.bibtex_errors(doc_dir, root_name) def make_document(log: List[String], errors: List[String]): Document_Output = { val root_pdf = Path.basic(root_name).pdf val result_pdf = doc_dir + root_pdf if (errors.nonEmpty) { val errors1 = errors ::: List("Failed to build document " + quote(doc.name)) throw new Build_Error(log, Exn.cat_message(errors1: _*)) } else if (!result_pdf.is_file) { val message = "Bad document result: expected to find " + root_pdf throw new Build_Error(log, message) } else { val log_xz = Bytes(cat_lines(log)).compress() val pdf = Bytes.read(result_pdf) Document_Output(doc.name, sources, log_xz, pdf) } } } /* build engines */ lazy val engines: List[Engine] = Isabelle_System.make_services(classOf[Engine]) abstract class Engine(val name: String) extends Isabelle_System.Service { override def toString: String = name def prepare_directory(context: Context, dir: Path, doc: Document_Variant): Directory def build_document(context: Context, directory: Directory, verbose: Boolean): Document_Output } abstract class Bash_Engine(name: String) extends Engine(name) { def prepare_directory(context: Context, dir: Path, doc: Document_Variant): Directory = context.prepare_directory(dir, doc, new Latex.Output(context.options)) def use_pdflatex: Boolean = false def latex_script(context: Context, directory: Directory): String = (if (use_pdflatex) "$ISABELLE_PDFLATEX" else "$ISABELLE_LUALATEX") + " " + directory.root_name_script() + "\n" def bibtex_script(context: Context, directory: Directory, latex: Boolean = false): String = { val ext = if (context.document_bibliography) "aux" else "bib" directory.conditional_script(ext, "$ISABELLE_BIBTEX", after = if (latex) latex_script(context, directory) else "") } def makeindex_script(context: Context, directory: Directory, latex: Boolean = false): String = directory.conditional_script("idx", "$ISABELLE_MAKEINDEX", after = if (latex) latex_script(context, directory) else "") def use_build_script: Boolean = false def build_script(context: Context, directory: Directory): String = { val has_build_script = (directory.doc_dir + Path.explode("build")).is_file if (!use_build_script && has_build_script) { error("Unexpected document build script for option document_build=" + quote(context.document_build)) } else if (use_build_script && !has_build_script) error("Missing document build script") else if (has_build_script) "./build pdf " + Bash.string(directory.doc.name) else { "set -e\n" + latex_script(context, directory) + bibtex_script(context, directory, latex = true) + makeindex_script(context, directory) + latex_script(context, directory) + makeindex_script(context, directory, latex = true) } } def build_document( context: Context, directory: Directory, verbose: Boolean ): Document_Output = { val result = context.progress.bash( build_script(context, directory), cwd = directory.doc_dir.file, echo = verbose, watchdog = Time.seconds(0.5)) val log = result.out_lines ::: result.err_lines val errors = (if (result.ok) Nil else List(result.err)) ::: directory.log_errors() directory.make_document(log, errors) } } class LuaLaTeX_Engine extends Bash_Engine("lualatex") class PDFLaTeX_Engine extends Bash_Engine("pdflatex") { override def use_pdflatex: Boolean = true } class Build_Engine extends Bash_Engine("build") { override def use_build_script: Boolean = true } /* build documents */ def tex_name(name: Document.Node.Name): String = name.theory_base_name + ".tex" class Build_Error(val log_lines: List[String], val message: String) extends Exn.User_Error(message) def build_documents( context: Context, output_sources: Option[Path] = None, output_pdf: Option[Path] = None, verbose: Boolean = false ): List[Document_Output] = { val progress = context.progress val engine = context.get_engine() val documents = for (doc <- context.documents) yield { Isabelle_System.with_tmp_dir("document") { tmp_dir => progress.echo("Preparing " + context.session + "/" + doc.name + " ...") val start = Time.now() output_sources.foreach(engine.prepare_directory(context, _, doc)) val directory = engine.prepare_directory(context, tmp_dir, doc) val document = context.old_document(directory) getOrElse engine.build_document(context, directory, verbose) val stop = Time.now() val timing = stop - start progress.echo("Finished " + context.session + "/" + doc.name + " (" + timing.message_hms + " elapsed time)") document } } for (dir <- output_pdf; doc <- documents) { val path = doc.write(dir) progress.echo("Document at " + path.absolute) } documents } /* Isabelle tool wrapper */ val isabelle_tool = Isabelle_Tool("document", "prepare session theory document", Scala_Project.here, { args => var output_sources: Option[Path] = None var output_pdf: Option[Path] = None var verbose_latex = false var dirs: List[Path] = Nil var options = Options.init() var verbose_build = false val getopts = Getopts(""" Usage: isabelle document [OPTIONS] SESSION Options are: -O DIR output directory for LaTeX sources and resulting PDF -P DIR output directory for resulting PDF -S DIR output directory for LaTeX sources -V verbose latex -d DIR include session directory -o OPTION override Isabelle system OPTION (via NAME=VAL or NAME) -v verbose build Prepare the theory document of a session. """, "O:" -> (arg => { val dir = Path.explode(arg) output_sources = Some(dir) output_pdf = Some(dir) }), "P:" -> (arg => { output_pdf = Some(Path.explode(arg)) }), "S:" -> (arg => { output_sources = Some(Path.explode(arg)) }), "V" -> (_ => verbose_latex = true), "d:" -> (arg => dirs = dirs ::: List(Path.explode(arg))), "o:" -> (arg => options = options + arg), "v" -> (_ => verbose_build = true)) val more_args = getopts(args) val session = more_args match { case List(a) => a case _ => getopts.usage() } val progress = new Console_Progress(verbose = verbose_build) val store = Sessions.store(options) progress.interrupt_handler { val res = Build.build(options, selection = Sessions.Selection.session(session), dirs = dirs, progress = progress, verbose = verbose_build) if (!res.ok) error("Failed to build session " + quote(session)) val deps = Sessions.load_structure(options + "document=pdf", dirs = dirs). selection_deps(Sessions.Selection.session(session)) if (output_sources.isEmpty && output_pdf.isEmpty) { progress.echo_warning("No output directory") } using(store.open_database_context()) { db_context => build_documents(context(session, deps, db_context, progress = progress), output_sources = output_sources, output_pdf = output_pdf, verbose = verbose_latex) } } }) } diff --git a/src/Pure/Thy/export.scala b/src/Pure/Thy/export.scala --- a/src/Pure/Thy/export.scala +++ b/src/Pure/Thy/export.scala @@ -1,472 +1,456 @@ /* Title: Pure/Thy/export.scala Author: Makarius Manage theory exports: compressed blobs. */ package isabelle import scala.annotation.tailrec import scala.util.matching.Regex object Export { /* artefact names */ val DOCUMENT_ID = "PIDE/document_id" val FILES = "PIDE/files" val MARKUP = "PIDE/markup" val MESSAGES = "PIDE/messages" val DOCUMENT_PREFIX = "document/" val DOCUMENT_LATEX = DOCUMENT_PREFIX + "latex" val DOCUMENT_CITATIONS = DOCUMENT_PREFIX + "citations" val THEORY_PREFIX: String = "theory/" val PROOFS_PREFIX: String = "proofs/" def explode_name(s: String): List[String] = space_explode('/', s) def implode_name(elems: Iterable[String]): String = elems.mkString("/") /* SQL data model */ object Data { val session_name = SQL.Column.string("session_name").make_primary_key val theory_name = SQL.Column.string("theory_name").make_primary_key val name = SQL.Column.string("name").make_primary_key val executable = SQL.Column.bool("executable") val compressed = SQL.Column.bool("compressed") val body = SQL.Column.bytes("body") val table = SQL.Table("isabelle_exports", List(session_name, theory_name, name, executable, compressed, body)) def where_equal(session_name: String, theory_name: String = "", name: String = ""): SQL.Source = "WHERE " + Data.session_name.equal(session_name) + (if (theory_name == "") "" else " AND " + Data.theory_name.equal(theory_name)) + (if (name == "") "" else " AND " + Data.name.equal(name)) } - def read_name( - db: SQL.Database, - session_name: String, - theory_name: String, - name: String - ): Boolean = { - val select = - Data.table.select(List(Data.name), Data.where_equal(session_name, theory_name, name)) - db.using_statement(select)(stmt => stmt.execute_query().next()) - } + def compound_name(a: String, b: String): String = + if (a.isEmpty) b else a + ":" + b - def read_names(db: SQL.Database, session_name: String, theory_name: String): List[String] = { - val select = Data.table.select(List(Data.name), Data.where_equal(session_name, theory_name)) - db.using_statement(select)(stmt => - stmt.execute_query().iterator(res => res.string(Data.name)).toList) + sealed case class Entry_Name(session: String = "", theory: String = "", name: String = "") { + val compound_name: String = Export.compound_name(theory, name) + + def make_path(prune: Int = 0): Path = { + val elems = theory :: space_explode('/', name) + if (elems.length < prune + 1) { + error("Cannot prune path by " + prune + " element(s): " + Path.make(elems)) + } + else Path.make(elems.drop(prune)) + } + + def readable(db: SQL.Database): Boolean = { + val select = Data.table.select(List(Data.name), Data.where_equal(session, theory, name)) + db.using_statement(select)(stmt => stmt.execute_query().next()) + } + + def read(db: SQL.Database, cache: XML.Cache): Option[Entry] = { + val select = + Data.table.select(List(Data.executable, Data.compressed, Data.body), + Data.where_equal(session, theory, name)) + db.using_statement(select) { stmt => + val res = stmt.execute_query() + if (res.next()) { + val executable = res.bool(Data.executable) + val compressed = res.bool(Data.compressed) + val bytes = res.bytes(Data.body) + val body = Future.value(compressed, bytes) + Some(Entry(this, executable, body, cache)) + } + else None + } + } + + def read(dir: Path, cache: XML.Cache): Option[Entry] = { + val path = dir + Path.basic(theory) + Path.explode(name) + if (path.is_file) { + val executable = File.is_executable(path) + val uncompressed = Bytes.read(path) + val body = Future.value((false, uncompressed)) + Some(Entry(this, executable, body, cache)) + } + else None + } } def read_theory_names(db: SQL.Database, session_name: String): List[String] = { val select = Data.table.select(List(Data.theory_name), Data.where_equal(session_name), distinct = true) db.using_statement(select)(stmt => stmt.execute_query().iterator(_.string(Data.theory_name)).toList) } - def read_theory_exports(db: SQL.Database, session_name: String): List[(String, String)] = { + def read_entry_names(db: SQL.Database, session_name: String): List[Entry_Name] = { val select = Data.table.select(List(Data.theory_name, Data.name), Data.where_equal(session_name)) + " ORDER BY " + Data.theory_name + ", " + Data.name db.using_statement(select)(stmt => stmt.execute_query().iterator(res => - (res.string(Data.theory_name), res.string(Data.name))).toList) + Entry_Name(session = session_name, + theory = res.string(Data.theory_name), + name = res.string(Data.name))).toList) } def message(msg: String, theory_name: String, name: String): String = msg + " " + quote(name) + " for theory " + quote(theory_name) - def compound_name(a: String, b: String): String = - if (a.isEmpty) b else a + ":" + b - def empty_entry(theory_name: String, name: String): Entry = - Entry("", theory_name, name, false, Future.value(false, Bytes.empty), XML.Cache.none) + Entry(Entry_Name(theory = theory_name, name = name), + false, Future.value(false, Bytes.empty), XML.Cache.none) sealed case class Entry( - session_name: String, - theory_name: String, - name: String, + entry_name: Entry_Name, executable: Boolean, body: Future[(Boolean, Bytes)], cache: XML.Cache ) { + def session_name: String = entry_name.session + def theory_name: String = entry_name.theory + def name: String = entry_name.name override def toString: String = name - def compound_name: String = Export.compound_name(theory_name, name) + def compound_name: String = entry_name.compound_name def name_has_prefix(s: String): Boolean = name.startsWith(s) val name_elems: List[String] = explode_name(name) def name_extends(elems: List[String]): Boolean = name_elems.startsWith(elems) && name_elems != elems def text: String = uncompressed.text def uncompressed: Bytes = { val (compressed, bytes) = body.join if (compressed) bytes.uncompress(cache = cache.xz) else bytes } def uncompressed_yxml: XML.Body = YXML.parse_body(UTF8.decode_permissive(uncompressed), cache = cache) def write(db: SQL.Database): Unit = { val (compressed, bytes) = body.join db.using_statement(Data.table.insert()) { stmt => stmt.string(1) = session_name stmt.string(2) = theory_name stmt.string(3) = name stmt.bool(4) = executable stmt.bool(5) = compressed stmt.bytes(6) = bytes stmt.execute() } } } def make_regex(pattern: String): Regex = { @tailrec def make(result: List[String], depth: Int, chs: List[Char]): Regex = chs match { case '*' :: '*' :: rest => make("[^:]*" :: result, depth, rest) case '*' :: rest => make("[^:/]*" :: result, depth, rest) case '?' :: rest => make("[^:/]" :: result, depth, rest) case '\\' :: c :: rest => make(("\\" + c) :: result, depth, rest) case '{' :: rest => make("(" :: result, depth + 1, rest) case ',' :: rest if depth > 0 => make("|" :: result, depth, rest) case '}' :: rest if depth > 0 => make(")" :: result, depth - 1, rest) case c :: rest if ".+()".contains(c) => make(("\\" + c) :: result, depth, rest) case c :: rest => make(c.toString :: result, depth, rest) case Nil => result.reverse.mkString.r } make(Nil, 0, pattern.toList) } - def make_matcher(pats: List[String]): (String, String) => Boolean = { + def make_matcher(pats: List[String]): Entry_Name => Boolean = { val regs = pats.map(make_regex) - { - (theory_name: String, name: String) => - val s = compound_name(theory_name, name) - regs.exists(_.pattern.matcher(s).matches) - } + (entry_name: Entry_Name) => + regs.exists(_.pattern.matcher(entry_name.compound_name).matches) } def make_entry( session_name: String, args: Protocol.Export.Args, bytes: Bytes, cache: XML.Cache ): Entry = { val body = if (args.compress) Future.fork(bytes.maybe_compress(cache = cache.xz)) else Future.value((false, bytes)) - Entry(session_name, args.theory_name, args.name, args.executable, body, cache) - } - - def read_entry( - db: SQL.Database, - cache: XML.Cache, - session_name: String, - theory_name: String, - name: String - ): Option[Entry] = { - val select = - Data.table.select(List(Data.executable, Data.compressed, Data.body), - Data.where_equal(session_name, theory_name, name)) - db.using_statement(select) { stmt => - val res = stmt.execute_query() - if (res.next()) { - val executable = res.bool(Data.executable) - val compressed = res.bool(Data.compressed) - val bytes = res.bytes(Data.body) - val body = Future.value(compressed, bytes) - Some(Entry(session_name, theory_name, name, executable, body, cache)) - } - else None - } - } - - def read_entry( - dir: Path, - cache: XML.Cache, - session_name: String, - theory_name: String, - name: String - ): Option[Entry] = { - val path = dir + Path.basic(theory_name) + Path.explode(name) - if (path.is_file) { - val executable = File.is_executable(path) - val uncompressed = Bytes.read(path) - val body = Future.value((false, uncompressed)) - Some(Entry(session_name, theory_name, name, executable, body, cache)) - } - else None + val entry_name = Entry_Name(session = session_name, theory = args.theory_name, name = args.name) + Entry(entry_name, args.executable, body, cache) } /* database consumer thread */ def consumer(db: SQL.Database, cache: XML.Cache, progress: Progress = new Progress): Consumer = new Consumer(db, cache, progress) class Consumer private[Export](db: SQL.Database, cache: XML.Cache, progress: Progress) { private val errors = Synchronized[List[String]](Nil) private val consumer = Consumer_Thread.fork_bulk[(Entry, Boolean)](name = "export")( bulk = { case (entry, _) => entry.body.is_finished }, consume = { (args: List[(Entry, Boolean)]) => val results = db.transaction { for ((entry, strict) <- args) yield { if (progress.stopped) { entry.body.cancel() Exn.Res(()) } - else if (read_name(db, entry.session_name, entry.theory_name, entry.name)) { + else if (entry.entry_name.readable(db)) { if (strict) { val msg = message("Duplicate export", entry.theory_name, entry.name) errors.change(msg :: _) } Exn.Res(()) } else Exn.capture { entry.write(db) } } } (results, true) }) def apply(session_name: String, args: Protocol.Export.Args, body: Bytes): Unit = { if (!progress.stopped) { consumer.send(make_entry(session_name, args, body, cache) -> args.strict) } } def shutdown(close: Boolean = false): List[String] = { consumer.shutdown() if (close) db.close() errors.value.reverse ::: (if (progress.stopped) List("Export stopped") else Nil) } } /* abstract provider */ object Provider { def none: Provider = new Provider { def apply(export_name: String): Option[Entry] = None def focus(other_theory: String): Provider = this override def toString: String = "none" } def database_context( context: Sessions.Database_Context, session_hierarchy: List[String], theory_name: String): Provider = new Provider { def apply(export_name: String): Option[Entry] = context.read_export(session_hierarchy, theory_name, export_name) def focus(other_theory: String): Provider = this override def toString: String = context.toString } def database( db: SQL.Database, cache: XML.Cache, session_name: String, theory_name: String ) : Provider = { new Provider { def apply(export_name: String): Option[Entry] = - read_entry(db, cache, session_name, theory_name, export_name) + Entry_Name(session = session_name, theory = theory_name, name = export_name) + .read(db, cache) def focus(other_theory: String): Provider = if (other_theory == theory_name) this else Provider.database(db, cache, session_name, other_theory) override def toString: String = db.toString } } def snapshot(snapshot: Document.Snapshot): Provider = new Provider { def apply(export_name: String): Option[Entry] = snapshot.exports_map.get(export_name) def focus(other_theory: String): Provider = if (other_theory == snapshot.node_name.theory) this else { val node_name = snapshot.version.nodes.theory_name(other_theory) getOrElse error("Bad theory " + quote(other_theory)) Provider.snapshot(snapshot.state.snapshot(node_name)) } override def toString: String = snapshot.toString } def directory( dir: Path, cache: XML.Cache, session_name: String, theory_name: String ) : Provider = { new Provider { def apply(export_name: String): Option[Entry] = - read_entry(dir, cache, session_name, theory_name, export_name) + Entry_Name(session = session_name, theory = theory_name, name = export_name) + .read(dir, cache) def focus(other_theory: String): Provider = if (other_theory == theory_name) this else Provider.directory(dir, cache, session_name, other_theory) override def toString: String = dir.toString } } } trait Provider { def apply(export_name: String): Option[Entry] def uncompressed_yxml(export_name: String): XML.Body = apply(export_name) match { case Some(entry) => entry.uncompressed_yxml case None => Nil } def focus(other_theory: String): Provider } /* export to file-system */ def export_files( store: Sessions.Store, session_name: String, export_dir: Path, progress: Progress = new Progress, export_prune: Int = 0, export_list: Boolean = false, export_patterns: List[String] = Nil ): Unit = { using(store.open_database(session_name)) { db => db.transaction { - val export_names = read_theory_exports(db, session_name) + val entry_names = read_entry_names(db, session_name) // list if (export_list) { - for ((theory_name, name) <- export_names) { - progress.echo(compound_name(theory_name, name)) - } + for (entry_name <- entry_names) progress.echo(entry_name.compound_name) } // export if (export_patterns.nonEmpty) { val matcher = make_matcher(export_patterns) for { - (theory_name, name) <- export_names if matcher(theory_name, name) - entry <- read_entry(db, store.cache, session_name, theory_name, name) + entry_name <- entry_names if matcher(entry_name) + entry <- entry_name.read(db, store.cache) } { - val elems = theory_name :: space_explode('/', name) - val path = - if (elems.length < export_prune + 1) { - error("Cannot prune path by " + export_prune + " element(s): " + Path.make(elems)) - } - else export_dir + Path.make(elems.drop(export_prune)) - + val path = export_dir + entry_name.make_path(prune = export_prune) progress.echo("export " + path + (if (entry.executable) " (executable)" else "")) Isabelle_System.make_directory(path.dir) val bytes = entry.uncompressed if (!path.is_file || Bytes.read(path) != bytes) Bytes.write(path, bytes) File.set_executable(path, entry.executable) } } } } } /* Isabelle tool wrapper */ val default_export_dir: Path = Path.explode("export") val isabelle_tool = Isabelle_Tool("export", "retrieve theory exports", Scala_Project.here, { args => /* arguments */ var export_dir = default_export_dir var dirs: List[Path] = Nil var export_list = false var no_build = false var options = Options.init() var export_prune = 0 var export_patterns: List[String] = Nil val getopts = Getopts(""" Usage: isabelle export [OPTIONS] SESSION Options are: -O DIR output directory for exported files (default: """ + default_export_dir + """) -d DIR include session directory -l list exports -n no build of session -o OPTION override Isabelle system OPTION (via NAME=VAL or NAME) -p NUM prune path of exported files by NUM elements -x PATTERN extract files matching pattern (e.g. "*:**" for all) List or export theory exports for SESSION: named blobs produced by isabelle build. Option -l or -x is required; option -x may be repeated. The PATTERN language resembles glob patterns in the shell, with ? and * (both excluding ":" and "/"), ** (excluding ":"), and [abc] or [^abc], and variants {pattern1,pattern2,pattern3}. """, "O:" -> (arg => export_dir = Path.explode(arg)), "d:" -> (arg => dirs = dirs ::: List(Path.explode(arg))), "l" -> (_ => export_list = true), "n" -> (_ => no_build = true), "o:" -> (arg => options = options + arg), "p:" -> (arg => export_prune = Value.Int.parse(arg)), "x:" -> (arg => export_patterns ::= arg)) val more_args = getopts(args) val session_name = more_args match { case List(session_name) if export_list || export_patterns.nonEmpty => session_name case _ => getopts.usage() } val progress = new Console_Progress() /* build */ if (!no_build) { val rc = progress.interrupt_handler { Build.build_logic(options, session_name, progress = progress, dirs = dirs) } if (rc != Process_Result.RC.ok) sys.exit(rc) } /* export files */ val store = Sessions.store(options) export_files(store, session_name, export_dir, progress = progress, export_prune = export_prune, export_list = export_list, export_patterns = export_patterns) }) } diff --git a/src/Pure/Thy/sessions.ML b/src/Pure/Thy/sessions.ML --- a/src/Pure/Thy/sessions.ML +++ b/src/Pure/Thy/sessions.ML @@ -1,130 +1,134 @@ (* Title: Pure/Thy/sessions.ML Author: Makarius Support for session ROOT syntax. *) signature SESSIONS = sig val root_name: string val theory_name: string val command_parser: (Toplevel.transition -> Toplevel.transition) parser end; structure Sessions: SESSIONS = struct val root_name = "ROOT"; val theory_name = "Pure.Sessions"; local val theory_entry = Parse.input Parse.theory_name --| Parse.opt_keyword "global"; val theories = Parse.$$$ "theories" |-- Parse.!!! (Scan.optional Parse.options [] -- Scan.repeat1 theory_entry); val in_path = Parse.$$$ "(" |-- Parse.!!! (Parse.$$$ "in" |-- Parse.path_input --| Parse.$$$ ")"); val document_theories = Parse.$$$ "document_theories" |-- Scan.repeat1 (Parse.input Parse.theory_name); val document_files = Parse.$$$ "document_files" |-- Parse.!!! (Scan.optional in_path (Input.string "document") -- Scan.repeat1 Parse.path_input); val prune = Scan.optional (Parse.$$$ "[" |-- Parse.!!! (Parse.nat --| Parse.$$$ "]")) 0; val export_files = Parse.$$$ "export_files" |-- Parse.!!! (Scan.optional in_path (Input.string "export") -- prune -- Scan.repeat1 Parse.embedded); +val export_classpath = + Parse.$$$ "export_classpath" |-- Scan.repeat Parse.embedded; + fun path_source source path = Input.source (Input.is_delimited source) (Path.implode path) (Input.range_of source); in val command_parser = Parse.session_name -- Scan.optional (Parse.$$$ "(" |-- Parse.!!! (Scan.repeat1 Parse.name --| Parse.$$$ ")")) [] -- Scan.optional (Parse.$$$ "in" |-- Parse.!!! Parse.path_input) (Input.string ".") -- (Parse.$$$ "=" |-- Parse.!!! (Scan.option (Parse.session_name --| Parse.!!! (Parse.$$$ "+")) -- Scan.optional (Parse.$$$ "description" |-- Parse.!!! (Parse.input Parse.embedded)) Input.empty -- Scan.optional (Parse.$$$ "options" |-- Parse.!!! Parse.options) [] -- Scan.optional (Parse.$$$ "sessions" |-- Parse.!!! (Scan.repeat1 Parse.session_name)) [] -- Scan.optional (Parse.$$$ "directories" |-- Parse.!!! (Scan.repeat1 Parse.path_input)) [] -- Scan.repeat theories -- Scan.optional document_theories [] -- Scan.repeat document_files -- - Scan.repeat export_files)) + Scan.repeat export_files -- + Scan.optional export_classpath [])) >> (fn (((((session, _), _), dir), - (((((((((parent, descr), options), sessions), directories), theories), - document_theories), document_files), export_files)))) => + ((((((((((parent, descr), options), sessions), directories), theories), + document_theories), document_files), export_files), _)))) => Toplevel.keep (fn state => let val ctxt = Toplevel.context_of state; val session_dir = Resources.check_dir ctxt NONE dir; val _ = (the_list parent @ sessions) |> List.app (fn arg => ignore (Resources.check_session ctxt arg) handle ERROR msg => Output.error_message msg); val _ = Context_Position.report ctxt (Position.range_position (Symbol_Pos.range (Input.source_explode descr))) Markup.comment; val _ = (options @ maps #1 theories) |> List.app (fn (x, y) => ignore (Completion.check_option_value ctxt x y (Options.default ())) handle ERROR msg => Output.error_message msg); fun check_thy source = ignore (Resources.check_file ctxt (SOME Path.current) source) handle ERROR msg => Output.error_message msg; val _ = maps #2 theories |> List.app (fn source => let val s = Input.string_of source; val pos = Input.pos_of source; val {node_name, theory_name, ...} = Resources.import_name session session_dir s handle ERROR msg => error (msg ^ Position.here pos); val thy_path = the_default node_name (Resources.find_theory_file theory_name); in check_thy (path_source source thy_path) end); val _ = directories |> List.app (ignore o Resources.check_dir ctxt (SOME session_dir)); val _ = document_theories |> List.app (fn source => let val thy = Input.string_of source; val pos = Input.pos_of source; in (case Resources.find_theory_file thy of NONE => Output.error_message ("Unknown theory " ^ quote thy ^ Position.here pos) | SOME path => check_thy (path_source source path)) end); val _ = document_files |> List.app (fn (doc_dir, doc_files) => let val dir = Resources.check_dir ctxt (SOME session_dir) doc_dir; val _ = List.app (ignore o Resources.check_file ctxt (SOME dir)) doc_files; in () end); val _ = export_files |> List.app (fn ((export_dir, _), _) => ignore (Resources.check_path ctxt (SOME session_dir) export_dir)); in () end)); end; end; diff --git a/src/Pure/Thy/sessions.scala b/src/Pure/Thy/sessions.scala --- a/src/Pure/Thy/sessions.scala +++ b/src/Pure/Thy/sessions.scala @@ -1,1488 +1,1521 @@ /* Title: Pure/Thy/sessions.scala Author: Makarius Cumulative session information. */ package isabelle import java.io.{File => JFile} import java.nio.ByteBuffer import java.nio.channels.FileChannel import java.nio.file.StandardOpenOption import scala.collection.immutable.{SortedSet, SortedMap} import scala.collection.mutable object Sessions { /* session and theory names */ val ROOTS: Path = Path.explode("ROOTS") val ROOT: Path = Path.explode("ROOT") val roots_name: String = "ROOTS" val root_name: String = "ROOT" val theory_import: String = "Pure.Sessions" val UNSORTED = "Unsorted" val DRAFT = "Draft" def is_pure(name: String): Boolean = name == Thy_Header.PURE def exclude_session(name: String): Boolean = name == "" || name == DRAFT def exclude_theory(name: String): Boolean = name == root_name || name == "README" || name == "index" || name == "bib" /* ROOTS file format */ class File_Format extends isabelle.File_Format { val format_name: String = roots_name val file_ext = "" override def detect(name: String): Boolean = Thy_Header.split_file_name(name) match { case Some((_, file_name)) => file_name == roots_name case None => false } override def theory_suffix: String = "ROOTS_file" override def theory_content(name: String): String = """theory "ROOTS" imports Pure begin ROOTS_file """ + Outer_Syntax.quote_string(name) + """ end""" } /* base info and source dependencies */ sealed case class Base( pos: Position.T = Position.none, session_directories: Map[JFile, String] = Map.empty, global_theories: Map[String, String] = Map.empty, session_theories: List[Document.Node.Name] = Nil, document_theories: List[Document.Node.Name] = Nil, loaded_theories: Graph[String, Outer_Syntax] = Graph.string, used_theories: List[(Document.Node.Name, Options)] = Nil, load_commands: Map[Document.Node.Name, List[Command_Span.Span]] = Map.empty, known_theories: Map[String, Document.Node.Entry] = Map.empty, known_loaded_files: Map[String, List[Path]] = Map.empty, overall_syntax: Outer_Syntax = Outer_Syntax.empty, imported_sources: List[(Path, SHA1.Digest)] = Nil, sources: List[(Path, SHA1.Digest)] = Nil, session_graph_display: Graph_Display.Graph = Graph_Display.empty_graph, errors: List[String] = Nil ) { override def toString: String = "Sessions.Base(loaded_theories = " + loaded_theories.size + ", used_theories = " + used_theories.length + ")" def theory_qualifier(name: String): String = global_theories.getOrElse(name, Long_Name.qualifier(name)) def theory_qualifier(name: Document.Node.Name): String = theory_qualifier(name.theory) def loaded_theory(name: String): Boolean = loaded_theories.defined(name) def loaded_theory(name: Document.Node.Name): Boolean = loaded_theory(name.theory) def loaded_theory_syntax(name: String): Option[Outer_Syntax] = if (loaded_theory(name)) Some(loaded_theories.get_node(name)) else None def loaded_theory_syntax(name: Document.Node.Name): Option[Outer_Syntax] = loaded_theory_syntax(name.theory) def theory_syntax(name: Document.Node.Name): Outer_Syntax = loaded_theory_syntax(name) getOrElse overall_syntax def node_syntax(nodes: Document.Nodes, name: Document.Node.Name): Outer_Syntax = nodes(name).syntax orElse loaded_theory_syntax(name) getOrElse overall_syntax } sealed case class Deps(sessions_structure: Structure, session_bases: Map[String, Base]) { override def toString: String = "Sessions.Deps(" + sessions_structure + ")" def is_empty: Boolean = session_bases.isEmpty def apply(name: String): Base = session_bases(name) def get(name: String): Option[Base] = session_bases.get(name) def imported_sources(name: String): List[SHA1.Digest] = session_bases(name).imported_sources.map(_._2) def sources(name: String): List[SHA1.Digest] = session_bases(name).sources.map(_._2) def errors: List[String] = (for { (name, base) <- session_bases.iterator if base.errors.nonEmpty } yield cat_lines(base.errors) + "\nThe error(s) above occurred in session " + quote(name) + Position.here(base.pos) ).toList def check_errors: Deps = errors match { case Nil => this case errs => error(cat_lines(errs)) } } def deps(sessions_structure: Structure, progress: Progress = new Progress, inlined_files: Boolean = false, verbose: Boolean = false, list_files: Boolean = false, check_keywords: Set[String] = Set.empty ): Deps = { var cache_sources = Map.empty[JFile, SHA1.Digest] def check_sources(paths: List[Path]): List[(Path, SHA1.Digest)] = { for { path <- paths file = path.file if cache_sources.isDefinedAt(file) || file.isFile } yield { cache_sources.get(file) match { case Some(digest) => (path, digest) case None => val digest = SHA1.digest(file) cache_sources = cache_sources + (file -> digest) (path, digest) } } } val session_bases = sessions_structure.imports_topological_order.foldLeft(Map("" -> sessions_structure.bootstrap)) { case (session_bases, session_name) => progress.expose_interrupt() val info = sessions_structure(session_name) try { val deps_base = info.deps_base(session_bases) val resources = new Resources(sessions_structure, deps_base) if (verbose || list_files) { val groups = if (info.groups.isEmpty) "" else info.groups.mkString(" (", " ", ")") progress.echo("Session " + info.chapter_session + groups) } val dependencies = resources.session_dependencies(info) val overall_syntax = dependencies.overall_syntax val session_theories = dependencies.theories.filter(name => deps_base.theory_qualifier(name) == session_name) val theory_files = dependencies.theories.map(_.path) dependencies.load_commands val (load_commands, load_commands_errors) = try { if (inlined_files) (dependencies.load_commands, Nil) else (Nil, Nil) } catch { case ERROR(msg) => (Nil, List(msg)) } val loaded_files = load_commands.map({ case (name, spans) => dependencies.loaded_files(name, spans) }) val session_files = (theory_files ::: loaded_files.flatMap(_._2) ::: info.document_files.map(file => info.dir + file._1 + file._2)).map(_.expand) val imported_files = if (inlined_files) dependencies.imported_files else Nil if (list_files) progress.echo(cat_lines(session_files.map(_.implode).sorted.map(" " + _))) if (check_keywords.nonEmpty) { Check_Keywords.check_keywords( progress, overall_syntax.keywords, check_keywords, theory_files) } val session_graph_display: Graph_Display.Graph = { def session_node(name: String): Graph_Display.Node = Graph_Display.Node("[" + name + "]", "session." + name) def node(name: Document.Node.Name): Graph_Display.Node = { val qualifier = deps_base.theory_qualifier(name) if (qualifier == info.name) Graph_Display.Node(name.theory_base_name, "theory." + name.theory) else session_node(qualifier) } val required_sessions = dependencies.loaded_theories.all_preds(dependencies.theories.map(_.theory)) .map(theory => deps_base.theory_qualifier(theory)) .filter(name => name != info.name && sessions_structure.defined(name)) val required_subgraph = sessions_structure.imports_graph .restrict(sessions_structure.imports_graph.all_preds(required_sessions).toSet) .transitive_closure .restrict(required_sessions.toSet) .transitive_reduction_acyclic val graph0 = required_subgraph.topological_order.foldLeft(Graph_Display.empty_graph) { case (g, session) => val a = session_node(session) val bs = required_subgraph.imm_preds(session).toList.map(session_node) bs.foldLeft((a :: bs).foldLeft(g)(_.default_node(_, Nil)))(_.add_edge(_, a)) } dependencies.entries.foldLeft(graph0) { case (g, entry) => val a = node(entry.name) val bs = entry.header.imports.map(node).filterNot(_ == a) bs.foldLeft((a :: bs).foldLeft(g)(_.default_node(_, Nil)))(_.add_edge(_, a)) } } val known_theories = dependencies.entries.iterator.map(entry => entry.name.theory -> entry). foldLeft(deps_base.known_theories)(_ + _) val known_loaded_files = deps_base.known_loaded_files ++ loaded_files val import_errors = { val known_sessions = sessions_structure.imports_requirements(List(session_name)).toSet for { name <- dependencies.theories qualifier = deps_base.theory_qualifier(name) if !known_sessions(qualifier) } yield "Bad import of theory " + quote(name.toString) + ": need to include sessions " + quote(qualifier) + " in ROOT" } val document_errors = info.document_theories.flatMap( { case (thy, pos) => val build_hierarchy = if (sessions_structure.build_graph.defined(session_name)) { sessions_structure.build_hierarchy(session_name) } else Nil def err(msg: String): Option[String] = Some(msg + " " + quote(thy) + Position.here(pos)) known_theories.get(thy).map(_.name) match { case None => err("Unknown document theory") case Some(name) => val qualifier = deps_base.theory_qualifier(name) if (session_theories.contains(name)) { err("Redundant document theory from this session:") } else if (build_hierarchy.contains(qualifier)) None else if (dependencies.theories.contains(name)) None else err("Document theory from other session not imported properly:") } }) val document_theories = info.document_theories.map({ case (thy, _) => known_theories(thy).name }) val dir_errors = { val ok = info.dirs.map(_.canonical_file).toSet val bad = (for { name <- session_theories.iterator path = name.master_dir_path if !ok(path.canonical_file) path1 = File.relative_path(info.dir.canonical, path).getOrElse(path) } yield (path1, name)).toList val bad_dirs = (for { (path1, _) <- bad } yield path1.toString).distinct.sorted val errs1 = for { (path1, name) <- bad } yield "Implicit use of directory " + path1 + " for theory " + quote(name.toString) val errs2 = if (bad_dirs.isEmpty) Nil else List("Implicit use of session directories: " + commas(bad_dirs)) val errs3 = for (p <- info.dirs if !p.is_dir) yield "No such directory: " + p val errs4 = (for { name <- session_theories.iterator name1 <- resources.find_theory_node(name.theory) if name.node != name1.node } yield "Incoherent theory file import:\n " + name.path + " vs. \n " + name1.path) .toList errs1 ::: errs2 ::: errs3 ::: errs4 } val sources_errors = for (p <- session_files if !p.is_file) yield "No such file: " + p val path_errors = try { Path.check_case_insensitive(session_files ::: imported_files); Nil } catch { case ERROR(msg) => List(msg) } val bibtex_errors = try { info.bibtex_entries; Nil } catch { case ERROR(msg) => List(msg) } val base = Base( pos = info.pos, session_directories = sessions_structure.session_directories, global_theories = sessions_structure.global_theories, session_theories = session_theories, document_theories = document_theories, loaded_theories = dependencies.loaded_theories, used_theories = dependencies.theories_adjunct, load_commands = load_commands.toMap, known_theories = known_theories, known_loaded_files = known_loaded_files, overall_syntax = overall_syntax, imported_sources = check_sources(imported_files), sources = check_sources(session_files), session_graph_display = session_graph_display, errors = dependencies.errors ::: load_commands_errors ::: import_errors ::: document_errors ::: dir_errors ::: sources_errors ::: path_errors ::: bibtex_errors) session_bases + (info.name -> base) } catch { case ERROR(msg) => cat_error(msg, "The error(s) above occurred in session " + quote(info.name) + Position.here(info.pos)) } } Deps(sessions_structure, session_bases) } /* base info */ sealed case class Base_Info( session: String, sessions_structure: Structure, errors: List[String], base: Base, infos: List[Info] ) { def check: Base_Info = if (errors.isEmpty) this else error(cat_lines(errors)) } def base_info(options: Options, session: String, progress: Progress = new Progress, dirs: List[Path] = Nil, include_sessions: List[String] = Nil, session_ancestor: Option[String] = None, session_requirements: Boolean = false ): Base_Info = { val full_sessions = load_structure(options, dirs = dirs) val selected_sessions = full_sessions.selection(Selection(sessions = session :: session_ancestor.toList)) val info = selected_sessions(session) val ancestor = session_ancestor orElse info.parent val (session1, infos1) = if (session_requirements && ancestor.isDefined) { val deps = Sessions.deps(selected_sessions, progress = progress) val base = deps(session) val ancestor_loaded = deps.get(ancestor.get) match { case Some(ancestor_base) if !selected_sessions.imports_requirements(List(ancestor.get)).contains(session) => ancestor_base.loaded_theories.defined _ case _ => error("Bad ancestor " + quote(ancestor.get) + " for session " + quote(session)) } val required_theories = for { thy <- base.loaded_theories.keys if !ancestor_loaded(thy) && base.theory_qualifier(thy) != session } yield thy if (required_theories.isEmpty) (ancestor.get, Nil) else { val other_name = info.name + "_requirements(" + ancestor.get + ")" Isabelle_System.isabelle_tmp_prefix() (other_name, List( make_info(info.options, dir_selected = false, dir = Path.explode("$ISABELLE_TMP_PREFIX"), chapter = info.chapter, Session_Entry( pos = info.pos, name = other_name, groups = info.groups, path = ".", parent = ancestor, description = "Required theory imports from other sessions", options = Nil, imports = info.deps, directories = Nil, theories = List((Nil, required_theories.map(thy => ((thy, Position.none), false)))), document_theories = Nil, document_files = Nil, - export_files = Nil)))) + export_files = Nil, + export_classpath = Nil)))) } } else (session, Nil) val full_sessions1 = if (infos1.isEmpty) full_sessions else load_structure(options, dirs = dirs, infos = infos1) val selected_sessions1 = full_sessions1.selection(Selection(sessions = session1 :: session :: include_sessions)) val deps1 = Sessions.deps(selected_sessions1, progress = progress) Base_Info(session1, full_sessions1, deps1.errors, deps1(session1), infos1) } /* cumulative session info */ sealed case class Info( name: String, chapter: String, dir_selected: Boolean, pos: Position.T, groups: List[String], dir: Path, parent: Option[String], description: String, directories: List[Path], options: Options, imports: List[String], theories: List[(Options, List[(String, Position.T)])], global_theories: List[String], document_theories: List[(String, Position.T)], document_files: List[(Path, Path)], export_files: List[(Path, Int, List[String])], + export_classpath: List[String], meta_digest: SHA1.Digest ) { def chapter_session: String = chapter + "/" + name def relative_path(info1: Info): String = if (name == info1.name) "" else if (chapter == info1.chapter) "../" + info1.name + "/" else "../../" + info1.chapter_session + "/" def deps: List[String] = parent.toList ::: imports def deps_base(session_bases: String => Base): Base = { val parent_base = session_bases(parent.getOrElse("")) val imports_bases = imports.map(session_bases) parent_base.copy( known_theories = (for { base <- imports_bases.iterator (_, entry) <- base.known_theories.iterator } yield (entry.name.theory -> entry)).foldLeft(parent_base.known_theories)(_ + _), known_loaded_files = imports_bases.iterator.map(_.known_loaded_files). foldLeft(parent_base.known_loaded_files)(_ ++ _)) } def dirs: List[Path] = dir :: directories def timeout_ignored: Boolean = !options.bool("timeout_build") || Time.seconds(options.real("timeout")) < Time.ms(1) def timeout: Time = Time.seconds(options.real("timeout") * options.real("timeout_scale")) def document_enabled: Boolean = options.string("document") match { case "" | "false" => false case "pdf" | "true" => true case doc => error("Bad document specification " + quote(doc)) } def document_variants: List[Document_Build.Document_Variant] = { val variants = Library.space_explode(':', options.string("document_variants")). map(Document_Build.Document_Variant.parse) val dups = Library.duplicates(variants.map(_.name)) if (dups.nonEmpty) error("Duplicate document variants: " + commas_quote(dups)) variants } def documents: List[Document_Build.Document_Variant] = { val variants = document_variants if (!document_enabled || document_files.isEmpty) Nil else variants } def document_output: Option[Path] = options.string("document_output") match { case "" => None case s => Some(dir + Path.explode(s)) } def browser_info: Boolean = options.bool("browser_info") lazy val bibtex_entries: List[Text.Info[String]] = (for { (document_dir, file) <- document_files.iterator if Bibtex.is_bibtex(file.file_name) info <- Bibtex.entries(File.read(dir + document_dir + file)).iterator } yield info).toList def record_proofs: Boolean = options.int("record_proofs") >= 2 def is_afp: Boolean = chapter == AFP.chapter def is_afp_bulky: Boolean = is_afp && groups.exists(AFP.groups_bulky.contains) } def make_info( options: Options, dir_selected: Boolean, dir: Path, chapter: String, entry: Session_Entry ): Info = { try { val name = entry.name if (exclude_session(name)) error("Bad session name") if (is_pure(name) && entry.parent.isDefined) error("Illegal parent session") if (!is_pure(name) && !entry.parent.isDefined) error("Missing parent session") val session_path = dir + Path.explode(entry.path) val directories = entry.directories.map(dir => session_path + Path.explode(dir)) val session_options = options ++ entry.options val theories = entry.theories.map({ case (opts, thys) => (session_options ++ opts, thys.map({ case ((thy, pos), _) => if (exclude_theory(thy)) error("Bad theory name " + quote(thy) + Position.here(pos)) else (thy, pos) })) }) val global_theories = for { (_, thys) <- entry.theories; ((thy, pos), global) <- thys if global } yield { val thy_name = Path.explode(thy).file_name if (Long_Name.is_qualified(thy_name)) error("Bad qualified name for global theory " + quote(thy_name) + Position.here(pos)) else thy_name } val conditions = theories.flatMap(thys => space_explode(',', thys._1.string("condition"))).distinct.sorted. map(x => (x, Isabelle_System.getenv(x) != "")) val document_files = entry.document_files.map({ case (s1, s2) => (Path.explode(s1), Path.explode(s2)) }) val export_files = entry.export_files.map({ case (dir, prune, pats) => (Path.explode(dir), prune, pats) }) val meta_digest = SHA1.digest( (name, chapter, entry.parent, entry.directories, entry.options, entry.imports, entry.theories_no_position, conditions, entry.document_theories_no_position, entry.document_files) .toString) Info(name, chapter, dir_selected, entry.pos, entry.groups, session_path, entry.parent, entry.description, directories, session_options, entry.imports, theories, global_theories, entry.document_theories, document_files, - export_files, meta_digest) + export_files, entry.export_classpath, meta_digest) } catch { case ERROR(msg) => error(msg + "\nThe error(s) above occurred in session entry " + quote(entry.name) + Position.here(entry.pos)) } } object Selection { val empty: Selection = Selection() val all: Selection = Selection(all_sessions = true) def session(session: String): Selection = Selection(sessions = List(session)) } sealed case class Selection( requirements: Boolean = false, all_sessions: Boolean = false, base_sessions: List[String] = Nil, exclude_session_groups: List[String] = Nil, exclude_sessions: List[String] = Nil, session_groups: List[String] = Nil, sessions: List[String] = Nil ) { def ++ (other: Selection): Selection = Selection( requirements = requirements || other.requirements, all_sessions = all_sessions || other.all_sessions, base_sessions = Library.merge(base_sessions, other.base_sessions), exclude_session_groups = Library.merge(exclude_session_groups, other.exclude_session_groups), exclude_sessions = Library.merge(exclude_sessions, other.exclude_sessions), session_groups = Library.merge(session_groups, other.session_groups), sessions = Library.merge(sessions, other.sessions)) } object Structure { val empty: Structure = make(Nil) def make(infos: List[Info]): Structure = { def add_edges( graph: Graph[String, Info], kind: String, edges: Info => Iterable[String] ) : Graph[String, Info] = { def add_edge(pos: Position.T, name: String, g: Graph[String, Info], parent: String) = { if (!g.defined(parent)) error("Bad " + kind + " session " + quote(parent) + " for " + quote(name) + Position.here(pos)) try { g.add_edge_acyclic(parent, name) } catch { case exn: Graph.Cycles[_] => error(cat_lines(exn.cycles.map(cycle => "Cyclic session dependency of " + cycle.map(c => quote(c.toString)).mkString(" via "))) + Position.here(pos)) } } graph.iterator.foldLeft(graph) { case (g, (name, (info, _))) => edges(info).foldLeft(g)(add_edge(info.pos, name, _, _)) } } val info_graph = infos.foldLeft(Graph.string[Info]) { case (graph, info) => if (graph.defined(info.name)) error("Duplicate session " + quote(info.name) + Position.here(info.pos) + Position.here(graph.get_node(info.name).pos)) else graph.new_node(info.name, info) } val build_graph = add_edges(info_graph, "parent", _.parent) val imports_graph = add_edges(build_graph, "imports", _.imports) val session_positions: List[(String, Position.T)] = (for ((name, (info, _)) <- info_graph.iterator) yield (name, info.pos)).toList val session_directories: Map[JFile, String] = (for { session <- imports_graph.topological_order.iterator info = info_graph.get_node(session) dir <- info.dirs.iterator } yield (info, dir)).foldLeft(Map.empty[JFile, String]) { case (dirs, (info, dir)) => val session = info.name val canonical_dir = dir.canonical_file dirs.get(canonical_dir) match { case Some(session1) => val info1 = info_graph.get_node(session1) error("Duplicate use of directory " + dir + "\n for session " + quote(session1) + Position.here(info1.pos) + "\n vs. session " + quote(session) + Position.here(info.pos)) case None => dirs + (canonical_dir -> session) } } val global_theories: Map[String, String] = (for { session <- imports_graph.topological_order.iterator info = info_graph.get_node(session) thy <- info.global_theories.iterator } yield (info, thy)).foldLeft(Thy_Header.bootstrap_global_theories.toMap) { case (global, (info, thy)) => val qualifier = info.name global.get(thy) match { case Some(qualifier1) if qualifier != qualifier1 => error("Duplicate global theory " + quote(thy) + Position.here(info.pos)) case _ => global + (thy -> qualifier) } } new Structure( session_positions, session_directories, global_theories, build_graph, imports_graph) } } final class Structure private[Sessions]( val session_positions: List[(String, Position.T)], val session_directories: Map[JFile, String], val global_theories: Map[String, String], val build_graph: Graph[String, Info], val imports_graph: Graph[String, Info] ) { sessions_structure => def bootstrap: Base = Base( session_directories = session_directories, global_theories = global_theories, overall_syntax = Thy_Header.bootstrap_syntax) def dest_session_directories: List[(String, String)] = for ((file, session) <- session_directories.toList) yield (File.standard_path(file), session) lazy val chapters: SortedMap[String, List[Info]] = build_graph.iterator.foldLeft(SortedMap.empty[String, List[Info]]) { case (chs, (_, (info, _))) => chs + (info.chapter -> (info :: chs.getOrElse(info.chapter, Nil))) } def build_graph_display: Graph_Display.Graph = Graph_Display.make_graph(build_graph) def imports_graph_display: Graph_Display.Graph = Graph_Display.make_graph(imports_graph) def defined(name: String): Boolean = imports_graph.defined(name) def apply(name: String): Info = imports_graph.get_node(name) def get(name: String): Option[Info] = if (defined(name)) Some(apply(name)) else None def theory_qualifier(name: String): String = global_theories.getOrElse(name, Long_Name.qualifier(name)) def check_sessions(names: List[String]): Unit = { val bad_sessions = SortedSet(names.filterNot(defined): _*).toList if (bad_sessions.nonEmpty) error("Undefined session(s): " + commas_quote(bad_sessions)) } def check_sessions(sel: Selection): Unit = check_sessions(sel.base_sessions ::: sel.exclude_sessions ::: sel.sessions) private def selected(graph: Graph[String, Info], sel: Selection): List[String] = { check_sessions(sel) val select_group = sel.session_groups.toSet val select_session = sel.sessions.toSet ++ imports_graph.all_succs(sel.base_sessions) val selected0 = if (sel.all_sessions) graph.keys else { (for { (name, (info, _)) <- graph.iterator if info.dir_selected || select_session(name) || info.groups.exists(select_group) } yield name).toList } if (sel.requirements) (graph.all_preds(selected0).toSet -- selected0).toList else selected0 } def selection(sel: Selection): Structure = { check_sessions(sel) val excluded = { val exclude_group = sel.exclude_session_groups.toSet val exclude_group_sessions = (for { (name, (info, _)) <- imports_graph.iterator if imports_graph.get_node(name).groups.exists(exclude_group) } yield name).toList imports_graph.all_succs(exclude_group_sessions ::: sel.exclude_sessions).toSet } def restrict(graph: Graph[String, Info]): Graph[String, Info] = { val sessions = graph.all_preds(selected(graph, sel)).filterNot(excluded) graph.restrict(graph.all_preds(sessions).toSet) } new Structure( session_positions, session_directories, global_theories, restrict(build_graph), restrict(imports_graph)) } def selection(session: String): Structure = selection(Selection.session(session)) def selection_deps( selection: Selection, progress: Progress = new Progress, loading_sessions: Boolean = false, inlined_files: Boolean = false, verbose: Boolean = false ): Deps = { val deps = Sessions.deps(sessions_structure.selection(selection), progress = progress, inlined_files = inlined_files, verbose = verbose) if (loading_sessions) { val selection_size = deps.sessions_structure.build_graph.size if (selection_size > 1) progress.echo("Loading " + selection_size + " sessions ...") } deps } def build_selection(sel: Selection): List[String] = selected(build_graph, sel) def build_descendants(ss: List[String]): List[String] = build_graph.all_succs(ss) def build_requirements(ss: List[String]): List[String] = build_graph.all_preds_rev(ss) def build_topological_order: List[String] = build_graph.topological_order def build_hierarchy(session: String): List[String] = build_graph.all_preds(List(session)) def imports_selection(sel: Selection): List[String] = selected(imports_graph, sel) def imports_descendants(ss: List[String]): List[String] = imports_graph.all_succs(ss) def imports_requirements(ss: List[String]): List[String] = imports_graph.all_preds_rev(ss) def imports_topological_order: List[String] = imports_graph.topological_order def imports_hierarchy(session: String): List[String] = imports_graph.all_preds(List(session)) def bibtex_entries: List[(String, List[String])] = build_topological_order.flatMap(name => apply(name).bibtex_entries match { case Nil => None case entries => Some(name -> entries.map(_.info)) }) override def toString: String = imports_graph.keys_iterator.mkString("Sessions.Structure(", ", ", ")") } /* parser */ private val CHAPTER = "chapter" private val SESSION = "session" private val IN = "in" private val DESCRIPTION = "description" private val DIRECTORIES = "directories" private val OPTIONS = "options" private val SESSIONS = "sessions" private val THEORIES = "theories" private val GLOBAL = "global" private val DOCUMENT_THEORIES = "document_theories" private val DOCUMENT_FILES = "document_files" private val EXPORT_FILES = "export_files" + private val EXPORT_CLASSPATH = "export_classpath" val root_syntax: Outer_Syntax = Outer_Syntax.empty + "(" + ")" + "+" + "," + "=" + "[" + "]" + GLOBAL + IN + (CHAPTER, Keyword.THY_DECL) + (SESSION, Keyword.THY_DECL) + (DESCRIPTION, Keyword.QUASI_COMMAND) + (DIRECTORIES, Keyword.QUASI_COMMAND) + (OPTIONS, Keyword.QUASI_COMMAND) + (SESSIONS, Keyword.QUASI_COMMAND) + (THEORIES, Keyword.QUASI_COMMAND) + (DOCUMENT_THEORIES, Keyword.QUASI_COMMAND) + (DOCUMENT_FILES, Keyword.QUASI_COMMAND) + - (EXPORT_FILES, Keyword.QUASI_COMMAND) + (EXPORT_FILES, Keyword.QUASI_COMMAND) + + (EXPORT_CLASSPATH, Keyword.QUASI_COMMAND) abstract class Entry sealed case class Chapter(name: String) extends Entry sealed case class Session_Entry( pos: Position.T, name: String, groups: List[String], path: String, parent: Option[String], description: String, options: List[Options.Spec], imports: List[String], directories: List[String], theories: List[(List[Options.Spec], List[((String, Position.T), Boolean)])], document_theories: List[(String, Position.T)], document_files: List[(String, String)], - export_files: List[(String, Int, List[String])] + export_files: List[(String, Int, List[String])], + export_classpath: List[String] ) extends Entry { def theories_no_position: List[(List[Options.Spec], List[(String, Boolean)])] = theories.map({ case (a, b) => (a, b.map({ case ((c, _), d) => (c, d) })) }) def document_theories_no_position: List[String] = document_theories.map(_._1) } private object Parsers extends Options.Parsers { private val chapter: Parser[Chapter] = { val chapter_name = atom("chapter name", _.is_name) command(CHAPTER) ~! chapter_name ^^ { case _ ~ a => Chapter(a) } } private val session_entry: Parser[Session_Entry] = { val option = option_name ~ opt($$$("=") ~! option_value ^^ { case _ ~ x => x }) ^^ { case x ~ y => (x, y) } val options = $$$("[") ~> rep1sep(option, $$$(",")) <~ $$$("]") val theory_entry = position(theory_name) ~ opt_keyword(GLOBAL) ^^ { case x ~ y => (x, y) } val theories = $$$(THEORIES) ~! ((options | success(Nil)) ~ rep1(theory_entry)) ^^ { case _ ~ (x ~ y) => (x, y) } val in_path = $$$("(") ~! ($$$(IN) ~ path ~ $$$(")")) ^^ { case _ ~ (_ ~ x ~ _) => x } val document_theories = $$$(DOCUMENT_THEORIES) ~! rep1(position(name)) ^^ { case _ ~ x => x } val document_files = $$$(DOCUMENT_FILES) ~! ((in_path | success("document")) ~ rep1(path)) ^^ { case _ ~ (x ~ y) => y.map((x, _)) } val prune = $$$("[") ~! (nat ~ $$$("]")) ^^ { case _ ~ (x ~ _) => x } | success(0) val export_files = $$$(EXPORT_FILES) ~! ((in_path | success("export")) ~ prune ~ rep1(embedded)) ^^ { case _ ~ (x ~ y ~ z) => (x, y, z) } + val export_classpath = + $$$(EXPORT_CLASSPATH) ~! (rep1(embedded) | success(List("*:classpath/*.jar"))) ^^ + { case _ ~ x => x } + command(SESSION) ~! (position(session_name) ~ (($$$("(") ~! (rep1(name) <~ $$$(")")) ^^ { case _ ~ x => x }) | success(Nil)) ~ (($$$(IN) ~! path ^^ { case _ ~ x => x }) | success(".")) ~ ($$$("=") ~! (opt(session_name ~! $$$("+") ^^ { case x ~ _ => x }) ~ (($$$(DESCRIPTION) ~! text ^^ { case _ ~ x => x }) | success("")) ~ (($$$(OPTIONS) ~! options ^^ { case _ ~ x => x }) | success(Nil)) ~ (($$$(SESSIONS) ~! rep1(session_name) ^^ { case _ ~ x => x }) | success(Nil)) ~ (($$$(DIRECTORIES) ~! rep1(path) ^^ { case _ ~ x => x }) | success(Nil)) ~ rep(theories) ~ (opt(document_theories) ^^ (x => x.getOrElse(Nil))) ~ (rep(document_files) ^^ (x => x.flatten)) ~ - rep(export_files)))) ^^ - { case _ ~ ((a, pos) ~ b ~ c ~ (_ ~ (d ~ e ~ f ~ g ~ h ~ i ~ j ~ k ~ l))) => - Session_Entry(pos, a, b, c, d, e, f, g, h, i, j, k, l) } + rep(export_files) ~ + opt(export_classpath)))) ^^ + { case _ ~ ((a, pos) ~ b ~ c ~ (_ ~ (d ~ e ~ f ~ g ~ h ~ i ~ j ~ k ~ l ~ m))) => + Session_Entry(pos, a, b, c, d, e, f, g, h, i, j, k, l, m.getOrElse(Nil)) } } def parse_root(path: Path): List[Entry] = { val toks = Token.explode(root_syntax.keywords, File.read(path)) val start = Token.Pos.file(path.implode) parse_all(rep(chapter | session_entry), Token.reader(toks, start)) match { case Success(result, _) => result case bad => error(bad.toString) } } } def parse_root(path: Path): List[Entry] = Parsers.parse_root(path) def parse_root_entries(path: Path): List[Session_Entry] = for (entry <- Parsers.parse_root(path) if entry.isInstanceOf[Session_Entry]) yield entry.asInstanceOf[Session_Entry] def read_root(options: Options, select: Boolean, path: Path): List[Info] = { var entry_chapter = UNSORTED val infos = new mutable.ListBuffer[Info] parse_root(path).foreach { case Chapter(name) => entry_chapter = name case entry: Session_Entry => infos += make_info(options, select, path.dir, entry_chapter, entry) } infos.toList } def parse_roots(roots: Path): List[String] = { for { line <- split_lines(File.read(roots)) if !(line == "" || line.startsWith("#")) } yield line } /* load sessions from certain directories */ def is_session_dir(dir: Path): Boolean = (dir + ROOT).is_file || (dir + ROOTS).is_file def check_session_dir(dir: Path): Path = if (is_session_dir(dir)) File.pwd() + dir.expand else error("Bad session root directory (missing ROOT or ROOTS): " + dir.expand.toString) def directories(dirs: List[Path], select_dirs: List[Path]): List[(Boolean, Path)] = { val default_dirs = Components.directories().filter(is_session_dir) for { (select, dir) <- (default_dirs ::: dirs).map((false, _)) ::: select_dirs.map((true, _)) } yield (select, dir.canonical) } def load_structure( options: Options, dirs: List[Path] = Nil, select_dirs: List[Path] = Nil, infos: List[Info] = Nil ): Structure = { def load_dir(select: Boolean, dir: Path): List[(Boolean, Path)] = load_root(select, dir) ::: load_roots(select, dir) def load_root(select: Boolean, dir: Path): List[(Boolean, Path)] = { val root = dir + ROOT if (root.is_file) List((select, root)) else Nil } def load_roots(select: Boolean, dir: Path): List[(Boolean, Path)] = { val roots = dir + ROOTS if (roots.is_file) { for { entry <- parse_roots(roots) dir1 = try { check_session_dir(dir + Path.explode(entry)) } catch { case ERROR(msg) => error(msg + "\nThe error(s) above occurred in session catalog " + roots.toString) } res <- load_dir(select, dir1) } yield res } else Nil } val roots = for { (select, dir) <- directories(dirs, select_dirs) res <- load_dir(select, check_session_dir(dir)) } yield res val unique_roots = roots.foldLeft(Map.empty[JFile, (Boolean, Path)]) { case (m, (select, path)) => val file = path.canonical_file m.get(file) match { case None => m + (file -> (select, path)) case Some((select1, path1)) => m + (file -> (select1 || select, path1)) } }.toList.map(_._2) Structure.make(unique_roots.flatMap(p => read_root(options, p._1, p._2)) ::: infos) } /* Isabelle tool wrapper */ val isabelle_tool = Isabelle_Tool("sessions", "explore structure of Isabelle sessions", Scala_Project.here, { args => var base_sessions: List[String] = Nil var select_dirs: List[Path] = Nil var requirements = false var exclude_session_groups: List[String] = Nil var all_sessions = false var dirs: List[Path] = Nil var session_groups: List[String] = Nil var exclude_sessions: List[String] = Nil val getopts = Getopts(""" Usage: isabelle sessions [OPTIONS] [SESSIONS ...] Options are: -B NAME include session NAME and all descendants -D DIR include session directory and select its sessions -R refer to requirements of selected sessions -X NAME exclude sessions from group NAME and all descendants -a select all sessions -d DIR include session directory -g NAME select session group NAME -x NAME exclude session NAME and all descendants Explore the structure of Isabelle sessions and print result names in topological order (on stdout). """, "B:" -> (arg => base_sessions = base_sessions ::: List(arg)), "D:" -> (arg => select_dirs = select_dirs ::: List(Path.explode(arg))), "R" -> (_ => requirements = true), "X:" -> (arg => exclude_session_groups = exclude_session_groups ::: List(arg)), "a" -> (_ => all_sessions = true), "d:" -> (arg => dirs = dirs ::: List(Path.explode(arg))), "g:" -> (arg => session_groups = session_groups ::: List(arg)), "x:" -> (arg => exclude_sessions = exclude_sessions ::: List(arg))) val sessions = getopts(args) val options = Options.init() val selection = Selection(requirements = requirements, all_sessions = all_sessions, base_sessions = base_sessions, exclude_session_groups = exclude_session_groups, exclude_sessions = exclude_sessions, session_groups = session_groups, sessions = sessions) val sessions_structure = load_structure(options, dirs = dirs, select_dirs = select_dirs).selection(selection) for (name <- sessions_structure.imports_topological_order) { Output.writeln(name, stdout = true) } }) /** heap file with SHA1 digest **/ private val sha1_prefix = "SHA1:" def read_heap_digest(heap: Path): Option[String] = { if (heap.is_file) { using(FileChannel.open(heap.java_path, StandardOpenOption.READ)) { file => val len = file.size val n = sha1_prefix.length + SHA1.digest_length if (len >= n) { file.position(len - n) val buf = ByteBuffer.allocate(n) var i = 0 var m = 0 var cont = true while (cont) { m = file.read(buf) if (m != -1) i += m cont = (m != -1 && n > i) } if (i == n) { val prefix = new String(buf.array(), 0, sha1_prefix.length, UTF8.charset) val s = new String(buf.array(), sha1_prefix.length, SHA1.digest_length, UTF8.charset) if (prefix == sha1_prefix) Some(s) else None } else None } else None } } else None } def write_heap_digest(heap: Path): String = read_heap_digest(heap) match { case None => val s = SHA1.digest(heap).toString File.append(heap, sha1_prefix + s) s case Some(s) => s } /** persistent store **/ object Session_Info { val session_name = SQL.Column.string("session_name").make_primary_key // Build_Log.Session_Info val session_timing = SQL.Column.bytes("session_timing") val command_timings = SQL.Column.bytes("command_timings") val theory_timings = SQL.Column.bytes("theory_timings") val ml_statistics = SQL.Column.bytes("ml_statistics") val task_statistics = SQL.Column.bytes("task_statistics") val errors = SQL.Column.bytes("errors") val build_log_columns = List(session_name, session_timing, command_timings, theory_timings, ml_statistics, task_statistics, errors) // Build.Session_Info val sources = SQL.Column.string("sources") val input_heaps = SQL.Column.string("input_heaps") val output_heap = SQL.Column.string("output_heap") val return_code = SQL.Column.int("return_code") val build_columns = List(sources, input_heaps, output_heap, return_code) val table = SQL.Table("isabelle_session_info", build_log_columns ::: build_columns) } class Database_Context private[Sessions]( val store: Sessions.Store, database_server: Option[SQL.Database] ) extends AutoCloseable { def cache: Term.Cache = store.cache def close(): Unit = database_server.foreach(_.close()) def output_database[A](session: String)(f: SQL.Database => A): A = database_server match { case Some(db) => f(db) case None => using(store.open_database(session, output = true))(f) } def input_database[A](session: String)(f: (SQL.Database, String) => Option[A]): Option[A] = database_server match { case Some(db) => f(db, session) case None => store.try_open_database(session) match { case Some(db) => using(db)(f(_, session)) case None => None } } def read_export( sessions: List[String], theory_name: String, name: String ): Option[Export.Entry] = { val attempts = database_server match { case Some(db) => sessions.view.map(session_name => - Export.read_entry(db, store.cache, session_name, theory_name, name)) + Export.Entry_Name(session = session_name, theory = theory_name, name = name) + .read(db, store.cache)) case None => sessions.view.map(session_name => store.try_open_database(session_name) match { case Some(db) => - using(db)(Export.read_entry(_, store.cache, session_name, theory_name, name)) + using(db) { _ => + Export.Entry_Name(session = session_name, theory = theory_name, name = name) + .read(db, store.cache) } case None => None }) } attempts.collectFirst({ case Some(entry) => entry }) } def get_export( session_hierarchy: List[String], theory_name: String, name: String): Export.Entry = read_export(session_hierarchy, theory_name, name) getOrElse Export.empty_entry(theory_name, name) + def get_classpath(structure: Structure, session: String): List[File.Content_Bytes] = + { + (for { + name <- structure.build_requirements(List(session)) + patterns = structure(name).export_classpath if patterns.nonEmpty + } yield { + input_database(name)((db, _) => + db.transaction { + val matcher = Export.make_matcher(patterns) + val res = + for { + entry_name <- Export.read_entry_names(db, name) if matcher(entry_name) + entry <- entry_name.read(db, store.cache) + } yield File.Content(entry.entry_name.make_path(), entry.uncompressed) + Some(res) + } + ).getOrElse(Nil) + }).flatten + } + override def toString: String = { val s = database_server match { case Some(db) => db.toString case None => "input_dirs = " + store.input_dirs.map(_.absolute).mkString(", ") } "Database_Context(" + s + ")" } } def store(options: Options, cache: Term.Cache = Term.Cache.make()): Store = new Store(options, cache) class Store private[Sessions](val options: Options, val cache: Term.Cache) { store => override def toString: String = "Store(output_dir = " + output_dir.absolute + ")" /* directories */ val system_output_dir: Path = Path.explode("$ISABELLE_HEAPS_SYSTEM/$ML_IDENTIFIER") val user_output_dir: Path = Path.explode("$ISABELLE_HEAPS/$ML_IDENTIFIER") def system_heaps: Boolean = options.bool("system_heaps") val output_dir: Path = if (system_heaps) system_output_dir else user_output_dir val input_dirs: List[Path] = if (system_heaps) List(system_output_dir) else List(user_output_dir, system_output_dir) def presentation_dir: Path = if (system_heaps) Path.explode("$ISABELLE_BROWSER_INFO_SYSTEM") else Path.explode("$ISABELLE_BROWSER_INFO") /* file names */ def heap(name: String): Path = Path.basic(name) def database(name: String): Path = Path.basic("log") + Path.basic(name).ext("db") def log(name: String): Path = Path.basic("log") + Path.basic(name) def log_gz(name: String): Path = log(name).ext("gz") def output_heap(name: String): Path = output_dir + heap(name) def output_database(name: String): Path = output_dir + database(name) def output_log(name: String): Path = output_dir + log(name) def output_log_gz(name: String): Path = output_dir + log_gz(name) def prepare_output_dir(): Unit = Isabelle_System.make_directory(output_dir + Path.basic("log")) /* heap */ def find_heap(name: String): Option[Path] = input_dirs.map(_ + heap(name)).find(_.is_file) def find_heap_digest(name: String): Option[String] = find_heap(name).flatMap(read_heap_digest) def the_heap(name: String): Path = find_heap(name) getOrElse error("Missing heap image for session " + quote(name) + " -- expected in:\n" + cat_lines(input_dirs.map(dir => " " + dir.expand.implode))) /* database */ def find_database(name: String): Option[Path] = input_dirs.map(_ + database(name)).find(_.is_file) def database_server: Boolean = options.bool("build_database_server") def open_database_server(): SQL.Database = PostgreSQL.open_database( user = options.string("build_database_user"), password = options.string("build_database_password"), database = options.string("build_database_name"), host = options.string("build_database_host"), port = options.int("build_database_port"), ssh = options.proper_string("build_database_ssh_host").map(ssh_host => SSH.open_session(options, host = ssh_host, user = options.string("build_database_ssh_user"), port = options.int("build_database_ssh_port"))), ssh_close = true) def open_database_context(): Database_Context = new Database_Context(store, if (database_server) Some(open_database_server()) else None) def try_open_database(name: String, output: Boolean = false): Option[SQL.Database] = { def check(db: SQL.Database): Option[SQL.Database] = if (output || session_info_exists(db)) Some(db) else { db.close(); None } if (database_server) check(open_database_server()) else if (output) Some(SQLite.open_database(output_database(name))) else { (for { dir <- input_dirs.view path = dir + database(name) if path.is_file db <- check(SQLite.open_database(path)) } yield db).headOption } } def open_database(name: String, output: Boolean = false): SQL.Database = try_open_database(name, output = output) getOrElse error("Missing build database for session " + quote(name)) def clean_output(name: String): (Boolean, Boolean) = { val relevant_db = database_server && { try_open_database(name) match { case Some(db) => try { db.transaction { val relevant_db = session_info_defined(db, name) init_session_info(db, name) relevant_db } } finally { db.close() } case None => false } } val del = for { dir <- (if (system_heaps) List(user_output_dir, system_output_dir) else List(user_output_dir)) file <- List(heap(name), database(name), log(name), log_gz(name)) path = dir + file if path.is_file } yield path.file.delete val relevant = relevant_db || del.nonEmpty val ok = del.forall(b => b) (relevant, ok) } /* SQL database content */ def read_bytes(db: SQL.Database, name: String, column: SQL.Column): Bytes = db.using_statement(Session_Info.table.select(List(column), Session_Info.session_name.where_equal(name))) { stmt => val res = stmt.execute_query() if (!res.next()) Bytes.empty else res.bytes(column) } def read_properties(db: SQL.Database, name: String, column: SQL.Column): List[Properties.T] = Properties.uncompress(read_bytes(db, name, column), cache = cache) /* session info */ def init_session_info(db: SQL.Database, name: String): Unit = { db.transaction { db.create_table(Session_Info.table) db.using_statement( Session_Info.table.delete(Session_Info.session_name.where_equal(name)))(_.execute()) db.create_table(Export.Data.table) db.using_statement( Export.Data.table.delete(Export.Data.session_name.where_equal(name)))(_.execute()) db.create_table(Document_Build.Data.table) db.using_statement( Document_Build.Data.table.delete( Document_Build.Data.session_name.where_equal(name)))(_.execute()) } } def session_info_exists(db: SQL.Database): Boolean = { val tables = db.tables tables.contains(Session_Info.table.name) && tables.contains(Export.Data.table.name) } def session_info_defined(db: SQL.Database, name: String): Boolean = db.transaction { session_info_exists(db) && { db.using_statement( Session_Info.table.select(List(Session_Info.session_name), Session_Info.session_name.where_equal(name)))(stmt => stmt.execute_query().next()) } } def write_session_info( db: SQL.Database, name: String, build_log: Build_Log.Session_Info, build: Build.Session_Info ): Unit = { db.transaction { db.using_statement(Session_Info.table.insert()) { stmt => stmt.string(1) = name stmt.bytes(2) = Properties.encode(build_log.session_timing) stmt.bytes(3) = Properties.compress(build_log.command_timings, cache = cache.xz) stmt.bytes(4) = Properties.compress(build_log.theory_timings, cache = cache.xz) stmt.bytes(5) = Properties.compress(build_log.ml_statistics, cache = cache.xz) stmt.bytes(6) = Properties.compress(build_log.task_statistics, cache = cache.xz) stmt.bytes(7) = Build_Log.compress_errors(build_log.errors, cache = cache.xz) stmt.string(8) = build.sources stmt.string(9) = cat_lines(build.input_heaps) stmt.string(10) = build.output_heap getOrElse "" stmt.int(11) = build.return_code stmt.execute() } } } def read_session_timing(db: SQL.Database, name: String): Properties.T = Properties.decode(read_bytes(db, name, Session_Info.session_timing), cache = cache) def read_command_timings(db: SQL.Database, name: String): List[Properties.T] = read_properties(db, name, Session_Info.command_timings) def read_theory_timings(db: SQL.Database, name: String): List[Properties.T] = read_properties(db, name, Session_Info.theory_timings) def read_ml_statistics(db: SQL.Database, name: String): List[Properties.T] = read_properties(db, name, Session_Info.ml_statistics) def read_task_statistics(db: SQL.Database, name: String): List[Properties.T] = read_properties(db, name, Session_Info.task_statistics) def read_theories(db: SQL.Database, name: String): List[String] = read_theory_timings(db, name).flatMap(Markup.Name.unapply) def read_errors(db: SQL.Database, name: String): List[String] = Build_Log.uncompress_errors(read_bytes(db, name, Session_Info.errors), cache = cache) def read_build(db: SQL.Database, name: String): Option[Build.Session_Info] = { if (db.tables.contains(Session_Info.table.name)) { db.using_statement(Session_Info.table.select(Session_Info.build_columns, Session_Info.session_name.where_equal(name))) { stmt => val res = stmt.execute_query() if (!res.next()) None else { Some( Build.Session_Info( res.string(Session_Info.sources), split_lines(res.string(Session_Info.input_heaps)), res.string(Session_Info.output_heap) match { case "" => None case s => Some(s) }, res.int(Session_Info.return_code))) } } } else None } } } diff --git a/src/Pure/Tools/generated_files.ML b/src/Pure/Tools/generated_files.ML --- a/src/Pure/Tools/generated_files.ML +++ b/src/Pure/Tools/generated_files.ML @@ -1,401 +1,444 @@ (* Title: Pure/Tools/generated_files.ML Author: Makarius Generated source files for other languages: with antiquotations, without Isabelle symbols. *) signature GENERATED_FILES = sig val add_files: Path.binding * Bytes.T -> theory -> theory type file = {path: Path.T, pos: Position.T, content: Bytes.T} val file_binding: file -> Path.binding val file_markup: file -> Markup.T val print_file: file -> string val report_file: Proof.context -> Position.T -> file -> unit val get_files: theory -> file list val get_file: theory -> Path.binding -> file val get_files_in: Path.binding list * theory -> (file * Position.T) list val check_files_in: Proof.context -> (string * Position.T) list * (string * Position.T) option -> Path.binding list * theory val write_file: Path.T -> file -> unit val export_file: theory -> file -> unit type file_type = {name: string, ext: string, make_comment: string -> string, make_string: string -> string} val file_type: binding -> {ext: string, make_comment: string -> string, make_string: string -> string} -> theory -> theory val antiquotation: binding -> 'a Token.context_parser -> ({context: Proof.context, source: Token.src, file_type: file_type, argument: 'a} -> string) -> theory -> theory val set_string: string -> Proof.context -> Proof.context val generate_file: Path.binding * Input.source -> Proof.context -> local_theory val generate_file_cmd: (string * Position.T) * Input.source -> local_theory -> local_theory val export_generated_files: Proof.context -> (Path.binding list * theory) list -> unit val export_generated_files_cmd: Proof.context -> ((string * Position.T) list * (string * Position.T) option) list -> unit + val check_external_files: Proof.context -> + Input.source list * Input.source -> Path.T list * Path.T + val get_external_files: Path.T -> Path.T list * Path.T -> unit + val scala_build_generated_files: Proof.context -> (Path.binding list * theory) list -> + (Path.T list * Path.T) list -> unit + val scala_build_generated_files_cmd: Proof.context -> + ((string * Position.T) list * (string * Position.T) option) list -> + (Input.source list * Input.source) list -> unit val with_compile_dir: (Path.T -> unit) -> unit val compile_generated_files: Proof.context -> (Path.binding list * theory) list -> (Path.T list * Path.T) list -> (Path.binding list * bool option) list -> Path.binding -> Input.source -> unit val compile_generated_files_cmd: Proof.context -> ((string * Position.T) list * (string * Position.T) option) list -> (Input.source list * Input.source) list -> ((string * Position.T) list * bool option) list -> string * Position.T -> Input.source -> unit val execute: Path.T -> Input.source -> string -> string end; structure Generated_Files: GENERATED_FILES = struct (** context data **) type file = Path.T * (Position.T * Bytes.T); type file_type = {name: string, ext: string, make_comment: string -> string, make_string: string -> string}; type antiquotation = Token.src -> Proof.context -> file_type -> string; fun err_dup_files path pos pos' = error ("Duplicate generated file: " ^ Path.print path ^ Position.here_list [pos, pos']); structure Data = Theory_Data ( type T = file list Symtab.table * (*files for named theory*) file_type Name_Space.table * (*file types*) antiquotation Name_Space.table; (*antiquotations*) val empty = (Symtab.empty, Name_Space.empty_table Markup.file_typeN, Name_Space.empty_table Markup.antiquotationN); fun merge ((files1, types1, antiqs1), (files2, types2, antiqs2)) = let val files' = (files1, files2) |> Symtab.join (fn _ => Library.merge (fn ((path1, (pos1, bytes1)), (path2, (pos2, bytes2))) => if path1 <> path2 then false else if pos1 = pos2 andalso Bytes.eq (bytes1, bytes2) then true else err_dup_files path1 pos1 pos2)); val types' = Name_Space.merge_tables (types1, types2); val antiqs' = Name_Space.merge_tables (antiqs1, antiqs2); in (files', types', antiqs') end; ); fun lookup_files thy = Symtab.lookup_list (#1 (Data.get thy)) (Context.theory_long_name thy); fun update_files thy_files' thy = (Data.map o @{apply 3(1)}) (Symtab.update (Context.theory_long_name thy, thy_files')) thy; fun add_files (binding, content) thy = let val _ = Path.proper_binding binding; val (path, pos) = Path.dest_binding binding; val thy_files = lookup_files thy; val thy_files' = (case AList.lookup (op =) thy_files path of SOME (pos', _) => err_dup_files path pos pos' | NONE => (path, (pos, content)) :: thy_files); in update_files thy_files' thy end; (* get files *) type file = {path: Path.T, pos: Position.T, content: Bytes.T}; fun file_binding (file: file) = Path.binding (#path file, #pos file); fun file_markup (file: file) = (Markup.entityN, Position.def_properties_of (#pos file)); fun print_file (file: file) = Markup.markup (file_markup file) (quote (Path.implode (#path file))); fun report_file ctxt pos file = Context_Position.report ctxt pos (file_markup file); fun get_files thy = lookup_files thy |> rev |> map (fn (path, (pos, content)) => {path = path, pos = pos, content = content}: file); fun get_file thy binding = let val (path, pos) = Path.dest_binding binding in (case find_first (fn file => #path file = path) (get_files thy) of SOME file => file | NONE => error ("Missing generated file " ^ Path.print path ^ " in theory " ^ quote (Context.theory_long_name thy) ^ Position.here pos)) end; fun get_files_in ([], thy) = map (rpair Position.none) (get_files thy) | get_files_in (files, thy) = map (fn binding => (get_file thy binding, Path.pos_of_binding binding)) files; (* check and output files *) fun check_files_in ctxt (files, opt_thy) = let val thy = (case opt_thy of SOME name => Theory.check {long = false} ctxt name | NONE => Proof_Context.theory_of ctxt); in (map Path.explode_binding files, thy) end; fun write_file dir (file: file) = let val path = Path.expand (dir + #path file); val _ = Isabelle_System.make_directory (Path.dir path); val content = #content file; val write_content = (case try Bytes.read path of SOME old_content => not (Bytes.eq (content, old_content)) | NONE => true) in if write_content then Bytes.write path content else () end; fun export_file thy (file: file) = Export.export thy (file_binding file) (Bytes.contents_blob (#content file)); (* file types *) fun get_file_type thy ext = if ext = "" then error "Bad file extension" else (#2 (Data.get thy), NONE) |-> Name_Space.fold_table (fn (_, file_type) => fn res => if #ext file_type = ext then SOME file_type else res) |> (fn SOME file_type => file_type | NONE => error ("Unknown file type for extension " ^ quote ext)); fun file_type binding {ext, make_comment, make_string} thy = let val name = Binding.name_of binding; val pos = Binding.pos_of binding; val file_type = {name = name, ext = ext, make_comment = make_comment, make_string = make_string}; val table = #2 (Data.get thy); val space = Name_Space.space_of_table table; val context = Context.Theory thy |> Name_Space.map_naming (K Name_Space.global_naming); val (_, data') = table |> Name_Space.define context true (Binding.make (name, pos), file_type); val _ = (case try (#name o get_file_type thy) ext of NONE => () | SOME name' => error ("Extension " ^ quote ext ^ " already registered for file type " ^ quote (Markup.markup (Name_Space.markup space name') name') ^ Position.here pos)); in thy |> (Data.map o @{apply 3(2)}) (K data') end; (* antiquotations *) val get_antiquotations = #3 o Data.get o Proof_Context.theory_of; fun antiquotation name scan body thy = let fun ant src ctxt file_type : string = let val (x, ctxt') = Token.syntax scan src ctxt in body {context = ctxt', source = src, file_type = file_type, argument = x} end; in thy |> (Data.map o @{apply 3(3)}) (Name_Space.define (Context.Theory thy) true (name, ant) #> #2) end; val scan_antiq = Antiquote.scan_control Antiquote.err_prefix >> Antiquote.Control || Scan.one (Symbol.not_eof o Symbol_Pos.symbol) >> (Antiquote.Text o Symbol_Pos.symbol); fun read_source ctxt (file_type: file_type) source = let val _ = Context_Position.report ctxt (Input.pos_of source) (Markup.language {name = #name file_type, symbols = false, antiquotes = true, delimited = Input.is_delimited source}); val (input, _) = Input.source_explode source |> Scan.error (Scan.finite Symbol_Pos.stopper (Scan.repeat scan_antiq)); val _ = Context_Position.reports ctxt (Antiquote.antiq_reports input); fun expand antiq = (case antiq of Antiquote.Text s => s | Antiquote.Control {name, body, ...} => let val src = Token.make_src name (if null body then [] else [Token.read_cartouche body]); val (src', ant) = Token.check_src ctxt get_antiquotations src; in ant src' ctxt file_type end | Antiquote.Antiq antiq => error ("Bad antiquotation " ^ Position.here (#1 (#range antiq)))); in implode (map expand input) end; (** Isar commands **) (* generate_file *) fun generate_file (binding, source) lthy = let val thy = Proof_Context.theory_of lthy; val (path, pos) = Path.dest_binding binding; val file_type = get_file_type thy (#2 (Path.split_ext path)) handle ERROR msg => error (msg ^ Position.here pos); val header = #make_comment file_type " generated by Isabelle "; val content = Bytes.string (header ^ "\n" ^ read_source lthy file_type source); in lthy |> (Local_Theory.background_theory o add_files) (binding, content) end; fun generate_file_cmd (file, source) lthy = generate_file (Path.explode_binding file, source) lthy; (* export_generated_files *) fun export_generated_files ctxt args = let val thy = Proof_Context.theory_of ctxt in (case map #1 (maps get_files_in args) of [] => () | files => (List.app (export_file thy) files; writeln (Export.message thy Path.current); writeln (cat_lines (map (prefix " " o print_file) files)))) end; fun export_generated_files_cmd ctxt args = export_generated_files ctxt (map (check_files_in ctxt) args); +(* external files *) + +fun check_external_files ctxt (raw_files, raw_base_dir) = + let + val base_dir = Resources.check_dir ctxt NONE raw_base_dir; + fun check source = + (Resources.check_file ctxt (SOME base_dir) source; + Path.explode (Input.string_of source)); + val files = map check raw_files; + in (files, base_dir) end; + +fun get_external_files dir (files, base_dir) = + files |> List.app (fn file => Isabelle_System.copy_file_base (base_dir, file) dir); + + +(* scala_build_generated_files *) + +fun scala_build_generated_files ctxt args external = + Isabelle_System.with_tmp_dir "scala_build" (fn dir => + let + val files = maps get_files_in args; + val _ = List.app (fn (file, pos) => report_file ctxt pos file) files; + val _ = List.app (write_file dir o #1) files; + val _ = List.app (get_external_files dir) external; + in Scala_Build.scala_build ctxt dir end); + +fun scala_build_generated_files_cmd ctxt args external = + scala_build_generated_files ctxt + (map (check_files_in ctxt) args) + (map (check_external_files ctxt) external) + + (* compile_generated_files *) val compile_dir = Config.declare_string ("compile_dir", \<^here>) (K ""); fun with_compile_dir body : unit = body (Path.explode (Config.get (Context.the_local_context ()) compile_dir)); fun compile_generated_files ctxt args external export export_prefix source = Isabelle_System.with_tmp_dir "compile" (fn dir => let val thy = Proof_Context.theory_of ctxt; val files = maps get_files_in args; val _ = List.app (fn (file, pos) => report_file ctxt pos file) files; val _ = List.app (write_file dir o #1) files; - val _ = - external |> List.app (fn (files, base_dir) => - files |> List.app (fn file => Isabelle_System.copy_file_base (base_dir, file) dir)); + val _ = List.app (get_external_files dir) external; val _ = ML_Context.eval_in (SOME (Config.put compile_dir (Path.implode dir) ctxt)) ML_Compiler.flags (Input.pos_of source) (ML_Lex.read "Generated_Files.with_compile_dir (" @ ML_Lex.read_source source @ ML_Lex.read ")"); val _ = export |> List.app (fn (bindings, executable) => bindings |> List.app (fn binding0 => let val binding = binding0 |> Path.map_binding (executable = SOME true ? Path.exe); val (path, pos) = Path.dest_binding binding; val content = (case try Bytes.read (dir + path) of SOME bytes => Bytes.contents_blob bytes | NONE => error ("Missing result file " ^ Path.print path ^ Position.here pos)); val _ = Export.report_export thy export_prefix; val binding' = Path.map_binding (Path.append (Path.path_of_binding export_prefix)) binding; in (if is_some executable then Export.export_executable else Export.export) thy binding' content end)); val _ = if null export then () else writeln (Export.message thy (Path.path_of_binding export_prefix)); in () end); fun compile_generated_files_cmd ctxt args external export export_prefix source = compile_generated_files ctxt (map (check_files_in ctxt) args) - (external |> map (fn (raw_files, raw_base_dir) => - let - val base_dir = Resources.check_dir ctxt NONE raw_base_dir; - fun check source = - (Resources.check_file ctxt (SOME base_dir) source; - Path.explode (Input.string_of source)); - val files = map check raw_files; - in (files, base_dir) end)) + (map (check_external_files ctxt) external) ((map o apfst o map) Path.explode_binding export) (Path.explode_binding export_prefix) source; (* execute compiler -- auxiliary *) fun execute dir title compile = Isabelle_System.bash_process (Bash.script compile |> Bash.cwd dir) |> Process_Result.check |> Process_Result.out handle ERROR msg => let val (s, pos) = Input.source_content title in error (s ^ " failed" ^ Position.here pos ^ ":\n" ^ msg) end; (** concrete file types **) val _ = Theory.setup (file_type \<^binding>\Haskell\ {ext = "hs", make_comment = enclose "{-" "-}", - make_string = GHC.print_string}); + make_string = GHC.print_string} #> + file_type \<^binding>\Java\ + {ext = "java", + make_comment = enclose "/*" "*/", + make_string = Java.print_string} #> + file_type \<^binding>\Scala\ + {ext = "scala", + make_comment = enclose "/*" "*/", + make_string = Java.print_string} #> + file_type \<^binding>\Properties\ + {ext = "props", + make_comment = enclose "#" "", + make_string = I}); (** concrete antiquotations **) (* ML expression as string literal *) structure Local_Data = Proof_Data ( type T = string option; fun init _ = NONE; ); val set_string = Local_Data.put o SOME; fun the_string ctxt = (case Local_Data.get ctxt of SOME s => s | NONE => raise Fail "No result string"); val _ = Theory.setup (antiquotation \<^binding>\cartouche\ (Scan.lift Args.cartouche_input) (fn {context = ctxt, file_type, argument, ...} => ctxt |> Context.proof_map (ML_Context.expression (Input.pos_of argument) (ML_Lex.read "Theory.local_setup (Generated_Files.set_string (" @ ML_Lex.read_source argument @ ML_Lex.read "))")) |> the_string |> #make_string file_type)); (* file-system paths *) fun path_antiquotation binding check = antiquotation binding (Args.context -- Scan.lift Parse.path_input >> (fn (ctxt, source) => (check ctxt (SOME Path.current) source |> Path.implode))) (fn {file_type, argument, ...} => #make_string file_type argument); val _ = Theory.setup (path_antiquotation \<^binding>\path\ Resources.check_path #> path_antiquotation \<^binding>\file\ Resources.check_file #> path_antiquotation \<^binding>\dir\ Resources.check_dir); end; diff --git a/src/Pure/Tools/scala_build.ML b/src/Pure/Tools/scala_build.ML --- a/src/Pure/Tools/scala_build.ML +++ b/src/Pure/Tools/scala_build.ML @@ -1,32 +1,26 @@ (* Title: Pure/Tools/scala_build.ML Author: Makarius Build and export Isabelle/Scala/Java modules. *) signature SCALA_BUILD = sig - val scala_build_component: Proof.context -> Input.source -> unit - val scala_build_directory: Proof.context -> Input.source -> unit + val scala_build: Proof.context -> Path.T -> unit end structure Scala_Build: SCALA_BUILD = struct -fun scala_build component ctxt dir = +fun scala_build ctxt dir = let - val path = Resources.check_dir ctxt NONE dir; val [jar_name, jar_bytes, output] = - \<^scala>\scala_build\ - [Bytes.string (Value.print_bool component), Bytes.string (Path.implode path)]; + \<^scala>\scala_build\ [Bytes.string (Isabelle_System.absolute_path dir)]; val _ = writeln (Bytes.content output); in Export.export (Proof_Context.theory_of ctxt) (Path.explode_binding0 (Bytes.content jar_name)) (Bytes.contents_blob jar_bytes) end; -val scala_build_component = scala_build true; -val scala_build_directory = scala_build false; - end; diff --git a/src/Pure/Tools/scala_build.scala b/src/Pure/Tools/scala_build.scala --- a/src/Pure/Tools/scala_build.scala +++ b/src/Pure/Tools/scala_build.scala @@ -1,117 +1,115 @@ /* Title: Pure/Tools/scala_build.scala Author: Makarius Manage and build Isabelle/Scala/Java modules. */ package isabelle import java.util.{Properties => JProperties} import java.io.{ByteArrayOutputStream, PrintStream} import java.nio.file.Files import scala.jdk.CollectionConverters._ object Scala_Build { class Context private[Scala_Build](java_context: isabelle.setup.Build.Context) { override def toString: String = java_context.toString def is_module(path: Path): Boolean = { val module_name = java_context.module_name() module_name.nonEmpty && File.eq(java_context.path(module_name).toFile, path.file) } def module_result: Option[Path] = { java_context.module_result() match { case "" => None case module => Some(File.path(java_context.path(module).toFile)) } } def sources: List[Path] = java_context.sources().asScala.toList.map(s => File.path(java_context.path(s).toFile)) def requirements: List[Path] = (for { s <- java_context.requirements().asScala.iterator p <- java_context.requirement_paths(s).asScala.iterator } yield (File.path(p.toFile))).toList def build(fresh: Boolean = false): String = { val output0 = new ByteArrayOutputStream val output = new PrintStream(output0) def get_output(): String = { output.flush() Library.trim_line(output0.toString(UTF8.charset)) } try { isabelle.setup.Build.build(output, java_context, fresh) get_output() } catch { case ERROR(msg) => cat_error(get_output(), msg) } } } def context(dir: Path, component: Boolean = false, no_title: Boolean = false, do_build: Boolean = false, module: Option[Path] = None ): Context = { val props_name = if (component) isabelle.setup.Build.COMPONENT_BUILD_PROPS else isabelle.setup.Build.BUILD_PROPS val props_path = dir + Path.explode(props_name) val props = new JProperties props.load(Files.newBufferedReader(props_path.java_path)) if (no_title) props.remove(isabelle.setup.Build.TITLE) if (do_build) props.remove(isabelle.setup.Build.NO_BUILD) if (module.isDefined) props.put(isabelle.setup.Build.MODULE, File.standard_path(module.get)) new Context(new isabelle.setup.Build.Context(dir.java_path, props, props_path.implode)) } sealed case class Result(output: String, jar_bytes: Bytes, jar_path: Option[Path]) { def write(): Unit = { if (jar_path.isDefined) { Isabelle_System.make_directory(jar_path.get.dir) Bytes.write(jar_path.get, jar_bytes) } } } def build_result(dir: Path, component: Boolean = false): Result = { Isabelle_System.with_tmp_file("result", "jar") { tmp_file => val output = context(dir, component = component, no_title = true, do_build = true, module = Some(tmp_file)).build() val jar_bytes = Bytes.read(tmp_file) val jar_path = context(dir, component = component).module_result Result(output, jar_bytes, jar_path) } } object Scala_Fun extends Scala.Fun("scala_build") with Scala.Bytes_Fun { val here = Scala_Project.here def invoke(args: List[Bytes]): List[Bytes] = args match { - case List(component, dir) => - val result = - build_result(Path.explode(dir.text), - component = Value.Boolean.parse(component.text)) + case List(dir) => + val result = build_result(Path.explode(dir.text)) val jar_name = result.jar_path match { case Some(path) => path.file_name - case None => "result.jar" + case None => "scala_build.jar" } - List(Bytes("scala_build/" + jar_name), result.jar_bytes, Bytes(result.output)) + List(Bytes("classpath/" + jar_name), result.jar_bytes, Bytes(result.output)) case _ => error("Bad arguments") } } def component_contexts(): List[Context] = isabelle.setup.Build.component_contexts().asScala.toList.map(new Context(_)) } diff --git a/src/Pure/Tools/server_commands.scala b/src/Pure/Tools/server_commands.scala --- a/src/Pure/Tools/server_commands.scala +++ b/src/Pure/Tools/server_commands.scala @@ -1,334 +1,334 @@ /* Title: Pure/Tools/server_commands.scala Author: Makarius Miscellaneous Isabelle server commands. */ package isabelle object Server_Commands { def default_preferences: String = Options.read_prefs() object Help extends Server.Command("help") { override val command_body: Server.Command_Body = { case (context, ()) => context.command_list } } object Echo extends Server.Command("echo") { override val command_body: Server.Command_Body = { case (_, t) => t } } object Cancel extends Server.Command("cancel") { sealed case class Args(task: UUID.T) def unapply(json: JSON.T): Option[Args] = for { task <- JSON.uuid(json, "task") } yield Args(task) override val command_body: Server.Command_Body = { case (context, Cancel(args)) => context.cancel_task(args.task) } } object Shutdown extends Server.Command("shutdown") { override val command_body: Server.Command_Body = { case (context, ()) => context.server.shutdown() } } object Session_Build extends Server.Command("session_build") { sealed case class Args( session: String, preferences: String = default_preferences, options: List[String] = Nil, dirs: List[String] = Nil, include_sessions: List[String] = Nil, verbose: Boolean = false) def unapply(json: JSON.T): Option[Args] = for { session <- JSON.string(json, "session") preferences <- JSON.string_default(json, "preferences", default_preferences) options <- JSON.strings_default(json, "options") dirs <- JSON.strings_default(json, "dirs") include_sessions <- JSON.strings_default(json, "include_sessions") verbose <- JSON.bool_default(json, "verbose") } yield { Args(session, preferences = preferences, options = options, dirs = dirs, include_sessions = include_sessions, verbose = verbose) } def command( args: Args, progress: Progress = new Progress ) : (JSON.Object.T, Build.Results, Options, Sessions.Base_Info) = { val options = Options.init(prefs = args.preferences, opts = args.options) val dirs = args.dirs.map(Path.explode) val base_info = Sessions.base_info(options, args.session, progress = progress, dirs = dirs, include_sessions = args.include_sessions).check val results = Build.build(options, selection = Sessions.Selection.session(args.session), progress = progress, build_heap = true, dirs = dirs, infos = base_info.infos, verbose = args.verbose) val sessions_order = base_info.sessions_structure.imports_topological_order.zipWithIndex. toMap.withDefaultValue(-1) val results_json = JSON.Object( "ok" -> results.ok, "return_code" -> results.rc, "sessions" -> results.sessions.toList.sortBy(sessions_order).map { session => val result = results(session) JSON.Object( "session" -> session, "ok" -> result.ok, "return_code" -> result.rc, "timeout" -> result.timeout, "timing" -> result.timing.json) }) if (results.ok) (results_json, results, options, base_info) else { throw new Server.Error("Session build failed: " + Process_Result.RC.print(results.rc), results_json) } } override val command_body: Server.Command_Body = { case (context, Session_Build(args)) => context.make_task(task => Session_Build.command(args, progress = task.progress)._1) } } object Session_Start extends Server.Command("session_start") { sealed case class Args( build: Session_Build.Args, print_mode: List[String] = Nil) def unapply(json: JSON.T): Option[Args] = for { build <- Session_Build.unapply(json) print_mode <- JSON.strings_default(json, "print_mode") } yield Args(build = build, print_mode = print_mode) def command( args: Args, progress: Progress = new Progress, log: Logger = No_Logger ) : (JSON.Object.T, (UUID.T, Headless.Session)) = { val (_, _, options, base_info) = try { Session_Build.command(args.build, progress = progress) } catch { case exn: Server.Error => error(exn.message) } val resources = Headless.Resources(options, base_info, log = log) val session = resources.start_session(print_mode = args.print_mode, progress = progress) val id = UUID.random() val res = JSON.Object( "session_id" -> id.toString, "tmp_dir" -> File.path(session.tmp_dir).implode) (res, id -> session) } override val command_body: Server.Command_Body = { case (context, Session_Start(args)) => context.make_task { task => val (res, entry) = Session_Start.command(args, progress = task.progress, log = context.server.log) context.server.add_session(entry) res } } } object Session_Stop extends Server.Command("session_stop") { def unapply(json: JSON.T): Option[UUID.T] = JSON.uuid(json, "session_id") def command(session: Headless.Session): (JSON.Object.T, Process_Result) = { val result = session.stop() val result_json = JSON.Object("ok" -> result.ok, "return_code" -> result.rc) if (result.ok) (result_json, result) else throw new Server.Error("Session shutdown failed: " + result.print_rc, result_json) } override val command_body: Server.Command_Body = { case (context, Session_Stop(id)) => context.make_task(_ => { val session = context.server.remove_session(id) Session_Stop.command(session)._1 }) } } object Use_Theories extends Server.Command("use_theories") { sealed case class Args( session_id: UUID.T, theories: List[String], master_dir: String = "", pretty_margin: Double = Pretty.default_margin, unicode_symbols: Boolean = false, export_pattern: String = "", check_delay: Option[Time] = None, check_limit: Option[Int] = None, watchdog_timeout: Option[Time] = None, nodes_status_delay: Option[Time] = None, commit_cleanup_delay: Option[Time] = None) def unapply(json: JSON.T): Option[Args] = for { session_id <- JSON.uuid(json, "session_id") theories <- JSON.strings(json, "theories") master_dir <- JSON.string_default(json, "master_dir") pretty_margin <- JSON.double_default(json, "pretty_margin", Pretty.default_margin) unicode_symbols <- JSON.bool_default(json, "unicode_symbols") export_pattern <- JSON.string_default(json, "export_pattern") check_delay = JSON.seconds(json, "check_delay") check_limit = JSON.int(json, "check_limit") watchdog_timeout = JSON.seconds(json, "watchdog_timeout") nodes_status_delay = JSON.seconds(json, "nodes_status_delay") commit_cleanup_delay = JSON.seconds(json, "commit_cleanup_delay") } yield { Args(session_id, theories, master_dir = master_dir, pretty_margin = pretty_margin, unicode_symbols = unicode_symbols, export_pattern = export_pattern, check_delay = check_delay, check_limit = check_limit, watchdog_timeout = watchdog_timeout, nodes_status_delay = nodes_status_delay, commit_cleanup_delay = commit_cleanup_delay) } def command(args: Args, session: Headless.Session, id: UUID.T = UUID.random(), progress: Progress = new Progress ): (JSON.Object.T, Headless.Use_Theories_Result) = { val result = session.use_theories(args.theories, master_dir = args.master_dir, check_delay = args.check_delay.getOrElse(session.default_check_delay), check_limit = args.check_limit.getOrElse(session.default_check_limit), watchdog_timeout = args.watchdog_timeout.getOrElse(session.default_watchdog_timeout), nodes_status_delay = args.nodes_status_delay.getOrElse(session.default_nodes_status_delay), commit_cleanup_delay = args.commit_cleanup_delay.getOrElse(session.default_commit_cleanup_delay), id = id, progress = progress) def output_text(text: String): String = Symbol.output(args.unicode_symbols, text) def output_message(tree: XML.Tree, pos: Position.T): JSON.Object.T = { val position = "pos" -> Position.JSON(pos) tree match { case XML.Text(msg) => Server.Reply.message(output_text(msg)) + position case elem: XML.Elem => val msg = XML.content(Pretty.formatted(List(elem), margin = args.pretty_margin)) val kind = Markup.messages.collectFirst({ case (a, b) if b == elem.name => if (Protocol.is_legacy(elem)) Markup.WARNING else a }) getOrElse "" Server.Reply.message(output_text(msg), kind = kind) + position } } val result_json = JSON.Object( "ok" -> result.ok, "errors" -> (for { (name, status) <- result.nodes if !status.ok (tree, pos) <- result.snapshot(name).messages if Protocol.is_error(tree) } yield output_message(tree, pos)), "nodes" -> (for ((name, status) <- result.nodes) yield { val snapshot = result.snapshot(name) name.json + ("status" -> status.json) + ("messages" -> (for { (tree, pos) <- snapshot.messages if Protocol.is_exported(tree) } yield output_message(tree, pos))) + ("exports" -> (if (args.export_pattern == "") Nil else { val matcher = Export.make_matcher(List(args.export_pattern)) - for { entry <- snapshot.exports if matcher(entry.theory_name, entry.name) } + for { entry <- snapshot.exports if matcher(entry.entry_name) } yield { val (base64, body) = entry.uncompressed.maybe_encode_base64 JSON.Object("name" -> entry.name, "base64" -> base64, "body" -> body) } })) })) (result_json, result) } override val command_body: Server.Command_Body = { case (context, Use_Theories(args)) => context.make_task { task => val session = context.server.the_session(args.session_id) Use_Theories.command(args, session, id = task.id, progress = task.progress)._1 } } } object Purge_Theories extends Server.Command("purge_theories") { sealed case class Args( session_id: UUID.T, theories: List[String] = Nil, master_dir: String = "", all: Boolean = false) def unapply(json: JSON.T): Option[Args] = for { session_id <- JSON.uuid(json, "session_id") theories <- JSON.strings_default(json, "theories") master_dir <- JSON.string_default(json, "master_dir") all <- JSON.bool_default(json, "all") } yield { Args(session_id, theories = theories, master_dir = master_dir, all = all) } def command( args: Args, session: Headless.Session ) : (JSON.Object.T, (List[Document.Node.Name], List[Document.Node.Name])) = { val (purged, retained) = session.purge_theories( theories = args.theories, master_dir = args.master_dir, all = args.all) val result_json = JSON.Object("purged" -> purged.map(_.json), "retained" -> retained.map(_.json)) (result_json, (purged, retained)) } override val command_body: Server.Command_Body = { case (context, Purge_Theories(args)) => val session = context.server.the_session(args.session_id) command(args, session)._1 } } } class Server_Commands extends Server.Commands( Server_Commands.Help, Server_Commands.Echo, Server_Commands.Cancel, Server_Commands.Shutdown, Server_Commands.Session_Build, Server_Commands.Session_Start, Server_Commands.Session_Stop, Server_Commands.Use_Theories, Server_Commands.Purge_Theories)