diff --git a/CONTRIBUTORS b/CONTRIBUTORS --- a/CONTRIBUTORS +++ b/CONTRIBUTORS @@ -1,1043 +1,1046 @@ For the purposes of the license agreement in the file COPYRIGHT, a 'contributor' is anybody who is listed in this file (CONTRIBUTORS) or who is listed as an author in one of the source files of this Isabelle distribution. Contributions to this Isabelle version -------------------------------------- +* March 2021: Lukas Stevens + New order prover + * March 2021: Florian Haftmann Dedicated session for combinatorics. * March 2021: Simon Foster and Leo Freitas More symbol definitions for Z Notation: Isabelle fonts and LaTeX macros. * February 2021: Manuel Eberl New material in HOL-Analysis/HOL-Probability, most notably Hoeffding's inequality and the negative binomial distribution * January 2021: Jakub Kądziołka Some lemmas for HOL-Computational_Algebra. * January 2021: Martin Rasyzk Fast set operations for red-black trees. Contributions to Isabelle2021 ----------------------------- * January 2021: Manuel Eberl Characteristic of a semiring. * January 2021: Manuel Eberl Algebraic integers in HOL-Computational_Algebra. * December 2020: Stepan Holub Contributed lemmas for theory HOL.List. * December 2020: Martin Desharnais Zipperposition 2.0 as external prover for Sledgehammer. * December 2020: Walter Guttmann Extension of session HOL-Hoare with total correctness proof system. * November / December 2020: Makarius Wenzel Improved HTML presentation and PDF document preparation, using mostly Isabelle/Scala instead of Isabelle/ML. * November 2020: Stepan Holub Removed preconditions from lemma comm_append_are_replicate. * November 2020: Florian Haftmann Bundle mixins for locale and class expressions. * November 2020: Jakub Kądziołka Stronger lemmas about orders of group elements (generate_pow_card). * October 2020: Jasmin Blanchette, Martin Desharnais, Mathias Fleury Support veriT as external prover in Sledgehammer. * October 2020: Mathias Fleury Updated proof reconstruction for the SMT solver veriT in the smt method. * October 2020: Jasmin Blanchette, Martin Desharnais Support E prover 2.5 as external prover in Sledgehammer. * September 2020: Florian Haftmann Substantial reworking and modularization of Word library, with generic type conversions. * August 2020: Makarius Wenzel Finally enable PIDE protocol for batch-builds, with various consequences of handling session build databases, Isabelle/Scala within Isabelle/ML etc. * August 2020: Makarius Wenzel Improved monitoring of runtime statistics: ML GC progress and Java. * July 2020: Martin Desharnais Update to Metis 2.4. * June 2020: Makarius Wenzel Batch-builds via "isabelle build" allow to invoke Scala from ML. * June 2020: Florian Haftmann Simproc defined_all for more aggressive substitution with variables from assumptions. * May 2020: Makarius Wenzel Antiquotations for Isabelle systems programming, notably @{scala_function} and @{scala} to invoke Scala from ML. * May 2020: Florian Haftmann Generic algebraically founded bit operations NOT, AND, OR, XOR. Contributions to Isabelle2020 ----------------------------- * February 2020: E. Gunther, M. Pagano and P. Sánchez Terraf Simplified, generalised version of ZF/Constructible. * January 2020: LC Paulson The full finite Ramsey's theorem and elements of finite and infinite Ramsey theory. * December 2019: Basil Fürer, Andreas Lochbihler, Joshua Schneider, Dmitriy Traytel Extension of lift_bnf to support quotient types. * November 2019: Peter Zeller, TU Kaiserslautern Update of Isabelle/VSCode to WebviewPanel API. * October..December 2019: Makarius Wenzel Isabelle/Phabrictor server setup, including Linux platform support in Isabelle/Scala. Client-side tool "isabelle hg_setup". * October 2019: Maximilian Schäffeler Port of the HOL Light decision procedure for metric spaces. * October 2019: Makarius Wenzel More scalable Isabelle dump and underlying headless PIDE session. * August 2019: Makarius Wenzel Better support for proof terms in Isabelle/Pure, notably via method combinator SUBPROOFS (ML) and "subproofs" (Isar). * July 2019: Alexander Krauss, Makarius Wenzel Minimal support for a soft-type system within the Isabelle logical framework. Contributions to Isabelle2019 ----------------------------- * April 2019: LC Paulson Homology and supporting lemmas on topology and group theory * April 2019: Paulo de Vilhena and Martin Baillon Group theory developments, esp. algebraic closure of a field * February/March 2019: Makarius Wenzel Stateless management of export artifacts in the Isabelle/HOL code generator. * February 2019: Manuel Eberl Exponentiation by squaring, used to implement "power" in monoid_mult and fast modular exponentiation. * February 2019: Manuel Eberl Carmichael's function, primitive roots in residue rings, more properties of the order in residue rings. * February 2019: Jeremy Sylvestre Formal Laurent Series and overhaul of Formal power series. * January 2019: Florian Haftmann Clarified syntax and congruence rules for big operators on sets involving the image operator. * January 2019: Florian Haftmann Renovation of code generation, particularly export into session data and proper strings and proper integers based on zarith for OCaml. * January 2019: Andreas Lochbihler New implementation for case_of_simps based on Code_Lazy's pattern matching elimination algorithm. * November/December 2018: Makarius Wenzel Support for Isabelle/Haskell applications of Isabelle/PIDE. * August/September 2018: Makarius Wenzel Improvements of headless Isabelle/PIDE session and server, and systematic exports from theory documents. * December 2018: Florian Haftmann Generic executable sorting algorithms based on executable comparators. * October 2018: Mathias Fleury Proof reconstruction for the SMT solver veriT in the smt method. Contributions to Isabelle2018 ----------------------------- * July 2018: Manuel Eberl "real_asymp" proof method for automatic proofs of real limits, "Big-O" statements, etc. * June 2018: Fabian Immler More tool support for HOL-Types_To_Sets. * June 2018: Martin Baillon and Paulo Emílio de Vilhena A variety of contributions to HOL-Algebra. * June 2018: Wenda Li New/strengthened results involving analysis, topology, etc. * May/June 2018: Makarius Wenzel System infrastructure to export blobs as theory presentation, and to dump PIDE database content in batch mode. * May 2018: Manuel Eberl Landau symbols and asymptotic equivalence (moved from the AFP). * May 2018: Jose Divasón (Universidad de la Rioja), Jesús Aransay (Universidad de la Rioja), Johannes Hölzl (VU Amsterdam), Fabian Immler (TUM) Generalizations in the formalization of linear algebra. * May 2018: Florian Haftmann Consolidation of string-like types in HOL. * May 2018: Andreas Lochbihler (Digital Asset), Pascal Stoop (ETH Zurich) Code generation with lazy evaluation semantics. * March 2018: Florian Haftmann Abstract bit operations push_bit, take_bit, drop_bit, alongside with an algebraic foundation for bit strings and word types in HOL-ex. * March 2018: Viorel Preoteasa Generalisation of complete_distrib_lattice * February 2018: Wenda Li A unified definition for the order of zeros and poles. Improved reasoning around non-essential singularities. * January 2018: Sebastien Gouezel Various small additions to HOL-Analysis * December 2017: Jan Gilcher, Andreas Lochbihler, Dmitriy Traytel A new conditional parametricity prover. * October 2017: Alexander Maletzky Derivation of axiom "iff" in theory HOL.HOL from the other axioms. Contributions to Isabelle2017 ----------------------------- * September 2017: Lawrence Paulson HOL-Analysis, e.g. simplicial complexes, Jordan Curve Theorem. * September 2017: Jasmin Blanchette Further integration of Nunchaku model finder. * November 2016 - June 2017: Makarius Wenzel New Isabelle/VSCode, with underlying restructuring of Isabelle/PIDE. * 2017: Makarius Wenzel Session-qualified theory names (theory imports and ROOT files). Prover IDE improvements. Support for SQL databases in Isabelle/Scala: SQLite and PostgreSQL. * August 2017: Andreas Lochbihler, ETH Zurich type of unordered pairs (HOL-Library.Uprod) * August 2017: Manuel Eberl, TUM HOL-Analysis: infinite products over natural numbers, infinite sums over arbitrary sets, connection between formal power series and analytic complex functions * March 2017: Alasdair Armstrong, University of Sheffield and Simon Foster, University of York Fixed-point theory and Galois Connections in HOL-Algebra. * February 2017: Florian Haftmann, TUM Statically embedded computations implemented by generated code. Contributions to Isabelle2016-1 ------------------------------- * December 2016: Ondřej Kunčar, TUM Types_To_Sets: experimental extension of Higher-Order Logic to allow translation of types to sets. * October 2016: Jasmin Blanchette Integration of Nunchaku model finder. * October 2016: Jaime Mendizabal Roche, TUM Ported remaining theories of session Old_Number_Theory to the new Number_Theory and removed Old_Number_Theory. * September 2016: Sascha Boehme Proof method "argo" based on SMT technology for a combination of quantifier-free propositional logic, equality and linear real arithmetic * July 2016: Daniel Stuewe Height-size proofs in HOL-Data_Structures. * July 2016: Manuel Eberl, TUM Algebraic foundation for primes; generalization from nat to general factorial rings. * June 2016: Andreas Lochbihler, ETH Zurich Formalisation of discrete subprobability distributions. * June 2016: Florian Haftmann, TUM Improvements to code generation: optional timing measurements, more succint closures for static evaluation, less ambiguities concering Scala implicits. * May 2016: Manuel Eberl, TUM Code generation for Probability Mass Functions. * March 2016: Florian Haftmann, TUM Abstract factorial rings with unique factorization. * March 2016: Florian Haftmann, TUM Reworking of the HOL char type as special case of a finite numeral type. * March 2016: Andreas Lochbihler, ETH Zurich Reasoning support for monotonicity, continuity and admissibility in chain-complete partial orders. * February - October 2016: Makarius Wenzel Prover IDE improvements. ML IDE improvements: bootstrap of Pure. Isar language consolidation. Notational modernization of HOL. Tight Poly/ML integration. More Isabelle/Scala system programming modules (e.g. SSH, Mercurial). * Winter 2016: Jasmin Blanchette, Inria & LORIA & MPII, Aymeric Bouzy, Ecole polytechnique, Andreas Lochbihler, ETH Zurich, Andrei Popescu, Middlesex University, and Dmitriy Traytel, ETH Zurich 'corec' command and friends. * January 2016: Florian Haftmann, TUM Abolition of compound operators INFIMUM and SUPREMUM for complete lattices. Contributions to Isabelle2016 ----------------------------- * Winter 2016: Manuel Eberl, TUM Support for real exponentiation ("powr") in the "approximation" method. (This was removed in Isabelle 2015 due to a changed definition of "powr".) * Summer 2015 - Winter 2016: Lawrence C Paulson, Cambridge General, homology form of Cauchy's integral theorem and supporting material (ported from HOL Light). * Winter 2015/16: Gerwin Klein, NICTA New print_record command. * May - December 2015: Makarius Wenzel Prover IDE improvements. More Isar language elements. Document language refinements. Poly/ML debugger integration. Improved multi-platform and multi-architecture support. * Winter 2015: Manuel Eberl, TUM The radius of convergence of power series and various summability tests. Harmonic numbers and the Euler-Mascheroni constant. The Generalised Binomial Theorem. The complex and real Gamma/log-Gamma/Digamma/Polygamma functions and their most important properties. * Autumn 2015: Manuel Eberl, TUM Proper definition of division (with remainder) for formal power series; Euclidean Ring and GCD instance for formal power series. * Autumn 2015: Florian Haftmann, TUM Rewrite definitions for global interpretations and sublocale declarations. * Autumn 2015: Andreas Lochbihler Bourbaki-Witt fixpoint theorem for increasing functions on chain-complete partial orders. * Autumn 2015: Chaitanya Mangla, Lawrence C Paulson, and Manuel Eberl A large number of additional binomial identities. * Summer 2015: Daniel Matichuk, NICTA and Makarius Wenzel Isar subgoal command for proof structure within unstructured proof scripts. * Summer 2015: Florian Haftmann, TUM Generic partial division in rings as inverse operation of multiplication. * Summer 2015: Manuel Eberl and Florian Haftmann, TUM Type class hierarchy with common algebraic notions of integral (semi)domains like units, associated elements and normalization wrt. units. * Summer 2015: Florian Haftmann, TUM Fundamentals of abstract type class for factorial rings. * Summer 2015: Julian Biendarra, TUM and Dmitriy Traytel, ETH Zurich Command to lift a BNF structure on the raw type to the abstract type for typedefs. * Summer 2014: Jeremy Avigad, Luke Serafin, CMU, and Johannes Hölzl, TUM Proof of the central limit theorem: includes weak convergence, characteristic functions, and Levy's uniqueness and continuity theorem. Contributions to Isabelle2015 ----------------------------- * 2014/2015: Daniel Matichuk, Toby Murray, NICTA and Makarius Wenzel The Eisbach proof method language and "match" method. * Winter 2014 and Spring 2015: Ondrej Kuncar, TUM Extension of lift_definition to execute lifted functions that have as a return type a datatype containing a subtype. * March 2015: Jasmin Blanchette, Inria & LORIA & MPII, Mathias Fleury, MPII, and Dmitriy Traytel, TUM More multiset theorems, syntax, and operations. * December 2014: Johannes Hölzl, Manuel Eberl, Sudeep Kanav, TUM, and Jeremy Avigad, Luke Serafin, CMU Various integration theorems: mostly integration on intervals and substitution. * September 2014: Florian Haftmann, TUM Lexicographic order on functions and sum/product over function bodies. * August 2014: Andreas Lochbihler, ETH Zurich Test infrastructure for executing generated code in target languages. * August 2014: Manuel Eberl, TUM Generic euclidean algorithms for GCD et al. Contributions to Isabelle2014 ----------------------------- * July 2014: Thomas Sewell, NICTA Preserve equality hypotheses in "clarify" and friends. New "hypsubst_thin" method configuration option. * Summer 2014: Florian Haftmann, TUM Consolidation and generalization of facts concerning (abelian) semigroups and monoids, particularly products (resp. sums) on finite sets. * Summer 2014: Mathias Fleury, ENS Rennes, and Albert Steckermeier, TUM Work on exotic automatic theorem provers for Sledgehammer (LEO-II, veriT, Waldmeister, etc.). * June 2014: Florian Haftmann, TUM Internal reorganisation of the local theory / named target stack. * June 2014: Sudeep Kanav, TUM, Jeremy Avigad, CMU, and Johannes Hölzl, TUM Various properties of exponentially, Erlang, and normal distributed random variables. * May 2014: Cezary Kaliszyk, University of Innsbruck, and Jasmin Blanchette, TUM SML-based engines for MaSh. * March 2014: René Thiemann Improved code generation for multisets. * February 2014: Florian Haftmann, TUM Permanent interpretation inside theory, locale and class targets with mixin definitions. * Spring 2014: Lawrence C Paulson, Cambridge Theory Complex_Basic_Analysis. Tidying up Number_Theory vs Old_Number_Theory * Winter 2013 and Spring 2014: Ondrej Kuncar, TUM Various improvements to Lifting/Transfer, integration with the BNF package. * Winter 2013 and Spring 2014: Makarius Wenzel, Université Paris-Sud / LRI Improvements of Isabelle/Scala and Isabelle/jEdit Prover IDE. * Fall 2013 and Winter 2014: Martin Desharnais, Lorenz Panny, Dmitriy Traytel, and Jasmin Blanchette, TUM Various improvements to the BNF-based (co)datatype package, including a more polished "primcorec" command, optimizations, and integration in the "HOL" session. * Winter/Spring 2014: Sascha Boehme, QAware GmbH, and Jasmin Blanchette, TUM "SMT2" module and "smt2" proof method, based on SMT-LIB 2 and Z3 4.3. * January 2014: Lars Hupel, TUM An improved, interactive simplifier trace with integration into the Isabelle/jEdit Prover IDE. * December 2013: Florian Haftmann, TUM Consolidation of abstract interpretations concerning min and max. * November 2013: Florian Haftmann, TUM Abolition of negative numeral literals in the logic. Contributions to Isabelle2013-1 ------------------------------- * September 2013: Lars Noschinski, TUM Conversion between function definitions as list of equations and case expressions in HOL. New library Simps_Case_Conv with commands case_of_simps, simps_of_case. * September 2013: Nik Sultana, University of Cambridge Improvements to HOL/TPTP parser and import facilities. * September 2013: Johannes Hölzl and Dmitriy Traytel, TUM New "coinduction" method (residing in HOL-BNF) to avoid boilerplate. * Summer 2013: Makarius Wenzel, Université Paris-Sud / LRI Improvements of Isabelle/Scala and Isabelle/jEdit Prover IDE. * Summer 2013: Manuel Eberl, TUM Generation of elimination rules in the function package. New command "fun_cases". * Summer 2013: Christian Sternagel, JAIST Improved support for ad hoc overloading of constants, including documentation and examples. * Spring and Summer 2013: Lorenz Panny, Dmitriy Traytel, and Jasmin Blanchette, TUM Various improvements to the BNF-based (co)datatype package, including "primrec_new" and "primcorec" commands and a compatibility layer. * Spring and Summer 2013: Ondrej Kuncar, TUM Various improvements of Lifting and Transfer packages. * Spring 2013: Brian Huffman, Galois Inc. Improvements of the Transfer package. * Summer 2013: Daniel Kühlwein, ICIS, Radboud University Nijmegen Jasmin Blanchette, TUM Various improvements to MaSh, including a server mode. * First half of 2013: Steffen Smolka, TUM Further improvements to Sledgehammer's Isar proof generator. * May 2013: Florian Haftmann, TUM Ephemeral interpretation in local theories. * May 2013: Lukas Bulwahn and Nicolai Schaffroth, TUM Spec_Check: A Quickcheck tool for Isabelle/ML. * April 2013: Stefan Berghofer, secunet Security Networks AG Dmitriy Traytel, TUM Makarius Wenzel, Université Paris-Sud / LRI Case translations as a separate check phase independent of the datatype package. * March 2013: Florian Haftmann, TUM Reform of "big operators" on sets. * March 2013: Florian Haftmann, TUM Algebraic locale hierarchy for orderings and (semi)lattices. * February 2013: Florian Haftmann, TUM Reworking and consolidation of code generation for target language numerals. * February 2013: Florian Haftmann, TUM Sieve of Eratosthenes. Contributions to Isabelle2013 ----------------------------- * 2012: Makarius Wenzel, Université Paris-Sud / LRI Improvements of Isabelle/Scala and Isabelle/jEdit Prover IDE. * Fall 2012: Daniel Kühlwein, ICIS, Radboud University Nijmegen Jasmin Blanchette, TUM Implemented Machine Learning for Sledgehammer (MaSh). * Fall 2012: Steffen Smolka, TUM Various improvements to Sledgehammer's Isar proof generator, including a smart type annotation algorithm and proof shrinking. * December 2012: Alessandro Coglio, Kestrel Contributions to HOL's Lattice library. * November 2012: Fabian Immler, TUM "Symbols" dockable for Isabelle/jEdit. * November 2012: Fabian Immler, TUM Proof of the Daniell-Kolmogorov theorem: the existence of the limit of projective families. * October 2012: Andreas Lochbihler, KIT Efficient construction of red-black trees from sorted associative lists. * September 2012: Florian Haftmann, TUM Lattice instances for type option. * September 2012: Christian Sternagel, JAIST Consolidated HOL/Library (theories: Prefix_Order, Sublist, and Sublist_Order) w.r.t. prefixes, suffixes, and embedding on lists. * August 2012: Dmitriy Traytel, Andrei Popescu, Jasmin Blanchette, TUM New BNF-based (co)datatype package. * August 2012: Andrei Popescu and Dmitriy Traytel, TUM Theories of ordinals and cardinals. * July 2012: Makarius Wenzel, Université Paris-Sud / LRI Advanced support for Isabelle sessions and build management, notably "isabelle build". * June 2012: Felix Kuperjans, Lukas Bulwahn, TUM and Rafal Kolanski, NICTA Simproc for rewriting set comprehensions into pointfree expressions. * May 2012: Andreas Lochbihler, KIT Theory of almost everywhere constant functions. * 2010-2012: Markus Kaiser and Lukas Bulwahn, TUM Graphview in Scala/Swing. Contributions to Isabelle2012 ----------------------------- * April 2012: Johannes Hölzl, TUM Probability: Introduced type to represent measures instead of locales. * April 2012: Johannes Hölzl, Fabian Immler, TUM Float: Moved to Dyadic rationals to represent floating point numers. * April 2012: Thomas Sewell, NICTA and 2010: Sascha Boehme, TUM Theory HOL/Word/WordBitwise: logic/circuit expansion of bitvector equalities/inequalities. * March 2012: Christian Sternagel, JAIST Consolidated theory of relation composition. * March 2012: Nik Sultana, University of Cambridge HOL/TPTP parser and import facilities. * March 2012: Cezary Kaliszyk, University of Innsbruck and Alexander Krauss, QAware GmbH Faster and more scalable Import mechanism for HOL Light proofs. * January 2012: Florian Haftmann, TUM, et al. (Re-)Introduction of the "set" type constructor. * 2012: Ondrej Kuncar, TUM New package Lifting, various improvements and refinements to the Quotient package. * 2011/2012: Jasmin Blanchette, TUM Various improvements to Sledgehammer, notably: tighter integration with SPASS, support for more provers (Alt-Ergo, iProver, iProver-Eq). * 2011/2012: Makarius Wenzel, Université Paris-Sud / LRI Various refinements of local theory infrastructure. Improvements of Isabelle/Scala layer and Isabelle/jEdit Prover IDE. Contributions to Isabelle2011-1 ------------------------------- * September 2011: Peter Gammie Theory HOL/Library/Saturated: numbers with saturated arithmetic. * August 2011: Florian Haftmann, Johannes Hölzl and Lars Noschinski, TUM Refined theory on complete lattices. * August 2011: Brian Huffman, Portland State University Miscellaneous cleanup of Complex_Main and Multivariate_Analysis. * June 2011: Brian Huffman, Portland State University Proof method "countable_datatype" for theory Library/Countable. * 2011: Jasmin Blanchette, TUM Various improvements to Sledgehammer, notably: use of sound translations, support for more provers (Waldmeister, LEO-II, Satallax). Further development of Nitpick and 'try' command. * 2011: Andreas Lochbihler, Karlsruhe Institute of Technology Theory HOL/Library/Cset_Monad allows do notation for computable sets (cset) via the generic monad ad-hoc overloading facility. * 2011: Johannes Hölzl, Armin Heller, TUM and Bogdan Grechuk, University of Edinburgh Theory HOL/Library/Extended_Reals: real numbers extended with plus and minus infinity. * 2011: Makarius Wenzel, Université Paris-Sud / LRI Various building blocks for Isabelle/Scala layer and Isabelle/jEdit Prover IDE. Contributions to Isabelle2011 ----------------------------- * January 2011: Stefan Berghofer, secunet Security Networks AG HOL-SPARK: an interactive prover back-end for SPARK. * October 2010: Bogdan Grechuk, University of Edinburgh Extended convex analysis in Multivariate Analysis. * October 2010: Dmitriy Traytel, TUM Coercive subtyping via subtype constraints. * October 2010: Alexander Krauss, TUM Command partial_function for function definitions based on complete partial orders in HOL. * September 2010: Florian Haftmann, TUM Refined concepts for evaluation, i.e., normalization of terms using different techniques. * September 2010: Florian Haftmann, TUM Code generation for Scala. * August 2010: Johannes Hoelzl, Armin Heller, and Robert Himmelmann, TUM Improved Probability theory in HOL. * July 2010: Florian Haftmann, TUM Reworking and extension of the Imperative HOL framework. * July 2010: Alexander Krauss, TUM and Christian Sternagel, University of Innsbruck Ad-hoc overloading. Generic do notation for monads. Contributions to Isabelle2009-2 ------------------------------- * 2009/2010: Stefan Berghofer, Alexander Krauss, and Andreas Schropp, TUM, Makarius Wenzel, TUM / LRI Elimination of type classes from proof terms. * April 2010: Florian Haftmann, TUM Reorganization of abstract algebra type classes. * April 2010: Florian Haftmann, TUM Code generation for data representations involving invariants; various collections avaiable in theories Fset, Dlist, RBT, Mapping and AssocList. * March 2010: Sascha Boehme, TUM Efficient SHA1 library for Poly/ML. * February 2010: Cezary Kaliszyk and Christian Urban, TUM Quotient type package for Isabelle/HOL. Contributions to Isabelle2009-1 ------------------------------- * November 2009, Brian Huffman, PSU New definitional domain package for HOLCF. * November 2009: Robert Himmelmann, TUM Derivation and Brouwer's fixpoint theorem in Multivariate Analysis. * November 2009: Stefan Berghofer and Lukas Bulwahn, TUM A tabled implementation of the reflexive transitive closure. * November 2009: Lukas Bulwahn, TUM Predicate Compiler: a compiler for inductive predicates to equational specifications. * November 2009: Sascha Boehme, TUM and Burkhart Wolff, LRI Paris HOL-Boogie: an interactive prover back-end for Boogie and VCC. * October 2009: Jasmin Blanchette, TUM Nitpick: yet another counterexample generator for Isabelle/HOL. * October 2009: Sascha Boehme, TUM Extension of SMT method: proof-reconstruction for the SMT solver Z3. * October 2009: Florian Haftmann, TUM Refinement of parts of the HOL datatype package. * October 2009: Florian Haftmann, TUM Generic term styles for term antiquotations. * September 2009: Thomas Sewell, NICTA More efficient HOL/record implementation. * September 2009: Sascha Boehme, TUM SMT method using external SMT solvers. * September 2009: Florian Haftmann, TUM Refinement of sets and lattices. * July 2009: Jeremy Avigad and Amine Chaieb New number theory. * July 2009: Philipp Meyer, TUM HOL/Library/Sum_Of_Squares: functionality to call a remote csdp prover. * July 2009: Florian Haftmann, TUM New quickcheck implementation using new code generator. * July 2009: Florian Haftmann, TUM HOL/Library/Fset: an explicit type of sets; finite sets ready to use for code generation. * June 2009: Florian Haftmann, TUM HOL/Library/Tree: search trees implementing mappings, ready to use for code generation. * March 2009: Philipp Meyer, TUM Minimization tool for results from Sledgehammer. Contributions to Isabelle2009 ----------------------------- * March 2009: Robert Himmelmann, TUM and Amine Chaieb, University of Cambridge Elementary topology in Euclidean space. * March 2009: Johannes Hoelzl, TUM Method "approximation", which proves real valued inequalities by computation. * February 2009: Filip Maric, Univ. of Belgrade A Serbian theory. * February 2009: Jasmin Christian Blanchette, TUM Misc cleanup of HOL/refute. * February 2009: Timothy Bourke, NICTA New find_consts command. * February 2009: Timothy Bourke, NICTA "solves" criterion for find_theorems and auto_solve option * December 2008: Clemens Ballarin, TUM New locale implementation. * December 2008: Armin Heller, TUM and Alexander Krauss, TUM Method "sizechange" for advanced termination proofs. * November 2008: Timothy Bourke, NICTA Performance improvement (factor 50) for find_theorems. * 2008: Florian Haftmann, TUM Various extensions and restructurings in HOL, improvements in evaluation mechanisms, new module binding.ML for name bindings. * October 2008: Fabian Immler, TUM ATP manager for Sledgehammer, based on ML threads instead of Posix processes. Additional ATP wrappers, including remote SystemOnTPTP services. * September 2008: Stefan Berghofer, TUM and Marc Bezem, Univ. Bergen Prover for coherent logic. * August 2008: Fabian Immler, TUM Vampire wrapper script for remote SystemOnTPTP service. Contributions to Isabelle2008 ----------------------------- * 2007/2008: Alexander Krauss, TUM and Florian Haftmann, TUM and Stefan Berghofer, TUM HOL library improvements. * 2007/2008: Brian Huffman, PSU HOLCF library improvements. * 2007/2008: Stefan Berghofer, TUM HOL-Nominal package improvements. * March 2008: Markus Reiter, TUM HOL/Library/RBT: red-black trees. * February 2008: Alexander Krauss, TUM and Florian Haftmann, TUM and Lukas Bulwahn, TUM and John Matthews, Galois: HOL/Library/Imperative_HOL: Haskell-style imperative data structures for HOL. * December 2007: Norbert Schirmer, Uni Saarbruecken Misc improvements of record package in HOL. * December 2007: Florian Haftmann, TUM Overloading and class instantiation target. * December 2007: Florian Haftmann, TUM New version of primrec package for local theories. * December 2007: Alexander Krauss, TUM Method "induction_scheme" in HOL. * November 2007: Peter Lammich, Uni Muenster HOL-Lattice: some more lemmas. Contributions to Isabelle2007 ----------------------------- * October 2007: Norbert Schirmer, TUM / Uni Saarbruecken State Spaces: The Locale Way (in HOL). * October 2007: Mark A. Hillebrand, DFKI Robust sub/superscripts in LaTeX document output. * August 2007: Jeremy Dawson, NICTA and Paul Graunke, Galois and Brian Huffman, PSU and Gerwin Klein, NICTA and John Matthews, Galois HOL-Word: a library for fixed-size machine words in Isabelle. * August 2007: Brian Huffman, PSU HOL/Library/Boolean_Algebra and HOL/Library/Numeral_Type. * June 2007: Amine Chaieb, TUM Semiring normalization and Groebner Bases. Support for dense linear orders. * June 2007: Joe Hurd, Oxford Metis theorem-prover. * 2007: Kong W. Susanto, Cambridge HOL: Metis prover integration. * 2007: Stefan Berghofer, TUM HOL: inductive predicates and sets. * 2007: Norbert Schirmer, TUM HOL/record: misc improvements. * 2006/2007: Alexander Krauss, TUM HOL: function package and related theories on termination. * 2006/2007: Florian Haftmann, TUM Pure: generic code generator framework. Pure: class package. HOL: theory reorganization, code generator setup. * 2006/2007: Christian Urban, TUM and Stefan Berghofer, TUM and Julien Narboux, TUM HOL/Nominal package and related tools. * November 2006: Lukas Bulwahn, TUM HOL: method "lexicographic_order" for function package. * October 2006: Stefan Hohe, TUM HOL-Algebra: ideals and quotients over rings. * August 2006: Amine Chaieb, TUM Experimental support for generic reflection and reification in HOL. * July 2006: Rafal Kolanski, NICTA Hex (0xFF) and binary (0b1011) numerals. * May 2006: Klaus Aehlig, LMU Command 'normal_form': normalization by evaluation. * May 2006: Amine Chaieb, TUM HOL-Complex: Ferrante and Rackoff Algorithm for linear real arithmetic. * February 2006: Benjamin Porter, NICTA HOL and HOL-Complex: generalised mean value theorem, continuum is not denumerable, harmonic and arithmetic series, and denumerability of rationals. * October 2005: Martin Wildmoser, TUM Sketch for Isar 'guess' element. Contributions to Isabelle2005 ----------------------------- * September 2005: Lukas Bulwahn and Bernhard Haeupler, TUM HOL-Complex: Formalization of Taylor series. * September 2005: Stephan Merz, Alwen Tiu, QSL Loria Components for SAT solver method using zChaff. * September 2005: Ning Zhang and Christian Urban, LMU Munich A Chinese theory. * September 2005: Bernhard Haeupler, TUM Method comm_ring for proving equalities in commutative rings. * July/August 2005: Jeremy Avigad, Carnegie Mellon University Various improvements of the HOL and HOL-Complex library. * July 2005: Florian Zuleger, Johannes Hoelzl, and Simon Funke, TUM Some structured proofs about completeness of real numbers. * May 2005: Rafal Kolanski and Gerwin Klein, NICTA Improved retrieval of facts from theory/proof context. * February 2005: Lucas Dixon, University of Edinburgh Improved subst method. * 2005: Brian Huffman, OGI Various improvements of HOLCF. Some improvements of the HOL-Complex library. * 2005: Claire Quigley and Jia Meng, University of Cambridge Some support for asynchronous communication with external provers (experimental). * 2005: Florian Haftmann, TUM Contributions to document 'sugar'. Various ML combinators, notably linear functional transformations. Some cleanup of ML legacy. Additional antiquotations. Improved Isabelle web site. * 2004/2005: David Aspinall, University of Edinburgh Various elements of XML and PGIP based communication with user interfaces (experimental). * 2004/2005: Gerwin Klein, NICTA Contributions to document 'sugar'. Improved Isabelle web site. Improved HTML presentation of theories. * 2004/2005: Clemens Ballarin, TUM Provers: tools for transitive relations and quasi orders. Improved version of locales, notably interpretation of locales. Improved version of HOL-Algebra. * 2004/2005: Amine Chaieb, TUM Improved version of HOL presburger method. * 2004/2005: Steven Obua, TUM Improved version of HOL/Import, support for HOL-Light. Improved version of HOL-Complex-Matrix. Pure/defs: more sophisticated checks on well-formedness of overloading. Pure/Tools: an experimental evaluator for lambda terms. * 2004/2005: Norbert Schirmer, TUM Contributions to document 'sugar'. Improved version of HOL/record. * 2004/2005: Sebastian Skalberg, TUM Improved version of HOL/Import. Some internal ML reorganizations. * 2004/2005: Tjark Weber, TUM SAT solver method using zChaff. Improved version of HOL/refute. :maxLineLen=78: diff --git a/src/HOL/Analysis/Interval_Integral.thy b/src/HOL/Analysis/Interval_Integral.thy --- a/src/HOL/Analysis/Interval_Integral.thy +++ b/src/HOL/Analysis/Interval_Integral.thy @@ -1,1106 +1,1107 @@ (* Title: HOL/Analysis/Interval_Integral.thy Author: Jeremy Avigad (CMU), Johannes Hölzl (TUM), Luke Serafin (CMU) Lebesgue integral over an interval (with endpoints possibly +-\) *) theory Interval_Integral (*FIX ME rename? Lebesgue *) imports Equivalence_Lebesgue_Henstock_Integration begin definition "einterval a b = {x. a < ereal x \ ereal x < b}" lemma einterval_eq[simp]: shows einterval_eq_Icc: "einterval (ereal a) (ereal b) = {a <..< b}" and einterval_eq_Ici: "einterval (ereal a) \ = {a <..}" and einterval_eq_Iic: "einterval (- \) (ereal b) = {..< b}" and einterval_eq_UNIV: "einterval (- \) \ = UNIV" by (auto simp: einterval_def) lemma einterval_same: "einterval a a = {}" by (auto simp: einterval_def) lemma einterval_iff: "x \ einterval a b \ a < ereal x \ ereal x < b" by (simp add: einterval_def) lemma einterval_nonempty: "a < b \ \c. c \ einterval a b" by (cases a b rule: ereal2_cases, auto simp: einterval_def intro!: dense gt_ex lt_ex) lemma open_einterval[simp]: "open (einterval a b)" by (cases a b rule: ereal2_cases) (auto simp: einterval_def intro!: open_Collect_conj open_Collect_less continuous_intros) lemma borel_einterval[measurable]: "einterval a b \ sets borel" unfolding einterval_def by measurable subsection \Approximating a (possibly infinite) interval\ lemma filterlim_sup1: "(LIM x F. f x :> G1) \ (LIM x F. f x :> (sup G1 G2))" unfolding filterlim_def by (auto intro: le_supI1) lemma ereal_incseq_approx: fixes a b :: ereal assumes "a < b" obtains X :: "nat \ real" where "incseq X" "\i. a < X i" "\i. X i < b" "X \ b" proof (cases b) case PInf with \a < b\ have "a = -\ \ (\r. a = ereal r)" by (cases a) auto moreover have "(\x. ereal (real (Suc x))) \ \" by (simp add: Lim_PInfty filterlim_sequentially_Suc) (metis le_SucI of_nat_Suc of_nat_mono order_trans real_arch_simple) moreover have "\r. (\x. ereal (r + real (Suc x))) \ \" by (simp add: filterlim_sequentially_Suc Lim_PInfty) (metis add.commute diff_le_eq nat_ceiling_le_eq) ultimately show thesis by (intro that[of "\i. real_of_ereal a + Suc i"]) (auto simp: incseq_def PInf) next case (real b') define d where "d = b' - (if a = -\ then b' - 1 else real_of_ereal a)" with \a < b\ have a': "0 < d" by (cases a) (auto simp: real) moreover have "\i r. r < b' \ (b' - r) * 1 < (b' - r) * real (Suc (Suc i))" by (intro mult_strict_left_mono) auto with \a < b\ a' have "\i. a < ereal (b' - d / real (Suc (Suc i)))" by (cases a) (auto simp: real d_def field_simps) moreover have "(\i. b' - d / real i) \ b'" by (force intro: tendsto_eq_intros tendsto_divide_0[OF tendsto_const] filterlim_sup1 simp: at_infinity_eq_at_top_bot filterlim_real_sequentially) then have "(\i. b' - d / Suc (Suc i)) \ b'" by (blast intro: dest: filterlim_sequentially_Suc [THEN iffD2]) ultimately show thesis by (intro that[of "\i. b' - d / Suc (Suc i)"]) (auto simp: real incseq_def intro!: divide_left_mono) qed (insert \a < b\, auto) lemma ereal_decseq_approx: fixes a b :: ereal assumes "a < b" obtains X :: "nat \ real" where "decseq X" "\i. a < X i" "\i. X i < b" "X \ a" proof - have "-b < -a" using \a < b\ by simp from ereal_incseq_approx[OF this] guess X . then show thesis apply (intro that[of "\i. - X i"]) apply (auto simp: decseq_def incseq_def simp flip: uminus_ereal.simps) apply (metis ereal_minus_less_minus ereal_uminus_uminus ereal_Lim_uminus)+ done qed proposition einterval_Icc_approximation: fixes a b :: ereal assumes "a < b" obtains u l :: "nat \ real" where "einterval a b = (\i. {l i .. u i})" "incseq u" "decseq l" "\i. l i < u i" "\i. a < l i" "\i. u i < b" "l \ a" "u \ b" proof - from dense[OF \a < b\] obtain c where "a < c" "c < b" by safe from ereal_incseq_approx[OF \c < b\] guess u . note u = this from ereal_decseq_approx[OF \a < c\] guess l . note l = this { fix i from less_trans[OF \l i < c\ \c < u i\] have "l i < u i" by simp } have "einterval a b = (\i. {l i .. u i})" proof (auto simp: einterval_iff) fix x assume "a < ereal x" "ereal x < b" have "eventually (\i. ereal (l i) < ereal x) sequentially" using l(4) \a < ereal x\ by (rule order_tendstoD) moreover have "eventually (\i. ereal x < ereal (u i)) sequentially" using u(4) \ereal x< b\ by (rule order_tendstoD) ultimately have "eventually (\i. l i < x \ x < u i) sequentially" by eventually_elim auto then show "\i. l i \ x \ x \ u i" by (auto intro: less_imp_le simp: eventually_sequentially) next fix x i assume "l i \ x" "x \ u i" with \a < ereal (l i)\ \ereal (u i) < b\ show "a < ereal x" "ereal x < b" by (auto simp flip: ereal_less_eq(3)) qed show thesis by (intro that) fact+ qed (* TODO: in this definition, it would be more natural if einterval a b included a and b when they are real. *) definition\<^marker>\tag important\ interval_lebesgue_integral :: "real measure \ ereal \ ereal \ (real \ 'a) \ 'a::{banach, second_countable_topology}" where "interval_lebesgue_integral M a b f = (if a \ b then (LINT x:einterval a b|M. f x) else - (LINT x:einterval b a|M. f x))" syntax "_ascii_interval_lebesgue_integral" :: "pttrn \ real \ real \ real measure \ real \ real" ("(5LINT _=_.._|_. _)" [0,60,60,61,100] 60) translations "LINT x=a..b|M. f" == "CONST interval_lebesgue_integral M a b (\x. f)" definition\<^marker>\tag important\ interval_lebesgue_integrable :: "real measure \ ereal \ ereal \ (real \ 'a::{banach, second_countable_topology}) \ bool" where "interval_lebesgue_integrable M a b f = (if a \ b then set_integrable M (einterval a b) f else set_integrable M (einterval b a) f)" syntax "_ascii_interval_lebesgue_borel_integral" :: "pttrn \ real \ real \ real \ real" ("(4LBINT _=_.._. _)" [0,60,60,61] 60) translations "LBINT x=a..b. f" == "CONST interval_lebesgue_integral CONST lborel a b (\x. f)" subsection\Basic properties of integration over an interval\ lemma interval_lebesgue_integral_cong: "a \ b \ (\x. x \ einterval a b \ f x = g x) \ einterval a b \ sets M \ interval_lebesgue_integral M a b f = interval_lebesgue_integral M a b g" by (auto intro: set_lebesgue_integral_cong simp: interval_lebesgue_integral_def) lemma interval_lebesgue_integral_cong_AE: "f \ borel_measurable M \ g \ borel_measurable M \ a \ b \ AE x \ einterval a b in M. f x = g x \ einterval a b \ sets M \ interval_lebesgue_integral M a b f = interval_lebesgue_integral M a b g" by (auto intro: set_lebesgue_integral_cong_AE simp: interval_lebesgue_integral_def) lemma interval_integrable_mirror: shows "interval_lebesgue_integrable lborel a b (\x. f (-x)) \ interval_lebesgue_integrable lborel (-b) (-a) f" proof - have *: "indicator (einterval a b) (- x) = (indicator (einterval (-b) (-a)) x :: real)" for a b :: ereal and x :: real by (cases a b rule: ereal2_cases) (auto simp: einterval_def split: split_indicator) show ?thesis unfolding interval_lebesgue_integrable_def using lborel_integrable_real_affine_iff[symmetric, of "-1" "\x. indicator (einterval _ _) x *\<^sub>R f x" 0] by (simp add: * set_integrable_def) qed lemma interval_lebesgue_integral_add [intro, simp]: fixes M a b f assumes "interval_lebesgue_integrable M a b f" "interval_lebesgue_integrable M a b g" shows "interval_lebesgue_integrable M a b (\x. f x + g x)" and "interval_lebesgue_integral M a b (\x. f x + g x) = interval_lebesgue_integral M a b f + interval_lebesgue_integral M a b g" using assms by (auto simp: interval_lebesgue_integral_def interval_lebesgue_integrable_def field_simps) lemma interval_lebesgue_integral_diff [intro, simp]: fixes M a b f assumes "interval_lebesgue_integrable M a b f" "interval_lebesgue_integrable M a b g" shows "interval_lebesgue_integrable M a b (\x. f x - g x)" and "interval_lebesgue_integral M a b (\x. f x - g x) = interval_lebesgue_integral M a b f - interval_lebesgue_integral M a b g" using assms by (auto simp: interval_lebesgue_integral_def interval_lebesgue_integrable_def field_simps) lemma interval_lebesgue_integrable_mult_right [intro, simp]: fixes M a b c and f :: "real \ 'a::{banach, real_normed_field, second_countable_topology}" shows "(c \ 0 \ interval_lebesgue_integrable M a b f) \ interval_lebesgue_integrable M a b (\x. c * f x)" by (simp add: interval_lebesgue_integrable_def) lemma interval_lebesgue_integrable_mult_left [intro, simp]: fixes M a b c and f :: "real \ 'a::{banach, real_normed_field, second_countable_topology}" shows "(c \ 0 \ interval_lebesgue_integrable M a b f) \ interval_lebesgue_integrable M a b (\x. f x * c)" by (simp add: interval_lebesgue_integrable_def) lemma interval_lebesgue_integrable_divide [intro, simp]: fixes M a b c and f :: "real \ 'a::{banach, real_normed_field, field, second_countable_topology}" shows "(c \ 0 \ interval_lebesgue_integrable M a b f) \ interval_lebesgue_integrable M a b (\x. f x / c)" by (simp add: interval_lebesgue_integrable_def) lemma interval_lebesgue_integral_mult_right [simp]: fixes M a b c and f :: "real \ 'a::{banach, real_normed_field, second_countable_topology}" shows "interval_lebesgue_integral M a b (\x. c * f x) = c * interval_lebesgue_integral M a b f" by (simp add: interval_lebesgue_integral_def) lemma interval_lebesgue_integral_mult_left [simp]: fixes M a b c and f :: "real \ 'a::{banach, real_normed_field, second_countable_topology}" shows "interval_lebesgue_integral M a b (\x. f x * c) = interval_lebesgue_integral M a b f * c" by (simp add: interval_lebesgue_integral_def) lemma interval_lebesgue_integral_divide [simp]: fixes M a b c and f :: "real \ 'a::{banach, real_normed_field, field, second_countable_topology}" shows "interval_lebesgue_integral M a b (\x. f x / c) = interval_lebesgue_integral M a b f / c" by (simp add: interval_lebesgue_integral_def) lemma interval_lebesgue_integral_uminus: "interval_lebesgue_integral M a b (\x. - f x) = - interval_lebesgue_integral M a b f" by (auto simp: interval_lebesgue_integral_def interval_lebesgue_integrable_def set_lebesgue_integral_def) lemma interval_lebesgue_integral_of_real: "interval_lebesgue_integral M a b (\x. complex_of_real (f x)) = of_real (interval_lebesgue_integral M a b f)" unfolding interval_lebesgue_integral_def by (auto simp: interval_lebesgue_integral_def set_integral_complex_of_real) lemma interval_lebesgue_integral_le_eq: fixes a b f assumes "a \ b" shows "interval_lebesgue_integral M a b f = (LINT x : einterval a b | M. f x)" using assms by (auto simp: interval_lebesgue_integral_def) lemma interval_lebesgue_integral_gt_eq: fixes a b f assumes "a > b" shows "interval_lebesgue_integral M a b f = -(LINT x : einterval b a | M. f x)" using assms by (auto simp: interval_lebesgue_integral_def less_imp_le einterval_def) lemma interval_lebesgue_integral_gt_eq': fixes a b f assumes "a > b" shows "interval_lebesgue_integral M a b f = - interval_lebesgue_integral M b a f" using assms by (auto simp: interval_lebesgue_integral_def less_imp_le einterval_def) lemma interval_integral_endpoints_same [simp]: "(LBINT x=a..a. f x) = 0" by (simp add: interval_lebesgue_integral_def set_lebesgue_integral_def einterval_same) lemma interval_integral_endpoints_reverse: "(LBINT x=a..b. f x) = -(LBINT x=b..a. f x)" by (cases a b rule: linorder_cases) (auto simp: interval_lebesgue_integral_def set_lebesgue_integral_def einterval_same) lemma interval_integrable_endpoints_reverse: "interval_lebesgue_integrable lborel a b f \ interval_lebesgue_integrable lborel b a f" by (cases a b rule: linorder_cases) (auto simp: interval_lebesgue_integrable_def einterval_same) lemma interval_integral_reflect: "(LBINT x=a..b. f x) = (LBINT x=-b..-a. f (-x))" proof (induct a b rule: linorder_wlog) case (sym a b) then show ?case by (auto simp: interval_lebesgue_integral_def interval_integrable_endpoints_reverse split: if_split_asm) next case (le a b) have "LBINT x:{x. - x \ einterval a b}. f (- x) = LBINT x:einterval (- b) (- a). f (- x)" unfolding interval_lebesgue_integrable_def set_lebesgue_integral_def apply (rule Bochner_Integration.integral_cong [OF refl]) by (auto simp: einterval_iff ereal_uminus_le_reorder ereal_uminus_less_reorder not_less simp flip: uminus_ereal.simps split: split_indicator) then show ?case unfolding interval_lebesgue_integral_def by (subst set_integral_reflect) (simp add: le) qed lemma interval_lebesgue_integral_0_infty: "interval_lebesgue_integrable M 0 \ f \ set_integrable M {0<..} f" "interval_lebesgue_integral M 0 \ f = (LINT x:{0<..}|M. f x)" unfolding zero_ereal_def by (auto simp: interval_lebesgue_integral_le_eq interval_lebesgue_integrable_def) lemma interval_integral_to_infinity_eq: "(LINT x=ereal a..\ | M. f x) = (LINT x : {a<..} | M. f x)" unfolding interval_lebesgue_integral_def by auto proposition interval_integrable_to_infinity_eq: "(interval_lebesgue_integrable M a \ f) = (set_integrable M {a<..} f)" unfolding interval_lebesgue_integrable_def by auto subsection\Basic properties of integration over an interval wrt lebesgue measure\ lemma interval_integral_zero [simp]: fixes a b :: ereal shows "LBINT x=a..b. 0 = 0" unfolding interval_lebesgue_integral_def set_lebesgue_integral_def einterval_eq by simp lemma interval_integral_const [intro, simp]: fixes a b c :: real shows "interval_lebesgue_integrable lborel a b (\x. c)" and "LBINT x=a..b. c = c * (b - a)" unfolding interval_lebesgue_integral_def interval_lebesgue_integrable_def einterval_eq by (auto simp: less_imp_le field_simps measure_def set_integrable_def set_lebesgue_integral_def) lemma interval_integral_cong_AE: assumes [measurable]: "f \ borel_measurable borel" "g \ borel_measurable borel" assumes "AE x \ einterval (min a b) (max a b) in lborel. f x = g x" shows "interval_lebesgue_integral lborel a b f = interval_lebesgue_integral lborel a b g" using assms proof (induct a b rule: linorder_wlog) case (sym a b) then show ?case by (simp add: min.commute max.commute interval_integral_endpoints_reverse[of a b]) next case (le a b) then show ?case by (auto simp: interval_lebesgue_integral_def max_def min_def intro!: set_lebesgue_integral_cong_AE) qed lemma interval_integral_cong: assumes "\x. x \ einterval (min a b) (max a b) \ f x = g x" shows "interval_lebesgue_integral lborel a b f = interval_lebesgue_integral lborel a b g" using assms proof (induct a b rule: linorder_wlog) case (sym a b) then show ?case by (simp add: min.commute max.commute interval_integral_endpoints_reverse[of a b]) next case (le a b) then show ?case by (auto simp: interval_lebesgue_integral_def max_def min_def intro!: set_lebesgue_integral_cong) qed lemma interval_lebesgue_integrable_cong_AE: "f \ borel_measurable lborel \ g \ borel_measurable lborel \ AE x \ einterval (min a b) (max a b) in lborel. f x = g x \ interval_lebesgue_integrable lborel a b f = interval_lebesgue_integrable lborel a b g" apply (simp add: interval_lebesgue_integrable_def) apply (intro conjI impI set_integrable_cong_AE) apply (auto simp: min_def max_def) done lemma interval_integrable_abs_iff: fixes f :: "real \ real" shows "f \ borel_measurable lborel \ interval_lebesgue_integrable lborel a b (\x. \f x\) = interval_lebesgue_integrable lborel a b f" unfolding interval_lebesgue_integrable_def by (subst (1 2) set_integrable_abs_iff') simp_all lemma interval_integral_Icc: fixes a b :: real shows "a \ b \ (LBINT x=a..b. f x) = (LBINT x : {a..b}. f x)" by (auto intro!: set_integral_discrete_difference[where X="{a, b}"] simp add: interval_lebesgue_integral_def) lemma interval_integral_Icc': "a \ b \ (LBINT x=a..b. f x) = (LBINT x : {x. a \ ereal x \ ereal x \ b}. f x)" by (auto intro!: set_integral_discrete_difference[where X="{real_of_ereal a, real_of_ereal b}"] simp add: interval_lebesgue_integral_def einterval_iff) lemma interval_integral_Ioc: "a \ b \ (LBINT x=a..b. f x) = (LBINT x : {a<..b}. f x)" by (auto intro!: set_integral_discrete_difference[where X="{a, b}"] simp add: interval_lebesgue_integral_def einterval_iff) (* TODO: other versions as well? *) (* Yes: I need the Icc' version. *) lemma interval_integral_Ioc': "a \ b \ (LBINT x=a..b. f x) = (LBINT x : {x. a < ereal x \ ereal x \ b}. f x)" by (auto intro!: set_integral_discrete_difference[where X="{real_of_ereal a, real_of_ereal b}"] simp add: interval_lebesgue_integral_def einterval_iff) lemma interval_integral_Ico: "a \ b \ (LBINT x=a..b. f x) = (LBINT x : {a..a\ < \ \ (LBINT x=a..\. f x) = (LBINT x : {real_of_ereal a <..}. f x)" by (auto simp: interval_lebesgue_integral_def einterval_iff) lemma interval_integral_Ioo: "a \ b \ \a\ < \ ==> \b\ < \ \ (LBINT x=a..b. f x) = (LBINT x : {real_of_ereal a <..< real_of_ereal b}. f x)" by (auto simp: interval_lebesgue_integral_def einterval_iff) lemma interval_integral_discrete_difference: fixes f :: "real \ 'b::{banach, second_countable_topology}" and a b :: ereal assumes "countable X" and eq: "\x. a \ b \ a < x \ x < b \ x \ X \ f x = g x" and anti_eq: "\x. b \ a \ b < x \ x < a \ x \ X \ f x = g x" assumes "\x. x \ X \ emeasure M {x} = 0" "\x. x \ X \ {x} \ sets M" shows "interval_lebesgue_integral M a b f = interval_lebesgue_integral M a b g" unfolding interval_lebesgue_integral_def set_lebesgue_integral_def apply (intro if_cong refl arg_cong[where f="\x. - x"] integral_discrete_difference[of X] assms) apply (auto simp: eq anti_eq einterval_iff split: split_indicator) done lemma interval_integral_sum: fixes a b c :: ereal assumes integrable: "interval_lebesgue_integrable lborel (min a (min b c)) (max a (max b c)) f" shows "(LBINT x=a..b. f x) + (LBINT x=b..c. f x) = (LBINT x=a..c. f x)" proof - let ?I = "\a b. LBINT x=a..b. f x" { fix a b c :: ereal assume "interval_lebesgue_integrable lborel a c f" "a \ b" "b \ c" then have ord: "a \ b" "b \ c" "a \ c" and f': "set_integrable lborel (einterval a c) f" by (auto simp: interval_lebesgue_integrable_def) then have f: "set_borel_measurable borel (einterval a c) f" unfolding set_integrable_def set_borel_measurable_def by (drule_tac borel_measurable_integrable) simp have "(LBINT x:einterval a c. f x) = (LBINT x:einterval a b \ einterval b c. f x)" proof (rule set_integral_cong_set) show "AE x in lborel. (x \ einterval a b \ einterval b c) = (x \ einterval a c)" using AE_lborel_singleton[of "real_of_ereal b"] ord by (cases a b c rule: ereal3_cases) (auto simp: einterval_iff) show "set_borel_measurable lborel (einterval a c) f" "set_borel_measurable lborel (einterval a b \ einterval b c) f" unfolding set_borel_measurable_def using ord by (auto simp: einterval_iff intro!: set_borel_measurable_subset[OF f, unfolded set_borel_measurable_def]) qed also have "\ = (LBINT x:einterval a b. f x) + (LBINT x:einterval b c. f x)" using ord by (intro set_integral_Un_AE) (auto intro!: set_integrable_subset[OF f'] simp: einterval_iff not_less) finally have "?I a b + ?I b c = ?I a c" using ord by (simp add: interval_lebesgue_integral_def) } note 1 = this { fix a b c :: ereal assume "interval_lebesgue_integrable lborel a c f" "a \ b" "b \ c" from 1[OF this] have "?I b c + ?I a b = ?I a c" by (metis add.commute) } note 2 = this have 3: "\a b. b \ a \ (LBINT x=a..b. f x) = - (LBINT x=b..a. f x)" by (rule interval_integral_endpoints_reverse) show ?thesis using integrable - by (cases a b b c a c rule: linorder_le_cases[case_product linorder_le_cases linorder_cases]) - (simp_all add: min_absorb1 min_absorb2 max_absorb1 max_absorb2 field_simps 1 2 3) + apply (cases a b b c a c rule: linorder_le_cases[case_product linorder_le_cases linorder_cases]) + apply simp_all (* remove some looping cases *) + by (simp_all add: min_absorb1 min_absorb2 max_absorb1 max_absorb2 field_simps 1 2 3) qed lemma interval_integrable_isCont: fixes a b and f :: "real \ 'a::{banach, second_countable_topology}" shows "(\x. min a b \ x \ x \ max a b \ isCont f x) \ interval_lebesgue_integrable lborel a b f" proof (induct a b rule: linorder_wlog) case (le a b) then show ?case unfolding interval_lebesgue_integrable_def set_integrable_def by (auto simp: interval_lebesgue_integrable_def intro!: set_integrable_subset[unfolded set_integrable_def, OF borel_integrable_compact[of "{a .. b}"]] continuous_at_imp_continuous_on) qed (auto intro: interval_integrable_endpoints_reverse[THEN iffD1]) lemma interval_integrable_continuous_on: fixes a b :: real and f assumes "a \ b" and "continuous_on {a..b} f" shows "interval_lebesgue_integrable lborel a b f" using assms unfolding interval_lebesgue_integrable_def apply simp by (rule set_integrable_subset, rule borel_integrable_atLeastAtMost' [of a b], auto) lemma interval_integral_eq_integral: fixes f :: "real \ 'a::euclidean_space" shows "a \ b \ set_integrable lborel {a..b} f \ LBINT x=a..b. f x = integral {a..b} f" by (subst interval_integral_Icc, simp) (rule set_borel_integral_eq_integral) lemma interval_integral_eq_integral': fixes f :: "real \ 'a::euclidean_space" shows "a \ b \ set_integrable lborel (einterval a b) f \ LBINT x=a..b. f x = integral (einterval a b) f" by (subst interval_lebesgue_integral_le_eq, simp) (rule set_borel_integral_eq_integral) subsection\General limit approximation arguments\ proposition interval_integral_Icc_approx_nonneg: fixes a b :: ereal assumes "a < b" fixes u l :: "nat \ real" assumes approx: "einterval a b = (\i. {l i .. u i})" "incseq u" "decseq l" "\i. l i < u i" "\i. a < l i" "\i. u i < b" "l \ a" "u \ b" fixes f :: "real \ real" assumes f_integrable: "\i. set_integrable lborel {l i..u i} f" assumes f_nonneg: "AE x in lborel. a < ereal x \ ereal x < b \ 0 \ f x" assumes f_measurable: "set_borel_measurable lborel (einterval a b) f" assumes lbint_lim: "(\i. LBINT x=l i.. u i. f x) \ C" shows "set_integrable lborel (einterval a b) f" "(LBINT x=a..b. f x) = C" proof - have 1 [unfolded set_integrable_def]: "\i. set_integrable lborel {l i..u i} f" by (rule f_integrable) have 2: "AE x in lborel. mono (\n. indicator {l n..u n} x *\<^sub>R f x)" proof - from f_nonneg have "AE x in lborel. \i. l i \ x \ x \ u i \ 0 \ f x" by eventually_elim (metis approx(5) approx(6) dual_order.strict_trans1 ereal_less_eq(3) le_less_trans) then show ?thesis apply eventually_elim apply (auto simp: mono_def split: split_indicator) apply (metis approx(3) decseqD order_trans) apply (metis approx(2) incseqD order_trans) done qed have 3: "AE x in lborel. (\i. indicator {l i..u i} x *\<^sub>R f x) \ indicator (einterval a b) x *\<^sub>R f x" proof - { fix x i assume "l i \ x" "x \ u i" then have "eventually (\i. l i \ x \ x \ u i) sequentially" apply (auto simp: eventually_sequentially intro!: exI[of _ i]) apply (metis approx(3) decseqD order_trans) apply (metis approx(2) incseqD order_trans) done then have "eventually (\i. f x * indicator {l i..u i} x = f x) sequentially" by eventually_elim auto } then show ?thesis unfolding approx(1) by (auto intro!: AE_I2 tendsto_eventually split: split_indicator) qed have 4: "(\i. \ x. indicator {l i..u i} x *\<^sub>R f x \lborel) \ C" using lbint_lim by (simp add: interval_integral_Icc [unfolded set_lebesgue_integral_def] approx less_imp_le) have 5: "(\x. indicat_real (einterval a b) x *\<^sub>R f x) \ borel_measurable lborel" using f_measurable set_borel_measurable_def by blast have "(LBINT x=a..b. f x) = lebesgue_integral lborel (\x. indicator (einterval a b) x *\<^sub>R f x)" using assms by (simp add: interval_lebesgue_integral_def set_lebesgue_integral_def less_imp_le) also have "\ = C" by (rule integral_monotone_convergence [OF 1 2 3 4 5]) finally show "(LBINT x=a..b. f x) = C" . show "set_integrable lborel (einterval a b) f" unfolding set_integrable_def by (rule integrable_monotone_convergence[OF 1 2 3 4 5]) qed proposition interval_integral_Icc_approx_integrable: fixes u l :: "nat \ real" and a b :: ereal fixes f :: "real \ 'a::{banach, second_countable_topology}" assumes "a < b" assumes approx: "einterval a b = (\i. {l i .. u i})" "incseq u" "decseq l" "\i. l i < u i" "\i. a < l i" "\i. u i < b" "l \ a" "u \ b" assumes f_integrable: "set_integrable lborel (einterval a b) f" shows "(\i. LBINT x=l i.. u i. f x) \ (LBINT x=a..b. f x)" proof - have "(\i. LBINT x:{l i.. u i}. f x) \ (LBINT x:einterval a b. f x)" unfolding set_lebesgue_integral_def proof (rule integral_dominated_convergence) show "integrable lborel (\x. norm (indicator (einterval a b) x *\<^sub>R f x))" using f_integrable integrable_norm set_integrable_def by blast show "(\x. indicat_real (einterval a b) x *\<^sub>R f x) \ borel_measurable lborel" using f_integrable by (simp add: set_integrable_def) then show "\i. (\x. indicat_real {l i..u i} x *\<^sub>R f x) \ borel_measurable lborel" by (rule set_borel_measurable_subset [unfolded set_borel_measurable_def]) (auto simp: approx) show "\i. AE x in lborel. norm (indicator {l i..u i} x *\<^sub>R f x) \ norm (indicator (einterval a b) x *\<^sub>R f x)" by (intro AE_I2) (auto simp: approx split: split_indicator) show "AE x in lborel. (\i. indicator {l i..u i} x *\<^sub>R f x) \ indicator (einterval a b) x *\<^sub>R f x" proof (intro AE_I2 tendsto_intros tendsto_eventually) fix x { fix i assume "l i \ x" "x \ u i" with \incseq u\[THEN incseqD, of i] \decseq l\[THEN decseqD, of i] have "eventually (\i. l i \ x \ x \ u i) sequentially" by (auto simp: eventually_sequentially decseq_def incseq_def intro: order_trans) } then show "eventually (\xa. indicator {l xa..u xa} x = (indicator (einterval a b) x::real)) sequentially" using approx order_tendstoD(2)[OF \l \ a\, of x] order_tendstoD(1)[OF \u \ b\, of x] by (auto split: split_indicator) qed qed with \a < b\ \\i. l i < u i\ show ?thesis by (simp add: interval_lebesgue_integral_le_eq[symmetric] interval_integral_Icc less_imp_le) qed subsection\A slightly stronger Fundamental Theorem of Calculus\ text\Three versions: first, for finite intervals, and then two versions for arbitrary intervals.\ (* TODO: make the older versions corollaries of these (using continuous_at_imp_continuous_on, etc.) *) lemma interval_integral_FTC_finite: fixes f F :: "real \ 'a::euclidean_space" and a b :: real assumes f: "continuous_on {min a b..max a b} f" assumes F: "\x. min a b \ x \ x \ max a b \ (F has_vector_derivative (f x)) (at x within {min a b..max a b})" shows "(LBINT x=a..b. f x) = F b - F a" proof (cases "a \ b") case True have "(LBINT x=a..b. f x) = (LBINT x. indicat_real {a..b} x *\<^sub>R f x)" by (simp add: True interval_integral_Icc set_lebesgue_integral_def) also have "\ = F b - F a" proof (rule integral_FTC_atLeastAtMost [OF True]) show "continuous_on {a..b} f" using True f by linarith show "\x. \a \ x; x \ b\ \ (F has_vector_derivative f x) (at x within {a..b})" by (metis F True max.commute max_absorb1 min_def) qed finally show ?thesis . next case False then have "b \ a" by simp have "- interval_lebesgue_integral lborel (ereal b) (ereal a) f = - (LBINT x. indicat_real {b..a} x *\<^sub>R f x)" by (simp add: \b \ a\ interval_integral_Icc set_lebesgue_integral_def) also have "\ = F b - F a" proof (subst integral_FTC_atLeastAtMost [OF \b \ a\]) show "continuous_on {b..a} f" using False f by linarith show "\x. \b \ x; x \ a\ \ (F has_vector_derivative f x) (at x within {b..a})" by (metis F False max_def min_def) qed auto finally show ?thesis by (metis interval_integral_endpoints_reverse) qed lemma interval_integral_FTC_nonneg: fixes f F :: "real \ real" and a b :: ereal assumes "a < b" assumes F: "\x. a < ereal x \ ereal x < b \ DERIV F x :> f x" assumes f: "\x. a < ereal x \ ereal x < b \ isCont f x" assumes f_nonneg: "AE x in lborel. a < ereal x \ ereal x < b \ 0 \ f x" assumes A: "((F \ real_of_ereal) \ A) (at_right a)" assumes B: "((F \ real_of_ereal) \ B) (at_left b)" shows "set_integrable lborel (einterval a b) f" "(LBINT x=a..b. f x) = B - A" proof - obtain u l where approx: "einterval a b = (\i. {l i .. u i})" "incseq u" "decseq l" "\i. l i < u i" "\i. a < l i" "\i. u i < b" "l \ a" "u \ b" by (blast intro: einterval_Icc_approximation[OF \a < b\]) have [simp]: "\x i. l i \ x \ a < ereal x" by (rule order_less_le_trans, rule approx, force) have [simp]: "\x i. x \ u i \ ereal x < b" by (rule order_le_less_trans, subst ereal_less_eq(3), assumption, rule approx) have FTCi: "\i. (LBINT x=l i..u i. f x) = F (u i) - F (l i)" using assms approx apply (intro interval_integral_FTC_finite) apply (auto simp: less_imp_le min_def max_def has_field_derivative_iff_has_vector_derivative[symmetric]) apply (rule continuous_at_imp_continuous_on, auto intro!: f) by (rule DERIV_subset [OF F], auto) have 1: "\i. set_integrable lborel {l i..u i} f" proof - fix i show "set_integrable lborel {l i .. u i} f" using \a < l i\ \u i < b\ unfolding set_integrable_def by (intro borel_integrable_compact f continuous_at_imp_continuous_on compact_Icc ballI) (auto simp flip: ereal_less_eq) qed have 2: "set_borel_measurable lborel (einterval a b) f" unfolding set_borel_measurable_def by (auto simp del: real_scaleR_def intro!: borel_measurable_continuous_on_indicator simp: continuous_on_eq_continuous_at einterval_iff f) have 3: "(\i. LBINT x=l i..u i. f x) \ B - A" apply (subst FTCi) apply (intro tendsto_intros) using B approx unfolding tendsto_at_iff_sequentially comp_def using tendsto_at_iff_sequentially[where 'a=real] apply (elim allE[of _ "\i. ereal (u i)"], auto) using A approx unfolding tendsto_at_iff_sequentially comp_def by (elim allE[of _ "\i. ereal (l i)"], auto) show "(LBINT x=a..b. f x) = B - A" by (rule interval_integral_Icc_approx_nonneg [OF \a < b\ approx 1 f_nonneg 2 3]) show "set_integrable lborel (einterval a b) f" by (rule interval_integral_Icc_approx_nonneg [OF \a < b\ approx 1 f_nonneg 2 3]) qed theorem interval_integral_FTC_integrable: fixes f F :: "real \ 'a::euclidean_space" and a b :: ereal assumes "a < b" assumes F: "\x. a < ereal x \ ereal x < b \ (F has_vector_derivative f x) (at x)" assumes f: "\x. a < ereal x \ ereal x < b \ isCont f x" assumes f_integrable: "set_integrable lborel (einterval a b) f" assumes A: "((F \ real_of_ereal) \ A) (at_right a)" assumes B: "((F \ real_of_ereal) \ B) (at_left b)" shows "(LBINT x=a..b. f x) = B - A" proof - obtain u l where approx: "einterval a b = (\i. {l i .. u i})" "incseq u" "decseq l" "\i. l i < u i" "\i. a < l i" "\i. u i < b" "l \ a" "u \ b" by (blast intro: einterval_Icc_approximation[OF \a < b\]) have [simp]: "\x i. l i \ x \ a < ereal x" by (rule order_less_le_trans, rule approx, force) have [simp]: "\x i. x \ u i \ ereal x < b" by (rule order_le_less_trans, subst ereal_less_eq(3), assumption, rule approx) have FTCi: "\i. (LBINT x=l i..u i. f x) = F (u i) - F (l i)" using assms approx by (auto simp: less_imp_le min_def max_def intro!: f continuous_at_imp_continuous_on interval_integral_FTC_finite intro: has_vector_derivative_at_within) have "(\i. LBINT x=l i..u i. f x) \ B - A" unfolding FTCi proof (intro tendsto_intros) show "(\x. F (l x)) \ A" using A approx unfolding tendsto_at_iff_sequentially comp_def by (elim allE[of _ "\i. ereal (l i)"], auto) show "(\x. F (u x)) \ B" using B approx unfolding tendsto_at_iff_sequentially comp_def by (elim allE[of _ "\i. ereal (u i)"], auto) qed moreover have "(\i. LBINT x=l i..u i. f x) \ (LBINT x=a..b. f x)" by (rule interval_integral_Icc_approx_integrable [OF \a < b\ approx f_integrable]) ultimately show ?thesis by (elim LIMSEQ_unique) qed (* The second Fundamental Theorem of Calculus and existence of antiderivatives on an einterval. *) theorem interval_integral_FTC2: fixes a b c :: real and f :: "real \ 'a::euclidean_space" assumes "a \ c" "c \ b" and contf: "continuous_on {a..b} f" fixes x :: real assumes "a \ x" and "x \ b" shows "((\u. LBINT y=c..u. f y) has_vector_derivative (f x)) (at x within {a..b})" proof - let ?F = "(\u. LBINT y=a..u. f y)" have intf: "set_integrable lborel {a..b} f" by (rule borel_integrable_atLeastAtMost', rule contf) have "((\u. integral {a..u} f) has_vector_derivative f x) (at x within {a..b})" using \a \ x\ \x \ b\ by (auto intro: integral_has_vector_derivative continuous_on_subset [OF contf]) then have "((\u. integral {a..u} f) has_vector_derivative (f x)) (at x within {a..b})" by simp then have "(?F has_vector_derivative (f x)) (at x within {a..b})" by (rule has_vector_derivative_weaken) (auto intro!: assms interval_integral_eq_integral[symmetric] set_integrable_subset [OF intf]) then have "((\x. (LBINT y=c..a. f y) + ?F x) has_vector_derivative (f x)) (at x within {a..b})" by (auto intro!: derivative_eq_intros) then show ?thesis proof (rule has_vector_derivative_weaken) fix u assume "u \ {a .. b}" then show "(LBINT y=c..a. f y) + (LBINT y=a..u. f y) = (LBINT y=c..u. f y)" using assms apply (intro interval_integral_sum) apply (auto simp: interval_lebesgue_integrable_def simp del: real_scaleR_def) by (rule set_integrable_subset [OF intf], auto simp: min_def max_def) qed (insert assms, auto) qed proposition einterval_antiderivative: fixes a b :: ereal and f :: "real \ 'a::euclidean_space" assumes "a < b" and contf: "\x :: real. a < x \ x < b \ isCont f x" shows "\F. \x :: real. a < x \ x < b \ (F has_vector_derivative f x) (at x)" proof - from einterval_nonempty [OF \a < b\] obtain c :: real where [simp]: "a < c" "c < b" by (auto simp: einterval_def) let ?F = "(\u. LBINT y=c..u. f y)" show ?thesis proof (rule exI, clarsimp) fix x :: real assume [simp]: "a < x" "x < b" have 1: "a < min c x" by simp from einterval_nonempty [OF 1] obtain d :: real where [simp]: "a < d" "d < c" "d < x" by (auto simp: einterval_def) have 2: "max c x < b" by simp from einterval_nonempty [OF 2] obtain e :: real where [simp]: "c < e" "x < e" "e < b" by (auto simp: einterval_def) have "(?F has_vector_derivative f x) (at x within {d<..x. \d \ x; x \ e\ \ a < ereal x" using \a < ereal d\ ereal_less_ereal_Ex by auto show "\x. \d \ x; x \ e\ \ ereal x < b" using \ereal e < b\ ereal_less_eq(3) le_less_trans by blast qed then show "(?F has_vector_derivative f x) (at x within {d..e})" by (intro interval_integral_FTC2) (use \d < c\ \c < e\ \d < x\ \x < e\ in \linarith+\) qed auto then show "(?F has_vector_derivative f x) (at x)" by (force simp: has_vector_derivative_within_open [of _ "{d<..The substitution theorem\ text\Once again, three versions: first, for finite intervals, and then two versions for arbitrary intervals.\ theorem interval_integral_substitution_finite: fixes a b :: real and f :: "real \ 'a::euclidean_space" assumes "a \ b" and derivg: "\x. a \ x \ x \ b \ (g has_real_derivative (g' x)) (at x within {a..b})" and contf : "continuous_on (g ` {a..b}) f" and contg': "continuous_on {a..b} g'" shows "LBINT x=a..b. g' x *\<^sub>R f (g x) = LBINT y=g a..g b. f y" proof- have v_derivg: "\x. a \ x \ x \ b \ (g has_vector_derivative (g' x)) (at x within {a..b})" using derivg unfolding has_field_derivative_iff_has_vector_derivative . then have contg [simp]: "continuous_on {a..b} g" by (rule continuous_on_vector_derivative) auto have 1: "\x\{a..b}. u = g x" if "min (g a) (g b) \ u" "u \ max (g a) (g b)" for u by (cases "g a \ g b") (use that assms IVT' [of g a u b] IVT2' [of g b u a] in \auto simp: min_def max_def\) obtain c d where g_im: "g ` {a..b} = {c..d}" and "c \ d" by (metis continuous_image_closed_interval contg \a \ b\) obtain F where derivF: "\x. \a \ x; x \ b\ \ (F has_vector_derivative (f (g x))) (at (g x) within (g ` {a..b}))" using continuous_on_subset [OF contf] g_im by (metis antiderivative_continuous atLeastAtMost_iff image_subset_iff set_eq_subset) have contfg: "continuous_on {a..b} (\x. f (g x))" by (blast intro: continuous_on_compose2 contf contg) have "LBINT x. indicat_real {a..b} x *\<^sub>R g' x *\<^sub>R f (g x) = F (g b) - F (g a)" apply (rule integral_FTC_atLeastAtMost [OF \a \ b\ vector_diff_chain_within[OF v_derivg derivF, unfolded comp_def]]) apply (auto intro!: continuous_on_scaleR contg' contfg) done then have "LBINT x=a..b. g' x *\<^sub>R f (g x) = F (g b) - F (g a)" by (simp add: assms interval_integral_Icc set_lebesgue_integral_def) moreover have "LBINT y=(g a)..(g b). f y = F (g b) - F (g a)" proof (rule interval_integral_FTC_finite) show "continuous_on {min (g a) (g b)..max (g a) (g b)} f" by (rule continuous_on_subset [OF contf]) (auto simp: image_def 1) show "(F has_vector_derivative f y) (at y within {min (g a) (g b)..max (g a) (g b)})" if y: "min (g a) (g b) \ y" "y \ max (g a) (g b)" for y proof - obtain x where "a \ x" "x \ b" "y = g x" using 1 y by force then show ?thesis by (auto simp: image_def intro!: 1 has_vector_derivative_within_subset [OF derivF]) qed qed ultimately show ?thesis by simp qed (* TODO: is it possible to lift the assumption here that g' is nonnegative? *) theorem interval_integral_substitution_integrable: fixes f :: "real \ 'a::euclidean_space" and a b u v :: ereal assumes "a < b" and deriv_g: "\x. a < ereal x \ ereal x < b \ DERIV g x :> g' x" and contf: "\x. a < ereal x \ ereal x < b \ isCont f (g x)" and contg': "\x. a < ereal x \ ereal x < b \ isCont g' x" and g'_nonneg: "\x. a \ ereal x \ ereal x \ b \ 0 \ g' x" and A: "((ereal \ g \ real_of_ereal) \ A) (at_right a)" and B: "((ereal \ g \ real_of_ereal) \ B) (at_left b)" and integrable: "set_integrable lborel (einterval a b) (\x. g' x *\<^sub>R f (g x))" and integrable2: "set_integrable lborel (einterval A B) (\x. f x)" shows "(LBINT x=A..B. f x) = (LBINT x=a..b. g' x *\<^sub>R f (g x))" proof - obtain u l where approx [simp]: "einterval a b = (\i. {l i .. u i})" "incseq u" "decseq l" "\i. l i < u i" "\i. a < l i" "\i. u i < b" "l \ a" "u \ b" by (blast intro: einterval_Icc_approximation[OF \a < b\]) note less_imp_le [simp] have [simp]: "\x i. l i \ x \ a < ereal x" by (rule order_less_le_trans, rule approx, force) have [simp]: "\x i. x \ u i \ ereal x < b" by (rule order_le_less_trans, subst ereal_less_eq(3), assumption, rule approx) then have lessb[simp]: "\i. l i < b" using approx(4) less_eq_real_def by blast have [simp]: "\i. a < u i" by (rule order_less_trans, rule approx, auto, rule approx) have lle[simp]: "\i j. i \ j \ l j \ l i" by (rule decseqD, rule approx) have [simp]: "\i j. i \ j \ u i \ u j" by (rule incseqD, rule approx) have g_nondec [simp]: "g x \ g y" if "a < x" "x \ y" "y < b" for x y proof (rule DERIV_nonneg_imp_nondecreasing [OF \x \ y\], intro exI conjI) show "\u. x \ u \ u \ y \ (g has_real_derivative g' u) (at u)" by (meson deriv_g ereal_less_eq(3) le_less_trans less_le_trans that) show "\u. x \ u \ u \ y \ 0 \ g' u" by (meson assms(5) dual_order.trans le_ereal_le less_imp_le order_refl that) qed have "A \ B" and un: "einterval A B = (\i. {g(l i)<..i. g (l i)) \ A" using A apply (auto simp: einterval_def tendsto_at_iff_sequentially comp_def) by (drule_tac x = "\i. ereal (l i)" in spec, auto) hence A3: "\i. g (l i) \ A" by (intro decseq_ge, auto simp: decseq_def) have B2: "(\i. g (u i)) \ B" using B apply (auto simp: einterval_def tendsto_at_iff_sequentially comp_def) by (drule_tac x = "\i. ereal (u i)" in spec, auto) hence B3: "\i. g (u i) \ B" by (intro incseq_le, auto simp: incseq_def) have "ereal (g (l 0)) \ ereal (g (u 0))" by auto then show "A \ B" by (meson A3 B3 order.trans) { fix x :: real assume "A < x" and "x < B" then have "eventually (\i. ereal (g (l i)) < x \ x < ereal (g (u i))) sequentially" by (fast intro: eventually_conj order_tendstoD A2 B2) hence "\i. g (l i) < x \ x < g (u i)" by (simp add: eventually_sequentially, auto) } note AB = this show "einterval A B = (\i. {g(l i)<.. (\i. {g(l i)<..i. {g(l i)<.. einterval A B" proof (clarsimp simp add: einterval_def, intro conjI) show "\x i. \g (l i) < x; x < g (u i)\ \ A < ereal x" using A3 le_ereal_less by blast show "\x i. \g (l i) < x; x < g (u i)\ \ ereal x < B" using B3 ereal_le_less by blast qed qed qed (* finally, the main argument *) have eq1: "(LBINT x=l i.. u i. g' x *\<^sub>R f (g x)) = (LBINT y=g (l i)..g (u i). f y)" for i apply (rule interval_integral_substitution_finite [OF _ DERIV_subset [OF deriv_g]]) unfolding has_field_derivative_iff_has_vector_derivative[symmetric] apply (auto intro!: continuous_at_imp_continuous_on contf contg') done have "(\i. LBINT x=l i..u i. g' x *\<^sub>R f (g x)) \ (LBINT x=a..b. g' x *\<^sub>R f (g x))" apply (rule interval_integral_Icc_approx_integrable [OF \a < b\ approx]) by (rule assms) hence 2: "(\i. (LBINT y=g (l i)..g (u i). f y)) \ (LBINT x=a..b. g' x *\<^sub>R f (g x))" by (simp add: eq1) have incseq: "incseq (\i. {g (l i)<..i. set_lebesgue_integral lborel {g (l i)<.. set_lebesgue_integral lborel (einterval A B) f" unfolding un by (rule set_integral_cont_up) (use incseq integrable2 un in auto) then have "(\i. (LBINT y=g (l i)..g (u i). f y)) \ (LBINT x = A..B. f x)" by (simp add: interval_lebesgue_integral_le_eq \A \ B\) thus ?thesis by (intro LIMSEQ_unique [OF _ 2]) qed (* TODO: the last two proofs are only slightly different. Factor out common part? An alternative: make the second one the main one, and then have another lemma that says that if f is nonnegative and all the other hypotheses hold, then it is integrable. *) theorem interval_integral_substitution_nonneg: fixes f g g':: "real \ real" and a b u v :: ereal assumes "a < b" and deriv_g: "\x. a < ereal x \ ereal x < b \ DERIV g x :> g' x" and contf: "\x. a < ereal x \ ereal x < b \ isCont f (g x)" and contg': "\x. a < ereal x \ ereal x < b \ isCont g' x" and f_nonneg: "\x. a < ereal x \ ereal x < b \ 0 \ f (g x)" (* TODO: make this AE? *) and g'_nonneg: "\x. a \ ereal x \ ereal x \ b \ 0 \ g' x" and A: "((ereal \ g \ real_of_ereal) \ A) (at_right a)" and B: "((ereal \ g \ real_of_ereal) \ B) (at_left b)" and integrable_fg: "set_integrable lborel (einterval a b) (\x. f (g x) * g' x)" shows "set_integrable lborel (einterval A B) f" "(LBINT x=A..B. f x) = (LBINT x=a..b. (f (g x) * g' x))" proof - from einterval_Icc_approximation[OF \a < b\] guess u l . note approx [simp] = this note less_imp_le [simp] have aless[simp]: "\x i. l i \ x \ a < ereal x" by (rule order_less_le_trans, rule approx, force) have lessb[simp]: "\x i. x \ u i \ ereal x < b" by (rule order_le_less_trans, subst ereal_less_eq(3), assumption, rule approx) have llb[simp]: "\i. l i < b" using lessb approx(4) less_eq_real_def by blast have alu[simp]: "\i. a < u i" by (rule order_less_trans, rule approx, auto, rule approx) have [simp]: "\i j. i \ j \ l j \ l i" by (rule decseqD, rule approx) have uleu[simp]: "\i j. i \ j \ u i \ u j" by (rule incseqD, rule approx) have g_nondec [simp]: "g x \ g y" if "a < x" "x \ y" "y < b" for x y proof (rule DERIV_nonneg_imp_nondecreasing [OF \x \ y\], intro exI conjI) show "\u. x \ u \ u \ y \ (g has_real_derivative g' u) (at u)" by (meson deriv_g ereal_less_eq(3) le_less_trans less_le_trans that) show "\u. x \ u \ u \ y \ 0 \ g' u" by (meson g'_nonneg less_ereal.simps(1) less_trans not_less that) qed have "A \ B" and un: "einterval A B = (\i. {g(l i)<..i. g (l i)) \ A" using A apply (auto simp: einterval_def tendsto_at_iff_sequentially comp_def) by (drule_tac x = "\i. ereal (l i)" in spec, auto) hence A3: "\i. g (l i) \ A" by (intro decseq_ge, auto simp: decseq_def) have B2: "(\i. g (u i)) \ B" using B apply (auto simp: einterval_def tendsto_at_iff_sequentially comp_def) by (drule_tac x = "\i. ereal (u i)" in spec, auto) hence B3: "\i. g (u i) \ B" by (intro incseq_le, auto simp: incseq_def) have "ereal (g (l 0)) \ ereal (g (u 0))" by auto then show "A \ B" by (meson A3 B3 order.trans) { fix x :: real assume "A < x" and "x < B" then have "eventually (\i. ereal (g (l i)) < x \ x < ereal (g (u i))) sequentially" by (fast intro: eventually_conj order_tendstoD A2 B2) hence "\i. g (l i) < x \ x < g (u i)" by (simp add: eventually_sequentially, auto) } note AB = this show "einterval A B = (\i. {g(l i)<.. (\i. {g (l i)<..i. {g (l i)<.. einterval A B" apply (clarsimp simp: einterval_def, intro conjI) using A3 le_ereal_less apply blast using B3 ereal_le_less by blast qed qed (* finally, the main argument *) have eq1: "(LBINT x=l i.. u i. (f (g x) * g' x)) = (LBINT y=g (l i)..g (u i). f y)" for i proof - have "(LBINT x=l i.. u i. g' x *\<^sub>R f (g x)) = (LBINT y=g (l i)..g (u i). f y)" apply (rule interval_integral_substitution_finite [OF _ DERIV_subset [OF deriv_g]]) unfolding has_field_derivative_iff_has_vector_derivative[symmetric] apply (auto intro!: continuous_at_imp_continuous_on contf contg') done then show ?thesis by (simp add: ac_simps) qed have incseq: "incseq (\i. {g (l i)<..c \ l i. c \ u i \ x = g c" if "g (l i) \ x" "x \ g (u i)" for x i proof - have "continuous_on {l i..u i} g" by (force intro!: DERIV_isCont deriv_g continuous_at_imp_continuous_on) with that show ?thesis using IVT' [of g] approx(4) dual_order.strict_implies_order by blast qed have "continuous_on {g (l i)..g (u i)} f" for i apply (intro continuous_intros continuous_at_imp_continuous_on) using contf img by force then have int_f: "\i. set_integrable lborel {g (l i)<..i. {g (l i)<..i. LBINT x=l i..u i. f (g x) * g' x) \ ?l" by (intro assms interval_integral_Icc_approx_integrable [OF \a < b\ approx]) hence "(\i. (LBINT y=g (l i)..g (u i). f y)) \ ?l" by (simp add: eq1) then show "(\i. set_lebesgue_integral lborel {g (l i)<.. ?l" unfolding interval_lebesgue_integral_def by auto have "\x i. g (l i) \ x \ x \ g (u i) \ 0 \ f x" using aless f_nonneg img lessb by blast then show "\x i. x \ {g (l i)<.. 0 \ f x" using less_eq_real_def by auto qed (auto simp: greaterThanLessThan_borel) thus "set_integrable lborel (einterval A B) f" by (simp add: un) have "(LBINT x=A..B. f x) = (LBINT x=a..b. g' x *\<^sub>R f (g x))" proof (rule interval_integral_substitution_integrable) show "set_integrable lborel (einterval a b) (\x. g' x *\<^sub>R f (g x))" using integrable_fg by (simp add: ac_simps) qed fact+ then show "(LBINT x=A..B. f x) = (LBINT x=a..b. (f (g x) * g' x))" by (simp add: ac_simps) qed syntax "_complex_lebesgue_borel_integral" :: "pttrn \ real \ complex" ("(2CLBINT _. _)" [0,60] 60) translations "CLBINT x. f" == "CONST complex_lebesgue_integral CONST lborel (\x. f)" syntax "_complex_set_lebesgue_borel_integral" :: "pttrn \ real set \ real \ complex" ("(3CLBINT _:_. _)" [0,60,61] 60) translations "CLBINT x:A. f" == "CONST complex_set_lebesgue_integral CONST lborel A (\x. f)" abbreviation complex_interval_lebesgue_integral :: "real measure \ ereal \ ereal \ (real \ complex) \ complex" where "complex_interval_lebesgue_integral M a b f \ interval_lebesgue_integral M a b f" abbreviation complex_interval_lebesgue_integrable :: "real measure \ ereal \ ereal \ (real \ complex) \ bool" where "complex_interval_lebesgue_integrable M a b f \ interval_lebesgue_integrable M a b f" syntax "_ascii_complex_interval_lebesgue_borel_integral" :: "pttrn \ ereal \ ereal \ real \ complex" ("(4CLBINT _=_.._. _)" [0,60,60,61] 60) translations "CLBINT x=a..b. f" == "CONST complex_interval_lebesgue_integral CONST lborel a b (\x. f)" proposition interval_integral_norm: fixes f :: "real \ 'a :: {banach, second_countable_topology}" shows "interval_lebesgue_integrable lborel a b f \ a \ b \ norm (LBINT t=a..b. f t) \ LBINT t=a..b. norm (f t)" using integral_norm_bound[of lborel "\x. indicator (einterval a b) x *\<^sub>R f x"] by (auto simp: interval_lebesgue_integral_def interval_lebesgue_integrable_def set_lebesgue_integral_def) proposition interval_integral_norm2: "interval_lebesgue_integrable lborel a b f \ norm (LBINT t=a..b. f t) \ \LBINT t=a..b. norm (f t)\" proof (induct a b rule: linorder_wlog) case (sym a b) then show ?case by (simp add: interval_integral_endpoints_reverse[of a b] interval_integrable_endpoints_reverse[of a b]) next case (le a b) then have "\LBINT t=a..b. norm (f t)\ = LBINT t=a..b. norm (f t)" using integrable_norm[of lborel "\x. indicator (einterval a b) x *\<^sub>R f x"] by (auto simp: interval_lebesgue_integral_def interval_lebesgue_integrable_def set_lebesgue_integral_def intro!: integral_nonneg_AE abs_of_nonneg) then show ?case using le by (simp add: interval_integral_norm) qed (* TODO: should we have a library of facts like these? *) lemma integral_cos: "t \ 0 \ LBINT x=a..b. cos (t * x) = sin (t * b) / t - sin (t * a) / t" apply (intro interval_integral_FTC_finite continuous_intros) by (auto intro!: derivative_eq_intros simp: has_field_derivative_iff_has_vector_derivative[symmetric]) end diff --git a/src/HOL/Data_Structures/Brother12_Map.thy b/src/HOL/Data_Structures/Brother12_Map.thy --- a/src/HOL/Data_Structures/Brother12_Map.thy +++ b/src/HOL/Data_Structures/Brother12_Map.thy @@ -1,209 +1,212 @@ (* Author: Tobias Nipkow *) section \1-2 Brother Tree Implementation of Maps\ theory Brother12_Map imports Brother12_Set Map_Specs begin fun lookup :: "('a \ 'b) bro \ 'a::linorder \ 'b option" where "lookup N0 x = None" | "lookup (N1 t) x = lookup t x" | "lookup (N2 l (a,b) r) x = (case cmp x a of LT \ lookup l x | EQ \ Some b | GT \ lookup r x)" locale update = insert begin fun upd :: "'a::linorder \ 'b \ ('a\'b) bro \ ('a\'b) bro" where "upd x y N0 = L2 (x,y)" | "upd x y (N1 t) = n1 (upd x y t)" | "upd x y (N2 l (a,b) r) = (case cmp x a of LT \ n2 (upd x y l) (a,b) r | EQ \ N2 l (a,y) r | GT \ n2 l (a,b) (upd x y r))" definition update :: "'a::linorder \ 'b \ ('a\'b) bro \ ('a\'b) bro" where "update x y t = tree(upd x y t)" end context delete begin fun del :: "'a::linorder \ ('a\'b) bro \ ('a\'b) bro" where "del _ N0 = N0" | "del x (N1 t) = N1 (del x t)" | "del x (N2 l (a,b) r) = (case cmp x a of LT \ n2 (del x l) (a,b) r | GT \ n2 l (a,b) (del x r) | EQ \ (case split_min r of None \ N1 l | Some (ab, r') \ n2 l ab r'))" definition delete :: "'a::linorder \ ('a\'b) bro \ ('a\'b) bro" where "delete a t = tree (del a t)" end subsection "Functional Correctness Proofs" subsubsection "Proofs for lookup" lemma lookup_map_of: "t \ T h \ sorted1(inorder t) \ lookup t x = map_of (inorder t) x" by(induction h arbitrary: t) (auto simp: map_of_simps split: option.splits) subsubsection "Proofs for update" context update begin lemma inorder_upd: "t \ T h \ sorted1(inorder t) \ inorder(upd x y t) = upd_list x y (inorder t)" by(induction h arbitrary: t) (auto simp: upd_list_simps inorder_n1 inorder_n2) lemma inorder_update: "t \ T h \ sorted1(inorder t) \ inorder(update x y t) = upd_list x y (inorder t)" by(simp add: update_def inorder_upd inorder_tree) end subsubsection \Proofs for deletion\ context delete begin lemma inorder_del: "t \ T h \ sorted1(inorder t) \ inorder(del x t) = del_list x (inorder t)" -by(induction h arbitrary: t) (auto simp: del_list_simps inorder_n2 + apply (induction h arbitrary: t) + apply (auto simp: del_list_simps inorder_n2) + apply (auto simp: del_list_simps inorder_n2 inorder_split_min[OF UnI1] inorder_split_min[OF UnI2] split: option.splits) + done lemma inorder_delete: "t \ T h \ sorted1(inorder t) \ inorder(delete x t) = del_list x (inorder t)" by(simp add: delete_def inorder_del inorder_tree) end subsection \Invariant Proofs\ subsubsection \Proofs for update\ context update begin lemma upd_type: "(t \ B h \ upd x y t \ Bp h) \ (t \ U h \ upd x y t \ T h)" apply(induction h arbitrary: t) apply (simp) apply (fastforce simp: Bp_if_B n2_type dest: n1_type) done lemma update_type: "t \ B h \ update x y t \ B h \ B (Suc h)" unfolding update_def by (metis upd_type tree_type) end subsubsection "Proofs for deletion" context delete begin lemma del_type: "t \ B h \ del x t \ T h" "t \ U h \ del x t \ Um h" proof (induction h arbitrary: x t) case (Suc h) { case 1 then obtain l a b r where [simp]: "t = N2 l (a,b) r" and lr: "l \ T h" "r \ T h" "l \ B h \ r \ B h" by auto have ?case if "x < a" proof cases assume "l \ B h" from n2_type3[OF Suc.IH(1)[OF this] lr(2)] show ?thesis using \x by(simp) next assume "l \ B h" hence "l \ U h" "r \ B h" using lr by auto from n2_type1[OF Suc.IH(2)[OF this(1)] this(2)] show ?thesis using \x by(simp) qed moreover have ?case if "x > a" proof cases assume "r \ B h" from n2_type3[OF lr(1) Suc.IH(1)[OF this]] show ?thesis using \x>a\ by(simp) next assume "r \ B h" hence "l \ B h" "r \ U h" using lr by auto from n2_type2[OF this(1) Suc.IH(2)[OF this(2)]] show ?thesis using \x>a\ by(simp) qed moreover have ?case if [simp]: "x=a" proof (cases "split_min r") case None show ?thesis proof cases assume "r \ B h" with split_minNoneN0[OF this None] lr show ?thesis by(simp) next assume "r \ B h" hence "r \ U h" using lr by auto with split_minNoneN1[OF this None] lr(3) show ?thesis by (simp) qed next case [simp]: (Some br') obtain b r' where [simp]: "br' = (b,r')" by fastforce show ?thesis proof cases assume "r \ B h" from split_min_type(1)[OF this] n2_type3[OF lr(1)] show ?thesis by simp next assume "r \ B h" hence "l \ B h" and "r \ U h" using lr by auto from split_min_type(2)[OF this(2)] n2_type2[OF this(1)] show ?thesis by simp qed qed ultimately show ?case by auto } { case 2 with Suc.IH(1) show ?case by auto } qed auto lemma delete_type: "t \ B h \ delete x t \ B h \ B(h-1)" unfolding delete_def by (cases h) (simp, metis del_type(1) tree_type Suc_eq_plus1 diff_Suc_1) end subsection "Overall correctness" interpretation Map_by_Ordered where empty = empty and lookup = lookup and update = update.update and delete = delete.delete and inorder = inorder and inv = "\t. \h. t \ B h" proof (standard, goal_cases) case 2 thus ?case by(auto intro!: lookup_map_of) next case 3 thus ?case by(auto intro!: update.inorder_update) next case 4 thus ?case by(auto intro!: delete.inorder_delete) next case 6 thus ?case using update.update_type by (metis Un_iff) next case 7 thus ?case using delete.delete_type by blast qed (auto simp: empty_def) end diff --git a/src/HOL/Data_Structures/Brother12_Set.thy b/src/HOL/Data_Structures/Brother12_Set.thy --- a/src/HOL/Data_Structures/Brother12_Set.thy +++ b/src/HOL/Data_Structures/Brother12_Set.thy @@ -1,549 +1,552 @@ (* Author: Tobias Nipkow, Daniel Stüwe *) section \1-2 Brother Tree Implementation of Sets\ theory Brother12_Set imports Cmp Set_Specs "HOL-Number_Theory.Fib" begin subsection \Data Type and Operations\ datatype 'a bro = N0 | N1 "'a bro" | N2 "'a bro" 'a "'a bro" | (* auxiliary constructors: *) L2 'a | N3 "'a bro" 'a "'a bro" 'a "'a bro" definition empty :: "'a bro" where "empty = N0" fun inorder :: "'a bro \ 'a list" where "inorder N0 = []" | "inorder (N1 t) = inorder t" | "inorder (N2 l a r) = inorder l @ a # inorder r" | "inorder (L2 a) = [a]" | "inorder (N3 t1 a1 t2 a2 t3) = inorder t1 @ a1 # inorder t2 @ a2 # inorder t3" fun isin :: "'a bro \ 'a::linorder \ bool" where "isin N0 x = False" | "isin (N1 t) x = isin t x" | "isin (N2 l a r) x = (case cmp x a of LT \ isin l x | EQ \ True | GT \ isin r x)" fun n1 :: "'a bro \ 'a bro" where "n1 (L2 a) = N2 N0 a N0" | "n1 (N3 t1 a1 t2 a2 t3) = N2 (N2 t1 a1 t2) a2 (N1 t3)" | "n1 t = N1 t" hide_const (open) insert locale insert begin fun n2 :: "'a bro \ 'a \ 'a bro \ 'a bro" where "n2 (L2 a1) a2 t = N3 N0 a1 N0 a2 t" | "n2 (N3 t1 a1 t2 a2 t3) a3 (N1 t4) = N2 (N2 t1 a1 t2) a2 (N2 t3 a3 t4)" | "n2 (N3 t1 a1 t2 a2 t3) a3 t4 = N3 (N2 t1 a1 t2) a2 (N1 t3) a3 t4" | "n2 t1 a1 (L2 a2) = N3 t1 a1 N0 a2 N0" | "n2 (N1 t1) a1 (N3 t2 a2 t3 a3 t4) = N2 (N2 t1 a1 t2) a2 (N2 t3 a3 t4)" | "n2 t1 a1 (N3 t2 a2 t3 a3 t4) = N3 t1 a1 (N1 t2) a2 (N2 t3 a3 t4)" | "n2 t1 a t2 = N2 t1 a t2" fun ins :: "'a::linorder \ 'a bro \ 'a bro" where "ins x N0 = L2 x" | "ins x (N1 t) = n1 (ins x t)" | "ins x (N2 l a r) = (case cmp x a of LT \ n2 (ins x l) a r | EQ \ N2 l a r | GT \ n2 l a (ins x r))" fun tree :: "'a bro \ 'a bro" where "tree (L2 a) = N2 N0 a N0" | "tree (N3 t1 a1 t2 a2 t3) = N2 (N2 t1 a1 t2) a2 (N1 t3)" | "tree t = t" definition insert :: "'a::linorder \ 'a bro \ 'a bro" where "insert x t = tree(ins x t)" end locale delete begin fun n2 :: "'a bro \ 'a \ 'a bro \ 'a bro" where "n2 (N1 t1) a1 (N1 t2) = N1 (N2 t1 a1 t2)" | "n2 (N1 (N1 t1)) a1 (N2 (N1 t2) a2 (N2 t3 a3 t4)) = N1 (N2 (N2 t1 a1 t2) a2 (N2 t3 a3 t4))" | "n2 (N1 (N1 t1)) a1 (N2 (N2 t2 a2 t3) a3 (N1 t4)) = N1 (N2 (N2 t1 a1 t2) a2 (N2 t3 a3 t4))" | "n2 (N1 (N1 t1)) a1 (N2 (N2 t2 a2 t3) a3 (N2 t4 a4 t5)) = N2 (N2 (N1 t1) a1 (N2 t2 a2 t3)) a3 (N1 (N2 t4 a4 t5))" | "n2 (N2 (N1 t1) a1 (N2 t2 a2 t3)) a3 (N1 (N1 t4)) = N1 (N2 (N2 t1 a1 t2) a2 (N2 t3 a3 t4))" | "n2 (N2 (N2 t1 a1 t2) a2 (N1 t3)) a3 (N1 (N1 t4)) = N1 (N2 (N2 t1 a1 t2) a2 (N2 t3 a3 t4))" | "n2 (N2 (N2 t1 a1 t2) a2 (N2 t3 a3 t4)) a5 (N1 (N1 t5)) = N2 (N1 (N2 t1 a1 t2)) a2 (N2 (N2 t3 a3 t4) a5 (N1 t5))" | "n2 t1 a1 t2 = N2 t1 a1 t2" fun split_min :: "'a bro \ ('a \ 'a bro) option" where "split_min N0 = None" | "split_min (N1 t) = (case split_min t of None \ None | Some (a, t') \ Some (a, N1 t'))" | "split_min (N2 t1 a t2) = (case split_min t1 of None \ Some (a, N1 t2) | Some (b, t1') \ Some (b, n2 t1' a t2))" fun del :: "'a::linorder \ 'a bro \ 'a bro" where "del _ N0 = N0" | "del x (N1 t) = N1 (del x t)" | "del x (N2 l a r) = (case cmp x a of LT \ n2 (del x l) a r | GT \ n2 l a (del x r) | EQ \ (case split_min r of None \ N1 l | Some (b, r') \ n2 l b r'))" fun tree :: "'a bro \ 'a bro" where "tree (N1 t) = t" | "tree t = t" definition delete :: "'a::linorder \ 'a bro \ 'a bro" where "delete a t = tree (del a t)" end subsection \Invariants\ fun B :: "nat \ 'a bro set" and U :: "nat \ 'a bro set" where "B 0 = {N0}" | "B (Suc h) = { N2 t1 a t2 | t1 a t2. t1 \ B h \ U h \ t2 \ B h \ t1 \ B h \ t2 \ B h \ U h}" | "U 0 = {}" | "U (Suc h) = N1 ` B h" abbreviation "T h \ B h \ U h" fun Bp :: "nat \ 'a bro set" where "Bp 0 = B 0 \ L2 ` UNIV" | "Bp (Suc 0) = B (Suc 0) \ {N3 N0 a N0 b N0|a b. True}" | "Bp (Suc(Suc h)) = B (Suc(Suc h)) \ {N3 t1 a t2 b t3 | t1 a t2 b t3. t1 \ B (Suc h) \ t2 \ U (Suc h) \ t3 \ B (Suc h)}" fun Um :: "nat \ 'a bro set" where "Um 0 = {}" | "Um (Suc h) = N1 ` T h" subsection "Functional Correctness Proofs" subsubsection "Proofs for isin" lemma isin_set: "t \ T h \ sorted(inorder t) \ isin t x = (x \ set(inorder t))" by(induction h arbitrary: t) (fastforce simp: isin_simps split: if_splits)+ subsubsection "Proofs for insertion" lemma inorder_n1: "inorder(n1 t) = inorder t" by(cases t rule: n1.cases) (auto simp: sorted_lems) context insert begin lemma inorder_n2: "inorder(n2 l a r) = inorder l @ a # inorder r" by(cases "(l,a,r)" rule: n2.cases) (auto simp: sorted_lems) lemma inorder_tree: "inorder(tree t) = inorder t" by(cases t) auto lemma inorder_ins: "t \ T h \ sorted(inorder t) \ inorder(ins a t) = ins_list a (inorder t)" by(induction h arbitrary: t) (auto simp: ins_list_simps inorder_n1 inorder_n2) lemma inorder_insert: "t \ T h \ sorted(inorder t) \ inorder(insert a t) = ins_list a (inorder t)" by(simp add: insert_def inorder_ins inorder_tree) end subsubsection \Proofs for deletion\ context delete begin lemma inorder_tree: "inorder(tree t) = inorder t" by(cases t) auto lemma inorder_n2: "inorder(n2 l a r) = inorder l @ a # inorder r" by(cases "(l,a,r)" rule: n2.cases) (auto) lemma inorder_split_min: "t \ T h \ (split_min t = None \ inorder t = []) \ (split_min t = Some(a,t') \ inorder t = a # inorder t')" by(induction h arbitrary: t a t') (auto simp: inorder_n2 split: option.splits) lemma inorder_del: "t \ T h \ sorted(inorder t) \ inorder(del x t) = del_list x (inorder t)" -by(induction h arbitrary: t) (auto simp: del_list_simps inorder_n2 + apply (induction h arbitrary: t) + apply (auto simp: del_list_simps inorder_n2 split: option.splits) + apply (auto simp: del_list_simps inorder_n2 inorder_split_min[OF UnI1] inorder_split_min[OF UnI2] split: option.splits) + done lemma inorder_delete: "t \ T h \ sorted(inorder t) \ inorder(delete x t) = del_list x (inorder t)" by(simp add: delete_def inorder_del inorder_tree) end subsection \Invariant Proofs\ subsubsection \Proofs for insertion\ lemma n1_type: "t \ Bp h \ n1 t \ T (Suc h)" by(cases h rule: Bp.cases) auto context insert begin lemma tree_type: "t \ Bp h \ tree t \ B h \ B (Suc h)" by(cases h rule: Bp.cases) auto lemma n2_type: "(t1 \ Bp h \ t2 \ T h \ n2 t1 a t2 \ Bp (Suc h)) \ (t1 \ T h \ t2 \ Bp h \ n2 t1 a t2 \ Bp (Suc h))" apply(cases h rule: Bp.cases) apply (auto)[2] apply(rule conjI impI | erule conjE exE imageE | simp | erule disjE)+ done lemma Bp_if_B: "t \ B h \ t \ Bp h" by (cases h rule: Bp.cases) simp_all text\An automatic proof:\ lemma "(t \ B h \ ins x t \ Bp h) \ (t \ U h \ ins x t \ T h)" apply(induction h arbitrary: t) apply (simp) apply (fastforce simp: Bp_if_B n2_type dest: n1_type) done text\A detailed proof:\ lemma ins_type: shows "t \ B h \ ins x t \ Bp h" and "t \ U h \ ins x t \ T h" proof(induction h arbitrary: t) case 0 { case 1 thus ?case by simp next case 2 thus ?case by simp } next case (Suc h) { case 1 then obtain t1 a t2 where [simp]: "t = N2 t1 a t2" and t1: "t1 \ T h" and t2: "t2 \ T h" and t12: "t1 \ B h \ t2 \ B h" by auto have ?case if "x < a" proof - have "n2 (ins x t1) a t2 \ Bp (Suc h)" proof cases assume "t1 \ B h" with t2 show ?thesis by (simp add: Suc.IH(1) n2_type) next assume "t1 \ B h" hence 1: "t1 \ U h" and 2: "t2 \ B h" using t1 t12 by auto show ?thesis by (metis Suc.IH(2)[OF 1] Bp_if_B[OF 2] n2_type) qed with \x < a\ show ?case by simp qed moreover have ?case if "a < x" proof - have "n2 t1 a (ins x t2) \ Bp (Suc h)" proof cases assume "t2 \ B h" with t1 show ?thesis by (simp add: Suc.IH(1) n2_type) next assume "t2 \ B h" hence 1: "t1 \ B h" and 2: "t2 \ U h" using t2 t12 by auto show ?thesis by (metis Bp_if_B[OF 1] Suc.IH(2)[OF 2] n2_type) qed with \a < x\ show ?case by simp qed moreover have ?case if "x = a" proof - from 1 have "t \ Bp (Suc h)" by(rule Bp_if_B) thus "?case" using \x = a\ by simp qed ultimately show ?case by auto next case 2 thus ?case using Suc(1) n1_type by fastforce } qed lemma insert_type: "t \ B h \ insert x t \ B h \ B (Suc h)" unfolding insert_def by (metis ins_type(1) tree_type) end subsubsection "Proofs for deletion" lemma B_simps[simp]: "N1 t \ B h = False" "L2 y \ B h = False" "(N3 t1 a1 t2 a2 t3) \ B h = False" "N0 \ B h \ h = 0" by (cases h, auto)+ context delete begin lemma n2_type1: "\t1 \ Um h; t2 \ B h\ \ n2 t1 a t2 \ T (Suc h)" apply(cases h rule: Bp.cases) apply auto[2] apply(erule exE bexE conjE imageE | simp | erule disjE)+ done lemma n2_type2: "\t1 \ B h ; t2 \ Um h \ \ n2 t1 a t2 \ T (Suc h)" apply(cases h rule: Bp.cases) apply auto[2] apply(erule exE bexE conjE imageE | simp | erule disjE)+ done lemma n2_type3: "\t1 \ T h ; t2 \ T h \ \ n2 t1 a t2 \ T (Suc h)" apply(cases h rule: Bp.cases) apply auto[2] apply(erule exE bexE conjE imageE | simp | erule disjE)+ done lemma split_minNoneN0: "\t \ B h; split_min t = None\ \ t = N0" by (cases t) (auto split: option.splits) lemma split_minNoneN1 : "\t \ U h; split_min t = None\ \ t = N1 N0" by (cases h) (auto simp: split_minNoneN0 split: option.splits) lemma split_min_type: "t \ B h \ split_min t = Some (a, t') \ t' \ T h" "t \ U h \ split_min t = Some (a, t') \ t' \ Um h" proof (induction h arbitrary: t a t') case (Suc h) { case 1 then obtain t1 a t2 where [simp]: "t = N2 t1 a t2" and t12: "t1 \ T h" "t2 \ T h" "t1 \ B h \ t2 \ B h" by auto show ?case proof (cases "split_min t1") case None show ?thesis proof cases assume "t1 \ B h" with split_minNoneN0[OF this None] 1 show ?thesis by(auto) next assume "t1 \ B h" thus ?thesis using 1 None by (auto) qed next case [simp]: (Some bt') obtain b t1' where [simp]: "bt' = (b,t1')" by fastforce show ?thesis proof cases assume "t1 \ B h" from Suc.IH(1)[OF this] 1 have "t1' \ T h" by simp from n2_type3[OF this t12(2)] 1 show ?thesis by auto next assume "t1 \ B h" hence t1: "t1 \ U h" and t2: "t2 \ B h" using t12 by auto from Suc.IH(2)[OF t1] have "t1' \ Um h" by simp from n2_type1[OF this t2] 1 show ?thesis by auto qed qed } { case 2 then obtain t1 where [simp]: "t = N1 t1" and t1: "t1 \ B h" by auto show ?case proof (cases "split_min t1") case None with split_minNoneN0[OF t1 None] 2 show ?thesis by(auto) next case [simp]: (Some bt') obtain b t1' where [simp]: "bt' = (b,t1')" by fastforce from Suc.IH(1)[OF t1] have "t1' \ T h" by simp thus ?thesis using 2 by auto qed } qed auto lemma del_type: "t \ B h \ del x t \ T h" "t \ U h \ del x t \ Um h" proof (induction h arbitrary: x t) case (Suc h) { case 1 then obtain l a r where [simp]: "t = N2 l a r" and lr: "l \ T h" "r \ T h" "l \ B h \ r \ B h" by auto have ?case if "x < a" proof cases assume "l \ B h" from n2_type3[OF Suc.IH(1)[OF this] lr(2)] show ?thesis using \x by(simp) next assume "l \ B h" hence "l \ U h" "r \ B h" using lr by auto from n2_type1[OF Suc.IH(2)[OF this(1)] this(2)] show ?thesis using \x by(simp) qed moreover have ?case if "x > a" proof cases assume "r \ B h" from n2_type3[OF lr(1) Suc.IH(1)[OF this]] show ?thesis using \x>a\ by(simp) next assume "r \ B h" hence "l \ B h" "r \ U h" using lr by auto from n2_type2[OF this(1) Suc.IH(2)[OF this(2)]] show ?thesis using \x>a\ by(simp) qed moreover have ?case if [simp]: "x=a" proof (cases "split_min r") case None show ?thesis proof cases assume "r \ B h" with split_minNoneN0[OF this None] lr show ?thesis by(simp) next assume "r \ B h" hence "r \ U h" using lr by auto with split_minNoneN1[OF this None] lr(3) show ?thesis by (simp) qed next case [simp]: (Some br') obtain b r' where [simp]: "br' = (b,r')" by fastforce show ?thesis proof cases assume "r \ B h" from split_min_type(1)[OF this] n2_type3[OF lr(1)] show ?thesis by simp next assume "r \ B h" hence "l \ B h" and "r \ U h" using lr by auto from split_min_type(2)[OF this(2)] n2_type2[OF this(1)] show ?thesis by simp qed qed ultimately show ?case by auto } { case 2 with Suc.IH(1) show ?case by auto } qed auto lemma tree_type: "t \ T (h+1) \ tree t \ B (h+1) \ B h" by(auto) lemma delete_type: "t \ B h \ delete x t \ B h \ B(h-1)" unfolding delete_def by (cases h) (simp, metis del_type(1) tree_type Suc_eq_plus1 diff_Suc_1) end subsection "Overall correctness" interpretation Set_by_Ordered where empty = empty and isin = isin and insert = insert.insert and delete = delete.delete and inorder = inorder and inv = "\t. \h. t \ B h" proof (standard, goal_cases) case 2 thus ?case by(auto intro!: isin_set) next case 3 thus ?case by(auto intro!: insert.inorder_insert) next case 4 thus ?case by(auto intro!: delete.inorder_delete) next case 6 thus ?case using insert.insert_type by blast next case 7 thus ?case using delete.delete_type by blast qed (auto simp: empty_def) subsection \Height-Size Relation\ text \By Daniel St\"uwe\ fun fib_tree :: "nat \ unit bro" where "fib_tree 0 = N0" | "fib_tree (Suc 0) = N2 N0 () N0" | "fib_tree (Suc(Suc h)) = N2 (fib_tree (h+1)) () (N1 (fib_tree h))" fun fib' :: "nat \ nat" where "fib' 0 = 0" | "fib' (Suc 0) = 1" | "fib' (Suc(Suc h)) = 1 + fib' (Suc h) + fib' h" fun size :: "'a bro \ nat" where "size N0 = 0" | "size (N1 t) = size t" | "size (N2 t1 _ t2) = 1 + size t1 + size t2" lemma fib_tree_B: "fib_tree h \ B h" by (induction h rule: fib_tree.induct) auto declare [[names_short]] lemma size_fib': "size (fib_tree h) = fib' h" by (induction h rule: fib_tree.induct) auto lemma fibfib: "fib' h + 1 = fib (Suc(Suc h))" by (induction h rule: fib_tree.induct) auto lemma B_N2_cases[consumes 1]: assumes "N2 t1 a t2 \ B (Suc n)" obtains (BB) "t1 \ B n" and "t2 \ B n" | (UB) "t1 \ U n" and "t2 \ B n" | (BU) "t1 \ B n" and "t2 \ U n" using assms by auto lemma size_bounded: "t \ B h \ size t \ size (fib_tree h)" unfolding size_fib' proof (induction h arbitrary: t rule: fib'.induct) case (3 h t') note main = 3 then obtain t1 a t2 where t': "t' = N2 t1 a t2" by auto with main have "N2 t1 a t2 \ B (Suc (Suc h))" by auto thus ?case proof (cases rule: B_N2_cases) case BB then obtain x y z where t2: "t2 = N2 x y z \ t2 = N2 z y x" "x \ B h" by auto show ?thesis unfolding t' using main(1)[OF BB(1)] main(2)[OF t2(2)] t2(1) by auto next case UB then obtain t11 where t1: "t1 = N1 t11" "t11 \ B h" by auto show ?thesis unfolding t' t1(1) using main(2)[OF t1(2)] main(1)[OF UB(2)] by simp next case BU then obtain t22 where t2: "t2 = N1 t22" "t22 \ B h" by auto show ?thesis unfolding t' t2(1) using main(2)[OF t2(2)] main(1)[OF BU(1)] by simp qed qed auto theorem "t \ B h \ fib (h + 2) \ size t + 1" using size_bounded by (simp add: size_fib' fibfib[symmetric] del: fib.simps) end diff --git a/src/HOL/Data_Structures/Selection.thy b/src/HOL/Data_Structures/Selection.thy --- a/src/HOL/Data_Structures/Selection.thy +++ b/src/HOL/Data_Structures/Selection.thy @@ -1,1038 +1,1040 @@ (* File: Data_Structures/Selection.thy Author: Manuel Eberl, TU München *) section \The Median-of-Medians Selection Algorithm\ theory Selection imports Complex_Main Sorting Time_Funs begin text \ Note that there is significant overlap between this theory (which is intended mostly for the Functional Data Structures book) and the Median-of-Medians AFP entry. \ subsection \Auxiliary material\ lemma replicate_numeral: "replicate (numeral n) x = x # replicate (pred_numeral n) x" by (simp add: numeral_eq_Suc) lemma isort_correct: "isort xs = sort xs" using sorted_isort mset_isort by (metis properties_for_sort) lemma sum_list_replicate [simp]: "sum_list (replicate n x) = n * x" by (induction n) auto lemma mset_concat: "mset (concat xss) = sum_list (map mset xss)" by (induction xss) simp_all lemma set_mset_sum_list [simp]: "set_mset (sum_list xs) = (\x\set xs. set_mset x)" by (induction xs) auto lemma filter_mset_image_mset: "filter_mset P (image_mset f A) = image_mset f (filter_mset (\x. P (f x)) A)" by (induction A) auto lemma filter_mset_sum_list: "filter_mset P (sum_list xs) = sum_list (map (filter_mset P) xs)" by (induction xs) simp_all lemma sum_mset_mset_mono: assumes "(\x. x \# A \ f x \# g x)" shows "(\x\#A. f x) \# (\x\#A. g x)" using assms by (induction A) (auto intro!: subset_mset.add_mono) lemma mset_filter_mono: assumes "A \# B" "\x. x \# A \ P x \ Q x" shows "filter_mset P A \# filter_mset Q B" by (rule mset_subset_eqI) (insert assms, auto simp: mset_subset_eq_count count_eq_zero_iff) lemma size_mset_sum_mset_distrib: "size (sum_mset A :: 'a multiset) = sum_mset (image_mset size A)" by (induction A) auto lemma sum_mset_mono: assumes "\x. x \# A \ f x \ (g x :: 'a :: {ordered_ab_semigroup_add,comm_monoid_add})" shows "(\x\#A. f x) \ (\x\#A. g x)" using assms by (induction A) (auto intro!: add_mono) lemma filter_mset_is_empty_iff: "filter_mset P A = {#} \ (\x. x \# A \ \P x)" by (auto simp: multiset_eq_iff count_eq_zero_iff) lemma sort_eq_Nil_iff [simp]: "sort xs = [] \ xs = []" by (metis set_empty set_sort) lemma sort_mset_cong: "mset xs = mset ys \ sort xs = sort ys" by (metis sorted_list_of_multiset_mset) lemma Min_set_sorted: "sorted xs \ xs \ [] \ Min (set xs) = hd xs" by (cases xs; force intro: Min_insert2) lemma hd_sort: fixes xs :: "'a :: linorder list" shows "xs \ [] \ hd (sort xs) = Min (set xs)" by (subst Min_set_sorted [symmetric]) auto lemma length_filter_conv_size_filter_mset: "length (filter P xs) = size (filter_mset P (mset xs))" by (induction xs) auto lemma sorted_filter_less_subset_take: assumes "sorted xs" and "i < length xs" shows "{#x \# mset xs. x < xs ! i#} \# mset (take i xs)" using assms proof (induction xs arbitrary: i rule: list.induct) case (Cons x xs i) show ?case proof (cases i) case 0 thus ?thesis using Cons.prems by (auto simp: filter_mset_is_empty_iff) next case (Suc i') have "{#y \# mset (x # xs). y < (x # xs) ! i#} \# add_mset x {#y \# mset xs. y < xs ! i'#}" using Suc Cons.prems by (auto) also have "\ \# add_mset x (mset (take i' xs))" unfolding mset_subset_eq_add_mset_cancel using Cons.prems Suc by (intro Cons.IH) (auto) also have "\ = mset (take i (x # xs))" by (simp add: Suc) finally show ?thesis . qed qed auto lemma sorted_filter_greater_subset_drop: assumes "sorted xs" and "i < length xs" shows "{#x \# mset xs. x > xs ! i#} \# mset (drop (Suc i) xs)" using assms proof (induction xs arbitrary: i rule: list.induct) case (Cons x xs i) show ?case proof (cases i) case 0 thus ?thesis by (auto simp: sorted_append filter_mset_is_empty_iff) next case (Suc i') have "{#y \# mset (x # xs). y > (x # xs) ! i#} \# {#y \# mset xs. y > xs ! i'#}" using Suc Cons.prems by (auto simp: set_conv_nth) also have "\ \# mset (drop (Suc i') xs)" using Cons.prems Suc by (intro Cons.IH) (auto) also have "\ = mset (drop (Suc i) (x # xs))" by (simp add: Suc) finally show ?thesis . qed qed auto subsection \Chopping a list into equally-sized bits\ fun chop :: "nat \ 'a list \ 'a list list" where "chop 0 _ = []" | "chop _ [] = []" | "chop n xs = take n xs # chop n (drop n xs)" lemmas [simp del] = chop.simps text \ This is an alternative induction rule for \<^const>\chop\, which is often nicer to use. \ lemma chop_induct' [case_names trivial reduce]: assumes "\n xs. n = 0 \ xs = [] \ P n xs" assumes "\n xs. n > 0 \ xs \ [] \ P n (drop n xs) \ P n xs" shows "P n xs" using assms proof induction_schema show "wf (measure (length \ snd))" by auto qed (blast | simp)+ lemma chop_eq_Nil_iff [simp]: "chop n xs = [] \ n = 0 \ xs = []" by (induction n xs rule: chop.induct; subst chop.simps) auto lemma chop_0 [simp]: "chop 0 xs = []" by (simp add: chop.simps) lemma chop_Nil [simp]: "chop n [] = []" by (cases n) (auto simp: chop.simps) lemma chop_reduce: "n > 0 \ xs \ [] \ chop n xs = take n xs # chop n (drop n xs)" by (cases n; cases xs) (auto simp: chop.simps) lemma concat_chop [simp]: "n > 0 \ concat (chop n xs) = xs" by (induction n xs rule: chop.induct; subst chop.simps) auto lemma chop_elem_not_Nil [dest]: "ys \ set (chop n xs) \ ys \ []" by (induction n xs rule: chop.induct; subst (asm) chop.simps) (auto simp: eq_commute[of "[]"] split: if_splits) lemma length_chop_part_le: "ys \ set (chop n xs) \ length ys \ n" by (induction n xs rule: chop.induct; subst (asm) chop.simps) (auto split: if_splits) lemma length_chop: assumes "n > 0" shows "length (chop n xs) = nat \length xs / n\" proof - from \n > 0\ have "real n * length (chop n xs) \ length xs" by (induction n xs rule: chop.induct; subst chop.simps) (auto simp: field_simps) moreover from \n > 0\ have "real n * length (chop n xs) < length xs + n" by (induction n xs rule: chop.induct; subst chop.simps) (auto simp: field_simps split: nat_diff_split_asm)+ ultimately have "length (chop n xs) \ length xs / n" and "length (chop n xs) < length xs / n + 1" using assms by (auto simp: field_simps) thus ?thesis by linarith qed lemma sum_msets_chop: "n > 0 \ (\ys\chop n xs. mset ys) = mset xs" by (subst mset_concat [symmetric]) simp_all lemma UN_sets_chop: "n > 0 \ (\ys\set (chop n xs). set ys) = set xs" by (simp only: set_concat [symmetric] concat_chop) lemma chop_append: "d dvd length xs \ chop d (xs @ ys) = chop d xs @ chop d ys" by (induction d xs rule: chop_induct') (auto simp: chop_reduce dvd_imp_le) lemma chop_replicate [simp]: "d > 0 \ chop d (replicate d xs) = [replicate d xs]" by (subst chop_reduce) auto lemma chop_replicate_dvd [simp]: assumes "d dvd n" shows "chop d (replicate n x) = replicate (n div d) (replicate d x)" proof (cases "d = 0") case False from assms obtain k where k: "n = d * k" by blast have "chop d (replicate (d * k) x) = replicate k (replicate d x)" using False by (induction k) (auto simp: replicate_add chop_append) thus ?thesis using False by (simp add: k) qed auto lemma chop_concat: assumes "\xs\set xss. length xs = d" and "d > 0" shows "chop d (concat xss) = xss" using assms proof (induction xss) case (Cons xs xss) have "chop d (concat (xs # xss)) = chop d (xs @ concat xss)" by simp also have "\ = chop d xs @ chop d (concat xss)" using Cons.prems by (intro chop_append) auto also have "chop d xs = [xs]" using Cons.prems by (subst chop_reduce) auto also have "chop d (concat xss) = xss" using Cons.prems by (intro Cons.IH) auto finally show ?case by simp qed auto subsection \Selection\ definition select :: "nat \ ('a :: linorder) list \ 'a" where "select k xs = sort xs ! k" lemma select_0: "xs \ [] \ select 0 xs = Min (set xs)" by (simp add: hd_sort select_def flip: hd_conv_nth) lemma select_mset_cong: "mset xs = mset ys \ select k xs = select k ys" using sort_mset_cong[of xs ys] unfolding select_def by auto lemma select_in_set [intro,simp]: assumes "k < length xs" shows "select k xs \ set xs" proof - from assms have "sort xs ! k \ set (sort xs)" by (intro nth_mem) auto also have "set (sort xs) = set xs" by simp finally show ?thesis by (simp add: select_def) qed lemma assumes "n < length xs" shows size_less_than_select: "size {#y \# mset xs. y < select n xs#} \ n" and size_greater_than_select: "size {#y \# mset xs. y > select n xs#} < length xs - n" proof - have "size {#y \# mset (sort xs). y < select n xs#} \ size (mset (take n (sort xs)))" unfolding select_def using assms by (intro size_mset_mono sorted_filter_less_subset_take) auto thus "size {#y \# mset xs. y < select n xs#} \ n" by simp have "size {#y \# mset (sort xs). y > select n xs#} \ size (mset (drop (Suc n) (sort xs)))" unfolding select_def using assms by (intro size_mset_mono sorted_filter_greater_subset_drop) auto thus "size {#y \# mset xs. y > select n xs#} < length xs - n" using assms by simp qed subsection \The designated median of a list\ definition median where "median xs = select ((length xs - 1) div 2) xs" lemma median_in_set [intro, simp]: assumes "xs \ []" shows "median xs \ set xs" proof - from assms have "length xs > 0" by auto hence "(length xs - 1) div 2 < length xs" by linarith thus ?thesis by (simp add: median_def) qed lemma size_less_than_median: "size {#y \# mset xs. y < median xs#} \ (length xs - 1) div 2" proof (cases "xs = []") case False hence "length xs > 0" by auto hence less: "(length xs - 1) div 2 < length xs" by linarith show "size {#y \# mset xs. y < median xs#} \ (length xs - 1) div 2" using size_less_than_select[OF less] by (simp add: median_def) qed auto lemma size_greater_than_median: "size {#y \# mset xs. y > median xs#} \ length xs div 2" proof (cases "xs = []") case False hence "length xs > 0" by auto hence less: "(length xs - 1) div 2 < length xs" by linarith have "size {#y \# mset xs. y > median xs#} < length xs - (length xs - 1) div 2" using size_greater_than_select[OF less] by (simp add: median_def) also have "\ = length xs div 2 + 1" using \length xs > 0\ by linarith finally show "size {#y \# mset xs. y > median xs#} \ length xs div 2" by simp qed auto lemmas median_props = size_less_than_median size_greater_than_median subsection \A recurrence for selection\ definition partition3 :: "'a \ 'a :: linorder list \ 'a list \ 'a list \ 'a list" where "partition3 x xs = (filter (\y. y < x) xs, filter (\y. y = x) xs, filter (\y. y > x) xs)" lemma partition3_code [code]: "partition3 x [] = ([], [], [])" "partition3 x (y # ys) = (case partition3 x ys of (ls, es, gs) \ if y < x then (y # ls, es, gs) else if x = y then (ls, y # es, gs) else (ls, es, y # gs))" by (auto simp: partition3_def) lemma sort_append: assumes "\x\set xs. \y\set ys. x \ y" shows "sort (xs @ ys) = sort xs @ sort ys" using assms by (intro properties_for_sort) (auto simp: sorted_append) lemma select_append: assumes "\y\set ys. \z\set zs. y \ z" shows "k < length ys \ select k (ys @ zs) = select k ys" and "k \ {length ys.. select k (ys @ zs) = select (k - length ys) zs" using assms by (simp_all add: select_def sort_append nth_append) lemma select_append': assumes "\y\set ys. \z\set zs. y \ z" and "k < length ys + length zs" shows "select k (ys @ zs) = (if k < length ys then select k ys else select (k - length ys) zs)" using assms by (auto intro!: select_append) theorem select_rec_partition: assumes "k < length xs" shows "select k xs = ( let (ls, es, gs) = partition3 x xs in if k < length ls then select k ls else if k < length ls + length es then x else select (k - length ls - length es) gs )" (is "_ = ?rhs") proof - define ls es gs where "ls = filter (\y. y < x) xs" and "es = filter (\y. y = x) xs" and "gs = filter (\y. y > x) xs" define nl ne where [simp]: "nl = length ls" "ne = length es" have mset_eq: "mset xs = mset ls + mset es + mset gs" unfolding ls_def es_def gs_def by (induction xs) auto have length_eq: "length xs = length ls + length es + length gs" - unfolding ls_def es_def gs_def by (induction xs) auto + unfolding ls_def es_def gs_def + using [[simp_depth_limit = 1]] by (induction xs) auto have [simp]: "select i es = x" if "i < length es" for i proof - have "select i es \ set (sort es)" unfolding select_def using that by (intro nth_mem) auto thus ?thesis by (auto simp: es_def) qed have "select k xs = select k (ls @ (es @ gs))" by (intro select_mset_cong) (simp_all add: mset_eq) also have "\ = (if k < nl then select k ls else select (k - nl) (es @ gs))" unfolding nl_ne_def using assms by (intro select_append') (auto simp: ls_def es_def gs_def length_eq) also have "\ = (if k < nl then select k ls else if k < nl + ne then x else select (k - nl - ne) gs)" proof (rule if_cong) assume "\k < nl" have "select (k - nl) (es @ gs) = (if k - nl < ne then select (k - nl) es else select (k - nl - ne) gs)" unfolding nl_ne_def using assms \\k < nl\ by (intro select_append') (auto simp: ls_def es_def gs_def length_eq) also have "\ = (if k < nl + ne then x else select (k - nl - ne) gs)" using \\k < nl\ by auto finally show "select (k - nl) (es @ gs) = \" . qed simp_all also have "\ = ?rhs" by (simp add: partition3_def ls_def es_def gs_def) finally show ?thesis . qed subsection \The size of the lists in the recursive calls\ text \ We now derive an upper bound for the number of elements of a list that are smaller (resp. bigger) than the median of medians with chopping size 5. To avoid having to do the same proof twice, we do it generically for an operation \\\ that we will later instantiate with either \<\ or \>\. \ context fixes xs :: "'a :: linorder list" fixes M defines "M \ median (map median (chop 5 xs))" begin lemma size_median_of_medians_aux: fixes R :: "'a :: linorder \ 'a \ bool" (infix "\" 50) assumes R: "R \ {(<), (>)}" shows "size {#y \# mset xs. y \ M#} \ nat \0.7 * length xs + 3\" proof - define n and m where [simp]: "n = length xs" and "m = length (chop 5 xs)" text \We define an abbreviation for the multiset of all the chopped-up groups:\ text \We then split that multiset into those groups whose medians is less than @{term M} and the rest.\ define Y_small ("Y\<^sub>\") where "Y\<^sub>\ = filter_mset (\ys. median ys \ M) (mset (chop 5 xs))" define Y_big ("Y\<^sub>\") where "Y\<^sub>\ = filter_mset (\ys. \(median ys \ M)) (mset (chop 5 xs))" have "m = size (mset (chop 5 xs))" by (simp add: m_def) also have "mset (chop 5 xs) = Y\<^sub>\ + Y\<^sub>\" unfolding Y_small_def Y_big_def by (rule multiset_partition) finally have m_eq: "m = size Y\<^sub>\ + size Y\<^sub>\" by simp text \At most half of the lists have a median that is smaller than the median of medians:\ have "size Y\<^sub>\ = size (image_mset median Y\<^sub>\)" by simp also have "image_mset median Y\<^sub>\ = {#y \# mset (map median (chop 5 xs)). y \ M#}" unfolding Y_small_def by (subst filter_mset_image_mset [symmetric]) simp_all also have "size \ \ (length (map median (chop 5 xs))) div 2" unfolding M_def using median_props[of "map median (chop 5 xs)"] R by auto also have "\ = m div 2" by (simp add: m_def) finally have size_Y\<^sub>_small: "size Y\<^sub>\ \ m div 2" . text \We estimate the number of elements less than @{term M} by grouping them into elements coming from @{term "Y\<^sub>\"} and elements coming from @{term "Y\<^sub>\"}:\ have "{#y \# mset xs. y \ M#} = {#y \# (\ys\chop 5 xs. mset ys). y \ M#}" by (subst sum_msets_chop) simp_all also have "\ = (\ys\chop 5 xs. {#y \# mset ys. y \ M#})" by (subst filter_mset_sum_list) (simp add: o_def) also have "\ = (\ys\#mset (chop 5 xs). {#y \# mset ys. y \ M#})" by (subst sum_mset_sum_list [symmetric]) simp_all also have "mset (chop 5 xs) = Y\<^sub>\ + Y\<^sub>\" by (simp add: Y_small_def Y_big_def not_le) also have "(\ys\#\. {#y \# mset ys. y \ M#}) = (\ys\#Y\<^sub>\. {#y \# mset ys. y \ M#}) + (\ys\#Y\<^sub>\. {#y \# mset ys. y \ M#})" by simp text \Next, we overapproximate the elements contributed by @{term "Y\<^sub>\"}: instead of those elements that are smaller than the median, we take \<^emph>\all\ the elements of each group. For the elements contributed by @{term "Y\<^sub>\"}, we overapproximate by taking all those that are less than their median instead of only those that are less than @{term M}.\ also have "\ \# (\ys\#Y\<^sub>\. mset ys) + (\ys\#Y\<^sub>\. {#y \# mset ys. y \ median ys#})" using R by (intro subset_mset.add_mono sum_mset_mset_mono mset_filter_mono) (auto simp: Y_big_def) finally have "size {# y \# mset xs. y \ M#} \ size \" by (rule size_mset_mono) hence "size {# y \# mset xs. y \ M#} \ (\ys\#Y\<^sub>\. length ys) + (\ys\#Y\<^sub>\. size {#y \# mset ys. y \ median ys#})" by (simp add: size_mset_sum_mset_distrib multiset.map_comp o_def) text \Next, we further overapproximate the first sum by noting that each group has at most size 5.\ also have "(\ys\#Y\<^sub>\. length ys) \ (\ys\#Y\<^sub>\. 5)" by (intro sum_mset_mono) (auto simp: Y_small_def length_chop_part_le) also have "\ = 5 * size Y\<^sub>\" by simp text \Next, we note that each group in @{term "Y\<^sub>\"} can have at most 2 elements that are smaller than its median.\ also have "(\ys\#Y\<^sub>\. size {#y \# mset ys. y \ median ys#}) \ (\ys\#Y\<^sub>\. length ys div 2)" proof (intro sum_mset_mono, goal_cases) fix ys assume "ys \# Y\<^sub>\" hence "ys \ []" by (auto simp: Y_big_def) thus "size {#y \# mset ys. y \ median ys#} \ length ys div 2" using R median_props[of ys] by auto qed also have "\ \ (\ys\#Y\<^sub>\. 2)" by (intro sum_mset_mono div_le_mono diff_le_mono) (auto simp: Y_big_def dest: length_chop_part_le) also have "\ = 2 * size Y\<^sub>\" by simp text \Simplifying gives us the main result.\ also have "5 * size Y\<^sub>\ + 2 * size Y\<^sub>\ = 2 * m + 3 * size Y\<^sub>\" by (simp add: m_eq) also have "\ \ 3.5 * m" using \size Y\<^sub>\ \ m div 2\ by linarith also have "\ = 3.5 * \n / 5\" by (simp add: m_def length_chop) also have "\ \ 0.7 * n + 3.5" by linarith finally have "size {#y \# mset xs. y \ M#} \ 0.7 * n + 3.5" by simp thus "size {#y \# mset xs. y \ M#} \ nat \0.7 * n + 3\" by linarith qed lemma size_less_than_median_of_medians: "size {#y \# mset xs. y < M#} \ nat \0.7 * length xs + 3\" using size_median_of_medians_aux[of "(<)"] by simp lemma size_greater_than_median_of_medians: "size {#y \# mset xs. y > M#} \ nat \0.7 * length xs + 3\" using size_median_of_medians_aux[of "(>)"] by simp end subsection \Efficient algorithm\ text \ We handle the base cases and computing the median for the chopped-up sublists of size 5 using the naive selection algorithm where we sort the list using insertion sort. \ definition slow_select where "slow_select k xs = isort xs ! k" definition slow_median where "slow_median xs = slow_select ((length xs - 1) div 2) xs" lemma slow_select_correct: "slow_select k xs = select k xs" by (simp add: slow_select_def select_def isort_correct) lemma slow_median_correct: "slow_median xs = median xs" by (simp add: median_def slow_median_def slow_select_correct) text \ The definition of the selection algorithm is complicated somewhat by the fact that its termination is contingent on its correctness: if the first recursive call were to return an element for \x\ that is e.g. smaller than all list elements, the algorithm would not terminate. Therefore, we first prove partial correctness, then termination, and then combine the two to obtain total correctness. \ function mom_select where "mom_select k xs = ( if length xs \ 20 then slow_select k xs else let M = mom_select (((length xs + 4) div 5 - 1) div 2) (map slow_median (chop 5 xs)); (ls, es, gs) = partition3 M xs in if k < length ls then mom_select k ls else if k < length ls + length es then M else mom_select (k - length ls - length es) gs )" by auto text \ If @{const "mom_select"} terminates, it agrees with @{const select}: \ lemma mom_select_correct_aux: assumes "mom_select_dom (k, xs)" and "k < length xs" shows "mom_select k xs = select k xs" using assms proof (induction rule: mom_select.pinduct) case (1 k xs) show "mom_select k xs = select k xs" proof (cases "length xs \ 20") case True thus "mom_select k xs = select k xs" using "1.prems" "1.hyps" by (subst mom_select.psimps) (auto simp: select_def slow_select_correct) next case False define x where "x = mom_select (((length xs + 4) div 5 - 1) div 2) (map slow_median (chop 5 xs))" define ls es gs where "ls = filter (\y. y < x) xs" and "es = filter (\y. y = x) xs" and "gs = filter (\y. y > x) xs" define nl ne where "nl = length ls" and "ne = length es" note defs = nl_def ne_def x_def ls_def es_def gs_def have tw: "(ls, es, gs) = partition3 x xs" unfolding partition3_def defs One_nat_def .. have length_eq: "length xs = nl + ne + length gs" - unfolding nl_def ne_def ls_def es_def gs_def by (induction xs) auto + unfolding nl_def ne_def ls_def es_def gs_def + using [[simp_depth_limit = 1]] by (induction xs) auto note IH = "1.IH"(2,3)[OF False x_def tw refl refl] have "mom_select k xs = (if k < nl then mom_select k ls else if k < nl + ne then x else mom_select (k - nl - ne) gs)" using "1.hyps" False by (subst mom_select.psimps) (simp_all add: partition3_def flip: defs One_nat_def) also have "\ = (if k < nl then select k ls else if k < nl + ne then x else select (k - nl - ne) gs)" using IH length_eq "1.prems" by (simp add: ls_def es_def gs_def nl_def ne_def) also have "\ = select k xs" using \k < length xs\ by (subst (3) select_rec_partition[of _ _ x]) (simp_all add: nl_def ne_def flip: tw) finally show "mom_select k xs = select k xs" . qed qed text \ @{const mom_select} indeed terminates for all inputs: \ lemma mom_select_termination: "All mom_select_dom" proof (relation "measure (length \ snd)"; (safe)?) fix k :: nat and xs :: "'a list" assume "\ length xs \ 20" thus "((((length xs + 4) div 5 - 1) div 2, map slow_median (chop 5 xs)), k, xs) \ measure (length \ snd)" by (auto simp: length_chop nat_less_iff ceiling_less_iff) next fix k :: nat and xs ls es gs :: "'a list" define x where "x = mom_select (((length xs + 4) div 5 - 1) div 2) (map slow_median (chop 5 xs))" assume A: "\ length xs \ 20" "(ls, es, gs) = partition3 x xs" "mom_select_dom (((length xs + 4) div 5 - 1) div 2, map slow_median (chop 5 xs))" have less: "((length xs + 4) div 5 - 1) div 2 < nat \length xs / 5\" using A(1) by linarith text \For termination, it suffices to prove that @{term x} is in the list.\ have "x = select (((length xs + 4) div 5 - 1) div 2) (map slow_median (chop 5 xs))" using less unfolding x_def by (intro mom_select_correct_aux A) (auto simp: length_chop) also have "\ \ set (map slow_median (chop 5 xs))" using less by (intro select_in_set) (simp_all add: length_chop) also have "\ \ set xs" unfolding set_map proof safe fix ys assume ys: "ys \ set (chop 5 xs)" hence "median ys \ set ys" by auto also have "set ys \ \(set ` set (chop 5 xs))" using ys by blast also have "\ = set xs" by (rule UN_sets_chop) simp_all finally show "slow_median ys \ set xs" by (simp add: slow_median_correct) qed finally have "x \ set xs" . thus "((k, ls), k, xs) \ measure (length \ snd)" and "((k - length ls - length es, gs), k, xs) \ measure (length \ snd)" using A(1,2) by (auto simp: partition3_def intro!: length_filter_less[of x]) qed termination mom_select by (rule mom_select_termination) lemmas [simp del] = mom_select.simps lemma mom_select_correct: "k < length xs \ mom_select k xs = select k xs" using mom_select_correct_aux and mom_select_termination by blast subsection \Running time analysis\ fun T_partition3 :: "'a \ 'a list \ nat" where "T_partition3 x [] = 1" | "T_partition3 x (y # ys) = T_partition3 x ys + 1" lemma T_partition3_eq: "T_partition3 x xs = length xs + 1" by (induction x xs rule: T_partition3.induct) auto definition T_slow_select :: "nat \ 'a :: linorder list \ nat" where "T_slow_select k xs = T_isort xs + T_nth (isort xs) k + 1" definition T_slow_median :: "'a :: linorder list \ nat" where "T_slow_median xs = T_slow_select ((length xs - 1) div 2) xs + 1" lemma T_slow_select_le: "T_slow_select k xs \ length xs ^ 2 + 3 * length xs + 3" proof - have "T_slow_select k xs \ (length xs + 1)\<^sup>2 + (length (isort xs) + 1) + 1" unfolding T_slow_select_def by (intro add_mono T_isort_length) (auto simp: T_nth_eq) also have "\ = length xs ^ 2 + 3 * length xs + 3" by (simp add: isort_correct algebra_simps power2_eq_square) finally show ?thesis . qed lemma T_slow_median_le: "T_slow_median xs \ length xs ^ 2 + 3 * length xs + 4" unfolding T_slow_median_def using T_slow_select_le[of "(length xs - 1) div 2" xs] by simp fun T_chop :: "nat \ 'a list \ nat" where "T_chop 0 _ = 1" | "T_chop _ [] = 1" | "T_chop n xs = T_take n xs + T_drop n xs + T_chop n (drop n xs)" lemmas [simp del] = T_chop.simps lemma T_chop_Nil [simp]: "T_chop d [] = 1" by (cases d) (auto simp: T_chop.simps) lemma T_chop_0 [simp]: "T_chop 0 xs = 1" by (auto simp: T_chop.simps) lemma T_chop_reduce: "n > 0 \ xs \ [] \ T_chop n xs = T_take n xs + T_drop n xs + T_chop n (drop n xs)" by (cases n; cases xs) (auto simp: T_chop.simps) lemma T_chop_le: "T_chop d xs \ 5 * length xs + 1" by (induction d xs rule: T_chop.induct) (auto simp: T_chop_reduce T_take_eq T_drop_eq) text \ The option \domintros\ here allows us to explicitly reason about where the function does and does not terminate. With this, we can skip the termination proof this time because we can reuse the one for \<^const>\mom_select\. \ function (domintros) T_mom_select :: "nat \ 'a :: linorder list \ nat" where "T_mom_select k xs = ( if length xs \ 20 then T_slow_select k xs else let xss = chop 5 xs; ms = map slow_median xss; idx = (((length xs + 4) div 5 - 1) div 2); x = mom_select idx ms; (ls, es, gs) = partition3 x xs; nl = length ls; ne = length es in (if k < nl then T_mom_select k ls else if k < nl + ne then 0 else T_mom_select (k - nl - ne) gs) + T_mom_select idx ms + T_chop 5 xs + T_map T_slow_median xss + T_partition3 x xs + T_length ls + T_length es + 1 )" by auto termination T_mom_select proof (rule allI, safe) fix k :: nat and xs :: "'a :: linorder list" have "mom_select_dom (k, xs)" using mom_select_termination by blast thus "T_mom_select_dom (k, xs)" by (induction k xs rule: mom_select.pinduct) (rule T_mom_select.domintros, simp_all) qed lemmas [simp del] = T_mom_select.simps function T'_mom_select :: "nat \ nat" where "T'_mom_select n = (if n \ 20 then 463 else T'_mom_select (nat \0.2*n\) + T'_mom_select (nat \0.7*n+3\) + 17 * n + 50)" by force+ termination by (relation "measure id"; simp; linarith) lemmas [simp del] = T'_mom_select.simps lemma T'_mom_select_ge: "T'_mom_select n \ 463" by (induction n rule: T'_mom_select.induct; subst T'_mom_select.simps) auto lemma T'_mom_select_mono: "m \ n \ T'_mom_select m \ T'_mom_select n" proof (induction n arbitrary: m rule: less_induct) case (less n m) show ?case proof (cases "m \ 20") case True hence "T'_mom_select m = 463" by (subst T'_mom_select.simps) auto also have "\ \ T'_mom_select n" by (rule T'_mom_select_ge) finally show ?thesis . next case False hence "T'_mom_select m = T'_mom_select (nat \0.2*m\) + T'_mom_select (nat \0.7*m + 3\) + 17 * m + 50" by (subst T'_mom_select.simps) auto also have "\ \ T'_mom_select (nat \0.2*n\) + T'_mom_select (nat \0.7*n + 3\) + 17 * n + 50" using \m \ n\ and False by (intro add_mono less.IH; linarith) also have "\ = T'_mom_select n" using \m \ n\ and False by (subst T'_mom_select.simps) auto finally show ?thesis . qed qed lemma T_mom_select_le_aux: "T_mom_select k xs \ T'_mom_select (length xs)" proof (induction k xs rule: T_mom_select.induct) case (1 k xs) define n where [simp]: "n = length xs" define x where "x = mom_select (((length xs + 4) div 5 - 1) div 2) (map slow_median (chop 5 xs))" define ls es gs where "ls = filter (\y. y < x) xs" and "es = filter (\y. y = x) xs" and "gs = filter (\y. y > x) xs" define nl ne where "nl = length ls" and "ne = length es" note defs = nl_def ne_def x_def ls_def es_def gs_def have tw: "(ls, es, gs) = partition3 x xs" unfolding partition3_def defs One_nat_def .. note IH = "1.IH"(1,2,3)[OF _ refl refl refl x_def tw refl refl refl refl] show ?case proof (cases "length xs \ 20") case True \ \base case\ hence "T_mom_select k xs \ (length xs)\<^sup>2 + 3 * length xs + 3" using T_slow_select_le[of k xs] by (subst T_mom_select.simps) auto also have "\ \ 20\<^sup>2 + 3 * 20 + 3" using True by (intro add_mono power_mono) auto also have "\ \ 463" by simp also have "\ = T'_mom_select (length xs)" using True by (simp add: T'_mom_select.simps) finally show ?thesis by simp next case False \ \recursive case\ have "((n + 4) div 5 - 1) div 2 < nat \n / 5\" using False unfolding n_def by linarith hence "x = select (((n + 4) div 5 - 1) div 2) (map slow_median (chop 5 xs))" unfolding x_def n_def by (intro mom_select_correct) (auto simp: length_chop) also have "((n + 4) div 5 - 1) div 2 = (nat \n / 5\ - 1) div 2" by linarith also have "select \ (map slow_median (chop 5 xs)) = median (map slow_median (chop 5 xs))" by (auto simp: median_def length_chop) finally have x_eq: "x = median (map slow_median (chop 5 xs))" . text \The cost of computing the medians of all the subgroups:\ define T_ms where "T_ms = T_map T_slow_median (chop 5 xs)" have "T_ms \ 9 * n + 45" proof - have "T_ms = (\ys\chop 5 xs. T_slow_median ys) + length (chop 5 xs) + 1" by (simp add: T_ms_def T_map_eq) also have "(\ys\chop 5 xs. T_slow_median ys) \ (\ys\chop 5 xs. 44)" proof (intro sum_list_mono) fix ys assume "ys \ set (chop 5 xs)" hence "length ys \ 5" using length_chop_part_le by blast have "T_slow_median ys \ (length ys) ^ 2 + 3 * length ys + 4" by (rule T_slow_median_le) also have "\ \ 5 ^ 2 + 3 * 5 + 4" using \length ys \ 5\ by (intro add_mono power_mono) auto finally show "T_slow_median ys \ 44" by simp qed also have "(\ys\chop 5 xs. 44) + length (chop 5 xs) + 1 = 45 * nat \real n / 5\ + 1" by (simp add: map_replicate_const length_chop) also have "\ \ 9 * n + 45" by linarith finally show "T_ms \ 9 * n + 45" by simp qed text \The cost of the first recursive call (to compute the median of medians):\ define T_rec1 where "T_rec1 = T_mom_select (((length xs + 4) div 5 - 1) div 2) (map slow_median (chop 5 xs))" have "T_rec1 \ T'_mom_select (length (map slow_median (chop 5 xs)))" using False unfolding T_rec1_def by (intro IH(3)) auto hence "T_rec1 \ T'_mom_select (nat \0.2 * n\)" by (simp add: length_chop) text \The cost of the second recursive call (to compute the final result):\ define T_rec2 where "T_rec2 = (if k < nl then T_mom_select k ls else if k < nl + ne then 0 else T_mom_select (k - nl - ne) gs)" consider "k < nl" | "k \ {nl.. nl+ne" by force hence "T_rec2 \ T'_mom_select (nat \0.7 * n + 3\)" proof cases assume "k < nl" hence "T_rec2 = T_mom_select k ls" by (simp add: T_rec2_def) also have "\ \ T'_mom_select (length ls)" by (rule IH(1)) (use \k < nl\ False in \auto simp: defs\) also have "length ls \ nat \0.7 * n + 3\" unfolding ls_def using size_less_than_median_of_medians[of xs] by (auto simp: length_filter_conv_size_filter_mset slow_median_correct[abs_def] x_eq) hence "T'_mom_select (length ls) \ T'_mom_select (nat \0.7 * n + 3\)" by (rule T'_mom_select_mono) finally show ?thesis . next assume "k \ {nl..0.7 * n + 3\"] by simp next assume "k \ nl + ne" hence "T_rec2 = T_mom_select (k - nl - ne) gs" by (simp add: T_rec2_def) also have "\ \ T'_mom_select (length gs)" unfolding nl_def ne_def by (rule IH(2)) (use \k \ nl + ne\ False in \auto simp: defs\) also have "length gs \ nat \0.7 * n + 3\" unfolding gs_def using size_greater_than_median_of_medians[of xs] by (auto simp: length_filter_conv_size_filter_mset slow_median_correct[abs_def] x_eq) hence "T'_mom_select (length gs) \ T'_mom_select (nat \0.7 * n + 3\)" by (rule T'_mom_select_mono) finally show ?thesis . qed text \Now for the final inequality chain:\ have "T_mom_select k xs = T_rec2 + T_rec1 + T_ms + n + nl + ne + T_chop 5 xs + 4" using False by (subst T_mom_select.simps, unfold Let_def tw [symmetric] defs [symmetric]) (simp_all add: nl_def ne_def T_rec1_def T_rec2_def T_partition3_eq T_length_eq T_ms_def) also have "nl \ n" by (simp add: nl_def ls_def) also have "ne \ n" by (simp add: ne_def es_def) also note \T_ms \ 9 * n + 45\ also have "T_chop 5 xs \ 5 * n + 1" using T_chop_le[of 5 xs] by simp also note \T_rec1 \ T'_mom_select (nat \0.2*n\)\ also note \T_rec2 \ T'_mom_select (nat \0.7*n + 3\)\ finally have "T_mom_select k xs \ T'_mom_select (nat \0.7*n + 3\) + T'_mom_select (nat \0.2*n\) + 17 * n + 50" by simp also have "\ = T'_mom_select n" using False by (subst T'_mom_select.simps) auto finally show ?thesis by simp qed qed subsection \Akra--Bazzi Light\ lemma akra_bazzi_light_aux1: fixes a b :: real and n n0 :: nat assumes ab: "a > 0" "a < 1" "n > n0" assumes "n0 \ (max 0 b + 1) / (1 - a)" shows "nat \a*n+b\ < n" proof - have "a * real n + max 0 b \ 0" using ab by simp hence "real (nat \a*n+b\) \ a * n + max 0 b + 1" by linarith also { have "n0 \ (max 0 b + 1) / (1 - a)" by fact also have "\ < real n" using assms by simp finally have "a * real n + max 0 b + 1 < real n" using ab by (simp add: field_simps) } finally show "nat \a*n+b\ < n" using \n > n0\ by linarith qed lemma akra_bazzi_light_aux2: fixes f :: "nat \ real" fixes n\<^sub>0 :: nat and a b c d :: real and C1 C2 C\<^sub>1 C\<^sub>2 :: real assumes bounds: "a > 0" "c > 0" "a + c < 1" "C\<^sub>1 \ 0" assumes rec: "\n>n\<^sub>0. f n = f (nat \a*n+b\) + f (nat \c*n+d\) + C\<^sub>1 * n + C\<^sub>2" assumes ineqs: "n\<^sub>0 > (max 0 b + max 0 d + 2) / (1 - a - c)" "C\<^sub>3 \ C\<^sub>1 / (1 - a - c)" "C\<^sub>3 \ (C\<^sub>1 * n\<^sub>0 + C\<^sub>2 + C\<^sub>4) / ((1 - a - c) * n\<^sub>0 - max 0 b - max 0 d - 2)" "\n\n\<^sub>0. f n \ C\<^sub>4" shows "f n \ C\<^sub>3 * n + C\<^sub>4" proof (induction n rule: less_induct) case (less n) have "0 \ C\<^sub>1 / (1 - a - c)" using bounds by auto also have "\ \ C\<^sub>3" by fact finally have "C\<^sub>3 \ 0" . show ?case proof (cases "n > n\<^sub>0") case False hence "f n \ C\<^sub>4" using ineqs(4) by auto also have "\ \ C\<^sub>3 * real n + C\<^sub>4" using bounds \C\<^sub>3 \ 0\ by auto finally show ?thesis . next case True have nonneg: "a * n \ 0" "c * n \ 0" using bounds by simp_all have "(max 0 b + 1) / (1 - a) \ (max 0 b + max 0 d + 2) / (1 - a - c)" using bounds by (intro frac_le) auto hence "n\<^sub>0 \ (max 0 b + 1) / (1 - a)" using ineqs(1) by linarith hence rec_less1: "nat \a*n+b\ < n" using bounds \n > n\<^sub>0\ by (intro akra_bazzi_light_aux1[of _ n\<^sub>0]) auto have "(max 0 d + 1) / (1 - c) \ (max 0 b + max 0 d + 2) / (1 - a - c)" using bounds by (intro frac_le) auto hence "n\<^sub>0 \ (max 0 d + 1) / (1 - c)" using ineqs(1) by linarith hence rec_less2: "nat \c*n+d\ < n" using bounds \n > n\<^sub>0\ by (intro akra_bazzi_light_aux1[of _ n\<^sub>0]) auto have "f n = f (nat \a*n+b\) + f (nat \c*n+d\) + C\<^sub>1 * n + C\<^sub>2" using \n > n\<^sub>0\ by (subst rec) auto also have "\ \ (C\<^sub>3 * nat \a*n+b\ + C\<^sub>4) + (C\<^sub>3 * nat \c*n+d\ + C\<^sub>4) + C\<^sub>1 * n + C\<^sub>2" using rec_less1 rec_less2 by (intro add_mono less.IH) auto also have "\ \ (C\<^sub>3 * (a*n+max 0 b+1) + C\<^sub>4) + (C\<^sub>3 * (c*n+max 0 d+1) + C\<^sub>4) + C\<^sub>1 * n + C\<^sub>2" using bounds \C\<^sub>3 \ 0\ nonneg by (intro add_mono mult_left_mono order.refl; linarith) also have "\ = C\<^sub>3 * n + ((C\<^sub>3 * (max 0 b + max 0 d + 2) + 2 * C\<^sub>4 + C\<^sub>2) - (C\<^sub>3 * (1 - a - c) - C\<^sub>1) * n)" by (simp add: algebra_simps) also have "\ \ C\<^sub>3 * n + ((C\<^sub>3 * (max 0 b + max 0 d + 2) + 2 * C\<^sub>4 + C\<^sub>2) - (C\<^sub>3 * (1 - a - c) - C\<^sub>1) * n\<^sub>0)" using \n > n\<^sub>0\ ineqs(2) bounds by (intro add_mono diff_mono order.refl mult_left_mono) (auto simp: field_simps) also have "(C\<^sub>3 * (max 0 b + max 0 d + 2) + 2 * C\<^sub>4 + C\<^sub>2) - (C\<^sub>3 * (1 - a - c) - C\<^sub>1) * n\<^sub>0 \ C\<^sub>4" using ineqs bounds by (simp add: field_simps) finally show "f n \ C\<^sub>3 * real n + C\<^sub>4" by (simp add: mult_right_mono) qed qed lemma akra_bazzi_light: fixes f :: "nat \ real" fixes n\<^sub>0 :: nat and a b c d C\<^sub>1 C\<^sub>2 :: real assumes bounds: "a > 0" "c > 0" "a + c < 1" "C\<^sub>1 \ 0" assumes rec: "\n>n\<^sub>0. f n = f (nat \a*n+b\) + f (nat \c*n+d\) + C\<^sub>1 * n + C\<^sub>2" shows "\C\<^sub>3 C\<^sub>4. \n. f n \ C\<^sub>3 * real n + C\<^sub>4" proof - define n\<^sub>0' where "n\<^sub>0' = max n\<^sub>0 (nat \(max 0 b + max 0 d + 2) / (1 - a - c) + 1\)" define C\<^sub>4 where "C\<^sub>4 = Max (f ` {..n\<^sub>0'})" define C\<^sub>3 where "C\<^sub>3 = max (C\<^sub>1 / (1 - a - c)) ((C\<^sub>1 * n\<^sub>0' + C\<^sub>2 + C\<^sub>4) / ((1 - a - c) * n\<^sub>0' - max 0 b - max 0 d - 2))" have "f n \ C\<^sub>3 * n + C\<^sub>4" for n proof (rule akra_bazzi_light_aux2[OF bounds _]) show "\n>n\<^sub>0'. f n = f (nat \a*n+b\) + f (nat \c*n+d\) + C\<^sub>1 * n + C\<^sub>2" using rec by (auto simp: n\<^sub>0'_def) next show "C\<^sub>3 \ C\<^sub>1 / (1 - a - c)" and "C\<^sub>3 \ (C\<^sub>1 * n\<^sub>0' + C\<^sub>2 + C\<^sub>4) / ((1 - a - c) * n\<^sub>0' - max 0 b - max 0 d - 2)" by (simp_all add: C\<^sub>3_def) next have "(max 0 b + max 0 d + 2) / (1 - a - c) < nat \(max 0 b + max 0 d + 2) / (1 - a - c) + 1\" by linarith also have "\ \ n\<^sub>0'" by (simp add: n\<^sub>0'_def) finally show "(max 0 b + max 0 d + 2) / (1 - a - c) < real n\<^sub>0'" . next show "\n\n\<^sub>0'. f n \ C\<^sub>4" by (auto simp: C\<^sub>4_def) qed thus ?thesis by blast qed lemma akra_bazzi_light_nat: fixes f :: "nat \ nat" fixes n\<^sub>0 :: nat and a b c d :: real and C\<^sub>1 C\<^sub>2 :: nat assumes bounds: "a > 0" "c > 0" "a + c < 1" "C\<^sub>1 \ 0" assumes rec: "\n>n\<^sub>0. f n = f (nat \a*n+b\) + f (nat \c*n+d\) + C\<^sub>1 * n + C\<^sub>2" shows "\C\<^sub>3 C\<^sub>4. \n. f n \ C\<^sub>3 * n + C\<^sub>4" proof - have "\C\<^sub>3 C\<^sub>4. \n. real (f n) \ C\<^sub>3 * real n + C\<^sub>4" using assms by (intro akra_bazzi_light[of a c C\<^sub>1 n\<^sub>0 f b d C\<^sub>2]) auto then obtain C\<^sub>3 C\<^sub>4 where le: "\n. real (f n) \ C\<^sub>3 * real n + C\<^sub>4" by blast have "f n \ nat \C\<^sub>3\ * n + nat \C\<^sub>4\" for n proof - have "real (f n) \ C\<^sub>3 * real n + C\<^sub>4" using le by blast also have "\ \ real (nat \C\<^sub>3\) * real n + real (nat \C\<^sub>4\)" by (intro add_mono mult_right_mono; linarith) also have "\ = real (nat \C\<^sub>3\ * n + nat \C\<^sub>4\)" by simp finally show ?thesis by linarith qed thus ?thesis by blast qed lemma T'_mom_select_le': "\C\<^sub>1 C\<^sub>2. \n. T'_mom_select n \ C\<^sub>1 * n + C\<^sub>2" proof (rule akra_bazzi_light_nat) show "\n>20. T'_mom_select n = T'_mom_select (nat \0.2 * n + 0\) + T'_mom_select (nat \0.7 * n + 3\) + 17 * n + 50" using T'_mom_select.simps by auto qed auto end \ No newline at end of file diff --git a/src/HOL/Data_Structures/Set2_Join.thy b/src/HOL/Data_Structures/Set2_Join.thy --- a/src/HOL/Data_Structures/Set2_Join.thy +++ b/src/HOL/Data_Structures/Set2_Join.thy @@ -1,356 +1,376 @@ (* Author: Tobias Nipkow *) section "Join-Based Implementation of Sets" theory Set2_Join imports Isin2 begin text \This theory implements the set operations \insert\, \delete\, \union\, \inter\section and \diff\erence. The implementation is based on binary search trees. All operations are reduced to a single operation \join l x r\ that joins two BSTs \l\ and \r\ and an element \x\ such that \l < x < r\. The theory is based on theory \<^theory>\HOL-Data_Structures.Tree2\ where nodes have an additional field. This field is ignored here but it means that this theory can be instantiated with red-black trees (see theory \<^file>\Set2_Join_RBT.thy\) and other balanced trees. This approach is very concrete and fixes the type of trees. Alternatively, one could assume some abstract type \<^typ>\'t\ of trees with suitable decomposition and recursion operators on it.\ locale Set2_Join = fixes join :: "('a::linorder*'b) tree \ 'a \ ('a*'b) tree \ ('a*'b) tree" fixes inv :: "('a*'b) tree \ bool" assumes set_join: "set_tree (join l a r) = set_tree l \ {a} \ set_tree r" assumes bst_join: "bst (Node l (a, b) r) \ bst (join l a r)" assumes inv_Leaf: "inv \\" assumes inv_join: "\ inv l; inv r \ \ inv (join l a r)" assumes inv_Node: "\ inv (Node l (a,b) r) \ \ inv l \ inv r" begin declare set_join [simp] Let_def[simp] subsection "\split_min\" fun split_min :: "('a*'b) tree \ 'a \ ('a*'b) tree" where "split_min (Node l (a, _) r) = (if l = Leaf then (a,r) else let (m,l') = split_min l in (m, join l' a r))" lemma split_min_set: "\ split_min t = (m,t'); t \ Leaf \ \ m \ set_tree t \ set_tree t = {m} \ set_tree t'" proof(induction t arbitrary: t' rule: tree2_induct) case Node thus ?case by(auto split: prod.splits if_splits dest: inv_Node) next case Leaf thus ?case by simp qed lemma split_min_bst: "\ split_min t = (m,t'); bst t; t \ Leaf \ \ bst t' \ (\x \ set_tree t'. m < x)" proof(induction t arbitrary: t' rule: tree2_induct) case Node thus ?case by(fastforce simp: split_min_set bst_join split: prod.splits if_splits) next case Leaf thus ?case by simp qed lemma split_min_inv: "\ split_min t = (m,t'); inv t; t \ Leaf \ \ inv t'" proof(induction t arbitrary: t' rule: tree2_induct) case Node thus ?case by(auto simp: inv_join split: prod.splits if_splits dest: inv_Node) next case Leaf thus ?case by simp qed subsection "\join2\" definition join2 :: "('a*'b) tree \ ('a*'b) tree \ ('a*'b) tree" where "join2 l r = (if r = Leaf then l else let (m,r') = split_min r in join l m r')" lemma set_join2[simp]: "set_tree (join2 l r) = set_tree l \ set_tree r" by(simp add: join2_def split_min_set split: prod.split) lemma bst_join2: "\ bst l; bst r; \x \ set_tree l. \y \ set_tree r. x < y \ \ bst (join2 l r)" by(simp add: join2_def bst_join split_min_set split_min_bst split: prod.split) lemma inv_join2: "\ inv l; inv r \ \ inv (join2 l r)" by(simp add: join2_def inv_join split_min_set split_min_inv split: prod.split) subsection "\split\" fun split :: "('a*'b)tree \ 'a \ ('a*'b)tree \ bool \ ('a*'b)tree" where "split Leaf k = (Leaf, False, Leaf)" | "split (Node l (a, _) r) x = (case cmp x a of LT \ let (l1,b,l2) = split l x in (l1, b, join l2 a r) | GT \ let (r1,b,r2) = split r x in (join l a r1, b, r2) | EQ \ (l, True, r))" lemma split: "split t x = (l,b,r) \ bst t \ set_tree l = {a \ set_tree t. a < x} \ set_tree r = {a \ set_tree t. x < a} \ (b = (x \ set_tree t)) \ bst l \ bst r" proof(induction t arbitrary: l b r rule: tree2_induct) case Leaf thus ?case by simp next - case Node thus ?case by(force split!: prod.splits if_splits intro!: bst_join) + case (Node y a b z l c r) + consider (LT) l1 xin l2 where "(l1,xin,l2) = split y x" + and "split \y, (a, b), z\ x = (l1, xin, join l2 a z)" and "cmp x a = LT" + | (GT) r1 xin r2 where "(r1,xin,r2) = split z x" + and "split \y, (a, b), z\ x = (join y a r1, xin, r2)" and "cmp x a = GT" + | (EQ) "split \y, (a, b), z\ x = (y, True, z)" and "cmp x a = EQ" + by (force split: cmp_val.splits prod.splits if_splits) + + thus ?case + proof cases + case (LT l1 xin l2) + with Node.IH(1)[OF \(l1,xin,l2) = split y x\[symmetric]] Node.prems + show ?thesis by (force intro!: bst_join) + next + case (GT r1 xin r2) + with Node.IH(2)[OF \(r1,xin,r2) = split z x\[symmetric]] Node.prems + show ?thesis by (force intro!: bst_join) + next + case EQ + with Node.prems show ?thesis by auto + qed qed lemma split_inv: "split t x = (l,b,r) \ inv t \ inv l \ inv r" proof(induction t arbitrary: l b r rule: tree2_induct) case Leaf thus ?case by simp next case Node thus ?case by(force simp: inv_join split!: prod.splits if_splits dest!: inv_Node) qed declare split.simps[simp del] subsection "\insert\" definition insert :: "'a \ ('a*'b) tree \ ('a*'b) tree" where "insert x t = (let (l,_,r) = split t x in join l x r)" lemma set_tree_insert: "bst t \ set_tree (insert x t) = {x} \ set_tree t" by(auto simp add: insert_def split split: prod.split) lemma bst_insert: "bst t \ bst (insert x t)" by(auto simp add: insert_def bst_join dest: split split: prod.split) lemma inv_insert: "inv t \ inv (insert x t)" by(force simp: insert_def inv_join dest: split_inv split: prod.split) subsection "\delete\" definition delete :: "'a \ ('a*'b) tree \ ('a*'b) tree" where "delete x t = (let (l,_,r) = split t x in join2 l r)" lemma set_tree_delete: "bst t \ set_tree (delete x t) = set_tree t - {x}" by(auto simp: delete_def split split: prod.split) lemma bst_delete: "bst t \ bst (delete x t)" by(force simp add: delete_def intro: bst_join2 dest: split split: prod.split) lemma inv_delete: "inv t \ inv (delete x t)" by(force simp: delete_def inv_join2 dest: split_inv split: prod.split) subsection "\union\" fun union :: "('a*'b)tree \ ('a*'b)tree \ ('a*'b)tree" where "union t1 t2 = (if t1 = Leaf then t2 else if t2 = Leaf then t1 else case t1 of Node l1 (a, _) r1 \ let (l2,_ ,r2) = split t2 a; l' = union l1 l2; r' = union r1 r2 in join l' a r')" declare union.simps [simp del] lemma set_tree_union: "bst t2 \ set_tree (union t1 t2) = set_tree t1 \ set_tree t2" proof(induction t1 t2 rule: union.induct) case (1 t1 t2) then show ?case by (auto simp: union.simps[of t1 t2] split split: tree.split prod.split) qed lemma bst_union: "\ bst t1; bst t2 \ \ bst (union t1 t2)" proof(induction t1 t2 rule: union.induct) case (1 t1 t2) thus ?case by(fastforce simp: union.simps[of t1 t2] set_tree_union split intro!: bst_join split: tree.split prod.split) qed lemma inv_union: "\ inv t1; inv t2 \ \ inv (union t1 t2)" proof(induction t1 t2 rule: union.induct) case (1 t1 t2) thus ?case by(auto simp:union.simps[of t1 t2] inv_join split_inv split!: tree.split prod.split dest: inv_Node) qed subsection "\inter\" fun inter :: "('a*'b)tree \ ('a*'b)tree \ ('a*'b)tree" where "inter t1 t2 = (if t1 = Leaf then Leaf else if t2 = Leaf then Leaf else case t1 of Node l1 (a, _) r1 \ let (l2,b,r2) = split t2 a; l' = inter l1 l2; r' = inter r1 r2 in if b then join l' a r' else join2 l' r')" declare inter.simps [simp del] lemma set_tree_inter: "\ bst t1; bst t2 \ \ set_tree (inter t1 t2) = set_tree t1 \ set_tree t2" proof(induction t1 t2 rule: inter.induct) case (1 t1 t2) show ?case proof (cases t1 rule: tree2_cases) case Leaf thus ?thesis by (simp add: inter.simps) next case [simp]: (Node l1 a _ r1) show ?thesis proof (cases "t2 = Leaf") case True thus ?thesis by (simp add: inter.simps) next case False let ?L1 = "set_tree l1" let ?R1 = "set_tree r1" have *: "a \ ?L1 \ ?R1" using \bst t1\ by (fastforce) obtain l2 b r2 where sp: "split t2 a = (l2,b,r2)" using prod_cases3 by blast let ?L2 = "set_tree l2" let ?R2 = "set_tree r2" let ?A = "if b then {a} else {}" have t2: "set_tree t2 = ?L2 \ ?R2 \ ?A" and **: "?L2 \ ?R2 = {}" "a \ ?L2 \ ?R2" "?L1 \ ?R2 = {}" "?L2 \ ?R1 = {}" using split[OF sp] \bst t1\ \bst t2\ by (force, force, force, force, force) have IHl: "set_tree (inter l1 l2) = set_tree l1 \ set_tree l2" using "1.IH"(1)[OF _ False _ _ sp[symmetric]] "1.prems"(1,2) split[OF sp] by simp have IHr: "set_tree (inter r1 r2) = set_tree r1 \ set_tree r2" using "1.IH"(2)[OF _ False _ _ sp[symmetric]] "1.prems"(1,2) split[OF sp] by simp have "set_tree t1 \ set_tree t2 = (?L1 \ ?R1 \ {a}) \ (?L2 \ ?R2 \ ?A)" by(simp add: t2) also have "\ = (?L1 \ ?L2) \ (?R1 \ ?R2) \ ?A" using * ** by auto also have "\ = set_tree (inter t1 t2)" using IHl IHr sp inter.simps[of t1 t2] False by(simp) finally show ?thesis by simp qed qed qed lemma bst_inter: "\ bst t1; bst t2 \ \ bst (inter t1 t2)" proof(induction t1 t2 rule: inter.induct) case (1 t1 t2) thus ?case by(fastforce simp: inter.simps[of t1 t2] set_tree_inter split intro!: bst_join bst_join2 split: tree.split prod.split) qed lemma inv_inter: "\ inv t1; inv t2 \ \ inv (inter t1 t2)" proof(induction t1 t2 rule: inter.induct) case (1 t1 t2) thus ?case by(auto simp: inter.simps[of t1 t2] inv_join inv_join2 split_inv split!: tree.split prod.split dest: inv_Node) qed subsection "\diff\" fun diff :: "('a*'b)tree \ ('a*'b)tree \ ('a*'b)tree" where "diff t1 t2 = (if t1 = Leaf then Leaf else if t2 = Leaf then t1 else case t2 of Node l2 (a, _) r2 \ let (l1,_,r1) = split t1 a; l' = diff l1 l2; r' = diff r1 r2 in join2 l' r')" declare diff.simps [simp del] lemma set_tree_diff: "\ bst t1; bst t2 \ \ set_tree (diff t1 t2) = set_tree t1 - set_tree t2" proof(induction t1 t2 rule: diff.induct) case (1 t1 t2) show ?case proof (cases t2 rule: tree2_cases) case Leaf thus ?thesis by (simp add: diff.simps) next case [simp]: (Node l2 a _ r2) show ?thesis proof (cases "t1 = Leaf") case True thus ?thesis by (simp add: diff.simps) next case False let ?L2 = "set_tree l2" let ?R2 = "set_tree r2" obtain l1 b r1 where sp: "split t1 a = (l1,b,r1)" using prod_cases3 by blast let ?L1 = "set_tree l1" let ?R1 = "set_tree r1" let ?A = "if b then {a} else {}" have t1: "set_tree t1 = ?L1 \ ?R1 \ ?A" and **: "a \ ?L1 \ ?R1" "?L1 \ ?R2 = {}" "?L2 \ ?R1 = {}" using split[OF sp] \bst t1\ \bst t2\ by (force, force, force, force) have IHl: "set_tree (diff l1 l2) = set_tree l1 - set_tree l2" using "1.IH"(1)[OF False _ _ _ sp[symmetric]] "1.prems"(1,2) split[OF sp] by simp have IHr: "set_tree (diff r1 r2) = set_tree r1 - set_tree r2" using "1.IH"(2)[OF False _ _ _ sp[symmetric]] "1.prems"(1,2) split[OF sp] by simp have "set_tree t1 - set_tree t2 = (?L1 \ ?R1) - (?L2 \ ?R2 \ {a})" by(simp add: t1) also have "\ = (?L1 - ?L2) \ (?R1 - ?R2)" using ** by auto also have "\ = set_tree (diff t1 t2)" using IHl IHr sp diff.simps[of t1 t2] False by(simp) finally show ?thesis by simp qed qed qed lemma bst_diff: "\ bst t1; bst t2 \ \ bst (diff t1 t2)" proof(induction t1 t2 rule: diff.induct) case (1 t1 t2) thus ?case by(fastforce simp: diff.simps[of t1 t2] set_tree_diff split intro!: bst_join bst_join2 split: tree.split prod.split) qed lemma inv_diff: "\ inv t1; inv t2 \ \ inv (diff t1 t2)" proof(induction t1 t2 rule: diff.induct) case (1 t1 t2) thus ?case by(auto simp: diff.simps[of t1 t2] inv_join inv_join2 split_inv split!: tree.split prod.split dest: inv_Node) qed text \Locale \<^locale>\Set2_Join\ implements locale \<^locale>\Set2\:\ sublocale Set2 where empty = Leaf and insert = insert and delete = delete and isin = isin and union = union and inter = inter and diff = diff and set = set_tree and invar = "\t. inv t \ bst t" proof (standard, goal_cases) case 1 show ?case by (simp) next case 2 thus ?case by(simp add: isin_set_tree) next case 3 thus ?case by (simp add: set_tree_insert) next case 4 thus ?case by (simp add: set_tree_delete) next case 5 thus ?case by (simp add: inv_Leaf) next case 6 thus ?case by (simp add: bst_insert inv_insert) next case 7 thus ?case by (simp add: bst_delete inv_delete) next case 8 thus ?case by(simp add: set_tree_union) next case 9 thus ?case by(simp add: set_tree_inter) next case 10 thus ?case by(simp add: set_tree_diff) next case 11 thus ?case by (simp add: bst_union inv_union) next case 12 thus ?case by (simp add: bst_inter inv_inter) next case 13 thus ?case by (simp add: bst_diff inv_diff) qed end interpretation unbal: Set2_Join where join = "\l x r. Node l (x, ()) r" and inv = "\t. True" proof (standard, goal_cases) case 1 show ?case by simp next case 2 thus ?case by simp next case 3 thus ?case by simp next case 4 thus ?case by simp next case 5 thus ?case by simp qed end \ No newline at end of file diff --git a/src/HOL/Library/RBT_Impl.thy b/src/HOL/Library/RBT_Impl.thy --- a/src/HOL/Library/RBT_Impl.thy +++ b/src/HOL/Library/RBT_Impl.thy @@ -1,3087 +1,3087 @@ (* Title: HOL/Library/RBT_Impl.thy Author: Markus Reiter, TU Muenchen Author: Alexander Krauss, TU Muenchen *) section \Implementation of Red-Black Trees\ theory RBT_Impl imports Main begin text \ For applications, you should use theory \RBT\ which defines an abstract type of red-black tree obeying the invariant. \ subsection \Datatype of RB trees\ datatype color = R | B datatype ('a, 'b) rbt = Empty | Branch color "('a, 'b) rbt" 'a 'b "('a, 'b) rbt" lemma rbt_cases: obtains (Empty) "t = Empty" | (Red) l k v r where "t = Branch R l k v r" | (Black) l k v r where "t = Branch B l k v r" proof (cases t) case Empty with that show thesis by blast next case (Branch c) with that show thesis by (cases c) blast+ qed subsection \Tree properties\ subsubsection \Content of a tree\ primrec entries :: "('a, 'b) rbt \ ('a \ 'b) list" where "entries Empty = []" | "entries (Branch _ l k v r) = entries l @ (k,v) # entries r" abbreviation (input) entry_in_tree :: "'a \ 'b \ ('a, 'b) rbt \ bool" where "entry_in_tree k v t \ (k, v) \ set (entries t)" definition keys :: "('a, 'b) rbt \ 'a list" where "keys t = map fst (entries t)" lemma keys_simps [simp, code]: "keys Empty = []" "keys (Branch c l k v r) = keys l @ k # keys r" by (simp_all add: keys_def) lemma entry_in_tree_keys: assumes "(k, v) \ set (entries t)" shows "k \ set (keys t)" proof - from assms have "fst (k, v) \ fst ` set (entries t)" by (rule imageI) then show ?thesis by (simp add: keys_def) qed lemma keys_entries: "k \ set (keys t) \ (\v. (k, v) \ set (entries t))" by (auto intro: entry_in_tree_keys) (auto simp add: keys_def) lemma non_empty_rbt_keys: "t \ rbt.Empty \ keys t \ []" by (cases t) simp_all subsubsection \Search tree properties\ context ord begin definition rbt_less :: "'a \ ('a, 'b) rbt \ bool" where rbt_less_prop: "rbt_less k t \ (\x\set (keys t). x < k)" abbreviation rbt_less_symbol (infix "|\" 50) where "t |\ x \ rbt_less x t" definition rbt_greater :: "'a \ ('a, 'b) rbt \ bool" (infix "\|" 50) where rbt_greater_prop: "rbt_greater k t = (\x\set (keys t). k < x)" lemma rbt_less_simps [simp]: "Empty |\ k = True" "Branch c lt kt v rt |\ k \ kt < k \ lt |\ k \ rt |\ k" by (auto simp add: rbt_less_prop) lemma rbt_greater_simps [simp]: "k \| Empty = True" "k \| (Branch c lt kt v rt) \ k < kt \ k \| lt \ k \| rt" by (auto simp add: rbt_greater_prop) lemmas rbt_ord_props = rbt_less_prop rbt_greater_prop lemmas rbt_greater_nit = rbt_greater_prop entry_in_tree_keys lemmas rbt_less_nit = rbt_less_prop entry_in_tree_keys lemma (in order) shows rbt_less_eq_trans: "l |\ u \ u \ v \ l |\ v" and rbt_less_trans: "t |\ x \ x < y \ t |\ y" and rbt_greater_eq_trans: "u \ v \ v \| r \ u \| r" and rbt_greater_trans: "x < y \ y \| t \ x \| t" by (auto simp: rbt_ord_props) primrec rbt_sorted :: "('a, 'b) rbt \ bool" where "rbt_sorted Empty = True" | "rbt_sorted (Branch c l k v r) = (l |\ k \ k \| r \ rbt_sorted l \ rbt_sorted r)" end context linorder begin lemma rbt_sorted_entries: "rbt_sorted t \ List.sorted (map fst (entries t))" by (induct t) (force simp: sorted_append rbt_ord_props dest!: entry_in_tree_keys)+ lemma distinct_entries: "rbt_sorted t \ distinct (map fst (entries t))" by (induct t) (force simp: sorted_append rbt_ord_props dest!: entry_in_tree_keys)+ lemma distinct_keys: "rbt_sorted t \ distinct (keys t)" by (simp add: distinct_entries keys_def) subsubsection \Tree lookup\ primrec (in ord) rbt_lookup :: "('a, 'b) rbt \ 'a \ 'b" where "rbt_lookup Empty k = None" | "rbt_lookup (Branch _ l x y r) k = (if k < x then rbt_lookup l k else if x < k then rbt_lookup r k else Some y)" lemma rbt_lookup_keys: "rbt_sorted t \ dom (rbt_lookup t) = set (keys t)" by (induct t) (auto simp: dom_def rbt_greater_prop rbt_less_prop) lemma dom_rbt_lookup_Branch: "rbt_sorted (Branch c t1 k v t2) \ dom (rbt_lookup (Branch c t1 k v t2)) = Set.insert k (dom (rbt_lookup t1) \ dom (rbt_lookup t2))" proof - assume "rbt_sorted (Branch c t1 k v t2)" then show ?thesis by (simp add: rbt_lookup_keys) qed lemma finite_dom_rbt_lookup [simp, intro!]: "finite (dom (rbt_lookup t))" proof (induct t) case Empty then show ?case by simp next case (Branch color t1 a b t2) let ?A = "Set.insert a (dom (rbt_lookup t1) \ dom (rbt_lookup t2))" have "dom (rbt_lookup (Branch color t1 a b t2)) \ ?A" by (auto split: if_split_asm) moreover from Branch have "finite (insert a (dom (rbt_lookup t1) \ dom (rbt_lookup t2)))" by simp ultimately show ?case by (rule finite_subset) qed end context ord begin lemma rbt_lookup_rbt_less[simp]: "t |\ k \ rbt_lookup t k = None" by (induct t) auto lemma rbt_lookup_rbt_greater[simp]: "k \| t \ rbt_lookup t k = None" by (induct t) auto lemma rbt_lookup_Empty: "rbt_lookup Empty = Map.empty" by (rule ext) simp end context linorder begin lemma map_of_entries: "rbt_sorted t \ map_of (entries t) = rbt_lookup t" proof (induct t) case Empty thus ?case by (simp add: rbt_lookup_Empty) next case (Branch c t1 k v t2) have "rbt_lookup (Branch c t1 k v t2) = rbt_lookup t2 ++ [k\v] ++ rbt_lookup t1" proof (rule ext) fix x from Branch have RBT_SORTED: "rbt_sorted (Branch c t1 k v t2)" by simp let ?thesis = "rbt_lookup (Branch c t1 k v t2) x = (rbt_lookup t2 ++ [k \ v] ++ rbt_lookup t1) x" have DOM_T1: "!!k'. k'\dom (rbt_lookup t1) \ k>k'" proof - fix k' from RBT_SORTED have "t1 |\ k" by simp with rbt_less_prop have "\k'\set (keys t1). k>k'" by auto moreover assume "k'\dom (rbt_lookup t1)" ultimately show "k>k'" using rbt_lookup_keys RBT_SORTED by auto qed have DOM_T2: "!!k'. k'\dom (rbt_lookup t2) \ k| t2" by simp with rbt_greater_prop have "\k'\set (keys t2). kdom (rbt_lookup t2)" ultimately show "kdom [k\v]" by simp moreover have "x \ dom (rbt_lookup t2)" proof assume "x \ dom (rbt_lookup t2)" with DOM_T2 have "k v] x" by simp moreover have "x \ dom (rbt_lookup t1)" proof assume "x \ dom (rbt_lookup t1)" with DOM_T1 have "k>x" by blast thus False by simp qed ultimately have ?thesis by (simp add: map_add_upd_left map_add_dom_app_simps) } moreover { assume C: "x>k" hence "rbt_lookup (Branch c t1 k v t2) x = rbt_lookup t2 x" by (simp add: less_not_sym[of k x]) moreover from C have "x\dom [k\v]" by simp moreover have "x\dom (rbt_lookup t1)" proof assume "x\dom (rbt_lookup t1)" with DOM_T1 have "k>x" by simp with C show False by simp qed ultimately have ?thesis by (simp add: map_add_upd_left map_add_dom_app_simps) } ultimately show ?thesis using less_linear by blast qed also from Branch have "rbt_lookup t2 ++ [k \ v] ++ rbt_lookup t1 = map_of (entries (Branch c t1 k v t2))" by simp finally show ?case by simp qed lemma rbt_lookup_in_tree: "rbt_sorted t \ rbt_lookup t k = Some v \ (k, v) \ set (entries t)" by (simp add: map_of_entries [symmetric] distinct_entries) lemma set_entries_inject: assumes rbt_sorted: "rbt_sorted t1" "rbt_sorted t2" shows "set (entries t1) = set (entries t2) \ entries t1 = entries t2" proof - from rbt_sorted have "distinct (map fst (entries t1))" "distinct (map fst (entries t2))" by (auto intro: distinct_entries) with rbt_sorted show ?thesis by (auto intro: map_sorted_distinct_set_unique rbt_sorted_entries simp add: distinct_map) qed lemma entries_eqI: assumes rbt_sorted: "rbt_sorted t1" "rbt_sorted t2" assumes rbt_lookup: "rbt_lookup t1 = rbt_lookup t2" shows "entries t1 = entries t2" proof - from rbt_sorted rbt_lookup have "map_of (entries t1) = map_of (entries t2)" by (simp add: map_of_entries) with rbt_sorted have "set (entries t1) = set (entries t2)" by (simp add: map_of_inject_set distinct_entries) with rbt_sorted show ?thesis by (simp add: set_entries_inject) qed lemma entries_rbt_lookup: assumes "rbt_sorted t1" "rbt_sorted t2" shows "entries t1 = entries t2 \ rbt_lookup t1 = rbt_lookup t2" using assms by (auto intro: entries_eqI simp add: map_of_entries [symmetric]) lemma rbt_lookup_from_in_tree: assumes "rbt_sorted t1" "rbt_sorted t2" and "\v. (k, v) \ set (entries t1) \ (k, v) \ set (entries t2)" shows "rbt_lookup t1 k = rbt_lookup t2 k" proof - from assms have "k \ dom (rbt_lookup t1) \ k \ dom (rbt_lookup t2)" by (simp add: keys_entries rbt_lookup_keys) with assms show ?thesis by (auto simp add: rbt_lookup_in_tree [symmetric]) qed end subsubsection \Red-black properties\ primrec color_of :: "('a, 'b) rbt \ color" where "color_of Empty = B" | "color_of (Branch c _ _ _ _) = c" primrec bheight :: "('a,'b) rbt \ nat" where "bheight Empty = 0" | "bheight (Branch c lt k v rt) = (if c = B then Suc (bheight lt) else bheight lt)" primrec inv1 :: "('a, 'b) rbt \ bool" where "inv1 Empty = True" | "inv1 (Branch c lt k v rt) \ inv1 lt \ inv1 rt \ (c = B \ color_of lt = B \ color_of rt = B)" primrec inv1l :: "('a, 'b) rbt \ bool" \ \Weaker version\ where "inv1l Empty = True" | "inv1l (Branch c l k v r) = (inv1 l \ inv1 r)" lemma [simp]: "inv1 t \ inv1l t" by (cases t) simp+ primrec inv2 :: "('a, 'b) rbt \ bool" where "inv2 Empty = True" | "inv2 (Branch c lt k v rt) = (inv2 lt \ inv2 rt \ bheight lt = bheight rt)" context ord begin definition is_rbt :: "('a, 'b) rbt \ bool" where "is_rbt t \ inv1 t \ inv2 t \ color_of t = B \ rbt_sorted t" lemma is_rbt_rbt_sorted [simp]: "is_rbt t \ rbt_sorted t" by (simp add: is_rbt_def) theorem Empty_is_rbt [simp]: "is_rbt Empty" by (simp add: is_rbt_def) end subsection \Insertion\ text \The function definitions are based on the book by Okasaki.\ fun (* slow, due to massive case splitting *) balance :: "('a,'b) rbt \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt" where "balance (Branch R a w x b) s t (Branch R c y z d) = Branch R (Branch B a w x b) s t (Branch B c y z d)" | "balance (Branch R (Branch R a w x b) s t c) y z d = Branch R (Branch B a w x b) s t (Branch B c y z d)" | "balance (Branch R a w x (Branch R b s t c)) y z d = Branch R (Branch B a w x b) s t (Branch B c y z d)" | "balance a w x (Branch R b s t (Branch R c y z d)) = Branch R (Branch B a w x b) s t (Branch B c y z d)" | "balance a w x (Branch R (Branch R b s t c) y z d) = Branch R (Branch B a w x b) s t (Branch B c y z d)" | "balance a s t b = Branch B a s t b" lemma balance_inv1: "\inv1l l; inv1l r\ \ inv1 (balance l k v r)" by (induct l k v r rule: balance.induct) auto lemma balance_bheight: "bheight l = bheight r \ bheight (balance l k v r) = Suc (bheight l)" by (induct l k v r rule: balance.induct) auto lemma balance_inv2: assumes "inv2 l" "inv2 r" "bheight l = bheight r" shows "inv2 (balance l k v r)" using assms by (induct l k v r rule: balance.induct) auto context ord begin lemma balance_rbt_greater[simp]: "(v \| balance a k x b) = (v \| a \ v \| b \ v < k)" by (induct a k x b rule: balance.induct) auto lemma balance_rbt_less[simp]: "(balance a k x b |\ v) = (a |\ v \ b |\ v \ k < v)" by (induct a k x b rule: balance.induct) auto end lemma (in linorder) balance_rbt_sorted: fixes k :: "'a" assumes "rbt_sorted l" "rbt_sorted r" "l |\ k" "k \| r" shows "rbt_sorted (balance l k v r)" using assms proof (induct l k v r rule: balance.induct) case ("2_2" a x w b y t c z s va vb vd vc) hence "y < z \ z \| Branch B va vb vd vc" by (auto simp add: rbt_ord_props) hence "y \| (Branch B va vb vd vc)" by (blast dest: rbt_greater_trans) with "2_2" show ?case by simp next case ("3_2" va vb vd vc x w b y s c z) from "3_2" have "x < y \ Branch B va vb vd vc |\ x" by simp hence "Branch B va vb vd vc |\ y" by (blast dest: rbt_less_trans) with "3_2" show ?case by simp next case ("3_3" x w b y s c z t va vb vd vc) from "3_3" have "y < z \ z \| Branch B va vb vd vc" by simp hence "y \| Branch B va vb vd vc" by (blast dest: rbt_greater_trans) with "3_3" show ?case by simp next case ("3_4" vd ve vg vf x w b y s c z t va vb vii vc) hence "x < y \ Branch B vd ve vg vf |\ x" by simp hence 1: "Branch B vd ve vg vf |\ y" by (blast dest: rbt_less_trans) from "3_4" have "y < z \ z \| Branch B va vb vii vc" by simp hence "y \| Branch B va vb vii vc" by (blast dest: rbt_greater_trans) with 1 "3_4" show ?case by simp next case ("4_2" va vb vd vc x w b y s c z t dd) hence "x < y \ Branch B va vb vd vc |\ x" by simp hence "Branch B va vb vd vc |\ y" by (blast dest: rbt_less_trans) with "4_2" show ?case by simp next case ("5_2" x w b y s c z t va vb vd vc) hence "y < z \ z \| Branch B va vb vd vc" by simp hence "y \| Branch B va vb vd vc" by (blast dest: rbt_greater_trans) with "5_2" show ?case by simp next case ("5_3" va vb vd vc x w b y s c z t) hence "x < y \ Branch B va vb vd vc |\ x" by simp hence "Branch B va vb vd vc |\ y" by (blast dest: rbt_less_trans) with "5_3" show ?case by simp next case ("5_4" va vb vg vc x w b y s c z t vd ve vii vf) hence "x < y \ Branch B va vb vg vc |\ x" by simp hence 1: "Branch B va vb vg vc |\ y" by (blast dest: rbt_less_trans) from "5_4" have "y < z \ z \| Branch B vd ve vii vf" by simp hence "y \| Branch B vd ve vii vf" by (blast dest: rbt_greater_trans) with 1 "5_4" show ?case by simp qed simp+ lemma entries_balance [simp]: "entries (balance l k v r) = entries l @ (k, v) # entries r" by (induct l k v r rule: balance.induct) auto lemma keys_balance [simp]: "keys (balance l k v r) = keys l @ k # keys r" by (simp add: keys_def) lemma balance_in_tree: "entry_in_tree k x (balance l v y r) \ entry_in_tree k x l \ k = v \ x = y \ entry_in_tree k x r" by (auto simp add: keys_def) lemma (in linorder) rbt_lookup_balance[simp]: fixes k :: "'a" assumes "rbt_sorted l" "rbt_sorted r" "l |\ k" "k \| r" shows "rbt_lookup (balance l k v r) x = rbt_lookup (Branch B l k v r) x" by (rule rbt_lookup_from_in_tree) (auto simp:assms balance_in_tree balance_rbt_sorted) primrec paint :: "color \ ('a,'b) rbt \ ('a,'b) rbt" where "paint c Empty = Empty" | "paint c (Branch _ l k v r) = Branch c l k v r" lemma paint_inv1l[simp]: "inv1l t \ inv1l (paint c t)" by (cases t) auto lemma paint_inv1[simp]: "inv1l t \ inv1 (paint B t)" by (cases t) auto lemma paint_inv2[simp]: "inv2 t \ inv2 (paint c t)" by (cases t) auto lemma paint_color_of[simp]: "color_of (paint B t) = B" by (cases t) auto lemma paint_in_tree[simp]: "entry_in_tree k x (paint c t) = entry_in_tree k x t" by (cases t) auto context ord begin lemma paint_rbt_sorted[simp]: "rbt_sorted t \ rbt_sorted (paint c t)" by (cases t) auto lemma paint_rbt_lookup[simp]: "rbt_lookup (paint c t) = rbt_lookup t" by (rule ext) (cases t, auto) lemma paint_rbt_greater[simp]: "(v \| paint c t) = (v \| t)" by (cases t) auto lemma paint_rbt_less[simp]: "(paint c t |\ v) = (t |\ v)" by (cases t) auto fun rbt_ins :: "('a \ 'b \ 'b \ 'b) \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt" where "rbt_ins f k v Empty = Branch R Empty k v Empty" | "rbt_ins f k v (Branch B l x y r) = (if k < x then balance (rbt_ins f k v l) x y r else if k > x then balance l x y (rbt_ins f k v r) else Branch B l x (f k y v) r)" | "rbt_ins f k v (Branch R l x y r) = (if k < x then Branch R (rbt_ins f k v l) x y r else if k > x then Branch R l x y (rbt_ins f k v r) else Branch R l x (f k y v) r)" lemma ins_inv1_inv2: assumes "inv1 t" "inv2 t" shows "inv2 (rbt_ins f k x t)" "bheight (rbt_ins f k x t) = bheight t" "color_of t = B \ inv1 (rbt_ins f k x t)" "inv1l (rbt_ins f k x t)" using assms by (induct f k x t rule: rbt_ins.induct) (auto simp: balance_inv1 balance_inv2 balance_bheight) end context linorder begin lemma ins_rbt_greater[simp]: "(v \| rbt_ins f (k :: 'a) x t) = (v \| t \ k > v)" by (induct f k x t rule: rbt_ins.induct) auto lemma ins_rbt_less[simp]: "(rbt_ins f k x t |\ v) = (t |\ v \ k < v)" by (induct f k x t rule: rbt_ins.induct) auto lemma ins_rbt_sorted[simp]: "rbt_sorted t \ rbt_sorted (rbt_ins f k x t)" by (induct f k x t rule: rbt_ins.induct) (auto simp: balance_rbt_sorted) lemma keys_ins: "set (keys (rbt_ins f k v t)) = { k } \ set (keys t)" by (induct f k v t rule: rbt_ins.induct) auto lemma rbt_lookup_ins: fixes k :: "'a" assumes "rbt_sorted t" shows "rbt_lookup (rbt_ins f k v t) x = ((rbt_lookup t)(k |-> case rbt_lookup t k of None \ v | Some w \ f k w v)) x" using assms by (induct f k v t rule: rbt_ins.induct) auto end context ord begin definition rbt_insert_with_key :: "('a \ 'b \ 'b \ 'b) \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt" where "rbt_insert_with_key f k v t = paint B (rbt_ins f k v t)" definition rbt_insertw_def: "rbt_insert_with f = rbt_insert_with_key (\_. f)" definition rbt_insert :: "'a \ 'b \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_insert = rbt_insert_with_key (\_ _ nv. nv)" end context linorder begin lemma rbt_insertwk_rbt_sorted: "rbt_sorted t \ rbt_sorted (rbt_insert_with_key f (k :: 'a) x t)" by (auto simp: rbt_insert_with_key_def) theorem rbt_insertwk_is_rbt: assumes inv: "is_rbt t" shows "is_rbt (rbt_insert_with_key f k x t)" using assms unfolding rbt_insert_with_key_def is_rbt_def by (auto simp: ins_inv1_inv2) lemma rbt_lookup_rbt_insertwk: assumes "rbt_sorted t" shows "rbt_lookup (rbt_insert_with_key f k v t) x = ((rbt_lookup t)(k |-> case rbt_lookup t k of None \ v | Some w \ f k w v)) x" unfolding rbt_insert_with_key_def using assms by (simp add:rbt_lookup_ins) lemma rbt_insertw_rbt_sorted: "rbt_sorted t \ rbt_sorted (rbt_insert_with f k v t)" by (simp add: rbt_insertwk_rbt_sorted rbt_insertw_def) theorem rbt_insertw_is_rbt: "is_rbt t \ is_rbt (rbt_insert_with f k v t)" by (simp add: rbt_insertwk_is_rbt rbt_insertw_def) lemma rbt_lookup_rbt_insertw: "is_rbt t \ rbt_lookup (rbt_insert_with f k v t) = (rbt_lookup t)(k \ (if k \ dom (rbt_lookup t) then f (the (rbt_lookup t k)) v else v))" by (rule ext, cases "rbt_lookup t k") (auto simp: rbt_lookup_rbt_insertwk dom_def rbt_insertw_def) lemma rbt_insert_rbt_sorted: "rbt_sorted t \ rbt_sorted (rbt_insert k v t)" by (simp add: rbt_insertwk_rbt_sorted rbt_insert_def) theorem rbt_insert_is_rbt [simp]: "is_rbt t \ is_rbt (rbt_insert k v t)" by (simp add: rbt_insertwk_is_rbt rbt_insert_def) lemma rbt_lookup_rbt_insert: "is_rbt t \ rbt_lookup (rbt_insert k v t) = (rbt_lookup t)(k\v)" by (rule ext) (simp add: rbt_insert_def rbt_lookup_rbt_insertwk split: option.split) end subsection \Deletion\ lemma bheight_paintR'[simp]: "color_of t = B \ bheight (paint R t) = bheight t - 1" by (cases t rule: rbt_cases) auto text \ The function definitions are based on the Haskell code by Stefan Kahrs at \<^url>\http://www.cs.ukc.ac.uk/people/staff/smk/redblack/rb.html\. \ fun balance_left :: "('a,'b) rbt \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt" where "balance_left (Branch R a k x b) s y c = Branch R (Branch B a k x b) s y c" | "balance_left bl k x (Branch B a s y b) = balance bl k x (Branch R a s y b)" | "balance_left bl k x (Branch R (Branch B a s y b) t z c) = Branch R (Branch B bl k x a) s y (balance b t z (paint R c))" | "balance_left t k x s = Empty" lemma balance_left_inv2_with_inv1: assumes "inv2 lt" "inv2 rt" "bheight lt + 1 = bheight rt" "inv1 rt" shows "bheight (balance_left lt k v rt) = bheight lt + 1" and "inv2 (balance_left lt k v rt)" using assms by (induct lt k v rt rule: balance_left.induct) (auto simp: balance_inv2 balance_bheight) lemma balance_left_inv2_app: assumes "inv2 lt" "inv2 rt" "bheight lt + 1 = bheight rt" "color_of rt = B" shows "inv2 (balance_left lt k v rt)" "bheight (balance_left lt k v rt) = bheight rt" using assms by (induct lt k v rt rule: balance_left.induct) (auto simp add: balance_inv2 balance_bheight)+ lemma balance_left_inv1: "\inv1l a; inv1 b; color_of b = B\ \ inv1 (balance_left a k x b)" by (induct a k x b rule: balance_left.induct) (simp add: balance_inv1)+ lemma balance_left_inv1l: "\ inv1l lt; inv1 rt \ \ inv1l (balance_left lt k x rt)" by (induct lt k x rt rule: balance_left.induct) (auto simp: balance_inv1) lemma (in linorder) balance_left_rbt_sorted: "\ rbt_sorted l; rbt_sorted r; rbt_less k l; k \| r \ \ rbt_sorted (balance_left l k v r)" apply (induct l k v r rule: balance_left.induct) apply (auto simp: balance_rbt_sorted) apply (unfold rbt_greater_prop rbt_less_prop) by force+ context order begin lemma balance_left_rbt_greater: fixes k :: "'a" assumes "k \| a" "k \| b" "k < x" shows "k \| balance_left a x t b" using assms by (induct a x t b rule: balance_left.induct) auto lemma balance_left_rbt_less: fixes k :: "'a" assumes "a |\ k" "b |\ k" "x < k" shows "balance_left a x t b |\ k" using assms by (induct a x t b rule: balance_left.induct) auto end lemma balance_left_in_tree: assumes "inv1l l" "inv1 r" "bheight l + 1 = bheight r" shows "entry_in_tree k v (balance_left l a b r) = (entry_in_tree k v l \ k = a \ v = b \ entry_in_tree k v r)" using assms by (induct l k v r rule: balance_left.induct) (auto simp: balance_in_tree) fun balance_right :: "('a,'b) rbt \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt" where "balance_right a k x (Branch R b s y c) = Branch R a k x (Branch B b s y c)" | "balance_right (Branch B a k x b) s y bl = balance (Branch R a k x b) s y bl" | "balance_right (Branch R a k x (Branch B b s y c)) t z bl = Branch R (balance (paint R a) k x b) s y (Branch B c t z bl)" | "balance_right t k x s = Empty" lemma balance_right_inv2_with_inv1: assumes "inv2 lt" "inv2 rt" "bheight lt = bheight rt + 1" "inv1 lt" shows "inv2 (balance_right lt k v rt) \ bheight (balance_right lt k v rt) = bheight lt" using assms by (induct lt k v rt rule: balance_right.induct) (auto simp: balance_inv2 balance_bheight) lemma balance_right_inv1: "\inv1 a; inv1l b; color_of a = B\ \ inv1 (balance_right a k x b)" by (induct a k x b rule: balance_right.induct) (simp add: balance_inv1)+ lemma balance_right_inv1l: "\ inv1 lt; inv1l rt \ \inv1l (balance_right lt k x rt)" by (induct lt k x rt rule: balance_right.induct) (auto simp: balance_inv1) lemma (in linorder) balance_right_rbt_sorted: "\ rbt_sorted l; rbt_sorted r; rbt_less k l; k \| r \ \ rbt_sorted (balance_right l k v r)" apply (induct l k v r rule: balance_right.induct) apply (auto simp:balance_rbt_sorted) apply (unfold rbt_less_prop rbt_greater_prop) by force+ context order begin lemma balance_right_rbt_greater: fixes k :: "'a" assumes "k \| a" "k \| b" "k < x" shows "k \| balance_right a x t b" using assms by (induct a x t b rule: balance_right.induct) auto lemma balance_right_rbt_less: fixes k :: "'a" assumes "a |\ k" "b |\ k" "x < k" shows "balance_right a x t b |\ k" using assms by (induct a x t b rule: balance_right.induct) auto end lemma balance_right_in_tree: assumes "inv1 l" "inv1l r" "bheight l = bheight r + 1" "inv2 l" "inv2 r" shows "entry_in_tree x y (balance_right l k v r) = (entry_in_tree x y l \ x = k \ y = v \ entry_in_tree x y r)" using assms by (induct l k v r rule: balance_right.induct) (auto simp: balance_in_tree) fun combine :: "('a,'b) rbt \ ('a,'b) rbt \ ('a,'b) rbt" where "combine Empty x = x" | "combine x Empty = x" | "combine (Branch R a k x b) (Branch R c s y d) = (case (combine b c) of Branch R b2 t z c2 \ (Branch R (Branch R a k x b2) t z (Branch R c2 s y d)) | bc \ Branch R a k x (Branch R bc s y d))" | "combine (Branch B a k x b) (Branch B c s y d) = (case (combine b c) of Branch R b2 t z c2 \ Branch R (Branch B a k x b2) t z (Branch B c2 s y d) | bc \ balance_left a k x (Branch B bc s y d))" | "combine a (Branch R b k x c) = Branch R (combine a b) k x c" | "combine (Branch R a k x b) c = Branch R a k x (combine b c)" lemma combine_inv2: assumes "inv2 lt" "inv2 rt" "bheight lt = bheight rt" shows "bheight (combine lt rt) = bheight lt" "inv2 (combine lt rt)" using assms by (induct lt rt rule: combine.induct) (auto simp: balance_left_inv2_app split: rbt.splits color.splits) lemma combine_inv1: assumes "inv1 lt" "inv1 rt" shows "color_of lt = B \ color_of rt = B \ inv1 (combine lt rt)" "inv1l (combine lt rt)" using assms by (induct lt rt rule: combine.induct) (auto simp: balance_left_inv1 split: rbt.splits color.splits) context linorder begin lemma combine_rbt_greater[simp]: fixes k :: "'a" assumes "k \| l" "k \| r" shows "k \| combine l r" using assms by (induct l r rule: combine.induct) (auto simp: balance_left_rbt_greater split:rbt.splits color.splits) lemma combine_rbt_less[simp]: fixes k :: "'a" assumes "l |\ k" "r |\ k" shows "combine l r |\ k" using assms by (induct l r rule: combine.induct) (auto simp: balance_left_rbt_less split:rbt.splits color.splits) lemma combine_rbt_sorted: fixes k :: "'a" assumes "rbt_sorted l" "rbt_sorted r" "l |\ k" "k \| r" shows "rbt_sorted (combine l r)" using assms proof (induct l r rule: combine.induct) case (3 a x v b c y w d) hence ineqs: "a |\ x" "x \| b" "b |\ k" "k \| c" "c |\ y" "y \| d" by auto with 3 show ?case by (cases "combine b c" rule: rbt_cases) (auto, (metis combine_rbt_greater combine_rbt_less ineqs ineqs rbt_less_simps(2) rbt_greater_simps(2) rbt_greater_trans rbt_less_trans)+) next case (4 a x v b c y w d) hence "x < k \ rbt_greater k c" by simp hence "rbt_greater x c" by (blast dest: rbt_greater_trans) with 4 have 2: "rbt_greater x (combine b c)" by (simp add: combine_rbt_greater) from 4 have "k < y \ rbt_less k b" by simp hence "rbt_less y b" by (blast dest: rbt_less_trans) with 4 have 3: "rbt_less y (combine b c)" by (simp add: combine_rbt_less) show ?case proof (cases "combine b c" rule: rbt_cases) case Empty from 4 have "x < y \ rbt_greater y d" by auto hence "rbt_greater x d" by (blast dest: rbt_greater_trans) with 4 Empty have "rbt_sorted a" and "rbt_sorted (Branch B Empty y w d)" and "rbt_less x a" and "rbt_greater x (Branch B Empty y w d)" by auto with Empty show ?thesis by (simp add: balance_left_rbt_sorted) next case (Red lta va ka rta) with 2 4 have "x < va \ rbt_less x a" by simp hence 5: "rbt_less va a" by (blast dest: rbt_less_trans) from Red 3 4 have "va < y \ rbt_greater y d" by simp hence "rbt_greater va d" by (blast dest: rbt_greater_trans) with Red 2 3 4 5 show ?thesis by simp next case (Black lta va ka rta) from 4 have "x < y \ rbt_greater y d" by auto hence "rbt_greater x d" by (blast dest: rbt_greater_trans) with Black 2 3 4 have "rbt_sorted a" and "rbt_sorted (Branch B (combine b c) y w d)" and "rbt_less x a" and "rbt_greater x (Branch B (combine b c) y w d)" by auto with Black show ?thesis by (simp add: balance_left_rbt_sorted) qed next case (5 va vb vd vc b x w c) hence "k < x \ rbt_less k (Branch B va vb vd vc)" by simp hence "rbt_less x (Branch B va vb vd vc)" by (blast dest: rbt_less_trans) with 5 show ?case by (simp add: combine_rbt_less) next case (6 a x v b va vb vd vc) hence "x < k \ rbt_greater k (Branch B va vb vd vc)" by simp hence "rbt_greater x (Branch B va vb vd vc)" by (blast dest: rbt_greater_trans) with 6 show ?case by (simp add: combine_rbt_greater) qed simp+ end lemma combine_in_tree: assumes "inv2 l" "inv2 r" "bheight l = bheight r" "inv1 l" "inv1 r" shows "entry_in_tree k v (combine l r) = (entry_in_tree k v l \ entry_in_tree k v r)" using assms proof (induct l r rule: combine.induct) case (4 _ _ _ b c) hence a: "bheight (combine b c) = bheight b" by (simp add: combine_inv2) from 4 have b: "inv1l (combine b c)" by (simp add: combine_inv1) show ?case proof (cases "combine b c" rule: rbt_cases) case Empty with 4 a show ?thesis by (auto simp: balance_left_in_tree) next case (Red lta ka va rta) with 4 show ?thesis by auto next case (Black lta ka va rta) with a b 4 show ?thesis by (auto simp: balance_left_in_tree) qed qed (auto split: rbt.splits color.splits) context ord begin fun rbt_del_from_left :: "'a \ ('a,'b) rbt \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt" and rbt_del_from_right :: "'a \ ('a,'b) rbt \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt" and rbt_del :: "'a\ ('a,'b) rbt \ ('a,'b) rbt" where "rbt_del x Empty = Empty" | "rbt_del x (Branch c a y s b) = (if x < y then rbt_del_from_left x a y s b else (if x > y then rbt_del_from_right x a y s b else combine a b))" | "rbt_del_from_left x (Branch B lt z v rt) y s b = balance_left (rbt_del x (Branch B lt z v rt)) y s b" | "rbt_del_from_left x a y s b = Branch R (rbt_del x a) y s b" | "rbt_del_from_right x a y s (Branch B lt z v rt) = balance_right a y s (rbt_del x (Branch B lt z v rt))" | "rbt_del_from_right x a y s b = Branch R a y s (rbt_del x b)" end context linorder begin lemma assumes "inv2 lt" "inv1 lt" shows "\inv2 rt; bheight lt = bheight rt; inv1 rt\ \ inv2 (rbt_del_from_left x lt k v rt) \ bheight (rbt_del_from_left x lt k v rt) = bheight lt \ (color_of lt = B \ color_of rt = B \ inv1 (rbt_del_from_left x lt k v rt) \ (color_of lt \ B \ color_of rt \ B) \ inv1l (rbt_del_from_left x lt k v rt))" and "\inv2 rt; bheight lt = bheight rt; inv1 rt\ \ inv2 (rbt_del_from_right x lt k v rt) \ bheight (rbt_del_from_right x lt k v rt) = bheight lt \ (color_of lt = B \ color_of rt = B \ inv1 (rbt_del_from_right x lt k v rt) \ (color_of lt \ B \ color_of rt \ B) \ inv1l (rbt_del_from_right x lt k v rt))" and rbt_del_inv1_inv2: "inv2 (rbt_del x lt) \ (color_of lt = R \ bheight (rbt_del x lt) = bheight lt \ inv1 (rbt_del x lt) \ color_of lt = B \ bheight (rbt_del x lt) = bheight lt - 1 \ inv1l (rbt_del x lt))" using assms proof (induct x lt k v rt and x lt k v rt and x lt rule: rbt_del_from_left_rbt_del_from_right_rbt_del.induct) case (2 y c _ y') have "y = y' \ y < y' \ y > y'" by auto thus ?case proof (elim disjE) assume "y = y'" with 2 show ?thesis by (cases c) (simp add: combine_inv2 combine_inv1)+ next assume "y < y'" with 2 show ?thesis by (cases c) auto next assume "y' < y" with 2 show ?thesis by (cases c) auto qed next case (3 y lt z v rta y' ss bb) thus ?case by (cases "color_of (Branch B lt z v rta) = B \ color_of bb = B") (simp add: balance_left_inv2_with_inv1 balance_left_inv1 balance_left_inv1l)+ next case (5 y a y' ss lt z v rta) thus ?case by (cases "color_of a = B \ color_of (Branch B lt z v rta) = B") (simp add: balance_right_inv2_with_inv1 balance_right_inv1 balance_right_inv1l)+ next case ("6_1" y a y' ss) thus ?case by (cases "color_of a = B \ color_of Empty = B") simp+ qed auto lemma rbt_del_from_left_rbt_less: "\ lt |\ v; rt |\ v; k < v\ \ rbt_del_from_left x lt k y rt |\ v" and rbt_del_from_right_rbt_less: "\lt |\ v; rt |\ v; k < v\ \ rbt_del_from_right x lt k y rt |\ v" and rbt_del_rbt_less: "lt |\ v \ rbt_del x lt |\ v" by (induct x lt k y rt and x lt k y rt and x lt rule: rbt_del_from_left_rbt_del_from_right_rbt_del.induct) (auto simp: balance_left_rbt_less balance_right_rbt_less) lemma rbt_del_from_left_rbt_greater: "\v \| lt; v \| rt; k > v\ \ v \| rbt_del_from_left x lt k y rt" and rbt_del_from_right_rbt_greater: "\v \| lt; v \| rt; k > v\ \ v \| rbt_del_from_right x lt k y rt" and rbt_del_rbt_greater: "v \| lt \ v \| rbt_del x lt" by (induct x lt k y rt and x lt k y rt and x lt rule: rbt_del_from_left_rbt_del_from_right_rbt_del.induct) (auto simp: balance_left_rbt_greater balance_right_rbt_greater) lemma "\rbt_sorted lt; rbt_sorted rt; lt |\ k; k \| rt\ \ rbt_sorted (rbt_del_from_left x lt k y rt)" and "\rbt_sorted lt; rbt_sorted rt; lt |\ k; k \| rt\ \ rbt_sorted (rbt_del_from_right x lt k y rt)" and rbt_del_rbt_sorted: "rbt_sorted lt \ rbt_sorted (rbt_del x lt)" proof (induct x lt k y rt and x lt k y rt and x lt rule: rbt_del_from_left_rbt_del_from_right_rbt_del.induct) case (3 x lta zz v rta yy ss bb) from 3 have "Branch B lta zz v rta |\ yy" by simp hence "rbt_del x (Branch B lta zz v rta) |\ yy" by (rule rbt_del_rbt_less) with 3 show ?case by (simp add: balance_left_rbt_sorted) next case ("4_2" x vaa vbb vdd vc yy ss bb) hence "Branch R vaa vbb vdd vc |\ yy" by simp hence "rbt_del x (Branch R vaa vbb vdd vc) |\ yy" by (rule rbt_del_rbt_less) with "4_2" show ?case by simp next case (5 x aa yy ss lta zz v rta) hence "yy \| Branch B lta zz v rta" by simp hence "yy \| rbt_del x (Branch B lta zz v rta)" by (rule rbt_del_rbt_greater) with 5 show ?case by (simp add: balance_right_rbt_sorted) next case ("6_2" x aa yy ss vaa vbb vdd vc) hence "yy \| Branch R vaa vbb vdd vc" by simp hence "yy \| rbt_del x (Branch R vaa vbb vdd vc)" by (rule rbt_del_rbt_greater) with "6_2" show ?case by simp qed (auto simp: combine_rbt_sorted) lemma "\rbt_sorted lt; rbt_sorted rt; lt |\ kt; kt \| rt; inv1 lt; inv1 rt; inv2 lt; inv2 rt; bheight lt = bheight rt; x < kt\ \ entry_in_tree k v (rbt_del_from_left x lt kt y rt) = (False \ (x \ k \ entry_in_tree k v (Branch c lt kt y rt)))" and "\rbt_sorted lt; rbt_sorted rt; lt |\ kt; kt \| rt; inv1 lt; inv1 rt; inv2 lt; inv2 rt; bheight lt = bheight rt; x > kt\ \ entry_in_tree k v (rbt_del_from_right x lt kt y rt) = (False \ (x \ k \ entry_in_tree k v (Branch c lt kt y rt)))" and rbt_del_in_tree: "\rbt_sorted t; inv1 t; inv2 t\ \ entry_in_tree k v (rbt_del x t) = (False \ (x \ k \ entry_in_tree k v t))" proof (induct x lt kt y rt and x lt kt y rt and x t rule: rbt_del_from_left_rbt_del_from_right_rbt_del.induct) case (2 xx c aa yy ss bb) have "xx = yy \ xx < yy \ xx > yy" by auto from this 2 show ?case proof (elim disjE) assume "xx = yy" with 2 show ?thesis proof (cases "xx = k") case True from 2 \xx = yy\ \xx = k\ have "rbt_sorted (Branch c aa yy ss bb) \ k = yy" by simp hence "\ entry_in_tree k v aa" "\ entry_in_tree k v bb" by (auto simp: rbt_less_nit rbt_greater_prop) with \xx = yy\ 2 \xx = k\ show ?thesis by (simp add: combine_in_tree) qed (simp add: combine_in_tree) qed simp+ next case (3 xx lta zz vv rta yy ss bb) define mt where [simp]: "mt = Branch B lta zz vv rta" from 3 have "inv2 mt \ inv1 mt" by simp hence "inv2 (rbt_del xx mt) \ (color_of mt = R \ bheight (rbt_del xx mt) = bheight mt \ inv1 (rbt_del xx mt) \ color_of mt = B \ bheight (rbt_del xx mt) = bheight mt - 1 \ inv1l (rbt_del xx mt))" by (blast dest: rbt_del_inv1_inv2) with 3 have 4: "entry_in_tree k v (rbt_del_from_left xx mt yy ss bb) = (False \ xx \ k \ entry_in_tree k v mt \ (k = yy \ v = ss) \ entry_in_tree k v bb)" by (simp add: balance_left_in_tree) thus ?case proof (cases "xx = k") case True from 3 True have "yy \| bb \ yy > k" by simp hence "k \| bb" by (blast dest: rbt_greater_trans) with 3 4 True show ?thesis by (auto simp: rbt_greater_nit) qed auto next case ("4_1" xx yy ss bb) show ?case proof (cases "xx = k") case True with "4_1" have "yy \| bb \ k < yy" by simp hence "k \| bb" by (blast dest: rbt_greater_trans) with "4_1" \xx = k\ have "entry_in_tree k v (Branch R Empty yy ss bb) = entry_in_tree k v Empty" by (auto simp: rbt_greater_nit) thus ?thesis by auto qed simp+ next case ("4_2" xx vaa vbb vdd vc yy ss bb) thus ?case proof (cases "xx = k") case True with "4_2" have "k < yy \ yy \| bb" by simp hence "k \| bb" by (blast dest: rbt_greater_trans) with True "4_2" show ?thesis by (auto simp: rbt_greater_nit) qed auto next case (5 xx aa yy ss lta zz vv rta) define mt where [simp]: "mt = Branch B lta zz vv rta" from 5 have "inv2 mt \ inv1 mt" by simp hence "inv2 (rbt_del xx mt) \ (color_of mt = R \ bheight (rbt_del xx mt) = bheight mt \ inv1 (rbt_del xx mt) \ color_of mt = B \ bheight (rbt_del xx mt) = bheight mt - 1 \ inv1l (rbt_del xx mt))" by (blast dest: rbt_del_inv1_inv2) with 5 have 3: "entry_in_tree k v (rbt_del_from_right xx aa yy ss mt) = (entry_in_tree k v aa \ (k = yy \ v = ss) \ False \ xx \ k \ entry_in_tree k v mt)" by (simp add: balance_right_in_tree) thus ?case proof (cases "xx = k") case True from 5 True have "aa |\ yy \ yy < k" by simp hence "aa |\ k" by (blast dest: rbt_less_trans) with 3 5 True show ?thesis by (auto simp: rbt_less_nit) qed auto next case ("6_1" xx aa yy ss) show ?case proof (cases "xx = k") case True with "6_1" have "aa |\ yy \ k > yy" by simp hence "aa |\ k" by (blast dest: rbt_less_trans) with "6_1" \xx = k\ show ?thesis by (auto simp: rbt_less_nit) qed simp next case ("6_2" xx aa yy ss vaa vbb vdd vc) thus ?case proof (cases "xx = k") case True with "6_2" have "k > yy \ aa |\ yy" by simp hence "aa |\ k" by (blast dest: rbt_less_trans) with True "6_2" show ?thesis by (auto simp: rbt_less_nit) qed auto qed simp definition (in ord) rbt_delete where "rbt_delete k t = paint B (rbt_del k t)" theorem rbt_delete_is_rbt [simp]: assumes "is_rbt t" shows "is_rbt (rbt_delete k t)" proof - from assms have "inv2 t" and "inv1 t" unfolding is_rbt_def by auto hence "inv2 (rbt_del k t) \ (color_of t = R \ bheight (rbt_del k t) = bheight t \ inv1 (rbt_del k t) \ color_of t = B \ bheight (rbt_del k t) = bheight t - 1 \ inv1l (rbt_del k t))" by (rule rbt_del_inv1_inv2) hence "inv2 (rbt_del k t) \ inv1l (rbt_del k t)" by (cases "color_of t") auto with assms show ?thesis unfolding is_rbt_def rbt_delete_def by (auto intro: paint_rbt_sorted rbt_del_rbt_sorted) qed lemma rbt_delete_in_tree: assumes "is_rbt t" shows "entry_in_tree k v (rbt_delete x t) = (x \ k \ entry_in_tree k v t)" using assms unfolding is_rbt_def rbt_delete_def by (auto simp: rbt_del_in_tree) lemma rbt_lookup_rbt_delete: assumes is_rbt: "is_rbt t" shows "rbt_lookup (rbt_delete k t) = (rbt_lookup t)|`(-{k})" proof fix x show "rbt_lookup (rbt_delete k t) x = (rbt_lookup t |` (-{k})) x" proof (cases "x = k") assume "x = k" with is_rbt show ?thesis by (cases "rbt_lookup (rbt_delete k t) k") (auto simp: rbt_lookup_in_tree rbt_delete_in_tree) next assume "x \ k" thus ?thesis by auto (metis is_rbt rbt_delete_is_rbt rbt_delete_in_tree is_rbt_rbt_sorted rbt_lookup_from_in_tree) qed qed end subsection \Modifying existing entries\ context ord begin primrec rbt_map_entry :: "'a \ ('b \ 'b) \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_map_entry k f Empty = Empty" | "rbt_map_entry k f (Branch c lt x v rt) = (if k < x then Branch c (rbt_map_entry k f lt) x v rt else if k > x then (Branch c lt x v (rbt_map_entry k f rt)) else Branch c lt x (f v) rt)" lemma rbt_map_entry_color_of: "color_of (rbt_map_entry k f t) = color_of t" by (induct t) simp+ lemma rbt_map_entry_inv1: "inv1 (rbt_map_entry k f t) = inv1 t" by (induct t) (simp add: rbt_map_entry_color_of)+ lemma rbt_map_entry_inv2: "inv2 (rbt_map_entry k f t) = inv2 t" "bheight (rbt_map_entry k f t) = bheight t" by (induct t) simp+ lemma rbt_map_entry_rbt_greater: "rbt_greater a (rbt_map_entry k f t) = rbt_greater a t" by (induct t) simp+ lemma rbt_map_entry_rbt_less: "rbt_less a (rbt_map_entry k f t) = rbt_less a t" by (induct t) simp+ lemma rbt_map_entry_rbt_sorted: "rbt_sorted (rbt_map_entry k f t) = rbt_sorted t" by (induct t) (simp_all add: rbt_map_entry_rbt_less rbt_map_entry_rbt_greater) theorem rbt_map_entry_is_rbt [simp]: "is_rbt (rbt_map_entry k f t) = is_rbt t" unfolding is_rbt_def by (simp add: rbt_map_entry_inv2 rbt_map_entry_color_of rbt_map_entry_rbt_sorted rbt_map_entry_inv1 ) end theorem (in linorder) rbt_lookup_rbt_map_entry: "rbt_lookup (rbt_map_entry k f t) = (rbt_lookup t)(k := map_option f (rbt_lookup t k))" by (induct t) (auto split: option.splits simp add: fun_eq_iff) subsection \Mapping all entries\ primrec map :: "('a \ 'b \ 'c) \ ('a, 'b) rbt \ ('a, 'c) rbt" where "map f Empty = Empty" | "map f (Branch c lt k v rt) = Branch c (map f lt) k (f k v) (map f rt)" lemma map_entries [simp]: "entries (map f t) = List.map (\(k, v). (k, f k v)) (entries t)" by (induct t) auto lemma map_keys [simp]: "keys (map f t) = keys t" by (simp add: keys_def split_def) lemma map_color_of: "color_of (map f t) = color_of t" by (induct t) simp+ lemma map_inv1: "inv1 (map f t) = inv1 t" by (induct t) (simp add: map_color_of)+ lemma map_inv2: "inv2 (map f t) = inv2 t" "bheight (map f t) = bheight t" by (induct t) simp+ context ord begin lemma map_rbt_greater: "rbt_greater k (map f t) = rbt_greater k t" by (induct t) simp+ lemma map_rbt_less: "rbt_less k (map f t) = rbt_less k t" by (induct t) simp+ lemma map_rbt_sorted: "rbt_sorted (map f t) = rbt_sorted t" by (induct t) (simp add: map_rbt_less map_rbt_greater)+ theorem map_is_rbt [simp]: "is_rbt (map f t) = is_rbt t" unfolding is_rbt_def by (simp add: map_inv1 map_inv2 map_rbt_sorted map_color_of) end theorem (in linorder) rbt_lookup_map: "rbt_lookup (map f t) x = map_option (f x) (rbt_lookup t x)" - apply(induct t) - apply auto - apply(rename_tac a b c, subgoal_tac "x = a") - apply auto - done + by (induct t) (auto simp: antisym_conv3) (* FIXME: simproc "antisym less" does not work for linorder context, only for linorder type class by (induct t) auto *) hide_const (open) map subsection \Folding over entries\ definition fold :: "('a \ 'b \ 'c \ 'c) \ ('a, 'b) rbt \ 'c \ 'c" where "fold f t = List.fold (case_prod f) (entries t)" lemma fold_simps [simp]: "fold f Empty = id" "fold f (Branch c lt k v rt) = fold f rt \ f k v \ fold f lt" by (simp_all add: fold_def fun_eq_iff) lemma fold_code [code]: "fold f Empty x = x" "fold f (Branch c lt k v rt) x = fold f rt (f k v (fold f lt x))" by(simp_all) \ \fold with continuation predicate\ fun foldi :: "('c \ bool) \ ('a \ 'b \ 'c \ 'c) \ ('a :: linorder, 'b) rbt \ 'c \ 'c" where "foldi c f Empty s = s" | "foldi c f (Branch col l k v r) s = ( if (c s) then let s' = foldi c f l s in if (c s') then foldi c f r (f k v s') else s' else s )" subsection \Bulkloading a tree\ definition (in ord) rbt_bulkload :: "('a \ 'b) list \ ('a, 'b) rbt" where "rbt_bulkload xs = foldr (\(k, v). rbt_insert k v) xs Empty" context linorder begin lemma rbt_bulkload_is_rbt [simp, intro]: "is_rbt (rbt_bulkload xs)" unfolding rbt_bulkload_def by (induct xs) auto lemma rbt_lookup_rbt_bulkload: "rbt_lookup (rbt_bulkload xs) = map_of xs" proof - obtain ys where "ys = rev xs" by simp have "\t. is_rbt t \ rbt_lookup (List.fold (case_prod rbt_insert) ys t) = rbt_lookup t ++ map_of (rev ys)" by (induct ys) (simp_all add: rbt_bulkload_def rbt_lookup_rbt_insert case_prod_beta) from this Empty_is_rbt have "rbt_lookup (List.fold (case_prod rbt_insert) (rev xs) Empty) = rbt_lookup Empty ++ map_of xs" by (simp add: \ys = rev xs\) then show ?thesis by (simp add: rbt_bulkload_def rbt_lookup_Empty foldr_conv_fold) qed end subsection \Building a RBT from a sorted list\ text \ These functions have been adapted from Andrew W. Appel, Efficient Verified Red-Black Trees (September 2011) \ fun rbtreeify_f :: "nat \ ('a \ 'b) list \ ('a, 'b) rbt \ ('a \ 'b) list" and rbtreeify_g :: "nat \ ('a \ 'b) list \ ('a, 'b) rbt \ ('a \ 'b) list" where "rbtreeify_f n kvs = (if n = 0 then (Empty, kvs) else if n = 1 then case kvs of (k, v) # kvs' \ (Branch R Empty k v Empty, kvs') else if (n mod 2 = 0) then case rbtreeify_f (n div 2) kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g (n div 2) kvs') else case rbtreeify_f (n div 2) kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_f (n div 2) kvs'))" | "rbtreeify_g n kvs = (if n = 0 \ n = 1 then (Empty, kvs) else if n mod 2 = 0 then case rbtreeify_g (n div 2) kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g (n div 2) kvs') else case rbtreeify_f (n div 2) kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g (n div 2) kvs'))" definition rbtreeify :: "('a \ 'b) list \ ('a, 'b) rbt" where "rbtreeify kvs = fst (rbtreeify_g (Suc (length kvs)) kvs)" declare rbtreeify_f.simps [simp del] rbtreeify_g.simps [simp del] lemma rbtreeify_f_code [code]: "rbtreeify_f n kvs = (if n = 0 then (Empty, kvs) else if n = 1 then case kvs of (k, v) # kvs' \ (Branch R Empty k v Empty, kvs') else let (n', r) = Divides.divmod_nat n 2 in if r = 0 then case rbtreeify_f n' kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g n' kvs') else case rbtreeify_f n' kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_f n' kvs'))" by (subst rbtreeify_f.simps) (simp only: Let_def divmod_nat_def prod.case) lemma rbtreeify_g_code [code]: "rbtreeify_g n kvs = (if n = 0 \ n = 1 then (Empty, kvs) else let (n', r) = Divides.divmod_nat n 2 in if r = 0 then case rbtreeify_g n' kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g n' kvs') else case rbtreeify_f n' kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g n' kvs'))" by(subst rbtreeify_g.simps)(simp only: Let_def divmod_nat_def prod.case) lemma Suc_double_half: "Suc (2 * n) div 2 = n" by simp lemma div2_plus_div2: "n div 2 + n div 2 = (n :: nat) - n mod 2" by arith lemma rbtreeify_f_rec_aux_lemma: "\k - n div 2 = Suc k'; n \ k; n mod 2 = Suc 0\ \ k' - n div 2 = k - n" apply(rule add_right_imp_eq[where a = "n - n div 2"]) apply(subst add_diff_assoc2, arith) apply(simp add: div2_plus_div2) done lemma rbtreeify_f_simps: "rbtreeify_f 0 kvs = (Empty, kvs)" "rbtreeify_f (Suc 0) ((k, v) # kvs) = (Branch R Empty k v Empty, kvs)" "0 < n \ rbtreeify_f (2 * n) kvs = (case rbtreeify_f n kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g n kvs'))" "0 < n \ rbtreeify_f (Suc (2 * n)) kvs = (case rbtreeify_f n kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_f n kvs'))" by(subst (1) rbtreeify_f.simps, simp add: Suc_double_half)+ lemma rbtreeify_g_simps: "rbtreeify_g 0 kvs = (Empty, kvs)" "rbtreeify_g (Suc 0) kvs = (Empty, kvs)" "0 < n \ rbtreeify_g (2 * n) kvs = (case rbtreeify_g n kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g n kvs'))" "0 < n \ rbtreeify_g (Suc (2 * n)) kvs = (case rbtreeify_f n kvs of (t1, (k, v) # kvs') \ apfst (Branch B t1 k v) (rbtreeify_g n kvs'))" by(subst (1) rbtreeify_g.simps, simp add: Suc_double_half)+ declare rbtreeify_f_simps[simp] rbtreeify_g_simps[simp] lemma length_rbtreeify_f: "n \ length kvs \ length (snd (rbtreeify_f n kvs)) = length kvs - n" and length_rbtreeify_g:"\ 0 < n; n \ Suc (length kvs) \ \ length (snd (rbtreeify_g n kvs)) = Suc (length kvs) - n" proof(induction n kvs and n kvs rule: rbtreeify_f_rbtreeify_g.induct) case (1 n kvs) show ?case proof(cases "n \ 1") case True thus ?thesis using "1.prems" by(cases n kvs rule: nat.exhaust[case_product list.exhaust]) auto next case False hence "n \ 0" "n \ 1" by simp_all note IH = "1.IH"[OF this] show ?thesis proof(cases "n mod 2 = 0") case True hence "length (snd (rbtreeify_f n kvs)) = length (snd (rbtreeify_f (2 * (n div 2)) kvs))" by(metis minus_nat.diff_0 minus_mod_eq_mult_div [symmetric]) also from "1.prems" False obtain k v kvs' where kvs: "kvs = (k, v) # kvs'" by(cases kvs) auto also have "0 < n div 2" using False by(simp) note rbtreeify_f_simps(3)[OF this] also note kvs[symmetric] also let ?rest1 = "snd (rbtreeify_f (n div 2) kvs)" from "1.prems" have "n div 2 \ length kvs" by simp with True have len: "length ?rest1 = length kvs - n div 2" by(rule IH) with "1.prems" False obtain t1 k' v' kvs'' where kvs'': "rbtreeify_f (n div 2) kvs = (t1, (k', v') # kvs'')" by(cases ?rest1)(auto simp add: snd_def split: prod.split_asm) note this also note prod.case also note list.simps(5) also note prod.case also note snd_apfst also have "0 < n div 2" "n div 2 \ Suc (length kvs'')" using len "1.prems" False unfolding kvs'' by simp_all with True kvs''[symmetric] refl refl have "length (snd (rbtreeify_g (n div 2) kvs'')) = Suc (length kvs'') - n div 2" by(rule IH) finally show ?thesis using len[unfolded kvs''] "1.prems" True by(simp add: Suc_diff_le[symmetric] mult_2[symmetric] minus_mod_eq_mult_div [symmetric]) next case False hence "length (snd (rbtreeify_f n kvs)) = length (snd (rbtreeify_f (Suc (2 * (n div 2))) kvs))" by (simp add: mod_eq_0_iff_dvd) also from "1.prems" \\ n \ 1\ obtain k v kvs' where kvs: "kvs = (k, v) # kvs'" by(cases kvs) auto also have "0 < n div 2" using \\ n \ 1\ by(simp) note rbtreeify_f_simps(4)[OF this] also note kvs[symmetric] also let ?rest1 = "snd (rbtreeify_f (n div 2) kvs)" from "1.prems" have "n div 2 \ length kvs" by simp with False have len: "length ?rest1 = length kvs - n div 2" by(rule IH) with "1.prems" \\ n \ 1\ obtain t1 k' v' kvs'' where kvs'': "rbtreeify_f (n div 2) kvs = (t1, (k', v') # kvs'')" by(cases ?rest1)(auto simp add: snd_def split: prod.split_asm) note this also note prod.case also note list.simps(5) also note prod.case also note snd_apfst also have "n div 2 \ length kvs''" using len "1.prems" False unfolding kvs'' by simp arith with False kvs''[symmetric] refl refl have "length (snd (rbtreeify_f (n div 2) kvs'')) = length kvs'' - n div 2" by(rule IH) finally show ?thesis using len[unfolded kvs''] "1.prems" False by simp(rule rbtreeify_f_rec_aux_lemma[OF sym]) qed qed next case (2 n kvs) show ?case proof(cases "n > 1") case False with \0 < n\ show ?thesis by(cases n kvs rule: nat.exhaust[case_product list.exhaust]) simp_all next case True hence "\ (n = 0 \ n = 1)" by simp note IH = "2.IH"[OF this] show ?thesis proof(cases "n mod 2 = 0") case True hence "length (snd (rbtreeify_g n kvs)) = length (snd (rbtreeify_g (2 * (n div 2)) kvs))" by(metis minus_nat.diff_0 minus_mod_eq_mult_div [symmetric]) also from "2.prems" True obtain k v kvs' where kvs: "kvs = (k, v) # kvs'" by(cases kvs) auto also have "0 < n div 2" using \1 < n\ by(simp) note rbtreeify_g_simps(3)[OF this] also note kvs[symmetric] also let ?rest1 = "snd (rbtreeify_g (n div 2) kvs)" from "2.prems" \1 < n\ have "0 < n div 2" "n div 2 \ Suc (length kvs)" by simp_all with True have len: "length ?rest1 = Suc (length kvs) - n div 2" by(rule IH) with "2.prems" obtain t1 k' v' kvs'' where kvs'': "rbtreeify_g (n div 2) kvs = (t1, (k', v') # kvs'')" by(cases ?rest1)(auto simp add: snd_def split: prod.split_asm) note this also note prod.case also note list.simps(5) also note prod.case also note snd_apfst also have "n div 2 \ Suc (length kvs'')" using len "2.prems" unfolding kvs'' by simp with True kvs''[symmetric] refl refl \0 < n div 2\ have "length (snd (rbtreeify_g (n div 2) kvs'')) = Suc (length kvs'') - n div 2" by(rule IH) finally show ?thesis using len[unfolded kvs''] "2.prems" True by(simp add: Suc_diff_le[symmetric] mult_2[symmetric] minus_mod_eq_mult_div [symmetric]) next case False hence "length (snd (rbtreeify_g n kvs)) = length (snd (rbtreeify_g (Suc (2 * (n div 2))) kvs))" by (simp add: mod_eq_0_iff_dvd) also from "2.prems" \1 < n\ obtain k v kvs' where kvs: "kvs = (k, v) # kvs'" by(cases kvs) auto also have "0 < n div 2" using \1 < n\ by(simp) note rbtreeify_g_simps(4)[OF this] also note kvs[symmetric] also let ?rest1 = "snd (rbtreeify_f (n div 2) kvs)" from "2.prems" have "n div 2 \ length kvs" by simp with False have len: "length ?rest1 = length kvs - n div 2" by(rule IH) with "2.prems" \1 < n\ False obtain t1 k' v' kvs'' where kvs'': "rbtreeify_f (n div 2) kvs = (t1, (k', v') # kvs'')" by(cases ?rest1)(auto simp add: snd_def split: prod.split_asm, arith) note this also note prod.case also note list.simps(5) also note prod.case also note snd_apfst also have "n div 2 \ Suc (length kvs'')" using len "2.prems" False unfolding kvs'' by simp arith with False kvs''[symmetric] refl refl \0 < n div 2\ have "length (snd (rbtreeify_g (n div 2) kvs'')) = Suc (length kvs'') - n div 2" by(rule IH) finally show ?thesis using len[unfolded kvs''] "2.prems" False by(simp add: div2_plus_div2) qed qed qed lemma rbtreeify_induct [consumes 1, case_names f_0 f_1 f_even f_odd g_0 g_1 g_even g_odd]: fixes P Q defines "f0 == (\kvs. P 0 kvs)" and "f1 == (\k v kvs. P (Suc 0) ((k, v) # kvs))" and "feven == (\n kvs t k v kvs'. \ n > 0; n \ length kvs; P n kvs; rbtreeify_f n kvs = (t, (k, v) # kvs'); n \ Suc (length kvs'); Q n kvs' \ \ P (2 * n) kvs)" and "fodd == (\n kvs t k v kvs'. \ n > 0; n \ length kvs; P n kvs; rbtreeify_f n kvs = (t, (k, v) # kvs'); n \ length kvs'; P n kvs' \ \ P (Suc (2 * n)) kvs)" and "g0 == (\kvs. Q 0 kvs)" and "g1 == (\kvs. Q (Suc 0) kvs)" and "geven == (\n kvs t k v kvs'. \ n > 0; n \ Suc (length kvs); Q n kvs; rbtreeify_g n kvs = (t, (k, v) # kvs'); n \ Suc (length kvs'); Q n kvs' \ \ Q (2 * n) kvs)" and "godd == (\n kvs t k v kvs'. \ n > 0; n \ length kvs; P n kvs; rbtreeify_f n kvs = (t, (k, v) # kvs'); n \ Suc (length kvs'); Q n kvs' \ \ Q (Suc (2 * n)) kvs)" shows "\ n \ length kvs; PROP f0; PROP f1; PROP feven; PROP fodd; PROP g0; PROP g1; PROP geven; PROP godd \ \ P n kvs" and "\ n \ Suc (length kvs); PROP f0; PROP f1; PROP feven; PROP fodd; PROP g0; PROP g1; PROP geven; PROP godd \ \ Q n kvs" proof - assume f0: "PROP f0" and f1: "PROP f1" and feven: "PROP feven" and fodd: "PROP fodd" and g0: "PROP g0" and g1: "PROP g1" and geven: "PROP geven" and godd: "PROP godd" show "n \ length kvs \ P n kvs" and "n \ Suc (length kvs) \ Q n kvs" proof(induction rule: rbtreeify_f_rbtreeify_g.induct) case (1 n kvs) show ?case proof(cases "n \ 1") case True thus ?thesis using "1.prems" by(cases n kvs rule: nat.exhaust[case_product list.exhaust]) (auto simp add: f0[unfolded f0_def] f1[unfolded f1_def]) next case False hence ns: "n \ 0" "n \ 1" by simp_all hence ge0: "n div 2 > 0" by simp note IH = "1.IH"[OF ns] show ?thesis proof(cases "n mod 2 = 0") case True note ge0 moreover from "1.prems" have n2: "n div 2 \ length kvs" by simp moreover from True n2 have "P (n div 2) kvs" by(rule IH) moreover from length_rbtreeify_f[OF n2] ge0 "1.prems" obtain t k v kvs' where kvs': "rbtreeify_f (n div 2) kvs = (t, (k, v) # kvs')" by(cases "snd (rbtreeify_f (n div 2) kvs)") (auto simp add: snd_def split: prod.split_asm) moreover from "1.prems" length_rbtreeify_f[OF n2] ge0 have n2': "n div 2 \ Suc (length kvs')" by(simp add: kvs') moreover from True kvs'[symmetric] refl refl n2' have "Q (n div 2) kvs'" by(rule IH) moreover note feven[unfolded feven_def] (* FIXME: why does by(rule feven[unfolded feven_def]) not work? *) ultimately have "P (2 * (n div 2)) kvs" by - thus ?thesis using True by (metis minus_mod_eq_div_mult [symmetric] minus_nat.diff_0 mult.commute) next case False note ge0 moreover from "1.prems" have n2: "n div 2 \ length kvs" by simp moreover from False n2 have "P (n div 2) kvs" by(rule IH) moreover from length_rbtreeify_f[OF n2] ge0 "1.prems" obtain t k v kvs' where kvs': "rbtreeify_f (n div 2) kvs = (t, (k, v) # kvs')" by(cases "snd (rbtreeify_f (n div 2) kvs)") (auto simp add: snd_def split: prod.split_asm) moreover from "1.prems" length_rbtreeify_f[OF n2] ge0 False have n2': "n div 2 \ length kvs'" by(simp add: kvs') arith moreover from False kvs'[symmetric] refl refl n2' have "P (n div 2) kvs'" by(rule IH) moreover note fodd[unfolded fodd_def] ultimately have "P (Suc (2 * (n div 2))) kvs" by - thus ?thesis using False by simp (metis One_nat_def Suc_eq_plus1_left le_add_diff_inverse mod_less_eq_dividend minus_mod_eq_mult_div [symmetric]) qed qed next case (2 n kvs) show ?case proof(cases "n \ 1") case True thus ?thesis using "2.prems" by(cases n kvs rule: nat.exhaust[case_product list.exhaust]) (auto simp add: g0[unfolded g0_def] g1[unfolded g1_def]) next case False hence ns: "\ (n = 0 \ n = 1)" by simp hence ge0: "n div 2 > 0" by simp note IH = "2.IH"[OF ns] show ?thesis proof(cases "n mod 2 = 0") case True note ge0 moreover from "2.prems" have n2: "n div 2 \ Suc (length kvs)" by simp moreover from True n2 have "Q (n div 2) kvs" by(rule IH) moreover from length_rbtreeify_g[OF ge0 n2] ge0 "2.prems" obtain t k v kvs' where kvs': "rbtreeify_g (n div 2) kvs = (t, (k, v) # kvs')" by(cases "snd (rbtreeify_g (n div 2) kvs)") (auto simp add: snd_def split: prod.split_asm) moreover from "2.prems" length_rbtreeify_g[OF ge0 n2] ge0 have n2': "n div 2 \ Suc (length kvs')" by(simp add: kvs') moreover from True kvs'[symmetric] refl refl n2' have "Q (n div 2) kvs'" by(rule IH) moreover note geven[unfolded geven_def] ultimately have "Q (2 * (n div 2)) kvs" by - thus ?thesis using True by(metis minus_mod_eq_div_mult [symmetric] minus_nat.diff_0 mult.commute) next case False note ge0 moreover from "2.prems" have n2: "n div 2 \ length kvs" by simp moreover from False n2 have "P (n div 2) kvs" by(rule IH) moreover from length_rbtreeify_f[OF n2] ge0 "2.prems" False obtain t k v kvs' where kvs': "rbtreeify_f (n div 2) kvs = (t, (k, v) # kvs')" by(cases "snd (rbtreeify_f (n div 2) kvs)") (auto simp add: snd_def split: prod.split_asm, arith) moreover from "2.prems" length_rbtreeify_f[OF n2] ge0 False have n2': "n div 2 \ Suc (length kvs')" by(simp add: kvs') arith moreover from False kvs'[symmetric] refl refl n2' have "Q (n div 2) kvs'" by(rule IH) moreover note godd[unfolded godd_def] ultimately have "Q (Suc (2 * (n div 2))) kvs" by - thus ?thesis using False by simp (metis One_nat_def Suc_eq_plus1_left le_add_diff_inverse mod_less_eq_dividend minus_mod_eq_mult_div [symmetric]) qed qed qed qed lemma inv1_rbtreeify_f: "n \ length kvs \ inv1 (fst (rbtreeify_f n kvs))" and inv1_rbtreeify_g: "n \ Suc (length kvs) \ inv1 (fst (rbtreeify_g n kvs))" by(induct n kvs and n kvs rule: rbtreeify_induct) simp_all fun plog2 :: "nat \ nat" where "plog2 n = (if n \ 1 then 0 else plog2 (n div 2) + 1)" declare plog2.simps [simp del] lemma plog2_simps [simp]: "plog2 0 = 0" "plog2 (Suc 0) = 0" "0 < n \ plog2 (2 * n) = 1 + plog2 n" "0 < n \ plog2 (Suc (2 * n)) = 1 + plog2 n" by(subst plog2.simps, simp add: Suc_double_half)+ lemma bheight_rbtreeify_f: "n \ length kvs \ bheight (fst (rbtreeify_f n kvs)) = plog2 n" and bheight_rbtreeify_g: "n \ Suc (length kvs) \ bheight (fst (rbtreeify_g n kvs)) = plog2 n" by(induct n kvs and n kvs rule: rbtreeify_induct) simp_all lemma bheight_rbtreeify_f_eq_plog2I: "\ rbtreeify_f n kvs = (t, kvs'); n \ length kvs \ \ bheight t = plog2 n" using bheight_rbtreeify_f[of n kvs] by simp lemma bheight_rbtreeify_g_eq_plog2I: "\ rbtreeify_g n kvs = (t, kvs'); n \ Suc (length kvs) \ \ bheight t = plog2 n" using bheight_rbtreeify_g[of n kvs] by simp hide_const (open) plog2 lemma inv2_rbtreeify_f: "n \ length kvs \ inv2 (fst (rbtreeify_f n kvs))" and inv2_rbtreeify_g: "n \ Suc (length kvs) \ inv2 (fst (rbtreeify_g n kvs))" by(induct n kvs and n kvs rule: rbtreeify_induct) (auto simp add: bheight_rbtreeify_f bheight_rbtreeify_g intro: bheight_rbtreeify_f_eq_plog2I bheight_rbtreeify_g_eq_plog2I) lemma "n \ length kvs \ True" and color_of_rbtreeify_g: "\ n \ Suc (length kvs); 0 < n \ \ color_of (fst (rbtreeify_g n kvs)) = B" by(induct n kvs and n kvs rule: rbtreeify_induct) simp_all lemma entries_rbtreeify_f_append: "n \ length kvs \ entries (fst (rbtreeify_f n kvs)) @ snd (rbtreeify_f n kvs) = kvs" and entries_rbtreeify_g_append: "n \ Suc (length kvs) \ entries (fst (rbtreeify_g n kvs)) @ snd (rbtreeify_g n kvs) = kvs" by(induction rule: rbtreeify_induct) simp_all lemma length_entries_rbtreeify_f: "n \ length kvs \ length (entries (fst (rbtreeify_f n kvs))) = n" and length_entries_rbtreeify_g: "n \ Suc (length kvs) \ length (entries (fst (rbtreeify_g n kvs))) = n - 1" by(induct rule: rbtreeify_induct) simp_all lemma rbtreeify_f_conv_drop: "n \ length kvs \ snd (rbtreeify_f n kvs) = drop n kvs" using entries_rbtreeify_f_append[of n kvs] by(simp add: append_eq_conv_conj length_entries_rbtreeify_f) lemma rbtreeify_g_conv_drop: "n \ Suc (length kvs) \ snd (rbtreeify_g n kvs) = drop (n - 1) kvs" using entries_rbtreeify_g_append[of n kvs] by(simp add: append_eq_conv_conj length_entries_rbtreeify_g) lemma entries_rbtreeify_f [simp]: "n \ length kvs \ entries (fst (rbtreeify_f n kvs)) = take n kvs" using entries_rbtreeify_f_append[of n kvs] by(simp add: append_eq_conv_conj length_entries_rbtreeify_f) lemma entries_rbtreeify_g [simp]: "n \ Suc (length kvs) \ entries (fst (rbtreeify_g n kvs)) = take (n - 1) kvs" using entries_rbtreeify_g_append[of n kvs] by(simp add: append_eq_conv_conj length_entries_rbtreeify_g) lemma keys_rbtreeify_f [simp]: "n \ length kvs \ keys (fst (rbtreeify_f n kvs)) = take n (map fst kvs)" by(simp add: keys_def take_map) lemma keys_rbtreeify_g [simp]: "n \ Suc (length kvs) \ keys (fst (rbtreeify_g n kvs)) = take (n - 1) (map fst kvs)" by(simp add: keys_def take_map) lemma rbtreeify_fD: "\ rbtreeify_f n kvs = (t, kvs'); n \ length kvs \ \ entries t = take n kvs \ kvs' = drop n kvs" using rbtreeify_f_conv_drop[of n kvs] entries_rbtreeify_f[of n kvs] by simp lemma rbtreeify_gD: "\ rbtreeify_g n kvs = (t, kvs'); n \ Suc (length kvs) \ \ entries t = take (n - 1) kvs \ kvs' = drop (n - 1) kvs" using rbtreeify_g_conv_drop[of n kvs] entries_rbtreeify_g[of n kvs] by simp lemma entries_rbtreeify [simp]: "entries (rbtreeify kvs) = kvs" by(simp add: rbtreeify_def entries_rbtreeify_g) context linorder begin lemma rbt_sorted_rbtreeify_f: "\ n \ length kvs; sorted (map fst kvs); distinct (map fst kvs) \ \ rbt_sorted (fst (rbtreeify_f n kvs))" and rbt_sorted_rbtreeify_g: "\ n \ Suc (length kvs); sorted (map fst kvs); distinct (map fst kvs) \ \ rbt_sorted (fst (rbtreeify_g n kvs))" proof(induction n kvs and n kvs rule: rbtreeify_induct) case (f_even n kvs t k v kvs') from rbtreeify_fD[OF \rbtreeify_f n kvs = (t, (k, v) # kvs')\ \n \ length kvs\] have "entries t = take n kvs" and kvs': "drop n kvs = (k, v) # kvs'" by simp_all hence unfold: "kvs = take n kvs @ (k, v) # kvs'" by(metis append_take_drop_id) from \sorted (map fst kvs)\ kvs' have "(\(x, y) \ set (take n kvs). x \ k) \ (\(x, y) \ set kvs'. k \ x)" by(subst (asm) unfold)(auto simp add: sorted_append) moreover from \distinct (map fst kvs)\ kvs' have "(\(x, y) \ set (take n kvs). x \ k) \ (\(x, y) \ set kvs'. x \ k)" by(subst (asm) unfold)(auto intro: rev_image_eqI) ultimately have "(\(x, y) \ set (take n kvs). x < k) \ (\(x, y) \ set kvs'. k < x)" by fastforce hence "fst (rbtreeify_f n kvs) |\ k" "k \| fst (rbtreeify_g n kvs')" using \n \ Suc (length kvs')\ \n \ length kvs\ set_take_subset[of "n - 1" kvs'] by(auto simp add: ord.rbt_greater_prop ord.rbt_less_prop take_map split_def) moreover from \sorted (map fst kvs)\ \distinct (map fst kvs)\ have "rbt_sorted (fst (rbtreeify_f n kvs))" by(rule f_even.IH) moreover have "sorted (map fst kvs')" "distinct (map fst kvs')" using \sorted (map fst kvs)\ \distinct (map fst kvs)\ by(subst (asm) (1 2) unfold, simp add: sorted_append)+ hence "rbt_sorted (fst (rbtreeify_g n kvs'))" by(rule f_even.IH) ultimately show ?case using \0 < n\ \rbtreeify_f n kvs = (t, (k, v) # kvs')\ by simp next case (f_odd n kvs t k v kvs') from rbtreeify_fD[OF \rbtreeify_f n kvs = (t, (k, v) # kvs')\ \n \ length kvs\] have "entries t = take n kvs" and kvs': "drop n kvs = (k, v) # kvs'" by simp_all hence unfold: "kvs = take n kvs @ (k, v) # kvs'" by(metis append_take_drop_id) from \sorted (map fst kvs)\ kvs' have "(\(x, y) \ set (take n kvs). x \ k) \ (\(x, y) \ set kvs'. k \ x)" by(subst (asm) unfold)(auto simp add: sorted_append) moreover from \distinct (map fst kvs)\ kvs' have "(\(x, y) \ set (take n kvs). x \ k) \ (\(x, y) \ set kvs'. x \ k)" by(subst (asm) unfold)(auto intro: rev_image_eqI) ultimately have "(\(x, y) \ set (take n kvs). x < k) \ (\(x, y) \ set kvs'. k < x)" by fastforce hence "fst (rbtreeify_f n kvs) |\ k" "k \| fst (rbtreeify_f n kvs')" using \n \ length kvs'\ \n \ length kvs\ set_take_subset[of n kvs'] by(auto simp add: rbt_greater_prop rbt_less_prop take_map split_def) moreover from \sorted (map fst kvs)\ \distinct (map fst kvs)\ have "rbt_sorted (fst (rbtreeify_f n kvs))" by(rule f_odd.IH) moreover have "sorted (map fst kvs')" "distinct (map fst kvs')" using \sorted (map fst kvs)\ \distinct (map fst kvs)\ by(subst (asm) (1 2) unfold, simp add: sorted_append)+ hence "rbt_sorted (fst (rbtreeify_f n kvs'))" by(rule f_odd.IH) ultimately show ?case using \0 < n\ \rbtreeify_f n kvs = (t, (k, v) # kvs')\ by simp next case (g_even n kvs t k v kvs') from rbtreeify_gD[OF \rbtreeify_g n kvs = (t, (k, v) # kvs')\ \n \ Suc (length kvs)\] have t: "entries t = take (n - 1) kvs" and kvs': "drop (n - 1) kvs = (k, v) # kvs'" by simp_all hence unfold: "kvs = take (n - 1) kvs @ (k, v) # kvs'" by(metis append_take_drop_id) from \sorted (map fst kvs)\ kvs' have "(\(x, y) \ set (take (n - 1) kvs). x \ k) \ (\(x, y) \ set kvs'. k \ x)" by(subst (asm) unfold)(auto simp add: sorted_append) moreover from \distinct (map fst kvs)\ kvs' have "(\(x, y) \ set (take (n - 1) kvs). x \ k) \ (\(x, y) \ set kvs'. x \ k)" by(subst (asm) unfold)(auto intro: rev_image_eqI) ultimately have "(\(x, y) \ set (take (n - 1) kvs). x < k) \ (\(x, y) \ set kvs'. k < x)" by fastforce hence "fst (rbtreeify_g n kvs) |\ k" "k \| fst (rbtreeify_g n kvs')" using \n \ Suc (length kvs')\ \n \ Suc (length kvs)\ set_take_subset[of "n - 1" kvs'] by(auto simp add: rbt_greater_prop rbt_less_prop take_map split_def) moreover from \sorted (map fst kvs)\ \distinct (map fst kvs)\ have "rbt_sorted (fst (rbtreeify_g n kvs))" by(rule g_even.IH) moreover have "sorted (map fst kvs')" "distinct (map fst kvs')" using \sorted (map fst kvs)\ \distinct (map fst kvs)\ by(subst (asm) (1 2) unfold, simp add: sorted_append)+ hence "rbt_sorted (fst (rbtreeify_g n kvs'))" by(rule g_even.IH) ultimately show ?case using \0 < n\ \rbtreeify_g n kvs = (t, (k, v) # kvs')\ by simp next case (g_odd n kvs t k v kvs') from rbtreeify_fD[OF \rbtreeify_f n kvs = (t, (k, v) # kvs')\ \n \ length kvs\] have "entries t = take n kvs" and kvs': "drop n kvs = (k, v) # kvs'" by simp_all hence unfold: "kvs = take n kvs @ (k, v) # kvs'" by(metis append_take_drop_id) from \sorted (map fst kvs)\ kvs' have "(\(x, y) \ set (take n kvs). x \ k) \ (\(x, y) \ set kvs'. k \ x)" by(subst (asm) unfold)(auto simp add: sorted_append) moreover from \distinct (map fst kvs)\ kvs' have "(\(x, y) \ set (take n kvs). x \ k) \ (\(x, y) \ set kvs'. x \ k)" by(subst (asm) unfold)(auto intro: rev_image_eqI) ultimately have "(\(x, y) \ set (take n kvs). x < k) \ (\(x, y) \ set kvs'. k < x)" by fastforce hence "fst (rbtreeify_f n kvs) |\ k" "k \| fst (rbtreeify_g n kvs')" using \n \ Suc (length kvs')\ \n \ length kvs\ set_take_subset[of "n - 1" kvs'] by(auto simp add: rbt_greater_prop rbt_less_prop take_map split_def) moreover from \sorted (map fst kvs)\ \distinct (map fst kvs)\ have "rbt_sorted (fst (rbtreeify_f n kvs))" by(rule g_odd.IH) moreover have "sorted (map fst kvs')" "distinct (map fst kvs')" using \sorted (map fst kvs)\ \distinct (map fst kvs)\ by(subst (asm) (1 2) unfold, simp add: sorted_append)+ hence "rbt_sorted (fst (rbtreeify_g n kvs'))" by(rule g_odd.IH) ultimately show ?case using \0 < n\ \rbtreeify_f n kvs = (t, (k, v) # kvs')\ by simp qed simp_all lemma rbt_sorted_rbtreeify: "\ sorted (map fst kvs); distinct (map fst kvs) \ \ rbt_sorted (rbtreeify kvs)" by(simp add: rbtreeify_def rbt_sorted_rbtreeify_g) lemma is_rbt_rbtreeify: "\ sorted (map fst kvs); distinct (map fst kvs) \ \ is_rbt (rbtreeify kvs)" by(simp add: is_rbt_def rbtreeify_def inv1_rbtreeify_g inv2_rbtreeify_g rbt_sorted_rbtreeify_g color_of_rbtreeify_g) lemma rbt_lookup_rbtreeify: "\ sorted (map fst kvs); distinct (map fst kvs) \ \ rbt_lookup (rbtreeify kvs) = map_of kvs" by(simp add: map_of_entries[symmetric] rbt_sorted_rbtreeify) end text \ Functions to compare the height of two rbt trees, taken from Andrew W. Appel, Efficient Verified Red-Black Trees (September 2011) \ fun skip_red :: "('a, 'b) rbt \ ('a, 'b) rbt" where "skip_red (Branch color.R l k v r) = l" | "skip_red t = t" definition skip_black :: "('a, 'b) rbt \ ('a, 'b) rbt" where "skip_black t = (let t' = skip_red t in case t' of Branch color.B l k v r \ l | _ \ t')" datatype compare = LT | GT | EQ partial_function (tailrec) compare_height :: "('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt \ compare" where "compare_height sx s t tx = (case (skip_red sx, skip_red s, skip_red t, skip_red tx) of (Branch _ sx' _ _ _, Branch _ s' _ _ _, Branch _ t' _ _ _, Branch _ tx' _ _ _) \ compare_height (skip_black sx') s' t' (skip_black tx') | (_, rbt.Empty, _, Branch _ _ _ _ _) \ LT | (Branch _ _ _ _ _, _, rbt.Empty, _) \ GT | (Branch _ sx' _ _ _, Branch _ s' _ _ _, Branch _ t' _ _ _, rbt.Empty) \ compare_height (skip_black sx') s' t' rbt.Empty | (rbt.Empty, Branch _ s' _ _ _, Branch _ t' _ _ _, Branch _ tx' _ _ _) \ compare_height rbt.Empty s' t' (skip_black tx') | _ \ EQ)" declare compare_height.simps [code] hide_type (open) compare hide_const (open) compare_height skip_black skip_red LT GT EQ case_compare rec_compare Abs_compare Rep_compare hide_fact (open) Abs_compare_cases Abs_compare_induct Abs_compare_inject Abs_compare_inverse Rep_compare Rep_compare_cases Rep_compare_induct Rep_compare_inject Rep_compare_inverse compare.simps compare.exhaust compare.induct compare.rec compare.simps compare.size compare.case_cong compare.case_cong_weak compare.case compare.nchotomy compare.split compare.split_asm compare.eq.refl compare.eq.simps equal_compare_def skip_red.simps skip_red.cases skip_red.induct skip_black_def compare_height.simps subsection \union and intersection of sorted associative lists\ context ord begin function sunion_with :: "('a \ 'b \ 'b \ 'b) \ ('a \ 'b) list \ ('a \ 'b) list \ ('a \ 'b) list" where "sunion_with f ((k, v) # as) ((k', v') # bs) = (if k > k' then (k', v') # sunion_with f ((k, v) # as) bs else if k < k' then (k, v) # sunion_with f as ((k', v') # bs) else (k, f k v v') # sunion_with f as bs)" | "sunion_with f [] bs = bs" | "sunion_with f as [] = as" by pat_completeness auto termination by lexicographic_order function sinter_with :: "('a \ 'b \ 'b \ 'b) \ ('a \ 'b) list \ ('a \ 'b) list \ ('a \ 'b) list" where "sinter_with f ((k, v) # as) ((k', v') # bs) = (if k > k' then sinter_with f ((k, v) # as) bs else if k < k' then sinter_with f as ((k', v') # bs) else (k, f k v v') # sinter_with f as bs)" | "sinter_with f [] _ = []" | "sinter_with f _ [] = []" by pat_completeness auto termination by lexicographic_order end declare ord.sunion_with.simps [code] ord.sinter_with.simps[code] context linorder begin lemma set_fst_sunion_with: "set (map fst (sunion_with f xs ys)) = set (map fst xs) \ set (map fst ys)" by(induct f xs ys rule: sunion_with.induct) auto lemma sorted_sunion_with [simp]: "\ sorted (map fst xs); sorted (map fst ys) \ \ sorted (map fst (sunion_with f xs ys))" by(induct f xs ys rule: sunion_with.induct) (auto simp add: set_fst_sunion_with simp del: set_map) lemma distinct_sunion_with [simp]: "\ distinct (map fst xs); distinct (map fst ys); sorted (map fst xs); sorted (map fst ys) \ \ distinct (map fst (sunion_with f xs ys))" proof(induct f xs ys rule: sunion_with.induct) case (1 f k v xs k' v' ys) have "\ \ k < k'; \ k' < k \ \ k = k'" by simp thus ?case using "1" by(auto simp add: set_fst_sunion_with simp del: set_map) qed simp_all lemma map_of_sunion_with: "\ sorted (map fst xs); sorted (map fst ys) \ \ map_of (sunion_with f xs ys) k = (case map_of xs k of None \ map_of ys k | Some v \ case map_of ys k of None \ Some v | Some w \ Some (f k v w))" by(induct f xs ys rule: sunion_with.induct)(auto split: option.split dest: map_of_SomeD bspec) lemma set_fst_sinter_with [simp]: "\ sorted (map fst xs); sorted (map fst ys) \ \ set (map fst (sinter_with f xs ys)) = set (map fst xs) \ set (map fst ys)" by(induct f xs ys rule: sinter_with.induct)(auto simp del: set_map) lemma set_fst_sinter_with_subset1: "set (map fst (sinter_with f xs ys)) \ set (map fst xs)" by(induct f xs ys rule: sinter_with.induct) auto lemma set_fst_sinter_with_subset2: "set (map fst (sinter_with f xs ys)) \ set (map fst ys)" by(induct f xs ys rule: sinter_with.induct)(auto simp del: set_map) lemma sorted_sinter_with [simp]: "\ sorted (map fst xs); sorted (map fst ys) \ \ sorted (map fst (sinter_with f xs ys))" by(induct f xs ys rule: sinter_with.induct)(auto simp del: set_map) lemma distinct_sinter_with [simp]: "\ distinct (map fst xs); distinct (map fst ys) \ \ distinct (map fst (sinter_with f xs ys))" proof(induct f xs ys rule: sinter_with.induct) case (1 f k v as k' v' bs) have "\ \ k < k'; \ k' < k \ \ k = k'" by simp thus ?case using "1" set_fst_sinter_with_subset1[of f as bs] set_fst_sinter_with_subset2[of f as bs] by(auto simp del: set_map) qed simp_all lemma map_of_sinter_with: "\ sorted (map fst xs); sorted (map fst ys) \ \ map_of (sinter_with f xs ys) k = (case map_of xs k of None \ None | Some v \ map_option (f k v) (map_of ys k))" apply(induct f xs ys rule: sinter_with.induct) apply(auto simp add: map_option_case split: option.splits dest: map_of_SomeD bspec) done end lemma distinct_map_of_rev: "distinct (map fst xs) \ map_of (rev xs) = map_of xs" by(induct xs)(auto 4 3 simp add: map_add_def intro!: ext split: option.split intro: rev_image_eqI) lemma map_map_filter: "map f (List.map_filter g xs) = List.map_filter (map_option f \ g) xs" by(auto simp add: List.map_filter_def) lemma map_filter_map_option_const: "List.map_filter (\x. map_option (\y. f x) (g (f x))) xs = filter (\x. g x \ None) (map f xs)" by(auto simp add: map_filter_def filter_map o_def) lemma set_map_filter: "set (List.map_filter P xs) = the ` (P ` set xs - {None})" by(auto simp add: List.map_filter_def intro: rev_image_eqI) (* Split and Join *) definition is_rbt_empty :: "('a, 'b) rbt \ bool" where "is_rbt_empty t \ (case t of RBT_Impl.Empty \ True | _ \ False)" lemma is_rbt_empty_prop[simp]: "is_rbt_empty t \ t = RBT_Impl.Empty" by (auto simp: is_rbt_empty_def split: RBT_Impl.rbt.splits) definition small_rbt :: "('a, 'b) rbt \ bool" where "small_rbt t \ bheight t < 4" definition flip_rbt :: "('a, 'b) rbt \ ('a, 'b) rbt \ bool" where "flip_rbt t1 t2 \ bheight t2 < bheight t1" abbreviation (input) MR where "MR l a b r \ Branch RBT_Impl.R l a b r" abbreviation (input) MB where "MB l a b r \ Branch RBT_Impl.B l a b r" fun rbt_baliL :: "('a, 'b) rbt \ 'a \ 'b \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_baliL (MR (MR t1 a b t2) a' b' t3) a'' b'' t4 = MR (MB t1 a b t2) a' b' (MB t3 a'' b'' t4)" | "rbt_baliL (MR t1 a b (MR t2 a' b' t3)) a'' b'' t4 = MR (MB t1 a b t2) a' b' (MB t3 a'' b'' t4)" | "rbt_baliL t1 a b t2 = MB t1 a b t2" fun rbt_baliR :: "('a, 'b) rbt \ 'a \ 'b \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_baliR t1 a b (MR t2 a' b' (MR t3 a'' b'' t4)) = MR (MB t1 a b t2) a' b' (MB t3 a'' b'' t4)" | "rbt_baliR t1 a b (MR (MR t2 a' b' t3) a'' b'' t4) = MR (MB t1 a b t2) a' b' (MB t3 a'' b'' t4)" | "rbt_baliR t1 a b t2 = MB t1 a b t2" fun rbt_baldL :: "('a, 'b) rbt \ 'a \ 'b \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_baldL (MR t1 a b t2) a' b' t3 = MR (MB t1 a b t2) a' b' t3" | "rbt_baldL t1 a b (MB t2 a' b' t3) = rbt_baliR t1 a b (MR t2 a' b' t3)" | "rbt_baldL t1 a b (MR (MB t2 a' b' t3) a'' b'' t4) = MR (MB t1 a b t2) a' b' (rbt_baliR t3 a'' b'' (paint RBT_Impl.R t4))" | "rbt_baldL t1 a b t2 = MR t1 a b t2" fun rbt_baldR :: "('a, 'b) rbt \ 'a \ 'b \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_baldR t1 a b (MR t2 a' b' t3) = MR t1 a b (MB t2 a' b' t3)" | "rbt_baldR (MB t1 a b t2) a' b' t3 = rbt_baliL (MR t1 a b t2) a' b' t3" | "rbt_baldR (MR t1 a b (MB t2 a' b' t3)) a'' b'' t4 = MR (rbt_baliL (paint RBT_Impl.R t1) a b t2) a' b' (MB t3 a'' b'' t4)" | "rbt_baldR t1 a b t2 = MR t1 a b t2" fun rbt_app :: "('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_app RBT_Impl.Empty t = t" | "rbt_app t RBT_Impl.Empty = t" | "rbt_app (MR t1 a b t2) (MR t3 a'' b'' t4) = (case rbt_app t2 t3 of MR u2 a' b' u3 \ (MR (MR t1 a b u2) a' b' (MR u3 a'' b'' t4)) | t23 \ MR t1 a b (MR t23 a'' b'' t4))" | "rbt_app (MB t1 a b t2) (MB t3 a'' b'' t4) = (case rbt_app t2 t3 of MR u2 a' b' u3 \ MR (MB t1 a b u2) a' b' (MB u3 a'' b'' t4) | t23 \ rbt_baldL t1 a b (MB t23 a'' b'' t4))" | "rbt_app t1 (MR t2 a b t3) = MR (rbt_app t1 t2) a b t3" | "rbt_app (MR t1 a b t2) t3 = MR t1 a b (rbt_app t2 t3)" fun rbt_joinL :: "('a, 'b) rbt \ 'a \ 'b \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_joinL l a b r = (if bheight l \ bheight r then MR l a b r else case r of MB l' a' b' r' \ rbt_baliL (rbt_joinL l a b l') a' b' r' | MR l' a' b' r' \ MR (rbt_joinL l a b l') a' b' r')" declare rbt_joinL.simps[simp del] fun rbt_joinR :: "('a, 'b) rbt \ 'a \ 'b \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_joinR l a b r = (if bheight l \ bheight r then MR l a b r else case l of MB l' a' b' r' \ rbt_baliR l' a' b' (rbt_joinR r' a b r) | MR l' a' b' r' \ MR l' a' b' (rbt_joinR r' a b r))" declare rbt_joinR.simps[simp del] definition rbt_join :: "('a, 'b) rbt \ 'a \ 'b \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_join l a b r = (let bhl = bheight l; bhr = bheight r in if bhl > bhr then paint RBT_Impl.B (rbt_joinR l a b r) else if bhl < bhr then paint RBT_Impl.B (rbt_joinL l a b r) else MB l a b r)" lemma size_paint[simp]: "size (paint c t) = size t" by (cases t) auto lemma size_baliL[simp]: "size (rbt_baliL t1 a b t2) = Suc (size t1 + size t2)" by (induction t1 a b t2 rule: rbt_baliL.induct) auto lemma size_baliR[simp]: "size (rbt_baliR t1 a b t2) = Suc (size t1 + size t2)" by (induction t1 a b t2 rule: rbt_baliR.induct) auto lemma size_baldL[simp]: "size (rbt_baldL t1 a b t2) = Suc (size t1 + size t2)" by (induction t1 a b t2 rule: rbt_baldL.induct) auto lemma size_baldR[simp]: "size (rbt_baldR t1 a b t2) = Suc (size t1 + size t2)" by (induction t1 a b t2 rule: rbt_baldR.induct) auto lemma size_rbt_app[simp]: "size (rbt_app t1 t2) = size t1 + size t2" by (induction t1 t2 rule: rbt_app.induct) (auto split: RBT_Impl.rbt.splits RBT_Impl.color.splits) lemma size_rbt_joinL[simp]: "size (rbt_joinL t1 a b t2) = Suc (size t1 + size t2)" by (induction t1 a b t2 rule: rbt_joinL.induct) (auto simp: rbt_joinL.simps split: RBT_Impl.rbt.splits RBT_Impl.color.splits) lemma size_rbt_joinR[simp]: "size (rbt_joinR t1 a b t2) = Suc (size t1 + size t2)" by (induction t1 a b t2 rule: rbt_joinR.induct) (auto simp: rbt_joinR.simps split: RBT_Impl.rbt.splits RBT_Impl.color.splits) lemma size_rbt_join[simp]: "size (rbt_join t1 a b t2) = Suc (size t1 + size t2)" by (auto simp: rbt_join_def Let_def) definition "inv_12 t \ inv1 t \ inv2 t" lemma rbt_Node: "inv_12 (RBT_Impl.Branch c l a b r) \ inv_12 l \ inv_12 r" by (auto simp: inv_12_def) lemma paint2: "paint c2 (paint c1 t) = paint c2 t" by (cases t) auto lemma inv1_rbt_baliL: "inv1l l \ inv1 r \ inv1 (rbt_baliL l a b r)" by (induct l a b r rule: rbt_baliL.induct) auto lemma inv1_rbt_baliR: "inv1 l \ inv1l r \ inv1 (rbt_baliR l a b r)" by (induct l a b r rule: rbt_baliR.induct) auto lemma rbt_bheight_rbt_baliL: "bheight l = bheight r \ bheight (rbt_baliL l a b r) = Suc (bheight l)" by (induct l a b r rule: rbt_baliL.induct) auto lemma rbt_bheight_rbt_baliR: "bheight l = bheight r \ bheight (rbt_baliR l a b r) = Suc (bheight l)" by (induct l a b r rule: rbt_baliR.induct) auto lemma inv2_rbt_baliL: "inv2 l \ inv2 r \ bheight l = bheight r \ inv2 (rbt_baliL l a b r)" by (induct l a b r rule: rbt_baliL.induct) auto lemma inv2_rbt_baliR: "inv2 l \ inv2 r \ bheight l = bheight r \ inv2 (rbt_baliR l a b r)" by (induct l a b r rule: rbt_baliR.induct) auto lemma inv_rbt_baliR: "inv2 l \ inv2 r \ inv1 l \ inv1l r \ bheight l = bheight r \ inv1 (rbt_baliR l a b r) \ inv2 (rbt_baliR l a b r) \ bheight (rbt_baliR l a b r) = Suc (bheight l)" by (induct l a b r rule: rbt_baliR.induct) auto lemma inv_rbt_baliL: "inv2 l \ inv2 r \ inv1l l \ inv1 r \ bheight l = bheight r \ inv1 (rbt_baliL l a b r) \ inv2 (rbt_baliL l a b r) \ bheight (rbt_baliL l a b r) = Suc (bheight l)" by (induct l a b r rule: rbt_baliL.induct) auto lemma inv2_rbt_baldL_inv1: "inv2 l \ inv2 r \ bheight l + 1 = bheight r \ inv1 r \ inv2 (rbt_baldL l a b r) \ bheight (rbt_baldL l a b r) = bheight r" by (induct l a b r rule: rbt_baldL.induct) (auto simp: inv2_rbt_baliR rbt_bheight_rbt_baliR) lemma inv2_rbt_baldL_B: "inv2 l \ inv2 r \ bheight l + 1 = bheight r \ color_of r = RBT_Impl.B \ inv2 (rbt_baldL l a b r) \ bheight (rbt_baldL l a b r) = bheight r" by (induct l a b r rule: rbt_baldL.induct) (auto simp add: inv2_rbt_baliR rbt_bheight_rbt_baliR) lemma inv1_rbt_baldL: "inv1l l \ inv1 r \ color_of r = RBT_Impl.B \ inv1 (rbt_baldL l a b r)" by (induct l a b r rule: rbt_baldL.induct) (simp_all add: inv1_rbt_baliR) lemma inv1lI: "inv1 t \ inv1l t" by (cases t) auto lemma neq_Black[simp]: "(c \ RBT_Impl.B) = (c = RBT_Impl.R)" by (cases c) auto lemma inv1l_rbt_baldL: "inv1l l \ inv1 r \ inv1l (rbt_baldL l a b r)" by (induct l a b r rule: rbt_baldL.induct) (auto simp: inv1_rbt_baliR paint2) lemma inv2_rbt_baldR_inv1: "inv2 l \ inv2 r \ bheight l = bheight r + 1 \ inv1 l \ inv2 (rbt_baldR l a b r) \ bheight (rbt_baldR l a b r) = bheight l" by (induct l a b r rule: rbt_baldR.induct) (auto simp: inv2_rbt_baliL rbt_bheight_rbt_baliL) lemma inv1_rbt_baldR: "inv1 l \ inv1l r \ color_of l = RBT_Impl.B \ inv1 (rbt_baldR l a b r)" by (induct l a b r rule: rbt_baldR.induct) (simp_all add: inv1_rbt_baliL) lemma inv1l_rbt_baldR: "inv1 l \ inv1l r \inv1l (rbt_baldR l a b r)" by (induct l a b r rule: rbt_baldR.induct) (auto simp: inv1_rbt_baliL paint2) lemma inv2_rbt_app: "inv2 l \ inv2 r \ bheight l = bheight r \ inv2 (rbt_app l r) \ bheight (rbt_app l r) = bheight l" by (induct l r rule: rbt_app.induct) (auto simp: inv2_rbt_baldL_B split: RBT_Impl.rbt.splits RBT_Impl.color.splits) lemma inv1_rbt_app: "inv1 l \ inv1 r \ (color_of l = RBT_Impl.B \ color_of r = RBT_Impl.B \ inv1 (rbt_app l r)) \ inv1l (rbt_app l r)" by (induct l r rule: rbt_app.induct) (auto simp: inv1_rbt_baldL split: RBT_Impl.rbt.splits RBT_Impl.color.splits) lemma inv_rbt_baldL: "inv2 l \ inv2 r \ bheight l + 1 = bheight r \ inv1l l \ inv1 r \ inv2 (rbt_baldL l a b r) \ bheight (rbt_baldL l a b r) = bheight r \ inv1l (rbt_baldL l a b r) \ (color_of r = RBT_Impl.B \ inv1 (rbt_baldL l a b r))" by (induct l a b r rule: rbt_baldL.induct) (auto simp: inv_rbt_baliR rbt_bheight_rbt_baliR paint2) lemma inv_rbt_baldR: "inv2 l \ inv2 r \ bheight l = bheight r + 1 \ inv1 l \ inv1l r \ inv2 (rbt_baldR l a b r) \ bheight (rbt_baldR l a b r) = bheight l \ inv1l (rbt_baldR l a b r) \ (color_of l = RBT_Impl.B \ inv1 (rbt_baldR l a b r))" by (induct l a b r rule: rbt_baldR.induct) (auto simp: inv_rbt_baliL rbt_bheight_rbt_baliL paint2) lemma inv_rbt_app: "inv2 l \ inv2 r \ bheight l = bheight r \ inv1 l \ inv1 r \ inv2 (rbt_app l r) \ bheight (rbt_app l r) = bheight l \ inv1l (rbt_app l r) \ (color_of l = RBT_Impl.B \ color_of r = RBT_Impl.B \ inv1 (rbt_app l r))" by (induct l r rule: rbt_app.induct) (auto simp: inv2_rbt_baldL_B inv_rbt_baldL split: RBT_Impl.rbt.splits RBT_Impl.color.splits) lemma inv1l_rbt_joinL: "inv1 l \ inv1 r \ bheight l \ bheight r \ inv1l (rbt_joinL l a b r) \ (bheight l \ bheight r \ color_of r = RBT_Impl.B \ inv1 (rbt_joinL l a b r))" proof (induct l a b r rule: rbt_joinL.induct) case (1 l a b r) then show ?case by (auto simp: inv1_rbt_baliL rbt_joinL.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma inv1l_rbt_joinR: "inv1 l \ inv2 l \ inv1 r \ inv2 r \ bheight l \ bheight r \ inv1l (rbt_joinR l a b r) \ (bheight l \ bheight r \ color_of l = RBT_Impl.B \ inv1 (rbt_joinR l a b r))" proof (induct l a b r rule: rbt_joinR.induct) case (1 l a b r) then show ?case by (fastforce simp: inv1_rbt_baliR rbt_joinR.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma bheight_rbt_joinL: "inv2 l \ inv2 r \ bheight l \ bheight r \ bheight (rbt_joinL l a b r) = bheight r" proof (induct l a b r rule: rbt_joinL.induct) case (1 l a b r) then show ?case by (auto simp: rbt_bheight_rbt_baliL rbt_joinL.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma inv2_rbt_joinL: "inv2 l \ inv2 r \ bheight l \ bheight r \ inv2 (rbt_joinL l a b r)" proof (induct l a b r rule: rbt_joinL.induct) case (1 l a b r) then show ?case by (auto simp: inv2_rbt_baliL bheight_rbt_joinL rbt_joinL.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma bheight_rbt_joinR: "inv2 l \ inv2 r \ bheight l \ bheight r \ bheight (rbt_joinR l a b r) = bheight l" proof (induct l a b r rule: rbt_joinR.induct) case (1 l a b r) then show ?case by (fastforce simp: rbt_bheight_rbt_baliR rbt_joinR.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma inv2_rbt_joinR: "inv2 l \ inv2 r \ bheight l \ bheight r \ inv2 (rbt_joinR l a b r)" proof (induct l a b r rule: rbt_joinR.induct) case (1 l a b r) then show ?case by (fastforce simp: inv2_rbt_baliR bheight_rbt_joinR rbt_joinR.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma keys_paint[simp]: "RBT_Impl.keys (paint c t) = RBT_Impl.keys t" by (cases t) auto lemma keys_rbt_baliL: "RBT_Impl.keys (rbt_baliL l a b r) = RBT_Impl.keys l @ a # RBT_Impl.keys r" by (cases "(l,a,b,r)" rule: rbt_baliL.cases) auto lemma keys_rbt_baliR: "RBT_Impl.keys (rbt_baliR l a b r) = RBT_Impl.keys l @ a # RBT_Impl.keys r" by (cases "(l,a,b,r)" rule: rbt_baliR.cases) auto lemma keys_rbt_baldL: "RBT_Impl.keys (rbt_baldL l a b r) = RBT_Impl.keys l @ a # RBT_Impl.keys r" by (cases "(l,a,b,r)" rule: rbt_baldL.cases) (auto simp: keys_rbt_baliL keys_rbt_baliR) lemma keys_rbt_baldR: "RBT_Impl.keys (rbt_baldR l a b r) = RBT_Impl.keys l @ a # RBT_Impl.keys r" by (cases "(l,a,b,r)" rule: rbt_baldR.cases) (auto simp: keys_rbt_baliL keys_rbt_baliR) lemma keys_rbt_app: "RBT_Impl.keys (rbt_app l r) = RBT_Impl.keys l @ RBT_Impl.keys r" by (induction l r rule: rbt_app.induct) (auto simp: keys_rbt_baldL keys_rbt_baldR split: RBT_Impl.rbt.splits RBT_Impl.color.splits) lemma keys_rbt_joinL: "bheight l \ bheight r \ RBT_Impl.keys (rbt_joinL l a b r) = RBT_Impl.keys l @ a # RBT_Impl.keys r" proof (induction l a b r rule: rbt_joinL.induct) case (1 l a b r) thus ?case by (auto simp: keys_rbt_baliL rbt_joinL.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma keys_rbt_joinR: "RBT_Impl.keys (rbt_joinR l a b r) = RBT_Impl.keys l @ a # RBT_Impl.keys r" proof (induction l a b r rule: rbt_joinR.induct) case (1 l a b r) thus ?case by (force simp: keys_rbt_baliR rbt_joinR.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma rbt_set_rbt_baliL: "set (RBT_Impl.keys (rbt_baliL l a b r)) = set (RBT_Impl.keys l) \ {a} \ set (RBT_Impl.keys r)" by (cases "(l,a,b,r)" rule: rbt_baliL.cases) auto lemma set_rbt_joinL: "set (RBT_Impl.keys (rbt_joinL l a b r)) = set (RBT_Impl.keys l) \ {a} \ set (RBT_Impl.keys r)" proof (induction l a b r rule: rbt_joinL.induct) case (1 l a b r) thus ?case by (auto simp: rbt_set_rbt_baliL rbt_joinL.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma rbt_set_rbt_baliR: "set (RBT_Impl.keys (rbt_baliR l a b r)) = set (RBT_Impl.keys l) \ {a} \ set (RBT_Impl.keys r)" by (cases "(l,a,b,r)" rule: rbt_baliR.cases) auto lemma set_rbt_joinR: "set (RBT_Impl.keys (rbt_joinR l a b r)) = set (RBT_Impl.keys l) \ {a} \ set (RBT_Impl.keys r)" proof (induction l a b r rule: rbt_joinR.induct) case (1 l a b r) thus ?case by (force simp: rbt_set_rbt_baliR rbt_joinR.simps[of l a b r] split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma set_keys_paint: "set (RBT_Impl.keys (paint c t)) = set (RBT_Impl.keys t)" by (cases t) auto lemma set_rbt_join: "set (RBT_Impl.keys (rbt_join l a b r)) = set (RBT_Impl.keys l) \ {a} \ set (RBT_Impl.keys r)" by (simp add: set_rbt_joinL set_rbt_joinR set_keys_paint rbt_join_def Let_def) lemma inv_rbt_join: "inv_12 l \ inv_12 r \ inv_12 (rbt_join l a b r)" by (auto simp: rbt_join_def Let_def inv1l_rbt_joinL inv1l_rbt_joinR inv2_rbt_joinL inv2_rbt_joinR inv_12_def) fun rbt_recolor :: "('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_recolor (Branch RBT_Impl.R t1 k v t2) = (if color_of t1 = RBT_Impl.B \ color_of t2 = RBT_Impl.B then Branch RBT_Impl.B t1 k v t2 else Branch RBT_Impl.R t1 k v t2)" | "rbt_recolor t = t" lemma rbt_recolor: "inv_12 t \ inv_12 (rbt_recolor t)" by (induction t rule: rbt_recolor.induct) (auto simp: inv_12_def) fun rbt_split_min :: "('a, 'b) rbt \ 'a \ 'b \ ('a, 'b) rbt" where "rbt_split_min RBT_Impl.Empty = undefined" | "rbt_split_min (RBT_Impl.Branch _ l a b r) = (if is_rbt_empty l then (a,b,r) else let (a',b',l') = rbt_split_min l in (a',b',rbt_join l' a b r))" lemma rbt_split_min_set: "rbt_split_min t = (a,b,t') \ t \ RBT_Impl.Empty \ a \ set (RBT_Impl.keys t) \ set (RBT_Impl.keys t) = {a} \ set (RBT_Impl.keys t')" by (induction t arbitrary: t') (auto simp: set_rbt_join split: prod.splits if_splits) lemma rbt_split_min_inv: "rbt_split_min t = (a,b,t') \ inv_12 t \ t \ RBT_Impl.Empty \ inv_12 t'" by (induction t arbitrary: t') (auto simp: inv_rbt_join split: if_splits prod.splits dest: rbt_Node) definition rbt_join2 :: "('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_join2 l r = (if is_rbt_empty r then l else let (a,b,r') = rbt_split_min r in rbt_join l a b r')" lemma set_rbt_join2[simp]: "set (RBT_Impl.keys (rbt_join2 l r)) = set (RBT_Impl.keys l) \ set (RBT_Impl.keys r)" by (simp add: rbt_join2_def rbt_split_min_set set_rbt_join split: prod.split) lemma inv_rbt_join2: "inv_12 l \ inv_12 r \ inv_12 (rbt_join2 l r)" by (simp add: rbt_join2_def inv_rbt_join rbt_split_min_set rbt_split_min_inv split: prod.split) context ord begin fun rbt_split :: "('a, 'b) rbt \ 'a \ ('a, 'b) rbt \ 'b option \ ('a, 'b) rbt" where "rbt_split RBT_Impl.Empty k = (RBT_Impl.Empty, None, RBT_Impl.Empty)" | "rbt_split (RBT_Impl.Branch _ l a b r) x = (if x < a then (case rbt_split l x of (l1, \, l2) \ (l1, \, rbt_join l2 a b r)) else if a < x then (case rbt_split r x of (r1, \, r2) \ (rbt_join l a b r1, \, r2)) else (l, Some b, r))" lemma rbt_split: "rbt_split t x = (l,\,r) \ inv_12 t \ inv_12 l \ inv_12 r" by (induction t arbitrary: l r) (auto simp: set_rbt_join inv_rbt_join rbt_greater_prop rbt_less_prop split: if_splits prod.splits dest!: rbt_Node) lemma rbt_split_size: "(l2,\,r2) = rbt_split t2 a \ size l2 + size r2 \ size t2" by (induction t2 a arbitrary: l2 r2 rule: rbt_split.induct) (auto split: if_splits prod.splits) function rbt_union_rec :: "('a \ 'b \ 'b \ 'b) \ ('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_union_rec f t1 t2 = (let (f, t2, t1) = if flip_rbt t2 t1 then (\k v v'. f k v' v, t1, t2) else (f, t2, t1) in if small_rbt t2 then RBT_Impl.fold (rbt_insert_with_key f) t2 t1 else (case t1 of RBT_Impl.Empty \ t2 | RBT_Impl.Branch _ l1 a b r1 \ case rbt_split t2 a of (l2, \, r2) \ rbt_join (rbt_union_rec f l1 l2) a (case \ of None \ b | Some b' \ f a b b') (rbt_union_rec f r1 r2)))" by pat_completeness auto termination using rbt_split_size by (relation "measure (\(f,t1,t2). size t1 + size t2)") (fastforce split: if_splits)+ declare rbt_union_rec.simps[simp del] function rbt_union_swap_rec :: "('a \ 'b \ 'b \ 'b) \ bool \ ('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_union_swap_rec f \ t1 t2 = (let (\, t2, t1) = if flip_rbt t2 t1 then (\\, t1, t2) else (\, t2, t1); f' = (if \ then (\k v v'. f k v' v) else f) in if small_rbt t2 then RBT_Impl.fold (rbt_insert_with_key f') t2 t1 else (case t1 of RBT_Impl.Empty \ t2 | RBT_Impl.Branch _ l1 a b r1 \ case rbt_split t2 a of (l2, \, r2) \ rbt_join (rbt_union_swap_rec f \ l1 l2) a (case \ of None \ b | Some b' \ f' a b b') (rbt_union_swap_rec f \ r1 r2)))" by pat_completeness auto termination using rbt_split_size by (relation "measure (\(f,\,t1,t2). size t1 + size t2)") (fastforce split: if_splits)+ declare rbt_union_swap_rec.simps[simp del] lemma rbt_union_swap_rec: "rbt_union_swap_rec f \ t1 t2 = rbt_union_rec (if \ then (\k v v'. f k v' v) else f) t1 t2" proof (induction f \ t1 t2 rule: rbt_union_swap_rec.induct) case (1 f \ t1 t2) show ?case using 1[OF refl _ refl refl _ refl _ refl] unfolding rbt_union_swap_rec.simps[of _ _ t1] rbt_union_rec.simps[of _ t1] by (auto simp: Let_def split: rbt.splits prod.splits option.splits) (* slow *) qed lemma rbt_fold_rbt_insert: assumes "inv_12 t2" shows "inv_12 (RBT_Impl.fold (rbt_insert_with_key f) t1 t2)" proof - define xs where "xs = RBT_Impl.entries t1" from assms show ?thesis unfolding RBT_Impl.fold_def xs_def[symmetric] by (induct xs rule: rev_induct) (auto simp: inv_12_def rbt_insert_with_key_def ins_inv1_inv2) qed lemma rbt_union_rec: "inv_12 t1 \ inv_12 t2 \ inv_12 (rbt_union_rec f t1 t2)" proof (induction f t1 t2 rule: rbt_union_rec.induct) case (1 t1 t2) thus ?case by (auto simp: rbt_union_rec.simps[of t1 t2] inv_rbt_join rbt_split rbt_fold_rbt_insert split!: RBT_Impl.rbt.splits RBT_Impl.color.splits prod.split if_splits dest: rbt_Node) qed definition "map_filter_inter f t1 t2 = List.map_filter (\(k, v). case rbt_lookup t1 k of None \ None | Some v' \ Some (k, f k v' v)) (RBT_Impl.entries t2)" function rbt_inter_rec :: "('a \ 'b \ 'b \ 'b) \ ('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_inter_rec f t1 t2 = (let (f, t2, t1) = if flip_rbt t2 t1 then (\k v v'. f k v' v, t1, t2) else (f, t2, t1) in if small_rbt t2 then rbtreeify (map_filter_inter f t1 t2) else case t1 of RBT_Impl.Empty \ RBT_Impl.Empty | RBT_Impl.Branch _ l1 a b r1 \ case rbt_split t2 a of (l2, \, r2) \ let l' = rbt_inter_rec f l1 l2; r' = rbt_inter_rec f r1 r2 in (case \ of None \ rbt_join2 l' r' | Some b' \ rbt_join l' a (f a b b') r'))" by pat_completeness auto termination using rbt_split_size by (relation "measure (\(f,t1,t2). size t1 + size t2)") (fastforce split: if_splits)+ declare rbt_inter_rec.simps[simp del] function rbt_inter_swap_rec :: "('a \ 'b \ 'b \ 'b) \ bool \ ('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_inter_swap_rec f \ t1 t2 = (let (\, t2, t1) = if flip_rbt t2 t1 then (\\, t1, t2) else (\, t2, t1); f' = (if \ then (\k v v'. f k v' v) else f) in if small_rbt t2 then rbtreeify (map_filter_inter f' t1 t2) else case t1 of RBT_Impl.Empty \ RBT_Impl.Empty | RBT_Impl.Branch _ l1 a b r1 \ case rbt_split t2 a of (l2, \, r2) \ let l' = rbt_inter_swap_rec f \ l1 l2; r' = rbt_inter_swap_rec f \ r1 r2 in (case \ of None \ rbt_join2 l' r' | Some b' \ rbt_join l' a (f' a b b') r'))" by pat_completeness auto termination using rbt_split_size by (relation "measure (\(f,\,t1,t2). size t1 + size t2)") (fastforce split: if_splits)+ declare rbt_inter_swap_rec.simps[simp del] lemma rbt_inter_swap_rec: "rbt_inter_swap_rec f \ t1 t2 = rbt_inter_rec (if \ then (\k v v'. f k v' v) else f) t1 t2" proof (induction f \ t1 t2 rule: rbt_inter_swap_rec.induct) case (1 f \ t1 t2) show ?case using 1[OF refl _ refl refl _ refl _ refl] unfolding rbt_inter_swap_rec.simps[of _ _ t1] rbt_inter_rec.simps[of _ t1] by (auto simp add: Let_def split: rbt.splits prod.splits option.splits) qed lemma rbt_rbtreeify[simp]: "inv_12 (rbtreeify kvs)" by (simp add: inv_12_def rbtreeify_def inv1_rbtreeify_g inv2_rbtreeify_g) lemma rbt_inter_rec: "inv_12 t1 \ inv_12 t2 \ inv_12 (rbt_inter_rec f t1 t2)" proof(induction f t1 t2 rule: rbt_inter_rec.induct) case (1 t1 t2) thus ?case by (auto simp: rbt_inter_rec.simps[of t1 t2] inv_rbt_join inv_rbt_join2 rbt_split Let_def split!: RBT_Impl.rbt.splits RBT_Impl.color.splits prod.split if_splits option.splits dest!: rbt_Node) qed definition "filter_minus t1 t2 = filter (\(k, _). rbt_lookup t2 k = None) (RBT_Impl.entries t1)" fun rbt_minus_rec :: "('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_minus_rec t1 t2 = (if small_rbt t2 then RBT_Impl.fold (\k _ t. rbt_delete k t) t2 t1 else if small_rbt t1 then rbtreeify (filter_minus t1 t2) else case t2 of RBT_Impl.Empty \ t1 | RBT_Impl.Branch _ l2 a b r2 \ case rbt_split t1 a of (l1, _, r1) \ rbt_join2 (rbt_minus_rec l1 l2) (rbt_minus_rec r1 r2))" declare rbt_minus_rec.simps[simp del] end context linorder begin lemma rbt_sorted_entries_right_unique: "\ (k, v) \ set (entries t); (k, v') \ set (entries t); rbt_sorted t \ \ v = v'" by(auto dest!: distinct_entries inj_onD[where x="(k, v)" and y="(k, v')"] simp add: distinct_map) lemma rbt_sorted_fold_rbt_insertwk: "rbt_sorted t \ rbt_sorted (List.fold (\(k, v). rbt_insert_with_key f k v) xs t)" by(induct xs rule: rev_induct)(auto simp add: rbt_insertwk_rbt_sorted) lemma is_rbt_fold_rbt_insertwk: assumes "is_rbt t1" shows "is_rbt (fold (rbt_insert_with_key f) t2 t1)" proof - define xs where "xs = entries t2" from assms show ?thesis unfolding fold_def xs_def[symmetric] by(induct xs rule: rev_induct)(auto simp add: rbt_insertwk_is_rbt) qed lemma rbt_delete: "inv_12 t \ inv_12 (rbt_delete x t)" using rbt_del_inv1_inv2[of t x] by (auto simp: inv_12_def rbt_delete_def rbt_del_inv1_inv2) lemma rbt_sorted_delete: "rbt_sorted t \ rbt_sorted (rbt_delete x t)" by (auto simp: rbt_delete_def rbt_del_rbt_sorted) lemma rbt_fold_rbt_delete: assumes "inv_12 t2" shows "inv_12 (RBT_Impl.fold (\k _ t. rbt_delete k t) t1 t2)" proof - define xs where "xs = RBT_Impl.entries t1" from assms show ?thesis unfolding RBT_Impl.fold_def xs_def[symmetric] by (induct xs rule: rev_induct) (auto simp: rbt_delete) qed lemma rbt_minus_rec: "inv_12 t1 \ inv_12 t2 \ inv_12 (rbt_minus_rec t1 t2)" proof(induction t1 t2 rule: rbt_minus_rec.induct) case (1 t1 t2) thus ?case by (auto simp: rbt_minus_rec.simps[of t1 t2] inv_rbt_join inv_rbt_join2 rbt_split rbt_fold_rbt_delete split!: RBT_Impl.rbt.splits RBT_Impl.color.splits prod.split if_splits dest: rbt_Node) qed end context linorder begin lemma rbt_sorted_rbt_baliL: "rbt_sorted l \ rbt_sorted r \ l |\ a \ a \| r \ rbt_sorted (rbt_baliL l a b r)" using rbt_greater_trans rbt_less_trans by (cases "(l,a,b,r)" rule: rbt_baliL.cases) fastforce+ lemma rbt_lookup_rbt_baliL: "rbt_sorted l \ rbt_sorted r \ l |\ a \ a \| r \ rbt_lookup (rbt_baliL l a b r) k = (if k < a then rbt_lookup l k else if k = a then Some b else rbt_lookup r k)" by (cases "(l,a,b,r)" rule: rbt_baliL.cases) (auto split!: if_splits) lemma rbt_sorted_rbt_baliR: "rbt_sorted l \ rbt_sorted r \ l |\ a \ a \| r \ rbt_sorted (rbt_baliR l a b r)" using rbt_greater_trans rbt_less_trans by (cases "(l,a,b,r)" rule: rbt_baliR.cases) fastforce+ lemma rbt_lookup_rbt_baliR: "rbt_sorted l \ rbt_sorted r \ l |\ a \ a \| r \ rbt_lookup (rbt_baliR l a b r) k = (if k < a then rbt_lookup l k else if k = a then Some b else rbt_lookup r k)" by (cases "(l,a,b,r)" rule: rbt_baliR.cases) (auto split!: if_splits) lemma rbt_sorted_rbt_joinL: "rbt_sorted (RBT_Impl.Branch c l a b r) \ bheight l \ bheight r \ rbt_sorted (rbt_joinL l a b r)" proof (induction l a b r arbitrary: c rule: rbt_joinL.induct) case (1 l a b r) thus ?case by (auto simp: rbt_set_rbt_baliL rbt_joinL.simps[of l a b r] set_rbt_joinL rbt_less_prop intro!: rbt_sorted_rbt_baliL split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma rbt_lookup_rbt_joinL: "rbt_sorted l \ rbt_sorted r \ l |\ a \ a \| r \ rbt_lookup (rbt_joinL l a b r) k = (if k < a then rbt_lookup l k else if k = a then Some b else rbt_lookup r k)" proof (induction l a b r rule: rbt_joinL.induct) case (1 l a b r) have less_rbt_joinL: "rbt_sorted r1 \ r1 |\ x \ a \| r1 \ a < x \ rbt_joinL l a b r1 |\ x" for x r1 using 1(5) by (auto simp: rbt_less_prop rbt_greater_prop set_rbt_joinL) show ?case using 1 less_rbt_joinL rbt_lookup_rbt_baliL[OF rbt_sorted_rbt_joinL[of _ l a b], where ?k=k] by (auto simp: rbt_joinL.simps[of l a b r] split!: if_splits rbt.splits color.splits) qed lemma rbt_sorted_rbt_joinR: "rbt_sorted l \ rbt_sorted r \ l |\ a \ a \| r \ rbt_sorted (rbt_joinR l a b r)" proof (induction l a b r rule: rbt_joinR.induct) case (1 l a b r) thus ?case by (auto simp: rbt_set_rbt_baliR rbt_joinR.simps[of l a b r] set_rbt_joinR rbt_greater_prop intro!: rbt_sorted_rbt_baliR split!: RBT_Impl.rbt.splits RBT_Impl.color.splits) qed lemma rbt_lookup_rbt_joinR: "rbt_sorted l \ rbt_sorted r \ l |\ a \ a \| r \ rbt_lookup (rbt_joinR l a b r) k = (if k < a then rbt_lookup l k else if k = a then Some b else rbt_lookup r k)" proof (induction l a b r rule: rbt_joinR.induct) case (1 l a b r) have less_rbt_joinR: "rbt_sorted l1 \ x \| l1 \ l1 |\ a \ x < a \ x \| rbt_joinR l1 a b r" for x l1 using 1(6) by (auto simp: rbt_less_prop rbt_greater_prop set_rbt_joinR) show ?case using 1 less_rbt_joinR rbt_lookup_rbt_baliR[OF _ rbt_sorted_rbt_joinR[of _ r a b], where ?k=k] by (auto simp: rbt_joinR.simps[of l a b r] split!: if_splits rbt.splits color.splits) qed lemma rbt_sorted_paint: "rbt_sorted (paint c t) = rbt_sorted t" by (cases t) auto lemma rbt_sorted_rbt_join: "rbt_sorted (RBT_Impl.Branch c l a b r) \ rbt_sorted (rbt_join l a b r)" by (auto simp: rbt_sorted_paint rbt_sorted_rbt_joinL rbt_sorted_rbt_joinR rbt_join_def Let_def) lemma rbt_lookup_rbt_join: "rbt_sorted l \ rbt_sorted r \ l |\ a \ a \| r \ rbt_lookup (rbt_join l a b r) k = (if k < a then rbt_lookup l k else if k = a then Some b else rbt_lookup r k)" by (auto simp: rbt_join_def Let_def rbt_lookup_rbt_joinL rbt_lookup_rbt_joinR) lemma rbt_split_min_rbt_sorted: "rbt_split_min t = (a,b,t') \ rbt_sorted t \ t \ RBT_Impl.Empty \ rbt_sorted t' \ (\x \ set (RBT_Impl.keys t'). a < x)" by (induction t arbitrary: t') (fastforce simp: rbt_split_min_set rbt_sorted_rbt_join set_rbt_join rbt_less_prop rbt_greater_prop split: if_splits prod.splits)+ lemma rbt_split_min_rbt_lookup: "rbt_split_min t = (a,b,t') \ rbt_sorted t \ t \ RBT_Impl.Empty \ rbt_lookup t k = (if k < a then None else if k = a then Some b else rbt_lookup t' k)" - by (induction t arbitrary: a b t') - (auto simp: rbt_less_trans antisym_conv3 rbt_less_prop rbt_split_min_set rbt_lookup_rbt_join - rbt_split_min_rbt_sorted split!: if_splits prod.splits) + apply (induction t arbitrary: a b t') + apply(simp_all split: if_splits prod.splits) + apply(auto simp: rbt_less_prop rbt_split_min_set rbt_lookup_rbt_join rbt_split_min_rbt_sorted) + done lemma rbt_sorted_rbt_join2: "rbt_sorted l \ rbt_sorted r \ \x \ set (RBT_Impl.keys l). \y \ set (RBT_Impl.keys r). x < y \ rbt_sorted (rbt_join2 l r)" by (simp add: rbt_join2_def rbt_sorted_rbt_join rbt_split_min_set rbt_split_min_rbt_sorted set_rbt_join rbt_greater_prop rbt_less_prop split: prod.split) lemma rbt_lookup_rbt_join2: "rbt_sorted l \ rbt_sorted r \ \x \ set (RBT_Impl.keys l). \y \ set (RBT_Impl.keys r). x < y \ rbt_lookup (rbt_join2 l r) k = (case rbt_lookup l k of None \ rbt_lookup r k | Some v \ Some v)" using rbt_lookup_keys by (fastforce simp: rbt_join2_def rbt_greater_prop rbt_less_prop rbt_lookup_rbt_join rbt_split_min_rbt_lookup rbt_split_min_rbt_sorted rbt_split_min_set split: option.splits prod.splits) lemma rbt_split_props: "rbt_split t x = (l,\,r) \ rbt_sorted t \ set (RBT_Impl.keys l) = {a \ set (RBT_Impl.keys t). a < x} \ set (RBT_Impl.keys r) = {a \ set (RBT_Impl.keys t). x < a} \ rbt_sorted l \ rbt_sorted r" - by (induction t arbitrary: l r) - (force simp: Let_def set_rbt_join rbt_greater_prop rbt_less_prop - split: if_splits prod.splits intro!: rbt_sorted_rbt_join)+ + apply (induction t arbitrary: l r) + apply(simp_all split!: prod.splits if_splits) + apply(force simp: set_rbt_join rbt_greater_prop rbt_less_prop + intro: rbt_sorted_rbt_join)+ + done lemma rbt_split_lookup: "rbt_split t x = (l,\,r) \ rbt_sorted t \ rbt_lookup t k = (if k < x then rbt_lookup l k else if k = x then \ else rbt_lookup r k)" proof (induction t arbitrary: x l \ r) case (Branch c t1 a b t2) have "rbt_sorted r1" "r1 |\ a" if "rbt_split t1 x = (l, \, r1)" for r1 using rbt_split_props Branch(4) that by (fastforce simp: rbt_less_prop)+ moreover have "rbt_sorted l1" "a \| l1" if "rbt_split t2 x = (l1, \, r)" for l1 using rbt_split_props Branch(4) that by (fastforce simp: rbt_greater_prop)+ ultimately show ?case using Branch rbt_lookup_rbt_join[of t1 _ a b k] rbt_lookup_rbt_join[of _ t2 a b k] by (auto split!: if_splits prod.splits) qed simp lemma rbt_sorted_fold_insertwk: "rbt_sorted t \ rbt_sorted (RBT_Impl.fold (rbt_insert_with_key f) t' t)" by (induct t' arbitrary: t) (simp_all add: rbt_insertwk_rbt_sorted) lemma rbt_lookup_iff_keys: "rbt_sorted t \ set (RBT_Impl.keys t) = {k. \v. rbt_lookup t k = Some v}" "rbt_sorted t \ rbt_lookup t k = None \ k \ set (RBT_Impl.keys t)" "rbt_sorted t \ (\v. rbt_lookup t k = Some v) \ k \ set (RBT_Impl.keys t)" using entry_in_tree_keys rbt_lookup_keys[of t] by force+ lemma rbt_lookup_fold_rbt_insertwk: assumes t1: "rbt_sorted t1" and t2: "rbt_sorted t2" shows "rbt_lookup (fold (rbt_insert_with_key f) t1 t2) k = (case rbt_lookup t1 k of None \ rbt_lookup t2 k | Some v \ case rbt_lookup t2 k of None \ Some v | Some w \ Some (f k w v))" proof - define xs where "xs = entries t1" hence dt1: "distinct (map fst xs)" using t1 by(simp add: distinct_entries) with t2 show ?thesis unfolding fold_def map_of_entries[OF t1, symmetric] xs_def[symmetric] distinct_map_of_rev[OF dt1, symmetric] apply(induct xs rule: rev_induct) apply(auto simp add: rbt_lookup_rbt_insertwk rbt_sorted_fold_rbt_insertwk split: option.splits) apply(auto simp add: distinct_map_of_rev intro: rev_image_eqI) done qed lemma rbt_lookup_union_rec: "rbt_sorted t1 \ rbt_sorted t2 \ rbt_sorted (rbt_union_rec f t1 t2) \ rbt_lookup (rbt_union_rec f t1 t2) k = (case rbt_lookup t1 k of None \ rbt_lookup t2 k | Some v \ (case rbt_lookup t2 k of None \ Some v | Some w \ Some (f k v w)))" proof(induction f t1 t2 arbitrary: k rule: rbt_union_rec.induct) case (1 f t1 t2) obtain f' t1' t2' where flip: "(f', t2', t1') = (if flip_rbt t2 t1 then (\k v v'. f k v' v, t1, t2) else (f, t2, t1))" by fastforce have rbt_sorted': "rbt_sorted t1'" "rbt_sorted t2'" using 1(3,4) flip by (auto split: if_splits) show ?case proof (cases t1') case Empty show ?thesis unfolding rbt_union_rec.simps[of _ t1] flip[symmetric] using flip rbt_sorted' rbt_split_props[of t2] by (auto simp: Empty rbt_lookup_fold_rbt_insertwk intro!: rbt_sorted_fold_insertwk split: if_splits option.splits) next case (Branch c l1 a b r1) { assume not_small: "\small_rbt t2'" obtain l2 \ r2 where rbt_split_t2': "rbt_split t2' a = (l2, \, r2)" by (cases "rbt_split t2' a") auto have rbt_sort: "rbt_sorted l1" "rbt_sorted r1" using 1(3,4) flip by (auto simp: Branch split: if_splits) note rbt_split_t2'_props = rbt_split_props[OF rbt_split_t2' rbt_sorted'(2)] have union_l1_l2: "rbt_sorted (rbt_union_rec f' l1 l2)" "rbt_lookup (rbt_union_rec f' l1 l2) k = (case rbt_lookup l1 k of None \ rbt_lookup l2 k | Some v \ (case rbt_lookup l2 k of None \ Some v | Some w \ Some (f' k v w)))" for k using 1(1)[OF flip refl refl _ Branch rbt_split_t2'[symmetric]] rbt_sort rbt_split_t2'_props by (auto simp: not_small) have union_r1_r2: "rbt_sorted (rbt_union_rec f' r1 r2)" "rbt_lookup (rbt_union_rec f' r1 r2) k = (case rbt_lookup r1 k of None \ rbt_lookup r2 k | Some v \ (case rbt_lookup r2 k of None \ Some v | Some w \ Some (f' k v w)))" for k using 1(2)[OF flip refl refl _ Branch rbt_split_t2'[symmetric]] rbt_sort rbt_split_t2'_props by (auto simp: not_small) have union_l1_l2_keys: "set (RBT_Impl.keys (rbt_union_rec f' l1 l2)) = set (RBT_Impl.keys l1) \ set (RBT_Impl.keys l2)" using rbt_sorted'(1) rbt_split_t2'_props by (auto simp: Branch rbt_lookup_iff_keys(1) union_l1_l2 split: option.splits) have union_r1_r2_keys: "set (RBT_Impl.keys (rbt_union_rec f' r1 r2)) = set (RBT_Impl.keys r1) \ set (RBT_Impl.keys r2)" using rbt_sorted'(1) rbt_split_t2'_props by (auto simp: Branch rbt_lookup_iff_keys(1) union_r1_r2 split: option.splits) have union_l1_l2_less: "rbt_union_rec f' l1 l2 |\ a" using rbt_sorted'(1) rbt_split_t2'_props by (auto simp: Branch rbt_less_prop union_l1_l2_keys) have union_r1_r2_greater: "a \| rbt_union_rec f' r1 r2" using rbt_sorted'(1) rbt_split_t2'_props by (auto simp: Branch rbt_greater_prop union_r1_r2_keys) have "rbt_lookup (rbt_union_rec f t1 t2) k = (case rbt_lookup t1' k of None \ rbt_lookup t2' k | Some v \ (case rbt_lookup t2' k of None \ Some v | Some w \ Some (f' k v w)))" using rbt_sorted' union_l1_l2 union_r1_r2 rbt_split_t2'_props union_l1_l2_less union_r1_r2_greater not_small by (auto simp: rbt_union_rec.simps[of _ t1] flip[symmetric] Branch rbt_split_t2' rbt_lookup_rbt_join rbt_split_lookup[OF rbt_split_t2'] split: option.splits) moreover have "rbt_sorted (rbt_union_rec f t1 t2)" using rbt_sorted' rbt_split_t2'_props not_small by (auto simp: rbt_union_rec.simps[of _ t1] flip[symmetric] Branch rbt_split_t2' union_l1_l2 union_r1_r2 union_l1_l2_keys union_r1_r2_keys rbt_less_prop rbt_greater_prop intro!: rbt_sorted_rbt_join) ultimately have ?thesis using flip by (auto split: if_splits option.splits) } then show ?thesis unfolding rbt_union_rec.simps[of _ t1] flip[symmetric] using rbt_sorted' flip by (auto simp: rbt_sorted_fold_insertwk rbt_lookup_fold_rbt_insertwk split: option.splits) qed qed lemma rbtreeify_map_filter_inter: fixes f :: "'a \ 'b \ 'b \ 'b" assumes "rbt_sorted t2" shows "rbt_sorted (rbtreeify (map_filter_inter f t1 t2))" "rbt_lookup (rbtreeify (map_filter_inter f t1 t2)) k = (case rbt_lookup t1 k of None \ None | Some v \ (case rbt_lookup t2 k of None \ None | Some w \ Some (f k v w)))" proof - have map_of_map_filter: "map_of (List.map_filter (\(k, v). case rbt_lookup t1 k of None \ None | Some v' \ Some (k, f k v' v)) xs) k = (case rbt_lookup t1 k of None \ None | Some v \ (case map_of xs k of None \ None | Some w \ Some (f k v w)))" for xs k by (induction xs) (auto simp: List.map_filter_def split: option.splits) (* slow *) have map_fst_map_filter: "map fst (List.map_filter (\(k, v). case rbt_lookup t1 k of None \ None | Some v' \ Some (k, f k v' v)) xs) = filter (\k. rbt_lookup t1 k \ None) (map fst xs)" for xs by (induction xs) (auto simp: List.map_filter_def split: option.splits) have "sorted (map fst (map_filter_inter f t1 t2))" using sorted_filter[of id] rbt_sorted_entries[OF assms] by (auto simp: map_filter_inter_def map_fst_map_filter) moreover have "distinct (map fst (map_filter_inter f t1 t2))" using distinct_filter distinct_entries[OF assms] by (auto simp: map_filter_inter_def map_fst_map_filter) ultimately show "rbt_sorted (rbtreeify (map_filter_inter f t1 t2))" "rbt_lookup (rbtreeify (map_filter_inter f t1 t2)) k = (case rbt_lookup t1 k of None \ None | Some v \ (case rbt_lookup t2 k of None \ None | Some w \ Some (f k v w)))" using rbt_sorted_rbtreeify by (auto simp: rbt_lookup_rbtreeify map_filter_inter_def map_of_map_filter map_of_entries[OF assms] split: option.splits) qed lemma rbt_lookup_inter_rec: "rbt_sorted t1 \ rbt_sorted t2 \ rbt_sorted (rbt_inter_rec f t1 t2) \ rbt_lookup (rbt_inter_rec f t1 t2) k = (case rbt_lookup t1 k of None \ None | Some v \ (case rbt_lookup t2 k of None \ None | Some w \ Some (f k v w)))" proof(induction f t1 t2 arbitrary: k rule: rbt_inter_rec.induct) case (1 f t1 t2) obtain f' t1' t2' where flip: "(f', t2', t1') = (if flip_rbt t2 t1 then (\k v v'. f k v' v, t1, t2) else (f, t2, t1))" by fastforce have rbt_sorted': "rbt_sorted t1'" "rbt_sorted t2'" using 1(3,4) flip by (auto split: if_splits) show ?case proof (cases t1') case Empty show ?thesis unfolding rbt_inter_rec.simps[of _ t1] flip[symmetric] using flip rbt_sorted' rbt_split_props[of t2] rbtreeify_map_filter_inter[OF rbt_sorted'(2)] by (auto simp: Empty split: option.splits) next case (Branch c l1 a b r1) { assume not_small: "\small_rbt t2'" obtain l2 \ r2 where rbt_split_t2': "rbt_split t2' a = (l2, \, r2)" by (cases "rbt_split t2' a") auto note rbt_split_t2'_props = rbt_split_props[OF rbt_split_t2' rbt_sorted'(2)] have rbt_sort: "rbt_sorted l1" "rbt_sorted r1" "rbt_sorted l2" "rbt_sorted r2" using 1(3,4) flip by (auto simp: Branch rbt_split_t2'_props split: if_splits) have inter_l1_l2: "rbt_sorted (rbt_inter_rec f' l1 l2)" "rbt_lookup (rbt_inter_rec f' l1 l2) k = (case rbt_lookup l1 k of None \ None | Some v \ (case rbt_lookup l2 k of None \ None | Some w \ Some (f' k v w)))" for k using 1(1)[OF flip refl refl _ Branch rbt_split_t2'[symmetric]] rbt_sort rbt_split_t2'_props by (auto simp: not_small) have inter_r1_r2: "rbt_sorted (rbt_inter_rec f' r1 r2)" "rbt_lookup (rbt_inter_rec f' r1 r2) k = (case rbt_lookup r1 k of None \ None | Some v \ (case rbt_lookup r2 k of None \ None | Some w \ Some (f' k v w)))" for k using 1(2)[OF flip refl refl _ Branch rbt_split_t2'[symmetric]] rbt_sort rbt_split_t2'_props by (auto simp: not_small) have inter_l1_l2_keys: "set (RBT_Impl.keys (rbt_inter_rec f' l1 l2)) = set (RBT_Impl.keys l1) \ set (RBT_Impl.keys l2)" using inter_l1_l2(1) by (auto simp: rbt_lookup_iff_keys(1) inter_l1_l2(2) rbt_sort split: option.splits) have inter_r1_r2_keys: "set (RBT_Impl.keys (rbt_inter_rec f' r1 r2)) = set (RBT_Impl.keys r1) \ set (RBT_Impl.keys r2)" using inter_r1_r2(1) by (auto simp: rbt_lookup_iff_keys(1) inter_r1_r2(2) rbt_sort split: option.splits) have inter_l1_l2_less: "rbt_inter_rec f' l1 l2 |\ a" using rbt_sorted'(1) rbt_split_t2'_props by (auto simp: Branch rbt_less_prop inter_l1_l2_keys) have inter_r1_r2_greater: "a \| rbt_inter_rec f' r1 r2" using rbt_sorted'(1) rbt_split_t2'_props by (auto simp: Branch rbt_greater_prop inter_r1_r2_keys) have rbt_lookup_join2: "rbt_lookup (rbt_join2 (rbt_inter_rec f' l1 l2) (rbt_inter_rec f' r1 r2)) k = (case rbt_lookup (rbt_inter_rec f' l1 l2) k of None \ rbt_lookup (rbt_inter_rec f' r1 r2) k | Some v \ Some v)" for k using rbt_lookup_rbt_join2[OF inter_l1_l2(1) inter_r1_r2(1)] rbt_sorted'(1) by (fastforce simp: Branch inter_l1_l2_keys inter_r1_r2_keys rbt_less_prop rbt_greater_prop) have rbt_lookup_l1_k: "rbt_lookup l1 k = Some v \ k < a" for k v using rbt_sorted'(1) rbt_lookup_iff_keys(3) by (auto simp: Branch rbt_less_prop) have rbt_lookup_r1_k: "rbt_lookup r1 k = Some v \ a < k" for k v using rbt_sorted'(1) rbt_lookup_iff_keys(3) by (auto simp: Branch rbt_greater_prop) have "rbt_lookup (rbt_inter_rec f t1 t2) k = (case rbt_lookup t1' k of None \ None | Some v \ (case rbt_lookup t2' k of None \ None | Some w \ Some (f' k v w)))" by (auto simp: Let_def rbt_inter_rec.simps[of _ t1] flip[symmetric] not_small Branch rbt_split_t2' rbt_lookup_join2 rbt_lookup_rbt_join inter_l1_l2_less inter_r1_r2_greater rbt_split_lookup[OF rbt_split_t2' rbt_sorted'(2)] inter_l1_l2 inter_r1_r2 split!: if_splits option.splits dest: rbt_lookup_l1_k rbt_lookup_r1_k) moreover have "rbt_sorted (rbt_inter_rec f t1 t2)" using rbt_sorted' inter_l1_l2 inter_r1_r2 rbt_split_t2'_props not_small by (auto simp: Let_def rbt_inter_rec.simps[of _ t1] flip[symmetric] Branch rbt_split_t2' rbt_less_prop rbt_greater_prop inter_l1_l2_less inter_r1_r2_greater inter_l1_l2_keys inter_r1_r2_keys intro!: rbt_sorted_rbt_join rbt_sorted_rbt_join2 split: if_splits option.splits dest!: bspec) ultimately have ?thesis using flip by (auto split: if_splits split: option.splits) } then show ?thesis unfolding rbt_inter_rec.simps[of _ t1] flip[symmetric] using rbt_sorted' flip rbtreeify_map_filter_inter[OF rbt_sorted'(2)] by (auto split: option.splits) qed qed lemma rbt_lookup_delete: assumes "inv_12 t" "rbt_sorted t" shows "rbt_lookup (rbt_delete x t) k = (if x = k then None else rbt_lookup t k)" proof - note rbt_sorted_del = rbt_del_rbt_sorted[OF assms(2), of x] show ?thesis using assms rbt_sorted_del rbt_del_in_tree rbt_lookup_from_in_tree[OF assms(2) rbt_sorted_del] by (fastforce simp: inv_12_def rbt_delete_def rbt_lookup_iff_keys(2) keys_entries) qed lemma fold_rbt_delete: assumes "inv_12 t1" "rbt_sorted t1" "rbt_sorted t2" shows "inv_12 (RBT_Impl.fold (\k _ t. rbt_delete k t) t2 t1) \ rbt_sorted (RBT_Impl.fold (\k _ t. rbt_delete k t) t2 t1) \ rbt_lookup (RBT_Impl.fold (\k _ t. rbt_delete k t) t2 t1) k = (case rbt_lookup t1 k of None \ None | Some v \ (case rbt_lookup t2 k of None \ Some v | _ \ None))" proof - define xs where "xs = RBT_Impl.entries t2" show "inv_12 (RBT_Impl.fold (\k _ t. rbt_delete k t) t2 t1) \ rbt_sorted (RBT_Impl.fold (\k _ t. rbt_delete k t) t2 t1) \ rbt_lookup (RBT_Impl.fold (\k _ t. rbt_delete k t) t2 t1) k = (case rbt_lookup t1 k of None \ None | Some v \ (case rbt_lookup t2 k of None \ Some v | _ \ None))" using assms(1,2) unfolding map_of_entries[OF assms(3), symmetric] RBT_Impl.fold_def xs_def[symmetric] by (induction xs arbitrary: t1 rule: rev_induct) (auto simp: rbt_delete rbt_sorted_delete rbt_lookup_delete split!: option.splits) qed lemma rbtreeify_filter_minus: assumes "rbt_sorted t1" shows "rbt_sorted (rbtreeify (filter_minus t1 t2)) \ rbt_lookup (rbtreeify (filter_minus t1 t2)) k = (case rbt_lookup t1 k of None \ None | Some v \ (case rbt_lookup t2 k of None \ Some v | _ \ None))" proof - have map_of_filter: "map_of (filter (\(k, _). rbt_lookup t2 k = None) xs) k = (case map_of xs k of None \ None | Some v \ (case rbt_lookup t2 k of None \ Some v | Some x \ Map.empty x))" for xs :: "('a \ 'b) list" by (induction xs) (auto split: option.splits) have map_fst_filter_minus: "map fst (filter_minus t1 t2) = filter (\k. rbt_lookup t2 k = None) (map fst (RBT_Impl.entries t1))" by (auto simp: filter_minus_def filter_map comp_def case_prod_unfold) have "sorted (map fst (filter_minus t1 t2))" "distinct (map fst (filter_minus t1 t2))" using distinct_filter distinct_entries[OF assms] sorted_filter[of id] rbt_sorted_entries[OF assms] by (auto simp: map_fst_filter_minus intro!: rbt_sorted_rbtreeify) then show ?thesis by (auto simp: rbt_lookup_rbtreeify filter_minus_def map_of_filter map_of_entries[OF assms] intro!: rbt_sorted_rbtreeify) qed lemma rbt_lookup_minus_rec: "inv_12 t1 \ rbt_sorted t1 \ rbt_sorted t2 \ rbt_sorted (rbt_minus_rec t1 t2) \ rbt_lookup (rbt_minus_rec t1 t2) k = (case rbt_lookup t1 k of None \ None | Some v \ (case rbt_lookup t2 k of None \ Some v | _ \ None))" proof(induction t1 t2 arbitrary: k rule: rbt_minus_rec.induct) case (1 t1 t2) show ?case proof (cases t2) case Empty show ?thesis using rbtreeify_filter_minus[OF 1(4)] 1(4) by (auto simp: rbt_minus_rec.simps[of t1] Empty split: option.splits) next case (Branch c l2 a b r2) { assume not_small: "\small_rbt t2" "\small_rbt t1" obtain l1 \ r1 where rbt_split_t1: "rbt_split t1 a = (l1, \, r1)" by (cases "rbt_split t1 a") auto note rbt_split_t1_props = rbt_split_props[OF rbt_split_t1 1(4)] have minus_l1_l2: "rbt_sorted (rbt_minus_rec l1 l2)" "rbt_lookup (rbt_minus_rec l1 l2) k = (case rbt_lookup l1 k of None \ None | Some v \ (case rbt_lookup l2 k of None \ Some v | Some x \ None))" for k using 1(1)[OF not_small Branch rbt_split_t1[symmetric] refl] 1(5) rbt_split_t1_props rbt_split[OF rbt_split_t1 1(3)] by (auto simp: Branch) have minus_r1_r2: "rbt_sorted (rbt_minus_rec r1 r2)" "rbt_lookup (rbt_minus_rec r1 r2) k = (case rbt_lookup r1 k of None \ None | Some v \ (case rbt_lookup r2 k of None \ Some v | Some x \ None))" for k using 1(2)[OF not_small Branch rbt_split_t1[symmetric] refl] 1(5) rbt_split_t1_props rbt_split[OF rbt_split_t1 1(3)] by (auto simp: Branch) have minus_l1_l2_keys: "set (RBT_Impl.keys (rbt_minus_rec l1 l2)) = set (RBT_Impl.keys l1) - set (RBT_Impl.keys l2)" using minus_l1_l2(1) 1(5) rbt_lookup_iff_keys(3) rbt_split_t1_props by (auto simp: Branch rbt_lookup_iff_keys(1) minus_l1_l2(2) split: option.splits) have minus_r1_r2_keys: "set (RBT_Impl.keys (rbt_minus_rec r1 r2)) = set (RBT_Impl.keys r1) - set (RBT_Impl.keys r2)" using minus_r1_r2(1) 1(5) rbt_lookup_iff_keys(3) rbt_split_t1_props by (auto simp: Branch rbt_lookup_iff_keys(1) minus_r1_r2(2) split: option.splits) have rbt_lookup_join2: "rbt_lookup (rbt_join2 (rbt_minus_rec l1 l2) (rbt_minus_rec r1 r2)) k = (case rbt_lookup (rbt_minus_rec l1 l2) k of None \ rbt_lookup (rbt_minus_rec r1 r2) k | Some v \ Some v)" for k using rbt_lookup_rbt_join2[OF minus_l1_l2(1) minus_r1_r2(1)] rbt_split_t1_props by (fastforce simp: minus_l1_l2_keys minus_r1_r2_keys) have lookup_l1_r1_a: "rbt_lookup l1 a = None" "rbt_lookup r1 a = None" using rbt_split_t1_props by (auto simp: rbt_lookup_iff_keys(2)) have "rbt_lookup (rbt_minus_rec t1 t2) k = (case rbt_lookup t1 k of None \ None | Some v \ (case rbt_lookup t2 k of None \ Some v | _ \ None))" using not_small rbt_lookup_iff_keys(2)[of l1] rbt_lookup_iff_keys(3)[of l1] rbt_lookup_iff_keys(3)[of r1] rbt_split_t1_props + using [[simp_depth_limit = 2]] by (auto simp: rbt_minus_rec.simps[of t1] Branch rbt_split_t1 rbt_lookup_join2 minus_l1_l2(2) minus_r1_r2(2) rbt_split_lookup[OF rbt_split_t1 1(4)] lookup_l1_r1_a split: option.splits) moreover have "rbt_sorted (rbt_minus_rec t1 t2)" using not_small minus_l1_l2(1) minus_r1_r2(1) rbt_split_t1_props rbt_sorted_rbt_join2 by (fastforce simp: rbt_minus_rec.simps[of t1] Branch rbt_split_t1 minus_l1_l2_keys minus_r1_r2_keys) ultimately have ?thesis by (auto split: if_splits split: option.splits) } then show ?thesis using fold_rbt_delete[OF 1(3,4,5)] rbtreeify_filter_minus[OF 1(4)] by (auto simp: rbt_minus_rec.simps[of t1]) qed qed end context ord begin definition rbt_union_with_key :: "('a \ 'b \ 'b \ 'b) \ ('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_union_with_key f t1 t2 = paint B (rbt_union_swap_rec f False t1 t2)" definition rbt_union_with where "rbt_union_with f = rbt_union_with_key (\_. f)" definition rbt_union where "rbt_union = rbt_union_with_key (%_ _ rv. rv)" definition rbt_inter_with_key :: "('a \ 'b \ 'b \ 'b) \ ('a, 'b) rbt \ ('a, 'b) rbt \ ('a, 'b) rbt" where "rbt_inter_with_key f t1 t2 = paint B (rbt_inter_swap_rec f False t1 t2)" definition rbt_inter_with where "rbt_inter_with f = rbt_inter_with_key (\_. f)" definition rbt_inter where "rbt_inter = rbt_inter_with_key (\_ _ rv. rv)" definition rbt_minus where "rbt_minus t1 t2 = paint B (rbt_minus_rec t1 t2)" end context linorder begin lemma is_rbt_rbt_unionwk [simp]: "\ is_rbt t1; is_rbt t2 \ \ is_rbt (rbt_union_with_key f t1 t2)" using rbt_union_rec rbt_lookup_union_rec by (fastforce simp: rbt_union_with_key_def rbt_union_swap_rec is_rbt_def inv_12_def) lemma rbt_lookup_rbt_unionwk: "\ rbt_sorted t1; rbt_sorted t2 \ \ rbt_lookup (rbt_union_with_key f t1 t2) k = (case rbt_lookup t1 k of None \ rbt_lookup t2 k | Some v \ case rbt_lookup t2 k of None \ Some v | Some w \ Some (f k v w))" using rbt_lookup_union_rec by (auto simp: rbt_union_with_key_def rbt_union_swap_rec) lemma rbt_unionw_is_rbt: "\ is_rbt lt; is_rbt rt \ \ is_rbt (rbt_union_with f lt rt)" by(simp add: rbt_union_with_def) lemma rbt_union_is_rbt: "\ is_rbt lt; is_rbt rt \ \ is_rbt (rbt_union lt rt)" by(simp add: rbt_union_def) lemma rbt_lookup_rbt_union: "\ rbt_sorted s; rbt_sorted t \ \ rbt_lookup (rbt_union s t) = rbt_lookup s ++ rbt_lookup t" by(rule ext)(simp add: rbt_lookup_rbt_unionwk rbt_union_def map_add_def split: option.split) lemma rbt_interwk_is_rbt [simp]: "\ is_rbt t1; is_rbt t2 \ \ is_rbt (rbt_inter_with_key f t1 t2)" using rbt_inter_rec rbt_lookup_inter_rec by (fastforce simp: rbt_inter_with_key_def rbt_inter_swap_rec is_rbt_def inv_12_def rbt_sorted_paint) lemma rbt_interw_is_rbt: "\ is_rbt t1; is_rbt t2 \ \ is_rbt (rbt_inter_with f t1 t2)" by(simp add: rbt_inter_with_def) lemma rbt_inter_is_rbt: "\ is_rbt t1; is_rbt t2 \ \ is_rbt (rbt_inter t1 t2)" by(simp add: rbt_inter_def) lemma rbt_lookup_rbt_interwk: "\ rbt_sorted t1; rbt_sorted t2 \ \ rbt_lookup (rbt_inter_with_key f t1 t2) k = (case rbt_lookup t1 k of None \ None | Some v \ case rbt_lookup t2 k of None \ None | Some w \ Some (f k v w))" using rbt_lookup_inter_rec by (auto simp: rbt_inter_with_key_def rbt_inter_swap_rec) lemma rbt_lookup_rbt_inter: "\ rbt_sorted t1; rbt_sorted t2 \ \ rbt_lookup (rbt_inter t1 t2) = rbt_lookup t2 |` dom (rbt_lookup t1)" by(auto simp add: rbt_inter_def rbt_lookup_rbt_interwk restrict_map_def split: option.split) lemma rbt_minus_is_rbt: "\ is_rbt t1; is_rbt t2 \ \ is_rbt (rbt_minus t1 t2)" using rbt_minus_rec[of t1 t2] rbt_lookup_minus_rec[of t1 t2] by (auto simp: rbt_minus_def is_rbt_def inv_12_def) lemma rbt_lookup_rbt_minus: "\ is_rbt t1; is_rbt t2 \ \ rbt_lookup (rbt_minus t1 t2) = rbt_lookup t1 |` (- dom (rbt_lookup t2))" by (rule ext) (auto simp: rbt_minus_def is_rbt_def inv_12_def restrict_map_def rbt_lookup_minus_rec split: option.splits) end subsection \Code generator setup\ lemmas [code] = ord.rbt_less_prop ord.rbt_greater_prop ord.rbt_sorted.simps ord.rbt_lookup.simps ord.is_rbt_def ord.rbt_ins.simps ord.rbt_insert_with_key_def ord.rbt_insertw_def ord.rbt_insert_def ord.rbt_del_from_left.simps ord.rbt_del_from_right.simps ord.rbt_del.simps ord.rbt_delete_def ord.rbt_split.simps ord.rbt_union_swap_rec.simps ord.map_filter_inter_def ord.rbt_inter_swap_rec.simps ord.filter_minus_def ord.rbt_minus_rec.simps ord.rbt_union_with_key_def ord.rbt_union_with_def ord.rbt_union_def ord.rbt_inter_with_key_def ord.rbt_inter_with_def ord.rbt_inter_def ord.rbt_minus_def ord.rbt_map_entry.simps ord.rbt_bulkload_def text \More efficient implementations for \<^term>\entries\ and \<^term>\keys\\ definition gen_entries :: "(('a \ 'b) \ ('a, 'b) rbt) list \ ('a, 'b) rbt \ ('a \ 'b) list" where "gen_entries kvts t = entries t @ concat (map (\(kv, t). kv # entries t) kvts)" lemma gen_entries_simps [simp, code]: "gen_entries [] Empty = []" "gen_entries ((kv, t) # kvts) Empty = kv # gen_entries kvts t" "gen_entries kvts (Branch c l k v r) = gen_entries (((k, v), r) # kvts) l" by(simp_all add: gen_entries_def) lemma entries_code [code]: "entries = gen_entries []" by(simp add: gen_entries_def fun_eq_iff) definition gen_keys :: "('a \ ('a, 'b) rbt) list \ ('a, 'b) rbt \ 'a list" where "gen_keys kts t = RBT_Impl.keys t @ concat (List.map (\(k, t). k # keys t) kts)" lemma gen_keys_simps [simp, code]: "gen_keys [] Empty = []" "gen_keys ((k, t) # kts) Empty = k # gen_keys kts t" "gen_keys kts (Branch c l k v r) = gen_keys ((k, r) # kts) l" by(simp_all add: gen_keys_def) lemma keys_code [code]: "keys = gen_keys []" by(simp add: gen_keys_def fun_eq_iff) text \Restore original type constraints for constants\ setup \ fold Sign.add_const_constraint [(\<^const_name>\rbt_less\, SOME \<^typ>\('a :: order) \ ('a, 'b) rbt \ bool\), (\<^const_name>\rbt_greater\, SOME \<^typ>\('a :: order) \ ('a, 'b) rbt \ bool\), (\<^const_name>\rbt_sorted\, SOME \<^typ>\('a :: linorder, 'b) rbt \ bool\), (\<^const_name>\rbt_lookup\, SOME \<^typ>\('a :: linorder, 'b) rbt \ 'a \ 'b\), (\<^const_name>\is_rbt\, SOME \<^typ>\('a :: linorder, 'b) rbt \ bool\), (\<^const_name>\rbt_ins\, SOME \<^typ>\('a::linorder \ 'b \ 'b \ 'b) \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_insert_with_key\, SOME \<^typ>\('a::linorder \ 'b \ 'b \ 'b) \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_insert_with\, SOME \<^typ>\('b \ 'b \ 'b) \ ('a :: linorder) \ 'b \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_insert\, SOME \<^typ>\('a :: linorder) \ 'b \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_del_from_left\, SOME \<^typ>\('a::linorder) \ ('a,'b) rbt \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_del_from_right\, SOME \<^typ>\('a::linorder) \ ('a,'b) rbt \ 'a \ 'b \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_del\, SOME \<^typ>\('a::linorder) \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_delete\, SOME \<^typ>\('a::linorder) \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_union_with_key\, SOME \<^typ>\('a::linorder \ 'b \ 'b \ 'b) \ ('a,'b) rbt \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_union_with\, SOME \<^typ>\('b \ 'b \ 'b) \ ('a::linorder,'b) rbt \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_union\, SOME \<^typ>\('a::linorder,'b) rbt \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_map_entry\, SOME \<^typ>\'a::linorder \ ('b \ 'b) \ ('a,'b) rbt \ ('a,'b) rbt\), (\<^const_name>\rbt_bulkload\, SOME \<^typ>\('a \ 'b) list \ ('a::linorder,'b) rbt\)] \ hide_const (open) MR MB R B Empty entries keys fold gen_keys gen_entries end diff --git a/src/HOL/Nonstandard_Analysis/Examples/NSPrimes.thy b/src/HOL/Nonstandard_Analysis/Examples/NSPrimes.thy --- a/src/HOL/Nonstandard_Analysis/Examples/NSPrimes.thy +++ b/src/HOL/Nonstandard_Analysis/Examples/NSPrimes.thy @@ -1,290 +1,288 @@ (* Title : NSPrimes.thy Author : Jacques D. Fleuriot Copyright : 2002 University of Edinburgh Conversion to Isar and new proofs by Lawrence C Paulson, 2004 *) section \The Nonstandard Primes as an Extension of the Prime Numbers\ theory NSPrimes imports "HOL-Computational_Algebra.Primes" "HOL-Nonstandard_Analysis.Hyperreal" begin text \These can be used to derive an alternative proof of the infinitude of primes by considering a property of nonstandard sets.\ definition starprime :: "hypnat set" where [transfer_unfold]: "starprime = *s* {p. prime p}" definition choicefun :: "'a set \ 'a" where "choicefun E = (SOME x. \X \ Pow E - {{}}. x \ X)" primrec injf_max :: "nat \ 'a::order set \ 'a" where injf_max_zero: "injf_max 0 E = choicefun E" | injf_max_Suc: "injf_max (Suc n) E = choicefun ({e. e \ E \ injf_max n E < e})" lemma dvd_by_all2: "\N>0. \m. 0 < m \ m \ M \ m dvd N" for M :: nat apply (induct M) apply auto apply (rule_tac x = "N * Suc M" in exI) apply auto apply (metis dvdI dvd_add_times_triv_left_iff dvd_add_triv_right_iff dvd_refl dvd_trans le_Suc_eq mult_Suc_right) done lemma dvd_by_all: "\M::nat. \N>0. \m. 0 < m \ m \ M \ m dvd N" using dvd_by_all2 by blast lemma hypnat_of_nat_le_zero_iff [simp]: "hypnat_of_nat n \ 0 \ n = 0" by transfer simp text \Goldblatt: Exercise 5.11(2) -- p. 57.\ lemma hdvd_by_all: "\M. \N. 0 < N \ (\m::hypnat. 0 < m \ m \ M \ m dvd N)" by transfer (rule dvd_by_all) lemmas hdvd_by_all2 = hdvd_by_all [THEN spec] text \Goldblatt: Exercise 5.11(2) -- p. 57.\ lemma hypnat_dvd_all_hypnat_of_nat: "\N::hypnat. 0 < N \ (\n \ - {0::nat}. hypnat_of_nat n dvd N)" apply (cut_tac hdvd_by_all) apply (drule_tac x = whn in spec) apply auto apply (rule exI) apply auto apply (drule_tac x = "hypnat_of_nat n" in spec) apply (auto simp add: linorder_not_less) done text \The nonstandard extension of the set prime numbers consists of precisely those hypernaturals exceeding 1 that have no nontrivial factors.\ text \Goldblatt: Exercise 5.11(3a) -- p 57.\ lemma starprime: "starprime = {p. 1 < p \ (\m. m dvd p \ m = 1 \ m = p)}" by transfer (auto simp add: prime_nat_iff) text \Goldblatt Exercise 5.11(3b) -- p 57.\ lemma hyperprime_factor_exists: "\n. 1 < n \ \k \ starprime. k dvd n" by transfer (simp add: prime_factor_nat) text \Goldblatt Exercise 3.10(1) -- p. 29.\ lemma NatStar_hypnat_of_nat: "finite A \ *s* A = hypnat_of_nat ` A" by (rule starset_finite) subsection \Another characterization of infinite set of natural numbers\ lemma finite_nat_set_bounded: "finite N \ \n::nat. \i \ N. i < n" apply (erule_tac F = N in finite_induct) apply auto apply (rule_tac x = "Suc n + x" in exI) apply auto done lemma finite_nat_set_bounded_iff: "finite N \ (\n::nat. \i \ N. i < n)" by (blast intro: finite_nat_set_bounded bounded_nat_set_is_finite) lemma not_finite_nat_set_iff: "\ finite N \ (\n::nat. \i \ N. n \ i)" by (auto simp add: finite_nat_set_bounded_iff not_less) lemma bounded_nat_set_is_finite2: "\i::nat \ N. i \ n \ finite N" apply (rule finite_subset) apply (rule_tac [2] finite_atMost) apply auto done lemma finite_nat_set_bounded2: "finite N \ \n::nat. \i \ N. i \ n" apply (erule_tac F = N in finite_induct) apply auto apply (rule_tac x = "n + x" in exI) apply auto done lemma finite_nat_set_bounded_iff2: "finite N \ (\n::nat. \i \ N. i \ n)" by (blast intro: finite_nat_set_bounded2 bounded_nat_set_is_finite2) lemma not_finite_nat_set_iff2: "\ finite N \ (\n::nat. \i \ N. n < i)" by (auto simp add: finite_nat_set_bounded_iff2 not_le) subsection \An injective function cannot define an embedded natural number\ lemma lemma_infinite_set_singleton: "\m n. m \ n \ f n \ f m \ {n. f n = N} = {} \ (\m. {n. f n = N} = {m})" apply auto apply (drule_tac x = x in spec, auto) apply (subgoal_tac "\n. f n = f x \ x = n") apply auto done lemma inj_fun_not_hypnat_in_SHNat: fixes f :: "nat \ nat" assumes inj_f: "inj f" shows "starfun f whn \ Nats" proof from inj_f have inj_f': "inj (starfun f)" by (transfer inj_on_def Ball_def UNIV_def) assume "starfun f whn \ Nats" then obtain N where N: "starfun f whn = hypnat_of_nat N" by (auto simp: Nats_def) then have "\n. starfun f n = hypnat_of_nat N" .. then have "\n. f n = N" by transfer then obtain n where "f n = N" .. then have "starfun f (hypnat_of_nat n) = hypnat_of_nat N" by transfer with N have "starfun f whn = starfun f (hypnat_of_nat n)" by simp with inj_f' have "whn = hypnat_of_nat n" by (rule injD) then show False by (simp add: whn_neq_hypnat_of_nat) qed lemma range_subset_mem_starsetNat: "range f \ A \ starfun f whn \ *s* A" apply (rule_tac x="whn" in spec) apply transfer apply auto done text \ Gleason Proposition 11-5.5. pg 149, pg 155 (ex. 3) and pg. 360. Let \E\ be a nonvoid ordered set with no maximal elements (note: effectively an infinite set if we take \E = N\ (Nats)). Then there exists an order-preserving injection from \N\ to \E\. Of course, (as some doofus will undoubtedly point out! :-)) can use notion of least element in proof (i.e. no need for choice) if dealing with nats as we have well-ordering property. \ lemma lemmaPow3: "E \ {} \ \x. \X \ Pow E - {{}}. x \ X" by auto lemma choicefun_mem_set [simp]: "E \ {} \ choicefun E \ E" apply (unfold choicefun_def) apply (rule lemmaPow3 [THEN someI2_ex], auto) done lemma injf_max_mem_set: "E \{} \ \x. \y \ E. x < y \ injf_max n E \ E" apply (induct n) apply force apply (simp add: choicefun_def) apply (rule lemmaPow3 [THEN someI2_ex], auto) done lemma injf_max_order_preserving: "\x. \y \ E. x < y \ injf_max n E < injf_max (Suc n) E" apply (simp add: choicefun_def) apply (rule lemmaPow3 [THEN someI2_ex]) apply auto done lemma injf_max_order_preserving2: "\x. \y \ E. x < y \ \n m. m < n \ injf_max m E < injf_max n E" apply (rule allI) apply (induct_tac n) apply auto apply (simp add: choicefun_def) apply (rule lemmaPow3 [THEN someI2_ex]) apply (auto simp add: less_Suc_eq) apply (drule_tac x = m in spec) apply (drule subsetD) apply auto - apply (drule_tac x = "injf_max m E" in order_less_trans) - apply auto done lemma inj_injf_max: "\x. \y \ E. x < y \ inj (\n. injf_max n E)" apply (rule inj_onI) apply (rule ccontr) apply auto apply (drule injf_max_order_preserving2) apply (metis antisym_conv3 order_less_le) done lemma infinite_set_has_order_preserving_inj: "E \ {} \ \x. \y \ E. x < y \ \f. range f \ E \ inj f \ (\m. f m < f (Suc m))" for E :: "'a::order set" and f :: "nat \ 'a" apply (rule_tac x = "\n. injf_max n E" in exI) apply safe apply (rule injf_max_mem_set) apply (rule_tac [3] inj_injf_max) apply (rule_tac [4] injf_max_order_preserving) apply auto done text \Only need the existence of an injective function from \N\ to \A\ for proof.\ lemma hypnat_infinite_has_nonstandard: "\ finite A \ hypnat_of_nat ` A < ( *s* A)" apply auto apply (subgoal_tac "A \ {}") prefer 2 apply force apply (drule infinite_set_has_order_preserving_inj) apply (erule not_finite_nat_set_iff2 [THEN iffD1]) apply auto apply (drule inj_fun_not_hypnat_in_SHNat) apply (drule range_subset_mem_starsetNat) apply (auto simp add: SHNat_eq) done lemma starsetNat_eq_hypnat_of_nat_image_finite: "*s* A = hypnat_of_nat ` A \ finite A" by (metis hypnat_infinite_has_nonstandard less_irrefl) lemma finite_starsetNat_iff: "*s* A = hypnat_of_nat ` A \ finite A" by (blast intro!: starsetNat_eq_hypnat_of_nat_image_finite NatStar_hypnat_of_nat) lemma hypnat_infinite_has_nonstandard_iff: "\ finite A \ hypnat_of_nat ` A < *s* A" apply (rule iffI) apply (blast intro!: hypnat_infinite_has_nonstandard) apply (auto simp add: finite_starsetNat_iff [symmetric]) done subsection \Existence of Infinitely Many Primes: a Nonstandard Proof\ lemma lemma_not_dvd_hypnat_one [simp]: "\ (\n \ - {0}. hypnat_of_nat n dvd 1)" apply auto apply (rule_tac x = 2 in bexI) apply transfer apply auto done lemma lemma_not_dvd_hypnat_one2 [simp]: "\n \ - {0}. \ hypnat_of_nat n dvd 1" using lemma_not_dvd_hypnat_one by (auto simp del: lemma_not_dvd_hypnat_one) lemma hypnat_add_one_gt_one: "\N::hypnat. 0 < N \ 1 < N + 1" by transfer simp lemma hypnat_of_nat_zero_not_prime [simp]: "hypnat_of_nat 0 \ starprime" by transfer simp lemma hypnat_zero_not_prime [simp]: "0 \ starprime" using hypnat_of_nat_zero_not_prime by simp lemma hypnat_of_nat_one_not_prime [simp]: "hypnat_of_nat 1 \ starprime" by transfer simp lemma hypnat_one_not_prime [simp]: "1 \ starprime" using hypnat_of_nat_one_not_prime by simp lemma hdvd_diff: "\k m n :: hypnat. k dvd m \ k dvd n \ k dvd (m - n)" by transfer (rule dvd_diff_nat) lemma hdvd_one_eq_one: "\x::hypnat. is_unit x \ x = 1" by transfer simp text \Already proved as \primes_infinite\, but now using non-standard naturals.\ theorem not_finite_prime: "\ finite {p::nat. prime p}" apply (rule hypnat_infinite_has_nonstandard_iff [THEN iffD2]) using hypnat_dvd_all_hypnat_of_nat apply clarify apply (drule hypnat_add_one_gt_one) apply (drule hyperprime_factor_exists) apply clarify apply (subgoal_tac "k \ hypnat_of_nat ` {p. prime p}") apply (force simp: starprime_def) apply (metis Compl_iff add.commute dvd_add_left_iff empty_iff hdvd_one_eq_one hypnat_one_not_prime imageE insert_iff mem_Collect_eq not_prime_0) done end diff --git a/src/HOL/Orderings.thy b/src/HOL/Orderings.thy --- a/src/HOL/Orderings.thy +++ b/src/HOL/Orderings.thy @@ -1,1805 +1,1702 @@ (* Title: HOL/Orderings.thy Author: Tobias Nipkow, Markus Wenzel, and Larry Paulson *) section \Abstract orderings\ theory Orderings imports HOL keywords "print_orders" :: diag begin -ML_file \~~/src/Provers/order.ML\ +ML_file \~~/src/Provers/order_procedure.ML\ +ML_file \~~/src/Provers/order_tac.ML\ subsection \Abstract ordering\ locale partial_preordering = fixes less_eq :: \'a \ 'a \ bool\ (infix \\<^bold>\\ 50) assumes refl: \a \<^bold>\ a\ \ \not \iff\: makes problems due to multiple (dual) interpretations\ and trans: \a \<^bold>\ b \ b \<^bold>\ c \ a \<^bold>\ c\ locale preordering = partial_preordering + fixes less :: \'a \ 'a \ bool\ (infix \\<^bold><\ 50) assumes strict_iff_not: \a \<^bold>< b \ a \<^bold>\ b \ \ b \<^bold>\ a\ begin lemma strict_implies_order: \a \<^bold>< b \ a \<^bold>\ b\ by (simp add: strict_iff_not) lemma irrefl: \ \not \iff\: makes problems due to multiple (dual) interpretations\ \\ a \<^bold>< a\ by (simp add: strict_iff_not) lemma asym: \a \<^bold>< b \ b \<^bold>< a \ False\ by (auto simp add: strict_iff_not) lemma strict_trans1: \a \<^bold>\ b \ b \<^bold>< c \ a \<^bold>< c\ by (auto simp add: strict_iff_not intro: trans) lemma strict_trans2: \a \<^bold>< b \ b \<^bold>\ c \ a \<^bold>< c\ by (auto simp add: strict_iff_not intro: trans) lemma strict_trans: \a \<^bold>< b \ b \<^bold>< c \ a \<^bold>< c\ by (auto intro: strict_trans1 strict_implies_order) end lemma preordering_strictI: \ \Alternative introduction rule with bias towards strict order\ fixes less_eq (infix \\<^bold>\\ 50) and less (infix \\<^bold><\ 50) assumes less_eq_less: \\a b. a \<^bold>\ b \ a \<^bold>< b \ a = b\ assumes asym: \\a b. a \<^bold>< b \ \ b \<^bold>< a\ assumes irrefl: \\a. \ a \<^bold>< a\ assumes trans: \\a b c. a \<^bold>< b \ b \<^bold>< c \ a \<^bold>< c\ shows \preordering (\<^bold>\) (\<^bold><)\ proof fix a b show \a \<^bold>< b \ a \<^bold>\ b \ \ b \<^bold>\ a\ by (auto simp add: less_eq_less asym irrefl) next fix a show \a \<^bold>\ a\ by (auto simp add: less_eq_less) next fix a b c assume \a \<^bold>\ b\ and \b \<^bold>\ c\ then show \a \<^bold>\ c\ by (auto simp add: less_eq_less intro: trans) qed lemma preordering_dualI: fixes less_eq (infix \\<^bold>\\ 50) and less (infix \\<^bold><\ 50) assumes \preordering (\a b. b \<^bold>\ a) (\a b. b \<^bold>< a)\ shows \preordering (\<^bold>\) (\<^bold><)\ proof - from assms interpret preordering \\a b. b \<^bold>\ a\ \\a b. b \<^bold>< a\ . show ?thesis by standard (auto simp: strict_iff_not refl intro: trans) qed locale ordering = partial_preordering + fixes less :: \'a \ 'a \ bool\ (infix \\<^bold><\ 50) assumes strict_iff_order: \a \<^bold>< b \ a \<^bold>\ b \ a \ b\ assumes antisym: \a \<^bold>\ b \ b \<^bold>\ a \ a = b\ begin sublocale preordering \(\<^bold>\)\ \(\<^bold><)\ proof show \a \<^bold>< b \ a \<^bold>\ b \ \ b \<^bold>\ a\ for a b by (auto simp add: strict_iff_order intro: antisym) qed lemma strict_implies_not_eq: \a \<^bold>< b \ a \ b\ by (simp add: strict_iff_order) lemma not_eq_order_implies_strict: \a \ b \ a \<^bold>\ b \ a \<^bold>< b\ by (simp add: strict_iff_order) lemma order_iff_strict: \a \<^bold>\ b \ a \<^bold>< b \ a = b\ by (auto simp add: strict_iff_order refl) lemma eq_iff: \a = b \ a \<^bold>\ b \ b \<^bold>\ a\ by (auto simp add: refl intro: antisym) end lemma ordering_strictI: \ \Alternative introduction rule with bias towards strict order\ fixes less_eq (infix \\<^bold>\\ 50) and less (infix \\<^bold><\ 50) assumes less_eq_less: \\a b. a \<^bold>\ b \ a \<^bold>< b \ a = b\ assumes asym: \\a b. a \<^bold>< b \ \ b \<^bold>< a\ assumes irrefl: \\a. \ a \<^bold>< a\ assumes trans: \\a b c. a \<^bold>< b \ b \<^bold>< c \ a \<^bold>< c\ shows \ordering (\<^bold>\) (\<^bold><)\ proof fix a b show \a \<^bold>< b \ a \<^bold>\ b \ a \ b\ by (auto simp add: less_eq_less asym irrefl) next fix a show \a \<^bold>\ a\ by (auto simp add: less_eq_less) next fix a b c assume \a \<^bold>\ b\ and \b \<^bold>\ c\ then show \a \<^bold>\ c\ by (auto simp add: less_eq_less intro: trans) next fix a b assume \a \<^bold>\ b\ and \b \<^bold>\ a\ then show \a = b\ by (auto simp add: less_eq_less asym) qed lemma ordering_dualI: fixes less_eq (infix \\<^bold>\\ 50) and less (infix \\<^bold><\ 50) assumes \ordering (\a b. b \<^bold>\ a) (\a b. b \<^bold>< a)\ shows \ordering (\<^bold>\) (\<^bold><)\ proof - from assms interpret ordering \\a b. b \<^bold>\ a\ \\a b. b \<^bold>< a\ . show ?thesis by standard (auto simp: strict_iff_order refl intro: antisym trans) qed locale ordering_top = ordering + fixes top :: \'a\ (\\<^bold>\\) assumes extremum [simp]: \a \<^bold>\ \<^bold>\\ begin lemma extremum_uniqueI: \\<^bold>\ \<^bold>\ a \ a = \<^bold>\\ by (rule antisym) auto lemma extremum_unique: \\<^bold>\ \<^bold>\ a \ a = \<^bold>\\ by (auto intro: antisym) lemma extremum_strict [simp]: \\ (\<^bold>\ \<^bold>< a)\ using extremum [of a] by (auto simp add: order_iff_strict intro: asym irrefl) lemma not_eq_extremum: \a \ \<^bold>\ \ a \<^bold>< \<^bold>\\ by (auto simp add: order_iff_strict intro: not_eq_order_implies_strict extremum) end subsection \Syntactic orders\ class ord = fixes less_eq :: "'a \ 'a \ bool" and less :: "'a \ 'a \ bool" begin notation less_eq ("'(\')") and less_eq ("(_/ \ _)" [51, 51] 50) and less ("'(<')") and less ("(_/ < _)" [51, 51] 50) abbreviation (input) greater_eq (infix "\" 50) where "x \ y \ y \ x" abbreviation (input) greater (infix ">" 50) where "x > y \ y < x" notation (ASCII) less_eq ("'(<=')") and less_eq ("(_/ <= _)" [51, 51] 50) notation (input) greater_eq (infix ">=" 50) end subsection \Quasi orders\ class preorder = ord + assumes less_le_not_le: "x < y \ x \ y \ \ (y \ x)" and order_refl [iff]: "x \ x" and order_trans: "x \ y \ y \ z \ x \ z" begin sublocale order: preordering less_eq less + dual_order: preordering greater_eq greater proof - interpret preordering less_eq less by standard (auto intro: order_trans simp add: less_le_not_le) show \preordering less_eq less\ by (fact preordering_axioms) then show \preordering greater_eq greater\ by (rule preordering_dualI) qed text \Reflexivity.\ lemma eq_refl: "x = y \ x \ y" \ \This form is useful with the classical reasoner.\ by (erule ssubst) (rule order_refl) lemma less_irrefl [iff]: "\ x < x" by (simp add: less_le_not_le) lemma less_imp_le: "x < y \ x \ y" by (simp add: less_le_not_le) text \Asymmetry.\ lemma less_not_sym: "x < y \ \ (y < x)" by (simp add: less_le_not_le) lemma less_asym: "x < y \ (\ P \ y < x) \ P" by (drule less_not_sym, erule contrapos_np) simp text \Transitivity.\ lemma less_trans: "x < y \ y < z \ x < z" by (auto simp add: less_le_not_le intro: order_trans) lemma le_less_trans: "x \ y \ y < z \ x < z" by (auto simp add: less_le_not_le intro: order_trans) lemma less_le_trans: "x < y \ y \ z \ x < z" by (auto simp add: less_le_not_le intro: order_trans) text \Useful for simplification, but too risky to include by default.\ lemma less_imp_not_less: "x < y \ (\ y < x) \ True" by (blast elim: less_asym) lemma less_imp_triv: "x < y \ (y < x \ P) \ True" by (blast elim: less_asym) text \Transitivity rules for calculational reasoning\ lemma less_asym': "a < b \ b < a \ P" by (rule less_asym) text \Dual order\ lemma dual_preorder: \class.preorder (\) (>)\ by standard (auto simp add: less_le_not_le intro: order_trans) end subsection \Partial orders\ class order = preorder + assumes order_antisym: "x \ y \ y \ x \ x = y" begin lemma less_le: "x < y \ x \ y \ x \ y" by (auto simp add: less_le_not_le intro: order_antisym) sublocale order: ordering less_eq less + dual_order: ordering greater_eq greater proof - interpret ordering less_eq less by standard (auto intro: order_antisym order_trans simp add: less_le) show "ordering less_eq less" by (fact ordering_axioms) then show "ordering greater_eq greater" by (rule ordering_dualI) qed print_theorems text \Reflexivity.\ lemma le_less: "x \ y \ x < y \ x = y" \ \NOT suitable for iff, since it can cause PROOF FAILED.\ by (fact order.order_iff_strict) lemma le_imp_less_or_eq: "x \ y \ x < y \ x = y" by (simp add: less_le) text \Useful for simplification, but too risky to include by default.\ lemma less_imp_not_eq: "x < y \ (x = y) \ False" by auto lemma less_imp_not_eq2: "x < y \ (y = x) \ False" by auto text \Transitivity rules for calculational reasoning\ lemma neq_le_trans: "a \ b \ a \ b \ a < b" by (fact order.not_eq_order_implies_strict) lemma le_neq_trans: "a \ b \ a \ b \ a < b" by (rule order.not_eq_order_implies_strict) text \Asymmetry.\ lemma order_eq_iff: "x = y \ x \ y \ y \ x" by (fact order.eq_iff) lemma antisym_conv: "y \ x \ x \ y \ x = y" by (simp add: order.eq_iff) lemma less_imp_neq: "x < y \ x \ y" by (fact order.strict_implies_not_eq) lemma antisym_conv1: "\ x < y \ x \ y \ x = y" by (simp add: local.le_less) lemma antisym_conv2: "x \ y \ \ x < y \ x = y" by (simp add: local.less_le) lemma leD: "y \ x \ \ x < y" by (auto simp: less_le order.antisym) text \Least value operator\ definition (in ord) Least :: "('a \ bool) \ 'a" (binder "LEAST " 10) where "Least P = (THE x. P x \ (\y. P y \ x \ y))" lemma Least_equality: assumes "P x" and "\y. P y \ x \ y" shows "Least P = x" unfolding Least_def by (rule the_equality) (blast intro: assms order.antisym)+ lemma LeastI2_order: assumes "P x" and "\y. P y \ x \ y" and "\x. P x \ \y. P y \ x \ y \ Q x" shows "Q (Least P)" unfolding Least_def by (rule theI2) (blast intro: assms order.antisym)+ lemma Least_ex1: assumes "\!x. P x \ (\y. P y \ x \ y)" shows Least1I: "P (Least P)" and Least1_le: "P z \ Least P \ z" using theI'[OF assms] unfolding Least_def by auto text \Greatest value operator\ definition Greatest :: "('a \ bool) \ 'a" (binder "GREATEST " 10) where "Greatest P = (THE x. P x \ (\y. P y \ x \ y))" lemma GreatestI2_order: "\ P x; \y. P y \ x \ y; \x. \ P x; \y. P y \ x \ y \ \ Q x \ \ Q (Greatest P)" unfolding Greatest_def by (rule theI2) (blast intro: order.antisym)+ lemma Greatest_equality: "\ P x; \y. P y \ x \ y \ \ Greatest P = x" unfolding Greatest_def by (rule the_equality) (blast intro: order.antisym)+ end lemma ordering_orderI: fixes less_eq (infix "\<^bold>\" 50) and less (infix "\<^bold><" 50) assumes "ordering less_eq less" shows "class.order less_eq less" proof - from assms interpret ordering less_eq less . show ?thesis by standard (auto intro: antisym trans simp add: refl strict_iff_order) qed lemma order_strictI: fixes less (infix "\" 50) and less_eq (infix "\" 50) assumes "\a b. a \ b \ a \ b \ a = b" assumes "\a b. a \ b \ \ b \ a" assumes "\a. \ a \ a" assumes "\a b c. a \ b \ b \ c \ a \ c" shows "class.order less_eq less" by (rule ordering_orderI) (rule ordering_strictI, (fact assms)+) context order begin text \Dual order\ lemma dual_order: "class.order (\) (>)" using dual_order.ordering_axioms by (rule ordering_orderI) end subsection \Linear (total) orders\ class linorder = order + assumes linear: "x \ y \ y \ x" begin lemma less_linear: "x < y \ x = y \ y < x" unfolding less_le using less_le linear by blast lemma le_less_linear: "x \ y \ y < x" by (simp add: le_less less_linear) lemma le_cases [case_names le ge]: "(x \ y \ P) \ (y \ x \ P) \ P" using linear by blast lemma (in linorder) le_cases3: "\\x \ y; y \ z\ \ P; \y \ x; x \ z\ \ P; \x \ z; z \ y\ \ P; \z \ y; y \ x\ \ P; \y \ z; z \ x\ \ P; \z \ x; x \ y\ \ P\ \ P" by (blast intro: le_cases) lemma linorder_cases [case_names less equal greater]: "(x < y \ P) \ (x = y \ P) \ (y < x \ P) \ P" using less_linear by blast lemma linorder_wlog[case_names le sym]: "(\a b. a \ b \ P a b) \ (\a b. P b a \ P a b) \ P a b" by (cases rule: le_cases[of a b]) blast+ lemma not_less: "\ x < y \ y \ x" unfolding less_le using linear by (blast intro: order.antisym) lemma not_less_iff_gr_or_eq: "\(x < y) \ (x > y \ x = y)" by (auto simp add:not_less le_less) lemma not_le: "\ x \ y \ y < x" unfolding less_le using linear by (blast intro: order.antisym) lemma neq_iff: "x \ y \ x < y \ y < x" by (cut_tac x = x and y = y in less_linear, auto) lemma neqE: "x \ y \ (x < y \ R) \ (y < x \ R) \ R" by (simp add: neq_iff) blast lemma antisym_conv3: "\ y < x \ \ x < y \ x = y" by (blast intro: order.antisym dest: not_less [THEN iffD1]) lemma leI: "\ x < y \ y \ x" unfolding not_less . lemma not_le_imp_less: "\ y \ x \ x < y" unfolding not_le . lemma linorder_less_wlog[case_names less refl sym]: "\\a b. a < b \ P a b; \a. P a a; \a b. P b a \ P a b\ \ P a b" using antisym_conv3 by blast text \Dual order\ lemma dual_linorder: "class.linorder (\) (>)" by (rule class.linorder.intro, rule dual_order) (unfold_locales, rule linear) end text \Alternative introduction rule with bias towards strict order\ lemma linorder_strictI: fixes less_eq (infix "\<^bold>\" 50) and less (infix "\<^bold><" 50) assumes "class.order less_eq less" assumes trichotomy: "\a b. a \<^bold>< b \ a = b \ b \<^bold>< a" shows "class.linorder less_eq less" proof - interpret order less_eq less by (fact \class.order less_eq less\) show ?thesis proof fix a b show "a \<^bold>\ b \ b \<^bold>\ a" using trichotomy by (auto simp add: le_less) qed qed subsection \Reasoning tools setup\ ML \ -signature ORDERS = -sig - val print_structures: Proof.context -> unit - val order_tac: Proof.context -> thm list -> int -> tactic - val add_struct: string * term list -> string -> attribute - val del_struct: string * term list -> attribute -end; - -structure Orders: ORDERS = -struct - -(* context data *) +structure Logic_Signature : LOGIC_SIGNATURE = struct + val mk_Trueprop = HOLogic.mk_Trueprop + val dest_Trueprop = HOLogic.dest_Trueprop + val Trueprop_conv = HOLogic.Trueprop_conv + val Not = HOLogic.Not + val conj = HOLogic.conj + val disj = HOLogic.disj + + val notI = @{thm notI} + val ccontr = @{thm ccontr} + val conjI = @{thm conjI} + val conjE = @{thm conjE} + val disjE = @{thm disjE} -fun struct_eq ((s1: string, ts1), (s2, ts2)) = - s1 = s2 andalso eq_list (op aconv) (ts1, ts2); + val not_not_conv = Conv.rewr_conv @{thm eq_reflection[OF not_not]} + val de_Morgan_conj_conv = Conv.rewr_conv @{thm eq_reflection[OF de_Morgan_conj]} + val de_Morgan_disj_conv = Conv.rewr_conv @{thm eq_reflection[OF de_Morgan_disj]} + val conj_disj_distribL_conv = Conv.rewr_conv @{thm eq_reflection[OF conj_disj_distribL]} + val conj_disj_distribR_conv = Conv.rewr_conv @{thm eq_reflection[OF conj_disj_distribR]} +end -structure Data = Generic_Data -( - type T = ((string * term list) * Order_Tac.less_arith) list; - (* Order structures: - identifier of the structure, list of operations and record of theorems - needed to set up the transitivity reasoner, - identifier and operations identify the structure uniquely. *) - val empty = []; - val extend = I; - fun merge data = AList.join struct_eq (K fst) data; -); +structure HOL_Base_Order_Tac = Base_Order_Tac( + structure Logic_Sig = Logic_Signature; + (* Exclude types with specialised solvers. *) + val excluded_types = [HOLogic.natT, HOLogic.intT, HOLogic.realT] +) -fun print_structures ctxt = +structure HOL_Order_Tac = Order_Tac(structure Base_Tac = HOL_Base_Order_Tac) + +fun print_orders ctxt0 = let - val structs = Data.get (Context.Proof ctxt); + val ctxt = Config.put show_sorts true ctxt0 + val orders = HOL_Order_Tac.Data.get (Context.Proof ctxt) fun pretty_term t = Pretty.block [Pretty.quote (Syntax.pretty_term ctxt t), Pretty.brk 1, Pretty.str "::", Pretty.brk 1, - Pretty.quote (Syntax.pretty_typ ctxt (type_of t))]; - fun pretty_struct ((s, ts), _) = Pretty.block - [Pretty.str s, Pretty.str ":", Pretty.brk 1, - Pretty.enclose "(" ")" (Pretty.breaks (map pretty_term ts))]; + Pretty.quote (Syntax.pretty_typ ctxt (type_of t)), Pretty.brk 1] + fun pretty_order ({kind = kind, ops = ops, ...}, _) = + Pretty.block ([Pretty.str (@{make_string} kind), Pretty.str ":", Pretty.brk 1] + @ map pretty_term ops) in - Pretty.writeln (Pretty.big_list "order structures:" (map pretty_struct structs)) - end; + Pretty.writeln (Pretty.big_list "order structures:" (map pretty_order orders)) + end val _ = Outer_Syntax.command \<^command_keyword>\print_orders\ "print order structures available to transitivity reasoner" - (Scan.succeed (Toplevel.keep (print_structures o Toplevel.context_of))); - - -(* tactics *) + (Scan.succeed (Toplevel.keep (print_orders o Toplevel.context_of))) -fun struct_tac ((s, ops), thms) ctxt facts = - let - val [eq, le, less] = ops; - fun decomp thy (\<^const>\Trueprop\ $ t) = - let - fun excluded t = - (* exclude numeric types: linear arithmetic subsumes transitivity *) - let val T = type_of t - in - T = HOLogic.natT orelse T = HOLogic.intT orelse T = HOLogic.realT - end; - fun rel (bin_op $ t1 $ t2) = - if excluded t1 then NONE - else if Pattern.matches thy (eq, bin_op) then SOME (t1, "=", t2) - else if Pattern.matches thy (le, bin_op) then SOME (t1, "<=", t2) - else if Pattern.matches thy (less, bin_op) then SOME (t1, "<", t2) - else NONE - | rel _ = NONE; - fun dec (Const (\<^const_name>\Not\, _) $ t) = - (case rel t of NONE => - NONE - | SOME (t1, rel, t2) => SOME (t1, "~" ^ rel, t2)) - | dec x = rel x; - in dec t end - | decomp _ _ = NONE; - in - (case s of - "order" => Order_Tac.partial_tac decomp thms ctxt facts - | "linorder" => Order_Tac.linear_tac decomp thms ctxt facts - | _ => error ("Unknown order kind " ^ quote s ^ " encountered in transitivity reasoner")) - end - -fun order_tac ctxt facts = - FIRST' (map (fn s => CHANGED o struct_tac s ctxt facts) (Data.get (Context.Proof ctxt))); - - -(* attributes *) - -fun add_struct s tag = - Thm.declaration_attribute - (fn thm => Data.map (AList.map_default struct_eq (s, Order_Tac.empty TrueI) (Order_Tac.update tag thm))); -fun del_struct s = - Thm.declaration_attribute - (fn _ => Data.map (AList.delete struct_eq s)); - -end; \ -attribute_setup order = \ - Scan.lift ((Args.add -- Args.name >> (fn (_, s) => SOME s) || Args.del >> K NONE) --| - Args.colon (* FIXME || Scan.succeed true *) ) -- Scan.lift Args.name -- - Scan.repeat Args.term - >> (fn ((SOME tag, n), ts) => Orders.add_struct (n, ts) tag - | ((NONE, n), ts) => Orders.del_struct (n, ts)) -\ "theorems controlling transitivity reasoner" - method_setup order = \ - Scan.succeed (fn ctxt => SIMPLE_METHOD' (Orders.order_tac ctxt [])) + Scan.succeed (fn ctxt => SIMPLE_METHOD' (HOL_Order_Tac.tac [] ctxt)) \ "transitivity reasoner" text \Declarations to set up transitivity reasoner of partial and linear orders.\ context order begin -(* The type constraint on @{term (=}) below is necessary since the operation - is not a parameter of the locale. *) - -declare less_irrefl [THEN notE, order add less_reflE: order "(=) :: 'a \ 'a \ bool" "(<=)" "(<)"] - -declare order_refl [order add le_refl: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare less_imp_le [order add less_imp_le: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare order.antisym [order add eqI: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare eq_refl [order add eqD1: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare sym [THEN eq_refl, order add eqD2: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] +lemma nless_le: "(\ a < b) \ (\ a \ b) \ a = b" + using local.dual_order.order_iff_strict by blast -declare less_trans [order add less_trans: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare less_le_trans [order add less_le_trans: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare le_less_trans [order add le_less_trans: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare order_trans [order add le_trans: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare le_neq_trans [order add le_neq_trans: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare neq_le_trans [order add neq_le_trans: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare less_imp_neq [order add less_imp_neq: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare eq_neq_eq_imp_neq [order add eq_neq_eq_imp_neq: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare not_sym [order add not_sym: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"] +local_setup \ + HOL_Order_Tac.declare_order { + ops = {eq = @{term \(=) :: 'a \ 'a \ bool\}, le = @{term \(\)\}, lt = @{term \(<)\}}, + thms = {trans = @{thm order_trans}, refl = @{thm order_refl}, eqD1 = @{thm eq_refl}, + eqD2 = @{thm eq_refl[OF sym]}, antisym = @{thm order_antisym}, contr = @{thm notE}}, + conv_thms = {less_le = @{thm eq_reflection[OF less_le]}, + nless_le = @{thm eq_reflection[OF nless_le]}} + } +\ end context linorder begin -declare [[order del: order "(=) :: 'a => 'a => bool" "(<=)" "(<)"]] - -declare less_irrefl [THEN notE, order add less_reflE: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare order_refl [order add le_refl: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare less_imp_le [order add less_imp_le: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare not_less [THEN iffD2, order add not_lessI: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare not_le [THEN iffD2, order add not_leI: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare not_less [THEN iffD1, order add not_lessD: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare not_le [THEN iffD1, order add not_leD: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare order.antisym [order add eqI: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] +lemma nle_le: "(\ a \ b) \ b \ a \ b \ a" + using not_le less_le by simp -declare eq_refl [order add eqD1: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare sym [THEN eq_refl, order add eqD2: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare less_trans [order add less_trans: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare less_le_trans [order add less_le_trans: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare le_less_trans [order add le_less_trans: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare order_trans [order add le_trans: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare le_neq_trans [order add le_neq_trans: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare neq_le_trans [order add neq_le_trans: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare less_imp_neq [order add less_imp_neq: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare eq_neq_eq_imp_neq [order add eq_neq_eq_imp_neq: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] - -declare not_sym [order add not_sym: linorder "(=) :: 'a => 'a => bool" "(<=)" "(<)"] +local_setup \ + HOL_Order_Tac.declare_linorder { + ops = {eq = @{term \(=) :: 'a \ 'a \ bool\}, le = @{term \(\)\}, lt = @{term \(<)\}}, + thms = {trans = @{thm order_trans}, refl = @{thm order_refl}, eqD1 = @{thm eq_refl}, + eqD2 = @{thm eq_refl[OF sym]}, antisym = @{thm order_antisym}, contr = @{thm notE}}, + conv_thms = {less_le = @{thm eq_reflection[OF less_le]}, + nless_le = @{thm eq_reflection[OF not_less]}, + nle_le = @{thm eq_reflection[OF nle_le]}} + } +\ end setup \ map_theory_simpset (fn ctxt0 => ctxt0 addSolver - mk_solver "Transitivity" (fn ctxt => Orders.order_tac ctxt (Simplifier.prems_of ctxt))) - (*Adding the transitivity reasoners also as safe solvers showed a slight - speed up, but the reasoning strength appears to be not higher (at least - no breaking of additional proofs in the entire HOL distribution, as - of 5 March 2004, was observed).*) + mk_solver "Transitivity" (fn ctxt => HOL_Order_Tac.tac (Simplifier.prems_of ctxt) ctxt)) \ ML \ local fun prp t thm = Thm.prop_of thm = t; (* FIXME proper aconv!? *) in fun antisym_le_simproc ctxt ct = (case Thm.term_of ct of (le as Const (_, T)) $ r $ s => (let val prems = Simplifier.prems_of ctxt; val less = Const (\<^const_name>\less\, T); val t = HOLogic.mk_Trueprop(le $ s $ r); in (case find_first (prp t) prems of NONE => let val t = HOLogic.mk_Trueprop(HOLogic.Not $ (less $ r $ s)) in (case find_first (prp t) prems of NONE => NONE | SOME thm => SOME(mk_meta_eq(thm RS @{thm antisym_conv1}))) end | SOME thm => SOME (mk_meta_eq (thm RS @{thm order_class.antisym_conv}))) end handle THM _ => NONE) | _ => NONE); fun antisym_less_simproc ctxt ct = (case Thm.term_of ct of NotC $ ((less as Const(_,T)) $ r $ s) => (let val prems = Simplifier.prems_of ctxt; val le = Const (\<^const_name>\less_eq\, T); val t = HOLogic.mk_Trueprop(le $ r $ s); in (case find_first (prp t) prems of NONE => let val t = HOLogic.mk_Trueprop (NotC $ (less $ s $ r)) in (case find_first (prp t) prems of NONE => NONE | SOME thm => SOME (mk_meta_eq(thm RS @{thm linorder_class.antisym_conv3}))) end | SOME thm => SOME (mk_meta_eq (thm RS @{thm antisym_conv2}))) - end handle THM _ => NONE) + end handle THM _ => NONE) | _ => NONE); end; \ simproc_setup antisym_le ("(x::'a::order) \ y") = "K antisym_le_simproc" simproc_setup antisym_less ("\ (x::'a::linorder) < y") = "K antisym_less_simproc" subsection \Bounded quantifiers\ syntax (ASCII) "_All_less" :: "[idt, 'a, bool] => bool" ("(3ALL _<_./ _)" [0, 0, 10] 10) "_Ex_less" :: "[idt, 'a, bool] => bool" ("(3EX _<_./ _)" [0, 0, 10] 10) "_All_less_eq" :: "[idt, 'a, bool] => bool" ("(3ALL _<=_./ _)" [0, 0, 10] 10) "_Ex_less_eq" :: "[idt, 'a, bool] => bool" ("(3EX _<=_./ _)" [0, 0, 10] 10) "_All_greater" :: "[idt, 'a, bool] => bool" ("(3ALL _>_./ _)" [0, 0, 10] 10) "_Ex_greater" :: "[idt, 'a, bool] => bool" ("(3EX _>_./ _)" [0, 0, 10] 10) "_All_greater_eq" :: "[idt, 'a, bool] => bool" ("(3ALL _>=_./ _)" [0, 0, 10] 10) "_Ex_greater_eq" :: "[idt, 'a, bool] => bool" ("(3EX _>=_./ _)" [0, 0, 10] 10) "_All_neq" :: "[idt, 'a, bool] => bool" ("(3ALL _~=_./ _)" [0, 0, 10] 10) "_Ex_neq" :: "[idt, 'a, bool] => bool" ("(3EX _~=_./ _)" [0, 0, 10] 10) syntax "_All_less" :: "[idt, 'a, bool] => bool" ("(3\_<_./ _)" [0, 0, 10] 10) "_Ex_less" :: "[idt, 'a, bool] => bool" ("(3\_<_./ _)" [0, 0, 10] 10) "_All_less_eq" :: "[idt, 'a, bool] => bool" ("(3\_\_./ _)" [0, 0, 10] 10) "_Ex_less_eq" :: "[idt, 'a, bool] => bool" ("(3\_\_./ _)" [0, 0, 10] 10) "_All_greater" :: "[idt, 'a, bool] => bool" ("(3\_>_./ _)" [0, 0, 10] 10) "_Ex_greater" :: "[idt, 'a, bool] => bool" ("(3\_>_./ _)" [0, 0, 10] 10) "_All_greater_eq" :: "[idt, 'a, bool] => bool" ("(3\_\_./ _)" [0, 0, 10] 10) "_Ex_greater_eq" :: "[idt, 'a, bool] => bool" ("(3\_\_./ _)" [0, 0, 10] 10) "_All_neq" :: "[idt, 'a, bool] => bool" ("(3\_\_./ _)" [0, 0, 10] 10) "_Ex_neq" :: "[idt, 'a, bool] => bool" ("(3\_\_./ _)" [0, 0, 10] 10) syntax (input) "_All_less" :: "[idt, 'a, bool] => bool" ("(3! _<_./ _)" [0, 0, 10] 10) "_Ex_less" :: "[idt, 'a, bool] => bool" ("(3? _<_./ _)" [0, 0, 10] 10) "_All_less_eq" :: "[idt, 'a, bool] => bool" ("(3! _<=_./ _)" [0, 0, 10] 10) "_Ex_less_eq" :: "[idt, 'a, bool] => bool" ("(3? _<=_./ _)" [0, 0, 10] 10) "_All_neq" :: "[idt, 'a, bool] => bool" ("(3! _~=_./ _)" [0, 0, 10] 10) "_Ex_neq" :: "[idt, 'a, bool] => bool" ("(3? _~=_./ _)" [0, 0, 10] 10) translations "\x "\x. x < y \ P" "\x "\x. x < y \ P" "\x\y. P" \ "\x. x \ y \ P" "\x\y. P" \ "\x. x \ y \ P" "\x>y. P" \ "\x. x > y \ P" "\x>y. P" \ "\x. x > y \ P" "\x\y. P" \ "\x. x \ y \ P" "\x\y. P" \ "\x. x \ y \ P" "\x\y. P" \ "\x. x \ y \ P" "\x\y. P" \ "\x. x \ y \ P" print_translation \ let val All_binder = Mixfix.binder_name \<^const_syntax>\All\; val Ex_binder = Mixfix.binder_name \<^const_syntax>\Ex\; val impl = \<^const_syntax>\HOL.implies\; val conj = \<^const_syntax>\HOL.conj\; val less = \<^const_syntax>\less\; val less_eq = \<^const_syntax>\less_eq\; val trans = [((All_binder, impl, less), (\<^syntax_const>\_All_less\, \<^syntax_const>\_All_greater\)), ((All_binder, impl, less_eq), (\<^syntax_const>\_All_less_eq\, \<^syntax_const>\_All_greater_eq\)), ((Ex_binder, conj, less), (\<^syntax_const>\_Ex_less\, \<^syntax_const>\_Ex_greater\)), ((Ex_binder, conj, less_eq), (\<^syntax_const>\_Ex_less_eq\, \<^syntax_const>\_Ex_greater_eq\))]; fun matches_bound v t = (case t of Const (\<^syntax_const>\_bound\, _) $ Free (v', _) => v = v' | _ => false); fun contains_var v = Term.exists_subterm (fn Free (x, _) => x = v | _ => false); fun mk x c n P = Syntax.const c $ Syntax_Trans.mark_bound_body x $ n $ P; fun tr' q = (q, fn _ => (fn [Const (\<^syntax_const>\_bound\, _) $ Free (v, T), Const (c, _) $ (Const (d, _) $ t $ u) $ P] => (case AList.lookup (=) trans (q, c, d) of NONE => raise Match | SOME (l, g) => if matches_bound v t andalso not (contains_var v u) then mk (v, T) l u P else if matches_bound v u andalso not (contains_var v t) then mk (v, T) g t P else raise Match) | _ => raise Match)); in [tr' All_binder, tr' Ex_binder] end \ subsection \Transitivity reasoning\ context ord begin lemma ord_le_eq_trans: "a \ b \ b = c \ a \ c" by (rule subst) lemma ord_eq_le_trans: "a = b \ b \ c \ a \ c" by (rule ssubst) lemma ord_less_eq_trans: "a < b \ b = c \ a < c" by (rule subst) lemma ord_eq_less_trans: "a = b \ b < c \ a < c" by (rule ssubst) end lemma order_less_subst2: "(a::'a::order) < b ==> f b < (c::'c::order) ==> (!!x y. x < y ==> f x < f y) ==> f a < c" proof - assume r: "!!x y. x < y ==> f x < f y" assume "a < b" hence "f a < f b" by (rule r) also assume "f b < c" finally (less_trans) show ?thesis . qed lemma order_less_subst1: "(a::'a::order) < f b ==> (b::'b::order) < c ==> (!!x y. x < y ==> f x < f y) ==> a < f c" proof - assume r: "!!x y. x < y ==> f x < f y" assume "a < f b" also assume "b < c" hence "f b < f c" by (rule r) finally (less_trans) show ?thesis . qed lemma order_le_less_subst2: "(a::'a::order) <= b ==> f b < (c::'c::order) ==> (!!x y. x <= y ==> f x <= f y) ==> f a < c" proof - assume r: "!!x y. x <= y ==> f x <= f y" assume "a <= b" hence "f a <= f b" by (rule r) also assume "f b < c" finally (le_less_trans) show ?thesis . qed lemma order_le_less_subst1: "(a::'a::order) <= f b ==> (b::'b::order) < c ==> (!!x y. x < y ==> f x < f y) ==> a < f c" proof - assume r: "!!x y. x < y ==> f x < f y" assume "a <= f b" also assume "b < c" hence "f b < f c" by (rule r) finally (le_less_trans) show ?thesis . qed lemma order_less_le_subst2: "(a::'a::order) < b ==> f b <= (c::'c::order) ==> (!!x y. x < y ==> f x < f y) ==> f a < c" proof - assume r: "!!x y. x < y ==> f x < f y" assume "a < b" hence "f a < f b" by (rule r) also assume "f b <= c" finally (less_le_trans) show ?thesis . qed lemma order_less_le_subst1: "(a::'a::order) < f b ==> (b::'b::order) <= c ==> (!!x y. x <= y ==> f x <= f y) ==> a < f c" proof - assume r: "!!x y. x <= y ==> f x <= f y" assume "a < f b" also assume "b <= c" hence "f b <= f c" by (rule r) finally (less_le_trans) show ?thesis . qed lemma order_subst1: "(a::'a::order) <= f b ==> (b::'b::order) <= c ==> (!!x y. x <= y ==> f x <= f y) ==> a <= f c" proof - assume r: "!!x y. x <= y ==> f x <= f y" assume "a <= f b" also assume "b <= c" hence "f b <= f c" by (rule r) finally (order_trans) show ?thesis . qed lemma order_subst2: "(a::'a::order) <= b ==> f b <= (c::'c::order) ==> (!!x y. x <= y ==> f x <= f y) ==> f a <= c" proof - assume r: "!!x y. x <= y ==> f x <= f y" assume "a <= b" hence "f a <= f b" by (rule r) also assume "f b <= c" finally (order_trans) show ?thesis . qed lemma ord_le_eq_subst: "a <= b ==> f b = c ==> (!!x y. x <= y ==> f x <= f y) ==> f a <= c" proof - assume r: "!!x y. x <= y ==> f x <= f y" assume "a <= b" hence "f a <= f b" by (rule r) also assume "f b = c" finally (ord_le_eq_trans) show ?thesis . qed lemma ord_eq_le_subst: "a = f b ==> b <= c ==> (!!x y. x <= y ==> f x <= f y) ==> a <= f c" proof - assume r: "!!x y. x <= y ==> f x <= f y" assume "a = f b" also assume "b <= c" hence "f b <= f c" by (rule r) finally (ord_eq_le_trans) show ?thesis . qed lemma ord_less_eq_subst: "a < b ==> f b = c ==> (!!x y. x < y ==> f x < f y) ==> f a < c" proof - assume r: "!!x y. x < y ==> f x < f y" assume "a < b" hence "f a < f b" by (rule r) also assume "f b = c" finally (ord_less_eq_trans) show ?thesis . qed lemma ord_eq_less_subst: "a = f b ==> b < c ==> (!!x y. x < y ==> f x < f y) ==> a < f c" proof - assume r: "!!x y. x < y ==> f x < f y" assume "a = f b" also assume "b < c" hence "f b < f c" by (rule r) finally (ord_eq_less_trans) show ?thesis . qed text \ Note that this list of rules is in reverse order of priorities. \ lemmas [trans] = order_less_subst2 order_less_subst1 order_le_less_subst2 order_le_less_subst1 order_less_le_subst2 order_less_le_subst1 order_subst2 order_subst1 ord_le_eq_subst ord_eq_le_subst ord_less_eq_subst ord_eq_less_subst forw_subst back_subst rev_mp mp lemmas (in order) [trans] = neq_le_trans le_neq_trans lemmas (in preorder) [trans] = less_trans less_asym' le_less_trans less_le_trans order_trans lemmas (in order) [trans] = order.antisym lemmas (in ord) [trans] = ord_le_eq_trans ord_eq_le_trans ord_less_eq_trans ord_eq_less_trans lemmas [trans] = trans lemmas order_trans_rules = order_less_subst2 order_less_subst1 order_le_less_subst2 order_le_less_subst1 order_less_le_subst2 order_less_le_subst1 order_subst2 order_subst1 ord_le_eq_subst ord_eq_le_subst ord_less_eq_subst ord_eq_less_subst forw_subst back_subst rev_mp mp neq_le_trans le_neq_trans less_trans less_asym' le_less_trans less_le_trans order_trans order.antisym ord_le_eq_trans ord_eq_le_trans ord_less_eq_trans ord_eq_less_trans trans text \These support proving chains of decreasing inequalities a >= b >= c ... in Isar proofs.\ lemma xt1 [no_atp]: "a = b \ b > c \ a > c" "a > b \ b = c \ a > c" "a = b \ b \ c \ a \ c" "a \ b \ b = c \ a \ c" "(x::'a::order) \ y \ y \ x \ x = y" "(x::'a::order) \ y \ y \ z \ x \ z" "(x::'a::order) > y \ y \ z \ x > z" "(x::'a::order) \ y \ y > z \ x > z" "(a::'a::order) > b \ b > a \ P" "(x::'a::order) > y \ y > z \ x > z" "(a::'a::order) \ b \ a \ b \ a > b" "(a::'a::order) \ b \ a \ b \ a > b" "a = f b \ b > c \ (\x y. x > y \ f x > f y) \ a > f c" "a > b \ f b = c \ (\x y. x > y \ f x > f y) \ f a > c" "a = f b \ b \ c \ (\x y. x \ y \ f x \ f y) \ a \ f c" "a \ b \ f b = c \ (\x y. x \ y \ f x \ f y) \ f a \ c" by auto lemma xt2 [no_atp]: "(a::'a::order) >= f b ==> b >= c ==> (!!x y. x >= y ==> f x >= f y) ==> a >= f c" by (subgoal_tac "f b >= f c", force, force) lemma xt3 [no_atp]: "(a::'a::order) >= b ==> (f b::'b::order) >= c ==> (!!x y. x >= y ==> f x >= f y) ==> f a >= c" by (subgoal_tac "f a >= f b", force, force) lemma xt4 [no_atp]: "(a::'a::order) > f b ==> (b::'b::order) >= c ==> (!!x y. x >= y ==> f x >= f y) ==> a > f c" by (subgoal_tac "f b >= f c", force, force) lemma xt5 [no_atp]: "(a::'a::order) > b ==> (f b::'b::order) >= c==> (!!x y. x > y ==> f x > f y) ==> f a > c" by (subgoal_tac "f a > f b", force, force) lemma xt6 [no_atp]: "(a::'a::order) >= f b ==> b > c ==> (!!x y. x > y ==> f x > f y) ==> a > f c" by (subgoal_tac "f b > f c", force, force) lemma xt7 [no_atp]: "(a::'a::order) >= b ==> (f b::'b::order) > c ==> (!!x y. x >= y ==> f x >= f y) ==> f a > c" by (subgoal_tac "f a >= f b", force, force) lemma xt8 [no_atp]: "(a::'a::order) > f b ==> (b::'b::order) > c ==> (!!x y. x > y ==> f x > f y) ==> a > f c" by (subgoal_tac "f b > f c", force, force) lemma xt9 [no_atp]: "(a::'a::order) > b ==> (f b::'b::order) > c ==> (!!x y. x > y ==> f x > f y) ==> f a > c" by (subgoal_tac "f a > f b", force, force) lemmas xtrans = xt1 xt2 xt3 xt4 xt5 xt6 xt7 xt8 xt9 (* Since "a >= b" abbreviates "b <= a", the abbreviation "..." stands for the wrong thing in an Isar proof. The extra transitivity rules can be used as follows: lemma "(a::'a::order) > z" proof - have "a >= b" (is "_ >= ?rhs") sorry also have "?rhs >= c" (is "_ >= ?rhs") sorry also (xtrans) have "?rhs = d" (is "_ = ?rhs") sorry also (xtrans) have "?rhs >= e" (is "_ >= ?rhs") sorry also (xtrans) have "?rhs > f" (is "_ > ?rhs") sorry also (xtrans) have "?rhs > z" sorry finally (xtrans) show ?thesis . qed Alternatively, one can use "declare xtrans [trans]" and then leave out the "(xtrans)" above. *) subsection \Monotonicity\ context order begin definition mono :: "('a \ 'b::order) \ bool" where "mono f \ (\x y. x \ y \ f x \ f y)" lemma monoI [intro?]: fixes f :: "'a \ 'b::order" shows "(\x y. x \ y \ f x \ f y) \ mono f" unfolding mono_def by iprover lemma monoD [dest?]: fixes f :: "'a \ 'b::order" shows "mono f \ x \ y \ f x \ f y" unfolding mono_def by iprover lemma monoE: fixes f :: "'a \ 'b::order" assumes "mono f" assumes "x \ y" obtains "f x \ f y" proof from assms show "f x \ f y" by (simp add: mono_def) qed definition antimono :: "('a \ 'b::order) \ bool" where "antimono f \ (\x y. x \ y \ f x \ f y)" lemma antimonoI [intro?]: fixes f :: "'a \ 'b::order" shows "(\x y. x \ y \ f x \ f y) \ antimono f" unfolding antimono_def by iprover lemma antimonoD [dest?]: fixes f :: "'a \ 'b::order" shows "antimono f \ x \ y \ f x \ f y" unfolding antimono_def by iprover lemma antimonoE: fixes f :: "'a \ 'b::order" assumes "antimono f" assumes "x \ y" obtains "f x \ f y" proof from assms show "f x \ f y" by (simp add: antimono_def) qed definition strict_mono :: "('a \ 'b::order) \ bool" where "strict_mono f \ (\x y. x < y \ f x < f y)" lemma strict_monoI [intro?]: assumes "\x y. x < y \ f x < f y" shows "strict_mono f" using assms unfolding strict_mono_def by auto lemma strict_monoD [dest?]: "strict_mono f \ x < y \ f x < f y" unfolding strict_mono_def by auto lemma strict_mono_mono [dest?]: assumes "strict_mono f" shows "mono f" proof (rule monoI) fix x y assume "x \ y" show "f x \ f y" proof (cases "x = y") case True then show ?thesis by simp next case False with \x \ y\ have "x < y" by simp with assms strict_monoD have "f x < f y" by auto then show ?thesis by simp + qed qed end context linorder begin lemma mono_invE: fixes f :: "'a \ 'b::order" assumes "mono f" assumes "f x < f y" obtains "x \ y" proof show "x \ y" proof (rule ccontr) assume "\ x \ y" then have "y \ x" by simp with \mono f\ obtain "f y \ f x" by (rule monoE) with \f x < f y\ show False by simp qed qed lemma mono_strict_invE: fixes f :: "'a \ 'b::order" assumes "mono f" assumes "f x < f y" obtains "x < y" proof show "x < y" proof (rule ccontr) assume "\ x < y" then have "y \ x" by simp with \mono f\ obtain "f y \ f x" by (rule monoE) with \f x < f y\ show False by simp qed qed lemma strict_mono_eq: assumes "strict_mono f" shows "f x = f y \ x = y" proof assume "f x = f y" show "x = y" proof (cases x y rule: linorder_cases) case less with assms strict_monoD have "f x < f y" by auto with \f x = f y\ show ?thesis by simp next case equal then show ?thesis . next case greater with assms strict_monoD have "f y < f x" by auto with \f x = f y\ show ?thesis by simp qed qed simp lemma strict_mono_less_eq: assumes "strict_mono f" shows "f x \ f y \ x \ y" proof assume "x \ y" with assms strict_mono_mono monoD show "f x \ f y" by auto next assume "f x \ f y" show "x \ y" proof (rule ccontr) assume "\ x \ y" then have "y < x" by simp with assms strict_monoD have "f y < f x" by auto with \f x \ f y\ show False by simp qed qed lemma strict_mono_less: assumes "strict_mono f" shows "f x < f y \ x < y" using assms by (auto simp add: less_le Orderings.less_le strict_mono_eq strict_mono_less_eq) end subsection \min and max -- fundamental\ definition (in ord) min :: "'a \ 'a \ 'a" where "min a b = (if a \ b then a else b)" definition (in ord) max :: "'a \ 'a \ 'a" where "max a b = (if a \ b then b else a)" lemma min_absorb1: "x \ y \ min x y = x" by (simp add: min_def) lemma max_absorb2: "x \ y \ max x y = y" by (simp add: max_def) lemma min_absorb2: "(y::'a::order) \ x \ min x y = y" by (simp add:min_def) lemma max_absorb1: "(y::'a::order) \ x \ max x y = x" by (simp add: max_def) lemma max_min_same [simp]: fixes x y :: "'a :: linorder" shows "max x (min x y) = x" "max (min x y) x = x" "max (min x y) y = y" "max y (min x y) = y" by(auto simp add: max_def min_def) subsection \(Unique) top and bottom elements\ class bot = fixes bot :: 'a ("\") class order_bot = order + bot + assumes bot_least: "\ \ a" begin sublocale bot: ordering_top greater_eq greater bot by standard (fact bot_least) lemma le_bot: "a \ \ \ a = \" by (fact bot.extremum_uniqueI) lemma bot_unique: "a \ \ \ a = \" by (fact bot.extremum_unique) lemma not_less_bot: "\ a < \" by (fact bot.extremum_strict) lemma bot_less: "a \ \ \ \ < a" by (fact bot.not_eq_extremum) lemma max_bot[simp]: "max bot x = x" by(simp add: max_def bot_unique) lemma max_bot2[simp]: "max x bot = x" by(simp add: max_def bot_unique) lemma min_bot[simp]: "min bot x = bot" by(simp add: min_def bot_unique) lemma min_bot2[simp]: "min x bot = bot" by(simp add: min_def bot_unique) end class top = fixes top :: 'a ("\") class order_top = order + top + assumes top_greatest: "a \ \" begin sublocale top: ordering_top less_eq less top by standard (fact top_greatest) lemma top_le: "\ \ a \ a = \" by (fact top.extremum_uniqueI) lemma top_unique: "\ \ a \ a = \" by (fact top.extremum_unique) lemma not_top_less: "\ \ < a" by (fact top.extremum_strict) lemma less_top: "a \ \ \ a < \" by (fact top.not_eq_extremum) lemma max_top[simp]: "max top x = top" by(simp add: max_def top_unique) lemma max_top2[simp]: "max x top = top" by(simp add: max_def top_unique) lemma min_top[simp]: "min top x = x" by(simp add: min_def top_unique) lemma min_top2[simp]: "min x top = x" by(simp add: min_def top_unique) end subsection \Dense orders\ class dense_order = order + assumes dense: "x < y \ (\z. x < z \ z < y)" class dense_linorder = linorder + dense_order begin lemma dense_le: fixes y z :: 'a assumes "\x. x < y \ x \ z" shows "y \ z" proof (rule ccontr) assume "\ ?thesis" hence "z < y" by simp from dense[OF this] obtain x where "x < y" and "z < x" by safe moreover have "x \ z" using assms[OF \x < y\] . ultimately show False by auto qed lemma dense_le_bounded: fixes x y z :: 'a assumes "x < y" assumes *: "\w. \ x < w ; w < y \ \ w \ z" shows "y \ z" proof (rule dense_le) fix w assume "w < y" from dense[OF \x < y\] obtain u where "x < u" "u < y" by safe from linear[of u w] show "w \ z" proof (rule disjE) assume "u \ w" from less_le_trans[OF \x < u\ \u \ w\] \w < y\ show "w \ z" by (rule *) next assume "w \ u" from \w \ u\ *[OF \x < u\ \u < y\] show "w \ z" by (rule order_trans) qed qed lemma dense_ge: fixes y z :: 'a assumes "\x. z < x \ y \ x" shows "y \ z" proof (rule ccontr) assume "\ ?thesis" hence "z < y" by simp from dense[OF this] obtain x where "x < y" and "z < x" by safe moreover have "y \ x" using assms[OF \z < x\] . ultimately show False by auto qed lemma dense_ge_bounded: fixes x y z :: 'a assumes "z < x" assumes *: "\w. \ z < w ; w < x \ \ y \ w" shows "y \ z" proof (rule dense_ge) fix w assume "z < w" from dense[OF \z < x\] obtain u where "z < u" "u < x" by safe from linear[of u w] show "y \ w" proof (rule disjE) assume "w \ u" from \z < w\ le_less_trans[OF \w \ u\ \u < x\] show "y \ w" by (rule *) next assume "u \ w" from *[OF \z < u\ \u < x\] \u \ w\ show "y \ w" by (rule order_trans) qed qed end class no_top = order + assumes gt_ex: "\y. x < y" class no_bot = order + assumes lt_ex: "\y. y < x" class unbounded_dense_linorder = dense_linorder + no_top + no_bot subsection \Wellorders\ class wellorder = linorder + assumes less_induct [case_names less]: "(\x. (\y. y < x \ P y) \ P x) \ P a" begin lemma wellorder_Least_lemma: fixes k :: 'a assumes "P k" shows LeastI: "P (LEAST x. P x)" and Least_le: "(LEAST x. P x) \ k" proof - have "P (LEAST x. P x) \ (LEAST x. P x) \ k" using assms proof (induct k rule: less_induct) case (less x) then have "P x" by simp show ?case proof (rule classical) assume assm: "\ (P (LEAST a. P a) \ (LEAST a. P a) \ x)" have "\y. P y \ x \ y" proof (rule classical) fix y assume "P y" and "\ x \ y" with less have "P (LEAST a. P a)" and "(LEAST a. P a) \ y" by (auto simp add: not_le) with assm have "x < (LEAST a. P a)" and "(LEAST a. P a) \ y" by auto then show "x \ y" by auto qed with \P x\ have Least: "(LEAST a. P a) = x" by (rule Least_equality) with \P x\ show ?thesis by simp qed qed then show "P (LEAST x. P x)" and "(LEAST x. P x) \ k" by auto qed \ \The following 3 lemmas are due to Brian Huffman\ lemma LeastI_ex: "\x. P x \ P (Least P)" by (erule exE) (erule LeastI) lemma LeastI2: "P a \ (\x. P x \ Q x) \ Q (Least P)" by (blast intro: LeastI) lemma LeastI2_ex: "\a. P a \ (\x. P x \ Q x) \ Q (Least P)" by (blast intro: LeastI_ex) lemma LeastI2_wellorder: assumes "P a" and "\a. \ P a; \b. P b \ a \ b \ \ Q a" shows "Q (Least P)" proof (rule LeastI2_order) show "P (Least P)" using \P a\ by (rule LeastI) next fix y assume "P y" thus "Least P \ y" by (rule Least_le) next fix x assume "P x" "\y. P y \ x \ y" thus "Q x" by (rule assms(2)) qed lemma LeastI2_wellorder_ex: assumes "\x. P x" and "\a. \ P a; \b. P b \ a \ b \ \ Q a" shows "Q (Least P)" using assms by clarify (blast intro!: LeastI2_wellorder) lemma not_less_Least: "k < (LEAST x. P x) \ \ P k" apply (simp add: not_le [symmetric]) apply (erule contrapos_nn) apply (erule Least_le) done lemma exists_least_iff: "(\n. P n) \ (\n. P n \ (\m < n. \ P m))" (is "?lhs \ ?rhs") proof assume ?rhs thus ?lhs by blast next assume H: ?lhs then obtain n where n: "P n" by blast let ?x = "Least P" { fix m assume m: "m < ?x" from not_less_Least[OF m] have "\ P m" . } with LeastI_ex[OF H] show ?rhs by blast qed end subsection \Order on \<^typ>\bool\\ instantiation bool :: "{order_bot, order_top, linorder}" begin definition le_bool_def [simp]: "P \ Q \ P \ Q" definition [simp]: "(P::bool) < Q \ \ P \ Q" definition [simp]: "\ \ False" definition [simp]: "\ \ True" instance proof qed auto end lemma le_boolI: "(P \ Q) \ P \ Q" by simp lemma le_boolI': "P \ Q \ P \ Q" by simp lemma le_boolE: "P \ Q \ P \ (Q \ R) \ R" by simp lemma le_boolD: "P \ Q \ P \ Q" by simp lemma bot_boolE: "\ \ P" by simp lemma top_boolI: \ by simp lemma [code]: "False \ b \ True" "True \ b \ b" "False < b \ b" "True < b \ False" by simp_all subsection \Order on \<^typ>\_ \ _\\ instantiation "fun" :: (type, ord) ord begin definition le_fun_def: "f \ g \ (\x. f x \ g x)" definition "(f::'a \ 'b) < g \ f \ g \ \ (g \ f)" instance .. end instance "fun" :: (type, preorder) preorder proof qed (auto simp add: le_fun_def less_fun_def intro: order_trans order.antisym) instance "fun" :: (type, order) order proof qed (auto simp add: le_fun_def intro: order.antisym) instantiation "fun" :: (type, bot) bot begin definition "\ = (\x. \)" instance .. end instantiation "fun" :: (type, order_bot) order_bot begin lemma bot_apply [simp, code]: "\ x = \" by (simp add: bot_fun_def) instance proof qed (simp add: le_fun_def) end instantiation "fun" :: (type, top) top begin definition [no_atp]: "\ = (\x. \)" instance .. end instantiation "fun" :: (type, order_top) order_top begin lemma top_apply [simp, code]: "\ x = \" by (simp add: top_fun_def) instance proof qed (simp add: le_fun_def) end lemma le_funI: "(\x. f x \ g x) \ f \ g" unfolding le_fun_def by simp lemma le_funE: "f \ g \ (f x \ g x \ P) \ P" unfolding le_fun_def by simp lemma le_funD: "f \ g \ f x \ g x" by (rule le_funE) lemma mono_compose: "mono Q \ mono (\i x. Q i (f x))" unfolding mono_def le_fun_def by auto subsection \Order on unary and binary predicates\ lemma predicate1I: assumes PQ: "\x. P x \ Q x" shows "P \ Q" apply (rule le_funI) apply (rule le_boolI) apply (rule PQ) apply assumption done lemma predicate1D: "P \ Q \ P x \ Q x" apply (erule le_funE) apply (erule le_boolE) apply assumption+ done lemma rev_predicate1D: "P x \ P \ Q \ Q x" by (rule predicate1D) lemma predicate2I: assumes PQ: "\x y. P x y \ Q x y" shows "P \ Q" apply (rule le_funI)+ apply (rule le_boolI) apply (rule PQ) apply assumption done lemma predicate2D: "P \ Q \ P x y \ Q x y" apply (erule le_funE)+ apply (erule le_boolE) apply assumption+ done lemma rev_predicate2D: "P x y \ P \ Q \ Q x y" by (rule predicate2D) lemma bot1E [no_atp]: "\ x \ P" by (simp add: bot_fun_def) lemma bot2E: "\ x y \ P" by (simp add: bot_fun_def) lemma top1I: "\ x" by (simp add: top_fun_def) lemma top2I: "\ x y" by (simp add: top_fun_def) subsection \Name duplicates\ lemmas antisym = order.antisym lemmas eq_iff = order.eq_iff lemmas order_eq_refl = preorder_class.eq_refl lemmas order_less_irrefl = preorder_class.less_irrefl lemmas order_less_imp_le = preorder_class.less_imp_le lemmas order_less_not_sym = preorder_class.less_not_sym lemmas order_less_asym = preorder_class.less_asym lemmas order_less_trans = preorder_class.less_trans lemmas order_le_less_trans = preorder_class.le_less_trans lemmas order_less_le_trans = preorder_class.less_le_trans lemmas order_less_imp_not_less = preorder_class.less_imp_not_less lemmas order_less_imp_triv = preorder_class.less_imp_triv lemmas order_less_asym' = preorder_class.less_asym' lemmas order_less_le = order_class.less_le lemmas order_le_less = order_class.le_less lemmas order_le_imp_less_or_eq = order_class.le_imp_less_or_eq lemmas order_less_imp_not_eq = order_class.less_imp_not_eq lemmas order_less_imp_not_eq2 = order_class.less_imp_not_eq2 lemmas order_neq_le_trans = order_class.neq_le_trans lemmas order_le_neq_trans = order_class.le_neq_trans lemmas order_eq_iff = order_class.order.eq_iff lemmas order_antisym_conv = order_class.antisym_conv lemmas linorder_linear = linorder_class.linear lemmas linorder_less_linear = linorder_class.less_linear lemmas linorder_le_less_linear = linorder_class.le_less_linear lemmas linorder_le_cases = linorder_class.le_cases lemmas linorder_not_less = linorder_class.not_less lemmas linorder_not_le = linorder_class.not_le lemmas linorder_neq_iff = linorder_class.neq_iff lemmas linorder_neqE = linorder_class.neqE end diff --git a/src/Provers/order.ML b/src/Provers/order.ML deleted file mode 100644 --- a/src/Provers/order.ML +++ /dev/null @@ -1,1264 +0,0 @@ -(* Title: Provers/order.ML - Author: Oliver Kutter, TU Muenchen - -Transitivity reasoner for partial and linear orders. -*) - -(* TODO: reduce number of input thms *) - -(* - -The package provides tactics partial_tac and linear_tac that use all -premises of the form - - t = u, t ~= u, t < u, t <= u, ~(t < u) and ~(t <= u) - -to -1. either derive a contradiction, - in which case the conclusion can be any term, -2. or prove the conclusion, which must be of the same form as the - premises (excluding ~(t < u) and ~(t <= u) for partial orders) - -The package is not limited to the relation <= and friends. It can be -instantiated to any partial and/or linear order --- for example, the -divisibility relation "dvd". In order to instantiate the package for -a partial order only, supply dummy theorems to the rules for linear -orders, and don't use linear_tac! - -*) - -signature ORDER_TAC = -sig - (* Theorems required by the reasoner *) - type less_arith - val empty : thm -> less_arith - val update : string -> thm -> less_arith -> less_arith - - (* Tactics *) - val partial_tac: - (theory -> term -> (term * string * term) option) -> less_arith -> - Proof.context -> thm list -> int -> tactic - val linear_tac: - (theory -> term -> (term * string * term) option) -> less_arith -> - Proof.context -> thm list -> int -> tactic -end; - -structure Order_Tac: ORDER_TAC = -struct - -(* Record to handle input theorems in a convenient way. *) - -type less_arith = - { - (* Theorems for partial orders *) - less_reflE: thm, (* x < x ==> P *) - le_refl: thm, (* x <= x *) - less_imp_le: thm, (* x < y ==> x <= y *) - eqI: thm, (* [| x <= y; y <= x |] ==> x = y *) - eqD1: thm, (* x = y ==> x <= y *) - eqD2: thm, (* x = y ==> y <= x *) - less_trans: thm, (* [| x < y; y < z |] ==> x < z *) - less_le_trans: thm, (* [| x < y; y <= z |] ==> x < z *) - le_less_trans: thm, (* [| x <= y; y < z |] ==> x < z *) - le_trans: thm, (* [| x <= y; y <= z |] ==> x <= z *) - le_neq_trans : thm, (* [| x <= y ; x ~= y |] ==> x < y *) - neq_le_trans : thm, (* [| x ~= y ; x <= y |] ==> x < y *) - not_sym : thm, (* x ~= y ==> y ~= x *) - - (* Additional theorems for linear orders *) - not_lessD: thm, (* ~(x < y) ==> y <= x *) - not_leD: thm, (* ~(x <= y) ==> y < x *) - not_lessI: thm, (* y <= x ==> ~(x < y) *) - not_leI: thm, (* y < x ==> ~(x <= y) *) - - (* Additional theorems for subgoals of form x ~= y *) - less_imp_neq : thm, (* x < y ==> x ~= y *) - eq_neq_eq_imp_neq : thm (* [| x = u ; u ~= v ; v = z|] ==> x ~= z *) - } - -fun empty dummy_thm = - {less_reflE= dummy_thm, le_refl= dummy_thm, less_imp_le= dummy_thm, eqI= dummy_thm, - eqD1= dummy_thm, eqD2= dummy_thm, - less_trans= dummy_thm, less_le_trans= dummy_thm, le_less_trans= dummy_thm, - le_trans= dummy_thm, le_neq_trans = dummy_thm, neq_le_trans = dummy_thm, - not_sym = dummy_thm, - not_lessD= dummy_thm, not_leD= dummy_thm, not_lessI= dummy_thm, not_leI= dummy_thm, - less_imp_neq = dummy_thm, eq_neq_eq_imp_neq = dummy_thm} - -fun change thms f = - let - val {less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq} = thms; - val (less_reflE', le_refl', less_imp_le', eqI', eqD1', eqD2', less_trans', - less_le_trans', le_less_trans', le_trans', le_neq_trans', neq_le_trans', - not_sym', not_lessD', not_leD', not_lessI', not_leI', less_imp_neq', - eq_neq_eq_imp_neq') = - f (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) - in {less_reflE = less_reflE', le_refl= le_refl', - less_imp_le = less_imp_le', eqI = eqI', eqD1 = eqD1', eqD2 = eqD2', - less_trans = less_trans', less_le_trans = less_le_trans', - le_less_trans = le_less_trans', le_trans = le_trans', le_neq_trans = le_neq_trans', - neq_le_trans = neq_le_trans', not_sym = not_sym', - not_lessD = not_lessD', not_leD = not_leD', not_lessI = not_lessI', - not_leI = not_leI', - less_imp_neq = less_imp_neq', eq_neq_eq_imp_neq = eq_neq_eq_imp_neq'} - end; - -fun update "less_reflE" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (thm, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "le_refl" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, thm, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "less_imp_le" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, thm, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "eqI" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, thm, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "eqD1" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, thm, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "eqD2" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, thm, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "less_trans" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, thm, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "less_le_trans" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - thm, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "le_less_trans" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, thm, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "le_trans" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, thm, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "le_neq_trans" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, thm, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "neq_le_trans" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, thm, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "not_sym" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - thm, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "not_lessD" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, thm, not_leD, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "not_leD" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, thm, not_lessI, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "not_lessI" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, thm, not_leI, less_imp_neq, eq_neq_eq_imp_neq)) - | update "not_leI" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, thm, less_imp_neq, eq_neq_eq_imp_neq)) - | update "less_imp_neq" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, thm, eq_neq_eq_imp_neq)) - | update "eq_neq_eq_imp_neq" thm thms = - change thms (fn (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, - eq_neq_eq_imp_neq) => - (less_reflE, le_refl, less_imp_le, eqI, eqD1, eqD2, less_trans, - less_le_trans, le_less_trans, le_trans, le_neq_trans, neq_le_trans, - not_sym, not_lessD, not_leD, not_lessI, not_leI, less_imp_neq, thm)); - -(* Internal datatype for the proof *) -datatype proof - = Asm of int - | Thm of proof list * thm; - -exception Cannot; - (* Internal exception, raised if conclusion cannot be derived from - assumptions. *) -exception Contr of proof; - (* Internal exception, raised if contradiction ( x < x ) was derived *) - -fun prove asms = - let - fun pr (Asm i) = nth asms i - | pr (Thm (prfs, thm)) = map pr prfs MRS thm; - in pr end; - -(* Internal datatype for inequalities *) -datatype less - = Less of term * term * proof - | Le of term * term * proof - | NotEq of term * term * proof; - -(* Misc functions for datatype less *) -fun lower (Less (x, _, _)) = x - | lower (Le (x, _, _)) = x - | lower (NotEq (x,_,_)) = x; - -fun upper (Less (_, y, _)) = y - | upper (Le (_, y, _)) = y - | upper (NotEq (_,y,_)) = y; - -fun getprf (Less (_, _, p)) = p -| getprf (Le (_, _, p)) = p -| getprf (NotEq (_,_, p)) = p; - - -(* ************************************************************************ *) -(* *) -(* mkasm_partial *) -(* *) -(* Tuple (t, n) (t an assumption, n its index in the assumptions) is *) -(* translated to an element of type less. *) -(* Partial orders only. *) -(* *) -(* ************************************************************************ *) - -fun mkasm_partial decomp (less_thms : less_arith) sign (n, t) = - case decomp sign t of - SOME (x, rel, y) => (case rel of - "<" => if (x aconv y) then raise Contr (Thm ([Asm n], #less_reflE less_thms)) - else [Less (x, y, Asm n)] - | "~<" => [] - | "<=" => [Le (x, y, Asm n)] - | "~<=" => [] - | "=" => [Le (x, y, Thm ([Asm n], #eqD1 less_thms)), - Le (y, x, Thm ([Asm n], #eqD2 less_thms))] - | "~=" => if (x aconv y) then - raise Contr (Thm ([(Thm ([(Thm ([], #le_refl less_thms)) ,(Asm n)], #le_neq_trans less_thms))], #less_reflE less_thms)) - else [ NotEq (x, y, Asm n), - NotEq (y, x,Thm ( [Asm n], #not_sym less_thms))] - | _ => error ("partial_tac: unknown relation symbol ``" ^ rel ^ - "''returned by decomp.")) - | NONE => []; - -(* ************************************************************************ *) -(* *) -(* mkasm_linear *) -(* *) -(* Tuple (t, n) (t an assumption, n its index in the assumptions) is *) -(* translated to an element of type less. *) -(* Linear orders only. *) -(* *) -(* ************************************************************************ *) - -fun mkasm_linear decomp (less_thms : less_arith) sign (n, t) = - case decomp sign t of - SOME (x, rel, y) => (case rel of - "<" => if (x aconv y) then raise Contr (Thm ([Asm n], #less_reflE less_thms)) - else [Less (x, y, Asm n)] - | "~<" => [Le (y, x, Thm ([Asm n], #not_lessD less_thms))] - | "<=" => [Le (x, y, Asm n)] - | "~<=" => if (x aconv y) then - raise (Contr (Thm ([Thm ([Asm n], #not_leD less_thms)], #less_reflE less_thms))) - else [Less (y, x, Thm ([Asm n], #not_leD less_thms))] - | "=" => [Le (x, y, Thm ([Asm n], #eqD1 less_thms)), - Le (y, x, Thm ([Asm n], #eqD2 less_thms))] - | "~=" => if (x aconv y) then - raise Contr (Thm ([(Thm ([(Thm ([], #le_refl less_thms)) ,(Asm n)], #le_neq_trans less_thms))], #less_reflE less_thms)) - else [ NotEq (x, y, Asm n), - NotEq (y, x,Thm ( [Asm n], #not_sym less_thms))] - | _ => error ("linear_tac: unknown relation symbol ``" ^ rel ^ - "''returned by decomp.")) - | NONE => []; - -(* ************************************************************************ *) -(* *) -(* mkconcl_partial *) -(* *) -(* Translates conclusion t to an element of type less. *) -(* Partial orders only. *) -(* *) -(* ************************************************************************ *) - -fun mkconcl_partial decomp (less_thms : less_arith) sign t = - case decomp sign t of - SOME (x, rel, y) => (case rel of - "<" => ([Less (x, y, Asm ~1)], Asm 0) - | "<=" => ([Le (x, y, Asm ~1)], Asm 0) - | "=" => ([Le (x, y, Asm ~1), Le (y, x, Asm ~1)], - Thm ([Asm 0, Asm 1], #eqI less_thms)) - | "~=" => ([NotEq (x,y, Asm ~1)], Asm 0) - | _ => raise Cannot) - | NONE => raise Cannot; - -(* ************************************************************************ *) -(* *) -(* mkconcl_linear *) -(* *) -(* Translates conclusion t to an element of type less. *) -(* Linear orders only. *) -(* *) -(* ************************************************************************ *) - -fun mkconcl_linear decomp (less_thms : less_arith) sign t = - case decomp sign t of - SOME (x, rel, y) => (case rel of - "<" => ([Less (x, y, Asm ~1)], Asm 0) - | "~<" => ([Le (y, x, Asm ~1)], Thm ([Asm 0], #not_lessI less_thms)) - | "<=" => ([Le (x, y, Asm ~1)], Asm 0) - | "~<=" => ([Less (y, x, Asm ~1)], Thm ([Asm 0], #not_leI less_thms)) - | "=" => ([Le (x, y, Asm ~1), Le (y, x, Asm ~1)], - Thm ([Asm 0, Asm 1], #eqI less_thms)) - | "~=" => ([NotEq (x,y, Asm ~1)], Asm 0) - | _ => raise Cannot) - | NONE => raise Cannot; - - -(*** The common part for partial and linear orders ***) - -(* Analysis of premises and conclusion: *) -(* decomp (`x Rel y') should yield (x, Rel, y) - where Rel is one of "<", "<=", "~<", "~<=", "=" and "~=", - other relation symbols cause an error message *) - -fun gen_order_tac mkasm mkconcl decomp' (less_thms : less_arith) ctxt prems = - -let - -fun decomp sign t = Option.map (fn (x, rel, y) => - (Envir.beta_eta_contract x, rel, Envir.beta_eta_contract y)) (decomp' sign t); - -(* ******************************************************************* *) -(* *) -(* mergeLess *) -(* *) -(* Merge two elements of type less according to the following rules *) -(* *) -(* x < y && y < z ==> x < z *) -(* x < y && y <= z ==> x < z *) -(* x <= y && y < z ==> x < z *) -(* x <= y && y <= z ==> x <= z *) -(* x <= y && x ~= y ==> x < y *) -(* x ~= y && x <= y ==> x < y *) -(* x < y && x ~= y ==> x < y *) -(* x ~= y && x < y ==> x < y *) -(* *) -(* ******************************************************************* *) - -fun mergeLess (Less (x, _, p) , Less (_ , z, q)) = - Less (x, z, Thm ([p,q] , #less_trans less_thms)) -| mergeLess (Less (x, _, p) , Le (_ , z, q)) = - Less (x, z, Thm ([p,q] , #less_le_trans less_thms)) -| mergeLess (Le (x, _, p) , Less (_ , z, q)) = - Less (x, z, Thm ([p,q] , #le_less_trans less_thms)) -| mergeLess (Le (x, z, p) , NotEq (x', z', q)) = - if (x aconv x' andalso z aconv z' ) - then Less (x, z, Thm ([p,q] , #le_neq_trans less_thms)) - else error "linear/partial_tac: internal error le_neq_trans" -| mergeLess (NotEq (x, z, p) , Le (x' , z', q)) = - if (x aconv x' andalso z aconv z') - then Less (x, z, Thm ([p,q] , #neq_le_trans less_thms)) - else error "linear/partial_tac: internal error neq_le_trans" -| mergeLess (NotEq (x, z, p) , Less (x' , z', q)) = - if (x aconv x' andalso z aconv z') - then Less ((x' , z', q)) - else error "linear/partial_tac: internal error neq_less_trans" -| mergeLess (Less (x, z, p) , NotEq (x', z', q)) = - if (x aconv x' andalso z aconv z') - then Less (x, z, p) - else error "linear/partial_tac: internal error less_neq_trans" -| mergeLess (Le (x, _, p) , Le (_ , z, q)) = - Le (x, z, Thm ([p,q] , #le_trans less_thms)) -| mergeLess (_, _) = - error "linear/partial_tac: internal error: undefined case"; - - -(* ******************************************************************** *) -(* tr checks for valid transitivity step *) -(* ******************************************************************** *) - -infix tr; -fun (Less (_, y, _)) tr (Le (x', _, _)) = ( y aconv x' ) - | (Le (_, y, _)) tr (Less (x', _, _)) = ( y aconv x' ) - | (Less (_, y, _)) tr (Less (x', _, _)) = ( y aconv x' ) - | (Le (_, y, _)) tr (Le (x', _, _)) = ( y aconv x' ) - | _ tr _ = false; - - -(* ******************************************************************* *) -(* *) -(* transPath (Lesslist, Less): (less list * less) -> less *) -(* *) -(* If a path represented by a list of elements of type less is found, *) -(* this needs to be contracted to a single element of type less. *) -(* Prior to each transitivity step it is checked whether the step is *) -(* valid. *) -(* *) -(* ******************************************************************* *) - -fun transPath ([],lesss) = lesss -| transPath (x::xs,lesss) = - if lesss tr x then transPath (xs, mergeLess(lesss,x)) - else error "linear/partial_tac: internal error transpath"; - -(* ******************************************************************* *) -(* *) -(* less1 subsumes less2 : less -> less -> bool *) -(* *) -(* subsumes checks whether less1 implies less2 *) -(* *) -(* ******************************************************************* *) - -infix subsumes; -fun (Less (x, y, _)) subsumes (Le (x', y', _)) = - (x aconv x' andalso y aconv y') - | (Less (x, y, _)) subsumes (Less (x', y', _)) = - (x aconv x' andalso y aconv y') - | (Le (x, y, _)) subsumes (Le (x', y', _)) = - (x aconv x' andalso y aconv y') - | (Less (x, y, _)) subsumes (NotEq (x', y', _)) = - (x aconv x' andalso y aconv y') orelse (y aconv x' andalso x aconv y') - | (NotEq (x, y, _)) subsumes (NotEq (x', y', _)) = - (x aconv x' andalso y aconv y') orelse (y aconv x' andalso x aconv y') - | (Le _) subsumes (Less _) = - error "linear/partial_tac: internal error: Le cannot subsume Less" - | _ subsumes _ = false; - -(* ******************************************************************* *) -(* *) -(* triv_solv less1 : less -> proof option *) -(* *) -(* Solves trivial goal x <= x. *) -(* *) -(* ******************************************************************* *) - -fun triv_solv (Le (x, x', _)) = - if x aconv x' then SOME (Thm ([], #le_refl less_thms)) - else NONE -| triv_solv _ = NONE; - -(* ********************************************************************* *) -(* Graph functions *) -(* ********************************************************************* *) - - - -(* ******************************************************************* *) -(* *) -(* General: *) -(* *) -(* Inequalities are represented by various types of graphs. *) -(* *) -(* 1. (Term.term * (Term.term * less) list) list *) -(* - Graph of this type is generated from the assumptions, *) -(* it does contain information on which edge stems from which *) -(* assumption. *) -(* - Used to compute strongly connected components *) -(* - Used to compute component subgraphs *) -(* - Used for component subgraphs to reconstruct paths in components*) -(* *) -(* 2. (int * (int * less) list ) list *) -(* - Graph of this type is generated from the strong components of *) -(* graph of type 1. It consists of the strong components of *) -(* graph 1, where nodes are indices of the components. *) -(* Only edges between components are part of this graph. *) -(* - Used to reconstruct paths between several components. *) -(* *) -(* ******************************************************************* *) - - -(* *********************************************************** *) -(* Functions for constructing graphs *) -(* *********************************************************** *) - -fun addEdge (v,d,[]) = [(v,d)] -| addEdge (v,d,((u,dl)::el)) = if v aconv u then ((v,d@dl)::el) - else (u,dl):: (addEdge(v,d,el)); - -(* ********************************************************************* *) -(* *) -(* mkGraphs constructs from a list of objects of type less a graph g, *) -(* by taking all edges that are candidate for a <=, and a list neqE, by *) -(* taking all edges that are candiate for a ~= *) -(* *) -(* ********************************************************************* *) - -fun mkGraphs [] = ([],[],[]) -| mkGraphs lessList = - let - -fun buildGraphs ([],leqG,neqG,neqE) = (leqG, neqG, neqE) -| buildGraphs (l::ls, leqG,neqG, neqE) = case l of - (Less (x,y,p)) => - buildGraphs (ls, addEdge (x,[(y,(Less (x, y, p)))],leqG) , - addEdge (x,[(y,(Less (x, y, p)))],neqG), l::neqE) -| (Le (x,y,p)) => - buildGraphs (ls, addEdge (x,[(y,(Le (x, y,p)))],leqG) , neqG, neqE) -| (NotEq (x,y,p)) => - buildGraphs (ls, leqG , addEdge (x,[(y,(NotEq (x, y, p)))],neqG), l::neqE) ; - - in buildGraphs (lessList, [], [], []) end; - - -(* *********************************************************************** *) -(* *) -(* adjacent g u : (''a * 'b list ) list -> ''a -> 'b list *) -(* *) -(* List of successors of u in graph g *) -(* *) -(* *********************************************************************** *) - -fun adjacent eq_comp ((v,adj)::el) u = - if eq_comp (u, v) then adj else adjacent eq_comp el u -| adjacent _ [] _ = [] - - -(* *********************************************************************** *) -(* *) -(* transpose g: *) -(* (''a * ''a list) list -> (''a * ''a list) list *) -(* *) -(* Computes transposed graph g' from g *) -(* by reversing all edges u -> v to v -> u *) -(* *) -(* *********************************************************************** *) - -fun transpose eq_comp g = - let - (* Compute list of reversed edges for each adjacency list *) - fun flip (u,(v,l)::el) = (v,(u,l)) :: flip (u,el) - | flip (_,[]) = [] - - (* Compute adjacency list for node u from the list of edges - and return a likewise reduced list of edges. The list of edges - is searches for edges starting from u, and these edges are removed. *) - fun gather (u,(v,w)::el) = - let - val (adj,edges) = gather (u,el) - in - if eq_comp (u, v) then (w::adj,edges) - else (adj,(v,w)::edges) - end - | gather (_,[]) = ([],[]) - - (* For every node in the input graph, call gather to find all reachable - nodes in the list of edges *) - fun assemble ((u,_)::el) edges = - let val (adj,edges) = gather (u,edges) - in (u,adj) :: assemble el edges - end - | assemble [] _ = [] - - (* Compute, for each adjacency list, the list with reversed edges, - and concatenate these lists. *) - val flipped = maps flip g - - in assemble g flipped end - -(* *********************************************************************** *) -(* *) -(* scc_term : (term * term list) list -> term list list *) -(* *) -(* The following is based on the algorithm for finding strongly connected *) -(* components described in Introduction to Algorithms, by Cormon, Leiserson*) -(* and Rivest, section 23.5. The input G is an adjacency list description *) -(* of a directed graph. The output is a list of the strongly connected *) -(* components (each a list of vertices). *) -(* *) -(* *) -(* *********************************************************************** *) - -fun scc_term G = - let - (* Ordered list of the vertices that DFS has finished with; - most recently finished goes at the head. *) - val finish : term list Unsynchronized.ref = Unsynchronized.ref [] - - (* List of vertices which have been visited. *) - val visited : term list Unsynchronized.ref = Unsynchronized.ref [] - - fun been_visited v = exists (fn w => w aconv v) (!visited) - - (* Given the adjacency list rep of a graph (a list of pairs), - return just the first element of each pair, yielding the - vertex list. *) - val members = map (fn (v,_) => v) - - (* Returns the nodes in the DFS tree rooted at u in g *) - fun dfs_visit g u : term list = - let - val _ = visited := u :: !visited - val descendents = - List.foldr (fn ((v,l),ds) => if been_visited v then ds - else v :: dfs_visit g v @ ds) - [] (adjacent (op aconv) g u) - in - finish := u :: !finish; - descendents - end - in - - (* DFS on the graph; apply dfs_visit to each vertex in - the graph, checking first to make sure the vertex is - as yet unvisited. *) - List.app (fn u => if been_visited u then () - else (dfs_visit G u; ())) (members G); - visited := []; - - (* We don't reset finish because its value is used by - foldl below, and it will never be used again (even - though dfs_visit will continue to modify it). *) - - (* DFS on the transpose. The vertices returned by - dfs_visit along with u form a connected component. We - collect all the connected components together in a - list, which is what is returned. *) - fold (fn u => fn comps => - if been_visited u then comps - else (u :: dfs_visit (transpose (op aconv) G) u) :: comps) (!finish) [] -end; - - -(* *********************************************************************** *) -(* *) -(* dfs_int_reachable g u: *) -(* (int * int list) list -> int -> int list *) -(* *) -(* Computes list of all nodes reachable from u in g. *) -(* *) -(* *********************************************************************** *) - -fun dfs_int_reachable g u = - let - (* List of vertices which have been visited. *) - val visited : int list Unsynchronized.ref = Unsynchronized.ref [] - - fun been_visited v = exists (fn w => w = v) (!visited) - - fun dfs_visit g u : int list = - let - val _ = visited := u :: !visited - val descendents = - List.foldr (fn ((v,l),ds) => if been_visited v then ds - else v :: dfs_visit g v @ ds) - [] (adjacent (op =) g u) - in descendents end - - in u :: dfs_visit g u end; - - -fun indexNodes IndexComp = - maps (fn (index, comp) => (map (fn v => (v,index)) comp)) IndexComp; - -fun getIndex v [] = ~1 -| getIndex v ((v',k)::vs) = if v aconv v' then k else getIndex v vs; - - - -(* *********************************************************************** *) -(* *) -(* dfs eq_comp g u v: *) -(* ('a * 'a -> bool) -> ('a *( 'a * less) list) list -> *) -(* 'a -> 'a -> (bool * ('a * less) list) *) -(* *) -(* Depth first search of v from u. *) -(* Returns (true, path(u, v)) if successful, otherwise (false, []). *) -(* *) -(* *********************************************************************** *) - -fun dfs eq_comp g u v = - let - val pred = Unsynchronized.ref []; - val visited = Unsynchronized.ref []; - - fun been_visited v = exists (fn w => eq_comp (w, v)) (!visited) - - fun dfs_visit u' = - let val _ = visited := u' :: (!visited) - - fun update (x,l) = let val _ = pred := (x,l) ::(!pred) in () end; - - in if been_visited v then () - else (List.app (fn (v',l) => if been_visited v' then () else ( - update (v',l); - dfs_visit v'; ()) )) (adjacent eq_comp g u') - end - in - dfs_visit u; - if (been_visited v) then (true, (!pred)) else (false , []) - end; - - -(* *********************************************************************** *) -(* *) -(* completeTermPath u v g: *) -(* Term.term -> Term.term -> (Term.term * (Term.term * less) list) list *) -(* -> less list *) -(* *) -(* Complete the path from u to v in graph g. Path search is performed *) -(* with dfs_term g u v. This yields for each node v' its predecessor u' *) -(* and the edge u' -> v'. Allows traversing graph backwards from v and *) -(* finding the path u -> v. *) -(* *) -(* *********************************************************************** *) - - -fun completeTermPath u v g = - let - val (found, tmp) = dfs (op aconv) g u v ; - val pred = map snd tmp; - - fun path x y = - let - - (* find predecessor u of node v and the edge u -> v *) - - fun lookup v [] = raise Cannot - | lookup v (e::es) = if (upper e) aconv v then e else lookup v es; - - (* traverse path backwards and return list of visited edges *) - fun rev_path v = - let val l = lookup v pred - val u = lower l; - in - if u aconv x then [l] - else (rev_path u) @ [l] - end - in rev_path y end; - - in - if found then (if u aconv v then [(Le (u, v, (Thm ([], #le_refl less_thms))))] - else path u v ) else raise Cannot -end; - - -(* *********************************************************************** *) -(* *) -(* findProof (sccGraph, neqE, ntc, sccSubgraphs) subgoal: *) -(* *) -(* (int * (int * less) list) list * less list * (Term.term * int) list *) -(* * ((term * (term * less) list) list) list -> Less -> proof *) -(* *) -(* findProof constructs from graphs (sccGraph, sccSubgraphs) and neqE a *) -(* proof for subgoal. Raises exception Cannot if this is not possible. *) -(* *) -(* *********************************************************************** *) - -fun findProof (sccGraph, neqE, ntc, sccSubgraphs) subgoal = -let - - (* complete path x y from component graph *) - fun completeComponentPath x y predlist = - let - val xi = getIndex x ntc - val yi = getIndex y ntc - - fun lookup k [] = raise Cannot - | lookup k ((h: int,l)::us) = if k = h then l else lookup k us; - - fun rev_completeComponentPath y' = - let val edge = lookup (getIndex y' ntc) predlist - val u = lower edge - val v = upper edge - in - if (getIndex u ntc) = xi then - completeTermPath x u (nth sccSubgraphs xi) @ [edge] @ - completeTermPath v y' (nth sccSubgraphs (getIndex y' ntc)) - else - rev_completeComponentPath u @ [edge] @ - completeTermPath v y' (nth sccSubgraphs (getIndex y' ntc)) - end - in - if x aconv y then - [(Le (x, y, (Thm ([], #le_refl less_thms))))] - else if xi = yi then completeTermPath x y (nth sccSubgraphs xi) - else rev_completeComponentPath y - end; - -(* ******************************************************************* *) -(* findLess e x y xi yi xreachable yreachable *) -(* *) -(* Find a path from x through e to y, of weight < *) -(* ******************************************************************* *) - - fun findLess e x y xi yi xreachable yreachable = - let val u = lower e - val v = upper e - val ui = getIndex u ntc - val vi = getIndex v ntc - - in - if member (op =) xreachable ui andalso member (op =) xreachable vi andalso - member (op =) yreachable ui andalso member (op =) yreachable vi then ( - - (case e of (Less (_, _, _)) => - let - val (xufound, xupred) = dfs (op =) sccGraph xi (getIndex u ntc) - in - if xufound then ( - let - val (vyfound, vypred) = dfs (op =) sccGraph (getIndex v ntc) yi - in - if vyfound then ( - let - val xypath = (completeComponentPath x u xupred)@[e]@(completeComponentPath v y vypred) - val xyLesss = transPath (tl xypath, hd xypath) - in - if xyLesss subsumes subgoal then SOME (getprf xyLesss) - else NONE - end) - else NONE - end) - else NONE - end - | _ => - let val (uvfound, uvpred) = dfs (op =) sccGraph (getIndex u ntc) (getIndex v ntc) - in - if uvfound then ( - let - val (xufound, xupred) = dfs (op =) sccGraph xi (getIndex u ntc) - in - if xufound then ( - let - val (vyfound, vypred) = dfs (op =) sccGraph (getIndex v ntc) yi - in - if vyfound then ( - let - val uvpath = completeComponentPath u v uvpred - val uvLesss = mergeLess ( transPath (tl uvpath, hd uvpath), e) - val xypath = (completeComponentPath x u xupred)@[uvLesss]@(completeComponentPath v y vypred) - val xyLesss = transPath (tl xypath, hd xypath) - in - if xyLesss subsumes subgoal then SOME (getprf xyLesss) - else NONE - end ) - else NONE - end) - else NONE - end ) - else NONE - end ) - ) else NONE -end; - - -in - (* looking for x <= y: any path from x to y is sufficient *) - case subgoal of (Le (x, y, _)) => ( - if null sccGraph then raise Cannot else ( - let - val xi = getIndex x ntc - val yi = getIndex y ntc - (* searches in sccGraph a path from xi to yi *) - val (found, pred) = dfs (op =) sccGraph xi yi - in - if found then ( - let val xypath = completeComponentPath x y pred - val xyLesss = transPath (tl xypath, hd xypath) - in - (case xyLesss of - (Less (_, _, q)) => if xyLesss subsumes subgoal then (Thm ([q], #less_imp_le less_thms)) - else raise Cannot - | _ => if xyLesss subsumes subgoal then (getprf xyLesss) - else raise Cannot) - end ) - else raise Cannot - end - ) - ) - (* looking for x < y: particular path required, which is not necessarily - found by normal dfs *) - | (Less (x, y, _)) => ( - if null sccGraph then raise Cannot else ( - let - val xi = getIndex x ntc - val yi = getIndex y ntc - val sccGraph_transpose = transpose (op =) sccGraph - (* all components that can be reached from component xi *) - val xreachable = dfs_int_reachable sccGraph xi - (* all comonents reachable from y in the transposed graph sccGraph' *) - val yreachable = dfs_int_reachable sccGraph_transpose yi - (* for all edges u ~= v or u < v check if they are part of path x < y *) - fun processNeqEdges [] = raise Cannot - | processNeqEdges (e::es) = - case (findLess e x y xi yi xreachable yreachable) of (SOME prf) => prf - | _ => processNeqEdges es - - in - processNeqEdges neqE - end - ) - ) -| (NotEq (x, y, _)) => ( - (* if there aren't any edges that are candidate for a ~= raise Cannot *) - if null neqE then raise Cannot - (* if there aren't any edges that are candidate for <= then just search a edge in neqE that implies the subgoal *) - else if null sccSubgraphs then ( - (case (find_first (fn fact => fact subsumes subgoal) neqE, subgoal) of - ( SOME (NotEq (x, y, p)), NotEq (x', y', _)) => - if (x aconv x' andalso y aconv y') then p - else Thm ([p], #not_sym less_thms) - | ( SOME (Less (x, y, p)), NotEq (x', y', _)) => - if x aconv x' andalso y aconv y' then (Thm ([p], #less_imp_neq less_thms)) - else (Thm ([(Thm ([p], #less_imp_neq less_thms))], #not_sym less_thms)) - | _ => raise Cannot) - ) else ( - - let val xi = getIndex x ntc - val yi = getIndex y ntc - val sccGraph_transpose = transpose (op =) sccGraph - val xreachable = dfs_int_reachable sccGraph xi - val yreachable = dfs_int_reachable sccGraph_transpose yi - - fun processNeqEdges [] = raise Cannot - | processNeqEdges (e::es) = ( - let val u = lower e - val v = upper e - val ui = getIndex u ntc - val vi = getIndex v ntc - - in - (* if x ~= y follows from edge e *) - if e subsumes subgoal then ( - case e of (Less (u, v, q)) => ( - if u aconv x andalso v aconv y then (Thm ([q], #less_imp_neq less_thms)) - else (Thm ([(Thm ([q], #less_imp_neq less_thms))], #not_sym less_thms)) - ) - | (NotEq (u,v, q)) => ( - if u aconv x andalso v aconv y then q - else (Thm ([q], #not_sym less_thms)) - ) - ) - (* if SCC_x is linked to SCC_y via edge e *) - else if ui = xi andalso vi = yi then ( - case e of (Less (_, _,_)) => ( - let - val xypath = - completeTermPath x u (nth sccSubgraphs ui) @ [e] @ - completeTermPath v y (nth sccSubgraphs vi) - val xyLesss = transPath (tl xypath, hd xypath) - in (Thm ([getprf xyLesss], #less_imp_neq less_thms)) end) - | _ => ( - let - val xupath = completeTermPath x u (nth sccSubgraphs ui) - val uxpath = completeTermPath u x (nth sccSubgraphs ui) - val vypath = completeTermPath v y (nth sccSubgraphs vi) - val yvpath = completeTermPath y v (nth sccSubgraphs vi) - val xuLesss = transPath (tl xupath, hd xupath) - val uxLesss = transPath (tl uxpath, hd uxpath) - val vyLesss = transPath (tl vypath, hd vypath) - val yvLesss = transPath (tl yvpath, hd yvpath) - val x_eq_u = (Thm ([(getprf xuLesss),(getprf uxLesss)], #eqI less_thms)) - val v_eq_y = (Thm ([(getprf vyLesss),(getprf yvLesss)], #eqI less_thms)) - in - (Thm ([x_eq_u , (getprf e), v_eq_y ], #eq_neq_eq_imp_neq less_thms)) - end - ) - ) else if ui = yi andalso vi = xi then ( - case e of (Less (_, _,_)) => ( - let - val xypath = - completeTermPath y u (nth sccSubgraphs ui) @ [e] @ - completeTermPath v x (nth sccSubgraphs vi) - val xyLesss = transPath (tl xypath, hd xypath) - in (Thm ([(Thm ([getprf xyLesss], #less_imp_neq less_thms))] , #not_sym less_thms)) end ) - | _ => ( - - let val yupath = completeTermPath y u (nth sccSubgraphs ui) - val uypath = completeTermPath u y (nth sccSubgraphs ui) - val vxpath = completeTermPath v x (nth sccSubgraphs vi) - val xvpath = completeTermPath x v (nth sccSubgraphs vi) - val yuLesss = transPath (tl yupath, hd yupath) - val uyLesss = transPath (tl uypath, hd uypath) - val vxLesss = transPath (tl vxpath, hd vxpath) - val xvLesss = transPath (tl xvpath, hd xvpath) - val y_eq_u = (Thm ([(getprf yuLesss),(getprf uyLesss)], #eqI less_thms)) - val v_eq_x = (Thm ([(getprf vxLesss),(getprf xvLesss)], #eqI less_thms)) - in - (Thm ([(Thm ([y_eq_u , (getprf e), v_eq_x ], #eq_neq_eq_imp_neq less_thms))], #not_sym less_thms)) - end - ) - ) else ( - (* there exists a path x < y or y < x such that - x ~= y may be concluded *) - case (findLess e x y xi yi xreachable yreachable) of - (SOME prf) => (Thm ([prf], #less_imp_neq less_thms)) - | NONE => ( - let - val yr = dfs_int_reachable sccGraph yi - val xr = dfs_int_reachable sccGraph_transpose xi - in - case (findLess e y x yi xi yr xr) of - (SOME prf) => (Thm ([(Thm ([prf], #less_imp_neq less_thms))], #not_sym less_thms)) - | _ => processNeqEdges es - end) - ) end) - in processNeqEdges neqE end) - ) -end; - - -(* ******************************************************************* *) -(* *) -(* mk_sccGraphs components leqG neqG ntc : *) -(* Term.term list list -> *) -(* (Term.term * (Term.term * less) list) list -> *) -(* (Term.term * (Term.term * less) list) list -> *) -(* (Term.term * int) list -> *) -(* (int * (int * less) list) list * *) -(* ((Term.term * (Term.term * less) list) list) list *) -(* *) -(* *) -(* Computes, from graph leqG, list of all its components and the list *) -(* ntc (nodes, index of component) a graph whose nodes are the *) -(* indices of the components of g. Egdes of the new graph are *) -(* only the edges of g linking two components. Also computes for each *) -(* component the subgraph of leqG that forms this component. *) -(* *) -(* For each component SCC_i is checked if there exists a edge in neqG *) -(* that leads to a contradiction. *) -(* *) -(* We have a contradiction for edge u ~= v and u < v if: *) -(* - u and v are in the same component, *) -(* that is, a path u <= v and a path v <= u exist, hence u = v. *) -(* From irreflexivity of < follows u < u or v < v. Ex false quodlibet. *) -(* *) -(* ******************************************************************* *) - -fun mk_sccGraphs _ [] _ _ = ([],[]) -| mk_sccGraphs components leqG neqG ntc = - let - (* Liste (Index der Komponente, Komponente *) - val IndexComp = map_index I components; - - - fun handleContr edge g = - (case edge of - (Less (x, y, _)) => ( - let - val xxpath = edge :: (completeTermPath y x g ) - val xxLesss = transPath (tl xxpath, hd xxpath) - val q = getprf xxLesss - in - raise (Contr (Thm ([q], #less_reflE less_thms ))) - end - ) - | (NotEq (x, y, _)) => ( - let - val xypath = (completeTermPath x y g ) - val yxpath = (completeTermPath y x g ) - val xyLesss = transPath (tl xypath, hd xypath) - val yxLesss = transPath (tl yxpath, hd yxpath) - val q = getprf (mergeLess ((mergeLess (edge, xyLesss)),yxLesss )) - in - raise (Contr (Thm ([q], #less_reflE less_thms ))) - end - ) - | _ => error "trans_tac/handleContr: invalid Contradiction"); - - - (* k is index of the actual component *) - - fun processComponent (k, comp) = - let - (* all edges with weight <= of the actual component *) - val leqEdges = maps (adjacent (op aconv) leqG) comp; - (* all edges with weight ~= of the actual component *) - val neqEdges = map snd (maps (adjacent (op aconv) neqG) comp); - - (* find an edge leading to a contradiction *) - fun findContr [] = NONE - | findContr (e::es) = - let val ui = (getIndex (lower e) ntc) - val vi = (getIndex (upper e) ntc) - in - if ui = vi then SOME e - else findContr es - end; - - (* sort edges into component internal edges and - edges pointing away from the component *) - fun sortEdges [] (intern,extern) = (intern,extern) - | sortEdges ((v,l)::es) (intern, extern) = - let val k' = getIndex v ntc in - if k' = k then - sortEdges es (l::intern, extern) - else sortEdges es (intern, (k',l)::extern) end; - - (* Insert edge into sorted list of edges, where edge is - only added if - - it is found for the first time - - it is a <= edge and no parallel < edge was found earlier - - it is a < edge - *) - fun insert (h: int,l) [] = [(h,l)] - | insert (h,l) ((k',l')::es) = if h = k' then ( - case l of (Less (_, _, _)) => (h,l)::es - | _ => (case l' of (Less (_, _, _)) => (h,l')::es - | _ => (k',l)::es) ) - else (k',l'):: insert (h,l) es; - - (* Reorganise list of edges such that - - duplicate edges are removed - - if a < edge and a <= edge exist at the same time, - remove <= edge *) - fun reOrganizeEdges [] sorted = sorted: (int * less) list - | reOrganizeEdges (e::es) sorted = reOrganizeEdges es (insert e sorted); - - (* construct the subgraph forming the strongly connected component - from the edge list *) - fun sccSubGraph [] g = g - | sccSubGraph (l::ls) g = - sccSubGraph ls (addEdge ((lower l),[((upper l),l)],g)) - - val (intern, extern) = sortEdges leqEdges ([], []); - val subGraph = sccSubGraph intern []; - - in - case findContr neqEdges of SOME e => handleContr e subGraph - | _ => ((k, (reOrganizeEdges (extern) [])), subGraph) - end; - - val tmp = map processComponent IndexComp -in - ( (map fst tmp), (map snd tmp)) -end; - - -(** Find proof if possible. **) - -fun gen_solve mkconcl sign (asms, concl) = - let - val (leqG, neqG, neqE) = mkGraphs asms - val components = scc_term leqG - val ntc = indexNodes (map_index I components) - val (sccGraph, sccSubgraphs) = mk_sccGraphs components leqG neqG ntc - in - let - val (subgoals, prf) = mkconcl decomp less_thms sign concl - fun solve facts less = - (case triv_solv less of NONE => findProof (sccGraph, neqE, ntc, sccSubgraphs) less - | SOME prf => prf ) - in - map (solve asms) subgoals - end - end; - -in - SUBGOAL (fn (A, n) => fn st => - let - val thy = Proof_Context.theory_of ctxt; - val rfrees = map Free (Term.rename_wrt_term A (Logic.strip_params A)); - val Hs = - map Thm.prop_of prems @ - map (fn H => subst_bounds (rfrees, H)) (Logic.strip_assums_hyp A) - val C = subst_bounds (rfrees, Logic.strip_assums_concl A) - val lesss = flat (map_index (mkasm decomp less_thms thy) Hs) - val prfs = gen_solve mkconcl thy (lesss, C); - val (subgoals, prf) = mkconcl decomp less_thms thy C; - in - Subgoal.FOCUS (fn {context = ctxt', prems = asms, ...} => - let val thms = map (prove (prems @ asms)) prfs - in resolve_tac ctxt' [prove thms prf] 1 end) ctxt n st - end - handle Contr p => - (Subgoal.FOCUS (fn {context = ctxt', prems = asms, ...} => - resolve_tac ctxt' [prove asms p] 1) ctxt n st - handle General.Subscript => Seq.empty) - | Cannot => Seq.empty - | General.Subscript => Seq.empty) -end; - -(* partial_tac - solves partial orders *) -val partial_tac = gen_order_tac mkasm_partial mkconcl_partial; - -(* linear_tac - solves linear/total orders *) -val linear_tac = gen_order_tac mkasm_linear mkconcl_linear; - -end; diff --git a/src/Provers/order_procedure.ML b/src/Provers/order_procedure.ML new file mode 100644 --- /dev/null +++ b/src/Provers/order_procedure.ML @@ -0,0 +1,604 @@ +structure Order_Procedure : sig + datatype int = Int_of_integer of IntInf.int + val integer_of_int : int -> IntInf.int + datatype 'a fm = Atom of 'a | And of 'a fm * 'a fm | Or of 'a fm * 'a fm | + Neg of 'a fm + datatype trm = Const of string | App of trm * trm | Var of int + datatype prf_trm = PThm of string | Appt of prf_trm * trm | + AppP of prf_trm * prf_trm | AbsP of trm * prf_trm | Bound of trm | + Conv of trm * prf_trm * prf_trm + datatype o_atom = EQ of int * int | LEQ of int * int | LESS of int * int + val lo_contr_prf : (bool * o_atom) fm -> prf_trm option + val po_contr_prf : (bool * o_atom) fm -> prf_trm option +end = struct + +datatype int = Int_of_integer of IntInf.int; + +fun integer_of_int (Int_of_integer k) = k; + +fun equal_inta k l = (((integer_of_int k) : IntInf.int) = (integer_of_int l)); + +type 'a equal = {equal : 'a -> 'a -> bool}; +val equal = #equal : 'a equal -> 'a -> 'a -> bool; + +val equal_int = {equal = equal_inta} : int equal; + +fun less_eq_int k l = IntInf.<= (integer_of_int k, integer_of_int l); + +type 'a ord = {less_eq : 'a -> 'a -> bool, less : 'a -> 'a -> bool}; +val less_eq = #less_eq : 'a ord -> 'a -> 'a -> bool; +val less = #less : 'a ord -> 'a -> 'a -> bool; + +fun less_int k l = IntInf.< (integer_of_int k, integer_of_int l); + +val ord_int = {less_eq = less_eq_int, less = less_int} : int ord; + +type 'a preorder = {ord_preorder : 'a ord}; +val ord_preorder = #ord_preorder : 'a preorder -> 'a ord; + +type 'a order = {preorder_order : 'a preorder}; +val preorder_order = #preorder_order : 'a order -> 'a preorder; + +val preorder_int = {ord_preorder = ord_int} : int preorder; + +val order_int = {preorder_order = preorder_int} : int order; + +type 'a linorder = {order_linorder : 'a order}; +val order_linorder = #order_linorder : 'a linorder -> 'a order; + +val linorder_int = {order_linorder = order_int} : int linorder; + +fun eq A_ a b = equal A_ a b; + +fun equal_proda A_ B_ (x1, x2) (y1, y2) = eq A_ x1 y1 andalso eq B_ x2 y2; + +fun equal_prod A_ B_ = {equal = equal_proda A_ B_} : ('a * 'b) equal; + +fun less_eq_prod A_ B_ (x1, y1) (x2, y2) = + less A_ x1 x2 orelse less_eq A_ x1 x2 andalso less_eq B_ y1 y2; + +fun less_prod A_ B_ (x1, y1) (x2, y2) = + less A_ x1 x2 orelse less_eq A_ x1 x2 andalso less B_ y1 y2; + +fun ord_prod A_ B_ = {less_eq = less_eq_prod A_ B_, less = less_prod A_ B_} : + ('a * 'b) ord; + +fun preorder_prod A_ B_ = + {ord_preorder = ord_prod (ord_preorder A_) (ord_preorder B_)} : + ('a * 'b) preorder; + +fun order_prod A_ B_ = + {preorder_order = preorder_prod (preorder_order A_) (preorder_order B_)} : + ('a * 'b) order; + +fun linorder_prod A_ B_ = + {order_linorder = order_prod (order_linorder A_) (order_linorder B_)} : + ('a * 'b) linorder; + +datatype nat = Zero_nat | Suc of nat; + +datatype color = R | B; + +datatype ('a, 'b) rbta = Empty | + Branch of color * ('a, 'b) rbta * 'a * 'b * ('a, 'b) rbta; + +datatype ('b, 'a) rbt = RBT of ('b, 'a) rbta; + +datatype 'a set = Set of 'a list | Coset of 'a list; + +datatype 'a fm = Atom of 'a | And of 'a fm * 'a fm | Or of 'a fm * 'a fm | + Neg of 'a fm; + +datatype trm = Const of string | App of trm * trm | Var of int; + +datatype prf_trm = PThm of string | Appt of prf_trm * trm | + AppP of prf_trm * prf_trm | AbsP of trm * prf_trm | Bound of trm | + Conv of trm * prf_trm * prf_trm; + +datatype ('a, 'b) mapping = Mapping of ('a, 'b) rbt; + +datatype o_atom = EQ of int * int | LEQ of int * int | LESS of int * int; + +fun id x = (fn xa => xa) x; + +fun impl_of B_ (RBT x) = x; + +fun folda f (Branch (c, lt, k, v, rt)) x = folda f rt (f k v (folda f lt x)) + | folda f Empty x = x; + +fun fold A_ x xc = folda x (impl_of A_ xc); + +fun gen_keys kts (Branch (c, l, k, v, r)) = gen_keys ((k, r) :: kts) l + | gen_keys ((k, t) :: kts) Empty = k :: gen_keys kts t + | gen_keys [] Empty = []; + +fun keysb x = gen_keys [] x; + +fun keys A_ x = keysb (impl_of A_ x); + +fun maps f [] = [] + | maps f (x :: xs) = f x @ maps f xs; + +fun empty A_ = RBT Empty; + +fun foldl f a [] = a + | foldl f a (x :: xs) = foldl f (f a x) xs; + +fun foldr f [] = id + | foldr f (x :: xs) = f x o foldr f xs; + +fun balance (Branch (R, a, w, x, b)) s t (Branch (R, c, y, z, d)) = + Branch (R, Branch (B, a, w, x, b), s, t, Branch (B, c, y, z, d)) + | balance (Branch (R, Branch (R, a, w, x, b), s, t, c)) y z Empty = + Branch (R, Branch (B, a, w, x, b), s, t, Branch (B, c, y, z, Empty)) + | balance (Branch (R, Branch (R, a, w, x, b), s, t, c)) y z + (Branch (B, va, vb, vc, vd)) = + Branch + (R, Branch (B, a, w, x, b), s, t, + Branch (B, c, y, z, Branch (B, va, vb, vc, vd))) + | balance (Branch (R, Empty, w, x, Branch (R, b, s, t, c))) y z Empty = + Branch (R, Branch (B, Empty, w, x, b), s, t, Branch (B, c, y, z, Empty)) + | balance + (Branch (R, Branch (B, va, vb, vc, vd), w, x, Branch (R, b, s, t, c))) y z + Empty = + Branch + (R, Branch (B, Branch (B, va, vb, vc, vd), w, x, b), s, t, + Branch (B, c, y, z, Empty)) + | balance (Branch (R, Empty, w, x, Branch (R, b, s, t, c))) y z + (Branch (B, va, vb, vc, vd)) = + Branch + (R, Branch (B, Empty, w, x, b), s, t, + Branch (B, c, y, z, Branch (B, va, vb, vc, vd))) + | balance + (Branch (R, Branch (B, ve, vf, vg, vh), w, x, Branch (R, b, s, t, c))) y z + (Branch (B, va, vb, vc, vd)) = + Branch + (R, Branch (B, Branch (B, ve, vf, vg, vh), w, x, b), s, t, + Branch (B, c, y, z, Branch (B, va, vb, vc, vd))) + | balance Empty w x (Branch (R, b, s, t, Branch (R, c, y, z, d))) = + Branch (R, Branch (B, Empty, w, x, b), s, t, Branch (B, c, y, z, d)) + | balance (Branch (B, va, vb, vc, vd)) w x + (Branch (R, b, s, t, Branch (R, c, y, z, d))) = + Branch + (R, Branch (B, Branch (B, va, vb, vc, vd), w, x, b), s, t, + Branch (B, c, y, z, d)) + | balance Empty w x (Branch (R, Branch (R, b, s, t, c), y, z, Empty)) = + Branch (R, Branch (B, Empty, w, x, b), s, t, Branch (B, c, y, z, Empty)) + | balance Empty w x + (Branch (R, Branch (R, b, s, t, c), y, z, Branch (B, va, vb, vc, vd))) = + Branch + (R, Branch (B, Empty, w, x, b), s, t, + Branch (B, c, y, z, Branch (B, va, vb, vc, vd))) + | balance (Branch (B, va, vb, vc, vd)) w x + (Branch (R, Branch (R, b, s, t, c), y, z, Empty)) = + Branch + (R, Branch (B, Branch (B, va, vb, vc, vd), w, x, b), s, t, + Branch (B, c, y, z, Empty)) + | balance (Branch (B, va, vb, vc, vd)) w x + (Branch (R, Branch (R, b, s, t, c), y, z, Branch (B, ve, vf, vg, vh))) = + Branch + (R, Branch (B, Branch (B, va, vb, vc, vd), w, x, b), s, t, + Branch (B, c, y, z, Branch (B, ve, vf, vg, vh))) + | balance Empty s t Empty = Branch (B, Empty, s, t, Empty) + | balance Empty s t (Branch (B, va, vb, vc, vd)) = + Branch (B, Empty, s, t, Branch (B, va, vb, vc, vd)) + | balance Empty s t (Branch (v, Empty, vb, vc, Empty)) = + Branch (B, Empty, s, t, Branch (v, Empty, vb, vc, Empty)) + | balance Empty s t (Branch (v, Branch (B, ve, vf, vg, vh), vb, vc, Empty)) = + Branch + (B, Empty, s, t, Branch (v, Branch (B, ve, vf, vg, vh), vb, vc, Empty)) + | balance Empty s t (Branch (v, Empty, vb, vc, Branch (B, vf, vg, vh, vi))) = + Branch + (B, Empty, s, t, Branch (v, Empty, vb, vc, Branch (B, vf, vg, vh, vi))) + | balance Empty s t + (Branch (v, Branch (B, ve, vj, vk, vl), vb, vc, Branch (B, vf, vg, vh, vi))) + = Branch + (B, Empty, s, t, + Branch + (v, Branch (B, ve, vj, vk, vl), vb, vc, Branch (B, vf, vg, vh, vi))) + | balance (Branch (B, va, vb, vc, vd)) s t Empty = + Branch (B, Branch (B, va, vb, vc, vd), s, t, Empty) + | balance (Branch (B, va, vb, vc, vd)) s t (Branch (B, ve, vf, vg, vh)) = + Branch (B, Branch (B, va, vb, vc, vd), s, t, Branch (B, ve, vf, vg, vh)) + | balance (Branch (B, va, vb, vc, vd)) s t (Branch (v, Empty, vf, vg, Empty)) + = Branch + (B, Branch (B, va, vb, vc, vd), s, t, Branch (v, Empty, vf, vg, Empty)) + | balance (Branch (B, va, vb, vc, vd)) s t + (Branch (v, Branch (B, vi, vj, vk, vl), vf, vg, Empty)) = + Branch + (B, Branch (B, va, vb, vc, vd), s, t, + Branch (v, Branch (B, vi, vj, vk, vl), vf, vg, Empty)) + | balance (Branch (B, va, vb, vc, vd)) s t + (Branch (v, Empty, vf, vg, Branch (B, vj, vk, vl, vm))) = + Branch + (B, Branch (B, va, vb, vc, vd), s, t, + Branch (v, Empty, vf, vg, Branch (B, vj, vk, vl, vm))) + | balance (Branch (B, va, vb, vc, vd)) s t + (Branch (v, Branch (B, vi, vn, vo, vp), vf, vg, Branch (B, vj, vk, vl, vm))) + = Branch + (B, Branch (B, va, vb, vc, vd), s, t, + Branch + (v, Branch (B, vi, vn, vo, vp), vf, vg, Branch (B, vj, vk, vl, vm))) + | balance (Branch (v, Empty, vb, vc, Empty)) s t Empty = + Branch (B, Branch (v, Empty, vb, vc, Empty), s, t, Empty) + | balance (Branch (v, Empty, vb, vc, Branch (B, ve, vf, vg, vh))) s t Empty = + Branch + (B, Branch (v, Empty, vb, vc, Branch (B, ve, vf, vg, vh)), s, t, Empty) + | balance (Branch (v, Branch (B, vf, vg, vh, vi), vb, vc, Empty)) s t Empty = + Branch + (B, Branch (v, Branch (B, vf, vg, vh, vi), vb, vc, Empty), s, t, Empty) + | balance + (Branch (v, Branch (B, vf, vg, vh, vi), vb, vc, Branch (B, ve, vj, vk, vl))) + s t Empty = + Branch + (B, Branch + (v, Branch (B, vf, vg, vh, vi), vb, vc, Branch (B, ve, vj, vk, vl)), + s, t, Empty) + | balance (Branch (v, Empty, vf, vg, Empty)) s t (Branch (B, va, vb, vc, vd)) + = Branch + (B, Branch (v, Empty, vf, vg, Empty), s, t, Branch (B, va, vb, vc, vd)) + | balance (Branch (v, Empty, vf, vg, Branch (B, vi, vj, vk, vl))) s t + (Branch (B, va, vb, vc, vd)) = + Branch + (B, Branch (v, Empty, vf, vg, Branch (B, vi, vj, vk, vl)), s, t, + Branch (B, va, vb, vc, vd)) + | balance (Branch (v, Branch (B, vj, vk, vl, vm), vf, vg, Empty)) s t + (Branch (B, va, vb, vc, vd)) = + Branch + (B, Branch (v, Branch (B, vj, vk, vl, vm), vf, vg, Empty), s, t, + Branch (B, va, vb, vc, vd)) + | balance + (Branch (v, Branch (B, vj, vk, vl, vm), vf, vg, Branch (B, vi, vn, vo, vp))) + s t (Branch (B, va, vb, vc, vd)) = + Branch + (B, Branch + (v, Branch (B, vj, vk, vl, vm), vf, vg, Branch (B, vi, vn, vo, vp)), + s, t, Branch (B, va, vb, vc, vd)); + +fun rbt_ins A_ f k v Empty = Branch (R, Empty, k, v, Empty) + | rbt_ins A_ f k v (Branch (B, l, x, y, r)) = + (if less A_ k x then balance (rbt_ins A_ f k v l) x y r + else (if less A_ x k then balance l x y (rbt_ins A_ f k v r) + else Branch (B, l, x, f k y v, r))) + | rbt_ins A_ f k v (Branch (R, l, x, y, r)) = + (if less A_ k x then Branch (R, rbt_ins A_ f k v l, x, y, r) + else (if less A_ x k then Branch (R, l, x, y, rbt_ins A_ f k v r) + else Branch (R, l, x, f k y v, r))); + +fun paint c Empty = Empty + | paint c (Branch (uu, l, k, v, r)) = Branch (c, l, k, v, r); + +fun rbt_insert_with_key A_ f k v t = paint B (rbt_ins A_ f k v t); + +fun rbt_insert A_ = rbt_insert_with_key A_ (fn _ => fn _ => fn nv => nv); + +fun insert A_ xc xd xe = + RBT (rbt_insert ((ord_preorder o preorder_order o order_linorder) A_) xc xd + (impl_of A_ xe)); + +fun rbt_lookup A_ Empty k = NONE + | rbt_lookup A_ (Branch (uu, l, x, y, r)) k = + (if less A_ k x then rbt_lookup A_ l k + else (if less A_ x k then rbt_lookup A_ r k else SOME y)); + +fun lookup A_ x = + rbt_lookup ((ord_preorder o preorder_order o order_linorder) A_) + (impl_of A_ x); + +fun member A_ [] y = false + | member A_ (x :: xs) y = eq A_ x y orelse member A_ xs y; + +fun hd (x21 :: x22) = x21; + +fun tl [] = [] + | tl (x21 :: x22) = x22; + +fun remdups A_ [] = [] + | remdups A_ (x :: xs) = + (if member A_ xs x then remdups A_ xs else x :: remdups A_ xs); + +fun dnf_and_fm (Or (phi_1, phi_2)) psi = + Or (dnf_and_fm phi_1 psi, dnf_and_fm phi_2 psi) + | dnf_and_fm (Atom v) (Or (phi_1, phi_2)) = + Or (dnf_and_fm (Atom v) phi_1, dnf_and_fm (Atom v) phi_2) + | dnf_and_fm (And (v, va)) (Or (phi_1, phi_2)) = + Or (dnf_and_fm (And (v, va)) phi_1, dnf_and_fm (And (v, va)) phi_2) + | dnf_and_fm (Neg v) (Or (phi_1, phi_2)) = + Or (dnf_and_fm (Neg v) phi_1, dnf_and_fm (Neg v) phi_2) + | dnf_and_fm (Atom v) (Atom va) = And (Atom v, Atom va) + | dnf_and_fm (Atom v) (And (va, vb)) = And (Atom v, And (va, vb)) + | dnf_and_fm (Atom v) (Neg va) = And (Atom v, Neg va) + | dnf_and_fm (And (v, va)) (Atom vb) = And (And (v, va), Atom vb) + | dnf_and_fm (And (v, va)) (And (vb, vc)) = And (And (v, va), And (vb, vc)) + | dnf_and_fm (And (v, va)) (Neg vb) = And (And (v, va), Neg vb) + | dnf_and_fm (Neg v) (Atom va) = And (Neg v, Atom va) + | dnf_and_fm (Neg v) (And (va, vb)) = And (Neg v, And (va, vb)) + | dnf_and_fm (Neg v) (Neg va) = And (Neg v, Neg va); + +fun dnf_fm (And (phi_1, phi_2)) = dnf_and_fm (dnf_fm phi_1) (dnf_fm phi_2) + | dnf_fm (Or (phi_1, phi_2)) = Or (dnf_fm phi_1, dnf_fm phi_2) + | dnf_fm (Atom v) = Atom v + | dnf_fm (Neg v) = Neg v; + +fun keysa A_ (Mapping t) = Set (keys A_ t); + +fun amap_fm h (Atom a) = h a + | amap_fm h (And (phi_1, phi_2)) = And (amap_fm h phi_1, amap_fm h phi_2) + | amap_fm h (Or (phi_1, phi_2)) = Or (amap_fm h phi_1, amap_fm h phi_2) + | amap_fm h (Neg phi) = Neg (amap_fm h phi); + +fun emptya A_ = Mapping (empty A_); + +fun lookupa A_ (Mapping t) = lookup A_ t; + +fun update A_ k v (Mapping t) = Mapping (insert A_ k v t); + +fun gen_length n (x :: xs) = gen_length (Suc n) xs + | gen_length n [] = n; + +fun size_list x = gen_length Zero_nat x; + +fun card A_ (Set xs) = size_list (remdups A_ xs); + +fun conj_list (And (phi_1, phi_2)) = conj_list phi_1 @ conj_list phi_2 + | conj_list (Atom a) = [a]; + +fun trm_of_fm f (Atom a) = f a + | trm_of_fm f (And (phi_1, phi_2)) = + App (App (Const "conj", trm_of_fm f phi_1), trm_of_fm f phi_2) + | trm_of_fm f (Or (phi_1, phi_2)) = + App (App (Const "disj", trm_of_fm f phi_1), trm_of_fm f phi_2) + | trm_of_fm f (Neg phi) = App (Const "Not", trm_of_fm f phi); + +fun dnf_and_fm_prf (Or (phi_1, phi_2)) psi = + foldl (fn a => fn b => AppP (a, b)) (PThm "then_conv") + [PThm "conj_disj_distribR_conv", + foldl (fn a => fn b => AppP (a, b)) (PThm "combination_conv") + [AppP (PThm "arg_conv", + hd [dnf_and_fm_prf phi_1 psi, dnf_and_fm_prf phi_2 psi]), + hd (tl [dnf_and_fm_prf phi_1 psi, dnf_and_fm_prf phi_2 psi])]] + | dnf_and_fm_prf (Atom v) (Or (phi_1, phi_2)) = + foldl (fn a => fn b => AppP (a, b)) (PThm "then_conv") + [PThm "conj_disj_distribL_conv", + foldl (fn a => fn b => AppP (a, b)) (PThm "combination_conv") + [AppP (PThm "arg_conv", + hd [dnf_and_fm_prf (Atom v) phi_1, + dnf_and_fm_prf (Atom v) phi_2]), + hd (tl [dnf_and_fm_prf (Atom v) phi_1, + dnf_and_fm_prf (Atom v) phi_2])]] + | dnf_and_fm_prf (And (v, va)) (Or (phi_1, phi_2)) = + foldl (fn a => fn b => AppP (a, b)) (PThm "then_conv") + [PThm "conj_disj_distribL_conv", + foldl (fn a => fn b => AppP (a, b)) (PThm "combination_conv") + [AppP (PThm "arg_conv", + hd [dnf_and_fm_prf (And (v, va)) phi_1, + dnf_and_fm_prf (And (v, va)) phi_2]), + hd (tl [dnf_and_fm_prf (And (v, va)) phi_1, + dnf_and_fm_prf (And (v, va)) phi_2])]] + | dnf_and_fm_prf (Neg v) (Or (phi_1, phi_2)) = + foldl (fn a => fn b => AppP (a, b)) (PThm "then_conv") + [PThm "conj_disj_distribL_conv", + foldl (fn a => fn b => AppP (a, b)) (PThm "combination_conv") + [AppP (PThm "arg_conv", + hd [dnf_and_fm_prf (Neg v) phi_1, + dnf_and_fm_prf (Neg v) phi_2]), + hd (tl [dnf_and_fm_prf (Neg v) phi_1, + dnf_and_fm_prf (Neg v) phi_2])]] + | dnf_and_fm_prf (Atom v) (Atom va) = PThm "all_conv" + | dnf_and_fm_prf (Atom v) (And (va, vb)) = PThm "all_conv" + | dnf_and_fm_prf (Atom v) (Neg va) = PThm "all_conv" + | dnf_and_fm_prf (And (v, va)) (Atom vb) = PThm "all_conv" + | dnf_and_fm_prf (And (v, va)) (And (vb, vc)) = PThm "all_conv" + | dnf_and_fm_prf (And (v, va)) (Neg vb) = PThm "all_conv" + | dnf_and_fm_prf (Neg v) (Atom va) = PThm "all_conv" + | dnf_and_fm_prf (Neg v) (And (va, vb)) = PThm "all_conv" + | dnf_and_fm_prf (Neg v) (Neg va) = PThm "all_conv"; + +fun dnf_fm_prf (And (phi_1, phi_2)) = + foldl (fn a => fn b => AppP (a, b)) (PThm "then_conv") + [foldl (fn a => fn b => AppP (a, b)) (PThm "combination_conv") + [AppP (PThm "arg_conv", hd [dnf_fm_prf phi_1, dnf_fm_prf phi_2]), + hd (tl [dnf_fm_prf phi_1, dnf_fm_prf phi_2])], + dnf_and_fm_prf (dnf_fm phi_1) (dnf_fm phi_2)] + | dnf_fm_prf (Or (phi_1, phi_2)) = + foldl (fn a => fn b => AppP (a, b)) (PThm "combination_conv") + [AppP (PThm "arg_conv", hd [dnf_fm_prf phi_1, dnf_fm_prf phi_2]), + hd (tl [dnf_fm_prf phi_1, dnf_fm_prf phi_2])] + | dnf_fm_prf (Atom v) = PThm "all_conv" + | dnf_fm_prf (Neg v) = PThm "all_conv"; + +fun of_alist A_ xs = foldr (fn (a, b) => update A_ a b) xs (emptya A_); + +fun deneg (true, LESS (x, y)) = + And (Atom (true, LEQ (x, y)), Atom (false, EQ (x, y))) + | deneg (false, LESS (x, y)) = Atom (true, LEQ (y, x)) + | deneg (false, LEQ (x, y)) = + And (Atom (true, LEQ (y, x)), Atom (false, EQ (y, x))) + | deneg (false, EQ (v, vb)) = Atom (false, EQ (v, vb)) + | deneg (v, EQ (vb, vc)) = Atom (v, EQ (vb, vc)) + | deneg (true, LEQ (vb, vc)) = Atom (true, LEQ (vb, vc)); + +fun from_conj_prf trm_of_atom p (And (a, b)) = + foldl (fn aa => fn ba => AppP (aa, ba)) (PThm "conjE") + [Bound (trm_of_fm trm_of_atom (And (a, b))), + AbsP (trm_of_fm trm_of_atom a, + AbsP (trm_of_fm trm_of_atom b, + from_conj_prf trm_of_atom (from_conj_prf trm_of_atom p b) + a))] + | from_conj_prf trm_of_atom p (Atom a) = p; + +fun contr_fm_prf trm_of_atom contr_atom_prf (Or (c, d)) = + (case (contr_fm_prf trm_of_atom contr_atom_prf c, + contr_fm_prf trm_of_atom contr_atom_prf d) + of (NONE, _) => NONE | (SOME _, NONE) => NONE + | (SOME p1, SOME p2) => + SOME (foldl (fn a => fn b => AppP (a, b)) (PThm "disjE") + [Bound (trm_of_fm trm_of_atom (Or (c, d))), + AbsP (trm_of_fm trm_of_atom c, p1), + AbsP (trm_of_fm trm_of_atom d, p2)])) + | contr_fm_prf trm_of_atom contr_atom_prf (And (a, b)) = + (case contr_atom_prf (conj_list (And (a, b))) of NONE => NONE + | SOME p => SOME (from_conj_prf trm_of_atom p (And (a, b)))) + | contr_fm_prf trm_of_atom contr_atom_prf (Atom a) = contr_atom_prf [a]; + +fun deless (true, LESS (x, y)) = + And (Atom (true, LEQ (x, y)), Atom (false, EQ (x, y))) + | deless (false, LESS (x, y)) = + Or (Atom (false, LEQ (x, y)), Atom (true, EQ (x, y))) + | deless (false, EQ (v, vb)) = Atom (false, EQ (v, vb)) + | deless (false, LEQ (v, vb)) = Atom (false, LEQ (v, vb)) + | deless (v, EQ (vb, vc)) = Atom (v, EQ (vb, vc)) + | deless (v, LEQ (vb, vc)) = Atom (v, LEQ (vb, vc)); + +fun deneg_prf (true, LESS (x, y)) = PThm "less_le" + | deneg_prf (false, LESS (x, y)) = PThm "nless_le" + | deneg_prf (false, LEQ (x, y)) = PThm "nle_le" + | deneg_prf (false, EQ (v, vb)) = PThm "all_conv" + | deneg_prf (v, EQ (vb, vc)) = PThm "all_conv" + | deneg_prf (true, LEQ (vb, vc)) = PThm "all_conv"; + +val one_nat : nat = Suc Zero_nat; + +fun map_option f NONE = NONE + | map_option f (SOME x2) = SOME (f x2); + +fun deless_prf (true, LESS (x, y)) = PThm "less_le" + | deless_prf (false, LESS (x, y)) = PThm "nless_le" + | deless_prf (false, EQ (v, vb)) = PThm "all_conv" + | deless_prf (false, LEQ (v, vb)) = PThm "all_conv" + | deless_prf (v, EQ (vb, vc)) = PThm "all_conv" + | deless_prf (v, LEQ (vb, vc)) = PThm "all_conv"; + +fun trm_of_oatom (EQ (x, y)) = App (App (Const "eq", Var x), Var y) + | trm_of_oatom (LEQ (x, y)) = App (App (Const "le", Var x), Var y) + | trm_of_oatom (LESS (x, y)) = App (App (Const "lt", Var x), Var y); + +fun minus_nat (Suc m) (Suc n) = minus_nat m n + | minus_nat Zero_nat n = Zero_nat + | minus_nat m Zero_nat = m; + +fun mapping_fold A_ f (Mapping t) a = fold A_ f t a; + +fun relcomp1_mapping A_ (B1_, B2_) x y1 pxy pm pma = + mapping_fold (linorder_prod B2_ B2_) + (fn (y2, z) => fn pyz => fn pmb => + (if eq B1_ y1 y2 andalso not (eq B1_ y2 z) + then update (linorder_prod A_ B2_) (x, z) + (foldl (fn a => fn b => AppP (a, b)) (PThm "trans") [pxy, pyz]) + pmb + else pmb)) + pm pma; + +fun relcomp_mapping (A1_, A2_) pm1 pm2 pma = + mapping_fold (linorder_prod A2_ A2_) + (fn (x, y) => fn pxy => fn pm => + (if eq A1_ x y then pm + else relcomp1_mapping A2_ (A1_, A2_) x y pxy pm2 pm)) + pm1 pma; + +fun ntrancl_mapping (A1_, A2_) Zero_nat m = m + | ntrancl_mapping (A1_, A2_) (Suc k) m = + let + val trclm = ntrancl_mapping (A1_, A2_) k m; + in + relcomp_mapping (A1_, A2_) trclm m trclm + end; + +fun trancl_mapping (A1_, A2_) m = + ntrancl_mapping (A1_, A2_) + (minus_nat (card (equal_prod A1_ A1_) (keysa (linorder_prod A2_ A2_) m)) + one_nat) + m; + +fun is_in_leq leqm l = + let + val (x, y) = l; + in + (if equal_inta x y then SOME (Appt (PThm "refl", Var x)) + else lookupa (linorder_prod linorder_int linorder_int) leqm l) + end; + +fun is_in_eq leqm l = + let + val (x, y) = l; + in + (case (is_in_leq leqm (x, y), is_in_leq leqm (y, x)) of (NONE, _) => NONE + | (SOME _, NONE) => NONE + | (SOME p1, SOME p2) => + SOME (foldl (fn a => fn b => AppP (a, b)) (PThm "antisym") [p1, p2])) + end; + +fun trm_of_oliteral (true, a) = trm_of_oatom a + | trm_of_oliteral (false, a) = App (Const "Not", trm_of_oatom a); + +fun contr1_list leqm (false, LEQ (x, y)) = + map_option + (fn a => + AppP (AppP (PThm "contr", Bound (trm_of_oliteral (false, LEQ (x, y)))), + a)) + (is_in_leq leqm (x, y)) + | contr1_list leqm (false, EQ (x, y)) = + map_option + (fn a => + AppP (AppP (PThm "contr", Bound (trm_of_oliteral (false, EQ (x, y)))), + a)) + (is_in_eq leqm (x, y)) + | contr1_list uu (true, va) = NONE + | contr1_list uu (v, LESS (vb, vc)) = NONE; + +fun contr_list_aux leqm [] = NONE + | contr_list_aux leqm (l :: ls) = + (case contr1_list leqm l of NONE => contr_list_aux leqm ls + | SOME a => SOME a); + +fun leq1_member_list (true, LEQ (x, y)) = + [((x, y), Bound (trm_of_oliteral (true, LEQ (x, y))))] + | leq1_member_list (true, EQ (x, y)) = + [((x, y), AppP (PThm "eqD1", Bound (trm_of_oliteral (true, EQ (x, y))))), + ((y, x), AppP (PThm "eqD2", Bound (trm_of_oliteral (true, EQ (x, y)))))] + | leq1_member_list (false, va) = [] + | leq1_member_list (v, LESS (vb, vc)) = []; + +fun leq1_list a = maps leq1_member_list a; + +fun leq1_mapping a = + of_alist (linorder_prod linorder_int linorder_int) (leq1_list a); + +fun contr_list a = + contr_list_aux (trancl_mapping (equal_int, linorder_int) (leq1_mapping a)) a; + +fun contr_prf atom_conv phi = + contr_fm_prf trm_of_oliteral contr_list (dnf_fm (amap_fm atom_conv phi)); + +fun amap_f_m_prf ap (Atom a) = AppP (PThm "atom_conv", ap a) + | amap_f_m_prf ap (And (phi_1, phi_2)) = + foldl (fn a => fn b => AppP (a, b)) (PThm "combination_conv") + [AppP (PThm "arg_conv", + hd [amap_f_m_prf ap phi_1, amap_f_m_prf ap phi_2]), + hd (tl [amap_f_m_prf ap phi_1, amap_f_m_prf ap phi_2])] + | amap_f_m_prf ap (Or (phi_1, phi_2)) = + foldl (fn a => fn b => AppP (a, b)) (PThm "combination_conv") + [AppP (PThm "arg_conv", + hd [amap_f_m_prf ap phi_1, amap_f_m_prf ap phi_2]), + hd (tl [amap_f_m_prf ap phi_1, amap_f_m_prf ap phi_2])] + | amap_f_m_prf ap (Neg phi) = AppP (PThm "arg_conv", amap_f_m_prf ap phi); + +fun lo_contr_prf phi = + map_option + ((fn a => + Conv (trm_of_fm trm_of_oliteral phi, amap_f_m_prf deneg_prf phi, a)) o + (fn a => + Conv (trm_of_fm trm_of_oliteral (amap_fm deneg phi), + dnf_fm_prf (amap_fm deneg phi), a))) + (contr_prf deneg phi); + +fun po_contr_prf phi = + map_option + ((fn a => + Conv (trm_of_fm trm_of_oliteral phi, amap_f_m_prf deless_prf phi, a)) o + (fn a => + Conv (trm_of_fm trm_of_oliteral (amap_fm deless phi), + dnf_fm_prf (amap_fm deless phi), a))) + (contr_prf deless phi); + +end; (*struct Order_Procedure*) diff --git a/src/Provers/order_tac.ML b/src/Provers/order_tac.ML new file mode 100644 --- /dev/null +++ b/src/Provers/order_tac.ML @@ -0,0 +1,456 @@ +signature REIFY_TABLE = +sig + type table + val empty : table + val get_var : term -> table -> (int * table) + val get_term : int -> table -> term option +end + +structure Reifytab: REIFY_TABLE = +struct + type table = (int * int Termtab.table * term Inttab.table) + + val empty = (0, Termtab.empty, Inttab.empty) + + fun get_var t (tab as (max_var, termtab, inttab)) = + (case Termtab.lookup termtab t of + SOME v => (v, tab) + | NONE => (max_var, + (max_var + 1, Termtab.update (t, max_var) termtab, Inttab.update (max_var, t) inttab)) + ) + + fun get_term v (_, _, inttab) = Inttab.lookup inttab v +end + +signature LOGIC_SIGNATURE = +sig + val mk_Trueprop : term -> term + val dest_Trueprop : term -> term + val Trueprop_conv : conv -> conv + val Not : term + val conj : term + val disj : term + + val notI : thm (* (P \ False) \ \ P *) + val ccontr : thm (* (\ P \ False) \ P *) + val conjI : thm (* P \ Q \ P \ Q *) + val conjE : thm (* P \ Q \ (P \ Q \ R) \ R *) + val disjE : thm (* P \ Q \ (P \ R) \ (Q \ R) \ R *) + + val not_not_conv : conv (* \ (\ P) \ P *) + val de_Morgan_conj_conv : conv (* \ (P \ Q) \ \ P \ \ Q *) + val de_Morgan_disj_conv : conv (* \ (P \ Q) \ \ P \ \ Q *) + val conj_disj_distribL_conv : conv (* P \ (Q \ R) \ (P \ Q) \ (P \ R) *) + val conj_disj_distribR_conv : conv (* (Q \ R) \ P \ (Q \ P) \ (R \ P) *) +end + +(* Control tracing output of the solver. *) +val order_trace_cfg = Attrib.setup_config_bool @{binding "order_trace"} (K false) +(* In partial orders, literals of the form \ x < y will force the order solver to perform case + distinctions, which leads to an exponential blowup of the runtime. The split limit controls + the number of literals of this form that are passed to the solver. + *) +val order_split_limit_cfg = Attrib.setup_config_int @{binding "order_split_limit"} (K 8) + +datatype order_kind = Order | Linorder + +type order_literal = (bool * Order_Procedure.o_atom) + +type order_context = { + kind : order_kind, + ops : term list, thms : (string * thm) list, conv_thms : (string * thm) list + } + +signature BASE_ORDER_TAC = +sig + + val tac : + (order_literal Order_Procedure.fm -> Order_Procedure.prf_trm option) + -> order_context -> thm list + -> Proof.context -> int -> tactic +end + +functor Base_Order_Tac( + structure Logic_Sig : LOGIC_SIGNATURE; val excluded_types : typ list) : BASE_ORDER_TAC = +struct + open Order_Procedure + + fun expect _ (SOME x) = x + | expect f NONE = f () + + fun matches_skeleton t s = t = Term.dummy orelse + (case (t, s) of + (t0 $ t1, s0 $ s1) => matches_skeleton t0 s0 andalso matches_skeleton t1 s1 + | _ => t aconv s) + + fun dest_binop t = + let + val binop_skel = Term.dummy $ Term.dummy $ Term.dummy + val not_binop_skel = Logic_Sig.Not $ binop_skel + in + if matches_skeleton not_binop_skel t + then (case t of (_ $ (t1 $ t2 $ t3)) => (false, (t1, t2, t3))) + else if matches_skeleton binop_skel t + then (case t of (t1 $ t2 $ t3) => (true, (t1, t2, t3))) + else raise TERM ("Not a binop literal", [t]) + end + + fun find_term t = Library.find_first (fn (t', _) => t' aconv t) + + fun reify_order_atom (eq, le, lt) t reifytab = + let + val (b, (t0, t1, t2)) = + (dest_binop t) handle TERM (_, _) => raise TERM ("Can't reify order literal", [t]) + val binops = [(eq, EQ), (le, LEQ), (lt, LESS)] + in + case find_term t0 binops of + SOME (_, reified_bop) => + reifytab + |> Reifytab.get_var t1 ||> Reifytab.get_var t2 + |> (fn (v1, (v2, vartab')) => + ((b, reified_bop (Int_of_integer v1, Int_of_integer v2)), vartab')) + |>> Atom + | NONE => raise TERM ("Can't reify order literal", [t]) + end + + fun reify consts reify_atom t reifytab = + let + fun reify' (t1 $ t2) reifytab = + let + val (t0, ts) = strip_comb (t1 $ t2) + val consts_of_arity = filter (fn (_, (_, ar)) => length ts = ar) consts + in + (case find_term t0 consts_of_arity of + SOME (_, (reified_op, _)) => fold_map reify' ts reifytab |>> reified_op + | NONE => reify_atom (t1 $ t2) reifytab) + end + | reify' t reifytab = reify_atom t reifytab + in + reify' t reifytab + end + + fun list_curry0 f = (fn [] => f, 0) + fun list_curry1 f = (fn [x] => f x, 1) + fun list_curry2 f = (fn [x, y] => f x y, 2) + + fun reify_order_conj ord_ops = + let + val consts = map (apsnd (list_curry2 o curry)) [(Logic_Sig.conj, And), (Logic_Sig.disj, Or)] + in + reify consts (reify_order_atom ord_ops) + end + + fun dereify_term consts reifytab t = + let + fun dereify_term' (App (t1, t2)) = (dereify_term' t1) $ (dereify_term' t2) + | dereify_term' (Const s) = + AList.lookup (op =) consts s + |> expect (fn () => raise TERM ("Const " ^ s ^ " not in", map snd consts)) + | dereify_term' (Var v) = Reifytab.get_term (integer_of_int v) reifytab |> the + in + dereify_term' t + end + + fun dereify_order_fm (eq, le, lt) reifytab t = + let + val consts = [ + ("eq", eq), ("le", le), ("lt", lt), + ("Not", Logic_Sig.Not), ("disj", Logic_Sig.disj), ("conj", Logic_Sig.conj) + ] + in + dereify_term consts reifytab t + end + + fun strip_AppP t = + let fun strip (AppP (f, s), ss) = strip (f, s::ss) + | strip x = x + in strip (t, []) end + + fun replay_conv convs cvp = + let + val convs = convs @ + [("all_conv", list_curry0 Conv.all_conv)] @ + map (apsnd list_curry1) [ + ("atom_conv", I), + ("neg_atom_conv", I), + ("arg_conv", Conv.arg_conv)] @ + map (apsnd list_curry2) [ + ("combination_conv", Conv.combination_conv), + ("then_conv", curry (op then_conv))] + + fun lookup_conv convs c = AList.lookup (op =) convs c + |> expect (fn () => error ("Can't replay conversion: " ^ c)) + + fun rp_conv t = + (case strip_AppP t ||> map rp_conv of + (PThm c, cvs) => + let val (conv, arity) = lookup_conv convs c + in if arity = length cvs + then conv cvs + else error ("Expected " ^ Int.toString arity ^ " arguments for conversion " ^ + c ^ " but got " ^ (length cvs |> Int.toString) ^ " arguments") + end + | _ => error "Unexpected constructor in conversion proof") + in + rp_conv cvp + end + + fun replay_prf_trm replay_conv dereify ctxt thmtab assmtab p = + let + fun replay_prf_trm' _ (PThm s) = + AList.lookup (op =) thmtab s + |> expect (fn () => error ("Cannot replay theorem: " ^ s)) + | replay_prf_trm' assmtab (Appt (p, t)) = + replay_prf_trm' assmtab p + |> Drule.infer_instantiate' ctxt [SOME (Thm.cterm_of ctxt (dereify t))] + | replay_prf_trm' assmtab (AppP (p1, p2)) = + apply2 (replay_prf_trm' assmtab) (p2, p1) |> (op COMP) + | replay_prf_trm' assmtab (AbsP (reified_t, p)) = + let + val t = dereify reified_t + val t_thm = Logic_Sig.mk_Trueprop t |> Thm.cterm_of ctxt |> Assumption.assume ctxt + val rp = replay_prf_trm' (Termtab.update (Thm.prop_of t_thm, t_thm) assmtab) p + in + Thm.implies_intr (Thm.cprop_of t_thm) rp + end + | replay_prf_trm' assmtab (Bound reified_t) = + let + val t = dereify reified_t |> Logic_Sig.mk_Trueprop + in + Termtab.lookup assmtab t + |> expect (fn () => raise TERM ("Assumption not found:", t::Termtab.keys assmtab)) + end + | replay_prf_trm' assmtab (Conv (t, cp, p)) = + let + val thm = replay_prf_trm' assmtab (Bound t) + val conv = Logic_Sig.Trueprop_conv (replay_conv cp) + val conv_thm = Conv.fconv_rule conv thm + val conv_term = Thm.prop_of conv_thm + in + replay_prf_trm' (Termtab.update (conv_term, conv_thm) assmtab) p + end + in + replay_prf_trm' assmtab p + end + + fun replay_order_prf_trm ord_ops {thms = thms, conv_thms = conv_thms, ...} ctxt reifytab assmtab = + let + val thmtab = thms @ [ + ("conjE", Logic_Sig.conjE), ("conjI", Logic_Sig.conjI), ("disjE", Logic_Sig.disjE) + ] + val convs = map (apsnd list_curry0) ( + map (apsnd Conv.rewr_conv) conv_thms @ + [ + ("not_not_conv", Logic_Sig.not_not_conv), + ("de_Morgan_conj_conv", Logic_Sig.de_Morgan_conj_conv), + ("de_Morgan_disj_conv", Logic_Sig.de_Morgan_disj_conv), + ("conj_disj_distribR_conv", Logic_Sig.conj_disj_distribR_conv), + ("conj_disj_distribL_conv", Logic_Sig.conj_disj_distribL_conv) + ]) + + val dereify = dereify_order_fm ord_ops reifytab + in + replay_prf_trm (replay_conv convs) dereify ctxt thmtab assmtab + end + + fun is_binop_term t = + let + fun is_included t = forall (curry (op <>) (t |> fastype_of |> domain_type)) excluded_types + in + (case dest_binop (Logic_Sig.dest_Trueprop t) of + (_, (binop, t1, t2)) => + is_included binop andalso + (* Exclude terms with schematic variables since the solver can't deal with them. + More specifically, the solver uses Assumption.assume which does not allow schematic + variables in the assumed cterm. + *) + Term.add_var_names (binop $ t1 $ t2) [] = [] + ) handle TERM (_, _) => false + end + + fun partition_matches ctxt term_of pats ys = + let + val thy = Proof_Context.theory_of ctxt + + fun find_match t env = + Library.get_first (try (fn pat => Pattern.match thy (pat, t) env)) pats + + fun filter_matches xs = fold (fn x => fn (mxs, nmxs, env) => + case find_match (term_of x) env of + SOME env' => (x::mxs, nmxs, env') + | NONE => (mxs, x::nmxs, env)) xs ([], [], (Vartab.empty, Vartab.empty)) + + fun partition xs = + case filter_matches xs of + ([], _, _) => [] + | (mxs, nmxs, env) => (env, mxs) :: partition nmxs + in + partition ys + end + + fun limit_not_less [_, _, lt] ctxt prems = + let + val thy = Proof_Context.theory_of ctxt + val trace = Config.get ctxt order_trace_cfg + val limit = Config.get ctxt order_split_limit_cfg + + fun is_not_less_term t = + (case dest_binop (Logic_Sig.dest_Trueprop t) of + (false, (t0, _, _)) => Pattern.matches thy (lt, t0) + | _ => false) + handle TERM _ => false + + val not_less_prems = filter (is_not_less_term o Thm.prop_of) prems + val _ = if trace andalso length not_less_prems > limit + then tracing "order split limit exceeded" + else () + in + filter_out (is_not_less_term o Thm.prop_of) prems @ + take limit not_less_prems + end + + fun order_tac raw_order_proc octxt simp_prems = + Subgoal.FOCUS (fn {prems=prems, context=ctxt, ...} => + let + val trace = Config.get ctxt order_trace_cfg + + val binop_prems = filter (is_binop_term o Thm.prop_of) (prems @ simp_prems) + val strip_binop = (fn (x, _, _) => x) o snd o dest_binop + val binop_of = strip_binop o Logic_Sig.dest_Trueprop o Thm.prop_of + + (* Due to local_setup, the operators of the order may contain schematic term and type + variables. We partition the premises according to distinct instances of those operators. + *) + val part_prems = partition_matches ctxt binop_of (#ops octxt) binop_prems + |> (case #kind octxt of + Order => map (fn (env, prems) => + (env, limit_not_less (#ops octxt) ctxt prems)) + | _ => I) + + fun order_tac' (_, []) = no_tac + | order_tac' (env, prems) = + let + val [eq, le, lt] = #ops octxt + val subst_contract = Envir.eta_contract o Envir.subst_term env + val ord_ops = (subst_contract eq, + subst_contract le, + subst_contract lt) + + val _ = if trace then @{print} (ord_ops, prems) else (ord_ops, prems) + + val prems_conj_thm = foldl1 (fn (x, a) => Logic_Sig.conjI OF [x, a]) prems + |> Conv.fconv_rule Thm.eta_conversion + val prems_conj = prems_conj_thm |> Thm.prop_of + val (reified_prems_conj, reifytab) = + reify_order_conj ord_ops (Logic_Sig.dest_Trueprop prems_conj) Reifytab.empty + + val proof = raw_order_proc reified_prems_conj + + val assmtab = Termtab.make [(prems_conj, prems_conj_thm)] + val replay = replay_order_prf_trm ord_ops octxt ctxt reifytab assmtab + in + case proof of + NONE => no_tac + | SOME p => SOLVED' (resolve_tac ctxt [replay p]) 1 + end + in + FIRST (map order_tac' part_prems) + end) + + val ad_absurdum_tac = SUBGOAL (fn (A, i) => + case try (Logic_Sig.dest_Trueprop o Logic.strip_assums_concl) A of + SOME (nt $ _) => + if nt = Logic_Sig.Not + then resolve0_tac [Logic_Sig.notI] i + else resolve0_tac [Logic_Sig.ccontr] i + | SOME _ => resolve0_tac [Logic_Sig.ccontr] i + | NONE => resolve0_tac [Logic_Sig.ccontr] i) + + fun tac raw_order_proc octxt simp_prems ctxt = + EVERY' [ + ad_absurdum_tac, + CONVERSION Thm.eta_conversion, + order_tac raw_order_proc octxt simp_prems ctxt + ] + +end + +functor Order_Tac(structure Base_Tac : BASE_ORDER_TAC) = struct + + fun order_context_eq ({kind = kind1, ops = ops1, ...}, {kind = kind2, ops = ops2, ...}) = + kind1 = kind2 andalso eq_list (op aconv) (ops1, ops2) + + fun order_data_eq (x, y) = order_context_eq (fst x, fst y) + + structure Data = Generic_Data( + type T = (order_context * (order_context -> thm list -> Proof.context -> int -> tactic)) list + val empty = [] + val extend = I + fun merge data = Library.merge order_data_eq data + ) + + fun declare (octxt as {kind = kind, raw_proc = raw_proc, ...}) lthy = + lthy |> Local_Theory.declaration {syntax = false, pervasive = false} (fn phi => fn context => + let + val ops = map (Morphism.term phi) (#ops octxt) + val thms = map (fn (s, thm) => (s, Morphism.thm phi thm)) (#thms octxt) + val conv_thms = map (fn (s, thm) => (s, Morphism.thm phi thm)) (#conv_thms octxt) + val octxt' = {kind = kind, ops = ops, thms = thms, conv_thms = conv_thms} + in + context |> Data.map (Library.insert order_data_eq (octxt', raw_proc)) + end) + + fun declare_order { + ops = {eq = eq, le = le, lt = lt}, + thms = { + trans = trans, (* x \ y \ y \ z \ x \ z *) + refl = refl, (* x \ x *) + eqD1 = eqD1, (* x = y \ x \ y *) + eqD2 = eqD2, (* x = y \ y \ x *) + antisym = antisym, (* x \ y \ y \ x \ x = y *) + contr = contr (* \ P \ P \ R *) + }, + conv_thms = { + less_le = less_le, (* x < y \ x \ y \ x \ y *) + nless_le = nless_le (* \ a < b \ \ a \ b \ a = b *) + } + } = + declare { + kind = Order, + ops = [eq, le, lt], + thms = [("trans", trans), ("refl", refl), ("eqD1", eqD1), ("eqD2", eqD2), + ("antisym", antisym), ("contr", contr)], + conv_thms = [("less_le", less_le), ("nless_le", nless_le)], + raw_proc = Base_Tac.tac Order_Procedure.po_contr_prf + } + + fun declare_linorder { + ops = {eq = eq, le = le, lt = lt}, + thms = { + trans = trans, (* x \ y \ y \ z \ x \ z *) + refl = refl, (* x \ x *) + eqD1 = eqD1, (* x = y \ x \ y *) + eqD2 = eqD2, (* x = y \ y \ x *) + antisym = antisym, (* x \ y \ y \ x \ x = y *) + contr = contr (* \ P \ P \ R *) + }, + conv_thms = { + less_le = less_le, (* x < y \ x \ y \ x \ y *) + nless_le = nless_le, (* \ x < y \ y \ x *) + nle_le = nle_le (* \ a \ b \ b \ a \ b \ a *) + } + } = + declare { + kind = Linorder, + ops = [eq, le, lt], + thms = [("trans", trans), ("refl", refl), ("eqD1", eqD1), ("eqD2", eqD2), + ("antisym", antisym), ("contr", contr)], + conv_thms = [("less_le", less_le), ("nless_le", nless_le), ("nle_le", nle_le)], + raw_proc = Base_Tac.tac Order_Procedure.lo_contr_prf + } + + (* Try to solve the goal by calling the order solver with each of the declared orders. *) + fun tac simp_prems ctxt = + let fun app_tac (octxt, tac0) = CHANGED o tac0 octxt simp_prems ctxt + in FIRST' (map app_tac (Data.get (Context.Proof ctxt))) end +end