diff --git a/metadata/metadata b/metadata/metadata --- a/metadata/metadata +++ b/metadata/metadata @@ -1,10192 +1,10239 @@ [Arith_Prog_Rel_Primes] title = Arithmetic progressions and relative primes author = José Manuel Rodríguez Caballero topic = Mathematics/Number theory date = 2020-02-01 notify = jose.manuel.rodriguez.caballero@ut.ee abstract = This article provides a formalization of the solution obtained by the author of the Problem “ARITHMETIC PROGRESSIONS” from the Putnam exam problems of 2002. The statement of the problem is as follows: For which integers n > 1 does the set of positive integers less than and relatively prime to n constitute an arithmetic progression? [Banach_Steinhaus] title = Banach-Steinhaus Theorem author = Dominique Unruh , Jose Manuel Rodriguez Caballero topic = Mathematics/Analysis date = 2020-05-02 notify = jose.manuel.rodriguez.caballero@ut.ee, unruh@ut.ee abstract = We formalize in Isabelle/HOL a result due to S. Banach and H. Steinhaus known as the Banach-Steinhaus theorem or Uniform boundedness principle: a pointwise-bounded family of continuous linear operators from a Banach space to a normed space is uniformly bounded. Our approach is an adaptation to Isabelle/HOL of a proof due to A. Sokal. [Complex_Geometry] title = Complex Geometry author = Filip Marić , Danijela Simić topic = Mathematics/Geometry date = 2019-12-16 notify = danijela@matf.bg.ac.rs, filip@matf.bg.ac.rs, boutry@unistra.fr abstract = A formalization of geometry of complex numbers is presented. Fundamental objects that are investigated are the complex plane extended by a single infinite point, its objects (points, lines and circles), and groups of transformations that act on them (e.g., inversions and Möbius transformations). Most objects are defined algebraically, but correspondence with classical geometric definitions is shown. [Poincare_Disc] title = Poincaré Disc Model author = Danijela Simić , Filip Marić , Pierre Boutry topic = Mathematics/Geometry date = 2019-12-16 notify = danijela@matf.bg.ac.rs, filip@matf.bg.ac.rs, boutry@unistra.fr abstract = We describe formalization of the Poincaré disc model of hyperbolic geometry within the Isabelle/HOL proof assistant. The model is defined within the extended complex plane (one dimensional complex projectives space ℂP1), formalized in the AFP entry “Complex Geometry”. Points, lines, congruence of pairs of points, betweenness of triples of points, circles, and isometries are defined within the model. It is shown that the model satisfies all Tarski's axioms except the Euclid's axiom. It is shown that it satisfies its negation and the limiting parallels axiom (which proves it to be a model of hyperbolic geometry). [Fourier] title = Fourier Series author = Lawrence C Paulson topic = Mathematics/Analysis date = 2019-09-06 notify = lp15@cam.ac.uk abstract = This development formalises the square integrable functions over the reals and the basics of Fourier series. It culminates with a proof that every well-behaved periodic function can be approximated by a Fourier series. The material is ported from HOL Light: https://github.com/jrh13/hol-light/blob/master/100/fourier.ml [Generic_Deriving] title = Deriving generic class instances for datatypes author = Jonas Rädle , Lars Hupel topic = Computer science/Data structures date = 2018-11-06 notify = jonas.raedle@gmail.com abstract =

We provide a framework for automatically deriving instances for generic type classes. Our approach is inspired by Haskell's generic-deriving package and Scala's shapeless library. In addition to generating the code for type class functions, we also attempt to automatically prove type class laws for these instances. As of now, however, some manual proofs are still required for recursive datatypes.

Note: There are already articles in the AFP that provide automatic instantiation for a number of classes. Concretely, Deriving allows the automatic instantiation of comparators, linear orders, equality, and hashing. Show instantiates a Haskell-style show class.

Our approach works for arbitrary classes (with some Isabelle/HOL overhead for each class), but a smaller set of datatypes.

[Partial_Order_Reduction] title = Partial Order Reduction author = Julian Brunner topic = Computer science/Automata and formal languages date = 2018-06-05 notify = brunnerj@in.tum.de abstract = This entry provides a formalization of the abstract theory of ample set partial order reduction. The formalization includes transition systems with actions, trace theory, as well as basics on finite, infinite, and lazy sequences. We also provide a basic framework for static analysis on concurrent systems with respect to the ample set condition. [CakeML] title = CakeML author = Lars Hupel , Yu Zhang <> contributors = Johannes Åman Pohjola <> topic = Computer science/Programming languages/Language definitions date = 2018-03-12 notify = hupel@in.tum.de abstract = CakeML is a functional programming language with a proven-correct compiler and runtime system. This entry contains an unofficial version of the CakeML semantics that has been exported from the Lem specifications to Isabelle. Additionally, there are some hand-written theory files that adapt the exported code to Isabelle and port proofs from the HOL4 formalization, e.g. termination and equivalence proofs. [CakeML_Codegen] title = A Verified Code Generator from Isabelle/HOL to CakeML author = Lars Hupel topic = Computer science/Programming languages/Compiling, Logic/Rewriting date = 2019-07-08 notify = lars@hupel.info abstract = This entry contains the formalization that accompanies my PhD thesis (see https://lars.hupel.info/research/codegen/). I develop a verified compilation toolchain from executable specifications in Isabelle/HOL to CakeML abstract syntax trees. This improves over the state-of-the-art in Isabelle by providing a trustworthy procedure for code generation. [DiscretePricing] title = Pricing in discrete financial models author = Mnacho Echenim topic = Mathematics/Probability theory, Mathematics/Games and economics date = 2018-07-16 notify = mnacho.echenim@univ-grenoble-alpes.fr abstract = We have formalized the computation of fair prices for derivative products in discrete financial models. As an application, we derive a way to compute fair prices of derivative products in the Cox-Ross-Rubinstein model of a financial market, thus completing the work that was presented in this paper. extra-history = Change history: [2019-05-12]: Renamed discr_mkt predicate to stk_strict_subs and got rid of predicate A for a more natural definition of the type discrete_market; renamed basic quantity processes for coherent notation; renamed value_process into val_process and closing_value_process to cls_val_process; relaxed hypothesis of lemma CRR_market_fair_price. Added functions to price some basic options. (revision 0b813a1a833f)
[Pell] title = Pell's Equation author = Manuel Eberl topic = Mathematics/Number theory date = 2018-06-23 notify = eberlm@in.tum.de abstract =

This article gives the basic theory of Pell's equation x2 = 1 + Dy2, where D ∈ ℕ is a parameter and x, y are integer variables.

The main result that is proven is the following: If D is not a perfect square, then there exists a fundamental solution (x0, y0) that is not the trivial solution (1, 0) and which generates all other solutions (x, y) in the sense that there exists some n ∈ ℕ such that |x| + |y| √D = (x0 + y0 √D)n. This also implies that the set of solutions is infinite, and it gives us an explicit and executable characterisation of all the solutions.

Based on this, simple executable algorithms for computing the fundamental solution and the infinite sequence of all non-negative solutions are also provided.

[WebAssembly] title = WebAssembly author = Conrad Watt topic = Computer science/Programming languages/Language definitions date = 2018-04-29 notify = caw77@cam.ac.uk abstract = This is a mechanised specification of the WebAssembly language, drawn mainly from the previously published paper formalisation of Haas et al. Also included is a full proof of soundness of the type system, together with a verified type checker and interpreter. We include only a partial procedure for the extraction of the type checker and interpreter here. For more details, please see our paper in CPP 2018. [Knuth_Morris_Pratt] title = The string search algorithm by Knuth, Morris and Pratt author = Fabian Hellauer , Peter Lammich topic = Computer science/Algorithms date = 2017-12-18 notify = hellauer@in.tum.de, lammich@in.tum.de abstract = The Knuth-Morris-Pratt algorithm is often used to show that the problem of finding a string s in a text t can be solved deterministically in O(|s| + |t|) time. We use the Isabelle Refinement Framework to formulate and verify the algorithm. Via refinement, we apply some optimisations and finally use the Sepref tool to obtain executable code in Imperative/HOL. [Minkowskis_Theorem] title = Minkowski's Theorem author = Manuel Eberl topic = Mathematics/Geometry, Mathematics/Number theory date = 2017-07-13 notify = eberlm@in.tum.de abstract =

Minkowski's theorem relates a subset of ℝn, the Lebesgue measure, and the integer lattice ℤn: It states that any convex subset of ℝn with volume greater than 2n contains at least one lattice point from ℤn\{0}, i. e. a non-zero point with integer coefficients.

A related theorem which directly implies this is Blichfeldt's theorem, which states that any subset of ℝn with a volume greater than 1 contains two different points whose difference vector has integer components.

The entry contains a proof of both theorems.

[Name_Carrying_Type_Inference] title = Verified Metatheory and Type Inference for a Name-Carrying Simply-Typed Lambda Calculus author = Michael Rawson topic = Computer science/Programming languages/Type systems date = 2017-07-09 notify = mr644@cam.ac.uk, michaelrawson76@gmail.com abstract = I formalise a Church-style simply-typed \(\lambda\)-calculus, extended with pairs, a unit value, and projection functions, and show some metatheory of the calculus, such as the subject reduction property. Particular attention is paid to the treatment of names in the calculus. A nominal style of binding is used, but I use a manual approach over Nominal Isabelle in order to extract an executable type inference algorithm. More information can be found in my undergraduate dissertation. [Propositional_Proof_Systems] title = Propositional Proof Systems author = Julius Michaelis , Tobias Nipkow topic = Logic/Proof theory date = 2017-06-21 notify = maintainafpppt@liftm.de abstract = We formalize a range of proof systems for classical propositional logic (sequent calculus, natural deduction, Hilbert systems, resolution) and prove the most important meta-theoretic results about semantics and proofs: compactness, soundness, completeness, translations between proof systems, cut-elimination, interpolation and model existence. [Optics] title = Optics author = Simon Foster , Frank Zeyda topic = Computer science/Functional programming, Mathematics/Algebra date = 2017-05-25 notify = simon.foster@york.ac.uk abstract = Lenses provide an abstract interface for manipulating data types through spatially-separated views. They are defined abstractly in terms of two functions, get, the return a value from the source type, and put that updates the value. We mechanise the underlying theory of lenses, in terms of an algebraic hierarchy of lenses, including well-behaved and very well-behaved lenses, each lens class being characterised by a set of lens laws. We also mechanise a lens algebra in Isabelle that enables their composition and comparison, so as to allow construction of complex lenses. This is accompanied by a large library of algebraic laws. Moreover we also show how the lens classes can be applied by instantiating them with a number of Isabelle data types. extra-history = Change history: [2020-03-02]: Added partial bijective and symmetric lenses. Improved alphabet command generating additional lenses and results. Several additional lens relations, including observational equivalence. Additional theorems throughout. Adaptations for Isabelle 2020. (revision 44e2e5c) [2021-01-27] Addition of new theorems throughout, particularly for prisms. New "chantype" command allows the definition of an algebraic datatype with generated prisms. New "dataspace" command allows the definition of a local-based state space, including lenses and prisms. Addition of various examples for the above. (revision 89cf045a) [Game_Based_Crypto] title = Game-based cryptography in HOL author = Andreas Lochbihler , S. Reza Sefidgar <>, Bhargav Bhatt topic = Computer science/Security/Cryptography date = 2017-05-05 notify = mail@andreas-lochbihler.de abstract =

In this AFP entry, we show how to specify game-based cryptographic security notions and formally prove secure several cryptographic constructions from the literature using the CryptHOL framework. Among others, we formalise the notions of a random oracle, a pseudo-random function, an unpredictable function, and of encryption schemes that are indistinguishable under chosen plaintext and/or ciphertext attacks. We prove the random-permutation/random-function switching lemma, security of the Elgamal and hashed Elgamal public-key encryption scheme and correctness and security of several constructions with pseudo-random functions.

Our proofs follow the game-hopping style advocated by Shoup and Bellare and Rogaway, from which most of the examples have been taken. We generalise some of their results such that they can be reused in other proofs. Thanks to CryptHOL's integration with Isabelle's parametricity infrastructure, many simple hops are easily justified using the theory of representation independence.

extra-history = Change history: [2018-09-28]: added the CryptHOL tutorial for game-based cryptography (revision 489a395764ae) [Multi_Party_Computation] title = Multi-Party Computation author = David Aspinall , David Butler topic = Computer science/Security date = 2019-05-09 notify = dbutler@turing.ac.uk abstract = We use CryptHOL to consider Multi-Party Computation (MPC) protocols. MPC was first considered by Yao in 1983 and recent advances in efficiency and an increased demand mean it is now deployed in the real world. Security is considered using the real/ideal world paradigm. We first define security in the semi-honest security setting where parties are assumed not to deviate from the protocol transcript. In this setting we prove multiple Oblivious Transfer (OT) protocols secure and then show security for the gates of the GMW protocol. We then define malicious security, this is a stronger notion of security where parties are assumed to be fully corrupted by an adversary. In this setting we again consider OT, as it is a fundamental building block of almost all MPC protocols. [Sigma_Commit_Crypto] title = Sigma Protocols and Commitment Schemes author = David Butler , Andreas Lochbihler topic = Computer science/Security/Cryptography date = 2019-10-07 notify = dbutler@turing.ac.uk abstract = We use CryptHOL to formalise commitment schemes and Sigma-protocols. Both are widely used fundamental two party cryptographic primitives. Security for commitment schemes is considered using game-based definitions whereas the security of Sigma-protocols is considered using both the game-based and simulation-based security paradigms. In this work, we first define security for both primitives and then prove secure multiple case studies: the Schnorr, Chaum-Pedersen and Okamoto Sigma-protocols as well as a construction that allows for compound (AND and OR statements) Sigma-protocols and the Pedersen and Rivest commitment schemes. We also prove that commitment schemes can be constructed from Sigma-protocols. We formalise this proof at an abstract level, only assuming the existence of a Sigma-protocol; consequently, the instantiations of this result for the concrete Sigma-protocols we consider come for free. [CryptHOL] title = CryptHOL author = Andreas Lochbihler topic = Computer science/Security/Cryptography, Computer science/Functional programming, Mathematics/Probability theory date = 2017-05-05 notify = mail@andreas-lochbihler.de abstract =

CryptHOL provides a framework for formalising cryptographic arguments in Isabelle/HOL. It shallowly embeds a probabilistic functional programming language in higher order logic. The language features monadic sequencing, recursion, random sampling, failures and failure handling, and black-box access to oracles. Oracles are probabilistic functions which maintain hidden state between different invocations. All operators are defined in the new semantic domain of generative probabilistic values, a codatatype. We derive proof rules for the operators and establish a connection with the theory of relational parametricity. Thus, the resuting proofs are trustworthy and comprehensible, and the framework is extensible and widely applicable.

The framework is used in the accompanying AFP entry "Game-based Cryptography in HOL". There, we show-case our framework by formalizing different game-based proofs from the literature. This formalisation continues the work described in the author's ESOP 2016 paper.

[Constructive_Cryptography] title = Constructive Cryptography in HOL author = Andreas Lochbihler , S. Reza Sefidgar<> topic = Computer science/Security/Cryptography, Mathematics/Probability theory date = 2018-12-17 notify = mail@andreas-lochbihler.de, reza.sefidgar@inf.ethz.ch abstract = Inspired by Abstract Cryptography, we extend CryptHOL, a framework for formalizing game-based proofs, with an abstract model of Random Systems and provide proof rules about their composition and equality. This foundation facilitates the formalization of Constructive Cryptography proofs, where the security of a cryptographic scheme is realized as a special form of construction in which a complex random system is built from simpler ones. This is a first step towards a fully-featured compositional framework, similar to Universal Composability framework, that supports formalization of simulation-based proofs. [Probabilistic_While] title = Probabilistic while loop author = Andreas Lochbihler topic = Computer science/Functional programming, Mathematics/Probability theory, Computer science/Algorithms date = 2017-05-05 notify = mail@andreas-lochbihler.de abstract = This AFP entry defines a probabilistic while operator based on sub-probability mass functions and formalises zero-one laws and variant rules for probabilistic loop termination. As applications, we implement probabilistic algorithms for the Bernoulli, geometric and arbitrary uniform distributions that only use fair coin flips, and prove them correct and terminating with probability 1. extra-history = Change history: [2018-02-02]: Added a proof that probabilistic conditioning can be implemented by repeated sampling. (revision 305867c4e911)
[Monad_Normalisation] title = Monad normalisation author = Joshua Schneider <>, Manuel Eberl , Andreas Lochbihler topic = Tools, Computer science/Functional programming, Logic/Rewriting date = 2017-05-05 notify = mail@andreas-lochbihler.de abstract = The usual monad laws can directly be used as rewrite rules for Isabelle’s simplifier to normalise monadic HOL terms and decide equivalences. In a commutative monad, however, the commutativity law is a higher-order permutative rewrite rule that makes the simplifier loop. This AFP entry implements a simproc that normalises monadic expressions in commutative monads using ordered rewriting. The simproc can also permute computations across control operators like if and case. [Monomorphic_Monad] title = Effect polymorphism in higher-order logic author = Andreas Lochbihler topic = Computer science/Functional programming date = 2017-05-05 notify = mail@andreas-lochbihler.de abstract = The notion of a monad cannot be expressed within higher-order logic (HOL) due to type system restrictions. We show that if a monad is used with values of only one type, this notion can be formalised in HOL. Based on this idea, we develop a library of effect specifications and implementations of monads and monad transformers. Hence, we can abstract over the concrete monad in HOL definitions and thus use the same definition for different (combinations of) effects. We illustrate the usefulness of effect polymorphism with a monadic interpreter for a simple language. extra-history = Change history: [2018-02-15]: added further specifications and implementations of non-determinism; more examples (revision bc5399eea78e)
[Constructor_Funs] title = Constructor Functions author = Lars Hupel topic = Tools date = 2017-04-19 notify = hupel@in.tum.de abstract = Isabelle's code generator performs various adaptations for target languages. Among others, constructor applications have to be fully saturated. That means that for constructor calls occuring as arguments to higher-order functions, synthetic lambdas have to be inserted. This entry provides tooling to avoid this construction altogether by introducing constructor functions. [Lazy_Case] title = Lazifying case constants author = Lars Hupel topic = Tools date = 2017-04-18 notify = hupel@in.tum.de abstract = Isabelle's code generator performs various adaptations for target languages. Among others, case statements are printed as match expressions. Internally, this is a sophisticated procedure, because in HOL, case statements are represented as nested calls to the case combinators as generated by the datatype package. Furthermore, the procedure relies on laziness of match expressions in the target language, i.e., that branches guarded by patterns that fail to match are not evaluated. Similarly, if-then-else is printed to the corresponding construct in the target language. This entry provides tooling to replace these special cases in the code generator by ignoring these target language features, instead printing case expressions and if-then-else as functions. [Dict_Construction] title = Dictionary Construction author = Lars Hupel topic = Tools date = 2017-05-24 notify = hupel@in.tum.de abstract = Isabelle's code generator natively supports type classes. For targets that do not have language support for classes and instances, it performs the well-known dictionary translation, as described by Haftmann and Nipkow. This translation happens outside the logic, i.e., there is no guarantee that it is correct, besides the pen-and-paper proof. This work implements a certified dictionary translation that produces new class-free constants and derives equality theorems. [Higher_Order_Terms] title = An Algebra for Higher-Order Terms author = Lars Hupel contributors = Yu Zhang <> topic = Computer science/Programming languages/Lambda calculi date = 2019-01-15 notify = lars@hupel.info abstract = In this formalization, I introduce a higher-order term algebra, generalizing the notions of free variables, matching, and substitution. The need arose from the work on a verified compiler from Isabelle to CakeML. Terms can be thought of as consisting of a generic (free variables, constants, application) and a specific part. As example applications, this entry provides instantiations for de-Bruijn terms, terms with named variables, and Blanchette’s λ-free higher-order terms. Furthermore, I implement translation functions between de-Bruijn terms and named terms and prove their correctness. [Subresultants] title = Subresultants author = Sebastiaan Joosten , René Thiemann , Akihisa Yamada topic = Mathematics/Algebra date = 2017-04-06 notify = rene.thiemann@uibk.ac.at abstract = We formalize the theory of subresultants and the subresultant polynomial remainder sequence as described by Brown and Traub. As a result, we obtain efficient certified algorithms for computing the resultant and the greatest common divisor of polynomials. [Comparison_Sort_Lower_Bound] title = Lower bound on comparison-based sorting algorithms author = Manuel Eberl topic = Computer science/Algorithms date = 2017-03-15 notify = eberlm@in.tum.de abstract =

This article contains a formal proof of the well-known fact that number of comparisons that a comparison-based sorting algorithm needs to perform to sort a list of length n is at least log2 (n!) in the worst case, i. e. Ω(n log n).

For this purpose, a shallow embedding for comparison-based sorting algorithms is defined: a sorting algorithm is a recursive datatype containing either a HOL function or a query of a comparison oracle with a continuation containing the remaining computation. This makes it possible to force the algorithm to use only comparisons and to track the number of comparisons made.

[Quick_Sort_Cost] title = The number of comparisons in QuickSort author = Manuel Eberl topic = Computer science/Algorithms date = 2017-03-15 notify = eberlm@in.tum.de abstract =

We give a formal proof of the well-known results about the number of comparisons performed by two variants of QuickSort: first, the expected number of comparisons of randomised QuickSort (i. e. QuickSort with random pivot choice) is 2 (n+1) Hn - 4 n, which is asymptotically equivalent to 2 n ln n; second, the number of comparisons performed by the classic non-randomised QuickSort has the same distribution in the average case as the randomised one.

[Random_BSTs] title = Expected Shape of Random Binary Search Trees author = Manuel Eberl topic = Computer science/Data structures date = 2017-04-04 notify = eberlm@in.tum.de abstract =

This entry contains proofs for the textbook results about the distributions of the height and internal path length of random binary search trees (BSTs), i. e. BSTs that are formed by taking an empty BST and inserting elements from a fixed set in random order.

In particular, we prove a logarithmic upper bound on the expected height and the Θ(n log n) closed-form solution for the expected internal path length in terms of the harmonic numbers. We also show how the internal path length relates to the average-case cost of a lookup in a BST.

[Randomised_BSTs] title = Randomised Binary Search Trees author = Manuel Eberl topic = Computer science/Data structures date = 2018-10-19 notify = eberlm@in.tum.de abstract =

This work is a formalisation of the Randomised Binary Search Trees introduced by Martínez and Roura, including definitions and correctness proofs.

Like randomised treaps, they are a probabilistic data structure that behaves exactly as if elements were inserted into a non-balancing BST in random order. However, unlike treaps, they only use discrete probability distributions, but their use of randomness is more complicated.

[E_Transcendental] title = The Transcendence of e author = Manuel Eberl topic = Mathematics/Analysis, Mathematics/Number theory date = 2017-01-12 notify = eberlm@in.tum.de abstract =

This work contains a proof that Euler's number e is transcendental. The proof follows the standard approach of assuming that e is algebraic and then using a specific integer polynomial to derive two inconsistent bounds, leading to a contradiction.

This kind of approach can be found in many different sources; this formalisation mostly follows a PlanetMath article by Roger Lipsett.

[Pi_Transcendental] title = The Transcendence of π author = Manuel Eberl topic = Mathematics/Number theory date = 2018-09-28 notify = eberlm@in.tum.de abstract =

This entry shows the transcendence of π based on the classic proof using the fundamental theorem of symmetric polynomials first given by von Lindemann in 1882, but the formalisation mostly follows the version by Niven. The proof reuses much of the machinery developed in the AFP entry on the transcendence of e.

[DFS_Framework] title = A Framework for Verifying Depth-First Search Algorithms author = Peter Lammich , René Neumann notify = lammich@in.tum.de date = 2016-07-05 topic = Computer science/Algorithms/Graph abstract =

This entry presents a framework for the modular verification of DFS-based algorithms, which is described in our [CPP-2015] paper. It provides a generic DFS algorithm framework, that can be parameterized with user-defined actions on certain events (e.g. discovery of new node). It comes with an extensible library of invariants, which can be used to derive invariants of a specific parameterization. Using refinement techniques, efficient implementations of the algorithms can easily be derived. Here, the framework comes with templates for a recursive and a tail-recursive implementation, and also with several templates for implementing the data structures required by the DFS algorithm. Finally, this entry contains a set of re-usable DFS-based algorithms, which illustrate the application of the framework.

[CPP-2015] Peter Lammich, René Neumann: A Framework for Verifying Depth-First Search Algorithms. CPP 2015: 137-146

[Flow_Networks] title = Flow Networks and the Min-Cut-Max-Flow Theorem author = Peter Lammich , S. Reza Sefidgar <> topic = Mathematics/Graph theory date = 2017-06-01 notify = lammich@in.tum.de abstract = We present a formalization of flow networks and the Min-Cut-Max-Flow theorem. Our formal proof closely follows a standard textbook proof, and is accessible even without being an expert in Isabelle/HOL, the interactive theorem prover used for the formalization. [Prpu_Maxflow] title = Formalizing Push-Relabel Algorithms author = Peter Lammich , S. Reza Sefidgar <> topic = Computer science/Algorithms/Graph, Mathematics/Graph theory date = 2017-06-01 notify = lammich@in.tum.de abstract = We present a formalization of push-relabel algorithms for computing the maximum flow in a network. We start with Goldberg's et al.~generic push-relabel algorithm, for which we show correctness and the time complexity bound of O(V^2E). We then derive the relabel-to-front and FIFO implementation. Using stepwise refinement techniques, we derive an efficient verified implementation. Our formal proof of the abstract algorithms closely follows a standard textbook proof. It is accessible even without being an expert in Isabelle/HOL, the interactive theorem prover used for the formalization. [Buildings] title = Chamber Complexes, Coxeter Systems, and Buildings author = Jeremy Sylvestre notify = jeremy.sylvestre@ualberta.ca date = 2016-07-01 topic = Mathematics/Algebra, Mathematics/Geometry abstract = We provide a basic formal framework for the theory of chamber complexes and Coxeter systems, and for buildings as thick chamber complexes endowed with a system of apartments. Along the way, we develop some of the general theory of abstract simplicial complexes and of groups (relying on the group_add class for the basics), including free groups and group presentations, and their universal properties. The main results verified are that the deletion condition is both necessary and sufficient for a group with a set of generators of order two to be a Coxeter system, and that the apartments in a (thick) building are all uniformly Coxeter. [Algebraic_VCs] title = Program Construction and Verification Components Based on Kleene Algebra author = Victor B. F. Gomes , Georg Struth notify = victor.gomes@cl.cam.ac.uk, g.struth@sheffield.ac.uk date = 2016-06-18 topic = Mathematics/Algebra abstract = Variants of Kleene algebra support program construction and verification by algebraic reasoning. This entry provides a verification component for Hoare logic based on Kleene algebra with tests, verification components for weakest preconditions and strongest postconditions based on Kleene algebra with domain and a component for step-wise refinement based on refinement Kleene algebra with tests. In addition to these components for the partial correctness of while programs, a verification component for total correctness based on divergence Kleene algebras and one for (partial correctness) of recursive programs based on domain quantales are provided. Finally we have integrated memory models for programs with pointers and a program trace semantics into the weakest precondition component. [C2KA_DistributedSystems] title = Communicating Concurrent Kleene Algebra for Distributed Systems Specification author = Maxime Buyse , Jason Jaskolka topic = Computer science/Automata and formal languages, Mathematics/Algebra date = 2019-08-06 notify = maxime.buyse@polytechnique.edu, jason.jaskolka@carleton.ca abstract = Communicating Concurrent Kleene Algebra (C²KA) is a mathematical framework for capturing the communicating and concurrent behaviour of agents in distributed systems. It extends Hoare et al.'s Concurrent Kleene Algebra (CKA) with communication actions through the notions of stimuli and shared environments. C²KA has applications in studying system-level properties of distributed systems such as safety, security, and reliability. In this work, we formalize results about C²KA and its application for distributed systems specification. We first formalize the stimulus structure and behaviour structure (CKA). Next, we combine them to formalize C²KA and its properties. Then, we formalize notions and properties related to the topology of distributed systems and the potential for communication via stimuli and via shared environments of agents, all within the algebraic setting of C²KA. [Card_Equiv_Relations] title = Cardinality of Equivalence Relations author = Lukas Bulwahn notify = lukas.bulwahn@gmail.com date = 2016-05-24 topic = Mathematics/Combinatorics abstract = This entry provides formulae for counting the number of equivalence relations and partial equivalence relations over a finite carrier set with given cardinality. To count the number of equivalence relations, we provide bijections between equivalence relations and set partitions, and then transfer the main results of the two AFP entries, Cardinality of Set Partitions and Spivey's Generalized Recurrence for Bell Numbers, to theorems on equivalence relations. To count the number of partial equivalence relations, we observe that counting partial equivalence relations over a set A is equivalent to counting all equivalence relations over all subsets of the set A. From this observation and the results on equivalence relations, we show that the cardinality of partial equivalence relations over a finite set of cardinality n is equal to the n+1-th Bell number. [Twelvefold_Way] title = The Twelvefold Way author = Lukas Bulwahn topic = Mathematics/Combinatorics date = 2016-12-29 notify = lukas.bulwahn@gmail.com abstract = This entry provides all cardinality theorems of the Twelvefold Way. The Twelvefold Way systematically classifies twelve related combinatorial problems concerning two finite sets, which include counting permutations, combinations, multisets, set partitions and number partitions. This development builds upon the existing formal developments with cardinality theorems for those structures. It provides twelve bijections from the various structures to different equivalence classes on finite functions, and hence, proves cardinality formulae for these equivalence classes on finite functions. [Chord_Segments] title = Intersecting Chords Theorem author = Lukas Bulwahn notify = lukas.bulwahn@gmail.com date = 2016-10-11 topic = Mathematics/Geometry abstract = This entry provides a geometric proof of the intersecting chords theorem. The theorem states that when two chords intersect each other inside a circle, the products of their segments are equal. After a short review of existing proofs in the literature, I decided to use a proof approach that employs reasoning about lengths of line segments, the orthogonality of two lines and the Pythagoras Law. Hence, one can understand the formalized proof easily with the knowledge of a few general geometric facts that are commonly taught in high-school. This theorem is the 55th theorem of the Top 100 Theorems list. [Category3] title = Category Theory with Adjunctions and Limits author = Eugene W. Stark notify = stark@cs.stonybrook.edu date = 2016-06-26 topic = Mathematics/Category theory abstract =

This article attempts to develop a usable framework for doing category theory in Isabelle/HOL. Our point of view, which to some extent differs from that of the previous AFP articles on the subject, is to try to explore how category theory can be done efficaciously within HOL, rather than trying to match exactly the way things are done using a traditional approach. To this end, we define the notion of category in an "object-free" style, in which a category is represented by a single partial composition operation on arrows. This way of defining categories provides some advantages in the context of HOL, including the ability to avoid the use of records and the possibility of defining functors and natural transformations simply as certain functions on arrows, rather than as composite objects. We define various constructions associated with the basic notions, including: dual category, product category, functor category, discrete category, free category, functor composition, and horizontal and vertical composite of natural transformations. A "set category" locale is defined that axiomatizes the notion "category of all sets at a type and all functions between them," and a fairly extensive set of properties of set categories is derived from the locale assumptions. The notion of a set category is used to prove the Yoneda Lemma in a general setting of a category equipped with a "hom embedding," which maps arrows of the category to the "universe" of the set category. We also give a treatment of adjunctions, defining adjunctions via left and right adjoint functors, natural bijections between hom-sets, and unit and counit natural transformations, and showing the equivalence of these definitions. We also develop the theory of limits, including representations of functors, diagrams and cones, and diagonal functors. We show that right adjoint functors preserve limits, and that limits can be constructed via products and equalizers. We characterize the conditions under which limits exist in a set category. We also examine the case of limits in a functor category, ultimately culminating in a proof that the Yoneda embedding preserves limits.

Revisions made subsequent to the first version of this article added material on equivalence of categories, cartesian categories, categories with pullbacks, categories with finite limits, and cartesian closed categories. A construction was given of the category of hereditarily finite sets and functions between them, and it was shown that this category is cartesian closed.

extra-history = Change history: [2018-05-29]: Revised axioms for the category locale. Introduced notation for composition and "in hom". (revision 8318366d4575)
[2020-02-15]: Move ConcreteCategory.thy from Bicategory to Category3 and use it systematically. Make other minor improvements throughout. (revision a51840d36867)
[2020-07-10]: Added new material, mostly centered around cartesian categories. (revision 06640f317a79)
[2020-11-04]: Minor modifications and extensions made in conjunction with the addition of new material to Bicategory. (revision 472cb2268826)
[MonoidalCategory] title = Monoidal Categories author = Eugene W. Stark topic = Mathematics/Category theory date = 2017-05-04 notify = stark@cs.stonybrook.edu abstract =

Building on the formalization of basic category theory set out in the author's previous AFP article, the present article formalizes some basic aspects of the theory of monoidal categories. Among the notions defined here are monoidal category, monoidal functor, and equivalence of monoidal categories. The main theorems formalized are MacLane's coherence theorem and the constructions of the free monoidal category and free strict monoidal category generated by a given category. The coherence theorem is proved syntactically, using a structurally recursive approach to reduction of terms that might have some novel aspects. We also give proofs of some results given by Etingof et al, which may prove useful in a formal setting. In particular, we show that the left and right unitors need not be taken as given data in the definition of monoidal category, nor does the definition of monoidal functor need to take as given a specific isomorphism expressing the preservation of the unit object. Our definitions of monoidal category and monoidal functor are stated so as to take advantage of the economy afforded by these facts.

Revisions made subsequent to the first version of this article added material on cartesian monoidal categories; showing that the underlying category of a cartesian monoidal category is a cartesian category, and that every cartesian category extends to a cartesian monoidal category.

extra-history = Change history: [2017-05-18]: Integrated material from MonoidalCategory/Category3Adapter into Category3/ and deleted adapter. (revision 015543cdd069)
[2018-05-29]: Modifications required due to 'Category3' changes. Introduced notation for "in hom". (revision 8318366d4575)
[2020-02-15]: Cosmetic improvements. (revision a51840d36867)
[2020-07-10]: Added new material on cartesian monoidal categories. (revision 06640f317a79)
[Card_Multisets] title = Cardinality of Multisets author = Lukas Bulwahn notify = lukas.bulwahn@gmail.com date = 2016-06-26 topic = Mathematics/Combinatorics abstract =

This entry provides three lemmas to count the number of multisets of a given size and finite carrier set. The first lemma provides a cardinality formula assuming that the multiset's elements are chosen from the given carrier set. The latter two lemmas provide formulas assuming that the multiset's elements also cover the given carrier set, i.e., each element of the carrier set occurs in the multiset at least once.

The proof of the first lemma uses the argument of the recurrence relation for counting multisets. The proof of the second lemma is straightforward, and the proof of the third lemma is easily obtained using the first cardinality lemma. A challenge for the formalization is the derivation of the required induction rule, which is a special combination of the induction rules for finite sets and natural numbers. The induction rule is derived by defining a suitable inductive predicate and transforming the predicate's induction rule.

[Posix-Lexing] title = POSIX Lexing with Derivatives of Regular Expressions author = Fahad Ausaf , Roy Dyckhoff , Christian Urban notify = christian.urban@kcl.ac.uk date = 2016-05-24 topic = Computer science/Automata and formal languages abstract = Brzozowski introduced the notion of derivatives for regular expressions. They can be used for a very simple regular expression matching algorithm. Sulzmann and Lu cleverly extended this algorithm in order to deal with POSIX matching, which is the underlying disambiguation strategy for regular expressions needed in lexers. In this entry we give our inductive definition of what a POSIX value is and show (i) that such a value is unique (for given regular expression and string being matched) and (ii) that Sulzmann and Lu's algorithm always generates such a value (provided that the regular expression matches the string). We also prove the correctness of an optimised version of the POSIX matching algorithm. [LocalLexing] title = Local Lexing author = Steven Obua topic = Computer science/Automata and formal languages date = 2017-04-28 notify = steven@recursivemind.com abstract = This formalisation accompanies the paper Local Lexing which introduces a novel parsing concept of the same name. The paper also gives a high-level algorithm for local lexing as an extension of Earley's algorithm. This formalisation proves the algorithm to be correct with respect to its local lexing semantics. As a special case, this formalisation thus also contains a proof of the correctness of Earley's algorithm. The paper contains a short outline of how this formalisation is organised. [MFMC_Countable] title = A Formal Proof of the Max-Flow Min-Cut Theorem for Countable Networks author = Andreas Lochbihler date = 2016-05-09 topic = Mathematics/Graph theory abstract = This article formalises a proof of the maximum-flow minimal-cut theorem for networks with countably many edges. A network is a directed graph with non-negative real-valued edge labels and two dedicated vertices, the source and the sink. A flow in a network assigns non-negative real numbers to the edges such that for all vertices except for the source and the sink, the sum of values on incoming edges equals the sum of values on outgoing edges. A cut is a subset of the vertices which contains the source, but not the sink. Our theorem states that in every network, there is a flow and a cut such that the flow saturates all the edges going out of the cut and is zero on all the incoming edges. The proof is based on the paper The Max-Flow Min-Cut theorem for countable networks by Aharoni et al. Additionally, we prove a characterisation of the lifting operation for relations on discrete probability distributions, which leads to a concise proof of its distributivity over relation composition. notify = mail@andreas-lochbihler.de extra-history = Change history: [2017-09-06]: derive characterisation for the lifting operations on discrete distributions from finite version of the max-flow min-cut theorem (revision a7a198f5bab0)
[2020-12-19]: simpler proof of linkability for bounded unhindered bipartite webs, leading to a simpler proof for networks with bounded out-capacities (revision 93ca33f4d915)
[Liouville_Numbers] title = Liouville numbers author = Manuel Eberl date = 2015-12-28 topic = Mathematics/Analysis, Mathematics/Number theory abstract =

Liouville numbers are a class of transcendental numbers that can be approximated particularly well with rational numbers. Historically, they were the first numbers whose transcendence was proven.

In this entry, we define the concept of Liouville numbers as well as the standard construction to obtain Liouville numbers (including Liouville's constant) and we prove their most important properties: irrationality and transcendence.

The proof is very elementary and requires only standard arithmetic, the Mean Value Theorem for polynomials, and the boundedness of polynomials on compact intervals.

notify = eberlm@in.tum.de [Triangle] title = Basic Geometric Properties of Triangles author = Manuel Eberl date = 2015-12-28 topic = Mathematics/Geometry abstract =

This entry contains a definition of angles between vectors and between three points. Building on this, we prove basic geometric properties of triangles, such as the Isosceles Triangle Theorem, the Law of Sines and the Law of Cosines, that the sum of the angles of a triangle is π, and the congruence theorems for triangles.

The definitions and proofs were developed following those by John Harrison in HOL Light. However, due to Isabelle's type class system, all definitions and theorems in the Isabelle formalisation hold for all real inner product spaces.

notify = eberlm@in.tum.de [Prime_Harmonic_Series] title = The Divergence of the Prime Harmonic Series author = Manuel Eberl date = 2015-12-28 topic = Mathematics/Number theory abstract =

In this work, we prove the lower bound ln(H_n) - ln(5/3) for the partial sum of the Prime Harmonic series and, based on this, the divergence of the Prime Harmonic Series ∑[p prime] · 1/p.

The proof relies on the unique squarefree decomposition of natural numbers. This is similar to Euler's original proof (which was highly informal and morally questionable). Its advantage over proofs by contradiction, like the famous one by Paul Erdős, is that it provides a relatively good lower bound for the partial sums.

notify = eberlm@in.tum.de [Descartes_Sign_Rule] title = Descartes' Rule of Signs author = Manuel Eberl date = 2015-12-28 topic = Mathematics/Analysis abstract =

Descartes' Rule of Signs relates the number of positive real roots of a polynomial with the number of sign changes in its coefficient sequence.

Our proof follows the simple inductive proof given by Rob Arthan, which was also used by John Harrison in his HOL Light formalisation. We proved most of the lemmas for arbitrary linearly-ordered integrity domains (e.g. integers, rationals, reals); the main result, however, requires the intermediate value theorem and was therefore only proven for real polynomials.

notify = eberlm@in.tum.de [Euler_MacLaurin] title = The Euler–MacLaurin Formula author = Manuel Eberl topic = Mathematics/Analysis date = 2017-03-10 notify = eberlm@in.tum.de abstract =

The Euler-MacLaurin formula relates the value of a discrete sum to that of the corresponding integral in terms of the derivatives at the borders of the summation and a remainder term. Since the remainder term is often very small as the summation bounds grow, this can be used to compute asymptotic expansions for sums.

This entry contains a proof of this formula for functions from the reals to an arbitrary Banach space. Two variants of the formula are given: the standard textbook version and a variant outlined in Concrete Mathematics that is more useful for deriving asymptotic estimates.

As example applications, we use that formula to derive the full asymptotic expansion of the harmonic numbers and the sum of inverse squares.

[Card_Partitions] title = Cardinality of Set Partitions author = Lukas Bulwahn date = 2015-12-12 topic = Mathematics/Combinatorics abstract = The theory's main theorem states that the cardinality of set partitions of size k on a carrier set of size n is expressed by Stirling numbers of the second kind. In Isabelle, Stirling numbers of the second kind are defined in the AFP entry `Discrete Summation` through their well-known recurrence relation. The main theorem relates them to the alternative definition as cardinality of set partitions. The proof follows the simple and short explanation in Richard P. Stanley's `Enumerative Combinatorics: Volume 1` and Wikipedia, and unravels the full details and implicit reasoning steps of these explanations. notify = lukas.bulwahn@gmail.com [Card_Number_Partitions] title = Cardinality of Number Partitions author = Lukas Bulwahn date = 2016-01-14 topic = Mathematics/Combinatorics abstract = This entry provides a basic library for number partitions, defines the two-argument partition function through its recurrence relation and relates this partition function to the cardinality of number partitions. The main proof shows that the recursively-defined partition function with arguments n and k equals the cardinality of number partitions of n with exactly k parts. The combinatorial proof follows the proof sketch of Theorem 2.4.1 in Mazur's textbook `Combinatorics: A Guided Tour`. This entry can serve as starting point for various more intrinsic properties about number partitions, the partition function and related recurrence relations. notify = lukas.bulwahn@gmail.com [Multirelations] title = Binary Multirelations author = Hitoshi Furusawa , Georg Struth date = 2015-06-11 topic = Mathematics/Algebra abstract = Binary multirelations associate elements of a set with its subsets; hence they are binary relations from a set to its power set. Applications include alternating automata, models and logics for games, program semantics with dual demonic and angelic nondeterministic choices and concurrent dynamic logics. This proof document supports an arXiv article that formalises the basic algebra of multirelations and proposes axiom systems for them, ranging from weak bi-monoids to weak bi-quantales. notify = [Noninterference_Generic_Unwinding] title = The Generic Unwinding Theorem for CSP Noninterference Security author = Pasquale Noce date = 2015-06-11 topic = Computer science/Security, Computer science/Concurrency/Process calculi abstract =

The classical definition of noninterference security for a deterministic state machine with outputs requires to consider the outputs produced by machine actions after any trace, i.e. any indefinitely long sequence of actions, of the machine. In order to render the verification of the security of such a machine more straightforward, there is a need of some sufficient condition for security such that just individual actions, rather than unbounded sequences of actions, have to be considered.

By extending previous results applying to transitive noninterference policies, Rushby has proven an unwinding theorem that provides a sufficient condition of this kind in the general case of a possibly intransitive policy. This condition has to be satisfied by a generic function mapping security domains into equivalence relations over machine states.

An analogous problem arises for CSP noninterference security, whose definition requires to consider any possible future, i.e. any indefinitely long sequence of subsequent events and any indefinitely large set of refused events associated to that sequence, for each process trace.

This paper provides a sufficient condition for CSP noninterference security, which indeed requires to just consider individual accepted and refused events and applies to the general case of a possibly intransitive policy. This condition follows Rushby's one for classical noninterference security, and has to be satisfied by a generic function mapping security domains into equivalence relations over process traces; hence its name, Generic Unwinding Theorem. Variants of this theorem applying to deterministic processes and trace set processes are also proven. Finally, the sufficient condition for security expressed by the theorem is shown not to be a necessary condition as well, viz. there exists a secure process such that no domain-relation map satisfying the condition exists.

notify = [Noninterference_Ipurge_Unwinding] title = The Ipurge Unwinding Theorem for CSP Noninterference Security author = Pasquale Noce date = 2015-06-11 topic = Computer science/Security abstract =

The definition of noninterference security for Communicating Sequential Processes requires to consider any possible future, i.e. any indefinitely long sequence of subsequent events and any indefinitely large set of refused events associated to that sequence, for each process trace. In order to render the verification of the security of a process more straightforward, there is a need of some sufficient condition for security such that just individual accepted and refused events, rather than unbounded sequences and sets of events, have to be considered.

Of course, if such a sufficient condition were necessary as well, it would be even more valuable, since it would permit to prove not only that a process is secure by verifying that the condition holds, but also that a process is not secure by verifying that the condition fails to hold.

This paper provides a necessary and sufficient condition for CSP noninterference security, which indeed requires to just consider individual accepted and refused events and applies to the general case of a possibly intransitive policy. This condition follows Rushby's output consistency for deterministic state machines with outputs, and has to be satisfied by a specific function mapping security domains into equivalence relations over process traces. The definition of this function makes use of an intransitive purge function following Rushby's one; hence the name given to the condition, Ipurge Unwinding Theorem.

Furthermore, in accordance with Hoare's formal definition of deterministic processes, it is shown that a process is deterministic just in case it is a trace set process, i.e. it may be identified by means of a trace set alone, matching the set of its traces, in place of a failures-divergences pair. Then, variants of the Ipurge Unwinding Theorem are proven for deterministic processes and trace set processes.

notify = [Relational_Method] title = The Relational Method with Message Anonymity for the Verification of Cryptographic Protocols author = Pasquale Noce topic = Computer science/Security date = 2020-12-05 notify = pasquale.noce.lavoro@gmail.com abstract = This paper introduces a new method for the formal verification of cryptographic protocols, the relational method, derived from Paulson's inductive method by means of some enhancements aimed at streamlining formal definitions and proofs, specially for protocols using public key cryptography. Moreover, this paper proposes a method to formalize a further security property, message anonymity, in addition to message confidentiality and authenticity. The relational method, including message anonymity, is then applied to the verification of a sample authentication protocol, comprising Password Authenticated Connection Establishment (PACE) with Chip Authentication Mapping followed by the explicit verification of an additional password over the PACE secure channel. [List_Interleaving] title = Reasoning about Lists via List Interleaving author = Pasquale Noce date = 2015-06-11 topic = Computer science/Data structures abstract =

Among the various mathematical tools introduced in his outstanding work on Communicating Sequential Processes, Hoare has defined "interleaves" as the predicate satisfied by any three lists such that the first list may be split into sublists alternately extracted from the other two ones, whatever is the criterion for extracting an item from either one list or the other in each step.

This paper enriches Hoare's definition by identifying such criterion with the truth value of a predicate taking as inputs the head and the tail of the first list. This enhanced "interleaves" predicate turns out to permit the proof of equalities between lists without the need of an induction. Some rules that allow to infer "interleaves" statements without induction, particularly applying to the addition or removal of a prefix to the input lists, are also proven. Finally, a stronger version of the predicate, named "Interleaves", is shown to fulfil further rules applying to the addition or removal of a suffix to the input lists.

notify = [Residuated_Lattices] title = Residuated Lattices author = Victor B. F. Gomes , Georg Struth date = 2015-04-15 topic = Mathematics/Algebra abstract = The theory of residuated lattices, first proposed by Ward and Dilworth, is formalised in Isabelle/HOL. This includes concepts of residuated functions; their adjoints and conjugates. It also contains necessary and sufficient conditions for the existence of these operations in an arbitrary lattice. The mathematical components for residuated lattices are linked to the AFP entry for relation algebra. In particular, we prove Jonsson and Tsinakis conditions for a residuated boolean algebra to form a relation algebra. notify = g.struth@sheffield.ac.uk [ConcurrentGC] title = Relaxing Safely: Verified On-the-Fly Garbage Collection for x86-TSO author = Peter Gammie , Tony Hosking , Kai Engelhardt <> date = 2015-04-13 topic = Computer science/Algorithms/Concurrent abstract =

We use ConcurrentIMP to model Schism, a state-of-the-art real-time garbage collection scheme for weak memory, and show that it is safe on x86-TSO.

This development accompanies the PLDI 2015 paper of the same name.

notify = peteg42@gmail.com [List_Update] title = Analysis of List Update Algorithms author = Maximilian P.L. Haslbeck , Tobias Nipkow date = 2016-02-17 topic = Computer science/Algorithms/Online abstract =

These theories formalize the quantitative analysis of a number of classical algorithms for the list update problem: 2-competitiveness of move-to-front, the lower bound of 2 for the competitiveness of deterministic list update algorithms and 1.6-competitiveness of the randomized COMB algorithm, the best randomized list update algorithm known to date. The material is based on the first two chapters of Online Computation and Competitive Analysis by Borodin and El-Yaniv.

For an informal description see the FSTTCS 2016 publication Verified Analysis of List Update Algorithms by Haslbeck and Nipkow.

notify = nipkow@in.tum.de [ConcurrentIMP] title = Concurrent IMP author = Peter Gammie date = 2015-04-13 topic = Computer science/Programming languages/Logics abstract = ConcurrentIMP extends the small imperative language IMP with control non-determinism and constructs for synchronous message passing. notify = peteg42@gmail.com [TortoiseHare] title = The Tortoise and Hare Algorithm author = Peter Gammie date = 2015-11-18 topic = Computer science/Algorithms abstract = We formalize the Tortoise and Hare cycle-finding algorithm ascribed to Floyd by Knuth, and an improved version due to Brent. notify = peteg42@gmail.com [UPF] title = The Unified Policy Framework (UPF) author = Achim D. Brucker , Lukas Brügger , Burkhart Wolff date = 2014-11-28 topic = Computer science/Security abstract = We present the Unified Policy Framework (UPF), a generic framework for modelling security (access-control) policies. UPF emphasizes the view that a policy is a policy decision function that grants or denies access to resources, permissions, etc. In other words, instead of modelling the relations of permitted or prohibited requests directly, we model the concrete function that implements the policy decision point in a system. In more detail, UPF is based on the following four principles: 1) Functional representation of policies, 2) No conflicts are possible, 3) Three-valued decision type (allow, deny, undefined), 4) Output type not containing the decision only. notify = adbrucker@0x5f.org, wolff@lri.fr, lukas.a.bruegger@gmail.com [UPF_Firewall] title = Formal Network Models and Their Application to Firewall Policies author = Achim D. Brucker , Lukas Brügger<>, Burkhart Wolff topic = Computer science/Security, Computer science/Networks date = 2017-01-08 notify = adbrucker@0x5f.org abstract = We present a formal model of network protocols and their application to modeling firewall policies. The formalization is based on the Unified Policy Framework (UPF). The formalization was originally developed with for generating test cases for testing the security configuration actual firewall and router (middle-boxes) using HOL-TestGen. Our work focuses on modeling application level protocols on top of tcp/ip. [AODV] title = Loop freedom of the (untimed) AODV routing protocol author = Timothy Bourke , Peter Höfner date = 2014-10-23 topic = Computer science/Concurrency/Process calculi abstract =

The Ad hoc On-demand Distance Vector (AODV) routing protocol allows the nodes in a Mobile Ad hoc Network (MANET) or a Wireless Mesh Network (WMN) to know where to forward data packets. Such a protocol is ‘loop free’ if it never leads to routing decisions that forward packets in circles.

This development mechanises an existing pen-and-paper proof of loop freedom of AODV. The protocol is modelled in the Algebra of Wireless Networks (AWN), which is the subject of an earlier paper and AFP mechanization. The proof relies on a novel compositional approach for lifting invariants to networks of nodes.

We exploit the mechanization to analyse several variants of AODV and show that Isabelle/HOL can re-establish most proof obligations automatically and identify exactly the steps that are no longer valid.

notify = tim@tbrk.org [Show] title = Haskell's Show Class in Isabelle/HOL author = Christian Sternagel , René Thiemann date = 2014-07-29 topic = Computer science/Functional programming license = LGPL abstract = We implemented a type class for "to-string" functions, similar to Haskell's Show class. Moreover, we provide instantiations for Isabelle/HOL's standard types like bool, prod, sum, nats, ints, and rats. It is further possible, to automatically derive show functions for arbitrary user defined datatypes similar to Haskell's "deriving Show". extra-history = Change history: [2015-03-11]: Adapted development to new-style (BNF-based) datatypes.
[2015-04-10]: Moved development for old-style datatypes into subdirectory "Old_Datatype".
notify = christian.sternagel@uibk.ac.at, rene.thiemann@uibk.ac.at [Certification_Monads] title = Certification Monads author = Christian Sternagel , René Thiemann date = 2014-10-03 topic = Computer science/Functional programming abstract = This entry provides several monads intended for the development of stand-alone certifiers via code generation from Isabelle/HOL. More specifically, there are three flavors of error monads (the sum type, for the case where all monadic functions are total; an instance of the former, the so called check monad, yielding either success without any further information or an error message; as well as a variant of the sum type that accommodates partial functions by providing an explicit bottom element) and a parser monad built on top. All of this monads are heavily used in the IsaFoR/CeTA project which thus provides many examples of their usage. notify = c.sternagel@gmail.com, rene.thiemann@uibk.ac.at [CISC-Kernel] title = Formal Specification of a Generic Separation Kernel author = Freek Verbeek , Sergey Tverdyshev , Oto Havle , Holger Blasum , Bruno Langenstein , Werner Stephan , Yakoub Nemouchi , Abderrahmane Feliachi , Burkhart Wolff , Julien Schmaltz date = 2014-07-18 topic = Computer science/Security abstract =

Intransitive noninterference has been a widely studied topic in the last few decades. Several well-established methodologies apply interactive theorem proving to formulate a noninterference theorem over abstract academic models. In joint work with several industrial and academic partners throughout Europe, we are helping in the certification process of PikeOS, an industrial separation kernel developed at SYSGO. In this process, established theories could not be applied. We present a new generic model of separation kernels and a new theory of intransitive noninterference. The model is rich in detail, making it suitable for formal verification of realistic and industrial systems such as PikeOS. Using a refinement-based theorem proving approach, we ensure that proofs remain manageable.

This document corresponds to the deliverable D31.1 of the EURO-MILS Project http://www.euromils.eu.

notify = [pGCL] title = pGCL for Isabelle author = David Cock date = 2014-07-13 topic = Computer science/Programming languages/Language definitions abstract =

pGCL is both a programming language and a specification language that incorporates both probabilistic and nondeterministic choice, in a unified manner. Program verification is by refinement or annotation (or both), using either Hoare triples, or weakest-precondition entailment, in the style of GCL.

This package provides both a shallow embedding of the language primitives, and an annotation and refinement framework. The generated document includes a brief tutorial.

notify = [Noninterference_CSP] title = Noninterference Security in Communicating Sequential Processes author = Pasquale Noce date = 2014-05-23 topic = Computer science/Security abstract =

An extension of classical noninterference security for deterministic state machines, as introduced by Goguen and Meseguer and elegantly formalized by Rushby, to nondeterministic systems should satisfy two fundamental requirements: it should be based on a mathematically precise theory of nondeterminism, and should be equivalent to (or at least not weaker than) the classical notion in the degenerate deterministic case.

This paper proposes a definition of noninterference security applying to Hoare's Communicating Sequential Processes (CSP) in the general case of a possibly intransitive noninterference policy, and proves the equivalence of this security property to classical noninterference security for processes representing deterministic state machines.

Furthermore, McCullough's generalized noninterference security is shown to be weaker than both the proposed notion of CSP noninterference security for a generic process, and classical noninterference security for processes representing deterministic state machines. This renders CSP noninterference security preferable as an extension of classical noninterference security to nondeterministic systems.

notify = pasquale.noce.lavoro@gmail.com [Floyd_Warshall] title = The Floyd-Warshall Algorithm for Shortest Paths author = Simon Wimmer , Peter Lammich topic = Computer science/Algorithms/Graph date = 2017-05-08 notify = wimmers@in.tum.de abstract = The Floyd-Warshall algorithm [Flo62, Roy59, War62] is a classic dynamic programming algorithm to compute the length of all shortest paths between any two vertices in a graph (i.e. to solve the all-pairs shortest path problem, or APSP for short). Given a representation of the graph as a matrix of weights M, it computes another matrix M' which represents a graph with the same path lengths and contains the length of the shortest path between any two vertices i and j. This is only possible if the graph does not contain any negative cycles. However, in this case the Floyd-Warshall algorithm will detect the situation by calculating a negative diagonal entry. This entry includes a formalization of the algorithm and of these key properties. The algorithm is refined to an efficient imperative version using the Imperative Refinement Framework. [Roy_Floyd_Warshall] title = Transitive closure according to Roy-Floyd-Warshall author = Makarius Wenzel <> date = 2014-05-23 topic = Computer science/Algorithms/Graph abstract = This formulation of the Roy-Floyd-Warshall algorithm for the transitive closure bypasses matrices and arrays, but uses a more direct mathematical model with adjacency functions for immediate predecessors and successors. This can be implemented efficiently in functional programming languages and is particularly adequate for sparse relations. notify = [GPU_Kernel_PL] title = Syntax and semantics of a GPU kernel programming language author = John Wickerson date = 2014-04-03 topic = Computer science/Programming languages/Language definitions abstract = This document accompanies the article "The Design and Implementation of a Verification Technique for GPU Kernels" by Adam Betts, Nathan Chong, Alastair F. Donaldson, Jeroen Ketema, Shaz Qadeer, Paul Thomson and John Wickerson. It formalises all of the definitions provided in Sections 3 and 4 of the article. notify = [AWN] title = Mechanization of the Algebra for Wireless Networks (AWN) author = Timothy Bourke date = 2014-03-08 topic = Computer science/Concurrency/Process calculi abstract =

AWN is a process algebra developed for modelling and analysing protocols for Mobile Ad hoc Networks (MANETs) and Wireless Mesh Networks (WMNs). AWN models comprise five distinct layers: sequential processes, local parallel compositions, nodes, partial networks, and complete networks.

This development mechanises the original operational semantics of AWN and introduces a variant 'open' operational semantics that enables the compositional statement and proof of invariants across distinct network nodes. It supports labels (for weakening invariants) and (abstract) data state manipulations. A framework for compositional invariant proofs is developed, including a tactic (inv_cterms) for inductive invariant proofs of sequential processes, lifting rules for the open versions of the higher layers, and a rule for transferring lifted properties back to the standard semantics. A notion of 'control terms' reduces proof obligations to the subset of subterms that act directly (in contrast to operators for combining terms and joining processes).

notify = tim@tbrk.org [Selection_Heap_Sort] title = Verification of Selection and Heap Sort Using Locales author = Danijela Petrovic date = 2014-02-11 topic = Computer science/Algorithms abstract = Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyze similar algorithms and to compare their properties within a single formalization. Usually, formal analysis is not done in educational setting due to complexity of verification and a lack of tools and procedures to make comparison easy. Verification of an algorithm should not only give correctness proof, but also better understanding of an algorithm. If the verification is based on small step program refinement, it can become simple enough to be demonstrated within the university-level computer science curriculum. In this paper we demonstrate this and give a formal analysis of two well known algorithms (Selection Sort and Heap Sort) using proof assistant Isabelle/HOL and program refinement techniques. notify = [Real_Impl] title = Implementing field extensions of the form Q[sqrt(b)] author = René Thiemann date = 2014-02-06 license = LGPL topic = Mathematics/Analysis abstract = We apply data refinement to implement the real numbers, where we support all numbers in the field extension Q[sqrt(b)], i.e., all numbers of the form p + q * sqrt(b) for rational numbers p and q and some fixed natural number b. To this end, we also developed algorithms to precisely compute roots of a rational number, and to perform a factorization of natural numbers which eliminates duplicate prime factors.

Our results have been used to certify termination proofs which involve polynomial interpretations over the reals. extra-history = Change history: [2014-07-11]: Moved NthRoot_Impl to Sqrt-Babylonian. notify = rene.thiemann@uibk.ac.at [ShortestPath] title = An Axiomatic Characterization of the Single-Source Shortest Path Problem author = Christine Rizkallah date = 2013-05-22 topic = Mathematics/Graph theory abstract = This theory is split into two sections. In the first section, we give a formal proof that a well-known axiomatic characterization of the single-source shortest path problem is correct. Namely, we prove that in a directed graph with a non-negative cost function on the edges the single-source shortest path function is the only function that satisfies a set of four axioms. In the second section, we give a formal proof of the correctness of an axiomatic characterization of the single-source shortest path problem for directed graphs with general cost functions. The axioms here are more involved because we have to account for potential negative cycles in the graph. The axioms are summarized in three Isabelle locales. notify = [Launchbury] title = The Correctness of Launchbury's Natural Semantics for Lazy Evaluation author = Joachim Breitner date = 2013-01-31 topic = Computer science/Programming languages/Lambda calculi, Computer science/Semantics abstract = In his seminal paper "Natural Semantics for Lazy Evaluation", John Launchbury proves his semantics correct with respect to a denotational semantics, and outlines an adequacy proof. We have formalized both semantics and machine-checked the correctness proof, clarifying some details. Furthermore, we provide a new and more direct adequacy proof that does not require intermediate operational semantics. extra-history = Change history: [2014-05-24]: Added the proof of adequacy, as well as simplified and improved the existing proofs. Adjusted abstract accordingly. [2015-03-16]: Booleans and if-then-else added to syntax and semantics, making this entry suitable to be used by the entry "Call_Arity". notify = [Call_Arity] title = The Safety of Call Arity author = Joachim Breitner date = 2015-02-20 topic = Computer science/Programming languages/Transformations abstract = We formalize the Call Arity analysis, as implemented in GHC, and prove both functional correctness and, more interestingly, safety (i.e. the transformation does not increase allocation).

We use syntax and the denotational semantics from the entry "Launchbury", where we formalized Launchbury's natural semantics for lazy evaluation.

The functional correctness of Call Arity is proved with regard to that denotational semantics. The operational properties are shown with regard to a small-step semantics akin to Sestoft's mark 1 machine, which we prove to be equivalent to Launchbury's semantics.

We use Christian Urban's Nominal2 package to define our terms and make use of Brian Huffman's HOLCF package for the domain-theoretical aspects of the development. extra-history = Change history: [2015-03-16]: This entry now builds on top of the Launchbury entry, and the equivalency proof of the natural and the small-step semantics was added. notify = [CCS] title = CCS in nominal logic author = Jesper Bengtson date = 2012-05-29 topic = Computer science/Concurrency/Process calculi abstract = We formalise a large portion of CCS as described in Milner's book 'Communication and Concurrency' using the nominal datatype package in Isabelle. Our results include many of the standard theorems of bisimulation equivalence and congruence, for both weak and strong versions. One main goal of this formalisation is to keep the machine-checked proofs as close to their pen-and-paper counterpart as possible.

This entry is described in detail in Bengtson's thesis. notify = [Pi_Calculus] title = The pi-calculus in nominal logic author = Jesper Bengtson date = 2012-05-29 topic = Computer science/Concurrency/Process calculi abstract = We formalise the pi-calculus using the nominal datatype package, based on ideas from the nominal logic by Pitts et al., and demonstrate an implementation in Isabelle/HOL. The purpose is to derive powerful induction rules for the semantics in order to conduct machine checkable proofs, closely following the intuitive arguments found in manual proofs. In this way we have covered many of the standard theorems of bisimulation equivalence and congruence, both late and early, and both strong and weak in a uniform manner. We thus provide one of the most extensive formalisations of a the pi-calculus ever done inside a theorem prover.

A significant gain in our formulation is that agents are identified up to alpha-equivalence, thereby greatly reducing the arguments about bound names. This is a normal strategy for manual proofs about the pi-calculus, but that kind of hand waving has previously been difficult to incorporate smoothly in an interactive theorem prover. We show how the nominal logic formalism and its support in Isabelle accomplishes this and thus significantly reduces the tedium of conducting completely formal proofs. This improves on previous work using weak higher order abstract syntax since we do not need extra assumptions to filter out exotic terms and can keep all arguments within a familiar first-order logic.

This entry is described in detail in Bengtson's thesis. notify = [Psi_Calculi] title = Psi-calculi in Isabelle author = Jesper Bengtson date = 2012-05-29 topic = Computer science/Concurrency/Process calculi abstract = Psi-calculi are extensions of the pi-calculus, accommodating arbitrary nominal datatypes to represent not only data but also communication channels, assertions and conditions, giving it an expressive power beyond the applied pi-calculus and the concurrent constraint pi-calculus.

We have formalised psi-calculi in the interactive theorem prover Isabelle using its nominal datatype package. One distinctive feature is that the framework needs to treat binding sequences, as opposed to single binders, in an efficient way. While different methods for formalising single binder calculi have been proposed over the last decades, representations for such binding sequences are not very well explored.

The main effort in the formalisation is to keep the machine checked proofs as close to their pen-and-paper counterparts as possible. This includes treating all binding sequences as atomic elements, and creating custom induction and inversion rules that to remove the bulk of manual alpha-conversions.

This entry is described in detail in Bengtson's thesis. notify = [Encodability_Process_Calculi] title = Analysing and Comparing Encodability Criteria for Process Calculi author = Kirstin Peters , Rob van Glabbeek date = 2015-08-10 topic = Computer science/Concurrency/Process calculi abstract = Encodings or the proof of their absence are the main way to compare process calculi. To analyse the quality of encodings and to rule out trivial or meaningless encodings, they are augmented with quality criteria. There exists a bunch of different criteria and different variants of criteria in order to reason in different settings. This leads to incomparable results. Moreover it is not always clear whether the criteria used to obtain a result in a particular setting do indeed fit to this setting. We show how to formally reason about and compare encodability criteria by mapping them on requirements on a relation between source and target terms that is induced by the encoding function. In particular we analyse the common criteria full abstraction, operational correspondence, divergence reflection, success sensitiveness, and respect of barbs; e.g. we analyse the exact nature of the simulation relation (coupled simulation versus bisimulation) that is induced by different variants of operational correspondence. This way we reduce the problem of analysing or comparing encodability criteria to the better understood problem of comparing relations on processes. notify = kirstin.peters@tu-berlin.de [Circus] title = Isabelle/Circus author = Abderrahmane Feliachi , Burkhart Wolff , Marie-Claude Gaudel contributors = Makarius Wenzel date = 2012-05-27 topic = Computer science/Concurrency/Process calculi, Computer science/System description languages abstract = The Circus specification language combines elements for complex data and behavior specifications, using an integration of Z and CSP with a refinement calculus. Its semantics is based on Hoare and He's Unifying Theories of Programming (UTP). Isabelle/Circus is a formalization of the UTP and the Circus language in Isabelle/HOL. It contains proof rules and tactic support that allows for proofs of refinement for Circus processes (involving both data and behavioral aspects).

The Isabelle/Circus environment supports a syntax for the semantic definitions which is close to textbook presentations of Circus. This article contains an extended version of corresponding VSTTE Paper together with the complete formal development of its underlying commented theories. extra-history = Change history: [2014-06-05]: More polishing, shorter proofs, added Circus syntax, added Makarius Wenzel as contributor. notify = [Dijkstra_Shortest_Path] title = Dijkstra's Shortest Path Algorithm author = Benedikt Nordhoff , Peter Lammich topic = Computer science/Algorithms/Graph date = 2012-01-30 abstract = We implement and prove correct Dijkstra's algorithm for the single source shortest path problem, conceived in 1956 by E. Dijkstra. The algorithm is implemented using the data refinement framework for monadic, nondeterministic programs. An efficient implementation is derived using data structures from the Isabelle Collection Framework. notify = lammich@in.tum.de [Refine_Monadic] title = Refinement for Monadic Programs author = Peter Lammich topic = Computer science/Programming languages/Logics date = 2012-01-30 abstract = We provide a framework for program and data refinement in Isabelle/HOL. The framework is based on a nondeterminism-monad with assertions, i.e., the monad carries a set of results or an assertion failure. Recursion is expressed by fixed points. For convenience, we also provide while and foreach combinators.

The framework provides tools to automatize canonical tasks, such as verification condition generation, finding appropriate data refinement relations, and refine an executable program to a form that is accepted by the Isabelle/HOL code generator.

This submission comes with a collection of examples and a user-guide, illustrating the usage of the framework. extra-history = Change history: [2012-04-23] Introduced ordered FOREACH loops
[2012-06] New features: REC_rule_arb and RECT_rule_arb allow for generalizing over variables. prepare_code_thms - command extracts code equations for recursion combinators.
[2012-07] New example: Nested DFS for emptiness check of Buchi-automata with witness.
New feature: fo_rule method to apply resolution using first-order matching. Useful for arg_conf, fun_cong.
[2012-08] Adaptation to ICF v2.
[2012-10-05] Adaptations to include support for Automatic Refinement Framework.
[2013-09] This entry now depends on Automatic Refinement
[2014-06] New feature: vc_solve method to solve verification conditions. Maintenace changes: VCG-rules for nfoldli, improved setup for FOREACH-loops.
[2014-07] Now defining recursion via flat domain. Dropped many single-valued prerequisites. Changed notion of data refinement. In single-valued case, this matches the old notion. In non-single valued case, the new notion allows for more convenient rules. In particular, the new definitions allow for projecting away ghost variables as a refinement step.
[2014-11] New features: le-or-fail relation (leof), modular reasoning about loop invariants. notify = lammich@in.tum.de [Refine_Imperative_HOL] title = The Imperative Refinement Framework author = Peter Lammich notify = lammich@in.tum.de date = 2016-08-08 topic = Computer science/Programming languages/Transformations,Computer science/Data structures abstract = We present the Imperative Refinement Framework (IRF), a tool that supports a stepwise refinement based approach to imperative programs. This entry is based on the material we presented in [ITP-2015, CPP-2016]. It uses the Monadic Refinement Framework as a frontend for the specification of the abstract programs, and Imperative/HOL as a backend to generate executable imperative programs. The IRF comes with tool support to synthesize imperative programs from more abstract, functional ones, using efficient imperative implementations for the abstract data structures. This entry also includes the Imperative Isabelle Collection Framework (IICF), which provides a library of re-usable imperative collection data structures. Moreover, this entry contains a quickstart guide and a reference manual, which provide an introduction to using the IRF for Isabelle/HOL experts. It also provids a collection of (partly commented) practical examples, some highlights being Dijkstra's Algorithm, Nested-DFS, and a generic worklist algorithm with subsumption. Finally, this entry contains benchmark scripts that compare the runtime of some examples against reference implementations of the algorithms in Java and C++. [ITP-2015] Peter Lammich: Refinement to Imperative/HOL. ITP 2015: 253--269 [CPP-2016] Peter Lammich: Refinement based verification of imperative data structures. CPP 2016: 27--36 [Automatic_Refinement] title = Automatic Data Refinement author = Peter Lammich topic = Computer science/Programming languages/Logics date = 2013-10-02 abstract = We present the Autoref tool for Isabelle/HOL, which automatically refines algorithms specified over abstract concepts like maps and sets to algorithms over concrete implementations like red-black-trees, and produces a refinement theorem. It is based on ideas borrowed from relational parametricity due to Reynolds and Wadler. The tool allows for rapid prototyping of verified, executable algorithms. Moreover, it can be configured to fine-tune the result to the user~s needs. Our tool is able to automatically instantiate generic algorithms, which greatly simplifies the implementation of executable data structures.

This AFP-entry provides the basic tool, which is then used by the Refinement and Collection Framework to provide automatic data refinement for the nondeterminism monad and various collection datastructures. notify = lammich@in.tum.de [EdmondsKarp_Maxflow] title = Formalizing the Edmonds-Karp Algorithm author = Peter Lammich , S. Reza Sefidgar<> notify = lammich@in.tum.de date = 2016-08-12 topic = Computer science/Algorithms/Graph abstract = We present a formalization of the Ford-Fulkerson method for computing the maximum flow in a network. Our formal proof closely follows a standard textbook proof, and is accessible even without being an expert in Isabelle/HOL--- the interactive theorem prover used for the formalization. We then use stepwise refinement to obtain the Edmonds-Karp algorithm, and formally prove a bound on its complexity. Further refinement yields a verified implementation, whose execution time compares well to an unverified reference implementation in Java. This entry is based on our ITP-2016 paper with the same title. [VerifyThis2018] title = VerifyThis 2018 - Polished Isabelle Solutions author = Peter Lammich , Simon Wimmer topic = Computer science/Algorithms date = 2018-04-27 notify = lammich@in.tum.de abstract = VerifyThis 2018 was a program verification competition associated with ETAPS 2018. It was the 7th event in the VerifyThis competition series. In this entry, we present polished and completed versions of our solutions that we created during the competition. [PseudoHoops] title = Pseudo Hoops author = George Georgescu <>, Laurentiu Leustean <>, Viorel Preoteasa topic = Mathematics/Algebra date = 2011-09-22 abstract = Pseudo-hoops are algebraic structures introduced by B. Bosbach under the name of complementary semigroups. In this formalization we prove some properties of pseudo-hoops and we define the basic concepts of filter and normal filter. The lattice of normal filters is isomorphic with the lattice of congruences of a pseudo-hoop. We also study some important classes of pseudo-hoops. Bounded Wajsberg pseudo-hoops are equivalent to pseudo-Wajsberg algebras and bounded basic pseudo-hoops are equivalent to pseudo-BL algebras. Some examples of pseudo-hoops are given in the last section of the formalization. notify = viorel.preoteasa@aalto.fi [MonoBoolTranAlgebra] title = Algebra of Monotonic Boolean Transformers author = Viorel Preoteasa topic = Computer science/Programming languages/Logics date = 2011-09-22 abstract = Algebras of imperative programming languages have been successful in reasoning about programs. In general an algebra of programs is an algebraic structure with programs as elements and with program compositions (sequential composition, choice, skip) as algebra operations. Various versions of these algebras were introduced to model partial correctness, total correctness, refinement, demonic choice, and other aspects. We formalize here an algebra which can be used to model total correctness, refinement, demonic and angelic choice. The basic model of this algebra are monotonic Boolean transformers (monotonic functions from a Boolean algebra to itself). notify = viorel.preoteasa@aalto.fi [LatticeProperties] title = Lattice Properties author = Viorel Preoteasa topic = Mathematics/Order date = 2011-09-22 abstract = This formalization introduces and collects some algebraic structures based on lattices and complete lattices for use in other developments. The structures introduced are modular, and lattice ordered groups. In addition to the results proved for the new lattices, this formalization also introduces theorems about latices and complete lattices in general. extra-history = Change history: [2012-01-05]: Removed the theory about distributive complete lattices which is in the standard library now. Added a theory about well founded and transitive relations and a result about fixpoints in complete lattices and well founded relations. Moved the results about conjunctive and disjunctive functions to a new theory. Removed the syntactic classes for inf and sup which are in the standard library now. notify = viorel.preoteasa@aalto.fi [Impossible_Geometry] title = Proving the Impossibility of Trisecting an Angle and Doubling the Cube author = Ralph Romanos , Lawrence C. Paulson topic = Mathematics/Algebra, Mathematics/Geometry date = 2012-08-05 abstract = Squaring the circle, doubling the cube and trisecting an angle, using a compass and straightedge alone, are classic unsolved problems first posed by the ancient Greeks. All three problems were proved to be impossible in the 19th century. The following document presents the proof of the impossibility of solving the latter two problems using Isabelle/HOL, following a proof by Carrega. The proof uses elementary methods: no Galois theory or field extensions. The set of points constructible using a compass and straightedge is defined inductively. Radical expressions, which involve only square roots and arithmetic of rational numbers, are defined, and we find that all constructive points have radical coordinates. Finally, doubling the cube and trisecting certain angles requires solving certain cubic equations that can be proved to have no rational roots. The Isabelle proofs require a great many detailed calculations. notify = ralph.romanos@student.ecp.fr, lp15@cam.ac.uk [IP_Addresses] title = IP Addresses author = Cornelius Diekmann , Julius Michaelis , Lars Hupel notify = diekmann@net.in.tum.de date = 2016-06-28 topic = Computer science/Networks abstract = This entry contains a definition of IP addresses and a library to work with them. Generic IP addresses are modeled as machine words of arbitrary length. Derived from this generic definition, IPv4 addresses are 32bit machine words, IPv6 addresses are 128bit words. Additionally, IPv4 addresses can be represented in dot-decimal notation and IPv6 addresses in (compressed) colon-separated notation. We support toString functions and parsers for both notations. Sets of IP addresses can be represented with a netmask (e.g. 192.168.0.0/255.255.0.0) or in CIDR notation (e.g. 192.168.0.0/16). To provide executable code for set operations on IP address ranges, the library includes a datatype to work on arbitrary intervals of machine words. [Simple_Firewall] title = Simple Firewall author = Cornelius Diekmann , Julius Michaelis , Maximilian Haslbeck notify = diekmann@net.in.tum.de, max.haslbeck@gmx.de date = 2016-08-24 topic = Computer science/Networks abstract = We present a simple model of a firewall. The firewall can accept or drop a packet and can match on interfaces, IP addresses, protocol, and ports. It was designed to feature nice mathematical properties: The type of match expressions was carefully crafted such that the conjunction of two match expressions is only one match expression. This model is too simplistic to mirror all aspects of the real world. In the upcoming entry "Iptables Semantics", we will translate the Linux firewall iptables to this model. For a fixed service (e.g. ssh, http), we provide an algorithm to compute an overview of the firewall's filtering behavior. The algorithm computes minimal service matrices, i.e. graphs which partition the complete IPv4 and IPv6 address space and visualize the allowed accesses between partitions. For a detailed description, see Verified iptables Firewall Analysis, IFIP Networking 2016. [Iptables_Semantics] title = Iptables Semantics author = Cornelius Diekmann , Lars Hupel notify = diekmann@net.in.tum.de, hupel@in.tum.de date = 2016-09-09 topic = Computer science/Networks abstract = We present a big step semantics of the filtering behavior of the Linux/netfilter iptables firewall. We provide algorithms to simplify complex iptables rulests to a simple firewall model (c.f. AFP entry Simple_Firewall) and to verify spoofing protection of a ruleset. Internally, we embed our semantics into ternary logic, ultimately supporting every iptables match condition by abstracting over unknowns. Using this AFP entry and all entries it depends on, we created an easy-to-use, stand-alone haskell tool called fffuu. The tool does not require any input —except for the iptables-save dump of the analyzed firewall— and presents interesting results about the user's ruleset. Real-Word firewall errors have been uncovered, and the correctness of rulesets has been proved, with the help of our tool. [Routing] title = Routing author = Julius Michaelis , Cornelius Diekmann notify = afp@liftm.de date = 2016-08-31 topic = Computer science/Networks abstract = This entry contains definitions for routing with routing tables/longest prefix matching. A routing table entry is modelled as a record of a prefix match, a metric, an output port, and an optional next hop. A routing table is a list of entries, sorted by prefix length and metric. Additionally, a parser and serializer for the output of the ip-route command, a function to create a relation from output port to corresponding destination IP space, and a model of a Linux-style router are included. [KBPs] title = Knowledge-based programs author = Peter Gammie topic = Computer science/Automata and formal languages date = 2011-05-17 abstract = Knowledge-based programs (KBPs) are a formalism for directly relating agents' knowledge and behaviour. Here we present a general scheme for compiling KBPs to executable automata with a proof of correctness in Isabelle/HOL. We develop the algorithm top-down, using Isabelle's locale mechanism to structure these proofs, and show that two classic examples can be synthesised using Isabelle's code generator. extra-history = Change history: [2012-03-06]: Add some more views and revive the code generation. notify = kleing@cse.unsw.edu.au [Tarskis_Geometry] title = The independence of Tarski's Euclidean axiom author = T. J. M. Makarios topic = Mathematics/Geometry date = 2012-10-30 abstract = Tarski's axioms of plane geometry are formalized and, using the standard real Cartesian model, shown to be consistent. A substantial theory of the projective plane is developed. Building on this theory, the Klein-Beltrami model of the hyperbolic plane is defined and shown to satisfy all of Tarski's axioms except his Euclidean axiom; thus Tarski's Euclidean axiom is shown to be independent of his other axioms of plane geometry.

An earlier version of this work was the subject of the author's MSc thesis, which contains natural-language explanations of some of the more interesting proofs. notify = tjm1983@gmail.com +[IsaGeoCoq] +title = Tarski's Parallel Postulate implies the 5th Postulate of Euclid, the Postulate of Playfair and the original Parallel Postulate of Euclid +author = Roland Coghetto +topic = Mathematics/Geometry +license = LGPL +date = 2021-01-31 +notify = roland_coghetto@hotmail.com +abstract = +

The GeoCoq library contains a formalization + of geometry using the Coq proof assistant. It contains both proofs + about the foundations of geometry and high-level proofs in the same + style as in high school. We port a part of the GeoCoq + 2.4.0 library to Isabelle/HOL: more precisely, + the files Chap02.v to Chap13_3.v, suma.v as well as the associated + definitions and some useful files for the demonstration of certain + parallel postulates. The synthetic approach of the demonstrations is directly + inspired by those contained in GeoCoq. The names of the lemmas and + theorems used are kept as far as possible as well as the definitions. +

+

It should be noted that T.J.M. Makarios has done + some proofs in Tarski's Geometry. It uses a definition that does not quite + coincide with the definition used in Geocoq and here. + Furthermore, corresponding definitions in the Poincaré Disc Model + development are not identical to those defined in GeoCoq. +

+

In the last part, it is + formalized that, in the neutral/absolute space, the axiom of the + parallels of Tarski's system implies the Playfair axiom, the 5th + postulate of Euclid and Euclid's original parallel postulate. These + proofs, which are not constructive, are directly inspired by Pierre + Boutry, Charly Gries, Julien Narboux and Pascal Schreck. +

+ [General-Triangle] title = The General Triangle Is Unique author = Joachim Breitner topic = Mathematics/Geometry date = 2011-04-01 abstract = Some acute-angled triangles are special, e.g. right-angled or isoscele triangles. Some are not of this kind, but, without measuring angles, look as if they were. In that sense, there is exactly one general triangle. This well-known fact is proven here formally. notify = mail@joachim-breitner.de [LightweightJava] title = Lightweight Java author = Rok Strniša , Matthew Parkinson topic = Computer science/Programming languages/Language definitions date = 2011-02-07 abstract = A fully-formalized and extensible minimal imperative fragment of Java. notify = rok@strnisa.com [Lower_Semicontinuous] title = Lower Semicontinuous Functions author = Bogdan Grechuk topic = Mathematics/Analysis date = 2011-01-08 abstract = We define the notions of lower and upper semicontinuity for functions from a metric space to the extended real line. We prove that a function is both lower and upper semicontinuous if and only if it is continuous. We also give several equivalent characterizations of lower semicontinuity. In particular, we prove that a function is lower semicontinuous if and only if its epigraph is a closed set. Also, we introduce the notion of the lower semicontinuous hull of an arbitrary function and prove its basic properties. notify = hoelzl@in.tum.de [RIPEMD-160-SPARK] title = RIPEMD-160 author = Fabian Immler topic = Computer science/Programming languages/Static analysis date = 2011-01-10 abstract = This work presents a verification of an implementation in SPARK/ADA of the cryptographic hash-function RIPEMD-160. A functional specification of RIPEMD-160 is given in Isabelle/HOL. Proofs for the verification conditions generated by the static-analysis toolset of SPARK certify the functional correctness of the implementation. extra-history = Change history: [2015-11-09]: Entry is now obsolete, moved to Isabelle distribution. notify = immler@in.tum.de [Regular-Sets] title = Regular Sets and Expressions author = Alexander Krauss , Tobias Nipkow contributors = Manuel Eberl topic = Computer science/Automata and formal languages date = 2010-05-12 abstract = This is a library of constructions on regular expressions and languages. It provides the operations of concatenation, Kleene star and derivative on languages. Regular expressions and their meaning are defined. An executable equivalence checker for regular expressions is verified; it does not need automata but works directly on regular expressions. By mapping regular expressions to binary relations, an automatic and complete proof method for (in)equalities of binary relations over union, concatenation and (reflexive) transitive closure is obtained.

Extended regular expressions with complement and intersection are also defined and an equivalence checker is provided. extra-history = Change history: [2011-08-26]: Christian Urban added a theory about derivatives and partial derivatives of regular expressions
[2012-05-10]: Tobias Nipkow added extended regular expressions
[2012-05-10]: Tobias Nipkow added equivalence checking with partial derivatives notify = nipkow@in.tum.de, krauss@in.tum.de, christian.urban@kcl.ac.uk [Regex_Equivalence] title = Unified Decision Procedures for Regular Expression Equivalence author = Tobias Nipkow , Dmitriy Traytel topic = Computer science/Automata and formal languages date = 2014-01-30 abstract = We formalize a unified framework for verified decision procedures for regular expression equivalence. Five recently published formalizations of such decision procedures (three based on derivatives, two on marked regular expressions) can be obtained as instances of the framework. We discover that the two approaches based on marked regular expressions, which were previously thought to be the same, are different, and one seems to produce uniformly smaller automata. The common framework makes it possible to compare the performance of the different decision procedures in a meaningful way. The formalization is described in a paper of the same name presented at Interactive Theorem Proving 2014. notify = nipkow@in.tum.de, traytel@in.tum.de [MSO_Regex_Equivalence] title = Decision Procedures for MSO on Words Based on Derivatives of Regular Expressions author = Dmitriy Traytel , Tobias Nipkow topic = Computer science/Automata and formal languages, Logic/General logic/Decidability of theories date = 2014-06-12 abstract = Monadic second-order logic on finite words (MSO) is a decidable yet expressive logic into which many decision problems can be encoded. Since MSO formulas correspond to regular languages, equivalence of MSO formulas can be reduced to the equivalence of some regular structures (e.g. automata). We verify an executable decision procedure for MSO formulas that is not based on automata but on regular expressions.

Decision procedures for regular expression equivalence have been formalized before, usually based on Brzozowski derivatives. Yet, for a straightforward embedding of MSO formulas into regular expressions an extension of regular expressions with a projection operation is required. We prove total correctness and completeness of an equivalence checker for regular expressions extended in that way. We also define a language-preserving translation of formulas into regular expressions with respect to two different semantics of MSO.

The formalization is described in this ICFP 2013 functional pearl. notify = traytel@in.tum.de, nipkow@in.tum.de [Formula_Derivatives] title = Derivatives of Logical Formulas author = Dmitriy Traytel topic = Computer science/Automata and formal languages, Logic/General logic/Decidability of theories date = 2015-05-28 abstract = We formalize new decision procedures for WS1S, M2L(Str), and Presburger Arithmetics. Formulas of these logics denote regular languages. Unlike traditional decision procedures, we do not translate formulas into automata (nor into regular expressions), at least not explicitly. Instead we devise notions of derivatives (inspired by Brzozowski derivatives for regular expressions) that operate on formulas directly and compute a syntactic bisimulation using these derivatives. The treatment of Boolean connectives and quantifiers is uniform for all mentioned logics and is abstracted into a locale. This locale is then instantiated by different atomic formulas and their derivatives (which may differ even for the same logic under different encodings of interpretations as formal words).

The WS1S instance is described in the draft paper A Coalgebraic Decision Procedure for WS1S by the author. notify = traytel@in.tum.de [Myhill-Nerode] title = The Myhill-Nerode Theorem Based on Regular Expressions author = Chunhan Wu <>, Xingyuan Zhang <>, Christian Urban contributors = Manuel Eberl topic = Computer science/Automata and formal languages date = 2011-08-26 abstract = There are many proofs of the Myhill-Nerode theorem using automata. In this library we give a proof entirely based on regular expressions, since regularity of languages can be conveniently defined using regular expressions (it is more painful in HOL to define regularity in terms of automata). We prove the first direction of the Myhill-Nerode theorem by solving equational systems that involve regular expressions. For the second direction we give two proofs: one using tagging-functions and another using partial derivatives. We also establish various closure properties of regular languages. Most details of the theories are described in our ITP 2011 paper. notify = christian.urban@kcl.ac.uk [Universal_Turing_Machine] title = Universal Turing Machine author = Jian Xu<>, Xingyuan Zhang<>, Christian Urban , Sebastiaan J. C. Joosten topic = Logic/Computability, Computer science/Automata and formal languages date = 2019-02-08 notify = sjcjoosten@gmail.com, christian.urban@kcl.ac.uk abstract = We formalise results from computability theory: recursive functions, undecidability of the halting problem, and the existence of a universal Turing machine. This formalisation is the AFP entry corresponding to the paper Mechanising Turing Machines and Computability Theory in Isabelle/HOL, ITP 2013. [CYK] title = A formalisation of the Cocke-Younger-Kasami algorithm author = Maksym Bortin date = 2016-04-27 topic = Computer science/Algorithms, Computer science/Automata and formal languages abstract = The theory provides a formalisation of the Cocke-Younger-Kasami algorithm (CYK for short), an approach to solving the word problem for context-free languages. CYK decides if a word is in the languages generated by a context-free grammar in Chomsky normal form. The formalized algorithm is executable. notify = maksym.bortin@nicta.com.au [Boolean_Expression_Checkers] title = Boolean Expression Checkers author = Tobias Nipkow date = 2014-06-08 topic = Computer science/Algorithms, Logic/General logic/Mechanization of proofs abstract = This entry provides executable checkers for the following properties of boolean expressions: satisfiability, tautology and equivalence. Internally, the checkers operate on binary decision trees and are reasonably efficient (for purely functional algorithms). extra-history = Change history: [2015-09-23]: Salomon Sickert added an interface that does not require the usage of the Boolean formula datatype. Furthermore the general Mapping type is used instead of an association list. notify = nipkow@in.tum.de [Presburger-Automata] title = Formalizing the Logic-Automaton Connection author = Stefan Berghofer , Markus Reiter <> date = 2009-12-03 topic = Computer science/Automata and formal languages, Logic/General logic/Decidability of theories abstract = This work presents a formalization of a library for automata on bit strings. It forms the basis of a reflection-based decision procedure for Presburger arithmetic, which is efficiently executable thanks to Isabelle's code generator. With this work, we therefore provide a mechanized proof of a well-known connection between logic and automata theory. The formalization is also described in a publication [TPHOLs 2009]. notify = berghofe@in.tum.de [Functional-Automata] title = Functional Automata author = Tobias Nipkow date = 2004-03-30 topic = Computer science/Automata and formal languages abstract = This theory defines deterministic and nondeterministic automata in a functional representation: the transition function/relation and the finality predicate are just functions. Hence the state space may be infinite. It is shown how to convert regular expressions into such automata. A scanner (generator) is implemented with the help of functional automata: the scanner chops the input up into longest recognized substrings. Finally we also show how to convert a certain subclass of functional automata (essentially the finite deterministic ones) into regular sets. notify = nipkow@in.tum.de [Statecharts] title = Formalizing Statecharts using Hierarchical Automata author = Steffen Helke , Florian Kammüller topic = Computer science/Automata and formal languages date = 2010-08-08 abstract = We formalize in Isabelle/HOL the abtract syntax and a synchronous step semantics for the specification language Statecharts. The formalization is based on Hierarchical Automata which allow a structural decomposition of Statecharts into Sequential Automata. To support the composition of Statecharts, we introduce calculating operators to construct a Hierarchical Automaton in a stepwise manner. Furthermore, we present a complete semantics of Statecharts including a theory of data spaces, which enables the modelling of racing effects. We also adapt CTL for Statecharts to build a bridge for future combinations with model checking. However the main motivation of this work is to provide a sound and complete basis for reasoning on Statecharts. As a central meta theorem we prove that the well-formedness of a Statechart is preserved by the semantics. notify = nipkow@in.tum.de [Stuttering_Equivalence] title = Stuttering Equivalence author = Stephan Merz topic = Computer science/Automata and formal languages date = 2012-05-07 abstract =

Two omega-sequences are stuttering equivalent if they differ only by finite repetitions of elements. Stuttering equivalence is a fundamental concept in the theory of concurrent and distributed systems. Notably, Lamport argues that refinement notions for such systems should be insensitive to finite stuttering. Peled and Wilke showed that all PLTL (propositional linear-time temporal logic) properties that are insensitive to stuttering equivalence can be expressed without the next-time operator. Stuttering equivalence is also important for certain verification techniques such as partial-order reduction for model checking.

We formalize stuttering equivalence in Isabelle/HOL. Our development relies on the notion of stuttering sampling functions that may skip blocks of identical sequence elements. We also encode PLTL and prove the theorem due to Peled and Wilke.

extra-history = Change history: [2013-01-31]: Added encoding of PLTL and proved Peled and Wilke's theorem. Adjusted abstract accordingly. notify = Stephan.Merz@loria.fr [Coinductive_Languages] title = A Codatatype of Formal Languages author = Dmitriy Traytel topic = Computer science/Automata and formal languages date = 2013-11-15 abstract =

We define formal languages as a codataype of infinite trees branching over the alphabet. Each node in such a tree indicates whether the path to this node constitutes a word inside or outside of the language. This codatatype is isormorphic to the set of lists representation of languages, but caters for definitions by corecursion and proofs by coinduction.

Regular operations on languages are then defined by primitive corecursion. A difficulty arises here, since the standard definitions of concatenation and iteration from the coalgebraic literature are not primitively corecursive-they require guardedness up-to union/concatenation. Without support for up-to corecursion, these operation must be defined as a composition of primitive ones (and proved being equal to the standard definitions). As an exercise in coinduction we also prove the axioms of Kleene algebra for the defined regular operations.

Furthermore, a language for context-free grammars given by productions in Greibach normal form and an initial nonterminal is constructed by primitive corecursion, yielding an executable decision procedure for the word problem without further ado.

notify = traytel@in.tum.de [Tree-Automata] title = Tree Automata author = Peter Lammich date = 2009-11-25 topic = Computer science/Automata and formal languages abstract = This work presents a machine-checked tree automata library for Standard-ML, OCaml and Haskell. The algorithms are efficient by using appropriate data structures like RB-trees. The available algorithms for non-deterministic automata include membership query, reduction, intersection, union, and emptiness check with computation of a witness for non-emptiness. The executable algorithms are derived from less-concrete, non-executable algorithms using data-refinement techniques. The concrete data structures are from the Isabelle Collections Framework. Moreover, this work contains a formalization of the class of tree-regular languages and its closure properties under set operations. notify = peter.lammich@uni-muenster.de, nipkow@in.tum.de [Depth-First-Search] title = Depth First Search author = Toshiaki Nishihara <>, Yasuhiko Minamide <> date = 2004-06-24 topic = Computer science/Algorithms/Graph abstract = Depth-first search of a graph is formalized with recdef. It is shown that it visits all of the reachable nodes from a given list of nodes. Executable ML code of depth-first search is obtained using the code generation feature of Isabelle/HOL. notify = lp15@cam.ac.uk, krauss@in.tum.de [FFT] title = Fast Fourier Transform author = Clemens Ballarin date = 2005-10-12 topic = Computer science/Algorithms/Mathematical abstract = We formalise a functional implementation of the FFT algorithm over the complex numbers, and its inverse. Both are shown equivalent to the usual definitions of these operations through Vandermonde matrices. They are also shown to be inverse to each other, more precisely, that composition of the inverse and the transformation yield the identity up to a scalar. notify = ballarin@in.tum.de [Gauss-Jordan-Elim-Fun] title = Gauss-Jordan Elimination for Matrices Represented as Functions author = Tobias Nipkow date = 2011-08-19 topic = Computer science/Algorithms/Mathematical, Mathematics/Algebra abstract = This theory provides a compact formulation of Gauss-Jordan elimination for matrices represented as functions. Its distinctive feature is succinctness. It is not meant for large computations. notify = nipkow@in.tum.de [UpDown_Scheme] title = Verification of the UpDown Scheme author = Johannes Hölzl date = 2015-01-28 topic = Computer science/Algorithms/Mathematical abstract = The UpDown scheme is a recursive scheme used to compute the stiffness matrix on a special form of sparse grids. Usually, when discretizing a Euclidean space of dimension d we need O(n^d) points, for n points along each dimension. Sparse grids are a hierarchical representation where the number of points is reduced to O(n * log(n)^d). One disadvantage of such sparse grids is that the algorithm now operate recursively in the dimensions and levels of the sparse grid.

The UpDown scheme allows us to compute the stiffness matrix on such a sparse grid. The stiffness matrix represents the influence of each representation function on the L^2 scalar product. For a detailed description see Dirk Pflüger's PhD thesis. This formalization was developed as an interdisciplinary project (IDP) at the Technische Universität München. notify = hoelzl@in.tum.de [GraphMarkingIBP] title = Verification of the Deutsch-Schorr-Waite Graph Marking Algorithm using Data Refinement author = Viorel Preoteasa , Ralph-Johan Back date = 2010-05-28 topic = Computer science/Algorithms/Graph abstract = The verification of the Deutsch-Schorr-Waite graph marking algorithm is used as a benchmark in many formalizations of pointer programs. The main purpose of this mechanization is to show how data refinement of invariant based programs can be used in verifying practical algorithms. The verification starts with an abstract algorithm working on a graph given by a relation next on nodes. Gradually the abstract program is refined into Deutsch-Schorr-Waite graph marking algorithm where only one bit per graph node of additional memory is used for marking. extra-history = Change history: [2012-01-05]: Updated for the new definition of data refinement and the new syntax for demonic and angelic update statements notify = viorel.preoteasa@aalto.fi [Efficient-Mergesort] title = Efficient Mergesort topic = Computer science/Algorithms date = 2011-11-09 author = Christian Sternagel abstract = We provide a formalization of the mergesort algorithm as used in GHC's Data.List module, proving correctness and stability. Furthermore, experimental data suggests that generated (Haskell-)code for this algorithm is much faster than for previous algorithms available in the Isabelle distribution. extra-history = Change history: [2012-10-24]: Added reference to journal article.
[2018-09-17]: Added theory Efficient_Mergesort that works exclusively with the mutual induction schemas generated by the function package.
[2018-09-19]: Added theory Mergesort_Complexity that proves an upper bound on the number of comparisons that are required by mergesort.
[2018-09-19]: Theory Efficient_Mergesort replaces theory Efficient_Sort but keeping the old name Efficient_Sort. [2020-11-20]: Additional theory Natural_Mergesort that developes an efficient mergesort algorithm without key-functions for educational purposes. notify = c.sternagel@gmail.com [SATSolverVerification] title = Formal Verification of Modern SAT Solvers author = Filip Marić date = 2008-07-23 topic = Computer science/Algorithms abstract = This document contains formal correctness proofs of modern SAT solvers. Following (Krstic et al, 2007) and (Nieuwenhuis et al., 2006), solvers are described using state-transition systems. Several different SAT solver descriptions are given and their partial correctness and termination is proved. These include:

  • a solver based on classical DPLL procedure (using only a backtrack-search with unit propagation),
  • a very general solver with backjumping and learning (similar to the description given in (Nieuwenhuis et al., 2006)), and
  • a solver with a specific conflict analysis algorithm (similar to the description given in (Krstic et al., 2007)).
Within the SAT solver correctness proofs, a large number of lemmas about propositional logic and CNF formulae are proved. This theory is self-contained and could be used for further exploring of properties of CNF based SAT algorithms. notify = [Transitive-Closure] title = Executable Transitive Closures of Finite Relations topic = Computer science/Algorithms/Graph date = 2011-03-14 author = Christian Sternagel , René Thiemann license = LGPL abstract = We provide a generic work-list algorithm to compute the transitive closure of finite relations where only successors of newly detected states are generated. This algorithm is then instantiated for lists over arbitrary carriers and red black trees (which are faster but require a linear order on the carrier), respectively. Our formalization was performed as part of the IsaFoR/CeTA project where reflexive transitive closures of large tree automata have to be computed. extra-history = Change history: [2014-09-04] added example simprocs in Finite_Transitive_Closure_Simprocs notify = c.sternagel@gmail.com, rene.thiemann@uibk.ac.at [Transitive-Closure-II] title = Executable Transitive Closures topic = Computer science/Algorithms/Graph date = 2012-02-29 author = René Thiemann license = LGPL abstract =

We provide a generic work-list algorithm to compute the (reflexive-)transitive closure of relations where only successors of newly detected states are generated. In contrast to our previous work, the relations do not have to be finite, but each element must only have finitely many (indirect) successors. Moreover, a subsumption relation can be used instead of pure equality. An executable variant of the algorithm is available where the generic operations are instantiated with list operations.

This formalization was performed as part of the IsaFoR/CeTA project, and it has been used to certify size-change termination proofs where large transitive closures have to be computed.

notify = rene.thiemann@uibk.ac.at [MuchAdoAboutTwo] title = Much Ado About Two author = Sascha Böhme date = 2007-11-06 topic = Computer science/Algorithms abstract = This article is an Isabelle formalisation of a paper with the same title. In a similar way as Knuth's 0-1-principle for sorting algorithms, that paper develops a 0-1-2-principle for parallel prefix computations. notify = boehmes@in.tum.de [DiskPaxos] title = Proving the Correctness of Disk Paxos date = 2005-06-22 author = Mauro Jaskelioff , Stephan Merz topic = Computer science/Algorithms/Distributed abstract = Disk Paxos is an algorithm for building arbitrary fault-tolerant distributed systems. The specification of Disk Paxos has been proved correct informally and tested using the TLC model checker, but up to now, it has never been fully formally verified. In this work we have formally verified its correctness using the Isabelle theorem prover and the HOL logic system, showing that Isabelle is a practical tool for verifying properties of TLA+ specifications. notify = kleing@cse.unsw.edu.au [GenClock] title = Formalization of a Generalized Protocol for Clock Synchronization author = Alwen Tiu date = 2005-06-24 topic = Computer science/Algorithms/Distributed abstract = We formalize the generalized Byzantine fault-tolerant clock synchronization protocol of Schneider. This protocol abstracts from particular algorithms or implementations for clock synchronization. This abstraction includes several assumptions on the behaviors of physical clocks and on general properties of concrete algorithms/implementations. Based on these assumptions the correctness of the protocol is proved by Schneider. His proof was later verified by Shankar using the theorem prover EHDM (precursor to PVS). Our formalization in Isabelle/HOL is based on Shankar's formalization. notify = kleing@cse.unsw.edu.au [ClockSynchInst] title = Instances of Schneider's generalized protocol of clock synchronization author = Damián Barsotti date = 2006-03-15 topic = Computer science/Algorithms/Distributed abstract = F. B. Schneider ("Understanding protocols for Byzantine clock synchronization") generalizes a number of protocols for Byzantine fault-tolerant clock synchronization and presents a uniform proof for their correctness. In Schneider's schema, each processor maintains a local clock by periodically adjusting each value to one computed by a convergence function applied to the readings of all the clocks. Then, correctness of an algorithm, i.e. that the readings of two clocks at any time are within a fixed bound of each other, is based upon some conditions on the convergence function. To prove that a particular clock synchronization algorithm is correct it suffices to show that the convergence function used by the algorithm meets Schneider's conditions. Using the theorem prover Isabelle, we formalize the proofs that the convergence functions of two algorithms, namely, the Interactive Convergence Algorithm (ICA) of Lamport and Melliar-Smith and the Fault-tolerant Midpoint algorithm of Lundelius-Lynch, meet Schneider's conditions. Furthermore, we experiment on handling some parts of the proofs with fully automatic tools like ICS and CVC-lite. These theories are part of a joint work with Alwen Tiu and Leonor P. Nieto "Verification of Clock Synchronization Algorithms: Experiments on a combination of deductive tools" in proceedings of AVOCS 2005. In this work the correctness of Schneider schema was also verified using Isabelle (entry GenClock in AFP). notify = kleing@cse.unsw.edu.au [Heard_Of] title = Verifying Fault-Tolerant Distributed Algorithms in the Heard-Of Model date = 2012-07-27 author = Henri Debrat , Stephan Merz topic = Computer science/Algorithms/Distributed abstract = Distributed computing is inherently based on replication, promising increased tolerance to failures of individual computing nodes or communication channels. Realizing this promise, however, involves quite subtle algorithmic mechanisms, and requires precise statements about the kinds and numbers of faults that an algorithm tolerates (such as process crashes, communication faults or corrupted values). The landmark theorem due to Fischer, Lynch, and Paterson shows that it is impossible to achieve Consensus among N asynchronously communicating nodes in the presence of even a single permanent failure. Existing solutions must rely on assumptions of "partial synchrony".

Indeed, there have been numerous misunderstandings on what exactly a given algorithm is supposed to realize in what kinds of environments. Moreover, the abundance of subtly different computational models complicates comparisons between different algorithms. Charron-Bost and Schiper introduced the Heard-Of model for representing algorithms and failure assumptions in a uniform framework, simplifying comparisons between algorithms.

In this contribution, we represent the Heard-Of model in Isabelle/HOL. We define two semantics of runs of algorithms with different unit of atomicity and relate these through a reduction theorem that allows us to verify algorithms in the coarse-grained semantics (where proofs are easier) and infer their correctness for the fine-grained one (which corresponds to actual executions). We instantiate the framework by verifying six Consensus algorithms that differ in the underlying algorithmic mechanisms and the kinds of faults they tolerate. notify = Stephan.Merz@loria.fr [Consensus_Refined] title = Consensus Refined date = 2015-03-18 author = Ognjen Maric <>, Christoph Sprenger topic = Computer science/Algorithms/Distributed abstract = Algorithms for solving the consensus problem are fundamental to distributed computing. Despite their brevity, their ability to operate in concurrent, asynchronous and failure-prone environments comes at the cost of complex and subtle behaviors. Accordingly, understanding how they work and proving their correctness is a non-trivial endeavor where abstraction is immensely helpful. Moreover, research on consensus has yielded a large number of algorithms, many of which appear to share common algorithmic ideas. A natural question is whether and how these similarities can be distilled and described in a precise, unified way. In this work, we combine stepwise refinement and lockstep models to provide an abstract and unified view of a sizeable family of consensus algorithms. Our models provide insights into the design choices underlying the different algorithms, and classify them based on those choices. notify = sprenger@inf.ethz.ch [Key_Agreement_Strong_Adversaries] title = Refining Authenticated Key Agreement with Strong Adversaries author = Joseph Lallemand , Christoph Sprenger topic = Computer science/Security license = LGPL date = 2017-01-31 notify = joseph.lallemand@loria.fr, sprenger@inf.ethz.ch abstract = We develop a family of key agreement protocols that are correct by construction. Our work substantially extends prior work on developing security protocols by refinement. First, we strengthen the adversary by allowing him to compromise different resources of protocol participants, such as their long-term keys or their session keys. This enables the systematic development of protocols that ensure strong properties such as perfect forward secrecy. Second, we broaden the class of protocols supported to include those with non-atomic keys and equationally defined cryptographic operators. We use these extensions to develop key agreement protocols including signed Diffie-Hellman and the core of IKEv1 and SKEME. [Security_Protocol_Refinement] title = Developing Security Protocols by Refinement author = Christoph Sprenger , Ivano Somaini<> topic = Computer science/Security license = LGPL date = 2017-05-24 notify = sprenger@inf.ethz.ch abstract = We propose a development method for security protocols based on stepwise refinement. Our refinement strategy transforms abstract security goals into protocols that are secure when operating over an insecure channel controlled by a Dolev-Yao-style intruder. As intermediate levels of abstraction, we employ messageless guard protocols and channel protocols communicating over channels with security properties. These abstractions provide insights on why protocols are secure and foster the development of families of protocols sharing common structure and properties. We have implemented our method in Isabelle/HOL and used it to develop different entity authentication and key establishment protocols, including realistic features such as key confirmation, replay caches, and encrypted tickets. Our development highlights that guard protocols and channel protocols provide fundamental abstractions for bridging the gap between security properties and standard protocol descriptions based on cryptographic messages. It also shows that our refinement approach scales to protocols of nontrivial size and complexity. [Abortable_Linearizable_Modules] title = Abortable Linearizable Modules author = Rachid Guerraoui , Viktor Kuncak , Giuliano Losa date = 2012-03-01 topic = Computer science/Algorithms/Distributed abstract = We define the Abortable Linearizable Module automaton (ALM for short) and prove its key composition property using the IOA theory of HOLCF. The ALM is at the heart of the Speculative Linearizability framework. This framework simplifies devising correct speculative algorithms by enabling their decomposition into independent modules that can be analyzed and proved correct in isolation. It is particularly useful when working in a distributed environment, where the need to tolerate faults and asynchrony has made current monolithic protocols so intricate that it is no longer tractable to check their correctness. Our theory contains a typical example of a refinement proof in the I/O-automata framework of Lynch and Tuttle. notify = giuliano@losa.fr, nipkow@in.tum.de [Amortized_Complexity] title = Amortized Complexity Verified author = Tobias Nipkow date = 2014-07-07 topic = Computer science/Data structures abstract = A framework for the analysis of the amortized complexity of functional data structures is formalized in Isabelle/HOL and applied to a number of standard examples and to the folowing non-trivial ones: skew heaps, splay trees, splay heaps and pairing heaps.

A preliminary version of this work (without pairing heaps) is described in a paper published in the proceedings of the conference on Interactive Theorem Proving ITP 2015. An extended version of this publication is available here. extra-history = Change history: [2015-03-17]: Added pairing heaps by Hauke Brinkop.
[2016-07-12]: Moved splay heaps from here to Splay_Tree
[2016-07-14]: Moved pairing heaps from here to the new Pairing_Heap notify = nipkow@in.tum.de [Dynamic_Tables] title = Parameterized Dynamic Tables author = Tobias Nipkow date = 2015-06-07 topic = Computer science/Data structures abstract = This article formalizes the amortized analysis of dynamic tables parameterized with their minimal and maximal load factors and the expansion and contraction factors.

A full description is found in a companion paper. notify = nipkow@in.tum.de [AVL-Trees] title = AVL Trees author = Tobias Nipkow , Cornelia Pusch <> date = 2004-03-19 topic = Computer science/Data structures abstract = Two formalizations of AVL trees with room for extensions. The first formalization is monolithic and shorter, the second one in two stages, longer and a bit simpler. The final implementation is the same. If you are interested in developing this further, please contact gerwin.klein@nicta.com.au. extra-history = Change history: [2011-04-11]: Ondrej Kuncar added delete function notify = kleing@cse.unsw.edu.au [BDD] title = BDD Normalisation author = Veronika Ortner <>, Norbert Schirmer <> date = 2008-02-29 topic = Computer science/Data structures abstract = We present the verification of the normalisation of a binary decision diagram (BDD). The normalisation follows the original algorithm presented by Bryant in 1986 and transforms an ordered BDD in a reduced, ordered and shared BDD. The verification is based on Hoare logics. notify = kleing@cse.unsw.edu.au, norbert.schirmer@web.de [BinarySearchTree] title = Binary Search Trees author = Viktor Kuncak date = 2004-04-05 topic = Computer science/Data structures abstract = The correctness is shown of binary search tree operations (lookup, insert and remove) implementing a set. Two versions are given, for both structured and linear (tactic-style) proofs. An implementation of integer-indexed maps is also verified. notify = lp15@cam.ac.uk [Splay_Tree] title = Splay Tree author = Tobias Nipkow notify = nipkow@in.tum.de date = 2014-08-12 topic = Computer science/Data structures abstract = Splay trees are self-adjusting binary search trees which were invented by Sleator and Tarjan [JACM 1985]. This entry provides executable and verified functional splay trees as well as the related splay heaps (due to Okasaki).

The amortized complexity of splay trees and heaps is analyzed in the AFP entry Amortized Complexity. extra-history = Change history: [2016-07-12]: Moved splay heaps here from Amortized_Complexity [Root_Balanced_Tree] title = Root-Balanced Tree author = Tobias Nipkow notify = nipkow@in.tum.de date = 2017-08-20 topic = Computer science/Data structures abstract =

Andersson introduced general balanced trees, search trees based on the design principle of partial rebuilding: perform update operations naively until the tree becomes too unbalanced, at which point a whole subtree is rebalanced. This article defines and analyzes a functional version of general balanced trees, which we call root-balanced trees. Using a lightweight model of execution time, amortized logarithmic complexity is verified in the theorem prover Isabelle.

This is the Isabelle formalization of the material decribed in the APLAS 2017 article Verified Root-Balanced Trees by the same author, which also presents experimental results that show competitiveness of root-balanced with AVL and red-black trees.

[Skew_Heap] title = Skew Heap author = Tobias Nipkow date = 2014-08-13 topic = Computer science/Data structures abstract = Skew heaps are an amazingly simple and lightweight implementation of priority queues. They were invented by Sleator and Tarjan [SIAM 1986] and have logarithmic amortized complexity. This entry provides executable and verified functional skew heaps.

The amortized complexity of skew heaps is analyzed in the AFP entry Amortized Complexity. notify = nipkow@in.tum.de [Pairing_Heap] title = Pairing Heap author = Hauke Brinkop , Tobias Nipkow date = 2016-07-14 topic = Computer science/Data structures abstract = This library defines three different versions of pairing heaps: a functional version of the original design based on binary trees [Fredman et al. 1986], the version by Okasaki [1998] and a modified version of the latter that is free of structural invariants.

The amortized complexity of pairing heaps is analyzed in the AFP article Amortized Complexity. extra-0 = Origin: This library was extracted from Amortized Complexity and extended. notify = nipkow@in.tum.de [Priority_Queue_Braun] title = Priority Queues Based on Braun Trees author = Tobias Nipkow date = 2014-09-04 topic = Computer science/Data structures abstract = This entry verifies priority queues based on Braun trees. Insertion and deletion take logarithmic time and preserve the balanced nature of Braun trees. Two implementations of deletion are provided. notify = nipkow@in.tum.de extra-history = Change history: [2019-12-16]: Added theory Priority_Queue_Braun2 with second version of del_min [Binomial-Queues] title = Functional Binomial Queues author = René Neumann date = 2010-10-28 topic = Computer science/Data structures abstract = Priority queues are an important data structure and efficient implementations of them are crucial. We implement a functional variant of binomial queues in Isabelle/HOL and show its functional correctness. A verification against an abstract reference specification of priority queues has also been attempted, but could not be achieved to the full extent. notify = florian.haftmann@informatik.tu-muenchen.de [Binomial-Heaps] title = Binomial Heaps and Skew Binomial Heaps author = Rene Meis , Finn Nielsen , Peter Lammich date = 2010-10-28 topic = Computer science/Data structures abstract = We implement and prove correct binomial heaps and skew binomial heaps. Both are data-structures for priority queues. While binomial heaps have logarithmic findMin, deleteMin, insert, and meld operations, skew binomial heaps have constant time findMin, insert, and meld operations, and only the deleteMin-operation is logarithmic. This is achieved by using skew links to avoid cascading linking on insert-operations, and data-structural bootstrapping to get constant-time findMin and meld operations. Our implementation follows the paper by Brodal and Okasaki. notify = peter.lammich@uni-muenster.de [Finger-Trees] title = Finger Trees author = Benedikt Nordhoff , Stefan Körner , Peter Lammich date = 2010-10-28 topic = Computer science/Data structures abstract = We implement and prove correct 2-3 finger trees. Finger trees are a general purpose data structure, that can be used to efficiently implement other data structures, such as priority queues. Intuitively, a finger tree is an annotated sequence, where the annotations are elements of a monoid. Apart from operations to access the ends of the sequence, the main operation is to split the sequence at the point where a monotone predicate over the sum of the left part of the sequence becomes true for the first time. The implementation follows the paper of Hinze and Paterson. The code generator can be used to get efficient, verified code. notify = peter.lammich@uni-muenster.de [Trie] title = Trie author = Andreas Lochbihler , Tobias Nipkow date = 2015-03-30 topic = Computer science/Data structures abstract = This article formalizes the ``trie'' data structure invented by Fredkin [CACM 1960]. It also provides a specialization where the entries in the trie are lists. extra-0 = Origin: This article was extracted from existing articles by the authors. notify = nipkow@in.tum.de [FinFun] title = Code Generation for Functions as Data author = Andreas Lochbihler date = 2009-05-06 topic = Computer science/Data structures abstract = FinFuns are total functions that are constant except for a finite set of points, i.e. a generalisation of finite maps. They are formalised as a new type in Isabelle/HOL such that the code generator can handle equality tests and quantification on FinFuns. On the code output level, FinFuns are explicitly represented by constant functions and pointwise updates, similarly to associative lists. Inside the logic, they behave like ordinary functions with extensionality. Via the update/constant pattern, a recursion combinator and an induction rule for FinFuns allow for defining and reasoning about operators on FinFun that are also executable. extra-history = Change history: [2010-08-13]: new concept domain of a FinFun as a FinFun (revision 34b3517cbc09)
[2010-11-04]: new conversion function from FinFun to list of elements in the domain (revision 0c167102e6ed)
[2012-03-07]: replace sets as FinFuns by predicates as FinFuns because the set type constructor has been reintroduced (revision b7aa87989f3a) notify = nipkow@in.tum.de [Collections] title = Collections Framework author = Peter Lammich contributors = Andreas Lochbihler , Thomas Tuerk <> date = 2009-11-25 topic = Computer science/Data structures abstract = This development provides an efficient, extensible, machine checked collections framework. The library adopts the concepts of interface, implementation and generic algorithm from object-oriented programming and implements them in Isabelle/HOL. The framework features the use of data refinement techniques to refine an abstract specification (using high-level concepts like sets) to a more concrete implementation (using collection datastructures, like red-black-trees). The code-generator of Isabelle/HOL can be used to generate efficient code. extra-history = Change history: [2010-10-08]: New Interfaces: OrderedSet, OrderedMap, List. Fifo now implements list-interface: Function names changed: put/get --> enqueue/dequeue. New Implementations: ArrayList, ArrayHashMap, ArrayHashSet, TrieMap, TrieSet. Invariant-free datastructures: Invariant implicitely hidden in typedef. Record-interfaces: All operations of an interface encapsulated as record. Examples moved to examples subdirectory.
[2010-12-01]: New Interfaces: Priority Queues, Annotated Lists. Implemented by finger trees, (skew) binomial queues.
[2011-10-10]: SetSpec: Added operations: sng, isSng, bexists, size_abort, diff, filter, iterate_rule_insertP MapSpec: Added operations: sng, isSng, iterate_rule_insertP, bexists, size, size_abort, restrict, map_image_filter, map_value_image_filter Some maintenance changes
[2012-04-25]: New iterator foundation by Tuerk. Various maintenance changes.
[2012-08]: Collections V2. New features: Polymorphic iterators. Generic algorithm instantiation where required. Naming scheme changed from xx_opname to xx.opname. A compatibility file CollectionsV1 tries to simplify porting of existing theories, by providing old naming scheme and the old monomorphic iterator locales.
[2013-09]: Added Generic Collection Framework based on Autoref. The GenCF provides: Arbitrary nesting, full integration with Autoref.
[2014-06]: Maintenace changes to GenCF: Optimized inj_image on list_set. op_set_cart (Cartesian product). big-Union operation. atLeastLessThan - operation ({a..<b})
notify = lammich@in.tum.de [Containers] title = Light-weight Containers author = Andreas Lochbihler contributors = René Thiemann date = 2013-04-15 topic = Computer science/Data structures abstract = This development provides a framework for container types like sets and maps such that generated code implements these containers with different (efficient) data structures. Thanks to type classes and refinement during code generation, this light-weight approach can seamlessly replace Isabelle's default setup for code generation. Heuristics automatically pick one of the available data structures depending on the type of elements to be stored, but users can also choose on their own. The extensible design permits to add more implementations at any time.

To support arbitrary nesting of sets, we define a linear order on sets based on a linear order of the elements and provide efficient implementations. It even allows to compare complements with non-complements. extra-history = Change history: [2013-07-11]: add pretty printing for sets (revision 7f3f52c5f5fa)
[2013-09-20]: provide generators for canonical type class instantiations (revision 159f4401f4a8 by René Thiemann)
[2014-07-08]: add support for going from partial functions to mappings (revision 7a6fc957e8ed)
[2018-03-05]: add two application examples: depth-first search and 2SAT (revision e5e1a1da2411) notify = mail@andreas-lochbihler.de [FileRefinement] title = File Refinement author = Karen Zee , Viktor Kuncak date = 2004-12-09 topic = Computer science/Data structures abstract = These theories illustrates the verification of basic file operations (file creation, file read and file write) in the Isabelle theorem prover. We describe a file at two levels of abstraction: an abstract file represented as a resizable array, and a concrete file represented using data blocks. notify = kkz@mit.edu [Datatype_Order_Generator] title = Generating linear orders for datatypes author = René Thiemann date = 2012-08-07 topic = Computer science/Data structures abstract = We provide a framework for registering automatic methods to derive class instances of datatypes, as it is possible using Haskell's ``deriving Ord, Show, ...'' feature.

We further implemented such automatic methods to derive (linear) orders or hash-functions which are required in the Isabelle Collection Framework. Moreover, for the tactic of Huffman and Krauss to show that a datatype is countable, we implemented a wrapper so that this tactic becomes accessible in our framework.

Our formalization was performed as part of the IsaFoR/CeTA project. With our new tactic we could completely remove tedious proofs for linear orders of two datatypes.

This development is aimed at datatypes generated by the "old_datatype" command. notify = rene.thiemann@uibk.ac.at [Deriving] title = Deriving class instances for datatypes author = Christian Sternagel , René Thiemann date = 2015-03-11 topic = Computer science/Data structures abstract =

We provide a framework for registering automatic methods to derive class instances of datatypes, as it is possible using Haskell's ``deriving Ord, Show, ...'' feature.

We further implemented such automatic methods to derive comparators, linear orders, parametrizable equality functions, and hash-functions which are required in the Isabelle Collection Framework and the Container Framework. Moreover, for the tactic of Blanchette to show that a datatype is countable, we implemented a wrapper so that this tactic becomes accessible in our framework. All of the generators are based on the infrastructure that is provided by the BNF-based datatype package.

Our formalization was performed as part of the IsaFoR/CeTA project. With our new tactics we could remove several tedious proofs for (conditional) linear orders, and conditional equality operators within IsaFoR and the Container Framework.

notify = rene.thiemann@uibk.ac.at [List-Index] title = List Index date = 2010-02-20 author = Tobias Nipkow topic = Computer science/Data structures abstract = This theory provides functions for finding the index of an element in a list, by predicate and by value. notify = nipkow@in.tum.de [List-Infinite] title = Infinite Lists date = 2011-02-23 author = David Trachtenherz <> topic = Computer science/Data structures abstract = We introduce a theory of infinite lists in HOL formalized as functions over naturals (folder ListInf, theories ListInf and ListInf_Prefix). It also provides additional results for finite lists (theory ListInf/List2), natural numbers (folder CommonArith, esp. division/modulo, naturals with infinity), sets (folder CommonSet, esp. cutting/truncating sets, traversing sets of naturals). notify = nipkow@in.tum.de [Matrix] title = Executable Matrix Operations on Matrices of Arbitrary Dimensions topic = Computer science/Data structures date = 2010-06-17 author = Christian Sternagel , René Thiemann license = LGPL abstract = We provide the operations of matrix addition, multiplication, transposition, and matrix comparisons as executable functions over ordered semirings. Moreover, it is proven that strongly normalizing (monotone) orders can be lifted to strongly normalizing (monotone) orders over matrices. We further show that the standard semirings over the naturals, integers, and rationals, as well as the arctic semirings satisfy the axioms that are required by our matrix theory. Our formalization is part of the CeTA system which contains several termination techniques. The provided theories have been essential to formalize matrix-interpretations and arctic interpretations. extra-history = Change history: [2010-09-17]: Moved theory on arbitrary (ordered) semirings to Abstract Rewriting. notify = rene.thiemann@uibk.ac.at, christian.sternagel@uibk.ac.at [Matrix_Tensor] title = Tensor Product of Matrices topic = Computer science/Data structures, Mathematics/Algebra date = 2016-01-18 author = T.V.H. Prathamesh abstract = In this work, the Kronecker tensor product of matrices and the proofs of some of its properties are formalized. Properties which have been formalized include associativity of the tensor product and the mixed-product property. notify = prathamesh@imsc.res.in [Huffman] title = The Textbook Proof of Huffman's Algorithm author = Jasmin Christian Blanchette date = 2008-10-15 topic = Computer science/Data structures abstract = Huffman's algorithm is a procedure for constructing a binary tree with minimum weighted path length. This report presents a formal proof of the correctness of Huffman's algorithm written using Isabelle/HOL. Our proof closely follows the sketches found in standard algorithms textbooks, uncovering a few snags in the process. Another distinguishing feature of our formalization is the use of custom induction rules to help Isabelle's automatic tactics, leading to very short proofs for most of the lemmas. notify = jasmin.blanchette@gmail.com [Partial_Function_MR] title = Mutually Recursive Partial Functions author = René Thiemann topic = Computer science/Functional programming date = 2014-02-18 license = LGPL abstract = We provide a wrapper around the partial-function command that supports mutual recursion. notify = rene.thiemann@uibk.ac.at [Lifting_Definition_Option] title = Lifting Definition Option author = René Thiemann topic = Computer science/Functional programming date = 2014-10-13 license = LGPL abstract = We implemented a command that can be used to easily generate elements of a restricted type {x :: 'a. P x}, provided the definition is of the form f ys = (if check ys then Some(generate ys :: 'a) else None) where ys is a list of variables y1 ... yn and check ys ==> P(generate ys) can be proved.

In principle, such a definition is also directly possible using the lift_definition command. However, then this definition will not be suitable for code-generation. To this end, we automated a more complex construction of Joachim Breitner which is amenable for code-generation, and where the test check ys will only be performed once. In the automation, one auxiliary type is created, and Isabelle's lifting- and transfer-package is invoked several times. notify = rene.thiemann@uibk.ac.at [Coinductive] title = Coinductive topic = Computer science/Functional programming author = Andreas Lochbihler contributors = Johannes Hölzl date = 2010-02-12 abstract = This article collects formalisations of general-purpose coinductive data types and sets. Currently, it contains coinductive natural numbers, coinductive lists, i.e. lazy lists or streams, infinite streams, coinductive terminated lists, coinductive resumptions, a library of operations on coinductive lists, and a version of König's lemma as an application for coinductive lists.
The initial theory was contributed by Paulson and Wenzel. Extensions and other coinductive formalisations of general interest are welcome. extra-history = Change history: [2010-06-10]: coinductive lists: setup for quotient package (revision 015574f3bf3c)
[2010-06-28]: new codatatype terminated lazy lists (revision e12de475c558)
[2010-08-04]: terminated lazy lists: setup for quotient package; more lemmas (revision 6ead626f1d01)
[2010-08-17]: Koenig's lemma as an example application for coinductive lists (revision f81ce373fa96)
[2011-02-01]: lazy implementation of coinductive (terminated) lists for the code generator (revision 6034973dce83)
[2011-07-20]: new codatatype resumption (revision 811364c776c7)
[2012-06-27]: new codatatype stream with operations (with contributions by Peter Gammie) (revision dd789a56473c)
[2013-03-13]: construct codatatypes with the BNF package and adjust the definitions and proofs, setup for lifting and transfer packages (revision f593eda5b2c0)
[2013-09-20]: stream theory uses type and operations from HOL/BNF/Examples/Stream (revision 692809b2b262)
[2014-04-03]: ccpo structure on codatatypes used to define ldrop, ldropWhile, lfilter, lconcat as least fixpoint; ccpo topology on coinductive lists contributed by Johannes Hölzl; added examples (revision 23cd8156bd42)
notify = mail@andreas-lochbihler.de [Stream-Fusion] title = Stream Fusion author = Brian Huffman topic = Computer science/Functional programming date = 2009-04-29 abstract = Stream Fusion is a system for removing intermediate list structures from Haskell programs; it consists of a Haskell library along with several compiler rewrite rules. (The library is available online.)

These theories contain a formalization of much of the Stream Fusion library in HOLCF. Lazy list and stream types are defined, along with coercions between the two types, as well as an equivalence relation for streams that generate the same list. List and stream versions of map, filter, foldr, enumFromTo, append, zipWith, and concatMap are defined, and the stream versions are shown to respect stream equivalence. notify = brianh@cs.pdx.edu [Tycon] title = Type Constructor Classes and Monad Transformers author = Brian Huffman date = 2012-06-26 topic = Computer science/Functional programming abstract = These theories contain a formalization of first class type constructors and axiomatic constructor classes for HOLCF. This work is described in detail in the ICFP 2012 paper Formal Verification of Monad Transformers by the author. The formalization is a revised and updated version of earlier joint work with Matthews and White.

Based on the hierarchy of type classes in Haskell, we define classes for functors, monads, monad-plus, etc. Each one includes all the standard laws as axioms. We also provide a new user command, tycondef, for defining new type constructors in HOLCF. Using tycondef, we instantiate the type class hierarchy with various monads and monad transformers. notify = huffman@in.tum.de [CoreC++] title = CoreC++ author = Daniel Wasserrab date = 2006-05-15 topic = Computer science/Programming languages/Language definitions abstract = We present an operational semantics and type safety proof for multiple inheritance in C++. The semantics models the behavior of method calls, field accesses, and two forms of casts in C++ class hierarchies. For explanations see the OOPSLA 2006 paper by Wasserrab, Nipkow, Snelting and Tip. notify = nipkow@in.tum.de [FeatherweightJava] title = A Theory of Featherweight Java in Isabelle/HOL author = J. Nathan Foster , Dimitrios Vytiniotis date = 2006-03-31 topic = Computer science/Programming languages/Language definitions abstract = We formalize the type system, small-step operational semantics, and type soundness proof for Featherweight Java, a simple object calculus, in Isabelle/HOL. notify = kleing@cse.unsw.edu.au [Jinja] title = Jinja is not Java author = Gerwin Klein , Tobias Nipkow date = 2005-06-01 topic = Computer science/Programming languages/Language definitions abstract = We introduce Jinja, a Java-like programming language with a formal semantics designed to exhibit core features of the Java language architecture. Jinja is a compromise between realism of the language and tractability and clarity of the formal semantics. The following aspects are formalised: a big and a small step operational semantics for Jinja and a proof of their equivalence; a type system and a definite initialisation analysis; a type safety proof of the small step semantics; a virtual machine (JVM), its operational semantics and its type system; a type safety proof for the JVM; a bytecode verifier, i.e. data flow analyser for the JVM; a correctness proof of the bytecode verifier w.r.t. the type system; a compiler and a proof that it preserves semantics and well-typedness. The emphasis of this work is not on particular language features but on providing a unified model of the source language, the virtual machine and the compiler. The whole development has been carried out in the theorem prover Isabelle/HOL. notify = kleing@cse.unsw.edu.au, nipkow@in.tum.de [JinjaThreads] title = Jinja with Threads author = Andreas Lochbihler date = 2007-12-03 topic = Computer science/Programming languages/Language definitions abstract = We extend the Jinja source code semantics by Klein and Nipkow with Java-style arrays and threads. Concurrency is captured in a generic framework semantics for adding concurrency through interleaving to a sequential semantics, which features dynamic thread creation, inter-thread communication via shared memory, lock synchronisation and joins. Also, threads can suspend themselves and be notified by others. We instantiate the framework with the adapted versions of both Jinja source and byte code and show type safety for the multithreaded case. Equally, the compiler from source to byte code is extended, for which we prove weak bisimilarity between the source code small step semantics and the defensive Jinja virtual machine. On top of this, we formalise the JMM and show the DRF guarantee and consistency. For description of the different parts, see Lochbihler's papers at FOOL 2008, ESOP 2010, ITP 2011, and ESOP 2012. extra-history = Change history: [2008-04-23]: added bytecode formalisation with arrays and threads, added thread joins (revision f74a8be156a7)
[2009-04-27]: added verified compiler from source code to bytecode; encapsulate native methods in separate semantics (revision e4f26541e58a)
[2009-11-30]: extended compiler correctness proof to infinite and deadlocking computations (revision e50282397435)
[2010-06-08]: added thread interruption; new abstract memory model with sequential consistency as implementation (revision 0cb9e8dbd78d)
[2010-06-28]: new thread interruption model (revision c0440d0a1177)
[2010-10-15]: preliminary version of the Java memory model for source code (revision 02fee0ef3ca2)
[2010-12-16]: improved version of the Java memory model, also for bytecode executable scheduler for source code semantics (revision 1f41c1842f5a)
[2011-02-02]: simplified code generator setup new random scheduler (revision 3059dafd013f)
[2011-07-21]: new interruption model, generalized JMM proof of DRF guarantee, allow class Object to declare methods and fields, simplified subtyping relation, corrected division and modulo implementation (revision 46e4181ed142)
[2012-02-16]: added example programs (revision bf0b06c8913d)
[2012-11-21]: type safety proof for the Java memory model, allow spurious wake-ups (revision 76063d860ae0)
[2013-05-16]: support for non-deterministic memory allocators (revision cc3344a49ced)
[2017-10-20]: add an atomic compare-and-swap operation for volatile fields (revision a6189b1d6b30)
notify = mail@andreas-lochbihler.de [Locally-Nameless-Sigma] title = Locally Nameless Sigma Calculus author = Ludovic Henrio , Florian Kammüller , Bianca Lutz , Henry Sudhof date = 2010-04-30 topic = Computer science/Programming languages/Language definitions abstract = We present a Theory of Objects based on the original functional sigma-calculus by Abadi and Cardelli but with an additional parameter to methods. We prove confluence of the operational semantics following the outline of Nipkow's proof of confluence for the lambda-calculus reusing his theory Commutation, a generic diamond lemma reduction. We furthermore formalize a simple type system for our sigma-calculus including a proof of type safety. The entire development uses the concept of Locally Nameless representation for binders. We reuse an earlier proof of confluence for a simpler sigma-calculus based on de Bruijn indices and lists to represent objects. notify = nipkow@in.tum.de [Attack_Trees] title = Attack Trees in Isabelle for GDPR compliance of IoT healthcare systems author = Florian Kammueller topic = Computer science/Security date = 2020-04-27 notify = florian.kammuller@gmail.com abstract = In this article, we present a proof theory for Attack Trees. Attack Trees are a well established and useful model for the construction of attacks on systems since they allow a stepwise exploration of high level attacks in application scenarios. Using the expressiveness of Higher Order Logic in Isabelle, we develop a generic theory of Attack Trees with a state-based semantics based on Kripke structures and CTL. The resulting framework allows mechanically supported logic analysis of the meta-theory of the proof calculus of Attack Trees and at the same time the developed proof theory enables application to case studies. A central correctness and completeness result proved in Isabelle establishes a connection between the notion of Attack Tree validity and CTL. The application is illustrated on the example of a healthcare IoT system and GDPR compliance verification. [AutoFocus-Stream] title = AutoFocus Stream Processing for Single-Clocking and Multi-Clocking Semantics author = David Trachtenherz <> date = 2011-02-23 topic = Computer science/Programming languages/Language definitions abstract = We formalize the AutoFocus Semantics (a time-synchronous subset of the Focus formalism) as stream processing functions on finite and infinite message streams represented as finite/infinite lists. The formalization comprises both the conventional single-clocking semantics (uniform global clock for all components and communications channels) and its extension to multi-clocking semantics (internal execution clocking of a component may be a multiple of the external communication clocking). The semantics is defined by generic stream processing functions making it suitable for simulation/code generation in Isabelle/HOL. Furthermore, a number of AutoFocus semantics properties are formalized using definitions from the IntervalLogic theories. notify = nipkow@in.tum.de [FocusStreamsCaseStudies] title = Stream Processing Components: Isabelle/HOL Formalisation and Case Studies author = Maria Spichkova date = 2013-11-14 topic = Computer science/Programming languages/Language definitions abstract = This set of theories presents an Isabelle/HOL formalisation of stream processing components introduced in Focus, a framework for formal specification and development of interactive systems. This is an extended and updated version of the formalisation, which was elaborated within the methodology "Focus on Isabelle". In addition, we also applied the formalisation on three case studies that cover different application areas: process control (Steam Boiler System), data transmission (FlexRay communication protocol), memory and processing components (Automotive-Gateway System). notify = lp15@cam.ac.uk, maria.spichkova@rmit.edu.au [Isabelle_Meta_Model] title = A Meta-Model for the Isabelle API author = Frédéric Tuong , Burkhart Wolff date = 2015-09-16 topic = Computer science/Programming languages/Language definitions abstract = We represent a theory of (a fragment of) Isabelle/HOL in Isabelle/HOL. The purpose of this exercise is to write packages for domain-specific specifications such as class models, B-machines, ..., and generally speaking, any domain-specific languages whose abstract syntax can be defined by a HOL "datatype". On this basis, the Isabelle code-generator can then be used to generate code for global context transformations as well as tactic code.

Consequently the package is geared towards parsing, printing and code-generation to the Isabelle API. It is at the moment not sufficiently rich for doing meta theory on Isabelle itself. Extensions in this direction are possible though.

Moreover, the chosen fragment is fairly rudimentary. However it should be easily adapted to one's needs if a package is written on top of it. The supported API contains types, terms, transformation of global context like definitions and data-type declarations as well as infrastructure for Isar-setups.

This theory is drawn from the Featherweight OCL project where it is used to construct a package for object-oriented data-type theories generated from UML class diagrams. The Featherweight OCL, for example, allows for both the direct execution of compiled tactic code by the Isabelle API as well as the generation of ".thy"-files for debugging purposes.

Gained experience from this project shows that the compiled code is sufficiently efficient for practical purposes while being based on a formal model on which properties of the package can be proven such as termination of certain transformations, correctness, etc. notify = tuong@users.gforge.inria.fr, wolff@lri.fr [Clean] title = Clean - An Abstract Imperative Programming Language and its Theory author = Frédéric Tuong , Burkhart Wolff topic = Computer science/Programming languages, Computer science/Semantics date = 2019-10-04 notify = wolff@lri.fr, ftuong@lri.fr abstract = Clean is based on a simple, abstract execution model for an imperative target language. “Abstract” is understood in contrast to “Concrete Semantics”; alternatively, the term “shallow-style embedding” could be used. It strives for a type-safe notion of program-variables, an incremental construction of the typed state-space, support of incremental verification, and open-world extensibility of new type definitions being intertwined with the program definitions. Clean is based on a “no-frills” state-exception monad with the usual definitions of bind and unit for the compositional glue of state-based computations. Clean offers conditionals and loops supporting C-like control-flow operators such as break and return. The state-space construction is based on the extensible record package. Direct recursion of procedures is supported. Clean’s design strives for extreme simplicity. It is geared towards symbolic execution and proven correct verification tools. The underlying libraries of this package, however, deliberately restrict themselves to the most elementary infrastructure for these tasks. The package is intended to serve as demonstrator semantic backend for Isabelle/C, or for the test-generation techniques. [PCF] title = Logical Relations for PCF author = Peter Gammie date = 2012-07-01 topic = Computer science/Programming languages/Lambda calculi abstract = We apply Andy Pitts's methods of defining relations over domains to several classical results in the literature. We show that the Y combinator coincides with the domain-theoretic fixpoint operator, that parallel-or and the Plotkin existential are not definable in PCF, that the continuation semantics for PCF coincides with the direct semantics, and that our domain-theoretic semantics for PCF is adequate for reasoning about contextual equivalence in an operational semantics. Our version of PCF is untyped and has both strict and non-strict function abstractions. The development is carried out in HOLCF. notify = peteg42@gmail.com [POPLmark-deBruijn] title = POPLmark Challenge Via de Bruijn Indices author = Stefan Berghofer date = 2007-08-02 topic = Computer science/Programming languages/Lambda calculi abstract = We present a solution to the POPLmark challenge designed by Aydemir et al., which has as a goal the formalization of the meta-theory of System F<:. The formalization is carried out in the theorem prover Isabelle/HOL using an encoding based on de Bruijn indices. We start with a relatively simple formalization covering only the basic features of System F<:, and explain how it can be extended to also cover records and more advanced binding constructs. notify = berghofe@in.tum.de [Lam-ml-Normalization] title = Strong Normalization of Moggis's Computational Metalanguage author = Christian Doczkal date = 2010-08-29 topic = Computer science/Programming languages/Lambda calculi abstract = Handling variable binding is one of the main difficulties in formal proofs. In this context, Moggi's computational metalanguage serves as an interesting case study. It features monadic types and a commuting conversion rule that rearranges the binding structure. Lindley and Stark have given an elegant proof of strong normalization for this calculus. The key construction in their proof is a notion of relational TT-lifting, using stacks of elimination contexts to obtain a Girard-Tait style logical relation. I give a formalization of their proof in Isabelle/HOL-Nominal with a particular emphasis on the treatment of bound variables. notify = doczkal@ps.uni-saarland.de, nipkow@in.tum.de [MiniML] title = Mini ML author = Wolfgang Naraschewski <>, Tobias Nipkow date = 2004-03-19 topic = Computer science/Programming languages/Type systems abstract = This theory defines the type inference rules and the type inference algorithm W for MiniML (simply-typed lambda terms with let) due to Milner. It proves the soundness and completeness of W w.r.t. the rules. notify = kleing@cse.unsw.edu.au [Simpl] title = A Sequential Imperative Programming Language Syntax, Semantics, Hoare Logics and Verification Environment author = Norbert Schirmer <> date = 2008-02-29 topic = Computer science/Programming languages/Language definitions, Computer science/Programming languages/Logics license = LGPL abstract = We present the theory of Simpl, a sequential imperative programming language. We introduce its syntax, its semantics (big and small-step operational semantics) and Hoare logics for both partial as well as total correctness. We prove soundness and completeness of the Hoare logic. We integrate and automate the Hoare logic in Isabelle/HOL to obtain a practically usable verification environment for imperative programs. Simpl is independent of a concrete programming language but expressive enough to cover all common language features: mutually recursive procedures, abrupt termination and exceptions, runtime faults, local and global variables, pointers and heap, expressions with side effects, pointers to procedures, partial application and closures, dynamic method invocation and also unbounded nondeterminism. notify = kleing@cse.unsw.edu.au, norbert.schirmer@web.de [Separation_Algebra] title = Separation Algebra author = Gerwin Klein , Rafal Kolanski , Andrew Boyton date = 2012-05-11 topic = Computer science/Programming languages/Logics license = BSD abstract = We present a generic type class implementation of separation algebra for Isabelle/HOL as well as lemmas and generic tactics which can be used directly for any instantiation of the type class.

The ex directory contains example instantiations that include structures such as a heap or virtual memory.

The abstract separation algebra is based upon "Abstract Separation Logic" by Calcagno et al. These theories are also the basis of the ITP 2012 rough diamond "Mechanised Separation Algebra" by the authors.

The aim of this work is to support and significantly reduce the effort for future separation logic developments in Isabelle/HOL by factoring out the part of separation logic that can be treated abstractly once and for all. This includes developing typical default rule sets for reasoning as well as automated tactic support for separation logic. notify = kleing@cse.unsw.edu.au, rafal.kolanski@nicta.com.au [Separation_Logic_Imperative_HOL] title = A Separation Logic Framework for Imperative HOL author = Peter Lammich , Rene Meis date = 2012-11-14 topic = Computer science/Programming languages/Logics license = BSD abstract = We provide a framework for separation-logic based correctness proofs of Imperative HOL programs. Our framework comes with a set of proof methods to automate canonical tasks such as verification condition generation and frame inference. Moreover, we provide a set of examples that show the applicability of our framework. The examples include algorithms on lists, hash-tables, and union-find trees. We also provide abstract interfaces for lists, maps, and sets, that allow to develop generic imperative algorithms and use data-refinement techniques.
As we target Imperative HOL, our programs can be translated to efficiently executable code in various target languages, including ML, OCaml, Haskell, and Scala. notify = lammich@in.tum.de [Inductive_Confidentiality] title = Inductive Study of Confidentiality author = Giampaolo Bella date = 2012-05-02 topic = Computer science/Security abstract = This document contains the full theory files accompanying article Inductive Study of Confidentiality --- for Everyone in Formal Aspects of Computing. They aim at an illustrative and didactic presentation of the Inductive Method of protocol analysis, focusing on the treatment of one of the main goals of security protocols: confidentiality against a threat model. The treatment of confidentiality, which in fact forms a key aspect of all protocol analysis tools, has been found cryptic by many learners of the Inductive Method, hence the motivation for this work. The theory files in this document guide the reader step by step towards design and proof of significant confidentiality theorems. These are developed against two threat models, the standard Dolev-Yao and a more audacious one, the General Attacker, which turns out to be particularly useful also for teaching purposes. notify = giamp@dmi.unict.it [Possibilistic_Noninterference] title = Possibilistic Noninterference author = Andrei Popescu , Johannes Hölzl date = 2012-09-10 topic = Computer science/Security, Computer science/Programming languages/Type systems abstract = We formalize a wide variety of Volpano/Smith-style noninterference notions for a while language with parallel composition. We systematize and classify these notions according to compositionality w.r.t. the language constructs. Compositionality yields sound syntactic criteria (a.k.a. type systems) in a uniform way.

An article about these proofs is published in the proceedings of the conference Certified Programs and Proofs 2012. notify = hoelzl@in.tum.de [SIFUM_Type_Systems] title = A Formalization of Assumptions and Guarantees for Compositional Noninterference author = Sylvia Grewe , Heiko Mantel , Daniel Schoepe date = 2014-04-23 topic = Computer science/Security, Computer science/Programming languages/Type systems abstract = Research in information-flow security aims at developing methods to identify undesired information leaks within programs from private (high) sources to public (low) sinks. For a concurrent system, it is desirable to have compositional analysis methods that allow for analyzing each thread independently and that nevertheless guarantee that the parallel composition of successfully analyzed threads satisfies a global security guarantee. However, such a compositional analysis should not be overly pessimistic about what an environment might do with shared resources. Otherwise, the analysis will reject many intuitively secure programs.

The paper "Assumptions and Guarantees for Compositional Noninterference" by Mantel et. al. presents one solution for this problem: an approach for compositionally reasoning about non-interference in concurrent programs via rely-guarantee-style reasoning. We present an Isabelle/HOL formalization of the concepts and proofs of this approach. notify = [Dependent_SIFUM_Type_Systems] title = A Dependent Security Type System for Concurrent Imperative Programs author = Toby Murray , Robert Sison<>, Edward Pierzchalski<>, Christine Rizkallah notify = toby.murray@unimelb.edu.au date = 2016-06-25 topic = Computer science/Security, Computer science/Programming languages/Type systems abstract = The paper "Compositional Verification and Refinement of Concurrent Value-Dependent Noninterference" by Murray et. al. (CSF 2016) presents a dependent security type system for compositionally verifying a value-dependent noninterference property, defined in (Murray, PLAS 2015), for concurrent programs. This development formalises that security definition, the type system and its soundness proof, and demonstrates its application on some small examples. It was derived from the SIFUM_Type_Systems AFP entry, by Sylvia Grewe, Heiko Mantel and Daniel Schoepe, and whose structure it inherits. extra-history = Change history: [2016-08-19]: Removed unused "stop" parameter and "stop_no_eval" assumption from the sifum_security locale. (revision dbc482d36372) [2016-09-27]: Added security locale support for the imposition of requirements on the initial memory. (revision cce4ceb74ddb) [Dependent_SIFUM_Refinement] title = Compositional Security-Preserving Refinement for Concurrent Imperative Programs author = Toby Murray , Robert Sison<>, Edward Pierzchalski<>, Christine Rizkallah notify = toby.murray@unimelb.edu.au date = 2016-06-28 topic = Computer science/Security abstract = The paper "Compositional Verification and Refinement of Concurrent Value-Dependent Noninterference" by Murray et. al. (CSF 2016) presents a compositional theory of refinement for a value-dependent noninterference property, defined in (Murray, PLAS 2015), for concurrent programs. This development formalises that refinement theory, and demonstrates its application on some small examples. extra-history = Change history: [2016-08-19]: Removed unused "stop" parameters from the sifum_refinement locale. (revision dbc482d36372) [2016-09-02]: TobyM extended "simple" refinement theory to be usable for all bisimulations. (revision 547f31c25f60) [Relational-Incorrectness-Logic] title = An Under-Approximate Relational Logic author = Toby Murray topic = Computer science/Programming languages/Logics, Computer science/Security date = 2020-03-12 notify = toby.murray@unimelb.edu.au abstract = Recently, authors have proposed under-approximate logics for reasoning about programs. So far, all such logics have been confined to reasoning about individual program behaviours. Yet there exist many over-approximate relational logics for reasoning about pairs of programs and relating their behaviours. We present the first under-approximate relational logic, for the simple imperative language IMP. We prove our logic is both sound and complete. Additionally, we show how reasoning in this logic can be decomposed into non-relational reasoning in an under-approximate Hoare logic, mirroring Beringer’s result for over-approximate relational logics. We illustrate the application of our logic on some small examples in which we provably demonstrate the presence of insecurity. [Strong_Security] title = A Formalization of Strong Security author = Sylvia Grewe , Alexander Lux , Heiko Mantel , Jens Sauer date = 2014-04-23 topic = Computer science/Security, Computer science/Programming languages/Type systems abstract = Research in information-flow security aims at developing methods to identify undesired information leaks within programs from private sources to public sinks. Noninterference captures this intuition. Strong security from Sabelfeld and Sands formalizes noninterference for concurrent systems.

We present an Isabelle/HOL formalization of strong security for arbitrary security lattices (Sabelfeld and Sands use a two-element security lattice in the original publication). The formalization includes compositionality proofs for strong security and a soundness proof for a security type system that checks strong security for programs in a simple while language with dynamic thread creation.

Our formalization of the security type system is abstract in the language for expressions and in the semantic side conditions for expressions. It can easily be instantiated with different syntactic approximations for these side conditions. The soundness proof of such an instantiation boils down to showing that these syntactic approximations imply the semantic side conditions. notify = [WHATandWHERE_Security] title = A Formalization of Declassification with WHAT-and-WHERE-Security author = Sylvia Grewe , Alexander Lux , Heiko Mantel , Jens Sauer date = 2014-04-23 topic = Computer science/Security, Computer science/Programming languages/Type systems abstract = Research in information-flow security aims at developing methods to identify undesired information leaks within programs from private sources to public sinks. Noninterference captures this intuition by requiring that no information whatsoever flows from private sources to public sinks. However, in practice this definition is often too strict: Depending on the intuitive desired security policy, the controlled declassification of certain private information (WHAT) at certain points in the program (WHERE) might not result in an undesired information leak.

We present an Isabelle/HOL formalization of such a security property for controlled declassification, namely WHAT&WHERE-security from "Scheduler-Independent Declassification" by Lux, Mantel, and Perner. The formalization includes compositionality proofs for and a soundness proof for a security type system that checks for programs in a simple while language with dynamic thread creation.

Our formalization of the security type system is abstract in the language for expressions and in the semantic side conditions for expressions. It can easily be instantiated with different syntactic approximations for these side conditions. The soundness proof of such an instantiation boils down to showing that these syntactic approximations imply the semantic side conditions.

This Isabelle/HOL formalization uses theories from the entry Strong Security. notify = [VolpanoSmith] title = A Correctness Proof for the Volpano/Smith Security Typing System author = Gregor Snelting , Daniel Wasserrab date = 2008-09-02 topic = Computer science/Programming languages/Type systems, Computer science/Security abstract = The Volpano/Smith/Irvine security type systems requires that variables are annotated as high (secret) or low (public), and provides typing rules which guarantee that secret values cannot leak to public output ports. This property of a program is called confidentiality. For a simple while-language without threads, our proof shows that typeability in the Volpano/Smith system guarantees noninterference. Noninterference means that if two initial states for program execution are low-equivalent, then the final states are low-equivalent as well. This indeed implies that secret values cannot leak to public ports. The proof defines an abstract syntax and operational semantics for programs, formalizes noninterference, and then proceeds by rule induction on the operational semantics. The mathematically most intricate part is the treatment of implicit flows. Note that the Volpano/Smith system is not flow-sensitive and thus quite unprecise, resulting in false alarms. However, due to the correctness property, all potential breaks of confidentiality are discovered. notify = [Abstract-Hoare-Logics] title = Abstract Hoare Logics author = Tobias Nipkow date = 2006-08-08 topic = Computer science/Programming languages/Logics abstract = These therories describe Hoare logics for a number of imperative language constructs, from while-loops to mutually recursive procedures. Both partial and total correctness are treated. In particular a proof system for total correctness of recursive procedures in the presence of unbounded nondeterminism is presented. notify = nipkow@in.tum.de [Stone_Algebras] title = Stone Algebras author = Walter Guttmann notify = walter.guttmann@canterbury.ac.nz date = 2016-09-06 topic = Mathematics/Order abstract = A range of algebras between lattices and Boolean algebras generalise the notion of a complement. We develop a hierarchy of these pseudo-complemented algebras that includes Stone algebras. Independently of this theory we study filters based on partial orders. Both theories are combined to prove Chen and Grätzer's construction theorem for Stone algebras. The latter involves extensive reasoning about algebraic structures in addition to reasoning in algebraic structures. [Kleene_Algebra] title = Kleene Algebra author = Alasdair Armstrong <>, Georg Struth , Tjark Weber date = 2013-01-15 topic = Computer science/Programming languages/Logics, Computer science/Automata and formal languages, Mathematics/Algebra abstract = These files contain a formalisation of variants of Kleene algebras and their most important models as axiomatic type classes in Isabelle/HOL. Kleene algebras are foundational structures in computing with applications ranging from automata and language theory to computational modeling, program construction and verification.

We start with formalising dioids, which are additively idempotent semirings, and expand them by axiomatisations of the Kleene star for finite iteration and an omega operation for infinite iteration. We show that powersets over a given monoid, (regular) languages, sets of paths in a graph, sets of computation traces, binary relations and formal power series form Kleene algebras, and consider further models based on lattices, max-plus semirings and min-plus semirings. We also demonstrate that dioids are closed under the formation of matrices (proofs for Kleene algebras remain to be completed).

On the one hand we have aimed at a reference formalisation of variants of Kleene algebras that covers a wide range of variants and the core theorems in a structured and modular way and provides readable proofs at text book level. On the other hand, we intend to use this algebraic hierarchy and its models as a generic algebraic middle-layer from which programming applications can quickly be explored, implemented and verified. notify = g.struth@sheffield.ac.uk, tjark.weber@it.uu.se [KAT_and_DRA] title = Kleene Algebra with Tests and Demonic Refinement Algebras author = Alasdair Armstrong <>, Victor B. F. Gomes , Georg Struth date = 2014-01-23 topic = Computer science/Programming languages/Logics, Computer science/Automata and formal languages, Mathematics/Algebra abstract = We formalise Kleene algebra with tests (KAT) and demonic refinement algebra (DRA) in Isabelle/HOL. KAT is relevant for program verification and correctness proofs in the partial correctness setting. While DRA targets similar applications in the context of total correctness. Our formalisation contains the two most important models of these algebras: binary relations in the case of KAT and predicate transformers in the case of DRA. In addition, we derive the inference rules for Hoare logic in KAT and its relational model and present a simple formally verified program verification tool prototype based on the algebraic approach. notify = g.struth@dcs.shef.ac.uk [KAD] title = Kleene Algebras with Domain author = Victor B. F. Gomes , Walter Guttmann , Peter Höfner , Georg Struth , Tjark Weber date = 2016-04-12 topic = Computer science/Programming languages/Logics, Computer science/Automata and formal languages, Mathematics/Algebra abstract = Kleene algebras with domain are Kleene algebras endowed with an operation that maps each element of the algebra to its domain of definition (or its complement) in abstract fashion. They form a simple algebraic basis for Hoare logics, dynamic logics or predicate transformer semantics. We formalise a modular hierarchy of algebras with domain and antidomain (domain complement) operations in Isabelle/HOL that ranges from domain and antidomain semigroups to modal Kleene algebras and divergence Kleene algebras. We link these algebras with models of binary relations and program traces. We include some examples from modal logics, termination and program analysis. notify = walter.guttman@canterbury.ac.nz, g.struth@sheffield.ac.uk, tjark.weber@it.uu.se [Regular_Algebras] title = Regular Algebras author = Simon Foster , Georg Struth date = 2014-05-21 topic = Computer science/Automata and formal languages, Mathematics/Algebra abstract = Regular algebras axiomatise the equational theory of regular expressions as induced by regular language identity. We use Isabelle/HOL for a detailed systematic study of regular algebras given by Boffa, Conway, Kozen and Salomaa. We investigate the relationships between these classes, formalise a soundness proof for the smallest class (Salomaa's) and obtain completeness of the largest one (Boffa's) relative to a deep result by Krob. In addition we provide a large collection of regular identities in the general setting of Boffa's axiom. Our regular algebra hierarchy is orthogonal to the Kleene algebra hierarchy in the Archive of Formal Proofs; we have not aimed at an integration for pragmatic reasons. notify = simon.foster@york.ac.uk, g.struth@sheffield.ac.uk [BytecodeLogicJmlTypes] title = A Bytecode Logic for JML and Types author = Lennart Beringer <>, Martin Hofmann date = 2008-12-12 topic = Computer science/Programming languages/Logics abstract = This document contains the Isabelle/HOL sources underlying the paper A bytecode logic for JML and types by Beringer and Hofmann, updated to Isabelle 2008. We present a program logic for a subset of sequential Java bytecode that is suitable for representing both, features found in high-level specification language JML as well as interpretations of high-level type systems. To this end, we introduce a fine-grained collection of assertions, including strong invariants, local annotations and VDM-reminiscent partial-correctness specifications. Thanks to a goal-oriented structure and interpretation of judgements, verification may proceed without recourse to an additional control flow analysis. The suitability for interpreting intensional type systems is illustrated by the proof-carrying-code style encoding of a type system for a first-order functional language which guarantees a constant upper bound on the number of objects allocated throughout an execution, be the execution terminating or non-terminating. Like the published paper, the formal development is restricted to a comparatively small subset of the JVML, lacking (among other features) exceptions, arrays, virtual methods, and static fields. This shortcoming has been overcome meanwhile, as our paper has formed the basis of the Mobius base logic, a program logic for the full sequential fragment of the JVML. Indeed, the present formalisation formed the basis of a subsequent formalisation of the Mobius base logic in the proof assistant Coq, which includes a proof of soundness with respect to the Bicolano operational semantics by Pichardie. notify = [DataRefinementIBP] title = Semantics and Data Refinement of Invariant Based Programs author = Viorel Preoteasa , Ralph-Johan Back date = 2010-05-28 topic = Computer science/Programming languages/Logics abstract = The invariant based programming is a technique of constructing correct programs by first identifying the basic situations (pre- and post-conditions and invariants) that can occur during the execution of the program, and then defining the transitions and proving that they preserve the invariants. Data refinement is a technique of building correct programs working on concrete datatypes as refinements of more abstract programs. In the theories presented here we formalize the predicate transformer semantics for invariant based programs and their data refinement. extra-history = Change history: [2012-01-05]: Moved some general complete lattice properties to the AFP entry Lattice Properties. Changed the definition of the data refinement relation to be more general and updated all corresponding theorems. Added new syntax for demonic and angelic update statements. notify = viorel.preoteasa@aalto.fi [RefinementReactive] title = Formalization of Refinement Calculus for Reactive Systems author = Viorel Preoteasa date = 2014-10-08 topic = Computer science/Programming languages/Logics abstract = We present a formalization of refinement calculus for reactive systems. Refinement calculus is based on monotonic predicate transformers (monotonic functions from sets of post-states to sets of pre-states), and it is a powerful formalism for reasoning about imperative programs. We model reactive systems as monotonic property transformers that transform sets of output infinite sequences into sets of input infinite sequences. Within this semantics we can model refinement of reactive systems, (unbounded) angelic and demonic nondeterminism, sequential composition, and other semantic properties. We can model systems that may fail for some inputs, and we can model compatibility of systems. We can specify systems that have liveness properties using linear temporal logic, and we can refine system specifications into systems based on symbolic transitions systems, suitable for implementations. notify = viorel.preoteasa@aalto.fi [SIFPL] title = Secure information flow and program logics author = Lennart Beringer <>, Martin Hofmann date = 2008-11-10 topic = Computer science/Programming languages/Logics, Computer science/Security abstract = We present interpretations of type systems for secure information flow in Hoare logic, complementing previous encodings in relational program logics. We first treat the imperative language IMP, extended by a simple procedure call mechanism. For this language we consider base-line non-interference in the style of Volpano et al. and the flow-sensitive type system by Hunt and Sands. In both cases, we show how typing derivations may be used to automatically generate proofs in the program logic that certify the absence of illicit flows. We then add instructions for object creation and manipulation, and derive appropriate proof rules for base-line non-interference. As a consequence of our work, standard verification technology may be used for verifying that a concrete program satisfies the non-interference property.

The present proof development represents an update of the formalisation underlying our paper [CSF 2007] and is intended to resolve any ambiguities that may be present in the paper. notify = lennart.beringer@ifi.lmu.de [TLA] title = A Definitional Encoding of TLA* in Isabelle/HOL author = Gudmund Grov , Stephan Merz date = 2011-11-19 topic = Computer science/Programming languages/Logics abstract = We mechanise the logic TLA* [Merz 1999], an extension of Lamport's Temporal Logic of Actions (TLA) [Lamport 1994] for specifying and reasoning about concurrent and reactive systems. Aiming at a framework for mechanising] the verification of TLA (or TLA*) specifications, this contribution reuses some elements from a previous axiomatic encoding of TLA in Isabelle/HOL by the second author [Merz 1998], which has been part of the Isabelle distribution. In contrast to that previous work, we give here a shallow, definitional embedding, with the following highlights:

  • a theory of infinite sequences, including a formalisation of the concepts of stuttering invariance central to TLA and TLA*;
  • a definition of the semantics of TLA*, which extends TLA by a mutually-recursive definition of formulas and pre-formulas, generalising TLA action formulas;
  • a substantial set of derived proof rules, including the TLA* axioms and Lamport's proof rules for system verification;
  • a set of examples illustrating the usage of Isabelle/TLA* for reasoning about systems.
Note that this work is unrelated to the ongoing development of a proof system for the specification language TLA+, which includes an encoding of TLA+ as a new Isabelle object logic [Chaudhuri et al 2010]. notify = ggrov@inf.ed.ac.uk [Compiling-Exceptions-Correctly] title = Compiling Exceptions Correctly author = Tobias Nipkow date = 2004-07-09 topic = Computer science/Programming languages/Compiling abstract = An exception compilation scheme that dynamically creates and removes exception handler entries on the stack. A formalization of an article of the same name by Hutton and Wright. notify = nipkow@in.tum.de [NormByEval] title = Normalization by Evaluation author = Klaus Aehlig , Tobias Nipkow date = 2008-02-18 topic = Computer science/Programming languages/Compiling abstract = This article formalizes normalization by evaluation as implemented in Isabelle. Lambda calculus plus term rewriting is compiled into a functional program with pattern matching. It is proved that the result of a successful evaluation is a) correct, i.e. equivalent to the input, and b) in normal form. notify = nipkow@in.tum.de [Program-Conflict-Analysis] title = Formalization of Conflict Analysis of Programs with Procedures, Thread Creation, and Monitors topic = Computer science/Programming languages/Static analysis author = Peter Lammich , Markus Müller-Olm date = 2007-12-14 abstract = In this work we formally verify the soundness and precision of a static program analysis that detects conflicts (e. g. data races) in programs with procedures, thread creation and monitors with the Isabelle theorem prover. As common in static program analysis, our program model abstracts guarded branching by nondeterministic branching, but completely interprets the call-/return behavior of procedures, synchronization by monitors, and thread creation. The analysis is based on the observation that all conflicts already occur in a class of particularly restricted schedules. These restricted schedules are suited to constraint-system-based program analysis. The formalization is based upon a flowgraph-based program model with an operational semantics as reference point. notify = peter.lammich@uni-muenster.de [Shivers-CFA] title = Shivers' Control Flow Analysis topic = Computer science/Programming languages/Static analysis author = Joachim Breitner date = 2010-11-16 abstract = In his dissertation, Olin Shivers introduces a concept of control flow graphs for functional languages, provides an algorithm to statically derive a safe approximation of the control flow graph and proves this algorithm correct. In this research project, Shivers' algorithms and proofs are formalized in the HOLCF extension of HOL. notify = mail@joachim-breitner.de, nipkow@in.tum.de [Slicing] title = Towards Certified Slicing author = Daniel Wasserrab date = 2008-09-16 topic = Computer science/Programming languages/Static analysis abstract = Slicing is a widely-used technique with applications in e.g. compiler technology and software security. Thus verification of algorithms in these areas is often based on the correctness of slicing, which should ideally be proven independent of concrete programming languages and with the help of well-known verifying techniques such as proof assistants. As a first step in this direction, this contribution presents a framework for dynamic and static intraprocedural slicing based on control flow and program dependence graphs. Abstracting from concrete syntax we base the framework on a graph representation of the program fulfilling certain structural and well-formedness properties.

The formalization consists of the basic framework (in subdirectory Basic/), the correctness proof for dynamic slicing (in subdirectory Dynamic/), the correctness proof for static intraprocedural slicing (in subdirectory StaticIntra/) and instantiations of the framework with a simple While language (in subdirectory While/) and the sophisticated object-oriented bytecode language of Jinja (in subdirectory JinjaVM/). For more information on the framework, see the TPHOLS 2008 paper by Wasserrab and Lochbihler and the PLAS 2009 paper by Wasserrab et al. notify = [HRB-Slicing] title = Backing up Slicing: Verifying the Interprocedural Two-Phase Horwitz-Reps-Binkley Slicer author = Daniel Wasserrab date = 2009-11-13 topic = Computer science/Programming languages/Static analysis abstract = After verifying dynamic and static interprocedural slicing, we present a modular framework for static interprocedural slicing. To this end, we formalized the standard two-phase slicer from Horwitz, Reps and Binkley (see their TOPLAS 12(1) 1990 paper) together with summary edges as presented by Reps et al. (see FSE 1994). The framework is again modular in the programming language by using an abstract CFG, defined via structural and well-formedness properties. Using a weak simulation between the original and sliced graph, we were able to prove the correctness of static interprocedural slicing. We also instantiate our framework with a simple While language with procedures. This shows that the chosen abstractions are indeed valid. notify = nipkow@in.tum.de [WorkerWrapper] title = The Worker/Wrapper Transformation author = Peter Gammie date = 2009-10-30 topic = Computer science/Programming languages/Transformations abstract = Gill and Hutton formalise the worker/wrapper transformation, building on the work of Launchbury and Peyton-Jones who developed it as a way of changing the type at which a recursive function operates. This development establishes the soundness of the technique and several examples of its use. notify = peteg42@gmail.com, nipkow@in.tum.de [JiveDataStoreModel] title = Jive Data and Store Model author = Nicole Rauch , Norbert Schirmer <> date = 2005-06-20 license = LGPL topic = Computer science/Programming languages/Misc abstract = This document presents the formalization of an object-oriented data and store model in Isabelle/HOL. This model is being used in the Java Interactive Verification Environment, Jive. notify = kleing@cse.unsw.edu.au, schirmer@in.tum.de [HotelKeyCards] title = Hotel Key Card System author = Tobias Nipkow date = 2006-09-09 topic = Computer science/Security abstract = Two models of an electronic hotel key card system are contrasted: a state based and a trace based one. Both are defined, verified, and proved equivalent in the theorem prover Isabelle/HOL. It is shown that if a guest follows a certain safety policy regarding her key cards, she can be sure that nobody but her can enter her room. notify = nipkow@in.tum.de [RSAPSS] title = SHA1, RSA, PSS and more author = Christina Lindenberg <>, Kai Wirt <> date = 2005-05-02 topic = Computer science/Security/Cryptography abstract = Formal verification is getting more and more important in computer science. However the state of the art formal verification methods in cryptography are very rudimentary. These theories are one step to provide a tool box allowing the use of formal methods in every aspect of cryptography. Moreover we present a proof of concept for the feasibility of verification techniques to a standard signature algorithm. notify = nipkow@in.tum.de [InformationFlowSlicing] title = Information Flow Noninterference via Slicing author = Daniel Wasserrab date = 2010-03-23 topic = Computer science/Security abstract =

In this contribution, we show how correctness proofs for intra- and interprocedural slicing can be used to prove that slicing is able to guarantee information flow noninterference. Moreover, we also illustrate how to lift the control flow graphs of the respective frameworks such that they fulfil the additional assumptions needed in the noninterference proofs. A detailed description of the intraprocedural proof and its interplay with the slicing framework can be found in the PLAS'09 paper by Wasserrab et al.

This entry contains the part for intra-procedural slicing. See entry InformationFlowSlicing_Inter for the inter-procedural part.

extra-history = Change history: [2016-06-10]: The original entry InformationFlowSlicing contained both the inter- and intra-procedural case was split into two for easier maintenance. notify = [InformationFlowSlicing_Inter] title = Inter-Procedural Information Flow Noninterference via Slicing author = Daniel Wasserrab date = 2010-03-23 topic = Computer science/Security abstract =

In this contribution, we show how correctness proofs for intra- and interprocedural slicing can be used to prove that slicing is able to guarantee information flow noninterference. Moreover, we also illustrate how to lift the control flow graphs of the respective frameworks such that they fulfil the additional assumptions needed in the noninterference proofs. A detailed description of the intraprocedural proof and its interplay with the slicing framework can be found in the PLAS'09 paper by Wasserrab et al.

This entry contains the part for inter-procedural slicing. See entry InformationFlowSlicing for the intra-procedural part.

extra-history = Change history: [2016-06-10]: The original entry InformationFlowSlicing contained both the inter- and intra-procedural case was split into two for easier maintenance. notify = [ComponentDependencies] title = Formalisation and Analysis of Component Dependencies author = Maria Spichkova date = 2014-04-28 topic = Computer science/System description languages abstract = This set of theories presents a formalisation in Isabelle/HOL of data dependencies between components. The approach allows to analyse system structure oriented towards efficient checking of system: it aims at elaborating for a concrete system, which parts of the system are necessary to check a given property. notify = maria.spichkova@rmit.edu.au [Verified-Prover] title = A Mechanically Verified, Efficient, Sound and Complete Theorem Prover For First Order Logic author = Tom Ridge <> date = 2004-09-28 topic = Logic/General logic/Mechanization of proofs abstract = Soundness and completeness for a system of first order logic are formally proved, building on James Margetson's formalization of work by Wainer and Wallen. The completeness proofs naturally suggest an algorithm to derive proofs. This algorithm, which can be implemented tail recursively, is formalized in Isabelle/HOL. The algorithm can be executed via the rewriting tactics of Isabelle. Alternatively, the definitions can be exported to OCaml, yielding a directly executable program. notify = lp15@cam.ac.uk [Completeness] title = Completeness theorem author = James Margetson <>, Tom Ridge <> date = 2004-09-20 topic = Logic/Proof theory abstract = The completeness of first-order logic is proved, following the first five pages of Wainer and Wallen's chapter of the book Proof Theory by Aczel et al., CUP, 1992. Their presentation of formulas allows the proofs to use symmetry arguments. Margetson formalized this theorem by early 2000. The Isar conversion is thanks to Tom Ridge. A paper describing the formalization is available [pdf]. notify = lp15@cam.ac.uk [Ordinal] title = Countable Ordinals author = Brian Huffman date = 2005-11-11 topic = Logic/Set theory abstract = This development defines a well-ordered type of countable ordinals. It includes notions of continuous and normal functions, recursively defined functions over ordinals, least fixed-points, and derivatives. Much of ordinal arithmetic is formalized, including exponentials and logarithms. The development concludes with formalizations of Cantor Normal Form and Veblen hierarchies over normal functions. notify = lcp@cl.cam.ac.uk [Ordinals_and_Cardinals] title = Ordinals and Cardinals author = Andrei Popescu date = 2009-09-01 topic = Logic/Set theory abstract = We develop a basic theory of ordinals and cardinals in Isabelle/HOL, up to the point where some cardinality facts relevant for the ``working mathematician" become available. Unlike in set theory, here we do not have at hand canonical notions of ordinal and cardinal. Therefore, here an ordinal is merely a well-order relation and a cardinal is an ordinal minim w.r.t. order embedding on its field. extra-history = Change history: [2012-09-25]: This entry has been discontinued because it is now part of the Isabelle distribution. notify = uuomul@yahoo.com, nipkow@in.tum.de [FOL-Fitting] title = First-Order Logic According to Fitting author = Stefan Berghofer contributors = Asta Halkjær From date = 2007-08-02 topic = Logic/General logic/Classical first-order logic abstract = We present a formalization of parts of Melvin Fitting's book "First-Order Logic and Automated Theorem Proving". The formalization covers the syntax of first-order logic, its semantics, the model existence theorem, a natural deduction proof calculus together with a proof of correctness and completeness, as well as the Löwenheim-Skolem theorem. extra-history = Change history: [2018-07-21]: Proved completeness theorem for open formulas. Proofs are now written in the declarative style. Enumeration of pairs and datatypes is automated using the Countable theory. notify = berghofe@in.tum.de [Epistemic_Logic] title = Epistemic Logic author = Asta Halkjær From topic = Logic/General logic/Logics of knowledge and belief date = 2018-10-29 notify = ahfrom@dtu.dk abstract = This work is a formalization of epistemic logic with countably many agents. It includes proofs of soundness and completeness for the axiom system K. The completeness proof is based on the textbook "Reasoning About Knowledge" by Fagin, Halpern, Moses and Vardi (MIT Press 1995). [SequentInvertibility] title = Invertibility in Sequent Calculi author = Peter Chapman <> date = 2009-08-28 topic = Logic/Proof theory license = LGPL abstract = The invertibility of the rules of a sequent calculus is important for guiding proof search and can be used in some formalised proofs of Cut admissibility. We present sufficient conditions for when a rule is invertible with respect to a calculus. We illustrate the conditions with examples. It must be noted we give purely syntactic criteria; no guarantees are given as to the suitability of the rules. notify = pc@cs.st-andrews.ac.uk, nipkow@in.tum.de [LinearQuantifierElim] title = Quantifier Elimination for Linear Arithmetic author = Tobias Nipkow date = 2008-01-11 topic = Logic/General logic/Decidability of theories abstract = This article formalizes quantifier elimination procedures for dense linear orders, linear real arithmetic and Presburger arithmetic. In each case both a DNF-based non-elementary algorithm and one or more (doubly) exponential NNF-based algorithms are formalized, including the well-known algorithms by Ferrante and Rackoff and by Cooper. The NNF-based algorithms for dense linear orders are new but based on Ferrante and Rackoff and on an algorithm by Loos and Weisspfenning which simulates infenitesimals. All algorithms are directly executable. In particular, they yield reflective quantifier elimination procedures for HOL itself. The formalization makes heavy use of locales and is therefore highly modular. notify = nipkow@in.tum.de [Nat-Interval-Logic] title = Interval Temporal Logic on Natural Numbers author = David Trachtenherz <> date = 2011-02-23 topic = Logic/General logic/Temporal logic abstract = We introduce a theory of temporal logic operators using sets of natural numbers as time domain, formalized in a shallow embedding manner. The theory comprises special natural intervals (theory IL_Interval: open and closed intervals, continuous and modulo intervals, interval traversing results), operators for shifting intervals to left/right on the number axis as well as expanding/contracting intervals by constant factors (theory IL_IntervalOperators.thy), and ultimately definitions and results for unary and binary temporal operators on arbitrary natural sets (theory IL_TemporalOperators). notify = nipkow@in.tum.de [Recursion-Theory-I] title = Recursion Theory I author = Michael Nedzelsky <> date = 2008-04-05 topic = Logic/Computability abstract = This document presents the formalization of introductory material from recursion theory --- definitions and basic properties of primitive recursive functions, Cantor pairing function and computably enumerable sets (including a proof of existence of a one-complete computably enumerable set and a proof of the Rice's theorem). notify = MichaelNedzelsky@yandex.ru [Free-Boolean-Algebra] topic = Logic/General logic/Classical propositional logic title = Free Boolean Algebra author = Brian Huffman date = 2010-03-29 abstract = This theory defines a type constructor representing the free Boolean algebra over a set of generators. Values of type (α)formula represent propositional formulas with uninterpreted variables from type α, ordered by implication. In addition to all the standard Boolean algebra operations, the library also provides a function for building homomorphisms to any other Boolean algebra type. notify = brianh@cs.pdx.edu [Sort_Encodings] title = Sound and Complete Sort Encodings for First-Order Logic author = Jasmin Christian Blanchette , Andrei Popescu date = 2013-06-27 topic = Logic/General logic/Mechanization of proofs abstract = This is a formalization of the soundness and completeness properties for various efficient encodings of sorts in unsorted first-order logic used by Isabelle's Sledgehammer tool.

Essentially, the encodings proceed as follows: a many-sorted problem is decorated with (as few as possible) tags or guards that make the problem monotonic; then sorts can be soundly erased.

The development employs a formalization of many-sorted first-order logic in clausal form (clauses, structures and the basic properties of the satisfaction relation), which could be of interest as the starting point for other formalizations of first-order logic metatheory. notify = uuomul@yahoo.com [Lambda_Free_RPOs] title = Formalization of Recursive Path Orders for Lambda-Free Higher-Order Terms author = Jasmin Christian Blanchette , Uwe Waldmann , Daniel Wand date = 2016-09-23 topic = Logic/Rewriting abstract = This Isabelle/HOL formalization defines recursive path orders (RPOs) for higher-order terms without lambda-abstraction and proves many useful properties about them. The main order fully coincides with the standard RPO on first-order terms also in the presence of currying, distinguishing it from previous work. An optimized variant is formalized as well. It appears promising as the basis of a higher-order superposition calculus. notify = jasmin.blanchette@gmail.com [Lambda_Free_KBOs] title = Formalization of Knuth–Bendix Orders for Lambda-Free Higher-Order Terms author = Heiko Becker , Jasmin Christian Blanchette , Uwe Waldmann , Daniel Wand date = 2016-11-12 topic = Logic/Rewriting abstract = This Isabelle/HOL formalization defines Knuth–Bendix orders for higher-order terms without lambda-abstraction and proves many useful properties about them. The main order fully coincides with the standard transfinite KBO with subterm coefficients on first-order terms. It appears promising as the basis of a higher-order superposition calculus. notify = jasmin.blanchette@gmail.com [Lambda_Free_EPO] title = Formalization of the Embedding Path Order for Lambda-Free Higher-Order Terms author = Alexander Bentkamp topic = Logic/Rewriting date = 2018-10-19 notify = a.bentkamp@vu.nl abstract = This Isabelle/HOL formalization defines the Embedding Path Order (EPO) for higher-order terms without lambda-abstraction and proves many useful properties about it. In contrast to the lambda-free recursive path orders, it does not fully coincide with RPO on first-order terms, but it is compatible with arbitrary higher-order contexts. [Nested_Multisets_Ordinals] title = Formalization of Nested Multisets, Hereditary Multisets, and Syntactic Ordinals author = Jasmin Christian Blanchette , Mathias Fleury , Dmitriy Traytel date = 2016-11-12 topic = Logic/Rewriting abstract = This Isabelle/HOL formalization introduces a nested multiset datatype and defines Dershowitz and Manna's nested multiset order. The order is proved well founded and linear. By removing one constructor, we transform the nested multisets into hereditary multisets. These are isomorphic to the syntactic ordinals—the ordinals can be recursively expressed in Cantor normal form. Addition, subtraction, multiplication, and linear orders are provided on this type. notify = jasmin.blanchette@gmail.com [Abstract-Rewriting] title = Abstract Rewriting topic = Logic/Rewriting date = 2010-06-14 author = Christian Sternagel , René Thiemann license = LGPL abstract = We present an Isabelle formalization of abstract rewriting (see, e.g., the book by Baader and Nipkow). First, we define standard relations like joinability, meetability, conversion, etc. Then, we formalize important properties of abstract rewrite systems, e.g., confluence and strong normalization. Our main concern is on strong normalization, since this formalization is the basis of CeTA (which is mainly about strong normalization of term rewrite systems). Hence lemmas involving strong normalization constitute by far the biggest part of this theory. One of those is Newman's lemma. extra-history = Change history: [2010-09-17]: Added theories defining several (ordered) semirings related to strong normalization and giving some standard instances.
[2013-10-16]: Generalized delta-orders from rationals to Archimedean fields. notify = christian.sternagel@uibk.ac.at, rene.thiemann@uibk.ac.at [First_Order_Terms] title = First-Order Terms author = Christian Sternagel , René Thiemann topic = Logic/Rewriting, Computer science/Algorithms license = LGPL date = 2018-02-06 notify = c.sternagel@gmail.com, rene.thiemann@uibk.ac.at abstract = We formalize basic results on first-order terms, including matching and a first-order unification algorithm, as well as well-foundedness of the subsumption order. This entry is part of the Isabelle Formalization of Rewriting IsaFoR, where first-order terms are omni-present: the unification algorithm is used to certify several confluence and termination techniques, like critical-pair computation and dependency graph approximations; and the subsumption order is a crucial ingredient for completion. [Free-Groups] title = Free Groups author = Joachim Breitner date = 2010-06-24 topic = Mathematics/Algebra abstract = Free Groups are, in a sense, the most generic kind of group. They are defined over a set of generators with no additional relations in between them. They play an important role in the definition of group presentations and in other fields. This theory provides the definition of Free Group as the set of fully canceled words in the generators. The universal property is proven, as well as some isomorphisms results about Free Groups. extra-history = Change history: [2011-12-11]: Added the Ping Pong Lemma. notify = [CofGroups] title = An Example of a Cofinitary Group in Isabelle/HOL author = Bart Kastermans date = 2009-08-04 topic = Mathematics/Algebra abstract = We formalize the usual proof that the group generated by the function k -> k + 1 on the integers gives rise to a cofinitary group. notify = nipkow@in.tum.de [Group-Ring-Module] title = Groups, Rings and Modules author = Hidetsune Kobayashi <>, L. Chen <>, H. Murao <> date = 2004-05-18 topic = Mathematics/Algebra abstract = The theory of groups, rings and modules is developed to a great depth. Group theory results include Zassenhaus's theorem and the Jordan-Hoelder theorem. The ring theory development includes ideals, quotient rings and the Chinese remainder theorem. The module development includes the Nakayama lemma, exact sequences and Tensor products. notify = lp15@cam.ac.uk [Robbins-Conjecture] title = A Complete Proof of the Robbins Conjecture author = Matthew Wampler-Doty <> date = 2010-05-22 topic = Mathematics/Algebra abstract = This document gives a formalization of the proof of the Robbins conjecture, following A. Mann, A Complete Proof of the Robbins Conjecture, 2003. notify = nipkow@in.tum.de [Valuation] title = Fundamental Properties of Valuation Theory and Hensel's Lemma author = Hidetsune Kobayashi <> date = 2007-08-08 topic = Mathematics/Algebra abstract = Convergence with respect to a valuation is discussed as convergence of a Cauchy sequence. Cauchy sequences of polynomials are defined. They are used to formalize Hensel's lemma. notify = lp15@cam.ac.uk [Rank_Nullity_Theorem] title = Rank-Nullity Theorem in Linear Algebra author = Jose Divasón , Jesús Aransay topic = Mathematics/Algebra date = 2013-01-16 abstract = In this contribution, we present some formalizations based on the HOL-Multivariate-Analysis session of Isabelle. Firstly, a generalization of several theorems of such library are presented. Secondly, some definitions and proofs involving Linear Algebra and the four fundamental subspaces of a matrix are shown. Finally, we present a proof of the result known in Linear Algebra as the ``Rank-Nullity Theorem'', which states that, given any linear map f from a finite dimensional vector space V to a vector space W, then the dimension of V is equal to the dimension of the kernel of f (which is a subspace of V) and the dimension of the range of f (which is a subspace of W). The proof presented here is based on the one given by Sheldon Axler in his book Linear Algebra Done Right. As a corollary of the previous theorem, and taking advantage of the relationship between linear maps and matrices, we prove that, for every matrix A (which has associated a linear map between finite dimensional vector spaces), the sum of its null space and its column space (which is equal to the range of the linear map) is equal to the number of columns of A. extra-history = Change history: [2014-07-14]: Added some generalizations that allow us to formalize the Rank-Nullity Theorem over finite dimensional vector spaces, instead of over the more particular euclidean spaces. Updated abstract. notify = jose.divasonm@unirioja.es, jesus-maria.aransay@unirioja.es [Affine_Arithmetic] title = Affine Arithmetic author = Fabian Immler date = 2014-02-07 topic = Mathematics/Analysis abstract = We give a formalization of affine forms as abstract representations of zonotopes. We provide affine operations as well as overapproximations of some non-affine operations like multiplication and division. Expressions involving those operations can automatically be turned into (executable) functions approximating the original expression in affine arithmetic. extra-history = Change history: [2015-01-31]: added algorithm for zonotope/hyperplane intersection
[2017-09-20]: linear approximations for all symbols from the floatarith data type notify = immler@in.tum.de [Laplace_Transform] title = Laplace Transform author = Fabian Immler topic = Mathematics/Analysis date = 2019-08-14 notify = fimmler@cs.cmu.edu abstract = This entry formalizes the Laplace transform and concrete Laplace transforms for arithmetic functions, frequency shift, integration and (higher) differentiation in the time domain. It proves Lerch's lemma and uniqueness of the Laplace transform for continuous functions. In order to formalize the foundational assumptions, this entry contains a formalization of piecewise continuous functions and functions of exponential order. [Cauchy] title = Cauchy's Mean Theorem and the Cauchy-Schwarz Inequality author = Benjamin Porter <> date = 2006-03-14 topic = Mathematics/Analysis abstract = This document presents the mechanised proofs of two popular theorems attributed to Augustin Louis Cauchy - Cauchy's Mean Theorem and the Cauchy-Schwarz Inequality. notify = kleing@cse.unsw.edu.au [Integration] title = Integration theory and random variables author = Stefan Richter date = 2004-11-19 topic = Mathematics/Analysis abstract = Lebesgue-style integration plays a major role in advanced probability. We formalize concepts of elementary measure theory, real-valued random variables as Borel-measurable functions, and a stepwise inductive definition of the integral itself. All proofs are carried out in human readable style using the Isar language. extra-note = Note: This article is of historical interest only. Lebesgue-style integration and probability theory are now available as part of the Isabelle/HOL distribution (directory Probability). notify = richter@informatik.rwth-aachen.de, nipkow@in.tum.de, hoelzl@in.tum.de [Ordinary_Differential_Equations] title = Ordinary Differential Equations author = Fabian Immler , Johannes Hölzl topic = Mathematics/Analysis date = 2012-04-26 abstract =

Session Ordinary-Differential-Equations formalizes ordinary differential equations (ODEs) and initial value problems. This work comprises proofs for local and global existence of unique solutions (Picard-Lindelöf theorem). Moreover, it contains a formalization of the (continuous or even differentiable) dependency of the flow on initial conditions as the flow of ODEs.

Not in the generated document are the following sessions:

  • HOL-ODE-Numerics: Rigorous numerical algorithms for computing enclosures of solutions based on Runge-Kutta methods and affine arithmetic. Reachability analysis with splitting and reduction at hyperplanes.
  • HOL-ODE-Examples: Applications of the numerical algorithms to concrete systems of ODEs.
  • Lorenz_C0, Lorenz_C1: Verified algorithms for checking C1-information according to Tucker's proof, computation of C0-information.

extra-history = Change history: [2014-02-13]: added an implementation of the Euler method based on affine arithmetic
[2016-04-14]: added flow and variational equation
[2016-08-03]: numerical algorithms for reachability analysis (using second-order Runge-Kutta methods, splitting, and reduction) implemented using Lammich's framework for automatic refinement
[2017-09-20]: added Poincare map and propagation of variational equation in reachability analysis, verified algorithms for C1-information and computations for C0-information of the Lorenz attractor. notify = immler@in.tum.de, hoelzl@in.tum.de [Polynomials] title = Executable Multivariate Polynomials author = Christian Sternagel , René Thiemann , Alexander Maletzky , Fabian Immler , Florian Haftmann , Andreas Lochbihler , Alexander Bentkamp date = 2010-08-10 topic = Mathematics/Analysis, Mathematics/Algebra, Computer science/Algorithms/Mathematical license = LGPL abstract = We define multivariate polynomials over arbitrary (ordered) semirings in combination with (executable) operations like addition, multiplication, and substitution. We also define (weak) monotonicity of polynomials and comparison of polynomials where we provide standard estimations like absolute positiveness or the more recent approach of Neurauter, Zankl, and Middeldorp. Moreover, it is proven that strongly normalizing (monotone) orders can be lifted to strongly normalizing (monotone) orders over polynomials. Our formalization was performed as part of the IsaFoR/CeTA-system which contains several termination techniques. The provided theories have been essential to formalize polynomial interpretations.

This formalization also contains an abstract representation as coefficient functions with finite support and a type of power-products. If this type is ordered by a linear (term) ordering, various additional notions, such as leading power-product, leading coefficient etc., are introduced as well. Furthermore, a lot of generic properties of, and functions on, multivariate polynomials are formalized, including the substitution and evaluation homomorphisms, embeddings of polynomial rings into larger rings (i.e. with one additional indeterminate), homogenization and dehomogenization of polynomials, and the canonical isomorphism between R[X,Y] and R[X][Y]. extra-history = Change history: [2010-09-17]: Moved theories on arbitrary (ordered) semirings to Abstract Rewriting.
[2016-10-28]: Added abstract representation of polynomials and authors Maletzky/Immler.
[2018-01-23]: Added authors Haftmann, Lochbihler after incorporating their formalization of multivariate polynomials based on Polynomial mappings. Moved material from Bentkamp's entry "Deep Learning".
[2019-04-18]: Added material about polynomials whose power-products are represented themselves by polynomial mappings. notify = rene.thiemann@uibk.ac.at, christian.sternagel@uibk.ac.at, alexander.maletzky@risc.jku.at, immler@in.tum.de [Sqrt_Babylonian] title = Computing N-th Roots using the Babylonian Method author = René Thiemann date = 2013-01-03 topic = Mathematics/Analysis license = LGPL abstract = We implement the Babylonian method to compute n-th roots of numbers. We provide precise algorithms for naturals, integers and rationals, and offer an approximation algorithm for square roots over linear ordered fields. Moreover, there are precise algorithms to compute the floor and the ceiling of n-th roots. extra-history = Change history: [2013-10-16]: Added algorithms to compute floor and ceiling of sqrt of integers. [2014-07-11]: Moved NthRoot_Impl from Real-Impl to this entry. notify = rene.thiemann@uibk.ac.at [Sturm_Sequences] title = Sturm's Theorem author = Manuel Eberl date = 2014-01-11 topic = Mathematics/Analysis abstract = Sturm's Theorem states that polynomial sequences with certain properties, so-called Sturm sequences, can be used to count the number of real roots of a real polynomial. This work contains a proof of Sturm's Theorem and code for constructing Sturm sequences efficiently. It also provides the “sturm” proof method, which can decide certain statements about the roots of real polynomials, such as “the polynomial P has exactly n roots in the interval I” or “P(x) > Q(x) for all x ∈ ℝ”. notify = eberlm@in.tum.de [Sturm_Tarski] title = The Sturm-Tarski Theorem author = Wenda Li date = 2014-09-19 topic = Mathematics/Analysis abstract = We have formalized the Sturm-Tarski theorem (also referred as the Tarski theorem), which generalizes Sturm's theorem. Sturm's theorem is usually used as a way to count distinct real roots, while the Sturm-Tarksi theorem forms the basis for Tarski's classic quantifier elimination for real closed field. notify = wl302@cam.ac.uk [Markov_Models] title = Markov Models author = Johannes Hölzl , Tobias Nipkow date = 2012-01-03 topic = Mathematics/Probability theory, Computer science/Automata and formal languages abstract = This is a formalization of Markov models in Isabelle/HOL. It builds on Isabelle's probability theory. The available models are currently Discrete-Time Markov Chains and a extensions of them with rewards.

As application of these models we formalize probabilistic model checking of pCTL formulas, analysis of IPv4 address allocation in ZeroConf and an analysis of the anonymity of the Crowds protocol. See here for the corresponding paper. notify = hoelzl@in.tum.de [Probabilistic_System_Zoo] title = A Zoo of Probabilistic Systems author = Johannes Hölzl , Andreas Lochbihler , Dmitriy Traytel date = 2015-05-27 topic = Computer science/Automata and formal languages abstract = Numerous models of probabilistic systems are studied in the literature. Coalgebra has been used to classify them into system types and compare their expressiveness. We formalize the resulting hierarchy of probabilistic system types by modeling the semantics of the different systems as codatatypes. This approach yields simple and concise proofs, as bisimilarity coincides with equality for codatatypes.

This work is described in detail in the ITP 2015 publication by the authors. notify = traytel@in.tum.de [Density_Compiler] title = A Verified Compiler for Probability Density Functions author = Manuel Eberl , Johannes Hölzl , Tobias Nipkow date = 2014-10-09 topic = Mathematics/Probability theory, Computer science/Programming languages/Compiling abstract = Bhat et al. [TACAS 2013] developed an inductive compiler that computes density functions for probability spaces described by programs in a probabilistic functional language. In this work, we implement such a compiler for a modified version of this language within the theorem prover Isabelle and give a formal proof of its soundness w.r.t. the semantics of the source and target language. Together with Isabelle's code generation for inductive predicates, this yields a fully verified, executable density compiler. The proof is done in two steps: First, an abstract compiler working with abstract functions modelled directly in the theorem prover's logic is defined and proved sound. Then, this compiler is refined to a concrete version that returns a target-language expression.

An article with the same title and authors is published in the proceedings of ESOP 2015. A detailed presentation of this work can be found in the first author's master's thesis. notify = hoelzl@in.tum.de [CAVA_Automata] title = The CAVA Automata Library author = Peter Lammich date = 2014-05-28 topic = Computer science/Automata and formal languages abstract = We report on the graph and automata library that is used in the fully verified LTL model checker CAVA. As most components of CAVA use some type of graphs or automata, a common automata library simplifies assembly of the components and reduces redundancy.

The CAVA Automata Library provides a hierarchy of graph and automata classes, together with some standard algorithms. Its object oriented design allows for sharing of algorithms, theorems, and implementations between its classes, and also simplifies extensions of the library. Moreover, it is integrated into the Automatic Refinement Framework, supporting automatic refinement of the abstract automata types to efficient data structures.

Note that the CAVA Automata Library is work in progress. Currently, it is very specifically tailored towards the requirements of the CAVA model checker. Nevertheless, the formalization techniques presented here allow an extension of the library to a wider scope. Moreover, they are not limited to graph libraries, but apply to class hierarchies in general.

The CAVA Automata Library is described in the paper: Peter Lammich, The CAVA Automata Library, Isabelle Workshop 2014. notify = lammich@in.tum.de [LTL] title = Linear Temporal Logic author = Salomon Sickert contributors = Benedikt Seidl date = 2016-03-01 topic = Logic/General logic/Temporal logic, Computer science/Automata and formal languages abstract = This theory provides a formalisation of linear temporal logic (LTL) and unifies previous formalisations within the AFP. This entry establishes syntax and semantics for this logic and decouples it from existing entries, yielding a common environment for theories reasoning about LTL. Furthermore a parser written in SML and an executable simplifier are provided. extra-history = Change history: [2019-03-12]: Support for additional operators, implementation of common equivalence relations, definition of syntactic fragments of LTL and the minimal disjunctive normal form.
notify = sickert@in.tum.de [LTL_to_GBA] title = Converting Linear-Time Temporal Logic to Generalized Büchi Automata author = Alexander Schimpf , Peter Lammich date = 2014-05-28 topic = Computer science/Automata and formal languages abstract = We formalize linear-time temporal logic (LTL) and the algorithm by Gerth et al. to convert LTL formulas to generalized Büchi automata. We also formalize some syntactic rewrite rules that can be applied to optimize the LTL formula before conversion. Moreover, we integrate the Stuttering Equivalence AFP-Entry by Stefan Merz, adapting the lemma that next-free LTL formula cannot distinguish between stuttering equivalent runs to our setting.

We use the Isabelle Refinement and Collection framework, as well as the Autoref tool, to obtain a refined version of our algorithm, from which efficiently executable code can be extracted. notify = lammich@in.tum.de [Gabow_SCC] title = Verified Efficient Implementation of Gabow's Strongly Connected Components Algorithm author = Peter Lammich date = 2014-05-28 topic = Computer science/Algorithms/Graph, Mathematics/Graph theory abstract = We present an Isabelle/HOL formalization of Gabow's algorithm for finding the strongly connected components of a directed graph. Using data refinement techniques, we extract efficient code that performs comparable to a reference implementation in Java. Our style of formalization allows for re-using large parts of the proofs when defining variants of the algorithm. We demonstrate this by verifying an algorithm for the emptiness check of generalized Büchi automata, re-using most of the existing proofs. notify = lammich@in.tum.de [Promela] title = Promela Formalization author = René Neumann date = 2014-05-28 topic = Computer science/System description languages abstract = We present an executable formalization of the language Promela, the description language for models of the model checker SPIN. This formalization is part of the work for a completely verified model checker (CAVA), but also serves as a useful (and executable!) description of the semantics of the language itself, something that is currently missing. The formalization uses three steps: It takes an abstract syntax tree generated from an SML parser, removes syntactic sugar and enriches it with type information. This further gets translated into a transition system, on which the semantic engine (read: successor function) operates. notify = [CAVA_LTL_Modelchecker] title = A Fully Verified Executable LTL Model Checker author = Javier Esparza , Peter Lammich , René Neumann , Tobias Nipkow , Alexander Schimpf , Jan-Georg Smaus date = 2014-05-28 topic = Computer science/Automata and formal languages abstract = We present an LTL model checker whose code has been completely verified using the Isabelle theorem prover. The checker consists of over 4000 lines of ML code. The code is produced using the Isabelle Refinement Framework, which allows us to split its correctness proof into (1) the proof of an abstract version of the checker, consisting of a few hundred lines of ``formalized pseudocode'', and (2) a verified refinement step in which mathematical sets and other abstract structures are replaced by implementations of efficient structures like red-black trees and functional arrays. This leads to a checker that, while still slower than unverified checkers, can already be used as a trusted reference implementation against which advanced implementations can be tested.

An early version of this model checker is described in the CAV 2013 paper with the same title. notify = lammich@in.tum.de [Fermat3_4] title = Fermat's Last Theorem for Exponents 3 and 4 and the Parametrisation of Pythagorean Triples author = Roelof Oosterhuis <> date = 2007-08-12 topic = Mathematics/Number theory abstract = This document presents the mechanised proofs of

  • Fermat's Last Theorem for exponents 3 and 4 and
  • the parametrisation of Pythagorean Triples.
notify = nipkow@in.tum.de, roelofoosterhuis@gmail.com [Perfect-Number-Thm] title = Perfect Number Theorem author = Mark Ijbema date = 2009-11-22 topic = Mathematics/Number theory abstract = These theories present the mechanised proof of the Perfect Number Theorem. notify = nipkow@in.tum.de [SumSquares] title = Sums of Two and Four Squares author = Roelof Oosterhuis <> date = 2007-08-12 topic = Mathematics/Number theory abstract = This document presents the mechanised proofs of the following results:
  • any prime number of the form 4m+1 can be written as the sum of two squares;
  • any natural number can be written as the sum of four squares
notify = nipkow@in.tum.de, roelofoosterhuis@gmail.com [Lehmer] title = Lehmer's Theorem author = Simon Wimmer , Lars Noschinski date = 2013-07-22 topic = Mathematics/Number theory abstract = In 1927, Lehmer presented criterions for primality, based on the converse of Fermat's litte theorem. This work formalizes the second criterion from Lehmer's paper, a necessary and sufficient condition for primality.

As a side product we formalize some properties of Euler's phi-function, the notion of the order of an element of a group, and the cyclicity of the multiplicative group of a finite field. notify = noschinl@gmail.com, simon.wimmer@tum.de [Pratt_Certificate] title = Pratt's Primality Certificates author = Simon Wimmer , Lars Noschinski date = 2013-07-22 topic = Mathematics/Number theory abstract = In 1975, Pratt introduced a proof system for certifying primes. He showed that a number p is prime iff a primality certificate for p exists. By showing a logarithmic upper bound on the length of the certificates in size of the prime number, he concluded that the decision problem for prime numbers is in NP. This work formalizes soundness and completeness of Pratt's proof system as well as an upper bound for the size of the certificate. notify = noschinl@gmail.com, simon.wimmer@tum.de [Monad_Memo_DP] title = Monadification, Memoization and Dynamic Programming author = Simon Wimmer , Shuwei Hu , Tobias Nipkow topic = Computer science/Programming languages/Transformations, Computer science/Algorithms, Computer science/Functional programming date = 2018-05-22 notify = wimmers@in.tum.de abstract = We present a lightweight framework for the automatic verified (functional or imperative) memoization of recursive functions. Our tool can turn a pure Isabelle/HOL function definition into a monadified version in a state monad or the Imperative HOL heap monad, and prove a correspondence theorem. We provide a variety of memory implementations for the two types of monads. A number of simple techniques allow us to achieve bottom-up computation and space-efficient memoization. The framework’s utility is demonstrated on a number of representative dynamic programming problems. A detailed description of our work can be found in the accompanying paper [2]. [Probabilistic_Timed_Automata] title = Probabilistic Timed Automata author = Simon Wimmer , Johannes Hölzl topic = Mathematics/Probability theory, Computer science/Automata and formal languages date = 2018-05-24 notify = wimmers@in.tum.de, hoelzl@in.tum.de abstract = We present a formalization of probabilistic timed automata (PTA) for which we try to follow the formula MDP + TA = PTA as far as possible: our work starts from our existing formalizations of Markov decision processes (MDP) and timed automata (TA) and combines them modularly. We prove the fundamental result for probabilistic timed automata: the region construction that is known from timed automata carries over to the probabilistic setting. In particular, this allows us to prove that minimum and maximum reachability probabilities can be computed via a reduction to MDP model checking, including the case where one wants to disregard unrealizable behavior. Further information can be found in our ITP paper [2]. [Hidden_Markov_Models] title = Hidden Markov Models author = Simon Wimmer topic = Mathematics/Probability theory, Computer science/Algorithms date = 2018-05-25 notify = wimmers@in.tum.de abstract = This entry contains a formalization of hidden Markov models [3] based on Johannes Hölzl's formalization of discrete time Markov chains [1]. The basic definitions are provided and the correctness of two main (dynamic programming) algorithms for hidden Markov models is proved: the forward algorithm for computing the likelihood of an observed sequence, and the Viterbi algorithm for decoding the most probable hidden state sequence. The Viterbi algorithm is made executable including memoization. Hidden markov models have various applications in natural language processing. For an introduction see Jurafsky and Martin [2]. [ArrowImpossibilityGS] title = Arrow and Gibbard-Satterthwaite author = Tobias Nipkow date = 2008-09-01 topic = Mathematics/Games and economics abstract = This article formalizes two proofs of Arrow's impossibility theorem due to Geanakoplos and derives the Gibbard-Satterthwaite theorem as a corollary. One formalization is based on utility functions, the other one on strict partial orders.

An article about these proofs is found here. notify = nipkow@in.tum.de [SenSocialChoice] title = Some classical results in Social Choice Theory author = Peter Gammie date = 2008-11-09 topic = Mathematics/Games and economics abstract = Drawing on Sen's landmark work "Collective Choice and Social Welfare" (1970), this development proves Arrow's General Possibility Theorem, Sen's Liberal Paradox and May's Theorem in a general setting. The goal was to make precise the classical statements and proofs of these results, and to provide a foundation for more recent results such as the Gibbard-Satterthwaite and Duggan-Schwartz theorems. notify = nipkow@in.tum.de [Vickrey_Clarke_Groves] title = VCG - Combinatorial Vickrey-Clarke-Groves Auctions author = Marco B. Caminati <>, Manfred Kerber , Christoph Lange, Colin Rowat date = 2015-04-30 topic = Mathematics/Games and economics abstract = A VCG auction (named after their inventors Vickrey, Clarke, and Groves) is a generalization of the single-good, second price Vickrey auction to the case of a combinatorial auction (multiple goods, from which any participant can bid on each possible combination). We formalize in this entry VCG auctions, including tie-breaking and prove that the functions for the allocation and the price determination are well-defined. Furthermore we show that the allocation function allocates goods only to participants, only goods in the auction are allocated, and no good is allocated twice. We also show that the price function is non-negative. These properties also hold for the automatically extracted Scala code. notify = mnfrd.krbr@gmail.com [Topology] title = Topology author = Stefan Friedrich <> date = 2004-04-26 topic = Mathematics/Topology abstract = This entry contains two theories. The first, Topology, develops the basic notions of general topology. The second, which can be viewed as a demonstration of the first, is called LList_Topology. It develops the topology of lazy lists. notify = lcp@cl.cam.ac.uk [Knot_Theory] title = Knot Theory author = T.V.H. Prathamesh date = 2016-01-20 topic = Mathematics/Topology abstract = This work contains a formalization of some topics in knot theory. The concepts that were formalized include definitions of tangles, links, framed links and link/tangle equivalence. The formalization is based on a formulation of links in terms of tangles. We further construct and prove the invariance of the Bracket polynomial. Bracket polynomial is an invariant of framed links closely linked to the Jones polynomial. This is perhaps the first attempt to formalize any aspect of knot theory in an interactive proof assistant. notify = prathamesh@imsc.res.in [Graph_Theory] title = Graph Theory author = Lars Noschinski date = 2013-04-28 topic = Mathematics/Graph theory abstract = This development provides a formalization of directed graphs, supporting (labelled) multi-edges and infinite graphs. A polymorphic edge type allows edges to be treated as pairs of vertices, if multi-edges are not required. Formalized properties are i.a. walks (and related concepts), connectedness and subgraphs and basic properties of isomorphisms.

This formalization is used to prove characterizations of Euler Trails, Shortest Paths and Kuratowski subgraphs. notify = noschinl@gmail.com [Planarity_Certificates] title = Planarity Certificates author = Lars Noschinski date = 2015-11-11 topic = Mathematics/Graph theory abstract = This development provides a formalization of planarity based on combinatorial maps and proves that Kuratowski's theorem implies combinatorial planarity. Moreover, it contains verified implementations of programs checking certificates for planarity (i.e., a combinatorial map) or non-planarity (i.e., a Kuratowski subgraph). notify = noschinl@gmail.com [Max-Card-Matching] title = Maximum Cardinality Matching author = Christine Rizkallah date = 2011-07-21 topic = Mathematics/Graph theory abstract =

A matching in a graph G is a subset M of the edges of G such that no two share an endpoint. A matching has maximum cardinality if its cardinality is at least as large as that of any other matching. An odd-set cover OSC of a graph G is a labeling of the nodes of G with integers such that every edge of G is either incident to a node labeled 1 or connects two nodes labeled with the same number i ≥ 2.

This article proves Edmonds theorem:
Let M be a matching in a graph G and let OSC be an odd-set cover of G. For any i ≥ 0, let n(i) be the number of nodes labeled i. If |M| = n(1) + ∑i ≥ 2(n(i) div 2), then M is a maximum cardinality matching.

notify = nipkow@in.tum.de [Girth_Chromatic] title = A Probabilistic Proof of the Girth-Chromatic Number Theorem author = Lars Noschinski date = 2012-02-06 topic = Mathematics/Graph theory abstract = This works presents a formalization of the Girth-Chromatic number theorem in graph theory, stating that graphs with arbitrarily large girth and chromatic number exist. The proof uses the theory of Random Graphs to prove the existence with probabilistic arguments. notify = noschinl@gmail.com [Random_Graph_Subgraph_Threshold] title = Properties of Random Graphs -- Subgraph Containment author = Lars Hupel date = 2014-02-13 topic = Mathematics/Graph theory, Mathematics/Probability theory abstract = Random graphs are graphs with a fixed number of vertices, where each edge is present with a fixed probability. We are interested in the probability that a random graph contains a certain pattern, for example a cycle or a clique. A very high edge probability gives rise to perhaps too many edges (which degrades performance for many algorithms), whereas a low edge probability might result in a disconnected graph. We prove a theorem about a threshold probability such that a higher edge probability will asymptotically almost surely produce a random graph with the desired subgraph. notify = hupel@in.tum.de [Flyspeck-Tame] title = Flyspeck I: Tame Graphs author = Gertrud Bauer <>, Tobias Nipkow date = 2006-05-22 topic = Mathematics/Graph theory abstract = These theories present the verified enumeration of tame plane graphs as defined by Thomas C. Hales in his proof of the Kepler Conjecture in his book Dense Sphere Packings. A Blueprint for Formal Proofs. [CUP 2012]. The values of the constants in the definition of tameness are identical to those in the Flyspeck project. The IJCAR 2006 paper by Nipkow, Bauer and Schultz refers to the original version of Hales' proof, the ITP 2011 paper by Nipkow refers to the Blueprint version of the proof. extra-history = Change history: [2010-11-02]: modified theories to reflect the modified definition of tameness in Hales' revised proof.
[2014-07-03]: modified constants in def of tameness and Archive according to the final state of the Flyspeck proof. notify = nipkow@in.tum.de [Well_Quasi_Orders] title = Well-Quasi-Orders author = Christian Sternagel date = 2012-04-13 topic = Mathematics/Combinatorics abstract = Based on Isabelle/HOL's type class for preorders, we introduce a type class for well-quasi-orders (wqo) which is characterized by the absence of "bad" sequences (our proofs are along the lines of the proof of Nash-Williams, from which we also borrow terminology). Our main results are instantiations for the product type, the list type, and a type of finite trees, which (almost) directly follow from our proofs of (1) Dickson's Lemma, (2) Higman's Lemma, and (3) Kruskal's Tree Theorem. More concretely:
  • If the sets A and B are wqo then their Cartesian product is wqo.
  • If the set A is wqo then the set of finite lists over A is wqo.
  • If the set A is wqo then the set of finite trees over A is wqo.
The research was funded by the Austrian Science Fund (FWF): J3202. extra-history = Change history: [2012-06-11]: Added Kruskal's Tree Theorem.
[2012-12-19]: New variant of Kruskal's tree theorem for terms (as opposed to variadic terms, i.e., trees), plus finite version of the tree theorem as corollary.
[2013-05-16]: Simplified construction of minimal bad sequences.
[2014-07-09]: Simplified proofs of Higman's lemma and Kruskal's tree theorem, based on homogeneous sequences.
[2016-01-03]: An alternative proof of Higman's lemma by open induction.
[2017-06-08]: Proved (classical) equivalence to inductive definition of almost-full relations according to the ITP 2012 paper "Stop When You Are Almost-Full" by Vytiniotis, Coquand, and Wahlstedt. notify = c.sternagel@gmail.com [Marriage] title = Hall's Marriage Theorem author = Dongchen Jiang , Tobias Nipkow date = 2010-12-17 topic = Mathematics/Combinatorics abstract = Two proofs of Hall's Marriage Theorem: one due to Halmos and Vaughan, one due to Rado. extra-history = Change history: [2011-09-09]: Added Rado's proof notify = nipkow@in.tum.de [Bondy] title = Bondy's Theorem author = Jeremy Avigad , Stefan Hetzl date = 2012-10-27 topic = Mathematics/Combinatorics abstract = A proof of Bondy's theorem following B. Bollabas, Combinatorics, 1986, Cambridge University Press. notify = avigad@cmu.edu, hetzl@logic.at [Ramsey-Infinite] title = Ramsey's theorem, infinitary version author = Tom Ridge <> date = 2004-09-20 topic = Mathematics/Combinatorics abstract = This formalization of Ramsey's theorem (infinitary version) is taken from Boolos and Jeffrey, Computability and Logic, 3rd edition, Chapter 26. It differs slightly from the text by assuming a slightly stronger hypothesis. In particular, the induction hypothesis is stronger, holding for any infinite subset of the naturals. This avoids the rather peculiar mapping argument between kj and aikj on p.263, which is unnecessary and slightly mars this really beautiful result. notify = lp15@cam.ac.uk [Derangements] title = Derangements Formula author = Lukas Bulwahn date = 2015-06-27 topic = Mathematics/Combinatorics abstract = The Derangements Formula describes the number of fixpoint-free permutations as a closed formula. This theorem is the 88th theorem in a list of the ``Top 100 Mathematical Theorems''. notify = lukas.bulwahn@gmail.com [Euler_Partition] title = Euler's Partition Theorem author = Lukas Bulwahn date = 2015-11-19 topic = Mathematics/Combinatorics abstract = Euler's Partition Theorem states that the number of partitions with only distinct parts is equal to the number of partitions with only odd parts. The combinatorial proof follows John Harrison's HOL Light formalization. This theorem is the 45th theorem of the Top 100 Theorems list. notify = lukas.bulwahn@gmail.com [Discrete_Summation] title = Discrete Summation author = Florian Haftmann contributors = Amine Chaieb <> date = 2014-04-13 topic = Mathematics/Combinatorics abstract = These theories introduce basic concepts and proofs about discrete summation: shifts, formal summation, falling factorials and stirling numbers. As proof of concept, a simple summation conversion is provided. notify = florian.haftmann@informatik.tu-muenchen.de [Open_Induction] title = Open Induction author = Mizuhito Ogawa <>, Christian Sternagel date = 2012-11-02 topic = Mathematics/Combinatorics abstract = A proof of the open induction schema based on J.-C. Raoult, Proving open properties by induction, Information Processing Letters 29, 1988, pp.19-23.

This research was supported by the Austrian Science Fund (FWF): J3202.

notify = c.sternagel@gmail.com [Category] title = Category Theory to Yoneda's Lemma author = Greg O'Keefe date = 2005-04-21 topic = Mathematics/Category theory license = LGPL abstract = This development proves Yoneda's lemma and aims to be readable by humans. It only defines what is needed for the lemma: categories, functors and natural transformations. Limits, adjunctions and other important concepts are not included. extra-history = Change history: [2010-04-23]: The definition of the constant equinumerous was slightly too weak in the original submission and has been fixed in revision 8c2b5b3c995f. notify = lcp@cl.cam.ac.uk [Category2] title = Category Theory author = Alexander Katovsky date = 2010-06-20 topic = Mathematics/Category theory abstract = This article presents a development of Category Theory in Isabelle/HOL. A Category is defined using records and locales. Functors and Natural Transformations are also defined. The main result that has been formalized is that the Yoneda functor is a full and faithful embedding. We also formalize the completeness of many sorted monadic equational logic. Extensive use is made of the HOLZF theory in both cases. For an informal description see here [pdf]. notify = alexander.katovsky@cantab.net [FunWithFunctions] title = Fun With Functions author = Tobias Nipkow date = 2008-08-26 topic = Mathematics/Misc abstract = This is a collection of cute puzzles of the form ``Show that if a function satisfies the following constraints, it must be ...'' Please add further examples to this collection! notify = nipkow@in.tum.de [FunWithTilings] title = Fun With Tilings author = Tobias Nipkow , Lawrence C. Paulson date = 2008-11-07 topic = Mathematics/Misc abstract = Tilings are defined inductively. It is shown that one form of mutilated chess board cannot be tiled with dominoes, while another one can be tiled with L-shaped tiles. Please add further fun examples of this kind! notify = nipkow@in.tum.de [Lazy-Lists-II] title = Lazy Lists II author = Stefan Friedrich <> date = 2004-04-26 topic = Computer science/Data structures abstract = This theory contains some useful extensions to the LList (lazy list) theory by Larry Paulson, including finite, infinite, and positive llists over an alphabet, as well as the new constants take and drop and the prefix order of llists. Finally, the notions of safety and liveness in the sense of Alpern and Schneider (1985) are defined. notify = lcp@cl.cam.ac.uk [Ribbon_Proofs] title = Ribbon Proofs author = John Wickerson <> date = 2013-01-19 topic = Computer science/Programming languages/Logics abstract = This document concerns the theory of ribbon proofs: a diagrammatic proof system, based on separation logic, for verifying program correctness. We include the syntax, proof rules, and soundness results for two alternative formalisations of ribbon proofs.

Compared to traditional proof outlines, ribbon proofs emphasise the structure of a proof, so are intelligible and pedagogical. Because they contain less redundancy than proof outlines, and allow each proof step to be checked locally, they may be more scalable. Where proof outlines are cumbersome to modify, ribbon proofs can be visually manoeuvred to yield proofs of variant programs. notify = [Koenigsberg_Friendship] title = The Königsberg Bridge Problem and the Friendship Theorem author = Wenda Li date = 2013-07-19 topic = Mathematics/Graph theory abstract = This development provides a formalization of undirected graphs and simple graphs, which are based on Benedikt Nordhoff and Peter Lammich's simple formalization of labelled directed graphs in the archive. Then, with our formalization of graphs, we show both necessary and sufficient conditions for Eulerian trails and circuits as well as the fact that the Königsberg Bridge Problem does not have a solution. In addition, we show the Friendship Theorem in simple graphs. notify = [Tree_Decomposition] title = Tree Decomposition author = Christoph Dittmann notify = date = 2016-05-31 topic = Mathematics/Graph theory abstract = We formalize tree decompositions and tree width in Isabelle/HOL, proving that trees have treewidth 1. We also show that every edge of a tree decomposition is a separation of the underlying graph. As an application of this theorem we prove that complete graphs of size n have treewidth n-1. [Menger] title = Menger's Theorem author = Christoph Dittmann topic = Mathematics/Graph theory date = 2017-02-26 notify = isabelle@christoph-d.de abstract = We present a formalization of Menger's Theorem for directed and undirected graphs in Isabelle/HOL. This well-known result shows that if two non-adjacent distinct vertices u, v in a directed graph have no separator smaller than n, then there exist n internally vertex-disjoint paths from u to v. The version for undirected graphs follows immediately because undirected graphs are a special case of directed graphs. [IEEE_Floating_Point] title = A Formal Model of IEEE Floating Point Arithmetic author = Lei Yu contributors = Fabian Hellauer , Fabian Immler date = 2013-07-27 topic = Computer science/Data structures abstract = This development provides a formal model of IEEE-754 floating-point arithmetic. This formalization, including formal specification of the standard and proofs of important properties of floating-point arithmetic, forms the foundation for verifying programs with floating-point computation. There is also a code generation setup for floats so that we can execute programs using this formalization in functional programming languages. notify = lp15@cam.ac.uk, immler@in.tum.de extra-history = Change history: [2017-09-25]: Added conversions from and to software floating point numbers (by Fabian Hellauer and Fabian Immler).
[2018-02-05]: 'Modernized' representation following the formalization in HOL4: former "float_format" and predicate "is_valid" is now encoded in a type "('e, 'f) float" where 'e and 'f encode the size of exponent and fraction. [Native_Word] title = Native Word author = Andreas Lochbihler contributors = Peter Lammich date = 2013-09-17 topic = Computer science/Data structures abstract = This entry makes machine words and machine arithmetic available for code generation from Isabelle/HOL. It provides a common abstraction that hides the differences between the different target languages. The code generator maps these operations to the APIs of the target languages. Apart from that, we extend the available bit operations on types int and integer, and map them to the operations in the target languages. extra-history = Change history: [2013-11-06]: added conversion function between native words and characters (revision fd23d9a7fe3a)
[2014-03-31]: added words of default size in the target language (by Peter Lammich) (revision 25caf5065833)
[2014-10-06]: proper test setup with compilation and execution of tests in all target languages (revision 5d7a1c9ae047)
[2017-09-02]: added 64-bit words (revision c89f86244e3c)
[2018-07-15]: added cast operators for default-size words (revision fc1f1fb8dd30)
notify = mail@andreas-lochbihler.de [XML] title = XML author = Christian Sternagel , René Thiemann date = 2014-10-03 topic = Computer science/Functional programming, Computer science/Data structures abstract = This entry provides an XML library for Isabelle/HOL. This includes parsing and pretty printing of XML trees as well as combinators for transforming XML trees into arbitrary user-defined data. The main contribution of this entry is an interface (fit for code generation) that allows for communication between verified programs formalized in Isabelle/HOL and the outside world via XML. This library was developed as part of the IsaFoR/CeTA project to which we refer for examples of its usage. notify = c.sternagel@gmail.com, rene.thiemann@uibk.ac.at [HereditarilyFinite] title = The Hereditarily Finite Sets author = Lawrence C. Paulson date = 2013-11-17 topic = Logic/Set theory abstract = The theory of hereditarily finite sets is formalised, following the development of Swierczkowski. An HF set is a finite collection of other HF sets; they enjoy an induction principle and satisfy all the axioms of ZF set theory apart from the axiom of infinity, which is negated. All constructions that are possible in ZF set theory (Cartesian products, disjoint sums, natural numbers, functions) without using infinite sets are possible here. The definition of addition for the HF sets follows Kirby. This development forms the foundation for the Isabelle proof of Gödel's incompleteness theorems, which has been formalised separately. extra-history = Change history: [2015-02-23]: Added the theory "Finitary" defining the class of types that can be embedded in hf, including int, char, option, list, etc. notify = lp15@cam.ac.uk [Incompleteness] title = Gödel's Incompleteness Theorems author = Lawrence C. Paulson date = 2013-11-17 topic = Logic/Proof theory abstract = Gödel's two incompleteness theorems are formalised, following a careful presentation by Swierczkowski, in the theory of hereditarily finite sets. This represents the first ever machine-assisted proof of the second incompleteness theorem. Compared with traditional formalisations using Peano arithmetic (see e.g. Boolos), coding is simpler, with no need to formalise the notion of multiplication (let alone that of a prime number) in the formalised calculus upon which the theorem is based. However, other technical problems had to be solved in order to complete the argument. notify = lp15@cam.ac.uk [Finite_Automata_HF] title = Finite Automata in Hereditarily Finite Set Theory author = Lawrence C. Paulson date = 2015-02-05 topic = Computer science/Automata and formal languages abstract = Finite Automata, both deterministic and non-deterministic, for regular languages. The Myhill-Nerode Theorem. Closure under intersection, concatenation, etc. Regular expressions define regular languages. Closure under reversal; the powerset construction mapping NFAs to DFAs. Left and right languages; minimal DFAs. Brzozowski's minimization algorithm. Uniqueness up to isomorphism of minimal DFAs. notify = lp15@cam.ac.uk [Decreasing-Diagrams] title = Decreasing Diagrams author = Harald Zankl license = LGPL date = 2013-11-01 topic = Logic/Rewriting abstract = This theory contains a formalization of decreasing diagrams showing that any locally decreasing abstract rewrite system is confluent. We consider the valley (van Oostrom, TCS 1994) and the conversion version (van Oostrom, RTA 2008) and closely follow the original proofs. As an application we prove Newman's lemma. notify = Harald.Zankl@uibk.ac.at [Decreasing-Diagrams-II] title = Decreasing Diagrams II author = Bertram Felgenhauer license = LGPL date = 2015-08-20 topic = Logic/Rewriting abstract = This theory formalizes the commutation version of decreasing diagrams for Church-Rosser modulo. The proof follows Felgenhauer and van Oostrom (RTA 2013). The theory also provides important specializations, in particular van Oostrom’s conversion version (TCS 2008) of decreasing diagrams. notify = bertram.felgenhauer@uibk.ac.at [GoedelGod] title = Gödel's God in Isabelle/HOL author = Christoph Benzmüller , Bruno Woltzenlogel Paleo date = 2013-11-12 topic = Logic/Philosophical aspects abstract = Dana Scott's version of Gödel's proof of God's existence is formalized in quantified modal logic KB (QML KB). QML KB is modeled as a fragment of classical higher-order logic (HOL); thus, the formalization is essentially a formalization in HOL. notify = lp15@cam.ac.uk, c.benzmueller@fu-berlin.de [Types_Tableaus_and_Goedels_God] title = Types, Tableaus and Gödel’s God in Isabelle/HOL author = David Fuenmayor , Christoph Benzmüller topic = Logic/Philosophical aspects date = 2017-05-01 notify = davfuenmayor@gmail.com, c.benzmueller@gmail.com abstract = A computer-formalisation of the essential parts of Fitting's textbook "Types, Tableaus and Gödel's God" in Isabelle/HOL is presented. In particular, Fitting's (and Anderson's) variant of the ontological argument is verified and confirmed. This variant avoids the modal collapse, which has been criticised as an undesirable side-effect of Kurt Gödel's (and Dana Scott's) versions of the ontological argument. Fitting's work is employing an intensional higher-order modal logic, which we shallowly embed here in classical higher-order logic. We then utilize the embedded logic for the formalisation of Fitting's argument. (See also the earlier AFP entry ``Gödel's God in Isabelle/HOL''.) [GewirthPGCProof] title = Formalisation and Evaluation of Alan Gewirth's Proof for the Principle of Generic Consistency in Isabelle/HOL author = David Fuenmayor , Christoph Benzmüller topic = Logic/Philosophical aspects date = 2018-10-30 notify = davfuenmayor@gmail.com, c.benzmueller@gmail.com abstract = An ambitious ethical theory ---Alan Gewirth's "Principle of Generic Consistency"--- is encoded and analysed in Isabelle/HOL. Gewirth's theory has stirred much attention in philosophy and ethics and has been proposed as a potential means to bound the impact of artificial general intelligence. extra-history = Change history: [2019-04-09]: added proof for a stronger variant of the PGC and examplary inferences (revision 88182cb0a2f6)
[Lowe_Ontological_Argument] title = Computer-assisted Reconstruction and Assessment of E. J. Lowe's Modal Ontological Argument author = David Fuenmayor , Christoph Benzmüller topic = Logic/Philosophical aspects date = 2017-09-21 notify = davfuenmayor@gmail.com, c.benzmueller@gmail.com abstract = Computers may help us to understand --not just verify-- philosophical arguments. By utilizing modern proof assistants in an iterative interpretive process, we can reconstruct and assess an argument by fully formal means. Through the mechanization of a variant of St. Anselm's ontological argument by E. J. Lowe, which is a paradigmatic example of a natural-language argument with strong ties to metaphysics and religion, we offer an ideal showcase for our computer-assisted interpretive method. [AnselmGod] title = Anselm's God in Isabelle/HOL author = Ben Blumson topic = Logic/Philosophical aspects date = 2017-09-06 notify = benblumson@gmail.com abstract = Paul Oppenheimer and Edward Zalta's formalisation of Anselm's ontological argument for the existence of God is automated by embedding a free logic for definite descriptions within Isabelle/HOL. [Tail_Recursive_Functions] title = A General Method for the Proof of Theorems on Tail-recursive Functions author = Pasquale Noce date = 2013-12-01 topic = Computer science/Functional programming abstract =

Tail-recursive function definitions are sometimes more straightforward than alternatives, but proving theorems on them may be roundabout because of the peculiar form of the resulting recursion induction rules.

This paper describes a proof method that provides a general solution to this problem by means of suitable invariants over inductive sets, and illustrates the application of such method by examining two case studies.

notify = pasquale.noce.lavoro@gmail.com [CryptoBasedCompositionalProperties] title = Compositional Properties of Crypto-Based Components author = Maria Spichkova date = 2014-01-11 topic = Computer science/Security abstract = This paper presents an Isabelle/HOL set of theories which allows the specification of crypto-based components and the verification of their composition properties wrt. cryptographic aspects. We introduce a formalisation of the security property of data secrecy, the corresponding definitions and proofs. Please note that here we import the Isabelle/HOL theory ListExtras.thy, presented in the AFP entry FocusStreamsCaseStudies-AFP. notify = maria.spichkova@rmit.edu.au [Featherweight_OCL] title = Featherweight OCL: A Proposal for a Machine-Checked Formal Semantics for OCL 2.5 author = Achim D. Brucker , Frédéric Tuong , Burkhart Wolff date = 2014-01-16 topic = Computer science/System description languages abstract = The Unified Modeling Language (UML) is one of the few modeling languages that is widely used in industry. While UML is mostly known as diagrammatic modeling language (e.g., visualizing class models), it is complemented by a textual language, called Object Constraint Language (OCL). The current version of OCL is based on a four-valued logic that turns UML into a formal language. Any type comprises the elements "invalid" and "null" which are propagated as strict and non-strict, respectively. Unfortunately, the former semi-formal semantics of this specification language, captured in the "Annex A" of the OCL standard, leads to different interpretations of corner cases. We formalize the core of OCL: denotational definitions, a logical calculus and operational rules that allow for the execution of OCL expressions by a mixture of term rewriting and code compilation. Our formalization reveals several inconsistencies and contradictions in the current version of the OCL standard. Overall, this document is intended to provide the basis for a machine-checked text "Annex A" of the OCL standard targeting at tool implementors. extra-history = Change history: [2015-10-13]: afp-devel@ea3b38fc54d6 and hol-testgen@12148
   Update of Featherweight OCL including a change in the abstract.
[2014-01-16]: afp-devel@9091ce05cb20 and hol-testgen@10241
   New Entry: Featherweight OCL notify = brucker@spamfence.net, tuong@users.gforge.inria.fr, wolff@lri.fr [Relation_Algebra] title = Relation Algebra author = Alasdair Armstrong <>, Simon Foster , Georg Struth , Tjark Weber date = 2014-01-25 topic = Mathematics/Algebra abstract = Tarski's algebra of binary relations is formalised along the lines of the standard textbooks of Maddux and Schmidt and Ströhlein. This includes relation-algebraic concepts such as subidentities, vectors and a domain operation as well as various notions associated to functions. Relation algebras are also expanded by a reflexive transitive closure operation, and they are linked with Kleene algebras and models of binary relations and Boolean matrices. notify = g.struth@sheffield.ac.uk, tjark.weber@it.uu.se [PSemigroupsConvolution] title = Partial Semigroups and Convolution Algebras author = Brijesh Dongol , Victor B. F. Gomes , Ian J. Hayes , Georg Struth topic = Mathematics/Algebra date = 2017-06-13 notify = g.struth@sheffield.ac.uk, victor.gomes@cl.cam.ac.uk abstract = Partial Semigroups are relevant to the foundations of quantum mechanics and combinatorics as well as to interval and separation logics. Convolution algebras can be understood either as algebras of generalised binary modalities over ternary Kripke frames, in particular over partial semigroups, or as algebras of quantale-valued functions which are equipped with a convolution-style operation of multiplication that is parametrised by a ternary relation. Convolution algebras provide algebraic semantics for various substructural logics, including categorial, relevance and linear logics, for separation logic and for interval logics; they cover quantitative and qualitative applications. These mathematical components for partial semigroups and convolution algebras provide uniform foundations from which models of computation based on relations, program traces or pomsets, and verification components for separation or interval temporal logics can be built with little effort. [Secondary_Sylow] title = Secondary Sylow Theorems author = Jakob von Raumer date = 2014-01-28 topic = Mathematics/Algebra abstract = These theories extend the existing proof of the first Sylow theorem (written by Florian Kammueller and L. C. Paulson) by what are often called the second, third and fourth Sylow theorems. These theorems state propositions about the number of Sylow p-subgroups of a group and the fact that they are conjugate to each other. The proofs make use of an implementation of group actions and their properties. notify = psxjv4@nottingham.ac.uk [Jordan_Hoelder] title = The Jordan-Hölder Theorem author = Jakob von Raumer date = 2014-09-09 topic = Mathematics/Algebra abstract = This submission contains theories that lead to a formalization of the proof of the Jordan-Hölder theorem about composition series of finite groups. The theories formalize the notions of isomorphism classes of groups, simple groups, normal series, composition series, maximal normal subgroups. Furthermore, they provide proofs of the second isomorphism theorem for groups, the characterization theorem for maximal normal subgroups as well as many useful lemmas about normal subgroups and factor groups. The proof is inspired by course notes of Stuart Rankin. notify = psxjv4@nottingham.ac.uk [Cayley_Hamilton] title = The Cayley-Hamilton Theorem author = Stephan Adelsberger , Stefan Hetzl , Florian Pollak date = 2014-09-15 topic = Mathematics/Algebra abstract = This document contains a proof of the Cayley-Hamilton theorem based on the development of matrices in HOL/Multivariate Analysis. notify = stvienna@gmail.com [Probabilistic_Noninterference] title = Probabilistic Noninterference author = Andrei Popescu , Johannes Hölzl date = 2014-03-11 topic = Computer science/Security abstract = We formalize a probabilistic noninterference for a multi-threaded language with uniform scheduling, where probabilistic behaviour comes from both the scheduler and the individual threads. We define notions probabilistic noninterference in two variants: resumption-based and trace-based. For the resumption-based notions, we prove compositionality w.r.t. the language constructs and establish sound type-system-like syntactic criteria. This is a formalization of the mathematical development presented at CPP 2013 and CALCO 2013. It is the probabilistic variant of the Possibilistic Noninterference AFP entry. notify = hoelzl@in.tum.de [HyperCTL] title = A shallow embedding of HyperCTL* author = Markus N. Rabe , Peter Lammich , Andrei Popescu date = 2014-04-16 topic = Computer science/Security, Logic/General logic/Temporal logic abstract = We formalize HyperCTL*, a temporal logic for expressing security properties. We first define a shallow embedding of HyperCTL*, within which we prove inductive and coinductive rules for the operators. Then we show that a HyperCTL* formula captures Goguen-Meseguer noninterference, a landmark information flow property. We also define a deep embedding and connect it to the shallow embedding by a denotational semantics, for which we prove sanity w.r.t. dependence on the free variables. Finally, we show that under some finiteness assumptions about the model, noninterference is given by a (finitary) syntactic formula. notify = uuomul@yahoo.com [Bounded_Deducibility_Security] title = Bounded-Deducibility Security author = Andrei Popescu , Peter Lammich date = 2014-04-22 topic = Computer science/Security abstract = This is a formalization of bounded-deducibility security (BD security), a flexible notion of information-flow security applicable to arbitrary input-output automata. It generalizes Sutherland's classic notion of nondeducibility by factoring in declassification bounds and trigger, whereas nondeducibility states that, in a system, information cannot flow between specified sources and sinks, BD security indicates upper bounds for the flow and triggers under which these upper bounds are no longer guaranteed. notify = uuomul@yahoo.com, lammich@in.tum.de [Network_Security_Policy_Verification] title = Network Security Policy Verification author = Cornelius Diekmann date = 2014-07-04 topic = Computer science/Security abstract = We present a unified theory for verifying network security policies. A security policy is represented as directed graph. To check high-level security goals, security invariants over the policy are expressed. We cover monotonic security invariants, i.e. prohibiting more does not harm security. We provide the following contributions for the security invariant theory.
  • Secure auto-completion of scenario-specific knowledge, which eases usability.
  • Security violations can be repaired by tightening the policy iff the security invariants hold for the deny-all policy.
  • An algorithm to compute a security policy.
  • A formalization of stateful connection semantics in network security mechanisms.
  • An algorithm to compute a secure stateful implementation of a policy.
  • An executable implementation of all the theory.
  • Examples, ranging from an aircraft cabin data network to the analysis of a large real-world firewall.
  • More examples: A fully automated translation of high-level security goals to both firewall and SDN configurations (see Examples/Distributed_WebApp.thy).
For a detailed description, see extra-history = Change history: [2015-04-14]: Added Distributed WebApp example and improved graphviz visualization (revision 4dde08ca2ab8)
notify = diekmann@net.in.tum.de [Abstract_Completeness] title = Abstract Completeness author = Jasmin Christian Blanchette , Andrei Popescu , Dmitriy Traytel date = 2014-04-16 topic = Logic/Proof theory abstract = A formalization of an abstract property of possibly infinite derivation trees (modeled by a codatatype), representing the core of a proof (in Beth/Hintikka style) of the first-order logic completeness theorem, independent of the concrete syntax or inference rules. This work is described in detail in the IJCAR 2014 publication by the authors. The abstract proof can be instantiated for a wide range of Gentzen and tableau systems as well as various flavors of FOL---e.g., with or without predicates, equality, or sorts. Here, we give only a toy example instantiation with classical propositional logic. A more serious instance---many-sorted FOL with equality---is described elsewhere [Blanchette and Popescu, FroCoS 2013]. notify = traytel@in.tum.de [Pop_Refinement] title = Pop-Refinement author = Alessandro Coglio date = 2014-07-03 topic = Computer science/Programming languages/Misc abstract = Pop-refinement is an approach to stepwise refinement, carried out inside an interactive theorem prover by constructing a monotonically decreasing sequence of predicates over deeply embedded target programs. The sequence starts with a predicate that characterizes the possible implementations, and ends with a predicate that characterizes a unique program in explicit syntactic form. Pop-refinement enables more requirements (e.g. program-level and non-functional) to be captured in the initial specification and preserved through refinement. Security requirements expressed as hyperproperties (i.e. predicates over sets of traces) are always preserved by pop-refinement, unlike the popular notion of refinement as trace set inclusion. Two simple examples in Isabelle/HOL are presented, featuring program-level requirements, non-functional requirements, and hyperproperties. notify = coglio@kestrel.edu [VectorSpace] title = Vector Spaces author = Holden Lee date = 2014-08-29 topic = Mathematics/Algebra abstract = This formalisation of basic linear algebra is based completely on locales, building off HOL-Algebra. It includes basic definitions: linear combinations, span, linear independence; linear transformations; interpretation of function spaces as vector spaces; the direct sum of vector spaces, sum of subspaces; the replacement theorem; existence of bases in finite-dimensional; vector spaces, definition of dimension; the rank-nullity theorem. Some concepts are actually defined and proved for modules as they also apply there. Infinite-dimensional vector spaces are supported, but dimension is only supported for finite-dimensional vector spaces. The proofs are standard; the proofs of the replacement theorem and rank-nullity theorem roughly follow the presentation in Linear Algebra by Friedberg, Insel, and Spence. The rank-nullity theorem generalises the existing development in the Archive of Formal Proof (originally using type classes, now using a mix of type classes and locales). notify = holdenl@princeton.edu [Special_Function_Bounds] title = Real-Valued Special Functions: Upper and Lower Bounds author = Lawrence C. Paulson date = 2014-08-29 topic = Mathematics/Analysis abstract = This development proves upper and lower bounds for several familiar real-valued functions. For sin, cos, exp and sqrt, it defines and verifies infinite families of upper and lower bounds, mostly based on Taylor series expansions. For arctan, ln and exp, it verifies a finite collection of upper and lower bounds, originally obtained from the functions' continued fraction expansions using the computer algebra system Maple. A common theme in these proofs is to take the difference between a function and its approximation, which should be zero at one point, and then consider the sign of the derivative. The immediate purpose of this development is to verify axioms used by MetiTarski, an automatic theorem prover for real-valued special functions. Crucial to MetiTarski's operation is the provision of upper and lower bounds for each function of interest. notify = lp15@cam.ac.uk [Landau_Symbols] title = Landau Symbols author = Manuel Eberl date = 2015-07-14 topic = Mathematics/Analysis abstract = This entry provides Landau symbols to describe and reason about the asymptotic growth of functions for sufficiently large inputs. A number of simplification procedures are provided for additional convenience: cancelling of dominated terms in sums under a Landau symbol, cancelling of common factors in products, and a decision procedure for Landau expressions containing products of powers of functions like x, ln(x), ln(ln(x)) etc. notify = eberlm@in.tum.de [Error_Function] title = The Error Function author = Manuel Eberl topic = Mathematics/Analysis date = 2018-02-06 notify = eberlm@in.tum.de abstract =

This entry provides the definitions and basic properties of the complex and real error function erf and the complementary error function erfc. Additionally, it gives their full asymptotic expansions.

[Akra_Bazzi] title = The Akra-Bazzi theorem and the Master theorem author = Manuel Eberl date = 2015-07-14 topic = Mathematics/Analysis abstract = This article contains a formalisation of the Akra-Bazzi method based on a proof by Leighton. It is a generalisation of the well-known Master Theorem for analysing the complexity of Divide & Conquer algorithms. We also include a generalised version of the Master theorem based on the Akra-Bazzi theorem, which is easier to apply than the Akra-Bazzi theorem itself.

Some proof methods that facilitate applying the Master theorem are also included. For a more detailed explanation of the formalisation and the proof methods, see the accompanying paper (publication forthcoming). notify = eberlm@in.tum.de [Dirichlet_Series] title = Dirichlet Series author = Manuel Eberl topic = Mathematics/Number theory date = 2017-10-12 notify = eberlm@in.tum.de abstract = This entry is a formalisation of much of Chapters 2, 3, and 11 of Apostol's “Introduction to Analytic Number Theory”. This includes:

  • Definitions and basic properties for several number-theoretic functions (Euler's φ, Möbius μ, Liouville's λ, the divisor function σ, von Mangoldt's Λ)
  • Executable code for most of these functions, the most efficient implementations using the factoring algorithm by Thiemann et al.
  • Dirichlet products and formal Dirichlet series
  • Analytic results connecting convergent formal Dirichlet series to complex functions
  • Euler product expansions
  • Asymptotic estimates of number-theoretic functions including the density of squarefree integers and the average number of divisors of a natural number
These results are useful as a basis for developing more number-theoretic results, such as the Prime Number Theorem. [Gauss_Sums] title = Gauss Sums and the Pólya–Vinogradov Inequality author = Rodrigo Raya , Manuel Eberl topic = Mathematics/Number theory date = 2019-12-10 notify = manuel.eberl@tum.de abstract =

This article provides a full formalisation of Chapter 8 of Apostol's Introduction to Analytic Number Theory. Subjects that are covered are:

  • periodic arithmetic functions and their finite Fourier series
  • (generalised) Ramanujan sums
  • Gauss sums and separable characters
  • induced moduli and primitive characters
  • the Pólya—Vinogradov inequality
[Zeta_Function] title = The Hurwitz and Riemann ζ Functions author = Manuel Eberl topic = Mathematics/Number theory, Mathematics/Analysis date = 2017-10-12 notify = eberlm@in.tum.de abstract =

This entry builds upon the results about formal and analytic Dirichlet series to define the Hurwitz ζ function ζ(a,s) and, based on that, the Riemann ζ function ζ(s). This is done by first defining them for ℜ(z) > 1 and then successively extending the domain to the left using the Euler–MacLaurin formula.

Apart from the most basic facts such as analyticity, the following results are provided:

  • the Stieltjes constants and the Laurent expansion of ζ(s) at s = 1
  • the non-vanishing of ζ(s) for ℜ(z) ≥ 1
  • the relationship between ζ(a,s) and Γ
  • the special values at negative integers and positive even integers
  • Hurwitz's formula and the reflection formula for ζ(s)
  • the Hadjicostas–Chapman formula

The entry also contains Euler's analytic proof of the infinitude of primes, based on the fact that ζ(s) has a pole at s = 1.

[Linear_Recurrences] title = Linear Recurrences author = Manuel Eberl topic = Mathematics/Analysis date = 2017-10-12 notify = eberlm@in.tum.de abstract =

Linear recurrences with constant coefficients are an interesting class of recurrence equations that can be solved explicitly. The most famous example are certainly the Fibonacci numbers with the equation f(n) = f(n-1) + f(n - 2) and the quite non-obvious closed form (φn - (-φ)-n) / √5 where φ is the golden ratio.

In this work, I build on existing tools in Isabelle – such as formal power series and polynomial factorisation algorithms – to develop a theory of these recurrences and derive a fully executable solver for them that can be exported to programming languages like Haskell.

[Lambert_W] title = The Lambert W Function on the Reals author = Manuel Eberl topic = Mathematics/Analysis date = 2020-04-24 notify = eberlm@in.tum.de abstract =

The Lambert W function is a multi-valued function defined as the inverse function of xx ex. Besides numerous applications in combinatorics, physics, and engineering, it also frequently occurs when solving equations containing both ex and x, or both x and log x.

This article provides a definition of the two real-valued branches W0(x) and W-1(x) and proves various properties such as basic identities and inequalities, monotonicity, differentiability, asymptotic expansions, and the MacLaurin series of W0(x) at x = 0.

[Cartan_FP] title = The Cartan Fixed Point Theorems author = Lawrence C. Paulson date = 2016-03-08 topic = Mathematics/Analysis abstract = The Cartan fixed point theorems concern the group of holomorphic automorphisms on a connected open set of Cn. Ciolli et al. have formalised the one-dimensional case of these theorems in HOL Light. This entry contains their proofs, ported to Isabelle/HOL. Thus it addresses the authors' remark that "it would be important to write a formal proof in a language that can be read by both humans and machines". notify = lp15@cam.ac.uk [Gauss_Jordan] title = Gauss-Jordan Algorithm and Its Applications author = Jose Divasón , Jesús Aransay topic = Computer science/Algorithms/Mathematical date = 2014-09-03 abstract = The Gauss-Jordan algorithm states that any matrix over a field can be transformed by means of elementary row operations to a matrix in reduced row echelon form. The formalization is based on the Rank Nullity Theorem entry of the AFP and on the HOL-Multivariate-Analysis session of Isabelle, where matrices are represented as functions over finite types. We have set up the code generator to make this representation executable. In order to improve the performance, a refinement to immutable arrays has been carried out. We have formalized some of the applications of the Gauss-Jordan algorithm. Thanks to this development, the following facts can be computed over matrices whose elements belong to a field: Ranks, Determinants, Inverses, Bases and dimensions and Solutions of systems of linear equations. Code can be exported to SML and Haskell. notify = jose.divasonm@unirioja.es, jesus-maria.aransay@unirioja.es [Echelon_Form] title = Echelon Form author = Jose Divasón , Jesús Aransay topic = Computer science/Algorithms/Mathematical, Mathematics/Algebra date = 2015-02-12 abstract = We formalize an algorithm to compute the Echelon Form of a matrix. We have proved its existence over Bézout domains and made it executable over Euclidean domains, such as the integer ring and the univariate polynomials over a field. This allows us to compute determinants, inverses and characteristic polynomials of matrices. The work is based on the HOL-Multivariate Analysis library, and on both the Gauss-Jordan and Cayley-Hamilton AFP entries. As a by-product, some algebraic structures have been implemented (principal ideal domains, Bézout domains...). The algorithm has been refined to immutable arrays and code can be generated to functional languages as well. notify = jose.divasonm@unirioja.es, jesus-maria.aransay@unirioja.es [QR_Decomposition] title = QR Decomposition author = Jose Divasón , Jesús Aransay topic = Computer science/Algorithms/Mathematical, Mathematics/Algebra date = 2015-02-12 abstract = QR decomposition is an algorithm to decompose a real matrix A into the product of two other matrices Q and R, where Q is orthogonal and R is invertible and upper triangular. The algorithm is useful for the least squares problem; i.e., the computation of the best approximation of an unsolvable system of linear equations. As a side-product, the Gram-Schmidt process has also been formalized. A refinement using immutable arrays is presented as well. The development relies, among others, on the AFP entry "Implementing field extensions of the form Q[sqrt(b)]" by René Thiemann, which allows execution of the algorithm using symbolic computations. Verified code can be generated and executed using floats as well. extra-history = Change history: [2015-06-18]: The second part of the Fundamental Theorem of Linear Algebra has been generalized to more general inner product spaces. notify = jose.divasonm@unirioja.es, jesus-maria.aransay@unirioja.es [Hermite] title = Hermite Normal Form author = Jose Divasón , Jesús Aransay topic = Computer science/Algorithms/Mathematical, Mathematics/Algebra date = 2015-07-07 abstract = Hermite Normal Form is a canonical matrix analogue of Reduced Echelon Form, but involving matrices over more general rings. In this work we formalise an algorithm to compute the Hermite Normal Form of a matrix by means of elementary row operations, taking advantage of the Echelon Form AFP entry. We have proven the correctness of such an algorithm and refined it to immutable arrays. Furthermore, we have also formalised the uniqueness of the Hermite Normal Form of a matrix. Code can be exported and some examples of execution involving integer matrices and polynomial matrices are presented as well. notify = jose.divasonm@unirioja.es, jesus-maria.aransay@unirioja.es [Imperative_Insertion_Sort] title = Imperative Insertion Sort author = Christian Sternagel date = 2014-09-25 topic = Computer science/Algorithms abstract = The insertion sort algorithm of Cormen et al. (Introduction to Algorithms) is expressed in Imperative HOL and proved to be correct and terminating. For this purpose we also provide a theory about imperative loop constructs with accompanying induction/invariant rules for proving partial and total correctness. Furthermore, the formalized algorithm is fit for code generation. notify = lp15@cam.ac.uk [Stream_Fusion_Code] title = Stream Fusion in HOL with Code Generation author = Andreas Lochbihler , Alexandra Maximova date = 2014-10-10 topic = Computer science/Functional programming abstract = Stream Fusion is a system for removing intermediate list data structures from functional programs, in particular Haskell. This entry adapts stream fusion to Isabelle/HOL and its code generator. We define stream types for finite and possibly infinite lists and stream versions for most of the fusible list functions in the theories List and Coinductive_List, and prove them correct with respect to the conversion functions between lists and streams. The Stream Fusion transformation itself is implemented as a simproc in the preprocessor of the code generator. [Brian Huffman's AFP entry formalises stream fusion in HOLCF for the domain of lazy lists to prove the GHC compiler rewrite rules correct. In contrast, this work enables Isabelle's code generator to perform stream fusion itself. To that end, it covers both finite and coinductive lists from the HOL library and the Coinductive entry. The fusible list functions require specification and proof principles different from Huffman's.] notify = mail@andreas-lochbihler.de [Case_Labeling] title = Generating Cases from Labeled Subgoals author = Lars Noschinski date = 2015-07-21 topic = Tools, Computer science/Programming languages/Misc abstract = Isabelle/Isar provides named cases to structure proofs. This article contains an implementation of a proof method casify, which can be used to easily extend proof tools with support for named cases. Such a proof tool must produce labeled subgoals, which are then interpreted by casify.

As examples, this work contains verification condition generators producing named cases for three languages: The Hoare language from HOL/Library, a monadic language for computations with failure (inspired by the AutoCorres tool), and a language of conditional expressions. These VCGs are demonstrated by a number of example programs. notify = noschinl@gmail.com [DPT-SAT-Solver] title = A Fast SAT Solver for Isabelle in Standard ML topic = Tools author = Armin Heller <> date = 2009-12-09 abstract = This contribution contains a fast SAT solver for Isabelle written in Standard ML. By loading the theory DPT_SAT_Solver, the SAT solver installs itself (under the name ``dptsat'') and certain Isabelle tools like Refute will start using it automatically. This is a port of the DPT (Decision Procedure Toolkit) SAT Solver written in OCaml. notify = jasmin.blanchette@gmail.com [Rep_Fin_Groups] title = Representations of Finite Groups topic = Mathematics/Algebra author = Jeremy Sylvestre date = 2015-08-12 abstract = We provide a formal framework for the theory of representations of finite groups, as modules over the group ring. Along the way, we develop the general theory of groups (relying on the group_add class for the basics), modules, and vector spaces, to the extent required for theory of group representations. We then provide formal proofs of several important introductory theorems in the subject, including Maschke's theorem, Schur's lemma, and Frobenius reciprocity. We also prove that every irreducible representation is isomorphic to a submodule of the group ring, leading to the fact that for a finite group there are only finitely many isomorphism classes of irreducible representations. In all of this, no restriction is made on the characteristic of the ring or field of scalars until the definition of a group representation, and then the only restriction made is that the characteristic must not divide the order of the group. notify = jsylvest@ualberta.ca [Noninterference_Inductive_Unwinding] title = The Inductive Unwinding Theorem for CSP Noninterference Security topic = Computer science/Security author = Pasquale Noce date = 2015-08-18 abstract =

The necessary and sufficient condition for CSP noninterference security stated by the Ipurge Unwinding Theorem is expressed in terms of a pair of event lists varying over the set of process traces. This does not render it suitable for the subsequent application of rule induction in the case of a process defined inductively, since rule induction may rather be applied to a single variable ranging over an inductively defined set.

Starting from the Ipurge Unwinding Theorem, this paper derives a necessary and sufficient condition for CSP noninterference security that involves a single event list varying over the set of process traces, and is thus suitable for rule induction; hence its name, Inductive Unwinding Theorem. Similarly to the Ipurge Unwinding Theorem, the new theorem only requires to consider individual accepted and refused events for each process trace, and applies to the general case of a possibly intransitive noninterference policy. Specific variants of this theorem are additionally proven for deterministic processes and trace set processes.

notify = pasquale.noce.lavoro@gmail.com [Password_Authentication_Protocol] title = Verification of a Diffie-Hellman Password-based Authentication Protocol by Extending the Inductive Method author = Pasquale Noce topic = Computer science/Security date = 2017-01-03 notify = pasquale.noce.lavoro@gmail.com abstract = This paper constructs a formal model of a Diffie-Hellman password-based authentication protocol between a user and a smart card, and proves its security. The protocol provides for the dispatch of the user's password to the smart card on a secure messaging channel established by means of Password Authenticated Connection Establishment (PACE), where the mapping method being used is Chip Authentication Mapping. By applying and suitably extending Paulson's Inductive Method, this paper proves that the protocol establishes trustworthy secure messaging channels, preserves the secrecy of users' passwords, and provides an effective mutual authentication service. What is more, these security properties turn out to hold independently of the secrecy of the PACE authentication key. [Jordan_Normal_Form] title = Matrices, Jordan Normal Forms, and Spectral Radius Theory topic = Mathematics/Algebra author = René Thiemann , Akihisa Yamada contributors = Alexander Bentkamp date = 2015-08-21 abstract =

Matrix interpretations are useful as measure functions in termination proving. In order to use these interpretations also for complexity analysis, the growth rate of matrix powers has to examined. Here, we formalized a central result of spectral radius theory, namely that the growth rate is polynomially bounded if and only if the spectral radius of a matrix is at most one.

To formally prove this result we first studied the growth rates of matrices in Jordan normal form, and prove the result that every complex matrix has a Jordan normal form using a constructive prove via Schur decomposition.

The whole development is based on a new abstract type for matrices, which is also executable by a suitable setup of the code generator. It completely subsumes our former AFP-entry on executable matrices, and its main advantage is its close connection to the HMA-representation which allowed us to easily adapt existing proofs on determinants.

All the results have been applied to improve CeTA, our certifier to validate termination and complexity proof certificates.

extra-history = Change history: [2016-01-07]: Added Schur-decomposition, Gram-Schmidt orthogonalization, uniqueness of Jordan normal forms
[2018-04-17]: Integrated lemmas from deep-learning AFP-entry of Alexander Bentkamp notify = rene.thiemann@uibk.ac.at, ayamada@trs.cm.is.nagoya-u.ac.jp [LTL_to_DRA] title = Converting Linear Temporal Logic to Deterministic (Generalized) Rabin Automata topic = Computer science/Automata and formal languages author = Salomon Sickert date = 2015-09-04 abstract = Recently, Javier Esparza and Jan Kretinsky proposed a new method directly translating linear temporal logic (LTL) formulas to deterministic (generalized) Rabin automata. Compared to the existing approaches of constructing a non-deterministic Buechi-automaton in the first step and then applying a determinization procedure (e.g. some variant of Safra's construction) in a second step, this new approach preservers a relation between the formula and the states of the resulting automaton. While the old approach produced a monolithic structure, the new method is compositional. Furthermore, in some cases the resulting automata are much smaller than the automata generated by existing approaches. In order to ensure the correctness of the construction, this entry contains a complete formalisation and verification of the translation. Furthermore from this basis executable code is generated. extra-history = Change history: [2015-09-23]: Enable code export for the eager unfolding optimisation and reduce running time of the generated tool. Moreover, add support for the mlton SML compiler.
[2016-03-24]: Make use of the LTL entry and include the simplifier. notify = sickert@in.tum.de [Timed_Automata] title = Timed Automata author = Simon Wimmer date = 2016-03-08 topic = Computer science/Automata and formal languages abstract = Timed automata are a widely used formalism for modeling real-time systems, which is employed in a class of successful model checkers such as UPPAAL [LPY97], HyTech [HHWt97] or Kronos [Yov97]. This work formalizes the theory for the subclass of diagonal-free timed automata, which is sufficient to model many interesting problems. We first define the basic concepts and semantics of diagonal-free timed automata. Based on this, we prove two types of decidability results for the language emptiness problem. The first is the classic result of Alur and Dill [AD90, AD94], which uses a finite partitioning of the state space into so-called `regions`. Our second result focuses on an approach based on `Difference Bound Matrices (DBMs)`, which is practically used by model checkers. We prove the correctness of the basic forward analysis operations on DBMs. One of these operations is the Floyd-Warshall algorithm for the all-pairs shortest paths problem. To obtain a finite search space, a widening operation has to be used for this kind of analysis. We use Patricia Bouyer's [Bou04] approach to prove that this widening operation is correct in the sense that DBM-based forward analysis in combination with the widening operation also decides language emptiness. The interesting property of this proof is that the first decidability result is reused to obtain the second one. notify = wimmers@in.tum.de [Parity_Game] title = Positional Determinacy of Parity Games author = Christoph Dittmann date = 2015-11-02 topic = Mathematics/Games and economics, Mathematics/Graph theory abstract = We present a formalization of parity games (a two-player game on directed graphs) and a proof of their positional determinacy in Isabelle/HOL. This proof works for both finite and infinite games. notify = [Ergodic_Theory] title = Ergodic Theory author = Sebastien Gouezel date = 2015-12-01 topic = Mathematics/Probability theory abstract = Ergodic theory is the branch of mathematics that studies the behaviour of measure preserving transformations, in finite or infinite measure. It interacts both with probability theory (mainly through measure theory) and with geometry as a lot of interesting examples are from geometric origin. We implement the first definitions and theorems of ergodic theory, including notably Poicaré recurrence theorem for finite measure preserving systems (together with the notion of conservativity in general), induced maps, Kac's theorem, Birkhoff theorem (arguably the most important theorem in ergodic theory), and variations around it such as conservativity of the corresponding skew product, or Atkinson lemma. notify = sebastien.gouezel@univ-rennes1.fr, hoelzl@in.tum.de [Latin_Square] title = Latin Square author = Alexander Bentkamp date = 2015-12-02 topic = Mathematics/Combinatorics abstract = A Latin Square is a n x n table filled with integers from 1 to n where each number appears exactly once in each row and each column. A Latin Rectangle is a partially filled n x n table with r filled rows and n-r empty rows, such that each number appears at most once in each row and each column. The main result of this theory is that any Latin Rectangle can be completed to a Latin Square. notify = bentkamp@gmail.com [Deep_Learning] title = Expressiveness of Deep Learning author = Alexander Bentkamp date = 2016-11-10 topic = Computer science/Machine learning, Mathematics/Analysis abstract = Deep learning has had a profound impact on computer science in recent years, with applications to search engines, image recognition and language processing, bioinformatics, and more. Recently, Cohen et al. provided theoretical evidence for the superiority of deep learning over shallow learning. This formalization of their work simplifies and generalizes the original proof, while working around the limitations of the Isabelle type system. To support the formalization, I developed reusable libraries of formalized mathematics, including results about the matrix rank, the Lebesgue measure, and multivariate polynomials, as well as a library for tensor analysis. notify = bentkamp@gmail.com [Inductive_Inference] title = Some classical results in inductive inference of recursive functions author = Frank J. Balbach topic = Logic/Computability, Computer science/Machine learning date = 2020-08-31 notify = frank-balbach@gmx.de abstract =

This entry formalizes some classical concepts and results from inductive inference of recursive functions. In the basic setting a partial recursive function ("strategy") must identify ("learn") all functions from a set ("class") of recursive functions. To that end the strategy receives more and more values $f(0), f(1), f(2), \ldots$ of some function $f$ from the given class and in turn outputs descriptions of partial recursive functions, for example, Gödel numbers. The strategy is considered successful if the sequence of outputs ("hypotheses") converges to a description of $f$. A class of functions learnable in this sense is called "learnable in the limit". The set of all these classes is denoted by LIM.

Other types of inference considered are finite learning (FIN), behaviorally correct learning in the limit (BC), and some variants of LIM with restrictions on the hypotheses: total learning (TOTAL), consistent learning (CONS), and class-preserving learning (CP). The main results formalized are the proper inclusions $\mathrm{FIN} \subset \mathrm{CP} \subset \mathrm{TOTAL} \subset \mathrm{CONS} \subset \mathrm{LIM} \subset \mathrm{BC} \subset 2^{\mathcal{R}}$, where $\mathcal{R}$ is the set of all total recursive functions. Further results show that for all these inference types except CONS, strategies can be assumed to be total recursive functions; that all inference types but CP are closed under the subset relation between classes; and that no inference type is closed under the union of classes.

The above is based on a formalization of recursive functions heavily inspired by the Universal Turing Machine entry by Xu et al., but different in that it models partial functions with codomain nat option. The formalization contains a construction of a universal partial recursive function, without resorting to Turing machines, introduces decidability and recursive enumerability, and proves some standard results: existence of a Kleene normal form, the s-m-n theorem, Rice's theorem, and assorted fixed-point theorems (recursion theorems) by Kleene, Rogers, and Smullyan.

[Applicative_Lifting] title = Applicative Lifting author = Andreas Lochbihler , Joshua Schneider <> date = 2015-12-22 topic = Computer science/Functional programming abstract = Applicative functors augment computations with effects by lifting function application to types which model the effects. As the structure of the computation cannot depend on the effects, applicative expressions can be analysed statically. This allows us to lift universally quantified equations to the effectful types, as observed by Hinze. Thus, equational reasoning over effectful computations can be reduced to pure types.

This entry provides a package for registering applicative functors and two proof methods for lifting of equations over applicative functors. The first method normalises applicative expressions according to the laws of applicative functors. This way, equations whose two sides contain the same list of variables can be lifted to every applicative functor.

To lift larger classes of equations, the second method exploits a number of additional properties (e.g., commutativity of effects) provided the properties have been declared for the concrete applicative functor at hand upon registration.

We declare several types from the Isabelle library as applicative functors and illustrate the use of the methods with two examples: the lifting of the arithmetic type class hierarchy to streams and the verification of a relabelling function on binary trees. We also formalise and verify the normalisation algorithm used by the first proof method.

extra-history = Change history: [2016-03-03]: added formalisation of lifting with combinators
[2016-06-10]: implemented automatic derivation of lifted combinator reductions; support arbitrary lifted relations using relators; improved compatibility with locale interpretation (revision ec336f354f37)
notify = mail@andreas-lochbihler.de [Stern_Brocot] title = The Stern-Brocot Tree author = Peter Gammie , Andreas Lochbihler date = 2015-12-22 topic = Mathematics/Number theory abstract = The Stern-Brocot tree contains all rational numbers exactly once and in their lowest terms. We formalise the Stern-Brocot tree as a coinductive tree using recursive and iterative specifications, which we have proven equivalent, and show that it indeed contains all the numbers as stated. Following Hinze, we prove that the Stern-Brocot tree can be linearised looplessly into Stern's diatonic sequence (also known as Dijkstra's fusc function) and that it is a permutation of the Bird tree.

The reasoning stays at an abstract level by appealing to the uniqueness of solutions of guarded recursive equations and lifting algebraic laws point-wise to trees and streams using applicative functors.

notify = mail@andreas-lochbihler.de [Algebraic_Numbers] title = Algebraic Numbers in Isabelle/HOL topic = Mathematics/Algebra author = René Thiemann , Akihisa Yamada , Sebastiaan Joosten contributors = Manuel Eberl date = 2015-12-22 abstract = Based on existing libraries for matrices, factorization of rational polynomials, and Sturm's theorem, we formalized algebraic numbers in Isabelle/HOL. Our development serves as an implementation for real and complex numbers, and it admits to compute roots and completely factorize real and complex polynomials, provided that all coefficients are rational numbers. Moreover, we provide two implementations to display algebraic numbers, an injective and expensive one, or a faster but approximative version.

To this end, we mechanized several results on resultants, which also required us to prove that polynomials over a unique factorization domain form again a unique factorization domain.

extra-history = Change history: [2016-01-29]: Split off Polynomial Interpolation and Polynomial Factorization
[2017-04-16]: Use certified Berlekamp-Zassenhaus factorization, use subresultant algorithm for computing resultants, improved bisection algorithm notify = rene.thiemann@uibk.ac.at, ayamada@trs.cm.is.nagoya-u.ac.jp, sebastiaan.joosten@uibk.ac.at [Polynomial_Interpolation] title = Polynomial Interpolation topic = Mathematics/Algebra author = René Thiemann , Akihisa Yamada date = 2016-01-29 abstract = We formalized three algorithms for polynomial interpolation over arbitrary fields: Lagrange's explicit expression, the recursive algorithm of Neville and Aitken, and the Newton interpolation in combination with an efficient implementation of divided differences. Variants of these algorithms for integer polynomials are also available, where sometimes the interpolation can fail; e.g., there is no linear integer polynomial p such that p(0) = 0 and p(2) = 1. Moreover, for the Newton interpolation for integer polynomials, we proved that all intermediate results that are computed during the algorithm must be integers. This admits an early failure detection in the implementation. Finally, we proved the uniqueness of polynomial interpolation.

The development also contains improved code equations to speed up the division of integers in target languages. notify = rene.thiemann@uibk.ac.at, ayamada@trs.cm.is.nagoya-u.ac.jp [Polynomial_Factorization] title = Polynomial Factorization topic = Mathematics/Algebra author = René Thiemann , Akihisa Yamada date = 2016-01-29 abstract = Based on existing libraries for polynomial interpolation and matrices, we formalized several factorization algorithms for polynomials, including Kronecker's algorithm for integer polynomials, Yun's square-free factorization algorithm for field polynomials, and Berlekamp's algorithm for polynomials over finite fields. By combining the last one with Hensel's lifting, we derive an efficient factorization algorithm for the integer polynomials, which is then lifted for rational polynomials by mechanizing Gauss' lemma. Finally, we assembled a combined factorization algorithm for rational polynomials, which combines all the mentioned algorithms and additionally uses the explicit formula for roots of quadratic polynomials and a rational root test.

As side products, we developed division algorithms for polynomials over integral domains, as well as primality-testing and prime-factorization algorithms for integers. notify = rene.thiemann@uibk.ac.at, ayamada@trs.cm.is.nagoya-u.ac.jp [Perron_Frobenius] title = Perron-Frobenius Theorem for Spectral Radius Analysis author = Jose Divasón , Ondřej Kunčar , René Thiemann , Akihisa Yamada notify = rene.thiemann@uibk.ac.at date = 2016-05-20 topic = Mathematics/Algebra abstract =

The spectral radius of a matrix A is the maximum norm of all eigenvalues of A. In previous work we already formalized that for a complex matrix A, the values in An grow polynomially in n if and only if the spectral radius is at most one. One problem with the above characterization is the determination of all complex eigenvalues. In case A contains only non-negative real values, a simplification is possible with the help of the Perron–Frobenius theorem, which tells us that it suffices to consider only the real eigenvalues of A, i.e., applying Sturm's method can decide the polynomial growth of An.

We formalize the Perron–Frobenius theorem based on a proof via Brouwer's fixpoint theorem, which is available in the HOL multivariate analysis (HMA) library. Since the results on the spectral radius is based on matrices in the Jordan normal form (JNF) library, we further develop a connection which allows us to easily transfer theorems between HMA and JNF. With this connection we derive the combined result: if A is a non-negative real matrix, and no real eigenvalue of A is strictly larger than one, then An is polynomially bounded in n.

extra-history = Change history: [2017-10-18]: added Perron-Frobenius theorem for irreducible matrices with generalization (revision bda1f1ce8a1c)
[2018-05-17]: prove conjecture of CPP'18 paper: Jordan blocks of spectral radius have maximum size (revision ffdb3794e5d5) [Stochastic_Matrices] title = Stochastic Matrices and the Perron-Frobenius Theorem author = René Thiemann topic = Mathematics/Algebra, Computer science/Automata and formal languages date = 2017-11-22 notify = rene.thiemann@uibk.ac.at abstract = Stochastic matrices are a convenient way to model discrete-time and finite state Markov chains. The Perron–Frobenius theorem tells us something about the existence and uniqueness of non-negative eigenvectors of a stochastic matrix. In this entry, we formalize stochastic matrices, link the formalization to the existing AFP-entry on Markov chains, and apply the Perron–Frobenius theorem to prove that stationary distributions always exist, and they are unique if the stochastic matrix is irreducible. [Formal_SSA] title = Verified Construction of Static Single Assignment Form author = Sebastian Ullrich , Denis Lohner date = 2016-02-05 topic = Computer science/Algorithms, Computer science/Programming languages/Transformations abstract =

We define a functional variant of the static single assignment (SSA) form construction algorithm described by Braun et al., which combines simplicity and efficiency. The definition is based on a general, abstract control flow graph representation using Isabelle locales.

We prove that the algorithm's output is semantically equivalent to the input according to a small-step semantics, and that it is in minimal SSA form for the common special case of reducible inputs. We then show the satisfiability of the locale assumptions by giving instantiations for a simple While language.

Furthermore, we use a generic instantiation based on typedefs in order to extract OCaml code and replace the unverified SSA construction algorithm of the CompCertSSA project with it.

A more detailed description of the verified SSA construction can be found in the paper Verified Construction of Static Single Assignment Form, CC 2016.

notify = denis.lohner@kit.edu [Minimal_SSA] title = Minimal Static Single Assignment Form author = Max Wagner , Denis Lohner topic = Computer science/Programming languages/Transformations date = 2017-01-17 notify = denis.lohner@kit.edu abstract =

This formalization is an extension to "Verified Construction of Static Single Assignment Form". In their work, the authors have shown that Braun et al.'s static single assignment (SSA) construction algorithm produces minimal SSA form for input programs with a reducible control flow graph (CFG). However Braun et al. also proposed an extension to their algorithm that they claim produces minimal SSA form even for irreducible CFGs.
In this formalization we support that claim by giving a mechanized proof.

As the extension of Braun et al.'s algorithm aims for removing so-called redundant strongly connected components of phi functions, we show that this suffices to guarantee minimality according to Cytron et al..

[PropResPI] title = Propositional Resolution and Prime Implicates Generation author = Nicolas Peltier notify = Nicolas.Peltier@imag.fr date = 2016-03-11 topic = Logic/General logic/Mechanization of proofs abstract = We provide formal proofs in Isabelle-HOL (using mostly structured Isar proofs) of the soundness and completeness of the Resolution rule in propositional logic. The completeness proofs take into account the usual redundancy elimination rules (tautology elimination and subsumption), and several refinements of the Resolution rule are considered: ordered resolution (with selection functions), positive and negative resolution, semantic resolution and unit resolution (the latter refinement is complete only for clause sets that are Horn- renamable). We also define a concrete procedure for computing saturated sets and establish its soundness and completeness. The clause sets are not assumed to be finite, so that the results can be applied to formulas obtained by grounding sets of first-order clauses (however, a total ordering among atoms is assumed to be given). Next, we show that the unrestricted Resolution rule is deductive- complete, in the sense that it is able to generate all (prime) implicates of any set of propositional clauses (i.e., all entailment- minimal, non-valid, clausal consequences of the considered set). The generation of prime implicates is an important problem, with many applications in artificial intelligence and verification (for abductive reasoning, knowledge compilation, diagnosis, debugging etc.). We also show that implicates can be computed in an incremental way, by fixing an ordering among all the atoms in the considered sets and resolving upon these atoms one by one in the considered order (with no backtracking). This feature is critical for the efficient computation of prime implicates. Building on these results, we provide a procedure for computing such implicates and establish its soundness and completeness. [SuperCalc] title = A Variant of the Superposition Calculus author = Nicolas Peltier notify = Nicolas.Peltier@imag.fr date = 2016-09-06 topic = Logic/Proof theory abstract = We provide a formalization of a variant of the superposition calculus, together with formal proofs of soundness and refutational completeness (w.r.t. the usual redundancy criteria based on clause ordering). This version of the calculus uses all the standard restrictions of the superposition rules, together with the following refinement, inspired by the basic superposition calculus: each clause is associated with a set of terms which are assumed to be in normal form -- thus any application of the replacement rule on these terms is blocked. The set is initially empty and terms may be added or removed at each inference step. The set of terms that are assumed to be in normal form includes any term introduced by previous unifiers as well as any term occurring in the parent clauses at a position that is smaller (according to some given ordering on positions) than a previously replaced term. The standard superposition calculus corresponds to the case where the set of irreducible terms is always empty. [Nominal2] title = Nominal 2 author = Christian Urban , Stefan Berghofer , Cezary Kaliszyk date = 2013-02-21 topic = Tools abstract =

Dealing with binders, renaming of bound variables, capture-avoiding substitution, etc., is very often a major problem in formal proofs, especially in proofs by structural and rule induction. Nominal Isabelle is designed to make such proofs easy to formalise: it provides an infrastructure for declaring nominal datatypes (that is alpha-equivalence classes) and for defining functions over them by structural recursion. It also provides induction principles that have Barendregt’s variable convention already built in.

This entry can be used as a more advanced replacement for HOL/Nominal in the Isabelle distribution.

notify = christian.urban@kcl.ac.uk [First_Welfare_Theorem] title = Microeconomics and the First Welfare Theorem author = Julian Parsert , Cezary Kaliszyk topic = Mathematics/Games and economics license = LGPL date = 2017-09-01 notify = julian.parsert@uibk.ac.at, cezary.kaliszyk@uibk.ac.at abstract = Economic activity has always been a fundamental part of society. Due to modern day politics, economic theory has gained even more influence on our lives. Thus we want models and theories to be as precise as possible. This can be achieved using certification with the help of formal proof technology. Hence we will use Isabelle/HOL to construct two economic models, that of the the pure exchange economy and a version of the Arrow-Debreu Model. We will prove that the First Theorem of Welfare Economics holds within both. The theorem is the mathematical formulation of Adam Smith's famous invisible hand and states that a group of self-interested and rational actors will eventually achieve an efficient allocation of goods and services. extra-history = Change history: [2018-06-17]: Added some lemmas and a theory file, also introduced Microeconomics folder.
[Noninterference_Sequential_Composition] title = Conservation of CSP Noninterference Security under Sequential Composition author = Pasquale Noce date = 2016-04-26 topic = Computer science/Security, Computer science/Concurrency/Process calculi abstract =

In his outstanding work on Communicating Sequential Processes, Hoare has defined two fundamental binary operations allowing to compose the input processes into another, typically more complex, process: sequential composition and concurrent composition. Particularly, the output of the former operation is a process that initially behaves like the first operand, and then like the second operand once the execution of the first one has terminated successfully, as long as it does.

This paper formalizes Hoare's definition of sequential composition and proves, in the general case of a possibly intransitive policy, that CSP noninterference security is conserved under this operation, provided that successful termination cannot be affected by confidential events and cannot occur as an alternative to other events in the traces of the first operand. Both of these assumptions are shown, by means of counterexamples, to be necessary for the theorem to hold.

notify = pasquale.noce.lavoro@gmail.com [Noninterference_Concurrent_Composition] title = Conservation of CSP Noninterference Security under Concurrent Composition author = Pasquale Noce notify = pasquale.noce.lavoro@gmail.com date = 2016-06-13 topic = Computer science/Security, Computer science/Concurrency/Process calculi abstract =

In his outstanding work on Communicating Sequential Processes, Hoare has defined two fundamental binary operations allowing to compose the input processes into another, typically more complex, process: sequential composition and concurrent composition. Particularly, the output of the latter operation is a process in which any event not shared by both operands can occur whenever the operand that admits the event can engage in it, whereas any event shared by both operands can occur just in case both can engage in it.

This paper formalizes Hoare's definition of concurrent composition and proves, in the general case of a possibly intransitive policy, that CSP noninterference security is conserved under this operation. This result, along with the previous analogous one concerning sequential composition, enables the construction of more and more complex processes enforcing noninterference security by composing, sequentially or concurrently, simpler secure processes, whose security can in turn be proven using either the definition of security, or unwinding theorems.

[ROBDD] title = Algorithms for Reduced Ordered Binary Decision Diagrams author = Julius Michaelis , Maximilian Haslbeck , Peter Lammich , Lars Hupel date = 2016-04-27 topic = Computer science/Algorithms, Computer science/Data structures abstract = We present a verified and executable implementation of ROBDDs in Isabelle/HOL. Our implementation relates pointer-based computation in the Heap monad to operations on an abstract definition of boolean functions. Internally, we implemented the if-then-else combinator in a recursive fashion, following the Shannon decomposition of the argument functions. The implementation mixes and adapts known techniques and is built with efficiency in mind. notify = bdd@liftm.de, haslbecm@in.tum.de [No_FTL_observers] title = No Faster-Than-Light Observers author = Mike Stannett , István Németi date = 2016-04-28 topic = Mathematics/Physics abstract = We provide a formal proof within First Order Relativity Theory that no observer can travel faster than the speed of light. Originally reported in Stannett & Németi (2014) "Using Isabelle/HOL to verify first-order relativity theory", Journal of Automated Reasoning 52(4), pp. 361-378. notify = m.stannett@sheffield.ac.uk [Groebner_Bases] title = Gröbner Bases Theory author = Fabian Immler , Alexander Maletzky date = 2016-05-02 topic = Mathematics/Algebra, Computer science/Algorithms/Mathematical abstract = This formalization is concerned with the theory of Gröbner bases in (commutative) multivariate polynomial rings over fields, originally developed by Buchberger in his 1965 PhD thesis. Apart from the statement and proof of the main theorem of the theory, the formalization also implements Buchberger's algorithm for actually computing Gröbner bases as a tail-recursive function, thus allowing to effectively decide ideal membership in finitely generated polynomial ideals. Furthermore, all functions can be executed on a concrete representation of multivariate polynomials as association lists. extra-history = Change history: [2019-04-18]: Specialized Gröbner bases to less abstract representation of polynomials, where power-products are represented as polynomial mappings.
notify = alexander.maletzky@risc.jku.at [Nullstellensatz] title = Hilbert's Nullstellensatz author = Alexander Maletzky topic = Mathematics/Algebra, Mathematics/Geometry date = 2019-06-16 notify = alexander.maletzky@risc-software.at abstract = This entry formalizes Hilbert's Nullstellensatz, an important theorem in algebraic geometry that can be viewed as the generalization of the Fundamental Theorem of Algebra to multivariate polynomials: If a set of (multivariate) polynomials over an algebraically closed field has no common zero, then the ideal it generates is the entire polynomial ring. The formalization proves several equivalent versions of this celebrated theorem: the weak Nullstellensatz, the strong Nullstellensatz (connecting algebraic varieties and radical ideals), and the field-theoretic Nullstellensatz. The formalization follows Chapter 4.1. of Ideals, Varieties, and Algorithms by Cox, Little and O'Shea. [Bell_Numbers_Spivey] title = Spivey's Generalized Recurrence for Bell Numbers author = Lukas Bulwahn date = 2016-05-04 topic = Mathematics/Combinatorics abstract = This entry defines the Bell numbers as the cardinality of set partitions for a carrier set of given size, and derives Spivey's generalized recurrence relation for Bell numbers following his elegant and intuitive combinatorial proof.

As the set construction for the combinatorial proof requires construction of three intermediate structures, the main difficulty of the formalization is handling the overall combinatorial argument in a structured way. The introduced proof structure allows us to compose the combinatorial argument from its subparts, and supports to keep track how the detailed proof steps are related to the overall argument. To obtain this structure, this entry uses set monad notation for the set construction's definition, introduces suitable predicates and rules, and follows a repeating structure in its Isar proof. notify = lukas.bulwahn@gmail.com [Randomised_Social_Choice] title = Randomised Social Choice Theory author = Manuel Eberl date = 2016-05-05 topic = Mathematics/Games and economics abstract = This work contains a formalisation of basic Randomised Social Choice, including Stochastic Dominance and Social Decision Schemes (SDSs) along with some of their most important properties (Anonymity, Neutrality, ex-post- and SD-Efficiency, SD-Strategy-Proofness) and two particular SDSs – Random Dictatorship and Random Serial Dictatorship (with proofs of the properties that they satisfy). Many important properties of these concepts are also proven – such as the two equivalent characterisations of Stochastic Dominance and the fact that SD-efficiency of a lottery only depends on the support. The entry also provides convenient commands to define Preference Profiles, prove their well-formedness, and automatically derive restrictions that sufficiently nice SDSs need to satisfy on the defined profiles. Currently, the formalisation focuses on weak preferences and Stochastic Dominance, but it should be easy to extend it to other domains – such as strict preferences – or other lottery extensions – such as Bilinear Dominance or Pairwise Comparison. notify = eberlm@in.tum.de [SDS_Impossibility] title = The Incompatibility of SD-Efficiency and SD-Strategy-Proofness author = Manuel Eberl date = 2016-05-04 topic = Mathematics/Games and economics abstract = This formalisation contains the proof that there is no anonymous and neutral Social Decision Scheme for at least four voters and alternatives that fulfils both SD-Efficiency and SD-Strategy- Proofness. The proof is a fully structured and quasi-human-redable one. It was derived from the (unstructured) SMT proof of the case for exactly four voters and alternatives by Brandl et al. Their proof relies on an unverified translation of the original problem to SMT, and the proof that lifts the argument for exactly four voters and alternatives to the general case is also not machine-checked. In this Isabelle proof, on the other hand, all of these steps are fully proven and machine-checked. This is particularly important seeing as a previously published informal proof of a weaker statement contained a mistake in precisely this lifting step. notify = eberlm@in.tum.de [Median_Of_Medians_Selection] title = The Median-of-Medians Selection Algorithm author = Manuel Eberl topic = Computer science/Algorithms date = 2017-12-21 notify = eberlm@in.tum.de abstract =

This entry provides an executable functional implementation of the Median-of-Medians algorithm for selecting the k-th smallest element of an unsorted list deterministically in linear time. The size bounds for the recursive call that lead to the linear upper bound on the run-time of the algorithm are also proven.

[Mason_Stothers] title = The Mason–Stothers Theorem author = Manuel Eberl topic = Mathematics/Algebra date = 2017-12-21 notify = eberlm@in.tum.de abstract =

This article provides a formalisation of Snyder’s simple and elegant proof of the Mason–Stothers theorem, which is the polynomial analogue of the famous abc Conjecture for integers. Remarkably, Snyder found this very elegant proof when he was still a high-school student.

In short, the statement of the theorem is that three non-zero coprime polynomials A, B, C over a field which sum to 0 and do not all have vanishing derivatives fulfil max{deg(A), deg(B), deg(C)} < deg(rad(ABC)) where the rad(P) denotes the radical of P, i. e. the product of all unique irreducible factors of P.

This theorem also implies a kind of polynomial analogue of Fermat’s Last Theorem for polynomials: except for trivial cases, An + Bn + Cn = 0 implies n ≤ 2 for coprime polynomials A, B, C over a field.

[FLP] title = A Constructive Proof for FLP author = Benjamin Bisping , Paul-David Brodmann , Tim Jungnickel , Christina Rickmann , Henning Seidler , Anke Stüber , Arno Wilhelm-Weidner , Kirstin Peters , Uwe Nestmann date = 2016-05-18 topic = Computer science/Concurrency abstract = The impossibility of distributed consensus with one faulty process is a result with important consequences for real world distributed systems e.g., commits in replicated databases. Since proofs are not immune to faults and even plausible proofs with a profound formalism can conclude wrong results, we validate the fundamental result named FLP after Fischer, Lynch and Paterson. We present a formalization of distributed systems and the aforementioned consensus problem. Our proof is based on Hagen Völzer's paper "A constructive proof for FLP". In addition to the enhanced confidence in the validity of Völzer's proof, we contribute the missing gaps to show the correctness in Isabelle/HOL. We clarify the proof details and even prove fairness of the infinite execution that contradicts consensus. Our Isabelle formalization can also be reused for further proofs of properties of distributed systems. notify = henning.seidler@mailbox.tu-berlin.de [IMAP-CRDT] title = The IMAP CmRDT author = Tim Jungnickel , Lennart Oldenburg <>, Matthias Loibl <> topic = Computer science/Algorithms/Distributed, Computer science/Data structures date = 2017-11-09 notify = tim.jungnickel@tu-berlin.de abstract = We provide our Isabelle/HOL formalization of a Conflict-free Replicated Datatype for Internet Message Access Protocol commands. We show that Strong Eventual Consistency (SEC) is guaranteed by proving the commutativity of concurrent operations. We base our formalization on the recently proposed "framework for establishing Strong Eventual Consistency for Conflict-free Replicated Datatypes" (AFP.CRDT) from Gomes et al. Hence, we provide an additional example of how the recently proposed framework can be used to design and prove CRDTs. [Incredible_Proof_Machine] title = The meta theory of the Incredible Proof Machine author = Joachim Breitner , Denis Lohner date = 2016-05-20 topic = Logic/Proof theory abstract = The Incredible Proof Machine is an interactive visual theorem prover which represents proofs as port graphs. We model this proof representation in Isabelle, and prove that it is just as powerful as natural deduction. notify = mail@joachim-breitner.de [Word_Lib] title = Finite Machine Word Library author = Joel Beeren<>, Matthew Fernandez<>, Xin Gao<>, Gerwin Klein , Rafal Kolanski<>, Japheth Lim<>, Corey Lewis<>, Daniel Matichuk<>, Thomas Sewell<> notify = kleing@unsw.edu.au date = 2016-06-09 topic = Computer science/Data structures abstract = This entry contains an extension to the Isabelle library for fixed-width machine words. In particular, the entry adds quickcheck setup for words, printing as hexadecimals, additional operations, reasoning about alignment, signed words, enumerations of words, normalisation of word numerals, and an extensive library of properties about generic fixed-width words, as well as an instantiation of many of these to the commonly used 32 and 64-bit bases. [Catalan_Numbers] title = Catalan Numbers author = Manuel Eberl notify = eberlm@in.tum.de date = 2016-06-21 topic = Mathematics/Combinatorics abstract =

In this work, we define the Catalan numbers Cn and prove several equivalent definitions (including some closed-form formulae). We also show one of their applications (counting the number of binary trees of size n), prove the asymptotic growth approximation Cn ∼ 4n / (√π · n1.5), and provide reasonably efficient executable code to compute them.

The derivation of the closed-form formulae uses algebraic manipulations of the ordinary generating function of the Catalan numbers, and the asymptotic approximation is then done using generalised binomial coefficients and the Gamma function. Thanks to these highly non-elementary mathematical tools, the proofs are very short and simple.

[Fisher_Yates] title = Fisher–Yates shuffle author = Manuel Eberl notify = eberlm@in.tum.de date = 2016-09-30 topic = Computer science/Algorithms abstract =

This work defines and proves the correctness of the Fisher–Yates algorithm for shuffling – i.e. producing a random permutation – of a list. The algorithm proceeds by traversing the list and in each step swapping the current element with a random element from the remaining list.

[Bertrands_Postulate] title = Bertrand's postulate author = Julian Biendarra<>, Manuel Eberl contributors = Lawrence C. Paulson topic = Mathematics/Number theory date = 2017-01-17 notify = eberlm@in.tum.de abstract =

Bertrand's postulate is an early result on the distribution of prime numbers: For every positive integer n, there exists a prime number that lies strictly between n and 2n. The proof is ported from John Harrison's formalisation in HOL Light. It proceeds by first showing that the property is true for all n greater than or equal to 600 and then showing that it also holds for all n below 600 by case distinction.

[Rewriting_Z] title = The Z Property author = Bertram Felgenhauer<>, Julian Nagele<>, Vincent van Oostrom<>, Christian Sternagel notify = bertram.felgenhauer@uibk.ac.at, julian.nagele@uibk.ac.at, c.sternagel@gmail.com date = 2016-06-30 topic = Logic/Rewriting abstract = We formalize the Z property introduced by Dehornoy and van Oostrom. First we show that for any abstract rewrite system, Z implies confluence. Then we give two examples of proofs using Z: confluence of lambda-calculus with respect to beta-reduction and confluence of combinatory logic. [Resolution_FOL] title = The Resolution Calculus for First-Order Logic author = Anders Schlichtkrull notify = andschl@dtu.dk date = 2016-06-30 topic = Logic/General logic/Mechanization of proofs abstract = This theory is a formalization of the resolution calculus for first-order logic. It is proven sound and complete. The soundness proof uses the substitution lemma, which shows a correspondence between substitutions and updates to an environment. The completeness proof uses semantic trees, i.e. trees whose paths are partial Herbrand interpretations. It employs Herbrand's theorem in a formulation which states that an unsatisfiable set of clauses has a finite closed semantic tree. It also uses the lifting lemma which lifts resolution derivation steps from the ground world up to the first-order world. The theory is presented in a paper in the Journal of Automated Reasoning [Sch18] which extends a paper presented at the International Conference on Interactive Theorem Proving [Sch16]. An earlier version was presented in an MSc thesis [Sch15]. The formalization mostly follows textbooks by Ben-Ari [BA12], Chang and Lee [CL73], and Leitsch [Lei97]. The theory is part of the IsaFoL project [IsaFoL].

[Sch18] Anders Schlichtkrull. "Formalization of the Resolution Calculus for First-Order Logic". Journal of Automated Reasoning, 2018.
[Sch16] Anders Schlichtkrull. "Formalization of the Resolution Calculus for First-Order Logic". In: ITP 2016. Vol. 9807. LNCS. Springer, 2016.
[Sch15] Anders Schlichtkrull. "Formalization of Resolution Calculus in Isabelle". https://people.compute.dtu.dk/andschl/Thesis.pdf. MSc thesis. Technical University of Denmark, 2015.
[BA12] Mordechai Ben-Ari. Mathematical Logic for Computer Science. 3rd. Springer, 2012.
[CL73] Chin-Liang Chang and Richard Char-Tung Lee. Symbolic Logic and Mechanical Theorem Proving. 1st. Academic Press, Inc., 1973.
[Lei97] Alexander Leitsch. The Resolution Calculus. Texts in theoretical computer science. Springer, 1997.
[IsaFoL] IsaFoL authors. IsaFoL: Isabelle Formalization of Logic. https://bitbucket.org/jasmin_blanchette/isafol. extra-history = Change history: [2018-01-24]: added several new versions of the soundness and completeness theorems as described in the paper [Sch18].
[2018-03-20]: added a concrete instance of the unification and completeness theorems using the First-Order Terms AFP-entry from IsaFoR as described in the papers [Sch16] and [Sch18]. [Surprise_Paradox] title = Surprise Paradox author = Joachim Breitner notify = mail@joachim-breitner.de date = 2016-07-17 topic = Logic/Proof theory abstract = In 1964, Fitch showed that the paradox of the surprise hanging can be resolved by showing that the judge’s verdict is inconsistent. His formalization builds on Gödel’s coding of provability. In this theory, we reproduce his proof in Isabelle, building on Paulson’s formalisation of Gödel’s incompleteness theorems. [Ptolemys_Theorem] title = Ptolemy's Theorem author = Lukas Bulwahn notify = lukas.bulwahn@gmail.com date = 2016-08-07 topic = Mathematics/Geometry abstract = This entry provides an analytic proof to Ptolemy's Theorem using polar form transformation and trigonometric identities. In this formalization, we use ideas from John Harrison's HOL Light formalization and the proof sketch on the Wikipedia entry of Ptolemy's Theorem. This theorem is the 95th theorem of the Top 100 Theorems list. [Falling_Factorial_Sum] title = The Falling Factorial of a Sum author = Lukas Bulwahn topic = Mathematics/Combinatorics date = 2017-12-22 notify = lukas.bulwahn@gmail.com abstract = This entry shows that the falling factorial of a sum can be computed with an expression using binomial coefficients and the falling factorial of its summands. The entry provides three different proofs: a combinatorial proof, an induction proof and an algebraic proof using the Vandermonde identity. The three formalizations try to follow their informal presentations from a Mathematics Stack Exchange page as close as possible. The induction and algebraic formalization end up to be very close to their informal presentation, whereas the combinatorial proof first requires the introduction of list interleavings, and significant more detail than its informal presentation. [InfPathElimination] title = Infeasible Paths Elimination by Symbolic Execution Techniques: Proof of Correctness and Preservation of Paths author = Romain Aissat<>, Frederic Voisin<>, Burkhart Wolff notify = wolff@lri.fr date = 2016-08-18 topic = Computer science/Programming languages/Static analysis abstract = TRACER is a tool for verifying safety properties of sequential C programs. TRACER attempts at building a finite symbolic execution graph which over-approximates the set of all concrete reachable states and the set of feasible paths. We present an abstract framework for TRACER and similar CEGAR-like systems. The framework provides 1) a graph- transformation based method for reducing the feasible paths in control-flow graphs, 2) a model for symbolic execution, subsumption, predicate abstraction and invariant generation. In this framework we formally prove two key properties: correct construction of the symbolic states and preservation of feasible paths. The framework focuses on core operations, leaving to concrete prototypes to “fit in” heuristics for combining them. The accompanying paper (published in ITP 2016) can be found at https://www.lri.fr/∼wolff/papers/conf/2016-itp-InfPathsNSE.pdf. [Stirling_Formula] title = Stirling's formula author = Manuel Eberl notify = eberlm@in.tum.de date = 2016-09-01 topic = Mathematics/Analysis abstract =

This work contains a proof of Stirling's formula both for the factorial $n! \sim \sqrt{2\pi n} (n/e)^n$ on natural numbers and the real Gamma function $\Gamma(x)\sim \sqrt{2\pi/x} (x/e)^x$. The proof is based on work by Graham Jameson.

This is then extended to the full asymptotic expansion $$\log\Gamma(z) = \big(z - \tfrac{1}{2}\big)\log z - z + \tfrac{1}{2}\log(2\pi) + \sum_{k=1}^{n-1} \frac{B_{k+1}}{k(k+1)} z^{-k}\\ {} - \frac{1}{n} \int_0^\infty B_n([t])(t + z)^{-n}\,\text{d}t$$ uniformly for all complex $z\neq 0$ in the cone $\text{arg}(z)\leq \alpha$ for any $\alpha\in(0,\pi)$, with which the above asymptotic relation for Γ is also extended to complex arguments.

[Lp] title = Lp spaces author = Sebastien Gouezel notify = sebastien.gouezel@univ-rennes1.fr date = 2016-10-05 topic = Mathematics/Analysis abstract = Lp is the space of functions whose p-th power is integrable. It is one of the most fundamental Banach spaces that is used in analysis and probability. We develop a framework for function spaces, and then implement the Lp spaces in this framework using the existing integration theory in Isabelle/HOL. Our development contains most fundamental properties of Lp spaces, notably the Hölder and Minkowski inequalities, completeness of Lp, duality, stability under almost sure convergence, multiplication of functions in Lp and Lq, stability under conditional expectation. [Berlekamp_Zassenhaus] title = The Factorization Algorithm of Berlekamp and Zassenhaus author = Jose Divasón , Sebastiaan Joosten , René Thiemann , Akihisa Yamada notify = rene.thiemann@uibk.ac.at date = 2016-10-14 topic = Mathematics/Algebra abstract =

We formalize the Berlekamp-Zassenhaus algorithm for factoring square-free integer polynomials in Isabelle/HOL. We further adapt an existing formalization of Yun’s square-free factorization algorithm to integer polynomials, and thus provide an efficient and certified factorization algorithm for arbitrary univariate polynomials.

The algorithm first performs a factorization in the prime field GF(p) and then performs computations in the integer ring modulo p^k, where both p and k are determined at runtime. Since a natural modeling of these structures via dependent types is not possible in Isabelle/HOL, we formalize the whole algorithm using Isabelle’s recent addition of local type definitions.

Through experiments we verify that our algorithm factors polynomials of degree 100 within seconds.

[Allen_Calculus] title = Allen's Interval Calculus author = Fadoua Ghourabi <> notify = fadouaghourabi@gmail.com date = 2016-09-29 topic = Logic/General logic/Temporal logic, Mathematics/Order abstract = Allen’s interval calculus is a qualitative temporal representation of time events. Allen introduced 13 binary relations that describe all the possible arrangements between two events, i.e. intervals with non-zero finite length. The compositions are pertinent to reasoning about knowledge of time. In particular, a consistency problem of relation constraints is commonly solved with a guideline from these compositions. We formalize the relations together with an axiomatic system. We proof the validity of the 169 compositions of these relations. We also define nests as the sets of intervals that share a meeting point. We prove that nests give the ordering properties of points without introducing a new datatype for points. [1] J.F. Allen. Maintaining Knowledge about Temporal Intervals. In Commun. ACM, volume 26, pages 832–843, 1983. [2] J. F. Allen and P. J. Hayes. A Common-sense Theory of Time. In Proceedings of the 9th International Joint Conference on Artificial Intelligence (IJCAI’85), pages 528–531, 1985. [Source_Coding_Theorem] title = Source Coding Theorem author = Quentin Hibon , Lawrence C. Paulson notify = qh225@cl.cam.ac.uk date = 2016-10-19 topic = Mathematics/Probability theory abstract = This document contains a proof of the necessary condition on the code rate of a source code, namely that this code rate is bounded by the entropy of the source. This represents one half of Shannon's source coding theorem, which is itself an equivalence. [Buffons_Needle] title = Buffon's Needle Problem author = Manuel Eberl topic = Mathematics/Probability theory, Mathematics/Geometry date = 2017-06-06 notify = eberlm@in.tum.de abstract = In the 18th century, Georges-Louis Leclerc, Comte de Buffon posed and later solved the following problem, which is often called the first problem ever solved in geometric probability: Given a floor divided into vertical strips of the same width, what is the probability that a needle thrown onto the floor randomly will cross two strips? This entry formally defines the problem in the case where the needle's position is chosen uniformly at random in a single strip around the origin (which is equivalent to larger arrangements due to symmetry). It then provides proofs of the simple solution in the case where the needle's length is no greater than the width of the strips and the more complicated solution in the opposite case. [SPARCv8] title = A formal model for the SPARCv8 ISA and a proof of non-interference for the LEON3 processor author = Zhe Hou , David Sanan , Alwen Tiu , Yang Liu notify = zhe.hou@ntu.edu.sg, sanan@ntu.edu.sg date = 2016-10-19 topic = Computer science/Security, Computer science/Hardware abstract = We formalise the SPARCv8 instruction set architecture (ISA) which is used in processors such as LEON3. Our formalisation can be specialised to any SPARCv8 CPU, here we use LEON3 as a running example. Our model covers the operational semantics for all the instructions in the integer unit of the SPARCv8 architecture and it supports Isabelle code export, which effectively turns the Isabelle model into a SPARCv8 CPU simulator. We prove the language-based non-interference property for the LEON3 processor. Our model is based on deterministic monad, which is a modified version of the non-deterministic monad from NICTA/l4v. [Separata] title = Separata: Isabelle tactics for Separation Algebra author = Zhe Hou , David Sanan , Alwen Tiu , Rajeev Gore , Ranald Clouston notify = zhe.hou@ntu.edu.sg date = 2016-11-16 topic = Computer science/Programming languages/Logics, Tools abstract = We bring the labelled sequent calculus $LS_{PASL}$ for propositional abstract separation logic to Isabelle. The tactics given here are directly applied on an extension of the Separation Algebra in the AFP. In addition to the cancellative separation algebra, we further consider some useful properties in the heap model of separation logic, such as indivisible unit, disjointness, and cross-split. The tactics are essentially a proof search procedure for the calculus $LS_{PASL}$. We wrap the tactics in an Isabelle method called separata, and give a few examples of separation logic formulae which are provable by separata. [LOFT] title = LOFT — Verified Migration of Linux Firewalls to SDN author = Julius Michaelis , Cornelius Diekmann notify = isabelleopenflow@liftm.de date = 2016-10-21 topic = Computer science/Networks abstract = We present LOFT — Linux firewall OpenFlow Translator, a system that transforms the main routing table and FORWARD chain of iptables of a Linux-based firewall into a set of static OpenFlow rules. Our implementation is verified against a model of a simplified Linux-based router and we can directly show how much of the original functionality is preserved. [Stable_Matching] title = Stable Matching author = Peter Gammie notify = peteg42@gmail.com date = 2016-10-24 topic = Mathematics/Games and economics abstract = We mechanize proofs of several results from the matching with contracts literature, which generalize those of the classical two-sided matching scenarios that go by the name of stable marriage. Our focus is on game theoretic issues. Along the way we develop executable algorithms for computing optimal stable matches. [Modal_Logics_for_NTS] title = Modal Logics for Nominal Transition Systems author = Tjark Weber , Lars-Henrik Eriksson , Joachim Parrow , Johannes Borgström , Ramunas Gutkovas notify = tjark.weber@it.uu.se date = 2016-10-25 topic = Computer science/Concurrency/Process calculi, Logic/General logic/Modal logic abstract = We formalize a uniform semantic substrate for a wide variety of process calculi where states and action labels can be from arbitrary nominal sets. A Hennessy-Milner logic for these systems is defined, and proved adequate for bisimulation equivalence. A main novelty is the construction of an infinitary nominal data type to model formulas with (finitely supported) infinite conjunctions and actions that may contain binding names. The logic is generalized to treat different bisimulation variants such as early, late and open in a systematic way. extra-history = Change history: [2017-01-29]: Formalization of weak bisimilarity added (revision c87cc2057d9c) [Abs_Int_ITP2012] title = Abstract Interpretation of Annotated Commands author = Tobias Nipkow notify = nipkow@in.tum.de date = 2016-11-23 topic = Computer science/Programming languages/Static analysis abstract = This is the Isabelle formalization of the material decribed in the eponymous ITP 2012 paper. It develops a generic abstract interpreter for a while-language, including widening and narrowing. The collecting semantics and the abstract interpreter operate on annotated commands: the program is represented as a syntax tree with the semantic information directly embedded, without auxiliary labels. The aim of the formalization is simplicity, not efficiency or precision. This is motivated by the inclusion of the material in a theorem prover based course on semantics. A similar (but more polished) development is covered in the book Concrete Semantics. [Complx] title = COMPLX: A Verification Framework for Concurrent Imperative Programs author = Sidney Amani<>, June Andronick<>, Maksym Bortin<>, Corey Lewis<>, Christine Rizkallah<>, Joseph Tuong<> notify = sidney.amani@data61.csiro.au, corey.lewis@data61.csiro.au date = 2016-11-29 topic = Computer science/Programming languages/Logics, Computer science/Programming languages/Language definitions abstract = We propose a concurrency reasoning framework for imperative programs, based on the Owicki-Gries (OG) foundational shared-variable concurrency method. Our framework combines the approaches of Hoare-Parallel, a formalisation of OG in Isabelle/HOL for a simple while-language, and Simpl, a generic imperative language embedded in Isabelle/HOL, allowing formal reasoning on C programs. We define the Complx language, extending the syntax and semantics of Simpl with support for parallel composition and synchronisation. We additionally define an OG logic, which we prove sound w.r.t. the semantics, and a verification condition generator, both supporting involved low-level imperative constructs such as function calls and abrupt termination. We illustrate our framework on an example that features exceptions, guards and function calls. We aim to then target concurrent operating systems, such as the interruptible eChronos embedded operating system for which we already have a model-level OG proof using Hoare-Parallel. extra-history = Change history: [2017-01-13]: Improve VCG for nested parallels and sequential sections (revision 30739dbc3dcb) [Paraconsistency] title = Paraconsistency author = Anders Schlichtkrull , Jørgen Villadsen topic = Logic/General logic/Paraconsistent logics date = 2016-12-07 notify = andschl@dtu.dk, jovi@dtu.dk abstract = Paraconsistency is about handling inconsistency in a coherent way. In classical and intuitionistic logic everything follows from an inconsistent theory. A paraconsistent logic avoids the explosion. Quite a few applications in computer science and engineering are discussed in the Intelligent Systems Reference Library Volume 110: Towards Paraconsistent Engineering (Springer 2016). We formalize a paraconsistent many-valued logic that we motivated and described in a special issue on logical approaches to paraconsistency (Journal of Applied Non-Classical Logics 2005). We limit ourselves to the propositional fragment of the higher-order logic. The logic is based on so-called key equalities and has a countably infinite number of truth values. We prove theorems in the logic using the definition of validity. We verify truth tables and also counterexamples for non-theorems. We prove meta-theorems about the logic and finally we investigate a case study. [Proof_Strategy_Language] title = Proof Strategy Language author = Yutaka Nagashima<> topic = Tools date = 2016-12-20 notify = Yutaka.Nagashima@data61.csiro.au abstract = Isabelle includes various automatic tools for finding proofs under certain conditions. However, for each conjecture, knowing which automation to use, and how to tweak its parameters, is currently labour intensive. We have developed a language, PSL, designed to capture high level proof strategies. PSL offloads the construction of human-readable fast-to-replay proof scripts to automatic search, making use of search-time information about each conjecture. Our preliminary evaluations show that PSL reduces the labour cost of interactive theorem proving. This submission contains the implementation of PSL and an example theory file, Example.thy, showing how to write poof strategies in PSL. [Concurrent_Ref_Alg] title = Concurrent Refinement Algebra and Rely Quotients author = Julian Fell , Ian J. Hayes , Andrius Velykis topic = Computer science/Concurrency date = 2016-12-30 notify = Ian.Hayes@itee.uq.edu.au abstract = The concurrent refinement algebra developed here is designed to provide a foundation for rely/guarantee reasoning about concurrent programs. The algebra builds on a complete lattice of commands by providing sequential composition, parallel composition and a novel weak conjunction operator. The weak conjunction operator coincides with the lattice supremum providing its arguments are non-aborting, but aborts if either of its arguments do. Weak conjunction provides an abstract version of a guarantee condition as a guarantee process. We distinguish between models that distribute sequential composition over non-deterministic choice from the left (referred to as being conjunctive in the refinement calculus literature) and those that don't. Least and greatest fixed points of monotone functions are provided to allow recursion and iteration operators to be added to the language. Additional iteration laws are available for conjunctive models. The rely quotient of processes c and i is the process that, if executed in parallel with i implements c. It represents an abstract version of a rely condition generalised to a process. [FOL_Harrison] title = First-Order Logic According to Harrison author = Alexander Birch Jensen , Anders Schlichtkrull , Jørgen Villadsen topic = Logic/General logic/Mechanization of proofs date = 2017-01-01 notify = aleje@dtu.dk, andschl@dtu.dk, jovi@dtu.dk abstract =

We present a certified declarative first-order prover with equality based on John Harrison's Handbook of Practical Logic and Automated Reasoning, Cambridge University Press, 2009. ML code reflection is used such that the entire prover can be executed within Isabelle as a very simple interactive proof assistant. As examples we consider Pelletier's problems 1-46.

Reference: Programming and Verifying a Declarative First-Order Prover in Isabelle/HOL. Alexander Birch Jensen, John Bruntse Larsen, Anders Schlichtkrull & Jørgen Villadsen. AI Communications 31:281-299 2018. https://content.iospress.com/articles/ai-communications/aic764

See also: Students' Proof Assistant (SPA). https://github.com/logic-tools/spa

extra-history = Change history: [2018-07-21]: Proof of Pelletier's problem 34 (Andrews's Challenge) thanks to Asta Halkjær From. [Bernoulli] title = Bernoulli Numbers author = Lukas Bulwahn, Manuel Eberl topic = Mathematics/Analysis, Mathematics/Number theory date = 2017-01-24 notify = eberlm@in.tum.de abstract =

Bernoulli numbers were first discovered in the closed-form expansion of the sum 1m + 2m + … + nm for a fixed m and appear in many other places. This entry provides three different definitions for them: a recursive one, an explicit one, and one through their exponential generating function.

In addition, we prove some basic facts, e.g. their relation to sums of powers of integers and that all odd Bernoulli numbers except the first are zero, and some advanced facts like their relationship to the Riemann zeta function on positive even integers.

We also prove the correctness of the Akiyama–Tanigawa algorithm for computing Bernoulli numbers with reasonable efficiency, and we define the periodic Bernoulli polynomials (which appear e.g. in the Euler–MacLaurin summation formula and the expansion of the log-Gamma function) and prove their basic properties.

[Stone_Relation_Algebras] title = Stone Relation Algebras author = Walter Guttmann topic = Mathematics/Algebra date = 2017-02-07 notify = walter.guttmann@canterbury.ac.nz abstract = We develop Stone relation algebras, which generalise relation algebras by replacing the underlying Boolean algebra structure with a Stone algebra. We show that finite matrices over extended real numbers form an instance. As a consequence, relation-algebraic concepts and methods can be used for reasoning about weighted graphs. We also develop a fixpoint calculus and apply it to compare different definitions of reflexive-transitive closures in semirings. [Stone_Kleene_Relation_Algebras] title = Stone-Kleene Relation Algebras author = Walter Guttmann topic = Mathematics/Algebra date = 2017-07-06 notify = walter.guttmann@canterbury.ac.nz abstract = We develop Stone-Kleene relation algebras, which expand Stone relation algebras with a Kleene star operation to describe reachability in weighted graphs. Many properties of the Kleene star arise as a special case of a more general theory of iteration based on Conway semirings extended by simulation axioms. This includes several theorems representing complex program transformations. We formally prove the correctness of Conway's automata-based construction of the Kleene star of a matrix. We prove numerous results useful for reasoning about weighted graphs. [Abstract_Soundness] title = Abstract Soundness author = Jasmin Christian Blanchette , Andrei Popescu , Dmitriy Traytel topic = Logic/Proof theory date = 2017-02-10 notify = jasmin.blanchette@gmail.com abstract = A formalized coinductive account of the abstract development of Brotherston, Gorogiannis, and Petersen [APLAS 2012], in a slightly more general form since we work with arbitrary infinite proofs, which may be acyclic. This work is described in detail in an article by the authors, published in 2017 in the Journal of Automated Reasoning. The abstract proof can be instantiated for various formalisms, including first-order logic with inductive predicates. [Differential_Dynamic_Logic] title = Differential Dynamic Logic author = Brandon Bohrer topic = Logic/General logic/Modal logic, Computer science/Programming languages/Logics date = 2017-02-13 notify = bbohrer@cs.cmu.edu abstract = We formalize differential dynamic logic, a logic for proving properties of hybrid systems. The proof calculus in this formalization is based on the uniform substitution principle. We show it is sound with respect to our denotational semantics, which provides increased confidence in the correctness of the KeYmaera X theorem prover based on this calculus. As an application, we include a proof term checker embedded in Isabelle/HOL with several example proofs. Published in: Brandon Bohrer, Vincent Rahli, Ivana Vukotic, Marcus Völp, André Platzer: Formally verified differential dynamic logic. CPP 2017. [Syntax_Independent_Logic] title = Syntax-Independent Logic Infrastructure author = Andrei Popescu , Dmitriy Traytel topic = Logic/Proof theory date = 2020-09-16 notify = a.popescu@sheffield.ac.uk, traytel@di.ku.dk abstract = We formalize a notion of logic whose terms and formulas are kept abstract. In particular, logical connectives, substitution, free variables, and provability are not defined, but characterized by their general properties as locale assumptions. Based on this abstract characterization, we develop further reusable reasoning infrastructure. For example, we define parallel substitution (along with proving its characterizing theorems) from single-point substitution. Similarly, we develop a natural deduction style proof system starting from the abstract Hilbert-style one. These one-time efforts benefit different concrete logics satisfying our locales' assumptions. We instantiate the syntax-independent logic infrastructure to Robinson arithmetic (also known as Q) in the AFP entry Robinson_Arithmetic and to hereditarily finite set theory in the AFP entries Goedel_HFSet_Semantic and Goedel_HFSet_Semanticless, which are part of our formalization of Gödel's Incompleteness Theorems described in our CADE-27 paper A Formally Verified Abstract Account of Gödel's Incompleteness Theorems. [Goedel_Incompleteness] title = An Abstract Formalization of Gödel's Incompleteness Theorems author = Andrei Popescu , Dmitriy Traytel topic = Logic/Proof theory date = 2020-09-16 notify = a.popescu@sheffield.ac.uk, traytel@di.ku.dk abstract = We present an abstract formalization of Gödel's incompleteness theorems. We analyze sufficient conditions for the theorems' applicability to a partially specified logic. Our abstract perspective enables a comparison between alternative approaches from the literature. These include Rosser's variation of the first theorem, Jeroslow's variation of the second theorem, and the Swierczkowski–Paulson semantics-based approach. This AFP entry is the main entry point to the results described in our CADE-27 paper A Formally Verified Abstract Account of Gödel's Incompleteness Theorems. As part of our abstract formalization's validation, we instantiate our locales twice in the separate AFP entries Goedel_HFSet_Semantic and Goedel_HFSet_Semanticless. [Goedel_HFSet_Semantic] title = From Abstract to Concrete Gödel's Incompleteness Theorems—Part I author = Andrei Popescu , Dmitriy Traytel topic = Logic/Proof theory date = 2020-09-16 notify = a.popescu@sheffield.ac.uk, traytel@di.ku.dk abstract = We validate an abstract formulation of Gödel's First and Second Incompleteness Theorems from a separate AFP entry by instantiating them to the case of finite sound extensions of the Hereditarily Finite (HF) Set theory, i.e., FOL theories extending the HF Set theory with a finite set of axioms that are sound in the standard model. The concrete results had been previously formalised in an AFP entry by Larry Paulson; our instantiation reuses the infrastructure developed in that entry. [Goedel_HFSet_Semanticless] title = From Abstract to Concrete Gödel's Incompleteness Theorems—Part II author = Andrei Popescu , Dmitriy Traytel topic = Logic/Proof theory date = 2020-09-16 notify = a.popescu@sheffield.ac.uk, traytel@di.ku.dk abstract = We validate an abstract formulation of Gödel's Second Incompleteness Theorem from a separate AFP entry by instantiating it to the case of finite consistent extensions of the Hereditarily Finite (HF) Set theory, i.e., consistent FOL theories extending the HF Set theory with a finite set of axioms. The instantiation draws heavily on infrastructure previously developed by Larry Paulson in his direct formalisation of the concrete result. It strengthens Paulson's formalization of Gödel's Second from that entry by not assuming soundness, and in fact not relying on any notion of model or semantic interpretation. The strengthening was obtained by first replacing some of Paulson’s semantic arguments with proofs within his HF calculus, and then plugging in some of Paulson's (modified) lemmas to instantiate our soundness-free Gödel's Second locale. [Robinson_Arithmetic] title = Robinson Arithmetic author = Andrei Popescu , Dmitriy Traytel topic = Logic/Proof theory date = 2020-09-16 notify = a.popescu@sheffield.ac.uk, traytel@di.ku.dk abstract = We instantiate our syntax-independent logic infrastructure developed in a separate AFP entry to the FOL theory of Robinson arithmetic (also known as Q). The latter was formalised using Nominal Isabelle by adapting Larry Paulson’s formalization of the Hereditarily Finite Set theory. [Elliptic_Curves_Group_Law] title = The Group Law for Elliptic Curves author = Stefan Berghofer topic = Computer science/Security/Cryptography date = 2017-02-28 notify = berghofe@in.tum.de abstract = We prove the group law for elliptic curves in Weierstrass form over fields of characteristic greater than 2. In addition to affine coordinates, we also formalize projective coordinates, which allow for more efficient computations. By specializing the abstract formalization to prime fields, we can apply the curve operations to parameters used in standard security protocols. [Example-Submission] title = Example Submission author = Gerwin Klein topic = Mathematics/Analysis, Mathematics/Number theory date = 2004-02-25 notify = kleing@cse.unsw.edu.au abstract =

This is an example submission to the Archive of Formal Proofs. It shows submission requirements and explains the structure of a simple typical submission.

Note that you can use HTML tags and LaTeX formulae like $\sum_{n=1}^\infty \frac{1}{n^2} = \frac{\pi^2}{6}$ in the abstract. Display formulae like $$ \int_0^1 x^{-x}\,\text{d}x = \sum_{n=1}^\infty n^{-n}$$ are also possible. Please read the submission guidelines before using this.

extra-no-index = no-index: true [CRDT] title = A framework for establishing Strong Eventual Consistency for Conflict-free Replicated Datatypes author = Victor B. F. Gomes , Martin Kleppmann, Dominic P. Mulligan, Alastair R. Beresford topic = Computer science/Algorithms/Distributed, Computer science/Data structures date = 2017-07-07 notify = vb358@cam.ac.uk, dominic.p.mulligan@googlemail.com abstract = In this work, we focus on the correctness of Conflict-free Replicated Data Types (CRDTs), a class of algorithm that provides strong eventual consistency guarantees for replicated data. We develop a modular and reusable framework for verifying the correctness of CRDT algorithms. We avoid correctness issues that have dogged previous mechanised proofs in this area by including a network model in our formalisation, and proving that our theorems hold in all possible network behaviours. Our axiomatic network model is a standard abstraction that accurately reflects the behaviour of real-world computer networks. Moreover, we identify an abstract convergence theorem, a property of order relations, which provides a formal definition of strong eventual consistency. We then obtain the first machine-checked correctness theorems for three concrete CRDTs: the Replicated Growable Array, the Observed-Remove Set, and an Increment-Decrement Counter. [HOLCF-Prelude] title = HOLCF-Prelude author = Joachim Breitner, Brian Huffman<>, Neil Mitchell<>, Christian Sternagel topic = Computer science/Functional programming date = 2017-07-15 notify = c.sternagel@gmail.com, joachim@cis.upenn.edu, hupel@in.tum.de abstract = The Isabelle/HOLCF-Prelude is a formalization of a large part of Haskell's standard prelude in Isabelle/HOLCF. We use it to prove the correctness of the Eratosthenes' Sieve, in its self-referential implementation commonly used to showcase Haskell's laziness; prove correctness of GHC's "fold/build" rule and related rewrite rules; and certify a number of hints suggested by HLint. [Decl_Sem_Fun_PL] title = Declarative Semantics for Functional Languages author = Jeremy Siek topic = Computer science/Programming languages date = 2017-07-21 notify = jsiek@indiana.edu abstract = We present a semantics for an applied call-by-value lambda-calculus that is compositional, extensional, and elementary. We present four different views of the semantics: 1) as a relational (big-step) semantics that is not operational but instead declarative, 2) as a denotational semantics that does not use domain theory, 3) as a non-deterministic interpreter, and 4) as a variant of the intersection type systems of the Torino group. We prove that the semantics is correct by showing that it is sound and complete with respect to operational semantics on programs and that is sound with respect to contextual equivalence. We have not yet investigated whether it is fully abstract. We demonstrate that this approach to semantics is useful with three case studies. First, we use the semantics to prove correctness of a compiler optimization that inlines function application. Second, we adapt the semantics to the polymorphic lambda-calculus extended with general recursion and prove semantic type soundness. Third, we adapt the semantics to the call-by-value lambda-calculus with mutable references.
The paper that accompanies these Isabelle theories is available on arXiv. [DynamicArchitectures] title = Dynamic Architectures author = Diego Marmsoler topic = Computer science/System description languages date = 2017-07-28 notify = diego.marmsoler@tum.de abstract = The architecture of a system describes the system's overall organization into components and connections between those components. With the emergence of mobile computing, dynamic architectures have become increasingly important. In such architectures, components may appear or disappear, and connections may change over time. In the following we mechanize a theory of dynamic architectures and verify the soundness of a corresponding calculus. Therefore, we first formalize the notion of configuration traces as a model for dynamic architectures. Then, the behavior of single components is formalized in terms of behavior traces and an operator is introduced and studied to extract the behavior of a single component out of a given configuration trace. Then, behavior trace assertions are introduced as a temporal specification technique to specify behavior of components. Reasoning about component behavior in a dynamic context is formalized in terms of a calculus for dynamic architectures. Finally, the soundness of the calculus is verified by introducing an alternative interpretation for behavior trace assertions over configuration traces and proving the rules of the calculus. Since projection may lead to finite as well as infinite behavior traces, they are formalized in terms of coinductive lists. Thus, our theory is based on Lochbihler's formalization of coinductive lists. The theory may be applied to verify properties for dynamic architectures. extra-history = Change history: [2018-06-07]: adding logical operators to specify configuration traces (revision 09178f08f050)
[Stewart_Apollonius] title = Stewart's Theorem and Apollonius' Theorem author = Lukas Bulwahn topic = Mathematics/Geometry date = 2017-07-31 notify = lukas.bulwahn@gmail.com abstract = This entry formalizes the two geometric theorems, Stewart's and Apollonius' theorem. Stewart's Theorem relates the length of a triangle's cevian to the lengths of the triangle's two sides. Apollonius' Theorem is a specialisation of Stewart's theorem, restricting the cevian to be the median. The proof applies the law of cosines, some basic geometric facts about triangles and then simply transforms the terms algebraically to yield the conjectured relation. The formalization in Isabelle can closely follow the informal proofs described in the Wikipedia articles of those two theorems. [LambdaMu] title = The LambdaMu-calculus author = Cristina Matache , Victor B. F. Gomes , Dominic P. Mulligan topic = Computer science/Programming languages/Lambda calculi, Logic/General logic/Lambda calculus date = 2017-08-16 notify = victorborgesfg@gmail.com, dominic.p.mulligan@googlemail.com abstract = The propositions-as-types correspondence is ordinarily presented as linking the metatheory of typed λ-calculi and the proof theory of intuitionistic logic. Griffin observed that this correspondence could be extended to classical logic through the use of control operators. This observation set off a flurry of further research, leading to the development of Parigots λμ-calculus. In this work, we formalise λμ- calculus in Isabelle/HOL and prove several metatheoretical properties such as type preservation and progress. [Orbit_Stabiliser] title = Orbit-Stabiliser Theorem with Application to Rotational Symmetries author = Jonas Rädle topic = Mathematics/Algebra date = 2017-08-20 notify = jonas.raedle@tum.de abstract = The Orbit-Stabiliser theorem is a basic result in the algebra of groups that factors the order of a group into the sizes of its orbits and stabilisers. We formalize the notion of a group action and the related concepts of orbits and stabilisers. This allows us to prove the orbit-stabiliser theorem. In the second part of this work, we formalize the tetrahedral group and use the orbit-stabiliser theorem to prove that there are twelve (orientation-preserving) rotations of the tetrahedron. [PLM] title = Representation and Partial Automation of the Principia Logico-Metaphysica in Isabelle/HOL author = Daniel Kirchner topic = Logic/Philosophical aspects date = 2017-09-17 notify = daniel@ekpyron.org abstract =

We present an embedding of the second-order fragment of the Theory of Abstract Objects as described in Edward Zalta's upcoming work Principia Logico-Metaphysica (PLM) in the automated reasoning framework Isabelle/HOL. The Theory of Abstract Objects is a metaphysical theory that reifies property patterns, as they for example occur in the abstract reasoning of mathematics, as abstract objects and provides an axiomatic framework that allows to reason about these objects. It thereby serves as a fundamental metaphysical theory that can be used to axiomatize and describe a wide range of philosophical objects, such as Platonic forms or Leibniz' concepts, and has the ambition to function as a foundational theory of mathematics. The target theory of our embedding as described in chapters 7-9 of PLM employs a modal relational type theory as logical foundation for which a representation in functional type theory is known to be challenging.

Nevertheless we arrive at a functioning representation of the theory in the functional logic of Isabelle/HOL based on a semantical representation of an Aczel-model of the theory. Based on this representation we construct an implementation of the deductive system of PLM which allows to automatically and interactively find and verify theorems of PLM.

Our work thereby supports the concept of shallow semantical embeddings of logical systems in HOL as a universal tool for logical reasoning as promoted by Christoph Benzmüller.

The most notable result of the presented work is the discovery of a previously unknown paradox in the formulation of the Theory of Abstract Objects. The embedding of the theory in Isabelle/HOL played a vital part in this discovery. Furthermore it was possible to immediately offer several options to modify the theory to guarantee its consistency. Thereby our work could provide a significant contribution to the development of a proper grounding for object theory.

[KD_Tree] title = Multidimensional Binary Search Trees author = Martin Rau<> topic = Computer science/Data structures date = 2019-05-30 notify = martin.rau@tum.de, mrtnrau@googlemail.com abstract = This entry provides a formalization of multidimensional binary trees, also known as k-d trees. It includes a balanced build algorithm as well as the nearest neighbor algorithm and the range search algorithm. It is based on the papers Multidimensional binary search trees used for associative searching and An Algorithm for Finding Best Matches in Logarithmic Expected Time. extra-history = Change history: [2020-15-04]: Change representation of k-dimensional points from 'list' to HOL-Analysis.Finite_Cartesian_Product 'vec'. Update proofs to incorporate HOL-Analysis 'dist' and 'cbox' primitives. [Closest_Pair_Points] title = Closest Pair of Points Algorithms author = Martin Rau , Tobias Nipkow topic = Computer science/Algorithms/Geometry date = 2020-01-13 notify = martin.rau@tum.de, nipkow@in.tum.de abstract = This entry provides two related verified divide-and-conquer algorithms solving the fundamental Closest Pair of Points problem in Computational Geometry. Functional correctness and the optimal running time of O(n log n) are proved. Executable code is generated which is empirically competitive with handwritten reference implementations. extra-history = Change history: [2020-14-04]: Incorporate Time_Monad of the AFP entry Root_Balanced_Tree. [Approximation_Algorithms] title = Verified Approximation Algorithms author = Robin Eßmann , Tobias Nipkow , Simon Robillard topic = Computer science/Algorithms/Approximation date = 2020-01-16 notify = nipkow@in.tum.de abstract = We present the first formal verification of approximation algorithms for NP-complete optimization problems: vertex cover, independent set, load balancing, and bin packing. The proofs correct incompletenesses in existing proofs and improve the approximation ratio in one case. [Diophantine_Eqns_Lin_Hom] title = Homogeneous Linear Diophantine Equations author = Florian Messner , Julian Parsert , Jonas Schöpf , Christian Sternagel topic = Computer science/Algorithms/Mathematical, Mathematics/Number theory, Tools license = LGPL date = 2017-10-14 notify = c.sternagel@gmail.com, julian.parsert@gmail.com abstract = We formalize the theory of homogeneous linear diophantine equations, focusing on two main results: (1) an abstract characterization of minimal complete sets of solutions, and (2) an algorithm computing them. Both, the characterization and the algorithm are based on previous work by Huet. Our starting point is a simple but inefficient variant of Huet's lexicographic algorithm incorporating improved bounds due to Clausen and Fortenbacher. We proceed by proving its soundness and completeness. Finally, we employ code equations to obtain a reasonably efficient implementation. Thus, we provide a formally verified solver for homogeneous linear diophantine equations. [Winding_Number_Eval] title = Evaluate Winding Numbers through Cauchy Indices author = Wenda Li topic = Mathematics/Analysis date = 2017-10-17 notify = wl302@cam.ac.uk, liwenda1990@hotmail.com abstract = In complex analysis, the winding number measures the number of times a path (counterclockwise) winds around a point, while the Cauchy index can approximate how the path winds. This entry provides a formalisation of the Cauchy index, which is then shown to be related to the winding number. In addition, this entry also offers a tactic that enables users to evaluate the winding number by calculating Cauchy indices. [Count_Complex_Roots] title = Count the Number of Complex Roots author = Wenda Li topic = Mathematics/Analysis date = 2017-10-17 notify = wl302@cam.ac.uk, liwenda1990@hotmail.com abstract = Based on evaluating Cauchy indices through remainder sequences, this entry provides an effective procedure to count the number of complex roots (with multiplicity) of a polynomial within a rectangle box or a half-plane. Potential applications of this entry include certified complex root isolation (of a polynomial) and testing the Routh-Hurwitz stability criterion (i.e., to check whether all the roots of some characteristic polynomial have negative real parts). [Buchi_Complementation] title = Büchi Complementation author = Julian Brunner topic = Computer science/Automata and formal languages date = 2017-10-19 notify = brunnerj@in.tum.de abstract = This entry provides a verified implementation of rank-based Büchi Complementation. The verification is done in three steps:
  1. Definition of odd rankings and proof that an automaton rejects a word iff there exists an odd ranking for it.
  2. Definition of the complement automaton and proof that it accepts exactly those words for which there is an odd ranking.
  3. Verified implementation of the complement automaton using the Isabelle Collections Framework.
[Transition_Systems_and_Automata] title = Transition Systems and Automata author = Julian Brunner topic = Computer science/Automata and formal languages date = 2017-10-19 notify = brunnerj@in.tum.de abstract = This entry provides a very abstract theory of transition systems that can be instantiated to express various types of automata. A transition system is typically instantiated by providing a set of initial states, a predicate for enabled transitions, and a transition execution function. From this, it defines the concepts of finite and infinite paths as well as the set of reachable states, among other things. Many useful theorems, from basic path manipulation rules to coinduction and run construction rules, are proven in this abstract transition system context. The library comes with instantiations for DFAs, NFAs, and Büchi automata. [Kuratowski_Closure_Complement] title = The Kuratowski Closure-Complement Theorem author = Peter Gammie , Gianpaolo Gioiosa<> topic = Mathematics/Topology date = 2017-10-26 notify = peteg42@gmail.com abstract = We discuss a topological curiosity discovered by Kuratowski (1922): the fact that the number of distinct operators on a topological space generated by compositions of closure and complement never exceeds 14, and is exactly 14 in the case of R. In addition, we prove a theorem due to Chagrov (1982) that classifies topological spaces according to the number of such operators they support. [Hybrid_Multi_Lane_Spatial_Logic] title = Hybrid Multi-Lane Spatial Logic author = Sven Linker topic = Logic/General logic/Modal logic date = 2017-11-06 notify = s.linker@liverpool.ac.uk abstract = We present a semantic embedding of a spatio-temporal multi-modal logic, specifically defined to reason about motorway traffic, into Isabelle/HOL. The semantic model is an abstraction of a motorway, emphasising local spatial properties, and parameterised by the types of sensors deployed in the vehicles. We use the logic to define controller constraints to ensure safety, i.e., the absence of collisions on the motorway. After proving safety with a restrictive definition of sensors, we relax these assumptions and show how to amend the controller constraints to still guarantee safety. [Dirichlet_L] title = Dirichlet L-Functions and Dirichlet's Theorem author = Manuel Eberl topic = Mathematics/Number theory, Mathematics/Algebra date = 2017-12-21 notify = eberlm@in.tum.de abstract =

This article provides a formalisation of Dirichlet characters and Dirichlet L-functions including proofs of their basic properties – most notably their analyticity, their areas of convergence, and their non-vanishing for ℜ(s) ≥ 1. All of this is built in a very high-level style using Dirichlet series. The proof of the non-vanishing follows a very short and elegant proof by Newman, which we attempt to reproduce faithfully in a similar level of abstraction in Isabelle.

This also leads to a relatively short proof of Dirichlet’s Theorem, which states that, if h and n are coprime, there are infinitely many primes p with ph (mod n).

[Symmetric_Polynomials] title = Symmetric Polynomials author = Manuel Eberl topic = Mathematics/Algebra date = 2018-09-25 notify = eberlm@in.tum.de abstract =

A symmetric polynomial is a polynomial in variables X1,…,Xn that does not discriminate between its variables, i. e. it is invariant under any permutation of them. These polynomials are important in the study of the relationship between the coefficients of a univariate polynomial and its roots in its algebraic closure.

This article provides a definition of symmetric polynomials and the elementary symmetric polynomials e1,…,en and proofs of their basic properties, including three notable ones:

  • Vieta's formula, which gives an explicit expression for the k-th coefficient of a univariate monic polynomial in terms of its roots x1,…,xn, namely ck = (-1)n-k en-k(x1,…,xn).
  • Second, the Fundamental Theorem of Symmetric Polynomials, which states that any symmetric polynomial is itself a uniquely determined polynomial combination of the elementary symmetric polynomials.
  • Third, as a corollary of the previous two, that given a polynomial over some ring R, any symmetric polynomial combination of its roots is also in R even when the roots are not.

Both the symmetry property itself and the witness for the Fundamental Theorem are executable.

[Taylor_Models] title = Taylor Models author = Christoph Traut<>, Fabian Immler topic = Computer science/Algorithms/Mathematical, Computer science/Data structures, Mathematics/Analysis, Mathematics/Algebra date = 2018-01-08 notify = immler@in.tum.de abstract = We present a formally verified implementation of multivariate Taylor models. Taylor models are a form of rigorous polynomial approximation, consisting of an approximation polynomial based on Taylor expansions, combined with a rigorous bound on the approximation error. Taylor models were introduced as a tool to mitigate the dependency problem of interval arithmetic. Our implementation automatically computes Taylor models for the class of elementary functions, expressed by composition of arithmetic operations and basic functions like exp, sin, or square root. [Green] title = An Isabelle/HOL formalisation of Green's Theorem author = Mohammad Abdulaziz , Lawrence C. Paulson topic = Mathematics/Analysis date = 2018-01-11 notify = mohammad.abdulaziz8@gmail.com, lp15@cam.ac.uk abstract = We formalise a statement of Green’s theorem—the first formalisation to our knowledge—in Isabelle/HOL. The theorem statement that we formalise is enough for most applications, especially in physics and engineering. Our formalisation is made possible by a novel proof that avoids the ubiquitous line integral cancellation argument. This eliminates the need to formalise orientations and region boundaries explicitly with respect to the outwards-pointing normal vector. Instead we appeal to a homological argument about equivalences between paths. [AI_Planning_Languages_Semantics] title = AI Planning Languages Semantics author = Mohammad Abdulaziz , Peter Lammich topic = Computer science/Artificial intelligence date = 2020-10-29 notify = mohammad.abdulaziz8@gmail.com abstract = This is an Isabelle/HOL formalisation of the semantics of the multi-valued planning tasks language that is used by the planning system Fast-Downward, the STRIPS fragment of the Planning Domain Definition Language (PDDL), and the STRIPS soundness meta-theory developed by Vladimir Lifschitz. It also contains formally verified checkers for checking the well-formedness of problems specified in either language as well the correctness of potential solutions. The formalisation in this entry was described in an earlier publication. [Verified_SAT_Based_AI_Planning] title = Verified SAT-Based AI Planning author = Mohammad Abdulaziz , Friedrich Kurz <> topic = Computer science/Artificial intelligence date = 2020-10-29 notify = mohammad.abdulaziz8@gmail.com abstract = We present an executable formally verified SAT encoding of classical AI planning that is based on the encodings by Kautz and Selman and the one by Rintanen et al. The encoding was experimentally tested and shown to be usable for reasonably sized standard AI planning benchmarks. We also use it as a reference to test a state-of-the-art SAT-based planner, showing that it sometimes falsely claims that problems have no solutions of certain lengths. The formalisation in this submission was described in an independent publication. [Gromov_Hyperbolicity] title = Gromov Hyperbolicity author = Sebastien Gouezel<> topic = Mathematics/Geometry date = 2018-01-16 notify = sebastien.gouezel@univ-rennes1.fr abstract = A geodesic metric space is Gromov hyperbolic if all its geodesic triangles are thin, i.e., every side is contained in a fixed thickening of the two other sides. While this definition looks innocuous, it has proved extremely important and versatile in modern geometry since its introduction by Gromov. We formalize the basic classical properties of Gromov hyperbolic spaces, notably the Morse lemma asserting that quasigeodesics are close to geodesics, the invariance of hyperbolicity under quasi-isometries, we define and study the Gromov boundary and its associated distance, and prove that a quasi-isometry between Gromov hyperbolic spaces extends to a homeomorphism of the boundaries. We also prove a less classical theorem, by Bonk and Schramm, asserting that a Gromov hyperbolic space embeds isometrically in a geodesic Gromov-hyperbolic space. As the original proof uses a transfinite sequence of Cauchy completions, this is an interesting formalization exercise. Along the way, we introduce basic material on isometries, quasi-isometries, Lipschitz maps, geodesic spaces, the Hausdorff distance, the Cauchy completion of a metric space, and the exponential on extended real numbers. [Ordered_Resolution_Prover] title = Formalization of Bachmair and Ganzinger's Ordered Resolution Prover author = Anders Schlichtkrull , Jasmin Christian Blanchette , Dmitriy Traytel , Uwe Waldmann topic = Logic/General logic/Mechanization of proofs date = 2018-01-18 notify = andschl@dtu.dk, j.c.blanchette@vu.nl abstract = This Isabelle/HOL formalization covers Sections 2 to 4 of Bachmair and Ganzinger's "Resolution Theorem Proving" chapter in the Handbook of Automated Reasoning. This includes soundness and completeness of unordered and ordered variants of ground resolution with and without literal selection, the standard redundancy criterion, a general framework for refutational theorem proving, and soundness and completeness of an abstract first-order prover. [Chandy_Lamport] title = A Formal Proof of The Chandy--Lamport Distributed Snapshot Algorithm author = Ben Fiedler , Dmitriy Traytel topic = Computer science/Algorithms/Distributed date = 2020-07-21 notify = ben.fiedler@inf.ethz.ch, traytel@inf.ethz.ch abstract = We provide a suitable distributed system model and implementation of the Chandy--Lamport distributed snapshot algorithm [ACM Transactions on Computer Systems, 3, 63-75, 1985]. Our main result is a formal termination and correctness proof of the Chandy--Lamport algorithm and its use in stable property detection. [BNF_Operations] title = Operations on Bounded Natural Functors author = Jasmin Christian Blanchette , Andrei Popescu , Dmitriy Traytel topic = Tools date = 2017-12-19 notify = jasmin.blanchette@gmail.com,uuomul@yahoo.com,traytel@inf.ethz.ch abstract = This entry formalizes the closure property of bounded natural functors (BNFs) under seven operations. These operations and the corresponding proofs constitute the core of Isabelle's (co)datatype package. To be close to the implemented tactics, the proofs are deliberately formulated as detailed apply scripts. The (co)datatypes together with (co)induction principles and (co)recursors are byproducts of the fixpoint operations LFP and GFP. Composition of BNFs is subdivided into four simpler operations: Compose, Kill, Lift, and Permute. The N2M operation provides mutual (co)induction principles and (co)recursors for nested (co)datatypes. [LLL_Basis_Reduction] title = A verified LLL algorithm author = Ralph Bottesch <>, Jose Divasón , Maximilian Haslbeck , Sebastiaan Joosten , René Thiemann , Akihisa Yamada<> topic = Computer science/Algorithms/Mathematical, Mathematics/Algebra date = 2018-02-02 notify = ralph.bottesch@uibk.ac.at, jose.divason@unirioja.es, maximilian.haslbeck@uibk.ac.at, s.j.c.joosten@utwente.nl, rene.thiemann@uibk.ac.at, ayamada@trs.cm.is.nagoya-u.ac.jp abstract = The Lenstra-Lenstra-Lovász basis reduction algorithm, also known as LLL algorithm, is an algorithm to find a basis with short, nearly orthogonal vectors of an integer lattice. Thereby, it can also be seen as an approximation to solve the shortest vector problem (SVP), which is an NP-hard problem, where the approximation quality solely depends on the dimension of the lattice, but not the lattice itself. The algorithm also possesses many applications in diverse fields of computer science, from cryptanalysis to number theory, but it is specially well-known since it was used to implement the first polynomial-time algorithm to factor polynomials. In this work we present the first mechanized soundness proof of the LLL algorithm to compute short vectors in lattices. The formalization follows a textbook by von zur Gathen and Gerhard. extra-history = Change history: [2018-04-16]: Integrated formal complexity bounds (Haslbeck, Thiemann) [2018-05-25]: Integrated much faster LLL implementation based on integer arithmetic (Bottesch, Haslbeck, Thiemann) [LLL_Factorization] title = A verified factorization algorithm for integer polynomials with polynomial complexity author = Jose Divasón , Sebastiaan Joosten , René Thiemann , Akihisa Yamada topic = Mathematics/Algebra date = 2018-02-06 notify = jose.divason@unirioja.es, s.j.c.joosten@utwente.nl, rene.thiemann@uibk.ac.at, ayamada@trs.cm.is.nagoya-u.ac.jp abstract = Short vectors in lattices and factors of integer polynomials are related. Each factor of an integer polynomial belongs to a certain lattice. When factoring polynomials, the condition that we are looking for an irreducible polynomial means that we must look for a small element in a lattice, which can be done by a basis reduction algorithm. In this development we formalize this connection and thereby one main application of the LLL basis reduction algorithm: an algorithm to factor square-free integer polynomials which runs in polynomial time. The work is based on our previous Berlekamp–Zassenhaus development, where the exponential reconstruction phase has been replaced by the polynomial-time basis reduction algorithm. Thanks to this formalization we found a serious flaw in a textbook. [Treaps] title = Treaps author = Maximilian Haslbeck , Manuel Eberl , Tobias Nipkow topic = Computer science/Data structures date = 2018-02-06 notify = eberlm@in.tum.de abstract =

A Treap is a binary tree whose nodes contain pairs consisting of some payload and an associated priority. It must have the search-tree property w.r.t. the payloads and the heap property w.r.t. the priorities. Treaps are an interesting data structure that is related to binary search trees (BSTs) in the following way: if one forgets all the priorities of a treap, the resulting BST is exactly the same as if one had inserted the elements into an empty BST in order of ascending priority. This means that a treap behaves like a BST where we can pretend the elements were inserted in a different order from the one in which they were actually inserted.

In particular, by choosing these priorities at random upon insertion of an element, we can pretend that we inserted the elements in random order, so that the shape of the resulting tree is that of a random BST no matter in what order we insert the elements. This is the main result of this formalisation.

[Skip_Lists] title = Skip Lists author = Max W. Haslbeck , Manuel Eberl topic = Computer science/Data structures date = 2020-01-09 notify = max.haslbeck@gmx.de abstract =

Skip lists are sorted linked lists enhanced with shortcuts and are an alternative to binary search trees. A skip lists consists of multiple levels of sorted linked lists where a list on level n is a subsequence of the list on level n − 1. In the ideal case, elements are skipped in such a way that a lookup in a skip lists takes O(log n) time. In a randomised skip list the skipped elements are choosen randomly.

This entry contains formalized proofs of the textbook results about the expected height and the expected length of a search path in a randomised skip list.

[Mersenne_Primes] title = Mersenne primes and the Lucas–Lehmer test author = Manuel Eberl topic = Mathematics/Number theory date = 2020-01-17 notify = eberlm@in.tum.de abstract =

This article provides formal proofs of basic properties of Mersenne numbers, i. e. numbers of the form 2n - 1, and especially of Mersenne primes.

In particular, an efficient, verified, and executable version of the Lucas–Lehmer test is developed. This test decides primality for Mersenne numbers in time polynomial in n.

[Hoare_Time] title = Hoare Logics for Time Bounds author = Maximilian P. L. Haslbeck , Tobias Nipkow topic = Computer science/Programming languages/Logics date = 2018-02-26 notify = haslbema@in.tum.de abstract = We study three different Hoare logics for reasoning about time bounds of imperative programs and formalize them in Isabelle/HOL: a classical Hoare like logic due to Nielson, a logic with potentials due to Carbonneaux et al. and a separation logic following work by Atkey, Chaguérand and Pottier. These logics are formally shown to be sound and complete. Verification condition generators are developed and are shown sound and complete too. We also consider variants of the systems where we abstract from multiplicative constants in the running time bounds, thus supporting a big-O style of reasoning. Finally we compare the expressive power of the three systems. [Architectural_Design_Patterns] title = A Theory of Architectural Design Patterns author = Diego Marmsoler topic = Computer science/System description languages date = 2018-03-01 notify = diego.marmsoler@tum.de abstract = The following document formalizes and verifies several architectural design patterns. Each pattern specification is formalized in terms of a locale where the locale assumptions correspond to the assumptions which a pattern poses on an architecture. Thus, pattern specifications may build on top of each other by interpreting the corresponding locale. A pattern is verified using the framework provided by the AFP entry Dynamic Architectures. Currently, the document consists of formalizations of 4 different patterns: the singleton, the publisher subscriber, the blackboard pattern, and the blockchain pattern. Thereby, the publisher component of the publisher subscriber pattern is modeled as an instance of the singleton pattern and the blackboard pattern is modeled as an instance of the publisher subscriber pattern. In general, this entry provides the first steps towards an overall theory of architectural design patterns. extra-history = Change history: [2018-05-25]: changing the major assumption for blockchain architectures from alternative minings to relative mining frequencies (revision 5043c5c71685)
[2019-04-08]: adapting the terminology: honest instead of trusted, dishonest instead of untrusted (revision 7af3431a22ae) [Weight_Balanced_Trees] title = Weight-Balanced Trees author = Tobias Nipkow , Stefan Dirix<> topic = Computer science/Data structures date = 2018-03-13 notify = nipkow@in.tum.de abstract = This theory provides a verified implementation of weight-balanced trees following the work of Hirai and Yamamoto who proved that all parameters in a certain range are valid, i.e. guarantee that insertion and deletion preserve weight-balance. Instead of a general theorem we provide parameterized proofs of preservation of the invariant that work for many (all?) valid parameters. [Fishburn_Impossibility] title = The Incompatibility of Fishburn-Strategyproofness and Pareto-Efficiency author = Felix Brandt , Manuel Eberl , Christian Saile , Christian Stricker topic = Mathematics/Games and economics date = 2018-03-22 notify = eberlm@in.tum.de abstract =

This formalisation contains the proof that there is no anonymous Social Choice Function for at least three agents and alternatives that fulfils both Pareto-Efficiency and Fishburn-Strategyproofness. It was derived from a proof of Brandt et al., which relies on an unverified translation of a fixed finite instance of the original problem to SAT. This Isabelle proof contains a machine-checked version of both the statement for exactly three agents and alternatives and the lifting to the general case.

[BNF_CC] title = Bounded Natural Functors with Covariance and Contravariance author = Andreas Lochbihler , Joshua Schneider topic = Computer science/Functional programming, Tools date = 2018-04-24 notify = mail@andreas-lochbihler.de, joshua.schneider@inf.ethz.ch abstract = Bounded natural functors (BNFs) provide a modular framework for the construction of (co)datatypes in higher-order logic. Their functorial operations, the mapper and relator, are restricted to a subset of the parameters, namely those where recursion can take place. For certain applications, such as free theorems, data refinement, quotients, and generalised rewriting, it is desirable that these operations do not ignore the other parameters. In this article, we formalise the generalisation BNFCC that extends the mapper and relator to covariant and contravariant parameters. We show that
  1. BNFCCs are closed under functor composition and least and greatest fixpoints,
  2. subtypes inherit the BNFCC structure under conditions that generalise those for the BNF case, and
  3. BNFCCs preserve quotients under mild conditions.
These proofs are carried out for abstract BNFCCs similar to the AFP entry BNF Operations. In addition, we apply the BNFCC theory to several concrete functors. [Modular_Assembly_Kit_Security] title = An Isabelle/HOL Formalization of the Modular Assembly Kit for Security Properties author = Oliver Bračevac , Richard Gay , Sylvia Grewe , Heiko Mantel , Henning Sudbrock , Markus Tasch topic = Computer science/Security date = 2018-05-07 notify = tasch@mais.informatik.tu-darmstadt.de abstract = The "Modular Assembly Kit for Security Properties" (MAKS) is a framework for both the definition and verification of possibilistic information-flow security properties at the specification-level. MAKS supports the uniform representation of a wide range of possibilistic information-flow properties and provides support for the verification of such properties via unwinding results and compositionality results. We provide a formalization of this framework in Isabelle/HOL. [AxiomaticCategoryTheory] title = Axiom Systems for Category Theory in Free Logic author = Christoph Benzmüller , Dana Scott topic = Mathematics/Category theory date = 2018-05-23 notify = c.benzmueller@gmail.com abstract = This document provides a concise overview on the core results of our previous work on the exploration of axioms systems for category theory. Extending the previous studies (http://arxiv.org/abs/1609.01493) we include one further axiomatic theory in our experiments. This additional theory has been suggested by Mac Lane in 1948. We show that the axioms proposed by Mac Lane are equivalent to the ones we studied before, which includes an axioms set suggested by Scott in the 1970s and another axioms set proposed by Freyd and Scedrov in 1990, which we slightly modified to remedy a minor technical issue. [OpSets] title = OpSets: Sequential Specifications for Replicated Datatypes author = Martin Kleppmann , Victor B. F. Gomes , Dominic P. Mulligan , Alastair R. Beresford topic = Computer science/Algorithms/Distributed, Computer science/Data structures date = 2018-05-10 notify = vb358@cam.ac.uk abstract = We introduce OpSets, an executable framework for specifying and reasoning about the semantics of replicated datatypes that provide eventual consistency in a distributed system, and for mechanically verifying algorithms that implement these datatypes. Our approach is simple but expressive, allowing us to succinctly specify a variety of abstract datatypes, including maps, sets, lists, text, graphs, trees, and registers. Our datatypes are also composable, enabling the construction of complex data structures. To demonstrate the utility of OpSets for analysing replication algorithms, we highlight an important correctness property for collaborative text editing that has traditionally been overlooked; algorithms that do not satisfy this property can exhibit awkward interleaving of text. We use OpSets to specify this correctness property and prove that although one existing replication algorithm satisfies this property, several other published algorithms do not. [Irrationality_J_Hancl] title = Irrational Rapidly Convergent Series author = Angeliki Koutsoukou-Argyraki , Wenda Li topic = Mathematics/Number theory, Mathematics/Analysis date = 2018-05-23 notify = ak2110@cam.ac.uk, wl302@cam.ac.uk abstract = We formalize with Isabelle/HOL a proof of a theorem by J. Hancl asserting the irrationality of the sum of a series consisting of rational numbers, built up by sequences that fulfill certain properties. Even though the criterion is a number theoretic result, the proof makes use only of analytical arguments. We also formalize a corollary of the theorem for a specific series fulfilling the assumptions of the theorem. [Optimal_BST] title = Optimal Binary Search Trees author = Tobias Nipkow , Dániel Somogyi <> topic = Computer science/Algorithms, Computer science/Data structures date = 2018-05-27 notify = nipkow@in.tum.de abstract = This article formalizes recursive algorithms for the construction of optimal binary search trees given fixed access frequencies. We follow Knuth (1971), Yao (1980) and Mehlhorn (1984). The algorithms are memoized with the help of the AFP article Monadification, Memoization and Dynamic Programming, thus yielding dynamic programming algorithms. [Projective_Geometry] title = Projective Geometry author = Anthony Bordg topic = Mathematics/Geometry date = 2018-06-14 notify = apdb3@cam.ac.uk abstract = We formalize the basics of projective geometry. In particular, we give a proof of the so-called Hessenberg's theorem in projective plane geometry. We also provide a proof of the so-called Desargues's theorem based on an axiomatization of (higher) projective space geometry using the notion of rank of a matroid. This last approach allows to handle incidence relations in an homogeneous way dealing only with points and without the need of talking explicitly about lines, planes or any higher entity. [Localization_Ring] title = The Localization of a Commutative Ring author = Anthony Bordg topic = Mathematics/Algebra date = 2018-06-14 notify = apdb3@cam.ac.uk abstract = We formalize the localization of a commutative ring R with respect to a multiplicative subset (i.e. a submonoid of R seen as a multiplicative monoid). This localization is itself a commutative ring and we build the natural homomorphism of rings from R to its localization. [Minsky_Machines] title = Minsky Machines author = Bertram Felgenhauer<> topic = Logic/Computability date = 2018-08-14 notify = int-e@gmx.de abstract =

We formalize undecidablity results for Minsky machines. To this end, we also formalize recursive inseparability.

We start by proving that Minsky machines can compute arbitrary primitive recursive and recursive functions. We then show that there is a deterministic Minsky machine with one argument and two final states such that the set of inputs that are accepted in one state is recursively inseparable from the set of inputs that are accepted in the other state.

As a corollary, the set of Minsky configurations that reach the first state but not the second recursively inseparable from the set of Minsky configurations that reach the second state but not the first. In particular both these sets are undecidable.

We do not prove that recursive functions can simulate Minsky machines.

[Neumann_Morgenstern_Utility] title = Von-Neumann-Morgenstern Utility Theorem author = Julian Parsert, Cezary Kaliszyk topic = Mathematics/Games and economics license = LGPL date = 2018-07-04 notify = julian.parsert@uibk.ac.at, cezary.kaliszyk@uibk.ac.at abstract = Utility functions form an essential part of game theory and economics. In order to guarantee the existence of utility functions most of the time sufficient properties are assumed in an axiomatic manner. One famous and very common set of such assumptions is that of expected utility theory. Here, the rationality, continuity, and independence of preferences is assumed. The von-Neumann-Morgenstern Utility theorem shows that these assumptions are necessary and sufficient for an expected utility function to exists. This theorem was proven by Neumann and Morgenstern in ``Theory of Games and Economic Behavior'' which is regarded as one of the most influential works in game theory. The formalization includes formal definitions of the underlying concepts including continuity and independence of preferences. [Simplex] title = An Incremental Simplex Algorithm with Unsatisfiable Core Generation author = Filip Marić , Mirko Spasić , René Thiemann topic = Computer science/Algorithms/Optimization date = 2018-08-24 notify = rene.thiemann@uibk.ac.at abstract = We present an Isabelle/HOL formalization and total correctness proof for the incremental version of the Simplex algorithm which is used in most state-of-the-art SMT solvers. It supports extraction of satisfying assignments, extraction of minimal unsatisfiable cores, incremental assertion of constraints and backtracking. The formalization relies on stepwise program refinement, starting from a simple specification, going through a number of refinement steps, and ending up in a fully executable functional implementation. Symmetries present in the algorithm are handled with special care. [Budan_Fourier] title = The Budan-Fourier Theorem and Counting Real Roots with Multiplicity author = Wenda Li topic = Mathematics/Analysis date = 2018-09-02 notify = wl302@cam.ac.uk, liwenda1990@hotmail.com abstract = This entry is mainly about counting and approximating real roots (of a polynomial) with multiplicity. We have first formalised the Budan-Fourier theorem: given a polynomial with real coefficients, we can calculate sign variations on Fourier sequences to over-approximate the number of real roots (counting multiplicity) within an interval. When all roots are known to be real, the over-approximation becomes tight: we can utilise this theorem to count real roots exactly. It is also worth noting that Descartes' rule of sign is a direct consequence of the Budan-Fourier theorem, and has been included in this entry. In addition, we have extended previous formalised Sturm's theorem to count real roots with multiplicity, while the original Sturm's theorem only counts distinct real roots. Compared to the Budan-Fourier theorem, our extended Sturm's theorem always counts roots exactly but may suffer from greater computational cost. [Quaternions] title = Quaternions author = Lawrence C. Paulson topic = Mathematics/Algebra, Mathematics/Geometry date = 2018-09-05 notify = lp15@cam.ac.uk abstract = This theory is inspired by the HOL Light development of quaternions, but follows its own route. Quaternions are developed coinductively, as in the existing formalisation of the complex numbers. Quaternions are quickly shown to belong to the type classes of real normed division algebras and real inner product spaces. And therefore they inherit a great body of facts involving algebraic laws, limits, continuity, etc., which must be proved explicitly in the HOL Light version. The development concludes with the geometric interpretation of the product of imaginary quaternions. [Octonions] title = Octonions author = Angeliki Koutsoukou-Argyraki topic = Mathematics/Algebra, Mathematics/Geometry date = 2018-09-14 notify = ak2110@cam.ac.uk abstract = We develop the basic theory of Octonions, including various identities and properties of the octonions and of the octonionic product, a description of 7D isometries and representations of orthogonal transformations. To this end we first develop the theory of the vector cross product in 7 dimensions. The development of the theory of Octonions is inspired by that of the theory of Quaternions by Lawrence Paulson. However, we do not work within the type class real_algebra_1 because the octonionic product is not associative. [Aggregation_Algebras] title = Aggregation Algebras author = Walter Guttmann topic = Mathematics/Algebra date = 2018-09-15 notify = walter.guttmann@canterbury.ac.nz abstract = We develop algebras for aggregation and minimisation for weight matrices and for edge weights in graphs. We verify the correctness of Prim's and Kruskal's minimum spanning tree algorithms based on these algebras. We also show numerous instances of these algebras based on linearly ordered commutative semigroups. [Prime_Number_Theorem] title = The Prime Number Theorem author = Manuel Eberl , Lawrence C. Paulson topic = Mathematics/Number theory date = 2018-09-19 notify = eberlm@in.tum.de abstract =

This article provides a short proof of the Prime Number Theorem in several equivalent forms, most notably π(x) ~ x/ln x where π(x) is the number of primes no larger than x. It also defines other basic number-theoretic functions related to primes like Chebyshev's functions ϑ and ψ and the “n-th prime number” function pn. We also show various bounds and relationship between these functions are shown. Lastly, we derive Mertens' First and Second Theorem, i. e. ∑px ln p/p = ln x + O(1) and ∑px 1/p = ln ln x + M + O(1/ln x). We also give explicit bounds for the remainder terms.

The proof of the Prime Number Theorem builds on a library of Dirichlet series and analytic combinatorics. We essentially follow the presentation by Newman. The core part of the proof is a Tauberian theorem for Dirichlet series, which is proven using complex analysis and then used to strengthen Mertens' First Theorem to ∑px ln p/p = ln x + c + o(1).

A variant of this proof has been formalised before by Harrison in HOL Light, and formalisations of Selberg's elementary proof exist both by Avigad et al. in Isabelle and by Carneiro in Metamath. The advantage of the analytic proof is that, while it requires more powerful mathematical tools, it is considerably shorter and clearer. This article attempts to provide a short and clear formalisation of all components of that proof using the full range of mathematical machinery available in Isabelle, staying as close as possible to Newman's simple paper proof.

[Signature_Groebner] title = Signature-Based Gröbner Basis Algorithms author = Alexander Maletzky topic = Mathematics/Algebra, Computer science/Algorithms/Mathematical date = 2018-09-20 notify = alexander.maletzky@risc.jku.at abstract =

This article formalizes signature-based algorithms for computing Gröbner bases. Such algorithms are, in general, superior to other algorithms in terms of efficiency, and have not been formalized in any proof assistant so far. The present development is both generic, in the sense that most known variants of signature-based algorithms are covered by it, and effectively executable on concrete input thanks to Isabelle's code generator. Sample computations of benchmark problems show that the verified implementation of signature-based algorithms indeed outperforms the existing implementation of Buchberger's algorithm in Isabelle/HOL.

Besides total correctness of the algorithms, the article also proves that under certain conditions they a-priori detect and avoid all useless zero-reductions, and always return 'minimal' (in some sense) Gröbner bases if an input parameter is chosen in the right way.

The formalization follows the recent survey article by Eder and Faugère.

[Factored_Transition_System_Bounding] title = Upper Bounding Diameters of State Spaces of Factored Transition Systems author = Friedrich Kurz <>, Mohammad Abdulaziz topic = Computer science/Automata and formal languages, Mathematics/Graph theory date = 2018-10-12 notify = friedrich.kurz@tum.de, mohammad.abdulaziz@in.tum.de abstract = A completeness threshold is required to guarantee the completeness of planning as satisfiability, and bounded model checking of safety properties. One valid completeness threshold is the diameter of the underlying transition system. The diameter is the maximum element in the set of lengths of all shortest paths between pairs of states. The diameter is not calculated exactly in our setting, where the transition system is succinctly described using a (propositionally) factored representation. Rather, an upper bound on the diameter is calculated compositionally, by bounding the diameters of small abstract subsystems, and then composing those. We port a HOL4 formalisation of a compositional algorithm for computing a relatively tight upper bound on the system diameter. This compositional algorithm exploits acyclicity in the state space to achieve compositionality, and it was introduced by Abdulaziz et. al. The formalisation that we port is described as a part of another paper by Abdulaziz et. al. As a part of this porting we developed a libray about transition systems, which shall be of use in future related mechanisation efforts. [Smooth_Manifolds] title = Smooth Manifolds author = Fabian Immler , Bohua Zhan topic = Mathematics/Analysis, Mathematics/Topology date = 2018-10-22 notify = immler@in.tum.de, bzhan@ios.ac.cn abstract = We formalize the definition and basic properties of smooth manifolds in Isabelle/HOL. Concepts covered include partition of unity, tangent and cotangent spaces, and the fundamental theorem of path integrals. We also examine some concrete manifolds such as spheres and projective spaces. The formalization makes extensive use of the analysis and linear algebra libraries in Isabelle/HOL, in particular its “types-to-sets” mechanism. [Matroids] title = Matroids author = Jonas Keinholz<> topic = Mathematics/Combinatorics date = 2018-11-16 notify = eberlm@in.tum.de abstract =

This article defines the combinatorial structures known as Independence Systems and Matroids and provides basic concepts and theorems related to them. These structures play an important role in combinatorial optimisation, e. g. greedy algorithms such as Kruskal's algorithm. The development is based on Oxley's `What is a Matroid?'.

[Graph_Saturation] title = Graph Saturation author = Sebastiaan J. C. Joosten<> topic = Logic/Rewriting, Mathematics/Graph theory date = 2018-11-23 notify = sjcjoosten@gmail.com abstract = This is an Isabelle/HOL formalisation of graph saturation, closely following a paper by the author on graph saturation. Nine out of ten lemmas of the original paper are proven in this formalisation. The formalisation additionally includes two theorems that show the main premise of the paper: that consistency and entailment are decided through graph saturation. This formalisation does not give executable code, and it did not implement any of the optimisations suggested in the paper. [Functional_Ordered_Resolution_Prover] title = A Verified Functional Implementation of Bachmair and Ganzinger's Ordered Resolution Prover author = Anders Schlichtkrull , Jasmin Christian Blanchette , Dmitriy Traytel topic = Logic/General logic/Mechanization of proofs date = 2018-11-23 notify = andschl@dtu.dk,j.c.blanchette@vu.nl,traytel@inf.ethz.ch abstract = This Isabelle/HOL formalization refines the abstract ordered resolution prover presented in Section 4.3 of Bachmair and Ganzinger's "Resolution Theorem Proving" chapter in the Handbook of Automated Reasoning. The result is a functional implementation of a first-order prover. [Auto2_HOL] title = Auto2 Prover author = Bohua Zhan topic = Tools date = 2018-11-20 notify = bzhan@ios.ac.cn abstract = Auto2 is a saturation-based heuristic prover for higher-order logic, implemented as a tactic in Isabelle. This entry contains the instantiation of auto2 for Isabelle/HOL, along with two basic examples: solutions to some of the Pelletier’s problems, and elementary number theory of primes. [Order_Lattice_Props] title = Properties of Orderings and Lattices author = Georg Struth topic = Mathematics/Order date = 2018-12-11 notify = g.struth@sheffield.ac.uk abstract = These components add further fundamental order and lattice-theoretic concepts and properties to Isabelle's libraries. They follow by and large the introductory sections of the Compendium of Continuous Lattices, covering directed and filtered sets, down-closed and up-closed sets, ideals and filters, Galois connections, closure and co-closure operators. Some emphasis is on duality and morphisms between structures, as in the Compendium. To this end, three ad-hoc approaches to duality are compared. [Quantales] title = Quantales author = Georg Struth topic = Mathematics/Algebra date = 2018-12-11 notify = g.struth@sheffield.ac.uk abstract = These mathematical components formalise basic properties of quantales, together with some important models, constructions, and concepts, including quantic nuclei and conuclei. [Transformer_Semantics] title = Transformer Semantics author = Georg Struth topic = Mathematics/Algebra, Computer science/Semantics date = 2018-12-11 notify = g.struth@sheffield.ac.uk abstract = These mathematical components formalise predicate transformer semantics for programs, yet currently only for partial correctness and in the absence of faults. A first part for isotone (or monotone), Sup-preserving and Inf-preserving transformers follows Back and von Wright's approach, with additional emphasis on the quantalic structure of algebras of transformers. The second part develops Sup-preserving and Inf-preserving predicate transformers from the powerset monad, via its Kleisli category and Eilenberg-Moore algebras, with emphasis on adjunctions and dualities, as well as isomorphisms between relations, state transformers and predicate transformers. [Concurrent_Revisions] title = Formalization of Concurrent Revisions author = Roy Overbeek topic = Computer science/Concurrency date = 2018-12-25 notify = Roy.Overbeek@cwi.nl abstract = Concurrent revisions is a concurrency control model developed by Microsoft Research. It has many interesting properties that distinguish it from other well-known models such as transactional memory. One of these properties is determinacy: programs written within the model always produce the same outcome, independent of scheduling activity. The concurrent revisions model has an operational semantics, with an informal proof of determinacy. This document contains an Isabelle/HOL formalization of this semantics and the proof of determinacy. [Core_DOM] title = A Formal Model of the Document Object Model author = Achim D. Brucker , Michael Herzberg topic = Computer science/Data structures date = 2018-12-26 notify = adbrucker@0x5f.org abstract = In this AFP entry, we formalize the core of the Document Object Model (DOM). At its core, the DOM defines a tree-like data structure for representing documents in general and HTML documents in particular. It is the heart of any modern web browser. Formalizing the key concepts of the DOM is a prerequisite for the formal reasoning over client-side JavaScript programs and for the analysis of security concepts in modern web browsers. We present a formalization of the core DOM, with focus on the node-tree and the operations defined on node-trees, in Isabelle/HOL. We use the formalization to verify the functional correctness of the most important functions defined in the DOM standard. Moreover, our formalization is 1) extensible, i.e., can be extended without the need of re-proving already proven properties and 2) executable, i.e., we can generate executable code from our specification. [Core_SC_DOM] title = The Safely Composable DOM author = Achim D. Brucker , Michael Herzberg topic = Computer science/Data structures date = 2020-09-28 notify = adbrucker@0x5f.org, mail@michael-herzberg.de abstract = In this AFP entry, we formalize the core of the Safely Composable Document Object Model (SC DOM). The SC DOM improve the standard DOM (as formalized in the AFP entry "Core DOM") by strengthening the tree boundaries set by shadow roots: in the SC DOM, the shadow root is a sub-class of the document class (instead of a base class). This modifications also results in changes to some API methods (e.g., getOwnerDocument) to return the nearest shadow root rather than the document root. As a result, many API methods that, when called on a node inside a shadow tree, would previously ``break out'' and return or modify nodes that are possibly outside the shadow tree, now stay within its boundaries. This change in behavior makes programs that operate on shadow trees more predictable for the developer and allows them to make more assumptions about other code accessing the DOM. [Shadow_SC_DOM] title = A Formal Model of the Safely Composable Document Object Model with Shadow Roots author = Achim D. Brucker , Michael Herzberg topic = Computer science/Data structures date = 2020-09-28 notify = adbrucker@0x5f.org, mail@michael-herzberg.de abstract = In this AFP entry, we extend our formalization of the safely composable DOM with Shadow Roots. This is a proposal for Shadow Roots with stricter safety guarantess than the standard compliant formalization (see "Shadow DOM"). Shadow Roots are a recent proposal of the web community to support a component-based development approach for client-side web applications. Shadow roots are a significant extension to the DOM standard and, as web standards are condemned to be backward compatible, such extensions often result in complex specification that may contain unwanted subtleties that can be detected by a formalization. Our Isabelle/HOL formalization is, in the sense of object-orientation, an extension of our formalization of the core DOM and enjoys the same basic properties, i.e., it is extensible, i.e., can be extended without the need of re-proving already proven properties and executable, i.e., we can generate executable code from our specification. We exploit the executability to show that our formalization complies to the official standard of the W3C, respectively, the WHATWG. [SC_DOM_Components] title = A Formalization of Safely Composable Web Components author = Achim D. Brucker , Michael Herzberg topic = Computer science/Data structures date = 2020-09-28 notify = adbrucker@0x5f.org, mail@michael-herzberg.de abstract = While the (safely composable) DOM with shadow trees provide the technical basis for defining web components, it does neither defines the concept of web components nor specifies the safety properties that web components should guarantee. Consequently, the standard also does not discuss how or even if the methods for modifying the DOM respect component boundaries. In AFP entry, we present a formally verified model of safely composable web components and define safety properties which ensure that different web components can only interact with each other using well-defined interfaces. Moreover, our verification of the application programming interface (API) of the DOM revealed numerous invariants that implementations of the DOM API need to preserve to ensure the integrity of components. In comparison to the strict standard compliance formalization of Web Components in the AFP entry "DOM_Components", the notion of components in this entry (based on "SC_DOM" and "Shadow_SC_DOM") provides much stronger safety guarantees. [Store_Buffer_Reduction] title = A Reduction Theorem for Store Buffers author = Ernie Cohen , Norbert Schirmer topic = Computer science/Concurrency date = 2019-01-07 notify = norbert.schirmer@web.de abstract = When verifying a concurrent program, it is usual to assume that memory is sequentially consistent. However, most modern multiprocessors depend on store buffering for efficiency, and provide native sequential consistency only at a substantial performance penalty. To regain sequential consistency, a programmer has to follow an appropriate programming discipline. However, naïve disciplines, such as protecting all shared accesses with locks, are not flexible enough for building high-performance multiprocessor software. We present a new discipline for concurrent programming under TSO (total store order, with store buffer forwarding). It does not depend on concurrency primitives, such as locks. Instead, threads use ghost operations to acquire and release ownership of memory addresses. A thread can write to an address only if no other thread owns it, and can read from an address only if it owns it or it is shared and the thread has flushed its store buffer since it last wrote to an address it did not own. This discipline covers both coarse-grained concurrency (where data is protected by locks) as well as fine-grained concurrency (where atomic operations race to memory). We formalize this discipline in Isabelle/HOL, and prove that if every execution of a program in a system without store buffers follows the discipline, then every execution of the program with store buffers is sequentially consistent. Thus, we can show sequential consistency under TSO by ordinary assertional reasoning about the program, without having to consider store buffers at all. [IMP2] title = IMP2 – Simple Program Verification in Isabelle/HOL author = Peter Lammich , Simon Wimmer topic = Computer science/Programming languages/Logics, Computer science/Algorithms date = 2019-01-15 notify = lammich@in.tum.de abstract = IMP2 is a simple imperative language together with Isabelle tooling to create a program verification environment in Isabelle/HOL. The tools include a C-like syntax, a verification condition generator, and Isabelle commands for the specification of programs. The framework is modular, i.e., it allows easy reuse of already proved programs within larger programs. This entry comes with a quickstart guide and a large collection of examples, spanning basic algorithms with simple proofs to more advanced algorithms and proof techniques like data refinement. Some highlights from the examples are:
  • Bisection Square Root,
  • Extended Euclid,
  • Exponentiation by Squaring,
  • Binary Search,
  • Insertion Sort,
  • Quicksort,
  • Depth First Search.
The abstract syntax and semantics are very simple and well-documented. They are suitable to be used in a course, as extension to the IMP language which comes with the Isabelle distribution. While this entry is limited to a simple imperative language, the ideas could be extended to more sophisticated languages. [Farkas] title = Farkas' Lemma and Motzkin's Transposition Theorem author = Ralph Bottesch , Max W. Haslbeck , René Thiemann topic = Mathematics/Algebra date = 2019-01-17 notify = rene.thiemann@uibk.ac.at abstract = We formalize a proof of Motzkin's transposition theorem and Farkas' lemma in Isabelle/HOL. Our proof is based on the formalization of the simplex algorithm which, given a set of linear constraints, either returns a satisfying assignment to the problem or detects unsatisfiability. By reusing facts about the simplex algorithm we show that a set of linear constraints is unsatisfiable if and only if there is a linear combination of the constraints which evaluates to a trivially unsatisfiable inequality. [Auto2_Imperative_HOL] title = Verifying Imperative Programs using Auto2 author = Bohua Zhan topic = Computer science/Algorithms, Computer science/Data structures date = 2018-12-21 notify = bzhan@ios.ac.cn abstract = This entry contains the application of auto2 to verifying functional and imperative programs. Algorithms and data structures that are verified include linked lists, binary search trees, red-black trees, interval trees, priority queue, quicksort, union-find, Dijkstra's algorithm, and a sweep-line algorithm for detecting rectangle intersection. The imperative verification is based on Imperative HOL and its separation logic framework. A major goal of this work is to set up automation in order to reduce the length of proof that the user needs to provide, both for verifying functional programs and for working with separation logic. [UTP] title = Isabelle/UTP: Mechanised Theory Engineering for Unifying Theories of Programming author = Simon Foster , Frank Zeyda<>, Yakoub Nemouchi , Pedro Ribeiro<>, Burkhart Wolff topic = Computer science/Programming languages/Logics date = 2019-02-01 notify = simon.foster@york.ac.uk abstract = Isabelle/UTP is a mechanised theory engineering toolkit based on Hoare and He’s Unifying Theories of Programming (UTP). UTP enables the creation of denotational, algebraic, and operational semantics for different programming languages using an alphabetised relational calculus. We provide a semantic embedding of the alphabetised relational calculus in Isabelle/HOL, including new type definitions, relational constructors, automated proof tactics, and accompanying algebraic laws. Isabelle/UTP can be used to both capture laws of programming for different languages, and put these fundamental theorems to work in the creation of associated verification tools, using calculi like Hoare logics. This document describes the relational core of the UTP in Isabelle/HOL. [HOL-CSP] title = HOL-CSP Version 2.0 author = Safouan Taha , Lina Ye , Burkhart Wolff topic = Computer science/Concurrency/Process calculi, Computer science/Semantics date = 2019-04-26 notify = wolff@lri.fr abstract = This is a complete formalization of the work of Hoare and Roscoe on the denotational semantics of the Failure/Divergence Model of CSP. It follows essentially the presentation of CSP in Roscoe’s Book ”Theory and Practice of Concurrency” [8] and the semantic details in a joint Paper of Roscoe and Brooks ”An improved failures model for communicating processes". The present work is based on a prior formalization attempt, called HOL-CSP 1.0, done in 1997 by H. Tej and B. Wolff with the Isabelle proof technology available at that time. This work revealed minor, but omnipresent foundational errors in key concepts like the process invariant. The present version HOL-CSP profits from substantially improved libraries (notably HOLCF), improved automated proof techniques, and structured proof techniques in Isar and is substantially shorter but more complete. [Probabilistic_Prime_Tests] title = Probabilistic Primality Testing author = Daniel Stüwe<>, Manuel Eberl topic = Mathematics/Number theory date = 2019-02-11 notify = eberlm@in.tum.de abstract =

The most efficient known primality tests are probabilistic in the sense that they use randomness and may, with some probability, mistakenly classify a composite number as prime – but never a prime number as composite. Examples of this are the Miller–Rabin test, the Solovay–Strassen test, and (in most cases) Fermat's test.

This entry defines these three tests and proves their correctness. It also develops some of the number-theoretic foundations, such as Carmichael numbers and the Jacobi symbol with an efficient executable algorithm to compute it.

[Kruskal] title = Kruskal's Algorithm for Minimum Spanning Forest author = Maximilian P.L. Haslbeck , Peter Lammich , Julian Biendarra<> topic = Computer science/Algorithms/Graph date = 2019-02-14 notify = haslbema@in.tum.de, lammich@in.tum.de abstract = This Isabelle/HOL formalization defines a greedy algorithm for finding a minimum weight basis on a weighted matroid and proves its correctness. This algorithm is an abstract version of Kruskal's algorithm. We interpret the abstract algorithm for the cycle matroid (i.e. forests in a graph) and refine it to imperative executable code using an efficient union-find data structure. Our formalization can be instantiated for different graph representations. We provide instantiations for undirected graphs and symmetric directed graphs. [List_Inversions] title = The Inversions of a List author = Manuel Eberl topic = Computer science/Algorithms date = 2019-02-01 notify = eberlm@in.tum.de abstract =

This entry defines the set of inversions of a list, i.e. the pairs of indices that violate sortedness. It also proves the correctness of the well-known O(n log n) divide-and-conquer algorithm to compute the number of inversions.

[Prime_Distribution_Elementary] title = Elementary Facts About the Distribution of Primes author = Manuel Eberl topic = Mathematics/Number theory date = 2019-02-21 notify = eberlm@in.tum.de abstract =

This entry is a formalisation of Chapter 4 (and parts of Chapter 3) of Apostol's Introduction to Analytic Number Theory. The main topics that are addressed are properties of the distribution of prime numbers that can be shown in an elementary way (i. e. without the Prime Number Theorem), the various equivalent forms of the PNT (which imply each other in elementary ways), and consequences that follow from the PNT in elementary ways. The latter include, most notably, asymptotic bounds for the number of distinct prime factors of n, the divisor function d(n), Euler's totient function φ(n), and lcm(1,…,n).

[Safe_OCL] title = Safe OCL author = Denis Nikiforov <> topic = Computer science/Programming languages/Language definitions license = LGPL date = 2019-03-09 notify = denis.nikif@gmail.com abstract =

The theory is a formalization of the OCL type system, its abstract syntax and expression typing rules. The theory does not define a concrete syntax and a semantics. In contrast to Featherweight OCL, it is based on a deep embedding approach. The type system is defined from scratch, it is not based on the Isabelle HOL type system.

The Safe OCL distincts nullable and non-nullable types. Also the theory gives a formal definition of safe navigation operations. The Safe OCL typing rules are much stricter than rules given in the OCL specification. It allows one to catch more errors on a type checking phase.

The type theory presented is four-layered: classes, basic types, generic types, errorable types. We introduce the following new types: non-nullable types (T[1]), nullable types (T[?]), OclSuper. OclSuper is a supertype of all other types (basic types, collections, tuples). This type allows us to define a total supremum function, so types form an upper semilattice. It allows us to define rich expression typing rules in an elegant manner.

The Preliminaries Chapter of the theory defines a number of helper lemmas for transitive closures and tuples. It defines also a generic object model independent from OCL. It allows one to use the theory as a reference for formalization of analogous languages.

[QHLProver] title = Quantum Hoare Logic author = Junyi Liu<>, Bohua Zhan , Shuling Wang<>, Shenggang Ying<>, Tao Liu<>, Yangjia Li<>, Mingsheng Ying<>, Naijun Zhan<> topic = Computer science/Programming languages/Logics, Computer science/Semantics date = 2019-03-24 notify = bzhan@ios.ac.cn abstract = We formalize quantum Hoare logic as given in [1]. In particular, we specify the syntax and denotational semantics of a simple model of quantum programs. Then, we write down the rules of quantum Hoare logic for partial correctness, and show the soundness and completeness of the resulting proof system. As an application, we verify the correctness of Grover’s algorithm. [Transcendence_Series_Hancl_Rucki] title = The Transcendence of Certain Infinite Series author = Angeliki Koutsoukou-Argyraki , Wenda Li topic = Mathematics/Analysis, Mathematics/Number theory date = 2019-03-27 notify = wl302@cam.ac.uk, ak2110@cam.ac.uk abstract = We formalize the proofs of two transcendence criteria by J. Hančl and P. Rucki that assert the transcendence of the sums of certain infinite series built up by sequences that fulfil certain properties. Both proofs make use of Roth's celebrated theorem on diophantine approximations to algebraic numbers from 1955 which we implement as an assumption without having formalised its proof. [Binding_Syntax_Theory] title = A General Theory of Syntax with Bindings author = Lorenzo Gheri , Andrei Popescu topic = Computer science/Programming languages/Lambda calculi, Computer science/Functional programming, Logic/General logic/Mechanization of proofs date = 2019-04-06 notify = a.popescu@mdx.ac.uk, lor.gheri@gmail.com abstract = We formalize a theory of syntax with bindings that has been developed and refined over the last decade to support several large formalization efforts. Terms are defined for an arbitrary number of constructors of varying numbers of inputs, quotiented to alpha-equivalence and sorted according to a binding signature. The theory includes many properties of the standard operators on terms: substitution, swapping and freshness. It also includes bindings-aware induction and recursion principles and support for semantic interpretation. This work has been presented in the ITP 2017 paper “A Formalized General Theory of Syntax with Bindings”. [LTL_Master_Theorem] title = A Compositional and Unified Translation of LTL into ω-Automata author = Benedikt Seidl , Salomon Sickert topic = Computer science/Automata and formal languages date = 2019-04-16 notify = benedikt.seidl@tum.de, s.sickert@tum.de abstract = We present a formalisation of the unified translation approach of linear temporal logic (LTL) into ω-automata from [1]. This approach decomposes LTL formulas into ``simple'' languages and allows a clear separation of concerns: first, we formalise the purely logical result yielding this decomposition; second, we instantiate this generic theory to obtain a construction for deterministic (state-based) Rabin automata (DRA). We extract from this particular instantiation an executable tool translating LTL to DRAs. To the best of our knowledge this is the first verified translation from LTL to DRAs that is proven to be double exponential in the worst case which asymptotically matches the known lower bound.

[1] Javier Esparza, Jan Kretínský, Salomon Sickert. One Theorem to Rule Them All: A Unified Translation of LTL into ω-Automata. LICS 2018 [LambdaAuth] title = Formalization of Generic Authenticated Data Structures author = Matthias Brun<>, Dmitriy Traytel topic = Computer science/Security, Computer science/Programming languages/Lambda calculi date = 2019-05-14 notify = traytel@inf.ethz.ch abstract = Authenticated data structures are a technique for outsourcing data storage and maintenance to an untrusted server. The server is required to produce an efficiently checkable and cryptographically secure proof that it carried out precisely the requested computation. Miller et al. introduced λ• (pronounced lambda auth)—a functional programming language with a built-in primitive authentication construct, which supports a wide range of user-specified authenticated data structures while guaranteeing certain correctness and security properties for all well-typed programs. We formalize λ• and prove its correctness and security properties. With Isabelle's help, we uncover and repair several mistakes in the informal proofs and lemma statements. Our findings are summarized in a paper draft. [IMP2_Binary_Heap] title = Binary Heaps for IMP2 author = Simon Griebel<> topic = Computer science/Data structures, Computer science/Algorithms date = 2019-06-13 notify = s.griebel@tum.de abstract = In this submission array-based binary minimum heaps are formalized. The correctness of the following heap operations is proved: insert, get-min, delete-min and make-heap. These are then used to verify an in-place heapsort. The formalization is based on IMP2, an imperative program verification framework implemented in Isabelle/HOL. The verified heap functions are iterative versions of the partly recursive functions found in "Algorithms and Data Structures – The Basic Toolbox" by K. Mehlhorn and P. Sanders and "Introduction to Algorithms" by T. H. Cormen, C. E. Leiserson, R. L. Rivest and C. Stein. [Groebner_Macaulay] title = Gröbner Bases, Macaulay Matrices and Dubé's Degree Bounds author = Alexander Maletzky topic = Mathematics/Algebra date = 2019-06-15 notify = alexander.maletzky@risc.jku.at abstract = This entry formalizes the connection between Gröbner bases and Macaulay matrices (sometimes also referred to as `generalized Sylvester matrices'). In particular, it contains a method for computing Gröbner bases, which proceeds by first constructing some Macaulay matrix of the initial set of polynomials, then row-reducing this matrix, and finally converting the result back into a set of polynomials. The output is shown to be a Gröbner basis if the Macaulay matrix constructed in the first step is sufficiently large. In order to obtain concrete upper bounds on the size of the matrix (and hence turn the method into an effectively executable algorithm), Dubé's degree bounds on Gröbner bases are utilized; consequently, they are also part of the formalization. [Linear_Inequalities] title = Linear Inequalities author = Ralph Bottesch , Alban Reynaud <>, René Thiemann topic = Mathematics/Algebra date = 2019-06-21 notify = rene.thiemann@uibk.ac.at abstract = We formalize results about linear inqualities, mainly from Schrijver's book. The main results are the proof of the fundamental theorem on linear inequalities, Farkas' lemma, Carathéodory's theorem, the Farkas-Minkowsky-Weyl theorem, the decomposition theorem of polyhedra, and Meyer's result that the integer hull of a polyhedron is a polyhedron itself. Several theorems include bounds on the appearing numbers, and in particular we provide an a-priori bound on mixed-integer solutions of linear inequalities. [Linear_Programming] title = Linear Programming author = Julian Parsert , Cezary Kaliszyk topic = Mathematics/Algebra date = 2019-08-06 notify = julian.parsert@gmail.com, cezary.kaliszyk@uibk.ac.at abstract = We use the previous formalization of the general simplex algorithm to formulate an algorithm for solving linear programs. We encode the linear programs using only linear constraints. Solving these constraints also solves the original linear program. This algorithm is proven to be sound by applying the weak duality theorem which is also part of this formalization. [Differential_Game_Logic] title = Differential Game Logic author = André Platzer topic = Computer science/Programming languages/Logics date = 2019-06-03 notify = aplatzer@cs.cmu.edu abstract = This formalization provides differential game logic (dGL), a logic for proving properties of hybrid game. In addition to the syntax and semantics, it formalizes a uniform substitution calculus for dGL. Church's uniform substitutions substitute a term or formula for a function or predicate symbol everywhere. The uniform substitutions for dGL also substitute hybrid games for a game symbol everywhere. We prove soundness of one-pass uniform substitutions and the axioms of differential game logic with respect to their denotational semantics. One-pass uniform substitutions are faster by postponing soundness-critical admissibility checks with a linear pass homomorphic application and regain soundness by a variable condition at the replacements. The formalization is based on prior non-mechanized soundness proofs for dGL. [Complete_Non_Orders] title = Complete Non-Orders and Fixed Points author = Akihisa Yamada , Jérémy Dubut topic = Mathematics/Order date = 2019-06-27 notify = akihisayamada@nii.ac.jp, dubut@nii.ac.jp abstract = We develop an Isabelle/HOL library of order-theoretic concepts, such as various completeness conditions and fixed-point theorems. We keep our formalization as general as possible: we reprove several well-known results about complete orders, often without any properties of ordering, thus complete non-orders. In particular, we generalize the Knaster–Tarski theorem so that we ensure the existence of a quasi-fixed point of monotone maps over complete non-orders, and show that the set of quasi-fixed points is complete under a mild condition—attractivity—which is implied by either antisymmetry or transitivity. This result generalizes and strengthens a result by Stauti and Maaden. Finally, we recover Kleene’s fixed-point theorem for omega-complete non-orders, again using attractivity to prove that Kleene’s fixed points are least quasi-fixed points. [Priority_Search_Trees] title = Priority Search Trees author = Peter Lammich , Tobias Nipkow topic = Computer science/Data structures date = 2019-06-25 notify = lammich@in.tum.de abstract = We present a new, purely functional, simple and efficient data structure combining a search tree and a priority queue, which we call a priority search tree. The salient feature of priority search trees is that they offer a decrease-key operation, something that is missing from other simple, purely functional priority queue implementations. Priority search trees can be implemented on top of any search tree. This entry does the implementation for red-black trees. This entry formalizes the first part of our ITP-2019 proof pearl Purely Functional, Simple and Efficient Priority Search Trees and Applications to Prim and Dijkstra. [Prim_Dijkstra_Simple] title = Purely Functional, Simple, and Efficient Implementation of Prim and Dijkstra author = Peter Lammich , Tobias Nipkow topic = Computer science/Algorithms/Graph date = 2019-06-25 notify = lammich@in.tum.de abstract = We verify purely functional, simple and efficient implementations of Prim's and Dijkstra's algorithms. This constitutes the first verification of an executable and even efficient version of Prim's algorithm. This entry formalizes the second part of our ITP-2019 proof pearl Purely Functional, Simple and Efficient Priority Search Trees and Applications to Prim and Dijkstra. [MFOTL_Monitor] title = Formalization of a Monitoring Algorithm for Metric First-Order Temporal Logic author = Joshua Schneider , Dmitriy Traytel topic = Computer science/Algorithms, Logic/General logic/Temporal logic, Computer science/Automata and formal languages date = 2019-07-04 notify = joshua.schneider@inf.ethz.ch, traytel@inf.ethz.ch abstract = A monitor is a runtime verification tool that solves the following problem: Given a stream of time-stamped events and a policy formulated in a specification language, decide whether the policy is satisfied at every point in the stream. We verify the correctness of an executable monitor for specifications given as formulas in metric first-order temporal logic (MFOTL), an expressive extension of linear temporal logic with real-time constraints and first-order quantification. The verified monitor implements a simplified variant of the algorithm used in the efficient MonPoly monitoring tool. The formalization is presented in a RV 2019 paper, which also compares the output of the verified monitor to that of other monitoring tools on randomly generated inputs. This case study revealed several errors in the optimized but unverified tools. extra-history = Change history: [2020-08-13]: added the formalization of the abstract slicing framework and joint data slicer (revision b1639ed541b7)
[FOL_Seq_Calc1] title = A Sequent Calculus for First-Order Logic author = Asta Halkjær From contributors = Alexander Birch Jensen , Anders Schlichtkrull , Jørgen Villadsen topic = Logic/Proof theory date = 2019-07-18 notify = ahfrom@dtu.dk abstract = This work formalizes soundness and completeness of a one-sided sequent calculus for first-order logic. The completeness is shown via a translation from a complete semantic tableau calculus, the proof of which is based on the First-Order Logic According to Fitting theory. The calculi and proof techniques are taken from Ben-Ari's Mathematical Logic for Computer Science. [Szpilrajn] title = Szpilrajn Extension Theorem author = Peter Zeller topic = Mathematics/Order date = 2019-07-27 notify = p_zeller@cs.uni-kl.de abstract = We formalize the Szpilrajn extension theorem, also known as order-extension principal: Every strict partial order can be extended to a strict linear order. [TESL_Language] title = A Formal Development of a Polychronous Polytimed Coordination Language author = Hai Nguyen Van , Frédéric Boulanger , Burkhart Wolff topic = Computer science/System description languages, Computer science/Semantics, Computer science/Concurrency date = 2019-07-30 notify = frederic.boulanger@centralesupelec.fr, burkhart.wolff@lri.fr abstract = The design of complex systems involves different formalisms for modeling their different parts or aspects. The global model of a system may therefore consist of a coordination of concurrent sub-models that use different paradigms. We develop here a theory for a language used to specify the timed coordination of such heterogeneous subsystems by addressing the following issues:

  • the behavior of the sub-systems is observed only at a series of discrete instants,
  • events may occur in different sub-systems at unrelated times, leading to polychronous systems, which do not necessarily have a common base clock,
  • coordination between subsystems involves causality, so the occurrence of an event may enforce the occurrence of other events, possibly after a certain duration has elapsed or an event has occurred a given number of times,
  • the domain of time (discrete, rational, continuous...) may be different in the subsystems, leading to polytimed systems,
  • the time frames of different sub-systems may be related (for instance, time in a GPS satellite and in a GPS receiver on Earth are related although they are not the same).
Firstly, a denotational semantics of the language is defined. Then, in order to be able to incrementally check the behavior of systems, an operational semantics is given, with proofs of progress, soundness and completeness with regard to the denotational semantics. These proofs are made according to a setup that can scale up when new operators are added to the language. In order for specifications to be composed in a clean way, the language should be invariant by stuttering (i.e., adding observation instants at which nothing happens). The proof of this invariance is also given. [Stellar_Quorums] title = Stellar Quorum Systems author = Giuliano Losa topic = Computer science/Algorithms/Distributed date = 2019-08-01 notify = giuliano@galois.com abstract = We formalize the static properties of personal Byzantine quorum systems (PBQSs) and Stellar quorum systems, as described in the paper ``Stellar Consensus by Reduction'' (to appear at DISC 2019). [IMO2019] title = Selected Problems from the International Mathematical Olympiad 2019 author = Manuel Eberl topic = Mathematics/Misc date = 2019-08-05 notify = eberlm@in.tum.de abstract =

This entry contains formalisations of the answers to three of the six problem of the International Mathematical Olympiad 2019, namely Q1, Q4, and Q5.

The reason why these problems were chosen is that they are particularly amenable to formalisation: they can be solved with minimal use of libraries. The remaining three concern geometry and graph theory, which, in the author's opinion, are more difficult to formalise resp. require a more complex library.

[Adaptive_State_Counting] title = Formalisation of an Adaptive State Counting Algorithm author = Robert Sachtleben topic = Computer science/Automata and formal languages, Computer science/Algorithms date = 2019-08-16 notify = rob_sac@uni-bremen.de abstract = This entry provides a formalisation of a refinement of an adaptive state counting algorithm, used to test for reduction between finite state machines. The algorithm has been originally presented by Hierons in the paper Testing from a Non-Deterministic Finite State Machine Using Adaptive State Counting. Definitions for finite state machines and adaptive test cases are given and many useful theorems are derived from these. The algorithm is formalised using mutually recursive functions, for which it is proven that the generated test suite is sufficient to test for reduction against finite state machines of a certain fault domain. Additionally, the algorithm is specified in a simple WHILE-language and its correctness is shown using Hoare-logic. [Jacobson_Basic_Algebra] title = A Case Study in Basic Algebra author = Clemens Ballarin topic = Mathematics/Algebra date = 2019-08-30 notify = ballarin@in.tum.de abstract = The focus of this case study is re-use in abstract algebra. It contains locale-based formalisations of selected parts of set, group and ring theory from Jacobson's Basic Algebra leading to the respective fundamental homomorphism theorems. The study is not intended as a library base for abstract algebra. It rather explores an approach towards abstract algebra in Isabelle. [Hybrid_Systems_VCs] title = Verification Components for Hybrid Systems author = Jonathan Julian Huerta y Munive <> topic = Mathematics/Algebra, Mathematics/Analysis date = 2019-09-10 notify = jjhuertaymunive1@sheffield.ac.uk, jonjulian23@gmail.com abstract = These components formalise a semantic framework for the deductive verification of hybrid systems. They support reasoning about continuous evolutions of hybrid programs in the style of differential dynamics logic. Vector fields or flows model these evolutions, and their verification is done with invariants for the former or orbits for the latter. Laws of modal Kleene algebra or categorical predicate transformers implement the verification condition generation. Examples show the approach at work. extra-history = Change history: [2020-12-13]: added components based on Kleene algebras with tests. These implement differential Hoare logic (dH) and a Morgan-style differential refinement calculus (dR) for verification of hybrid programs. [Generic_Join] title = Formalization of Multiway-Join Algorithms author = Thibault Dardinier<> topic = Computer science/Algorithms date = 2019-09-16 notify = tdardini@student.ethz.ch, traytel@inf.ethz.ch abstract = Worst-case optimal multiway-join algorithms are recent seminal achievement of the database community. These algorithms compute the natural join of multiple relational databases and improve in the worst case over traditional query plan optimizations of nested binary joins. In 2014, Ngo, Ré, and Rudra gave a unified presentation of different multi-way join algorithms. We formalized and proved correct their "Generic Join" algorithm and extended it to support negative joins. [Aristotles_Assertoric_Syllogistic] title = Aristotle's Assertoric Syllogistic author = Angeliki Koutsoukou-Argyraki topic = Logic/Philosophical aspects date = 2019-10-08 notify = ak2110@cam.ac.uk abstract = We formalise with Isabelle/HOL some basic elements of Aristotle's assertoric syllogistic following the article from the Stanford Encyclopedia of Philosophy by Robin Smith. To this end, we use a set theoretic formulation (covering both individual and general predication). In particular, we formalise the deductions in the Figures and after that we present Aristotle's metatheoretical observation that all deductions in the Figures can in fact be reduced to either Barbara or Celarent. As the formal proofs prove to be straightforward, the interest of this entry lies in illustrating the functionality of Isabelle and high efficiency of Sledgehammer for simple exercises in philosophy. [VerifyThis2019] title = VerifyThis 2019 -- Polished Isabelle Solutions author = Peter Lammich<>, Simon Wimmer topic = Computer science/Algorithms date = 2019-10-16 notify = lammich@in.tum.de, wimmers@in.tum.de abstract = VerifyThis 2019 (http://www.pm.inf.ethz.ch/research/verifythis.html) was a program verification competition associated with ETAPS 2019. It was the 8th event in the VerifyThis competition series. In this entry, we present polished and completed versions of our solutions that we created during the competition. [ZFC_in_HOL] title = Zermelo Fraenkel Set Theory in Higher-Order Logic author = Lawrence C. Paulson topic = Logic/Set theory date = 2019-10-24 notify = lp15@cam.ac.uk abstract =

This entry is a new formalisation of ZFC set theory in Isabelle/HOL. It is logically equivalent to Obua's HOLZF; the point is to have the closest possible integration with the rest of Isabelle/HOL, minimising the amount of new notations and exploiting type classes.

There is a type V of sets and a function elts :: V => V set mapping a set to its elements. Classes simply have type V set, and a predicate identifies the small classes: those that correspond to actual sets. Type classes connected with orders and lattices are used to minimise the amount of new notation for concepts such as the subset relation, union and intersection. Basic concepts — Cartesian products, disjoint sums, natural numbers, functions, etc. — are formalised.

More advanced set-theoretic concepts, such as transfinite induction, ordinals, cardinals and the transitive closure of a set, are also provided. The definition of addition and multiplication for general sets (not just ordinals) follows Kirby.

The theory provides two type classes with the aim of facilitating developments that combine V with other Isabelle/HOL types: embeddable, the class of types that can be injected into V (including V itself as well as V*V, etc.), and small, the class of types that correspond to some ZF set.

extra-history = Change history: [2020-01-28]: Generalisation of the "small" predicate and order types to arbitrary sets; ordinal exponentiation; introduction of the coercion ord_of_nat :: "nat => V"; numerous new lemmas. (revision 6081d5be8d08) [Interval_Arithmetic_Word32] title = Interval Arithmetic on 32-bit Words author = Brandon Bohrer topic = Computer science/Data structures date = 2019-11-27 notify = bjbohrer@gmail.com, bbohrer@cs.cmu.edu abstract = Interval_Arithmetic implements conservative interval arithmetic computations, then uses this interval arithmetic to implement a simple programming language where all terms have 32-bit signed word values, with explicit infinities for terms outside the representable bounds. Our target use case is interpreters for languages that must have a well-understood low-level behavior. We include a formalization of bounded-length strings which are used for the identifiers of our language. Bounded-length identifiers are useful in some applications, for example the Differential_Dynamic_Logic article, where a Euclidean space indexed by identifiers demands that identifiers are finitely many. [Generalized_Counting_Sort] title = An Efficient Generalization of Counting Sort for Large, possibly Infinite Key Ranges author = Pasquale Noce topic = Computer science/Algorithms, Computer science/Functional programming date = 2019-12-04 notify = pasquale.noce.lavoro@gmail.com abstract = Counting sort is a well-known algorithm that sorts objects of any kind mapped to integer keys, or else to keys in one-to-one correspondence with some subset of the integers (e.g. alphabet letters). However, it is suitable for direct use, viz. not just as a subroutine of another sorting algorithm (e.g. radix sort), only if the key range is not significantly larger than the number of the objects to be sorted. This paper describes a tail-recursive generalization of counting sort making use of a bounded number of counters, suitable for direct use in case of a large, or even infinite key range of any kind, subject to the only constraint of being a subset of an arbitrary linear order. After performing a pen-and-paper analysis of how such algorithm has to be designed to maximize its efficiency, this paper formalizes the resulting generalized counting sort (GCsort) algorithm and then formally proves its correctness properties, namely that (a) the counters' number is maximized never exceeding the fixed upper bound, (b) objects are conserved, (c) objects get sorted, and (d) the algorithm is stable. [Poincare_Bendixson] title = The Poincaré-Bendixson Theorem author = Fabian Immler , Yong Kiam Tan topic = Mathematics/Analysis date = 2019-12-18 notify = fimmler@cs.cmu.edu, yongkiat@cs.cmu.edu abstract = The Poincaré-Bendixson theorem is a classical result in the study of (continuous) dynamical systems. Colloquially, it restricts the possible behaviors of planar dynamical systems: such systems cannot be chaotic. In practice, it is a useful tool for proving the existence of (limiting) periodic behavior in planar systems. The theorem is an interesting and challenging benchmark for formalized mathematics because proofs in the literature rely on geometric sketches and only hint at symmetric cases. It also requires a substantial background of mathematical theories, e.g., the Jordan curve theorem, real analysis, ordinary differential equations, and limiting (long-term) behavior of dynamical systems. [Isabelle_C] title = Isabelle/C author = Frédéric Tuong , Burkhart Wolff topic = Computer science/Programming languages/Language definitions, Computer science/Semantics, Tools date = 2019-10-22 notify = tuong@users.gforge.inria.fr, wolff@lri.fr abstract = We present a framework for C code in C11 syntax deeply integrated into the Isabelle/PIDE development environment. Our framework provides an abstract interface for verification back-ends to be plugged-in independently. Thus, various techniques such as deductive program verification or white-box testing can be applied to the same source, which is part of an integrated PIDE document model. Semantic back-ends are free to choose the supported C fragment and its semantics. In particular, they can differ on the chosen memory model or the specification mechanism for framing conditions. Our framework supports semantic annotations of C sources in the form of comments. Annotations serve to locally control back-end settings, and can express the term focus to which an annotation refers. Both the logical and the syntactic context are available when semantic annotations are evaluated. As a consequence, a formula in an annotation can refer both to HOL or C variables. Our approach demonstrates the degree of maturity and expressive power the Isabelle/PIDE sub-system has achieved in recent years. Our integration technique employs Lex and Yacc style grammars to ensure efficient deterministic parsing. This is the core-module of Isabelle/C; the AFP package for Clean and Clean_wrapper as well as AutoCorres and AutoCorres_wrapper (available via git) are applications of this front-end. [Zeta_3_Irrational] title = The Irrationality of ζ(3) author = Manuel Eberl topic = Mathematics/Number theory date = 2019-12-27 notify = manuel.eberl@tum.de abstract =

This article provides a formalisation of Beukers's straightforward analytic proof that ζ(3) is irrational. This was first proven by Apéry (which is why this result is also often called ‘Apéry's Theorem’) using a more algebraic approach. This formalisation follows Filaseta's presentation of Beukers's proof.

[Hybrid_Logic] title = Formalizing a Seligman-Style Tableau System for Hybrid Logic author = Asta Halkjær From topic = Logic/General logic/Modal logic date = 2019-12-20 notify = ahfrom@dtu.dk abstract = This work is a formalization of soundness and completeness proofs for a Seligman-style tableau system for hybrid logic. The completeness result is obtained via a synthetic approach using maximally consistent sets of tableau blocks. The formalization differs from previous work in a few ways. First, to avoid the need to backtrack in the construction of a tableau, the formalized system has no unnamed initial segment, and therefore no Name rule. Second, I show that the full Bridge rule is admissible in the system. Third, I start from rules restricted to only extend the branch with new formulas, including only witnessing diamonds that are not already witnessed, and show that the unrestricted rules are admissible. Similarly, I start from simpler versions of the @-rules and show that these are sufficient. The GoTo rule is restricted using a notion of potential such that each application consumes potential and potential is earned through applications of the remaining rules. I show that if a branch can be closed then it can be closed starting from a single unit. Finally, Nom is restricted by a fixed set of allowed nominals. The resulting system should be terminating. extra-history = Change history: [2020-06-03]: The fully restricted system has been shown complete by updating the synthetic completeness proof. [Bicategory] title = Bicategories author = Eugene W. Stark topic = Mathematics/Category theory date = 2020-01-06 notify = stark@cs.stonybrook.edu abstract =

Taking as a starting point the author's previous work on developing aspects of category theory in Isabelle/HOL, this article gives a compatible formalization of the notion of "bicategory" and develops a framework within which formal proofs of facts about bicategories can be given. The framework includes a number of basic results, including the Coherence Theorem, the Strictness Theorem, pseudofunctors and biequivalence, and facts about internal equivalences and adjunctions in a bicategory. As a driving application and demonstration of the utility of the framework, it is used to give a formal proof of a theorem, due to Carboni, Kasangian, and Street, that characterizes up to biequivalence the bicategories of spans in a category with pullbacks. The formalization effort necessitated the filling-in of many details that were not evident from the brief presentation in the original paper, as well as identifying a few minor corrections along the way.

Revisions made subsequent to the first version of this article added additional material on pseudofunctors, pseudonatural transformations, modifications, and equivalence of bicategories; the main thrust being to give a proof that a pseudofunctor is a biequivalence if and only if it can be extended to an equivalence of bicategories.

extra-history = Change history: [2020-02-15]: Move ConcreteCategory.thy from Bicategory to Category3 and use it systematically. Make other minor improvements throughout. (revision a51840d36867)
[2020-11-04]: Added new material on equivalence of bicategories, with associated changes. (revision 472cb2268826)
[Subset_Boolean_Algebras] title = A Hierarchy of Algebras for Boolean Subsets author = Walter Guttmann , Bernhard Möller topic = Mathematics/Algebra date = 2020-01-31 notify = walter.guttmann@canterbury.ac.nz abstract = We present a collection of axiom systems for the construction of Boolean subalgebras of larger overall algebras. The subalgebras are defined as the range of a complement-like operation on a semilattice. This technique has been used, for example, with the antidomain operation, dynamic negation and Stone algebras. We present a common ground for these constructions based on a new equational axiomatisation of Boolean algebras. [Goodstein_Lambda] title = Implementing the Goodstein Function in λ-Calculus author = Bertram Felgenhauer topic = Logic/Rewriting date = 2020-02-21 notify = int-e@gmx.de abstract = In this formalization, we develop an implementation of the Goodstein function G in plain λ-calculus, linked to a concise, self-contained specification. The implementation works on a Church-encoded representation of countable ordinals. The initial conversion to hereditary base 2 is not covered, but the material is sufficient to compute the particular value G(16), and easily extends to other fixed arguments. [VeriComp] title = A Generic Framework for Verified Compilers author = Martin Desharnais topic = Computer science/Programming languages/Compiling date = 2020-02-10 notify = martin.desharnais@unibw.de abstract = This is a generic framework for formalizing compiler transformations. It leverages Isabelle/HOL’s locales to abstract over concrete languages and transformations. It states common definitions for language semantics, program behaviours, forward and backward simulations, and compilers. We provide generic operations, such as simulation and compiler composition, and prove general (partial) correctness theorems, resulting in reusable proof components. [Hello_World] title = Hello World author = Cornelius Diekmann , Lars Hupel topic = Computer science/Functional programming date = 2020-03-07 notify = diekmann@net.in.tum.de abstract = In this article, we present a formalization of the well-known "Hello, World!" code, including a formal framework for reasoning about IO. Our model is inspired by the handling of IO in Haskell. We start by formalizing the 🌍 and embrace the IO monad afterwards. Then we present a sample main :: IO (), followed by its proof of correctness. [WOOT_Strong_Eventual_Consistency] title = Strong Eventual Consistency of the Collaborative Editing Framework WOOT author = Emin Karayel , Edgar Gonzàlez topic = Computer science/Algorithms/Distributed date = 2020-03-25 notify = eminkarayel@google.com, edgargip@google.com, me@eminkarayel.de abstract = Commutative Replicated Data Types (CRDTs) are a promising new class of data structures for large-scale shared mutable content in applications that only require eventual consistency. The WithOut Operational Transforms (WOOT) framework is a CRDT for collaborative text editing introduced by Oster et al. (CSCW 2006) for which the eventual consistency property was verified only for a bounded model to date. We contribute a formal proof for WOOTs strong eventual consistency. [Furstenberg_Topology] title = Furstenberg's topology and his proof of the infinitude of primes author = Manuel Eberl topic = Mathematics/Number theory date = 2020-03-22 notify = manuel.eberl@tum.de abstract =

This article gives a formal version of Furstenberg's topological proof of the infinitude of primes. He defines a topology on the integers based on arithmetic progressions (or, equivalently, residue classes). Using some fairly obvious properties of this topology, the infinitude of primes is then easily obtained.

Apart from this, this topology is also fairly ‘nice’ in general: it is second countable, metrizable, and perfect. All of these (well-known) facts are formally proven, including an explicit metric for the topology given by Zulfeqarr.

[Saturation_Framework] title = A Comprehensive Framework for Saturation Theorem Proving author = Sophie Tourret topic = Logic/General logic/Mechanization of proofs date = 2020-04-09 notify = stourret@mpi-inf.mpg.de abstract = This Isabelle/HOL formalization is the companion of the technical report “A comprehensive framework for saturation theorem proving”, itself companion of the eponym IJCAR 2020 paper, written by Uwe Waldmann, Sophie Tourret, Simon Robillard and Jasmin Blanchette. It verifies a framework for formal refutational completeness proofs of abstract provers that implement saturation calculi, such as ordered resolution or superposition, and allows to model entire prover architectures in such a way that the static refutational completeness of a calculus immediately implies the dynamic refutational completeness of a prover implementing the calculus using a variant of the given clause loop. The technical report “A comprehensive framework for saturation theorem proving” is available on the Matryoshka website. The names of the Isabelle lemmas and theorems corresponding to the results in the report are indicated in the margin of the report. [Saturation_Framework_Extensions] title = Extensions to the Comprehensive Framework for Saturation Theorem Proving author = Jasmin Blanchette , Sophie Tourret topic = Logic/General logic/Mechanization of proofs date = 2020-08-25 notify = jasmin.blanchette@gmail.com abstract = This Isabelle/HOL formalization extends the AFP entry Saturation_Framework with the following contributions:
  • an application of the framework to prove Bachmair and Ganzinger's resolution prover RP refutationally complete, which was formalized in a more ad hoc fashion by Schlichtkrull et al. in the AFP entry Ordered_Resultion_Prover;
  • generalizations of various basic concepts formalized by Schlichtkrull et al., which were needed to verify RP and could be useful to formalize other calculi, such as superposition;
  • alternative proofs of fairness (and hence saturation and ultimately refutational completeness) for the given clause procedures GC and LGC, based on invariance.
[MFODL_Monitor_Optimized] title = Formalization of an Optimized Monitoring Algorithm for Metric First-Order Dynamic Logic with Aggregations author = Thibault Dardinier<>, Lukas Heimes<>, Martin Raszyk , Joshua Schneider , Dmitriy Traytel topic = Computer science/Algorithms, Logic/General logic/Modal logic, Computer science/Automata and formal languages date = 2020-04-09 notify = martin.raszyk@inf.ethz.ch, joshua.schneider@inf.ethz.ch, traytel@inf.ethz.ch abstract = A monitor is a runtime verification tool that solves the following problem: Given a stream of time-stamped events and a policy formulated in a specification language, decide whether the policy is satisfied at every point in the stream. We verify the correctness of an executable monitor for specifications given as formulas in metric first-order dynamic logic (MFODL), which combines the features of metric first-order temporal logic (MFOTL) and metric dynamic logic. Thus, MFODL supports real-time constraints, first-order parameters, and regular expressions. Additionally, the monitor supports aggregation operations such as count and sum. This formalization, which is described in a forthcoming paper at IJCAR 2020, significantly extends previous work on a verified monitor for MFOTL. Apart from the addition of regular expressions and aggregations, we implemented multi-way joins and a specialized sliding window algorithm to further optimize the monitor. [Sliding_Window_Algorithm] title = Formalization of an Algorithm for Greedily Computing Associative Aggregations on Sliding Windows author = Lukas Heimes<>, Dmitriy Traytel , Joshua Schneider<> topic = Computer science/Algorithms date = 2020-04-10 notify = heimesl@student.ethz.ch, traytel@inf.ethz.ch, joshua.schneider@inf.ethz.ch abstract = Basin et al.'s sliding window algorithm (SWA) is an algorithm for combining the elements of subsequences of a sequence with an associative operator. It is greedy and minimizes the number of operator applications. We formalize the algorithm and verify its functional correctness. We extend the algorithm with additional operations and provide an alternative interface to the slide operation that does not require the entire input sequence. [Lucas_Theorem] title = Lucas's Theorem author = Chelsea Edmonds topic = Mathematics/Number theory date = 2020-04-07 notify = cle47@cam.ac.uk abstract = This work presents a formalisation of a generating function proof for Lucas's theorem. We first outline extensions to the existing Formal Power Series (FPS) library, including an equivalence relation for coefficients modulo n, an alternate binomial theorem statement, and a formalised proof of the Freshman's dream (mod p) lemma. The second part of the work presents the formal proof of Lucas's Theorem. Working backwards, the formalisation first proves a well known corollary of the theorem which is easier to formalise, and then applies induction to prove the original theorem statement. The proof of the corollary aims to provide a good example of a formalised generating function equivalence proof using the FPS library. The final theorem statement is intended to be integrated into the formalised proof of Hilbert's 10th Problem. [ADS_Functor] title = Authenticated Data Structures As Functors author = Andreas Lochbihler , Ognjen Marić topic = Computer science/Data structures date = 2020-04-16 notify = andreas.lochbihler@digitalasset.com, mail@andreas-lochbihler.de abstract = Authenticated data structures allow several systems to convince each other that they are referring to the same data structure, even if each of them knows only a part of the data structure. Using inclusion proofs, knowledgeable systems can selectively share their knowledge with other systems and the latter can verify the authenticity of what is being shared. In this article, we show how to modularly define authenticated data structures, their inclusion proofs, and operations thereon as datatypes in Isabelle/HOL, using a shallow embedding. Modularity allows us to construct complicated trees from reusable building blocks, which we call Merkle functors. Merkle functors include sums, products, and function spaces and are closed under composition and least fixpoints. As a practical application, we model the hierarchical transactions of Canton, a practical interoperability protocol for distributed ledgers, as authenticated data structures. This is a first step towards formalizing the Canton protocol and verifying its integrity and security guarantees. [Power_Sum_Polynomials] title = Power Sum Polynomials author = Manuel Eberl topic = Mathematics/Algebra date = 2020-04-24 notify = eberlm@in.tum.de abstract =

This article provides a formalisation of the symmetric multivariate polynomials known as power sum polynomials. These are of the form pn(X1,…, Xk) = X1n + … + Xkn. A formal proof of the Girard–Newton Theorem is also given. This theorem relates the power sum polynomials to the elementary symmetric polynomials sk in the form of a recurrence relation (-1)k k sk = ∑i∈[0,k) (-1)i si pk-i .

As an application, this is then used to solve a generalised form of a puzzle given as an exercise in Dummit and Foote's Abstract Algebra: For k complex unknowns x1, …, xk, define pj := x1j + … + xkj. Then for each vector a ∈ ℂk, show that there is exactly one solution to the system p1 = a1, …, pk = ak up to permutation of the xi and determine the value of pi for i>k.

[Gaussian_Integers] title = Gaussian Integers author = Manuel Eberl topic = Mathematics/Number theory date = 2020-04-24 notify = eberlm@in.tum.de abstract =

The Gaussian integers are the subring ℤ[i] of the complex numbers, i. e. the ring of all complex numbers with integral real and imaginary part. This article provides a definition of this ring as well as proofs of various basic properties, such as that they form a Euclidean ring and a full classification of their primes. An executable (albeit not very efficient) factorisation algorithm is also provided.

Lastly, this Gaussian integer formalisation is used in two short applications:

  1. The characterisation of all positive integers that can be written as sums of two squares
  2. Euclid's formula for primitive Pythagorean triples

While elementary proofs for both of these are already available in the AFP, the theory of Gaussian integers provides more concise proofs and a more high-level view.

[Forcing] title = Formalization of Forcing in Isabelle/ZF author = Emmanuel Gunther , Miguel Pagano , Pedro Sánchez Terraf topic = Logic/Set theory date = 2020-05-06 notify = gunther@famaf.unc.edu.ar, pagano@famaf.unc.edu.ar, sterraf@famaf.unc.edu.ar abstract = We formalize the theory of forcing in the set theory framework of Isabelle/ZF. Under the assumption of the existence of a countable transitive model of ZFC, we construct a proper generic extension and show that the latter also satisfies ZFC. [Delta_System_Lemma] title = Cofinality and the Delta System Lemma author = Pedro Sánchez Terraf topic = Mathematics/Combinatorics, Logic/Set theory date = 2020-12-27 notify = sterraf@famaf.unc.edu.ar abstract = We formalize the basic results on cofinality of linearly ordered sets and ordinals and Šanin’s Lemma for uncountable families of finite sets. This last result is used to prove the countable chain condition for Cohen posets. We work in the set theory framework of Isabelle/ZF, using the Axiom of Choice as needed. [Recursion-Addition] title = Recursion Theorem in ZF author = Georgy Dunaev topic = Logic/Set theory date = 2020-05-11 notify = georgedunaev@gmail.com abstract = This document contains a proof of the recursion theorem. This is a mechanization of the proof of the recursion theorem from the text Introduction to Set Theory, by Karel Hrbacek and Thomas Jech. This implementation may be used as the basis for a model of Peano arithmetic in ZF. While recursion and the natural numbers are already available in Isabelle/ZF, this clean development is much easier to follow. [LTL_Normal_Form] title = An Efficient Normalisation Procedure for Linear Temporal Logic: Isabelle/HOL Formalisation author = Salomon Sickert topic = Computer science/Automata and formal languages, Logic/General logic/Temporal logic date = 2020-05-08 notify = s.sickert@tum.de abstract = In the mid 80s, Lichtenstein, Pnueli, and Zuck proved a classical theorem stating that every formula of Past LTL (the extension of LTL with past operators) is equivalent to a formula of the form $\bigwedge_{i=1}^n \mathbf{G}\mathbf{F} \varphi_i \vee \mathbf{F}\mathbf{G} \psi_i$, where $\varphi_i$ and $\psi_i$ contain only past operators. Some years later, Chang, Manna, and Pnueli built on this result to derive a similar normal form for LTL. Both normalisation procedures have a non-elementary worst-case blow-up, and follow an involved path from formulas to counter-free automata to star-free regular expressions and back to formulas. We improve on both points. We present an executable formalisation of a direct and purely syntactic normalisation procedure for LTL yielding a normal form, comparable to the one by Chang, Manna, and Pnueli, that has only a single exponential blow-up. [Matrices_for_ODEs] title = Matrices for ODEs author = Jonathan Julian Huerta y Munive topic = Mathematics/Analysis, Mathematics/Algebra date = 2020-04-19 notify = jonjulian23@gmail.com abstract = Our theories formalise various matrix properties that serve to establish existence, uniqueness and characterisation of the solution to affine systems of ordinary differential equations (ODEs). In particular, we formalise the operator and maximum norm of matrices. Then we use them to prove that square matrices form a Banach space, and in this setting, we show an instance of Picard-Lindelöf’s theorem for affine systems of ODEs. Finally, we use this formalisation to verify three simple hybrid programs. [Irrational_Series_Erdos_Straus] title = Irrationality Criteria for Series by Erdős and Straus author = Angeliki Koutsoukou-Argyraki , Wenda Li topic = Mathematics/Number theory, Mathematics/Analysis date = 2020-05-12 notify = ak2110@cam.ac.uk, wl302@cam.ac.uk, liwenda1990@hotmail.com abstract = We formalise certain irrationality criteria for infinite series of the form: \[\sum_{n=1}^\infty \frac{b_n}{\prod_{i=1}^n a_i} \] where $\{b_n\}$ is a sequence of integers and $\{a_n\}$ a sequence of positive integers with $a_n >1$ for all large n. The results are due to P. Erdős and E. G. Straus [1]. In particular, we formalise Theorem 2.1, Corollary 2.10 and Theorem 3.1. The latter is an application of Theorem 2.1 involving the prime numbers. [Knuth_Bendix_Order] title = A Formalization of Knuth–Bendix Orders author = Christian Sternagel , René Thiemann topic = Logic/Rewriting date = 2020-05-13 notify = c.sternagel@gmail.com, rene.thiemann@uibk.ac.at abstract = We define a generalized version of Knuth–Bendix orders, including subterm coefficient functions. For these orders we formalize several properties such as strong normalization, the subterm property, closure properties under substitutions and contexts, as well as ground totality. [Stateful_Protocol_Composition_and_Typing] title = Stateful Protocol Composition and Typing author = Andreas V. Hess , Sebastian Mödersheim , Achim D. Brucker topic = Computer science/Security date = 2020-04-08 notify = avhe@dtu.dk, andreasvhess@gmail.com, samo@dtu.dk, brucker@spamfence.net, andschl@dtu.dk abstract = We provide in this AFP entry several relative soundness results for security protocols. In particular, we prove typing and compositionality results for stateful protocols (i.e., protocols with mutable state that may span several sessions), and that focuses on reachability properties. Such results are useful to simplify protocol verification by reducing it to a simpler problem: Typing results give conditions under which it is safe to verify a protocol in a typed model where only "well-typed" attacks can occur whereas compositionality results allow us to verify a composed protocol by only verifying the component protocols in isolation. The conditions on the protocols under which the results hold are furthermore syntactic in nature allowing for full automation. The foundation presented here is used in another entry to provide fully automated and formalized security proofs of stateful protocols. [Automated_Stateful_Protocol_Verification] title = Automated Stateful Protocol Verification author = Andreas V. Hess , Sebastian Mödersheim , Achim D. Brucker , Anders Schlichtkrull topic = Computer science/Security, Tools date = 2020-04-08 notify = avhe@dtu.dk, andreasvhess@gmail.com, samo@dtu.dk, brucker@spamfence.net, andschl@dtu.dk abstract = In protocol verification we observe a wide spectrum from fully automated methods to interactive theorem proving with proof assistants like Isabelle/HOL. In this AFP entry, we present a fully-automated approach for verifying stateful security protocols, i.e., protocols with mutable state that may span several sessions. The approach supports reachability goals like secrecy and authentication. We also include a simple user-friendly transaction-based protocol specification language that is embedded into Isabelle. [Smith_Normal_Form] title = A verified algorithm for computing the Smith normal form of a matrix author = Jose Divasón topic = Mathematics/Algebra, Computer science/Algorithms/Mathematical date = 2020-05-23 notify = jose.divason@unirioja.es abstract = This work presents a formal proof in Isabelle/HOL of an algorithm to transform a matrix into its Smith normal form, a canonical matrix form, in a general setting: the algorithm is parameterized by operations to prove its existence over elementary divisor rings, while execution is guaranteed over Euclidean domains. We also provide a formal proof on some results about the generality of this algorithm as well as the uniqueness of the Smith normal form. Since Isabelle/HOL does not feature dependent types, the development is carried out switching conveniently between two different existing libraries: the Hermite normal form (based on HOL Analysis) and the Jordan normal form AFP entries. This permits to reuse results from both developments and it is done by means of the lifting and transfer package together with the use of local type definitions. [Nash_Williams] title = The Nash-Williams Partition Theorem author = Lawrence C. Paulson topic = Mathematics/Combinatorics date = 2020-05-16 notify = lp15@cam.ac.uk abstract = In 1965, Nash-Williams discovered a generalisation of the infinite form of Ramsey's theorem. Where the latter concerns infinite sets of n-element sets for some fixed n, the Nash-Williams theorem concerns infinite sets of finite sets (or lists) subject to a “no initial segment” condition. The present formalisation follows a monograph on Ramsey Spaces by Todorčević. [Safe_Distance] title = A Formally Verified Checker of the Safe Distance Traffic Rules for Autonomous Vehicles author = Albert Rizaldi , Fabian Immler topic = Computer science/Algorithms/Mathematical, Mathematics/Physics date = 2020-06-01 notify = albert.rizaldi@ntu.edu.sg, fimmler@andrew.cmu.edu, martin.rau@tum.de abstract = The Vienna Convention on Road Traffic defines the safe distance traffic rules informally. This could make autonomous vehicle liable for safe-distance-related accidents because there is no clear definition of how large a safe distance is. We provide a formally proven prescriptive definition of a safe distance, and checkers which can decide whether an autonomous vehicle is obeying the safe distance rule. Not only does our work apply to the domain of law, but it also serves as a specification for autonomous vehicle manufacturers and for online verification of path planners. [Relational_Paths] title = Relational Characterisations of Paths author = Walter Guttmann , Peter Höfner topic = Mathematics/Graph theory date = 2020-07-13 notify = walter.guttmann@canterbury.ac.nz, peter@hoefner-online.de abstract = Binary relations are one of the standard ways to encode, characterise and reason about graphs. Relation algebras provide equational axioms for a large fragment of the calculus of binary relations. Although relations are standard tools in many areas of mathematics and computing, researchers usually fall back to point-wise reasoning when it comes to arguments about paths in a graph. We present a purely algebraic way to specify different kinds of paths in Kleene relation algebras, which are relation algebras equipped with an operation for reflexive transitive closure. We study the relationship between paths with a designated root vertex and paths without such a vertex. Since we stay in first-order logic this development helps with mechanising proofs. To demonstrate the applicability of the algebraic framework we verify the correctness of three basic graph algorithms. [Amicable_Numbers] title = Amicable Numbers author = Angeliki Koutsoukou-Argyraki topic = Mathematics/Number theory date = 2020-08-04 notify = ak2110@cam.ac.uk abstract = This is a formalisation of Amicable Numbers, involving some relevant material including Euler's sigma function, some relevant definitions, results and examples as well as rules such as Thābit ibn Qurra's Rule, Euler's Rule, te Riele's Rule and Borho's Rule with breeders. [Ordinal_Partitions] title = Ordinal Partitions author = Lawrence C. Paulson topic = Mathematics/Combinatorics, Logic/Set theory date = 2020-08-03 notify = lp15@cam.ac.uk abstract = The theory of partition relations concerns generalisations of Ramsey's theorem. For any ordinal $\alpha$, write $\alpha \to (\alpha, m)^2$ if for each function $f$ from unordered pairs of elements of $\alpha$ into $\{0,1\}$, either there is a subset $X\subseteq \alpha$ order-isomorphic to $\alpha$ such that $f\{x,y\}=0$ for all $\{x,y\}\subseteq X$, or there is an $m$ element set $Y\subseteq \alpha$ such that $f\{x,y\}=1$ for all $\{x,y\}\subseteq Y$. (In both cases, with $\{x,y\}$ we require $x\not=y$.) In particular, the infinite Ramsey theorem can be written in this notation as $\omega \to (\omega, \omega)^2$, or if we restrict $m$ to the positive integers as above, then $\omega \to (\omega, m)^2$ for all $m$. This entry formalises Larson's proof of $\omega^\omega \to (\omega^\omega, m)^2$ along with a similar proof of a result due to Specker: $\omega^2 \to (\omega^2, m)^2$. Also proved is a necessary result by Erdős and Milner: $\omega^{1+\alpha\cdot n} \to (\omega^{1+\alpha}, 2^n)^2$. [Relational_Disjoint_Set_Forests] title = Relational Disjoint-Set Forests author = Walter Guttmann topic = Computer science/Data structures date = 2020-08-26 notify = walter.guttmann@canterbury.ac.nz abstract = We give a simple relation-algebraic semantics of read and write operations on associative arrays. The array operations seamlessly integrate with assignments in the Hoare-logic library. Using relation algebras and Kleene algebras we verify the correctness of an array-based implementation of disjoint-set forests with a naive union operation and a find operation with path compression. [PAC_Checker] title = Practical Algebraic Calculus Checker author = Mathias Fleury , Daniela Kaufmann topic = Computer science/Algorithms date = 2020-08-31 notify = mathias.fleury@jku.at abstract = Generating and checking proof certificates is important to increase the trust in automated reasoning tools. In recent years formal verification using computer algebra became more important and is heavily used in automated circuit verification. An existing proof format which covers algebraic reasoning and allows efficient proof checking is the practical algebraic calculus (PAC). In this development, we present the verified checker Pastèque that is obtained by synthesis via the Refinement Framework. This is the formalization going with our FMCAD'20 tool presentation. [BirdKMP] title = Putting the `K' into Bird's derivation of Knuth-Morris-Pratt string matching author = Peter Gammie topic = Computer science/Functional programming date = 2020-08-25 notify = peteg42@gmail.com abstract = Richard Bird and collaborators have proposed a derivation of an intricate cyclic program that implements the Morris-Pratt string matching algorithm. Here we provide a proof of total correctness for Bird's derivation and complete it by adding Knuth's optimisation. [Extended_Finite_State_Machines] title = A Formal Model of Extended Finite State Machines author = Michael Foster , Achim D. Brucker , Ramsay G. Taylor , John Derrick topic = Computer science/Automata and formal languages date = 2020-09-07 notify = jmafoster1@sheffield.ac.uk, adbrucker@0x5f.org abstract = In this AFP entry, we provide a formalisation of extended finite state machines (EFSMs) where models are represented as finite sets of transitions between states. EFSMs execute traces to produce observable outputs. We also define various simulation and equality metrics for EFSMs in terms of traces and prove their strengths in relation to each other. Another key contribution is a framework of function definitions such that LTL properties can be phrased over EFSMs. Finally, we provide a simple example case study in the form of a drinks machine. [Extended_Finite_State_Machine_Inference] title = Inference of Extended Finite State Machines author = Michael Foster , Achim D. Brucker , Ramsay G. Taylor , John Derrick topic = Computer science/Automata and formal languages date = 2020-09-07 notify = jmafoster1@sheffield.ac.uk, adbrucker@0x5f.org abstract = In this AFP entry, we provide a formal implementation of a state-merging technique to infer extended finite state machines (EFSMs), complete with output and update functions, from black-box traces. In particular, we define the subsumption in context relation as a means of determining whether one transition is able to account for the behaviour of another. Building on this, we define the direct subsumption relation, which lifts the subsumption in context relation to EFSM level such that we can use it to determine whether it is safe to merge a given pair of transitions. Key proofs include the conditions necessary for subsumption to occur and that subsumption and direct subsumption are preorder relations. We also provide a number of different heuristics which can be used to abstract away concrete values into registers so that more states and transitions can be merged and provide proofs of the various conditions which must hold for these abstractions to subsume their ungeneralised counterparts. A Code Generator setup to create executable Scala code is also defined. [Physical_Quantities] title = A Sound Type System for Physical Quantities, Units, and Measurements author = Simon Foster , Burkhart Wolff topic = Mathematics/Physics, Computer science/Programming languages/Type systems date = 2020-10-20 notify = simon.foster@york.ac.uk, wolff@lri.fr abstract = The present Isabelle theory builds a formal model for both the International System of Quantities (ISQ) and the International System of Units (SI), which are both fundamental for physics and engineering. Both the ISQ and the SI are deeply integrated into Isabelle's type system. Quantities are parameterised by dimension types, which correspond to base vectors, and thus only quantities of the same dimension can be equated. Since the underlying "algebra of quantities" induces congruences on quantity and SI types, specific tactic support is developed to capture these. Our construction is validated by a test-set of known equivalences between both quantities and SI units. Moreover, the presented theory can be used for type-safe conversions between the SI system and others, like the British Imperial System (BIS). [Shadow_DOM] title = A Formal Model of the Document Object Model with Shadow Roots author = Achim D. Brucker , Michael Herzberg topic = Computer science/Data structures date = 2020-09-28 notify = adbrucker@0x5f.org, mail@michael-herzberg.de abstract = In this AFP entry, we extend our formalization of the core DOM with Shadow Roots. Shadow roots are a recent proposal of the web community to support a component-based development approach for client-side web applications. Shadow roots are a significant extension to the DOM standard and, as web standards are condemned to be backward compatible, such extensions often result in complex specification that may contain unwanted subtleties that can be detected by a formalization. Our Isabelle/HOL formalization is, in the sense of object-orientation, an extension of our formalization of the core DOM and enjoys the same basic properties, i.e., it is extensible, i.e., can be extended without the need of re-proving already proven properties and executable, i.e., we can generate executable code from our specification. We exploit the executability to show that our formalization complies to the official standard of the W3C, respectively, the WHATWG. [DOM_Components] title = A Formalization of Web Components author = Achim D. Brucker , Michael Herzberg topic = Computer science/Data structures date = 2020-09-28 notify = adbrucker@0x5f.org, mail@michael-herzberg.de abstract = While the DOM with shadow trees provide the technical basis for defining web components, the DOM standard neither defines the concept of web components nor specifies the safety properties that web components should guarantee. Consequently, the standard also does not discuss how or even if the methods for modifying the DOM respect component boundaries. In AFP entry, we present a formally verified model of web components and define safety properties which ensure that different web components can only interact with each other using well-defined interfaces. Moreover, our verification of the application programming interface (API) of the DOM revealed numerous invariants that implementations of the DOM API need to preserve to ensure the integrity of components. [Interpreter_Optimizations] title = Inline Caching and Unboxing Optimization for Interpreters author = Martin Desharnais topic = Computer science/Programming languages/Misc date = 2020-12-07 notify = martin.desharnais@unibw.de abstract = This Isabelle/HOL formalization builds on the VeriComp entry of the Archive of Formal Proofs to provide the following contributions:
  • an operational semantics for a realistic virtual machine (Std) for dynamically typed programming languages;
  • the formalization of an inline caching optimization (Inca), a proof of bisimulation with (Std), and a compilation function;
  • the formalization of an unboxing optimization (Ubx), a proof of bisimulation with (Inca), and a simple compilation function.
This formalization was described in the CPP 2021 paper Towards Efficient and Verified Virtual Machines for Dynamic Languages [Isabelle_Marries_Dirac] title = Isabelle Marries Dirac: a Library for Quantum Computation and Quantum Information author = Anthony Bordg , Hanna Lachnitt, Yijun He topic = Computer science/Algorithms/Quantum computing, Mathematics/Physics/Quantum information date = 2020-11-22 notify = apdb3@cam.ac.uk, lachnitt@stanford.edu abstract = This work is an effort to formalise some quantum algorithms and results in quantum information theory. Formal methods being critical for the safety and security of algorithms and protocols, we foresee their widespread use for quantum computing in the future. We have developed a large library for quantum computing in Isabelle based on a matrix representation for quantum circuits, successfully formalising the no-cloning theorem, quantum teleportation, Deutsch's algorithm, the Deutsch-Jozsa algorithm and the quantum Prisoner's Dilemma. [Finite-Map-Extras] title = Finite Map Extras author = Javier Díaz topic = Computer science/Data structures date = 2020-10-12 notify = javier.diaz.manzi@gmail.com abstract = This entry includes useful syntactic sugar, new operators and functions, and their associated lemmas for finite maps which currently are not present in the standard Finite_Map theory. [Relational_Minimum_Spanning_Trees] title = Relational Minimum Spanning Tree Algorithms author = Walter Guttmann , Nicolas Robinson-O'Brien<> topic = Computer science/Algorithms/Graph date = 2020-12-08 notify = walter.guttmann@canterbury.ac.nz abstract = We verify the correctness of Prim's, Kruskal's and Borůvka's minimum spanning tree algorithms based on algebras for aggregation and minimisation. [Topological_Semantics] title = Topological semantics for paraconsistent and paracomplete logics author = David Fuenmayor topic = Logic/General logic date = 2020-12-17 notify = davfuenmayor@gmail.com abstract = We introduce a generalized topological semantics for paraconsistent and paracomplete logics by drawing upon early works on topological Boolean algebras (cf. works by Kuratowski, Zarycki, McKinsey & Tarski, etc.). In particular, this work exemplarily illustrates the shallow semantical embeddings approach (SSE) employing the proof assistant Isabelle/HOL. By means of the SSE technique we can effectively harness theorem provers, model finders and 'hammers' for reasoning with quantified non-classical logics. - + [CSP_RefTK] title = The HOL-CSP Refinement Toolkit author = Safouan Taha , Burkhart Wolff , Lina Ye topic = Computer science/Concurrency/Process calculi, Computer science/Semantics date = 2020-11-19 notify = wolff@lri.fr -abstract = +abstract = We use a formal development for CSP, called HOL-CSP2.0, to analyse a family of refinement notions, comprising classic and new ones. This analysis enables to derive a number of properties that allow to deepen the understanding of these notions, in particular with respect to specification decomposition principles for the case of infinite sets of events. The established relations between the refinement relations help to clarify some obscure points in the CSP literature, but also provide a weapon for shorter refinement proofs. Furthermore, we provide a framework for state-normalisation allowing to formally reason on parameterised process architectures. As a result, we have a modern environment for formal proofs of concurrent systems that allow for the combination of general infinite processes with locally finite ones in a logically safe way. We demonstrate these verification-techniques for classical, generalised examples: The CopyBuffer for arbitrary data and the Dijkstra's Dining Philosopher Problem of arbitrary size. [Hood_Melville_Queue] title = Hood-Melville Queue author = Alejandro Gómez-Londoño topic = Computer science/Data structures date = 2021-01-18 notify = nipkow@in.tum.de abstract = This is a verified implementation of a constant time queue. The original design is due to Hood and Melville. This formalization follows the presentation in Purely Functional Data Structuresby Okasaki. [JinjaDCI] title = JinjaDCI: a Java semantics with dynamic class initialization author = Susannah Mansky topic = Computer science/Programming languages/Language definitions date = 2021-01-11 notify = sjohnsn2@illinois.edu, susannahej@gmail.com abstract = We extend Jinja to include static fields, methods, and instructions, and dynamic class initialization, based on the Java SE 8 specification. This includes extension of definitions and proofs. This work is partially described in Mansky and Gunter's paper at CPP 2019 and Mansky's doctoral thesis (UIUC, 2020). +[Blue_Eyes] +title = Solution to the xkcd Blue Eyes puzzle +author = Jakub Kądziołka +topic = Logic/General logic/Logics of knowledge and belief +date = 2021-01-30 +notify = kuba@kadziolka.net +abstract = + In a puzzle published by + Randall Munroe, perfect logicians forbidden + from communicating are stranded on an island, and may only leave once + they have figured out their own eye color. We present a method of + modeling the behavior of perfect logicians and formalize a solution of + the puzzle. + diff --git a/thys/Blue_Eyes/Blue_Eyes.thy b/thys/Blue_Eyes/Blue_Eyes.thy new file mode 100644 --- /dev/null +++ b/thys/Blue_Eyes/Blue_Eyes.thy @@ -0,0 +1,424 @@ +(*<*) +theory Blue_Eyes + imports Main +begin +(*>*) + +section \Introduction\ + +text \The original problem statement @{cite xkcd} explains the puzzle well: + +\begin{quotation} +A group of people with assorted eye colors live on an island. +They are all perfect logicians -- if a conclusion can be logically deduced, +they will do it instantly. +No one knows the color of their eyes. +Every night at midnight, a ferry stops at the island. +Any islanders who have figured out the color of their own eyes then leave the island, and the rest stay. +Everyone can see everyone else at all times +and keeps a count of the number of people they see with each eye color (excluding themselves), +but they cannot otherwise communicate. +Everyone on the island knows all the rules in this paragraph. + +On this island there are 100 blue-eyed people, +100 brown-eyed people, +and the Guru (she happens to have green eyes). +So any given blue-eyed person can see 100 people with brown eyes and 99 people with blue eyes (and one with green), +but that does not tell him his own eye color; +as far as he knows the totals could be 101 brown and 99 blue. +Or 100 brown, 99 blue, and he could have red eyes. + +The Guru is allowed to speak once (let's say at noon), +on one day in all their endless years on the island. +Standing before the islanders, she says the following: + +``I can see someone who has blue eyes.'' + +Who leaves the island, and on what night? +\end{quotation} + +It might seem weird that the Guru's declaration gives anyone any new information. +For an informal discussion, see \cite[Section~1.1]{fagin1995}.\ + +section \Modeling the world \label{sec:world}\ + +text \We begin by fixing two type variables: @{typ "'color"} and @{typ "'person"}. +The puzzle doesn't specify how many eye colors are possible, but four are mentioned. +Crucially, we must assume they are distinct. We specify the existence of colors other +than blue and brown, even though we don't mention them later, because when blue and brown +are the only possible colors, the puzzle has a different solution — the brown-eyed logicians +may leave one day after the blue-eyed ones. + +We refrain from specifying the exact population of the island, choosing to only assume +it is finite and denote a specific person as the Guru. + +We could also model the Guru as an outside entity instead of a participant. This doesn't change +the answer and results in a slightly simpler proof, but is less faithful to the problem statement.\ + +context + fixes blue brown green red :: 'color + assumes colors_distinct: "distinct [blue, brown, green, red]" + + fixes guru :: 'person + assumes "finite (UNIV :: 'person set)" +begin + +text \It's slightly tricky to formalize the behavior of perfect logicians. +The representation we use is centered around the type of a @{emph \world\}, +which describes the entire state of the environment. In our case, it's a function +@{typ "'person => 'color"} that assigns an eye color to everyone.@{footnote \We would introduce +a type synonym, but at the time of writing Isabelle doesn't support including type variables fixed +by a locale in a type synonym.\} + +The only condition known to everyone and not dependent on the observer is Guru's declaration:\ + +definition valid :: "('person \ 'color) \ bool" where + "valid w \ (\p. p \ guru \ w p = blue)" + +text \We then define the function @{term "possible n p w w'"}, which returns @{term True} +if on day \n\ the potential world \w'\ is plausible from the perspective of person \p\, +based on the observations they made in the actual world \w\. + +Then, @{term "leaves n p w"} is @{term True} if \p\ is able to unambiguously deduce +the color of their own eyes, i.e. if it is the same in all possible worlds. Note that if \p\ actually +left many moons ago, this function still returns @{term True}.\ + +fun leaves :: "nat \ 'person \ ('person \ 'color) \ bool" + and possible :: "nat \ 'person \ ('person \ 'color) \ ('person \ 'color) \ bool" + where + "leaves n p w = (\w'. possible n p w w' \ w' p = w p)" | + "possible n p w w' \ valid w \ valid w' + \ (\p' \ p. w p' = w' p') + \ (\n' < n. \p'. leaves n' p' w = leaves n' p' w')" + +text \Naturally, the act of someone leaving can be observed by others, thus the two definitions +are mutually recursive. As such, we need to instruct the simplifier to not unfold these definitions endlessly.\ +declare possible.simps[simp del] leaves.simps[simp del] + +text \A world is possible if + \<^enum> The Guru's declaration holds. + \<^enum> The eye color of everyone but the observer matches. + \<^enum> The same people left on each of the previous days. + +Moreover, we require that the actual world \w\ is \valid\, so that the relation is symmetric:\ + +lemma possible_sym: "possible n p w w' = possible n p w' w" + by (auto simp: possible.simps) + +text \In fact, \possible n p\ is an equivalence relation:\ + +lemma possible_refl: "valid w \ possible n p w w" + by (auto simp: possible.simps) + +lemma possible_trans: "possible n p w1 w2 \ possible n p w2 w3 \ possible n p w1 w3" + by (auto simp: possible.simps) + +section \Eye colors other than blue\ + +text \Since there is no way to distinguish between the colors other than blue, +only the blue-eyed people will ever leave. To formalize this notion, we define +a function that takes a world and replaces the eye color of a specified person. +The original color is specified too, so that the transformation composes nicely +with the recursive hypothetical worlds of @{const possible}.\ + +definition try_swap :: "'person \ 'color \ 'color \ ('person \ 'color) \ ('person \ 'color)" where + "try_swap p c\<^sub>1 c\<^sub>2 w x = (if c\<^sub>1 = blue \ c\<^sub>2 = blue \ x \ p then w x else Fun.swap c\<^sub>1 c\<^sub>2 id (w x))" + +lemma try_swap_valid[simp]: "valid (try_swap p c\<^sub>1 c\<^sub>2 w) = valid w" + by (auto simp add: try_swap_def valid_def swap_def) + +lemma try_swap_eq[simp]: "try_swap p c\<^sub>1 c\<^sub>2 w x = try_swap p c\<^sub>1 c\<^sub>2 w' x \ w x = w' x" + by (auto simp add: try_swap_def swap_def) + +lemma try_swap_inv[simp]: "try_swap p c\<^sub>1 c\<^sub>2 (try_swap p c\<^sub>1 c\<^sub>2 w) = w" + by (rule ext) (auto simp add: try_swap_def swap_def) + +lemma leaves_try_swap[simp]: + assumes "valid w" + shows "leaves n p (try_swap p' c\<^sub>1 c\<^sub>2 w) = leaves n p w" + using assms +proof (induction n arbitrary: p w rule: less_induct) + case (less n) + have "leaves n p w" if "leaves n p (try_swap p' c\<^sub>1 c\<^sub>2 w)" for w + proof (unfold leaves.simps; rule+) + fix w' + assume "possible n p w w'" + then have "possible n p (try_swap p' c\<^sub>1 c\<^sub>2 w) (try_swap p' c\<^sub>1 c\<^sub>2 w')" + by (fastforce simp: possible.simps less.IH) + with `leaves n p (try_swap p' c\<^sub>1 c\<^sub>2 w)` have "try_swap p' c\<^sub>1 c\<^sub>2 w' p = try_swap p' c\<^sub>1 c\<^sub>2 w p" + unfolding leaves.simps + by simp + thus "w' p = w p" by simp + qed + + with try_swap_inv show ?case by auto +qed + +text \This lets us prove that only blue-eyed people will ever leave the island.\ + +proposition only_blue_eyes_leave: + assumes "leaves n p w" and "valid w" + shows "w p = blue" +proof (rule ccontr) + assume "w p \ blue" + then obtain c where c: "w p \ c" "c \ blue" + using colors_distinct + by (metis distinct_length_2_or_more) + + let ?w' = "try_swap p (w p) c w" + have "possible n p w ?w'" + using `valid w` apply (simp add: possible.simps) + by (auto simp: try_swap_def) + moreover have "?w' p \ w p" + using c `w p \ blue` by (auto simp: try_swap_def) + ultimately have "\ leaves n p w" + by (auto simp: leaves.simps) + with assms show False by simp +qed + +section "The blue-eyed logicians" + +text \We will now consider the behavior of the logicians with blue eyes. First, +some simple lemmas. Reasoning about set cardinalities often requires considering infinite +sets separately. Usefully, all sets of people are finite by assumption.\ + +lemma people_finite[simp]: "finite (S::'person set)" +proof (rule finite_subset) + show "S \ UNIV" by auto + show "finite (UNIV::'person set)" by fact +qed + +text \Secondly, we prove a destruction rule for @{const possible}. It is strictly weaker than +the definition, but thanks to the simpler form, it's easier to guide the automation with it.\ +lemma possibleD_colors: + assumes "possible n p w w'" and "p' \ p" + shows "w' p' = w p'" + using assms unfolding possible.simps by simp + +text \A central concept in the reasoning is the set of blue-eyed people someone can see.\ +definition blues_seen :: "('person \ 'color) \ 'person \ 'person set" where + "blues_seen w p = {p'. w p' = blue} - {p}" + +lemma blues_seen_others: + assumes "w p' = blue" and "p \ p'" + shows "w p = blue \ card (blues_seen w p) = card (blues_seen w p')" + and "w p \ blue \ card (blues_seen w p) = Suc (card (blues_seen w p'))" +proof - + assume "w p = blue" + then have "blues_seen w p' = blues_seen w p \ {p} - {p'}" + by (auto simp add: blues_seen_def) + moreover have "p \ blues_seen w p" + unfolding blues_seen_def by auto + moreover have "p' \ blues_seen w p \ {p}" + unfolding blues_seen_def using `p \ p'` `w p' = blue` by auto + ultimately show "card (blues_seen w p) = card (blues_seen w p')" + by simp +next + assume "w p \ blue" + then have "blues_seen w p' = blues_seen w p - {p'}" + by (auto simp add: blues_seen_def) + moreover have "p' \ blues_seen w p" + unfolding blues_seen_def using `p \ p'` `w p' = blue` by auto + ultimately show "card (blues_seen w p) = Suc (card (blues_seen w p'))" + by (simp only: card_Suc_Diff1 people_finite) +qed + +lemma blues_seen_same[simp]: + assumes "possible n p w w'" + shows "blues_seen w' p = blues_seen w p" + using assms + by (auto simp: blues_seen_def possible.simps) + +lemma possible_blues_seen: + assumes "possible n p w w'" + assumes "w p' = blue" and "p \ p'" + shows "w' p = blue \ card (blues_seen w p) = card (blues_seen w' p')" + and "w' p \ blue \ card (blues_seen w p) = Suc (card (blues_seen w' p'))" + using possibleD_colors[OF `possible n p w w'`] and blues_seen_others assms + by (auto simp flip: blues_seen_same) + +text \Finally, the crux of the solution. We proceed by strong induction.\ + +lemma blue_leaves: + assumes "w p = blue" and "valid w" + and guru: "w guru \ blue" + shows "leaves n p w \ n \ card (blues_seen w p)" + using assms +proof (induction n arbitrary: p w rule: less_induct) + case (less n) + show ?case + proof + \ \First, we show that day \n\ is sufficient to deduce that the eyes are blue.\ + assume "n \ card (blues_seen w p)" + have "w' p = blue" if "possible n p w w'" for w' + proof (cases "card (blues_seen w' p)") + case 0 + moreover from `possible n p w w'` have "valid w'" + by (simp add: possible.simps) + ultimately show "w' p = blue" + unfolding valid_def blues_seen_def by auto + next + case (Suc k) + \ \We consider the behavior of somebody else, who also has blue eyes.\ + then have "blues_seen w' p \ {}" + by auto + then obtain p' where "w' p' = blue" and "p \ p'" + unfolding blues_seen_def by auto + then have "w p' = blue" + using possibleD_colors[OF `possible n p w w'`] by simp + + have "p \ guru" + using `w p = blue` and `w guru \ blue` by auto + hence "w' guru \ blue" + using `w guru \ blue` and possibleD_colors[OF `possible n p w w'`] by simp + + have "valid w'" + using `possible n p w w'` unfolding possible.simps by simp + + show "w' p = blue" + proof (rule ccontr) + assume "w' p \ blue" + \ \If our eyes weren't blue, then \p'\ would see one blue-eyed person less than us.\ + with possible_blues_seen[OF `possible n p w w'` `w p' = blue` `p \ p'`] + have *: "card (blues_seen w p) = Suc (card (blues_seen w' p'))" + by simp + \ \By induction, they would've left on day \k = blues_seen w' p'\.\ + let ?k = "card (blues_seen w' p')" + have "?k < n" + using `n \ card (blues_seen w p)` and * by simp + hence "leaves ?k p' w'" + using `valid w'` `w' p' = blue` `w' guru \ blue` + by (intro less.IH[THEN iffD2]; auto) + \ \However, we know that actually, \p'\ didn't leave that day yet.\ + moreover have "\ leaves ?k p' w" + proof + assume "leaves ?k p' w" + then have "?k \ card (blues_seen w p')" + using `?k < n` `w p' = blue` `valid w` `w guru \ blue` + by (intro less.IH[THEN iffD1]; auto) + + have "card (blues_seen w p) = card (blues_seen w p')" + by (intro blues_seen_others; fact) + with * have "?k < card (blues_seen w p')" + by simp + with `?k \ card (blues_seen w p')` show False by simp + qed + moreover have "leaves ?k p' w' = leaves ?k p' w" + using `possible n p w w'` `?k < n` + unfolding possible.simps by simp + ultimately show False by simp + qed + qed + thus "leaves n p w" + unfolding leaves.simps using `w p = blue` by simp + next + \ \Then, we show that it's not possible to deduce the eye color any earlier.\ + { + assume "n < card (blues_seen w p)" + \ \Consider a hypothetical world where \p\ has brown eyes instead. We will prove that this + world is \possible\.\ + let ?w' = "w(p := brown)" + have "?w' guru \ blue" + using `w guru \ blue` `w p = blue` + by auto + have "valid ?w'" + proof - + from `n < card (blues_seen w p)` have "card (blues_seen w p) \ 0" by auto + hence "blues_seen w p \ {}" + by auto + then obtain p' where "p' \ blues_seen w p" + by auto + hence "p \ p'" and "w p' = blue" + by (auto simp: blues_seen_def) + hence "?w' p' = blue" by auto + with `?w' guru \ blue` show "valid ?w'" + unfolding valid_def by auto + qed + moreover have "leaves n' p' w = leaves n' p' ?w'" if "n' < n" for n' p' + proof - + have not_leavesI: "\leaves n' p' w'" + if "valid w'" "w' guru \ blue" and P: "w' p' = blue \ n' < card (blues_seen w' p')" for w' + proof (cases "w' p' = blue") + case True + then have "leaves n' p' w' \ n' \ card (blues_seen w' p')" + using less.IH `n' < n` `valid w'` `w' guru \ blue` + by simp + with P[OF `w' p' = blue`] show "\leaves n' p' w'" by simp + next + case False + then show "\ leaves n' p' w'" + using only_blue_eyes_leave `valid w'` by auto + qed + + have "\leaves n' p' w" + proof (intro not_leavesI) + assume "w p' = blue" + with `w p = blue` have "card (blues_seen w p) = card (blues_seen w p')" + apply (cases "p = p'", simp) + by (intro blues_seen_others; auto) + with `n' < n` and `n < card (blues_seen w p)` show "n' < card (blues_seen w p')" + by simp + qed fact+ + + moreover have "\ leaves n' p' ?w'" + proof (intro not_leavesI) + assume "?w' p' = blue" + with colors_distinct have "p \ p'" and "?w' p \ blue" by auto + hence "card (blues_seen ?w' p) = Suc (card (blues_seen ?w' p'))" + using `?w' p' = blue` + by (intro blues_seen_others; auto) + moreover have "blues_seen w p = blues_seen ?w' p" + unfolding blues_seen_def by auto + ultimately show "n' < card (blues_seen ?w' p')" + using `n' < n` and `n < card (blues_seen w p)` + by auto + qed fact+ + + ultimately show "leaves n' p' w = leaves n' p' ?w'" by simp + qed + ultimately have "possible n p w ?w'" + using `valid w` + by (auto simp: possible.simps) + moreover have "?w' p \ blue" + using colors_distinct by auto + ultimately have "\ leaves n p w" + unfolding leaves.simps + using `w p = blue` by blast + } + then show "leaves n p w \ n \ card (blues_seen w p)" + by fastforce + qed +qed + +text \This can be combined into a theorem that describes the behavior of the logicians based +on the objective count of blue-eyed people, and not the count by a specific person. The xkcd +puzzle is the instance where \n = 99\.\ + +theorem blue_eyes: + assumes "card {p. w p = blue} = Suc n" and "valid w" and "w guru \ blue" + shows "leaves k p w \ w p = blue \ k \ n" +proof (cases "w p = blue") + case True + with assms have "card (blues_seen w p) = n" + unfolding blues_seen_def by simp + then show ?thesis + using `w p = blue` `valid w` `w guru \ blue` blue_leaves + by simp +next + case False + then show ?thesis + using only_blue_eyes_leave `valid w` by auto +qed + +end + +(*<*) +end +(*>*) + +section \Future work\ + +text \After completing this formalization, I have been made aware of epistemic logic. +The @{emph \possible worlds\} model in \cref{sec:world} turns out to be quite similar +to the usual semantics of this logic. It might be interesting to solve this puzzle within +the axiom system of epistemic logic, without explicit reasoning about possible worlds.\ \ No newline at end of file diff --git a/thys/Blue_Eyes/ROOT b/thys/Blue_Eyes/ROOT new file mode 100644 --- /dev/null +++ b/thys/Blue_Eyes/ROOT @@ -0,0 +1,9 @@ +chapter AFP + +session Blue_Eyes (AFP) = HOL + + options [timeout = 300] + theories + Blue_Eyes + document_files + "root.tex" + "root.bib" diff --git a/thys/Blue_Eyes/document/root.bib b/thys/Blue_Eyes/document/root.bib new file mode 100644 --- /dev/null +++ b/thys/Blue_Eyes/document/root.bib @@ -0,0 +1,15 @@ +@online{xkcd, + author = {Randall Munroe}, + title = {Blue Eyes --- A Logic Puzzle}, + url = {https://xkcd.com/blue_eyes.html}, + urldate = {2021-01-26} +} + +@book{fagin1995, + title = {Reasoning About Knowledge}, + language = {eng}, + publisher = {MIT Press}, + year = {1995}, + isbn = {0262061627}, + author = {Fagin, R. and Halpern, J.Y. and Moses, Y. and Vardi, M.Y.} +} \ No newline at end of file diff --git a/thys/Blue_Eyes/document/root.tex b/thys/Blue_Eyes/document/root.tex new file mode 100644 --- /dev/null +++ b/thys/Blue_Eyes/document/root.tex @@ -0,0 +1,68 @@ +\documentclass[11pt,a4paper]{article} +\usepackage{isabelle,isabellesym} +\usepackage[T1]{fontenc} +\usepackage[margin=2.5cm]{geometry} + +% further packages required for unusual symbols (see also +% isabellesym.sty), use only when needed + +%\usepackage{amssymb} + %for \, \, \, \, \, \, + %\, \, \, \, \, + %\, \, \ + +%\usepackage{eurosym} + %for \ + +%\usepackage[only,bigsqcap]{stmaryrd} + %for \ + +%\usepackage{eufrak} + %for \ ... \, \ ... \ (also included in amssymb) + +%\usepackage{textcomp} + %for \, \, \, \, \, + %\ + +% this should be the last package used, but isn't, because cleveref complains when it is +\usepackage{pdfsetup} +\usepackage{cleveref} + +% urls in roman style, theory text in math-similar italics +\urlstyle{rm} +\isabellestyle{it} + +% for uniform font size +%\renewcommand{\isastyle}{\isastyleminor} + + +\begin{document} + +\title{Solution to the xkcd Blue Eyes puzzle} +\author{Jakub Kądziołka} +\maketitle + +\begin{abstract} + In a puzzle published by Randall Munroe~\cite{xkcd}, perfect logicians forbidden + from communicating are stranded on an island, and may only leave once they + have figured out their own eye color. We present a method of modeling the + behavior of perfect logicians and formalize a solution of the puzzle. +\end{abstract} + +\tableofcontents + +% sane default for proof documents +\parindent 0pt\parskip 0.5ex + +% generated text of all theories +\input{session} + +\bibliographystyle{plainurl} +\bibliography{root} + +\end{document} + +%%% Local Variables: +%%% mode: latex +%%% TeX-master: t +%%% End: diff --git a/thys/IsaGeoCoq/ROOT b/thys/IsaGeoCoq/ROOT new file mode 100644 --- /dev/null +++ b/thys/IsaGeoCoq/ROOT @@ -0,0 +1,10 @@ +chapter AFP + +session IsaGeoCoq (AFP) = HOL + + options [timeout = 300] + theories + Tarski_Neutral + + document_files + "root.bib" + "root.tex" diff --git a/thys/IsaGeoCoq/Tarski_Neutral.thy b/thys/IsaGeoCoq/Tarski_Neutral.thy new file mode 100644 --- /dev/null +++ b/thys/IsaGeoCoq/Tarski_Neutral.thy @@ -0,0 +1,28006 @@ +(* IsageoCoq + +Port part of GeoCoq 3.4.0 (https://geocoq.github.io/GeoCoq/) in Isabelle/Hol (Isabelle2021) + +Copyright (C) 2021 Roland Coghetto roland_coghetto (at) hotmail.com + +License: LGPL + +This library is free software; you can redistribute it and/or +modify it under the terms of the GNU Lesser General Public +License as published by the Free Software Foundation; either +version 2.1 of the License, or (at your option) any later version. + +This library is distributed in the hope that it will be useful, +but WITHOUT ANY WARRANTY; without even the implied warranty of +MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +Lesser General Public License for more details. + +You should have received a copy of the GNU Lesser General Public +License along with this library; if not, write to the Free Software +Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA +*) + +theory Tarski_Neutral + +imports + Main + +begin + +section "Tarski's axiom system for neutral geometry" + +subsection "Tarski's axiom system for neutral geometry: dimensionless" + +locale Tarski_neutral_dimensionless = + fixes Bet :: "'p \ 'p \ 'p \ bool" + fixes Cong :: "'p \ 'p \ 'p \ 'p \ bool" + assumes cong_pseudo_reflexivity: "\ a b. + Cong a b b a" + and cong_inner_transitivity: "\ a b p q r s. + Cong a b p q \ + Cong a b r s + \ + Cong p q r s" + and cong_identity: "\ a b c. + Cong a b c c + \ + a = b" + and segment_construction: "\ a b c q. + \x. (Bet q a x \ Cong a x b c)" + and five_segment: "\ a b c a' b' c'. + a \ b \ + Bet a b c \ + Bet a' b' c'\ + Cong a b a' b' \ + Cong b c b' c' \ + Cong a d a' d' \ + Cong b d b' d' + \ + Cong c d c' d'" + and between_identity: "\ a b. + Bet a b a + \ + a = b" + and inner_pasch: "\ a b c p q. + Bet a p c \ + Bet b q c + \ + (\ x. Bet p x b \ Bet q x a)" + and lower_dim: "\ a b c. (\ Bet a b c \ \ Bet b c a \ \ Bet c a b)" + +subsection "Tarski's axiom system for neutral geometry: 2D" + +locale Tarski_2D = Tarski_neutral_dimensionless + + assumes upper_dim: "\ a b c p q. + p \ q \ + Cong a p a q \ + Cong b p b q \ + Cong c p c q + \ + (Bet a b c \ Bet b c a \ Bet c a b)" + +section "Definitions" +subsection "Tarski's axiom system for neutral geometry: dimensionless" +context Tarski_neutral_dimensionless +begin + +subsubsection "Congruence" +definition OFSC :: + "['p,'p,'p,'p,'p,'p,'p,'p] \ bool" + ("_ _ _ _ OFSC _ _ _ _" [99,99,99,99,99,99,99,99] 50) + where + "A B C D OFSC A' B' C' D' \ + + Bet A B C \ + Bet A' B' C' \ + Cong A B A' B' \ + Cong B C B' C' \ + Cong A D A' D' \ + Cong B D B' D'" + +definition Cong3 :: + "['p,'p,'p,'p,'p,'p] \ bool" + ("_ _ _ Cong3 _ _ _" [99,99,99,99,99,99] 50) + where + "A B C Cong3 A' B' C' \ + + Cong A B A' B' \ + Cong A C A' C' \ + Cong B C B' C'" + +subsubsection "Betweenness" + +definition Col :: + "['p,'p,'p] \ bool" + ("Col _ _ _" [99,99,99] 50) + where + "Col A B C \ + + Bet A B C \ Bet B C A \ Bet C A B" + +definition Bet4 :: + "['p,'p,'p,'p] \ bool" + ("Bet4 _ _ _ _" [99,99,99,99] 50) + where + "Bet4 A1 A2 A3 A4 \ + + Bet A1 A2 A3 \ + Bet A2 A3 A4 \ + Bet A1 A3 A4 \ + Bet A1 A2 A4" + +definition BetS :: + "['p,'p,'p] \ bool" ("BetS _ _ _" [99,99,99] 50) + where + "BetS A B C \ + + Bet A B C \ + A \ B \ + B \ C" + +subsubsection "Collinearity" + +definition FSC :: + "['p,'p,'p,'p,'p,'p,'p,'p] \ bool" + ("_ _ _ _ FSC _ _ _ _" [99,99,99,99,99,99,99,99] 50) + where + "A B C D FSC A' B' C' D' \ + + Col A B C \ + A B C Cong3 A' B' C' \ + Cong A D A' D' \ + Cong B D B' D'" + +subsubsection "Congruence and Betweenness" +definition IFSC :: + "['p,'p,'p,'p,'p,'p,'p,'p] \ bool" + ("_ _ _ _ IFSC _ _ _ _" [99,99,99,99,99,99,99,99] 50) + where + "A B C D IFSC A' B' C' D' \ + + Bet A B C \ + Bet A' B' C' \ + Cong A C A' C' \ + Cong B C B' C' \ + Cong A D A' D' \ + Cong C D C' D'" + +subsubsection "Between transivitity LE" + +definition Le :: + "['p,'p,'p,'p] \ bool" ("_ _ Le _ _" [99,99,99,99] 50) + where "A B Le C D \ + + \ E. (Bet C E D \ Cong A B C E)" + + +definition Lt :: + "['p,'p,'p,'p] \ bool" ("_ _ Lt _ _" [99,99,99,99] 50) + where "A B Lt C D \ + + A B Le C D \ \ Cong A B C D" + +definition Ge :: + "['p,'p,'p,'p] \ bool" ("_ _Ge _ _" [99,99,99,99] 50) + where "A B Ge C D \ + + C D Le A B" + +definition Gt :: + "['p,'p,'p,'p] \ bool" ("_ _ Gt _ _" [99,99,99,99] 50) + where "A B Gt C D \ + + C D Lt A B" + +subsubsection "Out lines" + +definition Out :: + "['p,'p,'p] \ bool" ("_ Out _ _" [99,99,99] 50) + where "P Out A B \ + + A \ P \ + B \ P \ + (Bet P A B \ Bet P B A)" + +subsubsection "Midpoint" + +definition Midpoint :: + "['p,'p,'p] \ bool" ("_ Midpoint _ _" [99,99,99] 50) + where "M Midpoint A B \ + + Bet A M B \ + Cong A M M B" + +subsubsection "Orthogonality" + +definition Per :: + "['p,'p,'p] \ bool" ("Per _ _ _" [99,99,99] 50) + where "Per A B C \ + + \ C'::'p. (B Midpoint C C' \ Cong A C A C')" + +definition PerpAt :: + "['p,'p,'p,'p,'p] \ bool" ("_ PerpAt _ _ _ _ " [99,99,99,99,99] 50) + where "X PerpAt A B C D \ + + A \ B \ + C \ D \ + Col X A B \ + Col X C D \ + (\ U V. ((Col U A B \ Col V C D) \ Per U X V))" + +definition Perp :: + "['p,'p,'p,'p] \ bool" ("_ _ Perp _ _" [99,99,99,99] 50) + where "A B Perp C D \ + + \ X::'p. X PerpAt A B C D" + +subsubsection "Coplanar" + +definition Coplanar :: + "['p,'p,'p,'p] \ bool" ("Coplanar _ _ _ _" [99,99,99,99] 50) + where "Coplanar A B C D \ + \ X. (Col A B X \ Col C D X) \ + (Col A C X \ Col B D X) \ + (Col A D X \ Col B C X)" + +definition TS :: + "['p,'p,'p,'p] \ bool" ("_ _ TS _ _" [99,99,99,99] 50) + where "A B TS P Q \ + \ Col P A B \ \ Col Q A B \ (\ T::'p. Col T A B \ Bet P T Q)" + +definition ReflectL :: + "['p,'p,'p,'p] \ bool" ("_ _ ReflectL _ _" [99,99,99,99] 50) + where "P' P ReflectL A B \ + (\ X. X Midpoint P P' \ Col A B X) \ (A B Perp P P' \ P = P')" + +definition Reflect :: + "['p,'p,'p,'p] \ bool" ("_ _ Reflect _ _" [99,99,99,99] 50) + where "P' P Reflect A B \ + (A \ B \ P' P ReflectL A B) \ (A = B \ A Midpoint P P')" + +definition InAngle :: + "['p,'p,'p,'p] \ bool" ("_ InAngle _ _ _" [99,99,99,99] 50) + where "P InAngle A B C \ + A \ B \ C \ B \ P \ B \ +(\ X. Bet A X C \ (X = B \ B Out X P))" + +definition ParStrict:: + "['p,'p,'p,'p] \ bool" ("_ _ ParStrict _ _" [99,99,99,99] 50) + where "A B ParStrict C D \ Coplanar A B C D \ \ (\ X. Col X A B \ Col X C D)" + +definition Par:: + "['p,'p,'p,'p] \ bool" ("_ _ Par _ _" [99,99,99,99] 50) + where "A B Par C D \ + A B ParStrict C D \ (A \ B \ C \ D \ Col A C D \ Col B C D)" + +definition Plg:: + "['p,'p,'p,'p] \ bool" ("Plg _ _ _ _" [99,99,99,99] 50) + where "Plg A B C D \ + (A \ C \ B \ D) \ (\ M. M Midpoint A C \ M Midpoint B D)" + +definition ParallelogramStrict:: + "['p,'p,'p,'p] \ bool" ("ParallelogramStrict _ _ _ _" [99,99,99,99] 50) + where "ParallelogramStrict A B A' B' \ + A A' TS B B' \ A B Par A' B' \ Cong A B A' B'" + +definition ParallelogramFlat:: + "['p,'p,'p,'p] \ bool" ("ParallelogramFlat _ _ _ _" [99,99,99,99] 50) + where "ParallelogramFlat A B A' B' \ + Col A B A' \ Col A B B' \ + Cong A B A' B' \ Cong A B' A' B \ + (A \ A' \ B \ B')" + +definition Parallelogram:: + "['p,'p,'p,'p] \ bool" ("Parallelogram _ _ _ _" [99,99,99,99] 50) + where "Parallelogram A B A' B' \ + ParallelogramStrict A B A' B' \ ParallelogramFlat A B A' B'" + +definition Rhombus:: + "['p,'p,'p,'p] \ bool" ("Rhombus _ _ _ _" [99,99,99,99] 50) + where "Rhombus A B C D \ Plg A B C D \ Cong A B B C" + +definition Rectangle:: + "['p,'p,'p,'p] \ bool" ("Rectangle _ _ _ _" [99,99,99,99] 50) + where "Rectangle A B C D \ Plg A B C D \ Cong A C B D" + +definition Square:: + "['p,'p,'p,'p] \ bool" ("Square _ _ _ _" [99,99,99,99] 50) + where "Square A B C D \ Rectangle A B C D \ Cong A B B C" + +definition Lambert:: + "['p,'p,'p,'p] \ bool" ("Lambert _ _ _ _" [99,99,99,99] 50) + where "Lambert A B C D \ + A \ B \ B \ C \ C \ D \ + A \ D \ Per B A D \ Per A D C \ Per A B C \ Coplanar A B C D" + +subsubsection "Plane" + +definition OS :: + "['p,'p,'p,'p] \ bool" ("_ _ OS _ _" [99,99,99,99] 50) + where "A B OS P Q \ +\ R::'p. A B TS P R \ A B TS Q R" + +definition TSP :: + "['p,'p,'p,'p,'p] \ bool" ("_ _ _TSP _ _" [99,99,99,99,99] 50) + where "A B C TSP P Q \ + (\ Coplanar A B C P) \ (\ Coplanar A B C Q) \ +(\ T. Coplanar A B C T \ Bet P T Q)" + +definition OSP :: + "['p,'p,'p,'p,'p] \ bool" ("_ _ _ OSP _ _" [99,99,99,99,99] 50) + where "A B C OSP P Q \ +\ R. ((A B C TSP P R) \ (A B C TSP Q R))" + +definition Saccheri:: + "['p,'p,'p,'p] \ bool" ("Saccheri _ _ _ _" [99,99,99,99] 50) + where "Saccheri A B C D \ + Per B A D \ Per A D C \ Cong A B C D \ A D OS B C" + +subsubsection "Line reflexivity 2D" + +definition ReflectLAt :: + "['p,'p,'p,'p,'p] \ bool" ("_ ReflectLAt _ _ _ _" [99,99,99,99,99] 50) + where "M ReflectLAt P' P A B \ + (M Midpoint P P' \ Col A B M) \ (A B Perp P P' \ P = P')" + +definition ReflectAt :: + "['p,'p,'p,'p,'p] \ bool" ("_ ReflectAt _ _ _ _" [99,99,99,99,99] 50) + where "M ReflectAt P' P A B \ +(A \ B \ M ReflectLAt P' P A B) \ (A = B \ A = M \ M Midpoint P P')" + +subsubsection "Line reflexivity" + +definition upper_dim_axiom :: + "bool" ("UpperDimAxiom" [] 50) + where + "upper_dim_axiom \ + + \ A B C P Q. + P \ Q \ + Cong A P A Q \ + Cong B P B Q \ + Cong C P C Q + \ + (Bet A B C \ Bet B C A \ Bet C A B)" + +definition all_coplanar_axiom :: + "bool" ("AllCoplanarAxiom" [] 50) + where + "AllCoplanarAxiom \ + + \ A B C P Q. + P \ Q \ + Cong A P A Q \ + Cong B P B Q \ + Cong C P C Q + \ + (Bet A B C \ Bet B C A \ Bet C A B)" + +subsubsection "Angles" + +definition CongA :: + "['p,'p,'p,'p,'p,'p] \ bool" ("_ _ _ CongA _ _ _" [99,99,99,99,99,99] 50) + where "A B C CongA D E F \ + A \ B \ C \ B \ D \ E \ F \ E \ +(\ A' C' D' F'. Bet B A A' \ Cong A A' E D \ + Bet B C C' \ Cong C C' E F \ + Bet E D D' \ Cong D D' B A \ + Bet E F F' \ Cong F F' B C \ + Cong A' C' D' F')" + +definition LeA :: + "['p,'p,'p,'p,'p,'p] \ bool" ("_ _ _ LeA _ _ _" [99,99,99,99,99,99] 50) + where "A B C LeA D E F \ +\ P. (P InAngle D E F \ A B C CongA D E P)" + +definition LtA :: + "['p,'p,'p,'p,'p,'p] \ bool" ("_ _ _ LtA _ _ _" [99,99,99,99,99,99] 50) + where "A B C LtA D E F \ A B C LeA D E F \ \ A B C CongA D E F" + +definition GtA :: + "['p,'p,'p,'p,'p,'p] \ bool" ("_ _ _ GtA _ _ _" [99,99,99,99,99,99] 50) + where "A B C GtA D E F \ D E F LtA A B C" + +definition Acute :: + "['p,'p,'p] \ bool" ("Acute _ _ _" [99,99,99] 50) + where "Acute A B C \ +\ A' B' C'. (Per A' B' C' \ A B C LtA A' B' C')" + +definition Obtuse :: + "['p,'p,'p] \ bool" ("Obtuse _ _ _" [99,99,99] 50) + where "Obtuse A B C \ +\ A' B' C'. (Per A' B' C' \ A' B' C' LtA A B C)" + +definition OrthAt :: + "['p,'p,'p,'p,'p,'p] \ bool" ("_ OrthAt _ _ _ _ _" [99,99,99,99,99,99] 50) + where "X OrthAt A B C U V \ + \ Col A B C \ U \ V \ Coplanar A B C X \ Col U V X \ + (\ P Q. (Coplanar A B C P \ Col U V Q) \ Per P X Q)" + +definition Orth :: + "['p,'p,'p,'p,'p] \ bool" ("_ _ _ Orth _ _" [99,99,99,99,99] 50) + where "A B C Orth U V \ \ X. X OrthAt A B C U V" + +definition SuppA :: + "['p,'p,'p,'p,'p,'p] \ bool" + ("_ _ _ SuppA _ _ _ " [99,99,99,99,99,99] 50) + where + "A B C SuppA D E F \ + A \ B \ (\ A'. Bet A B A' \ D E F CongA C B A')" + +subsubsection "Sum of angles" + +definition SumA :: + "['p,'p,'p,'p,'p,'p,'p,'p,'p] \ bool" ("_ _ _ _ _ _ SumA _ _ _" [99,99,99,99,99,99,99,99,99] 50) + where + "A B C D E F SumA G H I \ + + \ J. (C B J CongA D E F \ \ B C OS A J \ Coplanar A B C J \ A B J CongA G H I)" + +definition TriSumA :: + "['p,'p,'p,'p,'p,'p] \ bool" ("_ _ _ TriSumA _ _ _" [99,99,99,99,99,99] 50) + where + "A B C TriSumA D E F \ + + \ G H I. (A B C B C A SumA G H I \ G H I C A B SumA D E F)" + +definition SAMS :: + "['p,'p,'p,'p,'p,'p] \ bool" ("SAMS _ _ _ _ _ _" [99,99,99,99,99,99] 50) + where + "SAMS A B C D E F \ + + (A \ B \ + (E Out D F \ \ Bet A B C)) \ + (\ J. (C B J CongA D E F \ \ (B C OS A J) \ \ (A B TS C J) \ Coplanar A B C J))" + +subsubsection "Parallelism" + +definition Inter :: + "['p,'p,'p,'p,'p] \ bool" ("_ Inter _ _ _ _" [99,99,99,99,99] 50) + where "X Inter A1 A2 B1 B2 \ + + B1 \ B2 \ + (\ P::'p. (Col P B1 B2 \ \ Col P A1 A2)) \ + Col A1 A2 X \ Col B1 B2 X" + +subsubsection "Perpendicularity" + +definition Perp2 :: + "['p,'p,'p,'p,'p] \ bool" ("_ Perp2 _ _ _ _" [99,99,99,99,99] 50) + where + "P Perp2 A B C D \ + + \ X Y. (Col P X Y \ X Y Perp A B \ X Y Perp C D)" + +subsubsection "Lentgh" + +definition QCong:: + "(['p,'p] \ bool) \ bool" ("QCong _" [99] 50) + where + "QCong l \ + + \ A B. (\ X Y. (Cong A B X Y \ l X Y))" + +definition TarskiLen:: + "['p,'p,(['p,'p] \ bool)] \ bool" ("TarskiLen _ _ _" [99,99,99] 50) + where + "TarskiLen A B l \ + + QCong l \ l A B" + +definition QCongNull :: + "(['p,'p] \ bool) \ bool" ("QCongNull _" [99] 50) + where + "QCongNull l \ + + QCong l \ (\ A. l A A)" + +subsubsection "Equivalence Class of Angles" + +definition QCongA :: + "(['p, 'p, 'p] \ bool) \ bool" ("QCongA _" [99] 50) + where + "QCongA a \ + + \ A B C. (A \ B \ C \ B \ (\ X Y Z. A B C CongA X Y Z \ a X Y Z))" + +definition Ang :: + "['p,'p,'p, (['p, 'p, 'p] \ bool) ] \ bool" ("_ _ _ Ang _" [99,99,99,99] 50) + where + "A B C Ang a \ + + QCongA a \ + a A B C" + +definition QCongAAcute :: + "(['p, 'p, 'p] \ bool) \ bool" ("QCongAACute _" [99] 50) + where + "QCongAAcute a \ + + \ A B C. (Acute A B C \ (\ X Y Z. (A B C CongA X Y Z \ a X Y Z)))" + +definition AngAcute :: + "['p,'p,'p, (['p,'p,'p] \ bool)] \ bool" ("_ _ _ AngAcute _" [99,99,99,99] 50) + where + "A B C AngAcute a \ + + ((QCongAAcute a) \ (a A B C))" + +definition QCongANullAcute :: + "(['p,'p,'p] \ bool) \ bool" ("QCongANullAcute _" [99] 50) + where + "QCongANullAcute a \ + + QCongAAcute a \ + (\ A B C. (a A B C \ B Out A C))" + +definition QCongAnNull :: + "(['p,'p,'p] \ bool) \ bool" ("QCongAnNull _" [99] 50) + where + "QCongAnNull a \ + + QCongA a \ + (\ A B C. (a A B C \ \ B Out A C))" + +definition QCongAnFlat :: + "(['p,'p,'p] \ bool) \ bool" ("QCongAnFlat _" [99] 50) + where + "QCongAnFlat a \ + + QCongA a \ + (\ A B C. (a A B C \ \ Bet A B C))" + +definition IsNullAngaP :: + "(['p,'p,'p] \ bool) \ bool" ("IsNullAngaP _" [99] 50) + where + "IsNullAngaP a\ + + QCongAAcute a \ + (\ A B C. (a A B C \ B Out A C))" + +definition QCongANull :: + "(['p,'p,'p] \ bool) \ bool" ("QCongANull _" [99] 50) + where + "QCongANull a \ + + QCongA a \ + (\ A B C. (a A B C \ B Out A C))" + +definition AngFlat :: + "(['p, 'p, 'p] \ bool) \ bool" ("AngFlat _" [99] 50) + where + "AngFlat a \ + + QCongA a \ + (\ A B C. (a A B C \ Bet A B C))" + +subsection "Parallel's definition Postulate" + +definition tarski_s_parallel_postulate :: + "bool" + ("TarskiSParallelPostulate") + where + "tarski_s_parallel_postulate \ +\ A B C D T. (Bet A D T \ Bet B D C \ A \ D) \ +(\ X Y. Bet A B X \ Bet A C Y \ Bet X T Y)" + +definition euclid_5 :: + "bool" ("Euclid5") + where + "euclid_5 \ + + \ P Q R S T U. + (BetS P T Q \ + BetS R T S \ + BetS Q U R \ + \ Col P Q S \ + Cong P T Q T \ + Cong R T S T) + \ + (\ I. BetS S Q I \ BetS P U I)" + +definition euclid_s_parallel_postulate :: + "bool" ("EuclidSParallelPostulate") + where + "euclid_s_parallel_postulate \ + + \ A B C D P Q R. + (B C OS A D \ + SAMS A B C B C D \ + A B C B C D SumA P Q R \ + \ Bet P Q R) + \ + (\ Y. B Out A Y \ C Out D Y)" + +definition playfair_s_postulate :: + "bool" + ("PlayfairSPostulate") + where + "playfair_s_postulate \ + + \ A1 A2 B1 B2 C1 C2 P. + (A1 A2 Par B1 B2 \ + Col P B1 B2 \ + A1 A2 Par C1 C2 \ + Col P C1 C2) + \ + (Col C1 B1 B2 \ Col C2 B1 B2)" + +section "Propositions" + +subsection "Congruence properties" + +lemma cong_reflexivity: + shows "Cong A B A B" + using cong_inner_transitivity cong_pseudo_reflexivity by blast + +lemma cong_symmetry: + assumes "Cong A B C D" + shows "Cong C D A B" + using assms cong_inner_transitivity cong_reflexivity by blast + +lemma cong_transitivity: + assumes "Cong A B C D" and "Cong C D E F" + shows "Cong A B E F" + by (meson assms(1) assms(2) cong_inner_transitivity cong_pseudo_reflexivity) + +lemma cong_left_commutativity: + assumes "Cong A B C D" + shows "Cong B A C D" + using assms cong_inner_transitivity cong_pseudo_reflexivity by blast + +lemma cong_right_commutativity: + assumes "Cong A B C D" + shows "Cong A B D C" + using assms cong_left_commutativity cong_symmetry by blast + +lemma cong_3421: + assumes "Cong A B C D" + shows "Cong C D B A" + using assms cong_left_commutativity cong_symmetry by blast + +lemma cong_4312: + assumes "Cong A B C D" + shows "Cong D C A B" + using assms cong_left_commutativity cong_symmetry by blast + +lemma cong_4321: + assumes "Cong A B C D" + shows "Cong D C B A" + using assms cong_3421 cong_left_commutativity by blast + +lemma cong_trivial_identity: + shows "Cong A A B B" + using cong_identity segment_construction by blast + +lemma cong_reverse_identity: + assumes "Cong A A C D" + shows "C = D" + using assms cong_3421 cong_identity by blast + +lemma cong_commutativity: + assumes "Cong A B C D" + shows "Cong B A D C" + using assms cong_3421 by blast + +lemma not_cong_2134: + assumes " \ Cong A B C D" + shows "\ Cong B A C D" + using assms cong_left_commutativity by blast + +lemma not_cong_1243: + assumes "\ Cong A B C D" + shows "\ Cong A B D C" + using assms cong_right_commutativity by blast + +lemma not_cong_2143: + assumes "\ Cong A B C D" + shows "\ Cong B A D C" + using assms cong_commutativity by blast + +lemma not_cong_3412: + assumes "\ Cong A B C D" + shows "\ Cong C D A B" + using assms cong_symmetry by blast + +lemma not_cong_4312: + assumes "\ Cong A B C D" + shows "\ Cong D C A B" + using assms cong_3421 by blast + +lemma not_cong_3421: + assumes "\ Cong A B C D" + shows "\ Cong C D B A" + using assms cong_4312 by blast + +lemma not_cong_4321: + assumes "\ Cong A B C D" + shows "\ Cong D C B A" + using assms cong_4321 by blast + +lemma five_segment_with_def: + assumes "A B C D OFSC A' B' C' D'" and "A \ B" + shows "Cong C D C' D'" + using assms(1) assms(2) OFSC_def five_segment by blast + +lemma cong_diff: + assumes "A \ B" and "Cong A B C D" + shows "C \ D" + using assms(1) assms(2) cong_identity by blast + +lemma cong_diff_2: + assumes "B \ A" and "Cong A B C D" + shows "C \ D" + using assms(1) assms(2) cong_identity by blast + +lemma cong_diff_3: + assumes "C \ D" and "Cong A B C D" + shows "A \ B" + using assms(1) assms(2) cong_reverse_identity by blast + +lemma cong_diff_4: + assumes "D \ C" and "Cong A B C D" + shows "A \ B" + using assms(1) assms(2) cong_reverse_identity by blast + +lemma cong_3_sym: + assumes "A B C Cong3 A' B' C'" + shows "A' B' C' Cong3 A B C" + using assms Cong3_def not_cong_3412 by blast + +lemma cong_3_swap: + assumes "A B C Cong3 A' B' C'" + shows "B A C Cong3 B' A' C'" + using assms Cong3_def cong_commutativity by blast + +lemma cong_3_swap_2: + assumes "A B C Cong3 A' B' C'" + shows "A C B Cong3 A' C' B'" + using assms Cong3_def cong_commutativity by blast + +lemma cong3_transitivity: + assumes "A0 B0 C0 Cong3 A1 B1 C1" and + "A1 B1 C1 Cong3 A2 B2 C2" + shows "A0 B0 C0 Cong3 A2 B2 C2" + by (meson assms(1) assms(2) Cong3_def cong_inner_transitivity not_cong_3412) + +lemma eq_dec_points: + shows "A = B \ \ A = B" + by simp + +lemma distinct: + assumes "P \ Q" + shows "R \ P \ R \ Q" + using assms by simp + +lemma l2_11: + assumes "Bet A B C" and + "Bet A' B' C'" and + "Cong A B A' B'" and + "Cong B C B' C'" + shows "Cong A C A' C'" + by (smt assms(1) assms(2) assms(3) assms(4) cong_right_commutativity cong_symmetry cong_trivial_identity five_segment) + +lemma bet_cong3: + assumes "Bet A B C" and + "Cong A B A' B'" + shows "\ C'. A B C Cong3 A' B' C'" + by (meson assms(1) assms(2) Cong3_def l2_11 not_cong_3412 segment_construction) + +lemma construction_uniqueness: + assumes "Q \ A" and + "Bet Q A X" and + "Cong A X B C" and + "Bet Q A Y" and + "Cong A Y B C" + shows "X = Y" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) cong_identity cong_inner_transitivity cong_reflexivity five_segment) + +lemma Cong_cases: + assumes "Cong A B C D \ Cong A B D C \ Cong B A C D \ Cong B A D C \ Cong C D A B \ Cong C D B A \ Cong D C A B \ Cong D C B A" + shows "Cong A B C D" + using assms not_cong_3421 not_cong_4321 by blast + +lemma Cong_perm : + assumes "Cong A B C D" + shows "Cong A B C D \ Cong A B D C \ Cong B A C D \ Cong B A D C \ Cong C D A B \ Cong C D B A \ Cong D C A B \ Cong D C B A" + using assms not_cong_1243 not_cong_3412 by blast + +subsection "Betweeness properties" + +lemma bet_col: + assumes "Bet A B C" + shows "Col A B C" + by (simp add: assms Col_def) + +lemma between_trivial: + shows "Bet A B B" + using cong_identity segment_construction by blast + +lemma between_symmetry: + assumes "Bet A B C" + shows "Bet C B A" + using assms between_identity between_trivial inner_pasch by blast + +lemma Bet_cases: + assumes "Bet A B C \ Bet C B A" + shows "Bet A B C" + using assms between_symmetry by blast + +lemma Bet_perm: + assumes "Bet A B C" + shows "Bet A B C \ Bet C B A" + using assms Bet_cases by blast + +lemma between_trivial2: + shows "Bet A A B" + using Bet_perm between_trivial by blast + +lemma between_equality: + assumes "Bet A B C" and "Bet B A C" + shows "A = B" + using assms(1) assms(2) between_identity inner_pasch by blast + +lemma between_equality_2: + assumes "Bet A B C" and + "Bet A C B" + shows "B = C" + using assms(1) assms(2) between_equality between_symmetry by blast + +lemma between_exchange3: + assumes "Bet A B C" and + "Bet A C D" + shows "Bet B C D" + by (metis Bet_perm assms(1) assms(2) between_identity inner_pasch) + +lemma bet_neq12__neq: + assumes "Bet A B C" and + "A \ B" + shows "A \ C" + using assms(1) assms(2) between_identity by blast + +lemma bet_neq21__neq: + assumes "Bet A B C" and + "B \ A" + shows "A \ C" + using assms(1) assms(2) between_identity by blast + +lemma bet_neq23__neq: + assumes "Bet A B C" and + "B \ C" + shows "A \ C" + using assms(1) assms(2) between_identity by blast + +lemma bet_neq32__neq: + assumes "Bet A B C" and + "C \ B" + shows "A \ C" + using assms(1) assms(2) between_identity by blast + +lemma not_bet_distincts: + assumes "\ Bet A B C" + shows "A \ B \ B \ C" + using assms between_trivial between_trivial2 by blast + +lemma between_inner_transitivity: + assumes "Bet A B D" and + "Bet B C D" + shows "Bet A B C" + using assms(1) assms(2) Bet_perm between_exchange3 by blast + +lemma outer_transitivity_between2: + assumes "Bet A B C" and + "Bet B C D" and + "B \ C" + shows "Bet A C D" +proof - + obtain X where "Bet A C X \ Cong C X C D" + using segment_construction by blast + thus ?thesis + using assms(1) assms(2) assms(3) between_exchange3 cong_inner_transitivity construction_uniqueness by blast +qed + +lemma between_exchange2: + assumes "Bet A B D" and + "Bet B C D" + shows "Bet A C D" + using assms(1) assms(2) between_inner_transitivity outer_transitivity_between2 by blast + +lemma outer_transitivity_between: + assumes "Bet A B C" and + "Bet B C D" and + "B \ C" + shows "Bet A B D" + using assms(1) assms(2) assms(3) between_symmetry outer_transitivity_between2 by blast + +lemma between_exchange4: + assumes "Bet A B C" and + "Bet A C D" + shows "Bet A B D" + using assms(1) assms(2) between_exchange2 between_symmetry by blast + +lemma l3_9_4: + assumes "Bet4 A1 A2 A3 A4" + shows "Bet4 A4 A3 A2 A1" + using assms Bet4_def Bet_cases by blast + +lemma l3_17: + assumes "Bet A B C" and + "Bet A' B' C" and + "Bet A P A'" + shows "(\ Q. Bet P Q C \ Bet B Q B')" +proof - + obtain X where "Bet B' X A \ Bet P X C" + using Bet_perm assms(2) assms(3) inner_pasch by blast + moreover + then obtain Y where "Bet X Y C \ Bet B Y B'" + using Bet_perm assms(1) inner_pasch by blast + ultimately show ?thesis + using between_exchange2 by blast +qed + +lemma lower_dim_ex: + "\ A B C. \ (Bet A B C \ Bet B C A \ Bet C A B)" + using lower_dim by auto + +lemma two_distinct_points: + "\ X::'p. \ Y::'p. X \ Y" + using lower_dim_ex not_bet_distincts by blast + +lemma point_construction_different: + "\ C. Bet A B C \ B \ C" + using Tarski_neutral_dimensionless.two_distinct_points Tarski_neutral_dimensionless_axioms cong_reverse_identity segment_construction by blast + +lemma another_point: + "\ B::'p. A \ B" + using point_construction_different by blast + +lemma Cong_stability: + assumes "\ \ Cong A B C D" + shows "Cong A B C D" + using assms by simp + +lemma l2_11_b: + assumes "Bet A B C" and + "Bet A' B' C'" and + "Cong A B A' B'" and + "Cong B C B' C'" + shows "Cong A C A' C'" + using assms(1) assms(2) assms(3) assms(4) l2_11 by auto + +lemma cong_dec_eq_dec_b: + assumes "\ A \ B" + shows "A = B" + using assms(1) by simp + +lemma BetSEq: + assumes "BetS A B C" + shows "Bet A B C \ A \ B \ A \ C \ B \ C" + using assms BetS_def between_identity by auto + +subsection "Collinearity" + +subsubsection "Collinearity and betweenness" + +lemma l4_2: + assumes "A B C D IFSC A' B' C' D'" + shows "Cong B D B' D'" +proof cases + assume "A = C" + thus ?thesis + by (metis IFSC_def Tarski_neutral_dimensionless.between_identity Tarski_neutral_dimensionless_axioms assms cong_diff_3) +next + assume H1: "A \ C" + have H2: "Bet A B C \ Bet A' B' C' \ + Cong A C A' C' \ Cong B C B' C' \ + Cong A D A' D' \ Cong C D C' D'" + using IFSC_def assms by auto + obtain E where P1: "Bet A C E \ Cong C E A C" + using segment_construction by blast + have P1A: "Bet A C E" + using P1 by simp + have P1B: "Cong C E A C" + using P1 by simp + obtain E' where P2: "Bet A' C' E' \ Cong C' E' C E" + using segment_construction by blast + have P2A: "Bet A' C' E'" + using P2 by simp + have P2B: "Cong C' E' C E" + using P2 by simp + then have "Cong C E C' E'" + using not_cong_3412 by blast + then have "Cong E D E' D'" + using H1 H2 P1 P2 five_segment by blast + thus ?thesis + by (smt H1 H2 P1A P1B P2A P2B Tarski_neutral_dimensionless.cong_commutativity Tarski_neutral_dimensionless.cong_diff_3 Tarski_neutral_dimensionless.cong_symmetry Tarski_neutral_dimensionless_axioms between_inner_transitivity between_symmetry five_segment) +qed + +lemma l4_3: + assumes "Bet A B C" and + "Bet A' B' C'" and + "Cong A C A' C'" + and "Cong B C B' C'" + shows "Cong A B A' B'" +proof - + have "A B C A IFSC A' B' C' A'" + using IFSC_def assms(1) assms(2) assms(3) assms(4) cong_trivial_identity not_cong_2143 by blast + thus ?thesis + using l4_2 not_cong_2143 by blast +qed + + +lemma l4_3_1: + assumes "Bet A B C" and + "Bet A' B' C'" and + "Cong A B A' B'" and + "Cong A C A' C'" + shows "Cong B C B' C'" + by (meson assms(1) assms(2) assms(3) assms(4) between_symmetry cong_4321 l4_3) + +lemma l4_5: + assumes "Bet A B C" and + "Cong A C A' C'" + shows "\ B'. (Bet A' B' C' \ A B C Cong3 A' B' C')" +proof - + obtain X' where P1: "Bet C' A' X' \ A' \ X'" + using point_construction_different by auto + obtain B' where P2: "Bet X' A' B' \ Cong A' B' A B" + using segment_construction by blast + obtain C'' where P3: "Bet X' B' C'' \ Cong B' C'' B C" + using segment_construction by blast + then have P4: "Bet A' B' C''" + using P2 between_exchange3 by blast + then have "C'' = C'" + by (smt P1 P2 P3 assms(1) assms(2) between_exchange4 between_symmetry cong_symmetry construction_uniqueness l2_11_b) + then show ?thesis + by (smt Cong3_def P1 P2 P3 Tarski_neutral_dimensionless.construction_uniqueness Tarski_neutral_dimensionless_axioms P4 assms(1) assms(2) between_exchange4 between_symmetry cong_commutativity cong_symmetry cong_trivial_identity five_segment not_bet_distincts) +qed + +lemma l4_6: + assumes "Bet A B C" and + "A B C Cong3 A' B' C'" + shows "Bet A' B' C'" +proof - + obtain x where P1: "Bet A' x C' \ A B C Cong3 A' x C'" + using Cong3_def assms(1) assms(2) l4_5 by blast + then have "A' x C' Cong3 A' B' C'" + using assms(2) cong3_transitivity cong_3_sym by blast + then have "A' x C' x IFSC A' x C' B'" + by (meson Cong3_def Cong_perm IFSC_def P1 cong_reflexivity) + then have "Cong x x x B'" + using l4_2 by auto + then show ?thesis + using P1 cong_reverse_identity by blast +qed + +lemma cong3_bet_eq: + assumes "Bet A B C" and + "A B C Cong3 A X C" + shows "X = B" +proof - + have "A B C B IFSC A B C X" + by (meson Cong3_def Cong_perm IFSC_def assms(1) assms(2) cong_reflexivity) + then show ?thesis + using cong_reverse_identity l4_2 by blast +qed + +subsubsection "Collinearity" + +lemma col_permutation_1: + assumes "Col A B C" + shows "Col B C A" + using assms(1) Col_def by blast + +lemma col_permutation_2: + assumes "Col A B C" + shows "Col C A B" + using assms(1) col_permutation_1 by blast + +lemma col_permutation_3: + assumes "Col A B C" + shows "Col C B A" + using assms(1) Bet_cases Col_def by auto + +lemma col_permutation_4: + assumes "Col A B C" + shows "Col B A C" + using assms(1) Bet_perm Col_def by blast + +lemma col_permutation_5: + assumes "Col A B C" + shows "Col A C B" + using assms(1) col_permutation_1 col_permutation_3 by blast + +lemma not_col_permutation_1: + assumes "\ Col A B C" + shows "\ Col B C A" + using assms col_permutation_2 by blast + +lemma not_col_permutation_2: + assumes "~ Col A B C" + shows "~ Col C A B" + using assms col_permutation_1 by blast + +lemma not_col_permutation_3: + assumes "\ Col A B C" + shows "\ Col C B A" + using assms col_permutation_3 by blast + +lemma not_col_permutation_4: + assumes "\ Col A B C" + shows "\ Col B A C" + using assms col_permutation_4 by blast + +lemma not_col_permutation_5: + assumes "\ Col A B C" + shows "\ Col A C B" + using assms col_permutation_5 by blast + +lemma Col_cases: + assumes "Col A B C \ Col A C B \ Col B A C \ Col B C A \ Col C A B \ Col C B A" + shows "Col A B C" + using assms not_col_permutation_4 not_col_permutation_5 by blast + +lemma Col_perm: + assumes "Col A B C" + shows "Col A B C \ Col A C B \ Col B A C \ Col B C A \ Col C A B \ Col C B A" + using Col_cases assms by blast + +lemma col_trivial_1: + "Col A A B" + using bet_col not_bet_distincts by blast + +lemma col_trivial_2: + "Col A B B" + by (simp add: Col_def between_trivial2) + +lemma col_trivial_3: + "Col A B A" + by (simp add: Col_def between_trivial2) + +lemma l4_13: + assumes "Col A B C" and + "A B C Cong3 A' B' C'" + shows "Col A' B' C'" + by (metis Tarski_neutral_dimensionless.Col_def Tarski_neutral_dimensionless.cong_3_swap Tarski_neutral_dimensionless.cong_3_swap_2 Tarski_neutral_dimensionless_axioms assms(1) assms(2) l4_6) + +lemma l4_14R1: + assumes "Bet A B C" and + "Cong A B A' B'" + shows "\ C'. A B C Cong3 A' B' C'" + by (simp add: assms(1) assms(2) bet_cong3) + +lemma l4_14R2: + assumes "Bet B C A" and + "Cong A B A' B'" + shows "\ C'. A B C Cong3 A' B' C'" + by (meson assms(1) assms(2) between_symmetry cong_3_swap_2 l4_5) + +lemma l4_14R3: + assumes "Bet C A B" and + "Cong A B A' B'" + shows "\ C'. A B C Cong3 A' B' C'" + by (meson assms(1) assms(2) between_symmetry cong_3_swap l4_14R1 not_cong_2143) + +lemma l4_14: + assumes "Col A B C" and + "Cong A B A' B'" + shows "\ C'. A B C Cong3 A' B' C'" + using Col_def assms(1) assms(2) l4_14R1 l4_14R2 l4_14R3 by blast + +lemma l4_16R1: + assumes "A B C D FSC A' B' C' D'" and + "A \ B" and + "Bet A B C" + shows "Cong C D C' D'" +proof - + have "A B C Cong3 A' B' C'" + using FSC_def assms(1) by blast + then have "Bet A' B' C'" + using assms(3) l4_6 by blast + then have "A B C D OFSC A' B' C' D'" + by (meson Cong3_def FSC_def OFSC_def assms(1) cong_3_sym l4_6) + thus ?thesis + using assms(2) five_segment_with_def by blast +qed + +lemma l4_16R2: + assumes "A B C D FSC A' B' C' D'" + and "Bet B C A" + shows "Cong C D C' D'" +proof - + have "A B C Cong3 A' B' C'" + using FSC_def assms(1) by blast + then have "Bet B' C' A'" + using Bet_perm assms(2) cong_3_swap_2 l4_6 by blast + then have "B C A D IFSC B' C' A' D'" + by (meson Cong3_def FSC_def IFSC_def assms(1) assms(2) not_cong_2143) + then show ?thesis + using l4_2 by auto +qed + +lemma l4_16R3: + assumes "A B C D FSC A' B' C' D'" and "A \ B" + and "Bet C A B" + shows "Cong C D C' D'" +proof - + have "A B C Cong3 A' B' C'" + using FSC_def assms(1) by blast + then have "Bet C' A' B'" + using assms(3) between_symmetry cong_3_swap l4_6 by blast + thus ?thesis + by (smt Cong3_def FSC_def assms(1) assms(2) assms(3) between_symmetry cong_commutativity five_segment) +qed + +lemma l4_16: + assumes "A B C D FSC A' B' C' D'" and + "A \ B" + shows "Cong C D C' D'" + by (meson Col_def FSC_def assms(1) assms(2) l4_16R1 l4_16R2 l4_16R3) + +lemma l4_17: + assumes "A \ B" and + "Col A B C" and + "Cong A P A Q" and + "Cong B P B Q" + shows "Cong C P C Q" +proof - + { + assume "\ Bet B C A" + then have "\p pa. Bet p pa C \ Cong pa P pa Q \ Cong p P p Q \ p \ pa" + using Col_def assms(1) assms(2) assms(3) assms(4) between_symmetry by blast + then have ?thesis + using cong_reflexivity five_segment by blast + } + then show ?thesis + by (meson IFSC_def assms(3) assms(4) cong_reflexivity l4_2) +qed + + +lemma l4_18: + assumes "A \ B" and + "Col A B C" and + "Cong A C A C'" and + "Cong B C B C'" + shows "C = C'" + using assms(1) assms(2) assms(3) assms(4) cong_diff_3 l4_17 by blast + +lemma l4_19: + assumes "Bet A C B" and + "Cong A C A C'" and + "Cong B C B C'" + shows "C = C'" + by (metis Col_def assms(1) assms(2) assms(3) between_equality between_trivial cong_identity l4_18 not_cong_3421) + +lemma not_col_distincts: + assumes "\ Col A B C" + shows "\ Col A B C \ A \ B \ B \ C \ A \ C" + using Col_def assms between_trivial by blast + +lemma NCol_cases: + assumes "\ Col A B C \ \ Col A C B \ \ Col B A C \ \ Col B C A \ \ Col C A B \ \ Col C B A" + shows "\ Col A B C" + using assms not_col_permutation_2 not_col_permutation_3 by blast + +lemma NCol_perm: + assumes "\ Col A B C" + shows "\ Col A B C \ ~ Col A C B \ ~ Col B A C \ ~ Col B C A \ ~ Col C A B \ ~ Col C B A" + using NCol_cases assms by blast + +lemma col_cong_3_cong_3_eq: + assumes "A \ B" + and "Col A B C" + and "A B C Cong3 A' B' C1" + and "A B C Cong3 A' B' C2" + shows "C1 = C2" + by (metis Tarski_neutral_dimensionless.Cong3_def Tarski_neutral_dimensionless.cong_diff Tarski_neutral_dimensionless.l4_18 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) cong_inner_transitivity l4_13) + +subsection "Between transitivity le" + +lemma l5_1: + assumes "A \ B" and + "Bet A B C" and + "Bet A B D" + shows "Bet A C D \ Bet A D C" +proof - + obtain C' where P1: "Bet A D C' \ Cong D C' C D" + using segment_construction by blast + obtain D' where P2: "Bet A C D' \ Cong C D' C D" + using segment_construction by blast + obtain B' where P3: "Bet A C' B' \ Cong C' B' C B" + using segment_construction by blast + obtain B'' where P4: "Bet A D' B'' \ Cong D' B'' D B" + using segment_construction by blast + then have P5: "Cong B C' B'' C" + by (smt P1 P2 assms(3) between_exchange3 between_symmetry cong_4312 cong_inner_transitivity l2_11_b) + then have "Cong B B' B'' B" + by (meson Bet_cases P1 P2 P3 P4 assms(2) assms(3) between_exchange4 between_inner_transitivity l2_11_b) + then have P6: "B'' = B'" + by (meson P1 P2 P3 P4 assms(1) assms(2) assms(3) between_exchange4 cong_inner_transitivity construction_uniqueness not_cong_2134) + have "Bet B C D'" + using P2 assms(2) between_exchange3 by blast + then have "B C D' C' FSC B' C' D C" + by (smt Cong3_def FSC_def P1 P2 P3 P5 P6 bet_col between_exchange3 between_symmetry cong_3421 cong_pseudo_reflexivity cong_transitivity l2_11_b) + then have P8: "Cong D' C' D C" + using P3 P4 P6 cong_identity l4_16 by blast + obtain E where P9: "Bet C E C' \ Bet D E D'" + using P1 P2 between_trivial2 l3_17 by blast + then have P10: "D E D' C IFSC D E D' C'" + by (smt IFSC_def P1 P2 P8 Tarski_neutral_dimensionless.cong_reflexivity Tarski_neutral_dimensionless_axioms cong_3421 cong_inner_transitivity) + then have "Cong E C E C'" + using l4_2 by auto + have P11: "C E C' D IFSC C E C' D'" + by (smt IFSC_def P1 P2 Tarski_neutral_dimensionless.cong_reflexivity Tarski_neutral_dimensionless_axioms P8 P9 cong_3421 cong_inner_transitivity) + then have "Cong E D E D'" + using l4_2 by auto + obtain P where "Bet C' C P \ Cong C P C D'" + using segment_construction by blast + obtain R where "Bet D' C R \ Cong C R C E" + using segment_construction by blast + obtain Q where "Bet P R Q \ Cong R Q R P" + using segment_construction by blast + have "D' C R P FSC P C E D'" + by (meson Bet_perm Cong3_def FSC_def \Bet C E C' \ Bet D E D'\ \Bet C' C P \ Cong C P C D'\ \Bet D' C R \ Cong C R C E\ bet_col between_exchange3 cong_pseudo_reflexivity l2_11_b not_cong_4321) + have "Cong R P E D'" + by (metis Cong_cases \D' C R P FSC P C E D'\ \Bet C' C P \ Cong C P C D'\ \Bet D' C R \ Cong C R C E\ cong_diff_2 l4_16) + have "Cong R Q E D" + by (metis Cong_cases \Cong E D E D'\ \Cong R P E D'\ \Bet P R Q \ Cong R Q R P\ cong_transitivity) + have "D' E D C FSC P R Q C" + by (meson Bet_perm Cong3_def FSC_def \Cong R P E D'\ \Cong R Q E D\ \Bet C E C' \ Bet D E D'\ \Bet C' C P \ Cong C P C D'\ \Bet D' C R \ Cong C R C E\ \Bet P R Q \ Cong R Q R P\ bet_col l2_11_b not_cong_2143 not_cong_4321) + have "Cong D C Q C" + using \D' E D C FSC P R Q C\ \Cong E D E D'\ \Bet C E C' \ Bet D E D'\ cong_identity l4_16 l4_16R2 by blast + have "Cong C P C Q" + using P2 \Cong D C Q C\ \Bet C' C P \ Cong C P C D'\ cong_right_commutativity cong_transitivity by blast + have "Bet A C D \ Bet A D C" + proof cases + assume "R = C" + then show ?thesis + by (metis P1 \Cong E C E C'\ \Bet D' C R \ Cong C R C E\ cong_diff_4) + next + assume "R \ C" + { + have "Cong D' P D' Q" + proof - + + have "Col R C D'" + by (simp add: \Bet D' C R \ Cong C R C E\ bet_col between_symmetry) + have "Cong R P R Q" + by (metis Tarski_neutral_dimensionless.Cong_cases Tarski_neutral_dimensionless_axioms \Bet P R Q \ Cong R Q R P\) + have "Cong C P C Q" + by (simp add: \Cong C P C Q\) + then show ?thesis + using \Col R C D'\ \Cong R P R Q\ \R \ C\ l4_17 by blast + qed + then have "Cong B P B Q" using \Cong C P C Q\ \Bet B C D'\ cong_diff_4 + by (metis Col_def \Bet C' C P \ Cong C P C D'\ cong_reflexivity l4_17 not_cong_3412) + have "Cong B' P B' Q" + by (metis P2 P4 \B'' = B'\ \Cong C P C Q\ \Cong D' P D' Q\ \Bet C' C P \ Cong C P C D'\ between_exchange3 cong_diff_4 cong_identity cong_reflexivity five_segment) + have "Cong C' P C' Q" + proof - + have "Bet B C' B'" + using P1 P3 assms(3) between_exchange3 between_exchange4 by blast + then show ?thesis + by (metis Col_def \Cong B P B Q\ \Cong B' P B' Q\ between_equality l4_17 not_bet_distincts) + qed + have "Cong P P P Q" + by (metis Tarski_neutral_dimensionless.cong_diff_2 Tarski_neutral_dimensionless_axioms \Cong C P C Q\ \Cong C' P C' Q\ \R \ C\ \Bet C E C' \ Bet D E D'\ \Bet C' C P \ Cong C P C D'\ \Bet D' C R \ Cong C R C E\ bet_col bet_neq12__neq l4_17) + thus ?thesis + by (metis P2 \Cong R P E D'\ \Cong R Q E D\ \Bet P R Q \ Cong R Q R P\ bet_neq12__neq cong_diff_4) + } + then have "R \ C \ Bet A C D \ Bet A D C" by blast + qed + thus ?thesis + by simp +qed + +lemma l5_2: + assumes "A \ B" and + "Bet A B C" and + "Bet A B D" + shows "Bet B C D \ Bet B D C" + using assms(1) assms(2) assms(3) between_exchange3 l5_1 by blast + +lemma segment_construction_2: + assumes "A \ Q" + shows "\ X. ((Bet Q A X \ Bet Q X A) \ Cong Q X B C)" +proof - + obtain A' where P1: "Bet A Q A' \ Cong Q A' A Q" + using segment_construction by blast + obtain X where P2: "Bet A' Q X \ Cong Q X B C" + using segment_construction by blast + then show ?thesis + by (metis P1 Tarski_neutral_dimensionless.cong_diff_4 Tarski_neutral_dimensionless_axioms between_symmetry l5_2) +qed + +lemma l5_3: + assumes "Bet A B D" and + "Bet A C D" + shows "Bet A B C \ Bet A C B" + by (metis Bet_perm assms(1) assms(2) between_inner_transitivity l5_2 point_construction_different) + +lemma bet3__bet: + assumes "Bet A B E" and + "Bet A D E" and + "Bet B C D" + shows "Bet A C E" + by (meson assms(1) assms(2) assms(3) between_exchange2 between_symmetry l5_3) + +lemma le_bet: + assumes "C D Le A B" + shows "\ X. (Bet A X B \ Cong A X C D)" + by (meson Le_def assms cong_symmetry) + +lemma l5_5_1: + assumes "A B Le C D" + shows "\ X. (Bet A B X \ Cong A X C D)" +proof - + obtain P where P1: "Bet C P D \ Cong A B C P" + using Le_def assms by blast + obtain X where P2: "Bet A B X \ Cong B X P D" + using segment_construction by blast + then have "Cong A X C D" + using P1 l2_11_b by blast + then show ?thesis + using P2 by blast +qed + +lemma l5_5_2: + assumes "\ X. (Bet A B X \ Cong A X C D)" + shows "A B Le C D" +proof - + obtain P where P1: "Bet A B P \ Cong A P C D" + using assms by blast + obtain B' where P2: "Bet C B' D \ A B P Cong3 C B' D" + using P1 l4_5 by blast + then show ?thesis + using Cong3_def Le_def by blast +qed + +lemma l5_6: + assumes "A B Le C D" and + "Cong A B A' B'" and + "Cong C D C' D'" + shows "A' B' Le C' D'" + by (meson Cong3_def Le_def assms(1) assms(2) assms(3) cong_inner_transitivity l4_5) + +lemma le_reflexivity: + shows "A B Le A B" + using between_trivial cong_reflexivity l5_5_2 by blast + +lemma le_transitivity: + assumes "A B Le C D" and + "C D Le E F" + shows "A B Le E F" + by (meson assms(1) assms(2) between_exchange4 cong_reflexivity l5_5_1 l5_5_2 l5_6 le_bet) + +lemma between_cong: + assumes "Bet A C B" and + "Cong A C A B" + shows "C = B" + by (smt assms(1) assms(2) between_trivial cong_inner_transitivity five_segment l4_19 l4_3_1) + +lemma cong3_symmetry: + assumes "A B C Cong3 A' B' C'" + shows "A' B' C' Cong3 A B C" + by (simp add: assms cong_3_sym) + +lemma between_cong_2: + assumes "Bet A D B" and + "Bet A E B" + and "Cong A D A E" + shows "D = E" using l5_3 + by (smt Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) cong_diff cong_inner_transitivity l4_3_1) + +lemma between_cong_3: + assumes "A \ B" + and "Bet A B D" + and "Bet A B E" + and "Cong B D B E" + shows "D = E" + by (meson assms(1) assms(2) assms(3) assms(4) cong_reflexivity construction_uniqueness) + +lemma le_anti_symmetry: + assumes "A B Le C D" and + "C D Le A B" + shows "Cong A B C D" + by (smt Le_def Tarski_neutral_dimensionless.between_exchange4 Tarski_neutral_dimensionless_axioms assms(1) assms(2) bet_neq21__neq between_cong between_exchange3 cong_transitivity l5_5_1 not_cong_3421) + +lemma cong_dec: + shows "Cong A B C D \ \ Cong A B C D" + by simp + +lemma bet_dec: + shows "Bet A B C \ \ Bet A B C" + by simp + +lemma col_dec: + shows "Col A B C \ \ Col A B C" + by simp + +lemma le_trivial: + shows "A A Le C D" + using Le_def between_trivial2 cong_trivial_identity by blast + +lemma le_cases: + shows "A B Le C D \ C D Le A B" + by (metis (full_types) cong_reflexivity l5_5_2 l5_6 not_bet_distincts segment_construction_2) + +lemma le_zero: + assumes "A B Le C C" + shows "A = B" + by (metis assms cong_diff_4 le_anti_symmetry le_trivial) + +lemma le_diff: + assumes "A \ B" and "A B Le C D" + shows "C \ D" + using assms(1) assms(2) le_zero by blast + +lemma lt_diff: + assumes "A B Lt C D" + shows "C \ D" + using Lt_def assms cong_trivial_identity le_zero by blast + +lemma bet_cong_eq: + assumes "Bet A B C" and + "Bet A C D" and + "Cong B C A D" + shows "C = D \ A = B" +proof - + have "Bet C B A" + using Bet_perm assms(1) by blast + then show ?thesis + by (metis (no_types) Cong_perm Le_def assms(2) assms(3) between_cong cong_pseudo_reflexivity le_anti_symmetry) +qed + +lemma cong__le: + assumes "Cong A B C D" + shows "A B Le C D" + using Le_def assms between_trivial by blast + +lemma cong__le3412: + assumes "Cong A B C D" + shows "C D Le A B" + using assms cong__le cong_symmetry by blast + +lemma le1221: + shows "A B Le B A" + by (simp add: cong__le cong_pseudo_reflexivity) + +lemma le_left_comm: + assumes "A B Le C D" + shows "B A Le C D" + using assms le1221 le_transitivity by blast + +lemma le_right_comm: + assumes "A B Le C D" + shows "A B Le D C" + by (meson assms cong_right_commutativity l5_5_1 l5_5_2) + +lemma le_comm: + assumes "A B Le C D" + shows "B A Le D C" + using assms le_left_comm le_right_comm by blast + +lemma ge_left_comm: + assumes "A B Ge C D" + shows "B A Ge C D" + by (meson Ge_def assms le_right_comm) + +lemma ge_right_comm: + assumes "A B Ge C D" + shows "A B Ge D C" + using Ge_def assms le_left_comm by presburger + +lemma ge_comm0: + assumes "A B Ge C D" + shows "B A Ge D C" + by (meson assms ge_left_comm ge_right_comm) + +lemma lt_right_comm: + assumes "A B Lt C D" + shows "A B Lt D C" + using Lt_def assms le_right_comm not_cong_1243 by blast + +lemma lt_left_comm: + assumes "A B Lt C D" + shows "B A Lt C D" + using Lt_def assms le_comm lt_right_comm not_cong_2143 by blast + +lemma lt_comm: + assumes "A B Lt C D" + shows "B A Lt D C" + using assms lt_left_comm lt_right_comm by blast + +lemma gt_left_comm0: + assumes "A B Gt C D" + shows "B A Gt C D" + by (meson Gt_def assms lt_right_comm) + +lemma gt_right_comm: + assumes "A B Gt C D" + shows "A B Gt D C" + using Gt_def assms lt_left_comm by presburger + +lemma gt_comm: + assumes "A B Gt C D" + shows "B A Gt D C" + by (meson assms gt_left_comm0 gt_right_comm) + +lemma cong2_lt__lt: + assumes "A B Lt C D" and + "Cong A B A' B'" and + "Cong C D C' D'" + shows "A' B' Lt C' D'" + by (meson Lt_def assms(1) assms(2) assms(3) l5_6 le_anti_symmetry not_cong_3412) + +lemma fourth_point: + assumes "A \ B" and + "B \ C" and + "Col A B P" and + "Bet A B C" + shows "Bet P A B \ Bet A P B \ Bet B P C \ Bet B C P" + by (metis Col_def Tarski_neutral_dimensionless.l5_2 Tarski_neutral_dimensionless_axioms assms(3) assms(4) between_symmetry) + +lemma third_point: + assumes "Col A B P" + shows "Bet P A B \ Bet A P B \ Bet A B P" + using Col_def assms between_symmetry by blast + +lemma l5_12_a: + assumes "Bet A B C" + shows "A B Le A C \ B C Le A C" + using assms between_symmetry cong_left_commutativity cong_reflexivity l5_5_2 le_left_comm by blast + +lemma bet__le1213: + assumes "Bet A B C" + shows "A B Le A C" + using assms l5_12_a by blast + +lemma bet__le2313: + assumes "Bet A B C" + shows "B C Le A C" + by (simp add: assms l5_12_a) + +lemma bet__lt1213: + assumes "B \ C" and + "Bet A B C" + shows "A B Lt A C" + using Lt_def assms(1) assms(2) bet__le1213 between_cong by blast + +lemma bet__lt2313: + assumes "A \ B" and + "Bet A B C" + shows "B C Lt A C" + using Lt_def assms(1) assms(2) bet__le2313 bet_cong_eq l5_1 by blast + +lemma l5_12_b: + assumes "Col A B C" and + "A B Le A C" and + "B C Le A C" + shows "Bet A B C" + by (metis assms(1) assms(2) assms(3) between_cong col_permutation_5 l5_12_a le_anti_symmetry not_cong_2143 third_point) + +lemma bet_le_eq: + assumes "Bet A B C" + and "A C Le B C" + shows "A = B" + by (meson assms(1) assms(2) bet__le2313 bet_cong_eq l5_1 le_anti_symmetry) + +lemma or_lt_cong_gt: + "A B Lt C D \ A B Gt C D \ Cong A B C D" + by (meson Gt_def Lt_def cong_symmetry local.le_cases) + +lemma lt__le: + assumes "A B Lt C D" + shows "A B Le C D" + using Lt_def assms by blast + +lemma le1234_lt__lt: + assumes "A B Le C D" and + "C D Lt E F" + shows "A B Lt E F" + by (meson Lt_def assms(1) assms(2) cong__le3412 le_anti_symmetry le_transitivity) + +lemma le3456_lt__lt: + assumes "A B Lt C D" and + "C D Le E F" + shows "A B Lt E F" + by (meson Lt_def assms(1) assms(2) cong2_lt__lt cong_reflexivity le1234_lt__lt) + +lemma lt_transitivity: + assumes "A B Lt C D" and + "C D Lt E F" + shows "A B Lt E F" + using Lt_def assms(1) assms(2) le1234_lt__lt by blast + +lemma not_and_lt: + "\ (A B Lt C D \ C D Lt A B)" + by (simp add: Lt_def le_anti_symmetry) + +lemma nlt: + "\ A B Lt A B" + using not_and_lt by blast + +lemma le__nlt: + assumes "A B Le C D" + shows "\ C D Lt A B" + using assms le3456_lt__lt nlt by blast + +lemma cong__nlt: + assumes "Cong A B C D" + shows "\ A B Lt C D" + by (simp add: Lt_def assms) + +lemma nlt__le: + assumes "\ A B Lt C D" + shows "C D Le A B" + using Lt_def assms cong__le3412 local.le_cases by blast + +lemma lt__nle: + assumes "A B Lt C D" + shows "\ C D Le A B" + using assms le__nlt by blast + +lemma nle__lt: + assumes "\ A B Le C D" + shows "C D Lt A B" + using assms nlt__le by blast + +lemma lt1123: + assumes "B \ C" + shows "A A Lt B C" + using assms le_diff nle__lt by blast + +lemma bet2_le2__le_R1: + assumes "Bet a P b" and + "Bet A Q B" and + "P a Le Q A" and + "P b Le Q B" and + "B = Q" + shows "a b Le A B" + by (metis assms(3) assms(4) assms(5) le_comm le_diff) + +lemma bet2_le2__le_R2: + assumes "Bet a Po b" and + "Bet A PO B" and + "Po a Le PO A" and + "Po b Le PO B" and + "A \ PO" and + "B \ PO" + shows "a b Le A B" +proof - + obtain b' where P1: "Bet A PO b' \ Cong PO b' b Po" + using segment_construction by blast + obtain a' where P2: "Bet B PO a' \ Cong PO a' a Po" + using segment_construction by blast + obtain a'' where P3: "Bet PO a'' A \ Cong Po a PO a''" + using Le_def assms(3) by blast + have P4: "a' = a''" + by (meson Bet_cases P2 P3 assms(2) assms(6) between_inner_transitivity cong_right_commutativity construction_uniqueness not_cong_3412) + have P5: "B a' Le B A" + using Bet_cases P3 P4 assms(2) bet__le1213 between_exchange2 by blast + obtain b'' where P6: "Bet PO b'' B \ Cong Po b PO b''" + using Le_def assms(4) by blast + then have "b' = b''" + using P1 assms(2) assms(5) between_inner_transitivity cong_right_commutativity construction_uniqueness not_cong_3412 by blast + then have "a' b' Le a' B" + using Bet_cases P2 P6 bet__le1213 between_exchange2 by blast + then have "a' b' Le A B" + using P5 le_comm le_transitivity by blast + thus ?thesis + by (smt Cong_cases P1 P3 P4 Tarski_neutral_dimensionless.l5_6 Tarski_neutral_dimensionless_axioms assms(1) between_exchange3 between_symmetry cong_reflexivity l2_11_b) +qed + +lemma bet2_le2__le: + assumes "Bet a P b" and + "Bet A Q B" and + "P a Le Q A" and + "P b Le Q B" + shows "a b Le A B" +proof cases + assume "A = Q" + thus ?thesis + using assms(3) assms(4) le_diff by force +next + assume "\ A = Q" + thus ?thesis + using assms(1) assms(2) assms(3) assms(4) bet2_le2__le_R1 bet2_le2__le_R2 by blast +qed + +lemma Le_cases: + assumes "A B Le C D \ B A Le C D \ A B Le D C \ B A Le D C" + shows "A B Le C D" + using assms le_left_comm le_right_comm by blast + +lemma Lt_cases: + assumes "A B Lt C D \ B A Lt C D \ A B Lt D C \ B A Lt D C" + shows "A B Lt C D" + using assms lt_comm lt_left_comm by blast + +subsection "Out lines" + +lemma bet_out: + assumes "B \ A" and + "Bet A B C" + shows "A Out B C" + using Out_def assms(1) assms(2) bet_neq12__neq by fastforce + +lemma bet_out_1: + assumes "B \ A" and + "Bet C B A" + shows "A Out B C" + by (simp add: assms(1) assms(2) bet_out between_symmetry) + +lemma out_dec: + shows "P Out A B \ \ P Out A B" + by simp + +lemma out_diff1: + assumes "A Out B C" + shows "B \ A" + using Out_def assms by auto + +lemma out_diff2: + assumes "A Out B C" + shows "C \ A" + using Out_def assms by auto + +lemma out_distinct: + assumes "A Out B C" + shows "B \ A \ C \ A" + using assms out_diff1 out_diff2 by auto + +lemma out_col: + assumes "A Out B C" + shows "Col A B C" + using Col_def Out_def assms between_symmetry by auto + +lemma l6_2: + assumes "A \ P" and + "B \ P" and + "C \ P" and + "Bet A P C" + shows "Bet B P C \ P Out A B" + by (smt Bet_cases Out_def assms(1) assms(2) assms(3) assms(4) between_inner_transitivity l5_2 outer_transitivity_between) + +lemma bet_out__bet: + assumes "Bet A P C" and + "P Out A B" + shows "Bet B P C" + by (metis Tarski_neutral_dimensionless.l6_2 Tarski_neutral_dimensionless_axioms assms(1) assms(2) not_bet_distincts out_diff1) + +lemma l6_3_1: + assumes "P Out A B" + shows "A \ P \ B \ P \ (\ C. (C \ P \ Bet A P C \ Bet B P C))" + using assms bet_out__bet out_diff1 out_diff2 point_construction_different by fastforce + +lemma l6_3_2: + assumes "A \ P" and + "B \ P" and + "\ C. (C \ P \ Bet A P C \ Bet B P C)" + shows "P Out A B" + using assms(1) assms(2) assms(3) l6_2 by blast + +lemma l6_4_1: + assumes "P Out A B" and + "Col A P B" + shows "\ Bet A P B" + using Out_def assms(1) between_equality between_symmetry by fastforce + +lemma l6_4_2: + assumes "Col A P B" + and "\ Bet A P B" + shows "P Out A B" + by (metis Out_def assms(1) assms(2) bet_out col_permutation_1 third_point) + +lemma out_trivial: + assumes "A \ P" + shows "P Out A A" + by (simp add: assms bet_out_1 between_trivial2) + +lemma l6_6: + assumes "P Out A B" + shows "P Out B A" + using Out_def assms by auto + +lemma l6_7: + assumes "P Out A B" and + "P Out B C" + shows "P Out A C" + by (smt Out_def assms(1) assms(2) between_exchange4 l5_1 l5_3) + +lemma bet_out_out_bet: + assumes "Bet A B C" and + "B Out A A'" and + "B Out C C'" + shows "Bet A' B C'" + by (metis Out_def assms(1) assms(2) assms(3) bet_out__bet between_inner_transitivity outer_transitivity_between) + +lemma out2_bet_out: + assumes "B Out A C" and + "B Out X P" and + "Bet A X C" + shows "B Out A P \ B Out C P" + by (smt Out_def Tarski_neutral_dimensionless.l6_7 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) between_exchange2 between_symmetry) + +lemma l6_11_uniqueness: + assumes "A Out X R" and + "Cong A X B C" and + "A Out Y R" and + "Cong A Y B C" + shows "X = Y" + by (metis Out_def assms(1) assms(2) assms(3) assms(4) between_cong cong_symmetry cong_transitivity l6_6 l6_7) + +lemma l6_11_existence: + assumes "R \ A" and + "B \ C" + shows "\ X. (A Out X R \ Cong A X B C)" + by (metis Out_def assms(1) assms(2) cong_reverse_identity segment_construction_2) + + +lemma segment_construction_3: + assumes "A \ B" and + "X \ Y" + shows "\ C. (A Out B C \ Cong A C X Y)" + by (metis assms(1) assms(2) l6_11_existence l6_6) + +lemma l6_13_1: + assumes "P Out A B" and + "P A Le P B" + shows "Bet P A B" + by (metis Out_def assms(1) assms(2) bet__lt1213 le__nlt) + +lemma l6_13_2: + assumes "P Out A B" and + "Bet P A B" + shows "P A Le P B" + by (simp add: assms(2) bet__le1213) + +lemma l6_16_1: + assumes "P \ Q" and + "Col S P Q" and + "Col X P Q" + shows "Col X P S" + by (smt Col_def assms(1) assms(2) assms(3) bet3__bet col_permutation_4 l5_1 l5_3 outer_transitivity_between third_point) + +lemma col_transitivity_1: + assumes "P \ Q" and + "Col P Q A" and + "Col P Q B" + shows "Col P A B" + by (meson Tarski_neutral_dimensionless.l6_16_1 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) not_col_permutation_2) + +lemma col_transitivity_2: + assumes "P \ Q" and + "Col P Q A" and + "Col P Q B" + shows "Col Q A B" + by (metis Tarski_neutral_dimensionless.col_transitivity_1 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) not_col_permutation_4) + +lemma l6_21: + assumes "\ Col A B C" and + "C \ D" and + "Col A B P" and + "Col A B Q" and + "Col C D P" and + "Col C D Q" + shows "P = Q" + by (metis assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) col_transitivity_1 l6_16_1 not_col_distincts) + +lemma col2__eq: + assumes "Col A X Y" and + "Col B X Y" and + "\ Col A X B" + shows "X = Y" + using assms(1) assms(2) assms(3) l6_16_1 by blast + +lemma not_col_exists: + assumes "A \ B" + shows "\ C. \ Col A B C" + by (metis Col_def assms col_transitivity_2 lower_dim_ex) + +lemma col3: + assumes "X \ Y" and + "Col X Y A" and + "Col X Y B" and + "Col X Y C" + shows "Col A B C" + by (metis assms(1) assms(2) assms(3) assms(4) col_transitivity_2) + +lemma colx: + assumes "A \ B" and + "Col X Y A" and + "Col X Y B" and + "Col A B C" + shows "Col X Y C" + by (metis assms(1) assms(2) assms(3) assms(4) l6_21 not_col_distincts) + +lemma out2__bet: + assumes "A Out B C" and + "C Out A B" + shows "Bet A B C" + by (metis Out_def assms(1) assms(2) between_equality between_symmetry) + +lemma bet2_le2__le1346: + assumes "Bet A B C" and + "Bet A' B' C'" and + "A B Le A' B'" and + "B C Le B' C'" + shows "A C Le A' C'" + using Le_cases assms(1) assms(2) assms(3) assms(4) bet2_le2__le by blast + +lemma bet2_le2__le2356_R1: + assumes "Bet A A C" and + "Bet A' B' C'" and + "A A Le A' B'" and + "A' C' Le A C" + shows "B' C' Le A C" + using assms(2) assms(4) bet__le2313 le3456_lt__lt lt__nle nlt__le by blast + +lemma bet2_le2__le2356_R2: + assumes "A \ B" and + "Bet A B C" and + "Bet A' B' C'" and + "A B Le A' B'" and + "A' C' Le A C" + shows "B' C' Le B C" +proof - + have "A \ C" + using assms(1) assms(2) bet_neq12__neq by blast + obtain B0 where P1: "Bet A B B0 \ Cong A B0 A' B'" + using assms(4) l5_5_1 by blast + then have P2: "A \ B0" + using assms(1) bet_neq12__neq by blast + obtain C0 where P3: "Bet A C0 C \ Cong A' C' A C0" + using Le_def assms(5) by blast + then have "A \ C0" + using assms(1) assms(3) assms(4) bet_neq12__neq cong_diff le_diff by blast + then have P4: "Bet A B0 C0" + by (smt Out_def P1 P2 P3 assms(1) assms(2) assms(3) bet__le1213 between_exchange2 between_symmetry l5_1 l5_3 l5_6 l6_13_1 not_cong_3412) + have K1: "B0 C0 Le B C0" + using P1 P4 between_exchange3 l5_12_a by blast + have K2: "B C0 Le B C" + using P1 P3 P4 bet__le1213 between_exchange3 between_exchange4 by blast + then have "Cong B0 C0 B' C'" + using P1 P3 P4 assms(3) l4_3_1 not_cong_3412 by blast + then show ?thesis + by (meson K1 K2 cong__nlt le_transitivity nlt__le) +qed + +lemma bet2_le2__le2356: + assumes "Bet A B C" and + "Bet A' B' C'" and + "A B Le A' B'" and + "A' C' Le A C" + shows "B' C' Le B C" +proof (cases) + assume "A = B" + then show ?thesis + using assms(1) assms(2) assms(3) assms(4) bet2_le2__le2356_R1 by blast +next + assume "\ A = B" + then show ?thesis + using assms(1) assms(2) assms(3) assms(4) bet2_le2__le2356_R2 by blast +qed + +lemma bet2_le2__le1245: + assumes "Bet A B C" and + "Bet A' B' C'" and + "B C Le B' C'" and + "A' C' Le A C" + shows "A' B' Le A B" + using assms(1) assms(2) assms(3) assms(4) bet2_le2__le2356 between_symmetry le_comm by blast + +lemma cong_preserves_bet: + assumes "Bet B A' A0" and + "Cong B A' E D'" and + "Cong B A0 E D0" and + "E Out D' D0" + shows "Bet E D' D0" + using Tarski_neutral_dimensionless.l6_13_1 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) bet__le1213 l5_6 by fastforce + +lemma out_cong_cong: + assumes "B Out A A0" and + "E Out D D0" and + "Cong B A E D" and + "Cong B A0 E D0" + shows "Cong A A0 D D0" + by (meson Out_def assms(1) assms(2) assms(3) assms(4) cong_4321 cong_symmetry l4_3_1 l5_6 l6_13_1 l6_13_2) + +lemma not_out_bet: + assumes "Col A B C" and + "\ B Out A C" + shows "Bet A B C" + using assms(1) assms(2) l6_4_2 by blast + +lemma or_bet_out: + shows "Bet A B C \ B Out A C \ \ Col A B C" + using not_out_bet by blast + +lemma not_bet_out: + assumes "Col A B C" and + "\ Bet A B C" + shows "B Out A C" + by (simp add: assms(1) assms(2) l6_4_2) + +lemma not_bet_and_out: + shows "\ (Bet A B C \ B Out A C)" + using bet_col l6_4_1 by blast + +lemma out_to_bet: + assumes "Col A' B' C'" and + "B Out A C \ B' Out A' C'" and + "Bet A B C" + shows "Bet A' B' C'" + using assms(1) assms(2) assms(3) not_bet_and_out or_bet_out by blast + +lemma col_out2_col: + assumes "Col A B C" and + "B Out A AA" and + "B Out C CC" + shows "Col AA B CC" using l6_21 + by (smt Tarski_neutral_dimensionless.out_col Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) col_trivial_2 not_col_permutation_1 out_diff1) + +lemma bet2_out_out: + assumes "B \ A" and + "B' \ A" and + "A Out C C'" and + "Bet A B C" and + "Bet A B' C'" + shows "A Out B B'" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) bet_out l6_6 l6_7) + +lemma bet2__out: + assumes "A \ B" and + "A \ B'" and + "Bet A B C" + and "Bet A B' C" + shows "A Out B B'" + using Out_def assms(1) assms(2) assms(3) assms(4) l5_3 by auto + +lemma out_bet_out_1: + assumes "P Out A C" and + "Bet A B C" + shows "P Out A B" + by (metis assms(1) assms(2) not_bet_and_out out2_bet_out out_trivial) + +lemma out_bet_out_2: + assumes "P Out A C" and + "Bet A B C" + shows "P Out B C" + using assms(1) assms(2) l6_6 l6_7 out_bet_out_1 by blast + +lemma out_bet__out: + assumes "Bet P Q A" and + "Q Out A B" + shows "P Out A B" + by (smt Out_def assms(1) assms(2) bet_out_1 bet_out__bet) + +lemma segment_reverse: + assumes "Bet A B C " + shows "\ B'. Bet A B' C \ Cong C B' A B" + by (metis Bet_perm Cong_perm assms bet_cong_eq cong_reflexivity segment_construction_2) + +lemma diff_col_ex: + shows "\ C. A \ C \ B \ C \ Col A B C" + by (metis bet_col bet_neq12__neq point_construction_different) + +lemma diff_bet_ex3: + assumes "Bet A B C" + shows "\ D. A \ D \ B \ D \ C \ D \ Col A B D" + by (metis (mono_tags, hide_lams) Col_def bet_out_1 between_trivial2 col_transitivity_1 l6_4_1 point_construction_different) + +lemma diff_col_ex3: + assumes "Col A B C" + shows "\ D. A \ D \ B \ D \ C \ D \ Col A B D" + by (metis Bet_perm Col_def between_equality between_trivial2 point_construction_different) + +lemma Out_cases: + assumes "A Out B C \ A Out C B" + shows "A Out B C" + using assms l6_6 by blast + +subsection "Midpoint" + +lemma midpoint_dec: + "I Midpoint A B \ \ I Midpoint A B" + by simp + +lemma is_midpoint_id: + assumes "A Midpoint A B" + shows "A = B" + using Midpoint_def assms between_cong by blast + +lemma is_midpoint_id_2: + assumes "A Midpoint B A" + shows "A = B" + using Midpoint_def assms cong_diff_2 by blast + +lemma l7_2: + assumes "M Midpoint A B" + shows "M Midpoint B A" + using Bet_perm Cong_perm Midpoint_def assms by blast + +lemma l7_3: + assumes "M Midpoint A A" + shows "M = A" + using Midpoint_def assms bet_neq23__neq by blast + +lemma l7_3_2: + "A Midpoint A A" + by (simp add: Midpoint_def between_trivial2 cong_reflexivity) + +lemma symmetric_point_construction: + "\ P'. A Midpoint P P'" + by (meson Midpoint_def cong__le cong__le3412 le_anti_symmetry segment_construction) + +lemma symmetric_point_uniqueness: + assumes "P Midpoint A P1" and + "P Midpoint A P2" + shows "P1 = P2" + by (metis Midpoint_def assms(1) assms(2) between_cong_3 cong_diff_4 cong_inner_transitivity) + +lemma l7_9: + assumes "A Midpoint P X" and + "A Midpoint Q X" + shows "P = Q" + using assms(1) assms(2) l7_2 symmetric_point_uniqueness by blast + +lemma l7_9_bis: + assumes "A Midpoint P X" and + "A Midpoint X Q" + shows "P = Q" + using assms(1) assms(2) l7_2 symmetric_point_uniqueness by blast + +lemma l7_13_R1: + assumes "A \ P" and + "A Midpoint P' P" and + "A Midpoint Q' Q" + shows "Cong P Q P' Q'" +proof - + obtain X where P1: "Bet P' P X \ Cong P X Q A" + using segment_construction by blast + obtain X' where P2: "Bet X P' X' \ Cong P' X' Q A" + using segment_construction by blast + obtain Y where P3: "Bet Q' Q Y \ Cong Q Y P A" + using segment_construction by blast + obtain Y' where P4: "Bet Y Q' Y' \ Cong Q' Y' P A" + using segment_construction by blast + have P5: "Bet Y A Q'" + by (meson Midpoint_def P3 P4 assms(3) bet3__bet between_symmetry l5_3) + have P6: "Bet P' A X" + using Midpoint_def P1 assms(2) between_exchange4 by blast + have P7: "Bet A P X" + using Midpoint_def P1 assms(2) between_exchange3 by blast + have P8: "Bet Y Q A" + using Midpoint_def P3 assms(3) between_exchange3 between_symmetry by blast + have P9: "Bet A Q' Y'" + using P4 P5 between_exchange3 by blast + have P10: "Bet X' P' A" + using P2 P6 between_exchange3 between_symmetry by blast + have P11: "Bet X A X'" + using P10 P2 P6 between_symmetry outer_transitivity_between2 by blast + have P12: "Bet Y A Y'" + using P4 P5 between_exchange4 by blast + have P13: "Cong A X Y A" + using P1 P3 P7 P8 l2_11_b not_cong_4321 by blast + have P14: "Cong A Y' X' A" + proof - + have Q1: "Cong Q' Y' P' A" + using Midpoint_def P4 assms(2) cong_transitivity not_cong_3421 by blast + have "Cong A Q' X' P'" + by (meson Midpoint_def P2 assms(3) cong_transitivity not_cong_3421) + then show ?thesis + using P10 P9 Q1 l2_11_b by blast + qed + have P15: "Cong A Y A Y'" + proof - + have "Cong Q Y Q' Y'" + using P3 P4 cong_transitivity not_cong_3412 by blast + then show ?thesis + using Bet_perm Cong_perm Midpoint_def P8 P9 assms(3) l2_11_b by blast + qed + have P16: "Cong X A Y' A" + using Cong_cases P13 P15 cong_transitivity by blast + have P17: "Cong A X' A Y" + using P14 P15 cong_transitivity not_cong_3421 by blast + have P18: "X A X' Y' FSC Y' A Y X" + proof - + have Q3: "Col X A X'" + by (simp add: Col_def P11) + have "Cong X X' Y' Y" + using Bet_cases P11 P12 P16 P17 l2_11_b by blast + then show ?thesis + by (simp add: Cong3_def FSC_def P16 P17 Q3 cong_4321 cong_pseudo_reflexivity) + qed + then have "Y Q A X IFSC Y' Q' A X'" + by (smt IFSC_def Midpoint_def P14 P15 P16 P7 P8 P9 assms(1) assms(3) bet_neq12__neq between_symmetry cong_4321 cong_inner_transitivity cong_right_commutativity l4_16) + then have "X P A Q IFSC X' P' A Q'" + by (meson IFSC_def Midpoint_def P10 P7 assms(2) between_symmetry cong_4312 l4_2) + then show ?thesis + using l4_2 by auto +qed + +lemma l7_13: + assumes "A Midpoint P' P" and + "A Midpoint Q' Q" + shows "Cong P Q P' Q'" +proof (cases) + assume "A = P" + then show ?thesis + using Midpoint_def assms(1) assms(2) cong_3421 is_midpoint_id_2 by blast +next + show ?thesis + by (metis Tarski_neutral_dimensionless.l7_13_R1 Tarski_neutral_dimensionless_axioms assms(1) assms(2) cong_trivial_identity is_midpoint_id_2 not_cong_2143) +qed + +lemma l7_15: + assumes "A Midpoint P P'" and + "A Midpoint Q Q'" and + "A Midpoint R R'" and + "Bet P Q R" + shows "Bet P' Q' R'" +proof - + have "P Q R Cong3 P' Q' R'" + using Cong3_def assms(1) assms(2) assms(3) l7_13 l7_2 by blast + then show ?thesis + using assms(4) l4_6 by blast +qed + +lemma l7_16: + assumes "A Midpoint P P'" and + "A Midpoint Q Q'" and + "A Midpoint R R'" and + "A Midpoint S S'" and + "Cong P Q R S" + shows "Cong P' Q' R' S'" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) cong_transitivity l7_13 not_cong_3412) + +lemma symmetry_preserves_midpoint: + assumes "Z Midpoint A D" and + "Z Midpoint B E" and + "Z Midpoint C F" and + "B Midpoint A C" + shows "E Midpoint D F" + by (meson Midpoint_def assms(1) assms(2) assms(3) assms(4) l7_15 l7_16) + +lemma Mid_cases: + assumes "A Midpoint B C \ A Midpoint C B" + shows "A Midpoint B C" + using assms l7_2 by blast + +lemma Mid_perm: + assumes "A Midpoint B C" + shows "A Midpoint B C \ A Midpoint C B" + by (simp add: assms l7_2) + +lemma l7_17: + assumes "A Midpoint P P'" and + "B Midpoint P P'" + shows "A = B" +proof - + obtain pp :: "'p \ 'p \ 'p" where + f1: "\p pa. p Midpoint pa (pp p pa)" + by (meson symmetric_point_construction) + then have "\p pa. Bet pa p (pp p pa)" + by (meson Midpoint_def) + then have f2: "\p. Bet p p p" + by (meson between_inner_transitivity) + have f3: "\p pa. Bet (pp pa p) pa p" + using f1 Mid_perm Midpoint_def by blast + have f4: "\p. pp p p = p" + using f2 f1 by (metis Midpoint_def bet_cong_eq) + have f5: "Bet (pp P P') P B" + using f3 by (meson Midpoint_def assms(2) between_inner_transitivity) + have f6: "A Midpoint P' P" + using Mid_perm assms(1) by blast + have f7: "Bet (pp P P') P A" + using f3 Midpoint_def assms(1) between_inner_transitivity by blast + have f8: "Bet P' A P" + using f6 by (simp add: Midpoint_def) + have "Cong P' A A P" + using f6 Midpoint_def by blast + then have "P' = P \ A = B" + using f8 by (metis (no_types) Midpoint_def assms(2) bet_cong_eq between_inner_transitivity l5_2) + then show ?thesis + using f7 f6 f5 f4 f1 by (metis (no_types) Col_perm Mid_perm assms(2) bet_col l4_18 l5_2 l7_13) +qed + +lemma l7_17_bis: + assumes "A Midpoint P P'" and + "B Midpoint P' P" + shows "A = B" + by (meson Tarski_neutral_dimensionless.l7_17 Tarski_neutral_dimensionless.l7_2 Tarski_neutral_dimensionless_axioms assms(1) assms(2)) + +lemma l7_20: + assumes "Col A M B" and + "Cong M A M B" + shows "A = B \ M Midpoint A B" + by (metis Bet_cases Col_def Midpoint_def assms(1) assms(2) between_cong cong_left_commutativity not_cong_3412) + +lemma l7_20_bis: + assumes "A \ B" and + "Col A M B" and + "Cong M A M B" + shows "M Midpoint A B" + using assms(1) assms(2) assms(3) l7_20 by blast + +lemma cong_col_mid: + assumes "A \ C" and + "Col A B C" and + "Cong A B B C" + shows "B Midpoint A C" + using assms(1) assms(2) assms(3) cong_left_commutativity l7_20 by blast + +lemma l7_21_R1: + assumes "\ Col A B C" and + "B \ D" and + "Cong A B C D" and + "Cong B C D A" and + "Col A P C" and + "Col B P D" + shows "P Midpoint A C" +proof - + obtain X where P1: "B D P Cong3 D B X" + using Col_perm assms(6) cong_pseudo_reflexivity l4_14 by blast + have P2: "Col D B X" + using P1 assms(6) l4_13 not_col_permutation_5 by blast + have P3: "B D P A FSC D B X C" + using FSC_def P1 assms(3) assms(4) assms(6) not_col_permutation_5 not_cong_2143 not_cong_3412 by blast + have P4: "B D P C FSC D B X A" + by (simp add: FSC_def P1 assms(3) assms(4) assms(6) col_permutation_5 cong_4321) + have "A P C Cong3 C X A" + using Cong3_def Cong_perm P3 P4 assms(2) cong_pseudo_reflexivity l4_16 by blast + then show ?thesis + by (smt Cong3_def NCol_cases P2 assms(1) assms(2) assms(5) assms(6) colx cong_col_mid l4_13 not_col_distincts not_col_permutation_1 not_cong_1243) +qed + +lemma l7_21: + assumes "\ Col A B C" and + "B \ D" and + "Cong A B C D" and + "Cong B C D A" and + "Col A P C" and + "Col B P D" + shows "P Midpoint A C \ P Midpoint B D" + by (smt assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) col_transitivity_2 is_midpoint_id_2 l7_21_R1 not_col_distincts not_cong_3412) + +lemma l7_22_aux_R1: + assumes "Bet A1 C C" and + "Bet B1 C B2" and + "Cong C A1 C B1" and + "Cong C C C B2" and + "M1 Midpoint A1 B1" and + "M2 Midpoint A2 B2"and + "C A1 Le C C" + shows "Bet M1 C M2" + by (metis assms(3) assms(5) assms(7) cong_diff_3 l7_3 le_diff not_bet_distincts) + +lemma l7_22_aux_R2: + assumes "A2 \ C" and + "Bet A1 C A2" and + "Bet B1 C B2" and + "Cong C A1 C B1" and + "Cong C A2 C B2" and + "M1 Midpoint A1 B1" and + "M2 Midpoint A2 B2" and + "C A1 Le C A2" + shows "Bet M1 C M2" +proof - + obtain X where P1: "C Midpoint A2 X" + using symmetric_point_construction by blast + obtain X0 where P2: "C Midpoint B2 X0" + using symmetric_point_construction by blast + obtain X1 where P3: "C Midpoint M2 X1" + using symmetric_point_construction by blast + have P4: "X1 Midpoint X X0" + using P1 P2 P3 assms(7) symmetry_preserves_midpoint by blast + have P5: "C A1 Le C X" + using Cong_perm Midpoint_def P1 assms(8) cong_reflexivity l5_6 by blast + have P6: "Bet C A1 X" + by (smt Midpoint_def P1 P5 assms(1) assms(2) bet2__out between_symmetry is_midpoint_id_2 l5_2 l6_13_1) + have P7: "C B1 Le C X0" + proof - + have Q1: "Cong C A1 C B1" + by (simp add: assms(4)) + have "Cong C X C X0" + using P1 P2 assms(5) l7_16 l7_3_2 by blast + then show ?thesis + using P5 Q1 l5_6 by blast + qed + have P8: "Bet C B1 X0" + by (smt Midpoint_def P2 P7 assms(1) assms(3) assms(5) bet2__out between_symmetry cong_identity l5_2 l6_13_1) + obtain Q where P9: "Bet X1 Q C \ Bet A1 Q B1" + by (meson Bet_perm Midpoint_def P4 P6 P8 l3_17) + have P10: "X A1 C X1 IFSC X0 B1 C X1" + by (smt Cong_perm IFSC_def Midpoint_def P1 P2 P4 P6 P8 assms(4) assms(5) between_symmetry cong_reflexivity l7_16 l7_3_2) + have P11: "Cong A1 X1 B1 X1" + using P10 l4_2 by blast + have P12: "Cong Q A1 Q B1" + proof (cases) + assume "C = X1" + then show ?thesis + using P9 assms(4) bet_neq12__neq by blast + next + assume Q1: "\ C = X1" + have Q2: "Col C X1 Q" + using Col_def P9 by blast + have Q3: "Cong C A1 C B1" + by (simp add: assms(4)) + have "Cong X1 A1 X1 B1" + using P11 not_cong_2143 by blast + then show ?thesis + using Q1 Q2 Q3 l4_17 by blast + qed + have P13: "Q Midpoint A1 B1" + by (simp add: Midpoint_def P12 P9 cong_left_commutativity) + then show ?thesis + using Midpoint_def P3 P9 assms(6) between_inner_transitivity between_symmetry l7_17 by blast +qed + +lemma l7_22_aux: + assumes "Bet A1 C A2" and + "Bet B1 C B2" and + "Cong C A1 C B1" and + "Cong C A2 C B2" and + "M1 Midpoint A1 B1" and + "M2 Midpoint A2 B2" and + "C A1 Le C A2" + shows "Bet M1 C M2" + by (smt assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) l7_22_aux_R1 l7_22_aux_R2) + +lemma l7_22: + assumes "Bet A1 C A2" and + "Bet B1 C B2" and + "Cong C A1 C B1" and + "Cong C A2 C B2" and + "M1 Midpoint A1 B1" and + "M2 Midpoint A2 B2" + shows "Bet M1 C M2" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) between_symmetry l7_22_aux local.le_cases) + +lemma bet_col1: + assumes "Bet A B D" and + "Bet A C D" + shows "Col A B C" + using Bet_perm Col_def assms(1) assms(2) l5_3 by blast + +lemma l7_25_R1: + assumes "Cong C A C B" and + "Col A B C" + shows "\ X. X Midpoint A B" + using assms(1) assms(2) l7_20 l7_3_2 not_col_permutation_5 by blast + +lemma l7_25_R2: + assumes "Cong C A C B" and + "\ Col A B C" + shows "\ X. X Midpoint A B" +proof - + obtain P where P1: "Bet C A P \ A \ P" + using point_construction_different by auto + obtain Q where P2: "Bet C B Q \ Cong B Q A P" + using segment_construction by blast + obtain R where P3: "Bet A R Q \ Bet B R P" + using P1 P2 between_symmetry inner_pasch by blast + obtain X where P4: "Bet A X B \ Bet R X C" + using P1 P3 inner_pasch by blast + have "Cong X A X B" + proof - + have Q1: "Cong R A R B \ Cong X A X B" + proof (cases) + assume "R = C" + then show ?thesis + using P4 bet_neq12__neq by blast + next + assume Q2: "\ R = C" + have "Col R C X" + using Col_perm P4 bet_col by blast + then show ?thesis + using Q2 assms(1) l4_17 by blast + qed + have "Cong R A R B" + proof - + have Q3: "C A P B OFSC C B Q A" + by (simp add: OFSC_def P1 P2 assms(1) cong_pseudo_reflexivity cong_symmetry) + have Q4: "Cong P B Q A" + using Q3 assms(2) five_segment_with_def not_col_distincts by blast + obtain R' where Q5: "Bet A R' Q \ B R P Cong3 A R' Q" + using Cong_perm P3 Q4 l4_5 by blast + have Q6: "B R P A IFSC A R' Q B" + by (meson Cong3_def IFSC_def OFSC_def P3 Q3 Q5 not_cong_2143) + have Q7: "B R P Q IFSC A R' Q P" + using IFSC_def P2 Q6 cong_pseudo_reflexivity by auto + have Q8: "Cong R A R' B" + using Q6 l4_2 by auto + have Q9: "Cong R Q R' P" + using Q7 l4_2 by auto + have Q10: "A R Q Cong3 B R' P" + using Cong3_def Q4 Q8 Q9 cong_commutativity not_cong_4321 by blast + have Q11: "Col B R' P" + using P3 Q10 bet_col l4_13 by blast + have "R = R'" + proof - + have R1: "B \ P" + using P1 assms(1) between_cong by blast + then have R2: "A \ Q" + using Q4 cong_diff_2 by blast + have R3: "B \ Q" + using P1 P2 cong_diff_3 by blast + then have R4: "B \ R" + by (metis Cong3_def P1 Q11 Q5 assms(2) bet_col cong_diff_3 l6_21 not_col_distincts) + have R5: "\ Col A Q B" + by (metis P2 R3 assms(2) bet_col col_permutation_3 col_trivial_2 l6_21) + have R6: "B \ P" + by (simp add: R1) + have R7: "Col A Q R" + using NCol_cases P3 bet_col by blast + have R8: "Col A Q R'" + using Q5 bet_col col_permutation_5 by blast + have R9: "Col B P R" + using NCol_cases P3 bet_col by blast + have "Col B P R'" + using Col_perm Q11 by blast + then show ?thesis + using R5 R6 R7 R8 R9 l6_21 by blast + qed + then show ?thesis + using Q8 by blast + qed + then show ?thesis + using Q1 by blast + qed + then show ?thesis + using P4 assms(2) bet_col l7_20_bis not_col_distincts by blast +qed + +lemma l7_25: + assumes "Cong C A C B" + shows "\ X. X Midpoint A B" + using assms l7_25_R1 l7_25_R2 by blast + +lemma midpoint_distinct_1: + assumes "A \ B" and + "I Midpoint A B" + shows "I \ A \ I \ B" + using assms(1) assms(2) is_midpoint_id is_midpoint_id_2 by blast + +lemma midpoint_distinct_2: + assumes "I \ A" and + "I Midpoint A B" + shows "A \ B \ I \ B" + using assms(1) assms(2) is_midpoint_id_2 l7_3 by blast + +lemma midpoint_distinct_3: + assumes "I \ B" and + "I Midpoint A B" + shows "A \ B \ I \ A" + using assms(1) assms(2) is_midpoint_id l7_3 by blast + +lemma midpoint_def: + assumes "Bet A B C" and + "Cong A B B C" + shows "B Midpoint A C" + using Midpoint_def assms(1) assms(2) by blast + +lemma midpoint_bet: + assumes "B Midpoint A C" + shows "Bet A B C" + using Midpoint_def assms by blast + +lemma midpoint_col: + assumes "M Midpoint A B" + shows "Col M A B" + using assms bet_col col_permutation_4 midpoint_bet by blast + +lemma midpoint_cong: + assumes "B Midpoint A C" + shows "Cong A B B C" + using Midpoint_def assms by blast + +lemma midpoint_out: + assumes "A \ C" and + "B Midpoint A C" + shows "A Out B C" + using assms(1) assms(2) bet_out midpoint_bet midpoint_distinct_1 by blast + +lemma midpoint_out_1: + assumes "A \ C" and + "B Midpoint A C" + shows "C Out A B" + by (metis Tarski_neutral_dimensionless.midpoint_bet Tarski_neutral_dimensionless.midpoint_distinct_1 Tarski_neutral_dimensionless_axioms assms(1) assms(2) bet_out_1 l6_6) + +lemma midpoint_not_midpoint: + assumes "A \ B" and + "I Midpoint A B" + shows "\ B Midpoint A I" + using assms(1) assms(2) between_equality_2 midpoint_bet midpoint_distinct_1 by blast + +lemma swap_diff: + assumes "A \ B" + shows "B \ A" + using assms by auto + +lemma cong_cong_half_1: + assumes "M Midpoint A B" and + "M' Midpoint A' B'" and + "Cong A B A' B'" + shows "Cong A M A' M'" +proof - + obtain M'' where P1: "Bet A' M'' B' \ A M B Cong3 A' M'' B'" + using assms(1) assms(3) l4_5 midpoint_bet by blast + have P2: "M'' Midpoint A' B'" + by (meson Cong3_def P1 assms(1) cong_inner_transitivity midpoint_cong midpoint_def) + have P3: "M' = M''" + using P2 assms(2) l7_17 by auto + then show ?thesis + using Cong3_def P1 by blast +qed + +lemma cong_cong_half_2: + assumes "M Midpoint A B" and + "M' Midpoint A' B'" and + "Cong A B A' B'" + shows "Cong B M B' M'" + using assms(1) assms(2) assms(3) cong_cong_half_1 l7_2 not_cong_2143 by blast + +lemma cong_mid2__cong: + assumes "M Midpoint A B" and + "M' Midpoint A' B'" and + "Cong A M A' M'" + shows "Cong A B A' B'" + by (meson assms(1) assms(2) assms(3) cong_inner_transitivity l2_11_b midpoint_bet midpoint_cong) + +lemma mid__lt: + assumes "A \ B" and + "M Midpoint A B" + shows "A M Lt A B" + using assms(1) assms(2) bet__lt1213 midpoint_bet midpoint_distinct_1 by blast + +lemma le_mid2__le13: + assumes "M Midpoint A B" and + "M' Midpoint A' B'" and + "A M Le A' M'" + shows "A B Le A' B'" + by (smt Tarski_neutral_dimensionless.cong_mid2__cong Tarski_neutral_dimensionless.l7_13 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) bet2_le2__le2356 l5_6 l7_3_2 le_anti_symmetry le_comm local.le_cases midpoint_bet) + +lemma le_mid2__le12: + assumes "M Midpoint A B" and + "M' Midpoint A' B'" + and "A B Le A' B'" + shows "A M Le A' M'" + by (meson assms(1) assms(2) assms(3) cong__le3412 cong_cong_half_1 le_anti_symmetry le_mid2__le13 local.le_cases) + +lemma lt_mid2__lt13: + assumes "M Midpoint A B" and + "M' Midpoint A' B'" and + "A M Lt A' M'" + shows "A B Lt A' B'" + by (meson Tarski_neutral_dimensionless.le_mid2__le12 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) lt__nle nlt__le) + +lemma lt_mid2__lt12: + assumes "M Midpoint A B" and + "M' Midpoint A' B'" and + "A B Lt A' B'" + shows "A M Lt A' M'" + by (meson Tarski_neutral_dimensionless.le_mid2__le13 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) le__nlt nle__lt) + +lemma midpoint_preserves_out: + assumes "A Out B C" and + "M Midpoint A A'" and + "M Midpoint B B'" and + "M Midpoint C C'" + shows "A' Out B' C'" + by (smt Out_def assms(1) assms(2) assms(3) assms(4) l6_4_2 l7_15 l7_2 not_bet_and_out not_col_distincts) + +lemma col_cong_bet: + assumes "Col A B D" and + "Cong A B C D" and + "Bet A C B" + shows "Bet C A D \ Bet C B D" + by (smt Col_def assms(1) assms(2) assms(3) bet_cong_eq between_inner_transitivity col_transitivity_2 cong_4321 l6_2 not_bet_and_out not_cong_4312 third_point) + +lemma col_cong2_bet1: + assumes "Col A B D" and + "Bet A C B" and + "Cong A B C D" and + "Cong A C B D" + shows "Bet C B D" + by (metis assms(1) assms(2) assms(3) assms(4) bet__le1213 bet_cong_eq between_symmetry col_cong_bet cong__le cong_left_commutativity l5_12_b l5_6 outer_transitivity_between2) + +lemma col_cong2_bet2: + assumes "Col A B D" and + "Bet A C B" and + "Cong A B C D" and + "Cong A D B C" + shows "Bet C A D" + by (metis assms(1) assms(2) assms(3) assms(4) bet_cong_eq col_cong_bet cong_identity not_bet_distincts not_cong_3421 outer_transitivity_between2) + +lemma col_cong2_bet3: + assumes "Col A B D" and + "Bet A B C" and + "Cong A B C D" and + "Cong A C B D" + shows "Bet B C D" + by (metis assms(1) assms(2) assms(3) assms(4) bet__le1213 bet__le2313 bet_col col_transitivity_2 cong_diff_3 cong_reflexivity l5_12_b l5_6 not_bet_distincts) + +lemma col_cong2_bet4: + assumes "Col A B C" and + "Bet A B D" and + "Cong A B C D" and + "Cong A D B C" + shows "Bet B D C" + using assms(1) assms(2) assms(3) assms(4) col_cong2_bet3 cong_right_commutativity by blast + +lemma col_bet2_cong1: + assumes "Col A B D" and + "Bet A C B" and + "Cong A B C D" and + "Bet C B D" + shows "Cong A C D B" + by (meson assms(2) assms(3) assms(4) between_symmetry cong_pseudo_reflexivity cong_right_commutativity l4_3) + +lemma col_bet2_cong2: + assumes "Col A B D" and + "Bet A C B" and + "Cong A B C D" and + "Bet C A D" + shows "Cong D A B C" + by (meson assms(2) assms(3) assms(4) between_symmetry cong_commutativity cong_pseudo_reflexivity cong_symmetry l4_3) + +lemma bet2_lt2__lt: + assumes "Bet a Po b" and + "Bet A PO B" and + "Po a Lt PO A" and + "Po b Lt PO B" + shows "a b Lt A B" + by (metis Lt_cases Tarski_neutral_dimensionless.nle__lt Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) bet2_le2__le1245 le__nlt lt__le) + +lemma bet2_lt_le__lt: + assumes "Bet a Po b" and + "Bet A PO B" and + "Cong Po a PO A" and + "Po b Lt PO B" + shows "a b Lt A B" + by (smt Lt_def Tarski_neutral_dimensionless.l4_3_1 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) bet2_le2__le cong__le not_cong_2143) + +subsection "Orthogonality" + +lemma per_dec: + "Per A B C \ \ Per A B C" + by simp + +lemma l8_2: + assumes "Per A B C" + shows "Per C B A" +proof - + obtain C' where P1: "B Midpoint C C' \ Cong A C A C'" + using Per_def assms by blast + obtain A' where P2: "B Midpoint A A'" + using symmetric_point_construction by blast + have "Cong C' A C A'" + using Mid_perm P1 P2 l7_13 by blast + thus ?thesis + using P1 P2 Per_def cong_4321 cong_inner_transitivity by blast +qed + +lemma Per_cases: + assumes "Per A B C \ Per C B A" + shows "Per A B C" + using assms l8_2 by blast + +lemma Per_perm : + assumes "Per A B C" + shows "Per A B C \ Per C B A" + by (simp add: assms l8_2) + +lemma l8_3 : + assumes "Per A B C" and + "A \ B" and + "Col B A A'" + shows "Per A' B C" + by (smt Per_def assms(1) assms(2) assms(3) l4_17 l7_13 l7_2 l7_3_2) + +lemma l8_4: + assumes "Per A B C" and + "B Midpoint C C'" + shows "Per A B C'" + by (metis Tarski_neutral_dimensionless.l8_2 Tarski_neutral_dimensionless_axioms assms(1) assms(2) l8_3 midpoint_col midpoint_distinct_1) + +lemma l8_5: + "Per A B B" + using Per_def cong_reflexivity l7_3_2 by blast + +lemma l8_6: + assumes "Per A B C" and + "Per A' B C" and + "Bet A C A'" + shows "B = C" + by (metis Per_def assms(1) assms(2) assms(3) l4_19 midpoint_distinct_3 symmetric_point_uniqueness) + +lemma l8_7: + assumes "Per A B C" and + "Per A C B" + shows "B = C" +proof - + obtain C' where P1: "B Midpoint C C' \ Cong A C A C'" + using Per_def assms(1) by blast + obtain A' where P2: "C Midpoint A A'" + using Per_def assms(2) l8_2 by blast + have "Per C' C A" + by (metis P1 Tarski_neutral_dimensionless.l8_3 Tarski_neutral_dimensionless_axioms assms(2) bet_col l8_2 midpoint_bet midpoint_distinct_3) + then have "Cong A C' A' C'" + using Cong_perm P2 Per_def symmetric_point_uniqueness by blast + then have "Cong A' C A' C'" + using P1 P2 cong_inner_transitivity midpoint_cong not_cong_2134 by blast + then have Q4: "Per A' B C" + using P1 Per_def by blast + have "Bet A' C A" + using Mid_perm P2 midpoint_bet by blast + thus ?thesis + using Q4 assms(1) l8_6 by blast +qed + +lemma l8_8: + assumes "Per A B A" + shows "A = B" + using Tarski_neutral_dimensionless.l8_6 Tarski_neutral_dimensionless_axioms assms between_trivial2 by fastforce + +lemma per_distinct: + assumes "Per A B C" and + "A \ B" + shows "A \ C" + using assms(1) assms(2) l8_8 by blast + +lemma per_distinct_1: + assumes "Per A B C" and + "B \ C" + shows "A \ C" + using assms(1) assms(2) l8_8 by blast + +lemma l8_9: + assumes "Per A B C" and + "Col A B C" + shows "A = B \ C = B" + using Col_cases assms(1) assms(2) l8_3 l8_8 by blast + +lemma l8_10: + assumes "Per A B C" and + "A B C Cong3 A' B' C'" + shows "Per A' B' C'" +proof - + obtain D where P1: "B Midpoint C D \ Cong A C A D" + using Per_def assms(1) by blast + obtain D' where P2: "Bet C' B' D' \ Cong B' D' B' C'" + using segment_construction by blast + have P3: "B' Midpoint C' D'" + by (simp add: Midpoint_def P2 cong_4312) + have "Cong A' C' A' D'" + proof (cases) + assume "C = B" + thus ?thesis + by (metis Cong3_def P3 assms(2) cong_diff_4 cong_reflexivity is_midpoint_id) + next + assume Q1: "\ C = B" + have "C B D A OFSC C' B' D' A'" + by (metis Cong3_def OFSC_def P1 P3 Tarski_neutral_dimensionless.cong_mid2__cong Tarski_neutral_dimensionless_axioms assms(2) cong_commutativity l4_3_1 midpoint_bet) + thus ?thesis + by (meson OFSC_def P1 Q1 cong_4321 cong_inner_transitivity five_segment_with_def) + qed + thus ?thesis + using Per_def P3 by blast +qed + +lemma col_col_per_per: + assumes "A \ X" and + "C \ X" and + "Col U A X" and + "Col V C X" and + "Per A X C" + shows "Per U X V" + by (meson Tarski_neutral_dimensionless.l8_2 Tarski_neutral_dimensionless.l8_3 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) assms(5) not_col_permutation_3) + +lemma perp_in_dec: + "X PerpAt A B C D \ \ X PerpAt A B C D" + by simp + +lemma perp_distinct: + assumes "A B Perp C D" + shows "A \ B \ C \ D" + using PerpAt_def Perp_def assms by auto + +lemma l8_12: + assumes "X PerpAt A B C D" + shows "X PerpAt C D A B" + using Per_perm PerpAt_def assms by auto + +lemma per_col: + assumes "B \ C" and + "Per A B C" and + "Col B C D" + shows "Per A B D" + by (metis Tarski_neutral_dimensionless.l8_3 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) l8_2) + +lemma l8_13_2: + assumes "A \ B" and + "C \ D" and + "Col X A B" and + "Col X C D" and + "\ U. \ V. Col U A B \ Col V C D \ U \ X \ V \ X \ Per U X V" + shows "X PerpAt A B C D" +proof - + obtain pp :: 'p and ppa :: 'p where + f1: "Col pp A B \ Col ppa C D \ pp \ X \ ppa \ X \ Per pp X ppa" + using assms(5) by blast + obtain ppb :: "'p \ 'p \ 'p \ 'p \ 'p \ 'p" and ppc :: "'p \ 'p \ 'p \ 'p \ 'p \ 'p" where + "\x0 x1 x2 x3 x4. (\v5 v6. (Col v5 x3 x2 \ Col v6 x1 x0) \ \ Per v5 x4 v6) = ((Col (ppb x0 x1 x2 x3 x4) x3 x2 \ Col (ppc x0 x1 x2 x3 x4) x1 x0) \ \ Per (ppb x0 x1 x2 x3 x4) x4 (ppc x0 x1 x2 x3 x4))" + by moura + then have f2: "\p pa pb pc pd. (\ p PerpAt pa pb pc pd \ pa \ pb \ pc \ pd \ Col p pa pb \ Col p pc pd \ (\pe pf. (\ Col pe pa pb \ \ Col pf pc pd) \ Per pe p pf)) \ (p PerpAt pa pb pc pd \ pa = pb \ pc = pd \ \ Col p pa pb \ \ Col p pc pd \ (Col (ppb pd pc pb pa p) pa pb \ Col (ppc pd pc pb pa p) pc pd) \ \ Per (ppb pd pc pb pa p) p (ppc pd pc pb pa p))" + using PerpAt_def by fastforce + { assume "\ Col (ppb D C B A X) pp X" + then have "\ Col (ppb D C B A X) A B \ \ Col (ppc D C B A X) C D \ Per (ppb D C B A X) X (ppc D C B A X)" + using f1 by (meson assms(1) assms(3) col3 not_col_permutation_2) } + moreover + { assume "\ Col (ppc D C B A X) ppa X" + then have "\ Col (ppb D C B A X) A B \ \ Col (ppc D C B A X) C D \ Per (ppb D C B A X) X (ppc D C B A X)" + using f1 by (meson assms(2) assms(4) col3 not_col_permutation_2) } + ultimately have "\ Col (ppb D C B A X) A B \ \ Col (ppc D C B A X) C D \ Per (ppb D C B A X) X (ppc D C B A X)" + using f1 by (meson Tarski_neutral_dimensionless.col_col_per_per Tarski_neutral_dimensionless_axioms) + then have "(X PerpAt A B C D \ A = B \ C = D \ \ Col X A B \ \ Col X C D \ Col (ppb D C B A X) A B \ Col (ppc D C B A X) C D \ \ Per (ppb D C B A X) X (ppc D C B A X)) \ (\ Col (ppb D C B A X) A B \ \ Col (ppc D C B A X) C D \ Per (ppb D C B A X) X (ppc D C B A X))" + using f2 by presburger + thus ?thesis + using assms(1) assms(2) assms(3) assms(4) by blast +qed + +lemma l8_14_1: + "\ A B Perp A B" + by (metis PerpAt_def Perp_def Tarski_neutral_dimensionless.col_trivial_1 Tarski_neutral_dimensionless.col_trivial_3 Tarski_neutral_dimensionless_axioms l8_8) + +lemma l8_14_2_1a: + assumes "X PerpAt A B C D" + shows "A B Perp C D" + using Perp_def assms by blast + +lemma perp_in_distinct: + assumes "X PerpAt A B C D" + shows "A \ B \ C \ D" + using PerpAt_def assms by blast + +lemma l8_14_2_1b: + assumes "X PerpAt A B C D" and + "Col Y A B" and + "Col Y C D" + shows "X = Y" + by (metis PerpAt_def assms(1) assms(2) assms(3) l8_13_2 l8_14_1 l8_14_2_1a) + +lemma l8_14_2_1b_bis: + assumes "A B Perp C D" and + "Col X A B" and + "Col X C D" + shows "X PerpAt A B C D" + using Perp_def assms(1) assms(2) assms(3) l8_14_2_1b by blast + +lemma l8_14_2_2: + assumes "A B Perp C D" and + "\ Y. (Col Y A B \ Col Y C D) \ X = Y" + shows "X PerpAt A B C D" + by (metis Tarski_neutral_dimensionless.PerpAt_def Tarski_neutral_dimensionless.Perp_def Tarski_neutral_dimensionless_axioms assms(1) assms(2)) + +lemma l8_14_3: + assumes "X PerpAt A B C D" and + "Y PerpAt A B C D" + shows "X = Y" + by (meson PerpAt_def assms(1) assms(2) l8_14_2_1b) + +lemma l8_15_1: + assumes "Col A B X" and + "A B Perp C X" + shows "X PerpAt A B C X" + using NCol_perm assms(1) assms(2) col_trivial_3 l8_14_2_1b_bis by blast + +lemma l8_15_2: + assumes "Col A B X" and + "X PerpAt A B C X" + shows "A B Perp C X" + using assms(2) l8_14_2_1a by blast + +lemma perp_in_per: + assumes "B PerpAt A B B C" + shows "Per A B C" + by (meson NCol_cases PerpAt_def assms col_trivial_3) + +lemma perp_sym: + assumes "A B Perp A B" + shows "C D Perp C D" + using assms l8_14_1 by auto + +lemma perp_col0: + assumes "A B Perp C D" and + "X \ Y" and + "Col A B X" and + "Col A B Y" + shows "C D Perp X Y" +proof - + obtain X0 where P1: "X0 PerpAt A B C D" + using Perp_def assms(1) by blast + then have P2: " A \ B \ C \ D \ Col X0 A B \ Col X0 C D \ + ((Col U A B \ Col V C D) \ Per U X0 V)" using PerpAt_def by blast + have Q1: "C \ D" using P2 by blast + have Q2: "X \ Y" using assms(2) by blast + have Q3: "Col X0 C D" using P2 by blast + have Q4: "Col X0 X Y" + proof - + have "\p pa. Col p pa Y \ Col p pa X \ Col p pa X0 \ p \ pa" + by (metis (no_types) Col_cases P2 assms(3) assms(4)) + thus ?thesis + using col3 by blast + qed + have "X0 PerpAt C D X Y" + proof - + have "\ U V. (Col U C D \ Col V X Y) \ Per U X0 V" + by (metis Col_perm P1 Per_perm Q2 Tarski_neutral_dimensionless.PerpAt_def Tarski_neutral_dimensionless_axioms assms(3) assms(4) colx) + thus ?thesis using Q1 Q2 Q3 Q4 PerpAt_def by blast + qed + thus ?thesis + using Perp_def by auto +qed + +lemma per_perp_in: + assumes "A \ B" and + "B \ C" and + "Per A B C" + shows "B PerpAt A B B C" + by (metis Col_def assms(1) assms(2) assms(3) between_trivial2 l8_13_2) + +lemma per_perp: + assumes "A \ B" and + "B \ C" and + "Per A B C" + shows "A B Perp B C" + using Perp_def assms(1) assms(2) assms(3) per_perp_in by blast + +lemma perp_left_comm: + assumes "A B Perp C D" + shows "B A Perp C D" +proof - + obtain X where "X PerpAt A B C D" + using Perp_def assms by blast + then have "X PerpAt B A C D" + using PerpAt_def col_permutation_5 by auto + thus ?thesis + using Perp_def by blast +qed + +lemma perp_right_comm: + assumes "A B Perp C D" + shows "A B Perp D C" + by (meson Perp_def assms l8_12 perp_left_comm) + +lemma perp_comm: + assumes "A B Perp C D" + shows "B A Perp D C" + by (simp add: assms perp_left_comm perp_right_comm) + +lemma perp_in_sym: + assumes "X PerpAt A B C D" + shows "X PerpAt C D A B" + by (simp add: assms l8_12) + +lemma perp_in_left_comm: + assumes "X PerpAt A B C D" + shows "X PerpAt B A C D" + by (metis Col_cases PerpAt_def assms) + +lemma perp_in_right_comm: + assumes "X PerpAt A B C D" + shows "X PerpAt A B D C" + using assms perp_in_left_comm perp_in_sym by blast + +lemma perp_in_comm: + assumes "X PerpAt A B C D" + shows "X PerpAt B A D C" + by (simp add: assms perp_in_left_comm perp_in_right_comm) + +lemma Perp_cases: + assumes "A B Perp C D \ B A Perp C D \ A B Perp D C \ B A Perp D C \ C D Perp A B \ C D Perp B A \ D C Perp A B \ D C Perp B A" + shows "A B Perp C D" + by (meson Perp_def assms perp_in_sym perp_left_comm) + +lemma Perp_perm : + assumes "A B Perp C D" + shows "A B Perp C D \ B A Perp C D \ A B Perp D C \ B A Perp D C \ C D Perp A B \ C D Perp B A \ D C Perp A B \ D C Perp B A" + by (meson Perp_def assms perp_in_sym perp_left_comm) + +lemma Perp_in_cases: + assumes "X PerpAt A B C D \ X PerpAt B A C D \ X PerpAt A B D C \ X PerpAt B A D C \ X PerpAt C D A B \ X PerpAt C D B A \ X PerpAt D C A B \ X PerpAt D C B A" + shows "X PerpAt A B C D" + using assms perp_in_left_comm perp_in_sym by blast + +lemma Perp_in_perm: + assumes "X PerpAt A B C D" + shows "X PerpAt A B C D \ X PerpAt B A C D \ X PerpAt A B D C \ X PerpAt B A D C \ X PerpAt C D A B \ X PerpAt C D B A \ X PerpAt D C A B \ X PerpAt D C B A" + using Perp_in_cases assms by blast + +lemma perp_in_col: + assumes "X PerpAt A B C D" + shows "Col A B X \ Col C D X" + using PerpAt_def assms col_permutation_2 by presburger + +lemma perp_perp_in: + assumes "A B Perp C A" + shows "A PerpAt A B C A" + using assms l8_15_1 not_col_distincts by blast + +lemma perp_per_1: + assumes "A B Perp C A" + shows "Per B A C" + using Perp_in_cases assms perp_in_per perp_perp_in by blast + +lemma perp_per_2: + assumes "A B Perp A C" + shows "Per B A C" + by (simp add: Perp_perm assms perp_per_1) + +lemma perp_col: + assumes "A \ E" and + "A B Perp C D" and + "Col A B E" + shows "A E Perp C D" + using Perp_perm assms(1) assms(2) assms(3) col_trivial_3 perp_col0 by blast + +lemma perp_col2: + assumes "A B Perp X Y" and + "C \ D" and + "Col A B C" and + "Col A B D" + shows "C D Perp X Y" + using Perp_perm assms(1) assms(2) assms(3) assms(4) perp_col0 by blast + +lemma perp_col4: + assumes "P \ Q" and + "R \ S" and + "Col A B P" and + "Col A B Q" and + "Col C D R" and + "Col C D S" and + "A B Perp C D" + shows "P Q Perp R S" + using assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) perp_col0 by blast + +lemma perp_not_eq_1: + assumes "A B Perp C D" + shows "A \ B" + using assms perp_distinct by auto + +lemma perp_not_eq_2: + assumes "A B Perp C D" + shows "C \ D" + using assms perp_distinct by auto + +lemma diff_per_diff: + assumes "A \ B" and + "Cong A P B R" and + "Per B A P" + and "Per A B R" + shows "P \ R" + using assms(1) assms(3) assms(4) l8_2 l8_7 by blast + +lemma per_not_colp: + assumes "A \ B" and + "A \ P" and + "B \ R" and + "Per B A P" + and "Per A B R" + shows "\ Col P A R" + by (metis Per_cases Tarski_neutral_dimensionless.col_permutation_4 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(4) assms(5) l8_3 l8_7) + +lemma per_not_col: + assumes "A \ B" and + "B \ C" and + "Per A B C" + shows "\ Col A B C" + using assms(1) assms(2) assms(3) l8_9 by auto + +lemma perp_not_col2: + assumes "A B Perp C D" + shows "\ Col A B C \ \ Col A B D" + using assms l8_14_1 perp_col2 perp_distinct by blast + +lemma perp_not_col: + assumes "A B Perp P A" + shows "\ Col A B P" +proof - + have "A PerpAt A B P A" + using assms perp_perp_in by auto + then have "Per P A B" + by (simp add: perp_in_per perp_in_sym) + then have "\ Col B A P" + by (metis NCol_perm Tarski_neutral_dimensionless.perp_not_eq_1 Tarski_neutral_dimensionless.perp_not_eq_2 Tarski_neutral_dimensionless_axioms assms per_not_col) + thus ?thesis + using Col_perm by blast +qed + +lemma perp_in_col_perp_in: + assumes "C \ E" and + "Col C D E" and + "P PerpAt A B C D" + shows "P PerpAt A B C E" +proof - + have P2: "C \ D" + using assms(3) perp_in_distinct by blast + have P3: "Col P A B" + using PerpAt_def assms(3) by auto + have "Col P C D" + using PerpAt_def assms(3) by blast + then have "Col P C E" + using P2 assms(2) col_trivial_2 colx by blast + thus ?thesis + by (smt P3 Perp_perm Tarski_neutral_dimensionless.l8_14_2_1b_bis Tarski_neutral_dimensionless.perp_col Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) l8_14_2_1a) +qed + +lemma perp_col2_bis: + assumes "A B Perp C D" and + "Col C D P" and + "Col C D Q" and + "P \ Q" + shows "A B Perp P Q" + using Perp_cases assms(1) assms(2) assms(3) assms(4) perp_col0 by blast + +lemma perp_in_perp_bis_R1: + assumes "X \ A" and + "X PerpAt A B C D" + shows "X B Perp C D \ A X Perp C D" + by (metis assms(2) l8_14_2_1a perp_col perp_in_col) + +lemma perp_in_perp_bis: + assumes "X PerpAt A B C D" + shows "X B Perp C D \ A X Perp C D" + by (metis assms l8_14_2_1a perp_in_perp_bis_R1) + +lemma col_per_perp: + assumes "A \ B" and + "B \ C" and + (* "D \ B" and *) + "D \ C" and + "Col B C D" and + "Per A B C" + shows "C D Perp A B" + by (metis Perp_cases assms(1) assms(2) assms(3) assms(4) assms(5) col_trivial_2 per_perp perp_col2_bis) + +lemma per_cong_mid_R1: + assumes "B = H" and + (* "B \ C" and *) + "Bet A B C" and + "Cong A H C H" and + "Per H B C" + shows "B Midpoint A C" + using assms(1) assms(2) assms(3) midpoint_def not_cong_1243 by blast + +lemma per_cong_mid_R2: + assumes (*"B \ H" and *) + "B \ C" and + "Bet A B C" and + "Cong A H C H" and + "Per H B C" + shows "B Midpoint A C" +proof - + have P1: "Per C B H" + using Per_cases assms(4) by blast + have P2: "Per H B A" + using assms(1) assms(2) assms(4) bet_col col_permutation_1 per_col by blast + then have P3: "Per A B H" + using Per_cases by blast + obtain C' where P4: "B Midpoint C C' \ Cong H C H C'" + using Per_def assms(4) by blast + obtain H' where P5: "B Midpoint H H' \ Cong C H C H'" + using P1 Per_def by blast + obtain A' where P6: "B Midpoint A A' \ Cong H A H A'" + using P2 Per_def by blast + obtain H'' where P7: "B Midpoint H H'' \ Cong A H A H'" + using P3 P5 Tarski_neutral_dimensionless.Per_def Tarski_neutral_dimensionless_axioms symmetric_point_uniqueness by fastforce + then have P8: "H' = H''" + using P5 symmetric_point_uniqueness by blast + have "H B H' A IFSC H B H' C" + proof - + have Q1: "Bet H B H'" + by (simp add: P7 P8 midpoint_bet) + have Q2: "Cong H H' H H'" + by (simp add: cong_reflexivity) + have Q3: "Cong B H' B H'" + by (simp add: cong_reflexivity) + have Q4: "Cong H A H C" + using assms(3) not_cong_2143 by blast + have "Cong H' A H' C" + using P5 P7 assms(3) cong_commutativity cong_inner_transitivity by blast + thus ?thesis + by (simp add: IFSC_def Q1 Q2 Q3 Q4) + qed + thus ?thesis + using assms(1) assms(2) bet_col bet_neq23__neq l4_2 l7_20_bis by auto +qed + +lemma per_cong_mid: + assumes "B \ C" and + "Bet A B C" and + "Cong A H C H" and + "Per H B C" + shows "B Midpoint A C" + using assms(1) assms(2) assms(3) assms(4) per_cong_mid_R1 per_cong_mid_R2 by blast + +lemma per_double_cong: + assumes "Per A B C" and + "B Midpoint C C'" + shows "Cong A C A C'" + using Mid_cases Per_def assms(1) assms(2) l7_9_bis by blast + +lemma cong_perp_or_mid_R1: + assumes "Col A B X" and + "A \ B" and + "M Midpoint A B" and + "Cong A X B X" + shows "X = M \ \ Col A B X \ M PerpAt X M A B" + using assms(1) assms(2) assms(3) assms(4) col_permutation_5 cong_commutativity l7_17_bis l7_2 l7_20 by blast + +lemma cong_perp_or_mid_R2: + assumes "\ Col A B X" and + "A \ B" and + "M Midpoint A B" and + "Cong A X B X" + shows "X = M \ \ Col A B X \ M PerpAt X M A B" +proof - + have P1: "Col M A B" + by (simp add: assms(3) midpoint_col) + have "Per X M A" + using Per_def assms(3) assms(4) cong_commutativity by blast + thus ?thesis + by (metis P1 assms(1) assms(2) assms(3) midpoint_distinct_1 not_col_permutation_4 per_perp_in perp_in_col_perp_in perp_in_right_comm) +qed + +lemma cong_perp_or_mid: + assumes "A \ B" and + "M Midpoint A B" and + "Cong A X B X" + shows "X = M \ \ Col A B X \ M PerpAt X M A B" + using assms(1) assms(2) assms(3) cong_perp_or_mid_R1 cong_perp_or_mid_R2 by blast + +lemma col_per2_cases: + assumes "B \ C" and + "B' \ C" and + "C \ D" and + "Col B C D" and + "Per A B C" and + "Per A B' C" + shows "B = B' \ \ Col B' C D" + by (meson Tarski_neutral_dimensionless.l8_7 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l6_16_1 per_col) + +lemma l8_16_1: + assumes "Col A B X" and + "Col A B U" and + "A B Perp C X" + shows "\ Col A B C \ Per C X U" + by (metis assms(1) assms(2) assms(3) l8_5 perp_col0 perp_left_comm perp_not_col2 perp_per_2) + +lemma l8_16_2: + assumes "Col A B X" and + "Col A B U" + and "U \ X" and + "\ Col A B C" and + "Per C X U" + shows "A B Perp C X" +proof - + obtain X where "X PerpAt A B C X" + by (metis (no_types) NCol_perm assms(1) assms(2) assms(3) assms(4) assms(5) l8_13_2 l8_2 not_col_distincts) + thus ?thesis + by (smt Perp_perm assms(1) assms(2) assms(3) assms(4) assms(5) col3 col_per_perp not_col_distincts per_col per_perp) +qed + +lemma l8_18_uniqueness: + assumes (*"\ Col A B C" and *) + "Col A B X" and + "A B Perp C X" and + "Col A B Y" and + "A B Perp C Y" + shows "X = Y" + using assms(1) assms(2) assms(3) assms(4) l8_16_1 l8_7 by blast + +lemma midpoint_distinct: + assumes "\ Col A B C" and + "Col A B X" and + "X Midpoint C C'" + shows "C \ C'" + using assms(1) assms(2) assms(3) l7_3 by auto + +lemma l8_20_1_R1: + assumes "A = B" + shows "Per B A P" + by (simp add: assms l8_2 l8_5) + +lemma l8_20_1_R2: + assumes "A \ B" and + "Per A B C" and + "P Midpoint C' D" and + "A Midpoint C' C" and + "B Midpoint D C" + shows "Per B A P" +proof - + obtain B' where P1: "A Midpoint B B'" + using symmetric_point_construction by blast + obtain D' where P2: "A Midpoint D D'" + using symmetric_point_construction by blast + obtain P' where P3: "A Midpoint P P'" + using symmetric_point_construction by blast + have P4: "Per B' B C" + by (metis P1 Tarski_neutral_dimensionless.Per_cases Tarski_neutral_dimensionless.per_col Tarski_neutral_dimensionless_axioms assms(1) assms(2) midpoint_col not_col_permutation_4) + have P5: "Per B B' C'" + proof - + have "Per B' B C" + by (simp add: P4) + have "B' B C Cong3 B B' C'" + by (meson Cong3_def P1 assms(4) l7_13 l7_2) + thus ?thesis + using P4 l8_10 by blast + qed + have P6: "B' Midpoint D' C'" + by (meson P1 P2 assms(4) assms(5) l7_15 l7_16 l7_2 midpoint_bet midpoint_cong midpoint_def) + have P7: "P' Midpoint C D'" + using P2 P3 assms(3) assms(4) symmetry_preserves_midpoint by blast + have P8: "A Midpoint P P'" + by (simp add: P3) + obtain D'' where P9: "B Midpoint C D'' \ Cong B' C B' D" + using P4 assms(5) l7_2 per_double_cong by blast + have P10: "D'' = D" + using P9 assms(5) l7_9_bis by blast + obtain D'' where P11: "B' Midpoint C' D'' \ Cong B C' B D''" + using P5 Per_def by blast + have P12: "D' = D''" + by (meson P11 P6 Tarski_neutral_dimensionless.l7_9_bis Tarski_neutral_dimensionless_axioms) + have P13: "P Midpoint C' D" + using assms(3) by blast + have P14: "Cong C D C' D'" + using P2 assms(4) l7_13 l7_2 by blast + have P15: "Cong C' D C D'" + using P2 assms(4) cong_4321 l7_13 by blast + have P16: "Cong P D P' D'" + using P2 P8 cong_symmetry l7_13 by blast + have P17: "Cong P D P' C" + using P16 P7 cong_3421 cong_transitivity midpoint_cong by blast + have P18: "C' P D B IFSC D' P' C B" + by (metis Bet_cases IFSC_def P10 P11 P12 P13 P15 P17 P7 P9 cong_commutativity cong_right_commutativity l7_13 l7_3_2 midpoint_bet) + then have "Cong B P B P'" + using Tarski_neutral_dimensionless.l4_2 Tarski_neutral_dimensionless_axioms not_cong_2143 by fastforce + thus ?thesis + using P8 Per_def by blast +qed + +lemma l8_20_1: + assumes "Per A B C" and + "P Midpoint C' D" and + "A Midpoint C' C" and + "B Midpoint D C" + shows "Per B A P" + using assms(1) assms(2) assms(3) assms(4) l8_20_1_R1 l8_20_1_R2 by fastforce + +lemma l8_20_2: + assumes "P Midpoint C' D" and + "A Midpoint C' C" and + "B Midpoint D C" and + "B \ C" + shows "A \ P" + using assms(1) assms(2) assms(3) assms(4) l7_3 symmetric_point_uniqueness by blast + +lemma perp_col1: + assumes "C \ X" and + "A B Perp C D" and + "Col C D X" + shows "A B Perp C X" + using assms(1) assms(2) assms(3) col_trivial_3 perp_col2_bis by blast + +lemma l8_18_existence: + assumes "\ Col A B C" + shows "\ X. Col A B X \ A B Perp C X" +proof - + obtain Y where P1: "Bet B A Y \ Cong A Y A C" + using segment_construction by blast + then obtain P where P2: "P Midpoint C Y" + using Mid_cases l7_25 by blast + then have P3: "Per A P Y" + using P1 Per_def l7_2 by blast + obtain Z where P3: "Bet A Y Z \ Cong Y Z Y P" + using segment_construction by blast + obtain Q where P4: "Bet P Y Q \ Cong Y Q Y A" + using segment_construction by blast + obtain Q' where P5: "Bet Q Z Q' \ Cong Z Q' Q Z" + using segment_construction by blast + then have P6: "Z Midpoint Q Q'" + using midpoint_def not_cong_3412 by blast + obtain C' where P7: "Bet Q' Y C' \ Cong Y C' Y C" + using segment_construction by blast + obtain X where P8: "X Midpoint C C'" + using Mid_cases P7 l7_25 by blast + have P9: "A Y Z Q OFSC Q Y P A" + by (simp add: OFSC_def P3 P4 between_symmetry cong_4321 cong_pseudo_reflexivity) + have P10: "A \ Y" + using P1 assms cong_reverse_identity not_col_distincts by blast + then have P11: "Cong Z Q P A" + using P9 five_segment_with_def by blast + then have P12: "A P Y Cong3 Q Z Y" + using Cong3_def P3 P4 not_cong_4321 by blast + have P13: "Per Q Z Y" + using Cong_perm P1 P12 P2 Per_def l8_10 l8_4 by blast + then have P14: "Per Y Z Q" + by (simp add: l8_2) + have P15: "P \ Y" + using NCol_cases P1 P2 assms bet_col l7_3_2 l7_9_bis by blast + obtain Q'' where P16:"Z Midpoint Q Q'' \ Cong Y Q Y Q'" + using P14 P6 per_double_cong by blast + then have P17: "Q' = Q''" + using P6 symmetric_point_uniqueness by blast + have P18: "Bet Z Y X" + proof - + have "Bet Q Y C" + using P15 P2 P4 between_symmetry midpoint_bet outer_transitivity_between2 by blast + thus ?thesis + using P16 P6 P7 P8 l7_22 not_cong_3412 by blast + qed + have P19: "Q \ Y" + using P10 P4 cong_reverse_identity by blast + have P20: "Per Y X C" + proof - + have "Bet C P Y" + by (simp add: P2 midpoint_bet) + thus ?thesis + using P7 P8 Per_def not_cong_3412 by blast + qed + have P21: "Col P Y Q" + by (simp add: Col_def P4) + have P22: "Col P Y C" + using P2 midpoint_col not_col_permutation_5 by blast + have P23: "Col P Q C" + using P15 P21 P22 col_transitivity_1 by blast + have P24: "Col Y Q C" + using P15 P21 P22 col_transitivity_2 by auto + have P25: "Col A Y B" + by (simp add: Col_def P1) + have P26: "Col A Y Z" + using P3 bet_col by blast + have P27: "Col A B Z" + using P10 P25 P26 col_transitivity_1 by blast + have P28: "Col Y B Z" + using P10 P25 P26 col_transitivity_2 by blast + have P29: "Col Q Y P" + using P21 not_col_permutation_3 by blast + have P30: "Q \ C" + using P15 P2 P4 between_equality_2 between_symmetry midpoint_bet by blast + have P31: "Col Y B Z" + using P28 by auto + have P32: "Col Y Q' C'" + by (simp add: P7 bet_col col_permutation_4) + have P33: "Q \ Q'" + using P11 P15 P22 P25 P5 assms bet_neq12__neq col_transitivity_1 cong_reverse_identity by blast + have P34: "C \ C'" + by (smt P15 P18 P3 P31 P8 assms bet_col col3 col_permutation_2 col_permutation_3 cong_3421 cong_diff midpoint_distinct_3) + have P35: "Q Y C Z OFSC Q' Y C' Z" + by (meson OFSC_def P15 P16 P2 P4 P5 P7 between_symmetry cong_3421 cong_reflexivity midpoint_bet not_cong_3412 outer_transitivity_between2) + then have P36: "Cong C Z C' Z" + using P19 five_segment_with_def by blast + have P37: "Col Z Y X" + by (simp add: P18 bet_col) + have P38: "Y \ Z" + using P15 P3 cong_reverse_identity by blast + then have P40: "X \ Y" + by (metis (mono_tags, hide_lams) Col_perm Cong_perm P14 P24 P25 P27 P36 P8 Per_def assms colx per_not_colp) + have "Col A B X" + using Col_perm P26 P31 P37 P38 col3 by blast + thus ?thesis + by (metis P18 P20 P27 P37 P40 Tarski_neutral_dimensionless.per_col Tarski_neutral_dimensionless_axioms assms between_equality col_permutation_3 l5_2 l8_16_2 l8_2) +qed + +lemma l8_21_aux: + assumes "\ Col A B C" + shows "\ P. \ T. (A B Perp P A \ Col A B T \ Bet C T P)" +proof - + obtain X where P1: "Col A B X \ A B Perp C X" + using assms l8_18_existence by blast + have P2: "X PerpAt A B C X" + by (simp add: P1 l8_15_1) + have P3: "Per A X C" + by (meson P1 Per_perm Tarski_neutral_dimensionless.l8_16_1 Tarski_neutral_dimensionless_axioms col_trivial_3) + obtain C' where P4: "X Midpoint C C' \ Cong A C A C'" + using P3 Per_def by blast + obtain C'' where P5: "A Midpoint C C''" + using symmetric_point_construction by blast + obtain P where P6: "P Midpoint C' C''" + by (metis Cong_perm P4 P5 Tarski_neutral_dimensionless.Midpoint_def Tarski_neutral_dimensionless_axioms cong_inner_transitivity l7_25) + have P7: "Per X A P" + by (smt P3 P4 P5 P6 l7_2 l8_20_1_R2 l8_4 midpoint_distinct_3 symmetric_point_uniqueness) + have P8: "X \ C" + using P1 assms by auto + have P9: "A \ P" + using P4 P5 P6 P8 l7_9 midpoint_distinct_2 by blast + obtain T where P10: "Bet P T C \ Bet A T X" + by (meson Mid_perm Midpoint_def P4 P5 P6 l3_17) + have "A B Perp P A \ Col A B T \ Bet C T P" + proof cases + assume "A = X" + thus ?thesis + by (metis Bet_perm Col_def P1 P10 P9 between_identity col_trivial_3 perp_col2_bis) + next + assume "A \ X" + thus ?thesis + by (metis Bet_perm Col_def P1 P10 P7 P9 Perp_perm col_transitivity_2 col_trivial_1 l8_3 per_perp perp_not_col2) + qed + thus ?thesis + by blast +qed + +lemma l8_21: + assumes "A \ B" + shows "\ P T. A B Perp P A \ Col A B T \ Bet C T P" + by (meson assms between_trivial2 l8_21_aux not_col_exists) + +lemma per_cong: + assumes "A \ B" and + "A \ P" and + "Per B A P" and + "Per A B R" and + "Cong A P B R" and + "Col A B X" and + "Bet P X R" + shows "Cong A R P B" +proof - + have P1: "Per P A B" + using Per_cases assms(3) by blast + obtain Q where P2: "R Midpoint B Q" + using symmetric_point_construction by auto + have P3: "B \ R" + using assms(2) assms(5) cong_identity by blast + have P4: "Per A B Q" + by (metis P2 P3 assms(1) assms(4) bet_neq23__neq col_permutation_4 midpoint_bet midpoint_col per_perp_in perp_in_col_perp_in perp_in_per) + have P5: "Per P A X" + using P1 assms(1) assms(6) per_col by blast + have P6: "B \ Q" + using P2 P3 l7_3 by blast + have P7: "Per R B X" + by (metis assms(1) assms(4) assms(6) l8_2 not_col_permutation_4 per_col) + have P8: "X \ A" + using P3 assms(1) assms(2) assms(3) assms(4) assms(7) bet_col per_not_colp by blast + obtain P' where P9: "A Midpoint P P'" + using Per_def assms(3) by blast + obtain R' where P10: "Bet P' X R' \ Cong X R' X R" + using segment_construction by blast + obtain M where P11: "M Midpoint R R'" + by (meson P10 Tarski_neutral_dimensionless.l7_2 Tarski_neutral_dimensionless_axioms l7_25) + have P12: "Per X M R" + using P10 P11 Per_def cong_symmetry by blast + have P13: "Cong X P X P'" + using P9 assms(1) assms(3) assms(6) cong_left_commutativity l4_17 midpoint_cong per_double_cong by blast + have P14: "X \ P'" + using P13 P8 P9 cong_identity l7_3 by blast + have P15: "P \ P'" + using P9 assms(2) midpoint_distinct_2 by blast + have P16: "\ Col X P P'" + using P13 P15 P8 P9 l7_17 l7_20 not_col_permutation_4 by blast + have P17: "Bet A X M" + using P10 P11 P13 P9 assms(7) cong_symmetry l7_22 by blast + have P18: "X \ R" + using P3 P7 per_distinct_1 by blast + have P19: "X \ R'" + using P10 P18 cong_diff_3 by blast + have P20: "X \ M" + by (metis Col_def P10 P11 P16 P18 P19 assms(7) col_transitivity_1 midpoint_col) + have P21: "M = B" + by (smt Col_def P12 P17 P20 P8 Per_perm assms(1) assms(4) assms(6) col_transitivity_2 l8_3 l8_7) + have "P X R P' OFSC P' X R' P" + by (simp add: OFSC_def P10 P13 assms(7) cong_commutativity cong_pseudo_reflexivity cong_symmetry) + then have "Cong R P' R' P" + using P13 P14 cong_diff_3 five_segment_with_def by blast + then have "P' A P R IFSC R' B R P" + by (metis Bet_perm Cong_perm Midpoint_def P11 P21 P9 Tarski_neutral_dimensionless.IFSC_def Tarski_neutral_dimensionless_axioms assms(5) cong_mid2__cong cong_pseudo_reflexivity) + thus ?thesis + using l4_2 not_cong_1243 by blast +qed + +lemma perp_cong: + assumes "A \ B" and + "A \ P" and + "A B Perp P A" and + "A B Perp R B" and + "Cong A P B R" and + "Col A B X" and + "Bet P X R" + shows "Cong A R P B" + using Perp_cases assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) per_cong perp_per_1 by blast + +lemma perp_exists: + assumes "A \ B" + shows "\ X. PO X Perp A B" +proof cases + assume "Col A B PO" + then obtain C where P1: "A \ C \ B \ C \ PO \ C \ Col A B C" + using diff_col_ex3 by blast + then obtain P T where P2: "PO C Perp P PO \ Col PO C T \ Bet PO T P" using l8_21 + by blast + then have "PO P Perp A B" + by (metis P1 Perp_perm \Col A B PO\ assms col3 col_trivial_2 col_trivial_3 perp_col2) + thus ?thesis + by blast +next + assume "\ Col A B PO" + thus ?thesis using l8_18_existence + using assms col_trivial_2 col_trivial_3 l8_18_existence perp_col0 by blast +qed + +lemma perp_vector: + assumes "A \ B" + shows "\ X Y. A B Perp X Y" + using assms l8_21 by blast + +lemma midpoint_existence_aux: + assumes "A \ B" and + "A B Perp Q B" and + "A B Perp P A" and + "Col A B T" and + "Bet Q T P" and + "A P Le B Q" + shows "\ X. X Midpoint A B" +proof - + obtain R where P1: "Bet B R Q \ Cong A P B R" + using Le_def assms(6) by blast + obtain X where P2: "Bet T X B \ Bet R X P" + using P1 assms(5) between_symmetry inner_pasch by blast + have P3: "Col A B X" + by (metis Col_def Out_cases P2 assms(4) between_equality l6_16_1 not_out_bet out_diff1) + have P4: "B \ R" + using P1 assms(3) cong_identity perp_not_eq_2 by blast + have P5: "\ Col A B Q" + using assms(2) col_trivial_2 l8_16_1 by blast + have P6: "\ Col A B R" + using Col_def P1 P4 P5 l6_16_1 by blast + have P7: "P \ R" + using P2 P3 P6 between_identity by blast + have "\ X. X Midpoint A B" + proof cases + assume "A = P" + thus ?thesis + using assms(3) col_trivial_3 perp_not_col2 by blast + next + assume Q1: "\ A = P" + have Q2: "A B Perp R B" + by (metis P1 P4 Perp_perm Tarski_neutral_dimensionless.bet_col1 Tarski_neutral_dimensionless_axioms assms(2) l5_1 perp_col1) + then have Q3: "Cong A R P B" + using P1 P2 P3 Q1 assms(1) assms(3) between_symmetry perp_cong by blast + then have "X Midpoint A B \ X Midpoint P R" + by (smt P1 P2 P3 P6 P7 bet_col cong_left_commutativity cong_symmetry l7_2 l7_21 not_col_permutation_1) + thus ?thesis + by blast + qed + thus ?thesis by blast +qed + +lemma midpoint_existence: + "\ X. X Midpoint A B" +proof cases + assume "A = B" + thus ?thesis + using l7_3_2 by blast +next + assume P1: "\ A = B" + obtain Q where P2: "A B Perp B Q" + by (metis P1 l8_21 perp_comm) + obtain P T where P3: "A B Perp P A \ Col A B T \ Bet Q T P" + using P2 l8_21_aux not_col_distincts perp_not_col2 by blast + have P4: "A P Le B Q \ B Q Le A P" + by (simp add: local.le_cases) + have P5: "A P Le B Q \ (\ X. X Midpoint A B)" + by (meson P1 P2 P3 Tarski_neutral_dimensionless.Perp_cases Tarski_neutral_dimensionless.midpoint_existence_aux Tarski_neutral_dimensionless_axioms) + have P6: "B Q Le A P \ (\ X. X Midpoint A B)" + proof - + { + assume H1: "B Q Le A P" + have Q6: "B \ A" + using P1 by auto + have Q2: "B A Perp P A" + by (simp add: P3 perp_left_comm) + have Q3: "B A Perp Q B" + using P2 Perp_perm by blast + have Q4: "Col B A T" + using Col_perm P3 by blast + have Q5: "Bet P T Q" + using Bet_perm P3 by blast + obtain X where "X Midpoint B A" + using H1 Q2 Q3 Q4 Q5 Q6 midpoint_existence_aux by blast + then have "\ X. X Midpoint A B" + using l7_2 by blast + } + thus ?thesis + by simp + qed + thus ?thesis + using P4 P5 by blast +qed + +lemma perp_in_id: + assumes "X PerpAt A B C A" + shows "X = A" + by (meson Col_cases assms col_trivial_3 l8_14_2_1b) + +lemma l8_22: + assumes "A \ B" and + "A \ P" and + "Per B A P" and + "Per A B R" and + "Cong A P B R" and + "Col A B X" and + "Bet P X R" and + "Cong A R P B" + shows "X Midpoint A B \ X Midpoint P R" + by (metis assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) assms(8) bet_col cong_commutativity cong_diff cong_right_commutativity l7_21 not_col_permutation_5 per_not_colp) + +lemma l8_22_bis: + assumes "A \ B" and + "A \ P" and + "A B Perp P A" and + "A B Perp R B" and + "Cong A P B R" and + "Col A B X" and + "Bet P X R" + shows "Cong A R P B \ X Midpoint A B \ X Midpoint P R" + by (metis l8_22 Perp_cases assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) perp_cong perp_per_2) + +lemma perp_in_perp: + assumes "X PerpAt A B C D" + shows "A B Perp C D" + using assms l8_14_2_1a by auto + +lemma perp_proj: + assumes "A B Perp C D" and + "\ Col A C D" + shows "\ X. Col A B X \ A X Perp C D" + using assms(1) not_col_distincts by auto + +lemma l8_24 : + assumes "P A Perp A B" and + "Q B Perp A B" and + "Col A B T" and + "Bet P T Q" and + "Bet B R Q" and + "Cong A P B R" + shows "\ X. X Midpoint A B \ X Midpoint P R" +proof - + obtain X where P1: "Bet T X B \ Bet R X P" + using assms(4) assms(5) inner_pasch by blast + have P2: "Col A B X" + by (metis Out_cases P1 assms(3) bet_out_1 col_out2_col not_col_distincts out_trivial) + have P3: "A \ B" + using assms(1) col_trivial_2 l8_16_1 by blast + have P4: "A \ P" + using assms(1) col_trivial_1 l8_16_1 by blast + have "\ X. X Midpoint A B \ X Midpoint P R" + proof cases + assume "Col A B P" + thus ?thesis + using Perp_perm assms(1) perp_not_col by blast + next + assume Q1: "\ Col A B P" + have Q2: "B \ R" + using P4 assms(6) cong_diff by blast + have Q3: "Q \ B" + using Q2 assms(5) between_identity by blast + have Q4: "\ Col A B Q" + by (metis assms(2) col_permutation_3 l8_14_1 perp_col1 perp_not_col) + have Q5: "\ Col A B R" + by (meson Q2 Q4 assms(5) bet_col col_transitivity_1 not_col_permutation_2) + have Q6: "P \ R" + using P1 P2 Q5 between_identity by blast + have "\ X. X Midpoint A B \ X Midpoint P R" + proof cases + assume "A = P" + thus ?thesis + using P4 by blast + next + assume R0: "\ A = P" + have R1: "A B Perp R B" + by (metis Perp_cases Q2 Tarski_neutral_dimensionless.bet_col1 Tarski_neutral_dimensionless_axioms assms(2) assms(5) bet_col col_transitivity_1 perp_col1) + have R2: "Cong A R P B" + using P1 P2 P3 Perp_perm R0 R1 assms(1) assms(6) between_symmetry perp_cong by blast + have R3: "\ Col A P B" + using Col_perm Q1 by blast + have R4: "P \ R" + by (simp add: Q6) + have R5: "Cong A P B R" + by (simp add: assms(6)) + have R6: "Cong P B R A" + using R2 not_cong_4312 by blast + have R7: "Col A X B" + using Col_perm P2 by blast + have R8: "Col P X R" + by (simp add: P1 bet_col between_symmetry) + thus ?thesis using l7_21 + using R3 R4 R5 R6 R7 by blast + qed + thus ?thesis by simp + qed + thus ?thesis + by simp +qed + +lemma col_per2__per: + assumes "A \ B" and + "Col A B C" and + "Per A X P" and + "Per B X P" + shows "Per C X P" + by (meson Per_def assms(1) assms(2) assms(3) assms(4) l4_17 per_double_cong) + +lemma perp_in_per_1: + assumes "X PerpAt A B C D" + shows "Per A X C" + using PerpAt_def assms col_trivial_1 by auto + +lemma perp_in_per_2: + assumes "X PerpAt A B C D" + shows "Per A X D" + using assms perp_in_per_1 perp_in_right_comm by blast + +lemma perp_in_per_3: + assumes "X PerpAt A B C D" + shows "Per B X C" + using assms perp_in_comm perp_in_per_2 by blast + +lemma perp_in_per_4: + assumes "X PerpAt A B C D" + shows "Per B X D" + using assms perp_in_per_3 perp_in_right_comm by blast + +subsection "Planes" + +subsubsection "Coplanar" + +lemma coplanar_perm_1: + assumes "Coplanar A B C D" + shows "Coplanar A B D C" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_2: + assumes "Coplanar A B C D" + shows "Coplanar A C B D" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_3: + assumes "Coplanar A B C D" + shows "Coplanar A C D B" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_4: + assumes "Coplanar A B C D" + shows "Coplanar A D B C" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_5: + assumes "Coplanar A B C D" + shows "Coplanar A D C B" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_6: + assumes "Coplanar A B C D" + shows "Coplanar B A C D" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_7: + assumes "Coplanar A B C D" + shows "Coplanar B A D C" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_8: + assumes "Coplanar A B C D" + shows "Coplanar B C A D" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_9: + assumes "Coplanar A B C D" + shows "Coplanar B C D A" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_10: + assumes "Coplanar A B C D" + shows "Coplanar B D A C" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_11: + assumes "Coplanar A B C D" + shows "Coplanar B D C A" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_12: + assumes "Coplanar A B C D" + shows "Coplanar C A B D" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_13: + assumes "Coplanar A B C D" + shows "Coplanar C A D B" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_14: + assumes "Coplanar A B C D" + shows "Coplanar C B A D" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_15: + assumes "Coplanar A B C D" + shows "Coplanar C B D A" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_16: + assumes "Coplanar A B C D" + shows "Coplanar C D A B" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_17: + assumes "Coplanar A B C D" + shows "Coplanar C D B A" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_18: + assumes "Coplanar A B C D" + shows "Coplanar D A B C" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_19: + assumes "Coplanar A B C D" + shows "Coplanar D A C B" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_20: + assumes "Coplanar A B C D" + shows "Coplanar D B A C" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_21: + assumes "Coplanar A B C D" + shows "Coplanar D B C A" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_22: + assumes "Coplanar A B C D" + shows "Coplanar D C A B" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma coplanar_perm_23: + assumes "Coplanar A B C D" + shows "Coplanar D C B A" +proof - + obtain X where P1: "(Col A B X \ Col C D X) \ (Col A C X \ Col B D X) \ (Col A D X \ Col B C X)" + using Coplanar_def assms by blast + then show ?thesis + using Coplanar_def col_permutation_4 by blast +qed + +lemma ncoplanar_perm_1: + assumes "\ Coplanar A B C D" + shows "\ Coplanar A B D C" + using assms coplanar_perm_1 by blast + +lemma ncoplanar_perm_2: + assumes "\ Coplanar A B C D" + shows "\ Coplanar A C B D" + using assms coplanar_perm_2 by blast + +lemma ncoplanar_perm_3: + assumes "\ Coplanar A B C D" + shows "\ Coplanar A C D B" + using assms coplanar_perm_4 by blast + +lemma ncoplanar_perm_4: + assumes "\ Coplanar A B C D" + shows "\ Coplanar A D B C" + using assms coplanar_perm_3 by blast + +lemma ncoplanar_perm_5: + assumes "\ Coplanar A B C D" + shows "\ Coplanar A D C B" + using assms coplanar_perm_5 by blast + +lemma ncoplanar_perm_6: + assumes "\ Coplanar A B C D" + shows "\ Coplanar B A C D" + using assms coplanar_perm_6 by blast + +lemma ncoplanar_perm_7: + assumes "\ Coplanar A B C D" + shows "\ Coplanar B A D C" + using assms coplanar_perm_7 by blast + +lemma ncoplanar_perm_8: + assumes "\ Coplanar A B C D" + shows "\ Coplanar B C A D" + using assms coplanar_perm_12 by blast + +lemma ncoplanar_perm_9: + assumes "\ Coplanar A B C D" + shows "\ Coplanar B C D A" + using assms coplanar_perm_18 by blast + +lemma ncoplanar_perm_10: + assumes "\ Coplanar A B C D" + shows "\ Coplanar B D A C" + using assms coplanar_perm_13 by blast + +lemma ncoplanar_perm_11: + assumes "\ Coplanar A B C D" + shows "\ Coplanar B D C A" + using assms coplanar_perm_19 by blast + +lemma ncoplanar_perm_12: + assumes "\ Coplanar A B C D" + shows "\ Coplanar C A B D" + using assms coplanar_perm_8 by blast + +lemma ncoplanar_perm_13: + assumes "\ Coplanar A B C D" + shows "\ Coplanar C A D B" + using assms coplanar_perm_10 by blast + +lemma ncoplanar_perm_14: + assumes "\ Coplanar A B C D" + shows "\ Coplanar C B A D" + using assms coplanar_perm_14 by blast + +lemma ncoplanar_perm_15: + assumes "\ Coplanar A B C D" + shows "\ Coplanar C B D A" + using assms coplanar_perm_20 by blast + +lemma ncoplanar_perm_16: + assumes "\ Coplanar A B C D" + shows "\ Coplanar C D A B" + using assms coplanar_perm_16 by blast + +lemma ncoplanar_perm_17: + assumes "\ Coplanar A B C D" + shows "\ Coplanar C D B A" + using assms coplanar_perm_22 by blast + +lemma ncoplanar_perm_18: + assumes "\ Coplanar A B C D" + shows "\ Coplanar D A B C" + using assms coplanar_perm_9 by blast + +lemma ncoplanar_perm_19: + assumes "\ Coplanar A B C D" + shows "\ Coplanar D A C B" + using assms coplanar_perm_11 by blast + +lemma ncoplanar_perm_20: + assumes "\ Coplanar A B C D" + shows "\ Coplanar D B A C" + using assms coplanar_perm_15 by blast + +lemma ncoplanar_perm_21: + assumes "\ Coplanar A B C D" + shows "\ Coplanar D B C A" + using assms coplanar_perm_21 by blast + +lemma ncoplanar_perm_22: + assumes "\ Coplanar A B C D" + shows "\ Coplanar D C A B" + using assms coplanar_perm_17 by blast + +lemma ncoplanar_perm_23: + assumes "\ Coplanar A B C D" + shows "\ Coplanar D C B A" + using assms coplanar_perm_23 by blast + +lemma coplanar_trivial: + shows "Coplanar A A B C" + using Coplanar_def NCol_cases col_trivial_1 by blast + +lemma col__coplanar: + assumes "Col A B C" + shows "Coplanar A B C D" + using Coplanar_def assms not_col_distincts by blast + +lemma ncop__ncol: + assumes "\ Coplanar A B C D" + shows "\ Col A B C" + using assms col__coplanar by blast + +lemma ncop__ncols: + assumes "\ Coplanar A B C D" + shows "\ Col A B C \ \ Col A B D \ \ Col A C D \ \ Col B C D" + by (meson assms col__coplanar coplanar_perm_4 ncoplanar_perm_9) + +lemma bet__coplanar: + assumes "Bet A B C" + shows "Coplanar A B C D" + using assms bet_col ncop__ncol by blast + +lemma out__coplanar: + assumes "A Out B C" + shows "Coplanar A B C D" + using assms col__coplanar out_col by blast + +lemma midpoint__coplanar: + assumes "A Midpoint B C" + shows "Coplanar A B C D" + using assms midpoint_col ncop__ncol by blast + +lemma perp__coplanar: + assumes "A B Perp C D" + shows "Coplanar A B C D" +proof - + obtain P where "P PerpAt A B C D" + using Perp_def assms by blast + then show ?thesis + using Coplanar_def perp_in_col by blast +qed + +lemma ts__coplanar: + assumes "A B TS C D" + shows "Coplanar A B C D" + by (metis (full_types) Coplanar_def TS_def assms bet_col col_permutation_2 col_permutation_3) + +lemma reflectl__coplanar: + assumes "A B ReflectL C D" + shows "Coplanar A B C D" + by (metis (no_types) ReflectL_def Tarski_neutral_dimensionless.perp__coplanar Tarski_neutral_dimensionless_axioms assms col__coplanar col_trivial_1 ncoplanar_perm_17) + +lemma reflect__coplanar: + assumes "A B Reflect C D" + shows "Coplanar A B C D" + by (metis (no_types) Reflect_def Tarski_neutral_dimensionless.reflectl__coplanar Tarski_neutral_dimensionless_axioms assms col_trivial_2 ncop__ncols) + +lemma inangle__coplanar: + assumes "A InAngle B C D" + shows "Coplanar A B C D" +proof - + obtain X where P1: "Bet B X D \ (X = C \ C Out X A)" + using InAngle_def assms by auto + then show ?thesis + by (meson Col_cases Coplanar_def bet_col ncop__ncols out_col) +qed + +lemma pars__coplanar: + assumes "A B ParStrict C D" + shows "Coplanar A B C D" + using ParStrict_def assms by auto + +lemma par__coplanar: + assumes "A B Par C D" + shows "Coplanar A B C D" + using Par_def assms ncop__ncols pars__coplanar by blast + +lemma plg__coplanar: + assumes "Plg A B C D" + shows "Coplanar A B C D" +proof - + obtain M where "Bet A M C \ Bet B M D" + by (meson Plg_def assms midpoint_bet) + then show ?thesis + by (metis InAngle_def bet_out_1 inangle__coplanar ncop__ncols not_col_distincts) +qed + +lemma plgs__coplanar: + assumes "ParallelogramStrict A B C D" + shows "Coplanar A B C D" + using ParallelogramStrict_def assms par__coplanar by blast + +lemma plgf__coplanar: + assumes "ParallelogramFlat A B C D" + shows "Coplanar A B C D" + using ParallelogramFlat_def assms col__coplanar by auto + +lemma parallelogram__coplanar: + assumes "Parallelogram A B C D" + shows "Coplanar A B C D" + using Parallelogram_def assms plgf__coplanar plgs__coplanar by auto + +lemma rhombus__coplanar: + assumes "Rhombus A B C D" + shows "Coplanar A B C D" + using Rhombus_def assms plg__coplanar by blast + +lemma rectangle__coplanar: + assumes "Rectangle A B C D" + shows "Coplanar A B C D" + using Rectangle_def assms plg__coplanar by blast + +lemma square__coplanar: + assumes "Square A B C D" + shows "Coplanar A B C D" + using Square_def assms rectangle__coplanar by blast + +lemma lambert__coplanar: + assumes "Lambert A B C D" + shows "Coplanar A B C D" + using Lambert_def assms by presburger + +subsubsection "Planes" + +lemma ts_distincts: + assumes "A B TS P Q" + shows "A \ B \ A \ P \ A \ Q \ B \ P \ B \ Q \ P \ Q" + using TS_def assms bet_neq12__neq not_col_distincts by blast + +lemma l9_2: + assumes "A B TS P Q" + shows "A B TS Q P" + using TS_def assms between_symmetry by blast + +lemma invert_two_sides: + assumes "A B TS P Q" + shows "B A TS P Q" + using TS_def assms not_col_permutation_5 by blast + +lemma l9_3: + assumes "P Q TS A C" and + "Col M P Q" and + "M Midpoint A C" and + "Col R P Q" and + "R Out A B" + shows "P Q TS B C" +proof - + have P1: "\ Col A P Q" + using TS_def assms(1) by blast + have P2: "P \ Q" + using P1 not_col_distincts by auto + obtain T where P3: "Col T P Q \ Bet A T C" + using assms(2) assms(3) midpoint_bet by blast + have P4: "A \ C" + using assms(1) ts_distincts by blast + have P5: "T = M" + by (smt P1 P3 Tarski_neutral_dimensionless.bet_col1 Tarski_neutral_dimensionless_axioms assms(2) assms(3) col_permutation_2 l6_21 midpoint_bet) + have "P Q TS B C" + proof cases + assume "C = M" + then show ?thesis + using P4 assms(3) midpoint_distinct_1 by blast + next + assume P6: "\ C = M" + have P7: "\ Col B P Q" + by (metis P1 assms(4) assms(5) col_permutation_1 colx l6_3_1 out_col) + have P97: "Bet R A B \ Bet R B A" + using Out_def assms(5) by auto + { + assume Q1: "Bet R A B" + obtain B' where Q2: "M Midpoint B B'" + using symmetric_point_construction by blast + obtain R' where Q3: "M Midpoint R R'" + using symmetric_point_construction by blast + have Q4: "Bet B' C R'" + using Q1 Q2 Q3 assms(3) between_symmetry l7_15 by blast + obtain X where Q5: "Bet M X R' \ Bet C X B" + using Bet_perm Midpoint_def Q2 Q4 between_trivial2 l3_17 by blast + have Q6: "Col X P Q" + proof - + have R1: "Col P M R" + using P2 assms(2) assms(4) col_permutation_4 l6_16_1 by blast + have R2: "Col Q M R" + by (metis R1 assms(2) assms(4) col_permutation_5 l6_16_1 not_col_permutation_3) + { + assume "M = X" + then have "Col X P Q" + using assms(2) by blast + } + then have R3: "M = X \ Col X P Q" by simp + { + assume "M \ X" + then have S1: "M \ R'" + using Q5 bet_neq12__neq by blast + have "M \ R" + using Q3 S1 midpoint_distinct_1 by blast + then have "Col X P Q" + by (smt Col_perm Q3 Q5 R1 R2 S1 bet_out col_transitivity_2 midpoint_col out_col) + } + then have "M \ X \ Col X P Q" by simp + then show ?thesis using R3 by blast + qed + have "Bet B X C" + using Q5 between_symmetry by blast + then have "P Q TS B C" using Q6 + using P7 TS_def assms(1) by blast + } + then have P98: "Bet R A B \ P Q TS B C" by simp + { + assume S2: "Bet R B A" + have S3: "Bet C M A" + using Bet_perm P3 P5 by blast + then obtain X where "Bet B X C \ Bet M X R" + using S2 inner_pasch by blast + then have "P Q TS B C" + by (metis Col_def P7 TS_def assms(1) assms(2) assms(4) between_inner_transitivity between_trivial l6_16_1 not_col_permutation_5) + } + then have "Bet R B A \ P Q TS B C" by simp + then show ?thesis using P97 P98 + by blast + qed + then show ?thesis by blast +qed + +lemma mid_preserves_col: + assumes "Col A B C" and + "M Midpoint A A'" and + "M Midpoint B B'" and + "M Midpoint C C'" + shows "Col A' B' C'" + using Col_def assms(1) assms(2) assms(3) assms(4) l7_15 by auto + +lemma per_mid_per: + assumes (*"A \ B" and*) + "Per X A B" and + "M Midpoint A B" and + "M Midpoint X Y" + shows "Cong A X B Y \ Per Y B A" + by (meson Cong3_def Mid_perm assms(1) assms(2) assms(3) l7_13 l8_10) + +lemma sym_preserve_diff: + assumes "A \ B" and + "M Midpoint A A'" and + "M Midpoint B B'" + shows "A'\ B'" + using assms(1) assms(2) assms(3) l7_9 by blast + +lemma l9_4_1_aux_R1: + assumes "R = S" and + "S C Le R A" and + "P Q TS A C" and + "Col R P Q" and + "P Q Perp A R" and + "Col S P Q" and + "P Q Perp C S" and + "M Midpoint R S" + shows "\ U C'. M Midpoint U C' \ (R Out U A \ S Out C C')" +proof - + have P1: "M = R" + using assms(1) assms(8) l7_3 by blast + have P2: "\ Col A P Q" + using TS_def assms(3) by auto + then have P3: "P \ Q" + using not_col_distincts by blast + obtain T where P4: "Col T P Q \ Bet A T C" + using TS_def assms(3) by blast + { + assume "\ M = T" + then have "M PerpAt M T A M" using perp_col2 + by (metis P1 P4 assms(4) assms(5) not_col_permutation_3 perp_left_comm perp_perp_in) + then have "M T Perp C M" + using P1 P4 \M \ T\ assms(1) assms(4) assms(7) col_permutation_1 perp_col2 by blast + then have "Per T M A" + using \M PerpAt M T A M\ perp_in_per_3 by blast + have "Per T M C" + by (simp add: \M T Perp C M\ perp_per_1) + have "M = T" + proof - + have "Per C M T" + by (simp add: \Per T M C\ l8_2) + then show ?thesis using l8_6 l8_2 + using P4 \Per T M A\ by blast + qed + then have "False" + using \M \ T\ by blast + } + then have Q0: "M = T" by blast + have R1: "\ U C'. ((M Midpoint U C' \ M Out U A) \ M Out C C')" + proof - + { + fix U C' + assume Q1: "M Midpoint U C' \ M Out U A" + have Q2: "C \ M" + using P1 assms(1) assms(7) perp_not_eq_2 by blast + have Q3: "C' \ M" + using Q1 midpoint_not_midpoint out_diff1 by blast + have Q4: "Bet U M C" + using P4 Q0 Q1 bet_out__bet l6_6 by blast + then have "M Out C C'" + by (metis (full_types) Out_def Q1 Q2 Q3 l5_2 midpoint_bet) + } + then show ?thesis by blast + qed + have R2: "\ U C'. ((M Midpoint U C' \ M Out C C') \ M Out U A)" + proof - + { + fix U C' + assume Q1: "M Midpoint U C' \ M Out C C'" + have Q2: "C \ M" + using P1 assms(1) assms(7) perp_not_eq_2 by blast + have Q3: "C' \ M" + using Q1 l6_3_1 by blast + have Q4: "Bet U M C" + by (metis Out_def Q1 between_inner_transitivity midpoint_bet outer_transitivity_between) + then have "M Out U A" + by (metis P2 P4 Q0 Q1 Q2 Q3 l6_2 midpoint_distinct_1) + } + then show ?thesis by blast + qed + then show ?thesis + using R1 P1 P2 assms by blast +qed + +lemma l9_4_1_aux_R21: + assumes "R \ S" and + "S C Le R A" and + "P Q TS A C" and + "Col R P Q" and + "P Q Perp A R" and + "Col S P Q" and + "P Q Perp C S" and + "M Midpoint R S" + shows "\ U C'. M Midpoint U C' \ (R Out U A \ S Out C C')" +proof - + obtain D where P1: "Bet R D A \ Cong S C R D" + using Le_def assms(2) by blast + have P2: "C \ S" + using assms(7) perp_not_eq_2 by auto + have P3: "R \ D" + using P1 P2 cong_identity by blast + have P4: "R S Perp A R" + using assms(1) assms(4) assms(5) assms(6) not_col_permutation_2 perp_col2 by blast + have "\ M. (M Midpoint S R \ M Midpoint C D)" + proof - + have Q1: "\ Col A P Q" + using TS_def assms(3) by blast + have Q2: "P \ Q" + using Q1 not_col_distincts by blast + obtain T where Q3: "Col T P Q \ Bet A T C" + using TS_def assms(3) by blast + have Q4: "C S Perp S R" + by (metis NCol_perm assms(1) assms(4) assms(6) assms(7) perp_col0) + have Q5: "A R Perp S R" + using P4 Perp_perm by blast + have Q6: "Col S R T" + using Col_cases Q2 Q3 assms(4) assms(6) col3 by blast + have Q7: "Bet C T A" + using Bet_perm Q3 by blast + have Q8: "Bet R D A" + by (simp add: P1) + have "Cong S C R D" + by (simp add: P1) + then show ?thesis using P1 Q4 Q5 Q6 Q7 l8_24 by blast + qed + then obtain M' where P5: "M' Midpoint S R \ M' Midpoint C D" by blast + have P6: "M = M'" + by (meson P5 assms(8) l7_17_bis) + have L1: "\ U C'. (M Midpoint U C' \ R Out U A) \ S Out C C'" + proof - + { + fix U C' + assume R1: "M Midpoint U C' \ R Out U A" + have R2: "C \ S" + using P2 by auto + have R3: "C' \ S" + using P5 R1 P6 l7_9_bis out_diff1 by blast + have R4: "Bet S C C' \ Bet S C' C" + proof - + have R5: "Bet R U A \ Bet R A U" + using Out_def R1 by auto + { + assume "Bet R U A" + then have "Bet R U D \ Bet R D U" + using P1 l5_3 by blast + then have "Bet S C C' \ Bet S C' C" + using P5 P6 R1 l7_15 l7_2 by blast + } + then have R6: "Bet R U A \ Bet S C C' \ Bet S C' C" by simp + have "Bet R A U \ Bet S C C' \ Bet S C' C" + using P1 P5 P6 R1 between_exchange4 l7_15 l7_2 by blast + then show ?thesis using R5 R6 by blast + qed + then have "S Out C C'" + by (simp add: Out_def R2 R3) + } + then show ?thesis by simp + qed + have "\ U C'. (M Midpoint U C' \ S Out C C') \ R Out U A" + proof - + { + fix U C' + assume Q1: "M Midpoint U C' \ S Out C C'" + then have Q2: "U \ R" + using P5 P6 l7_9_bis out_diff2 by blast + have Q3: "A \ R" + using assms(5) perp_not_eq_2 by auto + have Q4: "Bet S C C' \ Bet S C' C" + using Out_def Q1 by auto + { + assume V0: "Bet S C C'" + have V1: "R \ D" + by (simp add: P3) + then have V2: "Bet R D U" + proof - + have W1: "M Midpoint S R" + using P5 P6 by blast + have W2: "M Midpoint C D" + by (simp add: P5 P6) + have "M Midpoint C' U" + by (simp add: Q1 l7_2) + then show ?thesis + using V0 P5 P6 l7_15 by blast + qed + have "Bet R D A" + using P1 by auto + then have "Bet R U A \ Bet R A U" + using V1 V2 l5_1 by blast + } + then have Q5: "Bet S C C' \ Bet R U A \ Bet R A U" by simp + { + assume R1: "Bet S C' C" + have "Bet R U A" + using P1 P5 P6 Q1 R1 between_exchange4 l7_15 l7_2 by blast + } + then have "Bet S C' C \ Bet R U A \ Bet R A U" by simp + then have "Bet R U A \ Bet R A U" + using Q4 Q5 by blast + then have "R Out U A" + by (simp add: Out_def Q2 Q3) + } + then show ?thesis by simp + qed + then show ?thesis + using L1 by blast +qed + +lemma l9_4_1_aux: + assumes "S C Le R A" and + "P Q TS A C" and + "Col R P Q" and + "P Q Perp A R" and + "Col S P Q" and + "P Q Perp C S" and + "M Midpoint R S" + shows "\ U C'. (M Midpoint U C' \ (R Out U A \ S Out C C'))" + using l9_4_1_aux_R1 l9_4_1_aux_R21 assms by smt + +lemma per_col_eq: + assumes "Per A B C" and + "Col A B C" and + "B \ C" + shows "A = B" + using assms(1) assms(2) assms(3) l8_9 by blast + +lemma l9_4_1: + assumes "P Q TS A C" and + "Col R P Q" and + "P Q Perp A R" and + "Col S P Q" and + "P Q Perp C S" and + "M Midpoint R S" + shows "\ U C'. M Midpoint U C' \ (R Out U A \ S Out C C')" +proof - + have P1: "S C Le R A \ R A Le S C" + using local.le_cases by blast + { + assume Q1: "S C Le R A" + { + fix U C' + assume "M Midpoint U C'" + then have "(R Out U A \ S Out C C')" + using Q1 assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l9_4_1_aux by blast + } + then have "\ U C'. M Midpoint U C' \ (R Out U A \ S Out C C')" by simp + } + then have P2: "S C Le R A \ (\ U C'. M Midpoint U C' \ (R Out U A \ S Out C C'))" by simp + { + assume Q2: " R A Le S C" + { + fix U C' + assume "M Midpoint U C'" + then have "(R Out A U \ S Out C' C)" + using Q2 assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l7_2 l9_2 l9_4_1_aux by blast + then have "(R Out U A \ S Out C C')" + using l6_6 by blast + } + then have "\ U C'. M Midpoint U C' \ (R Out U A \ S Out C C')" by simp + } + then have P3: "R A Le S C \ (\ U C'. M Midpoint U C' \ (R Out U A \ S Out C C'))" by simp + + then show ?thesis + using P1 P2 by blast +qed + +lemma mid_two_sides: + assumes "M Midpoint A B" and + "\ Col A B X" and + "M Midpoint X Y" + shows "A B TS X Y" +proof - + have f1: "\ Col Y A B" + by (meson Mid_cases Tarski_neutral_dimensionless.mid_preserves_col Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) col_permutation_3) + have "Bet X M Y" + using assms(3) midpoint_bet by blast + then show ?thesis + using f1 by (metis (no_types) TS_def assms(1) assms(2) col_permutation_1 midpoint_col) +qed + + +lemma col_preserves_two_sides: + assumes "C \ D" and + "Col A B C" and + "Col A B D" and + "A B TS X Y" + shows "C D TS X Y" +proof - + have P1: "\ Col X A B" + using TS_def assms(4) by blast + then have P2: "A \ B" + using not_col_distincts by blast + have P3: "\ Col X C D" + by (metis Col_cases P1 Tarski_neutral_dimensionless.colx Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3)) + have P4: "\ Col Y C D" + by (metis Col_cases TS_def Tarski_neutral_dimensionless.colx Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4)) + then show ?thesis + proof - + obtain pp :: "'p \ 'p \ 'p \ 'p \ 'p" where + "\x0 x1 x2 x3. (\v4. Col v4 x3 x2 \ Bet x1 v4 x0) = (Col (pp x0 x1 x2 x3) x3 x2 \ Bet x1 (pp x0 x1 x2 x3) x0)" + by moura + then have f1: "\ Col X A B \ \ Col Y A B \ Col (pp Y X B A) A B \ Bet X (pp Y X B A) Y" + using TS_def assms(4) by presburger + then have "Col (pp Y X B A) C D" + by (meson P2 assms(2) assms(3) col3 not_col_permutation_3 not_col_permutation_4) + then show ?thesis + using f1 TS_def P3 P4 by blast + qed +qed + +lemma out_out_two_sides: + assumes "A \ B" and + "A B TS X Y" and + "Col I A B" and + "Col I X Y" and + "I Out X U" and + "I Out Y V" + shows "A B TS U V" +proof - + have P0: "\ Col X A B" + using TS_def assms(2) by blast + then have P1: "\ Col V A B" + by (smt assms(2) assms(3) assms(4) assms(6) col_out2_col col_transitivity_1 not_col_permutation_3 not_col_permutation_4 out_diff2 out_trivial ts_distincts) + have P2: "\ Col U A B" + by (metis P0 assms(3) assms(5) col_permutation_2 colx out_col out_distinct) + obtain T where P3: "Col T A B \ Bet X T Y" + using TS_def assms(2) by blast + have "I = T" + proof - + have f1: "\p pa pb. \ Col p pa pb \ \ Col p pb pa \ \ Col pa p pb \ \ Col pa pb p \ \ Col pb p pa \ \ Col pb pa p \ Col p pa pb" + using Col_cases by blast + then have f2: "Col X Y I" + using assms(4) by blast + have f3: "Col B A I" + using f1 assms(3) by blast + have f4: "Col B A T" + using f1 P3 by blast + have f5: "\ Col X A B \ \ Col X B A \ \ Col A X B \ \ Col A B X \ \ Col B X A \ \ Col B A X" + using f1 \\ Col X A B\ by blast + have f6: "A \ B \ A \ X \ A \ Y \ B \ X \ B \ Y \ X \ Y" + using assms(2) ts_distincts by presburger + have "Col X Y T" + using f1 by (meson P3 bet_col) + then show ?thesis + using f6 f5 f4 f3 f2 by (meson Tarski_neutral_dimensionless.l6_21 Tarski_neutral_dimensionless_axioms) + qed + then have "Bet U T V" + using P3 assms(5) assms(6) bet_out_out_bet by blast + then show ?thesis + using P1 P2 P3 TS_def by blast +qed + +lemma l9_4_2_aux_R1: + assumes "R = S " and + "S C Le R A" and + "P Q TS A C" and + "Col R P Q" and + "P Q Perp A R" and + "Col S P Q" and + "P Q Perp C S" and + "R Out U A" and + "S Out V C" + shows "P Q TS U V" +proof - + have "\ Col A P Q" + using TS_def assms(3) by auto + then have P2: "P \ Q" + using not_col_distincts by blast + obtain T where P3: "Col T P Q \ Bet A T C" + using TS_def assms(3) by blast + have "R = T" using assms(1) assms(5) assms(6) assms(7) col_permutation_1 l8_16_1 l8_6 + by (meson P3) + then show ?thesis + by (smt P2 P3 assms(1) assms(3) assms(8) assms(9) bet_col col_transitivity_2 l6_6 not_col_distincts out_out_two_sides) +qed + +lemma l9_4_2_aux_R2: + assumes "R \ S" and + "S C Le R A" and + "P Q TS A C" and + "Col R P Q" and + "P Q Perp A R" and + "Col S P Q" and + "P Q Perp C S" and + "R Out U A" and + "S Out V C" + shows "P Q TS U V" +proof - + have P1: "P \ Q" + using assms(7) perp_distinct by auto + have P2: "R S TS A C" + using assms(1) assms(3) assms(4) assms(6) col_permutation_1 col_preserves_two_sides by blast + have P3: "Col R S P" + using P1 assms(4) assms(6) col2__eq not_col_permutation_1 by blast + have P4: "Col R S Q" + by (metis P3 Tarski_neutral_dimensionless.colx Tarski_neutral_dimensionless_axioms assms(4) assms(6) col_trivial_2) + have P5: "R S Perp A R" + using NCol_perm assms(1) assms(4) assms(5) assms(6) perp_col2 by blast + have P6: "R S Perp C S" + using assms(1) assms(4) assms(6) assms(7) col_permutation_1 perp_col2 by blast + have P7: "\ Col A R S" + using P2 TS_def by blast + obtain T where P8: "Col T R S \ Bet A T C" + using P2 TS_def by blast + obtain C' where P9: "Bet R C' A \ Cong S C R C'" + using Le_def assms(2) by blast + have "\ X. X Midpoint S R \ X Midpoint C C'" + proof - + have Q1: "C S Perp S R" + using P6 Perp_perm by blast + have Q2: "A R Perp S R" + using P5 Perp_perm by blast + have Q3: "Col S R T" + using Col_cases P8 by blast + have Q4: "Bet C T A" + using Bet_perm P8 by blast + have Q5: "Bet R C' A" + by (simp add: P9) + have "Cong S C R C'" + by (simp add: P9) + then show ?thesis using Q1 Q2 Q3 Q4 Q5 l8_24 + by blast + qed + then obtain M where P10: "M Midpoint S R \ M Midpoint C C'" by blast + obtain U' where P11: "M Midpoint U U'" + using symmetric_point_construction by blast + have P12: "R \ U" + using assms(8) out_diff1 by blast + have P13: "R S TS U U'" + by (smt P10 P11 P12 P7 assms(8) col_transitivity_2 invert_two_sides mid_two_sides not_col_permutation_3 not_col_permutation_4 out_col) + have P14: "R S TS V U" + proof - + have Q1: "Col M R S" + using P10 midpoint_col not_col_permutation_5 by blast + have Q2: "M Midpoint U' U" + by (meson P11 Tarski_neutral_dimensionless.Mid_cases Tarski_neutral_dimensionless_axioms) + have "S Out U' V" + by (meson P10 P11 P2 P5 P6 Tarski_neutral_dimensionless.l7_2 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(8) assms(9) l6_6 l6_7 l9_4_1_aux_R21 not_col_distincts) + then show ?thesis + using P13 Q1 Q2 col_trivial_3 l9_2 l9_3 by blast + qed + then show ?thesis + using P1 P3 P4 col_preserves_two_sides l9_2 by blast +qed + +lemma l9_4_2_aux: + assumes "S C Le R A" and + "P Q TS A C" and + "Col R P Q" and + "P Q Perp A R" and + "Col S P Q" and + "P Q Perp C S" and + "R Out U A" and + "S Out V C" + shows "P Q TS U V" + using l9_4_2_aux_R1 l9_4_2_aux_R2 + by (metis assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) assms(8)) + +lemma l9_4_2: + assumes "P Q TS A C" and + "Col R P Q" and + "P Q Perp A R" and + "Col S P Q" and + "P Q Perp C S" and + "R Out U A" and + "S Out V C" + shows "P Q TS U V" +proof - + have P1: "S C Le R A \ R A Le S C" + by (simp add: local.le_cases) + have P2: "S C Le R A \ P Q TS U V" + by (simp add: assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) l9_4_2_aux) + have "R A Le S C \ P Q TS U V" + by (simp add: assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) l9_2 l9_4_2_aux) + then show ?thesis + using P1 P2 by blast +qed + +lemma l9_5: + assumes "P Q TS A C" and + "Col R P Q" and + "R Out A B" + shows "P Q TS B C" +proof - + have P1: "P \ Q" + using assms(1) ts_distincts by blast + obtain A' where P2: "Col P Q A' \ P Q Perp A A'" + by (metis NCol_perm Tarski_neutral_dimensionless.TS_def Tarski_neutral_dimensionless_axioms assms(1) l8_18_existence) + obtain C' where P3: "Col P Q C' \ P Q Perp C C'" + using Col_perm TS_def assms(1) l8_18_existence by blast + obtain M where P5: "M Midpoint A' C'" + using midpoint_existence by blast + obtain D where S2: "M Midpoint A D" + using symmetric_point_construction by auto + have "\ B0. Col P Q B0 \ P Q Perp B B0" + proof - + have S1: "\ Col P Q B" + by (metis P2 Tarski_neutral_dimensionless.colx Tarski_neutral_dimensionless.perp_not_col2 Tarski_neutral_dimensionless_axioms assms(2) assms(3) col_permutation_1 l6_3_1 out_col) + then show ?thesis + by (simp add: l8_18_existence) + qed + then obtain B' where P99: "Col P Q B' \ P Q Perp B B'" by blast + have "P Q TS B C" + proof - + have S3: "C' Out D C \ A' Out A A" + using Out_cases P2 P3 P5 S2 assms(1) l9_4_1 not_col_permutation_1 by blast + then have S4: "C' Out D C" + using P2 Tarski_neutral_dimensionless.perp_not_eq_2 Tarski_neutral_dimensionless_axioms out_trivial by fastforce + have S5: "P Q TS A D" + using P2 P3 S3 S4 assms(1) col_permutation_2 l9_4_2 by blast + { + assume "A' \ C'" + then have "Col M P Q" + by (smt P2 P3 P5 col_trivial_2 l6_21 midpoint_col not_col_permutation_1) + then have "P Q TS B D" + using S2 S5 assms(2) assms(3) l9_3 by blast + } + then have "A' \ C' \ P Q TS B D" by simp + then have S6: "P Q TS B D" + by (metis P3 P5 S2 S5 assms(2) assms(3) l9_3 midpoint_distinct_2 not_col_permutation_1) + have S7: "Col B' P Q" + using Col_perm P99 by blast + have S8: "P Q Perp B B'" + using P99 by blast + have S9: "Col C' P Q" + using Col_cases P3 by auto + have S10: "P Q Perp D C'" + by (metis Col_perm P3 S4 l6_3_1 out_col perp_col1 perp_right_comm) + have S11: "B' Out B B" + by (metis (no_types) P99 out_trivial perp_not_eq_2) + have "C' Out C D" + by (simp add: S4 l6_6) + then show ?thesis using S6 S7 S8 S9 S10 S11 l9_4_2 by blast + qed + then show ?thesis using l8_18_existence by blast +qed + +lemma outer_pasch_R1: + assumes "Col P Q C" and + "Bet A C P" and + "Bet B Q C" + shows "\ X. Bet A X B \ Bet P Q X" + by (smt Bet_perm Col_def assms(1) assms(2) assms(3) between_exchange3 between_trivial outer_transitivity_between2) + +lemma outer_pasch_R2: + assumes "\ Col P Q C" and + "Bet A C P" and + "Bet B Q C" + shows "\ X. Bet A X B \ Bet P Q X" +proof cases + assume "B = Q" + then show ?thesis + using between_trivial by blast +next + assume P1: "B \ Q" + have P2: "A \ P" + using assms(1) assms(2) between_identity col_trivial_3 by blast + have P3: "P \ Q" + using assms(1) col_trivial_1 by blast + have P4: "P \ B" + using assms(1) assms(3) bet_col by blast + have P5: "P Q TS C B" + proof - + have Q1: "\ Col C P Q" + using Col_cases assms(1) by blast + have Q2: "\ Col B P Q" + by (metis Col_cases P1 Tarski_neutral_dimensionless.colx Tarski_neutral_dimensionless_axioms assms(1) assms(3) bet_col col_trivial_2) + have "\ T. Col T P Q \ Bet C T B" + using Col_cases assms(3) between_symmetry col_trivial_2 by blast + then show ?thesis + by (simp add: Q1 Q2 TS_def) + qed + have P6: "P Q TS A B" + by (metis P5 assms(1) assms(2) bet_out_1 l9_5 not_col_distincts) + obtain X where P7: "Col X P Q \ Bet A X B" + using P6 TS_def by blast + have "Bet P Q X" + proof - + obtain T where P8: "Bet X T P \ Bet C T B" + using P7 assms(2) between_symmetry inner_pasch by blast + have P9: "B \ C" + using P1 assms(3) bet_neq12__neq by blast + have P10: "T = Q" + proof - + have f1: "\p pa pb. Col pb pa p \ \ Bet pb pa p" + by (meson bet_col1 between_trivial) + then have f2: "Col Q C B" + using NCol_cases assms(3) by blast + have "Col T C B" + using f1 NCol_cases P8 by blast + then show ?thesis + using f2 f1 by (metis (no_types) NCol_cases P7 P8 assms(1) between_trivial l6_16_1 l6_2 not_bet_and_out) + qed + then show ?thesis + using P8 between_symmetry by blast + qed + then show ?thesis using P7 by blast +qed + +lemma outer_pasch: + assumes "Bet A C P" and + "Bet B Q C" + shows "\ X. Bet A X B \ Bet P Q X" + using assms(1) assms(2) outer_pasch_R1 outer_pasch_R2 by blast + +lemma os_distincts: + assumes "A B OS X Y" + shows "A \ B \ A \ X \ A \ Y \ B \ X \ B \ Y" + using OS_def assms ts_distincts by blast + +lemma invert_one_side: + assumes "A B OS P Q" + shows "B A OS P Q" +proof - + obtain T where "A B TS P T \ A B TS Q T" + using OS_def assms by blast + then have "B A TS P T \ B A TS Q T" + using invert_two_sides by blast + thus ?thesis + using OS_def by blast +qed + +lemma l9_8_1: + assumes "P Q TS A C" and + "P Q TS B C" + shows "P Q OS A B" +proof - + have "\ R::'p. (P Q TS A R \ P Q TS B R)" + using assms(1) assms(2) by blast + then show ?thesis + using OS_def by blast +qed + +lemma not_two_sides_id: + shows "\ P Q TS A A" + using ts_distincts by blast + +lemma l9_8_2: + assumes "P Q TS A C" and + "P Q OS A B" + shows "P Q TS B C" +proof - + obtain D where P1: "P Q TS A D \ P Q TS B D" + using assms(2) OS_def by blast + then have "P \ Q" + using ts_distincts by blast + obtain T where P2: "Col T P Q \ Bet A T C" + using TS_def assms(1) by blast + obtain X where P3: "Col X P Q \ Bet A X D" + using TS_def P1 by blast + obtain Y where P4: "Col Y P Q \ Bet B Y D" + using TS_def P1 by blast + then obtain M where P5: "Bet Y M A \ Bet X M B" using P3 inner_pasch by blast + have P6: "A \ D" + using P1 ts_distincts by blast + have P7: "B \ D" + using P1 not_two_sides_id by blast + { + assume Q0: "Col A B D" + have "P Q TS B C" + proof cases + assume Q1: "M = Y" + have "X = Y" + proof - + have S1: "\ Col P Q A" + using TS_def assms(1) not_col_permutation_1 by blast + have S3: "Col P Q X" + using Col_perm P3 by blast + have S4: "Col P Q Y" + using Col_perm P4 by blast + have S5: "Col A D X" + by (simp add: P3 bet_col col_permutation_5) + have "Col A D Y" + by (metis Col_def P5 Q1 S5 Q0 between_equality between_trivial l6_16_1) + then show ?thesis using S1 S3 S4 S5 P6 l6_21 + by blast + qed + then have "X Out A B" + by (metis P1 P3 P4 TS_def l6_2) + then show ?thesis using assms(1) P3 l9_5 by blast + next + assume Z1: "\ M = Y" + have "X = Y" + proof - + have S1: "\ Col P Q A" + using TS_def assms(1) not_col_permutation_1 by blast + have S3: "Col P Q X" + using Col_perm P3 by blast + have S4: "Col P Q Y" + using Col_perm P4 by blast + have S5: "Col A D X" + by (simp add: P3 bet_col col_permutation_5) + have "Col A D Y" + by (metis Col_def P4 Q0 P7 l6_16_1) + then show ?thesis using S1 S3 S4 S5 P6 l6_21 + by blast + qed + then have Z3: "M \ X" using Z1 by blast + have Z4: "P Q TS M C" + by (meson Out_cases P4 P5 Tarski_neutral_dimensionless.l9_5 Tarski_neutral_dimensionless_axioms Z1 assms(1) bet_out) + have "X Out M B" + using P5 Z3 bet_out by auto + then show ?thesis using Z4 P3 l9_5 by blast + qed + } + then have Z99: "Col A B D \ P Q TS B C" by blast + { + assume Q0: "\ Col A B D" + have Q1: "P Q TS M C" + proof - + have S3: "Y Out A M" + proof - + have T1: "A \ Y" + using Col_def P4 Q0 col_permutation_4 by blast + have T2: "M \ Y" + proof - + { + assume T3: "M = Y" + have "Col B D X" + proof - + have U1: "B \ M" + using P1 P4 T3 TS_def by blast + have U2: "Col B M D" + by (simp add: P4 T3 bet_col) + have "Col B M X" + by (simp add: P5 bet_col between_symmetry) + then show ?thesis using U1 U2 + using col_transitivity_1 by blast + qed + have "False" + by (metis NCol_cases P1 P3 TS_def \Col B D X\ Q0 bet_col col_trivial_2 l6_21) + } + then show ?thesis by blast + qed + have "Bet Y A M \ Bet Y M A" using P5 by blast + then show ?thesis using T1 T2 + by (simp add: Out_def) + qed + then have "X Out M B" + by (metis P1 P3 P4 P5 TS_def bet_out l9_5) + then show ?thesis using assms(1) S3 l9_5 P3 P4 by blast + qed + have "X Out M B" + by (metis P3 P5 Q1 TS_def bet_out) + then have "P Q TS B C" using Q1 P3 l9_5 by blast + } + then have "\ Col A B D \ P Q TS B C" by blast + then show ?thesis using Z99 by blast +qed + +lemma l9_9: + assumes "P Q TS A B" + shows "\ P Q OS A B" + using assms l9_8_2 not_two_sides_id by blast + +lemma l9_9_bis: + assumes "P Q OS A B" + shows "\ P Q TS A B" + using assms l9_9 by blast + +lemma one_side_chara: + assumes "P Q OS A B" + shows "\ X. Col X P Q \ \ Bet A X B" +proof - + have "\ Col A P Q \ \ Col B P Q" + using OS_def TS_def assms by auto + then show ?thesis + using l9_9_bis TS_def assms by blast +qed + +lemma l9_10: + assumes "\ Col A P Q" + shows "\ C. P Q TS A C" + by (meson Col_perm assms mid_two_sides midpoint_existence symmetric_point_construction) + +lemma one_side_reflexivity: + assumes "\ Col A P Q" + shows "P Q OS A A" + using assms l9_10 l9_8_1 by blast + +lemma one_side_symmetry: + assumes "P Q OS A B" + shows "P Q OS B A" + by (meson Tarski_neutral_dimensionless.OS_def Tarski_neutral_dimensionless_axioms assms invert_two_sides) + +lemma one_side_transitivity: + assumes "P Q OS A B" and + "P Q OS B C" + shows "P Q OS A C" + by (meson Tarski_neutral_dimensionless.OS_def Tarski_neutral_dimensionless.l9_8_2 Tarski_neutral_dimensionless_axioms assms(1) assms(2)) + +lemma l9_17: + assumes "P Q OS A C" and + "Bet A B C" + shows "P Q OS A B" +proof cases + assume "A = C" + then show ?thesis + using assms(1) assms(2) between_identity by blast +next + assume P1: "\ A = C" + obtain D where P2: "P Q TS A D \ P Q TS C D" + using OS_def assms(1) by blast + then have P3: "P \ Q" + using ts_distincts by blast + obtain X where P4: "Col X P Q \ Bet A X D" + using P2 TS_def by blast + obtain Y where P5: "Col Y P Q \ Bet C Y D" + using P2 TS_def by blast + obtain T where P6: "Bet B T D \ Bet X T Y" + using P4 P5 assms(2) l3_17 by blast + have P7: "P Q TS A D" + by (simp add: P2) + have "P Q TS B D" + proof - + have Q1: "\ Col B P Q" + using assms(1) assms(2) one_side_chara by blast + have Q2: "\ Col D P Q" + using P2 TS_def by blast + obtain T0 where "Col T0 P Q \ Bet B T0 D" + proof - + assume a1: "\T0. Col T0 P Q \ Bet B T0 D \ thesis" + obtain pp :: 'p where + f2: "Bet B pp D \ Bet X pp Y" + using \\thesis. (\T. Bet B T D \ Bet X T Y \ thesis) \ thesis\ by blast + have "Col P Q Y" + using Col_def P5 by blast + then have "Y = X \ Col P Q pp" + using f2 Col_def P4 colx by blast + then show ?thesis + using f2 a1 by (metis BetSEq BetS_def Col_def P4) + qed + then show ?thesis using Q1 Q2 + using TS_def by blast + qed + then show ?thesis using P7 + using OS_def by blast +qed + +lemma l9_18_R1: + assumes "Col X Y P" and + "Col A B P" + and "X Y TS A B" + shows "Bet A P B \ \ Col X Y A \ \ Col X Y B" + by (meson TS_def assms(1) assms(2) assms(3) col_permutation_5 l9_5 not_col_permutation_1 not_out_bet not_two_sides_id) + +lemma l9_18_R2: + assumes "Col X Y P" and + "Col A B P" and + "Bet A P B" and + "\ Col X Y A" and + "\ Col X Y B" + shows "X Y TS A B" + using Col_perm TS_def assms(1) assms(3) assms(4) assms(5) by blast + +lemma l9_18: + assumes "Col X Y P" and + "Col A B P" + shows "X Y TS A B \ (Bet A P B \ \ Col X Y A \ \ Col X Y B)" + using l9_18_R1 l9_18_R2 assms(1) assms(2) by blast + +lemma l9_19_R1: + assumes "Col X Y P" and + "Col A B P" and + "X Y OS A B" + shows "P Out A B \ \ Col X Y A" + by (meson OS_def TS_def assms(1) assms(2) assms(3) col_permutation_5 not_col_permutation_1 not_out_bet one_side_chara) + +lemma l9_19_R2: + assumes "Col X Y P" and + (* "Col A B P" and *) + "P Out A B" and + "\ Col X Y A" + shows "X Y OS A B" +proof - + obtain D where "X Y TS A D" + using Col_perm assms(3) l9_10 by blast + then show ?thesis + using OS_def assms(1) assms(2) l9_5 not_col_permutation_1 by blast +qed + +lemma l9_19: + assumes "Col X Y P" and + "Col A B P" + shows "X Y OS A B \ (P Out A B \ \ Col X Y A)" + using l9_19_R1 l9_19_R2 assms(1) assms(2) by blast + +lemma one_side_not_col123: + assumes "A B OS X Y" + shows "\ Col A B X" + using assms col_trivial_3 l9_19 by blast + +lemma one_side_not_col124: + assumes "A B OS X Y" + shows "\ Col A B Y" + using assms one_side_not_col123 one_side_symmetry by blast + +lemma col_two_sides: + assumes "Col A B C" and + "A \ C" and + "A B TS P Q" + shows "A C TS P Q" + using assms(1) assms(2) assms(3) col_preserves_two_sides col_trivial_3 by blast + +lemma col_one_side: + assumes "Col A B C" and + "A \ C" and + "A B OS P Q" + shows "A C OS P Q" +proof - + obtain T where "A B TS P T \ A B TS Q T" using assms(1) assms(2) assms(3) OS_def by blast + then show ?thesis + using col_two_sides OS_def assms(1) assms(2) by blast +qed + + +lemma out_out_one_side: + assumes "A B OS X Y" and + "A Out Y Z" + shows "A B OS X Z" + by (meson Col_cases Tarski_neutral_dimensionless.OS_def Tarski_neutral_dimensionless_axioms assms(1) assms(2) col_trivial_3 l9_5) + +lemma out_one_side: + assumes "\ Col A B X \ \ Col A B Y" and + "A Out X Y" + shows "A B OS X Y" + using assms(1) assms(2) l6_6 not_col_permutation_2 one_side_reflexivity one_side_symmetry out_out_one_side by blast + +lemma bet__ts: + assumes "A \ Y" and + "\ Col A B X" and + "Bet X A Y" + shows "A B TS X Y" +proof - + have "\ Col Y A B" + using NCol_cases assms(1) assms(2) assms(3) bet_col col2__eq by blast + then show ?thesis + by (meson TS_def assms(2) assms(3) col_permutation_3 col_permutation_5 col_trivial_3) +qed + +lemma bet_ts__ts: + assumes "A B TS X Y" and + "Bet X Y Z" + shows "A B TS X Z" +proof - + have "\ Col Z A B" + using assms(1) assms(2) bet_col between_equality_2 col_permutation_1 l9_18 by blast + then show ?thesis + using TS_def assms(1) assms(2) between_exchange4 by blast +qed + +lemma bet_ts__os: + assumes "A B TS X Y" and + "Bet X Y Z" + shows "A B OS Y Z" + using OS_def assms(1) assms(2) bet_ts__ts l9_2 by blast + +lemma l9_31 : + assumes "A X OS Y Z" and + "A Z OS Y X" + shows "A Y TS X Z" +proof - + have P1: "A \ X \ A \ Z \ \ Col Y A X \ \ Col Z A X \ \ Col Y A Z" + using assms(1) assms(2) col_permutation_1 one_side_not_col123 one_side_not_col124 os_distincts by blast + obtain Z' where P2: "Bet Z A Z' \ Cong A Z' Z A" + using segment_construction by blast + have P3: "Z' \ A" + using P1 P2 cong_diff_4 by blast + have P4: "A X TS Y Z'" + by (metis (no_types) P2 P3 assms(1) bet__ts l9_8_2 one_side_not_col124 one_side_symmetry) + have P5: "\ Col Y A X" + using P1 by blast + obtain T where P6: "Col A T X \ Bet Y T Z'" + using P4 TS_def not_col_permutation_4 by blast + then have P7: "T \ A" + proof - + have "\ Col A Z Y" + by (simp add: P1 not_col_permutation_1) + then have f1: "\ A Out Z Y" + using out_col by blast + have "A \ Z'" + using P1 P2 cong_diff_4 by blast + then show ?thesis + using f1 by (metis (no_types) P1 P2 P6 l6_2) + qed + have P8: "Y A OS Z' T" + by (smt P1 P2 P3 P6 Tarski_neutral_dimensionless.l6_6 Tarski_neutral_dimensionless_axioms bet_col bet_out col_trivial_2 l6_21 not_col_permutation_1 out_one_side) + have P9: "A Y TS Z' Z" + using Col_perm P1 P2 P8 bet__ts between_symmetry one_side_not_col123 by blast + { + assume Q0: "Bet T A X" + have Q1: "Z' Z OS Y T" + by (metis BetSEq BetS_def P1 P2 P4 P6 TS_def Tarski_neutral_dimensionless.l6_6 Tarski_neutral_dimensionless_axioms bet_col bet_out_1 col_trivial_3 colx not_col_permutation_3 not_col_permutation_4 out_one_side) + then have Q2: "Z' Out T Y" + by (metis P6 bet_out_1 os_distincts) + then have Q3: "A Z OS Y T" + by (meson Out_cases P1 P2 P6 bet_col col_permutation_3 invert_one_side l9_19_R2) + have "A Z TS X T" + proof - + have R1: "\ Col X A Z" + using P1 col_permutation_3 by blast + have R2: "\ Col T A Z" + using Q3 between_trivial one_side_chara by blast + have "\ T0. Col T0 A Z \ Bet X T0 T" + proof - + have S1: "Col A A Z" + by (simp add: col_trivial_1) + have "Bet X A T" + by (simp add: Q0 between_symmetry) + then show ?thesis using S1 by blast + qed + then show ?thesis using R1 R2 + using TS_def by auto + qed + have "A Y TS X Z" + by (meson Q3 Tarski_neutral_dimensionless.l9_8_2 Tarski_neutral_dimensionless.one_side_symmetry Tarski_neutral_dimensionless_axioms \A Z TS X T\ assms(2) l9_9_bis) + } + then have P10: "Bet T A X \ A Y TS X Z" by blast + { + assume R1: "Bet A X T" + then have R3: "A Y OS Z' X" + by (meson Bet_cases P1 P6 P8 R1 between_equality invert_one_side not_col_permutation_4 not_out_bet out_out_one_side) + have "A Y TS X Z" + using R3 P9 l9_8_2 by blast + } + then have P11: "Bet A X T \ A Y TS X Z" by blast + { + assume R1: "Bet X T A" + then have R3: "A Y OS T X" + by (simp add: P5 P7 R1 bet_out_1 not_col_permutation_4 out_one_side) + then have "A Y TS X Z" + using P8 P9 invert_two_sides l9_8_2 by blast + } + then have "Bet X T A \ A Y TS X Z" by blast + then show ?thesis using P10 P11 + using P6 between_symmetry third_point by blast +qed + +lemma col123__nos: + assumes "Col P Q A" + shows "\ P Q OS A B" + using assms one_side_not_col123 by blast + +lemma col124__nos: + assumes "Col P Q B" + shows "\ P Q OS A B" + using assms one_side_not_col124 by blast + +lemma col2_os__os: + assumes "C \ D" and + "Col A B C" and + "Col A B D" and + "A B OS X Y" + shows "C D OS X Y" + by (metis assms(1) assms(2) assms(3) assms(4) col3 col_one_side col_trivial_3 invert_one_side os_distincts) + +lemma os_out_os: + assumes "Col A B P" and + "A B OS C D" and + "P Out C C'" + shows "A B OS C' D" + using OS_def assms(1) assms(2) assms(3) l9_5 not_col_permutation_1 by blast + +lemma ts_ts_os: + assumes "A B TS C D" and + "C D TS A B" + shows "A C OS B D" +proof - + obtain T1 where P1: "Col T1 A B \ Bet C T1 D" + using TS_def assms(1) by blast + obtain T where P2: "Col T C D \ Bet A T B" + using TS_def assms(2) by blast + have P3: "T1 = T" + proof - + have "A \ B" + using assms(2) ts_distincts by blast + then show ?thesis + proof - + have "Col T1 D C" + using Col_def P1 by blast + then have f1: "\p. (C = T1 \ Col C p T1) \ \ Col C T1 p" + by (metis assms(1) col_transitivity_1 l6_16_1 ts_distincts) + have f2: "\ Col C A B" + using TS_def assms(1) by presburger + have f3: "(Bet B T1 A \ Bet T1 A B) \ Bet A B T1" + using Col_def P1 by blast + { + assume "T1 \ B" + then have "C \ T1 \ \ Col C T1 B \ (\p. \ Col p T1 B \ Col p T1 T) \ T \ A \ T \ B" + using f3 f2 by (metis (no_types) Col_def col_transitivity_1 l6_16_1) + then have "T \ A \ T \ B \ C \ T1 \ \ Col C T1 T \ T1 = T" + using f3 by (meson Col_def l6_16_1) + } + moreover + { + assume "T \ A \ T \ B" + then have "C \ T1 \ \ Col C T1 T \ T1 = T" + using f2 by (metis (no_types) Col_def P1 P2 \A \ B\ col_transitivity_1 l6_16_1) + } + ultimately have "C \ T1 \ \ Col C T1 T \ T1 = T" + using f2 f1 assms(1) ts_distincts by blast + then show ?thesis + by (metis (no_types) Col_def P1 P2 assms(1) l6_16_1 ts_distincts) + qed + qed + have P4: "A C OS T B" + by (metis Col_cases P2 TS_def assms(1) assms(2) bet_out out_one_side) + then have "C A OS T D" + by (metis Col_cases P1 TS_def P3 assms(2) bet_out os_distincts out_one_side) + then show ?thesis + by (meson P4 Tarski_neutral_dimensionless.invert_one_side Tarski_neutral_dimensionless.one_side_symmetry Tarski_neutral_dimensionless_axioms one_side_transitivity) +qed + +lemma col_one_side_out: + assumes "Col A X Y" and + "A B OS X Y" + shows "A Out X Y" + by (meson assms(1) assms(2) l6_4_2 not_col_distincts not_col_permutation_4 one_side_chara) + +lemma col_two_sides_bet: + assumes "Col A X Y" and + "A B TS X Y" + shows "Bet X A Y" + using Col_cases assms(1) assms(2) l9_8_1 l9_9 or_bet_out out_out_one_side by blast + +lemma os_ts1324__os: + assumes "A X OS Y Z" and + "A Y TS X Z" + shows "A Z OS X Y" +proof - + obtain P where P1: "Col P A Y \ Bet X P Z" + using TS_def assms(2) by blast + have P2: "A Z OS X P" + by (metis Col_cases P1 TS_def assms(1) assms(2) bet_col bet_out_1 col124__nos col_trivial_2 l6_6 l9_19) + have "A Z OS P Y" + proof - + have "\ Col A Z P \ \ Col A Z Y" + using P2 col124__nos by blast + moreover have "A Out P Y" + proof - + have "X A OS P Z" + by (metis Col_cases P1 P2 assms(1) bet_out col123__nos out_one_side) + then have "A X OS P Y" + by (meson Tarski_neutral_dimensionless.invert_one_side Tarski_neutral_dimensionless.one_side_symmetry Tarski_neutral_dimensionless_axioms assms(1) one_side_transitivity) + then show ?thesis + using P1 col_one_side_out not_col_permutation_4 by blast + qed + ultimately show ?thesis + by (simp add: out_one_side) + qed + then show ?thesis + using P2 one_side_transitivity by blast +qed + +lemma ts2__ex_bet2: + assumes "A C TS B D" and + "B D TS A C" + shows "\ X. Bet A X C \ Bet B X D" + by (metis TS_def assms(1) assms(2) bet_col col_permutation_5 l9_18_R1 not_col_permutation_2) + +lemma out_one_side_1: + assumes "\ Col A B C" and + "Col A B X" and + "X Out C D" + shows "A B OS C D" + using assms(1) assms(2) assms(3) not_col_permutation_2 one_side_reflexivity one_side_symmetry os_out_os by blast + +lemma out_two_sides_two_sides: + assumes (*"A \ PX" and *) + "Col A B PX" and + "PX Out X P" and + "A B TS P Y" + shows "A B TS X Y" + using assms(1) assms(2) assms(3) l6_6 l9_5 not_col_permutation_1 by blast + +lemma l8_21_bis: + assumes "X \ Y" and + "\ Col C A B" + shows "\ P. Cong A P X Y \ A B Perp P A \ A B TS C P" +proof - + have P1: "A \ B" + using assms(2) not_col_distincts by blast + then have "\ P T. A B Perp P A \ Col A B T \ Bet C T P" + using l8_21 by auto + then obtain P T where P2: "A B Perp P A \ Col A B T \ Bet C T P" by blast + have P3: "A B TS C P" + proof - + have "\ Col P A B" + using P2 col_permutation_1 perp_not_col by blast + then show ?thesis + using P2 TS_def assms(2) not_col_permutation_1 by blast + qed + have P4: "P \ A" + using P3 ts_distincts by blast + obtain P' where P5: "(Bet A P P' \ Bet A P' P) \ Cong A P' X Y" + using segment_construction_2 P4 by blast + have P6: "A B Perp P' A" + by (smt P2 P5 Perp_perm assms(1) bet_col cong_identity cong_symmetry not_bet_distincts not_col_permutation_2 perp_col2) + have P7: "\ Col P' A B" + using NCol_perm P6 col_trivial_3 l8_16_1 by blast + then have P8: "A B OS P P'" + by (metis Out_def P4 P5 P6 col_permutation_2 out_one_side perp_not_eq_2) + then have P9: "A B TS C P'" + using P3 l9_2 l9_8_2 by blast + then show ?thesis + using P5 P6 by blast +qed + +lemma ts__ncol: + assumes "A B TS X Y" + shows "\ Col A X Y \ \ Col B X Y" + by (metis TS_def assms col_permutation_1 col_transitivity_2 ts_distincts) + +lemma one_or_two_sides_aux: + assumes "\ Col C A B" and + "\ Col D A B" and + "Col A C X" + and "Col B D X" + shows "A B TS C D \ A B OS C D" +proof - + have P1: "A \ X" + using assms(2) assms(4) col_permutation_2 by blast + have P2: "B \ X" + using assms(1) assms(3) col_permutation_4 by blast + have P3: "\ Col X A B" + using P1 assms(1) assms(3) col_permutation_5 col_transitivity_1 not_col_permutation_4 by blast + { + assume Q0: "Bet A C X \ Bet B D X" + then have Q1: "A B OS C X" + using assms(1) bet_out not_col_distincts not_col_permutation_1 out_one_side by blast + then have "A B OS X D" + by (metis Q0 assms(2) assms(4) bet_out_1 col_permutation_2 col_permutation_3 invert_one_side l6_4_2 not_bet_and_out not_col_distincts out_one_side) + then have "A B OS C D" + using Q1 one_side_transitivity by blast + } + then have P4: "Bet A C X \ Bet B D X \ A B OS C D" by blast + { + assume "Bet A C X \ Bet D X B" + then have "A B OS C D" + by (smt P2 assms(1) assms(4) bet_out between_equality_2 l9_10 l9_5 l9_8_1 not_bet_and_out not_col_distincts not_col_permutation_4 out_to_bet out_two_sides_two_sides) + } + then have P5: "Bet A C X \ Bet D X B \ A B OS C D " by blast + { + assume Q0: "Bet A C X \ Bet X B D" + have Q1: "A B TS X D" + using P3 Q0 TS_def assms(2) col_trivial_3 by blast + have "A B OS X C" + using Q0 assms(1) bet_out not_col_distincts one_side_reflexivity one_side_symmetry out_out_one_side by blast + then have "A B TS C D" + using Q1 l9_8_2 by blast + } + then have P6: "Bet A C X \ Bet X B D \ A B TS C D" by blast + { + assume Q1: "Bet C X A \ Bet B D X" + then have Q2: "A B OS C X" + using P1 assms(1) assms(3) between_equality_2 l6_4_2 not_col_permutation_1 not_col_permutation_4 out_one_side by blast + have "A B OS X D" + using Q1 assms(2) bet_out not_col_distincts one_side_reflexivity os_out_os by blast + then have "A B OS C D" using Q2 + using one_side_transitivity by blast + } + then have P7: "Bet C X A \ Bet B D X \ A B OS C D" by blast + { + assume "Bet C X A \ Bet D X B" + then have "A B OS C D" + by (smt \Bet A C X \ Bet D X B \ A B OS C D\ \Bet C X A \ Bet B D X \ A B OS C D\ assms(1) assms(2) assms(3) assms(4) between_symmetry l6_21 l9_18_R2 not_col_distincts ts_ts_os) + } + then have P8: "Bet C X A \ Bet D X B \ A B OS C D" by blast + { + assume Q1: "Bet C X A \ Bet X B D" + have Q2: "A B TS X D" + by (metis P3 Q1 assms(2) bet__ts invert_two_sides not_col_distincts not_col_permutation_3) + have Q3: "A B OS X C" + using P1 Q1 assms(1) bet_out_1 not_col_permutation_1 out_one_side by auto + then have "A B TS C D" + using Q2 l9_8_2 by blast + } + then have P9: "Bet C X A \ Bet X B D \ A B TS C D" by blast + { + assume Q0: "Bet X A C \ Bet B D X" + have Q1: "A B TS X C" + by (metis P3 Q0 assms(1) bet__ts col_permutation_2 not_col_distincts) + have "A B OS X D" + by (metis NCol_cases Q0 Tarski_neutral_dimensionless.out_one_side Tarski_neutral_dimensionless_axioms assms(2) assms(4) bet_out_1 invert_one_side l6_4_1 not_col_distincts not_out_bet) + then have "A B TS C D" + using Q1 l9_2 l9_8_2 by blast + } + then have P10: "Bet X A C \ Bet B D X \ A B TS C D" by blast + { + assume Q0: "Bet X A C \ Bet D X B" + have Q1: "A B TS X C" + by (metis NCol_cases P3 Q0 assms(1) bet__ts not_col_distincts) + have "A B OS X D" + by (metis P2 P3 Q0 bet_out_1 col_permutation_3 invert_one_side out_one_side) + then have "A B TS C D" + using Q1 l9_2 l9_8_2 by blast + } + then have P11: "Bet X A C \ Bet D X B \ A B TS C D" + by blast + { + assume Q0: "Bet X A C \ Bet X B D" + then have Q1: "A B TS C X" + by (simp add: P1 Q0 assms(1) bet__ts between_symmetry not_col_permutation_1) + have "A B TS D X" + by (simp add: P2 Q0 assms(2) bet__ts between_symmetry invert_two_sides not_col_permutation_3) + then have "A B OS C D" + using Q1 l9_8_1 by blast + } + then have P12: "Bet X A C \ Bet X B D \ A B OS C D" by blast + then show ?thesis using P4 P5 P6 P7 P8 P9 P10 P11 + using Col_def assms(3) assms(4) by auto +qed + +lemma cop__one_or_two_sides: + assumes "Coplanar A B C D" and + "\ Col C A B" and + "\ Col D A B" + shows "A B TS C D \ A B OS C D" +proof - + obtain X where P1: "Col A B X \ Col C D X \ Col A C X \ Col B D X \ Col A D X \ Col B C X" + using Coplanar_def assms(1) by auto + have P2: "Col A B X \ Col C D X \ A B TS C D \ A B OS C D" + by (metis TS_def Tarski_neutral_dimensionless.l9_19_R2 Tarski_neutral_dimensionless_axioms assms(2) assms(3) not_col_permutation_3 not_col_permutation_5 not_out_bet) + have P3: "Col A C X \ Col B D X \ A B TS C D \ A B OS C D" + using assms(2) assms(3) one_or_two_sides_aux by blast + have "Col A D X \ Col B C X \ A B TS C D \ A B OS C D" + using assms(2) assms(3) l9_2 one_or_two_sides_aux one_side_symmetry by blast + then show ?thesis + using P1 P2 P3 by blast +qed + +lemma os__coplanar: + assumes "A B OS C D" + shows "Coplanar A B C D" +proof - + have P1: "\ Col A B C" + using assms one_side_not_col123 by blast + obtain C' where P2: "Bet C B C' \ Cong B C' B C" + using segment_construction by presburger + have P3: "A B TS D C'" + by (metis (no_types) Cong_perm OS_def P2 TS_def assms bet__ts bet_cong_eq invert_one_side l9_10 l9_8_2 one_side_not_col123 ts_distincts) + obtain T where P4: "Col T A B \ Bet D T C'" + using P3 TS_def by blast + have P5: "C' \ T" + using P3 P4 TS_def by blast + have P6: "Col T B C \ Coplanar A B C D" + by (metis Col_def Coplanar_def P2 P4 P5 col_trivial_2 l6_16_1) + { + assume Q0: "\ Col T B C" + { + assume R0: "Bet T B A" + have S1: "B C TS T A" + by (metis P1 Q0 R0 bet__ts col_permutation_2 not_col_distincts) + have "C' Out T D" + using P4 P5 bet_out_1 by auto + then have "B C OS T D" + using P2 Q0 bet_col invert_one_side not_col_permutation_3 out_one_side_1 by blast + then have R1: "B C TS D A" + using S1 l9_8_2 by blast + then have "Coplanar A B C D" + using ncoplanar_perm_9 ts__coplanar by blast + } + then have Q1: "Bet T B A \ Coplanar A B C D" by blast + { + assume R0: "\ Bet T B A" + { + have R2: "B C OS D T" + proof - + have S1: "\ Col B C D" + by (metis Col_perm P2 P3 P4 Q0 bet_col colx ts_distincts) + have S2: "Col B C C'" + by (simp add: P2 bet_col col_permutation_4) + have S3: "C' Out D T" + using P4 P5 bet_out_1 l6_6 by auto + then show ?thesis + using S1 S2 out_one_side_1 by blast + qed + + have R3: "B C OS T A" + using P4 Q0 R0 col_permutation_2 col_permutation_5 not_bet_out out_one_side by blast + } + then have R1: "B C OS D A" + by (metis P2 P4 Q0 bet_col bet_out_1 col_permutation_2 col_permutation_5 os_out_os) + then have "Coplanar A B C D" + by (simp add: R1 assms coplanar_perm_19 invert_one_side l9_31 one_side_symmetry ts__coplanar) + } + then have "\ Bet T B A \ Coplanar A B C D" by blast + then have "Coplanar A B C D" using Q1 by blast + } + then have "\ Col T B C \ Coplanar A B C D" by blast + then show ?thesis using P6 by blast +qed + +lemma coplanar_trans_1: + assumes "\ Col P Q R" and + "Coplanar P Q R A" and + "Coplanar P Q R B" + shows "Coplanar Q R A B" +proof - + have P1: "Col Q R A \ Coplanar Q R A B" + by (simp add: col__coplanar) + { + assume T1: "\ Col Q R A" + { + assume T2: "\ Col Q R B" + { + have "Col Q A B \ Coplanar Q R A B" + using ncop__ncols by blast + { + assume S1: "\ Col Q A B" + have U1: "Q R TS P A \ Q R OS P A" + by (simp add: T1 assms(1) assms(2) cop__one_or_two_sides coplanar_perm_8 not_col_permutation_2) + have U2: "Q R TS P B \ Q R OS P B" + using T2 assms(1) assms(3) col_permutation_1 cop__one_or_two_sides coplanar_perm_8 by blast + have W1: "Q R TS P A \ Q R OS P A \ Q R TS A B \ Q R OS A B" + using l9_9 by blast + have W2: "Q R TS P A \ Q R OS P B \ Q R TS A B \ Q R OS A B" + using l9_2 l9_8_2 by blast + have W3: "Q R TS P B \ Q R OS P A \ Q R TS A B \ Q R OS A B" + using l9_8_2 by blast + have "Q R TS P B \ Q R OS P B \ Q R TS A B \ Q R OS A B" + using l9_9 by blast + then have S2: "Q R TS A B \ Q R OS A B" using U1 U2 W1 W2 W3 + using OS_def l9_2 one_side_transitivity by blast + have "Coplanar Q R A B" + using S2 os__coplanar ts__coplanar by blast + } + then have "\ Col Q A B \ Coplanar Q R A B" by blast + } + then have "Coplanar Q R A B" + using ncop__ncols by blast + } + then have "\ Col Q R B \ Coplanar Q R A B" + by blast + } + then have "\ Col Q R A \ Coplanar Q R A B" + using ncop__ncols by blast + then show ?thesis using P1 by blast +qed + +lemma col_cop__cop: + assumes "Coplanar A B C D" and + "C \ D" and + "Col C D E" + shows "Coplanar A B C E" +proof - + have "Col D A C \ Coplanar A B C E" + by (meson assms(2) assms(3) col_permutation_1 l6_16_1 ncop__ncols) + moreover + { + assume "\ Col D A C" + then have "Coplanar A C B E" + by (meson assms(1) assms(3) col__coplanar coplanar_trans_1 ncoplanar_perm_11 ncoplanar_perm_13) + then have "Coplanar A B C E" + using ncoplanar_perm_2 by blast + } + ultimately show ?thesis + by blast +qed + +lemma bet_cop__cop: + assumes "Coplanar A B C E" and + "Bet C D E" + shows "Coplanar A B C D" + by (metis NCol_perm Tarski_neutral_dimensionless.col_cop__cop Tarski_neutral_dimensionless_axioms assms(1) assms(2) bet_col bet_neq12__neq) + +lemma col2_cop__cop: + assumes "Coplanar A B C D" and + "C \ D" and + "Col C D E" and + "Col C D F" + shows "Coplanar A B E F" +proof cases + assume "C = E" + then show ?thesis + using assms(1) assms(2) assms(4) col_cop__cop by blast +next + assume "C \ E" + then show ?thesis + by (metis assms(1) assms(2) assms(3) assms(4) col_cop__cop col_transitivity_1 ncoplanar_perm_1 not_col_permutation_4) +qed + +lemma col_cop2__cop: + assumes "U \ V" and + "Coplanar A B C U" and + "Coplanar A B C V" and + "Col U V P" + shows "Coplanar A B C P" +proof cases + assume "Col A B C" + then show ?thesis + using ncop__ncol by blast +next + assume "\ Col A B C" + then show ?thesis + by (smt Col_perm assms(1) assms(2) assms(3) assms(4) col_cop__cop coplanar_trans_1 ncoplanar_perm_1 ncoplanar_perm_14 ncoplanar_perm_15 ncoplanar_perm_23) +qed + +lemma bet_cop2__cop: + assumes "Coplanar A B C U" and + "Coplanar A B C W" and + "Bet U V W" + shows "Coplanar A B C V" +proof - + have "Col U V W" + using assms(3) bet_col by blast + then have "Col U W V" + by (meson not_col_permutation_5) + then show ?thesis + using assms(1) assms(2) assms(3) bet_neq23__neq col_cop2__cop by blast +qed + +lemma coplanar_pseudo_trans: + assumes "\ Col P Q R" and + "Coplanar P Q R A" and + "Coplanar P Q R B" and + "Coplanar P Q R C" and + "Coplanar P Q R D" + shows "Coplanar A B C D" +proof cases + have LEM1: "(\ Col P Q R \ Coplanar P Q R A \ Coplanar P Q R B \ Coplanar P Q R C) \ Coplanar A B C R" + by (smt col_transitivity_2 coplanar_trans_1 ncop__ncols ncoplanar_perm_19 ncoplanar_perm_21) + assume P2: "Col P Q D" + have P3: "P \ Q" + using assms(1) col_trivial_1 by blast + have P4: "Coplanar A B C Q" + by (smt assms(1) assms(2) assms(3) assms(4) col2_cop__cop coplanar_trans_1 ncoplanar_perm_9 not_col_distincts) + have P5: "\ Col Q R P" + using Col_cases assms(1) by blast + have P6: "Coplanar Q R P A" + using assms(2) ncoplanar_perm_12 by blast + have P7: "Coplanar Q R P B" + using assms(3) ncoplanar_perm_12 by blast + have P8: "Coplanar Q R P C" + using assms(4) ncoplanar_perm_12 by blast + then have "Coplanar A B C P" using LEM1 P5 P6 P7 + by (smt col_transitivity_2 coplanar_trans_1 ncop__ncols ncoplanar_perm_19) + then show ?thesis + using LEM1 P2 P3 P4 col_cop2__cop by blast +next + assume P9: "\ Col P Q D" + have P10: "Coplanar P Q D A" + using NCol_cases assms(1) assms(2) assms(5) coplanar_trans_1 ncoplanar_perm_8 by blast + have P11: "Coplanar P Q D B" + using assms(1) assms(3) assms(5) col_permutation_1 coplanar_perm_12 coplanar_trans_1 by blast + have "Coplanar P Q D C" + by (meson assms(1) assms(4) assms(5) coplanar_perm_7 coplanar_trans_1 ncoplanar_perm_14 not_col_permutation_3) + then show ?thesis using P9 P10 P11 + by (smt P10 P11 P9 col3 coplanar_trans_1 ncop__ncol ncoplanar_perm_20 ncoplanar_perm_23 not_col_distincts) +qed + +lemma l9_30: + assumes "\ Coplanar A B C P" and + "\ Col D E F" and + "Coplanar D E F P" and + "Coplanar A B C X" and + "Coplanar A B C Y" and + "Coplanar A B C Z" and + "Coplanar D E F X" and + "Coplanar D E F Y" and + "Coplanar D E F Z" + shows "Col X Y Z" +proof - + { + assume P1: "\ Col X Y Z" + have P2: "\ Col A B C" + using assms(1) col__coplanar by blast + have "Coplanar A B C P" + proof - + have Q2: "Coplanar X Y Z A" + by (smt P2 assms(4) assms(5) assms(6) col2_cop__cop coplanar_trans_1 ncoplanar_perm_18 not_col_distincts) + have Q3: "Coplanar X Y Z B" + using P2 assms(4) assms(5) assms(6) col_trivial_3 coplanar_pseudo_trans ncop__ncols by blast + have Q4: "Coplanar X Y Z C" + using P2 assms(4) assms(5) assms(6) col_trivial_2 coplanar_pseudo_trans ncop__ncols by blast + have "Coplanar X Y Z P" + using assms(2) assms(3) assms(7) assms(8) assms(9) coplanar_pseudo_trans by blast + then show ?thesis using P1 Q2 Q3 Q4 + using assms(2) assms(3) assms(7) assms(8) assms(9) coplanar_pseudo_trans by blast + qed + then have "False" using assms(1) by blast + } + then show ?thesis by blast +qed + +lemma cop_per2__col: + assumes "Coplanar A X Y Z" and + "A \ Z" and + "Per X Z A" and + "Per Y Z A" + shows "Col X Y Z" +proof cases + assume "X = Y \ X = Z \ Y = Z" + then show ?thesis + using not_col_distincts by blast +next + assume H1:"\ (X = Y \ X = Z \ Y = Z)" + obtain B where P1: "Cong X A X B \ Z Midpoint A B \ Cong Y A Y B" + using Per_def assms(3) assms(4) per_double_cong by blast + have P2: "X \ Y" + using H1 by blast + have P3: "X \ Z" + using H1 by blast + have P4: "Y \ Z" + using H1 by blast + obtain I where P5: " Col A X I \ Col Y Z I \ + Col A Y I \ Col X Z I \ Col A Z I \ Col X Y I" + using Coplanar_def assms(1) by auto + have P6: "Col A X I \ Col Y Z I \ Col X Y Z" + by (smt P1 P4 assms(2) l4_17 l4_18 l7_13 l7_2 l7_3_2 midpoint_distinct_2 not_col_permutation_1) + have P7: "Col A Y I \ Col X Z I \ Col X Y Z" + by (smt P1 P3 assms(2) col_permutation_1 col_permutation_5 l4_17 l4_18 l7_13 l7_2 l7_3_2 midpoint_distinct_2) + have "Col A Z I \ Col X Y I \ Col X Y Z" + by (metis P1 P2 assms(2) col_permutation_1 l4_17 l4_18 l7_13 l7_2 l7_3_2 midpoint_distinct_2) + then show ?thesis + using P5 P6 P7 by blast +qed + +lemma cop_perp2__col: + assumes "Coplanar A B Y Z" and + "X Y Perp A B" and + "X Z Perp A B" + shows "Col X Y Z" +proof cases + assume P1: "Col A B X" + { + assume Q0: "X = A" + then have Q1: "X \ B" + using assms(3) perp_not_eq_2 by blast + have Q2: "Coplanar B Y Z X" + by (simp add: Q0 assms(1) coplanar_perm_9) + have Q3: "Per Y X B" + using Q0 assms(2) perp_per_2 by blast + have "Per Z X B" + using Q0 assms(3) perp_per_2 by blast + then have "Col X Y Z" + using Q1 Q2 Q3 cop_per2__col not_col_permutation_1 by blast + } + then have P2: "X = A \ Col X Y Z" by blast + { + assume Q0: "X \ A" + have Q1: "A X Perp X Y" + by (metis P1 Perp_perm Q0 assms(2) perp_col1) + have Q2: "A X Perp X Z" + by (metis P1 Perp_perm Q0 assms(3) perp_col1) + have Q3: "Coplanar A Y Z X" + by (smt P1 assms(1) assms(2) col2_cop__cop col_trivial_3 coplanar_perm_12 coplanar_perm_16 perp_distinct) + have Q4: "Per Y X A" + using Perp_perm Q1 perp_per_2 by blast + have "Per Z X A" + using P1 Q0 assms(3) perp_col1 perp_per_1 by auto + then have "Col X Y Z" + using Q0 Q3 Q4 cop_per2__col not_col_permutation_1 by blast + } + then have "X \ A \ Col X Y Z" by blast + then show ?thesis + using P2 by blast +next + assume P1: "\ Col A B X" + obtain Y0 where P2: "Y0 PerpAt X Y A B" + using Perp_def assms(2) by blast + obtain Z0 where P3: "Z0 PerpAt X Z A B" + using Perp_def assms(3) by auto + have P4: "X Y0 Perp A B" + by (metis P1 P2 assms(2) perp_col perp_in_col) + have P5: "X Z0 Perp A B" + by (metis P1 P3 assms(3) perp_col perp_in_col) + have P6: "Y0 = Z0" + by (meson P1 P2 P3 P4 P5 Perp_perm l8_18_uniqueness perp_in_col) + have P7: "X \ Y0" + using P4 perp_not_eq_1 by blast + have P8: "Col X Y0 Y" + using P2 col_permutation_5 perp_in_col by blast + have "Col X Y0 Z" + using P3 P6 col_permutation_5 perp_in_col by blast + then show ?thesis + using P7 P8 col_transitivity_1 by blast +qed + +lemma two_sides_dec: + shows "A B TS C D \ \ A B TS C D" + by simp + +lemma cop_nts__os: + assumes "Coplanar A B C D" and + "\ Col C A B" and + "\ Col D A B" and + "\ A B TS C D" + shows "A B OS C D" + using assms(1) assms(2) assms(3) assms(4) cop__one_or_two_sides by blast + +lemma cop_nos__ts: + assumes "Coplanar A B C D" and + "\ Col C A B" and + "\ Col D A B" and + "\ A B OS C D" + shows "A B TS C D" + using assms(1) assms(2) assms(3) assms(4) cop_nts__os by blast + +lemma one_side_dec: + "A B OS C D \ \ A B OS C D" + by simp + +lemma cop_dec: + "Coplanar A B C D \ \ Coplanar A B C D" + by simp + +lemma ex_diff_cop: + "\ E. Coplanar A B C E \ D \ E" + by (metis col_trivial_2 diff_col_ex ncop__ncols) + +lemma ex_ncol_cop: + assumes "D \ E" + shows "\ F. Coplanar A B C F \ \ Col D E F" +proof cases + assume "Col A B C" + then show ?thesis + using assms ncop__ncols not_col_exists by blast +next + assume P1: "\ Col A B C" + then show ?thesis + proof - + have P2: "(Col D E A \ Col D E B) \ (\ F. Coplanar A B C F \ \ Col D E F)" + by (meson P1 assms col3 col_trivial_2 ncop__ncols) + have P3: "(\Col D E A \ Col D E B) \ (\ F. Coplanar A B C F \ \ Col D E F)" + using col_trivial_3 ncop__ncols by blast + have P4: "(Col D E A \ \Col D E B) \ (\ F. Coplanar A B C F \ \ Col D E F)" + using col_trivial_2 ncop__ncols by blast + have "(\Col D E A \ \Col D E B) \ (\ F. Coplanar A B C F \ \ Col D E F)" + using col_trivial_3 ncop__ncols by blast + then show ?thesis using P2 P3 P4 by blast + qed +qed + +lemma ex_ncol_cop2: + "\ E F. (Coplanar A B C E \ Coplanar A B C F \ \ Col D E F)" +proof - + have f1: "\p pa pb. Coplanar pb pa p pb" + by (meson col_trivial_3 ncop__ncols) + have f2: "\p pa pb. Coplanar pb pa p p" + by (meson Col_perm col_trivial_3 ncop__ncols) + obtain pp :: "'p \ 'p \ 'p" where + f3: "\p pa. p = pa \ \ Col p pa (pp p pa)" + using not_col_exists by moura + have f4: "\p pa pb. Coplanar pb pb pa p" + by (meson Col_perm col_trivial_3 ncop__ncols) + have "\p. A \ p" + by (meson col_trivial_3 diff_col_ex3) + moreover + { assume "B \ A" + then have "D = B \ (\p. \ Col D p A \ Coplanar A B C p)" + using f3 f2 by (metis (no_types) Col_perm ncop__ncols) + then have "D = B \ (\p pa. Coplanar A B C p \ Coplanar A B C pa \ \ Col D p pa)" + using f1 by blast } + moreover + { assume "D \ B" + moreover + { assume "\p. D \ B \ \ Coplanar A B C p" + then have "D \ B \ \ Col A B C" + using ncop__ncols by blast + then have "\p. \ Col D p B \ Coplanar A B C p" + using f2 f1 by (metis (no_types) Col_perm col_transitivity_1) } + ultimately have ?thesis + using f3 by (metis (no_types) col_trivial_3 ncop__ncols) } + ultimately show ?thesis + using f4 f3 by blast +qed + +lemma col2_cop2__eq: + assumes "\ Coplanar A B C U" and + "U \ V" and + "Coplanar A B C P" and + "Coplanar A B C Q" and + "Col U V P" and + "Col U V Q" + shows "P = Q" +proof - + have "Col U Q P" + by (meson assms(2) assms(5) assms(6) col_transitivity_1) + then have "Col P Q U" + using not_col_permutation_3 by blast + then show ?thesis + using assms(1) assms(3) assms(4) col_cop2__cop by blast +qed + +lemma cong3_cop2__col: + assumes "Coplanar A B C P" and + "Coplanar A B C Q" and + "P \ Q" and + "Cong A P A Q" and + "Cong B P B Q" and + "Cong C P C Q" + shows "Col A B C" +proof cases + assume "Col A B C" + then show ?thesis by blast +next + assume P1: "\ Col A B C" + obtain M where P2: "M Midpoint P Q" + using assms(6) l7_25 by blast + have P3: "Per A M P" + using P2 Per_def assms(4) by blast + have P4: "Per B M P" + using P2 Per_def assms(5) by blast + have P5: "Per C M P" + using P2 Per_def assms(6) by blast + have "False" + proof cases + assume Q1: "A = M" + have Q2: "Coplanar P B C A" + using assms(1) ncoplanar_perm_21 by blast + have Q3: "P \ A" + by (metis assms(3) assms(4) cong_diff_4) + have Q4: "Per B A P" + by (simp add: P4 Q1) + have Q5: "Per C A P" + by (simp add: P5 Q1) + then show ?thesis using Q1 Q2 Q3 Q4 cop_per2__col + using P1 not_col_permutation_1 by blast + next + assume Q0: "A \ M" + have Q1: "Col A B M" + proof - + have R1: "Coplanar A B P Q" + using P1 assms(1) assms(2) coplanar_trans_1 ncoplanar_perm_8 not_col_permutation_2 by blast + then have R2: "Coplanar P A B M" + using P2 bet_cop__cop coplanar_perm_14 midpoint_bet ncoplanar_perm_6 by blast + have R3: "P \ M" + using P2 assms(3) l7_3_2 l7_9_bis by blast + have R4: "Per A M P" + by (simp add: P3) + have R5: "Per B M P" + by (simp add: P4) + then show ?thesis + using R2 R3 R4 cop_per2__col by blast + qed + have "Col A C M" + proof - + have R1: "Coplanar P A C M" + using P1 Q1 assms(1) col2_cop__cop coplanar_perm_22 ncoplanar_perm_3 not_col_distincts by blast + have R2: "P \ M" + using P2 assms(3) l7_3_2 symmetric_point_uniqueness by blast + have R3: "Per A M P" + by (simp add: P3) + have "Per C M P" + by (simp add: P5) + then show ?thesis + using R1 R2 R3 cop_per2__col by blast + qed + then show ?thesis + using NCol_perm P1 Q0 Q1 col_trivial_3 colx by blast + qed + then show ?thesis by blast +qed + +lemma l9_38: + assumes "A B C TSP P Q" + shows "A B C TSP Q P" + using Bet_perm TSP_def assms by blast + +lemma l9_39: + assumes "A B C TSP P R" and + "Coplanar A B C D" and + "D Out P Q" + shows "A B C TSP Q R" +proof - + have P1: "\ Col A B C" + using TSP_def assms(1) ncop__ncol by blast + have P2: "\ Coplanar A B C Q" + by (metis TSP_def assms(1) assms(2) assms(3) col_cop2__cop l6_6 out_col out_diff2) + have P3: "\ Coplanar A B C R" + using TSP_def assms(1) by blast + obtain T where P3A: "Coplanar A B C T \ Bet P T R" + using TSP_def assms(1) by blast + have W1: "D = T \ A B C TSP Q R" + using P2 P3 P3A TSP_def assms(3) bet_out__bet by blast + { + assume V1: "D \ T" + have V1A: "\ Col P D T" using P3A col_cop2__cop + by (metis TSP_def V1 assms(1) assms(2) col2_cop2__eq col_trivial_2) + have V1B: "D T TS P R" + by (metis P3 P3A V1A bet__ts invert_two_sides not_col_permutation_3) + have "D T OS P Q" + using V1A assms(3) not_col_permutation_1 out_one_side by blast + then have V2: "D T TS Q R" + using V1B l9_8_2 by blast + then obtain T' where V3: "Col T' D T \ Bet Q T' R" + using TS_def by blast + have V4: "Coplanar A B C T'" + using Col_cases P3A V1 V3 assms(2) col_cop2__cop by blast + then have "A B C TSP Q R" + using P2 P3 TSP_def V3 by blast + } + then have "D \ T \ A B C TSP Q R" by blast + then show ?thesis using W1 by blast +qed + +lemma l9_41_1: + assumes "A B C TSP P R" and + "A B C TSP Q R" + shows "A B C OSP P Q" + using OSP_def assms(1) assms(2) by blast + +lemma l9_41_2: + assumes "A B C TSP P R" and + "A B C OSP P Q" + shows "A B C TSP Q R" +proof - + have P1: "\ Coplanar A B C P" + using TSP_def assms(1) by blast + obtain S where P2: " A B C TSP P S \ A B C TSP Q S" + using OSP_def assms(2) by blast + obtain X where P3: "Coplanar A B C X \ Bet P X S" + using P2 TSP_def by blast + have P4: "\ Coplanar A B C P \ \ Coplanar A B C S" + using P2 TSP_def by blast + obtain Y where P5: "Coplanar A B C Y \ Bet Q Y S" + using P2 TSP_def by blast + have P6: "\ Coplanar A B C Q \ \ Coplanar A B C S" + using P2 TSP_def by blast + have P7: "X \ P \ S \ X \ Q \ Y \ S \ Y" + using P3 P4 P5 P6 by blast + { + assume Q1: "Col P Q S" + have Q2: "X = Y" + proof - + have R2: "Q \ S" + using P5 P6 bet_neq12__neq by blast + have R5: "Col Q S X" + by (smt Col_def P3 Q1 between_inner_transitivity between_symmetry col_transitivity_2) + have "Col Q S Y" + by (simp add: P5 bet_col col_permutation_5) + then show ?thesis + using P2 P3 P5 R2 R5 TSP_def col2_cop2__eq by blast + qed + then have "X Out P Q" + by (metis P3 P5 P7 l6_2) + then have "A B C TSP Q R" + using P3 assms(1) l9_39 by blast + } + then have P7: "Col P Q S \ A B C TSP Q R" by blast + { + assume Q1: "\ Col P Q S" + obtain Z where Q2: "Bet X Z Q \ Bet Y Z P" + using P3 P5 inner_pasch by blast + { + assume "X = Z" + then have "False" + by (metis P2 P3 P5 Q1 Q2 TSP_def bet_col col_cop2__cop l6_16_1 not_col_permutation_5) + } + then have Q3: "X \ Z" by blast + have "Y \ Z" + proof - + have "X \ Z" + by (meson \X = Z \ False\) + then have "Z \ Y" + by (metis (no_types) P2 P3 P5 Q2 TSP_def bet_col col_cop2__cop) + then show ?thesis + by meson + qed + then have "Y Out P Z" + using Q2 bet_out l6_6 by auto + then have Q4: "A B C TSP Z R" + using assms(1) P5 l9_39 by blast + have "X Out Z Q" + using Q2 Q3 bet_out by auto + then have "A B C TSP Q R" + using Q4 P3 l9_39 by blast + } + then have "\ Col P Q S \ A B C TSP Q R" by blast + then show ?thesis using P7 by blast +qed + +lemma tsp_exists: + assumes "\ Coplanar A B C P" + shows "\ Q. A B C TSP P Q" +proof - + obtain Q where P1: "Bet P A Q \ Cong A Q A P" + using segment_construction by blast + have P2: "Coplanar A B C A" + using coplanar_trivial ncoplanar_perm_5 by blast + have P3: "\ Coplanar A B C P" + by (simp add: assms) + have P4: "\ Coplanar A B C Q" + by (metis P1 P2 Tarski_neutral_dimensionless.col_cop2__cop Tarski_neutral_dimensionless_axioms assms bet_col cong_diff_4 not_col_permutation_2) + then show ?thesis + using P1 P2 TSP_def assms by blast +qed + + +lemma osp_reflexivity: + assumes "\ Coplanar A B C P" + shows "A B C OSP P P" + by (meson assms l9_41_1 tsp_exists) + +lemma osp_symmetry: + assumes "A B C OSP P Q" + shows "A B C OSP Q P" + using OSP_def assms by auto + +lemma osp_transitivity: + assumes "A B C OSP P Q" and + "A B C OSP Q R" + shows "A B C OSP P R" + using OSP_def assms(1) assms(2) l9_41_2 by blast + +lemma cop3_tsp__tsp: + assumes "\ Col D E F" and + "Coplanar A B C D" and + "Coplanar A B C E" and + "Coplanar A B C F" and + "A B C TSP P Q" + shows "D E F TSP P Q" +proof - + obtain T where P1: "Coplanar A B C T \ Bet P T Q" + using TSP_def assms(5) by blast + have P2: "\ Col A B C" + using TSP_def assms(5) ncop__ncols by blast + have P3: "Coplanar D E F A \ Coplanar D E F B \ Coplanar D E F C \ Coplanar D E F T" + proof - + have P3A: "Coplanar D E F A" + using P2 assms(2) assms(3) assms(4) col_trivial_3 coplanar_pseudo_trans ncop__ncols by blast + have P3B: "Coplanar D E F B" + using P2 assms(2) assms(3) assms(4) col_trivial_2 coplanar_pseudo_trans ncop__ncols by blast + have P3C: "Coplanar D E F C" + by (meson P2 assms(2) assms(3) assms(4) coplanar_perm_16 coplanar_pseudo_trans coplanar_trivial) + have "Coplanar D E F T" + using P1 P2 assms(2) assms(3) assms(4) coplanar_pseudo_trans by blast + then show ?thesis using P3A P3B P3C by simp + qed + have P4: "\ Coplanar D E F P" + using P3 TSP_def assms(1) assms(5) coplanar_pseudo_trans by auto + have P5: "\ Coplanar D E F Q" + by (metis P1 P3 P4 TSP_def assms(5) bet_col bet_col1 col2_cop2__eq) + have P6: "Coplanar D E F T" + by (simp add: P3) + have "Bet P T Q" + by (simp add: P1) + then show ?thesis + using P4 P5 P6 TSP_def by blast +qed + +lemma cop3_osp__osp: + assumes "\ Col D E F" and + "Coplanar A B C D" and + "Coplanar A B C E" and + "Coplanar A B C F" and + "A B C OSP P Q" + shows "D E F OSP P Q" +proof - + obtain R where P1: "A B C TSP P R \ A B C TSP Q R" + using OSP_def assms(5) by blast + then show ?thesis + using OSP_def assms(1) assms(2) assms(3) assms(4) cop3_tsp__tsp by blast +qed + +lemma ncop_distincts: + assumes "\ Coplanar A B C D" + shows "A \ B \ A \ C \ A \ D \ B \ C \ B \ D \ C \ D" + using Coplanar_def assms col_trivial_1 col_trivial_2 by blast + +lemma tsp_distincts: + assumes "A B C TSP P Q" + shows "A \ B \ A \ C \ B \ C \ A \ P \ B \ P \ C \ P \ A \ Q \ B \ Q \ C \ Q \ P \ Q" +proof - + obtain pp :: "'p \ 'p \ 'p \ 'p \ 'p \ 'p" where + "\x0 x1 x2 x3 x4. (\v5. Coplanar x4 x3 x2 v5 \ Bet x1 v5 x0) = (Coplanar x4 x3 x2 (pp x0 x1 x2 x3 x4) \ Bet x1 (pp x0 x1 x2 x3 x4) x0)" + by moura + then have f1: "\ Coplanar A B C P \ \ Coplanar A B C Q \ Coplanar A B C (pp Q P C B A) \ Bet P (pp Q P C B A) Q" + using TSP_def assms by presburger + then have "Q \ pp Q P C B A" + by force + then show ?thesis + using f1 by (meson bet_neq32__neq ncop_distincts) +qed + +lemma osp_distincts: + assumes "A B C OSP P Q" + shows "A \ B \ A \ C \ B \ C \ A \ P \ B \ P \ C \ P \ A \ Q \ B \ Q \ C \ Q" + using OSP_def assms tsp_distincts by blast + +lemma tsp__ncop1: + assumes "A B C TSP P Q" + shows "\ Coplanar A B C P" + using TSP_def assms by blast + +lemma tsp__ncop2: + assumes "A B C TSP P Q" + shows "\ Coplanar A B C Q" + using TSP_def assms by auto + +lemma osp__ncop1: + assumes "A B C OSP P Q" + shows "\ Coplanar A B C P" + using OSP_def TSP_def assms by blast + +lemma osp__ncop2: + assumes "A B C OSP P Q" + shows "\ Coplanar A B C Q" + using assms osp__ncop1 osp_symmetry by blast + +lemma tsp__nosp: + assumes "A B C TSP P Q" + shows "\ A B C OSP P Q" + using assms l9_41_2 tsp_distincts by blast + +lemma osp__ntsp: + assumes "A B C OSP P Q" + shows "\ A B C TSP P Q" + using assms tsp__nosp by blast + +lemma osp_bet__osp: + assumes "A B C OSP P R" and + "Bet P Q R" + shows "A B C OSP P Q" +proof - + obtain S where P1: "A B C TSP P S" + using OSP_def assms(1) by blast + then obtain Y where P2: "Coplanar A B C Y \ Bet R Y S" + using TSP_def assms(1) l9_41_2 by blast + obtain X where Q1: "Coplanar A B C X \ Bet P X S" + using P1 TSP_def by blast + have Q2: "P \ X \ S \ X \ R \ Y" + using P1 P2 Q1 TSP_def assms(1) osp__ncop2 by auto + { + assume P3: "Col P R S" + have P5: "A B C TSP Q S" + proof - + have Q3: "X = Y" + proof - + have R1: "\ Coplanar A B C R" + using assms(1) osp__ncop2 by blast + have R2: "R \ S" + using P1 assms(1) osp__ntsp by blast + have R5: "Col R S X" + by (smt Col_def P3 Q1 bet_col1 between_exchange4 between_symmetry) + have "Col R S Y" + using P2 bet_col col_permutation_5 by blast + then show ?thesis + using R1 R2 Q1 P2 R5 col2_cop2__eq by blast + qed + then have "Y Out P Q" + by (smt P2 P3 Q1 Q2 assms(2) bet_col1 between_exchange4 between_symmetry l6_3_2 l6_4_2 not_bet_and_out third_point) + then show ?thesis + using P1 P2 l9_39 by blast + qed + then have "A B C OSP P Q" + using OSP_def P1 P2 l9_39 by blast + } + then have H1: "Col P R S \ A B C OSP P Q" by blast + { + assume T1: "\ Col P R S" + have T2: "X Y OS P R" + proof - + have T3: "P \ X \ S \ X \ R \ Y \ S \ Y" + using P1 P2 Q2 TSP_def by auto + have T4: "\ Col S X Y" + by (metis P2 Q1 T1 T3 bet_out_1 col_out2_col col_permutation_5 not_col_permutation_4) + have T5: "X Y TS P S" + by (metis Col_perm Q1 Q2 T4 bet__ts bet_col col_transitivity_2) + have T6: "X Y TS R S" + by (metis P2 Q1 T4 assms(1) bet__ts col_cop2__cop invert_two_sides not_col_distincts osp__ncop2) + then show ?thesis + using T5 l9_8_1 by auto + qed + then have T7: "X Y OS P Q" + using assms(2) l9_17 by blast + then obtain S' where T7A: "X Y TS P S' \ X Y TS Q S'" + using OS_def by blast + have T7B: "\ Col P X Y \ \ Col S' X Y \ (\ T::'p. Col T X Y \ Bet P T S')" + using T7A TS_def by auto + have T7C: "\ Col Q X Y \ \ Col S' X Y \ (\ T::'p. Col T X Y \ Bet Q T S')" + using T7A TS_def by blast + obtain X' where T9: "Col X' X Y \ Bet P X' S' \ X Y TS Q S'" + using T7A T7B by blast + obtain Y' where T10: "Col Y' X Y \ Bet Q Y' S'" + using T7C by blast + have T11: "Coplanar A B C X'" + using Col_cases P2 Q1 T9 col_cop2__cop ts_distincts by blast + have T12: "Coplanar A B C Y'" + using Col_cases P2 Q1 T10 T9 col_cop2__cop ts_distincts by blast + have T13: "\ Coplanar A B C S'" + using T11 T7C T9 assms(1) bet_col bet_col1 col2_cop2__eq osp__ncop1 by fastforce + then have "A B C OSP P Q" + proof - + have R1: "A B C TSP P S'" + using P1 T11 T13 T9 TSP_def by blast + have "A B C TSP Q S'" + by (metis T10 T12 T13 T7C TSP_def bet_col col_cop2__cop) + then show ?thesis using R1 by (smt l9_41_1) + qed + } + then show ?thesis using H1 by blast +qed + +lemma l9_18_3: + assumes "Coplanar A B C P" and + "Col X Y P" + shows "A B C TSP X Y \ (Bet X P Y \ \ Coplanar A B C X \ \ Coplanar A B C Y)" + by (meson TSP_def assms(1) assms(2) l9_39 not_bet_out not_col_permutation_5 tsp_distincts) + +lemma bet_cop__tsp: + assumes "\ Coplanar A B C X" and + "P \ Y" and + "Coplanar A B C P" and + "Bet X P Y" + shows "A B C TSP X Y" + using TSP_def assms(1) assms(2) assms(3) assms(4) bet_col bet_col1 col2_cop2__eq by fastforce + +lemma cop_out__osp: + assumes "\ Coplanar A B C X" and + "Coplanar A B C P" and + "P Out X Y" + shows "A B C OSP X Y" + by (meson OSP_def assms(1) assms(2) assms(3) l9_39 tsp_exists) + +lemma l9_19_3: + assumes "Coplanar A B C P" and + "Col X Y P" + shows "A B C OSP X Y \ (P Out X Y \ \ Coplanar A B C X)" + by (meson assms(1) assms(2) cop_out__osp l6_4_2 l9_18_3 not_col_permutation_5 osp__ncop1 osp__ncop2 tsp__nosp) + +lemma cop2_ts__tsp: + assumes "\ Coplanar A B C X" and "Coplanar A B C D" and + "Coplanar A B C E" and "D E TS X Y" + shows "A B C TSP X Y" +proof - + obtain T where P1: "Col T D E \ Bet X T Y" + using TS_def assms(4) by blast + have P2: "Coplanar A B C T" + using P1 assms(2) assms(3) assms(4) col_cop2__cop not_col_permutation_2 ts_distincts by blast + then show ?thesis + by (metis P1 TS_def assms(1) assms(4) bet_cop__tsp) +qed + +lemma cop2_os__osp: + assumes "\ Coplanar A B C X" and + "Coplanar A B C D" and + "Coplanar A B C E" and + "D E OS X Y" + shows "A B C OSP X Y" +proof - + obtain Z where P1: "D E TS X Z \ D E TS Y Z" + using OS_def assms(4) by blast + then have P2: "A B C TSP X Z" + using assms(1) assms(2) assms(3) cop2_ts__tsp by blast + then have P3: "A B C TSP Y Z" + by (meson P1 assms(2) assms(3) cop2_ts__tsp l9_2 tsp__ncop2) + then show ?thesis + using P2 l9_41_1 by blast +qed + +lemma cop3_tsp__ts: + assumes "D \ E" and + "Coplanar A B C D" and + "Coplanar A B C E" and + "Coplanar D E X Y" and + "A B C TSP X Y" + shows "D E TS X Y" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) col_cop2__cop cop2_os__osp cop_nts__os not_col_permutation_2 tsp__ncop1 tsp__ncop2 tsp__nosp) + +lemma cop3_osp__os: + assumes "D \ E" and + "Coplanar A B C D" and + "Coplanar A B C E" and + "Coplanar D E X Y" and + "A B C OSP X Y" + shows "D E OS X Y" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) col_cop2__cop cop2_ts__tsp cop_nts__os not_col_permutation_2 osp__ncop1 osp__ncop2 tsp__nosp) + +lemma cop_tsp__ex_cop2: + assumes (*"Coplanar A B C P" and*) + "A B C TSP D E" + shows "\ Q. (Coplanar A B C Q \ Coplanar D E P Q \ P \ Q)" +proof cases + assume "Col D E P" + then show ?thesis + by (meson ex_diff_cop ncop__ncols) +next + assume "\ Col D E P" + then obtain Q where "Coplanar A B C Q \ Bet D Q E \ \ Col D E P" + using TSP_def assms(1) by blast + then show ?thesis + using Col_perm bet_col ncop__ncols by blast +qed + +lemma cop_osp__ex_cop2: + assumes "Coplanar A B C P" and + "A B C OSP D E" + shows "\ Q. Coplanar A B C Q \ Coplanar D E P Q \ P \ Q" +proof cases + assume "Col D E P" + then show ?thesis + by (metis col_trivial_3 diff_col_ex ncop__ncols) +next + assume P1: "\ Col D E P" + obtain E' where P2: "Bet E P E' \ Cong P E' P E" + using segment_construction by blast + have P3: "\ Col D E' P" + by (metis P1 P2 bet_col bet_cong_eq between_symmetry col_permutation_5 l5_2 l6_16_1) + have P4: "A B C TSP D E'" + by (metis P2 P3 assms(1) assms(2) bet_cop__tsp l9_41_2 not_col_distincts osp__ncop2 osp_symmetry) + then have "\ Coplanar A B C D \ \ Coplanar A B C E' \ (\ T. Coplanar A B C T \ Bet D T E')" + by (simp add: TSP_def) + then obtain Q where P7: "Coplanar A B C Q \ Bet D Q E'" + by blast + then have "Coplanar D E' P Q" + using bet_col ncop__ncols ncoplanar_perm_5 by blast + then have "Coplanar D E P Q" + using Col_perm P2 P3 bet_col col_cop__cop ncoplanar_perm_5 not_col_distincts by blast + then show ?thesis + using P3 P7 bet_col col_permutation_5 by blast +qed + +lemma sac__coplanar: + assumes "Saccheri A B C D" + shows "Coplanar A B C D" + using Saccheri_def assms ncoplanar_perm_4 os__coplanar by blast + +subsection "Line reflexivity" + +subsubsection "Dimensionless" + +lemma Ch10_Goal1: + assumes "\ Coplanar D C B A" + shows "\ Coplanar A B C D" + by (simp add: assms ncoplanar_perm_23) + +lemma ex_sym: + "\ Y. (A B Perp X Y \ X = Y) \ (\ M. Col A B M \ M Midpoint X Y)" +proof cases + assume "Col A B X" + thus ?thesis + using l7_3_2 by blast +next + assume "\ Col A B X" + then obtain M0 where P1: "Col A B M0 \ A B Perp X M0" + using l8_18_existence by blast + obtain Z where P2: "M0 Midpoint X Z" + using symmetric_point_construction by blast + thus ?thesis + by (metis (full_types) P1 Perp_cases bet_col midpoint_bet perp_col) +qed + +lemma is_image_is_image_spec: + assumes "A \ B" + shows "P' P Reflect A B \ P' P ReflectL A B" + by (simp add: Reflect_def assms) + +lemma ex_sym1: + assumes "A \ B" + shows "\ Y. (A B Perp X Y \ X = Y) \ (\ M. Col A B M \ M Midpoint X Y \ X Y Reflect A B)" +proof cases + assume "Col A B X" + thus ?thesis + by (meson ReflectL_def Reflect_def assms l7_3_2) +next + assume P0: "\ Col A B X" + then obtain M0 where P1: "Col A B M0 \ A B Perp X M0" + using l8_18_existence by blast + obtain Z where P2: "M0 Midpoint X Z" + using symmetric_point_construction by blast + have P3: "A B Perp X Z" + proof cases + assume "X = Z" + thus ?thesis + using P1 P2 P0 midpoint_distinct by blast + next + assume "X \ Z" + then have P2: "X Z Perp A B" + using P1 P2 Perp_cases bet_col midpoint_bet perp_col by blast + show ?thesis + by (simp add: Tarski_neutral_dimensionless.Perp_perm Tarski_neutral_dimensionless_axioms P2) + qed + have P10: "(A B Perp X Z \ X = Z)" + by (simp add: P3) + have "\ M. Col A B M \ M Midpoint X Z \ X Z Reflect A B" + using P1 P2 P3 ReflectL_def assms is_image_is_image_spec l7_2 perp_right_comm by blast + thus ?thesis + using P3 by blast +qed + +lemma l10_2_uniqueness: + assumes "P1 P Reflect A B" and + "P2 P Reflect A B" + shows "P1 = P2" +proof cases + assume "A = B" + thus ?thesis + using Reflect_def assms(1) assms(2) symmetric_point_uniqueness by auto +next + assume P1: "A \ B" + have P1A: "P1 P ReflectL A B" + using P1 assms(1) is_image_is_image_spec by auto + then have P1B: "A B Perp P P1 \ P = P1" + using ReflectL_def by blast + have P2A: "P2 P ReflectL A B" + using P1 assms(2) is_image_is_image_spec by auto + then have P2B: "A B Perp P P2 \ P = P2" + using ReflectL_def by blast + obtain X where R1: "X Midpoint P P1 \ Col A B X" + by (metis ReflectL_def assms(1) col_trivial_1 is_image_is_image_spec midpoint_existence) + obtain Y where R2: "Y Midpoint P P2 \ Col A B Y" + by (metis ReflectL_def assms(2) col_trivial_1 is_image_is_image_spec midpoint_existence) + { + assume Q1:"(A B Perp P P1 \ A B Perp P P2)" + have S1: "P \ X" + proof - + { + assume "P = X" + then have "P = P1" + using R1 is_midpoint_id by blast + then have "A B Perp P P" + using Q1 by blast + then have "False" + using perp_distinct by blast + } + thus ?thesis by blast + qed + then have "P1 = P2" + by (smt Perp_cases Q1 \\thesis. (\X. X Midpoint P P1 \ Col A B X \ thesis) \ thesis\ \\thesis. (\Y. Y Midpoint P P2 \ Col A B Y \ thesis) \ thesis\ col_permutation_1 l7_2 l7_9 l8_18_uniqueness midpoint_col perp_col perp_not_col2) + } + then have T1: "(A B Perp P P1 \ A B Perp P P2) \ P1 = P2" by blast + have T2: "(P = P1 \ A B Perp P P2) \ P1 = P2" + by (metis R1 R2 col3 col_trivial_2 col_trivial_3 midpoint_col midpoint_distinct_1 midpoint_distinct_2 perp_not_col2) + have T3: "(P = P2 \ A B Perp P P1) \ P1 = P2" + by (metis R1 R2 col_trivial_2 midpoint_col midpoint_distinct_3 perp_col2 perp_not_col2) + thus ?thesis + using T1 T2 T3 P1B P2B by blast +qed + +lemma l10_2_uniqueness_spec: + assumes "P1 P ReflectL A B" and + "P2 P ReflectL A B" + shows "P1 = P2" +proof - + have "A B Perp P P1 \ P = P1" + using ReflectL_def assms(1) by blast + moreover obtain X1 where "X1 Midpoint P P1 \ Col A B X1" + using ReflectL_def assms(1) by blast + moreover have "A B Perp P P2 \ P = P2" + using ReflectL_def assms(2) by blast + moreover obtain X2 where "X2 Midpoint P P2 \ Col A B X2" + using ReflectL_def assms(2) by blast + ultimately show ?thesis + by (smt col_permutation_1 l8_16_1 l8_18_uniqueness midpoint_col midpoint_distinct_3 perp_col1 symmetric_point_uniqueness) +qed + +lemma l10_2_existence_spec: + "\ P'. P' P ReflectL A B" +proof cases + assume "Col A B P" + thus ?thesis + using ReflectL_def l7_3_2 by blast +next + assume "\ Col A B P" + then obtain X where "Col A B X \ A B Perp P X" + using l8_18_existence by blast + moreover obtain P' where "X Midpoint P P'" + using symmetric_point_construction by blast + ultimately show ?thesis + using ReflectL_def bet_col midpoint_bet perp_col1 by blast +qed + +lemma l10_2_existence: + "\ P'. P' P Reflect A B" + by (metis Reflect_def l10_2_existence_spec symmetric_point_construction) + +lemma l10_4_spec: + assumes "P P' ReflectL A B" + shows "P' P ReflectL A B" +proof - + obtain X where "X Midpoint P P' \ Col A B X" + using ReflectL_def assms l7_2 by blast + thus ?thesis + using Perp_cases ReflectL_def assms by auto +qed + +lemma l10_4: + assumes "P P' Reflect A B" + shows "P' P Reflect A B" + using Reflect_def Tarski_neutral_dimensionless.l7_2 Tarski_neutral_dimensionless_axioms assms l10_4_spec by fastforce + +lemma l10_5: + assumes "P' P Reflect A B" and + "P'' P' Reflect A B" + shows "P = P''" + by (meson assms(1) assms(2) l10_2_uniqueness l10_4) + +lemma l10_6_uniqueness: + assumes "P P1 Reflect A B" and + "P P2 Reflect A B" + shows "P1 = P2" + using assms(1) assms(2) l10_4 l10_5 by blast + +lemma l10_6_uniqueness_spec: + assumes "P P1 ReflectL A B" and + "P P2 ReflectL A B" + shows "P1 = P2" + using assms(1) assms(2) l10_2_uniqueness_spec l10_4_spec by blast + +lemma l10_6_existence_spec: + assumes "A \ B" + shows "\ P. P' P ReflectL A B" + using l10_2_existence_spec l10_4_spec by blast + +lemma l10_6_existence: + "\ P. P' P Reflect A B" + using l10_2_existence l10_4 by blast + +lemma l10_7: + assumes "P' P Reflect A B" and + "Q' Q Reflect A B" and + "P' = Q'" + shows "P = Q" + using assms(1) assms(2) assms(3) l10_6_uniqueness by blast + +lemma l10_8: + assumes "P P Reflect A B" + shows "Col P A B" + by (metis Col_perm assms col_trivial_2 ex_sym1 l10_6_uniqueness l7_3) + +lemma col__refl: + assumes "Col P A B" + shows "P P ReflectL A B" + using ReflectL_def assms col_permutation_1 l7_3_2 by blast + +lemma is_image_col_cong: + assumes "A \ B" and + "P P' Reflect A B" and + "Col A B X" + shows "Cong P X P' X" +proof - + have P1: "P P' ReflectL A B" + using assms(1) assms(2) is_image_is_image_spec by blast + obtain M0 where P2: "M0 Midpoint P' P \ Col A B M0" + using P1 ReflectL_def by blast + have "A B Perp P' P \ P' = P" + using P1 ReflectL_def by auto + moreover + { + assume S1: "A B Perp P' P" + then have "A \ B \ P' \ P" + using perp_distinct by blast + have S2: "M0 = X \ Cong P X P' X" + using P2 cong_4312 midpoint_cong by blast + { + assume "M0 \ X" + then have "M0 X Perp P' P" + using P2 S1 assms(3) perp_col2 by blast + then have "\ Col A B P \ Per P M0 X" + by (metis Col_perm P2 S1 colx l8_2 midpoint_col midpoint_distinct_1 per_col perp_col1 perp_not_col2 perp_per_1) + then have "Cong P X P' X" + using P2 cong_commutativity l7_2 l8_2 per_double_cong by blast + } + then have "Cong P X P' X" + using S2 by blast + } + then have "A B Perp P' P \ Cong P X P' X" by blast + moreover + { + assume "P = P'" + then have "Cong P X P' X" + by (simp add: cong_reflexivity) + } + ultimately show ?thesis by blast +qed + +lemma is_image_spec_col_cong: + assumes "P P' ReflectL A B" and + "Col A B X" + shows "Cong P X P' X" + by (metis Col_def Reflect_def assms(1) assms(2) between_trivial col__refl cong_reflexivity is_image_col_cong l10_6_uniqueness_spec) + +lemma image_id: + assumes "A \ B" and + "Col A B T" and + "T T' Reflect A B" + shows "T = T'" + using assms(1) assms(2) assms(3) cong_diff_4 is_image_col_cong by blast + +lemma osym_not_col: + assumes "P P' Reflect A B" and + "\ Col A B P" + shows "\ Col A B P'" + using assms(1) assms(2) l10_4 local.image_id not_col_distincts by blast + +lemma midpoint_preserves_image: + assumes "A \ B" and + "Col A B M" and + "P P' Reflect A B" and + "M Midpoint P Q" and + "M Midpoint P' Q'" + shows "Q Q' Reflect A B" +proof - + obtain X where P1: "X Midpoint P' P \ Col A B X" + using ReflectL_def assms(1) assms(3) is_image_is_image_spec by blast + { + assume S1: "A B Perp P' P" + obtain Y where S2: "M Midpoint X Y" + using symmetric_point_construction by blast + have S3: "Y Midpoint Q Q'" + proof - + have R4: "X Midpoint P P'" + by (simp add: P1 l7_2) + thus ?thesis + using assms(4) assms(5) S2 symmetry_preserves_midpoint by blast + qed + have S4: "P \ P'" + using S1 perp_not_eq_2 by blast + then have S5: "Q \ Q'" + using Tarski_neutral_dimensionless.l7_9 Tarski_neutral_dimensionless_axioms assms(4) assms(5) by fastforce + have S6: "Y Midpoint Q' Q \ Col A B Y" + by (metis P1 S2 S3 assms(2) colx l7_2 midpoint_col midpoint_distinct_1) + have S7: "A B Perp Q' Q \ Q = Q'" + proof - + have R3: "Per M Y Q" + proof - + have S1: "Y Midpoint Q Q'" + using S3 by auto + have "Cong M Q M Q'" + using assms(1) assms(2) assms(3) assms(4) assms(5) cong_commutativity is_image_col_cong l7_16 l7_3_2 by blast + thus ?thesis + using Per_def S1 by blast + qed + { + have "X = Y \ (A B Perp Q' Q \ Q = Q')" + by (metis P1 Perp_cases S1 S2 S6 assms(5) l7_3 l7_9_bis) + { + assume "X \ Y" + then have "Y PerpAt M Y Y Q" + using R3 S2 S3 S5 midpoint_distinct_1 per_perp_in by blast + then have V1: "Y Y Perp Y Q \ M Y Perp Y Q" + by (simp add: perp_in_perp_bis) + { + have "Y Y Perp Y Q \ A B Perp Q' Q \ Q = Q'" + using perp_not_eq_1 by blast + { + assume T1: "M Y Perp Y Q" + have T2: "Y Q Perp A B" + proof cases + assume "A = M" + thus ?thesis + using Perp_cases S6 T1 assms(1) col_permutation_5 perp_col by blast + next + assume "A \ M" + thus ?thesis + by (smt S6 T1 assms(1) assms(2) col2__eq col_transitivity_2 perp_col0 perp_not_eq_1) + qed + have "A B Perp Q' Q \ Q = Q'" + by (metis S3 T2 midpoint_col not_col_distincts perp_col0) + } + then have "M Y Perp Y Q \ A B Perp Q' Q \ Q = Q'" by blast + } + then have "A B Perp Q' Q \ Q = Q'" + using V1 perp_distinct by blast + } + then have "X \ Y \ (A B Perp Q' Q \ Q = Q')" by blast + } + thus ?thesis + by (metis P1 Perp_cases S1 S2 S6 assms(5) l7_3 l7_9_bis) + qed + then have "Q Q' ReflectL A B" + using ReflectL_def S6 by blast + } + then have "A B Perp P' P \ Q Q' ReflectL A B" by blast + moreover + { + assume "P = P'" + then have "Q Q' ReflectL A B" + by (metis P1 assms(2) assms(4) assms(5) col__refl col_permutation_2 colx midpoint_col midpoint_distinct_3 symmetric_point_uniqueness) + } + ultimately show ?thesis + using ReflectL_def assms(1) assms(3) is_image_is_image_spec by auto +qed + +lemma image_in_is_image_spec: + assumes "M ReflectLAt P P' A B" + shows "P P' ReflectL A B" +proof - + have P1: "M Midpoint P' P" + using ReflectLAt_def assms by blast + have P2: "Col A B M" + using ReflectLAt_def assms by blast + have "A B Perp P' P \ P' = P" + using ReflectLAt_def assms by blast + thus ?thesis using P1 P2 + using ReflectL_def by blast +qed + +lemma image_in_gen_is_image: + assumes "M ReflectAt P P' A B" + shows "P P' Reflect A B" + using ReflectAt_def Reflect_def assms image_in_is_image_spec by auto + +lemma image_image_in: + assumes "P \ P'" and + "P P' ReflectL A B" and + "Col A B M" and + "Col P M P'" + shows "M ReflectLAt P P' A B" +proof - + obtain M' where P1: "M' Midpoint P' P \ Col A B M'" + using ReflectL_def assms(2) by blast + have Q1: "P M' Perp A B" + by (metis Col_cases P1 Perp_perm ReflectL_def assms(1) assms(2) bet_col cong_diff_3 midpoint_bet midpoint_cong not_cong_4321 perp_col1) + { + assume R1: "A B Perp P' P" + have R3: "P \ M'" + using Q1 perp_not_eq_1 by auto + have R4: "A B Perp P' P" + by (simp add: R1) + have R5: "Col P P' M'" + using P1 midpoint_col not_col_permutation_3 by blast + have R6: "M' Midpoint P' P" + by (simp add: P1) + have R7: "\ Col A B P" + using assms(1) assms(2) col__refl col_permutation_2 l10_2_uniqueness_spec l10_4_spec by blast + have R8: "P \ P'" + by (simp add: assms(1)) + have R9: "Col A B M'" + by (simp add: P1) + have R10: "Col A B M" + by (simp add: assms(3)) + have R11: "Col P P' M'" + by (simp add: R5) + have R12: "Col P P' M" + using Col_perm assms(4) by blast + have "M = M'" + proof cases + assume S1: "A = M'" + have "Per P M' A" + by (simp add: S1 l8_5) + thus ?thesis using l6_21 R8 R9 R10 R11 R12 + using R7 by blast + next + assume "A \ M'" + thus ?thesis + using R10 R12 R5 R7 R8 R9 l6_21 by blast + qed + then have "M Midpoint P' P" + using R6 by blast + } + then have Q2: "A B Perp P' P \ M Midpoint P' P" by blast + have Q3: "P' = P \ M Midpoint P' P" + using assms(1) by auto + have Q4: "A B Perp P' P \ P' = P" + using ReflectL_def assms(2) by auto + then have "M Midpoint P' P" + using Q2 Q3 by blast + thus ?thesis + by (simp add: ReflectLAt_def Q4 assms(3)) +qed + +lemma image_in_col: + assumes "Y ReflectLAt P P' A B" + shows "Col P P' Y" + using Col_perm ReflectLAt_def assms midpoint_col by blast + +lemma is_image_spec_rev: + assumes "P P' ReflectL A B" + shows "P P' ReflectL B A" +proof - + obtain M0 where P1: "M0 Midpoint P' P \ Col A B M0" + using ReflectL_def assms by blast + have P2: "Col B A M0" + using Col_cases P1 by blast + have "A B Perp P' P \ P' = P" + using ReflectL_def assms by blast + thus ?thesis + using P1 P2 Perp_cases ReflectL_def by auto +qed + +lemma is_image_rev: + assumes "P P' Reflect A B" + shows "P P' Reflect B A" + using Reflect_def assms is_image_spec_rev by auto + +lemma midpoint_preserves_per: + assumes "Per A B C" and + "M Midpoint A A1" and + "M Midpoint B B1" and + "M Midpoint C C1" + shows "Per A1 B1 C1" +proof - + obtain C' where P1: "B Midpoint C C' \ Cong A C A C'" + using Per_def assms(1) by blast + obtain C1' where P2: "M Midpoint C' C1'" + using symmetric_point_construction by blast + thus ?thesis + by (meson P1 Per_def assms(2) assms(3) assms(4) l7_16 symmetry_preserves_midpoint) +qed + +lemma col__image_spec: + assumes "Col A B X" + shows "X X ReflectL A B" + by (simp add: assms col__refl col_permutation_2) + +lemma image_triv: + "A A Reflect A B" + by (simp add: Reflect_def col__refl col_trivial_1 l7_3_2) + +lemma cong_midpoint__image: + assumes "Cong A X A Y" and + "B Midpoint X Y" + shows "Y X Reflect A B" +proof cases + assume "A = B" + thus ?thesis + by (simp add: Reflect_def assms(2)) +next + assume S0: "A \ B" + { + assume S1: "X \ Y" + then have "X Y Perp A B" + proof - + have T1: "B \ X" + using S1 assms(2) midpoint_distinct_1 by blast + have T2: "B \ Y" + using S1 assms(2) midpoint_not_midpoint by blast + have "Per A B X" + using Per_def assms(1) assms(2) by auto + thus ?thesis + using S0 S1 T1 T2 assms(2) col_per_perp midpoint_col by auto + qed + then have "A B Perp X Y \ X = Y" + using Perp_perm by blast + then have "Y X Reflect A B" + using ReflectL_def S0 assms(2) col_trivial_2 is_image_is_image_spec by blast + } + then have "X \ Y \ Y X Reflect A B" by blast + thus ?thesis + using assms(2) image_triv is_image_rev l7_3 by blast +qed + + +lemma col_image_spec__eq: + assumes "Col A B P" and + "P P' ReflectL A B" + shows "P = P'" + using assms(1) assms(2) col__image_spec l10_2_uniqueness_spec l10_4_spec by blast + +lemma image_spec_triv: + "A A ReflectL B B" + using col__image_spec not_col_distincts by blast + +lemma image_spec__eq: + assumes "P P' ReflectL A A" + shows "P = P'" + using assms col_image_spec__eq not_col_distincts by blast + +lemma image__midpoint: + assumes "P P' Reflect A A" + shows "A Midpoint P' P" + using Reflect_def assms by auto + +lemma is_image_spec_dec: + "A B ReflectL C D \ \ A B ReflectL C D" + by simp + +lemma l10_14: + assumes "P \ P'" and + "A \ B" and + "P P' Reflect A B" + shows "A B TS P P'" +proof - + have P1: "P P' ReflectL A B" + using assms(2) assms(3) is_image_is_image_spec by blast + then obtain M0 where "M0 Midpoint P' P \ Col A B M0" + using ReflectL_def by blast + then have "A B Perp P' P \ A B TS P P'" + by (meson TS_def assms(1) assms(2) assms(3) between_symmetry col_permutation_2 local.image_id midpoint_bet osym_not_col) + thus ?thesis + using assms(1) P1 ReflectL_def by blast +qed + +lemma l10_15: + assumes "Col A B C" and + "\ Col A B P" + shows "\ Q. A B Perp Q C \ A B OS P Q" +proof - + have P1: "A \ B" + using assms(2) col_trivial_1 by auto + obtain X where P2: "A B TS P X" + using assms(2) col_permutation_1 l9_10 by blast + { + assume Q1: "A = C" + obtain Q where Q2: "\ T. A B Perp Q A \ Col A B T \ Bet X T Q" + using P1 l8_21 by blast + then obtain T where "A B Perp Q A \ Col A B T \ Bet X T Q" by blast + then have "A B TS Q X" + by (meson P2 TS_def between_symmetry col_permutation_2 perp_not_col) + then have Q5: "A B OS P Q" + using P2 l9_8_1 by blast + then have "\ Q. A B Perp Q C \ A B OS P Q" + using Q1 Q2 by blast + } + then have P3: "A = C \ (\ Q. A B Perp Q C \ A B OS P Q)" by blast + { + assume Q1: "A \ C" + then obtain Q where Q2: "\ T. C A Perp Q C \ Col C A T \ Bet X T Q" + using l8_21 by presburger + then obtain T where Q3: "C A Perp Q C \ Col C A T \ Bet X T Q" by blast + have Q4: "A B Perp Q C" + using NCol_perm P1 Q2 assms(1) col_trivial_2 perp_col2 by blast + have "A B TS Q X" + proof - + have R1: "\ Col Q A B" + using Col_perm P1 Q2 assms(1) col_trivial_2 colx perp_not_col by blast + have R2: "\ Col X A B" + using P2 TS_def by auto + have R3: "Col T A B" + by (metis Q1 Q3 assms(1) col_trivial_2 colx not_col_permutation_1) + have "Bet Q T X" + using Bet_cases Q3 by blast + then have "\ T. Col T A B \ Bet Q T X" + using R3 by blast + thus ?thesis using R1 R2 + by (simp add: TS_def) + qed + then have "A B OS P Q" + using P2 l9_8_1 by blast + then have "\ Q. A B Perp Q C \ A B OS P Q" + using Q4 by blast + } + thus ?thesis using P3 by blast +qed + +lemma ex_per_cong: + assumes "A \ B" and + "X \ Y" and + "Col A B C" and + "\ Col A B D" + shows "\ P. Per P C A \ Cong P C X Y \ A B OS P D" +proof - + obtain Q where P1: "A B Perp Q C \ A B OS D Q" + using assms(3) assms(4) l10_15 by blast + obtain P where P2: "C Out Q P \ Cong C P X Y" + by (metis P1 assms(2) perp_not_eq_2 segment_construction_3) + have P3: "Per P C A" + using P1 P2 assms(3) col_trivial_3 l8_16_1 l8_3 out_col by blast + have "A B OS P D" + using P1 P2 assms(3) one_side_symmetry os_out_os by blast + thus ?thesis + using P2 P3 cong_left_commutativity by blast +qed + +lemma exists_cong_per: + "\ C. Per A B C \ Cong B C X Y" +proof cases + assume "A = B" + thus ?thesis + by (meson Tarski_neutral_dimensionless.l8_5 Tarski_neutral_dimensionless_axioms l8_2 segment_construction) +next + assume "A \ B" + thus ?thesis + by (metis Perp_perm bet_col between_trivial l8_16_1 l8_21 segment_construction) +qed + +subsubsection "Upper dim 2" + +lemma upper_dim_implies_per2__col: + assumes "upper_dim_axiom" + shows "\ A B C X. (Per A X C \ X \ C \ Per B X C) \ Col A B X" +proof - + { + fix A B C X + assume "Per A X C \ X \ C \ Per B X C" + moreover then obtain C' where "X Midpoint C C' \ Cong A C A C'" + using Per_def by blast + ultimately have "Col A B X" + by (smt Col_def assms midpoint_cong midpoint_distinct_2 not_cong_2134 per_double_cong upper_dim_axiom_def) + } + then show ?thesis by blast +qed + +lemma upper_dim_implies_col_perp2__col: + assumes "upper_dim_axiom" + shows "\ A B X Y P. (Col A B P \ A B Perp X P \ P A Perp Y P) \ Col Y X P" +proof - + { + fix A B X Y P + assume H1: "Col A B P \ A B Perp X P \ P A Perp Y P" + then have H2: "P \ A" + using perp_not_eq_1 by blast + have "Col Y X P" + proof - + have T1: "Per Y P A" + using H1 l8_2 perp_per_1 by blast + moreover have "Per X P A" + using H1 col_trivial_3 l8_16_1 by blast + then show ?thesis using T1 H2 + using assms upper_dim_implies_per2__col by blast + qed + } + then show ?thesis by blast +qed + +lemma upper_dim_implies_perp2__col: + assumes "upper_dim_axiom" + shows "\ X Y Z A B. (X Y Perp A B \ X Z Perp A B) \ Col X Y Z" +proof - + { + fix X Y Z A B + assume H1: "X Y Perp A B \ X Z Perp A B" + then have H1A: "X Y Perp A B" by blast + have H1B: "X Z Perp A B" using H1 by blast + obtain C where H2: "C PerpAt X Y A B" + using H1 Perp_def by blast + obtain C' where H3: "C' PerpAt X Z A B" + using H1 Perp_def by blast + have "Col X Y Z" + proof cases + assume H2: "Col A B X" + { + assume "X = A" + then have "Col X Y Z" using upper_dim_implies_col_perp2__col + by (metis H1 H2 Perp_cases assms col_permutation_1) + } + then have P1: "X = A \ Col X Y Z" by blast + { + assume P2: "X \ A" + then have P3: "A B Perp X Y" using perp_sym + using H1 Perp_perm by blast + have "Col A B X" + by (simp add: H2) + then have P4: "A X Perp X Y" using perp_col + using P2 P3 by auto + have P5: "A X Perp X Z" + by (metis H1 H2 P2 Perp_perm col_trivial_3 perp_col0) + have P6: "Col Y Z X" + proof - + have Q1: "upper_dim_axiom" + by (simp add: assms) + have Q2: "Per Y X A" + using P4 Perp_cases perp_per_2 by blast + have "Per Z X A" + by (meson P5 Tarski_neutral_dimensionless.Perp_cases Tarski_neutral_dimensionless_axioms perp_per_2) + then show ?thesis using Q1 Q2 P2 + using upper_dim_implies_per2__col by blast + qed + then have "Col X Y Z" + using Col_perm by blast + } + then show ?thesis + using P1 by blast + next + assume T1: "\ Col A B X" + obtain Y0 where Q3: "Y0 PerpAt X Y A B" + using H1 Perp_def by blast + obtain Z0 where Q4: "Z0 PerpAt X Z A B" + using Perp_def H1 by blast + have Q5: "X Y0 Perp A B" + proof - + have R1: "X \ Y0" + using Q3 T1 perp_in_col by blast + have R2: "X Y Perp A B" + by (simp add: H1A) + then show ?thesis using R1 + using Q3 perp_col perp_in_col by blast + qed + have "X Z0 Perp A B" + by (metis H1B Q4 T1 perp_col perp_in_col) + then have Q7: "Y0 = Z0" + by (meson Q3 Q4 Q5 T1 Tarski_neutral_dimensionless.Perp_perm Tarski_neutral_dimensionless_axioms l8_18_uniqueness perp_in_col) + have "Col X Y Z" + proof - + have "X \ Y0" + using Q5 perp_distinct by auto + moreover have "Col X Y0 Y" + using Q3 not_col_permutation_5 perp_in_col by blast + moreover have "Col X Y0 Z" + using Q4 Q7 col_permutation_5 perp_in_col by blast + ultimately show ?thesis + using col_transitivity_1 by blast + qed + then show ?thesis using l8_18_uniqueness + by (smt H1 H2 Perp_cases T1 col_trivial_3 perp_col1 perp_in_col perp_not_col) + qed + } + then show ?thesis by blast +qed + +lemma upper_dim_implies_not_two_sides_one_side_aux: + assumes "upper_dim_axiom" + shows "\ A B X Y PX. (A \ B \ PX \ A \ A B Perp X PX \ Col A B PX \ \ Col X A B \ \ Col Y A B \ \ A B TS X Y) \ A B OS X Y" +proof - + { + fix A B X Y PX + assume H1: "A \ B \ PX \ A \ A B Perp X PX \ Col A B PX \ \ Col X A B \ \ Col Y A B \ \ A B TS X Y" + have H1A: "A \ B" using H1 by simp + have H1B: "PX \ A" using H1 by simp + have H1C: "A B Perp X PX" using H1 by simp + have H1D: "Col A B PX" using H1 by simp + have H1E: "\ Col X A B" using H1 by simp + have H1F: "\ Col Y A B" using H1 by simp + have H1G: "\ A B TS X Y" using H1 by simp + have "\ P T. PX A Perp P PX \ Col PX A T \ Bet Y T P" + using H1B l8_21 by blast + then obtain P T where T1: "PX A Perp P PX \ Col PX A T \ Bet Y T P" + by blast + have J1: "PX A Perp P PX" using T1 by blast + have J2: "Col PX A T" using T1 by blast + have J3: "Bet Y T P" using T1 by blast + have P9: "Col P X PX" using upper_dim_implies_col_perp2__col + using H1C H1D J1 assms by blast + have J4: "\ Col P A B" + using H1A H1D T1 col_trivial_2 colx not_col_permutation_3 perp_not_col by blast + have J5: "PX A TS P Y" + proof - + have f1: "Col PX A B" + using H1D not_col_permutation_1 by blast + then have f2: "Col B PX A" + using not_col_permutation_1 by blast + have f3: "\p. (T = A \ Col p A PX) \ \ Col p A T" + by (metis J2 l6_16_1) + have f4: "Col T PX A" + using J2 not_col_permutation_1 by blast + have f5: "\p. Col p PX B \ \ Col p PX A" + using f2 by (meson H1B l6_16_1) + have f6: "\p. (B = PX \ Col p B A) \ \ Col p B PX" + using H1D l6_16_1 by blast + have f7: "\p pa. ((B = PX \ Col p PX pa) \ \ Col p PX B) \ \ Col pa PX A" + using f5 by (metis l6_16_1) + have f8: "\p. ((T = A \ B = PX) \ Col p A B) \ \ Col p A PX" + using f2 by (metis H1B l6_16_1 not_col_permutation_1) + have "Col B T PX" + using f5 f4 not_col_permutation_1 by blast + then have f9: "\p. (T = PX \ Col p T B) \ \ Col p T PX" + using l6_16_1 by blast + have "B = PX \ \ Col Y PX A \ \ Col P PX A" + using f1 by (metis (no_types) H1B H1F J4 l6_16_1 not_col_permutation_1) + then show ?thesis + using f9 f8 f7 f6 f5 f4 f3 by (metis (no_types) H1B H1F J3 J4 TS_def l9_2 not_col_permutation_1) + qed + have J6: "X \ PX" + using H1 perp_not_eq_2 by blast + have J7: "P \ X" + using H1A H1D H1G J5 col_preserves_two_sides col_trivial_2 not_col_permutation_1 by blast + have J8: "Bet X PX P \ PX Out X P \ \ Col X PX P" + using l6_4_2 by blast + have J9: "PX A TS P X" + by (metis H1A H1D H1G J5 J6 J8 Out_cases P9 TS_def bet__ts between_symmetry col_permutation_1 col_preserves_two_sides col_trivial_2 l9_5) + then have "A B OS X Y" + by (meson H1A H1D J5 col2_os__os col_trivial_2 l9_2 l9_8_1 not_col_permutation_1) + } + then show ?thesis by blast +qed + +lemma upper_dim_implies_not_two_sides_one_side: + assumes "upper_dim_axiom" + shows "\ A B X Y. (\ Col X A B \ \ Col Y A B \ \ A B TS X Y) \ A B OS X Y" +proof - + { + fix A B X Y + assume H1: "\ Col X A B \ \ Col Y A B \ \ A B TS X Y" + have H1A: "\ Col X A B" using H1 by simp + have H1B: "\ Col Y A B" using H1 by simp + have H1C: "\ A B TS X Y" using H1 by simp + have P1: "A \ B" + using H1A col_trivial_2 by blast + obtain PX where P2: "Col A B PX \ A B Perp X PX" + using Col_cases H1 l8_18_existence by blast + have "A B OS X Y" + proof cases + assume H5: "PX = A" + have "B A OS X Y" + proof - + have F1: "B A Perp X A" + using P2 Perp_perm H5 by blast + have F2: "Col B A A" + using not_col_distincts by blast + have F3: "\ Col X B A" + using Col_cases H1A by blast + have F4: "\ Col Y B A" + using Col_cases H1B by blast + have "\ B A TS X Y" + using H1C invert_two_sides by blast + then show ?thesis + by (metis F1 F3 F4 assms col_trivial_2 upper_dim_implies_not_two_sides_one_side_aux) + qed + then show ?thesis + by (simp add: invert_one_side) + next + assume "PX \ A" + then show ?thesis + using H1 P1 P2 assms upper_dim_implies_not_two_sides_one_side_aux by blast + qed + } + then show ?thesis by blast +qed + +lemma upper_dim_implies_not_one_side_two_sides: + assumes "upper_dim_axiom" + shows "\ A B X Y. (\ Col X A B \ \ Col Y A B \ \ A B OS X Y) \ A B TS X Y" + using assms upper_dim_implies_not_two_sides_one_side by blast + +lemma upper_dim_implies_one_or_two_sides: + assumes "upper_dim_axiom" + shows "\ A B X Y. (\ Col X A B \ \ Col Y A B) \ (A B TS X Y \ A B OS X Y)" + using assms upper_dim_implies_not_two_sides_one_side by blast + +lemma upper_dim_implies_all_coplanar: + assumes "upper_dim_axiom" + shows "all_coplanar_axiom" + using all_coplanar_axiom_def assms upper_dim_axiom_def by auto + +lemma all_coplanar_implies_upper_dim: + assumes "all_coplanar_axiom" + shows "upper_dim_axiom" + using all_coplanar_axiom_def assms upper_dim_axiom_def by auto + +lemma all_coplanar_upper_dim: + shows "all_coplanar_axiom \ upper_dim_axiom" + using all_coplanar_implies_upper_dim upper_dim_implies_all_coplanar by auto + +lemma upper_dim_stab: + shows "\ \ upper_dim_axiom \ upper_dim_axiom" by blast + +lemma cop__cong_on_bissect: + assumes "Coplanar A B X P" and + "M Midpoint A B" and + "M PerpAt A B P M" and + "Cong X A X B" + shows "Col M P X" +proof - + have P1: "X = M \ \ Col A B X \ M PerpAt X M A B" + using assms(2) assms(3) assms(4) cong_commutativity cong_perp_or_mid perp_in_distinct by blast + { + assume H1: "\ Col A B X \ M PerpAt X M A B" + then have Q1: "X M Perp A B" + using perp_in_perp by blast + have Q2: "A B Perp P M" + using assms(3) perp_in_perp by blast + have P2: "Col M A B" + by (simp add: assms(2) midpoint_col) + then have "Col M P X" using cop_perp2__col + by (meson Perp_perm Q1 Q2 assms(1) coplanar_perm_1) + } + then show ?thesis + using P1 not_col_distincts by blast +qed + +lemma cong_cop_mid_perp__col: + assumes "Coplanar A B X P" and + "Cong A X B X" and + "M Midpoint A B" and + "A B Perp P M" + shows "Col M P X" +proof - + have "M PerpAt A B P M" + using Col_perm assms(3) assms(4) bet_col l8_15_1 midpoint_bet by blast + then show ?thesis + using assms(1) assms(2) assms(3) cop__cong_on_bissect not_cong_2143 by blast +qed + +lemma cop_image_in2__col: + assumes "Coplanar A B P Q" and + "M ReflectLAt P P' A B" and + "M ReflectLAt Q Q' A B" + shows "Col M P Q" +proof - + have P1: "P P' ReflectL A B" + using assms(2) image_in_is_image_spec by auto + then have P2: "A B Perp P' P \ P' = P" + using ReflectL_def by auto + have P3: "Q Q' ReflectL A B" + using assms(3) image_in_is_image_spec by blast + then have P4: "A B Perp Q' Q \ Q' = Q" + using ReflectL_def by auto + { + assume S1: "A B Perp P' P \ A B Perp Q' Q" + { + assume T1: "A = M" + have T2: "Per B A P" + by (metis P1 Perp_perm S1 T1 assms(2) image_in_col is_image_is_image_spec l10_14 perp_col1 perp_distinct perp_per_1 ts_distincts) + have T3: "Per B A Q" + by (metis S1 T1 assms(3) image_in_col l8_5 perp_col1 perp_per_1 perp_right_comm) + have T4: "Coplanar B P Q A" + using assms(1) ncoplanar_perm_18 by blast + have T5: "B \ A" + using S1 perp_distinct by blast + have T6: "Per P A B" + by (simp add: T2 l8_2) + have T7: "Per Q A B" + using Per_cases T3 by blast + then have "Col P Q A" using T4 T5 T6 + using cop_per2__col by blast + then have "Col A P Q" + using not_col_permutation_1 by blast + then have "Col M P Q" + using T1 by blast + } + then have S2: "A = M \ Col M P Q" by blast + { + assume D0: "A \ M" + have D1: "Per A M P" + proof - + have E1: "M Midpoint P P'" + using ReflectLAt_def assms(2) l7_2 by blast + have "Cong P A P' A" + using P1 col_trivial_3 is_image_spec_col_cong by blast + then have "Cong A P A P'" + using Cong_perm by blast + then show ?thesis + using E1 Per_def by blast + qed + have D2: "Per A M Q" + proof - + have E2: "M Midpoint Q Q'" + using ReflectLAt_def assms(3) l7_2 by blast + have "Cong A Q A Q'" + using P3 col_trivial_3 cong_commutativity is_image_spec_col_cong by blast + then show ?thesis + using E2 Per_def by blast + qed + have "Col P Q M" + proof - + have W1: "Coplanar P Q A B" + using assms(1) ncoplanar_perm_16 by blast + have W2: "A \ B" + using S1 perp_distinct by blast + have "Col A B M" + using ReflectLAt_def assms(2) by blast + then have "Coplanar P Q A M" + using W1 W2 col2_cop__cop col_trivial_3 by blast + then have V1: "Coplanar A P Q M" + using ncoplanar_perm_8 by blast + have V3: "Per P M A" + by (simp add: D1 l8_2) + have "Per Q M A" + using D2 Per_perm by blast + then show ?thesis + using V1 D0 V3 cop_per2__col by blast + qed + then have "Col M P Q" + using Col_perm by blast + } + then have "A \ M \ Col M P Q" by blast + then have "Col M P Q" + using S2 by blast + } + then have P5: "(A B Perp P' P \ A B Perp Q' Q) \ Col M P Q" by blast + have P6: "(A B Perp P' P \ (Q' = Q)) \ Col M P Q" + using ReflectLAt_def assms(3) l7_3 not_col_distincts by blast + have P7: "(P' = P \ A B Perp Q' Q) \ Col M P Q" + using ReflectLAt_def assms(2) l7_3 not_col_distincts by blast + have "(P' = P \ Q' = Q) \ Col M P Q" + using ReflectLAt_def assms(3) col_trivial_3 l7_3 by blast + then show ?thesis + using P2 P4 P5 P6 P7 by blast +qed + +lemma l10_10_spec: + assumes "P' P ReflectL A B" and + "Q' Q ReflectL A B" + shows "Cong P Q P' Q'" +proof cases + assume "A = B" + then show ?thesis + using assms(1) assms(2) cong_reflexivity image_spec__eq by blast +next + assume H1: "A \ B" + obtain X where P1: "X Midpoint P P' \ Col A B X" + using ReflectL_def assms(1) by blast + obtain Y where P2: "Y Midpoint Q Q' \ Col A B Y" + using ReflectL_def assms(2) by blast + obtain Z where P3: "Z Midpoint X Y" + using midpoint_existence by blast + have P4: "Col A B Z" + proof cases + assume "X = Y" + then show ?thesis + by (metis P2 P3 midpoint_distinct_3) + next + assume "X \ Y" + then show ?thesis + by (metis P1 P2 P3 l6_21 midpoint_col not_col_distincts) + qed + obtain R where P5: "Z Midpoint P R" + using symmetric_point_construction by blast + obtain R' where P6: "Z Midpoint P' R'" + using symmetric_point_construction by blast + have P7: "A B Perp P P' \ P = P'" + using ReflectL_def assms(1) by auto + have P8: "A B Perp Q Q' \ Q = Q'" + using ReflectL_def assms(2) by blast + { + assume Q1: "A B Perp P P' \ A B Perp Q Q'" + have Q2: "R R' ReflectL A B" + proof - + have "P P' Reflect A B" + by (simp add: H1 assms(1) is_image_is_image_spec l10_4_spec) + then have "R R' Reflect A B" + using H1 P4 P5 P6 midpoint_preserves_image by blast + then show ?thesis + using H1 is_image_is_image_spec by blast + qed + have Q3: "R \ R'" + using P5 P6 Q1 l7_9 perp_not_eq_2 by blast + have Q4: "Y Midpoint R R'" + using P1 P3 P5 P6 symmetry_preserves_midpoint by blast + have Q5: "Cong Q' R' Q R" + using P2 Q4 l7_13 by blast + have Q6: "Cong P' Z P Z" + using P4 assms(1) is_image_spec_col_cong by auto + have Q7: "Cong Q' Z Q Z" + using P4 assms(2) is_image_spec_col_cong by blast + then have "Cong P Q P' Q'" + proof - + have S1: "Cong R Z R' Z" + using P5 P6 Q6 cong_symmetry l7_16 l7_3_2 by blast + have "R \ Z" + using Q3 S1 cong_reverse_identity by blast + then show ?thesis + by (meson P5 P6 Q5 Q6 Q7 S1 between_symmetry five_segment midpoint_bet not_cong_2143 not_cong_3412) + qed + } + then have P9: "(A B Perp P P' \ A B Perp Q Q') \ Cong P Q P' Q'" by blast + have P10: "(A B Perp P P' \ Q = Q') \ Cong P Q P' Q'" + using P2 Tarski_neutral_dimensionless.l7_3 Tarski_neutral_dimensionless_axioms assms(1) cong_symmetry is_image_spec_col_cong by fastforce + have P11: "(P = P' \ A B Perp Q Q') \ Cong P Q P' Q'" + using P1 Tarski_neutral_dimensionless.l7_3 Tarski_neutral_dimensionless.not_cong_4321 Tarski_neutral_dimensionless_axioms assms(2) is_image_spec_col_cong by fastforce + have "(P = P' \ Q = Q') \ Cong P Q P' Q'" + using cong_reflexivity by blast + then show ?thesis + using P10 P11 P7 P8 P9 by blast +qed + +lemma l10_10: + assumes "P' P Reflect A B" and + "Q' Q Reflect A B" + shows "Cong P Q P' Q'" + using Reflect_def assms(1) assms(2) cong_4321 l10_10_spec l7_13 by auto + +lemma image_preserves_bet: + assumes "A A' ReflectL X Y" and + "B B' ReflectL X Y" and + "C C' ReflectL X Y" and + "Bet A B C" + shows "Bet A' B' C'" +proof - + have P3: "A B C Cong3 A' B' C'" + using Cong3_def assms(1) assms(2) assms(3) l10_10_spec l10_4_spec by blast + then show ?thesis + using assms(4) l4_6 by blast +qed + +lemma image_gen_preserves_bet: + assumes "A A' Reflect X Y" and + "B B' Reflect X Y" and + "C C' Reflect X Y" and + "Bet A B C" + shows "Bet A' B' C'" +proof cases + assume "X = Y" + then show ?thesis + by (metis (full_types) assms(1) assms(2) assms(3) assms(4) image__midpoint l7_15 l7_2) +next + assume P1: "X \ Y" + then have P2: "A A' ReflectL X Y" + using assms(1) is_image_is_image_spec by blast + have P3: "B B' ReflectL X Y" + using P1 assms(2) is_image_is_image_spec by auto + have "C C' ReflectL X Y" + using P1 assms(3) is_image_is_image_spec by blast + then show ?thesis using image_preserves_bet + using assms(4) P2 P3 by blast +qed + +lemma image_preserves_col: + assumes "A A' ReflectL X Y" and + "B B' ReflectL X Y" and + "C C' ReflectL X Y" and + "Col A B C" + shows "Col A' B' C'" using image_preserves_bet + using Col_def assms(1) assms(2) assms(3) assms(4) by auto + +lemma image_gen_preserves_col: + assumes "A A' Reflect X Y" and + "B B' Reflect X Y" and + "C C' Reflect X Y" and + "Col A B C" + shows "Col A' B' C'" + using Col_def assms(1) assms(2) assms(3) assms(4) image_gen_preserves_bet by auto + +lemma image_gen_preserves_ncol: + assumes "A A' Reflect X Y" and + "B B' Reflect X Y" and + "C C' Reflect X Y" and + "\ Col A B C" + shows "\ Col A' B' C'" + using assms(1) assms(2) assms(3) assms(4)image_gen_preserves_col l10_4 by blast + +lemma image_gen_preserves_inter: + assumes "A A' Reflect X Y" and + "B B' Reflect X Y" and + "C C' Reflect X Y" and + "D D' Reflect X Y" and + "\ Col A B C" and + "C \ D" and + "Col A B I" and + "Col C D I" and + "Col A' B' I'" and + "Col C' D' I'" + shows "I I' Reflect X Y" +proof - + obtain I0 where P1: "I I0 Reflect X Y" + using l10_6_existence by blast + then show ?thesis + by (smt Tarski_neutral_dimensionless.image_gen_preserves_col Tarski_neutral_dimensionless_axioms assms(1) assms(10) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) assms(8) assms(9) l10_4 l10_7 l6_21) +qed + +lemma intersection_with_image_gen: + assumes "A A' Reflect X Y" and + "B B' Reflect X Y" and + "\ Col A B A'" and + "Col A B C" and + "Col A' B' C" + shows "Col C X Y" + by (smt assms(1) assms(2) assms(3) assms(4) assms(5) image_gen_preserves_inter l10_2_uniqueness l10_4 l10_8 not_col_distincts) + +lemma image_preserves_midpoint : + assumes "A A' ReflectL X Y" and + "B B' ReflectL X Y" and + "C C' ReflectL X Y" and + "A Midpoint B C" + shows "A' Midpoint B' C'" +proof - + have P1: "Bet B' A' C'" using image_preserves_bet + using assms(1) assms(2) assms(3) assms(4) midpoint_bet by auto + have "Cong B' A' A' C'" + by (metis Cong_perm Tarski_neutral_dimensionless.l10_10_spec Tarski_neutral_dimensionless.l7_13 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) cong_transitivity l7_3_2) + then show ?thesis + by (simp add: Midpoint_def P1) +qed + +lemma image_spec_preserves_per: + assumes "A A' ReflectL X Y" and + "B B' ReflectL X Y" and + "C C' ReflectL X Y" and + "Per A B C" + shows "Per A' B' C'" +proof cases + assume "X = Y" + then show ?thesis + using assms(1) assms(2) assms(3) assms(4) image_spec__eq by blast +next + assume P1: "X \ Y" + obtain C1 where P2: "B Midpoint C C1" + using symmetric_point_construction by blast + obtain C1' where P3: "C1 C1' ReflectL X Y" + by (meson P1 l10_6_existence_spec) + then have P4: "B' Midpoint C' C1'" + using P2 assms(2) assms(3) image_preserves_midpoint by blast + have "Cong A' C' A' C1'" + proof - + have Q1: "Cong A' C' A C" + using assms(1) assms(3) l10_10_spec by auto + have "Cong A C A' C1'" + by (metis P2 P3 Tarski_neutral_dimensionless.l10_10_spec Tarski_neutral_dimensionless_axioms assms(1) assms(4) cong_inner_transitivity cong_symmetry per_double_cong) + then show ?thesis + using Q1 cong_transitivity by blast + qed + then show ?thesis + using P4 Per_def by blast +qed + +lemma image_preserves_per: + assumes "A A' Reflect X Y" and + "B B' Reflect X Y"and + "C C' Reflect X Y" and + "Per A B C" + shows "Per A' B' C'" +proof cases + assume "X = Y" + then show ?thesis using midpoint_preserves_per + using assms(1) assms(2) assms(3) assms(4) image__midpoint l7_2 by blast +next + assume P1: "X \ Y" + have P2: "X \ Y \ A A' ReflectL X Y" + using P1 assms(1) is_image_is_image_spec by blast + have P3: "X \ Y \ B B' ReflectL X Y" + using P1 assms(2) is_image_is_image_spec by blast + have P4: "X \ Y \ C C' ReflectL X Y" + using P1 assms(3) is_image_is_image_spec by blast + then show ?thesis using image_spec_preserves_per + using P2 P3 assms(4) by blast +qed + +lemma l10_12: + assumes "Per A B C" and + "Per A' B' C'" and + "Cong A B A' B'" and + "Cong B C B' C'" + shows "Cong A C A' C'" +proof cases + assume P1: "B = C" + then have "B' = C'" + using assms(4) cong_reverse_identity by blast + then show ?thesis + using P1 assms(3) by blast +next + assume P2: "B \ C" + have "Cong A C A' C'" + proof cases + assume "A = B" + then show ?thesis + using assms(3) assms(4) cong_diff_3 by force + next + assume P3: "A \ B" + obtain X where P4: "X Midpoint B B'" + using midpoint_existence by blast + obtain A1 where P5: "X Midpoint A' A1" + using Mid_perm symmetric_point_construction by blast + obtain C1 where P6: "X Midpoint C' C1" + using Mid_perm symmetric_point_construction by blast + have Q1: "A' B' C' Cong3 A1 B C1" + using Cong3_def P4 P5 P6 l7_13 l7_2 by blast + have Q2: "Per A1 B C1" + using assms(2)Q1 l8_10 by blast + have Q3: "Cong A B A1 B" + by (metis Cong3_def Q1 Tarski_neutral_dimensionless.cong_symmetry Tarski_neutral_dimensionless_axioms assms(3) cong_inner_transitivity) + have Q4: "Cong B C B C1" + by (metis Cong3_def Q1 Tarski_neutral_dimensionless.cong_symmetry Tarski_neutral_dimensionless_axioms assms(4) cong_inner_transitivity) + obtain Y where P7: "Y Midpoint C C1" + using midpoint_existence by auto + then have R1: "C1 C Reflect B Y" using cong_midpoint__image + using Q4 by blast + obtain A2 where R2: "A1 A2 Reflect B Y" + using l10_6_existence by blast + have R3: "Cong C A2 C1 A1" + using R1 R2 l10_10 by blast + have R5: "B B Reflect B Y" + using image_triv by blast + have R6: "Per A2 B C" using image_preserves_per + using Q2 R1 R2 image_triv by blast + have R7: "Cong A B A2 B" + using l10_10 Cong_perm Q3 R2 cong_transitivity image_triv by blast + obtain Z where R7A: "Z Midpoint A A2" + using midpoint_existence by blast + have "Cong B A B A2" + using Cong_perm R7 by blast + then have T1: "A2 A Reflect B Z" using R7A cong_midpoint__image + by blast + obtain C0 where T2: "B Midpoint C C0" + using symmetric_point_construction by blast + have T3: "Cong A C A C0" + using T2 assms(1) per_double_cong by blast + have T4: "Cong A2 C A2 C0" + using R6 T2 per_double_cong by blast + have T5: "C0 C Reflect B Z" + proof - + have "C0 C Reflect Z B" + proof cases + assume "A = A2" + then show ?thesis + by (metis R7A T2 T3 cong_midpoint__image midpoint_distinct_3) + next + assume "A \ A2" + then show ?thesis using l4_17 cong_midpoint__image + by (metis R7A T2 T3 T4 midpoint_col not_col_permutation_3) + qed + then show ?thesis + using is_image_rev by blast + qed + have T6: "Cong A C A2 C0" + using T1 T5 l10_10 by auto + have R4: "Cong A C A2 C" + by (metis T4 T6 Tarski_neutral_dimensionless.cong_symmetry Tarski_neutral_dimensionless_axioms cong_inner_transitivity) + then have Q5: "Cong A C A1 C1" + by (meson R3 cong_inner_transitivity not_cong_3421) + then show ?thesis + using Cong3_def Q1 Q5 cong_symmetry cong_transitivity by blast + qed + then show ?thesis by blast +qed + +lemma l10_16: + assumes "\ Col A B C" and + "\ Col A' B' P" and + "Cong A B A' B'" + shows "\ C'. A B C Cong3 A' B' C' \ A' B' OS P C'" +proof cases + assume "A = B" + then show ?thesis + using assms(1) not_col_distincts by auto +next + assume P1: "A \ B" + obtain X where P2: "Col A B X \ A B Perp C X" + using assms(1) l8_18_existence by blast + obtain X' where P3: "A B X Cong3 A' B' X'" + using P2 assms(3) l4_14 by blast + obtain Q where P4: "A' B' Perp Q X' \ A' B' OS P Q" + using P2 P3 assms(2) l10_15 l4_13 by blast + obtain C' where P5: "X' Out C' Q \ Cong X' C' X C" + by (metis P2 P4 l6_11_existence perp_distinct) + have P6: "Cong A C A' C'" + proof cases + assume "A = X" + then show ?thesis + by (metis Cong3_def P3 P5 cong_4321 cong_commutativity cong_diff_3) + next + assume "A \ X" + have P7: "Per A X C" + using P2 col_trivial_3 l8_16_1 l8_2 by blast + have P8: "Per A' X' C'" + proof - + have "X' PerpAt A' X' X' C'" + proof - + have Z1: "A' X' Perp X' C'" + proof - + have W1: "X' \ C'" + using P5 out_distinct by blast + have W2: "X' Q Perp A' B'" + using P4 Perp_perm by blast + then have "X' C' Perp A' B'" + by (metis P5 Perp_perm W1 col_trivial_3 not_col_permutation_5 out_col perp_col2_bis) + then show ?thesis + by (metis Cong3_def P2 P3 Perp_perm \A \ X\ col_trivial_3 cong_identity l4_13 perp_col2_bis) + qed + have Z2: "Col X' A' X'" + using not_col_distincts by blast + have "Col X' X' C'" + by (simp add: col_trivial_1) + then show ?thesis + by (simp add: Z1 Z2 l8_14_2_1b_bis) + qed + then show ?thesis + by (simp add: perp_in_per) + qed + have P9: "Cong A X A' X'" + using Cong3_def P3 by auto + have "Cong X C X' C'" + using Cong_perm P5 by blast + then show ?thesis using l10_12 + using P7 P8 P9 by blast + qed + have P10: "Cong B C B' C'" + proof cases + assume "B = X" + then show ?thesis + by (metis Cong3_def P3 P5 cong_4321 cong_commutativity cong_diff_3) + next + assume "B \ X" + have Q1: "Per B X C" + using P2 col_trivial_2 l8_16_1 l8_2 by blast + have "X' PerpAt B' X' X' C'" + proof - + have Q2: "B' X' Perp X' C'" + proof - + have R1: "B' \ X'" + using Cong3_def P3 \B \ X\ cong_identity by blast + have "X' C' Perp B' A'" + proof - + have S1: "X' \ C'" + using Out_def P5 by blast + have S2: "X' Q Perp B' A'" + using P4 Perp_perm by blast + have "Col X' Q C'" + using Col_perm P5 out_col by blast + then show ?thesis + using S1 S2 perp_col by blast + qed + then have R2: "B' A' Perp X' C'" + using Perp_perm by blast + have R3: "Col B' A' X'" + using Col_perm P2 P3 l4_13 by blast + then show ?thesis + using R1 R2 perp_col by blast + qed + have Q3: "Col X' B' X'" + by (simp add: col_trivial_3) + have "Col X' X' C'" + by (simp add: col_trivial_1) + then show ?thesis using l8_14_2_1b_bis + using Q2 Q3 by blast + qed + then have Q2: "Per B' X' C'" + by (simp add: perp_in_per) + have Q3: "Cong B X B' X'" + using Cong3_def P3 by blast + have Q4: "Cong X C X' C'" + using P5 not_cong_3412 by blast + then show ?thesis + using Q1 Q2 Q3 l10_12 by blast + qed + have P12: "A' B' OS C' Q \ X' Out C' Q \ \ Col A' B' C'" using l9_19 l4_13 + by (meson P2 P3 P5 one_side_not_col123 out_one_side_1) + then have P13: "A' B' OS C' Q" using l4_13 + by (meson P2 P3 P4 P5 l6_6 one_side_not_col124 out_one_side_1) + then show ?thesis + using Cong3_def P10 P4 P6 assms(3) one_side_symmetry one_side_transitivity by blast +qed + +lemma cong_cop_image__col: + assumes "P \ P'" and + "P P' Reflect A B" and + "Cong P X P' X" and + "Coplanar A B P X" + shows "Col A B X" +proof - + have P1: "(A \ B \ P P' ReflectL A B) \ (A = B \ A Midpoint P' P)" + by (metis assms(2) image__midpoint is_image_is_image_spec) + { + assume Q1: "A \ B \ P P' ReflectL A B" + then obtain M where Q2: "M Midpoint P' P \ Col A B M" + using ReflectL_def by blast + have "Col A B X" + proof cases + assume R1: "A = M" + have R2: "P A Perp A B" + proof - + have S1: "P \ A" + using Q2 R1 assms(1) midpoint_distinct_2 by blast + have S2: "P P' Perp A B" + using Perp_perm Q1 ReflectL_def assms(1) by blast + have "Col P P' A" + using Q2 R1 midpoint_col not_col_permutation_3 by blast + then show ?thesis using perp_col + using S1 S2 by blast + qed + have R3: "Per P A B" + by (simp add: R2 perp_comm perp_per_1) + then have R3A: "Per B A P" using l8_2 + by blast + have "A Midpoint P P' \ Cong X P X P'" + using Cong_cases Q2 R1 assms(3) l7_2 by blast + then have R4: "Per X A P" + using Per_def by blast + have R5: "Coplanar P B X A" + using assms(4) ncoplanar_perm_20 by blast + have "P \ A" + using R2 perp_not_eq_1 by auto + then show ?thesis using R4 R5 R3A + using cop_per2__col not_col_permutation_1 by blast + next + assume T1: "A \ M" + have T3: "P \ M" + using Q2 assms(1) l7_3_2 sym_preserve_diff by blast + have T2: "P M Perp M A" + proof - + have T4: "P P' Perp M A" + using Perp_perm Q1 Q2 ReflectL_def T1 assms(1) col_trivial_3 perp_col0 by blast + have "Col P P' M" + by (simp add: Col_perm Q2 midpoint_col) + then show ?thesis using T3 T4 perp_col by blast + qed + then have "M P Perp A M" + using perp_comm by blast + then have "M PerpAt M P A M" + using perp_perp_in by blast + then have "M PerpAt P M M A" + by (simp add: perp_in_comm) + then have U1: "Per P M A" + by (simp add: perp_in_per) + have U2: "Per X M P" using l7_2 cong_commutativity + using Per_def Q2 assms(3) by blast + have "Col A X M" + proof - + have W2: "Coplanar P A X M" + by (meson Q1 Q2 assms(4) col_cop2__cop coplanar_perm_13 ncop_distincts) + have "Per A M P" + by (simp add: U1 l8_2) + then show ?thesis using cop_per2__col + using U2 T3 W2 by blast + qed + then show ?thesis + using Q2 T1 col2__eq not_col_permutation_4 by blast + qed + } + then have P2: "(A \ B \ P P' ReflectL A B) \ Col A B X" by blast + have "(A = B \ A Midpoint P' P) \ Col A B X" + using col_trivial_1 by blast + then show ?thesis using P1 P2 by blast +qed + +lemma cong_cop_per2_1: + assumes "A \ B" and + "Per A B X" and + "Per A B Y" and + "Cong B X B Y" and + "Coplanar A B X Y" + shows "X = Y \ B Midpoint X Y" + by (meson Per_cases assms(1) assms(2) assms(3) assms(4) assms(5) cop_per2__col coplanar_perm_3 l7_20_bis not_col_permutation_5) + +lemma cong_cop_per2: + assumes "A \ B" and + "Per A B X" and + "Per A B Y" and + "Cong B X B Y" and + "Coplanar A B X Y" + shows "X = Y \ X Y ReflectL A B" +proof - + have "X = Y \ B Midpoint X Y" + using assms(1) assms(2) assms(3) assms(4) assms(5) cong_cop_per2_1 by blast + then show ?thesis + by (metis Mid_perm Per_def Reflect_def assms(1) assms(3) cong_midpoint__image symmetric_point_uniqueness) +qed + +lemma cong_cop_per2_gen: + assumes "A \ B" and + "Per A B X" and + "Per A B Y" and + "Cong B X B Y" and + "Coplanar A B X Y" + shows "X = Y \ X Y Reflect A B" +proof - + have "X = Y \ B Midpoint X Y" + using assms(1) assms(2) assms(3) assms(4) assms(5) cong_cop_per2_1 by blast + then show ?thesis + using assms(2) cong_midpoint__image l10_4 per_double_cong by blast +qed + +lemma ex_perp_cop: + assumes "A \ B" + shows "\ Q. A B Perp Q C \ Coplanar A B P Q" +proof - + { + assume "Col A B C \ Col A B P" + then have "\ Q. A B Perp Q C \ Coplanar A B P Q" + using assms ex_ncol_cop l10_15 ncop__ncols by blast + } + then have T1: "(Col A B C \ Col A B P) \ + (\ Q. A B Perp Q C \ Coplanar A B P Q)" by blast + { + assume "\Col A B C \ Col A B P" + then have "\ Q. A B Perp Q C \ Coplanar A B P Q" + by (metis Perp_cases ncop__ncols not_col_distincts perp_exists) + } + then have T2: "(\Col A B C \ Col A B P) \ + (\ Q. A B Perp Q C \ Coplanar A B P Q)" by blast + + { + assume "Col A B C \ \Col A B P" + then have "\ Q. A B Perp Q C \ Coplanar A B P Q" + using l10_15 os__coplanar by blast + } + then have T3: "(Col A B C \ \Col A B P) \ + (\ Q. A B Perp Q C \ Coplanar A B P Q)" by blast + { + assume "\Col A B C \ \Col A B P" + then have "\ Q. A B Perp Q C \ Coplanar A B P Q" + using l8_18_existence ncop__ncols perp_right_comm by blast + } + then have "(\Col A B C \ \Col A B P) \ + (\ Q. A B Perp Q C \ Coplanar A B P Q)" by blast + then show ?thesis using T1 T2 T3 by blast +qed + +lemma hilbert_s_version_of_pasch_aux: + assumes "Coplanar A B C P" and + "\ Col A I P" and + "\ Col B C P" and + "Bet B I C" and + "B \ I" and + "I \ C" and + "B \ C" + shows "\ X. Col I P X \ ((Bet A X B \ A \ X \ X \ B \ A \ B) \ (Bet A X C \ A \ X \ X \ C \ A \ C))" +proof - + have T1: "I P TS B C" + using Col_perm assms(3) assms(4) assms(5) assms(6) bet__ts bet_col col_transitivity_1 by blast + have T2: "Coplanar A P B I" + using assms(1) assms(4) bet_cop__cop coplanar_perm_6 ncoplanar_perm_9 by blast + have T3: "I P TS A B \ I P TS A C" + by (meson T1 T2 TS_def assms(2) cop_nos__ts coplanar_perm_21 l9_2 l9_8_2) + have T4: "I P TS A B \ +(\ X. Col I P X \ + ((Bet A X B \ A \ X \ X \ B \ A \ B) \ + (Bet A X C \ A \ X \ X \ C \ A \ C)))" + by (metis TS_def not_col_permutation_2 ts_distincts) + have "I P TS A C \ +(\ X. Col I P X \ + ((Bet A X B \ A \ X \ X \ B \ A \ B) \ + (Bet A X C \ A \ X \ X \ C \ A \ C)))" + by (metis TS_def not_col_permutation_2 ts_distincts) + + then show ?thesis using T3 T4 by blast +qed + +lemma hilbert_s_version_of_pasch: + assumes "Coplanar A B C P" and + "\ Col C Q P" and + "\ Col A B P" and + "BetS A Q B" + shows "\ X. Col P Q X \ (BetS A X C \ BetS B X C)" +proof - + obtain X where "Col Q P X \ +(Bet C X A \ C \ X \ X \ A \ C \ A \ + Bet C X B \ C \ X \ X \ B \ C \ B)" + using BetSEq assms(1) assms(2) assms(3) assms(4) coplanar_perm_12 hilbert_s_version_of_pasch_aux by fastforce + then show ?thesis + by (metis BetS_def Bet_cases Col_perm) +qed + +lemma two_sides_cases: + assumes "\ Col PO A B" and + "PO P OS A B" + shows "PO A TS P B \ PO B TS P A" + by (meson assms(1) assms(2) cop_nts__os l9_31 ncoplanar_perm_3 not_col_permutation_4 one_side_not_col124 one_side_symmetry os__coplanar) + +lemma not_par_two_sides: + assumes "C \ D" and + "Col A B I" and + "Col C D I" and + "\ Col A B C" + shows "\ X Y. Col C D X \ Col C D Y \ A B TS X Y" +proof - + obtain pp :: "'p \ 'p \ 'p" where + f1: "\p pa. Bet p pa (pp p pa) \ pa \ (pp p pa)" + by (meson point_construction_different) + then have f2: "\p pa pb pc. (Col pc pb p \ \ Col pc pb (pp p pa)) \ \ Col pc pb pa" + by (meson Col_def colx) + have f3: "\p pa. Col pa p pa" + by (meson Col_def between_trivial) + have f4: "\p pa. Col pa p p" + by (meson Col_def between_trivial) + have f5: "Col I D C" + by (meson Col_perm assms(3)) + have f6: "\p pa. Col (pp pa p) p pa" + using f4 f3 f2 by blast + then have f7: "\p pa. Col pa (pp pa p) p" + by (meson Col_perm) + then have f8: "\p pa pb pc. (pc pb TS p (pp p pa) \ Col pc pb p) \ \ Col pc pb pa" + using f2 f1 by (meson l9_18) + have "I = D \ Col D (pp D I) C" + using f7 f5 f3 colx by blast + then have "I = D \ Col C D (pp D I)" + using Col_perm by blast + then show ?thesis + using f8 f6 f3 by (metis Col_perm assms(2) assms(4)) +qed + +lemma cop_not_par_other_side: + assumes "C \ D" and + "Col A B I" and + "Col C D I" and + "\ Col A B C" and + "\ Col A B P" and + "Coplanar A B C P" + shows "\ Q. Col C D Q \ A B TS P Q" +proof - + obtain X Y where P1:"Col C D X \ Col C D Y \ A B TS X Y" + using assms(1) assms(2) assms(3) assms(4) not_par_two_sides by blast + then have "Coplanar C A B X" + using Coplanar_def assms(1) assms(2) assms(3) col_transitivity_1 by blast + then have "Coplanar A B P X" + using assms(4) assms(6) col_permutation_3 coplanar_trans_1 ncoplanar_perm_2 ncoplanar_perm_6 by blast + then show ?thesis + by (meson P1 l9_8_2 TS_def assms(5) cop_nts__os not_col_permutation_2 one_side_symmetry) +qed + + +lemma cop_not_par_same_side: + assumes "C \ D" and + "Col A B I" and + "Col C D I" and + "\ Col A B C" and + "\ Col A B P" and + "Coplanar A B C P" + shows "\ Q. Col C D Q \ A B OS P Q" +proof - + obtain X Y where P1: "Col C D X \ Col C D Y \ A B TS X Y" + using assms(1) assms(2) assms(3) assms(4) not_par_two_sides by blast + then have "Coplanar C A B X" + using Coplanar_def assms(1) assms(2) assms(3) col_transitivity_1 by blast + then have "Coplanar A B P X" + using assms(4) assms(6) col_permutation_1 coplanar_perm_2 coplanar_trans_1 ncoplanar_perm_14 by blast + then show ?thesis + by (meson P1 TS_def assms(5) cop_nts__os l9_2 l9_8_1 not_col_permutation_2) +qed + +end + +subsubsection "Line reflexivity: 2D" + +context Tarski_2D + +begin + +lemma all_coplanar: + "Coplanar A B C D" +proof - + have "\ A B C P Q. P \ Q \ Cong A P A Q \ Cong B P B Q\ Cong C P C Q \ +(Bet A B C \ Bet B C A \ Bet C A B)" + using upper_dim by blast + then show ?thesis using upper_dim_implies_all_coplanar + by (smt Tarski_neutral_dimensionless.not_col_permutation_2 Tarski_neutral_dimensionless_axioms all_coplanar_axiom_def all_coplanar_implies_upper_dim coplanar_perm_9 ncop__ncol os__coplanar ts__coplanar upper_dim_implies_not_one_side_two_sides) +qed + +lemma per2__col: + assumes "Per A X C" and + "X \ C" and + "Per B X C" + shows "Col A B X" + using all_coplanar_axiom_def all_coplanar_upper_dim assms(1) assms(2) assms(3) upper_dim upper_dim_implies_per2__col by blast + +lemma perp2__col: + assumes "X Y Perp A B" and + "X Z Perp A B" + shows "Col X Y Z" + by (meson Tarski_neutral_dimensionless.cop_perp2__col Tarski_neutral_dimensionless_axioms all_coplanar assms(1) assms(2)) +end + +subsection "Angles" + +subsubsection "Some generalites" + +context Tarski_neutral_dimensionless + +begin + +lemma l11_3: + assumes "A B C CongA D E F" + shows "\ A' C' D' F'. B Out A' A \ B Out C C' \ E Out D' D \ E Out F F' \ A' B C' Cong3 D' E F'" +proof - + obtain A' C' D' F' where P1: "Bet B A A' \ Cong A A' E D \ Bet B C C' \ Cong C C' E F \ Bet E D D' \ Cong D D' B A \ Bet E F F' \ Cong F F' B C \ Cong A' C' D' F'" using CongA_def + using assms by auto + then have "A' B C' Cong3 D' E F'" + by (meson Cong3_def between_symmetry l2_11_b not_cong_1243 not_cong_4312) + thus ?thesis + by (metis CongA_def P1 assms bet_out l6_6) +qed + +lemma l11_aux: + assumes "B Out A A'" and + "E Out D D'" and + "Cong B A' E D'" and + "Bet B A A0" and + "Bet E D D0" and + "Cong A A0 E D" and + "Cong D D0 B A" + shows "Cong B A0 E D0 \ Cong A' A0 D' D0" +proof - + have P2: "Cong B A0 E D0" + by (meson Bet_cases assms(4) assms(5) assms(6) assms(7) l2_11_b not_cong_1243 not_cong_4312) + have P3: "Bet B A A' \ Bet B A' A" + using Out_def assms(1) by auto + have P4: "Bet E D D' \ Bet E D' D" + using Out_def assms(2) by auto + have P5: "Bet B A A' \ Cong A' A0 D' D0" + by (smt P2 assms(1) assms(2) assms(3) assms(4) assms(5) bet_out l6_6 l6_7 out_cong_cong out_diff1) + have P6: "Bet B A' A \ Cong A' A0 D' D0" + proof - + have "E Out D D0" + using assms(2) assms(5) bet_out out_diff1 by blast + thus ?thesis + by (meson P2 assms(2) assms(3) assms(4) between_exchange4 cong_preserves_bet l4_3_1 l6_6 l6_7) + qed + have P7: "Bet E D D' \ Cong A' A0 D' D0" + using P3 P5 P6 by blast + have "Bet E D' D \ Cong A' A0 D' D0" + using P3 P5 P6 by blast + thus ?thesis + using P2 P3 P4 P5 P6 P7 by blast +qed + +lemma l11_3_bis: + assumes "\ A' C' D' F'. (B Out A' A \ B Out C' C \ E Out D' D \ E Out F' F \ A' B C' Cong3 D' E F')" + shows "A B C CongA D E F" +proof - + obtain A' C' D' F' where P1: + "B Out A' A \ B Out C' C \ E Out D' D \ E Out F' F \ A' B C' Cong3 D' E F'" + using assms by blast + obtain A0 where P2: "Bet B A A0 \ Cong A A0 E D" + using segment_construction by presburger + obtain C0 where P3: "Bet B C C0 \ Cong C C0 E F" + using segment_construction by presburger + obtain D0 where P4: "Bet E D D0 \ Cong D D0 B A" + using segment_construction by presburger + obtain F0 where P5: "Bet E F F0 \ Cong F F0 B C" + using segment_construction by presburger + have P6: "A \ B \ C \ B \ D \ E \ F \ E" + using P1 out_diff2 by blast + have "Cong A0 C0 D0 F0" + proof - + have Q1: "Cong B A0 E D0 \ Cong A' A0 D' D0" + proof - + have R1: "B Out A A'" + by (simp add: P1 l6_6) + have R2: "E Out D D'" + by (simp add: P1 l6_6) + have "Cong B A' E D'" + using Cong3_def P1 cong_commutativity by blast + thus ?thesis using l11_aux + using P2 P4 R1 R2 by blast + qed + have Q2: "Cong B C0 E F0 \ Cong C' C0 F' F0" + by (smt Cong3_def Out_cases P1 P3 P5 Tarski_neutral_dimensionless.l11_aux Tarski_neutral_dimensionless_axioms) + have Q3: "B A' A0 Cong3 E D' D0" + by (meson Cong3_def P1 Q1 cong_3_swap) + have Q4: "B C' C0 Cong3 E F' F0" + using Cong3_def P1 Q2 by blast + have "Cong C0 A' F0 D'" + proof - + have R1: "B C' C0 A' FSC E F' F0 D'" + proof - + have S1: "Col B C' C0" + by (metis (no_types) Col_perm P1 P3 P6 bet_col col_transitivity_1 out_col) + have S3: "Cong B A' E D'" + using Cong3_def Q3 by blast + have "Cong C' A' F' D'" + using Cong3_def P1 cong_commutativity by blast + thus ?thesis + by (simp add: FSC_def S1 Q4 S3) + qed + have "B \ C'" + using P1 out_distinct by blast + thus ?thesis + using R1 l4_16 by blast + qed + then have Q6: "B A' A0 C0 FSC E D' D0 F0" + by (meson FSC_def P1 P2 P6 Q2 Q3 bet_out l6_7 not_cong_2143 out_col) + have "B \ A'" + using Out_def P1 by blast + thus ?thesis + using Q6 l4_16 by blast + qed + thus ?thesis using P6 P2 P3 P4 P5 CongA_def by auto +qed + +lemma l11_4_1: + assumes "A B C CongA D E F" and + (*"A \ B" and "C \ B" and "D \ E" and "F \ E" and*) + "B Out A' A" and + "B Out C' C" and + "E Out D' D" and + "E Out F' F" and + "Cong B A' E D'" and "Cong B C' E F'" + shows "Cong A' C' D' F'" +proof - + obtain A0 C0 D0 F0 where P1: "B Out A0 A \ B Out C C0 \ E Out D0 D \ E Out F F0 \ A0 B C0 Cong3 D0 E F0" + using assms(1) l11_3 by blast + have P2: "B Out A' A0" + using P1 assms(2) l6_6 l6_7 by blast + have P3: "E Out D' D0" + by (meson P1 assms(4) l6_6 l6_7) + have P4: "Cong A' A0 D' D0" + proof - + have "Cong B A0 E D0" + using Cong3_def P1 cong_3_swap by blast + thus ?thesis using P2 P3 + using assms(6) out_cong_cong by blast + qed + have P5: "Cong A' C0 D' F0" + proof - + have P6: "B A0 A' C0 FSC E D0 D' F0" + by (meson Cong3_def Cong_perm FSC_def P1 P2 P4 assms(6) not_col_permutation_5 out_col) + thus ?thesis + using P2 Tarski_neutral_dimensionless.l4_16 Tarski_neutral_dimensionless_axioms out_diff2 by fastforce + qed + have P6: "B Out C' C0" + using P1 assms(3) l6_7 by blast + have "E Out F' F0" + using P1 assms(5) l6_7 by blast + then have "Cong C' C0 F' F0" + using Cong3_def P1 P6 assms(7) out_cong_cong by auto + then have P9: "B C0 C' A' FSC E F0 F' D'" + by (smt Cong3_def Cong_perm FSC_def P1 P5 P6 assms(6) assms(7) not_col_permutation_5 out_col) + then have "Cong C' A' F' D'" + using P6 Tarski_neutral_dimensionless.l4_16 Tarski_neutral_dimensionless_axioms out_diff2 by fastforce + thus ?thesis + using Tarski_neutral_dimensionless.not_cong_2143 Tarski_neutral_dimensionless_axioms by fastforce +qed + +lemma l11_4_2: + assumes "A \ B" and + "C \ B" and + "D \ E" and + "F \ E" and + "\ A' C' D' F'. (B Out A' A \ B Out C' C \ E Out D' D \ E Out F' F \ Cong B A' E D' \ Cong B C' E F' \ Cong A' C' D' F')" + shows "A B C CongA D E F" +proof - + obtain A' where P1: "Bet B A A' \ Cong A A' E D" + using segment_construction by fastforce + obtain C' where P2: "Bet B C C' \ Cong C C' E F" + using segment_construction by fastforce + obtain D' where P3: "Bet E D D' \ Cong D D' B A" + using segment_construction by fastforce + obtain F' where P4: "Bet E F F' \ Cong F F' B C" + using segment_construction by fastforce + have P5: "Cong A' B D' E" + by (meson Bet_cases P1 P3 l2_11_b not_cong_1243 not_cong_4312) + have P6: "Cong B C' E F'" + by (meson P2 P4 between_symmetry cong_3421 cong_right_commutativity l2_11_b) + have "B Out A' A \ B Out C' C \ E Out D' D \ E Out F' F \ A' B C' Cong3 D' E F'" + by (metis (no_types, lifting) Cong3_def P1 P2 P3 P4 P5 P6 Tarski_neutral_dimensionless.Out_def Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) assms(5) bet_neq12__neq cong_commutativity) + thus ?thesis + using l11_3_bis by blast +qed + +lemma conga_refl: + assumes "A \ B" and + "C \ B" + shows "A B C CongA A B C" + by (meson CongA_def assms(1) assms(2) cong_reflexivity segment_construction) + +lemma conga_sym: + assumes "A B C CongA A' B' C'" + shows "A' B' C' CongA A B C" +proof - + obtain A0 C0 D0 F0 where + P1: "Bet B A A0 \ Cong A A0 B' A' \ Bet B C C0 \ Cong C C0 B' C' \ Bet B' A' D0 \ Cong A' D0 B A \ Bet B' C' F0 \ Cong C' F0 B C \ Cong A0 C0 D0 F0" + using CongA_def assms by auto + thus ?thesis + proof - + have "\p pa pb pc. Bet B' A' p \ Cong A' p B A \ Bet B' C' pa \ Cong C' pa B C \Bet B A pb \ Cong A pb B' A' \Bet B C pc \ Cong C pc B' C' \ Cong p pa pb pc" + by (metis (no_types) Tarski_neutral_dimensionless.cong_symmetry Tarski_neutral_dimensionless_axioms P1) + thus ?thesis + using CongA_def assms by auto + qed +qed + +lemma l11_10: + assumes "A B C CongA D E F" and + "B Out A' A" and + "B Out C' C" and + "E Out D' D" and + "E Out F' F" + shows "A' B C' CongA D' E F'" +proof - + have P1: "A' \ B" + using assms(2) out_distinct by auto + have P2: "C' \ B" + using Out_def assms(3) by force + have P3: "D' \ E" + using Out_def assms(4) by blast + have P4: "F' \ E" + using assms(5) out_diff1 by auto + have P5: "\ A'0 C'0 D'0 F'0. (B Out A'0 A' \ B Out C'0 C' \ E Out D'0 D' \ E Out F'0 F' \ Cong B A'0 E D'0 \ Cong B C'0 E F'0) \ Cong A'0 C'0 D'0 F'0" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) l11_4_1 l6_7) + thus ?thesis using P1 P2 P3 P4 P5 l11_4_2 by blast +qed + +lemma out2__conga: + assumes "B Out A' A" and + "B Out C' C" + shows "A B C CongA A' B C'" + by (smt assms(1) assms(2) between_trivial2 conga_refl l11_10 out2_bet_out out_distinct) + +lemma cong3_diff: + assumes "A \ B" and + "A B C Cong3 A' B' C'" + shows "A' \ B'" + using Cong3_def assms(1) assms(2) cong_diff by blast + +lemma cong3_diff2: + assumes "B \ C" and + "A B C Cong3 A' B' C'" + shows "B' \ C'" + using Cong3_def assms(1) assms(2) cong_diff by blast + +lemma cong3_conga: + assumes "A \ B" and + "C \ B" and + "A B C Cong3 A' B' C'" + shows "A B C CongA A' B' C'" + by (metis assms(1) assms(2) assms(3) cong3_diff cong3_diff2 l11_3_bis out_trivial) + +lemma cong3_conga2: + assumes "A B C Cong3 A' B' C'" and + "A B C CongA A'' B'' C''" + shows "A' B' C' CongA A'' B'' C''" +proof - + obtain A0 C0 A2 C2 where P1: "Bet B A A0 \ Cong A A0 B'' A'' \ Bet B C C0 \ Cong C C0 B'' C''\ Bet B'' A'' A2 \ Cong A'' A2 B A \ Bet B'' C'' C2 \ Cong C'' C2 B C \ Cong A0 C0 A2 C2" + using CongA_def assms(2) by auto + obtain A1 where P5: "Bet B' A' A1 \ Cong A' A1 B'' A''" + using segment_construction by blast + obtain C1 where P6: "Bet B' C' C1 \ Cong C' C1 B'' C''" + using segment_construction by blast + have P7: "Cong A A0 A' A1" + proof - + have "Cong B'' A'' A' A1" using P5 + using Cong_perm by blast + thus ?thesis + using Cong_perm P1 cong_inner_transitivity by blast + qed + have P8: "Cong B A0 B' A1" + using Cong3_def P1 P5 P7 assms(1) cong_commutativity l2_11_b by blast + have P9: "Cong C C0 C' C1" + using P1 P6 cong_inner_transitivity cong_symmetry by blast + have P10: "Cong B C0 B' C1" + using Cong3_def P1 P6 P9 assms(1) l2_11_b by blast + have "B A A0 C FSC B' A' A1 C'" + using FSC_def P1 P5 P7 P8 Tarski_neutral_dimensionless.Cong3_def Tarski_neutral_dimensionless_axioms assms(1) bet_col l4_3 by fastforce + then have P12: "Cong A0 C A1 C'" + using CongA_def assms(2) l4_16 by auto + then have "B C C0 A0 FSC B' C' C1 A1" + using Cong3_def FSC_def P1 P10 P8 P9 assms(1) bet_col cong_commutativity by auto + then have P13: "Cong C0 A0 C1 A1" + using l4_16 CongA_def assms(2) by blast + have Q2: "Cong A' A1 B'' A''" + using P1 P7 cong_inner_transitivity by blast + have Q5: "Bet B'' A'' A2" using P1 by blast + have Q6: "Cong A'' A2 B' A'" + proof - + have "Cong B A B' A'" + using P1 P7 P8 P5 l4_3 by blast + thus ?thesis + using P1 cong_transitivity by blast + qed + have Q7: "Bet B'' C'' C2" + using P1 by blast + have Q8: "Cong C'' C2 B' C'" + proof - + have "Cong B C B' C'" + using Cong3_def assms(1) by blast + thus ?thesis + using P1 cong_transitivity by blast + qed + have R2: "Cong C0 A0 C2 A2" + using Cong_cases P1 by blast + have "Cong C1 A1 A0 C0" + using Cong_cases P13 by blast + then have Q9: "Cong A1 C1 A2 C2" + using R2 P13 cong_inner_transitivity not_cong_4321 by blast + thus ?thesis + using CongA_def P5 Q2 P6 Q5 Q6 Q7 Q8 + by (metis assms(1) assms(2) cong3_diff cong3_diff2) +qed + +lemma conga_diff1: + assumes "A B C CongA A' B' C'" + shows "A \ B" + using CongA_def assms by blast + +lemma conga_diff2: + assumes "A B C CongA A' B' C'" + shows "C \ B" + using CongA_def assms by blast + +lemma conga_diff45: + assumes "A B C CongA A' B' C'" + shows "A' \ B'" + using CongA_def assms by blast + +lemma conga_diff56: + assumes "A B C CongA A' B' C'" + shows "C' \ B'" + using CongA_def assms by blast + +lemma conga_trans: + assumes "A B C CongA A' B' C'" and + "A' B' C' CongA A'' B'' C''" + shows "A B C CongA A'' B'' C''" +proof - + obtain A0 C0 A1 C1 where P1: "Bet B A A0 \ Cong A A0 B' A' \ +Bet B C C0 \ Cong C C0 B' C'\ Bet B' A' A1 \ Cong A' A1 B A \ +Bet B' C' C1 \ Cong C' C1 B C \ Cong A0 C0 A1 C1" + using CongA_def assms(1) by auto + have P2: "A''\ B'' \ C'' \ B''" + using CongA_def assms(2) by auto + have P3: "A1 B' C1 CongA A'' B'' C''" + proof - + have L2: "B' Out A1 A'" using P1 + by (metis Out_def assms(2) bet_neq12__neq conga_diff1) + have L3: "B' Out C1 C'" using P1 + by (metis Out_def assms(1) bet_neq12__neq conga_diff56) + have L4: "B'' Out A'' A''" + using P2 out_trivial by auto + have "B'' Out C'' C''" + by (simp add: P2 out_trivial) + thus ?thesis + using assms(2) L2 L3 L4 l11_10 by blast + qed + have L6: "A0 B C0 CongA A' B' C'" + by (smt Out_cases P1 Tarski_neutral_dimensionless.conga_diff1 Tarski_neutral_dimensionless.conga_diff2 Tarski_neutral_dimensionless.conga_diff45 Tarski_neutral_dimensionless_axioms assms(1) bet_out conga_diff56 l11_10 l5_3) + have L7: "Cong B A0 B' A1" + by (meson P1 between_symmetry cong_3421 l2_11_b not_cong_1243) + have L8: "Cong B C0 B' C1" + using P1 between_symmetry cong_3421 l2_11_b not_cong_1243 by blast + have L10: "A0 B C0 Cong3 A1 B' C1" + by (simp add: Cong3_def L7 L8 P1 cong_commutativity) + then have L11: "A0 B C0 CongA A'' B'' C''" + by (meson Tarski_neutral_dimensionless.cong3_conga2 Tarski_neutral_dimensionless_axioms P3 cong_3_sym) + thus ?thesis using l11_10 + proof - + have D2: "B Out A A0" using P1 + using CongA_def assms(1) bet_out by auto + have D3: "B Out C C0" using P1 + using CongA_def assms(1) bet_out by auto + have D4: "B'' Out A'' A''" + using P2 out_trivial by blast + have "B'' Out C'' C''" + using P2 out_trivial by auto + thus ?thesis using l11_10 L11 D2 D3 D4 + by blast + qed +qed + +lemma conga_pseudo_refl: + assumes "A \ B" and + "C \ B" + shows "A B C CongA C B A" + by (meson CongA_def assms(1) assms(2) between_trivial cong_pseudo_reflexivity segment_construction) + +lemma conga_trivial_1: + assumes "A \ B" and + "C \ D" + shows "A B A CongA C D C" + by (meson CongA_def assms(1) assms(2) cong_trivial_identity segment_construction) + +lemma l11_13: + assumes "A B C CongA D E F" and + "Bet A B A'" and + "A'\ B" and + "Bet D E D'" and + "D'\ E" + shows "A' B C CongA D' E F" +proof - + obtain A'' C'' D'' F'' where P1: + "Bet B A A'' \ Cong A A'' E D \ +Bet B C C'' \ Cong C C'' E F \ Bet E D D'' \ +Cong D D'' B A \ +Bet E F F'' \ Cong F F'' B C \ Cong A'' C'' D'' F''" + using CongA_def assms(1) by auto + obtain A0 where P2:"Bet B A' A0 \ Cong A' A0 E D'" + using segment_construction by blast + obtain D0 where P3: "Bet E D' D0 \ Cong D' D0 B A'" + using segment_construction by blast + have "Cong A0 C'' D0 F''" + proof - + have L1: "A'' B A0 C'' OFSC D'' E D0 F''" + proof - + have L2: "Bet A'' B A0" + proof - + have M1: "Bet A'' A B" + using Bet_perm P1 by blast + have M2: "Bet A B A0" + using P2 assms(2) assms(3) outer_transitivity_between by blast + have "A \ B" + using CongA_def assms(1) by blast + thus ?thesis + using M1 M2 outer_transitivity_between2 by blast + qed + have L3: "Bet D'' E D0" using Bet_perm P1 P2 outer_transitivity_between CongA_def + by (metis P3 assms(1) assms(4) assms(5)) + have L4: "Cong A'' B D'' E" + by (meson P1 between_symmetry cong_3421 cong_left_commutativity l2_11_b) + have L5: "Cong B A0 E D0" + by (meson P2 P3 between_symmetry cong_3421 cong_right_commutativity l2_11_b) + have "Cong B C'' E F''" + by (meson P1 between_symmetry cong_3421 cong_right_commutativity l2_11_b) + thus ?thesis using P1 L2 L3 L4 L5 + by (simp add: OFSC_def) + qed + have "A'' \ B" + using CongA_def P1 assms(1) bet_neq12__neq by fastforce + thus ?thesis + using L1 five_segment_with_def by auto + qed + thus ?thesis + using CongA_def P1 P2 P3 assms(1) assms(3) assms(5) by auto +qed + +lemma conga_right_comm: + assumes "A B C CongA D E F" + shows "A B C CongA F E D" + by (metis Tarski_neutral_dimensionless.conga_diff45 Tarski_neutral_dimensionless.conga_sym Tarski_neutral_dimensionless.conga_trans Tarski_neutral_dimensionless_axioms assms conga_diff56 conga_pseudo_refl) + +lemma conga_left_comm: + assumes "A B C CongA D E F" + shows "C B A CongA D E F" + by (meson assms conga_right_comm conga_sym) + +lemma conga_comm: + assumes "A B C CongA D E F" + shows "C B A CongA F E D" + by (meson Tarski_neutral_dimensionless.conga_left_comm Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless_axioms assms) + +lemma conga_line: + assumes "A \ B" and + "B \ C" and + "A' \ B'" and + "B' \ C'" + and "Bet A B C" and + "Bet A' B' C'" + shows "A B C CongA A' B' C'" + by (metis Bet_cases assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) conga_trivial_1 l11_13) + +lemma l11_14: + assumes "Bet A B A'" and + "A \ B" and + "A' \ B" and + "Bet C B C'" and + "B \ C" and + "B \ C'" + shows "A B C CongA A' B C'" + by (metis Bet_perm assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) conga_pseudo_refl conga_right_comm l11_13) + +lemma l11_16: + assumes "Per A B C" and + "A \ B" and + "C \ B" and + "Per A' B' C'" and + "A'\ B'" and + "C'\ B'" + shows "A B C CongA A' B' C'" +proof - + obtain C0 where P1: "Bet B C C0 \ Cong C C0 B' C'" + using segment_construction by blast + obtain C1 where P2: "Bet B' C' C1 \ Cong C' C1 B C" + using segment_construction by blast + obtain A0 where P3: "Bet B A A0 \ Cong A A0 B' A'" + using segment_construction by blast + obtain A1 where P4: "Bet B' A' A1 \ Cong A' A1 B A" + using segment_construction by blast + have "Cong A0 C0 A1 C1" + proof - + have Q1: "Per A0 B C0" + by (metis P1 P3 Tarski_neutral_dimensionless.l8_3 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) bet_col per_col) + have Q2: "Per A1 B' C1" + by (metis P2 P4 Tarski_neutral_dimensionless.l8_3 Tarski_neutral_dimensionless_axioms assms(4) assms(5) assms(6) bet_col per_col) + have Q3: "Cong A0 B A1 B'" + by (meson P3 P4 between_symmetry cong_3421 cong_left_commutativity l2_11_b) + have "Cong B C0 B' C1" + using P1 P2 between_symmetry cong_3421 l2_11_b not_cong_1243 by blast + thus ?thesis + using Q1 Q2 Q3 l10_12 by blast + qed + thus ?thesis + using CongA_def P1 P2 P3 P4 assms(2) assms(3) assms(5) assms(6) by auto +qed + +lemma l11_17: + assumes "Per A B C" and + "A B C CongA A' B' C'" + shows "Per A' B' C'" +proof - + obtain A0 C0 A1 C1 where P1: "Bet B A A0 \ Cong A A0 B' A' \ Bet B C C0 \ Cong C C0 B' C' \ Bet B' A' A1 \ Cong A' A1 B A \ Bet B' C' C1 \ Cong C' C1 B C \ Cong A0 C0 A1 C1" + using CongA_def assms(2) by auto + have P2: "Per A0 B C0" + proof - + have Q1: "B \ C" + using assms(2) conga_diff2 by blast + have Q2: "Per A0 B C" + by (metis P1 Tarski_neutral_dimensionless.l8_2 Tarski_neutral_dimensionless_axioms assms(1) assms(2) bet_col conga_diff1 per_col) + have "Col B C C0" + using P1 bet_col by blast + thus ?thesis + using Q1 Q2 per_col by blast + qed + have P3: "Per A1 B' C1" + proof - + have "A0 B C0 Cong3 A1 B' C1" + by (meson Bet_cases Cong3_def P1 l2_11_b not_cong_2134 not_cong_3421) + thus ?thesis + using P2 l8_10 by blast + qed + have P4: "B' \ C1" + using P1 assms(2) between_identity conga_diff56 by blast + have P5: "Per A' B' C1" + proof - + have P6: "B' \ A1" + using P1 assms(2) between_identity conga_diff45 by blast + have P7: "Per C1 B' A1" + by (simp add: P3 l8_2) + have "Col B' A1 A'" + using P1 NCol_cases bet_col by blast + thus ?thesis + using P3 P6 Tarski_neutral_dimensionless.l8_3 Tarski_neutral_dimensionless_axioms by fastforce + qed + have "Col B' C1 C'" + using P1 bet_col col_permutation_5 by blast + thus ?thesis + using P4 P5 per_col by blast +qed + +lemma l11_18_1: + assumes "Bet C B D" and + "B \ C" and + "B \ D" and + "A \ B" and + "Per A B C" + shows "A B C CongA A B D" + by (smt Tarski_neutral_dimensionless.l8_2 Tarski_neutral_dimensionless.l8_5 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) assms(5) bet_col col_per2__per l11_16) + +lemma l11_18_2: + assumes "Bet C B D" and + "A B C CongA A B D" + shows "Per A B C" +proof - + obtain A0 C0 A1 D0 where P1: "Bet B A A0 \ Cong A A0 B A \ Bet B C C0 \ +Cong C C0 B D \ Bet B A A1 \ Cong A A1 B A \ +Bet B D D0 \ Cong D D0 B C \ Cong A0 C0 A1 D0" + using CongA_def assms(2) by auto + have P2: "A0 = A1" + by (metis P1 assms(2) conga_diff45 construction_uniqueness) + have P3: "Per A0 B C0" + proof - + have Q1: "Bet C0 B D0" + proof - + have R1: "Bet C0 C B" + using P1 between_symmetry by blast + have R2: "Bet C B D0" + proof - + have S1: "Bet C B D" + by (simp add: assms(1)) + have S2: "Bet B D D0" + using P1 by blast + have "B \ D" + using assms(2) conga_diff56 by blast + thus ?thesis + using S1 S2 outer_transitivity_between by blast + qed + have "C \ B" + using assms(2) conga_diff2 by auto + thus ?thesis + using R1 R2 outer_transitivity_between2 by blast + qed + have Q2: "Cong C0 B B D0" + by (meson P1 between_symmetry cong_3421 l2_11_b not_cong_1243) + have "Cong A0 C0 A0 D0" + using P1 P2 by blast + thus ?thesis + using Per_def Q1 Q2 midpoint_def by blast + qed + have P4: "B \ C0" + using P1 assms(2) bet_neq12__neq conga_diff2 by blast + have P5: "Per A B C0" + by (metis P1 P3 Tarski_neutral_dimensionless.l8_3 Tarski_neutral_dimensionless_axioms assms(2) bet_col bet_col1 bet_neq21__neq col_transitivity_1 conga_diff45) + have "Col B C0 C" using P1 + using NCol_cases bet_col by blast + thus ?thesis + using P4 P5 per_col by blast +qed + +lemma cong3_preserves_out: + assumes "A Out B C" and + "A B C Cong3 A' B' C'" + shows "A' Out B' C'" + by (meson assms(1) assms(2) col_permutation_4 cong3_symmetry cong_3_swap l4_13 l4_6 not_bet_and_out or_bet_out out_col) + +lemma l11_21_a: + assumes "B Out A C" and + "A B C CongA A' B' C'" + shows "B' Out A' C'" +proof - + obtain A0 C0 A1 C1 where P1: "Bet B A A0 \ +Cong A A0 B' A' \ Bet B C C0 \ +Cong C C0 B' C' \ Bet B' A' A1 \ +Cong A' A1 B A \ Bet B' C' C1 \ +Cong C' C1 B C \ Cong A0 C0 A1 C1" + using CongA_def assms(2) by auto + have P2: "B Out A0 C0" + by (metis P1 assms(1) bet_out l6_6 l6_7 out_diff1) + have P3: "B' Out A1 C1" + proof - + have "B A0 C0 Cong3 B' A1 C1" + by (meson Cong3_def P1 between_symmetry cong_right_commutativity l2_11_b not_cong_4312) + thus ?thesis + using P2 cong3_preserves_out by blast + qed + thus ?thesis + by (metis P1 assms(2) bet_out conga_diff45 conga_diff56 l6_6 l6_7) +qed + +lemma l11_21_b: + assumes "B Out A C" and + "B' Out A' C'" + shows "A B C CongA A' B' C'" + by (smt assms(1) assms(2) between_trivial2 conga_trivial_1 l11_10 out2_bet_out out_distinct) + +lemma conga_cop__or_out_ts: + assumes "Coplanar A B C C'" and + "A B C CongA A B C'" + shows "B Out C C' \ A B TS C C'" +proof - + obtain A0 C0 A1 C1 where P1: "Bet B A A0 \ +Cong A A0 B A \Bet B C C0 \ +Cong C C0 B C' \Bet B A A1 \ +Cong A A1 B A \Bet B C' C1 \ +Cong C' C1 B C \ Cong A0 C0 A1 C1" + using CongA_def assms(2) by auto + have P2: "A0 = A1" using P1 + by (metis assms(2) conga_diff1 construction_uniqueness) + have "B Out C C' \ A B TS C C'" + proof cases + assume "C0 = C1" + thus ?thesis + by (metis P1 assms(2) bet2__out conga_diff2 conga_diff56) + next + assume R1: "C0 \ C1" + obtain M where R2: "M Midpoint C0 C1" + using midpoint_existence by blast + have R3: "Cong B C0 B C1" + by (meson Bet_cases P1 l2_11_b not_cong_2134 not_cong_3421) + have R3A: "Cong A0 C0 A0 C1" + using P1 P2 by blast + then have R4: "Per A0 M C0" using R2 + using Per_def by blast + have R5: "Per B M C0" + using Per_def R2 R3 by auto + then have R6: "Per B M C1" + using R2 l8_4 by blast + have R7: "B \ A0" + using P1 assms(2) bet_neq12__neq conga_diff45 by blast + then have "Cong A C0 A C1" + by (meson Col_perm P1 R3 R3A bet_col l4_17) + then have R9: "Per A M C0" + using Per_def R2 by blast + then have R10: "Per A M C1" + by (meson R2 Tarski_neutral_dimensionless.l8_4 Tarski_neutral_dimensionless_axioms) + have R11: "Col B A M" + proof - + have S1: "Coplanar C0 B A M" + proof - + have "Coplanar B A C0 M" + proof - + have T1: "Coplanar B A C0 C1" + proof - + have "Coplanar A C0 B C'" + proof - + have "Coplanar A C' B C0" + proof - + have U1: "Coplanar A C' B C" + by (simp add: assms(1) coplanar_perm_4) + have U2: "B \ C" + using assms(2) conga_diff2 by blast + have "Col B C C0" + by (simp add: P1 bet_col) + thus ?thesis + by (meson Tarski_neutral_dimensionless.col_cop__cop Tarski_neutral_dimensionless_axioms U1 U2) + qed + thus ?thesis + using ncoplanar_perm_5 by blast + qed + + thus ?thesis + by (metis P1 Tarski_neutral_dimensionless.col_cop__cop Tarski_neutral_dimensionless_axioms assms(2) bet_col conga_diff56 coplanar_perm_12) + qed + have "Col C0 C1 M" + using Col_perm R2 midpoint_col by blast + thus ?thesis + using T1 R1 col_cop__cop by blast + qed + thus ?thesis + using ncoplanar_perm_8 by blast + qed + have "C0 \ M" + using R1 R2 midpoint_distinct_1 by blast + thus ?thesis + using R5 R9 S1 cop_per2__col by blast + qed + have "B Out C C' \ A B TS C C'" + proof cases + assume Q1: "B = M" + have Q2: "\ Col A B C" + by (metis Col_def P1 Q1 R9 assms(2) conga_diff1 conga_diff2 l6_16_1 l8_9 not_bet_and_out out_trivial) + have Q3: "\ Col A B C'" + by (metis Col_def P1 Q1 R10 assms(2) conga_diff1 conga_diff56 l6_16_1 l8_9 not_bet_and_out out_trivial) + have Q4: "Col B A B" + by (simp add: col_trivial_3) + have "Bet C B C'" + proof - + have S1: "Bet C1 C' B" + using Bet_cases P1 by blast + have "Bet C1 B C" + proof - + have T1: "Bet C0 C B" + using Bet_cases P1 by blast + have "Bet C0 B C1" + by (simp add: Q1 R2 midpoint_bet) + thus ?thesis + using T1 between_exchange3 between_symmetry by blast + qed + thus ?thesis + using S1 between_exchange3 between_symmetry by blast + qed + thus ?thesis + by (metis Q2 Q3 Q4 bet__ts col_permutation_4 invert_two_sides) + next + assume L1: "B \ M" + have L2: "B M TS C0 C1" + proof - + have M1: "\ Col C0 B M" + by (metis (no_types) Col_perm L1 R1 R2 R5 is_midpoint_id l8_9) + have M2: "\ Col C1 B M" + using Col_perm L1 R1 R2 R6 l8_9 midpoint_not_midpoint by blast + have M3: "Col M B M" + using col_trivial_3 by auto + have "Bet C0 M C1" + by (simp add: R2 midpoint_bet) + thus ?thesis + using M1 M2 M3 TS_def by blast + qed + have "A B TS C C'" + proof - + have W2: "A B TS C C1" + proof - + have V1: "A B TS C0 C1" + using L2 P1 R11 R7 col_two_sides cong_diff invert_two_sides not_col_permutation_5 by blast + have "B Out C0 C" + using L2 Out_def P1 TS_def assms(2) col_trivial_1 conga_diff2 by auto + thus ?thesis + using V1 col_trivial_3 l9_5 by blast + qed + then have W1: "A B TS C' C" + proof - + have Z1: "A B TS C1 C" + by (simp add: W2 l9_2) + have Z2: "Col B A B" + using not_col_distincts by blast + have "B Out C1 C'" + using L2 Out_def P1 TS_def assms(2) col_trivial_1 conga_diff56 by auto + thus ?thesis + using Z1 Z2 l9_5 by blast + qed + thus ?thesis + by (simp add: l9_2) + qed + thus ?thesis by blast + qed + thus ?thesis by blast + qed + thus ?thesis by blast +qed + +lemma conga_os__out: + assumes "A B C CongA A B C'" and + "A B OS C C'" + shows "B Out C C'" + using assms(1) assms(2) conga_cop__or_out_ts l9_9 os__coplanar by blast + +lemma cong2_conga_cong: + assumes "A B C CongA A' B' C'" and + "Cong A B A' B'" and + "Cong B C B' C'" + shows "Cong A C A' C'" + by (smt assms(1) assms(2) assms(3) cong_4321 l11_3 l11_4_1 not_cong_3412 out_distinct out_trivial) + +lemma angle_construction_1: + assumes "\ Col A B C" and + "\ Col A' B' P" + shows "\ C'. (A B C CongA A' B' C' \ A' B' OS C' P)" +proof - + obtain C0 where P1: "Col B A C0 \ B A Perp C C0" + using assms(1) col_permutation_4 l8_18_existence by blast + have "\ C'. (A B C CongA A' B' C' \ A' B' OS C' P)" + proof cases + assume P1A: "B = C0" + obtain C' where P2: "Per C' B' A' \ Cong C' B' C B \ A' B' OS C' P" + by (metis assms(1) assms(2) col_trivial_1 col_trivial_2 ex_per_cong) + have P3: "A B C CongA A' B' C'" + by (metis P1 P2 Tarski_neutral_dimensionless.l8_2 Tarski_neutral_dimensionless.os_distincts Tarski_neutral_dimensionless_axioms P1A assms(1) l11_16 not_col_distincts perp_per_1) + thus ?thesis using P2 by blast + next + assume P4: "B \ C0" + have "\ C'. (A B C CongA A' B' C' \ A' B' OS C' P)" + proof cases + assume R1: "B Out A C0" + obtain C0' where R2: "B' Out A' C0' \ Cong B' C0' B C0" + by (metis P4 assms(2) col_trivial_1 segment_construction_3) + have "\ C'. Per C' C0' B' \ Cong C' C0' C C0 \ B' C0' OS C' P" + proof - + have R4: "B' \ C0'" + using Out_def R2 by auto + have R5: "C \ C0" + using P1 perp_distinct by blast + have R6: "Col B' C0' C0'" + by (simp add: col_trivial_2) + have "\ Col B' C0' P" + using NCol_cases R2 R4 assms(2) col_transitivity_1 out_col by blast + then have "\ C'. Per C' C0' B' \ +Cong C' C0' C C0 \ B' C0' OS C' P" using R4 R5 R6 ex_per_cong by blast + thus ?thesis by auto + qed + then obtain C' where R7: "Per C' C0' B' \ +Cong C' C0' C C0 \ B' C0' OS C' P" by auto + then have R8: "C0 B C Cong3 C0' B' C'" + by (meson Cong3_def P1 R2 col_trivial_2 l10_12 l8_16_1 not_col_permutation_2 not_cong_2143 not_cong_4321) + have R9: "A B C CongA A' B' C'" + proof - + have S1: "C0 B C CongA C0' B' C'" + by (metis P4 R8 assms(1) cong3_conga not_col_distincts) + have S3: "B Out C C" + using assms(1) not_col_distincts out_trivial by force + have "B' \ C'" + using R8 assms(1) cong3_diff2 not_col_distincts by blast + then have "B' Out C' C'" + using out_trivial by auto + thus ?thesis + using S1 R1 S3 R2 l11_10 by blast + qed + have "B' A' OS C' P" + proof - + have T1: "Col B' C0' A'" + by (meson NCol_cases R2 Tarski_neutral_dimensionless.out_col Tarski_neutral_dimensionless_axioms) + have "B' \ A'" + using assms(2) col_trivial_1 by auto + thus ?thesis + using T1 R7 col_one_side by blast + qed + then have "A' B' OS C' P" + by (simp add: invert_one_side) + thus ?thesis + using R9 by blast + next + assume U1: "\ B Out A C0" + then have U2: "Bet A B C0" + using NCol_perm P1 or_bet_out by blast + obtain C0' where U3: "Bet A' B' C0' \ Cong B' C0' B C0" + using segment_construction by blast + have U4: "\ C'. Per C' C0' B' \ Cong C' C0' C C0 \ B' C0' OS C' P" + proof - + have V2: "C \ C0" + using Col_cases P1 assms(1) by blast + have "B' \ C0'" + using P4 U3 cong_diff_3 by blast + then have "\ Col B' C0' P" + using Col_def U3 assms(2) col_transitivity_1 by blast + thus ?thesis using ex_per_cong + using V2 not_col_distincts by blast + qed + then obtain C' where U5: "Per C' C0' B' \ Cong C' C0' C C0 \ B' C0' OS C' P" + by blast + have U98: "A B C CongA A' B' C'" + proof - + have X1: "C0 B C Cong3 C0' B' C'" + proof - + have X2: "Cong C0 B C0' B'" + using Cong_cases U3 by blast + have X3: "Cong C0 C C0' C'" + using U5 not_cong_4321 by blast + have "Cong B C B' C'" + proof - + have Y1: "Per C C0 B" + using P1 col_trivial_3 l8_16_1 by blast + have "Cong C C0 C' C0'" + using U5 not_cong_3412 by blast + thus ?thesis + using Cong_perm Y1 U5 X2 l10_12 by blast + qed + thus ?thesis + by (simp add: Cong3_def X2 X3) + qed + have X22: "Bet C0 B A" + using U2 between_symmetry by blast + have X24: "Bet C0' B' A'" + using Bet_cases U3 by blast + have "A' \ B'" + using assms(2) not_col_distincts by blast + thus ?thesis + by (metis P4 X1 X22 X24 assms(1) cong3_conga l11_13 not_col_distincts) + qed + have "A' B' OS C' P" + proof - + have "B' A' OS C' P" + proof - + have W1: "Col B' C0' A'" + by (simp add: Col_def U3) + have "B' \ A'" + using assms(2) not_col_distincts by blast + thus ?thesis + using W1 U5 col_one_side by blast + qed + thus ?thesis + by (simp add: invert_one_side) + qed + thus ?thesis + using U98 by blast + qed + thus ?thesis by auto + qed + thus ?thesis by auto +qed + +lemma angle_construction_2: + assumes "A \ B" (*and + "A \ C"*) and + "B \ C" (*and + "A' \ B'"*) and + "\ Col A' B' P" + shows "\ C'. (A B C CongA A' B' C' \ (A' B' OS C' P \ Col A' B' C'))" + by (metis Col_def angle_construction_1 assms(1) assms(2) assms(3) col_trivial_3 conga_line l11_21_b or_bet_out out_trivial point_construction_different) + +lemma ex_conga_ts: + assumes "\ Col A B C" and + "\ Col A' B' P" + shows "\ C'. A B C CongA A' B' C' \ A' B' TS C' P" +proof - + obtain P' where P1: "A' Midpoint P P'" + using symmetric_point_construction by blast + have P2: "\ Col A' B' P'" + by (metis P1 assms(2) col_transitivity_1 midpoint_col midpoint_distinct_2 not_col_distincts) + obtain C' where P3: + "A B C CongA A' B' C' \ A' B' OS C' P'" + using P2 angle_construction_1 assms(1) by blast + have "A' B' TS P' P" + using P1 P2 assms(2) bet__ts l9_2 midpoint_bet not_col_distincts by auto + thus ?thesis + using P3 l9_8_2 one_side_symmetry by blast +qed + +lemma l11_15: + assumes "\ Col A B C" and + "\ Col D E P" + shows + "\ F. (A B C CongA D E F \ E D OS F P) \ + (\ F1 F2. ((A B C CongA D E F1 \ E D OS F1 P) \ + (A B C CongA D E F2 \ E D OS F2 P)) + \ E Out F1 F2)" +proof - + obtain F where P1: "A B C CongA D E F \ D E OS F P" + using angle_construction_1 assms(1) assms(2) by blast + then have P2: "A B C CongA D E F \ E D OS F P" + using invert_one_side by blast + have "(\ F1 F2. ((A B C CongA D E F1 \ E D OS F1 P) \ + (A B C CongA D E F2 \ E D OS F2 P)) \ E Out F1 F2)" + proof - + { + fix F1 F2 + assume P3: "((A B C CongA D E F1 \ E D OS F1 P) \ + (A B C CongA D E F2 \ E D OS F2 P))" + then have P4: "A B C CongA D E F1" by simp + have P5: "E D OS F1 P" using P3 by simp + have P6: "A B C CongA D E F2" using P3 by simp + have P7: "E D OS F2 P" using P3 by simp + have P8: "D E F1 CongA D E F2" + using P4 conga_sym P6 conga_trans by blast + have "D E OS F1 F2" + using P5 P7 invert_one_side one_side_symmetry one_side_transitivity by blast + then have "E Out F1 F2" using P8 conga_os__out + by (meson P3 conga_sym conga_trans) + } + thus ?thesis by auto + qed + thus ?thesis + using P2 by blast +qed + +lemma l11_19: + assumes "Per A B P1" and + "Per A B P2" and + "A B OS P1 P2" + shows "B Out P1 P2" +proof cases + assume "Col A B P1" + thus ?thesis + using assms(3) col123__nos by blast +next + assume P1: "\ Col A B P1" + have "B Out P1 P2" + proof cases + assume "Col A B P2" + thus ?thesis + using assms(3) one_side_not_col124 by blast + next + assume P2: "\ Col A B P2" + obtain x where "A B P1 CongA A B x \ B A OS x P2 \ + (\ F1 F2. ((A B P1 CongA A B F1 \ B A OS F1 P2) \ + (A B P1 CongA A B F2 \ B A OS F2 P2))\ B Out F1 F2)" + using P1 P2 l11_15 by blast + thus ?thesis + by (metis P1 P2 assms(1) assms(2) assms(3) conga_os__out l11_16 not_col_distincts) + qed + thus ?thesis + by simp +qed + +lemma l11_22_bet: + assumes "Bet A B C" and + "P' B' TS A' C'" and + "A B P CongA A' B' P'" and + "P B C CongA P' B' C'" + shows "Bet A' B' C'" +proof - + obtain C'' where P1: "Bet A' B' C'' \ Cong B' C'' B C" + using segment_construction by blast + have P2: "C B P CongA C'' B' P'" + by (metis P1 assms(1) assms(3) assms(4) cong_diff_3 conga_diff2 l11_13) + have P3: "C'' B' P' CongA C' B' P'" + by (meson P2 Tarski_neutral_dimensionless.conga_sym Tarski_neutral_dimensionless_axioms assms(4) conga_comm conga_trans) + have P4: "B' Out C' C'' \ P' B' TS C' C''" + proof - + have P5: "Coplanar P' B' C' C''" + by (meson P1 TS_def assms(2) bet__coplanar coplanar_trans_1 ncoplanar_perm_1 ncoplanar_perm_8 ts__coplanar) + have "P' B' C' CongA P' B' C''" + using P3 conga_comm conga_sym by blast + thus ?thesis + by (simp add: P5 conga_cop__or_out_ts) + qed + have P6: "B' Out C' C'' \ Bet A' B' C'" + proof - + { + assume "B' Out C' C''" + then have "Bet A' B' C'" + using P1 bet_out_out_bet between_exchange4 between_trivial2 col_trivial_3 l6_6 not_bet_out by blast + } + thus ?thesis by simp + qed + have "P' B' TS C' C'' \ Bet A' B' C'" + proof - + { + assume P7: "P' B' TS C' C''" + then have "Bet A' B' C'" + proof cases + assume "Col C' B' P'" + thus ?thesis + using Col_perm TS_def assms(2) by blast + next + assume Q1: "\ Col C' B' P'" + then have Q2: "B' \ P'" + using not_col_distincts by blast + have Q3: "B' P' TS A' C''" + by (metis Col_perm P1 TS_def P7 assms(2) col_trivial_3) + have Q4: "B' P' OS C' C''" + using P7 Q3 assms(2) invert_two_sides l9_8_1 l9_9 by blast + thus ?thesis + using P7 invert_one_side l9_9 by blast + qed + } + thus ?thesis by simp + qed + thus ?thesis using P6 P4 by blast +qed + +lemma l11_22a: + assumes "B P TS A C" and + "B' P' TS A' C'" and + "A B P CongA A' B' P'" and + "P B C CongA P' B' C'" + shows "A B C CongA A' B' C'" +proof - + have P1: "A \ B \ A' \ B' \ P \ B \ P' \ B' \ C \ B \ C' \ B'" + using assms(3) assms(4) conga_diff1 conga_diff2 conga_diff45 conga_diff56 by auto + have P2: "A \ C \ A' \ C'" + using assms(1) assms(2) not_two_sides_id by blast + obtain A'' where P3: "B' Out A' A'' \ Cong B' A'' B A" + using P1 segment_construction_3 by force + have P4: "\ Col A B P" + using TS_def assms(1) by blast + obtain T where P5: "Col T B P \ Bet A T C" + using TS_def assms(1) by blast + have "A B C CongA A' B' C'" + proof cases + assume "B = T" + thus ?thesis + by (metis P1 P5 assms(2) assms(3) assms(4) conga_line invert_two_sides l11_22_bet) + next + assume P6: "B \ T" + have "A B C CongA A' B' C'" + proof cases + assume P7A: "Bet P B T" + obtain T'' where T1: "Bet P' B' T'' \ Cong B' T'' B T" + using segment_construction by blast + have "\ T''. + Col B' P' T'' \ (B' Out P' T'' \ B Out P T) \ Cong B' T'' B T" + proof - + have T2: "Col B' P' T''" using T1 + by (simp add: bet_col col_permutation_4) + have "(B' Out P' T'' \ B Out P T) \ Cong B' T'' B T" + using P7A T1 not_bet_and_out by blast + thus ?thesis using T2 by blast + qed + then obtain T'' where T3: + "Col B' P' T'' \ (B' Out P' T'' \ B Out P T) \ Cong B' T'' B T" by blast + then have T4: "B' \ T''" + using P6 cong_diff_3 by blast + obtain C'' where T5: "Bet A'' T'' C'' \ Cong T'' C'' T C" + using segment_construction by blast + have T6: "A B T CongA A' B' T''" + by (smt Out_cases P5 P6 T3 T4 P7A assms(3) between_symmetry col_permutation_4 conga_comm l11_13 l6_4_1 or_bet_out) + then have T7: "A B T CongA A'' B' T''" + by (smt P3 P4 P6 T3 Tarski_neutral_dimensionless.l11_10 Tarski_neutral_dimensionless_axioms bet_out col_trivial_3 cong_diff_3 l5_2 l6_6 not_col_permutation_1 or_bet_out) + then have T8: "Cong A T A'' T''" + using P3 T3 cong2_conga_cong cong_4321 not_cong_3412 by blast + have T9: "Cong A C A'' C''" + using P5 T5 T8 cong_symmetry l2_11_b by blast + have T10: "Cong C B C'' B'" + by (smt P3 P4 P5 T3 T5 T8 cong_commutativity cong_symmetry five_segment) + have "A B C Cong3 A'' B' C''" + using Cong3_def P3 T10 T9 not_cong_2143 not_cong_4321 by blast + then have T11: "A B C CongA A'' B' C''" + by (simp add: Tarski_neutral_dimensionless.cong3_conga Tarski_neutral_dimensionless_axioms P1) + have "C B T Cong3 C'' B' T''" + by (simp add: Cong3_def T10 T3 T5 cong_4321 cong_symmetry) + then have T12: "C B T CongA C'' B' T''" + using P1 P6 cong3_conga by auto + have T13: "P B C CongA P' B' C''" + proof - + have K4: "Bet T B P" + using Bet_perm P7A by blast + have "Bet T'' B' P'" + using Col_perm P7A T3 l6_6 not_bet_and_out or_bet_out by blast + thus ?thesis + using K4 P1 T12 conga_comm l11_13 by blast + qed + have T14: "P' B' C' CongA P' B' C''" + proof - + have "P' B' C' CongA P B C" + by (simp add: assms(4) conga_sym) + thus ?thesis + using T13 conga_trans by blast + qed + have T15: "B' Out C' C'' \ P' B' TS C' C''" + proof - + have K7: "Coplanar P' B' C' C''" + proof - + have K8: "Coplanar A' P' B' C'" + using assms(2) coplanar_perm_14 ts__coplanar by blast + have K8A: "Coplanar P' C'' B' A''" + proof - + have "Col P' B' T'' \ Col C'' A'' T''" + using Col_def Col_perm T3 T5 by blast + then have "Col P' C'' T'' \ Col B' A'' T'' \ +Col P' B' T'' \ Col C'' A'' T'' \ Col P' A'' T'' \ Col C'' B' T''" by simp + thus ?thesis + using Coplanar_def by auto + qed + then have "Coplanar A' P' B' C''" + proof - + have K9: "B' \ A''" + using P3 out_distinct by blast + have "Col B' A'' A'" + using Col_perm P3 out_col by blast + thus ?thesis + using K8A K9 col_cop__cop coplanar_perm_19 by blast + qed + thus ?thesis + by (meson K8 TS_def assms(2) coplanar_perm_7 coplanar_trans_1 ncoplanar_perm_2) + qed + thus ?thesis + by (simp add: T14 K7 conga_cop__or_out_ts) + qed + have "A B C CongA A' B' C'" + proof cases + assume "B' Out C' C''" + thus ?thesis + using P1 P3 T11 l11_10 out_trivial by blast + next + assume W1: "\ B' Out C' C''" + then have W1A: " P' B' TS C' C''" using T15 by simp + have W2: "B' P' TS A'' C'" + using P3 assms(2) col_trivial_1 l9_5 by blast + then have W3: "B' P' OS A'' C''" + using T15 W1 invert_two_sides l9_2 l9_8_1 by blast + have W4: "B' P' TS A'' C''" + proof - + have "\ Col A' B' P'" + using TS_def assms(2) by auto + thus ?thesis + using Col_perm T3 T5 W3 one_side_chara by blast + qed + thus ?thesis + using W1A W2 invert_two_sides l9_8_1 l9_9 by blast + qed + thus ?thesis by simp + next + assume R1: "\ Bet P B T" + then have R2: "B Out P T" + using Col_cases P5 l6_4_2 by blast + have R2A: "\ T''. Col B' P' T'' \ (B' Out P' T'' \ B Out P T) \ Cong B' T'' B T" + proof - + obtain T'' where R3: "B' Out P' T'' \ Cong B' T'' B T" + using P1 P6 segment_construction_3 by fastforce + thus ?thesis + using R2 out_col by blast + qed + then obtain T'' where R4: "Col B' P' T'' \ (B' Out P' T'' \ B Out P T) \ Cong B' T'' B T" by auto + have R5: "B' \ T''" + using P6 R4 cong_diff_3 by blast + obtain C'' where R6: "Bet A'' T'' C'' \ Cong T'' C'' T C" + using segment_construction by blast + have R7: "A B T CongA A' B' T''" + using P1 R2 R4 assms(3) l11_10 l6_6 out_trivial by auto + have R8: "A B T CongA A'' B' T''" + using P3 P4 R2 R4 assms(3) l11_10 l6_6 not_col_distincts out_trivial by blast + have R9: "Cong A T A'' T''" + using Cong_cases P3 R4 R8 cong2_conga_cong by blast + have R10: "Cong A C A'' C''" + using P5 R6 R9 l2_11_b not_cong_3412 by blast + have R11: "Cong C B C'' B'" + by (smt P3 P4 P5 R4 R6 R9 cong_commutativity cong_symmetry five_segment) + have "A B C Cong3 A'' B' C''" + by (simp add: Cong3_def P3 R10 R11 cong_4321 cong_commutativity) + then have R12: "A B C CongA A'' B' C''" + using P1 by (simp add: cong3_conga) + have "C B T Cong3 C'' B' T''" + using Cong3_def R11 R4 R6 not_cong_3412 not_cong_4321 by blast + then have R13: "C B T CongA C'' B' T''" + using P1 P6 Tarski_neutral_dimensionless.cong3_conga Tarski_neutral_dimensionless_axioms by fastforce + have R13A: "\ Col A' B' P'" + using TS_def assms(2) by blast + have R14: "B' Out C' C'' \ P' B' TS C' C''" + proof - + have S1: "Coplanar P' B' C' C''" + proof - + have T1: "Coplanar A' P' B' C'" + using assms(2) ncoplanar_perm_14 ts__coplanar by blast + have "Coplanar A' P' B' C''" + proof - + have U6: "B' \ A''" + using P3 out_diff2 by blast + have "Coplanar P' C'' B' A''" + proof - + have "Col P' B' T'' \ Col C'' A'' T''" + using Col_def Col_perm R4 R6 by blast + thus ?thesis using Coplanar_def by auto + qed + thus ?thesis + by (meson Col_cases P3 U6 col_cop__cop ncoplanar_perm_21 ncoplanar_perm_6 out_col) + qed + thus ?thesis + using NCol_cases R13A T1 coplanar_trans_1 by blast + qed + have "P' B' C' CongA P' B' C''" + proof - + have "C B P CongA C'' B' P'" + using P1 R12 R13 R2 R4 conga_diff56 l11_10 out_trivial by presburger + then have "C' B' P' CongA C'' B' P'" + by (meson Tarski_neutral_dimensionless.conga_trans Tarski_neutral_dimensionless_axioms assms(4) conga_comm conga_sym) + thus ?thesis + by (simp add: conga_comm) + qed + thus ?thesis + by (simp add: S1 conga_cop__or_out_ts) + qed + have S1: "B Out A A" + using P4 not_col_distincts out_trivial by blast + have S2: "B Out C C" + using TS_def assms(1) not_col_distincts out_trivial by auto + have S3: "B' Out A' A''" using P3 by simp + have "A B C CongA A' B' C'" + proof cases + assume "B' Out C' C''" + thus ?thesis using S1 S2 S3 + using R12 l11_10 by blast + next + assume "\ B' Out C' C''" + then have Z3: "P' B' TS C' C''" using R14 by simp + have Q1: "B' P' TS A'' C'" + using S3 assms(2) l9_5 not_col_distincts by blast + have Q2: "B' P' OS A'' C''" + proof - + have "B' P' TS C'' C'" + proof - + have "B' P' TS C' C''" using Z3 + using invert_two_sides by blast + thus ?thesis + by (simp add: l9_2) + qed + thus ?thesis + using Q1 l9_8_1 by blast + qed + have "B' P' TS A'' C''" + using Col_perm Q2 R4 R6 one_side_chara by blast + thus ?thesis + using Q2 l9_9 by blast + qed + thus ?thesis using S1 S2 S3 + using R12 l11_10 by blast + qed + thus ?thesis by simp + qed + thus ?thesis by simp +qed + +lemma l11_22b: + assumes "B P OS A C" and + "B' P' OS A' C'" and + "A B P CongA A' B' P'" and + "P B C CongA P' B' C'" + shows "A B C CongA A' B' C'" +proof - + obtain D where P1: "Bet A B D \ Cong B D A B" + using segment_construction by blast + obtain D' where P2: "Bet A' B' D' \ Cong B' D' A' B'" + using segment_construction by blast + have P3: "D B P CongA D' B' P'" + proof - + have Q3: "D \ B" + by (metis P1 assms(1) col_trivial_3 cong_diff_3 one_side_not_col124 one_side_symmetry) + have Q5: "D' \ B'" + by (metis P2 assms(2) col_trivial_3 cong_diff_3 one_side_not_col124 one_side_symmetry) + thus ?thesis + using assms(3) P1 Q3 P2 l11_13 by blast + qed + have P5: "D B C CongA D' B' C'" + proof - + have Q1: "B P TS D C" + by (metis P1 assms(1) bet__ts col_trivial_3 cong_diff_3 l9_2 l9_8_2 one_side_not_col124 one_side_symmetry) + have "B' P' TS D' C'" by (metis Cong_perm P2 assms(2) bet__ts between_cong between_trivial2 l9_2 l9_8_2 one_side_not_col123 point_construction_different ts_distincts) + thus ?thesis + using assms(4) Q1 P3 l11_22a by blast + qed + have P6: "Bet D B A" + using Bet_perm P1 by blast + have P7: "A \ B" + using assms(3) conga_diff1 by auto + have P8: "Bet D' B' A'" + using Bet_cases P2 by blast + have "A' \ B'" + using assms(3) conga_diff45 by blast + thus ?thesis + using P5 P6 P7 P8 l11_13 by blast +qed + +lemma l11_22: + assumes "((B P TS A C \ B' P' TS A' C')\(B P OS A C \ B' P' OS A' C'))" and + "A B P CongA A' B' P'" and + "P B C CongA P' B' C'" + shows "A B C CongA A' B' C'" + by (meson assms(1) assms(2) assms(3) l11_22a l11_22b) + +lemma l11_24: + assumes "P InAngle A B C" + shows "P InAngle C B A" +proof - + obtain pp :: "'p \ 'p \ 'p \ 'p \ 'p" where + "\x0 x1 x2 x3. (\v4. Bet x2 v4 x0 \ (v4 = x1 \ x1 Out v4 x3)) = (Bet x2 (pp x0 x1 x2 x3) x0 \ ((pp x0 x1 x2 x3) = x1 \ x1 Out (pp x0 x1 x2 x3) x3))" + by moura + then have "A \ B \ C \ B \ P \ B \Bet A (pp C B A P) C \ ((pp C B A P) = B \ B Out (pp C B A P) P)" + using InAngle_def assms by presburger + thus ?thesis + by (metis (no_types) InAngle_def between_symmetry) +qed + +lemma col_in_angle: + assumes "A \ B" and + "C \ B" and + "P \ B" and + "B Out A P \ B Out C P" + shows "P InAngle A B C" + by (meson InAngle_def assms(1) assms(2) assms(3) assms(4) between_trivial between_trivial2) + +lemma out321__inangle: + assumes "C \ B" and + "B Out A P" + shows "P InAngle A B C" + using assms(1) assms(2) col_in_angle out_distinct by auto + +lemma inangle1123: + assumes "A \ B" and + "C \ B" + shows "A InAngle A B C" + by (simp add: assms(1) assms(2) out321__inangle out_trivial) + +lemma out341__inangle: + assumes "A \ B" and + "B Out C P" + shows "P InAngle A B C" + using assms(1) assms(2) col_in_angle out_distinct by auto + +lemma inangle3123: + assumes "A \ B" and + "C \ B" + shows "C InAngle A B C" + by (simp add: assms(1) assms(2) inangle1123 l11_24) + +lemma in_angle_two_sides: + assumes "\ Col B A P" and + "\ Col B C P" and + "P InAngle A B C" + shows "P B TS A C" + by (metis InAngle_def TS_def assms(1) assms(2) assms(3) not_col_distincts not_col_permutation_1 out_col) + +lemma in_angle_out: + assumes "B Out A C" and + "P InAngle A B C" + shows "B Out A P" + by (metis InAngle_def assms(1) assms(2) not_bet_and_out out2_bet_out) + +lemma col_in_angle_out: + assumes "Col B A P" and + "\ Bet A B C" and + "P InAngle A B C" + shows "B Out A P" +proof - + obtain X where P1: "Bet A X C \ (X = B \ B Out X P)" + using InAngle_def assms(3) by auto + have "B Out A P" + proof cases + assume "X = B" + thus ?thesis + using P1 assms(2) by blast + next + assume P2: "X \ B" + thus ?thesis + proof - + have f1: "Bet B A P \ A Out B P" + by (meson assms(1) l6_4_2) + have f2: "B Out X P" + using P1 P2 by blast + have f3: "(Bet B P A \Bet B A P) \Bet P B A" + using f1 by (meson Bet_perm Out_def) + have f4: "Bet B P X \Bet P X B" + using f2 by (meson Bet_perm Out_def) + then have f5: "((Bet B P X \Bet X B A) \Bet B P A) \Bet B A P" + using f3 by (meson between_exchange3) + have "\p. Bet p X C \ \Bet A p X" + using P1 between_exchange3 by blast + then have f6: "(P = B \Bet B A P) \Bet B P A" + using f5 f3 by (meson Bet_perm P2 assms(2) outer_transitivity_between2) + have f7: "Bet C X A" + using Bet_perm P1 by blast + have "P \ B" + using f2 by (simp add: Out_def) + moreover + { assume "Bet B B C" + then have "A \ B" + using assms(2) by blast } + ultimately have "A \ B" + using f7 f4 f1 by (meson Bet_perm Out_def P2 between_exchange3 outer_transitivity_between2) + thus ?thesis + using f6 f2 by (simp add: Out_def) + qed + qed + thus ?thesis by blast +qed + +lemma l11_25_aux: + assumes "P InAngle A B C" and + "\ Bet A B C" and + "B Out A' A" + shows "P InAngle A' B C" +proof - + have P1: "Bet B A' A \ Bet B A A'" + using Out_def assms(3) by auto + { + assume P2: "Bet B A' A" + obtain X where P3: "Bet A X C \ (X = B \ B Out X P)" + using InAngle_def assms(1) by auto + obtain T where P4: "Bet A' T C \ Bet X T B" + using Bet_perm P2 P3 inner_pasch by blast + { + assume "X = B" + then have "P InAngle A' B C" + using P3 assms(2) by blast + } + { + assume "B Out X P" + then have "P InAngle A' B C" + by (metis InAngle_def P4 assms(1) assms(3) bet_out_1 l6_7 out_diff1) + } + then have "P InAngle A' B C" + using P3 \X = B \ P InAngle A' B C\ by blast + } + { + assume Q0: "Bet B A A'" + obtain X where Q1: "Bet A X C \ (X = B \ B Out X P)" + using InAngle_def assms(1) by auto + { + assume "X = B" + then have "P InAngle A' B C" + using Q1 assms(2) by blast + } + { + assume Q2: "B Out X P" + obtain T where Q3: "Bet A' T C \ Bet B X T" + using Bet_perm Q1 Q0 outer_pasch by blast + then have "P InAngle A' B C" + by (metis InAngle_def Q0 Q2 assms(1) bet_out l6_6 l6_7 out_diff1) + } + then have "P InAngle A' B C" + using \X = B \ P InAngle A' B C\ Q1 by blast + } + thus ?thesis + using P1 \Bet B A' A \ P InAngle A' B C\ by blast +qed + +lemma l11_25: + assumes "P InAngle A B C" and + "B Out A' A" and + "B Out C' C" and + "B Out P' P" + shows "P' InAngle A' B C'" +proof cases + assume "Bet A B C" + thus ?thesis + by (metis Bet_perm InAngle_def assms(2) assms(3) assms(4) bet_out__bet l6_6 out_distinct) +next + assume P1: "\ Bet A B C" + have P2: "P InAngle A' B C" + using P1 assms(1) assms(2) l11_25_aux by blast + have P3: "P InAngle A' B C'" + proof - + have "P InAngle C' B A'" using l11_25_aux + using Bet_perm P1 P2 assms(2) assms(3) bet_out__bet l11_24 by blast + thus ?thesis using l11_24 by blast + qed + obtain X where P4: "Bet A' X C' \ (X = B \ B Out X P)" + using InAngle_def P3 by auto + { + assume "X = B" + then have "P' InAngle A' B C'" + using InAngle_def P3 P4 assms(4) out_diff1 by auto + } + { + assume "B Out X P" + then have "P' InAngle A' B C'" + proof - + have "\p. B Out p P' \ \ B Out p P" + by (meson Out_cases assms(4) l6_7) + thus ?thesis + by (metis (no_types) InAngle_def P3 assms(4) out_diff1) + qed + } + thus ?thesis + using InAngle_def P4 assms(2) assms(3) assms(4) out_distinct by auto +qed + +lemma inangle_distincts: + assumes "P InAngle A B C" + shows "A \ B \ C \ B \ P \ B" + using InAngle_def assms by auto + +lemma segment_construction_0: + shows "\ B'. Cong A' B' A B" + using segment_construction by blast + +lemma angle_construction_3: + assumes "A \ B" and + "C \ B" and + "A' \ B'" + shows "\ C'. A B C CongA A' B' C'" + by (metis angle_construction_2 assms(1) assms(2) assms(3) not_col_exists) + +lemma l11_28: + assumes "A B C Cong3 A' B' C'" and + "Col A C D" + shows "\ D'. (Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D')" +proof cases + assume P1: "A = C" + have "\ D'. (Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D')" + proof cases + assume "A = B" + thus ?thesis + by (metis P1 assms(1) cong3_diff cong3_symmetry cong_3_swap_2 not_cong_3421 segment_construction_0) + next + assume "A \ B" + have "\ D'. (Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D')" + proof cases + assume "A = D" + thus ?thesis + using Cong3_def P1 assms(1) cong_trivial_identity by blast + next + assume "A \ D" + have "\ D'. (Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D')" + proof cases + assume "B = D" + thus ?thesis + using Cong3_def assms(1) cong_3_swap_2 cong_trivial_identity by blast + next + assume Q1: "B \ D" + obtain D'' where Q2: "B A D CongA B' A' D''" + by (metis \A \ B\ \A \ D\ angle_construction_3 assms(1) cong3_diff) + obtain D' where Q3: "A' Out D'' D' \ Cong A' D' A D" + by (metis Q2 \A \ D\ conga_diff56 segment_construction_3) + have Q5: "Cong A D A' D'" + using Q3 not_cong_3412 by blast + have "B A D CongA B' A' D'" + using Q2 Q3 \A \ B\ \A \ D\ conga_diff45 l11_10 l6_6 out_trivial by auto + then have "Cong B D B' D'" + using Cong3_def Cong_perm Q5 assms(1) cong2_conga_cong by blast + thus ?thesis + using Cong3_def P1 Q5 assms(1) cong_reverse_identity by blast + qed + thus ?thesis by simp + qed + thus ?thesis by simp + qed + thus ?thesis by simp +next + assume Z1: "A \ C" + have "\ D'. (Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D')" + proof cases + assume "A = D" + thus ?thesis + using Cong3_def Cong_perm assms(1) cong_trivial_identity by blast + next + assume "A \ D" + { + assume "Bet A C D" + obtain D' where W1: "Bet A' C' D' \ Cong C' D' C D" + using segment_construction by blast + have W2: "Cong A D A' D'" + by (meson Cong3_def W1 \Bet A C D\ assms(1) cong_symmetry l2_11_b) + have W3: "Cong B D B' D'" + proof - + have X1: "Cong C D C' D'" + using W1 not_cong_3412 by blast + have "Cong C B C' B'" + using Cong3_def assms(1) cong_commutativity by presburger + then have W4: "A C D B OFSC A' C' D' B'" + using Cong3_def OFSC_def W1 X1 \Bet A C D\ assms(1) by blast + have "Cong D B D' B'" + using W4 \A \ C\ five_segment_with_def by blast + thus ?thesis + using Z1 not_cong_2143 by blast + qed + have "Cong C D C' D'" + by (simp add: W1 cong_symmetry) + then have "\ D'. (Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D')" + using W2 W3 by blast + } + { + assume W3B: "Bet C D A" + then obtain D' where W4A: "Bet A' D' C' \ A D C Cong3 A' D' C'" + using Bet_perm Cong3_def assms(1) l4_5 by blast + have W5: "Cong A D A' D'" + using Cong3_def W4A by blast + have "A D C B IFSC A' D' C' B'" + by (meson Bet_perm Cong3_def Cong_perm IFSC_def W4A W3B assms(1)) + then have "Cong D B D' B'" + using l4_2 by blast + then have W6: "Cong B D B' D'" + using Cong_perm by blast + then have "Cong C D C' D'" + using Cong3_def W4A not_cong_2143 by blast + then have "\ D'. (Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D')" + using W5 W6 by blast + } + { + assume W7: "Bet D A C" + obtain D' where W7A: "Bet C' A' D' \ Cong A' D' A D" + using segment_construction by blast + then have W8: "Cong A D A' D'" + using Cong_cases by blast + have "C A D B OFSC C' A' D' B'" + by (meson Bet_perm Cong3_def Cong_perm OFSC_def W7 W7A assms(1)) + then have "Cong D B D' B'" + using Z1 five_segment_with_def by auto + then have w9: "Cong B D B' D'" + using Cong_perm by blast + have "Cong C D C' D'" + proof - + have L1: "Bet C A D" + using Bet_perm W7 by blast + have L2: "Bet C' A' D'" + using Bet_perm W7 + using W7A by blast + have "Cong C A C' A'" using assms(1) + using Cong3_def assms(1) not_cong_2143 by blast + thus ?thesis using l2_11 + using L1 L2 W8 l2_11 by blast + qed + then have "\ D'. (Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D')" + using W8 w9 by blast + } + thus ?thesis + using Bet_cases \Bet A C D \ \D'. Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D'\ \Bet C D A \ \D'. Cong A D A' D' \ Cong B D B' D' \ Cong C D C' D'\ assms(2) third_point by blast + qed + thus ?thesis + by blast +qed + +lemma bet_conga__bet: + assumes "Bet A B C" and + "A B C CongA A' B' C'" + shows "Bet A' B' C'" +proof - + obtain A0 C0 A1 C1 where P1:" + Bet B A A0 \Cong A A0 B' A' \ + Bet B C C0 \Cong C C0 B' C' \ + Bet B' A' A1 \Cong A' A1 B A \ + Bet B' C' C1 \Cong C' C1 B C \ + Cong A0 C0 A1 C1" using CongA_def assms(2) + by auto + have "Bet C B A0" using P1 outer_transitivity_between + by (metis assms(1) assms(2) between_symmetry conga_diff1) + then have "Bet A0 B C" + using Bet_cases by blast + then have P2: "Bet A0 B C0" + using P1 assms(2) conga_diff2 outer_transitivity_between by blast + have P3: "A0 B C0 Cong3 A1 B' C1" + proof - + have Q1: "Cong A0 B A1 B'" + by (meson Bet_cases P1 l2_11_b not_cong_1243 not_cong_4312) + have Q3: "Cong B C0 B' C1" + using P1 between_symmetry cong_3421 l2_11_b not_cong_1243 by blast + thus ?thesis + by (simp add: Cong3_def Q1 P1) + qed + then have P4: "Bet A1 B' C1" using P2 l4_6 by blast + then have "Bet A' B' C1" + using P1 Bet_cases between_exchange3 by blast + thus ?thesis using between_inner_transitivity P1 by blast +qed + +lemma in_angle_one_side: + assumes "\ Col A B C" and + "\ Col B A P" and + "P InAngle A B C" + shows "A B OS P C" +proof - + obtain X where P1: "Bet A X C \ (X = B \ B Out X P)" + using InAngle_def assms(3) by auto + { + assume "X = B" + then have "A B OS P C" + using P1 assms(1) bet_col by blast + } + { + assume P2: "B Out X P" + obtain C' where P2A: "Bet C A C' \ Cong A C' C A" + using segment_construction by blast + have "A B TS X C'" + proof - + have Q1: "\ Col X A B" + by (metis Col_def P1 assms(1) assms(2) col_transitivity_2 out_col) + have Q2 :"\ Col C' A B" + by (metis Col_def Cong_perm P2A assms(1) cong_diff l6_16_1) + have "\ T. Col T A B \ Bet X T C'" + using Bet_cases P1 P2A between_exchange3 col_trivial_1 by blast + thus ?thesis + by (simp add: Q1 Q2 TS_def) + qed + then have P3: "A B TS P C'" + using P2 col_trivial_3 l9_5 by blast + then have "A B TS C C'" + by (smt P1 P2 bet_out bet_ts__os between_trivial col123__nos col_trivial_3 invert_two_sides l6_6 l9_2 l9_5) + then have "A B OS P C" + using OS_def P3 by blast + } + thus ?thesis + using P1 \X = B \ A B OS P C\ by blast +qed + +lemma inangle_one_side: + assumes "\ Col A B C" and + "\ Col A B P" and + "\ Col A B Q" and + "P InAngle A B C" and + "Q InAngle A B C" + shows "A B OS P Q" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) in_angle_one_side not_col_permutation_4 one_side_symmetry one_side_transitivity) + +lemma inangle_one_side2: + assumes "\ Col A B C" and + "\ Col A B P" and + "\ Col A B Q" and + "\ Col C B P" and + "\ Col C B Q" and + "P InAngle A B C" and + "Q InAngle A B C" + shows "A B OS P Q \ C B OS P Q" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) inangle_one_side l11_24 not_col_permutation_3) + +lemma col_conga_col: + assumes "Col A B C" and + "A B C CongA D E F" + shows "Col D E F" +proof - + { + assume "Bet A B C" + then have "Col D E F" + using Col_def assms(2) bet_conga__bet by blast + } + { + assume "Bet B C A" + then have "Col D E F" + by (meson Col_perm Tarski_neutral_dimensionless.l11_21_a Tarski_neutral_dimensionless_axioms \Bet A B C \ Col D E F\ assms(1) assms(2) or_bet_out out_col) + } + { + assume "Bet C A B" + then have "Col D E F" + by (meson Col_perm Tarski_neutral_dimensionless.l11_21_a Tarski_neutral_dimensionless_axioms \Bet A B C \ Col D E F\ assms(1) assms(2) or_bet_out out_col) + } + thus ?thesis + using Col_def \Bet A B C \ Col D E F\ \Bet B C A \ Col D E F\ assms(1) by blast +qed + +lemma ncol_conga_ncol: + assumes "\ Col A B C" and + "A B C CongA D E F" + shows "\ Col D E F" + using assms(1) assms(2) col_conga_col conga_sym by blast + +lemma angle_construction_4: + assumes "A \ B" and + "C \ B" and + "A' \ B'" + shows "\C'. (A B C CongA A' B' C' \ Coplanar A' B' C' P)" +proof cases + assume "Col A' B' P" + thus ?thesis + using angle_construction_3 assms(1) assms(2) assms(3) ncop__ncols by blast +next + assume "\ Col A' B' P" + { + assume "Col A B C" + then have "\C'. (A B C CongA A' B' C' \ Coplanar A' B' C' P)" + by (meson angle_construction_3 assms(1) assms(2) assms(3) col__coplanar col_conga_col) + } + { + assume "\ Col A B C" + then obtain C' where "A B C CongA A' B' C' \ A' B' OS C' P" + using \\ Col A' B' P\ angle_construction_1 by blast + then have "\C'. (A B C CongA A' B' C' \ Coplanar A' B' C' P)" + using os__coplanar by blast + } + thus ?thesis + using \Col A B C \ \C'. A B C CongA A' B' C' \ Coplanar A' B' C' P\ by blast +qed + +lemma lea_distincts: + assumes "A B C LeA D E F" + shows "A\B \ C\B \ D\E \ F\E" + by (metis (no_types) LeA_def Tarski_neutral_dimensionless.conga_diff1 Tarski_neutral_dimensionless.conga_diff2 Tarski_neutral_dimensionless_axioms assms inangle_distincts) + +lemma l11_29_a: + assumes "A B C LeA D E F" + shows "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" +proof - + obtain P where P1: "P InAngle D E F \ A B C CongA D E P" + using LeA_def assms by blast + then have P2: "E \ D \ B \ A \ E \ F \ E \ P \ B \ C" + using conga_diff1 conga_diff2 inangle_distincts by blast + then have P3: "A \ B \ C \ B" by blast + { + assume Q1: "Bet A B C" + then have Q2: "Bet D E P" + by (meson P1 Tarski_neutral_dimensionless.bet_conga__bet Tarski_neutral_dimensionless_axioms) + have Q3: "C InAngle A B C" + by (simp add: P3 inangle3123) + obtain X where Q4: "Bet D X F \ (X = E \ E Out X P)" + using InAngle_def P1 by auto + have "A B C CongA D E F" + proof - + { + assume R1: "X = E" + have R2: "Bet E F P \ Bet E P F" + proof - + have R3: "D \ E" using P2 by blast + have "Bet D E F" + using Col_def Col_perm P1 Q2 col_in_angle_out not_bet_and_out by blast + have "Bet D E P" using Q2 by blast + thus ?thesis using l5_2 + using R3 \Bet D E F\ by blast + qed + then have "A B C CongA D E F" + by (smt P1 P2 bet_out l11_10 l6_6 out_trivial) + } + { + assume S1: "E Out X P" + + have S2: "E Out P F" + proof - + { + assume "Bet E X P" + then have "E Out P F" + proof - + have "Bet E X F" + by (meson Bet_perm Q2 Q4 \Bet E X P\ between_exchange3) + thus ?thesis + by (metis Bet_perm S1 bet2__out between_equality_2 between_trivial2 out2_bet_out out_diff1) + qed + } + { + assume "Bet E P X" + then have "E Out P F" + by (smt Bet_perm Q2 Q4 S1 bet_out_1 between_exchange3 not_bet_and_out outer_transitivity_between2) + } + thus ?thesis + using Out_def S1 \Bet E X P \ E Out P F\ by blast + qed + + then have "A B C CongA D E F" + by (metis Bet_perm P2 Q1 Q2 bet_out__bet conga_line) + } + thus ?thesis + using Q4 \X = E \ A B C CongA D E F\ by blast + qed + then have "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + using conga_diff1 conga_diff2 inangle3123 by blast + } + { + assume "B Out A C" + obtain Q where "D E F CongA A B Q" + by (metis P2 angle_construction_3) + + then have "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + by (metis Tarski_neutral_dimensionless.conga_comm Tarski_neutral_dimensionless_axioms \B Out A C\ conga_diff1 conga_sym out321__inangle) + } + { + assume ZZ: "\ Col A B C" + have Z1: "D \ E" + using P2 by blast + have Z2: "F \ E" + using P2 by blast + have Z3: "Bet D E F \ E Out D F \ \ Col D E F" + using not_bet_out by blast + { + assume "Bet D E F" + obtain Q where Z4: "Bet A B Q \ Cong B Q E F" + using segment_construction by blast + + then have "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + by (metis InAngle_def P3 Z1 Z2 \Bet D E F\ conga_line point_construction_different) + } + { + assume "E Out D F" + then have Z5: "E Out D P" + using P1 in_angle_out by blast + have "D E P CongA A B C" + by (simp add: P1 conga_sym) + then have Z6: "B Out A C" using l11_21_a Z5 + by blast + + then have "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + using \B Out A C \ \Q. C InAngle A B Q \ A B Q CongA D E F\ by blast + } + { + assume W1: "\ Col D E F" + obtain Q where W2: "D E F CongA A B Q \ A B OS Q C" + using W1 ZZ angle_construction_1 by moura + obtain DD where W3: "E Out D DD \ Cong E DD B A" + using P3 Z1 segment_construction_3 by force + obtain FF where W4: "E Out F FF \ Cong E FF B Q" + by (metis P2 W2 conga_diff56 segment_construction_3) + then have W5: "P InAngle DD E FF" + by (smt Out_cases P1 P2 W3 l11_25 out_trivial) + obtain X where W6: "Bet DD X FF \ (X = E \ E Out X P)" + using InAngle_def W5 by presburger + { + assume W7: "X = E" + have W8: "Bet D E F" + proof - + have W10: "E Out DD D" + by (simp add: W3 l6_6) + have "E Out FF F" + by (simp add: W4 l6_6) + thus ?thesis using W6 W7 W10 bet_out_out_bet by blast + qed + then have "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + using \Bet D E F \ \Q. C InAngle A B Q \ A B Q CongA D E F\ by blast + } + { + assume V1: "E Out X P" + have "B \ C \ E \ X" + using P3 V1 out_diff1 by blast + then obtain CC where V2: "B Out C CC \ Cong B CC E X" + using segment_construction_3 by blast + then have V3: "A B CC CongA DD E X" + by (smt P1 P2 V1 W3 l11_10 l6_6 out_trivial) + have V4: "Cong A CC DD X" + proof - + have "Cong A B DD E" + using W3 not_cong_4321 by blast + thus ?thesis + using V2 V3 cong2_conga_cong by blast + qed + + have V5: "A B Q CongA DD E FF" + proof - + have U1: "D E F CongA A B Q" + by (simp add: W2) + then have U1A: "A B Q CongA D E F" + by (simp add: conga_sym) + have U2: "B Out A A" + by (simp add: P3 out_trivial) + have U3: "B Out Q Q" + using W2 conga_diff56 out_trivial by blast + have U4: "E Out DD D" + using W3 l6_6 by blast + have "E Out FF F" + by (simp add: W4 l6_6) + + thus ?thesis using l11_10 + using U1A U2 U3 U4 by blast + qed + then have V6: "Cong A Q DD FF" + using Cong_perm W3 W4 cong2_conga_cong by blast + have "CC B Q CongA X E FF" + proof - + have U1: "B A OS CC Q" + by (metis (no_types) V2 W2 col124__nos invert_one_side one_side_symmetry one_side_transitivity out_one_side) + have U2: "E DD OS X FF" + proof - + have "\ Col DD E FF" + by (metis Col_perm OS_def TS_def U1 V5 ncol_conga_ncol) + then have "\ Col E DD X" + by (metis Col_def V2 V4 W6 ZZ cong_identity l6_16_1 os_distincts out_one_side) + then have "DD E OS X FF" + by (metis Col_perm W6 bet_out not_col_distincts one_side_reflexivity out_out_one_side) + thus ?thesis + by (simp add: invert_one_side) + qed + have "CC B A CongA X E DD" + by (simp add: V3 conga_comm) + thus ?thesis + using U1 U2 V5 l11_22b by blast + qed + then have V8: "Cong CC Q X FF" + using V2 W4 cong2_conga_cong cong_commutativity not_cong_3412 by blast + have V9: "CC InAngle A B Q" + proof - + have T2: "Q \ B" + using W2 conga_diff56 by blast + have T3: "CC \ B" + using V2 out_distinct by blast + have "Bet A CC Q" + proof - + have T4: "DD X FF Cong3 A CC Q" + using Cong3_def V4 V6 V8 not_cong_3412 by blast + thus ?thesis + using W6 l4_6 by blast + qed + then have "\ X0. Bet A X0 Q \ (X0 = B \ B Out X0 CC)" + using out_trivial by blast + thus ?thesis + by (simp add: InAngle_def P3 T2 T3) + qed + then have "C InAngle A B Q" + using V2 inangle_distincts l11_25 out_trivial by blast + then have "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + using W2 conga_sym by blast + } + then have "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + using W6 \X = E \ \Q. C InAngle A B Q \ A B Q CongA D E F\ by blast + } + then have "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + using Z3 \E Out D F \ \Q. C InAngle A B Q \ A B Q CongA D E F\ \Bet D E F \ \Q. C InAngle A B Q \ A B Q CongA D E F\ by blast + } + thus ?thesis + using \B Out A C \ \Q. C InAngle A B Q \ A B Q CongA D E F\ \Bet A B C \ \Q. C InAngle A B Q \ A B Q CongA D E F\ not_bet_out by blast +qed + +lemma in_angle_line: + assumes "P \ B" and + "A \ B" and + "C \ B" and + "Bet A B C" + shows "P InAngle A B C" + using InAngle_def assms(1) assms(2) assms(3) assms(4) by auto + +lemma l11_29_b: + assumes "\ Q. (C InAngle A B Q \ A B Q CongA D E F)" + shows "A B C LeA D E F" +proof - + obtain Q where P1: "C InAngle A B Q \ A B Q CongA D E F" + using assms by blast + obtain X where P2: "Bet A X Q \ (X = B \ B Out X C)" + using InAngle_def P1 by auto + { + assume P2A: "X = B" + obtain P where P3: "A B C CongA D E P" + using angle_construction_3 assms conga_diff45 inangle_distincts by fastforce + have "P InAngle D E F" + proof - + have O1: "Bet D E F" + by (metis (no_types) P1 P2 Tarski_neutral_dimensionless.bet_conga__bet Tarski_neutral_dimensionless_axioms P2A) + have O2: "P \ E" + using P3 conga_diff56 by auto + have O3: "D \ E" + using P3 conga_diff45 by auto + have "F \ E" + using P1 conga_diff56 by blast + thus ?thesis using in_angle_line + by (simp add: O1 O2 O3) + qed + then have "A B C LeA D E F" + using LeA_def P3 by blast + } + { + assume G1: "B Out X C" + obtain DD where G2: "E Out D DD \ Cong E DD B A" + by (metis assms conga_diff1 conga_diff45 segment_construction_3) + have G3: "D \ E \ DD \ E" + using G2 out_diff1 out_diff2 by blast + obtain FF where G3G: "E Out F FF \ Cong E FF B Q" + by (metis P1 conga_diff56 inangle_distincts segment_construction_3) + then have G3A: "F \ E" + using out_diff1 by blast + have G3B: "FF \ E" + using G3G out_distinct by blast + have G4: "Bet A B C \ B Out A C \ \ Col A B C" + using not_bet_out by blast + { + assume G5: "Bet A B C" + have G6: "F InAngle D E F" + by (simp add: G3 G3A inangle3123) + have "A B C CongA D E F" + by (smt Bet_perm G3 G3A G5 Out_def P1 P2 bet_conga__bet between_exchange3 conga_line inangle_distincts outer_transitivity_between2) + then have "A B C LeA D E F" + using G6 LeA_def by blast + } + { + assume G8: "B Out A C" + have G9: "D InAngle D E F" + by (simp add: G3 G3A inangle1123) + have "A B C CongA D E D" + by (simp add: G3 G8 l11_21_b out_trivial) + then have "A B C LeA D E F" using G9 LeA_def by blast + } + { + assume R1: "\ Col A B C" + have R2: "Bet A B Q \ B Out A Q \ \ Col A B Q" + using not_bet_out by blast + { + assume R3: "Bet A B Q" + obtain P where R4: "A B C CongA D E P" + by (metis G3 LeA_def \Bet A B C \ A B C LeA D E F\ angle_construction_3 not_bet_distincts) + have R5: "P InAngle D E F" + proof - + have R6: "P \ E" + using R4 conga_diff56 by auto + have "Bet D E F" + by (metis (no_types) P1 R3 Tarski_neutral_dimensionless.bet_conga__bet Tarski_neutral_dimensionless_axioms) + thus ?thesis + by (simp add: R6 G3 G3A in_angle_line) + qed + then have "A B C LeA D E F" using R4 R5 LeA_def by blast + } + { + assume S1: "B Out A Q" + have S2: "B Out A C" + using G1 P2 S1 l6_7 out_bet_out_1 by blast + have S3: "Col A B C" + by (simp add: Col_perm S2 out_col) + then have "A B C LeA D E F" + using R1 by blast + } + { + assume S3B: "\ Col A B Q" + obtain P where S4: "A B C CongA D E P \ D E OS P F" + by (meson P1 R1 Tarski_neutral_dimensionless.ncol_conga_ncol Tarski_neutral_dimensionless_axioms S3B angle_construction_1) + obtain PP where S4A: "E Out P PP \ Cong E PP B X" + by (metis G1 S4 os_distincts out_diff1 segment_construction_3) + have S5: "P InAngle D E F" + proof - + have "PP InAngle DD E FF" + proof - + have Z3: "PP \ E" + using S4A l6_3_1 by blast + have Z4: "Bet DD PP FF" + proof - + have L1: "C B Q CongA P E F" + proof - + have K1: "B A OS C Q" + using Col_perm P1 R1 S3B in_angle_one_side invert_one_side by blast + have K2: "E D OS P F" + by (simp add: S4 invert_one_side) + have "C B A CongA P E D" + by (simp add: S4 conga_comm) + thus ?thesis + using K1 K2 P1 l11_22b by auto + qed + have L2: "Cong DD FF A Q" + proof - + have "DD E FF CongA A B Q" + proof - + have L3: "A B Q CongA D E F" + by (simp add: P1) + then have L3A: "D E F CongA A B Q" + using conga_sym by blast + have L4: "E Out DD D" + using G2 Out_cases by auto + have L5: "E Out FF F" + using G3G Out_cases by blast + have L6: "B Out A A" + using S3B not_col_distincts out_trivial by auto + have "B Out Q Q" + by (metis S3B not_col_distincts out_trivial) + thus ?thesis using L3A L4 L5 L6 l11_10 + by blast + qed + have L2B: "Cong DD E A B" + using Cong_perm G2 by blast + have "Cong E FF B Q" + by (simp add: G3G) + thus ?thesis + using L2B \DD E FF CongA A B Q\ cong2_conga_cong by auto + qed + have L8: "Cong A X DD PP" + proof - + have L9: "A B X CongA DD E PP" + proof - + have L9B: "B Out A A" + using S3B not_col_distincts out_trivial by blast + have L9D: "E Out D D " + using G3 out_trivial by auto + have "E Out PP P" + using Out_cases S4A by blast + thus ?thesis using l11_10 S4 L9B G1 L9D + using G2 Out_cases by blast + qed + have L10: "Cong A B DD E" + using G2 not_cong_4321 by blast + have "Cong B X E PP" + using Cong_perm S4A by blast + thus ?thesis + using L10 L9 cong2_conga_cong by blast + qed + have "A X Q Cong3 DD PP FF" + proof - + have L12B: "Cong A Q DD FF" + using L2 not_cong_3412 by blast + have "Cong X Q PP FF" + proof - + have L13A: "X B Q CongA PP E FF" + proof - + have L13AC: "B Out Q Q" + by (metis S3B col_trivial_2 out_trivial) + have L13AD: "E Out PP P" + by (simp add: S4A l6_6) + have "E Out FF F" + by (simp add: G3G l6_6) + thus ?thesis + using L1 G1 L13AC L13AD l11_10 by blast + qed + have L13B: "Cong X B PP E" + using S4A not_cong_4321 by blast + have "Cong B Q E FF" + using G3G not_cong_3412 by blast + thus ?thesis + using L13A L13B cong2_conga_cong by auto + qed + thus ?thesis + by (simp add: Cong3_def L12B L8) + qed + thus ?thesis using P2 l4_6 by blast + qed + have "PP = E \ E Out PP PP" + using out_trivial by auto + thus ?thesis + using InAngle_def G3 G3B Z3 Z4 by auto + qed + thus ?thesis + using G2 G3G S4A l11_25 by blast + qed + then have "A B C LeA D E F" + using S4 LeA_def by blast + } + then have "A B C LeA D E F" + using R2 \B Out A Q \ A B C LeA D E F\ \Bet A B Q \ A B C LeA D E F\ by blast + } + then have "A B C LeA D E F" + using G4 \B Out A C \ A B C LeA D E F\ \Bet A B C \ A B C LeA D E F\ by blast + } + thus ?thesis + using P2 \X = B \ A B C LeA D E F\ by blast +qed + +lemma bet_in_angle_bet: + assumes "Bet A B P" and + "P InAngle A B C" + shows "Bet A B C" + by (metis (no_types) Col_def Col_perm assms(1) assms(2) col_in_angle_out not_bet_and_out) + +lemma lea_line: + assumes "Bet A B P" and + "A B P LeA A B C" + shows "Bet A B C" + by (metis Tarski_neutral_dimensionless.bet_conga__bet Tarski_neutral_dimensionless.l11_29_a Tarski_neutral_dimensionless_axioms assms(1) assms(2) bet_in_angle_bet) + +lemma eq_conga_out: + assumes "A B A CongA D E F" + shows "E Out D F" + by (metis CongA_def assms l11_21_a out_trivial) + +lemma out_conga_out: + assumes "B Out A C" and + "A B C CongA D E F" + shows "E Out D F" + using assms(1) assms(2) l11_21_a by blast + +lemma conga_ex_cong3: + assumes "A B C CongA A' B' C'" + shows "\ AA CC. ((B Out A AA \ B Out C CC) \ AA B CC Cong3 A' B' C')" + using out_diff2 by blast + +lemma conga_preserves_in_angle: + assumes "A B C CongA A' B' C'" and + "A B I CongA A' B' I'" and + "I InAngle A B C" and "A' B' OS I' C'" + shows "I' InAngle A' B' C'" +proof - + have P1: "A \ B" + using assms(1) conga_diff1 by auto + have P2: "B \ C" + using assms(1) conga_diff2 by blast + have P3: "A' \ B'" + using assms(1) conga_diff45 by auto + have P4: "B' \ C'" + using assms(1) conga_diff56 by blast + have P5: "I \ B" + using assms(2) conga_diff2 by auto + have P6: "I' \ B'" + using assms(2) conga_diff56 by blast + have P7: "Bet A B C \ B Out A C \ \ Col A B C" + using l6_4_2 by blast + { + assume "Bet A B C" + have Q1: "Bet A' B' C'" + using \Bet A B C\ assms(1) assms(4) bet_col col124__nos col_conga_col by blast + then have "I' InAngle A' B' C'" + using assms(4) bet_col col124__nos by auto + } + { + assume "B Out A C" + then have "I' InAngle A' B' C'" + by (metis P4 assms(2) assms(3) in_angle_out l11_21_a out321__inangle) + } + { + assume Z1: "\ Col A B C" + have Q2: "Bet A B I \ B Out A I \ \ Col A B I" + by (simp add: or_bet_out) + { + assume "Bet A B I" + then have "I' InAngle A' B' C'" + using \Bet A B C \ I' InAngle A' B' C'\ assms(3) bet_in_angle_bet by blast + } + { + assume "B Out A I" + then have "I' InAngle A' B' C'" + using P4 assms(2) l11_21_a out321__inangle by auto + } + { + assume "\ Col A B I" + obtain AA' where Q3: "B' Out A' AA' \ Cong B' AA' B A" + using P1 P3 segment_construction_3 by presburger + obtain CC' where Q4: "B' Out C' CC' \ Cong B' CC' B C" + using P2 P4 segment_construction_3 by presburger + obtain J where Q5: "Bet A J C \ (J = B \ B Out J I)" + using InAngle_def assms(3) by auto + have Q6: "B \ J" + using Q5 Z1 bet_col by auto + have Q7: "\ Col A B J" + using Q5 Q6 \\ Col A B I\ col_permutation_2 col_transitivity_1 out_col by blast + have "\ Col A' B' I'" + by (metis assms(4) col123__nos) + then have "\ C'. (A B J CongA A' B' C' \ A' B' OS C' I')" + using Q7 angle_construction_1 by blast + then obtain J' where Q8: "A B J CongA A' B' J' \ A' B' OS J' I'" by blast + have Q9: "B' \ J'" + using Q8 conga_diff56 by blast + obtain JJ' where Q10: "B' Out J' JJ' \ Cong B' JJ' B J" + using segment_construction_3 Q6 Q9 by blast + have Q11: "\ Col A' B' J'" + using Q8 col123__nos by blast + have Q12: "A' \ JJ'" + by (metis Col_perm Q10 Q11 out_col) + have Q13: "B' \ JJ'" + using Q10 out_distinct by blast + have Q14: "\ Col A' B' JJ'" + using Col_perm Q10 Q11 Q13 l6_16_1 out_col by blast + have Q15: "A B C CongA AA' B' CC'" + proof - + have T2: "C \ B" using P2 by auto + have T3: "AA' \ B'" + using Out_def Q3 by blast + have T4: "CC' \ B'" + using Q4 out_distinct by blast + have T5: "\ A' C' D' F'. (B Out A' A \ B Out C' C \ B' Out D' AA' \ +B' Out F' CC' \Cong B A' B' D' \ Cong B C' B' F' \ Cong A' C' D' F')" + by (smt Q3 Q4 Tarski_neutral_dimensionless.l11_4_1 Tarski_neutral_dimensionless_axioms assms(1) l6_6 l6_7) + thus ?thesis using P1 T2 T3 T4 l11_4_2 by blast + qed + have Q16: "A' B' J' CongA A' B' JJ'" + proof - + have P9: "B' Out A' A'" + by (simp add: P3 out_trivial) + have "B' Out JJ' J'" + using Out_cases Q10 by auto + thus ?thesis + using l11_10 + by (simp add: P9 out2__conga) + qed + have Q17: "B' Out I' JJ' \ A' B' TS I' JJ'" + proof - + have "Coplanar A' I' B' J'" + by (metis (full_types) Q8 ncoplanar_perm_3 os__coplanar) + then have "Coplanar A' I' B' JJ'" + using Q10 Q9 col_cop__cop out_col by blast + then have R1: "Coplanar A' B' I' JJ'" using coplanar_perm_2 + by blast + have "A' B' I' CongA A' B' JJ'" + proof - + have R2: "A' B' I' CongA A B I" + by (simp add: assms(2) conga_sym) + have "A B I CongA A' B' JJ'" + proof - + have f1: "\p pa pb. \ p Out pa pb \ \ p Out pb pa \ p Out pa pb" + using Out_cases by blast + then have f2: "B' Out JJ' J'" + using Q10 by blast + have "B Out J I" + by (metis Q5 Q6) + thus ?thesis + using f2 f1 by (meson P3 Q8 Tarski_neutral_dimensionless.l11_10 Tarski_neutral_dimensionless_axioms \\ Col A B I\ col_one_side_out col_trivial_2 one_side_reflexivity out_trivial) + qed + thus ?thesis + using R2 conga_trans by blast + qed + thus ?thesis using R1 conga_cop__or_out_ts by blast + qed + { + assume Z2: "B' Out I' JJ'" + have Z3: "J B C CongA J' B' C'" + proof - + have R1: "B A OS J C" + by (metis Q5 Q7 Z1 bet_out invert_one_side not_col_distincts out_one_side) + have R2: "B' A' OS J' C'" + by (meson Q10 Z2 assms(4) invert_one_side l6_6 one_side_symmetry out_out_one_side) + have "J B A CongA J' B' A'" + using Q8 conga_comm by blast + thus ?thesis using assms(1) R1 R2 l11_22b by blast + qed + then have "I' InAngle A' B' C'" + proof - + have "A J C Cong3 AA' JJ' CC'" + proof - + have R8: "Cong A J AA' JJ'" + proof - + have R8A: "A B J CongA AA' B' JJ'" + proof - + have R8AB: "B Out A A" + by (simp add: P1 out_trivial) + have R8AC: "B Out J I" + using Q5 Q6 by auto + have R8AD: "B' Out AA' A'" + using Out_cases Q3 by auto + have "B' Out JJ' I'" + using Out_cases Z2 by blast + thus ?thesis + using assms(2) R8AB R8AC R8AD l11_10 by blast + qed + have R8B: "Cong A B AA' B'" + using Q3 not_cong_4321 by blast + have R8C: "Cong B J B' JJ'" + using Q10 not_cong_3412 by blast + thus ?thesis + using R8A R8B cong2_conga_cong by blast + qed + have LR8A: "Cong A C AA' CC'" + using Q15 Q3 Q4 cong2_conga_cong cong_4321 cong_symmetry by blast + have "Cong J C JJ' CC'" + proof - + have K1:"B' Out JJ' J'" + using Out_cases Q10 by auto + have "B' Out CC' C'" + using Out_cases Q4 by auto + then have "J' B' C' CongA JJ' B' CC'" using K1 + by (simp add: out2__conga) + then have LR9A: "J B C CongA JJ' B' CC'" + using Z3 conga_trans by blast have LR9B: "Cong J B JJ' B'" + using Q10 not_cong_4321 by blast + have "Cong B C B' CC'" + using Q4 not_cong_3412 by blast + thus ?thesis + using LR9A LR9B cong2_conga_cong by blast + qed + thus ?thesis using R8 LR8A + by (simp add: Cong3_def) + qed + then have R10: "Bet AA' JJ' CC'" using Q5 l4_6 by blast + have "JJ' InAngle AA' B' CC'" + proof - + have R11: "AA' \ B'" + using Out_def Q3 by auto + have R12: "CC' \ B'" + using Out_def Q4 by blast + have "Bet AA' JJ' CC' \ (JJ' = B' \ B' Out JJ' JJ')" + using R10 out_trivial by auto + thus ?thesis + using InAngle_def Q13 R11 R12 by auto + qed + thus ?thesis + using Z2 Q3 Q4 l11_25 by blast + qed + } + { + assume X1: "A' B' TS I' JJ'" + have "A' B' OS I' J'" + by (simp add: Q8 one_side_symmetry) + then have X2: "B' A' OS I' JJ'" + using Q10 invert_one_side out_out_one_side by blast + then have "I' InAngle A' B' C'" + using X1 invert_one_side l9_9 by blast + } + then have "I' InAngle A' B' C'" + using Q17 \B' Out I' JJ' \ I' InAngle A' B' C'\ by blast + } + then have "I' InAngle A' B' C'" + using Q2 \B Out A I \ I' InAngle A' B' C'\ \Bet A B I \ I' InAngle A' B' C'\ by blast + } + thus ?thesis + using P7 \B Out A C \ I' InAngle A' B' C'\ \Bet A B C \ I' InAngle A' B' C'\ by blast +qed + +lemma l11_30: + assumes "A B C LeA D E F" and + "A B C CongA A' B' C'" and + "D E F CongA D' E' F'" + shows "A' B' C' LeA D' E' F'" +proof - + obtain Q where P1: "C InAngle A B Q \ A B Q CongA D E F" + using assms(1) l11_29_a by blast + have P1A: "C InAngle A B Q" using P1 by simp + have P1B: "A B Q CongA D E F" using P1 by simp + have P2: "A \ B" + using P1A inangle_distincts by auto + have P3: "C \ B" + using P1A inangle_distincts by blast + have P4: "A' \ B'" + using CongA_def assms(2) by blast + have P5: "C' \ B'" + using CongA_def assms(2) by auto + have P6: "D \ E" + using CongA_def P1B by blast + have P7: "F \ E" + using CongA_def P1B by blast + have P8: "D' \ E'" + using CongA_def assms(3) by blast + have P9: "F' \ E'" + using CongA_def assms(3) by blast + have P10: "Bet A' B' C' \ B' Out A' C' \ \ Col A' B' C'" + using or_bet_out by blast + { + assume "Bet A' B' C'" + then have "\ Q'. (C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F')" + by (metis P1 P4 P5 P8 P9 assms(2) assms(3) bet_conga__bet bet_in_angle_bet conga_line conga_sym inangle3123) + } + { + assume R1: "B' Out A' C'" + obtain Q' where R2: "D' E' F' CongA A' B' Q'" + using P4 P8 P9 angle_construction_3 by blast + then have "C' InAngle A' B' Q'" + using col_in_angle P1 R1 conga_diff56 out321__inangle by auto + then have "\ Q'. (C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F')" + using R2 conga_sym by blast + } + { + assume R3: "\ Col A' B' C'" + have R3A: "Bet D' E' F' \ E' Out D' F' \ \ Col D' E' F'" + using or_bet_out by blast + { + assume "Bet D' E' F'" + have "\ Q'. (C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F')" + by (metis P4 P5 P8 P9 \Bet D' E' F'\ conga_line in_angle_line point_construction_different) + } + { + assume R4A: "E' Out D' F'" + obtain Q' where R4: "D' E' F' CongA A' B' Q'" + using P4 P8 P9 angle_construction_3 by blast + then have R5: "B' Out A' Q'" using out_conga_out R4A by blast + have R6: "A B Q CongA D' E' F'" + using P1 assms(3) conga_trans by blast + then have R7: "B Out A Q" using out_conga_out R4A R4 + using conga_sym by blast + have R8: "B Out A C" + using P1A R7 in_angle_out by blast + then have R9: "B' Out A' C'" using out_conga_out assms(2) + by blast + have "\ Q'. (C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F')" + by (simp add: R9 \B' Out A' C' \ \Q'. C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F'\) + } + { + assume "\ Col D' E' F'" + obtain QQ where S1: "D' E' F' CongA A' B' QQ \ A' B' OS QQ C'" + using R3 \\ Col D' E' F'\ angle_construction_1 by blast + have S1A: "A B Q CongA A' B' QQ" using S1 + using P1 assms(3) conga_trans by blast + have "A' B' OS C' QQ" using S1 + by (simp add: S1 one_side_symmetry) + then have S2: "C' InAngle A' B' QQ" using conga_preserves_in_angle S1A + using P1A assms(2) by blast + have S3: "A' B' QQ CongA D' E' F'" + by (simp add: S1 conga_sym) + then have "\ Q'. (C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F')" + using S2 by auto + } + then have "\ Q'. (C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F')" + using R3A \E' Out D' F' \ \Q'. C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F'\ \Bet D' E' F' \ \Q'. C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F'\ by blast + } + thus ?thesis using l11_29_b + using P10 \B' Out A' C' \ \Q'. C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F'\ \Bet A' B' C' \ \Q'. C' InAngle A' B' Q' \ A' B' Q' CongA D' E' F'\ by blast +qed + +lemma l11_31_1: + assumes "B Out A C" and + "D \ E" and + "F \ E" + shows "A B C LeA D E F" + by (metis (full_types) LeA_def assms(1) assms(2) assms(3) l11_21_b out321__inangle segment_construction_3) + +lemma l11_31_2: + assumes "A \ B" and + "C \ B" and + "D \ E" and + "F \ E" and + "Bet D E F" + shows "A B C LeA D E F" + by (metis LeA_def angle_construction_3 assms(1) assms(2) assms(3) assms(4) assms(5) conga_diff56 in_angle_line) + +lemma lea_refl: + assumes "A \ B" and + "C \ B" + shows "A B C LeA A B C" + by (meson assms(1) assms(2) conga_refl l11_29_b out341__inangle out_trivial) + +lemma conga__lea: + assumes "A B C CongA D E F" + shows "A B C LeA D E F" + by (metis Tarski_neutral_dimensionless.conga_diff1 Tarski_neutral_dimensionless.conga_diff2 Tarski_neutral_dimensionless.l11_30 Tarski_neutral_dimensionless_axioms assms conga_refl lea_refl) + +lemma conga__lea456123: + assumes "A B C CongA D E F" + shows "D E F LeA A B C" + by (simp add: Tarski_neutral_dimensionless.conga__lea Tarski_neutral_dimensionless_axioms assms conga_sym) + +lemma lea_left_comm: + assumes "A B C LeA D E F" + shows "C B A LeA D E F" + by (metis assms conga_pseudo_refl conga_refl l11_30 lea_distincts) + +lemma lea_right_comm: + assumes "A B C LeA D E F" + shows "A B C LeA F E D" + by (meson assms conga_right_comm l11_29_a l11_29_b) + +lemma lea_comm: + assumes"A B C LeA D E F" + shows "C B A LeA F E D" + using assms lea_left_comm lea_right_comm by blast + +lemma lta_left_comm: + assumes "A B C LtA D E F" + shows "C B A LtA D E F" + by (meson LtA_def Tarski_neutral_dimensionless.conga_left_comm Tarski_neutral_dimensionless.lea_left_comm Tarski_neutral_dimensionless_axioms assms) + +lemma lta_right_comm: + assumes "A B C LtA D E F" + shows "A B C LtA F E D" + by (meson Tarski_neutral_dimensionless.LtA_def Tarski_neutral_dimensionless.conga_comm Tarski_neutral_dimensionless.lea_comm Tarski_neutral_dimensionless.lta_left_comm Tarski_neutral_dimensionless_axioms assms) + +lemma lta_comm: + assumes "A B C LtA D E F" + shows "C B A LtA F E D" + using assms lta_left_comm lta_right_comm by blast + +lemma lea_out4__lea: + assumes "A B C LeA D E F" and + "B Out A A'" and + "B Out C C'" and + "E Out D D'" and + "E Out F F'" + shows "A' B C' LeA D' E F'" + using assms(1) assms(2) assms(3) assms(4) assms(5) l11_30 l6_6 out2__conga by auto + + +lemma lea121345: + assumes "A \ B" and + "C \ D" and + "D \ E" + shows "A B A LeA C D E" + using assms(1) assms(2) assms(3) l11_31_1 out_trivial by auto + +lemma inangle__lea: + assumes "P InAngle A B C" + shows "A B P LeA A B C" + by (metis Tarski_neutral_dimensionless.l11_29_b Tarski_neutral_dimensionless_axioms assms conga_refl inangle_distincts) + +lemma inangle__lea_1: + assumes "P InAngle A B C" + shows "P B C LeA A B C" + by (simp add: Tarski_neutral_dimensionless.inangle__lea Tarski_neutral_dimensionless.lea_comm Tarski_neutral_dimensionless_axioms assms l11_24) + +lemma inangle__lta: + assumes "\ Col P B C" and + "P InAngle A B C" + shows "A B P LtA A B C" + by (metis LtA_def TS_def Tarski_neutral_dimensionless.conga_cop__or_out_ts Tarski_neutral_dimensionless.conga_os__out Tarski_neutral_dimensionless.inangle__lea Tarski_neutral_dimensionless.ncol_conga_ncol Tarski_neutral_dimensionless_axioms assms(1) assms(2) col_one_side_out col_trivial_3 in_angle_one_side inangle__coplanar invert_two_sides l11_21_b ncoplanar_perm_12 not_col_permutation_3 one_side_reflexivity) + +lemma in_angle_trans: + assumes "C InAngle A B D" and + "D InAngle A B E" + shows "C InAngle A B E" +proof - + obtain CC where P1: "Bet A CC D \ (CC = B \ B Out CC C)" + using InAngle_def assms(1) by auto + obtain DD where P2: "Bet A DD E \ (DD = B \ B Out DD D)" + using InAngle_def assms(2) by auto + then have P3: "Bet A DD E" by simp + have P4: "DD = B \ B Out DD D" using P2 by simp + { + assume "CC = B \ DD = B" + then have "C InAngle A B E" + using InAngle_def P2 assms(1) assms(2) by auto + } + { + assume "CC = B \ B Out DD D" + then have "C InAngle A B E" + by (metis InAngle_def P1 assms(1) assms(2) bet_in_angle_bet) + } + { + assume "B Out CC C \ DD = B" + then have "C InAngle A B E" + by (metis Out_def P2 assms(2) in_angle_line inangle_distincts) + } + { + assume P3: "B Out CC C \ B Out DD D" + then have P3A: "B Out CC C" by simp + have P3B: "B Out DD D" using P3 by simp + have "C InAngle A B DD" + using P3 assms(1) inangle_distincts l11_25 out_trivial by blast + then obtain CC' where T1: "Bet A CC' DD \ (CC' = B \ B Out CC' C)" + using InAngle_def by auto + { + assume "CC' = B" + then have "C InAngle A B E" + by (metis P2 P3 T1 assms(2) between_exchange4 in_angle_line inangle_distincts out_diff2) + } + { + assume "B Out CC' C" + then have "C InAngle A B E" + by (metis InAngle_def P2 T1 assms(1) assms(2) between_exchange4) + } + + then have "C InAngle A B E" + using T1 \CC' = B \ C InAngle A B E\ by blast + } + thus ?thesis + using P1 P2 \B Out CC C \ DD = B \ C InAngle A B E\ \CC = B \ B Out DD D \ C InAngle A B E\ \CC = B \ DD = B \ C InAngle A B E\ by blast +qed + +lemma lea_trans: + assumes "A B C LeA A1 B1 C1" and + "A1 B1 C1 LeA A2 B2 C2" + shows "A B C LeA A2 B2 C2" +proof - + obtain P1 where T1: "P1 InAngle A1 B1 C1 \ A B C CongA A1 B1 P1" + using LeA_def assms(1) by auto + obtain P2 where T2: "P2 InAngle A2 B2 C2 \ A1 B1 C1 CongA A2 B2 P2" + using LeA_def assms(2) by blast + have T3: "A \ B" + using CongA_def T1 by auto + have T4: "C \ B" + using CongA_def T1 by blast + have T5: "A1 \ B1" + using T1 inangle_distincts by blast + have T6: "C1 \ B1" + using T1 inangle_distincts by blast + have T7: "A2 \ B2" + using T2 inangle_distincts by blast + have T8: "C2 \ B2" + using T2 inangle_distincts by blast + have T9: "Bet A B C \ B Out A C \ \ Col A B C" + using not_out_bet by auto + { + assume "Bet A B C" + then have "A B C LeA A2 B2 C2" + by (metis T1 T2 T3 T4 T7 T8 bet_conga__bet bet_in_angle_bet l11_31_2) + } + { + assume "B Out A C" + then have "A B C LeA A2 B2 C2" + by (simp add: T7 T8 l11_31_1) + } + { + assume H1: "\ Col A B C" + have T10: "Bet A2 B2 C2 \ B2 Out A2 C2 \ \ Col A2 B2 C2" + using not_out_bet by auto + { + assume "Bet A2 B2 C2" + then have "A B C LeA A2 B2 C2" + by (simp add: T3 T4 T7 T8 l11_31_2) + } + { + assume T10A: "B2 Out A2 C2" + have "B Out A C" + proof - + have "B1 Out A1 P1" + proof - + have "B1 Out A1 C1" using T2 conga_sym T2 T10A in_angle_out out_conga_out by blast + thus ?thesis using T1 in_angle_out by blast + qed + thus ?thesis using T1 conga_sym l11_21_a by blast + qed + then have "A B C LeA A2 B2 C2" + using \B Out A C \ A B C LeA A2 B2 C2\ by blast + } + { + assume T12: "\ Col A2 B2 C2" + obtain P where T13: "A B C CongA A2 B2 P \ A2 B2 OS P C2" + using T12 H1 angle_construction_1 by blast + have T14: "A2 B2 OS P2 C2" + proof - + have "\ Col B2 A2 P2" + proof - + have "B2 \ A2" + using T7 by auto + { + assume H2: "P2 = A2" + have "A2 B2 A2 CongA A1 B1 C1" + using T2 H2 conga_sym by blast + then have "B1 Out A1 C1" + using eq_conga_out by blast + then have "B1 Out A1 P1" + using T1 in_angle_out by blast + then have "B Out A C" + using T1 conga_sym out_conga_out by blast + then have False + using Col_cases H1 out_col by blast + } + then have "P2 \ A2" by blast + have "Bet A2 B2 P2 \ B2 Out A2 P2 \ \ Col A2 B2 P2" + using not_out_bet by auto + { + assume H4: "Bet A2 B2 P2" + then have "Bet A2 B2 C2" + using T2 bet_in_angle_bet by blast + then have "Col B2 A2 P2 \ False" + using Col_def T12 by blast + then have "\ Col B2 A2 P2" + using H4 bet_col not_col_permutation_4 by blast + } + { + assume H5: "B2 Out A2 P2" + then have "B1 Out A1 C1" + using T2 conga_sym out_conga_out by blast + then have "B1 Out A1 P1" + using T1 in_angle_out by blast + then have "B Out A C" + using H1 T1 ncol_conga_ncol not_col_permutation_4 out_col by blast + then have "\ Col B2 A2 P2" + using Col_perm H1 out_col by blast + } + { + assume "\ Col A2 B2 P2" + then have "\ Col B2 A2 P2" + using Col_perm by blast + } + thus ?thesis + using \B2 Out A2 P2 \ \ Col B2 A2 P2\ \Bet A2 B2 P2 \ \ Col B2 A2 P2\ \Bet A2 B2 P2 \ B2 Out A2 P2 \ \ Col A2 B2 P2\ by blast + qed + thus ?thesis + by (simp add: T2 T12 in_angle_one_side) + qed + have S1: "A2 B2 OS P P2" + using T13 T14 one_side_symmetry one_side_transitivity by blast + have "A1 B1 P1 CongA A2 B2 P" + using conga_trans conga_sym T1 T13 by blast + then have "P InAngle A2 B2 P2" + using conga_preserves_in_angle T2 T1 S1 by blast + then have "P InAngle A2 B2 C2" + using T2 in_angle_trans by blast + then have "A B C LeA A2 B2 C2" + using T13 LeA_def by blast + } + then have "A B C LeA A2 B2 C2" + using T10 \B2 Out A2 C2 \ A B C LeA A2 B2 C2\ \Bet A2 B2 C2 \ A B C LeA A2 B2 C2\ by blast + } + thus ?thesis + using T9 \B Out A C \ A B C LeA A2 B2 C2\ \Bet A B C \ A B C LeA A2 B2 C2\ by blast +qed + +lemma in_angle_asym: + assumes "D InAngle A B C" and + "C InAngle A B D" + shows "A B C CongA A B D" +proof - + obtain CC where P1: "Bet A CC D \ (CC = B \ B Out CC C)" + using InAngle_def assms(2) by auto + obtain DD where P2: "Bet A DD C \ (DD = B \ B Out DD D)" + using InAngle_def assms(1) by auto + { + assume "(CC = B) \ (DD = B)" + then have "A B C CongA A B D" + by (metis P1 P2 assms(2) conga_line inangle_distincts) + } + { + assume "(CC = B) \ (B Out DD D)" + then have "A B C CongA A B D" + by (metis P1 assms(1) bet_in_angle_bet conga_line inangle_distincts) + } + { + assume "(B Out CC C) \ (DD = B)" + then have "A B C CongA A B D" + by (metis P2 assms(2) bet_in_angle_bet conga_line inangle_distincts) + } + { + assume V1: "(B Out CC C) \ (B Out DD D)" + obtain X where P3: "Bet CC X C \ Bet DD X D" + using P1 P2 between_symmetry inner_pasch by blast + then have "B Out X D" + using V1 out_bet_out_2 by blast + then have "B Out C D" + using P3 V1 out2_bet_out by blast + then have "A B C CongA A B D" + using assms(2) inangle_distincts l6_6 out2__conga out_trivial by blast + } + thus ?thesis using P1 P2 + using \B Out CC C \ DD = B \ A B C CongA A B D\ \CC = B \ B Out DD D \ A B C CongA A B D\ \CC = B \ DD = B \ A B C CongA A B D\ by blast +qed + +lemma lea_asym: + assumes "A B C LeA D E F" and + "D E F LeA A B C" + shows "A B C CongA D E F" +proof cases + assume P1: "Col A B C" + { + assume P1A: "Bet A B C" + have P2: "D \ E" + using assms(1) lea_distincts by blast + have P3: "F \ E" + using assms(2) lea_distincts by auto + have P4: "A \ B" + using assms(1) lea_distincts by auto + have P5: "C \ B" + using assms(2) lea_distincts by blast + obtain P where P6: "P InAngle D E F \ A B C CongA D E P" + using LeA_def assms(1) by blast + then have "A B C CongA D E P" by simp + then have "Bet D E P" using P1 P1A bet_conga__bet + by blast + then have "Bet D E F" + using P6 bet_in_angle_bet by blast + then have "A B C CongA D E F" + by (metis Tarski_neutral_dimensionless.bet_conga__bet Tarski_neutral_dimensionless.conga_line Tarski_neutral_dimensionless.l11_29_a Tarski_neutral_dimensionless_axioms P2 P3 P4 P5 assms(2) bet_in_angle_bet) + } + { + assume T1: "\ Bet A B C" + then have T2: "B Out A C" + using P1 not_out_bet by auto + obtain P where T3: "P InAngle A B C \ D E F CongA A B P" + using LeA_def assms(2) by blast + then have T3A: "P InAngle A B C" by simp + have T3B: "D E F CongA A B P" using T3 by simp + have T4: "E Out D F" + proof - + have T4A: "B Out A P" + using T2 T3 in_angle_out by blast + have "A B P CongA D E F" + by (simp add: T3 conga_sym) + thus ?thesis + using T4A l11_21_a by blast + qed + then have "A B C CongA D E F" + by (simp add: T2 l11_21_b) + } + thus ?thesis + using \Bet A B C \ A B C CongA D E F\ by blast +next + assume T5: "\ Col A B C" + obtain Q where T6: "C InAngle A B Q \ A B Q CongA D E F" + using assms(1) l11_29_a by blast + then have T6A: "C InAngle A B Q" by simp + have T6B: "A B Q CongA D E F" by (simp add: T6) + obtain P where T7: "P InAngle A B C \ D E F CongA A B P" + using LeA_def assms(2) by blast + then have T7A: "P InAngle A B C" by simp + have T7B: "D E F CongA A B P" by (simp add: T7) + have T13: "A B Q CongA A B P" + using T6 T7 conga_trans by blast + have T14: "Bet A B Q \ B Out A Q \ \ Col A B Q" + using not_out_bet by auto + { + assume R1: "Bet A B Q" + then have "A B C CongA D E F" + using T13 T5 T7 bet_col bet_conga__bet bet_in_angle_bet by blast + } + { + assume R2: "B Out A Q" + then have "A B C CongA D E F" + using T6 in_angle_out l11_21_a l11_21_b by blast + } + { + assume R3: "\ Col A B Q" + have R3A: "Bet A B P \ B Out A P \ \ Col A B P" + using not_out_bet by blast + { + assume R3AA: "Bet A B P" + then have "A B C CongA D E F" + using T5 T7 bet_col bet_in_angle_bet by blast + } + { + assume R3AB: "B Out A P" + then have "A B C CongA D E F" + by (meson Col_cases R3 T13 ncol_conga_ncol out_col) + } + { + assume R3AC: "\ Col A B P" + have R3AD: "B Out P Q \ A B TS P Q" + proof - + have "Coplanar A B P Q" + using T6A T7A coplanar_perm_8 in_angle_trans inangle__coplanar by blast + thus ?thesis + by (simp add: T13 conga_sym conga_cop__or_out_ts) + qed + { + assume "B Out P Q" + then have "C InAngle A B P" + by (meson R3 T6A bet_col between_symmetry l11_24 l11_25_aux) + then have "A B C CongA A B P" + by (simp add: T7A in_angle_asym) + then have "A B C CongA D E F" + by (meson T7B Tarski_neutral_dimensionless.conga_sym Tarski_neutral_dimensionless.conga_trans Tarski_neutral_dimensionless_axioms) + } + { + assume W1: "A B TS P Q" + have "A B OS P Q" + using Col_perm R3 R3AC T6A T7A in_angle_one_side in_angle_trans by blast + then have "A B C CongA D E F" + using W1 l9_9 by blast + } + then have "A B C CongA D E F" + using R3AD \B Out P Q \ A B C CongA D E F\ by blast + } + then have "A B C CongA D E F" + using R3A \B Out A P \ A B C CongA D E F\ \Bet A B P \ A B C CongA D E F\ by blast + } + thus ?thesis + using T14 \B Out A Q \ A B C CongA D E F\ \Bet A B Q \ A B C CongA D E F\ by blast +qed + +lemma col_lta__bet: + assumes "Col X Y Z" and + "A B C LtA X Y Z" + shows "Bet X Y Z" +proof - + have "A B C LeA X Y Z \ \ A B C CongA X Y Z" + using LtA_def assms(2) by auto + then have "Y Out X Z \ False" + using Tarski_neutral_dimensionless.lea_asym Tarski_neutral_dimensionless.lea_distincts Tarski_neutral_dimensionless_axioms l11_31_1 + by fastforce + thus ?thesis using not_out_bet assms(1) + by blast +qed + +lemma col_lta__out: + assumes "Col A B C" and + "A B C LtA X Y Z" + shows "B Out A C" +proof - + have "A B C LeA X Y Z \ \ A B C CongA X Y Z" + using LtA_def assms(2) by auto + thus ?thesis + by (metis assms(1) l11_31_2 lea_asym lea_distincts or_bet_out) +qed + +lemma lta_distincts: + assumes "A B C LtA D E F" + shows "A\B \ C\B \ D\E \ F\E \ D \ F" + by (metis LtA_def assms bet_neq12__neq col_lta__bet lea_distincts not_col_distincts) + +lemma gta_distincts: + assumes "A B C GtA D E F" + shows "A\B \ C\B \ D\E \ F\E \ A \ C" + using GtA_def assms lta_distincts by presburger + +lemma acute_distincts: + assumes "Acute A B C" + shows "A\B \ C\B" + using Acute_def assms lta_distincts by blast + +lemma obtuse_distincts: + assumes "Obtuse A B C" + shows "A\B \ C\B \ A \ C" + using Obtuse_def assms lta_distincts by blast + +lemma two_sides_in_angle: + assumes "B \ P'" and + "B P TS A C" and + "Bet P B P'" + shows "P InAngle A B C \ P' InAngle A B C" +proof - + obtain T where P1: "Col T B P \ Bet A T C" + using TS_def assms(2) by auto + have P2: "A \ B" + using assms(2) ts_distincts by blast + have P3: "C \ B" + using assms(2) ts_distincts by blast + show ?thesis + proof cases + assume "B = T" + thus ?thesis + using P1 P2 P3 assms(1) in_angle_line by auto + next + assume "B \ T" + thus ?thesis + by (metis InAngle_def P1 assms(1) assms(2) assms(3) between_symmetry l6_3_2 or_bet_out ts_distincts) + qed +qed + +lemma in_angle_reverse: + assumes "A' \ B" and + "Bet A B A'" and + "C InAngle A B D" + shows "D InAngle A' B C" +proof - + have P1: "A \ B" + using assms(3) inangle_distincts by auto + have P2: "D \ B" + using assms(3) inangle_distincts by blast + have P3: "C \ B" + using assms(3) inangle_distincts by auto + show ?thesis + proof cases + assume "Col B A C" + thus ?thesis + by (smt P1 P2 P3 assms(1) assms(2) assms(3) bet_in_angle_bet between_inner_transitivity between_symmetry in_angle_line l6_3_2 out321__inangle outer_transitivity_between third_point) + next + assume P4: "\ Col B A C" + thus ?thesis + proof cases + assume "Col B D C" + thus ?thesis + by (smt P2 P4 assms(1) assms(2) assms(3) bet_col1 col2__eq col_permutation_2 in_angle_one_side l9_19_R1 out341__inangle) + next + assume P5: "\ Col B D C" + have P6: "C B TS A D" + using P4 P5 assms(3) in_angle_two_sides by auto + obtain X where P7: "Bet A X D \ (X = B \ B Out X C)" + using InAngle_def assms(3) by auto + have P8: "X = B \ D InAngle A' B C" + using Out_def P1 P2 P3 P7 assms(1) assms(2) l5_2 out321__inangle by auto + { + assume P9: "B Out X C" + have P10: "C \ B" + by (simp add: P3) + have P10A: "\ Col B A C" + by (simp add: P4) + have P10B: "\ Col B D C" + by (simp add: P5) + have P10C: "C InAngle D B A" + by (simp add: assms(3) l11_24) + { + assume "Col D B A" + have "Col B A C" + proof - + have "B \ X" + using P9 out_distinct by blast + have "Col B X A" + by (meson Bet_perm P10C P5 P7 \Col D B A\ bet_col1 col_permutation_3 in_angle_out or_bet_out out_col) + have "Col B X C" + by (simp add: P9 out_col) + thus ?thesis + using \B \ X\ \Col B X A\ col_transitivity_1 by blast + qed + then have False + by (simp add: P4) + } + then have P10E: "\ Col D B A" by auto + have P11: "D B OS C A" + by (simp add: P10C P10E P5 in_angle_one_side) + have P12: "\ Col A D B" + using Col_cases P10E by auto + have P13: "\ Col A' D B" + by (metis Col_def \Col D B A \ False\ assms(1) assms(2) col_transitivity_1) + have P14: "D B TS A A'" + using P12 P13 TS_def assms(2) col_trivial_3 by blast + have P15: "D B TS C A'" + using P11 P14 l9_8_2 one_side_symmetry by blast + have P16: "\ Col C D B" + by (simp add: P5 not_col_permutation_3) + obtain Y where P17: "Col Y D B \ Bet C Y A'" + using P15 TS_def by auto + have P18: "Bet A' Y C" + using Bet_perm P17 by blast + { + assume S1: "Y \ B" + have S2: "Col D B Y" + using P17 not_col_permutation_2 by blast + then have S3: "Bet D B Y \ Bet B Y D \ Bet Y D B" + using Col_def S2 by auto + { + assume S4: "Bet D B Y" + have S5: "C B OS A' Y" + by (metis P17 P18 P5 S1 bet_out_1 col_transitivity_2 l6_6 not_col_permutation_3 not_col_permutation_5 out_one_side) + have S6: "C B TS Y D" + by (metis Bet_perm P16 P17 S1 S4 bet__ts col3 col_trivial_3 invert_two_sides not_col_permutation_1) + have "C B TS A A'" + by (metis (full_types) P4 assms(1) assms(2) bet__ts invert_two_sides not_col_permutation_5) + then have "C B TS Y A" + using S5 l9_2 l9_8_2 by blast + then have S9: "C B OS A D" + using P6 S6 l9_8_1 l9_9 by blast + then have "B Out Y D" + using P6 S9 l9_9 by auto + } + { + assume "Bet B Y D" + then have "B Out Y D" + by (simp add: S1 bet_out) + } + { + assume "Bet Y D B" + then have "B Out Y D" + by (simp add: P2 bet_out_1 l6_6) + } + have "B Out Y D" + using S3 \Bet B Y D \ B Out Y D\ \Bet D B Y \ B Out Y D\ \Bet Y D B \ B Out Y D\ by blast + } + then have P19: "(Y = B \ B Out Y D)" by auto + have "D InAngle A' B C" + using InAngle_def P18 P19 P2 P3 assms(1) by auto + } + thus ?thesis using P7 P8 by blast + qed + qed +qed + +lemma in_angle_trans2: + assumes "C InAngle A B D" and + "D InAngle A B E" + shows "D InAngle C B E" +proof - + obtain pp :: "'p \ 'p \ 'p" where + f1: "\p pa. Bet p pa (pp p pa) \ pa \ (pp p pa)" + using point_construction_different by moura + then have f2: "\p. C InAngle D B (pp p B) \ \ D InAngle p B A" + by (metis assms(1) in_angle_reverse in_angle_trans l11_24) + have f3: "D InAngle E B A" + using assms(2) l11_24 by blast + then have "E \ B" + by (simp add: inangle_distincts) + thus ?thesis + using f3 f2 f1 by (meson Bet_perm in_angle_reverse l11_24) +qed + +lemma l11_36_aux1: + assumes "A \ B" and + "A' \ B" and + "D \ E" and + "D' \ E" and + "Bet A B A'" and + "Bet D E D'" and + "A B C LeA D E F" + shows "D' E F LeA A' B C" +proof - + obtain P where P1: "C InAngle A B P \ +A B P CongA D E F" + using assms(7) l11_29_a by blast + thus ?thesis + by (metis LeA_def Tarski_neutral_dimensionless.l11_13 Tarski_neutral_dimensionless_axioms assms(2) assms(4) assms(5) assms(6) conga_sym in_angle_reverse) +qed + +lemma l11_36_aux2: + assumes "A \ B" and + "A' \ B" and + "D \ E" and + "D' \ E" and + "Bet A B A'" and + "Bet D E D'" and + "D' E F LeA A' B C" + shows "A B C LeA D E F" + by (metis Bet_cases assms(1) assms(3) assms(5) assms(6) assms(7) l11_36_aux1 lea_distincts) + +lemma l11_36: + assumes "A \ B" and + "A' \ B" and + "D \ E" and + "D' \ E" and + "Bet A B A'" and + "Bet D E D'" + shows "A B C LeA D E F \ D' E F LeA A' B C" + using assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l11_36_aux1 l11_36_aux2 by auto + +lemma l11_41_aux: + assumes "\ Col A B C" and + "Bet B A D" and + "A \ D" + shows "A C B LtA C A D" +proof - + obtain M where P1: "M Midpoint A C" + using midpoint_existence by auto + obtain P where P2: "M Midpoint B P" + using symmetric_point_construction by auto + have P3: "A C B Cong3 C A P" + by (smt Cong3_def P1 P2 assms(1) l7_13_R1 l7_2 midpoint_distinct_1 not_col_distincts) + have P4: "A \ C" + using assms(1) col_trivial_3 by blast + have P5: "B \ C" + using assms(1) col_trivial_2 by blast + have P7: "A \ M" + using P1 P4 is_midpoint_id by blast + have P8: "A C B CongA C A P" + by (simp add: P3 P4 P5 cong3_conga) + have P8A: "Bet D A B" + using Bet_perm assms(2) by blast + have P8B: "Bet P M B" + by (simp add: P2 between_symmetry midpoint_bet) + then obtain X where P9: "Bet A X P \ Bet M X D" using P8A inner_pasch by blast + have P9A: "Bet A X P" by (simp add: P9) + have P9B: "Bet M X D" by (simp add: P9) + have P10A: "P InAngle C A D" + proof - + have K1: "P InAngle M A D" + by (metis InAngle_def P3 P5 P7 P9 assms(3) bet_out cong3_diff2) + have K2: "A Out C M" + using Out_def P1 P4 P7 midpoint_bet by auto + have K3: "A Out D D" + using assms(3) out_trivial by auto + have "A Out P P" + using K1 inangle_distincts out_trivial by auto + thus ?thesis + using K1 K2 K3 l11_25 by blast + qed + then have P10: "A C B LeA C A D" + using LeA_def P8 by auto + { + assume K5: "A C B CongA C A D" + then have K6: "C A D CongA C A P" + using P8 conga_sym conga_trans by blast + have K7: "Coplanar C A D P" + using P10A inangle__coplanar ncoplanar_perm_18 by blast + then have K8: "A Out D P \ C A TS D P" + by (simp add: K6 conga_cop__or_out_ts) + { + assume "A Out D P" + + then have "Col M B A" + by (meson P8A P8B bet_col1 bet_out__bet between_symmetry not_col_permutation_4) + then have K8F: "Col A M B" + using not_col_permutation_1 by blast + have "Col A M C" + by (simp add: P1 bet_col midpoint_bet) + then have "False" + using K8F P7 assms(1) col_transitivity_1 by blast + } + then have K9: "\ A Out D P" by auto + { + assume V1: "C A TS D P" + then have V3: "A C TS B P" + by (metis P10A P8A assms(1) col_trivial_1 col_trivial_2 in_angle_reverse in_angle_two_sides invert_two_sides l11_24 l9_18 not_col_permutation_5) + have "A C TS B D" + by (simp add: assms(1) assms(2) assms(3) bet__ts not_col_permutation_5) + then have "A C OS D P" + using V1 V3 invert_two_sides l9_8_1 l9_9 by blast + then have "False" + using V1 invert_one_side l9_9 by blast + } + then have "\ C A TS D P" by auto + then have "False" using K8 K9 by auto + } + then have "\ A C B CongA C A D" by auto + thus ?thesis + by (simp add: LtA_def P10) +qed + +lemma l11_41: + assumes "\ Col A B C" and + "Bet B A D" and + "A \ D" + shows "A C B LtA C A D \ A B C LtA C A D" +proof - + have P1: "A C B LtA C A D" + using assms(1) assms(2) assms(3) l11_41_aux by auto + have "A B C LtA C A D" + proof - + obtain E where T1: "Bet C A E \ Cong A E C A" + using segment_construction by blast + have T1A: "Bet C A E" using T1 by simp + have T1B: "Cong A E C A" using T1 by simp + have T2: "A B C LtA B A E" + using T1 assms(1) cong_reverse_identity l11_41_aux not_col_distincts not_col_permutation_5 by blast + have T3: "B A C CongA C A B" + by (metis assms(1) conga_pseudo_refl not_col_distincts) + have T3A: "D A C CongA E A B" + by (metis CongA_def T1 T3 assms(2) assms(3) cong_reverse_identity l11_13) + then have T4: "B A E CongA C A D" + using conga_comm conga_sym by blast + have "A B C CongA A B C" + using T2 Tarski_neutral_dimensionless.conga_refl Tarski_neutral_dimensionless.lta_distincts Tarski_neutral_dimensionless_axioms by fastforce + then have T5: "A B C LeA C A D" + by (meson T2 T4 Tarski_neutral_dimensionless.LtA_def Tarski_neutral_dimensionless.l11_30 Tarski_neutral_dimensionless_axioms) + have "\ A B C CongA C A D" + by (meson T2 Tarski_neutral_dimensionless.LtA_def Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless.conga_trans Tarski_neutral_dimensionless_axioms T3A) + thus ?thesis + by (simp add: LtA_def T5) + qed + thus ?thesis by (simp add: P1) +qed + +lemma not_conga: + assumes "A B C CongA A' B' C'" and + "\ A B C CongA D E F" + shows "\ A' B' C' CongA D E F" + by (meson assms(1) assms(2) conga_trans) + +lemma not_conga_sym: + assumes "\ A B C CongA D E F" + shows "\ D E F CongA A B C" + using assms conga_sym by blast + +lemma not_and_lta: + shows "\ (A B C LtA D E F \ D E F LtA A B C)" +proof - + { + assume P1: "A B C LtA D E F \ D E F LtA A B C" + then have "A B C CongA D E F" + using LtA_def lea_asym by blast + then have "False" + using LtA_def P1 by blast + } + thus ?thesis by auto +qed + +lemma conga_preserves_lta: + assumes "A B C CongA A' B' C'" and + "D E F CongA D' E' F'" and + "A B C LtA D E F" + shows "A' B' C' LtA D' E' F'" + by (meson Tarski_neutral_dimensionless.LtA_def Tarski_neutral_dimensionless.conga_trans Tarski_neutral_dimensionless.l11_30 Tarski_neutral_dimensionless.not_conga_sym Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3)) + +lemma lta_trans: + assumes "A B C LtA A1 B1 C1" and + "A1 B1 C1 LtA A2 B2 C2" + shows "A B C LtA A2 B2 C2" +proof - + have P1: "A B C LeA A2 B2 C2" + by (meson LtA_def assms(1) assms(2) lea_trans) + { + assume "A B C CongA A2 B2 C2" + then have "False" + by (meson Tarski_neutral_dimensionless.LtA_def Tarski_neutral_dimensionless.lea_asym Tarski_neutral_dimensionless.lea_trans Tarski_neutral_dimensionless_axioms assms(1) assms(2) conga__lea456123) + } + thus ?thesis + using LtA_def P1 by blast +qed + +lemma obtuse_sym: + assumes "Obtuse A B C" + shows "Obtuse C B A" + by (meson Obtuse_def Tarski_neutral_dimensionless.lta_right_comm Tarski_neutral_dimensionless_axioms assms) + +lemma acute_sym: + assumes "Acute A B C" + shows "Acute C B A" + by (meson Acute_def Tarski_neutral_dimensionless.lta_left_comm Tarski_neutral_dimensionless_axioms assms) + +lemma acute_col__out: + assumes "Col A B C" and + "Acute A B C" + shows "B Out A C" + by (meson Tarski_neutral_dimensionless.Acute_def Tarski_neutral_dimensionless_axioms assms(1) assms(2) col_lta__out) + +lemma col_obtuse__bet: + assumes "Col A B C" and + "Obtuse A B C" + shows "Bet A B C" + using Obtuse_def assms(1) assms(2) col_lta__bet by blast + +lemma out__acute: + assumes "B Out A C" + shows "Acute A B C" +proof - + have P1: "A \ B" + using assms out_diff1 by auto + then obtain D where P3: "B D Perp A B" + using perp_exists by blast + then have P4: "B \ D" + using perp_distinct by auto + have P5: "Per A B D" + by (simp add: P3 l8_2 perp_per_1) + have P6: "A B C LeA A B D" + using P1 P4 assms l11_31_1 by auto + { + assume "A B C CongA A B D" + then have "False" + by (metis Col_cases P1 P4 P5 assms col_conga_col l8_9 out_col) + } + then have "A B C LtA A B D" + using LtA_def P6 by auto + thus ?thesis + using P5 Acute_def by auto +qed + +lemma bet__obtuse: + assumes "Bet A B C" and + "A \ B" and "B \ C" + shows "Obtuse A B C" +proof - + obtain D where P1: "B D Perp A B" + using assms(2) perp_exists by blast + have P5: "B \ D" + using P1 perp_not_eq_1 by auto + have P6: "Per A B D" + using P1 Perp_cases perp_per_1 by blast + have P7: "A B D LeA A B C" + using assms(2) assms(3) P5 assms(1) l11_31_2 by auto + { + assume "A B D CongA A B C" + then have "False" + using assms(2) P5 P6 assms(1) bet_col ncol_conga_ncol per_not_col by blast + } + then have "A B D LtA A B C" + using LtA_def P7 by blast + thus ?thesis + using Obtuse_def P6 by blast +qed + +lemma l11_43_aux: + assumes "A \ B" and + "A \ C" and + "Per B A C \ Obtuse B A C" + shows "Acute A B C" +proof cases + assume P1: "Col A B C" + { + assume "Per B A C" + then have "Acute A B C" + using Col_cases P1 assms(1) assms(2) per_col_eq by blast + } + { + assume "Obtuse B A C" + then have "Bet B A C" + using P1 col_obtuse__bet col_permutation_4 by blast + then have "Acute A B C" + by (simp add: assms(1) bet_out out__acute) + } + thus ?thesis + using \Per B A C \ Acute A B C\ assms(3) by blast +next + assume P2: "\ Col A B C" + then have P3: "B \ C" + using col_trivial_2 by auto + obtain B' where P4: "Bet B A B' \ Cong A B' B A" + using segment_construction by blast + have P5: "\ Col B' A C" + by (metis Col_def P2 P4 col_transitivity_2 cong_reverse_identity) + then have P6: "B' \ A \ B' \ C" + using not_col_distincts by blast + then have P7: "A C B LtA C A B' \ A B C LtA C A B'" + using P2 P4 l11_41 by auto + then have P7A: "A C B LtA C A B'" by simp + have P7B: "A B C LtA C A B'" by (simp add: P7) + { + assume "Per B A C" + have "Acute A B C" + by (metis Acute_def P4 P7B \Per B A C\ assms(1) bet_col col_per2__per col_trivial_3 l8_3 lta_right_comm) + } + { + assume T1: "Obtuse B A C" + then obtain a b c where T2: "Per a b c \ a b c LtA B A C" + using Obtuse_def by blast + then have T2A: "Per a b c" by simp + have T2B: "a b c LtA B A C" by (simp add: T2) + then have T3: "a b c LeA B A C \ \ a b c CongA B A C" + by (simp add: LtA_def) + then have T3A: "a b c LeA B A C" by simp + have T3B: "\ a b c CongA B A C" by (simp add: T3) + obtain P where T4: "P InAngle B A C \ a b c CongA B A P" + using LeA_def T3 by blast + then have T5: "Per B A P" using T4 T2 l11_17 by blast + then have T6: "Per P A B" + using l8_2 by blast + have "Col A B B'" + by (simp add: P4 bet_col col_permutation_4) + then have "Per P A B'" + using T6 assms(1) per_col by blast + then have S3: "B A P CongA B' A P" + using l8_2 P6 T5 T4 CongA_def assms(1) l11_16 by auto + have "C A B' LtA P A B" + proof - + have S4: "B A P LeA B A C \ B' A C LeA B' A P" + using P4 P6 assms(1) l11_36 by auto + have S5: "C A B' LeA P A B" + proof - + have S6: "B A P LeA B A C" + using T4 inangle__lea by auto + have "B' A P CongA P A B" + using S3 conga_left_comm not_conga_sym by blast + thus ?thesis + using P6 S4 S6 assms(2) conga_pseudo_refl l11_30 by auto + qed + { + assume T10: "C A B' CongA P A B" + have "Per B' A C" + proof - + have "B A P CongA B' A C" + using T10 conga_comm conga_sym by blast + thus ?thesis + using T5 l11_17 by blast + qed + then have "Per C A B" + using Col_cases P6 \Col A B B'\ l8_2 l8_3 by blast + have "a b c CongA B A C" + proof - + have "a \ b" + using T3A lea_distincts by auto + have "c \ b" + using T2B lta_distincts by blast + have "Per B A C" + using Per_cases \Per C A B\ by blast + thus ?thesis + using T2 \a \ b\ \c \ b\ assms(1) assms(2) l11_16 by auto + qed + then have "False" + using T3B by blast + } + then have "\ C A B' CongA P A B" by blast + thus ?thesis + by (simp add: LtA_def S5) + qed + then have "A B C LtA B A P" + by (meson P7 lta_right_comm lta_trans) + then have "Acute A B C" using T5 + using Acute_def by blast + } + thus ?thesis + using \Per B A C \ Acute A B C\ assms(3) by blast +qed + +lemma l11_43: + assumes "A \ B" and + "A \ C" and + "Per B A C \ Obtuse B A C" + shows "Acute A B C \ Acute A C B" + using Per_perm assms(1) assms(2) assms(3) l11_43_aux obtuse_sym by blast + +lemma acute_lea_acute: + assumes "Acute D E F" and + "A B C LeA D E F" + shows "Acute A B C" +proof - + obtain A' B' C' where P1: "Per A' B' C' \ D E F LtA A' B' C'" + using Acute_def assms(1) by auto + have P2: "A B C LeA A' B' C'" + using LtA_def P1 assms(2) lea_trans by blast + have "\ A B C CongA A' B' C'" + by (meson LtA_def P1 assms(2) conga__lea456123 lea_asym lea_trans) + then have "A B C LtA A' B' C'" + by (simp add: LtA_def P2) + thus ?thesis + using Acute_def P1 by auto +qed + +lemma lea_obtuse_obtuse: + assumes "Obtuse D E F" and + "D E F LeA A B C" + shows "Obtuse A B C" +proof - + obtain A' B' C' where P1: "Per A' B' C' \ A' B' C' LtA D E F" + using Obtuse_def assms(1) by auto + then have P2: "A' B' C' LeA A B C" + using LtA_def assms(2) lea_trans by blast + have "\ A' B' C' CongA A B C" + by (meson LtA_def P1 assms(2) conga__lea456123 lea_asym lea_trans) + then have "A' B' C' LtA A B C" + by (simp add: LtA_def P2) + thus ?thesis + using Obtuse_def P1 by auto +qed + +lemma l11_44_1_a: + assumes "A \ B" and + "A \ C" and + "Cong B A B C" + shows "B A C CongA B C A" + by (metis (no_types, hide_lams) Cong3_def assms(1) assms(2) assms(3) cong3_conga cong_inner_transitivity cong_pseudo_reflexivity) + +lemma l11_44_2_a: + assumes "\ Col A B C" and + "B A Lt B C" + shows "B C A LtA B A C" +proof - + have T1: "A \ B" + using assms(1) col_trivial_1 by auto + have T3: "A \ C" + using assms(1) col_trivial_3 by auto + have "B A Le B C" + by (simp add: assms(2) lt__le) + then obtain C' where P1: "Bet B C' C \ Cong B A B C'" + using assms(2) Le_def by blast + have T5: "C \ C'" + using P1 assms(2) cong__nlt by blast + have T5A: "C' \ A" + using Col_def Col_perm P1 assms(1) by blast + then have T6: "C' InAngle B A C" + using InAngle_def P1 T1 T3 out_trivial by auto + have T7: "C' A C LtA A C' B \ C' C A LtA A C' B" + proof - + have W1: "\ Col C' C A" + by (metis Col_def P1 T5 assms(1) col_transitivity_2) + have W2: "Bet C C' B" + using Bet_perm P1 by blast + have "C' \ B" + using P1 T1 cong_identity by blast + thus ?thesis + using l11_41 W1 W2 by simp + qed + have T90: "B A C' LtA B A C" + proof - + have T90A: "B A C' LeA B A C" + by (simp add: T6 inangle__lea) + have "B A C' CongA B A C'" + using T1 T5A conga_refl by auto + { + assume "B A C' CongA B A C" + then have R1: "A Out C' C" + by (metis P1 T7 assms(1) bet_out conga_os__out lta_distincts not_col_permutation_4 out_one_side) + have "B A OS C' C" + by (metis Col_perm P1 T1 assms(1) bet_out cong_diff_2 out_one_side) + then have "False" + using Col_perm P1 T5 R1 bet_col col2__eq one_side_not_col123 out_col by blast + } + then have "\ B A C' CongA B A C" by blast + thus ?thesis + by (simp add: LtA_def T90A) + qed + have "B A C' CongA B C' A" + using P1 T1 T5A l11_44_1_a by auto + then have K2: "A C' B CongA B A C'" + using conga_left_comm not_conga_sym by blast + have "B C A LtA B A C'" + proof - + have K1: "B C A CongA B C A" + using assms(1) conga_refl not_col_distincts by blast + have "B C A LtA A C' B" + proof - + have "C' C A CongA B C A" + proof - + have K2: "C Out B C'" + using P1 T5 bet_out_1 l6_6 by auto + have "C Out A A" + by (simp add: T3 out_trivial) + thus ?thesis + by (simp add: K2 out2__conga) + qed + have "A C' B CongA A C' B" + using CongA_def K2 conga_refl by auto + thus ?thesis + using T7 \C' C A CongA B C A\ conga_preserves_lta by auto + qed + thus ?thesis + using K1 K2 conga_preserves_lta by auto + qed + thus ?thesis + using T90 lta_trans by blast +qed + +lemma not_lta_and_conga: + "\ ( A B C LtA D E F \ A B C CongA D E F)" + by (simp add: LtA_def) + +lemma conga_sym_equiv: + "A B C CongA A' B' C' \ A' B' C' CongA A B C" + using not_conga_sym by blast + +lemma conga_dec: + "A B C CongA D E F \ \ A B C CongA D E F" + by auto + +lemma lta_not_conga: + assumes "A B C LtA D E F" + shows "\ A B C CongA D E F" + using assms not_lta_and_conga by auto + +lemma lta__lea: + assumes "A B C LtA D E F" + shows "A B C LeA D E F" + using LtA_def assms by auto + +lemma nlta: + "\ A B C LtA A B C" + using not_and_lta by blast + +lemma lea__nlta: + assumes "A B C LeA D E F" + shows "\ D E F LtA A B C" + by (meson Tarski_neutral_dimensionless.lea_asym Tarski_neutral_dimensionless.not_lta_and_conga Tarski_neutral_dimensionless_axioms assms lta__lea) + +lemma lta__nlea: + assumes "A B C LtA D E F" + shows "\ D E F LeA A B C" + using assms lea__nlta by blast + +lemma l11_44_1_b: + assumes "\ Col A B C" and + "B A C CongA B C A" + shows "Cong B A B C" +proof - + have "B A Lt B C \ B A Gt B C \ Cong B A B C" + by (simp add: or_lt_cong_gt) + thus ?thesis + by (meson Gt_def assms(1) assms(2) conga_sym l11_44_2_a not_col_permutation_3 not_lta_and_conga) +qed + +lemma l11_44_2_b: + assumes "B A C LtA B C A" + shows "B C Lt B A" +proof cases + assume "Col A B C" + thus ?thesis + using Col_perm assms bet__lt1213 col_lta__bet lta_distincts by blast +next + assume P1: "\ Col A B C" + then have P2: "A \ B" + using col_trivial_1 by blast + have P3: "A \ C" + using P1 col_trivial_3 by auto + have "B A Lt B C \ B A Gt B C \ Cong B A B C" + by (simp add: or_lt_cong_gt) + { + assume "B A Lt B C" + then have "B C Lt B A" + using P1 assms l11_44_2_a not_and_lta by blast + } + { + assume "B A Gt B C" + then have "B C Lt B A" + using Gt_def P1 assms l11_44_2_a not_and_lta by blast + } + { + assume "Cong B A B C" + then have "B A C CongA B C A" + by (simp add: P2 P3 l11_44_1_a) + then have "B C Lt B A" + using assms not_lta_and_conga by blast + } + thus ?thesis + by (meson P1 Tarski_neutral_dimensionless.not_and_lta Tarski_neutral_dimensionless_axioms \B A Gt B C \ B C Lt B A\ \B A Lt B C \ B A Gt B C \ Cong B A B C\ assms l11_44_2_a) +qed + +lemma l11_44_1: + assumes "\ Col A B C" + shows "B A C CongA B C A \ Cong B A B C" + using assms l11_44_1_a l11_44_1_b not_col_distincts by blast + +lemma l11_44_2: + assumes "\ Col A B C" + shows "B A C LtA B C A \ B C Lt B A" + using assms l11_44_2_a l11_44_2_b not_col_permutation_3 by blast + +lemma l11_44_2bis: + assumes "\ Col A B C" + shows "B A C LeA B C A \ B C Le B A" +proof - + { + assume P1: "B A C LeA B C A" + { + assume "B A Lt B C" + then have "B C A LtA B A C" + by (simp add: assms l11_44_2_a) + then have "False" + using P1 lta__nlea by auto + } + then have "\ B A Lt B C" by blast + have "B C Le B A" + using \\ B A Lt B C\ nle__lt by blast + } + { + assume P2: "B C Le B A" + have "B A C LeA B C A" + proof cases + assume "Cong B C B A" + then have "B A C CongA B C A" + by (metis assms conga_sym l11_44_1_a not_col_distincts) + thus ?thesis + by (simp add: conga__lea) + next + assume "\ Cong B C B A" + then have "B A C LtA B C A" + by (simp add: l11_44_2 assms Lt_def P2) + thus ?thesis + by (simp add: lta__lea) + qed + } + thus ?thesis + using \B A C LeA B C A \ B C Le B A\ by blast +qed + +lemma l11_46: + assumes "A \ B" and + "B \ C" and + "Per A B C \ Obtuse A B C" + shows "B A Lt A C \ B C Lt A C" +proof cases + assume "Col A B C" + thus ?thesis + by (meson assms(1) assms(2) assms(3) bet__lt1213 bet__lt2313 col_obtuse__bet lt_left_comm per_not_col) +next + assume P1: "\ Col A B C" + have P2: "A \ C" + using P1 col_trivial_3 by auto + have P3: "Acute B A C \ Acute B C A" + using assms(1) assms(2) assms(3) l11_43 by auto + then obtain A' B' C' where P4: "Per A' B' C' \ B C A LtA A' B' C'" + using Acute_def P3 by auto + { + assume P5: "Per A B C" + have P5A: "A C B CongA A C B" + by (simp add: P2 assms(2) conga_refl) + have S1: "A \ B" + by (simp add: assms(1)) + have S2: "B \ C" + by (simp add: assms(2)) + have S3: "A' \ B'" + using P4 lta_distincts by blast + have S4: "B' \ C'" + using P4 lta_distincts by blast + then have "A' B' C' CongA A B C" using l11_16 + using S1 S2 S3 S4 P4 P5 by blast + then have "A C B LtA A B C" + using P5A P4 conga_preserves_lta lta_left_comm by blast + } + { + assume "Obtuse A B C" + obtain A'' B'' C'' where P6: "Per A'' B'' C'' \ A'' B'' C'' LtA A B C" + using Obtuse_def \Obtuse A B C\ by auto + have "B C A LtA A' B' C'" + by (simp add: P4) + then have P7: "A C B LtA A' B' C'" + by (simp add: lta_left_comm) + have "A' B' C' LtA A B C" + proof - + have U1: "A'' B'' C'' CongA A' B' C'" + proof - + have V2: "A'' \ B''" + using P6 lta_distincts by blast + have V3: "C'' \ B''" + using P6 lta_distincts by blast + have V5: "A' \ B'" + using P7 lta_distincts by blast + have "C' \ B'" + using P4 lta_distincts by blast + thus ?thesis using P6 V2 V3 P4 V5 + by (simp add: l11_16) + qed + have U2: "A B C CongA A B C" + using assms(1) assms(2) conga_refl by auto + have U3: "A'' B'' C'' LtA A B C" + by (simp add: P6) + thus ?thesis + using U1 U2 conga_preserves_lta by auto + qed + then have "A C B LtA A B C" + using P7 lta_trans by blast + } + then have "A C B LtA A B C" + using \Per A B C \ A C B LtA A B C\ assms(3) by blast + then have "A B Lt A C" + by (simp add: l11_44_2_b) + then have "B A Lt A C" + using Lt_cases by blast + have "C A B LtA C B A" + proof - + obtain A' B' C' where U4: "Per A' B' C' \ B A C LtA A' B' C'" + using Acute_def P3 by blast + { + assume "Per A B C" + then have W3: "A' B' C' CongA C B A" + using U4 assms(2) l11_16 l8_2 lta_distincts by blast + have W2: "C A B CongA C A B" + using P2 assms(1) conga_refl by auto + have "C A B LtA A' B' C'" + by (simp add: U4 lta_left_comm) + then have "C A B LtA C B A" + using W2 W3 conga_preserves_lta by blast + } + { + assume "Obtuse A B C" + then obtain A'' B'' C'' where W4: "Per A'' B'' C'' \ A'' B'' C'' LtA A B C" + using Obtuse_def by auto + have W5: "C A B LtA A' B' C'" + by (simp add: U4 lta_left_comm) + have "A' B' C' LtA C B A" + proof - + have W6: "A'' B'' C'' CongA A' B' C'" using l11_16 W4 U4 + using lta_distincts by blast + have "C B A CongA C B A" + using assms(1) assms(2) conga_refl by auto + thus ?thesis + using W4 W6 conga_left_comm conga_preserves_lta by blast + qed + then have "C A B LtA C B A" + using W5 lta_trans by blast + } + thus ?thesis + using \Per A B C \ C A B LtA C B A\ assms(3) by blast + qed + then have "C B Lt C A" + by (simp add: l11_44_2_b) + then have "C B Lt A C" + using Lt_cases by auto + then have "B C Lt A C" + using Lt_cases by blast + thus ?thesis + by (simp add: \B A Lt A C\) +qed + +lemma l11_47: + assumes "Per A C B" and + "H PerpAt C H A B" + shows "Bet A H B \ A \ H \ B \ H" +proof - + have P1: "Per C H A" + using assms(2) perp_in_per_1 by auto + have P2: "C H Perp A B" + using assms(2) perp_in_perp by auto + thus ?thesis + proof cases + assume "Col A C B" + thus ?thesis + by (metis P1 assms(1) assms(2) per_distinct_1 per_not_col perp_in_distinct perp_in_id) + next + assume P3: "\ Col A C B" + have P4: "A \ H" + by (metis P2 Per_perm Tarski_neutral_dimensionless.l8_7 Tarski_neutral_dimensionless_axioms assms(1) assms(2) col_trivial_1 perp_in_per_2 perp_not_col2) + have P5: "Per C H B" + using assms(2) perp_in_per_2 by auto + have P6: "B \ H" + using P1 P2 assms(1) l8_2 l8_7 perp_not_eq_1 by blast + have P7: "H A Lt A C \ H C Lt A C" + by (metis P1 P2 P4 l11_46 l8_2 perp_distinct) + have P8: "C A Lt A B \ C B Lt A B" + using P3 assms(1) l11_46 not_col_distincts by blast + have P9: "H B Lt B C \ H C Lt B C" + by (metis P2 P5 P6 Per_cases l11_46 perp_not_eq_1) + have P10: "Bet A H B" + proof - + have T1: "Col A H B" + using assms(2) col_permutation_5 perp_in_col by blast + have T2: "A H Le A B" using P7 P8 + by (meson lt_comm lt_transitivity nlt__le not_and_lt) + have "H B Le A B" + by (meson Lt_cases P8 P9 le_transitivity local.le_cases lt__nle) + thus ?thesis + using T1 T2 l5_12_b by blast + qed + thus ?thesis + by (simp add: P4 P6) + qed +qed + +lemma l11_49: + assumes "A B C CongA A' B' C'" and + "Cong B A B' A'" and + "Cong B C B' C'" + shows "Cong A C A' C' \ (A \ C \ (B A C CongA B' A' C' \ B C A CongA B' C' A'))" +proof - + have T1:" Cong A C A' C'" + using assms(1) assms(2) assms(3) cong2_conga_cong not_cong_2143 by blast + { + assume P1: "A \ C" + have P2: "A \ B" + using CongA_def assms(1) by blast + have P3: "C \ B" + using CongA_def assms(1) by blast + have "B A C Cong3 B' A' C'" + by (simp add: Cong3_def T1 assms(2) assms(3)) + then have T2: "B A C CongA B' A' C'" + using P1 P2 cong3_conga by auto + have "B C A Cong3 B' C' A'" + using Cong3_def T1 assms(2) assms(3) cong_3_swap_2 by blast + then have "B C A CongA B' C' A'" + using P1 P3 cong3_conga by auto + then have "B A C CongA B' A' C' \ B C A CongA B' C' A'" using T2 by blast + } + thus ?thesis + by (simp add: T1) +qed + +lemma l11_50_1: + assumes "\ Col A B C" and + "B A C CongA B' A' C'" and + "A B C CongA A' B' C'" and + "Cong A B A' B'" + shows "Cong A C A' C' \ Cong B C B' C' \ A C B CongA A' C' B'" +proof - + obtain C'' where P1: "B' Out C'' C' \ Cong B' C'' B C" + by (metis Col_perm assms(1) assms(3) col_trivial_3 conga_diff56 l6_11_existence) + have P2: "B' \ C''" + using P1 out_diff1 by auto + have P3: "\ Col A' B' C'" + using assms(1) assms(3) ncol_conga_ncol by blast + have P4: "\ Col A' B' C''" + by (meson P1 P2 P3 col_transitivity_1 not_col_permutation_2 out_col) + have P5: "Cong A C A' C''" + proof - + have Q1: "B Out A A" + using assms(1) not_col_distincts out_trivial by auto + have Q2: "B Out C C" + using assms(1) col_trivial_2 out_trivial by force + have Q3: "B' Out A' A'" + using P3 not_col_distincts out_trivial by auto + have Q5: "Cong B A B' A'" + using assms(4) not_cong_2143 by blast + have "Cong B C B' C''" + using P1 not_cong_3412 by blast + thus ?thesis + using l11_4_1 P1 Q1 Q2 Q3 Q5 assms(3) by blast + qed + have P6: "B A C Cong3 B' A' C''" + using Cong3_def Cong_perm P1 P5 assms(4) by blast + have P7: "B A C CongA B' A' C''" + by (metis P6 assms(1) cong3_conga not_col_distincts) + have P8: "B' A' C' CongA B' A' C''" + by (meson P7 assms(2) conga_sym conga_trans) + have "B' A' OS C' C''" + using Col_perm Out_cases P1 P3 out_one_side by blast + then have "A' Out C' C''" + using P8 conga_os__out by auto + then have "Col A' C' C''" + using out_col by auto + then have P9: "C' = C''" + using Col_perm P1 out_col P3 col_transitivity_1 by blast + have T1: "Cong A C A' C'" + by (simp add: P5 P9) + have T2: "Cong B C B' C'" + using Cong_perm P1 P9 by blast + then have "A C B CongA A' C' B'" + using T1 assms(1) assms(2) assms(4) col_trivial_2 l11_49 by blast + thus ?thesis using T1 T2 by blast +qed + +lemma l11_50_2: + assumes "\ Col A B C" and + "B C A CongA B' C' A'" and + "A B C CongA A' B' C'" and + "Cong A B A' B'" + shows "Cong A C A' C' \ Cong B C B' C' \ C A B CongA C' A' B'" +proof - + have P1: "A \ B" + using assms(1) col_trivial_1 by auto + have P2: "B \ C" + using assms(1) col_trivial_2 by auto + have P3: "A' \ B'" + using P1 assms(4) cong_diff by blast + have P4: "B' \ C'" + using assms(2) conga_diff45 by auto + then obtain C'' where P5: "B' Out C'' C' \ Cong B' C'' B C" + using P2 l6_11_existence by presburger + have P5BIS: "B' \ C''" + using P5 out_diff1 by auto + have P5A: "Col B' C'' C'" + using P5 out_col by auto + have P6: "\ Col A' B' C'" + using assms(1) assms(3) ncol_conga_ncol by blast + { + assume "Col A' B' C''" + then have "Col B' C'' A'" + using not_col_permutation_2 by blast + then have "Col B' C' A'" using col_transitivity_1 P5BIS P5A by blast + then have "Col A' B' C'" + using Col_perm by blast + then have False + using P6 by auto + } + then have P7: "\ Col A' B' C''" by blast + have P8: "Cong A C A' C''" + proof - + have "B Out A A" + by (simp add: P1 out_trivial) + have K1: "B Out C C" + using P2 out_trivial by auto + have K2: "B' Out A' A'" + using P3 out_trivial by auto + have "Cong B A B' A'" + by (simp add: Cong_perm assms(4)) + have "Cong B C B' C''" + using Cong_perm P5 by blast + thus ?thesis + using P5 \Cong B A B' A'\ P1 out_trivial K1 K2 assms(3) l11_4_1 by blast + qed + have P9: "B C A Cong3 B' C'' A'" + using Cong3_def Cong_perm P5 P8 assms(4) by blast + then have P10: "B C A CongA B' C'' A'" + using assms(1) cong3_conga not_col_distincts by auto + have P11: "B' C' A' CongA B' C'' A'" + using P9 assms(2) cong3_conga2 conga_sym by blast + show ?thesis + proof cases + assume L1: "C' = C''" + then have L2: "Cong A C A' C'" + by (simp add: P8) + have L3: "Cong B C B' C'" + using Cong_perm L1 P5 by blast + have "C A B Cong3 C' A' B'" + by (simp add: L1 P9 cong_3_swap cong_3_swap_2) + then have "C A B CongA C' A' B'" + by (metis CongA_def P1 assms(2) cong3_conga) + thus ?thesis using L2 L3 by auto + next + assume R1: "C' \ C''" + have R1A: "\ Col C'' C' A'" + by (metis P5A P7 R1 col_permutation_2 col_trivial_2 colx) + have R1B: "Bet B' C'' C' \ Bet B' C' C''" + using Out_def P5 by auto + { + assume S1: "Bet B' C'' C'" + then have S2: "C'' A' C' LtA A' C'' B' \ C'' C' A' LtA A' C'' B'" + using P5BIS R1A between_symmetry l11_41 by blast + have "B' C' A' CongA C'' C' A'" + by (metis P11 R1 Tarski_neutral_dimensionless.conga_comm Tarski_neutral_dimensionless_axioms S1 bet_out_1 conga_diff45 not_conga_sym out2__conga out_trivial) + then have "B' C' A' LtA A' C'' B'" + by (meson P11 Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless.not_conga Tarski_neutral_dimensionless.not_conga_sym Tarski_neutral_dimensionless_axioms S2 not_lta_and_conga) + then have "Cong A C A' C' \ Cong B C B' C'" + by (meson P11 Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless_axioms not_lta_and_conga) + } + { + assume Z1: "Bet B' C' C''" + have Z2: "\ Col C' C'' A'" + by (simp add: R1A not_col_permutation_4) + have Z3: "C'' Out C' B'" + by (simp add: R1 Z1 bet_out_1) + have Z4: "C'' Out A' A'" + using P7 not_col_distincts out_trivial by blast + then have Z4A: "B' C'' A' CongA C' C'' A'" + by (simp add: Z3 out2__conga) + have Z4B: "B' C'' A' LtA A' C' B'" + proof - + have Z5: "C' C'' A' CongA B' C'' A'" + using Z4A not_conga_sym by blast + have Z6: "A' C' B' CongA A' C' B'" + using P11 P4 conga_diff2 conga_refl by blast + have "C' C'' A' LtA A' C' B'" + using P4 Z1 Z2 between_symmetry l11_41 by blast + thus ?thesis + using Z5 Z6 conga_preserves_lta by auto + qed + have "B' C'' A' CongA B' C' A'" + using P11 not_conga_sym by blast + then have "Cong A C A' C' \ Cong B C B' C'" + by (meson Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless_axioms Z4B not_lta_and_conga) + } + then have R2: "Cong A C A' C' \ Cong B C B' C'" + using R1B \Bet B' C'' C' \ Cong A C A' C' \ Cong B C B' C'\ by blast + then have "C A B CongA C' A' B'" + using P1 assms(2) l11_49 not_cong_2143 by blast + thus ?thesis using R2 by auto + qed +qed + +lemma l11_51: + assumes "A \ B" and + "A \ C" and + "B \ C" and + "Cong A B A' B'" and + "Cong A C A' C'" and + "Cong B C B' C'" + shows + "B A C CongA B' A' C' \ A B C CongA A' B' C' \ B C A CongA B' C' A'" +proof - + have "B A C Cong3 B' A' C' \ A B C Cong3 A' B' C' \ B C A Cong3 B' C' A'" + using Cong3_def Cong_perm assms(4) assms(5) assms(6) by blast + thus ?thesis + using assms(1) assms(2) assms(3) cong3_conga by auto +qed + +lemma conga_distinct: + assumes "A B C CongA D E F" + shows "A \ B \ C \ B \ D \ E \ F \ E" + using CongA_def assms by auto + +lemma l11_52: + assumes "A B C CongA A' B' C'" and + "Cong A C A' C'" and + "Cong B C B' C'" and + "B C Le A C" + shows "Cong B A B' A' \ B A C CongA B' A' C' \ B C A CongA B' C' A'" +proof - + have P1: "A \ B" + using CongA_def assms(1) by blast + have P2: "C \ B" + using CongA_def assms(1) by blast + have P3: "A' \ B'" + using CongA_def assms(1) by blast + have P4: "C' \ B'" + using assms(1) conga_diff56 by auto + have P5: "Cong B A B' A'" + proof cases + assume P6: "Col A B C" + then have P7: "Bet A B C \ Bet B C A \ Bet C A B" + using Col_def by blast + { + assume P8: "Bet A B C" + then have "Bet A' B' C'" + using assms(1) bet_conga__bet by blast + then have "Cong B A B' A'" + using P8 assms(2) assms(3) l4_3 not_cong_2143 by blast + } + { + assume P9: "Bet B C A" + then have P10: "B' Out A' C'" + using Out_cases P2 assms(1) bet_out l11_21_a by blast + then have P11: "Bet B' A' C' \ Bet B' C' A'" + by (simp add: Out_def) + { + assume "Bet B' A' C'" + then have "Cong B A B' A'" + using P3 assms(2) assms(3) assms(4) bet_le_eq l5_6 by blast + } + { + assume "Bet B' C' A'" + then have "Cong B A B' A'" + using Cong_perm P9 assms(2) assms(3) l2_11_b by blast + } + then have "Cong B A B' A'" + using P11 \Bet B' A' C' \ Cong B A B' A'\ by blast + } + { + assume "Bet C A B" + then have "Cong B A B' A'" + using P1 assms(4) bet_le_eq between_symmetry by blast + } + thus ?thesis + using P7 \Bet A B C \ Cong B A B' A'\ \Bet B C A \ Cong B A B' A'\ by blast + next + assume Z1: "\ Col A B C" + obtain A'' where Z2: "B' Out A'' A' \ Cong B' A'' B A" + using P1 P3 l6_11_existence by force + then have Z3: "A' B' C' CongA A'' B' C'" + by (simp add: P4 out2__conga out_trivial) + have Z4: "A B C CongA A'' B' C'" + using Z3 assms(1) not_conga by blast + have Z5: "Cong A'' C' A C" + using Z2 Z4 assms(3) cong2_conga_cong cong_4321 cong_symmetry by blast + have Z6: "A'' B' C' Cong3 A B C" + using Cong3_def Cong_perm Z2 Z5 assms(3) by blast + have Z7: "Cong A'' C' A' C'" + using Z5 assms(2) cong_transitivity by blast + have Z8: "\ Col A' B' C'" + by (metis Z1 assms(1) ncol_conga_ncol) + then have Z9: "\ Col A'' B' C'" + by (metis Z2 col_transitivity_1 not_col_permutation_4 out_col out_diff1) + { + assume Z9A: "A'' \ A'" + have Z10: "Bet B' A'' A' \ Bet B' A' A''" + using Out_def Z2 by auto + { + assume Z11: "Bet B' A'' A'" + have Z12: "A'' C' B' LtA C' A'' A' \ A'' B' C' LtA C' A'' A'" + by (simp add: Z11 Z9 Z9A l11_41) + have Z13: "Cong A' C' A'' C'" + using Cong_perm Z7 by blast + have Z14: "\ Col A'' C' A'" + by (metis Col_def Z11 Z9 Z9A col_transitivity_1) + have Z15: "C' A'' A' CongA C' A' A'' \ Cong C' A'' C' A'" + by (simp add: Z14 l11_44_1) + have Z16: "Cong C' A' C' A''" + using Cong_perm Z7 by blast + then have Z17: "Cong C' A'' C' A'" + using Cong_perm by blast + then have Z18: "C' A'' A' CongA C' A' A''" + by (simp add: Z15) + have Z19: "\ Col B' C' A''" + using Col_perm Z9 by blast + have Z20: "B' A' C' CongA A'' A' C'" + by (metis Tarski_neutral_dimensionless.col_conga_col Tarski_neutral_dimensionless_axioms Z11 Z3 Z9 Z9A bet_out_1 col_trivial_3 out2__conga out_trivial) + have Z21: "\ Col B' C' A'" + using Col_perm Z8 by blast + then have Z22: "C' B' A' LtA C' A' B' \ C' A' Lt C' B'" + by (simp add: l11_44_2) + have "A'' B' C' CongA C' B' A'" + using Z3 conga_right_comm not_conga_sym by blast + then have U1: "C' B' A' LtA C' A' B'" + proof - + have f1: "\p pa pb pc pd pe pf pg ph pi pj pk pl pm. \ Tarski_neutral_dimensionless p pa \ \ Tarski_neutral_dimensionless.CongA p pa (pb::'p) pc pd pe pf pg \ \ Tarski_neutral_dimensionless.CongA p pa ph pi pj pk pl pm \ \ Tarski_neutral_dimensionless.LtA p pa pb pc pd ph pi pj \ Tarski_neutral_dimensionless.LtA p pa pe pf pg pk pl pm" + by (simp add: Tarski_neutral_dimensionless.conga_preserves_lta) + have f2: "C' A'' A' CongA C' A' A''" + by (metis Z15 Z17) + have f3: "\p pa pb pc pd pe pf pg. \ Tarski_neutral_dimensionless p pa \ \ Tarski_neutral_dimensionless.CongA p pa (pb::'p) pc pd pe pf pg \ Tarski_neutral_dimensionless.CongA p pa pe pf pg pb pc pd" + by (metis (no_types) Tarski_neutral_dimensionless.conga_sym) + then have "\ C' B' A' LtA C' A'' A' \ A'' B' C' LtA C' A' A''" + using f2 f1 by (meson Tarski_neutral_dimensionless_axioms \A'' B' C' CongA C' B' A'\) + then have "C' B' A' LtA C' A' B' \ A'' B' C' LtA A'' A' C' \ A'' = B'" + using f2 f1 by (metis (no_types) Tarski_neutral_dimensionless.conga_refl Tarski_neutral_dimensionless_axioms Z12 \A'' B' C' CongA C' B' A'\ lta_right_comm) + thus ?thesis + using f3 f2 f1 by (metis (no_types) Tarski_neutral_dimensionless_axioms Z12 Z20 \A'' B' C' CongA C' B' A'\ lta_right_comm) + qed + then have Z23: "C' A' Lt C' B'" + using Z22 by auto + have Z24: "C' A'' Lt C' B'" + using Z16 Z23 cong2_lt__lt cong_reflexivity by blast + have Z25: "C A Le C B" + proof - + have Z26: "Cong C' A'' C A" + using Z5 not_cong_2143 by blast + have "Cong C' B' C B" + using assms(3) not_cong_4321 by blast + thus ?thesis + using l5_6 Z24 Z26 lt__le by blast + qed + then have Z27: "Cong C A C B" + by (simp add: assms(4) le_anti_symmetry le_comm) + have "Cong C' A'' C' B'" + by (metis Cong_perm Z13 Z27 assms(2) assms(3) cong_transitivity) + then have "False" + using Z24 cong__nlt by blast + then have "Cong B A B' A'" by simp + } + { + assume W1: "Bet B' A' A''" + have W2: "A' \ A''" + using Z9A by auto + have W3: "A' C' B' LtA C' A' A'' \ A' B' C' LtA C' A' A''" + using W1 Z8 Z9A l11_41 by blast + have W4: "Cong A' C' A'' C'" + using Z7 not_cong_3412 by blast + have "\ Col A'' C' A'" + by (metis Col_def W1 Z8 Z9A col_transitivity_1) + then have W6: "C' A'' A' CongA C' A' A'' \ Cong C' A'' C' A'" + using l11_44_1 by auto + have W7: "Cong C' A' C' A''" + using Z7 not_cong_4321 by blast + then have W8: "Cong C' A'' C' A'" + using W4 not_cong_4321 by blast + have W9: "\ Col B' C' A''" + by (simp add: Z9 not_col_permutation_1) + have W10: "B' A'' C' CongA A' A'' C'" + by (metis Tarski_neutral_dimensionless.Out_def Tarski_neutral_dimensionless_axioms W1 Z9 Z9A bet_out_1 between_trivial not_col_distincts out2__conga) + have W12: "C' B' A'' LtA C' A'' B' \ C' A'' Lt C' B'" + by (simp add: W9 l11_44_2) + have W12A: "C' B' A'' LtA C' A'' B'" + proof - + have V1: "A' B' C' CongA C' B' A''" + by (simp add: Z3 conga_right_comm) + have "A' A'' C' CongA B' A'' C'" + by (metis Tarski_neutral_dimensionless.Out_def Tarski_neutral_dimensionless_axioms W1 \\ Col A'' C' A'\ between_equality_2 not_col_distincts or_bet_out out2__conga out_col) + then have "C' A' A'' CongA C' A'' B'" + by (meson W6 W8 conga_left_comm not_conga not_conga_sym) + thus ?thesis + using W3 V1 conga_preserves_lta by auto + qed + then have "C' A'' Lt C' B'" using W12 by auto + then have W14: "C' A' Lt C' B'" + using W8 cong2_lt__lt cong_reflexivity by blast + have W15: "C A Le C B" + proof - + have Q1: "C' A'' Le C' B'" + using W12 W12A lt__le by blast + have Q2: "Cong C' A'' C A" + using Z5 not_cong_2143 by blast + have "Cong C' B' C B" + using assms(3) not_cong_4321 by blast + thus ?thesis using Q1 Q2 l5_6 by blast + qed + have "C B Le C A" + by (simp add: assms(4) le_comm) + then have "Cong C A C B" + by (simp add: W15 le_anti_symmetry) + then have "Cong C' A' C' B'" + by (metis Cong_perm assms(2) assms(3) cong_inner_transitivity) + then have "False" + using W14 cong__nlt by blast + then have "Cong B A B' A'" by simp + } + then have "Cong B A B' A'" + using Z10 \Bet B' A'' A' \ Cong B A B' A'\ by blast + } + { + assume "A'' = A'" + then have "Cong B A B' A'" + using Z2 not_cong_3412 by blast + } + thus ?thesis + using \A'' \ A' \ Cong B A B' A'\ by blast + qed + have P6: "A B C Cong3 A' B' C'" + using Cong3_def Cong_perm P5 assms(2) assms(3) by blast + thus ?thesis + using P2 P5 assms(1) assms(3) assms(4) l11_49 le_zero by blast +qed + +lemma l11_53: + assumes "Per D C B" and + "C \ D" and + "A \ B" and + "B \ C" and + "Bet A B C" + shows "C A D LtA C B D \ B D Lt A D" +proof - + have P1: "C \ A" + using assms(3) assms(5) between_identity by blast + have P2: "\ Col B A D" + by (smt assms(1) assms(2) assms(3) assms(4) assms(5) bet_col bet_col1 col3 col_permutation_4 l8_9) + have P3: "A \ D" + using P2 col_trivial_2 by blast + have P4: "C A D LtA C B D" + proof - + have P4A: "B D A LtA D B C \ B A D LtA D B C" + by (simp add: P2 assms(4) assms(5) l11_41) + have P4AA:"A Out B C" + using assms(3) assms(5) bet_out by auto + have "A Out D D" + using P3 out_trivial by auto + then have P4B: "C A D CongA B A D" using P4AA + by (simp add: out2__conga) + then have P4C: "B A D CongA C A D" + by (simp add: P4B conga_sym) + have "D B C CongA C B D" + using assms(1) assms(4) conga_pseudo_refl per_distinct_1 by auto + thus ?thesis + using P4A P4C conga_preserves_lta by blast + qed + obtain B' where P5: "C Midpoint B B' \ Cong D B D B'" + using Per_def assms(1) by auto + have K2: "A \ B'" + using Bet_cases P5 assms(4) assms(5) between_equality_2 midpoint_bet by blast + { + assume "Col B D B'" + then have "Col B A D" + by (metis Col_cases P5 assms(1) assms(2) assms(4) col2__eq midpoint_col midpoint_distinct_2 per_not_col) + then have "False" + by (simp add: P2) + } + then have P6: "\ Col B D B'" by blast + then have "D B B' CongA D B' B \ Cong D B D B'" + by (simp add: l11_44_1) + then have "D B B' CongA D B' B" using P5 by simp + { + assume K1: "Col A D B'" + have "Col B' A B" + using Col_def P5 assms(4) assms(5) midpoint_bet outer_transitivity_between by blast + then have "Col B' B D" + using K1 K2 Col_perm col_transitivity_2 by blast + then have "Col B D B'" + using Col_perm by blast + then have "False" + by (simp add: P6) + } + then have K3B: "\ Col A D B'" by blast + then have K4: "D A B' LtA D B' A \ D B' Lt D A" + by (simp add: l11_44_2) + have K4A: "C A D LtA C B' D" + by (metis Midpoint_def P1 P3 P4 P5 P5 P6 assms(2) assms(4) col_trivial_1 cong_reflexivity conga_preserves_lta conga_refl l11_51 not_cong_2134) + have "D B' Lt D A" + proof - + have "D A B' LtA D B' A" + proof - + have K5A: "A Out D D" + using P3 out_trivial by auto + have K5AA: "A Out B' C" + by (smt K2 Out_def P1 P5 assms(4) assms(5) midpoint_bet outer_transitivity_between2) + then have K5: "D A C CongA D A B'" + by (simp add: K5A out2__conga) + have K6A: "B' Out D D" + using K3B not_col_distincts out_trivial by blast + have "B' Out A C" + by (smt P5 K5AA assms(4) assms(5) between_equality_2 l6_4_2 midpoint_bet midpoint_distinct_2 out_col outer_transitivity_between2) + then have K6: "D B' C CongA D B' A" + by (simp add: K6A out2__conga) + have "D A C LtA D B' C" + by (simp add: K4A lta_comm) + thus ?thesis + using K5 K6 conga_preserves_lta by auto + qed + thus ?thesis + by (simp add: K4) + qed + thus ?thesis + using P4 P5 cong2_lt__lt cong_pseudo_reflexivity not_cong_4312 by blast +qed + +lemma cong2_conga_obtuse__cong_conga2: + assumes "Obtuse A B C" and + "A B C CongA A' B' C'" and + "Cong A C A' C'" and + "Cong B C B' C'" + shows "Cong B A B' A' \ B A C CongA B' A' C' \ +B C A CongA B' C' A'" +proof - + have "B C Le A C" + proof cases + assume "Col A B C" + thus ?thesis + by (simp add: assms(1) col_obtuse__bet l5_12_a) + next + assume "\ Col A B C" + thus ?thesis + using l11_46 assms(1) lt__le not_col_distincts by auto + qed + thus ?thesis + using l11_52 assms(2) assms(3) assms(4) by blast +qed + +lemma cong2_per2__cong_conga2: + assumes "A \ B" and + "B \ C" and + "Per A B C" and + "Per A' B' C'" and + "Cong A C A' C'" and + "Cong B C B' C'" + shows "Cong B A B' A' \ B A C CongA B' A' C' \ +B C A CongA B' C' A'" +proof - + have P1: "B C Le A C \ \ Cong B C A C" + using assms(1) assms(2) assms(3) cong__nlt l11_46 lt__le by blast + then have "A B C CongA A' B' C'" + using assms(2) assms(3) assms(4) assms(5) assms(6) cong_diff cong_inner_transitivity cong_symmetry l11_16 by blast + thus ?thesis + using P1 assms(5) assms(6) l11_52 by blast +qed + +lemma cong2_per2__cong: + assumes "Per A B C" and + "Per A' B' C'" and + "Cong A C A' C'" and + "Cong B C B' C'" + shows "Cong B A B' A'" +proof cases + assume "B = C" + thus ?thesis + using assms(3) assms(4) cong_reverse_identity not_cong_2143 by blast +next + assume "B \ C" + show ?thesis + proof cases + assume "A = B" + thus ?thesis + proof - + have "Cong A C B' C'" + using \A = B\ assms(4) by blast + then have "B' = A'" + by (meson Cong3_def Per_perm assms(2) assms(3) cong_inner_transitivity cong_pseudo_reflexivity l8_10 l8_7) + thus ?thesis + using \A = B\ cong_trivial_identity by blast + qed + next + assume "A \ B" + show ?thesis + proof cases + assume "A' = B'" + thus ?thesis + by (metis Cong3_def Per_perm \A \ B\ assms(1) assms(3) assms(4) cong_inner_transitivity cong_pseudo_reflexivity l8_10 l8_7) + next + assume "A' \ B'" + thus ?thesis + using cong2_per2__cong_conga2 \A \ B\ \B \ C\ assms(1) assms(2) assms(3) assms(4) by blast + qed + qed +qed + +lemma cong2_per2__cong_3: + assumes "Per A B C" + "Per A' B' C'" and + "Cong A C A' C'" and + "Cong B C B' C'" + shows "A B C Cong3 A' B' C'" + by (metis Tarski_neutral_dimensionless.Cong3_def Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) cong2_per2__cong cong_3_swap) + +lemma cong_lt_per2__lt: + assumes "Per A B C" and + "Per A' B' C'" and + "Cong A B A' B'" and + "B C Lt B' C'" + shows "A C Lt A' C'" +proof cases + assume "A = B" + thus ?thesis + using assms(3) assms(4) cong_reverse_identity by blast +next + assume "A \ B" + show ?thesis + proof cases + assume "B = C" + thus ?thesis + by (smt assms(2) assms(3) assms(4) cong2_lt__lt cong_4312 cong_diff cong_reflexivity l11_46 lt_diff) + next + assume P0: "B \ C" + have "B C Lt B' C'" + by (simp add: assms(4)) + then have R1: "B C Le B' C' \ \ Cong B C B' C'" + by (simp add: Lt_def) + then obtain C0 where P1: "Bet B' C0 C' \ Cong B C B' C0" + using Le_def by auto + then have P2: "Per A' B' C0" + by (metis Col_def Per_cases assms(2) bet_out_1 col_col_per_per col_trivial_1 l8_5 out_diff2) + have "C0 A' Lt C' A'" using l11_53 + by (metis P1 P2 R1 P0 bet__lt2313 between_symmetry cong_diff) + then have P3: "A' C0 Lt A' C'" + using Lt_cases by blast + have P4: "Cong A' C0 A C" + using P1 P2 assms(1) assms(3) l10_12 not_cong_3412 by blast + have "Cong A' C' A' C'" + by (simp add: cong_reflexivity) + thus ?thesis + using cong2_lt__lt P3 P4 by blast + qed +qed + +lemma cong_le_per2__le: + assumes "Per A B C" and + "Per A' B' C'" and + "Cong A B A' B'" and + "B C Le B' C'" + shows "A C Le A' C'" +proof cases + assume "Cong B C B' C'" + thus ?thesis + using assms(1) assms(2) assms(3) cong__le l10_12 by blast +next + assume "\ Cong B C B' C'" + then have "B C Lt B' C'" + using Lt_def assms(4) by blast + thus ?thesis + using assms(1) assms(2) assms(3) cong_lt_per2__lt lt__le by auto +qed + +lemma lt2_per2__lt: + assumes "Per A B C" and + "Per A' B' C'" and + "A B Lt A' B'" and + "B C Lt B' C'" + shows "A C Lt A' C'" +proof - + have P2: "B A Lt B' A'" + by (simp add: assms(3) lt_comm) + have P3: "B C Le B' C' \ \ Cong B C B' C'" + using assms(4) cong__nlt lt__le by auto + then obtain C0 where P4: "Bet B' C0 C' \ Cong B C B' C0" + using Le_def by auto + have P4A: "B' \ C'" + using assms(4) lt_diff by auto + have "Col B' C' C0" + using P4 bet_col not_col_permutation_5 by blast + then have P5: "Per A' B' C0" + using assms(2) P4A per_col by blast + have P6: "A C Lt A' C0" + by (meson P2 P4 P5 assms(1) cong_lt_per2__lt l8_2 lt_comm not_cong_2143) + have "B' C0 Lt B' C'" + by (metis P4 assms(4) bet__lt1213 cong__nlt) + then have "A' C0 Lt A' C'" + using P5 assms(2) cong_lt_per2__lt cong_reflexivity by blast + thus ?thesis + using P6 lt_transitivity by blast +qed + +lemma le_lt_per2__lt: + assumes "Per A B C" and + "Per A' B' C'" and + "A B Le A' B'" and + "B C Lt B' C'" + shows "A C Lt A' C'" + using Lt_def assms(1) assms(2) assms(3) assms(4) cong_lt_per2__lt lt2_per2__lt by blast + +lemma le2_per2__le: + assumes "Per A B C" and + "Per A' B' C'" and + "A B Le A' B'" and + "B C Le B' C'" + shows "A C Le A' C'" +proof cases + assume "Cong B C B' C'" + thus ?thesis + by (meson Per_cases Tarski_neutral_dimensionless.cong_le_per2__le Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) le_comm not_cong_2143) +next + assume "\ Cong B C B' C'" + then have "B C Lt B' C'" + by (simp add: Lt_def assms(4)) + thus ?thesis + using assms(1) assms(2) assms(3) le_lt_per2__lt lt__le by blast +qed + +lemma cong_lt_per2__lt_1: + assumes "Per A B C" and + "Per A' B' C'" and + "A B Lt A' B'" and + "Cong A C A' C'" + shows "B' C' Lt B C" + by (meson Gt_def assms(1) assms(2) assms(3) assms(4) cong2_per2__cong cong_4321 cong__nlt cong_symmetry lt2_per2__lt or_lt_cong_gt) + +lemma symmetry_preserves_conga: + assumes "A \ B" and "C \ B" and + "M Midpoint A A'" and + "M Midpoint B B'" and + "M Midpoint C C'" + shows "A B C CongA A' B' C'" + by (metis Mid_perm assms(1) assms(2) assms(3) assms(4) assms(5) conga_trivial_1 l11_51 l7_13 symmetric_point_uniqueness) + +lemma l11_57: + assumes "A A' OS B B'" and + "Per B A A'" and + "Per B' A' A" and + "A A' OS C C'" and + "Per C A A'" and + "Per C' A' A" + shows "B A C CongA B' A' C'" +proof - + obtain M where P1: "M Midpoint A A'" + using midpoint_existence by auto + obtain B'' where P2: "M Midpoint B B''" + using symmetric_point_construction by auto + obtain C'' where P3: "M Midpoint C C''" + using symmetric_point_construction by auto + have P4: "\ Col A A' B" + using assms(1) col123__nos by auto + have P5: "\ Col A A' C" + using assms(4) col123__nos by auto + have P6: "B A C CongA B'' A' C''" + by (metis P1 P2 P3 assms(1) assms(4) os_distincts symmetry_preserves_conga) + have "B'' A' C'' CongA B' A' C'" + proof - + have "B \ M" + using P1 P4 midpoint_col not_col_permutation_2 by blast + then have P7: "\ Col B'' A A'" + using Mid_cases P1 P2 P4 mid_preserves_col not_col_permutation_3 by blast + have K3: "Bet B'' A' B'" + proof - + have "Per B'' A' A" + using P1 P2 assms(2) per_mid_per by blast + have "Col B B'' M \ Col A A' M" + using P1 P2 midpoint_col not_col_permutation_2 by blast + then have "Coplanar B A A' B''" + using Coplanar_def by auto + then have "Coplanar A B' B'' A'" + by (meson assms(1) between_trivial2 coplanar_trans_1 ncoplanar_perm_4 ncoplanar_perm_8 one_side_chara os__coplanar) + then have P8: "Col B' B'' A'" + using cop_per2__col P1 P2 P7 assms(2) assms(3) not_col_distincts per_mid_per by blast + have "A A' TS B B''" + using P1 P2 P4 mid_two_sides by auto + then have "A' A TS B'' B'" + using assms(1) invert_two_sides l9_2 l9_8_2 by blast + thus ?thesis + using Col_cases P8 col_two_sides_bet by blast + qed + have "\ Col C'' A A'" + by (smt Col_def P1 P3 P5 l7_15 l7_2 not_col_permutation_5) + have "Bet C'' A' C'" + proof - + have Z2: "Col C' C'' A'" + proof - + have "Col C C'' M \ Col A A' M" + using P1 P3 col_permutation_1 midpoint_col by blast + then have "Coplanar C A A' C''" + using Coplanar_def by blast + then have Z1: "Coplanar A C' C'' A'" + by (meson assms(4) between_trivial2 coplanar_trans_1 ncoplanar_perm_4 ncoplanar_perm_8 one_side_chara os__coplanar) + have "Per C'' A' A" + using P1 P3 assms(5) per_mid_per by blast + thus ?thesis + using Z1 P5 assms(6) col_trivial_1 cop_per2__col by blast + qed + have "A A' TS C C''" + using P1 P3 P5 mid_two_sides by auto + then have "A' A TS C'' C'" + using assms(4) invert_two_sides l9_2 l9_8_2 by blast + thus ?thesis + using Col_cases Z2 col_two_sides_bet by blast + qed + thus ?thesis + by (metis P6 K3 assms(1) assms(4) conga_diff45 conga_diff56 l11_14 os_distincts) + qed + thus ?thesis + using P6 conga_trans by blast +qed + +lemma cop3_orth_at__orth_at: + assumes "\ Col D E F" and + "Coplanar A B C D" and + "Coplanar A B C E" and + "Coplanar A B C F" and + "X OrthAt A B C U V" + shows "X OrthAt D E F U V" +proof - + have P1: "\ Col A B C \ Coplanar A B C X" + using OrthAt_def assms(5) by blast + then have P2: "Coplanar D E F X" + using assms(2) assms(3) assms(4) coplanar_pseudo_trans by blast + { + fix M + assume "Coplanar A B C M" + then have "Coplanar D E F M" + using P1 assms(2) assms(3) assms(4) coplanar_pseudo_trans by blast + } + have T1: "U \ V" + using OrthAt_def assms(5) by blast + have T2: "Col U V X" + using OrthAt_def assms(5) by auto + { + fix P Q + assume P7: "Coplanar D E F P \ Col U V Q" + then have "Coplanar A B C P" + by (meson \\M. Coplanar A B C M \ Coplanar D E F M\ assms(1) assms(2) assms(3) assms(4) l9_30) + then have "Per P X Q" using P7 OrthAt_def assms(5) by blast + } + thus ?thesis using assms(1) + by (simp add: OrthAt_def P2 T1 T2) +qed + +lemma col2_orth_at__orth_at: + assumes "U \ V" and + "Col P Q U" and + "Col P Q V" and + "X OrthAt A B C P Q" + shows "X OrthAt A B C U V" +proof - + have "Col P Q X" + using OrthAt_def assms(4) by auto + then have "Col U V X" + by (metis OrthAt_def assms(2) assms(3) assms(4) col3) + thus ?thesis + using OrthAt_def assms(1) assms(2) assms(3) assms(4) colx by presburger +qed + +lemma col_orth_at__orth_at: + assumes "U \ W" and + "Col U V W" and + "X OrthAt A B C U V" + shows "X OrthAt A B C U W" + using assms(1) assms(2) assms(3) col2_orth_at__orth_at col_trivial_3 by blast + +lemma orth_at_symmetry: + assumes "X OrthAt A B C U V" + shows "X OrthAt A B C V U" + by (metis assms col2_orth_at__orth_at col_trivial_2 col_trivial_3) + +lemma orth_at_distincts: + assumes "X OrthAt A B C U V" + shows "A \ B \ B \ C \ A \ C \ U \ V" + using OrthAt_def assms not_col_distincts by fastforce + +lemma orth_at_chara: + "X OrthAt A B C X P \ + (\ Col A B C \ X \ P \ Coplanar A B C X \ (\ D.(Coplanar A B C D \ Per D X P)))" +proof - + { + assume "X OrthAt A B C X P" + then have "\ Col A B C \ X \ P \ Coplanar A B C X \ (\ D.(Coplanar A B C D \ Per D X P))" + using OrthAt_def col_trivial_2 by auto + } + { + assume T1: "\ Col A B C \ X \ P \ Coplanar A B C X \ (\ D.(Coplanar A B C D \ Per D X P))" + { + fix P0 Q + assume "Coplanar A B C P0 \ Col X P Q" + then have "Per P0 X Q" using T1 OrthAt_def per_col by auto + } + then have "X OrthAt A B C X P" + by (simp add: T1 \\Q P0. Coplanar A B C P0 \ Col X P Q \ Per P0 X Q\ Tarski_neutral_dimensionless.OrthAt_def Tarski_neutral_dimensionless_axioms col_trivial_3) + } + thus ?thesis + using \X OrthAt A B C X P \ \ Col A B C \ X \ P \ Coplanar A B C X \ (\D. Coplanar A B C D \ Per D X P)\ by blast +qed + +lemma cop3_orth__orth: + assumes "\ Col D E F" and + "Coplanar A B C D" and + "Coplanar A B C E" and + "Coplanar A B C F" and + "A B C Orth U V" + shows "D E F Orth U V" + using Orth_def assms(1) assms(2) assms(3) assms(4) assms(5) cop3_orth_at__orth_at by blast + +lemma col2_orth__orth: + assumes "U \ V" and + "Col P Q U" and + "Col P Q V" and + "A B C Orth P Q" + shows "A B C Orth U V" + by (meson Orth_def Tarski_neutral_dimensionless.col2_orth_at__orth_at Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4)) + +lemma col_orth__orth: + assumes "U \ W" and + "Col U V W" and + "A B C Orth U V" + shows "A B C Orth U W" + by (meson assms(1) assms(2) assms(3) col2_orth__orth col_trivial_3) + +lemma orth_symmetry: + assumes "A B C Orth U V" + shows "A B C Orth V U" + by (meson Orth_def assms orth_at_symmetry) + +lemma orth_distincts: + assumes "A B C Orth U V" + shows "A \ B \ B \ C \ A \ C \ U \ V" + using Orth_def assms orth_at_distincts by blast + +lemma col_cop_orth__orth_at: + assumes "A B C Orth U V" and + "Coplanar A B C X" and + "Col U V X" + shows "X OrthAt A B C U V" +proof - + obtain Y where P1: + "\ Col A B C \ U \ V \ Coplanar A B C Y \ Col U V Y \ +(\ P Q. (Coplanar A B C P \ Col U V Q) \ Per P Y Q)" + by (metis OrthAt_def Tarski_neutral_dimensionless.Orth_def Tarski_neutral_dimensionless_axioms assms(1)) + then have P2: "X = Y" + using assms(2) assms(3) per_distinct_1 by blast + { + fix P Q + assume "Coplanar A B C P \ Col U V Q" + then have "Per P X Q" using P1 P2 by auto + } + thus ?thesis + using OrthAt_def Orth_def assms(1) assms(2) assms(3) by auto +qed + +lemma l11_60_aux: + assumes "\ Col A B C" and + "Cong A P A Q" and + "Cong B P B Q" and + "Cong C P C Q" and + "Coplanar A B C D" + shows "Cong D P D Q" +proof - + obtain M where P1: "Bet P M Q \ Cong P M M Q" + by (meson Midpoint_def Tarski_neutral_dimensionless.midpoint_existence Tarski_neutral_dimensionless_axioms) + obtain X where P2: " (Col A B X \ Col C D X) \ + (Col A C X \ Col B D X) \ + (Col A D X \ Col B C X)" + using assms(5) Coplanar_def by auto + { + assume "Col A B X \ Col C D X" + then have "Cong D P D Q" + by (metis (no_types, lifting) assms(1) assms(2) assms(3) assms(4) l4_17 not_col_distincts not_col_permutation_5) + } + { + assume "Col A C X \ Col B D X" + then have "Cong D P D Q" + by (metis (no_types, lifting) assms(1) assms(2) assms(3) assms(4) l4_17 not_col_distincts not_col_permutation_5) + } + { + assume "Col A D X \ Col B C X" + then have "Cong D P D Q" + by (smt assms(1) assms(2) assms(3) assms(4) l4_17 not_col_distincts not_col_permutation_1) + } + thus ?thesis + using P2 \Col A B X \ Col C D X \ Cong D P D Q\ \Col A C X \ Col B D X \ Cong D P D Q\ by blast +qed + +lemma l11_60: + assumes "\ Col A B C" and + "Per A D P" and + "Per B D P" and + "Per C D P" and + "Coplanar A B C E" + shows "Per E D P" + by (meson Per_def assms(1) assms(2) assms(3) assms(4) assms(5) l11_60_aux per_double_cong) + +lemma l11_60_bis: + assumes "\ Col A B C" and + "D \ P" and + "Coplanar A B C D" and + "Per A D P" and + "Per B D P" and + "Per C D P" + shows "D OrthAt A B C D P" + using assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l11_60 orth_at_chara by auto + +lemma l11_61: + assumes "A \ A'" and + "A \ B" and + "A \ C" and + "Coplanar A A' B B'" and + "Per B A A'" and + "Per B' A' A" and + "Coplanar A A' C C'" and + "Per C A A'" and + "Per B A C" + shows "Per B' A' C'" +proof - + have P1: "\ Col C A A'" + using assms(1) assms(3) assms(8) per_col_eq by blast + obtain C'' where P2: "A A' Perp C'' A' \ A A' OS C C''" using l10_15 + using Col_perm P1 col_trivial_2 by blast + have P6: "B' \ A" + using assms(1) assms(6) per_distinct by blast + have P8: "\ Col A' A C''" + using P2 not_col_permutation_4 one_side_not_col124 by blast + have P9: "Per A' A' B'" + by (simp add: l8_2 l8_5) + have P10: "Per A A' B'" + by (simp add: assms(6) l8_2) + { + fix B' + assume "A A' OS B B' \ Per B' A' A" + then have "B A C CongA B' A' C''" using l11_17 + by (meson P2 Perp_cases Tarski_neutral_dimensionless.l11_57 Tarski_neutral_dimensionless_axioms assms(5) assms(8) perp_per_1) + then have "Per B' A' C''" + using assms(9) l11_17 by blast + } + then have Q1: "\ B'. (A A' OS B B' \ Per B' A' A) \ Per B' A' C''" by simp + { + fix B' + assume P12: "Coplanar A A' B B' \ Per B' A' A \ B' \ A" + have "Per B' A' C''" + proof cases + assume "B' = A'" + thus ?thesis + by (simp add: Per_perm l8_5) + next + assume P13: "B' \ A'" + have P14: "\ Col B' A' A" + using P12 P13 assms(1) l8_9 by auto + have P15: "\ Col B A A'" + using assms(1) assms(2) assms(5) per_not_col by auto + then have Z1: "A A' TS B B' \ A A' OS B B'" + using P12 P14 cop__one_or_two_sides not_col_permutation_5 by blast + { + assume "A A' OS B B'" + then have "Per B' A' C''" + by (simp add: P12 \\B'a. A A' OS B B'a \ Per B'a A' A \ Per B'a A' C''\) + } + { + assume Q2: "A A' TS B B'" + obtain B'' where Z2: "Bet B' A' B'' \ Cong A' B'' A' B'" + using segment_construction by blast + have "B' \ B''" + using P13 Z2 bet_neq12__neq by blast + then have Z4: "A' \ B''" + using Z2 cong_diff_4 by blast + then have "A A' TS B'' B'" + by (meson TS_def Z2 Q2 bet__ts invert_two_sides l9_2 not_col_permutation_1) + then have Z5: "A A' OS B B''" + using Q2 l9_8_1 by auto + have "Per B'' A' A" + using P12 P13 Z2 bet_col col_per2__per l8_2 l8_5 by blast + then have "Per C'' A' B''" + using l8_2 Q1 Z5 by blast + then have "Per B' A' C''" + by (metis Col_def Per_perm Tarski_neutral_dimensionless.l8_3 Tarski_neutral_dimensionless_axioms Z2 Z4) + } + thus ?thesis using Z1 + using \A A' OS B B' \ Per B' A' C''\ by blast + qed + } + then have "\ B'. (Coplanar A A' B B' \ Per B' A' A \ B' \ A) \ Per B' A' C''" + by simp + then have "Per B' A' C''" + using P6 assms(4) assms(6) by blast + then have P11: "Per C'' A' B'" + using Per_cases by auto + have "Coplanar A' A C'' C'" + by (meson P1 P2 assms(7) coplanar_trans_1 ncoplanar_perm_6 ncoplanar_perm_8 os__coplanar) + thus ?thesis + using P8 P9 P10 P11 l8_2 l11_60 by blast +qed + +lemma l11_61_bis: + assumes "D OrthAt A B C D P" and + "D E Perp E Q" and + "Coplanar A B C E" and + "Coplanar D E P Q" + shows "E OrthAt A B C E Q" +proof - + have P4: "D \ E" + using assms(2) perp_not_eq_1 by auto + have P5: "E \ Q" + using assms(2) perp_not_eq_2 by auto + have "\ D'. (D E Perp D' D \ Coplanar A B C D')" + proof - + obtain F where T1: "Coplanar A B C F \ \ Col D E F" + using P4 ex_ncol_cop by blast + obtain D' where T2: "D E Perp D' D \ Coplanar D E F D'" + using P4 ex_perp_cop by blast + have "Coplanar A B C D'" + proof - + have T3A: "\ Col A B C" + using OrthAt_def assms(1) by auto + have T3B: "Coplanar A B C D" + using OrthAt_def assms(1) by blast + then have T4: "Coplanar D E F A" + by (meson T1 T3A assms(3) coplanar_pseudo_trans ncop_distincts) + have T5: "Coplanar D E F B" + using T1 T3A T3B assms(3) coplanar_pseudo_trans ncop_distincts by blast + have "Coplanar D E F C" + using T1 T3A T3B assms(3) coplanar_pseudo_trans ncop_distincts by blast + thus ?thesis + using T1 T2 T4 T5 coplanar_pseudo_trans by blast + qed + thus ?thesis + using T2 by auto + qed + then obtain D' where R1: "D E Perp D' D \ Coplanar A B C D'" by auto + then have R2: "D \ D'" + using perp_not_eq_2 by blast + { + fix M + assume R3: "Coplanar A B C M" + have "Col D P P" + by (simp add: col_trivial_2) + then have "Per E D P" + using assms(1) assms(3) orth_at_chara by auto + then have R4: "Per P D E" using l8_2 by auto + have R5: "Per Q E D" + using Perp_cases assms(2) perp_per_2 by blast + have R6: "Coplanar D E D' M" + proof - + have S1: "\ Col A B C" + using OrthAt_def assms(1) by auto + have "Coplanar A B C D" + using OrthAt_def assms(1) by auto + thus ?thesis + using S1 assms(3) R1 R3 coplanar_pseudo_trans by blast + qed + have R7: "Per D' D E" + using Perp_cases R1 perp_per_1 by blast + have "Per D' D P" + using R1 assms(1) orth_at_chara by blast + then have "Per P D D'" + using Per_cases by blast + then have "Per Q E M" + using l11_61 R4 R5 R6 R7 OrthAt_def P4 R2 assms(1) assms(4) by blast + then have "Per M E Q" using l8_2 by auto + } + { + fix P0 Q0 + assume "Coplanar A B C P0 \ Col E Q Q0" + then have "Per P0 E Q0" + using P5 \\M. Coplanar A B C M \ Per M E Q\ per_col by blast + } + thus ?thesis + using OrthAt_def P5 assms(1) assms(3) col_trivial_3 by auto +qed + +lemma l11_62_unicity: + assumes "Coplanar A B C D" and + "Coplanar A B C D'" and + "\ E. Coplanar A B C E \ Per E D P" and + "\ E. Coplanar A B C E \ Per E D' P" + shows "D = D'" + by (metis assms(1) assms(2) assms(3) assms(4) l8_8 not_col_distincts per_not_colp) + +lemma l11_62_unicity_bis: + assumes "X OrthAt A B C X U" and + "Y OrthAt A B C Y U" + shows "X = Y" +proof - + have P1: "Coplanar A B C X" + using assms(1) orth_at_chara by blast + have P2: "Coplanar A B C Y" + using assms(2) orth_at_chara by blast + { + fix E + assume "Coplanar A B C E" + then have "Per E X U" + using OrthAt_def assms(1) col_trivial_2 by auto + } + { + fix E + assume "Coplanar A B C E" + then have "Per E Y U" + using assms(2) orth_at_chara by auto + } + thus ?thesis + by (meson P1 P2 \\E. Coplanar A B C E \ Per E X U\ l8_2 l8_7) +qed + +lemma orth_at2__eq: + assumes "X OrthAt A B C U V" and + "Y OrthAt A B C U V" + shows "X = Y" +proof - + have P1: "Coplanar A B C X" + using assms(1) + by (simp add: OrthAt_def) + have P2: "Coplanar A B C Y" + using OrthAt_def assms(2) by auto + { + fix E + assume "Coplanar A B C E" + then have "Per E X U" + using OrthAt_def assms(1) col_trivial_3 by auto + } + { + fix E + assume "Coplanar A B C E" + then have "Per E Y U" + using OrthAt_def assms(2) col_trivial_3 by auto + } + thus ?thesis + by (meson P1 P2 Per_perm \\E. Coplanar A B C E \ Per E X U\ l8_7) +qed + +lemma col_cop_orth_at__eq: + assumes "X OrthAt A B C U V" and + "Coplanar A B C Y" and + "Col U V Y" + shows "X = Y" +proof - + have "Y OrthAt A B C U V" + using Orth_def assms(1) assms(2) assms(3) col_cop_orth__orth_at by blast + thus ?thesis + using assms(1) orth_at2__eq by auto +qed + +lemma orth_at__ncop1: + assumes "U \ X" and + "X OrthAt A B C U V" + shows "\ Coplanar A B C U" + using assms(1) assms(2) col_cop_orth_at__eq not_col_distincts by blast + +lemma orth_at__ncop2: + assumes "V \ X" and + "X OrthAt A B C U V" + shows "\ Coplanar A B C V" + using assms(1) assms(2) col_cop_orth_at__eq not_col_distincts by blast + +lemma orth_at__ncop: + assumes "X OrthAt A B C X P" + shows "\ Coplanar A B C P" + by (metis assms orth_at__ncop2 orth_at_distincts) + +lemma l11_62_existence: + "\ D. (Coplanar A B C D \ (\ E. (Coplanar A B C E \ Per E D P)))" +proof cases + assume "Coplanar A B C P" + thus ?thesis + using l8_5 by auto +next + assume P1: "\ Coplanar A B C P" + then have P2: "\ Col A B C" + using ncop__ncol by auto + have "\ Col A B P" + using P1 ncop__ncols by auto + then obtain D0 where P4: "Col A B D0 \ A B Perp P D0" using l8_18_existence by blast + have P5: "Coplanar A B C D0" + using P4 ncop__ncols by auto + have "A \ B" + using P2 not_col_distincts by auto + then obtain D1 where P10: "A B Perp D1 D0 \ Coplanar A B C D1" + using ex_perp_cop by blast + have P11: "\ Col A B D1" + using P10 P4 perp_not_col2 by blast + { + fix D + assume "Col D0 D1 D" + then have "Coplanar A B C D" + by (metis P10 P5 col_cop2__cop perp_not_eq_2) + } + obtain A0 where P11: "A \ A0 \ B \ A0 \ D0 \ A0 \ Col A B A0" + using P4 diff_col_ex3 by blast + have P12: "Coplanar A B C A0" + using P11 ncop__ncols by blast + have P13: "Per P D0 A0" + using l8_16_1 P11 P4 by blast + show ?thesis + proof cases + assume Z1: "Per P D0 D1" + { + fix E + assume R1: "Coplanar A B C E" + have R2: "\ Col A0 D1 D0" + by (metis P10 P11 P4 col_permutation_5 colx perp_not_col2) + have R3: "Per A0 D0 P" + by (simp add: P13 l8_2) + have R4: "Per D1 D0 P" + by (simp add: Z1 l8_2) + have R5: "Per D0 D0 P" + by (simp add: l8_2 l8_5) + have "Coplanar A0 D1 D0 E" + using R1 P2 P12 P10 P5 coplanar_pseudo_trans by blast + then have "Per E D0 P" + using l11_60 R2 R3 R4 R5 by blast + } + thus ?thesis using P5 by auto + next + assume S1: "\ Per P D0 D1" + { + assume S2: "Col D0 D1 P" + have "\ D. Col D0 D1 D \ Coplanar A B C D" + by (simp add: \\Da. Col D0 D1 Da \ Coplanar A B C Da\) + then have "False" + using P1 S2 by blast + } + then have S2A: "\ Col D0 D1 P" by blast + then obtain D where S3: "Col D0 D1 D \ D0 D1 Perp P D" + using l8_18_existence by blast + have S4: "Coplanar A B C D" + by (simp add: S3 \\Da. Col D0 D1 Da \ Coplanar A B C Da\) + { + fix E + assume S5: "Coplanar A B C E" + have S6: "D \ D0" + using S1 S3 l8_2 perp_per_1 by blast + have S7: "Per D0 D P" + by (metis Perp_cases S3 S6 perp_col perp_per_1) + have S8: "Per D D0 A0" + proof - + have V4: "D0 \ D1" + using P10 perp_not_eq_2 by blast + have V6: "Per A0 D0 D1" + using P10 P11 P4 l8_16_1 l8_2 by blast + thus ?thesis + using S3 V4 V6 l8_2 per_col by blast + qed + have S9: "Per A0 D P" + proof - + obtain A0' where W1: "D Midpoint A0 A0'" + using symmetric_point_construction by auto + obtain D0' where W2: "D Midpoint D0 D0'" + using symmetric_point_construction by auto + have "Cong P A0 P A0'" + proof - + have V3: "Cong P D0 P D0'" + using S7 W2 l8_2 per_double_cong by blast + have V4: "Cong D0 A0 D0' A0'" + using W1 W2 cong_4321 l7_13 by blast + have "Per P D0' A0'" + proof - + obtain P' where V5: "D Midpoint P P'" + using symmetric_point_construction by blast + have "Per P' D0 A0" + proof - + have "\ Col P D D0" + by (metis S2A S3 S6 col2__eq col_permutation_1) + thus ?thesis + by (metis (full_types) P13 S3 S8 V5 S2A col_per2__per midpoint_col) + qed + thus ?thesis + using midpoint_preserves_per V5 Mid_cases W1 W2 by blast + qed + thus ?thesis using l10_12 P13 V3 V4 by blast + qed + thus ?thesis + using Per_def Per_perm W1 by blast + qed + have S13: "Per D D P" + using Per_perm l8_5 by blast + have S14: "\ Col D0 A0 D" using P11 S7 S9 per_not_col Col_perm S6 S8 by blast + have "Coplanar A B C D" using S4 by auto + then have "Coplanar D0 A0 D E" + using P12 P2 P5 S5 coplanar_pseudo_trans by blast + then have "Per E D P" + using S13 S14 S7 S9 l11_60 by blast + } + thus ?thesis using S4 by blast + qed +qed + +lemma l11_62_existence_bis: + assumes "\ Coplanar A B C P" + shows "\ X. X OrthAt A B C X P" +proof - + obtain X where P1: "Coplanar A B C X \ (\ E. Coplanar A B C E \ Per E X P)" + using l11_62_existence by blast + then have P2: "X \ P" + using assms by auto + have P3: "\ Col A B C" + using assms ncop__ncol by auto + thus ?thesis + using P1 P2 P3 orth_at_chara by auto +qed + +lemma l11_63_aux: + assumes "Coplanar A B C D" and + "D \ E" and + "E OrthAt A B C E P" + shows "\ Q. (D E OS P Q \ A B C Orth D Q)" +proof - + have P1: "\ Col A B C" + using OrthAt_def assms(3) by blast + have P2: "E \ P" + using OrthAt_def assms(3) by blast + have P3: "Coplanar A B C E" + using OrthAt_def assms(3) by blast + have P4: "\ P0 Q. (Coplanar A B C P0 \ Col E P Q) \ Per P0 E Q" + using OrthAt_def assms(3) by blast + have P5: "\ Coplanar A B C P" + using assms(3) orth_at__ncop by auto + have P6: "Col D E D" + by (simp add: col_trivial_3) + have "\ Col D E P" + using P3 P5 assms(1) assms(2) col_cop2__cop by blast + then obtain Q where P6: "D E Perp Q D \ D E OS P Q" + using P6 l10_15 by blast + have "A B C Orth D Q" + proof - + obtain F where P7: "Coplanar A B C F \ \ Col D E F" + using assms(2) ex_ncol_cop by blast + obtain D' where P8: "D E Perp D' D \ Coplanar D E F D'" + using assms(2) ex_perp_cop by presburger + have P9: "\ Col D' D E" + using P8 col_permutation_1 perp_not_col by blast + have P10: "Coplanar D E F A" + by (meson P1 P3 P7 assms(1) coplanar_pseudo_trans ncop_distincts) + have P11: "Coplanar D E F B" + by (meson P1 P3 P7 assms(1) coplanar_pseudo_trans ncop_distincts) + have P12: "Coplanar D E F C" + by (meson P1 P3 P7 assms(1) coplanar_pseudo_trans ncop_distincts) + then have "D OrthAt A B C D Q" + proof - + have "Per D' D Q" + proof - + obtain E' where Y1: "D E Perp E' E \ Coplanar D E F E'" + using assms(2) ex_perp_cop by blast + have Y2: "E \ E'" + using Y1 perp_distinct by auto + have Y5: "Coplanar E D E' D'" + by (meson P7 P8 Y1 coplanar_perm_12 coplanar_perm_7 coplanar_trans_1 not_col_permutation_2) + have Y6: "Per E' E D" + by (simp add: Perp_perm Tarski_neutral_dimensionless.perp_per_2 Tarski_neutral_dimensionless_axioms Y1) + have Y7: "Per D' D E" + using P8 col_trivial_2 col_trivial_3 l8_16_1 by blast + have Y8: "Coplanar E D P Q" + using P6 ncoplanar_perm_6 os__coplanar by blast + have Y9: "Per P E D" using P4 + using assms(1) assms(3) l8_2 orth_at_chara by blast + have Y10: "Coplanar A B C E'" + using P10 P11 P12 P7 Y1 coplanar_pseudo_trans by blast + then have Y11: "Per E' E P" + using P4 Y10 col_trivial_2 by auto + have "E \ D" using assms(2) by blast + thus ?thesis + using l11_61 Y2 assms(2) P2 Y5 Y6 Y7 Y8 Y9 Y10 Y11 by blast + qed + then have X1: "D OrthAt D' D E D Q" using l11_60_bis + by (metis OS_def P6 P9 Per_perm TS_def Tarski_neutral_dimensionless.l8_5 Tarski_neutral_dimensionless_axioms col_trivial_3 invert_one_side ncop_distincts perp_per_1) + have X3: "Coplanar D' D E A" + using P10 P7 P8 coplanar_perm_14 coplanar_trans_1 not_col_permutation_3 by blast + have X4: "Coplanar D' D E B" + using P11 P7 P8 coplanar_perm_14 coplanar_trans_1 not_col_permutation_3 by blast + have "Coplanar D' D E C" + using P12 P7 P8 coplanar_perm_14 coplanar_trans_1 not_col_permutation_3 by blast + thus ?thesis + using X1 P1 X3 X4 cop3_orth_at__orth_at by blast + qed + thus ?thesis + using Orth_def by blast + qed + thus ?thesis using P6 by blast +qed + +lemma l11_63_existence: + assumes "Coplanar A B C D" and + "\ Coplanar A B C P" + shows "\ Q. A B C Orth D Q" + using Orth_def assms(1) assms(2) l11_62_existence_bis l11_63_aux by fastforce + +lemma l8_21_3: + assumes "Coplanar A B C D" and + "\ Coplanar A B C X" + shows + "\ P T. (A B C Orth D P \ Coplanar A B C T \ Bet X T P)" +proof - + obtain E where P1: "E OrthAt A B C E X" + using assms(2) l11_62_existence_bis by blast + thus ?thesis + proof cases + assume P2: "D = E" + obtain Y where P3: "Bet X D Y \ Cong D Y D X" + using segment_construction by blast + have P4: "D \ X" + using assms(1) assms(2) by auto + have P5: "A B C Orth D X" + using Orth_def P1 P2 by auto + have P6: "D \ Y" + using P3 P4 cong_reverse_identity by blast + have "Col D X Y" + using Col_def Col_perm P3 by blast + then have "A B C Orth D Y" + using P5 P6 col_orth__orth by auto + thus ?thesis + using P3 assms(1) by blast + next + assume K1: "D \ E" + then obtain P' where P7: "D E OS X P' \ A B C Orth D P'" + using P1 assms(1) l11_63_aux by blast + have P8: "\ Col A B C" + using assms(2) ncop__ncol by auto + have P9: "E \ X" + using P7 os_distincts by auto + have P10: "\ P Q. (Coplanar A B C P \ Col E X Q) \ Per P E Q" + using OrthAt_def P1 by auto + have P11: "D OrthAt A B C D P'" + by (simp add: P7 assms(1) col_cop_orth__orth_at col_trivial_3) + have P12: "D \ P'" + using P7 os_distincts by auto + have P13: "\ Coplanar A B C P'" + using P11 orth_at__ncop by auto + have P14: "\ P Q. (Coplanar A B C P \ Col D P' Q) \ Per P D Q" + using OrthAt_def P11 by auto + obtain P where P15: "Bet P' D P \ Cong D P D P'" + using segment_construction by blast + have P16: "D E TS X P" + proof - + have P16A: "D E OS P' X" using P7 one_side_symmetry by blast + then have "D E TS P' P" + by (metis P12 P15 Tarski_neutral_dimensionless.bet__ts Tarski_neutral_dimensionless_axioms cong_diff_3 one_side_not_col123) + thus ?thesis using l9_8_2 P16A by blast + qed + obtain T where P17: "Col T D E \ Bet X T P" + using P16 TS_def by blast + have P18: "D \ P" + using P16 ts_distincts by blast + have "Col D P' P" + using Col_def Col_perm P15 by blast + then have "A B C Orth D P" + using P18 P7 col_orth__orth by blast + thus ?thesis using col_cop2__cop + by (meson P1 P17 Tarski_neutral_dimensionless.orth_at_chara Tarski_neutral_dimensionless_axioms K1 assms(1) col_permutation_1) + qed +qed + +lemma mid2_orth_at2__cong: + assumes "X OrthAt A B C X P" and + "Y OrthAt A B C Y Q" and + "X Midpoint P P'" and + "Y Midpoint Q Q'" + shows "Cong P Q P' Q'" +proof - + have Q1: "\ Col A B C" + using assms(1) col__coplanar orth_at__ncop by blast + have Q2: "X \ P" + using assms(1) orth_at_distincts by auto + have Q3: "Coplanar A B C X" + using OrthAt_def assms(1) by auto + have Q4: "\ P0 Q. (Coplanar A B C P0 \ Col X P Q) \ Per P0 X Q" + using OrthAt_def assms(1) by blast + have Q5: "Y \ P" + by (metis assms(1) assms(2) orth_at__ncop2 orth_at_chara) + have Q6: "Coplanar A B C Y" + using OrthAt_def assms(2) by blast + have Q7: "\ P Q0. (Coplanar A B C P \ Col Y Q Q0) \ Per P Y Q0" + using OrthAt_def assms(2) by blast + obtain Z where P1: "Z Midpoint X Y" + using midpoint_existence by auto + obtain R where P2: "Z Midpoint P R" + using symmetric_point_construction by auto + obtain R' where P3: "Z Midpoint P' R'" + using symmetric_point_construction by auto + have T1: "Coplanar A B C Z" + using P1 Q3 Q6 bet_cop2__cop midpoint_bet by blast + then have "Per Z X P" + using Q4 assms(1) orth_at_chara by auto + then have T2: "Cong Z P Z P'" + using assms(3) per_double_cong by blast + have T3: "Cong R Z R' Z" + by (metis Cong_perm Midpoint_def P2 P3 T2 cong_transitivity) + have T4: "Cong R Q R' Q'" + by (meson P1 P2 P3 assms(3) assms(4) l7_13 not_cong_4321 symmetry_preserves_midpoint) + have "Per Z Y Q" + using Q7 T1 assms(2) orth_at_chara by auto + then have T5: "Cong Z Q Z Q'" + using assms(4) per_double_cong by auto + have "R \ Z" + by (metis P2 P3 Q2 T3 assms(3) cong_diff_3 l7_17_bis midpoint_not_midpoint) + thus ?thesis + using P2 P3 T2 T3 T4 T5 five_segment l7_2 midpoint_bet by blast +qed + +lemma orth_at2_tsp__ts: + assumes "P \ Q" and + "P OrthAt A B C P X" and + "Q OrthAt A B C Q Y" and + "A B C TSP X Y" + shows "P Q TS X Y" +proof - + obtain T where P0: "Coplanar A B C T \ Bet X T Y" + using TSP_def assms(4) by auto + have P1: "\ Col A B C" + using assms(4) ncop__ncol tsp__ncop1 by blast + have P2: "P \ X " + using assms(2) orth_at_distincts by auto + have P3: "Coplanar A B C P" + using OrthAt_def assms(2) by blast + have P4: "\ D. Coplanar A B C D \ Per D P X" + using assms(2) orth_at_chara by blast + have P5: "Q \ Y" + using assms(3) orth_at_distincts by auto + have P6: "Coplanar A B C Q" + using OrthAt_def assms(3) by blast + have P7: "\ D. Coplanar A B C D \ Per D Q Y" + using assms(3) orth_at_chara by blast + have P8: "\ Col X P Q" + using P3 P6 assms(1) assms(4) col_cop2__cop not_col_permutation_2 tsp__ncop1 by blast + have P9: "\ Col Y P Q" + using P3 P6 assms(1) assms(4) col_cop2__cop not_col_permutation_2 tsp__ncop2 by blast + have "Col T P Q" + proof - + obtain X' where Q1: "P Midpoint X X'" + using symmetric_point_construction by auto + obtain Y' where Q2: "Q Midpoint Y Y'" + using symmetric_point_construction by auto + have "Per T P X" + using P0 P4 by auto + then have Q3: "Cong T X T X'" + using Q1 per_double_cong by auto + have "Per T Q Y" + using P0 P7 by auto + then have Q4: "Cong T Y T Y'" using Q2 per_double_cong by auto + have "Cong X Y X' Y'" + using P1 Q1 Q2 assms(2) assms(3) mid2_orth_at2__cong by blast + then have "X T Y Cong3 X' T Y'" + using Cong3_def Q3 Q4 not_cong_2143 by blast + then have "Bet X' T Y'" + using l4_6 P0 by blast + thus ?thesis + using Q1 Q2 Q3 Q4 Col_def P0 between_symmetry l7_22 by blast + qed + thus ?thesis + using P0 P8 P9 TS_def by blast +qed + +lemma orth_dec: + shows "A B C Orth U V \ \ A B C Orth U V" by auto + +lemma orth_at_dec: + shows "X OrthAt A B C U V \ \ X OrthAt A B C U V" by auto + +lemma tsp_dec: + shows "A B C TSP X Y \ \ A B C TSP X Y" by auto + +lemma osp_dec: + shows "A B C OSP X Y \ \ A B C OSP X Y" by auto + +lemma ts2__inangle: + assumes "A C TS B P" and + "B P TS A C" + shows "P InAngle A B C" + by (metis InAngle_def assms(1) assms(2) bet_out ts2__ex_bet2 ts_distincts) + +lemma os_ts__inangle: + assumes "B P TS A C" and + "B A OS C P" + shows "P InAngle A B C" +proof - + have P1: "\ Col A B P" + using TS_def assms(1) by auto + have P2: "\ Col B A C" + using assms(2) col123__nos by blast + obtain P' where P3: "B Midpoint P P'" + using symmetric_point_construction by blast + then have P4: "\ Col B A P'" + by (metis assms(2) col_one_side col_permutation_5 midpoint_col midpoint_distinct_2 one_side_not_col124) + have P5: "(B \ P' \ B P TS A C \ Bet P B P') \ (P InAngle A B C \ P' InAngle A B C)" + using two_sides_in_angle by auto + then have P6: "P InAngle A B C \ P' InAngle A B C" + using P3 P4 assms(1) midpoint_bet not_col_distincts by blast + { + assume "P' InAngle A B C" + then have P7: "A B OS P' C" + using Col_cases P2 P4 in_angle_one_side by blast + then have P8: "\ A B TS P' C" + using l9_9 by auto + have "B A TS P P'" + using P1 P3 P4 bet__ts midpoint_bet not_col_distincts not_col_permutation_4 by auto + then have "B A TS C P'" + using P7 assms(2) invert_one_side l9_2 l9_8_2 l9_9 by blast + then have "B A TS P' C" + using l9_2 by blast + then have "A B TS P' C" + by (simp add: invert_two_sides) + then have "P InAngle A B C" + using P8 by auto + } + thus ?thesis + using P6 by blast +qed + +lemma os2__inangle: + assumes "B A OS C P" and + "B C OS A P" + shows "P InAngle A B C" + using assms(1) assms(2) col124__nos l9_9_bis os_ts__inangle two_sides_cases by blast + +lemma acute_conga__acute: + assumes "Acute A B C" and + "A B C CongA D E F" + shows "Acute D E F" +proof - + have "D E F LeA A B C" + by (simp add: assms(2) conga__lea456123) + thus ?thesis + using acute_lea_acute assms(1) by blast +qed + +lemma acute_out2__acute: + assumes "B Out A' A" and + "B Out C' C" and + "Acute A B C" + shows "Acute A' B C'" + by (meson Tarski_neutral_dimensionless.out2__conga Tarski_neutral_dimensionless_axioms acute_conga__acute assms(1) assms(2) assms(3)) + +lemma conga_obtuse__obtuse: + assumes "Obtuse A B C" and + "A B C CongA D E F" + shows "Obtuse D E F" + using assms(1) assms(2) conga__lea lea_obtuse_obtuse by blast + +lemma obtuse_out2__obtuse: + assumes "B Out A' A" and + "B Out C' C" and + "Obtuse A B C" + shows "Obtuse A' B C'" + by (meson Tarski_neutral_dimensionless.out2__conga Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) conga_obtuse__obtuse) + +lemma bet_lea__bet: + assumes "Bet A B C" and + "A B C LeA D E F" + shows "Bet D E F" +proof - + have "A B C CongA D E F" + by (metis assms(1) assms(2) l11_31_2 lea_asym lea_distincts) + thus ?thesis + using assms(1) bet_conga__bet by blast +qed + +lemma out_lea__out: + assumes "E Out D F" and + "A B C LeA D E F" + shows "B Out A C" +proof - + have "D E F CongA A B C" + using Tarski_neutral_dimensionless.l11_31_1 Tarski_neutral_dimensionless.lea_asym Tarski_neutral_dimensionless.lea_distincts Tarski_neutral_dimensionless_axioms assms(1) assms(2) by fastforce + thus ?thesis + using assms(1) out_conga_out by blast +qed + +lemma bet2_lta__lta: + assumes "A B C LtA D E F" and + "Bet A B A'" and + "A' \ B" and + "Bet D E D'" and + "D' \ E" + shows "D' E F LtA A' B C" +proof - + have P1: "D' E F LeA A' B C" + by (metis Bet_cases assms(1) assms(2) assms(3) assms(4) assms(5) l11_36_aux2 lea_distincts lta__lea) + have "\ D' E F CongA A' B C" + by (metis assms(1) assms(2) assms(4) between_symmetry conga_sym l11_13 lta_distincts not_lta_and_conga) + thus ?thesis + by (simp add: LtA_def P1) +qed + +lemma lea123456_lta__lta: + assumes "A B C LeA D E F" and + "D E F LtA G H I" + shows "A B C LtA G H I" +proof - + have "\ G H I LeA F E D" + by (metis (no_types) Tarski_neutral_dimensionless.lea__nlta Tarski_neutral_dimensionless.lta_left_comm Tarski_neutral_dimensionless_axioms assms(2)) + then have "\ A B C CongA G H I" + by (metis Tarski_neutral_dimensionless.lta_distincts Tarski_neutral_dimensionless_axioms assms(1) assms(2) conga_pseudo_refl l11_30) + thus ?thesis + by (meson LtA_def Tarski_neutral_dimensionless.lea_trans Tarski_neutral_dimensionless_axioms assms(1) assms(2)) +qed + +lemma lea456789_lta__lta: + assumes "A B C LtA D E F" and + "D E F LeA G H I" + shows "A B C LtA G H I" + by (meson LtA_def assms(1) assms(2) conga__lea456123 lea_trans lta__nlea) + +lemma acute_per__lta: + assumes "Acute A B C" and + "D \ E" and + "E \ F" and + "Per D E F" + shows "A B C LtA D E F" +proof - + obtain G H I where P1: "Per G H I \ A B C LtA G H I" + using Acute_def assms(1) by auto + then have "G H I CongA D E F" + using assms(2) assms(3) assms(4) l11_16 lta_distincts by blast + thus ?thesis + by (metis P1 conga_preserves_lta conga_refl lta_distincts) +qed + +lemma obtuse_per__lta: + assumes "Obtuse A B C" and + "D \ E" and + "E \ F" and + "Per D E F" + shows "D E F LtA A B C" +proof - + obtain G H I where P1: "Per G H I \ G H I LtA A B C" + using Obtuse_def assms(1) by auto + then have "G H I CongA D E F" + using assms(2) assms(3) assms(4) l11_16 lta_distincts by blast + thus ?thesis + by (metis P1 Tarski_neutral_dimensionless.l11_51 Tarski_neutral_dimensionless_axioms assms(1) cong_reflexivity conga_preserves_lta obtuse_distincts) +qed + +lemma acute_obtuse__lta: + assumes "Acute A B C" and + "Obtuse D E F" + shows "A B C LtA D E F" + by (metis Acute_def assms(1) assms(2) lta_distincts lta_trans obtuse_per__lta) + +lemma lea_in_angle: + assumes "A B P LeA A B C" and + "A B OS C P" + shows "P InAngle A B C" +proof - + obtain T where P3: "T InAngle A B C \ A B P CongA A B T" + using LeA_def assms(1) by blast + thus ?thesis + by (metis assms(2) conga_preserves_in_angle conga_refl not_conga_sym one_side_symmetry os_distincts) +qed + +lemma acute_bet__obtuse: + assumes "Bet A B A'" and + "A' \ B" and + "Acute A B C" + shows "Obtuse A' B C" +proof cases + assume P1: "Col A B C" + show ?thesis + proof cases + assume "Bet A B C" + thus ?thesis + using P1 acute_col__out assms(3) not_bet_and_out by blast + next + assume "\ Bet A B C" + thus ?thesis + by (smt P1 assms(1) assms(2) bet__obtuse between_inner_transitivity between_symmetry outer_transitivity_between third_point) + qed +next + assume P2: "\ Col A B C" + then obtain D where P3: "A B Perp D B \ A B OS C D" + using col_trivial_2 l10_15 by blast + { + assume P4: "Col C B D" + then have "Per A B C" + proof - + have P5: "B \ D" + using P3 perp_not_eq_2 by auto + have "Per A B D" + using P3 Perp_perm perp_per_2 by blast + thus ?thesis + using P4 P5 not_col_permutation_2 per_col by blast + qed + then have "A B C LtA A B C" + by (metis Acute_def acute_per__lta assms(3) lta_distincts) + then have "False" + by (simp add: nlta) + } + then have P6: "\ Col C B D" by auto + have P7: "B A' OS C D" + by (metis P3 assms(1) assms(2) bet_col col2_os__os l5_3) + have T1: "Per A B D" + by (simp add: P3 perp_left_comm perp_per_1) + have Q1: "B C TS A' A" + using P2 assms(1) assms(2) bet__ts l9_2 not_col_permutation_1 by auto + have "A B C LtA A B D" + using P2 P6 T1 acute_per__lta assms(3) not_col_distincts by auto + then have "A B C LeA A B D" + by (simp add: lta__lea) + then have "C InAngle A B D" + by (simp add: P3 lea_in_angle one_side_symmetry) + then have "C InAngle D B A" + using l11_24 by blast + then have "C B TS D A" + by (simp add: P2 P6 in_angle_two_sides not_col_permutation_1 not_col_permutation_4) + then have "B C TS D A" + using invert_two_sides by blast + then have "B C OS A' D" + using Q1 l9_8_1 by auto + then have T1A: "D InAngle A' B C" + by (simp add: P7 os2__inangle) + then have "A B D CongA A' B D" + by (metis Per_cases T1 Tarski_neutral_dimensionless.conga_comm Tarski_neutral_dimensionless.l11_18_1 Tarski_neutral_dimensionless_axioms acute_distincts assms(1) assms(3) inangle_distincts) + then have T2: "A B D LeA A' B C" + using LeA_def T1A by auto + { + assume "A B D CongA A' B C" + then have "False" + by (metis OS_def P7 T1 TS_def Tarski_neutral_dimensionless.out2__conga Tarski_neutral_dimensionless_axioms \A B C LtA A B D\ \A B D CongA A' B D\ \\thesis. (\D. A B Perp D B \ A B OS C D \ thesis) \ thesis\ col_trivial_2 invert_one_side l11_17 l11_19 not_lta_and_conga out_trivial) + } + then have "\ A B D CongA A' B C" by auto + then have "A B D LtA A' B C" + using T2 LtA_def by auto + thus ?thesis + using T1 Obtuse_def by blast +qed + +lemma bet_obtuse__acute: + assumes "Bet A B A'" and + "A' \ B" and + "Obtuse A B C" + shows "Acute A' B C" +proof cases + assume P1: "Col A B C" + thus ?thesis + proof cases + assume "Bet A B C" + then have "B Out A' C" + by (smt Out_def assms(1) assms(2) assms(3) l5_2 obtuse_distincts) + thus ?thesis + by (simp add: out__acute) + next + assume "\ Bet A B C" + thus ?thesis + using P1 assms(3) col_obtuse__bet by blast + qed +next + assume P2: "\ Col A B C" + then obtain D where P3: "A B Perp D B \ A B OS C D" + using col_trivial_2 l10_15 by blast + { + assume P3A: "Col C B D" + have P3B: "B \ D" + using P3 perp_not_eq_2 by blast + have P3C: "Per A B D" + using P3 Perp_perm perp_per_2 by blast + then have "Per A B C" + using P3A P3B not_col_permutation_2 per_col by blast + then have "A B C LtA A B C" + using P2 assms(3) not_col_distincts obtuse_per__lta by auto + then have "False" + by (simp add: nlta) + } + then have P4: "\ Col C B D" by auto + have "Col B A A'" + using Col_def Col_perm assms(1) by blast + then have P5: "B A' OS C D" + using P3 assms(2) invert_one_side col2_os__os col_trivial_3 by blast + have P7: "Per A' B D" + by (meson Col_def P3 Tarski_neutral_dimensionless.Per_perm Tarski_neutral_dimensionless_axioms assms(1) col_trivial_2 l8_16_1) + have "A' B C LtA A' B D" + proof - + have P8: "A' B C LeA A' B D" + proof - + have P9: "C InAngle A' B D" + proof - + have P10: "B A' OS D C" + by (simp add: P5 one_side_symmetry) + have "B D OS A' C" + proof - + have P6: "\ Col A B D" + using P3 col124__nos by auto + then have P11: "B D TS A' A" + using Col_perm P5 assms(1) bet__ts l9_2 os_distincts by blast + have "A B D LtA A B C" + proof - + have P11A: "A \ B" + using P2 col_trivial_1 by auto + have P11B: "B \ D" + using P4 col_trivial_2 by blast + have "Per A B D" + using P3 Perp_cases perp_per_2 by blast + thus ?thesis + by (simp add: P11A P11B Tarski_neutral_dimensionless.obtuse_per__lta Tarski_neutral_dimensionless_axioms assms(3)) + qed + then have "A B D LeA A B C" + by (simp add: lta__lea) + then have "D InAngle A B C" + by (simp add: P3 lea_in_angle) + then have "D InAngle C B A" + using l11_24 by blast + then have "D B TS C A" + by (simp add: P4 P6 in_angle_two_sides not_col_permutation_4) + then have "B D TS C A" + by (simp add: invert_two_sides) + thus ?thesis + using OS_def P11 by blast + qed + thus ?thesis + by (simp add: P10 os2__inangle) + qed + have "A' B C CongA A' B C" + using assms(2) assms(3) conga_refl obtuse_distincts by blast + thus ?thesis + by (simp add: P9 inangle__lea) + qed + { + assume "A' B C CongA A' B D" + then have "B Out C D" + using P5 conga_os__out invert_one_side by blast + then have "False" + using P4 not_col_permutation_4 out_col by blast + } + then have "\ A' B C CongA A' B D" by auto + thus ?thesis + by (simp add: LtA_def P8) + qed + thus ?thesis + using Acute_def P7 by blast +qed + +lemma inangle_dec: + "P InAngle A B C \ \ P InAngle A B C" by blast + +lemma lea_dec: + "A B C LeA D E F \ \ A B C LeA D E F" by blast + +lemma lta_dec: + "A B C LtA D E F \ \ A B C LtA D E F" by blast + +lemma lea_total: + assumes "A \ B" and + "B \ C" and + "D \ E" and + "E \ F" + shows "A B C LeA D E F \ D E F LeA A B C" +proof cases + assume P1: "Col A B C" + show ?thesis + proof cases + assume "B Out A C" + thus ?thesis + using assms(3) assms(4) l11_31_1 by auto + next + assume "\ B Out A C" + thus ?thesis + by (metis P1 assms(1) assms(2) assms(3) assms(4) l11_31_2 or_bet_out) + qed +next + assume P2: "\ Col A B C" + show ?thesis + proof cases + assume P3: "Col D E F" + show ?thesis + proof cases + assume "E Out D F" + thus ?thesis + using assms(1) assms(2) l11_31_1 by auto + next + assume "\ E Out D F" + thus ?thesis + by (metis P3 assms(1) assms(2) assms(3) assms(4) l11_31_2 l6_4_2) + qed + next + assume P4: "\ Col D E F" + show ?thesis + proof cases + assume "A B C LeA D E F" + thus ?thesis + by simp + next + assume P5: "\ A B C LeA D E F" + obtain P where P6: "D E F CongA A B P \ A B OS P C" + using P2 P4 angle_construction_1 by blast + then have P7: "B A OS C P" + using invert_one_side one_side_symmetry by blast + have "B C OS A P" + proof - + { + assume "Col P B C" + then have P7B: "B Out C P" + using Col_cases P7 col_one_side_out by blast + have "A B C CongA D E F" + proof - + have P7C: "A B P CongA D E F" + by (simp add: P6 conga_sym) + have P7D: "B Out A A" + by (simp add: assms(1) out_trivial) + have P7E: "E Out D D" + by (simp add: assms(3) out_trivial) + have "E Out F F" + using assms(4) out_trivial by auto + thus ?thesis + using P7B P7C P7D P7E l11_10 by blast + qed + then have "A B C LeA D E F" + by (simp add: conga__lea) + then have "False" + by (simp add: P5) + } + then have P8: "\ Col P B C" by auto + { + assume T0: "B C TS A P" + have "A B C CongA D E F" + proof - + have T1: "A B C LeA A B P" + proof - + have T1A: "C InAngle A B P" + by (simp add: P7 T0 one_side_symmetry os_ts__inangle) + have "A B C CongA A B C" + using assms(1) assms(2) conga_refl by auto + thus ?thesis + by (simp add: T1A inangle__lea) + qed + have T2: "A B C CongA A B C" + using assms(1) assms(2) conga_refl by auto + have "A B P CongA D E F" + by (simp add: P6 conga_sym) + thus ?thesis + using P5 T1 T2 l11_30 by blast + qed + then have "A B C LeA D E F" + by (simp add: conga__lea) + then have "False" + by (simp add: P5) + } + then have "\ B C TS A P" by auto + thus ?thesis + using Col_perm P7 P8 one_side_symmetry os_ts1324__os two_sides_cases by blast + qed + then have "P InAngle A B C" + using P7 os2__inangle by blast + then have "D E F LeA A B C" + using P6 LeA_def by blast + thus ?thesis + by simp + qed + qed +qed + +lemma or_lta2_conga: + assumes "A \ B" and + "C \ B" and + "D \ E" and + "F \ E" + shows "A B C LtA D E F \ D E F LtA A B C \ A B C CongA D E F" +proof - + have P1: "A B C LeA D E F \ D E F LeA A B C" + using assms(1) assms(2) assms(3) assms(4) lea_total by auto + { + assume "A B C LeA D E F" + then have "A B C LtA D E F \ D E F LtA A B C \ A B C CongA D E F" + using LtA_def by blast + } + { + assume "D E F LeA A B C" + then have "A B C LtA D E F \ D E F LtA A B C \ A B C CongA D E F" + using LtA_def conga_sym by blast + } + thus ?thesis + using P1 \A B C LeA D E F \ A B C LtA D E F \ D E F LtA A B C \ A B C CongA D E F\ by blast +qed + +lemma angle_partition: + assumes "A \ B" and + "B \ C" + shows "Acute A B C \ Per A B C \ Obtuse A B C" +proof - + obtain A' B' D' where P1: "\ (Bet A' B' D' \ Bet B' D' A' \ Bet D' A' B')" + using lower_dim by auto + then have "\ Col A' B' D'" + by (simp add: Col_def) + obtain C' where P3: "A' B' Perp C' B'" + by (metis P1 Perp_perm between_trivial2 perp_exists) + then have P4: "A B C LtA A' B' C' \ A' B' C' LtA A B C \ A B C CongA A' B' C'" + by (metis P1 assms(1) assms(2) between_trivial2 or_lta2_conga perp_not_eq_2) + { + assume "A B C LtA A' B' C'" + then have "Acute A B C \ Per A B C \ Obtuse A B C" + using Acute_def P3 Perp_cases perp_per_2 by blast + } + { + assume "A' B' C' LtA A B C" + then have "Acute A B C \ Per A B C \ Obtuse A B C" + using Obtuse_def P3 Perp_cases perp_per_2 by blast + } + { + assume "A B C CongA A' B' C'" + then have "Acute A B C \ Per A B C \ Obtuse A B C" + by (metis P3 Perp_cases Tarski_neutral_dimensionless.conga_sym Tarski_neutral_dimensionless.l11_17 Tarski_neutral_dimensionless_axioms perp_per_2) + } + thus ?thesis + using P4 \A B C LtA A' B' C' \ Acute A B C \ Per A B C \ Obtuse A B C\ \A' B' C' LtA A B C \ Acute A B C \ Per A B C \ Obtuse A B C\ by auto +qed + +lemma acute_chara_1: + assumes "Bet A B A'" and + "B \ A'" and + "Acute A B C" + shows "A B C LtA A' B C" +proof - + have "Obtuse A' B C" + using acute_bet__obtuse assms(1) assms(2) assms(3) by blast + thus ?thesis + by (simp add: acute_obtuse__lta assms(3)) +qed + +lemma acute_chara_2: + assumes "Bet A B A'" and + "A B C LtA A' B C" + shows "Acute A B C" + by (metis Tarski_neutral_dimensionless.Acute_def Tarski_neutral_dimensionless_axioms acute_chara_1 angle_partition assms(1) assms(2) bet_obtuse__acute between_symmetry lta_distincts not_and_lta) + +lemma acute_chara: + assumes "Bet A B A'" and + "B \ A'" + shows "Acute A B C \ A B C LtA A' B C" + using acute_chara_1 acute_chara_2 assms(1) assms(2) by blast + +lemma obtuse_chara: + assumes "Bet A B A'" and + "B \ A'" + shows "Obtuse A B C \ A' B C LtA A B C" + by (metis Tarski_neutral_dimensionless.Obtuse_def Tarski_neutral_dimensionless_axioms acute_bet__obtuse acute_chara assms(1) assms(2) bet_obtuse__acute between_symmetry lta_distincts) + +lemma conga__acute: + assumes "A B C CongA A C B" + shows "Acute A B C" + by (metis acute_sym angle_partition assms conga_distinct conga_obtuse__obtuse l11_17 l11_43) + +lemma cong__acute: + assumes "A \ B" and + "B \ C" and + "Cong A B A C" + shows "Acute A B C" + using angle_partition assms(1) assms(2) assms(3) cong__nlt l11_46 lt_left_comm by blast + +lemma nlta__lea: + assumes "\ A B C LtA D E F" and + "A \ B" and + "B \ C" and + "D \ E" and + "E \ F" + shows "D E F LeA A B C" + by (metis LtA_def assms(1) assms(2) assms(3) assms(4) assms(5) conga__lea456123 or_lta2_conga) + +lemma nlea__lta: + assumes "\ A B C LeA D E F" and + "A \ B" and + "B \ C" and + "D \ E" and + "E \ F" + shows "D E F LtA A B C" + using assms(1) assms(2) assms(3) assms(4) assms(5) nlta__lea by blast + +lemma triangle_strict_inequality: + assumes "Bet A B D" and + "Cong B C B D" and + "\ Bet A B C" + shows "A C Lt A D" +proof cases + assume P1: "Col A B C" + then have P2: "B Out A C" + using assms(3) not_out_bet by auto + { + assume "Bet B A C" + then have P3: "A C Le A D" + by (meson assms(1) assms(2) cong__le l5_12_a le_transitivity) + have "\ Cong A C A D" + by (metis Out_def P1 P2 assms(1) assms(2) assms(3) l4_18) + then have "A C Lt A D" + by (simp add: Lt_def P3) + } + { + assume "Bet A C B" + then have P5: "Bet A C D" + using assms(1) between_exchange4 by blast + then have P6: "A C Le A D" + by (simp add: bet__le1213) + have "\ Cong A C A D" + using P5 assms(1) assms(3) between_cong by blast + then have "A C Lt A D" + by (simp add: Lt_def P6) + } + thus ?thesis + using P1 \Bet B A C \ A C Lt A D\ assms(3) between_symmetry third_point by blast +next + assume T1: "\ Col A B C" + have T2: "A \ D" + using T1 assms(1) between_identity col_trivial_1 by auto + have T3: "\ Col A C D" + by (metis Col_def T1 T2 assms(1) col_transitivity_2) + have T4: "Bet D B A" + using Bet_perm assms(1) by blast + have T5: "C D A CongA D C B" + proof - + have T6: "C D B CongA D C B" + by (metis assms(1) assms(2) assms(3) between_trivial conga_comm l11_44_1_a not_conga_sym) + have T7: "D Out C C" + using T3 not_col_distincts out_trivial by blast + have T8: "D Out A B" + by (metis assms(1) assms(2) assms(3) bet_out_1 cong_diff l6_6) + have T9: "C Out D D" + using T7 out_trivial by force + have "C Out B B" + using T1 not_col_distincts out_trivial by auto + thus ?thesis + using T6 T7 T8 T9 l11_10 by blast + qed + have "A D C LtA A C D" + proof - + have "B InAngle D C A" + by (metis InAngle_def T1 T3 T4 not_col_distincts out_trivial) + then have "C D A LeA D C A" + using T5 LeA_def by auto + then have T10: "A D C LeA A C D" + by (simp add: lea_comm) + have "\ A D C CongA A C D" + by (metis Col_perm T3 assms(1) assms(2) assms(3) bet_col l11_44_1_b l4_18 not_bet_distincts not_cong_3412) + thus ?thesis + using LtA_def T10 by blast + qed + thus ?thesis + by (simp add: l11_44_2_b) +qed + +lemma triangle_inequality: + assumes "Bet A B D" and + "Cong B C B D" + shows "A C Le A D" +proof cases + assume "Bet A B C" + thus ?thesis + by (metis assms(1) assms(2) between_cong_3 cong__le le_reflexivity) +next + assume "\ Bet A B C" + then have "A C Lt A D" + using assms(1) assms(2) triangle_strict_inequality by auto + thus ?thesis + by (simp add: Lt_def) +qed + +lemma triangle_strict_inequality_2: + assumes "Bet A' B' C'" and + "Cong A B A' B'" and + "Cong B C B' C'" and + "\ Bet A B C" + shows "A C Lt A' C'" +proof - + obtain D where P1: "Bet A B D \ Cong B D B C" + using segment_construction by blast + then have P2: "A C Lt A D" + using P1 assms(4) cong_symmetry triangle_strict_inequality by blast + have "Cong A D A' C'" + using P1 assms(1) assms(2) assms(3) cong_transitivity l2_11_b by blast + thus ?thesis + using P2 cong2_lt__lt cong_reflexivity by blast +qed + +lemma triangle_inequality_2: + assumes "Bet A' B' C'" and + "Cong A B A' B'" and + "Cong B C B' C'" + shows "A C Le A' C'" +proof - + obtain D where P1: "Bet A B D \ Cong B D B C" + using segment_construction by blast + have P2: "A C Le A D" + by (meson P1 Tarski_neutral_dimensionless.triangle_inequality Tarski_neutral_dimensionless_axioms not_cong_3412) + have "Cong A D A' C'" + using P1 assms(1) assms(2) assms(3) cong_transitivity l2_11_b by blast + thus ?thesis + using P2 cong__le le_transitivity by blast +qed + +lemma triangle_strict_reverse_inequality: + assumes "A Out B D" and + "Cong A C A D" and + "\ A Out B C" + shows "B D Lt B C" +proof cases + assume "Col A B C" + thus ?thesis + using assms(1) assms(2) assms(3) col_permutation_4 cong_symmetry not_bet_and_out or_bet_out triangle_strict_inequality by blast +next + assume P1: "\ Col A B C" + show ?thesis + proof cases + assume "B = D" + thus ?thesis + using P1 lt1123 not_col_distincts by auto + next + assume P2: "B \ D" + then have P3: "\ Col B C D" + using P1 assms(1) col_trivial_2 colx not_col_permutation_5 out_col by blast + have P4: "\ Col A C D" + using P1 assms(1) col2__eq col_permutation_4 out_col out_distinct by blast + have P5: "C \ D" + using assms(1) assms(3) by auto + then have P6: "A C D CongA A D C" + by (metis P1 assms(2) col_trivial_3 l11_44_1_a) + { + assume T1: "Bet A B D" + then have T2: "Bet D B A" + using Bet_perm by blast + have "B C D LtA B D C" + proof - + have T3: "D C B CongA B C D" + by (metis P3 conga_pseudo_refl not_col_distincts) + have T3A: "D Out B A" + by (simp add: P2 T1 bet_out_1) + have T3B: "C Out D D" + using P5 out_trivial by auto + have T3C: "C Out A A" + using P1 not_col_distincts out_trivial by blast + have "D Out C C" + by (simp add: P5 out_trivial) + then have T4: "D C A CongA B D C" using T3A T3B T3C + by (meson Tarski_neutral_dimensionless.conga_comm Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless.l11_10 Tarski_neutral_dimensionless_axioms P6) + have "D C B LtA D C A" + proof - + have T4A: "D C B LeA D C A" + proof - + have T4AA: "B InAngle D C A" + using InAngle_def P1 P5 T2 not_col_distincts out_trivial by auto + have "D C B CongA D C B" + using T3 conga_right_comm by blast + thus ?thesis + by (simp add: T4AA inangle__lea) + qed + { + assume T5: "D C B CongA D C A" + have "D C OS B A" + using Col_perm P3 T3A out_one_side by blast + then have "C Out B A" + using T5 conga_os__out by blast + then have "False" + using Col_cases P1 out_col by blast + } + then have "\ D C B CongA D C A" by auto + thus ?thesis + using LtA_def T4A by blast + qed + thus ?thesis + using T3 T4 conga_preserves_lta by auto + qed + } + { + assume Q1: "Bet B D A" + obtain E where Q2: "Bet B C E \ Cong B C C E" + using Cong_perm segment_construction by blast + have "A D C LtA E C D" + proof - + have Q3: "D C OS A E" + proof - + have Q4: "D C TS A B" + by (metis Col_perm P2 Q1 P4 bet__ts between_symmetry) + then have "D C TS E B" + by (metis Col_def Q1 Q2 TS_def bet__ts cong_identity invert_two_sides l9_2) + thus ?thesis + using OS_def Q4 by blast + qed + have Q5: "A C D LtA E C D" + proof - + have "D C A LeA D C E" + proof - + have "B Out D A" + using P2 Q1 bet_out by auto + then have "B C OS D A" + by (simp add: P3 out_one_side) + then have "C B OS D A" + using invert_one_side by blast + then have "C E OS D A" + by (metis Col_def Q2 Q3 col124__nos col_one_side diff_col_ex not_col_permutation_5) + then have Q5A: "A InAngle D C E" + by (simp add: \C E OS D A\ Q3 invert_one_side one_side_symmetry os2__inangle) + have "D C A CongA D C A" + using CongA_def P6 conga_refl by auto + thus ?thesis + by (simp add: Q5A inangle__lea) + qed + then have Q6: "A C D LeA E C D" + using lea_comm by blast + { + assume "A C D CongA E C D" + then have "D C A CongA D C E" + by (simp add: conga_comm) + then have "C Out A E" + using Q3 conga_os__out by auto + then have "False" + by (meson Col_def Out_cases P1 Q2 not_col_permutation_3 one_side_not_col123 out_one_side) + } + then have "\ A C D CongA E C D" by auto + thus ?thesis + by (simp add: LtA_def Q6) + qed + have "E C D CongA E C D" + by (metis P1 P5 Q2 cong_diff conga_refl not_col_distincts) + thus ?thesis + using Q5 P6 conga_preserves_lta by auto + qed + then have "B C D LtA B D C" + using Bet_cases P1 P2 Q1 Q2 bet2_lta__lta not_col_distincts by blast + } + then have "B C D LtA B D C" + by (meson Out_def \Bet A B D \ B C D LtA B D C\ assms(1) between_symmetry) + thus ?thesis + by (simp add: l11_44_2_b) + qed +qed + +lemma triangle_reverse_inequality: + assumes "A Out B D" and + "Cong A C A D" + shows "B D Le B C" +proof cases + assume "A Out B C" + thus ?thesis + by (metis assms(1) assms(2) bet__le1213 cong_pseudo_reflexivity l6_11_uniqueness l6_6 not_bet_distincts not_cong_4312) +next + assume "\ A Out B C" + thus ?thesis + using triangle_strict_reverse_inequality assms(1) assms(2) lt__le by auto +qed + +lemma os3__lta: + assumes "A B OS C D" and + "B C OS A D" and + "A C OS B D" + shows "B A C LtA B D C" +proof - + have P1: "D InAngle A B C" + by (simp add: assms(1) assms(2) invert_one_side os2__inangle) + then obtain E where P2: "Bet A E C \ (E = B \ B Out E D)" + using InAngle_def by auto + have P3: "\ Col A B C" + using assms(1) one_side_not_col123 by auto + have P4: "\ Col A C D" + using assms(3) col124__nos by auto + have P5: "\ Col B C D" + using assms(2) one_side_not_col124 by auto + have P6: "\ Col A B D" + using assms(1) one_side_not_col124 by auto + { + assume "E = B" + then have "B A C LtA B D C" + using P2 P3 bet_col by blast + } + { + assume P7: "B Out E D" + have P8: "A \ E" + using P6 P7 not_col_permutation_4 out_col by blast + have P9: "C \ E" + using P5 P7 out_col by blast + have P10: "B A C LtA B E C" + proof - + have P10A: "\ Col E A B" + by (metis Col_def P2 P3 P8 col_transitivity_1) + then have P10B: "E B A LtA B E C" + using P2 P9 Tarski_neutral_dimensionless.l11_41_aux Tarski_neutral_dimensionless_axioms by fastforce + have P10C: "E A B LtA B E C" + using P2 P9 P10A l11_41 by auto + have P11: "E A B CongA B A C" + proof - + have P11A: "A Out B B" + using assms(2) os_distincts out_trivial by auto + have "A Out C E" + using P2 P8 bet_out l6_6 by auto + thus ?thesis + using P11A conga_right_comm out2__conga by blast + qed + have P12: "B E C CongA B E C" + by (metis Col_def P2 P3 P9 conga_refl) + thus ?thesis + using P11 P10C conga_preserves_lta by auto + qed + have "B E C LtA B D C" + proof - + have U1: "E Out D B" + proof - + obtain pp :: "'p \ 'p \ 'p" where + f1: "\p pa. p \ (pp p pa) \ pa \ (pp p pa) \ Col p pa (pp p pa)" + using diff_col_ex by moura + then have "\p pa pb. Col pb pa p \ \ Col pb pa (pp p pa)" + by (meson l6_16_1) + then have f2: "\p pa. Col pa p pa" + using f1 by metis + have f3: "(E = B \ D = E) \ Col D E B" + using f1 by (metis Col_def P2 col_out2_col l6_16_1 out_trivial) + have "\p. (A = E \ Col p A C) \ \ Col p A E" + using Col_def P2 l6_16_1 by blast + thus ?thesis + using f3 f2 by (metis (no_types) Col_def assms(3) not_out_bet one_side_chara one_side_symmetry) + qed + have U2: "D \ E" + using P2 P4 bet_col not_col_permutation_5 by blast + have U3: "\ Col D E C" + by (metis Col_def P2 P4 P9 col_transitivity_1) + have U4: "Bet E D B" + by (simp add: P7 U1 out2__bet) + have "D C E LtA C D B" + using P5 U3 U4 l11_41_aux not_col_distincts by blast + have U5: "D E C LtA C D B" + using P7 U4 U3 l11_41 out_diff2 by auto + have "D E C CongA B E C" + by (simp add: P9 U1 l6_6 out2__conga out_trivial) + thus ?thesis + by (metis U5 conga_preserves_lta conga_pseudo_refl lta_distincts) + qed + then have "B A C LtA B D C" + using P10 lta_trans by blast + } + thus ?thesis + using P2 \E = B \ B A C LtA B D C\ by blast +qed + +lemma bet_le__lt: + assumes "Bet A D B" and + "A \ D" and + "D \ B" and + "A C Le B C" + shows "D C Lt B C" +proof - + have P1: "A \ B" + using assms(1) assms(2) between_identity by blast + have "C D Lt C B" + proof cases + assume P3: "Col A B C" + thus ?thesis + proof cases + assume "Bet C D B" + thus ?thesis + by (simp add: assms(3) bet__lt1213) + next + assume "\ Bet C D B" + then have "D Out C B" + by (metis Out_def P1 P3 assms(1) col_transitivity_2 not_col_permutation_3 not_out_bet out_col) + thus ?thesis + by (smt Le_cases P3 assms(1) assms(2) assms(4) bet2_le2__le bet_le_eq bet_out_1 l6_6 l6_7 nle__lt or_bet_out out2__bet out_bet__out) + qed + next + assume Q0A: "\ Col A B C" + then have Q0B: "\ Col B C D" + by (meson Col_def assms(1) assms(3) col_transitivity_2) + have "C B D LtA C D B" + proof - + have Q1: "B Out C C" + using Q0A not_col_distincts out_trivial by force + have Q2: "B Out A D" + using Out_cases assms(1) assms(3) bet_out_1 by blast + have Q3: "A Out C C" + by (metis Q0A col_trivial_3 out_trivial) + have Q4: "A Out B B" + using P1 out_trivial by auto + have "C B A LeA C A B" + using Col_perm Le_cases Q0A assms(4) l11_44_2bis by blast + then have T9: "C B D LeA C A B" + using Q1 Q2 Q3 Q4 lea_out4__lea by blast + have "C A B LtA C D B" + proof - + have U2: "\ Col D A C" + using Q0B assms(1) assms(2) bet_col col_transitivity_2 not_col_permutation_3 not_col_permutation_4 by blast + have U3: "D \ C" + using Q0B col_trivial_2 by blast + have U4: "D C A LtA C D B" + using U2 assms(1) assms(3) l11_41_aux by auto + have U5: "D A C LtA C D B" + by (simp add: U2 assms(1) assms(3) l11_41) + have "A Out B D" + using Out_def P1 assms(1) assms(2) by auto + then have "D A C CongA C A B" + using Q3 conga_right_comm out2__conga by blast + thus ?thesis + by (metis U5 U3 assms(3) conga_preserves_lta conga_refl) + qed + thus ?thesis + using T9 lea123456_lta__lta by blast + qed + thus ?thesis + by (simp add: l11_44_2_b) + qed + thus ?thesis + using Lt_cases by auto +qed + +lemma cong2__ncol: + assumes "A \ B" and + "B \ C" and + "A \ C" and + "Cong A P B P" and + "Cong A P C P" + shows "\ Col A B C" +proof - + have "Cong B P C P" + using assms(4) assms(5) cong_inner_transitivity by blast + thus ?thesis using bet_le__lt + by (metis assms(1) assms(2) assms(3) assms(4) assms(5) cong__le cong__nlt lt__nle not_col_permutation_5 third_point) +qed + +lemma cong4_cop2__eq: + assumes "A \ B" and + "B \ C" and + "A \ C" and + "Cong A P B P" and + "Cong A P C P" and + "Coplanar A B C P" and + "Cong A Q B Q" and + "Cong A Q C Q" and + "Coplanar A B C Q" + shows "P = Q" +proof - + have P1: "\ Col A B C" + using assms(1) assms(2) assms(3) assms(4) assms(5) cong2__ncol by auto + { + assume P2: "P \ Q" + have P3: "\ R. Col P Q R \ (Cong A R B R \ Cong A R C R)" + using P2 assms(4) assms(5) assms(7) assms(8) l4_17 not_cong_4321 by blast + obtain D where P4: "D Midpoint A B" + using midpoint_existence by auto + have P5: "Coplanar A B C D" + using P4 coplanar_perm_9 midpoint__coplanar by blast + have P6: "Col P Q D" + proof - + have P6A: "Coplanar P Q D A" + using P1 P5 assms(6) assms(9) coplanar_pseudo_trans ncop_distincts by blast + then have P6B: "Coplanar P Q D B" + by (metis P4 col_cop__cop midpoint_col midpoint_distinct_1) + have P6D: "Cong P A P B" + using assms(4) not_cong_2143 by blast + have P6E: "Cong Q A Q B" + using assms(7) not_cong_2143 by blast + have "Cong D A D B" + using Midpoint_def P4 not_cong_2134 by blast + thus ?thesis using cong3_cop2__col P6A P6B assms(1) P6D P6E by blast + qed + obtain R1 where P7: "P \ R1 \ Q \ R1 \ D \ R1 \ Col P Q R1" + using P6 diff_col_ex3 by blast + obtain R2 where P8: "Bet R1 D R2 \ Cong D R2 R1 D" + using segment_construction by blast + have P9: "Col P Q R2" + by (metis P6 P7 P8 bet_col colx) + have P9A: "Cong R1 A R1 B" + using P3 P7 not_cong_2143 by blast + then have "Per R1 D A" + using P4 Per_def by auto + then have "Per A D R1" using l8_2 by blast + have P10: "Cong A R1 A R2" + proof - + have f1: "Bet R2 D R1 \Bet R1 R2 D" + by (metis (full_types) Col_def P7 P8 between_equality not_col_permutation_5) + have f2: "Cong B R1 A R1" + using Cong_perm \Cong R1 A R1 B\ by blast + have "Cong B R1 A R2 \ Bet R1 R2 D" + using f1 Cong_perm Midpoint_def P4 P8 l7_13 by blast + thus ?thesis + using f2 P8 bet_cong_eq cong_inner_transitivity by blast + qed + have "Col A B C" + proof - + have P11: "Cong B R1 B R2" + by (metis Cong_perm P10 P3 P9 P9A cong_inner_transitivity) + have P12: "Cong C R1 C R2" + using P10 P3 P7 P9 cong_inner_transitivity by blast + have P12A: "Coplanar A B C R1" + using P2 P7 assms(6) assms(9) col_cop2__cop by blast + have P12B: "Coplanar A B C R2" + using P2 P9 assms(6) assms(9) col_cop2__cop by blast + have "R1 \ R2" + using P7 P8 between_identity by blast + thus ?thesis + using P10 P11 P12A P12B P12 cong3_cop2__col by blast + qed + then have False + by (simp add: P1) + } + thus ?thesis by auto +qed + +lemma t18_18_aux: + assumes "Cong A B D E" and + "Cong A C D F" and + "F D E LtA C A B" and + "\ Col A B C" and + "\ Col D E F" and + "D F Le D E" + shows "E F Lt B C" +proof - + obtain G0 where P1: "C A B CongA F D G0 \ F D OS G0 E" + using angle_construction_1 assms(4) assms(5) not_col_permutation_2 by blast + then have P2: "\ Col F D G0" + using col123__nos by auto + then obtain G where P3: "D Out G0 G \ Cong D G A B" + by (metis assms(4) bet_col between_trivial2 col_trivial_2 segment_construction_3) + have P4: "C A B CongA F D G" + proof - + have P4B: "A Out C C" + by (metis assms(4) col_trivial_3 out_trivial) + have P4C: "A Out B B" + by (metis assms(4) col_trivial_1 out_trivial) + have P4D: "D Out F F" + using P2 not_col_distincts out_trivial by blast + have "D Out G G0" + by (simp add: P3 l6_6) + thus ?thesis using P1 P4B P4C P4D + using l11_10 by blast + qed + have "D Out G G0" + by (simp add: P3 l6_6) + then have "D F OS G G0" + using P2 not_col_permutation_4 out_one_side by blast + then have "F D OS G G0" + by (simp add: invert_one_side) + then have P5: "F D OS G E" + using P1 one_side_transitivity by blast + have P6: "\ Col F D G" + by (meson P5 one_side_not_col123) + have P7: "Cong C B F G" + using P3 P4 assms(2) cong2_conga_cong cong_commutativity cong_symmetry by blast + have P8: "F E Lt F G" + proof - + have P9: "F D E LtA F D G" + by (metis P4 assms(3) assms(5) col_trivial_1 col_trivial_3 conga_preserves_lta conga_refl) + have P10: "Cong D G D E" + using P3 assms(1) cong_transitivity by blast + { + assume P11: "Col E F G" + have P12: "F D E LeA F D G" + by (simp add: P9 lta__lea) + have P13: "\ F D E CongA F D G" + using P9 not_lta_and_conga by blast + have "F D E CongA F D G" + proof - + have "F Out E G" + using Col_cases P11 P5 col_one_side_out l6_6 by blast + then have "E F D CongA G F D" + by (metis assms(5) conga_os__out conga_refl l6_6 not_col_distincts one_side_reflexivity out2__conga) + thus ?thesis + by (meson P10 assms(2) assms(6) cong_4321 cong_inner_transitivity l11_52 le_comm) + qed + then have "False" + using P13 by blast + } + then have P15: "\ Col E F G" by auto + { + assume P20: "Col D E G" + have P21: "F D E LeA F D G" + by (simp add: P9 lta__lea) + have P22: "\ F D E CongA F D G" + using P9 not_lta_and_conga by blast + have "F D E CongA F D G" + proof - + have "D Out E G" + by (meson Out_cases P5 P20 col_one_side_out invert_one_side not_col_permutation_5) + thus ?thesis + using P10 P15 \D Out G G0\ cong_inner_transitivity l6_11_uniqueness l6_7 not_col_distincts by blast + qed + then have "False" + by (simp add: P22) + } + then have P16: "\ Col D E G" by auto + have P17: "E InAngle F D G" + using lea_in_angle by (simp add: P5 P9 lta__lea) + then obtain H where P18: "Bet F H G \ (H = D \ D Out H E)" + using InAngle_def by auto + { + assume "H = D" + then have "F G E LtA F E G" + using P18 P6 bet_col by blast + } + { + assume P19: "D Out H E" + have P20: "H \ F" + using Out_cases P19 assms(5) out_col by blast + have P21: "H \ G" + using P19 P16 l6_6 out_col by blast + have "F D Le G D" + using P10 assms(6) cong_pseudo_reflexivity l5_6 not_cong_4312 by blast + then have "H D Lt G D" + using P18 P20 P21 bet_le__lt by blast + then have P22: "D H Lt D E" + using Lt_cases P10 cong2_lt__lt cong_reflexivity by blast + then have P23: "D H Le D E \ \ Cong D H D E" + using Lt_def by blast + have P24: "H \ E" + using P23 cong_reflexivity by blast + have P25: "Bet D H E" + by (simp add: P19 P23 l6_13_1) + have P26: "E G OS F D" + by (metis InAngle_def P15 P16 P18 P25 bet_out_1 between_symmetry in_angle_one_side not_col_distincts not_col_permutation_1) + have "F G E LtA F E G" + proof - + have P27: "F G E LtA D E G" + proof - + have P28: "D G E CongA D E G" + by (metis P10 P16 l11_44_1_a not_col_distincts) + have "F G E LtA D G E" + proof - + have P29: "F G E LeA D G E" + by (metis OS_def P17 P26 P5 TS_def in_angle_one_side inangle__lea_1 invert_one_side l11_24 os2__inangle) + { + assume "F G E CongA D G E" + then have "E G F CongA E G D" + by (simp add: conga_comm) + then have "G Out F D" + using P26 conga_os__out by auto + then have "False" + using P6 not_col_permutation_2 out_col by blast + } + then have "\ F G E CongA D G E" by auto + thus ?thesis + by (simp add: LtA_def P29) + qed + thus ?thesis + by (metis P28 P6 col_trivial_3 conga_preserves_lta conga_refl) + qed + have "G E D LtA G E F" + proof - + have P30: "G E D LeA G E F" + proof - + have P31: "D InAngle G E F" + by (simp add: P16 P17 P26 assms(5) in_angle_two_sides l11_24 not_col_permutation_5 os_ts__inangle) + have "G E D CongA G E D" + by (metis P16 col_trivial_1 col_trivial_2 conga_refl) + thus ?thesis + using P31 inangle__lea by auto + qed + have "\ G E D CongA G E F" + by (metis OS_def P26 P5 TS_def conga_os__out invert_one_side out_col) + thus ?thesis + by (simp add: LtA_def P30) + qed + then have "D E G LtA F E G" + using lta_comm by blast + thus ?thesis + using P27 lta_trans by blast + qed + } + then have "F G E LtA F E G" + using P18 \H = D \ F G E LtA F E G\ by blast + thus ?thesis + by (simp add: l11_44_2_b) + qed + then have "E F Lt F G" + using lt_left_comm by blast + thus ?thesis + using P7 cong2_lt__lt cong_pseudo_reflexivity not_cong_4312 by blast +qed + +lemma t18_18: + assumes "Cong A B D E" and + "Cong A C D F" and + "F D E LtA C A B" + shows "E F Lt B C" +proof - + have P1: "F \ D" + using assms(3) lta_distincts by blast + have P2: "E \ D" + using assms(3) lta_distincts by blast + have P3: "C \ A" + using assms(3) lta_distincts by auto + have P4: "B \ A" + using assms(3) lta_distincts by blast + { + assume P6: "Col A B C" + { + assume P7: "Bet B A C" + obtain C' where P8:"Bet E D C' \ Cong D C' A C" + using segment_construction by blast + have P9: "Cong E F E F" + by (simp add: cong_reflexivity) + have P10: "Cong E C' B C" + using P7 P8 assms(1) l2_11_b not_cong_4321 by blast + have "E F Lt E C'" + proof - + have P11: "Cong D F D C'" + using P8 assms(2) cong_transitivity not_cong_3412 by blast + have "\ Bet E D F" + using Bet_perm Col_def assms(3) col_lta__out not_bet_and_out by blast + thus ?thesis + using P11 P8 triangle_strict_inequality by blast + qed + then have "E F Lt B C" + using P9 P10 cong2_lt__lt by blast + } + { + assume "\ Bet B A C" + then have "E F Lt B C" + using P6 assms(3) between_symmetry col_lta__bet col_permutation_2 by blast + } + then have "E F Lt B C" + using \Bet B A C \ E F Lt B C\ by auto + } + { + assume P12: "\ Col A B C" + { + assume P13: "Col D E F" + { + assume P14: "Bet F D E" + then have "C A B LeA F D E" + by (simp add: P1 P2 P3 P4 l11_31_2) + then have "F D E LtA F D E" + using assms(3) lea__nlta by auto + then have "False" + by (simp add: nlta) + then have "E F Lt B C" by auto + } + { + assume "\ Bet F D E" + then have P16: "D Out F E" + using P13 not_col_permutation_1 not_out_bet by blast + obtain F' where P17: "A Out B F' \ Cong A F' A C" + using P3 P4 segment_construction_3 by fastforce + then have P18: "B F' Lt B C" + by (meson P12 Tarski_neutral_dimensionless.triangle_strict_reverse_inequality Tarski_neutral_dimensionless_axioms not_cong_3412 out_col) + have "Cong B F' E F" + by (meson Out_cases P16 P17 assms(1) assms(2) cong_transitivity out_cong_cong) + then have "E F Lt B C" + using P18 cong2_lt__lt cong_reflexivity by blast + } + then have "E F Lt B C" + using \Bet F D E \ E F Lt B C\ by blast + } + { + assume P20: "\ Col D E F" + { + assume "D F Le D E" + then have "E F Lt B C" + by (meson P12 Tarski_neutral_dimensionless.t18_18_aux Tarski_neutral_dimensionless_axioms P20 assms(1) assms(2) assms(3)) + } + { + assume "D E Le D F" + then have "E F Lt B C" + by (meson P12 P20 Tarski_neutral_dimensionless.lta_comm Tarski_neutral_dimensionless.t18_18_aux Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) lt_comm not_col_permutation_5) + } + then have "E F Lt B C" + using \D F Le D E \ E F Lt B C\ local.le_cases by blast + } + then have "E F Lt B C" + using \Col D E F \ E F Lt B C\ by blast + } + thus ?thesis + using \Col A B C \ E F Lt B C\ by auto +qed + +lemma t18_19: + assumes "A \ B" and + "A \ C" and + "Cong A B D E" and + "Cong A C D F" and + "E F Lt B C" + shows "F D E LtA C A B" +proof - + { + assume P1: "C A B LeA F D E" + { + assume "C A B CongA F D E" + then have "False" + using Cong_perm assms(3) assms(4) assms(5) cong__nlt l11_49 by blast + } + { + assume P2: "\ C A B CongA F D E" + then have "C A B LtA F E D" + by (metis P1 assms(3) assms(4) assms(5) cong_symmetry lea_distincts lta__nlea not_and_lt or_lta2_conga t18_18) + then have "B C Lt E F" + by (metis P1 P2 assms(3) assms(4) cong_symmetry lta__nlea lta_distincts or_lta2_conga t18_18) + then have "False" + using assms(5) not_and_lt by auto + } + then have "False" + using \C A B CongA F D E \ False\ by auto + } + then have "\ C A B LeA F D E" by auto + thus ?thesis + using assms(1) assms(2) assms(3) assms(4) cong_identity nlea__lta by blast +qed + +lemma acute_trivial: + assumes "A \ B" + shows "Acute A B A" + by (metis Tarski_neutral_dimensionless.acute_distincts Tarski_neutral_dimensionless_axioms angle_partition assms l11_43) + +lemma acute_not_per: + assumes "Acute A B C" + shows "\ Per A B C" +proof - + obtain A' B' C' where P1: "Per A' B' C' \ A B C LtA A' B' C'" + using Acute_def assms by auto + thus ?thesis + using acute_distincts acute_per__lta assms nlta by fastforce +qed + +lemma angle_bisector: + assumes "A \ B" and + "C \ B" + shows "\ P. (P InAngle A B C \ P B A CongA P B C)" +proof cases + assume P1: "Col A B C" + thus ?thesis + proof cases + assume P2: "Bet A B C" + then obtain Q where P3: "\ Col A B Q" + using assms(1) not_col_exists by auto + then obtain P where P4: "A B Perp P B \ A B OS Q P" + using P1 l10_15 os_distincts by blast + then have P5: "P InAngle A B C" + by (metis P2 assms(2) in_angle_line os_distincts) + have "P B A CongA P B C" + proof - + have P9: "P \ B" + using P4 os_distincts by blast + have "Per P B A" + by (simp add: P4 Perp_perm Tarski_neutral_dimensionless.perp_per_2 Tarski_neutral_dimensionless_axioms) + thus ?thesis + using P2 assms(1) assms(2) P9 l11_18_1 by auto + qed + thus ?thesis + using P5 by auto + next + assume T1: "\ Bet A B C" + then have T2: "B Out A C" + by (simp add: P1 l6_4_2) + have T3: "C InAngle A B C" + by (simp add: assms(1) assms(2) inangle3123) + have "C B A CongA C B C" + using T2 between_trivial2 l6_6 out2__conga out2_bet_out by blast + thus ?thesis + using T3 by auto + qed +next + assume T4: "\ Col A B C" + obtain C0 where T5: "B Out C0 C \ Cong B C0 B A" + using assms(1) assms(2) l6_11_existence by fastforce + obtain P where T6: "P Midpoint A C0" + using midpoint_existence by auto + have T6A: "\ Col A B C0" + by (metis T4 T5 col3 l6_3_1 not_col_distincts out_col) + have T6B: "P \ B" + using Col_def Midpoint_def T6 T6A by auto + have T6D: "P \ A" + using T6 T6A is_midpoint_id not_col_distincts by blast + have "P InAngle A B C0" + using InAngle_def T5 T6 T6B assms(1) l6_3_1 midpoint_bet out_trivial by fastforce + then have T7: "P InAngle A B C" + using T5 T6B in_angle_trans2 l11_24 out341__inangle by blast + have T8: "(P = B) \ B Out P P" + using out_trivial by auto + have T9: "B Out A A" + by (simp add: assms(1) out_trivial) + { + assume T9A: "B Out P P" + have "P B A CongA P B C0 \ B P A CongA B P C0 \ P A B CongA P C0 B" + proof - + have T9B: "Cong B P B P" + by (simp add: cong_reflexivity) + have T9C: "Cong B A B C0" + using Cong_perm T5 by blast + have "Cong P A P C0" + using Midpoint_def T6 not_cong_2134 by blast + thus ?thesis using l11_51 T6B assms(1) T9B T9C T6D by presburger + qed + then have "P B A CongA P B C0" by auto + then have "P B A CongA P B C" using l11_10 T9A T9 + by (meson T5 l6_6) + then have "\ P. (P InAngle A B C \ P B A CongA P B C)" + using T7 by auto + } + thus ?thesis + using T6B T8 by blast +qed + +lemma reflectl__conga: + assumes "A \ B" and + "B \ P" and + "P P' ReflectL A B" + shows "A B P CongA A B P'" +proof - + obtain A' where P1: "A' Midpoint P' P \ Col A B A' \ (A B Perp P' P \ P = P')" + using ReflectL_def assms(3) by auto + { + assume P2: "A B Perp P' P" + then have P3: "P \ P'" + using perp_not_eq_2 by blast + then have P4: "A' \ P'" + using P1 is_midpoint_id by blast + have P5: "A' \ P" + using P1 P3 is_midpoint_id_2 by auto + have "A B P CongA A B P'" + proof cases + assume P6: "A' = B" + then have P8: "B \ P'" + using P4 by auto + have P9: "Per A B P" + by (smt P1 P3 P6 Perp_cases col_transitivity_2 midpoint_col midpoint_distinct_1 not_col_permutation_2 perp_col2_bis perp_per_2) + have "Per A B P'" + by (smt Mid_cases P1 P2 P6 P8 assms(1) col_trivial_3 midpoint_col not_col_permutation_3 perp_col4 perp_per_2) + thus ?thesis + using l11_16 P4 P5 P6 P9 assms(1) by auto + next + assume T1: "A' \ B" + have T2: "B A' P CongA B A' P'" + proof - + have T2A: "Cong B P B P'" + using assms(3) col_trivial_2 is_image_spec_col_cong l10_4_spec not_cong_4321 by blast + then have T2B: "A' B P CongA A' B P'" + by (metis Cong_perm Midpoint_def P1 P5 T1 Tarski_neutral_dimensionless.l11_51 Tarski_neutral_dimensionless_axioms assms(2) cong_reflexivity) + have "A' P B CongA A' P' B" + by (simp add: P5 T2A T2B cong_reflexivity conga_comm l11_49) + thus ?thesis + using P5 T2A T2B cong_reflexivity l11_49 by blast + qed + have T3: "Cong A' B A' B" + by (simp add: cong_reflexivity) + have "Cong A' P A' P'" + using Midpoint_def P1 not_cong_4312 by blast + then have T4: "A' B P CongA A' B P' \ A' P B CongA A' P' B" using l11_49 + using assms(2) T2 T3 by blast + show ?thesis + proof cases + assume "Bet A' B A" + thus ?thesis + using T4 assms(1) l11_13 by blast + next + assume "\ Bet A' B A" + then have T5: "B Out A' A" + using P1 not_col_permutation_3 or_bet_out by blast + have T6: "B \ P'" + using T4 conga_distinct by blast + have T8: "B Out A A'" + by (simp add: T5 l6_6) + have T9: "B Out P P" + using assms(2) out_trivial by auto + have "B Out P' P'" + using T6 out_trivial by auto + thus ?thesis + using l11_10 T4 T8 T9 by blast + qed + qed + } + { + assume "P = P'" + then have "A B P CongA A B P'" + using assms(1) assms(2) conga_refl by auto + } + thus ?thesis + using P1 \A B Perp P' P \ A B P CongA A B P'\ by blast +qed + +lemma conga_cop_out_reflectl__out: + assumes "\ B Out A C" and + "Coplanar A B C P" and + "P B A CongA P B C" and + "B Out A T" and + "T T' ReflectL B P" + shows "B Out C T'" +proof - + have P1: "P B T CongA P B T'" + by (metis assms(3) assms(4) assms(5) conga_distinct is_image_spec_rev out_distinct reflectl__conga) + have P2: "T T' Reflect B P" + by (metis P1 assms(5) conga_distinct is_image_is_image_spec) + have P3: "B \ T'" + using CongA_def P1 by blast + have P4: "P B C CongA P B T'" + proof - + have P5: "P B C CongA P B A" + by (simp add: assms(3) conga_sym) + have "P B A CongA P B T'" + proof - + have P7: "B Out P P" + using assms(3) conga_diff45 out_trivial by blast + have P8: "B Out A T" + by (simp add: assms(4)) + have "B Out T' T'" + using P3 out_trivial by auto + thus ?thesis + using P1 P7 P8 l11_10 by blast + qed + thus ?thesis + using P5 not_conga by blast + qed + have "P B OS C T'" + proof - + have P9: "P B TS A C" + using assms(1) assms(2) assms(3) conga_cop__or_out_ts coplanar_perm_20 by blast + then have "T \ T'" + by (metis Col_perm P2 P3 TS_def assms(4) col_transitivity_2 l10_8 out_col) + then have "P B TS T T'" + by (metis P2 P4 conga_diff45 invert_two_sides l10_14) + then have "P B TS A T'" + using assms(4) col_trivial_2 out_two_sides_two_sides by blast + thus ?thesis + using OS_def P9 l9_2 by blast + qed + thus ?thesis + using P4 conga_os__out by auto +qed + +lemma col_conga_cop_reflectl__col: + assumes "\ B Out A C" and + "Coplanar A B C P" and + "P B A CongA P B C" and + "Col B A T" and + "T T' ReflectL B P" + shows "Col B C T'" +proof cases + assume "B = T" + thus ?thesis + using assms(5) col_image_spec__eq not_col_distincts by blast +next + assume P1: "B \ T" + thus ?thesis + proof cases + assume "B Out A T" + thus ?thesis + using out_col conga_cop_out_reflectl__out assms(1) assms(2) assms(3) assms(5) by blast + next + assume P2: "\ B Out A T" + obtain A' where P3: "Bet A B A' \ Cong B A' A B" + using segment_construction by blast + obtain C' where P4: "Bet C B C' \ Cong B C' C B" + using segment_construction by blast + have P5: "B Out C' T'" + proof - + have P6: "\ B Out A' C'" + by (metis P3 P4 assms(1) between_symmetry cong_diff_2 l6_2 out_diff1 out_diff2) + have P7: "Coplanar A' B C' P" + proof cases + assume "Col A B C" + thus ?thesis + by (smt P3 P4 assms(1) assms(2) assms(3) bet_col bet_neq32__neq col2_cop__cop col_transitivity_1 colx conga_diff2 conga_diff56 l6_4_2 ncoplanar_perm_15 not_col_permutation_5) + next + assume P7B: "\ Col A B C" + have P7C: "Coplanar A B C A'" + using P3 bet_col ncop__ncols by blast + have P7D: "Coplanar A B C B" + using ncop_distincts by blast + have "Coplanar A B C C'" + using P4 bet__coplanar coplanar_perm_20 by blast + thus ?thesis + using P7B P7C P7D assms(2) coplanar_pseudo_trans by blast + qed + have P8: "P B A' CongA P B C'" + by (metis CongA_def P3 P4 assms(3) cong_reverse_identity conga_left_comm l11_13 not_conga_sym) + have P9: "B Out A' T" + by (smt Out_def P1 P2 P3 P8 assms(3) assms(4) conga_distinct l5_2 l6_4_2 not_col_permutation_4) + thus ?thesis + using P6 P7 P8 P9 assms(5) conga_cop_out_reflectl__out by blast + qed + thus ?thesis + by (metis Col_def P4 col_transitivity_1 out_col out_diff1) + qed +qed + +lemma conga2_cop2__col: + assumes "\ B Out A C" and + "P B A CongA P B C" and + "P' B A CongA P' B C" and + "Coplanar A B P P'" and + "Coplanar B C P P'" + shows "Col B P P'" +proof - + obtain C' where P1: "B Out C' C \ Cong B C' B A" + by (metis assms(2) conga_distinct l6_11_existence) + have P1A: "Cong P A P C' \ (P \ A \ (B P A CongA B P C' \ B A P CongA B C' P))" + proof - + have P2: "P B A CongA P B C'" + proof - + have P2A: "B Out P P" + using assms(2) conga_diff45 out_trivial by auto + have "B Out A A" + using assms(2) conga_distinct out_trivial by auto + thus ?thesis + using P1 P2A assms(2) l11_10 by blast + qed + have P3: "Cong B P B P" + by (simp add: cong_reflexivity) + have "Cong B A B C'" + using Cong_perm P1 by blast + thus ?thesis using l11_49 P2 cong_reflexivity by blast + qed + have P4: "P' B A CongA P' B C'" + proof - + have P4A: "B Out P' P'" + using assms(3) conga_diff1 out_trivial by auto + have "B Out A A" + using assms(2) conga_distinct out_trivial by auto + thus ?thesis + using P1 P4A assms(3) l11_10 by blast + qed + have P5: "Cong B P' B P'" + by (simp add: cong_reflexivity) + have P5A: "Cong B A B C'" + using Cong_perm P1 by blast + then have P6: "P' \ A \ (B P' A CongA B P' C' \ B A P' CongA B C' P')" + using P4 P5 l11_49 by blast + have P7: "Coplanar B P P' A" + using assms(4) ncoplanar_perm_18 by blast + have P8: "Coplanar B P P' C'" + by (smt Col_cases P1 assms(5) col_cop__cop ncoplanar_perm_16 ncoplanar_perm_8 out_col out_diff2) + have "A \ C'" + using P1 assms(1) by auto + thus ?thesis + using P4 P5 P7 P8 P5A P1A cong3_cop2__col l11_49 by blast +qed + +lemma conga2_cop2__col_1: + assumes "\ Col A B C" and + "P B A CongA P B C" and + "P' B A CongA P' B C" and + "Coplanar A B C P" and + "Coplanar A B C P'" + shows "Col B P P'" +proof - + have P1: "\ B Out A C" + using Col_cases assms(1) out_col by blast + have P2: "Coplanar A B P P'" + by (meson assms(1) assms(4) assms(5) coplanar_perm_12 coplanar_trans_1 not_col_permutation_2) + have "Coplanar B C P P'" + using assms(1) assms(4) assms(5) coplanar_trans_1 by auto + thus ?thesis using P1 P2 conga2_cop2__col assms(2) assms(3) conga2_cop2__col by auto +qed + +lemma col_conga__conga: + assumes "P B A CongA P B C" and + "Col B P P'" and + "B \ P'" + shows "P' B A CongA P' B C" +proof cases + assume "Bet P B P'" + thus ?thesis + using assms(1) assms(3) l11_13 by blast +next + assume "\ Bet P B P'" + then have P1: "B Out P P'" + using Col_cases assms(2) or_bet_out by blast + then have P2: "B Out P' P" + by (simp add: l6_6) + have P3: "B Out A A" + using CongA_def assms(1) out_trivial by auto + have "B Out C C" + using assms(1) conga_diff56 out_trivial by blast + thus ?thesis + using P2 P3 assms(1) l11_10 by blast +qed + +lemma cop_inangle__ex_col_inangle: + assumes "\ B Out A C" and + "P InAngle A B C" and + "Coplanar A B C Q" + shows "\ R. (R InAngle A B C \ P \ R \ Col P Q R)" +proof - + have P1: "A \ B" + using assms(2) inangle_distincts by blast + then have P4: "A \ C" + using assms(1) out_trivial by blast + have P2: "C \ B" + using assms(2) inangle_distincts by auto + have P3: "P \ B" + using InAngle_def assms(2) by auto + thus ?thesis + proof cases + assume "P = Q" + thus ?thesis + using P1 P2 P4 col_trivial_1 inangle1123 inangle3123 by blast + next + assume P5: "P \ Q" + thus ?thesis + proof cases + assume P6: "Col B P Q" + obtain R where P7: "Bet B P R \ Cong P R B P" + using segment_construction by blast + have P8: "R InAngle A B C" + using Out_cases P1 P2 P3 P7 assms(2) bet_out l11_25 out_trivial by blast + have "P \ R" + using P3 P7 cong_reverse_identity by blast + thus ?thesis + by (metis P3 P6 P7 P8 bet_col col_transitivity_2) + next + assume T1: "\ Col B P Q" + thus ?thesis + proof cases + assume T2: "Col A B C" + have T3: "Q InAngle A B C" + by (metis P1 P2 T1 T2 assms(1) in_angle_line l6_4_2 not_col_distincts) + thus ?thesis + using P5 col_trivial_2 by blast + next + assume Q1: "\ Col A B C" + thus ?thesis + proof cases + assume Q2: "Col B C P" + have Q3: "\ Col B A P" + using Col_perm P3 Q1 Q2 col_transitivity_2 by blast + have Q4: "Coplanar B P Q A" + using P2 Q2 assms(3) col2_cop__cop col_trivial_3 ncoplanar_perm_22 ncoplanar_perm_3 by blast + have Q5: "Q \ P" + using P5 by auto + have Q6: "Col B P P" + using not_col_distincts by blast + have Q7: "Col Q P P" + using not_col_distincts by auto + have "\ Col B P A" + using Col_cases Q3 by auto + then obtain Q0 where P10: "Col Q P Q0 \ B P OS A Q0" + using cop_not_par_same_side Q4 Q5 Q6 Q7 T1 by blast + have P13: "P \ Q0" + using P10 os_distincts by auto + { + assume "B A OS P Q0" + then have ?thesis + using P10 P13 assms(2) in_angle_trans not_col_permutation_4 os2__inangle by blast + } + { + assume V1: "\ B A OS P Q0" + have "\ R. Bet P R Q0 \ Col P Q R \ Col B A R" + proof cases + assume V3: "Col B A Q0" + have "Col P Q Q0" + using Col_cases P10 by auto + thus ?thesis + using V3 between_trivial by auto + next + assume V4: "\ Col B A Q0" + then have V5: "\ Col Q0 B A" + using Col_perm by blast + have "\ Col P B A" + using Col_cases Q3 by blast + then obtain R where V8: "Col R B A \ Bet P R Q0" + using cop_nos__ts V1 V5 + by (meson P10 TS_def ncoplanar_perm_2 os__coplanar) + thus ?thesis + by (metis Col_def P10 P13 col_transitivity_2) + qed + then obtain R where V9: "Bet P R Q0 \ Col P Q R \ Col B A R" by auto + have V10: "P \ R" + using Q3 V9 by blast + have "R InAngle A B C" + proof - + have W1: "\ Col B P Q0" + using P10 P13 T1 col2__eq by blast + have "P Out Q0 R" + using V10 V9 bet_out l6_6 by auto + then have "B P OS Q0 R" + using Q6 W1 out_one_side_1 by blast + then have "B P OS A R" + using P10 one_side_transitivity by blast + then have "B Out A R" + using V9 col_one_side_out by auto + thus ?thesis + by (simp add: P2 out321__inangle) + qed + then have ?thesis + using V10 V9 by blast + } + thus ?thesis + using \B A OS P Q0 \ \R. R InAngle A B C \ P \ R \ Col P Q R\ by blast + next + assume Z1: "\ Col B C P" + then have Z6: "\ Col B P C" + by (simp add: not_col_permutation_5) + have Z3: "Col B P P" + by (simp add: col_trivial_2) + have Z4: "Col Q P P" + by (simp add: col_trivial_2) + have "Coplanar A B C P" + using Q1 assms(2) inangle__coplanar ncoplanar_perm_18 by blast + then have "Coplanar B P Q C" + using Q1 assms(3) coplanar_trans_1 ncoplanar_perm_5 by blast + then obtain Q0 where Z5: "Col Q P Q0 \ B P OS C Q0" + using cop_not_par_same_side by (metis Z3 Z4 T1 Z6) + thus ?thesis + proof cases + assume "B C OS P Q0" + thus ?thesis + proof - + have "\p. p InAngle C B A \ \ p InAngle C B P" + using assms(2) in_angle_trans l11_24 by blast + then show ?thesis + by (metis Col_perm Z5 \B C OS P Q0\ l11_24 os2__inangle os_distincts) + qed + next + assume Z6: "\ B C OS P Q0" + have Z7: "\ R. Bet P R Q0 \ Col P Q R \ Col B C R" + proof cases + assume "Col B C Q0" + thus ?thesis + using Col_def Col_perm Z5 between_trivial by blast + next + assume Z8: "\ Col B C Q0" + have "\ R. Col R B C \ Bet P R Q0" + proof - + have Z10: "Coplanar B C P Q0" + using Z5 ncoplanar_perm_2 os__coplanar by blast + have Z11: "\ Col P B C" + using Col_cases Z1 by blast + have "\ Col Q0 B C" + using Col_perm Z8 by blast + thus ?thesis + using cop_nos__ts Z6 Z10 Z11 by (simp add: TS_def) + qed + then obtain R where "Col R B C \ Bet P R Q0" by blast + thus ?thesis + by (smt Z5 bet_col col2__eq col_permutation_1 os_distincts) + qed + then obtain R where Z12: "Bet P R Q0 \ Col P Q R \ Col B C R" by blast + have Z13: "P \ R" + using Z1 Z12 by auto + have Z14: "\ Col B P Q0" + using Z5 one_side_not_col124 by blast + have "P Out Q0 R" + using Z12 Z13 bet_out l6_6 by auto + then have "B P OS Q0 R" + using Z14 Z3 out_one_side_1 by blast + then have "B P OS C R" + using Z5 one_side_transitivity by blast + then have "B Out C R" + using Z12 col_one_side_out by blast + then have "R InAngle A B C" + using P1 out341__inangle by auto + thus ?thesis + using Z12 Z13 by auto + qed + qed + qed + qed + qed +qed + +lemma col_inangle2__out: + assumes "\ Bet A B C" and + "P InAngle A B C" and + "Q InAngle A B C" and + "Col B P Q" + shows "B Out P Q" +proof cases + assume "Col A B C" + thus ?thesis + by (meson assms(1) assms(2) assms(3) assms(4) bet_in_angle_bet bet_out__bet in_angle_out l6_6 not_col_permutation_4 or_bet_out) +next + assume P1: "\ Col A B C" + thus ?thesis + proof cases + assume "Col B A P" + thus ?thesis + by (meson assms(1) assms(2) assms(3) assms(4) bet_in_angle_bet bet_out__bet l6_6 not_col_permutation_4 or_bet_out) + next + assume P2: "\ Col B A P" + have "\ Col B A Q" + using P2 assms(3) assms(4) col2__eq col_permutation_4 inangle_distincts by blast + then have "B A OS P Q" + using P1 P2 assms(2) assms(3) inangle_one_side invert_one_side not_col_permutation_4 by auto + thus ?thesis + using assms(4) col_one_side_out by auto + qed +qed + +lemma inangle2__lea: + assumes "P InAngle A B C" and + "Q InAngle A B C" + shows "P B Q LeA A B C" +proof - + have P1: "P InAngle C B A" + by (simp add: assms(1) l11_24) + have P2: "Q InAngle C B A" + by (simp add: assms(2) l11_24) + have P3: "A \ B" + using assms(1) inangle_distincts by auto + have P4: "C \ B" + using assms(1) inangle_distincts by blast + have P5: "P \ B" + using assms(1) inangle_distincts by auto + have P6: "Q \ B" + using assms(2) inangle_distincts by auto + thus ?thesis + proof cases + assume P7: "Col A B C" + thus ?thesis + proof cases + assume "Bet A B C" + thus ?thesis + by (simp add: P3 P4 P5 P6 l11_31_2) + next + assume "\ Bet A B C" + then have "B Out A C" + using P7 not_out_bet by blast + then have "B Out P Q" + using Out_cases assms(1) assms(2) in_angle_out l6_7 by blast + thus ?thesis + by (simp add: P3 P4 l11_31_1) + qed + next + assume T1: "\ Col A B C" + thus ?thesis + proof cases + assume T2: "Col B P Q" + have "\ Bet A B C" + using T1 bet_col by auto + then have "B Out P Q" + using T2 assms(1) assms(2) col_inangle2__out by auto + thus ?thesis + by (simp add: P3 P4 l11_31_1) + next + assume T3: "\ Col B P Q" + thus ?thesis + proof cases + assume "Col B A P" + then have "B Out A P" + using Col_def T1 assms(1) col_in_angle_out by blast + then have "P B Q CongA A B Q" + using P6 out2__conga out_trivial by auto + thus ?thesis + using LeA_def assms(2) by blast + next + assume W0: "\ Col B A P" + show ?thesis + proof cases + assume "Col B C P" + then have "B Out C P" + by (metis P1 P3 T1 bet_out_1 col_in_angle_out out_col) + thus ?thesis + by (smt P3 P4 P6 Tarski_neutral_dimensionless.lea_left_comm Tarski_neutral_dimensionless.lea_out4__lea Tarski_neutral_dimensionless_axioms assms(2) inangle__lea_1 out_trivial) + next + assume W0A: "\ Col B C P" + show ?thesis + proof cases + assume "Col B A Q" + then have "B Out A Q" + using Col_def T1 assms(2) col_in_angle_out by blast + thus ?thesis + by (smt P3 P4 P5 Tarski_neutral_dimensionless.lea_left_comm Tarski_neutral_dimensionless.lea_out4__lea Tarski_neutral_dimensionless_axioms assms(1) inangle__lea out_trivial) + next + assume W0AA: "\ Col B A Q" + thus ?thesis + proof cases + assume "Col B C Q" + then have "B Out C Q" + using Bet_cases P2 T1 bet_col col_in_angle_out by blast + thus ?thesis + by (smt P1 P3 P4 P5 Tarski_neutral_dimensionless.lea_comm Tarski_neutral_dimensionless.lea_out4__lea Tarski_neutral_dimensionless_axioms inangle__lea out_trivial) + next + assume W0B: "\ Col B C Q" + have W1: "Coplanar B P A Q" + by (metis Col_perm T1 assms(1) assms(2) col__coplanar inangle_one_side ncoplanar_perm_13 os__coplanar) + have W2: "\ Col A B P" + by (simp add: W0 not_col_permutation_4) + have W3: "\ Col Q B P" + using Col_perm T3 by blast + then have W4: "B P TS A Q \ B P OS A Q" + using cop__one_or_two_sides + by (simp add: W1 W2) + { + assume W4A: "B P TS A Q" + have "Q InAngle P B C" + proof - + have W5: "P B OS C Q" + using OS_def P1 W0 W0A W4A in_angle_two_sides invert_two_sides l9_2 by blast + have "C B OS P Q" + by (meson P1 P2 T1 W0A W0B inangle_one_side not_col_permutation_3 not_col_permutation_4) + thus ?thesis + by (simp add: W5 invert_one_side os2__inangle) + qed + then have "P B Q LeA A B C" + by (meson assms(1) inangle__lea inangle__lea_1 lea_trans) + } + { + assume W6: "B P OS A Q" + have "B A OS P Q" + using Col_perm T1 W2 W0AA assms(1) assms(2) inangle_one_side invert_one_side by blast + then have "Q InAngle P B A" + by (simp add: W6 os2__inangle) + then have "P B Q LeA A B C" + by (meson P1 inangle__lea inangle__lea_1 lea_right_comm lea_trans) + } + thus ?thesis + using W4 \B P TS A Q \ P B Q LeA A B C\ by blast + qed + qed + qed + qed + qed + qed +qed + +lemma conga_inangle_per__acute: + assumes "Per A B C" and + "P InAngle A B C" and + "P B A CongA P B C" + shows "Acute A B P" +proof - + have P1: "\ Col A B C" + using assms(1) assms(3) conga_diff2 conga_diff56 l8_9 by blast + have P2: "A B P LeA A B C" + by (simp add: assms(2) inangle__lea) + { + assume "A B P CongA A B C" + then have P3: "Per A B P" + by (meson Tarski_neutral_dimensionless.l11_17 Tarski_neutral_dimensionless.not_conga_sym Tarski_neutral_dimensionless_axioms assms(1)) + have P4: "Coplanar P C A B" + using assms(2) inangle__coplanar ncoplanar_perm_3 by blast + have P5: "P \ B" + using assms(2) inangle_distincts by blast + have "Per C B P" + using P3 Per_cases assms(3) l11_17 by blast + then have "False" + using P1 P3 P4 P5 col_permutation_1 cop_per2__col by blast + } + then have "\ A B P CongA A B C" by auto + then have "A B P LtA A B C" + by (simp add: LtA_def P2) + thus ?thesis + using Acute_def assms(1) by blast +qed + +lemma conga_inangle2_per__acute: + assumes "Per A B C" and + "P InAngle A B C" and + "P B A CongA P B C" and + "Q InAngle A B C" + shows "Acute P B Q" +proof - + have P1: "P InAngle C B A" + using assms(2) l11_24 by auto + have P2: "Q InAngle C B A" + using assms(4) l11_24 by blast + have P3: "A \ B" + using assms(3) conga_diff2 by auto + have P5: "P \ B" + using assms(2) inangle_distincts by blast + have P7: "\ Col A B C" + using assms(1) assms(3) conga_distinct l8_9 by blast + have P8: "Acute A B P" + using assms(1) assms(2) assms(3) conga_inangle_per__acute by auto + { + assume "Col P B A" + then have "Col P B C" + using assms(3) col_conga_col by blast + then have "False" + using Col_perm P5 P7 \Col P B A\ col_transitivity_2 by blast + } + then have P9: "\ Col P B A" by auto + have P10: "\ Col P B C" + using \Col P B A \ False\ assms(3) ncol_conga_ncol by blast + have P11: "\ Bet A B C" + using P7 bet_col by blast + show ?thesis + proof cases + assume "Col B A Q" + then have "B Out A Q" + using P11 assms(4) col_in_angle_out by auto + thus ?thesis + using Out_cases P5 P8 acute_out2__acute acute_sym out_trivial by blast + next + assume S0: "\ Col B A Q" + show ?thesis + proof cases + assume S1: "Col B C Q" + then have "B Out C Q" + using P11 P2 between_symmetry col_in_angle_out by blast + then have S2: "B Out Q C" + using l6_6 by blast + have S3: "B Out P P" + by (simp add: P5 out_trivial) + have "B Out A A" + by (simp add: P3 out_trivial) + then have "A B P CongA P B Q" + using S2 conga_left_comm l11_10 S3 assms(3) by blast + thus ?thesis + using P8 acute_conga__acute by blast + next + assume S4: "\ Col B C Q" + show ?thesis + proof cases + assume "Col B P Q" + thus ?thesis + using out__acute col_inangle2__out P11 assms(2) assms(4) by blast + next + assume S5: "\ Col B P Q" + have S6: "Coplanar B P A Q" + by (metis Col_perm P7 assms(2) assms(4) coplanar_trans_1 inangle__coplanar ncoplanar_perm_12 ncoplanar_perm_21) + have S7: "\ Col A B P" + using Col_cases P9 by auto + have "\ Col Q B P" + using Col_perm S5 by blast + then have S8: "B P TS A Q \ B P OS A Q" + using cop__one_or_two_sides S6 S7 by blast + { + assume S9: "B P TS A Q" + have S10: "Acute P B C" + using P8 acute_conga__acute acute_sym assms(3) by blast + have "Q InAngle P B C" + proof - + have S11: "P B OS C Q" + by (metis Col_perm OS_def P1 P10 P9 S9 in_angle_two_sides invert_two_sides l9_2) + have "C B OS P Q" + by (meson P1 P10 P2 P7 S4 inangle_one_side not_col_permutation_3 not_col_permutation_4) + thus ?thesis + by (simp add: S11 invert_one_side os2__inangle) + qed + then have "P B Q LeA P B C" + by (simp add: inangle__lea) + then have "Acute P B Q" + using S10 acute_lea_acute by blast + } + { + assume S12: "B P OS A Q" + have "B A OS P Q" + using Col_perm P7 S7 S0 assms(2) assms(4) inangle_one_side invert_one_side by blast + then have "Q InAngle P B A" + by (simp add: S12 os2__inangle) + then have "Q B P LeA P B A" + by (simp add: P3 P5 inangle1123 inangle2__lea) + then have "P B Q LeA A B P" + by (simp add: lea_comm) + then have "Acute P B Q" + using P8 acute_lea_acute by blast + } + thus ?thesis + using \B P TS A Q \ Acute P B Q\ S8 by blast + qed + qed + qed +qed + +lemma lta_os__ts: + assumes (*"\ Col A O1 P" and*) + "A O1 P LtA A O1 B" and + "O1 A OS B P" + shows "O1 P TS A B" +proof - + have "A O1 P LeA A O1 B" + by (simp add: assms(1) lta__lea) + then have "\ P0. P0 InAngle A O1 B \ A O1 P CongA A O1 P0" + by (simp add: LeA_def) + then obtain P' where P1: "P' InAngle A O1 B \ A O1 P CongA A O1 P'" by blast + have P2: "\ Col A O1 B" + using assms(2) col123__nos not_col_permutation_4 by blast + obtain R where P3: "O1 A TS B R \ O1 A TS P R" + using OS_def assms(2) by blast + { + assume "Col B O1 P" + then have "Bet B O1 P" + by (metis Tarski_neutral_dimensionless.out2__conga Tarski_neutral_dimensionless_axioms assms(1) assms(2) between_trivial col_trivial_2 lta_not_conga one_side_chara or_bet_out out_trivial) + then have "O1 A TS B P" + using assms(2) col_trivial_1 one_side_chara by blast + then have P6: "\ O1 A OS B P" + using l9_9_bis by auto + then have "False" + using P6 assms(2) by auto + } + then have P4: "\ Col B O1 P" by auto + thus ?thesis + by (meson P3 assms(1) inangle__lta l9_8_1 not_and_lta not_col_permutation_4 os_ts__inangle two_sides_cases) +qed + +lemma bet__suppa: + assumes "A \ B" and + "B \ C" and + "B \ A'" and + "Bet A B A'" + shows "A B C SuppA C B A'" +proof - + have "C B A' CongA C B A'" + using assms(2) assms(3) conga_refl by auto + thus ?thesis using assms(4) assms(1) SuppA_def by auto +qed + +lemma ex_suppa: + assumes "A \ B" and + "B \ C" + shows "\ D E F. A B C SuppA D E F" +proof - + obtain A' where "Bet A B A' \ Cong B A' A B" + using segment_construction by blast + thus ?thesis + by (meson assms(1) assms(2) bet__suppa point_construction_different) +qed + +lemma suppa_distincts: + assumes "A B C SuppA D E F" + shows "A \ B \ B \ C \ D \ E \ E \ F" + using CongA_def SuppA_def assms by auto + +lemma suppa_right_comm: + assumes "A B C SuppA D E F" + shows "A B C SuppA F E D" + using SuppA_def assms conga_left_comm by auto + +lemma suppa_left_comm: + assumes "A B C SuppA D E F" + shows "C B A SuppA D E F" +proof - + obtain A' where P1: "Bet A B A' \ D E F CongA C B A'" + using SuppA_def assms by auto + obtain C' where P2: "Bet C B C' \ Cong B C' C B" + using segment_construction by blast + then have "C B A' CongA A B C'" + by (metis Bet_cases P1 SuppA_def assms cong_diff_3 conga_diff45 conga_diff56 conga_left_comm l11_14) + then have "D E F CongA A B C'" + using P1 conga_trans by blast + thus ?thesis + by (metis CongA_def P1 P2 SuppA_def) +qed + +lemma suppa_comm: + assumes "A B C SuppA D E F" + shows "C B A SuppA F E D" + using assms suppa_left_comm suppa_right_comm by blast + +lemma suppa_sym: + assumes "A B C SuppA D E F" + shows "D E F SuppA A B C" +proof - + obtain A' where P1: "Bet A B A' \ D E F CongA C B A'" + using SuppA_def assms by auto + obtain D' where P2: "Bet D E D' \ Cong E D' D E" + using segment_construction by blast + have "A' B C CongA D E F" + using P1 conga_right_comm not_conga_sym by blast + then have "A B C CongA F E D'" + by (metis P1 P2 Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless.l11_13 Tarski_neutral_dimensionless.suppa_distincts Tarski_neutral_dimensionless_axioms assms between_symmetry cong_diff_3) + thus ?thesis + by (metis CongA_def P1 P2 SuppA_def) +qed + +lemma conga2_suppa__suppa: + assumes "A B C CongA A' B' C'" and + "D E F CongA D' E' F'" and + "A B C SuppA D E F" + shows "A' B' C' SuppA D' E' F'" +proof - + obtain A0 where P1: "Bet A B A0 \ D E F CongA C B A0" + using SuppA_def assms(3) by auto + then have "A B C SuppA D' E' F'" + by (metis Tarski_neutral_dimensionless.SuppA_def Tarski_neutral_dimensionless_axioms assms(2) assms(3) conga_sym conga_trans) + then have P2: "D' E' F' SuppA A B C" + by (simp add: suppa_sym) + then obtain D0 where P3: "Bet D' E' D0 \ A B C CongA F' E' D0" + using P2 SuppA_def by auto + have P5: "A' B' C' CongA F' E' D0" + using P3 assms(1) not_conga not_conga_sym by blast + then have "D' E' F' SuppA A' B' C'" + using P2 P3 SuppA_def by auto + thus ?thesis + by (simp add: suppa_sym) +qed + +lemma suppa2__conga456: + assumes "A B C SuppA D E F" and + "A B C SuppA D' E' F'" + shows "D E F CongA D' E' F'" +proof - + obtain A' where P1: "Bet A B A' \ D E F CongA C B A'" + using SuppA_def assms(1) by auto + obtain A'' where P2: "Bet A B A'' \ D' E' F' CongA C B A''" + using SuppA_def assms(2) by auto + have "C B A' CongA C B A''" + proof - + have P3: "B Out C C" using P1 + by (simp add: CongA_def out_trivial) + have "B Out A'' A'" using P1 P2 l6_2 + by (metis assms(1) between_symmetry conga_distinct suppa_distincts) + thus ?thesis + by (simp add: P3 out2__conga) + qed + then have "C B A' CongA D' E' F'" + using P2 not_conga not_conga_sym by blast + thus ?thesis + using P1 not_conga by blast +qed + +lemma suppa2__conga123: + assumes "A B C SuppA D E F" and + "A' B' C' SuppA D E F" + shows "A B C CongA A' B' C'" + using assms(1) assms(2) suppa2__conga456 suppa_sym by blast + +lemma bet_out__suppa: + assumes "A \ B" and + "B \ C" and + "Bet A B C" and + "E Out D F" + shows "A B C SuppA D E F" +proof - + have "D E F CongA C B C" + using assms(2) assms(4) l11_21_b out_trivial by auto + thus ?thesis + using SuppA_def assms(1) assms(3) by blast +qed + +lemma bet_suppa__out: + assumes "Bet A B C" and + "A B C SuppA D E F" + shows "E Out D F" +proof - + have "A B C SuppA C B C" + using assms(1) assms(2) bet__suppa suppa_distincts by auto + then have "C B C CongA D E F" + using assms(2) suppa2__conga456 by auto + thus ?thesis + using eq_conga_out by auto +qed + +lemma out_suppa__bet: + assumes "B Out A C" and + "A B C SuppA D E F" + shows "Bet D E F" +proof - + obtain B' where P1: "Bet A B B' \ Cong B B' A B" + using segment_construction by blast + have "A B C SuppA A B B'" + by (metis P1 assms(1) assms(2) bet__suppa bet_cong_eq bet_out__bet suppa_distincts suppa_left_comm) + then have "A B B' CongA D E F" + using assms(2) suppa2__conga456 by auto + thus ?thesis + using P1 bet_conga__bet by blast +qed + +lemma per_suppa__per: + assumes "Per A B C" and + "A B C SuppA D E F" + shows "Per D E F" +proof - + obtain A' where P1: "Bet A B A' \ D E F CongA C B A'" + using SuppA_def assms(2) by auto + have "Per C B A'" + proof - + have P2: "A \ B" + using assms(2) suppa_distincts by auto + have P3: "Per C B A" + by (simp add: assms(1) l8_2) + have "Col B A A'" + using P1 Col_cases Col_def by blast + thus ?thesis + by (metis P2 P3 per_col) + qed + thus ?thesis + using P1 l11_17 not_conga_sym by blast +qed + +lemma per2__suppa: + assumes "A \ B" and + "B \ C" and + "D \ E" and + "E \ F" and + "Per A B C" and + "Per D E F" + shows "A B C SuppA D E F" +proof - + obtain D' E' F' where P1: "A B C SuppA D' E' F'" + using assms(1) assms(2) ex_suppa by blast + have "D' E' F' CongA D E F" + using P1 assms(3) assms(4) assms(5) assms(6) l11_16 per_suppa__per suppa_distincts by blast + thus ?thesis + by (meson P1 conga2_suppa__suppa suppa2__conga123) +qed + +lemma suppa__per: + assumes "A B C SuppA A B C" + shows "Per A B C" +proof - + obtain A' where P1: "Bet A B A' \ A B C CongA C B A'" + using SuppA_def assms by auto + then have "C B A CongA C B A'" + by (simp add: conga_left_comm) + thus ?thesis + using P1 Per_perm l11_18_2 by blast +qed + +lemma acute_suppa__obtuse: + assumes "Acute A B C" and + "A B C SuppA D E F" + shows "Obtuse D E F" +proof - + obtain A' where P1: "Bet A B A' \ D E F CongA C B A'" + using SuppA_def assms(2) by auto + then have "Obtuse C B A'" + by (metis Tarski_neutral_dimensionless.obtuse_sym Tarski_neutral_dimensionless_axioms acute_bet__obtuse assms(1) conga_distinct) + thus ?thesis + by (meson P1 Tarski_neutral_dimensionless.conga_obtuse__obtuse Tarski_neutral_dimensionless.not_conga_sym Tarski_neutral_dimensionless_axioms) +qed + +lemma obtuse_suppa__acute: + assumes "Obtuse A B C" and + "A B C SuppA D E F" + shows "Acute D E F" +proof - + obtain A' where P1: "Bet A B A' \ D E F CongA C B A'" + using SuppA_def assms(2) by auto + then have "Acute C B A'" + using acute_sym assms(1) bet_obtuse__acute conga_distinct by blast + thus ?thesis + using P1 acute_conga__acute not_conga_sym by blast +qed + +lemma lea_suppa2__lea: + assumes "A B C SuppA A' B' C'" and + "D E F SuppA D' E' F'" + "A B C LeA D E F" + shows "D' E' F' LeA A' B' C'" +proof - + obtain A0 where P1: "Bet A B A0 \ A' B' C' CongA C B A0" + using SuppA_def assms(1) by auto + obtain D0 where P2: "Bet D E D0 \ D' E' F' CongA F E D0" + using SuppA_def assms(2) by auto + have "F E D0 LeA C B A0" + proof - + have P3: "D0 \ E" + using CongA_def P2 by auto + have P4: "A0 \ B" + using CongA_def P1 by blast + have P6: "Bet D0 E D" + by (simp add: P2 between_symmetry) + have "Bet A0 B A" + by (simp add: P1 between_symmetry) + thus ?thesis + by (metis P3 P4 P6 assms(3) l11_36_aux2 lea_comm lea_distincts) + qed + thus ?thesis + by (meson P1 P2 Tarski_neutral_dimensionless.l11_30 Tarski_neutral_dimensionless.not_conga_sym Tarski_neutral_dimensionless_axioms) +qed + +lemma lta_suppa2__lta: + assumes "A B C SuppA A' B' C'" + and "D E F SuppA D' E' F'" + and "A B C LtA D E F" + shows "D' E' F' LtA A' B' C'" +proof - + obtain A0 where P1: "Bet A B A0 \ A' B' C' CongA C B A0" + using SuppA_def assms(1) by auto + obtain D0 where P2: "Bet D E D0 \ D' E' F' CongA F E D0" + using SuppA_def assms(2) by auto + have "F E D0 LtA C B A0" + proof - + have P5: "A0 \ B" + using CongA_def P1 by blast + have "D0 \ E" + using CongA_def P2 by auto + thus ?thesis + using assms(3) P1 P5 P2 bet2_lta__lta lta_comm by blast + qed + thus ?thesis + using P1 P2 conga_preserves_lta not_conga_sym by blast +qed + +lemma suppa_dec: + "A B C SuppA D E F \ \ A B C SuppA D E F" + by simp + +lemma acute_one_side_aux: + assumes "C A OS P B" and + "Acute A C P" and + "C A Perp B C" + shows "C B OS A P" +proof - + obtain R where T1: "C A TS P R \ C A TS B R" + using OS_def assms(1) by blast + obtain A' B' C' where P1: "Per A' B' C' \ A C P LtA A' B' C'" + using Acute_def assms(2) by auto + have P2: "Per A C B" + by (simp add: assms(3) perp_per_1) + then have P3: "A' B' C' CongA A C B" + using P1 assms(1) l11_16 lta_distincts os_distincts by blast + have P4: "A C P LtA A C B" + by (metis P2 acute_per__lta assms(1) assms(2) os_distincts) + { + assume P4A: "Col P C B" + have "Per A C P" + proof - + have P4B: "C \ B" + using assms(1) os_distincts by blast + have P4C: "Per A C B" + by (simp add: P2) + have "Col C B P" + using P4A Col_cases by auto + thus ?thesis using per_col P4B P4C by blast + qed + then have "False" + using acute_not_per assms(2) by auto + } + then have P5: "\ Col P C B" by auto + have P6: "\ Col A C P" + using assms(1) col123__nos not_col_permutation_4 by blast + have P7: "C B TS A P \ C B OS A P" + using P5 assms(1) not_col_permutation_4 os_ts1324__os two_sides_cases by blast + { + assume P8: "C B TS A P" + then obtain T where P9: "Col T C B \ Bet A T P" + using TS_def by blast + then have P10: "C \ T" + using Col_def P6 P9 by auto + have "T InAngle A C P" + by (meson P4 P5 P8 Tarski_neutral_dimensionless.inangle__lta Tarski_neutral_dimensionless_axioms assms(1) not_and_lta not_col_permutation_3 os_ts__inangle) + then have "C A OS T P" + by (metis P10 P9 T1 TS_def col123__nos in_angle_one_side invert_one_side l6_16_1 one_side_reflexivity) + then have P13: "C A OS T B" + using assms(1) one_side_transitivity by blast + have "C B OS A P" + by (meson P4 Tarski_neutral_dimensionless.lta_os__ts Tarski_neutral_dimensionless_axioms assms(1) one_side_symmetry os_ts1324__os) + } + thus ?thesis + using P7 by blast +qed + +lemma acute_one_side_aux0: + assumes "Col A C P" and + "Acute A C P" and + "C A Perp B C" + shows "C B OS A P" +proof - + have "Per A C B" + by (simp add: assms(3) perp_per_1) + then have P1: "A C P LtA A C B" + using Tarski_neutral_dimensionless.acute_per__lta Tarski_neutral_dimensionless_axioms acute_distincts assms(2) assms(3) perp_not_eq_2 by fastforce + have P2: "C Out A P" + using acute_col__out assms(1) assms(2) by auto + thus ?thesis + using Perp_cases assms(3) out_one_side perp_not_col by blast +qed + +lemma acute_cop_perp__one_side: + assumes "Acute A C P" and + "C A Perp B C" and + "Coplanar A B C P" + shows "C B OS A P" +proof cases + assume "Col A C P" + thus ?thesis + by (simp add: acute_one_side_aux0 assms(1) assms(2)) +next + assume P1: "\ Col A C P" + have P2: "C A TS P B \ C A OS P B" + using Col_cases P1 assms(2) assms(3) cop_nos__ts coplanar_perm_13 perp_not_col by blast + { + assume P3: "C A TS P B" + obtain Bs where P4: "C Midpoint B Bs" + using symmetric_point_construction by auto + have "C A TS Bs B" + by (metis P3 P4 assms(2) bet__ts l9_2 midpoint_bet midpoint_distinct_2 perp_not_col ts_distincts) + then have P6: "C A OS P Bs" + using P3 l9_8_1 by auto + have "C Bs Perp A C" + proof - + have P6A: "C \ Bs" + using P6 os_distincts by blast + have "Col C B Bs" + using Bet_cases Col_def P4 midpoint_bet by blast + thus ?thesis + using Perp_cases P6A assms(2) perp_col by blast + qed + then have "Bs C Perp C A" + using Perp_perm by blast + then have "C A Perp Bs C" + using Perp_perm by blast + then have "C B OS A P" using acute_one_side_aux + by (metis P4 P6 assms(1) assms(2) col_one_side midpoint_col not_col_permutation_5 perp_distinct) + } + { + assume "C A OS P B" + then have "C B OS A P" using acute_one_side_aux + using assms(1) assms(2) by blast + } + thus ?thesis + using P2 \C A TS P B \ C B OS A P\ by auto +qed + +lemma acute__not_obtuse: + assumes "Acute A B C" + shows "\ Obtuse A B C" + using acute_obtuse__lta assms nlta by blast + +subsubsection "Sum of angles" + +lemma suma_distincts: + assumes "A B C D E F SumA G H I" + shows "A \ B \ B \C \ D \ E \ E \ F \ G \ H \ H \ I" +proof - + obtain J where "C B J CongA D E F \ \ B C OS A J \ Coplanar A B C J \ A B J CongA G H I" + using SumA_def assms by auto + thus ?thesis + using CongA_def by blast +qed + +lemma trisuma_distincts: + assumes "A B C TriSumA D E F" + shows "A \ B \ B \ C \ A \ C \ D \ E \ E \ F" +proof - + obtain G H I where "A B C B C A SumA G H I \ G H I C A B SumA D E F" + using TriSumA_def assms by auto + thus ?thesis + using suma_distincts by blast +qed + +lemma ex_suma: + assumes "A \ B" and + "B \ C" and + "D \ E" and + "E \ F" + shows "\ G H I. A B C D E F SumA G H I" +proof - + have "\ I. A B C D E F SumA A B I" + proof cases + assume P1: "Col A B C" + obtain J where P2: "D E F CongA C B J \ Coplanar C B J A" using angle_construction_4 + using assms(2) assms(3) assms(4) by presburger + have P3: "J \ B" + using CongA_def P2 by blast + have "\ B C OS A J" + by (metis P1 between_trivial2 one_side_chara) + then have "A B C D E F SumA A B J" + by (meson P2 P3 SumA_def assms(1) conga_refl ncoplanar_perm_15 not_conga_sym) + thus ?thesis by blast + next + assume T1: "\ Col A B C" + show ?thesis + proof cases + assume T2: "Col D E F" + show ?thesis + proof cases + assume T3: "Bet D E F" + obtain J where T4: "B Midpoint C J" + using symmetric_point_construction by blast + have "A B C D E F SumA A B J" + proof - + have "C B J CongA D E F" + by (metis T3 T4 assms(2) assms(3) assms(4) conga_line midpoint_bet midpoint_distinct_2) + moreover have "\ B C OS A J" + by (simp add: T4 col124__nos midpoint_col) + moreover have "Coplanar A B C J" + using T3 bet__coplanar bet_conga__bet calculation(1) conga_sym ncoplanar_perm_15 by blast + moreover have "A B J CongA A B J" + using CongA_def assms(1) calculation(1) conga_refl by auto + ultimately show ?thesis + using SumA_def by blast + qed + then show ?thesis + by auto + next + assume T5: "\ Bet D E F" + have "A B C D E F SumA A B C" + proof - + have "E Out D F" + using T2 T5 l6_4_2 by auto + then have "C B C CongA D E F" + using assms(2) l11_21_b out_trivial by auto + moreover have "\ B C OS A C" + using os_distincts by blast + moreover have "Coplanar A B C C" + using ncop_distincts by auto + moreover have "A B C CongA A B C" + using assms(1) assms(2) conga_refl by auto + ultimately show ?thesis + using SumA_def by blast + qed + then show ?thesis + by auto + qed + next + assume T6: "\ Col D E F" + then obtain J where T7: "D E F CongA C B J \ C B TS J A" + using T1 ex_conga_ts not_col_permutation_4 not_col_permutation_5 by presburger + then show ?thesis + proof - + have "C B J CongA D E F" + using T7 not_conga_sym by blast + moreover have "\ B C OS A J" + by (simp add: T7 invert_two_sides l9_2 l9_9) + moreover have "Coplanar A B C J" + using T7 ncoplanar_perm_15 ts__coplanar by blast + moreover have "A B J CongA A B J" + using T7 assms(1) conga_diff56 conga_refl by blast + ultimately show ?thesis + using SumA_def by blast + qed + qed + qed + then show ?thesis + by auto +qed + +lemma suma2__conga: + assumes "A B C D E F SumA G H I" and + "A B C D E F SumA G' H' I'" + shows "G H I CongA G' H' I'" +proof - + obtain J where P1: "C B J CongA D E F \ \ B C OS A J \ Coplanar A B C J \ A B J CongA G H I" + using SumA_def assms(1) by blast + obtain J' where P2: "C B J' CongA D E F \ \ B C OS A J' \ Coplanar A B C J' \ A B J' CongA G' H' I'" + using SumA_def assms(2) by blast + have P3: "C B J CongA C B J'" + proof - + have "C B J CongA D E F" + by (simp add: P1) + moreover have "D E F CongA C B J'" + by (simp add: P2 conga_sym) + ultimately show ?thesis + using not_conga by blast + qed + have P4: "A B J CongA A B J'" + proof cases + assume P5: "Col A B C" + then show ?thesis + proof cases + assume P6: "Bet A B C" + show ?thesis + proof - + have "C B J CongA C B J'" + by (simp add: P3) + moreover have "Bet C B A" + by (simp add: P6 between_symmetry) + moreover have "A \ B" + using assms(1) suma_distincts by blast + ultimately show ?thesis + using l11_13 by blast + qed + next + assume P7: "\ Bet A B C" + moreover have "B Out A C" + by (simp add: P5 calculation l6_4_2) + moreover have "B \ J" + using CongA_def P3 by blast + then moreover have "B Out J J" + using out_trivial by auto + moreover have "B \ J'" + using CongA_def P3 by blast + then moreover have "B Out J' J'" + using out_trivial by auto + ultimately show ?thesis + using P3 l11_10 by blast + qed + next + assume P8: "\ Col A B C" + show ?thesis + proof cases + assume P9: "Col D E F" + have "B Out J' J" + proof cases + assume P10: "Bet D E F" + show ?thesis + proof - + have "D E F CongA J' B C" + using P2 conga_right_comm not_conga_sym by blast + then have "Bet J' B C" + using P10 bet_conga__bet by blast + moreover have "D E F CongA J B C" + by (simp add: P1 conga_right_comm conga_sym) + then moreover have "Bet J B C" + using P10 bet_conga__bet by blast + ultimately show ?thesis + by (metis CongA_def P3 l6_2) + qed + next + assume P11: "\ Bet D E F" + have P12: "E Out D F" + by (simp add: P11 P9 l6_4_2) + show ?thesis + proof - + have "B Out J' C" + proof - + have "D E F CongA J' B C" + using P2 conga_right_comm conga_sym by blast + then show ?thesis + using l11_21_a P12 by blast + qed + moreover have "B Out C J" + by (metis P3 P8 bet_conga__bet calculation col_conga_col col_out2_col l6_4_2 l6_6 not_col_distincts not_conga_sym out_bet_out_1 out_trivial) + ultimately show ?thesis + using l6_7 by blast + qed + qed + then show ?thesis + using P8 not_col_distincts out2__conga out_trivial by blast + next + assume P13: "\ Col D E F" + show ?thesis + proof - + have "B C TS A J" + proof - + have "Coplanar B C A J" + using P1 coplanar_perm_8 by blast + moreover have "\ Col A B C" + by (simp add: P8) + moreover have "\ B C OS A J" + using P1 by simp + moreover have "\ Col J B C" + proof - + have "D E F CongA J B C" + using P1 conga_right_comm not_conga_sym by blast + then show ?thesis + using P13 ncol_conga_ncol by blast + qed + ultimately show ?thesis + using cop__one_or_two_sides by blast + qed + moreover have "B C TS A J'" + proof - + have "Coplanar B C A J'" + using P2 coplanar_perm_8 by blast + moreover have "\ Col A B C" + by (simp add: P8) + moreover have "\ B C OS A J'" + using P2 by simp + moreover have "\ Col J' B C" + proof - + have "D E F CongA J' B C" + using P2 conga_right_comm not_conga_sym by blast + then show ?thesis + using P13 ncol_conga_ncol by blast + qed + ultimately show ?thesis + using cop_nos__ts by blast + qed + moreover have "A B C CongA A B C" + by (metis P8 conga_pseudo_refl conga_right_comm not_col_distincts) + moreover have "C B J CongA C B J'" + by (simp add: P3) + ultimately show ?thesis + using l11_22a by blast + qed + qed + qed + then show ?thesis + by (meson P1 P2 not_conga not_conga_sym) +qed + +lemma suma_sym: + assumes "A B C D E F SumA G H I" + shows "D E F A B C SumA G H I" +proof - + obtain J where P1: "C B J CongA D E F \ \ B C OS A J \ Coplanar A B C J \ A B J CongA G H I" + using SumA_def assms(1) by blast + show ?thesis + proof cases + assume P2: "Col A B C" + then show ?thesis + proof cases + assume P3: "Bet A B C" + obtain K where P4: "Bet F E K \ Cong F E E K" + using Cong_perm segment_construction by blast + show ?thesis + proof - + have P5: "F E K CongA A B C" + by (metis CongA_def P1 P3 P4 cong_diff conga_line) + moreover have "\ E F OS D K" + using P4 bet_col col124__nos invert_one_side by blast + moreover have "Coplanar D E F K" + using P4 bet__coplanar ncoplanar_perm_15 by blast + moreover have "D E K CongA G H I" + proof - + have "D E K CongA A B J" + proof - + have "F E D CongA C B J" + by (simp add: P1 conga_left_comm conga_sym) + moreover have "Bet F E K" + by (simp add: P4) + moreover have "K \ E" + using P4 calculation(1) cong_identity conga_diff1 by blast + moreover have "Bet C B A" + by (simp add: Bet_perm P3) + moreover have "A \ B" + using CongA_def P5 by blast + ultimately show ?thesis + using conga_right_comm l11_13 not_conga_sym by blast + qed + then show ?thesis + using P1 not_conga by blast + qed + ultimately show ?thesis + using SumA_def by blast + qed + next + assume T1: "\ Bet A B C" + then have T2: "B Out A C" + by (simp add: P2 l6_4_2) + show ?thesis + proof - + have "F E F CongA A B C" + by (metis T2 assms l11_21_b out_trivial suma_distincts) + moreover have "\ E F OS D F" + using os_distincts by auto + moreover have "Coplanar D E F F" + using ncop_distincts by auto + moreover have "D E F CongA G H I" + proof - + have "A B J CongA D E F" + proof - + have "C B J CongA D E F" + by (simp add: P1) + moreover have "B Out A C" + by (simp add: T2) + moreover have "J \ B" + using calculation(1) conga_distinct by auto + moreover have "D \ E" + using calculation(1) conga_distinct by blast + moreover have "F \ E" + using calculation(1) conga_distinct by blast + ultimately show ?thesis + by (meson Out_cases not_conga out2__conga out_trivial) + qed + then have "D E F CongA A B J" + using not_conga_sym by blast + then show ?thesis + using P1 not_conga by blast + qed + ultimately show ?thesis + using SumA_def by blast + qed + qed + next + assume Q1: "\ Col A B C" + show ?thesis + proof cases + assume Q2: "Col D E F" + obtain K where Q3: "A B C CongA F E K" + using P1 angle_construction_3 conga_diff1 conga_diff56 by fastforce + show ?thesis + proof - + have "F E K CongA A B C" + by (simp add: Q3 conga_sym) + moreover have "\ E F OS D K" + using Col_cases Q2 one_side_not_col123 by blast + moreover have "Coplanar D E F K" + by (simp add: Q2 col__coplanar) + moreover have "D E K CongA G H I" + proof - + have "D E K CongA A B J" + proof cases + assume "Bet D E F" + then have "J B A CongA D E K" + by (metis P1 bet_conga__bet calculation(1) conga_diff45 conga_right_comm l11_13 not_conga_sym) + then show ?thesis + using conga_right_comm not_conga_sym by blast + next + assume "\ Bet D E F" + then have W2: "E Out D F" + using Q2 or_bet_out by blast + have "A B J CongA D E K" + proof - + have "A B C CongA F E K" + by (simp add: Q3) + moreover have "A \ B" + using Q1 col_trivial_1 by auto + moreover have "E Out D F" + by (simp add: W2) + moreover have "B Out J C" + proof - + have "D E F CongA J B C" + by (simp add: P1 conga_left_comm conga_sym) + then show ?thesis + using W2 out_conga_out by blast + qed + moreover have "K \ E" + using CongA_def Q3 by blast + ultimately show ?thesis + using l11_10 out_trivial by blast + qed + then show ?thesis + using not_conga_sym by blast + qed + then show ?thesis + using P1 not_conga by blast + qed + ultimately show ?thesis + using SumA_def by blast + qed + next + assume W3: "\ Col D E F" + then obtain K where W4: "A B C CongA F E K \ F E TS K D" + using Q1 ex_conga_ts not_col_permutation_3 by blast + show ?thesis + proof - + have "F E K CongA A B C" + using W4 not_conga_sym by blast + moreover have "\ E F OS D K" + proof - + have "E F TS D K" + using W4 invert_two_sides l9_2 by blast + then show ?thesis + using l9_9 by auto + qed + moreover have "Coplanar D E F K" + proof - + have "E F TS D K" + using W4 invert_two_sides l9_2 by blast + then show ?thesis + using ncoplanar_perm_8 ts__coplanar by blast + qed + moreover have "D E K CongA G H I" + proof - + have "A B J CongA K E D" + proof - + have "B C TS A J" + proof - + have "Coplanar B C A J" + using P1 ncoplanar_perm_12 by blast + moreover have "\ Col A B C" + by (simp add: Q1) + moreover have "\ B C OS A J" + using P1 by simp + moreover have "\ Col J B C" + proof - + { + assume "Col J B C" + have "Col D E F" + proof - + have "Col C B J" + using Col_perm \Col J B C\ by blast + moreover have "C B J CongA D E F" + by (simp add: P1) + ultimately show ?thesis + using col_conga_col by blast + qed + then have "False" + by (simp add: W3) + } + then show ?thesis by blast + qed + ultimately show ?thesis + using cop_nos__ts by blast + qed + moreover have "E F TS K D" + using W4 invert_two_sides by blast + moreover have "A B C CongA K E F" + by (simp add: W4 conga_right_comm) + moreover have "C B J CongA F E D" + by (simp add: P1 conga_right_comm) + ultimately show ?thesis + using l11_22a by auto + qed + then have "D E K CongA A B J" + using conga_left_comm not_conga_sym by blast + then show ?thesis + using P1 not_conga by blast + qed + ultimately show ?thesis + using SumA_def by blast + qed + qed + qed +qed + +lemma conga3_suma__suma: + assumes "A B C D E F SumA G H I" and + "A B C CongA A' B' C'" and + "D E F CongA D' E' F'" and + "G H I CongA G' H' I'" + shows "A' B' C' D' E' F' SumA G' H' I'" +proof - + have "D' E' F' A B C SumA G' H' I'" + proof - + obtain J where P1: "C B J CongA D E F \ \ B C OS A J \ Coplanar A B C J \ A B J CongA G H I" + using SumA_def assms(1) by blast + have "A B C D' E' F' SumA G' H' I'" + proof - + have "C B J CongA D' E' F'" + using P1 assms(3) not_conga by blast + moreover have "\ B C OS A J" + using P1 by simp + moreover have "Coplanar A B C J" + using P1 by simp + moreover have "A B J CongA G' H' I'" + using P1 assms(4) not_conga by blast + ultimately show ?thesis + using SumA_def by blast + qed + then show ?thesis + by (simp add: suma_sym) + qed + then obtain J where P2: "F' E' J CongA A B C \ \ E' F' OS D' J \ Coplanar D' E' F' J \ D' E' J CongA G' H' I'" + using SumA_def by blast + have "D' E' F' A' B' C' SumA G' H' I'" + proof - + have "F' E' J CongA A' B' C'" + proof - + have "F' E' J CongA A B C" + by (simp add: P2) + moreover have "A B C CongA A' B' C'" + by (simp add: assms(2)) + ultimately show ?thesis + using not_conga by blast + qed + moreover have "\ E' F' OS D' J" + using P2 by simp + moreover have "Coplanar D' E' F' J" + using P2 by simp + moreover have "D' E' J CongA G' H' I'" + by (simp add: P2) + ultimately show ?thesis + using SumA_def by blast + qed + then show ?thesis + by (simp add: suma_sym) +qed + +lemma out6_suma__suma: + assumes "A B C D E F SumA G H I" and + "B Out A A'" and + "B Out C C'" and + "E Out D D'" and + "E Out F F'" and + "H Out G G'" and + "H Out I I'" + shows "A' B C' D' E F' SumA G' H I'" +proof - + have "A B C CongA A' B C'" + using Out_cases assms(2) assms(3) out2__conga by blast + moreover have "D E F CongA D' E F'" + using Out_cases assms(4) assms(5) out2__conga by blast + moreover have "G H I CongA G' H I'" + by (simp add: assms(6) assms(7) l6_6 out2__conga) + ultimately show ?thesis + using assms(1) conga3_suma__suma by blast +qed + +lemma out546_suma__conga: + assumes "A B C D E F SumA G H I" and + "E Out D F" + shows "A B C CongA G H I" +proof - + have "A B C D E F SumA A B C" + proof - + have "C B C CongA D E F" + by (metis assms(1) assms(2) l11_21_b out_trivial suma_distincts) + moreover have "\ B C OS A C" + using os_distincts by auto + moreover have "Coplanar A B C C" + using ncop_distincts by auto + moreover have "A B C CongA A B C" + by (metis Tarski_neutral_dimensionless.suma_distincts Tarski_neutral_dimensionless_axioms assms(1) conga_pseudo_refl conga_right_comm) + ultimately show ?thesis + using SumA_def by blast + qed + then show ?thesis using suma2__conga assms(1) by blast +qed + +lemma out546__suma: + assumes "A \ B" and + "B \ C" and + "E Out D F" + shows "A B C D E F SumA A B C" +proof - + have P1: "D \ E" + using assms(3) out_diff1 by auto + have P2: "F \ E" + using Out_def assms(3) by auto + then obtain G H I where P3: "A B C D E F SumA G H I" + using P1 assms(1) assms(2) ex_suma by presburger + then have "G H I CongA A B C" + by (meson Tarski_neutral_dimensionless.conga_sym Tarski_neutral_dimensionless.out546_suma__conga Tarski_neutral_dimensionless_axioms assms(3)) + then show ?thesis + using P1 P2 P3 assms(1) assms(2) assms(3) conga3_suma__suma conga_refl out_diff1 by auto +qed + +lemma out213_suma__conga: + assumes "A B C D E F SumA G H I" and + "B Out A C" + shows "D E F CongA G H I" + using assms(1) assms(2) out546_suma__conga suma_sym by blast + +lemma out213__suma: + assumes "D \ E" and + "E \ F" and + "B Out A C" + shows "A B C D E F SumA D E F" + by (simp add: assms(1) assms(2) assms(3) out546__suma suma_sym) + +lemma suma_left_comm: + assumes "A B C D E F SumA G H I" + shows "C B A D E F SumA G H I" +proof - + have "A B C CongA C B A" + using assms conga_pseudo_refl suma_distincts by fastforce + moreover have "D E F CongA D E F" + by (metis assms conga_refl suma_distincts) + moreover have "G H I CongA G H I" + by (metis assms conga_refl suma_distincts) + ultimately show ?thesis + using assms conga3_suma__suma by blast +qed + +lemma suma_middle_comm: + assumes "A B C D E F SumA G H I" + shows "A B C F E D SumA G H I" + using assms suma_left_comm suma_sym by blast + +lemma suma_right_comm: + assumes "A B C D E F SumA G H I" + shows "A B C D E F SumA I H G" +proof - + have "A B C CongA A B C" + using assms conga_refl suma_distincts by fastforce + moreover have "D E F CongA D E F" + by (metis assms conga_refl suma_distincts) + moreover have "G H I CongA I H G" + by (meson Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless.suma2__conga Tarski_neutral_dimensionless_axioms assms) + ultimately show ?thesis + using assms conga3_suma__suma by blast +qed + +lemma suma_comm: + assumes "A B C D E F SumA G H I" + shows "C B A F E D SumA I H G" + by (simp add: assms suma_left_comm suma_middle_comm suma_right_comm) + +lemma ts__suma: + assumes "A B TS C D" + shows "C B A A B D SumA C B D" +proof - + have "A B D CongA A B D" + by (metis Tarski_neutral_dimensionless.conga_right_comm Tarski_neutral_dimensionless_axioms assms conga_pseudo_refl ts_distincts) + moreover have "\ B A OS C D" + using assms invert_one_side l9_9 by blast + moreover have "Coplanar C B A D" + using assms ncoplanar_perm_14 ts__coplanar by blast + moreover have "C B D CongA C B D" + by (metis assms conga_refl ts_distincts) + ultimately show ?thesis + using SumA_def by blast +qed + +lemma ts__suma_1: + assumes "A B TS C D" + shows "C A B B A D SumA C A D" + by (simp add: assms invert_two_sides ts__suma) + +lemma inangle__suma: + assumes "P InAngle A B C" + shows "A B P P B C SumA A B C" +proof - + have "Coplanar A B P C" + by (simp add: assms coplanar_perm_8 inangle__coplanar) + moreover have "\ B P OS A C" + by (meson assms col123__nos col124__nos in_angle_two_sides invert_two_sides l9_9_bis not_col_permutation_5) + ultimately show ?thesis + using SumA_def assms conga_refl inangle_distincts by blast +qed + +lemma bet__suma: + assumes "A \ B" and + "B \ C" and + "P \ B" and "Bet A B C" + shows "A B P P B C SumA A B C" +proof - + have "P InAngle A B C" + using assms(1) assms(2) assms(3) assms(4) in_angle_line by auto + then show ?thesis + by (simp add: inangle__suma) +qed + +lemma sams_chara: + assumes "A \ B" and + "A' \ B" and + "Bet A B A'" + shows "SAMS A B C D E F \ D E F LeA C B A'" +proof - + { + assume T1: "SAMS A B C D E F" + obtain J where T2: "C B J CongA D E F \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J" + using SAMS_def T1 by auto + have T3: "A \ A'" + using assms(2) assms(3) between_identity by blast + have T4: "C \ B" + using T2 conga_distinct by blast + have T5: "J \ B" + using T2 conga_diff2 by blast + have T6: "D \ E" + using CongA_def T2 by auto + have T7: "F \ E" + using CongA_def T2 by blast + { + assume "E Out D F" + then have "D E F LeA C B A'" + by (simp add: T4 assms(2) l11_31_1) + } + { + assume T8: "\ Bet A B C" + have "D E F LeA C B A'" + proof cases + assume "Col A B C" + then have "Bet C B A'" + using T8 assms(1) assms(3) between_exchange3 outer_transitivity_between2 third_point by blast + then show ?thesis + by (simp add: T4 T6 T7 assms(2) l11_31_2) + next + assume T9: "\ Col A B C" + show ?thesis + proof cases + assume T10: "Col D E F" + show ?thesis + proof cases + assume T11: "Bet D E F" + have "D E F CongA C B J" + by (simp add: T2 conga_sym) + then have T12: "Bet C B J" + using T11 bet_conga__bet by blast + have "A B TS C J" + proof - + have "\ Col J A B" + using T5 T9 T12 bet_col col2__eq col_permutation_1 by blast + moreover have "\ T. Col T A B \ Bet C T J" + using T12 col_trivial_3 by blast + ultimately show ?thesis + using T9 TS_def col_permutation_1 by blast + qed + then have "False" + using T2 by simp + then show ?thesis by simp + next + assume "\ Bet D E F" + then show ?thesis + using T10 \E Out D F \ D E F LeA C B A'\ or_bet_out by auto + qed + next + assume T13: "\ Col D E F" + show ?thesis + proof - + have "C B J LeA C B A'" + proof - + have "J InAngle C B A'" + proof - + have "A' \ B" + by (simp add: assms(2)) + moreover have "Bet A B A'" + by (simp add: assms(3)) + moreover have "C InAngle A B J" + proof - + have "\ Col J B C" + proof - + have "\ Col D E F" + by (simp add: T13) + moreover have "D E F CongA J B C" + using T2 conga_left_comm not_conga_sym by blast + ultimately show ?thesis + using ncol_conga_ncol by blast + qed + then have "B C TS A J" + by (simp add: T2 T9 cop_nos__ts coplanar_perm_8) + then obtain X where T14: "Col X B C \ Bet A X J" + using TS_def by blast + { + assume T15: "X \ B" + have "B Out X C" + proof - + have "Col B X C" + by (simp add: Col_perm T14) + moreover have "B A OS X C" + proof - + have "A B OS X C" + proof - + have "A B OS X J" + by (smt T14 T9 T15 bet_out calculation col_transitivity_2 col_trivial_2 l6_21 out_one_side) + moreover have "A B OS J C" + by (metis T14 T2 T9 calculation cop_nts__os l5_2 not_col_permutation_2 one_side_chara one_side_symmetry) + ultimately show ?thesis + using one_side_transitivity by blast + qed + then show ?thesis + by (simp add: invert_one_side) + qed + ultimately show ?thesis + using col_one_side_out by auto + qed + } + then have "Bet A X J \ (X = B \ B Out X C)" + using T14 by blast + then show ?thesis + using InAngle_def T4 T5 assms(1) by auto + qed + ultimately show ?thesis + using in_angle_reverse l11_24 by blast + qed + moreover have "C B J CongA C B J" + by (simp add: T4 T5 conga_refl) + ultimately show ?thesis + by (simp add: inangle__lea) + qed + moreover have "D E F LeA C B J" + by (simp add: T2 conga__lea456123) + ultimately show ?thesis + using lea_trans by blast + qed + qed + qed + } + then have "D E F LeA C B A'" + using SAMS_def T1 \E Out D F \ D E F LeA C B A'\ by blast + } + { + assume P1: "D E F LeA C B A'" + have P2: "A \ A'" + using assms(2) assms(3) between_identity by blast + have P3: "C \ B" + using P1 lea_distincts by auto + have P4: "D \ E" + using P1 lea_distincts by auto + have P5: "F \ E" + using P1 lea_distincts by auto + have "SAMS A B C D E F" + proof cases + assume P6: "Col A B C" + show ?thesis + proof cases + assume P7: "Bet A B C" + have "E Out D F" + proof - + have "B Out C A'" + by (meson Bet_perm P3 P7 assms(1) assms(2) assms(3) l6_2) + moreover have "C B A' CongA D E F" + using P1 calculation l11_21_b out_lea__out by blast + ultimately show ?thesis + using out_conga_out by blast + qed + moreover have "C B C CongA D E F" + using P3 calculation l11_21_b out_trivial by auto + moreover have "\ B C OS A C" + using os_distincts by auto + moreover have "\ A B TS C C" + by (simp add: not_two_sides_id) + moreover have "Coplanar A B C C" + using ncop_distincts by auto + ultimately show ?thesis + using SAMS_def assms(1) by blast + next + assume P8: "\ Bet A B C" + have P9: "B Out A C" + by (simp add: P6 P8 l6_4_2) + obtain J where P10: "D E F CongA C B J" + using P3 P4 P5 angle_construction_3 by blast + show ?thesis + proof - + have "C B J CongA D E F" + using P10 not_conga_sym by blast + moreover have "\ B C OS A J" + using Col_cases P6 one_side_not_col123 by blast + moreover have "\ A B TS C J" + using Col_cases P6 TS_def by blast + moreover have "Coplanar A B C J" + using P6 col__coplanar by auto + ultimately show ?thesis + using P8 SAMS_def assms(1) by blast + qed + qed + next + assume P11: "\ Col A B C" + have P12: "\ Col A' B C" + using P11 assms(2) assms(3) bet_col bet_col1 colx by blast + show ?thesis + proof cases + assume P13: "Col D E F" + have P14: "E Out D F" + proof - + { + assume P14: "Bet D E F" + have "D E F LeA C B A'" + by (simp add: P1) + then have "Bet C B A'" + using P14 bet_lea__bet by blast + then have "Col A' B C" + using Col_def Col_perm by blast + then have "False" + by (simp add: P12) + } + then have "\ Bet D E F" by auto + then show ?thesis + by (simp add: P13 l6_4_2) + qed + show ?thesis + proof - + have "C B C CongA D E F" + by (simp add: P3 P14 l11_21_b out_trivial) + moreover have "\ B C OS A C" + using os_distincts by auto + moreover have "\ A B TS C C" + by (simp add: not_two_sides_id) + moreover have "Coplanar A B C C" + using ncop_distincts by auto + ultimately show ?thesis + using P14 SAMS_def assms(1) by blast + qed + next + assume P15: "\ Col D E F" + obtain J where P16: "D E F CongA C B J \ C B TS J A" + using P11 P15 ex_conga_ts not_col_permutation_3 by presburger + show ?thesis + proof - + have "C B J CongA D E F" + by (simp add: P16 conga_sym) + moreover have "\ B C OS A J" + proof - + have "C B TS A J" + using P16 by (simp add: l9_2) + then show ?thesis + using invert_one_side l9_9 by blast + qed + moreover have "\ A B TS C J \ Coplanar A B C J" + proof cases + assume "Col A B J" + then show ?thesis + using TS_def ncop__ncols not_col_permutation_1 by blast + next + assume P17: "\ Col A B J" + have "\ A B TS C J" + proof - + have "A' B OS J C" + proof - + have "\ Col A' B C" + by (simp add: P12) + moreover have "\ Col B A' J" + proof - + { + assume "Col B A' J" + then have "False" + by (metis P17 assms(2) assms(3) bet_col col_trivial_2 colx) + } + then show ?thesis by auto + qed + moreover have "J InAngle A' B C" + proof - + obtain K where P20: "K InAngle C B A' \ D E F CongA C B K" + using LeA_def P1 by blast + have "J InAngle C B A'" + proof - + have "C B A' CongA C B A'" + by (simp add: P3 assms(2) conga_pseudo_refl conga_right_comm) + moreover have "C B K CongA C B J" + proof - + have "C B K CongA D E F" + using P20 not_conga_sym by blast + moreover have "D E F CongA C B J" + by (simp add: P16) + ultimately show ?thesis + using not_conga by blast + qed + moreover have "K InAngle C B A'" + using P20 by simp + moreover have "C B OS J A'" + proof - + have "C B TS J A" using P16 + by simp + moreover have "C B TS A' A" + using Col_perm P12 assms(3) bet__ts between_symmetry calculation invert_two_sides ts_distincts by blast + ultimately show ?thesis + using OS_def by auto + qed + ultimately show ?thesis + using conga_preserves_in_angle by blast + qed + then show ?thesis + by (simp add: l11_24) + qed + ultimately show ?thesis + by (simp add: in_angle_one_side) + qed + then have "A' B OS C J" + by (simp add: one_side_symmetry) + then have "\ A' B TS C J" + by (simp add: l9_9_bis) + then show ?thesis + using assms(2) assms(3) bet_col bet_col1 col_preserves_two_sides by blast + qed + moreover have "Coplanar A B C J" + proof - + have "C B TS J A" + using P16 by simp + then show ?thesis + by (simp add: coplanar_perm_20 ts__coplanar) + qed + ultimately show ?thesis by auto + qed + ultimately show ?thesis + using P11 SAMS_def assms(1) bet_col by auto + qed + qed + qed + } + then show ?thesis + using \SAMS A B C D E F \ D E F LeA C B A'\ by blast +qed + +lemma sams_distincts: + assumes "SAMS A B C D E F" + shows "A \ B \ B \ C \ D \ E \ E \ F" +proof - + obtain J where P1: "C B J CongA D E F \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J" + using SAMS_def assms by auto + then show ?thesis + by (metis SAMS_def assms conga_distinct) +qed + +lemma sams_sym: + assumes "SAMS A B C D E F" + shows "SAMS D E F A B C" +proof - + have P1: "A \ B" + using assms sams_distincts by blast + have P3: "D \ E" + using assms sams_distincts by blast + obtain D' where P5: "E Midpoint D D'" + using symmetric_point_construction by blast + obtain A' where P6: "B Midpoint A A'" + using symmetric_point_construction by blast + have P8: "E \ D'" + using P3 P5 is_midpoint_id_2 by blast + have P9: "A \ A'" + using P1 P6 l7_3 by auto + then have P10: "B \ A'" + using P6 P9 midpoint_not_midpoint by auto + then have "D E F LeA C B A'" + using P1 P6 assms midpoint_bet sams_chara by fastforce + then have "D E F LeA A' B C" + by (simp add: lea_right_comm) + then have "A B C LeA D' E F" + by (metis Mid_cases P1 P10 P3 P5 P6 P8 l11_36 midpoint_bet) + then have "A B C LeA F E D'" + by (simp add: lea_right_comm) + moreover have "D \ E" + by (simp add: P3) + moreover have "D' \ E" + using P8 by auto + moreover have "Bet D E D'" + by (simp add: P5 midpoint_bet) + then show ?thesis + using P3 P8 calculation(1) sams_chara by fastforce +qed + +lemma sams_right_comm: + assumes "SAMS A B C D E F" + shows "SAMS A B C F E D" +proof - + have P1: "E Out D F \ \ Bet A B C" + using SAMS_def assms by blast + obtain J where P2: "C B J CongA D E F \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J" + using SAMS_def assms by auto + { + assume "E Out D F" + then have "E Out F D \ \ Bet A B C" + by (simp add: l6_6) + } + { + assume "\ Bet A B C" + then have "E Out F D \ \ Bet A B C" by auto + } + then have "E Out F D \ \ Bet A B C" + using \E Out D F \ E Out F D \ \ Bet A B C\ P1 by auto + moreover have "C B J CongA F E D" + proof - + have "C B J CongA D E F" + by (simp add: P2) + then show ?thesis + by (simp add: conga_right_comm) + qed + ultimately show ?thesis + using P2 SAMS_def assms by auto +qed + +lemma sams_left_comm: + assumes "SAMS A B C D E F" + shows "SAMS C B A D E F" +proof - + have "SAMS D E F A B C" + by (simp add: assms sams_sym) + then have "SAMS D E F C B A" + using sams_right_comm by blast + then show ?thesis + using sams_sym by blast +qed + +lemma sams_comm: + assumes "SAMS A B C D E F" + shows "SAMS C B A F E D" + using assms sams_left_comm sams_right_comm by blast + +lemma conga2_sams__sams: + assumes "A B C CongA A' B' C'" and + "D E F CongA D' E' F'" and + "SAMS A B C D E F" + shows "SAMS A' B' C' D' E' F'" +proof - + obtain A0 where "B Midpoint A A0" + using symmetric_point_construction by auto + obtain A'0 where "B' Midpoint A' A'0" + using symmetric_point_construction by blast + have "D' E' F' LeA C' B' A'0" + proof - + have "D E F LeA C B A0" + by (metis \B Midpoint A A0\ assms(1) assms(3) conga_distinct midpoint_bet midpoint_distinct_2 sams_chara) + moreover have "D E F CongA D' E' F'" + by (simp add: assms(2)) + moreover have "C B A0 CongA C' B' A'0" + proof - + have "A0 B C CongA A'0 B' C'" + by (metis \B Midpoint A A0\ \B' Midpoint A' A'0\ assms(1) calculation(1) conga_diff45 l11_13 lea_distincts midpoint_bet midpoint_not_midpoint) + then show ?thesis + using conga_comm by blast + qed + ultimately show ?thesis + using l11_30 by blast + qed + then show ?thesis + by (metis \B' Midpoint A' A'0\ assms(1) conga_distinct lea_distincts midpoint_bet sams_chara) +qed + +lemma out546__sams: + assumes "A \ B" and + "B \ C" and + "E Out D F" + shows "SAMS A B C D E F" +proof - + obtain A' where "Bet A B A' \ Cong B A' A B" + using segment_construction by blast + then have "D E F LeA C B A'" + using assms(1) assms(2) assms(3) cong_diff_3 l11_31_1 by fastforce + then show ?thesis + using \Bet A B A' \ Cong B A' A B\ assms(1) lea_distincts sams_chara by blast +qed + +lemma out213__sams: + assumes "D \ E" and + "E \ F" and + "B Out A C" + shows "SAMS A B C D E F" + by (simp add: Tarski_neutral_dimensionless.sams_sym Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) out546__sams) + +lemma bet_suma__sams: + assumes "A B C D E F SumA G H I" and + "Bet G H I" + shows "SAMS A B C D E F" +proof - + obtain A' where P1: "C B A' CongA D E F \ \ B C OS A A' \ Coplanar A B C A' \ A B A' CongA G H I" + using SumA_def assms(1) by auto + then have "G H I CongA A B A'" + using not_conga_sym by blast + then have "Bet A B A'" + using assms(2) bet_conga__bet by blast + show ?thesis + proof - + have "E Out D F \ \ Bet A B C" + proof - + { + assume "Bet A B C" + then have "E Out D F" + proof - + have "B Out C A'" + proof - + have "C \ B" + using assms(1) suma_distincts by blast + moreover have "A' \ B" + using CongA_def \G H I CongA A B A'\ by blast + moreover have "A \ B" + using CongA_def \G H I CongA A B A'\ by blast + moreover have "Bet C B A" + by (simp add: Bet_perm \Bet A B C\) + ultimately show ?thesis + using Out_def \Bet A B A'\ \Bet A B C\ l5_2 by auto + qed + moreover have "C B A' CongA D E F" + using P1 by simp + ultimately show ?thesis + using l11_21_a by blast + qed + } + then show ?thesis + by blast + qed + moreover have "\ J. (C B J CongA D E F \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J)" + proof - + have "C B A' CongA D E F" + by (simp add: P1) + moreover have "\ B C OS A A'" + by (simp add: P1) + moreover have "\ A B TS C A'" + using Col_def TS_def \Bet A B A'\ by blast + moreover have "Coplanar A B C A'" + by (simp add: P1) + ultimately show ?thesis + by blast + qed + ultimately show ?thesis + using CongA_def SAMS_def \C B A' CongA D E F \ \ B C OS A A' \ Coplanar A B C A' \ A B A' CongA G H I\ by auto + qed +qed + +lemma bet__sams: + assumes "A \ B" and + "B \ C" and + "P \ B" and + "Bet A B C" + shows "SAMS A B P P B C" + by (meson assms(1) assms(2) assms(3) assms(4) bet__suma bet_suma__sams) + +lemma suppa__sams: + assumes "A B C SuppA D E F" + shows "SAMS A B C D E F" +proof - + obtain A' where P1: "Bet A B A' \ D E F CongA C B A'" + using SuppA_def assms by auto + then have "SAMS A B C C B A'" + by (metis assms bet__sams conga_diff45 conga_diff56 suppa2__conga123) + thus ?thesis + by (meson P1 assms conga2_sams__sams not_conga_sym suppa2__conga123) +qed + +lemma os_ts__sams: + assumes "B P TS A C" and + "A B OS P C" + shows "SAMS A B P P B C" +proof - + have "B Out P C \ \ Bet A B P" + using assms(2) bet_col col123__nos by blast + moreover have "\ J. (P B J CongA P B C \ \ B P OS A J \ \ A B TS P J \ Coplanar A B P J)" + by (metis assms(1) assms(2) conga_refl l9_9 os__coplanar os_distincts) + ultimately show ?thesis + using SAMS_def assms(2) os_distincts by auto +qed + +lemma os2__sams: + assumes "A B OS P C" and + "C B OS P A" + shows "SAMS A B P P B C" + by (simp add: Tarski_neutral_dimensionless.os_ts__sams Tarski_neutral_dimensionless_axioms assms(1) assms(2) invert_one_side l9_31) + +lemma inangle__sams: + assumes "P InAngle A B C" + shows "SAMS A B P P B C" +proof - + have "Bet A B C \ B Out A C \ \ Col A B C" + using l6_4_2 by blast + { + assume "Bet A B C" + then have "SAMS A B P P B C" + using assms bet__sams inangle_distincts by fastforce + } + { + assume "B Out A C" + then have "SAMS A B P P B C" + by (metis assms in_angle_out inangle_distincts out213__sams) + } + { + assume "\ Col A B C" + then have "\ Bet A B C" + using Col_def by auto + { + assume "Col B A P" + have "SAMS A B P P B C" + by (metis \Col B A P\ \\ Bet A B C\ assms col_in_angle_out inangle_distincts out213__sams) + } + { + assume "\ Col B A P" + { + assume "Col B C P" + have "SAMS A B P P B C" + by (metis Tarski_neutral_dimensionless.sams_comm Tarski_neutral_dimensionless_axioms \Col B C P\ \\ Bet A B C\ assms between_symmetry col_in_angle_out inangle_distincts l11_24 out546__sams) + } + { + assume "\ Col B C P" + have "SAMS A B P P B C" + proof - + have "B P TS A C" + by (simp add: \\ Col B A P\ \\ Col B C P\ assms in_angle_two_sides invert_two_sides) + moreover have "A B OS P C" + by (simp add: \\ Col A B C\ \\ Col B A P\ assms in_angle_one_side) + ultimately show ?thesis + by (simp add: os_ts__sams) + qed + } + then have "SAMS A B P P B C" + using \Col B C P \ SAMS A B P P B C\ by blast + } + then have "SAMS A B P P B C" + using \Col B A P \ SAMS A B P P B C\ by blast + } + thus ?thesis + using \B Out A C \ SAMS A B P P B C\ \Bet A B C \ SAMS A B P P B C\ \Bet A B C \ B Out A C \ \ Col A B C\ by blast +qed + +lemma sams_suma__lea123789: + assumes "A B C D E F SumA G H I" and + "SAMS A B C D E F" + shows "A B C LeA G H I" +proof cases + assume "Col A B C" + show ?thesis + proof cases + assume "Bet A B C" + have P1: "(A \ B \ (E Out D F \ \ Bet A B C)) \ (\ J. (C B J CongA D E F \ \ (B C OS A J) \ \ (A B TS C J) \ Coplanar A B C J))" + using SAMS_def assms(2) by auto + { + assume "E Out D F" + then have "A B C CongA G H I" + using assms(1) out546_suma__conga by auto + then have "A B C LeA G H I" + by (simp add: conga__lea) + } + thus ?thesis + using P1 \Bet A B C\ by blast + next + assume "\ Bet A B C" + then have "B Out A C" + using \Col A B C\ or_bet_out by auto + thus ?thesis + by (metis assms(1) l11_31_1 suma_distincts) + qed +next + assume "\ Col A B C" + show ?thesis + proof cases + assume "Col D E F" + show ?thesis + proof cases + assume "Bet D E F" + have "SAMS D E F A B C" + using assms(2) sams_sym by auto + then have "B Out A C" + using SAMS_def \Bet D E F\ by blast + thus ?thesis using l11_31_1 + by (metis assms(1) suma_distincts) + next + assume "\ Bet D E F" + have "A B C CongA G H I" + proof - + have "A B C D E F SumA G H I" + by (simp add: assms(1)) + moreover have "E Out D F" + using \Col D E F\ \\ Bet D E F\ l6_4_2 by auto + ultimately show ?thesis + using out546_suma__conga by auto + qed + show ?thesis + by (simp add: \A B C CongA G H I\ conga__lea) + qed + next + assume "\ Col D E F" + show ?thesis + proof cases + assume "Col G H I" + show ?thesis + proof cases + assume "Bet G H I" + thus ?thesis + by (metis assms(1) l11_31_2 suma_distincts) + next + assume "\ Bet G H I" + then have "H Out G I" + by (simp add: \Col G H I\ l6_4_2) + have "E Out D F \ \ Bet A B C" + using \\ Col A B C\ bet_col by auto + { + assume "\ Bet A B C" + then obtain J where P2: "C B J CongA D E F \ \ B C OS A J \ Coplanar A B C J \ A B J CongA G H I" + using SumA_def assms(1) by blast + have "G H I CongA A B J" + using P2 not_conga_sym by blast + then have "B Out A J" + using \H Out G I\ out_conga_out by blast + then have "B C OS A J" + using Col_perm \\ Col A B C\ out_one_side by blast + then have "False" + using \C B J CongA D E F \ \ B C OS A J \ Coplanar A B C J \ A B J CongA G H I\ by linarith + } + then have "False" + using Col_def \\ Col A B C\ by blast + thus ?thesis by blast + qed + next + assume "\ Col G H I" + obtain J where P4: "C B J CongA D E F \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J" + using SAMS_def assms(2) by auto + { + assume "Col J B C" + have "J B C CongA D E F" + by (simp add: P4 conga_left_comm) + then have "Col D E F" + using col_conga_col \Col J B C\ by blast + then have "False" + using \\ Col D E F\ by blast + } + then have "\ Col J B C" by blast + have "A B J CongA G H I" + proof - + have "A B C D E F SumA A B J" + proof - + have "C B J CongA D E F" + using P4 by simp + moreover have "\ B C OS A J" + by (simp add: P4) + moreover have "Coplanar A B C J" + by (simp add: P4) + moreover have "A B J CongA A B J" + by (metis \\ Col A B C\ \\ Col J B C\ col_trivial_1 conga_refl) + ultimately show ?thesis + using SumA_def by blast + qed + then show ?thesis + using assms(1) suma2__conga by auto + qed + have "\ Col J B A" + using \A B J CongA G H I\ \\ Col G H I\ col_conga_col not_col_permutation_3 by blast + have "A B C LeA A B J" + proof - + have "C InAngle A B J" + by (metis Col_perm P4 \\ Col A B C\ \\ Col J B A\ \\ Col J B C\ cop_nos__ts coplanar_perm_7 coplanar_perm_8 invert_two_sides l9_2 os_ts__inangle) + moreover have "A B C CongA A B C" + using calculation in_angle_asym inangle3123 inangle_distincts by auto + ultimately show ?thesis + using inangle__lea by blast + qed + thus ?thesis + using \A B J CongA G H I\ conga__lea lea_trans by blast + qed + qed +qed + +lemma sams_suma__lea456789: + assumes "A B C D E F SumA G H I" and + "SAMS A B C D E F" + shows "D E F LeA G H I" +proof - + have "D E F A B C SumA G H I" + by (simp add: assms(1) suma_sym) + moreover have "SAMS D E F A B C" + using assms(2) sams_sym by blast + ultimately show ?thesis + using sams_suma__lea123789 by auto +qed + +lemma sams_lea2__sams: + assumes "SAMS A' B' C' D' E' F'" and + "A B C LeA A' B' C'" and + "D E F LeA D' E' F'" + shows "SAMS A B C D E F" +proof - + obtain A0 where "B Midpoint A A0" + using symmetric_point_construction by auto + obtain A'0 where "B' Midpoint A' A'0" + using symmetric_point_construction by auto + have "D E F LeA C B A0" + proof - + have "D' E' F' LeA C B A0" + proof - + have "D' E' F' LeA C' B' A'0" + by (metis \B' Midpoint A' A'0\ assms(1) assms(2) lea_distincts midpoint_bet midpoint_distinct_2 sams_chara) + moreover have "C' B' A'0 LeA C B A0" + by (metis Mid_cases \B Midpoint A A0\ \B' Midpoint A' A'0\ assms(2) l11_36_aux2 l7_3_2 lea_comm lea_distincts midpoint_bet sym_preserve_diff) + ultimately show ?thesis + using lea_trans by blast + qed + moreover have "D E F LeA D' E' F'" + using assms(3) by auto + ultimately show ?thesis + using \D' E' F' LeA C B A0\ assms(3) lea_trans by blast + qed + then show ?thesis + by (metis \B Midpoint A A0\ assms(2) lea_distincts midpoint_bet sams_chara) +qed + +lemma sams_lea456_suma2__lea: + assumes "D E F LeA D' E' F'" and + "SAMS A B C D' E' F'" and + "A B C D E F SumA G H I" and + "A B C D' E' F' SumA G' H' I'" + shows "G H I LeA G' H' I'" +proof cases + assume "E' Out D' F'" + have "G H I CongA G' H' I'" + proof - + have "G H I CongA A B C" + proof - + have "A B C D E F SumA G H I" + by (simp add: assms(3)) + moreover have "E Out D F" + using \E' Out D' F'\ assms(1) out_lea__out by blast + ultimately show ?thesis + using conga_sym out546_suma__conga by blast + qed + moreover have "A B C CongA G' H' I'" + using \E' Out D' F'\ assms(4) out546_suma__conga by blast + ultimately show ?thesis + using conga_trans by blast + qed + thus ?thesis + by (simp add: conga__lea) +next + assume T1: "\ E' Out D' F'" + show ?thesis + proof cases + assume T2: "Col A B C" + have "E' Out D' F' \ \ Bet A B C" + using assms(2) SAMS_def by simp + { + assume "\ Bet A B C" + then have "B Out A C" + by (simp add: T2 l6_4_2) + have "G H I LeA G' H' I'" + proof - + have "D E F LeA D' E' F'" + by (simp add: assms(1)) + moreover have "D E F CongA G H I" + using \B Out A C\ assms(3) out213_suma__conga by auto + moreover have "D' E' F' CongA G' H' I'" + using \B Out A C\ assms(4) out213_suma__conga by auto + ultimately show ?thesis + using l11_30 by blast + qed + } + thus ?thesis + using T1 \E' Out D' F' \ \ Bet A B C\ by auto + next + assume "\ Col A B C" + show ?thesis + proof cases + assume "Col D' E' F'" + have "SAMS D' E' F' A B C" + by (simp add: assms(2) sams_sym) + { + assume "\ Bet D' E' F'" + then have "G H I LeA G' H' I'" + using not_bet_out T1 \Col D' E' F'\ by auto + } + thus ?thesis + by (metis assms(2) assms(3) assms(4) bet_lea__bet l11_31_2 sams_suma__lea456789 suma_distincts) + next + assume "\ Col D' E' F'" + show ?thesis + proof cases + assume "Col D E F" + have "\ Bet D E F" + using bet_lea__bet Col_def \\ Col D' E' F'\ assms(1) by blast + thus ?thesis + proof - + have "A B C LeA G' H' I'" + using assms(2) assms(4) sams_suma__lea123789 by auto + moreover have "A B C CongA G H I" + by (meson \Col D E F\ \\ Bet D E F\ assms(3) or_bet_out out213_suma__conga suma_sym) + moreover have "G' H' I' CongA G' H' I'" + proof - + have "G' \ H'" + using calculation(1) lea_distincts by blast + moreover have "H' \ I'" + using \A B C LeA G' H' I'\ lea_distincts by blast + ultimately show ?thesis + using conga_refl by auto + qed + ultimately show ?thesis + using l11_30 by blast + qed + next + assume "\ Col D E F" + show ?thesis + proof cases + assume "Col G' H' I'" + show ?thesis + proof cases + assume "Bet G' H' I'" + show ?thesis + proof - + have "G \ H" + using assms(3) suma_distincts by auto + moreover have "I \ H" + using assms(3) suma_distincts by blast + moreover have "G' \ H'" + using assms(4) suma_distincts by auto + moreover have "I' \ H'" + using assms(4) suma_distincts by blast + ultimately show ?thesis + by (simp add: \Bet G' H' I'\ l11_31_2) + qed + next + assume "\ Bet G' H' I'" + have "B Out A C" + proof - + have "H' Out G' I'" + using \Col G' H' I'\ l6_4_2 by (simp add: \\ Bet G' H' I'\) + moreover have "A B C LeA G' H' I'" using sams_suma__lea123789 + using assms(2) assms(4) by auto + ultimately show ?thesis + using out_lea__out by blast + qed + then have "Col A B C" + using Col_perm out_col by blast + then have "False" + using \\ Col A B C\ by auto + thus ?thesis by blast + qed + next + assume "\ Col G' H' I'" + obtain F'1 where P5: "C B F'1 CongA D' E' F' \ \ B C OS A F'1 \ \ A B TS C F'1 \ Coplanar A B C F'1" + using SAMS_def assms(2) by auto + have P6: "D E F LeA C B F'1" + proof - + have "D E F CongA D E F" + using \\ Col D E F\ conga_refl not_col_distincts by fastforce + moreover have "D' E' F' CongA C B F'1" + by (simp add: P5 conga_sym) + ultimately show ?thesis + using assms(1) l11_30 by blast + qed + then obtain F1 where P6: "F1 InAngle C B F'1 \ D E F CongA C B F1" + using LeA_def by auto + have "A B F'1 CongA G' H' I'" + proof - + have "A B C D' E' F' SumA A B F'1" + proof - + have "C B F'1 CongA D' E' F'" + using P5 by blast + moreover have "\ B C OS A F'1" + using P5 by auto + moreover have "Coplanar A B C F'1" + by (simp add: P5) + moreover have "A B F'1 CongA A B F'1" + proof - + have "A \ B" + using \\ Col A B C\ col_trivial_1 by blast + moreover have "B \ F'1" + using P6 inangle_distincts by auto + ultimately show ?thesis + using conga_refl by auto + qed + ultimately show ?thesis + using SumA_def by blast + qed + moreover have "A B C D' E' F' SumA G' H' I'" + by (simp add: assms(4)) + ultimately show ?thesis + using suma2__conga by auto + qed + have "\ Col A B F'1" + using \A B F'1 CongA G' H' I'\ \\ Col G' H' I'\ col_conga_col by blast + have "\ Col C B F'1" + proof - + have "\ Col D' E' F'" + by (simp add: \\ Col D' E' F'\) + moreover have "D' E' F' CongA C B F'1" + using P5 not_conga_sym by blast + ultimately show ?thesis + using ncol_conga_ncol by blast + qed + show ?thesis + proof - + have "A B F1 LeA A B F'1" + proof - + have "F1 InAngle A B F'1" + proof - + have "F1 InAngle F'1 B A" + proof - + have "F1 InAngle F'1 B C" + by (simp add: P6 l11_24) + moreover have "C InAngle F'1 B A" + proof - + have "B C TS A F'1" + using Col_perm P5 \\ Col A B C\ \\ Col C B F'1\ cop_nos__ts ncoplanar_perm_12 by blast + obtain X where "Col X B C \ Bet A X F'1" + using TS_def \B C TS A F'1\ by auto + have "Bet F'1 X A" + using Bet_perm \Col X B C \ Bet A X F'1\ by blast + moreover have "(X = B) \ (B Out X C)" + proof - + have "B A OS X C" + proof - + have "A B OS X F'1" + by (metis \Col X B C \ Bet A X F'1\ \\ Col A B C\ \\ Col A B F'1\ bet_out_1 calculation out_one_side) + moreover have "A B OS F'1 C" + using Col_perm P5 \\ Col A B C\ \\ Col A B F'1\ cop_nos__ts one_side_symmetry by blast + ultimately show ?thesis + using invert_one_side one_side_transitivity by blast + qed + thus ?thesis + using Col_cases \Col X B C \ Bet A X F'1\ col_one_side_out by blast + qed + ultimately show ?thesis + by (metis InAngle_def \\ Col A B C\ \\ Col A B F'1\ not_col_distincts) + qed + ultimately show ?thesis + using in_angle_trans by blast + qed + then show ?thesis + using l11_24 by blast + qed + moreover have "A B F1 CongA A B F1" + proof - + have "A \ B" + using \\ Col A B C\ col_trivial_1 by blast + moreover have "B \ F1" + using P6 conga_diff56 by blast + ultimately show ?thesis + using conga_refl by auto + qed + ultimately show ?thesis + by (simp add: inangle__lea) + qed + moreover have "A B F1 CongA G H I" + proof - + have "A B C D E F SumA A B F1" + proof - + have "B C TS F1 A" + proof - + have "B C TS F'1 A" + proof - + have "B C TS A F'1" + using Col_perm P5 \\ Col A B C\ \\ Col C B F'1\ cop_nos__ts ncoplanar_perm_12 by blast + thus ?thesis + using l9_2 by blast + qed + moreover have "B C OS F'1 F1" + proof - + have "\ Col C B F'1" + by (simp add: \\ Col C B F'1\) + moreover have "\ Col B C F1" + proof - + have "\ Col D E F" + using \\ Col D E F\ by auto + moreover have "D E F CongA C B F1" + by (simp add: P6) + ultimately show ?thesis + using ncol_conga_ncol not_col_permutation_4 by blast + qed + moreover have "F1 InAngle C B F'1" using P6 by blast + ultimately show ?thesis + using in_angle_one_side invert_one_side one_side_symmetry by blast + qed + ultimately show ?thesis + using l9_8_2 by blast + qed + thus ?thesis + proof - + have "C B F1 CongA D E F" + using P6 not_conga_sym by blast + moreover have "\ B C OS A F1" + using \B C TS F1 A\ l9_9 one_side_symmetry by blast + moreover have "Coplanar A B C F1" + using \B C TS F1 A\ ncoplanar_perm_9 ts__coplanar by blast + moreover have "A B F1 CongA A B F1" + proof - + have "A \ B" + using \\ Col A B C\ col_trivial_1 by blast + moreover have "B \ F1" + using P6 conga_diff56 by blast + ultimately show ?thesis + using conga_refl by auto + qed + ultimately show ?thesis + using SumA_def by blast + qed + qed + moreover have "A B C D E F SumA G H I" + by (simp add: assms(3)) + ultimately show ?thesis + using suma2__conga by auto + qed + ultimately show ?thesis + using \A B F'1 CongA G' H' I'\ l11_30 by blast + qed + qed + qed + qed + qed +qed + +lemma sams_lea123_suma2__lea: + assumes "A B C LeA A' B' C'" and + "SAMS A' B' C' D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D E F SumA G' H' I'" + shows "G H I LeA G' H' I'" + by (meson assms(1) assms(2) assms(3) assms(4) sams_lea456_suma2__lea sams_sym suma_sym) + +lemma sams_lea2_suma2__lea: + assumes "A B C LeA A' B' C'" and + "D E F LeA D' E' F'" and + "SAMS A' B' C' D' E' F'" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "G H I LeA G' H' I'" +proof - + obtain G'' H'' I'' where "A B C D' E' F' SumA G'' H'' I''" + using assms(4) assms(5) ex_suma suma_distincts by presburger + have "G H I LeA G'' H'' I''" + proof - + have "D E F LeA D' E' F'" + by (simp add: assms(2)) + moreover have "SAMS A B C D' E' F'" + proof - + have "SAMS A' B' C' D' E' F'" + by (simp add: assms(3)) + moreover have "A B C LeA A' B' C'" + using assms(1) by auto + moreover have "D' E' F' LeA D' E' F'" + using assms(2) lea_distincts lea_refl by blast + ultimately show ?thesis + using sams_lea2__sams by blast + qed + moreover have "A B C D E F SumA G H I" + by (simp add: assms(4)) + moreover have "A B C D' E' F' SumA G'' H'' I''" + by (simp add: \A B C D' E' F' SumA G'' H'' I''\) + ultimately show ?thesis + using sams_lea456_suma2__lea by blast + qed + moreover have "G'' H'' I'' LeA G' H' I'" + using sams_lea123_suma2__lea assms(3) assms(5) \A B C D' E' F' SumA G'' H'' I''\ assms(1) by blast + ultimately show ?thesis + using lea_trans by blast +qed + +lemma sams2_suma2__conga456: + assumes "SAMS A B C D E F" and + "SAMS A B C D' E' F'" and + "A B C D E F SumA G H I" and + "A B C D' E' F' SumA G H I" + shows "D E F CongA D' E' F'" +proof cases + assume "Col A B C" + show ?thesis + proof cases + assume P2: "Bet A B C" + then have "E Out D F" + using assms(1) SAMS_def by blast + moreover have "E' Out D' F'" + using P2 assms(2) SAMS_def by blast + ultimately show ?thesis + by (simp add: l11_21_b) + next + assume "\ Bet A B C" + then have "B Out A C" + using \Col A B C\ or_bet_out by blast + show ?thesis + proof - + have "D E F CongA G H I" + proof - + have "A B C D E F SumA G H I" + by (simp add: assms(3)) + thus ?thesis + using \B Out A C\ out213_suma__conga by auto + qed + moreover have "G H I CongA D' E' F'" + proof - + have "A B C D' E' F' SumA G H I" + by (simp add: assms(4)) + then have "D' E' F' CongA G H I" + using \B Out A C\ out213_suma__conga by auto + thus ?thesis + using not_conga_sym by blast + qed + ultimately show ?thesis + using not_conga by blast + qed + qed +next + assume "\ Col A B C" + obtain J where T1: "C B J CongA D E F \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J" + using assms(1) SAMS_def by blast + have T1A: "C B J CongA D E F" + using T1 by simp + have T1B: "\ B C OS A J" + using T1 by simp + have T1C: "\ A B TS C J" + using T1 by simp + have T1D: "Coplanar A B C J" + using T1 by simp + obtain J' where T2: "C B J' CongA D' E' F' \ \ B C OS A J' \ \ A B TS C J' \ Coplanar A B C J'" + using assms(2) SAMS_def by blast + have T2A: "C B J' CongA D' E' F'" + using T2 by simp + have T2B: "\ B C OS A J'" + using T2 by simp + have T2C: "\ A B TS C J'" + using T2 by simp + have T2D: "Coplanar A B C J'" + using T2 by simp + have T3: "C B J CongA C B J'" + proof - + have "A B J CongA A B J'" + proof - + have "A B J CongA G H I" + proof - + have "A B C D E F SumA A B J" + using SumA_def T1A T1B T1D \\ Col A B C\ conga_distinct conga_refl not_col_distincts by auto + thus ?thesis + using assms(3) suma2__conga by blast + qed + moreover have "G H I CongA A B J'" + proof - + have "A B C D' E' F' SumA G H I" + by (simp add: assms(4)) + moreover have "A B C D' E' F' SumA A B J'" + using SumA_def T2A T2B T2D \\ Col A B C\ conga_distinct conga_refl not_col_distincts by auto + ultimately show ?thesis + using suma2__conga by auto + qed + ultimately show ?thesis + using conga_trans by blast + qed + have "B Out J J' \ A B TS J J'" + proof - + have "Coplanar A B J J'" + using T1D T2D \\ Col A B C\ coplanar_trans_1 ncoplanar_perm_8 not_col_permutation_2 by blast + moreover have "A B J CongA A B J'" + by (simp add: \A B J CongA A B J'\) + ultimately show ?thesis + by (simp add: conga_cop__or_out_ts) + qed + { + assume "B Out J J'" + then have "C B J CongA C B J'" + by (metis Out_cases \\ Col A B C\ bet_out between_trivial not_col_distincts out2__conga) + } + { + assume "A B TS J J'" + then have "A B OS J C" + by (meson T1C T1D TS_def \\ Col A B C\ cop_nts__os not_col_permutation_2 one_side_symmetry) + then have "A B TS C J'" + using \A B TS J J'\ l9_8_2 by blast + then have "False" + by (simp add: T2C) + then have "C B J CongA C B J'" + by blast + } + thus ?thesis + using \B Out J J' \ C B J CongA C B J'\ \B Out J J' \ A B TS J J'\ by blast + qed + then have "C B J CongA D' E' F'" + using T2A not_conga by blast + thus ?thesis + using T1A not_conga not_conga_sym by blast +qed + +lemma sams2_suma2__conga123: + assumes "SAMS A B C D E F" and + "SAMS A' B' C' D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D E F SumA G H I" + shows "A B C CongA A' B' C'" +proof - + have "SAMS D E F A B C" + by (simp add: assms(1) sams_sym) + moreover have "SAMS D E F A' B' C'" + by (simp add: assms(2) sams_sym) + moreover have "D E F A B C SumA G H I" + by (simp add: assms(3) suma_sym) + moreover have "D E F A' B' C' SumA G H I" + using assms(4) suma_sym by blast + ultimately show ?thesis + using sams2_suma2__conga456 by auto +qed + +lemma suma_assoc_1: + assumes "SAMS A B C D E F" and + "SAMS D E F G H I" and + "A B C D E F SumA A' B' C'" and + "D E F G H I SumA D' E' F'" and + "A' B' C' G H I SumA K L M" + shows "A B C D' E' F' SumA K L M" +proof - + obtain A0 where P1: "Bet A B A0 \ Cong A B B A0" + using Cong_perm segment_construction by blast + obtain D0 where P2: "Bet D E D0 \ Cong D E E D0" + using Cong_perm segment_construction by blast + show ?thesis + proof cases + assume "Col A B C" + show ?thesis + proof cases + assume "Bet A B C" + then have "E Out D F" + using SAMS_def assms(1) by simp + show ?thesis + proof - + have "A' B' C' CongA A B C" + using assms(3) \E Out D F\ conga_sym out546_suma__conga by blast + moreover have "G H I CongA D' E' F'" + using assms(4) \E Out D F\ out213_suma__conga by auto + ultimately show ?thesis + by (meson Tarski_neutral_dimensionless.conga3_suma__suma Tarski_neutral_dimensionless.suma2__conga Tarski_neutral_dimensionless_axioms assms(5)) + qed + next + assume "\ Bet A B C" + then have "B Out A C" + using \Col A B C\ l6_4_2 by auto + have "A \ B" + using \B Out A C\ out_distinct by auto + have "B \ C" + using \\ Bet A B C\ between_trivial by auto + have "D' \ E'" + using assms(4) suma_distincts by blast + have "E' \ F'" + using assms(4) suma_distincts by auto + show ?thesis + proof - + obtain K0 L0 M0 where P3:"A B C D' E' F' SumA K0 L0 M0" + using ex_suma \A \ B\ \B \ C\ \D' \ E'\ \E' \ F'\ by presburger + moreover have "A B C CongA A B C" + using \A \ B\ \B \ C\ conga_refl by auto + moreover have "D' E' F' CongA D' E' F'" + using \D' \ E'\ \E' \ F'\ conga_refl by auto + moreover have "K0 L0 M0 CongA K L M" + proof - + have "K0 L0 M0 CongA D' E' F'" + using P3 \B Out A C\ conga_sym out213_suma__conga by blast + moreover have "D' E' F' CongA K L M" + proof - + have "D E F G H I SumA D' E' F'" + by (simp add: assms(4)) + moreover have "D E F G H I SumA K L M" + by (meson Tarski_neutral_dimensionless.conga3_suma__suma Tarski_neutral_dimensionless.out213_suma__conga Tarski_neutral_dimensionless.sams2_suma2__conga456 Tarski_neutral_dimensionless.suma2__conga Tarski_neutral_dimensionless_axioms \B Out A C\ assms(2) assms(3) assms(5) calculation not_conga_sym) + ultimately show ?thesis + using suma2__conga by auto + qed + ultimately show ?thesis + using not_conga by blast + qed + ultimately show ?thesis + using conga3_suma__suma by blast + qed + qed + next + assume T1: "\ Col A B C" + have "\ Col C B A0" + by (metis Col_def P1 \\ Col A B C\ cong_diff l6_16_1) + show ?thesis + proof cases + assume "Col D E F" + show ?thesis + proof cases + assume "Bet D E F" + have "H Out G I" using SAMS_def assms(2) \Bet D E F\ by blast + have "A B C D E F SumA A' B' C'" + by (simp add: assms(3)) + moreover have "A B C CongA A B C" + by (metis \\ Col A B C\ conga_pseudo_refl conga_right_comm not_col_distincts) + moreover have "D E F CongA D' E' F'" + using \H Out G I\ assms(4) out546_suma__conga by auto + moreover have "A' B' C' CongA K L M" + using \H Out G I\ assms(5) out546_suma__conga by auto + ultimately show ?thesis + using conga3_suma__suma by blast + next + assume "\ Bet D E F" + then have "E Out D F" + using not_bet_out by (simp add: \Col D E F\) + show ?thesis + proof - + have "A' B' C' CongA A B C" + using assms(3) \E Out D F\ conga_sym out546_suma__conga by blast + moreover have "G H I CongA D' E' F'" + using out546_suma__conga \E Out D F\ assms(4) out213_suma__conga by auto + moreover have "K L M CongA K L M" + proof - + have "K \ L \ L \ M" + using assms(5) suma_distincts by blast + thus ?thesis + using conga_refl by auto + qed + ultimately show ?thesis + using assms(5) conga3_suma__suma by blast + qed + qed + next + assume "\ Col D E F" + then have "\ Col F E D0" + by (metis Col_def P2 cong_diff l6_16_1 not_col_distincts) + show ?thesis + proof cases + assume "Col G H I" + show ?thesis + proof cases + assume "Bet G H I" + have "SAMS G H I D E F" + by (simp add: assms(2) sams_sym) + then have "E Out D F" + using SAMS_def \Bet G H I\ by blast + then have "Col D E F" + using Col_perm out_col by blast + then have "False" + using \\ Col D E F\ by auto + thus ?thesis by simp + next + assume "\ Bet G H I" + then have "H Out G I" + using SAMS_def by (simp add: \Col G H I\ l6_4_2) + show ?thesis + proof - + have "A B C CongA A B C" + by (metis \\ Col A B C\ conga_refl not_col_distincts) + moreover have "D E F CongA D' E' F'" + using assms(4) out546_suma__conga \H Out G I\ by auto + moreover have "A' B' C' CongA K L M" + using \H Out G I\ assms(5) out546_suma__conga by auto + ultimately show ?thesis + using assms(3) conga3_suma__suma by blast + qed + qed + next + assume "\ Col G H I" + have "\ B C OS A A0" + using P1 col_trivial_1 one_side_chara by blast + have "E F TS D D0" + by (metis P2 \\ Col D E F\ \\ Col F E D0\ bet__ts bet_col between_trivial not_col_permutation_1) + show ?thesis + proof cases + assume "Col A' B' C'" + show ?thesis + proof cases + assume "Bet A' B' C'" + show ?thesis + proof cases + assume "Col D' E' F'" + show ?thesis + proof cases + assume "Bet D' E' F'" + have "A B C CongA G H I" + proof - + have "A B C CongA D0 E F" + proof - + have "SAMS A B C D E F" + by (simp add: assms(1)) + moreover have "SAMS D0 E F D E F" + by (metis P2 \\ Col D E F\ \\ Col F E D0\ bet__sams between_symmetry not_col_distincts sams_right_comm) + moreover have "A B C D E F SumA A' B' C'" + by (simp add: assms(3)) + moreover have "D0 E F D E F SumA A' B' C'" + proof - + have "D E F D0 E F SumA A' B' C'" + proof - + have "F E D0 CongA D0 E F" + by (metis \\ Col F E D0\ col_trivial_1 col_trivial_2 conga_pseudo_refl) + moreover have "\ E F OS D D0" + using P2 col_trivial_1 one_side_chara by blast + moreover have "Coplanar D E F D0" + by (meson P2 bet__coplanar ncoplanar_perm_1) + moreover have "D E D0 CongA A' B' C'" + using assms(3) P2 \Bet A' B' C'\ \\ Col F E D0\ conga_line not_col_distincts suma_distincts by auto + ultimately show ?thesis + using SumA_def by blast + qed + thus ?thesis + by (simp add: \D E F D0 E F SumA A' B' C'\ suma_sym) + qed + ultimately show ?thesis + using sams2_suma2__conga123 by blast + qed + moreover have "D0 E F CongA G H I" + proof - + have "SAMS D E F D0 E F" + using P2 \\ Col D E F\ \\ Col F E D0\ bet__sams not_col_distincts sams_right_comm by auto + moreover have "D E F D0 E F SumA D' E' F'" + proof - + have "F E D0 CongA D0 E F" + by (metis (no_types) \\ Col F E D0\ col_trivial_1 col_trivial_2 conga_pseudo_refl) + moreover have "\ E F OS D D0" + using P2 col_trivial_1 one_side_chara by blast + moreover have "Coplanar D E F D0" + using P2 bet__coplanar ncoplanar_perm_1 by blast + moreover have "D E D0 CongA D' E' F'" + using assms(3) P2 \Bet D' E' F'\ \\ Col F E D0\ assms(4) conga_line not_col_distincts suma_distincts by auto + ultimately show ?thesis + using SumA_def by blast + qed + ultimately show ?thesis + using assms(2) assms(4) sams2_suma2__conga456 by auto + qed + ultimately show ?thesis + using conga_trans by blast + qed + then have "G H I CongA A B C" + using not_conga_sym by blast + have "D' E' F' A B C SumA K L M" + proof - + have "A' B' C' CongA D' E' F'" + by (metis Tarski_neutral_dimensionless.suma_distincts Tarski_neutral_dimensionless_axioms \Bet A' B' C'\ \Bet D' E' F'\ assms(4) assms(5) conga_line) + then show ?thesis + by (meson Tarski_neutral_dimensionless.conga3_suma__suma Tarski_neutral_dimensionless.suma2__conga Tarski_neutral_dimensionless_axioms \G H I CongA A B C\ assms(5)) + qed + thus ?thesis + by (simp add: suma_sym) + next + assume "\ Bet D' E' F'" + then have "E' Out D' F'" + by (simp add: \Col D' E' F'\ l6_4_2) + have "D E F LeA D' E' F'" + using assms(2) assms(4) sams_suma__lea123789 by blast + then have "E Out D F" + using \E' Out D' F'\ out_lea__out by blast + then have "Col D E F" + using Col_perm out_col by blast + then have "False" + using \\ Col D E F\ by auto + thus ?thesis by simp + qed + next + assume "\ Col D' E' F'" + have "D E F CongA C B A0" + proof - + have "SAMS A B C D E F" + by (simp add: assms(1)) + moreover have "SAMS A B C C B A0" + using P1 \\ Col A B C\ \\ Col C B A0\ bet__sams not_col_distincts by auto + moreover have "A B C D E F SumA A' B' C'" + by (simp add: assms(3)) + moreover have "A B C C B A0 SumA A' B' C'" + proof - + have "A B C C B A0 SumA A B A0" + by (metis P1 \\ Col A B C\ \\ Col C B A0\ bet__suma col_trivial_1 col_trivial_2) + moreover have "A B C CongA A B C" + using \SAMS A B C C B A0\ calculation sams2_suma2__conga123 by auto + moreover have "C B A0 CongA C B A0" + using \SAMS A B C C B A0\ calculation(1) sams2_suma2__conga456 by auto + moreover have "A B A0 CongA A' B' C'" + using P1 \Bet A' B' C'\ \\ Col C B A0\ assms(3) conga_line not_col_distincts suma_distincts by auto + ultimately show ?thesis + using conga3_suma__suma by blast + qed + ultimately show ?thesis + using sams2_suma2__conga456 by blast + qed + have "SAMS C B A0 G H I" + proof - + have "D E F CongA C B A0" + by (simp add: \D E F CongA C B A0\) + moreover have "G H I CongA G H I" + using \\ Col G H I\ conga_refl not_col_distincts by fastforce + moreover have "SAMS D E F G H I" + by (simp add: assms(2)) + ultimately show ?thesis + using conga2_sams__sams by blast + qed + then obtain J where P3: "A0 B J CongA G H I \ \ B A0 OS C J \ \ C B TS A0 J \ Coplanar C B A0 J" + using SAMS_def by blast + obtain F1 where P4: "F E F1 CongA G H I \ \ E F OS D F1 \ \ D E TS F F1 \ Coplanar D E F F1" + using SAMS_def assms(2) by auto + have "C B J CongA D' E' F'" + proof - + have "C B J CongA D E F1" + proof - + have "(B A0 TS C J \ E F TS D F1) \ (B A0 OS C J \ E F OS D F1)" + proof - + have "B A0 TS C J" + proof - + have "Coplanar B A0 C J" + using P3 ncoplanar_perm_12 by blast + moreover have "\ Col C B A0" + by (simp add: \\ Col C B A0\) + moreover have "\ Col J B A0" + using P3 \\ Col G H I\ col_conga_col not_col_permutation_3 by blast + moreover have "\ B A0 OS C J" + using P3 by simp + ultimately show ?thesis + by (simp add: cop_nos__ts) + qed + moreover have "E F TS D F1" + proof - + have "Coplanar E F D F1" + using P4 ncoplanar_perm_12 by blast + moreover have "\ Col D E F" + by (simp add: \\ Col D E F\) + moreover have "\ Col F1 E F" + using P4 \\ Col G H I\ col_conga_col col_permutation_3 by blast + moreover have "\ E F OS D F1" + using P4 by auto + ultimately show ?thesis + by (simp add: cop_nos__ts) + qed + ultimately show ?thesis + by simp + qed + moreover have "C B A0 CongA D E F" + using \D E F CongA C B A0\ not_conga_sym by blast + moreover have "A0 B J CongA F E F1" + proof - + have "A0 B J CongA G H I" + by (simp add: P3) + moreover have "G H I CongA F E F1" + using P4 not_conga_sym by blast + ultimately show ?thesis + using conga_trans by blast + qed + ultimately show ?thesis + using l11_22 by auto + qed + moreover have "D E F1 CongA D' E' F'" + proof - + have "D E F G H I SumA D E F1" + using P4 SumA_def \\ Col D E F\ conga_distinct conga_refl not_col_distincts by auto + moreover have "D E F G H I SumA D' E' F'" + by (simp add: assms(4)) + ultimately show ?thesis + using suma2__conga by auto + qed + ultimately show ?thesis + using conga_trans by blast + qed + show ?thesis + proof - + have "A B C D' E' F' SumA A B J" + proof - + have "C B TS J A" + proof - + have "C B TS A0 A" + proof - + have "B \ A0" + using \\ Col C B A0\ not_col_distincts by blast + moreover have "\ Col B C A" + using Col_cases \\ Col A B C\ by auto + moreover have "Bet A B A0" + by (simp add: P1) + ultimately show ?thesis + by (metis Bet_cases Col_cases \\ Col C B A0\ bet__ts invert_two_sides not_col_distincts) + qed + moreover have "C B OS A0 J" + proof - + have "\ Col J C B" + using \C B J CongA D' E' F'\ \\ Col D' E' F'\ col_conga_col not_col_permutation_2 by blast + moreover have "\ Col A0 C B" + using Col_cases \\ Col C B A0\ by blast + ultimately show ?thesis + using P3 cop_nos__ts by blast + qed + ultimately show ?thesis + using l9_8_2 by blast + qed + moreover have "C B J CongA D' E' F'" + by (simp add: \C B J CongA D' E' F'\) + moreover have "\ B C OS A J" + using calculation(1) invert_one_side l9_9_bis one_side_symmetry by blast + moreover have "Coplanar A B C J" + using calculation(1) ncoplanar_perm_15 ts__coplanar by blast + moreover have "A B J CongA A B J" + proof - + have "A \ B" + using \\ Col A B C\ col_trivial_1 by auto + moreover have "B \ J" + using \C B TS J A\ ts_distincts by blast + ultimately show ?thesis + using conga_refl by auto + qed + ultimately show ?thesis + using SumA_def by blast + qed + moreover have "A B J CongA K L M" + proof - + have "A' B' C' G H I SumA A B J" + proof - + have "A B A0 CongA A' B' C'" + using P1 \Bet A' B' C'\ \\ Col A B C\ \\ Col C B A0\ assms(5) conga_line not_col_distincts suma_distincts by auto + moreover have "A0 B J CongA G H I" + by (simp add: P3) + moreover have "A B A0 A0 B J SumA A B J" + proof - + have "A0 B J CongA A0 B J" + proof - + have "A0 \ B" + using \\ Col C B A0\ col_trivial_2 by auto + moreover have "B \ J" + using CongA_def \A0 B J CongA G H I\ by blast + ultimately show ?thesis + using conga_refl by auto + qed + moreover have "\ B A0 OS A J" + by (simp add: Col_def P1 col123__nos) + moreover have "Coplanar A B A0 J" + using P1 bet__coplanar by auto + moreover have "A B J CongA A B J" + using P1 \\ Col A B C\ between_symmetry calculation(1) l11_13 not_col_distincts by blast + ultimately show ?thesis + using SumA_def by blast + qed + ultimately show ?thesis + by (meson conga3_suma__suma suma2__conga) + qed + moreover have "A' B' C' G H I SumA K L M" + by (simp add: assms(5)) + ultimately show ?thesis + using suma2__conga by auto + qed + ultimately show ?thesis + proof - + have "A B C CongA A B C \ D' E' F' CongA D' E' F'" + using CongA_def \A B J CongA K L M\ \C B J CongA D' E' F'\ conga_refl by presburger + then show ?thesis + using \A B C D' E' F' SumA A B J\ \A B J CongA K L M\ conga3_suma__suma by blast + qed + qed + qed + next + assume "\ Bet A' B' C'" + have "B Out A C" + proof - + have "A B C LeA A' B' C'" using assms(1) assms(3) sams_suma__lea123789 by auto + moreover have "B' Out A' C'" + using \Col A' B' C'\ \\ Bet A' B' C'\ or_bet_out by blast + ultimately show ?thesis + using out_lea__out by blast + qed + then have "Col A B C" + using Col_perm out_col by blast + then have "False" + using \\ Col A B C\ by auto + thus ?thesis by simp + qed + next + assume "\ Col A' B' C'" + obtain C1 where P6: "C B C1 CongA D E F \ \ B C OS A C1 \ \ A B TS C C1 \ Coplanar A B C C1" + using SAMS_def assms(1) by auto + have P6A: "C B C1 CongA D E F" + using P6 by simp + have P6B: "\ B C OS A C1" + using P6 by simp + have P6C: "\ A B TS C C1" + using P6 by simp + have P6D: "Coplanar A B C C1" + using P6 by simp + have "A' B' C' CongA A B C1" + proof - + have "A B C D E F SumA A B C1" + using P6A P6B P6D SumA_def \\ Col A B C\ conga_distinct conga_refl not_col_distincts by auto + moreover have "A B C D E F SumA A' B' C'" + by (simp add: assms(3)) + ultimately show ?thesis + using suma2__conga by auto + qed + have "B C1 OS C A" + proof - + have "B A OS C C1" + proof - + have "A B OS C C1" + proof - + have "\ Col C A B" + using Col_perm \\ Col A B C\ by blast + moreover have "\ Col C1 A B" + using \\ Col A' B' C'\ \A' B' C' CongA A B C1\ col_permutation_1 ncol_conga_ncol by blast + ultimately show ?thesis + using P6C P6D cop_nos__ts by blast + qed + thus ?thesis + by (simp add: invert_one_side) + qed + moreover have "B C TS A C1" + proof - + have "Coplanar B C A C1" + using P6D ncoplanar_perm_12 by blast + moreover have "\ Col C1 B C" + proof - + have "D E F CongA C1 B C" + using P6A conga_left_comm not_conga_sym by blast + thus ?thesis + using \\ Col D E F\ ncol_conga_ncol by blast + qed + ultimately show ?thesis + using T1 P6B cop_nos__ts by blast + qed + ultimately show ?thesis + using os_ts1324__os one_side_symmetry by blast + qed + show ?thesis + proof cases + assume "Col D' E' F'" + show ?thesis + proof cases + assume "Bet D' E' F'" + obtain C0 where P7: "Bet C B C0 \ Cong C B B C0" + using Cong_perm segment_construction by blast + have "B C1 TS C C0" + by (metis P7 \B C1 OS C A\ bet__ts cong_diff_2 not_col_distincts one_side_not_col123) + show ?thesis + proof - + have "A B C C B C0 SumA A B C0" + proof - + have "C B C0 CongA C B C0" + by (metis P7 T1 cong_diff conga_line not_col_distincts) + moreover have "\ B C OS A C0" + using P7 bet_col col124__nos invert_one_side by blast + moreover have "Coplanar A B C C0" + using P7 bet__coplanar ncoplanar_perm_15 by blast + moreover have "A B C0 CongA A B C0" + by (metis P7 T1 cong_diff conga_refl not_col_distincts) + ultimately show ?thesis + using SumA_def by blast + qed + moreover have "A B C0 CongA K L M" + proof - + have "A' B' C' G H I SumA A B C0" + proof - + have "A B C1 C1 B C0 SumA A B C0" + using \B C1 TS C C0\ \B C1 OS C A\ l9_8_2 ts__suma_1 by blast + moreover have "A B C1 CongA A' B' C'" + by (simp add: P6 \A' B' C' CongA A B C1\ conga_sym) + moreover have "C1 B C0 CongA G H I" + proof - + have "SAMS C B C1 C1 B C0" + by (metis P7 \B C1 TS C C0\ bet__sams ts_distincts) + moreover have "SAMS C B C1 G H I" + proof - + have "D E F CongA C B C1" + using P6A not_conga_sym by blast + moreover have "G H I CongA G H I" + using \\ Col G H I\ conga_refl not_col_distincts by fastforce + moreover have "SAMS D E F G H I" + by (simp add: assms(2)) + ultimately show ?thesis + using conga2_sams__sams by blast + qed + moreover have "C B C1 C1 B C0 SumA C B C0" + by (simp add: \B C1 TS C C0\ ts__suma_1) + moreover have "C B C1 G H I SumA C B C0" + proof - + have "D E F G H I SumA D' E' F'" + by (simp add: assms(4)) + moreover have "D E F CongA C B C1" + using P6A not_conga_sym by blast + moreover have "G H I CongA G H I" + using \\ Col G H I\ conga_refl not_col_distincts by fastforce + moreover have "D' E' F' CongA C B C0" using P7 assms(4) + by (metis P6A Tarski_neutral_dimensionless.suma_distincts Tarski_neutral_dimensionless_axioms \Bet D' E' F'\ cong_diff conga_diff1 conga_line) + ultimately show ?thesis + using conga3_suma__suma by blast + qed + ultimately show ?thesis + using sams2_suma2__conga456 by auto + qed + moreover have "A B C0 CongA A B C0" + by (metis P7 T1 cong_diff conga_refl not_col_distincts) + ultimately show ?thesis + using conga3_suma__suma by blast + qed + thus ?thesis + using assms(5) suma2__conga by auto + qed + moreover have "A B C CongA A B C" + proof - + have "A \ B \ B \ C" + using T1 col_trivial_1 col_trivial_2 by auto + thus ?thesis + using conga_refl by auto + qed + moreover have "C B C0 CongA D' E' F'" + proof - + have "C \ B" + using T1 col_trivial_2 by blast + moreover have "B \ C0" + using \B C1 TS C C0\ ts_distincts by blast + moreover have "D' \ E'" + using assms(4) suma_distincts by blast + moreover have "E' \ F'" + using assms(4) suma_distincts by auto + ultimately show ?thesis + by (simp add: P7 \Bet D' E' F'\ conga_line) + qed + ultimately show ?thesis + using conga3_suma__suma by blast + qed + next + assume "\ Bet D' E' F'" + then have "E' Out D' F'" + by (simp add: \Col D' E' F'\ l6_4_2) + have "D E F LeA D' E' F'" + using sams_suma__lea123789 assms(2) assms(4) by auto + then have "E Out D F" + using \E' Out D' F'\ out_lea__out by blast + then have "False" + using Col_cases \\ Col D E F\ out_col by blast + thus ?thesis by simp + qed + next + assume "\ Col D' E' F'" + have "SAMS C B C1 G H I" + proof - + have "D E F CongA C B C1" + using P6A not_conga_sym by blast + moreover have "G H I CongA G H I" + using \\ Col G H I\ conga_refl not_col_distincts by fastforce + ultimately show ?thesis + using assms(2) conga2_sams__sams by blast + qed + then obtain J where P7: "C1 B J CongA G H I \ \ B C1 OS C J \ \ C B TS C1 J \ Coplanar C B C1 J" + using SAMS_def by blast + have P7A: "C1 B J CongA G H I" + using P7 by simp + have P7B: "\ B C1 OS C J" + using P7 by simp + have P7C: "\ C B TS C1 J" + using P7 by simp + have P7D: "Coplanar C B C1 J" + using P7 by simp + obtain F1 where P8: "F E F1 CongA G H I \ \ E F OS D F1 \ \ D E TS F F1 \ Coplanar D E F F1" + using SAMS_def assms(2) by auto + have P8A: "F E F1 CongA G H I" + using P8 by simp + have P8B: "\ E F OS D F1" + using P8 by simp + have P8C: "\ D E TS F F1" + using P8 by simp + have P8D: "Coplanar D E F F1" + using P8 by simp + have "C B J CongA D' E' F'" + proof - + have "C B J CongA D E F1" + proof - + have "B C1 TS C J" + proof - + have "Coplanar B C1 C J" + using P7D ncoplanar_perm_12 by blast + moreover have "\ Col C B C1" + using \B C1 OS C A\ not_col_permutation_2 one_side_not_col123 by blast + moreover have "\ Col J B C1" + using P7 \\ Col G H I\ col_conga_col not_col_permutation_3 by blast + moreover have "\ B C1 OS C J" + by (simp add: P7B) + ultimately show ?thesis + by (simp add: cop_nos__ts) + qed + moreover have "E F TS D F1" + proof - + have "Coplanar E F D F1" + using P8D ncoplanar_perm_12 by blast + moreover have "\ Col F1 E F" + using P8 \\ Col G H I\ col_conga_col not_col_permutation_3 by blast + ultimately show ?thesis + using P8B \\ Col D E F\ cop_nos__ts by blast + qed + moreover have "C B C1 CongA D E F" + using P6A by blast + moreover have "C1 B J CongA F E F1" + using P8 by (meson P7A not_conga not_conga_sym) + ultimately show ?thesis + using l11_22a by blast + qed + moreover have "D E F1 CongA D' E' F'" + proof - + have "D E F G H I SumA D E F1" + using P8A P8B P8D SumA_def \\ Col D E F\ conga_distinct conga_refl not_col_distincts by auto + moreover have "D E F G H I SumA D' E' F'" + by (simp add: assms(4)) + ultimately show ?thesis + using suma2__conga by auto + qed + ultimately show ?thesis + using conga_trans by blast + qed + have "\ Col C B C1" + using \B C1 OS C A\ col123__nos col_permutation_1 by blast + show ?thesis + proof - + have "A B C C B J SumA A B J" + proof - + have "B C TS J A" + proof - + have "B C TS C1 A" using cop_nos__ts + using P6B P6D T1 \\ Col C B C1\ l9_2 ncoplanar_perm_12 not_col_permutation_3 by blast + moreover have "B C OS C1 J" + proof - + have "\ Col C1 C B" + using Col_perm \\ Col C B C1\ by blast + moreover have "\ Col J C B" + using \C B J CongA D' E' F'\ \\ Col D' E' F'\ col_conga_col col_permutation_1 by blast + ultimately show ?thesis + using P7C P7D cop_nos__ts invert_one_side by blast + qed + ultimately show ?thesis + using l9_8_2 by blast + qed + thus ?thesis + by (simp add: l9_2 ts__suma_1) + qed + moreover have "A B C CongA A B C" + using T1 conga_refl not_col_distincts by fastforce + moreover have "A B J CongA K L M" + proof - + have "A' B' C' G H I SumA A B J" + proof - + have "A B C1 C1 B J SumA A B J" + proof - + have "B C1 TS A J" + proof - + have "B C1 TS C J" + proof - + have "Coplanar B C1 C J" + using P7D ncoplanar_perm_12 by blast + moreover have "\ Col J B C1" + using P7 \\ Col G H I\ col_conga_col not_col_permutation_3 by blast + ultimately show ?thesis + by (simp add: \\ Col C B C1\ P7B cop_nos__ts) + qed + moreover have "B C1 OS C A" + by (simp add: \B C1 OS C A\) + ultimately show ?thesis + using l9_8_2 by blast + qed + thus ?thesis + by (simp add: ts__suma_1) + qed + moreover have "A B C1 CongA A' B' C'" + using \A' B' C' CongA A B C1\ not_conga_sym by blast + moreover have "C1 B J CongA G H I" + by (simp add: P7A) + moreover have "A B J CongA A B J" + using \A B C C B J SumA A B J\ suma2__conga by auto + ultimately show ?thesis + using conga3_suma__suma by blast + qed + moreover have "A' B' C' G H I SumA K L M" + using assms(5) by simp + ultimately show ?thesis + using suma2__conga by auto + qed + ultimately show ?thesis + using \C B J CongA D' E' F'\ conga3_suma__suma by blast + qed + qed + qed + qed + qed + qed +qed + +lemma suma_assoc_2: + assumes "SAMS A B C D E F" and + "SAMS D E F G H I" and + "A B C D E F SumA A' B' C'" and + "D E F G H I SumA D' E' F'" and + "A B C D' E' F' SumA K L M" + shows "A' B' C' G H I SumA K L M" + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) sams_sym suma_assoc_1 suma_sym) + +lemma suma_assoc: + assumes "SAMS A B C D E F" and + "SAMS D E F G H I" and + "A B C D E F SumA A' B' C'" and + "D E F G H I SumA D' E' F'" + shows + "A' B' C' G H I SumA K L M \ A B C D' E' F' SumA K L M" + by (meson assms(1) assms(2) assms(3) assms(4) suma_assoc_1 suma_assoc_2) + +lemma sams_assoc_1: + assumes "SAMS A B C D E F" and + "SAMS D E F G H I" and + "A B C D E F SumA A' B' C'" and + "D E F G H I SumA D' E' F'" and + "SAMS A' B' C' G H I" + shows "SAMS A B C D' E' F'" +proof cases + assume "Col A B C" + { + assume "E Out D F" + have "SAMS A B C D' E' F'" + proof - + have "A' B' C' CongA A B C" + using assms(3) \E Out D F\ conga_sym out546_suma__conga by blast + moreover have "G H I CongA D' E' F'" + using \E Out D F\ assms(4) out213_suma__conga by blast + ultimately show ?thesis + using assms(5) conga2_sams__sams by blast + qed + } + { + assume "\ Bet A B C" + then have P1: "B Out A C" + using \Col A B C\ or_bet_out by blast + have "SAMS D' E' F' A B C" + proof - + have "D' \ E'" + using assms(4) suma_distincts by auto + moreover have "F' E' F' CongA A B C" + proof - + have "E' \ F'" + using assms(4) suma_distincts by blast + then have "E' Out F' F'" + using out_trivial by auto + thus ?thesis + using P1 l11_21_b by blast + qed + moreover have "\ E' F' OS D' F'" + using os_distincts by blast + moreover have "\ D' E' TS F' F'" + by (simp add: not_two_sides_id) + moreover have "Coplanar D' E' F' F'" + using ncop_distincts by blast + ultimately show ?thesis using SAMS_def P1 by blast + qed + then have "SAMS A B C D' E' F'" + using sams_sym by blast + } + thus ?thesis + using SAMS_def assms(1) \E Out D F \ SAMS A B C D' E' F'\ by blast +next + assume "\ Col A B C" + show ?thesis + proof cases + assume "Col D E F" + have "H Out G I \ \ Bet D E F" + using SAMS_def assms(2) by blast + { + assume "H Out G I" + have "SAMS A B C D' E' F'" + proof - + have "A B C CongA A B C" + using \\ Col A B C\ conga_refl not_col_distincts by fastforce + moreover have "D E F CongA D' E' F'" + using \H Out G I\ assms(4) out546_suma__conga by blast + ultimately show ?thesis + using assms(1) conga2_sams__sams by blast + qed + } + { + assume "\ Bet D E F" + then have "E Out D F" + using \Col D E F\ l6_4_2 by blast + have "SAMS A B C D' E' F'" + proof - + have "A' B' C' CongA A B C" + using out546_suma__conga \E Out D F\ assms(3) not_conga_sym by blast + moreover have "G H I CongA D' E' F'" + using out213_suma__conga \E Out D F\ assms(4) by auto + ultimately show ?thesis + using assms(5) conga2_sams__sams by auto + qed + } + thus ?thesis + using \H Out G I \ SAMS A B C D' E' F'\ \H Out G I \ \ Bet D E F\ by blast + next + assume "\ Col D E F" + show ?thesis + proof cases + assume "Col G H I" + have "SAMS G H I D E F" + by (simp add: assms(2) sams_sym) + then have "E Out D F \ \ Bet G H I" + using SAMS_def by blast + { + assume "E Out D F" + then have "False" + using Col_cases \\ Col D E F\ out_col by blast + then have "SAMS A B C D' E' F'" + by simp + } + { + assume "\ Bet G H I" + then have "H Out G I" + by (simp add: \Col G H I\ l6_4_2) + have "SAMS A B C D' E' F'" + proof - + have "A B C CongA A B C" + by (metis \\ Col A B C\ col_trivial_1 col_trivial_2 conga_refl) + moreover have "D E F CongA D' E' F'" + using out546_suma__conga \H Out G I\ assms(4) by blast + moreover have "SAMS A B C D E F" + using assms(1) by auto + ultimately show ?thesis + using conga2_sams__sams by auto + qed + } + thus ?thesis + using \E Out D F \ \ Bet G H I\ \E Out D F \ SAMS A B C D' E' F'\ by blast + next + assume "\ Col G H I" + show ?thesis + proof - + have "\ Bet A B C" + using Col_def \\ Col A B C\ by auto + moreover have "\ J. (C B J CongA D' E' F' \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J)" + proof - + have "\ Col A' B' C'" + proof - + { + assume "Col A' B' C'" + have "H Out G I \ \ Bet A' B' C'" + using SAMS_def assms(5) by blast + { + assume "H Out G I" + then have "False" + using Col_cases \\ Col G H I\ out_col by blast + } + { + assume "\ Bet A' B' C'" + then have "B' Out A' C'" + using not_bet_out \Col A' B' C'\ by blast + have "A B C LeA A' B' C'" + using assms(1) assms(3) sams_suma__lea123789 by auto + then have "B Out A C" + using \B' Out A' C'\ out_lea__out by blast + then have "Col A B C" + using Col_perm out_col by blast + then have "False" + using \\ Col A B C\ by auto + } + then have "False" + using \H Out G I \ False\ \H Out G I \ \ Bet A' B' C'\ by blast + } + thus ?thesis by blast + qed + obtain C1 where P1: "C B C1 CongA D E F \ \ B C OS A C1 \ \ A B TS C C1 \ Coplanar A B C C1" + using SAMS_def assms(1) by auto + have P1A: "C B C1 CongA D E F" + using P1 by simp + have P1B: "\ B C OS A C1" + using P1 by simp + have P1C: "\ A B TS C C1" + using P1 by simp + have P1D: "Coplanar A B C C1" + using P1 by simp + have "A B C1 CongA A' B' C'" + proof - + have "A B C D E F SumA A B C1" + using P1A P1B P1D SumA_def \\ Col A B C\ conga_distinct conga_refl not_col_distincts by auto + thus ?thesis + using assms(3) suma2__conga by auto + qed + have "SAMS C B C1 G H I" + proof - + have "D E F CongA C B C1" + using P1A not_conga_sym by blast + moreover have "G H I CongA G H I" + using \\ Col G H I\ conga_refl not_col_distincts by fastforce + ultimately show ?thesis using conga2_sams__sams + using assms(2) by blast + qed + then obtain J where T1: "C1 B J CongA G H I \ \ B C1 OS C J \ \ C B TS C1 J \ Coplanar C B C1 J" + using SAMS_def by auto + have T1A: "C1 B J CongA G H I" + using T1 by simp + have T1B: "\ B C1 OS C J" + using T1 by simp + have T1C: "\ C B TS C1 J" + using T1 by simp + have T1D: "Coplanar C B C1 J" + using T1 by simp + have "SAMS A B C1 C1 B J" + proof - + have "A' B' C' CongA A B C1" + using \A B C1 CongA A' B' C'\ not_conga_sym by blast + moreover have "G H I CongA C1 B J" + using T1A not_conga_sym by blast + ultimately show ?thesis + using assms(5) conga2_sams__sams by auto + qed + then obtain J' where T2: "C1 B J' CongA C1 B J \ \ B C1 OS A J' \ \ A B TS C1 J' \ Coplanar A B C1 J'" + using SAMS_def by auto + have T2A: "C1 B J' CongA C1 B J" + using T2 by simp + have T2B: "\ B C1 OS A J'" + using T2 by simp + have T2C: "\ A B TS C1 J'" + using T2 by simp + have T2D: "Coplanar A B C1 J'" + using T2 by simp + have "A' B' C' CongA A B C1" + using \A B C1 CongA A' B' C'\ not_conga_sym by blast + then have "\ Col A B C1" + using ncol_conga_ncol \\ Col A' B' C'\ by blast + have "D E F CongA C B C1" + using P1A not_conga_sym by blast + then have "\ Col C B C1" + using ncol_conga_ncol \\ Col D E F\ by blast + then have "Coplanar C1 B A J" + using coplanar_trans_1 P1D T1D coplanar_perm_15 coplanar_perm_6 by blast + have "Coplanar C1 B J' J" + using T2D \Coplanar C1 B A J\ \\ Col A B C1\ coplanar_perm_14 coplanar_perm_6 coplanar_trans_1 by blast + have "B Out J' J \ C1 B TS J' J" + by (meson T2 \Coplanar C1 B A J\ \\ Col A B C1\ conga_cop__or_out_ts coplanar_trans_1 ncoplanar_perm_14 ncoplanar_perm_6) + { + assume "B Out J' J" + have "\ J. (C B J CongA D' E' F' \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J)" + proof - + have "C B C1 C1 B J SumA C B J" + proof - + have "C1 B J CongA C1 B J" + using CongA_def T2A conga_refl by auto + moreover have "C B J CongA C B J" + using \\ Col C B C1\ calculation conga_diff56 conga_pseudo_refl conga_right_comm not_col_distincts by blast + ultimately show ?thesis + using T1D T1B SumA_def by blast + qed + then have "D E F G H I SumA C B J" + using conga3_suma__suma by (meson P1A T1A suma2__conga) + then have "C B J CongA D' E' F'" + using assms(4) suma2__conga by auto + moreover have "\ B C OS A J" + by (metis (no_types, lifting) Col_perm P1B P1D T1C \\ Col A B C\ \\ Col C B C1\ cop_nos__ts coplanar_perm_8 invert_two_sides l9_2 l9_8_2) + moreover have "\ A B TS C J" + proof cases + assume "Col A B J" + thus ?thesis + using TS_def invert_two_sides not_col_permutation_3 by blast + next + assume "\ Col A B J" + have "A B OS C J" + proof - + have "A B OS C C1" + by (simp add: P1C P1D \\ Col A B C1\ \\ Col A B C\ cop_nts__os not_col_permutation_2) + moreover have "A B OS C1 J" + proof - + have "A B OS C1 J'" + by (metis T2C T2D \B Out J' J\ \\ Col A B C1\ \\ Col A B J\ col_trivial_2 colx cop_nts__os not_col_permutation_2 out_col out_distinct) + thus ?thesis + using \B Out J' J\ invert_one_side out_out_one_side by blast + qed + ultimately show ?thesis + using one_side_transitivity by blast + qed + thus ?thesis + using l9_9 by blast + qed + moreover have "Coplanar A B C J" + by (meson P1D \Coplanar C1 B A J\ \\ Col A B C1\ coplanar_perm_18 coplanar_perm_2 coplanar_trans_1 not_col_permutation_2) + ultimately show ?thesis + by blast + qed + } + { + assume "C1 B TS J' J" + have "B C1 OS C J" + proof - + have "B C1 TS C J'" + proof - + have "B C1 TS A J'" + by (meson T2B T2D TS_def \C1 B TS J' J\ \\ Col A B C1\ cop_nts__os invert_two_sides ncoplanar_perm_12) + moreover have "B C1 OS A C" + by (meson P1B P1C P1D \\ Col A B C1\ \\ Col A B C\ \\ Col C B C1\ cop_nts__os invert_one_side ncoplanar_perm_12 not_col_permutation_2 not_col_permutation_3 os_ts1324__os) + ultimately show ?thesis + using l9_8_2 by blast + qed + moreover have "B C1 TS J J'" + using \C1 B TS J' J\ invert_two_sides l9_2 by blast + ultimately show ?thesis + using OS_def by blast + qed + then have "False" + by (simp add: T1B) + then have "\ J. (C B J CongA D' E' F' \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J)" + by auto + } + thus ?thesis + using \B Out J' J \ \J. C B J CongA D' E' F' \ \ B C OS A J \ \ A B TS C J \ Coplanar A B C J\ \B Out J' J \ C1 B TS J' J\ by blast + qed + ultimately show ?thesis + using SAMS_def not_bet_distincts by auto + qed + qed + qed +qed + +lemma sams_assoc_2: + assumes "SAMS A B C D E F" and + "SAMS D E F G H I" and + "A B C D E F SumA A' B' C'" and + "D E F G H I SumA D' E' F'" and + "SAMS A B C D' E' F'" + shows "SAMS A' B' C' G H I" +proof - + have "SAMS G H I A' B' C'" + proof - + have "SAMS G H I D E F" + by (simp add: assms(2) sams_sym) + moreover have "SAMS D E F A B C" + by (simp add: assms(1) sams_sym) + moreover have "G H I D E F SumA D' E' F'" + by (simp add: assms(4) suma_sym) + moreover have "D E F A B C SumA A' B' C'" + by (simp add: assms(3) suma_sym) + moreover have "SAMS D' E' F' A B C" + by (simp add: assms(5) sams_sym) + ultimately show ?thesis + using sams_assoc_1 by blast + qed + thus ?thesis + using sams_sym by blast +qed + +lemma sams_assoc: + assumes "SAMS A B C D E F" and + "SAMS D E F G H I" and + "A B C D E F SumA A' B' C'" and + "D E F G H I SumA D' E' F'" + shows "(SAMS A' B' C' G H I) \ (SAMS A B C D' E' F')" + using sams_assoc_1 sams_assoc_2 + by (meson assms(1) assms(2) assms(3) assms(4)) + +lemma sams_nos__nts: + assumes "SAMS A B C C B J" and + "\ B C OS A J" + shows "\ A B TS C J" +proof - + obtain J' where P1: "C B J' CongA C B J \ \ B C OS A J' \ \ A B TS C J' \ Coplanar A B C J'" + using SAMS_def assms(1) by blast + have P1A: "C B J' CongA C B J" + using P1 by simp + have P1B: "\ B C OS A J'" + using P1 by simp + have P1C: "\ A B TS C J'" + using P1 by simp + have P1D: "Coplanar A B C J'" + using P1 by simp + have P2: "B Out C J \ \ Bet A B C" + using SAMS_def assms(1) by blast + { + assume "A B TS C J" + have "Coplanar C B J J'" + proof - + have "\ Col A C B" + using TS_def \A B TS C J\ not_col_permutation_4 by blast + moreover have "Coplanar A C B J" + using \A B TS C J\ ncoplanar_perm_2 ts__coplanar by blast + moreover have "Coplanar A C B J'" + using P1D ncoplanar_perm_2 by blast + ultimately show ?thesis + using coplanar_trans_1 by blast + qed + have "B Out J J' \ C B TS J J'" + by (metis P1 \A B TS C J\ \Coplanar C B J J'\ bet_conga__bet bet_out col_conga_col col_two_sides_bet conga_distinct conga_os__out conga_sym cop_nts__os invert_two_sides l5_2 l6_6 not_col_permutation_3 not_col_permutation_4) + { + assume "B Out J J'" + have "\ Col B A J \ \ Col B A J'" + using TS_def \A B TS C J\ not_col_permutation_3 by blast + then have "A B OS C J'" + by (metis (full_types) \B Out J J'\ Col_cases P1C P1D TS_def \A B TS C J\ col2__eq cop_nts__os l6_3_1 out_col) + then have "A B TS C J'" + by (meson \A B TS C J\ \B Out J J'\ l6_6 l9_2 not_col_distincts out_two_sides_two_sides) + then have "False" + by (simp add: P1C) + } + { + assume "C B TS J J'" + then have "\ Col C A B \ \ Col J A B" + using TS_def \A B TS C J\ by blast + then have "False" + by (metis P1B P1D TS_def \C B TS J J'\ assms(2) cop_nts__os invert_two_sides l9_8_1 ncoplanar_perm_12 not_col_permutation_1) + } + then have "False" + using \B Out J J' \ False\ \B Out J J' \ C B TS J J'\ by blast + } + thus ?thesis by auto +qed + +lemma conga_sams_nos__nts: + assumes "SAMS A B C D E F" and + "C B J CongA D E F" and + "\ B C OS A J" + shows "\ A B TS C J" +proof - + have "SAMS A B C C B J" + proof - + have "A B C CongA A B C" + using assms(1) conga_refl sams_distincts by fastforce + moreover have "D E F CongA C B J" + using assms(2) not_conga_sym by blast + ultimately show ?thesis + using assms(1) conga2_sams__sams by auto + qed + thus ?thesis + by (simp add: assms(3) sams_nos__nts) +qed + +lemma sams_lea2_suma2__conga123: + assumes "A B C LeA A' B' C'" and + "D E F LeA D' E' F'" and + "SAMS A' B' C' D' E' F'" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G H I" + shows "A B C CongA A' B' C'" +proof - + have "SAMS A B C D E F" + using assms(1) assms(2) assms(3) sams_lea2__sams by blast + moreover have "SAMS A' B' C' D E F" + by (metis assms(2) assms(3) lea_refl sams_distincts sams_lea2__sams) + moreover have "A' B' C' D E F SumA G H I" + proof - + obtain G' H' I' where P1: "A' B' C' D E F SumA G' H' I'" + using calculation(2) ex_suma sams_distincts by blast + show ?thesis + proof - + have "A' \ B' \ B' \ C'" + using assms(1) lea_distincts by blast + then have "A' B' C' CongA A' B' C'" + using conga_refl by auto + moreover + have "D \ E \ E \ F" + using \SAMS A B C D E F\ sams_distincts by blast + then have "D E F CongA D E F" + using conga_refl by auto + moreover have "G' H' I' CongA G H I" + proof - + have "G' H' I' LeA G H I" + using P1 assms(2) assms(3) assms(5) sams_lea456_suma2__lea by blast + moreover have "G H I LeA G' H' I'" + proof - + have "SAMS A' B' C' D E F" + using \SAMS A' B' C' D E F\ by auto + thus ?thesis + using P1 assms(1) assms(4) sams_lea123_suma2__lea by blast + qed + ultimately show ?thesis + by (simp add: lea_asym) + qed + ultimately show ?thesis + using P1 conga3_suma__suma by blast + qed + qed + ultimately show ?thesis + using assms(4) sams2_suma2__conga123 by blast +qed + +lemma sams_lea2_suma2__conga456: + assumes "A B C LeA A' B' C'" and + "D E F LeA D' E' F'" and + "SAMS A' B' C' D' E' F'" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G H I" + shows "D E F CongA D' E' F'" +proof - + have "SAMS D' E' F' A' B' C'" + by (simp add: assms(3) sams_sym) + moreover have "D E F A B C SumA G H I" + by (simp add: assms(4) suma_sym) + moreover have "D' E' F' A' B' C' SumA G H I" + by (simp add: assms(5) suma_sym) + ultimately show ?thesis + using assms(1) assms(2) sams_lea2_suma2__conga123 by auto +qed + +lemma sams_suma__out213: + assumes "A B C D E F SumA D E F" and + "SAMS A B C D E F" + shows "B Out A C" +proof - + have "E \ D" + using assms(2) sams_distincts by blast + then have "E Out D D" + using out_trivial by auto + moreover have "D E D CongA A B C" + proof - + have "D E D LeA A B C" + using assms(1) lea121345 suma_distincts by auto + moreover + have "E \ D \ E \ F" + using assms(2) sams_distincts by blast + then have "D E F LeA D E F" + using lea_refl by auto + moreover have "D E D D E F SumA D E F" + proof - + have "\ E D OS D F" + using os_distincts by auto + moreover have "Coplanar D E D F" + using ncop_distincts by auto + ultimately show ?thesis + using SumA_def \D E F LeA D E F\ lea_asym by blast + qed + ultimately show ?thesis + using assms(1) assms(2) sams_lea2_suma2__conga123 by auto + qed + ultimately show ?thesis + using eq_conga_out by blast +qed + +lemma sams_suma__out546: + assumes "A B C D E F SumA A B C" and + "SAMS A B C D E F" + shows "E Out D F" +proof - + have "D E F A B C SumA A B C" + using assms(1) suma_sym by blast + moreover have "SAMS D E F A B C" + using assms(2) sams_sym by blast + ultimately show ?thesis + using sams_suma__out213 by blast +qed + +lemma sams_lea_lta123_suma2__lta: + assumes "A B C LtA A' B' C'" and + "D E F LeA D' E' F'" and + "SAMS A' B' C' D' E' F'" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "G H I LtA G' H' I'" +proof - + have "G H I LeA G' H' I'" + proof - + have "A B C LeA A' B' C'" + by (simp add: assms(1) lta__lea) + thus ?thesis + using assms(2) assms(3) assms(4) assms(5) sams_lea2_suma2__lea by blast + qed + moreover have "\ G H I CongA G' H' I'" + proof - + { + assume "G H I CongA G' H' I'" + have "A B C CongA A' B' C'" + proof - + have "A B C LeA A' B' C'" + by (simp add: assms(1) lta__lea) + moreover have "A' B' C' D' E' F' SumA G H I" + by (meson \G H I CongA G' H' I'\ assms(3) assms(5) conga3_suma__suma conga_sym sams2_suma2__conga123 sams2_suma2__conga456) + ultimately show ?thesis + using assms(2) assms(3) assms(4) sams_lea2_suma2__conga123 by blast + qed + then have "False" + using assms(1) lta_not_conga by auto + } + thus ?thesis + by auto + qed + ultimately show ?thesis + using LtA_def by blast +qed + +lemma sams_lea_lta456_suma2__lta: + assumes "A B C LeA A' B' C'" and + "D E F LtA D' E' F'" and + "SAMS A' B' C' D' E' F'" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "G H I LtA G' H' I'" + using sams_lea_lta123_suma2__lta + by (meson assms(1) assms(2) assms(3) assms(4) assms(5) sams_sym suma_sym) + +lemma sams_lta2_suma2__lta: + assumes "A B C LtA A' B' C'" and + "D E F LtA D' E' F'" and + "SAMS A' B' C' D' E' F'" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "G H I LtA G' H' I'" + using sams_lea_lta123_suma2__lta + by (meson LtA_def assms(1) assms(2) assms(3) assms(4) assms(5)) + +lemma sams_lea2_suma2__lea123: + assumes "D' E' F' LeA D E F" and + "G H I LeA G' H' I'" and + "SAMS A B C D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "A B C LeA A' B' C'" +proof cases + assume "A' B' C' LtA A B C" + then have "G' H' I' LtA G H I" using sams_lea_lta123_suma2__lta + using assms(1) assms(3) assms(4) assms(5) by blast + then have "\ G H I LeA G' H' I'" + using lea__nlta by blast + then have "False" + using assms(2) by auto + thus ?thesis by auto +next + assume "\ A' B' C' LtA A B C" + have "A' \ B' \ B' \ C' \ A \ B \ B \ C" + using assms(4) assms(5) suma_distincts by auto + thus ?thesis + by (simp add: \\ A' B' C' LtA A B C\ nlta__lea) +qed + +lemma sams_lea2_suma2__lea456: + assumes "A' B' C' LeA A B C" and + "G H I LeA G' H' I'" and + "SAMS A B C D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "D E F LeA D' E' F'" +proof - + have "SAMS D E F A B C" + by (simp add: assms(3) sams_sym) + moreover have "D E F A B C SumA G H I" + by (simp add: assms(4) suma_sym) + moreover have "D' E' F' A' B' C' SumA G' H' I'" + by (simp add: assms(5) suma_sym) + ultimately show ?thesis + using assms(1) assms(2) sams_lea2_suma2__lea123 by blast +qed + +lemma sams_lea_lta456_suma2__lta123: + assumes "D' E' F' LtA D E F" and + "G H I LeA G' H' I'" and + "SAMS A B C D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "A B C LtA A' B' C'" +proof cases + assume "A' B' C' LeA A B C" + then have "G' H' I' LtA G H I" + using sams_lea_lta456_suma2__lta assms(1) assms(3) assms(4) assms(5) by blast + then have "\ G H I LeA G' H' I'" + using lea__nlta by blast + then have "False" + using assms(2) by blast + thus ?thesis by blast +next + assume "\ A' B' C' LeA A B C" + have "A' \ B' \ B' \ C' \ A \ B \ B \ C" + using assms(4) assms(5) suma_distincts by auto + thus ?thesis using nlea__lta + by (simp add: \\ A' B' C' LeA A B C\) +qed + +lemma sams_lea_lta123_suma2__lta456: + assumes "A' B' C' LtA A B C" and + "G H I LeA G' H' I'" and + "SAMS A B C D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "D E F LtA D' E' F'" +proof - + have "SAMS D E F A B C" + by (simp add: assms(3) sams_sym) + moreover have "D E F A B C SumA G H I" + by (simp add: assms(4) suma_sym) + moreover have "D' E' F' A' B' C' SumA G' H' I'" + by (simp add: assms(5) suma_sym) + ultimately show ?thesis + using assms(1) assms(2) sams_lea_lta456_suma2__lta123 by blast +qed + +lemma sams_lea_lta789_suma2__lta123: + assumes "D' E' F' LeA D E F" and + "G H I LtA G' H' I'" and + "SAMS A B C D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "A B C LtA A' B' C'" +proof cases + assume "A' B' C' LeA A B C" + then have "G' H' I' LeA G H I" + using assms(1) assms(3) assms(4) assms(5) sams_lea2_suma2__lea by blast + then have "\ G H I LtA G' H' I'" + by (simp add: lea__nlta) + then have "False" + using assms(2) by blast + thus ?thesis by auto +next + assume "\ A' B' C' LeA A B C" + have "A' \ B' \ B' \ C' \ A \ B \ B \ C" + using assms(4) assms(5) suma_distincts by auto + thus ?thesis + using nlea__lta by (simp add: \\ A' B' C' LeA A B C\) +qed + +lemma sams_lea_lta789_suma2__lta456: + assumes "A' B' C' LeA A B C" and + "G H I LtA G' H' I'" and + "SAMS A B C D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "D E F LtA D' E' F'" +proof - + have "SAMS D E F A B C" + by (simp add: assms(3) sams_sym) + moreover have "D E F A B C SumA G H I" + by (simp add: assms(4) suma_sym) + moreover have "D' E' F' A' B' C' SumA G' H' I'" + using assms(5) suma_sym by blast + ultimately show ?thesis + using assms(1) assms(2) sams_lea_lta789_suma2__lta123 by blast +qed + +lemma sams_lta2_suma2__lta123: + assumes "D' E' F' LtA D E F" and + "G H I LtA G' H' I'" and + "SAMS A B C D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "A B C LtA A' B' C'" +proof - + have "D' E' F' LeA D E F" + by (simp add: assms(1) lta__lea) + thus ?thesis + using assms(2) assms(3) assms(4) assms(5) sams_lea_lta789_suma2__lta123 by blast +qed + +lemma sams_lta2_suma2__lta456: + assumes "A' B' C' LtA A B C" and + "G H I LtA G' H' I'" and + "SAMS A B C D E F" and + "A B C D E F SumA G H I" and + "A' B' C' D' E' F' SumA G' H' I'" + shows "D E F LtA D' E' F'" +proof - + have "A' B' C' LeA A B C" + by (simp add: assms(1) lta__lea) + thus ?thesis + using assms(2) assms(3) assms(4) assms(5) sams_lea_lta789_suma2__lta456 by blast +qed + +lemma sams123231: + assumes "A \ B" and + "A \ C" and + "B \ C" + shows "SAMS A B C B C A" +proof - + obtain A' where "B Midpoint A A'" + using symmetric_point_construction by auto + then have "A' \ B" + using assms(1) midpoint_not_midpoint by blast + moreover have "Bet A B A'" + by (simp add: \B Midpoint A A'\ midpoint_bet) + moreover have "B C A LeA C B A'" + proof cases + assume "Col A B C" + show ?thesis + proof cases + assume "Bet A C B" + thus ?thesis + by (metis assms(2) assms(3) between_exchange3 calculation(1) calculation(2) l11_31_2) + next + assume "\ Bet A C B" + then have "C Out B A" + using Col_cases \Col A B C\ l6_6 or_bet_out by blast + thus ?thesis + using assms(3) calculation(1) l11_31_1 by auto + qed + next + assume "\ Col A B C" + thus ?thesis + using l11_41_aux \B Midpoint A A'\ calculation(1) lta__lea midpoint_bet not_col_permutation_4 by blast + qed + ultimately show ?thesis + using assms(1) sams_chara by blast +qed + +lemma col_suma__col: + assumes "Col D E F" and + "A B C B C A SumA D E F" + shows "Col A B C" +proof - + { + assume "\ Col A B C" + have "False" + proof cases + assume "Bet D E F" + obtain P where P1: "Bet A B P \ Cong A B B P" + using Cong_perm segment_construction by blast + have "D E F LtA A B P" + proof - + have "A B C LeA A B C" + using \\ Col A B C\ lea_total not_col_distincts by blast + moreover + have "B C TS A P" + by (metis Cong_perm P1 \\ Col A B C\ bet__ts bet_col between_trivial2 cong_reverse_identity not_col_permutation_1) + then have "B C A LtA C B P" + using Col_perm P1 \\ Col A B C\ calculation l11_41_aux ts_distincts by blast + moreover have "A B C C B P SumA A B P" + by (simp add: \B C TS A P\ ts__suma_1) + ultimately show ?thesis + by (meson P1 Tarski_neutral_dimensionless.sams_lea_lta456_suma2__lta Tarski_neutral_dimensionless_axioms assms(2) bet_suma__sams) + qed + thus ?thesis + using P1 \Bet D E F\ bet2_lta__lta lta_distincts by blast + next + assume "\ Bet D E F" + have "C Out B A" + proof - + have "E Out D F" + by (simp add: \\ Bet D E F\ assms(1) l6_4_2) + moreover have "B C A LeA D E F" + using sams_suma__lea456789 + by (metis assms(2) sams123231 suma_distincts) + ultimately show ?thesis + using out_lea__out by blast + qed + thus ?thesis + using Col_cases \\ Col A B C\ out_col by blast + qed + } + thus ?thesis by auto +qed + +lemma ncol_suma__ncol: + assumes "\ Col A B C" and + "A B C B C A SumA D E F" + shows "\ Col D E F" + using col_suma__col assms(1) assms(2) by blast + +lemma per2_suma__bet: + assumes "Per A B C" and + "Per D E F" and + "A B C D E F SumA G H I" + shows "Bet G H I" +proof - + obtain A1 where P1: "C B A1 CongA D E F \ \ B C OS A A1 \ Coplanar A B C A1 \ A B A1 CongA G H I" + using SumA_def assms(3) by blast + then have "D E F CongA A1 B C" + using conga_right_comm conga_sym by blast + then have "Per A1 B C" + using assms(2) l11_17 by blast + have "Bet A B A1" + proof - + have "Col B A A1" + proof - + have "Coplanar C A A1 B" + using P1 ncoplanar_perm_10 by blast + moreover have "C \ B" + using \D E F CongA A1 B C\ conga_distinct by auto + ultimately show ?thesis + using assms(1) \Per A1 B C\ col_permutation_2 cop_per2__col by blast + qed + moreover have "B C TS A A1" + proof - + have "Coplanar B C A A1" + using calculation ncop__ncols by blast + moreover + have "A \ B \ B \ C" + using CongA_def P1 by blast + then have "\ Col A B C" + by (simp add: assms(1) per_not_col) + moreover + have "A1 \ B \ B \ C" + using \D E F CongA A1 B C\ conga_distinct by blast + then have "\ Col A1 B C" + using \Per A1 B C\ by (simp add: per_not_col) + ultimately show ?thesis + by (simp add: P1 cop_nos__ts) + qed + ultimately show ?thesis + using col_two_sides_bet by blast + qed + thus ?thesis + using P1 bet_conga__bet by blast +qed + +lemma bet_per2__suma: + assumes "A \ B" and + "B \ C" and + "D \ E" and + "E \ F" and + "G \ H" and + "H \ I" and + "Per A B C" and + "Per D E F" and + "Bet G H I" + shows "A B C D E F SumA G H I" +proof - + obtain G' H' I' where "A B C D E F SumA G' H' I'" + using assms(1) assms(2) assms(3) assms(4) ex_suma by blast + moreover have "A B C CongA A B C" + using assms(1) assms(2) conga_refl by auto + moreover have "D E F CongA D E F" + using assms(3) assms(4) conga_refl by auto + moreover have "G' H' I' CongA G H I" + proof - + have "G' \ H'" + using calculation(1) suma_distincts by auto + moreover have "H' \ I'" + using \A B C D E F SumA G' H' I'\ suma_distincts by blast + moreover have "Bet G' H' I'" + using \A B C D E F SumA G' H' I'\ assms(7) assms(8) per2_suma__bet by auto + ultimately show ?thesis + using conga_line by (simp add: assms(5) assms(6) assms(9)) + qed + ultimately show ?thesis + using conga3_suma__suma by blast +qed + +lemma per2__sams: + assumes "A \ B" and + "B \ C" and + "D \ E" and + "E \ F" and + "Per A B C" and + "Per D E F" + shows "SAMS A B C D E F" +proof - + obtain G H I where "A B C D E F SumA G H I" + using assms(1) assms(2) assms(3) assms(4) ex_suma by blast + moreover then have "Bet G H I" + using assms(5) assms(6) per2_suma__bet by auto + ultimately show ?thesis + using bet_suma__sams by blast +qed + +lemma bet_per_suma__per456: + assumes "Per A B C" and + "Bet G H I" and + "A B C D E F SumA G H I" + shows "Per D E F" +proof - + obtain A1 where "B Midpoint A A1" + using symmetric_point_construction by auto + have "\ Col A B C" + using assms(1) assms(3) per_col_eq suma_distincts by blast + have "A B C CongA D E F" + proof - + have "SAMS A B C A B C" + using \\ Col A B C\ assms(1) not_col_distincts per2__sams by auto + moreover have "SAMS A B C D E F" + using assms(2) assms(3) bet_suma__sams by blast + moreover have "A B C A B C SumA G H I" + using assms(1) assms(2) assms(3) bet_per2__suma suma_distincts by blast + ultimately show ?thesis + using assms(3) sams2_suma2__conga456 by auto + qed + thus ?thesis + using assms(1) l11_17 by blast +qed + +lemma bet_per_suma__per123: + assumes "Per D E F" and + "Bet G H I" and + "A B C D E F SumA G H I" + shows "Per A B C" + using bet_per_suma__per456 + by (meson assms(1) assms(2) assms(3) suma_sym) + +lemma bet_suma__per: + assumes "Bet D E F" and + "A B C A B C SumA D E F" + shows "Per A B C" +proof - + obtain A' where "C B A' CongA A B C \ A B A' CongA D E F" + using SumA_def assms(2) by blast + have "Per C B A" + proof - + have "Bet A B A'" + proof - + have "D E F CongA A B A'" + using \C B A' CongA A B C \ A B A' CongA D E F\ not_conga_sym by blast + thus ?thesis + using assms(1) bet_conga__bet by blast + qed + moreover have "C B A CongA C B A'" + using conga_left_comm not_conga_sym \C B A' CongA A B C \ A B A' CongA D E F\ by blast + ultimately show ?thesis + using l11_18_2 by auto + qed + thus ?thesis + using Per_cases by auto +qed + +lemma acute__sams: + assumes "Acute A B C" + shows "SAMS A B C A B C" +proof - + obtain A' where "B Midpoint A A'" + using symmetric_point_construction by auto + show ?thesis + proof - + have "A \ B \ A' \ B" + using \B Midpoint A A'\ acute_distincts assms is_midpoint_id_2 by blast + moreover have "Bet A B A'" + by (simp add: \B Midpoint A A'\ midpoint_bet) + moreover + have "Obtuse C B A'" + using acute_bet__obtuse assms calculation(1) calculation(2) obtuse_sym by blast + then have "A B C LeA C B A'" + by (metis acute__not_obtuse assms calculation(1) lea_obtuse_obtuse lea_total obtuse_distincts) + ultimately show ?thesis + using sams_chara by blast + qed +qed + +lemma acute_suma__nbet: + assumes "Acute A B C" and + "A B C A B C SumA D E F" + shows "\ Bet D E F" +proof - + { + assume "Bet D E F" + then have "Per A B C" + using assms(2) bet_suma__per by auto + then have "A B C LtA A B C" + using acute_not_per assms(1) by auto + then have "False" + by (simp add: nlta) + } + thus ?thesis by blast +qed + +lemma acute2__sams: + assumes "Acute A B C" and + "Acute D E F" + shows "SAMS A B C D E F" + by (metis acute__sams acute_distincts assms(1) assms(2) lea_total sams_lea2__sams) + +lemma acute2_suma__nbet_a: + assumes "Acute A B C" and + "D E F LeA A B C" and + "A B C D E F SumA G H I" + shows "\ Bet G H I" +proof - + { + assume "Bet G H I" + obtain A' B' C' where "A B C A B C SumA A' B' C'" + using acute_distincts assms(1) ex_suma by moura + have "G H I LeA A' B' C'" + proof - + have "A B C LeA A B C" + using acute_distincts assms(1) lea_refl by blast + moreover have "SAMS A B C A B C" + by (simp add: acute__sams assms(1)) + ultimately show ?thesis + using \A B C A B C SumA A' B' C'\ assms(1) assms(2) assms(3) sams_lea456_suma2__lea by blast + qed + then have "Bet A' B' C'" + using \Bet G H I\ bet_lea__bet by blast + then have "False" + using acute_suma__nbet assms(1) assms(3) \A B C A B C SumA A' B' C'\ by blast + } + thus ?thesis by blast +qed + +lemma acute2_suma__nbet: + assumes "Acute A B C" and + "Acute D E F" and + "A B C D E F SumA G H I" + shows "\ Bet G H I" +proof - + have "A \ B \ B \ C \ D \ E \ E \ F" + using assms(3) suma_distincts by auto + then have "A B C LeA D E F \ D E F LeA A B C" + by (simp add: lea_total) + moreover + { + assume P3: "A B C LeA D E F" + have "D E F A B C SumA G H I" + by (simp add: assms(3) suma_sym) + then have "\ Bet G H I" + using P3 assms(2) acute2_suma__nbet_a by auto + } + { + assume "D E F LeA A B C" + then have "\ Bet G H I" + using acute2_suma__nbet_a assms(1) assms(3) by auto + } + thus ?thesis + using \A B C LeA D E F \ \ Bet G H I\ calculation by blast +qed + +lemma acute_per__sams: + assumes "A \ B" and + "B \ C" and + "Per A B C" and + "Acute D E F" + shows "SAMS A B C D E F" +proof - + have "SAMS A B C A B C" + by (simp add: assms(1) assms(2) assms(3) per2__sams) + moreover have "A B C LeA A B C" + using assms(1) assms(2) lea_refl by auto + moreover have "D E F LeA A B C" + by (metis acute_distincts acute_lea_acute acute_not_per assms(1) assms(2) assms(3) assms(4) lea_total) + ultimately show ?thesis + using sams_lea2__sams by blast +qed + +lemma acute_per_suma__nbet: + assumes "A \ B" and + "B \ C" and + "Per A B C" and + "Acute D E F" and + "A B C D E F SumA G H I" + shows "\ Bet G H I" +proof - + { + assume "Bet G H I" + have "G H I LtA G H I" + proof - + have "A B C LeA A B C" + using assms(1) assms(2) lea_refl by auto + moreover have "D E F LtA A B C" + by (simp add: acute_per__lta assms(1) assms(2) assms(3) assms(4)) + moreover have "SAMS A B C A B C" + by (simp add: assms(1) assms(2) assms(3) per2__sams) + moreover have "A B C D E F SumA G H I" + by (simp add: assms(5)) + moreover have "A B C A B C SumA G H I" + by (meson Tarski_neutral_dimensionless.bet_per_suma__per456 Tarski_neutral_dimensionless_axioms \Bet G H I\ acute_not_per assms(3) assms(4) calculation(4)) + ultimately show ?thesis + using sams_lea_lta456_suma2__lta by blast + qed + then have "False" + by (simp add: nlta) + } + thus ?thesis by blast +qed + +lemma obtuse__nsams: + assumes "Obtuse A B C" + shows "\ SAMS A B C A B C" +proof - + { + assume "SAMS A B C A B C" + obtain A' where "B Midpoint A A'" + using symmetric_point_construction by auto + have "A B C LtA A B C" + proof - + have "A B C LeA A' B C" + by (metis \B Midpoint A A'\ \SAMS A B C A B C\ lea_right_comm midpoint_bet midpoint_distinct_2 sams_chara sams_distincts) + moreover have "A' B C LtA A B C" + using \B Midpoint A A'\ assms calculation lea_distincts midpoint_bet obtuse_chara by blast + ultimately show ?thesis + using lea__nlta by blast + qed + then have "False" + by (simp add: nlta) + } + thus ?thesis by blast +qed + +lemma nbet_sams_suma__acute: + assumes "\ Bet D E F" and + "SAMS A B C A B C" and + "A B C A B C SumA D E F" + shows "Acute A B C" +proof - + have "Acute A B C \ Per A B C \ Obtuse A B C" + by (metis angle_partition l8_20_1_R1 l8_5) + { + assume "Per A B C" + then have "Bet D E F" + using assms(3) per2_suma__bet by auto + then have "False" + using assms(1) by auto + } + { + assume "Obtuse A B C" + then have "\ SAMS A B C A B C" + by (simp add: obtuse__nsams) + then have "False" + using assms(2) by auto + } + thus ?thesis + using \Acute A B C \ Per A B C \ Obtuse A B C\ \Per A B C \ False\ by auto +qed + +lemma nsams__obtuse: + assumes "A \ B" and + "B \ C" and + "\ SAMS A B C A B C" + shows "Obtuse A B C" +proof - + { + assume "Per A B C" + obtain A' where "B Midpoint A A'" + using symmetric_point_construction by blast + have "\ Col A B C" + using \Per A B C\ assms(1) assms(2) per_col_eq by blast + have "SAMS A B C A B C" + proof - + have "C B A' CongA A B C" + using \Per A B C\ assms(1) assms(2) assms(3) per2__sams by blast + moreover have "\ B C OS A A'" + by (meson Col_cases \B Midpoint A A'\ col_one_side_out l6_4_1 midpoint_bet midpoint_col) + moreover have "\ A B TS C A'" + using Col_def Midpoint_def TS_def \B Midpoint A A'\ by blast + moreover have "Coplanar A B C A'" + using \Per A B C\ assms(1) assms(2) assms(3) per2__sams by blast + ultimately show ?thesis + using SAMS_def \\ Col A B C\ assms(1) bet_col by auto + qed + then have "False" + using assms(3) by auto + } + { + assume "Acute A B C" + then have "SAMS A B C A B C" + by (simp add: acute__sams) + then have "False" + using assms(3) by auto + } + thus ?thesis + using \Per A B C \ False\ angle_partition assms(1) assms(2) by auto +qed + +lemma sams2_suma2__conga: + assumes "SAMS A B C A B C" and + "A B C A B C SumA D E F" and + "SAMS A' B' C' A' B' C'" and + "A' B' C' A' B' C' SumA D E F" + shows "A B C CongA A' B' C'" +proof - + have "A B C LeA A' B' C' \ A' B' C' LeA A B C" + using assms(1) assms(3) lea_total sams_distincts by auto + moreover + have "A B C LeA A' B' C' \ A B C CongA A' B' C'" + using assms(2) assms(3) assms(4) sams_lea2_suma2__conga123 by auto + ultimately show ?thesis + by (meson Tarski_neutral_dimensionless.conga_sym Tarski_neutral_dimensionless.sams_lea2_suma2__conga123 Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(4)) +qed + +lemma acute2_suma2__conga: + assumes "Acute A B C" and + "A B C A B C SumA D E F" and + "Acute A' B' C'" and + "A' B' C' A' B' C' SumA D E F" + shows "A B C CongA A' B' C'" +proof - + have "SAMS A B C A B C" + by (simp add: acute__sams assms(1)) + moreover have "SAMS A' B' C' A' B' C'" + by (simp add: acute__sams assms(3)) + ultimately show ?thesis + using assms(2) assms(4) sams2_suma2__conga by auto +qed + +lemma bet2_suma__out: + assumes "Bet A B C" and + "Bet D E F" and + "A B C D E F SumA G H I" + shows "H Out G I" +proof - + have "A B C D E F SumA A B A" + proof - + have "C B A CongA D E F" + by (metis Bet_cases assms(1) assms(2) assms(3) conga_line suma_distincts) + moreover have "\ B C OS A A" + by (simp add: Col_def assms(1) col124__nos) + moreover have "Coplanar A B C A" + using ncop_distincts by blast + moreover have "A B A CongA A B A" + using calculation(1) conga_diff2 conga_trivial_1 by auto + ultimately show ?thesis + using SumA_def by blast + qed + then have "A B A CongA G H I" + using assms(3) suma2__conga by auto + thus ?thesis + using eq_conga_out by auto +qed + +lemma col2_suma__col: + assumes "Col A B C" and + "Col D E F" and + "A B C D E F SumA G H I" + shows "Col G H I" +proof cases + assume "Bet A B C" + show ?thesis + proof cases + assume "Bet D E F" + thus ?thesis using bet2_suma__out + by (meson \Bet A B C\ assms(3) not_col_permutation_4 out_col) + next + assume "\ Bet D E F" + show ?thesis + proof - + have "E Out D F" + using \\ Bet D E F\ assms(2) or_bet_out by auto + then have "A B C CongA G H I" + using assms(3) out546_suma__conga by auto + thus ?thesis + using assms(1) col_conga_col by blast + qed + qed +next + assume "\ Bet A B C" + have "D E F CongA G H I" + proof - + have "B Out A C" + by (simp add: \\ Bet A B C\ assms(1) l6_4_2) + thus ?thesis + using assms(3) out213_suma__conga by auto + qed + thus ?thesis + using assms(2) col_conga_col by blast +qed + +lemma suma_suppa__bet: + assumes "A B C SuppA D E F" and + "A B C D E F SumA G H I" + shows "Bet G H I" +proof - + obtain A' where P1: "Bet A B A' \ D E F CongA C B A'" + using SuppA_def assms(1) by auto + obtain A'' where P2: "C B A'' CongA D E F \ \ B C OS A A'' \ Coplanar A B C A'' \ A B A'' CongA G H I" + using SumA_def assms(2) by auto + have "B Out A' A'' \ C B TS A' A''" + proof - + have "Coplanar C B A' A''" + proof - + have "Coplanar C A'' B A" + using P2 coplanar_perm_17 by blast + moreover have "B \ A" + using SuppA_def assms(1) by auto + moreover have "Col B A A'" using P1 + by (simp add: bet_col col_permutation_4) + ultimately show ?thesis + using col_cop__cop coplanar_perm_3 by blast + qed + moreover have "C B A' CongA C B A''" + proof - + have "C B A' CongA D E F" + using P1 not_conga_sym by blast + moreover have "D E F CongA C B A''" + using P2 not_conga_sym by blast + ultimately show ?thesis + using not_conga by blast + qed + ultimately show ?thesis + using conga_cop__or_out_ts by simp + qed + have "Bet A B A''" + proof - + have "\ C B TS A' A''" + proof - + { + assume "C B TS A' A''" + have "B C TS A A'" + proof - + { + assume "Col A B C" + then have "Col A' C B" + using P1 assms(1) bet_col bet_col1 col3 suppa_distincts by blast + then have "False" + using TS_def \C B TS A' A''\ by auto + } + then have "\ Col A B C" by auto + moreover have "\ Col A' B C" + using TS_def \C B TS A' A''\ not_col_permutation_5 by blast + moreover + have "\ T. (Col T B C \ Bet A T A')" + using P1 not_col_distincts by blast + ultimately show ?thesis + by (simp add: TS_def) + qed + then have "B C OS A A''" + using OS_def \C B TS A' A''\ invert_two_sides l9_2 by blast + then have "False" + using P2 by simp + } + thus ?thesis by blast + qed + then have "B Out A' A''" + using \B Out A' A'' \ C B TS A' A''\ by auto + moreover have "A' \ B \ A'' \ B \ A \ B" + using P2 calculation conga_diff1 out_diff1 out_diff2 by blast + moreover have "Bet A' B A" + using P1 Bet_perm by blast + ultimately show ?thesis + using bet_out__bet between_symmetry by blast + qed + moreover have "A B A'' CongA G H I" + using P2 by blast + ultimately show ?thesis + using bet_conga__bet by blast +qed + +lemma bet_suppa__suma: + assumes "G \ H" and + "H \ I" and + "A B C SuppA D E F" and + "Bet G H I" + shows "A B C D E F SumA G H I" +proof - + obtain G' H' I' where "A B C D E F SumA G' H' I'" + using assms(3) ex_suma suppa_distincts by blast + moreover have "A B C CongA A B C" + using calculation conga_refl suma_distincts by fastforce + moreover + have "D \ E \ E \ F" + using assms(3) suppa_distincts by auto + then have "D E F CongA D E F" + using conga_refl by auto + moreover + have "G' H' I' CongA G H I" + proof - + have "G' \ H' \ H' \ I'" + using calculation(1) suma_distincts by auto + moreover have "Bet G' H' I'" + using \A B C D E F SumA G' H' I'\ assms(3) suma_suppa__bet by blast + ultimately show ?thesis + using assms(1) assms(2) assms(4) conga_line by auto + qed + ultimately show ?thesis + using conga3_suma__suma by blast +qed + +lemma bet_suma__suppa: + assumes "A B C D E F SumA G H I" and + "Bet G H I" + shows "A B C SuppA D E F" +proof - + obtain A' where "C B A' CongA D E F \ A B A' CongA G H I" + using SumA_def assms(1) by blast + moreover + have "G H I CongA A B A'" + using calculation not_conga_sym by blast + then have "Bet A B A'" + using assms(2) bet_conga__bet by blast + moreover have "D E F CongA C B A'" + using calculation(1) not_conga_sym by blast + ultimately show ?thesis + by (metis SuppA_def conga_diff1) +qed + +lemma bet2_suma__suma: + assumes "A' \ B" and + "D' \ E" and + "Bet A B A'" and + "Bet D E D'" and + "A B C D E F SumA G H I" + shows "A' B C D' E F SumA G H I" +proof - + obtain J where P1: "C B J CongA D E F \ \ B C OS A J \ Coplanar A B C J \ A B J CongA G H I" + using SumA_def assms(5) by auto + moreover + obtain C' where P2: "Bet C B C' \ Cong B C' B C" + using segment_construction by blast + moreover + have "A B C' D' E F SumA G H I" + proof - + have "C' B J CongA D' E F" + by (metis assms(2) assms(4) calculation(1) calculation(2) cong_diff_3 conga_diff1 l11_13) + moreover have "\ B C' OS A J" + by (metis Col_cases P1 P2 bet_col col_one_side cong_diff) + moreover have "Coplanar A B C' J" + by (smt P1 P2 bet_col bet_col1 col2_cop__cop cong_diff ncoplanar_perm_5) + ultimately show ?thesis + using P1 SumA_def by blast + qed + moreover have "A B C' CongA A' B C" + using assms(1) assms(3) assms(5) between_symmetry calculation(2) calculation(3) l11_14 suma_distincts by auto + moreover + have "D' \ E \ E \ F" + using assms(2) calculation(1) conga_distinct by blast + then have "D' E F CongA D' E F" + using conga_refl by auto + moreover + have "G \ H \ H \ I" + using assms(5) suma_distincts by blast + then have "G H I CongA G H I" + using conga_refl by auto + ultimately show ?thesis + using conga3_suma__suma by blast +qed + +lemma suma_suppa2__suma: + assumes "A B C SuppA A' B' C'" and + "D E F SuppA D' E' F'" and + "A B C D E F SumA G H I" + shows "A' B' C' D' E' F' SumA G H I" +proof - + obtain A0 where P1: "Bet A B A0 \ A' B' C' CongA C B A0" + using SuppA_def assms(1) by auto + obtain D0 where P2: "Bet D E D0 \ D' E' F' CongA F E D0" + using SuppA_def assms(2) by auto + show ?thesis + proof - + have "A0 B C D0 E F SumA G H I" + proof - + have "A0 \ B" + using CongA_def P1 by auto + moreover have "D0 \ E" + using CongA_def P2 by blast + ultimately show ?thesis + using P1 P2 assms(3) bet2_suma__suma by auto + qed + moreover have "A0 B C CongA A' B' C'" + using P1 conga_left_comm not_conga_sym by blast + moreover have "D0 E F CongA D' E' F'" + using P2 conga_left_comm not_conga_sym by blast + moreover + have "G \ H \ H \ I" + using assms(3) suma_distincts by blast + then have "G H I CongA G H I" + using conga_refl by auto + ultimately show ?thesis + using conga3_suma__suma by blast + qed +qed + +lemma suma2_obtuse2__conga: + assumes "Obtuse A B C" and + "A B C A B C SumA D E F" and + "Obtuse A' B' C'" and + "A' B' C' A' B' C' SumA D E F" + shows "A B C CongA A' B' C'" +proof - + obtain A0 where P1: "Bet A B A0 \ Cong B A0 A B" + using segment_construction by blast + moreover + obtain A0' where P2: "Bet A' B' A0' \ Cong B' A0' A' B'" + using segment_construction by blast + moreover + have "A0 B C CongA A0' B' C'" + proof - + have "Acute A0 B C" + using assms(1) bet_obtuse__acute P1 cong_diff_3 obtuse_distincts by blast + moreover have "A0 B C A0 B C SumA D E F" + using P1 acute_distincts assms(2) bet2_suma__suma calculation by blast + moreover have "Acute A0' B' C'" + using P2 assms(3) bet_obtuse__acute cong_diff_3 obtuse_distincts by blast + moreover have "A0' B' C' A0' B' C' SumA D E F" + by (metis P2 assms(4) bet2_suma__suma cong_diff_3) + ultimately show ?thesis + using acute2_suma2__conga by blast + qed + moreover have "Bet A0 B A" + using Bet_perm calculation(1) by blast + moreover have "Bet A0' B' A'" + using Bet_perm calculation(2) by blast + moreover have "A \ B" + using assms(1) obtuse_distincts by blast + moreover have "A' \ B'" + using assms(3) obtuse_distincts by blast + ultimately show ?thesis + using l11_13 by blast +qed + +lemma bet_suma2__or_conga: + assumes "A0 \ B" and + "Bet A B A0" and + "A B C A B C SumA D E F" and + "A' B' C' A' B' C' SumA D E F" + shows "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" +proof - + { + fix A' B' C' + assume"Acute A' B' C' \ A' B' C' A' B' C' SumA D E F" + have "Per A B C \ Acute A B C \ Obtuse A B C" + by (metis angle_partition l8_20_1_R1 l8_5) + { + assume "Per A B C" + then have "Bet D E F" + using per2_suma__bet assms(3) by auto + then have "False" + using \Acute A' B' C' \ A' B' C' A' B' C' SumA D E F\ acute_suma__nbet by blast + then have "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" by blast + } + { + assume "Acute A B C" + have "Acute A' B' C'" + by (simp add: \Acute A' B' C' \ A' B' C' A' B' C' SumA D E F\) + moreover have "A' B' C' A' B' C' SumA D E F" + by (simp add: \Acute A' B' C' \ A' B' C' A' B' C' SumA D E F\) + ultimately + have "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" + using assms(3) \Acute A B C\ acute2_suma2__conga by auto + } + { + assume "Obtuse A B C" + have "Acute A0 B C" + using \Obtuse A B C\ assms(1) assms(2) bet_obtuse__acute by auto + moreover have "A0 B C A0 B C SumA D E F" + using assms(1) assms(2) assms(3) bet2_suma__suma by auto + ultimately have "A0 B C CongA A' B' C'" + using \Acute A' B' C' \ A' B' C' A' B' C' SumA D E F\ acute2_suma2__conga by auto + then have "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" by blast + } + then have "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" + using \Acute A B C \ A B C CongA A' B' C' \ A0 B C CongA A' B' C'\ \Per A B C \ A B C CongA A' B' C' \ A0 B C CongA A' B' C'\ \Per A B C \ Acute A B C \ Obtuse A B C\ by blast + } + then have P1: "\ A' B' C'. (Acute A' B' C' \ A' B' C' A' B' C' SumA D E F) \ (A B C CongA A' B' C' \ A0 B C CongA A' B' C')" by blast + have "Per A' B' C' \ Acute A' B' C' \ Obtuse A' B' C'" + by (metis angle_partition l8_20_1_R1 l8_5) + { + assume P2: "Per A' B' C'" + have "A B C CongA A' B' C'" + proof - + have "A \ B \ B \ C" + using assms(3) suma_distincts by blast + moreover have "A' \ B' \ B' \ C'" + using assms(4) suma_distincts by auto + moreover have "Per A B C" + proof - + have "Bet D E F" + using P2 assms(4) per2_suma__bet by blast + moreover have "A B C A B C SumA D E F" + by (simp add: assms(3)) + ultimately show ?thesis + using assms(3) bet_suma__per by blast + qed + ultimately show ?thesis + using P2 l11_16 by blast + qed + then have "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" by blast + } + { + assume "Acute A' B' C'" + then have "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" + using P1 assms(4) by blast + } + { + assume "Obtuse A' B' C'" + obtain A0' where "Bet A' B' A0' \ Cong B' A0' A' B'" + using segment_construction by blast + moreover + have "Acute A0' B' C'" + using \Obtuse A' B' C'\ bet_obtuse__acute calculation cong_diff_3 obtuse_distincts by blast + moreover have "A0' B' C' A0' B' C' SumA D E F" + using acute_distincts assms(4) bet2_suma__suma calculation(1) calculation(2) by blast + ultimately + have "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" + using P1 by (metis assms(1) assms(2) assms(3) assms(4) between_symmetry l11_13 suma_distincts) + } + thus ?thesis + using \Acute A' B' C' \ A B C CongA A' B' C' \ A0 B C CongA A' B' C'\ \Per A' B' C' \ A B C CongA A' B' C' \ A0 B C CongA A' B' C'\ \Per A' B' C' \ Acute A' B' C' \ Obtuse A' B' C'\ by blast +qed + +lemma suma2__or_conga_suppa: + assumes "A B C A B C SumA D E F" and + "A' B' C' A' B' C' SumA D E F" + shows "A B C CongA A' B' C' \ A B C SuppA A' B' C'" +proof - + obtain A0 where P1: "Bet A B A0 \ Cong B A0 A B" + using segment_construction by blast + then have "A0 \ B" + using assms(1) bet_cong_eq suma_distincts by blast + then have "A B C CongA A' B' C' \ A0 B C CongA A' B' C'" + using assms(1) assms(2) P1 bet_suma2__or_conga by blast + thus ?thesis + by (metis P1 SuppA_def cong_diff conga_right_comm conga_sym) +qed + +lemma ex_trisuma: + assumes "A \ B" and + "B \ C" and + "A \ C" + shows "\ D E F. A B C TriSumA D E F" +proof - + obtain G H I where "A B C B C A SumA G H I" + using assms(1) assms(2) assms(3) ex_suma by presburger + moreover + then obtain D E F where "G H I C A B SumA D E F" + using ex_suma suma_distincts by presburger + ultimately show ?thesis + using TriSumA_def by blast +qed + +lemma trisuma_perm_231: + assumes "A B C TriSumA D E F" + shows "B C A TriSumA D E F" +proof - + obtain A1 B1 C1 where P1: "A B C B C A SumA A1 B1 C1 \ A1 B1 C1 C A B SumA D E F" + using TriSumA_def assms(1) by auto + obtain A2 B2 C2 where P2: "B C A C A B SumA B2 C2 A2" + using P1 ex_suma suma_distincts by fastforce + have "A B C B2 C2 A2 SumA D E F" + proof - + have "SAMS A B C B C A" + using assms sams123231 trisuma_distincts by auto + moreover have "SAMS B C A C A B" + using P2 sams123231 suma_distincts by fastforce + ultimately show ?thesis + using P1 P2 suma_assoc by blast + qed + thus ?thesis + using P2 TriSumA_def suma_sym by blast +qed + +lemma trisuma_perm_312: + assumes "A B C TriSumA D E F" + shows "C A B TriSumA D E F" + by (simp add: assms trisuma_perm_231) + +lemma trisuma_perm_321: + assumes "A B C TriSumA D E F" + shows "C B A TriSumA D E F" +proof - + obtain G H I where "A B C B C A SumA G H I \ G H I C A B SumA D E F" + using TriSumA_def assms(1) by auto + thus ?thesis + by (meson TriSumA_def suma_comm suma_right_comm suma_sym trisuma_perm_231) +qed + +lemma trisuma_perm_213: + assumes "A B C TriSumA D E F" + shows "B A C TriSumA D E F" + using assms trisuma_perm_231 trisuma_perm_321 by blast + +lemma trisuma_perm_132: + assumes "A B C TriSumA D E F" + shows "A C B TriSumA D E F" + using assms trisuma_perm_312 trisuma_perm_321 by blast + +lemma conga_trisuma__trisuma: + assumes "A B C TriSumA D E F" and + "D E F CongA D' E' F'" + shows "A B C TriSumA D' E' F'" +proof - + obtain G H I where P1: "A B C B C A SumA G H I \ G H I C A B SumA D E F" + using TriSumA_def assms(1) by auto + have "G H I C A B SumA D' E' F'" + proof - + have f1: "B \ A" + by (metis P1 suma_distincts) + have f2: "C \ A" + using P1 suma_distincts by blast + have "G H I CongA G H I" + by (metis (full_types) P1 conga_refl suma_distincts) + then show ?thesis + using f2 f1 by (meson P1 assms(2) conga3_suma__suma conga_refl) + qed + thus ?thesis using P1 TriSumA_def by blast +qed + +lemma trisuma2__conga: + assumes "A B C TriSumA D E F" and + "A B C TriSumA D' E' F'" + shows "D E F CongA D' E' F'" +proof - + obtain G H I where P1: "A B C B C A SumA G H I \ G H I C A B SumA D E F" + using TriSumA_def assms(1) by auto + then have P1A: "G H I C A B SumA D E F" by simp + obtain G' H' I' where P2: "A B C B C A SumA G' H' I' \ G' H' I' C A B SumA D' E' F'" + using TriSumA_def assms(2) by auto + have "G' H' I' C A B SumA D E F" + proof - + have "G H I CongA G' H' I'" using P1 P2 suma2__conga by blast + moreover have "D E F CongA D E F \ C A B CongA C A B" + by (metis assms(1) conga_refl trisuma_distincts) + ultimately show ?thesis + by (meson P1 conga3_suma__suma) + qed + thus ?thesis + using P2 suma2__conga by auto +qed + +lemma conga3_trisuma__trisuma: + assumes "A B C TriSumA D E F" and + "A B C CongA A' B' C'" and + "B C A CongA B' C' A'" and + "C A B CongA C' A' B'" + shows "A' B' C' TriSumA D E F" +proof - + obtain G H I where P1: "A B C B C A SumA G H I \ G H I C A B SumA D E F" + using TriSumA_def assms(1) by auto + thus ?thesis + proof - + have "A' B' C' B' C' A' SumA G H I" + using conga3_suma__suma P1 by (meson assms(2) assms(3) suma2__conga) + moreover have "G H I C' A' B' SumA D E F" + using conga3_suma__suma P1 by (meson P1 assms(4) suma2__conga) + ultimately show ?thesis + using TriSumA_def by blast + qed +qed + +lemma col_trisuma__bet: + assumes "Col A B C" and + "A B C TriSumA P Q R" + shows "Bet P Q R" +proof - + obtain D E F where P1: "A B C B C A SumA D E F \ D E F C A B SumA P Q R" + using TriSumA_def assms(2) by auto + { + assume "Bet A B C" + have "A B C CongA P Q R" + proof - + have "A B C CongA D E F" + proof - + have "C \ A \ C \ B" + using assms(2) trisuma_distincts by blast + then have "C Out B A" + using \ Bet A B C\ bet_out_1 by fastforce + thus ?thesis + using P1 out546_suma__conga by auto + qed + moreover have "D E F CongA P Q R" + proof - + have "A \ C \ A \ B" + using assms(2) trisuma_distincts by blast + then have "A Out C B" + using Out_def \Bet A B C\ by auto + thus ?thesis + using P1 out546_suma__conga by auto + qed + ultimately show ?thesis + using conga_trans by blast + qed + then have "Bet P Q R" + using \Bet A B C\ bet_conga__bet by blast + } + { + assume "Bet B C A" + have "B C A CongA P Q R" + proof - + have "B C A CongA D E F" + proof - + have "B \ A \ B \ C" + using assms(2) trisuma_distincts by blast + then have "B Out A C" + using Out_def \Bet B C A\ by auto + thus ?thesis + using P1 out213_suma__conga by blast + qed + moreover have "D E F CongA P Q R" + proof - + have "A \ C \ A \ B" + using assms(2) trisuma_distincts by auto + then have "A Out C B" + using \Bet B C A\ bet_out_1 by auto + thus ?thesis + using P1 out546_suma__conga by blast + qed + ultimately show ?thesis + using not_conga by blast + qed + then have "Bet P Q R" + using \Bet B C A\ bet_conga__bet by blast + } + { + assume "Bet C A B" + have "E Out D F" + proof - + have "C Out B A" + using \Bet C A B\ assms(2) bet_out l6_6 trisuma_distincts by blast + moreover have "B C A CongA D E F" + proof - + have "B \ A \ B \ C" + using P1 suma_distincts by blast + then have "B Out A C" + using \Bet C A B\ bet_out_1 by auto + thus ?thesis using out213_suma__conga P1 by blast + qed + ultimately show ?thesis + using l11_21_a by blast + qed + + then have "C A B CongA P Q R" + using P1 out213_suma__conga by blast + then have "Bet P Q R" + using \Bet C A B\ bet_conga__bet by blast + } + thus ?thesis + using Col_def \Bet A B C \ Bet P Q R\ \Bet B C A \ Bet P Q R\ assms(1) by blast +qed + +lemma suma_dec: + "A B C D E F SumA G H I \ \ A B C D E F SumA G H I" by simp + +lemma sams_dec: + "SAMS A B C D E F \ \ SAMS A B C D E F" by simp + +lemma trisuma_dec: + "A B C TriSumA P Q R \ \ A B C TriSumA P Q R" + by simp + +subsection "Parallelism" + +lemma par_reflexivity: + assumes "A \ B" + shows "A B Par A B" + using Par_def assms not_col_distincts by blast + +lemma par_strict_irreflexivity: + "\ A B ParStrict A B" + using ParStrict_def col_trivial_3 by blast + +lemma not_par_strict_id: + "\ A B ParStrict A C" + using ParStrict_def col_trivial_1 by blast + +lemma par_id: + assumes "A B Par A C" + shows "Col A B C" + using Col_cases Par_def assms not_par_strict_id by auto + +lemma par_strict_not_col_1: + assumes "A B ParStrict C D" + shows "\ Col A B C" + using Col_perm ParStrict_def assms col_trivial_1 by blast + +lemma par_strict_not_col_2: + assumes "A B ParStrict C D" + shows "\ Col B C D" + using ParStrict_def assms col_trivial_3 by auto + +lemma par_strict_not_col_3: + assumes "A B ParStrict C D" + shows "\ Col C D A" + using Col_perm ParStrict_def assms col_trivial_1 by blast + +lemma par_strict_not_col_4: + assumes "A B ParStrict C D" + shows "\ Col A B D" + using Col_perm ParStrict_def assms col_trivial_3 by blast + +lemma par_id_1: + assumes "A B Par A C" + shows "Col B A C" + using Par_def assms not_par_strict_id by auto + +lemma par_id_2: + assumes "A B Par A C" + shows "Col B C A" + using Col_perm assms par_id_1 by blast + +lemma par_id_3: + assumes "A B Par A C" + shows "Col A C B" + using Col_perm assms par_id_2 by blast + +lemma par_id_4: + assumes "A B Par A C" + shows "Col C B A" + using Col_perm assms par_id_2 by blast + +lemma par_id_5: + assumes "A B Par A C" + shows "Col C A B" + using assms col_permutation_2 par_id by blast + +lemma par_strict_symmetry: + assumes "A B ParStrict C D" + shows "C D ParStrict A B" + using ParStrict_def assms coplanar_perm_16 by blast + +lemma par_symmetry: + assumes "A B Par C D" + shows "C D Par A B" + by (smt NCol_perm Par_def assms l6_16_1 par_strict_symmetry) + +lemma par_left_comm: + assumes "A B Par C D" + shows "B A Par C D" + by (metis (mono_tags, lifting) ParStrict_def Par_def assms ncoplanar_perm_6 not_col_permutation_5) + +lemma par_right_comm: + assumes "A B Par C D" + shows "A B Par D C" + using assms par_left_comm par_symmetry by blast + +lemma par_comm: + assumes "A B Par C D" + shows "B A Par D C" + using assms par_left_comm par_right_comm by blast + +lemma par_strict_left_comm: + assumes "A B ParStrict C D" + shows "B A ParStrict C D" + using ParStrict_def assms ncoplanar_perm_6 not_col_permutation_5 by blast + +lemma par_strict_right_comm: + assumes "A B ParStrict C D" + shows "A B ParStrict D C" + using assms par_strict_left_comm par_strict_symmetry by blast + +lemma par_strict_comm: + assumes "A B ParStrict C D" + shows "B A ParStrict D C" + by (simp add: assms par_strict_left_comm par_strict_right_comm) + +lemma par_strict_neq1: + assumes "A B ParStrict C D" + shows "A \ B" + using assms col_trivial_1 par_strict_not_col_4 by blast + +lemma par_strict_neq2: + assumes "A B ParStrict C D" + shows "C \ D" + using assms col_trivial_2 par_strict_not_col_2 by blast + +lemma par_neq1: + assumes "A B Par C D" + shows "A \ B" + using Par_def assms par_strict_neq1 by blast + +lemma par_neq2: + assumes "A B Par C D" + shows "C \ D" + using assms par_neq1 par_symmetry by blast + +lemma Par_cases: + assumes "A B Par C D \ B A Par C D \ A B Par D C \ B A Par D C \ C D Par A B \ C D Par B A \ D C Par A B \ D C Par B A" + shows "A B Par C D" + using assms par_right_comm par_symmetry by blast + +lemma Par_perm: + assumes "A B Par C D" + shows "A B Par C D \ B A Par C D \ A B Par D C \ B A Par D C \ C D Par A B \ C D Par B A \ D C Par A B \ D C Par B A" + using Par_cases assms by blast + +lemma Par_strict_cases: + assumes "A B ParStrict C D \ B A ParStrict C D \ A B ParStrict D C \ B A ParStrict D C \ C D ParStrict A B \ C D ParStrict B A \ D C ParStrict A B \ D C ParStrict B A" + shows "A B ParStrict C D" + using assms par_strict_right_comm par_strict_symmetry by blast + +lemma Par_strict_perm: + assumes "A B ParStrict C D" + shows "A B ParStrict C D \ B A ParStrict C D \ A B ParStrict D C \ B A ParStrict D C \ C D ParStrict A B \ C D ParStrict B A \ D C ParStrict A B \ D C ParStrict B A" + using Par_strict_cases assms by blast + +lemma l12_6: + assumes "A B ParStrict C D" + shows "A B OS C D" + by (metis Col_def ParStrict_def Par_strict_perm TS_def assms cop_nts__os par_strict_not_col_2) + +lemma pars__os3412: + assumes "A B ParStrict C D" + shows "C D OS A B" + by (simp add: assms l12_6 par_strict_symmetry) + +lemma perp_dec: + "A B Perp C D \ \ A B Perp C D" + by simp + +lemma col_cop2_perp2__col: + assumes "X1 X2 Perp A B" and + "Y1 Y2 Perp A B" and + "Col X1 Y1 Y2" and + "Coplanar A B X2 Y1" and + "Coplanar A B X2 Y2" + shows "Col X2 Y1 Y2" +proof cases + assume "X1 = Y2" + thus ?thesis + using assms(1) assms(2) assms(4) cop_perp2__col not_col_permutation_2 perp_left_comm by blast +next + assume "X1 \ Y2" + then have "Y2 X1 Perp A B" + by (metis Col_cases assms(2) assms(3) perp_col perp_comm perp_right_comm) + then have P1: "X1 Y2 Perp A B" + using Perp_perm by blast + thus ?thesis + proof cases + assume "X1 = Y1" + thus ?thesis + using assms(1) assms(2) assms(5) cop_perp2__col not_col_permutation_4 by blast + next + assume "X1 \ Y1" + then have "X1 Y1 Perp A B" + using Col_cases P1 assms(3) perp_col by blast + thus ?thesis + using P1 assms(1) assms(4) assms(5) col_transitivity_2 cop_perp2__col perp_not_eq_1 by blast + qed +qed + +lemma col_perp2_ncol_col: + assumes "X1 X2 Perp A B" and + "Y1 Y2 Perp A B" and + "Col X1 Y1 Y2" and + "\ Col X1 A B" + shows "Col X2 Y1 Y2" +proof - + have "Coplanar A B X2 Y1" + proof cases + assume "X1 = Y1" + thus ?thesis + using assms(1) ncoplanar_perm_22 perp__coplanar by blast + next + assume "X1 \ Y1" + then have "Y1 X1 Perp A B" + by (metis Col_cases assms(2) assms(3) perp_col) + thus ?thesis + by (meson assms(1) assms(4) coplanar_trans_1 ncoplanar_perm_18 ncoplanar_perm_4 perp__coplanar) + qed + then moreover have "Coplanar A B X2 Y2" + by (smt assms(1) assms(2) assms(3) assms(4) col_cop2__cop coplanar_perm_17 coplanar_perm_18 coplanar_trans_1 perp__coplanar) + ultimately show ?thesis + using assms(1) assms(2) assms(3) col_cop2_perp2__col by blast +qed + +lemma l12_9: + assumes + "Coplanar C1 C2 A1 B1" and + "Coplanar C1 C2 A1 B2" and + "Coplanar C1 C2 A2 B1" and + "Coplanar C1 C2 A2 B2" and + "A1 A2 Perp C1 C2" and + "B1 B2 Perp C1 C2" + shows "A1 A2 Par B1 B2" +proof - + have P1: "A1 \ A2 \ C1 \ C2" + using assms(5) perp_distinct by auto + have P2: "B1 \ B2" + using assms(6) perp_distinct by auto + show ?thesis + proof cases + assume "Col A1 B1 B2" + then show ?thesis + using P1 P2 Par_def assms(3) assms(4) assms(5) assms(6) col_cop2_perp2__col by blast + next + assume P3: "\ Col A1 B1 B2" + { + assume "\ Col C1 C2 A1" + then have "Coplanar A1 A2 B1 B2" + by (smt assms(1) assms(2) assms(5) coplanar_perm_22 coplanar_perm_8 coplanar_pseudo_trans ncop_distincts perp__coplanar) + } + have "C1 C2 Perp A1 A2" + using Perp_cases assms(5) by blast + then have "Coplanar A1 A2 B1 B2" + by (smt \\ Col C1 C2 A1 \ Coplanar A1 A2 B1 B2\ assms(3) assms(4) coplanar_perm_1 coplanar_pseudo_trans ncop_distincts perp__coplanar perp_not_col2) + { + assume "\ X. Col X A1 A2 \ Col X B1 B2" + then obtain AB where P4: "Col AB A1 A2 \ Col AB B1 B2" by auto + then have "False" + proof cases + assume "AB = A1" + thus ?thesis + using P3 P4 by blast + next + assume "AB \ A1" + then have "A1 AB Perp C1 C2" + by (metis P4 assms(5) not_col_permutation_2 perp_col) + then have "AB A1 Perp C1 C2" + by (simp add: perp_left_comm) + thus ?thesis + using P3 P4 assms(1) assms(2) assms(6) col_cop2_perp2__col by blast + qed + } + then show ?thesis + using ParStrict_def Par_def \Coplanar A1 A2 B1 B2\ by blast + qed +qed + +lemma parallel_existence: + assumes "A \ B" + shows "\ C D. C \ D \ A B Par C D \ Col P C D" +proof cases + assume "Col A B P" + then show ?thesis + using Col_perm assms par_reflexivity by blast +next + assume P1: "\ Col A B P" + then obtain P' where P2: "Col A B P' \ A B Perp P P'" + using l8_18_existence by blast + then have P3: "P \ P'" + using P1 by blast + show ?thesis + proof cases + assume P4: "P' = A" + have "\ Q. Per Q P A \ Cong Q P A B \ A P OS Q B" + proof - + have "Col A P P" + using not_col_distincts by auto + moreover have "\ Col A P B" + by (simp add: P1 not_col_permutation_5) + ultimately show ?thesis + using P3 P4 assms ex_per_cong by simp + qed + then obtain Q where T1: "Per Q P A \ Cong Q P A B \ A P OS Q B" by auto + then have T2: "P \ Q" + using os_distincts by auto + have T3: "A B Par P Q" + proof - + have "P Q Perp P A" + proof - + have "P \ A" + using P3 P4 by auto + moreover have "Col P P Q" + by (simp add: col_trivial_1) + ultimately show ?thesis + by (metis T1 T2 Tarski_neutral_dimensionless.Perp_perm Tarski_neutral_dimensionless_axioms per_perp) + qed + moreover have "Coplanar P A A P" + using ncop_distincts by auto + moreover have "Coplanar P A B P" + using ncop_distincts by auto + moreover have "Coplanar P A B Q" + by (metis (no_types) T1 ncoplanar_perm_7 os__coplanar) + moreover have "A B Perp P A" + using P2 P4 by auto + ultimately show ?thesis using l12_9 ncop_distincts by blast + qed + thus ?thesis + using T2 col_trivial_1 by auto + next + assume T4: "P' \ A" + have "\ Q. Per Q P P' \ Cong Q P A B \ P' P OS Q A" + proof - + have "P' \ P" + using P3 by auto + moreover have "A \ B" + by (simp add: assms) + moreover have "Col P' P P" + using not_col_distincts by blast + moreover have "\ Col P' P A" + by (metis P1 P2 T4 col2__eq col_permutation_1) + ultimately show ?thesis + using ex_per_cong by blast + qed + then obtain Q where T5: "Per Q P P' \ Cong Q P A B \ P' P OS Q A" by blast + then have T6: "P \ Q" + using os_distincts by blast + moreover have "A B Par P Q" + proof - + have "Coplanar P P' A P" + using ncop_distincts by auto + moreover have "Coplanar P P' A Q" + by (meson T5 ncoplanar_perm_7 os__coplanar) + then moreover have "Coplanar P P' B Q" + by (smt P2 T4 col2_cop__cop col_permutation_5 col_transitivity_1 coplanar_perm_5) + moreover have "Coplanar P P' B P" + using ncop_distincts by auto + moreover have "A B Perp P P'" + by (simp add: P2) + moreover have "P Q Perp P P'" + by (metis P3 T5 T6 Tarski_neutral_dimensionless.Perp_perm Tarski_neutral_dimensionless_axioms per_perp) + ultimately show ?thesis + using l12_9 by blast + qed + moreover have "Col P P Q" + by (simp add: col_trivial_1) + ultimately show ?thesis + by blast + qed +qed + +lemma par_col_par: + assumes "C \ D'" and + "A B Par C D" and + "Col C D D'" + shows "A B Par C D'" +proof - + { + assume P1: "A B ParStrict C D" + have "Coplanar A B C D'" + using assms(2) assms(3) col2__eq col2_cop__cop par__coplanar par_neq2 by blast + then have "A B Par C D'" + by (smt ParStrict_def Par_def P1 assms(1) assms(3) colx not_col_distincts not_col_permutation_5) + } + { + assume "A \ B \ C \ D \ Col A C D \ Col B C D" + then have "A B Par C D'" + using Par_def assms(1) assms(3) col2__eq col_permutation_2 by blast + } + thus ?thesis + using Par_def \A B ParStrict C D \ A B Par C D'\ assms(2) by auto +qed + +lemma parallel_existence1: + assumes "A \ B" + shows "\ Q. A B Par P Q" +proof - + obtain C D where "C \ D \ A B Par C D \ Col P C D" + using assms parallel_existence by blast + then show ?thesis + by (metis Col_cases Par_cases par_col_par) +qed + +lemma par_not_col: + assumes "A B ParStrict C D" and + "Col X A B" + shows "\ Col X C D" + using ParStrict_def assms(1) assms(2) by blast + +lemma not_strict_par1: + assumes "A B Par C D" and + "Col A B X" and + "Col C D X" + shows "Col A B C" + by (smt Par_def assms(1) assms(2) assms(3) col2__eq col_permutation_2 par_not_col) + +lemma not_strict_par2: + assumes "A B Par C D" and + "Col A B X" and + "Col C D X" + shows "Col A B D" + using Par_cases assms(1) assms(2) assms(3) not_col_permutation_4 not_strict_par1 by blast + +lemma not_strict_par: + assumes "A B Par C D" and + "Col A B X" and + "Col C D X" + shows "Col A B C \ Col A B D" + using assms(1) assms(2) assms(3) not_strict_par1 not_strict_par2 by blast + +lemma not_par_not_col: + assumes "A \ B" and + "A \ C" and + "\ A B Par A C" + shows "\ Col A B C" + using Par_def assms(1) assms(2) assms(3) not_col_distincts not_col_permutation_4 by blast + +lemma not_par_inter_uniqueness: + assumes "A \ B" and + "C \ D" and + "\ A B Par C D" and + "Col A B X" and + "Col C D X" and + "Col A B Y" and + "Col C D Y" + shows "X = Y" +proof cases + assume P1: "C = Y" + thus ?thesis + proof cases + assume P2: "C = X" + thus ?thesis + using P1 by auto + next + assume "C \ X" + thus ?thesis + by (smt Par_def assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) col3 col_permutation_5 l6_21) + qed +next + assume "C \ Y" + thus ?thesis + by (smt Par_def assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) assms(7) col_permutation_2 col_permutation_4 l6_21) +qed + +lemma inter_uniqueness_not_par: + assumes "\ Col A B C" and + "Col A B P" and + "Col C D P" + shows "\ A B Par C D" + using assms(1) assms(2) assms(3) not_strict_par1 by blast + +lemma col_not_col_not_par: + assumes "\ P. Col A B P \ Col C D P" and + "\ Q. Col C D Q \ \Col A B Q" + shows "\ A B Par C D" + using assms(1) assms(2) colx not_strict_par par_neq2 by blast + +lemma par_distincts: + assumes "A B Par C D" + shows "A B Par C D \ A \ B \ C \ D" + using assms par_neq1 par_neq2 by blast + +lemma par_not_col_strict: + assumes "A B Par C D" and + "Col C D P" and + "\ Col A B P" + shows "A B ParStrict C D" + using Col_cases Par_def assms(1) assms(2) assms(3) col3 by blast + +lemma col_cop_perp2_pars: + assumes "\ Col A B P" and + "Col C D P" and + "Coplanar A B C D" and + "A B Perp P Q" and + "C D Perp P Q" + shows "A B ParStrict C D" +proof - + have P1: "C \ D" + using assms(5) perp_not_eq_1 by auto + then have P2: "Coplanar A B C P" + using col_cop__cop assms(2) assms(3) by blast + moreover have P3: "Coplanar A B D P" using col_cop__cop + using P1 assms(2) assms(3) col2_cop__cop col_trivial_2 by blast + have "A B Par C D" + proof - + have "Coplanar P A Q C" + proof - + have "\ Col B P A" + by (simp add: assms(1) not_col_permutation_1) + moreover have "Coplanar B P A Q" + by (meson assms(4) ncoplanar_perm_12 perp__coplanar) + moreover have "Coplanar B P A C" + using P2 ncoplanar_perm_13 by blast + ultimately show ?thesis + using coplanar_trans_1 by auto + qed + then have P4: "Coplanar P Q A C" + using ncoplanar_perm_2 by blast + have "Coplanar P A Q D" + proof - + have "\ Col B P A" + by (simp add: assms(1) not_col_permutation_1) + moreover have "Coplanar B P A Q" + by (meson assms(4) ncoplanar_perm_12 perp__coplanar) + moreover have "Coplanar B P A D" + using P3 ncoplanar_perm_13 by blast + ultimately show ?thesis + using coplanar_trans_1 by blast + qed + then moreover have "Coplanar P Q A D" + using ncoplanar_perm_2 by blast + moreover have "Coplanar P Q B C" + using P2 assms(1) assms(4) coplanar_perm_1 coplanar_perm_10 coplanar_trans_1 perp__coplanar by blast + moreover have "Coplanar P Q B D" + by (meson P3 assms(1) assms(4) coplanar_trans_1 ncoplanar_perm_1 ncoplanar_perm_13 perp__coplanar) + ultimately show ?thesis + using assms(4) assms(5) l12_9 P4 by auto + qed + thus ?thesis + using assms(1) assms(2) par_not_col_strict by auto +qed + +lemma all_one_side_par_strict: + assumes "C \ D" and + "\ P. Col C D P \ A B OS C P" + shows "A B ParStrict C D" +proof - + have P1: "Coplanar A B C D" + by (simp add: assms(2) col_trivial_2 os__coplanar) + { + assume "\ X. Col X A B \ Col X C D" + then obtain X where P2: "Col X A B \ Col X C D" by blast + have "A B OS C X" + by (simp add: P2 Col_perm assms(2)) + then obtain M where "A B TS C M \ A B TS X M" + by (meson Col_cases P2 col124__nos) + then have "False" + using P2 TS_def by blast + } + thus ?thesis + using P1 ParStrict_def by auto +qed + +lemma par_col_par_2: + assumes "A \ P" and + "Col A B P" and + "A B Par C D" + shows "A P Par C D" + using assms(1) assms(2) assms(3) par_col_par par_symmetry by blast + +lemma par_col2_par: + assumes "E \ F" and + "A B Par C D" and + "Col C D E" and + "Col C D F" + shows "A B Par E F" + by (metis assms(1) assms(2) assms(3) assms(4) col_transitivity_2 not_col_permutation_4 par_col_par par_distincts par_right_comm) + +lemma par_col2_par_bis: + assumes "E \ F" and + "A B Par C D" and + "Col E F C" and + "Col E F D" + shows "A B Par E F" + by (metis assms(1) assms(2) assms(3) assms(4) col_transitivity_1 not_col_permutation_2 par_col2_par) + +lemma par_strict_col_par_strict: + assumes "C \ E" and + "A B ParStrict C D" and + "Col C D E" + shows "A B ParStrict C E" +proof - + have P1: "C E Par A B" + using Par_def Par_perm assms(1) assms(2) assms(3) par_col_par_2 by blast + { + assume "C E ParStrict A B" + then have "A B ParStrict C E" + by (metis par_strict_symmetry) + } + thus ?thesis + using Col_cases Par_def P1 assms(2) par_strict_not_col_1 by blast +qed + +lemma par_strict_col2_par_strict: + assumes "E \ F" and + "A B ParStrict C D" and + "Col C D E" and + "Col C D F" + shows "A B ParStrict E F" + by (smt ParStrict_def assms(1) assms(2) assms(3) assms(4) col2_cop__cop colx not_col_permutation_1 par_strict_neq1 par_strict_symmetry) + +lemma line_dec: + "(Col C1 B1 B2 \ Col C2 B1 B2) \ \ (Col C1 B1 B2 \ Col C2 B1 B2)" + by simp + +lemma par_distinct: + assumes "A B Par C D" + shows "A \ B \ C \ D" + using assms par_neq1 par_neq2 by auto + +lemma par_col4__par: + assumes "E \ F" and + "G \ H" and + "A B Par C D" and + "Col A B E" and + "Col A B F" and + "Col C D G" and + "Col C D H" + shows "E F Par G H" +proof - + have "C D Par E F" + using Par_cases assms(1) assms(3) assms(4) assms(5) par_col2_par by blast + then have "E F Par C D" + by (simp add: \C D Par E F\ par_symmetry) + thus ?thesis + using assms(2) assms(6) assms(7) par_col2_par by blast +qed + +lemma par_strict_col4__par_strict: + assumes "E \ F" and + "G \ H" and + "A B ParStrict C D" and + "Col A B E" and + "Col A B F" and + "Col C D G" and + "Col C D H" + shows "E F ParStrict G H" +proof - + have "C D ParStrict E F" + using Par_strict_cases assms(1) assms(3) assms(4) assms(5) par_strict_col2_par_strict by blast + then have "E F ParStrict C D" + by (simp add: \C D ParStrict E F\ par_strict_symmetry) + thus ?thesis + using assms(2) assms(6) assms(7) par_strict_col2_par_strict by blast +qed + +lemma par_strict_one_side: + assumes "A B ParStrict C D" and + "Col C D P" + shows "A B OS C P" +proof cases + assume "C = P" + thus ?thesis + using assms(1) assms(2) not_col_permutation_5 one_side_reflexivity par_not_col by blast +next + assume "C \ P" + thus ?thesis + using assms(1) assms(2) l12_6 par_strict_col_par_strict by blast +qed + +lemma par_strict_all_one_side: + assumes "A B ParStrict C D" + shows "\ P. Col C D P \ A B OS C P" + using assms par_strict_one_side by blast + +lemma inter_trivial: + assumes "\ Col A B X" + shows "X Inter A X B X" + by (metis Col_perm Inter_def assms col_trivial_1) + +lemma inter_sym: + assumes "X Inter A B C D" + shows "X Inter C D A B" +proof - + obtain P where P1: "Col P C D \ \ Col P A B" + using Inter_def assms by auto + have P2: "A \ B" + using P1 col_trivial_2 by blast + then show ?thesis + proof cases + assume "A = X" + have "Col B A B" + by (simp add: col_trivial_3) + { + assume P3: "Col B C D" + have "Col P A B" + proof - + have "C \ D" + using Inter_def assms by blast + moreover have "Col C D P" + using P1 not_col_permutation_2 by blast + moreover have "Col C D A" + using Inter_def \A = X\ assms by auto + moreover have "Col C D B" + using P3 not_col_permutation_2 by blast + ultimately show ?thesis + using col3 by blast + qed + then have "False" + by (simp add: P1) + } + then have "\ Col B C D" by auto + then show ?thesis + using Inter_def P2 assms by (meson col_trivial_3) + next + assume P5: "A \ X" + have P6: "Col A A B" + using not_col_distincts by blast + { + assume P7: "Col A C D" + have "Col A P X" + proof - + have "C \ D" + using Inter_def assms by auto + moreover have "Col C D A" + using Col_cases P7 by blast + moreover have "Col C D P" + using Col_cases P1 by auto + moreover have "Col C D X" + using Inter_def assms by auto + ultimately show ?thesis + using col3 by blast + qed + then have "Col P A B" + by (metis (full_types) Col_perm Inter_def P5 assms col_transitivity_2) + then have "False" + by (simp add: P1) + } + then have "\ Col A C D" by auto + then show ?thesis + by (meson Inter_def P2 assms col_trivial_1) + qed +qed + +lemma inter_left_comm: + assumes "X Inter A B C D" + shows "X Inter B A C D" + using Col_cases Inter_def assms by auto + +lemma inter_right_comm: + assumes "X Inter A B C D" + shows "X Inter A B D C" + by (metis assms inter_left_comm inter_sym) + +lemma inter_comm: + assumes "X Inter A B C D" + shows "X Inter B A D C" + using assms inter_left_comm inter_right_comm by blast + +lemma l12_17: + assumes "A \ B" and + "P Midpoint A C" and + "P Midpoint B D" + shows "A B Par C D" +proof cases + assume P1: "Col A B P" + thus ?thesis + proof cases + assume "A = P" + thus ?thesis + using assms(1) assms(2) assms(3) cong_diff_2 is_midpoint_id midpoint_col midpoint_cong not_par_not_col by blast + next + assume P2: "A \ P" + thus ?thesis + proof cases + assume "B = P" + thus ?thesis + by (metis assms(1) assms(2) assms(3) midpoint_col midpoint_distinct_2 midpoint_distinct_3 not_par_not_col par_comm) + next + assume P3: "B \ P" + have P4: "Col B P D" + using assms(3) midpoint_col not_col_permutation_4 by blast + have P5: "Col A P C" + using assms(2) midpoint_col not_col_permutation_4 by blast + then have P6: "Col B C P" + using P1 P2 col_transitivity_2 not_col_permutation_3 not_col_permutation_5 by blast + have "C \ D" + using assms(1) assms(2) assms(3) l7_9 by blast + moreover have "Col A C D" + using P1 P3 P4 P6 col3 not_col_permutation_3 not_col_permutation_5 by blast + moreover have "Col B C D" + using P3 P4 P6 col_trivial_3 colx by blast + ultimately show ?thesis + by (simp add: Par_def assms(1)) + qed + qed +next + assume T1: "\ Col A B P" + then obtain E where T2: "Col A B E \ A B Perp P E" + using l8_18_existence by blast + have T3: "A \ P" + using T1 col_trivial_3 by blast + then show ?thesis + proof cases + assume T4: "A = E" + then have T5: "Per P A B" + using T2 l8_2 perp_per_1 by blast + obtain B' where T6: "Bet B A B' \ Cong A B' B A" + using segment_construction by blast + obtain D' where T7: "Bet B' P D' \ Cong P D' B' P" + using segment_construction by blast + have T8: "C Midpoint D D'" + using T6 T7 assms(2) assms(3) midpoint_def not_cong_3412 symmetry_preserves_midpoint by blast + have "Col A B B'" + using Col_cases Col_def T6 by blast + then have T9: "Per P A B'" + using per_col T5 assms(1) by blast + obtain B'' where T10: "A Midpoint B B'' \ Cong P B P B''" + using Per_def T5 by auto + then have "B' = B''" + using T6 cong_symmetry midpoint_def symmetric_point_uniqueness by blast + then have "Cong P D P D'" + by (metis Cong_perm Midpoint_def T10 T7 assms(3) cong_inner_transitivity) + then have T12: "Per P C D" + using Per_def T8 by auto + then have T13: "C PerpAt P C C D" + by (metis T3 assms(1) assms(2) assms(3) l7_3_2 per_perp_in sym_preserve_diff) + have T14: "P \ C" + using T3 assms(2) is_midpoint_id_2 by auto + have T15: "C \ D" + using assms(1) assms(2) assms(3) l7_9 by auto + have T15A: "C C Perp C D \ P C Perp C D" + using T12 T14 T15 per_perp by auto + { + assume "C C Perp C D" + then have "A B Par C D" + using perp_distinct by auto + } + { + assume "P C Perp C D" + have "A B Par C D" + proof - + have "Coplanar P A A C" + using ncop_distincts by blast + moreover have "Coplanar P A A D" + using ncop_distincts by blast + moreover have "Coplanar P A B C" + by (simp add: assms(2) coplanar_perm_1 midpoint__coplanar) + moreover have "Coplanar P A B D" + using assms(3) midpoint_col ncop__ncols by blast + moreover have "A B Perp P A" + using T2 T4 by auto + moreover have "C D Perp P A" + proof - + have "P A Perp C D" + proof - + have "P \ A" + using T3 by auto + moreover have "P C Perp C D" + using T14 T15 T12 per_perp by blast + moreover have "Col P C A" + by (simp add: assms(2) l7_2 midpoint_col) + ultimately show ?thesis + using perp_col by blast + qed + then show ?thesis + using Perp_perm by blast + qed + ultimately show ?thesis using l12_9 by blast + qed + } + then show ?thesis using T15A + using \C C Perp C D \ A B Par C D\ by blast + next + assume S1B: "A \ E" + obtain F where S2: "Bet E P F \ Cong P F E P" + using segment_construction by blast + then have S2A: "P Midpoint E F" + using midpoint_def not_cong_3412 by blast + then have S3: "Col C D F" + using T2 assms(2) assms(3) mid_preserves_col by blast + obtain A' where S4: "Bet A E A' \ Cong E A' A E" + using segment_construction by blast + obtain C' where S5: "Bet A' P C' \ Cong P C' A' P" + using segment_construction by blast + have S6: "F Midpoint C C'" + using S4 S5 S2A assms(2) midpoint_def not_cong_3412 symmetry_preserves_midpoint by blast + have S7: "Per P E A" + using T2 col_trivial_3 l8_16_1 by blast + have S8: "Cong P C P C'" + proof - + have "Cong P C P A" + using Cong_perm Midpoint_def assms(2) by blast + moreover have "Cong P A P C'" + proof - + obtain A'' where S9: "E Midpoint A A'' \ Cong P A P A''" + using Per_def S7 by blast + have S10: "A' = A''" + using Cong_perm Midpoint_def S4 S9 symmetric_point_uniqueness by blast + then have "Cong P A P A'" using S9 by auto + moreover have "Cong P A' P C'" + using Cong_perm S5 by blast + ultimately show ?thesis + using cong_transitivity by blast + qed + ultimately show ?thesis + using cong_transitivity by blast + qed + then have S9: "Per P F C" + using S6 Per_def by blast + then have "F PerpAt P F F C" + by (metis S2 S2A T1 T2 S1B assms(2) cong_diff_3 l7_9 per_perp_in) + then have "F PerpAt F P C F" + using Perp_in_perm by blast + then have S10: "F P Perp C F \ F F Perp C F" + using l8_15_2 perp_in_col by blast + { + assume S11: "F P Perp C F" + have "Coplanar P E A C" + proof - + have "Col P E P \ Col A C P" + using assms(2) col_trivial_3 midpoint_col not_col_permutation_2 by blast + then show ?thesis + using Coplanar_def by blast + qed + moreover have "Coplanar P E A D" + proof - + have "Col P D B \ Col E A B" + using Mid_cases T2 assms(3) midpoint_col not_col_permutation_1 by blast + then show ?thesis + using Coplanar_def by blast + qed + moreover have "Coplanar P E B C" + by (metis S1B T2 calculation(1) col2_cop__cop col_transitivity_1 ncoplanar_perm_5 not_col_permutation_5) + moreover have "Coplanar P E B D" + by (metis S1B T2 calculation(2) col2_cop__cop col_transitivity_1 ncoplanar_perm_5 not_col_permutation_5) + moreover have "C D Perp P E" + proof - + have "C \ D" + using assms(1) assms(2) assms(3) sym_preserve_diff by blast + moreover have "P F Perp C F" + using Perp_perm S11 by blast + moreover have "Col P F E" + by (simp add: Col_def S2) + moreover have "Col C F D" + using Col_perm S3 by blast + ultimately show ?thesis using per_col + by (smt Perp_cases S2 col_trivial_3 cong_diff perp_col4 perp_not_eq_1) + qed + ultimately have "A B Par C D" + using T2 l12_9 by blast + } + { + assume "F F Perp C F" + then have "A B Par C D" + using perp_distinct by blast + } + thus ?thesis + using S10 \F P Perp C F \ A B Par C D\ by blast + qed +qed + +lemma l12_18_a: + assumes "Cong A B C D" and + "Cong B C D A" and + "\ Col A B C" and + "B \ D" and + "Col A P C" and + "Col B P D" + shows "A B Par C D" +proof - + have "P Midpoint A C \ P Midpoint B D" + using assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l7_21 by blast + then show ?thesis + using assms(3) l12_17 not_col_distincts by blast +qed + +lemma l12_18_b: + assumes "Cong A B C D" and + "Cong B C D A" and + "\ Col A B C" and + "B \ D" and + "Col A P C" and + "Col B P D" + shows "B C Par D A" + by (smt assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) cong_symmetry inter_uniqueness_not_par l12_18_a l6_21 not_col_distincts) + +lemma l12_18_c: + assumes "Cong A B C D" and + "Cong B C D A" and + "\ Col A B C" and + "B \ D" and + "Col A P C" and + "Col B P D" + shows "B D TS A C" +proof - + have "P Midpoint A C \ P Midpoint B D" + using assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l7_21 by blast + then show ?thesis + proof - + have "A C TS B D" + by (metis Col_cases Tarski_neutral_dimensionless.mid_two_sides Tarski_neutral_dimensionless_axioms \P Midpoint A C \ P Midpoint B D\ assms(3)) + then have "\ Col B D A" + by (meson Col_cases Tarski_neutral_dimensionless.mid_preserves_col Tarski_neutral_dimensionless.ts__ncol Tarski_neutral_dimensionless_axioms \P Midpoint A C \ P Midpoint B D\ l7_2) + then show ?thesis + by (meson Tarski_neutral_dimensionless.mid_two_sides Tarski_neutral_dimensionless_axioms \P Midpoint A C \ P Midpoint B D\) + qed +qed + +lemma l12_18_d: + assumes "Cong A B C D" and + "Cong B C D A" and + "\ Col A B C" and + "B \ D" and + "Col A P C" and + "Col B P D" + shows "A C TS B D" + by (metis (no_types, lifting) Col_cases TS_def assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l12_18_c not_col_distincts not_cong_2143 not_cong_4321) + +lemma l12_18: + assumes "Cong A B C D" and + "Cong B C D A" and + "\ Col A B C" and + "B \ D" and + "Col A P C" and + "Col B P D" + shows "A B Par C D \ B C Par D A \ B D TS A C \ A C TS B D" + using assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l12_18_a l12_18_b l12_18_c l12_18_d by auto + +lemma par_two_sides_two_sides: + assumes "A B Par C D" and + "B D TS A C" + shows "A C TS B D" + by (metis Par_def TS_def assms(1) assms(2) invert_one_side invert_two_sides l12_6 l9_31 not_col_permutation_4 one_side_symmetry os_ts1324__os pars__os3412) + +lemma par_one_or_two_sides: + assumes "A B ParStrict C D" + shows "(A C TS B D \ B D TS A C) \ (A C OS B D \ B D OS A C)" + by (smt Par_def assms invert_one_side l12_6 l9_31 not_col_permutation_3 os_ts1324__os par_strict_not_col_1 par_strict_not_col_2 par_two_sides_two_sides pars__os3412 two_sides_cases) + +lemma l12_21_b: + assumes "A C TS B D" and + "B A C CongA D C A" + shows "A B Par C D" +proof - + have P1: "\ Col A B C" + using TS_def assms(1) not_col_permutation_4 by blast + then have P2: "A \ B" + using col_trivial_1 by auto + have P3: "C \ D" + using assms(1) ts_distincts by blast + then obtain D' where P4: "C Out D D' \ Cong C D' A B" + using P2 segment_construction_3 by blast + have P5: "B A C CongA D' C A" + proof - + have "A Out B B" + using P2 out_trivial by auto + moreover have "A Out C C" + using P1 col_trivial_3 out_trivial by force + moreover have "C Out D' D" + by (simp add: P4 l6_6) + moreover have "C Out A A" + using P1 not_col_distincts out_trivial by auto + ultimately show ?thesis + using assms(2) l11_10 by blast + qed + then have P6: "Cong D' A B C" + using Cong_perm P4 cong_pseudo_reflexivity l11_49 by blast + have P7: "A C TS D' B" + proof - + have "A C TS D B" + by (simp add: assms(1) l9_2) + moreover have "Col C A C" + using col_trivial_3 by auto + ultimately show ?thesis + using P4 l9_5 by blast + qed + then obtain M where P8: "Col M A C \ Bet D' M B" + using TS_def by blast + have "B \ D'" + using P7 not_two_sides_id by blast + then have "M Midpoint A C \ M Midpoint B D'" + by (metis Col_cases P1 P4 P6 P8 bet_col l7_21 not_cong_3412) + then have "A B Par C D'" + using P2 l12_17 by blast + thus ?thesis + by (meson P3 P4 Tarski_neutral_dimensionless.par_col_par Tarski_neutral_dimensionless_axioms l6_6 out_col) +qed + +lemma l12_22_aux: + assumes "P \ A" and + "A \ C" and + "Bet P A C" and + "P A OS B D" and + "B A P CongA D C P" + shows "A B Par C D" +proof - + have P1: "P \ C" + using CongA_def assms(5) by blast + obtain B' where P2: "Bet B A B' \ Cong A B' B A" + using segment_construction by blast + have P3: "P A B CongA C A B'" + by (metis CongA_def P2 assms(2) assms(3) assms(5) cong_reverse_identity l11_14) + have P4: "D C A CongA D C P" + by (metis Col_def assms(2) assms(3) assms(4) bet_out_1 col124__nos l6_6 out2__conga out_trivial) + have P5: "A B' Par C D" + proof - + have "\ Col B P A" + using assms(4) col123__nos not_col_permutation_2 by blast + then have "P A TS B B'" + by (metis P2 assms(4) bet__ts cong_reverse_identity invert_two_sides not_col_permutation_3 os_distincts) + then have "A C TS B' D" + by (meson assms(2) assms(3) assms(4) bet_col bet_col1 col_preserves_two_sides l9_2 l9_8_2) + moreover have "B' A C CongA D C A" + proof - + have "B' A C CongA B A P" + by (simp add: P3 conga_comm conga_sym) + moreover have "B A P CongA D C A" + using P4 assms(5) not_conga not_conga_sym by blast + ultimately show ?thesis + using not_conga by blast + qed + ultimately show ?thesis + using l12_21_b by blast + qed + have "C D Par A B" + proof - + have "A \ B" + using assms(4) os_distincts by blast + moreover have "C D Par A B'" + using P5 par_symmetry by blast + moreover have "Col A B' B" + by (simp add: Col_def P2) + ultimately show ?thesis + using par_col_par by blast + qed + thus ?thesis + using Par_cases by blast +qed + +lemma l12_22_b: + assumes "P Out A C" and + "P A OS B D" and + "B A P CongA D C P" + shows "A B Par C D" +proof cases + assume "A = C" + then show ?thesis + using assms(2) assms(3) conga_comm conga_os__out not_par_not_col os_distincts out_col by blast +next + assume P1: "A \ C" + { + assume "Bet P A C" + then have "A B Par C D" + using P1 assms(2) assms(3) conga_diff2 l12_22_aux by blast + } + { + assume P2: "Bet P C A" + have "C D Par A B" + proof - + have "P C OS D B" + using assms(1) assms(2) col_one_side one_side_symmetry out_col out_diff2 by blast + moreover have "D C P CongA B A P" + using assms(3) not_conga_sym by blast + then show ?thesis + by (metis P1 P2 assms(1) calculation l12_22_aux out_distinct) + qed + then have "A B Par C D" + using Par_cases by auto + } + then show ?thesis + using Out_def \Bet P A C \ A B Par C D\ assms(1) by blast +qed + +lemma par_strict_par: + assumes "A B ParStrict C D" + shows "A B Par C D" + using Par_def assms by auto + +lemma par_strict_distinct: + assumes "A B ParStrict C D" + shows " A \ B \ C \ D" + using assms par_strict_neq1 par_strict_neq2 by auto + +lemma col_par: + assumes "A \ B" and + "B \ C" and + "Col A B C" + shows "A B Par B C" + by (simp add: Par_def assms(1) assms(2) assms(3) col_trivial_1) + +lemma acute_col_perp__out: + assumes "Acute A B C" and + "Col B C A'" and + "B C Perp A A'" + shows "B Out A' C" +proof - + { + assume P1: "\ Col B C A" + then obtain B' where P2: "B C Perp B' B \ B C OS A B'" + using assms(2) l10_15 os_distincts by blast + have P3: "\ Col B' B C" + using P2 col124__nos col_permutation_1 by blast + { + assume "Col B B' A" + then have "A B C LtA A B C" + using P2 acute_one_side_aux acute_sym assms(1) one_side_not_col124 by blast + then have "False" + by (simp add: nlta) + } + then have P4: "\ Col B B' A" by auto + have P5: "B B' ParStrict A A'" + proof - + have "B B' Par A A'" + proof - + have "Coplanar B C B A" + using ncop_distincts by blast + moreover have "Coplanar B C B A'" + using ncop_distincts by blast + moreover have "Coplanar B C B' A" + using P2 coplanar_perm_1 os__coplanar by blast + moreover have "Coplanar B C B' A'" + using assms(2) ncop__ncols by auto + moreover have "B B' Perp B C" + using P2 Perp_perm by blast + moreover have "A A' Perp B C" + using Perp_perm assms(3) by blast + ultimately show ?thesis + using l12_9 by auto + qed + moreover have "Col A A' A" + by (simp add: col_trivial_3) + moreover have "\ Col B B' A" + by (simp add: P4) + ultimately show ?thesis + using par_not_col_strict by auto + qed + then have P6: "\ Col B B' A'" + using P5 par_strict_not_col_4 by auto + then have "B B' OS A' C" + proof - + have "B B' OS A' A" + using P5 l12_6 one_side_symmetry by blast + moreover have "B B' OS A C" + using P2 acute_one_side_aux acute_sym assms(1) one_side_symmetry by blast + ultimately show ?thesis + using one_side_transitivity by blast + qed + then have "B Out A' C" + using Col_cases assms(2) col_one_side_out by blast + } + then show ?thesis + using assms(2) assms(3) perp_not_col2 by blast +qed + +lemma acute_col_perp__out_1: + assumes "Acute A B C" and + "Col B C A'" and + "B A Perp A A'" + shows "B Out A' C" +proof - + obtain A0 where P1: "Bet A B A0 \ Cong B A0 A B" + using segment_construction by blast + obtain C0 where P2: "Bet C B C0 \ Cong B C0 C B" + using segment_construction by blast + have P3: "\ Col B A A'" + using assms(3) col_trivial_2 perp_not_col2 by blast + have "Bet A' B C0" + proof - + have P4: "Col A' B C0" + using P2 acute_distincts assms(1) assms(2) bet_col col_transitivity_2 not_col_permutation_4 by blast + { + assume P5: "B Out A' C0" + have "B Out A A0" + proof - + have "Bet C B A'" + by (smt Bet_perm Col_def P2 P5 assms(2) between_exchange3 not_bet_and_out outer_transitivity_between2) + then have "A B C CongA A0 B A'" + using P1 P3 acute_distincts assms(1) cong_diff_4 l11_14 not_col_distincts by blast + then have "Acute A' B A0" + using acute_conga__acute acute_sym assms(1) by blast + moreover have "B A0 Perp A' A" + proof - + have "B \ A0" + using P1 P3 col_trivial_1 cong_reverse_identity by blast + moreover have "B A Perp A' A" + using Perp_perm assms(3) by blast + moreover have "Col B A A0" + using P1 bet_col not_col_permutation_4 by blast + ultimately show ?thesis + using perp_col by blast + qed + ultimately show ?thesis + using Col_cases P1 acute_col_perp__out bet_col by blast + qed + then have "False" + using P1 not_bet_and_out by blast + } + moreover then have "\ B Out A' C0" by auto + ultimately show ?thesis + using l6_4_2 P4 by blast + qed + then show ?thesis + by (metis P2 P3 acute_distincts assms(1) cong_diff_3 l6_2 not_col_distincts) +qed + +lemma conga_inangle_per2__inangle: + assumes "Per A B C" and + "T InAngle A B C" and + "P B A CongA P B C" and + "Per B P T" and + "Coplanar A B C P" + shows "P InAngle A B C" +proof cases + assume "P = T" + then show ?thesis + by (simp add: assms(2)) +next + assume P1: "P \ T" + obtain P' where P2: "P' InAngle A B C \ P' B A CongA P' B C" + using CongA_def angle_bisector assms(3) by presburger + have P3: "Acute P' B A" + using P2 acute_sym assms(1) conga_inangle_per__acute by blast + have P4: "\ Col A B C" + using assms(1) assms(3) conga_diff2 conga_diff56 l8_9 by blast + have P5: "Col B P P'" + proof - + have "\ B Out A C" + using Col_cases P4 out_col by blast + moreover have "Coplanar A B P P'" + proof - + have T1: "\ Col C A B" + using Col_perm P4 by blast + moreover have "Coplanar C A B P" + using assms(5) ncoplanar_perm_8 by blast + moreover have "Coplanar C A B P'" + using P2 inangle__coplanar ncoplanar_perm_21 by blast + ultimately show ?thesis + using coplanar_trans_1 by blast + qed + moreover have "Coplanar B C P P'" + proof - + have "Coplanar A B C P" + by (meson P2 bet__coplanar calculation(1) calculation(2) col_in_angle_out coplanar_perm_18 coplanar_trans_1 inangle__coplanar l11_21_a l6_6 l6_7 not_col_permutation_4 not_col_permutation_5) + have "Coplanar A B C P'" + using P2 inangle__coplanar ncoplanar_perm_18 by blast + then show ?thesis + using P4 \Coplanar A B C P\ coplanar_trans_1 by blast + qed + ultimately show ?thesis using conga2_cop2__col P2 assms(3) by blast + qed + have "B Out P P'" + proof - + have "Acute T B P'" + using P2 acute_sym assms(1) assms(2) conga_inangle2_per__acute by blast + moreover have "B P' Perp T P" + by (metis P1 P5 acute_distincts assms(3) assms(4) calculation col_per_perp conga_distinct l8_2 not_col_permutation_4) + ultimately show ?thesis + using Col_cases P5 acute_col_perp__out by blast + qed + then show ?thesis + using Out_cases P2 in_angle_trans inangle_distincts out341__inangle by blast +qed + +lemma perp_not_par: + assumes "A B Perp X Y" + shows "\ A B Par X Y" +proof - + obtain P where P1: "P PerpAt A B X Y" + using Perp_def assms by blast + { + assume P2: "A B Par X Y" + { + assume P3: "A B ParStrict X Y" + then have "False" + proof - + have "Col P A B" + using Col_perm P1 perp_in_col by blast + moreover have "Col P X Y" + using P1 col_permutation_2 perp_in_col by blast + ultimately show ?thesis + using P3 par_not_col by blast + qed + } + { + assume P4: "A \ B \ X \ Y \ Col A X Y \ Col B X Y" + then have "False" + proof cases + assume "A = Y" + thus ?thesis + using P4 assms not_col_permutation_1 perp_not_col by blast + next + assume "A \ Y" + thus ?thesis + using Col_perm P4 Perp_perm assms perp_not_col2 by blast + qed + } + then have "False" + using Par_def P2 \A B ParStrict X Y \ False\ by auto + } + thus ?thesis by auto +qed + +lemma cong_conga_perp: + assumes "B P TS A C" and + "Cong A B C B" and + "A B P CongA C B P" + shows "A C Perp B P" +proof - + have P1: " \ Col A B P" + using TS_def assms(1) by blast + then have P2: "B \ P" + using col_trivial_2 by blast + have P3: "A \ B" + using assms(1) ts_distincts by blast + have P4: "C \ B" + using assms(1) ts_distincts by auto + have P5: "A \ C" + using assms(1) not_two_sides_id by auto + show ?thesis + proof cases + assume P6: "Bet A B C" + then have "Per P B A" + by (meson Tarski_neutral_dimensionless.conga_comm Tarski_neutral_dimensionless_axioms assms(3) l11_18_2) + then show ?thesis + using P2 P3 P5 Per_perm P6 bet_col per_perp perp_col by blast + next + assume P7: "\ Bet A B C" + obtain T where P7A: "Col T B P \ Bet A T C" + using TS_def assms(1) by auto + then have P8: "B \ T" + using P7 by blast + then have P9: "T B A CongA T B C" + by (meson Col_cases P7A Tarski_neutral_dimensionless.col_conga__conga Tarski_neutral_dimensionless.conga_comm Tarski_neutral_dimensionless_axioms assms(3)) + then have P10: "Cong T A T C" + using assms(2) cong2_conga_cong cong_reflexivity not_cong_2143 by blast + then have P11: "T Midpoint A C" + using P7A midpoint_def not_cong_2134 by blast + have P12: "Per B T A" + using P11 Per_def assms(2) not_cong_2143 by blast + then show ?thesis + proof - + have "A C Perp B T" + by (metis P11 P12 P5 P8 col_per_perp midpoint_col midpoint_distinct_1) + moreover have "B \ T" + by (simp add: P8) + moreover have "T \ A" + using P1 P7A by blast + moreover have "C \ T" + using P10 P5 cong_identity by blast + moreover have "C \ A" + using P5 by auto + moreover have "Col T A C" + by (meson P7A bet_col not_col_permutation_4) + ultimately show ?thesis + using P2 P7A not_col_permutation_4 perp_col1 by blast + qed + qed +qed + +lemma perp_inter_exists: + assumes "A B Perp C D" + shows "\ P. Col A B P \ Col C D P" +proof - + obtain P where "P PerpAt A B C D" + using Perp_def assms by auto + then show ?thesis + using perp_in_col by blast +qed + +lemma perp_inter_perp_in: + assumes "A B Perp C D" + shows "\ P. Col A B P \ Col C D P \ P PerpAt A B C D" + by (meson Perp_def Tarski_neutral_dimensionless.perp_in_col Tarski_neutral_dimensionless_axioms assms) + +end + +context Tarski_2D + +begin + +lemma l12_9_2D: + assumes "A1 A2 Perp C1 C2" and + "B1 B2 Perp C1 C2" + shows "A1 A2 Par B1 B2" + using l12_9 all_coplanar assms(1) assms(2) by auto + +end + +context Tarski_neutral_dimensionless + +begin + +subsection "Tarski: Chapter 13" + +subsubsection "Introduction" + +lemma per2_col_eq: + assumes "A \ P" and + "A \ P'" and + "Per A P B" and + "Per A P' B" and + "Col P A P'" + shows "P = P'" + by (metis assms(1) assms(2) assms(3) assms(4) assms(5) col_per2_cases l6_16_1 l8_2 not_col_permutation_3) + +lemma per2_preserves_diff: + assumes "PO \ A'" and + "PO \ B'" and + "Col PO A' B'" and + "Per PO A' A" and + "Per PO B' B" and + "A' \ B'" + shows "A \ B" + using assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) not_col_permutation_4 per2_col_eq by blast + +lemma per23_preserves_bet: + assumes "Bet A B C" and + "A \ B'" and "A \ C'" and + "Col A B' C'" and + "Per A B' B" and + "Per A C' C" + shows "Bet A B' C'" +proof - + have P1: "Col A B C" + by (simp add: assms(1) bet_col) + show ?thesis + proof cases + assume P2: "B = B'" + then have "Col A C' C" + using P1 assms(2) assms(4) col_transitivity_1 by blast + then have P4: "A = C' \ C = C'" + by (simp add: assms(6) l8_9) + { + assume "A = C'" + then have "Bet A B' C'" + using assms(3) by auto + } + { + assume "C = C'" + then have "Bet A B' C'" + using P2 assms(1) by auto + } + then show ?thesis + using P4 assms(3) by auto + next + assume T1: "B \ B'" + have T2: "A \ C" + using assms(3) assms(6) l8_8 by auto + have T3: "C \ C'" + using P1 T1 assms(2) assms(3) assms(4) assms(5) col_trivial_3 colx l8_9 not_col_permutation_5 by blast + have T3A: "A B' Perp B' B" + using T1 assms(2) assms(5) per_perp by auto + have T3B: "A C' Perp C' C" + using T3 assms(3) assms(6) per_perp by auto + have T4: "B B' Par C C'" + proof - + have "Coplanar A B' B C" + using P1 ncop__ncols by blast + moreover have "Coplanar A B' B C'" + using assms(4) ncop__ncols by blast + moreover have "Coplanar A B' B' C" + using ncop_distincts by blast + moreover have "B B' Perp A B'" + using Perp_perm \A B' Perp B' B\ by blast + moreover have "C C' Perp A B'" + using Col_cases Perp_cases T3B assms(2) assms(4) perp_col1 by blast + ultimately show ?thesis + using l12_9 bet__coplanar between_trivial by auto + qed + moreover have "Bet A B' C'" + proof cases + assume "B = C" + then show ?thesis + by (metis T1 Tarski_neutral_dimensionless.per_col_eq Tarski_neutral_dimensionless_axioms assms(4) assms(5) calculation l6_16_1 l6_6 or_bet_out out_diff1 par_id) + next + assume T6: "B \ C" + have T7: "\ Col A B' B" + using T1 assms(2) assms(5) l8_9 by blast + have T8: "\ Col A C' C" + using T3 assms(3) assms(6) l8_9 by blast + have T9: "B' \ C'" + using P1 T6 assms(2) assms(5) assms(6) col_per2__per col_permutation_1 l8_2 l8_8 by blast + have T10: "B B' ParStrict C C' \ (B \ B' \ C \ C' \ Col B C C' \ Col B' C C')" + using Par_def calculation by blast + { + assume T11: "B B' ParStrict C C'" + then have T12: "B B' OS C' C" + using l12_6 one_side_symmetry by blast + have "B B' TS A C" + using Col_cases T6 T7 assms(1) bet__ts by blast + then have "Bet A B' C'" + using T12 assms(4) l9_5 l9_9 not_col_distincts or_bet_out by blast + } + { + assume "B \ B' \ C \ C' \ Col B C C' \ Col B' C C'" + then have "Bet A B' C'" + using Col_def T6 T8 assms(1) col_transitivity_2 by blast + } + then show ?thesis + using T10 \B B' ParStrict C C' \ Bet A B' C'\ by blast + qed + ultimately show ?thesis + by (smt P1 Par_def T1 T2 assms(4) col_transitivity_2 not_col_permutation_1 par_strict_not_col_2) + qed +qed + +lemma per23_preserves_bet_inv: + assumes "Bet A B' C'" and + "A \ B'" and + "Col A B C" and + "Per A B' B" and + "Per A C' C" + shows "Bet A B C" +proof cases + assume T1: "B = B'" + then have "Col A C' C" + using Col_def assms(1) assms(2) assms(3) col_transitivity_1 by blast + then have T2: "A = C' \ C = C'" + by (simp add: assms(5) l8_9) + { + assume "A = C'" + then have "Bet A B C" + using assms(1) assms(2) between_identity by blast + } + { + assume "C = C'" + then have "Bet A B C" + by (simp add: T1 assms(1)) + } + then show ?thesis + using T2 \A = C' \ Bet A B C\ by auto +next + assume P1: "B \ B'" + then have P2: "A B' Perp B' B" + using assms(2) assms(4) per_perp by auto + have "Per A C' C" + by (simp add: assms(5)) + then have P2: "C' PerpAt A C' C' C" + by (metis (mono_tags, lifting) Col_cases P1 assms(1) assms(2) assms(3) assms(4) bet_col bet_neq12__neq col_transitivity_1 l8_9 per_perp_in) + then have P3: "A C' Perp C' C" + using perp_in_perp by auto + then have "C' \ C" + using \A C' Perp C' C\ perp_not_eq_2 by auto + have "C' PerpAt C' A C C'" + by (simp add: Perp_in_perm P2) + then have "(C' A Perp C C') \ (C' C' Perp C C')" + using Perp_def by blast + have "A \ C'" + using assms(1) assms(2) between_identity by blast + { + assume "C' A Perp C C'" + have "Col A B' C'" using assms(1) + by (simp add: Col_def) + have "A B' Perp C' C" + using Col_cases \A C' Perp C' C\ \Col A B' C'\ assms(2) perp_col by blast + have P7: "B' B Par C' C" + proof - + have "Coplanar A B' B' C'" + using ncop_distincts by blast + moreover have "Coplanar A B' B' C" + using ncop_distincts by auto + moreover have "Coplanar A B' B C'" + using Bet_perm assms(1) bet__coplanar ncoplanar_perm_20 by blast + moreover have "Coplanar A B' B C" + using assms(3) ncop__ncols by auto + moreover have "B' B Perp A B'" + by (metis P1 Perp_perm assms(2) assms(4) per_perp) + moreover have "C' C Perp A B'" + using Perp_cases \A B' Perp C' C\ by auto + ultimately show ?thesis using l12_9 by blast + qed + have "Bet A B C" + proof cases + assume "B = C" + then show ?thesis + by (simp add: between_trivial) + next + assume T1: "B \ C" + have T2: "B' B ParStrict C' C \ (B' \ B \ C' \ C \ Col B' C' C \ Col B C' C)" + using P7 Par_def by auto + { + assume T3: "B' B ParStrict C' C" + then have "B' \ C'" + using not_par_strict_id by auto + have "\ X. Col X B' B \ Col X B' C" + using col_trivial_1 by blast + have "B' B OS C' C" + by (simp add: T3 l12_6) + have "B' B TS A C'" + by (metis Bet_cases T3 assms(1) assms(2) bet__ts l9_2 par_strict_not_col_1) + then have T8: "B' B TS C A" + using \B' B OS C' C\ l9_2 l9_8_2 by blast + then obtain T where T9: "Col T B' B \ Bet C T A" + using TS_def by auto + have "\ Col A C B'" + using T8 assms(3) not_col_permutation_2 not_col_permutation_3 ts__ncol by blast + then have "T = B" + by (metis Col_def Col_perm T9 assms(3) colx) + then have "Bet A B C" + using Bet_cases T9 by auto + } + { + assume "B' \ B \ C' \ C \ Col B' C' C \ Col B C' C" + then have "Col A B' B" + by (metis Col_perm T1 assms(3) l6_16_1) + then have "A = B' \ B = B'" + using assms(4) l8_9 by auto + then have "Bet A B C" + by (simp add: P1 assms(2)) + } + then show ?thesis + using T2 \B' B ParStrict C' C \ Bet A B C\ by auto + qed + } + then show ?thesis + by (simp add: P3 perp_comm) +qed + +lemma per13_preserves_bet: + assumes "Bet A B C" and + "B \ A'" and + "B \ C'" and + "Col A' B C'" and + "Per B A' A" and + "Per B C' C" + shows "Bet A' B C'" + by (smt Col_cases Tarski_neutral_dimensionless.per23_preserves_bet_inv Tarski_neutral_dimensionless_axioms assms(1) assms(4) assms(5) assms(6) bet_col between_equality between_symmetry per_distinct third_point) + +lemma per13_preserves_bet_inv: + assumes "Bet A' B C'" and + "B \ A'" and + "B \ C'" and + "Col A B C" and + "Per B A' A" and + "Per B C' C" + shows "Bet A B C" +proof - + have P1: "Col A' B C'" + by (simp add: Col_def assms(1)) + show ?thesis + proof cases + assume "A = A'" + then show ?thesis + using P1 assms(1) assms(3) assms(4) assms(6) col_transitivity_2 l8_9 not_bet_distincts by blast + next + assume "A \ A'" + show ?thesis + by (metis Col_cases P1 Tarski_neutral_dimensionless.per23_preserves_bet Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) between_equality between_symmetry third_point) + qed +qed + +lemma per3_preserves_bet1: + assumes "Col PO A B" and + "Bet A B C" and + "PO \ A'" and + "PO \ B'" and + "PO \ C'" and + "Per PO A' A" and + "Per PO B' B" and + "Per PO C' C" and + "Col A' B' C'" and + "Col PO A' B'" + shows "Bet A' B' C'" +proof cases + assume "A = B" + then show ?thesis + using assms(10) assms(3) assms(4) assms(6) assms(7) between_trivial2 per2_preserves_diff by blast +next + assume P1: "A \ B" + show ?thesis + proof cases + assume P2: "A = A'" + show ?thesis + proof cases + assume P3: "B = B'" + then have "Col PO C C'" + by (metis (no_types, hide_lams) Col_def P1 P2 assms(1) assms(2) assms(9) col_transitivity_1) + then have "C = C'" + using assms(5) assms(8) l8_9 not_col_permutation_5 by blast + then show ?thesis + using P2 P3 assms(2) by blast + next + assume P4: "B \ B'" + show ?thesis + proof cases + assume "A = B'" + then show ?thesis + using P2 between_trivial2 by auto + next + assume "A \ B'" + have "A \ C" + using P1 assms(2) between_identity by blast + have P7: "\ Col PO B' B" + using P4 assms(4) assms(7) l8_9 by blast + show ?thesis + using P2 P7 assms(1) assms(10) assms(3) col_transitivity_1 by blast + qed + qed + next + assume R1: "A \ A'" + show ?thesis + proof cases + assume R2: "A' = B'" + then show ?thesis + by (simp add: between_trivial2) + next + assume R3: "A' \ B'" + show ?thesis + proof cases + assume "B = C" + have "B' = C'" + by (metis Tarski_neutral_dimensionless.per2_col_eq Tarski_neutral_dimensionless_axioms \A' \ B'\ \B = C\ assms(10) assms(4) assms(5) assms(7) assms(8) assms(9) col_transitivity_2 not_col_permutation_2) + then show ?thesis + by (simp add: between_trivial) + next + assume R4: "B \ C" + show ?thesis + proof cases + assume "B = B'" + then show ?thesis + by (metis R1 assms(1) assms(10) assms(3) assms(4) assms(6) l6_16_1 l8_9 not_col_permutation_2) + next + assume R5: "B \ B'" + show ?thesis + proof cases + assume "A' = B" + then show ?thesis + using R5 assms(10) assms(4) assms(7) col_permutation_5 l8_9 by blast + next + assume R5A: "A' \ B" + have R6: "C \ C'" + by (metis P1 R1 R3 assms(1) assms(10) assms(2) assms(3) assms(5) assms(6) assms(9) bet_col col_permutation_1 col_trivial_2 l6_21 l8_9) + have R7: "A A' Perp PO A'" + by (metis Perp_cases R1 assms(3) assms(6) per_perp) + have R8: "C C' Perp PO A'" + by (smt Perp_cases R3 R6 assms(10) assms(3) assms(5) assms(8) assms(9) col2__eq col3 col_per_perp col_trivial_2 l8_2 per_perp) + have "A A' Par C C'" + proof - + have "Coplanar PO A' A C" + using P1 assms(1) assms(2) bet_col col_trivial_2 colx ncop__ncols by blast + moreover have "Coplanar PO A' A C'" + using R3 assms(10) assms(9) col_trivial_2 colx ncop__ncols by blast + moreover have "Coplanar PO A' A' C" + using ncop_distincts by blast + moreover have "Coplanar PO A' A' C'" + using ncop_distincts by blast + ultimately show ?thesis using l12_9 R7 R8 by blast + qed + have S1: "B B' Perp PO A'" + by (metis Col_cases Per_cases Perp_perm R5 assms(10) assms(3) assms(4) assms(7) col_per_perp) + have "A A' Par B B'" + proof - + have "Coplanar PO A' A B" + using assms(1) ncop__ncols by auto + moreover have "Coplanar PO A' A B'" + using assms(10) ncop__ncols by auto + moreover have "Coplanar PO A' A' B" + using ncop_distincts by auto + moreover have "Coplanar PO A' A' B'" + using ncop_distincts by auto + moreover have "A A' Perp PO A'" + by (simp add: R7) + moreover have "B B' Perp PO A'" + by (simp add: S1) + ultimately show ?thesis + using l12_9 by blast + qed + { + assume "A A' ParStrict B B'" + then have "A A' OS B B'" + by (simp add: l12_6) + have "B B' TS A C" + using R4 \A A' ParStrict B B'\ assms(2) bet__ts par_strict_not_col_3 by auto + have "B B' OS A A'" + using \A A' ParStrict B B'\ pars__os3412 by auto + have "B B' TS A' C" + using \B B' OS A A'\ \B B' TS A C\ l9_8_2 by blast + have "Bet A' B' C'" + proof cases + assume "C = C'" + then show ?thesis + using R6 by auto + next + assume "C \ C'" + have "C C' Perp PO A'" + by (simp add: R8) + have Q2: "B B' Par C C'" + proof - + have "Coplanar PO A' B C" + by (metis P1 assms(1) assms(2) bet_col col_transitivity_1 colx ncop__ncols not_col_permutation_5) + moreover have "Coplanar PO A' B C'" + using R3 assms(10) assms(9) col_trivial_2 colx ncop__ncols by blast + moreover have "Coplanar PO A' B' C" + by (simp add: assms(10) col__coplanar) + moreover have "Coplanar PO A' B' C'" + using assms(10) col__coplanar by auto + moreover have "B B' Perp PO A'" + by (simp add: S1) + moreover have "C C' Perp PO A'" + by (simp add: R8) + ultimately show ?thesis + using l12_9 by auto + qed + then have Q3: "(B B' ParStrict C C') \ (B \ B' \ C \ C' \ Col B C C' \ Col B' C C')" + by (simp add: Par_def) + { + assume "B B' ParStrict C C'" + then have "B B' OS C C'" + using l12_6 by auto + then have "B B' TS C' A'" + using \B B' TS A' C\ l9_2 l9_8_2 by blast + then obtain T where Q4: "Col T B B' \ Bet C' T A'" + using TS_def by blast + have "T = B'" + proof - + have "\ Col B B' A'" + using \B B' OS A A'\ col124__nos by auto + moreover have "A' \ C'" + using \B B' TS C' A'\ not_two_sides_id by auto + moreover have "Col B B' T" + using Col_cases Q4 by auto + moreover have "Col B B' B'" + using not_col_distincts by blast + moreover have "Col A' C' T" + by (simp add: Col_def Q4) + ultimately show ?thesis + by (meson assms(9) col_permutation_5 l6_21) + qed + then have "Bet A' B' C'" + using Q4 between_symmetry by blast + } + { + assume "B \ B' \ C \ C' \ Col B C C' \ Col B' C C'" + then have "Bet A' B' C'" + using TS_def \B B' TS A C\ l6_16_1 not_col_permutation_2 by blast + } + then show ?thesis + using Q3 \B B' ParStrict C C' \ Bet A' B' C'\ by blast + qed + } + { + assume R8: "A \ A' \ B \ B' \ Col A B B' \ Col A' B B'" + have "A' A Perp PO A'" + by (simp add: R7 perp_left_comm) + have "\ Col A' A PO" + using Col_cases R8 assms(3) assms(6) l8_9 by blast + then have "Bet A' B' C'" + using Col_perm P1 R8 assms(1) l6_16_1 by blast + } + then show ?thesis + using Par_def \A A' Par B B'\ \A A' ParStrict B B' \ Bet A' B' C'\ by auto + qed + qed + qed + qed + qed +qed + +lemma per3_preserves_bet2_aux: + assumes "Col PO A C" and + "A \ C'" and + "Bet A B' C'" and + "PO \ A" and + "PO \ B'" and + "PO \ C'" and + "Per PO B' B" and + "Per PO C' C" and + "Col A B C" and + "Col PO A C'" + shows "Bet A B C" +proof cases + assume "A = B" + then show ?thesis + by (simp add: between_trivial2) +next + assume P1: "A \ B" + show ?thesis + proof cases + assume "B = C" + then show ?thesis + by (simp add: between_trivial) + next + assume P2: "B \ C" + have P3: "Col PO A B'" + by (metis Col_def assms(10) assms(2) assms(3) l6_16_1) + then have P4: "Col PO B' C'" + using assms(10) assms(4) col_transitivity_1 by blast + show ?thesis + proof cases + assume "B = B'" + thus ?thesis + by (metis Tarski_neutral_dimensionless.per_col_eq Tarski_neutral_dimensionless_axioms assms(1) assms(10) assms(3) assms(4) assms(6) assms(8) col_transitivity_1) + next + assume P5: "B \ B'" + have P6: "C = C'" + using assms(1) assms(10) assms(4) assms(6) assms(8) col_transitivity_1 l8_9 by blast + then have "False" + by (metis P3 P5 P6 Tarski_neutral_dimensionless.per_col_eq Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(4) assms(5) assms(7) assms(9) col_transitivity_1 l6_16_1 not_col_permutation_4) + then show ?thesis by blast + qed + qed +qed + +lemma per3_preserves_bet2: + assumes "Col PO A C" and + "A' \ C'" and + "Bet A' B' C'" and + "PO \ A'" and + "PO \ B'" and + "PO \ C'" and + "Per PO A' A" and + "Per PO B' B" and + "Per PO C' C" and + "Col A B C" and + "Col PO A' C'" + shows "Bet A B C" +proof cases + assume "A = A'" + then show ?thesis + using assms(1) assms(10) assms(11) assms(2) assms(3) assms(4) assms(5) assms(6) assms(8) assms(9) per3_preserves_bet2_aux by blast +next + assume P1: "A \ A'" + show ?thesis + proof cases + assume "C = C'" + thus ?thesis + by (metis P1 assms(1) assms(11) assms(4) assms(6) assms(7) col_trivial_3 l6_21 l8_9 not_col_permutation_2) + next + assume P2: "C \ C'" + then have P3: "PO A' Perp C C'" + by (metis assms(11) assms(4) assms(6) assms(9) col_per_perp l8_2 not_col_permutation_1) + have P4: "PO A' Perp A A'" + using P1 assms(4) assms(7) per_perp perp_right_comm by auto + have "A A' Par C C'" + proof - + have "Coplanar PO A' A C" + using assms(1) ncop__ncols by blast + moreover have "Coplanar PO A' A C'" + by (meson assms(11) ncop__ncols) + moreover have "Coplanar PO A' A' C" + using ncop_distincts by blast + moreover have "Coplanar PO A' A' C'" + using ncop_distincts by blast + moreover have "A A' Perp PO A'" + using P4 Perp_cases by blast + moreover have "C C' Perp PO A'" + using P3 Perp_cases by auto + ultimately show ?thesis + using l12_9 by blast + qed + { + assume P5: "A A' ParStrict C C'" + then have P6: "A A' OS C C'" + by (simp add: l12_6) + have P7: "C C' OS A A'" + by (simp add: P5 pars__os3412) + + have "Bet A B C" + proof cases + assume P8: "B = B'" + then have "A' A OS B C'" + by (metis P6 assms(10) assms(3) bet_out col123__nos col124__nos invert_one_side out_one_side) + then have "A A' OS B C'" + by (simp add: invert_one_side) + then have "A A' OS B C" + using P6 one_side_symmetry one_side_transitivity by blast + then have P12: "A Out B C" + using assms(10) col_one_side_out by blast + have "C' C OS B A'" + by (metis Col_perm P5 P7 P8 assms(10) assms(3) bet_out_1 col123__nos out_one_side par_strict_not_col_2) + then have "C C' OS B A" + by (meson P7 invert_one_side one_side_symmetry one_side_transitivity) + then have "C C' OS A B" + using one_side_symmetry by blast + then have "C Out A B" + using assms(10) col_one_side_out col_permutation_2 by blast + then show ?thesis + by (simp add: P12 out2__bet) + next + assume T1: "B \ B'" + have T2: "PO A' Perp B B'" + proof - + have "Per PO B' B" + by (simp add: assms(8)) + then have "B' PerpAt PO B' B' B" + using T1 assms(5) per_perp_in by auto + then have "B' PerpAt B' PO B B'" + by (simp add: perp_in_comm) + then have T4: "B' PO Perp B B' \ B' B' Perp B B'" + using Perp_def by auto + { + assume T5: "B' PO Perp B B'" + have "Col A' B' C'" + by (simp add: assms(3) bet_col) + then have "Col PO B' A'" + using assms(11) assms(2) col2__eq col_permutation_4 col_permutation_5 by blast + then have "PO A' Perp B B'" + by (metis T5 assms(4) col_trivial_3 perp_col2 perp_comm) + } + { + assume "B' B' Perp B B'" + then have "PO A' Perp B B'" + using perp_distinct by auto + } + then show ?thesis + using T4 \B' PO Perp B B' \ PO A' Perp B B'\ by linarith + qed + have T6: "B B' Par A A'" + proof - + have "Coplanar PO A' B A" + by (metis Col_cases P7 assms(1) assms(10) col_transitivity_2 ncop__ncols os_distincts) + moreover have "Coplanar PO A' B A'" + using ncop_distincts by blast + moreover have "Coplanar PO A' B' A" + proof - + have "(Bet PO A' C' \ Bet PO C' A') \ Bet C' PO A'" + by (meson assms(11) third_point) + then show ?thesis + by (meson Bet_perm assms(3) bet__coplanar between_exchange2 l5_3 ncoplanar_perm_8) + qed + moreover have "Coplanar PO A' B' A'" + using ncop_distincts by auto + moreover have "B B' Perp PO A'" + using Perp_cases T2 by blast + moreover have "A A' Perp PO A'" + using P4 Perp_cases by blast + ultimately show ?thesis + using l12_9 by blast + qed + { + assume "B B' ParStrict A A'" + then have "B B' OS A A'" + by (simp add: l12_6) + have "B B' Par C C'" + proof - + have "Coplanar PO A' B C" + by (metis Col_cases P7 assms(1) assms(10) col2__eq ncop__ncols os_distincts) + moreover have "Coplanar PO A' B C'" + using assms(11) ncop__ncols by auto + moreover have "Coplanar PO A' B' C" + by (metis Out_def assms(11) assms(2) assms(3) col_trivial_2 l6_16_1 ncop__ncols not_col_permutation_1 out_col) + moreover have "Coplanar PO A' B' C'" + using assms(11) ncop__ncols by blast + moreover have "B B' Perp PO A'" + using Perp_cases T2 by blast + moreover have "C C' Perp PO A'" + using P3 Perp_cases by auto + ultimately show ?thesis + using l12_9 by blast + qed + { + assume T9: "B B' ParStrict C C'" + then have T10: "B B' OS C C'" + by (simp add: l12_6) + have T11: "B B' TS A' C'" + by (metis Col_cases T10 \B B' ParStrict A A'\ assms(3) bet__ts invert_two_sides os_distincts par_strict_not_col_4) + have T12: "B B' TS A C'" + using \B B' OS A A'\ \B B' TS A' C'\ l9_8_2 one_side_symmetry by blast + then have T12A: "B B' TS C A" + using T10 l9_2 l9_8_2 one_side_symmetry by blast + then obtain T where T13: "Col T B B' \ Bet C T A" + using TS_def by auto + then have "B = T" + by (metis Col_perm TS_def T12A assms(10) bet_col1 col_transitivity_2 col_two_sides_bet) + then have "Bet A B C" + using Bet_perm T13 by blast + } + { + assume "B \ B' \ C \ C' \ Col B C C' \ Col B' C C'" + then have "Bet A B C" + by (metis Col_cases P5 assms(10) col3 col_trivial_2 not_bet_distincts par_strict_not_col_3) + } + then have "Bet A B C" + using Par_def \B B' Par C C'\ \B B' ParStrict C C' \ Bet A B C\ by auto + } + { + assume "B \ B' \ A \ A' \ Col B A A' \ Col B' A A'" + then have "Bet A B C" + by (smt P6 assms(10) col123__nos l6_16_1 not_bet_distincts not_col_permutation_1) + } + then show ?thesis + using Par_def T6 \B B' ParStrict A A' \ Bet A B C\ by auto + qed + } + { + assume "A \ A' \ C \ C' \ Col A C C' \ Col A' C C'" + then have "Bet A B C" + by (metis Col_perm P3 Par_def assms(11) assms(2) assms(4) col_transitivity_1 perp_not_par) + } + thus ?thesis + using Par_def \A A' Par C C'\ \A A' ParStrict C C' \ Bet A B C\ by auto + qed +qed + +lemma symmetry_preserves_per: + assumes "Per B P A" and + "B Midpoint A A'" and + "B Midpoint P P'" + shows "Per B P' A'" +proof - + obtain C where P1: "P Midpoint A C" + using symmetric_point_construction by blast + obtain C' where P2: "B Midpoint C C'" + using symmetric_point_construction by blast + have P3: "P' Midpoint A' C'" + using P1 P2 assms(2) assms(3) symmetry_preserves_midpoint by blast + have "Cong B A' B C'" + by (meson P1 P2 assms(1) assms(2) l7_16 l7_3_2 per_double_cong) + then show ?thesis + using P3 Per_def by blast +qed + +lemma l13_1_aux: + assumes "\ Col A B C" and + "P Midpoint B C" and + "Q Midpoint A C" and + "R Midpoint A B" + shows + "\ X Y. (R PerpAt X Y A B \ X Y Perp P Q \ Coplanar A B C X \ Coplanar A B C Y)" +proof - + have P1: "Q \ C" + using assms(1) assms(3) midpoint_not_midpoint not_col_distincts by blast + have P2: "P \ C" + using assms(1) assms(2) is_midpoint_id_2 not_col_distincts by blast + then have "Q \ R" + using assms(2) assms(3) assms(4) l7_3 symmetric_point_uniqueness by blast + have "R \ B" + using assms(1) assms(4) midpoint_not_midpoint not_col_distincts by blast + { + assume V1: "Col P Q C" + have V2: "Col B P C" + by (simp add: assms(2) bet_col midpoint_bet) + have V3: "Col A Q C" + by (simp add: assms(3) bet_col midpoint_bet) + have "Col A R B" + using assms(4) midpoint_col not_col_permutation_4 by blast + then have "Col A B C" using V1 V2 V3 + by (metis P1 P2 col2__eq col_permutation_5) + then have "False" + using assms(1) by auto + } + then have P2A: "\ Col P Q C" by auto + then obtain C' where P3: "Col P Q C' \ P Q Perp C C'" + using l8_18_existence by blast + obtain A' where P4: "Q Midpoint C' A'" + using symmetric_point_construction by auto + obtain B' where P5: "P Midpoint C' B'" + using symmetric_point_construction by auto + have P6: "Cong C C' B B'" + using Mid_cases P5 assms(2) l7_13 by blast + have P7: "Cong C C' A A'" + using P4 assms(3) l7_13 l7_2 by blast + have P8: "Per P B' B" + proof cases + assume "P = C'" + then show ?thesis + using P5 Per_cases is_midpoint_id l8_5 by blast + next + assume "P \ C'" + then have "P C' Perp C C'" + using P3 perp_col by blast + then have "Per P C' C" + using Perp_perm perp_per_2 by blast + then show ?thesis + using symmetry_preserves_per Mid_perm P5 assms(2) by blast + qed + have P8A: "Per Q A' A" + proof cases + assume "Q = C'" + then show ?thesis + using P4 Per_cases is_midpoint_id l8_5 by blast + next + assume "Q \ C'" + then have "C' Q Perp C C'" + using P3 col_trivial_2 perp_col2 by auto + then have "Per Q C' C" + by (simp add: perp_per_1) + then show ?thesis + by (meson Mid_cases P4 assms(3) l7_3_2 midpoint_preserves_per) + qed + have P9: "Col A' C' Q" + using P4 midpoint_col not_col_permutation_3 by blast + have P10: "Col B' C' P" + using P5 midpoint_col not_col_permutation_3 by blast + have P11: "P \ Q" + using P2A col_trivial_1 by auto + then have P12: "A' \ B'" + using P4 P5 l7_17 by blast + have P13: "Col A' B' P" + by (metis P10 P3 P4 P5 P9 col2__eq col_permutation_5 midpoint_distinct_1 not_col_distincts) + have P14: "Col A' B' Q" + by (smt P10 P3 P4 P5 P9 col3 col_permutation_1 midpoint_distinct_1 not_col_distincts) + have P15: "Col A' B' C'" + using P11 P13 P14 P3 colx by blast + have P16: "C \ C'" + using P2A P3 by blast + then have P17: "A \ A'" + using P7 cong_diff by blast + have P18: "B \ B'" + using P16 P6 cong_diff by blast + have P19: "Per P A' A" + proof cases + assume P20: "A' = Q" + then have "A' P Perp C A'" + by (metis P3 P4 Perp_cases midpoint_not_midpoint) + then have "Per P A' C" + by (simp add: perp_per_1) + then show ?thesis + using P20 assms(3) l7_2 l8_4 by blast + next + assume "A' \ Q" + then show ?thesis + by (meson P12 P13 P14 P8A col_transitivity_1 l8_2 per_col) + qed + have "Per Q B' B" + proof cases + assume P21: "P = B'" + then have P22: "C' = B'" + using P5 is_midpoint_id_2 by auto + then have "Per Q B' C" + using P3 P21 perp_per_1 by auto + thus ?thesis + by (metis Col_perm P16 P21 P22 assms(2) midpoint_col per_col) + next + assume P23: "P \ B'" + have "Col B' P Q" + using P12 P13 P14 col_transitivity_2 by blast + then have "Per B B' Q" + using P8 P23 l8_2 l8_3 by blast + thus ?thesis + using Per_perm by blast + qed + then have P24: "Per A' B' B" + using P11 P13 P14 P8 l8_3 not_col_permutation_2 by blast + have P25: "Per A A' B'" + using P11 P13 P14 P19 P8A l8_2 l8_3 not_col_permutation_5 by blast + then have "Per B' A' A" + using Per_perm by blast + then have "\ Col B' A' A" + using P12 P17 P25 per_not_col by auto + then have P26: "\ Col A' B' A" + using Col_cases by auto + have "\ Col A' B' B" + using P12 P18 P24 l8_9 by auto + obtain X where P28: "X Midpoint A' B'" + using midpoint_existence by blast + then have P28A: "Col A' B' X" + using midpoint_col not_col_permutation_2 by blast + then have "\ Q. A' B' Perp Q X \ A' B' OS A Q" + by (simp add: P26 l10_15) + then obtain y where P29: "A' B' Perp y X \ A' B' OS A y" by blast + then obtain B'' where P30: "(X y Perp A B'' \ A = B'') \ (\ M. (Col X y M \ M Midpoint A B''))" + using ex_sym by blast + then have P31: "B'' A ReflectL X y" + using P30 ReflectL_def by blast + have P32: "X \ y" + using P29 P28A col124__nos by blast + then have "X \ y \ B'' A ReflectL X y \ X = y \ X Midpoint A B''" + using P31 by auto + then have P33: "B'' A Reflect X y" + by (simp add: Reflect_def) + have P33A: "X \ y \ A' B' ReflectL X y" + using P28 P29 Perp_cases ReflectL_def P32 col_trivial_3 l10_4_spec by blast + then have P34: "A' B' Reflect X y" + using Reflect_def by blast + have P34A: "A B'' Reflect X y" + using P33 l10_4 by blast + then have P35: "Cong B'' B' A A'" + using P34 l10_10 by auto + have "Per A' B' B''" + proof - + have R1: "X \ y \ A B'' ReflectL X y \ X = y \ X Midpoint B'' A" + by (simp add: P31 P32 l10_4_spec) + have R2: "X \ y \ A' B' ReflectL X y \ X = y \ X Midpoint B' A'" + using P33A by linarith + { + assume "X \ y \ A B'' ReflectL X y \ X \ y \ A' B' ReflectL X y" + then have "Per A' B' B''" + using \Per B' A' A\ image_spec_preserves_per l10_4_spec by blast + } + { + assume "X \ y \ A B'' ReflectL X y \ X = y \ X Midpoint B' A'" + then have "Per A' B' B''" by blast + } + { + assume "X = y \ X Midpoint B'' A \ X \ y \ A' B' ReflectL X y" + then have "Per A' B' B''" by blast + } + { + assume "X = y \ X Midpoint B'' A \ X = y \ X Midpoint B' A'" + then have "Per A' B' B''" + using P32 by blast + } + then show ?thesis using R1 R2 + using \X \ y \ A B'' ReflectL X y \ X \ y \ A' B' ReflectL X y \ Per A' B' B''\ by auto + qed + have "A' B' OS A B''" + proof - + { + assume S1: "X y Perp A B''" + have "Coplanar A y A' X" + by (metis P28A P29 col_one_side coplanar_perm_16 ncop_distincts os__coplanar) + have "Coplanar A y B' X" + by (smt P12 P28A P29 col2_cop__cop col_transitivity_1 ncoplanar_perm_22 not_col_permutation_5 os__coplanar) + have S2: "\ Col A X y" + using Col_perm P34A S1 local.image_id perp_distinct by blast + + have "A' B' Par A B''" + proof - + have "Coplanar X y A' A" + using \Coplanar A y A' X\ ncoplanar_perm_21 by blast + moreover have "Coplanar X y A' B''" + proof - + have "Coplanar A X y A'" + using \Coplanar X y A' A\ ncoplanar_perm_9 by blast + moreover have "Coplanar A X y B''" + using Coplanar_def S1 perp_inter_exists by blast + ultimately show ?thesis + using S2 coplanar_trans_1 by auto + qed + moreover have "Coplanar X y B' A" + proof - + have "\ Col A X y" + by (simp add: S2) + moreover have "Coplanar A X y B'" + using \Coplanar A y B' X\ ncoplanar_perm_3 by blast + moreover have "Coplanar A X y B''" + using Coplanar_def S1 perp_inter_exists by blast + ultimately show ?thesis + using ncoplanar_perm_18 by blast + qed + moreover have "Coplanar X y B' B''" + proof - + have "\ Col A X y" + by (simp add: S2) + moreover have "Coplanar A X y B'" + using \Coplanar X y B' A\ ncoplanar_perm_9 by blast + moreover have "Coplanar A X y B''" + using Coplanar_def S1 perp_inter_exists by blast + ultimately show ?thesis + using coplanar_trans_1 by blast + qed + ultimately show ?thesis using l12_9 + using P29 Perp_cases S1 by blast + qed + have "A' B' OS A B''" + proof - + { + assume "A' B' ParStrict A B''" + have "A' B' OS A B''" using l12_6 + using \A' B' ParStrict A B''\ by blast + } + { + assume "A' \ B' \ A \ B'' \ Col A' A B'' \ Col B' A B''" + have "A' B' OS A B''" + using P26 \A' B' Par A B''\ \A' B' ParStrict A B'' \ A' B' OS A B''\ col_trivial_3 par_not_col_strict by blast + } + then show ?thesis + using Par_def \A' B' Par A B''\ \A' B' ParStrict A B'' \ A' B' OS A B''\ by auto + qed + } + { + assume "A = B''" + then have "A' B' OS A B''" + using P12 P25 \Per A' B' B''\ l8_2 l8_7 by blast + } + then show ?thesis + using P30 \X y Perp A B'' \ A' B' OS A B''\ by blast + qed + have "A' B' OS A B" + proof - + have "A' B' TS A C" + proof - + have "\ Col A A' B'" + using Col_perm \\ Col B' A' A\ by blast + moreover have "\ Col C A' B'" + by (metis P13 P14 P2A \\ Col B' A' A\ col3 not_col_distincts not_col_permutation_3 not_col_permutation_4) + moreover have "\ T. Col T A' B' \ Bet A T C" + using P14 assms(3) midpoint_bet not_col_permutation_1 by blast + ultimately show ?thesis + by (simp add: TS_def) + qed + moreover have "A' B' TS B C" + by (metis Col_cases P13 TS_def \\ Col A' B' B\ assms(2) calculation midpoint_bet) + ultimately show ?thesis + using OS_def by blast + qed + have "Col B B'' B'" + proof - + have "Coplanar A' B B'' B'" + proof - + have "Coplanar A' B' B B''" + proof - + have "\ Col A A' B'" + using Col_perm \\ Col B' A' A\ by blast + moreover have "Coplanar A A' B' B" + using \A' B' OS A B\ ncoplanar_perm_8 os__coplanar by blast + moreover have "Coplanar A A' B' B''" + using \A' B' OS A B''\ ncoplanar_perm_8 os__coplanar by blast + ultimately show ?thesis + using coplanar_trans_1 by blast + qed + then show ?thesis + using ncoplanar_perm_4 by blast + qed + moreover have "A' \ B'" + by (simp add: P12) + moreover have "Per B B' A'" + by (simp add: P24 l8_2) + moreover have "Per B'' B' A'" + using Per_cases \Per A' B' B''\ by auto + ultimately show ?thesis + using cop_per2__col by blast + qed + have "Cong B B' A A'" + using P6 P7 cong_inner_transitivity by blast + have "B = B'' \ B' Midpoint B B''" + proof - + have "Col B B' B''" + using \Col B B'' B'\ not_col_permutation_5 by blast + moreover have "Cong B' B B' B''" + by (metis Cong_perm P35 P6 P7 cong_inner_transitivity) + ultimately show ?thesis + using l7_20 by simp + qed + { + assume "B = B''" + then obtain M where S1: "Col X y M \ M Midpoint A B" + using P30 by blast + then have "R = M" + using assms(4) l7_17 by auto + have "A \ B" + using assms(1) col_trivial_1 by auto + have "Col R A B" + by (simp add: assms(4) midpoint_col) + have "X \ R" + using Midpoint_def P28 \A' B' OS A B''\ \B = B''\ assms(4) midpoint_col one_side_chara by auto + then have "\ X Y. (R PerpAt X Y A B \ X Y Perp P Q \ Coplanar A B C X \ Coplanar A B C Y)" + proof - + have "R PerpAt R X A B" + proof - + have "R X Perp A B" + using P30 S1 \A \ B\ \B = B''\ \R = M\ \X \ R\ perp_col perp_left_comm by blast + then show ?thesis + using \Col R A B\ l8_14_2_1b_bis not_col_distincts by blast + qed + moreover have "R X Perp P Q" + proof - + have "X R Perp P Q" + proof - + have "X y Perp P Q" + proof - + have "P Q Perp X y" + using P11 P13 P14 P29 P33A col_trivial_2 col_trivial_3 perp_col4 by blast + then show ?thesis + using Perp_perm by blast + qed + moreover have "Col X y R" + by (simp add: S1 \R = M\) + ultimately show ?thesis + using \X \ R\ perp_col by blast + qed + then show ?thesis + using Perp_perm by blast + qed + moreover have "Coplanar A B C R" + using \Col R A B\ ncop__ncols not_col_permutation_2 by blast + moreover have "Coplanar A B C X" + proof - + have "Col P Q X" + using P12 P13 P14 P28A col3 by blast + moreover have "\ Col P Q C" + by (simp add: P2A) + moreover have "Coplanar P Q C A" + using assms(3) coplanar_perm_19 midpoint__coplanar by blast + moreover have "Coplanar P Q C B" + using assms(2) midpoint_col ncop__ncols not_col_permutation_5 by blast + moreover have "Coplanar P Q C C" + using ncop_distincts by auto + moreover have "Coplanar P Q C X" + using calculation(1) ncop__ncols by blast + ultimately show ?thesis + using coplanar_pseudo_trans by blast + qed + ultimately show ?thesis by blast + qed + } + { + assume "B' Midpoint B B''" + have "A' B' TS B B''" + proof - + have "\ Col B A' B'" + using Col_perm \\ Col A' B' B\ by blast + moreover have "\ Col B'' A' B'" + using \A' B' OS A B''\ col124__nos not_col_permutation_2 by blast + moreover have "\ T. Col T A' B' \ Bet B T B''" + using \B' Midpoint B B''\ col_trivial_3 midpoint_bet by blast + ultimately show ?thesis + by (simp add: TS_def) + qed + have "A' B' OS B B''" + using \A' B' OS A B''\ \A' B' OS A B\ one_side_symmetry one_side_transitivity by blast + have "\ A' B' OS B B''" + using \A' B' TS B B''\ l9_9_bis by blast + then have "False" + by (simp add: \A' B' OS B B''\) + then have "\ X Y. (R PerpAt X Y A B \ X Y Perp P Q \ Coplanar A B C X \ Coplanar A B C Y)" + by auto + } + then show ?thesis + using \B = B'' \ \X Y. R PerpAt X Y A B \ X Y Perp P Q \ Coplanar A B C X \ Coplanar A B C Y\ \B = B'' \ B' Midpoint B B''\ by blast +qed + +lemma l13_1: + assumes "\ Col A B C" and + "P Midpoint B C" and + "Q Midpoint A C" and + "R Midpoint A B" + shows + "\ X Y.(R PerpAt X Y A B \ X Y Perp P Q)" +proof - + obtain X Y where "R PerpAt X Y A B \ X Y Perp P Q \ Coplanar A B C X \ Coplanar A B C Y" + using l13_1_aux assms(1) assms(2) assms(3) assms(4) by blast + then show ?thesis by blast +qed + +lemma per_lt: + assumes "A \ B" and + "C \ B" and + "Per A B C" + shows "A B Lt A C \ C B Lt A C" +proof - + have "B A Lt A C \ B C Lt A C" + using assms(1) assms(2) assms(3) l11_46 by auto + then show ?thesis + using lt_left_comm by blast +qed + +lemma cong_perp_conga: + assumes "Cong A B C B" and + "A C Perp B P" + shows "A B P CongA C B P \ B P TS A C" +proof - + have P1: "A \ C" + using assms(2) perp_distinct by auto + have P2: "B \ P" + using assms(2) perp_distinct by auto + have P3: "A \ B" + by (metis P1 assms(1) cong_diff_3) + have P4: "C \ B" + using P3 assms(1) cong_diff by blast + show ?thesis + proof cases + assume P5: "Col A B C" + have P6: "\ Col B A P" + using P3 P5 assms(2) col_transitivity_1 not_col_permutation_4 not_col_permutation_5 perp_not_col2 by blast + have "Per P B A" + using P3 P5 Perp_perm assms(2) not_col_permutation_5 perp_col1 perp_per_1 by blast + then have P8: "Per A B P" + using Per_cases by blast + have "Per P B C" + using P3 P5 P8 col_per2__per l8_2 l8_5 by blast + then have P10: "Per C B P" + using Per_perm by blast + show ?thesis + proof - + have "A B P CongA C B P" + using P2 P3 P4 P8 P10 l11_16 by auto + moreover have "B P TS A C" + by (metis Col_cases P1 P5 P6 assms(1) bet__ts between_cong not_cong_2143 not_cong_4321 third_point) + ultimately show ?thesis + by simp + qed + next + assume T1: "\ Col A B C" + obtain T where T2: "T PerpAt A C B P" + using assms(2) perp_inter_perp_in by blast + then have T3: "Col A C T \ Col B P T" + using perp_in_col by auto + have T4: "B \ T" + using Col_perm T1 T3 by blast + have T5: "B T Perp A C" + using Perp_cases T3 T4 assms(2) perp_col1 by blast + { + assume T5_1: "A = T" + have "B A Lt B C \ C A Lt B C" + proof - + have "B \ A" + using P3 by auto + moreover have "C \ A" + using P1 by auto + moreover have "Per B A C" + using T5 T5_1 perp_comm perp_per_1 by blast + ultimately show ?thesis + by (simp add: per_lt) + qed + then have "False" + using Cong_perm assms(1) cong__nlt by blast + } + then have T6: "A \ T" by auto + { + assume T6_1: "C = T" + have "B C Lt B A \ A C Lt B A" + proof - + have "B \ C" + using P4 by auto + moreover have "A \ C" + by (simp add: P1) + moreover have "Per B C A" + using T5 T6_1 perp_left_comm perp_per_1 by blast + ultimately show ?thesis + by (simp add: per_lt) + qed + then have "False" + using Cong_perm assms(1) cong__nlt by blast + } + then have T7: "C \ T" by auto + have T8: "T PerpAt B T T A" + by (metis Perp_in_cases T2 T3 T4 T6 perp_in_col_perp_in) + have T9: "T PerpAt B T T C" + by (metis Col_cases T3 T7 T8 perp_in_col_perp_in) + have T10: "Cong T A T C \ T A B CongA T C B \ T B A CongA T B C" + proof - + have "A T B CongA C T B" + proof - + have "Per A T B" + using T2 perp_in_per_1 by auto + moreover have "Per C T B" + using T2 perp_in_per_3 by auto + ultimately show ?thesis + by (simp add: T4 T6 T7 l11_16) + qed + moreover have "Cong A B C B" + by (simp add: assms(1)) + moreover have "Cong T B T B" + by (simp add: cong_reflexivity) + moreover have "T B Le A B" + proof - + have "Per B T A" + using T8 perp_in_per by auto + then have "B T Lt B A \ A T Lt B A" + using T4 T6 per_lt by blast + then show ?thesis + using Le_cases Lt_def by blast + qed + ultimately show ?thesis + using l11_52 by blast + qed + show ?thesis + proof - + have T11: "A B P CongA C B P" + proof - + have "P B A CongA P B C" + using Col_cases P2 T10 T3 col_conga__conga by blast + thus ?thesis + using conga_comm by blast + qed + moreover have "B P TS A C" + proof - + have T12: "A = C \ T Midpoint A C" + using T10 T3 l7_20_bis not_col_permutation_5 by blast + { + assume "T Midpoint A C" + then have "B P TS A C" + by (smt Col_perm P2 T1 T3 \A = T \ False\ \C = T \ False\ col2__eq l9_18 midpoint_bet) + } + then show ?thesis + using P1 T12 by auto + qed + ultimately show ?thesis + by simp + qed + qed +qed + +lemma perp_per_bet: + assumes "\ Col A B C" and + (* "Col A P C" and *) + "Per A B C" and + "P PerpAt P B A C" + shows "Bet A P C" +proof - + have "A \ C" + using assms(1) col_trivial_3 by auto + then show ?thesis + using assms(2) assms(3) l11_47 perp_in_left_comm by blast +qed + +lemma ts_per_per_ts: + assumes "A B TS C D" and + "Per B C A" and + "Per B D A" + shows "C D TS A B" +proof - + have P1: "\ Col C A B" + using TS_def assms(1) by blast + have P2: "A \ B" + using P1 col_trivial_2 by auto + obtain P where P3: "Col P A B \ Bet C P D" + using TS_def assms(1) by blast + have P4: "C \ D" + using assms(1) not_two_sides_id by auto + show ?thesis + proof - + { + assume "Col A C D" + then have "C = D" + by (metis assms(1) assms(2) assms(3) col_per2_cases col_permutation_2 not_col_distincts ts_distincts) + then have "False" + using P4 by auto + } + then have "\ Col A C D" by auto + moreover have "\ Col B C D" + using assms(1) assms(2) assms(3) per2_preserves_diff ts_distincts by blast + moreover have "\ T. Col T C D \ Bet A T B" + proof - + have "Col P C D" + using Col_def Col_perm P3 by blast + moreover have "Bet A P B" + proof - + have "\ X. Col A B X \ A B Perp C X" + using Col_perm P1 l8_18_existence by blast + then obtain C' where P5: "Col A B C' \ A B Perp C C'" by blast + have "\ X. Col A B X \ A B Perp D X" + by (metis (no_types) Col_perm TS_def assms(1) l8_18_existence) + then obtain D' where P6: "Col A B D' \ A B Perp D D'" by blast + have P7: "A \ C'" + using P5 assms(2) l8_7 perp_not_eq_2 perp_per_1 by blast + have P8: "A \ D'" + using P6 assms(3) l8_7 perp_not_eq_2 perp_per_1 by blast + have P9: "Bet A C' B" + proof - + have "\ Col A C B" + using Col_cases P1 by blast + moreover have "Per A C B" + by (simp add: assms(2) l8_2) + moreover have "C' PerpAt C' C A B" + using P5 Perp_in_perm l8_15_1 by blast + ultimately show ?thesis + using perp_per_bet by blast + qed + have P10: "Bet A D' B" + proof - + have "\ Col A D B" + using P6 col_permutation_5 perp_not_col2 by blast + moreover have "Per A D B" + by (simp add: assms(3) l8_2) + moreover have "D' PerpAt D' D A B" + using P6 Perp_in_perm l8_15_1 by blast + ultimately show ?thesis + using perp_per_bet by blast + qed + show ?thesis + proof cases + assume "P = C'" + then show ?thesis + by (simp add: P9) + next + assume "P \ C'" + show ?thesis + proof cases + assume "P = D'" + then show ?thesis + by (simp add: P10) + next + assume "P \ D'" + show ?thesis + proof cases + assume "A = P" + then show ?thesis + by (simp add: between_trivial2) + next + assume "A \ P" + show ?thesis + proof cases + assume "B = P" + then show ?thesis + using between_trivial by auto + next + assume "B \ P" + have "Bet C' P D'" + proof - + have "Bet C P D" + by (simp add: P3) + moreover have "P \ C'" + by (simp add: \P \ C'\) + moreover have "P \ D'" + by (simp add: \P \ D'\) + moreover have "Col C' P D'" + by (meson P2 P3 P5 P6 col3 col_permutation_2) + moreover have "Per P C' C" + using P3 P5 l8_16_1 l8_2 not_col_permutation_3 not_col_permutation_4 by blast + moreover have "Per P D' D" + by (metis P3 P6 calculation(3) not_col_permutation_2 perp_col2 perp_per_1) + ultimately show ?thesis + using per13_preserves_bet by blast + qed + then show ?thesis + using P10 P9 bet3__bet by blast + qed + qed + qed + qed + qed + ultimately show ?thesis + by auto + qed + ultimately show ?thesis + by (simp add: TS_def) + qed +qed + +lemma l13_2_1: + assumes "A B TS C D" and + "Per B C A" and + "Per B D A" and + "Col C D E" and + "A E Perp C D" and + "C A B CongA D A B" + shows "B A C CongA D A E \ B A D CongA C A E \ Bet C E D" +proof - + have P1: "\ Col C A B" + using TS_def assms(1) by auto + have P2: "A \ C" + using P1 col_trivial_1 by blast + have P3: "A \ B" + using P1 col_trivial_2 by auto + have P4: "A \ D" + using assms(1) ts_distincts by auto + have P5: "Cong B C B D \ Cong A C A D \ C B A CongA D B A" + proof - + have "\ Col B A C" + by (simp add: P1 not_col_permutation_3) + moreover have "A C B CongA A D B" + using assms(1) assms(2) assms(3) l11_16 l8_2 ts_distincts by blast + moreover have "B A C CongA B A D" + by (simp add: assms(6) conga_comm) + moreover have "Cong B A B A" + by (simp add: cong_reflexivity) + ultimately show ?thesis + using l11_50_2 by blast + qed + then have P6: "C D Perp A B" + using assms(1) assms(6) cong_conga_perp not_cong_2143 by blast + then have P7: "C D TS A B" + by (simp add: assms(1) assms(2) assms(3) ts_per_per_ts) + obtain T1 where P8: "Col T1 C D \ Bet A T1 B" + using P7 TS_def by auto + obtain T where P9: "Col T A B \ Bet C T D" + using TS_def assms(1) by blast + have P10: "T1 = T" + by (metis (no_types) Col_def P1 P3 P8 P9 between_equality_2 between_trivial2 l6_16_1) + have P11: "T = E" + proof - + have "\ Col A B C" + using Col_perm P1 by blast + moreover have "C \ D" + using assms(1) ts_distincts by blast + moreover have "Col A B T" + using Col_cases P9 by auto + moreover have "Col A B E" + by (metis P7 Perp_cases P6 assms(1) assms(5) col_perp2_ncol_col col_trivial_3 not_col_permutation_3 one_side_not_col123 os_ts1324__os ts_ts_os) + moreover have "Col C D T" + using NCol_cases P9 bet_col by blast + moreover have "Col C D E" + by (simp add: assms(4)) + ultimately show ?thesis + using l6_21 by blast + qed + show ?thesis + proof - + have "B A C CongA D A E" + proof - + have "A Out C C" + using P2 out_trivial by auto + moreover have "A Out B B" + using P3 out_trivial by auto + moreover have "A Out D D" + using P4 out_trivial by auto + moreover have "A Out E B" + by (metis P10 P11 P7 P8 TS_def bet_out) + ultimately show ?thesis + by (meson assms(6) conga_comm conga_right_comm l11_10) + qed + moreover have "B A D CongA C A E" + proof - + have "C A E CongA D A B" + by (meson Perp_cases P5 assms(5) assms(6) calculation cong_perp_conga conga_right_comm conga_trans not_cong_2143 not_conga_sym) + then have "C A E CongA B A D" + by (simp add: conga_right_comm) + then show ?thesis + by (simp add: conga_sym) + qed + moreover have "Bet C E D" + using P11 P9 by auto + ultimately show ?thesis by simp + qed +qed + +lemma triangle_mid_par: + assumes "\ Col A B C" and + "P Midpoint B C" and + "Q Midpoint A C" + shows "A B ParStrict Q P" +proof - + obtain R where P1: "R Midpoint A B" + using midpoint_existence by auto + then obtain X Y where P2: "R PerpAt X Y A B \ X Y Perp P Q \ Coplanar A B C X \ Coplanar A B C Y" + using l13_1_aux assms(1) assms(2) assms(3) by blast + have P3: "Coplanar X Y A P \ Coplanar X Y A Q \ Coplanar X Y B P \ Coplanar X Y B Q" + proof - + have "Coplanar A B C A" + using ncop_distincts by auto + moreover have "Coplanar A B C B" + using ncop_distincts by auto + moreover have "Coplanar A B C P" + using assms(2) coplanar_perm_21 midpoint__coplanar by blast + moreover have "Coplanar A B C Q" + using assms(3) coplanar_perm_11 midpoint__coplanar by blast + ultimately show ?thesis + using P2 assms(1) coplanar_pseudo_trans by blast + qed + have P4: "Col X Y R \ Col A B R" + using P2 perp_in_col by blast + have P5: "R Y Perp A B \ X R Perp A B" + using P2 perp_in_perp_bis by auto + have P6: "Col A R B" + using Col_perm P4 by blast + have P7: "X \ Y" + using P2 perp_not_eq_1 by auto + { + assume P8: "R Y Perp A B" + have "Col Y R X" + using P4 not_col_permutation_2 by blast + then have "Y X Perp A B" + using P2 Perp_cases perp_in_perp by blast + then have P10: "X Y Perp A B" + using Perp_cases by blast + have "A B Par P Q" + proof - + have "Coplanar X Y A P" + by (simp add: P3) + moreover have "Coplanar X Y A Q" + by (simp add: P3) + moreover have "Coplanar X Y B P" + by (simp add: P3) + moreover have "Coplanar X Y B Q" + by (simp add: P3) + moreover have "A B Perp X Y" + using P10 Perp_cases by auto + moreover have "P Q Perp X Y" + using P2 Perp_cases by auto + ultimately show ?thesis + using l12_9 by blast + qed + { + assume "A B ParStrict P Q" + then have "A B ParStrict Q P" + using Par_strict_perm by blast + } + { + assume "A \ B \ P \ Q \ Col A P Q \ Col B P Q" + then have "Col A B P" + using l6_16_1 not_col_permutation_1 by blast + then have "P = B" + by (metis Col_perm assms(1) assms(2) l6_16_1 midpoint_col) + then have "A B ParStrict Q P" + using assms(1) assms(2) col_trivial_2 is_midpoint_id by blast + } + then have "A B ParStrict Q P" + using Par_def \A B Par P Q\ \A B ParStrict P Q \ A B ParStrict Q P\ by auto + } + { + assume P10: "X R Perp A B" + have "Col X R Y" + by (simp add: Col_perm P4) + then have P11: "X Y Perp A B" + using P7 P10 perp_col by blast + have "A B Par P Q" + proof - + have "A B Perp X Y" + using P11 Perp_perm by blast + moreover have "P Q Perp X Y" + using P2 Perp_perm by blast + ultimately show ?thesis + using P3 l12_9 by blast + qed + { + assume "A B ParStrict P Q" + then have "A B ParStrict Q P" + by (simp add: par_strict_right_comm) + } + { + assume "A \ B \ P \ Q \ Col A P Q \ Col B P Q" + then have "Col A B P" + using Col_perm l6_16_1 by blast + then have "P = B" + by (metis Col_perm assms(1) assms(2) l6_16_1 midpoint_col) + then have "A B ParStrict Q P" + using assms(1) assms(2) col_trivial_2 is_midpoint_id by blast + } + then have "A B ParStrict Q P" + using Par_def \A B Par P Q\ \A B ParStrict P Q \ A B ParStrict Q P\ by auto + } + then show ?thesis + using P5 \R Y Perp A B \ A B ParStrict Q P\ by blast +qed + +lemma cop4_perp_in2__col: + assumes "Coplanar X Y A A'" and + "Coplanar X Y A B'" and + "Coplanar X Y B A'" and + "Coplanar X Y B B'" and + "P PerpAt A B X Y" and + "P PerpAt A' B' X Y" + shows "Col A B A'" +proof - + have P1: "Col A B P \ Col X Y P" + using assms(5) perp_in_col by auto + show ?thesis + proof cases + assume P2: "A = P" + show ?thesis + proof cases + assume P3: "P = X" + have "Col B A' P" + proof - + have "Coplanar Y B A' P" + using P3 assms(3) ncoplanar_perm_18 by blast + moreover have "Y \ P" + using P3 assms(6) perp_in_distinct by blast + moreover have "Per B P Y" + using assms(5) perp_in_per_4 by auto + moreover have "Per A' P Y" + using assms(6) perp_in_per_2 by auto + ultimately show ?thesis + using cop_per2__col by auto + qed + then show ?thesis + using Col_perm P2 by blast + next + assume P4: "P \ X" + have "Col B A' P" + proof - + have "Coplanar X B A' P" + by (metis P1 assms(3) assms(6) col2_cop__cop col_trivial_3 ncoplanar_perm_9 perp_in_distinct) + moreover have "Per B P X" + using assms(5) perp_in_per_3 by auto + moreover have "Per A' P X" + using assms(6) perp_in_per_1 by auto + ultimately show ?thesis + using cop_per2__col P4 by auto + qed + then show ?thesis + using Col_perm P2 by blast + qed + next + assume P5: "A \ P" + have P6: "Per A P Y" + using assms(5) perp_in_per_2 by auto + show ?thesis + proof cases + assume P7: "P = A'" + have P8: "Per B' P Y" + using assms(6) perp_in_per_4 by auto + have "Col A B' P" + proof - + have "Coplanar Y A B' P" + using assms(2) by (metis P1 assms(6) col_transitivity_2 coplanar_trans_1 ncop__ncols perp_in_distinct) + then show ?thesis using P6 P8 cop_per2__col + by (metis assms(2) assms(5) assms(6) col_permutation_4 coplanar_perm_5 perp_in_distinct perp_in_per_1 perp_in_per_3) + qed + then show ?thesis + using P1 P7 by auto + next + assume T1: "P \ A'" + show ?thesis + proof cases + assume T2: "Y = P" + { + assume R1: "Coplanar X P A A' \ P PerpAt A B X P \ P PerpAt A' B' X P \ A \ P" + then have R2: "Per A P X" + using perp_in_per_1 by auto + have "Per A' P X" + using R1 perp_in_per_1 by auto + then have "Col A B A'" + by (metis R1 R2 PerpAt_def col_permutation_3 col_transitivity_2 cop_per2__col ncoplanar_perm_5) + } + then show ?thesis + using P5 T1 T2 assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) by blast + next + assume P10: "Y \ P" + have "Col A A' P" + proof - + have "Coplanar Y A A' P" + by (metis P1 assms(1) assms(6) col2_cop__cop col_trivial_2 ncoplanar_perm_9 perp_in_distinct) + moreover have "Per A P Y" + by (simp add: P6) + moreover have "Per A' P Y" + using assms(6) perp_in_per_2 by auto + ultimately show ?thesis + using cop_per2__col P10 by auto + qed + then show ?thesis + using P1 P5 col2__eq col_permutation_4 by blast + qed + qed + qed +qed + +lemma l13_2: + assumes "A B TS C D" and + "Per B C A" and + "Per B D A" and + "Col C D E" and + "A E Perp C D" + shows "B A C CongA D A E \ B A D CongA C A E \ Bet C E D" +proof - + have P2: "\ Col C A B" + using TS_def assms(1) by auto + have P3: "C \ D" + using assms(1) not_two_sides_id by blast + have P4: "\ C'. B A C CongA D A C' \ D A OS C' B" + proof - + have "\ Col B A C" + using Col_cases P2 by auto + moreover have "\ Col D A B" + using TS_def assms(1) by blast + ultimately show ?thesis + by (simp add: angle_construction_1) + qed + then obtain E' where P5: "B A C CongA D A E' \ D A OS E' B" by blast + have P6: "A \ B" + using P2 not_col_distincts by blast + have P7: "A \ C" + using P2 not_col_distincts by blast + have P8: "A \ D" + using P5 os_distincts by blast + have P9: "((A B TS C E' \ A E' TS D B) \ (A B OS C E' \ A E' OS D B \ C A B CongA D A E' \ B A E' CongA E' A B)) \ C A E' CongA D A B" + by (metis P5 P6 conga_diff56 conga_left_comm conga_pseudo_refl l11_22) + have P10: "C D TS A B" + by (simp add: assms(1) assms(2) assms(3) ts_per_per_ts) + have P11: "\ Col A C D" + using P10 TS_def by auto + obtain T where P12: "Col T A B \ Bet C T D" + using TS_def assms(1) by blast + obtain T2 where P13: "Col T2 C D \ Bet A T2 B" + using P10 TS_def by auto + then have P14: "T = T2" + by (metis Col_def Col_perm P12 P2 P3 P6 l6_16_1) + have P15: "B InAngle D A C" + using P10 assms(1) l11_24 ts2__inangle by blast + have P16: "C A B LeA C A D" + by (simp add: P10 assms(1) inangle__lea ts2__inangle) + have P17: "E' InAngle D A C" + proof - + have "D A E' LeA D A C" + using P16 P5 P7 P8 conga_left_comm conga_pseudo_refl l11_30 by presburger + moreover have "D A OS C E'" + by (meson P11 P15 P5 col124__nos in_angle_one_side invert_one_side not_col_permutation_2 one_side_symmetry one_side_transitivity) + ultimately show ?thesis + by (simp add: lea_in_angle) + qed + obtain E'' where P18: "Bet D E'' C \ (E'' = A \ A Out E'' E')" + using InAngle_def P17 by auto + { + assume "E'' = A" + then have "B A C CongA D A E \ B A D CongA C A E \ Bet C E D" + using Col_def P11 P18 by auto + } + { + assume P19: "A Out E'' E'" + then have P20: "B A C CongA D A E''" + by (meson OS_def P5 Tarski_neutral_dimensionless.out2__conga Tarski_neutral_dimensionless_axioms col_one_side_out col_trivial_2 l9_18_R1 not_conga one_side_reflexivity) + have P21: "A \ T" + using P11 P13 P14 by auto + have "B A C CongA D A E \ B A D CongA C A E \ Bet C E D" + proof cases + assume P22: "E'' = T" + have P23: "C A B CongA D A B" + proof - + have "C A B CongA D A T" + using P22 P20 conga_left_comm by blast + moreover have "A Out C C" + using P7 out_trivial by presburger + moreover have "A Out B B" + using P6 out_trivial by auto + moreover have "A Out D D" + using P8 out_trivial by auto + moreover have "A Out B T" + using Out_def P13 P14 P6 P21 by blast + ultimately show ?thesis + using l11_10 by blast + qed + then show ?thesis + using assms(1) assms(2) assms(3) assms(4) assms(5) l13_2_1 by blast + next + assume P23A: "E'' \ T" + have P24: "D \ E''" + using P2 P20 col_trivial_3 ncol_conga_ncol not_col_permutation_3 by blast + { + assume P24A: "C = E''" + have P24B: "C A OS B D" + by (meson P10 assms(1) invert_one_side ts_ts_os) + have P24C: "A Out B D" + proof - + have "C A B CongA C A D" + using P20 P24A conga_comm by blast + moreover have "C A OS B D" + by (simp add: P24B) + ultimately show ?thesis + using conga_os__out by blast + qed + then have "False" + using Col_def P5 one_side_not_col124 out_col by blast + } + then have P25: "C \ E''" by auto + have P26: "A \ E''" + using P19 out_diff1 by auto + { + assume "Col E'' A B" + then have "E'' = T" + by (smt P13 P14 P18 P2 P3 bet_col l6_21 not_col_permutation_2 not_col_permutation_3) + then have "False" + using P23A by auto + } + then have P27: "\ Col E'' A B" by auto + have "(A B TS C E'' \ A E'' TS D B) \ (A B OS C E'' \ A E'' OS D B \ C A B CongA D A E'' \ B A E'' CongA E'' A B)" + proof cases + assume P27_0: "A B OS C E''" + have "A E'' OS D B" + proof - + have P27_1: "A E'' TS D C" + by (metis Col_def P10 P18 P24 TS_def P25 bet__ts invert_two_sides l6_16_1) + moreover have "A E'' TS B C" + proof - + have "A E'' TS T C" + proof - + have "\ Col T A E''" + by (metis NCol_cases P13 P14 P21 P27 bet_col col3 col_trivial_2) + moreover have "\ Col C A E''" + using P27_1 TS_def by auto + moreover have "\ T0. (Col T0 A E'' \ Bet T T0 C)" + by (meson P12 P18 P27_0 between_symmetry col_trivial_3 l5_3 one_side_chara) + ultimately show ?thesis + by (simp add: TS_def) + qed + moreover have "A Out T B" + using Out_def P13 P14 P21 P6 by auto + ultimately show ?thesis + using col_trivial_1 l9_5 by blast + qed + ultimately show ?thesis + using OS_def by auto + qed + thus ?thesis + using P20 P27_0 conga_distinct conga_left_comm conga_pseudo_refl by blast + next + assume P27_2: "\ A B OS C E''" + show ?thesis + proof - + have P27_3: "A B TS C E''" + using P18 P2 P27_2 P27 assms(1) bet_cop__cop between_symmetry cop_nos__ts ts__coplanar by blast + moreover have "A E'' TS D B" + proof - + have P27_3: "A B OS D E''" + using P18 bet_ts__os between_symmetry calculation one_side_symmetry by blast + have P27_4: "A E'' TS T D" + proof - + have "\ Col T A E''" + by (metis NCol_cases P13 P14 P21 P27 bet_col col3 col_trivial_2) + moreover have "\ Col D A E''" + by (smt Col_def P11 P18 P24 P27_3 bet3__bet bet_col1 col3 col_permutation_5 col_two_sides_bet l5_1) + moreover have "\ T0. (Col T0 A E'' \ Bet T T0 D)" + by (meson Bet_perm P12 P18 P27_3 bet_col1 bet_out__bet between_exchange3 col_trivial_3 not_bet_out one_side_chara) + ultimately show ?thesis + by (simp add: TS_def) + qed + have "A E'' TS B D" + proof - + have "A E'' TS T D" + using P27_4 by simp + moreover have "Col A A E''" + using col_trivial_1 by auto + moreover have "A Out T B" + using P13 P14 P21 bet_out by auto + ultimately show ?thesis + using l9_5 by blast + qed + thus ?thesis + by (simp add: l9_2) + qed + ultimately show ?thesis + by simp + qed + qed + then have P28: "C A E'' CongA D A B" using l11_22 + by (metis P20 P26 P6 conga_left_comm conga_pseudo_refl) + obtain C' where P29: "Bet B C C' \ Cong C C' B C" + using segment_construction by blast + obtain D' where P30: "Bet B D D' \ Cong D D' B D" + using segment_construction by blast + have P31: "B A D Cong3 D' A D" + proof - + have "Per A D B" + by (simp add: assms(3) l8_2) + then obtain D'' where P31_2: "D Midpoint B D'' \ Cong A B A D''" + using Per_def by auto + have "D Midpoint B D'" + using Cong_perm Midpoint_def P30 by blast + then have "D' = D''" + using P31_2 symmetric_point_uniqueness by auto + thus ?thesis + using Cong3_def Cong_perm P30 P31_2 cong_reflexivity by blast + qed + then have P32: "B A D CongA D' A D" + using P6 P8 cong3_conga by auto + have "B A C Cong3 C' A C" + proof - + obtain C'' where P33_1: "C Midpoint B C'' \ Cong A B A C''" + using Per_def assms(2) l8_2 by blast + have "C Midpoint B C'" + using Cong_perm Midpoint_def P29 by blast + then have "C' = C''" + using P33_1 symmetric_point_uniqueness by auto + thus ?thesis + using Cong3_def Cong_perm P29 P33_1 cong_reflexivity by blast + qed + then have P34: "B A C CongA C' A C" + using P6 P7 cong3_conga by auto + have P35: "E'' A C' CongA D' A E''" + proof - + have "(A C TS E'' C' \ A D TS D' E'') \ (A C OS E'' C' \ A D OS D' E'')" + proof - + have P35_1: "C A OS D E''" + by (metis Col_perm P11 P18 P25 bet_out between_symmetry one_side_symmetry out_one_side) + have P35_2: "C A OS B D" + using P10 assms(1) one_side_symmetry ts_ts_os by blast + have P35_3: "C A TS B C'" + by (metis P2 P29 bet__ts cong_diff_4 not_col_distincts) + have P35_4: "C A OS B E''" + using P35_1 P35_2 one_side_transitivity by blast + have P35_5: "D A OS C E''" + by (metis Col_perm P18 P24 P35_1 bet2__out l5_1 one_side_not_col123 out_one_side) + have P35_6: "D A OS B C" + by (simp add: P10 assms(1) invert_two_sides l9_2 one_side_symmetry ts_ts_os) + have P35_7: "D A TS B D'" + by (metis P30 TS_def assms(1) bet__ts cong_diff_3 ts_distincts) + have P35_8: "D A OS B E''" + using P35_5 P35_6 one_side_transitivity by blast + have P35_9: "A C TS E'' C'" + using P35_3 P35_4 invert_two_sides l9_8_2 by blast + have "A D TS D' E''" + using P35_7 P35_8 invert_two_sides l9_2 l9_8_2 by blast + thus ?thesis + using P35_9 by simp + qed + moreover have "E'' A C CongA D' A D" + proof - + have "E'' A C CongA B A D" + by (simp add: P28 conga_comm) + moreover have "B A D CongA D' A D" + by (simp add: P32) + ultimately show ?thesis + using conga_trans by blast + qed + moreover have "C A C' CongA D A E''" + proof - + have "D A E'' CongA C A C'" + proof - + have "D A E'' CongA B A C" + by (simp add: P20 conga_sym) + moreover have "B A C CongA C A C'" + by (simp add: P34 conga_right_comm) + ultimately show ?thesis + using conga_trans by blast + qed + thus ?thesis + using not_conga_sym by blast + qed + ultimately show ?thesis + using l11_22 by auto + qed + have P36: "D' \ B" + using P30 assms(1) bet_neq32__neq ts_distincts by blast + have P37: "C' \ B" + using P29 assms(1) bet_neq32__neq ts_distincts by blast + then have P38: "\ Col C' D' B" + by (metis Col_def P10 P29 P30 P36 TS_def col_transitivity_2) + have P39: "C' D' ParStrict C D" + proof - + have "\ Col C' D' B" + by (simp add: P38) + moreover have "D Midpoint D' B" + using P30 l7_2 midpoint_def not_cong_3412 by blast + moreover have "C Midpoint C' B" + using P29 l7_2 midpoint_def not_cong_3412 by blast + ultimately show ?thesis + using triangle_mid_par by auto + qed + have P40: "A E'' TS C D" + by (metis Bet_perm Col_def P10 P18 P24 TS_def \C = E'' \ False\ bet__ts col_transitivity_2 invert_two_sides) + have P41: "B A TS C D" + by (simp add: assms(1) invert_two_sides) + have P42: "A B OS C C'" + proof - + have "\ Col A B C" + by (simp add: P2 not_col_permutation_1) + moreover have "Col A B B" + by (simp add: col_trivial_2) + moreover have "B Out C C'" + by (metis P29 P37 bet_out cong_identity) + ultimately show ?thesis + using out_one_side_1 by blast + qed + have P43: "A B OS D D'" using out_one_side_1 + by (metis Col_perm P30 TS_def assms(1) bet_out col_trivial_1) + then have P44: "A B OS D D'" using invert_two_sides by blast + have P45: "A B TS C' D" + using P42 assms(1) l9_8_2 by blast + then have P46: "A B TS C' D'" + using P44 l9_2 l9_8_2 by blast + have P47: "C' D' Perp A E''" + proof - + have "A E'' TS C' D'" + proof - + have "A Out C' D' \ E'' A TS C' D'" + proof - + have "E'' A C' CongA E'' A D'" + by (simp add: P35 conga_right_comm) + moreover have "Coplanar E'' A C' D'" + proof - + have f1: "B A OS C C'" + by (metis P42 invert_one_side) + have f2: "Coplanar B A C' C" + by (meson P42 ncoplanar_perm_7 os__coplanar) + have f3: "Coplanar D' A C' D" + by (meson P44 P46 col124__nos coplanar_trans_1 invert_one_side ncoplanar_perm_7 os__coplanar ts__coplanar) + have "Coplanar D' A C' C" + using f2 f1 by (meson P46 col124__nos coplanar_trans_1 ncoplanar_perm_6 ncoplanar_perm_8 ts__coplanar) + then show ?thesis + using f3 by (meson P18 bet_cop2__cop ncoplanar_perm_6 ncoplanar_perm_7 ncoplanar_perm_8) + qed + ultimately show ?thesis using conga_cop__or_out_ts + by simp + qed + then show ?thesis + using P46 col_two_sides_bet invert_two_sides not_bet_and_out out_col by blast + qed + moreover have "Cong C' A D' A" + using Cong3_def P31 \B A C Cong3 C' A C\ cong_inner_transitivity by blast + moreover have "C' A E'' CongA D' A E''" + by (simp add: P35 conga_left_comm) + ultimately show ?thesis + by (simp add: cong_conga_perp) + qed + have T1: "Cong A C' A D'" + proof - + have "Cong A C' A B" + using Cong3_def Cong_perm \B A C Cong3 C' A C\ by blast + moreover have "Cong A D' A B" + using Cong3_def P31 not_cong_4321 by blast + ultimately show ?thesis + using Cong_perm \Cong A C' A B\ \Cong A D' A B\ cong_inner_transitivity by blast + qed + obtain R where T2: "R Midpoint C' D'" + using midpoint_existence by auto + have "\ X Y. (R PerpAt X Y C' D' \ X Y Perp D C \ Coplanar C' D' B X \ Coplanar C' D' B Y)" + proof - + have "\ Col C' D' B" + by (simp add: P38) + moreover have "D Midpoint D' B" + using P30 l7_2 midpoint_def not_cong_3412 by blast + moreover have "C Midpoint C' B" + using Cong_perm Mid_perm Midpoint_def P29 by blast + moreover have "R Midpoint C' D'" + by (simp add: T2) + ultimately show ?thesis using l13_1_aux by blast + qed + then obtain X Y where T3: "R PerpAt X Y C' D' \ X Y Perp D C \ Coplanar C' D' B X \ Coplanar C' D' B Y" + by blast + then have "X \ Y" + using perp_not_eq_1 by blast + have "C D Perp A E''" + proof cases + assume "A = R" + then have W1: "A PerpAt C' D' A E''" + using Col_def P47 T2 between_trivial2 l8_14_2_1b_bis midpoint_col by blast + have "Coplanar B C' D' E''" + proof - + have "\ Col B C D" + using P10 TS_def by auto + moreover have "Coplanar B C D B" + using ncop_distincts by auto + moreover have "Coplanar B C D C'" + using P29 bet_col ncop__ncols by blast + moreover have "Coplanar B C D D'" + using P30 bet_col ncop__ncols by blast + moreover have "Coplanar B C D E''" + by (simp add: P18 bet__coplanar coplanar_perm_22) + ultimately show ?thesis + using coplanar_pseudo_trans by blast + qed + have "Coplanar C' D' X E''" + proof - + have "\ Col B C' D'" + by (simp add: P38 not_col_permutation_2) + moreover have "Coplanar B C' D' X" + using T3 ncoplanar_perm_8 by blast + moreover have "Coplanar B C' D' E''" + by (simp add: \Coplanar B C' D' E''\) + ultimately show ?thesis + using coplanar_trans_1 by blast + qed + have "Coplanar C' D' Y E''" + proof - + have "\ Col B C' D'" + by (simp add: P38 not_col_permutation_2) + moreover have "Coplanar B C' D' Y" + by (simp add: T3 coplanar_perm_12) + moreover have "Coplanar B C' D' E''" + by (simp add: \Coplanar B C' D' E''\) + ultimately show ?thesis + using coplanar_trans_1 by blast + qed + have "Coplanar C' D' X A" + proof - + have "Col C' D' A" + using T2 \A = R\ midpoint_col not_col_permutation_2 by blast + moreover have "Col X A A" + by (simp add: col_trivial_2) + ultimately show ?thesis + using ncop__ncols by blast + qed + have "Coplanar C' D' Y A" + proof - + have "Col C' D' A" + using T2 \A = R\ midpoint_col not_col_permutation_2 by blast + moreover have "Col Y A A" + by (simp add: col_trivial_2) + ultimately show ?thesis + using ncop__ncols by blast + qed + have "Col X Y A" + proof - + have "Coplanar C' D' X A" + by (simp add: \Coplanar C' D' X A\) + moreover have "Coplanar C' D' X E''" + by (simp add: \Coplanar C' D' X E''\) + moreover have "Coplanar C' D' Y A" + by (simp add: \Coplanar C' D' Y A\) + moreover have "Coplanar C' D' Y E''" + by (simp add: \Coplanar C' D' Y E''\) + moreover have "A PerpAt X Y C' D'" + using T3 \A = R\ Perp_in_cases by auto + moreover have "A PerpAt A E'' C' D'" + using Perp_in_cases \A PerpAt C' D' A E''\ by blast + ultimately show ?thesis + using cop4_perp_in2__col by blast + qed + have "Col X Y E''" + proof - + have "Coplanar C' D' X E''" + using \Coplanar C' D' X E''\ by auto + moreover have "Coplanar C' D' X A" + by (simp add: \Coplanar C' D' X A\) + moreover have "Coplanar C' D' Y E''" + by (simp add: \Coplanar C' D' Y E''\) + moreover have "Coplanar C' D' Y A" + using \Coplanar C' D' Y A\ by auto + moreover have "A PerpAt X Y C' D'" + using T3 \A = R\ Perp_in_cases by auto + moreover have "A PerpAt E'' A C' D'" + using Perp_in_perm W1 by blast + ultimately show ?thesis + using cop4_perp_in2__col by blast + qed + have "A E'' Perp C D" + proof cases + assume "Y = A" + show ?thesis + proof - + have "A \ E''" + by (simp add: P26) + moreover have "A X Perp C D" + using T3 Perp_cases \Y = A\ by blast + moreover have "Col A X E''" + using Col_perm \Col X Y E''\ \Y = A\ by blast + ultimately show ?thesis + using perp_col by blast + qed + next + assume "Y \ A" + show ?thesis + proof - + have "A \ E''" + by (simp add: P26) + moreover have "A Y Perp C D" + proof - + have "Y X Perp C D" + using T3 by (simp add: perp_comm) + then have "Y A Perp C D" + using \Col X Y A\ \Y \ A\ col_trivial_2 perp_col2 perp_left_comm by blast + then show ?thesis + using Perp_cases by blast + qed + moreover have "Col A Y E''" + using Col_perm \Col X Y A\ \Col X Y E''\ \X \ Y\ col_transitivity_2 by blast + ultimately show ?thesis + using perp_col by blast + qed + qed + thus ?thesis + using Perp_perm by blast + next + assume "A \ R" + have "R \ C'" + using P46 T2 is_midpoint_id ts_distincts by blast + have "Per A R C'" using T1 T2 Per_def by blast + then have "R PerpAt A R R C'" + by (simp add: \A \ R\ \R \ C'\ per_perp_in) + then have "R PerpAt R C' A R" + using Perp_in_perm by blast + then have "R C' Perp A R \ R R Perp A R" + using perp_in_perp by auto + { + assume "R C' Perp A R" + then have "C' R Perp A R" + by (simp add: \R C' Perp A R\ Perp_perm) + have "C' D' Perp R A" + by (metis P47 T2 \A \ R\ \Per A R C'\ \R \ C'\ col_per_perp midpoint_col perp_distinct perp_right_comm) + then have "R PerpAt C' D' R A" + using T2 l8_14_2_1b_bis midpoint_col not_col_distincts by blast + have "Col B D D'" + by (simp add: Col_def P30) + have "Col B C C'" + using Col_def P29 by auto + have "Col D E'' C" + using P18 bet_col by auto + have "Col R C' D'" + using \R PerpAt C' D' R A\ by (simp add: T2 midpoint_col) + have "Col A E'' E'" + by (simp add: P19 out_col) + have "Coplanar C' D' X A" + proof - + have "\ Col B C' D'" + using Col_perm P38 by blast + moreover have "Coplanar B C' D' X" + using T3 ncoplanar_perm_8 by blast + moreover have "Coplanar B C' D' A" + using P46 ncoplanar_perm_18 ts__coplanar by blast + ultimately show ?thesis + using coplanar_trans_1 by auto + qed + have "Coplanar C' D' Y A" + proof - + have "\ Col B C' D'" + using Col_perm P38 by blast + moreover have "Coplanar B C' D' Y" + using T3 ncoplanar_perm_8 by blast + moreover have "Coplanar B C' D' A" + using P46 ncoplanar_perm_18 ts__coplanar by blast + ultimately show ?thesis + using coplanar_trans_1 by auto + qed + have "Coplanar C' D' X R" + proof - + have "Col C' D' R" + using Col_perm \Col R C' D'\ by blast + moreover have "Col X R R" + by (simp add: col_trivial_2) + ultimately show ?thesis + using ncop__ncols by blast + qed + have "Coplanar C' D' Y R" + using Col_perm T2 midpoint_col ncop__ncols by blast + have "Col X Y A" + proof - + have "R PerpAt X Y C' D'" + using T3 by simp + moreover have "R PerpAt A R C' D'" + using Perp_in_perm \R PerpAt C' D' R A\ by blast + ultimately show ?thesis + using \Coplanar C' D' Y R\ \Coplanar C' D' X R\ cop4_perp_in2__col \Coplanar C' D' X A\ \Coplanar C' D' Y A\ by blast + qed + have Z1: "Col X Y R" + using T3 perp_in_col by blast + have "Col A E'' R" + proof - + have "Coplanar C' D' E'' R" + using Col_cases \Col R C' D'\ ncop__ncols by blast + moreover have "A E'' Perp C' D'" + using P47 Perp_perm by blast + moreover have "A R Perp C' D'" + using Perp_perm \C' D' Perp R A\ by blast + ultimately show ?thesis + using cop_perp2__col by blast + qed + then have "Col X Y E''" using Z1 + by (metis (full_types) \A \ R\ \Col X Y A\ col_permutation_4 col_trivial_2 l6_21) + have "Col A E'' R" + proof - + have "Coplanar C' D' E'' R" + using Col_cases \Col R C' D'\ ncop__ncols by blast + moreover have "A E'' Perp C' D'" + using P47 Perp_perm by blast + moreover have "A R Perp C' D'" + using Perp_perm \C' D' Perp R A\ by blast + ultimately show ?thesis + using cop_perp2__col by blast + qed + have "Col A R X" + using \Col X Y A\ \Col X Y R\ \X \ Y\ col_transitivity_1 not_col_permutation_3 by blast + have "Col A R Y" + using \Col X Y A\ \Col X Y R\ \X \ Y\ col_transitivity_2 not_col_permutation_3 by blast + have "A E'' Perp C D" + proof cases + assume "X = A" + show ?thesis + proof - + have "A \ E''" + by (simp add: P26) + moreover have "A Y Perp C D" + using T3 \X = A\ perp_right_comm by blast + moreover have "Col A Y E''" + using Col_perm \A \ R\ \Col A E'' R\ \Col A R Y\ col_transitivity_1 by blast + ultimately show ?thesis + using perp_col by auto + qed + next + assume "X \ A" + show ?thesis + proof - + have "A X Perp C D" + by (smt P3 T3 \Col X Y A\ \X \ A\ col_trivial_2 col_trivial_3 perp_col4) + moreover have "Col A X E''" + using Col_perm \A \ R\ \Col A E'' R\ \Col A R X\ col_transitivity_1 by blast + ultimately show ?thesis + using P26 perp_col by blast + qed + qed + } + { + assume "R R Perp A R" + then have "A E'' Perp C D" + using perp_distinct by blast + } + then have "A E'' Perp C D" + using Perp_cases \R C' Perp A R \ A E'' Perp C D\ \R C' Perp A R \ R R Perp A R\ by auto + then show ?thesis + using Perp_perm by blast + qed + show ?thesis + proof - + have "Col A E E''" + proof - + have "Coplanar C D E E'" + using assms(4) col__coplanar by auto + moreover have "A E Perp C D" + using assms(5) by auto + moreover have "A E'' Perp C D" + using Perp_perm \C D Perp A E''\ by blast + ultimately show ?thesis + by (meson P11 col_perp2_ncol_col col_trivial_3 not_col_permutation_2) + qed + moreover have "E'' = E" + proof - + have f1: "C = E'' \ Col C E'' D" + by (metis P18 bet_out_1 out_col) + then have f2: "C = E'' \ Col C E'' E" + using Col_perm P3 assms(4) col_transitivity_1 by blast + have "\p. (C = E'' \ Col C p D) \ \ Col C E'' p" + using f1 by (meson col_transitivity_1) + then have "\p. \ Col E'' p A \ Col E'' E p" + using f2 by (metis (no_types) Col_perm P11 assms(4)) + then show ?thesis + using Col_perm calculation col_transitivity_1 by blast + qed + ultimately show ?thesis + by (metis Bet_perm P18 P20 P28 Tarski_neutral_dimensionless.conga_left_comm Tarski_neutral_dimensionless_axioms not_conga_sym) + qed + qed + then have "B A C CongA D A E \ B A D CongA C A E \ Bet C E D" + by blast + } + thus ?thesis + using P18 \E'' = A \ B A C CongA D A E \ B A D CongA C A E \ Bet C E D\ by blast +qed + +lemma perp2_refl: + assumes "A \ B" + shows "P Perp2 A B A B" +proof cases + assume "Col A B P" + obtain X where "\ Col A B X" + using assms not_col_exists by blast + then obtain Q where "A B Perp Q P \ A B OS X Q" + using \Col A B P\ l10_15 by blast + thus ?thesis + using Perp2_def Perp_cases col_trivial_3 by blast +next + assume "\ Col A B P" + then obtain Q where "Col A B Q \ A B Perp P Q" + using l8_18_existence by blast + thus ?thesis + using Perp2_def Perp_cases col_trivial_3 by blast +qed + +lemma perp2_sym: + assumes "P Perp2 A B C D" + shows "P Perp2 C D A B" +proof - + obtain X Y where "Col P X Y \ X Y Perp A B \ X Y Perp C D" + using Perp2_def assms by auto + thus ?thesis + using Perp2_def by blast +qed + +lemma perp2_left_comm: + assumes "P Perp2 A B C D" + shows "P Perp2 B A C D" +proof - + obtain X Y where "Col P X Y \ X Y Perp A B \ X Y Perp C D" + using Perp2_def assms by auto + thus ?thesis + using Perp2_def perp_right_comm by blast +qed + +lemma perp2_right_comm: + assumes "P Perp2 A B C D" + shows "P Perp2 A B D C" +proof - + obtain X Y where "Col P X Y \ X Y Perp A B \ X Y Perp C D" + using Perp2_def assms by auto + thus ?thesis + using Perp2_def perp_right_comm by blast +qed + +lemma perp2_comm: + assumes "P Perp2 A B C D" + shows "P Perp2 B A D C" +proof - + obtain X Y where "Col P X Y \ X Y Perp A B \ X Y Perp C D" + using Perp2_def assms by auto + thus ?thesis + using assms perp2_left_comm perp2_right_comm by blast +qed + +lemma perp2_pseudo_trans: + assumes "P Perp2 A B C D" and + "P Perp2 C D E F" and + "\ Col C D P" + shows "P Perp2 A B E F" +proof - + obtain X Y where P1: "Col P X Y \ X Y Perp A B \ X Y Perp C D" + using Perp2_def assms(1) by auto + obtain X' Y' where P2: "Col P X' Y' \ X' Y' Perp C D \ X' Y' Perp E F" + using Perp2_def assms(2) by auto + have "X Y Par X' Y'" + proof - + have "Coplanar P C D X" + proof cases + assume "X = P" + thus ?thesis + using ncop_distincts by blast + next + assume "X \ P" + then have "X P Perp C D" + using Col_cases P1 perp_col by blast + then have "Coplanar X P C D" + by (simp add: perp__coplanar) + thus ?thesis + using ncoplanar_perm_18 by blast + qed + have "Coplanar P C D Y" + proof cases + assume "Y = P" + thus ?thesis + using ncop_distincts by blast + next + assume "Y \ P" + then have "Y P Perp C D" + by (metis (full_types) Col_cases P1 Perp_cases col_transitivity_2 perp_col2) + then have "Coplanar Y P C D" + by (simp add: perp__coplanar) + thus ?thesis + using ncoplanar_perm_18 by blast + qed + have "Coplanar P C D X'" + proof cases + assume "X' = P" + thus ?thesis + using ncop_distincts by blast + next + assume "X' \ P" + then have "X' P Perp C D" + using Col_cases P2 perp_col by blast + then have "Coplanar X' P C D" + by (simp add: perp__coplanar) + thus ?thesis + using ncoplanar_perm_18 by blast + qed + have "Coplanar P C D Y'" + proof cases + assume "Y' = P" + thus ?thesis + using ncop_distincts by blast + next + assume "Y' \ P" + then have "Y' P Perp C D" + by (metis (full_types) Col_cases P2 Perp_cases col_transitivity_2 perp_col2) + then have "Coplanar Y' P C D" + by (simp add: perp__coplanar) + thus ?thesis + using ncoplanar_perm_18 by blast + qed + show ?thesis + proof - + have "Coplanar C D X X'" + using Col_cases \Coplanar P C D X'\ \Coplanar P C D X\ assms(3) coplanar_trans_1 by blast + moreover have "Coplanar C D X Y'" + using Col_cases \Coplanar P C D X\ \Coplanar P C D Y'\ assms(3) coplanar_trans_1 by blast + moreover have "Coplanar C D Y X'" + using Col_cases \Coplanar P C D X'\ \Coplanar P C D Y\ assms(3) coplanar_trans_1 by blast + moreover have "Coplanar C D Y Y'" + using Col_cases \Coplanar P C D Y'\ \Coplanar P C D Y\ assms(3) coplanar_trans_1 by blast + ultimately show ?thesis + using l12_9 P1 P2 by blast + qed + qed + thus ?thesis + proof - + { + assume "X Y ParStrict X' Y'" + then have "Col X X' Y'" + using P1 P2 \X Y ParStrict X' Y'\ par_not_col by blast + } + then have "Col X X' Y'" + using Par_def \X Y Par X' Y'\ by blast + moreover have "Col Y X' Y'" + proof - + { + assume "X Y ParStrict X' Y'" + then have "Col Y X' Y'" + using P1 P2 \X Y ParStrict X' Y'\ par_not_col by blast + } + thus ?thesis + using Par_def \X Y Par X' Y'\ by blast + qed + moreover have "X \ Y" + using P1 perp_not_eq_1 by auto + ultimately show ?thesis + by (meson Perp2_def P1 P2 col_permutation_1 perp_col2) + qed +qed + +lemma col_cop_perp2__pars_bis: + assumes "\ Col A B P" and + "Col C D P" and + "Coplanar A B C D" and + "P Perp2 A B C D" + shows "A B ParStrict C D" +proof - + obtain X Y where P1: "Col P X Y \ X Y Perp A B \ X Y Perp C D" + using Perp2_def assms(4) by auto + then have "Col X Y P" + using Col_perm by blast + obtain Q where "X \ Q \ Y \ Q \ P \ Q \ Col X Y Q" + using \Col X Y P\ diff_col_ex3 by blast + thus ?thesis + by (smt P1 Perp_perm assms(1) assms(2) assms(3) col_cop_perp2_pars col_permutation_1 col_transitivity_2 not_col_distincts perp_col4 perp_distinct) +qed + +lemma perp2_preserves_bet23: + assumes "Bet PO A B" and + "Col PO A' B'" and + "\ Col PO A A'" and + "PO Perp2 A A' B B'" + shows "Bet PO A' B'" +proof - + have "A \ A'" + using assms(3) not_col_distincts by auto + show ?thesis + proof cases + assume "A' = B'" + thus ?thesis + using between_trivial by auto + next + assume "A' \ B'" + { + assume "A = B" + then obtain X Y where P1: "Col PO X Y \ X Y Perp A A' \ X Y Perp A B'" + using Perp2_def assms(4) by blast + have "Col A A' B'" + proof - + have "Coplanar X Y A' B'" + using Col_cases Coplanar_def P1 assms(2) by auto + moreover have "A A' Perp X Y" + using P1 Perp_perm by blast + moreover have "A B' Perp X Y" + using P1 Perp_perm by blast + ultimately show ?thesis + using cop_perp2__col by blast + qed + then have "False" + using Col_perm \A' \ B'\ assms(2) assms(3) l6_16_1 by blast + } + then have "A \ B" by auto + have "A A' Par B B'" + proof - + obtain X Y where P2: "Col PO X Y \ X Y Perp A A' \ X Y Perp B B'" + using Perp2_def assms(4) by auto + then have "Coplanar X Y A B" + using Coplanar_def assms(1) bet_col not_col_permutation_2 by blast + show ?thesis + proof - + have "Coplanar X Y A B'" + by (metis (full_types) Col_cases P2 assms(2) assms(3) col_cop2__cop col_trivial_3 ncop__ncols perp__coplanar) + moreover have "Coplanar X Y A' B" + proof cases + assume "Col A X Y" + then have "Col Y X A" + by (metis (no_types) Col_cases) + then show ?thesis + by (metis Col_cases P2 assms(1) assms(3) bet_col colx ncop__ncols not_col_distincts) + next + assume "\ Col A X Y" + moreover have "Coplanar A X Y A'" + using Coplanar_def P2 perp_inter_exists by blast + moreover have "Coplanar A X Y B" + using \Coplanar X Y A B\ ncoplanar_perm_8 by blast + ultimately show ?thesis + using coplanar_trans_1 by auto + qed + moreover have "Coplanar X Y A' B'" + using Col_cases Coplanar_def P2 assms(2) by auto + moreover have "A A' Perp X Y" + using P2 Perp_perm by blast + moreover have "B B' Perp X Y" + using P2 Perp_perm by blast + ultimately show ?thesis + using \Coplanar X Y A B\ l12_9 by auto + qed + qed + { + assume "A A' ParStrict B B'" + then have "A A' OS B B'" + by (simp add: l12_6) + have "A A' TS PO B" + using Col_cases \A \ B\ assms(1) assms(3) bet__ts by blast + then have "A A' TS B' PO" + using \A A' OS B B'\ l9_2 l9_8_2 by blast + then have "Bet PO A' B'" + using Col_cases assms(2) between_symmetry col_two_sides_bet invert_two_sides by blast + } + thus ?thesis + by (metis Col_cases Par_def \A A' Par B B'\ \A \ B\ assms(1) assms(3) bet_col col_trivial_3 l6_21) + qed +qed + +lemma perp2_preserves_bet13: + assumes "Bet B PO C" and + "Col PO B' C'" and + "\ Col PO B B'" and + "PO Perp2 B C' C B'" + shows "Bet B' PO C'" +proof cases + assume "C' = PO" + thus ?thesis + using not_bet_distincts by blast +next + assume "C' \ PO" + show ?thesis + proof cases + assume "B' = PO" + thus ?thesis + using between_trivial2 by auto + next + assume "B' \ PO" + have "B \ PO" + using assms(3) col_trivial_1 by auto + have "Col B PO C" + by (simp add: Col_def assms(1)) + show ?thesis + proof cases + assume "B = C" + thus ?thesis + using \B = C\ \B \ PO\ assms(1) between_identity by blast + next + assume "B \ C" + have "B C' Par C B'" + proof - + obtain X Y where P1: "Col PO X Y \ X Y Perp B C' \ X Y Perp C B'" + using Perp2_def assms(4) by auto + have "Coplanar X Y B C" + by (meson P1 \Col B PO C\ assms(1) l9_18_R2 ncop__ncols not_col_permutation_2 not_col_permutation_5 ts__coplanar) + have "Coplanar X Y C' B'" + using Col_cases Coplanar_def P1 assms(2) by auto + show ?thesis + proof - + have "Coplanar X Y B C" + by (simp add: \Coplanar X Y B C\) + moreover have "Coplanar X Y B B'" + by (metis P1 \C' \ PO\ assms(1) assms(2) bet_cop__cop calculation col_cop2__cop not_col_permutation_5 perp__coplanar) + moreover have "Coplanar X Y C' C" + by (smt P1 \B \ PO\ \Col B PO C\ \Coplanar X Y C' B'\ assms(2) col2_cop__cop col_cop2__cop col_permutation_1 col_transitivity_2 coplanar_perm_1 perp__coplanar) + moreover have "Coplanar X Y C' B'" + by (simp add: \Coplanar X Y C' B'\) + moreover have "B C' Perp X Y" + using P1 Perp_perm by blast + moreover have "C B' Perp X Y" + by (simp add: P1 Perp_perm) + ultimately show ?thesis + using l12_9 by blast + qed + qed + have "B C' ParStrict C B'" + by (metis Out_def Par_def \B C' Par C B'\ \B \ C\ \B \ PO\ assms(1) assms(3) col_transitivity_1 not_col_permutation_4 out_col) + have "B' \ PO" + by (simp add: \B' \ PO\) + obtain X Y where P5: "Col PO X Y \ X Y Perp B C' \ X Y Perp C B'" + using Perp2_def assms(4) by auto + have "X \ Y" + using P5 perp_not_eq_1 by auto + show ?thesis + proof cases + assume "Col X Y B" + have "Col X Y C" + using P5 \B \ PO\ \Col B PO C\ \Col X Y B\ col_permutation_1 colx by blast + show ?thesis + proof - + have "Col B' PO C'" + using Col_cases assms(2) by auto + moreover have "Per PO C B'" + by (metis P5 \B C' ParStrict C B'\ \Col X Y C\ assms(2) col_permutation_2 par_strict_not_col_2 perp_col2 perp_per_2) + moreover have "Per PO B C'" + using P5 \B \ PO\ \Col X Y B\ col_permutation_1 perp_col2 perp_per_2 by blast + ultimately show ?thesis + by (metis Tarski_neutral_dimensionless.per13_preserves_bet_inv Tarski_neutral_dimensionless_axioms \B C' ParStrict C B'\ assms(1) assms(3) between_symmetry not_col_distincts not_col_permutation_3 par_strict_not_col_2) + qed + next + assume "\ Col X Y B" + then obtain B0 where U1: "Col X Y B0 \ X Y Perp B B0" + using l8_18_existence by blast + have "\ Col X Y C" + by (smt P5 \B C' ParStrict C B'\ \Col B PO C\ \\ Col X Y B\ assms(2) col_permutation_2 colx par_strict_not_col_2) + then obtain C0 where U2: "Col X Y C0 \ X Y Perp C C0" + using l8_18_existence by blast + have "B0 \ PO" + by (metis P5 Perp_perm \Col B PO C\ \Col X Y B0 \ X Y Perp B B0\ \\ Col X Y C\ assms(3) col_permutation_2 col_permutation_3 col_perp2_ncol_col) + { + assume "C0 = PO" + then have "C PO Par C B'" + by (metis P5 Par_def Perp_cases \Col X Y C0 \ X Y Perp C C0\ \\ Col X Y C\ col_perp2_ncol_col not_col_distincts not_col_permutation_3 perp_distinct) + then have "False" + by (metis \B C' ParStrict C B'\ assms(2) assms(3) col3 not_col_distincts par_id_2 par_strict_not_col_2) + } + then have "C0 \ PO" by auto + have "Bet B0 PO C0" + proof - + have "Bet B PO C" + by (simp add: assms(1)) + moreover have "PO \ B0" + using \B0 \ PO\ by auto + moreover have "PO \ C0" + using \C0 \ PO\ by auto + moreover have "Col B0 PO C0" + using U1 U2 P5 \X \ Y\ col3 not_col_permutation_2 by blast + moreover have "Per PO B0 B" + proof - + have "B0 PerpAt PO B0 B0 B" + proof cases + assume "X = B0" + have "B0 PO Perp B B0" + by (metis P5 U1 calculation(2) col3 col_trivial_2 col_trivial_3 perp_col2) + show ?thesis + proof - + have "B0 \ PO" + using calculation(2) by auto + moreover have "B0 Y Perp B B0" + using U1 \X = B0\ by auto + moreover have "Col B0 Y PO" + using Col_perm P5 \X = B0\ by blast + ultimately show ?thesis + using \B0 PO Perp B B0\ perp_in_comm perp_perp_in by blast + qed + next + assume "X \ B0" + have "X B0 Perp B B0" + using U1 \X \ B0\ perp_col by blast + have "B0 PO Perp B B0" + by (metis P5 U1 calculation(2) not_col_permutation_2 perp_col2) + then have "B0 PerpAt B0 PO B B0" + by (simp add: perp_perp_in) + thus ?thesis + using Perp_in_perm by blast + qed + then show ?thesis + by (simp add: perp_in_per) + qed + moreover have "Per PO C0 C" + proof - + have "C0 PO Perp C C0" + by (metis P5 U2 calculation(3) col3 col_trivial_2 col_trivial_3 perp_col2) + then have "C0 PerpAt PO C0 C0 C" + by (simp add: perp_in_comm perp_perp_in) + thus ?thesis + using perp_in_per_2 by auto + qed + ultimately show ?thesis + using per13_preserves_bet by blast + qed + show ?thesis + proof cases + assume "C' = B0" + have "B' = C0" + proof - + have "\ Col C' PO C" + using P5 U1 \B0 \ PO\ \C' = B0\ \\ Col X Y C\ colx not_col_permutation_3 not_col_permutation_4 by blast + moreover have "C \ C0" + using U2 \\ Col X Y C\ by auto + moreover have "Col C C0 B'" + proof - + have "Coplanar X Y C0 B'" + proof - + have "Col X Y C0" + by (simp add: U2) + moreover have "Col C0 B' C0" + by (simp add: col_trivial_3) + ultimately show ?thesis + using ncop__ncols by blast + qed + moreover have "C C0 Perp X Y" + using Perp_perm U2 by blast + moreover have "C B' Perp X Y" + using P5 Perp_perm by blast + ultimately show ?thesis + using cop_perp2__col by auto + qed + ultimately show ?thesis + by (metis Col_def \C' = B0\ \Bet B0 PO C0\ assms(2) colx) + qed + show ?thesis + using Bet_cases \B' = C0\ \C' = B0\ \Bet B0 PO C0\ by blast + next + assume "C' \ B0" + then have "B' \ C0" + by (metis P5 U1 U2 \C0 \ PO\ assms(2) col_permutation_1 colx l8_18_uniqueness) + have "B C' Par B B0" + proof - + have "Coplanar X Y B B" + using ncop_distincts by auto + moreover have "Coplanar X Y B B0" + using U1 ncop__ncols by blast + moreover have "Coplanar X Y C' B" + using P5 ncoplanar_perm_1 perp__coplanar by blast + moreover have "Coplanar X Y C' B0" + using \\ Col X Y B\ calculation(2) calculation(3) col_permutation_1 coplanar_perm_12 coplanar_perm_18 coplanar_trans_1 by blast + moreover have "B C' Perp X Y" + using P5 Perp_perm by blast + moreover have "B B0 Perp X Y" + using Perp_perm U1 by blast + ultimately show ?thesis + using l12_9 by blast + qed + { + assume "B C' ParStrict B B0" + have "Col B B0 C'" + by (simp add: \B C' Par B B0\ par_id_3) + } + then have "Col B B0 C'" + using \B C' Par B B0\ par_id_3 by blast + have "Col C C0 B'" + proof - + have "Coplanar X Y C0 B'" + by (simp add: U2 col__coplanar) + moreover have "C C0 Perp X Y" + by (simp add: Perp_perm U2) + moreover have "C B' Perp X Y" + using P5 Perp_perm by blast + ultimately show ?thesis + using cop_perp2__col by auto + qed + show ?thesis + proof - + have "Col B' PO C'" + using assms(2) not_col_permutation_4 by blast + moreover have "Per PO C0 B'" + proof - + have "C0 PerpAt PO C0 C0 B'" + proof cases + assume "X = C0" + have "C0 PO Perp C B'" + proof - + have "C0 \ PO" + by (simp add: \C0 \ PO\) + moreover have "C0 Y Perp C B'" + using P5 \X = C0\ by auto + moreover have "Col C0 Y PO" + using Col_perm P5 \X = C0\ by blast + ultimately show ?thesis + using perp_col by blast + qed + then have "B' C0 Perp C0 PO" + using Perp_perm \B' \ C0\ \Col C C0 B'\ not_col_permutation_1 perp_col1 by blast + then have "C0 PerpAt C0 B' PO C0" + using Perp_perm perp_perp_in by blast + thus ?thesis + using Perp_in_perm by blast + next + assume "X \ C0" + then have "X C0 Perp C B'" + using P5 U2 perp_col by blast + have "C0 PO Perp C B'" + using Col_cases P5 U2 \C0 \ PO\ perp_col2 by blast + then have "B' C0 Perp C0 PO" + using Perp_cases \B' \ C0\ \Col C C0 B'\ col_permutation_2 perp_col by blast + thus ?thesis + using Perp_in_perm Perp_perm perp_perp_in by blast + qed + then show ?thesis + using perp_in_per_2 by auto + qed + moreover have "Per PO B0 C'" + proof - + have "B0 PerpAt PO B0 B0 C'" + proof - + have "Col C' B B0" + using Col_cases \Col B B0 C'\ by blast + then have "C' B0 Perp X Y" using perp_col P5 Perp_cases \C' \ B0\ by blast + show ?thesis + proof - + have "PO B0 Perp B0 C'" + by (smt P5 U1 \B0 \ PO\ \C' \ B0\ \Col B B0 C'\ col_trivial_2 not_col_permutation_2 perp_col4) + then show ?thesis + using Perp_in_cases Perp_perm perp_perp_in by blast + qed + qed + thus ?thesis + by (simp add: perp_in_per) + qed + ultimately show ?thesis + using \B0 \ PO\ \C0 \ PO\ \Bet B0 PO C0\ between_symmetry per13_preserves_bet_inv by blast + qed + qed + qed + qed + qed +qed + +lemma is_image_perp_in: + assumes "A \ A'" and + "X \ Y" and + "A A' Reflect X Y" + shows "\ P. P PerpAt A A' X Y" + by (metis Perp_def Tarski_neutral_dimensionless.Perp_perm Tarski_neutral_dimensionless_axioms assms(1) assms(2) assms(3) ex_sym1 l10_6_uniqueness) + +lemma perp_inter_perp_in_n: + assumes "A B Perp C D" + shows "\ P. Col A B P \ Col C D P \ P PerpAt A B C D" + by (simp add: assms perp_inter_perp_in) + +lemma perp2_perp_in: + assumes "PO Perp2 A B C D" and + "\ Col PO A B" and + "\ Col PO C D" + shows "\ P Q. Col A B P \ Col C D Q \ Col PO P Q \ P PerpAt PO P A B \ Q PerpAt PO Q C D" +proof - + obtain X Y where P1: "Col PO X Y \ X Y Perp A B \ X Y Perp C D" + using Perp2_def assms(1) by blast + have "X \ Y" + using P1 perp_not_eq_1 by auto + obtain P where P2: "Col X Y P \ Col A B P \ P PerpAt X Y A B" + using P1 perp_inter_perp_in_n by blast + obtain Q where P3: "Col X Y Q \ Col C D Q \ Q PerpAt X Y C D" + using P1 perp_inter_perp_in_n by blast + have "Col A B P" + using P2 by simp + moreover have "Col C D Q" + using P3 by simp + moreover have "Col PO P Q" + using P2 P3 P1 \X \ Y\ col3 not_col_permutation_2 by blast + moreover have "P PerpAt PO P A B" + proof cases + assume "X = PO" + thus ?thesis + by (metis P2 assms(2) not_col_permutation_3 not_col_permutation_4 perp_in_col_perp_in perp_in_sym) + next + assume "X \ PO" + then have "P PerpAt A B X PO" + by (meson Col_cases P1 P2 perp_in_col_perp_in perp_in_sym) + then have "P PerpAt A B PO X" + using Perp_in_perm by blast + then have "P PerpAt A B PO P" + by (metis Col_cases assms(2) perp_in_col perp_in_col_perp_in) + thus ?thesis + by (simp add: perp_in_sym) + qed + moreover have "Q PerpAt PO Q C D" + by (metis P1 P3 \X \ Y\ assms(3) col_trivial_2 colx not_col_permutation_3 not_col_permutation_4 perp_in_col_perp_in perp_in_right_comm perp_in_sym) + ultimately show ?thesis + by blast +qed + +lemma l13_8: + assumes "U \ PO" and + "V \ PO" and + "Col PO P Q" and + "Col PO U V" and + "Per P U PO" and + "Per Q V PO" + shows "PO Out P Q \ PO Out U V" + by (smt Out_def assms(1) assms(2) assms(3) assms(4) assms(5) assms(6) l8_2 not_col_permutation_5 per23_preserves_bet per23_preserves_bet_inv per_distinct_1) + +lemma perp_in_rewrite: + assumes "P PerpAt A B C D" + shows "P PerpAt A P P C \ P PerpAt A P P D \ P PerpAt B P P C \ P PerpAt B P P D" + by (metis assms per_perp_in perp_in_distinct perp_in_per_1 perp_in_per_3 perp_in_per_4) + +lemma perp_out_acute: + assumes "B Out A C'" and + "A B Perp C C'" + shows "Acute A B C" +proof - + have "A \ B" + using assms(1) out_diff1 by auto + have "C' \ B" + using Out_def assms(1) by auto + then have "B C' Perp C C'" + by (metis assms(1) assms(2) out_col perp_col perp_comm perp_right_comm) + then have "Per C C' B" + using Perp_cases perp_per_2 by blast + then have "Acute C' C B \ Acute C' B C" + by (metis \C' \ B\ assms(2) l11_43 perp_not_eq_2) + have "C \ B" + using \B C' Perp C C'\ l8_14_1 by auto + show ?thesis + proof - + have "B Out A C'" + by (simp add: assms(1)) + moreover have "B Out C C" + by (simp add: \C \ B\ out_trivial) + moreover have "Acute C' B C" + by (simp add: \Acute C' C B \ Acute C' B C\) + ultimately show ?thesis + using acute_out2__acute by auto + qed +qed + +lemma perp_bet_obtuse: + assumes "B \ C'" and + "A B Perp C C'" and + "Bet A B C'" + shows "Obtuse A B C" +proof - + have "Acute C' B C" + proof - + have "B Out C' C'" + using assms(1) out_trivial by auto + moreover have "Col A B C'" + by (simp add: Col_def assms(3)) + then have "C' B Perp C C'" + using Out_def assms(2) assms(3) bet_col1 calculation perp_col2 by auto + ultimately show ?thesis + using perp_out_acute by blast + qed + thus ?thesis + using acute_bet__obtuse assms(2) assms(3) between_symmetry perp_not_eq_1 by blast +qed + +end + +subsubsection "Part 1: 2D" + +context Tarski_2D +begin + +lemma perp_in2__col: + assumes "P PerpAt A B X Y" and + "P PerpAt A' B' X Y" + shows "Col A B A'" + using cop4_perp_in2__col all_coplanar assms by blast + +lemma perp2_trans: + assumes "P Perp2 A B C D" and + "P Perp2 C D E F" + shows "P Perp2 A B E F" +proof - + obtain X Y where P1: "Col P X Y \ X Y Perp A B \ X Y Perp C D" + using Perp2_def assms(1) by blast + obtain X' Y' where P2: "Col P X' Y' \ X' Y' Perp C D \ X' Y' Perp E F" + using Perp2_def assms(2) by blast + { + assume "X Y Par X' Y'" + then have P3: "X Y ParStrict X' Y' \ (X \ Y \ X' \ Y' \ Col X X' Y' \ Col Y X' Y')" + using Par_def by blast + { + assume "X Y ParStrict X' Y'" + then have "P Perp2 A B E F" + using P1 P2 par_not_col by auto + } + { + assume "X \ Y \ X' \ Y' \ Col X X' Y' \ Col Y X' Y'" + then have "P Perp2 A B E F" + by (meson P1 P2 Perp2_def col_permutation_1 perp_col2) + } + then have "P Perp2 A B E F" + using P3 \X Y ParStrict X' Y' \ P Perp2 A B E F\ by blast + } + { + assume "\ X Y Par X' Y'" + then have "P Perp2 A B E F" + using P1 P2 l12_9_2D by blast + } + thus ?thesis + using \X Y Par X' Y' \ P Perp2 A B E F\ by blast +qed + +lemma perp2_par: + assumes "PO Perp2 A B C D" + shows "A B Par C D" + using Perp2_def l12_9_2D Perp_perm assms by blast + +end + +subsubsection "Part 2: length" + +context Tarski_neutral_dimensionless + +begin + +lemma lg_exists: + "\ l. (QCong l \ l A B)" + using QCong_def cong_pseudo_reflexivity by blast + +lemma lg_cong: + assumes "QCong l" and + "l A B" and + "l C D" + shows "Cong A B C D" + by (metis QCong_def assms(1) assms(2) assms(3) cong_inner_transitivity) + +lemma lg_cong_lg: + assumes "QCong l" and + "l A B" and + "Cong A B C D" + shows "l C D" + by (metis QCong_def assms(1) assms(2) assms(3) cong_transitivity) + +lemma lg_sym: + assumes "QCong l" + and "l A B" + shows "l B A" + using assms(1) assms(2) cong_pseudo_reflexivity lg_cong_lg by blast + +lemma ex_points_lg: + assumes "QCong l" + shows "\ A B. l A B" + using QCong_def assms cong_pseudo_reflexivity by fastforce + +lemma is_len_cong: + assumes "TarskiLen A B l" and + "TarskiLen C D l" + shows "Cong A B C D" + using TarskiLen_def assms(1) assms(2) lg_cong by auto + +lemma is_len_cong_is_len: + assumes "TarskiLen A B l" and + "Cong A B C D" + shows "TarskiLen C D l" + using TarskiLen_def assms(1) assms(2) lg_cong_lg by fastforce + +lemma not_cong_is_len: + assumes "\ Cong A B C D" and + "TarskiLen A B l" + shows "\ l C D" + using TarskiLen_def assms(1) assms(2) lg_cong by auto + +lemma not_cong_is_len1: + assumes "\ Cong A B C D" + and "TarskiLen A B l" + shows "\ TarskiLen C D l" + using assms(1) assms(2) is_len_cong by blast + +lemma lg_null_instance: + assumes "QCongNull l" + shows "l A A" + by (metis QCongNull_def QCong_def assms cong_diff cong_trivial_identity) + +lemma lg_null_trivial: + assumes "QCong l" + and "l A A" + shows "QCongNull l" + using QCongNull_def assms(1) assms(2) by auto + +lemma lg_null_dec: + (*assumes "QCong l" *) + shows "QCongNull l \ \ QCongNull l" + by simp + +lemma ex_point_lg: + assumes "QCong l" + shows "\ B. l A B" + by (metis QCong_def assms not_cong_3412 segment_construction) + +lemma ex_point_lg_out: + assumes "A \ P" and + "QCong l" and + "\ QCongNull l" + shows "\ B. (l A B \ A Out B P)" +proof - + obtain X Y where P1: "\ X0 Y0. (Cong X Y X0 Y0 \ l X0 Y0)" + using QCong_def assms(2) by auto + then have "l X Y" + using cong_reflexivity by auto + then have "X \ Y" + using assms(2) assms(3) lg_null_trivial by auto + then obtain B where "A Out P B \ Cong A B X Y" + using assms(1) segment_construction_3 by blast + thus ?thesis + using Cong_perm Out_cases P1 by blast +qed + +lemma ex_point_lg_bet: + assumes "QCong l" + shows "\ B. (l M B \ Bet A M B)" +proof - + obtain X Y where P1: "\ X0 Y0. (Cong X Y X0 Y0 \ l X0 Y0)" + using QCong_def assms by auto + then have "l X Y" + using cong_reflexivity by blast + obtain B where "Bet A M B \ Cong M B X Y" + using segment_construction by blast + thus ?thesis + using Cong_perm P1 by blast +qed + +lemma ex_points_lg_not_col: + assumes "QCong l" + and "\ QCongNull l" + shows "\ A B. (l A B \ \ Col A B P)" +proof - + have "\ B::'p. A \ B" + using another_point by blast + then obtain A::'p where "P \ A" + by metis + then obtain Q where "\ Col P A Q" + using not_col_exists by auto + then have "A \ Q" + using col_trivial_2 by auto + then obtain B where "l A B \ A Out B Q" + using assms(1) assms(2) ex_point_lg_out by blast + thus ?thesis + by (metis \\ Col P A Q\ col_transitivity_1 not_col_permutation_1 out_col out_diff1) +qed + +lemma ex_eql: + assumes "\ A B. (TarskiLen A B l1 \ TarskiLen A B l2)" + shows "l1 = l2" +proof - + obtain A B where P1: "TarskiLen A B l1 \ TarskiLen A B l2" + using assms by auto + have "\ A0 B0. (l1 A0 B0 \ l2 A0 B0)" + by (metis TarskiLen_def \TarskiLen A B l1 \ TarskiLen A B l2\ lg_cong lg_cong_lg) + have "\ A0 B0. (l1 A0 B0 \ l2 A0 B0)" + proof - + have "\ A0 B0. (l1 A0 B0 \ l2 A0 B0)" + by (metis TarskiLen_def \TarskiLen A B l1 \ TarskiLen A B l2\ lg_cong lg_cong_lg) + moreover have "\ A0 B0. (l2 A0 B0 \ l1 A0 B0)" + by (metis TarskiLen_def \TarskiLen A B l1 \ TarskiLen A B l2\ lg_cong lg_cong_lg) + ultimately show ?thesis by blast + qed + thus ?thesis by blast +qed + +lemma all_eql: + assumes "TarskiLen A B l1" and + "TarskiLen A B l2" + shows "l1 = l2" + using assms(1) assms(2) ex_eql by auto + +lemma null_len: + assumes "TarskiLen A A la" and + "TarskiLen B B lb" + shows "la = lb" + by (metis TarskiLen_def all_eql assms(1) assms(2) lg_null_instance lg_null_trivial) + +lemma eqL_equivalence: + assumes "QCong la" and + "QCong lb" and + "QCong lc" + shows "la = la \ (la = lb \ lb = la) \ (la = lb \ lb = lc \ la = lc)" + by simp + +lemma ex_lg: + "\ l. (QCong l \ l A B)" + by (simp add: lg_exists) + +lemma lg_eql_lg: + assumes "QCong l1" and + "l1 = l2" + shows "QCong l2" + using assms(1) assms(2) by auto + +lemma ex_eqL: + assumes "QCong l1" and + "QCong l2" and + "\ A B. (l1 A B \ l2 A B)" + shows "l1 = l2" + using TarskiLen_def all_eql assms(1) assms(2) assms(3) by auto + +subsubsection "Part 3 : angles" + +lemma ang_exists: + assumes "A \ B" and + "C \ B" + shows "\ a. (QCongA a \ a A B C)" +proof - + have "A B C CongA A B C" + by (simp add: assms(1) assms(2) conga_refl) + thus ?thesis + using QCongA_def assms(1) assms(2) by auto +qed + +lemma ex_points_eng: + assumes "QCongA a" + shows "\ A B C. (a A B C)" +proof - + obtain A B C where "A \ B \ C \ B \ (\ X Y Z. (A B C CongA X Y Z \ a X Y Z))" + using QCongA_def assms by auto + thus ?thesis + using conga_pseudo_refl by blast +qed + +lemma ang_conga: + assumes "QCongA a" and + "a A B C" and + "a A' B' C'" + shows "A B C CongA A' B' C'" +proof - + obtain A0 B0 C0 where "A0 \ B0 \ C0 \ B0 \ (\ X Y Z. (A0 B0 C0 CongA X Y Z \ a X Y Z))" + using QCongA_def assms(1) by auto + thus ?thesis + by (meson assms(2) assms(3) not_conga not_conga_sym) +qed + +lemma is_ang_conga: + assumes "A B C Ang a" and + "A' B' C' Ang a" + shows "A B C CongA A' B' C'" + using Ang_def ang_conga assms(1) assms(2) by auto + +lemma is_ang_conga_is_ang: + assumes "A B C Ang a" and + "A B C CongA A' B' C'" + shows "A' B' C' Ang a" +proof - + have "QCongA a" + using Ang_def assms(1) by auto + then obtain A0 B0 C0 where "A0 \ B0 \ C0 \ B0 \ (\ X Y Z. (A0 B0 C0 CongA X Y Z \ a X Y Z))" + using QCongA_def by auto + thus ?thesis + by (metis Ang_def assms(1) assms(2) not_conga) +qed + +lemma not_conga_not_ang: + assumes "QCongA a" and + "\ A B C CongA A' B' C'" and + "a A B C" + shows "\ a A' B' C'" + using ang_conga assms(1) assms(2) assms(3) by auto + +lemma not_conga_is_ang: + assumes "\ A B C CongA A' B' C'" and + "A B C Ang a" + shows "\ a A' B' C'" + using Ang_def ang_conga assms(1) assms(2) by auto + +lemma not_cong_is_ang1: + assumes "\ A B C CongA A' B' C'" and + "A B C Ang a" + shows "\ A' B' C' Ang a" + using assms(1) assms(2) is_ang_conga by blast + +lemma ex_eqa: + assumes "\ A B C.(A B C Ang a1 \ A B C Ang a2)" + shows "a1 = a2" +proof - + obtain A B C where P1: "A B C Ang a1 \ A B C Ang a2" + using assms by auto + { + fix x y z + assume "a1 x y z" + then have "x y z Ang a1" + using Ang_def assms by auto + then have "x y z CongA A B C" + using P1 not_cong_is_ang1 by blast + then have "x y z Ang a2" + using P1 is_ang_conga_is_ang not_conga_sym by blast + then have "a2 x y z" + using Ang_def assms by auto + } + { + fix x y z + assume "a2 x y z" + then have "x y z Ang a2" + using Ang_def assms by auto + then have "x y z CongA A B C" + using P1 not_cong_is_ang1 by blast + then have "x y z Ang a1" + using P1 is_ang_conga_is_ang not_conga_sym by blast + then have "a1 x y z" + using Ang_def assms by auto + } + then have "\ x y z. (a1 x y z) \ (a2 x y z)" + using \\z y x. a1 x y z \ a2 x y z\ by blast + then have "\x y. (\ z. (a1 x y) z = (a2 x y) z)" + by simp + then have "\ x y. (a1 x y) = (a2 x y)" using fun_eq_iff by auto + thus ?thesis using fun_eq_iff by auto +qed + +lemma all_eqa: + assumes "A B C Ang a1" and + "A B C Ang a2" + shows "a1 = a2" + using assms(1) assms(2) ex_eqa by blast + +lemma is_ang_distinct: + assumes "A B C Ang a" + shows "A \ B \ C \ B" + using assms conga_diff1 conga_diff2 is_ang_conga by blast + +lemma null_ang: + assumes "A B A Ang a1" and + "C D C Ang a2" + shows "a1 = a2" + using all_eqa assms(1) assms(2) conga_trivial_1 is_ang_conga_is_ang is_ang_distinct by auto + +lemma flat_ang: + assumes "Bet A B C" and + "Bet A' B' C'" and + "A B C Ang a1" and + "A' B' C' Ang a2" + shows "a1 = a2" +proof - + have "A B C Ang a2" + proof - + have "A' B' C' Ang a2" + by (simp add: assms(4)) + moreover have "A' B' C' CongA A B C" + by (metis assms(1) assms(2) assms(3) calculation conga_line is_ang_distinct) + ultimately show ?thesis + using is_ang_conga_is_ang by blast + qed + then show ?thesis + using assms(3) all_eqa by auto +qed + +lemma ang_distinct: + assumes "QCongA a" and + "a A B C" + shows "A \ B \ C \ B" +proof - + have "A B C Ang a" + by (simp add: Ang_def assms(1) assms(2)) + thus ?thesis + using is_ang_distinct by auto +qed + +lemma ex_ang: + assumes "B \ A" and + "B \ C" + shows "\ a. (QCongA a \ a A B C)" + using ang_exists assms(1) assms(2) by auto + +lemma anga_exists: + assumes "A \ B" and + "C \ B" and + "Acute A B C" + shows "\ a. (QCongAAcute a \ a A B C)" +proof - + have "A B C CongA A B C" + by (simp add: assms(1) assms(2) conga_refl) + thus ?thesis + using assms(1) QCongAAcute_def assms(3) by blast +qed + +lemma anga_is_ang: + assumes "QCongAAcute a" + shows "QCongA a" +proof - + obtain A0 B0 C0 where P1: "Acute A0 B0 C0 \ (\ X Y Z.(A0 B0 C0 CongA X Y Z \ a X Y Z))" + using QCongAAcute_def assms by auto + thus ?thesis + using QCongA_def by (metis acute_distincts) +qed + +lemma ex_points_anga: + assumes "QCongAAcute a" + shows "\ A B C. a A B C" + by (simp add: anga_is_ang assms ex_points_eng) + +lemma anga_conga: + assumes "QCongAAcute a" and + "a A B C" and + "a A' B' C'" + shows "A B C CongA A' B' C'" + by (meson Tarski_neutral_dimensionless.ang_conga Tarski_neutral_dimensionless_axioms anga_is_ang assms(1) assms(2) assms(3)) + +lemma is_anga_to_is_ang: + assumes "A B C AngAcute a" + shows "A B C Ang a" + using AngAcute_def Ang_def anga_is_ang assms by auto + +lemma is_anga_conga: + assumes "A B C AngAcute a" and + "A' B' C' AngAcute a" + shows "A B C CongA A' B' C'" + using AngAcute_def anga_conga assms(1) assms(2) by auto + +lemma is_anga_conga_is_anga: + assumes "A B C AngAcute a" and + "A B C CongA A' B' C'" + shows "A' B' C' AngAcute a" + using Tarski_neutral_dimensionless.AngAcute_def Tarski_neutral_dimensionless.Ang_def Tarski_neutral_dimensionless.is_ang_conga_is_ang Tarski_neutral_dimensionless_axioms assms(1) assms(2) is_anga_to_is_ang by fastforce + +lemma not_conga_is_anga: + assumes "\ A B C CongA A' B' C'" and + "A B C AngAcute a" + shows "\ a A' B' C'" + using AngAcute_def anga_conga assms(1) assms(2) by auto + +lemma not_cong_is_anga1: + assumes "\ A B C CongA A' B' C'" and + "A B C AngAcute a" + shows "\ A' B' C' AngAcute a" + using assms(1) assms(2) is_anga_conga by auto + +lemma ex_eqaa: + assumes "\ A B C. (A B C AngAcute a1 \ A B C AngAcute a2)" + shows "a1 = a2" + using all_eqa assms is_anga_to_is_ang by blast + +lemma all_eqaa: + assumes "A B C AngAcute a1" and + "A B C AngAcute a2" + shows "a1 = a2" + using assms(1) assms(2) ex_eqaa by blast + +lemma is_anga_distinct: + assumes "A B C AngAcute a" + shows "A \ B \ C \ B" + using assms is_ang_distinct is_anga_to_is_ang by blast + +lemma null_anga: + assumes "A B A AngAcute a1" and + "C D C AngAcute a2" + shows "a1 = a2" + using assms(1) assms(2) is_anga_to_is_ang null_ang by blast + +lemma anga_distinct: + assumes "QCongAAcute a" and + "a A B C" + shows "A \ B \ C \ B" + using ang_distinct anga_is_ang assms(1) assms(2) by blast + +lemma out_is_len_eq: + assumes "A Out B C" and + "TarskiLen A B l" and + "TarskiLen A C l" + shows "B = C" + using Out_def assms(1) assms(2) assms(3) between_cong not_cong_is_len1 by fastforce + +lemma out_len_eq: + assumes "QCong l" and + "A Out B C" and + "l A B" and + "l A C" + shows "B = C" using out_is_len_eq + using TarskiLen_def assms(1) assms(2) assms(3) assms(4) by auto + +lemma ex_anga: + assumes "Acute A B C" + shows "\ a. (QCongAAcute a \ a A B C)" + using acute_distincts anga_exists assms by blast + +lemma not_null_ang_ang: + assumes "QCongAnNull a" + shows "QCongA a" + using QCongAnNull_def assms by blast + +lemma not_null_ang_def_equiv: + "QCongAnNull a \ (QCongA a \ (\ A B C. (a A B C \ \ B Out A C)))" +proof - + { + assume "QCongAnNull a" + have "QCongA a \ (\ A B C. (a A B C \ \ B Out A C))" + using QCongAnNull_def \QCongAnNull a\ ex_points_eng by fastforce + } + { + assume "QCongA a \ (\ A B C. (a A B C \ \ B Out A C))" + have "QCongAnNull a" + by (metis Ang_def QCongAnNull_def Tarski_neutral_dimensionless.l11_21_a Tarski_neutral_dimensionless_axioms \QCongA a \ (\A B C. a A B C \ \ B Out A C)\ not_conga_is_ang) + } + thus ?thesis + using \QCongAnNull a \ QCongA a \ (\A B C. a A B C \ \ B Out A C)\ by blast +qed + +lemma not_flat_ang_def_equiv: + "QCongAnFlat a \ (QCongA a \ (\ A B C. (a A B C \ \ Bet A B C)))" +proof - + { + assume "QCongAnFlat a" + then have "QCongA a \ (\ A B C. (a A B C \ \ Bet A B C))" + using QCongAnFlat_def ex_points_eng by fastforce + } + { + assume "QCongA a \ (\ A B C. (a A B C \ \ Bet A B C))" + have "QCongAnFlat a" + proof - + obtain pp :: 'p and ppa :: 'p and ppb :: 'p where + f1: "QCongA a \ a pp ppa ppb \ \ Bet pp ppa ppb" + using \QCongA a \ (\A B C. a A B C \ \ Bet A B C)\ by moura + then have f2: "\p pa pb. pp ppa ppb CongA pb pa p \ \ a pb pa p" + by (metis (no_types) Ang_def Tarski_neutral_dimensionless.not_cong_is_ang1 Tarski_neutral_dimensionless_axioms) + then have f3: "\p pa pb. (Col pp ppa ppb \ \ a pb pa p) \ \ Bet pb pa p" + by (metis (no_types) Col_def Tarski_neutral_dimensionless.ncol_conga_ncol Tarski_neutral_dimensionless_axioms) + have f4: "\p pa pb. (\ Bet ppa ppb pp \ \ Bet pb pa p) \ \ a pb pa p" + using f2 f1 by (metis Col_def Tarski_neutral_dimensionless.l11_21_a Tarski_neutral_dimensionless_axioms not_bet_and_out not_out_bet) + have f5: "\p pa pb. (\ Bet ppb pp ppa \ \ Bet pb pa p) \ \ a pb pa p" + using f2 f1 by (metis Col_def Tarski_neutral_dimensionless.l11_21_a Tarski_neutral_dimensionless_axioms not_bet_and_out not_out_bet) + { assume "Bet ppa ppb pp" + then have ?thesis + using f4 f1 QCongAnFlat_def by blast } + moreover + { assume "Bet ppb pp ppa" + then have ?thesis + using f5 f1 QCongAnFlat_def by blast } + ultimately show ?thesis + using f3 f1 Col_def QCongAnFlat_def by blast + qed + } + thus ?thesis + using \QCongAnFlat a \ QCongA a \ (\A B C. a A B C \ \ Bet A B C)\ by blast +qed + +lemma ang_const: + assumes "QCongA a" and + "A \ B" + shows "\ C. a A B C" +proof - + obtain A0 B0 C0 where "A0 \ B0 \ C0 \ B0 \ (\ X Y Z. (A0 B0 C0 CongA X Y Z \ a X Y Z))" + by (metis QCongA_def assms(1)) + then have "(A0 B0 C0 CongA A0 B0 C0) \ a A0 B0 C0" + by (simp add: conga_refl) + then have "a A0 B0 C0" + using \A0 \ B0 \ C0 \ B0 \ (\X Y Z. A0 B0 C0 CongA X Y Z \ a X Y Z)\ conga_refl by blast + then show ?thesis + using \A0 \ B0 \ C0 \ B0 \ (\X Y Z. A0 B0 C0 CongA X Y Z \ a X Y Z)\ angle_construction_3 assms(2) by blast +qed + +lemma ang_sym: + assumes "QCongA a" and + "a A B C" + shows "a C B A" +proof - + obtain A0 B0 C0 where "A0 \ B0 \ C0 \ B0 \ (\ X Y Z. (A0 B0 C0 CongA X Y Z \ a X Y Z))" + by (metis QCongA_def assms(1)) + then show ?thesis + by (metis Tarski_neutral_dimensionless.ang_conga Tarski_neutral_dimensionless_axioms assms(1) assms(2) conga_left_comm conga_refl not_conga_sym) +qed + +lemma ang_not_null_lg: + assumes "QCongA a" and + "QCong l" and + "a A B C" and + "l A B" + shows "\ QCongNull l" + by (metis QCongNull_def TarskiLen_def ang_distinct assms(1) assms(3) assms(4) cong_reverse_identity not_cong_is_len) + +lemma ang_distincts: + assumes "QCongA a" and + "a A B C" + shows "A \ B \ C \ B" + using ang_distinct assms(1) assms(2) by auto + +lemma anga_sym: + assumes "QCongAAcute a" and + "a A B C" + shows "a C B A" + by (simp add: ang_sym anga_is_ang assms(1) assms(2)) + +lemma anga_not_null_lg: + assumes "QCongAAcute a" and + "QCong l" and + "a A B C" and + "l A B" + shows "\ QCongNull l" + using ang_not_null_lg anga_is_ang assms(1) assms(2) assms(3) assms(4) by blast + +lemma anga_distincts: + assumes "QCongAAcute a" and + "a A B C" + shows "A \ B \ C \ B" + using anga_distinct assms(1) assms(2) by blast + +lemma ang_const_o: + assumes "\ Col A B P" and + "QCongA a" and + "QCongAnNull a" and + "QCongAnFlat a" + shows "\ C. a A B C \ A B OS C P" +proof - + obtain A0 B0 C0 where P1: "A0 \ B0 \ C0 \ B0 \ (\ X Y Z. (A0 B0 C0 CongA X Y Z \ a X Y Z))" + by (metis QCongA_def assms(2)) + then have "a A0 B0 C0" + by (simp add: conga_refl) + then have T1: "A0 \ C0" + using P1 Tarski_neutral_dimensionless.QCongAnNull_def Tarski_neutral_dimensionless_axioms assms(3) out_trivial by fastforce + have "A \ B" + using assms(1) col_trivial_1 by blast + have "A0 \ B0 \ B0 \ C0" + using P1 by auto + then obtain C where P2: "A0 B0 C0 CongA A B C \ (A B OS C P \ Col A B C)" + using angle_construction_2 assms(1) by blast + then have "a A B C" + by (simp add: P1) + have P3: "A B OS C P \ Col A B C" + using P2 by simp + have P4: "\ A B C. (a A B C \ \ B Out A C)" + using assms(3) by (simp add: QCongAnNull_def) + have P5: "\ A B C. (a A B C \ \ Bet A B C)" + using assms(4) QCongAnFlat_def by auto + { + assume "Col A B C" + have "\ B Out A C" + using P4 by (simp add: \a A B C\) + have "\ Bet A B C" + using P5 by (simp add: \a A B C\) + then have "A B OS C P" + using \Col A B C\ \\ B Out A C\ l6_4_2 by blast + then have "\ C1. (a A B C1 \ A B OS C1 P)" + using \a A B C\ by blast + } + then have "\ C1. (a A B C1 \ A B OS C1 P)" + using P3 \a A B C\ by blast + then show ?thesis + by simp +qed + +lemma anga_const: + assumes "QCongAAcute a" and + "A \ B" + shows "\ C. a A B C" + using Tarski_neutral_dimensionless.ang_const Tarski_neutral_dimensionless_axioms anga_is_ang assms(1) assms(2) by fastforce + +lemma null_anga_null_angaP: + "QCongANullAcute a \ IsNullAngaP a" +proof - + have "QCongANullAcute a \ IsNullAngaP a" + using IsNullAngaP_def QCongANullAcute_def ex_points_anga by fastforce + moreover have "IsNullAngaP a \ QCongANullAcute a" + by (metis IsNullAngaP_def QCongAnNull_def Tarski_neutral_dimensionless.QCongANullAcute_def Tarski_neutral_dimensionless_axioms anga_is_ang not_null_ang_def_equiv) + ultimately show ?thesis + by blast +qed + +lemma is_null_anga_out: + assumes (*"QCongAAcute a" and *) + "a A B C" and + "QCongANullAcute a" + shows "B Out A C" + using QCongANullAcute_def assms(1) assms(2) by auto + +lemma acute_not_bet: + assumes "Acute A B C" + shows "\ Bet A B C" + using acute_col__out assms bet_col not_bet_and_out by blast + +lemma anga_acute: + assumes "QCongAAcute a" and + "a A B C" + shows "Acute A B C" + by (smt Tarski_neutral_dimensionless.QCongAAcute_def Tarski_neutral_dimensionless_axioms acute_conga__acute assms(1) assms(2)) + +lemma not_null_not_col: + assumes "QCongAAcute a" and + "\ QCongANullAcute a" and + "a A B C" + shows "\ Col A B C" +proof - + have "Acute A B C" + using anga_acute assms(1) assms(3) by blast + then show ?thesis + using Tarski_neutral_dimensionless.IsNullAngaP_def Tarski_neutral_dimensionless_axioms acute_col__out assms(1) assms(2) assms(3) null_anga_null_angaP by blast +qed + +lemma ang_cong_ang: + assumes "QCongA a" and + "a A B C" and + "A B C CongA A' B' C'" + shows "a A' B' C'" + by (metis QCongA_def assms(1) assms(2) assms(3) not_conga) + +lemma is_null_ang_out: + assumes (*"QCongA a" and *) + "a A B C" and + "QCongANull a" + shows "B Out A C" +proof - + have "a A B C \ B Out A C" + using QCongANull_def assms(2) by auto + then show ?thesis + by (simp add: assms(1)) +qed + +lemma out_null_ang: + assumes "QCongA a" and + "a A B C" and + "B Out A C" + shows "QCongANull a" + by (metis QCongANull_def QCongAnNull_def assms(1) assms(2) assms(3) not_null_ang_def_equiv) + +lemma bet_flat_ang: + assumes "QCongA a" and + "a A B C" and + "Bet A B C" + shows "AngFlat a" + by (metis AngFlat_def QCongAnFlat_def assms(1) assms(2) assms(3) not_flat_ang_def_equiv) + +lemma out_null_anga: + assumes "QCongAAcute a" and + "a A B C" and + "B Out A C" + shows "QCongANullAcute a" + using IsNullAngaP_def assms(1) assms(2) assms(3) null_anga_null_angaP by auto + +lemma anga_not_flat: + assumes "QCongAAcute a" + shows "QCongAnFlat a" + by (metis (no_types, lifting) Tarski_neutral_dimensionless.QCongAnFlat_def Tarski_neutral_dimensionless.anga_is_ang Tarski_neutral_dimensionless_axioms assms bet_col is_null_anga_out not_bet_and_out not_null_not_col) + +lemma anga_const_o: + assumes "\ Col A B P" and + "\ QCongANullAcute a" and + "QCongAAcute a" + shows "\ C. (a A B C \ A B OS C P)" +proof - + have "QCongA a" + by (simp add: anga_is_ang assms(3)) + moreover have "QCongAnNull a" + using QCongANullAcute_def assms(2) assms(3) calculation not_null_ang_def_equiv by auto + moreover have "QCongAnFlat a" + by (simp add: anga_not_flat assms(3)) + ultimately show ?thesis + by (simp add: ang_const_o assms(1)) +qed + +lemma anga_conga_anga: + assumes "QCongAAcute a" and + "a A B C" and + "A B C CongA A' B' C'" + shows "a A' B' C'" + using ang_cong_ang anga_is_ang assms(1) assms(2) assms(3) by blast + +lemma anga_out_anga: + assumes "QCongAAcute a" and + "a A B C" and + "B Out A A'" and + "B Out C C'" + shows "a A' B C'" +proof - + have "A B C CongA A' B C'" + by (simp add: assms(3) assms(4) l6_6 out2__conga) + thus ?thesis + using anga_conga_anga assms(1) assms(2) by blast +qed + +lemma out_out_anga: + assumes "QCongAAcute a" and + "B Out A C" and + "B' Out A' C'" and + "a A B C" + shows "a A' B' C'" +proof - + have "A B C CongA A' B' C'" + by (simp add: assms(2) assms(3) l11_21_b) + thus ?thesis + using anga_conga_anga assms(1) assms(4) by blast +qed + +lemma is_null_all: + assumes "A \ B" and + "QCongANullAcute a" + shows "a A B A" +proof - + obtain A0 B0 C0 where "Acute A0 B0 C0 \ (\ X Y Z. (A0 B0 C0 CongA X Y Z \ a X Y Z))" + using QCongAAcute_def QCongANullAcute_def assms(2) by auto + then have "a A0 B0 C0" + using acute_distincts conga_refl by blast + thus ?thesis + by (smt QCongANullAcute_def assms(1) assms(2) out_out_anga out_trivial) +qed + +lemma anga_col_out: + assumes "QCongAAcute a" and + "a A B C" and + "Col A B C" + shows "B Out A C" +proof - + have "Acute A B C" + using anga_acute assms(1) assms(2) by auto + then have P1: "Bet A B C \ B Out A C" + using acute_not_bet by auto + then have "Bet C A B \ B Out A C" + using assms(3) l6_4_2 by auto + thus ?thesis + using P1 assms(3) l6_4_2 by blast +qed + +lemma ang_not_lg_null: + assumes "QCong la" and + "QCong lc" and + "QCongA a" and + "la A B" and + "lc C B" and + "a A B C" + shows "\ QCongNull la \ \ QCongNull lc" + by (metis ang_not_null_lg ang_sym assms(1) assms(2) assms(3) assms(4) assms(5) assms(6)) + +lemma anga_not_lg_null: + assumes (*"QCong la" and + "QCong lc" and*) + "QCongAAcute a" and + "la A B" and + "lc C B" and + "a A B C" + shows "\ QCongNull la \ \ QCongNull lc" + by (metis QCongNull_def anga_not_null_lg anga_sym assms(1) assms(2) assms(3) assms(4)) + +lemma anga_col_null: + assumes "QCongAAcute a" and + "a A B C" and + "Col A B C" + shows "B Out A C \ QCongANullAcute a" + using anga_col_out assms(1) assms(2) assms(3) out_null_anga by blast + +lemma eqA_preserves_ang: + assumes "QCongA a" and + "a = b" + shows "QCongA b" + using assms(1) assms(2) by auto + +lemma eqA_preserves_anga: + assumes "QCongAAcute a" and + (* "QCongA b" and*) + "a = b" + shows "QCongAAcute b" + using assms(1) assms(2) by auto + +section "Some postulates of the parallels" + +lemma euclid_5__original_euclid: + assumes "Euclid5" + shows "EuclidSParallelPostulate" +proof - + { + fix A B C D P Q R + assume P1: "B C OS A D \ SAMS A B C B C D \ A B C B C D SumA P Q R \ \ Bet P Q R" + obtain M where P2: "M Midpoint B C" + using midpoint_existence by auto + obtain D' where P3: "C Midpoint D D'" + using symmetric_point_construction by auto + obtain E where P4: "M Midpoint D' E" + using symmetric_point_construction by auto + have P5: "A \ B" + using P1 os_distincts by blast + have P6: "B \ C" + using P1 os_distincts by blast + have P7: "C \ D" + using P1 os_distincts by blast + have P10: "M \ B" + using P2 P6 is_midpoint_id by auto + have P11: "M \ C" + using P2 P6 is_midpoint_id_2 by auto + have P13: "C \ D'" + using P3 P7 is_midpoint_id_2 by blast + have P16: "\ Col B C A" + using one_side_not_col123 P1 by blast + have "B C OS D A" + using P1 one_side_symmetry by blast + then have P17: "\ Col B C D" + using one_side_not_col123 P1 by blast + then have P18: "\ Col M C D" + using P2 Col_perm P11 col_transitivity_2 midpoint_col by blast + then have P19: "\ Col M C D'" + by (metis P13 P3 Col_perm col_transitivity_2 midpoint_col) + then have P20: "\ Col D' C B" + by (metis Col_perm P13 P17 P3 col_transitivity_2 midpoint_col) + then have P21: "\ Col M C E" + by (metis P19 P4 bet_col col2__eq col_permutation_4 midpoint_bet midpoint_distinct_2) + have P22: "M C D' CongA M B E \ M D' C CongA M E B" using P13 l11_49 + by (metis Cong_cases P19 P2 P4 l11_51 l7_13_R1 l7_2 midpoint_cong not_col_distincts) + have P23: "Cong C D' B E" + using P11 P2 P4 l7_13_R1 l7_2 by blast + have P27: "C B TS D D'" + by (simp add: P13 P17 P3 bet__ts midpoint_bet not_col_permutation_4) + have P28: "A InAngle C B E" + proof - + have "C B A LeA C B E" + proof - + have "A B C LeA B C D'" + proof - + have "Bet D C D'" + by (simp add: P3 midpoint_bet) + then show ?thesis using P1 P7 P13 sams_chara + by (metis sams_left_comm sams_sym) + qed + moreover have "A B C CongA C B A" + using P5 P6 conga_pseudo_refl by auto + moreover have "B C D' CongA C B E" + by (metis CongA_def Mid_cases P2 P22 P4 P6 symmetry_preserves_conga) + ultimately show ?thesis + using l11_30 by blast + qed + moreover have "C B OS E A" + proof - + have "C B TS E D'" + using P2 P20 P4 l7_2 l9_2 mid_two_sides not_col_permutation_1 by blast + moreover have "C B TS A D'" + using P27 \B C OS D A\ invert_two_sides l9_8_2 by blast + ultimately show ?thesis + using OS_def by blast + qed + ultimately show ?thesis + using lea_in_angle by simp + qed + obtain A' where P30: "Bet C A' E \ (A' = B \ B Out A' A)" using P28 InAngle_def by auto + { + assume "A' = B" + then have "Col D' C B" + by (metis Col_def P2 P21 P30 P6 col_transitivity_1 midpoint_col) + then have "False" + by (simp add: P20) + then have "\ Y. B Out A Y \ C Out D Y" by auto + } + { + assume P31: "B Out A' A" + have "\ I. BetS D' C I \ BetS B A' I" + proof - + have P32: "BetS B M C" + using BetS_def Midpoint_def P10 P11 P2 by auto + moreover have "BetS E M D'" + using BetS_def Bet_cases P19 P21 P4 midpoint_bet not_col_distincts by fastforce + moreover have "BetS C A' E" + proof - + have P32A: "C \ A'" + using P16 P31 out_col by auto + { + assume "A' = E" + then have P33: "B Out A E" + using P31 l6_6 by blast + then have "A B C B C D SumA D' C D" + proof - + have "D' C B CongA A B C" + proof - + have "D' C M CongA E B M" + by (simp add: P22 conga_comm) + moreover have "C Out D' D'" + using P13 out_trivial by auto + moreover have "C Out B M" + using BetSEq Out_cases P32 bet_out_1 by blast + moreover have "B Out A E" + using P33 by auto + moreover have "B Out C M" + using BetSEq Out_def P32 by blast + ultimately show ?thesis + using l11_10 by blast + qed + moreover have "D' C B B C D SumA D' C D" + by (simp add: P27 l9_2 ts__suma_1) + moreover have "B C D CongA B C D" + using P6 P7 conga_refl by auto + moreover have "D' C D CongA D' C D" + using P13 P7 conga_refl by presburger + ultimately show ?thesis + using conga3_suma__suma by blast + qed + then have "D' C D CongA P Q R" + using P1 suma2__conga by auto + then have "Bet P Q R" + using Bet_cases P3 bet_conga__bet midpoint_bet by blast + then have "False" using P1 by simp + } + then have "A' \ E" by auto + then show ?thesis + by (simp add: BetS_def P30 P32A) + qed + moreover have "\ Col B C D'" + by (simp add: P20 not_col_permutation_3) + moreover have "Cong B M C M" + using Midpoint_def P2 not_cong_1243 by blast + moreover have "Cong E M D' M" + using Cong_perm Midpoint_def P4 by blast + ultimately show ?thesis + using euclid_5_def assms by blast + qed + then obtain Y where P34: "Bet D' C Y \ BetS B A' Y" using BetSEq by blast + then have "\ Y. B Out A Y \ C Out D Y" + proof - + have P35: "B Out A Y" + by (metis BetSEq Out_def P31 P34 l6_7) + moreover have "C Out D Y" + proof - + have "D \ C" + using P7 by auto + moreover have "Y \ C" + using P16 P35 l6_6 out_col by blast + moreover have "D' \ C" + using P13 by auto + moreover have "Bet D C D'" + by (simp add: P3 midpoint_bet) + moreover have "Bet Y C D'" + by (simp add: Bet_perm P34) + ultimately show ?thesis + using l6_2 by blast + qed + ultimately show ?thesis by auto + qed + } + then have "\ Y. B Out A Y \ C Out D Y" + using P30 \A' = B \ \Y. B Out A Y \ C Out D Y\ by blast + } + then show ?thesis using euclid_s_parallel_postulate_def by blast +qed + +lemma tarski_s_euclid_implies_euclid_5: + assumes "TarskiSParallelPostulate" + shows "Euclid5" +proof - + { + fix P Q R S T U + assume + P1: "BetS P T Q \ BetS R T S \ BetS Q U R \ \ Col P Q S \ Cong P T Q T \ Cong R T S T" + have P1A: "BetS P T Q" using P1 by simp + have P1B: "BetS R T S" using P1 by simp + have P1C: "BetS Q U R" using P1 by simp + have P1D: "\ Col P Q S" using P1 by simp + have P1E: "Cong P T Q T" using P1 by simp + have P1F: "Cong R T S T" using P1 by simp + obtain V where P2: "P Midpoint R V" + using symmetric_point_construction by auto + have P3: "Bet V P R" + using Mid_cases P2 midpoint_bet by blast + then obtain W where P4: "Bet P W Q \ Bet U W V" using inner_pasch + using BetSEq P1C by blast + { + assume "P = W" + have "P \ V" + by (metis BetSEq Bet_perm Col_def Cong_perm Midpoint_def P1A P1B P1D P1E P1F P2 between_trivial is_midpoint_id_2 l7_9) + have "Col P Q S" + proof - + have f1: "Col V P R" + by (meson Col_def P3) + have f2: "Col U R Q" + by (simp add: BetSEq Col_def P1) + have f3: "Bet P T Q" + using BetSEq P1 by fastforce + have f4: "R = P \ Col V P U" + by (metis (no_types) Col_def P4 \P = W\ \P \ V\ l6_16_1) + have f5: "Col Q P T" + using f3 by (meson Col_def) + have f6: "Col T Q P" + using f3 by (meson Col_def) + have f7: "Col P T Q" + using f3 by (meson Col_def) + have f8: "Col P Q P" + using Col_def P4 \P = W\ by blast + have "Col R T S" + by (meson BetSEq Col_def P1) + then have "T = P \ Q = P" + using f8 f7 f6 f5 f4 f2 f1 by (metis (no_types) BetSEq P1 \P \ V\ colx l6_16_1) + then show ?thesis + by (metis BetSEq P1) + qed + then have "False" + by (simp add: P1D) + } + then have P5: "P \ W" by auto + have "Bet V W U" + using Bet_cases P4 by auto + then obtain X Y where P7: "Bet P V X \ Bet P U Y \ Bet X Q Y" + using assms(1) P1 P4 P5 tarski_s_parallel_postulate_def by blast + have "Q S Par P R" + proof - + have "Q \ S" + using P1D col_trivial_2 by auto + moreover have "T Midpoint Q P" + using BetSEq P1A P1E l7_2 midpoint_def not_cong_1243 by blast + moreover have "T Midpoint S R" + using BetSEq P1B P1F l7_2 midpoint_def not_cong_1243 by blast + ultimately show ?thesis + using l12_17 by auto + qed + then have P9: "Q S ParStrict P R" + using P1D Par_def par_strict_symmetry par_symmetry by blast + have P10: "Q S TS P Y" + proof - + have P10A: "P \ R" + using P9 par_strict_distinct by auto + then have P11: "P \ X" + by (metis P2 P7 bet_neq12__neq midpoint_not_midpoint) + have P12: "\ Col X Q S" + proof - + have "Q S ParStrict P R" + by (simp add: P9) + then have "Col P R X" + by (metis P2 P3 P7 bet_col between_symmetry midpoint_not_midpoint not_col_permutation_4 outer_transitivity_between) + then have "P X ParStrict Q S" + using P9 Par_strict_perm P11 par_strict_col_par_strict by blast + then show ?thesis + using par_strict_not_col_2 by auto + qed + { + assume W1: "Col Y Q S" + have W2: "Q = Y" + by (metis P12 P7 W1 bet_col bet_col1 colx) + then have "\ Col Q P R" + using P9 W1 par_not_col by auto + then have W3: "Q = U" + by (smt BetS_def Col_def P1C P7 W2 col_transitivity_2) + then have "False" + using BetS_def P1C by auto + } + then have "\ Col Y Q S" by auto + then have "Q S TS X Y" + by (metis P7 P12 bet__ts not_col_distincts not_col_permutation_1) + moreover have "Q S OS X P" + proof - + have "P \ V" + using P10A P2 is_midpoint_id_2 by blast + then have "Q S ParStrict P X" + by (meson Bet_perm P3 P7 P9 P11 bet_col not_col_permutation_4 par_strict_col_par_strict) + then have "Q S ParStrict X P" + by (simp add: par_strict_right_comm) + then show ?thesis + by (simp add: l12_6) + qed + ultimately show ?thesis + using l9_8_2 by auto + qed + then obtain I where W4: "Col I Q S \ Bet P I Y" + using TS_def by blast + have "\ I. (BetS S Q I \ BetS P U I)" + proof - + have "BetS P U I" + proof - + have "P \ Y" + using P10 not_two_sides_id by auto + have W4A: "Bet P U I" + proof - + have W5: "Col P U I" + using P7 W4 bet_col1 by auto + { + assume W6: "Bet U I P" + have W7: "Q S OS P U" + proof - + have "Q S OS R U" + proof - + have "\ Col Q S R" + using P9 par_strict_not_col_4 by auto + moreover have "Q Out R U" + using BetSEq Out_def P1C by blast + ultimately show ?thesis + by (simp add: out_one_side) + qed + moreover have "Q S OS P R" + by (simp add: P9 l12_6) + ultimately show ?thesis + using one_side_transitivity by blast + qed + have W8: "I Out P U \ \ Col Q S P" + by (simp add: P1D not_col_permutation_1) + have "False" + proof - + have "I Out U P" + using W4 W6 W7 between_symmetry one_side_chara by blast + then show ?thesis + using W6 not_bet_and_out by blast + qed + } + { + assume V1: "Bet I P U" + have "P R OS I U" + proof - + have "P R OS I Q" + proof - + { + assume "Q = I" + then have "Col P Q S" + by (metis BetSEq Col_def P1C P7 P9 V1 W4 between_equality outer_transitivity_between par_not_col) + then have "False" + using P1D by blast + } + then have "Q \ I" by blast + moreover have "P R ParStrict Q S" + using P9 par_strict_symmetry by blast + moreover have "Col Q S I" + using Col_cases W4 by blast + ultimately show ?thesis + using one_side_symmetry par_strict_all_one_side by blast + qed + moreover have "P R OS Q U" + proof - + have "Q S ParStrict P R" + using P9 by blast + have "R Out Q U \ \ Col P R Q" + by (metis BetSEq Bet_cases Out_def P1C calculation col124__nos) + then show ?thesis + by (metis P7 V1 W4 \Bet U I P \ False\ between_equality col_permutation_2 not_bet_distincts out_col outer_transitivity_between) + qed + ultimately show ?thesis + using one_side_transitivity by blast + qed + then have V2: "P Out I U" + using P7 W4 bet2__out os_distincts by blast + then have "Col P I U" + using V1 not_bet_and_out by blast + then have "False" + using V1 V2 not_bet_and_out by blast + } + then moreover have "\ (Bet U I P \ Bet I P U)" + using \Bet U I P \ False\ by auto + ultimately show ?thesis + using Col_def W5 by blast + qed + { + assume "P = U" + then have "Col P R Q" + using BetSEq Col_def P1C by blast + then have "False" + using P9 par_strict_not_col_3 by blast + } + then have V6: "P \ U" by auto + { + assume "U = I" + have "Q = U" + proof - + have f1: "BetS Q I R" + using P1C \U = I\ by blast + then have f2: "Col Q I R" + using BetSEq Col_def by blast + have f3: "Col I R Q" + using f1 by (simp add: BetSEq Col_def) + { assume "R \ Q" + moreover + { assume "(R \ Q \ R \ I) \ \ Col I Q R" + moreover + { assume "\p. (R \ Q \ \ Col I p I) \ Col Q I p" + then have "I = Q" + using f1 by (metis (no_types) BetSEq Col_def col_transitivity_2) } + ultimately have "(\p pa. ((pa \ I \ \ Col pa p R) \ Col Q I pa) \ Col I pa p) \ I = Q" + using f3 f2 by (metis (no_types) col_transitivity_2) } + ultimately have "(\p pa. ((pa \ I \ \ Col pa p R) \ Col Q I pa) \ Col I pa p) \ I = Q" + using f1 by (metis (no_types) BetSEq P9 W4 col_transitivity_2 par_strict_not_col_4) } + then show ?thesis + using f2 by (metis P9 W4 \U = I\ col_transitivity_2 par_strict_not_col_4) + qed + then have "False" + using BetSEq P1C by blast + } + then have "U \ I" by auto + then show ?thesis + by (simp add: W4A V6 BetS_def) + qed + moreover have "BetS S Q I" + proof - + have "Q R TS S I" + proof - + have "Q R TS P I" + proof - + have "\ Col P Q R" + using P9 col_permutation_5 par_strict_not_col_3 by blast + moreover have "\ Col I Q R" + proof - + { + assume "Col I Q R" + then have "Col Q S R" + proof - + have f1: "\p pa pb. Col p pa pb \ \ BetS pb p pa" + by (meson BetSEq Col_def) + then have f2: "Col U I P" + using \BetS P U I\ by blast + have f3: "Col I P U" + by (simp add: BetSEq Col_def \BetS P U I\) + have f4: "\p. (U = Q \ Col Q p R) \ \ Col Q U p" + by (metis BetSEq Col_def P1C col_transitivity_1) + { assume "P \ Q" + moreover + { assume "(P \ Q \ U \ Q) \ Col Q P Q" + then have "(P \ Q \ U \ Q) \ \ Col Q P R" + using Col_cases \\ Col P Q R\ by blast + moreover + { assume "\p. ((U \ Q \ P \ Q) \ \ Col Q p P) \ Col Q P p" + then have "U \ Q \ \ Col Q P P" + by (metis col_transitivity_1) + then have "\ Col U Q P" + using col_transitivity_2 by blast } + ultimately have "\ Col U Q P \ I \ Q" + using f4 f3 by blast } + ultimately have "I \ Q" + using f2 f1 by (metis BetSEq P1C col_transitivity_1 col_transitivity_2) } + then have "I \ Q" + using BetSEq \BetS P U I\ by blast + then show ?thesis + by (simp add: W4 \Col I Q R\ col_transitivity_2) + qed + then have "False" + using P9 par_strict_not_col_4 by blast + } + then show ?thesis by blast + qed + moreover have "Col U Q R" + using BetSEq Bet_cases Col_def P1C by blast + moreover have "Bet P U I" + by (simp add: BetSEq \BetS P U I\) + ultimately show ?thesis + using TS_def by blast + qed + moreover have "Q R OS P S" + proof - + have "Q R Par P S" + proof - + have "Q \ R" + using BetSEq P1 by blast + moreover have "T Midpoint Q P" + using BetSEq Bet_cases P1A P1E cong_3421 midpoint_def by blast + moreover have "T Midpoint R S" + using BetSEq P1B P1F midpoint_def not_cong_1243 by blast + ultimately show ?thesis + using l12_17 by blast + qed + then have "Q R ParStrict P S" + by (simp add: P1D Par_def not_col_permutation_4) + then show ?thesis + using l12_6 by blast + qed + ultimately show ?thesis + using l9_8_2 by blast + qed + then show ?thesis + by (metis BetS_def W4 col_two_sides_bet not_col_permutation_2 ts_distincts) + qed + ultimately show ?thesis + by auto + qed + } + then show ?thesis using euclid_5_def by blast +qed + +lemma tarski_s_implies_euclid_s_parallel_postulate: + assumes "TarskiSParallelPostulate" + shows "EuclidSParallelPostulate" + by (simp add: assms euclid_5__original_euclid tarski_s_euclid_implies_euclid_5) + +theorem tarski_s_euclid_implies_playfair_s_postulate: + assumes "TarskiSParallelPostulate" + shows "PlayfairSPostulate" +proof - + { + fix A1 A2 B1 B2 P C1 C2 + assume P1: "\ Col P A1 A2 \ A1 A2 Par B1 B2 \ Col P B1 B2 \ A1 A2 Par C1 C2 \ Col P C1 C2" + have P1A: "\ Col P A1 A2" + by (simp add: P1) + have P2: "A1 A2 Par B1 B2" + by (simp add: P1) + have P3: "Col P B1 B2" + by (simp add: P1) + have P4: "A1 A2 Par C1 C2" + by (simp add: P1) + have P5: "Col P C1 C2" + by (simp add: P1) + have P6: "A1 A2 ParStrict B1 B2" + proof - + have "A1 A2 Par B1 B2" + by (simp add: P1) + moreover have "Col B1 B2 P" + using P3 not_col_permutation_2 by blast + moreover have "\ Col A1 A2 P" + by (simp add: P1A not_col_permutation_1) + ultimately show ?thesis + using par_not_col_strict by auto + qed + have P7: "A1 A2 ParStrict C1 C2" + proof - + have "A1 A2 Par C1 C2" + by (simp add: P1) + moreover have "Col C1 C2 P" + using Col_cases P1 by blast + moreover have "\ Col A1 A2 P" + by (simp add: P1A not_col_permutation_1) + ultimately show ?thesis + using par_not_col_strict by auto + qed + { + assume "\ Col C1 B1 B2 \ \ Col C2 B1 B2" + have "\ C'. Col C1 C2 C' \ B1 B2 TS A1 C'" + proof - + have T2: "Coplanar A1 A2 P A1" + using ncop_distincts by auto + have T3: "Coplanar A1 A2 B1 B2" + by (simp add: P1 par__coplanar) + have T4: "Coplanar A1 A2 C1 C2" + by (simp add: P7 pars__coplanar) + have T5: "Coplanar A1 A2 P B1" + using P1 col_trivial_2 ncop_distincts par__coplanar par_col2_par_bis by blast + then have T6: "Coplanar A1 A2 P B2" + using P3 T3 col_cop__cop by blast + have T7: "Coplanar A1 A2 P C1" + using P1 T4 col_cop__cop coplanar_perm_1 not_col_permutation_2 par_distincts by blast + then have T8: "Coplanar A1 A2 P C2" + using P5 T4 col_cop__cop by blast + { + assume "\ Col C1 B1 B2" + moreover have "C1 \ C2" + using P1 par_neq2 by auto + moreover have "Col B1 B2 P" + using P1 not_col_permutation_2 by blast + moreover have "Col C1 C2 P" + using Col_cases P5 by auto + moreover have "\ Col B1 B2 C1" + using Col_cases calculation(1) by auto + moreover have "\ Col B1 B2 A1" + using P6 par_strict_not_col_3 by auto + moreover have "Coplanar B1 B2 C1 A1" + using Col_cases P1A T5 T2 T6 T7 coplanar_pseudo_trans by blast + ultimately have "\ C'. Col C1 C2 C' \ B1 B2 TS A1 C'" + using cop_not_par_other_side by blast + } + { + assume "\ Col C2 B1 B2" + moreover have "C2 \ C1" + using P1 par_neq2 by blast + moreover have "Col B1 B2 P" + using Col_cases P3 by auto + moreover have "Col C2 C1 P" + using Col_cases P5 by auto + moreover have "\ Col B1 B2 C2" + by (simp add: calculation(1) not_col_permutation_1) + moreover have "\ Col B1 B2 A1" + using P6 par_strict_not_col_3 by auto + moreover have "Coplanar B1 B2 C2 A1" + using Col_cases P1A T2 T5 T6 T8 coplanar_pseudo_trans by blast + ultimately have "\ C'. Col C1 C2 C' \ B1 B2 TS A1 C'" using cop_not_par_other_side + by (meson not_col_permutation_4) + } + then show ?thesis + using \\ Col C1 B1 B2 \ \C'. Col C1 C2 C' \ B1 B2 TS A1 C'\ \\ Col C1 B1 B2 \ \ Col C2 B1 B2\ by blast + qed + then obtain C' where W1: "Col C1 C2 C' \ B1 B2 TS A1 C'" by auto + then have W2: "\ Col A1 B1 B2" + using TS_def by blast + obtain B where W3: "Col B B1 B2 \ Bet A1 B C'" + using TS_def W1 by blast + obtain C where W4: "P Midpoint C' C" + using symmetric_point_construction by blast + then have W4A: "Bet A1 B C' \ Bet C P C'" + using Mid_cases W3 midpoint_bet by blast + then obtain D where W5: "Bet B D C \ Bet P D A1" using inner_pasch by blast + have W6: "C' \ P" + using P3 TS_def W1 by blast + then have "A1 A2 Par C' P" + by (meson P1 W1 not_col_permutation_2 par_col2_par) + have W9: "A1 A2 ParStrict C' P" + using Col_cases P5 P7 W1 W6 par_strict_col2_par_strict by blast + then have W10: "B \ P" + by (metis W6 W4A bet_out_1 out_col par_strict_not_col_3) + have W11: "P \ C" + using W6 W4 is_midpoint_id_2 by blast + { + assume "P = D" + then have "False" + by (metis Col_def P3 W1 W3 W4A W5 W10 W11 col_trivial_2 colx l9_18_R1) + } + then have "P \ D" by auto + then obtain X Y where W12: "Bet P B X \ Bet P C Y \ Bet X A1 Y" + using W5 assms tarski_s_parallel_postulate_def by blast + then have "P \ X" + using W10 bet_neq12__neq by auto + then have "A1 A2 ParStrict P X" + by (metis Col_cases P3 P6 W10 W12 W3 bet_col colx par_strict_col2_par_strict) + then have W15: "A1 A2 OS P X" + by (simp add: l12_6) + have "P \ Y" + using W11 W12 between_identity by blast + then have "A1 A2 ParStrict P Y" + by (metis Col_def W11 W12 W4A W9 col_trivial_2 par_strict_col2_par_strict) + then have W16: "A1 A2 OS P Y" + using l12_6 by auto + have "Col A1 X Y" + by (simp add: W12 bet_col col_permutation_4) + then have "A1 Out X Y" using col_one_side_out W15 W16 + using one_side_symmetry one_side_transitivity by blast + then have "False" + using W12 not_bet_and_out by blast + } + then have "Col C1 B1 B2 \ Col C2 B1 B2" + by auto + } + { + fix A1 A2 B1 B2 P C1 C2 + assume P1: "Col P A1 A2 \ A1 A2 Par B1 B2 \ Col P B1 B2 \ A1 A2 Par C1 C2 \ Col P C1 C2" + have "Col C1 B1 B2" + by (smt P1 l9_10 not_col_permutation_3 not_strict_par2 par_col2_par par_comm par_id_5 par_symmetry ts_distincts) + moreover have "Col C2 B1 B2" + by (smt P1 l9_10 not_col_permutation_3 not_strict_par2 par_col2_par par_id_5 par_left_comm par_symmetry ts_distincts) + ultimately have "Col C1 B1 B2 \ Col C2 B1 B2" by auto + } + then show ?thesis + using playfair_s_postulate_def + by (metis \\P C2 C1 B2 B1 A2 A1. \ Col P A1 A2 \ A1 A2 Par B1 B2 \ Col P B1 B2 \ A1 A2 Par C1 C2 \ Col P C1 C2 \ Col C1 B1 B2 \ Col C2 B1 B2\) +qed + +end +end diff --git a/thys/IsaGeoCoq/document/root.bib b/thys/IsaGeoCoq/document/root.bib new file mode 100644 --- /dev/null +++ b/thys/IsaGeoCoq/document/root.bib @@ -0,0 +1,323 @@ +%% This BibTeX bibliography file was created using BibDesk. +%% http://bibdesk.sourceforge.net/ + + +%% Created for Larry Paulson at 2021-02-01 11:31:03 +0000 + + +%% Saved with string encoding Unicode (UTF-8) + + + +@article{Poincare_Disc-AFP, + author = {Danijela Simi{\'c} and Filip Mari{\'c} and Pierre Boutry}, + date-added = {2021-02-01 11:30:15 +0000}, + date-modified = {2021-02-01 11:30:15 +0000}, + issn = {2150-914x}, + journal = {Archive of Formal Proofs}, + month = dec, + note = {\url{https://isa-afp.org/entries/Poincare_Disc.html}, Formal proof development}, + title = {Poincar{\'e} Disc Model}, + year = 2019} + +@article{Tarskis_Geometry-AFP, + author = {T. J. M. Makarios}, + date-added = {2021-02-01 11:28:01 +0000}, + date-modified = {2021-02-01 11:28:28 +0000}, + issn = {2150-914x}, + journal = {Archive of Formal Proofs}, + month = oct, + note = {\url{https://isa-afp.org/entries/Tarskis_Geometry.html}, Formal proof development}, + title = {The Independence of {Tarski's Euclidean} Axiom}, + year = 2012} + +@book{tarski, + address = {Berlin}, + author = {Wolfram Schwabh{\"a}user and Wanda Szmielew and Alfred Tarski}, + publisher = {Springer-Verlag}, + title = {{Metamathematische Methoden in der Geometrie}}, + year = {1983}} + +@mastersthesis{makarios, + author = {Makarios, Timothy James McKenzie}, + note = {Master Thesis}, + school = {Victoria University of Wellington}, + title = {{A Mechanical Verification of the Independence of Tarski's Euclidean Axiom}}, + year = {2012}} + +@misc{beeson:hal-01912024, + author = {Beeson, Michael and Boutry, Pierre and Braun, Gabriel and Gries, Charly and Narboux, Julien}, + hal_id = {hal-01912024}, + hal_version = {v1}, + keywords = {coordinates ; Triangle centers ; parallel postulates ; Euclidean geometry ; Formal proofs ; Coq Proof Assistant ; Foundations of geometry ; arithmetization ; Thales theorem ; pythagoras theorem ; neutral geometry ; Tarski's geometry ; Hilbert's geometry}, + month = Jun, + title = {{GeoCoq}}, + url = {https://hal.inria.fr/hal-01912024}, + year = {2018}, + Bdsk-Url-1 = {https://hal.inria.fr/hal-01912024}} + +@article{wiedijk2012synthesis, + author = {Wiedijk, Freek}, + journal = {Logical Methods in Computer Science}, + publisher = {Episciences. org}, + title = {A Synthesis of the Procedural and Declarative Styles of Interactive Theorem Proving}, + volume = {8}, + year = {2012}} + +@article{boutry:hal-01178236, + author = {Boutry, Pierre and Gries, Charly and Narboux, Julien and Schreck, Pascal}, + doi = {10.1007/s10817-017-9422-8}, + hal_id = {hal-01178236}, + hal_version = {v2}, + journal = {{Journal of Automated Reasoning}}, + keywords = {Archimedes' axiom ; axiom ; Aristotle's ; Coq ; neutral geometry ; formalization ; decidability of intersection ; classification ; foundations of geometry ; Euclid ; parallel postulate ; Saccheri-Legendre theorem ; sum of angles}, + note = {online first}, + pages = {68}, + pdf = {https://hal.inria.fr/hal-01178236/file/parallel_postulates_revised.pdf}, + publisher = {{Springer Verlag}}, + title = {{Parallel postulates and continuity axioms: a mechanized study in intuitionistic logic using Coq}}, + url = {https://hal.inria.fr/hal-01178236}, + year = {2017}, + Bdsk-Url-1 = {https://hal.inria.fr/hal-01178236}, + Bdsk-Url-2 = {https://doi.org/10.1007/s10817-017-9422-8}} + +@inproceedings{gries:hal-01228612, + address = {Saint Malo, France}, + author = {Gries, Charly and Boutry, Pierre and Narboux, Julien}, + booktitle = {{Les vingt-septi{\`e}mes Journ{\'e}es Francophones des Langages Applicatifs (JFLA 2016)}}, + hal_id = {hal-01228612}, + hal_version = {v2}, + keywords = {Euclid ; formalization ; Coq ; parallel postulate ; geometry ; formalisation}, + month = Jan, + organization = {{Jade Algave and Julien Signoles}}, + pages = {15}, + pdf = {https://hal.inria.fr/hal-01228612/file/jfla2016-gries-boutry-narboux.pdf}, + series = {Actes des Vingt-septi{\`e}mes Journ{\'e}es Francophones des Langages Applicatifs (JFLA 2016)}, + title = {{Somme des angles d'un triangle et unicit{\'e} de la parall{\`e}le : une preuve d'{\'e}quivalence formalis{\'e}e en Coq}}, + url = {https://hal.inria.fr/hal-01228612}, + year = {2016}, + Bdsk-Url-1 = {https://hal.inria.fr/hal-01228612}} + +@inproceedings{narboux:inria-00118812, + address = {Pontevedra, Spain}, + author = {Narboux, Julien}, + booktitle = {{Automated Deduction in Geometry 2006}}, + doi = {10.1007/978-3-540-77356-6}, + editor = {Eugenio Roanes Lozano, Francisco Botana}, + hal_id = {inria-00118812}, + hal_version = {v1}, + keywords = {Tarski's axioms ; Coq ; Formal Proof ; geometry}, + month = Aug, + organization = {{Francisco Botana}}, + pages = {139-156}, + pdf = {https://hal.inria.fr/inria-00118812/file/adg06-narboux.pdf}, + publisher = {{Springer}}, + series = {LNCS}, + title = {{Mechanical Theorem Proving in Tarski's geometry.}}, + url = {https://hal.inria.fr/inria-00118812}, + volume = {4869}, + year = {2006}, + Bdsk-Url-1 = {https://hal.inria.fr/inria-00118812}, + Bdsk-Url-2 = {https://doi.org/10.1007/978-3-540-77356-6}} + +@article{boutry:hal-01483457, + author = {Boutry, Pierre and Braun, Gabriel and Narboux, Julien}, + doi = {10.1016/j.jsc.2018.04.007}, + hal_id = {hal-01483457}, + hal_version = {v1}, + journal = {{Journal of Symbolic Computation}}, + keywords = {Pythagoras' theorem ; intercept theorem ; arithmetization ; Coq ; geometry ; Formalization ; area method}, + pages = {149-168}, + pdf = {https://hal.inria.fr/hal-01483457/file/extended-arithmetization.pdf}, + publisher = {{Elsevier}}, + series = {Special Issue on Symbolic Computation in Software Science}, + title = {{Formalization of the Arithmetization of Euclidean Plane Geometry and Applications}}, + url = {https://hal.inria.fr/hal-01483457}, + volume = {90}, + year = {2019}, + Bdsk-Url-1 = {https://hal.inria.fr/hal-01483457}, + Bdsk-Url-2 = {https://doi.org/10.1016/j.jsc.2018.04.007}} + +@incollection{narboux:hal-01779452, + author = {Narboux, Julien and Janicic, Predrag and Fleuriot, Jacques}, + booktitle = {{Handbook of Geometric Constraint Systems Principles}}, + editor = {Meera Sitharam and Audrey St. John and Jessica Sidman}, + hal_id = {hal-01779452}, + hal_version = {v1}, + publisher = {{Chapman and Hall/CRC }}, + series = {Discrete Mathematics and Its Applications}, + title = {{Computer-assisted Theorem Proving in Synthetic Geometry}}, + url = {https://hal.inria.fr/hal-01779452}, + year = {2018}, + Bdsk-Url-1 = {https://hal.inria.fr/hal-01779452}} + +@article{dhurdjevic2015automated, + author = {{\DH}ur{\dj}evi{\'c}, Sana Stojanovi{\'c} and Narboux, Julien and Jani{\v{c}}i{\'c}, Predrag}, + journal = {Annals of Mathematics and Artificial Intelligence}, + number = {3-4}, + pages = {249--269}, + publisher = {Springer}, + title = {Automated generation of machine verifiable and readable proofs: a case study of Tarski's geometry}, + volume = {74}, + year = {2015}} + +@inproceedings{beeson2014otter, + author = {Beeson, Michael and Wos, Larry}, + booktitle = {International Joint Conference on Automated Reasoning}, + organization = {Springer}, + pages = {495--510}, + title = {OTTER proofs in Tarskian geometry}, + year = {2014}} + +@inproceedings{DBLP:conf/csedu/DoreB18a, + author = {Maximilian Dor{\'{e}} and Krysia Broda}, + bibsource = {dblp computer science bibliography, https://dblp.org}, + biburl = {https://dblp.org/rec/conf/csedu/DoreB18a.bib}, + booktitle = {Computer Supported Education - 10th International Conference, {CSEDU} 2018, Funchal, Madeira, Portugal, March 15-17, 2018, Revised Selected Papers}, + doi = {10.1007/978-3-030-21151-6\_26}, + editor = {Bruce M. McLaren and Rob Reilly and Susan Zvacek and James Uhomoibhi}, + pages = {549--571}, + publisher = {Springer}, + series = {Communications in Computer and Information Science}, + timestamp = {Tue, 25 Jun 2019 19:08:13 +0200}, + title = {Intuitive Reasoning in Formalized Mathematics with Elfe}, + url = {https://doi.org/10.1007/978-3-030-21151-6\_26}, + volume = {1022}, + year = {2018}, + Bdsk-Url-1 = {https://doi.org/10.1007/978-3-030-21151-6%5C_26}} + +@article{DBLP:journals/fm/CoghettoG19, + author = {Roland Coghetto and Adam Grabowski}, + bibsource = {dblp computer science bibliography, https://dblp.org}, + biburl = {https://dblp.org/rec/journals/fm/CoghettoG19.bib}, + doi = {10.2478/forma-2019-0008}, + journal = {Formalized Mathematics}, + number = {1}, + pages = {75--85}, + timestamp = {Mon, 17 Jun 2019 17:02:00 +0200}, + title = {Tarski Geometry Axioms. Part {IV} - Right Angle}, + url = {https://doi.org/10.2478/forma-2019-0008}, + volume = {27}, + year = {2019}, + Bdsk-Url-1 = {https://doi.org/10.2478/forma-2019-0008}} + +@article{DBLP:journals/fm/CoghettoG17, + author = {Roland Coghetto and Adam Grabowski}, + bibsource = {dblp computer science bibliography, https://dblp.org}, + biburl = {https://dblp.org/rec/journals/fm/CoghettoG17.bib}, + doi = {10.1515/forma-2017-0028}, + journal = {Formalized Mathematics}, + number = {4}, + pages = {289--313}, + timestamp = {Sat, 19 Oct 2019 19:33:25 +0200}, + title = {Tarski Geometry Axioms. Part {III}}, + url = {https://doi.org/10.1515/forma-2017-0028}, + volume = {25}, + year = {2017}, + Bdsk-Url-1 = {https://doi.org/10.1515/forma-2017-0028}} + +@article{DBLP:journals/fm/CoghettoG16, + author = {Roland Coghetto and Adam Grabowski}, + bibsource = {dblp computer science bibliography, https://dblp.org}, + biburl = {https://dblp.org/rec/journals/fm/CoghettoG16.bib}, + doi = {10.1515/forma-2016-0012}, + journal = {Formalized Mathematics}, + number = {2}, + pages = {157--166}, + timestamp = {Sat, 19 Oct 2019 19:33:25 +0200}, + title = {Tarski Geometry Axioms - Part {II}}, + url = {https://doi.org/10.1515/forma-2016-0012}, + volume = {24}, + year = {2016}, + Bdsk-Url-1 = {https://doi.org/10.1515/forma-2016-0012}} + +@article{DBLP:journals/fm/RichterGA14, + author = {William Richter and Adam Grabowski and Jesse Alama}, + bibsource = {dblp computer science bibliography, https://dblp.org}, + biburl = {https://dblp.org/rec/journals/fm/RichterGA14.bib}, + journal = {Formalized Mathematics}, + number = {2}, + pages = {167--176}, + timestamp = {Thu, 19 Feb 2015 15:03:13 +0100}, + title = {Tarski Geometry Axioms}, + url = {http://www.degruyter.com/view/j/forma.2014.22.issue-2/forma-2014-0017/forma-2014-0017.xml}, + volume = {22}, + year = {2014}, + Bdsk-Url-1 = {http://www.degruyter.com/view/j/forma.2014.22.issue-2/forma-2014-0017/forma-2014-0017.xml}} + +@article{sutcliffe1998tptp, + author = {Sutcliffe, Geoff and Suttner, Christian}, + journal = {Journal of Automated Reasoning}, + number = {2}, + pages = {177--203}, + publisher = {Springer}, + title = {The TPTP problem library}, + volume = {21}, + year = {1998}} + +@book{martin2012foundations, + author = {Martin, George Edward}, + publisher = {Springer Science \& Business Media}, + title = {The foundations of geometry and the non-Euclidean plane}, + year = {2012}} + +@inproceedings{nipkow2002structured, + author = {Nipkow, Tobias}, + booktitle = {International Workshop on Types for Proofs and Programs}, + organization = {Springer}, + pages = {259--278}, + title = {Structured proofs in Isar/HOL}, + year = {2002}} + +@article{PoincareDisc, + author = {Simic, Danijela and Maric, Filip and Boutry, Pierre}, + doi = {10.1007/s10817-020-09551-2}, + journal = {J. Autom. Reasoning}, + number = {4}, + title = {Formalization of the Poincar{\'e} Disc Model of Hyperbolic Geometry}, + url = {https://doi.org/10.1007/s10817-020-09551-2}, + volume = {64}, + year = {2020}, + Bdsk-Url-1 = {https://doi.org/10.1007/s10817-020-09551-2}} + +@inproceedings{stojanovic2010coherent, + author = {Stojanovi{\'c}, Sana and Pavlovi{\'c}, Vesna and Jani{\v{c}}i{\'c}, Predrag}, + booktitle = {International Workshop on Automated Deduction in Geometry}, + organization = {Springer}, + pages = {201--220}, + title = {A coherent logic based geometry theorem prover capable of producing formal and readable proofs}, + year = {2010}} + +@article{beeson2017finding, + author = {Beeson, Michael and Wos, Larry}, + journal = {Journal of Automated Reasoning}, + number = {1}, + pages = {181--207}, + publisher = {Springer}, + title = {Finding proofs in Tarskian geometry}, + volume = {58}, + year = {2017}} + +@article{beeson2019proof, + author = {Beeson, Michael and Narboux, Julien and Wiedijk, Freek}, + journal = {Annals of Mathematics and Artificial Intelligence}, + number = {2}, + pages = {213--257}, + publisher = {Springer}, + title = {Proof-checking Euclid}, + volume = {85}, + year = {2019}} + +@article{narboux2018computer, + author = {Narboux, Julien and Janicic, Predrag and Fleuriot, Jacques}, + journal = {Handbook of Geometric Constraint Systems Principles}, + pages = {25--73}, + publisher = {Chapman and Hall/CRC}, + title = {Computer-assisted theorem proving in synthetic geometry}, + year = {2018}} + +@phdthesis{boutry2018formalization, + author = {Boutry, Pierre}, + school = {Universit{\'e} de Strasbourg}, + title = {On the formalization of foundations of geometry}, + year = {2018}} diff --git a/thys/IsaGeoCoq/document/root.tex b/thys/IsaGeoCoq/document/root.tex new file mode 100644 --- /dev/null +++ b/thys/IsaGeoCoq/document/root.tex @@ -0,0 +1,136 @@ +\documentclass[8pt,a4paper]{article} +\usepackage[T1]{fontenc} +\usepackage[margin=2cm]{geometry} +\usepackage{isabelle,isabellesym} + +% further packages required for unusual symbols (see also +% isabellesym.sty), use only when needed + +\usepackage{amssymb} + %for \, \, \, \, \, \, + %\, \, \, \, \, + %\, \, \ + +%\usepackage{eurosym} + %for \ + +%\usepackage[only,bigsqcap]{stmaryrd} + %for \ + +%\usepackage{eufrak} + %for \ ... \, \ ... \ (also included in amssymb) + +%\usepackage{textcomp} + %for \, \, \, \, \, + %\ + +% this should be the last package used +\usepackage{pdfsetup} + +% urls in roman style, theory text in math-similar italics +\urlstyle{rm} +\isabellestyle{it} + +% for uniform font size +%\renewcommand{\isastyle}{\isastyleminor} + +\usepackage{amsmath} + +\begin{document} + +\title{IsaGeoCoq: Partial porting of GeoCoq 2.4.0. Case studies: +Tarski's postulate of parallels implies the 5th postulate of Euclid, +the postulate of Playfair and +the original postulate of Euclid.} +\author{Roland Coghetto} +\maketitle + +\begin{abstract} +The GeoCoq library contains a formalization of geometry using +the Coq proof assistant. +It contains both proofs about the foundations of geometry +\cite{tarski,narboux:inria-00118812,boutry:hal-01483457,narboux:hal-01779452} +and high-level proofs in the same style as in high-school. +\cite{beeson:hal-01912024}(Code Repository https://github.com/GeoCoq/GeoCoq). + +Some theorems also inspired by \cite{tarski} are also formalized with others ITP(Metamath, Mizar) or ATP +\cite{sutcliffe1998tptp,dhurdjevic2015automated,beeson2014otter, +stojanovic2010coherent,beeson2017finding,beeson2019proof, +narboux2018computer,boutry2018formalization, +{DBLP:conf/csedu/DoreB18a}, +{DBLP:journals/fm/RichterGA14},{DBLP:journals/fm/CoghettoG16}, +{DBLP:journals/fm/CoghettoG17},{DBLP:journals/fm/CoghettoG19}}. + + +We port a part of the GeoCoq 2.4.0 library within the Isabelle/Hol proof +assistant: more precisely, the files Chap02.v to Chap13$\_$3.v, suma.v as +well as the associated definitions and some useful files for +the demonstration of certain parallel postulates. + +While the demonstrations in Coq are written in procedural language +\cite{wiedijk2012synthesis}, the transcript +is done in declarative language Isar\cite{nipkow2002structured}. + +The synthetic approach of the demonstrations are directly inspired by +those contained in GeoCoq. +Some demonstrations are credited to G.E Martin(<> in Ch11$\_$angles.thy, proved by Martin as Theorem 18.17 in +\cite{martin2012foundations}) or +Gupta H.N (Krippen Lemma, proved by Gupta in its PhD in 1965 as Theorem 3.45). +(See \cite{gries:hal-01228612}). + +In this work, the proofs are not contructive. +The sledeghammer tool being used to find some demonstrations. + +The names of the lemmas and theorems used are kept as far as possible +as well as the definitions. +A different translation has been proposed when the name was already used in +Isabel/Hol ("Len" is translated as "TarskiLen") or that characters were not +allowed in Isabel/Hol ("anga'" in Ch13$\_$angles.v is translated as "angaP"). +For some definitions the highlighting of a variable has changed the order or +the position of the variables (Midpoint, Out, Inter,...). + +All the lemmas are valid in absolute/neutral space defined with Tarski's axioms. + +It should be noted that T.J.M. Makarios \cite{Tarskis_Geometry-AFP} has +begun some demonstrations of certain proposals +mainly those corresponding to SST chapters 2 and 3. +It uses a definition that does not quite coincide with the definition +used in Geocoq and here. As an example, Makarios introduces +the axiom A11 (Axiom of continuity) in the definition of the locale +"Tarski$\_$absolute$\_$space". + +Furthermore, the definition of the locale "TarskiAbsolute" \cite{PoincareDisc,Poincare_Disc-AFP} is not +not identical to the one defined in the "Tarski$\_$neutral$\_$dimensionless" +class of GeoCoq. +Indeed this one does not contain the axiom "upper$\_$dimension". In some cases +particular, it is nevertheless to use the axiom "upper$\_$dimension". +The addition of the word "$\_$2D" in the file indicates its presence. + +In the last part, it is formalized that, in the neutral/absolute space, the axiom of the parallels of the system of Tarski +implies the Playfair axiom, the 5th postulate of euclide and the postulate +original from Euclid. +These proofs, which are not constructive, are directly inspired by +\cite{gries:hal-01228612,boutry:hal-01178236}. + +\end{abstract} + +\tableofcontents + +% sane default for proof documents +\parindent 0pt\parskip 0.5ex + +\clearpage +% generated text of all theories +\input{session} + +% optional bibliography +\clearpage +\bibliographystyle{abbrv} +\bibliography{root} + +\end{document} + +%%% Local Variables: +%%% mode: latex +%%% TeX-master: t +%%% End: diff --git a/thys/ROOTS b/thys/ROOTS --- a/thys/ROOTS +++ b/thys/ROOTS @@ -1,581 +1,583 @@ ADS_Functor AI_Planning_Languages_Semantics AODV AVL-Trees AWN Abortable_Linearizable_Modules Abs_Int_ITP2012 Abstract-Hoare-Logics Abstract-Rewriting Abstract_Completeness Abstract_Soundness Adaptive_State_Counting Affine_Arithmetic Aggregation_Algebras Akra_Bazzi Algebraic_Numbers Algebraic_VCs Allen_Calculus Amicable_Numbers Amortized_Complexity AnselmGod Applicative_Lifting Approximation_Algorithms Architectural_Design_Patterns Aristotles_Assertoric_Syllogistic Arith_Prog_Rel_Primes ArrowImpossibilityGS Attack_Trees Auto2_HOL Auto2_Imperative_HOL AutoFocus-Stream Automated_Stateful_Protocol_Verification Automatic_Refinement AxiomaticCategoryTheory BDD BNF_CC BNF_Operations Banach_Steinhaus Bell_Numbers_Spivey Berlekamp_Zassenhaus Bernoulli Bertrands_Postulate Bicategory BinarySearchTree Binding_Syntax_Theory Binomial-Heaps Binomial-Queues BirdKMP +Blue_Eyes Bondy Boolean_Expression_Checkers Bounded_Deducibility_Security Buchi_Complementation Budan_Fourier Buffons_Needle Buildings BytecodeLogicJmlTypes C2KA_DistributedSystems CAVA_Automata CAVA_LTL_Modelchecker CCS CISC-Kernel CRDT CYK CakeML CakeML_Codegen Call_Arity Card_Equiv_Relations Card_Multisets Card_Number_Partitions Card_Partitions Cartan_FP Case_Labeling Catalan_Numbers Category Category2 Category3 Cauchy Cayley_Hamilton Certification_Monads Chandy_Lamport Chord_Segments Circus Clean ClockSynchInst Closest_Pair_Points CofGroups Coinductive Coinductive_Languages Collections Comparison_Sort_Lower_Bound Compiling-Exceptions-Correctly Complete_Non_Orders Completeness Complex_Geometry Complx ComponentDependencies ConcurrentGC ConcurrentIMP Concurrent_Ref_Alg Concurrent_Revisions Consensus_Refined Constructive_Cryptography Constructor_Funs Containers CoreC++ Core_DOM Core_SC_DOM Count_Complex_Roots CryptHOL CryptoBasedCompositionalProperties CSP_RefTK DFS_Framework DPT-SAT-Solver DataRefinementIBP Datatype_Order_Generator Decl_Sem_Fun_PL Decreasing-Diagrams Decreasing-Diagrams-II Deep_Learning Delta_System_Lemma Density_Compiler Dependent_SIFUM_Refinement Dependent_SIFUM_Type_Systems Depth-First-Search Derangements Deriving Descartes_Sign_Rule Dict_Construction Differential_Dynamic_Logic Differential_Game_Logic Dijkstra_Shortest_Path Diophantine_Eqns_Lin_Hom Dirichlet_L Dirichlet_Series DiscretePricing Discrete_Summation DiskPaxos DOM_Components DynamicArchitectures Dynamic_Tables E_Transcendental Echelon_Form EdmondsKarp_Maxflow Efficient-Mergesort Elliptic_Curves_Group_Law Encodability_Process_Calculi Epistemic_Logic Ergodic_Theory Error_Function Euler_MacLaurin Euler_Partition Example-Submission Extended_Finite_State_Machine_Inference Extended_Finite_State_Machines FFT FLP FOL-Fitting FOL_Harrison FOL_Seq_Calc1 Factored_Transition_System_Bounding Falling_Factorial_Sum Farkas FeatherweightJava Featherweight_OCL Fermat3_4 FileRefinement FinFun Finger-Trees Finite-Map-Extras Finite_Automata_HF First_Order_Terms First_Welfare_Theorem Fishburn_Impossibility Fisher_Yates Flow_Networks Floyd_Warshall Flyspeck-Tame FocusStreamsCaseStudies Forcing Formal_SSA Formula_Derivatives Fourier Free-Boolean-Algebra Free-Groups FunWithFunctions FunWithTilings Functional-Automata Functional_Ordered_Resolution_Prover Furstenberg_Topology GPU_Kernel_PL Gabow_SCC Game_Based_Crypto Gauss-Jordan-Elim-Fun Gauss_Jordan Gauss_Sums Gaussian_Integers GenClock General-Triangle Generalized_Counting_Sort Generic_Deriving Generic_Join GewirthPGCProof Girth_Chromatic GoedelGod Goedel_HFSet_Semantic Goedel_HFSet_Semanticless Goedel_Incompleteness Goodstein_Lambda GraphMarkingIBP Graph_Saturation Graph_Theory Green Groebner_Bases Groebner_Macaulay Gromov_Hyperbolicity Group-Ring-Module HOL-CSP HOLCF-Prelude HRB-Slicing Heard_Of Hello_World HereditarilyFinite Hermite Hidden_Markov_Models Higher_Order_Terms Hoare_Time Hood_Melville_Queue HotelKeyCards Huffman Hybrid_Logic Hybrid_Multi_Lane_Spatial_Logic Hybrid_Systems_VCs HyperCTL IEEE_Floating_Point IMAP-CRDT IMO2019 IMP2 IMP2_Binary_Heap IP_Addresses Imperative_Insertion_Sort Impossible_Geometry Incompleteness Incredible_Proof_Machine Inductive_Confidentiality Inductive_Inference InfPathElimination InformationFlowSlicing InformationFlowSlicing_Inter Integration Interpreter_Optimizations Interval_Arithmetic_Word32 Iptables_Semantics Irrational_Series_Erdos_Straus Irrationality_J_Hancl Isabelle_C Isabelle_Marries_Dirac Isabelle_Meta_Model +IsaGeoCoq Jacobson_Basic_Algebra Jinja JinjaDCI JinjaThreads JiveDataStoreModel Jordan_Hoelder Jordan_Normal_Form KAD KAT_and_DRA KBPs KD_Tree Key_Agreement_Strong_Adversaries Kleene_Algebra Knuth_Bendix_Order Knot_Theory Knuth_Bendix_Order Knuth_Morris_Pratt Koenigsberg_Friendship Kruskal Kuratowski_Closure_Complement LLL_Basis_Reduction LLL_Factorization LOFT LTL LTL_Master_Theorem LTL_Normal_Form LTL_to_DRA LTL_to_GBA Lam-ml-Normalization LambdaAuth LambdaMu Lambda_Free_EPO Lambda_Free_KBOs Lambda_Free_RPOs Lambert_W Landau_Symbols Laplace_Transform Latin_Square LatticeProperties Launchbury Lazy-Lists-II Lazy_Case Lehmer Lifting_Definition_Option LightweightJava LinearQuantifierElim Linear_Inequalities Linear_Programming Linear_Recurrences Liouville_Numbers List-Index List-Infinite List_Interleaving List_Inversions List_Update LocalLexing Localization_Ring Locally-Nameless-Sigma Lowe_Ontological_Argument Lower_Semicontinuous Lp Lucas_Theorem MFMC_Countable MFODL_Monitor_Optimized MFOTL_Monitor MSO_Regex_Equivalence Markov_Models Marriage Mason_Stothers Matrices_for_ODEs Matrix Matrix_Tensor Matroids Max-Card-Matching Median_Of_Medians_Selection Menger Mersenne_Primes MiniML Minimal_SSA Minkowskis_Theorem Minsky_Machines Modal_Logics_for_NTS Modular_Assembly_Kit_Security Monad_Memo_DP Monad_Normalisation MonoBoolTranAlgebra MonoidalCategory Monomorphic_Monad MuchAdoAboutTwo Multi_Party_Computation Multirelations Myhill-Nerode Name_Carrying_Type_Inference Nash_Williams Nat-Interval-Logic Native_Word Nested_Multisets_Ordinals Network_Security_Policy_Verification Neumann_Morgenstern_Utility No_FTL_observers Nominal2 Noninterference_CSP Noninterference_Concurrent_Composition Noninterference_Generic_Unwinding Noninterference_Inductive_Unwinding Noninterference_Ipurge_Unwinding Noninterference_Sequential_Composition NormByEval Nullstellensatz Octonions OpSets Open_Induction Optics Optimal_BST Orbit_Stabiliser Order_Lattice_Props Ordered_Resolution_Prover Ordinal Ordinal_Partitions Ordinals_and_Cardinals Ordinary_Differential_Equations PAC_Checker PCF PLM POPLmark-deBruijn PSemigroupsConvolution Pairing_Heap Paraconsistency Parity_Game Partial_Function_MR Partial_Order_Reduction Password_Authentication_Protocol Pell Perfect-Number-Thm Perron_Frobenius Physical_Quantities Pi_Calculus Pi_Transcendental Planarity_Certificates Poincare_Bendixson Poincare_Disc Polynomial_Factorization Polynomial_Interpolation Polynomials Pop_Refinement Posix-Lexing Possibilistic_Noninterference Power_Sum_Polynomials Pratt_Certificate Presburger-Automata Prim_Dijkstra_Simple Prime_Distribution_Elementary Prime_Harmonic_Series Prime_Number_Theorem Priority_Queue_Braun Priority_Search_Trees Probabilistic_Noninterference Probabilistic_Prime_Tests Probabilistic_System_Zoo Probabilistic_Timed_Automata Probabilistic_While Program-Conflict-Analysis Projective_Geometry Promela Proof_Strategy_Language PropResPI Propositional_Proof_Systems Prpu_Maxflow PseudoHoops Psi_Calculi Ptolemys_Theorem QHLProver QR_Decomposition Quantales Quaternions Quick_Sort_Cost RIPEMD-160-SPARK ROBDD RSAPSS Ramsey-Infinite Random_BSTs Random_Graph_Subgraph_Threshold Randomised_BSTs Randomised_Social_Choice Rank_Nullity_Theorem Real_Impl Recursion-Addition Recursion-Theory-I Refine_Imperative_HOL Refine_Monadic RefinementReactive Regex_Equivalence Regular-Sets Regular_Algebras Relation_Algebra Relational-Incorrectness-Logic Relational_Disjoint_Set_Forests Relational_Method Relational_Minimum_Spanning_Trees Relational_Paths Rep_Fin_Groups Residuated_Lattices Resolution_FOL Rewriting_Z Ribbon_Proofs Robbins-Conjecture Robinson_Arithmetic Root_Balanced_Tree Routing Roy_Floyd_Warshall SATSolverVerification SC_DOM_Components SDS_Impossibility SIFPL SIFUM_Type_Systems SPARCv8 Safe_Distance Safe_OCL Saturation_Framework Saturation_Framework_Extensions Shadow_DOM Secondary_Sylow Security_Protocol_Refinement Selection_Heap_Sort SenSocialChoice Separata Separation_Algebra Separation_Logic_Imperative_HOL SequentInvertibility Shadow_SC_DOM Shivers-CFA ShortestPath Show Sigma_Commit_Crypto Signature_Groebner Simpl Simple_Firewall Simplex Skew_Heap Skip_Lists Slicing Sliding_Window_Algorithm Smith_Normal_Form Smooth_Manifolds Sort_Encodings Source_Coding_Theorem Special_Function_Bounds Splay_Tree Sqrt_Babylonian Stable_Matching Statecharts Stateful_Protocol_Composition_and_Typing Stellar_Quorums Stern_Brocot Stewart_Apollonius Stirling_Formula Stochastic_Matrices Stone_Algebras Stone_Kleene_Relation_Algebras Stone_Relation_Algebras Store_Buffer_Reduction Stream-Fusion Stream_Fusion_Code Strong_Security Sturm_Sequences Sturm_Tarski Stuttering_Equivalence Subresultants Subset_Boolean_Algebras SumSquares SuperCalc Surprise_Paradox Symmetric_Polynomials Syntax_Independent_Logic Szpilrajn TESL_Language TLA Tail_Recursive_Functions Tarskis_Geometry Taylor_Models Timed_Automata Topological_Semantics Topology TortoiseHare Transcendence_Series_Hancl_Rucki Transformer_Semantics Transition_Systems_and_Automata Transitive-Closure Transitive-Closure-II Treaps Tree-Automata Tree_Decomposition Triangle Trie Twelvefold_Way Tycon Types_Tableaus_and_Goedels_God UPF UPF_Firewall UTP Universal_Turing_Machine UpDown_Scheme Valuation VectorSpace VeriComp Verified-Prover Verified_SAT_Based_AI_Planning VerifyThis2018 VerifyThis2019 Vickrey_Clarke_Groves VolpanoSmith WHATandWHERE_Security WOOT_Strong_Eventual_Consistency WebAssembly Weight_Balanced_Trees Well_Quasi_Orders Winding_Number_Eval Word_Lib WorkerWrapper XML ZFC_in_HOL Zeta_3_Irrational Zeta_Function pGCL diff --git a/web/entries/Blue_Eyes.html b/web/entries/Blue_Eyes.html new file mode 100644 --- /dev/null +++ b/web/entries/Blue_Eyes.html @@ -0,0 +1,195 @@ + + + + +Solution to the xkcd Blue Eyes puzzle - Archive of Formal Proofs + + + + + + + + + + + + + + + + + + + + + + + + +
+

 

+ + + +

 

+

 

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

 

+

 

+
+
+

 

+

Solution + + to + + the + + xkcd + + Blue + + Eyes + + puzzle + +

+

 

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Title:Solution to the xkcd Blue Eyes puzzle
+ Author: + + Jakub Kądziołka (kuba /at/ kadziolka /dot/ net) +
Submission date:2021-01-30
Abstract: +In a puzzle published by +Randall Munroe, perfect logicians forbidden +from communicating are stranded on an island, and may only leave once +they have figured out their own eye color. We present a method of +modeling the behavior of perfect logicians and formalize a solution of +the puzzle.
BibTeX: +
@article{Blue_Eyes-AFP,
+  author  = {Jakub Kądziołka},
+  title   = {Solution to the xkcd Blue Eyes puzzle},
+  journal = {Archive of Formal Proofs},
+  month   = jan,
+  year    = 2021,
+  note    = {\url{https://isa-afp.org/entries/Blue_Eyes.html},
+            Formal proof development},
+  ISSN    = {2150-914x},
+}
+
License:BSD License
+ +

+ + + + + + + + + + + + + + + + + + +
+
+ + + + + + \ No newline at end of file diff --git a/web/entries/IsaGeoCoq.html b/web/entries/IsaGeoCoq.html new file mode 100644 --- /dev/null +++ b/web/entries/IsaGeoCoq.html @@ -0,0 +1,239 @@ + + + + +Tarski's Parallel Postulate implies the 5th Postulate of Euclid, the Postulate of Playfair and the original Parallel Postulate of Euclid - Archive of Formal Proofs + + + + + + + + + + + + + + + + + + + + + + + + +
+

 

+ + + +

 

+

 

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

 

+

 

+
+
+

 

+

Tarski's + + Parallel + + Postulate + + implies + + the + + 5th + + Postulate + + of + + Euclid, + + the + + Postulate + + of + + Playfair + + and + + the + + original + + Parallel + + Postulate + + of + + Euclid + +

+

 

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Title:Tarski's Parallel Postulate implies the 5th Postulate of Euclid, the Postulate of Playfair and the original Parallel Postulate of Euclid
+ Author: + + Roland Coghetto (roland_coghetto /at/ hotmail /dot/ com) +
Submission date:2021-01-31
Abstract: +

The GeoCoq library contains a formalization +of geometry using the Coq proof assistant. It contains both proofs +about the foundations of geometry and high-level proofs in the same +style as in high school. We port a part of the GeoCoq +2.4.0 library to Isabelle/HOL: more precisely, +the files Chap02.v to Chap13_3.v, suma.v as well as the associated +definitions and some useful files for the demonstration of certain +parallel postulates. The synthetic approach of the demonstrations is directly +inspired by those contained in GeoCoq. The names of the lemmas and +theorems used are kept as far as possible as well as the definitions. +

+

It should be noted that T.J.M. Makarios has done +some proofs in Tarski's Geometry. It uses a definition that does not quite +coincide with the definition used in Geocoq and here. +Furthermore, corresponding definitions in the Poincaré Disc Model +development are not identical to those defined in GeoCoq. +

+

In the last part, it is +formalized that, in the neutral/absolute space, the axiom of the +parallels of Tarski's system implies the Playfair axiom, the 5th +postulate of Euclid and Euclid's original parallel postulate. These +proofs, which are not constructive, are directly inspired by Pierre +Boutry, Charly Gries, Julien Narboux and Pascal Schreck. +

BibTeX: +
@article{IsaGeoCoq-AFP,
+  author  = {Roland Coghetto},
+  title   = {Tarski's Parallel Postulate implies the 5th Postulate of Euclid, the Postulate of Playfair and the original Parallel Postulate of Euclid},
+  journal = {Archive of Formal Proofs},
+  month   = jan,
+  year    = 2021,
+  note    = {\url{https://isa-afp.org/entries/IsaGeoCoq.html},
+            Formal proof development},
+  ISSN    = {2150-914x},
+}
+
License:GNU Lesser General Public License (LGPL)
+ +

+ + + + + + + + + + + + + + + + + + +
+
+ + + + + + \ No newline at end of file diff --git a/web/entries/Optics.html b/web/entries/Optics.html --- a/web/entries/Optics.html +++ b/web/entries/Optics.html @@ -1,221 +1,227 @@ Optics - Archive of Formal Proofs

 

 

 

 

 

 

Optics

 

+(revision 44e2e5c) +[2021-01-27] +Addition of new theorems throughout, particularly for prisms. +New "chantype" command allows the definition of an algebraic datatype with generated prisms. +New "dataspace" command allows the definition of a local-based state space, including lenses and prisms. +Addition of various examples for the above. +(revision 89cf045a)
Title: Optics
Authors: Simon Foster and Frank Zeyda
Submission date: 2017-05-25
Abstract: Lenses provide an abstract interface for manipulating data types through spatially-separated views. They are defined abstractly in terms of two functions, get, the return a value from the source type, and put that updates the value. We mechanise the underlying theory of lenses, in terms of an algebraic hierarchy of lenses, including well-behaved and very well-behaved lenses, each lens class being characterised by a set of lens laws. We also mechanise a lens algebra in Isabelle that enables their composition and comparison, so as to allow construction of complex lenses. This is accompanied by a large library of algebraic laws. Moreover we also show how the lens classes can be applied by instantiating them with a number of Isabelle data types.
Change history: [2020-03-02]: Added partial bijective and symmetric lenses. Improved alphabet command generating additional lenses and results. Several additional lens relations, including observational equivalence. Additional theorems throughout. Adaptations for Isabelle 2020. -(revision 44e2e5c)
BibTeX:
@article{Optics-AFP,
   author  = {Simon Foster and Frank Zeyda},
   title   = {Optics},
   journal = {Archive of Formal Proofs},
   month   = may,
   year    = 2017,
   note    = {\url{https://isa-afp.org/entries/Optics.html},
             Formal proof development},
   ISSN    = {2150-914x},
 }
License: BSD License

\ No newline at end of file diff --git a/web/index.html b/web/index.html --- a/web/index.html +++ b/web/index.html @@ -1,5316 +1,5332 @@ Archive of Formal Proofs

 

 

 

 

 

 

Archive of Formal Proofs

 

The Archive of Formal Proofs is a collection of proof libraries, examples, and larger scientific developments, mechanically checked in the theorem prover Isabelle. It is organized in the way of a scientific journal, is indexed by dblp and has an ISSN: 2150-914x. Submissions are refereed. The preferred citation style is available [here]. We encourage companion AFP submissions to conference and journal publications.

A development version of the archive is available as well.

 

 

+ + + + + +
2021
+ 2021-01-31: Tarski's Parallel Postulate implies the 5th Postulate of Euclid, the Postulate of Playfair and the original Parallel Postulate of Euclid +
+ Author: + Roland Coghetto +
+ 2021-01-30: Solution to the xkcd Blue Eyes puzzle +
+ Author: + Jakub Kądziołka +
2021-01-18: Hood-Melville Queue
Author: Alejandro Gómez-Londoño
2021-01-11: JinjaDCI: a Java semantics with dynamic class initialization
Author: Susannah Mansky

 

2020
2020-12-27: Cofinality and the Delta System Lemma
Author: Pedro Sánchez Terraf
2020-12-17: Topological semantics for paraconsistent and paracomplete logics
Author: David Fuenmayor
2020-12-08: Relational Minimum Spanning Tree Algorithms
Authors: Walter Guttmann and Nicolas Robinson-O'Brien
2020-12-07: Inline Caching and Unboxing Optimization for Interpreters
Author: Martin Desharnais
2020-12-05: The Relational Method with Message Anonymity for the Verification of Cryptographic Protocols
Author: Pasquale Noce
2020-11-22: Isabelle Marries Dirac: a Library for Quantum Computation and Quantum Information
Authors: Anthony Bordg, Hanna Lachnitt and Yijun He
2020-11-19: The HOL-CSP Refinement Toolkit
Authors: Safouan Taha, Burkhart Wolff and Lina Ye
2020-10-29: Verified SAT-Based AI Planning
Authors: Mohammad Abdulaziz and Friedrich Kurz
2020-10-29: AI Planning Languages Semantics
Authors: Mohammad Abdulaziz and Peter Lammich
2020-10-20: A Sound Type System for Physical Quantities, Units, and Measurements
Authors: Simon Foster and Burkhart Wolff
2020-10-12: Finite Map Extras
Author: Javier Díaz
2020-09-28: A Formal Model of the Safely Composable Document Object Model with Shadow Roots
Authors: Achim D. Brucker and Michael Herzberg
2020-09-28: A Formal Model of the Document Object Model with Shadow Roots
Authors: Achim D. Brucker and Michael Herzberg
2020-09-28: A Formalization of Safely Composable Web Components
Authors: Achim D. Brucker and Michael Herzberg
2020-09-28: A Formalization of Web Components
Authors: Achim D. Brucker and Michael Herzberg
2020-09-28: The Safely Composable DOM
Authors: Achim D. Brucker and Michael Herzberg
2020-09-16: Syntax-Independent Logic Infrastructure
Authors: Andrei Popescu and Dmitriy Traytel
2020-09-16: Robinson Arithmetic
Authors: Andrei Popescu and Dmitriy Traytel
2020-09-16: An Abstract Formalization of Gödel's Incompleteness Theorems
Authors: Andrei Popescu and Dmitriy Traytel
2020-09-16: From Abstract to Concrete Gödel's Incompleteness Theorems—Part II
Authors: Andrei Popescu and Dmitriy Traytel
2020-09-16: From Abstract to Concrete Gödel's Incompleteness Theorems—Part I
Authors: Andrei Popescu and Dmitriy Traytel
2020-09-07: A Formal Model of Extended Finite State Machines
Authors: Michael Foster, Achim D. Brucker, Ramsay G. Taylor and John Derrick
2020-09-07: Inference of Extended Finite State Machines
Authors: Michael Foster, Achim D. Brucker, Ramsay G. Taylor and John Derrick
2020-08-31: Practical Algebraic Calculus Checker
Authors: Mathias Fleury and Daniela Kaufmann
2020-08-31: Some classical results in inductive inference of recursive functions
Author: Frank J. Balbach
2020-08-26: Relational Disjoint-Set Forests
Author: Walter Guttmann
2020-08-25: Extensions to the Comprehensive Framework for Saturation Theorem Proving
Authors: Jasmin Blanchette and Sophie Tourret
2020-08-25: Putting the `K' into Bird's derivation of Knuth-Morris-Pratt string matching
Author: Peter Gammie
2020-08-04: Amicable Numbers
Author: Angeliki Koutsoukou-Argyraki
2020-08-03: Ordinal Partitions
Author: Lawrence C. Paulson
2020-07-21: A Formal Proof of The Chandy--Lamport Distributed Snapshot Algorithm
Authors: Ben Fiedler and Dmitriy Traytel
2020-07-13: Relational Characterisations of Paths
Authors: Walter Guttmann and Peter Höfner
2020-06-01: A Formally Verified Checker of the Safe Distance Traffic Rules for Autonomous Vehicles
Authors: Albert Rizaldi and Fabian Immler
2020-05-23: A verified algorithm for computing the Smith normal form of a matrix
Author: Jose Divasón
2020-05-16: The Nash-Williams Partition Theorem
Author: Lawrence C. Paulson
2020-05-13: A Formalization of Knuth–Bendix Orders
Authors: Christian Sternagel and René Thiemann
2020-05-12: Irrationality Criteria for Series by Erdős and Straus
Authors: Angeliki Koutsoukou-Argyraki and Wenda Li
2020-05-11: Recursion Theorem in ZF
Author: Georgy Dunaev
2020-05-08: An Efficient Normalisation Procedure for Linear Temporal Logic: Isabelle/HOL Formalisation
Author: Salomon Sickert
2020-05-06: Formalization of Forcing in Isabelle/ZF
Authors: Emmanuel Gunther, Miguel Pagano and Pedro Sánchez Terraf
2020-05-02: Banach-Steinhaus Theorem
Authors: Dominique Unruh and Jose Manuel Rodriguez Caballero
2020-04-27: Attack Trees in Isabelle for GDPR compliance of IoT healthcare systems
Author: Florian Kammueller
2020-04-24: Power Sum Polynomials
Author: Manuel Eberl
2020-04-24: The Lambert W Function on the Reals
Author: Manuel Eberl
2020-04-24: Gaussian Integers
Author: Manuel Eberl
2020-04-19: Matrices for ODEs
Author: Jonathan Julian Huerta y Munive
2020-04-16: Authenticated Data Structures As Functors
Authors: Andreas Lochbihler and Ognjen Marić
2020-04-10: Formalization of an Algorithm for Greedily Computing Associative Aggregations on Sliding Windows
Authors: Lukas Heimes, Dmitriy Traytel and Joshua Schneider
2020-04-09: A Comprehensive Framework for Saturation Theorem Proving
Author: Sophie Tourret
2020-04-09: Formalization of an Optimized Monitoring Algorithm for Metric First-Order Dynamic Logic with Aggregations
Authors: Thibault Dardinier, Lukas Heimes, Martin Raszyk, Joshua Schneider and Dmitriy Traytel
2020-04-08: Stateful Protocol Composition and Typing
Authors: Andreas V. Hess, Sebastian Mödersheim and Achim D. Brucker
2020-04-08: Automated Stateful Protocol Verification
Authors: Andreas V. Hess, Sebastian Mödersheim, Achim D. Brucker and Anders Schlichtkrull
2020-04-07: Lucas's Theorem
Author: Chelsea Edmonds
2020-03-25: Strong Eventual Consistency of the Collaborative Editing Framework WOOT
Authors: Emin Karayel and Edgar Gonzàlez
2020-03-22: Furstenberg's topology and his proof of the infinitude of primes
Author: Manuel Eberl
2020-03-12: An Under-Approximate Relational Logic
Author: Toby Murray
2020-03-07: Hello World
Authors: Cornelius Diekmann and Lars Hupel
2020-02-21: Implementing the Goodstein Function in λ-Calculus
Author: Bertram Felgenhauer
2020-02-10: A Generic Framework for Verified Compilers
Author: Martin Desharnais
2020-02-01: Arithmetic progressions and relative primes
Author: José Manuel Rodríguez Caballero
2020-01-31: A Hierarchy of Algebras for Boolean Subsets
Authors: Walter Guttmann and Bernhard Möller
2020-01-17: Mersenne primes and the Lucas–Lehmer test
Author: Manuel Eberl
2020-01-16: Verified Approximation Algorithms
Authors: Robin Eßmann, Tobias Nipkow and Simon Robillard
2020-01-13: Closest Pair of Points Algorithms
Authors: Martin Rau and Tobias Nipkow
2020-01-09: Skip Lists
Authors: Max W. Haslbeck and Manuel Eberl
2020-01-06: Bicategories
Author: Eugene W. Stark

 

2019
2019-12-27: The Irrationality of ζ(3)
Author: Manuel Eberl
2019-12-20: Formalizing a Seligman-Style Tableau System for Hybrid Logic
Author: Asta Halkjær From
2019-12-18: The Poincaré-Bendixson Theorem
Authors: Fabian Immler and Yong Kiam Tan
2019-12-16: Poincaré Disc Model
Authors: Danijela Simić, Filip Marić and Pierre Boutry
2019-12-16: Complex Geometry
Authors: Filip Marić and Danijela Simić
2019-12-10: Gauss Sums and the Pólya–Vinogradov Inequality
Authors: Rodrigo Raya and Manuel Eberl
2019-12-04: An Efficient Generalization of Counting Sort for Large, possibly Infinite Key Ranges
Author: Pasquale Noce
2019-11-27: Interval Arithmetic on 32-bit Words
Author: Brandon Bohrer
2019-10-24: Zermelo Fraenkel Set Theory in Higher-Order Logic
Author: Lawrence C. Paulson
2019-10-22: Isabelle/C
Authors: Frédéric Tuong and Burkhart Wolff
2019-10-16: VerifyThis 2019 -- Polished Isabelle Solutions
Authors: Peter Lammich and Simon Wimmer
2019-10-08: Aristotle's Assertoric Syllogistic
Author: Angeliki Koutsoukou-Argyraki
2019-10-07: Sigma Protocols and Commitment Schemes
Authors: David Butler and Andreas Lochbihler
2019-10-04: Clean - An Abstract Imperative Programming Language and its Theory
Authors: Frédéric Tuong and Burkhart Wolff
2019-09-16: Formalization of Multiway-Join Algorithms
Author: Thibault Dardinier
2019-09-10: Verification Components for Hybrid Systems
Author: Jonathan Julian Huerta y Munive
2019-09-06: Fourier Series
Author: Lawrence C Paulson
2019-08-30: A Case Study in Basic Algebra
Author: Clemens Ballarin
2019-08-16: Formalisation of an Adaptive State Counting Algorithm
Author: Robert Sachtleben
2019-08-14: Laplace Transform
Author: Fabian Immler
2019-08-06: Linear Programming
Authors: Julian Parsert and Cezary Kaliszyk
2019-08-06: Communicating Concurrent Kleene Algebra for Distributed Systems Specification
Authors: Maxime Buyse and Jason Jaskolka
2019-08-05: Selected Problems from the International Mathematical Olympiad 2019
Author: Manuel Eberl
2019-08-01: Stellar Quorum Systems
Author: Giuliano Losa
2019-07-30: A Formal Development of a Polychronous Polytimed Coordination Language
Authors: Hai Nguyen Van, Frédéric Boulanger and Burkhart Wolff
2019-07-27: Szpilrajn Extension Theorem
Author: Peter Zeller
2019-07-18: A Sequent Calculus for First-Order Logic
Author: Asta Halkjær From
2019-07-08: A Verified Code Generator from Isabelle/HOL to CakeML
Author: Lars Hupel
2019-07-04: Formalization of a Monitoring Algorithm for Metric First-Order Temporal Logic
Authors: Joshua Schneider and Dmitriy Traytel
2019-06-27: Complete Non-Orders and Fixed Points
Authors: Akihisa Yamada and Jérémy Dubut
2019-06-25: Priority Search Trees
Authors: Peter Lammich and Tobias Nipkow
2019-06-25: Purely Functional, Simple, and Efficient Implementation of Prim and Dijkstra
Authors: Peter Lammich and Tobias Nipkow
2019-06-21: Linear Inequalities
Authors: Ralph Bottesch, Alban Reynaud and René Thiemann
2019-06-16: Hilbert's Nullstellensatz
Author: Alexander Maletzky
2019-06-15: Gröbner Bases, Macaulay Matrices and Dubé's Degree Bounds
Author: Alexander Maletzky
2019-06-13: Binary Heaps for IMP2
Author: Simon Griebel
2019-06-03: Differential Game Logic
Author: André Platzer
2019-05-30: Multidimensional Binary Search Trees
Author: Martin Rau
2019-05-14: Formalization of Generic Authenticated Data Structures
Authors: Matthias Brun and Dmitriy Traytel
2019-05-09: Multi-Party Computation
Authors: David Aspinall and David Butler
2019-04-26: HOL-CSP Version 2.0
Authors: Safouan Taha, Lina Ye and Burkhart Wolff
2019-04-16: A Compositional and Unified Translation of LTL into ω-Automata
Authors: Benedikt Seidl and Salomon Sickert
2019-04-06: A General Theory of Syntax with Bindings
Authors: Lorenzo Gheri and Andrei Popescu
2019-03-27: The Transcendence of Certain Infinite Series
Authors: Angeliki Koutsoukou-Argyraki and Wenda Li
2019-03-24: Quantum Hoare Logic
Authors: Junyi Liu, Bohua Zhan, Shuling Wang, Shenggang Ying, Tao Liu, Yangjia Li, Mingsheng Ying and Naijun Zhan
2019-03-09: Safe OCL
Author: Denis Nikiforov
2019-02-21: Elementary Facts About the Distribution of Primes
Author: Manuel Eberl
2019-02-14: Kruskal's Algorithm for Minimum Spanning Forest
Authors: Maximilian P.L. Haslbeck, Peter Lammich and Julian Biendarra
2019-02-11: Probabilistic Primality Testing
Authors: Daniel Stüwe and Manuel Eberl
2019-02-08: Universal Turing Machine
Authors: Jian Xu, Xingyuan Zhang, Christian Urban and Sebastiaan J. C. Joosten
2019-02-01: Isabelle/UTP: Mechanised Theory Engineering for Unifying Theories of Programming
Authors: Simon Foster, Frank Zeyda, Yakoub Nemouchi, Pedro Ribeiro and Burkhart Wolff
2019-02-01: The Inversions of a List
Author: Manuel Eberl
2019-01-17: Farkas' Lemma and Motzkin's Transposition Theorem
Authors: Ralph Bottesch, Max W. Haslbeck and René Thiemann
2019-01-15: IMP2 – Simple Program Verification in Isabelle/HOL
Authors: Peter Lammich and Simon Wimmer
2019-01-15: An Algebra for Higher-Order Terms
Author: Lars Hupel
2019-01-07: A Reduction Theorem for Store Buffers
Authors: Ernie Cohen and Norbert Schirmer

 

2018
2018-12-26: A Formal Model of the Document Object Model
Authors: Achim D. Brucker and Michael Herzberg
2018-12-25: Formalization of Concurrent Revisions
Author: Roy Overbeek
2018-12-21: Verifying Imperative Programs using Auto2
Author: Bohua Zhan
2018-12-17: Constructive Cryptography in HOL
Authors: Andreas Lochbihler and S. Reza Sefidgar
2018-12-11: Transformer Semantics
Author: Georg Struth
2018-12-11: Quantales
Author: Georg Struth
2018-12-11: Properties of Orderings and Lattices
Author: Georg Struth
2018-11-23: Graph Saturation
Author: Sebastiaan J. C. Joosten
2018-11-23: A Verified Functional Implementation of Bachmair and Ganzinger's Ordered Resolution Prover
Authors: Anders Schlichtkrull, Jasmin Christian Blanchette and Dmitriy Traytel
2018-11-20: Auto2 Prover
Author: Bohua Zhan
2018-11-16: Matroids
Author: Jonas Keinholz
2018-11-06: Deriving generic class instances for datatypes
Authors: Jonas Rädle and Lars Hupel
2018-10-30: Formalisation and Evaluation of Alan Gewirth's Proof for the Principle of Generic Consistency in Isabelle/HOL
Authors: David Fuenmayor and Christoph Benzmüller
2018-10-29: Epistemic Logic
Author: Asta Halkjær From
2018-10-22: Smooth Manifolds
Authors: Fabian Immler and Bohua Zhan
2018-10-19: Randomised Binary Search Trees
Author: Manuel Eberl
2018-10-19: Formalization of the Embedding Path Order for Lambda-Free Higher-Order Terms
Author: Alexander Bentkamp
2018-10-12: Upper Bounding Diameters of State Spaces of Factored Transition Systems
Authors: Friedrich Kurz and Mohammad Abdulaziz
2018-09-28: The Transcendence of π
Author: Manuel Eberl
2018-09-25: Symmetric Polynomials
Author: Manuel Eberl
2018-09-20: Signature-Based Gröbner Basis Algorithms
Author: Alexander Maletzky
2018-09-19: The Prime Number Theorem
Authors: Manuel Eberl and Lawrence C. Paulson
2018-09-15: Aggregation Algebras
Author: Walter Guttmann
2018-09-14: Octonions
Author: Angeliki Koutsoukou-Argyraki
2018-09-05: Quaternions
Author: Lawrence C. Paulson
2018-09-02: The Budan-Fourier Theorem and Counting Real Roots with Multiplicity
Author: Wenda Li
2018-08-24: An Incremental Simplex Algorithm with Unsatisfiable Core Generation
Authors: Filip Marić, Mirko Spasić and René Thiemann
2018-08-14: Minsky Machines
Author: Bertram Felgenhauer
2018-07-16: Pricing in discrete financial models
Author: Mnacho Echenim
2018-07-04: Von-Neumann-Morgenstern Utility Theorem
Authors: Julian Parsert and Cezary Kaliszyk
2018-06-23: Pell's Equation
Author: Manuel Eberl
2018-06-14: Projective Geometry
Author: Anthony Bordg
2018-06-14: The Localization of a Commutative Ring
Author: Anthony Bordg
2018-06-05: Partial Order Reduction
Author: Julian Brunner
2018-05-27: Optimal Binary Search Trees
Authors: Tobias Nipkow and Dániel Somogyi
2018-05-25: Hidden Markov Models
Author: Simon Wimmer
2018-05-24: Probabilistic Timed Automata
Authors: Simon Wimmer and Johannes Hölzl
2018-05-23: Irrational Rapidly Convergent Series
Authors: Angeliki Koutsoukou-Argyraki and Wenda Li
2018-05-23: Axiom Systems for Category Theory in Free Logic
Authors: Christoph Benzmüller and Dana Scott
2018-05-22: Monadification, Memoization and Dynamic Programming
Authors: Simon Wimmer, Shuwei Hu and Tobias Nipkow
2018-05-10: OpSets: Sequential Specifications for Replicated Datatypes
Authors: Martin Kleppmann, Victor B. F. Gomes, Dominic P. Mulligan and Alastair R. Beresford
2018-05-07: An Isabelle/HOL Formalization of the Modular Assembly Kit for Security Properties
Authors: Oliver Bračevac, Richard Gay, Sylvia Grewe, Heiko Mantel, Henning Sudbrock and Markus Tasch
2018-04-29: WebAssembly
Author: Conrad Watt
2018-04-27: VerifyThis 2018 - Polished Isabelle Solutions
Authors: Peter Lammich and Simon Wimmer
2018-04-24: Bounded Natural Functors with Covariance and Contravariance
Authors: Andreas Lochbihler and Joshua Schneider
2018-03-22: The Incompatibility of Fishburn-Strategyproofness and Pareto-Efficiency
Authors: Felix Brandt, Manuel Eberl, Christian Saile and Christian Stricker
2018-03-13: Weight-Balanced Trees
Authors: Tobias Nipkow and Stefan Dirix
2018-03-12: CakeML
Authors: Lars Hupel and Yu Zhang
2018-03-01: A Theory of Architectural Design Patterns
Author: Diego Marmsoler
2018-02-26: Hoare Logics for Time Bounds
Authors: Maximilian P. L. Haslbeck and Tobias Nipkow
2018-02-06: Treaps
Authors: Maximilian Haslbeck, Manuel Eberl and Tobias Nipkow
2018-02-06: A verified factorization algorithm for integer polynomials with polynomial complexity
Authors: Jose Divasón, Sebastiaan Joosten, René Thiemann and Akihisa Yamada
2018-02-06: First-Order Terms
Authors: Christian Sternagel and René Thiemann
2018-02-06: The Error Function
Author: Manuel Eberl
2018-02-02: A verified LLL algorithm
Authors: Ralph Bottesch, Jose Divasón, Maximilian Haslbeck, Sebastiaan Joosten, René Thiemann and Akihisa Yamada
2018-01-18: Formalization of Bachmair and Ganzinger's Ordered Resolution Prover
Authors: Anders Schlichtkrull, Jasmin Christian Blanchette, Dmitriy Traytel and Uwe Waldmann
2018-01-16: Gromov Hyperbolicity
Author: Sebastien Gouezel
2018-01-11: An Isabelle/HOL formalisation of Green's Theorem
Authors: Mohammad Abdulaziz and Lawrence C. Paulson
2018-01-08: Taylor Models
Authors: Christoph Traut and Fabian Immler

 

2017
2017-12-22: The Falling Factorial of a Sum
Author: Lukas Bulwahn
2017-12-21: The Median-of-Medians Selection Algorithm
Author: Manuel Eberl
2017-12-21: The Mason–Stothers Theorem
Author: Manuel Eberl
2017-12-21: Dirichlet L-Functions and Dirichlet's Theorem
Author: Manuel Eberl
2017-12-19: Operations on Bounded Natural Functors
Authors: Jasmin Christian Blanchette, Andrei Popescu and Dmitriy Traytel
2017-12-18: The string search algorithm by Knuth, Morris and Pratt
Authors: Fabian Hellauer and Peter Lammich
2017-11-22: Stochastic Matrices and the Perron-Frobenius Theorem
Author: René Thiemann
2017-11-09: The IMAP CmRDT
Authors: Tim Jungnickel, Lennart Oldenburg and Matthias Loibl
2017-11-06: Hybrid Multi-Lane Spatial Logic
Author: Sven Linker
2017-10-26: The Kuratowski Closure-Complement Theorem
Authors: Peter Gammie and Gianpaolo Gioiosa
2017-10-19: Transition Systems and Automata
Author: Julian Brunner
2017-10-19: Büchi Complementation
Author: Julian Brunner
2017-10-17: Evaluate Winding Numbers through Cauchy Indices
Author: Wenda Li
2017-10-17: Count the Number of Complex Roots
Author: Wenda Li
2017-10-14: Homogeneous Linear Diophantine Equations
Authors: Florian Messner, Julian Parsert, Jonas Schöpf and Christian Sternagel
2017-10-12: The Hurwitz and Riemann ζ Functions
Author: Manuel Eberl
2017-10-12: Linear Recurrences
Author: Manuel Eberl
2017-10-12: Dirichlet Series
Author: Manuel Eberl
2017-09-21: Computer-assisted Reconstruction and Assessment of E. J. Lowe's Modal Ontological Argument
Authors: David Fuenmayor and Christoph Benzmüller
2017-09-17: Representation and Partial Automation of the Principia Logico-Metaphysica in Isabelle/HOL
Author: Daniel Kirchner
2017-09-06: Anselm's God in Isabelle/HOL
Author: Ben Blumson
2017-09-01: Microeconomics and the First Welfare Theorem
Authors: Julian Parsert and Cezary Kaliszyk
2017-08-20: Root-Balanced Tree
Author: Tobias Nipkow
2017-08-20: Orbit-Stabiliser Theorem with Application to Rotational Symmetries
Author: Jonas Rädle
2017-08-16: The LambdaMu-calculus
Authors: Cristina Matache, Victor B. F. Gomes and Dominic P. Mulligan
2017-07-31: Stewart's Theorem and Apollonius' Theorem
Author: Lukas Bulwahn
2017-07-28: Dynamic Architectures
Author: Diego Marmsoler
2017-07-21: Declarative Semantics for Functional Languages
Author: Jeremy Siek
2017-07-15: HOLCF-Prelude
Authors: Joachim Breitner, Brian Huffman, Neil Mitchell and Christian Sternagel
2017-07-13: Minkowski's Theorem
Author: Manuel Eberl
2017-07-09: Verified Metatheory and Type Inference for a Name-Carrying Simply-Typed Lambda Calculus
Author: Michael Rawson
2017-07-07: A framework for establishing Strong Eventual Consistency for Conflict-free Replicated Datatypes
Authors: Victor B. F. Gomes, Martin Kleppmann, Dominic P. Mulligan and Alastair R. Beresford
2017-07-06: Stone-Kleene Relation Algebras
Author: Walter Guttmann
2017-06-21: Propositional Proof Systems
Authors: Julius Michaelis and Tobias Nipkow
2017-06-13: Partial Semigroups and Convolution Algebras
Authors: Brijesh Dongol, Victor B. F. Gomes, Ian J. Hayes and Georg Struth
2017-06-06: Buffon's Needle Problem
Author: Manuel Eberl
2017-06-01: Formalizing Push-Relabel Algorithms
Authors: Peter Lammich and S. Reza Sefidgar
2017-06-01: Flow Networks and the Min-Cut-Max-Flow Theorem
Authors: Peter Lammich and S. Reza Sefidgar
2017-05-25: Optics
Authors: Simon Foster and Frank Zeyda
2017-05-24: Developing Security Protocols by Refinement
Authors: Christoph Sprenger and Ivano Somaini
2017-05-24: Dictionary Construction
Author: Lars Hupel
2017-05-08: The Floyd-Warshall Algorithm for Shortest Paths
Authors: Simon Wimmer and Peter Lammich
2017-05-05: Probabilistic while loop
Author: Andreas Lochbihler
2017-05-05: Effect polymorphism in higher-order logic
Author: Andreas Lochbihler
2017-05-05: Monad normalisation
Authors: Joshua Schneider, Manuel Eberl and Andreas Lochbihler
2017-05-05: Game-based cryptography in HOL
Authors: Andreas Lochbihler, S. Reza Sefidgar and Bhargav Bhatt
2017-05-05: CryptHOL
Author: Andreas Lochbihler
2017-05-04: Monoidal Categories
Author: Eugene W. Stark
2017-05-01: Types, Tableaus and Gödel’s God in Isabelle/HOL
Authors: David Fuenmayor and Christoph Benzmüller
2017-04-28: Local Lexing
Author: Steven Obua
2017-04-19: Constructor Functions
Author: Lars Hupel
2017-04-18: Lazifying case constants
Author: Lars Hupel
2017-04-06: Subresultants
Authors: Sebastiaan Joosten, René Thiemann and Akihisa Yamada
2017-04-04: Expected Shape of Random Binary Search Trees
Author: Manuel Eberl
2017-03-15: The number of comparisons in QuickSort
Author: Manuel Eberl
2017-03-15: Lower bound on comparison-based sorting algorithms
Author: Manuel Eberl
2017-03-10: The Euler–MacLaurin Formula
Author: Manuel Eberl
2017-02-28: The Group Law for Elliptic Curves
Author: Stefan Berghofer
2017-02-26: Menger's Theorem
Author: Christoph Dittmann
2017-02-13: Differential Dynamic Logic
Author: Brandon Bohrer
2017-02-10: Abstract Soundness
Authors: Jasmin Christian Blanchette, Andrei Popescu and Dmitriy Traytel
2017-02-07: Stone Relation Algebras
Author: Walter Guttmann
2017-01-31: Refining Authenticated Key Agreement with Strong Adversaries
Authors: Joseph Lallemand and Christoph Sprenger
2017-01-24: Bernoulli Numbers
Authors: Lukas Bulwahn and Manuel Eberl
2017-01-17: Minimal Static Single Assignment Form
Authors: Max Wagner and Denis Lohner
2017-01-17: Bertrand's postulate
Authors: Julian Biendarra and Manuel Eberl
2017-01-12: The Transcendence of e
Author: Manuel Eberl
2017-01-08: Formal Network Models and Their Application to Firewall Policies
Authors: Achim D. Brucker, Lukas Brügger and Burkhart Wolff
2017-01-03: Verification of a Diffie-Hellman Password-based Authentication Protocol by Extending the Inductive Method
Author: Pasquale Noce
2017-01-01: First-Order Logic According to Harrison
Authors: Alexander Birch Jensen, Anders Schlichtkrull and Jørgen Villadsen

 

2016
2016-12-30: Concurrent Refinement Algebra and Rely Quotients
Authors: Julian Fell, Ian J. Hayes and Andrius Velykis
2016-12-29: The Twelvefold Way
Author: Lukas Bulwahn
2016-12-20: Proof Strategy Language
Author: Yutaka Nagashima
2016-12-07: Paraconsistency
Authors: Anders Schlichtkrull and Jørgen Villadsen
2016-11-29: COMPLX: A Verification Framework for Concurrent Imperative Programs
Authors: Sidney Amani, June Andronick, Maksym Bortin, Corey Lewis, Christine Rizkallah and Joseph Tuong
2016-11-23: Abstract Interpretation of Annotated Commands
Author: Tobias Nipkow
2016-11-16: Separata: Isabelle tactics for Separation Algebra
Authors: Zhe Hou, David Sanan, Alwen Tiu, Rajeev Gore and Ranald Clouston
2016-11-12: Formalization of Nested Multisets, Hereditary Multisets, and Syntactic Ordinals
Authors: Jasmin Christian Blanchette, Mathias Fleury and Dmitriy Traytel
2016-11-12: Formalization of Knuth–Bendix Orders for Lambda-Free Higher-Order Terms
Authors: Heiko Becker, Jasmin Christian Blanchette, Uwe Waldmann and Daniel Wand
2016-11-10: Expressiveness of Deep Learning
Author: Alexander Bentkamp
2016-10-25: Modal Logics for Nominal Transition Systems
Authors: Tjark Weber, Lars-Henrik Eriksson, Joachim Parrow, Johannes Borgström and Ramunas Gutkovas
2016-10-24: Stable Matching
Author: Peter Gammie
2016-10-21: LOFT — Verified Migration of Linux Firewalls to SDN
Authors: Julius Michaelis and Cornelius Diekmann
2016-10-19: Source Coding Theorem
Authors: Quentin Hibon and Lawrence C. Paulson
2016-10-19: A formal model for the SPARCv8 ISA and a proof of non-interference for the LEON3 processor
Authors: Zhe Hou, David Sanan, Alwen Tiu and Yang Liu
2016-10-14: The Factorization Algorithm of Berlekamp and Zassenhaus
Authors: Jose Divasón, Sebastiaan Joosten, René Thiemann and Akihisa Yamada
2016-10-11: Intersecting Chords Theorem
Author: Lukas Bulwahn
2016-10-05: Lp spaces
Author: Sebastien Gouezel
2016-09-30: Fisher–Yates shuffle
Author: Manuel Eberl
2016-09-29: Allen's Interval Calculus
Author: Fadoua Ghourabi
2016-09-23: Formalization of Recursive Path Orders for Lambda-Free Higher-Order Terms
Authors: Jasmin Christian Blanchette, Uwe Waldmann and Daniel Wand
2016-09-09: Iptables Semantics
Authors: Cornelius Diekmann and Lars Hupel
2016-09-06: A Variant of the Superposition Calculus
Author: Nicolas Peltier
2016-09-06: Stone Algebras
Author: Walter Guttmann
2016-09-01: Stirling's formula
Author: Manuel Eberl
2016-08-31: Routing
Authors: Julius Michaelis and Cornelius Diekmann
2016-08-24: Simple Firewall
Authors: Cornelius Diekmann, Julius Michaelis and Maximilian Haslbeck
2016-08-18: Infeasible Paths Elimination by Symbolic Execution Techniques: Proof of Correctness and Preservation of Paths
Authors: Romain Aissat, Frederic Voisin and Burkhart Wolff
2016-08-12: Formalizing the Edmonds-Karp Algorithm
Authors: Peter Lammich and S. Reza Sefidgar
2016-08-08: The Imperative Refinement Framework
Author: Peter Lammich
2016-08-07: Ptolemy's Theorem
Author: Lukas Bulwahn
2016-07-17: Surprise Paradox
Author: Joachim Breitner
2016-07-14: Pairing Heap
Authors: Hauke Brinkop and Tobias Nipkow
2016-07-05: A Framework for Verifying Depth-First Search Algorithms
Authors: Peter Lammich and René Neumann
2016-07-01: Chamber Complexes, Coxeter Systems, and Buildings
Author: Jeremy Sylvestre
2016-06-30: The Z Property
Authors: Bertram Felgenhauer, Julian Nagele, Vincent van Oostrom and Christian Sternagel
2016-06-30: The Resolution Calculus for First-Order Logic
Author: Anders Schlichtkrull
2016-06-28: IP Addresses
Authors: Cornelius Diekmann, Julius Michaelis and Lars Hupel
2016-06-28: Compositional Security-Preserving Refinement for Concurrent Imperative Programs
Authors: Toby Murray, Robert Sison, Edward Pierzchalski and Christine Rizkallah
2016-06-26: Category Theory with Adjunctions and Limits
Author: Eugene W. Stark
2016-06-26: Cardinality of Multisets
Author: Lukas Bulwahn
2016-06-25: A Dependent Security Type System for Concurrent Imperative Programs
Authors: Toby Murray, Robert Sison, Edward Pierzchalski and Christine Rizkallah
2016-06-21: Catalan Numbers
Author: Manuel Eberl
2016-06-18: Program Construction and Verification Components Based on Kleene Algebra
Authors: Victor B. F. Gomes and Georg Struth
2016-06-13: Conservation of CSP Noninterference Security under Concurrent Composition
Author: Pasquale Noce
2016-06-09: Finite Machine Word Library
Authors: Joel Beeren, Matthew Fernandez, Xin Gao, Gerwin Klein, Rafal Kolanski, Japheth Lim, Corey Lewis, Daniel Matichuk and Thomas Sewell
2016-05-31: Tree Decomposition
Author: Christoph Dittmann
2016-05-24: POSIX Lexing with Derivatives of Regular Expressions
Authors: Fahad Ausaf, Roy Dyckhoff and Christian Urban
2016-05-24: Cardinality of Equivalence Relations
Author: Lukas Bulwahn
2016-05-20: Perron-Frobenius Theorem for Spectral Radius Analysis
Authors: Jose Divasón, Ondřej Kunčar, René Thiemann and Akihisa Yamada
2016-05-20: The meta theory of the Incredible Proof Machine
Authors: Joachim Breitner and Denis Lohner
2016-05-18: A Constructive Proof for FLP
Authors: Benjamin Bisping, Paul-David Brodmann, Tim Jungnickel, Christina Rickmann, Henning Seidler, Anke Stüber, Arno Wilhelm-Weidner, Kirstin Peters and Uwe Nestmann
2016-05-09: A Formal Proof of the Max-Flow Min-Cut Theorem for Countable Networks
Author: Andreas Lochbihler
2016-05-05: Randomised Social Choice Theory
Author: Manuel Eberl
2016-05-04: The Incompatibility of SD-Efficiency and SD-Strategy-Proofness
Author: Manuel Eberl
2016-05-04: Spivey's Generalized Recurrence for Bell Numbers
Author: Lukas Bulwahn
2016-05-02: Gröbner Bases Theory
Authors: Fabian Immler and Alexander Maletzky
2016-04-28: No Faster-Than-Light Observers
Authors: Mike Stannett and István Németi
2016-04-27: Algorithms for Reduced Ordered Binary Decision Diagrams
Authors: Julius Michaelis, Maximilian Haslbeck, Peter Lammich and Lars Hupel
2016-04-27: A formalisation of the Cocke-Younger-Kasami algorithm
Author: Maksym Bortin
2016-04-26: Conservation of CSP Noninterference Security under Sequential Composition
Author: Pasquale Noce
2016-04-12: Kleene Algebras with Domain
Authors: Victor B. F. Gomes, Walter Guttmann, Peter Höfner, Georg Struth and Tjark Weber
2016-03-11: Propositional Resolution and Prime Implicates Generation
Author: Nicolas Peltier
2016-03-08: Timed Automata
Author: Simon Wimmer
2016-03-08: The Cartan Fixed Point Theorems
Author: Lawrence C. Paulson
2016-03-01: Linear Temporal Logic
Author: Salomon Sickert
2016-02-17: Analysis of List Update Algorithms
Authors: Maximilian P.L. Haslbeck and Tobias Nipkow
2016-02-05: Verified Construction of Static Single Assignment Form
Authors: Sebastian Ullrich and Denis Lohner
2016-01-29: Polynomial Interpolation
Authors: René Thiemann and Akihisa Yamada
2016-01-29: Polynomial Factorization
Authors: René Thiemann and Akihisa Yamada
2016-01-20: Knot Theory
Author: T.V.H. Prathamesh
2016-01-18: Tensor Product of Matrices
Author: T.V.H. Prathamesh
2016-01-14: Cardinality of Number Partitions
Author: Lukas Bulwahn

 

2015
2015-12-28: Basic Geometric Properties of Triangles
Author: Manuel Eberl
2015-12-28: The Divergence of the Prime Harmonic Series
Author: Manuel Eberl
2015-12-28: Liouville numbers
Author: Manuel Eberl
2015-12-28: Descartes' Rule of Signs
Author: Manuel Eberl
2015-12-22: The Stern-Brocot Tree
Authors: Peter Gammie and Andreas Lochbihler
2015-12-22: Applicative Lifting
Authors: Andreas Lochbihler and Joshua Schneider
2015-12-22: Algebraic Numbers in Isabelle/HOL
Authors: René Thiemann, Akihisa Yamada and Sebastiaan Joosten
2015-12-12: Cardinality of Set Partitions
Author: Lukas Bulwahn
2015-12-02: Latin Square
Author: Alexander Bentkamp
2015-12-01: Ergodic Theory
Author: Sebastien Gouezel
2015-11-19: Euler's Partition Theorem
Author: Lukas Bulwahn
2015-11-18: The Tortoise and Hare Algorithm
Author: Peter Gammie
2015-11-11: Planarity Certificates
Author: Lars Noschinski
2015-11-02: Positional Determinacy of Parity Games
Author: Christoph Dittmann
2015-09-16: A Meta-Model for the Isabelle API
Authors: Frédéric Tuong and Burkhart Wolff
2015-09-04: Converting Linear Temporal Logic to Deterministic (Generalized) Rabin Automata
Author: Salomon Sickert
2015-08-21: Matrices, Jordan Normal Forms, and Spectral Radius Theory
Authors: René Thiemann and Akihisa Yamada
2015-08-20: Decreasing Diagrams II
Author: Bertram Felgenhauer
2015-08-18: The Inductive Unwinding Theorem for CSP Noninterference Security
Author: Pasquale Noce
2015-08-12: Representations of Finite Groups
Author: Jeremy Sylvestre
2015-08-10: Analysing and Comparing Encodability Criteria for Process Calculi
Authors: Kirstin Peters and Rob van Glabbeek
2015-07-21: Generating Cases from Labeled Subgoals
Author: Lars Noschinski
2015-07-14: Landau Symbols
Author: Manuel Eberl
2015-07-14: The Akra-Bazzi theorem and the Master theorem
Author: Manuel Eberl
2015-07-07: Hermite Normal Form
Authors: Jose Divasón and Jesús Aransay
2015-06-27: Derangements Formula
Author: Lukas Bulwahn
2015-06-11: The Ipurge Unwinding Theorem for CSP Noninterference Security
Author: Pasquale Noce
2015-06-11: The Generic Unwinding Theorem for CSP Noninterference Security
Author: Pasquale Noce
2015-06-11: Binary Multirelations
Authors: Hitoshi Furusawa and Georg Struth
2015-06-11: Reasoning about Lists via List Interleaving
Author: Pasquale Noce
2015-06-07: Parameterized Dynamic Tables
Author: Tobias Nipkow
2015-05-28: Derivatives of Logical Formulas
Author: Dmitriy Traytel
2015-05-27: A Zoo of Probabilistic Systems
Authors: Johannes Hölzl, Andreas Lochbihler and Dmitriy Traytel
2015-04-30: VCG - Combinatorial Vickrey-Clarke-Groves Auctions
Authors: Marco B. Caminati, Manfred Kerber, Christoph Lange and Colin Rowat
2015-04-15: Residuated Lattices
Authors: Victor B. F. Gomes and Georg Struth
2015-04-13: Concurrent IMP
Author: Peter Gammie
2015-04-13: Relaxing Safely: Verified On-the-Fly Garbage Collection for x86-TSO
Authors: Peter Gammie, Tony Hosking and Kai Engelhardt
2015-03-30: Trie
Authors: Andreas Lochbihler and Tobias Nipkow
2015-03-18: Consensus Refined
Authors: Ognjen Maric and Christoph Sprenger
2015-03-11: Deriving class instances for datatypes
Authors: Christian Sternagel and René Thiemann
2015-02-20: The Safety of Call Arity
Author: Joachim Breitner
2015-02-12: QR Decomposition
Authors: Jose Divasón and Jesús Aransay
2015-02-12: Echelon Form
Authors: Jose Divasón and Jesús Aransay
2015-02-05: Finite Automata in Hereditarily Finite Set Theory
Author: Lawrence C. Paulson
2015-01-28: Verification of the UpDown Scheme
Author: Johannes Hölzl

 

2014
2014-11-28: The Unified Policy Framework (UPF)
Authors: Achim D. Brucker, Lukas Brügger and Burkhart Wolff
2014-10-23: Loop freedom of the (untimed) AODV routing protocol
Authors: Timothy Bourke and Peter Höfner
2014-10-13: Lifting Definition Option
Author: René Thiemann
2014-10-10: Stream Fusion in HOL with Code Generation
Authors: Andreas Lochbihler and Alexandra Maximova
2014-10-09: A Verified Compiler for Probability Density Functions
Authors: Manuel Eberl, Johannes Hölzl and Tobias Nipkow
2014-10-08: Formalization of Refinement Calculus for Reactive Systems
Author: Viorel Preoteasa
2014-10-03: XML
Authors: Christian Sternagel and René Thiemann
2014-10-03: Certification Monads
Authors: Christian Sternagel and René Thiemann
2014-09-25: Imperative Insertion Sort
Author: Christian Sternagel
2014-09-19: The Sturm-Tarski Theorem
Author: Wenda Li
2014-09-15: The Cayley-Hamilton Theorem
Authors: Stephan Adelsberger, Stefan Hetzl and Florian Pollak
2014-09-09: The Jordan-Hölder Theorem
Author: Jakob von Raumer
2014-09-04: Priority Queues Based on Braun Trees
Author: Tobias Nipkow
2014-09-03: Gauss-Jordan Algorithm and Its Applications
Authors: Jose Divasón and Jesús Aransay
2014-08-29: Vector Spaces
Author: Holden Lee
2014-08-29: Real-Valued Special Functions: Upper and Lower Bounds
Author: Lawrence C. Paulson
2014-08-13: Skew Heap
Author: Tobias Nipkow
2014-08-12: Splay Tree
Author: Tobias Nipkow
2014-07-29: Haskell's Show Class in Isabelle/HOL
Authors: Christian Sternagel and René Thiemann
2014-07-18: Formal Specification of a Generic Separation Kernel
Authors: Freek Verbeek, Sergey Tverdyshev, Oto Havle, Holger Blasum, Bruno Langenstein, Werner Stephan, Yakoub Nemouchi, Abderrahmane Feliachi, Burkhart Wolff and Julien Schmaltz
2014-07-13: pGCL for Isabelle
Author: David Cock
2014-07-07: Amortized Complexity Verified
Author: Tobias Nipkow
2014-07-04: Network Security Policy Verification
Author: Cornelius Diekmann
2014-07-03: Pop-Refinement
Author: Alessandro Coglio
2014-06-12: Decision Procedures for MSO on Words Based on Derivatives of Regular Expressions
Authors: Dmitriy Traytel and Tobias Nipkow
2014-06-08: Boolean Expression Checkers
Author: Tobias Nipkow
2014-05-28: Promela Formalization
Author: René Neumann
2014-05-28: Converting Linear-Time Temporal Logic to Generalized Büchi Automata
Authors: Alexander Schimpf and Peter Lammich
2014-05-28: Verified Efficient Implementation of Gabow's Strongly Connected Components Algorithm
Author: Peter Lammich
2014-05-28: A Fully Verified Executable LTL Model Checker
Authors: Javier Esparza, Peter Lammich, René Neumann, Tobias Nipkow, Alexander Schimpf and Jan-Georg Smaus
2014-05-28: The CAVA Automata Library
Author: Peter Lammich
2014-05-23: Transitive closure according to Roy-Floyd-Warshall
Author: Makarius Wenzel
2014-05-23: Noninterference Security in Communicating Sequential Processes
Author: Pasquale Noce
2014-05-21: Regular Algebras
Authors: Simon Foster and Georg Struth
2014-04-28: Formalisation and Analysis of Component Dependencies
Author: Maria Spichkova
2014-04-23: A Formalization of Declassification with WHAT-and-WHERE-Security
Authors: Sylvia Grewe, Alexander Lux, Heiko Mantel and Jens Sauer
2014-04-23: A Formalization of Strong Security
Authors: Sylvia Grewe, Alexander Lux, Heiko Mantel and Jens Sauer
2014-04-23: A Formalization of Assumptions and Guarantees for Compositional Noninterference
Authors: Sylvia Grewe, Heiko Mantel and Daniel Schoepe
2014-04-22: Bounded-Deducibility Security
Authors: Andrei Popescu and Peter Lammich
2014-04-16: A shallow embedding of HyperCTL*
Authors: Markus N. Rabe, Peter Lammich and Andrei Popescu
2014-04-16: Abstract Completeness
Authors: Jasmin Christian Blanchette, Andrei Popescu and Dmitriy Traytel
2014-04-13: Discrete Summation
Author: Florian Haftmann
2014-04-03: Syntax and semantics of a GPU kernel programming language
Author: John Wickerson
2014-03-11: Probabilistic Noninterference
Authors: Andrei Popescu and Johannes Hölzl
2014-03-08: Mechanization of the Algebra for Wireless Networks (AWN)
Author: Timothy Bourke
2014-02-18: Mutually Recursive Partial Functions
Author: René Thiemann
2014-02-13: Properties of Random Graphs -- Subgraph Containment
Author: Lars Hupel
2014-02-11: Verification of Selection and Heap Sort Using Locales
Author: Danijela Petrovic
2014-02-07: Affine Arithmetic
Author: Fabian Immler
2014-02-06: Implementing field extensions of the form Q[sqrt(b)]
Author: René Thiemann
2014-01-30: Unified Decision Procedures for Regular Expression Equivalence
Authors: Tobias Nipkow and Dmitriy Traytel
2014-01-28: Secondary Sylow Theorems
Author: Jakob von Raumer
2014-01-25: Relation Algebra
Authors: Alasdair Armstrong, Simon Foster, Georg Struth and Tjark Weber
2014-01-23: Kleene Algebra with Tests and Demonic Refinement Algebras
Authors: Alasdair Armstrong, Victor B. F. Gomes and Georg Struth
2014-01-16: Featherweight OCL: A Proposal for a Machine-Checked Formal Semantics for OCL 2.5
Authors: Achim D. Brucker, Frédéric Tuong and Burkhart Wolff
2014-01-11: Sturm's Theorem
Author: Manuel Eberl
2014-01-11: Compositional Properties of Crypto-Based Components
Author: Maria Spichkova

 

2013
2013-12-01: A General Method for the Proof of Theorems on Tail-recursive Functions
Author: Pasquale Noce
2013-11-17: Gödel's Incompleteness Theorems
Author: Lawrence C. Paulson
2013-11-17: The Hereditarily Finite Sets
Author: Lawrence C. Paulson
2013-11-15: A Codatatype of Formal Languages
Author: Dmitriy Traytel
2013-11-14: Stream Processing Components: Isabelle/HOL Formalisation and Case Studies
Author: Maria Spichkova
2013-11-12: Gödel's God in Isabelle/HOL
Authors: Christoph Benzmüller and Bruno Woltzenlogel Paleo
2013-11-01: Decreasing Diagrams
Author: Harald Zankl
2013-10-02: Automatic Data Refinement
Author: Peter Lammich
2013-09-17: Native Word
Author: Andreas Lochbihler
2013-07-27: A Formal Model of IEEE Floating Point Arithmetic
Author: Lei Yu
2013-07-22: Pratt's Primality Certificates
Authors: Simon Wimmer and Lars Noschinski
2013-07-22: Lehmer's Theorem
Authors: Simon Wimmer and Lars Noschinski
2013-07-19: The Königsberg Bridge Problem and the Friendship Theorem
Author: Wenda Li
2013-06-27: Sound and Complete Sort Encodings for First-Order Logic
Authors: Jasmin Christian Blanchette and Andrei Popescu
2013-05-22: An Axiomatic Characterization of the Single-Source Shortest Path Problem
Author: Christine Rizkallah
2013-04-28: Graph Theory
Author: Lars Noschinski
2013-04-15: Light-weight Containers
Author: Andreas Lochbihler
2013-02-21: Nominal 2
Authors: Christian Urban, Stefan Berghofer and Cezary Kaliszyk
2013-01-31: The Correctness of Launchbury's Natural Semantics for Lazy Evaluation
Author: Joachim Breitner
2013-01-19: Ribbon Proofs
Author: John Wickerson
2013-01-16: Rank-Nullity Theorem in Linear Algebra
Authors: Jose Divasón and Jesús Aransay
2013-01-15: Kleene Algebra
Authors: Alasdair Armstrong, Georg Struth and Tjark Weber
2013-01-03: Computing N-th Roots using the Babylonian Method
Author: René Thiemann

 

2012
2012-11-14: A Separation Logic Framework for Imperative HOL
Authors: Peter Lammich and Rene Meis
2012-11-02: Open Induction
Authors: Mizuhito Ogawa and Christian Sternagel
2012-10-30: The independence of Tarski's Euclidean axiom
Author: T. J. M. Makarios
2012-10-27: Bondy's Theorem
Authors: Jeremy Avigad and Stefan Hetzl
2012-09-10: Possibilistic Noninterference
Authors: Andrei Popescu and Johannes Hölzl
2012-08-07: Generating linear orders for datatypes
Author: René Thiemann
2012-08-05: Proving the Impossibility of Trisecting an Angle and Doubling the Cube
Authors: Ralph Romanos and Lawrence C. Paulson
2012-07-27: Verifying Fault-Tolerant Distributed Algorithms in the Heard-Of Model
Authors: Henri Debrat and Stephan Merz
2012-07-01: Logical Relations for PCF
Author: Peter Gammie
2012-06-26: Type Constructor Classes and Monad Transformers
Author: Brian Huffman
2012-05-29: Psi-calculi in Isabelle
Author: Jesper Bengtson
2012-05-29: The pi-calculus in nominal logic
Author: Jesper Bengtson
2012-05-29: CCS in nominal logic
Author: Jesper Bengtson
2012-05-27: Isabelle/Circus
Authors: Abderrahmane Feliachi, Burkhart Wolff and Marie-Claude Gaudel
2012-05-11: Separation Algebra
Authors: Gerwin Klein, Rafal Kolanski and Andrew Boyton
2012-05-07: Stuttering Equivalence
Author: Stephan Merz
2012-05-02: Inductive Study of Confidentiality
Author: Giampaolo Bella
2012-04-26: Ordinary Differential Equations
Authors: Fabian Immler and Johannes Hölzl
2012-04-13: Well-Quasi-Orders
Author: Christian Sternagel
2012-03-01: Abortable Linearizable Modules
Authors: Rachid Guerraoui, Viktor Kuncak and Giuliano Losa
2012-02-29: Executable Transitive Closures
Author: René Thiemann
2012-02-06: A Probabilistic Proof of the Girth-Chromatic Number Theorem
Author: Lars Noschinski
2012-01-30: Refinement for Monadic Programs
Author: Peter Lammich
2012-01-30: Dijkstra's Shortest Path Algorithm
Authors: Benedikt Nordhoff and Peter Lammich
2012-01-03: Markov Models
Authors: Johannes Hölzl and Tobias Nipkow

 

2011
2011-11-19: A Definitional Encoding of TLA* in Isabelle/HOL
Authors: Gudmund Grov and Stephan Merz
2011-11-09: Efficient Mergesort
Author: Christian Sternagel
2011-09-22: Pseudo Hoops
Authors: George Georgescu, Laurentiu Leustean and Viorel Preoteasa
2011-09-22: Algebra of Monotonic Boolean Transformers
Author: Viorel Preoteasa
2011-09-22: Lattice Properties
Author: Viorel Preoteasa
2011-08-26: The Myhill-Nerode Theorem Based on Regular Expressions
Authors: Chunhan Wu, Xingyuan Zhang and Christian Urban
2011-08-19: Gauss-Jordan Elimination for Matrices Represented as Functions
Author: Tobias Nipkow
2011-07-21: Maximum Cardinality Matching
Author: Christine Rizkallah
2011-05-17: Knowledge-based programs
Author: Peter Gammie
2011-04-01: The General Triangle Is Unique
Author: Joachim Breitner
2011-03-14: Executable Transitive Closures of Finite Relations
Authors: Christian Sternagel and René Thiemann
2011-02-23: Interval Temporal Logic on Natural Numbers
Author: David Trachtenherz
2011-02-23: Infinite Lists
Author: David Trachtenherz
2011-02-23: AutoFocus Stream Processing for Single-Clocking and Multi-Clocking Semantics
Author: David Trachtenherz
2011-02-07: Lightweight Java
Authors: Rok Strniša and Matthew Parkinson
2011-01-10: RIPEMD-160
Author: Fabian Immler
2011-01-08: Lower Semicontinuous Functions
Author: Bogdan Grechuk

 

2010
2010-12-17: Hall's Marriage Theorem
Authors: Dongchen Jiang and Tobias Nipkow
2010-11-16: Shivers' Control Flow Analysis
Author: Joachim Breitner
2010-10-28: Finger Trees
Authors: Benedikt Nordhoff, Stefan Körner and Peter Lammich
2010-10-28: Functional Binomial Queues
Author: René Neumann
2010-10-28: Binomial Heaps and Skew Binomial Heaps
Authors: Rene Meis, Finn Nielsen and Peter Lammich
2010-08-29: Strong Normalization of Moggis's Computational Metalanguage
Author: Christian Doczkal
2010-08-10: Executable Multivariate Polynomials
Authors: Christian Sternagel, René Thiemann, Alexander Maletzky, Fabian Immler, Florian Haftmann, Andreas Lochbihler and Alexander Bentkamp
2010-08-08: Formalizing Statecharts using Hierarchical Automata
Authors: Steffen Helke and Florian Kammüller
2010-06-24: Free Groups
Author: Joachim Breitner
2010-06-20: Category Theory
Author: Alexander Katovsky
2010-06-17: Executable Matrix Operations on Matrices of Arbitrary Dimensions
Authors: Christian Sternagel and René Thiemann
2010-06-14: Abstract Rewriting
Authors: Christian Sternagel and René Thiemann
2010-05-28: Verification of the Deutsch-Schorr-Waite Graph Marking Algorithm using Data Refinement
Authors: Viorel Preoteasa and Ralph-Johan Back
2010-05-28: Semantics and Data Refinement of Invariant Based Programs
Authors: Viorel Preoteasa and Ralph-Johan Back
2010-05-22: A Complete Proof of the Robbins Conjecture
Author: Matthew Wampler-Doty
2010-05-12: Regular Sets and Expressions
Authors: Alexander Krauss and Tobias Nipkow
2010-04-30: Locally Nameless Sigma Calculus
Authors: Ludovic Henrio, Florian Kammüller, Bianca Lutz and Henry Sudhof
2010-03-29: Free Boolean Algebra
Author: Brian Huffman
2010-03-23: Inter-Procedural Information Flow Noninterference via Slicing
Author: Daniel Wasserrab
2010-03-23: Information Flow Noninterference via Slicing
Author: Daniel Wasserrab
2010-02-20: List Index
Author: Tobias Nipkow
2010-02-12: Coinductive
Author: Andreas Lochbihler

 

2009
2009-12-09: A Fast SAT Solver for Isabelle in Standard ML
Author: Armin Heller
2009-12-03: Formalizing the Logic-Automaton Connection
Authors: Stefan Berghofer and Markus Reiter
2009-11-25: Tree Automata
Author: Peter Lammich
2009-11-25: Collections Framework
Author: Peter Lammich
2009-11-22: Perfect Number Theorem
Author: Mark Ijbema
2009-11-13: Backing up Slicing: Verifying the Interprocedural Two-Phase Horwitz-Reps-Binkley Slicer
Author: Daniel Wasserrab
2009-10-30: The Worker/Wrapper Transformation
Author: Peter Gammie
2009-09-01: Ordinals and Cardinals
Author: Andrei Popescu
2009-08-28: Invertibility in Sequent Calculi
Author: Peter Chapman
2009-08-04: An Example of a Cofinitary Group in Isabelle/HOL
Author: Bart Kastermans
2009-05-06: Code Generation for Functions as Data
Author: Andreas Lochbihler
2009-04-29: Stream Fusion
Author: Brian Huffman

 

2008
2008-12-12: A Bytecode Logic for JML and Types
Authors: Lennart Beringer and Martin Hofmann
2008-11-10: Secure information flow and program logics
Authors: Lennart Beringer and Martin Hofmann
2008-11-09: Some classical results in Social Choice Theory
Author: Peter Gammie
2008-11-07: Fun With Tilings
Authors: Tobias Nipkow and Lawrence C. Paulson
2008-10-15: The Textbook Proof of Huffman's Algorithm
Author: Jasmin Christian Blanchette
2008-09-16: Towards Certified Slicing
Author: Daniel Wasserrab
2008-09-02: A Correctness Proof for the Volpano/Smith Security Typing System
Authors: Gregor Snelting and Daniel Wasserrab
2008-09-01: Arrow and Gibbard-Satterthwaite
Author: Tobias Nipkow
2008-08-26: Fun With Functions
Author: Tobias Nipkow
2008-07-23: Formal Verification of Modern SAT Solvers
Author: Filip Marić
2008-04-05: Recursion Theory I
Author: Michael Nedzelsky
2008-02-29: A Sequential Imperative Programming Language Syntax, Semantics, Hoare Logics and Verification Environment
Author: Norbert Schirmer
2008-02-29: BDD Normalisation
Authors: Veronika Ortner and Norbert Schirmer
2008-02-18: Normalization by Evaluation
Authors: Klaus Aehlig and Tobias Nipkow
2008-01-11: Quantifier Elimination for Linear Arithmetic
Author: Tobias Nipkow

 

2007
2007-12-14: Formalization of Conflict Analysis of Programs with Procedures, Thread Creation, and Monitors
Authors: Peter Lammich and Markus Müller-Olm
2007-12-03: Jinja with Threads
Author: Andreas Lochbihler
2007-11-06: Much Ado About Two
Author: Sascha Böhme
2007-08-12: Sums of Two and Four Squares
Author: Roelof Oosterhuis
2007-08-12: Fermat's Last Theorem for Exponents 3 and 4 and the Parametrisation of Pythagorean Triples
Author: Roelof Oosterhuis
2007-08-08: Fundamental Properties of Valuation Theory and Hensel's Lemma
Author: Hidetsune Kobayashi
2007-08-02: POPLmark Challenge Via de Bruijn Indices
Author: Stefan Berghofer
2007-08-02: First-Order Logic According to Fitting
Author: Stefan Berghofer

 

2006
2006-09-09: Hotel Key Card System
Author: Tobias Nipkow
2006-08-08: Abstract Hoare Logics
Author: Tobias Nipkow
2006-05-22: Flyspeck I: Tame Graphs
Authors: Gertrud Bauer and Tobias Nipkow
2006-05-15: CoreC++
Author: Daniel Wasserrab
2006-03-31: A Theory of Featherweight Java in Isabelle/HOL
Authors: J. Nathan Foster and Dimitrios Vytiniotis
2006-03-15: Instances of Schneider's generalized protocol of clock synchronization
Author: Damián Barsotti
2006-03-14: Cauchy's Mean Theorem and the Cauchy-Schwarz Inequality
Author: Benjamin Porter

 

2005
2005-11-11: Countable Ordinals
Author: Brian Huffman
2005-10-12: Fast Fourier Transform
Author: Clemens Ballarin
2005-06-24: Formalization of a Generalized Protocol for Clock Synchronization
Author: Alwen Tiu
2005-06-22: Proving the Correctness of Disk Paxos
Authors: Mauro Jaskelioff and Stephan Merz
2005-06-20: Jive Data and Store Model
Authors: Nicole Rauch and Norbert Schirmer
2005-06-01: Jinja is not Java
Authors: Gerwin Klein and Tobias Nipkow
2005-05-02: SHA1, RSA, PSS and more
Authors: Christina Lindenberg and Kai Wirt
2005-04-21: Category Theory to Yoneda's Lemma
Author: Greg O'Keefe

 

2004
2004-12-09: File Refinement
Authors: Karen Zee and Viktor Kuncak
2004-11-19: Integration theory and random variables
Author: Stefan Richter
2004-09-28: A Mechanically Verified, Efficient, Sound and Complete Theorem Prover For First Order Logic
Author: Tom Ridge
2004-09-20: Ramsey's theorem, infinitary version
Author: Tom Ridge
2004-09-20: Completeness theorem
Authors: James Margetson and Tom Ridge
2004-07-09: Compiling Exceptions Correctly
Author: Tobias Nipkow
2004-06-24: Depth First Search
Authors: Toshiaki Nishihara and Yasuhiko Minamide
2004-05-18: Groups, Rings and Modules
Authors: Hidetsune Kobayashi, L. Chen and H. Murao
2004-04-26: Topology
Author: Stefan Friedrich
2004-04-26: Lazy Lists II
Author: Stefan Friedrich
2004-04-05: Binary Search Trees
Author: Viktor Kuncak
2004-03-30: Functional Automata
Author: Tobias Nipkow
2004-03-19: Mini ML
Authors: Wolfgang Naraschewski and Tobias Nipkow
2004-03-19: AVL Trees
Authors: Tobias Nipkow and Cornelia Pusch
\ No newline at end of file diff --git a/web/rss.xml b/web/rss.xml --- a/web/rss.xml +++ b/web/rss.xml @@ -1,607 +1,619 @@ Archive of Formal Proofs https://www.isa-afp.org The Archive of Formal Proofs is a collection of proof libraries, examples, and larger scientific developments, mechanically checked in the theorem prover Isabelle. - 18 Jan 2021 00:00:00 +0000 + 31 Jan 2021 00:00:00 +0000 + + Tarski's Parallel Postulate implies the 5th Postulate of Euclid, the Postulate of Playfair and the original Parallel Postulate of Euclid + https://www.isa-afp.org/entries/IsaGeoCoq.html + https://www.isa-afp.org/entries/IsaGeoCoq.html + Roland Coghetto + 31 Jan 2021 00:00:00 +0000 + +<p>The <a href="https://geocoq.github.io/GeoCoq/">GeoCoq library</a> contains a formalization +of geometry using the Coq proof assistant. It contains both proofs +about the foundations of geometry and high-level proofs in the same +style as in high school. We port a part of the GeoCoq +2.4.0 library to Isabelle/HOL: more precisely, +the files Chap02.v to Chap13_3.v, suma.v as well as the associated +definitions and some useful files for the demonstration of certain +parallel postulates. The synthetic approach of the demonstrations is directly +inspired by those contained in GeoCoq. The names of the lemmas and +theorems used are kept as far as possible as well as the definitions. +</p> +<p>It should be noted that T.J.M. Makarios has done +<a href="https://www.isa-afp.org/entries/Tarskis_Geometry.html">some proofs in Tarski's Geometry</a>. It uses a definition that does not quite +coincide with the definition used in Geocoq and here. +Furthermore, corresponding definitions in the <a href="https://www.isa-afp.org/entries/Poincare_Disc.html">Poincaré Disc Model +development</a> are not identical to those defined in GeoCoq. +</p> +<p>In the last part, it is +formalized that, in the neutral/absolute space, the axiom of the +parallels of Tarski's system implies the Playfair axiom, the 5th +postulate of Euclid and Euclid's original parallel postulate. These +proofs, which are not constructive, are directly inspired by Pierre +Boutry, Charly Gries, Julien Narboux and Pascal Schreck. +</p> + + + Solution to the xkcd Blue Eyes puzzle + https://www.isa-afp.org/entries/Blue_Eyes.html + https://www.isa-afp.org/entries/Blue_Eyes.html + Jakub Kądziołka + 30 Jan 2021 00:00:00 +0000 + +In a <a href="https://xkcd.com/blue_eyes.html">puzzle published by +Randall Munroe</a>, perfect logicians forbidden +from communicating are stranded on an island, and may only leave once +they have figured out their own eye color. We present a method of +modeling the behavior of perfect logicians and formalize a solution of +the puzzle. + Hood-Melville Queue https://www.isa-afp.org/entries/Hood_Melville_Queue.html https://www.isa-afp.org/entries/Hood_Melville_Queue.html Alejandro Gómez-Londoño 18 Jan 2021 00:00:00 +0000 This is a verified implementation of a constant time queue. The original design is due to <a href="https://doi.org/10.1016/0020-0190(81)90030-2">Hood and Melville</a>. This formalization follows the presentation in <em>Purely Functional Data Structures</em>by Okasaki. JinjaDCI: a Java semantics with dynamic class initialization https://www.isa-afp.org/entries/JinjaDCI.html https://www.isa-afp.org/entries/JinjaDCI.html Susannah Mansky 11 Jan 2021 00:00:00 +0000 We extend Jinja to include static fields, methods, and instructions, and dynamic class initialization, based on the Java SE 8 specification. This includes extension of definitions and proofs. This work is partially described in Mansky and Gunter's paper at CPP 2019 and Mansky's doctoral thesis (UIUC, 2020). Cofinality and the Delta System Lemma https://www.isa-afp.org/entries/Delta_System_Lemma.html https://www.isa-afp.org/entries/Delta_System_Lemma.html Pedro Sánchez Terraf 27 Dec 2020 00:00:00 +0000 We formalize the basic results on cofinality of linearly ordered sets and ordinals and Šanin’s Lemma for uncountable families of finite sets. This last result is used to prove the countable chain condition for Cohen posets. We work in the set theory framework of Isabelle/ZF, using the Axiom of Choice as needed. Topological semantics for paraconsistent and paracomplete logics https://www.isa-afp.org/entries/Topological_Semantics.html https://www.isa-afp.org/entries/Topological_Semantics.html David Fuenmayor 17 Dec 2020 00:00:00 +0000 We introduce a generalized topological semantics for paraconsistent and paracomplete logics by drawing upon early works on topological Boolean algebras (cf. works by Kuratowski, Zarycki, McKinsey & Tarski, etc.). In particular, this work exemplarily illustrates the shallow semantical embeddings approach (<a href="http://dx.doi.org/10.1007/s11787-012-0052-y">SSE</a>) employing the proof assistant Isabelle/HOL. By means of the SSE technique we can effectively harness theorem provers, model finders and 'hammers' for reasoning with quantified non-classical logics. Relational Minimum Spanning Tree Algorithms https://www.isa-afp.org/entries/Relational_Minimum_Spanning_Trees.html https://www.isa-afp.org/entries/Relational_Minimum_Spanning_Trees.html Walter Guttmann, Nicolas Robinson-O'Brien 08 Dec 2020 00:00:00 +0000 We verify the correctness of Prim's, Kruskal's and Borůvka's minimum spanning tree algorithms based on algebras for aggregation and minimisation. Inline Caching and Unboxing Optimization for Interpreters https://www.isa-afp.org/entries/Interpreter_Optimizations.html https://www.isa-afp.org/entries/Interpreter_Optimizations.html Martin Desharnais 07 Dec 2020 00:00:00 +0000 This Isabelle/HOL formalization builds on the <em>VeriComp</em> entry of the <em>Archive of Formal Proofs</em> to provide the following contributions: <ul> <li>an operational semantics for a realistic virtual machine (Std) for dynamically typed programming languages;</li> <li>the formalization of an inline caching optimization (Inca), a proof of bisimulation with (Std), and a compilation function;</li> <li>the formalization of an unboxing optimization (Ubx), a proof of bisimulation with (Inca), and a simple compilation function.</li> </ul> This formalization was described in the CPP 2021 paper <em>Towards Efficient and Verified Virtual Machines for Dynamic Languages</em> The Relational Method with Message Anonymity for the Verification of Cryptographic Protocols https://www.isa-afp.org/entries/Relational_Method.html https://www.isa-afp.org/entries/Relational_Method.html Pasquale Noce 05 Dec 2020 00:00:00 +0000 This paper introduces a new method for the formal verification of cryptographic protocols, the relational method, derived from Paulson's inductive method by means of some enhancements aimed at streamlining formal definitions and proofs, specially for protocols using public key cryptography. Moreover, this paper proposes a method to formalize a further security property, message anonymity, in addition to message confidentiality and authenticity. The relational method, including message anonymity, is then applied to the verification of a sample authentication protocol, comprising Password Authenticated Connection Establishment (PACE) with Chip Authentication Mapping followed by the explicit verification of an additional password over the PACE secure channel. Isabelle Marries Dirac: a Library for Quantum Computation and Quantum Information https://www.isa-afp.org/entries/Isabelle_Marries_Dirac.html https://www.isa-afp.org/entries/Isabelle_Marries_Dirac.html Anthony Bordg, Hanna Lachnitt, Yijun He 22 Nov 2020 00:00:00 +0000 This work is an effort to formalise some quantum algorithms and results in quantum information theory. Formal methods being critical for the safety and security of algorithms and protocols, we foresee their widespread use for quantum computing in the future. We have developed a large library for quantum computing in Isabelle based on a matrix representation for quantum circuits, successfully formalising the no-cloning theorem, quantum teleportation, Deutsch's algorithm, the Deutsch-Jozsa algorithm and the quantum Prisoner's Dilemma. The HOL-CSP Refinement Toolkit https://www.isa-afp.org/entries/CSP_RefTK.html https://www.isa-afp.org/entries/CSP_RefTK.html Safouan Taha, Burkhart Wolff, Lina Ye 19 Nov 2020 00:00:00 +0000 We use a formal development for CSP, called HOL-CSP2.0, to analyse a family of refinement notions, comprising classic and new ones. This analysis enables to derive a number of properties that allow to deepen the understanding of these notions, in particular with respect to specification decomposition principles for the case of infinite sets of events. The established relations between the refinement relations help to clarify some obscure points in the CSP literature, but also provide a weapon for shorter refinement proofs. Furthermore, we provide a framework for state-normalisation allowing to formally reason on parameterised process architectures. As a result, we have a modern environment for formal proofs of concurrent systems that allow for the combination of general infinite processes with locally finite ones in a logically safe way. We demonstrate these verification-techniques for classical, generalised examples: The CopyBuffer for arbitrary data and the Dijkstra's Dining Philosopher Problem of arbitrary size. Verified SAT-Based AI Planning https://www.isa-afp.org/entries/Verified_SAT_Based_AI_Planning.html https://www.isa-afp.org/entries/Verified_SAT_Based_AI_Planning.html Mohammad Abdulaziz, Friedrich Kurz 29 Oct 2020 00:00:00 +0000 We present an executable formally verified SAT encoding of classical AI planning that is based on the encodings by Kautz and Selman and the one by Rintanen et al. The encoding was experimentally tested and shown to be usable for reasonably sized standard AI planning benchmarks. We also use it as a reference to test a state-of-the-art SAT-based planner, showing that it sometimes falsely claims that problems have no solutions of certain lengths. The formalisation in this submission was described in an independent publication. AI Planning Languages Semantics https://www.isa-afp.org/entries/AI_Planning_Languages_Semantics.html https://www.isa-afp.org/entries/AI_Planning_Languages_Semantics.html Mohammad Abdulaziz, Peter Lammich 29 Oct 2020 00:00:00 +0000 This is an Isabelle/HOL formalisation of the semantics of the multi-valued planning tasks language that is used by the planning system Fast-Downward, the STRIPS fragment of the Planning Domain Definition Language (PDDL), and the STRIPS soundness meta-theory developed by Vladimir Lifschitz. It also contains formally verified checkers for checking the well-formedness of problems specified in either language as well the correctness of potential solutions. The formalisation in this entry was described in an earlier publication. A Sound Type System for Physical Quantities, Units, and Measurements https://www.isa-afp.org/entries/Physical_Quantities.html https://www.isa-afp.org/entries/Physical_Quantities.html Simon Foster, Burkhart Wolff 20 Oct 2020 00:00:00 +0000 The present Isabelle theory builds a formal model for both the International System of Quantities (ISQ) and the International System of Units (SI), which are both fundamental for physics and engineering. Both the ISQ and the SI are deeply integrated into Isabelle's type system. Quantities are parameterised by dimension types, which correspond to base vectors, and thus only quantities of the same dimension can be equated. Since the underlying "algebra of quantities" induces congruences on quantity and SI types, specific tactic support is developed to capture these. Our construction is validated by a test-set of known equivalences between both quantities and SI units. Moreover, the presented theory can be used for type-safe conversions between the SI system and others, like the British Imperial System (BIS). Finite Map Extras https://www.isa-afp.org/entries/Finite-Map-Extras.html https://www.isa-afp.org/entries/Finite-Map-Extras.html Javier Díaz 12 Oct 2020 00:00:00 +0000 This entry includes useful syntactic sugar, new operators and functions, and their associated lemmas for finite maps which currently are not present in the standard Finite_Map theory. A Formal Model of the Safely Composable Document Object Model with Shadow Roots https://www.isa-afp.org/entries/Shadow_SC_DOM.html https://www.isa-afp.org/entries/Shadow_SC_DOM.html Achim D. Brucker, Michael Herzberg 28 Sep 2020 00:00:00 +0000 In this AFP entry, we extend our formalization of the safely composable DOM with Shadow Roots. This is a proposal for Shadow Roots with stricter safety guarantess than the standard compliant formalization (see "Shadow DOM"). Shadow Roots are a recent proposal of the web community to support a component-based development approach for client-side web applications. Shadow roots are a significant extension to the DOM standard and, as web standards are condemned to be backward compatible, such extensions often result in complex specification that may contain unwanted subtleties that can be detected by a formalization. Our Isabelle/HOL formalization is, in the sense of object-orientation, an extension of our formalization of the core DOM and enjoys the same basic properties, i.e., it is extensible, i.e., can be extended without the need of re-proving already proven properties and executable, i.e., we can generate executable code from our specification. We exploit the executability to show that our formalization complies to the official standard of the W3C, respectively, the WHATWG. A Formal Model of the Document Object Model with Shadow Roots https://www.isa-afp.org/entries/Shadow_DOM.html https://www.isa-afp.org/entries/Shadow_DOM.html Achim D. Brucker, Michael Herzberg 28 Sep 2020 00:00:00 +0000 In this AFP entry, we extend our formalization of the core DOM with Shadow Roots. Shadow roots are a recent proposal of the web community to support a component-based development approach for client-side web applications. Shadow roots are a significant extension to the DOM standard and, as web standards are condemned to be backward compatible, such extensions often result in complex specification that may contain unwanted subtleties that can be detected by a formalization. Our Isabelle/HOL formalization is, in the sense of object-orientation, an extension of our formalization of the core DOM and enjoys the same basic properties, i.e., it is extensible, i.e., can be extended without the need of re-proving already proven properties and executable, i.e., we can generate executable code from our specification. We exploit the executability to show that our formalization complies to the official standard of the W3C, respectively, the WHATWG. A Formalization of Safely Composable Web Components https://www.isa-afp.org/entries/SC_DOM_Components.html https://www.isa-afp.org/entries/SC_DOM_Components.html Achim D. Brucker, Michael Herzberg 28 Sep 2020 00:00:00 +0000 While the (safely composable) DOM with shadow trees provide the technical basis for defining web components, it does neither defines the concept of web components nor specifies the safety properties that web components should guarantee. Consequently, the standard also does not discuss how or even if the methods for modifying the DOM respect component boundaries. In AFP entry, we present a formally verified model of safely composable web components and define safety properties which ensure that different web components can only interact with each other using well-defined interfaces. Moreover, our verification of the application programming interface (API) of the DOM revealed numerous invariants that implementations of the DOM API need to preserve to ensure the integrity of components. In comparison to the strict standard compliance formalization of Web Components in the AFP entry "DOM_Components", the notion of components in this entry (based on "SC_DOM" and "Shadow_SC_DOM") provides much stronger safety guarantees. A Formalization of Web Components https://www.isa-afp.org/entries/DOM_Components.html https://www.isa-afp.org/entries/DOM_Components.html Achim D. Brucker, Michael Herzberg 28 Sep 2020 00:00:00 +0000 While the DOM with shadow trees provide the technical basis for defining web components, the DOM standard neither defines the concept of web components nor specifies the safety properties that web components should guarantee. Consequently, the standard also does not discuss how or even if the methods for modifying the DOM respect component boundaries. In AFP entry, we present a formally verified model of web components and define safety properties which ensure that different web components can only interact with each other using well-defined interfaces. Moreover, our verification of the application programming interface (API) of the DOM revealed numerous invariants that implementations of the DOM API need to preserve to ensure the integrity of components. The Safely Composable DOM https://www.isa-afp.org/entries/Core_SC_DOM.html https://www.isa-afp.org/entries/Core_SC_DOM.html Achim D. Brucker, Michael Herzberg 28 Sep 2020 00:00:00 +0000 In this AFP entry, we formalize the core of the Safely Composable Document Object Model (SC DOM). The SC DOM improve the standard DOM (as formalized in the AFP entry "Core DOM") by strengthening the tree boundaries set by shadow roots: in the SC DOM, the shadow root is a sub-class of the document class (instead of a base class). This modifications also results in changes to some API methods (e.g., getOwnerDocument) to return the nearest shadow root rather than the document root. As a result, many API methods that, when called on a node inside a shadow tree, would previously ``break out'' and return or modify nodes that are possibly outside the shadow tree, now stay within its boundaries. This change in behavior makes programs that operate on shadow trees more predictable for the developer and allows them to make more assumptions about other code accessing the DOM. Syntax-Independent Logic Infrastructure https://www.isa-afp.org/entries/Syntax_Independent_Logic.html https://www.isa-afp.org/entries/Syntax_Independent_Logic.html Andrei Popescu, Dmitriy Traytel 16 Sep 2020 00:00:00 +0000 We formalize a notion of logic whose terms and formulas are kept abstract. In particular, logical connectives, substitution, free variables, and provability are not defined, but characterized by their general properties as locale assumptions. Based on this abstract characterization, we develop further reusable reasoning infrastructure. For example, we define parallel substitution (along with proving its characterizing theorems) from single-point substitution. Similarly, we develop a natural deduction style proof system starting from the abstract Hilbert-style one. These one-time efforts benefit different concrete logics satisfying our locales' assumptions. We instantiate the syntax-independent logic infrastructure to Robinson arithmetic (also known as Q) in the AFP entry <a href="https://www.isa-afp.org/entries/Robinson_Arithmetic.html">Robinson_Arithmetic</a> and to hereditarily finite set theory in the AFP entries <a href="https://www.isa-afp.org/entries/Goedel_HFSet_Semantic.html">Goedel_HFSet_Semantic</a> and <a href="https://www.isa-afp.org/entries/Goedel_HFSet_Semanticless.html">Goedel_HFSet_Semanticless</a>, which are part of our formalization of G&ouml;del's Incompleteness Theorems described in our CADE-27 paper <a href="https://dx.doi.org/10.1007/978-3-030-29436-6_26">A Formally Verified Abstract Account of Gödel's Incompleteness Theorems</a>. Robinson Arithmetic https://www.isa-afp.org/entries/Robinson_Arithmetic.html https://www.isa-afp.org/entries/Robinson_Arithmetic.html Andrei Popescu, Dmitriy Traytel 16 Sep 2020 00:00:00 +0000 We instantiate our syntax-independent logic infrastructure developed in <a href="https://www.isa-afp.org/entries/Syntax_Independent_Logic.html">a separate AFP entry</a> to the FOL theory of Robinson arithmetic (also known as Q). The latter was formalised using Nominal Isabelle by adapting <a href="https://www.isa-afp.org/entries/Incompleteness.html">Larry Paulson’s formalization of the Hereditarily Finite Set theory</a>. An Abstract Formalization of Gödel's Incompleteness Theorems https://www.isa-afp.org/entries/Goedel_Incompleteness.html https://www.isa-afp.org/entries/Goedel_Incompleteness.html Andrei Popescu, Dmitriy Traytel 16 Sep 2020 00:00:00 +0000 We present an abstract formalization of G&ouml;del's incompleteness theorems. We analyze sufficient conditions for the theorems' applicability to a partially specified logic. Our abstract perspective enables a comparison between alternative approaches from the literature. These include Rosser's variation of the first theorem, Jeroslow's variation of the second theorem, and the Swierczkowski&ndash;Paulson semantics-based approach. This AFP entry is the main entry point to the results described in our CADE-27 paper <a href="https://dx.doi.org/10.1007/978-3-030-29436-6_26">A Formally Verified Abstract Account of Gödel's Incompleteness Theorems</a>. As part of our abstract formalization's validation, we instantiate our locales twice in the separate AFP entries <a href="https://www.isa-afp.org/entries/Goedel_HFSet_Semantic.html">Goedel_HFSet_Semantic</a> and <a href="https://www.isa-afp.org/entries/Goedel_HFSet_Semanticless.html">Goedel_HFSet_Semanticless</a>. From Abstract to Concrete Gödel's Incompleteness Theorems—Part II https://www.isa-afp.org/entries/Goedel_HFSet_Semanticless.html https://www.isa-afp.org/entries/Goedel_HFSet_Semanticless.html Andrei Popescu, Dmitriy Traytel 16 Sep 2020 00:00:00 +0000 We validate an abstract formulation of G&ouml;del's Second Incompleteness Theorem from a <a href="https://www.isa-afp.org/entries/Goedel_Incompleteness.html">separate AFP entry</a> by instantiating it to the case of <i>finite consistent extensions of the Hereditarily Finite (HF) Set theory</i>, i.e., consistent FOL theories extending the HF Set theory with a finite set of axioms. The instantiation draws heavily on infrastructure previously developed by Larry Paulson in his <a href="https://www.isa-afp.org/entries/Incompleteness.html">direct formalisation of the concrete result</a>. It strengthens Paulson's formalization of G&ouml;del's Second from that entry by <i>not</i> assuming soundness, and in fact not relying on any notion of model or semantic interpretation. The strengthening was obtained by first replacing some of Paulson’s semantic arguments with proofs within his HF calculus, and then plugging in some of Paulson's (modified) lemmas to instantiate our soundness-free G&ouml;del's Second locale. From Abstract to Concrete Gödel's Incompleteness Theorems—Part I https://www.isa-afp.org/entries/Goedel_HFSet_Semantic.html https://www.isa-afp.org/entries/Goedel_HFSet_Semantic.html Andrei Popescu, Dmitriy Traytel 16 Sep 2020 00:00:00 +0000 We validate an abstract formulation of G&ouml;del's First and Second Incompleteness Theorems from a <a href="https://www.isa-afp.org/entries/Goedel_Incompleteness.html">separate AFP entry</a> by instantiating them to the case of <i>finite sound extensions of the Hereditarily Finite (HF) Set theory</i>, i.e., FOL theories extending the HF Set theory with a finite set of axioms that are sound in the standard model. The concrete results had been previously formalised in an <a href="https://www.isa-afp.org/entries/Incompleteness.html">AFP entry by Larry Paulson</a>; our instantiation reuses the infrastructure developed in that entry. A Formal Model of Extended Finite State Machines https://www.isa-afp.org/entries/Extended_Finite_State_Machines.html https://www.isa-afp.org/entries/Extended_Finite_State_Machines.html Michael Foster, Achim D. Brucker, Ramsay G. Taylor, John Derrick 07 Sep 2020 00:00:00 +0000 In this AFP entry, we provide a formalisation of extended finite state machines (EFSMs) where models are represented as finite sets of transitions between states. EFSMs execute traces to produce observable outputs. We also define various simulation and equality metrics for EFSMs in terms of traces and prove their strengths in relation to each other. Another key contribution is a framework of function definitions such that LTL properties can be phrased over EFSMs. Finally, we provide a simple example case study in the form of a drinks machine. Inference of Extended Finite State Machines https://www.isa-afp.org/entries/Extended_Finite_State_Machine_Inference.html https://www.isa-afp.org/entries/Extended_Finite_State_Machine_Inference.html Michael Foster, Achim D. Brucker, Ramsay G. Taylor, John Derrick 07 Sep 2020 00:00:00 +0000 In this AFP entry, we provide a formal implementation of a state-merging technique to infer extended finite state machines (EFSMs), complete with output and update functions, from black-box traces. In particular, we define the subsumption in context relation as a means of determining whether one transition is able to account for the behaviour of another. Building on this, we define the direct subsumption relation, which lifts the subsumption in context relation to EFSM level such that we can use it to determine whether it is safe to merge a given pair of transitions. Key proofs include the conditions necessary for subsumption to occur and that subsumption and direct subsumption are preorder relations. We also provide a number of different heuristics which can be used to abstract away concrete values into registers so that more states and transitions can be merged and provide proofs of the various conditions which must hold for these abstractions to subsume their ungeneralised counterparts. A Code Generator setup to create executable Scala code is also defined. Practical Algebraic Calculus Checker https://www.isa-afp.org/entries/PAC_Checker.html https://www.isa-afp.org/entries/PAC_Checker.html Mathias Fleury, Daniela Kaufmann 31 Aug 2020 00:00:00 +0000 Generating and checking proof certificates is important to increase the trust in automated reasoning tools. In recent years formal verification using computer algebra became more important and is heavily used in automated circuit verification. An existing proof format which covers algebraic reasoning and allows efficient proof checking is the practical algebraic calculus (PAC). In this development, we present the verified checker Pastèque that is obtained by synthesis via the Refinement Framework. This is the formalization going with our FMCAD'20 tool presentation. Some classical results in inductive inference of recursive functions https://www.isa-afp.org/entries/Inductive_Inference.html https://www.isa-afp.org/entries/Inductive_Inference.html Frank J. Balbach 31 Aug 2020 00:00:00 +0000 <p> This entry formalizes some classical concepts and results from inductive inference of recursive functions. In the basic setting a partial recursive function ("strategy") must identify ("learn") all functions from a set ("class") of recursive functions. To that end the strategy receives more and more values $f(0), f(1), f(2), \ldots$ of some function $f$ from the given class and in turn outputs descriptions of partial recursive functions, for example, Gödel numbers. The strategy is considered successful if the sequence of outputs ("hypotheses") converges to a description of $f$. A class of functions learnable in this sense is called "learnable in the limit". The set of all these classes is denoted by LIM. </p> <p> Other types of inference considered are finite learning (FIN), behaviorally correct learning in the limit (BC), and some variants of LIM with restrictions on the hypotheses: total learning (TOTAL), consistent learning (CONS), and class-preserving learning (CP). The main results formalized are the proper inclusions $\mathrm{FIN} \subset \mathrm{CP} \subset \mathrm{TOTAL} \subset \mathrm{CONS} \subset \mathrm{LIM} \subset \mathrm{BC} \subset 2^{\mathcal{R}}$, where $\mathcal{R}$ is the set of all total recursive functions. Further results show that for all these inference types except CONS, strategies can be assumed to be total recursive functions; that all inference types but CP are closed under the subset relation between classes; and that no inference type is closed under the union of classes. </p> <p> The above is based on a formalization of recursive functions heavily inspired by the <a href="https://www.isa-afp.org/entries/Universal_Turing_Machine.html">Universal Turing Machine</a> entry by Xu et al., but different in that it models partial functions with codomain <em>nat option</em>. The formalization contains a construction of a universal partial recursive function, without resorting to Turing machines, introduces decidability and recursive enumerability, and proves some standard results: existence of a Kleene normal form, the <em>s-m-n</em> theorem, Rice's theorem, and assorted fixed-point theorems (recursion theorems) by Kleene, Rogers, and Smullyan. </p> Relational Disjoint-Set Forests https://www.isa-afp.org/entries/Relational_Disjoint_Set_Forests.html https://www.isa-afp.org/entries/Relational_Disjoint_Set_Forests.html Walter Guttmann 26 Aug 2020 00:00:00 +0000 We give a simple relation-algebraic semantics of read and write operations on associative arrays. The array operations seamlessly integrate with assignments in the Hoare-logic library. Using relation algebras and Kleene algebras we verify the correctness of an array-based implementation of disjoint-set forests with a naive union operation and a find operation with path compression. - - Extensions to the Comprehensive Framework for Saturation Theorem Proving - https://www.isa-afp.org/entries/Saturation_Framework_Extensions.html - https://www.isa-afp.org/entries/Saturation_Framework_Extensions.html - Jasmin Blanchette, Sophie Tourret - 25 Aug 2020 00:00:00 +0000 - -This Isabelle/HOL formalization extends the AFP entry -<em>Saturation_Framework</em> with the following -contributions: <ul> <li>an application of the framework -to prove Bachmair and Ganzinger's resolution prover RP -refutationally complete, which was formalized in a more ad hoc fashion -by Schlichtkrull et al. in the AFP entry -<em>Ordered_Resultion_Prover</em>;</li> -<li>generalizations of various basic concepts formalized by -Schlichtkrull et al., which were needed to verify RP and could be -useful to formalize other calculi, such as superposition;</li> -<li>alternative proofs of fairness (and hence saturation and -ultimately refutational completeness) for the given clause procedures -GC and LGC, based on invariance.</li> </ul> - - - Putting the `K' into Bird's derivation of Knuth-Morris-Pratt string matching - https://www.isa-afp.org/entries/BirdKMP.html - https://www.isa-afp.org/entries/BirdKMP.html - Peter Gammie - 25 Aug 2020 00:00:00 +0000 - -Richard Bird and collaborators have proposed a derivation of an -intricate cyclic program that implements the Morris-Pratt string -matching algorithm. Here we provide a proof of total correctness for -Bird's derivation and complete it by adding Knuth's -optimisation. - diff --git a/web/statistics.html b/web/statistics.html --- a/web/statistics.html +++ b/web/statistics.html @@ -1,302 +1,302 @@ Archive of Formal Proofs

 

 

 

 

 

 

Statistics

 

Statistics

- - - - + + + +
Number of Articles:579
Number of Authors:372
Number of lemmas:~162,200
Lines of Code:~2,826,300
Number of Articles:581
Number of Authors:374
Number of lemmas:~163,500
Lines of Code:~2,853,500

Most used AFP articles:

NameUsed by ? articles
1. List-Index 17
2. Coinductive 12
Collections 12
Regular-Sets 12
3. Landau_Symbols 11
Show 11
4. Polynomial_Factorization 10
5. Abstract-Rewriting 9
Automatic_Refinement 9
Deriving 9
Jordan_Normal_Form 9

Growth in number of articles:

Growth in lines of code:

Growth in number of authors:

Size of articles:

\ No newline at end of file diff --git a/web/topics.html b/web/topics.html --- a/web/topics.html +++ b/web/topics.html @@ -1,937 +1,939 @@ Archive of Formal Proofs

 

 

 

 

 

 

Index by Topic

 

Computer science

Artificial intelligence

Automata and formal languages

Algorithms

Knuth_Morris_Pratt   Probabilistic_While   Comparison_Sort_Lower_Bound   Quick_Sort_Cost   TortoiseHare   Selection_Heap_Sort   VerifyThis2018   CYK   Boolean_Expression_Checkers   Efficient-Mergesort   SATSolverVerification   MuchAdoAboutTwo   First_Order_Terms   Monad_Memo_DP   Hidden_Markov_Models   Imperative_Insertion_Sort   Formal_SSA   ROBDD   Median_Of_Medians_Selection   Fisher_Yates   Optimal_BST   IMP2   Auto2_Imperative_HOL   List_Inversions   IMP2_Binary_Heap   MFOTL_Monitor   Adaptive_State_Counting   Generic_Join   VerifyThis2019   Generalized_Counting_Sort   MFODL_Monitor_Optimized   Sliding_Window_Algorithm   PAC_Checker   Graph: DFS_Framework   Prpu_Maxflow   Floyd_Warshall   Roy_Floyd_Warshall   Dijkstra_Shortest_Path   EdmondsKarp_Maxflow   Depth-First-Search   GraphMarkingIBP   Transitive-Closure   Transitive-Closure-II   Gabow_SCC   Kruskal   Prim_Dijkstra_Simple   Relational_Minimum_Spanning_Trees   Distributed: DiskPaxos   GenClock   ClockSynchInst   Heard_Of   Consensus_Refined   Abortable_Linearizable_Modules   IMAP-CRDT   CRDT   Chandy_Lamport   OpSets   Stellar_Quorums   WOOT_Strong_Eventual_Consistency   Concurrent: ConcurrentGC   Online: List_Update   Geometry: Closest_Pair_Points   Approximation: Approximation_Algorithms   Mathematical: FFT   Gauss-Jordan-Elim-Fun   UpDown_Scheme   Polynomials   Gauss_Jordan   Echelon_Form   QR_Decomposition   Hermite   Groebner_Bases   Diophantine_Eqns_Lin_Hom   Taylor_Models   LLL_Basis_Reduction   Signature_Groebner   Smith_Normal_Form   Safe_Distance   Optimization: Simplex   Quantum computing: Isabelle_Marries_Dirac  

Concurrency

Data structures

Functional programming

Hardware

SPARCv8  

Machine learning

Networks

Programming languages

Clean   Decl_Sem_Fun_PL   Language definitions: CakeML   WebAssembly   pGCL   GPU_Kernel_PL   LightweightJava   CoreC++   FeatherweightJava   Jinja   JinjaThreads   Locally-Nameless-Sigma   AutoFocus-Stream   FocusStreamsCaseStudies   Isabelle_Meta_Model   Simpl   Complx   Safe_OCL   Isabelle_C   JinjaDCI   Lambda calculi: Higher_Order_Terms   Launchbury   PCF   POPLmark-deBruijn   Lam-ml-Normalization   LambdaMu   Binding_Syntax_Theory   LambdaAuth   Type systems: Name_Carrying_Type_Inference   MiniML   Possibilistic_Noninterference   SIFUM_Type_Systems   Dependent_SIFUM_Type_Systems   Strong_Security   WHATandWHERE_Security   VolpanoSmith   Physical_Quantities   Logics: ConcurrentIMP   Refine_Monadic   Automatic_Refinement   MonoBoolTranAlgebra   Simpl   Separation_Algebra   Separation_Logic_Imperative_HOL   Relational-Incorrectness-Logic   Abstract-Hoare-Logics   Kleene_Algebra   KAT_and_DRA   KAD   BytecodeLogicJmlTypes   DataRefinementIBP   RefinementReactive   SIFPL   TLA   Ribbon_Proofs   Separata   Complx   Differential_Dynamic_Logic   Hoare_Time   IMP2   UTP   QHLProver   Differential_Game_Logic   Compiling: CakeML_Codegen   Compiling-Exceptions-Correctly   NormByEval   Density_Compiler   VeriComp   Static analysis: RIPEMD-160-SPARK   Program-Conflict-Analysis   Shivers-CFA   Slicing   HRB-Slicing   InfPathElimination   Abs_Int_ITP2012   Transformations: Call_Arity   Refine_Imperative_HOL   WorkerWrapper   Monad_Memo_DP   Formal_SSA   Minimal_SSA   Misc: JiveDataStoreModel   Pop_Refinement   Case_Labeling   Interpreter_Optimizations  

Security

Semantics

System description languages

Logic

Philosophical aspects

General logic

Computability

Set theory

Proof theory

Rewriting

Mathematics

Order

Algebra

Analysis

Probability theory

Number theory

Games and economics

Geometry

Topology

Graph theory

Combinatorics

Category theory

Physics

Misc

Tools

\ No newline at end of file