diff --git a/src/HOL/Statespace/StateSpaceEx.thy b/src/HOL/Statespace/StateSpaceEx.thy --- a/src/HOL/Statespace/StateSpaceEx.thy +++ b/src/HOL/Statespace/StateSpaceEx.thy @@ -1,511 +1,511 @@ (* Title: HOL/Statespace/StateSpaceEx.thy Author: Norbert Schirmer, TU Muenchen, 2007 Author: Norbert Schirmer, Apple, 2021 *) section \Examples \label{sec:Examples}\ theory StateSpaceEx imports StateSpaceLocale StateSpaceSyntax begin (*<*) syntax "_statespace_updates" :: "('a \ 'b) \ updbinds \ ('a \ 'b)" ("_\_\" [900,0] 900) (*>*) text \Did you ever dream about records with multiple inheritance? Then you should definitely have a look at statespaces. They may be what you are dreaming of. Or at least almost \dots\ text \Isabelle allows to add new top-level commands to the system. Building on the locale infrastructure, we provide a command \<^theory_text>\statespace\ like this:\ statespace vars = n::nat b::bool print_locale vars_namespace print_locale vars_valuetypes print_locale vars text \\noindent This resembles a \<^theory_text>\record\ definition, but introduces sophisticated locale infrastructure instead of HOL type schemes. The resulting context postulates two distinct names \<^term>\n\ and \<^term>\b\ and projection~/ injection functions that convert from abstract values to \<^typ>\nat\ and \bool\. The logical content of the locale is:\ locale vars' = fixes n::'name and b::'name assumes "distinct [n, b]" fixes project_nat::"'value \ nat" and inject_nat::"nat \ 'value" assumes "\n. project_nat (inject_nat n) = n" fixes project_bool::"'value \ bool" and inject_bool::"bool \ 'value" assumes "\b. project_bool (inject_bool b) = b" text \\noindent The HOL predicate \<^const>\distinct\ describes distinctness of all names in the context. Locale \vars'\ defines the raw logical content that is defined in the state space locale. We also maintain non-logical context information to support the user: \<^item> Syntax for state lookup and updates that automatically inserts the corresponding projection and injection functions. \<^item> Setup for the proof tools that exploit the distinctness information and the cancellation of projections and injections in deductions and simplifications. This extra-logical information is added to the locale in form of declarations, which associate the name of a variable to the corresponding projection and injection functions to handle the syntax transformations, and a link from the variable name to the corresponding distinctness theorem. As state spaces are merged or extended there are multiple distinctness theorems in the context. Our declarations take care that the link always points to the strongest distinctness assumption. With these declarations in place, a lookup can be written as \s\n\, which is translated to \project_nat (s n)\, and an update as \s\n := 2\\, which is translated to \s(n := inject_nat 2)\. We can now establish the following lemma:\ lemma (in vars) foo: "s\b = s\b" by simp text \\noindent Here the simplifier was able to refer to distinctness of \<^term>\b\ and \<^term>\n\ to solve the equation. The resulting lemma is also recorded in locale \vars\ for later use and is automatically propagated to all its interpretations. Here is another example:\ statespace 'a varsX = NB: vars [n=N, b=B] + vars + x::'a text \\noindent The state space \varsX\ imports two copies of the state space \vars\, where one has the variables renamed to upper-case letters, and adds another variable \<^term>\x\ of type \<^typ>\'a\. This type is fixed inside the state space but may get instantiated later on, analogous to type parameters of an ML-functor. The distinctness assumption is now \distinct [N, B, n, b, x]\, from this we can derive both \<^term>\distinct [N,B]\ and \<^term>\distinct [n,b]\, the distinction assumptions for the two versions of locale \vars\ above. Moreover we have all necessary projection and injection assumptions available. These assumptions together allow us to establish state space \<^term>\varsX\ as an interpretation of both instances of locale \<^term>\vars\. Hence we inherit both variants of theorem \foo\: \s\N := 2\\B = s\B\ as well as \s\n := 2\\b = s\b\. These are immediate consequences of the locale interpretation action. The declarations for syntax and the distinctness theorems also observe the morphisms generated by the locale package due to the renaming \<^term>\n = N\:\ lemma (in varsX) foo: "s\N := 2\\x = s\x" by simp text \To assure scalability towards many distinct names, the distinctness predicate is refined to operate on balanced trees. Thus we get logarithmic certificates for the distinctness of two names by the distinctness of the paths in the tree. Asked for the distinctness of two names, our tool produces the paths of the variables in the tree (this is implemented in Isabelle/ML, outside the logic) and returns a certificate corresponding to the different paths. Merging state spaces requires to prove that the combined distinctness assumption implies the distinctness assumptions of the components. Such a proof is of the order $m \cdot \log n$, where $n$ and $m$ are the number of nodes in the larger and smaller tree, respectively.\ text \We continue with more examples.\ statespace 'a foo = f::"nat\nat" a::int b::nat c::'a lemma (in foo) foo1: shows "s\a := i\\a = i" by simp lemma (in foo) foo2: shows "(s\a:=i\)\a = i" by simp lemma (in foo) foo3: shows "(s\a:=i\)\b = s\b" by simp lemma (in foo) foo4: shows "(s\a:=i,b:=j,c:=k,a:=x\) = (s\b:=j,c:=k,a:=x\)" by simp statespace bar = b::bool c::string lemma (in bar) bar1: shows "(s\b:=True\)\c = s\c" by simp text \You can define a derived state space by inheriting existing state spaces, renaming of components if you like, and by declaring new components. \ statespace ('a,'b) loo = 'a foo + bar [b=B,c=C] + X::'b lemma (in loo) loo1: shows "s\a:=i\\B = s\B" proof - thm foo1 txt \The Lemma @{thm [source] foo1} from the parent state space is also available here: \begin{center}@{thm foo1}\end{center}\ have "s\a = i" by (rule foo1) thm bar1 txt \Note the renaming of the parameters in Lemma @{thm [source] bar1}: \begin{center}@{thm bar1}\end{center}\ have "s\C = s\C" by (rule bar1) show ?thesis by simp qed statespace 'a dup = FA: 'a foo [f=F, a=A] + 'a foo + x::int lemma (in dup) shows "s\x = s\x" by simp lemma (in dup) shows "s\a = s\a" by simp lemma (in dup) shows "s\x = s\x" by simp text \There were known problems with syntax-declarations. They only worked when the context is already completely built. This is now overcome. e.g.:\ locale fooX = foo + assumes "s\b = k" text \ We can also put statespaces side-by-side by using ordinary @{command locale} expressions (instead of the @{command statespace}). \ locale side_by_side = foo + bar where b="B::'a" and c=C for B C context side_by_side begin text \Simplification within one of the statespaces works as expected.\ lemma "s\C = s\C" by simp lemma "s\b = s\b" by simp text \In contrast to the statespace @{locale loo} there is no 'inter' statespace distinctness between the names of @{locale foo} and @{locale bar}.\ end text \Sharing of names in side-by-side statespaces is also possible as long as they are mapped -to the same type.}\ +to the same type.\ statespace vars1 = n::nat m::nat statespace vars2 = n::nat k::nat locale vars1_vars2 = vars1 + vars2 context vars1_vars2 begin text \Note that the distinctness theorem for @{locale vars1} is selected here to do the proof.\ lemma "s\m = s\m" by simp text \Note that the distinctness theorem for @{locale vars2} is selected here to do the proof.\ lemma "s\k = s\k" by simp text \Still there is no inter-statespace distinctness.\ lemma "s\m = s\m" (* apply simp *) oops end statespace merge_vars1_vars2 = vars1 + vars2 context merge_vars1_vars2 begin text \When defining a statespace instead of a side-by-side locale we get the distinctness of all variables.\ lemma "s\m = s\m" by simp end subsection \Benchmarks\ text \Here are some bigger examples for benchmarking.\ ML \ fun make_benchmark n = writeln (Active.sendback_markup_command ("statespace benchmark" ^ string_of_int n ^ " =\n" ^ (cat_lines (map (fn i => "A" ^ string_of_int i ^ "::nat") (1 upto n))))); \ text "0.2s" statespace benchmark100 = A1::nat A2::nat A3::nat A4::nat A5::nat A6::nat A7::nat A8::nat A9::nat A10::nat A11::nat A12::nat A13::nat A14::nat A15::nat A16::nat A17::nat A18::nat A19::nat A20::nat A21::nat A22::nat A23::nat A24::nat A25::nat A26::nat A27::nat A28::nat A29::nat A30::nat A31::nat A32::nat A33::nat A34::nat A35::nat A36::nat A37::nat A38::nat A39::nat A40::nat A41::nat A42::nat A43::nat A44::nat A45::nat A46::nat A47::nat A48::nat A49::nat A50::nat A51::nat A52::nat A53::nat A54::nat A55::nat A56::nat A57::nat A58::nat A59::nat A60::nat A61::nat A62::nat A63::nat A64::nat A65::nat A66::nat A67::nat A68::nat A69::nat A70::nat A71::nat A72::nat A73::nat A74::nat A75::nat A76::nat A77::nat A78::nat A79::nat A80::nat A81::nat A82::nat A83::nat A84::nat A85::nat A86::nat A87::nat A88::nat A89::nat A90::nat A91::nat A92::nat A93::nat A94::nat A95::nat A96::nat A97::nat A98::nat A99::nat A100::nat text "2.4s" statespace benchmark500 = A1::nat A2::nat A3::nat A4::nat A5::nat A6::nat A7::nat A8::nat A9::nat A10::nat A11::nat A12::nat A13::nat A14::nat A15::nat A16::nat A17::nat A18::nat A19::nat A20::nat A21::nat A22::nat A23::nat A24::nat A25::nat A26::nat A27::nat A28::nat A29::nat A30::nat A31::nat A32::nat A33::nat A34::nat A35::nat A36::nat A37::nat A38::nat A39::nat A40::nat A41::nat A42::nat A43::nat A44::nat A45::nat A46::nat A47::nat A48::nat A49::nat A50::nat A51::nat A52::nat A53::nat A54::nat A55::nat A56::nat A57::nat A58::nat A59::nat A60::nat A61::nat A62::nat A63::nat A64::nat A65::nat A66::nat A67::nat A68::nat A69::nat A70::nat A71::nat A72::nat A73::nat A74::nat A75::nat A76::nat A77::nat A78::nat A79::nat A80::nat A81::nat A82::nat A83::nat A84::nat A85::nat A86::nat A87::nat A88::nat A89::nat A90::nat A91::nat A92::nat A93::nat A94::nat A95::nat A96::nat A97::nat A98::nat A99::nat A100::nat A101::nat A102::nat A103::nat A104::nat A105::nat A106::nat A107::nat A108::nat A109::nat A110::nat A111::nat A112::nat A113::nat A114::nat A115::nat A116::nat A117::nat A118::nat A119::nat A120::nat A121::nat A122::nat A123::nat A124::nat A125::nat A126::nat A127::nat A128::nat A129::nat A130::nat A131::nat A132::nat A133::nat A134::nat A135::nat A136::nat A137::nat A138::nat A139::nat A140::nat A141::nat A142::nat A143::nat A144::nat A145::nat A146::nat A147::nat A148::nat A149::nat A150::nat A151::nat A152::nat A153::nat A154::nat A155::nat A156::nat A157::nat A158::nat A159::nat A160::nat A161::nat A162::nat A163::nat A164::nat A165::nat A166::nat A167::nat A168::nat A169::nat A170::nat A171::nat A172::nat A173::nat A174::nat A175::nat A176::nat A177::nat A178::nat A179::nat A180::nat A181::nat A182::nat A183::nat A184::nat A185::nat A186::nat A187::nat A188::nat A189::nat A190::nat A191::nat A192::nat A193::nat A194::nat A195::nat A196::nat A197::nat A198::nat A199::nat A200::nat A201::nat A202::nat A203::nat A204::nat A205::nat A206::nat A207::nat A208::nat A209::nat A210::nat A211::nat A212::nat A213::nat A214::nat A215::nat A216::nat A217::nat A218::nat A219::nat A220::nat A221::nat A222::nat A223::nat A224::nat A225::nat A226::nat A227::nat A228::nat A229::nat A230::nat A231::nat A232::nat A233::nat A234::nat A235::nat A236::nat A237::nat A238::nat A239::nat A240::nat A241::nat A242::nat A243::nat A244::nat A245::nat A246::nat A247::nat A248::nat A249::nat A250::nat A251::nat A252::nat A253::nat A254::nat A255::nat A256::nat A257::nat A258::nat A259::nat A260::nat A261::nat A262::nat A263::nat A264::nat A265::nat A266::nat A267::nat A268::nat A269::nat A270::nat A271::nat A272::nat A273::nat A274::nat A275::nat A276::nat A277::nat A278::nat A279::nat A280::nat A281::nat A282::nat A283::nat A284::nat A285::nat A286::nat A287::nat A288::nat A289::nat A290::nat A291::nat A292::nat A293::nat A294::nat A295::nat A296::nat A297::nat A298::nat A299::nat A300::nat A301::nat A302::nat A303::nat A304::nat A305::nat A306::nat A307::nat A308::nat A309::nat A310::nat A311::nat A312::nat A313::nat A314::nat A315::nat A316::nat A317::nat A318::nat A319::nat A320::nat A321::nat A322::nat A323::nat A324::nat A325::nat A326::nat A327::nat A328::nat A329::nat A330::nat A331::nat A332::nat A333::nat A334::nat A335::nat A336::nat A337::nat A338::nat A339::nat A340::nat A341::nat A342::nat A343::nat A344::nat A345::nat A346::nat A347::nat A348::nat A349::nat A350::nat A351::nat A352::nat A353::nat A354::nat A355::nat A356::nat A357::nat A358::nat A359::nat A360::nat A361::nat A362::nat A363::nat A364::nat A365::nat A366::nat A367::nat A368::nat A369::nat A370::nat A371::nat A372::nat A373::nat A374::nat A375::nat A376::nat A377::nat A378::nat A379::nat A380::nat A381::nat A382::nat A383::nat A384::nat A385::nat A386::nat A387::nat A388::nat A389::nat A390::nat A391::nat A392::nat A393::nat A394::nat A395::nat A396::nat A397::nat A398::nat A399::nat A400::nat A401::nat A402::nat A403::nat A404::nat A405::nat A406::nat A407::nat A408::nat A409::nat A410::nat A411::nat A412::nat A413::nat A414::nat A415::nat A416::nat A417::nat A418::nat A419::nat A420::nat A421::nat A422::nat A423::nat A424::nat A425::nat A426::nat A427::nat A428::nat A429::nat A430::nat A431::nat A432::nat A433::nat A434::nat A435::nat A436::nat A437::nat A438::nat A439::nat A440::nat A441::nat A442::nat A443::nat A444::nat A445::nat A446::nat A447::nat A448::nat A449::nat A450::nat A451::nat A452::nat A453::nat A454::nat A455::nat A456::nat A457::nat A458::nat A459::nat A460::nat A461::nat A462::nat A463::nat A464::nat A465::nat A466::nat A467::nat A468::nat A469::nat A470::nat A471::nat A472::nat A473::nat A474::nat A475::nat A476::nat A477::nat A478::nat A479::nat A480::nat A481::nat A482::nat A483::nat A484::nat A485::nat A486::nat A487::nat A488::nat A489::nat A490::nat A491::nat A492::nat A493::nat A494::nat A495::nat A496::nat A497::nat A498::nat A499::nat A500::nat text "9.0s" statespace benchmark1000 = A1::nat A2::nat A3::nat A4::nat A5::nat A6::nat A7::nat A8::nat A9::nat A10::nat A11::nat A12::nat A13::nat A14::nat A15::nat A16::nat A17::nat A18::nat A19::nat A20::nat A21::nat A22::nat A23::nat A24::nat A25::nat A26::nat A27::nat A28::nat A29::nat A30::nat A31::nat A32::nat A33::nat A34::nat A35::nat A36::nat A37::nat A38::nat A39::nat A40::nat A41::nat A42::nat A43::nat A44::nat A45::nat A46::nat A47::nat A48::nat A49::nat A50::nat A51::nat A52::nat A53::nat A54::nat A55::nat A56::nat A57::nat A58::nat A59::nat A60::nat A61::nat A62::nat A63::nat A64::nat A65::nat A66::nat A67::nat A68::nat A69::nat A70::nat A71::nat A72::nat A73::nat A74::nat A75::nat A76::nat A77::nat A78::nat A79::nat A80::nat A81::nat A82::nat A83::nat A84::nat A85::nat A86::nat A87::nat A88::nat A89::nat A90::nat A91::nat A92::nat A93::nat A94::nat A95::nat A96::nat A97::nat A98::nat A99::nat A100::nat A101::nat A102::nat A103::nat A104::nat A105::nat A106::nat A107::nat A108::nat A109::nat A110::nat A111::nat A112::nat A113::nat A114::nat A115::nat A116::nat A117::nat A118::nat A119::nat A120::nat A121::nat A122::nat A123::nat A124::nat A125::nat A126::nat A127::nat A128::nat A129::nat A130::nat A131::nat A132::nat A133::nat A134::nat A135::nat A136::nat A137::nat A138::nat A139::nat A140::nat A141::nat A142::nat A143::nat A144::nat A145::nat A146::nat A147::nat A148::nat A149::nat A150::nat A151::nat A152::nat A153::nat A154::nat A155::nat A156::nat A157::nat A158::nat A159::nat A160::nat A161::nat A162::nat A163::nat A164::nat A165::nat A166::nat A167::nat A168::nat A169::nat A170::nat A171::nat A172::nat A173::nat A174::nat A175::nat A176::nat A177::nat A178::nat A179::nat A180::nat A181::nat A182::nat A183::nat A184::nat A185::nat A186::nat A187::nat A188::nat A189::nat A190::nat A191::nat A192::nat A193::nat A194::nat A195::nat A196::nat A197::nat A198::nat A199::nat A200::nat A201::nat A202::nat A203::nat A204::nat A205::nat A206::nat A207::nat A208::nat A209::nat A210::nat A211::nat A212::nat A213::nat A214::nat A215::nat A216::nat A217::nat A218::nat A219::nat A220::nat A221::nat A222::nat A223::nat A224::nat A225::nat A226::nat A227::nat A228::nat A229::nat A230::nat A231::nat A232::nat A233::nat A234::nat A235::nat A236::nat A237::nat A238::nat A239::nat A240::nat A241::nat A242::nat A243::nat A244::nat A245::nat A246::nat A247::nat A248::nat A249::nat A250::nat A251::nat A252::nat A253::nat A254::nat A255::nat A256::nat A257::nat A258::nat A259::nat A260::nat A261::nat A262::nat A263::nat A264::nat A265::nat A266::nat A267::nat A268::nat A269::nat A270::nat A271::nat A272::nat A273::nat A274::nat A275::nat A276::nat A277::nat A278::nat A279::nat A280::nat A281::nat A282::nat A283::nat A284::nat A285::nat A286::nat A287::nat A288::nat A289::nat A290::nat A291::nat A292::nat A293::nat A294::nat A295::nat A296::nat A297::nat A298::nat A299::nat A300::nat A301::nat A302::nat A303::nat A304::nat A305::nat A306::nat A307::nat A308::nat A309::nat A310::nat A311::nat A312::nat A313::nat A314::nat A315::nat A316::nat A317::nat A318::nat A319::nat A320::nat A321::nat A322::nat A323::nat A324::nat A325::nat A326::nat A327::nat A328::nat A329::nat A330::nat A331::nat A332::nat A333::nat A334::nat A335::nat A336::nat A337::nat A338::nat A339::nat A340::nat A341::nat A342::nat A343::nat A344::nat A345::nat A346::nat A347::nat A348::nat A349::nat A350::nat A351::nat A352::nat A353::nat A354::nat A355::nat A356::nat A357::nat A358::nat A359::nat A360::nat A361::nat A362::nat A363::nat A364::nat A365::nat A366::nat A367::nat A368::nat A369::nat A370::nat A371::nat A372::nat A373::nat A374::nat A375::nat A376::nat A377::nat A378::nat A379::nat A380::nat A381::nat A382::nat A383::nat A384::nat A385::nat A386::nat A387::nat A388::nat A389::nat A390::nat A391::nat A392::nat A393::nat A394::nat A395::nat A396::nat A397::nat A398::nat A399::nat A400::nat A401::nat A402::nat A403::nat A404::nat A405::nat A406::nat A407::nat A408::nat A409::nat A410::nat A411::nat A412::nat A413::nat A414::nat A415::nat A416::nat A417::nat A418::nat A419::nat A420::nat A421::nat A422::nat A423::nat A424::nat A425::nat A426::nat A427::nat A428::nat A429::nat A430::nat A431::nat A432::nat A433::nat A434::nat A435::nat A436::nat A437::nat A438::nat A439::nat A440::nat A441::nat A442::nat A443::nat A444::nat A445::nat A446::nat A447::nat A448::nat A449::nat A450::nat A451::nat A452::nat A453::nat A454::nat A455::nat A456::nat A457::nat A458::nat A459::nat A460::nat A461::nat A462::nat A463::nat A464::nat A465::nat A466::nat A467::nat A468::nat A469::nat A470::nat A471::nat A472::nat A473::nat A474::nat A475::nat A476::nat A477::nat A478::nat A479::nat A480::nat A481::nat A482::nat A483::nat A484::nat A485::nat A486::nat A487::nat A488::nat A489::nat A490::nat A491::nat A492::nat A493::nat A494::nat A495::nat A496::nat A497::nat A498::nat A499::nat A500::nat A501::nat A502::nat A503::nat A504::nat A505::nat A506::nat A507::nat A508::nat A509::nat A510::nat A511::nat A512::nat A513::nat A514::nat A515::nat A516::nat A517::nat A518::nat A519::nat A520::nat A521::nat A522::nat A523::nat A524::nat A525::nat A526::nat A527::nat A528::nat A529::nat A530::nat A531::nat A532::nat A533::nat A534::nat A535::nat A536::nat A537::nat A538::nat A539::nat A540::nat A541::nat A542::nat A543::nat A544::nat A545::nat A546::nat A547::nat A548::nat A549::nat A550::nat A551::nat A552::nat A553::nat A554::nat A555::nat A556::nat A557::nat A558::nat A559::nat A560::nat A561::nat A562::nat A563::nat A564::nat A565::nat A566::nat A567::nat A568::nat A569::nat A570::nat A571::nat A572::nat A573::nat A574::nat A575::nat A576::nat A577::nat A578::nat A579::nat A580::nat A581::nat A582::nat A583::nat A584::nat A585::nat A586::nat A587::nat A588::nat A589::nat A590::nat A591::nat A592::nat A593::nat A594::nat A595::nat A596::nat A597::nat A598::nat A599::nat A600::nat A601::nat A602::nat A603::nat A604::nat A605::nat A606::nat A607::nat A608::nat A609::nat A610::nat A611::nat A612::nat A613::nat A614::nat A615::nat A616::nat A617::nat A618::nat A619::nat A620::nat A621::nat A622::nat A623::nat A624::nat A625::nat A626::nat A627::nat A628::nat A629::nat A630::nat A631::nat A632::nat A633::nat A634::nat A635::nat A636::nat A637::nat A638::nat A639::nat A640::nat A641::nat A642::nat A643::nat A644::nat A645::nat A646::nat A647::nat A648::nat A649::nat A650::nat A651::nat A652::nat A653::nat A654::nat A655::nat A656::nat A657::nat A658::nat A659::nat A660::nat A661::nat A662::nat A663::nat A664::nat A665::nat A666::nat A667::nat A668::nat A669::nat A670::nat A671::nat A672::nat A673::nat A674::nat A675::nat A676::nat A677::nat A678::nat A679::nat A680::nat A681::nat A682::nat A683::nat A684::nat A685::nat A686::nat A687::nat A688::nat A689::nat A690::nat A691::nat A692::nat A693::nat A694::nat A695::nat A696::nat A697::nat A698::nat A699::nat A700::nat A701::nat A702::nat A703::nat A704::nat A705::nat A706::nat A707::nat A708::nat A709::nat A710::nat A711::nat A712::nat A713::nat A714::nat A715::nat A716::nat A717::nat A718::nat A719::nat A720::nat A721::nat A722::nat A723::nat A724::nat A725::nat A726::nat A727::nat A728::nat A729::nat A730::nat A731::nat A732::nat A733::nat A734::nat A735::nat A736::nat A737::nat A738::nat A739::nat A740::nat A741::nat A742::nat A743::nat A744::nat A745::nat A746::nat A747::nat A748::nat A749::nat A750::nat A751::nat A752::nat A753::nat A754::nat A755::nat A756::nat A757::nat A758::nat A759::nat A760::nat A761::nat A762::nat A763::nat A764::nat A765::nat A766::nat A767::nat A768::nat A769::nat A770::nat A771::nat A772::nat A773::nat A774::nat A775::nat A776::nat A777::nat A778::nat A779::nat A780::nat A781::nat A782::nat A783::nat A784::nat A785::nat A786::nat A787::nat A788::nat A789::nat A790::nat A791::nat A792::nat A793::nat A794::nat A795::nat A796::nat A797::nat A798::nat A799::nat A800::nat A801::nat A802::nat A803::nat A804::nat A805::nat A806::nat A807::nat A808::nat A809::nat A810::nat A811::nat A812::nat A813::nat A814::nat A815::nat A816::nat A817::nat A818::nat A819::nat A820::nat A821::nat A822::nat A823::nat A824::nat A825::nat A826::nat A827::nat A828::nat A829::nat A830::nat A831::nat A832::nat A833::nat A834::nat A835::nat A836::nat A837::nat A838::nat A839::nat A840::nat A841::nat A842::nat A843::nat A844::nat A845::nat A846::nat A847::nat A848::nat A849::nat A850::nat A851::nat A852::nat A853::nat A854::nat A855::nat A856::nat A857::nat A858::nat A859::nat A860::nat A861::nat A862::nat A863::nat A864::nat A865::nat A866::nat A867::nat A868::nat A869::nat A870::nat A871::nat A872::nat A873::nat A874::nat A875::nat A876::nat A877::nat A878::nat A879::nat A880::nat A881::nat A882::nat A883::nat A884::nat A885::nat A886::nat A887::nat A888::nat A889::nat A890::nat A891::nat A892::nat A893::nat A894::nat A895::nat A896::nat A897::nat A898::nat A899::nat A900::nat A901::nat A902::nat A903::nat A904::nat A905::nat A906::nat A907::nat A908::nat A909::nat A910::nat A911::nat A912::nat A913::nat A914::nat A915::nat A916::nat A917::nat A918::nat A919::nat A920::nat A921::nat A922::nat A923::nat A924::nat A925::nat A926::nat A927::nat A928::nat A929::nat A930::nat A931::nat A932::nat A933::nat A934::nat A935::nat A936::nat A937::nat A938::nat A939::nat A940::nat A941::nat A942::nat A943::nat A944::nat A945::nat A946::nat A947::nat A948::nat A949::nat A950::nat A951::nat A952::nat A953::nat A954::nat A955::nat A956::nat A957::nat A958::nat A959::nat A960::nat A961::nat A962::nat A963::nat A964::nat A965::nat A966::nat A967::nat A968::nat A969::nat A970::nat A971::nat A972::nat A973::nat A974::nat A975::nat A976::nat A977::nat A978::nat A979::nat A980::nat A981::nat A982::nat A983::nat A984::nat A985::nat A986::nat A987::nat A988::nat A989::nat A990::nat A991::nat A992::nat A993::nat A994::nat A995::nat A996::nat A997::nat A998::nat A999::nat A1000::nat lemma (in benchmark100) test: "s\A100 = s\A100" by simp lemma (in benchmark500) test: "s\A100 = s\A100" by simp lemma (in benchmark1000) test: "s\A100 = s\A100" by simp end diff --git a/src/HOL/Tools/record.ML b/src/HOL/Tools/record.ML --- a/src/HOL/Tools/record.ML +++ b/src/HOL/Tools/record.ML @@ -1,2424 +1,2425 @@ (* Title: HOL/Tools/record.ML Author: Wolfgang Naraschewski, TU Muenchen Author: Markus Wenzel, TU Muenchen Author: Norbert Schirmer, TU Muenchen Author: Thomas Sewell, NICTA Extensible records with structural subtyping. *) signature RECORD = sig val type_abbr: bool Config.T val type_as_fields: bool Config.T val timing: bool Config.T type info = {args: (string * sort) list, parent: (typ list * string) option, fields: (string * typ) list, extension: (string * typ list), ext_induct: thm, ext_inject: thm, ext_surjective: thm, ext_split: thm, ext_def: thm, select_convs: thm list, update_convs: thm list, select_defs: thm list, update_defs: thm list, fold_congs: thm list, unfold_congs: thm list, splits: thm list, defs: thm list, surjective: thm, equality: thm, induct_scheme: thm, induct: thm, cases_scheme: thm, cases: thm, simps: thm list, iffs: thm list} val get_info: theory -> string -> info option val the_info: theory -> string -> info val get_hierarchy: theory -> (string * typ list) -> (string * ((string * sort) * typ) list) list val add_record: {overloaded: bool} -> (string * sort) list * binding -> (typ list * string) option -> (binding * typ * mixfix) list -> theory -> theory val last_extT: typ -> (string * typ list) option val dest_recTs: typ -> (string * typ list) list val get_extT_fields: theory -> typ -> (string * typ) list * (string * typ) val get_recT_fields: theory -> typ -> (string * typ) list * (string * typ) val get_parent: theory -> string -> (typ list * string) option val get_extension: theory -> string -> (string * typ list) option val get_extinjects: theory -> thm list val get_simpset: theory -> simpset val simproc: simproc val eq_simproc: simproc val upd_simproc: simproc val split_simproc: (term -> int) -> simproc val ex_sel_eq_simproc: simproc val split_tac: Proof.context -> int -> tactic val split_simp_tac: Proof.context -> thm list -> (term -> int) -> int -> tactic val split_wrapper: string * (Proof.context -> wrapper) val pretty_recT: Proof.context -> typ -> Pretty.T val string_of_record: Proof.context -> string -> string val codegen: bool Config.T val updateN: string val ext_typeN: string val extN: string end; signature ISO_TUPLE_SUPPORT = sig val add_iso_tuple_type: {overloaded: bool} -> binding * (string * sort) list -> typ * typ -> theory -> (term * term) * theory val mk_cons_tuple: term * term -> term val dest_cons_tuple: term -> term * term val iso_tuple_intros_tac: Proof.context -> int -> tactic end; structure Iso_Tuple_Support: ISO_TUPLE_SUPPORT = struct val isoN = "_Tuple_Iso"; val iso_tuple_intro = @{thm isomorphic_tuple_intro}; val iso_tuple_intros = Tactic.build_net @{thms isomorphic_tuple.intros}; val tuple_iso_tuple = (\<^const_name>\Record.tuple_iso_tuple\, @{thm tuple_iso_tuple}); structure Iso_Tuple_Thms = Theory_Data ( type T = thm Symtab.table; val empty = Symtab.make [tuple_iso_tuple]; fun merge data = Symtab.merge Thm.eq_thm_prop data; (* FIXME handle Symtab.DUP ?? *) ); fun get_typedef_info tyco vs (({rep_type, Abs_name, ...}, {Rep_inject, Abs_inverse, ... }) : Typedef.info) thy = let val exists_thm = UNIV_I |> Thm.instantiate' [SOME (Thm.global_ctyp_of thy (Logic.varifyT_global rep_type))] []; val proj_constr = Abs_inverse OF [exists_thm]; val absT = Type (tyco, map TFree vs); in thy |> pair (tyco, ((Rep_inject, proj_constr), Const (Abs_name, rep_type --> absT), absT)) end fun do_typedef overloaded raw_tyco repT raw_vs thy = let val ctxt = Proof_Context.init_global thy |> Variable.declare_typ repT; val vs = map (Proof_Context.check_tfree ctxt) raw_vs; in thy |> Named_Target.theory_map_result (apsnd o Typedef.transform_info) (Typedef.add_typedef overloaded (raw_tyco, vs, NoSyn) (HOLogic.mk_UNIV repT) NONE (fn ctxt' => resolve_tac ctxt' [UNIV_witness] 1)) |-> (fn (tyco, info) => get_typedef_info tyco vs info) end; fun mk_cons_tuple (left, right) = let val (leftT, rightT) = (fastype_of left, fastype_of right); val prodT = HOLogic.mk_prodT (leftT, rightT); val isomT = Type (\<^type_name>\tuple_isomorphism\, [prodT, leftT, rightT]); in Const (\<^const_name>\Record.iso_tuple_cons\, isomT --> leftT --> rightT --> prodT) $ Const (fst tuple_iso_tuple, isomT) $ left $ right end; fun dest_cons_tuple (Const (\<^const_name>\Record.iso_tuple_cons\, _) $ Const _ $ t $ u) = (t, u) | dest_cons_tuple t = raise TERM ("dest_cons_tuple", [t]); fun add_iso_tuple_type overloaded (b, alphas) (leftT, rightT) thy = let val repT = HOLogic.mk_prodT (leftT, rightT); val ((_, ((rep_inject, abs_inverse), absC, absT)), typ_thy) = thy |> do_typedef overloaded b repT alphas ||> Sign.add_path (Binding.name_of b); (*FIXME proper prefixing instead*) val typ_ctxt = Proof_Context.init_global typ_thy; (*construct a type and body for the isomorphism constant by instantiating the theorem to which the definition will be applied*) val intro_inst = rep_inject RS infer_instantiate typ_ctxt [(("abst", 0), Thm.cterm_of typ_ctxt absC)] iso_tuple_intro; val (_, body) = Logic.dest_equals (List.last (Thm.prems_of intro_inst)); val isomT = fastype_of body; val isom_binding = Binding.suffix_name isoN b; val isom_name = Sign.full_name typ_thy isom_binding; val isom = Const (isom_name, isomT); val ([isom_def], cdef_thy) = typ_thy |> Sign.declare_const_global ((isom_binding, isomT), NoSyn) |> snd |> Global_Theory.add_defs false [((Binding.concealed (Thm.def_binding isom_binding), Logic.mk_equals (isom, body)), [])]; val iso_tuple = isom_def RS (abs_inverse RS (rep_inject RS iso_tuple_intro)); val cons = Const (\<^const_name>\Record.iso_tuple_cons\, isomT --> leftT --> rightT --> absT); val thm_thy = cdef_thy |> Iso_Tuple_Thms.map (Symtab.insert Thm.eq_thm_prop (isom_name, iso_tuple)) |> Sign.restore_naming thy in ((isom, cons $ isom), thm_thy) end; fun iso_tuple_intros_tac ctxt = resolve_from_net_tac ctxt iso_tuple_intros THEN' CSUBGOAL (fn (cgoal, i) => let val goal = Thm.term_of cgoal; val isthms = Iso_Tuple_Thms.get (Proof_Context.theory_of ctxt); fun err s t = raise TERM ("iso_tuple_intros_tac: " ^ s, [t]); val goal' = Envir.beta_eta_contract goal; val is = (case goal' of Const (\<^const_name>\Trueprop\, _) $ (Const (\<^const_name>\isomorphic_tuple\, _) $ Const is) => is | _ => err "unexpected goal format" goal'); val isthm = (case Symtab.lookup isthms (#1 is) of SOME isthm => isthm | NONE => err "no thm found for constant" (Const is)); in resolve_tac ctxt [isthm] i end); end; structure Record: RECORD = struct val surject_assistI = @{thm iso_tuple_surjective_proof_assistI}; val surject_assist_idE = @{thm iso_tuple_surjective_proof_assist_idE}; val updacc_accessor_eqE = @{thm update_accessor_accessor_eqE}; val updacc_updator_eqE = @{thm update_accessor_updator_eqE}; val updacc_eq_idI = @{thm iso_tuple_update_accessor_eq_assist_idI}; val updacc_eq_triv = @{thm iso_tuple_update_accessor_eq_assist_triv}; val updacc_foldE = @{thm update_accessor_congruence_foldE}; val updacc_unfoldE = @{thm update_accessor_congruence_unfoldE}; val updacc_noopE = @{thm update_accessor_noopE}; val updacc_noop_compE = @{thm update_accessor_noop_compE}; val updacc_cong_idI = @{thm update_accessor_cong_assist_idI}; val updacc_cong_triv = @{thm update_accessor_cong_assist_triv}; val updacc_cong_from_eq = @{thm iso_tuple_update_accessor_cong_from_eq}; val codegen = Attrib.setup_config_bool \<^binding>\record_codegen\ (K true); (** name components **) val rN = "r"; val wN = "w"; val moreN = "more"; val schemeN = "_scheme"; val ext_typeN = "_ext"; val inner_typeN = "_inner"; val extN ="_ext"; val updateN = "_update"; val makeN = "make"; val fields_selN = "fields"; val extendN = "extend"; val truncateN = "truncate"; (*** utilities ***) fun varifyT idx = map_type_tfree (fn (a, S) => TVar ((a, idx), S)); (* timing *) val timing = Attrib.setup_config_bool \<^binding>\record_timing\ (K false); fun timeit_msg ctxt s x = if Config.get ctxt timing then (warning s; timeit x) else x (); fun timing_msg ctxt s = if Config.get ctxt timing then warning s else (); (* syntax *) val Trueprop = HOLogic.mk_Trueprop; infix 0 :== ===; infixr 0 ==>; val op :== = Misc_Legacy.mk_defpair; val op === = Trueprop o HOLogic.mk_eq; val op ==> = Logic.mk_implies; (* constructor *) fun mk_ext (name, T) ts = let val Ts = map fastype_of ts in list_comb (Const (suffix extN name, Ts ---> T), ts) end; (* selector *) fun mk_selC sT (c, T) = (c, sT --> T); fun mk_sel s (c, T) = let val sT = fastype_of s in Const (mk_selC sT (c, T)) $ s end; (* updates *) fun mk_updC sfx sT (c, T) = (suffix sfx c, (T --> T) --> sT --> sT); fun mk_upd' sfx c v sT = let val vT = domain_type (fastype_of v); in Const (mk_updC sfx sT (c, vT)) $ v end; fun mk_upd sfx c v s = mk_upd' sfx c v (fastype_of s) $ s; (* types *) fun dest_recT (typ as Type (c_ext_type, Ts as (_ :: _))) = (case try (unsuffix ext_typeN) c_ext_type of NONE => raise TYPE ("Record.dest_recT", [typ], []) | SOME c => ((c, Ts), List.last Ts)) | dest_recT typ = raise TYPE ("Record.dest_recT", [typ], []); val is_recT = can dest_recT; fun dest_recTs T = let val ((c, Ts), U) = dest_recT T in (c, Ts) :: dest_recTs U end handle TYPE _ => []; fun last_extT T = let val ((c, Ts), U) = dest_recT T in (case last_extT U of NONE => SOME (c, Ts) | SOME l => SOME l) end handle TYPE _ => NONE; fun rec_id i T = let val rTs = dest_recTs T; val rTs' = if i < 0 then rTs else take i rTs; in implode (map #1 rTs') end; (*** extend theory by record definition ***) (** record info **) (* type info and parent_info *) type info = {args: (string * sort) list, parent: (typ list * string) option, fields: (string * typ) list, extension: (string * typ list), ext_induct: thm, ext_inject: thm, ext_surjective: thm, ext_split: thm, ext_def: thm, select_convs: thm list, update_convs: thm list, select_defs: thm list, update_defs: thm list, fold_congs: thm list, (* potentially used in L4.verified *) unfold_congs: thm list, (* potentially used in L4.verified *) splits: thm list, defs: thm list, surjective: thm, equality: thm, induct_scheme: thm, induct: thm, cases_scheme: thm, cases: thm, simps: thm list, iffs: thm list}; fun make_info args parent fields extension ext_induct ext_inject ext_surjective ext_split ext_def select_convs update_convs select_defs update_defs fold_congs unfold_congs splits defs surjective equality induct_scheme induct cases_scheme cases simps iffs : info = {args = args, parent = parent, fields = fields, extension = extension, ext_induct = ext_induct, ext_inject = ext_inject, ext_surjective = ext_surjective, ext_split = ext_split, ext_def = ext_def, select_convs = select_convs, update_convs = update_convs, select_defs = select_defs, update_defs = update_defs, fold_congs = fold_congs, unfold_congs = unfold_congs, splits = splits, defs = defs, surjective = surjective, equality = equality, induct_scheme = induct_scheme, induct = induct, cases_scheme = cases_scheme, cases = cases, simps = simps, iffs = iffs}; type parent_info = {name: string, fields: (string * typ) list, extension: (string * typ list), induct_scheme: thm, ext_def: thm}; fun make_parent_info name fields extension ext_def induct_scheme : parent_info = {name = name, fields = fields, extension = extension, ext_def = ext_def, induct_scheme = induct_scheme}; (* theory data *) type data = {records: info Symtab.table, sel_upd: {selectors: (int * bool) Symtab.table, updates: string Symtab.table, simpset: simpset, defset: simpset}, equalities: thm Symtab.table, extinjects: thm list, extsplit: thm Symtab.table, (*maps extension name to split rule*) splits: (thm * thm * thm * thm) Symtab.table, (*!!, ALL, EX - split-equalities, induct rule*) extfields: (string * typ) list Symtab.table, (*maps extension to its fields*) fieldext: (string * typ list) Symtab.table}; (*maps field to its extension*) fun make_data records sel_upd equalities extinjects extsplit splits extfields fieldext = {records = records, sel_upd = sel_upd, equalities = equalities, extinjects=extinjects, extsplit = extsplit, splits = splits, extfields = extfields, fieldext = fieldext }: data; structure Data = Theory_Data ( type T = data; val empty = make_data Symtab.empty {selectors = Symtab.empty, updates = Symtab.empty, simpset = HOL_basic_ss, defset = HOL_basic_ss} Symtab.empty [] Symtab.empty Symtab.empty Symtab.empty Symtab.empty; fun merge ({records = recs1, sel_upd = {selectors = sels1, updates = upds1, simpset = ss1, defset = ds1}, equalities = equalities1, extinjects = extinjects1, extsplit = extsplit1, splits = splits1, extfields = extfields1, fieldext = fieldext1}, {records = recs2, sel_upd = {selectors = sels2, updates = upds2, simpset = ss2, defset = ds2}, equalities = equalities2, extinjects = extinjects2, extsplit = extsplit2, splits = splits2, extfields = extfields2, fieldext = fieldext2}) = make_data (Symtab.merge (K true) (recs1, recs2)) {selectors = Symtab.merge (K true) (sels1, sels2), updates = Symtab.merge (K true) (upds1, upds2), simpset = Simplifier.merge_ss (ss1, ss2), defset = Simplifier.merge_ss (ds1, ds2)} (Symtab.merge Thm.eq_thm_prop (equalities1, equalities2)) (Thm.merge_thms (extinjects1, extinjects2)) (Symtab.merge Thm.eq_thm_prop (extsplit1, extsplit2)) (Symtab.merge (fn ((a, b, c, d), (w, x, y, z)) => Thm.eq_thm (a, w) andalso Thm.eq_thm (b, x) andalso Thm.eq_thm (c, y) andalso Thm.eq_thm (d, z)) (splits1, splits2)) (Symtab.merge (K true) (extfields1, extfields2)) (Symtab.merge (K true) (fieldext1, fieldext2)); ); (* access 'records' *) val get_info = Symtab.lookup o #records o Data.get; fun the_info thy name = (case get_info thy name of SOME info => info | NONE => error ("Unknown record type " ^ quote name)); fun put_record name info = Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => make_data (Symtab.update (name, info) records) sel_upd equalities extinjects extsplit splits extfields fieldext); (* access 'sel_upd' *) val get_sel_upd = #sel_upd o Data.get; val is_selector = Symtab.defined o #selectors o get_sel_upd; val get_updates = Symtab.lookup o #updates o get_sel_upd; val get_simpset = #simpset o get_sel_upd; val get_sel_upd_defs = #defset o get_sel_upd; fun get_update_details u thy = let val sel_upd = get_sel_upd thy in (case Symtab.lookup (#updates sel_upd) u of SOME s => let val SOME (dep, ismore) = Symtab.lookup (#selectors sel_upd) s in SOME (s, dep, ismore) end | NONE => NONE) end; fun put_sel_upd names more depth simps defs thy = let val ctxt0 = Proof_Context.init_global thy; val all = names @ [more]; val sels = map (rpair (depth, false)) names @ [(more, (depth, true))]; val upds = map (suffix updateN) all ~~ all; val {records, sel_upd = {selectors, updates, simpset, defset}, equalities, extinjects, extsplit, splits, extfields, fieldext} = Data.get thy; val data = make_data records {selectors = fold Symtab.update_new sels selectors, updates = fold Symtab.update_new upds updates, simpset = simpset_map ctxt0 (fn ctxt => ctxt addsimps simps) simpset, defset = simpset_map ctxt0 (fn ctxt => ctxt addsimps defs) defset} equalities extinjects extsplit splits extfields fieldext; in Data.put data thy end; (* access 'equalities' *) fun add_equalities name thm = Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => make_data records sel_upd (Symtab.update_new (name, thm) equalities) extinjects extsplit splits extfields fieldext); val get_equalities = Symtab.lookup o #equalities o Data.get; (* access 'extinjects' *) fun add_extinjects thm = Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => make_data records sel_upd equalities (insert Thm.eq_thm_prop thm extinjects) extsplit splits extfields fieldext); val get_extinjects = rev o #extinjects o Data.get; (* access 'extsplit' *) fun add_extsplit name thm = Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => make_data records sel_upd equalities extinjects (Symtab.update_new (name, thm) extsplit) splits extfields fieldext); (* access 'splits' *) fun add_splits name thmP = Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => make_data records sel_upd equalities extinjects extsplit (Symtab.update_new (name, thmP) splits) extfields fieldext); val get_splits = Symtab.lookup o #splits o Data.get; (* parent/extension of named record *) val get_parent = (Option.join o Option.map #parent) oo (Symtab.lookup o #records o Data.get); val get_extension = Option.map #extension oo (Symtab.lookup o #records o Data.get); (* access 'extfields' *) fun add_extfields name fields = Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => make_data records sel_upd equalities extinjects extsplit splits (Symtab.update_new (name, fields) extfields) fieldext); val get_extfields = Symtab.lookup o #extfields o Data.get; fun get_extT_fields thy T = let val ((name, Ts), moreT) = dest_recT T; val recname = let val (nm :: _ :: rst) = rev (Long_Name.explode name) (* FIXME !? *) in Long_Name.implode (rev (nm :: rst)) end; val varifyT = varifyT (maxidx_of_typs (moreT :: Ts) + 1); val {records, extfields, ...} = Data.get thy; val (fields, (more, _)) = split_last (Symtab.lookup_list extfields name); val args = map varifyT (snd (#extension (the (Symtab.lookup records recname)))); val subst = fold (Sign.typ_match thy) (#1 (split_last args) ~~ #1 (split_last Ts)) Vartab.empty; val fields' = map (apsnd (Envir.norm_type subst o varifyT)) fields; in (fields', (more, moreT)) end; fun get_recT_fields thy T = let val (root_fields, (root_more, root_moreT)) = get_extT_fields thy T; val (rest_fields, rest_more) = if is_recT root_moreT then get_recT_fields thy root_moreT else ([], (root_more, root_moreT)); in (root_fields @ rest_fields, rest_more) end; (* access 'fieldext' *) fun add_fieldext extname_types fields = Data.map (fn {records, sel_upd, equalities, extinjects, extsplit, splits, extfields, fieldext} => let val fieldext' = fold (fn field => Symtab.update_new (field, extname_types)) fields fieldext; in make_data records sel_upd equalities extinjects extsplit splits extfields fieldext' end); val get_fieldext = Symtab.lookup o #fieldext o Data.get; (* parent records *) local fun add_parents _ NONE = I | add_parents thy (SOME (types, name)) = let fun err msg = error (msg ^ " parent record " ^ quote name); val {args, parent, ...} = (case get_info thy name of SOME info => info | NONE => err "Unknown"); val _ = if length types <> length args then err "Bad number of arguments for" else (); fun bad_inst ((x, S), T) = if Sign.of_sort thy (T, S) then NONE else SOME x val bads = map_filter bad_inst (args ~~ types); val _ = null bads orelse err ("Ill-sorted instantiation of " ^ commas bads ^ " in"); val inst = args ~~ types; val subst = Term.map_type_tfree (the o AList.lookup (op =) inst); val parent' = Option.map (apfst (map subst)) parent; in cons (name, inst) #> add_parents thy parent' end; in fun get_hierarchy thy (name, types) = add_parents thy (SOME (types, name)) []; fun get_parent_info thy parent = add_parents thy parent [] |> map (fn (name, inst) => let val subst = Term.map_type_tfree (the o AList.lookup (op =) inst); val {fields, extension, induct_scheme, ext_def, ...} = the_info thy name; val fields' = map (apsnd subst) fields; val extension' = apsnd (map subst) extension; in make_parent_info name fields' extension' ext_def induct_scheme end); end; (** concrete syntax for records **) (* parse translations *) local fun split_args (field :: fields) ((name, arg) :: fargs) = if can (unsuffix name) field then let val (args, rest) = split_args fields fargs in (arg :: args, rest) end else raise Fail ("expecting field " ^ quote field ^ " but got " ^ quote name) | split_args [] (fargs as (_ :: _)) = ([], fargs) | split_args (_ :: _) [] = raise Fail "expecting more fields" | split_args _ _ = ([], []); fun field_type_tr ((Const (\<^syntax_const>\_field_type\, _) $ Const (name, _) $ arg)) = (name, arg) | field_type_tr t = raise TERM ("field_type_tr", [t]); fun field_types_tr (Const (\<^syntax_const>\_field_types\, _) $ t $ u) = field_type_tr t :: field_types_tr u | field_types_tr t = [field_type_tr t]; fun record_field_types_tr more ctxt t = let val thy = Proof_Context.theory_of ctxt; fun err msg = raise TERM ("Error in record-type input: " ^ msg, [t]); fun mk_ext (fargs as (name, _) :: _) = (case get_fieldext thy (Proof_Context.intern_const ctxt name) of SOME (ext, alphas) => (case get_extfields thy ext of SOME fields => let val (fields', _) = split_last fields; val types = map snd fields'; val (args, rest) = split_args (map fst fields') fargs handle Fail msg => err msg; val argtypes = Syntax.check_typs ctxt (map Syntax_Phases.decode_typ args); val varifyT = varifyT (fold Term.maxidx_typ argtypes ~1 + 1); val vartypes = map varifyT types; val subst = Type.raw_matches (vartypes, argtypes) Vartab.empty handle Type.TYPE_MATCH => err "type is no proper record (extension)"; val alphas' = map (Syntax_Phases.term_of_typ ctxt o Envir.norm_type subst o varifyT) (#1 (split_last alphas)); val more' = mk_ext rest; in list_comb (Syntax.const (Lexicon.mark_type (suffix ext_typeN ext)), alphas' @ [more']) end | NONE => err ("no fields defined for " ^ quote ext)) | NONE => err (quote name ^ " is no proper field")) | mk_ext [] = more; in mk_ext (field_types_tr t) end; fun record_type_tr ctxt [t] = record_field_types_tr (Syntax.const \<^type_syntax>\unit\) ctxt t | record_type_tr _ ts = raise TERM ("record_type_tr", ts); fun record_type_scheme_tr ctxt [t, more] = record_field_types_tr more ctxt t | record_type_scheme_tr _ ts = raise TERM ("record_type_scheme_tr", ts); fun field_tr ((Const (\<^syntax_const>\_field\, _) $ Const (name, _) $ arg)) = (name, arg) | field_tr t = raise TERM ("field_tr", [t]); fun fields_tr (Const (\<^syntax_const>\_fields\, _) $ t $ u) = field_tr t :: fields_tr u | fields_tr t = [field_tr t]; fun record_fields_tr more ctxt t = let val thy = Proof_Context.theory_of ctxt; fun err msg = raise TERM ("Error in record input: " ^ msg, [t]); fun mk_ext (fargs as (name, _) :: _) = (case get_fieldext thy (Proof_Context.intern_const ctxt name) of SOME (ext, _) => (case get_extfields thy ext of SOME fields => let val (args, rest) = split_args (map fst (fst (split_last fields))) fargs handle Fail msg => err msg; val more' = mk_ext rest; in list_comb (Syntax.const (Lexicon.mark_const (ext ^ extN)), args @ [more']) end | NONE => err ("no fields defined for " ^ quote ext)) | NONE => err (quote name ^ " is no proper field")) | mk_ext [] = more; in mk_ext (fields_tr t) end; fun record_tr ctxt [t] = record_fields_tr (Syntax.const \<^const_syntax>\Unity\) ctxt t | record_tr _ ts = raise TERM ("record_tr", ts); fun record_scheme_tr ctxt [t, more] = record_fields_tr more ctxt t | record_scheme_tr _ ts = raise TERM ("record_scheme_tr", ts); fun field_update_tr (Const (\<^syntax_const>\_field_update\, _) $ Const (name, _) $ arg) = Syntax.const (suffix updateN name) $ Abs (Name.uu_, dummyT, arg) | field_update_tr t = raise TERM ("field_update_tr", [t]); fun field_updates_tr (Const (\<^syntax_const>\_field_updates\, _) $ t $ u) = field_update_tr t :: field_updates_tr u | field_updates_tr t = [field_update_tr t]; fun record_update_tr [t, u] = fold (curry op $) (field_updates_tr u) t | record_update_tr ts = raise TERM ("record_update_tr", ts); in val _ = Theory.setup (Sign.parse_translation [(\<^syntax_const>\_record_update\, K record_update_tr), (\<^syntax_const>\_record\, record_tr), (\<^syntax_const>\_record_scheme\, record_scheme_tr), (\<^syntax_const>\_record_type\, record_type_tr), (\<^syntax_const>\_record_type_scheme\, record_type_scheme_tr)]); end; (* print translations *) val type_abbr = Attrib.setup_config_bool \<^binding>\record_type_abbr\ (K true); val type_as_fields = Attrib.setup_config_bool \<^binding>\record_type_as_fields\ (K true); local (* FIXME early extern (!??) *) (* FIXME Syntax.free (??) *) fun field_type_tr' (c, t) = Syntax.const \<^syntax_const>\_field_type\ $ Syntax.const c $ t; fun field_types_tr' (t, u) = Syntax.const \<^syntax_const>\_field_types\ $ t $ u; fun record_type_tr' ctxt t = let val thy = Proof_Context.theory_of ctxt; val T = Syntax_Phases.decode_typ t; val varifyT = varifyT (Term.maxidx_of_typ T + 1); fun strip_fields T = (case T of Type (ext, args as _ :: _) => (case try (unsuffix ext_typeN) ext of SOME ext' => (case get_extfields thy ext' of SOME (fields as (x, _) :: _) => (case get_fieldext thy x of SOME (_, alphas) => (let val (f :: fs, _) = split_last fields; val fields' = apfst (Proof_Context.extern_const ctxt) f :: map (apfst Long_Name.base_name) fs; val (args', more) = split_last args; val alphavars = map varifyT (#1 (split_last alphas)); val subst = Type.raw_matches (alphavars, args') Vartab.empty; val fields'' = (map o apsnd) (Envir.norm_type subst o varifyT) fields'; in fields'' @ strip_fields more end handle Type.TYPE_MATCH => [("", T)]) | _ => [("", T)]) | _ => [("", T)]) | _ => [("", T)]) | _ => [("", T)]); val (fields, (_, moreT)) = split_last (strip_fields T); val _ = null fields andalso raise Match; val u = foldr1 field_types_tr' (map (field_type_tr' o apsnd (Syntax_Phases.term_of_typ ctxt)) fields); in if not (Config.get ctxt type_as_fields) orelse null fields then raise Match else if moreT = HOLogic.unitT then Syntax.const \<^syntax_const>\_record_type\ $ u else Syntax.const \<^syntax_const>\_record_type_scheme\ $ u $ Syntax_Phases.term_of_typ ctxt moreT end; (*try to reconstruct the record name type abbreviation from the (nested) extension types*) fun record_type_abbr_tr' abbr alphas zeta last_ext schemeT ctxt tm = let val T = Syntax_Phases.decode_typ tm; val varifyT = varifyT (maxidx_of_typ T + 1); fun mk_type_abbr subst name args = let val abbrT = Type (name, map (varifyT o TFree) args) in Syntax_Phases.term_of_typ ctxt (Envir.norm_type subst abbrT) end; fun match rT T = Type.raw_match (varifyT rT, T) Vartab.empty; in if Config.get ctxt type_abbr then (case last_extT T of SOME (name, _) => if name = last_ext then let val subst = match schemeT T in if HOLogic.is_unitT (Envir.norm_type subst (varifyT (TFree zeta))) then mk_type_abbr subst abbr alphas else mk_type_abbr subst (suffix schemeN abbr) (alphas @ [zeta]) end handle Type.TYPE_MATCH => record_type_tr' ctxt tm else raise Match (*give print translation of specialised record a chance*) | _ => raise Match) else record_type_tr' ctxt tm end; in fun record_ext_type_tr' name = let val ext_type_name = Lexicon.mark_type (suffix ext_typeN name); fun tr' ctxt ts = record_type_tr' ctxt (list_comb (Syntax.const ext_type_name, ts)); in (ext_type_name, tr') end; fun record_ext_type_abbr_tr' abbr alphas zeta last_ext schemeT name = let val ext_type_name = Lexicon.mark_type (suffix ext_typeN name); fun tr' ctxt ts = record_type_abbr_tr' abbr alphas zeta last_ext schemeT ctxt (list_comb (Syntax.const ext_type_name, ts)); in (ext_type_name, tr') end; end; local (* FIXME Syntax.free (??) *) fun field_tr' (c, t) = Syntax.const \<^syntax_const>\_field\ $ Syntax.const c $ t; fun fields_tr' (t, u) = Syntax.const \<^syntax_const>\_fields\ $ t $ u; fun record_tr' ctxt t = let val thy = Proof_Context.theory_of ctxt; fun strip_fields t = (case strip_comb t of (Const (ext, _), args as (_ :: _)) => (case try (Lexicon.unmark_const o unsuffix extN) ext of SOME ext' => (case get_extfields thy ext' of SOME fields => (let val (f :: fs, _) = split_last (map fst fields); val fields' = Proof_Context.extern_const ctxt f :: map Long_Name.base_name fs; val (args', more) = split_last args; in (fields' ~~ args') @ strip_fields more end handle ListPair.UnequalLengths => [("", t)]) | NONE => [("", t)]) | NONE => [("", t)]) | _ => [("", t)]); val (fields, (_, more)) = split_last (strip_fields t); val _ = null fields andalso raise Match; val u = foldr1 fields_tr' (map field_tr' fields); in (case more of Const (\<^const_syntax>\Unity\, _) => Syntax.const \<^syntax_const>\_record\ $ u | _ => Syntax.const \<^syntax_const>\_record_scheme\ $ u $ more) end; in fun record_ext_tr' name = let val ext_name = Lexicon.mark_const (name ^ extN); fun tr' ctxt ts = record_tr' ctxt (list_comb (Syntax.const ext_name, ts)); in (ext_name, tr') end; end; local fun dest_update ctxt c = (case try Lexicon.unmark_const c of SOME d => try (unsuffix updateN) (Proof_Context.extern_const ctxt d) | NONE => NONE); fun field_updates_tr' ctxt (tm as Const (c, _) $ k $ u) = (case dest_update ctxt c of SOME name => (case try Syntax_Trans.const_abs_tr' k of SOME t => apfst (cons (Syntax.const \<^syntax_const>\_field_update\ $ Syntax.free name $ t)) (field_updates_tr' ctxt u) | NONE => ([], tm)) | NONE => ([], tm)) | field_updates_tr' _ tm = ([], tm); fun record_update_tr' ctxt tm = (case field_updates_tr' ctxt tm of ([], _) => raise Match | (ts, u) => Syntax.const \<^syntax_const>\_record_update\ $ u $ foldr1 (fn (v, w) => Syntax.const \<^syntax_const>\_field_updates\ $ v $ w) (rev ts)); in fun field_update_tr' name = let val update_name = Lexicon.mark_const (name ^ updateN); fun tr' ctxt [t, u] = record_update_tr' ctxt (Syntax.const update_name $ t $ u) | tr' _ _ = raise Match; in (update_name, tr') end; end; (** record simprocs **) fun is_sel_upd_pair thy (Const (s, _)) (Const (u, t')) = (case get_updates thy u of SOME u_name => u_name = s | NONE => raise TERM ("is_sel_upd_pair: not update", [Const (u, t')])); fun mk_comp_id f = let val T = range_type (fastype_of f) in HOLogic.mk_comp (Const (\<^const_name>\Fun.id\, T --> T), f) end; fun get_upd_funs (upd $ _ $ t) = upd :: get_upd_funs t | get_upd_funs _ = []; fun get_accupd_simps ctxt term defset = let val thy = Proof_Context.theory_of ctxt; val (acc, [body]) = strip_comb term; val upd_funs = sort_distinct Term_Ord.fast_term_ord (get_upd_funs body); fun get_simp upd = let (* FIXME fresh "f" (!?) *) val T = domain_type (fastype_of upd); val lhs = HOLogic.mk_comp (acc, upd $ Free ("f", T)); val rhs = if is_sel_upd_pair thy acc upd then HOLogic.mk_comp (Free ("f", T), acc) else mk_comp_id acc; val prop = lhs === rhs; val othm = Goal.prove ctxt [] [] prop (fn {context = ctxt', ...} => simp_tac (put_simpset defset ctxt') 1 THEN REPEAT_DETERM (Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1) THEN TRY (simp_tac (put_simpset HOL_ss ctxt' addsimps @{thms id_apply id_o o_id}) 1)); val dest = if is_sel_upd_pair thy acc upd then @{thm o_eq_dest} else @{thm o_eq_id_dest}; in Drule.export_without_context (othm RS dest) end; in map get_simp upd_funs end; fun get_updupd_simp ctxt defset u u' comp = let (* FIXME fresh "f" (!?) *) val f = Free ("f", domain_type (fastype_of u)); val f' = Free ("f'", domain_type (fastype_of u')); val lhs = HOLogic.mk_comp (u $ f, u' $ f'); val rhs = if comp then u $ HOLogic.mk_comp (f, f') else HOLogic.mk_comp (u' $ f', u $ f); val prop = lhs === rhs; val othm = Goal.prove ctxt [] [] prop (fn {context = ctxt', ...} => simp_tac (put_simpset defset ctxt') 1 THEN REPEAT_DETERM (Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1) THEN TRY (simp_tac (put_simpset HOL_ss ctxt' addsimps @{thms id_apply}) 1)); val dest = if comp then @{thm o_eq_dest_lhs} else @{thm o_eq_dest}; in Drule.export_without_context (othm RS dest) end; fun get_updupd_simps ctxt term defset = let val upd_funs = get_upd_funs term; val cname = fst o dest_Const; fun getswap u u' = get_updupd_simp ctxt defset u u' (cname u = cname u'); fun build_swaps_to_eq _ [] swaps = swaps | build_swaps_to_eq upd (u :: us) swaps = let val key = (cname u, cname upd); val newswaps = if Symreltab.defined swaps key then swaps else Symreltab.insert (K true) (key, getswap u upd) swaps; in if cname u = cname upd then newswaps else build_swaps_to_eq upd us newswaps end; fun swaps_needed [] _ _ swaps = map snd (Symreltab.dest swaps) | swaps_needed (u :: us) prev seen swaps = if Symtab.defined seen (cname u) then swaps_needed us prev seen (build_swaps_to_eq u prev swaps) else swaps_needed us (u :: prev) (Symtab.insert (K true) (cname u, ()) seen) swaps; in swaps_needed upd_funs [] Symtab.empty Symreltab.empty end; fun prove_unfold_defs thy ex_simps ex_simprs prop = let val ctxt = Proof_Context.init_global thy; val defset = get_sel_upd_defs thy; val prop' = Envir.beta_eta_contract prop; val (lhs, _) = Logic.dest_equals (Logic.strip_assums_concl prop'); val (_, args) = strip_comb lhs; val simps = (if length args = 1 then get_accupd_simps else get_updupd_simps) ctxt lhs defset; in Goal.prove ctxt [] [] prop' (fn {context = ctxt', ...} => simp_tac (put_simpset HOL_basic_ss ctxt' addsimps (simps @ @{thms K_record_comp})) 1 THEN TRY (simp_tac (put_simpset HOL_basic_ss ctxt' addsimps ex_simps addsimprocs ex_simprs) 1)) end; local fun eq (s1: string) (s2: string) = (s1 = s2); fun has_field extfields f T = exists (fn (eN, _) => exists (eq f o fst) (Symtab.lookup_list extfields eN)) (dest_recTs T); fun K_skeleton n (T as Type (_, [_, kT])) (b as Bound i) (Abs (x, xT, t)) = if null (loose_bnos t) then ((n, kT), (Abs (x, xT, Bound (i + 1)))) else ((n, T), b) | K_skeleton n T b _ = ((n, T), b); in (* simproc *) (* Simplify selections of an record update: (1) S (S_update k r) = k (S r) (2) S (X_update k r) = S r The simproc skips multiple updates at once, eg: S (X_update x (Y_update y (S_update k r))) = k (S r) But be careful in (2) because of the extensibility of records. - If S is a more-selector we have to make sure that the update on component X does not affect the selected subrecord. - If X is a more-selector we have to make sure that S is not in the updated subrecord. *) val simproc = Simplifier.make_simproc \<^context> "record" {lhss = [\<^term>\x::'a::{}\], proc = fn _ => fn ctxt => fn ct => let val thy = Proof_Context.theory_of ctxt; val t = Thm.term_of ct; in (case t of (sel as Const (s, Type (_, [_, rangeS]))) $ ((upd as Const (u, Type (_, [_, Type (_, [rT, _])]))) $ k $ r) => if is_selector thy s andalso is_some (get_updates thy u) then let val {sel_upd = {updates, ...}, extfields, ...} = Data.get thy; fun mk_eq_terms ((upd as Const (u, Type(_, [kT, _]))) $ k $ r) = (case Symtab.lookup updates u of NONE => NONE | SOME u_name => if u_name = s then (case mk_eq_terms r of NONE => let val rv = ("r", rT); val rb = Bound 0; val (kv, kb) = K_skeleton "k" kT (Bound 1) k; in SOME (upd $ kb $ rb, kb $ (sel $ rb), [kv, rv]) end | SOME (trm, trm', vars) => let val (kv, kb) = K_skeleton "k" kT (Bound (length vars)) k; in SOME (upd $ kb $ trm, kb $ trm', kv :: vars) end) else if has_field extfields u_name rangeS orelse has_field extfields s (domain_type kT) then NONE else (case mk_eq_terms r of SOME (trm, trm', vars) => let val (kv, kb) = K_skeleton "k" kT (Bound (length vars)) k in SOME (upd $ kb $ trm, trm', kv :: vars) end | NONE => let val rv = ("r", rT); val rb = Bound 0; val (kv, kb) = K_skeleton "k" kT (Bound 1) k; in SOME (upd $ kb $ rb, sel $ rb, [kv, rv]) end)) | mk_eq_terms _ = NONE; in (case mk_eq_terms (upd $ k $ r) of SOME (trm, trm', vars) => SOME (prove_unfold_defs thy [] [] (Logic.list_all (vars, Logic.mk_equals (sel $ trm, trm')))) | NONE => NONE) end else NONE | _ => NONE) end}; fun get_upd_acc_cong_thm upd acc thy ss = let val ctxt = Proof_Context.init_global thy; val prop = infer_instantiate ctxt [(("upd", 0), Thm.cterm_of ctxt upd), (("ac", 0), Thm.cterm_of ctxt acc)] updacc_cong_triv |> Thm.concl_of; in Goal.prove ctxt [] [] prop (fn {context = ctxt', ...} => simp_tac (put_simpset ss ctxt') 1 THEN REPEAT_DETERM (Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1) THEN TRY (resolve_tac ctxt' [updacc_cong_idI] 1)) end; (* upd_simproc *) (*Simplify multiple updates: (1) "N_update y (M_update g (N_update x (M_update f r))) = (N_update (y o x) (M_update (g o f) r))" (2) "r(|M:= M r|) = r" In both cases "more" updates complicate matters: for this reason we omit considering further updates if doing so would introduce both a more update and an update to a field within it.*) val upd_simproc = Simplifier.make_simproc \<^context> "record_upd" {lhss = [\<^term>\x::'a::{}\], proc = fn _ => fn ctxt => fn ct => let val thy = Proof_Context.theory_of ctxt; val t = Thm.term_of ct; (*We can use more-updators with other updators as long as none of the other updators go deeper than any more updator. min here is the depth of the deepest other updator, max the depth of the shallowest more updator.*) fun include_depth (dep, true) (min, max) = if min <= dep then SOME (min, if dep <= max orelse max = ~1 then dep else max) else NONE | include_depth (dep, false) (min, max) = if dep <= max orelse max = ~1 then SOME (if min <= dep then dep else min, max) else NONE; fun getupdseq (term as (upd as Const (u, _)) $ f $ tm) min max = (case get_update_details u thy of SOME (s, dep, ismore) => (case include_depth (dep, ismore) (min, max) of SOME (min', max') => let val (us, bs, _) = getupdseq tm min' max' in ((upd, s, f) :: us, bs, fastype_of term) end | NONE => ([], term, HOLogic.unitT)) | NONE => ([], term, HOLogic.unitT)) | getupdseq term _ _ = ([], term, HOLogic.unitT); val (upds, base, baseT) = getupdseq t 0 ~1; fun is_upd_noop s (Abs (n, T, Const (s', T') $ tm')) tm = if s = s' andalso null (loose_bnos tm') andalso subst_bound (HOLogic.unit, tm') = tm then (true, Abs (n, T, Const (s', T') $ Bound 1)) else (false, HOLogic.unit) | is_upd_noop _ _ _ = (false, HOLogic.unit); fun get_noop_simps (upd as Const _) (Abs (_, _, (acc as Const _) $ _)) = let val ss = get_sel_upd_defs thy; val uathm = get_upd_acc_cong_thm upd acc thy ss; in [Drule.export_without_context (uathm RS updacc_noopE), Drule.export_without_context (uathm RS updacc_noop_compE)] end; (*If f is constant then (f o g) = f. We know that K_skeleton only returns constant abstractions thus when we see an abstraction we can discard inner updates.*) fun add_upd (f as Abs _) _ = [f] | add_upd f fs = (f :: fs); (*mk_updterm returns (orig-term-skeleton, simplified-skeleton, variables, duplicate-updates, simp-flag, noop-simps) where duplicate-updates is a table used to pass upward the list of update functions which can be composed into an update above them, simp-flag indicates whether any simplification was achieved, and noop-simps are used for eliminating case (2) defined above*) fun mk_updterm ((upd as Const (u, T), s, f) :: upds) above term = let val (lhs, rhs, vars, dups, simp, noops) = mk_updterm upds (Symtab.update (u, ()) above) term; val (fvar, skelf) = K_skeleton (Long_Name.base_name s) (domain_type T) (Bound (length vars)) f; val (isnoop, skelf') = is_upd_noop s f term; val funT = domain_type T; fun mk_comp_local (f, f') = Const (\<^const_name>\Fun.comp\, funT --> funT --> funT) $ f $ f'; in if isnoop then (upd $ skelf' $ lhs, rhs, vars, Symtab.update (u, []) dups, true, if Symtab.defined noops u then noops else Symtab.update (u, get_noop_simps upd skelf') noops) else if Symtab.defined above u then (upd $ skelf $ lhs, rhs, fvar :: vars, Symtab.map_default (u, []) (add_upd skelf) dups, true, noops) else (case Symtab.lookup dups u of SOME fs => (upd $ skelf $ lhs, upd $ foldr1 mk_comp_local (add_upd skelf fs) $ rhs, fvar :: vars, dups, true, noops) | NONE => (upd $ skelf $ lhs, upd $ skelf $ rhs, fvar :: vars, dups, simp, noops)) end | mk_updterm [] _ _ = (Bound 0, Bound 0, [("r", baseT)], Symtab.empty, false, Symtab.empty) | mk_updterm us _ _ = raise TERM ("mk_updterm match", map (fn (x, _, _) => x) us); val (lhs, rhs, vars, _, simp, noops) = mk_updterm upds Symtab.empty base; val noops' = maps snd (Symtab.dest noops); in if simp then SOME (prove_unfold_defs thy noops' [simproc] (Logic.list_all (vars, Logic.mk_equals (lhs, rhs)))) else NONE end}; end; (* eq_simproc *) (*Look up the most specific record-equality. Note on efficiency: Testing equality of records boils down to the test of equality of all components. Therefore the complexity is: #components * complexity for single component. Especially if a record has a lot of components it may be better to split up the record first and do simplification on that (split_simp_tac). e.g. r(|lots of updates|) = x eq_simproc split_simp_tac Complexity: #components * #updates #updates *) val eq_simproc = Simplifier.make_simproc \<^context> "record_eq" {lhss = [\<^term>\r = s\], proc = fn _ => fn ctxt => fn ct => (case Thm.term_of ct of Const (\<^const_name>\HOL.eq\, Type (_, [T, _])) $ _ $ _ => (case rec_id ~1 T of "" => NONE | name => (case get_equalities (Proof_Context.theory_of ctxt) name of NONE => NONE | SOME thm => SOME (thm RS @{thm Eq_TrueI}))) | _ => NONE)}; (* split_simproc *) (*Split quantified occurrences of records, for which P holds. P can peek on the subterm starting at the quantified occurrence of the record (including the quantifier): P t = 0: do not split P t = ~1: completely split P t > 0: split up to given bound of record extensions.*) fun split_simproc P = Simplifier.make_simproc \<^context> "record_split" {lhss = [\<^term>\x::'a::{}\], proc = fn _ => fn ctxt => fn ct => (case Thm.term_of ct of Const (quantifier, Type (_, [Type (_, [T, _]), _])) $ _ => if quantifier = \<^const_name>\Pure.all\ orelse quantifier = \<^const_name>\All\ orelse quantifier = \<^const_name>\Ex\ then (case rec_id ~1 T of "" => NONE | _ => let val split = P (Thm.term_of ct) in if split <> 0 then (case get_splits (Proof_Context.theory_of ctxt) (rec_id split T) of NONE => NONE | SOME (all_thm, All_thm, Ex_thm, _) => SOME (case quantifier of \<^const_name>\Pure.all\ => all_thm | \<^const_name>\All\ => All_thm RS @{thm eq_reflection} | \<^const_name>\Ex\ => Ex_thm RS @{thm eq_reflection} | _ => raise Fail "split_simproc")) else NONE end) else NONE | _ => NONE)}; val ex_sel_eq_simproc = Simplifier.make_simproc \<^context> "ex_sel_eq" {lhss = [\<^term>\Ex t\], proc = fn _ => fn ctxt => fn ct => let val thy = Proof_Context.theory_of ctxt; val t = Thm.term_of ct; fun mkeq (lr, Teq, (sel, Tsel), x) i = if is_selector thy sel then let val x' = if not (Term.is_dependent x) then Free ("x" ^ string_of_int i, range_type Tsel) else raise TERM ("", [x]); val sel' = Const (sel, Tsel) $ Bound 0; val (l, r) = if lr then (sel', x') else (x', sel'); in Const (\<^const_name>\HOL.eq\, Teq) $ l $ r end else raise TERM ("", [Const (sel, Tsel)]); fun dest_sel_eq (Const (\<^const_name>\HOL.eq\, Teq) $ (Const (sel, Tsel) $ Bound 0) $ X) = (true, Teq, (sel, Tsel), X) | dest_sel_eq (Const (\<^const_name>\HOL.eq\, Teq) $ X $ (Const (sel, Tsel) $ Bound 0)) = (false, Teq, (sel, Tsel), X) | dest_sel_eq _ = raise TERM ("", []); in (case t of Const (\<^const_name>\Ex\, Tex) $ Abs (s, T, t) => (let val eq = mkeq (dest_sel_eq t) 0; val prop = Logic.list_all ([("r", T)], Logic.mk_equals (Const (\<^const_name>\Ex\, Tex) $ Abs (s, T, eq), \<^term>\True\)); in SOME (Goal.prove_sorry_global thy [] [] prop (fn {context = ctxt', ...} => simp_tac (put_simpset (get_simpset thy) ctxt' addsimps @{thms simp_thms} addsimprocs [split_simproc (K ~1)]) 1)) end handle TERM _ => NONE) | _ => NONE) end}; (* split_simp_tac *) (*Split (and simplify) all records in the goal for which P holds. For quantified occurrences of a record P can peek on the whole subterm (including the quantifier); for free variables P can only peek on the variable itself. P t = 0: do not split P t = ~1: completely split P t > 0: split up to given bound of record extensions.*) fun split_simp_tac ctxt thms P = CSUBGOAL (fn (cgoal, i) => let val thy = Proof_Context.theory_of ctxt; val goal = Thm.term_of cgoal; val frees = filter (is_recT o #2) (Term.add_frees goal []); val has_rec = exists_Const (fn (s, Type (_, [Type (_, [T, _]), _])) => (s = \<^const_name>\Pure.all\ orelse s = \<^const_name>\All\ orelse s = \<^const_name>\Ex\) andalso is_recT T | _ => false); fun mk_split_free_tac free induct_thm i = let val _ $ (_ $ Var (r, _)) = Thm.concl_of induct_thm; val thm = infer_instantiate ctxt [(r, Thm.cterm_of ctxt free)] induct_thm; in simp_tac (put_simpset HOL_basic_ss ctxt addsimps @{thms induct_atomize}) i THEN resolve_tac ctxt [thm] i THEN simp_tac (put_simpset HOL_basic_ss ctxt addsimps @{thms induct_rulify}) i end; val split_frees_tacs = frees |> map_filter (fn (x, T) => (case rec_id ~1 T of "" => NONE | _ => let val free = Free (x, T); val split = P free; in if split <> 0 then (case get_splits thy (rec_id split T) of NONE => NONE | SOME (_, _, _, induct_thm) => SOME (mk_split_free_tac free induct_thm i)) else NONE end)); val simprocs = if has_rec goal then [split_simproc P] else []; val thms' = @{thms o_apply K_record_comp} @ thms; in EVERY split_frees_tacs THEN full_simp_tac (put_simpset (get_simpset thy) ctxt addsimps thms' addsimprocs simprocs) i end); (* split_tac *) (*Split all records in the goal, which are quantified by !! or ALL.*) fun split_tac ctxt = CSUBGOAL (fn (cgoal, i) => let val goal = Thm.term_of cgoal; val has_rec = exists_Const (fn (s, Type (_, [Type (_, [T, _]), _])) => (s = \<^const_name>\Pure.all\ orelse s = \<^const_name>\All\) andalso is_recT T | _ => false); fun is_all (Const (\<^const_name>\Pure.all\, _) $ _) = ~1 | is_all (Const (\<^const_name>\All\, _) $ _) = ~1 | is_all _ = 0; in if has_rec goal then full_simp_tac (put_simpset HOL_basic_ss ctxt addsimprocs [split_simproc is_all]) i else no_tac end); val _ = Theory.setup (map_theory_simpset (fn ctxt => ctxt addsimprocs [simproc, upd_simproc, eq_simproc])); (* wrapper *) val split_name = "record_split_tac"; val split_wrapper = (split_name, fn ctxt => fn tac => split_tac ctxt ORELSE' tac); (** theory extender interface **) (* attributes *) fun case_names_fields x = Rule_Cases.case_names ["fields"] x; fun induct_type_global name = [case_names_fields, Induct.induct_type name]; fun cases_type_global name = [case_names_fields, Induct.cases_type name]; (* tactics *) (*Do case analysis / induction according to rule on last parameter of ith subgoal (or on s if there are no parameters). Instatiation of record variable (and predicate) in rule is calculated to avoid problems with higher order unification.*) fun try_param_tac ctxt s rule = CSUBGOAL (fn (cgoal, i) => let val g = Thm.term_of cgoal; val params = Logic.strip_params g; val concl = HOLogic.dest_Trueprop (Logic.strip_assums_concl g); val rule' = Thm.lift_rule cgoal rule; val (P, ys) = strip_comb (HOLogic.dest_Trueprop (Logic.strip_assums_concl (Thm.prop_of rule'))); (*ca indicates if rule is a case analysis or induction rule*) val (x, ca) = (case rev (drop (length params) ys) of [] => (head_of (fst (HOLogic.dest_eq (HOLogic.dest_Trueprop (hd (rev (Logic.strip_assums_hyp (hd (Thm.prems_of rule')))))))), true) | [x] => (head_of x, false)); val rule'' = infer_instantiate ctxt (map (apsnd (Thm.cterm_of ctxt)) (case rev params of [] => (case AList.lookup (op =) (Term.add_frees g []) s of NONE => error "try_param_tac: no such variable" | SOME T => [(#1 (dest_Var P), if ca then concl else lambda (Free (s, T)) concl), (#1 (dest_Var x), Free (s, T))]) | (_, T) :: _ => [(#1 (dest_Var P), fold_rev Term.abs params (if ca then concl else incr_boundvars 1 (Abs (s, T, concl)))), (#1 (dest_Var x), fold_rev Term.abs params (Bound 0))])) rule'; in compose_tac ctxt (false, rule'', Thm.nprems_of rule) i end); fun extension_definition overloaded name fields alphas zeta moreT more vars thy = let val ctxt = Proof_Context.init_global thy; val base_name = Long_Name.base_name name; val fieldTs = map snd fields; val fields_moreTs = fieldTs @ [moreT]; val alphas_zeta = alphas @ [zeta]; val ext_binding = Binding.name (suffix extN base_name); val ext_name = suffix extN name; val ext_tyco = suffix ext_typeN name; val extT = Type (ext_tyco, map TFree alphas_zeta); val ext_type = fields_moreTs ---> extT; (* the tree of new types that will back the record extension *) val mktreeV = Balanced_Tree.make Iso_Tuple_Support.mk_cons_tuple; fun mk_iso_tuple (left, right) (thy, i) = let val suff = if i = 0 then ext_typeN else inner_typeN ^ string_of_int i; val ((_, cons), thy') = thy |> Iso_Tuple_Support.add_iso_tuple_type overloaded (Binding.suffix_name suff (Binding.name base_name), alphas_zeta) (fastype_of left, fastype_of right); in (cons $ left $ right, (thy', i + 1)) end; (*trying to create a 1-element iso_tuple will fail, and is pointless anyway*) fun mk_even_iso_tuple [arg] = pair arg | mk_even_iso_tuple args = mk_iso_tuple (Iso_Tuple_Support.dest_cons_tuple (mktreeV args)); fun build_meta_tree_type i thy vars more = let val len = length vars in if len < 1 then raise TYPE ("meta_tree_type args too short", [], vars) else if len > 16 then let fun group16 [] = [] | group16 xs = take 16 xs :: group16 (drop 16 xs); val vars' = group16 vars; val (composites, (thy', i')) = fold_map mk_even_iso_tuple vars' (thy, i); in build_meta_tree_type i' thy' composites more end else let val (term, (thy', _)) = mk_iso_tuple (mktreeV vars, more) (thy, 0) in (term, thy') end end; val _ = timing_msg ctxt "record extension preparing definitions"; (* 1st stage part 1: introduce the tree of new types *) val (ext_body, typ_thy) = timeit_msg ctxt "record extension nested type def:" (fn () => build_meta_tree_type 1 thy vars more); (* prepare declarations and definitions *) (* 1st stage part 2: define the ext constant *) fun mk_ext args = list_comb (Const (ext_name, ext_type), args); val ext_spec = Logic.mk_equals (mk_ext (vars @ [more]), ext_body); val ([ext_def], defs_thy) = timeit_msg ctxt "record extension constructor def:" (fn () => typ_thy |> Sign.declare_const_global ((ext_binding, ext_type), NoSyn) |> snd |> Global_Theory.add_defs false [((Thm.def_binding ext_binding, ext_spec), [])]); val defs_ctxt = Proof_Context.init_global defs_thy; (* prepare propositions *) val _ = timing_msg ctxt "record extension preparing propositions"; val vars_more = vars @ [more]; val variants = map (fn Free (x, _) => x) vars_more; val ext = mk_ext vars_more; val s = Free (rN, extT); val P = Free (singleton (Name.variant_list variants) "P", extT --> HOLogic.boolT); val inject_prop = (* FIXME local x x' *) let val vars_more' = map (fn (Free (x, T)) => Free (x ^ "'", T)) vars_more in HOLogic.mk_conj (HOLogic.eq_const extT $ mk_ext vars_more $ mk_ext vars_more', \<^term>\True\) === foldr1 HOLogic.mk_conj (map HOLogic.mk_eq (vars_more ~~ vars_more') @ [\<^term>\True\]) end; val induct_prop = (fold_rev Logic.all vars_more (Trueprop (P $ ext)), Trueprop (P $ s)); val split_meta_prop = (* FIXME local P *) let val P = Free (singleton (Name.variant_list variants) "P", extT --> propT) in Logic.mk_equals (Logic.all s (P $ s), fold_rev Logic.all vars_more (P $ ext)) end; val inject = timeit_msg ctxt "record extension inject proof:" (fn () => simplify (put_simpset HOL_ss defs_ctxt) (Goal.prove_sorry_global defs_thy [] [] inject_prop (fn {context = ctxt', ...} => simp_tac (put_simpset HOL_basic_ss ctxt' addsimps [ext_def]) 1 THEN REPEAT_DETERM (resolve_tac ctxt' @{thms refl_conj_eq} 1 ORELSE Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1 ORELSE resolve_tac ctxt' [refl] 1)))); (*We need a surjection property r = (| f = f r, g = g r ... |) to prove other theorems. We haven't given names to the accessors f, g etc yet however, so we generate an ext structure with free variables as all arguments and allow the introduction tactic to operate on it as far as it can. We then use Drule.export_without_context to convert the free variables into unifiable variables and unify them with (roughly) the definition of the accessor.*) val surject = timeit_msg ctxt "record extension surjective proof:" (fn () => let val start = infer_instantiate defs_ctxt [(("y", 0), Thm.cterm_of defs_ctxt ext)] surject_assist_idE; val tactic1 = simp_tac (put_simpset HOL_basic_ss defs_ctxt addsimps [ext_def]) 1 THEN REPEAT_ALL_NEW (Iso_Tuple_Support.iso_tuple_intros_tac defs_ctxt) 1; val tactic2 = REPEAT (resolve_tac defs_ctxt [surject_assistI] 1 THEN resolve_tac defs_ctxt [refl] 1); val [halfway] = Seq.list_of (tactic1 start); val [surject] = Seq.list_of (tactic2 (Drule.export_without_context halfway)); in surject end); val split_meta = timeit_msg ctxt "record extension split_meta proof:" (fn () => Goal.prove_sorry_global defs_thy [] [] split_meta_prop (fn {context = ctxt', ...} => EVERY1 [resolve_tac ctxt' @{thms equal_intr_rule}, Goal.norm_hhf_tac ctxt', eresolve_tac ctxt' @{thms meta_allE}, assume_tac ctxt', resolve_tac ctxt' [@{thm prop_subst} OF [surject]], REPEAT o eresolve_tac ctxt' @{thms meta_allE}, assume_tac ctxt'])); val induct = timeit_msg ctxt "record extension induct proof:" (fn () => let val (assm, concl) = induct_prop in Goal.prove_sorry_global defs_thy [] [assm] concl (fn {context = ctxt', prems, ...} => cut_tac (split_meta RS Drule.equal_elim_rule2) 1 THEN resolve_tac ctxt' prems 2 THEN asm_simp_tac (put_simpset HOL_ss ctxt') 1) end); val ([(_, [induct']), (_, [inject']), (_, [surjective']), (_, [split_meta'])], thm_thy) = defs_thy |> Global_Theory.note_thmss "" [((Binding.name "ext_induct", []), [([induct], [])]), ((Binding.name "ext_inject", []), [([inject], [])]), ((Binding.name "ext_surjective", []), [([surject], [])]), ((Binding.name "ext_split", []), [([split_meta], [])])]; in (((ext_name, ext_type), (ext_tyco, alphas_zeta), extT, induct', inject', surjective', split_meta', ext_def), thm_thy) end; fun chunks [] [] = [] | chunks [] xs = [xs] | chunks (l :: ls) xs = take l xs :: chunks ls (drop l xs); fun chop_last [] = error "chop_last: list should not be empty" | chop_last [x] = ([], x) | chop_last (x :: xs) = let val (tl, l) = chop_last xs in (x :: tl, l) end; fun subst_last _ [] = error "subst_last: list should not be empty" | subst_last s [_] = [s] | subst_last s (x :: xs) = x :: subst_last s xs; (* mk_recordT *) (*build up the record type from the current extension tpye extT and a list of parent extensions, starting with the root of the record hierarchy*) fun mk_recordT extT = fold_rev (fn (parent, Ts) => fn T => Type (parent, subst_last T Ts)) extT; (* code generation *) fun mk_random_eq tyco vs extN Ts = let (* FIXME local i etc. *) val size = \<^term>\i::natural\; fun termifyT T = HOLogic.mk_prodT (T, \<^typ>\unit \ term\); val T = Type (tyco, map TFree vs); val Tm = termifyT T; val params = Name.invent_names Name.context "x" Ts; val lhs = HOLogic.mk_random T size; val tc = HOLogic.mk_return Tm \<^typ>\Random.seed\ (HOLogic.mk_valtermify_app extN params T); val rhs = HOLogic.mk_ST (map (fn (v, T') => ((HOLogic.mk_random T' size, \<^typ>\Random.seed\), SOME (v, termifyT T'))) params) tc \<^typ>\Random.seed\ (SOME Tm, \<^typ>\Random.seed\); in (lhs, rhs) end fun mk_full_exhaustive_eq tyco vs extN Ts = let (* FIXME local i etc. *) val size = \<^term>\i::natural\; fun termifyT T = HOLogic.mk_prodT (T, \<^typ>\unit \ term\); val T = Type (tyco, map TFree vs); val test_function = Free ("f", termifyT T --> \<^typ>\(bool \ term list) option\); val params = Name.invent_names Name.context "x" Ts; fun full_exhaustiveT T = (termifyT T --> \<^typ>\(bool \ Code_Evaluation.term list) option\) --> \<^typ>\natural\ --> \<^typ>\(bool \ Code_Evaluation.term list) option\; fun mk_full_exhaustive T = Const (\<^const_name>\Quickcheck_Exhaustive.full_exhaustive_class.full_exhaustive\, full_exhaustiveT T); val lhs = mk_full_exhaustive T $ test_function $ size; val tc = test_function $ (HOLogic.mk_valtermify_app extN params T); val rhs = fold_rev (fn (v, T) => fn cont => mk_full_exhaustive T $ (lambda (Free (v, termifyT T)) cont) $ size) params tc; in (lhs, rhs) end; fun instantiate_sort_record (sort, mk_eq) tyco vs extN Ts thy = let val eq = HOLogic.mk_Trueprop (HOLogic.mk_eq (mk_eq tyco vs extN Ts)); in thy |> Class.instantiation ([tyco], vs, sort) |> `(fn lthy => Syntax.check_term lthy eq) |-> (fn eq => Specification.definition NONE [] [] ((Binding.concealed Binding.empty, []), eq)) |> snd |> Class.prove_instantiation_exit (fn ctxt => Class.intro_classes_tac ctxt []) end; fun ensure_sort_record (sort, mk_eq) ext_tyco vs extN Ts thy = let val algebra = Sign.classes_of thy; val has_inst = Sorts.has_instance algebra ext_tyco sort; in if has_inst then thy else (case Quickcheck_Common.perhaps_constrain thy (map (rpair sort) Ts) vs of SOME constrain => instantiate_sort_record (sort, mk_eq) ext_tyco (map constrain vs) extN ((map o map_atyps) (fn TFree v => TFree (constrain v)) Ts) thy | NONE => thy) end; fun add_code ext_tyco vs extT ext simps inject thy = if Config.get_global thy codegen then let val eq = HOLogic.mk_Trueprop (HOLogic.mk_eq (Const (\<^const_name>\HOL.equal\, extT --> extT --> HOLogic.boolT), Const (\<^const_name>\HOL.eq\, extT --> extT --> HOLogic.boolT))); fun tac ctxt eq_def = Class.intro_classes_tac ctxt [] THEN rewrite_goals_tac ctxt [Simpdata.mk_eq eq_def] THEN ALLGOALS (resolve_tac ctxt @{thms refl}); fun mk_eq ctxt eq_def = rewrite_rule ctxt [Axclass.unoverload ctxt (Thm.symmetric (Simpdata.mk_eq eq_def))] inject; fun mk_eq_refl ctxt = @{thm equal_refl} |> Thm.instantiate (TVars.make [(\<^tvar>\?'a::equal\, Thm.ctyp_of ctxt (Logic.varifyT_global extT))], Vars.empty) |> Axclass.unoverload ctxt; val ensure_random_record = ensure_sort_record (\<^sort>\random\, mk_random_eq); val ensure_exhaustive_record = ensure_sort_record (\<^sort>\full_exhaustive\, mk_full_exhaustive_eq); fun add_code eq_def thy = let val ctxt = Proof_Context.init_global thy; in thy |> Code.declare_default_eqns_global [(mk_eq ctxt eq_def, true), (mk_eq_refl ctxt, false)] end; in thy |> Code.declare_datatype_global [ext] |> Code.declare_default_eqns_global (map (rpair true) simps) |> Class.instantiation ([ext_tyco], vs, [HOLogic.class_equal]) |> `(fn lthy => Syntax.check_term lthy eq) |-> (fn eq => Specification.definition NONE [] [] (Binding.empty_atts, eq)) |-> (fn (_, (_, eq_def)) => Class.prove_instantiation_exit_result Morphism.thm tac eq_def) |-> add_code |> ensure_random_record ext_tyco vs (fst ext) (binder_types (snd ext)) |> ensure_exhaustive_record ext_tyco vs (fst ext) (binder_types (snd ext)) end else thy; fun add_ctr_sugar ctr exhaust inject sel_thms = Ctr_Sugar.default_register_ctr_sugar_global (K true) {kind = Ctr_Sugar.Record, T = body_type (fastype_of ctr), ctrs = [ctr], casex = Term.dummy, discs = [], selss = [], exhaust = exhaust, nchotomy = Drule.dummy_thm, injects = [inject], distincts = [], case_thms = [], case_cong = Drule.dummy_thm, case_cong_weak = Drule.dummy_thm, case_distribs = [], split = Drule.dummy_thm, split_asm = Drule.dummy_thm, disc_defs = [], disc_thmss = [], discIs = [], disc_eq_cases = [], sel_defs = [], sel_thmss = [sel_thms], distinct_discsss = [], exhaust_discs = [], exhaust_sels = [], collapses = [], expands = [], split_sels = [], split_sel_asms = [], case_eq_ifs = []}; fun lhs_of_equation \<^Const_>\Pure.eq _ for t _\ = t | lhs_of_equation \<^Const_>\Trueprop for \<^Const_>\HOL.eq _ for t _\\ = t; fun add_spec_rule rule = let val head = head_of (lhs_of_equation (Thm.prop_of rule)) in Spec_Rules.add_global Binding.empty Spec_Rules.equational [head] [rule] end; (* definition *) fun definition overloaded (alphas, binding) parent (parents: parent_info list) raw_fields thy0 = let val ctxt0 = Proof_Context.init_global thy0; val prefix = Binding.name_of binding; val name = Sign.full_name thy0 binding; val full = Sign.full_name_path thy0 prefix; val bfields = map (fn (x, T, _) => (x, T)) raw_fields; val field_syntax = map #3 raw_fields; val parent_fields = maps #fields parents; val parent_chunks = map (length o #fields) parents; val parent_names = map fst parent_fields; val parent_types = map snd parent_fields; val parent_fields_len = length parent_fields; val parent_variants = Name.variant_list [moreN, rN, rN ^ "'", wN] (map Long_Name.base_name parent_names); val parent_vars = map2 (curry Free) parent_variants parent_types; val parent_len = length parents; val fields = map (apfst full) bfields; val names = map fst fields; val types = map snd fields; val alphas_fields = fold Term.add_tfreesT types []; val alphas_ext = inter (op =) alphas_fields alphas; val len = length fields; val variants = Name.variant_list (moreN :: rN :: (rN ^ "'") :: wN :: parent_variants) (map (Binding.name_of o fst) bfields); val vars = map2 (curry Free) variants types; val named_vars = names ~~ vars; val idxms = 0 upto len; val all_fields = parent_fields @ fields; val all_types = parent_types @ types; val all_variants = parent_variants @ variants; val all_vars = parent_vars @ vars; val all_named_vars = (parent_names ~~ parent_vars) @ named_vars; val zeta = (singleton (Name.variant_list (map #1 alphas)) "'z", \<^sort>\type\); val moreT = TFree zeta; val more = Free (moreN, moreT); val full_moreN = full (Binding.name moreN); val bfields_more = bfields @ [(Binding.name moreN, moreT)]; val fields_more = fields @ [(full_moreN, moreT)]; val named_vars_more = named_vars @ [(full_moreN, more)]; val all_vars_more = all_vars @ [more]; val all_named_vars_more = all_named_vars @ [(full_moreN, more)]; (* 1st stage: ext_thy *) val extension_name = full binding; val ((ext, (ext_tyco, vs), extT, ext_induct, ext_inject, ext_surjective, ext_split, ext_def), ext_thy) = thy0 |> Sign.qualified_path false binding |> extension_definition overloaded extension_name fields alphas_ext zeta moreT more vars; val ext_ctxt = Proof_Context.init_global ext_thy; val _ = timing_msg ext_ctxt "record preparing definitions"; val Type extension_scheme = extT; val extension_name = unsuffix ext_typeN (fst extension_scheme); val extension = let val (n, Ts) = extension_scheme in (n, subst_last HOLogic.unitT Ts) end; val extension_names = map (unsuffix ext_typeN o fst o #extension) parents @ [extension_name]; val extension_id = implode extension_names; fun rec_schemeT n = mk_recordT (map #extension (drop n parents)) extT; val rec_schemeT0 = rec_schemeT 0; fun recT n = let val (c, Ts) = extension in mk_recordT (map #extension (drop n parents)) (Type (c, subst_last HOLogic.unitT Ts)) end; val recT0 = recT 0; fun mk_rec args n = let val (args', more) = chop_last args; fun mk_ext' ((name, T), args) more = mk_ext (name, T) (args @ [more]); fun build Ts = fold_rev mk_ext' (drop n ((extension_names ~~ Ts) ~~ chunks parent_chunks args')) more; in if more = HOLogic.unit then build (map_range recT (parent_len + 1)) else build (map_range rec_schemeT (parent_len + 1)) end; val r_rec0 = mk_rec all_vars_more 0; val r_rec_unit0 = mk_rec (all_vars @ [HOLogic.unit]) 0; fun r n = Free (rN, rec_schemeT n); val r0 = r 0; fun r_unit n = Free (rN, recT n); val r_unit0 = r_unit 0; (* print translations *) val record_ext_type_abbr_tr's = let val trname = hd extension_names; val last_ext = unsuffix ext_typeN (fst extension); in [record_ext_type_abbr_tr' name alphas zeta last_ext rec_schemeT0 trname] end; val record_ext_type_tr's = let (*avoid conflict with record_type_abbr_tr's*) val trnames = if parent_len > 0 then [extension_name] else []; in map record_ext_type_tr' trnames end; val print_translation = map field_update_tr' (full_moreN :: names) @ [record_ext_tr' extension_name] @ record_ext_type_tr's @ record_ext_type_abbr_tr's; (* prepare declarations *) val sel_decls = map (mk_selC rec_schemeT0 o apfst Binding.name_of) bfields_more; val upd_decls = map (mk_updC updateN rec_schemeT0 o apfst Binding.name_of) bfields_more; val make_decl = (makeN, all_types ---> recT0); val fields_decl = (fields_selN, types ---> Type extension); val extend_decl = (extendN, recT0 --> moreT --> rec_schemeT0); val truncate_decl = (truncateN, rec_schemeT0 --> recT0); (* prepare definitions *) val ext_defs = ext_def :: map #ext_def parents; (*Theorems from the iso_tuple intros. By unfolding ext_defs from r_rec0 we create a tree of constructor calls (many of them Pair, but others as well). The introduction rules for update_accessor_eq_assist can unify two different ways on these constructors. If we take the complete result sequence of running a the introduction tactic, we get one theorem for each upd/acc pair, from which we can derive the bodies of our selector and updator and their convs.*) val (accessor_thms, updator_thms, upd_acc_cong_assists) = timeit_msg ext_ctxt "record getting tree access/updates:" (fn () => let val r_rec0_Vars = let (*pick variable indices of 1 to avoid possible variable collisions with existing variables in updacc_eq_triv*) fun to_Var (Free (c, T)) = Var ((c, 1), T); in mk_rec (map to_Var all_vars_more) 0 end; val init_thm = infer_instantiate ext_ctxt [(("v", 0), Thm.cterm_of ext_ctxt r_rec0), (("v'", 0), Thm.cterm_of ext_ctxt r_rec0_Vars)] updacc_eq_triv; val terminal = resolve_tac ext_ctxt [updacc_eq_idI] 1 THEN resolve_tac ext_ctxt [refl] 1; val tactic = simp_tac (put_simpset HOL_basic_ss ext_ctxt addsimps ext_defs) 1 THEN REPEAT (Iso_Tuple_Support.iso_tuple_intros_tac ext_ctxt 1 ORELSE terminal); val updaccs = Seq.list_of (tactic init_thm); in (updaccs RL [updacc_accessor_eqE], updaccs RL [updacc_updator_eqE], updaccs RL [updacc_cong_from_eq]) end); fun lastN xs = drop parent_fields_len xs; (*selectors*) fun mk_sel_spec ((c, T), thm) = let val (acc $ arg, _) = HOLogic.dest_eq (HOLogic.dest_Trueprop (Envir.beta_eta_contract (Thm.concl_of thm))); val _ = if arg aconv r_rec0 then () else raise TERM ("mk_sel_spec: different arg", [arg]); in Const (mk_selC rec_schemeT0 (c, T)) :== acc end; val sel_specs = map mk_sel_spec (fields_more ~~ lastN accessor_thms); (*updates*) fun mk_upd_spec ((c, T), thm) = let val (upd $ _ $ arg, _) = HOLogic.dest_eq (HOLogic.dest_Trueprop (Envir.beta_eta_contract (Thm.concl_of thm))); val _ = if arg aconv r_rec0 then () else raise TERM ("mk_sel_spec: different arg", [arg]); in Const (mk_updC updateN rec_schemeT0 (c, T)) :== upd end; val upd_specs = map mk_upd_spec (fields_more ~~ lastN updator_thms); (*derived operations*) val make_spec = list_comb (Const (full (Binding.name makeN), all_types ---> recT0), all_vars) :== mk_rec (all_vars @ [HOLogic.unit]) 0; val fields_spec = list_comb (Const (full (Binding.name fields_selN), types ---> Type extension), vars) :== mk_rec (all_vars @ [HOLogic.unit]) parent_len; val extend_spec = Const (full (Binding.name extendN), recT0 --> moreT --> rec_schemeT0) $ r_unit0 $ more :== mk_rec ((map (mk_sel r_unit0) all_fields) @ [more]) 0; val truncate_spec = Const (full (Binding.name truncateN), rec_schemeT0 --> recT0) $ r0 :== mk_rec ((map (mk_sel r0) all_fields) @ [HOLogic.unit]) 0; (* 2st stage: defs_thy *) val (((sel_defs, upd_defs), derived_defs), defs_thy) = timeit_msg ext_ctxt "record trfuns/tyabbrs/selectors/updates/make/fields/extend/truncate defs:" (fn () => ext_thy |> Sign.print_translation print_translation |> Sign.restore_naming thy0 |> Typedecl.abbrev_global (binding, map #1 alphas, NoSyn) recT0 |> snd |> Typedecl.abbrev_global (Binding.suffix_name schemeN binding, map #1 (alphas @ [zeta]), NoSyn) rec_schemeT0 |> snd |> Sign.qualified_path false binding |> fold (fn ((x, T), mx) => snd o Sign.declare_const_global ((Binding.name x, T), mx)) (sel_decls ~~ (field_syntax @ [NoSyn])) |> fold (fn (x, T) => snd o Sign.declare_const_global ((Binding.name x, T), NoSyn)) (upd_decls @ [make_decl, fields_decl, extend_decl, truncate_decl]) |> (Global_Theory.add_defs false o map (Thm.no_attributes o apfst (Binding.concealed o Binding.name))) sel_specs ||>> (Global_Theory.add_defs false o map (Thm.no_attributes o apfst (Binding.concealed o Binding.name))) upd_specs ||>> (Global_Theory.add_defs false o map (Thm.no_attributes o apfst (Binding.concealed o Binding.name))) [make_spec, fields_spec, extend_spec, truncate_spec]); val defs_ctxt = Proof_Context.init_global defs_thy; (* prepare propositions *) val _ = timing_msg defs_ctxt "record preparing propositions"; val P = Free (singleton (Name.variant_list all_variants) "P", rec_schemeT0 --> HOLogic.boolT); val C = Free (singleton (Name.variant_list all_variants) "C", HOLogic.boolT); val P_unit = Free (singleton (Name.variant_list all_variants) "P", recT0 --> HOLogic.boolT); (*selectors*) val sel_conv_props = map (fn (c, x as Free (_, T)) => mk_sel r_rec0 (c, T) === x) named_vars_more; (*updates*) fun mk_upd_prop i (c, T) = let val x' = Free (singleton (Name.variant_list all_variants) (Long_Name.base_name c ^ "'"), T --> T); val n = parent_fields_len + i; val args' = nth_map n (K (x' $ nth all_vars_more n)) all_vars_more; in mk_upd updateN c x' r_rec0 === mk_rec args' 0 end; val upd_conv_props = map2 mk_upd_prop idxms fields_more; (*induct*) val induct_scheme_prop = fold_rev Logic.all all_vars_more (Trueprop (P $ r_rec0)) ==> Trueprop (P $ r0); val induct_prop = (fold_rev Logic.all all_vars (Trueprop (P_unit $ r_rec_unit0)), Trueprop (P_unit $ r_unit0)); (*surjective*) val surjective_prop = let val args = map (fn (c, Free (_, T)) => mk_sel r0 (c, T)) all_named_vars_more in r0 === mk_rec args 0 end; (*cases*) val cases_scheme_prop = (fold_rev Logic.all all_vars_more ((r0 === r_rec0) ==> Trueprop C), Trueprop C); val cases_prop = fold_rev Logic.all all_vars ((r_unit0 === r_rec_unit0) ==> Trueprop C) ==> Trueprop C; (*split*) val split_meta_prop = let val P = Free (singleton (Name.variant_list all_variants) "P", rec_schemeT0 --> propT); in Logic.mk_equals (Logic.all r0 (P $ r0), fold_rev Logic.all all_vars_more (P $ r_rec0)) end; val split_object_prop = let val ALL = fold_rev (fn (v, T) => fn t => HOLogic.mk_all (v, T, t)) in ALL [dest_Free r0] (P $ r0) === ALL (map dest_Free all_vars_more) (P $ r_rec0) end; val split_ex_prop = let val EX = fold_rev (fn (v, T) => fn t => HOLogic.mk_exists (v, T, t)) in EX [dest_Free r0] (P $ r0) === EX (map dest_Free all_vars_more) (P $ r_rec0) end; (*equality*) val equality_prop = let val s' = Free (rN ^ "'", rec_schemeT0); fun mk_sel_eq (c, Free (_, T)) = mk_sel r0 (c, T) === mk_sel s' (c, T); val seleqs = map mk_sel_eq all_named_vars_more; in Logic.all r0 (Logic.all s' (Logic.list_implies (seleqs, r0 === s'))) end; (* 3rd stage: thms_thy *) val record_ss = get_simpset defs_thy; - fun sel_upd_ctxt ctxt' = - put_simpset record_ss ctxt' - addsimps (sel_defs @ accessor_thms @ upd_defs @ updator_thms); + val sel_upd_ss = + simpset_of + (put_simpset record_ss defs_ctxt + addsimps (sel_defs @ accessor_thms @ upd_defs @ updator_thms)); val (sel_convs, upd_convs) = timeit_msg defs_ctxt "record sel_convs/upd_convs proof:" (fn () => grouped 10 Par_List.map (fn prop => Goal.prove_sorry_global defs_thy [] [] prop (fn {context = ctxt', ...} => - ALLGOALS (asm_full_simp_tac (sel_upd_ctxt ctxt')))) + ALLGOALS (asm_full_simp_tac (put_simpset sel_upd_ss ctxt')))) (sel_conv_props @ upd_conv_props)) |> chop (length sel_conv_props); val (fold_congs, unfold_congs) = timeit_msg defs_ctxt "record upd fold/unfold congs:" (fn () => let val symdefs = map Thm.symmetric (sel_defs @ upd_defs); val fold_ctxt = put_simpset HOL_basic_ss defs_ctxt addsimps symdefs; val ua_congs = map (Drule.export_without_context o simplify fold_ctxt) upd_acc_cong_assists; in (ua_congs RL [updacc_foldE], ua_congs RL [updacc_unfoldE]) end); val parent_induct = Option.map #induct_scheme (try List.last parents); val induct_scheme = timeit_msg defs_ctxt "record induct_scheme proof:" (fn () => Goal.prove_sorry_global defs_thy [] [] induct_scheme_prop (fn {context = ctxt', ...} => EVERY [case parent_induct of NONE => all_tac | SOME ind => try_param_tac ctxt' rN ind 1, try_param_tac ctxt' rN ext_induct 1, asm_simp_tac (put_simpset HOL_basic_ss ctxt') 1])); val induct = timeit_msg defs_ctxt "record induct proof:" (fn () => Goal.prove_sorry_global defs_thy [] [#1 induct_prop] (#2 induct_prop) (fn {context = ctxt', prems, ...} => try_param_tac ctxt' rN induct_scheme 1 THEN try_param_tac ctxt' "more" @{thm unit.induct} 1 THEN resolve_tac ctxt' prems 1)); val surjective = timeit_msg defs_ctxt "record surjective proof:" (fn () => Goal.prove_sorry_global defs_thy [] [] surjective_prop (fn {context = ctxt', ...} => EVERY [resolve_tac ctxt' [surject_assist_idE] 1, simp_tac (put_simpset HOL_basic_ss ctxt' addsimps ext_defs) 1, REPEAT (Iso_Tuple_Support.iso_tuple_intros_tac ctxt' 1 ORELSE (resolve_tac ctxt' [surject_assistI] 1 THEN simp_tac (put_simpset (get_sel_upd_defs defs_thy) ctxt' addsimps (sel_defs @ @{thms o_assoc id_apply id_o o_id})) 1))])); val cases_scheme = timeit_msg defs_ctxt "record cases_scheme proof:" (fn () => Goal.prove_sorry_global defs_thy [] [#1 cases_scheme_prop] (#2 cases_scheme_prop) (fn {context = ctxt', prems, ...} => resolve_tac ctxt' prems 1 THEN resolve_tac ctxt' [surjective] 1)); val cases = timeit_msg defs_ctxt "record cases proof:" (fn () => Goal.prove_sorry_global defs_thy [] [] cases_prop (fn {context = ctxt', ...} => try_param_tac ctxt' rN cases_scheme 1 THEN ALLGOALS (asm_full_simp_tac (put_simpset HOL_basic_ss ctxt' addsimps @{thms unit_all_eq1})))); val split_meta = timeit_msg defs_ctxt "record split_meta proof:" (fn () => Goal.prove_sorry_global defs_thy [] [] split_meta_prop (fn {context = ctxt', ...} => EVERY1 [resolve_tac ctxt' @{thms equal_intr_rule}, Goal.norm_hhf_tac ctxt', eresolve_tac ctxt' @{thms meta_allE}, assume_tac ctxt', resolve_tac ctxt' [@{thm prop_subst} OF [surjective]], REPEAT o eresolve_tac ctxt' @{thms meta_allE}, assume_tac ctxt'])); val split_object = timeit_msg defs_ctxt "record split_object proof:" (fn () => Goal.prove_sorry_global defs_thy [] [] split_object_prop (fn {context = ctxt', ...} => resolve_tac ctxt' [@{lemma "Trueprop A \ Trueprop B \ A = B" by (rule iffI) unfold}] 1 THEN rewrite_goals_tac ctxt' @{thms atomize_all [symmetric]} THEN resolve_tac ctxt' [split_meta] 1)); val split_ex = timeit_msg defs_ctxt "record split_ex proof:" (fn () => Goal.prove_sorry_global defs_thy [] [] split_ex_prop (fn {context = ctxt', ...} => simp_tac (put_simpset HOL_basic_ss ctxt' addsimps (@{lemma "\x. P x \ \ (\x. \ P x)" by simp} :: @{thms not_not Not_eq_iff})) 1 THEN resolve_tac ctxt' [split_object] 1)); val equality = timeit_msg defs_ctxt "record equality proof:" (fn () => Goal.prove_sorry_global defs_thy [] [] equality_prop (fn {context = ctxt', ...} => asm_full_simp_tac (put_simpset record_ss ctxt' addsimps (split_meta :: sel_convs)) 1)); val ([(_, sel_convs'), (_, upd_convs'), (_, sel_defs'), (_, upd_defs'), (_, fold_congs'), (_, unfold_congs'), (_, splits' as [split_meta', split_object', split_ex']), (_, derived_defs'), (_, [surjective']), (_, [equality']), (_, [induct_scheme']), (_, [induct']), (_, [cases_scheme']), (_, [cases'])], thms_thy) = defs_thy |> Code.declare_default_eqns_global (map (rpair true) derived_defs) |> Global_Theory.note_thmss "" [((Binding.name "select_convs", []), [(sel_convs, [])]), ((Binding.name "update_convs", []), [(upd_convs, [])]), ((Binding.name "select_defs", []), [(sel_defs, [])]), ((Binding.name "update_defs", []), [(upd_defs, [])]), ((Binding.name "fold_congs", []), [(fold_congs, [])]), ((Binding.name "unfold_congs", []), [(unfold_congs, [])]), ((Binding.name "splits", []), [([split_meta, split_object, split_ex], [])]), ((Binding.name "defs", []), [(derived_defs, [])]), ((Binding.name "surjective", []), [([surjective], [])]), ((Binding.name "equality", []), [([equality], [])]), ((Binding.name "induct_scheme", induct_type_global (suffix schemeN name)), [([induct_scheme], [])]), ((Binding.name "induct", induct_type_global name), [([induct], [])]), ((Binding.name "cases_scheme", cases_type_global (suffix schemeN name)), [([cases_scheme], [])]), ((Binding.name "cases", cases_type_global name), [([cases], [])])]; val sel_upd_simps = sel_convs' @ upd_convs'; val sel_upd_defs = sel_defs' @ upd_defs'; val depth = parent_len + 1; val ([(_, simps'), (_, iffs')], thms_thy') = thms_thy |> Global_Theory.note_thmss "" [((Binding.name "simps", [Simplifier.simp_add]), [(sel_upd_simps, [])]), ((Binding.name "iffs", [iff_add]), [([ext_inject], [])])]; val info = make_info alphas parent fields extension ext_induct ext_inject ext_surjective ext_split ext_def sel_convs' upd_convs' sel_defs' upd_defs' fold_congs' unfold_congs' splits' derived_defs' surjective' equality' induct_scheme' induct' cases_scheme' cases' simps' iffs'; val final_thy = thms_thy' |> put_record name info |> put_sel_upd names full_moreN depth sel_upd_simps sel_upd_defs |> add_equalities extension_id equality' |> add_extinjects ext_inject |> add_extsplit extension_name ext_split |> add_splits extension_id (split_meta', split_object', split_ex', induct_scheme') |> add_extfields extension_name (fields @ [(full_moreN, moreT)]) |> add_fieldext (extension_name, snd extension) names |> add_code ext_tyco vs extT ext simps' ext_inject |> add_ctr_sugar (Const ext) cases_scheme' ext_inject sel_convs' |> fold add_spec_rule (sel_convs' @ upd_convs' @ derived_defs') |> Sign.restore_naming thy0; in final_thy end; (* add_record *) local fun read_parent NONE ctxt = (NONE, ctxt) | read_parent (SOME raw_T) ctxt = (case Proof_Context.read_typ_abbrev ctxt raw_T of Type (name, Ts) => (SOME (Ts, name), fold Variable.declare_typ Ts ctxt) | T => error ("Bad parent record specification: " ^ Syntax.string_of_typ ctxt T)); fun read_fields raw_fields ctxt = let val Ts = Syntax.read_typs ctxt (map (fn (_, raw_T, _) => raw_T) raw_fields); val fields = map2 (fn (x, _, mx) => fn T => (x, T, mx)) raw_fields Ts; val ctxt' = fold Variable.declare_typ Ts ctxt; in (fields, ctxt') end; in fun add_record overloaded (params, binding) raw_parent raw_fields thy = let val ctxt = Proof_Context.init_global thy; fun cert_typ T = Type.no_tvars (Proof_Context.cert_typ ctxt T) handle TYPE (msg, _, _) => error msg; (* specification *) val parent = Option.map (apfst (map cert_typ)) raw_parent handle ERROR msg => cat_error msg ("The error(s) above occurred in parent record specification"); val parent_args = (case parent of SOME (Ts, _) => Ts | NONE => []); val parents = get_parent_info thy parent; val bfields = raw_fields |> map (fn (x, raw_T, mx) => (x, cert_typ raw_T, mx) handle ERROR msg => cat_error msg ("The error(s) above occurred in record field " ^ Binding.print x)); (* errors *) val name = Sign.full_name thy binding; val err_dup_record = if is_none (get_info thy name) then [] else ["Duplicate definition of record " ^ quote name]; val spec_frees = fold Term.add_tfreesT (parent_args @ map #2 bfields) []; val err_extra_frees = (case subtract (op =) params spec_frees of [] => [] | extras => ["Extra free type variable(s) " ^ commas (map (Syntax.string_of_typ ctxt o TFree) extras)]); val err_no_fields = if null bfields then ["No fields present"] else []; val err_dup_fields = (case duplicates Binding.eq_name (map #1 bfields) of [] => [] | dups => ["Duplicate field(s) " ^ commas (map Binding.print dups)]); val err_bad_fields = if forall (not_equal moreN o Binding.name_of o #1) bfields then [] else ["Illegal field name " ^ quote moreN]; val errs = err_dup_record @ err_extra_frees @ err_no_fields @ err_dup_fields @ err_bad_fields; val _ = if null errs then () else error (cat_lines errs); in thy |> definition overloaded (params, binding) parent parents bfields end handle ERROR msg => cat_error msg ("Failed to define record " ^ Binding.print binding); fun add_record_cmd overloaded (raw_params, binding) raw_parent raw_fields thy = let val ctxt = Proof_Context.init_global thy; val params = map (apsnd (Typedecl.read_constraint ctxt)) raw_params; val ctxt1 = fold (Variable.declare_typ o TFree) params ctxt; val (parent, ctxt2) = read_parent raw_parent ctxt1; val (fields, ctxt3) = read_fields raw_fields ctxt2; val params' = map (Proof_Context.check_tfree ctxt3) params; in thy |> add_record overloaded (params', binding) parent fields end; end; (* printing *) local fun the_parent_recT (Type (parent, [Type (_, [unit as Type (_,[])])])) = Type (parent, [unit]) | the_parent_recT (Type (extT, [T])) = Type (extT, [the_parent_recT T]) | the_parent_recT T = raise TYPE ("Not a unit record scheme with parent: ", [T], []) in fun pretty_recT ctxt typ = let val thy = Proof_Context.theory_of ctxt val (fs, (_, moreT)) = get_recT_fields thy typ val _ = if moreT = HOLogic.unitT then () else raise TYPE ("Not a unit record scheme: ", [typ], []) val parent = if length (dest_recTs typ) >= 2 then SOME (the_parent_recT typ) else NONE val pfs = case parent of SOME p => fst (get_recT_fields thy p) | NONE => [] val fs' = drop (length pfs) fs fun pretty_field (name, typ) = Pretty.block [ Syntax.pretty_term ctxt (Const (name, typ)), Pretty.brk 1, Pretty.str "::", Pretty.brk 1, Syntax.pretty_typ ctxt typ ] in Pretty.block (Library.separate (Pretty.brk 1) ([Pretty.keyword1 "record", Syntax.pretty_typ ctxt typ, Pretty.str "="] @ (case parent of SOME p => [Syntax.pretty_typ ctxt p, Pretty.str "+"] | NONE => [])) @ Pretty.fbrk :: Pretty.fbreaks (map pretty_field fs')) end end fun string_of_record ctxt s = let val T = Syntax.read_typ ctxt s in Pretty.string_of (pretty_recT ctxt T) handle TYPE _ => error ("Unknown record: " ^ Syntax.string_of_typ ctxt T) end val print_record = let fun print_item string_of (modes, arg) = Toplevel.keep (fn state => Print_Mode.with_modes modes (fn () => Output.writeln (string_of state arg)) ()); in print_item (string_of_record o Toplevel.context_of) end (* outer syntax *) val _ = Outer_Syntax.command \<^command_keyword>\record\ "define extensible record" (Parse_Spec.overloaded -- (Parse.type_args_constrained -- Parse.binding) -- (\<^keyword>\=\ |-- Scan.option (Parse.typ --| \<^keyword>\+\) -- Scan.repeat1 Parse.const_binding) >> (fn ((overloaded, x), (y, z)) => Toplevel.theory (add_record_cmd {overloaded = overloaded} x y z))); val opt_modes = Scan.optional (\<^keyword>\(\ |-- Parse.!!! (Scan.repeat1 Parse.name --| \<^keyword>\)\)) [] val _ = Outer_Syntax.command \<^command_keyword>\print_record\ "print record definiton" (opt_modes -- Parse.typ >> print_record); end diff --git a/src/Pure/ML/ml_antiquotations.ML b/src/Pure/ML/ml_antiquotations.ML --- a/src/Pure/ML/ml_antiquotations.ML +++ b/src/Pure/ML/ml_antiquotations.ML @@ -1,648 +1,467 @@ (* Title: Pure/ML/ml_antiquotations.ML Author: Makarius Miscellaneous ML antiquotations. *) signature ML_ANTIQUOTATIONS = sig val make_judgment: Proof.context -> term -> term val dest_judgment: Proof.context -> term -> term - val make_ctyp: Proof.context -> typ -> ctyp - val make_cterm: Proof.context -> term -> cterm - type insts = ((indexname * sort) * typ) list * ((indexname * typ) * term) list - type cinsts = ((indexname * sort) * ctyp) list * ((indexname * typ) * cterm) list - val instantiate_typ: insts -> typ -> typ - val instantiate_ctyp: Position.T -> cinsts -> ctyp -> ctyp - val instantiate_term: insts -> term -> term - val instantiate_cterm: Position.T -> cinsts -> cterm -> cterm end; structure ML_Antiquotations: ML_ANTIQUOTATIONS = struct (* ML support *) val _ = Theory.setup (ML_Antiquotation.inline \<^binding>\undefined\ (Scan.succeed "(raise General.Match)") #> ML_Antiquotation.inline \<^binding>\assert\ (Scan.succeed "(fn b => if b then () else raise General.Fail \"Assertion failed\")") #> ML_Antiquotation.declaration_embedded \<^binding>\print\ (Scan.lift (Scan.optional Parse.embedded "Output.writeln")) (fn src => fn output => fn ctxt => let val struct_name = ML_Context.struct_name ctxt; val (_, pos) = Token.name_of_src src; val (a, ctxt') = ML_Context.variant "output" ctxt; val env = "val " ^ a ^ ": string -> unit =\n\ \ (" ^ output ^ ") o (fn s => s ^ Position.here (" ^ ML_Syntax.print_position pos ^ "));\n"; val body = "(fn x => (" ^ struct_name ^ "." ^ a ^ " (" ^ ML_Pretty.make_string_fn ^ " x); x))"; in (K (env, body), ctxt') end) #> ML_Antiquotation.value \<^binding>\rat\ (Scan.lift (Scan.optional (Args.$$$ "~" >> K ~1) 1 -- Parse.nat -- Scan.optional (Args.$$$ "/" |-- Parse.nat) 1) >> (fn ((sign, a), b) => "Rat.make " ^ ML_Syntax.print_pair ML_Syntax.print_int ML_Syntax.print_int (sign * a, b))) #> ML_Antiquotation.conditional \<^binding>\if_linux\ (fn _ => ML_System.platform_is_linux) #> ML_Antiquotation.conditional \<^binding>\if_macos\ (fn _ => ML_System.platform_is_macos) #> ML_Antiquotation.conditional \<^binding>\if_windows\ (fn _ => ML_System.platform_is_windows) #> ML_Antiquotation.conditional \<^binding>\if_unix\ (fn _ => ML_System.platform_is_unix)); (* formal entities *) val _ = Theory.setup (ML_Antiquotation.value_embedded \<^binding>\system_option\ (Args.context -- Scan.lift Parse.embedded_position >> (fn (ctxt, (name, pos)) => (Completion.check_option (Options.default ()) ctxt (name, pos) |> ML_Syntax.print_string))) #> ML_Antiquotation.value_embedded \<^binding>\theory\ (Args.context -- Scan.lift Parse.embedded_position >> (fn (ctxt, (name, pos)) => (Theory.check {long = false} ctxt (name, pos); "Context.get_theory {long = false} (Proof_Context.theory_of ML_context) " ^ ML_Syntax.print_string name)) || Scan.succeed "Proof_Context.theory_of ML_context") #> ML_Antiquotation.value_embedded \<^binding>\theory_context\ (Args.context -- Scan.lift Parse.embedded_position >> (fn (ctxt, (name, pos)) => (Theory.check {long = false} ctxt (name, pos); "Proof_Context.get_global (Proof_Context.theory_of ML_context) " ^ ML_Syntax.print_string name))) #> ML_Antiquotation.inline \<^binding>\context\ (Args.context >> (fn ctxt => ML_Context.struct_name ctxt ^ ".ML_context")) #> ML_Antiquotation.inline_embedded \<^binding>\typ\ (Args.typ >> (ML_Syntax.atomic o ML_Syntax.print_typ)) #> ML_Antiquotation.inline_embedded \<^binding>\term\ (Args.term >> (ML_Syntax.atomic o ML_Syntax.print_term)) #> ML_Antiquotation.inline_embedded \<^binding>\prop\ (Args.prop >> (ML_Syntax.atomic o ML_Syntax.print_term)) #> ML_Antiquotation.value_embedded \<^binding>\ctyp\ (Args.typ >> (fn T => "Thm.ctyp_of ML_context " ^ ML_Syntax.atomic (ML_Syntax.print_typ T))) #> ML_Antiquotation.value_embedded \<^binding>\cterm\ (Args.term >> (fn t => "Thm.cterm_of ML_context " ^ ML_Syntax.atomic (ML_Syntax.print_term t))) #> ML_Antiquotation.value_embedded \<^binding>\cprop\ (Args.prop >> (fn t => "Thm.cterm_of ML_context " ^ ML_Syntax.atomic (ML_Syntax.print_term t))) #> ML_Antiquotation.inline_embedded \<^binding>\oracle_name\ (Args.context -- Scan.lift Parse.embedded_position >> (fn (ctxt, (name, pos)) => ML_Syntax.print_string (Thm.check_oracle ctxt (name, pos))))); (* schematic variables *) val schematic_input = Args.context -- Scan.lift Parse.embedded_input >> (fn (ctxt, src) => (Proof_Context.set_mode Proof_Context.mode_schematic ctxt, (Syntax.implode_input src, Input.pos_of src))); val _ = Theory.setup (ML_Antiquotation.inline_embedded \<^binding>\tvar\ (schematic_input >> (fn (ctxt, (s, pos)) => (case Syntax.read_typ ctxt s of TVar v => ML_Syntax.print_pair ML_Syntax.print_indexname ML_Syntax.print_sort v | _ => error ("Schematic type variable expected" ^ Position.here pos)))) #> ML_Antiquotation.inline_embedded \<^binding>\var\ (schematic_input >> (fn (ctxt, (s, pos)) => (case Syntax.read_term ctxt s of Var v => ML_Syntax.print_pair ML_Syntax.print_indexname ML_Syntax.print_typ v | _ => error ("Schematic variable expected" ^ Position.here pos))))); (* type classes *) fun class syn = Args.context -- Scan.lift Parse.embedded_inner_syntax >> (fn (ctxt, s) => Proof_Context.read_class ctxt s |> syn ? Lexicon.mark_class |> ML_Syntax.print_string); val _ = Theory.setup (ML_Antiquotation.inline_embedded \<^binding>\class\ (class false) #> ML_Antiquotation.inline_embedded \<^binding>\class_syntax\ (class true) #> ML_Antiquotation.inline_embedded \<^binding>\sort\ (Args.context -- Scan.lift Parse.embedded_inner_syntax >> (fn (ctxt, s) => ML_Syntax.atomic (ML_Syntax.print_sort (Syntax.read_sort ctxt s))))); (* type constructors *) fun type_name kind check = Args.context -- Scan.lift (Parse.token Parse.embedded) >> (fn (ctxt, tok) => let val s = Token.inner_syntax_of tok; val (_, pos) = Input.source_content (Token.input_of tok); val Type (c, _) = Proof_Context.read_type_name {proper = true, strict = false} ctxt s; val decl = Type.the_decl (Proof_Context.tsig_of ctxt) (c, pos); val res = (case try check (c, decl) of SOME res => res | NONE => error ("Not a " ^ kind ^ ": " ^ quote c ^ Position.here pos)); in ML_Syntax.print_string res end); val _ = Theory.setup (ML_Antiquotation.inline_embedded \<^binding>\type_name\ (type_name "logical type" (fn (c, Type.LogicalType _) => c)) #> ML_Antiquotation.inline_embedded \<^binding>\type_abbrev\ (type_name "type abbreviation" (fn (c, Type.Abbreviation _) => c)) #> ML_Antiquotation.inline_embedded \<^binding>\nonterminal\ (type_name "nonterminal" (fn (c, Type.Nonterminal) => c)) #> ML_Antiquotation.inline_embedded \<^binding>\type_syntax\ (type_name "type" (fn (c, _) => Lexicon.mark_type c))); (* constants *) fun const_name check = Args.context -- Scan.lift (Parse.token Parse.embedded) >> (fn (ctxt, tok) => let val s = Token.inner_syntax_of tok; val (_, pos) = Input.source_content (Token.input_of tok); val Const (c, _) = Proof_Context.read_const {proper = true, strict = false} ctxt s; val res = check (Proof_Context.consts_of ctxt, c) handle TYPE (msg, _, _) => error (msg ^ Position.here pos); in ML_Syntax.print_string res end); val _ = Theory.setup (ML_Antiquotation.inline_embedded \<^binding>\const_name\ (const_name (fn (consts, c) => (Consts.the_const consts c; c))) #> ML_Antiquotation.inline_embedded \<^binding>\const_abbrev\ (const_name (fn (consts, c) => (Consts.the_abbreviation consts c; c))) #> ML_Antiquotation.inline_embedded \<^binding>\const_syntax\ (const_name (fn (_, c) => Lexicon.mark_const c)) #> ML_Antiquotation.inline_embedded \<^binding>\syntax_const\ (Args.context -- Scan.lift Parse.embedded_position >> (fn (ctxt, arg) => ML_Syntax.print_string (Proof_Context.check_syntax_const ctxt arg))) #> ML_Antiquotation.inline_embedded \<^binding>\const\ (Args.context -- Scan.lift (Parse.position Parse.embedded_inner_syntax) -- Scan.optional (Scan.lift (Args.$$$ "(") |-- Parse.enum1' "," Args.typ --| Scan.lift (Args.$$$ ")")) [] >> (fn ((ctxt, (raw_c, pos)), Ts) => let val Const (c, _) = Proof_Context.read_const {proper = true, strict = true} ctxt raw_c; val consts = Proof_Context.consts_of ctxt; val n = length (Consts.typargs consts (c, Consts.type_scheme consts c)); val _ = length Ts <> n andalso error ("Constant requires " ^ string_of_int n ^ " type argument(s): " ^ quote c ^ enclose "(" ")" (commas (replicate n "_")) ^ Position.here pos); val const = Const (c, Consts.instance consts (c, Ts)); in ML_Syntax.atomic (ML_Syntax.print_term const) end))); (* object-logic judgment *) fun make_judgment ctxt = let val const = Object_Logic.judgment_const ctxt in fn t => Const const $ t end; fun dest_judgment ctxt = let val is_judgment = Object_Logic.is_judgment ctxt; val drop_judgment = Object_Logic.drop_judgment ctxt; in fn t => if is_judgment t then drop_judgment t else raise TERM ("dest_judgment", [t]) end; val _ = Theory.setup (ML_Antiquotation.value \<^binding>\make_judgment\ (Scan.succeed "ML_Antiquotations.make_judgment ML_context") #> ML_Antiquotation.value \<^binding>\dest_judgment\ (Scan.succeed "ML_Antiquotations.dest_judgment ML_context")); -(* type/term instantiations *) - -fun make_ctyp ctxt T = Thm.ctyp_of ctxt T |> Thm.trim_context_ctyp; -fun make_cterm ctxt t = Thm.cterm_of ctxt t |> Thm.trim_context_cterm; - -type insts = ((indexname * sort) * typ) list * ((indexname * typ) * term) list -type cinsts = ((indexname * sort) * ctyp) list * ((indexname * typ) * cterm) list - -fun instantiate_typ (insts: insts) = - Term_Subst.instantiateT (TVars.make (#1 insts)); - -fun instantiate_ctyp pos (cinsts: cinsts) cT = - Thm.instantiate_ctyp (TVars.make (#1 cinsts)) cT - handle CTERM (msg, args) => Exn.reraise (CTERM (msg ^ Position.here pos, args)); - -fun instantiate_term (insts: insts) = - let - val instT = TVars.make (#1 insts); - val instantiateT = Term_Subst.instantiateT instT; - val inst = Vars.make ((map o apfst o apsnd) instantiateT (#2 insts)); - in Term_Subst.instantiate_beta (instT, inst) end; - -fun instantiate_cterm pos (cinsts: cinsts) ct = - let - val cinstT = TVars.make (#1 cinsts); - val instantiateT = Term_Subst.instantiateT (TVars.map (K Thm.typ_of) cinstT); - val cinst = Vars.make ((map o apfst o apsnd) instantiateT (#2 cinsts)); - in Thm.instantiate_beta_cterm (cinstT, cinst) ct end - handle CTERM (msg, args) => Exn.reraise (CTERM (msg ^ Position.here pos, args)); - - -local - -fun make_keywords ctxt = - Thy_Header.get_keywords' ctxt - |> Keyword.no_major_keywords - |> Keyword.add_major_keywords ["typ", "term", "prop", "ctyp", "cterm", "cprop"]; - -val parse_inst_name = Parse.position (Parse.type_ident >> pair true || Parse.name >> pair false); - -val parse_inst = - (parse_inst_name -- (Parse.$$$ "=" |-- Parse.!!! Parse.embedded_ml) || - Scan.ahead parse_inst_name -- Parse.embedded_ml) - >> (fn (((b, a), pos), ml) => (b, ((a, pos), ml))); - -val parse_insts = - Parse.and_list1 parse_inst >> (List.partition #1 #> apply2 (map #2)); - -val ml = ML_Lex.tokenize_no_range; -val ml_range = ML_Lex.tokenize_range; -fun ml_parens x = ml "(" @ x @ ml ")"; -fun ml_bracks x = ml "[" @ x @ ml "]"; -fun ml_commas xs = flat (separate (ml ", ") xs); -val ml_list = ml_bracks o ml_commas; -fun ml_pair (x, y) = ml_parens (ml_commas [x, y]); -val ml_list_pair = ml_list o ListPair.map ml_pair; - -fun get_tfree envT (a, pos) = - (case AList.lookup (op =) envT a of - SOME S => (a, S) - | NONE => error ("No occurrence of type variable " ^ quote a ^ Position.here pos)); - -fun check_free ctxt env (x, pos) = - (case AList.lookup (op =) env x of - SOME T => - (Context_Position.reports ctxt (map (pair pos) (Syntax_Phases.markup_free ctxt x)); (x, T)) - | NONE => error ("No occurrence of variable " ^ quote x ^ Position.here pos)); - -fun missing_instT envT instT = - (case filter_out (fn (a, _) => exists (fn (b, _) => a = b) instT) envT of - [] => () - | bad => error ("No instantiation for free type variable(s) " ^ commas_quote (map #1 bad))); - -fun missing_inst env inst = - (case filter_out (fn (a, _) => exists (fn (b, _) => a = b) inst) env of - [] => () - | bad => error ("No instantiation for free variable(s) " ^ commas_quote (map #1 bad))); - -fun make_instT (a, pos) T = - (case try (Term.dest_TVar o Logic.dest_type) T of - NONE => error ("Not a free type variable " ^ quote a ^ Position.here pos) - | SOME v => ml (ML_Syntax.print_pair ML_Syntax.print_indexname ML_Syntax.print_sort v)); - -fun make_inst (a, pos) t = - (case try Term.dest_Var t of - NONE => error ("Not a free variable " ^ quote a ^ Position.here pos) - | SOME v => ml (ML_Syntax.print_pair ML_Syntax.print_indexname ML_Syntax.print_typ v)); - -fun term_env t = (Term.add_tfrees t [], Term.add_frees t []); - -fun prepare_insts ctxt1 ctxt0 (instT, inst) t = - let - val (envT, env) = term_env t; - val freesT = map (Logic.mk_type o TFree o get_tfree envT) instT; - val frees = map (Free o check_free ctxt1 env) inst; - val (t' :: varsT, vars) = - Variable.export_terms ctxt1 ctxt0 (t :: freesT @ frees) - |> chop (1 + length freesT); - - val (envT', env') = term_env t'; - val _ = missing_instT (subtract (eq_fst op =) envT' envT) instT; - val _ = missing_inst (subtract (eq_fst op =) env' env) inst; - - val ml_insts = (map2 make_instT instT varsT, map2 make_inst inst vars); - in (ml_insts, t') end; - -fun prepare_ml range (kind, ml1, ml2) ml_val (ml_instT, ml_inst) ctxt = - let - val (ml_name, ctxt') = ML_Context.variant kind ctxt; - val ml_env = ml ("val " ^ ml_name ^ " = ") @ ml ml1 @ ml_parens (ml ml_val) @ ml ";\n"; - fun ml_body (ml_argsT, ml_args) = - ml_parens (ml ml2 @ - ml_pair (ml_list_pair (ml_instT, ml_argsT), ml_list_pair (ml_inst, ml_args)) @ - ml_range range (ML_Context.struct_name ctxt ^ "." ^ ml_name)); - in ((ml_env, ml_body), ctxt') end; - -fun prepare_type range (arg, s) insts ctxt = - let - val T = Syntax.read_typ ctxt s; - val t = Logic.mk_type T; - val ctxt1 = Proof_Context.augment t ctxt; - val (ml_insts, T') = prepare_insts ctxt1 ctxt insts t ||> Logic.dest_type; - in prepare_ml range arg (ML_Syntax.print_typ T') ml_insts ctxt end; - -fun prepare_term read range (arg, (s, fixes)) insts ctxt = - let - val ctxt' = #2 (Proof_Context.add_fixes_cmd fixes ctxt); - val t = read ctxt' s; - val ctxt1 = Proof_Context.augment t ctxt'; - val (ml_insts, t') = prepare_insts ctxt1 ctxt insts t; - in prepare_ml range arg (ML_Syntax.print_term t') ml_insts ctxt end; - -val ml_here = ML_Syntax.atomic o ML_Syntax.print_position; - -fun typ_ml (kind, _: Position.T) = (kind, "", "ML_Antiquotations.instantiate_typ "); -fun term_ml (kind, _: Position.T) = (kind, "", "ML_Antiquotations.instantiate_term "); -fun ctyp_ml (kind, pos) = - (kind, "ML_Antiquotations.make_ctyp ML_context", - "ML_Antiquotations.instantiate_ctyp " ^ ml_here pos); -fun cterm_ml (kind, pos) = - (kind, "ML_Antiquotations.make_cterm ML_context", - "ML_Antiquotations.instantiate_cterm " ^ ml_here pos); - -val command_name = Parse.position o Parse.command_name; - -fun parse_body range = - (command_name "typ" >> typ_ml || command_name "ctyp" >> ctyp_ml) -- - Parse.!!! Parse.typ >> prepare_type range || - (command_name "term" >> term_ml || command_name "cterm" >> cterm_ml) - -- Parse.!!! (Parse.term -- Parse.for_fixes) >> prepare_term Syntax.read_term range || - (command_name "prop" >> term_ml || command_name "cprop" >> cterm_ml) - -- Parse.!!! (Parse.term -- Parse.for_fixes) >> prepare_term Syntax.read_prop range; - -val _ = Theory.setup - (ML_Context.add_antiquotation \<^binding>\instantiate\ true - (fn range => fn src => fn ctxt => - let - val (insts, prepare_val) = src - |> Parse.read_embedded_src ctxt (make_keywords ctxt) - ((parse_insts --| Parse.$$$ "in") -- parse_body range); - - val (((ml_env, ml_body), decls), ctxt1) = ctxt - |> prepare_val (apply2 (map #1) insts) - ||>> ML_Context.expand_antiquotes_list (op @ (apply2 (map #2) insts)); - - fun decl' ctxt' = - let val (ml_args_env, ml_args) = split_list (decls ctxt') - in (ml_env @ flat ml_args_env, ml_body (chop (length (#1 insts)) ml_args)) end; - in (decl', ctxt1) end)); - -in end; - - (* type/term constructors *) local val keywords = Keyword.add_minor_keywords ["for", "=>"] Keyword.empty_keywords; val parse_name = Parse.input Parse.name; val parse_args = Scan.repeat Parse.embedded_ml_underscore; val parse_for_args = Scan.optional (Parse.$$$ "for" |-- Parse.!!! parse_args) []; fun parse_body b = if b then Parse.$$$ "=>" |-- Parse.!!! Parse.embedded_input >> (ML_Lex.read_source #> single) else Scan.succeed []; fun is_dummy [Antiquote.Text tok] = ML_Lex.content_of tok = "_" | is_dummy _ = false; val ml = ML_Lex.tokenize_no_range; val ml_range = ML_Lex.tokenize_range; val ml_dummy = ml "_"; fun ml_enclose range x = ml "(" @ x @ ml_range range ")"; fun ml_parens x = ml "(" @ x @ ml ")"; fun ml_bracks x = ml "[" @ x @ ml "]"; fun ml_commas xs = flat (separate (ml ", ") xs); val ml_list = ml_bracks o ml_commas; val ml_string = ml o ML_Syntax.print_string; fun ml_pair (x, y) = ml_parens (ml_commas [x, y]); fun type_antiquotation binding {function} = ML_Context.add_antiquotation binding true (fn range => fn src => fn ctxt => let val ((s, type_args), fn_body) = src |> Parse.read_embedded_src ctxt keywords (parse_name -- parse_args -- parse_body function); val pos = Input.pos_of s; val Type (c, Ts) = Proof_Context.read_type_name {proper = true, strict = true} ctxt (Syntax.implode_input s); val n = length Ts; val _ = length type_args = n orelse error ("Type constructor " ^ quote (Proof_Context.markup_type ctxt c) ^ " takes " ^ string_of_int n ^ " argument(s)" ^ Position.here pos); val (decls1, ctxt1) = ML_Context.expand_antiquotes_list type_args ctxt; val (decls2, ctxt2) = ML_Context.expand_antiquotes_list fn_body ctxt1; fun decl' ctxt' = let val (ml_args_env, ml_args_body) = split_list (decls1 ctxt'); val (ml_fn_env, ml_fn_body) = split_list (decls2 ctxt'); val ml1 = ml_enclose range (ml "Term.Type " @ ml_pair (ml_string c, ml_list ml_args_body)); val ml2 = if function then ml_enclose range (ml_range range "fn " @ ml1 @ ml "=> " @ flat ml_fn_body @ ml "| T => " @ ml_range range "raise" @ ml " Term.TYPE (" @ ml_string ("Type_fn " ^ quote c) @ ml ", [T], [])") else ml1; in (flat (ml_args_env @ ml_fn_env), ml2) end; in (decl', ctxt2) end); fun const_antiquotation binding {pattern, function} = ML_Context.add_antiquotation binding true (fn range => fn src => fn ctxt => let val (((s, type_args), term_args), fn_body) = src |> Parse.read_embedded_src ctxt keywords (parse_name -- parse_args -- parse_for_args -- parse_body function); val Const (c, T) = Proof_Context.read_const {proper = true, strict = true} ctxt (Syntax.implode_input s); val consts = Proof_Context.consts_of ctxt; val type_paths = Consts.type_arguments consts c; val type_params = map Term.dest_TVar (Consts.typargs consts (c, T)); val n = length type_params; val m = length (Term.binder_types T); fun err msg = error ("Constant " ^ quote (Proof_Context.markup_const ctxt c) ^ msg ^ Position.here (Input.pos_of s)); val _ = length type_args <> n andalso err (" takes " ^ string_of_int n ^ " type argument(s)"); val _ = length term_args > m andalso Term.is_Type (Term.body_type T) andalso err (" cannot have more than " ^ string_of_int m ^ " argument(s)"); val (decls1, ctxt1) = ML_Context.expand_antiquotes_list type_args ctxt; val (decls2, ctxt2) = ML_Context.expand_antiquotes_list term_args ctxt1; val (decls3, ctxt3) = ML_Context.expand_antiquotes_list fn_body ctxt2; fun decl' ctxt' = let val (ml_args_env1, ml_args_body1) = split_list (decls1 ctxt'); val (ml_args_env2, ml_args_body2) = split_list (decls2 ctxt'); val (ml_fn_env, ml_fn_body) = split_list (decls3 ctxt'); val relevant = map is_dummy type_args ~~ type_paths; fun relevant_path is = not pattern orelse let val p = rev is in relevant |> exists (fn (u, q) => not u andalso is_prefix (op =) p q) end; val ml_typarg = the o AList.lookup (op =) (type_params ~~ ml_args_body1); fun ml_typ is (Type (d, Us)) = if relevant_path is then ml "Term.Type " @ ml_pair (ml_string d, ml_list (map_index (fn (i, U) => ml_typ (i :: is) U) Us)) else ml_dummy | ml_typ is (TVar arg) = if relevant_path is then ml_typarg arg else ml_dummy | ml_typ _ (TFree _) = raise Match; fun ml_app [] = ml "Term.Const " @ ml_pair (ml_string c, ml_typ [] T) | ml_app (u :: us) = ml "Term.$ " @ ml_pair (ml_app us, u); val ml_env = flat (ml_args_env1 @ ml_args_env2 @ ml_fn_env); val ml1 = ml_enclose range (ml_app (rev ml_args_body2)); val ml2 = if function then ml_enclose range (ml_range range "fn " @ ml1 @ ml "=> " @ flat ml_fn_body @ ml "| t => " @ ml_range range "raise" @ ml " Term.TERM (" @ ml_string ("Const_fn " ^ quote c) @ ml ", [t])") else ml1; in (ml_env, ml2) end; in (decl', ctxt3) end); val _ = Theory.setup (type_antiquotation \<^binding>\Type\ {function = false} #> type_antiquotation \<^binding>\Type_fn\ {function = true} #> const_antiquotation \<^binding>\Const\ {pattern = false, function = false} #> const_antiquotation \<^binding>\Const_\ {pattern = true, function = false} #> const_antiquotation \<^binding>\Const_fn\ {pattern = true, function = true}); in end; (* special forms *) val _ = Theory.setup (ML_Antiquotation.special_form \<^binding>\try\ "() |> Basics.try" #> ML_Antiquotation.special_form \<^binding>\can\ "() |> Basics.can"); (* basic combinators *) local val parameter = Parse.position Parse.nat >> (fn (n, pos) => if n > 1 then n else error ("Bad parameter: " ^ string_of_int n ^ Position.here pos)); fun indices n = map string_of_int (1 upto n); fun empty n = replicate_string n " []"; fun dummy n = replicate_string n " _"; fun vars x n = implode (map (fn a => " " ^ x ^ a) (indices n)); fun cons n = implode (map (fn a => " (x" ^ a ^ " :: xs" ^ a ^ ")") (indices n)); val tuple = enclose "(" ")" o commas; fun tuple_empty n = tuple (replicate n "[]"); fun tuple_vars x n = tuple (map (fn a => x ^ a) (indices n)); fun tuple_cons n = "(" ^ tuple_vars "x" n ^ " :: xs)" fun cons_tuple n = tuple (map (fn a => "x" ^ a ^ " :: xs" ^ a) (indices n)); in val _ = Theory.setup (ML_Antiquotation.value \<^binding>\map\ (Scan.lift parameter >> (fn n => "fn f =>\n\ \ let\n\ \ fun map _" ^ empty n ^ " = []\n\ \ | map f" ^ cons n ^ " = f" ^ vars "x" n ^ " :: map f" ^ vars "xs" n ^ "\n\ \ | map _" ^ dummy n ^ " = raise ListPair.UnequalLengths\n" ^ " in map f end")) #> ML_Antiquotation.value \<^binding>\fold\ (Scan.lift parameter >> (fn n => "fn f =>\n\ \ let\n\ \ fun fold _" ^ empty n ^ " a = a\n\ \ | fold f" ^ cons n ^ " a = fold f" ^ vars "xs" n ^ " (f" ^ vars "x" n ^ " a)\n\ \ | fold _" ^ dummy n ^ " _ = raise ListPair.UnequalLengths\n" ^ " in fold f end")) #> ML_Antiquotation.value \<^binding>\fold_map\ (Scan.lift parameter >> (fn n => "fn f =>\n\ \ let\n\ \ fun fold_map _" ^ empty n ^ " a = ([], a)\n\ \ | fold_map f" ^ cons n ^ " a =\n\ \ let\n\ \ val (x, a') = f" ^ vars "x" n ^ " a\n\ \ val (xs, a'') = fold_map f" ^ vars "xs" n ^ " a'\n\ \ in (x :: xs, a'') end\n\ \ | fold_map _" ^ dummy n ^ " _ = raise ListPair.UnequalLengths\n" ^ " in fold_map f end")) #> ML_Antiquotation.value \<^binding>\split_list\ (Scan.lift parameter >> (fn n => "fn list =>\n\ \ let\n\ \ fun split_list [] =" ^ tuple_empty n ^ "\n\ \ | split_list" ^ tuple_cons n ^ " =\n\ \ let val" ^ tuple_vars "xs" n ^ " = split_list xs\n\ \ in " ^ cons_tuple n ^ "end\n\ \ in split_list list end")) #> ML_Antiquotation.value \<^binding>\apply\ (Scan.lift (parameter -- Scan.option (Args.parens (Parse.position Parse.nat))) >> (fn (n, opt_index) => let val cond = (case opt_index of NONE => K true | SOME (index, index_pos) => if 1 <= index andalso index <= n then equal (string_of_int index) else error ("Bad index: " ^ string_of_int index ^ Position.here index_pos)); in "fn f => fn " ^ tuple_vars "x" n ^ " => " ^ tuple (map (fn a => (if cond a then "f x" else "x") ^ a) (indices n)) end))); end; (* outer syntax *) val _ = Theory.setup (ML_Antiquotation.value_embedded \<^binding>\keyword\ (Args.context -- Scan.lift (Parse.embedded_position || Parse.position (Parse.keyword_with (K true))) >> (fn (ctxt, (name, pos)) => if Keyword.is_keyword (Thy_Header.get_keywords' ctxt) name then (Context_Position.report ctxt pos (Token.keyword_markup (true, Markup.keyword2) name); "Parse.$$$ " ^ ML_Syntax.print_string name) else error ("Bad outer syntax keyword " ^ quote name ^ Position.here pos))) #> ML_Antiquotation.value_embedded \<^binding>\command_keyword\ (Args.context -- Scan.lift Parse.embedded_position >> (fn (ctxt, (name, pos)) => (case Keyword.command_markup (Thy_Header.get_keywords' ctxt) name of SOME markup => (Context_Position.reports ctxt [(pos, markup), (pos, Markup.keyword1)]; ML_Syntax.print_pair ML_Syntax.print_string ML_Syntax.print_position (name, pos)) | NONE => error ("Bad outer syntax command " ^ quote name ^ Position.here pos))))); end; diff --git a/src/Pure/ML/ml_instantiate.ML b/src/Pure/ML/ml_instantiate.ML new file mode 100644 --- /dev/null +++ b/src/Pure/ML/ml_instantiate.ML @@ -0,0 +1,194 @@ +(* Title: Pure/ML/ml_instantiate.ML + Author: Makarius + +ML antiquotation to instantiate logical entities. +*) + +signature ML_INSTANTIATE = +sig + val make_ctyp: Proof.context -> typ -> ctyp + val make_cterm: Proof.context -> term -> cterm + type insts = ((indexname * sort) * typ) list * ((indexname * typ) * term) list + type cinsts = ((indexname * sort) * ctyp) list * ((indexname * typ) * cterm) list + val instantiate_typ: insts -> typ -> typ + val instantiate_ctyp: Position.T -> cinsts -> ctyp -> ctyp + val instantiate_term: insts -> term -> term + val instantiate_cterm: Position.T -> cinsts -> cterm -> cterm +end; + +structure ML_Instantiate: ML_INSTANTIATE = +struct + +(* exported operations *) + +fun make_ctyp ctxt T = Thm.ctyp_of ctxt T |> Thm.trim_context_ctyp; +fun make_cterm ctxt t = Thm.cterm_of ctxt t |> Thm.trim_context_cterm; + +type insts = ((indexname * sort) * typ) list * ((indexname * typ) * term) list +type cinsts = ((indexname * sort) * ctyp) list * ((indexname * typ) * cterm) list + +fun instantiate_typ (insts: insts) = + Term_Subst.instantiateT (TVars.make (#1 insts)); + +fun instantiate_ctyp pos (cinsts: cinsts) cT = + Thm.instantiate_ctyp (TVars.make (#1 cinsts)) cT + handle CTERM (msg, args) => Exn.reraise (CTERM (msg ^ Position.here pos, args)); + +fun instantiate_term (insts: insts) = + let + val instT = TVars.make (#1 insts); + val instantiateT = Term_Subst.instantiateT instT; + val inst = Vars.make ((map o apfst o apsnd) instantiateT (#2 insts)); + in Term_Subst.instantiate_beta (instT, inst) end; + +fun instantiate_cterm pos (cinsts: cinsts) ct = + let + val cinstT = TVars.make (#1 cinsts); + val instantiateT = Term_Subst.instantiateT (TVars.map (K Thm.typ_of) cinstT); + val cinst = Vars.make ((map o apfst o apsnd) instantiateT (#2 cinsts)); + in Thm.instantiate_beta_cterm (cinstT, cinst) ct end + handle CTERM (msg, args) => Exn.reraise (CTERM (msg ^ Position.here pos, args)); + + +(* ML antiquotation *) + +local + +fun make_keywords ctxt = + Thy_Header.get_keywords' ctxt + |> Keyword.no_major_keywords + |> Keyword.add_major_keywords ["typ", "term", "prop", "ctyp", "cterm", "cprop"]; + +val parse_inst_name = Parse.position (Parse.type_ident >> pair true || Parse.name >> pair false); + +val parse_inst = + (parse_inst_name -- (Parse.$$$ "=" |-- Parse.!!! Parse.embedded_ml) || + Scan.ahead parse_inst_name -- Parse.embedded_ml) + >> (fn (((b, a), pos), ml) => (b, ((a, pos), ml))); + +val parse_insts = + Parse.and_list1 parse_inst >> (List.partition #1 #> apply2 (map #2)); + +val ml = ML_Lex.tokenize_no_range; +val ml_range = ML_Lex.tokenize_range; +fun ml_parens x = ml "(" @ x @ ml ")"; +fun ml_bracks x = ml "[" @ x @ ml "]"; +fun ml_commas xs = flat (separate (ml ", ") xs); +val ml_list = ml_bracks o ml_commas; +fun ml_pair (x, y) = ml_parens (ml_commas [x, y]); +val ml_list_pair = ml_list o ListPair.map ml_pair; + +fun get_tfree envT (a, pos) = + (case AList.lookup (op =) envT a of + SOME S => (a, S) + | NONE => error ("No occurrence of type variable " ^ quote a ^ Position.here pos)); + +fun check_free ctxt env (x, pos) = + (case AList.lookup (op =) env x of + SOME T => + (Context_Position.reports ctxt (map (pair pos) (Syntax_Phases.markup_free ctxt x)); (x, T)) + | NONE => error ("No occurrence of variable " ^ quote x ^ Position.here pos)); + +fun missing_instT envT instT = + (case filter_out (fn (a, _) => exists (fn (b, _) => a = b) instT) envT of + [] => () + | bad => error ("No instantiation for free type variable(s) " ^ commas_quote (map #1 bad))); + +fun missing_inst env inst = + (case filter_out (fn (a, _) => exists (fn (b, _) => a = b) inst) env of + [] => () + | bad => error ("No instantiation for free variable(s) " ^ commas_quote (map #1 bad))); + +fun make_instT (a, pos) T = + (case try (Term.dest_TVar o Logic.dest_type) T of + NONE => error ("Not a free type variable " ^ quote a ^ Position.here pos) + | SOME v => ml (ML_Syntax.print_pair ML_Syntax.print_indexname ML_Syntax.print_sort v)); + +fun make_inst (a, pos) t = + (case try Term.dest_Var t of + NONE => error ("Not a free variable " ^ quote a ^ Position.here pos) + | SOME v => ml (ML_Syntax.print_pair ML_Syntax.print_indexname ML_Syntax.print_typ v)); + +fun term_env t = (Term.add_tfrees t [], Term.add_frees t []); + +fun prepare_insts ctxt1 ctxt0 (instT, inst) t = + let + val (envT, env) = term_env t; + val freesT = map (Logic.mk_type o TFree o get_tfree envT) instT; + val frees = map (Free o check_free ctxt1 env) inst; + val (t' :: varsT, vars) = + Variable.export_terms ctxt1 ctxt0 (t :: freesT @ frees) + |> chop (1 + length freesT); + + val (envT', env') = term_env t'; + val _ = missing_instT (subtract (eq_fst op =) envT' envT) instT; + val _ = missing_inst (subtract (eq_fst op =) env' env) inst; + + val ml_insts = (map2 make_instT instT varsT, map2 make_inst inst vars); + in (ml_insts, t') end; + +fun prepare_ml range (kind, ml1, ml2) ml_val (ml_instT, ml_inst) ctxt = + let + val (ml_name, ctxt') = ML_Context.variant kind ctxt; + val ml_env = ml ("val " ^ ml_name ^ " = ") @ ml ml1 @ ml_parens (ml ml_val) @ ml ";\n"; + fun ml_body (ml_argsT, ml_args) = + ml_parens (ml ml2 @ + ml_pair (ml_list_pair (ml_instT, ml_argsT), ml_list_pair (ml_inst, ml_args)) @ + ml_range range (ML_Context.struct_name ctxt ^ "." ^ ml_name)); + in ((ml_env, ml_body), ctxt') end; + +fun prepare_type range (arg, s) insts ctxt = + let + val T = Syntax.read_typ ctxt s; + val t = Logic.mk_type T; + val ctxt1 = Proof_Context.augment t ctxt; + val (ml_insts, T') = prepare_insts ctxt1 ctxt insts t ||> Logic.dest_type; + in prepare_ml range arg (ML_Syntax.print_typ T') ml_insts ctxt end; + +fun prepare_term read range (arg, (s, fixes)) insts ctxt = + let + val ctxt' = #2 (Proof_Context.add_fixes_cmd fixes ctxt); + val t = read ctxt' s; + val ctxt1 = Proof_Context.augment t ctxt'; + val (ml_insts, t') = prepare_insts ctxt1 ctxt insts t; + in prepare_ml range arg (ML_Syntax.print_term t') ml_insts ctxt end; + +val ml_here = ML_Syntax.atomic o ML_Syntax.print_position; + +fun typ_ml (kind, _: Position.T) = (kind, "", "ML_Instantiate.instantiate_typ "); +fun term_ml (kind, _: Position.T) = (kind, "", "ML_Instantiate.instantiate_term "); +fun ctyp_ml (kind, pos) = + (kind, "ML_Instantiate.make_ctyp ML_context", "ML_Instantiate.instantiate_ctyp " ^ ml_here pos); +fun cterm_ml (kind, pos) = + (kind, "ML_Instantiate.make_cterm ML_context", "ML_Instantiate.instantiate_cterm " ^ ml_here pos); + +val command_name = Parse.position o Parse.command_name; + +fun parse_body range = + (command_name "typ" >> typ_ml || command_name "ctyp" >> ctyp_ml) -- + Parse.!!! Parse.typ >> prepare_type range || + (command_name "term" >> term_ml || command_name "cterm" >> cterm_ml) + -- Parse.!!! (Parse.term -- Parse.for_fixes) >> prepare_term Syntax.read_term range || + (command_name "prop" >> term_ml || command_name "cprop" >> cterm_ml) + -- Parse.!!! (Parse.term -- Parse.for_fixes) >> prepare_term Syntax.read_prop range; + +val _ = Theory.setup + (ML_Context.add_antiquotation \<^binding>\instantiate\ true + (fn range => fn src => fn ctxt => + let + val (insts, prepare_val) = src + |> Parse.read_embedded_src ctxt (make_keywords ctxt) + ((parse_insts --| Parse.$$$ "in") -- parse_body range); + + val (((ml_env, ml_body), decls), ctxt1) = ctxt + |> prepare_val (apply2 (map #1) insts) + ||>> ML_Context.expand_antiquotes_list (op @ (apply2 (map #2) insts)); + + fun decl' ctxt' = + let val (ml_args_env, ml_args) = split_list (decls ctxt') + in (ml_env @ flat ml_args_env, ml_body (chop (length (#1 insts)) ml_args)) end; + in (decl', ctxt1) end)); + +in end; + +end; diff --git a/src/Pure/ML/ml_thms.ML b/src/Pure/ML/ml_thms.ML --- a/src/Pure/ML/ml_thms.ML +++ b/src/Pure/ML/ml_thms.ML @@ -1,142 +1,142 @@ (* Title: Pure/ML/ml_thms.ML Author: Makarius Attribute source and theorem values within ML. *) signature ML_THMS = sig val the_attributes: Proof.context -> int -> Token.src list val the_thmss: Proof.context -> thm list list val get_stored_thms: unit -> thm list val get_stored_thm: unit -> thm val store_thms: string * thm list -> unit val store_thm: string * thm -> unit val bind_thm: string * thm -> unit val bind_thms: string * thm list -> unit end; structure ML_Thms: ML_THMS = struct (* auxiliary data *) type thms = (string * bool) * thm list; (*name, single, value*) structure Data = Proof_Data ( type T = Token.src list Inttab.table * thms list; fun init _ = (Inttab.empty, []); ); val put_attributes = Data.map o apfst o Inttab.update; fun the_attributes ctxt name = the (Inttab.lookup (fst (Data.get ctxt)) name); val get_thmss = snd o Data.get; val the_thmss = map snd o get_thmss; val cons_thms = Data.map o apsnd o cons; (* attribute source *) val _ = Theory.setup (ML_Antiquotation.declaration \<^binding>\attributes\ Attrib.attribs (fn _ => fn srcs => fn ctxt => let val i = serial () in ctxt |> put_attributes (i, srcs) |> ML_Antiquotation.value_decl "attributes" ("ML_Thms.the_attributes ML_context " ^ string_of_int i) end)); (* fact references *) fun thm_binding kind is_single thms ctxt = let val initial = null (get_thmss ctxt); val (name, ctxt') = ML_Context.variant kind ctxt; val ctxt'' = cons_thms ((name, is_single), thms) ctxt'; val ml_body = ML_Context.struct_name ctxt ^ "." ^ name; fun decl final_ctxt = if initial then let val binds = get_thmss final_ctxt |> map (fn ((a, b), _) => (b ? enclose "[" "]") a); val ml_env = "val [" ^ commas binds ^ "] = ML_Thms.the_thmss ML_context;\n"; in (ml_env, ml_body) end else ("", ml_body); in (decl, ctxt'') end; val _ = Theory.setup (ML_Antiquotation.declaration \<^binding>\thm\ (Attrib.thm >> single) (K (thm_binding "thm" true)) #> ML_Antiquotation.declaration \<^binding>\thms\ Attrib.thms (K (thm_binding "thms" false))); (* ad-hoc goals *) val and_ = Args.$$$ "and"; val by = Parse.reserved "by"; val goal = Scan.unless (by || and_) Parse.embedded_inner_syntax; val _ = Theory.setup (ML_Antiquotation.declaration \<^binding>\lemma\ (Scan.lift (Args.mode "open" -- Parse.enum1 "and" (Scan.repeat1 goal) -- (Parse.position by -- Method.parse -- Scan.option Method.parse))) (fn _ => fn ((is_open, raw_propss), (((_, by_pos), m1), m2)) => fn ctxt => let val reports = (by_pos, Markup.keyword1 |> Markup.keyword_properties) :: maps Method.reports_of (m1 :: the_list m2); val _ = Context_Position.reports ctxt reports; val propss = burrow (map (rpair []) o Syntax.read_props ctxt) raw_propss; val prep_result = Goal.norm_result ctxt #> not is_open ? Thm.close_derivation \<^here>; fun after_qed res goal_ctxt = Proof_Context.put_thms false (Auto_Bind.thisN, SOME (map prep_result (Proof_Context.export goal_ctxt ctxt (flat res)))) goal_ctxt; val ctxt' = ctxt |> Proof.theorem NONE after_qed propss |> Proof.global_terminal_proof (m1, m2); val thms = Proof_Context.get_fact ctxt' (Facts.named (Proof_Context.full_name ctxt' (Binding.name Auto_Bind.thisN))); - in thm_binding "lemma" (length (flat propss) = 1) thms ctxt end)); + in thm_binding "lemma" (length thms = 1) thms ctxt end)); (* old-style theorem bindings *) structure Data = Theory_Data ( type T = thm list; val empty = []; val merge = #1; ); fun get_stored_thms () = Data.get (Context.the_global_context ()); val get_stored_thm = hd o get_stored_thms; fun ml_store get (name, ths) = let val (_, ths') = Context.>>> (Context.map_theory_result (Global_Theory.note_thms "" ((Binding.name name, []), [(ths, [])]))); val _ = Theory.setup (Data.put ths'); val _ = if name = "" then () else if not (ML_Syntax.is_identifier name) then error ("Cannot bind theorem(s) " ^ quote name ^ " as ML value") else ML_Compiler.eval (ML_Compiler.verbose true ML_Compiler.flags) Position.none (ML_Lex.tokenize ("val " ^ name ^ " = " ^ get ^ " ();")); val _ = Theory.setup (Data.put []); in () end; val store_thms = ml_store "ML_Thms.get_stored_thms"; fun store_thm (name, th) = ml_store "ML_Thms.get_stored_thm" (name, [th]); fun bind_thm (name, thm) = store_thm (name, Drule.export_without_context thm); fun bind_thms (name, thms) = store_thms (name, map Drule.export_without_context thms); end; diff --git a/src/Pure/ROOT.ML b/src/Pure/ROOT.ML --- a/src/Pure/ROOT.ML +++ b/src/Pure/ROOT.ML @@ -1,359 +1,360 @@ (* Title: Pure/ROOT.ML Author: Makarius Main entry point for the Isabelle/Pure bootstrap process. Note: When this file is open in the Prover IDE, the ML files of Isabelle/Pure can be explored interactively. This is a separate copy of Pure within Pure: it does not affect the running logic session. *) chapter "Isabelle/Pure bootstrap"; ML_file "ML/ml_name_space.ML"; section "Bootstrap phase 0: Poly/ML setup"; ML_file "ML/ml_init.ML"; ML_file "ML/ml_system.ML"; ML_file "General/basics.ML"; ML_file "General/symbol_explode.ML"; ML_file "Concurrent/multithreading.ML"; ML_file "Concurrent/unsynchronized.ML"; ML_file "Concurrent/synchronized.ML"; ML_file "Concurrent/counter.ML"; ML_file "ML/ml_heap.ML"; ML_file "ML/ml_print_depth0.ML"; ML_file "ML/ml_pretty.ML"; ML_file "ML/ml_compiler0.ML"; section "Bootstrap phase 1: towards ML within position context"; subsection "Library of general tools"; ML_file "library.ML"; ML_file "General/print_mode.ML"; ML_file "General/alist.ML"; ML_file "General/table.ML"; ML_file "General/random.ML"; ML_file "General/value.ML"; ML_file "General/properties.ML"; ML_file "General/output.ML"; ML_file "PIDE/markup.ML"; ML_file "General/utf8.ML"; ML_file "General/scan.ML"; ML_file "General/source.ML"; ML_file "General/symbol.ML"; ML_file "General/position.ML"; ML_file "General/symbol_pos.ML"; ML_file "General/input.ML"; ML_file "General/comment.ML"; ML_file "General/antiquote.ML"; ML_file "ML/ml_lex.ML"; ML_file "ML/ml_compiler1.ML"; section "Bootstrap phase 2: towards ML within Isar context"; subsection "Library of general tools"; ML_file "General/integer.ML"; ML_file "General/stack.ML"; ML_file "General/queue.ML"; ML_file "General/heap.ML"; ML_file "General/same.ML"; ML_file "General/ord_list.ML"; ML_file "General/balanced_tree.ML"; ML_file "General/linear_set.ML"; ML_file "General/buffer.ML"; ML_file "General/pretty.ML"; ML_file "General/rat.ML"; ML_file "PIDE/xml.ML"; ML_file "General/path.ML"; ML_file "General/url.ML"; ML_file "System/bash.ML"; ML_file "General/file.ML"; ML_file "General/long_name.ML"; ML_file "General/binding.ML"; ML_file "General/seq.ML"; ML_file "General/time.ML"; ML_file "General/timing.ML"; ML_file "General/sha1.ML"; ML_file "PIDE/yxml.ML"; ML_file "ML/ml_profiling.ML"; ML_file "PIDE/byte_message.ML"; ML_file "PIDE/protocol_message.ML"; ML_file "PIDE/document_id.ML"; ML_file "General/socket_io.ML"; ML_file "General/change_table.ML"; ML_file "General/graph.ML"; ML_file "System/options.ML"; subsection "Fundamental structures"; ML_file "name.ML"; ML_file "term.ML"; ML_file "context.ML"; ML_file "config.ML"; ML_file "context_position.ML"; ML_file "soft_type_system.ML"; subsection "Concurrency within the ML runtime"; ML_file "ML/exn_properties.ML"; ML_file_no_debug "ML/exn_debugger.ML"; ML_file "Concurrent/thread_data_virtual.ML"; ML_file "Concurrent/isabelle_thread.ML"; ML_file "Concurrent/single_assignment.ML"; ML_file "Concurrent/par_exn.ML"; ML_file "Concurrent/task_queue.ML"; ML_file "Concurrent/future.ML"; ML_file "Concurrent/event_timer.ML"; ML_file "Concurrent/timeout.ML"; ML_file "Concurrent/lazy.ML"; ML_file "Concurrent/par_list.ML"; ML_file "Concurrent/mailbox.ML"; ML_file "Concurrent/cache.ML"; ML_file "PIDE/active.ML"; ML_file "Thy/export.ML"; subsection "Inner syntax"; ML_file "Syntax/type_annotation.ML"; ML_file "Syntax/term_position.ML"; ML_file "Syntax/lexicon.ML"; ML_file "Syntax/ast.ML"; ML_file "Syntax/syntax_ext.ML"; ML_file "Syntax/parser.ML"; ML_file "Syntax/syntax_trans.ML"; ML_file "Syntax/mixfix.ML"; ML_file "Syntax/printer.ML"; ML_file "Syntax/syntax.ML"; subsection "Core of tactical proof system"; ML_file "term_ord.ML"; ML_file "term_items.ML"; ML_file "term_subst.ML"; ML_file "General/completion.ML"; ML_file "General/name_space.ML"; ML_file "sorts.ML"; ML_file "type.ML"; ML_file "logic.ML"; ML_file "Syntax/simple_syntax.ML"; ML_file "net.ML"; ML_file "item_net.ML"; ML_file "envir.ML"; ML_file "consts.ML"; ML_file "term_xml.ML"; ML_file "primitive_defs.ML"; ML_file "sign.ML"; ML_file "defs.ML"; ML_file "term_sharing.ML"; ML_file "pattern.ML"; ML_file "unify.ML"; ML_file "theory.ML"; ML_file "proofterm.ML"; ML_file "thm.ML"; ML_file "cterm_items.ML"; ML_file "more_pattern.ML"; ML_file "more_unify.ML"; ML_file "more_thm.ML"; ML_file "facts.ML"; ML_file "thm_name.ML"; ML_file "global_theory.ML"; ML_file "pure_thy.ML"; ML_file "drule.ML"; ML_file "morphism.ML"; ML_file "variable.ML"; ML_file "conv.ML"; ML_file "goal_display.ML"; ML_file "tactical.ML"; ML_file "search.ML"; ML_file "tactic.ML"; ML_file "raw_simplifier.ML"; ML_file "conjunction.ML"; ML_file "assumption.ML"; subsection "Isar -- Intelligible Semi-Automated Reasoning"; (*ML support and global execution*) ML_file "ML/ml_syntax.ML"; ML_file "ML/ml_env.ML"; ML_file "ML/ml_options.ML"; ML_file "ML/ml_print_depth.ML"; ML_file_no_debug "Isar/runtime.ML"; ML_file "PIDE/execution.ML"; ML_file "ML/ml_compiler.ML"; ML_file "skip_proof.ML"; ML_file "goal.ML"; (*outer syntax*) ML_file "Isar/keyword.ML"; ML_file "Isar/token.ML"; ML_file "Isar/parse.ML"; ML_file "Thy/document_source.ML"; ML_file "Thy/thy_header.ML"; ML_file "Thy/document_marker.ML"; (*proof context*) ML_file "Isar/object_logic.ML"; ML_file "Isar/rule_cases.ML"; ML_file "Isar/auto_bind.ML"; ML_file "type_infer.ML"; ML_file "Syntax/local_syntax.ML"; ML_file "Isar/proof_context.ML"; ML_file "type_infer_context.ML"; ML_file "Syntax/syntax_phases.ML"; ML_file "Isar/args.ML"; (*theory specifications*) ML_file "Isar/local_defs.ML"; ML_file "Isar/local_theory.ML"; ML_file "Isar/entity.ML"; ML_file "PIDE/command_span.ML"; ML_file "Thy/thy_element.ML"; ML_file "Thy/markdown.ML"; ML_file "Thy/latex.ML"; (*ML with context and antiquotations*) ML_file "ML/ml_context.ML"; ML_file "ML/ml_antiquotation.ML"; ML_file "ML/ml_compiler2.ML"; ML_file "ML/ml_antiquotations.ML"; section "Bootstrap phase 3: towards theory Pure and final ML toplevel setup"; (*basic proof engine*) ML_file "par_tactical.ML"; ML_file "context_tactic.ML"; ML_file "Isar/proof_display.ML"; ML_file "Isar/attrib.ML"; ML_file "Isar/context_rules.ML"; ML_file "Isar/method.ML"; ML_file "Isar/proof.ML"; ML_file "Isar/element.ML"; ML_file "Isar/obtain.ML"; ML_file "Isar/subgoal.ML"; ML_file "Isar/calculation.ML"; (*local theories and targets*) ML_file "Isar/locale.ML"; ML_file "Isar/generic_target.ML"; ML_file "Isar/bundle.ML"; ML_file "Isar/overloading.ML"; ML_file "axclass.ML"; ML_file "Isar/class.ML"; ML_file "Isar/named_target.ML"; ML_file "Isar/expression.ML"; ML_file "Isar/interpretation.ML"; ML_file "Isar/class_declaration.ML"; ML_file "Isar/target_context.ML"; ML_file "Isar/experiment.ML"; ML_file "simplifier.ML"; ML_file "Tools/plugin.ML"; (*executable theory content*) ML_file "Isar/code.ML"; (*specifications*) ML_file "Isar/spec_rules.ML"; ML_file "Isar/specification.ML"; ML_file "Isar/parse_spec.ML"; ML_file "Isar/typedecl.ML"; (*toplevel transactions*) ML_file "Isar/proof_node.ML"; ML_file "Isar/toplevel.ML"; (*proof term operations*) ML_file "Proof/proof_rewrite_rules.ML"; ML_file "Proof/proof_syntax.ML"; ML_file "Proof/proof_checker.ML"; ML_file "Proof/extraction.ML"; (*Isabelle system*) ML_file "PIDE/protocol_command.ML"; ML_file "System/scala.ML"; ML_file "System/process_result.ML"; ML_file "System/isabelle_system.ML"; (*theory documents*) ML_file "Thy/term_style.ML"; ML_file "Isar/outer_syntax.ML"; ML_file "ML/ml_pid.ML"; ML_file "Thy/document_antiquotation.ML"; ML_file "Thy/document_output.ML"; ML_file "Thy/document_antiquotations.ML"; ML_file "General/graph_display.ML"; ML_file "pure_syn.ML"; ML_file "PIDE/command.ML"; ML_file "PIDE/query_operation.ML"; ML_file "PIDE/resources.ML"; ML_file "Thy/thy_info.ML"; ML_file "thm_deps.ML"; ML_file "Thy/export_theory.ML"; ML_file "Thy/sessions.ML"; ML_file "PIDE/session.ML"; ML_file "PIDE/document.ML"; (*theory and proof operations*) ML_file "Isar/isar_cmd.ML"; subsection "Isabelle/Isar system"; ML_file "System/command_line.ML"; ML_file "System/message_channel.ML"; ML_file "System/isabelle_process.ML"; ML_file "System/scala_compiler.ML"; ML_file "System/isabelle_tool.ML"; ML_file "Thy/bibtex.ML"; ML_file "PIDE/protocol.ML"; ML_file "General/output_primitives_virtual.ML"; subsection "Miscellaneous tools and packages for Pure Isabelle"; ML_file "ML/ml_pp.ML"; ML_file "ML/ml_thms.ML"; +ML_file "ML/ml_instantiate.ML"; ML_file "ML/ml_file.ML"; ML_file "Tools/build.ML"; ML_file "Tools/named_thms.ML"; ML_file "Tools/print_operation.ML"; ML_file "Tools/rail.ML"; ML_file "Tools/rule_insts.ML"; ML_file "Tools/thy_deps.ML"; ML_file "Tools/class_deps.ML"; ML_file "Tools/find_theorems.ML"; ML_file "Tools/find_consts.ML"; ML_file "Tools/simplifier_trace.ML"; ML_file_no_debug "Tools/debugger.ML"; ML_file "Tools/named_theorems.ML"; ML_file "Tools/doc.ML"; ML_file "Tools/jedit.ML"; ML_file "Tools/ghc.ML"; ML_file "Tools/generated_files.ML"