From 7bc1a651f3b2ae36a4cf37f0c9c722fe7dbb3d95 Mon Sep 17 00:00:00 2001
From: Tim Daly
Date: Mon, 30 Oct 2017 20:28:50 0400
Subject: [PATCH] books/bookheader.tex add tab characters in credites
Goal: Axiom Maintenance

books/bookheader.tex  202 +++
changelog  2 +
patch  1187 +
src/axiomwebsite/patches.html  4 +
4 files changed, 108 insertions(+), 1287 deletions()
diff git a/books/bookheader.tex b/books/bookheader.tex
index 1b5e831..2eeb37d 100644
 a/books/bookheader.tex
+++ b/books/bookheader.tex
@@ 118,111 +118,111 @@ does not in any way imply an endorsement but represents historical
influence on Axiom development.
\begin{tabular}{lll}
Michael Albaugh Cyril Alberga Roy Adler\\
Christian Aistleitner Richard Anderson George Andrews\\
Jerry Archibald S.J. Atkins Jeremy Avigad\\
Henry Baker Martin Baker Stephen Balzac\\
Yurij Baransky David R. Barton Thomas Baruchel\\
Gerald Baumgartner Gilbert Baumslag Michael Becker\\
Nelson H. F. Beebe Jay Belanger David Bindel\\
Fred Blair Vladimir Bondarenko Mark Botch\\
Raoul Bourquin Alexandre Bouyer Karen Braman\\
Wolfgang Brehm Peter A. Broadbery Martin Brock\\
Manuel Bronstein Christopher Brown Stephen Buchwald\\
Florian Bundschuh Luanne Burns William Burge\\
Ralph Byers Quentin Carpent Pierre Casteran\\
Robert Cavines Bruce Char Ondrej Certik\\
TzuYi Chen Bobby Cheng Cheekai Chin\\
David V. Chudnovsky Gregory V. Chudnovsky Mark Clements\\
James Cloos Jia Zhao Cong Josh Cohen\\
Christophe Conil Don Coppersmith George Corliss\\
Robert Corless Gary Cornell Meino Cramer\\
Karl Crary Jeremy Du Croz David Cyganski\\
Nathaniel Daly Timothy Daly Sr. Timothy Daly Jr.\\
James H. Davenport David Day James Demmel\\
Didier Deshommes Michael Dewar Inderjit Dhillon\\
Jack Dongarra Jean Della Dora Gabriel Dos Reis\\
Claire DiCrescendo Sam Dooley Nicolas James Doye\\
Zlatko Drmac Lionel Ducos Iain Duff\\
Lee Duhem Martin Dunstan Brian Dupee\\
Dominique Duval Robert Edwards HansDieter Ehrich\\
Heow EideGoodman Lars Erickson Mark Fahey\\
Richard Fateman Bertfried Fauser Stuart Feldman\\
John Fletcher Brian Ford Albrecht Fortenbacher\\
George Frances Constantine Frangos Timothy Freeman\\
Korrinn Fu Marc Gaetano Rudiger Gebauer\\
Van de Geijn Kathy Gerber Patricia Gianni\\
Gustavo Goertkin Samantha Goldrich Holger Gollan\\
Teresa GomezDiaz Laureano GonzalezVega Stephen Gortler\\
Johannes Grabmeier Matt Grayson Klaus Ebbe Grue\\
James Griesmer Vladimir Grinberg Oswald Gschnitzer\\
Ming Gu Jocelyn Guidry Gaetan Hache\\
Steve Hague Satoshi Hamaguchi Sven Hammarling\\
Mike Hansen Richard Hanson Richard Harke\\
Bill Hart Vilya Harvey Martin Hassner\\
Arthur S. Hathaway Dan Hatton Waldek Hebisch\\
Karl Hegbloom Ralf Hemmecke Henderson\\
Antoine Hersen Nicholas J. Higham Hoon Hong\\
Roger House Gernot Hueber Pietro Iglio\\
Alejandro Jakubi Richard Jenks Bo Kagstrom\\
William Kahan Kyriakos Kalorkoti Kai Kaminski\\
Grant Keady Wilfrid Kendall Tony Kennedy\\
David Kincaid Keshav Kini Ted Kosan\\
+Michael Albaugh & Cyril Alberga & Roy Adler\\
+Christian Aistleitner & Richard Anderson & George Andrews\\
+Jerry Archibald & S.J. Atkins & Jeremy Avigad\\
+Henry Baker & Martin Baker & Stephen Balzac\\
+Yurij Baransky & David R. Barton & Thomas Baruchel\\
+Gerald Baumgartner & Gilbert Baumslag & Michael Becker\\
+Nelson H. F. Beebe & Jay Belanger & David Bindel\\
+Fred Blair & Vladimir Bondarenko & Mark Botch\\
+Raoul Bourquin & Alexandre Bouyer & Karen Braman\\
+Wolfgang Brehm & Peter A. Broadbery & Martin Brock\\
+Manuel Bronstein & Christopher Brown & Stephen Buchwald\\
+Florian Bundschuh & Luanne Burns & William Burge\\
+Ralph Byers & Quentin Carpent & Pierre Casteran\\
+Robert Cavines & Bruce Char & Ondrej Certik\\
+TzuYi Chen & Bobby Cheng & Cheekai Chin\\
+David V. Chudnovsky & Gregory V. Chudnovsky & Mark Clements\\
+James Cloos & Jia Zhao Cong & Josh Cohen\\
+Christophe Conil & Don Coppersmith & George Corliss\\
+Robert Corless & Gary Cornell & Meino Cramer\\
+Karl Crary & Jeremy Du Croz & David Cyganski\\
+Nathaniel Daly & Timothy Daly Sr. & Timothy Daly Jr.\\
+James H. Davenport & David Day & James Demmel\\
+Didier Deshommes & Michael Dewar & Inderjit Dhillon\\
+Jack Dongarra & Jean Della Dora & Gabriel Dos Reis\\
+Claire DiCrescendo & Sam Dooley & Nicolas James Doye\\
+Zlatko Drmac & Lionel Ducos & Iain Duff\\
+Lee Duhem & Martin Dunstan & Brian Dupee\\
+Dominique Duval & Robert Edwards & HansDieter Ehrich\\
+Heow EideGoodman & Lars Erickson & Mark Fahey\\
+Richard Fateman & Bertfried Fauser & Stuart Feldman\\
+John Fletcher & Brian Ford & Albrecht Fortenbacher\\
+George Frances & Constantine Frangos & Timothy Freeman\\
+Korrinn Fu & Marc Gaetano & Rudiger Gebauer\\
+Van de Geijn & Kathy Gerber & Patricia Gianni\\
+Gustavo Goertkin & Samantha Goldrich & Holger Gollan\\
+Teresa GomezDiaz & Laureano GonzalezVega & Stephen Gortler\\
+Johannes Grabmeier & Matt Grayson & Klaus Ebbe Grue\\
+James Griesmer & Vladimir Grinberg & Oswald Gschnitzer\\
+Ming Gu & Jocelyn Guidry & Gaetan Hache\\
+Steve Hague & Satoshi Hamaguchi & Sven Hammarling\\
+Mike Hansen & Richard Hanson & Richard Harke\\
+Bill Hart & Vilya Harvey & Martin Hassner\\
+Arthur S. Hathaway & Dan Hatton & Waldek Hebisch\\
+Karl Hegbloom & Ralf Hemmecke & Henderson\\
+Antoine Hersen & Nicholas J. Higham & Hoon Hong\\
+Roger House & Gernot Hueber & Pietro Iglio\\
+Alejandro Jakubi & Richard Jenks & Bo Kagstrom\\
+William Kahan & Kyriakos Kalorkoti & Kai Kaminski\\
+Grant Keady & Wilfrid Kendall & Tony Kennedy\\
+David Kincaid & Keshav Kini & Ted Kosan\\
\end{tabular}
\vfill
\newpage
\begin{tabular}{lll}
Paul Kosinski Igor Kozachenko Fred Krogh\\
Klaus Kusche Bernhard Kutzler Tim Lahey\\
Larry Lambe Kaj Laurson Charles Lawson\\
George L. Legendre Franz Lehner Frederic Lehobey\\
Michel Levaud Howard Levy J. Lewis\\
RenCang Li Rudiger Loos Craig Lucas\\
Michael Lucks Richard Luczak Camm Maguire\\
Francois Maltey Osni Marques Alasdair McAndrew\\
Bob McElrath Michael McGettrick Edi Meier\\
Ian Meikle David Mentre Victor S. Miller\\
Gerard Milmeister Mohammed Mobarak H. Michael Moeller\\
Michael Monagan Marc MorenoMaza Scott Morrison\\
Joel Moses Mark Murray William Naylor\\
Patrice Naudin C. Andrew Neff John Nelder\\
Godfrey Nolan Arthur Norman Jinzhong Niu\\
Michael O'Connor Summat Oemrawsingh Kostas Oikonomou\\
Humberto OrtizZuazaga Julian A. Padget Bill Page\\
David Parnas Susan Pelzel Michel Petitot\\
Didier Pinchon Ayal Pinkus Frederick H. Pitts\\
Frank Pfenning Jose Alfredo Portes E. QuintanaOrti\\
Gregorio QuintanaOrti Beresford Parlett A. Petitet\\
Andre Platzer Peter Poromaas Claude Quitte\\
Arthur C. Ralfs Norman Ramsey Anatoly Raportirenko\\
Guilherme Reis Huan Ren Albert D. Rich\\
Michael Richardson Jason Riedy Renaud Rioboo\\
Jean Rivlin Nicolas Robidoux Simon Robinson\\
Raymond Rogers Michael Rothstein Martin Rubey\\
Jeff Rutter Philip Santas David Saunders\\
Alfred Scheerhorn William Schelter Gerhard Schneider\\
Martin Schoenert Marshall Schor Frithjof Schulze\\
Fritz Schwarz Steven Segletes V. Sima\\
Nick Simicich William Sit Elena Smirnova\\
Jacob Nyffeler Smith Matthieu Sozeau Ken Stanley\\
Jonathan Steinbach Fabio Stumbo Christine Sundaresan\\
Klaus Sutner Robert Sutor Moss E. Sweedler\\
Eugene Surowitz Yong Kiam Tan Max Tegmark\\
T. Doug Telford James Thatcher Laurent Thery\\
Balbir Thomas Mike Thomas Dylan Thurston\\
Francoise Tisseur Steve Toleque Raymond Toy\\
Barry Trager Themos T. Tsikas Gregory Vanuxem\\
Kresimir Veselic Christof Voemel Bernhard Wall\\
Stephen Watt Andreas Weber Jaap Weel\\
Juergen Weiss M. Weller Mark Wegman\\
James Wen Thorsten Werther Michael Wester\\
R. Clint Whaley James T. Wheeler John M. Wiley\\
Berhard Will Clifton J. Williamson Stephen Wilson\\
Shmuel Winograd Robert Wisbauer Sandra Wityak\\
Waldemar Wiwianka Knut Wolf Yanyang Xiao\\
Liu Xiaojun Clifford Yapp David Yun\\
Qian Yun Vadim Zhytnikov Richard Zippel\\
Evelyn Zoernack Bruno Zuercher Dan Zwillinger\\
+Paul Kosinski & Igor Kozachenko & Fred Krogh\\
+Klaus Kusche & Bernhard Kutzler & Tim Lahey\\
+Larry Lambe & Kaj Laurson & Charles Lawson\\
+George L. Legendre & Franz Lehner & Frederic Lehobey\\
+Michel Levaud & Howard Levy & J. Lewis\\
+RenCang Li & Rudiger Loos & Craig Lucas\\
+Michael Lucks & Richard Luczak & Camm Maguire\\
+Francois Maltey & Osni Marques & Alasdair McAndrew\\
+Bob McElrath & Michael McGettrick & Edi Meier\\
+Ian Meikle & David Mentre & Victor S. Miller\\
+Gerard Milmeister & Mohammed Mobarak & H. Michael Moeller\\
+Michael Monagan & Marc MorenoMaza & Scott Morrison\\
+Joel Moses & Mark Murray & William Naylor\\
+Patrice Naudin & C. Andrew Neff & John Nelder\\
+Godfrey Nolan & Arthur Norman & Jinzhong Niu\\
+Michael O'Connor & Summat Oemrawsingh & Kostas Oikonomou\\
+Humberto OrtizZuazaga & Julian A. Padget & Bill Page\\
+David Parnas & Susan Pelzel & Michel Petitot\\
+Didier Pinchon & Ayal Pinkus & Frederick H. Pitts\\
+Frank Pfenning & Jose Alfredo Portes & E. QuintanaOrti\\
+Gregorio QuintanaOrti & Beresford Parlett & A. Petitet\\
+Andre Platzer & Peter Poromaas & Claude Quitte\\
+Arthur C. Ralfs & Norman Ramsey & Anatoly Raportirenko\\
+Guilherme Reis & Huan Ren & Albert D. Rich\\
+Michael Richardson & Jason Riedy & Renaud Rioboo\\
+Jean Rivlin & Nicolas Robidoux & Simon Robinson\\
+Raymond Rogers & Michael Rothstein & Martin Rubey\\
+Jeff Rutter & Philip Santas & David Saunders\\
+Alfred Scheerhorn & William Schelter & Gerhard Schneider\\
+Martin Schoenert & Marshall Schor & Frithjof Schulze\\
+Fritz Schwarz & Steven Segletes & V. Sima\\
+Nick Simicich & William Sit & Elena Smirnova\\
+Jacob Nyffeler Smith & Matthieu Sozeau & Ken Stanley\\
+Jonathan Steinbach & Fabio Stumbo & Christine Sundaresan\\
+Klaus Sutner & Robert Sutor & Moss E. Sweedler\\
+Eugene Surowitz & Yong Kiam Tan & Max Tegmark\\
+T. Doug Telford & James Thatcher & Laurent Thery\\
+Balbir Thomas & Mike Thomas & Dylan Thurston\\
+Francoise Tisseur & Steve Toleque & Raymond Toy\\
+Barry Trager & Themos T. Tsikas & Gregory Vanuxem\\
+Kresimir Veselic & Christof Voemel & Bernhard Wall\\
+Stephen Watt & Andreas Weber & Jaap Weel\\
+Juergen Weiss & M. Weller & Mark Wegman\\
+James Wen & Thorsten Werther & Michael Wester\\
+R. Clint Whaley & James T. Wheeler & John M. Wiley\\
+Berhard Will & Clifton J. Williamson & Stephen Wilson\\
+Shmuel Winograd & Robert Wisbauer & Sandra Wityak\\
+Waldemar Wiwianka & Knut Wolf & Yanyang Xiao\\
+Liu Xiaojun & Clifford Yapp & David Yun\\
+Qian Yun & Vadim Zhytnikov & Richard Zippel\\
+Evelyn Zoernack & Bruno Zuercher & Dan Zwillinger\\
\end{tabular}
\newpage
diff git a/changelog b/changelog
index df6b3eb..42df9ca 100644
 a/changelog
+++ b/changelog
@@ 1,3 +1,5 @@
+20171030 tpd src/axiomwebsite/patches.html 20171030.01.tpd.patch
+20171030 tpd books/bookheader.tex add tab characters in credits
20171029 tpd src/axiomwebsite/patches.html 20171029.01.tpd.patch
20171029 tpd books/bookvolbib add references
20171017 tpd src/axiomwebsite/patches.html 20171017.01.tpd.patch
diff git a/patch b/patch
index ccbebae..5ffb360 100644
 a/patch
+++ b/patch
@@ 1,1186 +1,3 @@
books/bookvolbib add References
+books/bookheader.tex add tab characters in credites
Goal: Proving Axiom Correct

\index{Paoli, Francesco}
\begin{chunk}{axiom.bib}
@book{Paol02,
 author = "Paoli, Francesco",
 title = "Substructural Logics: A Primer",
 publisher = "Springer",
 isbn = "9789048160143",
 year = "2002",
 abstract =
 "Substructural logics are by now one of the most prominent branches of
 the research field usually labelled as "nonclassical logics"  and
 perhaps of logic tout court. Over the last few decades a vast amount
 of research papers and even some books have been devoted to this
 subject. The aim of the present book is to give a comprehensive
 account of the "state of the art" of substructural logics, focusing
 both on their proof theory (especially on sequent calculi and their
 generalizations) and on their semantics (both algebraic and relational).",
 paper = "Paol02.pdf"
}

\end{chunk}

\index{Kowalski, Robert}
\begin{chunk}{axiom.bib}
@article{Kowa79,
 author = "Kowalski, Robert",
 title = "Algorithm = Logic + Control",
 journal = "CACM",
 volume = "22",
 number = "7",
 year = "1979",
 paper = "Kowa79.pdf"
}

\end{chunk}

\index{Wadler, Philip}
\begin{chunk}{axiom.bib}
@inproceedings{Wadl89,
 author = "Wadler, Philip",
 title = "Theorems for free!",
 booktitle = "4th Intl Conf. on Functional Programming",
 pages = "347359",
 year = "1989",
 abstract =
 "From the type of a polymorphic function we can derive a theorem that
 it satisfies. Every function of the same type satisfies the same
 theorem. This provides a free source of useful theorems, courtesy of
 Reynolds' abstraction theorem for the polymorphic lambda calculus.",
 paper = "Wadl89.pdf"
}

\end{chunk}

\index{Collins, George E.}
\begin{chunk}{axiom.bib}
@inproceedings{Coll85,
 author = "Collins, George E.",
 title = "The SAC2 Computer Algebra System",
 booktitle = "Proc. Europena Conf. on Computer Algebra",
 year = "1985",
 pages = "3435",
 paper = "Coll85.pdf"
}

\end{chunk}

\index{Colmerauer, Alain}
\begin{chunk}{axiom.bib}
@article{Colm90,
 author = "Colmerauer, Alain",
 title = "An Introduction to Prolog III",
 journal = "CACM",
 volume = "33",
 number = "7",
 year = "1990",
 pages = "6990",
 abstract =
 "The Prolog III programming language extends Prolog by redefining the
 fundamental process at its heart: unification. This article presents
 the specifications of this new language and illustrates its capabilities.",
 paper = "Colm90.pdf"
}

\end{chunk}

\index{Harrison, John Robert}
\begin{chunk}{axiom.bib}
@phdthesis{Harr96,
 author = "Harrison, John Robert",
 title = "Theorem Proving with the Real Numbers",
 school = "Churchhill College",
 comment = "U. Cambrige Computer Lab Tech Report 408,
 also SpringerVerlag 1988 ISBN 35487625666",
 year = "1996",
 abstract =
 "This thesis discusses the use of the real numbers in theorem
 proving. Typically, theorem provers only support a few `discrete'
 datatypes such as the natural numbers. However the availability of the
 real numbers opens up many interesting and important application
 areas, such as the verification of floating point hardware and hybrid
 systems. It also allows the formalization of many more branches of
 classical mathematics, which is particularly relevant for attempts to
 inject more rigour into computer algebra systems.

 Our work is conducted in a version of the HOL theorem prover. We
 describe the rigorous definitional construction of the real numbers,
 using a new version of Cantor's method, and the formalization of a
 significant portion of real analysis. We also describe an advanced
 derived decision procedure for the `Tarski subset' of real algebra as
 well as some more modest but practically useful tools for automating
 explicit calculations and routine linear arithmetic reasoning.

 Finally, we consider in more detail two interesting application
 areas. We discuss the desirability of combining the rigour of theorem
 provers with the power and convenience of computer algebra systems,
 and explain a method we have used in practice to achieve this. We then
 move on to the verification of floating point hardware. After a
 careful discussion of possible correctness specifications, we report
 on two case studies, one involving a transcendental function.

 We aim to show that a theory of real numbers is useful in practice and
 interesting in theory, and that the `LCF style' of theorem proving is
 well suited to the kind of work we describe. We hope also to convince
 the reader that the kind of mathematics needed for applications is
 well within the abilities of current theorem proving technology.",
 paper = "Harr96.pdf"
}

\end{chunk}

\index{Bronstein, Manuel}
\begin{chunk}{axiom.bib}
@misc{Bron96,
 author = "Bronstein, Manuel",
 title =
 "$\sum^{IT}$  A stronglytyped embeddable computer algebra library",
 link = "\url{http://wwwsop.inria.fr/cafe/Manuel.Bronstein/publications/mb_papers.html}",
 abstract =
 "We describe the new computer algebra library $\sum^{IT}$ and its
 underlying design. The development of $\sum^{IT}$ is motivated by the
 need to provide highly efficient implementations of key algorithms for
 linear ordinary differential and ($q$)difference equations to
 scientific programmers and to computer algebra users, regardless of
 the programming language or interactive system they use. As such,
 $\sum^{IT}$ is not a computer algebra system per se, but a library (or
 substrate) which is designed to be ``plugged'' with minimal efforts
 into different types of client applications.",
 paper = "Bron96.pdf"
}

\end{chunk}

\index{Corless, Robert M.}
\index{Jeffrey, David J.}
\begin{chunk}{axiom.bib}
@article{Corl97a,
 author = "Corless, Robert M. and Jeffrey, David J.",
 title = "The Turing Factorization of a Rectangular Matrix",
 journal = "ACM SIGSAM Bulletin",
 volume = "31",
 number = "3",
 pages = "2835",
 year = "1997",
 abstract =
 "The Turing factorization is a generalization of the standard LU
 factoring of a square matrix. Among other advantages, it allows us to
 meet demands that arise in a symbolic context. For a rectangular
 matrix A, the generalized factors are written PA = LDU R, where R is
 the rowechelon form of A. For matrices with symbolic entries, the LDU
 R factoring is superior to the standard reduction to rowechelon form,
 because special case information can be recorded in a natural
 way. Special interest attaches to the continuity properties of the
 factors, and it is shown that conditions for discontinuous behaviour
 can be given using the factor D. We show that this is important, for
 example, in computing the MoorePenrose inverse of a matrix containing
 symbolic entries.We also give a separate generalization of LU
 factoring to fractionfree Gaussian elimination.",
 paper = "Corl97a.pdf"
}

\end{chunk}

\index{Clarke, Edmund}
\index{Zhao, Xudong}
\begin{chunk}{axiom.bib}
@techreport{Clar92,
 author = "Clarke, Edmund and Zhao, Xudong",
 title = "Analytica  An Experiment in Combining Theorem Proving and
 Symbolic Computation",
 type = "technical report",
 institution = "Carnegie Mellon University",
 number = "CMUCS92147",
 year = "1992",
 abstract =
 "Analytica is an automatic theorem prover for theorems in elementary
 analysis. The prover is written in Mathematica language and runs in
 the Mathematica environment. The goal of the project is to use a
 powerful symbolic computation system to prove theorems that are beyond
 the scope of previous automatic theorem provers. The theorem prover is
 also able to guarantee the correctness of certain steps that are made
 by the symbolic computation system and therefore prevent common errors
 like division by a symbolic expression that could be zero. In this
 paper we describe the structure of Analytica and explain the main
 techniques that it uses to construct proofs. Analytica has been able
 to prove several nontrivial examples including the basic properties
 of the stereographic projection and a series of three lemmas that lead
 to a proof of Weierstrass''s example of a continuous nowhere
 differentiable function. Each of the lemmas in the latter example is
 proved completely automatically.",
 keywords = "CASProof"
}

\end{chunk}

\index{Mines, Ray}
\index{Richman, Fred}
\index{Ruitenburg, Wim}
\begin{chunk}{axiom.bib}
@book{Mine88,
 author = "Mines, Ray and Richman, Fred and Ruitenburg, Wim",
 title = "A Course in Constructive Algebra",
 year = "1988",
 publisher = "Universitext",
 abstract =
 "The constructive approach to mathematics has enjoyed a renaissance,
 caused in large part by the appearance of Errett Bishop's book
 Foundations of constructiue analysis in 1967, and by the subtle
 influences of the proliferation of powerful computers. Bishop
 demonstrated that pure mathematics can be developed from a
 constructive point of view while maintaining a continuity with
 classical terminology and spirit; much more of classical mathematics
 was preserved than had been thought possible, and no classically false
 theorems resulted, as had been the case in other constructive schools
 such as intuitionism and Russian constructivism. The computers created
 a widespread awareness of the intuitive notion of an effecti ve
 procedure, and of computation in principle, in addi tion to
 stimulating the study of constructive algebra for actual
 implementation, and from the point of view of recursive function
 theory. In analysis, constructive problems arise instantly because we
 must start with the real numbers, and there is no finite procedure for
 deciding whether two given real numbers are equal or not (the real
 numbers are not discrete) . The main thrust of constructive
 mathematics was in the direction of analysis, although several
 mathematicians, including Kronecker and van der waerden, made
 important contributions to construc tive algebra. Heyting, working in
 intuitionistic algebra, concentrated on issues raised by considering
 algebraic structures over the real numbers, and so developed a
 handmaiden'of analysis rather than a theory of discrete algebraic
 structures."
}

\end{chunk}

\index{Guillaume, Alexandre}
\begin{chunk}{axiom.bib}
@phdthesis{Guil98,
 author = "Guillaume, Alexandre",
 title = "De Aldor A Zermelo",
 year = "1998",
 school = "Universite Pierre et Marie Curie (Paris)"
}

\end{chunk}

\index{Augustsson, Lennart}
\begin{chunk}{axiom.bib}
@inproceedings{Augu98,
 author = "Augustsson, Lennart",
 title = "Cayenne  a language with dependent types",
 booktitle = "ICFP 98",
 year = "1998",
 pages = "239250",
 isbn = "1581130244",
 link = "\url{http://fsl.cs.illinois.edu/images/5/5e/Cayenne.pdf}",
 abstract =
 "Cayenne is a Haskelllike language. The main difference between
 Haskell and Cayenne is that Cayenne has dependent types i.e.
 the result type of a function may depend on the argument value
 and types of record components which can be types or values
 may depend on other components. Cayenne also combines the
 syntactic categories for value expressions and type expressions thus
 reducing the number of language concepts.

 Having dependent types
 and combined type and value expressions makes the language very
 powerful. It is powerful enough that a special module concept
 is unnecessary; ordinary records suffice. It is also powerful enough to
 encode predicate logic at the type level allowing types to be
 used as specifications of programs. However this power comes at a
 cost: type checking of Cayenne is undecidable. While this may
 appear to be a steep price to pay it seems to work well in practice.",
 paper = "Augu98.pdf"
}

\end{chunk}

\index{Pfenning, Frank}
\begin{chunk}{axiom.bib}
@inproceedings{Pfen89,
 author = "Pfenning, Frank",
 title = "Elf: A Language for Logic Definition and Verified
 Metaprogramming",
 booktitle = "4th Symp. on Logic in Computer Science",
 pages = "313322",
 isbn = "0818619546",
 year = "1989",
 abstract =
 "We describe Elf, a metalanguage for proof manipulation environments
 that are independent of any particular logic system. Elf is
 intended for metaprograms such as theorem provers, proof
 transformers, or type inference programs for programming languages
 with complex type systems. Elf unifies logic definition (in the
 style of LF, the Edinburgh Logical Framework) with logic
 programming (in the style of $\lambda$Prolog). It achieves this
 unification by giving {\sl types} an operational interpretation,
 much the same way that Prolog gives certain formulas
 (Hornclauses) an operational interpretation. Novel features of
 Elf include: (1) the Elf search process automatically constructs
 terms that can represent objectlogic proofs, and thus a program
 need not construct them explicitly, (2( the partial correectness
 of metaprograms with respect to a given logic can be expressed
 and proved in Elf itself, and (3) Elf exploits Elliott's
 unification algorithm for a $\lambda$calculus with dependent
 types.",
 paper = "Pfen89.pdf"
}

\end{chunk}

\index{Aransay, Jesus}
\index{Ballarin, Clemens}
\index{Rubio, Julio}
\begin{chunk}{axiom.bib}
@inproceedings{Aran05,
 author = "Aransay, Jesus and Ballarin, Clemens and Rubio, Julio",
 title = "Extracting Computer Algebra Programs from Statements",
 booktitle = "Int. Conf. on Computer Aided System Theory",
 pages = "159168",
 year = "2005",
 abstract =
 "In this paper, an approach to synthesize correct programs from
 specifications is presented. The idea is to extract code from
 definitions appearing in statements which have been mechanically
 proved with the help of a proof assistant. This approach has been
 found when proving the correctness of certain Computer Algebra
 programs (for Algebraic Topology) by using the Isabelle proof
 assistant. To ease the understanding of our techniques, they are
 illustrated by means of examples in elementary arithmetic.",
 paper = "Aran05.pdf"
}

\end{chunk}

\index{Calmet, Jacques}
\index{Homann, Karsten}
\begin{chunk}{axiom.bib}
@inproceedings{Calm96a,
 author = "Calmet, Jacques and Homann, Karsten",
 title = "Proofs in Computational Algebra: An Interface between DTP
 and Magma",
 booktitle = "Proc. 2nd Magma Conf. on Computational Algebra",
 year = "1996",
 comment = "Extended Abstract",
 link = "\url{https://pdfs.semanticscholar.org/c709/338dfde245e638690bc7414d8a191eae3a82.pdf}",
 paper = "Calm96a.pdf"
}

\end{chunk}

\index{Homann, Karsten}
\begin{chunk}{axiom.bib}
@phdthesis{Homa96,
 author = "Homann, Karsten",
 title = {Symbolisches L\"osen mathematischer Probleme durch
 Kooperation algorithmischer und logischer Systeme},
 year = "1996",
 school = {Universit\"at Karlsruhe},
 paper = "Homa96.pdf"
}

\end{chunk}

\index{McAllester, D.}
\index{Arkondas, K.}
\begin{chunk}{axiom.bib}
@inproceedings{Mcal96,
 author = "McAllester, D. and Arkondas, K.",
 title = "Walther Recursion",
 booktitle = "CADE 13",
 publisher = "SpringerVerlag",
 year = "1996",
 abstract =

 "Primitive recursion is a well known syntactic restriction on
 recursion definitions which guarantees termination. Unfortunately
 many natural definitions, such as the most common definition of
 Euclid's GCD algorithm, are not primitive recursive. Walther has
 recently given a proof system for verifying termination of a
 broader class of definitions. Although Walther's system is highly
 automatible, the class of acceptable definitions remains only
 semidecidable. Here we simplify Walther's calculus and give a
 syntactic criteria generalizes primitive recursion and handles
 most of the examples given by Walthar. We call the corresponding
 class of acceptable defintions ``Walther recursive''.",
 paper = Mcal96.pdf
}

\end{chunk}

\index{Mitchell, John C.}
\index{Plotkin, Gordon D.}
@article{Mitc88,
 author = "Mitchell, John C. and Plotkin, Gordon D.",
 title = "Abstract types have existential type",
 journal = "ACM TOPLAS",
 volume = "10",
 number = "3",
 year = "1988",
 pages = "470502",
 abstract =
 "Abstract data type declarations appear in typed programming languages
 like Ada, Alphard, CLU and ML. This form of declaration binds a list
 of identifiers to a type with associated operations, a composite
 “value” we call a data algebra. We use a secondorder typed lambda
 calculus SOL to show how data algebras may be given types, passed as
 parameters, and returned as results of function calls. In the process,
 we discuss the semantics of abstract data type declarations and review
 a connection between typed programming languages and constructive
 logic.",
 paper = "Mitc88.pdf"
}

\end{chunk}

\index{Nordstr\"om, Bengt}
\index{Petersson, Kent}
\index{Smith, Jan M.}
\begin{chunk}{axiom.bib}
@book{Nord90,
 author = {Nordstr\"om, Bengt and Petersson, Kent and Smith, Jan M.},
 title = {Programming in MartinL\"of's Type Theory},
 year = "1990",
 publisher = "Oxford University Press",
 paper = "Nord90.pdf"
}

\end{chunk}

\index{PaulinMohring, Christine}
\begin{chunk}{axiom.bib}
@inproceedings{Paul93,
 author = "PaulinMohring, Christine",
 title = "Inductive Definitions in the system Coq  Rules and
 Properties",
 booktitle = "Proc '93 Int. Conf. on Typed Lambda Calculi and
 Applications",
 year = "1993",
 pages = "328345",
 isbn = "3540565175",
 abstract =
 "In the pure Calculus of Constructions, it is possible to represent
 data structures and predicates using higherorder
 quantification. However, this representation is not satisfactory, from
 the point of view of both the efficiency of the underlying programs
 and the power of the logical system. For these reasons, the calculus
 was extended with a primitive notion of inductive definitions
 [8]. This paper describes the rules for inductive definitions in the
 system Coq. They are general enough to be seen as one formulation of
 adding inductive definitions to a typed lambdacalculus. We prove
 strong normalization for a subsystem of Coq corresponding to the pure
 Calculus of Constructions plus Inductive Definitions with only weak
 eliminations"
}

\end{chunk}

\index{Cohen, Arjeh M.}
\index{Davenport, James H.}
\index{Heck, J.P.}
\begin{chunk}{axiom.bib}
@misc{Cohe93,
 author = "Cohen, Arjeh M. and Davenport, James H. and Heck, J.P.",
 title = "An overview of computer algebra",
 year = "1993",
 paper = "Cohe93.pdf"
}

\end{chunk}

\index{Caprotti, Olga}
\index{Oostdijk, Martijn}
\begin{chunk}{axiom.bib}
@article{Capr01,
 author = "Caprotti, Olga and Oostdijk, Martijn",
 title = "Formal and Efficient Primality Proofs by Use of Computer
 Algebra Oracles",
 journal = "J. Symbolic Computation",
 volume = "32",
 pages = "5570",
 year = "2001",
 abstract =
 "This paper focuses on how to use Pocklington's criterion to
 produce efficient formal proofobjects for showing primality of
 large positive numbers. First, we describe a formal development of
 Pocklington's criterion, done using the proof assistant Coq. Then
 we present an algorithm in which computer algebra software is
 employed as oracle to the proof assistant to generate the
 necessary witnesses for applying the criterion. Finally, we
 discuss the implementation of this approach and tackle the proof
 of primality for some of the largest numbers expressible in Coq.",
 paper = "Capr01.pdf"
}

\end{chunk}

\index{Delahaye, David}
\index{Mayero, Micaela}
\begin{chunk}{axiom.bib}
@article{Dela05,
 author = "Delahaye, David and Mayero, Micaela",
 title = "Dealing with algebraic expressions over a field in Coq
 using Maple",
 journal = "J. Symbolic Computation",
 volume = "39",
 pages = "569592",
 year = "2005",
 abstract =
 "We describe an interface between the Coq proof assistant and the
 Maple symbolic computations system, which mainly consists in
 importing, in Coq. Maple computations regarding algebraic
 expressions over fields. These can either be pure computations,
 which do not require any validation, or computations used during
 proofs, which must be proved (to be correct) within Coq. These
 correctneess proofs are completed automatically thanks to the
 tactic Field, which deals with equalities over fields. This
 tactic, which may generate side conditions (regarding the
 denominators) that must be proved by the user, has been
 implemented in a reflxive way, which ensures both efficiency and
 certification. The implementation of this interface is quite light
 and can be very easily extended to get other Maple functions (in
 addition to the four functions we have imported and used in the
 examples given here).",
 paper = "Dela05.pdf"
}

\end{chunk}

\index{Turner, D.A.}
\begin{chunk}{axiom.bib}
@article{Turn95,
 author = "Turner, D.A.",
 title = "Elemntary Strong Functional Programming",
 journal = "Lecture Notes in Computer Science",
 volume = "1022",
 pages = "113",
 year = "1995",
 abstract =
 "Functional programming is a good idea, but we haven’t got it quite
 right yet. What we have been doing up to now is weak (or partial)
 functional programming. What we should be doing is strong (or
 total) functional programming  in which all computations terminate.
 We propose an elementary discipline of strong functional programming.
 A key feature of the discipline is that we introduce a type
 distinction between data, which is known to be finite, and codata,
 which is (potentially) infinite.",
 paper = "Turn95.pdf"
}

\end{chunk}

\index{Thompson, Simon}
\begin{chunk}{axiom.bib}
@book{Thom91,
 author = "Thompson, Simon",
 title = "Type Theory and Functional Programming",
 year = "1991",
 publisher = "AddisonWesley",
 abstract =
 "This book explores the role of MartinLof's constructive type
 theory in computer programming. Th emain focus of the book is how
 the theory can be successfully applied in practice. Introductory
 sections provide the necessary background in logic, lambda
 calculus and constructive mathematics, and exercises and chapter
 summaries are included to reinforce understanding"
}

\end{chunk}

\index{Denzinger, J\"org}
\index{Fuchs, Matthias}
\begin{chunk}{axiom.bib}
@article{Denz94,
 author = "Denzinger, Jorg and Fuchs, Matthias",
 title = "Goal oriented equational theorem proving using team work",
 journal = "Lecture Notes in Computer Science",
 volume = "861",
 pages = "343354",
 year = "1994",
 abstract =
 "The team work method is a concept for distributing automated theorem
 provers by activating several experts to work on a problem. We have
 implemented this for pure equational logic using the unfailing
 KnuthBendix completion procedure as basic prover. In this paper we
 present three classes of experts working in a goal oriented
 fashion. In general, goal oriented experts perform their job “unfair”
 and so are often unable to solve a given problem alone. However, as
 team members in the team work method they perform highly efficiently,
 as we demonstrate by examples, some of which can only be proved using
 team work.",
 paper = "Denz94.pdf"
}

\end{chunk}

\index{Calmet, Jacques}
\index{Homann, Karsten}
\begin{chunk}{axiom.bib}
@article{Calm97a,
 author = "Calmet, Jacques and Homann, Karsten",
 title = "Towards the Mathematics Software Bus",
 journal = "Theoretical Computer Science",
 volume = "197",
 number = "12",
 year = "1997",
 pages = "221230",
 abstract =
 "The Mathematics Software Bus is a software environment for combining
 heterogeneous systems performing any kind of mathematical
 computation. Such an environment will provide combinations of
 graphics, editing and computation tools through interfaces to already
 existing powerful software by flexible and powerful semantically
 integration.

 Communication and cooperation mechanisms for logical and symbolic
 computation systems enable to study and solve new classes of problems
 and to perform efficient computation in mathematics through
 cooperating specialized packages.

 We give an overview on the need for cooperation in solving
 mathematical problems and illustrate the advantages by several
 wellknown examples. The needs and requirements for the Mathematics
 Software Bus and its architecture are demonstrated through some
 implementations of powerful interfaces between mathematical services.",
 paper = "Calm97a.pdf"
}

\end{chunk}

\index{Abbott, John}
\index{van Leeuwen, Andre}
\index{Strotmann, Andreas}
\begin{chunk}{axiom.bib}
@article{Abbo95,
 author = "Abbott, John and van Leeuwen, Andre and Strotmann, Andreas",
 title = "Objectives of OpenMath",
 journal = "J. Symbolic Computation",
 volume = "11",
 year = "1995",
 abstract = "
 "OpenMath aims at providing a universal means of communicating
 mathematical information between applications. In this paper we set
 out the objectives and design goals of OpenMath , and sketch the
 framework of a model that meets these requirements. Based upon this
 model, we propose a structured approach for further development and
 implementation of OpenMath. Throughout, emphasis is on
 extensibility and flexibility, so that OpenMath is not confined to any
 particular area of mathematics, nor to any particular
 implementation. We give some example scenarios to motivate and clarify
 the objectives, and include a brief discussion of the parallels
 between this model and the theory of human language perception.",
 paper = "Abbo95.pdf"
}

\end{chunk}

\index{Davenport, James H.}
\begin{chunk}{axiom.bib}
@article{Dave16,
 author = "Davenport, James H.",
 title = "Complexity of Integration, Special Values, and Recent Developments",
 journal = "LNCS",
 volume = "9725",
 pages = "485491",
 abstract =
 "Two questions often come up when the author discusses integration:
 what is the complexity of the integration process, and for what
 special values of parameters is an unintegrable function actually
 integrable. These questions have not been much considered in the
 formal literature, and where they have been, there is one recent
 development indicating that the question is more delicate than had
 been supposed.",
 paper = "Dave16.pdf"
}

\end{chunk}

\index{Dongarra, Jack}
\begin{chunk}{axiom.bib}
@article{Dong16,
 author = "Dongarra, Jack",
 title = "With Extreme Scale Computing, the Rules Have Changed",
 journal = "LNCS",
 volume = "9725",
 pages = "36",
 paper = "Dong16.pdf"
}

\end{chunk}

\index{Maletzky, Alexander}
\begin{chunk}{axiom.bib}
@article{Male16,
 author = "Maletzky, Alexander",
 title = "Interactive Proving, HigherOrder Rewriting, and Theory Analysis
 in Theorema 2.0",
 journal = "LNCS",
 volume = "9725",
 pages = "5966",
 year = "2016",
 abstract =
 "In this talk we will report on three useful tools recently
 implemented in the frame of the Theorema project: a graphical user
 interface for interactive proof development, a higherorder
 rewriting mechanism, and a tool for automatically analyzing the
 logical structure of Theorematheories. Each of these three tools
 already proved extremely useful in the extensive formal exploration of
 a nontrivial mathematical theory, namely the theory of Groeobner
 bases and reduction rings, in Theorema 2.0.",
 paper = "Male16.pdf"
}

\end{chunk}

\index{Buchberger, Bruno}
\begin{chunk}{axiom.bib}
@article{Buch14,
 author = "Buchberger, Bruno",
 title = "Soft Math Math Soft",
 journal = "LNCS",
 volume = "8592",
 year = "2014",
 pages = "915",
 abstract =
 "In this talk we argue that mathematics is essentially software. In
 fact, from the beginning of mathematics, it was the goal of
 mathematics to automate problem solving. By systematic and deep
 thinking, for problems whose solution was difficult in each
 individual instance, systematic procedures were found that allow to
 solve each instance without further thinking. In each round of
 automation in mathematics, the deep thinking on spect ra of problem
 instances is reflected by deep theorems with deep proofs.",
 paper = "Buch14.pdf"
}

\end{chunk}

\index{Pollack, Robert}
\begin{chunk}{axiom.bib}
@techreport{Poll97,
 author = "Pollack, Robert",
 title = "How to Believe a MachineChecked Proof",
 type = "technical report",
 institution = "Univ. of Aarhus, Basic Research in Computer Science",
 number = "BRICS RS9718",
 year = "1997",
 abstract =
 "Suppose I say ``Here is a machinechecked proof of Fermat's last
 theorem (FLT)''. How can you use my putative machinechecked proof
 as evidence for belief in FLT? I start from the position that you
 must have some personal experience of understanding to attain
 belief, and to have this experience you must engage your intuition
 and other mental processes which are impossible to formalise.",
 paper = "Poll97.pdf"
}

\end{chunk}

\index{Vaninwegen, Myra}
\begin{chunk}{axiom.bib}
@phdthesis{Vani96,
 author = "Vaninwegen, Myra",
 title = "The MachineAssisted Proof of Programming Language
 Properties",
 year = "1996",
 school = "University of Pennsylvania",
 link = "\url{https://pdfs.semanticscholar.org/4ba2/6cbd3d1600aa82d556d76ce5531db9e8e940.pdf}",
 abstract =
 "The goals of thie project described in this thesis are
 twofold. First, we wanted to demonstrate that if a programming
 language has a semantics that is complete and rigorous
 (mathematical), but not too complex, then substantial theorems can
 be proved about it. Second, we wanted to assess the utiliity of
 using an automated theorem prover to aid in such proofs. We chose
 SML as the language about which to prove theorems: it has a
 published semantics that is complete and rigorous, and while not
 exactly simple, is comprehensible. We encoded the semantics of
 Core SML into the theorem prover HOL (creating new definitional
 packages for HOL in the process). We proved important theorems
 about evaluation and about the type system. We also proved the
 type preservation theorem, which relates evaluation and typing,
 for a good portion of the language. We were not able to complete
 the proof of type preservation because it is not tur: we found
 counterexamples. These proofs demonstrate that a good semantics
 will allow the proof of programming language properties and allow
 the identification of trouble spotes in the language. The use of
 HOL had its plusses and minuses. On the whole the benefits greatly
 outweigh the drawbacks, enough so that we believe that these
 theorems could ot have been proved in amount of time taken by this
 project had we not use automated help.",
 paper = "Vani96.pdf"
}

\end{chunk}

\index{Milner, R.}
\index{Torte, M.}
\index{Harper, R.}
\index{MacQueen, David}
\begin{chunk}{axiom.bib}
@book{Miln97,
 author = "Milner, Robin and Torte, Mads and Harper, Robert and
 MacQueen, David",
 title = "The Definition of Standard ML",
 publisher = "Lab for Foundations of Computer Science, Univ. Edinburgh",
 link = "\url{http://smlfamily.org/sml97defn.pdf}",
 year = "1997",
 paper = "Miln97.pdf"
}

\end{chunk}

\index{Windsteiger, Wolfgang}
\begin{chunk}{axiom.bib}
@article{Wind14,
 author = "Windsteiger, Wolfgang",
 title = "Theorema 2.0: A System for Mathematical Theory
 Exploration",
 journal = "LNCS",
 volume = "8592",
 pages = "4952",
 year = "2014",
 abstract =
 "Theorema 2.0 stands for a redesign including a complete
 reimplementation of the Theorema system, which was originally designed,
 developed, and implemented by Bruno Buchberger and his Theorema group
 at RISC. In this talk, we want to present the current status of the
 new implementation, in particular the new user interface of the
 system.",
 paper = "Wind14.pdf"
}

\end{chunk}

\index{Lamport, Leslie}
\begin{chunk}{axiom.bib}
@misc{Lamp95,
 author = "Lamport, Leslie",
 title = "Types Are Not Harmless",
 year = "1995",
 paper = "Lamp95.pdf"
}

\end{chunk}

\index{Kerber, Manfred}
\index{Pollet, Martin}
\begin{chunk}{axiom.bib}
@article{Kerb07,
 author = "Kerber, Manfred and Pollet, Martin",
 title = "Informal and Formal Representations in Mathewmatics",
 journal = "Studies in Logic, Grammar and Rhetoric 10",
 volume = "23",
 year = "2007",
 isbn = "9788374311281",
 abstract =
 "In this paper we discuss the importance of good representations in
 mathematics and relate them to general design issues. Good design
 makes life easy, bad design difficult. For this reason experienced
 mathematicians spend a significant amount of their time on the design
 of their concepts. While many formal systems try to support this by
 providing a highlevel language, we argue that more should be learned
 from the mathematical practice in order to improve the applicability
 of formal systems.",
 paper = "Kerb07.pdf"
}

\end{chunk}

\index{Bittencourt, G.}
\index{Calmet, Jacques}
\index{Homann, K.}
\index{Lulay, A.}
\begin{chunk}{axiom.bib}
@inproceedings{Bitt94,
 author = "Bittencourt, G. and Calmet, Jacques and Homann, K. and
 Lulay, A.",
 title = "MANTRA: A MultiLevel Hybrid Knowledge Representation System",
 booktitle = "Proc. XI Brazilian Symp. on Artificial Intelligence",
 pages = "493506",
 year = "1994",
 abstract =
 "The intelligent behavior of a system is based upon its represented
 knowledge and inference capabilities. In this paper we report on a
 knowledge representation and reasoning system, developed at the
 University of Karlsruhe, called Mantra. The system provides four
 different knowledge representation methods  firstorder logic,
 terminological language, semantic networks, and production rules 
 distributed into a three levels architecture. The first three methods
 form the lowest level of the architecture, the epistemological
 level. The supported hybrid inferences as well as the management of
 knowledge bases form the second level, called logical level. Finally,
 the third level, the heuristic level, provides representation of
 procedural knowledge of a domain, and the introduction of ad hoc
 rules. This knowledge is represented in terms of production rules
 which are processed by a Ops5like rule interpreter. This paper mainly
 describes the introduction of this level into the hybrid system. The
 semantics of the knowledge representation methods of the
 epistemological level is dened according to a fourvalued logic
 approach. This denition insures that all inference algorithms are
 sound, complete and decidable. The system has been implemented in
 Common Lisp using the objectoriented extension CLOS, and the
 graphical user interface was implemented in C with XToolkit.",
 paper = "Bitt94.pdf"
}

\end{chunk}

\index{Homann, Karsten}
\begin{chunk}{axiom.bib}
@inproceedings{Homa94a,
 author = "Homann, Karsten",
 title = "Integrating ExplanationBased Learning in Symbolic
 Computing",
 booktitle = "Advances in Artificial Intelligence  Theory and
 Application II, Volume II",
 pages = "130135",
 year = "1994"
}

\end{chunk}

\index{B\"undgen, Reinhard}
\begin{chunk}{axiom.bib}
@article{Bund94,
 author = "Bundgen, Reinhard",
 title = "Combining Computer Algebra and Rule Based Reasoning",
 journal = "LNCS",
 volume = "958",
 pages = "209223",
 abstract =
 "We present extended term rewriting systems as a means to describe a
 simplification relation for an equational specification with a
 builtln domain of external objects. Even if the extended term
 rewriting system is canonical, the combined relation including
 builtin computations of 'ground terms' needs neither be terminating
 nor confluent. We investigate restrictions on the extended term
 rewriting systems and the builtin domains under which these
 properties hold. A very important property of extended term rewriting
 systems is decomposition freedom. Among others decomposition free
 extended term rewriting systems allow for efficient simplifications.
 Some interesting algebraic applications of canonical simplification
 relations are presented.",
 paper = "Bund94.pdf"
}

\end{chunk}

\index{Calmet, J.}
\index{Campbell, J.A.}
\begin{chunk}{axiom.bib}
@article{Calm92,
 author = "Calmet, J. and Campbell, J.A.",
 title = "Artificial Intelligence and Symbolic Mathematical
 Computations",
 journal = "LNCS",
 volume = "737",
 pages = "119",
 year = "1992",
 abstract =
 "This introductory paper summarizes the picture of the territory
 common to AI and SMC that has evolved from discussions following the
 presentation of papers given at the 1992 Karlsruhe conference. Its
 main objective is to highlight some patterns that can be used to guide
 both sketches of the state of the art in this territory and
 suggestions for future research activities.",
 paper = "Calm92.pdf"
}

\end{chunk}

\index{Miola, Alfonso}
\begin{chunk}{axiom.bib}
@article{Miol91,
 author = "Miola, Alfonso",
 title = "Symbolic Computation and Artificial Intelligence",
 journal = "LNCS",
 volume = ="535",
 pages = "243255",
 abstract =
 "The paper presents an overview of the research achievements on issues
 of common interest for Symbolic Computation and Artificial
 Intelligence. Common methods and techniques of nonnumerical
 information processing and of automated problem solving are underlined
 together with specific applications. A qualitative analysis of the
 symbolic computation systems currently available is presented in
 view of the design and implementation of a new system. This system
 allows both formal algebraic and analytical computations and automated
 deduction to prove properties of the computation. ",
 paper = "Miol91.pdf"
}

\end{chunk}

\index{Audemard, Gilles}
\index{Bertoli, Piergiorgio}
\index{Cimatti, Alessandro}
\index{Kornilowicz, Artur}
\index{Sebastiani, Roberto}
\begin{chunk}{axiom.bib}
@article{Aude02,
 author = "Audemard, Gilles and Bertoli, Piergiorgio and Cimatti,
 Alessandro and Kornilowicz, Artur and Sebastiani,
 Roberto",
 title = "Integrating Boolean and Mathematical Solving: Foundations,
 Basic Algorithms, and Requirements",
 journal = "LNCS",
 volume = "2385",
 pages = "231245",
 year = "2002",
 abstract =
 "In the last years we have witnessed an impressive advance in the
 efficiency of boolean solving techniques, which has brought large
 previously intractable problems at the reach of stateoftheart
 solvers. Unfortunately, simple boolean expressions are not expressive
 enough for representing many realworld problems, which require
 handling also integer or real values and operators. On the other
 hand, mathematical solvers, like computeralgebra systems or
 constraint solvers, cannot handle efficiently problems involving
 heavy boolean search, or do not handle them at all. In this paper we
 present the foundations and the basic algorithms for a new class of
 procedures for solving boolean combinations of mathematical
 propositions, which combine boolean and mathematical solvers, and we
 highlight the main requirements that boolean and mathematical solvers
 must fulfill in order to achieve the maximum benefits from their
 integration. Finally we show how existing systems are captured by our
 framework.",
 paper = "Aude02.pdf"
}

\end{chunk}

\index{Clement, Dominique}
\index{Prunet, Vincent}
\index{Montagnac, Francis}
@begin{chunk}{axiom.bib}
@article{Clem91,
 author = "Clement, Dominique and Prunet, Vincent and Montagnac, Francis",
 title = "Integrated software components: A Paradigm for Control Integration",
 journal = "LNCS",
 volume = "509",
 pages = "167177",
 year = "1991",
 abstract =
 "This report describes how control integration between software
 components may be organised using an encapsulation technique combined
 with broadcast message passing : each software component, which is
 encapsulated within an integrated software component (IC),
 communicates by sending and receiving events. Events are emitted
 without the emitter knowing whether there are any receivers. The
 proposed mechanism can be used for intertool communication as well as
 for communication within a single tool.

 This programming architecture frees the code from dependencies upon
 the effective software components environments, and simplifies its
 extension.",
 paper = "Clem91.pdf"
}

\end{chunk}

\index{Farmer, William M.}
\index{Guttman, Joshua D.}
\index{Thayer, Javier}
\begin{chunk}{axiom.bib}
@article{Farm93,
 author = "Farmer, William M. and Guttman, Joshua D. and Thayer, Javier",
 title = "Reasoning with contexts",
 journal = "LNCS",
 volume = "722",
 year = "1993",
 abstract =
 "Contexts are sets of formulas used to manage the assumptions that
 arise in the course of a mathematical deduction or calculation. This
 paper describes some techniques for symbolic computation that are
 dependent on using contexts, and are implemented in IMPS, an
 Interactive Mathematical Proof System.",
 paper = "Farm93.pdf"
}

\end{chunk}

\index{Freyd, Peter}
\index{Scedrov, Andre}
\begin{chunk}{axiom.bib}
@book{Frey90,
 author = "Freyd, Peter and Scedrov, Andre",
 title = "Categories, Allegories",
 year = "1990",
 publisher = "NorthHolland",
 comment = "Mathematical Library Vol 39",
 isbn = "97804447033682",
 abstract =
 "On the Categories side, the book centers on that part of categorical
 algebra that studies exactness properties, or other properties enjoyed
 by nice or convenient categories such as toposes, and their
 relationship to logic (for example, geometric logic). A major theme
 throughout is the possibility of representation theorems (aka
 completeness theorems or embedding theorems) for various categorical
 structures, spanning back now about five decades (as of this writing)
 to the original embedding theorems for abelian categories, such as the
 FreydMitchell embedding theorem.

 On the Allegories side: it may be said they were first widely
 publicized in this book. They comprise many aspects of relational
 algebra corresponding to the categorical algebra studied in the first
 part of the book"
}

\end{chunk}

\index{Harrison, John}
\begin{chunk}{axiom.bib}
@techreport{Harr95,
 author = "Harrison, John",
 title = "Methatheory and Reflection in Theorem Proving: A Survey
 and Critique",
 institution = "SRI Cambridge",
 year = "1995",
 type = "technical report",
 number = "CRC053",
 abstract =
 "One way to ensure correctness of the inference performed by computer
 theorem provers is to force all proofs to be done step by step in a
 simple, more or less traditional, deductive system. Using techniques
 pioneered in Edinburgh LCF, this can be made palatable. However, some
 believe such an approach will never be efficient enough for large,
 complex proofs. One alternative, commonly called reflection, is to
 analyze proofs using a second layer of logic, a metalogic, and so
 justify abbreviating or simplifying proofs, making the kinds of
 shortcuts humans often do or appealing to specialized decision
 algorithms. In this paper we contrast the fullyexpansive LCF approach
 with the use of reflection. We put forward arguments to suggest that
 the inadequacy of the LCF approach has not been adequately
 demonstrated, and neither has the practical utility of reflection
 (notwithstanding its undoubted intellectual interest). The LCF system
 with which we are most concerned is the HOL proof assistant.

 The plan of the paper is as follows. We examine ways of providing user
 extensibility for theorem provers, which naturally places the LCF and
 reflective approaches in opposition. A detailed introduction to LCF is
 provided, emphasizing ways in which it can be made efficient. Next, we
 present a short introduction to metatheory and its usefulness, and,
 starting from Gödel's proofs and Feferman's transfinite progressions
 of theories, look at logical `reflection principles'. We show how to
 introduce computational `reflection principles' which do not extend
 the power of the logic, but may make deductions in it more efficient,
 and speculate about their practical usefulness. Applications or
 proposed applications of computational reflection in theorem proving
 are surveyed, following which we draw some conclusions. In an
 appendix, we attempt to clarify a couple of other notions of
 `reflection' often encountered in the literature.

 The paper questions the tooeasy acceptance of reflection principles
 as a practical necessity. However I hope it also serves as an adequate
 introduction to the concepts involved in reflection and a survey of
 relevant work. To this end, a rather extensive bibliography is
 provided.",
 paper = "Harr95.pdf"
}

\end{chunk}
+Goal: Axiom Maintenance
diff git a/src/axiomwebsite/patches.html b/src/axiomwebsite/patches.html
index 048681f..9e3d01c 100644
 a/src/axiomwebsite/patches.html
+++ b/src/axiomwebsite/patches.html
@@ 5843,7 +5843,9 @@ books/bookvolbib Proving Axiom Correct references
20171017.01.tpd.patch
books/bookvol13 add Homann database reference
20171029.01.tpd.patch
books/bookvolbib add references
+books/bookvolbib add references
+20171030.01.tpd.patch
+books/bookheader.tex add tab characters in credits

1.9.1