Search

Published After
Published Before

Search Results

  • Verification of human-level proof steps in mathematics education
    345-362
    Views:
    11
    Automated mathematics tutorial systems need support from a reasoning module which can verify the correctness of students' contributions. However, current systems typically do not reason at a level similar to the student's reasoning level, and do not fully account for underspecified or ambiguous inputs. We present a domain-independent method for automatically verifying correct proof steps and detecting standard reasoning errors. We use a depth limited BFS proof search to determine and maintain multiple possible interpretations consistent with the given proof step, we are able to resolve or otherwise propagate underspecification and ambiguity which occurs due to unrestricted user input. Our approach has been implemented in ΩmegaCoRe.
  • An e-learning environment for elementary analysis: combining computer algebra, graphics and automated reasoning
    13-34
    Views:
    34
    CreaComp is a project at the University of Linz, which aims at producing computer-supported interactive learning units for several mathematical topics at introductory university level. The units are available as Mathematica notebooks. For student experimentation we provide computational, graphical and reasoning tools as well. This paper focuses on the elementary analysis units.
    The computational and graphical tools of the CreaComp learning environment facilitate the exploration of new mathematical objects and their properties (e.g., boundedness, continuity, limits of real valued functions). Using the provided tools students should be able to collect empirical data systematically and come up with conjectures. A CreaComp component allows the formulation of precise conjectures and the investigatation of their validity. The Theorema system, which has been integrated into the CreaComp learning environment, provides full predicate logic with a user-friendly twodimensional syntax and a couple of automated reasoners that produce proofs in an easy-to-read and natural presentation. We demonstrate the learning situations and the provided tools through several examples.
  • Proof step analysis for proof tutoring - a learning approach to granularity
    325-343
    Views:
    31
    We present a proof step diagnosis module based on the mathematical assistant system Ωmega. The task of this module is to evaluate proof steps as typically uttered by students in tutoring sessions on mathematical proofs. In particular, we categorise the step size of proof steps performed by the student, in order to recognise if they are appropriate with respect to the student model. We propose an approach which builds on reconstructions of the proof in question via automated proof search using a cognitively motivated proof calculus. Our approach employs learning techniques and incorporates a student model, and our diagnosis module can be adjusted to different domains and users. We present a first evaluation based on empirical data.