Department of Computer Science

Machine Learning Research Group

University of Texas at Austin Artificial Intelligence Lab

Publications: Neural-Symbolic Learning

Neural networks and symbolic learning techniques can be seen as operating at different levels of abstraction. Our work focuses on understanding differences between their capabilities, and on combining their strengths.
  1. Combining Symbolic and Connectionist Learning Methods to Refine Certainty-Factor Rule-Bases
    [Details] [PDF]
    J. Jeffrey Mahoney
    PhD Thesis, Department of Computer Sciences, University of Texas at Austin, May 1996. 113 pages.
  2. Revising Bayesian Network Parameters Using Backpropagation
    [Details] [PDF]
    Sowmya Ramachandran and Raymond J. Mooney
    In Proceedings of the International Conference on Neural Networks (ICNN-96), Special Session on Knowledge-Based Artificial Neural Networks, 82--87, Washington DC, June 1996.
  3. Refinement of Bayesian Networks by Combining Connectionist and Symbolic Techniques
    [Details] [PDF]
    Sowmya Ramachandran
    , 1995. Unpublished Ph.D. Thesis Proposal.
  4. Comparing Methods For Refining Certainty Factor Rule-Bases
    [Details] [PDF]
    J. Jeffrey Mahoney and Raymond J. Mooney
    In Proceedings of the Eleventh International Workshop on Machine Learning (ML-94), 173--180, Rutgers, NJ, July 1994.
  5. Modifying Network Architectures For Certainty-Factor Rule-Base Revision
    [Details] [PDF]
    J. Jeffrey Mahoney and Raymond J. Mooney
    In Proceedings of the International Symposium on Integrating Knowledge and Neural Heuristics (ISIKNH-94), 75--85, Pensacola, FL, May 1994.
  6. Combining Connectionist and Symbolic Learning to Refine Certainty-Factor Rule-Bases
    [Details] [PDF]
    J. Jeffrey Mahoney and Raymond J. Mooney
    Connection Science:339-364, 1993.
  7. Combining Symbolic and Neural Learning to Revise Probabilistic Theories
    [Details] [PDF]
    J. Jeffrey Mahoney and Raymond J. Mooney
    In Proceedings of the ML92 Workshop on Integrated Learning in Real Domains, Aberdeen, Scotland, July 1992.
  8. Growing Layers of Perceptrons: Introducing the Extentron Algorithm
    [Details] [PDF]
    Paul T. Baffes and John M. Zelle
    In Proceedings of the 1992 International Joint Conference on Neural Networks, 392--397, Baltimore, MD, June 1992.
  9. Symbolic and Neural Learning Algorithms: An Experimental Comparison
    [Details] [PDF]
    J.W. Shavlik, Raymond J. Mooney and G. Towell
    Machine Learning, 6:111-143, 1991. Reprinted in {it Readings in Knowledge Acquisition and Learning}, Bruce G. Buchanan and David C. Wilkins (eds.), Morgan Kaufman, San Mateo, CA, 1993..
  10. Processing Issues in Comparisons of Symbolic and Connectionist Learning Systems
    [Details] [PDF]
    Douglas Fisher and Kathleen McKusick and Raymond J. Mooney and Jude W. Shavlik and Geoffrey Towell
    In Proceedings of the Sixth International Workshop on Machine Learning, 169--173, Ithaca, New York, 1989.
  11. An Experimental Comparison of Symbolic and Connectionist Learning Algorithms
    [Details] [PDF]
    Raymond J. Mooney, J.W. Shavlik, G. Towell and A. Gove
    In Proceedings of the Eleventh International Joint Conference on Artificial Intelligence (IJCAI-89), 775-780, Detroit, MI, August 1989. Reprinted in ``Readings in Machine Learning'', Jude W. Shavlik and T. G. Dietterich (eds.), Morgan Kaufman, San Mateo, CA, 1990..