skip to main content
10.5555/3540261.3540451guideproceedingsArticle/Chapter ViewAbstractPublication PagesnipsConference Proceedingsconference-collections
research-article

A Bayesian-symbolic approach to reasoning and learning in intuitive physics

Published: 10 June 2024 Publication History
  • Get Citation Alerts
  • Abstract

    Humans can reason about intuitive physics in fully or partially observed environments even after being exposed to a very limited set of observations. This sample-efficient intuitive physical reasoning is considered a core domain of human common sense knowledge. One hypothesis to explain this remarkable capacity, posits that humans quickly learn approximations to the laws of physics that govern the dynamics of the environment. In this paper, we propose a Bayesian-symbolic framework (BSP) for physical reasoning and learning that is close to human-level sample-efficiency and accuracy. In BSP, the environment is represented by a top-down generative model of entities, which are assumed to interact with each other under unknown force laws over their latent and observed properties. BSP models each of these entities as random variables, and uses Bayesian inference to estimate their unknown properties. For learning the unknown forces, BSP leverages symbolic regression on a novel grammar of Newtonian physics in a bilevel optimization setup. These inference and regression steps are performed in an iterative manner using expectation-maximization, allowing BSP to simultaneously learn force laws while maintaining uncertainty over entity properties. We show that BSP is more sample-efficient compared to neural alternatives on controlled synthetic datasets, demonstrate BSP's applicability to real-world common sense scenes and study BSP's performance on tasks previously used to study human physical reasoning.

    Supplementary Material

    Additional material (3540261.3540451_supp.pdf)
    Supplemental material.

    References

    [1]
    Allen, K. R., Smith, K. A., and Tenenbaum, J. B. The tools challenge: Rapid trial-and-error learning in physical problem solving. arXiv preprint arXiv:1907.09620, 2019.
    [2]
    Amos, B., Dinh, L., Cabi, S., Rothörl, T., Colmenarejo, S. G., Muldal, A., Erez, T., Tassa, Y., de Freitas, N., and Denil, M. Learning awareness models. arXiv preprint arXiv:1804.06318, 2018.
    [3]
    Baradel, F., Neverova, N., Mille, J., Mori, G., and Wolf, C. Cophy: Counterfactual learning of physical dynamics. In International Conference on Learning Representations, 2020. URL https://openreview.net/forum?id=SkeyppEFvS.
    [4]
    Bates, C., Battaglia, P. W., Yildirim, I., and Tenenbaum, J. B. Humans predict liquid dynamics using probabilistic simulation. In CogSci, 2015.
    [5]
    Battaglia, P. W., Hamrick, J. B., and Tenenbaum, J. B. Simulation as an engine of physical scene understanding. Proceedings of the National Academy of Sciences, 110(45):18327–18332, 2013.
    [6]
    Battaglia, P. W., Pascanu, R., Lai, M., Rezende, D., and Kavukcuoglu, K. Interaction networks for learning about objects, relations and physics. arXiv:1612.00222 [cs], dec 2016.
    [7]
    Bonawitz, E., Ullman, T. D., Bridgers, S., Gopnik, A., and Tenenbaum, J. B. Sticking to the evidence? a behavioral and computational case study of micro-theory change in the domain of magnetism. Cognitive Science, 43(8):e12765, 2019. ISSN 1551-6709.
    [8]
    Bramley, N. R., Gerstenberg, T., Tenenbaum, J. B., and Gureckis, T. M. Intuitive experimentation in the physical world. Cognitive psychology, 105:9–38, 2018.
    [9]
    Breen, P. G., Foley, C. N., Boekholt, T., and Zwart, S. P. Newton vs the machine: solving the chaotic three-body problem using deep neural networks. arXiv:1910.07291 [astro-ph, physics:physics], oct 2019.
    [10]
    Brescia, F. Fundamentals of Chemistry: A Modern Introduction (1966). Elsevier, 2012.
    [11]
    Cerny, B. M., Nelson, P. C., and Zhou, C. Using differential evolution for symbolic regression and numerical constant creation. In Proceedings of the 10th annual conference on Genetic and evolutionary computation, GECCO '08, pp. 1195–1202, Atlanta, GA, USA, July 2008. Association for Computing Machinery. ISBN 978-1-60558-130-9.
    [12]
    Chang, M. B., Ullman, T., Torralba, A., and Tenenbaum, J. B. A compositional object-based approach to learning physical dynamics. arXiv preprint arXiv:1612.00341, 2016.
    [13]
    Cranmer, M., Sanchez-Gonzalez, A., Battaglia, P., Xu, R., Cranmer, K., Spergel, D., and Ho, S. Discovering symbolic models from deep learning with inductive biases. arXiv:2006.11287 [astro-ph, physics:physics, stat], June 2020.
    [14]
    Davidson, J. W., Savic, D. A., and Walters, G. A. Symbolic and numerical regression: Experiments and applications. In John, R. and Birkenhead, R. (eds.), Developments in Soft Computing, Advances in Soft Computing, pp. 175–182, Heidelberg, 2001. Physica-Verlag HD. ISBN 978-3-7908-1829-1.
    [15]
    Dempe, S. Foundations of bilevel programming. Springer Science & Business Media, 2002.
    [16]
    Duane, S., Kennedy, A. D., Pendleton, B. J., and Roweth, D. Hybrid Monte Carlo. Physics letters B, 195(2):216–222, 1987.
    [17]
    Ehrhardt, S., Monszpart, A., Mitra, N. J., and Vedaldi, A. Learning a physical long-term predictor. arXiv preprint arXiv:1703.00247, 2017.
    [18]
    Feser, J. K., Chaudhuri, S., and Dillig, I. Synthesizing data structure transformations from input-output examples. ACM SIGPLAN Notices, 50(6):229–239, 2015.
    [19]
    Finn, C., Abbeel, P., and Levine, S. Model-agnostic meta-learning for fast adaptation of deep networks. arXiv:1703.03400 [cs], July 2017.
    [20]
    Fragkiadaki, K., Agrawal, P., Levine, S., and Malik, J. Learning visual predictive models of physics for playing billiards. arXiv preprint arXiv:1511.07404, 2015.
    [21]
    Ge, H., Xu, K., and Ghahramani, Z. Turing: A Language for Flexible Probabilistic Inference. In The International Conference on Artificial Intelligence and Statistics (AISTATS), 2018.
    [22]
    Gerstenberg, T., Goodman, N. D., Lagnado, D. A., and Tenenbaum, J. B. How, whether, why: Causal judgments as counterfactual contrasts. In CogSci, 2015.
    [23]
    Grzeszczuk, N. and Animator, T. D. H. G. N. Fast neural network emulation and control of physics-based models. Proc. ACM SIGGRAPH '98 (New York, 1998).–ACM Press, pp. 9–20, 1998.
    [24]
    Hoffman, M. D. and Gelman, A. The no-U-turn sampler: Adaptively setting path Lengths in Hamiltonian Monte Carlo. J. Mach. Learn. Res., 15(1):1593–1623, 2014. URL http://arxiv.org/abs/1111.4246. arXiv: 1111.4246.
    [25]
    Janner, M., Levine, S., Freeman, W. T., Tenenbaum, J. B., Finn, C., and Wu, J. Reasoning about physical interactions with object-oriented prediction and planning. arXiv:1812.10972 [cs, stat], January 2019.
    [26]
    Kingma, D. P. and Ba, J. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
    [27]
    Kipf, T., Fetaya, E., Wang, K.-C., Welling, M., and Zemel, R. Neural relational inference for interacting systems. arXiv preprint arXiv:1802.04687, 2018.
    [28]
    Kommenda, M., Kronberger, G., Winkler, S., Affenzeller, M., and Wagner, S. Effects of constant optimization by nonlinear least squares minimization in symbolic regression. In Proceedings of the 15th annual conference companion on Genetic and evolutionary computation, GECCO '13 Companion, pp. 1121–1128, Amsterdam, The Netherlands, July 2013. Association for Computing Machinery. ISBN 978-1-4503-1964-5.
    [29]
    Koza, J. R. Genetic programming as a means for programming computers by natural selection. Statistics and Computing, 4(2):87–112, June 1994. ISSN 1573-1375.
    [30]
    Lake, B. M., Ullman, T. D., Tenenbaum, J. B., and Gershman, S. J. Building machines that learn and think like people. Behavioral and brain sciences, 40, 2017.
    [31]
    Neal, R. M. MCMC using Hamiltonian dynamics. Handbook of markov chain monte carlo, 2(11):2, 2011.
    [32]
    Osera, P.-M. and Zdancewic, S. Type-and-example-directed program synthesis. In Proceedings of the 36th ACM SIGPLAN Conference on Programming Language Design and Implementation, PLDI '15, pp. 619–630, New York, NY, USA, June 2015. Association for Computing Machinery. ISBN 978-1-4503-3468-6.
    [33]
    Quade, M., Abel, M., Shafi, K., Niven, R. K., and Noack, B. R. Prediction of dynamical systems by symbolic regression. Physical Review E, 94(1):012214, July 2016. ISSN 2470-0045, 2470-0053.
    [34]
    Sanborn, A. N., Mansinghka, V. K., and Griffiths, T. L. Reconciling intuitive physics and newtonian mechanics for colliding objects. Psychological review, 120(2):411, 2013.
    [35]
    Sanchez-Gonzalez, A., Bapst, V., Cranmer, K., and Battaglia, P. Hamiltonian graph networks with ode integrators. arXiv:1909.12790 [physics], sep 2019.
    [36]
    Schmidt, M. and Lipson, H. Distilling free-form natural laws from experimental data. Science, 324 (5923):81–85, April 2009. ISSN 0036-8075, 1095-9203.
    [37]
    Seo, S., Meng, C., and Liu, Y. Physics-aware difference graph networks for sparsely-observed dynamics. In International Conference on Learning Representations, 2019.
    [38]
    Smith, K., Mei, L., Yao, S., Wu, J., Spelke, E., Tenenbaum, J., and Ullman, T. Modeling expectation violation in intuitive physics with coarse probabilistic object representations. In Advances in Neural Information Processing Systems, pp. 8983–8993, 2019.
    [39]
    Spelke, E. S. Core knowledge. American psychologist, 55(11):1233, 2000.
    [40]
    Spelke, E. S. and Kinzler, K. D. Core knowledge. Developmental science, 10(1):89–96, 2007.
    [41]
    Udrescu, S.-M. and Tegmark, M. Ai feynman: A physics-inspired method for symbolic regression. Science Advances, 6(16):eaay2631, April 2020. ISSN 2375-2548.
    [42]
    Ullman, T. D., Stuhlmüller, A., Goodman, N. D., and Tenenbaum, J. B. Learning physical parameters from dynamic scenes. Cognitive Psychology, 104:57–82, August 2018. ISSN 0010-0285.
    [43]
    Veerapaneni, R., Co-Reyes, J. D., Chang, M., Janner, M., Finn, C., Wu, J., Tenenbaum, J., and Levine, S. Entity abstraction in visual model-based reinforcement learning. In Conference on Robot Learning, pp. 1439–1456. PMLR, 2020.
    [44]
    Watters, N., Zoran, D., Weber, T., Battaglia, P., Pascanu, R., and Tacchetti, A. Visual interaction networks: Learning a physics simulator from video. In Advances in neural information processing systems, pp. 4539–4547, 2017.
    [45]
    Wood, F., Meent, J. W., and Mansinghka, V. A new approach to probabilistic programming inference. In Artificial Intelligence and Statistics, pp. 1024–1032, 2014.
    [46]
    Wu, J., Yildirim, I., Lim, J. J., Freeman, B., and Tenenbaum, J. Galileo: Perceiving physical object properties by integrating a physics engine with deep learning. In Advances in neural information processing systems, pp. 127–135, 2015.
    [47]
    Wu, J., Lim, J., Zhang, H., Tenenbaum, J., and Freeman, W. Physics 101: Learning physical object properties from unlabeled videos. In Procedings of the British Machine Vision Conference 2016, pp. 39.1–39.12, York, UK, 2016. British Machine Vision Association. ISBN 978-1-901725-59-9.
    [48]
    Zheng, D., Luo, V., Wu, J., and Tenenbaum, J. B. Unsupervised learning of latent physical properties using perception-prediction networks. arXiv:1807.09244 [cs, stat], July 2018.

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image Guide Proceedings
    NIPS '21: Proceedings of the 35th International Conference on Neural Information Processing Systems
    December 2021
    30517 pages

    Publisher

    Curran Associates Inc.

    Red Hook, NY, United States

    Publication History

    Published: 10 June 2024

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 0
      Total Downloads
    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 14 Aug 2024

    Other Metrics

    Citations

    View Options

    View options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media