Close Menu
  • AI
  • Content Creation
  • Tech
  • Robotics
AI-trends.todayAI-trends.today
  • AI
  • Content Creation
  • Tech
  • Robotics
Trending
  • Anthropic Mythos is Unauthorized by Discord Sleuths
  • Ace the Ping Pong Robot can Whup your Ass
  • GitNexus, an Open-Source Knowledge Graph Engine that is MCP Native and Gives Claude Coding and Cursor Complete Codebase Structure Awareness
  • Deepgram Python SDK Implementation for Transcription and Async Processing of Audio, Async Text Intelligence, and Async Text Intelligence.
  • DeepSeek AI releases DeepSeek V4: Sparse attention and heavily compressed attention enable one-million-token contexts.
  • AI-Designed drugs by a DeepMind spinoff are headed to human trials
  • Apple’s new CEO must launch an AI killer product
  • OpenMythos Coding Tutorial: Recurrent-Depth Transformers, Depth Extrapolation and Mixture of Experts Routing
AI-trends.todayAI-trends.today
Home»Tech»Reexamining the foundations of intelligence: Maybe physics-based AI is the right approach

Reexamining the foundations of intelligence: Maybe physics-based AI is the right approach

Tech By Gavin Wallace20/07/20257 Mins Read
Facebook Twitter LinkedIn Email
NVIDIA AI Releases Llama Nemotron Nano VL: A Compact Vision-Language
NVIDIA AI Releases Llama Nemotron Nano VL: A Compact Vision-Language
Share
Facebook Twitter LinkedIn Email

Deep learning revolutionized artificial Intelligence in the last decade. It has led to breakthroughs such as image recognition, game play, language modelling, and other areas. Data inefficiency, lackluster robustness, high power consumption and superficial understanding of physical law are some of the persistent limits. As AI adoption deepens into critical sectors—from climate forecasting to medicine—these constraints are becoming untenable.

The emergence of physics-based AI is a promising trend, in which learning is guided and constrained by laws of nature. Inspire by the centuries-long scientific progress of this hybrid approach, machine learning algorithms are embedded with physical principles, which offers new ways to interpretability and reliability. It is not a question of whether or when we will move past black-box machine learning. The real issue now is how quickly we can achieve this change.

A Case for Physics Based AI

Why Physics Now?

Contemporary AI—especially LLMs and vision models—rely on extracting correlations from massive, often unstructured, datasets. The data-driven approach is less effective in environments where data are scarce, safety critical or physical governed. Physics-based AI leverages, on the other hand:

  • The Inductive Bias of Physical Constraints The hypothesis space is reduced by integrating symmetries. Conservation laws and invariances. This leads to a more feasible solution.
  • Sample Efficiency: The models that use physical priors are able to achieve more using less data. This is an important advantage, especially in areas like computational science and healthcare.
  • The Robustness of Generalization and its Effectiveness Contrary to black boxes, models based on physics are less likely to fail in unpredictable ways when they extrapolate beyond the distribution.
  • Translation and Trust It is more reliable and logical to make predictions that adhere to laws such as those governing energy conservation.

What is the landscape of physics-based AI?

Workhorse: Neural networks with a Physics-Informed approach

Physics-Informed Neural Networks integrates physical knowledge through the penalty of violations in the loss functions. This ecosystem has grown over the years into one that is rich:

  • In geosciences and climate, the PINNs can predict free surface flows even with complex topography.
  • Modeling stress distribution, nonlinear propagation of waves and turbulence with good efficiency is a common practice in materials science and fluid dynamic.
  • PINNs are a powerful tool for biomedical simulation. They accurately reproduce cardiac dynamics, and the development of tumors under sparse observation.

Latest Developments (2024–2025):

  • The PINN error breakdown is now a more thorough process, allowing for better training methods.
  • PointNet informed by physics allows PINN solutions to work on irregular geometries, without requiring per-geometry retraining.
  • PINNs of the next generation use multimodal architectures that combine data-driven components with physics-guided ones to combat partial observability.

Neural Operators: Learning Physics Across Infinite Domains

Machine learning models that use classical methods are not able to handle variations of physics equations or boundary conditions. Fourier neural operators learn function space mappings.

  • FNOs perform better than CNNs at forecasting weather because they can capture the nonlinear dynamic of oceans and air.
  • These limitations have been addressed by ensemble and multiscale operators techniques. This has improved accuracy of high-frequency forecasting.
  • Now, the global weather prediction system is dominated by multigrid neural operators.

Differentiable Simulation: A Data-Physical Backbone

Simulators that are differentiable allow for the optimization of predictions from beginning to end with learning.

  • Differentiable simulations in tactile and contact physics enable learners to learn through contact-rich manipulative, soft-body and rigid-body scenarios.
  • Differentiable simulation in neuroscience brings gradient-based large-scale optimization to neural networks.
  • Genesis, a new generation of physics engines that simulate at unprecedented speeds and sizes for both robotics and education.

Recent work recognizes several principal approaches for differentiable contact—LCP-based, convex optimization-based, compliant, and position-based dynamics models.

The Best of Both Worlds with Hybrid Physics and ML Models

  • Hybrid neural-physical models for tropical cyclone forecasting combine explicit physics code with data-driven modeling. This pushes the forecasting limits well beyond what was previously possible.
  • Hybrids are a powerful tool in manufacturing and engineering. They combine empirical constraints with physical ones, and overcome the fragility of models that rely solely on first principles or black box data.
  • Climate science hybrid methods allow for physically plausible downscaling, and an uncertainty-aware forecast.

Research Frontiers and Challenges

  1. Scalability: The training of models with physics constraints at scale continues to be a challenge, even though meshless operators are improving and the simulation speed is increasing.
  2. Partial Observability & Noise Research is still a challenge in handling noisy and partial data. Hybrid models, as well as multimodal ones are being developed to tackle this.
  3. Integrating Foundation Models Researchers are focusing on how to integrate general AI models that have explicit physical preconditions.
  4. Verification & Validation: It is a technical challenge to ensure that all models conform to the physical laws of each regime.
  5. Automated Law Discovery PINN inspired approaches make data-driven discoveries of governing laws more practical.

The Future: Moving Towards a Physics First AI Paradigm

It is essential that AI models shift from physics to hybrid and physic-based, if it wants to be able extrapolate and reason and possibly discover new scientific laws. Some promising directions are:

  • Integrating neural-symbolic learning with physical understanding and deep networks.
  • Artificial intelligence that is aware of mechanisms in real-time and can make decisions with confidence. This technology will be used for robotics, digital twins and other applications.
  • Machine learning and advanced causal analysis for automated scientific discovery.

This breakthrough depends on a close collaboration between machine-learning, domain experts, and physics. The rapid advancements in this area are combining domain knowledge with computation and data. They promise a new generation AI capabilities that will benefit science and the society.


Referrals

  • Physics-Informed Neural Networks: A Deep Learning Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations, Raissi et al. (2019)
  • Lagrangian Neural Networks, Cranmer et al. (2020)
  • Hamiltonian Neural Networks, Greydanus et al. (2019)
  • Fourier Neural Operator for Parametric Partial Differential Equations, Li et al. (2021)
  • Neural Operator: Learning Maps Between Function Spaces, Kovachki et al. (2021)
  • Scientific Machine Learning Through Physics–Informed Neural Networks: Where We Are and What’s Next, Cuomo et al. (2022)
  • Numerical Analysis of Physics-Informed Neural Networks and Related Models in Physics-Informed Machine Learning, De Ryck et al. (2024)
  • Physics-Informed Neural Networks and Extensions, Raissi et al. (2024)
  • Spherical Multigrid Neural Operator for Improving Autoregressive Global Weather Forecasting, Hu et al. (2025)
  • Applications of the Fourier Neural Operator in a Regional Ocean Modeling and Prediction, Choi et al. (2024)
  • Physics‐Informed Neural Networks for the Augmented System of Shallow Water Equations with Topography, Dazzi et al. (2024)
  • DiffTaichi: Differentiable Programming for Physical Simulation, Hu et al. (2020)
  • DIFFTACTILE: A Physics-Based Differentiable Tactile Simulator for Contact-Rich Robotic Manipulation, Si et al. (2024)
  • A Review of Differentiable Simulators, Newbury et al. (2024)
  • Differentiable Physics Simulations with Contacts: Do They Have Correct Gradients w.r.t. Position, Velocity and Control?, Zhong et al. (2022)
  • A Hybrid Machine Learning/Physics‐Based Modeling Framework for 2‐Week Extended Prediction of Tropical Cyclones, Liu et al. (2024)
  • Jaxley: Differentiable Simulation Enables Large-Scale Training of Detailed Biophysical Models of Neural Dynamics, Deistler et al. (2024)
  • Revolutionizing Physics: A Comprehensive Survey of Machine Learning Applications, Suresh et al. (2024)
  • A Library for Learning Neural Operators, Kossaifi et al. (2024); GitHub
  • Genesis: Universal Physics Platform for Robotics and Embodied AI, Genesis Embodied AI Team (2024)
  • Enforcing Analytic Constraints in Neural Networks Emulating Physical Systems, Beucler et al. (2021)


Michal is a professional in data science with a Masters of Science degree from the University of Padova. Michal Sutter excels in transforming large datasets to actionable insight. He has a strong foundation in statistics, machine learning and data engineering.

AI dat mining physics x
Share. Facebook Twitter LinkedIn Email
Avatar
Gavin Wallace

Related Posts

GitNexus, an Open-Source Knowledge Graph Engine that is MCP Native and Gives Claude Coding and Cursor Complete Codebase Structure Awareness

25/04/2026

Deepgram Python SDK Implementation for Transcription and Async Processing of Audio, Async Text Intelligence, and Async Text Intelligence.

25/04/2026

DeepSeek AI releases DeepSeek V4: Sparse attention and heavily compressed attention enable one-million-token contexts.

24/04/2026

OpenMythos Coding Tutorial: Recurrent-Depth Transformers, Depth Extrapolation and Mixture of Experts Routing

24/04/2026
Top News

Young Mormons built an app to stop men from gooning

AliExpress is Soon Selling a $4370 Humanoid Robot

Big Tech asked for a looser Clean Water Act permitting. Trump is willing to grant it.

Can AI Kill Venture Capitalists?

Google AI overviews says it’s still 2024

Load More
AI-Trends.Today

Your daily source of AI news and trends. Stay up to date with everything AI and automation!

X (Twitter) Instagram
Top Insights

Mirage: Multimodal Reasoning without Rendering Images

18/07/2025

Comparing Top Agentic AI Browsers for 2025: Atlas Vs Copilot Mode Vs Dia vs Comet

16/11/2025
Latest News

Anthropic Mythos is Unauthorized by Discord Sleuths

25/04/2026

Ace the Ping Pong Robot can Whup your Ass

25/04/2026
X (Twitter) Instagram
  • Privacy Policy
  • Contact Us
  • Terms and Conditions
© 2026 AI-Trends.Today

Type above and press Enter to search. Press Esc to cancel.