Physics Informed Neural Networks

Introduction Neural networks have an extraordinary ability to learn highly non-linear solutions from data. However, the amount of data needed for a neural net to learn a good, generalizable solution can be vast. Modern LLMs, for example, are pre-trained on over 15 trillion tokens. When such vast datasets are out of reach, there are still options. Architectures can bake in known structural properties of the problem – convolutional nets exploit translational invariance, for instance – but this only helps when the constraint is something that can be encoded in the architecture itself. ...

March 28, 2026