Associate Professor Pennsylvania State University, Pennsylvania, United States
Automatic differentiation (AD) and differentiable programming offer a promising approach to integrating physics-based and data-driven modeling in computational hydrodynamics. Hydrograd is a prime example—a differentiable 2D shallow water equations (SWEs) solver designed for scientific machine learning. It shares similarities with other 2D SWEs solvers, such as SRH-2D, HEC-RAS, and AdH. However, its key distinction lies in its automatic differentiability with respect to any model parameter. This feature makes Hydrograd seamlessly integrable with AI and ML techniques. Built on Julia and the SciML ecosystem, Hydrograd delivers accurate solutions for forward simulations, sensitivity analysis, parameter inversion, and physics discovery through Universal Differential Equations (UDEs). In the UDE framework, certain components of the governing partial differential equations (PDEs)—such as Manning’s n, the flow resistance term, or riverbed elevation—are replaced with a neural network. Hydrograd solves the 2D SWEs on unstructured meshes, similar to those used by SRH-2D and HEC-RAS 2D, making it well-suited for real-world applications. This talk will showcase Hydrograd’s capabilities through various examples, including sensitivity analysis of Manning’s roughness coefficient, parameter inversion, and the discovery of flow resistance physics using UDEs.