This repository contains a comprehensive analysis of Neural Tangent Kernel (NTK) behavior in both finite and infinite width regimes, with a focus on Physics-Informed Neural Networks (PINNs). This project was completed for the course WI4450: Special Topics in Computational Science and Engineering (2024/2025 Q3–Q4).
This project investigates the scaling properties of Neural Tangent Kernels in finite-width and finite-depth neural networks, inspired by the paper "Finite Depth and Width Corrections to the Neural Tangent Kernel". The research explores:
- NTK scaling behavior with respect to network depth and width
- Activation function effects on NTK properties (ReLU, GELU, Sigmoid)
- Physics-Informed Neural Networks (PINNs) in both infinite and finite width regimes
- Training dynamics and convergence properties
The main implementation directory containing all computational experiments and analysis.
Key Components:
experiments/- Jupyter notebooks with all numerical experiments:infinite_ntk.ipynb- NTK analysis in infinite width regimefinite_width_analysis.ipynb- NTK behavior in finite-width networkspinn_infinite.ipynb- PINN analysis in infinite width regimepinn_finite_width_analysis.ipynb- PINN performance with finite-width networkssupplementary/full_training_analysis.ipynb- Comprehensive training dynamics analysis
util/- Utility functions and helper modulesrequirements.txt- Python dependenciesREADME.md- Detailed setup and usage instructions
Quick Start:
cd code
conda create -n ntk_pinn python=3.10
conda activate ntk_pinn
pip install -r requirements.txtContains all experimental data, including:
- Pre-computed NTK matrices for different activation functions
- Training data and results
- PINN-specific datasets
report.pdf- Complete project report with findings and analysis- LaTeX source files for the report
Contains materials used for the final project presentation, including slides and supporting documents.
draft.md- Original project proposal with research objectivesfeedback.md- Feedback received on the proposalREADME.md- Proposal documentation
The foundational paper that inspired this research project.
- Read the report (
report/report.pdf) to understand the findings - Explore the code in
code/experiments/to see the implementation - Run experiments by following the setup instructions in
code/README.md
- Start with
code/experiments/finite_width_analysis.ipynbfor core NTK analysis - Check
code/experiments/pinn_finite_width_analysis.ipynbfor PINN-specific results - Review
code/util/for implementation details
- Check
presentation/for slides and materials - Review the main findings in
report/report.pdf
The project provides insights into:
- NTK scaling laws in practical finite-width settings
- Activation function impact on NTK behavior
- PINN performance across different network architectures
- Training dynamics and convergence properties
- Python 3.10
- JAX and Flax for neural network implementation
- Neural Tangents for NTK computations
- See
code/requirements.txtfor complete dependencies
- Finite Depth and Width Corrections to the Neural Tangent Kernel - Primary reference paper
- Additional references available in the final report
This is a completed academic project. For questions or discussions about the methodology or results, please refer to the report and code documentation.
Note: This project was completed as part of WI4450: Special Topics in Computational Science and Engineering (2024/2025 Q3–Q4). All code, analysis, and findings are documented for reproducibility and educational purposes.