The final module of the ISEPS programme bridges the gap between semiconductor engineering and modern computational intelligence. As device complexity and design space dimensionality grow beyond what traditional simulation can efficiently explore, machine learning and AI-driven optimisation are rapidly becoming indispensable tools in the engineer's toolkit.
The module opens with a practical grounding in Python for scientific and engineering workflows: NumPy and SciPy for numerical modelling, Matplotlib for visualisation, and Pandas for data handling. You will then progress to machine learning fundamentals — regression, classification, model validation — before tackling the neural network architectures most relevant to device modelling: fully connected networks, convolutional networks for image-like data, and graph neural networks for structured physical systems.
Surrogate modelling is a central theme: you will learn to replace expensive TCAD or FDTD simulations with fast, trained approximators, and to use these surrogates in gradient-based and evolutionary optimisation loops. The module concludes with case studies drawn directly from semiconductor and photonics research — demonstrating how AI is already accelerating innovation in device design, process optimisation, and yield improvement.
Hour breakdown for this module across online and onsite delivery formats.
20 places available · Hybrid format · Warsaw University of Technology