The paper proposes a region-based deep learning convolutional neural network to detect objects within images able to identify the filamentary plasma structures that arise in the boundary region of the plasma in toroidal nuclear fusion reactors. The images required to train and test the neural model have been synthetically generated from statistical distributions, which reproduce the statistical properties in terms of position and intensity of experimental filaments. The recently proposed Faster Region-based Convolutional Network algorithm has been customized to the problem of identifying the filaments both in location and size with the associated score. The results demonstrate the suitability of the deep learning approach for the filaments detection.
Convolutional neural networks for the identification of filaments from fast visual imaging cameras in tokamak reactors
Cannas, Barbara;Carcangiu, Sara;Fanni, Alessandra;Montisci, Augusto;Pisano, Fabio;Sias, Giuliana;
2019-01-01
Abstract
The paper proposes a region-based deep learning convolutional neural network to detect objects within images able to identify the filamentary plasma structures that arise in the boundary region of the plasma in toroidal nuclear fusion reactors. The images required to train and test the neural model have been synthetically generated from statistical distributions, which reproduce the statistical properties in terms of position and intensity of experimental filaments. The recently proposed Faster Region-based Convolutional Network algorithm has been customized to the problem of identifying the filaments both in location and size with the associated score. The results demonstrate the suitability of the deep learning approach for the filaments detection.File | Dimensione | Formato | |
---|---|---|---|
464090_1_En_15_Chapter_Author_uncorrected.pdf
Solo gestori archivio
Tipologia:
versione post-print (AAM)
Dimensione
537.62 kB
Formato
Adobe PDF
|
537.62 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.