Probing Convolutional Neural Networks for Event Reconstruction in γ-Ray Astronomy with Cherenkov Telescopes

Tim Lukas Holch, Idan Shilon, Matthias Büchele, Tobias Fischer, Stefan Funk, Nils Groeger, David Jankowsky, Thomas Lohse, Ullrich Schwanke, Philipp Wagner

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

A dramatic progress in the field of computer vision has been made in recent years by applying deep learning techniques. State-of-the-art performance in image recognition is thereby reached with Convolutional Neural Networks (CNNs). CNNs are a powerful class of artificial neural networks, characterized by requiring fewer connections and free parameters than traditional neural networks and exploiting spatial symmetries in the input data. Moreover, CNNs have the ability to automatically extract general characteristic features from data sets and create abstract data representations which can perform very robust predictions. This suggests that experiments using Cherenkov telescopes could harness these powerful machine learning algorithms to improve the analysis of particle-induced air-showers, where the properties of primary shower particles are reconstructed from shower images recorded by the telescopes. In this work, we present initial results of a CNN-based analysis for background rejection and shower reconstruction, utilizing simulation data from the H.E.S.S. experiment. We concentrate on supervised training methods and outline the influence of image sampling on the performance of the CNN-model predictions.

Original languageEnglish
JournalProceedings of Science
StatePublished - 1 Jan 2017
Externally publishedYes
Event35th International Cosmic Ray Conference, ICRC 2017 - Bexco, Busan, Korea, Republic of
Duration: 10 Jul 201720 Jul 2017

ASJC Scopus subject areas

  • General

Fingerprint

Dive into the research topics of 'Probing Convolutional Neural Networks for Event Reconstruction in γ-Ray Astronomy with Cherenkov Telescopes'. Together they form a unique fingerprint.

Cite this