Venturelli, Francesco Aldo
(2023)
Quantum neural networks for data-efficient image classification.
[Laurea magistrale], Università di Bologna, Corso di Studio in
Physics [LM-DM270]
Documenti full-text disponibili:
|
Documento PDF (Thesis)
Disponibile con Licenza: Salvo eventuali più ampie autorizzazioni dell'autore, la tesi può essere liberamente consultata e può essere effettuato il salvataggio e la stampa di una copia per fini strettamente personali di studio, di ricerca e di insegnamento, con espresso divieto di qualunque utilizzo direttamente o indirettamente commerciale. Ogni altro diritto sul materiale è riservato
Download (1MB)
|
Abstract
In our constantly evolving world, an overwhelming influx of data permeates every moment—be it daily, hourly, or even by the second. We communicate, share links, images, and opinions, disseminating a trail of traces, representing not just the vastness of our natural surroundings but also reflecting our thoughts, preferences, and sentiments. Recognizing the significance of these data, the field of Data Science has emerged, dedicated to unveiling the concealed insights embedded within.
Machine Learning (ML) has become a captivating realm of research, gaining prominence for its capacity to extract knowledge from extensive datasets. ML has played a pivotal role in bridging the gap between our understanding of nature and its intricacies. Deep Learning (DL), particularly Neural Networks (NNs), has revolutionized classical ML, serving as non-linear structures for modeling statistical data. NNs, and notably Convolutional Neural Networks (CNNs), simulate intricate relationships between inputs and outputs, excelling at tasks such as image-based pattern recognition inspired by the structure of the visual cortex.
While NNs, especially multilayered ones, have demonstrated remarkable power, their trainability posed challenges. The advent of back-propagation mitigated this issue, but training difficulties persisted, necessitating solutions like rectifier neuron activation functions and layer-wise training. Quantum Machine Learning (QML) has introduced new avenues, leveraging noisy intermediate-scale quantum computers for computational problems involving quantum data. Variational quantum algorithms (VQAs) and quantum neural networks (QNNs) offer promising applications, utilizing classical optimizers to train parameters in a quantum circuit.
They present a distinctive advantage over classical models by analyzing systems with polynomial complexity, which would be exponentially complex in classical ML, providing a computational edge. This thesis wants to investigate their abilities.
Abstract
In our constantly evolving world, an overwhelming influx of data permeates every moment—be it daily, hourly, or even by the second. We communicate, share links, images, and opinions, disseminating a trail of traces, representing not just the vastness of our natural surroundings but also reflecting our thoughts, preferences, and sentiments. Recognizing the significance of these data, the field of Data Science has emerged, dedicated to unveiling the concealed insights embedded within.
Machine Learning (ML) has become a captivating realm of research, gaining prominence for its capacity to extract knowledge from extensive datasets. ML has played a pivotal role in bridging the gap between our understanding of nature and its intricacies. Deep Learning (DL), particularly Neural Networks (NNs), has revolutionized classical ML, serving as non-linear structures for modeling statistical data. NNs, and notably Convolutional Neural Networks (CNNs), simulate intricate relationships between inputs and outputs, excelling at tasks such as image-based pattern recognition inspired by the structure of the visual cortex.
While NNs, especially multilayered ones, have demonstrated remarkable power, their trainability posed challenges. The advent of back-propagation mitigated this issue, but training difficulties persisted, necessitating solutions like rectifier neuron activation functions and layer-wise training. Quantum Machine Learning (QML) has introduced new avenues, leveraging noisy intermediate-scale quantum computers for computational problems involving quantum data. Variational quantum algorithms (VQAs) and quantum neural networks (QNNs) offer promising applications, utilizing classical optimizers to train parameters in a quantum circuit.
They present a distinctive advantage over classical models by analyzing systems with polynomial complexity, which would be exponentially complex in classical ML, providing a computational edge. This thesis wants to investigate their abilities.
Tipologia del documento
Tesi di laurea
(Laurea magistrale)
Autore della tesi
Venturelli, Francesco Aldo
Relatore della tesi
Correlatore della tesi
Scuola
Corso di studio
Indirizzo
THEORETICAL PHYSICS
Ordinamento Cds
DM270
Parole chiave
Quantum computing,Quantum Machine Learning,Machine Learning,Neural Networks,Quantum Circuits,Quantum Neural Networks,Quantum Mechanics,Image classification
Data di discussione della Tesi
15 Dicembre 2023
URI
Altri metadati
Tipologia del documento
Tesi di laurea
(NON SPECIFICATO)
Autore della tesi
Venturelli, Francesco Aldo
Relatore della tesi
Correlatore della tesi
Scuola
Corso di studio
Indirizzo
THEORETICAL PHYSICS
Ordinamento Cds
DM270
Parole chiave
Quantum computing,Quantum Machine Learning,Machine Learning,Neural Networks,Quantum Circuits,Quantum Neural Networks,Quantum Mechanics,Image classification
Data di discussione della Tesi
15 Dicembre 2023
URI
Statistica sui download
Gestione del documento: