TECHNOLOGY FOCUS

In the evolution of Network Sciences, the involvement of AI becomes more and more visible for designing, deploying, and operating complex networks. In this segment researchers are also looking into the possibility of exploiting the results from neuroscience and brain operation to enhance the efficiency of artificial neural networks. In practical implementations, this is usually combined with involvement of advanced technologies based on quantum computing.

In order to present these efforts in an integral form, the course starts with an introduction to quantum (q-) computing. Then, before we discuss an artificial quantum neuron, we first describe some physical processes in brain and discuss in depth the interdisciplinary research of neuroscience, network science, and dynamic systems, with emphasis on the emergence of brain‐inspired intelligence.

To replicate brain intelligence, a practical way is to reconstruct cortical networks with dynamic activities that nourish the brain functions, instead of using only artificial computing networks. Here we provide a complex network and spatiotemporal dynamics (referred to as network dynamics) perspective for understanding the brain and cortical networks and develop integrated approaches of neuroscience and network dynamics toward building brain‐inspired intelligence with learning and resilience functions. For this we need to cover issues such as fundamental concepts and principles of complex networks, neuroscience, and hybrid dynamic systems, as well as relevant studies about the brain and intelligence. Other promising research directions, such as brain science, data science, quantum information science, and machine behavior are also briefly discussed toward future applications.

RF Component and System Measurements
RF Component and System Measurements

COURSE CONTENT

.

WHO SHOULD ATTEND

Participants with background in either quantum physics, networks planning, design, deployment and control or networks/internet economics should benefit from participation. This includes researchers, students and professors in academia as well as industry, networks operators, regulators and managers in this field.

Image
Monday

1. INTRODUCTION
Qubit
Entanglement
Quantum Gates and Quantum Computing
Quantum Teleportation and
Quantum Information Theory
Quantum algorithms
Quantum parallelism

2 NEUROSCIENCE AND AI
2.1 Preliminaries
2.2 Neuroscience and Network Dynamics
2.3 Electrophysiological Connectivity Patterns in Cortex
2.4 Spiking Neuron Models

3 SPIKING NEURON Timing
3.1 Models of synaptic plasticity based on spike timing
3.2 Reinforcement Learning and STDP
3.3 The brain as an adaptive learner

Ch 4 SPIKING NEURON NETWORKS
4.1 Artificial Neural Networks
4.2 Spiking neurons and synaptic plasticity
4.3 The Neural Code
4.4 Training Spiking Neural Networks
4.5 Online Learning
4.6 Further Research in SNN
Ex. 1: From Artificial to Spiking Neural Networks
Ex. : Spike Encoding
Ex. : Training Spiking Neural Networks

Tuesday                                    2

5 DEEP LEARNING AND NEUROSCIENCE
5.1 Preliminaries
5.2. Can the brain optimize cost functions?
5.3 Analytical Models for Credit Assignment in Neural Networks
5.4 Derivation of the Key Theorems.2.1 Stability analysis with instantaneous system dynamics

6 NEUROSCIENCE AND NETWORK SYNCHRONIZATION
6.1. Synchronization of neural networks with stochastic perturbation
6.2 Stability of spiking NN synchronization under stochastic perturbations
6.3. Feedback Control of NN Synchronization
6.4 Exponential Synchronization NN Under Time‐Varying Sampling
6.5 Synchronizing cortical oscillations in human brain
7.6 Complex Networks Synchronization
 A1: Large Deviation Principle
Ex. A2: Derivation of

7 ARTIFICIAL QUANTUM NEURON
7.1 Modeling Quantum Perceptron
7.2 Quantum Perceptron Model Complexity
7.3 Hybrid Quantum‐Classical Perceptron Algorithm
7.4 Quantum Activation Functions for QNN
7.5 Quantum Neuron
7.5.1 Feedforward neural network
 7.5.2 Hopfield network
Ex. 7.1: Perceptron Algorithm (Primal Form)
Ex. : Proofs of Theorem 10.2, Lemma 10.1 and Lemma 10.2
Ex. 7.3 Convergence analysis of the nonlinear map
Ex. 7.4 Runtime analysis of RUS circuits
Ex. 7.5 Weights and bias setting
Ex. 7.6 General property of quantum neuron
Ex. 7.7 Feedforward networks of quantum neurons
Ex. 7.8 Hopfield networks of quantum neurons

8 QUANTUM Neural Networks
8.1 Network Architecture
8.2 Performance Limits of QNN
8.3 Continuous-variable QNN
8.3.1 Preliminaries: The CV model
8.3.2 Continuous‐Variable QNN
8.3.3 Embedding classical neural networks
8.3.4 Convolutional, Recurrent and Residual CV QNN

Wednesday                                   3

9 QUANTUM MACHINE LEARNING
9.1 Methods of machine learning
9.2 Learning Theory
9.3 Machine Learning in Quantum Physics
9.4 Group‐theoretic approach to QML
DESIGN EXAMPLE: Ancilla‐based models for the purity dataset
DESIGN EXAMPLE: Concentration results for time-reversal datasets
DESIGN EXAMPLE: Quantum Graph Convolutional Neural Networks

10 TENSOR NETWORKS for QML
10.1 Tensor Networks
10.2 Tensor-network Based Machine Learning
10.3 ML by Quantum Tensor Network
Ex. 1: Tensors and Tensor Products
Ex. 2: Training Algorithm

11 QUANTUM SIMULATIONS AND MACHINE LEARNING
11.1 Preliminaries on QS
11.2 Experimental Results
11.3 Simulation of QML
11.4 A quantum feed‐forward neural network

12 TENSOR NETWORKS for Complex Systems OPTIMIZATION
12.1 Preliminaries
12.2 Low‐Rank Tensor Approximations via Tensor Networks
12.3 Analytical Representation of Tensor Trains
12.4 Large‐Scale Optimization Problems