Photo by — @rtp_atw

What is On the Horizon for Processing Units in 2020?

The Possibility of Photon-based Tensor Processing Units Mid-2020

Alex Moltzau

--

What is on the horizon for processing units? You might have heard of CPU, maybe GPU or even TPU. I saw an article about the development of a new Tensor Processing Unit (TPU), and that it provided for an interesting possibility. Namely that of photon-based tensor processing units.

TPU is an acronym for Tensor Processing Unit. According to Wikipedia the definition is the following:

“A tensor processing unit (TPU) is an AI accelerator application-specific integrated circuit (ASIC) developed by Google specifically for neural network machine learning, particularly using Google’s own TensorFlow software. Google began using TPUs internally in 2015, and in 2018 made them available for third party use, both as part of its cloud infrastructure and by offering a smaller version of the chip for sale.”

I have written more of these previously.

This June in 2020 there is a paper that proposes a new approach to performing computations.

It proposes to use light instead of electricity to perform these computations required by a neural network.

Within this — a photonic tensor core performs multiplications of matrices in parallel.

Thus, it is said, improving speed and efficiency of current deep learning paradigms.

There is an article discussing this research in ScienceDaily.

In some machine learning techniques neural networks are trained to perform unsupervised decision and classification on data that has not been seen.

Trained on data → to make an inference → recognise and or classify objects.

This is of course a gross simplification of a complex process that does not necessarily happen in this sequences or specific manner.

One attempts to find a signature or pattern in the data.

Photonic TPU is said to store data in parallel featuring an electro-optical interconnect. This supposedly allow the optical memory to be efficiently read and written, and the photonic TPU to interface with other architectures.

These consume a fraction of power and have higher throughput.

There are multiple layers of interconnected ‘neurons’ in some of the neural networks.

“Each neuron is a mathematical operation that takes it’s input, multiplies it by it’s weights and then passes the sum through the activation function to the other neurons.”

These networks can become incredible complex.

They demand vast amounts of data for computation, and power to process the data.

Graphics Processing Units (GPUs) and Tensor Processing Units (TPUs) are limited by the transmission of electronic data between the processor and the memory.

The researchers showed that the performance of their TPU could be 2–3 orders higher than an electrical TPU.

The article in ScienceDaily argues that it could be of interest in 5G networks and in this sense edge computing:

“Photons may also be an ideal match for computing node-distributed networks and engines performing intelligent tasks with high throughput at the edge of a networks, such as 5G. At network edges, data signals may already exist in the form of photons from surveillance cameras, optical sensors and other sources.” [bold added]

If this is done then there is a possibility to reduce data centre traffic in certain processing tasks.

Data processed faster, due to preprocessed data, meaning only a portion of the data needs to be sent to the cloud or data centre.

One the one hand this is of course highly possible, yet on the other hand it is still in the research phase so we must show some patient and think of this with some moderation.

This is #500daysofAI and you are reading article 418. I am writing one new article about or related to artificial intelligence every day for 500 days.

--

--

Alex Moltzau
Alex Moltzau

Written by Alex Moltzau

Policy Officer at the European AI Office in the European Commission. This is a personal Blog and not the views of the European Commission.

No responses yet