At the present stage of the development of training of artificial intelligence systems, in fact, there is a natural technological limitation associated with the limit of the computing power of servers and even supercomputers.

 

 

Datasets loaded for processing go through “several circles of hell” in not always successful variants of obtaining the optimal algorithm for their implementation: it often turns out that to get at least some useful result, additional data is needed that can significantly speed up the process, or without them the result is impossible at all.

The worst case scenario is when it turns out that building a reliable predictive model based on the available data is generally impossible with the existing technical processing parameters – that is, when there is a lot of source data and their relationships; that is, of course, the problem is still solvable – only its result will be useless due to the fact that it will be solved for too long…

 

Quantum technologies have made a large-scale breakthrough in the ways of teaching artificial intelligence: on the one hand, they have reduced the data processing time by several orders of magnitude, on the other hand, this technology is also not without drawbacks, sometimes quite critical.

iskustvennii intelekt

 

At once it is necessary to make a reservation that this system is not yet put on stream: there are implemented experimental options that just revealed both the positive aspects and described the range of tasks that will need to be solved in the near future for full implementation in full.

 

First of all, this is the presence of sufficiently powerful quantum computers in the required quantity, while they are clearly too few, they are expensive and require further refinement in many aspects of interaction with the rest of the equipment, both at the technical level and at the software level.

 

The second is a temporary drawback of most existing quantum algorithms that are used in calculations, namely: unacceptably often a dead end in calculations is reached (“barren plateau”), which occurs due to the fact that usually variational algorithms in the solution process use a random distribution in the search, loading all the qubits globally, which ultimately gives the effect of a plateau that does not bring an answer.

 

However, it turned out that the second problem stems from the first: mathematically, a group of researchers at the University of Los Alamos proved that with sufficient scaling of the quantum system or an increase in the number of qubits (at least 100), this probability will significantly decrease to quite acceptable values.

Leave a Reply

Your email address will not be published. Required fields are marked *

GPD Host Contacts
GPD Host Social
Pay with Confidence

Copyright © 2015 - 2020 GPD Host All right reserved.