DARPA’s UPSIDE Seeks to Rethink Computing

The Defense Advanced Research Projects Agency‘s (DARPA)’s Unconventional Processing of Signals for Intelligent Data Exploitation (UPSIDE) program is looking to develop a computing infrastructure that is three orders of magnitude faster and four orders of magnitude more energy efficient, or roughly 1,000 and 10,000 times respectively, than current digital processors. To do so, the Broad Agency Announcement for UPSIDE, issued August 14, seeks a solution so revolutionary that is specifically states it will disqualify any “evolutionary improvements to the existing state of practice.” That’s because UPSIDE isn’t simply seeking faster, smaller, and more efficient processors, it seeks to escape current limitations on Moore’s Law.

The Department of Defense relies heavily on intelligence, surveillance, and reconnaissance (ISR) through sensors that guide our missiles, inform our military, allow us to fly our drones and even manned aircraft, and generally act as one of our principle force multipliers. But as with everything in the age of Big Data, we are now pulling in more information than we can analyze and leverage. Military sensors also have additional limitations. We can’t simply throw supercomputers at the problem as many sensors are deployed, portable, and may be in the tactical arena, meaning that they have to be small and energy efficient as well as powerful. Moore’s Law, which observes that the number of transistors on integrated circuits doubles every 18 months to two years resulting in processors with twice the power, half the size, or half the price, has been driving the evolution of small, low-cost sensors, but we are reaching the limits of traditional complementary metal–oxide–semiconductor (CMOS) technology, and processor improvements are beginning to stall, especially with regards to energy, limiting portable defense technology.

As a result, DARPA is looking for a breakthrough in non-CMOS, non-Boolean computing. Current digital technology is based on complementary metal–oxide–semiconductors and, as a result, a Boolean computational model where expressions must be true or false, 1 or 0, on or off, which is inherently power inefficient for “noisy,” real-time, analog sensors. Instead, DARPA is looking at nanoscale devices whose physical properties allow for a Bayesian model. Bayesian mathematics deals with probabilities based on inference from Bayes Law in mathematical statistics that calculates conditional probabilities. In other words, given that condition B is true, Bayes Law helps you determine if condition A is also true based on the likelihood of A, B, and B given A. This is a good approximation of how we make inferences ourselves. For example, if we are trying to decide if an animal is a duck given that it’s quacking, we would consider the likelihood that it’s a duck (high near a pond but low in the desert), the likelihood that the duck would be quacking (do all ducks quack?), and the likelihood that what quacks is a duck (certain parrots also make similar noises).

The main use and advantage of Bayes Law, and the cornerstone of Bayesian probability, is iterative probability. Once you determine the probability of A given B, another condition, C, can be observed and you can determine how likely A is given both B and C, and continue adding conditions. Returning to our example of the duck, if we notice that the quaking animal is also walking like a duck, we can use Bayes’ Law to determine the likelihood an animal that quacks is a duck given that it also waddles.

UPSIDE is less concerned with identifying ducks and more interested in identifying insurgents, IEDs, friendly forces, and intruders. For the solicitation, the computational framework would be implemented as an “Inference Module” (IM) that performs image analysis based on Bayesian networks. A Bayesian network is an iterative set of conditions that influence each other to make a decision. Each node in the network is connected to others by probabilities and the IM would give the most likely interpretation of the input by moving through the network of inferences.

What’s so revolutionary about UPSIDE is the way it would implement the IM. As mentioned earlier, the computations would not be Boolean true or false, rather, a series of Bayesian probability distributions. Traditional computing is Boolean, however, in part because CMOS devices are analogous to billions of on-off switches, resulting in the 1′s and 0′s of bits and bytes. UPSIDE would avoid this by adopting different nanoscale devices with properties that can represent a weighted probability distribution.

This is where UPSIDE’s incredible increases in speed and energy efficiency come from. It takes many bits and bytes to represent a probability distribution, which is each node of a Bayesian network. Representing each with a nanoscale device would drastically reduce the number of calculations necessary to make a judgement, which would increase speed by several orders of magnitude while reducing device size and the power needed to perform operations. DARPA hopes to further cut energy requirements by leveraging energy minimization in the decision-making process by determining the most likely path through the Bayesian network as the one that uses the least energy. This, again, is easier to represent via the physical properties of nanoscale devices such as seeking the path of least resistance.

Another interesting element of UPSIDE’s IM is that, rather than being programmed, each module should self-organize in response to its inputs. The IM would learn in real-time based on an evolving series of “beliefs” that influence probabilities and pattern recognition. All of this would occur while the system is running.

Submissions to UPSIDE will be evaluated against current solutions based on speed, power efficiency, and accuracy. The first two should be dramatically improved without any critical loss to the latter. Proposals for UPSIDE are due October 12 and performance is estimated to start in March, 2013.

This post by was first published at CTOvision.com.


Original post

Leave a Comment

One Comment

Leave a Reply

Jeff Ribeira

Wow. What a shift this would be for the tech world. I’m sure it’ll be years before whatever DARPA comes up with goes commercial, but it is still quite exciting.