Data-flow vs control-flow for extreme level computing
PublisherInstitute of Electrical and Electronics Engineers Inc.
SourceProceedings - 2013 3rd Workshop on Data-Flow Execution Models for Extreme Scale Computing, DFM 2013
2013 3rd Workshop on Data-Flow Execution Models for Extreme Scale Computing, DFM 2013
Google Scholar check
MetadataShow full item record
This paper challenges the current thinking for building High Performance Computing (HPC) Systems, which is currently based on the sequential computing also known as the von Neumann model, by proposing the use of Novel systems based on the Dynamic Data-Flow model of computation. The switch to Multi-core chips has brought the Parallel Processing into the mainstream. The computing industry and research community were forced to do this switch because they hit the Power and Memory walls. Will the same happen with HPC? The United States through its DARPA agency commissioned a study in 2007 to determine what kind of technologies will be needed to build an Exaflop computer. The head of the study was very pessimistic about the possibility of having an Exaflop computer in the foreseeable future. We believe that many of the findings that caused the pessimistic outlook were due to the limitations of the sequential model. A paradigm shift might be needed in order to achieve the affordable Exascale class Supercomputers. © 2013 IEEE.