DENVER — Micron unveiled what it claims is a fundamentally different new processor architecture at Supercomputing 2013 that speeds up the search and analysis of complex and unstructured data streams. The sneak peak of its Automata Processor (AP) architecture was accompanied by the establishment of a Center for Automata Computing at the University of Virginia.
In an interview from the conference floor via telephone, Micron’s director of Automata Processor technology development Paul Dlugosch told EE Times that Automata is different than conventional CPUs in that its computing fabric is made up of tens of thousands to millions of processing elements that are interconnected. Its design is based on an adaptation of memory array architecture, exploiting the inherent bit-parallelism of traditional SDRAM.
“Many of the most complex, computational problems that face our industry today require a substantial amount of parallelism in order to increase the performance of the computing system,” said Dlugosch.
Conventional SDRAM is organized into a two-dimensional array of rows and columns and accesses a memory cell for any read or write operation. The memory, said Dlugosch, is not used to store data; it is used to stream back analysis of data. The AP architecture uses a DDR3-like memory interface and will be made available as single components or as DIMM modules.
Micron will also make available graphic design and simulation tools and a software development kit (SDK) to help developers design, compile, test, and deploy their own applications. A PCIe board populated with AP DIMMs will be available to early access application developers so they can begin plug-in development of AP applications. Samples of the AP and the SDK will available in 2014.
Automata has been in development for seven years, spurned by customer requests for even faster speeds. Dlugosch said Micron decided it was time to take a different approach to solve problems consistently voiced by CPU vendors and OEMs: Memory is the bottleneck. This problem has been exasperated since the early days of big data in 2007, he said.
The architecture is aimed specifically at advanced computing capabilities, particularly analytics, where high-performance computing meets big data to solve problems in the areas of bioinformatics and network security analysis. The applications require deep analysis of data streams that contain spatial and temporal information and are often challenged by memory constraints.
Micron has partnered with a number of research institutions to foster adoption of the technology, including the Georgia Institute of Technology and the University of Missouri, while the University of Virginia has established the Center for Automata Computing with the help of seed funding from Micron.
Stu Wolf, professor of materials science and physics at the university, also attending the conference and instrumental in setting up the center, said problems are already being solved using the technology. One early project is designed to help researchers in biomedical engineering analyze huge volumes of DNA data and cellular imagery. Other potential applications include interpretation of social science datasets, data analysis for personal and national security, and design verification in engineering.
“This particular architecture and the way it has been configured allows a problem size to be much larger in any conventional computer,” said Wolf, adding that researchers who have started using it have found Micron’s SDK easy to work with.
Chirag Dekate, IDC’s research manager for HPC/Data Analysis, said Micron’s AP architecture was one of the highlights for him at Supercomputing 2013. “It is an extremely innovative architecture. It is the first processor of its kind.”
Because the AP architecture is data-flow based, it is well suited for certain problems, such as the emerging area of graph analytics, said Dekate, and having an SDK to program the processor is a compelling feature.
He said it’s important to note that the AP architecture is a first-generation technology and it is not suitable for all computing problems.
Micron’s choice beyond releasing an incremental update is a high-risk proposition, Dekate said. “They’re willing to innovate in what is quite a complex and dynamic ecosystem.”
Originally posted here: