Dongarra’s algorithms and software fueled the growth of high-performance computing and had a significant impact on many areas of computer science, from AI to computer graphics.
New York, NY, March 30, 2022 – ACM, the Association for Computing Machinery, today named Jack J. Dongarra the winner of the ACM A.M. Turing 2021 for his pioneering contributions to numerical algorithms and libraries that enabled high-performance computing software to keep pace with exponential improvements in hardware for more than four decades. Dongarra is Distinguished Professor of Computer Science in the Department of Electrical and Computer Engineering at the University of Tennessee. He is also a member of the Oak Ridge National Laboratory and the University of Manchester.
The ACM A.M. Turing, often referred to as the “Nobel Prize in Computing,” is endowed with a million dollars and is financially supported by Google, Inc. It is named after Alan M. Turing, the British mathematician who articulated the mathematical foundations and computing limits.
Dongarra has led the world in high-performance computing through his contributions to efficient numerical algorithms for linear algebra operations, programming mechanisms for parallel computing, and performance evaluation tools. For nearly forty years, Moore’s Law produced exponential growth in hardware performance. During that same time, while most software couldn’t keep up with these hardware advances, high-performance numerical software did, thanks in large part to algorithms, optimization techniques, and software implementations of Dongarra production quality.
These contributions established a framework from which scientists and engineers made important discoveries and game-changing innovations in areas such as big data analytics, health, renewable energy, weather forecasting, genomics, and economics, to name a few. name a few. Dongarra’s work also helped facilitate advances in computer architecture and supported revolutions in computer graphics and deep learning.
Dongarra’s major contribution was the creation of open source software libraries and standards that employ linear algebra as an intermediate language that can be used by a wide variety of applications. These libraries have been written for single processors, parallel computers, multicore nodes, and multiple GPUs per node. The Dongarra libraries also introduced many important innovations, such as autotuning, mixed-precision arithmetic, and batch calculations.
As the leading ambassador for high-performance computing, Dongarra was tasked with convincing hardware vendors to optimize these methods and software developers to use his open source libraries in their work. Ultimately, these efforts resulted in linear algebra-based software libraries achieving near-universal adoption for high-performance engineering and scientific computing on machines ranging from laptops to the world’s fastest supercomputers. These libraries were essential to the growth of the field, allowing increasingly powerful computers to solve difficult computational problems.
“Today’s fastest supercomputers grab media headlines and pique public interest by performing mind-boggling feats of a quadrillion calculations in a second,” explains ACM President Gabriele Kotsis. “But beyond the understandable interest in breaking new records, high-performance computing has been an important tool of scientific discovery. High-performance computing innovations have also spread to many different areas of computing and have made advance our whole field”. Jack Dongarra played a critical role in directing the successful trajectory of this field. His pioneering work dates back to 1979 and he remains one of the most outstanding and committed leaders in the HPC community. His career arguably exemplifies the Turing Award’s recognition of “great contributions of lasting importance.”
“Jack Dongarra’s work has fundamentally changed and advanced scientific computing,” said Jeff Dean, Google Senior Fellow and SVP of Google Research and Google Health. “His profound and important work at the core of the world’s most widely used numerical libraries underlies all areas of scientific computing, helping to advance everything from drug discovery to weather prediction to aerospace engineering and dozens of other applications.” other fields, and his deep focus on characterizing the performance of a wide range of computers has led to major advances in computer architectures that are well suited to numerical computations.”
Dongarra will formally receive the ACM A.M. Turing at the ACM Annual Awards Banquet, to be held this year on Saturday, June 11 at the Palace Hotel in San Francisco
SELECTED TECHNICAL CONTRIBUTIONS
For over four decades Dongarra has been the main implementer or researcher of many libraries such as LINPACK, BLAS, LAPACK, ScaLAPACK, PLASMA, MAGMA and SLATE. These libraries have been written for single processors, parallel computers, multicore nodes, and multiple GPUs per node. Its software libraries are used almost universally for high-performance engineering and scientific computing on machines ranging from laptops to the world’s fastest supercomputers.
These libraries incorporate numerous far-reaching technical innovations, such as: Autotuning: through its ATLAS project, awarded at the 2016 Supercomputing Conference, Dongarra pioneered the automatic search for algorithmic parameters that produce linear algebra kernels of near-optimal efficiency , often exceeding the codes supplied by the providers.
Mixed-Precision Arithmetic: In his 2006 Supercomputing Conference paper, “Exploiting the Performance of 32-bit Floating Point Arithmetic in Obtaining 64-bit Accuracy,” Dongarra pioneered the exploitation of the multiple precisions of floating-point arithmetic to deliver accurate solutions faster. This work has become a building block in machine learning applications, as recently demonstrated in the HPL-AI benchmark, which has achieved unprecedented levels of performance on the world’s best supercomputers.
Batch Calculations: Dongarra pioneered the paradigm of dividing large dense matrix computations, which are commonly used in simulations, modeling, and data analysis, into many smaller task computations into blocks that can be computed independently and concurrently. Building on his 2016 paper, “Performance, design, and autotuning of batched GEMM for GPUs,” Dongarra led the development of the Batched BLAS standard for these types of calculations, and they also appear in the MAGMA and SLATE software libraries.
Dongarra has collaborated internationally with many people on the above efforts, always in the role of driver of innovation, continually developing new techniques to maximize performance and portability, while maintaining numerically reliable results using the most advanced techniques. Other examples of its leadership are the Message Passing Interface (MPI), the de facto standard for portable message passing in parallel computing architectures, and the Performance API (PAPI), which provides an interface that enables collection and synthesis of the yield of the components of a heterogeneous system. The standards he helped create, such as MPI, the LINPACK Benchmark, and the Top500 list of supercomputers, support computational tasks ranging from weather prediction to climate change to data analysis from large-scale physics experiments.
biographical background
Jack J. Dongarra is a Distinguished Professor at the University of Tennessee and a Distinguished Member of the Research Staff at Oak Ridge National Laboratory since 1989. He has also been a Turing Scholar at the University of Manchester, UK, since 2007. Dongarra has a BA in Mathematics from Chicago State University, received a Master’s degree in Computer Science from the Illinois Institute of Technology and a Ph.D. in Applied Mathematics from the University of New Mexico.
Dongarra has received, among others, the IEEE Computer Pioneer Award, the SIAM/ACM Award in Computational Science and Engineering, and the ACM/IEEE Ken Kennedy Award. He is a member of the ACM, the Institute of Electrical and Electronics Engineers (IEEE), the Society for Industrial and Applied Mathematics (SIAM), the American Association for the Advancement of Science (AAAS), the International Supercomputing Conference (ISC), and the International Institute of Engineering and Technology (IETI). He is a member of the National Academy of Engineering and a Foreign Fellow of the British Royal Society.
About the A.M. ACM Turing
The AM Award Turing is named after Alan M. Turing, the British mathematician who articulated the mathematical foundations and limits of computer science, and who was a key contributor to the Allied cryptanalysis of the Enigma key during World War II. Since its inception in 1966, the Turing Award has honored computer scientists and engineers who created the systems and underlying theoretical foundations that have powered the information technology industry.
About MCA
ACM, the Association for Computing Machinery, is the world’s largest computer science and education society, uniting educators, researchers, and practitioners to inspire dialogue, share resources, and address challenges in the field. ACM strengthens the collective voice of the computing profession through strong leadership, promotion of the highest standards, and recognition of technical excellence. The ACM supports the professional growth of its members by offering opportunities for lifelong learning, career development, and professional networking.
Science seeks the basic laws of nature. Mathematics seeks new theorems based on old ones. Engineering builds systems to solve human needs. The three disciplines are interdependent but distinct. It is very rare for one person to simultaneously make fundamental contributions to all three, but Claude Shannon was a rare person.
Despite being the subject of the recent documentary The Bit Player – and his research work and philosophy having inspired my own career – Shannon is not exactly a household name. He never won a Nobel Prize and was not a celebrity like Albert Einstein or Richard Feynman, neither before nor after his death in 2001. But more than 70 years ago, in a single groundbreaking paper, he laid the foundation for the entire communication infrastructure underlying the modern information age.
Shannon was born in Gaylord, Michigan, in 1916, the son of a local businessman and a teacher. After receiving bachelor’s degrees in electrical engineering and mathematics from the University of Michigan, he wrote a master’s thesis at the Massachusetts Institute of Technology that applied a mathematical discipline called Boolean algebra to the analysis and synthesis of switching circuits. It was a transformative work, turning circuit design from an art to a science, and is now considered to be the starting point of digital circuit design.
Next, Shannon set her sights on an even bigger goal: communication.
Claude Shannon wrote a master’s thesis that launched digital circuit design, and a decade later he wrote his seminal paper on information theory, “A Mathematical Theory of Communication.”
Communication is one of the most basic human needs. From smoke signals to carrier pigeons to telephones and televisions, humans have always sought ways to communicate further, faster and more reliably. But the engineering of communication systems has always been linked to the source and the specific physical medium. Shannon instead wondered, “Is there a grand unified theory for communication?” In a 1939 letter to his mentor, Vannevar Bush, Shannon outlined some of his early ideas on “the fundamental properties of general systems for the transmission of intelligence.” After working on the problem for a decade, Shannon finally published his masterpiece in 1948: “A Mathematical Theory of Communication.”
The core of his theory is a simple but very general model of communication: A transmitter encodes information into a signal, which is corrupted by noise and decoded by the receiver. Despite its simplicity, Shannon’s model incorporates two key ideas: isolate the sources of information and noise from the communication system to be designed, and model both sources probabilistically. He imagined that the information source generated one of many possible messages to communicate, each of which had a certain probability. Probabilistic noise added more randomness for the receiver to unravel.
Before Shannon, the problem of communication was seen primarily as a problem of deterministic signal reconstruction: how to transform a received signal, distorted by the physical medium, to reconstruct the original as accurately as possible. Shannon’s genius lies in his observation that the key to communication is uncertainty. After all, if you knew in advance what I was going to tell you in this column, what would be the point of writing it?
Schematic diagram of Shannon’s communication model, excerpted from his paper.
This single observation moved the communication problem from the physical to the abstract, allowing Shannon to model uncertainty using probability. This was a complete shock to the communication engineers of the time.
In this framework of uncertainty and probability, Shannon set out to systematically determine the fundamental limit of communication. His answer is divided into three parts. The concept of the “bit” of information, used by Shannon as the basic unit of uncertainty, plays a fundamental role in all three. A bit, which is short for “binary digit,” can be either a 1 or a 0, and Shannon’s paper is the first to use the word (although he said mathematician John Tukey used it first in a memo).
First, Shannon devised a formula for the minimum number of bits per second to represent information, a number he called the entropy rate, H. This number quantifies the uncertainty involved in determining which message the source will generate. The lower the entropy rate, the lower the uncertainty, and therefore the easier it is to compress the message into something shorter. For example, texting at a rate of 100 English letters per minute means sending one of 26,100 possible messages each minute, each represented by a sequence of 100 letters. All of these possibilities could be encoded in 470 bits, since 2470 ≈ 26100. If the sequences were equally likely, Shannon’s formula would say that the entropy rate is effectively 470 bits per minute. In reality, some sequences are much more likely than others, and the entropy rate is much lower, allowing for more compression.
Second, he provided a formula for the maximum number of bits per second that can be reliably communicated in the face of noise, which he called the system capacity, C. This is the maximum rate at which the receiver can resolve the uncertainty in the message. , which makes it the communication speed limit.
Finally, he showed that reliable communication of source information versus noise is possible if and only if H < C. Thus, information is like water: If the flow rate is less than the capacity of the pipe, the current happens reliably.
Although it is a theory of communication, it is at the same time a theory of how information is produced and transferred: an information theory. That is why Shannon is considered today “the father of information theory”.
His theorems led to some counterintuitive conclusions. Suppose you are talking in a very noisy place. What’s the best way to make sure your message gets through? Maybe repeat it many times? That’s certainly anyone’s first instinct in a noisy restaurant, but it turns out it’s not very effective. Surely the more times you repeat yourself, the more reliable the communication will be. But you’ve sacrificed speed for reliability. Shannon showed us that we can do much better. Repeating a message is an example of using a code to convey a message, and by using different and more sophisticated codes, you can communicate quickly – up to the speed limit, C – while maintaining any degree of reliability.
Another unexpected conclusion that emerges from Shannon’s theory is that, whatever the nature of the information (a Shakespeare sonnet, a recording of Beethoven’s Fifth Symphony, or a Kurosawa film), it is always more efficient to encode it into bits. before transmitting it. Thus, in a radio system, for example, although both the initial sound and the electromagnetic signal sent through the air are analog waveforms, Shannon’s theorems imply that it is optimal to first digitize the sound wave into bits, and then map those bits in the electromagnetic wave. This surprising result is the cornerstone of the modern digital information age, in which the bit reigns as the universal currency of information.
Shannon also had a playful side, which she often brought to her work. Here, he poses with a maze he built for an electronic mouse, named Theseus.
Shannon’s general theory of communication is so natural that it’s as if he discovered the communication laws of the universe, instead of inventing them. His theory is as fundamental as the physical laws of nature. In that sense, he was a scientist.
Shannon invented new mathematics to describe the laws of communication. He introduced new ideas, such as the entropy rate of a probabilistic model, which have been applied in far-reaching mathematical branches, such as ergodic theory, the study of the long-term behavior of dynamical systems. In that sense, Shannon was a mathematician.
But above all, Shannon was an engineer. His theory was motivated by practical engineering problems. And while it was esoteric to engineers of his day, Shannon’s theory has become the standard framework on which all modern communication systems are based: optical, submarine, and even interplanetary. Personally, I have been fortunate to be part of a worldwide effort to apply and extend Shannon’s theory to wireless communication, increasing communication speeds by two orders of magnitude over multiple generations of standards. In fact, the 5G standard currently being rolled out uses not one, but two proven codes of practice to reach the Shannon speed limit.
Although Shannon died in 2001, her legacy lives on in the technology that makes up our modern world and in the devices she created, like this remote-controlled bus.
Shannon discovered the basis for all this more than 70 years ago. How did he do it? Relentlessly focusing on the essential feature of a problem and ignoring all other aspects. The simplicity of his communication model is a good illustration of this style. He also knew how to focus on what is possible, rather than what is immediately practical.
Shannon’s work illustrates the true role of high-level science. When I started college, my advisor told me that the best job was to prune the tree of knowledge, instead of growing it. So I didn’t know what to make of this message; I always thought my job as a researcher was to add my own twigs. But throughout my career, when I had the opportunity to apply this philosophy in my own work, I began to understand it.
When Shannon began studying communication, engineers already had a large collection of techniques. It was his work of unification that pruned all these twigs of knowledge into a single charming and coherent tree, which has borne fruit for generations of scientists, mathematicians, and engineers.
Recent Posts
- Virtual Course: Secure Development Oriented to OWASP Web and APIs
- Workshop on Management Indicators and Strategic Planning for Human Resources Management. Friday, November 8, 2024
- Virtual Course: Design Thinking
- SPN Software present at The Fall Conference – INTRAS
- Mercado Magazine. Tech Leaders – October 2024.
Archives
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- October 2019
- September 2019
- August 2019
- July 2019
- June 2019
- May 2019
- March 2019
- February 2019
- January 2019
- December 2018
- November 2018
- October 2018
- September 2018
- August 2018
- July 2018
- June 2018
- May 2018
- April 2018
- March 2018
- February 2018
- December 2017