Engineers to study best way to maximize computer’s power

Benjamin Moseley
Moseley

Benjamin Moseley, a computer scientist at Washington University in St. Louis, has received two multi-year grants from the National Science Foundation (NSF) totaling $900,000.

Benjamin Moseley, a computer scientist at Washington University in St. Louis, has received two multi-year grants from the National Science Foundation (NSF) totaling $900,000.

In one grant, Moseley, Kunal Agrawal, and I-Ting Angelina Lee, all assistant professors of computer science & engineering in the School of Engineering & Applied Science, received a four-year, $650,000 grant to find a way to schedule jobs so that the parallel computing process runs fairly and efficiently.

When you type a word or phrase into a search engine, the search goes to many processors looking for the answer in an activity called parallel computing. A team of computer engineers at Washington University in St. Louis is seeking the best way to take advantage of parallel computing to maximize its power and potential.

Parallel computing, which is now the standard in computing, uses at least two processors, or cores, simultaneously to solve a single problem. A desktop computer generally has eight cores, while a data center would have computers with 128 or 256 cores on one chip, said Moseley, the principal investigator.

“We want to use the processing power and deliver good quality of service,” said Moseley, an applied mathematician. “This would result in more results, so users could do more work, and reduced variance of time it takes to receive results.”

For example, if a user opens a word-processing program, the job will be assigned to a core to begin the program. Moseley and his team will study what happens when the same user decides to watch a video while keeping the word-processing program open.

“How does the computer decide if it processes the word processing program or runs the video, and how does it allocate this processor in such a way that it delivers whatever is important to you: that the video keeps running and the word processor stays open,” Moseley said.

Prior to parallel computing, hitting a search button would go to just one machine and one processor, where it would be processed in the order it was received, or sequentially.

Moseley said the team will release its code to the public so that it could potentially be adopted into computer systems.

As part of the funding, Moseley, Agarwal and Lee will teach parallelism to both undergraduate and graduate students and include several students in the research each summer. Moseley also is developing a distributed computer course.

In addition, Moseley has received a four-year, $250,000 grant from the NSF to develop an algorithmic foundation for using a new kind of memory called high-bandwidth memory (HBM).

This project stems from work that Moseley did while a consultant for Sandia National Labs in Albuquerque, N.M., looking into a new type of memory with a much higher bandwidth than any previous memory and can be used for supercomputers or large-scale processors.

“Now that we have this high-bandwidth memory, how do we use it and how do we redesign our parallel programs to leverage this memory,” Moseley said. “Most of our interesting applications are big and have large memory requirements, and data is only getting bigger.”

Mosely said HBM redefines the classic memory hierarchy and the way computer scientists view memory and teach it to students.

“The research into high-bandwidth memory (HBM) has the potential for economic, technological and scientific impact because industry has an investment in this technology, and many of the nation’s strategic codes are run on HBM machines,” Moseley said.

This project seeks to develop an understanding of how to algorithmically design codes for HBM enhanced architectures. This work will require new algorithms, models and abstractions designed by researchers who study hardware issues, HPC challenges and theoretical modeling and analysis.


The School of Engineering & Applied Science at Washington University in St. Louis focuses intellectual efforts through a new convergence paradigm and builds on strengths, particularly as applied to medicine and health, energy and environment, entrepreneurship and security. With 94 tenured/tenure-track and 28 additional full-time faculty, 1,300 undergraduate students, 1,200 graduate students and 20,000 alumni, we are working to leverage our partnerships with academic and industry partners — across disciplines and across the world — to contribute to solving the greatest global challenges of the 21st century.