(This profile appeared in the spring 2023 issue of InSiDE, the magazine of the Gauss Centre for Supercomputing.)
When he was around six years old, Dennis Hoppe’s father brought home an IBM i386, a relatively dated computer at the time. In order to free up enough memory to play his favorite games, Hoppe had to learn how to put in “secret” MS-DOS commands. That initial ambition began a long journey through generations of computer hardware.
“While growing up, I gradually went through every CPU generation: i486, Pentium, and the AMD Athlon are some processor generations that come to my mind. I was specifically fascinated by the AMD Athlon Thunderbird because one could ‘overclock’ them massively by using water cooling,” he said.
Aug 01, 2023
HLRS Projects
Interview
Cloud Computing
AI & Data Analytics
Quantum Computing
See all news
Experimenting with new generations of technology as a child was more important for Hoppe’s future than he realized at the time. Since 2019, he has been the Head of Service Management and Business Processes at the High-Performance Computing Center Stuttgart (HLRS). In that role, Hoppe identifies where users’ technology needs are going, helps ensure HLRS is addressing those needs through hardware and software investments, and then closely interacts with users to put them in the best position to succeed when using HLRS’s suite of computational resources.
That suite includes two GPU-accelerated cabinets that were added after the 2020 installation of HLRS’s flagship supercomputer, Hawk, to better serve the increased interest in artificial intelligence (AI) workflows. Between the rise of AI, growing interest in quantum computing technologies, and an increased emphasis on cloud-based high-performance computing (HPC), Hoppe is embracing a continuation of his childhood passion by learning everything he can about new technologies as they come online.
While all of these technologies have generated increased interest in recent years, perhaps none has captured society’s collective imagination and stoked its collective worries like AI. Since 2016, Hoppe has already seen three distinct phases of AI adoption at HLRS. In the first, users were primarily interested in using classical HPC resources for large-scale data processing tasks. In the second phase, users’ interest in GPUs multiplied with the rise of deep learning applications Now in the third, contemporary phase, Hoppe sees an increasing number of users interested in combining AI with traditional HPC, developing hybrid workflows that can take advantage of each type of hardware. “This era is the most interesting one for us as an HPC center,” he explained, “because AI enables our experienced user base to experiment with these new methods and enrich their existing workflows on our classical infrastructure.”
Hoppe sees this hybrid workflow as the future of scientific computing, and with HLRS as part of the Gauss Centre for Supercomputing (GCS), he feels well-positioned to help guide old and new users alike in taking advantage of these methods. “GCS is in a comfortable position regarding these changes because we can offer our user bases unique, heterogenous infrastructure,” he said. “Each of the three centers has its own research focuses, allowing us to address the needs of different research communities and industries.”
The staff diversity, spirit of collaboration, and dynamic work environment are Hoppe’s favorite aspects of working for HLRS. “For me, the biggest benefit, which is simultaneously the biggest challenge, is that every day is different,” he said. “This comes naturally, because we participate in many national and international research projects and interact with our users and stakeholders in many ways. Every day holds a new surprise—both welcome and occasionally unwelcome ones.”
That rapid pace of change both challenges and excites Hoppe. While his interest in the frontier of computer science was born from a youthful interest in manipulating hardware, his current fascination comes from the rapid advancements in software. He pointed to the first quarter of 2023 as an example: The seemingly overnight success of ChatGPT’s GPT-4 large-language model solved a problem that just a few years ago was among the hardest challenges facing AI researchers—generating human-like chat interactions with impressive accuracy and fluency. With researchers’ eyes now set on improving text-to-image and even developing text-to-video models, among myriad other hopes and dreams for AI applications, Hoppe envisions more rapid growth in the years to come. “The current disruptive technologies in the relatively young research domain of AI are actually foundational models for the future,” he said.
Luckily for Hoppe, his life-long experiences have prepared him for this technological moment, and his position at HLRS gives him a front-row seat.
-Eric Gedenk