Researchers Use Computation to Scale Up Biochemical Processes

Picture in laboratory

High-performance computing resources at HLRS help bioengineers to predict how laboratory results can be transferred to industrial conditions without loss of performance.

In the multidisciplinary field of bioengineering, scientists use principles from engineering to understand, utilize, and optimize biological processes and systems. This can include, for example, designing biological systems to synthesize new products, developing processes for large-scale applications of biotechnology, or even using biological systems as models for addressing complex human problems. Whether we know it or not, all of us use biotechnology products daily. Methods and technologies based in bioengineering could, for example, help to develop natural alternatives for wasteful or dirty industrial processes, including manufacturing plastics, creating food additives, making technical enzymes for washing powders, or producing pharmaceuticals, among others.

Biotechnology now makes it possible to alter certain parts of a bacteria’s DNA—its genetic code—to imbue it with traits that can address a given task. Biotech processes utilize renewable resources such as sugar for carbon supply, and specially engineered microbes often convert that sugar into products of interest. As living cells, they need optimal cultivation conditions to perform as they should.

How microbes, sugars, and other compounds react in a laboratory, though, can be much different than how they behave in industrial-scale bioreactors, which often exceed 500,000 liters.

A research team led by Professor Ralf Takors in the University of Stuttgart’s Institute for Biochemical Engineering (IBVT) has been tackling this problem. “As a society we want to develop sustainable processes for the production of biology-based goods, be they commodities, chemicals, or pharmaceuticals,” he said. “For this to be practical, however, you have to scale up the processes you develop in the lab to large-scale reactors. This is no easy task."

Takors' multidisciplinary team has partnered with other academic and industrial researchers to investigate how methods from a field called computational fluid dynamics—a discipline often used to model aerodynamics or combustion processes, for example—could be used to simulate how biochemical reactions studied in the lab will occur at industrial scales. By developing detailed computational models, this approach could avoid the need to conduct expensive and time-consuming trial-and-error experiments that would otherwise be impractical.

Recently, the group used high-performance computing (HPC) resources at the High-Performance Computing Center Stuttgart (HLRS), one of the three centres comprising the Gauss Centre for Supercomputing (GCS), to model the behavior of Corynebacterium glutamicum, a widely used and well-understood bacterium used in producing amino acids and food additives. The work, which demonstrates close collaboration between computational and experimental research methods, was published in the journal Biotechnology and Bioengineering.

Bringing small-scale reactions to the industrial scale

From selective breeding of wheat and corn plants to modern day genetic engineering, humanity has long sought to improve upon naturally occurring biological processes to address its needs. Using bioengineering to increase production of useful materials is another stage in this technological evolution, though it presents some difficult challenges. "When things get into these large machines, we often observe non-optimal mixing conditions," Takors explained. "Bacteria are living organisms, so in addition to reacting in ways that might be useful in an industrial process, they also initiate processes that are natural but are of no use to the application. This can mean that they underperform at an industrial scale when compared to the ideal conditions in the lab.”

In large reactors, bacteria are cultivated in a variety of substrates, or material used to help grow microorganisms. Among other possible challenges, insufficient mixing creates substrate gradients that lead to situations where some reactants are used up too quickly, or reactions occur in certain areas of the reactor but not in others. In other cases, materials don't retain the same mixing characteristics—how soluble a substance is, for instance—when they are brought together in larger quantities. Even the shape of a reactor can influence the efficiency of chemical reactions.

Ultimately, this means that processes can produce less material than intended, make the quality of resulting materials worse than predicted, or in the worst cases, prevent the desired reaction from occurring at all. This loss of efficiency and quality at large scale results in higher expenses and can make it impractical to produce materials commercially, even when they hold the potential to address human needs, or even enable environmental gains by offering cleaner manufacturing processes. Researchers even fear ‘the death of innovation’ in certain research sectors once lab-scale expectations cannot be transferred properly to industrial scale after having spent considerable efforts, money, and time.

A feedback loop of simulation and experiment

In order to optimize these industrial processes, investigators in the Institute of Biochemical Engineering at the University of Stuttgart have had to optimize their respective research and development processes. Often, such studies are motivated by national and international industrial partners focused on chemical and biopharmaceutical products.

In their recent paper, Takors and his collaborators use C. glutamicum — an industrially important microbe that is used in the manufacture of a variety of common food additives such as the flavor enhancer monosodium glutamate (MSG). Because the bacteria has been so comprehensively studied, it is also a good model system for exploring how computational approaches could help ensure that reactions studied in the lab more reliably function at scale.

The team used data gathered from small-scale laboratory experiments involving C. glutamicum as the basis for computational fluid dynamics (CFD) simulations to predict how the bacterium performs in stirred tanks. CFD simulations allow the team to model the mixture of glucose and oxygen in the tank at the molecular level by creating a fine-grained computational mesh, and then solve equations to demonstrate molecule interactions within each box in the grid while advancing in time very slowly.

Based on these simulations, engineers and biologists can then make slight modifications to the bacterium’s genetic code, the ratio of ingredients in the mixture, or even the geometry of the bioreactor. By alternating simulations and experiments in an iterative fashion, they gain a better understanding of how the materials will react before testing them in an expensive experiment in a half-million-litre bioreactor.

This process only works if there can actually be efficient iteration, though, and it is here that access to HLRS's HPC systems has played an important role. “One of these simulations takes our team almost a week with our institution’s computational resources, but we can do the same simulation in less than 24 hours at HLRS,” Takors said. Furthermore, the larger computing power means that the team is able to create more accurate simulations at higher resolution, an approach that minimizes the number of iterations that are necessary to optimize a process. “The way we do the simulation, we create computational meshes, or grids, to solve equations, and the finer the grid, the better the prediction,” Takors said.

The team’s recent paper demonstrated that its computational approach was capable of simulating large-scale industrial processes at the accuracy necessary to avoid messy trial-and-error work being done inside a large bioreactor. The team was able to simulate the interaction of 120,000 Corynebacterium glutamicum cells in realistic industrial mixer conditions. Takors and his collaborators are now exploring other CFD methods and plan to run them on larger amounts of compute cores in parallel to further optimize this approach.

Takors also indicated that his team has been an early adopter of this model of close collaboration and feedback between laboratory scientists, engineers, and computational scientists when it comes to bioeconomy applications. “We are at the forefront of this specific application, and are working to get our approach accepted and adopted by the broader community,” he said.

Eric Gedenk

(Article republished with permission of the Gauss Centre for Supercomputing.)

Related publication

Kuschel M, Takors R. 2020. Simulated oxygen and glucose gradients as a prerequisite for predicting industrial scale performance a prioriBiotechnol Bioeng. 117:2760-2770.