How Computational Modeling is Accelerating Industrial Innovation
How Computational Modeling is Accelerating Industrial Innovation
Budgets and timelines once slowed innovation. Advanced computational modeling is removing barriers.
Historically, the high costs and time demands of industrial technology development have hindered innovation. Thankfully, such hurdles are being lowered—and quickly—due to progress in advanced computational and modeling tools.
This shift, which has been years in the making and continues to pick up the pace, is empowering engineers to test, modify, and commercialize important technologies far more efficiently, to the benefit of their employers and society at large. In this piece, we take a closer look at the historic arc of this evolution, examples of this arc at work, and the promise such change holds for our profession and some of the world’s most pressing problems.
If we take a brief look backward to the 1980s, when computational tools and supercomputers were first introduced, we can see how this period of immense change and innovation found its footing. In 1981, the IBM PC debuted, and three years later, Apple came out with its first Macintosh. At the same time ARPANET, the precursor to the internet, began transitioning into a global network. The introduction of these tools allowed engineers to begin modeling and testing without the use of physical materials.
Significant developments in compute, data, and algorithms have further paved a brighter path ahead. Advancements in compute, which refers to the processing power of the physical hardware used to perform calculations and process data, are overcoming limitations imposed by the end of Dennard scaling and the slowing of Moore’s law. These innovations in compute have directly led to the recent (and immense) progression of AI. For example, the transition from general purpose CPUs to more specialized processors has helped address barriers to training AI models.
By increasing the power of compute, models can execute algorithms and process data at a much faster rate. As this power increases, engineers are able to train models to process more information and perform more tasks with increasing efficiency, leading to faster innovation.
While companies traditionally deploy testing in the early stages of the product development cycle, doing so throughout the entire process is significantly more advantageous to the final product. This has reduced emissions by minimizing the resources needed to test and develop these chips. In addition, it has uncovered new ways of manufacturing that can lead to reduced emissions in the output.
At companies like Energy Recovery and Bloom Energy, it’s clear how these developments have enabled engineers to usher in a new era of innovation, ensuring efficiency, reliability, and safety in our products.
Bloom Energy designs and manufactures solid oxide fuel cells that enable the conversion of fuels like natural gas, biogas, and hydrogen into electricity without combustion. These cells are critical to the function of the Bloom Energy Server, a 24/7 source of onsite power used by tech companies, retailers, manufacturers, and more.

Computational modeling accelerated the product development cycle and provided insights to lower costs, while rapidly advancing the next generation of the energy server at Bloom Energy. In a high temperature fuel cell application of this novel technology, it’s extremely challenging to have reliable instrumentation to measure temperatures and fuel distribution. With detailed 3D computational fluid dynamics (CFD) modeling and the application of engineering insights, we were able to look at novel designs that increased world class performance through electrical efficiency conversion, deduce the root cause analysis (RCA) of failure modes, and enhance the reliability of the energy server. The real time data collected through a private cloud helped reduce maintenance costs and greatly enhanced customer experience through increased uptime.
This continuous improvement through modeling and machine learning through field data led to the 2023 launch of a combined heat and power (CHP) solution, the Bloom Energy Server, which uses a high temperature exhaust stream for industrial steam production and absorption chilling. At the time of this announcement, about 50 percent of global industrial energy use was for steam generation, proving how valuable this advancement was to not only providing this form of energy, but reducing carbon emissions while doing so.
These same principles are being applied at Energy Recovery, which is designing and manufacturing high-performance solutions that generate cost savings, increase energy efficiency, and reduce carbon emissions across the desalination, carbon dioxide refrigeration, and wastewater industries. The company’s core technology, the Pressure Exchanger (PX), is at the heart of many of its products and is designed to efficiently capture and transfer pressure energy, making commercial and industrial processes more efficient.
Over the last several years, Energy Recovery has made significant strides in advancing this core technology, in large part due to computational modeling, allowing the company to produce, maintain the quality of, and test new iterations of the PX. Given Energy Recovery operates in the water business—a vital resource for health, sanitation, industry, and recreation—unexpected downtime in the desalination industry is very disruptive and is seen as an emergency situation for those relying on the plant’s water production. This proves how valuable it is to be able to test and model products at all phases of the R&D stage, ensuring downtime is avoided while maximizing the energy savings for the plant.
The use of the PX in SWRO desalination is particularly challenging, as it operates continuously in high-pressure, highly corrosive seawater and brine. Through modeling and testing, Energy Recovery’s engineers determined the right materials and design to provide world-class efficiency for pressure energy conversion for the PX in SWRO, while ensuring reliability and extended design life. For example, the PX devices have a single moving part—a ceramic rotor—that is surrounded by a close-fitting ceramic sleeve and ceramic end covers. The company is now able to measure the rotor/stator gaps, roundness, parallelism, and perpendicularity on every ceramic cartridge with a precision of fractions of a micron.
These efforts have paid off. Energy Recovery conducted a recent test that found the PX used in seawater reverse osmosis (SWRO) applications has a longer design life than previously thought, extending from a 25-year to 30-year design life. The PX is not just an example of cutting-edge technology—it’s the standard of reliability.
Computational modeling is faster and more capital efficient, accelerating the leap from concept to reality. Today, the introduction of AI has brought computational modeling that is much closer to simulating the conditions of real-life testing in a lab.
AI-powered simulations can now replicate complex physical processes with high fidelity, reducing the need for costly and time-consuming physical prototypes. Whether modeling fluid dynamics in pipelines, thermal behavior in energy systems, or structural integrity in manufacturing components, engineers can run thousands of digital experiments rapidly and in parallel. This increased testing frequency helps identify weaknesses, optimize performance, and shorten development cycles—all before a single physical component is built.
Afterward, AI can quickly decipher the resulting findings, allowing engineers to more rapidly troubleshoot issues by detecting patterns in data in real-time. For systems that are expected to run continuously, like industrial parts used in desalination plants or wastewater treatment facilities, this is invaluable.
Moreover, machine learning models trained on historical performance data can predict system behavior under a wide range of conditions, including rare or extreme scenarios. Engineers can now use real-time sensor inputs and predictive analytics to virtually stress-test equipment, anticipate failures, and refine designs with greater accuracy. This not only improves reliability and safety but also enables iterative improvements throughout a project’s life cycle.
Nvidia’s Jensen Huang believes that AI could drive a “million-fold” increase in computing power over the next decade. As computing power grows and AI models become more sophisticated, simulation and modeling are evolving from passive design tools into proactive, adaptive systems that continually learn and improve over time.
As engineers leverage these tools to improve and innovate products, and the power of these tools increases, we will uncover what we may have never discovered without them.
Ram Ramanan is chief technology officer of Energy Recovery, an energy efficiency technology company headquartered in the San Francisco Bay Area.
This shift, which has been years in the making and continues to pick up the pace, is empowering engineers to test, modify, and commercialize important technologies far more efficiently, to the benefit of their employers and society at large. In this piece, we take a closer look at the historic arc of this evolution, examples of this arc at work, and the promise such change holds for our profession and some of the world’s most pressing problems.
From hardware to modeling
The fundamental process of engineering and problem-solving have always been consistent: design, validate, test, and improve. However, what has changed is the manner, speed, and expense at which such processes unfold.If we take a brief look backward to the 1980s, when computational tools and supercomputers were first introduced, we can see how this period of immense change and innovation found its footing. In 1981, the IBM PC debuted, and three years later, Apple came out with its first Macintosh. At the same time ARPANET, the precursor to the internet, began transitioning into a global network. The introduction of these tools allowed engineers to begin modeling and testing without the use of physical materials.
Significant developments in compute, data, and algorithms have further paved a brighter path ahead. Advancements in compute, which refers to the processing power of the physical hardware used to perform calculations and process data, are overcoming limitations imposed by the end of Dennard scaling and the slowing of Moore’s law. These innovations in compute have directly led to the recent (and immense) progression of AI. For example, the transition from general purpose CPUs to more specialized processors has helped address barriers to training AI models.
By increasing the power of compute, models can execute algorithms and process data at a much faster rate. As this power increases, engineers are able to train models to process more information and perform more tasks with increasing efficiency, leading to faster innovation.
Semiconductors, energy, and desalination
These advancements are already transforming industries. Let’s look at real-world examples. The semiconductor industry is a prime example of how computational advancements have enabled the industry to evolve both efficiently and with less strain on the environment. Each iteration of semiconductor chips calls for experimentation throughout the R&D process. The manufacturing of the physical chip device requires more physical resources, but computational advancements, like simulation and virtualization, allow for experimentation during the R&D phase.While companies traditionally deploy testing in the early stages of the product development cycle, doing so throughout the entire process is significantly more advantageous to the final product. This has reduced emissions by minimizing the resources needed to test and develop these chips. In addition, it has uncovered new ways of manufacturing that can lead to reduced emissions in the output.
At companies like Energy Recovery and Bloom Energy, it’s clear how these developments have enabled engineers to usher in a new era of innovation, ensuring efficiency, reliability, and safety in our products.
Bloom Energy designs and manufactures solid oxide fuel cells that enable the conversion of fuels like natural gas, biogas, and hydrogen into electricity without combustion. These cells are critical to the function of the Bloom Energy Server, a 24/7 source of onsite power used by tech companies, retailers, manufacturers, and more.

Computational modeling accelerated the product development cycle and provided insights to lower costs, while rapidly advancing the next generation of the energy server at Bloom Energy. In a high temperature fuel cell application of this novel technology, it’s extremely challenging to have reliable instrumentation to measure temperatures and fuel distribution. With detailed 3D computational fluid dynamics (CFD) modeling and the application of engineering insights, we were able to look at novel designs that increased world class performance through electrical efficiency conversion, deduce the root cause analysis (RCA) of failure modes, and enhance the reliability of the energy server. The real time data collected through a private cloud helped reduce maintenance costs and greatly enhanced customer experience through increased uptime.
This continuous improvement through modeling and machine learning through field data led to the 2023 launch of a combined heat and power (CHP) solution, the Bloom Energy Server, which uses a high temperature exhaust stream for industrial steam production and absorption chilling. At the time of this announcement, about 50 percent of global industrial energy use was for steam generation, proving how valuable this advancement was to not only providing this form of energy, but reducing carbon emissions while doing so.
These same principles are being applied at Energy Recovery, which is designing and manufacturing high-performance solutions that generate cost savings, increase energy efficiency, and reduce carbon emissions across the desalination, carbon dioxide refrigeration, and wastewater industries. The company’s core technology, the Pressure Exchanger (PX), is at the heart of many of its products and is designed to efficiently capture and transfer pressure energy, making commercial and industrial processes more efficient.
Over the last several years, Energy Recovery has made significant strides in advancing this core technology, in large part due to computational modeling, allowing the company to produce, maintain the quality of, and test new iterations of the PX. Given Energy Recovery operates in the water business—a vital resource for health, sanitation, industry, and recreation—unexpected downtime in the desalination industry is very disruptive and is seen as an emergency situation for those relying on the plant’s water production. This proves how valuable it is to be able to test and model products at all phases of the R&D stage, ensuring downtime is avoided while maximizing the energy savings for the plant.
The use of the PX in SWRO desalination is particularly challenging, as it operates continuously in high-pressure, highly corrosive seawater and brine. Through modeling and testing, Energy Recovery’s engineers determined the right materials and design to provide world-class efficiency for pressure energy conversion for the PX in SWRO, while ensuring reliability and extended design life. For example, the PX devices have a single moving part—a ceramic rotor—that is surrounded by a close-fitting ceramic sleeve and ceramic end covers. The company is now able to measure the rotor/stator gaps, roundness, parallelism, and perpendicularity on every ceramic cartridge with a precision of fractions of a micron.
These efforts have paid off. Energy Recovery conducted a recent test that found the PX used in seawater reverse osmosis (SWRO) applications has a longer design life than previously thought, extending from a 25-year to 30-year design life. The PX is not just an example of cutting-edge technology—it’s the standard of reliability.
The AI-powered future of engineering
Computational modeling is faster and more capital efficient, accelerating the leap from concept to reality. Today, the introduction of AI has brought computational modeling that is much closer to simulating the conditions of real-life testing in a lab.AI-powered simulations can now replicate complex physical processes with high fidelity, reducing the need for costly and time-consuming physical prototypes. Whether modeling fluid dynamics in pipelines, thermal behavior in energy systems, or structural integrity in manufacturing components, engineers can run thousands of digital experiments rapidly and in parallel. This increased testing frequency helps identify weaknesses, optimize performance, and shorten development cycles—all before a single physical component is built.
Afterward, AI can quickly decipher the resulting findings, allowing engineers to more rapidly troubleshoot issues by detecting patterns in data in real-time. For systems that are expected to run continuously, like industrial parts used in desalination plants or wastewater treatment facilities, this is invaluable.
Learn. Lead. Leave your mark.
Develop your career, engage with fellow engineers, and stay informed on the innovations driving ME forward.
Nvidia’s Jensen Huang believes that AI could drive a “million-fold” increase in computing power over the next decade. As computing power grows and AI models become more sophisticated, simulation and modeling are evolving from passive design tools into proactive, adaptive systems that continually learn and improve over time.
As engineers leverage these tools to improve and innovate products, and the power of these tools increases, we will uncover what we may have never discovered without them.
Ram Ramanan is chief technology officer of Energy Recovery, an energy efficiency technology company headquartered in the San Francisco Bay Area.