biotechnology

Metrics for Sustainable Technology Innovation

In today's rapidly evolving technological landscape, measuring performance is essential for sustainable technology innovation. Applying science and engineering principles to new problems drives progress and ensures technical and economic viability, while ensuring more sustainable solutions to today’s problems. Metrics provide a structured way to track these criteria, validate new technology against the competition, and report out to key stakeholders. These metrics help us evaluate reaction and separation tradeoffs, optimize reactor systems, and ensure that our processes are both economic and sustainable.  Some examples of key metrics include: 

Reactors are the heart of any chemical, biological, or electrochemical process, and scaleup of new reactor systems have their own subset of critical metrics.  By rapidly transitioning from small-scale to pilot-scale reactors, we can gather valuable data that can be used to scale directly to commercial production. Key design metrics such as mass transfer rate (kLa) and weight hourly space velocity (WHSV) are essential for optimizing reactor performance and ensuring economic viability,  and can be used as scale independent parameters to track performance at any scale. For instance, for many gas/liquid reaction systems, including gas fermentation systems, kLa (mass transfer coefficient) is evaluated to set minimum targets for commercial design, ensuring that the mass transfer rate is sufficient for economic viability. Similarly, WHSV (weight hourly space velocity) is a key measure of liquid flow per unit of catalyst, which is crucial for reactor performance.

Metrics used for design, scale-up, and operation of reactor systems in thermochemical and bioprocessing systems can include: 

Case Studies and Examples

To illustrate the practical application of these concepts, let's look at a few case studies:

  1. Sustainable Bio-based base oils: Base oils are typically produced as a petrochemical fraction, and are used to produce lubricants, greases, and other heavy oils for industrial use.  In this example, metrics for reactor scale-up were applied to develop sustainable bio-based replacements for petrochemicals. The key scale-up metric used was the Reynolds Number, a dimensionless parameter that describes properties for both the liquid phase and gas phase over a solid catalyst.  Using standard guidelines for Reynolds number correlations in trickle bed reactor systems, this ensured that we could identify reactor conditions in the regime known to promote sufficient mass transfer for good performance.    



2. Biobased Monomer Production: Another example involves the production of a biobased monomer. The process rapidly transitioned from a 50 mL to a 1500 mL pilot scale, achieving a 30x scale-up. The pilot results could then be used to scale directly to commercial production. In this case, the key scaleup parameter was weight hourly space velocity (WHSV), a parameter than determines the catalyst volume needed for a given reactor performance.  The table below shows a typical scaleup plan that could be used for this type of scaleup problem. 

3. Novel Bioreactor Design: In this case, a novel bioreactor was designed for gas fermentation. The key design metric, kLa (mass transfer coefficient), was used to set performance targets and design the equipment.  Published correlations were used to evaluate the mass transfer performance vs the superficial velocity of the gas phase through the reactor system.   The minimum reactor performance determined the reactor design parameters necessary for an economic process design. 

In conclusion, metrics play a pivotal role in the successful design, scale-up, and optimization of sustainable technologies. The case studies presented in this document illustrate the practical application of these concepts, demonstrating how metrics can guide decision-making and ensure that our processes are efficient, economical and sustainable.

Navigating the Bumpy Road of Industrial Biotechnology Scale-up

We have seen a growth in products from Industrial Biotechnology, with commercial technologies emerging in areas such as:

  • Fuels. Sustainable Aviation Fuel, Green Diesel, and Ethanol from low cost feedstocks

  • Chemicals. Drop-in replacements for industrial chemicals such as propanediol and butanediol, made through biological routes instead of conventional petroleum based options

  • Alternative routes for proteins, fats, and meat

  • Materials for building projects, fabrics, electronics

The drivers for this growth include a focus on sustainability, and a drive to enable circularity through reuse of carbon and carbon-based products. Some technologies offer the potential to make use of a lower cost source of carbon, through use of waste feedstocks such as industrial gaseous emissions, biogas, end of life plastic, and waste biomass. In addition, in some cases the bioproduct is a better product than the petroleum-based version, coming with a cheaper, safer processing route and performance advantages over traditional materials.

The road we travel while commercializing new technologies like these is often bumpy, with many challenges along the way. In order to be successful we must address these challenges while also: 1) reducing technology risk 2) reducing time to market 3) optimizing/minimizing cost and 4) maximizing value.

These are often competing objectives, and usually reducing time to market and reducing risk win out. Of course, if the capital and operating cost are too high, then a new technology will not be successful, so these criteria cannot be ignored.

We can follow guidelines and best practices for the scale-up and design of industrial bioprocessing technology, to effectively de-risk and optimize new industrial biotechnology during the scale-up effort. These guidelines include elements such as:

Creative Process Engineering: The flow scheme is developed, the material balance is estimated, and key process design decisions are identified to establish the best process flowsheet for the technology.

Modeling & Analysis: A good model can save time and resources in the lab. Coupled with the right analysis, the scale-up team can prioritize objectives in the lab, pilot, and demo units.

Experimental Data: The right data is needed to prove out breakthrough ideas, secure partners and investors, and develop engineering data for equipment design. Multi-scale data is critical to this effort, and with good planning, multiple assets and external resources can be leveraged.

The key benefits of this approach are:

  • Prioritization of R&D to de-risk and optimize the new technology.

  • Identification of cost reduction opportunities throughout the scale-up effort.

  • Anticipation of process design needs as early as possible.

While scale-up of new sustainable technology, in particular industrial biotechnology, is hard and challenging, it is not impossible! The opportunities are great, and with the right approach we will see many more success stories in the future.

Growing the Bioeconomy with Gas Fermentation

 Gas fermentation is a novel industrial biotechnology that can contribute to the growth of the bioeconomy by using low cost, readily available carbon sources such as methane, carbon monoxide, and carbon dioxide to produce various fuel, chemical and food products, such as ethanol, ethylene, triglycerides, proteins, and polyesters.

 

Figure 1:  Gas Fermentation Landscape

 

Gas fermentation has advantages over conventional processing routes, including:

·       lower cost operating conditions

·       robustness to fluctuations in feed rate and composition

·       tolerance to contaminants in the gaseous feeds. 

 

We can look at two classes of gas fermentation.  The first involves direct conversion of CO2 through gas fermentation, typically with hydrogen and/or oxygen as co-feeds.  A diverse array of products can be produced through these routes, including triglycerides which can be used for food, materials, and fuel applications. Chemicals such as acetic acid and ethylene are other products viable through these routes, along with single cell proteins for animal feed or other alternative protein applications. 

 

We can also consider gas fermentation routes that convert CO2 precursors, such as carbon monoxide or methane into useful products.  In this case, possible products include chemicals such as ethanol, methanol, or iso-propanol, and polymers such as polyhydroxyalkanoate (PHA).

 

However, a key challenge with gas fermentation involves the design of a cost-effective reactor system with high mass transfer coefficients for the gaseous feedstocks into an aqueous media.    A number of reactor types have been proposed to overcome this challenge, from simple bubble columns to more sophisticated air lift and external loop reactors. These reactor types have tradeoffs between mass transfer and design complexity.  It is important to identify the best option for a particular gas fermentation application. 

Figure 2:  Mass Transfer Challenge

 

In addition, with any biological or chemical process it is important to look beyond the reactor system and consider the integrations of unit operations both upstream and downstream of the reactor system in order to optimize the process as a whole. For gas fermentation, we may need to consider tradeoffs associated with the cost of compression or gas cleanup vs potential performance benefits in the reactor system. Similarly, we need to consider the design and performance of the product recovery section. Ultimately, we want to optimize the process not just the reactor system.

Figure 3: Process Integration Challenge

Additional challenges must be addressed when scaling and commercializing gas fermentation technology.  These include:

·       Lack of established data and models.  Compared to petrochemical reaction chemistry, the availability of data and reactor design models is quite limited. 

·       New equipment to be designed and constructed, such as custom fermenters.

·       New separation challenges.  Recovery of extracellular products such as ethanol or acetic acid from the fermentation broth, or recovery of intracellular products.

·       New optimization criteria.  Carbon footprint and ESG/LCA metrics in addition to traditional optimization metrics such as operating cost and capital cost.

·       New microbial catalysts.  As gas fermentation becomes a more mature and broadly deployed technology, methods for manufacturing and distribution of commercial scale quantities of these catalysts will be required. 

 

As gas fermentation becomes more mature and we see more commercial applications, opportunities for future developments will enable greater scale, reduced production costs, and new products.

·       Microbial modeling, including bacterial growth kinetics and flux models.  By bringing a more analytical approach to our gas fermentation systems, we can enhance understanding of the biological reactor systems, and develop custom reactor designs for specific microbial systems. 

·       Strain development to reduce bioproduct formation, increase contaminant tolerance, and enable more extreme operating conditions (higher temperature, for example). 

·       Reactor design and scaleup, to develop reactor systems that can enhance mass transfer while balancing constraints around capital and operating costs. 

·       New or improved approaches for product recovery to reduce the cost and complexity of product separation and purification. 

 

The future is bright for this exciting technology area. Gas fermentation will play a key role in the growth of the industrial bioeconomy in the coming decades. 

Puzzled about scaleup? Multi-scale data is the key

 

Experimental data is[1] clearly the lifeblood of any new technology.  Getting data to prove out an invention can be the key to obtaining an important patent, generating early stage investment, and securing key partnerships.  Earlier postings established the links between experimental data and creative process engineering as well as robust, useful models.  However, generating data is expensive and time consuming, particularly as scale increases, making it critical to ensure that the right data is generated to make the best use of available resources. 

I like to start by looking at the scale-up effort as one integrated data gathering exercise, with the overall goal of generating the necessary data to define the commercial process design.  Along the way data is also needed to demonstrate a reduction in technical risk and allow optimization of the process economics.  This is a bit of a different mindset from trying to prove out a ‘result’ at each scale (e.g. proving conversion of raw materials A and B into product C with desired efficiency X in the lab, then the lab-pilot, then the pilot, and finally the demo).  So rather than charging ahead in result proving mode, some up front planning can ensure that the right data is gathered.   After all, all data are equal, but some are more equal than others (with apologies to George Orwell…) [2]

This planning effort will yield a scale-up plan with experiments designed to generate the necessary design data and identify the parameters that have the greatest impact on economics and technical risk.  In fact, the product of this effort is data, more than a physical fuel, chemical, or nutrition product. 

A key part of this early stage planning is decoupling these parameters, understanding that ‘science parameters’ such as reaction kinetics and separation factors can, and should, be explored at the lab stage.  Conversely, a lab scale effort to evaluate issues related to heat and mass transfer or pressure drop will be a futile effort at best leading to inconclusive or even incorrect results and is best done at a larger scale.  This decoupling is illustrated in the following table: 

Table of scaleup chemical/biological and engineering parameters

Multi-scale data is beneficial for many additional reasons:

·       Model development. Data at multiple scales enables generation of robust models for process development and equipment design.  

·       Troubleshooting.  The smaller lab and pilot rigs can be instrumental to troubleshooting challenges in the larger units.  If possible, it is worth the investment in to keep these smaller units operating in support of the larger scale operations. 

·       Continuous improvement.  Continuous improvement is often needed while scaling a new technology to meet aggressive timelines and cost targets.  These improvements can be identified and scaled in parallel to ensure that the first commercial unit has the benefit of the learnings from several generations of technology improvements that are identified and de-risked in multiscale operations. 

By bringing Experimental Data together with Modeling and Analysis and Creative Process Engineering we develop a process concept, and an overall approach to reduce the time, cost, and risk of scale-up. 

Process concept to reduce the time, cost and risk of scale-up

[1] I used to make sure I strictly used ‘data’ as a plural noun as the OED intended, but decided a while ago that this is somewhat cumbersome, and perhaps even a bit pretentious.  I don’t think I am alone in this shift but am not sure the official definitions have caught up yet. 

[2] Original Quote: “All animals are equal, but some are more equal than others”, George Orwell, Animal Farm

Creative Process Engineering

Creative Process Engineering.jpg

In my introductory article on this topic (Practical Technology Scaleup) I wrote about the benefit of drawing on Creative Process Engineering, Modeling & Analysis, and Experimental Data to develop a solid Process Concept to drive the scale-up effort--reducing risk and optimizing the economics of a sustainable technology.  

Creative Engineering, like Creative Accounting, may be an oxymoron or have negative connotations, but in my experience, it is critical for first of its kind technology.  Creative process engineers understand commercial plant design and can also deal with the ambiguity that is common with any new technology.  This creativity enables the engineers working closely with the science experts to develop the process concept, establish the material balance, and make key process design decisions to set the framework for the evolving novel technology.  These decisions fall into categories such as:

·       Product Requirements.  Product quality.  Waste vs Byproduct.  Batch vs Continuous.

·       Catalyst:  Composition.  Biological vs Thermochemical.  Size/shape.  Heterogenous vs Homogenous. 

·       Major Unit Operations.  Reactor concept.   Feedstock processing.  Separation processes.

·       Major Equipment.  Standard or Custom.  Pump/Exchanger/Compressor type.

·       Design Conditions.  Temperature.  Pressure.  Product Specifications. 

The challenge of translating discoveries from the lab into viable process flowsheets has been described by Douglas[i] to require assumptions 1) that fix parts of the process flowsheet 2) that fix some of the design variables and 3) that fix the connections to the environment.  Douglas estimates that more than one million process flowsheets can be generated just from the varied assumptions associated with the first process flowsheet.   Clearly it is not feasible to evaluate all of these alternatives. The good news is that we can just as quickly reduce the number of alternatives to a more manageable number but need good engineering judgement to make decisions with relatively little information.  This is where the Creative aspect of Process Engineering is critical. 

In practicality I find it is best to identify the reactor concept and separation scheme that are the best options, and then build the flowscheme around these.  Often, this is a case of screening out the ‘bad options’ resulting in several process concepts that make sense.  We can then define the data needed by Experimentation and Modeling & Analysis to refine our choices to the best option.  These areas will be explored in future articles. 

[i] Douglas, J.M. “A Hierarchical Decision Procedure for Process Synthesis”, AIChE Journal, March 1985, Vol. 31. No 3, pp 353-362. 

Practical Technology Scaleup

The Key to Launching Sustainable Technology

Sustainable Technology Scaleup Concept

We are in the midst of a global crisis with the need to reduce carbon across all industries in order to limit global warming to 1.5 deg C above pre-industrial levels, as established in The Paris Agreement[1].  This drives a need for breakthrough technologies across all industries that can both reduce carbon and create value.  We can draw on past experience to reduce the time, cost and risk of technology scaleup, through some guidelines and practices that are the key to Practical Technology Scaleup.   This increases the chance of success for individual technologies and will enable us as a society to meet these aggressive climate targets. 

I have had the chance to scale-up and launch new products and technologies across a range of industries including sustainable fuels, renewable chemicals, bioprocessing, petrochemicals, specialty chemicals, distillation, and catalysis, and in my 23 years of industrial experience have developed a series of rules and guidelines to scaling and launching new technology.  The challenge with each has been to:

·       reduce technology risk

·       reduce time to market

·       reduce cost

·       maximize value

These are often competing objectives, and usually reducing time to market and reducing risk win out.  Of course, if the capital and operating cost are too high then we will not be successful, so we cannot ignore these criteria either.   

It is critical to ‘start with the end in mind’ using a Technology Concept (or Process Concept) that is used as a framework to drive new technology development, scale-up and commercialization.  This technology concept is not set in stone, and, in fact, should be reviewed and updated as we progress throughout the scale-up effort.  We establish the technology concept to drive the scale-up effort, not just inform it. This then enables us to direct the innovation to create the greatest value from breakthrough and disruptive ideas, as we identify challenges early, fail fast when it is cheaper and quicker, and make sure our efforts are focused on solving commercially relevant problems.

The technology concept is developed, and iteratively revised, through a combination of Creative Process Engineering, Multi-scale Experimental Data, and Modeling and Analysis. 

Creative Process Engineering:  The flow scheme is developed, the material balance is estimated, and key process design decisions are identified so that we can establish the best process flowsheet for the technology.  

Modeling & Analysis:  A good model can save time and $$ in the lab.  Coupled with the right analysis, this can be used to prioritize objectives in the lab, pilot, and demo units.   Cautionary note--useful models are more important than perfect models!

Experimental Data:  We need the right data to prove out breakthrough ideas, secure partners and investors, and develop engineering data for equipment design.  Mutli-scale data is critical to this effort, and with good planning, multiple assets and external resources can be leveraged.

The key benefits to this approach are:

•               Prioritization of R&D to de-risk and optimize a new technology.

•               Identification of cost reduction opportunities throughout the scale-up effort.

•               Anticipation of engineering needs as early as possible. 

In this way we can reduce risk and optimize economics of our new design, while efficiently managing the time and cost of our efforts.

In future posts I will elaborate on the key concepts presented in this introduction. 



[1] https://unfccc.int/process-and-meetings/the-paris-agreement/the-paris-agreement