You are on page 1of 5

LP Modelling LP Modelling

The New LP
New technology is changing the face of Linear Programming. The New LP is non-linear without compromise, reliably convergent, predictable, and no longer an isolated application but an integral part of the business process. True cut-point optimisation is a convenient reality. With modern software tools, the New LP is easier to learn and the results applicable to a wider audience. It is available to and used by more people within the refinery. In this paper, the virtues of the 21st century LP application will be described, and its benefits discussed.

istorical perspective

To appreciate the significant technical advances now available in 21st century LP systems, it is important to understand how refinery LP technology has evolved. In 1947, George Danzig, a professor at Stanford University, developed the first linear programming algorithm. It was used on computers of that era to help solve troop supply problems during the Second World War. Since then, LP technology has been used in many different industries to solve myriad problems. Its use in the petroleum refining arena, however, is arguably responsible for the greatest advancements in this extraordinary technology. Early LPs were very difficult to create and interpret. Their input decks were literally boxes of punch cards, each card specifying a particular matrix element. LP execution times typically ran into hours. In the early 1960s, Larry Haverly developed a concept that revolutionised LP modelling. Shortly thereafter, John Bonner, Joe Moore, and others followed with similar approaches. Haverly developed a computer language, appropriately named MaGen, which allowed for the automatic generation of an LP matrix from a logical array of recognisable data tables. MaGen was perhaps the first application of database technology in the

computer industry. A problem recognised in the infancy of LP was that LP models were indeed linear, while the operations they were meant to represent were often non-linear. This was no more evident than in an LP situation commonly referred to as The Pooling Problem. Since it was not possible to specify the quality of unfinished streams resulting from the joining or pooling of two or more other unfinished streams, models were developed to allow all unfinished streams to blend directly to every finished product for which they were a potential component. This led to a situation called overoptimisation, where LP results were often too optimistic and entirely unrealistic. In the late 1970s, the LP community overcame the Pooling Problem by developing advanced recursive techniques. Programmers could now construct models with the qualities of these intermediate pools specified as variables - to be updated through successive solutions to the LP model, in a manner, which drives the problem to convergence. As a result of this advancement, LP models could finally be constructed to represent the realistic behaviour of refinery operations. Even so, LP models required exorbitant amounts of computer resources and long execution times. With the development of powerful

personal computers in the 1980s, LP technology truly became the planning tool of choice throughout the refining industry. No longer was linear programming restricted to only companies with access to ample computer mainframe resources. Convenient PC spreadsheet technology became the preferred environment for the maintenance and storage of LP model data as well as the recipient and reporter of solution results. The PC environment also opened the door to development of friendly user interfaces, which made LPs much easier to construct, maintain, execute, interpret, and manage. Throughout the 1980s, researchers continued to develop techniques, which would allow LP models to still better meet other nonlinear demands of real refining operations. With the introduction of the US Reformulated Gasoline (RFG) regulations in the early 1990s, the desire to develop such new techniques became a requirement. In 1994, Dean Trierwiler adopted a technique earlier developed by BPs Dr. Roger Main, which allowed for the determination of pooled stream properties that were themselves functions of other stream properties. With this technique, which Trierwiler called Adherent Recursion, the complex RFG emission properties could be modelled as nonlinear functions of linearly determined gasoline

36

HYDROCARBON ASIA, MAY/JUNE 2004

Visit our website at: http://www.safan.com

properties. While LP technology was advancing, process and distillation simulation technology also progressed. Such simulators were being used with increasing frequency to prepare LP model data, but while the simulators were able to consider the real, nonlinear thermodynamic relationships involved in the processing of hydrocarbons, only linearized approximations could be passed onto an LP. The LP modelling community sought to integrate these technologies better and further improve the accuracy and application of their models. In the late 1990s, the Adherent Recursion technique was further refined and applied to process unit modelling (which was Dr. Roger Mains originally intended use for the technique). Links were developed to embed simulators into the recursive life of the LP. These developments have led to the direct LP optimisation of such nonlinear operating variables as unit severities and distillation cutpoints.

tabase technology to provide a complete LP system that is both easy to operate and technically rich. Figure 1 depicts the typical refinery LP users activities prior to the advent of new LP systems, such as G4. Crude oil and processing data typically first had to be analysed and linearized through the use of offline simulators, before it could be structured into a form acceptable to an LP. Other data, such as information related to purchases, sales, transports, and inventory, had to be restructured
Figure 1

into LP acceptable forms, as well. The user then had to separately specify the parameters of the LP: its scope (periods and locations), blended product specifications, reporting and LP execution settings, etc. Once all parameters were set, the user could then launch a run and subsequently interrogate its results. Finally, the user was responsible for separately managing his LP cases in a manner to best provide him with all the information he needed to complete his study.

The 21st Century LP


While the benefits of LP simulations to the refining industry were unquestioned, the LP systems in use at the end of the 1990s were still very cumbersome. Although vastly improved over systems of only a few years earlier, they still required dedicated, well-trained individuals to manage their data, control their operation, and interpret their results. Most development efforts in recent years have focused on reducing the skill levels required to operate LPs through still further simplification of the procedures, while making available tools and software integration avenues to increase personal productivity. Perhaps the most significant of these undertakings is Haverly Systems development of its G4 Enterprise Planning System. G4 has fully embraced modern relational da-

Figure 2

HYDROCARBON ASIA, MAY/JUNE 2004

37

LP Modelling

Figure 2 shows the refinery LP users activities under the new LP systems environment. The functions previously mentioned still largely exist, but are performed away from the users direct area of responsibility. The user interacts only with friendly interface software, which resides atop a highly sophisticated relational database. In addition, the relational database provides for several more functions useful in LP modelling, such as: intrinsic data checking, ready linkage to other software systems (information systems, short-term schedulers, etc), and vivid graphical utilities.

Example Case Study


The New LP allows the user to analyse refinery operations with more detail than ever before. One example is the use of a process simulator interface. Historically, process yields were represented using linear base yields and shift vectors. The base yields were calibrated to a specific base feed. For an FCC unit, the LP model might incorporate 2 or more base yield vectors, say a high conversion yield and a low conversion yield. A shift vector is used to reflect yield differences when the actual feed differs from the base feed. There are many types of shift vector applications; in FCC operations, some examples of feed shift vectors might be UOPK, nitrogen, refractive inActivity 63% High Conv Linear Yield 0.0388 0.1265 0.1956 0.5905

dex, density, hydrotreated feed, and conversion. As a simple example, say the base yields are calibrated to a feed that has a UOP K-factor of 12.0. If the LP model runs a feed with a UOP K-factor of 11.5 (therefore less crackable), one would expect less gasoline production. The shift vector captures the product yield difference from the 12.0 base feed and the 11.5 actual feed. In the historical sense, both base yields and shift vectors were linear representations for process yields. In an operating refinery, process operations are not always linear, but we have historically used linear vectors to approximate the operations. In the example below, the model only has two base yield vectors-a high and low conversion vector-and for simplicity, excludes quality shift vectors. If the model runs 63% in high conversion of 1000(F) Riser Temperature and 37% in low conversion of 960(F) Riser Temperature, the resulting linear solution is an approximation of the FCC yields when the Riser Temperature is 985(F). Although this methodology can be used, it oversimplifies the true FCC behaviour because of its linear representation. As an example, the yield of FCC naphtha is not linear. In fact, at some point along the curve, the production of naphtha actually decreases, marking the point of 37% Low Conv Linear Yield 0.0265 0.0984 0.1671 0.5875 0.1908 0.0684 960 74.0% 100% Linear Solution Vol% 0.0342 0.1160 0.1849 0.5894 0.1639 0.0586 985 77.8%

overcracking. The graph in Figure 3 clearly shows that this in a non-linear relationship. The linear approximation predicts an FCC naphtha yield of 58.94 percent at a riser temperature of 985(F), which is the linear interpolation between 960 (F) and 1000(F). The above non-linear curve was developed with an FCC simulator that has been calibrated to a generic operation. At the riser temperature of 985(F), the FCC simulator predicts a naphtha yield of 59.28% Simulating FCC operations is critical to the overall success of accurately predicting refinery operations. With the new LP technology available, an FCC process simulator was developed in a spreadsheet format to link directly to the LP model. The goal of the simulator is to characterise the feed accurately and to predict the yields under various operating conditions. Some significant input parameters to the FCCU simulator include: Feed Stream Rates Unit Contact Times Unit Technology Options Straight Run Feed Quality Data Feed Injection Technology Hydrotreated Feed Quality Data Reaction System Efficiencies Feed Hydrotreater Reactor Data Catalyst Kinetic & Selectivity Factors Catalyst Addition Rates, Quality and Costs

Simulator Solution Vol% 0.0331 0.1154 0.1849 0.5928 0.1632 0.0583 985 77.8%

Linear vs Simulator Delta -0.0011 -0.0006 0.0000 0.0034 -0.0007 -0.0003 0 0.0%

C1-C2s C3S C4S FCCU Naphtha

FCC Lt Cycle Oil 0.1478 FCC Slurry Riser Conversion 0.0527 1000 80.0%

38

HYDROCARBON ASIA, MAY/JUNE 2004

Visit our website at: http://www.safan.com

LP Modelling

Unit Kinetic Options Unit Operating Conditions

the context of refinery LP optimisation allows the user to understand the impact of the FCC on the overall refinery

Figure 3

Unit Kinetic Variables Miscellaneous Feed Quality Data The FCC simulator performs detailed heat & material balance calculations and sizing calculations. This tool can be effectively used to optimise existing operations as well as analyse capital improvement ideas. In the above FCC example, the difference in naphtha production between the simulator and a linear representation is 0.34 percent. The question to ask is whether or not the linear representation is close enough. The answer is, it depends. As with most LP work, the model is geared to the level of detail sufficient to answer the question at hand. A tolerance of 0.34% naphtha yield for a regional supply/demand model is certainly acceptable; however, if considering replacement of old FCC nozzles with advanced nozzle technology, a 0.34% difference might not be acceptable. This small tolerance could translate to millions of dollars per year, and could be the difference between accepting or rejecting a proposed project. The accuracy provided by the FCC simulator coupled to the LP model is far superior to the old linear approximation. Additionally, accurate simulation of FCC operations within

complex. For example, revamping an FCC likely will impact refinery feedstock selections and refinery production. With successive LP runs integrated with a process simulator, the ability to analyse the complex interactions between FCC operations and overall refinery economics is greatly enhanced: changing nozzle technologies, catalyst impacts, feed preheat impact, riser temperature impact, hydrotreated versus straight run feed, recycle streams, and the list goes on.

The Ten Commandments of LP


In conclusion, users of the New LP must continue to recognise the abilities and limitations of LP, as well as the responsibilities they have to properly apply this technology. These responsibilities are best summed up in the following Ten Commandments of LP: Thou shalt consider the LP model an extension of thyself. The LP is only a tool. The responsibility for how the tool is used resides solely with its operator. Thou shalt accept that all LP models are premise laden and assumption driven.

All LP models are built on the biases and assumptions of their creators. Users must recognise this fact, and ensure that these conditions do not interfere with, but indeed support, the use of the LP for each study. Thou shalt accept that an LP Model is only as good as the data on which it is based. Nowhere is the old adage Garbage In ... Garbage Out more applicable than in LP modelling. Thou shalt create the LP model to be an accurate representation of refinery operations. To best optimise a refinerys operation, the LP must be given a complete, accurate description of the refinerys processing capabilities. And these capabilities must be modelled in a manner consistent with their realistic impact on operations. Thou shalt assure that the LP model is weight balanced. Mass can neither be created nor destroyed. Given the opportunity, an LP will always try to make something out of nothing, or destroy what it does not like. Thou shalt examine the LP Model for local optimals. Most new LP systems have mechanisms that work to avoid local optimums. Nevertheless, encountering a local optimal is always a theoretical possibility in nonlinear models. Users must be familiar enough with their models to recognise possible local optimalities and test for their existence. Thou shalt accept the sensitivity of an LP solution. An LP solution is based upon incremental economics, and is by nature driven to extremes. A small change in a model constraint or cost can have a substantial effect on its solution. Thou shalt not expect that the LP will improve its datas accuracy. It is not reasonable to force a model to solve to tolerances tighter then those upon which its data are defined. Thou shalt accept that an LP does

40

HYDROCARBON ASIA, MAY/JUNE 2004

Visit our website at: http://www.safan.com

LP Modelling

not consider time. All values reported by an LP are averages over the time period(s) upon which it is defined. In all likelihood, products will never be blended to the recipes they predict, and feedstocks will never be charged in the ratios they assume. Thou shalt never base a decision solely on an LP solution.

One should never answer a question with a statement beginning with because the LP says .... LPs say nothing. They are only one of the tools a planner employs to assist him in making a decision. LP solutions must be collaborated by wisdom and experience.
Enquiry Number 5/6-08

HA

Hydrocarbon Asia thanks L. Dean Trierwiler of Haverly Systems Inc and Vince B. DiVita of Jacobs Consultancy Inc. for contributing this paper which was presented at the ARTC Computing Conference in Singapore on 20 - 22 October 2003. L. Dean Trierwiler is the Business and Technical Manager at Haverly Systems, Inc., Houston, Texas. He joined Haverly in 1990, and has since worked in the management, support, and application of Haverlys planning, scheduling, and crude assay software tools. For the 16 years prior to joining Haverly, he held various planning, economic, engineering, and software development positions within the refining industry (with CITGO, Chevron, and UNOCAL). Mr Trierwiler is a recognised expert in the area of Linear Programming Recursion, and was instrumental in the development of both the Distributive and Adherent Recursion techniques. He holds a B.S. degree in Mechanical Engineering from Washington State University. Vincent B. DiVita has 15 years experience in the chemical and petroleum industries with emphasis in LP refinery simulations and economic analysis. As a Group Manager of Jacobs Consultancy, his responsibilities include refinery optimisation, project management, strategic planning, and feasibility studies. He is also responsible for maintaining, developing and managing the LP model that is used to analyse a broad range of complex subjects for single client and multiclient studies. Before joining Jacobs Consultancy, Mr. DiVita was a Senior Process Engineer with Rhne Poulenc at their Houston, Texas sulphuric acid/aluminium sulphate plant. He has also been employed as an Associate Engineer for Shell Oil Company and as a consultant with Purvin & Gertz. Mr. DiVita holds a B.S. degree in Chemical Engineering from Texas A&M University and an M.B.A. from Houston Baptist University. He is a registered Professional Engineer (Texas).

42

HYDROCARBON ASIA, MAY/JUNE 2004

Visit our website at: http://www.safan.com

You might also like