脱发缺少什么维生素| 掉头发吃什么| hpv73阳性是什么意思| 有点拉肚子吃什么药| 好无奈是什么意思| 霍乱是什么| 胃溃疡是什么原因引起的| 嗳气吃什么药| 肌肉痉挛用什么药能治好| 吃狗肉有什么危害| 银色是什么颜色| 排骨焖什么好吃| 3月24日是什么星座| 天梭属于什么档次| 经期吃榴莲有什么好处| 小孩老是打嗝是什么原因| 胆汁淤积症有什么症状| 相见不如怀念是什么意思| 医生说宝宝趴着在暗示着什么| 2028年属什么生肖| 四肢发达是什么生肖| 十二指肠球炎是什么意思| 尿酸高会引发什么疾病| 小猫吃什么食物| hpv是指什么| 一般事故隐患是指什么| 乐不思蜀是什么意思| 内窥镜是做什么检查| 雷锋代表什么生肖| 美国的国鸟是什么| 过敏性紫癜不能吃什么| 咳嗽干呕是什么原因| 打牛是什么意思| 高泌乳素血症是什么原因引起的| 节瓜煲汤放什么材料| 什么是有机| 北海为什么叫北海| 霜降是什么意思| 外公是什么关系| 五花大绑是什么意思| 04年出生属什么| 小孩不吃饭是什么原因| 甲状腺偏高有什么影响| 内瘘是什么意思| 红糖荷包蛋有什么功效| 轻度脑萎缩是什么意思| 好哒是什么意思| 频繁流鼻血是什么病的前兆| 宁静致远是什么意思| 中央委员是什么级别| 老鹰代表什么生肖| 什么水果可以解酒| 什么是结核病| 专政是什么意思| 胃疼适合吃什么食物| 4月25号什么星座| 喜大普奔什么意思| 山楂什么季节成熟| 手痛挂什么科| 腋下有味道是什么原因| 喝咖啡有什么好处和坏处| 肺动脉增宽是什么意思| 牛奶为什么能解辣| 书香是什么意思| 抽搐吃什么药| 肌酐测定是查什么| 脚筋膜炎什么办法恢复的最快| 眼皮浮肿是什么原因| 自相矛盾是什么意思| 9月14号是什么星座| 为什么韩国叫棒子国| 销魂什么意思| 1.23是什么星座| 男生适合什么职业| 沸去掉三点水念什么| 杨字五行属什么| 肺型p波是什么意思| 右是什么结构| 母亲节送什么礼物好| 手淫过度有什么症状| 腰酸痛挂什么科| 结肠炎吃什么食物好| 收缩压低是什么原因| 心慌吃点什么药| 经常耳鸣是什么原因引起的| 迪卡侬属于什么档次| 胃胀气吃什么食物好| 请佛容易送佛难什么意思| 晚饭适合吃什么| 抗体阳性什么意思| 属鼠的守护神是什么菩萨| 奠什么意思| 乌托邦什么意思| kpa是什么单位| 心源性猝死是什么意思| 华妃娘娘是什么电视剧| 什么是脂溢性皮炎| 瘖什么意思| 香菜什么时候种最合适| 包茎挂什么科| 来日方长是什么意思| 坐月子可以吃什么蔬菜| nautical什么牌子| 成吉思汗什么意思| 父亲节什么时候| 乙型肝炎表面抗体高是什么意思| 专科女生学什么专业好| 会车是什么| 乙肝表面抗原阳性是什么意思| 酥油茶是什么做的| 肛门疼痛用什么药| 担当是什么| 爱情是什么| 清关什么意思| 沙弗莱是什么宝石| 布洛芬缓释胶囊有什么副作用| 家用制氧机什么牌子好| 后囟门什么时候闭合| 孕妇适合吃什么| 高同型半胱氨酸血症吃什么药| t恤搭配什么裤子好看| 93岁属什么生肖| 抑郁症是什么意思| 窦房结是什么意思| 腺肌症不治疗会导致什么结果| 脾胃虚弱吃什么药调理| 总胆红素是什么| 苹果熬水喝有什么功效| 整夜做梦是什么原因| 草字头加青读什么| 木瓜是什么季节的水果| 烛是什么意思| 囊内可见卵黄囊是什么意思| 什么是体外受精| 睡觉醒来口苦是什么原因| 发扬什么词语搭配| 4.9是什么星座| 点映什么意思| 问其故的故是什么意思| 赵本山是什么学历| 自相矛盾什么意思| 吃桃子有什么好处| 牙齿流血是什么原因| 什么东西能解酒| 促进钙吸收吃什么| 阴虱什么症状| 好吃懒做是什么生肖| 曲苑杂坛为什么停播| 拉肚子是什么原因引起的| 直肠ca代表什么病| 有什么症状是肯定没怀孕| 宰相相当于现在的什么官| 毛宁和毛阿敏是什么关系| 发烧酒精擦什么部位| 什么是垃圾食品| 什么书在书店里买不到| 神经内科看什么病的| 脚心凉是什么原因| 对食什么意思| 鬼压床是什么意思| 嘉兴有什么大学| 脑梗前有什么预兆| 静电是什么| dna是什么| 火龙果和什么不能一起吃| 拘留是什么意思| 肝胃郁热吃什么中成药| 如什么如什么成语| 湿气重喝什么茶| 白加黑是什么药| 什么东西清肺最好| 避孕药有什么副作用| 前庭功能检查是查什么| 玉对人身体健康有什么好处| 功劳叶的别名叫什么| 什么肉好消化| 大便常规检查能查出什么| uranus是什么星球| 黑眼圈是什么原因导致的| 盲人按摩有什么好处| 吃什么食物养胃| 腮腺炎是什么引起的| 烈女怕缠郎是什么意思| 一票制什么意思| 乖戾是什么意思| 支气管炎吃什么药最有效| 什么时候有胎动| 肺动脉流的是什么血| 杆菌是什么| 12月份是什么星座的| 梦见出血是什么征兆| 吃东西就打嗝是什么原因| 什么动物眼睛是红色的| 大姨妈来了吃什么好| 黑藻是什么植物| 彪子是什么意思| 今年的属相是什么生肖| 三月有什么节日| 肠易激综合征吃什么药| 心脏痛吃什么药效果好| 12岁属什么| 农历十月初五是什么星座| 海葵是什么| 11月16号是什么星座| wonderflower是什么牌子| 阴毛是什么| 耳聋吃什么药| 脑动脉硬化是什么意思| 牙龈起泡是什么原因| 考试紧张吃什么药可缓解| 河蚌用什么呼吸| skp是什么品牌| 做了胃镜多久可以吃东西吃些什么| 经常胃胀气是什么原因引起的| 丙球是什么| 什么是PC出轨| 血糖30多有什么危险| 曲苑杂坛为什么停播| 肚子疼什么原因| 绿豆汤放什么糖最好| 低密度脂蛋白偏低是什么意思| 张学友属什么生肖| 腰椎管狭窄吃什么药| 头疼发热是什么原因| 肺炎用什么药| 膈应是什么意思| 浸润性癌是什么意思| 什么叫打卡| 足外翻挂什么科| 照身份证穿什么颜色的衣服| 急性盆腔炎有什么症状表现呢| 什么的脊背| 什么的雪人| 五马长枪是什么意思| 孙笑川是什么梗| 芥子是什么意思| 树叶又什么又什么| 贫血用什么药补血最快| 车字旁有什么字| 什么时候开始胎教| 盲肠憩室是什么意思| 舌头辣辣的是什么原因| 松鼠尾巴像什么| exp是什么意思| 白皮书什么意思| 医院为什么禁止小孩灌肠| 专柜是什么意思| 水柔棉是什么面料| 七什么八什么| 地主代表什么生肖| 大米里放什么不生虫子| 增大淋巴结是什么意思| 腹部淋巴结肿大是什么原因| 吃樱桃有什么好处| 羡慕是什么意思| 血压高什么原因引起的| 新生儿痤疮是什么引起的| crpa是什么细菌| 全运会是什么| 人为什么会长智齿| 胆切除后吃什么好| 医院医务科是干什么的| 梦见长大水是什么意思| a和b生的孩子是什么血型| 百度Jump to content

糖尿病人喝什么茶最好

From Wikipedia, the free encyclopedia
百度   俄罗斯国际文传电讯社援引乌克兰官员的话称,这架客机在乌东部领空大约1万米高空遭到击落。

Industrial process data validation and reconciliation, or more briefly, process data reconciliation (PDR), is a technology that uses process information and mathematical methods in order to automatically ensure data validation and reconciliation by correcting measurements in industrial processes. The use of PDR allows for extracting accurate and reliable information about the state of industry processes from raw measurement data and produces a single consistent set of data representing the most likely process operation.

Models, data and measurement errors

[edit]

Industrial processes, for example chemical or thermodynamic processes in chemical plants, refineries, oil or gas production sites, or power plants, are often represented by two fundamental means:

  1. Models that express the general structure of the processes,
  2. Data that reflects the state of the processes at a given point in time.

Models can have different levels of detail, for example one can incorporate simple mass or compound conservation balances, or more advanced thermodynamic models including energy conservation laws. Mathematically the model can be expressed by a nonlinear system of equations in the variables , which incorporates all the above-mentioned system constraints (for example the mass or heat balances around a unit). A variable could be the temperature or the pressure at a certain place in the plant.

Error types

[edit]

Data originates typically from measurements taken at different places throughout the industrial site, for example temperature, pressure, volumetric flow rate measurements etc. To understand the basic principles of PDR, it is important to first recognize that plant measurements are never 100% correct, i.e. raw measurement is not a solution of the nonlinear system . When using measurements without correction to generate plant balances, it is common to have incoherencies. Measurement errors can be categorized into two basic types:

  1. random errors due to intrinsic sensor accuracy and
  2. systematic errors (or gross errors) due to sensor calibration or faulty data transmission.

Random errors means that the measurement is a random variable with mean , where is the true value that is typically not known. A systematic error on the other hand is characterized by a measurement which is a random variable with mean , which is not equal to the true value . For ease in deriving and implementing an optimal estimation solution, and based on arguments that errors are the sum of many factors (so that the Central limit theorem has some effect), data reconciliation assumes these errors are normally distributed.

Other sources of errors when calculating plant balances include process faults such as leaks, unmodeled heat losses, incorrect physical properties or other physical parameters used in equations, and incorrect structure such as unmodeled bypass lines. Other errors include unmodeled plant dynamics such as holdup changes, and other instabilities in plant operations that violate steady state (algebraic) models. Additional dynamic errors arise when measurements and samples are not taken at the same time, especially lab analyses.

The normal practice of using time averages for the data input partly reduces the dynamic problems. However, that does not completely resolve timing inconsistencies for infrequently-sampled data like lab analyses.

This use of average values, like a moving average, acts as a low-pass filter, so high frequency noise is mostly eliminated. The result is that, in practice, data reconciliation is mainly making adjustments to correct systematic errors like biases.

Necessity of removing measurement errors

[edit]

ISA-95 is the international standard for the integration of enterprise and control systems[1] It asserts that:

Data reconciliation is a serious issue for enterprise-control integration. The data have to be valid to be useful for the enterprise system. The data must often be determined from physical measurements that have associated error factors. This must usually be converted into exact values for the enterprise system. This conversion may require manual, or intelligent reconciliation of the converted values [...]. Systems must be set up to ensure that accurate data are sent to production and from production. Inadvertent operator or clerical errors may result in too much production, too little production, the wrong production, incorrect inventory, or missing inventory.

History

[edit]

PDR has become more and more important due to industrial processes that are becoming more and more complex. PDR started in the early 1960s with applications aiming at closing material balances in production processes where raw measurements were available for all variables.[2] At the same time the problem of gross error identification and elimination has been presented.[3] In the late 1960s and 1970s unmeasured variables were taken into account in the data reconciliation process.,[4][5] PDR also became more mature by considering general nonlinear equation systems coming from thermodynamic models.,[6] ,[7][8] Quasi steady state dynamics for filtering and simultaneous parameter estimation over time were introduced in 1977 by Stanley and Mah.[7] Dynamic PDR was formulated as a nonlinear optimization problem by Liebman et al. in 1992.[9]

Data reconciliation

[edit]

Data reconciliation is a technique that targets at correcting measurement errors that are due to measurement noise, i.e. random errors. From a statistical point of view the main assumption is that no systematic errors exist in the set of measurements, since they may bias the reconciliation results and reduce the robustness of the reconciliation.

Given measurements , data reconciliation can mathematically be expressed as an optimization problem of the following form:

where is the reconciled value of the -th measurement (), is the measured value of the -th measurement (), is the -th unmeasured variable (), and is the standard deviation of the -th measurement (), are the process equality constraints and are the bounds on the measured and unmeasured variables.

The term is called the penalty of measurement i. The objective function is the sum of the penalties, which will be denoted in the following by .

In other words, one wants to minimize the overall correction (measured in the least squares term) that is needed in order to satisfy the system constraints. Additionally, each least squares term is weighted by the standard deviation of the corresponding measurement. The standard deviation is related to the accuracy of the measurement. For example, at a 95% confidence level, the standard deviation is about half the accuracy.

Redundancy

[edit]

Data reconciliation relies strongly on the concept of redundancy to correct the measurements as little as possible in order to satisfy the process constraints. Here, redundancy is defined differently from redundancy in information theory. Instead, redundancy arises from combining sensor data with the model (algebraic constraints), sometimes more specifically called "spatial redundancy",[7] "analytical redundancy", or "topological redundancy".

Redundancy can be due to sensor redundancy, where sensors are duplicated in order to have more than one measurement of the same quantity. Redundancy also arises when a single variable can be estimated in several independent ways from separate sets of measurements at a given time or time averaging period, using the algebraic constraints.

Redundancy is linked to the concept of observability. A variable (or system) is observable if the models and sensor measurements can be used to uniquely determine its value (system state). A sensor is redundant if its removal causes no loss of observability. Rigorous definitions of observability, calculability, and redundancy, along with criteria for determining it, were established by Stanley and Mah,[10] for these cases with set constraints such as algebraic equations and inequalities. Next, we illustrate some special cases:

Topological redundancy is intimately linked with the degrees of freedom () of a mathematical system,[11] i.e. the minimum number of pieces of information (i.e. measurements) that are required in order to calculate all of the system variables. For instance, in the example above the flow conservation requires that . One needs to know the value of two of the 3 variables in order to calculate the third one. The degrees of freedom for the model in that case is equal to 2. At least 2 measurements are needed to estimate all the variables, and 3 would be needed for redundancy.

When speaking about topological redundancy we have to distinguish between measured and unmeasured variables. In the following let us denote by the unmeasured variables and the measured variables. Then the system of the process constraints becomes , which is a nonlinear system in and . If the system is calculable with the measurements given, then the level of topological redundancy is defined as , i.e. the number of additional measurements that are at hand on top of those measurements which are required in order to just calculate the system. Another way of viewing the level of redundancy is to use the definition of , which is the difference between the number of variables (measured and unmeasured) and the number of equations. Then one gets

i.e. the redundancy is the difference between the number of equations and the number of unmeasured variables . The level of total redundancy is the sum of sensor redundancy and topological redundancy. We speak of positive redundancy if the system is calculable and the total redundancy is positive. One can see that the level of topological redundancy merely depends on the number of equations (the more equations the higher the redundancy) and the number of unmeasured variables (the more unmeasured variables, the lower the redundancy) and not on the number of measured variables.

Simple counts of variables, equations, and measurements are inadequate for many systems, breaking down for several reasons: (a) Portions of a system might have redundancy, while others do not, and some portions might not even be possible to calculate, and (b) Nonlinearities can lead to different conclusions at different operating points. As an example, consider the following system with 4 streams and 2 units.

Example of calculable and non-calculable systems

[edit]

We incorporate only flow conservation constraints and obtain and . It is possible that the system is not calculable, even though .

If we have measurements for and , but not for and , then the system cannot be calculated (knowing does not give information about and ). On the other hand, if and are known, but not and , then the system can be calculated.

In 1981, observability and redundancy criteria were proven for these sorts of flow networks involving only mass and energy balance constraints.[12] After combining all the plant inputs and outputs into an "environment node", loss of observability corresponds to cycles of unmeasured streams. That is seen in the second case above, where streams a and b are in a cycle of unmeasured streams. Redundancy classification follows, by testing for a path of unmeasured streams, since that would lead to an unmeasured cycle if the measurement was removed. Measurements c and d are redundant in the second case above, even though part of the system is unobservable.

Benefits

[edit]

Redundancy can be used as a source of information to cross-check and correct the measurements and increase their accuracy and precision: on the one hand they reconciled Further, the data reconciliation problem presented above also includes unmeasured variables . Based on information redundancy, estimates for these unmeasured variables can be calculated along with their accuracies. In industrial processes these unmeasured variables that data reconciliation provides are referred to as soft sensors or virtual sensors, where hardware sensors are not installed.

Data validation

[edit]

Data validation denotes all validation and verification actions before and after the reconciliation step.

Data filtering

[edit]

Data filtering denotes the process of treating measured data such that the values become meaningful and lie within the range of expected values. Data filtering is necessary before the reconciliation process in order to increase robustness of the reconciliation step. There are several ways of data filtering, for example taking the average of several measured values over a well-defined time period.

Result validation

[edit]

Result validation is the set of validation or verification actions taken after the reconciliation process and it takes into account measured and unmeasured variables as well as reconciled values. Result validation covers, but is not limited to, penalty analysis for determining the reliability of the reconciliation, or bound checks to ensure that the reconciled values lie in a certain range, e.g. the temperature has to be within some reasonable bounds.

Gross error detection

[edit]

Result validation may include statistical tests to validate the reliability of the reconciled values, by checking whether gross errors exist in the set of measured values. These tests can be for example

  • the chi square test (global test)
  • the individual test.

If no gross errors exist in the set of measured values, then each penalty term in the objective function is a random variable that is normally distributed with mean equal to 0 and variance equal to 1. By consequence, the objective function is a random variable which follows a chi-square distribution, since it is the sum of the square of normally distributed random variables. Comparing the value of the objective function with a given percentile of the probability density function of a chi-square distribution (e.g. the 95th percentile for a 95% confidence) gives an indication of whether a gross error exists: If , then no gross errors exist with 95% probability. The chi square test gives only a rough indication about the existence of gross errors, and it is easy to conduct: one only has to compare the value of the objective function with the critical value of the chi square distribution.

The individual test compares each penalty term in the objective function with the critical values of the normal distribution. If the -th penalty term is outside the 95% confidence interval of the normal distribution, then there is reason to believe that this measurement has a gross error.

Advanced process data reconciliation

[edit]

Advanced process data reconciliation (PDR) is an integrated approach of combining data reconciliation and data validation techniques, which is characterized by

  • complex models incorporating besides mass balances also thermodynamics, momentum balances, equilibria constraints, hydrodynamics etc.
  • gross error remediation techniques to ensure meaningfulness of the reconciled values,
  • robust algorithms for solving the reconciliation problem.

Thermodynamic models

[edit]

Simple models include mass balances only. When adding thermodynamic constraints such as energy balances to the model, its scope and the level of redundancy increases. Indeed, as we have seen above, the level of redundancy is defined as , where is the number of equations. Including energy balances means adding equations to the system, which results in a higher level of redundancy (provided that enough measurements are available, or equivalently, not too many variables are unmeasured).

Gross error remediation

[edit]
The workflow of an advanced data validation and reconciliation process.

Gross errors are measurement systematic errors that may bias the reconciliation results. Therefore, it is important to identify and eliminate these gross errors from the reconciliation process. After the reconciliation statistical tests can be applied that indicate whether or not a gross error does exist somewhere in the set of measurements. These techniques of gross error remediation are based on two concepts:

  • gross error elimination
  • gross error relaxation.

Gross error elimination determines one measurement that is biased by a systematic error and discards this measurement from the data set. The determination of the measurement to be discarded is based on different kinds of penalty terms that express how much the measured values deviate from the reconciled values. Once the gross errors are detected they are discarded from the measurements and the reconciliation can be done without these faulty measurements that spoil the reconciliation process. If needed, the elimination is repeated until no gross error exists in the set of measurements.

Gross error relaxation targets at relaxing the estimate for the uncertainty of suspicious measurements so that the reconciled value is in the 95% confidence interval. Relaxation typically finds application when it is not possible to determine which measurement around one unit is responsible for the gross error (equivalence of gross errors). Then measurement uncertainties of the measurements involved are increased.

It is important to note that the remediation of gross errors reduces the quality of the reconciliation, either the redundancy decreases (elimination) or the uncertainty of the measured data increases (relaxation). Therefore, it can only be applied when the initial level of redundancy is high enough to ensure that the data reconciliation can still be done (see Section 2,[11]).

Workflow

[edit]

Advanced PDR solutions offer an integration of the techniques mentioned above:

  1. data acquisition from data historian, data base or manual inputs
  2. data validation and filtering of raw measurements
  3. data reconciliation of filtered measurements
  4. result verification
    • range check
    • gross error remediation (and go back to step 3)
  5. result storage (raw measurements together with reconciled values)

The result of an advanced PDR procedure is a coherent set of validated and reconciled process data.

Applications

[edit]

PDR finds application mainly in industry sectors where either measurements are not accurate or even non-existing, like for example in the upstream sector where flow meters are difficult or expensive to position (see [13]); or where accurate data is of high importance, for example for security reasons in nuclear power plants (see [14]). Another field of application is performance and process monitoring (see [15]) in oil refining or in the chemical industry.

As PDR enables to calculate estimates even for unmeasured variables in a reliable way, the German Engineering Society (VDI Gesellschaft Energie und Umwelt) has accepted the technology of PDR as a means to replace expensive sensors in the nuclear power industry (see VDI norm 2048,[11]).

See also

[edit]

References

[edit]
  1. ^ "ISA-95: the international standard for the integration of enterprise and control systems". isa-95.com.
  2. ^ D.R. Kuehn, H. Davidson, Computer Control II. Mathematics of Control, Chem. Eng. Process 57: 44–47, 1961.
  3. ^ V. Vaclavek, Studies on System Engineering I. On the Application of the Calculus of the Observations of Calculations of Chemical Engineering Balances, Coll. Czech Chem. Commun. 34: 3653, 1968.
  4. ^ V. Vaclavek, M. Loucka, Selection of Measurements Necessary to Achieve Multicomponent Mass Balances in Chemical Plant, Chem. Eng. Sci. 31: 1199–1205, 1976.
  5. ^ R.S.H. Mah, G.M. Stanley, D.W. Downing, Reconciliation and Rectification of Process Flow and Inventory Data, Ind. & Eng. Chem. Proc. Des. Dev. 15: 175–183, 1976.
  6. ^ J.C. Knepper, J.W. Gorman, Statistical Analysis of Constrained Data Sets, AiChE Journal 26: 260–164, 1961.
  7. ^ a b c G.M. Stanley and R.S.H. Mah, Estimation of Flows and Temperatures in Process Networks, AIChE Journal 23: 642–650, 1977.
  8. ^ P. Joris, B. Kalitventzeff, Process measurements analysis and validation, Proc. CEF’87: Use Comput. Chem. Eng., Italy, 41–46, 1987.
  9. ^ M.J. Liebman, T.F. Edgar, L.S. Lasdon, Efficient Data Reconciliation and Estimation for Dynamic Processes Using Nonlinear Programming Techniques, Computers Chem. Eng. 16: 963–986, 1992.
  10. ^ Stanley G.M. and Mah, R.S.H., "Observability and Redundancy in Process Data Estimation, Chem. Engng. Sci. 36, 259 (1981)
  11. ^ a b c VDI-Gesellschaft Energie und Umwelt, "Guidelines - VDI 2048 Blatt 1 - “Control and quality improvement of process data and their uncertainties by means of correction calculation for operation and acceptance tests”; VDI 2048 Part 1; September 2017", Association of German Engineers, 2017.
  12. ^ Stanley G.M., and Mah R.S.H., "Observability and Redundancy Classification in Process Networks", Chem. Engng. Sci. 36, 1941 (1981)
  13. ^ P. Delava, E. Maréchal, B. Vrielynck, B. Kalitventzeff (1999), Modelling of a Crude Oil Distillation Unit in Term of Data Reconciliation with ASTM or TBP Curves as Direct Input – Application : Crude Oil Preheating Train, Proceedings of ESCAPE-9 conference, Budapest, May 31-June 2, 1999, supplementary volume, p. 17-20.
  14. ^ M. Langenstein, J. Jansky, B. Laipple (2004), Finding Megawatts in nuclear power plants with process data validation, Proceedings of ICONE12, Arlington, USA, April 25–29, 2004.
  15. ^ Th. Amand, G. Heyen, B. Kalitventzeff, Plant Monitoring and Fault Detection: Synergy between Data Reconciliation and Principal Component Analysis, Comp. and Chem, Eng. 25, p. 501-507, 2001.
  • Alexander, Dave, Tannar, Dave & Wasik, Larry "Mill Information System uses Dynamic Data Reconciliation for Accurate Energy Accounting" TAPPI Fall Conference 2007.[1][permanent dead link]
  • Rankin, J. & Wasik, L. "Dynamic Data Reconciliation of Batch Pulping Processes (for On-Line Prediction)" PAPTAC Spring Conference 2009.
  • S. Narasimhan, C. Jordache, Data reconciliation and gross error detection: an intelligent use of process data, Golf Publishing Company, Houston, 2000.
  • V. Veverka, F. Madron, Material and Energy Balancing in the Process Industries, Elsevier Science BV, Amsterdam, 1997.
  • J. Romagnoli, M.C. Sanchez, Data processing and reconciliation for chemical process operations, Academic Press, 2000.
殁年是什么意思 二大爷是什么意思 资金流入股价下跌为什么 感恩节什么时候 八月十六号是什么星座
苦荞是什么植物 脚背有痣代表什么 拉缸是什么意思 曩是什么意思 46岁属什么
身上长红痘痘是什么原因 什么叫屌丝 总恶心是什么病的前兆 脾脏是人体的什么器官 寿司是什么
番茄是什么 脚底有黑痣有什么说法 肝癌是什么 心服口服是什么意思 年少轻狂下一句是什么
鱼加思读什么hcv9jop0ns3r.cn 熠字五行属什么hcv8jop8ns5r.cn 晚上睡觉脚抽筋是什么原因引起的hcv7jop5ns0r.cn 中东是什么意思hcv8jop1ns3r.cn 赶的偏旁是什么bfb118.com
维生素e的功效与作用是什么tiangongnft.com 什么是横纹肌溶解hcv8jop0ns5r.cn 口腔扁平苔藓吃什么药hcv8jop2ns9r.cn 睡觉吹气是什么原因dayuxmw.com 代谢慢吃什么效果最快hcv8jop6ns5r.cn
叶公好龙的寓意是什么hcv9jop2ns7r.cn 喘是什么意思hcv7jop5ns5r.cn 36朵玫瑰花代表什么意思hcv8jop9ns2r.cn 什么叫夫妻hcv8jop0ns7r.cn 午饭吃什么hcv9jop7ns4r.cn
肺实性结节是什么意思hcv8jop0ns7r.cn 6月27号是什么星座hcv9jop1ns0r.cn 唏嘘不已的意思是什么xinjiangjialails.com 嘴贱什么意思hcv9jop1ns3r.cn 史迪奇是什么动物hcv9jop6ns1r.cn
百度