书籍详情
现代编译原理:C语言描述 英文版
作者:(美)Andrew W.Appel,(美)Maia Ginsburg著
出版社:人民邮电出版社
出版时间:2005-09-01
ISBN:9787115137715
定价:¥59.00
购买这本书可以去
内容简介
本书是一本著名的编译原理课程的教材。国际上众多名校均采用本书作为编译原理课程的教材,包括美国麻省理工学院、加州大学伯克利分校、普林斯顿大学和英国剑桥大学等。本书在国外享有“虎书”的称号,与有“龙书”之称的《编译原理》(AlfredAho等编著)齐名。与编译原理方面的其他名著相比,本书出版时间晚,内容新。书中专门为学生提供了一个用C语言编写的实习项目,包括前端和后端设计,学生可以在一学期内创建一个功能完整的编译器。本书全面讲述了现代编译器的各个组成部分,包括:词法分析、语法分析、抽象语法、语义检查、中间代码表示、指令选择、数据流分析、寄存器分配以及运行时系统等。与大多数编译原理的教材不同,本书采用了函数语言和面向对象语言来描述代码生成和寄存器分配,对于编译器中各个模块之间的接口都给出了实际的C语言头文件。全书分成两部分,第一部分是编译的基础知识,适用于第一门编译原理课程(一个学期);第二部分是高级主题,包括面向对象语言和函数语言、垃圾收集、循环优化、SSA(静态单赋值)形式、循环调度、存储结构优化等,适合于专题选讲、后续课程或研究生教学。本书适用于高等院校计算机及相关专业的本科生和研究生,也可供科研人员或者专业技术人员使用。
作者简介
暂缺《现代编译原理:C语言描述 英文版》作者简介
目录
1 INTRODUCTION 1
1.1 Four Important Practical Problems 2
1.1.1 Forecasting Time Series, 2
1.1.2 Estimation of Transfer Functions, 3
1.1.3 Analysis of Effects of Unusual Intervention Events To a System, 4
1.1.4 Discrete Control Systems, 5
1.2 Stochastic and Deterministic Dynamic Mathematical
Models 7
1.2.1 Stationary and Nonstationary Stochastic Models
for Forecasting and Control, 7
1.2.2 Transfer Function Models, 12
1.2.3 Models for Discrete Control Systems, 14
1.3 Basic Ideas in Model Building 16
1.3.1 Parsimony, 16
1.3.2 Iterative Stages in the Selection of a Model, 16
Part I Stochastic Models and Their Forecasting 19
2 AUTOCORRELATION FUNCTION AND SPECTRUM OF
STATIONARY PROCESSES 21
2.1 Autocorrelation Properties of Stationary Models 21
2.1.1 Time Series and Stochastic Processes, 21
2.1.2 Stationary Stochastic Processes, 23
2.1.3 Positive Definiteness and the Autocovariance Matrix, 26
2.1.4 Autocovariance and Autocorrelation Functions, 29
2.1.5 Estimation of Autocovariance and Autocorrelation Functions, 30
2.1.6 Standard Error of Autocorrelation Estimates, 32
2.2 Spectral Properties of Stationary Models 35
2.2.1 Periodogram of a Time Series, 35
2.2.2 Analysis of Variance, 36
2.2.3 Spectrum and Spectral Density Function, 37
2.2.4 Simple Examples of Autocorrelation and Spectral Density Functions, 41
2.2.5 Advantages and Disadvantages of the
Autocorrelation and Spectral Density Functions, 43
A2.1 Link Between the Sample Spectrum and
Autocovariance Function Estimate 44
3 LINEAR STATIONARY MODELS 46
3.1 General Linear Process 46
3.1.1 Two Equivalent Forms for the Linear Process, 46
3.1.2 Autocovariance Generating Function of a Linear Process, 49
3.1.3 Stationarity and Invertibility Conditions for a Linear Process, 50
3.1.4 Autoregressive and Moving Average Processes, 52
3.2 Autoregressive Processes 54
3.2.1 Stationarity Conditions for Autoregressive Processes, 54
3.2.2 Autocorrelation Function and Spectrum of Autoregressiue Processes, 55
3.2.3 First-Order Autoregressive (Markov) Process, 58
3.2.4 Second-Order Autoregressive Process, 60
3.2.5 Partial Autocorrelation Function, 64
3.2.6 Estimation of the Partial Autocorrelation Function, 67
3.2.7 Standard Errors of Partial Autocorrelation Estimates, 68
3.3 Moving Average Processes 69
3.3.1 Invertibility Conditions for Moving Average Processes, 69
3.3.2 Autocorrelation Function and Spectrum of Moving Average Processes, 70
3.3.3 First-Order Moving Average Process, 72
3.3.4 Second-Order Moving Average Process, 73
3.3.5 Duality Between Autoregressive and Moving Average Processes, 75
3.4 Mixed Autoregressive-Moving Average Processes 77
3.4.1 Stationarity and Invertibility Properties, 77
3.4.2 Autocorrelation Function and Spectrum of Mixed Processes, 78
3.4.3 First-Order Autoregressive-First-Order Moving Average Process, 80
3.4.4 Summary, 83
A3.1 Autocovariances, Autocovariance Generating
Function, and Stationarity Conditions for a
General Linear Process 85
A3.2 Recursive Method for Calculating Estimates of
Autoregressive Parameters 87
4 LINEAR NONSTATIONARY MODELS 89
4.1 Autoregressive Integrated Moving Average Processes 89
4.1.1 Nonstationary First-Order Autoregressive Process, 89
4.1.2 General Model for a Nonstationary Process Exhibiting Homogeneity, 92
4.1.3 General Form of the Autoregressive Integrated Moving Average Process, 96
4.2 Three Explicit Forms for the Autoregressive Integrated Moving Average Model 99
4.2.1 Difference Equation Form of the Model, 99
4.2.2 Random Shock Form of the Model, I00
4.2.3 Inverted Form of the Model, 106
4.3 Integrated Moving Average Processes 109
4.3.1 Integrated Moving Average Process of Order (0, 1, 1), 110
4.3.2 Integrated Moving Average Process of Order (0, 2, 2), 114
4.3.3 General Integrated Moving Average Process of Order (0, d, q), 118
A4.1 Linear Difference Equations 120
A4.2 IMA(0, 1, 1) Process With Deterministic Drift 125
A4.3 ARIMA Processes With Added Noise 126
A4.3.1 Sum of Two Independent Moving Average Processes, 126
A4.3.2 Effect of Added Noise on the General Model, 127
A4.3.3 Example for an IMA(O, 1, 1) Process with Added White Noise, 128
A4.3.4 Relation Between the IMA(O, 1, 1) Process and a Random Walk, 129
A4.3.5 Autocovariance Function of the General Model
with Added Correlated Noise, 129
5 FORECASTING 131
5. ! Minimum Mean Square Error Forecasts and Their Properties 131
5.1.1 Derivation of the Minimum Mean Square Error Forecasts, 133
5.1.2 Three Basic Forms for the Forecast, 135
5.2 Calculating and Updating Forecasts 139
5.2.1 Convenient Format for the Forecasts, 139
5.2.2 Calculation of the ¦· Weights, 139
5.2.3 Use of the ¦· Weights in Updating the Forecasts, 141
5.2.4 Calculation of the Probability Limits of the Forecasts at Any Lead Time, 142
5.3 Forecast Function and Forecast Weights 145
5.3.1 Eventual Forecast Function Determined by the Autoregressive Operator, 146
5.3.2 Role of the Mooing Average Operator in Fixing the Initial Values, 147
5.3.3 Lead l Forecast Weights, 148
5.4 Examples of Forecast Functions and Their Updating 151
5.4.1 Forecasting an IMA(O, 1, 1) Process, 151
5.4.2 Forecasting an IMA(O, 2, 2) Process, 154
5.4.3 Forecasting a General IMA(O, d, q) Process, 156
5.4.4 Forecasting Autoregressive Processes, 157
5.4.5 Forecasting a (1, O, 1) Process, 160
5.4.6 Forecasting a (1, 1, 1) Process, 162
5.5 Use of State Space Model Formulation for Exact Forecasting 163
5.5.1 State Space Model Representation for the ARIMA Process, 163
5.5.2 Kalman Filtering Relations for Use in Prediction, 164
5.6 Summary 166
A5.1 Correlations Between Forecast Errors 169
A5.1.1 Autocorrelation Function of Forecast Errors at
Different Origins, 169
A5.1.2 Correlation Between Forecast Errors at the
Same Origin with Different Lead Times, 170
A5.2 Forecast Weights for Any Lead Time 172
A5.3 Forecasting in Terms of the General Integrated
Form 174
A5.3.1 General Method of Obtaining the Integrated
Form, 174
A5.3.2 Updating the General Integrated Form, 176
A5.3.3 Comparison with the Discounted Least Squares
Method, 176
Part II Stochastic Model Building 181
6 MODELDENTIFICATION 183
6. l Objectives of Identification 183
6.1.1 Stages in the Identification Procedure, 184
6.2 Identification Techniques 184
6.2.1 Use of the Autocorrelation and Partial
Autocorrelation Functions in Identification, 184
6.2.2 Standard Errors for Estimated Autocorrelations
and Partial Autocorrelations, 188
6.2.3 Identification of Some Actual Time Series, 188
6.2.4 Some Additional Model Identification Tools, 197
6.3 Initial Estimates for the Parameters 202
6.3.1 Uniqueness of Estimates Obtained from the
Autocovariance Function, 202
6.3.2 Initial Estimates for Moving Average Processes, 202
6.3.3 Initial Estimates for Autoregressive Processes, 204
6.3.4 Initial Estimates for Mixed Autoregressive-Moving
Average Processes, 206
6.3.5 Choice Between Stationary and Nonstationary
Models in Doubtful Cases, 207
6.3.6 More Formal Tests for Unit Roots in ARIMA
Models, 208
6.3.7 Initial Estimate of Residual Variance, 211
6.3.8 Approximate Standard Error for , 212
6.4 Model Multiplicity 214
6.4.1 Multiplicity of Autoregressive-Moving Average
Models, 214
6.4.2 Multiple Moment Solutions for Moving Average
Parameters, 216
6.4.3 Use of the Backward Process to Determine
Starting Values, 218
A6.1 Expected Behavior of the Estimated Autocorrelation
Function for a Nonstationary Process 218
A6.2 General Method for Obtaining Initial Estimates of
the Parameters of a Mixed Autoregressive-Moving
Average Process 220
7MODELESTIMATION 224
7. l Study of the Likelihood and Sum of Squares
Functions 224
7.1.1 Likelihood Function, 224
7.1.2 Conditional Likelihood for an ARIMA Process, 226
7.1.3 Choice of Starting Values for Conditional Calculation, 227
7.1.4 Unconditional Likelihood; Sum of Squares
Function; Least Squares Estimates, 228
7.1.5 General Procedure for Calculating the Unconditional Sum of Squares, 233
7.1.6 Graphical Study of the Sum of Squares Function, 238
7.1.7 Description of "Well-Behaved" Estimation
Situations; Confidence Regions, 241
7.2 Nonlinear Estimation 248
7.2.1 General Method of Approach, 248
7.2.2 Numerical Estimates of the Derivatives, 249
7.2.3 Direct Evaluation of the Derivatives, 251
7.2.4 General Least Squares Algorithm for the Conditional Model, 252
7.2.5 Summary of Models Fitted to Series A to F, 255
7.2.6 Large-Sample Information Matrices and Covariance Estimates, 256
7.3 Some Estimation Results for Specific Models 259
7.3.1 Autoregressive Processes, 260
7.3.2 Moving Average Processes, 262
7.3.3 Mixed Processes, 262
7.3.4 Separation of Linear and Nonlinear Components in Estimation, 263
7.3.5 Parameter Redundancy, 264
7.4 Estimation Using Bayes' Theorem 267
7.4.1 Bayes' Theorem, 267
7.4.2 Bayesian Estimation of Parameters, 269
7.4.3 Autoregressive Processes, 270
7.4.4 Moving Average Processes, 272
7.4.5 Mixed processes, 274
7.5 Likelihood Function Based on The State Space Model 275
A7.1 Review of Normal Distribution Theory 279
A7.1.1 Partitioning of a Positive-Definite Quadratic Form, 279
A7.1.2 Two Useful Integrals, 280
A7.1.3 Normal Distribution, 281
A7.1.4 Student's t-Distribution, 283
A7.2 Review of Linear Least Squares Theory 286
A7.2.1 Normal Equations, 286
A7.2.2 Estimation of Residual Variance, 287
A7.2.3 Covariance Matrix of Estimates, 288
A7.2.4 Confidence Regions, 288
A7.2.5 Correlated Errors, 288
A7.3 Exact Likelihood Function for Moving Average and Mixed Processes 289
A7.4 Exact Likelihood Function for an Autoregressive Process 296
A7.5 Examples of the Effect of Parameter Estimation
Errors on Probability Limits for Forecasts 304
A7.6 Special Note on Estimation of Moving Average Parameters 307
8 MODEL DIAGNOSTIC CHECKING 308
8.1 Checking the Stochastic Model 308
8.1.1 General Philosophy, 308
8.1.2 Overfitting, 309
8.2 Diagnostic Checks Applied to Residuals 312
8.2.1 Autocorrelation Check, 312
8.2.2 Portmanteau Lack-of-Fit Test, 314
8.2.3 Model Inadequacy Arising from Changes in Parameter Values, 317
8.2.4 Score Tests for Model Checking, 318
8.2.5 Cumulative Periodogram Check, 321
8.3 Use of Residuals to Modify the Model 324
8.3.1 Nature of the Correlations in the Residuals When
an Incorrect Model Is Used, 324
8.3.2 Use of Residuals to Modify the Model, 325
9SEASONALMODELS 327
9.! Parsimonious Models for Seasonal Time Series 327
9.1.1 Fitting versus Forecasting, 328
9.1.2 Seasonal Models Involving Adaptive Sines and Cosines, 329
9.1.3 General Multiplicative Seasonal Model, 330
9.2 Representation of the Airline Data by a Multiplicative
(0, 1, 1) ~ (0, 1, 1)12 Seasonal Model 333
9.2.1 Multiplicative (0, l, l) ~ (0, l, 1)12 Model, 333
9.2.2 Forecasting, 334
9.2.3 Identification, 341
9.2.4 Estimation, 344
9.2.5 Diagnostic Checking, 349
9.3 Some Aspects of More General Seasonal Models 351
9.3.1 Multiplicative and Nonmultiplicative Models, 351
9.3.2 Identification, 353
9.3.3 Estimation, 355
9.3.4 Eventual Forecast Functions for Various Seasonal Models, 355
9.3.5 Choice of Transformation, 358
9.4 Structural Component Models and Deterministic
Seasonal Components 359
9.4.1 Deterministic Seasonal and Trend Components and Common Factors, 360
9.4.2 Models with Regression Terms and Time Series Error Terms, 361
A9.1 Autocovariances for Some Seasonal Models 366
Part III Transfer Function Model Building 371
10 TRANSFER FUNCTION MODELS 373
10.1 Linear Transfer Function Models 373
10.1.1 Discrete Transfer Function, 374
10.1.2 Continuous Dynamic Models Represented by Differential Equations, 376
10.2 Discrete Dynamic Models Represented by Difference Equations 381
10.2.1 General Form of the Difference Equation, 381
10.2.2 Nature of the Transfer Function, 383
10.2.3 First- and Second-Order Discrete Transfer Function Models, 384
10.2.4 Recursive Computation of Output for Any Input, 390
10.2.5 Transfer Function Models with Added Noise, 392
10.3 Relation Between Discrete and Continuous Models 392
10.3.1 Response to a Pulsed Input, 393
10.3.2 Relationships for First- and Second-Order Coincident Systems, 395
10.3.3 Approximating General Continuous Models by Discrete Models, 398
A10.1 Continuous Models With Pulsed Inputs 399
A10.2 Nonlinear Transfer Functions and Linearization 404
11 IDENTIFICATION, FITTING, AND CHECKING OF
TRANSFER FUNCTION MODELS 407
ll.1 Cross Correlation Function 408
11.1.1 Properties of the Cross Covariance and Cross Correlation Functions. 408
11.1.2 Estimation of the Cross Covariance and Cross Correlation Functions, 411
11.1.3 Approximate Standard Errors of Cross Correlation Estimates, 413
11.2 Identification of Transfer Function Models 415
11.2.1 Identification of Transfer Function Models by Prewhitening the Input, 417
11.2.2 Example of the Identification of a Transfer Function Model, 419
11.2.3 Identification of the Noise Model, 422
11.2.4 Some General Considerations in Identifying Transfer Function Models, 424
11.3 Fitting and Checking Transfer Function Models 426
11.3.1 Conditional Sum of Squares Function, 426
11.3.2 Nonlinear Estimation, 429
11.3.3 Use of Residuals for Diagnostic Checking, 431
11.3.4 Specific Checks Applied to the Residuals, 432
11.4 Some Examples of Fitting and Checking Transfer Function Models 435
11.4.1 Fitting and Checking of the Gas Furnace Model, 435
11.4.2 Simulated Example with Two Inputs, 441
11.5 Forecasting Using Leading Indicators 444
11.5.1 Minimum Mean Square Error Forecast, 444
11.5.2 Forecast of C02 Output from Gas Furnace, 448
11.5.3 Forecast of Nonstationary Sales Data Using a Leading Indicator, 451
11.6 Some Aspects of the Design of Experiments to Estimate Transfer Functions 453
A11.1 Use of Cross Spectral Analysis for Transfer
Function Model Identification 455
All.I.1 Identification of Single Input Transfer Function Models, 455
All.l.2 Identification of Multiple Input Transfer Function Models, 456
AI1.2 Choice of Input to Provide Optimal Parameter Estimates 457
All.2.1 Design of Optimal Inputs for a Simple System, 457
All.2.2 Numerical Example, 460
12 INTERVENTION ANALYSIS MODELS AND OUTLIER
DETECTION 462
12.1 Intervention Analysis Methods 462
12.1.1 Models for Intervention Analysis, 462
12.1.2 Example of Intervention Analysis, 465
12.1.3 Nature of the MLE for a Simple Level Change Parameter Model, 466
12.2 Outlier Analysis for Time Series 469
12.2.1 Models for Additive and Innovational Outliers, 469
12.2.2 Estir m ation of Outlier Effect for Known Timing of the Outlier, 470
12.2.3 Iterative Procedure for Outlier Detection, 471
12.2.4 Examples of Analysis of Outliers, 473
12.3 Estimation for ARMA Models With Missing Values 474
Part IV Design of Discrete Control Schemes 481
13 ASPECTS OF PROCESS CONTROL 483
13.1 Process Monitoring and Process Adjustment 484
13.1.1 Process Monitoring, 484
13.1.2 Process Adjustment, 487
13.2 Process Adjustment Using Feedback Control~488
13.2.1 Feedback Adjustment Chart, 489
13.2.2 Modeling the Feedback Loop, 492
13.2.3 Simple Models for Disturbances and Dynamics, 493
13.2.4 General Minimum Mean Square Error Feedback Control Schemes, 497
13.2.5 Manual Adjustment for Discrete Proportional-Integral Schemes, 499
13.2.6 Complementary Roles of Monitoring and Adjustment, 503
13.3 Excessive Adjustment Sometimes Required by MMSE Control 505
13.3.1 Constrained Control, 506
13.4 Minimum Cost Control With Fixed Costs of Adjustment And Monitoring 508
13.4.1 Bounded Adjustment Scheme for Fixed Adjustment Cost, 508
13.4.2 Indirect Approach for Obtaining a Bounded Adjustment Scheme, 510
13.4.3 Inclusion of the Cost of Monitoring, 511
13.5 Monitoring Values of Parameters of Forecasting
and Feedback Adjustment Schemes 514
A13.1 Feedback Control Schemes Where the
Adjustment Variance Is Restricted 516
A13.1.1 Derivation of Optimal Adjustment, 517
A13.2 Choice of the Sampling Interval 526
A13.2.1 Illustration of the Effect of Reducing Sampling Frequency, 526
A13.2.2 Sampling an IMA(O, I, I) Process, 526
Part V Charts and Tables 531
COLLECTION OF TABLES AND CHARTS 533
COLLECTION OF TIME SERIES USED FOR EXAMPLES IN
THE TEXT AND IN EXERCISES 540
REFERENCES 556
Part VI
EXERCISES AND PROBLEMS 569
INDEX 589
1.1 Four Important Practical Problems 2
1.1.1 Forecasting Time Series, 2
1.1.2 Estimation of Transfer Functions, 3
1.1.3 Analysis of Effects of Unusual Intervention Events To a System, 4
1.1.4 Discrete Control Systems, 5
1.2 Stochastic and Deterministic Dynamic Mathematical
Models 7
1.2.1 Stationary and Nonstationary Stochastic Models
for Forecasting and Control, 7
1.2.2 Transfer Function Models, 12
1.2.3 Models for Discrete Control Systems, 14
1.3 Basic Ideas in Model Building 16
1.3.1 Parsimony, 16
1.3.2 Iterative Stages in the Selection of a Model, 16
Part I Stochastic Models and Their Forecasting 19
2 AUTOCORRELATION FUNCTION AND SPECTRUM OF
STATIONARY PROCESSES 21
2.1 Autocorrelation Properties of Stationary Models 21
2.1.1 Time Series and Stochastic Processes, 21
2.1.2 Stationary Stochastic Processes, 23
2.1.3 Positive Definiteness and the Autocovariance Matrix, 26
2.1.4 Autocovariance and Autocorrelation Functions, 29
2.1.5 Estimation of Autocovariance and Autocorrelation Functions, 30
2.1.6 Standard Error of Autocorrelation Estimates, 32
2.2 Spectral Properties of Stationary Models 35
2.2.1 Periodogram of a Time Series, 35
2.2.2 Analysis of Variance, 36
2.2.3 Spectrum and Spectral Density Function, 37
2.2.4 Simple Examples of Autocorrelation and Spectral Density Functions, 41
2.2.5 Advantages and Disadvantages of the
Autocorrelation and Spectral Density Functions, 43
A2.1 Link Between the Sample Spectrum and
Autocovariance Function Estimate 44
3 LINEAR STATIONARY MODELS 46
3.1 General Linear Process 46
3.1.1 Two Equivalent Forms for the Linear Process, 46
3.1.2 Autocovariance Generating Function of a Linear Process, 49
3.1.3 Stationarity and Invertibility Conditions for a Linear Process, 50
3.1.4 Autoregressive and Moving Average Processes, 52
3.2 Autoregressive Processes 54
3.2.1 Stationarity Conditions for Autoregressive Processes, 54
3.2.2 Autocorrelation Function and Spectrum of Autoregressiue Processes, 55
3.2.3 First-Order Autoregressive (Markov) Process, 58
3.2.4 Second-Order Autoregressive Process, 60
3.2.5 Partial Autocorrelation Function, 64
3.2.6 Estimation of the Partial Autocorrelation Function, 67
3.2.7 Standard Errors of Partial Autocorrelation Estimates, 68
3.3 Moving Average Processes 69
3.3.1 Invertibility Conditions for Moving Average Processes, 69
3.3.2 Autocorrelation Function and Spectrum of Moving Average Processes, 70
3.3.3 First-Order Moving Average Process, 72
3.3.4 Second-Order Moving Average Process, 73
3.3.5 Duality Between Autoregressive and Moving Average Processes, 75
3.4 Mixed Autoregressive-Moving Average Processes 77
3.4.1 Stationarity and Invertibility Properties, 77
3.4.2 Autocorrelation Function and Spectrum of Mixed Processes, 78
3.4.3 First-Order Autoregressive-First-Order Moving Average Process, 80
3.4.4 Summary, 83
A3.1 Autocovariances, Autocovariance Generating
Function, and Stationarity Conditions for a
General Linear Process 85
A3.2 Recursive Method for Calculating Estimates of
Autoregressive Parameters 87
4 LINEAR NONSTATIONARY MODELS 89
4.1 Autoregressive Integrated Moving Average Processes 89
4.1.1 Nonstationary First-Order Autoregressive Process, 89
4.1.2 General Model for a Nonstationary Process Exhibiting Homogeneity, 92
4.1.3 General Form of the Autoregressive Integrated Moving Average Process, 96
4.2 Three Explicit Forms for the Autoregressive Integrated Moving Average Model 99
4.2.1 Difference Equation Form of the Model, 99
4.2.2 Random Shock Form of the Model, I00
4.2.3 Inverted Form of the Model, 106
4.3 Integrated Moving Average Processes 109
4.3.1 Integrated Moving Average Process of Order (0, 1, 1), 110
4.3.2 Integrated Moving Average Process of Order (0, 2, 2), 114
4.3.3 General Integrated Moving Average Process of Order (0, d, q), 118
A4.1 Linear Difference Equations 120
A4.2 IMA(0, 1, 1) Process With Deterministic Drift 125
A4.3 ARIMA Processes With Added Noise 126
A4.3.1 Sum of Two Independent Moving Average Processes, 126
A4.3.2 Effect of Added Noise on the General Model, 127
A4.3.3 Example for an IMA(O, 1, 1) Process with Added White Noise, 128
A4.3.4 Relation Between the IMA(O, 1, 1) Process and a Random Walk, 129
A4.3.5 Autocovariance Function of the General Model
with Added Correlated Noise, 129
5 FORECASTING 131
5. ! Minimum Mean Square Error Forecasts and Their Properties 131
5.1.1 Derivation of the Minimum Mean Square Error Forecasts, 133
5.1.2 Three Basic Forms for the Forecast, 135
5.2 Calculating and Updating Forecasts 139
5.2.1 Convenient Format for the Forecasts, 139
5.2.2 Calculation of the ¦· Weights, 139
5.2.3 Use of the ¦· Weights in Updating the Forecasts, 141
5.2.4 Calculation of the Probability Limits of the Forecasts at Any Lead Time, 142
5.3 Forecast Function and Forecast Weights 145
5.3.1 Eventual Forecast Function Determined by the Autoregressive Operator, 146
5.3.2 Role of the Mooing Average Operator in Fixing the Initial Values, 147
5.3.3 Lead l Forecast Weights, 148
5.4 Examples of Forecast Functions and Their Updating 151
5.4.1 Forecasting an IMA(O, 1, 1) Process, 151
5.4.2 Forecasting an IMA(O, 2, 2) Process, 154
5.4.3 Forecasting a General IMA(O, d, q) Process, 156
5.4.4 Forecasting Autoregressive Processes, 157
5.4.5 Forecasting a (1, O, 1) Process, 160
5.4.6 Forecasting a (1, 1, 1) Process, 162
5.5 Use of State Space Model Formulation for Exact Forecasting 163
5.5.1 State Space Model Representation for the ARIMA Process, 163
5.5.2 Kalman Filtering Relations for Use in Prediction, 164
5.6 Summary 166
A5.1 Correlations Between Forecast Errors 169
A5.1.1 Autocorrelation Function of Forecast Errors at
Different Origins, 169
A5.1.2 Correlation Between Forecast Errors at the
Same Origin with Different Lead Times, 170
A5.2 Forecast Weights for Any Lead Time 172
A5.3 Forecasting in Terms of the General Integrated
Form 174
A5.3.1 General Method of Obtaining the Integrated
Form, 174
A5.3.2 Updating the General Integrated Form, 176
A5.3.3 Comparison with the Discounted Least Squares
Method, 176
Part II Stochastic Model Building 181
6 MODELDENTIFICATION 183
6. l Objectives of Identification 183
6.1.1 Stages in the Identification Procedure, 184
6.2 Identification Techniques 184
6.2.1 Use of the Autocorrelation and Partial
Autocorrelation Functions in Identification, 184
6.2.2 Standard Errors for Estimated Autocorrelations
and Partial Autocorrelations, 188
6.2.3 Identification of Some Actual Time Series, 188
6.2.4 Some Additional Model Identification Tools, 197
6.3 Initial Estimates for the Parameters 202
6.3.1 Uniqueness of Estimates Obtained from the
Autocovariance Function, 202
6.3.2 Initial Estimates for Moving Average Processes, 202
6.3.3 Initial Estimates for Autoregressive Processes, 204
6.3.4 Initial Estimates for Mixed Autoregressive-Moving
Average Processes, 206
6.3.5 Choice Between Stationary and Nonstationary
Models in Doubtful Cases, 207
6.3.6 More Formal Tests for Unit Roots in ARIMA
Models, 208
6.3.7 Initial Estimate of Residual Variance, 211
6.3.8 Approximate Standard Error for , 212
6.4 Model Multiplicity 214
6.4.1 Multiplicity of Autoregressive-Moving Average
Models, 214
6.4.2 Multiple Moment Solutions for Moving Average
Parameters, 216
6.4.3 Use of the Backward Process to Determine
Starting Values, 218
A6.1 Expected Behavior of the Estimated Autocorrelation
Function for a Nonstationary Process 218
A6.2 General Method for Obtaining Initial Estimates of
the Parameters of a Mixed Autoregressive-Moving
Average Process 220
7MODELESTIMATION 224
7. l Study of the Likelihood and Sum of Squares
Functions 224
7.1.1 Likelihood Function, 224
7.1.2 Conditional Likelihood for an ARIMA Process, 226
7.1.3 Choice of Starting Values for Conditional Calculation, 227
7.1.4 Unconditional Likelihood; Sum of Squares
Function; Least Squares Estimates, 228
7.1.5 General Procedure for Calculating the Unconditional Sum of Squares, 233
7.1.6 Graphical Study of the Sum of Squares Function, 238
7.1.7 Description of "Well-Behaved" Estimation
Situations; Confidence Regions, 241
7.2 Nonlinear Estimation 248
7.2.1 General Method of Approach, 248
7.2.2 Numerical Estimates of the Derivatives, 249
7.2.3 Direct Evaluation of the Derivatives, 251
7.2.4 General Least Squares Algorithm for the Conditional Model, 252
7.2.5 Summary of Models Fitted to Series A to F, 255
7.2.6 Large-Sample Information Matrices and Covariance Estimates, 256
7.3 Some Estimation Results for Specific Models 259
7.3.1 Autoregressive Processes, 260
7.3.2 Moving Average Processes, 262
7.3.3 Mixed Processes, 262
7.3.4 Separation of Linear and Nonlinear Components in Estimation, 263
7.3.5 Parameter Redundancy, 264
7.4 Estimation Using Bayes' Theorem 267
7.4.1 Bayes' Theorem, 267
7.4.2 Bayesian Estimation of Parameters, 269
7.4.3 Autoregressive Processes, 270
7.4.4 Moving Average Processes, 272
7.4.5 Mixed processes, 274
7.5 Likelihood Function Based on The State Space Model 275
A7.1 Review of Normal Distribution Theory 279
A7.1.1 Partitioning of a Positive-Definite Quadratic Form, 279
A7.1.2 Two Useful Integrals, 280
A7.1.3 Normal Distribution, 281
A7.1.4 Student's t-Distribution, 283
A7.2 Review of Linear Least Squares Theory 286
A7.2.1 Normal Equations, 286
A7.2.2 Estimation of Residual Variance, 287
A7.2.3 Covariance Matrix of Estimates, 288
A7.2.4 Confidence Regions, 288
A7.2.5 Correlated Errors, 288
A7.3 Exact Likelihood Function for Moving Average and Mixed Processes 289
A7.4 Exact Likelihood Function for an Autoregressive Process 296
A7.5 Examples of the Effect of Parameter Estimation
Errors on Probability Limits for Forecasts 304
A7.6 Special Note on Estimation of Moving Average Parameters 307
8 MODEL DIAGNOSTIC CHECKING 308
8.1 Checking the Stochastic Model 308
8.1.1 General Philosophy, 308
8.1.2 Overfitting, 309
8.2 Diagnostic Checks Applied to Residuals 312
8.2.1 Autocorrelation Check, 312
8.2.2 Portmanteau Lack-of-Fit Test, 314
8.2.3 Model Inadequacy Arising from Changes in Parameter Values, 317
8.2.4 Score Tests for Model Checking, 318
8.2.5 Cumulative Periodogram Check, 321
8.3 Use of Residuals to Modify the Model 324
8.3.1 Nature of the Correlations in the Residuals When
an Incorrect Model Is Used, 324
8.3.2 Use of Residuals to Modify the Model, 325
9SEASONALMODELS 327
9.! Parsimonious Models for Seasonal Time Series 327
9.1.1 Fitting versus Forecasting, 328
9.1.2 Seasonal Models Involving Adaptive Sines and Cosines, 329
9.1.3 General Multiplicative Seasonal Model, 330
9.2 Representation of the Airline Data by a Multiplicative
(0, 1, 1) ~ (0, 1, 1)12 Seasonal Model 333
9.2.1 Multiplicative (0, l, l) ~ (0, l, 1)12 Model, 333
9.2.2 Forecasting, 334
9.2.3 Identification, 341
9.2.4 Estimation, 344
9.2.5 Diagnostic Checking, 349
9.3 Some Aspects of More General Seasonal Models 351
9.3.1 Multiplicative and Nonmultiplicative Models, 351
9.3.2 Identification, 353
9.3.3 Estimation, 355
9.3.4 Eventual Forecast Functions for Various Seasonal Models, 355
9.3.5 Choice of Transformation, 358
9.4 Structural Component Models and Deterministic
Seasonal Components 359
9.4.1 Deterministic Seasonal and Trend Components and Common Factors, 360
9.4.2 Models with Regression Terms and Time Series Error Terms, 361
A9.1 Autocovariances for Some Seasonal Models 366
Part III Transfer Function Model Building 371
10 TRANSFER FUNCTION MODELS 373
10.1 Linear Transfer Function Models 373
10.1.1 Discrete Transfer Function, 374
10.1.2 Continuous Dynamic Models Represented by Differential Equations, 376
10.2 Discrete Dynamic Models Represented by Difference Equations 381
10.2.1 General Form of the Difference Equation, 381
10.2.2 Nature of the Transfer Function, 383
10.2.3 First- and Second-Order Discrete Transfer Function Models, 384
10.2.4 Recursive Computation of Output for Any Input, 390
10.2.5 Transfer Function Models with Added Noise, 392
10.3 Relation Between Discrete and Continuous Models 392
10.3.1 Response to a Pulsed Input, 393
10.3.2 Relationships for First- and Second-Order Coincident Systems, 395
10.3.3 Approximating General Continuous Models by Discrete Models, 398
A10.1 Continuous Models With Pulsed Inputs 399
A10.2 Nonlinear Transfer Functions and Linearization 404
11 IDENTIFICATION, FITTING, AND CHECKING OF
TRANSFER FUNCTION MODELS 407
ll.1 Cross Correlation Function 408
11.1.1 Properties of the Cross Covariance and Cross Correlation Functions. 408
11.1.2 Estimation of the Cross Covariance and Cross Correlation Functions, 411
11.1.3 Approximate Standard Errors of Cross Correlation Estimates, 413
11.2 Identification of Transfer Function Models 415
11.2.1 Identification of Transfer Function Models by Prewhitening the Input, 417
11.2.2 Example of the Identification of a Transfer Function Model, 419
11.2.3 Identification of the Noise Model, 422
11.2.4 Some General Considerations in Identifying Transfer Function Models, 424
11.3 Fitting and Checking Transfer Function Models 426
11.3.1 Conditional Sum of Squares Function, 426
11.3.2 Nonlinear Estimation, 429
11.3.3 Use of Residuals for Diagnostic Checking, 431
11.3.4 Specific Checks Applied to the Residuals, 432
11.4 Some Examples of Fitting and Checking Transfer Function Models 435
11.4.1 Fitting and Checking of the Gas Furnace Model, 435
11.4.2 Simulated Example with Two Inputs, 441
11.5 Forecasting Using Leading Indicators 444
11.5.1 Minimum Mean Square Error Forecast, 444
11.5.2 Forecast of C02 Output from Gas Furnace, 448
11.5.3 Forecast of Nonstationary Sales Data Using a Leading Indicator, 451
11.6 Some Aspects of the Design of Experiments to Estimate Transfer Functions 453
A11.1 Use of Cross Spectral Analysis for Transfer
Function Model Identification 455
All.I.1 Identification of Single Input Transfer Function Models, 455
All.l.2 Identification of Multiple Input Transfer Function Models, 456
AI1.2 Choice of Input to Provide Optimal Parameter Estimates 457
All.2.1 Design of Optimal Inputs for a Simple System, 457
All.2.2 Numerical Example, 460
12 INTERVENTION ANALYSIS MODELS AND OUTLIER
DETECTION 462
12.1 Intervention Analysis Methods 462
12.1.1 Models for Intervention Analysis, 462
12.1.2 Example of Intervention Analysis, 465
12.1.3 Nature of the MLE for a Simple Level Change Parameter Model, 466
12.2 Outlier Analysis for Time Series 469
12.2.1 Models for Additive and Innovational Outliers, 469
12.2.2 Estir m ation of Outlier Effect for Known Timing of the Outlier, 470
12.2.3 Iterative Procedure for Outlier Detection, 471
12.2.4 Examples of Analysis of Outliers, 473
12.3 Estimation for ARMA Models With Missing Values 474
Part IV Design of Discrete Control Schemes 481
13 ASPECTS OF PROCESS CONTROL 483
13.1 Process Monitoring and Process Adjustment 484
13.1.1 Process Monitoring, 484
13.1.2 Process Adjustment, 487
13.2 Process Adjustment Using Feedback Control~488
13.2.1 Feedback Adjustment Chart, 489
13.2.2 Modeling the Feedback Loop, 492
13.2.3 Simple Models for Disturbances and Dynamics, 493
13.2.4 General Minimum Mean Square Error Feedback Control Schemes, 497
13.2.5 Manual Adjustment for Discrete Proportional-Integral Schemes, 499
13.2.6 Complementary Roles of Monitoring and Adjustment, 503
13.3 Excessive Adjustment Sometimes Required by MMSE Control 505
13.3.1 Constrained Control, 506
13.4 Minimum Cost Control With Fixed Costs of Adjustment And Monitoring 508
13.4.1 Bounded Adjustment Scheme for Fixed Adjustment Cost, 508
13.4.2 Indirect Approach for Obtaining a Bounded Adjustment Scheme, 510
13.4.3 Inclusion of the Cost of Monitoring, 511
13.5 Monitoring Values of Parameters of Forecasting
and Feedback Adjustment Schemes 514
A13.1 Feedback Control Schemes Where the
Adjustment Variance Is Restricted 516
A13.1.1 Derivation of Optimal Adjustment, 517
A13.2 Choice of the Sampling Interval 526
A13.2.1 Illustration of the Effect of Reducing Sampling Frequency, 526
A13.2.2 Sampling an IMA(O, I, I) Process, 526
Part V Charts and Tables 531
COLLECTION OF TABLES AND CHARTS 533
COLLECTION OF TIME SERIES USED FOR EXAMPLES IN
THE TEXT AND IN EXERCISES 540
REFERENCES 556
Part VI
EXERCISES AND PROBLEMS 569
INDEX 589
猜您喜欢