12 views

Uploaded by GiovanniErario

Syllabus

- CHAPTER 6 - Time Series Analysis.pdf
- Time Series Analysis
- Download
- granger.1974
- Bs 4 Years Program Pu Semester 5-8 Related Courses Contents
- Madhu ICMS Black
- 14 State Space
- An automated algorithm for decline_analysis.pdf
- Time Series Data Mining
- Forecasting Tourist Inflow in Bhutan using Seasonal ARIMA
- rutgers-lib-24889_PDF-1.pdf
- 2009 Doyle and Chen
- six sigma initiative
- Time Series Analysis
- LM Test
- stat
- qmt
- 10.1109RCIS.2016.Data-mining
- jouranl paper
- Time Series Analysis using R

You are on page 1of 4

Prof. Sonia Petrone

Tutor: Marta Crispino & Lorenzo Cappello

Course contents

May 2015

General notions.

Aims of time series analysis.

What is a time series? Kinds of data (quantitative, qualitative time series; regular

irregular frequency; high frequency; univariate, multivariate).

(Chatfield, Ch 1)

Part I. Introduction to time series analysis: descriptive techniques.

Classical time series decomposition. (Chatfield, Ch 2, Sections 2.1-2.6. Subsection 2.5.2:

only basic moving average. Section 2.7 will be useful later.)

Forecasting. Simple exponential smoothing. Holt and Holt-Winters forecasting procedures (exponential smoothing with trend and seasonality: only the general idea).

Some references: Chatfield (2004), Ch.5, sect. 5.2.2; Petris, Petrone, Campagnoli

(2009), Chapter 3, section 3.3.1).

Part II. Probabilistic models for time series analysis.

Time series as a discrete time stochastic process.

Mean and variance function. Autocovariance and autocorrelation functions.

Strong and weak stationarity.

Estimation of the autocorrelation and partial autocorrelation functions. The correlogram.

Examples: White noise. Gaussian processes.

Random walk.

(reference: Chatfield, Ch. 3, Sections 3.1-3.4)

Markov processes. Inference for Markov chains and examples

(reference: Lecture notes.).

Stationary time series: ARMA models.

Autoregressive processes.

Moving average processes.

ARMA models.

Basic properties of ARMA processes.

Inference for ARMA models (basic notions for AR processes. How inference is treated

exploiting the connection with DLMs).

Box-Jenkins strategy to fit ARMA models.

Model checking (residual analysis; forecasting performance, etc. ).

A basic reference is Chatfield (2004), Ch.3 (pp 39-49), Ch. 4, Ch 5, section 5.2.4).

1

Part III. Inference for non stationary time series: state-space models

Preliminary notions: elements of Bayesian inference.

Bayesian inference for the mean of a Gaussian population with known variance. Predictive

distribution. (Petris, Petrone, Campagnoli (2009), Chapter 1: the Gaussian example, with

known variance, is on page 7).

State-space models for time series analysis.

(reference: Petris, Petrone, Campagnoli, Ch 2)

Main assumptions.

Examples:

Random walk plus noise.

Hidden Markov models.

Dynamic linear models (DLMs).

Examples of non-linear, non-gaussian models.

Basic problems: filtering, smoothing, forecasting

Filtering for DLMs: Kalman filter.

Examples: Random walk plus noise. Local trend model.

Forecasting.

One-step-ahead and k-steps ahead forecasting.

The innovation process. Model checking.

Smoothing: the Kalman smoother.

(Petris, Petrone, Campagnoli (2009), Chapter 3.)

Maximum likelihood estimation of unknown parameters. (Petris, Petrone, Campagnoli (2009), Chapter 4, section 4.1).

Model specification: DLMs for time series analysis.

Combining dynamic linear models.

Structural time series DLMs:

Trend models.

Random walk plus noise. Derivation of its forecast function and main properties.

Local trend (or linear growth) and main properties.

DLMs for seasonality (Petris, Petrone, Campagnoli, Chapter 3; only local seasonal factors DLMs; no Fourier-based DLMs).

DLMs for dynamic regression.

Example: dynamic CAPM models.

DLM representations of ARMA models (Chapter 3. Details only for the AR(2) model).

Notion of conditionally Gaussian state space models.

Example: estimating the output gap.

DLMs for multivariate time series.

Vector autoregressive models as DLMs (only the general idea).

2

Seemingly unrelated equation models (SUTSE)

Factor models (only the general idea).

Bayesian inference for unknown parameters in a DLM.

Basic notions of Bayesian inference vs MLE for unknown parameters in a DLM.

(part of Petris, Petrone, Campagnoli, Chapter 4).

Bayesian inference by stochastic simulation only the general idea of Markov Chain Monte

Carlo (MCMC).

(further reading, not mandatory! : Petris, Petrone, Campagnoli, Chapter 1 and Chapter 4).

TEXTBOOKS:

PART I Classical time series analysis.

Chatfield, C. (2004). The Analysis of Time Series (6th edition). Chapman and HallCRC.

A classical, popular textbook, simple but quite clear and generally well written. We

mainly study Chapters 1-5.

PART II: State space models

Lecture notes: Introduction to Markov chains posted on learning space.

Petris, G, Petrone, S. and Campagnoli, P. (2009). Dynamic linear models with R. Springer,

N.Y.

These are the course lecture notes (and more). Chapters 1-2-3 and part of Chapter 4.

Further references (NOT exhaustive, and NOT textbooks for this course):

Makridakis, Wheelwright and Hyndman (1998). Forecasting: methods and applications. 3rd

Edition, Wiley.

This is a simple book, but Chapter 3 is a reasonable reference on classical time series decomposition.

Tsay, R.S. (2010). Analysis of Financial Time Series, Third Edition. Wiley-Interscience.

Hamilton (1994). Time Series Analysis. Princeton University Press.

Durbin and Koopman (2001). Time Series Analysis by State Space Methods. Oxford University Press.

Further reading about Kalman Filtering and Smoothing.

Petris, G. and Petrone, S. (2011). State Space Models in R. Journal of Statistical Software,

41, http://www.jstatsoft.org/v41/i04

This special issue of the Journal of Statistical Software, http://www.jstatsoft.org/v41 ,

presents implementations of state-space models in different software frameworks.

Assignments: The assignments during the course are not compulsory. They can be done

individually or in groups. The aim of the assignments is to encourage you to make exercise, try

your own data analysis, follow the lectures, ask questions if there is anything unclear.

These assignments are not formally evaluated for the final exam, but students who deliver the

periodic assignments can skip the questions on DLMs with R in the written proof.

The final assignment (or final project) will consist on a guided analysis of a problem with

real data. This is mandatory and will count for 20% of the final grade (see below).

Exam.

Final project: it can be done individually or in groups. The final assignment is due by the

day of the written exam (if the components of a group take the written proof in different dates, it

has to be delivered on the fist date). The final project will count 20% of the final grade.

Written exam: The written proof includes general questions on the course contents, and

questions on the implementation of time series analysis with R; the latter are not compulsory for

the students who delivered the periodic assignments.

The written proof counts 80% of the final grade. Because the final project is not necessarily done

individually, the written proof may count 100% if badly done.

If a student fails in the written exam, the final project will be kept.

The 80% and 20% weights may slightly change according to the global evaluation made by the

teacher, which takes into account the work done during the course, the assignments etc.

Some non exhaustive examples of possible questions for the written exam - more on

learning space

A time series is a (discrete time) stochastic process. Explain.

Describe simple exponential smoothing techniques for prediction.

When a time series (Yt ) is weakly (strongly) stationary? When and how can you estimate the

autocorrelation function?

When (Yt ) is a Markov chain and questions on properties.

Define an AR(p) model.

Define an AR(1) process and discuss its main properties.

Define a general state space model and provide some examples...

Explain, giving at least an idea of the proofs, the Kalman filter for DLMs.

Compute the forecast function for the random walk plus noise model and give the expression of

credibility intervals for the h-steps ahead prediction. Comment briefly.

Explain SUTSE models.

Compare MLE and Bayesian estimation of unknown parameters of a DLM

- CHAPTER 6 - Time Series Analysis.pdfUploaded byparklong16
- Time Series AnalysisUploaded bydatabrio
- DownloadUploaded byGiridhar Muthyala
- granger.1974Uploaded byChristiaan267
- Bs 4 Years Program Pu Semester 5-8 Related Courses ContentsUploaded byfazalulbasit9796
- Madhu ICMS BlackUploaded byDrNaga Chaitanya Kavuri
- 14 State SpaceUploaded byMarza Ihsan Marzuki
- An automated algorithm for decline_analysis.pdfUploaded byner68
- Time Series Data MiningUploaded byjayzul usrah
- Forecasting Tourist Inflow in Bhutan using Seasonal ARIMAUploaded byIjsrnet Editorial
- rutgers-lib-24889_PDF-1.pdfUploaded byFikri Ali Nawawi
- 2009 Doyle and ChenUploaded byHien Thao
- six sigma initiativeUploaded byNavin
- Time Series AnalysisUploaded byLok Joshi
- LM TestUploaded byJack
- statUploaded bySamson Haile
- qmtUploaded byalikaltay
- 10.1109RCIS.2016.Data-miningUploaded byBao Nam
- jouranl paperUploaded byPrabha Karan
- Time Series Analysis using RUploaded byGiovanny Casas Agudelo
- Ch.04 Performance ManagementUploaded bySteve Alain Onana Dang
- Data Mgmt & Analysis in EXCEL - CDAE 295 WC3 - Course Syllabus or Other Course-Related DocumentUploaded byContinuing Education at the University of Vermont
- Econometric Models for Oil PriceUploaded bydanonnino
- Compute output gapUploaded byMùa Tuyết Tan
- 28.pdfUploaded byTalin Machin
- ForecastingUploaded byJeremy Grant Ambe
- Anfis WeatherUploaded byJaka Pratama
- L01 - Scope and Role of QNA (ENG)Uploaded bykillerkiss
- tmpAF07.tmpUploaded byFrontiers
- URC 10-m VIV Test Marintek ReportUploaded byEdgard Malta

- Economics and IdentityUploaded byGiovanniErario
- The Economic Consequences of the PeaceUploaded bycypres08
- Draft 11-3Uploaded byGiovanniErario
- MethodologyUploaded byGiovanniErario
- Ubaldo Garibaldi, Enrico Scalas-Finitary Probabilistic Methods in Econophysics (2010)Uploaded byGiovanniErario
- 4.1 Cacioppo_et_al.Uploaded byGiovanniErario
- Farewell to Growth - Serge LatoucheUploaded byGiovanniErario
- Introductory StatisticsUploaded byshagakane
- CultureUploaded byGiovanniErario
- John C Hull - Opzioni, Future E Altri DerivatiUploaded byJo March
- John C Hull - Opzioni, Future E Altri DerivatiUploaded byJo March
- Tesi Parte 2Uploaded byGiovanniErario
- BiblxiografiaUploaded byGiovanniErario
- BiblxiografiaUploaded byGiovanniErario
- DocUploaded byGiovanniErario
- Derive 5Uploaded byNicola Apicella
- Newts Master IntroductionUploaded byGiovanniErario
- Bob KnowltonITUploaded byanon_433777221

- pliska2Uploaded bykenyanboy
- monte-carlo.pptUploaded byadiraju07
- 6fj3u.stochastic.calculus.for.FinanceUploaded byVSōkrátēsRamessur
- Pqt Formula VidhyatriUploaded byLaksh Manan
- Queue_MM1Uploaded bysai420
- Hw 3 SolutionUploaded byEddie Martinez Jr.
- Queuing Systems SurveyUploaded byEvaldo da Silva
- 3rde ChapterUploaded byGowtham Raj
- LT380069-05 Pipette Specs Chart_No Spreads.pdfUploaded byJosé Álvarez Robles
- ArimaUploaded bySofia Lively
- Queueing TheoryUploaded bypaxxxer
- MCMC - Markov Chain Monte Carlo: One of the top ten algorithms of the 20th centuryUploaded byTigerHATS
- Sec 45Uploaded byroufix
- Panel DataUploaded byarmailgm
- Discrete Time Series Analysis With ARMA ModelsUploaded byLetrans
- compound poisson process.pdfUploaded byJan Christopher D. Mendoza
- pqtUploaded byPavithraRam
- SyLLABOOZZUploaded bykh5892
- AC notes2013.pdfUploaded bykaddour7108
- Time Series AnalysisUploaded byAlex Tgen
- Stochastic Calculus - BookUploaded byTraderCat Solaris
- CG10-HMMUploaded byOr Kamara
- Hayashi Econometrics Ch 2 Sec 1-2Uploaded bysilvio de paula
- peramalanUploaded byReza Tianto
- Over FittingUploaded byAnnisa Putri Utami
- Stochastic Modelling 2000-2004Uploaded byBrian Kufahakutizwi
- Slide 23Uploaded byAbdalmoedAlaiashy
- Assignment 01ObaidUploaded byZeko
- Lecture 11 Vt13Uploaded bykushtrim_bajqinca
- Speech Recognition Using Hidden Markov ModelsUploaded byVincent Quang Nguyen