- Learning Quantitative Finance with R
- Dr. Param Jeet Prashant Vats
- 379字
- 2021-07-09 19:06:53
Parameter estimates
In this section, we are going to discuss some of the algorithms used for parameter estimation.
Maximum likelihood estimation
Maximum likelihood estimation (MLE) is a method for estimating model parameters on a given dataset.
Now let us try to find the parameter estimates of a probability density function of normal distribution.
Let us first generate a series of random variables, which can be done by executing the following code:
> set.seed(100) > NO_values <- 100 > Y <- rnorm(NO_values, mean = 5, sd = 1) > mean(Y)
This gives 5.002913
.
> sd(Y)
This gives 1.02071
.
Now let us make a function for log
likelihood:
LogL <- function(mu, sigma) { + A = dnorm(Y, mu, sigma) + -sum(log(A)) + }
Now let us apply the function mle
to estimate the parameters for estimating mean and standard deviation:
> library(stats4) > mle(LogL, start = list(mu = 2, sigma=2))
mu
and sigma
have been given initial values.
This gives the output as follows:
Figure 2.13: Output for MLE estimation
NaNs are produced when negative values are attempted for the standard deviation.
This can be controlled by giving relevant options, as shown here. This ignores the warning messages produced in the output displayed in Figure 2.13:
> mle(LogL, start = list(mu = 2, sigma=2), method = "L-BFGS-B", + lower = c(-Inf, 0), + upper = c(Inf, Inf))
This, upon execution, gives the best possible fit, as shown here:
Figure 2.14: Revised output for MLE estimation
Linear model
In the linear regression model, we try to predict dependent/response variables in terms of independent/predictor variables. In the linear model, we try to fit the best possible line, known as the regression line, though the given points. The coefficients for the regression lines are estimated using statistical software. An intercept in the regression line represents the mean value of the dependent variable when the predictor variable takes the value as zero. Also the response variable increases by the factor of estimated coefficients for each unit change in the predictor variable. Now let us try to estimate parameters for the linear regression model where the dependent variable is Adj.Close
and independent variable is Volume
of Sampledata
. Then we can fit the linear model as follows:
> Y<-Sampledata$Adj.Close > X<-Sampledata$Volume > fit <- lm(Y ~ X) > summary(fit)
Upon executing the preceding code, the output is generated as given here:
Figure 2.15: Output for linear model estimation
The summary
display shows the parameter estimates of the linear regression model. Similarly, we can estimate parameters for other regression models such as multiple or other forms of regression models.
- 工業(yè)機(jī)器人虛擬仿真實(shí)例教程:KUKA.Sim Pro(全彩版)
- 空間機(jī)器人遙操作系統(tǒng)及控制
- 輕松學(xué)PHP
- 大數(shù)據(jù)時(shí)代的數(shù)據(jù)挖掘
- Python Data Science Essentials
- 完全掌握AutoCAD 2008中文版:機(jī)械篇
- 工業(yè)控制系統(tǒng)測試與評價(jià)技術(shù)
- Blender 3D Printing by Example
- 從零開始學(xué)C++
- Dreamweaver CS6精彩網(wǎng)頁制作與網(wǎng)站建設(shè)
- 電氣控制與PLC原理及應(yīng)用(歐姆龍機(jī)型)
- 無人駕駛感知智能
- 筆記本電腦使用與維護(hù)
- 工業(yè)機(jī)器人應(yīng)用系統(tǒng)三維建模
- 玩轉(zhuǎn)機(jī)器人:基于Proteus的電路原理仿真(移動(dòng)視頻版)