Você está na página 1de 5

Practice from Introducing Monte Carlo Methods with R

YIK LUN, KEI


allen29@ucla.edu
This paper is a practice from the book called Introducing Monte Carlo Methods
with R by Christian P Robert and George Casella. All R codes and comments
below are belonged to the book and authors.

1. Chi-squared with df = 6
U=runif(3*10^4)
U=matrix(data=U,nrow=3) #matrix for sums
X=-log(U) #uniform to exponential
X=2* apply(X,2,sum) #sum up to get chi squares
Y=rchisq(10^4,df=6)
par(mfrow=c(1,2))
hist(X,breaks=30);hist(Y,breaks=30)

Histogram of Y

1000
200

600

Frequency

1000
600

200
0

Frequency

1400

Histogram of X

10 15 20 25

10

15
Y

20

25

2. Monte Carlo Integration


Example 1
f<-function(x){(x/sqrt(2*pi))*exp(-x^2/2)}
integrate(f,-Inf,Inf)
## 0 with absolute error < 0
n<-10e6
h<-function(x) {x}
f<-rnorm(n,0,1)
I<-sum(h(f))/n
I
## [1] -8.954987e-05

Example 2
f<-function(x){(sqrt(abs(x))/sqrt(2*pi))*exp(-x^2/2)}
integrate(f,Inf,-Inf)
## 0.822179 with absolute error < 9.5e-05
n<-10e6
h<-function(x) {sqrt(abs(x))}
f<-rnorm(n,0,1)
I<-sum(h(f))/n
I
## [1] 0.8222552

3. E(X 2 )
n<-1e4
iter<-1e3
I_hat<-c()
for (i in 1:iter){
I_hat[i] = mean(rnorm(n)^2)
}
E<-mean(I_hat)
E
## [1] 1.000269

E = function(n,iter) {
counter = 0
for (i in 1:iter){
x = rnorm(n)
I_hat = mean(x^2)
v = var(x^2)
sd = sqrt(v/n)
l = I_hat + qnorm(0.025)*sd
u = I_hat + qnorm(0.975)*sd
if (l<=1 & u>=1){ # checks whether the confidence interval contains 1
counter = counter+1
}
}
fraction = counter/iter
return(fraction)
}
cat("The fraction of time that CIs capture the true value =", E(1e4,1e3))

## The fraction of time that CIs capture the true value = 0.959

4. Variance Reduction
(1) Variances determine the efficiency of Monte Carlo Methods.
(2) Variances only depend on the samples size, not the dimensionality of X.
1. Original method: X = R cos() Y = R sin() integral over X 2 + Y 2 1
n<- 1e4;m<- 1e3
pi1<- function(n){
X<- cbind(runif(n), runif(n))
in_circle <- which(X[,1]^2 + X[,2]^2 <= 1)
fraction <- length(in_circle)/n
return(4*fraction)
}
original <- replicate(m,pi1(n)) #Instead of for-loop, use replicate

2. Conditioning method: P (X 2 + Y 2 1|X = x) =


pi2 <- function(n){
return (4 * mean(sqrt(1-runif(n)^2)))
}
conditioning = replicate(m,pi2(n))

1 x2

3. Antithetic method: unif(n) and 1-unif(n), then I = E(h(x) + h(


x))
pi3 <- function(n){
x<- runif(n)
return(4*mean((sqrt(1-x^2)+sqrt(1-(1-x)^2))/2))
}
antithetic <- replicate(m,pi3(n))

4. Control variate: no fluctuation in

1
2

pi4 <- function(n){


x<- runif(n)
average <- mean(sqrt(1-x^2)-(1-x))
return(4*(average+1/2))
}
control <- replicate(m,pi4(n))

5. Stratified:
pi5 <- function(n){
x<- 0.1*runif(n/10)+ seq(0,0.9,by=0.1) # making the subintervals
return (4*mean(sqrt(1-x^2)))
}
stratified <- replicate(m,pi5(n))

6. Results
mean(original)
## [1] 3.142217
mean(conditioning)
## [1] 3.1414
mean(antithetic)
## [1] 3.141789
mean(control)
## [1] 3.141543

mean(stratified)
## [1] 3.141916
var(original)

## [1] 0.0002748538
var(conditioning)

## [1] 7.908459e-05
var(antithetic)
## [1] 1.039312e-05
var(control)
## [1] 2.316289e-05
var(stratified)
## [1] 2.458501e-05

Reference:
Robert, Christian, and George Casella. Introducing Monte Carlo
Methods with R. Springer Science & Business Media, 2009.

Você também pode gostar