Last data update: 2014.03.03

R: truncated multivariate normal generator
mvrandnR Documentation

truncated multivariate normal generator

Description

simulates n identically and independently distributed random vectors from the d-dimensional N(0,Σ) distribution (zero-mean normal with covariance Σ) conditional on l<X<u; infinite values for l and u are accepted;

Usage

mvrandn(l, u, Sig, n)

Arguments

l

lower truncation limit

u

upper truncation limit

Sig

Covariance matrix of N(0,Σ)

n

number of simulated vectors

Details

Bivariate normal:

Suppose we wish to simulate a bivariate X from N(μ,Σ), conditional on X_1-X_2<-6. We can recast this as the problem of simulation of Y from N(0,AΣ A^\top) (for an appropriate matrix A) conditional on l-Aμ < Y < u-Aμ and then setting X=μ+A^{-1}Y. See the example code below.

Exact posterior simulation for Probit regression:

Consider the Bayesian Probit Regression model applied to the lupus dataset. Let the prior for the regression coefficients β be N(0,ν^2 I). Then, to simulate from the Bayesian posterior exactly, we first simulate Z from N(0,Σ), where Σ=I+ν^2 X X^\top, conditional on Z≥ 0. Then, we simulate the posterior regression coefficients, β, of the Probit regression by drawing (β|Z) from N(C X^\top Z,C), where C^{-1}=I/ν^2+X^\top X. See the example code below.

Value

output is a d by n array storing the random vectors, X, drawn from N(0,Σ), conditional on l<X<u;

Note

Algorithm may not work or be very inefficient if Σ is close to being rank deficient.

Author(s)

Z. I. Botev, email: botev@unsw.edu.au and web page: http://web.maths.unsw.edu.au/~zdravkobotev/

References

Z. I. Botev (2015), The Normal Law Under Linear Restrictions: Simulation and Estimation via Minimax Tilting, submitted to JRSS(B)

See Also

See also: mvNqmc and mvNcdf

Examples

# Bivariate example.

Sig=matrix(c(1,0.9,0.9,1),2,2);mu=c(-3,0);l=c(-Inf,-Inf);u=c(-6,Inf);A=matrix(c(1,0,-1,1),2,2);
 n=10^3; # number of sampled vectors
 Y=mvrandn(l-A%*%mu,u-A%*%mu,A%*%Sig%*%t(A),n); 
 X=rep(mu,n)+solve(A,diag(2))%*%Y; # now apply the inverse map as explained above
 plot(X[1,],X[2,]) # provide a scatterplot of exactly simulated points

# Exact Bayesian Posterior Simulation Example.

data("lupus"); # load lupus data
Y = lupus[,1]; # response data
X = lupus[,-1]  # construct design matrix
m=dim(X)[1]; d=dim(X)[2]; # dimensions of problem
X=diag(2*Y-1)%*%X; # incorporate response into design matrix
nu=sqrt(10000); # prior scale parameter
C=solve(diag(d)/nu^2+t(X)%*%X);
L=t(chol(t(C))); # lower Cholesky decomposition
Sig=diag(m)+nu^2*X%*%t(X); # this is covariance of Z given beta
l=rep(0,m);u=rep(Inf,m);
est=mvNcdf(l,u,Sig,10^3); # estimate acceptance probability of Crude Monte Carlo
print(est$upbnd/est$prob) # estimate the reciprocal of acceptance probability
n=10^4 # number of iid variables
z=mvrandn(l,u,Sig,n); # sample exactly from auxiliary distribution 
beta=L%*%matrix(rnorm(d*n),d,n)+C%*%t(X)%*%z; # simulate beta given Z and plot boxplots of marginals
boxplot(t(beta)) # plot the boxplots of the marginal distribution of the coefficients in beta
print(rowMeans(beta)) # output the posterior means

Results