rnmixGibbs              package:bayesm              R Documentation

_G_i_b_b_s _S_a_m_p_l_e_r _f_o_r _N_o_r_m_a_l _M_i_x_t_u_r_e_s

_D_e_s_c_r_i_p_t_i_o_n:

     'rnmixGibbs' implements a Gibbs Sampler for normal mixtures.

_U_s_a_g_e:

     rnmixGibbs(Data, Prior, Mcmc)

_A_r_g_u_m_e_n_t_s:

    Data: list(y) 

   Prior: list(Mubar,A,nu,V,a,ncomp) (only ncomp required)

    Mcmc: list(R,keep) (R required) 

_D_e_t_a_i_l_s:

     Model: 
      y_i ~ N(mu_{ind_i},Sigma_{ind_i}). 
      ind ~ iid multinomial(p).  p is a ncomp x 1 vector of probs. 

     Priors:
      mu_j ~ N(mubar,Sigma_j (x) A^{-1}). mubar=vec(Mubar). 
      Sigma_j ~ IW(nu,V).
      note: this is the natural conjugate prior - a special case of
     multivariate  regression.
      p ~ Dirchlet(a).

     Output of the components is in the form of a list of lists. 
      compsdraw[[i]] is ith draw - list of ncomp lists. 
      compsdraw[[i]][[j]] is list of parms for jth normal component. 
      jcomp=compsdraw[[i]][j]]. Then jth comp ~ N(jcomp[[1]],Sigma), 
     Sigma = t(R)%*%R, R^{-1} = jcomp[[2]].

     List arguments contain:

   _y n x k array of data (rows are obs) 

   _M_u_b_a_r 1 x k array with prior mean of normal comp means (def: 0)

   _A 1 x 1 precision parameter for prior on mean of normal comp (def:
        .01)

   _n_u d.f. parameter for prior on Sigma (normal comp cov matrix) (def:
        k+3)

   _V k x k location matrix of IW prior on Sigma (def: nuI)

   _a ncomp x 1 vector of Dirichlet prior parms (def: rep(5,ncomp))

   _n_c_o_m_p number of normal components to be included 

   _R number of MCMC draws 

   _k_e_e_p MCMC thinning parm: keep every keepth draw (def: 1)

_V_a_l_u_e:

    nmix: 

     {a list containing: probdraw,zdraw,compdraw}

_N_o_t_e:

     more details on contents of nmix: 

_p_r_o_b_d_r_a_w R/keep x ncomp array of mixture prob draws

_z_d_r_a_w R/keep x nobs array of indicators of mixture comp identity for
     each obs

_c_o_m_p_d_r_a_w R/keep lists of lists of comp parm draws

     In this model, the component normal parameters are not-identified
     due to label-switching. However, the fitted mixture of normals
     density is identified as it is invariant to label-switching.  See
     Allenby et al, chapter 5 for details. Use 'eMixMargDen' or
     'momMix' to compute posterior expectation or distribution of
     various identified parameters.

_A_u_t_h_o_r(_s):

     Peter Rossi, Graduate School of Business, University of Chicago,
     Peter.Rossi@ChicagoGsb.edu.

_R_e_f_e_r_e_n_c_e_s:

     For further discussion, see _Bayesian Statistics and Marketing_ by
     Rossi, Allenby and McCulloch, Chapter 3. 
      <URL:
     http://faculty.chicagogsb.edu/peter.rossi/research/bsm.html>

_S_e_e _A_l_s_o:

     'rmixture', 'rmixGibbs' ,'eMixMargDen', 'momMix', 'mixDen',
     'mixDenBi'

_E_x_a_m_p_l_e_s:

     ##
     if(nchar(Sys.getenv("LONG_TEST")) != 0) {R=2000} else {R=10}

     set.seed(66)
     dim=5;  k=3   # dimension of simulated data and number of "true" components
     sigma = matrix(rep(0.5,dim^2),nrow=dim);diag(sigma)=1
     sigfac = c(1,1,1);mufac=c(1,2,3); compsmv=list()
     for(i in 1:k) compsmv[[i]] = list(mu=mufac[i]*1:dim,sigma=sigfac[i]*sigma)
     comps = list() # change to "rooti" scale
     for(i in 1:k) comps[[i]] = list(mu=compsmv[[i]][[1]],rooti=solve(chol(compsmv[[i]][[2]])))
     pvec=(1:k)/sum(1:k)

     nobs=500
     dm = rmixture(nobs,pvec,comps)

     Data1=list(y=dm$x)
     ncomp=9
     Prior1=list(ncomp=ncomp)
     Mcmc1=list(R=R,keep=1)
     out=rnmixGibbs(Data=Data1,Prior=Prior1,Mcmc=Mcmc1)

     cat("Summary of Normal Mixture Distribution",fill=TRUE)
     summary(out)
     tmom=momMix(matrix(pvec,nrow=1),list(comps))
     mat=rbind(tmom$mu,tmom$sd)
     cat(" True Mean/Std Dev",fill=TRUE)
     print(mat)

     if(0){
     ##
     ## plotting examples
     ##
     plot(out,Data=dm$x)
     }
      

