Combines bma objects (resulting from bms). Can be used to split estimation over several machines, or combine the MCMC results obtained from different starting points.
Usage
combine_chains(...)
## S3 method for class 'bma'
c(..., recursive = FALSE)
Arguments
...
At least two 'bma' objects (cf. bms)
recursive
retained for compatibility with c method
Details
Aggregates the information obtained from several chains. The result is a 'bma' object (cf. 'Values' in bms) that can be used just as a standard 'bma' object.
Note that combine_chains helps in particular to paralllelize the enumeration of the total model space:
A model with K regressors has 2^K potential covariate combinations: With K large (more than 25), this can be pretty time intensive.
With the bms arguments start.value and iter, sampling can be done in steps: cf. example 'enumeration' below.
data(datafls)
#MCMC case ############################
model1=bms(datafls,burn=1000,iter=4000,mcmc="bd",start.value=c(20,30,35))
model2=bms(datafls,burn=1500,iter=7000,mcmc="bd",start.value=c(1,10,15))
model_all=c(model1,model2)
coef(model_all)
plot(model_all)
#splitting enumeration ########################
#standard case with 12 covariates (4096 differnt combinations):
enum0=bms(datafls[,1:13],mcmc="enumerate")
# now split the task:
# enum1 does everything from model zero (the first model) to model 1999
enum1=bms(datafls[,1:13],mcmc="enumerate",start.value=0,iter=1999)
# enum2 does models from index 2000 to the index 3000 (in total 1001 models)
enum2=bms(datafls[,1:13],mcmc="enumerate",start.value=2000,iter=1000)
# enum3 does models from index 3001 to the end
enum3=bms(datafls[,1:13],mcmc="enumerate",start.value=3001)
enum_combi=c(enum1,enum2,enum3)
coef(enum_combi)
coef(enum0)
#both enum_combi and enum0 have exactly the same results
#(one difference: enum_combi has more 'top models' (1500 instead of 500))
Results
R version 3.3.1 (2016-06-21) -- "Bug in Your Hair"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.
R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.
Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.
> library(BMS)
> png(filename="/home/ddbj/snapshot/RGM3/R_CC/result/BMS/c.bma.Rd_%03d_medium.png", width=480, height=480)
> ### Name: c.bma
> ### Title: Concatenate bma objects
> ### Aliases: combine_chains c.bma
> ### Keywords: models
>
> ### ** Examples
>
> data(datafls)
>
> #MCMC case ############################
> model1=bms(datafls,burn=1000,iter=4000,mcmc="bd",start.value=c(20,30,35))
PIP Post Mean Post SD Cond.Pos.Sign Idx
GDP60 1.00000 -1.608178e-02 3.032965e-03 0.00000000 12
Confucian 1.00000 6.089174e-02 1.390316e-02 1.00000000 19
LifeExp 0.98075 8.729354e-04 2.706093e-04 1.00000000 11
EquipInv 0.91975 1.322197e-01 6.427800e-02 1.00000000 38
SubSahara 0.91575 -1.679530e-02 7.396973e-03 0.00000000 7
Mining 0.82600 3.331080e-02 2.049371e-02 1.00000000 13
NequipInv 0.68400 3.488031e-02 2.939511e-02 1.00000000 39
Hindu 0.67500 -4.314928e-02 3.813677e-02 0.00407407 21
EcoOrg 0.65125 1.403726e-03 1.278067e-03 1.00000000 14
Protestants 0.60425 -6.382624e-03 6.639024e-03 0.00000000 25
RuleofLaw 0.59175 7.116776e-03 7.013043e-03 1.00000000 26
LatAmerica 0.58875 -6.017545e-03 6.221842e-03 0.00424628 6
LabForce 0.58875 1.448485e-07 1.435372e-07 0.99490446 29
Muslim 0.52800 5.910678e-03 6.813488e-03 0.99810606 23
BlMktPm 0.47575 -3.563679e-03 4.320894e-03 0.00000000 41
HighEnroll 0.45975 -3.879307e-02 4.923375e-02 0.00000000 30
EthnoL 0.43300 4.743771e-03 6.443617e-03 1.00000000 20
Buddha 0.38725 3.768500e-03 5.949566e-03 1.00000000 17
CivlLib 0.35175 -8.142978e-04 1.433562e-03 0.03624733 34
YrsOpen 0.34475 2.859814e-03 5.749789e-03 0.89412618 15
PrScEnroll 0.32275 5.940581e-03 1.028403e-02 0.97598761 10
Spanish 0.29925 2.663423e-03 5.159970e-03 0.97744361 2
English 0.29650 -1.879917e-03 3.724483e-03 0.00000000 35
WarDummy 0.23000 -8.088676e-04 1.902465e-03 0.00434783 5
French 0.22550 1.645017e-03 3.564330e-03 1.00000000 3
Catholic 0.22425 -4.253123e-04 2.839036e-03 0.35785953 18
Age 0.22375 -7.888752e-06 2.027023e-05 0.01675978 16
PolRights 0.21850 -1.721269e-04 8.508871e-04 0.25629291 33
Abslat 0.21725 -4.176041e-06 6.006711e-05 0.39585731 1
OutwarOr 0.18825 -5.549710e-04 1.437109e-03 0.00531208 8
stdBMP 0.18250 -5.845187e-07 5.268857e-06 0.29863014 40
RFEXDist 0.15950 -5.207810e-06 1.683545e-05 0.04545455 37
Popg 0.15025 8.209603e-03 8.337574e-02 0.60898502 27
Brit 0.14100 5.931238e-04 2.217540e-03 0.78900709 4
PublEdupct 0.12750 2.461075e-02 7.679309e-02 0.99019608 31
PrExports 0.11825 -4.921237e-04 2.449831e-03 0.13530655 24
Jewish 0.10900 -6.875219e-04 4.412089e-03 0.17889908 22
Foreign 0.10175 2.719025e-04 1.487504e-03 0.77886978 36
Area 0.09075 6.283143e-10 1.805018e-07 0.61983471 9
RevnCoup 0.06950 -2.529618e-05 1.277271e-03 0.45323741 32
WorkPop 0.04600 -1.339035e-04 1.686857e-03 0.22826087 28
Mean no. regressors Draws Burnins Time
"16.7482" "4000" "1000" "0.758888 secs"
No. models visited Modelspace 2^K % visited % Topmodels
"1442" "2.2e+12" "6.6e-08" "56"
Corr PMP No. Obs. Model Prior g-Prior
"0.2069" "72" "random / 20.5" "UIP"
Shrinkage-Stats
"Av=0.9863"
Time difference of 0.758888 secs
> model2=bms(datafls,burn=1500,iter=7000,mcmc="bd",start.value=c(1,10,15))
PIP Post Mean Post SD Cond.Pos.Sign Idx
GDP60 1.00000000 -1.590151e-02 2.914313e-03 0.00000000 12
Confucian 1.00000000 5.977769e-02 1.367557e-02 1.00000000 19
SubSahara 0.98842857 -1.982018e-02 5.889752e-03 0.00000000 7
LifeExp 0.96957143 8.480364e-04 2.928248e-04 1.00000000 11
Mining 0.83442857 3.506547e-02 2.071799e-02 1.00000000 13
EquipInv 0.81828571 1.080049e-01 6.783300e-02 1.00000000 38
RuleofLaw 0.80000000 1.026599e-02 6.730391e-03 1.00000000 26
Hindu 0.78885714 -5.037836e-02 3.925688e-02 0.00434625 21
LatAmerica 0.76157143 -7.930144e-03 6.062410e-03 0.00018758 6
EcoOrg 0.76114286 1.750638e-03 1.241146e-03 0.99943694 14
NequipInv 0.72771429 3.828002e-02 2.933261e-02 1.00000000 39
Protestants 0.64000000 -7.448722e-03 6.793022e-03 0.00000000 25
LabForce 0.60471429 1.596409e-07 1.534547e-07 0.98866052 29
BlMktPm 0.60471429 -4.628262e-03 4.568936e-03 0.00070872 41
HighEnroll 0.54414286 -4.997965e-02 5.476083e-02 0.00945130 30
EthnoL 0.44271429 4.937741e-03 6.651597e-03 0.99548241 20
Spanish 0.33614286 3.107325e-03 5.450555e-03 0.98682533 2
Muslim 0.32100000 3.610762e-03 6.297493e-03 0.99910992 23
Age 0.29785714 -1.319350e-05 2.485153e-05 0.00000000 16
PublEdupct 0.29671429 6.173146e-02 1.140685e-01 0.99374097 31
Buddha 0.29328571 2.851567e-03 5.547070e-03 1.00000000 17
PolRights 0.27957143 -3.521612e-04 9.001595e-04 0.07256004 33
PrScEnroll 0.26271429 4.420366e-03 9.262979e-03 0.96193583 10
OutwarOr 0.25714286 -7.832003e-04 1.703333e-03 0.01222222 8
English 0.25700000 -1.731715e-03 3.685041e-03 0.00000000 35
French 0.25057143 1.882825e-03 3.982697e-03 0.97377423 3
Catholic 0.24571429 -6.004306e-04 3.299952e-03 0.30290698 18
CivlLib 0.21185714 -4.574767e-04 1.127430e-03 0.03304113 34
Brit 0.20742857 9.788368e-04 2.810361e-03 0.84366391 4
Abslat 0.19028571 -1.239869e-05 5.581074e-05 0.17192192 1
YrsOpen 0.17528571 1.155690e-03 3.987637e-03 0.81907090 15
PrExports 0.16657143 -9.295676e-04 3.370946e-03 0.15608919 24
WarDummy 0.15957143 -4.848127e-04 1.454471e-03 0.00000000 5
stdBMP 0.12271429 -7.150856e-07 4.417257e-06 0.14668219 40
Jewish 0.11914286 -1.826904e-04 3.843626e-03 0.42685851 22
Popg 0.11357143 1.545391e-02 7.515625e-02 0.84905660 27
WorkPop 0.11200000 -4.740670e-04 2.863695e-03 0.17346939 28
RFEXDist 0.10428571 -2.520389e-06 1.243424e-05 0.08356164 37
Foreign 0.08485714 9.068024e-05 1.193260e-03 0.55892256 36
Area 0.07514286 -2.320578e-08 2.099224e-07 0.23193916 9
RevnCoup 0.06271429 5.962213e-06 1.032998e-03 0.62642369 32
Mean no. regressors Draws Burnins Time
"17.2894" "7000" "1500" "0.7232931 secs"
No. models visited Modelspace 2^K % visited % Topmodels
"2227" "2.2e+12" "1e-07" "37"
Corr PMP No. Obs. Model Prior g-Prior
"0.0106" "72" "random / 20.5" "UIP"
Shrinkage-Stats
"Av=0.9863"
Time difference of 0.7232931 secs
>
> model_all=c(model1,model2)
> coef(model_all)
PIP Post Mean Post SD Cond.Pos.Sign Idx
GDP60 1.00000000 -1.596706e-02 2.959281e-03 0.00000000 12
Confucian 1.00000000 6.018280e-02 1.376920e-02 1.00000000 19
LifeExp 0.97363636 8.570906e-04 2.851985e-04 1.00000000 11
SubSahara 0.96200000 -1.872022e-02 6.639932e-03 0.00000000 7
EquipInv 0.85518182 1.168103e-01 6.757380e-02 1.00000000 38
Mining 0.83136364 3.442741e-02 2.065397e-02 1.00000000 13
Hindu 0.74745455 -4.774960e-02 3.900862e-02 0.00425687 21
RuleofLaw 0.72427273 9.120822e-03 7.000409e-03 1.00000000 26
EcoOrg 0.72118182 1.624488e-03 1.265747e-03 0.99962183 14
NequipInv 0.71181818 3.704376e-02 2.940087e-02 1.00000000 39
LatAmerica 0.69872727 -7.234653e-03 6.189627e-03 0.00143117 6
Protestants 0.62700000 -7.061050e-03 6.756920e-03 0.00000000 25
LabForce 0.59890909 1.542619e-07 1.500930e-07 0.99089253 29
BlMktPm 0.55781818 -4.241141e-03 4.509501e-03 0.00048892 41
HighEnroll 0.51345455 -4.591181e-02 5.309137e-02 0.00637394 30
EthnoL 0.43918182 4.867206e-03 6.577391e-03 0.99710205 20
Muslim 0.39627273 4.447095e-03 6.583505e-03 0.99862354 23
Buddha 0.32745455 3.184998e-03 5.713774e-03 1.00000000 17
Spanish 0.32272727 2.945906e-03 5.350978e-03 0.98366197 2
PrScEnroll 0.28454545 4.973171e-03 9.674462e-03 0.96773163 10
English 0.27136364 -1.785607e-03 3.700119e-03 0.00000000 35
Age 0.27090909 -1.126450e-05 2.342949e-05 0.00503356 16
CivlLib 0.26272727 -5.872298e-04 1.259227e-03 0.03460208 34
PolRights 0.25736364 -2.866942e-04 8.867997e-04 0.12928294 33
French 0.24145455 1.796350e-03 3.837552e-03 0.98268072 3
Catholic 0.23790909 -5.367512e-04 3.141313e-03 0.32174245 18
YrsOpen 0.23690909 1.775372e-03 4.776278e-03 0.85878741 15
PublEdupct 0.23518182 4.823302e-02 1.036504e-01 0.99304213 31
OutwarOr 0.23209091 -7.002078e-04 1.615356e-03 0.01018410 8
Abslat 0.20009091 -9.408634e-06 5.753118e-05 0.26033621 1
WarDummy 0.18518182 -6.026508e-04 1.639102e-03 0.00196367 5
Brit 0.18327273 8.385775e-04 2.616999e-03 0.82837302 4
PrExports 0.14900000 -7.704971e-04 3.075367e-03 0.15009152 24
stdBMP 0.14445455 -6.676067e-07 4.745064e-06 0.21648836 40
Popg 0.12690909 1.281962e-02 7.832268e-02 0.74570201 27
RFEXDist 0.12436364 -3.497633e-06 1.425223e-05 0.06578947 37
Jewish 0.11545455 -3.662655e-04 4.066818e-03 0.34173228 22
Foreign 0.09100000 1.565792e-04 1.310841e-03 0.64835165 36
WorkPop 0.08800000 -3.503712e-04 2.506024e-03 0.18388430 28
Area 0.08081818 -1.453883e-08 2.000549e-07 0.39032621 9
RevnCoup 0.06518182 -5.404477e-06 1.128063e-03 0.55927476 32
> plot(model_all)
>
>
>
> #splitting enumeration ########################
>
> #standard case with 12 covariates (4096 differnt combinations):
> enum0=bms(datafls[,1:13],mcmc="enumerate")
PIP Post Mean Post SD Cond.Pos.Sign Idx
GDP60 0.9999661 -1.948009e-02 3.201047e-03 0.00000000 12
SubSahara 0.9999333 -2.857041e-02 4.991078e-03 0.00000000 7
LifeExp 0.9912818 1.168628e-03 3.052639e-04 0.99999977 11
WarDummy 0.9870859 -1.106952e-02 3.222471e-03 0.00000000 5
LatAmerica 0.9855809 -1.565560e-02 4.496320e-03 0.00000000 6
PrScEnroll 0.2630332 3.671428e-03 8.137266e-03 0.99999771 10
Brit 0.1862789 5.704196e-04 1.826635e-03 0.99926156 4
Abslat 0.1771891 -1.981004e-05 6.971902e-05 0.00542193 1
Spanish 0.1511549 2.446922e-04 2.926712e-03 0.91084062 2
OutwarOr 0.1382977 1.529135e-04 1.187186e-03 0.92267652 8
French 0.1375714 -2.272925e-04 1.791920e-03 0.06822726 3
Area 0.1330138 2.641379e-08 2.549633e-07 0.98159721 9
Mean no. regressors Draws Burnins Time
"6.1504" "4096" "0" "0.3733745 secs"
No. models visited Modelspace 2^K % visited % Topmodels
"4096" "4096" "100" "12"
Corr PMP No. Obs. Model Prior g-Prior
"NA" "72" "random / 6" "UIP"
Shrinkage-Stats
"Av=0.9863"
Time difference of 0.3733745 secs
>
> # now split the task:
> # enum1 does everything from model zero (the first model) to model 1999
> enum1=bms(datafls[,1:13],mcmc="enumerate",start.value=0,iter=1999)
PIP Post Mean Post SD Cond.Pos.Sign Idx
GDP60 0.9999761 -1.966784e-02 3.137932e-03 0.00000000 12
SubSahara 0.9999322 -2.831658e-02 4.920806e-03 0.00000000 7
LifeExp 0.9906952 1.167709e-03 3.065897e-04 0.99999972 11
WarDummy 0.9860885 -1.104015e-02 3.241361e-03 0.00000000 5
LatAmerica 0.9848062 -1.532753e-02 4.361645e-03 0.00000000 6
PrScEnroll 0.2574544 3.676725e-03 8.166954e-03 0.99999715 10
Brit 0.1811703 5.776004e-04 1.820420e-03 0.99999167 4
Spanish 0.1437823 2.286690e-04 2.892770e-03 0.90036166 2
OutwarOr 0.1311783 1.683224e-04 1.163028e-03 0.99197928 8
French 0.1295152 -2.309937e-04 1.750807e-03 0.05971735 3
Area 0.1248893 2.720280e-08 2.483040e-07 0.98117034 9
Abslat 0.0000000 0.000000e+00 0.000000e+00 NA 1
Mean no. regressors Draws Burnins Time
"5.9295" "2000" "0" "0.2333915 secs"
No. models visited Modelspace 2^K % visited % Topmodels
"2000" "4096" "49" "25"
Corr PMP No. Obs. Model Prior g-Prior
"NA" "72" "random / 6" "UIP"
Shrinkage-Stats
"Av=0.9863"
Time difference of 0.2333915 secs
>
> # enum2 does models from index 2000 to the index 3000 (in total 1001 models)
> enum2=bms(datafls[,1:13],mcmc="enumerate",start.value=2000,iter=1000)
PIP Post Mean Post SD Cond.Pos.Sign Idx
Spanish 1.0000000 1.720055e-03 6.977639e-03 0.9485941 2
GDP60 0.9999489 -1.885964e-02 3.405390e-03 0.0000000 12
SubSahara 0.9998723 -2.943026e-02 5.218073e-03 0.0000000 7
Abslat 0.9987746 -9.618432e-05 1.342822e-04 0.0159139 1
LifeExp 0.9945354 1.162997e-03 3.075459e-04 1.0000000 11
WarDummy 0.9929087 -1.120479e-02 3.137409e-03 0.0000000 5
LatAmerica 0.9499824 -1.784212e-02 7.239832e-03 0.0000000 6
PrScEnroll 0.3742007 4.790402e-03 8.862937e-03 1.0000000 10
Brit 0.2898725 8.159065e-04 2.236539e-03 0.9983797 4
French 0.2362095 -1.587092e-04 2.442589e-03 0.3663766 3
OutwarOr 0.2361064 1.033924e-04 1.530843e-03 0.5914303 8
Area 0.2358730 3.930291e-08 3.434966e-07 0.9659913 9
Mean no. regressors Draws Burnins Time
"8.3083" "1001" "0" "0.1447463 secs"
No. models visited Modelspace 2^K % visited % Topmodels
"1001" "4096" "24" "50"
Corr PMP No. Obs. Model Prior g-Prior
"NA" "72" "random / 6" "UIP"
Shrinkage-Stats
"Av=0.9863"
Time difference of 0.1447463 secs
>
> # enum3 does models from index 3001 to the end
> enum3=bms(datafls[,1:13],mcmc="enumerate",start.value=3001)
PIP Post Mean Post SD Cond.Pos.Sign Idx
Abslat 1.0000000000 -1.153223e-04 1.299134e-04 0.00303879 1
SubSahara 0.9999532601 -2.982125e-02 5.123757e-03 0.00000000 7
GDP60 0.9999128136 -1.855141e-02 3.326949e-03 0.00000000 12
LatAmerica 0.9980913973 -1.702774e-02 4.013571e-03 0.00000000 6
LifeExp 0.9938844956 1.175146e-03 2.969702e-04 1.00000000 11
WarDummy 0.9914452244 -1.120613e-02 3.127916e-03 0.00000000 5
PrScEnroll 0.2695405978 3.386772e-03 7.764374e-03 1.00000000 10
Brit 0.1918295650 4.736739e-04 1.750320e-03 0.99563545 4
French 0.1610445098 -2.217996e-04 1.847473e-03 0.00777334 3
OutwarOr 0.1566219163 7.637330e-05 1.229934e-03 0.70549111 8
Area 0.1559166186 1.898662e-08 2.682484e-07 0.98891462 9
Spanish 0.0001222297 4.705967e-07 1.030573e-04 0.90716254 2
Mean no. regressors Draws Burnins Time
"6.9184" "1095" "0" "0.1339428 secs"
No. models visited Modelspace 2^K % visited % Topmodels
"1095" "4096" "27" "46"
Corr PMP No. Obs. Model Prior g-Prior
"NA" "72" "random / 6" "UIP"
Shrinkage-Stats
"Av=0.9863"
Time difference of 0.1339428 secs
>
> enum_combi=c(enum1,enum2,enum3)
> coef(enum_combi)
PIP Post Mean Post SD Cond.Pos.Sign Idx
GDP60 0.9999661 -1.948009e-02 3.201047e-03 0.00000000 12
SubSahara 0.9999333 -2.857041e-02 4.991078e-03 0.00000000 7
LifeExp 0.9912818 1.168628e-03 3.052639e-04 0.99999977 11
WarDummy 0.9870859 -1.106952e-02 3.222471e-03 0.00000000 5
LatAmerica 0.9855809 -1.565560e-02 4.496320e-03 0.00000000 6
PrScEnroll 0.2630332 3.671428e-03 8.137266e-03 0.99999771 10
Brit 0.1862789 5.704196e-04 1.826635e-03 0.99926156 4
Abslat 0.1771891 -1.981004e-05 6.971902e-05 0.00542193 1
Spanish 0.1511549 2.446922e-04 2.926712e-03 0.91084062 2
OutwarOr 0.1382977 1.529135e-04 1.187186e-03 0.92267652 8
French 0.1375714 -2.272925e-04 1.791920e-03 0.06822726 3
Area 0.1330138 2.641379e-08 2.549633e-07 0.98159721 9
> coef(enum0)
PIP Post Mean Post SD Cond.Pos.Sign Idx
GDP60 0.9999661 -1.948009e-02 3.201047e-03 0.00000000 12
SubSahara 0.9999333 -2.857041e-02 4.991078e-03 0.00000000 7
LifeExp 0.9912818 1.168628e-03 3.052639e-04 0.99999977 11
WarDummy 0.9870859 -1.106952e-02 3.222471e-03 0.00000000 5
LatAmerica 0.9855809 -1.565560e-02 4.496320e-03 0.00000000 6
PrScEnroll 0.2630332 3.671428e-03 8.137266e-03 0.99999771 10
Brit 0.1862789 5.704196e-04 1.826635e-03 0.99926156 4
Abslat 0.1771891 -1.981004e-05 6.971902e-05 0.00542193 1
Spanish 0.1511549 2.446922e-04 2.926712e-03 0.91084062 2
OutwarOr 0.1382977 1.529135e-04 1.187186e-03 0.92267652 8
French 0.1375714 -2.272925e-04 1.791920e-03 0.06822726 3
Area 0.1330138 2.641379e-08 2.549633e-07 0.98159721 9
> #both enum_combi and enum0 have exactly the same results
> #(one difference: enum_combi has more 'top models' (1500 instead of 500))
>
>
>
>
>
>
> dev.off()
null device
1
>