Code Monkey home page Code Monkey logo

shic's People

Contributors

dschride avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

shic's Issues

error trainClassifier.py

Dear Daniel,

sorry to bother you with this, but I'm trying to run your pipeline in the shIC_pipelin.sh script and I'm running into the issue below. Does this look familiar to you?

I'm hoping to use your pipeline to classify regions near color pattern genes in >30 Heliconius butterfly populations.

Many thanks for any help!

Steven

error:

python trainClassifier.py combinedTrainingSetsTennessenEuro/ classifiers/tennessenEuro/tennessenEuroAutosomal.p all
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/cross_validation.py:41: DeprecationWarning: This module was deprecated in version 0.18 in favor of the model_selection module into which all the refactored classes and functions are moved. Also note that the interface of the new CV iterators are different from that of this module. This module will be removed in 0.20.
"This module will be removed in 0.20.", DeprecationWarning)
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py:42: DeprecationWarning: This module was deprecated in version 0.18 in favor of the model_selection module into which all the refactored classes and functions are moved. This module will be removed in 0.20.
DeprecationWarning)
using these features: ['all'] (indices: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121])
training set size after balancing: 5000
Checking accuracy when distinguishing among all 5 classes
Training extraTreesClassifier
Traceback (most recent call last):
File "trainClassifier.py", line 102, in
grid_search.fit(X, y)
File "/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py", line 838, in fit
return self._fit(X, y, ParameterGrid(self.param_grid))
File "/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py", line 574, in _fit
for parameters in parameter_iterable
File "/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py", line 789, in call
self.retrieve()
File "/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py", line 740, in retrieve
raise exception
sklearn.externals.joblib.my_exceptions.JoblibValueError: JoblibValueError


Multiprocessing exception:
...........................................................................
/rds/user/sv378/hpc-work/Programs/shIC/trainClassifier.py in ()
97
98 heatmap = []
99 sys.stderr.write("Training %s\n" %(mlType))
100 grid_search = GridSearchCV(clf,param_grid=param_grid_forest,cv=10,n_jobs=20)
101 start = time()
--> 102 grid_search.fit(X, y)
103 sys.stderr.write("GridSearchCV took %.2f seconds for %d candidate parameter settings.\n"
104 % (time() - start, len(grid_search.grid_scores_)))
105 print "Results for %s" %(mlType)
106 report(grid_search.grid_scores_)

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py in fit(self=GridSearchCV(cv=10, error_score='raise',
...='2n_jobs', refit=True, scoring=None, verbose=0), X=[[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], y=['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...])
833 y : array-like, shape = [n_samples] or [n_samples, n_output], optional
834 Target relative to X for classification or regression;
835 None for unsupervised learning.
836
837 """
--> 838 return self._fit(X, y, ParameterGrid(self.param_grid))
self._fit = <bound method GridSearchCV._fit of GridSearchCV(...'2
n_jobs', refit=True, scoring=None, verbose=0)>
X = [[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...]
y = ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...]
self.param_grid = {'bootstrap': [True, False], 'criterion': ['gini', 'entropy'], 'max_depth': [3, 10, None], 'max_features': [1, 3, 11, 121], 'min_samples_leaf': [1, 3, 10], 'min_samples_split': [1, 3, 10]}
839
840
841 class RandomizedSearchCV(BaseSearchCV):
842 """Randomized search on hyper parameters.

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py in _fit(self=GridSearchCV(cv=10, error_score='raise',
...='2*n_jobs', refit=True, scoring=None, verbose=0), X=[[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], y=['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...], parameter_iterable=<sklearn.grid_search.ParameterGrid object>)
569 )(
570 delayed(fit_and_score)(clone(base_estimator), X, y, self.scorer,
571 train, test, self.verbose, parameters,
572 self.fit_params, return_parameters=True,
573 error_score=self.error_score)
--> 574 for parameters in parameter_iterable
parameters = undefined
parameter_iterable = <sklearn.grid_search.ParameterGrid object>
575 for train, test in cv)
576
577 # Out is a list of triplet: score, estimator, n_test_samples
578 n_fits = len(out)

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in call(self=Parallel(n_jobs=20), iterable=<generator object >)
784 if pre_dispatch == "all" or n_jobs == 1:
785 # The iterable was consumed all at once by the above for loop.
786 # No need to wait for async callbacks to trigger to
787 # consumption.
788 self._iterating = False
--> 789 self.retrieve()
self.retrieve = <bound method Parallel.retrieve of Parallel(n_jobs=20)>
790 # Make sure that we get a last message telling us we are done
791 elapsed_time = time.time() - self._start_time
792 self._print('Done %3i out of %3i | elapsed: %s finished',
793 (len(self._output), len(self._output),


Sub-process traceback:

ValueError Mon Feb 19 15:00:55 2018
PID: 233604 Python 2.7.14: /home/sv378/anaconda2/bin/python
...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in call(self=<sklearn.externals.joblib.parallel.BatchedCalls object>)
126 def init(self, iterator_slice):
127 self.items = list(iterator_slice)
128 self._size = len(self.items)
129
130 def call(self):
--> 131 return [func(*args, **kwargs) for func, args, kwargs in self.items]
func =
args = (ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), [[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...], , array([ 100, 101, 102, ..., 4997, 4998, 4999]), array([ 0, 1, 2, 3, 4, 5, 6,...4093, 4094,
4095, 4096, 4097, 4098, 4099]), 0, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 3, 'max_features': 1, 'min_samples_leaf': 1, 'min_samples_split': 1}, {})
kwargs = {'error_score': 'raise', 'return_parameters': True}
self.items = [(, (ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), [[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...], , array([ 100, 101, 102, ..., 4997, 4998, 4999]), array([ 0, 1, 2, 3, 4, 5, 6,...4093, 4094,
4095, 4096, 4097, 4098, 4099]), 0, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 3, 'max_features': 1, 'min_samples_leaf': 1, 'min_samples_split': 1}, {}), {'error_score': 'raise', 'return_parameters': True})]
132
133 def len(self):
134 return self._size
135

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/cross_validation.py in _fit_and_score(estimator=ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), X=[[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], y=['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...], scorer=, train=array([ 100, 101, 102, ..., 4997, 4998, 4999]), test=array([ 0, 1, 2, 3, 4, 5, 6,...4093, 4094,
4095, 4096, 4097, 4098, 4099]), verbose=0, parameters={'bootstrap': True, 'criterion': 'gini', 'max_depth': 3, 'max_features': 1, 'min_samples_leaf': 1, 'min_samples_split': 1}, fit_params={}, return_train_score=False, return_parameters=True, error_score='raise')
1670
1671 try:
1672 if y_train is None:
1673 estimator.fit(X_train, **fit_params)
1674 else:
-> 1675 estimator.fit(X_train, y_train, **fit_params)
estimator.fit =
X_train = [[0.108771151419, 0.105614197748, 0.0966494789241, 0.0866170007457, 0.0964047576502, 0.11930815248, 0.0665383993874, 0.0841112601666, 0.051475049271, 0.068146231684, 0.116364320525, 0.105090670745, 0.093292549705, 0.106620056806, 0.0873934891851, 0.0941664845969, 0.117325759231, 0.0694778239021, 0.0793095914354, 0.0747214332532, ...], [0.106617909874, 0.106357049024, 0.0765518196138, 0.147184749999, 0.0858705240986, 0.0847635617398, 0.0850937259626, 0.111091967911, 0.0773122555785, 0.0487853243806, 0.0703711118193, 0.0942956926659, 0.108265424913, 0.0791618160652, 0.136204889406, 0.0977881257276, 0.0849825378347, 0.0861466821886, 0.0966239813737, 0.0791618160652, ...], [0.185416168109, 0.100101261529, 0.133528111996, 0.101465662428, 0.0703716746745, 0.102758077764, 0.0812441134372, 0.0988210729473, 0.0435087624933, 0.0382121103195, 0.0445729843021, 0.141348497157, 0.103980503656, 0.10804224208, 0.0909829406986, 0.0601137286759, 0.0999187652315, 0.072298943948, 0.116978066613, 0.0674248578392, ...], [0.0963637088631, 0.121625351917, 0.0956768527573, 0.189534202703, 0.10684804642, 0.0715540664797, 0.123586319996, 0.0487971734638, 0.0425811912908, 0.0475143202527, 0.055918765857, 0.106206438765, 0.108861599734, 0.0932625290408, 0.138400265516, 0.0945901095254, 0.081978094922, 0.103883172917, 0.0640557583804, 0.0640557583804, ...], [0.0801096133575, 0.071895449447, 0.0685534174348, 0.0896877135845, 0.108408700517, 0.0759894373889, 0.111760084293, 0.114561218587, 0.0846313318665, 0.0917518547716, 0.102651178752, 0.0948412698413, 0.0845238095238, 0.068253968254, 0.0936507936508, 0.1, 0.0845238095238, 0.109920634921, 0.1, 0.0809523809524, ...], [0.0382561737825, 0.0315757277579, 0.0376576988282, 0.0861285452859, 0.143679577183, 0.0924431361215, 0.086804250971, 0.105428327609, 0.117157492386, 0.134981244139, 0.125887825935, 0.0738862730895, 0.0590365809489, 0.0684534588917, 0.102136906918, 0.114813473379, 0.0804056501268, 0.0843897138718, 0.0876494023904, 0.105396595436, ...], [0.0784746826308, 0.1006413305, 0.105294687454, 0.0729785868519, 0.0657225365574, 0.110274033891, 0.109100079816, 0.0715028124597, 0.116002698888, 0.0796512495891, 0.0903573013613, 0.0795935647756, 0.0897544453853, 0.106689246401, 0.0838272650296, 0.0804403048264, 0.104149026249, 0.106689246401, 0.0762066045724, 0.0973751058425, ...], [0.0920064369969, 0.0767766409783, 0.088577875781, 0.103323636964, 0.0708319990492, 0.102821897842, 0.0771736055954, 0.0911898776385, 0.115921754556, 0.113968901938, 0.0674073726602, 0.10502283105, 0.0896118721461, 0.0816210045662, 0.087899543379, 0.0810502283105, 0.0958904109589, 0.074200913242, 0.0816210045662, 0.101598173516, ...], [0.113920327585, 0.104225241968, 0.121645767516, 0.072213302433, 0.130757972587, 0.089292624932, 0.0656439314145, 0.101428447263, 0.0763251337636, 0.0631624875531, 0.0613847629852, 0.0898410504492, 0.109882515549, 0.112646855563, 0.0822391154112, 0.108500345543, 0.0746371803732, 0.0711817553559, 0.0981340704907, 0.0988251554941, ...], [0.10219007675, 0.0723602246041, 0.0961970951707, 0.116814267704, 0.148062298638, 0.0827725938389, 0.125494140016, 0.0514917907593, 0.0651889033509, 0.05285931811, 0.0865692910589, 0.0786445012788, 0.0620204603581, 0.0895140664962, 0.105498721228, 0.128516624041, 0.0773657289003, 0.106138107417, 0.0818414322251, 0.0914322250639, ...], [0.080177548404, 0.0931795868809, 0.0999111444437, 0.118483345352, 0.091940873078, 0.083312789863, 0.0522401079175, 0.106566192871, 0.0585826455211, 0.107876963363, 0.107728802306, 0.0743801652893, 0.0914256198347, 0.107438016529, 0.114669421488, 0.0847107438017, 0.0888429752066, 0.0712809917355, 0.0976239669421, 0.0686983471074, ...], [0.113891144206, 0.0866807008977, 0.0548496521406, 0.0900785333581, 0.0950678237314, 0.0814450026369, 0.0838627610583, 0.098519913327, 0.140039732117, 0.0894909660789, 0.0660737704478, 0.117040630685, 0.0824742268041, 0.0651910248636, 0.0779260157671, 0.0921770770164, 0.089144936325, 0.0876288659794, 0.0885385081868, 0.11522134627, ...], [0.0381077645694, 0.0974157216961, 0.0950881655299, 0.092574894648, 0.130234317646, 0.108731554031, 0.105156453529, 0.0993667783061, 0.0648689912665, 0.106401315243, 0.0620540435341, 0.076511861009, 0.0935516204477, 0.0898763782158, 0.0968927497494, 0.11693952556, 0.0992315402606, 0.0928833945874, 0.0838623454728, 0.0825258937521, ...], [0.0805225473236, 0.0825788214013, 0.0774580674909, 0.0505955135195, 0.0852763226211, 0.093171130758, 0.14737804487, 0.0900577192716, 0.0697143905999, 0.108670714035, 0.114576728109, 0.0880964866282, 0.0980597797588, 0.0880964866282, 0.0608285264814, 0.0844257996854, 0.102254850551, 0.126900891453, 0.0859989512323, 0.0676455165181, ...], [0.0545096490945, 0.0854380096443, 0.0590723053012, 0.11268847762, 0.11237593458, 0.148140575669, 0.0777969797628, 0.0578052877306, 0.114498669096, 0.0769023412346, 0.100771770267, 0.100886162236, 0.0804362644853, 0.0688479890934, 0.091342876619, 0.0940695296524, 0.12406271302, 0.105657805044, 0.0743012951602, 0.0879345603272, ...], [0.0793137451288, 0.0971837591699, 0.112239651189, 0.0960395146984, 0.107311937923, 0.12176252675, 0.0855251726555, 0.0610570054797, 0.0502425432012, 0.0688468299037, 0.120477313901, 0.0880999342538, 0.0878807801885, 0.104098181021, 0.0973044049967, 0.10650887574, 0.115713346483, 0.0913872452334, 0.0721016874863, 0.0600482138944, ...], [0.098928448476, 0.137305549245, 0.0753791252916, 0.0990566805978, 0.0822124178289, 0.0910533865423, 0.0892351234243, 0.1018692375, 0.107928239779, 0.0444253340597, 0.0726064572547, 0.108493310064, 0.112274578243, 0.086678301338, 0.0933682373473, 0.0796974985457, 0.087550901687, 0.0767888307155, 0.0983129726585, 0.108493310064, ...], [0.123070473061, 0.0578929142681, 0.0464503220138, 0.201737887342, 0.0855492777249, 0.0536180031335, 0.111344533931, 0.10607667333, 0.0519348097508, 0.0775904494795, 0.084734655966, 0.118683901293, 0.0708969839405, 0.0642381511947, 0.142969056013, 0.087348217783, 0.0708969839405, 0.104582843713, 0.0967489228359, 0.0771641206424, ...], [0.0908112608184, 0.086731479801, 0.106727761089, 0.116224958469, 0.0746198778934, 0.100064341346, 0.0903571333521, 0.0540778404843, 0.0883506852995, 0.0824550664772, 0.109579594971, 0.0942684766214, 0.10407239819, 0.0942684766214, 0.0972850678733, 0.077677224736, 0.0852187028658, 0.0980392156863, 0.0761689291101, 0.0897435897436, ...], [0.0946723497621, 0.102614036567, 0.104068459189, 0.104233398688, 0.0692101472815, 0.10136195871, 0.0758868780611, 0.100250906356, 0.0760499812908, 0.0793837453837, 0.092268138711, 0.0808823529412, 0.104411764706, 0.105882352941, 0.107352941176, 0.0801470588235, 0.0911764705882, 0.0727941176471, 0.0948529411765, 0.0860294117647, ...], ...]
y_train = ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...]
fit_params = {}
1676
1677 except Exception as e:
1678 if error_score == 'raise':
1679 raise

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/ensemble/forest.py in fit(self=ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), X=array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), y=array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), sample_weight=None)
323 trees = Parallel(n_jobs=self.n_jobs, verbose=self.verbose,
324 backend="threading")(
325 delayed(parallel_build_trees)(
326 t, self, X, y, sample_weight, i, len(trees),
327 verbose=self.verbose, class_weight=self.class_weight)
--> 328 for i, t in enumerate(trees))
i = 99
329
330 # Collect newly grown trees
331 self.estimators
.extend(trees)
332

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in call(self=Parallel(n_jobs=1), iterable=<generator object >)
774 self.n_completed_tasks = 0
775 try:
776 # Only set self._iterating to True if at least a batch
777 # was dispatched. In particular this covers the edge
778 # case of Parallel used with an exhausted iterator.
--> 779 while self.dispatch_one_batch(iterator):
self.dispatch_one_batch = <bound method Parallel.dispatch_one_batch of Parallel(n_jobs=1)>
iterator = <generator object >
780 self._iterating = True
781 else:
782 self._iterating = False
783

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in dispatch_one_batch(self=Parallel(n_jobs=1), iterator=<generator object >)
620 tasks = BatchedCalls(itertools.islice(iterator, batch_size))
621 if len(tasks) == 0:
622 # No more tasks available in the iterator: tell caller to stop.
623 return False
624 else:
--> 625 self._dispatch(tasks)
self._dispatch = <bound method Parallel._dispatch of Parallel(n_jobs=1)>
tasks = <sklearn.externals.joblib.parallel.BatchedCalls object>
626 return True
627
628 def _print(self, msg, msg_args):
629 """Display the message on stout or stderr depending on verbosity"""

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in _dispatch(self=Parallel(n_jobs=1), batch=<sklearn.externals.joblib.parallel.BatchedCalls object>)
583 self.n_dispatched_tasks += len(batch)
584 self.n_dispatched_batches += 1
585
586 dispatch_timestamp = time.time()
587 cb = BatchCompletionCallBack(dispatch_timestamp, len(batch), self)
--> 588 job = self._backend.apply_async(batch, callback=cb)
job = undefined
self._backend.apply_async = <bound method SequentialBackend.apply_async of <...lib._parallel_backends.SequentialBackend object>>
batch = <sklearn.externals.joblib.parallel.BatchedCalls object>
cb = <sklearn.externals.joblib.parallel.BatchCompletionCallBack object>
589 self._jobs.append(job)
590
591 def dispatch_next(self):
592 """Dispatch more data for parallel processing

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/_parallel_backends.py in apply_async(self=<sklearn.externals.joblib._parallel_backends.SequentialBackend object>, func=<sklearn.externals.joblib.parallel.BatchedCalls object>, callback=<sklearn.externals.joblib.parallel.BatchCompletionCallBack object>)
106 raise ValueError('n_jobs == 0 in Parallel has no meaning')
107 return 1
108
109 def apply_async(self, func, callback=None):
110 """Schedule a func to be run"""
--> 111 result = ImmediateResult(func)
result = undefined
func = <sklearn.externals.joblib.parallel.BatchedCalls object>
112 if callback:
113 callback(result)
114 return result
115

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/_parallel_backends.py in init(self=<sklearn.externals.joblib._parallel_backends.ImmediateResult object>, batch=<sklearn.externals.joblib.parallel.BatchedCalls object>)
327
328 class ImmediateResult(object):
329 def init(self, batch):
330 # Don't delay the application, to avoid keeping the input
331 # arguments in memory
--> 332 self.results = batch()
self.results = undefined
batch = <sklearn.externals.joblib.parallel.BatchedCalls object>
333
334 def get(self):
335 return self.results
336

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in call(self=<sklearn.externals.joblib.parallel.BatchedCalls object>)
126 def init(self, iterator_slice):
127 self.items = list(iterator_slice)
128 self._size = len(self.items)
129
130 def call(self):
--> 131 return [func(*args, **kwargs) for func, args, kwargs in self.items]
func =
args = (ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), None, 0, 100)
kwargs = {'class_weight': None, 'verbose': 0}
self.items = [(, (ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), None, 0, 100), {'class_weight': None, 'verbose': 0})]
132
133 def len(self):
134 return self._size
135

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/ensemble/forest.py in _parallel_build_trees(tree=ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), forest=ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), X=array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), y=array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), sample_weight=None, tree_idx=0, n_trees=100, verbose=0, class_weight=None)
116 warnings.simplefilter('ignore', DeprecationWarning)
117 curr_sample_weight *= compute_sample_weight('auto', y, indices)
118 elif class_weight == 'balanced_subsample':
119 curr_sample_weight *= compute_sample_weight('balanced', y, indices)
120
--> 121 tree.fit(X, y, sample_weight=curr_sample_weight, check_input=False)
tree.fit = <bound method ExtraTreeClassifier.fit of ExtraTr...om_state=820678124,
splitter='random')>
X = array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32)
y = array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]])
sample_weight = None
curr_sample_weight = array([ 2., 0., 2., ..., 1., 1., 2.])
122 else:
123 tree.fit(X, y, sample_weight=sample_weight, check_input=False)
124
125 return tree

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/tree/tree.py in fit(self=ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), X=array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), y=array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), sample_weight=array([ 2., 0., 2., ..., 1., 1., 2.]), check_input=False, X_idx_sorted=None)
785
786 super(DecisionTreeClassifier, self).fit(
787 X, y,
788 sample_weight=sample_weight,
789 check_input=check_input,
--> 790 X_idx_sorted=X_idx_sorted)
X_idx_sorted = None
791 return self
792
793 def predict_proba(self, X, check_input=True):
794 """Predict class probabilities of the input samples X.

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/tree/tree.py in fit(self=ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), X=array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), y=array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), sample_weight=array([ 2., 0., 2., ..., 1., 1., 2.]), check_input=False, X_idx_sorted=None)
189 if isinstance(self.min_samples_split, (numbers.Integral, np.integer)):
190 if not 2 <= self.min_samples_split:
191 raise ValueError("min_samples_split must be an integer "
192 "greater than 1 or a float in (0.0, 1.0]; "
193 "got the integer %s"
--> 194 % self.min_samples_split)
self.min_samples_split = 1
195 min_samples_split = self.min_samples_split
196 else: # float
197 if not 0. < self.min_samples_split <= 1.:
198 raise ValueError("min_samples_split must be an integer "

ValueError: min_samples_split must be an integer greater than 1 or a float in (0.0, 1.0]; got the integer 1


python trainClassifier.py combinedTrainingSetsTennessenEuro/ classifiers/tennessenEuro/tennessenEuroAutosomal.p all
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/cross_validation.py:41: DeprecationWarning: This module was deprecated in version 0.18 in favor of the model_selection module into which all the refactored classes and functions are moved. Also note that the interface of the new CV iterators are different from that of this module. This module will be removed in 0.20.
"This module will be removed in 0.20.", DeprecationWarning)
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py:42: DeprecationWarning: This module was deprecated in version 0.18 in favor of the model_selection module into which all the refactored classes and functions are moved. This module will be removed in 0.20.
DeprecationWarning)
using these features: ['all'] (indices: [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121])
training set size after balancing: 5000
Checking accuracy when distinguishing among all 5 classes
Training extraTreesClassifier
Traceback (most recent call last):
File "trainClassifier.py", line 102, in
grid_search.fit(X, y)
File "/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py", line 838, in fit
return self._fit(X, y, ParameterGrid(self.param_grid))
File "/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py", line 574, in _fit
for parameters in parameter_iterable
File "/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py", line 789, in call
self.retrieve()
File "/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py", line 740, in retrieve
raise exception
sklearn.externals.joblib.my_exceptions.JoblibValueError: JoblibValueError


Multiprocessing exception:
...........................................................................
/rds/user/sv378/hpc-work/Programs/shIC/trainClassifier.py in ()
97
98 heatmap = []
99 sys.stderr.write("Training %s\n" %(mlType))
100 grid_search = GridSearchCV(clf,param_grid=param_grid_forest,cv=10,n_jobs=20)
101 start = time()
--> 102 grid_search.fit(X, y)
103 sys.stderr.write("GridSearchCV took %.2f seconds for %d candidate parameter settings.\n"
104 % (time() - start, len(grid_search.grid_scores_)))
105 print "Results for %s" %(mlType)
106 report(grid_search.grid_scores_)

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py in fit(self=GridSearchCV(cv=10, error_score='raise',
...='2n_jobs', refit=True, scoring=None, verbose=0), X=[[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], y=['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...])
833 y : array-like, shape = [n_samples] or [n_samples, n_output], optional
834 Target relative to X for classification or regression;
835 None for unsupervised learning.
836
837 """
--> 838 return self._fit(X, y, ParameterGrid(self.param_grid))
self._fit = <bound method GridSearchCV._fit of GridSearchCV(...'2
n_jobs', refit=True, scoring=None, verbose=0)>
X = [[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...]
y = ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...]
self.param_grid = {'bootstrap': [True, False], 'criterion': ['gini', 'entropy'], 'max_depth': [3, 10, None], 'max_features': [1, 3, 11, 121], 'min_samples_leaf': [1, 3, 10], 'min_samples_split': [1, 3, 10]}
839
840
841 class RandomizedSearchCV(BaseSearchCV):
842 """Randomized search on hyper parameters.

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/grid_search.py in _fit(self=GridSearchCV(cv=10, error_score='raise',
...='2*n_jobs', refit=True, scoring=None, verbose=0), X=[[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], y=['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...], parameter_iterable=<sklearn.grid_search.ParameterGrid object>)
569 )(
570 delayed(fit_and_score)(clone(base_estimator), X, y, self.scorer,
571 train, test, self.verbose, parameters,
572 self.fit_params, return_parameters=True,
573 error_score=self.error_score)
--> 574 for parameters in parameter_iterable
parameters = undefined
parameter_iterable = <sklearn.grid_search.ParameterGrid object>
575 for train, test in cv)
576
577 # Out is a list of triplet: score, estimator, n_test_samples
578 n_fits = len(out)

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in call(self=Parallel(n_jobs=20), iterable=<generator object >)
784 if pre_dispatch == "all" or n_jobs == 1:
785 # The iterable was consumed all at once by the above for loop.
786 # No need to wait for async callbacks to trigger to
787 # consumption.
788 self._iterating = False
--> 789 self.retrieve()
self.retrieve = <bound method Parallel.retrieve of Parallel(n_jobs=20)>
790 # Make sure that we get a last message telling us we are done
791 elapsed_time = time.time() - self._start_time
792 self._print('Done %3i out of %3i | elapsed: %s finished',
793 (len(self._output), len(self._output),


Sub-process traceback:

ValueError Mon Feb 19 15:00:55 2018
PID: 233604 Python 2.7.14: /home/sv378/anaconda2/bin/python
...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in call(self=<sklearn.externals.joblib.parallel.BatchedCalls object>)
126 def init(self, iterator_slice):
127 self.items = list(iterator_slice)
128 self._size = len(self.items)
129
130 def call(self):
--> 131 return [func(*args, **kwargs) for func, args, kwargs in self.items]
func =
args = (ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), [[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...], , array([ 100, 101, 102, ..., 4997, 4998, 4999]), array([ 0, 1, 2, 3, 4, 5, 6,...4093, 4094,
4095, 4096, 4097, 4098, 4099]), 0, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 3, 'max_features': 1, 'min_samples_leaf': 1, 'min_samples_split': 1}, {})
kwargs = {'error_score': 'raise', 'return_parameters': True}
self.items = [(, (ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), [[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...], , array([ 100, 101, 102, ..., 4997, 4998, 4999]), array([ 0, 1, 2, 3, 4, 5, 6,...4093, 4094,
4095, 4096, 4097, 4098, 4099]), 0, {'bootstrap': True, 'criterion': 'gini', 'max_depth': 3, 'max_features': 1, 'min_samples_leaf': 1, 'min_samples_split': 1}, {}), {'error_score': 'raise', 'return_parameters': True})]
132
133 def len(self):
134 return self._size
135

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/cross_validation.py in _fit_and_score(estimator=ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), X=[[0.0785765713796, 0.0711224669757, 0.0534461846636, 0.11298676124, 0.107917943941, 0.101909772587, 0.0901051202811, 0.0651605926738, 0.107920962989, 0.10713828059, 0.10371534268, 0.0868028279654, 0.0738413197172, 0.0652003142184, 0.106834249804, 0.106441476826, 0.0856245090338, 0.0934799685782, 0.0777690494894, 0.0926944226237, ...], [0.0973934121939, 0.103414595071, 0.0582038016651, 0.0722162925787, 0.101465160151, 0.096078447283, 0.0956982685818, 0.124639279001, 0.0882124475236, 0.0822871796707, 0.08039111628, 0.0980584910297, 0.11231260752, 0.0646350454657, 0.078397640698, 0.0857704595724, 0.0916687146719, 0.0948636028508, 0.107151634308, 0.0941263209634, ...], [0.0946038612071, 0.0628274293759, 0.0635114153467, 0.0821922661427, 0.117514683178, 0.0857874299633, 0.116179998728, 0.034174352587, 0.148609061614, 0.0954130467951, 0.0991864550625, 0.0895716140199, 0.0876244050195, 0.0748593682389, 0.111639982691, 0.115750757248, 0.0822154911294, 0.0993076590221, 0.0577672003462, 0.11813067936, ...], [0.0763012937959, 0.0822720272276, 0.0875805475829, 0.151841581049, 0.105441813407, 0.125292033663, 0.0557865414393, 0.129616497127, 0.0623685496556, 0.0592295925092, 0.0642695225445, 0.0799347471452, 0.0872756933116, 0.10277324633, 0.142740619902, 0.119086460033, 0.07911908646, 0.0668841761827, 0.108482871126, 0.0807504078303, ...], [0.0780317791643, 0.088050455936, 0.112721027998, 0.0815720859807, 0.0932859144498, 0.120855127775, 0.0889598922566, 0.0732143452073, 0.0712786931909, 0.097072670274, 0.094958007768, 0.0838767042087, 0.0853586247777, 0.10106698281, 0.0820983995258, 0.092768227623, 0.109069353883, 0.0856550088915, 0.081802015412, 0.0755779490219, ...], [0.0460941718576, 0.0463637200459, 0.0447542856168, 0.0554310536586, 0.0860398274601, 0.0982856948517, 0.0724855559341, 0.0878971694772, 0.127892013132, 0.18569688579, 0.149059622177, 0.0534351145038, 0.0490730643402, 0.0610687022901, 0.062159214831, 0.0741548527808, 0.0905125408942, 0.0959651035987, 0.100327153762, 0.121046892039, ...], [0.0422460985659, 0.0712603176271, 0.0667774319623, 0.0680146080701, 0.0772870708642, 0.09369433366, 0.100985932355, 0.0810389503239, 0.144340071979, 0.18983764455, 0.0645175400425, 0.0465949820789, 0.0716845878136, 0.0878136200717, 0.078853046595, 0.078853046595, 0.094982078853, 0.0913978494624, 0.0931899641577, 0.123655913978, ...], [0.0747003438046, 0.0709113975128, 0.0702564509068, 0.125195449018, 0.148396843503, 0.133330655143, 0.0947256169447, 0.0949676302822, 0.0777737196361, 0.0431480248604, 0.0665938683875, 0.0801005747126, 0.0757902298851, 0.0765086206897, 0.122126436782, 0.136494252874, 0.123204022989, 0.0908764367816, 0.0897988505747, 0.0818965517241, ...], [0.0935852681206, 0.0799515986482, 0.084809124865, 0.104350286807, 0.0861143667554, 0.075640690841, 0.0950895431933, 0.114404264643, 0.0916468738272, 0.0717119718634, 0.102696010436, 0.0822147651007, 0.0922818791946, 0.0939597315436, 0.0939597315436, 0.0989932885906, 0.0687919463087, 0.0838926174497, 0.110738255034, 0.0922818791946, ...], [0.143762118287, 0.0752115450379, 0.0759428491362, 0.0611558822459, 0.0635945153654, 0.083381655377, 0.0903466057737, 0.131923852967, 0.091660407882, 0.11440327318, 0.0686172947486, 0.129200312581, 0.0854389163845, 0.0705912998177, 0.065121125293, 0.0742380828341, 0.0854389163845, 0.0914300599114, 0.116697056525, 0.0901276374056, ...], [0.0842953852787, 0.0518660645361, 0.0688425675496, 0.0916964785564, 0.128847625977, 0.0910237753652, 0.0844987920911, 0.105917309246, 0.0900695619193, 0.0900814476277, 0.112860991853, 0.0926732673267, 0.0756435643564, 0.0784158415842, 0.0906930693069, 0.107722772277, 0.0807920792079, 0.0879207920792, 0.090297029703, 0.0974257425743, ...], [0.0995419487358, 0.0566932747575, 0.103139830659, 0.0873756781862, 0.0832312774641, 0.106317109993, 0.0877466909196, 0.120839247202, 0.0976638256343, 0.0766085628385, 0.0808425536102, 0.126052441665, 0.0726485446235, 0.0959826798172, 0.0849170074573, 0.0897281693529, 0.0995910512389, 0.078662496993, 0.0969449121963, 0.0793841712774, ...], [0.117801081687, 0.0943130009469, 0.0515135902269, 0.10545249186, 0.0944253408009, 0.105271822002, 0.0954933114009, 0.0651543340649, 0.0383074807107, 0.0792520463759, 0.153015499925, 0.113686051596, 0.0944468736336, 0.0592479230433, 0.0988194140796, 0.101005684303, 0.096195889812, 0.096195889812, 0.0686488850022, 0.0623087013555, ...], [0.0731776490501, 0.0712100001219, 0.109737568554, 0.0595921444108, 0.0638344453193, 0.0762462053669, 0.0733110101932, 0.115271113309, 0.13139444359, 0.0774666998622, 0.148758720222, 0.0855430020752, 0.0680193682269, 0.0973022826839, 0.0924602259626, 0.0880793175006, 0.0855430020752, 0.101913765276, 0.0949965413881, 0.104680654831, ...], [0.117620328396, 0.0842790397159, 0.0983482855473, 0.117148215074, 0.115752347151, 0.0485812707264, 0.0839024378383, 0.11263994789, 0.0678342126373, 0.108253114853, 0.0456408001713, 0.105845181675, 0.0789889415482, 0.0932069510269, 0.102685624013, 0.11532385466, 0.0789889415482, 0.0932069510269, 0.0821484992101, 0.0947867298578, ...], [0.0918138876293, 0.0812197731926, 0.113558463822, 0.0960908501524, 0.0587890624809, 0.0860386597311, 0.0703068770504, 0.0406429035279, 0.111694359724, 0.0943436315527, 0.155501531136, 0.0813758389262, 0.101510067114, 0.109060402685, 0.0813758389262, 0.0654362416107, 0.0788590604027, 0.0864093959732, 0.0755033557047, 0.0922818791946, ...], [0.104746288118, 0.0533569318189, 0.0523897695476, 0.0583000851884, 0.0787301255592, 0.138440837761, 0.152733227314, 0.0913592568827, 0.0881518274414, 0.0870679262584, 0.0947237241109, 0.0914583333333, 0.0704166666667, 0.075, 0.0670833333333, 0.0875, 0.126041666667, 0.132708333333, 0.079375, 0.0945833333333, ...], [0.160915056666, 0.0832245998739, 0.0506360528275, 0.12658458999, 0.057762563961, 0.0887716391952, 0.0760406217988, 0.0659714677172, 0.107550168555, 0.0871408097759, 0.0954024296403, 0.143191116306, 0.0870835768556, 0.0686732904734, 0.112507305669, 0.0660432495617, 0.0812390414962, 0.0683810637054, 0.0698421975453, 0.107831677382, ...], [0.0924159560849, 0.10902799519, 0.109129648161, 0.0700583522686, 0.112748816779, 0.0828274141212, 0.0620371134711, 0.0644140134551, 0.0817596239787, 0.0777387537625, 0.137842312728, 0.0886105860113, 0.0827032136106, 0.102788279773, 0.086011342155, 0.110349716446, 0.112712665406, 0.0824669187146, 0.0637996219282, 0.0718336483932, ...], [0.0232773173295, 0.102554932777, 0.0548014820573, 0.0231733059686, 0.0752955163473, 0.0949925915621, 0.0783216012158, 0.140890408259, 0.15839854271, 0.152253506105, 0.0960407956685, 0.0374826469227, 0.0869967607589, 0.0684868116613, 0.047200370199, 0.0763535400278, 0.0795927811199, 0.0758907913003, 0.142989356779, 0.167977788061, ...], ...], y=['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...], scorer=, train=array([ 100, 101, 102, ..., 4997, 4998, 4999]), test=array([ 0, 1, 2, 3, 4, 5, 6,...4093, 4094,
4095, 4096, 4097, 4098, 4099]), verbose=0, parameters={'bootstrap': True, 'criterion': 'gini', 'max_depth': 3, 'max_features': 1, 'min_samples_leaf': 1, 'min_samples_split': 1}, fit_params={}, return_train_score=False, return_parameters=True, error_score='raise')
1670
1671 try:
1672 if y_train is None:
1673 estimator.fit(X_train, **fit_params)
1674 else:
-> 1675 estimator.fit(X_train, y_train, **fit_params)
estimator.fit =
X_train = [[0.108771151419, 0.105614197748, 0.0966494789241, 0.0866170007457, 0.0964047576502, 0.11930815248, 0.0665383993874, 0.0841112601666, 0.051475049271, 0.068146231684, 0.116364320525, 0.105090670745, 0.093292549705, 0.106620056806, 0.0873934891851, 0.0941664845969, 0.117325759231, 0.0694778239021, 0.0793095914354, 0.0747214332532, ...], [0.106617909874, 0.106357049024, 0.0765518196138, 0.147184749999, 0.0858705240986, 0.0847635617398, 0.0850937259626, 0.111091967911, 0.0773122555785, 0.0487853243806, 0.0703711118193, 0.0942956926659, 0.108265424913, 0.0791618160652, 0.136204889406, 0.0977881257276, 0.0849825378347, 0.0861466821886, 0.0966239813737, 0.0791618160652, ...], [0.185416168109, 0.100101261529, 0.133528111996, 0.101465662428, 0.0703716746745, 0.102758077764, 0.0812441134372, 0.0988210729473, 0.0435087624933, 0.0382121103195, 0.0445729843021, 0.141348497157, 0.103980503656, 0.10804224208, 0.0909829406986, 0.0601137286759, 0.0999187652315, 0.072298943948, 0.116978066613, 0.0674248578392, ...], [0.0963637088631, 0.121625351917, 0.0956768527573, 0.189534202703, 0.10684804642, 0.0715540664797, 0.123586319996, 0.0487971734638, 0.0425811912908, 0.0475143202527, 0.055918765857, 0.106206438765, 0.108861599734, 0.0932625290408, 0.138400265516, 0.0945901095254, 0.081978094922, 0.103883172917, 0.0640557583804, 0.0640557583804, ...], [0.0801096133575, 0.071895449447, 0.0685534174348, 0.0896877135845, 0.108408700517, 0.0759894373889, 0.111760084293, 0.114561218587, 0.0846313318665, 0.0917518547716, 0.102651178752, 0.0948412698413, 0.0845238095238, 0.068253968254, 0.0936507936508, 0.1, 0.0845238095238, 0.109920634921, 0.1, 0.0809523809524, ...], [0.0382561737825, 0.0315757277579, 0.0376576988282, 0.0861285452859, 0.143679577183, 0.0924431361215, 0.086804250971, 0.105428327609, 0.117157492386, 0.134981244139, 0.125887825935, 0.0738862730895, 0.0590365809489, 0.0684534588917, 0.102136906918, 0.114813473379, 0.0804056501268, 0.0843897138718, 0.0876494023904, 0.105396595436, ...], [0.0784746826308, 0.1006413305, 0.105294687454, 0.0729785868519, 0.0657225365574, 0.110274033891, 0.109100079816, 0.0715028124597, 0.116002698888, 0.0796512495891, 0.0903573013613, 0.0795935647756, 0.0897544453853, 0.106689246401, 0.0838272650296, 0.0804403048264, 0.104149026249, 0.106689246401, 0.0762066045724, 0.0973751058425, ...], [0.0920064369969, 0.0767766409783, 0.088577875781, 0.103323636964, 0.0708319990492, 0.102821897842, 0.0771736055954, 0.0911898776385, 0.115921754556, 0.113968901938, 0.0674073726602, 0.10502283105, 0.0896118721461, 0.0816210045662, 0.087899543379, 0.0810502283105, 0.0958904109589, 0.074200913242, 0.0816210045662, 0.101598173516, ...], [0.113920327585, 0.104225241968, 0.121645767516, 0.072213302433, 0.130757972587, 0.089292624932, 0.0656439314145, 0.101428447263, 0.0763251337636, 0.0631624875531, 0.0613847629852, 0.0898410504492, 0.109882515549, 0.112646855563, 0.0822391154112, 0.108500345543, 0.0746371803732, 0.0711817553559, 0.0981340704907, 0.0988251554941, ...], [0.10219007675, 0.0723602246041, 0.0961970951707, 0.116814267704, 0.148062298638, 0.0827725938389, 0.125494140016, 0.0514917907593, 0.0651889033509, 0.05285931811, 0.0865692910589, 0.0786445012788, 0.0620204603581, 0.0895140664962, 0.105498721228, 0.128516624041, 0.0773657289003, 0.106138107417, 0.0818414322251, 0.0914322250639, ...], [0.080177548404, 0.0931795868809, 0.0999111444437, 0.118483345352, 0.091940873078, 0.083312789863, 0.0522401079175, 0.106566192871, 0.0585826455211, 0.107876963363, 0.107728802306, 0.0743801652893, 0.0914256198347, 0.107438016529, 0.114669421488, 0.0847107438017, 0.0888429752066, 0.0712809917355, 0.0976239669421, 0.0686983471074, ...], [0.113891144206, 0.0866807008977, 0.0548496521406, 0.0900785333581, 0.0950678237314, 0.0814450026369, 0.0838627610583, 0.098519913327, 0.140039732117, 0.0894909660789, 0.0660737704478, 0.117040630685, 0.0824742268041, 0.0651910248636, 0.0779260157671, 0.0921770770164, 0.089144936325, 0.0876288659794, 0.0885385081868, 0.11522134627, ...], [0.0381077645694, 0.0974157216961, 0.0950881655299, 0.092574894648, 0.130234317646, 0.108731554031, 0.105156453529, 0.0993667783061, 0.0648689912665, 0.106401315243, 0.0620540435341, 0.076511861009, 0.0935516204477, 0.0898763782158, 0.0968927497494, 0.11693952556, 0.0992315402606, 0.0928833945874, 0.0838623454728, 0.0825258937521, ...], [0.0805225473236, 0.0825788214013, 0.0774580674909, 0.0505955135195, 0.0852763226211, 0.093171130758, 0.14737804487, 0.0900577192716, 0.0697143905999, 0.108670714035, 0.114576728109, 0.0880964866282, 0.0980597797588, 0.0880964866282, 0.0608285264814, 0.0844257996854, 0.102254850551, 0.126900891453, 0.0859989512323, 0.0676455165181, ...], [0.0545096490945, 0.0854380096443, 0.0590723053012, 0.11268847762, 0.11237593458, 0.148140575669, 0.0777969797628, 0.0578052877306, 0.114498669096, 0.0769023412346, 0.100771770267, 0.100886162236, 0.0804362644853, 0.0688479890934, 0.091342876619, 0.0940695296524, 0.12406271302, 0.105657805044, 0.0743012951602, 0.0879345603272, ...], [0.0793137451288, 0.0971837591699, 0.112239651189, 0.0960395146984, 0.107311937923, 0.12176252675, 0.0855251726555, 0.0610570054797, 0.0502425432012, 0.0688468299037, 0.120477313901, 0.0880999342538, 0.0878807801885, 0.104098181021, 0.0973044049967, 0.10650887574, 0.115713346483, 0.0913872452334, 0.0721016874863, 0.0600482138944, ...], [0.098928448476, 0.137305549245, 0.0753791252916, 0.0990566805978, 0.0822124178289, 0.0910533865423, 0.0892351234243, 0.1018692375, 0.107928239779, 0.0444253340597, 0.0726064572547, 0.108493310064, 0.112274578243, 0.086678301338, 0.0933682373473, 0.0796974985457, 0.087550901687, 0.0767888307155, 0.0983129726585, 0.108493310064, ...], [0.123070473061, 0.0578929142681, 0.0464503220138, 0.201737887342, 0.0855492777249, 0.0536180031335, 0.111344533931, 0.10607667333, 0.0519348097508, 0.0775904494795, 0.084734655966, 0.118683901293, 0.0708969839405, 0.0642381511947, 0.142969056013, 0.087348217783, 0.0708969839405, 0.104582843713, 0.0967489228359, 0.0771641206424, ...], [0.0908112608184, 0.086731479801, 0.106727761089, 0.116224958469, 0.0746198778934, 0.100064341346, 0.0903571333521, 0.0540778404843, 0.0883506852995, 0.0824550664772, 0.109579594971, 0.0942684766214, 0.10407239819, 0.0942684766214, 0.0972850678733, 0.077677224736, 0.0852187028658, 0.0980392156863, 0.0761689291101, 0.0897435897436, ...], [0.0946723497621, 0.102614036567, 0.104068459189, 0.104233398688, 0.0692101472815, 0.10136195871, 0.0758868780611, 0.100250906356, 0.0760499812908, 0.0793837453837, 0.092268138711, 0.0808823529412, 0.104411764706, 0.105882352941, 0.107352941176, 0.0801470588235, 0.0911764705882, 0.0727941176471, 0.0948529411765, 0.0860294117647, ...], ...]
y_train = ['0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', '0', ...]
fit_params = {}
1676
1677 except Exception as e:
1678 if error_score == 'raise':
1679 raise

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/ensemble/forest.py in fit(self=ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), X=array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), y=array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), sample_weight=None)
323 trees = Parallel(n_jobs=self.n_jobs, verbose=self.verbose,
324 backend="threading")(
325 delayed(parallel_build_trees)(
326 t, self, X, y, sample_weight, i, len(trees),
327 verbose=self.verbose, class_weight=self.class_weight)
--> 328 for i, t in enumerate(trees))
i = 99
329
330 # Collect newly grown trees
331 self.estimators
.extend(trees)
332

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in call(self=Parallel(n_jobs=1), iterable=<generator object >)
774 self.n_completed_tasks = 0
775 try:
776 # Only set self._iterating to True if at least a batch
777 # was dispatched. In particular this covers the edge
778 # case of Parallel used with an exhausted iterator.
--> 779 while self.dispatch_one_batch(iterator):
self.dispatch_one_batch = <bound method Parallel.dispatch_one_batch of Parallel(n_jobs=1)>
iterator = <generator object >
780 self._iterating = True
781 else:
782 self._iterating = False
783

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in dispatch_one_batch(self=Parallel(n_jobs=1), iterator=<generator object >)
620 tasks = BatchedCalls(itertools.islice(iterator, batch_size))
621 if len(tasks) == 0:
622 # No more tasks available in the iterator: tell caller to stop.
623 return False
624 else:
--> 625 self._dispatch(tasks)
self._dispatch = <bound method Parallel._dispatch of Parallel(n_jobs=1)>
tasks = <sklearn.externals.joblib.parallel.BatchedCalls object>
626 return True
627
628 def _print(self, msg, msg_args):
629 """Display the message on stout or stderr depending on verbosity"""

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in _dispatch(self=Parallel(n_jobs=1), batch=<sklearn.externals.joblib.parallel.BatchedCalls object>)
583 self.n_dispatched_tasks += len(batch)
584 self.n_dispatched_batches += 1
585
586 dispatch_timestamp = time.time()
587 cb = BatchCompletionCallBack(dispatch_timestamp, len(batch), self)
--> 588 job = self._backend.apply_async(batch, callback=cb)
job = undefined
self._backend.apply_async = <bound method SequentialBackend.apply_async of <...lib._parallel_backends.SequentialBackend object>>
batch = <sklearn.externals.joblib.parallel.BatchedCalls object>
cb = <sklearn.externals.joblib.parallel.BatchCompletionCallBack object>
589 self._jobs.append(job)
590
591 def dispatch_next(self):
592 """Dispatch more data for parallel processing

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/_parallel_backends.py in apply_async(self=<sklearn.externals.joblib._parallel_backends.SequentialBackend object>, func=<sklearn.externals.joblib.parallel.BatchedCalls object>, callback=<sklearn.externals.joblib.parallel.BatchCompletionCallBack object>)
106 raise ValueError('n_jobs == 0 in Parallel has no meaning')
107 return 1
108
109 def apply_async(self, func, callback=None):
110 """Schedule a func to be run"""
--> 111 result = ImmediateResult(func)
result = undefined
func = <sklearn.externals.joblib.parallel.BatchedCalls object>
112 if callback:
113 callback(result)
114 return result
115

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/_parallel_backends.py in init(self=<sklearn.externals.joblib._parallel_backends.ImmediateResult object>, batch=<sklearn.externals.joblib.parallel.BatchedCalls object>)
327
328 class ImmediateResult(object):
329 def init(self, batch):
330 # Don't delay the application, to avoid keeping the input
331 # arguments in memory
--> 332 self.results = batch()
self.results = undefined
batch = <sklearn.externals.joblib.parallel.BatchedCalls object>
333
334 def get(self):
335 return self.results
336

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/externals/joblib/parallel.py in call(self=<sklearn.externals.joblib.parallel.BatchedCalls object>)
126 def init(self, iterator_slice):
127 self.items = list(iterator_slice)
128 self._size = len(self.items)
129
130 def call(self):
--> 131 return [func(*args, **kwargs) for func, args, kwargs in self.items]
func =
args = (ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), None, 0, 100)
kwargs = {'class_weight': None, 'verbose': 0}
self.items = [(, (ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), None, 0, 100), {'class_weight': None, 'verbose': 0})]
132
133 def len(self):
134 return self._size
135

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/ensemble/forest.py in _parallel_build_trees(tree=ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), forest=ExtraTreesClassifier(bootstrap=True, class_weigh..., random_state=None, verbose=0, warm_start=False), X=array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), y=array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), sample_weight=None, tree_idx=0, n_trees=100, verbose=0, class_weight=None)
116 warnings.simplefilter('ignore', DeprecationWarning)
117 curr_sample_weight *= compute_sample_weight('auto', y, indices)
118 elif class_weight == 'balanced_subsample':
119 curr_sample_weight *= compute_sample_weight('balanced', y, indices)
120
--> 121 tree.fit(X, y, sample_weight=curr_sample_weight, check_input=False)
tree.fit = <bound method ExtraTreeClassifier.fit of ExtraTr...om_state=820678124,
splitter='random')>
X = array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32)
y = array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]])
sample_weight = None
curr_sample_weight = array([ 2., 0., 2., ..., 1., 1., 2.])
122 else:
123 tree.fit(X, y, sample_weight=sample_weight, check_input=False)
124
125 return tree

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/tree/tree.py in fit(self=ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), X=array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), y=array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), sample_weight=array([ 2., 0., 2., ..., 1., 1., 2.]), check_input=False, X_idx_sorted=None)
785
786 super(DecisionTreeClassifier, self).fit(
787 X, y,
788 sample_weight=sample_weight,
789 check_input=check_input,
--> 790 X_idx_sorted=X_idx_sorted)
X_idx_sorted = None
791 return self
792
793 def predict_proba(self, X, check_input=True):
794 """Predict class probabilities of the input samples X.

...........................................................................
/home/sv378/anaconda2/lib/python2.7/site-packages/sklearn/tree/tree.py in fit(self=ExtraTreeClassifier(class_weight=None, criterion...dom_state=820678124,
splitter='random'), X=array([[ 0.10877115, 0.1056142 , 0.09664948, .... 0.26129046, 0.23792571]], dtype=float32), y=array([[ 0.],
[ 0.],
[ 0.],
...,
[ 4.],
[ 4.],
[ 4.]]), sample_weight=array([ 2., 0., 2., ..., 1., 1., 2.]), check_input=False, X_idx_sorted=None)
189 if isinstance(self.min_samples_split, (numbers.Integral, np.integer)):
190 if not 2 <= self.min_samples_split:
191 raise ValueError("min_samples_split must be an integer "
192 "greater than 1 or a float in (0.0, 1.0]; "
193 "got the integer %s"
--> 194 % self.min_samples_split)
self.min_samples_split = 1
195 min_samples_split = self.min_samples_split
196 else: # float
197 if not 0. < self.min_samples_split <= 1.:
198 raise ValueError("min_samples_split must be an integer "

ValueError: min_samples_split must be an integer greater than 1 or a float in (0.0, 1.0]; got the integer 1


Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.