guillermoblasco / gmx Goto Github PK
View Code? Open in Web Editor NEWLicense: Other
License: Other
Use scala style from apache/spark:
https://github.com/apache/spark/blob/master/scalastyle-config.xml
Vertex: initial potentials and updating potentials
Edge: deltas in both directions
Message aggregation: d1 * d2
Message generation: updating potential marginalized to stepset / incoming delta and normalize
Message processing: updating potential := initial potential * deltas
Schedule message when graph structure allows:
We can schedule messages in a way that accounts for their potential usefulness; for example, we can pass a message between clusters where the beliefs disagree most strongly on the sepset. This approach, called residual belief propagation is convenient, since it is fully general and does not require a deep understanding of the properties of the network.
- Probabilistic Graphical Models, Box 11.B
Tree Reparametrization (TRP) selects a set of trees, each of which spans a large number of clusters, and whose union covers all of the edges in the network. The TRP algorithm then iteratively selects a tree and does an upward-downward calibration of the tree, keeping all other messages fixed. Of course, calibrating this tree has the effect of "uncalibrating" other trees, and so this process repeats.
- Probabilistic Graphical Models Box 11.B
A second observation is that nonconvergence is often due to oscillations in the beliefs. This observation suggests that we dampen the oscillations by reducing the difference between two subsequent updates. [...] We can replace this line by a damped (or smoothed) version that averages the update delta i --> j with the previous message between two cliques.
- Probabilistic Graphical Models, Box 11.B - Skill: Making loopy belief propagation work in practice
Same build:
$SPARK_HOME/bin/spark-submit --class "com.blackbox.gmx.example.Student" --master local[4] target/scala-2.10/gmx_2.10-1.0.jar
Cluster Graph built
Cluster with 8 clusters
error: 0.0654881172839506 at 1412700248917
error: 0.08341474233689133 at 1412700249673
error: 0.07509828700926946 at 1412700250405
error: 0.04373545719848248 at 1412700251057
error: 0.06800458034992825 at 1412700251701
error: 0.05593939274381361 at 1412700252295
error: 0.02856050623243219 at 1412700252730
error: 0.10687774530501093 at 1412700253164
error: 0.01726229748329744 at 1412700253578
error: 0.009527547159413133 at 1412700253898
error: 0.01543337151929534 at 1412700254288
error: 0.01295290471366622 at 1412700254613
error: 0.004791139421929123 at 1412700254942
error: 0.013548817400343896 at 1412700255300
error: 0.0033719088173315157 at 1412700255672
error: 0.00160183920248516 at 1412700256043
error: 0.0030624380147187703 at 1412700256503
error: 0.0024814727596545387 at 1412700256912
error: 7.25498646743708E-4 at 1412700257311
error: 0.0017878726548758066 at 1412700257779
error: 6.002763397696782E-4 at 1412700258166
error: 2.668207097736408E-4 at 1412700258569
error: 5.450913105309635E-4 at 1412700258972
error: 4.360996485491701E-4 at 1412700259372
error: 1.1588967799769118E-4 at 1412700259837
error: 2.680201310171418E-4 at 1412700260290
error: 1.0197383024739067E-4 at 1412700260744
error: 4.4090801951427004E-5 at 1412700261210
error: 9.248972086044061E-5 at 1412700261749
error: 7.367926909925668E-5 at 1412700262295
error: 1.88277013243677E-5 at 1412700262825
error: 4.246672212695515E-5 at 1412700263396
error: 1.6966558968674908E-5 at 1412700263957
error: 7.244358197173231E-6 at 1412700264565
Calibrated
error: 0.08788149145538879 at 1412700265878
error: 0.15997087527286608 at 1412700266068
error: 0.03452498478996678 at 1412700266276
error: 0.031508382715638814 at 1412700266481
error: 0.13015097389951538 at 1412700266684
error: 0.08289713261795872 at 1412700266914
error: 0.03943056472399397 at 1412700267123
error: 0.08586561341656541 at 1412700267377
error: 0.01791185360445833 at 1412700267593
error: 0.018261565839008037 at 1412700267819
error: 0.05140561893189333 at 1412700268087
error: 0.015751244254026783 at 1412700268325
error: 0.0024108536283455953 at 1412700268581
error: 0.00869843094817362 at 1412700268884
error: 0.0023468709244813153 at 1412700269164
error: 8.455094195599025E-5 at 1412700269443
error: 8.455094195599009E-5 at 1412700269750
error: 2.407412430484045E-34 at 1412700270032
Map calibrated
marginals
ArrayFactor [AbstractArrayFactor with scope {INTELLIGENCE (card=2)} and values {0.734806629834254,0.2651933701657459}]
ArrayFactor [AbstractArrayFactor with scope {LETTER (card=2),GRADE (card=3)} and values {0.09263094874226793,0.035973441454726696,0.013597429051248602,0.8336785386804113,0.023982294303151134,1.373477681944303E-4}]
ArrayFactor [AbstractArrayFactor with scope {SAT (card=2)} and values {0.5,0.5}]
ArrayFactor [AbstractArrayFactor with scope {DIFFICULTY (card=2)} and values {0.845179020095127,0.15482097990487298}]
ArrayFactor [AbstractArrayFactor with scope {GRADE (card=3)} and values {0.9263094874226792,0.05995573575787783,0.013734776819443031}]
ArrayFactor [AbstractArrayFactor with scope {GRADE (card=3),DIFFICULTY (card=2),INTELLIGENCE (card=2)} and values {0.39865800117270195,0.002255059827917999,0.006395331268974536,0.057694865905484616,0.034626521958369336,3.70219866551525E-4,0.39865800117270195,0.002255059827917999,0.006395331268974536,0.057694865905484616,0.034626521958369336,3.70219866551525E-4}]
ArrayFactor [AbstractArrayFactor with scope {SAT (card=2),INTELLIGENCE (card=2)} and values {0.3674033149171271,0.13259668508287292,0.3674033149171271,0.13259668508287292}]
ArrayFactor [AbstractArrayFactor with scope {LETTER (card=2)} and values {0.142865151373684,0.8571348486263161}]
maps
ArrayFactor [AbstractArrayFactor with scope {LETTER (card=2)} and values {0.1,0.9}]
ArrayFactor [AbstractArrayFactor with scope {LETTER (card=2),GRADE (card=3)} and values {0.09774214090968877,0.01228305692524889,0.0020857610674204835,0.879679268187199,0.008188704616832593,2.1068293610307917E-5}]
ArrayFactor [AbstractArrayFactor with scope {GRADE (card=3)} and values {0.9830811897654246,0.014587878330324043,0.0023309319042514754}]
ArrayFactor [AbstractArrayFactor with scope {DIFFICULTY (card=2)} and values {0.8735740036545014,0.1264259963454986}]
ArrayFactor [AbstractArrayFactor with scope {GRADE (card=3),DIFFICULTY (card=2),INTELLIGENCE (card=2)} and values {0.4307738616467811,3.4819948613079154E-4,0.0011235236752486866,0.06234276023606536,0.005346615199798237,6.503975597572816E-5,0.4307738616467811,3.4819948613079154E-4,0.0011235236752486866,0.06234276023606536,0.005346615199798237,6.503975597572816E-5}]
ArrayFactor [AbstractArrayFactor with scope {INTELLIGENCE (card=2)} and values {0.734806629834254,0.2651933701657459}]
ArrayFactor [AbstractArrayFactor with scope {SAT (card=2)} and values {0.5,0.5}]
ArrayFactor [AbstractArrayFactor with scope {SAT (card=2),INTELLIGENCE (card=2)} and values {0.3674033149171271,0.13259668508287292,0.3674033149171271,0.13259668508287292}]
$SPARK_HOME/bin/spark-submit --class "com.blackbox.gmx.example.Student" --master local[4] target/scala-2.10/gmx_2.10-1.0.jar
Cluster Graph built
Cluster with 8 clusters
error: 0.0613490866717057 at 1412700442030
error: 0.03034749482764568 at 1412700442854
error: 0.01679456833820894 at 1412700443599
error: 0.04103024342460169 at 1412700444242
error: 0.02697099805839067 at 1412700444874
error: 0.014550803216732154 at 1412700445467
error: 0.0350738165469253 at 1412700445931
error: 0.02338162548708511 at 1412700446391
error: 0.012297222121285718 at 1412700446791
error: 0.02914123208605337 at 1412700447123
error: 0.019729279843793468 at 1412700447482
error: 0.010128862471221314 at 1412700447807
error: 0.023502899948822917 at 1412700448148
error: 0.01617730597520799 at 1412700448507
error: 0.00812940446284152 at 1412700448883
error: 0.01839076122016555 at 1412700449248
error: 0.012879421157519572 at 1412700449726
error: 0.006360901288843781 at 1412700450124
error: 0.01396655222350034 at 1412700450528
error: 0.009956514315467974 at 1412700451010
error: 0.004857991090787082 at 1412700451410
error: 0.010306263988590842 at 1412700451788
error: 0.007480908166309211 at 1412700452181
error: 0.00362782329142685 at 1412700452618
error: 0.007403998994120884 at 1412700453034
error: 0.00547262558221468 at 1412700453460
error: 0.002654915907621452 at 1412700453908
error: 0.005191112934769255 at 1412700454394
error: 0.00390709870927531 at 1412700454933
error: 0.0019087456135176125 at 1412700455471
error: 0.003562205705455709 at 1412700456037
error: 0.0027297617853948857 at 1412700456627
error: 0.001351618067410742 at 1412700457237
error: 0.0023996751967033285 at 1412700457894
error: 0.0018718999396941825 at 1412700458567
error: 9.450612333670903E-4 at 1412700459218
error: 0.001591749550692266 at 1412700459887
error: 0.001263596309067183 at 1412700460575
error: 6.540120516298157E-4 at 1412700461342
error: 0.0010426735254852127 at 1412700462111
error: 8.42032392455015E-4 at 1412700462969
error: 4.4890014175408407E-4 at 1412700463820
error: 6.763131263132559E-4 at 1412700464815
error: 5.553653937040373E-4 at 1412700465783
error: 3.061637696053671E-4 at 1412700466761
error: 4.3545039606555187E-4 at 1412700467800
error: 3.6339537355794654E-4 at 1412700468915
error: 2.078164451033659E-4 at 1412700470054
error: 2.789129759484473E-4 at 1412700471275
error: 2.3638994804346805E-4 at 1412700472529
error: 1.405707250516063E-4 at 1412700473951
error: 1.7806044538865907E-4 at 1412700475296
error: 1.5314542944478354E-4 at 1412700476670
error: 9.485494206989771E-5 at 1412700478155
error: 1.1348826418573315E-4 at 1412700479748
error: 9.896121689948542E-5 at 1412700481285
error: 6.39062077746496E-5 at 1412700482919
error: 7.23151181865718E-5 at 1412700484597
error: 6.386545019112331E-5 at 1412700486713
error: 4.3016168418758604E-5 at 1412700488571
error: 4.6122607517393423E-5 at 1412700490476
error: 4.120643424280222E-5 at 1412700492652
error: 2.8943163706122562E-5 at 1412700494700
error: 2.947320718562405E-5 at 1412700497232
error: 2.6603284406378926E-5 at 1412700499377
error: 1.9473771977773015E-5 at 1412700501682
error: 1.8884687336247628E-5 at 1412700504167
error: 1.7197879064427714E-5 at 1412700506893
error: 1.3105778985360542E-5 at 1412700509882
error: 1.2140205738990239E-5 at 1412700512555
error: 1.113824629741423E-5 at 1412700515861
error: 8.82402865317684E-6 at 1412700518945
Calibrated
error: 0.09993894444203227 at 1412700526040
error: 0.05781491295981514 at 1412700526405
error: 0.016629558115686276 at 1412700526609
error: 0.04371193761895504 at 1412700526780
error: 0.041499210240868334 at 1412700526975
error: 0.008174255807986914 at 1412700527167
error: 0.02967628296739122 at 1412700527381
error: 0.034465123203038524 at 1412700527574
error: 0.004428036568707985 at 1412700527770
error: 0.020573569702940053 at 1412700527988
error: 0.0250075136361148 at 1412700528220
error: 0.002670770841552942 at 1412700528464
error: 0.013599545654512563 at 1412700528706
error: 0.016117789742760374 at 1412700528949
error: 0.0016294745899280028 at 1412700529221
error: 0.008314636704889692 at 1412700529592
error: 0.009460614347695491 at 1412700529960
error: 9.484376522221207E-4 at 1412700530327
error: 0.004721260997133544 at 1412700530647
error: 0.0051802720061431915 at 1412700530965
error: 5.214651210163904E-4 at 1412700531289
error: 0.0025284629496722783 at 1412700531846
error: 0.0026984067494074393 at 1412700532273
error: 2.73234097262464E-4 at 1412700532667
error: 0.0012970977603062035 at 1412700533119
error: 0.0013570181601332025 at 1412700533546
error: 1.3807999259979217E-4 at 1412700534051
error: 6.453900523249521E-4 at 1412700534626
error: 6.65914682319315E-4 at 1412700535387
error: 6.799529394909593E-5 at 1412700535962
error: 3.143514976762522E-4 at 1412700536491
error: 3.212804424970773E-4 at 1412700537037
error: 3.288219845624559E-5 at 1412700537629
error: 1.5087340549206887E-4 at 1412700538222
error: 1.532052878931821E-4 at 1412700538848
error: 1.570391833495863E-5 at 1412700539497
error: 7.16828094367051E-5 at 1412700540139
error: 7.247290989386099E-5 at 1412700541152
error: 7.435764918620324E-6 at 1412700542227
Map calibrated
marginals
ArrayFactor [AbstractArrayFactor with scope {LETTER (card=2)} and values {0.989946905093049,0.010053094906951091}]
ArrayFactor [AbstractArrayFactor with scope {INTELLIGENCE (card=2)} and values {0.734806629834254,0.2651933701657459}]
ArrayFactor [AbstractArrayFactor with scope {SAT (card=2)} and values {0.5,0.5}]
ArrayFactor [AbstractArrayFactor with scope {DIFFICULTY (card=2)} and values {0.9452766346827345,0.05472336531726544}]
ArrayFactor [AbstractArrayFactor with scope {LETTER (card=2),GRADE (card=3)} and values {7.749501127436174E-7,6.171488165746366E-5,0.9898904984391491,6.974551014692557E-6,4.114325443830911E-5,0.009998893923627768}]
ArrayFactor [AbstractArrayFactor with scope {GRADE (card=3)} and values {7.749501127436173E-6,1.0285813609577276E-4,0.9998893923627767}]
ArrayFactor [AbstractArrayFactor with scope {SAT (card=2),INTELLIGENCE (card=2)} and values {0.3674033149171271,0.13259668508287292,0.3674033149171271,0.13259668508287292}]
ArrayFactor [AbstractArrayFactor with scope {GRADE (card=3),DIFFICULTY (card=2),INTELLIGENCE (card=2)} and values {1.498991587743437E-4,2.5272176106920663E-5,0.059380930077625314,4.2120293511534443E-5,0.07917457343683375,0.3612272048571481,1.498991587743437E-4,2.5272176106920663E-5,0.059380930077625314,4.2120293511534443E-5,0.07917457343683375,0.3612272048571481}]
maps
ArrayFactor [AbstractArrayFactor with scope {GRADE (card=3),DIFFICULTY (card=2),INTELLIGENCE (card=2)} and values {0.0011978685983252944,2.375353377851629E-7,2.318812171319791E-4,3.958922296419382E-7,3.0917495617597217E-4,0.4982604418007993,0.0011978685983252944,2.375353377851629E-7,2.318812171319791E-4,3.958922296419382E-7,3.0917495617597217E-4,0.4982604418007993}]
ArrayFactor [AbstractArrayFactor with scope {SAT (card=2)} and values {0.5,0.5}]
ArrayFactor [AbstractArrayFactor with scope {GRADE (card=3)} and values {4.735463212997269E-5,1.2502294700068801E-9,0.9999526441176406}]
ArrayFactor [AbstractArrayFactor with scope {LETTER (card=2),GRADE (card=3)} and values {4.7354632129972735E-6,7.501376820041283E-10,0.9899531176764642,4.261916891697546E-5,5.000917880027522E-10,0.009999526441176406}]
ArrayFactor [AbstractArrayFactor with scope {INTELLIGENCE (card=2)} and values {0.734806629834254,0.2651933701657459}]
ArrayFactor [AbstractArrayFactor with scope {SAT (card=2),INTELLIGENCE (card=2)} and values {0.3674033149171271,0.13259668508287292,0.3674033149171271,0.13259668508287292}]
ArrayFactor [AbstractArrayFactor with scope {DIFFICULTY (card=2)} and values {0.9452786832683161,0.054721316731683865}]
ArrayFactor [AbstractArrayFactor with scope {LETTER (card=2)} and values {0.99,0.01}]
Each line outputs the error and the system time in milliseconds
Guillermos-MacBook-Air:gmX guillermoblascojimenez$ $SPARK_HOME/bin/spark-submit --class "com.blackbox.gmx.example.Student" --master local[4] target/scala-2.10/gmx_2.10-1.0.jar
14/10/06 11:30:00 WARN SparkConf:
SPARK_JAVA_OPTS was detected (set to '
-Dspark.serializer=org.apache.spark.serializer.KryoSerializer
-Dspark.kryo.registrator=org.apache.spark.graphx.GraphKryoRegistrator ').
This is deprecated in Spark 1.0+.
Please instead use:
14/10/06 11:30:00 WARN SparkConf: Setting 'spark.executor.extraJavaOptions' to '
-Dspark.serializer=org.apache.spark.serializer.KryoSerializer
-Dspark.kryo.registrator=org.apache.spark.graphx.GraphKryoRegistrator ' as a work-around.
14/10/06 11:30:00 WARN SparkConf: Setting 'spark.driver.extraJavaOptions' to '
-Dspark.serializer=org.apache.spark.serializer.KryoSerializer
-Dspark.kryo.registrator=org.apache.spark.graphx.GraphKryoRegistrator ' as a work-around.
Cluster Graph built
Cluster with 8 clusters
error: 0.12519177621599598 at 1412587808098
error: 0.0372313362941709 at 1412587809062
error: 0.06300571554158453 at 1412587809978
error: 0.00161212143632129 at 1412587810791
error: 0.030923425188297375 at 1412587811548
error: 0.054924511754993896 at 1412587812446
error: 0.00179216227496699 at 1412587813092
error: 0.026453902880653097 at 1412587813676
error: 0.04633838053832956 at 1412587814227
error: 0.0019397498390229576 at 1412587814756
error: 0.021979344335079948 at 1412587815193
error: 0.037895937786007076 at 1412587815570
error: 0.0020447103515861204 at 1412587815960
error: 0.017800920542429398 at 1412587816370
error: 0.030117038195337953 at 1412587816803
error: 0.0021008989559054185 at 1412587817336
error: 0.014122736707891216 at 1412587817809
error: 0.023341273358217927 at 1412587818336
error: 0.0021067342348015045 at 1412587818805
error: 0.011044122733230993 at 1412587819282
error: 0.017722184448606144 at 1412587819783
error: 0.002064930329880744 at 1412587820272
error: 0.008574193904316924 at 1412587820734
error: 0.013257235603442087 at 1412587821255
error: 0.001981613649191978 at 1412587821748
error: 0.006658818172795672 at 1412587822271
error: 0.009836377066600989 at 1412587822832
error: 0.0018651133405955605 at 1412587823427
error: 0.005209808956860952 at 1412587824059
error: 0.007292594390134931 at 1412587824722
error: 0.001724712954617012 at 1412587825355
error: 0.004129219619470362 at 1412587826043
error: 0.00544330835573178 at 1412587826734
error: 0.0015695776468153963 at 1412587827453
error: 0.0033256497207461593 at 1412587828197
error: 0.004118071455019422 at 1412587828965
error: 0.001407971738427338 at 1412587829776
error: 0.0027226952114126094 at 1412587830663
error: 0.003173069003239549 at 1412587831625
error: 0.001246791472562367 at 1412587832493
error: 0.002261411859234501 at 1412587833425
error: 0.002495589621008917 at 1412587834445
error: 0.0010913759412709664 at 1412587835472
error: 0.0018990935164854544 at 1412587836638
error: 0.0020022487072031683 at 1412587837774
error: 9.455283197521791E-4 at 1412587838964
error: 0.0016063215660626236 at 1412587840208
error: 0.0016341730447027739 at 1412587841567
error: 8.116733361830232E-4 at 1412587842921
error: 0.0013636151992905824 at 1412587844582
error: 0.0013513514689513575 at 1412587846075
error: 6.9108589586185E-4 at 1412587847592
error: 0.0011584153631472333 at 1412587849197
error: 0.0011274015433960385 at 1412587850789
error: 5.841414925669533E-4 at 1412587852685
error: 9.826983437212836E-4 at 1412587854452
error: 9.452949549260945E-4 at 1412587856312
error: 4.905556028834128E-4 at 1412587858777
error: 8.312518158165064E-4 at 1412587860722
error: 7.941479333414236E-4 at 1412587862791
error: 4.0959345835718075E-4 at 1412587865178
error: 7.005185711507701E-4 at 1412587867307
error: 6.66960775445E-4 at 1412587869639
error: 3.402422239565994E-4 at 1412587872412
error: 5.878721798916212E-4 at 1412587875131
error: 5.591084200205985E-4 at 1412587877876
error: 2.8134470297948023E-4 at 1412587880482
error: 4.911941301720584E-4 at 1412587883528
error: 4.673795723369543E-4 at 1412587886294
error: 2.316978488437011E-4 at 1412587889442
error: 4.08646498442017E-4 at 1412587892523
error: 3.8939242588711693E-4 at 1412587895810
error: 1.9012137346270506E-4 at 1412587898961
error: 3.3856260888576244E-4 at 1412587902726
error: 3.2325556409674224E-4 at 1412587906439
error: 1.5550232517912334E-4 at 1412587910030
error: 2.7940323852191685E-4 at 1412587913892
error: 2.673806875499431E-4 at 1412587917908
error: 1.2682124391892125E-4 at 1412587921811
error: 2.2974531031923326E-4 at 1412587926020
error: 2.203845929616536E-4 at 1412587931047
error: 1.0316480149530829E-4 at 1412587936686
error: 1.8828371244058144E-4 at 1412587943826
error: 1.810405714242484E-4 at 1412587949923
error: 8.372897270365223E-5 at 1412587955998
error: 1.5383589335611405E-4 at 1412587962615
error: 1.4825513370506739E-4 at 1412587970265
error: 6.78159206745734E-5 at 1412587976948
error: 1.2534443071747666E-4 at 1412587983611
error: 1.210562958977216E-4 at 1412587990095
error: 5.482700395990345E-5 at 1412587997221
error: 1.0187594691457726E-4 at 1412588004837
error: 9.858610511237596E-5 at 1412588012438
error: 4.425365933934775E-5 at 1412588020448
error: 8.261639376011903E-5 at 1412588030284
error: 8.009390808734914E-5 at 1412588040433
error: 3.5667389872503174E-5 at 1412588049176
error: 6.686345694268794E-5 at 1412588057937
error: 6.492900014604511E-5 at 1412588067124
error: 2.8709682698803304E-5 at 1412588076591
Calibrated
error: 0.12283588778298232 at 1412588095804
error: 0.06734689401269783 at 1412588096008
error: 0.08202427250069447 at 1412588096210
error: 0.009125868090623329 at 1412588096437
error: 0.02370906736888987 at 1412588096708
error: 0.061729871366266405 at 1412588097013
error: 0.005303302363843675 at 1412588097318
error: 0.0132107241097436 at 1412588097586
error: 0.04947564926318184 at 1412588097884
error: 0.0027130234216898195 at 1412588098134
error: 0.00777109400968799 at 1412588098444
error: 0.03557238065169925 at 1412588098725
error: 0.0012684945620320513 at 1412588099028
error: 0.004707108373747947 at 1412588099362
error: 0.02296395393643073 at 1412588099722
error: 5.628445113616308E-4 at 1412588100087
error: 0.0027959802023483143 at 1412588100402
error: 0.013543437291183753 at 1412588100735
error: 2.4419009522826145E-4 at 1412588101171
error: 0.0015876828700689084 at 1412588101513
error: 0.007451464602969695 at 1412588101932
error: 1.0561038573350697E-4 at 1412588102317
error: 8.595522937915052E-4 at 1412588102785
error: 0.00389643723865832 at 1412588103344
error: 4.59926284801572E-5 at 1412588103875
error: 4.468585100072085E-4 at 1412588104572
error: 0.001965049017029566 at 1412588105137
error: 2.024341879241442E-5 at 1412588105706
error: 2.251356055541293E-4 at 1412588106258
error: 9.662161650241316E-4 at 1412588106926
error: 9.007157354916817E-6 at 1412588107478
Map calibrated
If you plot the differences in a graph it actually scares...
https://drive.google.com/file/d/0BzzgOwBUb-lyaE5SaEk1ZkhSYVE/view?usp=sharing
https://drive.google.com/file/d/0BzzgOwBUb-lyLTNPeTVad1FpVGs/view?usp=sharing
It looks that there are not an actual modern schema or syntax to serialise factor graphs. There are many ways to encode them but since a factor represents a function we have to deal with polymorphism, that is that a factor may be a discrete function (with sparse values or not), a gaussian, ... So a flexible schema would be required to grant scalability. I've made a workaround with json but I would like to check some factor representations and compare them.
{
"variables" : {
"cloudy" : 2,
"sprinkler" : 2,
"rain" : 2,
"wet_grass" : 2
},
"factors" : [
{
"variables" : ["cloudy"],
"values" : [
{"assignment" : {"cloudy" : 0}, "value" : 0.5},
{"assignment" : {"cloudy" : 1}, "value" : 0.5}
]
},
{
"variables" : ["sprinkler", "cloudy"],
"values" : [
{"assignment" : {"sprinkler" : 0, "cloudy" : 0}, "value" : 0.5},
{"assignment" : {"sprinkler" : 1, "cloudy" : 0}, "value" : 0.5},
{"assignment" : {"sprinkler" : 0, "cloudy" : 1}, "value" : 0.9},
{"assignment" : {"sprinkler" : 1, "cloudy" : 1}, "value" : 0.1}
]
},
{
"variables" : ["rain", "cloudy"],
"values" : [
{"assignment" : {"rain" : 0, "cloudy" : 0}, "value" : 0.8},
{"assignment" : {"rain" : 1, "cloudy" : 0}, "value" : 0.2},
{"assignment" : {"rain" : 0, "cloudy" : 1}, "value" : 0.2},
{"assignment" : {"rain" : 1, "cloudy" : 1}, "value" : 0.8}
]
},
{
"variables" : ["wet_grass", "sprinkler", "rain"],
"values" : [
{"assignment" : {"wet_grass" : 0, "sprinkler" : 0, "rain" : 0}, "value" : 1.0},
{"assignment" : {"wet_grass" : 1, "sprinkler" : 0, "rain" : 0}, "value" : 0.0},
{"assignment" : {"wet_grass" : 0, "sprinkler" : 1, "rain" : 0}, "value" : 0.1},
{"assignment" : {"wet_grass" : 1, "sprinkler" : 1, "rain" : 0}, "value" : 0.9},
{"assignment" : {"wet_grass" : 0, "sprinkler" : 0, "rain" : 1}, "value" : 0.1},
{"assignment" : {"wet_grass" : 1, "sprinkler" : 0, "rain" : 1}, "value" : 0.9},
{"assignment" : {"wet_grass" : 0, "sprinkler" : 1, "rain" : 1}, "value" : 0.01},
{"assignment" : {"wet_grass" : 1, "sprinkler" : 1, "rain" : 1}, "value" : 0.99}
]
}
]
}
As described in the paper that can be downloaded http://ttic.uchicago.edu/~tamir/papers/norm-prod-ieee-final-corr.pdf
Sparse matrix (http://en.wikipedia.org/wiki/Sparse_matrix) are those that takes values at small amount of cells, and therefore store all the values is a waste of memory. There are two cases:
This concept can be applied to discrete factors that are actually multidimensional matrices.
Use LogFactor instead of Factor.
Take care of marginalization of LogFactor. Read http://www.mathworks.es/matlabcentral/fileexchange/25273-methods-for-calculating-precise-logarithm-of-a-sum-and-subtraction
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.