paddlepaddle / book Goto Github PK
View Code? Open in Web Editor NEWDeep Learning 101 with PaddlePaddle (『飞桨』深度学习框架入门教程)
Home Page: http://www.paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/quick_start/index.html
Deep Learning 101 with PaddlePaddle (『飞桨』深度学习框架入门教程)
Home Page: http://www.paddlepaddle.org/documentation/docs/zh/1.2/beginners_guide/quick_start/index.html
These problems were found by Sudnya @sudnya . I just post them here so to remind us to fix them:
Figure 2 says weights in black, but the image doesn't have black.
Figure 3, same issue black vs. blue
References have names with special characters eg: é, ş, ö etc. that looks strange in the HTML file.
The following reference link is broken ->
10. Bishop, Christopher M. “Pattern recognition.” Machine Learning 128 (2006): 1-58.
I see this in my browser:
This XML file does not appear to have any style information associated with it. The document tree is shown below.
AccessDenied
Request has expired
2017-01-19T09:04:00Z
2017-03-03T02:39:45Z
CC773176D6EE0ED3
lhWBsQrDQ7Y6aWcbYllrQtPicY3CtbAG0feHgYLH1l03Z3TBHfp2fD1MeAnkiqtWgTVUaRzhO+Y=
只是一个例子,感觉写得很好,和大家分享一下。
比如说写LSTM,https://deeplearning4j.org/lstm
需要几个点:RNN和feedforword network的关联,BP的改进——BPTT,简单RNN为什么走不下去了——梯度消失和梯度爆炸问题, LSTM怎样解决的该问题,长时序传播问题。
再比如CNN,https://deeplearning4j.org/convolutionalnets 讲得也非常到位
另外,大家还可以参考左边栏的目录,参考其他章节的知识和结构,共同完善我们的tutorial。
[V2.SimpleCode]第八章,个性化推荐 recommender_system
Note: there is an error in the original figure. The locations for 源语言词序列 and 源语编码状态 should be switched.
貌似因为mac os Sierra默认不支持matplotlib
解决方法如下:
安装matplotlib
pip install matplotlib
报错
* The following required packages can not be built:Command "python setup.py egg_info" failed with error code 1 in /tmp/pip-build-8UvQcW/matplotlib
解决方法,先执行
sudo apt-get install libfreetype6-dev pkg-config
再安装matplotlib
pip install matplotlib
最后执行
cd data && python prepare_data.py
提示
housing.data housing.test.npy housing.train.npy prepare_data.py test.list train.lis
Done.
[V2.SimpleCode]第三章,图像分类 image_classification
请先阅读中文教程撰写统一标准。
要求:
This makes the translation extremely difficult.
请先阅读中文教程撰写统一标准。
要求:
请先阅读中文教程撰写统一标准。
要求:
https://github.com/PaddlePaddle/book/blob/develop/word2vec/image/ngram.png
请先阅读中文教程撰写统一标准。
要求:
[V2.SimpleCode]第六章,语意角色标注 label_semantic_roles
请先阅读中文教程撰写统一标准。
要求:
请先阅读中文教程撰写统一标准。
要求:
After #166 is resolved.
请先阅读中文教程撰写统一标准。
要求:
We need the translation of Figure 3. only, as Figure 1. and 2. came from the Google Youtube paper.
请先阅读中文教程撰写统一标准。
要求:
[V2.SimpleCode]第二章,识别数字 recognize_digits
请先阅读中文教程撰写统一标准。
要求:
我在docker中运行情感分析understand_sentiment没有出错,但是运行语义角色标注label_semantic_roles时出现如下错误:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/paddle/trainer/config_parser.py", line 3406, in parse_config_and_serialize
config = parse_config(config_file, config_arg_str)
File "/usr/local/lib/python2.7/dist-packages/paddle/trainer/config_parser.py", line 3382, in parse_config
execfile(config_file, make_config_environment(config_file, config_args))
File "./db_lstm.py", line 79, in
average_window=0.5, max_average_window=10000), )
File "/usr/local/lib/python2.7/dist-packages/paddle/trainer_config_helpers/default_decorators.py", line 53, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/paddle/trainer_config_helpers/default_decorators.py", line 53, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/paddle/trainer_config_helpers/optimizers.py", line 441, in settings
kwargs = extends(kwargs, each.to_setting_kwargs())
File "/usr/local/lib/python2.7/dist-packages/paddle/trainer_config_helpers/optimizers.py", line 349, in extends
assert key not in dict1
AssertionError
F0227 09:10:32.580476 869 PythonUtil.cpp:134] Check failed: (ret) != nullptr Current PYTHONPATH: ['/usr/local/opt/paddle/bin', '/home/code/label_semantic_roles', '/usr/lib/python2.7', '/usr/lib/python2.7/plat-x86_64-linux-gnu', '/usr/lib/python2.7/lib-tk', '/usr/lib/python2.7/lib-old', '/usr/lib/python2.7/lib-dynload', '/usr/local/lib/python2.7/dist-packages', '/usr/lib/python2.7/dist-packages']
Python Error: <type 'exceptions.AssertionError'> :
Python Callstack:
/usr/local/lib/python2.7/dist-packages/paddle/trainer/config_parser.py : 3406
/usr/local/lib/python2.7/dist-packages/paddle/trainer/config_parser.py : 3382
./db_lstm.py : 79
/usr/local/lib/python2.7/dist-packages/paddle/trainer_config_helpers/default_decorators.py : 53
/usr/local/lib/python2.7/dist-packages/paddle/trainer_config_helpers/default_decorators.py : 53
/usr/local/lib/python2.7/dist-packages/paddle/trainer_config_helpers/optimizers.py : 441
/usr/local/lib/python2.7/dist-packages/paddle/trainer_config_helpers/optimizers.py : 349
Call Object failed.
*** Check failure stack trace: ***
@ 0x7fac2f86edaa (unknown)
@ 0x7fac2f86ece4 (unknown)
@ 0x7fac2f86e6e6 (unknown)
@ 0x7fac2f871687 (unknown)
@ 0x813eba paddle::callPythonFuncRetPyObj()
@ 0x81409c paddle::callPythonFunc()
@ 0x6b1f73 paddle::TrainerConfigHelper::TrainerConfigHelper()
@ 0x6b25b4 paddle::TrainerConfigHelper::createFromFlags()
@ 0x52b2f7 main
@ 0x7fac2ea7af45 (unknown)
@ 0x540c05 (unknown)
@ (nil) (unknown)
/usr/local/bin/paddle: line 109: 869 Aborted (core dumped) ${DEBUGGER}$MYDIR/../opt/paddle/bin/paddle_trainer $ {@:2}
第79行代码如下:
model_average=ModelAverage(
average_window=0.5, max_average_window=10000), )
我注释掉该行后,报96行:
default_std = 1 / math.sqrt(hidden_dim) / 3.0
sqrt找不到,但是单独在python中运行时没有问题的;
万望指导,谢谢!
[V2.SimpleCode]第七章,机器翻译 machine_translation
[V2.SimpleCode]第一章,新手入门 fit_a_line
@Zrachel reminded me few days ago that Jupyter might already fit all what we want. Andrew Ng showed me today the Google Trends of Jupyter and Zeppelin notebooks:
So I think I need to try Jupyter at least on my laptop.
请先阅读中文教程撰写统一标准。
要求:
This would prevent it from being converted automatically into Jupyter Notebooks.
I0222 07:10:44.978152 120 Trainer.cpp:170] trainer mode: Normal
F0222 07:10:44.979224 120 ClassRegistrar.h:66] Check failed: mapGet(type, creatorMap_, &creator) Unknown class type: cudnn_conv
*** Check failure stack trace: ***
@ 0x7fe16de95daa (unknown)
@ 0x7fe16de95ce4 (unknown)
@ 0x7fe16de956e6 (unknown)
@ 0x7fe16de98687 (unknown)
@ 0x600835 paddle::Layer::create()
@ 0x538940 ZZN6paddle13NeuralNetwork4initERKNS_11ModelConfigESt8functionIFviPNS_9ParameterEEERKSt6vectorINS_19enumeration_wrapper13ParameterTypeESaISB_EEbENKUlRKNS_11LayerConfigEE_clESI
@ 0x539d5b paddle::NeuralNetwork::init()
@ 0x5479d2 paddle::MultiGradientMachine::MultiGradientMachine()
@ 0x53efde paddle::GradientMachine::create()
@ 0x67bc98 paddle::TrainerInternal::init()
@ 0x6783ee paddle::Trainer::init()
@ 0x5132a9 main
@ 0x7fe16d0a1f45 (unknown)
@ 0x51f2a5 (unknown)
@ (nil) (unknown)
n_in != n_out
n_in != n_out
/usr/local/bin/paddle: line 109: 120 Aborted ${DEBUGGER}
目前首页是:
新手入门 [fit_a_line] [html]
加入英文网页后,要如何显示:
[V2.SimpleCode]第四章,词向量 word2vec
[V2.SimpleCode]第五章,情感分析 understand_sentiment
In book/label_semantic_roles/index.en.html, the URL to label_semantic_roles/image/db_lstm_en.png has typo db
.
请先阅读中文教程撰写统一标准。
要求:
比如: 如图4中有$Filter W_0$和$Filter W_1$两个卷积核
并未在图4中标明
请先阅读中文教程撰写统一标准。
要求:
请先阅读中文教程撰写统一标准。
要求:
(1) For this image:
https://github.com/PaddlePaddle/book/blob/develop/machine_translation/image/encoder_decoder.png
the locations of "源语言编码状态" and “源语言词序列” should be switched.
(2) For this image
https://github.com/PaddlePaddle/book/blob/develop/machine_translation/image/encoder_decoder_en.png
similarly, the locations of "Word Embedding Sequence for the Source Language" and “Word Sequence for the Source Sequence” should be switched.
我用docker安装的,使用了cpu版本,进行vgg或者resNet训练时候报错
I0307 09:38:06.377203 355 Util.cpp:130] Calling runInitFunctions
I0307 09:38:06.377521 355 Util.cpp:143] Call runInitFunctions done.
[INFO 2017-03-07 09:38:06,464 layers.py:1890] channels=3 size=3072
[INFO 2017-03-07 09:38:06,464 layers.py:1890] output size for conv_0 is 32
[INFO 2017-03-07 09:38:06,466 layers.py:1890] channels=64 size=65536
[INFO 2017-03-07 09:38:06,466 layers.py:1890] output size for conv_1 is 32
[INFO 2017-03-07 09:38:06,468 layers.py:1985] output size for pool_0 is 1616
[INFO 2017-03-07 09:38:06,469 layers.py:1890] channels=64 size=16384
[INFO 2017-03-07 09:38:06,469 layers.py:1890] output size for conv_2 is 16
[INFO 2017-03-07 09:38:06,471 layers.py:1890] channels=128 size=32768
[INFO 2017-03-07 09:38:06,471 layers.py:1890] output size for conv_3 is 16
[INFO 2017-03-07 09:38:06,472 layers.py:1985] output size for pool_1 is 88
[INFO 2017-03-07 09:38:06,473 layers.py:1890] channels=128 size=8192
[INFO 2017-03-07 09:38:06,473 layers.py:1890] output size for conv_4 is 8
[INFO 2017-03-07 09:38:06,475 layers.py:1890] channels=256 size=16384
[INFO 2017-03-07 09:38:06,475 layers.py:1890] output size for conv_5 is 8
[INFO 2017-03-07 09:38:06,477 layers.py:1890] channels=256 size=16384
[INFO 2017-03-07 09:38:06,478 layers.py:1890] output size for conv_6 is 8
[INFO 2017-03-07 09:38:06,479 layers.py:1985] output size for pool_2 is 44
[INFO 2017-03-07 09:38:06,480 layers.py:1890] channels=256 size=4096
[INFO 2017-03-07 09:38:06,481 layers.py:1890] output size for conv_7 is 4
[INFO 2017-03-07 09:38:06,483 layers.py:1890] channels=512 size=8192
[INFO 2017-03-07 09:38:06,483 layers.py:1890] output size for conv_8 is 4
[INFO 2017-03-07 09:38:06,485 layers.py:1890] channels=512 size=8192
[INFO 2017-03-07 09:38:06,485 layers.py:1890] output size for conv_9 is 4
[INFO 2017-03-07 09:38:06,487 layers.py:1985] output size for pool_3 is 22
[INFO 2017-03-07 09:38:06,488 layers.py:1890] channels=512 size=2048
[INFO 2017-03-07 09:38:06,488 layers.py:1890] output size for conv_10 is 2
[INFO 2017-03-07 09:38:06,490 layers.py:1890] channels=512 size=2048
[INFO 2017-03-07 09:38:06,490 layers.py:1890] output size for conv_11 is 2
[INFO 2017-03-07 09:38:06,491 layers.py:1890] channels=512 size=2048
[INFO 2017-03-07 09:38:06,491 layers.py:1890] output size for conv_12 is 2
[INFO 2017-03-07 09:38:06,493 layers.py:1985] output size for pool_4 is 1*1
[INFO 2017-03-07 09:38:06,495 networks.py:1466] The input order is [image, label]
[INFO 2017-03-07 09:38:06,495 networks.py:1472] The output order is [cost_0]
I0307 09:38:06.502377 355 Trainer.cpp:170] trainer mode: Normal
I0307 09:38:06.710746 355 PyDataProvider2.cpp:257] loading dataprovider dataprovider::process
I0307 09:38:06.712720 355 PyDataProvider2.cpp:257] loading dataprovider dataprovider::process
I0307 09:38:06.713508 355 GradientMachine.cpp:134] Initing parameters..
I0307 09:38:07.561144 355 GradientMachine.cpp:141] Init parameters done.
I0307 09:38:10.089026 364 ThreadLocal.cpp:37] thread use undeterministic rand seed:365
I0307 09:38:10.251857 357 ThreadLocal.cpp:37] thread use undeterministic rand seed:358
I0307 09:38:10.296422 356 ThreadLocal.cpp:37] thread use undeterministic rand seed:357
I0307 09:38:10.305033 359 ThreadLocal.cpp:37] thread use undeterministic rand seed:360
I0307 09:38:10.348325 358 ThreadLocal.cpp:37] thread use undeterministic rand seed:359
/usr/local/bin/paddle: line 109: 355 Killed ${DEBUGGER}
followup #137
请先阅读中文教程撰写统一标准。
要求:
请先阅读中文教程撰写统一标准。
要求:
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.