Code Monkey home page Code Monkey logo

paddletest's Introduction

PaddleTest

目录结构

.    -------------------------------------> 项目根目录
├── README.md
│
├── distributed/ -------------------------> 分布式测试case
│
├── framework/   -------------------------> 主框架Paddle测试case
│
├── inference/   -------------------------> 预测相关测试case
│
├── lite/        -------------------------> Lite相关测试case
│
└── models/      -------------------------> 模型相关的测试case

如何贡献

代码审查

代码审查包括两部分,一个是基于现有开源代码格式规范的审查,通过pre-commit来审查,另一个是基于自定义要求的审查,在tools/codestyle下。

pre-commit审核主要是三种,包括black,flake8,pylint,在CI阶段代码审查执行。本地运行方式是安装pre-commit,一个简单的方法是用python3+使用pip install pre-commit来安装。 执行方式pre-commit run --file [your code file]black会自动调整代码格式和一些简单的规范错误。具体规范配置请见根目录.pre-commit-config.yaml文件。

合入规范

合入必须要求通过全部CI检测,原则上禁止强行Merge,如果有Pylint代码格式阻塞,可以讨论是否禁止某一条规范生效,必须要求一个QA Reviewer,禁止出现敏感代码。

CI 触发规则

  • 默认触发全部CI任务
  • 只触发某一个CI任务,需要在commit信息中添加相应的关键字
CI任务名称 关键字 效果
linux-ci notest,test=linux_ci 只触发linux-ci任务,其余均不触发
linux-inference-ci notest,test=linux_inference_ci 只触发linux-inference-ci任务,其余均不触发
CodeStyle notest,test=codestyle 只触发CodeStyle任务,其余均不触发

CI任务分类触发

  • 框架内容合入不触发linux-inference-ci任务
  • 预测内容合入不触发linux-ci任务
  • 模型内容合入不触发linux-ci任务和linux-inference-ci任务
任务类型 关键字 效果
框架任务 run_mode=framework linux-inference-ci任务不执行
预测任务 run_mode=inference linux-ci任务不执行
模型任务 run_mode=model linux-ci任务与linux-inference-ci任务不执行

paddletest's People

Contributors

chalsliu avatar dddivano avatar emmonscurse avatar fightfat avatar ggbond8488 avatar gzxl avatar iamwhtwd avatar jiamengsi avatar jiaxiao243 avatar liujie0926 avatar mmglove avatar oliverlph avatar pcjmmc avatar plusnew001 avatar pollyyan avatar quanxiang-liu avatar shjnt avatar veyron95 avatar wanghuancoder avatar wangye707 avatar xiegegege avatar xieyunshen avatar yanmeng1019 avatar yghstill avatar zeref996 avatar zhangyulongg avatar zhengya01 avatar zhoutianzi666 avatar zhwesky2010 avatar zjjlivein avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

paddletest's Issues

1

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you all sign our Contributor License Agreement before we can accept your contribution.
1 out of 2 committers have signed the CLA.

✅ sunhon
❌ sunhongjiang


sunhongjiang seems not to be a GitHub user. You need a GitHub account to be able to sign the CLA. If you have already a GitHub account, please add the email address used for this commit to your account.
You have signed the CLA already but the status is still pending? Let us recheck it.

Originally posted by @CLAassistant in #65 (comment)

error in test_segmentation_infer.py with Debug build of Paddle

Hi,

I have stumbled upon an error when running test_segmentation_infer.py with PaddlePaddle built on Debug. The problem doesn't exist when PaddlePaddle is built on RelWithDebInfo.

command

python3.10 test_segmentation_infer.py --model_path=models/human_pp_humansegv2_mobile_192x192_inference_model_with_softmax/ --dataset='human' --dataset_config=configs/humanseg_dataset.yaml --device=CPU --use_mkldnn=True --cpu_threads=10 --model_name=PP-HumanSegV2-Lite

Error message:

python3.10: /mnt/drive/PaddlePaddle/Paddle/build/third_party/eigen3/src/extern_eigen3/unsupported/Eigen/CXX11/src/Tensor/TensorAssign.h:146: bool Eigen::TensorEvaluator<const Eigen::TensorAssignOp<LhsXprType, RhsXprType>, Device>::evalSubExprsIfNeeded(Eigen::TensorEvaluator<const Eigen::TensorAssignOp<LhsXprType, RhsXprType>, Device>::EvaluatorPointerType) [with LeftArgType = Eigen::TensorMap<Eigen::Tensor<int, 4, 1, long int>, 0, Eigen::MakePointer>; RightArgType = const Eigen::TensorConversionOp<int, const Eigen::TensorTupleReducerOp<Eigen::internal::ArgMaxTupleReducer<Eigen::Tuple<long int, float> >, const std::array<long int, 1>, const Eigen::TensorMap<Eigen::Tensor<const float, 4, 1, long int>, 0, Eigen::MakePointer> > >; Device = Eigen::DefaultDevice; Eigen::TensorEvaluator<const Eigen::TensorAssignOp<LhsXprType, RhsXprType>, Device>::EvaluatorPointerType = int*]: Assertion `dimensions_match(m_leftImpl.dimensions(), m_rightImpl.dimensions())' failed.
Aborted (core dumped)
sfraczek@sfraczek-X299:/mnt/drive/PaddlePaddle/PaddleTest/inference/python_api_test/test_int8_model$ Process Process-1:
Traceback (most recent call last):
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_common.py", line 443, in wrapper
    ret = self._cache[fun]
AttributeError: 'Process' object has no attribute '_cache'
 
During handling of the above exception, another exception occurred:
 
Traceback (most recent call last):
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_pslinux.py", line 1645, in wrapper
    return fun(self, *args, **kwargs)
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_common.py", line 446, in wrapper
    return fun(self)
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_pslinux.py", line 1687, in _parse_stat_file
    data = bcat("%s/%s/stat" % (self._procfs_path, self.pid))
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_common.py", line 776, in bcat
    return cat(fname, fallback=fallback, _open=open_binary)
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_common.py", line 764, in cat
    with _open(fname) as f:
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_common.py", line 728, in open_binary
    return open(fname, "rb", buffering=FILE_READ_BUFFER_SIZE)
FileNotFoundError: [Errno 2] No such file or directory: '/proc/228972/stat'
 
During handling of the above exception, another exception occurred:
 
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/mnt/drive/PaddlePaddle/PaddleTest/inference/python_api_test/test_int8_model/backend/monitor.py", line 135, in cpu_stat_func
    stat_info.cpu_percent(),
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/__init__.py", line 999, in cpu_percent
    pt2 = self._proc.cpu_times()
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_pslinux.py", line 1645, in wrapper
    return fun(self, *args, **kwargs)
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_pslinux.py", line 1836, in cpu_times
    values = self._parse_stat_file()
  File "/home/sfraczek/.local/lib/python3.10/site-packages/psutil/_pslinux.py", line 1652, in wrapper
    raise NoSuchProcess(self.pid, self._name)
psutil.NoSuchProcess: process no longer exists (pid=228972)

PaddlePaddle commit

commit 3c14b38e7309e661d13bab89dcdf667fa37d39f5 (HEAD -> develop)
Author: duanyanhui <[email protected]>
Date:   Wed Feb 15 20:24:33 2023 +0800

    fix npu save_combine (#50496)

PaddleTest commit

commit f5f80e56f361d0678f23abd52b43ff69e2174a41 (HEAD -> develop, origin/develop, origin/HEAD)
Author: YuBaoku <[email protected]>
Date:   Wed Feb 15 17:12:58 2023 +0800

    Remove delete_pass_list of ViT cases (#1952)

pre-commit错误

新建a.txt,内容123,pre-commit安装运行错误

(py310) root@iZt4n49r14byh758lhmkk2Z:/nfs/github/paddle/PaddleTest# git commit -m 'Fix'
[INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks.
[INFO] Initializing environment for https://gitee.com/mirrors/black.
[INFO] Initializing environment for https://gitee.com/mirrors_PyCQA/flake8.git.
[INFO] Initializing environment for https://gitee.com/mirrors_PyCQA/flake8.git:pep8-naming.
[INFO] Initializing environment for https://github.com/PyCQA/pylint.
[INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
[INFO] Installing environment for https://gitee.com/mirrors/black.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...


[INFO] Installing environment for https://gitee.com/mirrors_PyCQA/flake8.git.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...


[INFO] Installing environment for https://github.com/PyCQA/pylint.
[INFO] Once installed this environment will be reused.
[INFO] This may take a few minutes...
An unexpected error has occurred: CalledProcessError: command: ('/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/bin/python', '-mpip', 'install', '.')
return code: 1
stdout:
    Looking in indexes: http://mirrors.cloud.aliyuncs.com/pypi/simple/
    Processing /nfs/home/.cache/pre-commit/repoy41wkzu7
      Preparing metadata (setup.py): started
      Preparing metadata (setup.py): finished with status 'error'
stderr:
      error: subprocess-exited-with-error
      
      × python setup.py egg_info did not run successfully.
      │ exit code: 1
      ╰─> [63 lines of output]
          /nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/dist.py:472: SetuptoolsDeprecationWarning: Invalid dash-separated options
          !!
          
                  ********************************************************************************
                  Usage of dash-separated 'index-url' will not be supported in future
                  versions. Please use the underscore name 'index_url' instead.
          
                  By 2024-Sep-26, you need to update your project and remove deprecated calls
                  or your builds will no longer be supported.
          
                  See https://setuptools.pypa.io/en/latest/userguide/declarative_config.html for details.
                  ********************************************************************************
          
          !!
            opt = self.warn_dash_deprecation(opt, section)
          /nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/__init__.py:80: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated.
          !!
          
                  ********************************************************************************
                  Requirements should be satisfied by a PEP 517 installer.
                  If you are using pip, you can try `pip install --use-pep517`.
                  ********************************************************************************
          
          !!
            dist.fetch_build_eggs(dist.setup_requires)
          WARNING: The repository located at mirrors.cloud.aliyuncs.com is not a trusted or secure host and is being ignored. If this repository is available via HTTPS we recommend you use HTTPS instead, otherwise you may silence this warning and allow it anyway with '--trusted-host mirrors.cloud.aliyuncs.com'.
          ERROR: Could not find a version that satisfies the requirement setuptools_scm (from versions: none)
          ERROR: No matching distribution found for setuptools_scm
          Traceback (most recent call last):
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/installer.py", line 101, in _fetch_build_egg_no_warn
              subprocess.check_call(cmd)
            File "/data/anaconda3/envs/py310/lib/python3.10/subprocess.py", line 369, in check_call
              raise CalledProcessError(retcode, cmd)
          subprocess.CalledProcessError: Command '['/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/bin/python', '-m', 'pip', '--disable-pip-version-check', 'wheel', '--no-deps', '-w', '/tmp/tmpb4eabi61', '--quiet', '--index-url', 'http://mirrors.cloud.aliyuncs.com/pypi/simple/', 'setuptools_scm']' returned non-zero exit status 1.
          
          The above exception was the direct cause of the following exception:
          
          Traceback (most recent call last):
            File "<string>", line 2, in <module>
            File "<pip-setuptools-caller>", line 34, in <module>
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/setup.py", line 3, in <module>
              setup(use_scm_version=True)
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/__init__.py", line 102, in setup
              _install_setup_requires(attrs)
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/__init__.py", line 75, in _install_setup_requires
              _fetch_build_eggs(dist)
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/__init__.py", line 80, in _fetch_build_eggs
              dist.fetch_build_eggs(dist.setup_requires)
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/dist.py", line 636, in fetch_build_eggs
              return _fetch_build_eggs(self, requires)
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/installer.py", line 38, in _fetch_build_eggs
              resolved_dists = pkg_resources.working_set.resolve(
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/pkg_resources/__init__.py", line 829, in resolve
              dist = self._resolve_dist(
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/pkg_resources/__init__.py", line 865, in _resolve_dist
              dist = best[req.key] = env.best_match(
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/pkg_resources/__init__.py", line 1135, in best_match
              return self.obtain(req, installer)
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/pkg_resources/__init__.py", line 1147, in obtain
              return installer(requirement)
            File "/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/lib/python3.10/site-packages/setuptools/installer.py", line 103, in _fetch_build_egg_no_warn
              raise DistutilsError(str(e)) from e
          distutils.errors.DistutilsError: Command '['/nfs/home/.cache/pre-commit/repoy41wkzu7/py_env-python3.10/bin/python', '-m', 'pip', '--disable-pip-version-check', 'wheel', '--no-deps', '-w', '/tmp/tmpb4eabi61', '--quiet', '--index-url', 'http://mirrors.cloud.aliyuncs.com/pypi/simple/', 'setuptools_scm']' returned non-zero exit status 1.
          [end of output]
      
      note: This error originates from a subprocess, and is likely not a problem with pip.
    error: metadata-generation-failed
    
    × Encountered error while generating package metadata.
    ╰─> See above for output.
    
    note: This is an issue with the package mentioned above, not pip.
    hint: See above for details.
Check the log at /nfs/home/.cache/pre-commit/pre-commit.log

Problem launching test of ppseg_lite

Hi,

I'm running test
python test_segmentation_infer.py --model_path=models/ppseg_lite_portrait_398x224_with_softmax --dataset='human' --dataset_config=configs/humanseg_dataset.yaml --device=CPU --use_mkldnn=True --cpu_threads=10

I got error:

Traceback (most recent call last):
  File "/mnt/drive/PaddlePaddle/PaddleTest/inference/python_api_test/test_int8_model/test_segmentation_infer.py", line 262, in <module>
    eval(args)
  File "/mnt/drive/PaddlePaddle/PaddleTest/inference/python_api_test/test_int8_model/test_segmentation_infer.py", line 172, in eval
    image = np.array(data[0])
KeyError: 0

This error means that the dictionary named data doesn't contain anything under key 0. This dictionary has 4 elements with below keys:

(Pdb) p data.keys()
dict_keys(['trans_info', 'img', 'label', 'gt_fields'])

It probably means that the data loader is broken in PaddleTest repo because the return_list=True doesn't result in producing a list.
loader = paddle.io.DataLoader(eval_dataset, batch_sampler=batch_sampler, num_workers=0, return_list=True)
After working around that with replacing data[0] and data[1] with data["img"] and data["label"] a new error appeared.

Traceback (most recent call last):
  File "/mnt/drive/PaddlePaddle/PaddleTest/inference/python_api_test/test_int8_model/test_segmentation_infer.py", line 262, in <module>
    eval(args)
  File "/mnt/drive/PaddlePaddle/PaddleTest/inference/python_api_test/test_int8_model/test_segmentation_infer.py", line 189, in eval
    logit = reverse_transform(
TypeError: reverse_transform() got multiple values for argument 'mode'

I'm using latest develop of repositories.
@yeliang2258
@jiangjiajun

Model tests stopped working on machines without a GPU

After 97a182a the tests from this repository stopped working on our validation machines. This is because 97a182a enables monitor.py, which requires the nvidia-smi, which our validation machines don't have.
Humansegv2 and ppyoloe are high priority models that we were told to optimize by @yaomichael, but we can't do this if this repository keeps adding changes that make it impossible to run the models on machines without a GPU. Most of our machines don't have GPU's, because OneDNN only runs on CPU.
Please decide if we should keep trying to optimize the oneDNN versions of these models and if so, please make it possible to run the tests of these models on machines without a GPU.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.