Code Monkey home page Code Monkey logo

Comments (16)

yileitu avatar yileitu commented on June 3, 2024

Forget to attach ./outputs/default/20240411_091943/logs/infer/llama-2-7b/tydiqa-goldp_arabic.out. It reads

Error: mkl-service + Intel(R) MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library.
        Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.

All other tydiqa-goldp_xx.out file are the same as above.

from opencompass.

Ethan-9606 avatar Ethan-9606 commented on June 3, 2024

i got the same error....

from opencompass.

kkkparty avatar kkkparty commented on June 3, 2024

i got the same error....

please tell when it was fixed

from opencompass.

IcyFeather233 avatar IcyFeather233 commented on June 3, 2024

try export MKL_SERVICE_FORCE_INTEL=1 and run again

from opencompass.

kkkparty avatar kkkparty commented on June 3, 2024

try export MKL_SERVICE_FORCE_INTEL=1 and run again

it doesn't work

from opencompass.

seanxuu avatar seanxuu commented on June 3, 2024

same Error: mkl-service + Intel(R) MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library. Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.

from opencompass.

seanxuu avatar seanxuu commented on June 3, 2024

try export MKL_SERVICE_FORCE_INTEL=1 and run again

it doesn't work

pytorch/pytorch#37377 (comment)

from opencompass.

yileitu avatar yileitu commented on June 3, 2024

Dear team, any updates? It seems that this bug is exclusively associated with some datasets like tidyqa and XCOPA. I can run the exemplar script successfully with meaningful outputs, in the same setting and environment.

python run.py --models hf_opt_125m hf_opt_350m --datasets siqa_gen winograd_ppl

from opencompass.

bittersweet1999 avatar bittersweet1999 commented on June 3, 2024

How about
export MKL_THREADING_LAYER=GNU export MKL_SERVICE_FORCE_INTEL=1

from opencompass.

bittersweet1999 avatar bittersweet1999 commented on June 3, 2024

And please check your environment, whether updated Pytorch, transformers, and whether running on Linux

from opencompass.

yileitu avatar yileitu commented on June 3, 2024

How about export MKL_THREADING_LAYER=GNU export MKL_SERVICE_FORCE_INTEL=1

It doesn't work. I tried.

And please check your environment, whether updated Pytorch, transformers, and whether running on Linux

Libraries are updated. Yes it is indeed running on Linux. Anything particular I should care about if it is on Linux?

Or could you provide a script that you/admins have verified can successfully run TyDiQA Evaluation? (Any model would be fine). I can try to reproduce it in my environment and find out the differences. I think this is the fastest way to solve this issue.

from opencompass.

bittersweet1999 avatar bittersweet1999 commented on June 3, 2024
from mmengine.config import read_base

from opencompass.models import HuggingFaceCausalLM
from opencompass.partitioners import NaivePartitioner
from opencompass.partitioners.sub_naive import SubjectiveNaivePartitioner
from opencompass.runners import LocalRunner
from opencompass.tasks import OpenICLInferTask
from opencompass.tasks.subjective_eval import SubjectiveEvalTask


with read_base():
    from .datasets.tydiqa.tydiqa_gen import tydiqa_datasets
    from .models.hf_internlm.hf_internlm2_chat_7b import models
datasets = [*tydiqa_datasets]


from opencompass.models import HuggingFaceCausalLM


_meta_template = dict(
    round=[
        dict(role='HUMAN', begin='<|im_start|>user\n', end='<|im_end|>\n'),
        dict(role='BOT', begin='<|im_start|>assistant\n', end='<|im_end|>\n', generate=True),
    ],
)

models = [
    dict(
        type=HuggingFaceCausalLM,
        abbr='internlm2-chat-7b-hf',
        path="internlm/internlm2-chat-7b",
        tokenizer_path='internlm/internlm2-chat-7b',
        model_kwargs=dict(
            trust_remote_code=True,
            device_map='auto',
        ),
        tokenizer_kwargs=dict(
            padding_side='left',
            truncation_side='left',
            use_fast=False,
            trust_remote_code=True,
        ),
        max_out_len=2048,
        max_seq_len=2048,
        batch_size=8,
        meta_template=_meta_template,
        run_cfg=dict(num_gpus=1, num_procs=1),
        end_str='<|im_end|>',
        generation_kwargs = {"eos_token_id": [2, 92542], "do_sample": True},
        batch_padding=True,
    )
]


infer = dict(
    partitioner=dict(type=NaivePartitioner),
    runner=dict(
        type=LocalRunner,
        max_num_workers=256,
        task=dict(type=OpenICLInferTask)),
)

work_dir = 'outputs/test/'

hi here is my config, and runned by the script below

conda activate opencompass
export MKL_SERVICE_FORCE_INTEL=1
export HF_EVALUATE_OFFLINE=1
export HF_DATASETS_OFFLINE=1
export TRANSFORMERS_OFFLINE=1
export HF_ENDPOINT=https://hf-mirror.com
export TRANSFORMERS_CACHE='my cache dir'
python run.py configs/eval_my_config.py --mode all --reuse latest 

from opencompass.

yileitu avatar yileitu commented on June 3, 2024

Same error happens with your code

04/28 19:40:37 - OpenCompass - ERROR - /cluster/project/sachan/yilei/projects/opencompass/opencompass/runners/local.py - _launch - 192 - task OpenICLInfer[internlm2-chat-7b-hf/tydiqa-goldp_arabic] fail, see
outputs/test/20240428_193926/logs/infer/internlm2-chat-7b-hf/tydiqa-goldp_arabic.out

It seems like a Linux-special problem.

from opencompass.

bittersweet1999 avatar bittersweet1999 commented on June 3, 2024

I am also running on linux platform, after checking the environment like PyTorch and transformers, our different is only the GPU version, I used A100 80G.

from opencompass.

yileitu avatar yileitu commented on June 3, 2024

Most probably not GPU problem. I tested it on A100 80G but still got the same error

from opencompass.

yileitu avatar yileitu commented on June 3, 2024

For those who encounter the same OpenICLInfer Error, reinstalling numpy in the opencompass conda env works for me! Plus I add two env vars:

export MKL_SERVICE_FORCE_INTEL=1
export MKL_THREADING_LAYER=GNU

from opencompass.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.