Comments (8)
Hey @haohenggang thanks for reporting. We fixed this issue in #2614. Could you update to the latest main branch and try again?
from mlc-llm.
Thank you @AkulRT for reporting. I see, it seems that those macros might not directly work with Windows. We will dig into this and please give us some time to find a solution. Meanwhile, a quick workaround you can do locally is to replace these includes with the absolute path
mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h
Lines 15 to 44 in 0575b92
For example, use
- #include CONCAT(TVM_SOURCE_DIR,/src/runtime/c_runtime_api.cc)
+ #include "C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tvm/src/runtime/c_runtime_api.cc"
to include c_runtime_api.cc
. And do the same for the other includes in the file.
from mlc-llm.
@AkulRT Ah wait the absolute path may not work as well. So instead we can do
#include "../../../../3rdparty/tvm/src/runtime/c_runtime_api.cc"
from mlc-llm.
@AkulRT We just merged a fix in #2616 and you can check out the latest main branch to try it.
Given the fix of the original gemma config problem has been confirmed, gonna close this issue now. You are more than welcome to open new issues for further problems :-)
from mlc-llm.
Managed to get the chat app compiled. Thank you for your help @MasterJH5574 !
from mlc-llm.
I tried compiling Gemma for chat again, after the update and it gives the following error:
[151/153] Building CXX object CMakeFiles/tvm4j_runtime_pac...vm/native/src/main/native/org_apache_tvm_native_c_api.cc.o
FAILED: CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o
C:\Users\akula\AppData\Local\Android\Sdk\ndk\27.0.11902837\toolchains\llvm\prebuilt\windows-x86_64\bin\clang++.exe --target=aarch64-none-linux-android24 --sysroot=C:/Users/akula/AppData/Local/Android/Sdk/ndk/27.0.11902837/toolchains/llvm/prebuilt/windows-x86_64/sysroot -DTVM4J_ANDROID -DTVM_LOG_CUSTOMIZE=1 -DTVM_RELAX_VM_ENABLE_PROFILER=0 -DTVM_SOURCE_DIR=C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm -Dtvm4j_runtime_packed_EXPORTS -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/MLCChat/build/jni_header -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/3rdparty/dlpack/include -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/3rdparty/dmlc-core/include -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/3rdparty/OpenCL-Headers -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/3rdparty/picojson -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/include -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tokenizers-cpp/include -g -DANDROID -fdata-sections -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security "-O3" -O3 -DNDEBUG -fPIC -MD -MT CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o -MF CMakeFiles\tvm4j_runtime_packed.dir\368b657d31d5c8d946e3ffa48aa52ef0\Internship\mlc-llm\3rdparty\tvm\jvm\native\src\main\native\org_apache_tvm_native_c_api.cc.o.d -o CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o -c C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc
In file included from C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc:25:
C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h:15:10: fatal error: 'C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm /src/runtime/c_runtime_api.cc' file not found
15 | #include CONCAT(TVM_SOURCE_DIR,/src/runtime/c_runtime_api.cc)
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h:12:24: note: expanded from macro 'CONCAT'
12 | #define CONCAT(n1, n2) STRINGIFY_MACRO(EXPAND(n1) EXPAND(n2))
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h:9:28: note: expanded from macro 'STRINGIFY_MACRO'
9 | #define STRINGIFY_MACRO(x) STR(x)
| ^~~~~~
C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h:10:16: note: expanded from macro 'STR'
10 | #define STR(x) #x
| ^~
:329:1: note: expanded from here
329 | "C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm /src/runtime/c_runtime_api.cc"
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
1 error generated.
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 120, in
main(parsed.mlc_llm_source_dir)
File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 103, in main
run_cmake_build()
File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 66, in run_cmake_build
subprocess.run(cmd, check=True, env=os.environ)
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\subprocess.py", line 571, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['cmake', '--build', '.', '--target', 'tvm4j_runtime_packed', '--config', 'release', '-j16']' returned non-zero exit status 1.
Traceback (most recent call last):
File "", line 198, in run_module_as_main
File "", line 88, in run_code
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Scripts\mlc_llm.exe_main.py", line 7, in
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm_main.py", line 53, in main
cli.main(sys.argv[2:])
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\cli\package.py", line 64, in main
package(
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\interface\package.py", line 361, in package
build_android_binding(mlc_llm_source_dir, output)
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\interface\package.py", line 275, in build_android_binding
subprocess.run([sys.executable, mlc4j_path / "prepare_libs.py"], check=True, env=os.environ)
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\subprocess.py", line 571, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['C:\Users\akula\miniconda3\envs\mlc-chat-venv\python.exe', WindowsPath('C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/prepare_libs.py')]' returned non-zero exit status 1.
from mlc-llm.
Doing so gave the following error:
C:\WINDOWS\system32\cmd.exe /C "cd . && C:\Users\akula\AppData\Local\Android\Sdk\ndk\27.0.11902837\toolchains\llvm\prebuilt\windows-x86_64\bin\clang++.exe --target=aarch64-none-linux-android24 --sysroot=C:/Users/akula/AppData/Local/Android/Sdk/ndk/27.0.11902837/toolchains/llvm/prebuilt/windows-x86_64/sysroot -fPIC -g -DANDROID -fdata-sections -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security "-O3" -O3 -DNDEBUG -static-libstdc++ -Wl,--build-id=sha1 -Wl,--no-rosegment -Wl,--no-undefined-version -Wl,--fatal-warnings -Wl,--no-undefined -Qunused-arguments -Wl,--gc-sections -shared -Wl,-soname,libtvm4j_runtime_packed.so -o libtvm4j_runtime_packed.so CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o mlc_llm/tokenizers/libtokenizers_cpp.a -llog -Wl,--whole-archive mlc_llm/libmlc_llm.a lib/libmodel_android.a -Wl,--no-whole-archive mlc_llm/tokenizers/aarch64-linux-android/release/libtokenizers_c.a mlc_llm/tokenizers/sentencepiece/src/libsentencepiece.a -pthread -latomic -lm && cd ."
ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::GetFunction(tvm::runtime::String const&, bool)
referenced by packed_func.h:2136 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tvm/include/tvm/runtime/packed_func.h:2136)
model.cc.o:(mlc::llm::ModelMetadata::FromModule(tvm::runtime::Module, picojson::object_with_ordered_keys const&)) in archive mlc_llm/libmlc_llm.a
referenced by engine.cc:647 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/cpp/serve/engine.cc:647)
engine.cc.o:(mlc::llm::serve::EngineImpl::CreateDiscoSession(std::__ndk1::vector<std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator>, std::__ndk1::allocator<std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator>>> const&, std::__ndk1::vector<picojson::object_with_ordered_keys, std::__ndk1::allocatorpicojson::object_with_ordered_keys> const&, DLDevice)::'lambda'(std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator> const&, picojson::object_with_ordered_keys const&)::operator()(std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator> const&, picojson::object_with_ordered_keys const&) const) in archive mlc_llm/libmlc_llm.a
referenced by engine.cc:650 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/cpp/serve/engine.cc:650)
engine.cc.o:(mlc::llm::serve::EngineImpl::CreateDiscoSession(std::__ndk1::vector<std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator>, std::__ndk1::allocator<std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator>>> const&, std::__ndk1::vector<picojson::object_with_ordered_keys, std::__ndk1::allocatorpicojson::object_with_ordered_keys> const&, DLDevice)::'lambda'(std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator> const&, picojson::object_with_ordered_keys const&)::operator()(std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator> const&, picojson::object_with_ordered_keys const&) const) in archive mlc_llm/libmlc_llm.a
referenced 21 more times
ld.lld: error: undefined symbol: tvm::runtime::Module::LoadFromFile(tvm::runtime::String const&, tvm::runtime::String const&)
referenced by engine.cc:646 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/cpp/serve/engine.cc:646)
engine.cc.o:(mlc::llm::serve::EngineImpl::CreateDiscoSession(std::__ndk1::vector<std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator>, std::__ndk1::allocator<std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator>>> const&, std::__ndk1::vector<picojson::object_with_ordered_keys, std::__ndk1::allocatorpicojson::object_with_ordered_keys> const&, DLDevice)::'lambda'(std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator> const&, picojson::object_with_ordered_keys const&)::operator()(std::__ndk1::basic_string<char, std::__ndk1::char_traits, std::__ndk1::allocator> const&, picojson::object_with_ordered_keys const&) const) in archive mlc_llm/libmlc_llm.a
referenced by function_table.cc:120 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/cpp/serve/function_table.cc:120)
function_table.cc.o:(mlc::llm::serve::FunctionTable::Init(tvm::runtime::String, DLDevice, picojson::object_with_ordered_keys, tvm::runtime::Optionaltvm::runtime::Session, int)) in archive mlc_llm/libmlc_llm.a
referenced by c_runtime_api.cc:489 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/../../../../3rdparty/tvm/src/runtime/c_runtime_api.cc:489)
CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(TVMModLoadFromFile)
ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::SaveToFile(tvm::runtime::String const&, tvm::runtime::String const&)
referenced by json_ffi_engine.cc
json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a
referenced by engine.cc
engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a
referenced by threaded_engine.cc
threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a
referenced 5 more times
ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::SaveToBinary(dmlc::Stream*)
referenced by json_ffi_engine.cc
json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a
referenced by engine.cc
engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a
referenced by threaded_engine.cc
threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a
referenced 5 more times
ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::GetSource(tvm::runtime::String const&)
referenced by json_ffi_engine.cc
json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a
referenced by engine.cc
engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a
referenced by threaded_engine.cc
threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a
referenced 6 more times
ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::GetFormat()
referenced by json_ffi_engine.cc
json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a
referenced by engine.cc
engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a
referenced by threaded_engine.cc
threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a
referenced 7 more times
ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::ImplementsFunction(tvm::runtime::String const&, bool)
referenced by json_ffi_engine.cc
json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a
referenced by engine.cc
engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a
referenced by threaded_engine.cc
threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a
referenced 7 more times
ld.lld: error: undefined symbol: typeinfo for tvm::runtime::ModuleNode
referenced by json_ffi_engine.cc
json_ffi_engine.cc.o:(typeinfo for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a
referenced by engine.cc
engine.cc.o:(typeinfo for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a
referenced by threaded_engine.cc
threaded_engine.cc.o:(typeinfo for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a
referenced 5 more times
ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::Import(tvm::runtime::Module)
referenced by c_runtime_api.cc:499 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/../../../../3rdparty/tvm/src/runtime/c_runtime_api.cc:499)
CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(TVMModImport)
ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::GetFuncFromEnv(tvm::runtime::String const&)
referenced by c_runtime_api.cc:524 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/../../../../3rdparty/tvm/src/runtime/c_runtime_api.cc:524)
CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(TVMBackendGetFuncFromEnv)
ld.lld: error: undefined symbol: vtable for tvm::runtime::ModuleNode
referenced by module.h:145 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/include/tvm/runtime/module.h:145)
CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(tvm::runtime::ModuleNode::~ModuleNode())
referenced by module.h:145 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/include/tvm/runtime/module.h:145)
CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(tvm::runtime::ModuleNode::~ModuleNode())
the vtable symbol may be undefined because the class is missing its key function (see https://lld.llvm.org/missingkeyfunction)
clang++: error: linker command failed with exit code 1 (use -v to see invocation)
ninja: build stopped: subcommand failed.
Traceback (most recent call last):
File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 120, in
main(parsed.mlc_llm_source_dir)
File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 103, in main
run_cmake_build()
File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 66, in run_cmake_build
subprocess.run(cmd, check=True, env=os.environ)
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\subprocess.py", line 571, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['cmake', '--build', '.', '--target', 'tvm4j_runtime_packed', '--config', 'release', '-j16']' returned non-zero exit status 1.
Traceback (most recent call last):
File "", line 198, in run_module_as_main
File "", line 88, in run_code
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Scripts\mlc_llm.exe_main.py", line 7, in
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm_main.py", line 53, in main
cli.main(sys.argv[2:])
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\cli\package.py", line 64, in main
package(
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\interface\package.py", line 361, in package
build_android_binding(mlc_llm_source_dir, output)
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\interface\package.py", line 275, in build_android_binding
subprocess.run([sys.executable, mlc4j_path / "prepare_libs.py"], check=True, env=os.environ)
File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\subprocess.py", line 571, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['C:\Users\akula\miniconda3\envs\mlc-chat-venv\python.exe', WindowsPath('C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/prepare_libs.py')]' returned non-zero exit status 1.
I think I will try using an older wheel and see if that allows compiling. Thank you @MasterJH5574 for your prompt response, I really appreciate it!
from mlc-llm.
@AkulRT Thanks for the swift response. Yeah you can just try check out an older commit prior to this one fbb6a48 (since it is this commit that introduces the issue). Meanwhile we will try to look into it.
from mlc-llm.
Related Issues (20)
- [Question] Does the model deployed using mlc-llm support integration with Langchain-chatchat? Faster than Ollama. HOT 3
- [Feature Request] Lookahead Decoding support HOT 3
- cmake can not locate cuda path
- mlc llm and nvidia compute capability mismatch in order of magnitude HOT 2
- [Question] Is seed parameters supported? HOT 1
- [Question] AMD Attention Performance MLC-LLM vs best
- [Question] Where is Cascade Inference implemented?
- [Question] How to enable the CLI tool to be supported on Android
- [Bug] Android load model crash. org.apache.tvm.Base$TVMError: InternalError: Check failed: type_code_ == kTVMObjectHandle (0 vs. 8) : expected Object but got int HOT 2
- [Question] Convert weight doesn't create ndarray-cache.json file HOT 1
- [Bug] Importing MLC LLM disables color logging for `rich` HOT 1
- RAG support
- very slow on Mac m3
- [Bug] crash when init qwen2 model HOT 4
- [Bug] Do NOT know how to handle return type code 15 HOT 4
- [Question] Speculative Decoding Metrics Variable
- [Trying to compile my fine-tuned llama3 llm using MLC-LLM but keep running to this] HOT 1
- [Bug] Insufficient Memory Error (although there's sufficient memory available)
- [Question] Prefix cache with prompts dedupe
- [Feature Request] Novel Config Enhancements: Min-P & DRY Penalization
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from mlc-llm.