Code Monkey home page Code Monkey logo

midori-ai's Issues

SSL - issue when installing InvokeAI

Hi,

thanks for creating a great installer!

When trying to install InvokeAI in the subsystem, I get this error after using the ./install.sh in the subsystem command line It downloads all the files from the wheels (from InvokeAI) no problem there, then it asks me if I want an automatic install (yes, I press a), then iit runs a bit, but crashes out below

Checking Name: anto-midori_ai_subsystem-1, ID: [REDACTED]
Found subsystem, logging into: anto-midori_ai_subsystem-1 / [REDACTED]
Entering subsystem shell! Type ``Exit`` to exit...
------------------------------------------
Press enter to start the shell...
               ^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 557, in error
    result = self._call_chain(*args)
             ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 496, in _call_chain
    result = func(*args)
             ^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 749, in http_error_302
    return self.parent.open(new, timeout=req.timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 519, in open
    response = self._open(req, data)
               ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 536, in _open
    result = self._call_chain(self.handle_open, protocol, protocol +
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 496, in _call_chain
    result = func(*args)
             ^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 1391, in https_open
    return self.do_open(http.client.HTTPSConnection, req,
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/urllib/request.py", line 1351, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:992)>

[2024-03-11 18:18:02,434]::[InvokeAI]::INFO --> Downloading core tokenizers and text encoders
[2024-03-11 18:18:02,818]::[InvokeAI]::ERROR --> (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /bert-base-uncased/resolve/main/tokenizer_config.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:992)')))"), '(Request ID: [REDACTED])')
[2024-03-11 18:18:03,092]::[InvokeAI]::INFO --> Scanning /root/invokeai/data/models for new models
[2024-03-11 18:18:03,173]::[InvokeAI]::INFO --> Scanned 9 files and directories, imported 0 models
[2024-03-11 18:18:03,234]::[InvokeAI]::INFO --> Installing InvokeAI/ip_adapter_sd15 [1/10]

A network error was encountered during configuration and download: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /api/models/InvokeAI/ip_adapter_sd15 (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:992)')))"), '(Request ID: [REDACTED])')
To try again, find the "invokeai" directory, run the script "invoke.sh" or "invoke.bat"
and choose option 7 to fix a broken install, optionally followed by option 5 to install models.
Alternatively you can relaunch the installer.
root@ae600451a535:/app/files/invokeai/InvokeAI-Installer#

use Midori AI Manager with glibc 2.28

I have a machine that must remain for a time glibc 2.28 ( Rocklylinux 8.x ) to be compatible with some other software preexisting on the machine. This is what occurs if you try to install the AI Manager on Rocky 8.x..

curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI/master/other_files/model_installer/model_installer.sh | sh

[3500081] Error loading Python lib '/tmp/_MEIVh5PmX/libpython3.9.so.1.0': dlopen: /lib64/libm.so.6: version `GLIBC_2.29' not found (required by /tmp/_MEIVh5PmX/libpython3.9.so.1.0)

With that said, is there any way to use the AI Manager with that? (manually seems to work but I really want the manger)

Thanks for your time.

Net Nut

[BUG] missed mounts

Describe the bug
Running the Midori Subsystem installer, following the guide the backend install fails of localai

Screenshots
image

for Unraid the mounts should be:

/mnt/user/appdata/Midori/audio
/mnt/user/appdata/Midori/images
/mnt/user/appdata/Midori/models
/mnt/user/appdata/Midori/files

Host Info (please complete the following information):

  • OS: Unraid using Docker

Additional context

2024-06-02 05:06:55 (21.3 MB/s) - '.env' saved [3463/3463]
Running mv docker-compose.yaml /app/files/localai/docker-compose.yaml
Running mv .env /app/files/localai/.env
Running docker compose -f ./files/localai/docker-compose.yaml down --rmi all
Running echo "LocalAI GPU is installed and running... Please wait 30mins before using or restarting"
LocalAI GPU is installed and running... Please wait 30mins before using or restarting
Running: docker compose -f ./files/localai/docker-compose.yaml up -d
stat /app/files/localai/docker-compose.yaml: no such file or directory
We are running localai on 38080
Please press enter to go back to the main menu: 

[BUG] Error: No module named 'support'

Describe the bug
I just run quick install script:
curl -sSL https://raw.githubusercontent.com/lunamidori5/Midori-AI/master/other_files/model_installer/shell_files/model_installer.sh > model_installer.sh && bash ./model_installer.sh

And got the following output:
Traceback (most recent call last):
File "subsystem_manager.py", line 4, in <module>
ModuleNotFoundError: No module named 'support'
[290617] Failed to execute script 'subsystem_manager' due to unhandled exception!

To Reproduce
Steps to reproduce the behavior:

  • Run the script
  • Get the error

Expected behavior
It should install without errors.

Host Info (please complete the following information):

  • OS: Ubuntu
  • Version: 24.04

Doesn't work with podman.

Can't use podman instead of docker despite having setup docker compatibility options and set env variables to redirect docker calls to podman, but whenever I have tried to run the installer script, I always get an error saying it can't find docker.

To Reproduce
Running either the quick script or the manual setup as instructed for Linux from the subsystem website

Expected behavior
The installer would send a command to docker which podman should receive and perform.

Host Info:

  • OS: Garuda-Linux
  • Version: newest, rolling release
  • Subsystem Manager Version: attempted just yesterday as of writing this so I'm assuming latest.

Additional context
Not sure if I've setup the docker cli compatibility 100% correctly, however whenever I've run other docker commands to install an image (the one for the standalone local-ai for example) it works just fine using podman as if it were docker.

[FEATURE] Incorporate Fabric from Daniel Miessler into Subsystem

Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Add Daniel Miessler's Fabric AI project into the subsystem: https://github.com/danielmiessler/fabric

Describe the solution you'd like
A clear and concise description of what you want to happen.

Have the subsystem handle the installation of Fabric and incorporate into the front end interfaces (big-agi, anythingllm, etc)

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

I've installed so many I can't keep track. Luna's subsystem is awesome.

Additional context
Add any other context or screenshots about the feature request here.

unexpected exception when launching subsystem.

Describe the bug
On running subsystem to check the version I got this error.

To Reproduce
D:\Local AI>subsystem_manager.bat
Could Not Find D:\Local AI\model_installer.zip
Could Not Find D:\Local AI\model_installer.bat
Could Not Find D:\Local AI\model_installer.exe
Could Not Find D:\Local AI\restart.bat
Traceback (most recent call last):
File "subsystem_manager.py", line 109, in
File "support.py", line 330, in data_helper_python
FileNotFoundError: [Errno 2] No such file or directory: '192.168.1.54_log_05222024.txt'
[31376] Failed to execute script 'subsystem_manager' due to unhandled exception!

Host Info (please complete the following information):

  • OS: Win
  • Version 11
  • Subsystem Manager Version [e.g. 24.2.24.1]

Additional context
Add any other context about the problem here.

[BUG] Subsystem_manager can't be found

Subsystem_ manager disappears from the folder when installing and it says it cant find it when installing.
A clear and concise description of what the bug is.

To Reproduce
I've extracted the model installer windows, opened up the new folder, ran subsystem_manager. A warning pops up saying Smartscreen cannot be found and is not sure if the app is safe to run. I select run anyways. Subsystem_ manager disappears from the folder then it says it cannot find the folder.

Expected behavior
The program to install.

Host Info (please complete the following information):

  • OS: Windows 11
  • Version not sure
  • Subsystem Manager Version whatever the newest one today is.

AutoGPT with LocalAI

Describe the bug
connect AutoGPT to the localAI midori -AI-backend-gpu-1 works, and if using the serv function the UI spins up, but there is an issue in creating the agent. I have also posted the issue on AutoGPT issues.

To Reproduce
Steps to reproduce the behavior:
With the .env file for autogpt pointed to the local docker instance of localAI. /v1 the system connects but then doesn't seem to initialise the agent / possibly because of the naming convention, I can't tell.
D:\Alex\Auto-GPT\autogpts\autogpt>poetry run python -m autogpt serve
2024-05-22 13:31:05,301 INFO HTTP Request: GET http://localhost:38080/v1/models "HTTP/1.1 200 OK"
2024-05-22 13:31:05,305 WARNING You don't have access to gpt-3.5-turbo. Setting fast_llm to OpenAIModelName.GPT3_ROLLING.
2024-05-22 13:31:05,548 INFO HTTP Request: GET http://localhost:38080/v1/models "HTTP/1.1 200 OK"
2024-05-22 13:31:05,552 WARNING You don't have access to gpt-4-turbo. Setting smart_llm to OpenAIModelName.GPT3_ROLLING.
2024-05-22 13:31:05,837 INFO AutoGPT server starting on http://localhost:8000
[2024-05-22 13:32:07,032] [forge.sdk.routes.agent_protocol] [ERROR] โŒ Error whilst trying to execute a task step: 7b3e59fa-52c1-41a7-9c55-adf1a410ae30
Traceback (most recent call last):
File "D:\Alex\Auto-GPT\autogpts\forge\forge\sdk\routes\agent_protocol.py", line 398, in execute_agent_task_step
step = await agent.execute_step(task_id, step)

Expected behavior
The autogpt comments and threads say by changing the link to the autoGPT the chat should work as normal.

Screenshots
If applicable, add screenshots to help explain your problem.

Host Info (please complete the following information):

  • OS: Win -docker
  • Version 11 - docker
  • Subsystem Manager Version:

Additional context
Add any other context about the problem here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.