Code Monkey home page Code Monkey logo

Comments (4)

MBaldo83 avatar MBaldo83 commented on June 29, 2024

Hey @jtresko I was wondering if you ever found a fix for this? I'm just trying to get the migration up and running, and I'm getting this as well. Is there a known workaround?
Thanks in advance :-)

from gpt-migrate.

jtresko avatar jtresko commented on June 29, 2024

No, I've sorta abandoned this one. Was just curious about the capability. Seems a critical piece is to make sure TreeSitter has definitions for both languages. Not sure if that's where this error came from.

It worked when I tested something like python to Nodejs I believe.

from gpt-migrate.

mjroeleveld avatar mjroeleveld commented on June 29, 2024

Also getting this error. This repo seems useless now. @joshpxyne FYI

from gpt-migrate.

windowshopr avatar windowshopr commented on June 29, 2024

Yeah I've got the same issue, trying to run ollama/llama3 model on Windows 10, Python 3.11

│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\main.py:100 in main                         │
│                                                                                                  │
│    97 │   │   │   file_name = write_migration(sourcefile, external_deps_list, target_deps_per_   │
│    98 │   │   │   target_deps_per_file[parent_file].append(file_name)                            │
│    99 │   │                                                                                      │
│ > 100 │   │   migrate(sourceentry, globals)                                                      │
│   101 │   │   add_env_files(globals)                                                             │
│   102 │                                                                                          │
│   103 │   ''' 3. Testing '''                                                                     │
│                                                                                                  │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │                         ai = <ai.AI object at 0x00000210E5A7CD10>                            │ │
│ │          detected_language = None                                                            │ │
│ │                    globals = <__main__.Globals object at 0x00000210E5976BD0>                 │ │
│ │                 guidelines = ''                                                              │ │
│ │                    migrate = <function main.<locals>.migrate at 0x00000210E3311BC0>          │ │
│ │                      model = 'ollama/llama3'                                                 │ │
│ │           operating_system = 'linux'                                                         │ │
│ │ source_directory_structure = '        ├── .github/\n            │   ├── ISSUE_TEMPLATE/\n    │ │
│ │                              │   │  '+2224752                                                │ │
│ │                  sourcedir = 'C:\\Users\\chalu\\Desktop\myapp17.0'                          │ │
│ │                sourceentry = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py'       │ │
│ │                 sourcelang = 'python'                                                        │ │
│ │                 sourceport = None                                                            │ │
│ │                       step = 'all'                                                           │ │
│ │       target_deps_per_file = defaultdict(<class 'list'>, {})                                 │ │
│ │                  targetdir = 'C:\\Users\\chalu\\Desktop\\myappGolang'                         │ │
│ │                 targetlang = 'golang'                                                        │ │
│ │                 targetport = 8080                                                            │ │
│ │                temperature = 0.0                                                             │ │
│ │                  testfiles = 'app.py'                                                        │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│                                                                                                  │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\main.py:94 in migrate                       │
│                                                                                                  │
│    91 │   │   target_deps_per_file = defaultdict(list)                                           │
│    92 │   │   def migrate(sourcefile, globals, parent_file=None):                                │
│    93 │   │   │   # recursively work through each of the files in the source directory, starti   │
│ >  94 │   │   │   internal_deps_list, external_deps_list = get_dependencies(sourcefile=sourcef   │
│    95 │   │   │   for dependency in internal_deps_list:                                          │
│    96 │   │   │   │   migrate(dependency, globals, parent_file=sourcefile)                       │
│    97 │   │   │   file_name = write_migration(sourcefile, external_deps_list, target_deps_per_   │
│                                                                                                  │
│ ┌───────────────────────────────────── locals ─────────────────────────────────────┐             │
│ │              globals = <__main__.Globals object at 0x00000210E5976BD0>           │             │
│ │              migrate = <function main.<locals>.migrate at 0x00000210E3311BC0>    │             │
│ │          parent_file = None                                                      │             │
│ │           sourcefile = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py' │             │
│ │ target_deps_per_file = defaultdict(<class 'list'>, {})                           │             │
│ └──────────────────────────────────────────────────────────────────────────────────┘             │
│                                                                                                  │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\steps\migrate.py:58 in get_dependencies     │
│                                                                                                  │
│    55 │   │   │   │   │   │   │   │   │   │   │   │   │   sourcelang=globals.sourcelang,         │
│    56 │   │   │   │   │   │   │   │   │   │   │   │   │   sourcefile_content=sourcefile_conten   │
│    57 │                                                                                          │
│ >  58 │   external_dependencies = llm_run(prompt,                                                │
│    59 │   │   │   │   │   │   │   waiting_message=f"Identifying external dependencies for {sou   │
│    60 │   │   │   │   │   │   │   success_message=None,                                          │
│    61 │   │   │   │   │   │   │   globals=globals)                                               │
│                                                                                                  │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │ external_deps_prompt_template = 'The following prompt is a composition of prompt sections,   │ │
│ │                                 each with different pr'+1799                                 │ │
│ │                          file = <_io.TextIOWrapper                                           │ │
│ │                                 name='C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.… │ │
│ │                                 mode='r' encoding='cp1252'>                                  │ │
│ │                       globals = <__main__.Globals object at 0x00000210E5976BD0>              │ │
│ │ internal_deps_prompt_template = 'The following prompt is a composition of prompt sections,   │ │
│ │                                 each with different pr'+2207                                 │ │
│ │                        prompt = 'The following prompt is a composition of prompt sections,   │ │
│ │                                 each with different pr'+1775                                 │ │
│ │                    sourcefile = 'C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__main__.py'    │ │
│ │            sourcefile_content = 'from .cli.command import main\n\nmain()\n'                  │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│                                                                                                  │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\utils.py:39 in llm_run                      │
│                                                                                                  │
│    36 │                                                                                          │
│    37 │   output = ""                                                                            │
│    38 │   with yaspin(text=waiting_message, spinner="dots") as spinner:                          │
│ >  39 │   │   output = globals.ai.run(prompt)                                                    │
│    40 │   │   spinner.ok("✅ ")                                                                  │
│    41 │                                                                                          │
│    42 │   if success_message:                                                                    │
│                                                                                                  │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │         globals = <__main__.Globals object at 0x00000210E5976BD0>                            │ │
│ │          output = ''                                                                         │ │
│ │          prompt = 'The following prompt is a composition of prompt sections, each with       │ │
│ │                   different pr'+1775                                                         │ │
│ │         spinner = <Yaspin frames=⠋⠙⠹⠸⠼⠴⠦⠧⠇⠏>                                                 │ │
│ │ success_message = None                                                                       │ │
│ │ waiting_message = 'Identifying external dependencies for                                     │ │
│ │                   C:\\Users\\chalu\\Desktop\\myapp-17.0\\myapp\\__ma'+10                       │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
│                                                                                                  │
│ I:\nasty\Python_Projects\LLM\gpt_migrate\gpt_migrate\ai.py:49 in run                             │
│                                                                                                  │
│   46 │   │   for chunk in response:                                                              │
│   47 │   │   │   delta = chunk["choices"][0]["delta"]                                            │
│   48 │   │   │   msg = delta.get("content", "")                                                  │
│ > 49 │   │   │   chat += msg                                                                     │
│   50 │   │   return chat                                                                         │
│   51                                                                                             │
│   52                                                                                             │
│                                                                                                  │
│ ┌─────────────────────────────────────────── locals ───────────────────────────────────────────┐ │
│ │     chat = 'logrus,github.com/spf13/cobra,github.com/google/go-cmp/matchers,github.com/goog… │ │
│ │    chunk = ModelResponse(                                                                    │ │
│ │            │   id='chatcmpl-0022f59d-3cf6-4202-8205-3dfbe33133a1',                           │ │
│ │            │   choices=[                                                                     │ │
│ │            │   │   StreamingChoices(                                                         │ │
│ │            │   │   │   finish_reason='stop',                                                 │ │
│ │            │   │   │   index=0,                                                              │ │
│ │            │   │   │   delta=Delta(                                                          │ │
│ │            │   │   │   │   content=None,                                                     │ │
│ │            │   │   │   │   role=None,                                                        │ │
│ │            │   │   │   │   function_call=None,                                               │ │
│ │            │   │   │   │   tool_calls=None                                                   │ │
│ │            │   │   │   ),                                                                    │ │
│ │            │   │   │   logprobs=None                                                         │ │
│ │            │   │   )                                                                         │ │
│ │            │   ],                                                                            │ │
│ │            │   created=1715994803,                                                           │ │
│ │            │   model='llama3',                                                               │ │
│ │            │   object='chat.completion.chunk',                                               │ │
│ │            │   system_fingerprint=None                                                       │ │
│ │            )                                                                                 │ │
│ │    delta = Delta(content=None, role=None, function_call=None, tool_calls=None)               │ │
│ │  message = [                                                                                 │ │
│ │            │   {                                                                             │ │
│ │            │   │   'role': 'user',                                                           │ │
│ │            │   │   'content': 'The following prompt is a composition of prompt sections,     │ │
│ │            each with different pr'+1775                                                      │ │
│ │            │   }                                                                             │ │
│ │            ]                                                                                 │ │
│ │      msg = None                                                                              │ │
│ │   prompt = 'The following prompt is a composition of prompt sections, each with different    │ │
│ │            pr'+1775                                                                          │ │
│ │ response = <generator object ollama_completion_stream at 0x00000210E5A38460>                 │ │
│ │     self = <ai.AI object at 0x00000210E5A7CD10>                                              │ │
│ └──────────────────────────────────────────────────────────────────────────────────────────────┘ │
└──────────────────────────────────────────────────────────────────────────────────────────────────┘
TypeError: can only concatenate str (not "NoneType") to str

from gpt-migrate.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.