Code Monkey home page Code Monkey logo

marvin's People

Contributors

aaazzam avatar abrookins avatar chrisguidry avatar cicdw avatar dependabot[bot] avatar dev-khant avatar discdiver avatar geminixiang avatar jamesflores avatar jamiezieziula avatar jeremy-feng avatar jlowin avatar ksallee avatar lmmx avatar lucemia avatar maccam912 avatar maisim avatar oaustegard avatar phodaie avatar polacekpavel avatar richardscottoz avatar roansong avatar roboojack avatar salman1993 avatar sarahmk125 avatar thanseefpp avatar vickyliin avatar wybryan avatar yasyf avatar zzstoatzz avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

marvin's Issues

Add optional memoization for AI Functions

Add optional memoziation for AI functions to avoid repeated calls with the same inputs. This won't always be desirable, but for well structured functions with more deterministic outputs it could be.

Print the model used in all cases, not just with 3.5

First check

  • I added a descriptive title to this issue.
  • I used the GitHub search to look for a similar issue and didn't find it.
  • I searched the Marvin documentation for this feature.

Describe the current behavior

When GPT 3.5 is used, that information is printed. If 4 is used, that info isn't printed.

Describe the proposed behavior

Print the model in all cases, as it's helpful for the user and we plan to add more models.

Example Use

No response

Additional context

No response

Prompting for "True" and "False" (capitalized) can break bool returns

@ai_fn
def classify_sentiment(tweets: list[str]) -> list[bool]:
    """
    Given a list of tweets, classifies each one as 
    positive (True) or negative (False) and returns 
    a corresponding list
    """

classify_sentiment(['i love pizza', 'i hate pizza'])

This errors, but changing the prompts to "true" and "false" (lowercase) works

OSError: could not get source code

I am getting this error:

root@2f43e901cf78:/# python
Python 3.10.10 (main, Mar 23 2023, 14:25:51) [GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from marvin import ai_fn
[03/31/23 08:59:34] INFO     marvin.marvin: Using OpenAI model "gpt-3.5-turbo"                                                                             logging.py:50
>>> @ai_fn
... def list_fruits(n):
...     "generate a list of n fruits"
...
>>> list_fruits(5)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.10/site-packages/marvin/bots/ai_functions.py", line 141, in ai_fn_wrapper
    function_def = inspect.cleandoc(inspect.getsource(fn))
  File "/usr/local/lib/python3.10/inspect.py", line 1139, in getsource
    lines, lnum = getsourcelines(object)
  File "/usr/local/lib/python3.10/inspect.py", line 1121, in getsourcelines
    lines, lnum = findsource(object)
  File "/usr/local/lib/python3.10/inspect.py", line 958, in findsource
    raise OSError('could not get source code')
OSError: could not get source code

Upgrading to 0.7.4 leads to database error when using `marvin chat`

context:

Name Version Build Channel

marvin 0.7.1 pypi_0 pypi

same with 0.7.4

marvin chat

results in the TUI displaying and then a very log error message that concludes with the following:

...
OperationalError: (sqlite3.OperationalError) no such column: bot_config.created_at
[SQL: SELECT bot_config.created_at, bot_config.updated_at, bot_config.plugins, bot_config.input_transformers, 
bot_config.id, bot_config.name, bot_config.personality, bot_config.instructions, 
bot_config.instructions_template, bot_config.description, bot_config.profile_picture 
FROM bot_config 
WHERE bot_config.name = ?
 LIMIT ? OFFSET ?]
[parameters: ('Marvin', 1, 0)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)

Running a script with an AI function completes with the correct output, but also throws the error:

sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) no such column: bot_config.created_at
[SQL: SELECT bot_config.created_at, bot_config.updated_at, bot_config.plugins, bot_config.input_transformers, bot_config.id, bot_config.name, bot_config.personality, bot_config.instructions, bot_config.instructions_template, bot_config.description, bot_config.profile_picture 
FROM bot_config 
WHERE bot_config.name = ?
 LIMIT ? OFFSET ?]
[parameters: ('DuetBot', 1, 0)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)

Include some plugin output in memory

To help avoid hallucinated links, in situations like:

user: common pitfalls running prefect on ECS?

marvin: there's a discourse post about this! (not lying here, I can see it in the DEBUG: Chroma plugin output)

user: cool, please provide a link

marvin: sure! here's a totally fake link that looks real 

we may want to include plugin output in bot.history. However, to avoid eating too much of our context window, we should probably summarize the body of the plugin output while carrying forward any links present in the raw plugin output.

Non-descriptive error when SSL certificate not configured correctly

After upgrading to Python 3.10, certifi was not set up correctly. After installing marvin, I could run openai-setup and set my api key, but upon running marvin chat the message would retry ~6 times and surface error APIConnectionError: Error communicating with OpenAI.

The full stacktrace revealed several SSL-related errors. I ended up running certificates.command within my root Python folder, after which SSL was set up correctly.

Consider creating more robust error messaging around APIConnectionError: Error communicating with OpenAI, as this may be a common new user failure point.

large memory footprint introduced by `chromadb` default embeddings

ChromaDB uses sentence-transformers by default for embeddings, which requires torch, which all together makes the footprint of chroma something like ~5GB.

We use OpenAI's text-embedding-ada-002 model offered via chromadb.utils.embedding_functions, but
chromadb enforces the sentence-transformers dependency at this time.

We should find a way around this if chroma doesn't make sentence-transformers an optional dep

context_aware_fillna_df doesn't fill data in example

First check

  • I added a descriptive title to this issue.
  • I used the GitHub search to try to find a similar issue and didn't find one.
  • I searched the Marvin documentation for this issue.

Bug summary

context_aware_fillna works fine in brief manual test.
context_aware_fillna_df turns an integer column into a float and doesn't fill missing values.

Reproduction

import pandas as pd
from marvin.ai_functions.data import context_aware_fillna_df

data = pd.DataFrame(
    [
        ["The Terminator", 1984, None],
        ["Minority Report", None, "Steven Spielberg"],
        ["Wall-E", None, "andrew Stanton"],
    ],
    columns=["title", "release_year", "director"],
)


df_filled = context_aware_fillna_df(data)
print(df_filled)
print(type(df_filled))

Error

<img width="503" alt="Screenshot 2023-04-21 at 11 29 33 PM" src="https://user-images.githubusercontent.com/7703961/233764622-6ea8dc33-78b5-4d08-979c-588c057e90c6.png">

Versions

0.7.7.dev100+ge191037

Additional context

No response

Add outputtransformer

List of hooks that can take strings → return (optional) structured outputs. Could be used to call other bots to evaluate the response

("ResponseTransformer"?)

Support sets in ResponseFormatters

This currently errors:

@ai_fn
def extract_animals(text: str) -> set[str]:
    """Returns a set of all animals mentioned in the text"""

extract_animals('the fox and the dog')

Support Python 3.8

Great question from the livestream - can we support 3.9?

For posterity - the decision to support 3.10 is a legacy of this being an internal project and wanting to use the best and brightest :)

Now that it's being used so widely, I think 3.10 is too high a hurdle.

Use generators as AI functions

A generator would allow for multi-step execution and the potential for side effects, as user code would be running in sequence with AI execution.

@ai_fn
def my_fn():
    x = do_stuff()
    ai_result = yield x
	y = do_stuff_with_ai_result(ai_result)
	another_result = yield y

If I already have the default bots installed, the TUI should not have an install button

First check

  • I added a descriptive title to this issue.
  • I used the GitHub search to look for a similar issue and didn't find it.
  • I searched the Marvin documentation for this feature.

Describe the current behavior

I already have the default bots installed. I click on "Bots". The TUI has a "Install default bots" button, which is confusing. A user might think the bots are not installed and that they need to install each bot individually.

Screenshot 2023-04-21 at 10 42 58 PM

Describe the proposed behavior

The "Install Default Bots" button should only show if the default bots are not installed.

Example Use

No response

Additional context

No response

weird recursion issue with `Document.to_excerpts` with `GitHubRepoLoader`

only appears to happen when no globs are passed in, will need to investigate

seems related to MarvinBaseModel.copy_with_updates

Example

works

    prefect_source_code = GitHubRepoLoader(  # gimme da source
        repo="prefecthq/prefect",
        include_globs=["**/*.py"],
        exclude_globs=["tests/**/*", "docs/**/*", "**/migrations/**/*"],
    )

doesnt

    prefect_source_code = GitHubRepoLoader(  # gimme da source
        repo="prefecthq/prefect",
    #    include_globs=["**/*.py"],
    #    exclude_globs=["tests/**/*", "docs/**/*", "**/migrations/**/*"],
    )

fix `plugins` for `ai_fn`

currently, bot_kwargs is set explicitly to {} in the ai_fn_wrapper scope, erasing any plugins set via the ai_fn decorator kwargs

Add temporary loaders

Add a way to temporarily and conveniently add simple knowledge to a bot, including via plugin. E.g. point a bot at a website, document, or other information source, parse it with an appropriate loader, store that in a temporary vector store, and make it available ot the bot.

"Hey bot, index example.com"
"Done! what do you want to know?"

Would provide a much lower-lift interface to exploration than having to go through formal knowledge processing

Prompt Engineering: how to get better results

I love the idea of AI functions!

Can you share some tipps how I can get more reliable results though?

@ai_fn
def top3_words_in_text(text: str, criteria: str=None) -> str:
    """
    give me the top 3 most often used words in the text in this format:
    [
        { word: "milk", count: 10},
        { word: "fish", count: 5}
    ]
    make sure to sort by count
    """

text = "I really really really love pancakes and cherry from cherry trees"
print(top3_words_in_text(text))
print(top3_words_in_text(text))
print(top3_words_in_text(text))

Gives me back:

[{"word": "cherry", "count": 2}, {"word": "really", "count": 3}, {"word": "love", "count": 1}]
[{"word": "really", "count": 3}, {"word": "cherry", "count": 2}, {"word": "and", "count": 1}]
[{"word": "cherry", "count": 2}, {"word": "really", "count": 3}, {"word": "love", "count": 1}]

allow JSON-style examples in plugin docstring

LLMs seem to do best with examples, and if a plugin should accept JSON, it seems easiest to just show that in the docstring

currently this errors:

In [22]: @plugin
    ...: def foo():
    ...:     """ {"key": "value"} """
    ...:     pass
    ...:

In [23]: foo.get_full_description()
---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
Cell In[23], line 1
----> 1 foo.get_full_description()

File ~/src/open-source/marvin/src/marvin/plugins/base.py:48, in Plugin.get_full_description(self)
     47 def get_full_description(self) -> str:
---> 48     description = self.description.format(**self.dict())
     49     docstring = self.run.__doc__
     51     result = inspect.cleandoc(
     52         f"""
     53         Name: {self.name}
     54         Signature: {self._signature}
     55         """
     56     )

KeyError: '"key"'

and this works fine

In [25]: @plugin
    ...: def foo():
    ...:     """ "key" -> "value" """
    ...:     pass
    ...:

In [26]: foo.get_full_description()
Out[26]: 'Name: foo\nSignature: ()\n "key" -> "value" '

Observation about LLM Response

I noticed that in a number of cases, these lines in your prompt:

"A valid JSON object that satisfies this OpenAPI schema:"
f" ```{json.dumps(schema)}```. The JSON object will be coerced to"

have an outsized effect on the response from the LLM.

When the response type of an ai_fn wrapped function was str, I got this from gpt-3.5: {"title": "Model", "type": "string"}

And when the response type was dict, I got: {"title": "Model", "type": "object"}.

Failure when return type was string seems more likely. And for context, I captured these outputs directly after the response = await self._call_llm(messages=messages) line.

Perhaps this issue might only be faced with 3.5 since it doesn't always understand highly detailed instructions, but I just wanted to make note of it anyway.


Great job on the code and the tool! Pleasure to read through!

Enable user customization of instruction templates

Thank you for making this awesome tool available for everyone!

I noticed that the instructions templates of the bots (and its corresponding parameters in the Jinja template) are hardcoded in bots/base.py (defined in INSTRUCTIONS_TEMPLATE). I also noticed similar pattern in ai_fn defined in ai_functions.py. The instructions template for user message (defined in AI_FN_MESSAGE constant) is hardcoded, resulting in users having little control on what user message being sent to the OpenAI model.

I was wondering if there is any plan to allow users to configure their own instruction templates. I feel that the default template can be a bit restrictive sometimes as its longer length does not always go well with the shorter context length in GPT 3.5. Besides, I feel having a different bot instruction template for different use cases may result in a more efficient prompts and better performance.

Thanks :)

Issue with greenlet library on M1 Mac

Having trouble with the greenlet library on an M1 mac. (Searching around I see this issue with M1 macs for other libraries that use greenlet but haven't found a good solution yet. Specific error is:

ValueError: the greenlet library is required to use this function. dlopen(/Users/eball/miniconda3/envs/marvin/lib/python3.10/site-packages/greenlet/_greenlet.cpython-310-darwin.so, 0x0002): tried: '/Users/eball/miniconda3/envs/marvin/lib/python3.10/site-packages/greenlet/_greenlet.cpython-310-darwin.so' (mach-o file, but is an incompatible architecture (have (x86_64), need (arm64e)))

Ideas for workarounds welcome.

create context preprocessor

the list of messages provided to the LLM for a given generation should only contain the most relevant messages to producing that generation:

  • most recent messages
  • system messages from top of the thread dictating bot attrs
  • select previous plugin outputs and other messages deemed "relevant" to the actor the generation is for

primarily this would solve the problem of overflowing the model-specific context window limit, but in general
a context preprocessor could enable arbitrary actors to "hook into" thread content that is relevant to them

this could potentially involve embedding the thread content so we could get top k messages in a thread

AttributeError: 'str' object has no attribute 'match'

First check

  • I added a descriptive title to this issue.
  • I used the GitHub search to try to find a similar issue and didn't find one.
  • I searched the Marvin documentation for this issue.

Bug summary

When trying to use the marvin chat, I get an exception that 'str' doesn't have 'match'

Reproduction

python3 -m pip install marvin;
marvin chat;

Error

(base) kevin@Kevins-MacBook-Pro ~ % marvin chat
│ │ data = {                                                                                                                       │                                      │
│ │        │   'id': 'bot_01GY0MH0MTVYS7P8K6SHJSYDK6',                                                                             │                                      │
│ │        │   'name': 'Marvin',                                                                                                   │                                      │
│ │        │   'personality': 'Marvin is characterized by its immense intelligence, constant sense of depressio'+439,              │                                      │
│ │        │   'instructions': 'You are part of a library for building AI-powered software called "Marvin". One '+907,             │                                      │
│ │        │   'instructions_template': 'Your name is: {{ name }}\n\nYour instructions tell you how to respond to a message'+1036, │                                      │
│ │        │   'plugins': [                                                                                                        │                                      │
│ │        │   │   {                                                                                                               │                                      │
│ │        │   │   │   'name': 'list_all_bots',                                                                                    │                                      │
│ │        │   │   │   'description': '\n    This plugin lets you look up the names and descriptions of all bots.\n    ',          │                                      │
│ │        │   │   │   'discriminator': 'list_all_bots'                                                                            │                                      │
│ │        │   │   },                                                                                                              │                                      │
│ │        │   │   {                                                                                                               │                                      │
│ │        │   │   │   'name': 'get_bot_details',                                                                                  │                                      │
│ │        │   │   │   'description': '\n    This plugin lets you look up the details of a single bot, including its\n   '+55,     │                                      │
│ │        │   │   │   'discriminator': 'get_bot_details'                                                                          │                                      │
│ │        │   │   },                                                                                                              │                                      │
│ │        │   │   {                                                                                                               │                                      │
│ │        │   │   │   'name': 'create_bot',                                                                                       │                                      │
│ │        │   │   │   'description': '\n    Creates a bot with the given name, description, personality, and\n    instru'+226,    │                                      │
│ │        │   │   │   'discriminator': 'create_bot'                                                                               │                                      │
│ │        │   │   },                                                                                                              │                                      │
│ │        │   │   {                                                                                                               │                                      │
│ │        │   │   │   'name': 'update_bot',                                                                                       │                                      │
│ │        │   │   │   'description': "\n    This plugin can be used to update a bot's description, personality, or\n    "+109,    │                                      │
│ │        │   │   │   'discriminator': 'update_bot'                                                                               │                                      │
│ │        │   │   },                                                                                                              │                                      │
│ │        │   │   {                                                                                                               │                                      │
│ │        │   │   │   'name': 'delete_bot',                                                                                       │                                      │
│ │        │   │   │   'description': '\n    Deletes the bot with the given name. Names must be an exact match, includin'+208,     │                                      │
│ │        │   │   │   'discriminator': 'delete_bot'                                                                               │                                      │
│ │        │   │   }                                                                                                               │                                      │
│ │        │   ],                                                                                                                  │                                      │
│ │        │   'input_transformers': [],                                                                                           │                                      │
│ │        │   'description': 'The Genuine People Personality you know and love.\n\nMarvin can also help you crea'+34              │                                      │
│ │        }                                                                                                                       │                                      │
│ ╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯                                      │
│                                                                                                                                                                         │
│ in pydantic.main.validate_model:1076                                                                                                                                    │
│                                                                                                                                                                         │
│ in pydantic.fields.ModelField.validate:884                                                                                                                              │
│                                                                                                                                                                         │
│ in pydantic.fields.ModelField._validate_singleton:1101                                                                                                                  │
│                                                                                                                                                                         │
│ in pydantic.fields.ModelField._apply_validators:1148                                                                                                                    │
│                                                                                                                                                                         │
│ in pydantic.class_validators._generic_validator_basic.lambda13:318                                                                                                      │
│                                                                                                                                                                         │
│ in pydantic.types.ConstrainedStr.validate:433                                                                                                                           │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
AttributeError: 'str' object has no attribute 'match'

Versions

(base) kevin@Kevins-MacBook-Pro ~ % marvin version
WARNING: `openai_api_key` is not set. Some features may not work.
[04/14/23 12:45:45] INFO     marvin.marvin: Using OpenAI model "gpt-3.5-turbo"                                                                                logging.py:50
0.7.5

Additional context

.

Installation failing

I didpip3 install marvin. The installation failed. Here is the stacktrace :

pip3 install marvin
Collecting marvin
Using cached marvin-0.6.1-py3-none-any.whl (78 kB)
Collecting aiofiles~=23.1.0
Using cached aiofiles-23.1.0-py3-none-any.whl (14 kB)
Collecting aiosqlite~=0.18.0
Using cached aiosqlite-0.18.0-py3-none-any.whl (15 kB)
Collecting asyncpg~=0.27.0
Using cached asyncpg-0.27.0-cp311-cp311-macosx_11_0_arm64.whl (608 kB)
Collecting cloudpickle~=2.2.1
Using cached cloudpickle-2.2.1-py3-none-any.whl (25 kB)
Collecting datamodel-code-generator~=0.17.1
Using cached datamodel_code_generator-0.17.2-py3-none-any.whl (75 kB)
Collecting fastapi~=0.89.1
Using cached fastapi-0.89.1-py3-none-any.whl (55 kB)
Collecting httpx~=0.23.3
Using cached httpx-0.23.3-py3-none-any.whl (71 kB)
Collecting jinja2~=3.1.2
Using cached Jinja2-3.1.2-py3-none-any.whl (133 kB)
Collecting langchain>=0.0.103
Using cached langchain-0.0.128-py3-none-any.whl (465 kB)
Collecting nest-asyncio~=1.5.6
Using cached nest_asyncio-1.5.6-py3-none-any.whl (5.2 kB)
Collecting openai~=0.27.0
Using cached openai-0.27.2-py3-none-any.whl (70 kB)
Collecting pendulum~=2.1.2
Using cached pendulum-2.1.2.tar.gz (81 kB)
Installing build dependencies ... done
Getting requirements to build wheel ... done
ERROR: Exception:
Traceback (most recent call last):
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/cli/base_command.py", line 160, in exc_logging_wrapper
status = run_func(*args)
^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/cli/req_command.py", line 247, in wrapper
return func(self, options, args)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/commands/install.py", line 400, in run
requirement_set = resolver.resolve(
^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/resolver.py", line 92, in resolve
result = self._result = resolver.resolve(
^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/resolvelib/resolvers.py", line 481, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/resolvelib/resolvers.py", line 373, in resolve
failure_causes = self._attempt_to_pin_criterion(name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/resolvelib/resolvers.py", line 213, in _attempt_to_pin_criterion
criteria = self._get_updated_criteria(candidate)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/resolvelib/resolvers.py", line 204, in _get_updated_criteria
self._add_to_criteria(criteria, requirement, parent=candidate)
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/resolvelib/resolvers.py", line 172, in _add_to_criteria
if not criterion.candidates:
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/resolvelib/structs.py", line 151, in bool
return bool(self._sequence)
^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 155, in bool
return any(self)
^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 143, in
return (c for c in iterator if id(c) not in self._incompatible_ids)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/found_candidates.py", line 47, in _iter_built
candidate = func()
^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/factory.py", line 206, in _make_candidate_from_link
self._link_candidate_cache[link] = LinkCandidate(
^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 297, in init
super().init(
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 162, in init
self.dist = self._prepare()
^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 231, in _prepare
dist = self._prepare_distribution()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/resolution/resolvelib/candidates.py", line 308, in _prepare_distribution
return preparer.prepare_linked_requirement(self._ireq, parallel_builds=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 491, in prepare_linked_requirement
return self._prepare_linked_requirement(req, parallel_builds)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 577, in _prepare_linked_requirement
dist = _get_prepared_distribution(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/operations/prepare.py", line 69, in _get_prepared_distribution
abstract_dist.prepare_distribution_metadata(
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/distributions/sdist.py", line 48, in prepare_distribution_metadata
self._install_build_reqs(finder)
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/distributions/sdist.py", line 118, in _install_build_reqs
build_reqs = self._get_build_requires_wheel()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/distributions/sdist.py", line 95, in _get_build_requires_wheel
return backend.get_requires_for_build_wheel()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_internal/utils/misc.py", line 685, in get_requires_for_build_wheel
return super().get_requires_for_build_wheel(config_settings=cs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/pep517/wrappers.py", line 173, in get_requires_for_build_wheel
return self._call_hook('get_requires_for_build_wheel', {
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/pep517/wrappers.py", line 319, in _call_hook
raise BackendUnavailable(data.get('traceback', ''))
pip._vendor.pep517.wrappers.BackendUnavailable: Traceback (most recent call last):
File "/opt/homebrew/lib/python3.11/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 77, in _build_backend
obj = import_module(mod_path)
^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/[email protected]/3.11.2_1/Frameworks/Python.framework/Versions/3.11/lib/python3.11/importlib/init.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "", line 1206, in _gcd_import
File "", line 1178, in _find_and_load
File "", line 1128, in _find_and_load_unlocked
File "", line 241, in _call_with_frames_removed
File "", line 1206, in _gcd_import
File "", line 1178, in _find_and_load
File "", line 1128, in _find_and_load_unlocked
File "", line 241, in _call_with_frames_removed
File "", line 1206, in _gcd_import
File "", line 1178, in _find_and_load
File "", line 1128, in _find_and_load_unlocked
File "", line 241, in _call_with_frames_removed
File "", line 1206, in _gcd_import
File "", line 1178, in _find_and_load
File "", line 1142, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'poetry'

NoMatches: No nodes match

Hi,
thanks for this project. I ran marvin chat and entered my API key. However, subsequent requests seem to error out.

Here is an example:

Prompt = "What is something only a bot would say?"

I get the exception NoMatches: No nodes match. Here is a fuller view:

│ ╭───────────────────────────────────────────────────────── locals ──────────────────────────────────────────────────────────╮                                            │
│ │           bot = Bot(                                                                                                      │                                            │
│ │                 │   id='bot_01GXJJX4DND5CZWTN1G10NDB22',                                                                  │                                            │
│ │                 │   name='Marvin',                                                                                        │                                            │
│ │                 │   description='The Genuine People Personality you know and love.\n\nMarvin can also help you crea'+34,  │                                            │
│ │                 │   plugins=[                                                                                             │                                            │
│ │                 │   │   list_all_bots(name='list_all_bots', discriminator='list_all_bots'),                               │                                            │
│ │                 │   │   get_bot_details(name='get_bot_details', discriminator='get_bot_details'),                         │                                            │
│ │                 │   │   create_bot(name='create_bot', discriminator='create_bot'),                                        │                                            │
│ │                 │   │   update_bot(name='update_bot', discriminator='update_bot'),                                        │                                            │
│ │                 │   │   delete_bot(name='delete_bot', discriminator='delete_bot')                                         │                                            │
│ │                 │   ]                                                                                                     │                                            │
│ │                 )                                                                                                         │                                            │
│ │  bot_response = BotResponse(classes={'response', 'bot-response'}, pseudo_classes={'enabled'})                             │                                            │
│ │ bot_responses = <DOMQuery MainScreen(pseudo_classes={'focus-within', 'enabled'}) filter='BotResponse'>                    │                                            │
│ │         event = Submitted()                                                                                               │                                            │
│ │      response = BotResponse(                                                                                              │                                            │
│ │                 │   id='msg_01GXJK6ZJSE5VQFXEETABGWP2F',                                                                  │                                            │
│ │                 │   role='bot',                                                                                           │                                            │
│ │                 │   content="Well, that's an interesting question. As an AI language model, I can tell you th"+240,       │                                            │
│ │                 │   name='Marvin',                                                                                        │                                            │
│ │                 │   timestamp=DateTime(2023, 4, 9, 8, 48, 6, 233439, tzinfo=Timezone('UTC')),                             │                                            │
│ │                 │   bot_id='bot_01GXJJX4DND5CZWTN1G10NDB22',                                                              │                                            │
│ │                 │   data={},                                                                                              │                                            │
│ │                 │   parsed_content="Well, that's an interesting question. As an AI language model, I can tell you th"+240 │                                            │
│ │                 )                                                                                                         │                                            │
│ │          self = MainScreen(pseudo_classes={'focus-within', 'enabled'})                                                    │                                            │
│ ╰───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯                                            │
│                                                                                                                                                                          │
│ /home/lab/env/lib/python3.9/site-packages/textual/css/query.py:285 in last                                                                                               │
│                                                                                                                                                                          │
│   282 │   │   │   The matching Widget.                                                                                                                                   │
│   283 │   │   """                                                                                                                                                        │
│   284 │   │   if not self.nodes:                                                                                                                                         │
│ ❱ 285 │   │   │   raise NoMatches(f"No nodes match {self!r}")                                                                                                            │
│   286 │   │   last = self.nodes[-1]                                                                                                                                      │
│   287 │   │   if expect_type is not None and not isinstance(last, expect_type):                                                                                          │
│   288 │   │   │   raise WrongType(                                                                                                                                       │
│                                                                                                                                                                          │
│ ╭─────────────────────────────────────────────── locals ───────────────────────────────────────────────╮                                                                 │
│ │ expect_type = None                                                                                   │                                                                 │
│ │        self = <DOMQuery MainScreen(pseudo_classes={'focus-within', 'enabled'}) filter='BotResponse'> │                                                                 │
│ ╰──────────────────────────────────────────────────────────────────────────────────────────────────────╯                                                                 │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
NoMatches: No nodes match <DOMQuery MainScreen(pseudo_classes={'focus-within', 'enabled'}) filter='BotResponse'>

So I guess the API connection and almost everything works, but there might be a small bug in the rendering?!

Running this on Debian 11, Python 3.9.2, and using the latest marvin version from pip.

Thanks.

Allow AI functions to generate source code

If AI functions could generate source code and execute it, they could do significantly more on the local machine. However, this would constitute untrusted code and would require controls to avoid unintended execution.

incorrect API is being delivered to the OpenAI server.

  1. given the following: sk-wJOK3***************************************fjYz
  2. I get:

AuthenticationError: Incorrect API key provided: sk-nuLlk***************************************dhN2. You can find your API key at https://platform.openai.com/account/api-keys.

probably due I have past config from other API in my envs.

OperationalError Linux

First check

  • I added a descriptive title to this issue.
  • I used the GitHub search to try to find a similar issue and didn't find one.
  • I searched the Marvin documentation for this issue.

Bug summary

Hello!

I installed marvin with pip install marvin (without a conda environment) on pop os 22.04. I'm getting the following error when running marvin chat:

OperationalError: (sqlite3.OperationalError) no such column: bot_config.created_at
[SQL: SELECT bot_config.created_at, bot_config.updated_at, bot_config.plugins, 
bot_config.input_transformers, bot_config.id, bot_config.name, bot_config.personality, 
bot_config.instructions, bot_config.instructions_template, bot_config.description, 
bot_config.profile_picture 
FROM bot_config 
WHERE bot_config.name = ?
 LIMIT ? OFFSET ?]
[parameters: ('Marvin', 1, 0)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)

Any idea how to fix this?

Cheers

Reproduction

In pop os 22.04, run: 


pip install marvin
marvin chat


### Error

```python3
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/sqlalchemy/dialects/sqlite/aiosqlite.p │
│ y:100 in execute                                                                          │
│                                                                                           │
│    97 │   │   │   else:                                                                   │
│    98 │   │   │   │   self._cursor = _cursor                                              │
│    99 │   │   except Exception as error:                                                  │
│ ❱ 100 │   │   │   self._adapt_connection._handle_exception(error)                         │
│   101 │                                                                                   │
│   102 │   def executemany(self, operation, seq_of_parameters):                            │
│   103 │   │   try:                                                                        │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │    _cursor = <aiosqlite.cursor.Cursor object at 0x7fc0812df730>                       │ │
│ │  operation = 'SELECT bot_config.created_at, bot_config.updated_at,                    │ │
│ │              bot_config.plugins, bot_con'+251                                         │ │
│ │ parameters = ('Marvin', 1, 0)                                                         │ │
│ │       self = <sqlalchemy.dialects.sqlite.aiosqlite.AsyncAdapt_aiosqlite_cursor object │ │
│ │              at 0x7fc081312dc0>                                                       │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/sqlalchemy/dialects/sqlite/aiosqlite.p │
│ y:228 in _handle_exception                                                                │
│                                                                                           │
│   225 │   │   │   │   from_=error,                                                        │
│   226 │   │   │   )                                                                       │
│   227 │   │   else:                                                                       │
│ ❱ 228 │   │   │   raise error                                                             │
│   229                                                                                     │
│   230                                                                                     │
│   231 class AsyncAdaptFallback_aiosqlite_connection(AsyncAdapt_aiosqlite_connection):     │
│                                                                                           │
│ ╭────────────────────────────────────── locals ──────────────────────────────────────╮    │
│ │ error = OperationalError('no such column: bot_config.created_at')                  │    │
│ │  self = <AdaptedConnection <Connection(Thread-6, started daemon 140464776869440)>> │    │
│ ╰────────────────────────────────────────────────────────────────────────────────────╯    │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/sqlalchemy/dialects/sqlite/aiosqlite.p │
│ y:82 in execute                                                                           │
│                                                                                           │
│    79 │   │   │   if parameters is None:                                                  │
│    80 │   │   │   │   self.await_(_cursor.execute(operation))                             │
│    81 │   │   │   else:                                                                   │
│ ❱  82 │   │   │   │   self.await_(_cursor.execute(operation, parameters))                 │
│    83 │   │   │                                                                           │
│    84 │   │   │   if _cursor.description:                                                 │
│    85 │   │   │   │   self.description = _cursor.description                              │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │    _cursor = <aiosqlite.cursor.Cursor object at 0x7fc0812df730>                       │ │
│ │  operation = 'SELECT bot_config.created_at, bot_config.updated_at,                    │ │
│ │              bot_config.plugins, bot_con'+251                                         │ │
│ │ parameters = ('Marvin', 1, 0)                                                         │ │
│ │       self = <sqlalchemy.dialects.sqlite.aiosqlite.AsyncAdapt_aiosqlite_cursor object │ │
│ │              at 0x7fc081312dc0>                                                       │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/sqlalchemy/util/_concurrency_py3k.py:6 │
│ 8 in await_only                                                                           │
│                                                                                           │
│    65 │   # a coroutine to run. Once the awaitable is done, the driver greenlet           │
│    66 │   # switches back to this greenlet with the result of awaitable that is           │
│    67 │   # then returned to the caller (or raised as error)                              │
│ ❱  68 │   return current.driver.switch(awaitable)                                         │
│    69                                                                                     │
│    70                                                                                     │
│    71 def await_fallback(awaitable: Coroutine) -> Any:                                    │
│                                                                                           │
│ ╭────────────────────────────────────── locals ──────────────────────────────────────╮    │
│ │ awaitable = <coroutine object Cursor.execute at 0x7fc08138ad40>                    │    │
│ │   current = <_AsyncIoGreenlet object at 0x7fc08126d2c0 (otid=0x7fc0a6d45330) dead> │    │
│ ╰────────────────────────────────────────────────────────────────────────────────────╯    │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/sqlalchemy/util/_concurrency_py3k.py:1 │
│ 21 in greenlet_spawn                                                                      │
│                                                                                           │
│   118 │   │   │   try:                                                                    │
│   119 │   │   │   │   # wait for a coroutine from await_only and then return its          │
│   120 │   │   │   │   # result back to it.                                                │
│ ❱ 121 │   │   │   │   value = await result                                                │
│   122 │   │   │   except BaseException:                                                   │
│   123 │   │   │   │   # this allows an exception to be raised within                      │
│   124 │   │   │   │   # the moderated greenlet so that it can continue                    │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │  _require_await = False                                                               │ │
│ │            args = (<sqlalchemy.sql.selectable.Select object at 0x7fc081bf5c70>,)      │ │
│ │         context = <_AsyncIoGreenlet object at 0x7fc08126d2c0 (otid=0x7fc0a6d45330)    │ │
│ │                   dead>                                                               │ │
│ │              fn = <bound method Session.execute of <sqlmodel.orm.session.Session      │ │
│ │                   object at 0x7fc081c7fa90>>                                          │ │
│ │          kwargs = {                                                                   │ │
│ │                   │   'params': None,                                                 │ │
│ │                   │   'execution_options': immutabledict({'prebuffer_rows': True}),   │ │
│ │                   │   'bind_arguments': None                                          │ │
│ │                   }                                                                   │ │
│ │          result = <coroutine object Cursor.execute at 0x7fc08138ad40>                 │ │
│ │ switch_occurred = True                                                                │ │
│ │           value = <aiosqlite.cursor.Cursor object at 0x7fc0812df730>                  │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/aiosqlite/cursor.py:37 in execute      │
│                                                                                           │
│    34 │   │   """Execute the given query."""                                              │
│    35 │   │   if parameters is None:                                                      │
│    36 │   │   │   parameters = []                                                         │
│ ❱  37 │   │   await self._execute(self._cursor.execute, sql, parameters)                  │
│    38 │   │   return self                                                                 │
│    39 │                                                                                   │
│    40 │   async def executemany(                                                          │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │ parameters = ('Marvin', 1, 0)                                                         │ │
│ │       self = <aiosqlite.cursor.Cursor object at 0x7fc0812df730>                       │ │
│ │        sql = 'SELECT bot_config.created_at, bot_config.updated_at,                    │ │
│ │              bot_config.plugins, bot_con'+251                                         │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/aiosqlite/cursor.py:31 in _execute     │
│                                                                                           │
│    28 │                                                                                   │
│    29 │   async def _execute(self, fn, *args, **kwargs):                                  │
│    30 │   │   """Execute the given function on the shared connection's thread."""         │
│ ❱  31 │   │   return await self._conn._execute(fn, *args, **kwargs)                       │
│    32 │                                                                                   │
│    33 │   async def execute(self, sql: str, parameters: Iterable[Any] = None) -> "Cursor" │
│    34 │   │   """Execute the given query."""                                              │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │   args = (                                                                            │ │
│ │          │   'SELECT bot_config.created_at, bot_config.updated_at,                    │ │
│ │          bot_config.plugins, bot_con'+251,                                            │ │
│ │          │   ('Marvin', 1, 0)                                                         │ │
│ │          )                                                                            │ │
│ │     fn = <built-in method execute of sqlite3.Cursor object at 0x7fc08137d180>         │ │
│ │ kwargs = {}                                                                           │ │
│ │   self = <aiosqlite.cursor.Cursor object at 0x7fc0812df730>                           │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/aiosqlite/core.py:137 in _execute      │
│                                                                                           │
│   134 │   │                                                                               │
│   135 │   │   self._tx.put_nowait((future, function))                                     │
│   136 │   │                                                                               │
│ ❱ 137 │   │   return await future                                                         │
│   138 │                                                                                   │
│   139 │   async def _connect(self) -> "Connection":                                       │
│   140 │   │   """Connect to the actual sqlite database."""                                │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │     args = (                                                                          │ │
│ │            │   'SELECT bot_config.created_at, bot_config.updated_at,                  │ │
│ │            bot_config.plugins, bot_con'+251,                                          │ │
│ │            │   ('Marvin', 1, 0)                                                       │ │
│ │            )                                                                          │ │
│ │       fn = <built-in method execute of sqlite3.Cursor object at 0x7fc08137d180>       │ │
│ │ function = functools.partial(<built-in method execute of sqlite3.Cursor object at     │ │
│ │            0x7fc08137d180>, 'SELECT bot_config.created_at, bot_config.updated_at,     │ │
│ │            bot_config.plugins, bot_config.input_transformers, bot_config.id,          │ │
│ │            bot_config.name, bot_config.personality, bot_config.instructions,          │ │
│ │            bot_config.instructions_template, bot_config.description,                  │ │
│ │            bot_config.profile_picture \nFROM bot_config \nWHERE bot_config.name = ?\n │ │
│ │            LIMIT ? OFFSET ?', ('Marvin', 1, 0))                                       │ │
│ │   future = <Future finished exception=OperationalError('no such column:               │ │
│ │            bot_config.created_at')>                                                   │ │
│ │   kwargs = {}                                                                         │ │
│ │     self = <Connection(Thread-6, started daemon 140464776869440)>                     │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/asyncio/futures.py:284 in __await__                  │
│                                                                                           │
│   281 │   def __await__(self):                                                            │
│   282 │   │   if not self.done():                                                         │
│   283 │   │   │   self._asyncio_future_blocking = True                                    │
│ ❱ 284 │   │   │   yield self  # This tells Task to wait for completion.                   │
│   285 │   │   if not self.done():                                                         │
│   286 │   │   │   raise RuntimeError("await wasn't used with future")                     │
│   287 │   │   return self.result()  # May raise too.                                      │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │ self = <Future finished exception=OperationalError('no such column:                   │ │
│ │        bot_config.created_at')>                                                       │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/asyncio/tasks.py:328 in __wakeup                     │
│                                                                                           │
│   325 │                                                                                   │
│   326 │   def __wakeup(self, future):                                                     │
│   327 │   │   try:                                                                        │
│ ❱ 328 │   │   │   future.result()                                                         │
│   329 │   │   except BaseException as exc:                                                │
│   330 │   │   │   # This may also be a cancellation.                                      │
│   331 │   │   │   self.__step(exc)                                                        │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │ future = <Future finished exception=OperationalError('no such column:                 │ │
│ │          bot_config.created_at')>                                                     │ │
│ │   self = None                                                                         │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/asyncio/futures.py:201 in result                     │
│                                                                                           │
│   198 │   │   │   raise exceptions.InvalidStateError('Result is not ready.')              │
│   199 │   │   self.__log_traceback = False                                                │
│   200 │   │   if self._exception is not None:                                             │
│ ❱ 201 │   │   │   raise self._exception                                                   │
│   202 │   │   return self._result                                                         │
│   203 │                                                                                   │
│   204 │   def exception(self):                                                            │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │ self = <Future finished exception=OperationalError('no such column:                   │ │
│ │        bot_config.created_at')>                                                       │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
│                                                                                           │
│ /home/arnle/miniconda3/lib/python3.9/site-packages/aiosqlite/core.py:110 in run           │
│                                                                                           │
│   107 │   │   │   │   break                                                               │
│   108 │   │   │   try:                                                                    │
│   109 │   │   │   │   LOG.debug("executing %s", function)                                 │
│ ❱ 110 │   │   │   │   result = function()                                                 │
│   111 │   │   │   │   LOG.debug("operation %s completed", function)                       │
│   112 │   │   │   │                                                                       │
│   113 │   │   │   │   def set_result(fut, result):                                        │
│                                                                                           │
│ ╭─────────────────────────────────────── locals ────────────────────────────────────────╮ │
│ │      function = functools.partial(<built-in method close of sqlite3.Connection object │ │
│ │                 at 0x7fc081ca56c0>)                                                   │ │
│ │        future = <Future finished result=None>                                         │ │
│ │        result = None                                                                  │ │
│ │          self = <Connection(Thread-6, started daemon 140464776869440)>                │ │
│ │ set_exception = <function Connection.run.<locals>.set_exception at 0x7fc08128b430>    │ │
│ │    set_result = <function Connection.run.<locals>.set_result at 0x7fc08128b0d0>       │ │
│ ╰───────────────────────────────────────────────────────────────────────────────────────╯ │
╰───────────────────────────────────────────────────────────────────────────────────────────╯
OperationalError: (sqlite3.OperationalError) no such column: bot_config.created_at
[SQL: SELECT bot_config.created_at, bot_config.updated_at, bot_config.plugins, 
bot_config.input_transformers, bot_config.id, bot_config.name, bot_config.personality, 
bot_config.instructions, bot_config.instructions_template, bot_config.description, 
bot_config.profile_picture 
FROM bot_config 
WHERE bot_config.name = ?
 LIMIT ? OFFSET ?]
[parameters: ('Marvin', 1, 0)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)



### Versions

```Text
Same error when running `marvin version`

Additional context

Pop OS 22.04

Store plugin outputs in a loader

First check

  • I added a descriptive title to this issue.
  • I used the GitHub search to look for a similar issue and didn't find it.
  • I searched the Marvin documentation for this feature.

Describe the current behavior

When you create a plugin and the output is too big for the engine, it will just fail.
It would be nice that the output is stored as a vector instead, so that it can be used for the next steps, instead of failing.
I tried to find similar issues, I did find similar ones but they did not convey this exactly I think.

A similar issue is #162

Describe the proposed behavior

When you declare a plugin, instead of feeding the whole output as a message, have a way of storing the result instead, so it can be used as a knowledge base instead of directly being consumed and potentially failing because of its length.

Example Use

No response

Additional context

No response

Error while opening the server.

error on commit ba4fe3b

(base) PS C:\Users\jalba\Sage> marvin server start
[04/07/23 16:33:58] INFO     marvin.marvin: Using OpenAI model "gpt-3.5-turbo"                                                                                    logging.py:50
INFO:marvin.marvin:[default on default]Using OpenAI model "gpt-3.5-turbo"[/]
INFO:     Started server process [1952]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://127.0.0.1:4200 (Press CTRL+C to quit)
INFO:     127.0.0.1:56318 - "GET / HTTP/1.1" 307 Temporary Redirect
INFO:     127.0.0.1:56318 - "GET /docs HTTP/1.1" 200 OK
INFO:     127.0.0.1:56318 - "GET /openapi.json HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "C:\Users\jalba\miniconda3\lib\site-packages\uvicorn\protocols\http\httptools_impl.py", line 436, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
  File "C:\Users\jalba\miniconda3\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
  File "C:\Users\jalba\miniconda3\lib\site-packages\fastapi\applications.py", line 276, in __call__
    await super().__call__(scope, receive, send)
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\middleware\errors.py", line 184, in __call__
    raise exc
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\middleware\cors.py", line 84, in __call__
    await self.app(scope, receive, send)
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\middleware\exceptions.py", line 79, in __call__
    raise exc
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\middleware\exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "C:\Users\jalba\miniconda3\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in __call__
    raise e
  File "C:\Users\jalba\miniconda3\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in __call__
    await self.app(scope, receive, send)
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "C:\Users\jalba\miniconda3\lib\site-packages\starlette\routing.py", line 66, in app
    response = await func(request)
  File "C:\Users\jalba\miniconda3\lib\site-packages\fastapi\applications.py", line 231, in openapi
    return JSONResponse(self.openapi())
  File "C:\Users\jalba\miniconda3\lib\site-packages\fastapi\applications.py", line 206, in openapi
    self.openapi_schema = get_openapi(
  File "C:\Users\jalba\miniconda3\lib\site-packages\fastapi\openapi\utils.py", line 423, in get_openapi
    definitions = get_model_definitions(
  File "C:\Users\jalba\miniconda3\lib\site-packages\fastapi\utils.py", line 44, in get_model_definitions
    m_schema, m_definitions, m_nested_models = model_process_schema(
  File "pydantic\schema.py", line 581, in pydantic.schema.model_process_schema
  File "pydantic\schema.py", line 622, in pydantic.schema.model_type_schema
  File "pydantic\schema.py", line 255, in pydantic.schema.field_schema
  File "pydantic\schema.py", line 462, in pydantic.schema.field_type_schema
  File "pydantic\schema.py", line 850, in pydantic.schema.field_singleton_schema
  File "pydantic\schema.py", line 701, in pydantic.schema.field_singleton_sub_fields_schema
  File "pydantic\schema.py", line 527, in pydantic.schema.field_type_schema
  File "pydantic\schema.py", line 924, in pydantic.schema.field_singleton_schema
  File "C:\Users\jalba\miniconda3\lib\abc.py", line 123, in __subclasscheck__
    return _abc_subclasscheck(cls, subclass)
TypeError: issubclass() arg 1 must be a class

Add pre/post message formatting hooks

In order to avoid situations where bots like utility bots misinterpret a message as an instruction, rather than text to process, introduce processing hooks to manipulate messages. For example, "The text to process is: {message}" could be the modification applied to utility bot inputs.

I don't have the ability to visit external websites

Homepage says the following about bot plugins.
Plugins
Plugins allow bots to access new information and functionality. By default, bots have plugins that let them browse the internet, visit URLs, and run simple calculations.

When I am trying to ask bot to access a website and ask it questions, it tells me following:
"
│ I'm sorry for the confusion earlier. As an AI language model, I don't have the ability to visit external websites. │
│ However, I can still provide some feedback based on what you tell me about your website or any other website you're │
│ interested in.
"

Is bots allowed to browse the internet and visit URLs?

Creating a custom bot and then using in the TUI fails

Context:
marvin version 0.7.4

Created bot based on example in the docs.

Script:

from marvin import Bot
import asyncio

async def create_ford_bot():
    ford_bot = Bot(
        name="Ford",
        personality="Can't get the hang of Thursdays",
        instructions="Always responds as if researching an article for the Hitchhiker's Guide to the Galaxy",
    )

    await ford_bot.save()

asyncio.run(create_ford_bot())

Run the file.

Bot is found in list.

Then: marvin chat

Results in error:

                                                                                                             │
│    19 │   │   RenderError: If the object can not be rendered.                                               │
│    20 │   """                                                                                               │
│    21 │   if not is_renderable(renderable):                                                                 │
│ ❱  22 │   │   raise RenderError(                                                                            │
│    23 │   │   │   f"unable to render {renderable!r}; a string, Text, or other Rich renderable               │
│    24 │   │   )                                                                                             │
│    25                                                                                                       │
│                                                                                                             │
│ ╭───── locals ──────╮                                                                                       │
│ │ renderable = None │                                                                                       │
│ ╰───────────────────╯                                                                                       │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
RenderError: unable to render None; a string, Text, or other Rich renderable is required

Marvin pip install issues in windows 11

I tried installing marvin as per the instructions(in a new virtual environment). During the installation I got some errors before pip installation was somewhat successfully completed:

image

But when I run "marvin chat" command in the command prompt of my VS Code then nothing happens and I don't receive an error either.

I know this is not much information but this is all I have.

Thank you in advance for any assistance :)

marvin setup-openai fails with "cannot find context for fork"

Traceback (most recent call last):
File "C:\Users\ccrow\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in run_module_as_main
return run_code(code, main_globals, None,
File "C:\Users\ccrow\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in run_code
exec(code, run_globals)
File "C:\Users\ccrow\AppData\Local\Programs\Python\Python310\Scripts\marvin.exe_main
.py", line 4, in
File "C:\Users\ccrow\AppData\Local\Programs\Python\Python310\lib\site-packages\marvin_init
.py", line 18, in
from marvin.utilities.logging import get_logger
File "C:\Users\ccrow\AppData\Local\Programs\Python\Python310\lib\site-packages\marvin\utilities_init.py", line 1, in
from . import logging, async_utils, types, strings, collections, models
File "C:\Users\ccrow\AppData\Local\Programs\Python\Python310\lib\site-packages\marvin\utilities\async_utils.py", line 10, in
process_pool = concurrent.futures.ProcessPoolExecutor(mp_context=mp.get_context("fork"))
File "C:\Users\ccrow\AppData\Local\Programs\Python\Python310\lib\multiprocessing\context.py", line 243, in get_context
return super().get_context(method)
File "C:\Users\ccrow\AppData\Local\Programs\Python\Python310\lib\multiprocessing\context.py", line 193, in get_context
raise ValueError('cannot find context for %r' % method) from None
ValueError: cannot find context for 'fork'

JS/TS version & a few questions

This is phenomenal 🔥

I'd love to help add TS support for the core ai_fn decorator.

  1. What are your internal plans for JS/TS support?
  2. How much is tied to Marvin the company & platform vs being an agnostic tool like LangChain that can call into any hosted or local LLM provider?
  3. Any advice on where to start either in this monorepo or separately on a JS/TS port of the ai_fn functionality in particular?

Thanks && congrats on your public launch! 🙏

Get marvin to import correctly when running other python processes on multiple threads

When import marvin into a streamlit app process, it kept conflicting with the main thread running the streamlit app. I worked around it by setting thread conditions within marvin to workaround this. Would love this to be a default.

# __init__.py
from importlib.metadata import version as _get_version

# load nest_asyncio
import nest_asyncio as _nest_asyncio
import asyncio as _asyncio

import threading

import asyncio as _asyncio

import threading

if threading.current_thread() == threading.main_thread():
    _nest_asyncio.apply()
else:
    _asyncio.set_event_loop(_asyncio.new_event_loop())
    _nest_asyncio.apply()

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.