Code Monkey home page Code Monkey logo

Comments (11)

nickjohn1912 avatar nickjohn1912 commented on July 4, 2024 1

Sorry for the late reply @chrisrude , and ya that was exactly what i meant. and i think that compromised would do well. for me it gets a bit confusing to see who the bot responded to if there more then several people. but i think that idea would work. thanks for the response and hope all goes well. it a great bit non of the less. and have enjoyed using it

from oobabot.

nickjohn1912 avatar nickjohn1912 commented on July 4, 2024 1

alright i gave it a try in my group and it works wonderful. thank you very much and for sharing this kind of thing

from oobabot.

chrisrude avatar chrisrude commented on July 4, 2024

By reply, do you mean having the bot's responses posted as a "reply" in Discord (with the little link at the top to what it is responding to?)

If so, I actually had it do that initially, but it had some challenges, curious what you think about the tradeoffs...

On slower channels, it would work fine. The bot would be summoned by a message, it would read and respond to it, and tag the message that it was responding to as a reply. Everything worked fine.

But on busier channels, what can happen is that after the bot is summoned, but before it could reply, other messages would be posted. The bot then has a choice... does it "see" those newer messages when building its reply?

If it doesn't see those other messages, the response it writes is relevant to the original request, but might be out-of-context relative to the overall flow of the conversation. For instance someone might have asked a different question in the meantime, and it could be unclear to the participants who the bot was referring to.

If it does "see" those other messages, it's the same problem but the other way around. With the additional context it may no longer decide to reply to the message that summoned it, but rather a later request.

Having tried it both ways, the later feels more like a natural participant, but of course ymmv based on your server.

Thinking out loud, one compromise I could think of would be to have the bot respond differently based on whether or not it was hard-summoned (i.e. referred to by its name or an @-mention). In that case, we could follow the first path: don't include any context after the @-mention, and then post its response as a reply.

But then when doing an "unsolicited" responses it would always include the full context, and not explicitly mark any message as one it is replying to.

I think I might play around with this behavior to see how it feels, but other ideas are welcome!

from oobabot.

nickjohn1912 avatar nickjohn1912 commented on July 4, 2024

i'll close this since i got the answer i was looking for

from oobabot.

chrisrude avatar chrisrude commented on July 4, 2024

Glad you got the answer you needed! I'd like to keep this open until I can get the change you want done. (it helps me track work to do)

from oobabot.

chrisrude avatar chrisrude commented on July 4, 2024

This is committed as 6e2ebcf

@nickjohn1912 if you're able to run from source, do you want to test this out? It should just magically work.

Otherwise it should be included in the next release (0.1.8).

from oobabot.

nickjohn1912 avatar nickjohn1912 commented on July 4, 2024

@chrisrude glad i came to check here. ya will check this out for you. apologies for my swayed time. got some projects so makes my communication random.

from oobabot.

nickjohn1912 avatar nickjohn1912 commented on July 4, 2024

but, i can wait as well if needed. i can understand being a dev it a time consuming task. but i'll give it a try when i can.

from oobabot.

rebek43 avatar rebek43 commented on July 4, 2024

@chrisrude something i've noticed with this is that with splitting responses on, every single split response replies to the same message, which can look a little messy. Off the top of my head there are two obvious solutions: either only having the first split part of the response reply to the request message (which i guess would make it unclear when the response ended some times) or having the response be in a single message (which is very humanlike but could be slow).
I think the second solution could be more elegant, and it could also be used to reuse the deprecated response streaming feature: instead of dumping the entire response at once, the bot would split it up as usual, but instead of sending them as different messages, it could edit the original message split response by split response (my impressions are that this would call the discord API far less times that the original text streaming implementation and work way more smoothly than the original implementation)

from oobabot.

chrisrude avatar chrisrude commented on July 4, 2024

Yeah, I noticed the same thing. I think keeping the reply tagging is important, since otherwise it could be hard to understand what the bot is doing.

The half-streaming idea is interesting! I'll play around with that and see how it feels.

from oobabot.

chrisrude avatar chrisrude commented on July 4, 2024

Split this into a separate open issue, so that I don't miss the convo on the closed one.

from oobabot.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.