Code Monkey home page Code Monkey logo

lsp-devtools's People

Contributors

alcarney avatar dependabot[bot] avatar github-actions[bot] avatar laurencewarne avatar pre-commit-ci[bot] avatar thofrank avatar tobb10001 avatar tombh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

lsp-devtools's Issues

Format string fallback option

Currently, any message that does not match a format string given to lsp-devtools record is not shown. This is useful, however there are situations where you want the message to still be shown in it's raw form.

There should be a cli option to lsp-devtools record to control this

lsp-devtools integration

It would be great to be able to run a set of tests with pytest-lsp able to forward the traffic to the lsp-devtools set of tools

It should be possible to request other pytest fixtures from `@pytest_lsp.fixture`

Consider the following fixture definition

@pytest_lsp.fixture(
    scope="session",
    params=["visual_studio_code", "neovim"],
    config=ClientServerConfig(
        server_command=[sys.executable, *SERVER_CMD],
    ),
)
async def client(request, uri_for, lsp_client: LanguageClient):
    workspace = uri_for("sphinx-default", "workspace")

    await lsp_client.initialize_session(
        types.InitializeParams(
            capabilities=client_capabilities(request.param),
            workspace_folders=[
                types.WorkspaceFolder(uri=str(workspace), name="sphinx-default"),
            ],
        )
    )

    yield

    # Teardown
    await lsp_client.shutdown_session()

Which results in the following error

    @pytest_asyncio.fixture(**kwargs)
    async def the_fixture(request):
        client_server = make_client_server(config)
        await client_server.start()
    
        kwargs = get_fixture_arguments(fn, client_server.client, request)
>       result = fn(**kwargs)
E       TypeError: client() missing 1 required positional argument: 'uri_for'

Record and playback?

It might be useful to use lsp-devtools record to capture an LSP session and use pytest-lsp to replay that session and make sure that the server behaves as expected

Test-client hangs and integrating with Pygls

Continuing on from the discussion in the Pygls v1 thread about hanging CI.

Basically, yes absolutely, I think it'd be great to formally have pytest-lsp as part of the Pygls ecosystem ๐Ÿ™‡

So let's see here what would be involved to address test-client hangs, which we've identified as a shared problem. Maybe even using pytest-lsp as the solution for Pygls' CI hangs, as well as being the excuse to wholly rely on it in Pygls for all testing. There's even some ongoing internal discussions about creating a pygls-org Github organisation, for which pytest-lsp would be an obvious candidate member.

Fun fact, the test client in pytest-lsp was originally based on the LSP client in pygls' test suite, it's just been adapted to talk to a subprocess rather than an os.pipe() and I'm facing the exact same issue.

What are the advantages of subprocess over os.pipe()? I assume the former is async?

I've been playing around with a few ideas (the most promising having a background thread cancel all pending tasks when the subprocess exits), but nothing that works in all cases...

So I assume that the exiting of the subprocess (namely the tested LSP server instance) isn't a problem? At the very least just prepending POSIX's timeout reliably protects against hangs right? Eg: timeout 5 python custom-pygls-server.py.

The exiting of client "processes" (or rather async futures yes?) is the issue at hand right? I was thinking that maybe a dedicated test-client could have its own request method, that all tests used, and had a single site for setting and guaranteeing a timeout. How does your background thread relate to that idea? Am I mixing up 2 different things?

Also, changing the subject a little ... looking through the code here, I briefly noticed those JSON files that seem to have serialised editor requests? What do you use them for? Is it because there is such divergence in how each editor sends LSP client requests?

`workspace/configuration` support

It should be possible to provide some settings on the test language client and have it respond accordingly to workspace/configuration requests

Warning when running `lsp_devtools record`

/usr/lib64/python3.11/asyncio/base_events.py:692: ResourceWarning: unclosed event loop <_UnixSelectorEventLoop running=False closed=False debug=False>
  _warn(f"unclosed event loop {self!r}", ResourceWarning, source=self)
ResourceWarning: Enable tracemalloc to get the object allocation traceback

Generally happens when hitting Ctrl-C, but I have seen it a few times show up mid session.

Not really sure where this is coming from... especially since nothing seems broken....

Deprecation warning

============================================================================================================= test session starts =============================================================================================================
platform linux -- Python 3.7.16, pytest-7.2.2, pluggy-1.0.0
rootdir: /tmp/pytest-of-alex/pytest-9/test_detect_invalid_json0, configfile: tox.ini
plugins: typeguard-2.13.3, asyncio-0.21.0, lsp-0.2.1
asyncio: mode=auto
collected 1 item                                                                                                                                                                                                                              

test_detect_invalid_json.py FE                                                                                                                                                                                                          [100%]

=================================================================================================================== ERRORS ====================================================================================================================
___________________________________________________________________________________________________ ERROR at teardown of test_capabilities ____________________________________________________________________________________________________

    def finalizer() -> None:
        """Yield again, to finalize."""
    
        async def async_finalizer() -> None:
            try:
                await gen_obj.__anext__()
            except StopAsyncIteration:
                pass
            else:
                msg = "Async generator fixture didn't stop."
                msg += "Yield only once."
                raise ValueError(msg)
    
>       event_loop.run_until_complete(async_finalizer())

/var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:296: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <_UnixSelectorEventLoop running=False closed=True debug=False>
future = <Task cancelled coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.finalizer.<locals>.async_fina... /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:286>>

    def run_until_complete(self, future):
        """Run until the Future is done.
    
        If the argument is a coroutine, it is wrapped in a Task.
    
        WARNING: It would be disastrous to call run_until_complete()
        with the same coroutine twice -- it would wrap it in two
        different Tasks and that can't be good.
    
        Return the Future's result, or raise its exception.
        """
        self._check_closed()
        self._check_runnung()
    
        new_task = not futures.isfuture(future)
        future = tasks.ensure_future(future, loop=self)
        if new_task:
            # An exception is raised if the future didn't complete, so there
            # is no need to log the "destroy pending task" message
            future._log_destroy_pending = False
    
        future.add_done_callback(_run_until_complete_cb)
        try:
            self.run_forever()
        except:
            if new_task and future.done() and not future.cancelled():
                # The coroutine raised a BaseException. Consume the exception
                # to not log a warning, the caller doesn't have access to the
                # local task.
                future.exception()
            raise
        finally:
            future.remove_done_callback(_run_until_complete_cb)
        if not future.done():
            raise RuntimeError('Event loop stopped before Future completed.')
    
>       return future.result()
E       concurrent.futures._base.CancelledError

/usr/lib64/python3.7/asyncio/base_events.py:587: CancelledError
------------------------------------------------------------------------------------------------------------ Captured stdout setup ------------------------------------------------------------------------------------------------------------
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
------------------------------------------------------------------------------------------------------------ Captured stdout call -------------------------------------------------------------------------------------------------------------
cancel: <Task pending coro=<test_capabilities() running at /tmp/pytest-of-alex/pytest-9/test_detect_invalid_json0/test_detect_invalid_json.py:15> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bc1550>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
cancel: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:45> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7f06d8b9dd50>()]>>
-------------------------------------------------------------------------------------------------------------- Captured log call --------------------------------------------------------------------------------------------------------------
ERROR    pygls.protocol:protocol.py:512 Error receiving data
Traceback (most recent call last):
  File "/var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/protocol.py", line 510, in data_received
    self._data_received(data)
  File "/var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/protocol.py", line 543, in _data_received
    object_hook=self._deserialize_message))
  File "/usr/lib64/python3.7/json/__init__.py", line 361, in loads
    return cls(**kw).decode(s)
  File "/usr/lib64/python3.7/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib64/python3.7/json/decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
================================================================================================================== FAILURES ===================================================================================================================
______________________________________________________________________________________________________________ test_capabilities ______________________________________________________________________________________________________________

args = (), kwargs = {'client': <pytest_lsp.client.LanguageClient object at 0x7f06d8c18050>}, coro = <coroutine object test_capabilities at 0x7f06d8b23e60>
task = <Task cancelled coro=<test_capabilities() done, defined at /tmp/pytest-of-alex/pytest-9/test_detect_invalid_json0/test_detect_invalid_json.py:7>>

    @functools.wraps(func)
    def inner(*args, **kwargs):
        coro = func(*args, **kwargs)
        if not inspect.isawaitable(coro):
            pyfuncitem.warn(
                pytest.PytestWarning(
                    f"The test {pyfuncitem} is marked with '@pytest.mark.asyncio' "
                    "but it is not an async function. "
                    "Please remove asyncio marker. "
                    "If the test is not marked explicitly, "
                    "check for global markers applied via 'pytestmark'."
                )
            )
            return
        task = asyncio.ensure_future(coro, loop=_loop)
        try:
>           _loop.run_until_complete(task)

/var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:525: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <_UnixSelectorEventLoop running=False closed=False debug=False>, future = <Task cancelled coro=<test_capabilities() done, defined at /tmp/pytest-of-alex/pytest-9/test_detect_invalid_json0/test_detect_invalid_json.py:7>>

    def run_until_complete(self, future):
        """Run until the Future is done.
    
        If the argument is a coroutine, it is wrapped in a Task.
    
        WARNING: It would be disastrous to call run_until_complete()
        with the same coroutine twice -- it would wrap it in two
        different Tasks and that can't be good.
    
        Return the Future's result, or raise its exception.
        """
        self._check_closed()
        self._check_runnung()
    
        new_task = not futures.isfuture(future)
        future = tasks.ensure_future(future, loop=self)
        if new_task:
            # An exception is raised if the future didn't complete, so there
            # is no need to log the "destroy pending task" message
            future._log_destroy_pending = False
    
        future.add_done_callback(_run_until_complete_cb)
        try:
            self.run_forever()
        except:
            if new_task and future.done() and not future.cancelled():
                # The coroutine raised a BaseException. Consume the exception
                # to not log a warning, the caller doesn't have access to the
                # local task.
                future.exception()
            raise
        finally:
            future.remove_done_callback(_run_until_complete_cb)
        if not future.done():
            raise RuntimeError('Event loop stopped before Future completed.')
    
>       return future.result()
E       concurrent.futures._base.CancelledError

/usr/lib64/python3.7/asyncio/base_events.py:587: CancelledError
------------------------------------------------------------------------------------------------------------ Captured stdout setup ------------------------------------------------------------------------------------------------------------
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
--------------------------------------------------------------
task: <Task pending coro=<_wrap_asyncgen_fixture.<locals>._asyncgen_fixture_wrapper.<locals>.setup() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:280> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bba4d0>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
task: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:33>>
task: <Task pending coro=<aio_readline() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/server.py:61> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8b9dd10>()]>>
------------------------------------------------------------------------------------------------------------ Captured stdout call -------------------------------------------------------------------------------------------------------------
cancel: <Task pending coro=<test_capabilities() running at /tmp/pytest-of-alex/pytest-9/test_detect_invalid_json0/test_detect_invalid_json.py:15> wait_for=<Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/lib64/python3.7/asyncio/futures.py:351, <TaskWakeupMethWrapper object at 0x7f06d8bc1550>()]> cb=[_run_until_complete_cb() at /usr/lib64/python3.7/asyncio/base_events.py:157]>
cancel: <Task pending coro=<check_server_process() running at /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_lsp/plugin.py:45> wait_for=<Future pending cb=[<TaskWakeupMethWrapper object at 0x7f06d8b9dd50>()]>>
-------------------------------------------------------------------------------------------------------------- Captured log call --------------------------------------------------------------------------------------------------------------
ERROR    pygls.protocol:protocol.py:512 Error receiving data
Traceback (most recent call last):
  File "/var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/protocol.py", line 510, in data_received
    self._data_received(data)
  File "/var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pygls/protocol.py", line 543, in _data_received
    object_hook=self._deserialize_message))
  File "/usr/lib64/python3.7/json/__init__.py", line 361, in loads
    return cls(**kw).decode(s)
  File "/usr/lib64/python3.7/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib64/python3.7/json/decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
============================================================================================================== warnings summary ===============================================================================================================
test_detect_invalid_json.py::test_capabilities
  /var/home/alex/Projects/lsp-devtools/lib/pytest-lsp/.py37/lib64/python3.7/site-packages/pytest_asyncio/plugin.py:446: DeprecationWarning: pytest-asyncio detected an unclosed event loop when tearing down the event_loop
  fixture: <_UnixSelectorEventLoop running=False closed=False debug=False>
  pytest-asyncio will close the event loop for you, but future versions of the
  library will no longer do so. In order to ensure compatibility with future
  versions, please make sure that:
      1. Any custom "event_loop" fixture properly closes the loop after yielding it
      2. Your code does not modify the event loop in async fixtures or tests
  
    DeprecationWarning,

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================================================================================================== short test summary info ===========================================================================================================
FAILED test_detect_invalid_json.py::test_capabilities - concurrent.futures._base.CancelledError
ERROR test_detect_invalid_json.py::test_capabilities - concurrent.futures._base.CancelledError
==================================================================================================== 1 failed, 1 warning, 1 error in 0.97s ====================================================================================================

Not entirely sure why the event loop is not being closed by this test case....
I'm assuming it has something to do with cancelling tasks, but as far as I can tell it's not cancelling anything internal to pytest-asyncio?

pytest-asyncio v0.23 support

This release includes some breaking changes that at the very least will require an update to the documentation.

Something else to investigate is that esbonio's test suite will hang (on my machine at least ...) if a test requests a fixture with a scope and that scope is not also explicitly requested in the test's mark

@pytest_lsp.fixture(
    scope="session",
    config=ClientServerConfig(
        server_command=[sys.executable, *SERVER_CMD],
    ),
)
async def client(lsp_client: LanguageClient, uri_for, tmp_path_factory):
    ...

@pytest.mark.asyncio(scope="session")  # <-- hangs if not set!
async def test_workspace_symbols(client: LanguageClient, ...):
    ...

Interestingly however, the following tests appear to run fine without issue

@pytest_asyncio.fixture(scope="session")
async def slow_start():
    await asyncio.sleep(1)
    yield True

@pytest.mark.asyncio
async def test_one(slow_start):
    await asyncio.sleep(1)
    assert slow_start is True

@pytest.mark.asyncio
async def test_two(slow_start):
    await asyncio.sleep(1)
    assert slow_start is False

Which suggests perhaps pytest-lsp is doing something that doesn't quite agree with pytest-asyncio's new setup.

Investigate TCP based agent

Websockets are useful, especially when connecting the the browser, however TCP is supported by the stdlib (and pygls!) and (I think) would allow us to depend on pure python wheels, making distribution just that little bit easier

Cannot start agent

I'm pretty sure I was able to do this a short while ago and I'm not sure what's changed. When I try to start the agent I get multiple stamina.retry_scheduled messages and when I issue a keyboard interrupt I get the following output:

โžœ  lsp-devtools agent -- /Users/XXXX/delme/.venv/bin/ngnklsp
stamina.retry_scheduled
stamina.retry_scheduled
stamina.retry_scheduled
^CTraceback (most recent call last):
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/base_events.py", line 653, in run_until_complete
    return future.result()
           ^^^^^^^^^^^^^^^
  File "/Users/XXXX/.local/pipx/venvs/lsp-devtools/lib/python3.11/site-packages/lsp_devtools/agent/__init__.py", line 80, in main
    await asyncio.gather(
  File "/Users/XXXX/.local/pipx/venvs/lsp-devtools/lib/python3.11/site-packages/lsp_devtools/agent/client.py", line 63, in start_tcp
    async for attempt in retries:
  File "/Users/XXXX/.local/pipx/venvs/lsp-devtools/lib/python3.11/site-packages/stamina/_core.py", line 462, in __anext__
    return Attempt(await self._t_a_retrying.__anext__())
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/XXXX/.local/pipx/venvs/lsp-devtools/lib/python3.11/site-packages/tenacity/_asyncio.py", line 78, in __anext__
    await self.sleep(do)
  File "/Users/XXXX/.local/pipx/venvs/lsp-devtools/lib/python3.11/site-packages/stamina/_core.py", line 42, in _smart_sleep
    await asyncio.sleep(delay)
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/tasks.py", line 649, in sleep
    return await future
           ^^^^^^^^^^^^
asyncio.exceptions.CancelledError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/XXXX/.local/bin/lsp-devtools", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/Users/XXXX/.local/pipx/venvs/lsp-devtools/lib/python3.11/site-packages/lsp_devtools/cli.py", line 57, in main
    return parsed_args.run(parsed_args, extra)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/XXXX/.local/pipx/venvs/lsp-devtools/lib/python3.11/site-packages/lsp_devtools/agent/__init__.py", line 87, in run_agent
    asyncio.run(main(args, extra))
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 190, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/opt/local/Library/Frameworks/Python.framework/Versions/3.11/lib/python3.11/asyncio/runners.py", line 123, in run
    raise KeyboardInterrupt()
KeyboardInterrupt

โžœ 

Here's the version of lsp-devtools I'm running:

โžœ  lsp-devtools --version
lsp-devtools v0.2.2
โžœ 

And my OS: macOS Sonoma Version 14.2.1 (23C71)

I have no idea what's going on. When I run without lsp-devtools agent -- it seems to be fine.

Capabilities override

Provide a cli option to override the set of capabilities a set of tests are run with.

VSCode extension?

It should be possible to host the inspector app in a webview provided by a VSCode extension...

record --to-file not dumping conformant json.

When using lsp-devtools record --to-file example.json the output seems to be formatted as Python objects rather than proper json making it hard to use 3rd party tools like jq to examine the output.

Is there any reason not to use json.dumps() here? Or if it's optional, would it make sense to make json the default?

I see this in the code, but looking at the doc it's not clear to me how to simply format the output as json. Perhaps this line should be higher up in this function?

Here's a typical line of output I'm seeing:

{'jsonrpc': '2.0', 'method': 'initialize', 'params': {'processId': 41917, 'rootPath': '/Users/XXXX/K/ngnk/lsp', 'clientInfo': {'name': 'emacs', 'version': 'GNU Emacs 29.1 (build 1, aarch64-apple-darwin21.6.0, NS appkit-2113.60 Version 12.6.6 (Build 21G646))\n of 2023-08-17'}, 'rootUri': 'file:///Users/XXXX/K/ngnk/lsp', 'capabilities': {'general': {'positionEncodings': ['utf-32', 'utf-16']}, 'workspace': {'workspaceEdit': {'documentChanges': True, 'resourceOperations': ['create', 'rename', 'delete']}, 'applyEdit': True, 'symbol': {'symbolKind': {'valueSet': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26]}}, 'executeCommand': {'dynamicRegistration': False}, 'didChangeWatchedFiles': {'dynamicRegistration': True}, 'workspaceFolders': True, 'configuration': True, 'semanticTokens': {'refreshSupport': True}, 'codeLens': {'refreshSupport': True}, 'fileOperations': {'didCreate': False, 'willCreate': False, 'didRename': True, 'willRename': True, 'didDelete': False, 'willDelete': False}}, 'textDocument': {'declaration': {'dynamicRegistration': True, 'linkSupport': True}, 'definition': {'dynamicRegistration': True, 'linkSupport': True}, 'references': {'dynamicRegistration': True}, 'implementation': {'dynamicRegistration': True, 'linkSupport': True}, 'typeDefinition': {'dynamicRegistration': True, 'linkSupport': True}, 'synchronization': {'willSave': True, 'didSave': True, 'willSaveWaitUntil': True}, 'documentSymbol': {'symbolKind': {'valueSet': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26]}, 'hierarchicalDocumentSymbolSupport': True}, 'formatting': {'dynamicRegistration': True}, 'rangeFormatting': {'dynamicRegistration': True}, 'onTypeFormatting': {'dynamicRegistration': True}, 'semanticTokens': {'dynamicRegistration': True, 'requests': {'range': True, 'full': True}, 'tokenModifiers': ['declaration', 'definition', 'implementation', 'readonly', 'static', 'deprecated', 'abstract', 'async', 'modification', 'documentation', 'defaultLibrary'], 'overlappingTokenSupport': True, 'multilineTokenSupport': True, 'tokenTypes': ['comment', 'keyword', 'string', 'number', 'regexp', 'operator', 'namespace', 'type', 'struct', 'class', 'interface', 'enum', 'typeParameter', 'function', 'method', 'member', 'property', 'event', 'macro', 'variable', 'parameter', 'label', 'enumConstant', 'enumMember', 'dependent', 'concept'], 'formats': ['relative']}, 'rename': {'dynamicRegistration': True, 'prepareSupport': True}, 'codeAction': {'dynamicRegistration': True, 'isPreferredSupport': True, 'codeActionLiteralSupport': {'codeActionKind': {'valueSet': ['', 'quickfix', 'refactor', 'refactor.extract', 'refactor.inline', 'refactor.rewrite', 'source', 'source.organizeImports']}}, 'resolveSupport': {'properties': ['edit', 'command']}, 'dataSupport': True}, 'completion': {'completionItem': {'snippetSupport': True, 'documentationFormat': ['markdown', 'plaintext'], 'resolveAdditionalTextEditsSupport': True, 'insertReplaceSupport': True, 'deprecatedSupport': True, 'resolveSupport': {'properties': ['documentation', 'detail', 'additionalTextEdits', 'command']}, 'insertTextModeSupport': {'valueSet': [1, 2]}}, 'contextSupport': True, 'dynamicRegistration': True}, 'signatureHelp': {'signatureInformation': {'parameterInformation': {'labelOffsetSupport': True}}, 'dynamicRegistration': True}, 'documentLink': {'dynamicRegistration': True, 'tooltipSupport': True}, 'hover': {'contentFormat': ['markdown', 'plaintext'], 'dynamicRegistration': True}, 'foldingRange': {'dynamicRegistration': True}, 'selectionRange': {'dynamicRegistration': True}, 'callHierarchy': {'dynamicRegistration': False}, 'typeHierarchy': {'dynamicRegistration': True}, 'publishDiagnostics': {'relatedInformation': True, 'tagSupport': {'valueSet': [1, 2]}, 'versionSupport': True}, 'linkedEditingRange': {'dynamicRegistration': True}}, 'window': {'workDoneProgress': True, 'showDocument': {'support': True}}}, 'initializationOptions': None, 'workDoneToken': '1'}, 'id': 232}

Investigate mutliple client connections

Still a valid issue, but the architecture is now different - can we have multiple agents reports into a single server command?

Since the agent is wrapped in a websocket server, there shouldn't be a reason why we can't have mutliple clients connected to the same agent.

Yet when I try running two separate lsp-devtools record sessions, the first mysteriously stops receiving messages once the second client connects...

Config file and "profiles"

It would be useful for lsp-devtools to be able to read configuration from say pyproject.toml.

This would open the possibility of introducing named profiles as shortcuts to common use-cases e.g.

  • lsp-devtools record --profile log => lsp-devtools record -f "{.params.type|MessageType}: {.params.message}"
  • lsp-devtools record --profile registration => lsp-devtools record --include-method initialize --include-method client/registerCapability --include-method client/unregisterCapability

Client selection syntax

Currently, we only accept a client's name and return the first file we find that matches.
However, now that we'll soon have multiple versions of a client we need a syntax to select between them e.g.

pytest skipping "getting started" test with async error

I'm seeing an error trying to run the sample server + test from the getting started docs. Specifically, pytest skips the test instead of running it.

Re-creating

  1. Create new venv & activate
  2. Install pytest v7.3.2, pygls v1.0.2 and pytest-lsp v0.3.0
  3. create server.py and test_server.py from docs
  4. run pytest

Result:

================================ test session starts =================================
platform linux -- Python 3.10.6, pytest-7.3.2, pluggy-1.1.0
rootdir: /home/sfinnie/projects/lsp/pytest-lsp
plugins: lsp-0.3.0, typeguard-3.0.2, asyncio-0.21.0
asyncio: mode=strict
collected 1 item                                                                     

test_server.py s                                                               [100%]

================================== warnings summary ==================================
test_server.py::test_completions
  /home/sfinnie/projects/lsp/pytest-lsp/venv/lib/python3.10/site-packages/_pytest/python.py:183: PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
  You need to install a suitable plugin for your async framework, for example:
    - anyio
    - pytest-asyncio
    - pytest-tornasync
    - pytest-trio
    - pytest-twisted
    warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))

test_server.py::test_completions
  /home/sfinnie/projects/lsp/pytest-lsp/venv/lib/python3.10/site-packages/pytest_asyncio/plugin.py:444: DeprecationWarning: pytest-asyncio detected an unclosed event loop when tearing down the event_loop
  fixture: <_UnixSelectorEventLoop running=False closed=False debug=False>
  pytest-asyncio will close the event loop for you, but future versions of the
  library will no longer do so. In order to ensure compatibility with future
  versions, please make sure that:
      1. Any custom "event_loop" fixture properly closes the loop after yielding it
      5. Your code does not modify the event loop in async fixtures or tests
  
    warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================== 1 skipped, 2 warnings in 0.64s ===========================

Resolution (partial)

Marking the test as async means pytest runs it successfully:

import pytest

@pytest.mark.asyncio
async def test_completions(client: LanguageClient):
    # test body unchanged

It still generates a warning though:

================================ test session starts =================================
platform linux -- Python 3.10.6, pytest-7.3.2, pluggy-1.1.0
rootdir: /home/sfinnie/projects/lsp/pytest-lsp
plugins: lsp-0.3.0, typeguard-3.0.2, asyncio-0.21.0
asyncio: mode=strict
collected 1 item                                                                     

test_server.py .                                                               [100%]

================================== warnings summary ==================================
test_server.py::test_completions
  /home/sfinnie/projects/lsp/pytest-lsp/venv/lib/python3.10/site-packages/pytest_asyncio/plugin.py:444: DeprecationWarning: pytest-asyncio detected an unclosed event loop when tearing down the event_loop
  fixture: <_UnixSelectorEventLoop running=False closed=False debug=False>
  pytest-asyncio will close the event loop for you, but future versions of the
  library will no longer do so. In order to ensure compatibility with future
  versions, please make sure that:
      1. Any custom "event_loop" fixture properly closes the loop after yielding it
      2. Your code does not modify the event loop in async fixtures or tests
  
    warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
============================ 1 passed, 1 warning in 0.65s ============================

I think that's covered in a separate issue though.

Alternative Resolution

Rather than having to decorate every test as async, add a pyproject.toml:

[tool.pytest.ini_options]
asyncio_mode = "auto"

The test will then run as per docs (though still generates the warning).

HTH.

Fix `lsp-devtools` packaging

For some reason half the files are missing from the v0.1.0 release on PyPi....

This is why we have tests!!! ๐Ÿ˜ฌ

Windows support

The lsp-devtools agent does not work on windows, due to the way it currently tries to obtain async readers/writers for stdin/stdout.

Persistance

Currently, the lsp-devtools package supports 2 modes of operation

  1. Recording messages to a SQLite db to be inspected at a later time
  2. Real time streaming of messages over web sockets to the proof of concept UI

The UI would need to support both modes, not entirely surely at the moment how best to unify the two approaches...

(pytest-lsp) Handling null Results

Hi, pytest-lsp has made my life a lot easier recently, so thanks for putting it together ๐Ÿ™‚

For some responses like HoverResponse and CompletionResponse, a server can return null (link), and (AFAICS) the pygls LanguageServer.send_request_async returns None in such circumstances, which causes pytest_lsp.Client to throw an exception when it tries to stick the result in for example a Hover. So for example when I Client.hover_request at a position where no hovers are the result is:

TypeError: pygls.lsp.types.language_features.hover.Hover() argument after ** must be a mapping, not NoneType

Perhaps None could be returned?

Let me know if that makes sense, thanks!

Commands do not exit cleanly

$ .env/bin/lsp-devtools record -f "{.params}" --to-file logs.json
^C^CException in callback StreamReaderProtocol.connection_made.<locals>.callback(<Task cancell...server.py:43>>) at /usr/lib64/python3.12/asyncio/streams.py:248
handle: <Handle StreamReaderProtocol.connection_made.<locals>.callback(<Task cancell...server.py:43>>) at /usr/lib64/python3.12/asyncio/streams.py:248>
Traceback (most recent call last):
  File "/usr/lib64/python3.12/asyncio/runners.py", line 194, in run
    return runner.run(main)
           ^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/asyncio/base_events.py", line 671, in run_until_complete
    self.run_forever()
  File "/usr/lib64/python3.12/asyncio/base_events.py", line 638, in run_forever
    self._run_once()
  File "/usr/lib64/python3.12/asyncio/base_events.py", line 1933, in _run_once
    event_list = self._selector.select(timeout)
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/selectors.py", line 468, in select
    fd_event_list = self._selector.poll(timeout, max_ev)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/asyncio/runners.py", line 157, in _on_sigint
    raise KeyboardInterrupt()
KeyboardInterrupt

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib64/python3.12/asyncio/events.py", line 84, in _run
    self._context.run(self._callback, *self._args)
  File "/usr/lib64/python3.12/asyncio/streams.py", line 249, in callback
    exc = task.exception()
          ^^^^^^^^^^^^^^^^
  File "/var/home/alex/Projects/lsp-devtools/lib/lsp-devtools/lsp_devtools/agent/server.py", line 45, in handle_client
    await aio_readline(self._stop_event, reader, self.lsp.data_received)
  File "/var/home/alex/Projects/lsp-devtools/.env/lib64/python3.12/site-packages/pygls/client.py", line 48, in aio_readline
    header = await reader.readline()
             ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/asyncio/streams.py", line 565, in readline
    line = await self.readuntil(sep)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.12/asyncio/streams.py", line 657, in readuntil
    await self._wait_for_data('readuntil')
  File "/usr/lib64/python3.12/asyncio/streams.py", line 542, in _wait_for_data
    await self._waiter
asyncio.exceptions.CancelledError
^CTask was destroyed but it is pending!
task: <Task cancelling name='Task-1' coro=<AgentServer.start_tcp() done, defined at /var/home/alex/Projects/lsp-devtools/lib/lsp-devtools/lsp_devtools/agent/server.py:42> wait_for=<Future pending cb=[Task.task_wakeup()]> cb=[gather.<locals>._done_callback() at /usr/lib64/python3.12/asyncio/tasks.py:767]>

I think this is losely related to #131

Server process does not exit

This may ultimately be an issue with esbonio or even pygls??, but it is not guaranteed that the server process exits when the client disappears ๐Ÿ˜…
image

`pytest-lsp`: See server's STDERR in pytest output

Or: How to debug the server behind pytest?

For debugging, I usually

  • set up a test to reproduce the behaviour
  • run the test case
  • use the logging from my system to trace what's happening

Unfortunately I haven't found a way to access the logs of my server with pytest-lsp in a nice way. I think it would be great to see the server's STDERR output in the console right next to the test results (or have an option for this behaviour), just as pytest does by default when working with pure Pyhon systems. I think this could work by just forwarding the server subprocess' STDERR to the test process STDERR.

My current workaround is to have my server log to a file and check that in another terminal, but that's kind of tedious.

record --to-file having trouble with unicode?

I'll try to debug further, but when I try to use lsp-devtools record --to-file to record a session connecting my LSP server to a client file which contains unicode I get a bunch of errors and the record session seems to effectively bail. The server itself seems to keep running. I've attached a script session of attempting to record, the output and the file. It does use my lsp-server but I suspect that's not relevant.

The file simply contains (โˆŠ). Note that the output (test.json) doesn't record past the initialized message though I interacted with it further before shutting it down.

I'm happy to poke around further but it might be easiest to for you to try to reproduce with whatever test server you use than to explain mine.
test.json
test.script.txt
unitest.k.txt

LSP agent does not exit cleanly

Even though the editor running the wrapped lsp server has closed, the process is still there in the background holding onto the bound port

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.