Code Monkey home page Code Monkey logo

Comments (5)

welcome avatar welcome commented on August 21, 2024

Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! 🤗

If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
welcome
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! 👋

Welcome to the Jupyter community! 🎉

from jupyterlab_server.

bollwyvl avatar bollwyvl commented on August 21, 2024

from jupyterlab_server.

trungleduc avatar trungleduc commented on August 21, 2024

Thank @bollwyvl for clarifying the situation! I'm quite new to the subject so maybe I missed previous discussions about LSP and JupyterLab, please correct me if I'm wrong.
The LSP kernel ideal looks promising, I agree with most of your points but I still have some questions about abandoning the WebSocket approach.

  • For multiple ws connections, by using another frontend extension as an entry point for LSP service, I think we only need one ws connection for all language servers. So performance-wise, do other potential advantages worth the added overhead of being a kernel?
  • Since it's a kernel, it will live outside the core of JupyterLab? Does it defeat the purpose of having an IDE-like experience out of the box for JupyterLab?

from jupyterlab_server.

SylvainCorlay avatar SylvainCorlay commented on August 21, 2024

I really see the kernel and LSP protocol as orthogonal:

  • We always strived to make the kernel unaware of the "source" of execution requests. Exposing that it came from a console or a notebook or somewhere else was always considered as an abstraction leak. Kernels are mostly about execution, and not so much about source code.
  • On the other hand, LSP is almost entirely about the source code of your workspace. LSP servers exist completely independently of the ability to execute code (for e.g. compiled languages).

Hence, shoehorning the LSP protocol inside of the kernel protocol seems unintuitive, and it will be hard to reconcile the two, even from a UX perspective:

  • there can be many running Python kernels in one project, but it really makes sense to have a single language server running for Python
  • the special "LSP" kernels would appear in the list of running kernels while they are not really about executing code. I would strongly prefer having another category in the side panel with the currently running language servers:

yaQhZ4Rd2FZVdij_-y-9ig3D_F-DzNu8bXNtzDSflQDrs05LJThmdi3XsYpiPajUgMcMhVNWMb4q_qfegwpnVN57mf9UnsWi97Kum2Oluzf18knFtg0PMs99Nh4PZveJTe0EU2hV=s0

from jupyterlab_server.

bollwyvl avatar bollwyvl commented on August 21, 2024

outside the core of JupyterLab? Does it defeat the purpose

A huge amount of complexity in LSP is also in the language servers, much like the complexity of jupyter kernel messaging is also in... the kernels. At this point, to parallel ipykernel, I can't even confidently name a python language server i'd want everyone to have to use.

But, as i suggest, having in-browser language servers would allow us to ship no-foolin features, without a new server dependency, for kernel-less things like markdown, CSS, JSON Schema, etc. And if we just happened to get in-browser kernels, i wouldn't be sad either...

unaware of the "source" of execution requests

Be that as it may, the kernel still has knowledge of things very important to the user that might be outside the remit of the source document, e.g. dynamically-defined/side-effect variables, and some of those have useful LSP features associated with them, much as was already demonstrated with DAP (JEP47).

shoehorning the LSP protocol inside of the kernel protocol

As has been raised in a few places, continuing to evolve the existing jupyter kernel message spec to "catch up" with what is already defined in existing LSP features is a mug's game: if, instead, we embrace and extend LSP, we can get a lot of stuff for "free", but can define more of the integration on our terms.

Being able to plug into existing LSP features in this way would require maybe two JEPs:

  • formalizing a list of well-known comm targets
    • widgets, bokeh, etc. would get grandfathered in
  • reserving a comm target for LSP and defining its schema

This sounds better than nickle-and-diming JEPs for each new field/message, which no doubt is what it would take. And encouraging comm implementation would open up more kernels to other schema-constrained, language-agnostic components... like Jupyter Widgets, bokeh documents, etc.

not really about executing code

I see interactive control of a language server as an extension of get_ipython().set_hook, etc. We don't even know what people could build on top of this. Over on juyterlab-lsp, as soon as we support TCP servers, i'm for sure excited to try live-coding a language server with e.g. pygls. But I am already steeped in all this stuff. What would an even more interactive flavor of this be like?

Not having to implement a JSON-RPC wire-protocol from first principals is a big win, as we already have a life-cycled data object we can own. Indeed, this was what lintotype did: don't like your black line length? Move it with a slider.

And if we did do that once we could imagine going the other way, and offering LSP+Jupyter with a single bridge in the opposite direction.

reconcile the two, even from a UX perspective:

yep, there is certainly work to be done. We're already having to reconcile data from multiple sources on e.g. completion, and it's harsh. Basically every jupyter document is polyglot on multiple axes (code/narrative, input/output, natural languages, semantic types) something that a traditional source code document doesn't have to deal with.

But the high road is being able to bring as many as sources as a user wants, such that annotations of code, and eventually outputs, could come from:

  • the kernel
    • a vestigal language server inside the host kernel
  • a dedicated language server (whether inside a proxy kernel or not)
  • static LSIF data
  • software forge (gitlab/github) annotations (e.g. jupyterlab/pull-requests)

from jupyterlab_server.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.