Code Monkey home page Code Monkey logo

jupyter-dash's Introduction

NOTICE: as of Dash v2.11, Jupyter support is built into the main Dash package.

The jupyter-dash package is no longer necessary, all of its functionality has been merged into dash. See https://dash.plotly.com/dash-in-jupyter for usage details, and if you have any questions please join the discussion at https://community.plotly.com/

The old readme is below for those still using the package, but jupyter-dash will not receive any further updates.


Jupyter Dash

Binder

This library makes it easy to develop Plotly Dash apps interactively from within Jupyter environments (e.g. classic Notebook, JupyterLab, Visual Studio Code notebooks, nteract, PyCharm notebooks, etc.).

jupterlab example

See the notebooks/getting_started.ipynb for more information and example usage.

Installation

You can install the JupyterDash Python package using pip...

$ pip install jupyter-dash

or conda

$ conda install -c conda-forge -c plotly jupyter-dash

JupyterLab support

When used in JupyterLab, JupyterDash depends on the jupyterlab-dash JupyterLab extension, which requires JupyterLab version 2.0 or above.

This extension is included with the Python package, but in order to activate it JupyterLab must be rebuilt. JupyterLab should automatically produce a popup dialog asking for permission to rebuild, but the rebuild can also be performed manually from the command line using:

$ jupyter lab build

To check that the extension is installed properly, call jupyter labextension list.

Colab support

As of version 0.3.0, JupyterDash works in Colab with no additional configuration. Just install jupyter-dash using pip in a Colab notebook cell

!pip install jupyter-dash

Features

To learn more about the features of JupyterDash, check out the announcement post.

Development

To develop JupyterDash, first create and activate a virtual environment using virtualenv or conda.

Then clone the repository and change directory to the repository root:

$ git clone https://github.com/plotly/jupyter-dash.git
$ cd jupyter-dash

Then install the dependencies:

$ pip install -r requirements.txt -r requirements-dev.txt 

Then install the Python package in editable mode. Note: this will require nodejs to be installed.

$ pip install -e .

Then install the classic notebook extension in development mode:

$ jupyter nbextension install --sys-prefix --symlink --py jupyter_dash
$ jupyter nbextension enable --py jupyter_dash

Then install the JupyterLab extension in development mode:

$ jupyter labextension link extensions/jupyterlab

For release, build the JupyterLab extension to bundle with the Python package (see RELEASE.md for the full process):

$ python setup.py build_js

jupyter-dash's People

Contributors

aiqc avatar alexcjohnson avatar chriddyp avatar deepyaman avatar jbampton avatar jonmmease avatar renovate-bot avatar renovate[bot] avatar t4rk1n avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

jupyter-dash's Issues

Modify host variable from 127.0.0.1

Hi,

I found this in the code:

    host = kwargs.get("host", os.getenv("HOST", "127.0.0.1"))
    port = kwargs.get("port", os.getenv("PORT", "8050"))

Is it possible to modify these variables to fit my own environment as accessing something from 127.0.0.1 is not working in the setup I have and would need to be 0.0.0.0?

Thanks
Petter

The 'environ['werkzeug.server.shutdown']' function is deprecated and will be removed in Werkzeug 2.1.

Hi. First let me thank you for this amazing library! It's been really useful for me. I just wanted to let you know that I'm getting this warning when using jupyter_dash on Python 3.8. Not sure what it means.

/opt/conda/lib/python3.8/site-packages/jupyter_dash/jupyter_app.py:139: UserWarning:

The 'environ['werkzeug.server.shutdown']' function is deprecated and will be removed in Werkzeug 2.1.

Thanks!

Unable to communicate with the jupyter_dash notebook or JupyterLab extension required to infer Jupyter configuration.

Hello. I have setup a jupyterlab used by many in my firm, however when installing dash for jupyter it doesnt work. I have looked for similar cases on the web and found a few, but the solutions were either unsolved or solved without an explanation, or in a different environment than me.

I have installed juypter-dash and jupyter-server-proxy, jupyter-dash installed jupyterlab-dash for me. They are installed through pip if that is relevant to the case.

The error occurs when "JupyterDash.infer_jupyter_proxy_config()" is being called, which for some reason cant communicate with my jupyter-server-proxy or notebook, referencing the error message at the bottom of the page.

JupyterLab v3.1.7
/opt/anaconda3/share/jupyter/labextensions
        nbdime-jupyterlab v2.1.0 enabled OK
        jupyterlab-topbar-extension v0.6.1 enabled OK (python, jupyterlab-topbar)
        jupyterlab-system-monitor v0.8.0 enabled OK (python, jupyterlab-system-monitor)
        jupyter-matplotlib v0.10.0 enabled OK
        jupyterlab-plotly v5.4.0 enabled OK
        @jupyter-widgets/jupyterlab-manager v3.0.0 enabled OK (python, jupyterlab_widgets)
        @jupyterlab/git v0.33.0 enabled OK (python, jupyterlab-git)
        @jupyterlab/server-proxy v3.1.0 enabled OK
        @jupyter-server/resource-usage v0.5.0 enabled OK (python, jupyter-resource-usage)
        @lckr/jupyterlab_variableinspector v3.0.9 enabled OK (python, lckr_jupyterlab_variableinspector)
        @krassowski/jupyterlab-lsp v3.9.0 enabled OK (python, jupyterlab-lsp)
        @telamonian/theme-darcula v3.1.1 enabled OK (python, theme-darcula)

Other labextensions (built into JupyterLab)
   app dir: /opt/anaconda3/share/jupyter/lab
        @techrah/text-shortcuts v1.0.3 enabled OK
        jupyterlab-chart-editor v4.14.3 enabled OK
        jupyterlab-dash v0.4.0 enabled OK
        jupyterlab-spreadsheet v0.4.1 enabled OK
        jupyterlab-theme-solarized-dark v2.0.1 enabled OK
        jupyterlab-topbar-text v0.6.1 enabled OK

Jupyter nbextension list:

Known nbextensions:
  config dir: /root/.jupyter/nbconfig
    notebook section
      jupyter_dash/main  enabled 
      - Validating: OK
  config dir: /opt/anaconda3/etc/jupyter/nbconfig
    notebook section
      bamboolib/extension  enabled 
      - Validating: OK
      jupyter-matplotlib/extension  enabled 
      - Validating: OK
      jupyter_dash/main  enabled 
      - Validating: OK
      jupyter_resource_usage/main  enabled 
      - Validating: OK
      jupyterlab-plotly/extension  enabled 
      - Validating: OK
      nbdime/index  enabled 
      - Validating: OK
      rise/main  enabled 
      - Validating: OK
      jupyter-js-widgets/extension  enabled 
      - Validating: OK
    tree section
      jupyter_server_proxy/tree  enabled 
      - Validating: OK
  config dir: /usr/local/etc/jupyter/nbconfig
    notebook section
      jupyterlab-plotly/extension  enabled 
      - Validating: OK

Test code:

import plotly.express as px
from jupyter_dash import JupyterDash
from dash import dcc, html
from dash.dependencies import Input, Output


JupyterDash.infer_jupyter_proxy_config()

# Load Data
df = px.data.tips()
# Build App
app = JupyterDash(__name__)
app.layout = html.Div([
    html.H1("JupyterDash Demo"),
    dcc.Graph(id='graph'),
    html.Label([
        "colorscale",
        dcc.Dropdown(
            id='colorscale-dropdown', clearable=False,
            value='plasma', options=[
                {'label': c, 'value': c}
                for c in px.colors.named_colorscales()
            ])
    ]),
])
# Define callback to update graph
@app.callback(
    Output('graph', 'figure'),
    [Input("colorscale-dropdown", "value")]
)
def update_figure(colorscale):
    return px.scatter(
        df, x="total_bill", y="tip", color="size",
        color_continuous_scale=colorscale,
        render_mode="webgl", title="Tips"
    )


if __name__ == '__main__':
    app.run_server(mode="inline")

Error:

`/opt/anaconda3/lib/python3.8/site-packages/jupyter_dash/comms.py:69: RuntimeWarning: coroutine 'Kernel.do_one_iteration' was never awaited
  kernel.do_one_iteration()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
/tmp/ipykernel_155541/584202169.py in <module>
      5 
      6 
----> 7 JupyterDash.infer_jupyter_proxy_config()
      8 
      9 # Load Data

/opt/anaconda3/lib/python3.8/site-packages/jupyter_dash/jupyter_app.py in infer_jupyter_proxy_config(cls)
     71         else:
     72             # Assume classic notebook or JupyterLab
---> 73             _request_jupyter_config()
     74 
     75     def __init__(self, name=None, server_url=None, **kwargs):

/opt/anaconda3/lib/python3.8/site-packages/jupyter_dash/comms.py in _request_jupyter_config(timeout)
     60         if (time.time() - t0) > timeout:
     61             # give up
---> 62             raise EnvironmentError(
     63                 "Unable to communicate with the jupyter_dash notebook or JupyterLab \n"
     64                 "extension required to infer Jupyter configuration."

OSError: Unable to communicate with the jupyter_dash notebook or JupyterLab 
extension required to infer Jupyter configuration.`

Deploying JupyterDash in Google Collab, callbacks not functioning correctly

!pip install jupyter-dash
!pip install dash_uploader

app = JupyterDash(__name__)

app.config.suppress_callback_exceptions = True

du.configure_upload(app, r"C:\tmp\Uploads")

app.layout = html.Div([
                       
  html.Div(
    dcc.Graph(id="output-data-upload",
    config = {'toImageButtonOptions':
    {'width': None,
    'height': None,
    'format': 'png',
    'filename': 'Image_Graph'}
    })
  ),

  html.Div(
      du.Upload(
          id='dash-uploader',
          text='Drag or Drop file here',
          text_completed='Completed: ',
          pause_button=True,
          cancel_button=True,
          max_file_size=1e+9,  # 1gb
          filetypes=["mp4","mkv","avi"],
          upload_id=uuid.uuid1()
      ),
    id="upload-container"),

  html.Div(
      html.Button('Continue', id='continue', n_clicks=0,style={"display":"None"})
  ,style={"margin":"auto","position":"absolute","right":10,"bottom":10})

])

@app.callback(
    [Output('output-data-upload', 'figure'), Output('continue', 'style'), Output('upload-container', 'style')],
    [Input('dash-uploader', 'isCompleted'), Input('continue', 'n_clicks')],
    [State('dash-uploader', 'fileNames'), State('dash-uploader', 'upload_id')]
)
def update_output(iscompleted, conbut, list_of_contents, upload_id):
    print(upload_id)

app.run_server(mode="external", port=port, debug=True)

I am running a jupyter notebook file in google colab, and regardless of what I run inside of the callback function, I get a callback error:
Callback error updating output-data-upload.figure, continue.style, upload-container.style
This leads me to believe the callback function is not registering properly. I have tried reducing the callback function's inputs, outputs, and states to a minimum, but the error persists for me. I can see my graph online, but once I upload something, I get that error.

I am unsure if this is a bug or an error on my accord, but from what I search online this functionality should be enabled.

Multiple instances of JupyterDash

Hello,
Iโ€™m trying to leverage Dash to create interactive plots of mathematical functions inside Jupyter Notebook. Therefore, Iโ€™ve embedded the Dash app logic into MyClass. Whenever I need to plot a mathematical function, I create a Python function which is going to update the traces of the figure, and pass it to the constructor of MyClass.

This is my code:

import numpy as np
import plotly.graph_objects as go
from jupyter_dash import JupyterDash
import dash_core_components as dcc
import dash_html_components as html
import dash_bootstrap_components as dbc
from dash.dependencies import Input, Output, State, ALL
from dash.development.base_component import Component
from datetime import datetime

class MyClass:

    def __init__(self, user_func):
        
        timestamp = int(datetime.now().timestamp() * 1000)
        self.unique_id = str(timestamp)

        self.fig = go.Figure()

        childrens = []
        childrens.append(
            dbc.Container([
                self._create_slider(0, "Par a", 1, 5, 2.5),
                self._create_slider(1, "Par b", 0, 1, 0.15),
            ])
        )

        # Add a Plotly Graph Object
        childrens.append(
            dcc.Graph(id=self.unique_id + "_my_plot", figure={})
        )
        
        # create the Dash app applying the specified CSS and JS
        self.app = JupyterDash(
            __name__,
            external_stylesheets=[dbc.themes.SLATE],
        )
        # add the controls
        self.app.layout = html.Div(id=self.unique_id, children=childrens)

        # Create the callback, specifying the input/output controls
        @self.app.callback(
            [Output(self.unique_id + "_my_plot", "figure"),
             *[Output(self.unique_id + '_value_slider_{}'.format(j), 'children') for j in range(2)]],
            [Input({'type': self.unique_id + "_", 'index': ALL}, 'value')]
        )
        def func(values):
            user_func(self.fig, values)
            return [self.fig, *values]
        
        # execute the app in the Jupyter environment
#         self.app.run_server()
        self.app.run_server(mode="inline")

    def _create_slider(self, i, _name, _min, _max, _val):
        return dbc.Row(
                children = [
                    # this shows the label
                    dbc.Col(html.Div(_name), 
                            width=1, style = {"text-align": "right"}),
                    # this will show the actual value of the slider
                    dbc.Col(html.Div(id=self.unique_id + "_value_slider_" + str(i), children={}), 
                            width=1, style = {"text-align": "left"}),
                    dbc.Col(dcc.Slider(
                        id = {
                            "type": self.unique_id + "_",
                            "index": i
                        },
                        min = _min,
                        max = _max,
                        step = (_max - _min) / 50,
                        value = _val,
                    )),
                ]
            )

Now, letโ€™s suppose I want to plot two different mathematical functions. I create a first instance, everything works ok in the first app:

def my_sin_func(fig, values):
    a, b = values
    fig.data = []
    xx = np.linspace(0, 20, 200)
    yy = a * np.sin(xx) * np.exp(-xx * b)
    fig.add_trace(
        go.Scatter(x=xx, y=yy, mode="lines")
    )
    fig.add_trace(
        go.Scatter(x=xx, y=np.sin(xx), mode="lines")
    )

a = MyClass(my_sin_func)

Then. I create the second instance, everything works ok on the second app:

def my_cos_func(fig, values):
    a, b = values
    fig.data = []
    xx = np.linspace(0, 20, 200)
    yy = a * np.cos(xx) * np.exp(-xx * b)
    fig.add_trace(
        go.Scatter(x=xx, y=yy, mode="lines")
    )
    fig.add_trace(
        go.Scatter(x=xx, y=np.cos(xx), mode="lines")
    )

b = MyClass(my_cos_func)

But as soon as I move back to the first app and change a slider, an error happens in the second app: KeyError: '..1603462668915_my_plot.figure...1603462668915_value_slider_0.children...1603462668915_value_slider_1.children..'

As I understand, JupyterDash ran both app in the same server, http://127.0.0.1:8050/. I tried to change the server_url option to use a different port, but the server always start at http://127.0.0.1:8050/. What can I do to have two instances working in parallel?

Cannot load CSS style sheets

By default, Dash should use CSS and JS from the assets folder, but I cannot get it to work in Jupyter-Dash.

It works for external stylesheets, so I'm not sure why it doesn't work for local CSS.

I've tried setting the assets path manually, but it did not work.
app = JupyterDash(__name__, assets_url_path='assets')

Also, this is my first time using Dash, so perhaps I am just missing something.

Thank you

jupyter-dash in a container

Hi,

I'm trying to prototype a plain-vanilla frontend in Dash for testing my model scoring in a jupyter docker container.

Below my code


from jupyter_dash import JupyterDash
import dash
import dash_core_components as component
import dash_html_components as html
app = JupyterDash(__name__)
app.layout = html.Div(children=[
    html.H1(children='Shopping App'),
    html.Div(children= 'A Scoring Interactive Web Service for testing')
])

app.run_server(mode="jupyterlab", host='172.23.0.6', port='8050', debug=True)

Everything starts fine but I get a page with this error

172.23.0.6 took too long to respond.

Any suggestion?

Thanks

call-backs from Dash-plotly to JupyterLab

Screen Shot 2021-02-12 at 7 45 19
Hi,
I am trying to incorporate ArcGIS WebScenes in my Dash-plotly workflow.
I think I have two main options

  1. To use iFrame and I louse the call back (As far as I know), which looks great but I am very limited in modification and call back.
  2. To incorporate Dash-Plotly Call backs to Jupyer-Lab like in the example in the attached image, I was able one time to have a call-back from the map to the output view on the left side.

Obviously I prefer the Dash-plotly--> Jupyter Lab callback option.
I have much more control in modifying the websene in the Jupyter-Lab embedded cell (moving camera, changing visualisation. Even though I can modify the websene and update the iframe, it's performance is inferrier to that of an embed websene like in the jupyte-lab.

So in Short - I am looking to send a call back from Dash-plotly back to Jupyter Lab, and I want it to work more than once. (It's now a one time thing) I tried both 'Inline' and the 'JupyterLab' option.

Thanks in advance.

Prebuilt JupyterLab 3.0 extension ?

As described in the Jupyter 3.0 blog post:

JupyterLab extensions can now be distributed as prebuilt extensions, which do not require a user to rebuild JupyterLab or have Node.js installed. Prebuilt extensions can be distributed as Python packages using familiar package managers such as pip...

https://blog.jupyter.org/jupyterlab-3-0-is-out-4f58385e25bb

Are there plans to prebuild the "jupyterlab-dash JupyterLab extension?"

In my experience, it requires a lot of troubleshooting. This could help reduce friction.

Switching kernel breaks the extensions

The jupyterlab extension is registering the comm target at notebook creation time (see

notebooks.widgetAdded.connect((sender, nbPanel: NotebookPanel) => {
). This means that changing the kernel will not work. In practice all calls to JupyterDash.infer_jupyter_proxy_config() will fail after the kernel is changed (for a good reason).

A potential solution would be to listen to the ISessionContext.kernelChanged signal and register the comm target against the updated kernel.

I have only tested it with Jupyterlab but I suspect the nbextension has the same problem.

Export to HTML

Hi, i'm seen that this tool does not have an option to export report as HTML, PDF or another. Have you seen this feature for dash?=

FileNotFoundError for conda skeleton

I'm trying to build a conda-forge package via conda skeleton pypi jupyter_dash but it returns me the following error

Traceback (most recent call last):
  File "setup.py", line 108, in <module>
    with open(os.path.join(here, 'requirements.txt')) as f:
FileNotFoundError: [Errno 2] No such file or directory: '/tmp/tmp4lephpktconda_skeleton_jupyter-dash-0.2.1.tar.gz/jupyter-dash-0.2.1/requirements.txt'
$PYTHONPATH = /tmp/tmp4lephpktconda_skeleton_jupyter-dash-0.2.1.tar.gz/jupyter-dash-0.2.1

Cannot stop server in JupyterLab using CTRL+C

I'm running the following code:

Cell [2]

app = JupyterDash(__name__)
app.layout = html.Div('Minimal')

app.run_server(mode="external", debug=False)

Cel[2] Output


 * Running on http://127.0.0.1:8050/ (Press CTRL+C to quit)
127.0.0.1 - - [10/Jun/2020 14:17:49] "GET /_alive_a6195403-4d44-472b-8ea6-99427edf92c3 HTTP/1.1" 200 -
Dash app running on http://127.0.0.1:8050/
127.0.0.1 - - [10/Jun/2020 14:18:07] "GET / HTTP/1.1" 200 -
127.0.0.1 - - [10/Jun/2020 14:18:08] "GET /_dash-dependencies HTTP/1.1" 200 -
127.0.0.1 - - [10/Jun/2020 14:18:08] "GET /_dash-layout HTTP/1.1" 200 -

I cannot stop the server using (Press CTRL+C to quit) as indicated in the output log. This message is probably only applicable to execution in the terminal? Is there an alternate way of stopping the server in JupyterLab?

Environment setup:

  • conda install
  • Windows 10
  • JupyterLab 2.1
  • localhost

StreamHandler added every time to logger on JupyterDash()/Dash()

If you run a code block like below for multiple times in jupyter lab, you see the log message is printed once at your first run, twice at your second run, and so on. In the end the message can print many times, depending on how many times you create the JupyterDash app.

app = JupyterDash(__name__)
app.logger.disabled = False
app.logger.info("some message")

I believe it's actually an issue in the Dash class, actually, inspecting app.logger.handlers shows that every time Dash initializes there's one more StreamHandler added. But it's obvious with JupyterDash as with Jupyterlab a person may create the dash app object for multiple times. I will also create a ticket to the Dash repo.

Google Colab 403 Unauthorized response

Rather than provide my own example, here's a broken example available from the dash-sample-apps repository.

https://github.com/plotly/dash-sample-apps/tree/master/apps/dash-image-enhancing

The prototypical example in the docs also fails.

Searching online, there were some reports of google putting new restrictions in place on colab compute instances.

screengrab

!pip list
...
jupyter                       1.0.0          
jupyter-client                5.3.5          
jupyter-console               5.2.0          
jupyter-core                  4.7.0          
jupyter-dash                  0.3.1
..

url_base_pathname broken

The vanilla Dash class can take url_base_pathname and set both requests_pathname_prefix and routes_pathname_prefix [1][2]. The JupyterDash class, however, seems to have different behavior that it sets its own requests_pathname_prefix attribute, and defaults it to "/" regardless if user specifies url_base_pathname.. So url_base_pathname itself as a init parameter results a broken JupyterDash app.

[1] https://github.com/plotly/dash/blob/e8ac94919105a91c76a966c21aca2ec7b0297e22/dash/_configs.py#L97
[2] https://github.com/plotly/dash/blob/e8ac94919105a91c76a966c21aca2ec7b0297e22/dash/_configs.py#L111
[3]

requests_pathname_prefix = '/'

Unable to run Jupyter-dash with JupyterHub | Unable to communicate with the jupyter_dash notebook or JupyterLab

When I try to run jupyter-dash in jupyterhub I get the following error:

from jupyter_dash import JupyterDash
JupyterDash.infer_jupyter_proxy_config()

_OSError Traceback (most recent call last)
in
----> 1 JupyterDash.infer_jupyter_proxy_config()

/opt/conda/lib/python3.7/site-packages/jupyter_dash/jupyter_app.py in infer_jupyter_proxy_config(cls)
71 else:
72 # Assume classic notebook or JupyterLab
---> 73 _request_jupyter_config()
74
75 def init(self, name=None, server_url=None, **kwargs):

/opt/conda/lib/python3.7/site-packages/jupyter_dash/comms.py in _request_jupyter_config(timeout)
61 # give up
62 raise EnvironmentError(
---> 63 "Unable to communicate with the jupyter_dash notebook or JupyterLab \n"
64 "extension required to infer Jupyter configuration."
65 )

OSError: Unable to communicate with the jupyter_dash notebook or JupyterLab
extension required to infer Jupyter configuration._

The version of JupyterLab is v2.2.9 and I am using jupyterlab-dash v0.3.0.

When I try to use the archived version (jupyterlab_dash) it works fine, example:

import jupyterlab_dash
import dash
import dash_html_components as html
โ€‹
viewer = jupyterlab_dash.AppViewer()
โ€‹
app = dash.Dash(__name__)
โ€‹
app.layout = html.Div('Hello World')
โ€‹
viewer.show(app)

image

Given the error from above if I use the new version jupyter_dash, and If I run app.run_server("jupyterlab") nothing shows

jupyter-dash R library

Hello,

Is there a way to create an R library so that I could run r-Dash in jupyter lab?

Thanks!

Elze

Unable to run jupyter dash in Google Colab

I would like to run jupyter-dash in google colab. However, it is not possible to do so based on the example notebook.

Running the notebook I get a blank iframe.

image

When I followed the steps for embedding iframe of a flask app into colab notebook described at: https://stackoverflow.com/questions/54465816/how-to-use-flask-in-google-colaboratory-python-notebook

I get empty charts:

image

If I rerun the app without the mode="inline" I get the following error messages:

image

import IPython.display

def display_iframe(port, height):
    shell = """
        (async () => {
            const url = await google.colab.kernel.proxyPort(%PORT%, {"cache": true});
            console.log(`Adding ifram from URL:${url}`);
            const iframe = document.createElement('iframe');
            iframe.src = url;
            iframe.setAttribute('width', '100%');
            iframe.setAttribute('height', '%HEIGHT%');
            iframe.setAttribute('frameborder', 0);
            document.body.appendChild(iframe);
        })();
    """
    replacements = [
        ("%PORT%", "%d" % port),
        ("%HEIGHT%", "%d" % height),
    ]
    for (k, v) in replacements:
        shell = shell.replace(k, v)

    script = IPython.display.Javascript(shell)
    IPython.display.display(script)

Run jupyter-dash with Jupyterlab2.x + JupyterHub + Kubernetes

Regarding #2, not sure which server proxy it is referring to. Can you point me to the link to the documentation?

Hi @rdelubi,
This issue is specifically concerning Colab, because Colab isn't a standard Jupyter front-end. For regular JupyterHub, it should be enough to:

  1. Make sure the jupyterlab-dash extension is built (https://github.com/plotly/jupyter-dash#jupyterlab-support)
  2. Make sure jupyter_server_proxy is installed in the environment running the Jupyter server.
  3. Call JupyterDash.infer_jupyter_proxy_config() at the top of the notebook. This uses the JupyterLab extension to detect the URL that JupyterLab is being accessed at (This is what server_url is), passes that info back to the Python library for use in configuring Dash.

If that doesn't work for you, could you open a new issue? Thanks!

Originally posted by @jonmmease in #10 (comment)

Canโ€™t run JupyterDash or Plotly from JupyterLab (3.0.10) with docker container.

When I tried to use JupyterDash from the JupyterLab (3.0.10) docker container, I got the error that the webserver is not responding:

This page isnโ€™t working
127.0.0.1 didnโ€™t send any data.
ERR_EMPTY_RESPONSE

My docker file:

FROM continuumio/miniconda3:latest   

RUN apt update && apt upgrade -y
RUN apt install curl -y

RUN curl -fsSL https://deb.nodesource.com/setup_15.x | bash -
RUN apt-get install -y nodejs

RUN conda update conda
RUN conda install -c conda-forge jupyterlab
RUN conda install -c conda-forge -c plotly jupyter-dash
RUN jupyter lab build

RUN mkdir /opt/notebooks
WORKDIR  /opt/notebooks

EXPOSE 8888 8050

CMD [ "jupyter", "lab", "--notebook-dir=/opt/notebooks", "--ip='*'", "--port=8888", "--no-browser", "--allow-root" ]

Docker build command:

docker build -t demo/dash .

Docker run command:

docker run -p 8888:8888 -p 8050:8050 -v c:\\docker\\data:/opt/notebooks demo/dash

I got the error described above, trying to run getting_started.ipynb from this repo.

Should I call JupyterDash.infer_jupyter_proxy_config() when I run notebook in container?

Another issue with this setup. If I try to use plotly, it's plots nothing, without error, just blank area after running cell.
My code snippets:

import plotly.graph_objects as go
fig = go.Figure(data=go.Bar(y=[2, 3, 1]))
fig.show()

or

import plotly.express as px
df = px.data.iris()
fig = px.scatter(df, x="sepal_width", y="sepal_length", color="species")
fig.show()

OSError: Address "" already in use.

Hi all!

I'm running into an error when trying to run app.run_server(mode='external')

Even if I kill all ports, and even if I change the port to a random number (for instance: app.run_server(mode='external', port=2000)), the error still persists. I don't know how to resolve this. Help would be appreciated! Thanks!

Full error:

---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
<ipython-input-5-0a3a9f2580ba> in <module>
----> 1 app.run_server(mode='external', port=2000)

/opt/anaconda3/lib/python3.7/site-packages/jupyter_dash/jupyter_app.py in run_server(self, mode, width, height, inline_exceptions, **kwargs)
    317                 )
    318 
--> 319         wait_for_app()
    320 
    321         if JupyterDash._in_colab:

/opt/anaconda3/lib/python3.7/site-packages/retrying.py in wrapped_f(*args, **kw)
     47             @six.wraps(f)
     48             def wrapped_f(*args, **kw):
---> 49                 return Retrying(*dargs, **dkw).call(f, *args, **kw)
     50 
     51             return wrapped_f

/opt/anaconda3/lib/python3.7/site-packages/retrying.py in call(self, fn, *args, **kwargs)
    210                 if not self._wrap_exception and attempt.has_exception:
    211                     # get() on an attempt with an exception should cause it to be raised, but raise just in case
--> 212                     raise attempt.get()
    213                 else:
    214                     raise RetryError(attempt)

/opt/anaconda3/lib/python3.7/site-packages/retrying.py in get(self, wrap_exception)
    245                 raise RetryError(self)
    246             else:
--> 247                 six.reraise(self.value[0], self.value[1], self.value[2])
    248         else:
    249             return self.value

/opt/anaconda3/lib/python3.7/site-packages/six.py in reraise(tp, value, tb)
    701             if value.__traceback__ is not tb:
    702                 raise value.with_traceback(tb)
--> 703             raise value
    704         finally:
    705             value = None

/opt/anaconda3/lib/python3.7/site-packages/retrying.py in call(self, fn, *args, **kwargs)
    198         while True:
    199             try:
--> 200                 attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
    201             except:
    202                 tb = sys.exc_info()

/opt/anaconda3/lib/python3.7/site-packages/jupyter_dash/jupyter_app.py in wait_for_app()
    313                     "Address '{url}' already in use.\n"
    314                     "    Try passing a different port to run_server.".format(
--> 315                         url=url
    316                     )
    317                 )

OSError: Address 'http://127.0.0.1:2000' already in use.
    Try passing a different port to run_server.

Unable to use infer_jupyter_proxy_config with jupyterhub

Hi!

I'm having trouble getting jupyter-dash to work with jupyterhub.

The base environment that jupyterhub is running in has jupyter_server_proxy enabled (shows up when the jupyter serverextensions are listed). The kernel has jupyter-dash installed. I'm primarily interested in getting it to work with classic notebooks, but I tried it in lab and it didn't work either (same error).

Running:
JupyterDash.infer_jupyter_proxy_config()
results in the following error:

OSError: Unable to communicate with the jupyter_dash notebook or JupyterLab 
extension required to infer Jupyter configuration.

I tried adding the

from jupyter_dash.comms import _send_jupyter_config_comm_request
_send_jupyter_config_comm_request()

from #17, but no luck there.

Any advice for how to debug further?

Cannot render app when in remote Jupyter Lab

I am running JupyterLab on a remote server, and accessing it via the browser on my machine. When I try to run a JupyterDash app, I get the following error:

OSError: Address 'http://127.0.0.1:8050' already in use.
    Try passing a different port to run_server.

I tried passing different ports (unused ones) and still could not get it to run in inline or external modes.

app.run_server(mode="jupyterlab") does not open new tab for me

Hi,

First of all, thank you so much for making this great feature available.

I tried to follow your demo* from end to end, but this command line "app.run_server(mode="jupyterlab")" does not open any new tab for me; yet jupyter ran through.

May I know how to solve this?

*https://medium.com/plotly/introducing-jupyterdash-811f1f57c02e

-------------- jupyter version --------------
jupyter core : 4.5.0
jupyter-notebook : 6.0.1
qtconsole : 4.5.5
ipython : 7.8.0
ipykernel : 5.1.2
jupyter client : 5.3.3
jupyter lab : 2.2.9
nbconvert : 5.6.0
ipywidgets : 7.5.1
nbformat : 4.4.0
traitlets : 4.3.3

Thank you for your time in advance,
Bill

Support dev tools stack traces

Stack-traces in the dev UI don't work in the notebook when there is no source code file to pull from. Look into providing this information from the notebook.

Server starting twice on duplicate element

Hello, if in the layout an element is present twice with the same id, then app.run_server() will fail saying that someone else is already using the port. The error message seems a bit misleading and I was wondering if this is a known issue.

To replicate, just use the getting started notebook and copy the scatter plot in another div so that you will have two

dcc.Graph(
            id='crossfilter-indicator-scatter',
            hoverData={'points': [{'customdata': 'Japan'}]}
        )

404 error running in JupyterLab + JupyterHub

Hi @jonmmease, I've been trying out jupyter-dash in JupyterLab 2.1.1, many thanks for the great work here! I'm currently stuck at a 404 error trying to open the dash app. Additionally I ran into an issue with infer_jupyter_proxy_config() that I seem to have worked around successfully.


I followed the example getting_started notebook, but had ran into an issue running JupyterDash.infer_jupyter_proxy_config(), where cell execution continues indefinitely (and the kernel crashes if I interrupt the kernel). Somehow this is resolved by running this in a separate cell just before infer_jupyter_proxy_config():

from jupyter_dash.comms import _send_jupyter_config_comm_request
_send_jupyter_config_comm_request()

The top of my notebook looks like this (cells run in order, with restarted kernel):
image

The current issue I'm running into though is that the dash URL returns a 404, like such:
image
(I also get the same 404 with mode='external' or 'jupyterlab')

We're running JupyterLab 2.1.1 with JupyterHub 1.1 on a k8s deployment. Our jupyter environment as follows:

pip list | grep jupyter
jupyter-client         6.1.3
jupyter-console        6.1.0
jupyter-core           4.6.3
jupyter-dash           0.2.1.post1
jupyter-server-proxy   1.5.0
jupyter-telemetry      0.0.5
jupyterhub             1.1.0
jupyterlab             2.1.1
jupyterlab-server      1.1.3
jupyterlab-templates   0.2.3

jupyter labextension list
JupyterLab v2.1.1
Known labextensions:
   app dir: /opt/conda/share/jupyter/lab
        @jupyter-widgets/jupyterlab-manager v2.0.0  enabled  OK
        @jupyterlab/geojson-extension v2.0.1  enabled  OK
        @jupyterlab/toc v3.0.0  enabled  OK
        jupyter-leaflet v0.12.4  enabled  OK
        jupyterlab-dash v0.2.0  enabled  OK
        jupyterlab-plotly v4.7.1  enabled  OK
        jupyterlab_templates v0.2.3  enabled  OK
        plotlywidget v4.7.1  enabled  OK

Following your suggested diagnosis checks from #14 :

  • Able to import jupyter_server_proxy successfully
  • jupyter_dash.comms._jupyter_config returns the following:
{'type': 'base_url_response',
 'server_url': 'http://go-orbitalinsight.com',
 'base_subpath': '/user/[email protected]/',
 'frontend': 'jupyterlab'}

The URL printed when I run_server(mode='external') is as follows, which looks right:
http://go-orbitalinsight.com/user/[email protected]/proxy/8050/

Would be grateful for any suggestions on resolving!

Not working properly when executing as python script on Jupyter terminal

Hi, thanks for the great work. It is working perfectly when executing on Jupyterlab notebook. However, when I tried to execute it as a python script on Jupyterlab terminal, the dash app page deployed on 'http://localhost:8888/proxy/8050' was blocked on 'Loading...'.
image

I know this is an issue between 'Dash' and 'jupyter-server-proxy', not related to 'jupyter-dash'. But it seems to be a small fix to enable 'jupyter-dash' magic work with a python script as well? This is very useful when we want to use 'Jupyter Server Proxy' to run dash app alongside our notebooks and provide authenticated web access to them with jupyterhub https://jupyter-server-proxy.readthedocs.io/en/latest/server-process.html.

Thanks

AWS SageMaker

Has anyone been able to get Dash working in an AWS SageMaker Jupyter notebook or or lab environment?
I've been unsuccessful with all three modes of inline, external, jupyterlab.

To access a basic Flask app, the web address in SageMaker is ...aws/proxy/<port>/ which I think may be causing a compatibility issue with the host/port settings of Dash.

The closest I was able to get was with the external mode, where the sample got stuck displaying "Loading..."

I also created an issue on SM repo: aws/amazon-sagemaker-examples#1595

HOST env var is clobbered by conda and prevents using run_server without overiding host explicitly

I noticed a peculiar issue when trying to use jupyter-dash wherein it would try and connect to x86_64-conda_cos6-linux-gnu when execute app.run_server.

It seems that when a conda environment is activated, it modifies the HOST variable (a very strange idea, but one that has been noted and not fixed for a number of years).

As i imagine jupyter-dash is used very often with conda, could i suggest that HOSTNAME is used instead of HOST if it exists as this isn't clobbered by conda activation. Or is there any reason that host shouldn't just default to 127.0.0.1 unless explicitly set, given how the plugin operates?

Using on SageMaker Studio

I'd like to use this on AWS Sagemaker Studio, but I think Sagemaker Studio is JupyterLab 1.

Has anyone been able to get either the inline or jupyterlab modes working on Sagemaker Studio?

ConnectionError: HTTPConnectionPool: Max retries exceeded

ConnectionError: HTTPConnectionPool(host='x86_64-conda_cos6-linux-gnu', port=8050): Max retries exceeded with url: /_alive_a3a79a8c-7956-4602-96cc-f8f88ec33619 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f4dd6f53a58>: Failed to establish a new connection: [Errno -2] Name or service not known'))

I'm trying to run the demo code as shown here

can't run JupyterDash.infer_jupyter_proxy_config() in Jupyterhub

I am not able to run JupyterDash.infer_jupyter_proxy_config() in Jupyterhub. There is no error message, cell remains in busy state without any output. Do you have any idea?

  • app.run_server() returns 127.0.0.1:8005 which will not work from outside
  • app.run_server("mode=jupyterlab") also won't work
  • I also see in the browser console: blocked mixed content: the page was loaded over HTTPS, but requested an insecure frame 'http://ip:8050/'. This request has been blocked; the content must be served over HTTPS.

only thing working is opening the ip:8050 in another tab and I can see the dashboard.

Dependency Dashboard

This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.

Open

These updates have all been created already. Click a checkbox below to force a retry/rebase of any.

Ignored or Blocked

These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below.

Detected dependencies

npm
extensions/jupyterlab/package.json
  • @jupyterlab/application ^2.0.0 || ^3.0.0
  • @jupyterlab/notebook ^2.0.0 || ^3.0.0
  • @jupyterlab/console ^2.0.0 || ^3.0.0
  • prettier 2.0.5
  • rimraf 3.0.2
  • typescript 3.9.3
jupyter_dash/labextension/package.json
  • @jupyterlab/application ^2.0.0 || ^3.0.0
  • @jupyterlab/notebook ^2.0.0 || ^3.0.0
  • @jupyterlab/console ^2.0.0 || ^3.0.0
  • prettier 2.0.5
  • rimraf 3.0.2
  • typescript 3.9.3
pip_requirements
binder/requirements.txt
  • notebook ==6.0.3
requirements-dev.txt
  • jupyterlab >=2.0
  • notebook >=6.0
requirements.txt

  • Check this box to trigger a request for Renovate to run again on this repository

Show output in full height without scroll bar

Hi there,

I am facing an issue where the JupyterDash output cell will have the vertical scroll bar whenever the output is too long.

Is there a way for me to code it such that the output cell will always be at maximum height according to what graphs are generated, so that my users won't need to scroll through two vertical bars?

I have attached a screenshot here to show the two vertical bars I am referring to.

image

infer_jupyter_proxy_config() misbehaviour in Binder notebook with "Run all cells"

I am trying to run a Dash web app in a Notebook in BinderHub.

If I execute cells one by one (by repeatedly hit Shift-Enter) from top to bottom, everything works (apart a minor quirk of skipping a number when executing the cell containing the call to JupyterDash.infer_jupyter_proxy_config()). The app is eventually displayed and everything works.

However, if I execute the notebook with "Run all cells", the cell containing the call to JupyterDash.infer_jupyter_proxy_config() appears to be skipped only to be executed after all the other cells have executed.
This is reflected in its sequence number and in the fact that any other code in the same cell seems not to have taken effect.

I am also trying to use Voila, so that only the dashboard is shown.
When I run the notebook under Voila, the execution appears to stop at the cell containing JupyterDash.infer_jupyter_proxy_config().

I see that in _request_jupyter_config() messages are exchanged with the IPython kernel to glean the information, but I am not competent enough to see the cause of the misbehaviour.
I was wondering if one could not get the same info in a safer (?) way from notebook.notebookapp.list_running_servers.

Server does not start in JupyterLab when debug=True

Server does not start if debug=True in JupyterLab.
This minimal code does run when :

  • debug=False in JuyterL:ab, or
  • debug=True if I export it as a an executable script and run it in a terminal window!

Environment setup:

  • conda install
  • Windows 10
  • JupyterLab 2.1
  • localhost

I'm running the following code:

Cell [2]:

app = JupyterDash(__name__)
app.layout = html.Div('Minimal')

app.run_server(mode="external", debug=True)

Cel[2] Output:

Click here for the lengthy error log:
Exception in thread Thread-28:
Traceback (most recent call last):
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\threading.py", line 926, in _bootstrap_inner
    self.run()
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\site-packages\retrying.py", line 49, in wrapped_f
    return Retrying(*dargs, **dkw).call(f, *args, **kw)
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\site-packages\retrying.py", line 212, in call
    raise attempt.get()
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\site-packages\retrying.py", line 247, in get
    six.reraise(self.value[0], self.value[1], self.value[2])
  File "C:\Users\spikes\AppData\Roaming\Python\Python37\site-packages\six.py", line 693, in reraise
    raise value
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\site-packages\retrying.py", line 200, in call
    attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\site-packages\jupyter_dash\jupyter_app.py", line 264, in run
    super_run_server(**kwargs)
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\site-packages\dash\dash.py", line 1411, in run_server
    **flask_run_options)
  File "C:\Users\spikes\Miniconda3\envs\jlab2\lib\site-packages\flask\app.py", line 990, in run
    run_simple(host, port, self, **options)
TypeError: run_simple() got an unexpected keyword argument 'dev_tools_props_check'

---------------------------------------------------------------------------
ConnectionRefusedError                    Traceback (most recent call last)
~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\connection.py in _new_conn(self)
    159             conn = connection.create_connection(
--> 160                 (self._dns_host, self.port), self.timeout, **extra_kw
    161             )

~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\util\connection.py in create_connection(address, timeout, source_address, socket_options)
     83     if err is not None:
---> 84         raise err
     85 

~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\util\connection.py in create_connection(address, timeout, source_address, socket_options)
     73                 sock.bind(source_address)
---> 74             sock.connect(sa)
     75             return sock

ConnectionRefusedError: [WinError 10061] No connection could be made because the target machine actively refused it

During handling of the above exception, another exception occurred:

NewConnectionError                        Traceback (most recent call last)
~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    676                 headers=headers,
--> 677                 chunked=chunked,
    678             )

~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
    391         else:
--> 392             conn.request(method, url, **httplib_request_kw)
    393 

~\Miniconda3\envs\jlab2\lib\http\client.py in request(self, method, url, body, headers, encode_chunked)
   1251         """Send a complete request to the server."""
-> 1252         self._send_request(method, url, body, headers, encode_chunked)
   1253 

~\Miniconda3\envs\jlab2\lib\http\client.py in _send_request(self, method, url, body, headers, encode_chunked)
   1297             body = _encode(body, 'body')
-> 1298         self.endheaders(body, encode_chunked=encode_chunked)
   1299 

~\Miniconda3\envs\jlab2\lib\http\client.py in endheaders(self, message_body, encode_chunked)
   1246             raise CannotSendHeader()
-> 1247         self._send_output(message_body, encode_chunked=encode_chunked)
   1248 

~\Miniconda3\envs\jlab2\lib\http\client.py in _send_output(self, message_body, encode_chunked)
   1025         del self._buffer[:]
-> 1026         self.send(msg)
   1027 

~\Miniconda3\envs\jlab2\lib\http\client.py in send(self, data)
    965             if self.auto_open:
--> 966                 self.connect()
    967             else:

~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\connection.py in connect(self)
    186     def connect(self):
--> 187         conn = self._new_conn()
    188         self._prepare_conn(conn)

~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\connection.py in _new_conn(self)
    171             raise NewConnectionError(
--> 172                 self, "Failed to establish a new connection: %s" % e
    173             )

NewConnectionError: <urllib3.connection.HTTPConnection object at 0x0000020565163988>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it

During handling of the above exception, another exception occurred:

MaxRetryError                             Traceback (most recent call last)
~\Miniconda3\envs\jlab2\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    448                     retries=self.max_retries,
--> 449                     timeout=timeout
    450                 )

~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked, body_pos, **response_kw)
    724             retries = retries.increment(
--> 725                 method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
    726             )

~\Miniconda3\envs\jlab2\lib\site-packages\urllib3\util\retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
    438         if new_retry.is_exhausted():
--> 439             raise MaxRetryError(_pool, url, error or ResponseError(cause))
    440 

MaxRetryError: HTTPConnectionPool(host='127.0.0.1', port=8050): Max retries exceeded with url: /_alive_a6195403-4d44-472b-8ea6-99427edf92c3 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000020565163988>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))

During handling of the above exception, another exception occurred:

ConnectionError                           Traceback (most recent call last)
<ipython-input-4-6c821b38cf6f> in <module>
      2 app.layout = html.Div('Minimal')
      3 
----> 4 app.run_server(mode="external", debug=True)

~\Miniconda3\envs\jlab2\lib\site-packages\jupyter_dash\jupyter_app.py in run_server(self, mode, width, height, inline_exceptions, **kwargs)
    292                 )
    293 
--> 294         wait_for_app()
    295 
    296         if mode == 'inline':

~\Miniconda3\envs\jlab2\lib\site-packages\retrying.py in wrapped_f(*args, **kw)
     47             @six.wraps(f)
     48             def wrapped_f(*args, **kw):
---> 49                 return Retrying(*dargs, **dkw).call(f, *args, **kw)
     50 
     51             return wrapped_f

~\Miniconda3\envs\jlab2\lib\site-packages\retrying.py in call(self, fn, *args, **kwargs)
    210                 if not self._wrap_exception and attempt.has_exception:
    211                     # get() on an attempt with an exception should cause it to be raised, but raise just in case
--> 212                     raise attempt.get()
    213                 else:
    214                     raise RetryError(attempt)

~\Miniconda3\envs\jlab2\lib\site-packages\retrying.py in get(self, wrap_exception)
    245                 raise RetryError(self)
    246             else:
--> 247                 six.reraise(self.value[0], self.value[1], self.value[2])
    248         else:
    249             return self.value

~\AppData\Roaming\Python\Python37\site-packages\six.py in reraise(tp, value, tb)
    691             if value.__traceback__ is not tb:
    692                 raise value.with_traceback(tb)
--> 693             raise value
    694         finally:
    695             value = None

~\Miniconda3\envs\jlab2\lib\site-packages\retrying.py in call(self, fn, *args, **kwargs)
    198         while True:
    199             try:
--> 200                 attempt = Attempt(fn(*args, **kwargs), attempt_number, False)
    201             except:
    202                 tb = sys.exc_info()

~\Miniconda3\envs\jlab2\lib\site-packages\jupyter_dash\jupyter_app.py in wait_for_app()
    280         )
    281         def wait_for_app():
--> 282             res = requests.get(alive_url).content.decode()
    283             if res != "Alive":
    284                 url = "http://{host}:{port}".format(

~\Miniconda3\envs\jlab2\lib\site-packages\requests\api.py in get(url, params, **kwargs)
     74 
     75     kwargs.setdefault('allow_redirects', True)
---> 76     return request('get', url, params=params, **kwargs)
     77 
     78 

~\Miniconda3\envs\jlab2\lib\site-packages\requests\api.py in request(method, url, **kwargs)
     59     # cases, and look like a memory leak in others.
     60     with sessions.Session() as session:
---> 61         return session.request(method=method, url=url, **kwargs)
     62 
     63 

~\Miniconda3\envs\jlab2\lib\site-packages\requests\sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
    528         }
    529         send_kwargs.update(settings)
--> 530         resp = self.send(prep, **send_kwargs)
    531 
    532         return resp

~\Miniconda3\envs\jlab2\lib\site-packages\requests\sessions.py in send(self, request, **kwargs)
    641 
    642         # Send the request
--> 643         r = adapter.send(request, **kwargs)
    644 
    645         # Total elapsed time of the request (approximately)

~\Miniconda3\envs\jlab2\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    514                 raise SSLError(e, request=request)
    515 
--> 516             raise ConnectionError(e, request=request)
    517 
    518         except ClosedPoolError as e:

ConnectionError: HTTPConnectionPool(host='127.0.0.1', port=8050): Max retries exceeded with url: /_alive_a6195403-4d44-472b-8ea6-99427edf92c3 (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0000020565163988>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))

Inline dash does not work with Jupyter behind an NGINX proxy

We have an NGINX proxy that does SSL termination, and reroutes traffic between ports. If the server is started up on localhost port 8050 on HTTP that is fine, but we would need to inject a different url https://hostname:8051 for example. We found the code in line 217 in jupyter_app.py or so we would need to manually edit to fix this.

Is it OK if I submit a PR that adds an extra argument to app.run_server(...), perhaps called proxy_server_url, which lets you specify the host to use in javascript without changing anything else? If left out behavior would continue to work as it does now.

Jupyter-Dash no longer working in Jupyterlab 3.1.1?

Are there issues with Jupyter-Dash in Jupyterlab 3.1.1?

I've just created a new venv and installed the latest jupyterlab, dash, Jupyter-dash libraries. I'm trying to run the code from the example notebook in the repo.

When i run the cell for JupyterDash.infer_jupyter_proxy_config() i get the error:

c:\jupyter\venv\lib\site-packages\jupyter_dash\comms.py:69: RuntimeWarning: coroutine 'Kernel.do_one_iteration' was never awaited
kernel.do_one_iteration()
RuntimeWarning: Enable tracemalloc to get the object allocation traceback

Then the notebook becomes unresponsive until i restart the kernel.

image

Package versions are:

image

The extension looks like it was installed correctly. I installed jupyter-dash and then did a lab rebuild.

image

I've tried running this on my pc as well as in a docker container on my NAS. Both get the same issues.

Project abandoned?

Is this project abandoned by plotly? I have identified a couple issues and have fixes (e.g. #60), but there hasn't been any commits since January 22, 2021. There also doesn't appear to have been any responses to issues. I don't mind maintaining a fork with fixes, but would prefer to move good changes upstream if possible.

Incompatibility with Dash v2.1.0

Quick-Fix on User-Level

If you are a user and looking for a quick fix do this:

$ python -m pip uninstall dash
$ python -m pip install 'dash==2.0.0'

Cause

I spent the past 30 minutes diving through the source code of both projects and I would assume it has to do with:
plotly/dash#1876

Problem

Running the following lines always leads to the AttributeError below:

from jupyter_dash import JupyterDash
app = JupyterDash(__name__)
app.run_server(mode='inline')
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
Input In [12], in <module>
      1 from jupyter_dash import JupyterDash
      2 app = JupyterDash()
----> 3 app.run_server(mode='inline')

File ~\.pyenv\pyenv-win\versions\3.10.2\lib\site-packages\jupyter_dash\jupyter_app.py:231, in JupyterDash.run_server(self, mode, width, height, inline_exceptions, **kwargs)
    229 else:
    230     requests_pathname_prefix = '/'
--> 231 self.config.update({'requests_pathname_prefix': requests_pathname_prefix})
    233 # Compute server_url url
    234 if self.server_url is None:

File ~\.pyenv\pyenv-win\versions\3.10.2\lib\site-packages\dash\_utils.py:169, in AttributeDict.update(self, other)
    166 def update(self, other):
    167     # Overrides dict.update() to use __setitem__ above
    168     for k, v in other.items():
--> 169         self[k] = v

File ~\.pyenv\pyenv-win\versions\3.10.2\lib\site-packages\dash\_utils.py:158, in AttributeDict.__setitem__(self, key, val)
    156 def __setitem__(self, key, val):
    157     if key in self.__dict__.get("_read_only", {}):
--> 158         raise AttributeError(self._read_only[key], key)
    160     final_msg = self.__dict__.get("_final")
    161     if final_msg and key not in self:

AttributeError: ('Read-only: can only be set in the Dash constructor or during init_app()', 'requests_pathname_prefix')

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.