Skip to content

adding support for async callbacks and page layouts #3089

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 69 commits into from
Jun 18, 2025
Merged
Changes from all commits
Commits
Show all changes
69 commits
Select commit Hold shift + click to select a range
e1002d5
adding support for async callbacks and page layouts
BSd3v Nov 21, 2024
e24e094
adding new `use_async` attribute to `Dash` and having callbacks and l…
BSd3v Nov 21, 2024
5d492dc
fixing issue with indentations of layouts
BSd3v Nov 21, 2024
69ee069
removing needing to classify the function as async, instead I am look…
BSd3v Nov 21, 2024
950888c
fixing for lint
BSd3v Nov 22, 2024
916efdc
adjustments for the test for the debugger, making test more robust fo…
BSd3v Nov 22, 2024
0dd778e
fixing lint issues
BSd3v Nov 22, 2024
2d9a930
Adjustments for `use_async` and determining whether the app can be us…
BSd3v Nov 22, 2024
b2f9cd6
disable lint for unused import on `asgiref`
BSd3v Nov 22, 2024
f1ac667
adjustments for formatting
BSd3v Nov 22, 2024
378099f
Merge remote-tracking branch 'dash/dev' into support-async-callbacks
BSd3v Nov 22, 2024
52de22e
adding additional ignore for `flake8`
BSd3v Nov 22, 2024
4189ed6
more adjustments for lint
BSd3v Nov 22, 2024
0ddf254
adding `dash[async]`
BSd3v Nov 22, 2024
76a9a56
Merge branch 'plotly:dev' into support-async-callbacks
BSd3v Nov 25, 2024
337ec26
attempt no 1 for refactoring dash for `dispatch`
BSd3v Dec 4, 2024
a730b4b
fixing for lint
BSd3v Dec 4, 2024
538515d
fixing no outputs
BSd3v Dec 4, 2024
4f14a1a
attempt no 1 for refactoring callbacks
BSd3v Dec 4, 2024
96df44e
fixing for multi outputs
BSd3v Dec 5, 2024
91400a3
attempt no 1 refactoring background callbacks for async functions
BSd3v Dec 5, 2024
1856f5c
Merge remote-tracking branch 'dash/dev' into support-async-callbacks
BSd3v Dec 5, 2024
671cb2b
fixing for lint and progress outputs
BSd3v Dec 5, 2024
f19fe23
lint adjustments
BSd3v Dec 6, 2024
17c824d
adding async tests
Dec 6, 2024
fdfd058
bypassing `test_rdrh003_refresh_jwt` as this fails with 3 failed requ…
Dec 7, 2024
fb691c3
removing `__init__` from `async` directory
Dec 7, 2024
2473546
adjusting `jwt` test to adjust value in the MultiProcessing and remov…
Dec 9, 2024
b2e8884
removing flaky for lint
Dec 9, 2024
df29ee6
adjustments for formatting
Dec 9, 2024
5342573
simplifying `jwt` test by using `before_request` removing needs to ch…
Dec 9, 2024
c79debf
attempting to allow for failed tests to be rerun to see if the cache …
Dec 9, 2024
37d649c
moving retry to `integration-dash`
Dec 9, 2024
eaec04f
adjusting browser for percy snapshot to append `async` to the snapsho…
Dec 9, 2024
68320f4
fixing `browser` for lint
Dec 9, 2024
5b201a1
Merge branch 'dev' into support-async-callbacks
BSd3v Feb 3, 2025
1aea686
Merge remote-tracking branch 'remote/dev' into support-async-callbacks
BSd3v Mar 25, 2025
fc6c6bc
updating missing `long` -> `background` adjustments and fixing issue …
BSd3v Mar 25, 2025
503f7c0
fixing for lint
BSd3v Mar 25, 2025
c26671e
adding tests for async background callbacks
BSd3v Mar 25, 2025
20dcd67
updating to bypass test if not async
BSd3v Mar 25, 2025
bb3d881
fixing for lint
BSd3v Mar 25, 2025
c6940c7
update for celery app process
BSd3v Mar 25, 2025
b7a8c4a
Merge branch 'dev' into support-async-callbacks
BSd3v Mar 26, 2025
51c1689
Merge branch 'dev' into support-async-callbacks
BSd3v Mar 27, 2025
eb96c5d
Merge branch 'dev' into support-async-callbacks
BSd3v Apr 9, 2025
832ab4d
Merge remote-tracking branch 'remote/dev' into support-async-callbacks
BSd3v Jun 11, 2025
d0b2dd3
fixing tyop
BSd3v Jun 11, 2025
e88bc3c
fixing for issues with callback structure
BSd3v Jun 11, 2025
a1bd54f
Update dash/_callback.py
BSd3v Jun 11, 2025
7f35ecd
adjusting issue with try
BSd3v Jun 11, 2025
28b0693
Merge branch 'support-async-callbacks' of github.com:BSd3v/dash into …
BSd3v Jun 11, 2025
443f785
removing `test_async` from the utils to keep from confusing `pytest`
BSd3v Jun 11, 2025
07bcd38
fixing for lint
BSd3v Jun 11, 2025
e82f561
updates for new `background_callbacks` test path
BSd3v Jun 11, 2025
43fe153
adjustment for GHA to run the async test
BSd3v Jun 11, 2025
6fd556d
adding step for creating the test components
BSd3v Jun 11, 2025
abbfcd2
altering path for the get background callback manager
BSd3v Jun 11, 2025
13a33be
verifying redis is still running
BSd3v Jun 11, 2025
e3a1496
issue with celery
BSd3v Jun 11, 2025
e8bcb22
adjusting order
BSd3v Jun 11, 2025
43d10c4
attempt to fix failing test
BSd3v Jun 11, 2025
4b3c011
moving async install
BSd3v Jun 11, 2025
777a12b
adding test components
BSd3v Jun 11, 2025
769fc7e
remove async test from circleci
T4rk1n Jun 13, 2025
12574d1
move async tests
T4rk1n Jun 16, 2025
3c610b3
remove integrations tests from async_tests
T4rk1n Jun 16, 2025
d955ce1
fix async_tests
T4rk1n Jun 16, 2025
513408d
fix async tests
T4rk1n Jun 17, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 8 additions & 4 deletions .github/workflows/testing.yml
Original file line number Diff line number Diff line change
@@ -135,9 +135,9 @@ jobs:
run: |
cd tests
pytest compliance/test_typing.py
background-callbacks:
name: Run Background Callback Tests (Python ${{ matrix.python-version }})
name: Run Background & Async Callback Tests (Python ${{ matrix.python-version }})
needs: [build, changes_filter]
if: |
(github.event_name == 'push' && (github.ref == 'refs/heads/master' || github.ref == 'refs/heads/dev')) ||
@@ -195,7 +195,7 @@ jobs:
python -m pip install --upgrade pip wheel
python -m pip install "setuptools<78.0.0"
python -m pip install "selenium==4.32.0"
find packages -name dash-*.whl -print -exec sh -c 'pip install "{}[ci,testing,dev,celery,diskcache]"' \;
find packages -name dash-*.whl -print -exec sh -c 'pip install "{}[async,ci,testing,dev,celery,diskcache]"' \;
- name: Install Google Chrome
run: |
@@ -253,13 +253,17 @@ jobs:
run: |
python -c "import redis; r = redis.Redis(host='localhost', port=6379, db=0); r.ping(); print('Successfully connected to Redis!')"
- name: Run Background Callback Tests
- name: Build/Setup test components
run: npm run setup-tests.py

- name: Run Background & Async Callback Tests
run: |
mkdir bgtests
cp -r tests bgtests/tests
cd bgtests
touch __init__.py
pytest --headless --nopercyfinalize tests/background_callback -v -s
pytest --headless --nopercyfinalize tests/async_tests -v -s
table-unit:
name: Table Unit/Lint Tests (Python ${{ matrix.python-version }})
603 changes: 396 additions & 207 deletions dash/_callback.py

Large diffs are not rendered by default.

66 changes: 61 additions & 5 deletions dash/background_callback/managers/celery_manager.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
import json
import traceback
from contextvars import copy_context
import asyncio
from functools import partial

from _plotly_utils.utils import PlotlyJSONEncoder

@@ -16,7 +18,7 @@ class CeleryManager(BaseBackgroundCallbackManager):

def __init__(self, celery_app, cache_by=None, expire=None):
"""
Long callback manager that runs callback logic on a celery task queue,
Background callback manager that runs callback logic on a celery task queue,
and stores results using a celery result backend.
:param celery_app:
@@ -40,7 +42,7 @@ def __init__(self, celery_app, cache_by=None, expire=None):
except ImportError as missing_imports:
raise ImportError(
"""\
CeleryLongCallbackManager requires extra dependencies which can be installed doing
CeleryManager requires extra dependencies which can be installed doing
$ pip install "dash[celery]"\n"""
) from missing_imports
@@ -135,11 +137,13 @@ def get_updated_props(self, key):
return json.loads(updated_props)


def _make_job_fn(fn, celery_app, progress, key):
def _make_job_fn(fn, celery_app, progress, key): # pylint: disable=too-many-statements
cache = celery_app.backend

@celery_app.task(name=f"background_callback_{key}")
def job_fn(result_key, progress_key, user_callback_args, context=None):
def job_fn(
result_key, progress_key, user_callback_args, context=None
): # pylint: disable=too-many-statements
def _set_progress(progress_value):
if not isinstance(progress_value, (list, tuple)):
progress_value = [progress_value]
@@ -198,7 +202,59 @@ def run():
result_key, json.dumps(user_callback_output, cls=PlotlyJSONEncoder)
)

ctx.run(run)
async def async_run():
c = AttributeDict(**context)
c.ignore_register_page = False
c.updated_props = ProxySetProps(_set_props)
context_value.set(c)
errored = False
try:
if isinstance(user_callback_args, dict):
user_callback_output = await fn(
*maybe_progress, **user_callback_args
)
elif isinstance(user_callback_args, (list, tuple)):
user_callback_output = await fn(
*maybe_progress, *user_callback_args
)
else:
user_callback_output = await fn(*maybe_progress, user_callback_args)
except PreventUpdate:
# Put NoUpdate dict directly to avoid circular imports.
errored = True
cache.set(
result_key,
json.dumps(
{"_dash_no_update": "_dash_no_update"}, cls=PlotlyJSONEncoder
),
)
except Exception as err: # pylint: disable=broad-except
errored = True
cache.set(
result_key,
json.dumps(
{
"background_callback_error": {
"msg": str(err),
"tb": traceback.format_exc(),
}
},
),
)

if asyncio.iscoroutine(user_callback_output):
user_callback_output = await user_callback_output

if not errored:
cache.set(
result_key, json.dumps(user_callback_output, cls=PlotlyJSONEncoder)
)

if asyncio.iscoroutinefunction(fn):
func = partial(ctx.run, async_run)
asyncio.run(func())
else:
ctx.run(run)
Comment on lines +253 to +257
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we guaranteed to be out of the asyncio loop? In which case, the run might fail. Otherwise it might be better to only have the async_run definition so there is less code repetition.


return job_fn

92 changes: 86 additions & 6 deletions dash/background_callback/managers/diskcache_manager.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
import traceback
from contextvars import copy_context
import asyncio
from functools import partial


from . import BaseBackgroundCallbackManager
@@ -16,7 +18,7 @@ class DiskcacheManager(BaseBackgroundCallbackManager):

def __init__(self, cache=None, cache_by=None, expire=None):
"""
Long callback manager that runs callback logic in a subprocess and stores
Background callback manager that runs callback logic in a subprocess and stores
results on disk using diskcache
:param cache:
@@ -39,7 +41,7 @@ def __init__(self, cache=None, cache_by=None, expire=None):
except ImportError as missing_imports:
raise ImportError(
"""\
DiskcacheLongCallbackManager requires extra dependencies which can be installed doing
DiskcacheManager requires extra dependencies which can be installed doing
$ pip install "dash[diskcache]"\n"""
) from missing_imports
@@ -117,16 +119,52 @@ def clear_cache_entry(self, key):

# noinspection PyUnresolvedReferences
def call_job_fn(self, key, job_fn, args, context):
"""
Call the job function, supporting both sync and async jobs.
Args:
key: Cache key for the job.
job_fn: The job function to execute.
args: Arguments for the job function.
context: Context for the job.
Returns:
The PID of the spawned process or None for async execution.
"""
# pylint: disable-next=import-outside-toplevel,no-name-in-module,import-error
from multiprocess import Process # type: ignore

# pylint: disable-next=not-callable
proc = Process(
process = Process(
target=job_fn,
args=(key, self._make_progress_key(key), args, context),
)
proc.start()
return proc.pid
process.start()
return process.pid

@staticmethod
def _run_async_in_process(job_fn, key, args, context):
"""
Helper function to run an async job in a new process.
Args:
job_fn: The async job function.
key: Cache key for the job.
args: Arguments for the job function.
context: Context for the job.
"""
# Create a new event loop for the process
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)

# Wrap the job function to include key and progress
async_job = partial(job_fn, key, args, context)

try:
# Run the async job and wait for completion
loop.run_until_complete(async_job())
except Exception as e:
# Handle errors, log them, and cache if necessary
raise Exception(str(e)) from e
finally:
loop.close()

def get_progress(self, key):
progress_key = self._make_progress_key(key)
@@ -169,7 +207,9 @@ def get_updated_props(self, key):
return result


# pylint: disable-next=too-many-statements
def _make_job_fn(fn, cache, progress):
# pylint: disable-next=too-many-statements
def job_fn(result_key, progress_key, user_callback_args, context):
def _set_progress(progress_value):
if not isinstance(progress_value, (list, tuple)):
@@ -216,7 +256,47 @@ def run():
if not errored:
cache.set(result_key, user_callback_output)

ctx.run(run)
async def async_run():
c = AttributeDict(**context)
c.ignore_register_page = False
c.updated_props = ProxySetProps(_set_props)
context_value.set(c)
errored = False
try:
if isinstance(user_callback_args, dict):
user_callback_output = await fn(
*maybe_progress, **user_callback_args
)
elif isinstance(user_callback_args, (list, tuple)):
user_callback_output = await fn(
*maybe_progress, *user_callback_args
)
else:
user_callback_output = await fn(*maybe_progress, user_callback_args)
except PreventUpdate:
errored = True
cache.set(result_key, {"_dash_no_update": "_dash_no_update"})
except Exception as err: # pylint: disable=broad-except
errored = True
cache.set(
result_key,
{
"background_callback_error": {
"msg": str(err),
"tb": traceback.format_exc(),
}
},
)
if asyncio.iscoroutine(user_callback_output):
user_callback_output = await user_callback_output
if not errored:
cache.set(result_key, user_callback_output)

if asyncio.iscoroutinefunction(fn):
func = partial(ctx.run, async_run)
asyncio.run(func())
else:
ctx.run(run)
Comment on lines +295 to +299
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as per celery_manager, but for this one I am pretty sure there is no loop running.


return job_fn

412 changes: 270 additions & 142 deletions dash/dash.py

Large diffs are not rendered by default.

6 changes: 6 additions & 0 deletions dash/testing/browser.py
Original file line number Diff line number Diff line change
@@ -163,6 +163,12 @@ def percy_snapshot(
"""
if widths is None:
widths = [1280]
try:
import asgiref # pylint: disable=unused-import, import-outside-toplevel # noqa: F401, C0415

name += "_async"
except ImportError:
pass
Comment on lines +166 to +171
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't really need the percy integration if we run the tests separately in a new github action.=


logger.info("taking snapshot name => %s", name)
try:
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -31,7 +31,7 @@
"private::test.unit-dash": "pytest tests/unit",
"private::test.unit-renderer": "cd dash/dash-renderer && npm run test",
"private::test.unit-generation": "cd @plotly/dash-generator-test-component-typescript && npm ci && npm test",
"private::test.integration-dash": "TESTFILES=$(circleci tests glob \"tests/integration/**/test_*.py\" | circleci tests split --split-by=timings) && pytest --headless --nopercyfinalize --junitxml=test-reports/junit_intg.xml ${TESTFILES}",
"private::test.integration-dash": "TESTFILES=$(circleci tests glob \"tests/integration/**/test_*.py\" | circleci tests split --split-by=timings) && pytest --headless --nopercyfinalize --junitxml=test-reports/junit_intg.xml ${TESTFILES} && python rerun_failed_tests.py",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We have a package pytest-rerunfailures, we used to use when the tests were very flaky. We could re-enable it for the time being.

"private::test.integration-dash-import": "cd tests/integration/dash && python dash_import_test.py",
"cibuild": "run-s private::cibuild.*",
"build": "run-p private::build.*",
1 change: 1 addition & 0 deletions requirements/async.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
flask[async]
25 changes: 25 additions & 0 deletions rerun_failed_tests.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
import xml.etree.ElementTree as ET
import subprocess

def parse_test_results(file_path):
tree = ET.parse(file_path)
root = tree.getroot()
failed_tests = []
for testcase in root.iter('testcase'):
if testcase.find('failure') is not None:
failed_tests.append(testcase.get('name'))
return failed_tests

def rerun_failed_tests(failed_tests):
if failed_tests:
print("Initial failed tests:", failed_tests)
failed_test_names = ' '.join(failed_tests)
result = subprocess.run(f'pytest --headless {failed_test_names}', shell=True, capture_output=True, text=True)
print(result.stdout)
print(result.stderr)
else:
print('All tests passed.')

if __name__ == "__main__":
failed_tests = parse_test_results('test-reports/junit_intg.xml')
rerun_failed_tests(failed_tests)
1 change: 1 addition & 0 deletions setup.py
Original file line number Diff line number Diff line change
@@ -29,6 +29,7 @@ def read_req_file(req_type):
install_requires=read_req_file("install"),
python_requires=">=3.8",
extras_require={
"async": read_req_file("async"),
"ci": read_req_file("ci"),
"dev": read_req_file("dev"),
"testing": read_req_file("testing"),
Empty file added tests/async_tests/__init__.py
Empty file.
32 changes: 32 additions & 0 deletions tests/async_tests/app1_async.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
import time

from dash import Dash, Input, Output, dcc, html

from .utils import get_background_callback_manager

background_callback_manager = get_background_callback_manager()
handle = background_callback_manager.handle

app = Dash(__name__)
app.layout = html.Div(
[
dcc.Input(id="input", value="initial value"),
html.Div(html.Div([1.5, None, "string", html.Div(id="output-1")])),
]
)


@app.callback(
Output("output-1", "children"),
[Input("input", "value")],
interval=500,
manager=background_callback_manager,
background=True,
)
async def update_output(value):
time.sleep(0.1)
return value


if __name__ == "__main__":
app.run(debug=True)
50 changes: 50 additions & 0 deletions tests/async_tests/app_arbitrary_async.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
import time

from dash import Dash, Input, Output, html, callback, set_props

from .utils import get_background_callback_manager

background_callback_manager = get_background_callback_manager()
handle = background_callback_manager.handle

app = Dash(__name__, background_callback_manager=background_callback_manager)

app.layout = html.Div(
[
html.Button("start", id="start"),
html.Div(id="secondary"),
html.Div(id="no-output"),
html.Div("initial", id="output"),
html.Button("start-no-output", id="start-no-output"),
]
)


@callback(
Output("output", "children"),
Input("start", "n_clicks"),
prevent_initial_call=True,
background=True,
interval=500,
)
async def on_click(_):
set_props("secondary", {"children": "first"})
set_props("secondary", {"style": {"background": "red"}})
time.sleep(2)
set_props("secondary", {"children": "second"})
return "completed"


@callback(
Input("start-no-output", "n_clicks"),
prevent_initial_call=True,
background=True,
)
async def on_click2(_):
set_props("no-output", {"children": "started"})
time.sleep(2)
set_props("no-output", {"children": "completed"})


if __name__ == "__main__":
app.run(debug=True)
15 changes: 15 additions & 0 deletions tests/async_tests/conftest.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
import os

import pytest


if "REDIS_URL" in os.environ:
managers = ["celery", "diskcache"]
else:
print("Skipping celery tests because REDIS_URL is not defined")
managers = ["diskcache"]


@pytest.fixture(params=managers)
def manager(request):
return request.param
63 changes: 63 additions & 0 deletions tests/async_tests/test_async_background_callbacks.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
import sys
import time
from multiprocessing import Lock

import pytest
from flaky import flaky

from tests.utils import is_dash_async
from .utils import setup_background_callback_app


def test_001ab_arbitrary(dash_duo, manager):
if not is_dash_async():
return
with setup_background_callback_app(manager, "app_arbitrary_async") as app:
dash_duo.start_server(app)

dash_duo.wait_for_text_to_equal("#output", "initial")
# pause for sync
time.sleep(0.2)
dash_duo.find_element("#start").click()

dash_duo.wait_for_text_to_equal("#secondary", "first")
dash_duo.wait_for_style_to_equal(
"#secondary", "background-color", "rgba(255, 0, 0, 1)"
)
dash_duo.wait_for_text_to_equal("#output", "initial")
dash_duo.wait_for_text_to_equal("#secondary", "second")
dash_duo.wait_for_text_to_equal("#output", "completed")

dash_duo.find_element("#start-no-output").click()

dash_duo.wait_for_text_to_equal("#no-output", "started")
dash_duo.wait_for_text_to_equal("#no-output", "completed")


@pytest.mark.skipif(
sys.version_info < (3, 7), reason="Python 3.6 long callbacks tests hangs up"
)
@flaky(max_runs=3)
def test_002ab_basic(dash_duo, manager):
"""
Make sure that we settle to the correct final value when handling rapid inputs
"""
if not is_dash_async():
return
lock = Lock()
with setup_background_callback_app(manager, "app1_async") as app:
dash_duo.start_server(app)
dash_duo.wait_for_text_to_equal("#output-1", "initial value", 15)
input_ = dash_duo.find_element("#input")
# pause for sync
time.sleep(0.2)
dash_duo.clear_input(input_)

for key in "hello world":
with lock:
input_.send_keys(key)

dash_duo.wait_for_text_to_equal("#output-1", "hello world", 8)

assert not dash_duo.redux_state_is_loading
assert dash_duo.get_logs() == []
881 changes: 881 additions & 0 deletions tests/async_tests/test_async_callbacks.py

Large diffs are not rendered by default.

155 changes: 155 additions & 0 deletions tests/async_tests/utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,155 @@
# pylint: disable=import-outside-toplevel,global-statement,subprocess-popen-preexec-fn,W0201

import os
import shutil
import subprocess
import tempfile
import time
from contextlib import contextmanager

import psutil
import redis

from dash.background_callback import DiskcacheManager

manager = None


class TestDiskCacheManager(DiskcacheManager):
def __init__(self, cache=None, cache_by=None, expire=None):
super().__init__(cache=cache, cache_by=cache_by, expire=expire)
self.running_jobs = []

def call_job_fn(
self,
key,
job_fn,
args,
context,
):
pid = super().call_job_fn(key, job_fn, args, context)
self.running_jobs.append(pid)
return pid


def get_background_callback_manager():
"""
Get the long callback mangaer configured by environment variables
"""
if os.environ.get("LONG_CALLBACK_MANAGER", None) == "celery":
from dash.background_callback import CeleryManager
from celery import Celery

celery_app = Celery(
__name__,
broker=os.environ.get("CELERY_BROKER"),
backend=os.environ.get("CELERY_BACKEND"),
)
background_callback_manager = CeleryManager(celery_app)
redis_conn = redis.Redis(host="localhost", port=6379, db=1)
background_callback_manager.test_lock = redis_conn.lock("test-lock")
elif os.environ.get("LONG_CALLBACK_MANAGER", None) == "diskcache":
import diskcache

cache = diskcache.Cache(os.environ.get("DISKCACHE_DIR"))
background_callback_manager = TestDiskCacheManager(cache)
background_callback_manager.test_lock = diskcache.Lock(cache, "test-lock")
else:
raise ValueError(
"Invalid long callback manager specified as LONG_CALLBACK_MANAGER "
"environment variable"
)

global manager
manager = background_callback_manager

return background_callback_manager


def kill(proc_pid):
process = psutil.Process(proc_pid)
for proc in process.children(recursive=True):
proc.kill()
process.kill()


@contextmanager
def setup_background_callback_app(manager_name, app_name):
from dash.testing.application_runners import import_app

if manager_name == "celery":
os.environ["LONG_CALLBACK_MANAGER"] = "celery"
redis_url = os.environ["REDIS_URL"].rstrip("/")
os.environ["CELERY_BROKER"] = f"{redis_url}/0"
os.environ["CELERY_BACKEND"] = f"{redis_url}/1"

# Clear redis of cached values
redis_conn = redis.Redis(host="localhost", port=6379, db=1)
cache_keys = redis_conn.keys()
if cache_keys:
redis_conn.delete(*cache_keys)

worker = subprocess.Popen(
[
"celery",
"-A",
f"tests.async_tests.{app_name}:handle",
"worker",
"-P",
"prefork",
"--concurrency",
"2",
"--loglevel=info",
],
encoding="utf8",
preexec_fn=os.setpgrp,
stderr=subprocess.PIPE,
)
# Wait for the worker to be ready, if you cancel before it is ready, the job
# will still be queued.
lines = []
for line in iter(worker.stderr.readline, ""):
if "ready" in line:
break
lines.append(line)
else:
error = "\n".join(lines)
raise RuntimeError(f"celery failed to start: {error}")

try:
yield import_app(f"tests.async_tests.{app_name}")
finally:
# Interval may run one more time after settling on final app state
# Sleep for 1 interval of time
time.sleep(0.5)
os.environ.pop("LONG_CALLBACK_MANAGER")
os.environ.pop("CELERY_BROKER")
os.environ.pop("CELERY_BACKEND")
kill(worker.pid)
from dash import page_registry

page_registry.clear()

elif manager_name == "diskcache":
os.environ["LONG_CALLBACK_MANAGER"] = "diskcache"
cache_directory = tempfile.mkdtemp(prefix="lc-diskcache-")
print(cache_directory)
os.environ["DISKCACHE_DIR"] = cache_directory
try:
app = import_app(f"tests.async_tests.{app_name}")
yield app
finally:
# Interval may run one more time after settling on final app state
# Sleep for a couple of intervals
time.sleep(2.0)

if hasattr(manager, "running_jobs"):
for job in manager.running_jobs:
manager.terminate_job(job)

shutil.rmtree(cache_directory, ignore_errors=True)
os.environ.pop("LONG_CALLBACK_MANAGER")
os.environ.pop("DISKCACHE_DIR")
from dash import page_registry

page_registry.clear()
8 changes: 4 additions & 4 deletions tests/integration/devtools/test_devtools_error_handling.py
Original file line number Diff line number Diff line change
@@ -73,14 +73,14 @@ def test_dveh001_python_errors(dash_duo):
assert "Special 2 clicks exception" in error0
assert "in bad_sub" not in error0
# dash and flask part of the traceback not included
assert "%% callback invoked %%" not in error0
assert "dash.py" not in error0
assert "self.wsgi_app" not in error0

error1 = get_error_html(dash_duo, 1)
assert "in update_output" in error1
assert "in bad_sub" in error1
assert "ZeroDivisionError" in error1
assert "%% callback invoked %%" not in error1
assert "dash.py" not in error1
assert "self.wsgi_app" not in error1


@@ -109,14 +109,14 @@ def test_dveh006_long_python_errors(dash_duo):
assert "in bad_sub" not in error0
# dash and flask part of the traceback ARE included
# since we set dev_tools_prune_errors=False
assert "%% callback invoked %%" in error0
assert "dash.py" in error0
assert "self.wsgi_app" in error0

error1 = get_error_html(dash_duo, 1)
assert "in update_output" in error1
assert "in bad_sub" in error1
assert "ZeroDivisionError" in error1
assert "%% callback invoked %%" in error1
assert "dash.py" in error1
assert "self.wsgi_app" in error1


73 changes: 29 additions & 44 deletions tests/integration/renderer/test_request_hooks.py
Original file line number Diff line number Diff line change
@@ -1,13 +1,10 @@
import json
import functools
import flask
import pytest

from flaky import flaky
from multiprocessing import Value

from dash import Dash, Output, Input, html, dcc
from dash.types import RendererHooks
from werkzeug.exceptions import HTTPException


def test_rdrh001_request_hooks(dash_duo):
@@ -200,7 +197,7 @@ def update_output(value):
assert dash_duo.get_logs() == []


@flaky(max_runs=3)
# @flaky(max_runs=3)
@pytest.mark.parametrize("expiry_code", [401, 400])
def test_rdrh003_refresh_jwt(expiry_code, dash_duo):
app = Dash(__name__)
@@ -244,61 +241,49 @@ def test_rdrh003_refresh_jwt(expiry_code, dash_duo):
]
)

@app.callback(Output("output-1", "children"), [Input("input", "value")])
@app.callback(
Output("output-1", "children"),
[Input("input", "value")],
prevent_initial_call=True,
)
def update_output(value):
jwt_token.value = len(value) + 1
return value

required_jwt_len = 0
jwt_token = Value("i", 0)

# test with an auth layer that requires a JWT with a certain length
def protect_route(func):
@functools.wraps(func)
def wrap(*args, **kwargs):
try:
if flask.request.method == "OPTIONS":
return func(*args, **kwargs)
token = flask.request.headers.environ.get("HTTP_AUTHORIZATION")
if required_jwt_len and (
not token or len(token) != required_jwt_len + len("Bearer ")
):
# Read the data to prevent bug with base http server.
flask.request.get_json(silent=True)
flask.abort(expiry_code, description="JWT Expired " + str(token))
except HTTPException as e:
return e
return func(*args, **kwargs)

return wrap

# wrap all API calls with auth.
for name, method in (
(x, app.server.view_functions[x])
for x in app.routes
if x in app.server.view_functions
):
app.server.view_functions[name] = protect_route(method)
@app.server.before_request
def add_auth():
if flask.request.method != "OPTIONS":
token = flask.request.headers.environ.get("HTTP_AUTHORIZATION")
if jwt_token.value and (
not token or len(token) != jwt_token.value + len("Bearer ")
):
# Read the data to prevent bug with base http server.
flask.request.get_json(silent=True)
flask.abort(expiry_code, description="JWT Expired " + str(token))

dash_duo.start_server(app)

_in = dash_duo.find_element("#input")
dash_duo.clear_input(_in)

required_jwt_len = 1

_in.send_keys("fired request")
dash_duo.wait_for_text_to_equal("#output-1", "")

dash_duo.wait_for_text_to_equal("#output-1", "fired request")
_in.send_keys(".")
dash_duo.wait_for_text_to_equal("#output-1", ".")
dash_duo.wait_for_text_to_equal("#output-token", ".")

required_jwt_len = 2

dash_duo.clear_input(_in)
_in.send_keys("fired request again")

dash_duo.wait_for_text_to_equal("#output-1", "fired request again")
_in.send_keys(".")
dash_duo.wait_for_text_to_equal("#output-1", "..")
dash_duo.wait_for_text_to_equal("#output-token", "..")

assert len(dash_duo.get_logs()) == 2
_in.send_keys(".")
dash_duo.wait_for_text_to_equal("#output-1", "...")
dash_duo.wait_for_text_to_equal("#output-token", "...")

assert len(dash_duo.get_logs()) == 3


def test_rdrh004_layout_hooks(dash_duo):
8 changes: 8 additions & 0 deletions tests/utils.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
def is_dash_async():
try:

import asgiref # pylint: disable=unused-import, # noqa: F401

return True
except ImportError:
return False