You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Greetings, I'm having an issue with an airflow instance where a task fails and cannot read the logs...
Executor: CeleryExecutor
OS: Ubuntu 24.04 LTS
Airflow Version: 2.10.1
Deployment: Docker Swarm
Deployment details: Ran docker-compose on docker swarm setup on 2 VMs.
Logs:
*** Could not read served logs: Invalid URL 'http://:8793/log/dag_id=my_dag/run_id=dynamic__apple_3_my_dag_cb353081__2024-09-09T14:41:22.596199__f73c5571719e4f35bf195ded40e5e25b/task_id=cleanup_temporary_directory/attempt=1.log': No host supplied
Event logs:
Executor CeleryExecutor(parallelism=128) reported that the task instance <TaskInstance: my_dag.cleanup_temporary_directory dynamic__apple_3_my_dag_cb353081__2024-09-09T14:41:22.596199__f73c5571719e4f35bf195ded40e5e25b [queued]> finished with state failed, but the task instance's state attribute is queued. Learn more: https://airflow.apache.org/docs/apache-airflow/stable/troubleshooting.html#task-state-changed-externally
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Greetings, I'm having an issue with an airflow instance where a task fails and cannot read the logs...
CeleryExecutor
Ubuntu 24.04 LTS
2.10.1
Docker Swarm
Logs:
Event logs:
Beta Was this translation helpful? Give feedback.
All reactions