Skip to content

SUPERSET-CELERY-REDIS : Configuration of celerity with remote redis server is throwing error #32558

@rabindragogoi

Description

@rabindragogoi

Bug description

We configured celerity with remote redis server. The Superset instance is up and running but when I run a query in sqllab it returns me an error as :
Failed to start remote query on a worker

In celery.log.
Cannot connect to redis://localhost:6367. ( But still it is is trying to connect the localhost redis.)

When I configure redis in same local instance it works perfect.

I have made all the configurations of the remote redis server.
Below is the error in gunicorn.log

**File "/opt/app-root/lib64/python3.11/site-packages/kombu/utils/functional.py", line 318, in retry_over_time
    return fun(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^
  File "/opt/app-root/lib64/python3.11/site-packages/celery/backends/redis.py", line 106, in _reconnect_pubsub
    metas = self.backend.client.mget(self.subscribed_to)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/app-root/lib64/python3.11/site-packages/redis/commands/core.py", line 2002, in mget
    return self.execute_command("MGET", *args, **options)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/app-root/lib64/python3.11/site-packages/redis/client.py", line 1266, in execute_command
    conn = self.connection or pool.get_connection(command_name, **options)
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/app-root/lib64/python3.11/site-packages/redis/connection.py", line 1461, in get_connection
    connection.connect()
  File "/opt/app-root/lib64/python3.11/site-packages/redis/connection.py", line 713, in connect
    raise ConnectionError(self._error_message(e))
redis.exceptions.ConnectionError: Error 111 connecting to localhost:6379. Connection refused.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/app-root/lib64/python3.11/site-packages/superset/sqllab/sql_json_executer.py", line 170, in execute
    task = self._get_sql_results_task.delay(  # type: ignore
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/app-root/lib64/python3.11/site-packages/celery/app/task.py", line 444, in delay
    return self.apply_async(args, kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/app-root/lib64/python3.11/site-packages/celery/app/task.py", line 594, in apply_async
    return app.send_task(
           ^^^^^^^^^^^^^^
  File "/opt/app-root/lib64/python3.11/site-packages/celery/app/base.py", line 796, in send_task
    with P.connection._reraise_as_library_errors():
  File "/usr/lib64/python3.11/contextlib.py", line 158, in exit
    self.gen.throw(typ, value, traceback)
  File "/opt/app-root/lib64/python3.11/site-packages/kombu/connection.py", line 476, in _reraise_as_library_errors
    raise ConnectionError(str(exc)) from exc
kombu.exceptions.OperationalError: Error 111 connecting to localhost:6379. Connection refused.

The above exception was the direct cause of the following exception:**

Below is my config.py file:

Screenshots/recordings

Image

CONFIG.PY FILE:

class CeleryConfig: # pylint: disable=too-few-public-methods

    #BROKER_URL = 'redis://localhost:6379/0'
    CELERY_BROKER_URL = 'redis://<username>:<password>@redis-server.uat.dbs.com:1200/0'
    CELERY_IMPORTS = (
        'superset.sql_lab',
        'superset.tasks',
    )
    #CELERY_RESULT_BACKEND = 'redis://localhost:6379/0'
    CELERY_RESULT_BACKEND = 'redis://<username>:<password>@redis-server.uat.dbs.com:1200/0'
    CELERYD_LOG_LEVEL = 'DEBUG'
    CELERY_TASK_SERIALIZER = "json"
    CELERY_RESULT_SERIALIZER = "json"
    CELERYD_PREFETCH_MULTIPLIER = 10
    CELERY_ACKS_LATE = True
    CELERY_ANNOTATIONS = {
        'sql_lab.get_sql_results': {
            'rate_limit': '100/s',
        },
        'email_reports.send': {
            'rate_limit': '1/s',
            'time_limit': 120,
            'soft_time_limit': 150,
            'ignore_result': True,
        },
    }
    CELERYBEAT_SCHEDULE = {
        'email_reports.schedule_hourly': {
            'task': 'email_reports.schedule_hourly',
            'schedule': crontab(minute=1, hour='*'),
        },
    }


CELERY_CONFIG = CeleryConfig  # pylint: disable=invalid-name

RESULTS_BACKEND = RedisCache(
host='redis-server.uat.dbs.com', port=1200, key_prefix='superset_results', username='' , password='')

Superset version

master / latest-dev

Python version

3.9

Node version

16

Browser

Chrome

Additional context

No response

Checklist

  • I have searched Superset docs and Slack and didn't find a solution to my problem.
  • I have searched the GitHub issue tracker and didn't find a similar bug report.
  • I have checked Superset's logs for errors and if I found a relevant Python stacktrace, I included it here as text in the "additional context" section.

Metadata

Metadata

Assignees

No one assigned

    Labels

    infra:cachingInfra setup and configuration related to cachingsqllabNamespace | Anything related to the SQL Lab

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions