Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Datashare writes are not authorized by producer or associated by consumer #247

Open
MinuraPunchihewa opened this issue Dec 3, 2024 · 0 comments

Comments

@MinuraPunchihewa
Copy link

Driver version

2.1.1

Redshift version

PostgreSQL 8.0.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.4.2 20041017 (Red Hat 3.4.2-6.fc3), Redshift 1.0.79372

Client Operating System

Ubuntu 24.04 LTS

Python version

3.8

Table schema

tpcds100

Problem description

I am not able to successfully query a Datashare (this Datashare, to be precise) using the package. Although the error message implies that I am attempting to execute a write operation, I've attempted to only execute a simple SELECT query.

I can also confirm that querying this Datashare via the psql client works without a problem.

  1. Expected behaviour:
    Queries against Datashares should be executed successfullly.

  2. Actual behaviour:
    Queries against Datashares fail with the above error.

  3. Error message/stack trace:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
File ~/anaconda3/envs/mindsdb_handlers/lib/python3.8/site-packages/redshift_connector/core.py:1821, in Connection.execute(self, cursor, operation, vals)
   1820 try:
-> 1821     ps = cache["ps"][key]
   1822     _logger.debug("Using cached prepared statement")

KeyError: ('SELECT * FROM store_sales s LIMIT 10;', ())

During handling of the above exception, another exception occurred:

ProgrammingError                          Traceback (most recent call last)
Cell In[58], line 3
      1 cursor: redshift_connector.Cursor = connection.cursor()
----> 3 cursor.execute(query)
      4 result: tuple = cursor.fetchall()

File ~/anaconda3/envs/mindsdb_handlers/lib/python3.8/site-packages/redshift_connector/cursor.py:248, in Cursor.execute(self, operation, args, stream, merge_socket_read)
    246     except:
    247         pass
--> 248     raise e
    249 return self

File ~/anaconda3/envs/mindsdb_handlers/lib/python3.8/site-packages/redshift_connector/cursor.py:241, in Cursor.execute(self, operation, args, stream, merge_socket_read)
    239         self._c.execute(self, "begin transaction", None)
    240     self._c.merge_socket_read = merge_socket_read
--> 241     self._c.execute(self, operation, args)
    242 except Exception as e:
    243     try:

File ~/anaconda3/envs/mindsdb_handlers/lib/python3.8/site-packages/redshift_connector/core.py:1902, in Connection.execute(self, cursor, operation, vals)
   1899     else:
   1900         raise e
-> 1902 self.handle_messages(cursor)
   1904 # We've got row_desc that allows us to identify what we're
   1905 # going to get back from this statement.
   1906 output_fc = tuple(self.redshift_types[f["type_oid"]][0] for f in ps["row_desc"])

File ~/anaconda3/envs/mindsdb_handlers/lib/python3.8/site-packages/redshift_connector/core.py:2194, in Connection.handle_messages(self, cursor)
   2191     self.message_types[code](self._read(data_len - 4), cursor)
   2193 if self.error is not None:
-> 2194     raise self.error

ProgrammingError: {'S': 'ERROR', 'C': 'XX000', 'M': 'Datashare writes are not authorized by producer or associated by consumer.', 'D': '\n  -----------------------------------------------\n  error:  Datashare writes are not authorized by producer or associated by consumer.\n  code:      35002\n  context:   \n  query:     -1[child_sequence:1]\n  location:  redcat_rpc_client.cpp:2112\n  process:   padbmaster [pid=1073979618]\n  -----------------------------------------------\n', 'F': '/home/ec2-user/padb/src/sys/xen_execute.cpp', 'L': '13364', 'R': 'pg_throw'}
  1. Any other details that can be helpful:
    Like I've mentioned above, it seems that querying the Datashare via other means such as the psql client works well, but I have also noticed that querying it with the Python driver for PostgreSQL (pyscopg) does not work either. Given below is the code that I tried to execute:
import os
import psycopg


os.environ["PGCLIENTENCODING"] = "utf-8"

connection = psycopg.connect(
    ...
)

with connection.cursor() as cur:
    cur.execute("SELECT * FROM store s LIMIT 10;")
    result = cur.fetchall()

And this is the error that is raised:

---------------------------------------------------------------------------
InternalError_                            Traceback (most recent call last)
Cell In[55], line 2
      1 with connection.cursor() as cur:
----> 2     cur.execute(query)
      3     result = cur.fetchall()

File ~/anaconda3/envs/mindsdb_handlers/lib/python3.8/site-packages/psycopg/cursor.py:737, in Cursor.execute(self, query, params, prepare, binary)
    733         self._conn.wait(
    734             self._execute_gen(query, params, prepare=prepare, binary=binary)
    735         )
    736 except e._NO_TRACEBACK as ex:
--> 737     raise ex.with_traceback(None)
    738 return self

InternalError_: Datashare writes are not authorized by producer or associated by consumer.
DETAIL:  
  -----------------------------------------------
  error:  Datashare writes are not authorized by producer or associated by consumer.
  code:      35002
  context:   
  query:     -1[child_sequence:1]
  location:  redcat_rpc_client.cpp:2112
  process:   padbmaster [pid=1073996227]
  -----------------------------------------------

Python Driver trace logs

I believe I have provided the necessary logs above?

Reproduction code

import redshift_connector


connection = redshift_connector.connect(
    ...
)

cursor = connection.cursor()

cursor.execute("SELECT * FROM store s LIMIT 10;")
result: tuple = cursor.fetchall()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant