hyper-api-samples icon indicating copy to clipboard operation
hyper-api-samples copied to clipboard

There was a problem retrieving table information for the source table. Specified table does not exist: updated_rows. (errorCode=410012)

Open JMXAAS opened this issue 2 years ago • 11 comments

using the clouddb-extractor for bigquery, everything works fine except for export_load

2023-12-04 13:59:46,171 - TSC wait_for_job - INFO - Job e0f073a1-0e0d-48d4-a1ba-a90266d1250f Completed: Finish Code: 1 - Notes:['com.tableausoftware.server.status.reporting.TableauRuntimeException: There was a problem retrieving table information for the source table. Specified table does not exist: updated_rows. (errorCode=410012)'] Traceback (most recent call last): File "/hyper-api-samples/Community-Supported/clouddb-extractor/extractor_cli.py", line 354, in <module> main() File "/hyper-api-samples/Community-Supported/clouddb-extractor/extractor_cli.py", line 296, in main extractor.export_load( File "/hyper-api-samples/Community-Supported/clouddb-extractor/base_extractor.py", line 182, in execution_timer result = func(*args, **kw) File "/hyper-api-samples/Community-Supported/clouddb-extractor/base_extractor.py", line 778, in export_load self.update_datasource_from_hyper_file( File "/hyper-api-samples/Community-Supported/clouddb-extractor/base_extractor.py", line 668, in update_datasource_from_hyper_file self.tableau_server.jobs.wait_for_job(async_job) File "/usr/local/lib/python3.10/dist-packages/tableauserverclient/server/endpoint/jobs_endpoint.py", line 72, in wait_for_job raise JobFailedException(job) tableauserverclient.server.endpoint.exceptions.JobFailedException: Job e0f073a1-0e0d-48d4-a1ba-a90266d1250f failed with notes ['com.tableausoftware.server.status.reporting.TableauRuntimeException: There was a problem retrieving table information for the source table. Specified table does not exist: updated_rows. (errorCode=410012)']

Please help!

JMXAAS avatar Dec 04 '23 13:12 JMXAAS

@FrancoisZim any ideas?

vogelsgesang avatar Dec 04 '23 15:12 vogelsgesang

FYI, for smaller tables it actually works fine, but not when you reach 10M rows...

JMXAAS avatar Dec 04 '23 15:12 JMXAAS

@JMXAAS - will try to recreate this but I may need you to provide part of your debug.log file to understand the defect. Will allocate some time to this in the next three days.

FrancoisZim avatar Dec 04 '23 15:12 FrancoisZim

debug_20231204.txt

Thanks alot @FrancoisZim Here's the debug of my latest run.

Bigquery Tableau Online 10,9M Rows

JMXAAS avatar Dec 04 '23 16:12 JMXAAS

I've created a branch with the fix for this bug: https://github.com/FrancoisZim/hyper-api-samples/tree/clouddb-extractor-bugfix Can you let me know if this resolves the issue and I will commit to the main repo.

FrancoisZim avatar Dec 05 '23 12:12 FrancoisZim

Awesome! It worked @FrancoisZim!

Is there some settings in uploading number of rows that can be tweaked? At the moment it just uploads ~400k rows in every chunks to the extract

JMXAAS avatar Dec 05 '23 19:12 JMXAAS

Thanks for testing. There is a constant that you can change to include more chunks in each extract. Odd that you are only getting 400K rows as I thought the default BigQuery behaviour was 1G per chunk - so should be 5G of CSV per hyper file?

BLOBS_PER_HYPER_FILE: int = 5

FrancoisZim avatar Dec 06 '23 14:12 FrancoisZim

That's what I read as well, but we get ~6Mb chunks of gzipped CSV? Yes, I played around with the BLOBS_PER_HYPER_FILE and you clearly see it loads several chunks to the hyperfile. Although what I noticed when increasing the constant was if the hyperfile exceeded the 64Mb limit it started to split the hyperfile to multipart but failed every time. Running the script as with lower constant ending up in hyperfiles below 64Mb worked fine. I'll see if i can recreate the issue again when exceeding the 64Mb

JMXAAS avatar Dec 06 '23 14:12 JMXAAS

Would be really useful if you could send me one of the debug logs for the multi-part >64MB bug. That should be handled by TSC so I will include a fix for this as well

FrancoisZim avatar Dec 06 '23 14:12 FrancoisZim

I'll probably manage to try it out tomorrow again

JMXAAS avatar Dec 06 '23 15:12 JMXAAS

debug_20231207.log

@FrancoisZim attaching debug.log from todays test where I increased BLOBS_PER_HYPER_FILE to create a larger hyperfile.

What's not in the logfile is the error message. No extract is published

`Traceback (most recent call last): File "/hyper-api-samples/Community-Supported/clouddb-extractor/extractor_cli.py", line 354, in main() File "/hyper-api-samples/Community-Supported/clouddb-extractor/extractor_cli.py", line 296, in main extractor.export_load( File "/hyper-api-samples/Community-Supported/clouddb-extractor/base_extractor.py", line 182, in execution_timer result = func(*args, **kw) File "/hyper-api-samples/Community-Supported/clouddb-extractor/base_extractor.py", line 776, in export_load self.publish_hyper_file(path_to_database, tab_ds_name, publish_mode) File "/hyper-api-samples/Community-Supported/clouddb-extractor/base_extractor.py", line 527, in publish_hyper_file datasource = self.tableau_server.datasources.publish(datasource, path_to_database, publish_mode) File "/usr/local/lib/python3.10/dist-packages/tableauserverclient/server/endpoint/endpoint.py", line 292, in wrapper return func(self, *args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/tableauserverclient/server/endpoint/endpoint.py", line 334, in wrapper return func(self, *args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/tableauserverclient/server/endpoint/endpoint.py", line 334, in wrapper return func(self, *args, **kwargs) File "/usr/local/lib/python3.10/dist-packages/tableauserverclient/server/endpoint/datasources_endpoint.py", line 298, in publish server_response = self.post_request(url, xml_request, content_type) File "/usr/local/lib/python3.10/dist-packages/tableauserverclient/server/endpoint/endpoint.py", line 249, in post_request return self._make_request( File "/usr/local/lib/python3.10/dist-packages/tableauserverclient/server/endpoint/endpoint.py", line 166, in _make_request self._check_status(server_response, url) File "/usr/local/lib/python3.10/dist-packages/tableauserverclient/server/endpoint/endpoint.py", line 189, in _check_status raise ServerResponseError.from_response(server_response.content, self.parent_srv.namespace, url) tableauserverclient.server.endpoint.exceptions.ServerResponseError:

    400011: Bad Request
            There was a problem publishing the file '26261:5027d532c029418681b095ae9b659165-0:0'.`

JMXAAS avatar Dec 07 '23 09:12 JMXAAS