Details
-
Task
-
Status: Open (View Workflow)
-
Major
-
Resolution: Unresolved
-
1.1.7, 1.2.4, 1.2.5
-
Linux and Windows when using bulk data adapter to export data from spark to columnstore
Description
I have been referring to the following links to enable export to columnstore tables.
[
https://mariadb.com/kb/en/library/mariadb-columnstore-with-spark/ |
https://mariadb.com/kb/en/library/installing-mariadb-ax-mariadb-columnstore-from-the-package-repositories-11x/#mariadb-columnstore-api-bulk-write-sdk-package]
I have been able to export the data to a columnstore table with spark dataframe as an input. However there currently seems to be a limitation that columnstore needs to be present on the same node where spark code is running. However there are multiple scenarios where application and database are on separate nodes and hence the Columnstore exporter throws errors in this case.
Need an enhancement where in the Columnstore exporter can take input URL on where the columnstore instance is running