[MCOL-1612] python spark connector - broken Created: 2018-07-30  Updated: 2023-10-26  Resolved: 2018-09-19

Status: Closed
Project: MariaDB ColumnStore
Component/s: None
Affects Version/s: 1.1.5
Fix Version/s: 1.1.6

Type: Bug Priority: Critical
Reporter: Jens Röwekamp (Inactive) Assignee: David Thompson (Inactive)
Resolution: Fixed Votes: 0
Labels: None

Sprint: 2018-15, 2018-16, 2018-17

 Description   

The Python spark-connector is broken and unusable. No data can be injected.

The tests unfortunately didn't cover this type of error. The tests assume that data is written via the connector and only check if the injected matches the expected and not the other way around.



 Comments   
Comment by Jens Röwekamp (Inactive) [ 2018-07-30 ]

fixed the python spark connector and adjusted the test cases so that this error will be discovered if it appears again.

Comment by Jens Röwekamp (Inactive) [ 2018-07-30 ]

For QA:
Execute the regression test suite and verify that all tests pass.

Comment by Jens Röwekamp (Inactive) [ 2018-08-01 ]

Just discovered that one regression test suite test fails on CentOS 7 due to the old version of Swig CentOS is using by default. With a newer version of Swig (3.0.12) the test passes as well. Will update the documentation accordingly, to build it from source instead of CentOS'es packet manager.

Comment by Jens Röwekamp (Inactive) [ 2018-08-01 ]

Documentation updated with instructions to build and install swig from source.

For QA:
After executing the regression test suite and verifying that it passes, our build pipeline needs to be adapted as well to contain the new Swig version.

Comment by Jens Röwekamp (Inactive) [ 2018-08-02 ]

Debian 8's packet manager also installs Swig 2. Therefore swig needs to be build from source as well on Debian 8. Made the necessary documentation changes in Readme.md.

Generated at Thu Feb 08 02:30:03 UTC 2024 using Jira 8.20.16#820016-sha1:9d11dbea5f4be3d4cc21f03a88dd11d8c8687422.