[Abandoned] [Java] Spark integration failure due to Netty version#36493
[Abandoned] [Java] Spark integration failure due to Netty version#36493danepitkin wants to merge 1 commit intoapache:mainfrom
Conversation
|
|
|
Hmm, does this really work? Just because there's not a reference to it in the source, doesn't mean there isn't a reference in the bytecode |
|
@github-actions crossbow submit test-conda-python--spark- |
|
Revision: 2176273 Submitted crossbow builds: ursacomputing/crossbow @ actions-4f9b35019b
|
|
No, I also don't think this would work. I was curious what CI would produce (is there a way to run CI w/o creating a diff?). This should be an object mismatch, probably even the same error we saw before. I'll try the reflection approach instead. |
|
If you push a branch, it should run on your fork, too. e.g. see https://github.com/danepitkin/arrow/actions/runs/5469006949 |
Thank you! This probably isn't a good first Java issue, so I might pass it off to David S. tomorrow and play around on my own branch in the meantime. Shading actually seems like the nicest approach. With reflection, its not yet clear to me how we would resolve. I think we would need to reflect all usage of |
|
Closing for now to reflect that I might not be the one who merges a fix. |
|
My thought with reflection would be to detect which version of the netty method is available and dispatch to it, yes. But the call overhead is not great. Shading (or using arrow-memory-unsafe) basically means just modifying Spark's POM, in which case there's maybe not much for us to do. |
|
The change in netty was reverted in 4.1.96. See #36926 |
Rationale for this change
Fix broken CI with Spark due to Arrow upgrading Netty to v4.1.94.
Are these changes tested?
Unit tests pass, but need to run integration tests (via GHA).
Are there any user-facing changes?
No