chore: Pass Spark configs to native createPlan#2180
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #2180 +/- ##
============================================
+ Coverage 56.12% 58.41% +2.29%
- Complexity 976 1262 +286
============================================
Files 119 143 +24
Lines 11743 13191 +1448
Branches 2251 2357 +106
============================================
+ Hits 6591 7706 +1115
- Misses 4012 4260 +248
- Partials 1140 1225 +85 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
native/proto/src/proto/config.proto
Outdated
| repeated ConfigEntry entry = 1; | ||
| } | ||
|
|
||
| message ConfigEntry { |
There was a problem hiding this comment.
We already had something similar for datafusion scan.
Maybe we can consolidate the two approaches?
There was a problem hiding this comment.
object_store_options does follow a similar approach, only the proto uses a map type directly, which I did not know was supported. I will update this PR to use the same.
native/core/src/execution/jni_api.rs
Outdated
| let spark_configs = serde::deserialize_config(bytes.as_slice())?; | ||
|
|
||
| // Convert Spark configs to HashMap | ||
| let _spark_config_map: HashMap<String, String> = spark_configs |
There was a problem hiding this comment.
Yes, this is the code for deserializing the protobuf map type. The hash map is not used yet, hence the variable name starting with _.
|
@parthchandra could you take another look? |
Which issue does this PR close?
N/A
Rationale for this change
I removed this functionality in #1101 and am now re-implementing it but using protobuf to serialize the map rather than passing a reference to a Java HashMap into native code and then making JNI calls for each entry to fetch the values.
What changes are included in this PR?
How are these changes tested?