Skip to content

Commit 263f3b4

Browse files
committed
[spark] Pin Scala to 2.13.17 under spark4 profile
Spark 4.1.1 is built against Scala 2.13.17 and depends on APIs added in that patch (e.g. MurmurHash3.caseClassHash$default$2). Paimon's global scala213.version is still 2.13.16 for the spark3 / Flink lines, so override scala.version directly inside the spark4 profile to avoid NoSuchMethodError during Spark 4.1 Catalyst execution.
1 parent 341fdfa commit 263f3b4

File tree

1 file changed

+3
-1
lines changed

1 file changed

+3
-1
lines changed

pom.xml

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -433,7 +433,9 @@ under the License.
433433
<target.java.version>17</target.java.version>
434434
<antlr4.version>4.13.1</antlr4.version>
435435
<scala.binary.version>2.13</scala.binary.version>
436-
<scala.version>${scala213.version}</scala.version>
436+
<!-- Spark 4.1.1 is built against Scala 2.13.17; force the same stdlib here so that
437+
`MurmurHash3.caseClassHash$default$2` and related APIs resolve at runtime. -->
438+
<scala.version>2.13.17</scala.version>
437439
<paimon-spark-common.spark.version>4.1.1</paimon-spark-common.spark.version>
438440
<paimon-sparkx-common>paimon-spark4-common_2.13</paimon-sparkx-common>
439441
<arrow.version>18.1.0</arrow.version>

0 commit comments

Comments
 (0)