Skip to content

Conversation

@vikrantpuppala
Copy link
Collaborator

@vikrantpuppala vikrantpuppala commented Jan 13, 2026

Summary

Fixes type inference bugs for numeric parameters:

Problems Fixed

1. int64/uint64 → BIGINT

When inserting int64/uint64 values into BIGINT columns, the driver was sending them with type INTEGER instead of BIGINT, causing the server to reject large values with error:

[INVALID_PARAMETER_MARKER_VALUE.INVALID_VALUE_FOR_DATA_TYPE] An invalid parameter mapping was provided: 
the value '1311768467463790320' for parameter 'null' cannot be cast to INT because it is malformed.

Additionally, int64 was using strconv.Itoa(int(value)) which truncates values larger than int32.

2. float64 → DOUBLE

When inserting float64 values into DOUBLE columns, the driver was sending them with type FLOAT (32-bit) instead of DOUBLE (64-bit), causing:

  • Precision loss for high-precision float64 values
  • Potential overflow for values beyond float32 range (~3.4e38)

3. Panic with explicit Parameter type

When using Parameter{Type: SqlBigInt, Value: int64(...)} with a non-string value, the driver panicked at convertNamedValuesToSparkParams due to unsafe type assertion.

Changes

  • parameters.go:
    • int64 now uses strconv.FormatInt() and maps to SqlBigInt
    • uint64 now maps to SqlBigInt
    • float64 now maps to SqlDouble instead of SqlFloat
    • Added safe type assertion with fallback in convertNamedValuesToSparkParams

Test plan

  • Added unit tests for int64/uint64 type inference (TestParameter_BigInt)
  • Added unit tests for float64/float32 type inference (TestParameter_Float)
  • Verified large int64 values are correctly inserted and retrieved from BIGINT columns
  • Verified float64 values with high precision are correctly inserted and retrieved
  • All existing parameter tests pass

🤖 Generated with Claude Code

This fixes issue databricks#314 where float64 values were incorrectly mapped to
SqlFloat instead of SqlDouble, causing precision loss and potential
overflow issues for values beyond float32 range.

Changes:
- float64 now maps to SqlDouble instead of SqlFloat

Co-Authored-By: Claude Opus 4.5 <[email protected]>
Copilot AI review requested due to automatic review settings January 13, 2026 04:19
@vikrantpuppala vikrantpuppala changed the title Fix float64 type inference to use DOUBLE instead of FLOAT Fix type inference for int64/uint64 (BIGINT) and float64 (DOUBLE) Jan 13, 2026
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes type inference for 64-bit numeric types in the Databricks SQL Go driver. Previously, float64 was incorrectly mapped to SQL FLOAT (32-bit) and int64/uint64 were mapped to SQL INTEGER (32-bit), causing precision loss and potential overflow issues.

Changes:

  • Fixed float64 to map to SqlDouble instead of SqlFloat for correct 64-bit floating-point precision
  • Fixed int64 and uint64 to map to SqlBigInt instead of SqlInteger to handle full 64-bit integer range
  • Enhanced convertNamedValuesToSparkParams to handle explicit Parameter objects with non-string values

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
parameters.go Updated type inference for float64, int64, and uint64; added type switch to handle non-string Parameter values
parameter_test.go Added comprehensive tests for BIGINT (int64/uint64) and DOUBLE (float64) type inference, including edge cases

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

…ter values

Use strconv.FormatFloat for float32/float64 and strconv.FormatInt/FormatUint
for int64/uint64 in convertNamedValuesToSparkParams to ensure consistent
formatting (decimal notation) instead of fmt.Sprintf which uses scientific
notation for large floats.

Added test cases to verify large float64 values use decimal notation.

Co-Authored-By: Claude Opus 4.5 <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

2 participants