Skip to content

Commit 28d6a13

Browse files
msrathore-dbclaude
andauthored
[PECOBLR-1746] Implementing support for listing procedures (#1238)
## Description Implements `getProcedures` and `getProcedureColumns` in `DatabricksDatabaseMetaData` by querying `information_schema.routines` and `information_schema.parameters` via SQL. Unlike other metadata operations that use `SHOW` commands or Thrift RPCs, these use direct SQL SELECT queries against `information_schema` views. This works for both Thrift and SEA transports. **getProcedures:** - Queries `information_schema.routines` filtered by `routine_type = 'PROCEDURE'` - Returns 9-column JDBC-spec result set (PROCEDURE_CAT, PROCEDURE_SCHEM, PROCEDURE_NAME, reserved x3, REMARKS, PROCEDURE_TYPE, SPECIFIC_NAME) - PROCEDURE_TYPE is always `procedureNoResult` (1) **getProcedureColumns:** - Queries `information_schema.parameters` JOINed with `information_schema.routines` to filter for procedures - Returns 20-column JDBC-spec result set with parameter metadata - Maps `parameter_mode` (IN/OUT/INOUT) to JDBC COLUMN_TYPE constants (1/4/2) - Maps Databricks type names to `java.sql.Types` codes via existing `getCode()` **Catalog resolution:** - NULL catalog → queries `system.information_schema.routines` (cross-catalog) - Specific catalog → queries `<catalog>.information_schema.routines` - Empty string → returns empty result set **Shared SQL builders** in `CommandConstants` eliminate duplication between SDK and Thrift clients. ## Testing **Unit tests** (`DatabricksMetadataSdkClientTest`): - 4 parameterized tests for `listProcedures` SQL generation (catalog+schema+name, null schema, null name, null catalog) - 3 parameterized tests for `listProcedureColumns` SQL generation (all filters, partial filters, all nulls) **Integration test** (`MetadataIntegrationTests#testGetProceduresAndProcedureColumns`): - Creates a procedure `jdbc_test_compute_area(x DOUBLE, y DOUBLE, OUT area DOUBLE)` with COMMENT - Verifies `getProcedures` returns correct name, schema, catalog, remarks, type - Verifies `getProcedureColumns` returns 3 params with correct COLUMN_TYPE (IN=1, OUT=4), DATA_TYPE (DOUBLE=8), ordinal positions - Tests column name filtering - Cleans up procedure after test - WireMock stubs recorded for REPLAY mode **Existing tests:** All 248 `DatabricksDatabaseMetaDataTest` + 51 `DatabricksMetadataSdkClientTest` pass. ## Additional Notes to the Reviewer - `information_schema.parameters` is used (not `routine_columns`) because `routine_columns` contains table-valued function output columns, not procedure parameters. - NULLABLE is always `procedureNullableUnknown` (2) since the server does not track parameter nullability. - When catalog is NULL, `system.information_schema` is queried which requires system table access permissions. If the user lacks this permission, the driver returns an empty result set. This will be addressed server-side in a future release. --------- Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1 parent dd0c7b9 commit 28d6a13

31 files changed

Lines changed: 1323 additions & 65 deletions

File tree

NEXT_CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,7 @@
33
## [Unreleased]
44

55
### Added
6+
- Added `DatabaseMetaData.getProcedures()` and `DatabaseMetaData.getProcedureColumns()` to discover stored procedures and their parameters. Queries `information_schema.routines` and `information_schema.parameters` using parameterized SQL for both SEA and Thrift transports.
67
- Added connection property `OAuthWebServerTimeout` to configure the OAuth browser authentication timeout for U2M (user-to-machine) flows, and also updated hardcoded 1-hour timeout to default 120 seconds timeout.
78
- Added connection property `UseQueryForMetadata` to use SQL SHOW commands instead of Thrift RPCs for metadata operations (getCatalogs, getSchemas, getTables, getColumns, getFunctions). This fixes incorrect wildcard matching where `_` was treated as a single-character wildcard in Thrift metadata pattern filters.
89
- Added connection property `TreatMetadataCatalogNameAsPattern` to control whether catalog names are treated as patterns in Thrift metadata RPCs. When disabled (default), unescaped `_` in catalog names is escaped to prevent single-character wildcard matching. This aligns with JDBC spec which treats catalogName as identifier and not pattern.
@@ -11,6 +12,9 @@
1112
- Bumped `com.fasterxml.jackson.core:jackson-core` from 2.18.3 to 2.18.6.
1213
- Fat jar now routes SDK and Apache HTTP client logs through Java Util Logging (JUL), removing the need for external logging libraries.
1314
- PECOBLR-1121 Arrow patch to circumvent Arrow issues with JDK 16+.
15+
- Log timestamps now explicitly display timezone.
16+
- **[Breaking Change]** `PreparedStatement.setTimestamp(int, Timestamp, Calendar)` now properly applies Calendar timezone conversion using LocalDateTime pattern (inline with `getTimestamp`). Previously Calendar parameter was ineffective.
17+
- `DatabaseMetaData.getColumns()` with null catalog parameter now retrieves columns from all available catalogs when using SQL Execution API.
1418

1519
### Fixed
1620
- Fixed statement timeout when the server returns `TIMEDOUT_STATE` directly in the `ExecuteStatement` response (e.g. query queued under load), the driver now throws `SQLTimeoutException` instead of `DatabricksHttpException`.

src/main/java/com/databricks/jdbc/api/impl/DatabricksDatabaseMetaData.java

Lines changed: 17 additions & 59 deletions
Original file line numberDiff line numberDiff line change
@@ -892,21 +892,6 @@ public boolean supportsSharding() throws SQLException {
892892
return false;
893893
}
894894

895-
/**
896-
* Builds the result set for stored procedures metadata.
897-
*
898-
* <p>The result set structure is defined based on the JDBC driver specifications to ensure
899-
* consistency. The following columns are included in the result set:
900-
*
901-
* <ul>
902-
* <li>PROCEDURE_CAT: The catalog of the procedure (String)
903-
* <li>PROCEDURE_SCHEM: The schema of the procedure (String)
904-
* <li>PROCEDURE_NAME: The name of the procedure (String)
905-
* <li>REMARKS: A description or remarks about the procedure (String)
906-
* <li>PROCEDURE_TYPE: The type of procedure (e.g., FUNCTION, PROCEDURE) (String)
907-
* <li>SPECIFIC_NAME: The specific name for the procedure (String)
908-
* </ul>
909-
*/
910895
@Override
911896
public ResultSet getProcedures(String catalog, String schemaPattern, String procedureNamePattern)
912897
throws SQLException {
@@ -916,44 +901,14 @@ public ResultSet getProcedures(String catalog, String schemaPattern, String proc
916901
schemaPattern,
917902
procedureNamePattern);
918903
throwExceptionIfConnectionIsClosed();
919-
return new DatabricksResultSet(
920-
new StatementStatus().setState(StatementState.SUCCEEDED),
921-
new StatementId("getprocedures-metadata"),
922-
Arrays.asList(
923-
"PROCEDURE_CAT",
924-
"PROCEDURE_SCHEM",
925-
"PROCEDURE_NAME",
926-
"NUM_INPUT_PARAMS",
927-
"NUM_OUTPUT_PARAMS",
928-
"NUM_RESULT_SETS",
929-
"REMARKS",
930-
"PROCEDURE_TYPE",
931-
"SPECIFIC_NAME"),
932-
Arrays.asList(
933-
"VARCHAR",
934-
"VARCHAR",
935-
"VARCHAR",
936-
"INTEGER",
937-
"INTEGER",
938-
"INTEGER",
939-
"VARCHAR",
940-
"SMALLINT",
941-
"VARCHAR"),
942-
new int[] {
943-
Types.VARCHAR,
944-
Types.VARCHAR,
945-
Types.VARCHAR,
946-
Types.INTEGER,
947-
Types.INTEGER,
948-
Types.INTEGER,
949-
Types.VARCHAR,
950-
Types.SMALLINT,
951-
Types.VARCHAR
952-
},
953-
new int[] {128, 128, 128, 10, 10, 10, 254, 5, 128},
954-
new int[] {1, 1, 0, 1, 1, 1, 1, 1, 0},
955-
new Object[0][0],
956-
StatementType.METADATA);
904+
try {
905+
return session
906+
.getDatabricksMetadataClient()
907+
.listProcedures(session, catalog, schemaPattern, procedureNamePattern);
908+
} catch (Exception e) {
909+
LOGGER.error(e, "Unable to fetch procedures, returning empty result set");
910+
return metadataResultSetBuilder.getProceduresResult(new ArrayList<>());
911+
}
957912
}
958913

959914
@Override
@@ -967,12 +922,15 @@ public ResultSet getProcedureColumns(
967922
procedureNamePattern,
968923
columnNamePattern);
969924
throwExceptionIfConnectionIsClosed();
970-
971-
return metadataResultSetBuilder.getResultSetWithGivenRowsAndColumns(
972-
PROCEDURE_COLUMNS_COLUMNS,
973-
new ArrayList<>(),
974-
METADATA_STATEMENT_ID,
975-
CommandName.GET_PROCEDURES_COLUMNS);
925+
try {
926+
return session
927+
.getDatabricksMetadataClient()
928+
.listProcedureColumns(
929+
session, catalog, schemaPattern, procedureNamePattern, columnNamePattern);
930+
} catch (Exception e) {
931+
LOGGER.error(e, "Unable to fetch procedure columns, returning empty result set");
932+
return metadataResultSetBuilder.getProcedureColumnsResult(new ArrayList<>());
933+
}
976934
}
977935

978936
@Override

src/main/java/com/databricks/jdbc/common/CommandName.java

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,8 @@ public enum CommandName {
2222
GET_TABLE_PRIVILEGES,
2323
GET_VERSION_COLUMNS,
2424
GET_SUPER_TYPES,
25-
GET_PROCEDURES_COLUMNS,
25+
LIST_PROCEDURES,
26+
LIST_PROCEDURE_COLUMNS,
2627
GET_INDEX_INFO,
2728
GET_SUPER_TABLES,
2829
GET_FUNCTION_COLUMNS,

src/main/java/com/databricks/jdbc/common/MetadataOperationType.java

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,9 @@ public enum MetadataOperationType {
1111
GET_COLUMNS("GetColumns"),
1212
GET_FUNCTIONS("GetFunctions"),
1313
GET_PRIMARY_KEYS("GetPrimaryKeys"),
14-
GET_CROSS_REFERENCE("GetCrossReference");
14+
GET_CROSS_REFERENCE("GetCrossReference"),
15+
GET_PROCEDURES("GetProcedures"),
16+
GET_PROCEDURE_COLUMNS("GetProcedureColumns");
1517

1618
private final String headerValue;
1719

src/main/java/com/databricks/jdbc/common/MetadataResultConstants.java

Lines changed: 22 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -195,6 +195,14 @@ public class MetadataResultConstants {
195195
private static final ResultColumn RADIX = new ResultColumn("RADIX", "radix", Types.SMALLINT);
196196
private static final ResultColumn NULLABLE_SHORT =
197197
new ResultColumn("NULLABLE", "nullable", Types.SMALLINT);
198+
private static final ResultColumn NUM_INPUT_PARAMS =
199+
new ResultColumn("NUM_INPUT_PARAMS", "numInputParams", Types.INTEGER);
200+
private static final ResultColumn NUM_OUTPUT_PARAMS =
201+
new ResultColumn("NUM_OUTPUT_PARAMS", "numOutputParams", Types.INTEGER);
202+
private static final ResultColumn NUM_RESULT_SETS =
203+
new ResultColumn("NUM_RESULT_SETS", "numResultSets", Types.INTEGER);
204+
private static final ResultColumn PROCEDURE_TYPE =
205+
new ResultColumn("PROCEDURE_TYPE", "procedureType", Types.SMALLINT);
198206
private static final ResultColumn NON_UNIQUE =
199207
new ResultColumn("NON_UNIQUE", "nonUnique", Types.BOOLEAN);
200208
private static final ResultColumn INDEX_QUALIFIER =
@@ -225,6 +233,18 @@ public class MetadataResultConstants {
225233
FUNCTION_TYPE_COLUMN,
226234
SPECIFIC_NAME_COLUMN);
227235

236+
public static final List<ResultColumn> PROCEDURES_COLUMNS =
237+
List.of(
238+
PROCEDURE_CAT,
239+
PROCEDURE_SCHEM,
240+
PROCEDURE_NAME,
241+
NUM_INPUT_PARAMS,
242+
NUM_OUTPUT_PARAMS,
243+
NUM_RESULT_SETS,
244+
REMARKS_COLUMN,
245+
PROCEDURE_TYPE,
246+
SPECIFIC_NAME_COLUMN);
247+
228248
public static List<ResultColumn> COLUMN_COLUMNS =
229249
List.of(
230250
CATALOG_COLUMN,
@@ -618,8 +638,9 @@ public class MetadataResultConstants {
618638
CommandName.GET_VERSION_COLUMNS,
619639
List.of(SCOPE, COL_NAME_COLUMN, DATA_TYPE_COLUMN, TYPE_NAME_COLUMN, PSEUDO_COLUMN));
620640
put(CommandName.GET_SUPER_TYPES, List.of(TYPE_NAME_COLUMN, SUPERTYPE_NAME));
641+
put(CommandName.LIST_PROCEDURES, List.of(PROCEDURE_NAME, SPECIFIC_NAME_COLUMN));
621642
put(
622-
CommandName.GET_PROCEDURES_COLUMNS,
643+
CommandName.LIST_PROCEDURE_COLUMNS,
623644
List.of(
624645
PROCEDURE_NAME,
625646
COLUMN_NAME_COLUMN,

src/main/java/com/databricks/jdbc/dbclient/IDatabricksMetadataClient.java

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -132,6 +132,42 @@ DatabricksResultSet listImportedKeys(
132132
DatabricksResultSet listExportedKeys(
133133
IDatabricksSession session, String catalog, String schema, String table) throws SQLException;
134134

135+
/**
136+
* Returns the list of stored procedures
137+
*
138+
* @param session underlying session
139+
* @param catalog catalogName; null means use system catalog
140+
* @param schemaNamePattern schema name pattern (can be a LIKE pattern)
141+
* @param procedureNamePattern procedure name pattern (can be a LIKE pattern)
142+
* @return a DatabricksResultSet representing list of procedures
143+
*/
144+
@DatabricksMetricsTimed
145+
DatabricksResultSet listProcedures(
146+
IDatabricksSession session,
147+
String catalog,
148+
String schemaNamePattern,
149+
String procedureNamePattern)
150+
throws SQLException;
151+
152+
/**
153+
* Returns the list of stored procedure columns/parameters
154+
*
155+
* @param session underlying session
156+
* @param catalog catalogName; null means use system catalog
157+
* @param schemaNamePattern schema name pattern (can be a LIKE pattern)
158+
* @param procedureNamePattern procedure name pattern (can be a LIKE pattern)
159+
* @param columnNamePattern column/parameter name pattern (can be a LIKE pattern)
160+
* @return a DatabricksResultSet representing list of procedure columns
161+
*/
162+
@DatabricksMetricsTimed
163+
DatabricksResultSet listProcedureColumns(
164+
IDatabricksSession session,
165+
String catalog,
166+
String schemaNamePattern,
167+
String procedureNamePattern,
168+
String columnNamePattern)
169+
throws SQLException;
170+
135171
/**
136172
* Returns the list of cross references between a parent table and a foreign table
137173
*

src/main/java/com/databricks/jdbc/dbclient/impl/common/CommandConstants.java

Lines changed: 109 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,17 @@
11
package com.databricks.jdbc.dbclient.impl.common;
22

3+
import com.databricks.jdbc.api.impl.ImmutableSqlParameter;
4+
import com.databricks.jdbc.model.core.ColumnInfoTypeName;
5+
import java.util.Map;
6+
37
public class CommandConstants {
48
public static final String METADATA_STATEMENT_ID = "metadata-statement";
59
public static final String GET_TABLES_STATEMENT_ID = "gettables-metadata";
610
public static final String GET_CATALOGS_STATEMENT_ID = "getcatalogs-metadata";
711
public static final String GET_TABLE_TYPE_STATEMENT_ID = "gettabletype-metadata";
812
public static final String GET_FUNCTIONS_STATEMENT_ID = "getfunctions-metadata";
13+
public static final String GET_PROCEDURES_STATEMENT_ID = "getprocedures-metadata";
14+
public static final String GET_PROCEDURE_COLUMNS_STATEMENT_ID = "getprocedurecolumns-metadata";
915
public static final String SHOW_CATALOGS_SQL = "SHOW CATALOGS";
1016
public static final String SHOW_TABLE_TYPES_SQL = "SHOW TABLE_TYPES";
1117
public static final String IN_CATALOG_SQL = " IN CATALOG `%s`";
@@ -26,4 +32,107 @@ public class CommandConstants {
2632
"SHOW KEYS" + IN_CATALOG_SQL + IN_ABSOLUTE_SCHEMA_SQL + IN_ABSOLUTE_TABLE_SQL;
2733
public static final String SHOW_FOREIGN_KEYS_SQL =
2834
"SHOW FOREIGN KEYS" + IN_CATALOG_SQL + IN_ABSOLUTE_SCHEMA_SQL + IN_ABSOLUTE_TABLE_SQL;
35+
36+
private static final String INFORMATION_SCHEMA_ROUTINES = "information_schema.routines";
37+
private static final String INFORMATION_SCHEMA_PARAMETERS = "information_schema.parameters";
38+
private static final String PROCEDURE_TYPE_FILTER = "routine_type = 'PROCEDURE'";
39+
40+
private static final String ROUTINES_SELECT_COLUMNS =
41+
"routine_catalog, routine_schema, routine_name, comment, specific_name";
42+
43+
private static final String PARAMETERS_SELECT_COLUMNS =
44+
"p.specific_catalog, p.specific_schema, p.specific_name,"
45+
+ " p.parameter_name, p.parameter_mode, p.is_result,"
46+
+ " p.data_type,"
47+
+ " p.numeric_precision, p.numeric_precision_radix, p.numeric_scale,"
48+
+ " p.character_maximum_length, p.character_octet_length,"
49+
+ " p.ordinal_position, p.parameter_default, p.comment";
50+
51+
/**
52+
* Builds a parameterized SQL query to fetch procedures from information_schema.routines. LIKE
53+
* clause values use ? placeholders with parameters populated in the provided map for server-side
54+
* binding.
55+
*/
56+
public static String buildProceduresSQL(
57+
String catalog,
58+
String schemaPattern,
59+
String procedureNamePattern,
60+
Map<Integer, ImmutableSqlParameter> params) {
61+
String catalogPrefix = getCatalogPrefix(catalog);
62+
String routinesTable = catalogPrefix + "." + INFORMATION_SCHEMA_ROUTINES;
63+
int paramIndex = 1;
64+
65+
StringBuilder sql = new StringBuilder();
66+
sql.append("SELECT ").append(ROUTINES_SELECT_COLUMNS);
67+
sql.append(" FROM ").append(routinesTable);
68+
sql.append(" WHERE ").append(PROCEDURE_TYPE_FILTER);
69+
if (schemaPattern != null) {
70+
sql.append(" AND routine_schema LIKE ?");
71+
params.put(paramIndex, buildStringParam(paramIndex, schemaPattern));
72+
paramIndex++;
73+
}
74+
if (procedureNamePattern != null) {
75+
sql.append(" AND routine_name LIKE ?");
76+
params.put(paramIndex, buildStringParam(paramIndex, procedureNamePattern));
77+
paramIndex++;
78+
}
79+
sql.append(" ORDER BY routine_catalog, routine_schema, routine_name");
80+
return sql.toString();
81+
}
82+
83+
/**
84+
* Builds a parameterized SQL query to fetch procedure columns from information_schema.parameters.
85+
* LIKE clause values use ? placeholders with parameters populated in the provided map for
86+
* server-side binding.
87+
*/
88+
public static String buildProcedureColumnsSQL(
89+
String catalog,
90+
String schemaPattern,
91+
String procedureNamePattern,
92+
String columnNamePattern,
93+
Map<Integer, ImmutableSqlParameter> params) {
94+
String catalogPrefix = getCatalogPrefix(catalog);
95+
String parametersTable = catalogPrefix + "." + INFORMATION_SCHEMA_PARAMETERS + " p";
96+
String routinesTable = catalogPrefix + "." + INFORMATION_SCHEMA_ROUTINES + " r";
97+
int paramIndex = 1;
98+
99+
StringBuilder sql = new StringBuilder();
100+
sql.append("SELECT ").append(PARAMETERS_SELECT_COLUMNS);
101+
sql.append(" FROM ").append(parametersTable);
102+
sql.append(" JOIN ").append(routinesTable);
103+
sql.append(" ON p.specific_catalog = r.specific_catalog");
104+
sql.append(" AND p.specific_schema = r.specific_schema");
105+
sql.append(" AND p.specific_name = r.specific_name");
106+
sql.append(" WHERE r.").append(PROCEDURE_TYPE_FILTER);
107+
if (schemaPattern != null) {
108+
sql.append(" AND p.specific_schema LIKE ?");
109+
params.put(paramIndex, buildStringParam(paramIndex, schemaPattern));
110+
paramIndex++;
111+
}
112+
if (procedureNamePattern != null) {
113+
sql.append(" AND p.specific_name LIKE ?");
114+
params.put(paramIndex, buildStringParam(paramIndex, procedureNamePattern));
115+
paramIndex++;
116+
}
117+
if (columnNamePattern != null) {
118+
sql.append(" AND p.parameter_name LIKE ?");
119+
params.put(paramIndex, buildStringParam(paramIndex, columnNamePattern));
120+
paramIndex++;
121+
}
122+
sql.append(
123+
" ORDER BY p.specific_catalog, p.specific_schema, p.specific_name, p.ordinal_position");
124+
return sql.toString();
125+
}
126+
127+
private static ImmutableSqlParameter buildStringParam(int index, String value) {
128+
return ImmutableSqlParameter.builder()
129+
.type(ColumnInfoTypeName.STRING)
130+
.value(value)
131+
.cardinal(index)
132+
.build();
133+
}
134+
135+
private static String getCatalogPrefix(String catalog) {
136+
return (catalog == null) ? "system" : "`" + catalog + "`";
137+
}
29138
}

0 commit comments

Comments
 (0)