Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support connecting to HiveServer2 through database connection pools other than HikariCP #33762

Merged
merged 1 commit into from
Nov 22, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions RELEASE-NOTES.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,7 @@
1. Doc: Adds documentation for HiveServer2 support - [#33717](https://github.com/apache/shardingsphere/pull/33717)
1. DistSQL: Check inline expression when create sharding table rule with inline sharding algorithm - [#33735](https://github.com/apache/shardingsphere/pull/33735)
1. Infra: Support setting `hive_conf_list`, `hive_var_list` and `sess_var_list` for jdbcURL when connecting to HiveServer2 - [#33749](https://github.com/apache/shardingsphere/pull/33749)
1. Infra: Support connecting to HiveServer2 through database connection pools other than HikariCP - [#33762](https://github.com/apache/shardingsphere/pull/33762)

### Bug Fixes

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,17 @@ cd ./shardingsphere/
COPY --from=ghcr.io/apache/shardingsphere-agent:latest /usr/agent/ /shardingsphere-agent/
```

#### 社区构建

自 ShardingSphere 5.5.2 开始,ShardingSphere Agent 在 https://github.com/apache/shardingsphere/pkgs/container/shardingsphere-agent 发布社区构建。
此 Docker Image 不属于 ASF 分发产物之一,只是为了方便而提供。

若在自定义 `Dockerfile` 中添加以下语句,这会将 ShardingSphere Agent 的目录复制到 `/shardingsphere-agent/` 。

```dockerfile
COPY --from=ghcr.io/apache/shardingsphere-agent:5.5.2 /usr/agent/ /shardingsphere-agent/
```

#### 夜间构建

ShardingSphere Agent 在 https://github.com/apache/shardingsphere/pkgs/container/shardingsphere-agent 存在夜间构建的 Docker Image。
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -130,6 +130,17 @@ If you add the following statement in your custom `Dockerfile`, it will copy the
COPY --from=ghcr.io/apache/shardingsphere-agent:latest /usr/agent/ /shardingsphere-agent/
```

#### Community Build

Since ShardingSphere 5.5.2, ShardingSphere Agent has released community builds at https://github.com/apache/shardingsphere/pkgs/container/shardingsphere-agent .
This Docker Image is not part of the ASF distribution, but is provided for convenience.

If you add the following statement in a custom `Dockerfile`, it will copy the ShardingSphere Agent directory to `/shardingsphere-agent/`.

```dockerfile
COPY --from=ghcr.io/apache/shardingsphere-agent:5.5.2 /usr/agent/ /shardingsphere-agent/
```

#### Nightly Build

ShardingSphere Agent has a nightly built Docker Image at https://github.com/apache/shardingsphere/pkgs/container/shardingsphere-agent .
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -233,16 +233,6 @@ ShardingSphere 仅针对 HiveServer2 `4.0.1` 进行集成测试。
HiveServer2 JDBC Driver `4.0.1` 不支持 Hadoop `3.4.1`,
参考 https://github.com/apache/hive/pull/5500 。

### 数据库连接池限制

由于 `org.apache.hive.jdbc.DatabaseMetaData` 未实现 `java.sql.DatabaseMetaData#getURL()`,
ShardingSphere 在`org.apache.shardingsphere.infra.database.DatabaseTypeEngine#getStorageType(javax.sql.DataSource)`处做了模糊处理,
因此用户暂时仅可通过 `com.zaxxer.hikari.HikariDataSource` 的数据库连接池连接 HiveServer2。

若用户需要通过 `com.alibaba.druid.pool.DruidDataSource` 的数据库连接池连接 HiveServer2,
用户应当考虑在 Hive 的主分支实现 `java.sql.DatabaseMetaData#getURL()`,
而不是尝试修改 ShardingSphere 的内部类。

### SQL 限制

ShardingSphere JDBC DataSource 尚不支持执行 HiveServer2 的 `SET` 语句,`CREATE TABLE` 语句和 `TRUNCATE TABLE` 语句。
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -237,16 +237,6 @@ Users can only use Hadoop `3.3.6` as the underlying Hadoop dependency of HiveSer
HiveServer2 JDBC Driver `4.0.1` does not support Hadoop `3.4.1`,
Reference https://github.com/apache/hive/pull/5500.

### Database connection pool limitation

Since `org.apache.hive.jdbc.DatabaseMetaData` does not implement `java.sql.DatabaseMetaData#getURL()`,
ShardingSphere has done fuzzy processing at `org.apache.shardingsphere.infra.database.DatabaseTypeEngine#getStorageType(javax.sql.DataSource)`,
so users can only connect to HiveServer2 through the database connection pool of `com.zaxxer.hikari.HikariDataSource` for the time being.

If users need to connect to HiveServer2 through the database connection pool of `com.alibaba.druid.pool.DruidDataSource`,
users should consider implementing `java.sql.DatabaseMetaData#getURL()` in the main branch of Hive,
rather than trying to modify the internal classes of ShardingSphere.

### SQL Limitations

ShardingSphere JDBC DataSource does not yet support executing HiveServer2's `SET` statement,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,13 @@ title = "Testcontainers"
weight = 6
+++

## 背景信息

ShardingSphere 默认情况下不提供对 `org.testcontainers.jdbc.ContainerDatabaseDriver` 的 `driverClassName` 的支持。
要在 ShardingSphere 的配置文件为数据节点使用类似 `jdbc:tc:postgresql:17.1-bookworm://test-databases-postgres/demo_ds_0` 的 `jdbcUrl`,

## 前提条件

要在 ShardingSphere 的配置文件为数据节点使用类似 `jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0` 的 `jdbcUrl`,
可能的 Maven 依赖关系如下,

```xml
Expand All @@ -28,6 +33,8 @@ ShardingSphere 默认情况下不提供对 `org.testcontainers.jdbc.ContainerDat
</dependencies>
```

## 配置示例

要使用 `org.apache.shardingsphere:shardingsphere-infra-database-testcontainers` 模块,
用户设备总是需要安装 Docker Engine 或符合 https://java.testcontainers.org/supported_docker_environment/ 要求的 alternative container runtimes。
此时可在 ShardingSphere 的 YAML 配置文件正常使用 `jdbc:tc:postgresql:` 前缀的 jdbcURL。
Expand All @@ -37,15 +44,15 @@ dataSources:
ds_0:
dataSourceClassName: com.zaxxer.hikari.HikariDataSource
driverClassName: org.testcontainers.jdbc.ContainerDatabaseDriver
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test-databases-postgres/demo_ds_0
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0
ds_1:
dataSourceClassName: com.zaxxer.hikari.HikariDataSource
driverClassName: org.testcontainers.jdbc.ContainerDatabaseDriver
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test-databases-postgres/demo_ds_1
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_1
ds_2:
dataSourceClassName: com.zaxxer.hikari.HikariDataSource
driverClassName: org.testcontainers.jdbc.ContainerDatabaseDriver
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test-databases-postgres/demo_ds_2
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_2
```

`org.apache.shardingsphere:shardingsphere-infra-database-testcontainers` 为 testcontainers-java 风格的 jdbcURL 提供支持,
Expand All @@ -58,3 +65,34 @@ dataSources:
5. 为 `jdbc:tc:mysql:` 的 jdbcURL 前缀提供支持的 Maven 模块 `org.testcontainers:mysql:1.20.3`
6. 为 `jdbc:tc:oracle:` 的 jdbcURL 前缀提供支持的 Maven 模块 `org.testcontainers:oracle-xe:1.20.3` 和 `org.testcontainers:oracle-free:1.20.3`
7. 为 `jdbc:tc:tidb:` 的 jdbcURL 前缀提供支持的 Maven 模块 `org.testcontainers:tidb:1.20.3`

## 使用限制

### 生命周期限制

如果像如下所示在 ShardingSphere 配置文件内定义通过 testcontainers-java 创建 Docker Container 的逻辑,

```yaml
dataSources:
ds_0:
dataSourceClassName: com.zaxxer.hikari.HikariDataSource
driverClassName: org.testcontainers.jdbc.ContainerDatabaseDriver
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0
```

testcontainers 默认情况下仅在对 `jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0` 的最后一个 `java.sql.Connection` 关闭后,
停止`jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0`创建的 Docker Container。
但 ShardingSphere 的内部类会缓存 `java.sql.Connection`。这导致直到 JVM 关闭,
`jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0` 创建的 Docker Container 才会被关闭。
若有避免 Container 被长期开启的必要,
`org.testcontainers.jdbc.ContainerDatabaseDriver` 存在可用方法来在单元测试中快速关闭相关 Container,
示例如下,

```java
import org.testcontainers.jdbc.ContainerDatabaseDriver;
public class ExampleUtils {
void test() {
ContainerDatabaseDriver.killContainers();
}
}
```
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,13 @@ title = "Testcontainers"
weight = 6
+++

## Background Information

ShardingSphere does not provide support for `driverClassName` of `org.testcontainers.jdbc.ContainerDatabaseDriver` by default.
To use `jdbcUrl` like `jdbc:tc:postgresql:17.1-bookworm://test-databases-postgres/demo_ds_0` for data nodes in ShardingSphere's configuration file,

## Prerequisites

To use `jdbcUrl` like `jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0` for data nodes in ShardingSphere's configuration file,
the possible Maven dependencies are as follows,

```xml
Expand All @@ -30,31 +35,63 @@ the possible Maven dependencies are as follows,

At this time, you can use the jdbcURL with the prefix `jdbc:tc:postgresql:` normally in the YAML configuration file of ShardingSphere.

## Configuration Example

To use the `org.apache.shardingsphere:shardingsphere-infra-database-testcontainers` module,
the user machine always needs to have Docker Engine or alternative container runtimes that comply with https://java.testcontainers.org/supported_docker_environment/ installed.
`org.apache.shardingsphere:shardingsphere-infra-database-testcontainers` provides support for testcontainers-java style jdbcURL,
including but not limited to,

```yaml
dataSources:
ds_0:
dataSourceClassName: com.zaxxer.hikari.HikariDataSource
driverClassName: org.testcontainers.jdbc.ContainerDatabaseDriver
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test-databases-postgres/demo_ds_0
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0
ds_1:
dataSourceClassName: com.zaxxer.hikari.HikariDataSource
driverClassName: org.testcontainers.jdbc.ContainerDatabaseDriver
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test-databases-postgres/demo_ds_1
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_1
ds_2:
dataSourceClassName: com.zaxxer.hikari.HikariDataSource
driverClassName: org.testcontainers.jdbc.ContainerDatabaseDriver
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test-databases-postgres/demo_ds_2
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_2
```

To use the `org.apache.shardingsphere:shardingsphere-infra-database-testcontainers` module,
the user machine always needs to have Docker Engine or alternative container runtimes that comply with https://java.testcontainers.org/supported_docker_environment/ installed.
`org.apache.shardingsphere:shardingsphere-infra-database-testcontainers` provides support for testcontainers-java style jdbcURL,
including but not limited to,

1. Maven module `org.testcontainers:clickhouse:1.20.3` that provides support for jdbcURL prefixes for `jdbc:tc:clickhouse:`
2. Maven module `org.testcontainers:postgresql:1.20.3` that provides support for jdbcURL prefixes for `jdbc:tc:postgresql:`
3. Maven module `org.testcontainers:mssqlserver:1.20.3` that provides support for jdbcURL prefixes for `jdbc:tc:sqlserver:`
4. Maven module `org.testcontainers:mariadb:1.20.3` that provides support for jdbcURL prefixes for `jdbc:tc:mariadb:`
5. Maven module `org.testcontainers:mysql:1.20.3` that provides support for jdbcURL prefixes of `jdbc:tc:mysql:`
6. Maven modules `org.testcontainers:oracle-xe:1.20.3` and `org.testcontainers:oracle-free:1.20.3` that provide support for jdbcURL prefixes of `jdbc:tc:oracle:`
7. Maven module `org.testcontainers:tidb:1.20.3` that provides support for jdbcURL prefixes of `jdbc:tc:tidb:`

## Usage restrictions

### Lifecycle restrictions

If the logic of creating a Docker Container through testcontainers-java is defined in the ShardingSphere configuration file as shown below,

```yaml
dataSources:
ds_0:
dataSourceClassName: com.zaxxer.hikari.HikariDataSource
driverClassName: org.testcontainers.jdbc.ContainerDatabaseDriver
jdbcUrl: jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0
```

testcontainers, by default,
stops the Docker Container created by `jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0` only after the last `java.sql.Connection` of `jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0` is closed.
But ShardingSphere's internal class will cache `java.sql.Connection`.
As a result, the Docker Container created by `jdbc:tc:postgresql:17.1-bookworm://test/demo_ds_0` will not be closed until the JVM is closed.
If it is necessary to prevent the Container from being opened for a long time, `org.testcontainers.jdbc.ContainerDatabaseDriver` has a method available to quickly close the relevant Container in the unit test.
The example is as follows,

```java
import org.testcontainers.jdbc.ContainerDatabaseDriver;
public class ExampleUtils {
void test() {
ContainerDatabaseDriver.killContainers();
}
}
```
Original file line number Diff line number Diff line change
Expand Up @@ -24,15 +24,13 @@
import org.apache.shardingsphere.infra.config.props.ConfigurationPropertyKey;
import org.apache.shardingsphere.infra.database.core.type.DatabaseType;
import org.apache.shardingsphere.infra.database.core.type.DatabaseTypeFactory;
import org.apache.shardingsphere.infra.datasource.pool.CatalogSwitchableDataSource;
import org.apache.shardingsphere.infra.datasource.pool.hikari.metadata.HikariDataSourcePoolFieldMetaData;
import org.apache.shardingsphere.infra.datasource.pool.hikari.metadata.HikariDataSourcePoolMetaData;
import org.apache.shardingsphere.infra.exception.core.external.sql.type.wrapper.SQLWrapperException;
import org.apache.shardingsphere.infra.spi.type.typed.TypedSPILoader;
import org.apache.shardingsphere.infra.util.reflection.ReflectionUtils;

import javax.sql.DataSource;
import java.lang.reflect.InvocationTargetException;
import java.sql.Connection;
import java.sql.DatabaseMetaData;
import java.sql.SQLException;
import java.sql.SQLFeatureNotSupportedException;
import java.util.Collection;
Expand Down Expand Up @@ -117,27 +115,30 @@ public static Map<String, DatabaseType> getStorageTypes(final DatabaseConfigurat

/**
* Get storage type.
* Similar to apache/hive 4.0.0's `org.apache.hive.jdbc.HiveDatabaseMetaData`, it does not implement {@link java.sql.DatabaseMetaData#getURL()}.
* So use {@link CatalogSwitchableDataSource#getUrl()} and {@link ReflectionUtils#getFieldValue(Object, String)} to try fuzzy matching.
* Similar to <a href="https://github.com/apache/hive/pull/5554">apache/hive#5554</a>,
* apache/hive 4.0.1's `org.apache.hive.jdbc.HiveDatabaseMetaData` does not implement {@link DatabaseMetaData#getURL()}.
* So use {@link java.sql.Wrapper#isWrapperFor(Class)} to try fuzzy matching.
*
* @param dataSource data source
* @return storage type
* @throws SQLWrapperException SQL wrapper exception
* @throws RuntimeException Runtime exception
*/
public static DatabaseType getStorageType(final DataSource dataSource) {
try (Connection connection = dataSource.getConnection()) {
return DatabaseTypeFactory.get(connection.getMetaData().getURL());
} catch (final SQLFeatureNotSupportedException sqlFeatureNotSupportedException) {
if (dataSource instanceof CatalogSwitchableDataSource) {
return DatabaseTypeFactory.get(((CatalogSwitchableDataSource) dataSource).getUrl());
try (Connection connection = dataSource.getConnection()) {
Class<?> hiveConnectionClass = Class.forName("org.apache.hive.jdbc.HiveConnection");
if (connection.isWrapperFor(hiveConnectionClass)) {
Object hiveConnection = connection.unwrap(hiveConnectionClass);
String connectedUrl = (String) hiveConnectionClass.getMethod("getConnectedUrl").invoke(hiveConnection);
return DatabaseTypeFactory.get(connectedUrl);
}
throw new SQLWrapperException(sqlFeatureNotSupportedException);
} catch (final SQLException | ClassNotFoundException | NoSuchMethodException | InvocationTargetException | IllegalAccessException exception) {
throw new RuntimeException(exception);
Comment on lines +131 to +140
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

}
if (dataSource.getClass().getName().equals(new HikariDataSourcePoolMetaData().getType())) {
HikariDataSourcePoolFieldMetaData dataSourcePoolFieldMetaData = new HikariDataSourcePoolFieldMetaData();
String jdbcUrlFieldName = ReflectionUtils.<String>getFieldValue(dataSource, dataSourcePoolFieldMetaData.getJdbcUrlFieldName())
.orElseThrow(() -> new SQLWrapperException(sqlFeatureNotSupportedException));
return DatabaseTypeFactory.get(jdbcUrlFieldName);
}
throw new SQLWrapperException(sqlFeatureNotSupportedException);
} catch (final SQLException ex) {
throw new SQLWrapperException(ex);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -692,11 +692,6 @@
"queryAllPublicConstructors":true,
"methods":[{"name":"<init>","parameterTypes":[] }, {"name":"add","parameterTypes":["long"] }, {"name":"sum","parameterTypes":[] }]
},
{
"condition":{"typeReachable":"org.apache.shardingsphere.proxy.frontend.command.CommandExecutorTask"},
"name":"java.util.concurrent.atomic.Striped64$Cell",
"fields":[{"name":"value"}]
},
{
"condition":{"typeReachable":"org.apache.shardingsphere.infra.expr.groovy.GroovyInlineExpressionParser"},
"name":"java.util.function.DoubleFunction",
Expand Down Expand Up @@ -2075,7 +2070,7 @@
"queryAllDeclaredMethods":true
},
{
"condition":{"typeReachable":"org.apache.shardingsphere.mode.manager.cluster.listener.DatabaseMetaDataChangedListener$$Lambda/0x00007f401fb273a8"},
"condition":{"typeReachable":"org.apache.shardingsphere.mode.manager.cluster.listener.DatabaseMetaDataChangedListener$$Lambda/0x00007f47f3b26e30"},
"name":"org.apache.shardingsphere.mode.manager.cluster.event.subscriber.dispatch.MetaDataChangedSubscriber"
},
{
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,10 @@
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import org.apache.shardingsphere.test.natived.commons.TestShardingService;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.condition.EnabledInNativeImage;
import org.testcontainers.jdbc.ContainerDatabaseDriver;

import javax.sql.DataSource;
import java.sql.SQLException;
Expand All @@ -31,6 +33,11 @@ class PostgresTest {

private TestShardingService testShardingService;

@AfterAll
static void afterAll() {
ContainerDatabaseDriver.killContainers();
}

@Test
void assertShardingInLocalTransactions() throws SQLException {
HikariConfig config = new HikariConfig();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,10 @@
import com.zaxxer.hikari.HikariConfig;
import com.zaxxer.hikari.HikariDataSource;
import org.apache.shardingsphere.test.natived.commons.TestShardingService;
import org.junit.jupiter.api.AfterAll;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.condition.EnabledInNativeImage;
import org.testcontainers.jdbc.ContainerDatabaseDriver;

import javax.sql.DataSource;
import java.sql.SQLException;
Expand All @@ -31,6 +33,11 @@ class SQLServerTest {

private TestShardingService testShardingService;

@AfterAll
static void afterAll() {
ContainerDatabaseDriver.killContainers();
}

@Test
void assertShardingInLocalTransactions() throws SQLException {
HikariConfig config = new HikariConfig();
Expand Down
Loading