Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions hazelcast-integration/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,3 +25,5 @@
Implementation of integration of MongoDB with hazelcast.
- <h3>openshift</h3>
A guideline to start using Hazelcast on the Red Hat OpenShift platform.
- <h3>hikari-connection-pool</h3>
A sample demo code integrate Hazelcast with Hikari connection pool.
Comment on lines +28 to +29
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We provide an integration with Hikari via JdbcDataConnection.

https://docs.hazelcast.com/hazelcast/5.5/data-connections/data-connections-configuration#JDBC

It provides a nice integration with the Hazelcast configuration and the resources cleanup is tied to the data connection (either dynamically when used from SQL, or bound to cluster lifecycle when configured in XML).

Would you like to rework the sample with the JdbcDataConnection instead?

21 changes: 21 additions & 0 deletions hazelcast-integration/hikari-connection-pool/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
<h1>Hazelcast Integration with Hikari Connection Pool</h1>
To create Hikari connection pool, first create database table mapping with Hazelcast in-memory map.
In this demo, I have used EMPLOYEE table created on H2 DB, which is mapped with employee-map.

- Create Hikari data source from HikariConfig, with database credentials and connection pool configuration e.g. pool size, connection timeout etc.
- Get the connection from Hikari Data Source.
- Implement the Map Store Factory from MapLoader<K, V>, MapStore<K, V> and override all the necessary methods.
- Register the MapStore into hazelcast.xml config.


<hazelcast>
...
<map name="empaloyee_map">
<map-store enabled="true">
<class-name>com.demo.maps.EmployeeMapStore</class-name>
</map-store>
</map>
...
</hazelcast>

Database connection information can be externalized from hazelcast.xml. Build the above classes in a jar and copy them inside hazelcast user-lib directory under the Hazelcast installation.
40 changes: 40 additions & 0 deletions hazelcast-integration/hikari-connection-pool/pom.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<packaging>jar</packaging>
<parent>
<groupId>com.hazelcast.samples</groupId>
<artifactId>hazelcast-integration</artifactId>
<version>0.1-SNAPSHOT</version>
</parent>

<artifactId>hikari-connection-pool</artifactId>
<name>Hazelcast Hikari Connection Pool</name>
<description>A sample demo code to integrate Hazelcast with Hikari connection pool</description>

<properties>
<main.basedir>${project.parent.parent.basedir}</main.basedir>

<maven.compiler.source>17</maven.compiler.source>
<maven.compiler.target>17</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

<dependencies>
<dependency>
<groupId>com.hazelcast</groupId>
<artifactId>hazelcast</artifactId>
<version>${hazelcast.version}</version>
<scope>provided</scope>
</dependency>

<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>2.2.224</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
package com.hazelcast.samples.connection;

import java.sql.DriverManager;
import java.sql.SQLException;

public class JDBCBasicConnection {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this used anywhere?

public static void getConnection() {
try {
Class.forName("org.h2.Driver");
DriverManager.getConnection("jdbc:h2:~/test", "sa", "");
} catch (ClassNotFoundException | SQLException e) {
throw new RuntimeException(e);
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
package com.hazelcast.samples.connection.pool;

import com.hazelcast.shaded.com.zaxxer.hikari.HikariConfig;
import com.hazelcast.shaded.com.zaxxer.hikari.HikariDataSource;

import java.sql.Connection;
import java.sql.SQLException;

/**
* Connection pool data source class
* Implement this class as thread-safe in multithreading environment
*/
public class HikariDataSourcePool {
private static HikariDataSource hikariDataSource = null;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using a static in this way makes it difficult to test. We don't want to give our users wrong advice.

This should be tied to the life of a MapStore instance.

private static final HikariDataSourcePool hikariDataSourcePool = null;

private HikariDataSourcePool() {
if (null != hikariDataSource) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

HikariDataSourcePool is not thread-safe due to hikariDataSource. It would be ideal to make it thread-safe (e.g., using double checked locking), or mark it thread unsafe.

System.out.println("Hikari data source already created. existing connection can be used.");
} else {
createHikariDataSource();
}
}

private void createHikariDataSource() {
HikariConfig hikariConfig = new HikariConfig();

hikariConfig.setJdbcUrl("jdbc:h2:mem:testdb");
hikariConfig.setUsername("sa");
hikariConfig.setPassword("");
hikariConfig.setMaximumPoolSize(5);
hikariConfig.setIdleTimeout(30000);
hikariConfig.setConnectionTimeout(30000);
hikariConfig.setPoolName("Demo-POOL");
hikariConfig.setDriverClassName("org.h2.Driver");
Comment on lines +28 to +35
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally, these should be parameterized.


hikariDataSource = new HikariDataSource(hikariConfig);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where is this dataSource closed?


System.out.println("Datasource Created..");
}

/**
*
* Implementation should be thread-safe
*/
public static synchronized Connection getConnection() {
try {
if (null != hikariDataSource) {
System.err.println("\nGetting....! SQL Connection from HIKARI POOL.\n");
return hikariDataSource.getConnection();
} else {
throw new RuntimeException("Ops! Hikari datasource not available.");
}
} catch (SQLException exception) {
throw new RuntimeException("Exception while creating database connection." + exception);
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
package com.hazelcast.samples.map;

import com.hazelcast.map.MapLoader;
import com.hazelcast.map.MapStore;
import com.hazelcast.samples.model.Employee;

import java.sql.*;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import java.util.Map;
import java.util.stream.Collectors;

import static com.hazelcast.samples.connection.pool.HikariDataSourcePool.getConnection;

public class EmployeeMapStore implements MapLoader<Integer, Employee>, MapStore<Integer, Employee> {

public EmployeeMapStore() {
}

@Override
public Iterable<Integer> loadAllKeys() {
String query = "SELECT EMPID FROM EMPLOYEE";

List<Integer> empIds = new ArrayList<>();
try (Connection connection = getConnection();
PreparedStatement preparedStatement = connection.prepareStatement(query);
ResultSet resultSet = preparedStatement.executeQuery()) {
while (resultSet.next()) {
empIds.add(resultSet.getInt(1));
}
} catch (SQLException exception) {
throw new RuntimeException("Error on load all keys : " + exception);
}

return empIds;
}

@Override
public Employee load(Integer empId) {
String query = "SELECT EMPID, NAME, SALARY FROM EMPLOYEE WHERE EMPID=?";
Employee employee = null;
try (Connection connection = getConnection();
PreparedStatement preparedStatement = connection.prepareStatement(query)) {
preparedStatement.setInt(1, empId);
ResultSet resultSet = preparedStatement.executeQuery();

if (resultSet.next()) {
employee = new Employee(resultSet.getInt(1), resultSet.getString(2), resultSet.getDouble(3));
}
} catch (SQLException exception) {
throw new RuntimeException("Error on load key : " + exception);
}

return employee;
}

@Override
public Map<Integer, String> loadAll(Collection collection) {
System.out.println("Load all employee..");

List<Integer> employees = (List<Integer>) collection;

return employees.stream().collect(Collectors.toMap(id -> id, id -> load(id).toString()));
}

@Override
public void store(Integer integer, Employee employee) {
String storeQuery = "INSERT INTO EMPLOYEE(EMPID, NAME, SALARY) VALUES(?, ?, ?)";
try (Connection connection = getConnection();
PreparedStatement preparedStatement = connection.prepareStatement(storeQuery)) {
preparedStatement.setInt(1, employee.empId());
preparedStatement.setString(2, employee.name());
preparedStatement.setDouble(3, employee.salary());

preparedStatement.executeUpdate();
} catch (Exception exception) {
System.out.println("Exception : " + exception.getMessage());
throw new RuntimeException(exception.getMessage());
}
}


@Override
public void storeAll(Map<Integer, Employee> map) {
String storeQuery = "INSERT INTO EMPLOYEE(EMPID, NAME, SALARY) VALUES(?, ?, ?)";
try (Connection connection = getConnection();
PreparedStatement preparedStatement = connection.prepareStatement(storeQuery)) {
map.forEach((identity, employee) -> {

try {
preparedStatement.setInt(1, employee.empId());
preparedStatement.setString(2, employee.name());
preparedStatement.setDouble(3, employee.salary());
preparedStatement.addBatch();
} catch (SQLException e) {
throw new RuntimeException(e);
}
});

int[] batchResults = preparedStatement.executeBatch();
} catch (SQLException exception) {
System.out.println("Exception : " + exception.getMessage());
throw new RuntimeException(exception.getMessage());
}
}

@Override
public void delete(Integer empId) {
String deleteQuery = "DELETE FROM EMPLOYEE WHERE EMPID=?";
try (Connection connection = getConnection();
PreparedStatement preparedStatement = connection.prepareStatement(deleteQuery)) {
preparedStatement.setInt(1, empId);

preparedStatement.executeUpdate();
} catch (Exception exception) {
System.out.println("Exception : " + exception.getMessage());
throw new RuntimeException(exception.getMessage());
}
}

@Override
public void deleteAll(Collection<Integer> empIds) {
String deleteQuery = "DELETE FROM EMPLOYEE WHERE EMPID IN (?)";
try (Connection connection = getConnection();
PreparedStatement preparedStatement = connection.prepareStatement(deleteQuery)) {
Array empIdsInArray = connection.createArrayOf("integer", empIds.toArray());
preparedStatement.setArray(1, empIdsInArray);

preparedStatement.executeUpdate();
} catch (Exception exception) {
System.out.println("Exception : " + exception.getMessage());
throw new RuntimeException(exception.getMessage());
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
package com.hazelcast.samples.model;

public record Employee(Integer empId, String name, Double salary) {

}
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
<?xml version="1.0" encoding="UTF-8"?>
<hazelcast>

<!--Sample map config XML-->
<map name="empaloyee_map">
<map-store enabled="true">
<class-name>com.demo.maps.EmployeeMapStore</class-name>
</map-store>
</map>

</hazelcast>

1 change: 1 addition & 0 deletions hazelcast-integration/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,7 @@
<module>manager-based-session-replication</module>
<module>openshift</module>
<module>eureka</module>
<module>hikari-connection-pool</module>
</modules>

<dependencies>
Expand Down
Loading