Skip to content

Commit ce8dfb4

Browse files
committed
Merged branch v118. Following changes:
* First-class support for namespaces * Upgraded dependent libraries and plugins * Added jackson datatype support for JDK 8 types * Fixed multi custom codec issue * Error handling, javadoc and README improvements
1 parent a34e489 commit ce8dfb4

31 files changed

+348
-175
lines changed

README.md

Lines changed: 44 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -3,22 +3,24 @@
33
[![Build Status](https://api.travis-ci.org/flipkart-incubator/hbase-orm.svg?branch=master&status=passed)](https://travis-ci.org/github/flipkart-incubator/hbase-orm)
44
[![Language grade: Java](https://img.shields.io/lgtm/grade/java/g/flipkart-incubator/hbase-orm.svg?logo=lgtm&logoWidth=18)](https://lgtm.com/projects/g/flipkart-incubator/hbase-orm/context:java)
55
[![Coverage Status](https://coveralls.io/repos/github/flipkart-incubator/hbase-orm/badge.svg?branch=master)](https://coveralls.io/github/flipkart-incubator/hbase-orm?branch=master)
6-
[![Maven Central](https://img.shields.io/badge/sonatype-1.16-orange.svg)](https://oss.sonatype.org/content/repositories/releases/com/flipkart/hbase-object-mapper/1.16/)
6+
[![Maven Central](https://img.shields.io/badge/sonatype-1.18-blue.svg)](https://oss.sonatype.org/content/repositories/releases/com/flipkart/hbase-object-mapper/1.18/)
77
[![License](https://img.shields.io/badge/License-Apache%202-blue.svg)](./LICENSE.txt)
88

99
## Introduction
1010
HBase ORM is a light-weight, thread-safe and performant library that enables:
1111

12-
1. object-oriented access of HBase rows (Data Access Object) with minimal code and good testability
13-
2. reading from and/or writing to HBase tables in Hadoop MapReduce jobs
12+
1. object-oriented access of HBase rows (Data Access Object) with minimal code and good testability.
13+
2. reading from and/or writing to HBase tables in Hadoop MapReduce jobs.
14+
15+
This can also be used as an ORM for Bigtable. Scroll down till the relevant section to know how.
1416

1517
## Usage
1618
Let's say you've an HBase table `citizens` with row-key format of `country_code#UID`. Now, let's say this table is created with three column families `main`, `optional` and `tracked`, which may have columns (qualifiers) `uid`, `name`, `salary` etc.
1719

1820
This library enables to you represent your HBase table as a *bean-like class*, as below:
1921

2022
```java
21-
@HBTable(name = "citizens",
23+
@HBTable(namepsace = "govt", name = "citizens",
2224
families = {
2325
@Family(name = "main"),
2426
@Family(name = "optional", versions = 3),
@@ -37,8 +39,11 @@ public class Citizen implements HBRecord<String> {
3739
@HBColumn(family = "optional", column = "age")
3840
private Short age;
3941

40-
@HBColumn(family = "optional", column = "salary")
41-
private Integer sal;
42+
@HBColumn(family = "optional", column = "income")
43+
private Integer annualIncome;
44+
45+
@HBColumn(family = "optional", column = "registration_date")
46+
private LocalDateTime registrationDate;
4247

4348
@HBColumn(family = "optional", column = "counter")
4449
private Long counter;
@@ -74,18 +79,17 @@ public class Citizen implements HBRecord<String> {
7479
```
7580
That is,
7681

77-
* The above class `Citizen` represents the HBase table `citizens`, using the `@HBTable` annotation.
82+
* The above class `Citizen` represents the HBase table `citizens` in namespace `govt`, using the `@HBTable` annotation.
7883
* Logics for conversion of HBase row key to member variables of `Citizen` objects and vice-versa are implemented using `parseRowKey` and `composeRowKey` methods respectively.
7984
* The data type representing row key is the type parameter to `HBRecord` generic interface (in above case, `String`).
80-
* Note that `String` is both `Comparable` and `Serializable`.
85+
* Note that `String` is both '`Comparable` with itself' and `Serializable`.
8186
* Names of columns and their column families are specified using `@HBColumn` or `@HBColumnMultiVersion` annotations.
8287
* The class may contain fields of simple data types (e.g. `String`, `Integer`), generic data types (e.g. `Map`, `List`), custom class (e.g. `Dependents`) or even generics of custom class (e.g. `List<Dependent>`)
8388
* The `@HBColumnMultiVersion` annotation allows you to map multiple versions of column in a `NavigableMap<Long, ?>`. In above example, field `phoneNumber` is mapped to column `phone_number` within the column family `tracked` (which is configured for multiple versions)
8489

8590
Alternatively, you can model your class as below:
8691

8792
```java
88-
...
8993
class CitizenKey implements Serializable, Comparable<CitizenKey> {
9094
String countryCode;
9195
Integer uid;
@@ -148,7 +152,11 @@ Once defined, you can instantiate your *data access object* as below:
148152
```java
149153
CitizenDAO citizenDao = new CitizenDAO(connection);
150154
```
151-
You can access, manipulate and persist records of `citizens` table as shown in below examples:
155+
**Side note**: As you'd know, HBase's `Connection` creation is a heavy-weight operation
156+
(Details: [Connection](https://hbase.apache.org/2.0/apidocs/org/apache/hadoop/hbase/client/Connection.html)).
157+
So, it is recommended that you create `Connection` instance once and use it for the entire life cycle of your program across all the DAO classes that you create (such as above).
158+
159+
Now, you can access, manipulate and persist records of `citizens` table as shown in below examples:
152160

153161
Create new record:
154162

@@ -290,7 +298,7 @@ citizenDao.getHBaseTable() // returns HTable instance (in case you want to direc
290298

291299
(see [TestsAbstractHBDAO.java](./src/test/java/com/flipkart/hbaseobjectmapper/testcases/TestsAbstractHBDAO.java) for more detailed examples)
292300

293-
**Please note:** Since we're dealing with HBase (and not an OLTP data store), fitting a classical (Hibernate-like) ORM paradigm may not make sense. So this library doesn't intend to evolve as a full-fledged ORM. However, if that's your intent, I suggest you use [Apache Phoenix](https://phoenix.apache.org/).
301+
**Please note:** Since we're dealing with HBase (and not a classical RDBMS), fitting a Hibernate-like ORM may not make sense. So, this library does **not** intend to evolve as a full-fledged ORM. However, if that's your intent, I suggest you use [Apache Phoenix](https://phoenix.apache.org/).
294302

295303

296304
## Using this library for DDL operations
@@ -418,30 +426,32 @@ Being an *object mapper*, this library works for pre-defined columns only. For e
418426
* columns matching a pattern or a regular expression
419427
* unmapped columns of a column family
420428

421-
## Maven
422-
Add below entry within the `dependencies` section of your `pom.xml`:
429+
## Adding to your build
430+
If you are using Maven, add below entry within the `dependencies` section of your `pom.xml`:
423431

424432
```xml
425433
<dependency>
426434
<groupId>com.flipkart</groupId>
427435
<artifactId>hbase-object-mapper</artifactId>
428-
<version>1.16</version>
436+
<version>1.18</version>
429437
</dependency>
430438
```
431439

432-
See artifact details: [com.flipkart:hbase-object-mapper on **Maven Central**](https://search.maven.org/search?q=g:com.flipkart%20AND%20a:hbase-object-mapper&core=gav) or
433-
[com.flipkart:hbase-object-mapper on **MVN Repository**](https://mvnrepository.com/artifact/com.flipkart/hbase-object-mapper).
440+
See artifact details: [com.flipkart:hbase-object-mapper on **Maven Central**](https://search.maven.org/search?q=g:com.flipkart%20AND%20a:hbase-object-mapper&core=gav).
441+
442+
If you're using Gradle or Ivy or SBT, see how to include this library in your build:
443+
[com.flipkart:hbase-object-mapper:1.18](https://mvnrepository.com/artifact/com.flipkart/hbase-object-mapper/1.18).
434444

435445
## How to build?
436446
To build this project, follow below simple steps:
437447

438448
1. Do a `git clone` of this repository
439-
2. Checkout latest stable version `git checkout v1.16`
449+
2. Checkout latest stable version `git checkout v1.18`
440450
3. Execute `mvn clean install` from shell
441451

442452
### Please note:
443453

444-
* Currently, projects that use this library are running on [Hortonworks Data Platform v3.1](https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.0/index.html) (corresponds to Hadoop 3.1 and HBase 2.0). However, if you are using a different version of Hadoop/HBase, you may change the versions in [pom.xml](./pom.xml) to desired ones and build the project.
454+
* Currently, systems that use this library are running on Hadoop 3.1 and HBase 2.0. However, if you are using a different version of Hadoop/HBase, you may change the versions in [pom.xml](./pom.xml) to desired ones and build the project.
445455
* Test cases are **very comprehensive**. So, `mvn` build times can sometimes be longer, depending on your machine configuration.
446456
* By default, test cases spin an [in-memory HBase test cluster](https://github.com/apache/hbase/blob/master/hbase-server/src/test/java/org/apache/hadoop/hbase/HBaseTestingUtility.java) to run data access related test cases (near-realworld scenario).
447457
* If test cases are failing with time out errors, you may increase the timeout by setting environment variable `INMEMORY_CLUSTER_START_TIMEOUT` (seconds). For example, on Linux you may run the command `export INMEMORY_CLUSTER_START_TIMEOUT=8` on terminal, before running the aforementioned `mvn` command.
@@ -457,6 +467,22 @@ The change log can be found in the [releases](//github.com/flipkart-incubator/hb
457467

458468
If you intend to request a feature or report a bug, you may use [Github Issues for hbase-orm](//github.com/flipkart-incubator/hbase-orm/issues).
459469

470+
## Bigtable ORM
471+
Google's [Cloud Bigtable](https://cloud.google.com/bigtable) provides first-class support for [accessing Bigtable using HBase client](https://cloud.google.com/bigtable/docs/reference/libraries#client-libraries-usage-hbase-java).
472+
473+
This library can be used as a **Bigtable ORM**, 3 simple steps:
474+
1. Add following to your dependencies:
475+
* [bigtable-hbase-2.x](https://mvnrepository.com/artifact/com.google.cloud.bigtable/bigtable-hbase-2.x) or [bigtable-hbase-2.x-shaded](https://mvnrepository.com/artifact/com.google.cloud.bigtable/bigtable-hbase-2.x-shaded)
476+
* This library
477+
2. Instantiate `Connection` class as below:
478+
```java
479+
import com.google.cloud.bigtable.hbase.BigtableConfiguration;
480+
// some code
481+
Connection connection = BigtableConfiguration.connect(projectId, instanceId);
482+
// some code
483+
```
484+
3. Use the `Connection` instance as mentioned earlier
485+
460486
## License
461487

462488
Copyright 2020 Flipkart Internet Pvt Ltd.

pom.xml

Lines changed: 33 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -5,13 +5,14 @@
55
<name>HBase ORM</name>
66
<description>
77
HBase ORM is a light-weight, thread-safe and performant library that enables:
8-
[1] object-oriented access of HBase rows (Data Access Object) with minimal code and good testability
9-
[2] reading from and/or writing to HBase tables in Hadoop MapReduce jobs
8+
[1] object-oriented access of HBase rows (Data Access Object) with minimal code and good testability.
9+
[2] reading from and/or writing to HBase tables in Hadoop MapReduce jobs.
10+
This can also be used as an ORM for Bigtable.
1011
</description>
1112
<modelVersion>4.0.0</modelVersion>
1213
<groupId>com.flipkart</groupId>
1314
<artifactId>hbase-object-mapper</artifactId>
14-
<version>1.16</version>
15+
<version>1.18</version>
1516
<url>https://github.com/flipkart-incubator/hbase-orm</url>
1617
<scm>
1718
<url>https://github.com/flipkart-incubator/hbase-orm</url>
@@ -35,18 +36,11 @@
3536
<organizationUrl>https://www.flipkart.com</organizationUrl>
3637
</developer>
3738
</developers>
38-
<repositories>
39-
<repository>
40-
<id>HDPReleases</id>
41-
<name>HDP Releases</name>
42-
<url>https://repo.hortonworks.com/content/repositories/releases/</url>
43-
</repository>
44-
</repositories>
4539
<properties>
4640
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
4741
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
48-
<version.hadoop>3.1.1.3.1.0.0-78</version.hadoop>
49-
<version.hbase>2.0.2.3.1.0.0-78</version.hbase>
42+
<version.hbase>2.0.6</version.hbase>
43+
<jackson-version>2.10.4</jackson-version>
5044
</properties>
5145
<distributionManagement>
5246
<repository>
@@ -56,20 +50,30 @@
5650
</distributionManagement>
5751
<dependencies>
5852
<!-- Application dependencies -->
59-
<dependency>
60-
<groupId>org.apache.hbase</groupId>
61-
<artifactId>hbase-client</artifactId>
62-
<version>${version.hbase}</version>
63-
</dependency>
6453
<dependency>
6554
<groupId>com.fasterxml.jackson.core</groupId>
6655
<artifactId>jackson-databind</artifactId>
67-
<version>2.9.10.4</version>
56+
<version>${jackson-version}</version>
57+
</dependency>
58+
<dependency>
59+
<groupId>com.fasterxml.jackson.datatype</groupId>
60+
<artifactId>jackson-datatype-jdk8</artifactId>
61+
<version>${jackson-version}</version>
62+
</dependency>
63+
<dependency>
64+
<groupId>com.fasterxml.jackson.datatype</groupId>
65+
<artifactId>jackson-datatype-jsr310</artifactId>
66+
<version>${jackson-version}</version>
6867
</dependency>
6968
<dependency>
7069
<groupId>com.google.guava</groupId>
7170
<artifactId>guava</artifactId>
72-
<version>19.0</version>
71+
<version>20.0</version>
72+
</dependency>
73+
<dependency>
74+
<groupId>org.apache.hbase</groupId>
75+
<artifactId>hbase-client</artifactId>
76+
<version>${version.hbase}</version>
7377
</dependency>
7478
<!-- test dependencies -->
7579
<dependency>
@@ -194,6 +198,16 @@
194198
<artifactId>maven-surefire-plugin</artifactId>
195199
<version>2.22.2</version>
196200
</plugin>
201+
<plugin>
202+
<groupId>org.apache.maven.plugins</groupId>
203+
<artifactId>maven-help-plugin</artifactId>
204+
<version>3.2.0</version>
205+
</plugin>
206+
<plugin>
207+
<groupId>org.apache.maven.plugins</groupId>
208+
<artifactId>maven-dependency-plugin</artifactId>
209+
<version>3.1.2</version>
210+
</plugin>
197211
</plugins>
198212
</build>
199213
</project>

src/main/java/com/flipkart/hbaseobjectmapper/AbstractHBDAO.java

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ protected AbstractHBDAO(Configuration configuration, Codec codec) throws IOExcep
114114
* @throws IllegalStateException Annotation(s) on base entity may be incorrect
115115
*/
116116
protected AbstractHBDAO(Connection connection) {
117-
this(connection, (Codec) null);
117+
this(connection, HBObjectMapperFactory.construct());
118118
}
119119

120120
/**
@@ -125,7 +125,7 @@ protected AbstractHBDAO(Connection connection) {
125125
* @throws IllegalStateException Annotation(s) on base entity may be incorrect
126126
*/
127127
protected AbstractHBDAO(Configuration configuration) throws IOException {
128-
this(configuration, (Codec) null);
128+
this(configuration, HBObjectMapperFactory.construct());
129129
}
130130

131131
/**
@@ -281,11 +281,7 @@ public List<T> get(List<R> rowKeys) throws IOException {
281281
* @throws IOException When HBase call fails
282282
*/
283283
public List<T> get(R startRowKey, R endRowKey, int numVersionsToFetch) throws IOException {
284-
Scan scan = new Scan()
285-
.withStartRow(toBytes(startRowKey))
286-
.withStopRow(toBytes(endRowKey))
287-
.readVersions(numVersionsToFetch);
288-
return get(scan);
284+
return get(startRowKey, true, endRowKey, false, numVersionsToFetch);
289285
}
290286

291287
/**
@@ -457,7 +453,7 @@ private WrappedHBColumn validateAndGetLongColumn(String fieldName) {
457453
if (!Long.class.equals(field.getType())) {
458454
throw new IllegalArgumentException(String.format("Invalid attempt to increment a non-Long field (%s.%s)", hbRecordClass.getName(), fieldName));
459455
}
460-
return new WrappedHBColumn(field, true);
456+
return new WrappedHBColumn(field);
461457
}
462458

463459
/**
@@ -558,11 +554,11 @@ public T append(R rowKey, Map<String, Object> valuesToAppend) throws IOException
558554
for (Map.Entry<String, Object> e : valuesToAppend.entrySet()) {
559555
String fieldName = e.getKey();
560556
Field field = getField(fieldName);
561-
WrappedHBColumn hbColumn = new WrappedHBColumn(field, true);
562557
Object value = e.getValue();
563558
if (!field.getType().isAssignableFrom(value.getClass())) {
564559
throw new IllegalArgumentException(String.format("An attempt was made to append a value of type '%s' to field '%s', which is of type '%s' (incompatible)", value.getClass(), fieldName, field.getType()));
565560
}
561+
WrappedHBColumn hbColumn = new WrappedHBColumn(field);
566562
append.addColumn(hbColumn.familyBytes(), hbColumn.columnBytes(),
567563
hbObjectMapper.valueToByteArray((Serializable) value, hbColumn.codecFlags())
568564
);
@@ -753,7 +749,7 @@ private void populateFieldValuesToMap(Field field, Result result, Map<R, Navigab
753749
if (result.isEmpty()) {
754750
return;
755751
}
756-
WrappedHBColumn hbColumn = new WrappedHBColumn(field, true);
752+
WrappedHBColumn hbColumn = new WrappedHBColumn(field);
757753
List<Cell> cells = result.getColumnCells(hbColumn.familyBytes(), hbColumn.columnBytes());
758754
for (Cell cell : cells) {
759755
Type fieldType = hbObjectMapper.getFieldType(field, hbColumn.isMultiVersioned());
@@ -833,7 +829,7 @@ private Map<R, Object> toSingleVersioned(Map<R, NavigableMap<Long, Object>> mult
833829
*/
834830
public NavigableMap<R, NavigableMap<Long, Object>> fetchFieldValues(R startRowKey, R endRowKey, String fieldName, int numVersionsToFetch) throws IOException {
835831
Field field = getField(fieldName);
836-
WrappedHBColumn hbColumn = new WrappedHBColumn(field, true);
832+
WrappedHBColumn hbColumn = new WrappedHBColumn(field);
837833
Scan scan = new Scan().withStartRow(toBytes(startRowKey)).withStopRow(toBytes(endRowKey));
838834
scan.addColumn(hbColumn.familyBytes(), hbColumn.columnBytes());
839835
scan.readVersions(numVersionsToFetch);
@@ -871,7 +867,7 @@ public Map<R, Object> fetchFieldValues(R[] rowKeys, String fieldName) throws IOE
871867
*/
872868
public Map<R, NavigableMap<Long, Object>> fetchFieldValues(R[] rowKeys, String fieldName, int numVersionsToFetch) throws IOException {
873869
Field field = getField(fieldName);
874-
WrappedHBColumn hbColumn = new WrappedHBColumn(field, true);
870+
WrappedHBColumn hbColumn = new WrappedHBColumn(field);
875871
List<Get> gets = new ArrayList<>(rowKeys.length);
876872
for (R rowKey : rowKeys) {
877873
Get get = new Get(toBytes(rowKey));

src/main/java/com/flipkart/hbaseobjectmapper/HBAdmin.java

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
package com.flipkart.hbaseobjectmapper;
22

3+
import org.apache.hadoop.hbase.NamespaceDescriptor;
34
import org.apache.hadoop.hbase.TableName;
45
import org.apache.hadoop.hbase.client.*;
56
import org.apache.hadoop.hbase.util.Bytes;
@@ -23,6 +24,13 @@ public HBAdmin(Connection connection) {
2324
this.connection = connection;
2425
}
2526

27+
public void createNamespace(String namespace) throws IOException {
28+
try (Admin admin = connection.getAdmin()) {
29+
admin.createNamespace(
30+
NamespaceDescriptor.create(namespace).build());
31+
}
32+
}
33+
2634
/**
2735
* Create table represented by the class
2836
*

0 commit comments

Comments
 (0)