3장 클라이언트 API : 기본기능 - 일괄처리 연산
아,, 책 내용 너모 많다 ㅠ_ㅠ주말에도 공부해야댄디
2018/08/10 - [2018년 하반기/HBASE] - 3장 클라이언트 API : 기본기능 - Put 메서드
2018/08/17 - [2018년 하반기/HBASE] - 3장 클라이언트 API : 기본기능 - Get 메서드
2018/08/17 - [2018년 하반기/HBASE] - 3장 클라이언트 API : 기본기능 - Delete 메서드
일괄 처리 연산
자 지금까지 앞에서는 한개의 row 아니면 List에 담아서 해보기 이런걸 해봤다. 여기서는 여러 개의 row에 다양한 연산을 일괄처리를 해보겠다.
table class에 가면 다음과 같은 batch라는 메서드가 있다.
*연산을 한번에 집어넣고 사용할 수 는 있지만, Put과 delete를 동일한 row에 같이 쓰거나 하는 것은 성능상 좋지 않다. 경쟁조건이 생겨서 불안정한 결과가 생길수도 있다고 한다.
바로 코딩!
HBaseConfiguration conf =new HBaseConfiguration(new Configuration());
Connection connection = ConnectionFactory.createConnection(conf);
Admin admin = connection.getAdmin();
HTableDescriptor tableDescriptor = new HTableDescriptor(TableName.valueOf("testtable"));
HColumnDescriptor cd=new HColumnDescriptor("colfam1");
HColumnDescriptor cd2=new HColumnDescriptor("colfam2");
cd.setMaxVersions(10);
cd2.setMaxVersions(10);
tableDescriptor.addFamily(cd);
tableDescriptor.addFamily(cd2);
admin.createTable(tableDescriptor);
System.out.println("create table testTable.."+tableDescriptor.getNameAsString());
Table table = connection.getTable(TableName.valueOf("testtable"));
Put put = new Put(Bytes.toBytes("row1"));
put.addColumn(Bytes.toBytes("colfam1"),Bytes.toBytes("qual1"),1,Bytes.toBytes("val1"));
put.addColumn(Bytes.toBytes("colfam1"),Bytes.toBytes("qual2"),2,Bytes.toBytes("val2"));
put.addColumn(Bytes.toBytes("colfam1"),Bytes.toBytes("qual3"),3,Bytes.toBytes("val3"));
table.put(put); //사전에 값을 몇개 넣어주고,,,
List<Row> batch = new ArrayList<Row>();
Put put2 = new Put(Bytes.toBytes("row2"));
put2.addColumn(Bytes.toBytes("colfam2"), Bytes.toBytes("qual1"), 4, Bytes.toBytes("val5"));
batch.add(put2);
Get get1 = new Get(Bytes.toBytes("row1"));
get1.addColumn(Bytes.toBytes("colfam1"),Bytes.toBytes("qual1"));
batch.add(get1);
Delete delete = new Delete(Bytes.toBytes("row1"));
delete.addColumns(Bytes.toBytes("colfam2"), Bytes.toBytes("qual2"));
batch.add(delete);
Get get2 = new Get(Bytes.toBytes("row2"));
get2.addFamily(Bytes.toBytes("BOGUS")); //에러가 발생한다.
batch.add(get2);
// get,put,delete연산들을 batch list에 담아준다.
Object[] results = new Object[batch.size()];
try {
table.batch(batch, results); //batch를 수행한다.
} catch (Exception e) {
System.err.println("Error: " + e);
}
for (int i = 0; i < results.length; i++) {
System.out.println("Result[" + i + "]: type = " +
results[i].getClass().getSimpleName() + "; " + results[i]);
}
table.close();
connection.close();
}
그럼 아래와 같이 당연히 에러문구가 출력된다. 가운데에 Result의 결과를 출력하는 문구가 있는데 이를 보면 Get 연산이 수행된것을 알 수 있다.
Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: Column family BOGUS does not exist in region testtable,,1534506668895.c97d2fa45b3cbf98834d020a52b09d85. in table 'testtable', {NAME => 'colfam1', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}, {NAME => 'colfam2', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}
at org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:7982)
at org.apache.hadoop.hbase.regionserver.HRegion.prepareGet(HRegion.java:7269)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2512)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:834)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2673)
at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
: 1 time, servers with issues: netdb.slave1.com,16020,1534483984776
Result[0]: type = Result; keyvalues=NONE
Result[1]: type = Result; keyvalues={row1/colfam1:qual1/1/Put/vlen=4/seqid=0}
Result[2]: type = Result; keyvalues=NONE
Result[3]: type = NoSuchColumnFamilyException; org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: Column family BOGUS does not exist in region testtable,,1534506668895.c97d2fa45b3cbf98834d020a52b09d85. in table 'testtable', {NAME => 'colfam1', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}, {NAME => 'colfam2', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}
at org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:7982)
at org.apache.hadoop.hbase.regionserver.HRegion.prepareGet(HRegion.java:7269)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2512)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:834)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2673)
at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
Put결과는 어떨까? 아래 그 결과를 살펴보면,
Put 연산 : row2 삽입 이 잘 되었음을 확인할 수 있다.
hbase(main):004:0> scan 'testtable'
ROW COLUMN+CELL
row1 column=colfam1:qual1, timestamp=1, value=val1
row1 column=colfam1:qual2, timestamp=2, value=val2
row1 column=colfam1:qual3, timestamp=3, value=val3
row2 column=colfam2:qual1, timestamp=4, value=val5
2 row(s)
Took 0.3051 seconds
위에서 반환된 결과가 무엇을 의미하는지 알아보자. batch 연산을 통해서 얻을 수 있는 유형에는 다음과 같은 것들이 있다.
결과 값 |
설명 |
null |
연산이 원격 서버와의 통신에 실패! |
비어있는 Result |
Put 또는 Delete 연산에 성공! |
Result |
Get 연산이 성공했음, 조건에 맞는 값이 없으면 빈Result 반환 |
Throwable |
서버에서 연산에대해 예외를 던져주었음. |
책에서는 비슷한 메서드가 2개 있다면서 return type만 다름 이름이 같은 batch 메서드를 2개 소개해주고 잇는데, 아마 현재 메서드로 말하면 이것 인것 같다. API에서도 Same as batch(List, Object[])
, but with a callback. 라고 적혀있다.
코드는 아래와 같다. 책에서 말하기를 둘을 비교하면 위에 영어에 나와있듯이 이건 Callback이다.라고 적혀있다. 콜백을 찾아보았다.
Callback은 A가 B를 호출해서 작업을 수행하다가 어떤 시점에 B가 A를 호출할때 그 때 A가 정해놓은 작업을 수행하는거라고 한다.학부때 안드로이드 스튜디오에서 본거같기도 하고,, =_=;
출처 : https://brunch.co.kr/@kimkm4726/1
List<Row> batch = new ArrayList<Row>();
Put put2 = new Put(Bytes.toBytes("row2"));
put2.addColumn(Bytes.toBytes("colfam2"), Bytes.toBytes("qual1"), 4, Bytes.toBytes("val5"));
batch.add(put2);
Get get1 = new Get(Bytes.toBytes("row1"));
get1.addColumn(Bytes.toBytes("colfam1"),Bytes.toBytes("qual1"));
batch.add(get1);
Delete delete = new Delete(Bytes.toBytes("row1"));
delete.addColumns(Bytes.toBytes("colfam2"), Bytes.toBytes("qual2"));
batch.add(delete);
Get get2 = new Get(Bytes.toBytes("row2"));
get2.addFamily(Bytes.toBytes("BOGUS"));
batch.add(get2);
Object[] results = new Object[batch.size()];
try {
table.batchCallback(batch, results, new Batch.Callback<Result>() {
@Override
public void update(byte[] region, byte[] row, Result result) {
System.out.println("Received callback for row[" +
Bytes.toString(row) + "] -> " + result);
}
});
} catch (Exception e) {
System.err.println("Error: " + e);
}
for (int i = 0; i < results.length; i++) {
System.out.println("Result[" + i + "]: type = " +
results[i].getClass().getSimpleName() + "; " + results[i]);
}
Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: Column family BOGUS does not exist in region testtable,,1534509253439.6cae61149e758c73491c6e1f1c49d8a0. in table 'testtable', {NAME => 'colfam1', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}, {NAME => 'colfam2', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}
at org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:7982)
at org.apache.hadoop.hbase.regionserver.HRegion.prepareGet(HRegion.java:7269)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2512)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:834)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2673)
at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
: 1 time, servers with issues: netdb.slave3.com,16020,1534483988616
Result[0]: type = Result; keyvalues=NONE
Result[1]: type = Result; keyvalues={row1/colfam1:qual1/1/Put/vlen=4/seqid=0}
Result[2]: type = Result; keyvalues=NONE
Result[3]: type = NoSuchColumnFamilyException; org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: Column family BOGUS does not exist in region testtable,,1534509253439.6cae61149e758c73491c6e1f1c49d8a0. in table 'testtable', {NAME => 'colfam1', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}, {NAME => 'colfam2', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}
at org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:7982)
at org.apache.hadoop.hbase.regionserver.HRegion.prepareGet(HRegion.java:7269)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2512)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:834)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2673)
at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
* 왜 한 row에 하지말라는건지를 test할 수 있는 코드
public static void main(String[] args) throws IOException {
HBaseConfiguration conf =new HBaseConfiguration(new Configuration());
Connection connection = ConnectionFactory.createConnection(conf);
Admin admin = connection.getAdmin();
HTableDescriptor tableDescriptor = new HTableDescriptor(TableName.valueOf("testtable"));
HColumnDescriptor cd=new HColumnDescriptor("colfam1");
cd.setMaxVersions(10);
cd2.setMaxVersions(10);
tableDescriptor.addFamily(cd);
tableDescriptor.addFamily(cd2);
admin.createTable(tableDescriptor);
System.out.println("create table testTable.."+tableDescriptor.getNameAsString());
Table table = connection.getTable(TableName.valueOf("testtable"));
Put put = new Put(Bytes.toBytes("row1"));
put.addColumn(Bytes.toBytes("colfam1"),Bytes.toBytes("qual1"),1L,Bytes.toBytes("val1"));
table.put(put);
List<Row> batch = new ArrayList<Row>();
Put put2 = new Put(Bytes.toBytes("row1"));
put2.addColumn(Bytes.toBytes("colfam1"), Bytes.toBytes("qual1"), 2L, Bytes.toBytes("val2"));
batch.add(put2);
Get get1 = new Get(Bytes.toBytes("row1"));
get1.addColumn(Bytes.toBytes("colfam1"),Bytes.toBytes("qual1"));
batch.add(get1);
Delete delete = new Delete(Bytes.toBytes("row1"));
delete.addColumns(Bytes.toBytes("colfam1"), Bytes.toBytes("qual1"),3L);
batch.add(delete);
Get get2 = new Get(Bytes.toBytes("row1"));
get2.addColumn(Bytes.toBytes("colfam1"),Bytes.toBytes("qual1"));
batch.add(get2);
Object[] results = new Object[batch.size()];
try {
table.batch(batch, results);
} catch (Exception e) {
System.err.println("Error: " + e);
}
for (int i = 0; i < results.length; i++) {
System.out.println("Result[" + i + "]: type = " +
results[i].getClass().getSimpleName() + "; " + results[i]);
}
table.close();
connection.close();
}
hbase(main):013:0> scan 'testtable'
ROW COLUMN+CELL
row1 column=colfam1:qual1, timestamp=1, value=val1
row1 column=colfam1:qual2, timestamp=2, value=val2
row1 column=colfam1:qual3, timestamp=3, value=val3
row2 column=colfam2:qual1, timestamp=4, value=val5
2 row(s)
Took 0.0435 seconds
create table testTable..testtable
Received callback for row[row2] -> keyvalues=NONE
Received callback for row[row1] -> keyvalues={row1/colfam1:qual1/1/Put/vlen=4/seqid=0}
Received callback for row[row1] -> keyvalues=NONE
Error: org.apache.hadoop.hbase.client.RetriesExhaustedWithDetailsException: Failed 1 action: org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: Column family BOGUS does not exist in region testtable,,1534509855860.29ca3a407eb659d7914476a1dcb6ec58. in table 'testtable', {NAME => 'colfam1', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}, {NAME => 'colfam2', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}
at org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:7982)
at org.apache.hadoop.hbase.regionserver.HRegion.prepareGet(HRegion.java:7269)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2512)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:834)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2673)
at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)
: 1 time, servers with issues: netdb.master.com,16020,1534483987681
Result[0]: type = Result; keyvalues=NONE
Result[1]: type = Result; keyvalues={row1/colfam1:qual1/1/Put/vlen=4/seqid=0}
Result[2]: type = Result; keyvalues=NONE
Result[3]: type = NoSuchColumnFamilyException; org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: org.apache.hadoop.hbase.regionserver.NoSuchColumnFamilyException: Column family BOGUS does not exist in region testtable,,1534509855860.29ca3a407eb659d7914476a1dcb6ec58. in table 'testtable', {NAME => 'colfam1', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}, {NAME => 'colfam2', VERSIONS => '10', EVICT_BLOCKS_ON_CLOSE => 'false', NEW_VERSION_BEHAVIOR => 'false', KEEP_DELETED_CELLS => 'FALSE', CACHE_DATA_ON_WRITE => 'false', DATA_BLOCK_ENCODING => 'NONE', TTL => 'FOREVER', MIN_VERSIONS => '0', REPLICATION_SCOPE => '0', BLOOMFILTER => 'ROW', CACHE_INDEX_ON_WRITE => 'false', IN_MEMORY => 'false', CACHE_BLOOMS_ON_WRITE => 'false', PREFETCH_BLOCKS_ON_OPEN => 'false', COMPRESSION => 'NONE', BLOCKCACHE => 'true', BLOCKSIZE => '65536'}
at org.apache.hadoop.hbase.regionserver.HRegion.checkFamily(HRegion.java:7982)
at org.apache.hadoop.hbase.regionserver.HRegion.prepareGet(HRegion.java:7269)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2512)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.doNonAtomicRegionMutation(RSRpcServices.java:834)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.multi(RSRpcServices.java:2673)
at org.apache.hadoop.hbase.shaded.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:42014)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:409)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:324)
at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:304)