markdown HBase Stargate REST API扫描程序筛选器示例
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了markdown HBase Stargate REST API扫描程序筛选器示例相关的知识,希望对你有一定的参考价值。
Stargate Scanner Filter Examples
=================================
## Introduction
So yeah... no documentation for the HBase REST API in regards to what should a filter look like...
So I installed Eclipse, got the library, and took some time to find some of the (seemingly) most useful filters you could use. I'm very green at anything regarding HBase, and I hope this will help anyone trying to get started with it.
What I discovered is that basically, attributes of the filter object follow the same naming than in the documentation. For this reason, I have made the link clickable and direct them to the HBase Class documentation attached to it; check for the instantiation argument names, and you will have your attribute list (more or less).
Don't forget, values are encoded.
## References:
* [HBase REST Filter (SingleColumnValueFilter)](http://stackoverflow.com/questions/9302097/hbase-rest-filter-singlecolumnvaluefilter)
* [HBase Intra-row scanning](http://stackoverflow.com/questions/13119369/hbase-intra-row-scanning)
* [HBase Book / Chapter on Client Filter](http://hbase.apache.org/book/client.filter.html)
### [ColumnPrefixFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/ColumnPrefixFilter.html)
```
{
"type": "ColumnPrefixFilter",
"value": "cHJlZml4"
}
```
### [ColumnRangeFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/ColumnRangeFilter.html)
```
{
"type": "ColumnRangeFilter",
"minColumn": "Zmx1ZmZ5",
"minColumnInclusive": true,
"maxColumn": "Zmx1ZmZ6",
"maxColumnInclusive": false
}
```
### [ColumnPaginationFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/ColumnPaginationFilter.html)
Could not generate an example, but I guess it should be pretty simple to test if it works just by intuitively plugging variables a certain way...
### [DependentColumnFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/DependentColumnFilter.html)
null
### [FamilyFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/FamilyFilter.html)
```
{
"type": "FamilyFilter",
"op": "EQUAL",
"comparator": {
"type": "BinaryComparator",
"value": "dGVzdHJvdw\u003d\u003d"
}
}
```
### [FilterList with RowFilter and ColumnRangeFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/FilterList.html)
```
{
"type": "FilterList",
"op": "MUST_PASS_ALL",
"filters": [
{
"type": "RowFilter",
"op": "EQUAL",
"comparator": {
"type": "BinaryComparator",
"value": "dGVzdHJvdw\u003d\u003d"
}
},
{
"type": "ColumnRangeFilter",
"minColumn": "Zmx1ZmZ5",
"minColumnInclusive": true,
"maxColumn": "Zmx1ZmZ6",
"maxColumnInclusive": false
}
]
}
```
### [FirstKeyOnlyFilter (Can be used for more efficiently perform row count operation)](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/FirstKeyOnlyFilter.html)
```
{
"type": "FirstKeyOnlyFilter"
}
```
### [InclusiveStopFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/InclusiveStopFilter.html)
```
{
"type": "InclusiveStopFilter",
"value": "cm93a2V5"
}
```
### [MultipleColumnPrefixFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/MultipleColumnPrefixFilter.html)
```
{
"type": "MultipleColumnPrefixFilter",
"prefixes": [
"YWxwaGE\u003d",
"YnJhdm8\u003d",
"Y2hhcmxpZQ\u003d\u003d"
]
}
```
### [PageFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/PageFilter.html)
```
{
"type": "PageFilter",
"value": "10"
}
```
### [PrefixFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/PrefixFilter.html)
```
{
"type": "PrefixFilter",
"value": "cm93cHJlZml4"
}
```
### [QualifierFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/QualifierFilter.html)
```
{
"type": "QualifierFilter",
"op": "GREATER",
"comparator": {
"type": "BinaryComparator",
"value": "cXVhbGlmaWVycHJlZml4"
}
}
```
### [RowFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/RowFilter.html)
```
{
"type": "RowFilter",
"op": "EQUAL",
"comparator": {
"type": "BinaryComparator",
"value": "dGVzdHJvdw\u003d\u003d"
}
}
```
### [SingleColumnValueFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/SingleColumnValueFilter.html)
```
{
"type": "SingleColumnValueFilter",
"op": "EQUAL",
"family": "ZmFtaWx5",
"qualifier": "Y29sMQ\u003d\u003d",
"latestVersion": true,
"comparator": {
"type": "BinaryComparator",
"value": "MQ\u003d\u003d"
}
}
```
### [TimestampsFilter](http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/filter/TimestampsFilter.html)
```
{
"type": "TimestampsFilter",
"timestamps": [
"1351586939"
]
}
```
以上是关于markdown HBase Stargate REST API扫描程序筛选器示例的主要内容,如果未能解决你的问题,请参考以下文章
HBASE Rest API (Stargate) 发布多个单元格/行
如何使用 Stargate REST API 在 HBase 中构建更复杂的过滤器层次结构?