[英]A GenericUDF Function to Extract a Field From an Array of Structs
I am trying to write a GenericUDF function to collect all of a specific struct field(s) within an array for each record, and return them in an array as well. 我正在尝试编写GenericUDF函数来为每个记录收集数组中的所有特定结构字段,并将它们也返回数组中。
I wrote the GenericUDF (as below), and it seems to work but: 我写了GenericUDF(如下),它似乎可以工作,但是:
1) It does not work when I am performing this on an external table, it works fine on a managed table, any idea? 1)当我在外部表上执行此操作时不起作用,在托管表上工作正常,有什么想法吗?
2) I am having a tough time writing a test on this. 2)我很难对此进行测试。 I have attached the test I have so far, and it does not work, always getting 'java.util.ArrayList cannot be cast to org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector' or cannot cast String to LazyString', my question is how do I supply a list of structs for the evalue method? 我已经附加了到目前为止的测试,但是它无法正常工作,总是会导致“ java.util.ArrayList无法转换为org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector”或无法将String转换为LazyString”,我的问题是如何为evalue方法提供结构列表?
Any help will be greatly appreciated. 任何帮助将不胜感激。
The table: 桌子:
CREATE EXTERNAL TABLE FOO (
TS string,
customerId string,
products array< struct<productCategory:string> >
)
PARTITIONED BY (ds string)
ROW FORMAT SERDE 'some.serde'
WITH SERDEPROPERTIES ('error.ignore'='true')
LOCATION 'some_locations'
;
A row of record holds: 一行记录保存:
1340321132000, 'some_company', [{"productCategory":"footwear"},{"productCategory":"eyewear"}]
This is my code: 这是我的代码:
import org.apache.hadoop.hive.ql.exec.Description;
import org.apache.hadoop.hive.ql.exec.UDFArgumentException;
import org.apache.hadoop.hive.ql.exec.UDFArgumentLengthException;
import org.apache.hadoop.hive.ql.exec.UDFArgumentTypeException;
import org.apache.hadoop.hive.ql.metadata.HiveException;
import org.apache.hadoop.hive.ql.udf.generic.GenericUDF;
import org.apache.hadoop.hive.serde2.lazy.LazyString;
import org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector.Category;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory;
import org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.StructField;
import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory;
import org.apache.hadoop.hive.serde2.objectinspector.primitive.StringObjectInspector;
import org.apache.hadoop.io.Text;
import java.util.ArrayList;
@Description(name = "extract_product_category",
value = "_FUNC_( array< struct<productcategory:string> > ) - Collect all product category field values inside an array of struct(s), and return the results in an array<string>",
extended = "Example:\n SELECT _FUNC_(array_of_structs_with_product_category_field)")
public class GenericUDFExtractProductCategory
extends GenericUDF
{
private ArrayList ret;
private ListObjectInspector listOI;
private StructObjectInspector structOI;
private ObjectInspector prodCatOI;
@Override
public ObjectInspector initialize(ObjectInspector[] args)
throws UDFArgumentException
{
if (args.length != 1) {
throw new UDFArgumentLengthException("The function extract_product_category() requires exactly one argument.");
}
if (args[0].getCategory() != Category.LIST) {
throw new UDFArgumentTypeException(0, "Type array<struct> is expected to be the argument for extract_product_category but " + args[0].getTypeName() + " is found instead");
}
listOI = ((ListObjectInspector) args[0]);
structOI = ((StructObjectInspector) listOI.getListElementObjectInspector());
if (structOI.getAllStructFieldRefs().size() != 1) {
throw new UDFArgumentTypeException(0, "Incorrect number of fields in the struct, should be one");
}
StructField productCategoryField = structOI.getStructFieldRef("productCategory");
//If not, throw exception
if (productCategoryField == null) {
throw new UDFArgumentTypeException(0, "NO \"productCategory\" field in input structure");
}
//Are they of the correct types?
//We store these object inspectors for use in the evaluate() method
prodCatOI = productCategoryField.getFieldObjectInspector();
//First are they primitives
if (prodCatOI.getCategory() != Category.PRIMITIVE) {
throw new UDFArgumentTypeException(0, "productCategory field must be of string type");
}
//Are they of the correct primitives?
if (((PrimitiveObjectInspector)prodCatOI).getPrimitiveCategory() != PrimitiveObjectInspector.PrimitiveCategory.STRING) {
throw new UDFArgumentTypeException(0, "productCategory field must be of string type");
}
ret = new ArrayList();
return ObjectInspectorFactory.getStandardListObjectInspector(PrimitiveObjectInspectorFactory.writableStringObjectInspector);
}
@Override
public ArrayList evaluate(DeferredObject[] arguments)
throws HiveException
{
ret.clear();
if (arguments.length != 1) {
return null;
}
if (arguments[0].get() == null) {
return null;
}
int numElements = listOI.getListLength(arguments[0].get());
for (int i = 0; i < numElements; i++) {
LazyString prodCatDataObject = (LazyString) (structOI.getStructFieldData(listOI.getListElement(arguments[0].get(), i), structOI.getStructFieldRef("productCategory")));
Text productCategoryValue = ((StringObjectInspector) prodCatOI).getPrimitiveWritableObject(prodCatDataObject);
ret.add(productCategoryValue);
}
return ret;
}
@Override
public String getDisplayString(String[] strings)
{
assert (strings.length > 0);
StringBuilder sb = new StringBuilder();
sb.append("extract_product_category(");
sb.append(strings[0]);
sb.append(")");
return sb.toString();
}
}
My Test: 我的测试:
import org.apache.hadoop.hive.ql.metadata.HiveException;
import org.apache.hadoop.hive.ql.udf.generic.GenericUDF;
import org.apache.hadoop.hive.ql.udf.generic.GenericUDF.DeferredObject;
import org.apache.hadoop.hive.serde2.objectinspector.ListObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorFactory;
import org.apache.hadoop.hive.serde2.objectinspector.StructObjectInspector;
import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorFactory;
import org.testng.annotations.Test;
import java.util.ArrayList;
import java.util.List;
public class TestGenericUDFExtractShas
{
ArrayList<String> fieldNames = new ArrayList<String>();
ArrayList<ObjectInspector> fieldObjectInspectors = new ArrayList<ObjectInspector>();
@Test
public void simpleTest()
throws Exception
{
ListObjectInspector firstInspector = new MyListObjectInspector();
ArrayList test = new ArrayList();
test.add("test");
ArrayList test2 = new ArrayList();
test2.add(test);
StructObjectInspector soi = ObjectInspectorFactory.getStandardStructObjectInspector(test, test2);
fieldNames.add("productCategory");
fieldObjectInspectors.add(PrimitiveObjectInspectorFactory.writableStringObjectInspector);
GenericUDF.DeferredObject firstDeferredObject = new MyDeferredObject(test2);
GenericUDF extract_product_category = new GenericUDFExtractProductCategory();
extract_product_category.initialize(new ObjectInspector[]{firstInspector});
extract_product_category.evaluate(new DeferredObject[]{firstDeferredObject});
}
public class MyDeferredObject implements DeferredObject
{
private Object value;
public MyDeferredObject(Object value) {
this.value = value;
}
@Override
public Object get() throws HiveException
{
return value;
}
}
private class MyListObjectInspector implements ListObjectInspector
{
@Override
public ObjectInspector getListElementObjectInspector()
{
return ObjectInspectorFactory.getStandardStructObjectInspector(fieldNames, fieldObjectInspectors);
}
@Override
public Object getListElement(Object data, int index)
{
List myList = (List) data;
if (myList == null || index > myList.size()) {
return null;
}
return myList.get(index);
}
@Override
public int getListLength(Object data)
{
if (data == null) {
return -1;
}
return ((List) data).size();
}
@Override
public List<?> getList(Object data)
{
return (List) data;
}
@Override
public String getTypeName()
{
return null; //To change body of implemented methods use File | Settings | File Templates.
}
@Override
public Category getCategory()
{
return Category.LIST;
}
}
}
I can't speak to the testing, but with a caveat discussed below, I think I have a solution for the issue with external tables. 我无法对测试进行讨论,但是通过下面讨论的警告,我认为我对于外部表的问题有解决方案。
In adapting your code to my needs I changed string to long in the evaluate method: 为了使您的代码适应我的需要,我在评估方法中将字符串更改为long:
your code: 您的代码:
LazyString prodCatDataObject = (LazyString) (structOI.getStructFieldData(listOI.getListElement(arguments[0].get(), i), structOI.getStructFieldRef("productCategory")));
Text productCategoryValue = ((StringObjectInspector) prodCatOI).getPrimitiveWritableObject(prodCatDataObject);
my old code: 我的旧代码:
LazyLong indDataObject = (LazyLong) (structOI.getStructFieldData(listOI.getListElement(arguments[0].get(), i), structOI.getStructFieldRef(indexName)));
LongWritable indValue = ((LazyLongObjectInspector) indOI).getPrimitiveWritableObject(indDataObject);
You can see they are the same logic with different data types etc. 您可以看到它们是具有不同数据类型等的相同逻辑。
This worked for me with non-external table. 这为我使用非外部表工作。 Did not work with external table. 不适用于外部表。
I was able to resolve this by replacing my old code with this: 我可以用以下代码替换旧代码来解决此问题:
long indValue = (Long) (structOI.getStructFieldData(listOI.getListElement(arguments[0].get(), i), structOI.getStructFieldRef(indexName)));
In another version, where I was returning text 在另一个版本中,我正在返回文本
You can probably do something similar, namely by casting to text / string in the first step. 您可能可以执行类似的操作,即在第一步中强制转换为文本/字符串。
You may also have to change public Text evaluate(DeferredObject[] arguments)
to public Object evaluate(DeferredObject[] arguments)
. 您可能还需要将public Text evaluate(DeferredObject[] arguments)
更改为public Object evaluate(DeferredObject[] arguments)
。
Source code for some working UDFs that handle array is available here . 一些可用的处理数组的UDF的源代码在此处 。
Now for the caveat: this does not appear to work with tables stored as ORC. 现在需要警告:这似乎不适用于存储为ORC的表。 (neither does the original code, mind you). (请注意,原始代码也没有)。 I will probably create a question about this. 我可能会对此提出一个问题。 I am not sure what the issue is. 我不确定是什么问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.