其他分享
首页 > 其他分享> > 如何将嵌套的avro GenericRecord转换为Row

如何将嵌套的avro GenericRecord转换为Row

作者:互联网

我有一个代码,使用函数avroToRowConverter()将我的avro记录转换为Row

directKafkaStream.foreachRDD(rdd -> {
        JavaRDD<Row> newRDD= rdd.map(x->{

            Injection<GenericRecord, byte[]> recordInjection = GenericAvroCodecs.toBinary(SchemaRegstryClient.getLatestSchema("poc2"));
            return avroToRowConverter(recordInjection.invert(x._2).get());
            });

此函数不适用于嵌套模式(TYPE = UNION).

private static Row avroToRowConverter(GenericRecord avroRecord) {
    if (null == avroRecord) {
        return null;
    }
    //GenericData
    Object[] objectArray = new Object[avroRecord.getSchema().getFields().size()];
    StructType structType = (StructType) SchemaConverters.toSqlType(avroRecord.getSchema()).dataType();
    for (Schema.Field field : avroRecord.getSchema().getFields()) {

        if(field.schema().getType().toString().equalsIgnoreCase("STRING") || field.schema().getType().toString().equalsIgnoreCase("ENUM")){
            objectArray[field.pos()] = ""+avroRecord.get(field.pos());
        }else {
            objectArray[field.pos()] = avroRecord.get(field.pos());
        }
    }

    return new GenericRowWithSchema(objectArray, structType);
}

任何人都可以建议我如何将复杂的架构转换为ROW?

解决方法:

有SchemaConverters.createConverterToSQL但不幸的是它是私有的.
有PR公开,但它们从未被合并:

> https://github.com/databricks/spark-avro/pull/89
> https://github.com/databricks/spark-avro/pull/132

我们使用了一个解决方法.

您可以通过在com.databricks.spark.avro包中创建一个类来公开它:

package com.databricks.spark.avro

import org.apache.avro.Schema
import org.apache.avro.generic.GenericRecord
import org.apache.spark.sql.Row
import org.apache.spark.sql.types.DataType

object MySchemaConversions {
  def createConverterToSQL(avroSchema: Schema, sparkSchema: DataType): (GenericRecord) => Row =
    SchemaConverters.createConverterToSQL(avroSchema, sparkSchema).asInstanceOf[(GenericRecord) => Row]
}

然后你可以在你的代码中使用它,如下所示:

final DataType myAvroType = SchemaConverters.toSqlType(MyAvroRecord.getClassSchema()).dataType();

final Function1<GenericRecord, Row> myAvroRecordConverter =
        MySchemaConversions.createConverterToSQL(MyAvroRecord.getClassSchema(), myAvroType);

Row[] convertAvroRecordsToRows(List<GenericRecord> records) {
    return records.stream().map(myAvroRecordConverter::apply).toArray(Row[]::new);
}

对于一条记录,您可以像这样调用它:

final Row row = myAvroRecordConverter.apply(record);

标签:avro,java,apache-spark,spark-avro
来源: https://codeday.me/bug/20191002/1840746.html