I need to add BigDecimals in Hadoop. I'm currently using Apache Pig BigDecimalWritable but Pig seems to be completely outdated.
<dependency>
<groupId>org.apache.pig</groupId>
<artifactId>pig</artifactId>
<version>0.17.0</version>
</dependency>
This version is 5 years old!!
I now use instead:
<dependency>
<groupId>org.apache.parquet</groupId>
<artifactId>parquet-hive-storage-handler</artifactId>
<version>1.11.2</version>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-serde</artifactId>
<version>4.0.0-alpha-1</version>
<exclusions>
<exclusion>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
</exclusion>
</exclusions>
</dependency>
Parquet contains a BigDecimalWritable and is more up to date.
Would there be a better solution?
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.