[英]Mapping list of objects using parallelStream with DozerMaper gives StackOverflowError
I have the following utility-method to map a List of domain object to DTOs, resulting in a list of the mapped object. 我有以下实用程序方法将域对象的列表映射到DTO,从而生成映射对象的列表。
public static <Z, T> List<T> mapList(Mapper mapper, List<Z> source, Class<T> type) {
List<T> result = new ArrayList<T>();
int listSize = source.size();
for (int i=0;i<listSize;i++) {
result.add(mapper.map(source.get(i), type));
}
return result;
}
As a mapper I pass a singleton-instance of DozerBeanMaper (instance managed by Spring). 作为映射器,我传递了DozerBeanMaper的单例实例(由Spring管理的实例)。 The List source is the result of a hibernate query.
列表源是休眠查询的结果。 The above code work just fine.
上面的代码工作正常。
Now, I've change to code to make use of the Stream API (I wanted to parallelize the mapping): 现在,我已更改代码以使用Stream API(我想并行化映射):
public static <Z, T> List<T> mapList(Mapper mapper, List<Z> source, Class<T> type) {
return source.parallelStream()
.map((s) -> mapper.map(s, type))
.collect(Collectors.toList());
}
And get the following : 并获得以下内容:
Caused by: java.lang.NullPointerException
at org.hibernate.engine.internal.StatefulPersistenceContext.getLoadedCollectionOwnerOrNull(StatefulPersistenceContext.java:755)
at org.hibernate.event.spi.AbstractCollectionEvent.getLoadedOwnerOrNull(AbstractCollectionEvent.java:75)
at org.hibernate.event.spi.InitializeCollectionEvent.<init>(InitializeCollectionEvent.java:36)
at org.hibernate.internal.SessionImpl.initializeCollection(SessionImpl.java:1895)
at org.hibernate.collection.internal.AbstractPersistentCollection$4.doWork(AbstractPersistentCollection.java:558)
at org.hibernate.collection.internal.AbstractPersistentCollection.withTemporarySessionIfNeeded(AbstractPersistentCollection.java:260)
at org.hibernate.collection.internal.AbstractPersistentCollection.initialize(AbstractPersistentCollection.java:554)
at org.hibernate.collection.internal.AbstractPersistentCollection.read(AbstractPersistentCollection.java:142)
at org.hibernate.collection.internal.PersistentSet.iterator(PersistentSet.java:180)
at org.dozer.MappingProcessor.addOrUpdateToList(MappingProcessor.java:766)
at org.dozer.MappingProcessor.addOrUpdateToList(MappingProcessor.java:850)
at org.dozer.MappingProcessor.mapListToList(MappingProcessor.java:686)
at org.dozer.MappingProcessor.mapCollection(MappingProcessor.java:553)
at org.dozer.MappingProcessor.mapOrRecurseObject(MappingProcessor.java:434)
at org.dozer.MappingProcessor.mapFromFieldMap(MappingProcessor.java:342)
at org.dozer.MappingProcessor.mapField(MappingProcessor.java:288)
at org.dozer.MappingProcessor.map(MappingProcessor.java:248)
at org.dozer.MappingProcessor.map(MappingProcessor.java:197)
at org.dozer.MappingProcessor.mapCustomObject(MappingProcessor.java:495)
at org.dozer.MappingProcessor.mapOrRecurseObject(MappingProcessor.java:446)
at org.dozer.MappingProcessor.mapFromFieldMap(MappingProcessor.java:342)
at org.dozer.MappingProcessor.mapField(MappingProcessor.java:288)
at org.dozer.MappingProcessor.map(MappingProcessor.java:248)
at org.dozer.MappingProcessor.map(MappingProcessor.java:197)
at org.dozer.MappingProcessor.map(MappingProcessor.java:187)
at org.dozer.MappingProcessor.map(MappingProcessor.java:124)
at org.dozer.MappingProcessor.map(MappingProcessor.java:119)
at org.dozer.DozerBeanMapper.map(DozerBeanMapper.java:120)
at
org.mycompany.myproject.utils.BeanMapperUtil.lambda$0(BeanMapperUtil.java:30)
The execption repeats itself and finally turns into an StackOverFlowErorr. 执行重复自身,最后变成StackOverFlowErorr。
If I use source.stream()
instead of source.parallelStream()
, I don't get any errors. 如果我使用
source.stream()
而不是source.parallelStream()
,则不会出现任何错误。
Any ideas? 有任何想法吗?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.