When trying to store


private ZoneId zone;


I get the following error:


one.microstream.persistence.exceptions.PersistenceExceptionTypeHandlerConsistencyUnhandledTypeId: No type handler found for type id "1000124".


Looking in PersistenceTypeDictionary.ptd, it points to this:


0000000000001000124 java.time.ZoneRegion{
   java.lang.String java.time.ZoneRegion#id,
}


Please add support for java.time.ZoneId in Microstream.

It should be possible to persist a ZoneId implemented by the ZoneRegion class.

I was not able to reproduce that problem in a short test, there must be some other cause for that issue.

The entry in the typeDictionary indicates that a ZoneRegion instance has been persisted successfully at least once.


Could you provide me some additional information like the stack trace of the exception or a code snippet?

A User.class has this property


private ZoneId zone;


and it's stored with


user.setZone(ZoneId.of("Europe/Berlin"));


No error is shown when storing it, However, when restarting and opening the storage.restclient.app and connecting to the root instance and clicking the user instance, following stacktraces is shown:


13:29:16,574 ERROR [spark.http.matching.GeneralError] : one.microstream.persistence.exceptions.PersistenceExceptionTypeHandlerConsistencyUnhandledTypeId: No type handler found for type id "1000124".
   at test.war//one.microstream.persistence.binary.types.BinaryLoader$Default.lookupTypeHandler(BinaryLoader.java:190)
   at test.war//one.microstream.persistence.binary.types.BinaryLoader$Default.createBuildItem(BinaryLoader.java:198)
   at test.war//one.microstream.persistence.binary.types.BinaryLoader$Default.internalReadBinaryEntities(BinaryLoader.java:154)
   at test.war//one.microstream.persistence.binary.types.BinaryLoader$Default.readBinaryEntities(BinaryLoader.java:142)
   at test.war//one.microstream.persistence.binary.types.ChunksBuffer.iterateEntityDataLocal(ChunksBuffer.java:266)
   at test.war//one.microstream.persistence.binary.types.ChunksBuffer.iterateEntityData(ChunksBuffer.java:275)
   at test.war//one.microstream.persistence.binary.types.BinaryLoader$Default.addChunks(BinaryLoader.java:801)
   at test.war//one.microstream.persistence.binary.types.BinaryLoader$Default.readLoadOidData(BinaryLoader.java:771)
   at test.war//one.microstream.persistence.binary.types.BinaryLoader$Default.getObject(BinaryLoader.java:839)
   at test.war//one.microstream.storage.restadapter.ViewerBinaryPersistenceManager$Default.getStorageObject(ViewerBinaryPersistenceManager.java:291)
   at test.war//one.microstream.storage.restadapter.EmbeddedStorageRestAdapter$Default.getStorageObject(EmbeddedStorageRestAdapter.java:89)
   at test.war//one.microstream.storage.restadapter.ObjectDescription.resolveReference(ObjectDescription.java:190)
   at test.war//one.microstream.storage.restadapter.ObjectDescription.lambda$2(ObjectDescription.java:149)
   at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
   at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
   at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177)
   at java.base/java.util.stream.SliceOps$1$1.accept(SliceOps.java:199)
   at java.base/java.util.Spliterators$ArraySpliterator.tryAdvance(Spliterators.java:958)
   at java.base/java.util.stream.ReferencePipeline.forEachWithCancel(ReferencePipeline.java:127)
   at java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(AbstractPipeline.java:502)
   at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:488)
   at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
   at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
   at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
   at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
   at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
   at test.war//one.microstream.storage.restadapter.ObjectDescription.resolveReferences(ObjectDescription.java:150)
   at test.war//one.microstream.storage.restadapter.StorageRestAdapter$Default.getObject(StorageRestAdapter.java:97)
   at test.war//one.microstream.storage.restservice.sparkjava.RouteGetObject.handle(RouteGetObject.java:36)
   at test.war//one.microstream.storage.restservice.sparkjava.RouteGetObject.handle(RouteGetObject.java:1)
   at test.war//spark.RouteImpl$1.handle(RouteImpl.java:72)
   at test.war//spark.http.matching.Routes.execute(Routes.java:61)
   at test.war//spark.http.matching.MatcherFilter.doFilter(MatcherFilter.java:134)
   at test.war//spark.embeddedserver.jetty.JettyHandler.doHandle(JettyHandler.java:50)
   at test.war//org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1568)
   at test.war//org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
   at test.war//org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
   at test.war//org.eclipse.jetty.server.Server.handle(Server.java:503)
   at test.war//org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
   at test.war//org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
   at test.war//org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
   at test.war//org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
   at test.war//org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
   at test.war//org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
   at test.war//org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
   at test.war//org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
   at test.war//org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
   at test.war//org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
   at test.war//org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
   at test.war//org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
   at java.base/java.lang.Thread.run(Thread.java:835)



Re-Creating the database without storing ZoneId and opening the restclient again does not throw any error. So the error must be with ZoneId.



Thanks for the stacktrace, it is a great help, with that I was able to create a scenario that produces the error you described.

As the trace shows the exception does not come from the storage’s persistence layer. It is thrown by the Viewer (ViewerBinaryPersistenceManager.java:291). With that information I can say that you discovered a bug in the Viewer but the storage itself is not affected. The data should be loaded on startup correctly; if not please let me know.


What happens is that the restservice reads the persisted data chunk of the ZoneRegion but does not find a “handler” for that. The restservice uses other handlers than the storage as it does not re-create the persisted objects. Instead, it creates an object that holds all information about the persisted object to be displayed by the Viewer.

Those handlers are created for all known types already persistet by the storage during the restservice initialization. But there seems to be a problem if new types are added at runtime.

This is an issue we must fix.

As workaround you may move the restservice initialization to a point where the storage already persisted all types at least once.

Ok, thanks for your help. I hope it will be fixed in the near future.