Radix Scrypto transaction manifest multiple transfer from account - smartcontracts

How do we do multiple transfers of tokens from an account in one manifest? For example:
Account A -- Token A --> Account B
Account A -- Token B --> Account B
Account A -- Token A --> Account C
Account A -- Token B --> Account C
I did this:
CLONE_BUCKET_REF BucketRef(1u32) BucketRef("badge1");
CALL_METHOD Address("0293c502780e23621475989d707cd8128e4506362e5fed6ac0c00a") "withdraw" Decimal("2000") Address("03bcc1960b6f99bae8614c3bf276ee3217f800f5cc7bdc48db9a5f") BucketRef("badge1");
CALL_METHOD_WITH_ALL_RESOURCES Address("02a2a79aa613da237bcda37fd79af36e09eadd195976092cb24696") "deposit_batch";
CLONE_BUCKET_REF BucketRef(1u32) BucketRef("badge2");
CALL_METHOD Address("0293c502780e23621475989d707cd8128e4506362e5fed6ac0c00a") "withdraw" Decimal("2000") Address("031773788de8e4d2947d6592605302d4820ad060ceab06eb2d4711") BucketRef("badge2");
CALL_METHOD_WITH_ALL_RESOURCES Address("02a2a79aa613da237bcda37fd79af36e09eadd195976092cb24696") "deposit_batch";
CLONE_BUCKET_REF BucketRef(1u32) BucketRef("badge3");
CALL_METHOD Address("0293c502780e23621475989d707cd8128e4506362e5fed6ac0c00a") "withdraw" Decimal("2000") Address("03bcc1960b6f99bae8614c3bf276ee3217f800f5cc7bdc48db9a5f") BucketRef("badge3");
CALL_METHOD_WITH_ALL_RESOURCES Address("0236ca00316c8eb5ad51b0cb5e3f232cb871803a85ec3847b36bb4") "deposit_batch";
CLONE_BUCKET_REF BucketRef(1u32) BucketRef("badge4");
CALL_METHOD Address("0293c502780e23621475989d707cd8128e4506362e5fed6ac0c00a") "withdraw" Decimal("2000") Address("031773788de8e4d2947d6592605302d4820ad060ceab06eb2d4711") BucketRef("badge4");
CALL_METHOD_WITH_ALL_RESOURCES Address("0236ca00316c8eb5ad51b0cb5e3f232cb871803a85ec3847b36bb4") "deposit_batch";
But i get this error:
Error: CompileError(GeneratorError(IdValidatorError(BucketRefNotFound(Rid(1)))))
It looks like we lose all reference to an account when we call CALL_METHOD_WITH_ALL_RESOURCES

You're indeed correct, the CALL_METHOD_WITH_ALL_RESOURCES instruction drops all of the BucketRefs in the transaction. The specific line where this happens is: https://github.com/radixdlt/radixdlt-scrypto/blob/7cb4af0b35b8462f214e839590234602a11281d0/radix-engine/src/engine/process.rs#L367
One of way you could work around this is by avoiding the use of the CALL_METHOD_WITH_ALL_RESOURCES before the end of the rtm file and instead replacing the CALL_METHOD_WITH_ALL_RESOURCES in your rtm file with TAKE_ALL_FROM_WORKTOP and regular deposit method calls.
So as a high level view, what we're trying to do for each of the transfers is:
1- Clone the badge.
2- Withdraw the tokens using cloned badge.
3- Create a bucket out of the withdrawn tokens.
4- Depositing the bucket we just created into the receiver's account.
I have made the above described modifications to your rtm file:
CLONE_BUCKET_REF BucketRef(1u32) BucketRef("badge1");
CALL_METHOD Address("0293c502780e23621475989d707cd8128e4506362e5fed6ac0c00a") "withdraw" Decimal("2000") Address("03bcc1960b6f99bae8614c3bf276ee3217f800f5cc7bdc48db9a5f") BucketRef("badge1");
TAKE_ALL_FROM_WORKTOP Address("03bcc1960b6f99bae8614c3bf276ee3217f800f5cc7bdc48db9a5f") Bucket("transfer1_bucket");
CALL_METHOD Address("02a2a79aa613da237bcda37fd79af36e09eadd195976092cb24696") "deposit" Bucket("transfer1_bucket");
CLONE_BUCKET_REF BucketRef(1u32) BucketRef("badge2");
CALL_METHOD Address("0293c502780e23621475989d707cd8128e4506362e5fed6ac0c00a") "withdraw" Decimal("2000") Address("031773788de8e4d2947d6592605302d4820ad060ceab06eb2d4711") BucketRef("badge2");
TAKE_ALL_FROM_WORKTOP Address("031773788de8e4d2947d6592605302d4820ad060ceab06eb2d4711") Bucket("transfer2_bucket");
CALL_METHOD Address("02a2a79aa613da237bcda37fd79af36e09eadd195976092cb24696") "deposit" Bucket("transfer2_bucket");
CLONE_BUCKET_REF BucketRef(1u32) BucketRef("badge3");
CALL_METHOD Address("0293c502780e23621475989d707cd8128e4506362e5fed6ac0c00a") "withdraw" Decimal("2000") Address("03bcc1960b6f99bae8614c3bf276ee3217f800f5cc7bdc48db9a5f") BucketRef("badge3");
TAKE_ALL_FROM_WORKTOP Address("03bcc1960b6f99bae8614c3bf276ee3217f800f5cc7bdc48db9a5f") Bucket("transfer3_bucket");
CALL_METHOD Address("0236ca00316c8eb5ad51b0cb5e3f232cb871803a85ec3847b36bb4") "deposit" Bucket("transfer3_bucket");
CLONE_BUCKET_REF BucketRef(1u32) BucketRef("badge4");
CALL_METHOD Address("0293c502780e23621475989d707cd8128e4506362e5fed6ac0c00a") "withdraw" Decimal("2000") Address("031773788de8e4d2947d6592605302d4820ad060ceab06eb2d4711") BucketRef("badge4");
TAKE_ALL_FROM_WORKTOP Address("031773788de8e4d2947d6592605302d4820ad060ceab06eb2d4711") Bucket("transfer4_bucket");
CALL_METHOD Address("0236ca00316c8eb5ad51b0cb5e3f232cb871803a85ec3847b36bb4") "deposit" Bucket("transfer4_bucket");
Edit: I just want to highlight that this answer is for Scrypto v0.3.0.

Related

Geoserver 2.19 ImagePyramid processing error

I have more than 500 Gb orthophoto GTiff images with LZW compression, the task is to operate them using the geoserver's power.
The main idea is to use pyramids for much better data mobility in the future. For my tests, I use 137 Gb GTiff images with LZW compression.
Firstly, I compressed my files via GDAL util gdal_translate, which helps me to get 25 Gb GTiff images:
gdal_translate -of GTiff -co COMPRESS=JPEG input_file output_file
Secondly, I used GDAL util gdalbuildvrt to build a Virtual Dataset (VRT) via GDAL util gdal_retile:
gdalbuildvrt -te xmin_vrt ymin_vrt xmax_vrt ymax_vrt -srcnodata "0 0 0" output_file.vrt input_gtiff_file.tif
Thirdly, I used GDAL util gdal_retile for external pyramids building:
gdal_retile -of GTiff -v -r bilinear -levels 4 -ps 2048 2048 -co "TILED=YES" -co "COMPRESS=JPEG" -targetDir C:\...\out input_file.vrt
All 1-4 levels have been built into 1-4 subdirectories and cuted GTiff files.
The next step was to use ImagePyramid Geoserver's extesion for 25 Gb GTiff pyramids. For correct usage, I have created a Geoserver's new data ImagePyramid Storage (ImagePyramid pyramidal plugin). Zero subdirectory has been created correctly with a ShapeFile into it.
The last step is to publish the new generated storage as a layer, but it leads the error "An error occurred while loading the page" with " Failed to load granule file" and "java.lang.NullPointerException".
org.apache.wicket.WicketRuntimeException: Method onRequest of interface org.apache.wicket.behavior.IBehaviorListener targeted at org.apache.wicket.ajax.markup.html.AjaxLink$1#3a4a35fe on component [AjaxLink [Component id = link]] threw an exception
at org.apache.wicket.RequestListenerInterface.internalInvoke(RequestListenerInterface.java:268)
at org.apache.wicket.RequestListenerInterface.invoke(RequestListenerInterface.java:241)
at org.apache.wicket.core.request.handler.ListenerInterfaceRequestHandler.invokeListener(ListenerInterfaceRequestHandler.java:248)
at org.apache.wicket.core.request.handler.ListenerInterfaceRequestHandler.respond(ListenerInterfaceRequestHandler.java:234)
at org.apache.wicket.request.cycle.RequestCycle$HandlerExecutor.respond(RequestCycle.java:895)
at org.apache.wicket.request.RequestHandlerStack.execute(RequestHandlerStack.java:64)
at org.apache.wicket.request.cycle.RequestCycle.execute(RequestCycle.java:265)
at org.apache.wicket.request.cycle.RequestCycle.processRequest(RequestCycle.java:222)
at org.apache.wicket.request.cycle.RequestCycle.processRequestAndDetach(RequestCycle.java:293)
at org.apache.wicket.protocol.http.WicketFilter.processRequestCycle(WicketFilter.java:261)
at org.apache.wicket.protocol.http.WicketFilter.processRequest(WicketFilter.java:203)
at org.apache.wicket.protocol.http.WicketServlet.doGet(WicketServlet.java:137)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.springframework.web.servlet.mvc.ServletWrappingController.handleRequestInternal(ServletWrappingController.java:166)
at org.springframework.web.servlet.mvc.AbstractController.handleRequest(AbstractController.java:177)
at org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:52)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1040)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:943)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder$NotAsync.service(ServletHolder.java:1452)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:791)
at org.eclipse.jetty.servlet.ServletHandler$ChainEnd.doFilter(ServletHandler.java:1626)
at org.geoserver.filters.ThreadLocalsCleanupFilter.doFilter(ThreadLocalsCleanupFilter.java:26)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.geoserver.filters.SpringDelegatingFilter$Chain.doFilter(SpringDelegatingFilter.java:69)
at org.geoserver.wms.animate.AnimatorFilter.doFilter(AnimatorFilter.java:70)
at org.geoserver.filters.SpringDelegatingFilter$Chain.doFilter(SpringDelegatingFilter.java:66)
at org.geoserver.filters.SpringDelegatingFilter.doFilter(SpringDelegatingFilter.java:41)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.geoserver.platform.AdvancedDispatchFilter.doFilter(AdvancedDispatchFilter.java:37)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:320)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:70)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.invoke(FilterSecurityInterceptor.java:127)
at org.springframework.security.web.access.intercept.FilterSecurityInterceptor.doFilter(FilterSecurityInterceptor.java:91)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:74)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:70)
at org.springframework.security.web.access.ExceptionTranslationFilter.doFilter(ExceptionTranslationFilter.java:119)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:74)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
at org.geoserver.security.filter.GeoServerAnonymousAuthenticationFilter.doFilter(GeoServerAnonymousAuthenticationFilter.java:51)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:70)
at org.springframework.security.web.authentication.AbstractAuthenticationProcessingFilter.doFilter(AbstractAuthenticationProcessingFilter.java:200)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:74)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:91)
at org.geoserver.security.filter.GeoServerUserNamePasswordAuthenticationFilter.doFilter(GeoServerUserNamePasswordAuthenticationFilter.java:122)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:70)
at org.springframework.security.web.authentication.rememberme.RememberMeAuthenticationFilter.doFilter(RememberMeAuthenticationFilter.java:158)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:74)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:70)
at org.springframework.security.web.context.SecurityContextPersistenceFilter.doFilter(SecurityContextPersistenceFilter.java:105)
at org.geoserver.security.filter.GeoServerSecurityContextPersistenceFilter$1.doFilter(GeoServerSecurityContextPersistenceFilter.java:52)
at org.geoserver.security.filter.GeoServerCompositeFilter$NestedFilterChain.doFilter(GeoServerCompositeFilter.java:74)
at org.geoserver.security.filter.GeoServerCompositeFilter.doFilter(GeoServerCompositeFilter.java:91)
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:334)
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:215)
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:178)
at org.geoserver.security.GeoServerSecurityFilterChainProxy.doFilter(GeoServerSecurityFilterChainProxy.java:142)
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:358)
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:271)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.geoserver.filters.LoggingFilter.doFilter(LoggingFilter.java:101)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.geoserver.filters.XFrameOptionsFilter.doFilter(XFrameOptionsFilter.java:77)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.geoserver.filters.GZIPFilter.doFilter(GZIPFilter.java:47)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.geoserver.filters.SessionDebugFilter.doFilter(SessionDebugFilter.java:46)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.geoserver.filters.FlushSafeFilter.doFilter(FlushSafeFilter.java:42)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:119)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:201)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:548)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:602)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1435)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1350)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:191)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:516)
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:388)
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:633)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:380)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:773)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:905)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: java.lang.reflect.InvocationTargetException
at jdk.internal.reflect.GeneratedMethodAccessor302.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.wicket.RequestListenerInterface.internalInvoke(RequestListenerInterface.java:258)
... 128 more
Caused by: java.lang.RuntimeException: Error occurred while building the resources for the configuration page
at org.geoserver.web.data.layer.NewLayerPage.buildLayerInfo(NewLayerPage.java:431)
at org.geoserver.web.data.layer.NewLayerPage$9.onClick(NewLayerPage.java:324)
at org.geoserver.web.wicket.SimpleAjaxLink$1.onClick(SimpleAjaxLink.java:47)
at org.apache.wicket.ajax.markup.html.AjaxLink$1.onEvent(AjaxLink.java:85)
at org.apache.wicket.ajax.AjaxEventBehavior.respond(AjaxEventBehavior.java:155)
at org.apache.wicket.ajax.AbstractDefaultAjaxBehavior.onRequest(AbstractDefaultAjaxBehavior.java:601)
... 132 more
Caused by: org.geotools.data.DataSourceException: Unable to create this mosaic
at org.geotools.gce.imagemosaic.RasterLayerResponse.prepareResponse(RasterLayerResponse.java:757)
at org.geotools.gce.imagemosaic.RasterLayerResponse.processRequest(RasterLayerResponse.java:605)
at org.geotools.gce.imagemosaic.RasterLayerResponse.createResponse(RasterLayerResponse.java:573)
at org.geotools.gce.imagemosaic.RasterManager.read(RasterManager.java:1321)
at org.geotools.gce.imagemosaic.ImageMosaicReader.read(ImageMosaicReader.java:652)
at org.geotools.gce.imagemosaic.ImageMosaicReader.read(ImageMosaicReader.java:633)
at org.geotools.gce.imagepyramid.ImagePyramidReader.loadRequestedTiles(ImagePyramidReader.java:402)
at org.geotools.gce.imagepyramid.ImagePyramidReader.read(ImagePyramidReader.java:360)
at org.geoserver.catalog.CoverageDimensionCustomizerReader.read(CoverageDimensionCustomizerReader.java:234)
at org.geoserver.catalog.SingleGridCoverage2DReader.read(SingleGridCoverage2DReader.java:126)
at org.geoserver.catalog.CatalogBuilder.getCoverageSampleDimensions(CatalogBuilder.java:1188)
at org.geoserver.catalog.CatalogBuilder.buildCoverageInternal(CatalogBuilder.java:1064)
at org.geoserver.catalog.CatalogBuilder.buildCoverage(CatalogBuilder.java:985)
at org.geoserver.catalog.CatalogBuilder.buildCoverage(CatalogBuilder.java:939)
at org.geoserver.web.data.layer.NewLayerPage.buildLayerInfo(NewLayerPage.java:418)
... 137 more
Caused by: java.io.IOException: java.util.concurrent.ExecutionException: org.geotools.gce.imagemosaic.GranuleLoadingException: Failed to load granule file:/C:/AlidadA/3_software/geoserver_2_19_0/data_dir/data/1_drn_data/pyramids/out/0/8-50-0-5_epsg_4326_01_01.tif
at org.geotools.gce.imagemosaic.granulecollector.BaseSubmosaicProducer.collectGranules(BaseSubmosaicProducer.java:225)
at org.geotools.gce.imagemosaic.granulecollector.BaseSubmosaicProducer.createMosaic(BaseSubmosaicProducer.java:398)
at org.geotools.gce.imagemosaic.RasterLayerResponse$MosaicProducer.produce(RasterLayerResponse.java:420)
at org.geotools.gce.imagemosaic.RasterLayerResponse$MosaicProducer.access$600(RasterLayerResponse.java:276)
at org.geotools.gce.imagemosaic.RasterLayerResponse.prepareResponse(RasterLayerResponse.java:676)
... 151 more
Caused by: java.util.concurrent.ExecutionException: org.geotools.gce.imagemosaic.GranuleLoadingException: Failed to load granule file:/C:/AlidadA/3_software/geoserver_2_19_0/data_dir/data/1_drn_data/pyramids/out/0/8-50-0-5_epsg_4326_01_01.tif
at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191)
at org.geotools.gce.imagemosaic.granulecollector.BaseSubmosaicProducer.collectGranules(BaseSubmosaicProducer.java:121)
... 155 more
Caused by: org.geotools.gce.imagemosaic.GranuleLoadingException: Failed to load granule file:/C:/AlidadA/3_software/geoserver_2_19_0/data_dir/data/1_drn_data/pyramids/out/0/8-50-0-5_epsg_4326_01_01.tif
at org.geotools.gce.imagemosaic.GranuleLoader.call(GranuleLoader.java:112)
at org.geotools.gce.imagemosaic.GranuleLoader.call(GranuleLoader.java:38)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.geotools.gce.imagemosaic.granulecollector.BaseSubmosaicProducer.acceptGranule(BaseSubmosaicProducer.java:445)
at org.geotools.gce.imagemosaic.granulecollector.DefaultSubmosaicProducer.accept(DefaultSubmosaicProducer.java:70)
at org.geotools.gce.imagemosaic.RasterLayerResponse$MosaicProducer.visit(RasterLayerResponse.java:360)
at org.geotools.gce.imagemosaic.catalog.CachingDataStoreGranuleCatalog.getGranuleDescriptors(CachingDataStoreGranuleCatalog.java:180)
at org.geotools.gce.imagemosaic.catalog.LockingGranuleCatalog.lambda$getGranuleDescriptors$7(LockingGranuleCatalog.java:195)
at org.geotools.gce.imagemosaic.catalog.LockingGranuleCatalog.guardIO(LockingGranuleCatalog.java:93)
at org.geotools.gce.imagemosaic.catalog.LockingGranuleCatalog.getGranuleDescriptors(LockingGranuleCatalog.java:195)
at org.geotools.gce.imagemosaic.RasterManager.getGranuleDescriptors(RasterManager.java:1330)
at org.geotools.gce.imagemosaic.RasterLayerResponse.prepareResponse(RasterLayerResponse.java:672)
... 151 more
Caused by: java.lang.NullPointerException
at org.geotools.gce.imagemosaic.GranuleDescriptor.loadRaster(GranuleDescriptor.java:1318)
at org.geotools.gce.imagemosaic.GranuleLoader.call(GranuleLoader.java:108)
... 162 more
I found myself in a similar situation, and I succeeded while using a GeoTIFF without any compression (especially JPEG compression). It means that I only executed your third command line :
gdal_retile -of GTiff -v -r bilinear -levels 4 -ps 2048 2048 -co "TILED=YES" \
-co "COMPRESS=JPEG" -targetDir C:\...\out input_file.vrt
But without the -co COMPRESS=JPEG. And it worked ! I think it's a problem with JPEG compression, but I didn't test with others so I can't be sure.
I was having the same issue and it worked by omitting the COMPRESS option. I then tried using
-co "COMPRESS=LZW"
and it worked. Helped me almost halving the space used by the uncompressed tiles.

PySpark pandas_udfs java.lang.IllegalArgumentException error

Does anyone have experience using pandas UDFs on a local pyspark session running on Windows? I've used them on linux with good results, but I've been unsuccessful on my Windows machine.
Environment:
python==3.7
pyarrow==0.15
pyspark==2.3.4
pandas==0.24
java version "1.8.0_74"
Sample script:
from pyspark.sql.functions import pandas_udf, PandasUDFType
from pyspark.sql import SparkSession
spark = SparkSession.builder.master("local").getOrCreate()
spark.conf.set("spark.sql.execution.arrow.enabled", "true")
spark.conf.set("spark.sql.execution.arrow.fallback.enabled", "false")
df = spark.createDataFrame(
[(1, 1.0), (1, 2.0), (2, 3.0), (2, 5.0), (2, 10.0)],
("id", "v"))
#pandas_udf("id long, v double", PandasUDFType.GROUPED_MAP)
def subtract_mean(pdf):
# pdf is a pandas.DataFrame
v = pdf.v
return pdf.assign(v=v - v.mean())
out_df = df.groupby("id").apply(subtract_mean).toPandas()
print(out_df.head())
# +---+----+
# | id| v|
# +---+----+
# | 1|-0.5|
# | 1| 0.5|
# | 2|-3.0|
# | 2|-1.0|
# | 2| 4.0|
# +---+----+
After running for a loooong time (splits the toPandas stage into 200 tasks each taking over a second) it returns an error like this:
Traceback (most recent call last):
File "C:\miniconda3\envs\pandas_udf\lib\site-packages\pyspark\sql\dataframe.py", line 1953, in toPandas
tables = self._collectAsArrow()
File "C:\miniconda3\envs\pandas_udf\lib\site-packages\pyspark\sql\dataframe.py", line 2004, in _collectAsArrow
sock_info = self._jdf.collectAsArrowToPython()
File "C:\miniconda3\envs\pandas_udf\lib\site-packages\py4j\java_gateway.py", line 1257, in __call__
answer, self.gateway_client, self.target_id, self.name)
File "C:\miniconda3\envs\pandas_udf\lib\site-packages\pyspark\sql\utils.py", line 63, in deco
return f(*a, **kw)
File "C:\miniconda3\envs\pandas_udf\lib\site-packages\py4j\protocol.py", line 328, in get_return_value
format(target_id, ".", name), value)
py4j.protocol.Py4JJavaError: An error occurred while calling o62.collectAsArrowToPython.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 69 in stage 3.0 failed 1 times, most recent failure: Lost task 69.0 in stage 3.0 (TID 201, localhost, executor driver): java.lang.IllegalArgumentException
at java.nio.ByteBuffer.allocate(Unknown Source)
at org.apache.arrow.vector.ipc.message.MessageChannelReader.readNextMessage(MessageChannelReader.java:64)
at org.apache.arrow.vector.ipc.message.MessageSerializer.deserializeSchema(MessageSerializer.java:104)
at org.apache.arrow.vector.ipc.ArrowStreamReader.readSchema(ArrowStreamReader.java:128)
at org.apache.arrow.vector.ipc.ArrowReader.initialize(ArrowReader.java:181)
at org.apache.arrow.vector.ipc.ArrowReader.ensureInitialized(ArrowReader.java:172)
at org.apache.arrow.vector.ipc.ArrowReader.getVectorSchemaRoot(ArrowReader.java:65)
at org.apache.spark.sql.execution.python.ArrowPythonRunner$$anon$1.read(ArrowPythonRunner.scala:161)
at org.apache.spark.sql.execution.python.ArrowPythonRunner$$anon$1.read(ArrowPythonRunner.scala:121)
at org.apache.spark.api.python.BasePythonRunner$ReaderIterator.hasNext(PythonRunner.scala:290)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:439)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.sql.execution.arrow.ArrowConverters$$anon$2.hasNext(ArrowConverters.scala:96)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at org.apache.spark.sql.execution.arrow.ArrowConverters$$anon$2.foreach(ArrowConverters.scala:94)
at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
at org.apache.spark.sql.execution.arrow.ArrowConverters$$anon$2.to(ArrowConverters.scala:94)
at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
at org.apache.spark.sql.execution.arrow.ArrowConverters$$anon$2.toBuffer(ArrowConverters.scala:94)
at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
at org.apache.spark.sql.execution.arrow.ArrowConverters$$anon$2.toArray(ArrowConverters.scala:94)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$12.apply(RDD.scala:945)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2074)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:109)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Your java.lang.IllegalArgumentException in pandas_udf has to do with pyarrow version, not with OS environment. See this issue for details.
You have two routs of action:
Downgrade pyarrow to v.0.14, or
Add environment variable ARROW_PRE_0_15_IPC_FORMAT=1 to SPARK_HOME/conf/spark-env.sh
On Windows, you'll need to have a spark-env.cmd file in the conf directory: set ARROW_PRE_0_15_IPC_FORMAT=1, as suggested by Jonathan Taws
Addendum to the answer of Sergey:
if you prefer to build your own sparkSession in python and not change your config files, you'll need to set both spark.yarn.appMasterEnv.ARROW_PRE_0_15_IPC_FORMAT and the environment variable of the local executor spark.executorEnv.ARROW_PRE_0_15_IPC_FORMAT
spark_session = SparkSession.builder \
.master("yarn") \
.config('spark.yarn.appMasterEnv.ARROW_PRE_0_15_IPC_FORMAT',1)\
.config('spark.executorEnv.ARROW_PRE_0_15_IPC_FORMAT',1)
spark = spark_session.getOrCreate()
Hope this helps!

Latex tick label overlapping axis in graph generated by matplotlib

I am using the following code from the standard beginner's matplotlib tutorial.
from pylab import *
figure(figsize=(10, 6), dpi=80)
subplot(1,1,1)
X = np.linspace(-np.pi, np.pi, 256, endpoint=True)
C, S = np.cos(X), np.sin(X)
plot(X, C, color="blue", linewidth=2.5, linestyle="-",
label=r'Cosine')
plot(X, S, color="red", linewidth=2.5, linestyle="-",
label=r"Sine")
xlim(X.min()*1.1, X.max()*1.1)
ylim(C.min()*1.1, C.max()*1.1)
ax = gca()
ax.xaxis.set_ticks_position('bottom')
ax.spines['bottom'].set_position('center')
ax.yaxis.set_ticks_position('left')
ax.spines['left'].set_position(('data',0))
ax.spines['right'].set_color('none')
ax.spines['top'].set_color('none')
xticks([-np.pi, -np.pi/2, 0, np.pi/2, np.pi],
[r'$-\pi$', r'$-\frac{\pi}{2}$',r'$0$', r'$\frac{\pi}{2}$',r'$+\pi$'])
yticks([-1, +1])
legend(loc='upper left')
for label in ax.get_xticklabels() + ax.get_yticklabels():
label.set_fontsize(20)
label.set_bbox(dict(facecolor='white', edgecolor='None', alpha=0.65))
show(block=False)
which results in the following graph
I've tried to figure out why the latex tick labels for pi/2 and -pi/2 are intersecting the x-axis, but I cannot find anything on Google, SO or in the matplotlib documentation. Is this potentially a bug? I'm on OSX Mountain Lion, Python 2.7.6, matplotlib 1.3.1, ipython 2.1.0.
Thanks to the comment by #Schorsch, I was able to narrow down the problem. It had to do with incompatibilities with two versions of libpng I had installed with Homebrew.
$brew info libpng
libpng: stable 1.6.12 (bottled)
http://www.libpng.org/pub/png/libpng.html
/usr/local/Cellar/libpng/1.5.17 (15 files, 1.3M)
Built from source with: --universal
/usr/local/Cellar/libpng/1.6.12 (17 files, 1.3M) *
It seems when I installed matplotlib with
pip install matplotlib
It used libpng 1.5.17, but when running ipython --pylab it was using 1.6.12. To force pip to use the appropriate version of libpng I used the following shell variables.
export LDFLAGS="-L/usr/local/Cellar/libpng/1.6.12/lib/ -L/usr/X11/lib"
export CFLAGS="-I/usr/local/Cellar/libpng/1.6.12/include/ -I/usr/X11/include -I/usr/X11/include/freetype2"
Then to reinstall matplot lib
pip install --upgrade --force-reinstall matplotlib
Then to ensure that latex rendering is used
from matplotlib import rc
rc('text', usetex=True)
Results in the proper figure

ifort optimization O3 leads to inconsistency

I have a problem with my fortran code when using O3 optimization: The values calculated for the norm of an array changes with and without O3, and is incorrect with O3. The following shows a minimal example of my code
program main
use wavefunction
implicit none
integer(I4B) :: Na, Nb, Npes
complex(DPC), ALLOCATABLE, DIMENSION(:,:,:) :: phi
real(DP), ALLOCATABLE, DIMENSION(:) :: normPerPes1
real(DP) :: sigma
integer(I4B) :: i,j
Na=100
Nb=100
Npes=4
ALLOCATE(phi(Na,Nb,Npes), normPerPes1(Npes))
sigma=10
phi=(0.D0,0.D0)
do i=1,Na
do j=1,Nb
!gaussian on pes 1
phi(i,j,1)=1.D0/(sigma**2*2.D0*pi)*exp(-(dble(i-50)**2+dble(j-50)**2/(2.D0*sigma**2))
end do
end do
!total norm
write(*,*) norm(Na,Nb,Npes,phi)
!norm on each pes
CALL normPerPes(Na,Nb,Npes,phi,NormPerPes1)
write(*,*) NormPerPes1
end program main
which uses the following module
module wavefunction
use nrtype
implicit none
contains
function norm(Na,Nb,Npes, phi)
implicit none
INTEGER(I4B), INTENT(IN) :: Na, Nb, Npes
COMPLEX(DPC), INTENT(IN) :: phi(Na,Nb,Npes)
REAL(DP) :: norm
INTEGER(I4B) :: i,j, pesNr
norm=0.D0
do i=1,Na
do j=1,Nb
do pesNr=1, Npes
norm=norm+abs(phi(i,j,pesNr))**2
end do
end do
end do
end function norm
!----------------------------------------------------------
subroutine normPerPes(Na, Nb, Npes, phi, normPerPes1)
IMPLICIT none
REAL(DP) :: normPerPes1(Npes)
INTEGER(I4B), INTENT(IN) :: Na, Nb, Npes
COMPLEX(DPC), INTENT(IN) :: phi(Na,Nb,Npes)
INTEGER(I4B):: i,j,pesNr
normPerPes1=0.0d0
do i=1,Na
do j=1,Nb
do pesNr=1,Npes
normPerPes1(pesNr)=normPerPes1(pesNr)+abs(phi(i,j,pesNr))**2
end do
end do
end do
return
end subroutine normPerPes
!-----------------------------------------------------------
end module wavefunction
if I compile with the following Makefile
# compiler
FC = ifort
# flags
FFLAGS = # -O3
main: main.o nrtype.o wavefunction.o
main.o: main.f90 nrtype.o wavefunction.o
wavefunction.o: wavefunction.f90 nrtype.o
nrtype.o: nrtype.f90
%: %.o
$(FC) $(FFLAGS) -o dynamic $^ $(LDFLAGS)
%.o: %.f90
$(FC) $(FFLAGS) -c $<
clean:
rm -f *.o *.mod *_genmod.f90
I get the following correct output:
7.957747154568253E-004
7.957747154568242E-004 0.000000000000000E+000 0.000000000000000E+000
0.000000000000000E+000
However, if I use O3 then I obtain the following incorrect result
7.957747154568253E-004
1.989436788642788E-004 0.000000000000000E+000 0.000000000000000E+000
0.000000000000000E+000
This looks to me as there were something seriously fishy in my code, but I can't seem to find the problem. Thank you for your help!
As confirmed by Intel (see https://software.intel.com/en-us/forums/topic/516819), this is a problem of the compiler version used ( Composer XE 2013 SP1 initial release (pkg. 080) ).
They claim that upgrading to Update 2 or 3 helps - I was not able to try yet.
In the mean while a workaround is to forget about O3 and use O2 optimization.

knitr pdflatex output discrepancy between console and PDF

I am running RStudio Version 0.98.484 and R version 3.0.2 on OS X Mavericks.
While using knitr I noticed a discrepancy between the console output from a summary() command and that generated in the PDF (via pdflatex). Here is the example.
\documentclass[11pt]{article}
\usepackage{MinionPro}
\usepackage{MnSymbol}
\usepackage[margin = 1 in]{geometry}
\geometry{verbose,tmargin=2.5cm,bmargin=2.5cm,lmargin=2.5cm,rmargin=2.5cm}
\setcounter{secnumdepth}{2}
\setcounter{tocdepth}{2}
\usepackage{url}
\usepackage[unicode=true,pdfusetitle,
bookmarks=true,bookmarksnumbered=true,bookmarksopen=true,bookmarksopenlevel=2,
breaklinks=false,pdfborder={0 0 1},backref=false,colorlinks=false]
{hyperref}
\hypersetup{
pdfstartview={XYZ null null 1}}
\usepackage{breakurl}
\usepackage{color}
\usepackage{graphicx}
\usepackage{fancyhdr}
\definecolor{darkred}{rgb}{0.5,0,0}
\definecolor{darkgreen}{rgb}{0,0.5,0}
\definecolor{darkblue}{rgb}{0,0,0.5}
\hypersetup{ colorlinks,
linkcolor=darkblue,
filecolor=darkgreen,
urlcolor=darkred,
citecolor=darkblue }
\definecolor{keywordcolor}{rgb}{0,0.6,0.6}
\definecolor{delimcolor}{rgb}{0.461,0.039,0.102}
\definecolor{Rcommentcolor}{rgb}{0.101,0.043,0.432}
\usepackage{booktabs}
\usepackage{listings}
\lstset{breaklines=true,showstringspaces=false}
\makeatletter
\newcommand\gobblepars{%
\#ifnextchar\par%
{\expandafter\gobblepars\#gobble}%
{}}
\makeatother
\newcommand{\R}{R}
\title{\textsc{Laboratory Session 1}}
\author{Ani}
\begin{document}
<<setup, include=FALSE, cache=FALSE>>=
library(knitr)
# set global chunk options
opts_chunk$set(fig.path='figure/minimal-', fig.align='center', fig.show='hold')
options(replace.assign=FALSE, width=90, tidy=TRUE)
render_listings()
#
\maketitle
<<chunk26>>==
require(rpart)
data(car90)
summary(car90$Price)
#
Hello!
\end{document}
The console shows:
> summary(car90$Price)
Min. 1st Qu. Median Mean 3rd Qu. Max. NA's
5866 9995 13070 15810 19940 41990 6
the pdf shows
Min. 1st Qu. Median Mean 3rd Qu. Max. NA 's
5870 10000 13100 15800 19900 42000 6
Why would this be happening? There are no decimals to round up. Any clues would be much appreciated.
Thanks!!
Ani
That is because knitr sets options(digits=4), and the default value of the digits option is 7 in R. You can reproduce it by
library(rpart)
options(digits=4)
summary(car90$Price)