Back to Build/check report for BioC 3.18:   simplified   long
ABCDEFGHIJKLMNOPQ[R]STUVWXYZ

This page was generated on 2023-05-10 10:04:39 -0000 (Wed, 10 May 2023).

HostnameOSArch (*)R versionInstalled pkgs
kunpeng1Linux (Ubuntu 22.04.1 LTS)aarch64R Under development (unstable) (2023-03-12 r83975) -- "Unsuffered Consequences" 6211
Click on any hostname to see more info about the system (e.g. compilers)      (*) as reported by 'uname -p', except on Windows and Mac OS X

BUILD results for RGMQL on kunpeng1


To the developers/maintainers of the RGMQL package:
- Allow up to 24 hours (and sometimes 48 hours) for your latest push to git@git.bioconductor.org:packages/RGMQL.git to reflect on this report. See Troubleshooting Build Report for more information.

- Use the following Renviron settings to reproduce errors and warnings.

Note: If "R CMD check" recently failed on the Linux builder over a missing dependency, add the missing dependency to "Suggests" in your DESCRIPTION file. See the Renviron.bioc for details.

raw results

Package 1669/2194HostnameOS / ArchINSTALLBUILDCHECKBUILD BIN
RGMQL 1.21.0  (landing page)
Simone Pallotta
Snapshot Date: 2023-05-08 19:11:19 -0000 (Mon, 08 May 2023)
git_url: https://git.bioconductor.org/packages/RGMQL
git_branch: devel
git_last_commit: 66ad989
git_last_commit_date: 2023-04-25 14:58:44 -0000 (Tue, 25 Apr 2023)
kunpeng1Linux (Ubuntu 22.04.1 LTS) / aarch64  OK    ERROR  skipped

Summary

Package: RGMQL
Version: 1.21.0
Command: /home/biocbuild/bbs-3.17-bioc/R/bin/R CMD build --keep-empty-dirs --no-resave-data RGMQL
StartedAt: 2023-05-09 05:20:04 -0000 (Tue, 09 May 2023)
EndedAt: 2023-05-09 05:21:09 -0000 (Tue, 09 May 2023)
EllapsedTime: 65.1 seconds
RetCode: 1
Status:   ERROR  
PackageFile: None
PackageFileSize: NA

Command output

##############################################################################
##############################################################################
###
### Running command:
###
###   /home/biocbuild/bbs-3.17-bioc/R/bin/R CMD build --keep-empty-dirs --no-resave-data RGMQL
###
##############################################################################
##############################################################################


* checking for file ‘RGMQL/DESCRIPTION’ ... OK
* preparing ‘RGMQL’:
* checking DESCRIPTION meta-information ... OK
* installing the package to build vignettes
* creating vignettes ... ERROR
--- re-building ‘RGMQL-vignette.Rmd’ using rmarkdown
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
23/05/09 05:21:07 INFO SparkContext: Running Spark version 2.2.0
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/home/biocbuild/bbs-3.17-bioc/R-devel_2023-03-12_r83975-bin/lib/R/library/RGMQLlib/extdata/java/GMQL.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
23/05/09 05:21:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
23/05/09 05:21:08 INFO SparkContext: Submitted application: GMQL-R
23/05/09 05:21:08 INFO SecurityManager: Changing view acls to: biocbuild
23/05/09 05:21:08 INFO SecurityManager: Changing modify acls to: biocbuild
23/05/09 05:21:08 INFO SecurityManager: Changing view acls groups to: 
23/05/09 05:21:08 INFO SecurityManager: Changing modify acls groups to: 
23/05/09 05:21:08 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(biocbuild); groups with view permissions: Set(); users  with modify permissions: Set(biocbuild); groups with modify permissions: Set()
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 WARN Utils: Service 'sparkDriver' could not bind on a random free port. You may check whether configuring an appropriate binding address.
23/05/09 05:21:08 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
	at java.base/sun.nio.ch.Net.bind0(Native Method)
	at java.base/sun.nio.ch.Net.bind(Net.java:459)
	at java.base/sun.nio.ch.Net.bind(Net.java:448)
	at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:227)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:127)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:501)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1218)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:496)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:481)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:965)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:210)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:353)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:399)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:446)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
	at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
	at java.base/java.lang.Thread.run(Thread.java:829)
23/05/09 05:21:08 INFO SparkContext: Successfully stopped SparkContext
Quitting from lines 250-251 (RGMQL-vignette.Rmd) 
Error: processing vignette 'RGMQL-vignette.Rmd' failed with diagnostics:
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (on a random free port)! Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
--- failed re-building ‘RGMQL-vignette.Rmd’

SUMMARY: processing the following file failed:
  ‘RGMQL-vignette.Rmd’

Error: Vignette re-building failed.
Execution halted