I'm trying to deploy my lambda functions using serverless framework.
I'm pretty sure it worked in the past, now I get the error below when calling sls deploy.
The lambda runtime is in python. Here my last version of serverless.yaml
I'm running the deploy from macOS big sur but I get the same error from centos.
Any clue?
Error ---------------------------------------------------
Error: EMFILE: too many open files, open '/Users/antoniodalessio/personal/dynamoplus-project/dynamoplus/serverless/.serverless/get.zip'
For debugging logs, run again after setting the "SLS_DEBUG=*" environment variable.
Get Support --------------------------------------------
Docs: docs.serverless.com
Bugs: github.com/serverless/serverless/issues
Issues: forum.serverless.com
Your Environment Information ---------------------------
Operating System: darwin
Node Version: 17.3.0
Framework Version: 2.69.1 (local)
Plugin Version: 5.5.1
SDK Version: 4.3.0
Components Version: 3.18.1
Related
I have updated my CUDA toolkit version to 11.4. When I try to migrate any CUDA code to DPC++ using DPCT tool, I get the following error:
dpct exited with code: -5 (Error: Path for CUDA header files is invalid or not available. Specify with --cuda-include-path)
I have even specified the cuda-include-path flag but the error still persists. I'm I missing something?
Environment:
OS: CentOS 8
oneAPI base tookit version: 2021.3
DPCT tool does not support CentOS version 8. You can try using any of the supported versions of OS for DPCT migration.
Refer to the below link for the supported OS and their versions.
https://software.intel.com/content/www/us/en/develop/articles/intel-dpc-compatibility-tool-system-requirements.html
This has been happening ever since I have updated Intellij (IDEA CE 2020.3) to a newer version (today). I am getting this exception from the plugin when running the Develop on Kubernetes Run Configuration that I usually use with my local Minikube instance to get all of the services in the cluster up and running, and able to Debug in debug mode.
My local Minikube instance is fine shown by the following:
(Dev) $ minikube status
minikube
type: Control Plane
host: Running
kubelet: Running
apiserver: Running
kubeconfig: Configured
I've tried checking for updates, restarting Intellij, and I am still getting the same thing. It must be something in relation to my Intellij Update but we'll have to see...
The full stack trace is:
java.util.ServiceConfigurationError: io.grpc.ManagedChannelProvider: io.grpc.netty.shaded.io.grpc.netty.NettyChannelProvider not a subtype
at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:588)
at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1236)
at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1264)
at java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1299)
at java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1384)
at io.grpc.ServiceProviders.loadAll(ServiceProviders.java:67)
at io.grpc.ServiceProviders.load(ServiceProviders.java:42)
at io.grpc.ManagedChannelProvider.<clinit>(ManagedChannelProvider.java:37)
at io.grpc.ManagedChannelBuilder.forAddress(ManagedChannelBuilder.java:37)
at com.google.cloud.tools.intellij.kubernetes.skaffold.events.SkaffoldEventHandler.newManagedChannel(SkaffoldEventHandler.kt:319)
at com.google.cloud.tools.intellij.kubernetes.skaffold.events.SkaffoldEventHandler.listenEvents(SkaffoldEventHandler.kt:75)
at com.google.cloud.tools.intellij.kubernetes.skaffold.run.SkaffoldCommandLineState$startProcess$1.invokeSuspend(SkaffoldCommandLineState.kt:189)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.DispatchedTask.run(Dispatched.kt:241)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:594)
at kotlinx.coroutines.scheduling.CoroutineScheduler.access$runSafely(CoroutineScheduler.kt:60)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:740)
I am getting the same behaviour in both DEBUG mode and RUN mode.
Environment Info
IDE type: IntelliJ
IDE version: Community Edition 2020.3
Cloud Code version: 20.10.1-202
Skaffold version: v1.14.0
Operating System: Windows 10 Pro 64-bit
Any help, suggestions or resolutions would be really appreciated so thank you in advance! Thanks
This issue was fixed with patch release 20.12.1 that was put out shortly after the EAP release. Please try it out and if you run into any other issues feel free to post on our GitHub. – eshaul
I was trying to install the ZAP proxy in my parrot home OS, but I'm unable to install it and the error that I'm receiving in the terminal is as follows:
(A fatal error has been detected by the Java Runtime Environment:
SIGBUS (0x7) at pc=0x00007f904544b12f, pid=6446, tid=6447JRE version: OpenJDK Runtime
Environment (11.0.5+10) (build 11.0.5+10-post-Debian-2)
Java VM: OpenJDK 64-Bit Server VM (11.0.5+10-post-Debian-2, mixed mode, sharing, tiered,
compressed oops, g1 gc, linux-amd64)
Problematic frame:
V [libjvm.so+0xcce12f]
No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
If you would like to submit a bug report, please visit:
(https://bugs.debian.org/openjdk-11)
Aborted
This error message...
(A fatal error has been detected by the Java Runtime Environment:
SIGBUS (0x7) at pc=0x00007f904544b12f, pid=6446, tid=6447JRE version: OpenJDK Runtime
Environment (11.0.5+10) (build 11.0.5+10-post-Debian-2)
Java VM: OpenJDK 64-Bit Server VM (11.0.5+10-post-Debian-2, mixed mode, sharing, tiered,
compressed oops, g1 gc, linux-amd64)
Problematic frame:
V [libjvm.so+0xcce12f]
...implies that the V frame was the Problematic frame which crashed which resulted in libjvm.so file.
The crash you observed is possibly not a zap or java issue but a debian issue.
Deep Dive
However as per the documentation in What versions of Java are supported? page, ZAP should be able to run with all/newer Java versions, but might require a minimum for certain ZAP versions:
ZAP 2.7.0 and later requires a minimum of Java 1.8
ZAP 2.0.0 and later requires a minimum of Java 1.7
Previous versions of ZAP also support Java 1.6, the last of those being 1.4.1
Additionally, as per the documentation in Download ZAP page:
The Windows and Linux versions require Java 8 or higher to run.
The macOS version includes Java 8 - you can use the Linux or Cross Platform versions if you do not want to download this.
Finally, as per the documentation in Release 2.9.0 page:
This is a bug fix and enhancement release, which requires a minimum of Java 8. Note that a minimum of Java 11 is recommended, especially for high DPI displays.
References
You can find a couple of relevant discussions in:
Jvm crash :fatal error has been detected by the Java Runtime Environment
“A fatal error has been detected by the Java Runtime Environment” when running java project on another computer
A fatal error has been detected by the Java Runtime Environment (SIGBUS (0x7))
JVM crashes with problematic frame [libjvm.so+0x7f582e] PerfLongVariant::sample()+0x1e
I am attempting to build the Mule ESB 3.5.0 Community Runtime found at https://github.com/mulesoft/mule/releases/tag/mule-3.5.0
If I build using maven and skip the tests everything is fine.
However if I leave the tests enabled a few fail and the rest are skipped.
I have tried building each maven module individually in the order they are listed in the parent POM using the setups listed below:
Please could someone advise to any additional steps or environment setup required in order to successfully build the source?
Setup #1
Ubuntu 14.04 LTS Desktop (64bit)
java-7-oracle JDK (64bit)
Maven 3.0.5
The results are as follows:
buildtools - ALL TESTS PASS
core - ALL TESTS PASS
distributions - ALL TESTS PASS
examples - ALL TESTS PASS
modules - failed on Management Extensions:
testDefaultJmxAgent(org.mule.management.JmxAgentEmptyConfigurationTestCase)
testDefaultJmxAgent(org.mule.management.JmxAgentDefaultConfigurationWithRMITestCase)
patterns - ALL TESTS PASS
tools - ALL TESTS PASS
transports - failed on HTTP Transport:
createHttpServerConnectionWithHttpConnectorProperties(org.mule.transport.http.HttpServerConnectionTestCase)
tests -failed on Integration Tests:
testOutboundInMiddleOfFlow(org.mule.test.construct.FlowOutboundInMiddleOfFlowTestCase)
validatesDbConnectorGenericMySqlOverriddenTemplateResolution(org.mule.spring.config.NewDatabaseMuleArtifactTestCase)
verifiesDerby(org.mule.spring.config.DatabaseMuleArtifactTestCase)
Setup #2
Windows 7 Pro (64bit)
Oracle Jave JDK 1.6.0_31 (64bit)
Maven 3.2.1
The results are as follows:
buildtools - ALL TESTS PASS
core - Failed
testFullStackTraceWithoutMessage(org.mule.util.ExceptionUtilsTestCase)
dateTimeIsAfter[0](org.mule.el.context.ServerContextTestCase)
testIsSupportedJdkVersion(org.mule.util.JdkVersionUtilsTestCase)
testRecommendedJdkVersion(org.mule.util.JdkVersionUtilsTestCase)
testValidateJdk5(org.mule.util.JdkVersionUtilsTestCase)
Setup #3
Windows 7 Pro (64bit)
Oracle Jave JDK 1.7.0_51 (64bit)
Maven 3.2.1
The results are as follows:
buildtools - ALL TESTS PASS
core - Failed
testFullStackTraceWithoutMessage(org.mule.util.ExceptionUtilsTestCase)
dateTimeIsAfter[0](org.mule.el.context.ServerContextTestCase)
testIsSupportedJdkVersion(org.mule.util.JdkVersionUtilsTestCase)
testRecommendedJdkVersion(org.mule.util.JdkVersionUtilsTestCase)
testValidateJdk5(org.mule.util.JdkVersionUtilsTestCase)
Regards
Kumaran
Mule 3.5 has to be compiled against Java6.
Theoretically it can be compiled on windows but I would rather try with Linux/Mac.
Make sure you allocate enough ram export MAVEN_OPTS="-Xmx4000m"
Read carefully the BUILD instructions.
I try to configure hadoop 2.3.0 on windows but its give me
Error: Could not find or load main class org.apache.hadoop.hdfs.tools.GetConf
and lots of other error like "winutils.exe","hadoop.dll messing" and "load main class" error.
Please help me to configure Hadoop on windows 8 machine.
I have installed
JDK 1.7
cygwin64
If we directly take the binary distribution of Apache Hadoop 2.3 release and try to run it on Microsoft Windows, then we'll encounter ERROR util.Shell: Failed to locate the winutils binary in the hadoop binary path.
The binary distribution of Apache Hadoop 2.3 release does not contain some windows native components (like winutils.exe, hadoop.dll etc). These are required (not optional) to run Hadoop on Windows.
So you need to build windows native binary distribution of hadoop from source codes following "BUILD.txt" file located inside the source distribution of hadoop. You can follow the following posts (applicable for hadoop 2.3 as well) for step by step guide with screen shot
Build, Install, Configure and Run Apache Hadoop 2.2.0 in Microsoft Windows OS
ERROR util.Shell: Failed to locate the winutils binary in the hadoop binary path
Follow the steps described in the below link which describes the installation of Hadoop 2.3.0 in a windows 8 machine.
http://www.srccodes.com/p/article/38/build-install-configure-run-apache-hadoop-2.2.0-microsoft-windows-os
This works perfectly.
Thanks
Installing in windows:
http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.6.0-Win/bk_installing_hdp_for_windows/content/win-chap2-singlenode.html
Prerrequisites:
http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.0.6.0-Win/bk_installing_hdp_for_windows/content/win-getting-ready-2-3-2.HTML
In the suggested tutorial above, the build instructions do not work for the most recent version of visual studio/.NET framework/Windows.
http://www.srccodes.com/p/article/38/build-install-configure-run-apache-hadoop-2.2.0-microsoft-windows-os
First download the desired source version and ...
You do not need the windows 7 sdk/build will fail to create the binary. For windows 8, you can build the hadoop-2.5.2-src\hadoop-common-project\hadoop-common\src\main\winutils solution and C:\hfds\hadoop-2.5.2-src\hadoop-common-project\hadoop-common\src\main\native solutions in visual studio
Then download the binary version...
and place the output files in hadoop-2.5.2-src\hadoop-common-project\hadoop-common\target\bin in the bin directory of the downloaded hadoop binary bin folder.
Then follow the remaining steps of the tutorial...