I have an issue when I try running Pentaho Data Integration on Mac bigSur (M1).
issue code in below:
I'm sorry, this Mac platform [arm64] is not yet supported! Please try starting using 'Data Integration 32-bit' or 'Data Integration 64-bit' as appropriate.
java version
> java version "1.8.0_291"
Java(TM) SE Runtime Environment (build 1.8.0_291-b10)
Java HotSpot(TM) 64-Bit Server VM (build 25.291-b10, mixed mode)
can anyone help me with this issue?
Thanks
Try this guide from reddit
Guide:
Here is how you can force the shell to run in Intel mode so that you
can continue working in this little command-line Rosetta Island while
waiting for native ARM64 support.
Open the Terminal app.
Open the Terminal app’s Preferences.
Click on the Profiles tab.
Select a profile, click on the ellipsis at the bottom of the profile list and then select Duplicate Profile.
Click on the new profile and give it a good name. I named mine as “Rosetta Shell”.
Also in the new profile, click on the Window tab. In the Title, put a name to indicate that this is for running Intel-based apps.I put “Terminal (Intel)” on mine.
Click on the Shell tab and use the following as its Run Command to force the shell run under Rosetta: env /usr/bin/arch -x86_64 /bin/zsh --login
Untick the Run inside shell checkbox. Clearing the checkbox would prevent running the shell twice, which could bloat your environment variables since ~/.zshrc gets run twice.
Optionally set this profile as the Default.
This is the first step. After that you have to replace the swt.jar in the data-integration Folder /path_to_your_data-integration/libswt/osx64/
Otherwise it won't start.
You can download the jar here
Important!: You don't have to rename this file, but you have to remove the original swt.jar .
I've just want to add that you need to have an x86 Java version installed for Tufan Atak solution to work.
So if you've installed an M1 compat Java version and you try to start spoon with that it will throw the next error:
java.lang.UnsatisfiedLinkError: Could not load SWT library. Reasons:
no swt-cocoa-4944r26 in java.library.path
no swt-cocoa in java.library.path
no swt in java.library.path
Can't load library: /<user-home-path>/.swt/lib/macosx/aarch64/libswt-cocoa-4944r26.jnilib
Can't load library: /<user-home-path>/.swt/lib/macosx/aarch64/libswt-cocoa.jnilib
Can't load library: /<user-home-path>/.swt/lib/macosx/aarch64/libswt.jnilib
/<user-home-path>/.swt/lib/macosx/aarch64/libswt-cocoa-4944r26.jnilib: dlopen(/<user-home-path>/.swt/lib/macosx/aarch64/libswt-cocoa-4944r26.jnilib, 0x0001): tried: '/<user-home-path>/.swt/lib/macosx/aarch64/libswt-cocoa-4944r26.jnilib' (mach-o file, but is an incompatible architecture (have (x86_64), need (arm64e)))
at org.eclipse.swt.internal.Library.loadLibrary(Library.java:348)
at org.eclipse.swt.internal.Library.loadLibrary(Library.java:257)
at org.eclipse.swt.internal.C.<clinit>(C.java:19)
at org.eclipse.swt.widgets.Display.<clinit>(Display.java:107)
at org.pentaho.di.ui.core.widget.OsHelper.setAppName(OsHelper.java:106)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:652)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:92)
So you need to install Java again from the same intel terminal profile so an x86 version is installed.
You can use SDK Man and after that you can execute this command (for java 8 temurin):
> sdk install java 8.0.345-tem
.
.
.
> Do you want java 8.0.345-tem to be set as default? (Y/n): n
The n answer is because you don't wanna run every other java program with the x86 version.
After that you can tell SDK Man to use this new version for this terminal shell
> sdk use java 8.0.345-tem
then just verify the current version is the one you've just indicated sdk man to use:
> java -version
you should see something like this:
openjdk version "1.8.0_345"
OpenJDK Runtime Environment (Temurin)(build 1.8.0_345-b01)
OpenJDK 64-Bit Server VM (Temurin)(build 25.345-b01, mixed mode)
After that you can finally start spoon
> ./spoon.sh
Related
I am developing a RHEL 7 Qt application and need to connect to an Oracle database. When calling QSqlDatabase::addDatabase("QOCI"), I am prompted with the following:
QSqlDatabase: QOCI driver not loaded
QSqlDatabase: available drivers: QSQLITE
I have Oracle Install Client v11.2 installed, but I'm not sure where to go from here. I've done extensive research and cannot find a solution.
Based on what I saw online, I tried creating an oci directory within my Qt dir (/usr/lib64/qt5/plugins/sqldrivers) and then created an oci.pro file. Its contents are below:
INCLUDEPATH+=/usr/include/oracle/11.2/client
LIBS+=-L/usr/lib/oracle/11.2/client/lib -lclntsh
TEMPLATE = subdirs
I ran qmake-qt5 on this to generate a Makefile, but when I run make, the necessary QOCI .so file is not generated.
I'm having issues getting the ODBC driver for Snowflake to work on an M1 Apple Silicon Mac running Big Sur.
Successfully following the instructions on Snowflake's website gets me to the point where testing the driver from the command line (using iodbctest) using the DSN results in the following error:
1: SQLDriverConnect = [iODBC][Driver Manager]dlopen(/opt/snowflake/snowflakeodbc/lib/universal/libSnowflake.dylib, 6): no suitable image found. Did find:
/opt/snowflake/snowflakeodbc/lib/universal/libSnowflake.dylib: no matching architecture in universal wrapper
/opt/snowfl (0) SQLSTATE=00000
2: SQLDriverConnect = [iODBC][Driver Manager]Specified driver could not be loaded (0) SQLSTATE=IM003
My Snowflake driver is installed to /opt/snowflake/snowflakeodbc, so that is correct -- I'm suspicious that this is specifically an M1 problem. I'm using the 2.24.1 version of the driver available from the download mirror here, and the path to the driver in /etc/odbcinst.ini is /opt/snowflake/snowflakeodbc/lib/universal/libSnowflake.dylib (which exists and seems, from all my research, that it should be right).
When I run a connection via DBI in R, I get a completely different error:
Error: nanodbc/nanodbc.cpp:1021: 00000:
[Snowflake][ODBC] (11560) Unable to locate SQLGetPrivateProfileString function.
In other StackOverflow posts, people have referenced the above error meaning that there is a missing library of some kind (IODBC isn't configured correctly?), but I've tried quite a few things to no avail. Any guidance would be great.
Tinkered with this a bit more and realized it's an artifact of the installation pathways for the .dmgs & the preset paths in simba.snowflake.ini.
You need to point the Snowflake driver towards the iODBC dylib (as per a sideswiping statement in the docs) -- the driver is originally configured to look for the ODBC dylib (not iODBC) in a folder that's on the path.
When you install the iODBC driver, verify that it is installed to /usr/local/iODBC (this was where my Silicon Mac installed it to) -- and that /usr/local/iODBC/lib has libiodbc.dylib in it. If so, navigate to your installed snowflake driver directory (should be /etc/snowflake) and alter the simba.snowflake.ini file (/etc/snowflake/snowflake/snowflakeodbc/universal/simba.snowflake.ini). You want to uncomment & alter the last line to be both uncommented & point with a full path towards the iODBC dylib (instead of the default, which is the ODBC dylib).
# Darwin specific ODBCInstLib
# iODBC
ODBCInstLib=/usr/local/iODBC/lib/libiodbcinst.dylib
Make sure to comment out any other ODBCInstLib line so that only one is configured. That should enable you to get your connection to snowflake up and running on an M1 Mac.
Big Sur is macOS v11.n
Snowflake supports macOS 10.14 and 10.15 Supported OSs
So what you are trying to do is not supported and is unlikely to work
None of the other solutions worked for me but #kiran-kumawat 's answer set me down a path that worked.
It seems like the core of the issue is that the odbc code is looking for arm64 architecture drivers but Snowflake is providing it in x86_64 architecture. By installing an x86_64 versions of odbc we are able to have it successfully talk to the driver.
First I uninstalled R and Rstudio. (it may be possible to sim-link or change things behind the scenes to make this work with existing installs but I am not sure).
Then install rosetta (apples software for translating between architectures) and a version of homebrew built with it. I am leaving my main version of homebrew in place.
softwareupdate --install-rosetta
arch -x86_64 /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"
Then use that version of homebrew to install odbc, R, and Rstudio.
arch -x86_64 /usr/local/Homebrew/bin/brew install unixodbc
arch -x86_64 /usr/local/Homebrew/bin/brew install --cask rstudio
arch -x86_64 /usr/local/Homebrew/bin/brew install --cask r
We then need to install the snowflake driver: https://sfc-repo.snowflakecomputing.com/odbc/mac64/latest/index.html
Click through all the install prompts.
Modify your files
/usr/local/etc/odbcinst.ini:
[Snowflake Driver]
Driver = /opt/snowflake/snowflakeodbc/lib/universal/libSnowflake.dylib
/usr/local/etc/odbc.ini
[Snowflake]
Driver = Snowflake Driver
uid = <uid>
server = <server>
role = <role>
warehouse = <warehouse>
authenticator = externalbrowser
We also need to modify the simba.snowflake.ini file.
It is somewhat locked down so run:
sudo chmod 646 /opt/snowflake/snowflakeodbc/lib/universal/simba.snowflake.ini
Then
vim /opt/snowflake/snowflakeodbc/lib/universal/simba.snowflake.ini
And find the ODBCInstLib line that is uncommented and change it to:
ODBCInstLib=/usr/local/Cellar/unixodbc/2.3.9_1/lib/libodbcinst.dylib
After setting this up I was able to use this connection successfully:
install.packages("DBI")
install.packages("odbc")
con <- DBI::dbConnect(odbc::odbc(), "Snowflake")
one of our team member suggested below steps and it worked for us for Apple M1 series
Install the latest snowflake driver
Uninstall m1 based homebrew using cmd
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/uninstall.sh)"
Install intel based homebrew - restart terminal when done
arch -x86_64 /bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"
Re-install unixodbc
arch -x86_64 brew install unixodbc
Test
isql -v Pattern
in your database.yml file for connection to snowflake make following change-
replace "dsn: <DSN_NAME>" with following
conn_str: "Driver={PATH};Locale=en-US;uid={USER_NAME};pwd= {PASSWORD};server=<yours>.snowflakecomputing.com;role=<ROLE>;charset=UTF-8;warehouse=<WAREHOUSE>;database=<DATABASE>;schema=<SCHEMA>;"
Has anyone gotten this to work? I use excel w odbc to refresh snowflake files and have tried multiple ways to move the drivers etc and followed snowflake instructions but never works. I did get parallels to work running windows arm but would prefer to just do this in Mac OS
I also have a M1 (version Monterey 12.0) and I ran into similar issues when I tested the driver. Nevertheless, when I tried the "real connection" it worked like a charm. So, maybe it would be good for you to go and test the "real connection" to avoid a wasting of time using such testing. Hope you find this useful.
I'm learning Jackrabbit and following the documentation to run a standalone server. When I run the command java -jar jackrabbit-standalone-2.16.2.jar and access localhost:8080 on my browser, I get a 500 error saying:
org.apache.jasper.JasperException: PWC6345: There is an error in invoking javac. A full JDK (not just JRE) is required
What am I doing wrong?
Note: I have set my jdk/bin path in my environment variables.Also, my javac command is working properly. I've jdk version 1.8.0_74 and Jackrabbit version 2.16.2
Edit: According to this answer, I tried setting my jdk to my installed jres in eclipse but that didn't solve my problem.
Running the latest jackarabit standalone jar(2.17.3) in my machine(windows 10 and java home pointing in java8 jdk) produced the same errors.
I then executed the rar with java -Djava.home="%JAVA_HOME%" -jar jackrabbit-standalone-2.17.3.jar. Although I got the same error in browser I was able to see errors in the console where I invoked the running command.
One of these error was
can't open C:\Progra~1\Java\jdk1.8.0_144\lib\tzmappings.
Searching my java installation I found that the missing files, are located under jre's installation folder.
So I eventually made the standalone jar to work with:
java -Djava.home="%JAVA_HOME%\jre" -jar jackrabbit-standalone-2.17.3.jar
The initial error is a bit misleading as it refers to javac and not to the missing files.
The whole thing seems to be a bug for me. Please give a try to my workaround and if it works for you consider filing a bug in Jackrabbit's issue tracker platform.
jackrabbit-standalone uses JSP. JSP needs compilation. Compilation needs JDK.
Before running java -jar jackrabbit-standalone-2.16.2.jar do you check your JAVA_HOME, and make sure it refers to a fully-fledged JDK? In short, the bin directory should have javac.
I found that there was another variable in the Path environment System variable preceding my %JAVA_HOME%\bin variable.
You don't have to delete the other variable, but move it down (or move %JAVA_HOME\bin up) to correct the load order.
What happens:
I execute the following command.
java -jar sat4j-sat.jar -remote
No window opens, and I get a console output same as without the -remote flag, which begins:
c SAT4J: a SATisfiability library for Java (c) 2004-2013 Artois (...)
c This is free software under the dual EPL/GNU LGPL licenses.
c See www.sat4j.org for details.
c version 2.3.4.v20130419
c java.runtime.name OpenJDK Runtime Environment
c java.vm.name OpenJDK Client VM
c java.vm.version 24.65-b04
c java.vm.vendor Oracle Corporation
c sun.arch.data.model 32
c java.version 1.7.0_65
c os.name Linux
c os.version 3.2.0-4-686-pae
(...)
What is expected:
From readme.txt:
To run sat4j with on the fly configuration:
java -jar sat4j-sat.jar -remote
These instructions should open a java window named Remote Control. We
assume that the 1.5 version of the java command is in your path. If
it isn’t, then you should either specify the complete path to the java
command or update your PATH environment variable as described in the
installation instructions for the Java 2 SDK.
Other details
I have tried multiple versions of the library, up to 2.3.4.
My system is Debian 7 with Gnome 2.
My default Java installation is OpenJDK 1.7.0_65.
My secondary Java installation is Oracle Java 1.8.0_45 (with the same issue).
Gnuplot 4.6 is installed.
My first machine has a 32 bit dual core CPU with 2GB of RAM.
My second machine has a 64 bit quad core CPU with 8GB of RAM with nearly identical software.
Question
Has anyone used SAT4J's remote control feature? What is the problem with my method?
Update
On another machine (64 bit Debian 7) the window opens. After start dat files are created, but plotting does not start.
Update 2
I ran the generated instance.dimacs-gnuplot.gnuplot file manually from a gnuplot terminal, and I got the message unknown or ambiguous terminal type for the x11 type. I installed the gnuplot-x11 package, and now it works on the workplace machine: I can see the diagrams (wow!). Unfortunately on my home machines the Remote Control window still doesn't open.
The -remote parameter is used to display the remote control, i.e. to setup the various parameters of the solver.
If you want to always monitor what the solver is doing, you need to use in conjunction the -r parameter.
So the complete command line should be:
java -jar sat4j-sat.jar -r -remote file.cnf
You can get a fresh snapshot of Sat4j Sat on our continuous integration server:
http://bamboo.ow2.org/browse/SAT4J-DEF2-41/artifact/JOB1/nightly_build/
This might solve the issue you met with the 2.3.4 release.
Cheers,
Daniel
I struggled on this issue for a few days but didn't get a right answer yet.
Here is the Problem Description:
I wrote a normal Java program (Program-A), and wrote a Windows-based native agent (*.dll, written in C/C++) with Agent_OnLoad, Agent_OnAttach, Agent_OnUnload method, which works fine if using Java command-line flag (-agentlib). Then I wrote another Java program to attach the native agent onto a runing the Java Program-A (see the below code piece for VM attach and loadAgentPath), however I got the exception:
com.sun.tools.attach.AgentLoadException: Failed to load agent library
I tried to change the agentPath (absolute or relative file path) this or that way, none of these works. Should I try some other way to make this work. What I need is to attach a native agent onto a runing java program rather than using command-line flag.
Does anyone know the root cause or a clue for the solution?
BTW, the command line to run attach VM Java code as:
java -Djava.library.path=D:\DevTools\Java7\jre7\bin -classpath .;./tools.jar com.xxx.TestAgentVMAttacher
...
VirtualMachine virtualMachine = com.sun.tools.attach.VirtualMachine.attach(pid); // Note: this code line is executed normally, I am sure the pid is correct
...
agentPath = theFilePath + "/myagent.dll"; // Note: I am sure the dll file path is correct
virtualMachine.loadAgentPath(agentPath,null); // Note: this code line would cause the exception (AgentLoadException) as I mentioned above, no matter how I set the agentPath, even I set it as null, same exception happened.
Environment related info:
- OS: Windows XP
- Java Version: Java(TM) SE Runtime Environment (build 1.7.0-b147)
Eventually I found the answer for my question, I had a wrong method name ('Agent_Attach') in Agent.cpp file, the correct one should be 'Agent_OnAttach', with this fix, my agent lib (.dll) can be loaded to a running Java program now.