Connecting to the built-in Anylogic DB - hsqldb

I've been using Anylogic's feature to collect data to a database provided by the program.
However I have trouble extracting the data again. I've tried using the export to excel function, but the program is getting stuck (probably because of the large amount of data).
Now I'm looking at connecting directly to the database. Anylogic provides me with a connection string: jdbc:hsqldb:hsql://localhost:9001/nau_sterilcentral;file:C:\Users\nbn\Models\NAU sterilcentral\database\db
I only use the first part (until the file:C:\...) since DBeaver doesn't ask for more. My connection string i DBeaver looks like this: jdbc:hsqldb:hsql://localhost:9001/nau_sterilcentral
I expect the db to be running since i can query it from Anylogic, but I'm not sure. When I try to connect i keep getting an error:
connection exception: connection failure: java.io.EOFException
org.hsqldb.HsqlException: connection exception: connection failure: java.io.EOFException

I don't know AnyLogic, but:
HSQLDB needs to be started in server mode to allow a connection from a different process. If AnyLogic starts HSQLDB in embedded mode, you can't access the database as long as AnyLogic is running.
Also the JDBC URL needs to be either a server URL using localhost:9001 or a "local" URL containing a file name, you can not mix both things (and the syntax for the file based URL was wrong as well)
Assuming AnyLogic starts HSQLDB in embedded mode, you have to stop AnyLogic, then you can connect using a file based URL. The syntax for that would be:
jdbc:hsqldb:C:\Users\nbn\Models\NAU sterilcentral\database\db

Related

Linqpad DB connections - get it to NOT connect when I open a file?

I have many LinqPad scripts I use to query databases. These files are all in dropbox so I have all my queries available across computers. Very handy.
However each script is tied to a database connection, and that connection is not valid across computers. I'm guessing it has to do with how the passwords are stored. And that's OK.
But what is annoying is when I open a script, LinqPad attempts to connect to the invalid database. It is frozen while this happens. Once it is done, all I have to do is change it to the new db and it's fine. But this freeze interrupts my flow.
Is there a way to tell Linqpad to NOT try to connect when I open a file?
you can use a connection by connection string
so you can add some read key or any machanics to execute connection
using(TypedDataContext context = new TypedDataContext(Connection.ConnectionString)){
context.table.Where(w=>w.ID == 123);
}

Getting frequent "connection timeout" error on Databricks Job while connecting to Azure SQL Database

We have a clojure code that runs on Databricks, and fetches some large amount of data from Azure SQL Database.
Recently, we are getting frequent connection timeout errors.
I am new to Clojure, so I don't understand why this error occurs. Sometimes the code runs perfectly while sometimes it fails.
We have tried different connection parameters like "connectionretry" and "logintimeout" but it didn't work.
com.microsoft.sqlserver.jdbc.SQLServerException: Connection timed out (Read failed)
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDriverError(SQLServerException.java:234)
at com.microsoft.sqlserver.jdbc.SimpleInputStream.getBytes(SimpleInputStream.java:352)
at com.microsoft.sqlserver.jdbc.DDC.convertStreamToObject(DDC.java:796)
at com.microsoft.sqlserver.jdbc.ServerDTVImpl.getValue(dtv.java:3777)
at com.microsoft.sqlserver.jdbc.DTV.getValue(dtv.java:247)
at com.microsoft.sqlserver.jdbc.Column.getValue(Column.java:190)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.getValue(SQLServerResultSet.java:2054)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.getValue(SQLServerResultSet.java:2040)
at com.microsoft.sqlserver.jdbc.SQLServerResultSet.getObject(SQLServerResultSet.java:2372)
at clojure.java.jdbc$dft_read_columns$fn__226.invoke(jdbc.clj:457)
at clojure.core$mapv$fn__6953.invoke(core.clj:6627)
at clojure.lang.LongRange.reduce(LongRange.java:233)
at clojure.core$reduce.invokeStatic(core.clj:6544)
at clojure.core$mapv.invokeStatic(core.clj:6618)
at clojure.core$mapv.invoke(core.clj:6618)
at clojure.java.jdbc$dft_read_columns.invokeStatic(jdbc.clj:457)
at clojure.java.jdbc$dft_read_columns.invoke(jdbc.clj:453)
at clojure.java.jdbc$result_set_seq$row_values__233.invoke(jdbc.clj:483)
at clojure.java.jdbc$result_set_seq$thisfn__235.invoke(jdbc.clj:493)
at clojure.java.jdbc$result_set_seq$thisfn__235$fn__236.invoke(jdbc.clj:493)
at clojure.lang.LazySeq.sval(LazySeq.java:40)
at clojure.lang.LazySeq.seq(LazySeq.java:49)
at clojure.lang.RT.seq(RT.java:521)
at clojure.core$seq__4357.invokeStatic(core.clj:137)
at clojure.core$map$fn__4785.invoke(core.clj:2637)
at clojure.lang.LazySeq.sval(LazySeq.java:40)
at clojure.lang.LazySeq.seq(LazySeq.java:49)
at clojure.lang.Cons.next(Cons.java:39)
at clojure.lang.RT.next(RT.java:688)
at clojure.core$next__4341.invokeStatic(core.clj:64)
at clojure.core$dorun.invokeStatic(core.clj:3033)
at clojure.core$doall.invokeStatic(core.clj:3039)
at clojure.core$doall.invoke(core.clj:3039)
at clojure.java.jdbc$query$fn__340$fn__341.invoke(jdbc.clj:1007)
at clojure.java.jdbc$query$fn__340.invoke(jdbc.clj:1006)
The issue isn't related to Azure SQL Database JDBC driver but the way you are establishing the connection and running queries.
Make sure you are closing the connection properly after every commit. Additionally, you have a sufficient sleep time between connections.
Also, make sure that the large data you are fetching must have required bandwidth and properly queried.
You can refer these official documentation for troubleshooting:
Troubleshooting connectivity issues and other errors with Azure SQL Database and Azure SQL Managed Instance
Resolving connectivity errors to SQL Server

Pentaho Connection to MultiSubnet SQL Server environment

In Hitachi Pentaho 8.2 Data Integration (Spoon) we are trying to configure "MultisubnetFailover=True" for a SQL Server database connection.
For database connections we are using "MS-SQL Server(Native)/Native JDBC". The problem is that I can't find where to set this property in the Database Connection component, and it doesn't look like I can specify the complete connection string for the driver to use. Where can I set this property?
It doesn't look like the Options tab is the correct place for setting this property, and I don't see a way to just specify the entire connection string myself. Also, ODBC can't work for us because the ETL changes the server and database dynamically (we use Pentaho variables to vary those) and ODBC is only one hardcoded connection.
Any help will be appreciate it.

SSAS Tabular expression tables fail to load from SQL Server data source

We have been developing a new SSAS Tabular (1400) model, initially doing the PoC by loading the data directly from SQL Server (not using expressions). Once we had an idea of what we wanted, we refactored and did the data imports with Expressions to allow further Power Query (M) processing. At first, all seemed fine. Along the way we started to mash up other data from an ODBC datasource and didn't experience any issues during the process.
This morning we opened the project to make some modifications before running a new load of data and clicking "process tables" triggered an authentication dialog for the SQL Server data source again and then produced this error for all tables we were importing.
We have tried removing the data source connection string and adding it again. We removed all expressions and tried to start from scratch. No matter what we do, the Expression preview window WILL show the data preview, but attempting to import it into the model then fails with this error.
We've tried this on multiple workstations to try and rule out something with one developers settings that might have changed.
Any thoughts?
After reading this error message again, it struck me that it mentions an ODBC error. The connection to the SQL Server database shouldn't be ODBC so that got me wondering if something was wrong with that connection instead. Sure enough, once I re-authenticated the ODBC data source, the data load worked as expected.
Note that we hadn't actually created a table in the model from the ODBC source yet. Regardless, processing the tables appears to try and validate each connection and rather than more clearly saying that something was wrong with this one data source (even though no data was being pulled into tables), the error dialog made it appear that the import was failing for the SQL Server connection, which is simply wrong.

Extending the connection time limit in Oracle via SQL Developer?

I am trying to dump the whole data from the developer db server to my local machine using SQL Developer, but whenever I try to export, the connection gets halted before I finish dumping/exporting.
Is there a way to adjust that db connection timeout?
There is no connection time limit imposed by SQL Developer. If your connection is getting lost, that implies either that there is something in the network (a firewall for example) that limits the length of a connection or that there is something configured in the database (a profile, Resource Manager, etc.) that is causing the connection to be terminated. Since you haven't told us what error you get, it is impossible to guess which of these options is the most likely source of your problem.
Of course, it would probably be more effective to use the proper tools (the DataPump version of the export and import utilities) for this sort of thing.