how to load / mount an existing file based database in HSQLdb - hsqldb

Good day all !
I have 2.5 HSQLDB running as server on my Windows machine via:
#java -classpath ./lib/hsqldb.jar org.hsqldb.server.Server -database.0 file:.\data -dbname.0 foo
I left the files of my foo database in the data folder:
foo.data
foo.log (empty BTW)
foo.properties
foo.script
... and I see this line:
[Server#74cd4d]: Database [index=0. id=0, db=file:.\data, alias=foo] opened successfully in 335ms.
I open the manager via:
#java classpath .\lib\hsqldb.jar org.hsqldb.util.DatabaseManagerSwing
connection properties:
Recent Settings: foo
Setting Name: foo
Type: HSQL Database Engine Server
Driver: org.hsqldb.jdbc.JDBCDriver
URL: jdbc:hsqldb:hsql://localhost/foo
I run the query to show all tables (which I know for a fact exists):
SELECT * FROM INFORMATION_SCHEMA.SYSTEM_TABLES where TABLE_TYPE='TABLE'
... but I get zero results.
Any ideas?
Thanks!
PS I had to spend an hour figuring out why I hit the post button and it complained I did not indent my code correctly so I indented the whole thing and was then forced to type this long diatribe in order to get my post to be accepted ...
... God bless the pedantic "Knights who say Ni" !

The file: URL among the server properties indicates the path and file name of the database files (but without the file extensions such as .script, .data).
The URL that you used in db=file:.\data refers to database files named data.script, data.properties, etc. in the parent directory of lib. You can check the parent directory and will probably find a set of database files with those names was created when you started the server.
In order to access you proper database, the URL should be db=file:.\data\foo

Related

DBT - [WARNING]: Did not find matching node for patch

I keep getting the error below when I use dbt run - I can't find anything on why this error occurs or how to fix it within the dbt documentation.
[WARNING]: Did not find matching node for patch with name 'vGenericView' in the 'models' section of file 'models\generic_schema\schema.sql'
did you by chance recently upgrade to dbt 1.0.0? If so, this means that you have a model, vGenericView defined in a schema.yml but you don't have a vGenericView.sql model file to which it corresponds.
If all views and tables defined in schema are 1 to 1 with model files then try to run dbt clean and test or run afterward.
Not sure what happened to my project, but ran into frustration looking for missing and/or misspelled files when it was just leftovers from different compiled files not cleaned out. Previously moved views around to different schemas and renamed others.
So the mistake is here in the naming:
The model name in the models.yml file should for example be: employees
And the sql file should be named: employees.sql
So your models.yml will look like:
version: 2
models:
- name: employees
description: "View of employees"
And there must be a model with file name: employees.sql
One case when this will happen is if you have the same data source defined in two different schema.yml file (or whatever you call it)

DBD::Oracle, Cursors and Environment under mod_perl

Need some help, because I can't find any solution for my problems with DBD::Oracle.
So at first, this is the current situation:
We are running Apache2 with mod_perl 2.0.4 at our company
Apache web server was set up with a startup script which is setting some environment variables (LD_LIBRARY_PATH, ORACLE_HOME, NLS_LANG)
In httpd.conf there are also environment variables for LD_LIBRARY_PATH and ORACLE_HOME (via SetEnv)
We are generally using the perl module DBI with driver DBD::Oracle to connect to our main database
Before we create a new instance of DBI we are setting some perl env variables, too (%ENV). We are setting ORACLE_HOME and NLS_LANG.
So far, this works fine. But now we are extending our system and need to connect to a remote database. Again, we are using DBI and DBD::Oracle. But for now there are some new conditions:
New connection must run in parallel to the existing one
TNSNAMES.ORA for the new connection is placed at a different location (not at $ORACLE_HOME.'/network/admin')
New database contents are provided by stored procedures, which we are fetching with DBD::Oracle and cursors (like explained here: https://metacpan.org/pod/DBD::Oracle#Binding-Cursors)
The stored procedures are returning object types and collection types, containing attributes of oracle type DATE
To get these dates in a readable format, we set a new env variable $ENV{NLS_DATE_FORMAT}
To ensure the date format we additionally alter the session by alter session set nls_date_format ...
Okay, this works fine, too. But only if we make a new connection on the console. New TNS location is found by the script, connection could be established and fetching data from the procedures by cursor is also working. Alle DATE types are formatted as specified.
Now, if we try to make this connection at apache environment, it fails. At first the datasource name could not resolved by DBI/DBD::Oracle. I think this is because of our new TNSNAMES.ORA file or rather the location is not found by DBI/DBD::Oracle in Apache context (published by $ENV{TNS_ADMIN}). But I don't know why???
The second problem is (if I create a dirty workaround for our first one) that the date format, published by $ENV{NLS_DATE_FORMAT} is only working on first level of our cursor select.
BEGIN OPEN :cursor FOR SELECT * FROM TABLE(stored_procedure) END;
The example above returns collection types of object which are containing date attributes. In Apache context the format published by NLS_DATE_FORMAT is not recognized. If I use a simple form of the example like this
BEGIN OPEN :cursor FOR SELECT SYSDATE FROM TABLE(stored_procedure) END;
the result (a single date field) is formatted well. So I think subordinated structures were not formatted because $ENV{NLS_DATE_FORMAT} works only in console context and not in Apache context, too.
So there must be a problem with the perl environment variables (%ENV) running under Apache and mod_perl. Maybe a problem of mod_perl?
I am at my wit's end. Maybe anyone in the whole wide world has a solution ... and excuse my english :-) If you need some further explanations, I will try to define it more precisely.
If your problem is that changes to %ENV made while processing a request don't seem to be honoured, this is because mod_perl assumes you might be running multiple threads and doesn't actually change the process environment when you change %ENV, so external libraries (like the oracle client) or child processes don't see the change.
You can work around it by first using the prefork MPM, so there aren't any threading issues, and then making changes to the environment using Env::C instead of the %ENV hash.

How to use setup FULL-TEXT Lucene SEARCH in H2 Database without errors?

I followed the H2 tutorial about setting the FTL using Lucene, however I'm experiencing unknown exceptions.
This is how I did it:
using SQuirrrel SQL Client I added the lucene-core-3.0.3.jar library in Additional classpath (otherwise it complains cannot import classes)
then I called this:
CREATE ALIAS IF NOT EXISTS FTL_INIT FOR "org.h2.fulltext.FullTextLucene.init";
CALL FTL_INIT();
Afterwards it the *.trace.db log said that the {db.name} exists but it's not a directory.
I just fixed it, by renaming the database file to something else, then I made a directory named after the db.
Now running this:
CREATE ALIAS IF NOT EXISTS FTL_INIT FOR "org.h2.fulltext.FullTextLucene.init";
CALL FTL_INIT();
works.

How to use a config file to connect to database set by user

I have a program that will run a query, and return results in report viewer. The issue is we have 10 locations, all with their own local database. What I'd like to do is have each location use the program and utilize the App.config file to specify which database to connect to depending on which location you are. This will prevent me from having to create 10 individual programs with separate database connections. I was thinking I could have 3 values in the app.config file "Database" "login" "password" Generally speaking the databases are on the .30 address... so it would be nice to be able to have them set the config file to the database server IP...
For example:
Location: 1
DatabaseIP: 10.0.1.30
Login: sa
Password: databasepassword
Is it possible to set something like this up using the app.config file?
You should take a look on the resource files.
Originally, they are intended for localization, but they should work for you also.
Go to your project Properties, and set up an Application Setting --> Type (Connection String) from the drop down. This will result in a xlm config file in your output directory in which you can modify the connection string post-compile.
I ended up using a simple XML File to do this. I used this site to accomplish it. I first wrote the XML using the form load, then switched it to the read.

MsTest, DataSourceAttribute - how to get it working with a runtime generated file?

for some test I need to run a data driven test with a configuration that is generated (via reflection) in the ClassInitialize method (by using reflection). I tried out everything, but I just can not get the data source properly set up.
The test takes a list of classes in a csv file (one line per class) and then will test that the mappings to the database work out well (i.e. try to get one item from the database for every entity, which will throw an exception when the table structure does not match).
The testmethod is:
[DataSource(
"Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\EntityMappingsTests.Types.csv",
"EntityMappingsTests.Types#csv",
DataAccessMethod.Sequential)
]
[TestMethod()]
public void TestMappings () {
Obviously the file is EntityMappingsTests.Types.csv. It should be in the DataDirectory.
Now, in the Initialize method (marked with ClassInitialize) I put that together and then try to write it.
WHERE should I write it to? WHERE IS THE DataDirectory?
I tried:
File.WriteAllText(context.TestDeploymentDir + "\\EntityMappingsTests.Types.csv", types.ToString());
File.WriteAllText("EntityMappingsTests.Types.csv", types.ToString());
Both result in "the unit test adapter failed to connect to the data source or read the data". More exact:
Error details: The Microsoft Jet database engine could not find the
object 'EntityMappingsTests.Types.csv'. Make sure the object exists
and that you spell its name and the path name correctly.
So where should I put that file?
I also tried just writing it to the current directory and taking out the DataDirectory part - same result. Sadly, there is limited debugging support here.
Please use the ProcessMonitor tool from technet.microsoft.com/en-us/sysinternals/bb896645. Put a filter on MSTest.exe or the associate qtagent32.exe and find out what locations it is trying to load from and at what point in time in the test loading process. Then please provide an update on those details here .
After you add the CSV file to your VS project, you need to open the properties for it. Set the Property "Copy To Output Directory" to "Copy Always". The DataDirectory defaults to the location of the compiled executable, which runs from the output directory so it will find it there.