SQLite coalesce problem when running django testcases - sql

I am using django for running my project. And I am using postgresql_psycopg2 engine for my production db, but the test runner uses sqllite3 for running the tests. Keeping my production db(postgresql)in mind I tried building a query which uses "coalesce". But sqllite3 doesn't recognize this. How do I get pass this. I can use postgresql_psycopg2 even for my test running(just to avoid wastage of time), but its too slow. How do I get pass this?

Sqlite does support coalesce but requires at least two arguments. I believe that the Postgresl implementation only requires one parameter while Sqlite requires at least two. Perhaps you are using coalesce with only one parameter in Postgresql and that is breaking when moving to Sqlite?
Could you post the code that is failing?

Related

How to supply SQL functions and views required for testing Django app

I've created a file <APP>/<MODEL>.sql according to the Django docs in order to make use of a hook to pass arbitrary SQL after syncdb is run. Inside this file are two function declarations for PostgreSQL and a statement that creates a database view. This runs fine in production but -- just as the docs say -- the code is not run for the test database because the use of fixtures is suggested. Now my unittests are missing the crucial database views and functions and thus fail.
How do I test code that relies on raw sql functions / views?
UPDATE
I dug up this ticket which concerns this question directly and also presents a small workaround.
I found the best way to handle this is to put the custom SQL code into Django's migrations.
Django and South (which is the predecessor to Django's own migration framework) both provide commands to create custom (i.e. empty) migrations. The code for creating database views or functions can be put into an empty migration and will be run whenever a new installation of the project is migrated or the test suite is run.
A tutorial on how to use custom migrations for database views with South can be found here. The syntax is a bit different in Django's own migration framework but the documentation about RunSQL explains it all.
Just run them natively like the sql that they are.
or
Use sqlcustom
or
Don't bother with them; you might find yourself swimming upstream to try and make good use of these functions and view via the ORM.
or
Consider another python framework (dare i say it) which is more attuned to using native sql.

When will I ever need more than what MySqli can give me?

I use mysqli for everything. I'm adding features to a small system I built, and some of the examples are pdo. I was about to convert to mysqli to match my system, but I realized it might be less work to change what I've already built to pdo. I've been reading about pdo vs mysqli all over the web.
But here is my question. When will I ever need more than mysqli? PDO offers three major things that mysqli doesn't. Twelve different drivers, named parameters, and prepared statements.
For a web-dev I don't really need the ability to use my web application over 18 database types.
The only major advantage I see is prepared statements.
What are major reasons to switch when the only language I am using is php? Does php even support named parameters now?
Is there even a way to get the number of results from a select using pdo?
You managed to confuse everything.
On your statement
mysqli does offer native prepared statements support as well
named parameters are just actually bloating your code with no purpose. Yet they are very easy to implement manually. There are some people called programmers who can actually program and implement whatever feature they need.
twelve different drivers are actually a bubble. Only few of them actually workable and you cannot switch databases by only changing a DSN. You need a higher level of abstraction such as ORM.
On what you have missed
mysqli way of handling prepared statements is indeed too complex for a newcomer.
as a matter of fact, mysqli is just an API that should never be used as is, but only as a source material for a higher level driver.
yet it has a lot of mysql-specific nitty-gritties obviously absent in a generalized driver like PDO
So, in brief
If you are going to create a database abstraction library for use with mysql - go for mysqli.
If your only idea of using API is to call it's methods directly in the code - then PDO is the only choice as it's already a semi-DAL and it's support for prepared statements is way easier to use.
On your other questions
Does php even support named parameters now?
PHP has nothing to do with named parameters.
Is there even a way to get the number of results from a select using pdo?
You don't need it. Once you have results, you can simply count them.

Inmemory SQL DB options with support for analytic functions (parition over)?

Working on adding unit testing around database code on a Java legacy project. Some of SQL (Oracle) contains analytics functions .. (i.e. parition over syntax). Is there any in memory SQL DB options out there (preferably open-source) that support these functions?
Any other solutions? I would prefer not to hit the real database (even if I rollback the data).
Postgre SQL support window & analytic function.
check the docs here
you need to think if it's smart to test the data on a different platform though.
If you are using Oracle, I would just spin up a local copy of Oracle XE because it is free. This ensures that your test environment mirrors your production environment. While it may be wise to use something like HyperSonic it is not a true representation of your production environment and as such some issues may be missed that could have been caught in test.

Using JUnit to perform an insert

I need to insert data into a table using JUnit tests. Is this possible? If so, how? Thanks!
Check out DBUnit.
DbUnit is a JUnit extension (also usable with Ant) targeted at database-driven projects that, among other things, puts your database into a known state between test runs. This is an excellent way to avoid the myriad of problems that can occur when one test case corrupts the database and causes subsequent tests to fail or exacerbate the damage.
DbUnit has the ability to export and import your database data to and from XML datasets. Since version 2.0, DbUnit can also work with very large datasets when used in streaming mode. DbUnit can also help you to verify that your database data match an expected set of values.
You can use JDBC and regular insert statements to do this.
See the Java JDBC tutorial to get started: http://download.oracle.com/javase/tutorial/jdbc/
I don't have time to write a sample for you.
You're new to this, so it's likely to be over your head, but I'd recommend that you study some Spring 3 examples even if you don't use Spring. The ideas will help you to write better tests.
The key is to make your tests transactional: Do your INSERT, check the result, and roll the transaction back when you're done. It should be as if your test was never there.
http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/testing.html

Django: run queries depending on SQL engine

In Django 1.2.3 I need to perform some queries that are not feasible with pure Django ORM functions. E.g.
result = MyModel.objects.extra(select={'stddev': 'STDDEV_SAMP(value)'}).values()
But, indeed, I need to run this code on several SQL engines (sqllite, MySQL and MSSQL). So, I should test settings.DATABASES['default']['engine'] and run engine-specific code.
Is there a more Django-like approach to this problem? (e.g. user-definined function to put somewhere so that Django run them according to default database engine).
Thank you
The proper place to store the code for accessing data is in a method in the model layer. That way, the model can:
be environment-aware
construct custom queries
use built-in ORM functions
These can be swapped around, optimized, and tweaked, without the rest of your application having to change a bit, because the rest of your application only manipulates data through your model.