Thank you for checking my question out!
I'm trying to write a query for a very specific problem we're having at my workplace and I can't seem to get my head around it.
Short version: I need to be able to target columns by their name, and more specifically by a part of their name that will be consistent throughout all the columns I need to combine or compare.
More details:
We have (for example), 5 different surveys. They have many questions each, but SOME of the questions are part of the same metric, and we need to create a generic field that keeps it. There's more background to the "why" of that, but it's pretty important for us at this point.
We were able to kind of solve this with either COALESCE() or CASE statements but the challenge is that, as more surveys/survey versions continue to grow, our vendor inevitably generates new columns for each survey and its questions.
Take this example, which is what we do currently and works well enough:
CASE
WHEN SURVEY_NAME = 'Service1' THEN SERV1_REC
WHEN SURVEY_NAME = 'Notice1' THEN FNOL1_REC
WHEN SURVEY_NAME = 'Status1' THEN STAT1_REC
WHEN SURVEY_NAME = 'Sales1' THEN SALE1_REC
WHEN SURVEY_NAME = 'Transfer1' THEN Null
ELSE Null
END REC
And also this alternative which works well:
COALESCE(SERV1_REC, FNOL1_REC, STAT1_REC, SALE1_REC) as REC
But as I mentioned, eventually we will have a "SALE2_REC" for example, and we'll need them BOTH on this same statement. I want to create something where having to come into the SQL and make changes isn't needed. Given that the columns will ALWAYS be named "something#_REC" for this specific metric, is there any way to achieve something like:
COALESCE(all columns named LIKE '%_REC') as REC
Bonus! Related, might be another way around this same problem:
Would there also be a way to achieve this?
SELECT (columns named LIKE '%_REC') FROM ...
Thank you very much in advance for all your time and attention.
-Kendall
Table and column information in Db2 are managed in the system catalog. The relevant views are SYSCAT.TABLES and SYSCAT.COLUMNS. You could write:
select colname, tabname from syscat.tables
where colname like some_expression
and syscat.tabname='MYTABLE
Note that the LIKE predicate supports expressions based on a variable or the result of a scalar function. So you could match it against some dynamic input.
Have you considered storing the more complicated properties in JSON or XML values? Db2 supports both and you can query those values with regular SQL statements.
I am trying to extract data via a unique identifier from two separate schema in TOAD for Oracle using SQL. Normally, I would export to Python and work the csv from there, however, in this case, I must compare in TOAD before export as one data field is WAY too large to export without filtering down using the ticket identifier. So, I have tried:
SELECT LDKEY, LDOWNERTABLE, LDTEXT, LDOWNERCOL
FROM DB.SCHEMA1
WHERE LDKEY = DB.SCHEMA2.TICKETUID
and...
SELECT LDKEY, LDOWNERTABLE, LDTEXT, LDOWNERCOL
FROM DB.SCHEMA1
WHERE (
SELECT TICKETUID
FROM DB.SCHEMA2
WHERE LDKEY = TICKETUID
)
How can I compare the LDKEY from one schema to the TICKETUID of another? I only want to extract data where the key and ticket are equal - all other data I do not need. I can get data out using only one schema successfully, but one dataset DB.SCHEMA1 is too large to export, so it must be filtered first. How do comparisons across Schemas work in SQL/TOAD? How do I reference other schema from SQL queries?
I have seen similar questions out there, but none seemed to get at this directly.
Your question can be answered using SQL, regardless of whether you use TOAD or not. Add a WHERE clause (or join, depending on the data model) to compare the two values.
SELECT LDKEY, LDOWNERTABLE, LDTEXT, LDOWNERCOL
FROM DB.SCHEMA1 INNER JOIN DB.SCHEMA2 ON LDKEY = TICKETUID
I want to create a database starts with the data. I'm creating a app that has a long list of items.
I have used:
String item1 = "my item";
ContentValues cvItems = new ContentValues();
cvM1.put(KEY_NAME, item1);
db.insert(DATABASE_TABLE, null, cvItems);
but I think that having to use a string for each element is too much and there has to be a shorter way. And I want to know if its possible to start with a database that has all values predefined. Also I can't reference strings from strings xml in SQL. thanks in advance
Your best approach is probably to create a template database that contains all the data you need and restore a full copy of the DB from the template rather that building it incrementally. Then just tweak whatever items you need to in code to suit the particular usage of that instance of the DB.
This will make it easier to change the default data without updating your code.
I have a data driven site with many stored procedures. What I want to eventually be able to do is to say something like:
For Each #variable in sproc inputs
UPDATE #TableName SET #variable.toString = #variable
Next
I would like it to be able to accept any number of arguments.
It will basically loop through all of the inputs and update the column with the name of the variable with the value of the variable - for example column "Name" would be updated with the value of #Name. I would like to basically have one stored procedure for updating and one for creating. However to do this I will need to be able to convert the actual name of a variable, not the value, to a string.
Question 1: Is it possible to do this in T-SQL, and if so how?
Question 2: Are there any major drawbacks to using something like this (like performance or CPU usage)?
I know if a value is not valid then it will only prevent the update involving that variable and any subsequent ones, but all the data is validated in the vb.net code anyway so will always be valid on submitting to the database, and I will ensure that only variables where the column exists are able to be submitted.
Many thanks in advance,
Regards,
Richard Clarke
Edit:
I know about using SQL strings and the risk of SQL injection attacks - I studied this a bit in my dissertation a few weeks ago.
Basically the website uses an object oriented architecture. There are many classes - for example Product - which have many "Attributes" (I created my own class called Attribute, which has properties such as DataField, Name and Value where DataField is used to get or update data, Name is displayed on the administration frontend when creating or updating a Product and the Value, which may be displayed on the customer frontend, is set by the administrator. DataField is the field I will be using in the "UPDATE Blah SET #Field = #Value".
I know this is probably confusing but its really complicated to explain - I have a really good understanding of the entire system in my head but I cant put it into words easily.
Basically the structure is set up such that no user will be able to change the value of DataField or Name, but they can change Value. I think if I were to use dynamic parameterised SQL strings there will therefore be no risk of SQL injection attacks.
I mean basically loop through all the attributes so that it ends up like:
UPDATE Products SET [Name] = '#Name', Description = '#Description', Display = #Display
Then loop through all the attributes again and add the parameter values - this will have the same effect as using stored procedures, right??
I dont mind adding to the page load time since this is mainly going to affect the administration frontend, and will marginly affect the customer frontend.
Question 1: you must use dynamic SQL - construct your update statement as a string, and run it with the EXEC command.
Question 2: yes there are - SQL injection attacks, risk of mal-formed queries, added overhead of having to compile a separate SQL statement.
Your example is very inefficient, so if I pass in 10 columns you will update the same table 10 times?
The better way is to do one update by using sp_executesql and build this dynamically, take a look at The Curse and Blessings of Dynamic SQL to see how you have to do it
Is this a new system where you have the freedom to design as necessary, or are you stuck with an existing DB design?
You might consider representing the attributes not as columns, but as rows in a child table.
In the parent MyObject you'd just have header-level data, things that are common to all objects in the system (maybe just an identifier). In the child table MyObjectAttribute you'd have a primary key of with another column attrValue. This way you can do an UPDATE like so:
UPDATE MyObjectAttribute
SET attrValue = #myValue
WHERE objectID = #myID
AND attrName = #myAttrName
I have this idea that using SQL VIEWS to abstract simple database computations (such as a count on a relation) is sufficient, and you don't need procedures (== procedural code)
A simple sql view + a where clause >> a stored procedure with parameters sometimes
While making this point I imagined a way of retrieving table/view data without writing SQL and without writing the where clause..
But, to my surprise, there does not seem a way to accomplish this in ADO.NET 2.0 or later.
Let me tell you what I tried:
SqlDataAdapter + SqlCommandBuilder still requires you to write "SELECT ... FROM" and the WHERE CLAUSE in strings (plus, if you put the 'where', you dont have much use of Update/Insert/DeleteCommand)
typed DataSets only allow you to retrieve _entire DataTable_s and then applying filters to them. Filters are strings, without escaping aid... (must double the single quote!)
SQL to Entities looked promising but they seem to: be limited to MSSQL, generate bloated SQL queries, generate a whole new stack of DAOs (besides the existing Domain Model classes), reqiuire .net 3.5+ for all this etc. (that is, all these are disadvantages for me)
Other ORMs have similar problems as SQL to Entities.
What I'm looking for is a strong-typed method of accessing database tables/views that:
doesn't come with another set of DAOs (K.I.S.S)
allows me to query a table without writing "SELECTs" in strings (strong-typed)
allows me to filter(WHERE) a table with properly-escaped parameters (and without retrieving the whole data beforehand)
can later issue updates/inserts/deletes
I am fairly new to .Net but not stupid: does this exist?
Thanks.
FluentADO
Subsonic has a fairly lightweight query tool that you can use to directly query against the database with a Query object that abstracts the SQL. If you want to, you can also use its code generation facility to map your database tables to POCOs, or to only create a strongly typed schema (for column/table names and so on).
I don't really believe that what you want to do is achievable without using some sort of ORM, or a specialized DSL with a compiler that somehow knows about your database schema, type/column information, etc.
Take into account that C# is a general purpose language, and its compiler is totally unaware of your database types, that's why you cannot bind them without using some abstraction layer, which usually involves ad hoc SQL queries(strings), NHibernate or similar mapping files (more strings) and/or DAOs.
If you don't want to write the WHERE clause, one way is to use a Filter object and add the conditions you want. For example:
var sc = new Filter();
sc.Add("Contacttitle", "Sales Agent");
sc.Add("city", "london", Logical.Or);
var customers = D2Bk.Fetch(sc, new Customers());
But you don't want to use DAOs (Customers above is such), so you would have to write the SQL statement and specify the where clause:
DataSet ds = D2Bk.Fetch("SELECT * FROM Customers WHERE Contacttitle=#PAR1 OR City=#PAR2", "Sales Agent", "london");
I would assume that you've looked at LINQ and ADO.Net Data Services and these do not meet some of your requirements?
Native ADO.Net is a database provider, and therefore it provides a direct SQL interface into the underlying data sources. There is various CRUB based solutions out there that simulate what you suggest to various degrees.
We have a strong internal trend to leave the database to the Database team and use Webservices as a main interface to our databases, mainly because of a Delphi codebase that still needs to be supported.
Actually can't believe I forgot to add ADO.Net Entity Framework which is used by ADO.Net Data Services among others. There is also a LINQ to Entities provider.
I did something like this with a stored procedure once. Basically, I wanted to specify any permutation of fields to match on in my WHERE clause, but I didn't want to write 100 sprocs with slightly different param lists and where clauses.
So, I did something like this:
CREATE PROCEDURE [GetSimpleCustomers]
(
#ID varchar(50) = null,
#Name varchar(50) = null,
#IsActive bit = null,
#Address1 varchar(50) = null,
#Address2 varchar(50) = null,
#City varchar(50) = null,
#State varchar(50) = null,
#Zip varchar(50) = null
)
AS
SELECT ID, Name, IsActive, Address1, Address2, City, State, Zip
FROM SimpleCustomerExample
WHERE (ID = #ID OR #ID is NULL)
AND (Name = #Name OR #Name is NULL)
AND (IsActive = #IsActive or #IsActive is NULL)
AND (Address1= #Address1 or #Address1 is NULL)
AND (Address2= #Address2 or #Address2 is NULL)
AND (City= #City or #City is NULL)
AND (State= #State or #State is NULL)
AND (Zip= #Zip or #Zip is NULL)
This will let you call the sproc in your code and only pass the params you are interested in filtering on, and the rest will not be factored in if you leave them null.
So, you can do something like
public List<SimpleCustomer> GetAllCustomersFromOhio()
{
List<SimpleCustomer> list = new List<SimpleCustomer>();
using (SqlCommand cmd = new SqlCommand(blah blah))
{
cmd.Parameters.AddWithValue("State", "Ohio");//or "OH" depending on your convention
using(IDataReader oDR = cmd.ExecuteReader())
{
//hydrate your list of SimpleCustomers from the record set.
}
}
return list;
}
EDIT:
In reply to comment:
You could easily enough alter the GetSimpleCustomers to be DeleteSimpleCustomers by changing the
SELECT <columns> FROM SimpleCustomers
to
DELETE FROM SimpleCustomers
and keep the same logic. The same is true for an Update.
Also, I'll answer a question with a question: How many tables do you have that actually need this level of custom filtering? The syntax would be so similar you could pound it all out in a day (or less if you hammered together a simple script to write it for you).
If you are using strongly-typed DataSets, you can create parameterized queries in the Visual Studio Editor by adding identifiers prefixed with # in the query. Create a DataSet XSD file in Visual Studio and create a new table called Products, then add a new query to it.
For example:
select * from Products where Category = #category;
This will autogenerate methods for filled datasets or getting datatables that take the extra parameter. It will also handle propery escaping the strings (uses parameters on the command objects). I used this to create some super simple prototype web apps fairly quickly.
I recently wrote a query 'framework' for generating SQL where clauses.
The primary component is a BaseQueryArgs class with a ToWhereClause() function that uses reflection to convert properties into string sections. This needs to handle the work of escaping and properly formatting values.
Any class inheriting BaseQueryArgs simply needs to declare public properties, and you end up with a strongly typed query object. For optional properties, you make the value nullable (ref type or Nullable<>) and the SQL generator filters out null values.
You can use custom attributes to define extra features for each property:
custom column name that differs from the property name
custom value handling (such as a date value being used as a BETWEEN test expression)
This can be used to create queries with a strongly-typed query object like so:
MyCustomQueryArgs args = new MyCustomQueryArgs
{
ProductFamilyID = 17,
Region = Regions.Northwest,
Active = true
};
List<Product> product = QueryProcessor.GetProductsWhere(args);
GetProductsWhere() would obviously call some data method that accesses the view with the generated SQL.
I don't have a solution for updates/deletes, but it doesn't seem that hard to write a method that converts an object instance into a SQL statement using a switch or attribute to determine table name.
This is very "roll your own", but that gives you the freedom to customize it for your needs, and doesn't include lots of heavy ORM/DAO wrapping.
Have a look at Mindscapes Lightspeed products
It builds stongly typed LINQ queriable models that results in efficient SQL code accross a variety of database engines and includes Memcached and Lucene support
I have used XPO on several projects and their newer version has better support for queries.
http://www.devexpress.com/Products/NET/ORM/
The implementation, like all of them is not without its drawbacks however.
I use Data Abstract for my projects.