using sql reader to check for nulls when setting multiple values - sql

I have a sql command that is picking up a row in my DB but sometimes one of the datetime values may be null.
example:
var reader = command.ExecuteReader();
if (reader.HasRows)
{
List<AdmissionsVm> appDetailsOut = new List<AdmissionsVm>();
while (reader.Read())
{
appListOut.Add(new AdmissionsVm
{
Parish = Convert.ToString(reader.GetValue(40)),
CofE = Convert.ToBoolean(reader.GetValue(41)),
OtherFaith = Convert.ToString(reader.GetValue(42)),
PrefSiblingName1 = Convert.ToString(reader.GetValue(43)),
if (!reader.GetValue(44).IsDbNull){SiblingDateOfBirth = Convert.ToDateTime(reader.GetValue(44))}
SiblingGender = Convert.ToString(reader.GetValue(45))
});
}
}
I am actually bringing back a lot of details but when the siblingdateofbirth is null, i cant seem to check it as i am getting errors with fields that have been added afterwards
any help would be appreciated

Its often better to specify the column name instead of the column position because if the query for some reason changes the order in which its returning columns, you may need to change the params of all the GetValue calls.
To check for null try something like this
if (!reader.IsDBNull(reader.GetOrdinal("YourColumnNameForPosition44")))
{SiblingDateOfBirth = Convert.ToDateTime(reader.GetString(reader.GetOrdinal("YourColumnAgain"))}

Related

SQLSTATE[22007]: Invalid datetime forma

I try to save some data that it brings me from my view, which is a table, but I don't know why it throws me that error with the insert.
result of insert
this is my view:
table of view
this is my controller:
$checked_array = $_POST['id_version'];
foreach ($request['id_version'] as $key => $value) {
if (in_array($request['id_version'][$key], $checked_array))
{
$soft_instal = new Software_instalacion;
$soft_instal->id_instalacion = $instalaciones->id;
$soft_instal->id_historial = $historial->id;
$soft_instal->id_usuario = $request->id_usuario;
$soft_instal->id_version = $_POST['id_version'][$key];
$soft_instal->obs_software = $_POST['obs_software'][$key];
$soft_instal->id_tipo_venta = $_POST['id_tipo_venta'][$key];
$soft_instal->save();
}
}
id_tipo_venta seems to be an empty string which is apparently not valid.
You can try debugging what you get in :
var_dump($_POST['id_tipo_venta'][$key]);
die;
Your database field expects to receive an integer. Therefore, using the intval() function can solve your problem.
Indeed, I think your code returns an alphanumeric string.
Therefore, the code below will return 0 in all cases if no version is returned (not set, string or simply null):
$soft_instal->id_tipo_venta = intval($_POST['id_tipo_venta'][$key]);
On the other hand, intval() will always convert to int, so a decimal will be converted, example :
intval("1.1") // returns 1
intval("v1.1") // returns 0
If this is not the desired behavior, maybe you should think about changing your database type.
EDIT :
Of course, you can also set the value as null if you prefer to 0. You must allow nullable values in your database.
id_tipo_venta can not be empty, try with some number or change type column to varchar in the database

columnSummary is not added

I am trying to add columnSummary to my table using Handsontable. But it seems that the function does not fire. The stretchH value gets set and is set properly. But it does not react to the columnSummary option:
this.$refs.hot.hotInstance.updateSettings({stretchH: 'all',columnSummary: [
{
destinationRow: 0,
destinationColumn: 2,
reversedRowCoords: true,
type: 'custom',
customFunction: function(endpoint) {
console.log("TEST");
}
}]
}, false);
I have also tried with type:'sum' without any luck.
Thanks for all help and guidance!
columnSummary cannot be changed with updateSettings: GH #3597
You can set columnSummary settings at the initialization of Handsontable.
One workaround would be to somehow manage your own column summary, since Handsontable one could give you some headeache. So you may try to add one additional row to put your arithmetic in, but it is messy (it needs fixed rows number and does not work with filtering and sorting operations. Still, it could work well under some circumstances.
In my humble opinion though, a summary column has to be fully functionnal. We then need to set our summary row out of the table data. What comes to mind is to take the above mentioned additional row and take it away from the table data "area" but it would force us to make that out of the table row always looks like it still was in the table.
So I thought that instead of having a new line we could just have to add our column summary within column header:
Here is a working JSFiddle example.
Once the Handsontable table is rendered, we need to iterate through the columns and set our column summary right in the table cell HTML content:
for(var i=0;i<tableConfig.columns.length;i++) {
var columnHeader = document.querySelectorAll('.ht_clone_top th')[i];
if(columnHeader) { // Just to be sure column header exists
var summaryColumnHeader = document.createElement('div');
summaryColumnHeader.className = 'custom-column-summary';
columnHeader.appendChild( summaryColumnHeader );
}
}
Now that our placeholders are set, we have to update them with some arithmetic results:
var printedData = hotInstance.getData();
for(var i=0;i<tableConfig.columns.length;i++) {
var summaryColumnHeader = document.querySelectorAll('.ht_clone_top th')[i].querySelector('.custom-column-summary'); // Get back our column summary for each column
if(summaryColumnHeader) {
var res = 0;
printedData.forEach(function(row) { res += row[i] }); // Count all data that are stored under that column
summaryColumnHeader.innerText = '= '+ res;
}
}
This piece of code function may be called anytime it should be:
var hotInstance = new Handsontable(/* ... */);
setMySummaryHeaderCalc(); // When Handsontable table is printed
Handsontable.hooks.add('afterFilter', function(conditionsStack) { // When Handsontable table is filtered
setMySummaryHeaderCalc();
}, hotInstance);
Feel free to comment, I could improve my answer.

Sales Order Confirmation Report - SalesConfirmDP

I am modifying the SalesConfirmDP class and trying to add the CustVendExternalItem.ExternalItemTxt field into a new field I have created.
I have tried a couple of things but I do not think my syntax was correct i.e I declare the CustVendExternalItem table in the class declaration. But then when I try to insert CustVendExternalItem.ExternalItemTxt into my new field, it does not populate, I guess there must be a method which I need to include?
If anyone has any suggestion it would be highly appreciated.
Thank you in advance.
private void setSalesConfirmDetailsTmp(NoYes _confirmTransOrTaxTrans)
{
DocuRefSearch docuRefSearch;
// Body
salesConfirmTmp.JournalRecId = custConfirmJour.RecId;
if(_confirmTransOrTaxTrans == NoYes::Yes)
{
if (printLineHeader)
{
salesConfirmTmp.LineHeader = custConfirmTrans.LineHeader;
}
else
{
salesConfirmTmp.LineHeader = '';
}
salesConfirmTmp.ItemId = this.itemId();
salesConfirmTmp.Name = custConfirmTrans.Name;
salesConfirmTmp.Qty = custConfirmTrans.Qty;
salesConfirmTmp.SalesUnitTxt = custConfirmTrans.salesUnitTxt();
salesConfirmTmp.SalesPrice = custConfirmTrans.SalesPrice;
salesConfirmTmp.DlvDate = custConfirmTrans.DlvDate;
salesConfirmTmp.DiscPercent = custConfirmTrans.DiscPercent;
salesConfirmTmp.DiscAmount = custConfirmTrans.DiscAmount;
salesConfirmTmp.LineAmount = custConfirmTrans.LineAmount;
salesConfirmTmp.CurrencyCode = custConfirmJour.CurrencyCode;
salesConfirmTmp.PrintCode = custConfirmTrans.TaxWriteCode;
if (pdsCWEnabled)
{
salesConfirmTmp.PdsCWUnitId = custConfirmTrans.pdsCWUnitId();
salesConfirmTmp.PdsCWQty = custConfirmTrans.PdsCWQty;
}
**salesConfirmTmp.ExternalItemText = CustVendExternalItem.ExternalItemTxt;**
if ((custFormletterDocument.DocuOnConfirm == DocuOnFormular::Line)
|| (custFormletterDocument.DocuOnConfirm == DocuOnFormular::All))
{
docuRefSearch = DocuRefSearch::newTypeIdAndRestriction(custConfirmTrans,
custFormletterDocument.DocuTypeConfirm,
DocuRestriction::External);
salesConfirmTmp.Notes = Docu::concatDocuRefNotes(docuRefSearch);
}
salesConfirmTmp.InventDimPrint = this.printDimHistory();
Well, AX cannot guess which record you need, there is a helper class CustVendExternalItemDescription to deal with it:
boolean found;
str externalItemId;
...
[found, externalItemId, salesConfirmTmp.ExternalItemText] = CustVendExternalItemDescription::findExternalItemDescription(
ModuleCustVend::Cust,
custConfirmTrans.ItemId,
custConfirmTrans.inventDim(),
custConfirmJour.OrderAccount,
CustTable::find(custConfirmJour.OrderAccount).CustItemGroupId);
The findExternalItemDescription method returns more information than you need here, but you have to define variables to store it anyway.
Well, the steps to solve this problem are fairly easy and i will try to give you a step by step approach how to solve this problem.
1) Are you initialising CustVendExternalItem properly? Make a record of the same and initialise it as Jan has shown above, then debug your code and see if the value is being initialised in your DP class.
2)If your value is being initialised correctly, but it is not showing up in the report design there can be multiple issues such as:
Overlapping of text boxes.
Insufficient space for the given field
Some report parameter/property not being set correctly which causes
your value not to show up on the report.
Check these one by one and you should end up arriving towards a solution

"update" query - error invalid input synatx for integer: "{39}" - postgresql

I'm using node js 0.10.12 to perform querys to postgreSQL 9.1.
I get the error error invalid input synatx for integer: "{39}" (39 is an example number) when I try to perform an update query
I cannot see what is going wrong. Any advise?
Here is my code (snippets) in the front-end
//this is global
var gid=0;
//set websockets to search - works fine
var sd = new WebSocket("ws://localhost:0000");
sd.onmessage = function (evt)
{
//get data, parse it, because there is more than one vars, pass id to gid
var received_msg = evt.data;
var packet = JSON.parse(received_msg);
var tid = packet['tid'];
gid=tid;
}
//when user clicks button, set websockets to send id and other data, to perform update query
var sa = new WebSocket("ws://localhost:0000");
sa.onopen = function(){
sa.send(JSON.stringify({
command:'typesave',
indi:gid,
name:document.getElementById("typename").value,
}));
sa.onmessage = function (evt) {
alert("Saved");
sa.close;
gid=0;//make gid 0 again, for re-use
}
And the back -end (query)
var query=client.query("UPDATE type SET t_name=$1,t_color=$2 WHERE t_id = $3 ",[name, color, indi])
query.on("row", function (row, result) {
result.addRow(row);
});
query.on("end", function (result) {
connection.send("o");
client.end();
});
Why this not work and the number does not get recognized?
Thanks in advance
As one would expect from the initial problem, your database driver is sending in an integer array of one member into a field for an integer. PostgreSQL rightly rejects the data and return an error. '{39}' in PostgreSQL terms is exactly equivalent to ARRAY[39] using an array constructor and [39] in JSON.
Now, obviously you can just change your query call to pull the first item out of the JSON array. and send that instead of the whole array, but I would be worried about what happens if things change and you get multiple values. You may want to look at separating that logic out for this data structure.

ResultSet coming as empty after executing query

I have a query
SELECT instance_guid FROM service_instances WHERE service_template_guid='E578F99360A86E4EE043C28DE50A1D84' AND service_family_name='TEST'
Directly executing this returns me
4FEFDE7671A760A8DC8FC63CFBFC8316
F2F9DF641D8E2CACC03175A7A628D51D
Now I am trying same code from JDBC.
PreparedStatement ps = null;
ResultSet rs = null;
try {
conn = executionContext.getConnection();
if (conn != null) {
ps = (PreparedStatement)conn.prepareStatement(query);
if (params == null) params = new Object[0];
for (int i=0;i<params.length;i++) {
if (params[i] instanceof Integer) {
ps.setInt(i+1, ((Integer)params[i]).intValue());
} else if (params[i] instanceof java.util.Date) {
((PreparedStatement)ps).setDATE(i+1, new oracle.sql.DATE((new java.sql.Timestamp(((Date)params[i]).getTime()))));
//ps.setObject(i+1, new oracle.sql.DATE(new Time(((Date)params[i]).getTime())));
} else {
if (params[i] == null) params[i] = "";
ps.setString(i+1, params[i].toString());
}
}
rs = ps.executeQuery();
I see params[0] =E578F99360A86E4EE043C28DE50A1D84 and params[1]=TEST
But the resultSet is empty and not getting the result.I debugged but not much help?
Can you please let me know Am i trying right?
In java its defined as below
final static private String INSTANCE_GUID_BY_TEMPLATE_GUID =
"SELECT instance_guid FROM service_instances WHERE service_template_guid=? AND service_family_name=? "
SERVICE_FAMILY_NAME NOT NULL VARCHAR2(256)
SERVICE_TEMPLATE_GUID NOT NULL RAW(16 BYTE)
First and foremost this breaks every sql mapping pattern I have ever seen.
String sql = "SELECT instance_guid FROM service_instances WHERE service_template_guid=? AND service_family_name=?";
PreparedStatement ps = null;
ResultSet rs = null;
try {
conn = executionContext.getConnection();
ps = conn.prepareStatement(sql);
ps.setString(1,guid);
ps.setString(2,family);
rs = ps.executeQuery();
while(rs.next(){...}
...
}
You should not be dynamically figuring out the data types as they come in, unless you are trying to write some code to port from database X to database Y.
UPDATE
I see you are using RAW as a datatype, from this post:
As described in the Oracle JDBC Developer's guide and reference 11g,
when using a RAW column, you can treat it as a BINARY or VARBINARY
JDBC type, which means you can use the JDBC standard methods
getBytes() and setBytes() which returns or accepts a byte[]. The other
options is to use the Oracle driver specific extensions getRAW() and
setRAW() which return or accept a oracle.sql.RAW. Using these two will
require you to unwrap and/or cast to the specific Oracle
implementation class.
Further from a code readability standpoint, your solution makes it painful for a new developer to take over. Far too often I see people making sql be "dynamic" when in reality 99% of the time you don't need this level of dynamic query building. It sounds good in most people's heads but it just causes pain and suffering in the SDLC.