I am writing a perl script which copies data from a table in one db to the same table in other DB. I am using DBI to obtain connection to DBS.
I noticed that when I am copying data,it's not copied properly.
In source table if date is like this-'04/22/1996 13:51:15 PM'
In destination table it's appearing like this-'22 APR 1996'.
Can anyone help me in copying exact date?
Thanks in advance
#!/usr/bin/env perl
use DateTime::Format::Strptime;
my $datetimestr = '04/22/1996 13:51:15 PM';
# 1) parse datetime
my $parser = DateTime::Format::Strptime->new(
pattern=>'%m/%d/%Y %H:%M:%S %p',
on_error => 'croak',
);
my $dt = $parser->parse_datetime($datetimestr,);
# 2) format datetime for your database format
warn $dt->strftime('%Y-%m-%d %H:%M:%S');
Related
How to write SAS dates to Microsoft SQL Server 2016 Date data type in database?
I got SAS data with a sas date DataEndDay and I want to write that into a database. The following bit is in use (buffer is just to speed up the testing-failing) :
libname valu oledb provider=sqloledb schema="dbo" INSERTBUFF=100
properties=("User ID"="&username." Password="&pw."
"data source" = &database.
"initial catalog"=&catalog.);
proc sql noprint;
insert into valu.Data_upload_from_me
( <some_columns...>,
<more-columns...>
,DataEndDay
)
select
<some_columns_source...>,
<more-columns_source...>
,DataEndDay
from work.SAS_data_to_publish
;quit;
Of course because SAS dates are numbers, direct writing is going to fail. What works is if I hard-code this as:
select
<some_columns_source...>,
<more-columns_source...>
,'2018-12-12'
from work.SAS_data_to_publish
;quit;
But If I convert the SAS date to string in SAS datasteps:
data SAS_data_to_publish ;
set SAS_data_to_publish ;
dataEndday0 = put(DataEndDay, yymmddd10.);
DataEndDay1 = quote(dataEndday0, "'") ;
run;
and try to write either of these, I get conversion error:
ERROR: ICommand::Execute failed. : Conversion failed when converting date and/or time from character string.
When I select the string it looks pretty ok:
proc sql; select DataEndDay1 from SAS_data_to_publish; quit;
'2018-12-12'
previously I've managed to write dateTimes with similar trick, which works:
proc format;
picture sjm
. = .
other='%Y-%0m-%0d %0H:%0M:%0S:000' (datatype=datetime)
;run;
data to_be_written;
set save.raw_data_to_be_written;
DataEndDay0 = put(dhms(DataEndDay,0,0,0), sjm. -L);
run;
Anyone ran into similar issues? How could I write the dates?
I could ask them to change the column to dateTime, maybe....
Thank you in advance.
Edit:
I managed to develop a work-around, which works but is ugly and -frankly- I don't like it. It so happens that my date is same for all rows, so I can assing it to macro variable and then use it in database writing.
data _NULL_;
set SAS_data_to_publish;
call symput('foobar', quote( put (DataEndDay , yymmddd10. -L), "'") ) ;
run;
....
select
<some_columns_source...>,
<more-columns_source...>
,&foobar.
from work.SAS_data_to_publish
;quit;
Of course this would fail immediately should DataEndDay vary, but maybe demonstrates that something is off in Proc SQLs select clause....
Edit Edit Pasted the question to SAS forums
I finally managed to crack the issue. The issue was for the missing values. As I am passing the values as strings into the database the parser interpreted missing values as real dots instead of empty strings. The following works:
data upload;
set upload;
CreatedReportdate2 = PUT(CreatedReportdate , yymmddn8.);
run;
libname uplad_db odbc noprompt =
"DRIVER=SQL Server; server=&server.; Uid=&user.;Pwd=&pw.; DATABASE=&db.;"
INSERTBUFF=32767;
proc sql;
insert into uplad_db.upload_table
(.... )
select
case when CreatedReportdate2 ='.' then '' else CreatedReportdate2 end,
...
from upload;
quit;
SAS does not really properly support the SQL server DATE data type. I imagine this is due to the fact that it's newer, but for whatever reason you have to pass the data as strings.
For missing values, it's important to have a blank string, not a . character. The easiest workaround here is to set:
options missing=' ';
That will allow you to insert data properly. You can then return it to . if you wish. In a production application that might be used by others, I'd consider storing aside the option value temporarily then resetting to that, in order to do no harm.
Normally I just use PROC APPEND to insert observations into a remote database.
proc append base=valu.Data_upload_from_me force
data=work.SAS_data_to_publish
;
run;
Make sure your date variable in your SAS dataset use the same data type as the corresponding variable names in your target database table. So if your MS SQL database uses TIMESTAMP fields for date values then make sure your SAS dataset uses DATETIME values.
If you want to use constants then make sure to use SAS syntax in your SAS code and MS SQL syntax in any pass through code.
data test;
date = '01JAN2017'd ;
datetime = '01JAN2017:00:00'dt ;
run;
proc sql ;
connect to oledb .... ;
execute ( ... date = '2017-01-01' .... datetime='2017-01-01 00:00' ...)
by oledb;
quit;
I have a txt file containing numerous items in the following format
DBSERVER: HKSER
DBREPLICAID: 51376694590
DBPATH: redirect.nsf
DBTITLE: Redirect AP
DATETIME: 09.03.2015 09:44:21 AM
READS: 1
Adds: 0
Updates: 0
Deletes: 0
DBSERVER: HKSER
DBREPLICAID: 21425584590
DBPATH: redirect.nsf
DBTITLE: Redirect AP
DATETIME: 08.03.2015 09:50:20 PM
READS: 2
Adds: 0
Updates: 0
Deletes: 0
.
.
.
.
please see the source capture here
I would like to import the txt file into the following format in SQL
1st column 2nd column 3rd column 4th column 5th column .....
DBSERVER DBREPLICAID DBPATH DBTITLE DATETIME ......
HKSER 51376694590 redirect.nsf Redirect AP 09.03.2015 09:44:21 AM
HKSER 21425584590 redirect.nsf Redirect AP 08.03.2015 01:08:07 AM
please see the output capture here
Thanks a lot!
You can dump that file into a temporary table, with just a single text column. Once imported, you loop through that table using a cursor, storing into variables the content, and every 10 records inserting a new row to the real target table.
Not the most elegant solution, but it's simple and it will do the job.
Using Bulk insert you can insert these headers and data in two different columns and then using dynamic sql query, you can create a table and insert data as required.
For Something like this I'd probably use SSIS.
The idea is to create a Script Component (As a Transformation)
You'll need to manually define your Output cols (Eg DBSERVER String (100))
The Src is your File (read Normally)
The Idea is that you build your rows line by line then add the full row to the Output Buffer.
Eg
Output0Buffer.AddRow();
Then write the rows to your Dest.
If all files have a common format then you can wrap the whole thiing in a for each loop
I need to use R for writing a query coming from a database my R environment is connected to. The structure of the query looks like this:
ALTER TABLE cph.table_id ADD PARTITION (event_date = 'YYYY-MM-DD')
LOCATION 's3://external-dwh-company-com/id-to-idl/YYYYMMDD'
So for example, today's addition would like as such:
ALTER TABLE cph.table_id ADD PARTITION (event_date = '2018-08-02')
LOCATION 's3://external-dwh-company-com/id-to-idl/20180802'
The issue is, I need to be doing this for every data going back to 03/01/2018.
So the steps would look like:
initial_query <- paste(#however the above query would be formatted with the dates)
results_query <- dbGetQuery(conn, initial_query)
But yeah, the biggest hurdle for me is 1.) Figuring out the paste formatting for that first part and 2.) Creating a loop that will allow me to run the above steps until the current date.
Consider looping through the range of days since a targeted begin date with lapply and seq, concatenating strings with sprintf and corresponding date format:
# GET DIFFERENCE BETWEEN TODAY AND DESIRED BEGIN DATE
date_diff <- Sys.Date() - as.Date("2018-03-01")
date_diff[[1]]
# 155
sql <- "ALTER TABLE cph.table_id ADD PARTITION (event_date = '%s')
LOCATION 's3://external-dwh-company-com/id-to-idl/%s'"
# RUN QUERY SEQUENTIALLY ADDING TO BEGIN DATE
output <- lapply(seq(1, date_diff[[1]]), function(i)
dbGetQuery(conn, sprintf(sql, strftime(as.Date("2018-03-01") + i, "%Y-%m-%d"),
strftime(as.Date("2018-03-01") + i, "%Y%m%d"))))
I have a table with an interval column, something like this.
CREATE TABLE validity (
window INTERVAL NOT NULL
);
Assume the value stored is 'P3DT1H' which is in iso_8601 format.
When I try to read the value, it comes in regular postgres format.
3 days 01:00:00
However I want the value in iso_8601 format. How can I achieve it?
so=# CREATE TABLE validity (
w INTERVAL NOT NULL
);
CREATE TABLE
so=# insert into validity values ('3 days 01:00:00');
INSERT 0 1
you probably are looking for intervalstyle
so=# set intervalstyle to iso_8601;
SET
so=# select w From validity;
w
--------
P3DT1H
(1 row)
surely it can be set per transaction/session/role/db/cluster
You can use SET intervalstyle query and set the style to iso_8601. Then, when you output the results, they will be in ISO 8601 format.
_, err := s.db.Exec("SET intervalstyle='iso_8601'")
res, err := s.db.Query("select interval '1d1m'")
// res contains a row with P1DT1M
If you are looking for a way to change intervalstyle for all sessions on a server level, you can update it in your configuration file:
-- connect to your psql using whatever client, e.g. cli and run
SHOW config_file;
-- in my case: /usr/local/var/postgres/postgresql.conf
Edit this file and add the following line:
intervalstyle = 'iso_8601'
In my case the file already had a commented out line with intervalstyle, and its value was postgres. You should change it and restart the service.
That way you won't have to change the style from golang each time you run a query.
The error that I found at the log is the one below.
'Illuminate\Database\QueryException' with message 'SQLSTATE[22007]:
[Microsoft][ODBC Driver 11 for SQL Server][SQL Server]Conversion
failed when converting date and/or time from character string. (SQL:
SELECT COUNT(*) AS aggregate
FROM [mytable]
WHERE [mytable].[deleted_at] IS NULL
AND [created_at] BETWEEN '2015-09-30T00:00:00' AND '2015-09-30T23:59:59'
AND ((SELECT COUNT(*) FROM [mytable_translation]
WHERE [mytable_translation].[item_id] = [mytable].[id]) >= 1)
)'
in
wwwroot\myproject\vendor\laravel\framework\src\Illuminate\Database\Connection.php:625
On the database, the DataType is datetime and is not null
Based on marc_s's answer I tried to change the format that I'm sending to the database. So I tried without the T on and [created_at] between '2015-09-30 00:00:00' and '2015-09-30 23:59:59'.
In my local, I'm using mysql, and the code works just fine. If I test the query above on the SQL Server client, both (with and without the T) works too.
How can I fix this problem without create any changes on the database itself?
The PHP/Laravel code:
$items = $items->whereBetween($key, ["'".$value_aux."T00:00:00'", "'".$value_aux."T23:59:59'"]);
With #lad2025 help, I got it to work.
Based on the point of his comments on my question, I changed in the code part (Laravel/PHP in this case) the format that I was passing. (In reality, I "removed" the format it self, and just added fields to a variable before passing to the query. This way, I let the database decide the format that he wants)
Instead of
$itens = $itens->whereBetween($key, ["'".$value_aux."T00:00:00'", "'".$value_aux."T23:59:59'"]);
I changed the code to this:
$sta = $value_aux."T00:00:00";
$end = $value_aux."T23:59:59";
$itens = $itens->whereBetween($key, [$sta, $end]);