Get Current Date from TXT/LOG File - sql

Just started using LogParser recently, and I'm new to SQL queries.
I'm trying to get LogParser to find a log file with today's date, then outputting specific contents of the file into a text document. For example:
Select Text INTO D:\LogParser\output\BlahYYYY-MM-DD.txt
From 'C:\Logs\BlahYYYY-MM-DDBlah.log'
where Text like '%exception%'
How do I get my query to search for today's date in the file name, while keeping the YYYY-MM-DD format, and output to a text file with the same date and format?

Unfortunately there is no way to use functions in the FROM (nor in the INTO) clause.
If you can get the date from some other means, you could save the query to a file, specify e.g. %MYDATE% instead of the date, and then run the query specifying the parameter values as follows:
LogParser file:myQuery.sql?MYDATE=...

Related

Importing CSV file but getting timestamp error

I'm trying to import CSV files into BigQuery and on any of the hourly reports I attempt to upload it gives the code
Error while reading data, error message: Could not parse 4/12/2016 12:00:00 AM as TIMESTAMP for field SleepDay (position 1) starting at location 65 with message Invalid time zone: AM
I get that the format is trying to use AM as a timezone and causing an error but I'm not sure how best to work around it. All of the hourly entries will have AM or PM after the date-time and that will be thousands of entries.
I'm using the autodetect for my schema and I believe that's where the issue is coming up, but I'm not sure what to put in the edit as text schema option to fix it
To successfully parse an imported string to timestamp in Bigquery, the string must be in the ISO 8601 format.
YYYY-MM-DDThh:mm:ss.sss
If your source data is not available in this format, then try the below approach.
Import the CSV into a temporary table by providing explicit schema, where timestamp fields are strings.
2. Select the data from the created temporary table, use the BigQuery PARSE_TIMESTAMP function as specified below and write to the permanent table.
INSERT INTO `example_project.example_dataset.permanent_table`
SELECT
PARSE_TIMESTAMP('%m/%d/%Y %H:%M:%S %p',time_stamp) as time_stamp,
value
FROM `example_project.example_dataset.temporary_table`;

How to solve error gdk-05030 when importing csv data in SQL Developer

I have a problem importing a CSV file using SQL Developer. I created a table to import a 'date' data by using the code below
CREATE TABLE DEPARTMENT (
DATECOLUMN DATE
);
I imported the CSV by right clicking and so on.
Interestingly, the CSV 'date' data has a format of 'YYYY-MM-DD', but when loaded in SQL developer (preview, before importing), it took the format of 'MM/DD/YYYY'.
I believe this is the problem, because when trying to import, it returned error "gdk-05030", meaning the data contains more than needed (to fit the format).
To me, it looks as follows:
date format in a CSV file is yyyy-mm-dd (as you said)
when you queried the table after insert, you found out that values look differently - mm/dd/yyyy
it is because current settings use that date format. If you want to change it, run
alter session set nls_date_format = 'yyyy-mm-dd';
and then select rows from the table.
alternatively, go to SQL Developer's Preferences, and set desired date format in its [Database - NLS] section.

In PostgreSQL, what's data type you pass to a create table call when dealing with timestamp values?

When creating a table how do you deal with a timestamp in csv file that has the following syntax - MM/DD/YY HH:MI? Here's an example: 1/1/16 19:00
I have tried the following script in PostgreSQL:
create table timetable (
time timestamp
);
copy table from '<path>' delimiter ',' CSV;
But, I receive an error message saying:
ERROR: ERROR: invalid input syntax for type timestamp:
"visit_datetime" Where: COPY air_reserve, line 16, column
visit_datetime: "visit_datetime"
One solution I have considered is first creating the timestamp column in char then run a separate query that converts it to the appropriate timestamp datatype using the function call 'to_char(time, MM/DD/YY HH:MI). But, I'm looking for a solution that would enable to load the data in the correct datatype in a single query.
You may find a datestyle that enables you to load the data you have, but sooner or later someone will deliver to you something that doesn't fit.
The solution you have considered is probably the best.
We use this as a standard pattern for loading data warehouses. We take today's data, load it into a staging table using varchar columns for any data that will not load directly into its target data type. We then run whatever scripts we need to to get the data into a good state, raising warnings for anything that is broken in a way we haven't seen before. Then we add the cleaned version of today's data into the table containing cleaned data for all previous days.
We don't mind if this takes several steps; we put them all in a script and run it as an automated job.
I'm working on documenting the techniques we use. You can see the beginnings of this at http://www.thedatastudio.net.

Date format not consistent for a particular column in .csv file using app engine in peoplesoft

I am generating a report in .csv format. The file is having date fields. In those date fields, the date is coming in "dd/mm/yyyy" format and "dd-mm-yyyy format".
I have used to_char(date,'DD-MM-YYYY') for all date fields. Still I am not getting consistent date format. Please help me resolve this issue.
you can retrieve a date type from the DB and use a string field in the output file. In peoplecode use: DateToLocalizedString
PeopleSoft Documentation

How to load a file with different names with SQL Loader?

My problem is that I would like to load a file data (.csv) into Oracle table with SQL Loader but the file name changes all days.
e.g : I have this .csv file today TEST06062014.csv and here is the code
INFILE 'TEST06062014.csv'
It works well.
But I'm going to use this everyday and so the file name changes (tommorrow, the file name will be TEST07062014.csv)
Could I replace the file name by "*" character or an other solution?
Have you an idea?
I would suggest writing a batch script and then schedule it to run everyday with date as a parameter. In case you are confused how to get date in DDMMYYYY format, you can have a look at this link:
http://www.tech-recipes.com/rx/956/windows-batch-file-bat-to-get-current-date-in-mmddyyyy-format/