I wish to create a table using edit text. The reason that I don't use auto detect is that my date format is not in the US format and Bigquery doesn't understand it (I'm not sure if it is a bug or I'm missing something). When I use Auto detect, Bigquery reads the date as mm/dd/yyyy while my date format is dd/mm/yyy.
I've tried the below but I'm getting an error
State:string,
New_Existing:string,
PARSE_DATE('%d/%m/%Y', Prd_Dt):DATE
You can use JSON to write your schema using edit text, refer to this GCP Documentation. Using your scenario below this is how we applied it to JSON:
Output:
However, you can only use the PARSE_DATE function in Bigquery when doing DML statements such as SELECT as per this GCP Documentation.
Related
I am attempting to load a database from a CSV file using AsterixDB. Currently, it works using only string, int, and double fields. However, I have a column in the CSV file that is in DateTime format. Currently I am importing them as strings, which works fine, but I would like to import them as the SQL DateTime data type. When I try changing my schema and reimporting I get the following error:
ERROR: Code: 1 "org.apache.hyracks.algebricks.common.exceptions.NotImplementedException: No value parser factory for fields of type datetime"
All entries are in this format 02/20/2010 12:00:00 AM.
I know this isn't exactly inline with the format specified by the Asterix Data Model, however, I tried a test line with the proper format and the error persisted.
Does this mean AsterixDB cant parse DateTime when doing mass imports? And if so how can I get around this issue?
Any help is much appreciated.
Alright, after discussing with some colleagues, we believe that AsterixDB does not currently support DateTime parsing when mass importing. Our solution was to upsert every entry in the dataset with the parsing built into the query.
We used the following query:
upsert into csv_set (
SELECT parse_datetime(c.Date_Rptd, "M/D/Y h:m:s a") as Datetime_Rptd,
parse_datetime(c.Date_OCC, "M/D/Y h:m:s a") as Datetime_OCC,
c.*
FROM csv_set c
);
As you can see we parse the strings using the parse_datetime function from the AsterixDB Temporal Functions library. This query intentionally doesn't erase the column with the DateTimes in string format, although that would be very simple to do if your application requires it. If anyone has a better or more elegant solution please feel free to add to this thread!
On loading a date column from salesforce, I receive it in this format: 2017-01-31 22:00:00. I want convert this to the german format and load into sql table without time.
Also, which datatype should I initialize in creating column in table?
try using the Convert function if you're using MSSQL
select Convert(nvarchar(10),GETDATE(),101)
This will create mm/dd/yyyy
if you need more samples please look at w3schools to convert time in different formats
https://www.w3schools.com/sql/func_sqlserver_convert.asp
Updated question:
I am working with scenario where the source oracle schema do not have a field say "Date of Birth" saved in encrypted format but when using select statement I want output to be in encrypted format. I do not know the version of the oracle to find the appropriate function in the documentations.
I have worked with MySQL and I am familiar with "password()" function. I am looking for similar function for Oracle in SQL not PL/SQL as I cannot use that in the application I am using it should be one line query to fetch the results. I tried using DBMS_CRYPTO as per the documentation https://docs.oracle.com/cd/B19306_01/appdev.102/b14258/d_crypto.htm#i1004271 but I am getting error fetching data on my application could be possible that DB version may not be supporting DBMS_CRYPTO.
Any other suggestion on which function can be used to display non-encrypted field in encrypted format when using select on Oracle(thin) query?
I have data in a BigQuery instance with a some date fields in epoch/timestamp format. I'm trying to convert to a YYYYMMDD format or similar in order to create a report in Data Studio. I have tried the following solutions so far:
Change the format in the Edit Connection menu when creating the Data Source in Data Studio to Date format. Not working. I get Configuration errors when I add the field to the Data Studio report.
Create a new field using the TODATE() function. I always get an invalid formula error (even when I follow the documentation for this function). I have tried to change the field type prior to use the TODATE() function. Not working in any case.
Am I doing something wrong? Why do I always get errors?
Thanks!
The function for TODATE() is actually CURRENT_DATE(). Change timestamp to DATE using EXTRACT(DATE from variableName)
make sure not use Legacy SQL !
The issue stayed, but changing the name of the variable from actual_delivery_date to ADelDate made it work. So I presume there's a bug and short(er) names may help to avoid it
As commented by Elliott Brossard, the solution would be instead of using Data Studio for the conversion,use PARSE_DATE or PARSE_TIMESTAMP in BigQuery and convert it there instead.
I need to update a FileMaker Timestamp field with a timestamp taken from PHP and put into a script using the PHP API and executeSQL API and plugin
so
UPDATE table SET time ='2011-05-27 11:28:57'
My Question is as follows, how do I utilise the available scripting functions within Filemaker Pro 11 to convert the string that is being supplied within the SQL statement to an acceptable TimeStamp format for FileMake? or is it possible using the executeSQL plugin for FileMaker to do the conversion within the ExecuteSQL() function within the Execute SQL plugin?
I haven't tried it out, but it should work using CAST:
CAST( expression AS type [ (length) ] )
so, it should read:
UPDATE table SET time = CAST ('2011-05-27 11:28:57' AS TIMESTAMP)
However, please be aware that Filemaker's own ExecuteSQL() functions doesn't support UPDATE or INSERT INTO statements. You need to get a free extension from Dracoventions called epSQLExecute() in order to do this.
Hope this helps (someone).
Gary
You haven't given us much to go on, but my guess would be that you are updating a timestamp column with a string that does not match the required format.
You should convert your string to the appropriate object and then the update should work.