Phalcon: Convert MySQL data types to PHP data types and vice-versa - phalcon

What is the best way to handle datatype conversion between MySQL and PHP while using Phalcon models. When a datetime field is retrieved from MySQL, it is converted to a string which I want to automatically convert to datetime. Similarly for MySQL decimal fields, I want to convert the value to a custom Decimal field.
So, where exactly does this datatype conversion happen? OR if it does not, what's the best way to achieve this kind of data conversion? I went through the documentation but couldn't find anything relevant to this.
Any help is highly appreciated.

There are two ways to handle this that I know of.
One is using model annotations to describe metadata:
http://docs.phalconphp.com/en/latest/reference/models.html#annotations-strategy
This will solve your issue with decimals but not with datetime it sounds like.
The other is by using an afterFetch hook to mutate the model:
http://docs.phalconphp.com/en/latest/reference/models.html#initializing-preparing-fetched-records

Related

Why are all my variables objects instead of numerical values (int,float) when uploaded?

I just started so that might be stupid, but I have following problem:
I created a .csv-file for some basic data description. However, although they are all numerical values without any missing values when using df.dtyped() I receive all variables as objects with only some being int64 or float64. Do I have to manually convert all object variables to numerical ones with code?
Or is there anything I did wrong when creating my csv?
Also the date I have saved in the format yyyy-mm-dd is shown as object instead of date format.
The numbers of the data range from [0,2] for some variables and [0,2000000] for others.
Could the formatting in Excel be a problem?
Is there any "How to build your csv"-documentation? So that I dont have to ask stupid beginner questions like this?
Additionally, I was told for a model to work properly I need to do some Scaling/Normalization of my data as the value ranges differ a lot.. Where can I find more information on that?
I would suggest you just do data type conversion before saving the CSV file. you can use the below function as well for conversion.
astype()
to_numeric()
convert_dtypes()
you can use the attached link for scaling information. https://www.analyticsvidhya.com/blog/2020/07/types-of-feature-transformation-and-scaling/
pd.read_csv has already an option to specify the type so if you want you can specify the dtypeType with read_csv. For the date, you always have to change the format to datetime
To scale or normalize your date is going to depend on which machine learning model you are going to use also.
For example : if use a random forest and a KNN, the KNN will need to have scaling feature since it works with distance.
Hands-On Machine Learning with Scikit-Learn, Keras, and Tensorflow: Concepts, Tools, and Techniques to Build Intelligent Systems is a good book to start in my personal opinion
Thanks for the ideas.
In the end a pd.readcsv(title, decimal:',') helped to create them as floats. As I used german formatting.
But conversion with to_numeric() also worked

How can I change a date field from String to Date or DateTime?

I an using Google Big Query and I have a field, named 'AsOfDate' which is set as a string datatype. I have a bunch of data in this field, which I really want to set as DateTime or just Date. Either is fine. I Googled for a solution, and I thought this would be pretty easy to do, but I can't seem to get the data type updated. I don't want to run a simple select statement; I want to permanently change the Schema. Has anyone run into this and figured out how to do this kind of thing? If so, please share your insights. Thanks!
To quote directly from the official documentation: 'Changing a column's data type is not supported by the BigQuery web UI, the command-line tool, or the API.'
https://cloud.google.com/bigquery/docs/manually-changing-schemas#changing_a_columns_data_type
There are two ways to manually change a column's data type:
Using a SQL query — Choose this option if you are more concerned about
simplicity and ease of use, and you are less concerned about costs.
Recreating the table — Choose this option if you are more concerned
about costs, and you are less concerned about simplicity and ease of
use.
You could use either of the approaches above along with the PARSE_DATE() function to transform your string into a date field.
https://cloud.google.com/bigquery/docs/reference/standard-sql/functions-and-operators#parse_date

Apache NiFi: InferAvroSchema infers signed values as string

I'm setting up a pipeline in NiFi where I get JSON records which I then use to make a request to an API. The response I get would have both numeric and textual data. I then have to write this data to Hive. I use InferAvroSchema to infer the schema. Some numeric values are signed values like -2.46,-0.1 While inferring the type, the processor considers them as string instead of double or float or decimal type.
I know we can hard code our AVRO schema in the processors but I thought making it more dynamic by utilizing the InferAvroSchema would be even better. Is there any other way we can overcome/resolve this?
InferAvroSchema is good for guessing an initial schema, but once you need something more specific it is better to remove InferAvroSchema and provide the exact schema you need.

SQL Server Decimals have changed

I have jusrt uploaded an Access Database to SQL Server 2008 and the numeric fields have been changed to things like:
2.5364E-05,
2.5364E-05,
2.7598E-05,
2.8425E-05,
2.7598E-05,
2.5364E-05,
2.5364E-05,
I have seen this happen before, but now i need to know how to resolve it.
Is there any way to convert the numbers back, or avoid the problem in the first place?
Thanks all!
C
It's very unlikely that your data has actually been changed. This is a presentational effect that happens with the decimal data type.
Have you tried formatting the data using CONVERT(VarChar, ...)? What is the format you're expecting?

Preserve leading zeros when mapping NUMERIC(5,0) to string

Hopefully this is an easy question. I have a legacy DB2 database on an AS/400 where fields like Zip Code are stored as NUMERIC. When mapping them in NHibernate they are treated as integers and the leading zeros are lost. Typically I would use
SELECT DIGITS(field) FROM TABLE
to preserve leading zeros but its my understanding that if I go creating formulas to correct formatting I force the field to become read only.
What is the correct way to map the NUMERIC type (with its leading digits) to a string type and back again?
You are trying to deal with it at the wrong level.
Leading zeroes are a display concern; just use "00000" as your format string.
Diego is right, but if you really need to represent the zip codes as strings in your domain you could implement your own IUserType and use it in your mappings. However, I'm not sure that you'll be able to handle getting the leading zeroes into the numeric database column (if this is something you need). I think it would be easy enough if you stored your zip code as '392' in the database and then presented it as '00392' though.
For a quick overiew of what's involved in creating an IUserType take a look at this blog post: Mapping Strings to Booleans using NHibernate's IUserType