String type versus char in abap - abap

What are the drawbacks of the String type in abap? When to use it, when not?
An example : I have a text field that should save values ranging from 0 to 12 chars, better to use a string or a Char(12)?
Thanks!

A string is stored as a dynamic array of characters while a char is statically allocated.
Some of the downsides of strings include:
Overhead - because they are dynamic the length must be stored in addition to the actual string.
The substring and offset operators don't work with strings.
Strings cannot be turned into translatable text elements.
So to answer your question, strings should only be used for fairly long values with a wide range of lengths where the additional overhead is negligible relative to the potential wasted space of a static char(x) variable.

I think CHAR is the best because you are 100% sure that the field has to only hold 0-12 characters.

string is the variable length Data type , while in char you have to define the length ..
for type C(Text field (alphanumeric characters)) and String X or hexadecimal string have initial value (X'0 … 0') .
to avoid initial value , and to use actual length C type is used

Strings are good when:
The text length will be variable.
Spaces are part of the string (trailing spaces in CHAR fields are lost)
You pass them around a lot (when STRING variable metadata is less than char field size)
You need to get the STRING length often. It is more optimal than with CHAR fields.
CHAR fields are good:
If they are small, they are fast (less than around 32 chars on unicode systems)
CHAR field literals using (') quotes instead of (`) can be made into translatable texts.
Things to remember:
All variables have metadata, but strings also has some internal pointer to the string data, which could add up to 64 bytes to memory consumption. Something to keep in mind.
When assigning a literal text to a variable, try to match the literal type to the variable type. Use 'test' for CHAR and `test` for STRING. This is usually slightly faster.

String Variable :
A String is a variable length data type which is used to store any length of data. Variable length fields are used because they save space.
String, can store any number of characters. String will allocate the memory at runtime which is also called as dynamic memory allocation, will allocate the memory as per the size of the string. Strings cannot be declared using parameters as the memory of allocation is dynamic.
But in your case, you already know max-length of field(0 - 12 characters), So CHAR type is best for use in your case. A STRING type generally used to variable length data or a long values.
Read more

Related

How to query for a zero-byte char?

According to the documentation, pg_attribute.attgenerated is typed as char and has a value of "a zero byte" if the column is not generated, and there is at least one other possible value, with potentially more in the future.
I want to query for all non-generated columns. Since I would prefer to not be tripped up by additions in future versions, the query predicate needs to be WHERE attgenerated = ZERO BYTE rather than an inequality, but I have no idea how to represent that value correctly in SQL.
What's the correct way to write this? In most programming languages you'd say '\0', and you can use escape sequences by prepending an e to the string literal, but if I say e'\0' it errors out with "invalid byte sequence for encoding "UTF8": 0x00". So I'm not quite sure what the right way to do this is.
It's simply an empty string:
WHERE attgenerated = ''

Encoding numbers

I am a developer using high level languages. I usually take the lower level details for granted.
I read that standards such as ASCII and Unicode are for character encodings. A character has to be stored as a number. Is this the same for numbers? For example, if I declare a variable in .NET like this:
dim test as integer=5
In this case the value of test (5) will be represented as decimal: 49 according to this table. Is that correct?
If you code Dim test As String = "5" the value will be stored using the Unicode encoding for the character "5". However, Integers (and other numeric types) are not strings and are not encoded in that way, they are represented internally using their numeric value. An Integer is stored as a 32 bit value.
what you are asking about is data representation in memory.The way integers are represented depends on the whether they are signed or unsigned. IF they are signed (usually the case, unless you specify type as unsigned int or something equvalent) they are represented in binary in two's complement form: http://en.wikipedia.org/wiki/Two%27s_complement

Convert an alphanumeric string to integer format

I need to store an alphanumeric string in an integer column on one of my models.
I have tried:
#result.each do |i|
hex_id = []
i["id"].split(//).each{|c| hex_id.push(c.hex)}
hex_id = hex_id.join
...
Model.create(:origin_id => hex_id)
...
end
When I run this in the console using puts hex_id in place of the create line, it returns the correct values, however the above code results in the origin_id being set to "2147483647" for every instance. An example string input is "t6gnk3pp86gg4sboh5oin5vr40" so that doesn't make any sense to me.
Can anyone tell me what is going wrong here or suggest a better way to store a string like the aforementioned example as a unique integer?
Thanks.
Answering by request form OP
It seems that the hex_id.join operation does not concatenate strings in this case but instead sums or performs binary complement of the hex values. The issue could also be that hex_id is an array of hex-es rather than a string, or char array. Nevertheless, what seems to happen is reaching the maximum positive value for the integer type 2147483647. Still, I was unable to find any documented effects on array.join applied on a hex array, it appears it is not concatenation of the elements.
On the other hand, the desired result 060003008600401100500050040 is too large to be recorded as an integer either. A better approach would be to keep it as a string, or use different algorithm for producing a number form the original string. Perhaps aggregating the hex values by an arithmetic operation will do better than join ?

Objective C: Parsing JSON string

I have a string data which I need to parse into a dictionary object. Here is my code:
NSString *barcode = [NSString stringWithString:#"{\"OTP\": 24923313, \"Person\": 100000000000112, \"Coupons\": [ 54900012445, 499030000003, 00000005662 ] }"];
NSLog(#"%#",[barcode objectFromJSONString]);
In this log, I get NULL result. But if I pass only one value in Coupons, I get the results. How to get all three values ?
00000005662 might not be a proper integer number as it's prefixed by zeroes (which means it's octal, IIRC). Try removing them.
Cyrille is right, here is the autoritative answer:
The application/json Media Type for JavaScript Object Notation (JSON): 2.4 Numbers
The representation of numbers is similar to that used in most programming languages. A number contains an integer component that may be prefixed with an optional minus sign, which may be followed by a fraction part and/or an exponent part.
Octal and hex forms are not allowed. Leading zeros are not allowed.

Enumerating Strings as bytes?

I was looking for a way to enumerate String types in (vb).NET, but .NET enums only accept numeric type values.
The first alternative I came across was to create a dictionary of my enum values and the string I want to return. This worked, but was hard to maintain because if you changed the enum you would have to remember to also change the dictionary.
The second alternative was to set field attributes on each enum member, and retrieve it using reflection. Surely enough this worked aswell and also solved the maintenance problem, but it uses reflection and I've always read that using reflection should be a last resort thing.
So I started thinking and I came up with this: every ASCII character can be represented as a hexadecimal value, and you can assign hexadecimal values to enum members.
You could get rid of the attributes, assign the hexadecimal values to the enum members. Then, when you need the text value, convert the value to a byte array and use System.Text.Encodings.ASCII.GetString(enumMemberBytes) to get the string value.
Now speaking out of experience, anything I come up with is usually either flawed or just plain wrong. What do you guys think about this approach? Is there any reason not to do it like that?
Thanks.
EDIT
As pointed out by David W, enum member values are limited in length, depending on the underlying type (integer by default). So yes, I believe my method works but you are limited to characters in the ASCII table, with a maximum length of 4 or 8 characters using integers or longs respectively.
The easiest way I have found to dynamically parse a String representation of an Enumeration into the actual Enumeration type was to do the following:
Private EnumObject
[Undefined]
ValueA
ValueB
End Enum
dim enumVal as EnumObject = DirectCast([Enum].Parse(GetType(EnumObject), "ValueA"), EnumObject)
This removes the need to maintain a dictionary and allows you to just handle strings instead of converting to an Int or a Long. This does use reflection, but I have not come across any issues as long as you catch and handle any exceptions with the String Parse.