Most Compact File Storage of Time Stamp and String Pairs - vb.net

I would like to write a time stamp and string pair to file in the most compact way possible. I started out writing the string representation of Ticks, then ASCII 31 as a seperator, then the string, then a CR.
Then I realised that as ticks is a long and can be stored as only 8 bytes I should convert ticks to bytes and write those bytes to the file. That's fine except those timestamp bytes might contain a byte whose value is 31 so my ASCII 31 delimiter is no longer unique.
What is the most compact way to store a timestamp and string pair to file?
Thanks.

Since Ticks has a fixed maximum length, you could avoid using the separator, reading the first 8 bytes of Ticks data and then reading the remaining bytes as the string.
:)

Related

What costs more data, ASCII or HEX?

I'm dealing with a device that has both options to send data through UDP connection. As I couldn't find any comparison or something, could someone explain the difference in processing both?
Hex data transfers a byte as two hex characters, using only 4 bits of the available 8 bits. Ascii data transfers either 7bits or 8bits at a time, thus using the full range of 0..255 while a hex character only transfers 0..15.
For example, the number 18 is transferred as 12 hex coded (taking up two bytes) but as 18 ascii-encoded(taking up one byte 00010002).

How to execute query longer than 32767 characters on Firebird?

I'm developing a Java web application that deals with large amounts of text (HTML code strings encoded using base64), which I need to save in my database. I'm using Firebird 2.0, and every time I try to insert a new record with strings longer than 32767 characters, I receive the following error:
GDS Exception. 335544726. Error reading data from the connection.
I have done some research about it, and apparently this is the character limit for Firebird, both for query strings and records in the database. I have tried a couple of things, like splitting the string in the query and then concatenating the parts, but it didn't work. Does anyone know any workarounds for this issue?
If you need to save large amount of text data in the database - just use BLOB fields. Varchar field size is limited to 32Kb.
For better performance you can use binary BLOBs and save there zipped data.
Firebird query strings are limited to 64 kilobytes in Firebird 2.5 and earlier. The maximum length of a varchar field is 32766 byte (which means it can only store 8191 characters when using UTF-8!). The maximum size of a row (with blobs counting for 8 bytes) is 64 kilobytes as well.
If you want to store values longer than 32 kilobytes, you need to use a BLOB SUB_TYPE TEXT, and you need to use a prepared statement to set the value.

Number of real character allowed in SQL Server Varchar(max) or Text data type

I need to know the maximum number of characters I can put into a varchar(max) or text field using Sql Server. In this page I have found that the maximum number of bytes for storage is 2GB (2^31 - 1). Since I suppose, according to this page and other I've searched, the Unicode character is 2 byte sized, I conclude that I have to divide the total byte size for the Unicode character size, which does not give an integer result. Any sugestions where I am failing? Why does the page say the maximum string length is 2^31 - 1 instead of 2^31?
From SQL Server 2012 Help:
Variable-length, non-Unicode string data. ndefines the string length and can be a value from 1 through
8,000. maxindicates that the maximum storage size is 2^31-1 bytes (2 GB). The storage size is the actual
length of the data entered + 2 bytes. The ISO synonyms for varcharare char varyingor character
varying.

I have loaded a 1.5GB csv file and successfully loading my table size is 250MB why this is so?

In google Bigquery ....I have loaded a 1.5GB csv file from googlstorage after successfully loading,.... my table size is 250MB why this is so?
Likely because the binary encoding of numbers is more efficient than encoding them as strings. For example, the string "1234567890" takes 10 bytes (at least, or 20 bytes if it is UTF-16 encoded), but it can be represented by a 4 byte integer which only takes, ehm, 4 bytes.
Furthermore, the table in bigquery can also leave out the separators, because it knows how many bytes each field is wide. Thats another byte saved for every ,.

Does nvarchar always take twice as much space as varchar?

Nvarchar is used to store unicode data which is used to store multilingual data. If you don't end up storing unicode does it still take up the same space?
YES.
See MSDN Books Online on NCHAR and NVARCHAR.
NCHAR:
The storage size is two times n bytes.
NVARCHAR
The storage size, in bytes, is two
times the number of characters entered
+ 2 bytes
Sort of. Not all unicode characters use two bytes. Utf-8, for example, is still just one byte per character a lot of the time, but rarely you may need 4 bytes per character. What nvarchar will do is allocate two bytes per character.