List of Cryptocurrency version bytes (address prefix) - bitcoin

Where can be found the list of version byte (address prefix) for each currency following Bitcoin implementation (P2PKH address encoding) ?
I browsed the official Bitcoin github and the BIPs but was not able to find anything about it. Only currency IDs are listed there.
I found on the WalletGenerator.net github such a list in the code of the index.html.
ex:
//name, networkVersion, privateKeyPrefix, WIF_Start, CWIF_Start
("Bitcoin", 0x00, 0x80, "5", "[LK]" )
("BitcoinCash", 0x00, 0x80, "5", "[LK]" )
("Blackcoin", 0x19, 0x99, "6", "P" )
("Litecoin", 0x30, 0xb0, "6", "T" )
...
Is there any kind of official or updated source with the list of address prefix (version byte) of all cryptocurrency?

The only good way to check this - look into sources. Usually these prefixes are defined in chainparams.cpp. I don't believe that there is somewhere up-to-date table with all prefixes for all bitcoin-based cryptocurrencies. Examples below:
Bitcoin:
base58Prefixes[PUBKEY_ADDRESS] = std::vector<unsigned char>(1,0);
base58Prefixes[SCRIPT_ADDRESS] = std::vector<unsigned char>(1,5);
base58Prefixes[SECRET_KEY] = std::vector<unsigned char>(1,128);
base58Prefixes[EXT_PUBLIC_KEY] = {0x04, 0x88, 0xB2, 0x1E};
base58Prefixes[EXT_SECRET_KEY] = {0x04, 0x88, 0xAD, 0xE4};
Litecoin:
base58Prefixes[PUBKEY_ADDRESS] = std::vector<unsigned char>(1,48);
base58Prefixes[SCRIPT_ADDRESS] = std::vector<unsigned char>(1,5);
base58Prefixes[SCRIPT_ADDRESS2] = std::vector<unsigned char>(1,50);
base58Prefixes[SECRET_KEY] = std::vector<unsigned char>(1,176);
base58Prefixes[EXT_PUBLIC_KEY] = {0x04, 0x88, 0xB2, 0x1E};
base58Prefixes[EXT_SECRET_KEY] = {0x04, 0x88, 0xAD, 0xE4};
Dash:
// Dash addresses start with 'X'
base58Prefixes[PUBKEY_ADDRESS] = std::vector<unsigned char>(1,76);
// Dash script addresses start with '7'
base58Prefixes[SCRIPT_ADDRESS] = std::vector<unsigned char>(1,16);
// Dash private keys start with '7' or 'X'
base58Prefixes[SECRET_KEY] = std::vector<unsigned char>(1,204);
// Dash BIP32 pubkeys start with 'xpub' (Bitcoin defaults)
base58Prefixes[EXT_PUBLIC_KEY] = boost::assign::list_of(0x04)(0x88)(0xB2)(0x1E).convert_to_container<std::vector<unsigned char> >();
// Dash BIP32 prvkeys start with 'xprv' (Bitcoin defaults)
base58Prefixes[EXT_SECRET_KEY] = boost::assign::list_of(0x04)(0x88)(0xAD)(0xE4).convert_to_container<std::vector<unsigned char> >();
ZCash:
// These prefixes are the same as the testnet prefixes
base58Prefixes[PUBKEY_ADDRESS] = {0x1D,0x25};
base58Prefixes[SCRIPT_ADDRESS] = {0x1C,0xBA};
base58Prefixes[SECRET_KEY] = {0xEF};
// do not rely on these BIP32 prefixes; they are not specified and may change
base58Prefixes[EXT_PUBLIC_KEY] = {0x04,0x35,0x87,0xCF};
base58Prefixes[EXT_SECRET_KEY] = {0x04,0x35,0x83,0x94};
base58Prefixes[ZCPAYMENT_ADDRRESS] = {0x16,0xB6};
base58Prefixes[ZCVIEWING_KEY] = {0xA8,0xAC,0x0C};
base58Prefixes[ZCSPENDING_KEY] = {0xAC,0x08};

Related

I can't read keydatas (buttons' press values) with SPI between XMC4800 and STLED316

I use SPI communication between XMC4800 and STLED316 led controller. I can write datas to the display on STLED316.However, I can't read keydatas (by keyscan feature of STLED, it is written by STLED316 datasheet). I have some codes for write and read. I suppose command address or register address to send a byte to STLED is wrong. I couldn't figure out if they are the problem or something else. I simplified the codes. I used SPI_MASTER API for communication. write_7segg is working but can't use keydata's specific bits to use button press values (0,1). I need to use key press (keydata) values to change something on the display. If someone worked with STLED316 before, it maybe more understandable. Finally, STLED316 has a 5-digit display and 5 buttons with keyscan feature. read[0] = STLED316_DATA_RD = 0x40, read[1] = STLED316_READ_PAGE = 0x08, keydata[0] = STLED316_ADDR_KEY_DATA1 = 0x01. Here the codes are:
keydata = read_keyscan();
if ((((keydata) & 0x01) == 1) && (current_mode == 0)) {
write_7segg(4, arServo_Numbers[0]);
write_7segg(3, arServo_Numbers[0]);
write_7segg(2, arServo_Characters[23]);
write_7segg(1, arServo_Numbers[0]);
write_7segg(0, arServo_Characters[15]);
}
void write_7segg(unsigned char DisplayNumber, unsigned char Segments) {
// DIGITAL_IO_SetOutputLow(&DIGITAL_IO_0);
Buffer[0] = DisplayNumber;
Buffer[1] = Segments;
DIGITAL_IO_SetOutputLow(&DIGITAL_IO_8);
SPI_MASTER_Transmit(&SPI_MASTER_01,Buffer, 2);
DIGITAL_IO_SetOutputHigh(&DIGITAL_IO_8);
uint8_t readData(uint8_t address){
uint8_t sendByte = read[0]|read[1]|address;
uint8_t readByte = 0;
DIGITAL_IO_SetOutputLow(&DIGITAL_IO_8);
SPI_MASTER_Transfer(&SPI_MASTER_01,&sendByte, &readByte, 1);
DIGITAL_IO_SetOutputHigh(&DIGITAL_IO_8);
return readByte;
}
uint16_t read_keyscan(void){
uint16_t keyState = 0;
keyState = readData(keys[1]) << 8;
keyState = readData(keys[0]);
return keyState;
}

Can you have objective c enum same values?

I'm a beginner in Objective C. I've encountered the following typedef enum in Apple docs.
typedef enum NSUnderlineStyle : NSInteger {
NSUnderlineStyleNone = 0x00,
NSUnderlineStyleSingle = 0x01,
NSUnderlineStyleThick = 0x02,
NSUnderlineStyleDouble = 0x09,
NSUnderlinePatternSolid = 0x0000,
NSUnderlinePatternDot = 0x0100,
NSUnderlinePatternDash = 0x0200,
NSUnderlinePatternDashDot = 0x0300,
NSUnderlinePatternDashDotDot = 0x0400,
NSUnderlineByWord = 0x8000
} NSUnderlineStyle;
Aren't the values for
NSUnderlineStyleNone = 0x00,
NSUnderlinePatternSolid = 0x0000,
the same hex 0? How is it possible to differentiate the two values?
Thanks in advance.
While Apple included them in the same enum definition, there are 3 distinct sets of values being defined. The first is line style, the second is pattern, and one is an option (ByWord).
When you define your options, you choose from at most one value from each set, and you OR them together. By defining a style and a pattern with the same value, it simply means that the default, as defined by bit 0 in the result, will be no underline, but if a style is chosen, the default pattern will be a solid line.

C# HMACSHA1 Hash is not Right

I'm trying to decrypt a file. First 2 steps:
Copy the first 16 bytes of the file to a buffer. This is the HMAC-SHA1 hash of the file which is made using one of the keys above.
Use HMAC-SHA1 on that buffer with a key from above to create the RC4 key, which is 0x10 bytes.
My code is:
OpenFileDialog ofd = new OpenFileDialog();
ofd.ShowDialog();
BinaryReader binread = new BinaryReader(File.Open(ofd.FileName, FileMode.Open));
byte[] RetailKey = { 0xE1, 0xBC, 0x15, 0x9C, 0x73, 0xB1, 0xEA, 0xE9, 0xAB, 0x31, 0x70, 0xF3, 0xAD, 0x47, 0xEB, 0xF3 };
HMACSHA1 SHA = new HMACSHA1(RetailKey); //Initalize HMAC w/ retail or development key
byte[] buffer = binread.ReadBytes(16);
buffer = SHA.ComputeHash(buffer);
MessageBox.Show(buffer.Length.ToString());
As you can see, it says buffer has to be 10 bytes but messagebox says it is 20 bytes.Where is my mistake?
SHA-1 and thus HMAC-SHA-1 outputs 20 bytes.
You only need 16 (0x10) of them so you need to truncate. For example with byte[] key = hmacSha1.ComputeHash(input).Take(16).ToArray().
The 0x in 0x10 is a prefix denoting hexadecimal numbers in c (and derived languages). So 0x10 means 16 and not 10.

iOS: send customized uint8 array

I'm trying to make an app that communicate iPhone with another hardware using dock to RS232 wire (I bought from RedPark). I'm also using the library provided by redpark. I made a simple code at beginning, it worked fine.
UInt8 infoCmd[5] = {0x3E,0x3E,0x05,0x80,0xff};
[rscMgr write:infoCmd Length:5];
Then I want to add more command to it, so I create a method that returns different combinations of command I need.
- (UInt8 *)requestCommand:(int)commandName{
UInt8 * command;
if (commandName == DATADUMP) {
command=[Communication buildDataDump];
}
if (commandName == GETSERIALINFO) {
command=[Communication buildGetSerailInfo];
}
return command;
}
+ (UInt8 *)buildGetSerailInfo{
UInt8 *command = malloc(sizeof(UInt8)*5);
command[0]=SYN;
command[1]=SYN;
command[2]=ENQ;
command[3]=GETSERIALINFO;
//command[4] = {SYN, SYN, ENQ, GETSERIALINFO};
return command;
}
The thing is, some of my commands includes data that can be 200 bytes long. How can I create an UInt8 array that is easier for me to add bytes?
I'm new to programming, please explain to me in detail. Thank you a lot in advance.
Actually you will just send data, row byte over the wire. I do something similar in one project (not wire, but RS232 commands over TCP/IP), and it becomes quite simple, if you use an NSMutableData instance.
A snippet from my code:
static u_int8_t codeTable[] = { 0x1b, 0x74, 0x10 };
static u_int8_t charSet[] = { 0x1b, 0x52, 0x10 };
static u_int8_t formatOff[] = { 0x1b, 0x21, 0x00 };
static u_int8_t reverseOn[] = { 0x1d, 0x42, 0x01 };
static u_int8_t reverseOff[]= { 0x1d, 0x42, 0x00 };
static u_int8_t paperCut[] = { 0x1d, 0x56, 0x0 };
NSMutableData *mdata = [NSMutableData dataWithBytes:&formatOff length:sizeof(formatOff)];
[mdata appendBytes:&formatOff length:sizeof(formatOff)];
[mdata appendBytes:&reverseOff length:sizeof(reverseOff)];
[mdata appendData: [NSData dataWithBytes: &codeTable length:sizeof(codeTable)]];
[mdata appendData: [NSData dataWithBytes: &charSet length:sizeof(charSet)]];
As you see, I am just appending the data byte by byte.

NSTextMovement values

A 'notification' message contains values called NSTextMovement, is there a list somewhere that tells what the different values are?
Thanks
Movement Codes
enum {
NSIllegalTextMovement = 0,
NSReturnTextMovement = 0x10,
NSTabTextMovement = 0x11,
NSBacktabTextMovement = 0x12,
NSLeftTextMovement = 0x13,
NSRightTextMovement = 0x14,
NSUpTextMovement = 0x15,
NSDownTextMovement = 0x16,
NSCancelTextMovement = 0x17,
NSOtherTextMovement = 0
};
Since MacOS 10.13, the movement codes have been collected in an enum: NSTextMovement.
At least in Swift, it gives you better pattern matching and much shorter names – .tab instead of NSTextMovementTab etc.