Simple joystick hid report descriptor doesn't work - usb

Using atmega8 chip and V-USB library I made a little bridge to connect my NES gamepad to USB. At first I used one of examples as my hid descriptor and so...
I had my device correctly recognized in Windows when I set it to handshake with this
HID report descriptor
PROGMEM const char usbHidReportDescriptor[100] = {
0x05, 0x01, // USAGE_PAGE (Generic Desktop)
0x15, 0x00, // LOGICAL_MINIMUM (0)
0x09, 0x04, // USAGE (Joystick)
0xa1, 0x01, // COLLECTION (Application)
0x05, 0x02, // USAGE_PAGE (Simulation Controls)
0x09, 0xbb, // USAGE (Throttle)
0x15, 0x81, // LOGICAL_MINIMUM (-127)
0x25, 0x7f, // LOGICAL_MAXIMUM (127)
0x75, 0x08, // REPORT_SIZE (8)
0x95, 0x01, // REPORT_COUNT (1)
0x81, 0x02, // INPUT (Data,Var,Abs)
0x05, 0x01, // USAGE_PAGE (Generic Desktop)
0x09, 0x01, // USAGE (Pointer)
0xa1, 0x00, // COLLECTION (Physical)
0x09, 0x30, // USAGE (X)
0x09, 0x31, // USAGE (Y)
0x95, 0x02, // REPORT_COUNT (2)
0x81, 0x02, // INPUT (Data,Var,Abs)
0xc0, // END_COLLECTION
0x09, 0x39, // USAGE (Hat switch)
0x15, 0x00, // LOGICAL_MINIMUM (0)
0x25, 0x03, // LOGICAL_MAXIMUM (3)
0x35, 0x00, // PHYSICAL_MINIMUM (0)
0x46, 0x0e, 0x01, // PHYSICAL_MAXIMUM (270)
0x65, 0x14, // UNIT (Eng Rot:Angular Pos)
0x75, 0x04, // REPORT_SIZE (4)
0x95, 0x01, // REPORT_COUNT (1)
0x81, 0x02, // INPUT (Data,Var,Abs)
0x05, 0x09, // USAGE_PAGE (Button)
0x19, 0x01, // USAGE_MINIMUM (Button 1)
0x29, 0x04, // USAGE_MAXIMUM (Button 4)
0x15, 0x00, // LOGICAL_MINIMUM (0)
0x25, 0x01, // LOGICAL_MAXIMUM (1)
0x75, 0x01, // REPORT_SIZE (1)
0x95, 0x04, // REPORT_COUNT (4)
0x55, 0x00, // UNIT_EXPONENT (0)
0x65, 0x00, // UNIT (None)
0x81, 0x02, // INPUT (Data,Var,Abs)
0xc0 // END_COLLECTION
};
and this report type
typedef struct{
char throttle;
char x;
char y;
uchar hatSwitchAndButtons;
}report_t;
But when I made it simpler:
PROGMEM const char usbHidReportDescriptor[48] = {
0x05, 0x01, // USAGE_PAGE (Generic Desktop)
0x09, 0x04, // USAGE (Joystick)
0xa1, 0x01, // COLLECTION (Application)
0x15, 0x81, // LOGICAL_MINIMUM (-127)
0x25, 0x7f, // LOGICAL_MAXIMUM (127)
0x05, 0x01, // USAGE_PAGE (Generic Desktop)
0x09, 0x01, // USAGE (Pointer)
0xa1, 0x00, // COLLECTION (Physical)
0x09, 0x30, // USAGE (X)
0x09, 0x31, // USAGE (Y)
0x75, 0x08, // REPORT_SIZE (8)
0x95, 0x02, // REPORT_COUNT (2)
0x81, 0x02, // INPUT (Data,Var,Abs)
0xc0, // END_COLLECTION
0x05, 0x09, // USAGE_PAGE (Button)
0x19, 0x01, // USAGE_MINIMUM (Button 1)
0x29, 0x04, // USAGE_MAXIMUM (Button 8)
0x15, 0x00, // LOGICAL_MINIMUM (0)
0x25, 0x01, // LOGICAL_MAXIMUM (1)
0x75, 0x01, // REPORT_SIZE (1)
0x95, 0x04, // REPORT_COUNT (8)
0x55, 0x00, // UNIT_EXPONENT (0)
0x65, 0x00, // UNIT (None)
0x81, 0x02, // INPUT (Data,Var,Abs)
0xc0 // END_COLLECTION
};
typedef struct{
char x;
char y;
uchar buttons;
}report_t;
Windows has not been unable to install my joystick. It will still recognize it's name but will fail to finish installation.
I'm confused because USB Hid Tool validates this report descriptor.
Any idea, please?

0x29, 0x04, // USAGE_MAXIMUM (Button 8)
0x95, 0x04, // REPORT_COUNT (8)
Ooops, looks like someone programms with comments. :)
Change 0x04 to 0x08. And watch so that each report size could be devided by 8 (8,16,24,32 etc.)

Related

How to parse hid report descriptor?

All the tutorials I've read about usb hid report descriptors use numbers to represent data. Where do these numbers come from?
Example:
0x05, 0x01, // USAGE_PAGE (Generic Desktop)
0x09, 0x02, // USAGE (Mouse)
0xa1, 0x01, // COLLECTION (Application)
0x09, 0x01, // USAGE (Pointer)
0xa1, 0x00, // COLLECTION (Physical)
0x05, 0x09, // USAGE_PAGE (Button)
0x19, 0x01, // USAGE_MINIMUM (Button 1)
0x29, 0x03, // USAGE_MAXIMUM (Button 3)
0x15, 0x00, // LOGICAL_MINIMUM (0)
0x25, 0x01, // LOGICAL_MAXIMUM (1)
0x95, 0x03, // REPORT_COUNT (3)
0x75, 0x01, // REPORT_SIZE (1)
0x81, 0x02, // INPUT (Data,Var,Abs)
0x95, 0x01, // REPORT_COUNT (1)
0x75, 0x05, // REPORT_SIZE (5)
0x81, 0x03, // INPUT (Cnst,Var,Abs)
0x05, 0x01, // USAGE_PAGE (Generic Desktop)
0x09, 0x30, // USAGE (X)
0x09, 0x31, // USAGE (Y)
0x15, 0x81, // LOGICAL_MINIMUM (-127)
0x25, 0x7f, // LOGICAL_MAXIMUM (127)
0x75, 0x08, // REPORT_SIZE (8)
0x95, 0x02, // REPORT_COUNT (2)
0x81, 0x06, // INPUT (Data,Var,Rel)
0xc0, // END_COLLECTION
0xc0 // END_COLLECTION
For example, where does the 0x05 and 0x01 in the first line come from? How does that transalate to // USAGE_PAGE (Generic Desktop)? What are valid values? How are these ordered? How is nesting defined?
See my answer to a similar question at Custom HID device HID report descriptor (extract follows):
To understand HID Report Descriptors you need to read some of the documents on the HID Information page. In particular, you should try to understand:
The "Device Class Definition for HID 1.11" document - which describes the Human Interface Device report format
The "HID Usage Tables 1.3" document - which describes the values of many Usage Pages and Usages within those pages that can appear in a Report Descriptor

How to make QT application use native Windows cursors instead of qt specific?

In my application I want to have native Windows cursors icons instead of loaded/built from qt resources.
For example Qt::SplitVCursor/Qt::SplitHCursor which differs from standard Windows.
Looking at the sources of qwindowscursor.cpp:
switch (cursorShape) {
case Qt::SplitVCursor:
return createPixmapCursorFromData(systemCursorSize(), standardCursorSize(), 32, vsplit_bits, vsplitm_bits);
case Qt::SplitHCursor:
return createPixmapCursorFromData(systemCursorSize(), standardCursorSize(), 32, hsplit_bits, hsplitm_bits);
case Qt::OpenHandCursor:
return createPixmapCursorFromData(systemCursorSize(), standardCursorSize(), 16, openhand_bits, openhandm_bits);
case Qt::ClosedHandCursor:
return createPixmapCursorFromData(systemCursorSize(), standardCursorSize(), 16, closedhand_bits, closedhandm_bits);
case Qt::DragCopyCursor:
return QCursor(QPixmap(copyDragCursorXpmC), 0, 0);
case Qt::DragMoveCursor:
return QCursor(QPixmap(moveDragCursorXpmC), 0, 0);
case Qt::DragLinkCursor:
return QCursor(QPixmap(linkDragCursorXpmC), 0, 0);
}
and then
// Non-standard Windows cursors are created from bitmaps
static const uchar vsplit_bits[] = {
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x80, 0x00, 0x00, 0x00, 0xc0, 0x01, 0x00, 0x00, 0xe0, 0x03, 0x00,
0x00, 0x80, 0x00, 0x00, 0x00, 0x80, 0x00, 0x00, 0x00, 0x80, 0x00, 0x00,
0x00, 0x80, 0x00, 0x00, 0x00, 0x80, 0x00, 0x00, 0x00, 0xff, 0x7f, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0xff, 0x7f, 0x00,
0x00, 0x80, 0x00, 0x00, 0x00, 0x80, 0x00, 0x00, 0x00, 0x80, 0x00, 0x00,
0x00, 0x80, 0x00, 0x00, 0x00, 0x80, 0x00, 0x00, 0x00, 0xe0, 0x03, 0x00,
0x00, 0xc0, 0x01, 0x00, 0x00, 0x80, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 };
Looks like Qt uses it's own bitmap for V/HSplit cursors. So the short answer is "impossible". The long answer is you have to modify the mentioned lines and recompile Qt. I doubt though that you can get the cursor you've mentioned from windows as it seems non-standard and the bitmap in Qt is there for a reason. Good luck, commit if you make it :)

Telnet over a socket with GCDAsyncSocket

I'm trying to connect to a Cisco C40 codec via telnet from objective c. When using the terminal on my computer I get:
Password:
However when doing a socket connection there are telnet negotiations that need to be made. Which I am but for some reason I cannot get to the "Password:" prompt above.
- (void)socket:(GCDAsyncSocket *)sock didReadData:(NSData *)data withTag:(long)tag
{
NSLog(#"RECEIVE BUFFER %#",data);
//store read bytes to rawData
self.rawData = [[NSMutableData alloc] initWithData:data];
//cast bytes
const uint8_t *bytes = [self.rawData bytes];
//go through rawdata format and save it to networkbuffer
for (int i =0; i < [self.rawData length]; i++)
{
if (![[NSString stringWithFormat:#"%02X", bytes[i]]isEqual:#"0D"])
{
[self.networkBuffer addObject:[NSString stringWithFormat:#"%02X", bytes[i]]];
}
}
//negotiate any telnet protocal stuff (just accept options )
//example:
//FF:FD:18 returns FF:FB:18
while([[self.networkBuffer objectAtIndex:0]isEqualToString:#"FF"] && [[self.networkBuffer objectAtIndex:1]isEqualToString:#"FD"] ) {
// NSLog(#"HERE");
NSData * tempData =[data subdataWithRange:NSMakeRange(0, 3)];
NSMutableData * tempMutData = [NSMutableData dataWithData:tempData];
const unsigned char replacement[] = {
0xFC
};
[tempMutData replaceBytesInRange:NSMakeRange(1, 1) withBytes:replacement];
[self sendCommand:tempMutData];
data = [data subdataWithRange:NSMakeRange(3, [data length]-3)];
self.networkBuffer = [NSMutableArray arrayWithArray:[self.networkBuffer subarrayWithRange:NSMakeRange(3, [self.networkBuffer count]-3)]];
// NSLog(#"network buffer after removal: %#", data);
if ([self.networkBuffer count]<3) {
[self.networkBuffer insertObject:#" " atIndex:0];
}
}
//decode from bytes to text
for ( NSString * component in self.networkBuffer)
{
int value = 0;
sscanf([component cStringUsingEncoding:NSASCIIStringEncoding], "%x", &value);
[self.dataString appendFormat:#"%c", (char)value];
NSLog(#"data byte: %c",(char)value);
}
[self telnetResponse:self.dataString];
[self.networkBuffer removeAllObjects];
[self.socket readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];//CRLFData
}
Here is a breakdown of the telnet negotiation options Im receiving and sending:
server sending me:
FF,FD,18 (0x18 = 24dec) (Do terminal type)
FF,FD,20 (0x20 = 32dec) (Do terminal speed)
FF,FD,23 (0x23 = 35dec) (Do X Display Location)
FF,FD,27 (0x27 = 39dec) (Do New Environment Option)
My attempt at a response that doesnt work (no prompt for further input):
FF,FC,18 (0x18 = 24dec) (Wont terminal type)
FF,FC,20 (0x20 = 32dec) (Wont terminal speed)
FF,FC,23 (0x23 = 35dec) (Wont X Display Location)
FF,FC,27 (0x27 = 39dec) (Wont New Environment Option)
If you look at the code you will see that I am checking for FF and if so responding with similar bytes (replacing FD with FC), in hopes that wont accept the options but that does not seem to be working.
Links that helped me:
https://stackoverflow.com/a/2913991/530933
Telnet IAC commands (NSStream socket)
http://www.iprodeveloper.com/forums/aft/52910
UPDATE
I did a wireshark with a command shell and the cisco codec. After which I duplicated those negotiation setting/packets. Now the only problem is that Im only getting the echo. So i will get nothing, send a command, then get back a prompt plus my text. (Example. get nothing - send username "admin" - get back "login: admin") Hence what I mean by only getting the echo. I should get a prompt "login:" then send "admin" then it should prompt me for the password.
here are the negotiation options Im sending on connect:
//will terminal type
//will negotiate about window size
const unsigned char nineteen[] = {
0xFF, 0xFB, 0x18, 0xFF, 0xFB, 0x1F
};
self.dataToBeSent = [[NSData alloc]initWithBytes:nineteen length:sizeof(nineteen)];
[self sendCommand:self.dataToBeSent];
//wont terminal speed
//wont X display location
//will new environment option
const unsigned char twenty[] = {
0xFF, 0xFC, 0x20, 0xFF, 0xFC, 0x23, 0xFF, 0xFB, 0x27
};
self.dataToBeSent = [[NSData alloc]initWithBytes:twenty length:sizeof(twenty)];
[self sendCommand:self.dataToBeSent];
//Suboption being: negotiate about window size
//end
const unsigned char twentyOne[] = {
//0xFF,0xFC, 0x18
0xFF, 0xFA, 0x1F, 0x00, 0x50, 0x00, 0x19, 0xFF, 0xF0
};
self.dataToBeSent = [[NSData alloc]initWithBytes:twentyOne length:sizeof(twentyOne)];
[self sendCommand:self.dataToBeSent];
//new enviroment option
//end
const unsigned char twentyThree[] = {
0xFF,0xFA, 0x27, 0x00, 0xFF, 0xF0
};
self.dataToBeSent = [[NSData alloc]initWithBytes:twentyThree length:sizeof(twentyThree)];
[self sendCommand:self.dataToBeSent];
//Terminal Type (ANSI)
//end
const unsigned char twentySeven[] = {
0xFF,0xFA, 0x18, 0x00, 0x41, 0x4E, 0x53, 0x49, 0xFF, 0xF0
};
self.dataToBeSent = [[NSData alloc]initWithBytes:twentySeven length:sizeof(twentySeven)];
[self sendCommand:self.dataToBeSent];
//do suppress go ahead
const unsigned char twentyEight[] = {
0xFF, 0xFD, 0x03
};
self.dataToBeSent = [[NSData alloc]initWithBytes:twentyEight length:sizeof(twentyEight)];
[self sendCommand:self.dataToBeSent];
//will echo
//dont status
//wont remote flow control
const unsigned char twentyFour[] = {
0xFF, 0xFB, 0x01, 0xFF, 0xFE, 0x05, 0xFF,0xFC, 0x21
};
self.dataToBeSent = [[NSData alloc]initWithBytes:twentyFour length:sizeof(twentyFour)];
[self sendCommand:self.dataToBeSent];
//wont echo
const unsigned char twentyFive[] = {
0xFF, 0xFC, 0x01
};
self.dataToBeSent = [[NSData alloc]initWithBytes:twentyFive length:sizeof(twentyFive)];
[self sendCommand:self.dataToBeSent];
//Do echo
const unsigned char twentySix[] = {
0xFF,0xFD, 0x01
};
self.dataToBeSent = [[NSData alloc]initWithBytes:twentySix length:sizeof(twentySix)];
[self sendCommand:self.dataToBeSent];
So a big problem came from the fact that the prompts (login: or password:) do no end the line with CR NL (0D:0A). And I was doing
[self.socket readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];
so I was never reading the data that held the prompt (a big problem also was that wireshark wasnt working (fixed that myself too)). Once I figured this out I changed the line above to:
[self.socket readDataWithTimeout:-1 tag:0];
Which successfully gave me my prompt. Below are the negotiations Im sending to get to this point and what the original questions entailed (same as above in the update):
will terminal type - 0xFF, 0xFB, 0x18
will negotiate about window size - 0xFF, 0xFB, 0x1F
wont terminal speed - 0xFF, 0xFC, 0x20
wont X display location - 0xFF, 0xFC, 0x23
will new environment option - 0xFF, 0xFB, 0x27
Suboptions
negotiate about window size - 0xFF, 0xFA, 0x1F, 0x00, 0x50, 0x00, 0x19
end - 0xFF, 0xF0
new enviroment option - 0xFF,0xFA, 0x27, 0x00,
end - 0xFF, 0xF0
Terminal Type (ANSI) - 0xFF,0xFA, 0x18, 0x00, 0x41, 0x4E, 0x53, 0x49,
end - 0xFF, 0xF0
do suppress go ahead - 0xFF, 0xFD, 0x03
will echo - 0xFF, 0xFB, 0x01
dont status - 0xFF, 0xFE, 0x05
wont remote flow control - 0xFF,0xFC, 0x21
wont echo - 0xFF, 0xFC, 0x01
Do echo - 0xFF,0xFD, 0x01
This might also help. It removes the negotiation bytes from the stream so when your encoding to make the string it doesnt include negotiation bytes.
while([[self.networkBuffer objectAtIndex:0]isEqualToString:#"FF"])
{
if ([[self.networkBuffer objectAtIndex:1]isEqualToString:#"FD"] || [[self.networkBuffer objectAtIndex:1]isEqualToString:#"FB"] || [[self.networkBuffer objectAtIndex:1]isEqualToString:#"FE"] || [[self.networkBuffer objectAtIndex:1]isEqualToString:#"FA"]) {
//most negotiation options are 3 bytes long
int indexToRemoveFromBuffer = 3;
//if FA then they are longer then 3 bytes
if ([[self.networkBuffer objectAtIndex:1]isEqualToString:#"FA"]) {
//look for indicator of END (F0)
indexToRemoveFromBuffer = [self.networkBuffer indexOfObject:#"F0"]+1;
}
//remove these bytes from networkbuffer
self.networkBuffer = [NSMutableArray arrayWithArray:[self.networkBuffer subarrayWithRange:NSMakeRange(indexToRemoveFromBuffer, [self.networkBuffer count]-indexToRemoveFromBuffer)]];
if ([self.networkBuffer count] == 0) {
if (self.isLoggedIn) {
[self.socket readDataToData:[GCDAsyncSocket CRLFData] withTimeout:-1 tag:0];//CRLFData
}else{
[self.socket readDataWithTimeout:-1 tag:0];
}
return;
}
}else{
break;
}
}

Change endianess of bluetooth transfer

I am using bluetooth 4 (Low Energy) and need to transfer an 8-bit slider value to my slave device. The receiving end should get something like this : 000000A3 but right now I am stuck with A3000000
I have tried different solutions:
int value = ((int)slider.value >> 24) ;
NSData *dataToWrite = [NSData dataWithBytes:&value length:4]; //data to be sent has to be of type NSData
and
int value[] = {0x00, 0x00, 0x00, slider.value};
and the only working one
char value[4] = {0x00, 0x00, 0x00, slider.value};
but I think this looks a bit ugly. Any other ideas on how to do this?
Core Foundation has functions for handling byte-order conversions: Byte-Order Utilities Reference

bitwise operations

- (void)pushDigitalJoin: (NSString*)joinNumber
{
char joinByteArray[] = {
0x05, 0x00, 0x06, 0x00, 0x00, 0x03, 0x27
};
int joinIntNumber = ([joinNumber intValue] - 1);
char *upperByte;
char *lowerByte;
NSString *decimalString = [NSString stringWithFormat:#"%i", 0xff];
*upperByte = joinIntNumber & [decimalString intValue];
*lowerByte = joinIntNumber >> 8;
joinByteArray[7]= *upperByte;
joinByteArray[8] = *lowerByte;
int i;
for (i = 0; i < sizeof(joinByteArray); i++) {
NSLog(#"joinByteArray: position-%i | value-%i",i,joinByteArray[i]);
}
}
basically i have the byte array
i need to change the last 2 bytes based on the "joinNumber"
then add those 2 bytes to the array
however i get compile errors on the joinIntNumber >> 8 and the operation above that which uses the and operator doesnt seem to work. (output always shows 39)
so how do i correctly use these bitwise operators and get my 2bytes added to the array?
CHANGES MADE TO REFLECT COMMENTS AND SHOW OUTPUT (ANSWER(:
- (void)pushDigitalJoin: (NSString*)joinNumber
{
char joinByteArray[] = {
0x05, 0x00, 0x06, 0x00, 0x00, 0x03, 0x27, 0x00, 0x00
};
int joinIntNumber = ([joinNumber intValue] - 1);
char upperByte = nil;
char lowerByte = nil;
// NSString *decimalString = [NSString stringWithFormat:#"%i", 0xff];
upperByte = joinIntNumber & 0xff;//[decimalString intValue];
lowerByte = joinIntNumber >> 8;
joinByteArray[7]= upperByte;
joinByteArray[8] = lowerByte;
int i;
for (i = 0; i < sizeof(joinByteArray); i++) {
NSLog(#"joinByteArray: position-%i | value-%x",i,joinByteArray[i]);
}
}
OUTPUT (joinnumber = 5):
2011-08-26 11:06:07.554 Cameleon[2213:40b] joinByteArray: position-0 | value-5
2011-08-26 11:06:07.555 Cameleon[2213:40b] joinByteArray: position-1 | value-0
2011-08-26 11:06:07.557 Cameleon[2213:40b] joinByteArray: position-2 | value-6
2011-08-26 11:06:07.558 Cameleon[2213:40b] joinByteArray: position-3 | value-0
2011-08-26 11:06:07.559 Cameleon[2213:40b] joinByteArray: position-4 | value-0
2011-08-26 11:06:07.561 Cameleon[2213:40b] joinByteArray: position-5 | value-3
2011-08-26 11:06:07.562 Cameleon[2213:40b] joinByteArray: position-6 | value-27
2011-08-26 11:06:07.563 Cameleon[2213:40b] joinByteArray: position-7 | value-4
2011-08-26 11:06:07.564 Cameleon[2213:40b] joinByteArray: position-8 | value-0
so how do i correctly use these bitwise operators and get my 2bytes added to the array?
You don't. The array is declared on the stack and has fixed size (7 bytes). If you try to add values onto the end, you'll wind up stomping on other values on the stack and probably corrupting the stack.
Unrelated, but also problematic is this:
NSString *decimalString = [NSString stringWithFormat:#"%i", 0xff];
*upperByte = joinIntNumber & [decimalString intValue];
That really doesn't make any sense... why are you creating a string from an int only to take it's intValue? It'd be better to write:
*upperByte = joinIntNumber & 0xff;
And another thing... you're declaring upperByte and lowerByte as character pointes (char*), but you don't set them to point at anything in particular. So when you try to set the characters that they point to as in the above line, you're going to end up putting the values into random places.
If you want a C array that you can modify, you should declare one that's large enough to hold any values that you're going to add, in this case:
char joinByteArray[] = {
0x05, 0x00, 0x06, 0x00, 0x00, 0x03, 0x27, 0x00, 0x00
};
You could also create it on the heap with malloc() and friends, but again you'd need to make it large enough at the outset to hold your extra values, or else grow the array as necessary with realloc() before adding your new values. In any case, don't write past the end of your array.
First problem: joinByteArray is only 7-elements long, so you can't assign to indices 7 and 8. Not sure what you're trying to do here. Is this array supposed to grow over time as you receive more numbers? If so, you're better off using a NSMutableArray or NSMutableData and storing that in an instance variable:
char initialBytes[] = {
0x05, 0x00, 0x06, 0x00, 0x00, 0x03, 0x27
};
NSMutableData *joinBytes = [[NSMutableData alloc] initWithBytes:initialBytes length:7];
Second problem: you declare upperByte and lowerByte as pointers, but they should just be stack-allocated variables. Or better yet, use an array for this to make appending the data easier later on:
char newBytes[2];
newBytes[0] = joinIntNumber & 0xff;
newBytes[1] = joinIntNumber >> 8;
Once you've got that, you can append to the data:
[joinBytes appendBytes:newBytes length:2];
I'm not sure what it is you're trying to do, but you may want to consider endianess (see, for example, NSSwapHostIntToBig).
This may be closer to what you are looking for.
- (void)pushDigitalJoin: (NSString*)joinNumber
{
//You are appending 2 more values so you need to specify
//that jointByteArray is 9
unsigned char joinByteArray[9] = {
0x05, 0x00, 0x06, 0x00, 0x00, 0x03, 0x27
};
int joinIntNumber = ([joinNumber intValue] - 1);
//Upper and lower byte do not need to be char*
//unless you want to needlessly malloc memory for them
char upperByte;
char lowerByte;
//0xff is an int (unsigned) so this is useless
//NSString *decimalString = [NSString stringWithFormat:#"%i", 0xff];
//To get upper byte you need to know the size of int
//use int32_t to specify >> 24 so you don't need to use sizeof
upperByte = joinIntNumber >> ((sizeof(joinIntNumber) - 1) * 8);
lowerByte = joinIntNumber & 0xFF;
joinByteArray[7] = upperByte;
joinByteArray[8] = lowerByte;
int i;
for (i = 0; i < sizeof(joinByteArray); i++) {
//Should log hex since you are manipulating bytes
NSLog(#"joinByteArray: position-%X | value-%X",i,joinByteArray[i]);
}
}