I am trying to turn the led (LD2 in schematic) inside the nucleo board on using only registers with the STM32CubeIDE.
The user manual states the following addresses for the clock, mode and data registers:
Led pin: PA5
Address of the Clock control register: RCC_AHBENR
[base address] + [offset] ===> [Result]
0x4002 1000 + 0x14 ===> 0x40021014
Address of the GPIOA mode register
0x4800 0000 + 0x00 ===> 0x48000000
Address of the GPIOA output data register
0x4800 0000 + 0x14 ===> 0x48000014
I am using the following code to set/clear the registers in the board:
#include <stdint.h>
int main(void)
{
uint32_t *pClkCtrlReg = (uint32_t*)0x40021014;
uint32_t *pPortAModeReg = (uint32_t*)0x48000000;
uint32_t *pPortAOutReg = (uint32_t*)0x48000014;
//1. enable the clock for GPIOA peripheral in the AHBENR
*pClkCtrlReg |= 0x00020000;
//2. configure the mode of the IO pin as output
//a. clear the 24th and 25th bit positions
*pPortAModeReg &= 0xFCFFFFFF;
//b set 24th bit position as 1
*pPortAModeReg |= 0x01000000;
//3. SET 12th bit of the output data register to make I/O pin-12 as HIGH
*pPortAOutReg |= 0x20;
while(1);
}
Using the register viewer from the IDE, I can see that the PA5 is set as output, but physically, my led is not turning on.
I do not know what I am doing wrong. I am suspecting that the pin PA5 is wrong, but i tried PA12 too and it does not work. Can someone please help me out here?
I ran through your code with reference manual in hand. RM0316 STM32F303 Reference manual.
You activate clock to GPIO Port A correctly (also, GPIOA registers would also read all 0x00 is it hadn't been activated).
Then you set GPIO mode as, quoting you:
//2. configure the mode of the IO pin as output
//a. clear the 24th and 25th bit positions
*pPortAModeReg &= 0xFCFFFFFF;
//b set 24th bit position as 1
*pPortAModeReg |= 0x01000000;
You work with bits 24 and 25. Which are these:
So you set the mode for pin A12, not A5. For GPIOA Pin 5 you need to manipulate bits 10 and 11.
//clear pin5 bits
*pPortAModeReg &= ~(0x03 << 10); //take 0b11, shift it to position 10, flip all bits, AND with original state of register
And follow it with setting those bits to "General purpose output mode" 01, which you also do for wrong bits:
//b set 10th bit position as 1
*pPortAModeReg |= (0x01 << 10);
I checked all the registers GPIO has, there shouldn't be anything else you need to set. If things still don't work, please post the contents of all GPIOA registers.
EDIT: also, try using bit set/reset registers. Note that they are read only, so no "|=" for them, only "=". Writing 0 to them doesn't do anything. So you need to write only (0x01<<5) straight to the GPIOA BSRR register.
GPIOA->BSRR = (1U<<5); //or however you want to address that register
A few comments on your code:
Get in the habit of always using the volatile keyword when accessing peripheral registers. Otherwise the compiler may end up optimizing out your code. I usually make a typedef for something like a reg32_t type in order not to forget.
As the OP pointed out, use code of the form (1 << b) to make it obvious which bit(s) you are referencing. Your compiler will optimize this, so the end result in the compiled code will be the same. You could even consider making this into a couple of macros or inline functions.
When you make several changes to a peripheral register, it's a good idea to copy it; make the changes to the copy; then store the copy back to the register. This way, you don't temporarily put the register in an unwanted state.
Here's your code with these changes:
#include <stdint.h>
typedef volatile uint32_t reg32_t;
int main(void)
{
reg32_t *pClkCtrlReg = (reg32_t*)0x40021014;
reg32_t *pPortAModeReg = (reg32_t*)0x48000000;
reg32_t *pPortABsrReg = (reg32_t*)0x48000018;
//1. enable the clock for GPIOA peripheral in the AHBENR
*pClkCtrlReg |= (1<<17);
//2. configure the mode of the IO pin as output
uint32_t modereg = *pPortAModeReg & ~(3<<(2*5));
modereg |= (1<<(2*5));
*pPortAModeReg = modereg;
//3. SET 5th bit of the bit set/reset register to make I/O pin 5 HIGH
*pPortABsrReg = (1<<5);
// If you want to turn the LED back off at a later time, do this:
// *pPortABsrReg = (1<<(5+16));
for (;;) ;
}
Good luck with your blinky!
//3. SET 12th bit of the output data register to make I/O pin-12 as HIGH
*pPortAOutReg |= 0x20;
for PIN 12, the value should be *pPortAOutReg |= 0x1000;
Or you could do as following:*pPortAOutReg |= (1 << 12); With this you don't need to know excactly value.
Related
I have LPCXpresso OM13058 board with LPC11U68 MCU.
Board schematic: https://www.nxp.com/downloads/en/schematics/LPC11U68_Xpresso_v2_Schematic_RevC_1.pdf
I am using this board standalone and compile my program with IAR IDE. For MCU programming I use Flash magic.
Configuration files I used, including SystemInit:
https://github.com/NordicPlayground/mbed/tree/master/libraries/mbed/targets/cmsis/TARGET_NXP/TARGET_LPC11U6X/system_LPC11U6x.h
https://github.com/NordicPlayground/mbed/tree/master/libraries/mbed/targets/cmsis/TARGET_NXP/TARGET_LPC11U6X/system_LPC11U6x.c
https://github.com/NordicPlayground/mbed/tree/master/libraries/mbed/targets/cmsis/TARGET_NXP/TARGET_LPC11U6X/LPC11U6x.h
My program runs at 48MHz. I am trying to generate 200kHz PWM for a GPIO pin. I init my CT32B0 timer:
LPC_SYSCON->SYSAHBCLKCTRL |= (1<<9);
LPC_CT32B0->CTCR = 0x0;
LPC_CT32B0->PR = 48-1; //prescale, I get 1MHz
LPC_CT32B0->TCR = 0x02;
Then I have a timer counter based delay function:
void delay(unsigned int ms)
{
LPC_CT32B0->TCR = 0x02;
LPC_CT32B0->TCR = 0x01;
while(LPC_CT32B0->TC < ms);
LPC_CT32B0->TCR = 0x00;
}
Simple GPIO toggling to get 1kHz (works well, oscilloscope shows 1kHz):
LPC_GPIO_PORT->NOT[0] |= (1<<17);
delay(500);
Then I changed a value from 500 to 50, but instead of getting 10kHz, my oscilloscope shows 9.88kHz.
LPC_GPIO_PORT->NOT[0] |= (1<<17);
delay(50);
By changing value to 5, a result get much more inaccurate. I get 87.7kHz instead of 100kHz:
LPC_GPIO_PORT->NOT[0] |= (1<<17);
delay(5);
As you can see, the frequency is not so high, but the error is not acceptable. I have implemented USART that is running 115200 baud rate without any issues, so I think the main clock frequency is actually running at 48MHz (SystemCoreClock also returns value 48000000).
I have tried different ports and pins, but result the same. Also, I have tried to run the same code on another board with 72MHz MCU LPC1343, ant GPIO switching at 200kHz was perfect and without any errors.
Could you please help to find out the problem. Are there any additional configuration needed or the issue is hardware?
How could I create an interrupt for a blue pill from scratch?
I do not want to use any sort of special library. Also, I use Keil IDE, thus, by "building from scratch" I refer rather not to use any extra library than to assemble the project without the help of an IDE.
I tried to find resources, but no success. Could anybody help me and at least provide some information/bibliography for me? I would be grateful.
Moreover, by "strange library" I mean any other library than the stmf32f1xx.h header. I would like to fire an interrupt when one of the pins' input value toggles. In order to do this, on AVR MCUs it was very simple as long as only a few register values should be changed. Unfortunately, I don't know how an interrupt within an ARM MCU functions and in which registers should I write what values.
Also, a better understanding of the ARM MCU's interrupt mechanism would make me more prepared for tackling debouncing issues.
I am not going to take you entirely literally when you mandate "no libraries", because no one who wants to get work done and knows what they are doing on Cortex-M would do that - and I will assume at least that you will use the CMSIS - a common API provided for all ARM Cortex-M devices, and which makes your code more, not less portable.
All the CMSIS code is provided as source, rather than static library, so there is nothing hidden and if you chose not to use it, you can see how it works and replicate that functionality (needlessly) if you wish.
In the CMSIS default implementations are provided as "weak-links" that can be overridden by user code simply by defining a function of the pre-defined name to override the default. The default implementation is generally an infinite loop - so that unhandled interrupts are "trapped" so you can intervene with your debugger or wait for a watchdog reset for example.
The Cortex-M core interrupt handlers and exception handlers have common names across all Cortex-M parts:
Reset_Handler
NMI_Handler
HardFault_Handler
MemManage_Handler
BusFault_Handler
UsageFault_Handler
SVC_Handler
DebugMon_Handler
PendSV_Handler
SysTick_Handler
Peripheral interrupt handlers have names defined by the vendor, but the naming convention is <interrupt_source>_IRQHandler. For example on STM32F1xx EXTI0_IRQHandler is the shared external interrupt assigned to bit zero of GPIO ports.
To implement an CMSIS interrupt handler, all you need do is:
Implement the interrupt handler function using the CMSIS handler function name
Enable the interrupt in the NVIC (interrupt controller).
There other are things you might do such as assign the interrupt priority scheme (the split between preempt priorities and subpriorities), but lets keep it simple for the time being.
Because it is ubiquitous to all Cortex-M parts, and because it is useful in almost any non-trivial application an illustration using the SYSTICK interrupt is useful as a starting point.
#include "stm32f1xx.h"
volatile uint32_t msTicks = 0 ;
void SysTick_Handler(void)
{
msTicks++ ;
}
int main (void)
{
if( SysTick_Config( SystemCoreClock / 1000 ) != 0 ) // 1ms tick
{
// Error Handling
}
...
}
SysTick_Config() is another CMSIS function. In core_cm3.h it looks like this:
__STATIC_INLINE uint32_t SysTick_Config(uint32_t ticks)
{
if ((ticks - 1UL) > SysTick_LOAD_RELOAD_Msk)
{
return (1UL); /* Reload value impossible */
}
SysTick->LOAD = (uint32_t)(ticks - 1UL); /* set reload register */
NVIC_SetPriority (SysTick_IRQn, (1UL << __NVIC_PRIO_BITS) - 1UL); /* set Priority for Systick Interrupt */
SysTick->VAL = 0UL; /* Load the SysTick Counter Value */
SysTick->CTRL = SysTick_CTRL_CLKSOURCE_Msk |
SysTick_CTRL_TICKINT_Msk |
SysTick_CTRL_ENABLE_Msk; /* Enable SysTick IRQ and SysTick Timer */
return (0UL); /* Function successful */
}
So let's say you have a external interrupt source on the falling edge of GPIOA pin 0, then you would use the STM32 EXTI0 interrupt. The minimal handler would look like:
void EXTI0_IRQHandler(void)
{
EXTI->PR |= (1<<0); // clear pending interrupt
// Handle interrupt...
}
Setting up the EXTI requires enabling the GPIO and the EXTI itself as well as the NVIC:
RCC->APB2ENR |= RCC_APB2ENR_IOPAEN ; // enable clock for GPIOA
RCC->APB2ENR |= RCC_APB2ENR_AFIOEN ; // enable clock for Alternate Function
AFIO->EXTICR[0] |= AFIO_EXTICR1_EXTI0 ; // set pin to use
EXTI->IMR = EXTI_IMR_MR0 ; // unmask interrupt
EXTI->EMR = EXTI_EMR_MR0 ; // unmask event
EXTI->FTSR = EXTI_FTSR_TR0 ; // set falling edge
NVIC->ISER[0] |= (1 << (EXTI0_IRQChannel & 0x1F)); // enable interrupt EXTI 0
The peripheral registers and structures are defined in stm32f10weakx.h, and the "weak" default peripheral handlers to be overridden are in startup_stm32f10x_cl.s for your specific part. Any handlers you override must match these symbol names exactly.
All the peripheral interrupt sources and how to configure them is defined un the ST Reference Manual RM0008.
All the Cortex-M core specific stuff - systtick, NVIC, exception handlers etc. is provided by ARM at https://developer.arm.com/ip-products/processors/cortex-m/cortex-m3
CMSIS for CM3 is documented at https://developer.arm.com/documentation/dui0552/a/
The project I'm desperatly trying to finish is rather simple in term of microcontroller programming. The dspic33fj128mc802 I use basically has to do 3 things:
receive data via UART and convert it into PWM signals for servomotors
regularly wake up its ADC to check a battery level
change its baudrate working values when fed an external interrupt.
It's the last point that causes issue. In my circuit I have a switch. One position corresponds to a baudrate value, the other one to a second value. I didn't find any documentation on how to trigger an interrupt on any voltage level change, so I use rising edge and falling edge trigger in combination with checking the current state of the pin I chose for my interrupt.
Furthemore, I have two other interrupt functions in my code, one for UART reception, and the second one is a timer interrupt to wake up the ADC periodically. Interrupt priorities are the following: UART -> 1, Timer -> 2, External Interrupt -> 6 (any number above 2 really).
Here is my interrupt code:
void __attribute__((interrupt, auto_psv)) _INT0Interrupt( void )
{
IEC0bits.INT0IE = 0; // disable INT0 interrupt
if(IFS0bits.INT0IF){
if (PORTBbits.RB7 == 1){ //if pin at high logic level
INTCON2bits.INT0EP = 1; //falling edge trigger
LATAbits.LATA0 = 1;
U1BRG = 23;
}
else{ //if pin at low logic level
INTCON2bits.INT0EP = 0; //rising edge trigger
LATAbits.LATA0 = 0;
U1BRG = 1;
}
}
IFS0bits.INT0IF = 0; //clear INT0 flag
IEC0bits.INT0IE = 1; // enable INT0 interrupt
}
The weird behaviour now -> When pulling the pin to low, the baudrate is set at the right value, the UART commmunication works perfectly. When pulled to high, the previous communication doesn't work anymore, proof that the baudrate has changed, but setting the new communication at that new baudrate doesn't work either. The LED status change works fine as well.
It is to be noted that all the different apsects of this project have been tested multiple times, each section works well, only adding this External Interrupt made the whole thing crash. The microcontroller works fine, my baudrate values are good, my circuit has been tested and has no issues.... I jst think I don't know how to properly use an external interrupt.
I'm just trying to learn to use external ADC and DAC (PT8211) with my PIC32MX534f06h.
So far, my code is just about sampling a signal with my ADC every time a timer-interrupt is triggered, then sending then same signal out to the DAC.
The interrupt and ADC part works fine and have been tested independently, but the voltages that my DAC outputs don't make much sens to me and stay at 2,5V (it's powered at 0 - 5V).
I've tried to feed the DAC various values ranging from 0 to 65534 (16bits DAC so i guess it should be the expected range of the values to feed to it, right?) voltage stays at 2.5V.
I've tried changing the SPI configuration, using different SPIs (3 and 4) and DACs (I have one soldered to my pcb, soldered to SPI3, and one one breadboard, linked to SPI4 in case the one soldered on my board was defective).
I made sure that the chip selection line works as expected.
I couldn't see the data and clock that are transmissed since i don't have a scope yet.
I'm a bit out of ideas now.
Chip selection and SPI configuration settings
signed short adc_value;
signed short DAC_output_value;
int Empty_SPI3_buffer;
#define Chip_Select_DAC_Set() {LATDSET=_LATE_LATE0_MASK;}
#define Chip_Select_DAC_Clr() {LATDCLR=_LATE_LATE0_MASK;}
#define SPI4_CONF 0b1000010100100000 // SPI on, 16-bit master,CKE=1,CKP=0
#define SPI4_BAUD 100 // clock divider
DAC output function
//output to external DAC
void DAC_Output(signed int valueDAC) {
INTDisableInterrupts();
Chip_Select_DAC_Clr();
while(!SPI4STATbits.SPITBE); // wait for TX buffer to empty
SPI4BUF=valueDAC; // write byte to TX buffer
while(!SPI4STATbits.SPIRBF); // wait for RX buffer to fill
Empty_SPI3_buffer=SPI4BUF; // read RX buffer
Chip_Select_DAC_Set();
INTEnableInterrupts();
}
ISR sampling the data, triggered by Timer1. This works fine.
ADC_input inputs the data in the global variable adc_value (12 bits, signed)
//ISR to sample data
void __ISR( _TIMER_1_VECTOR, IPL7SRS) Test_data_sampling_in( void)
{
IFS0bits.T1IF = 0;
ADC_Input();
//rescale the signed 12 bit audio values to unsigned 16 bits wide values
DAC_output_value = adc_value + 2048; //first unsign the signed 12 bit values (between 0 - 4096, center 2048)
DAC_output_value = DAC_output_value *16; // the scale between 12 and 16 bits is actually 16=65536/4096
DAC_Output(DAC_output_value);
}
main function with SPI, IO, Timer configuration
void main() {
SPI4CON = SPI4_CONF;
SPI4BRG = SPI4_BAUD;
TRISE = 0b00100000;
TRISD = 0b000000110100;
TRISG = 0b0010000000;
LATD = 0x0;
SYSTEMConfigPerformance(80000000L); //
INTCONSET = _INTCON_MVEC_MASK; /* Set the interrupt controller for multi-vector mode */
//
T1CONbits.TON = 0; /* turn off Timer 1 */
T1CONbits.TCKPS = 0b11; /* pre-scale = 1:1 (T1CLKIN = 80MHz (?) ) */
PR1 = 1816; /* T1 period ~ ? */
TMR1 = 0; /* clear Timer 1 counter */
//
IPC1bits.T1IP = 7; /* Set Timer 1 interrupt priority to 7 */
IFS0bits.T1IF = 0; /* Reset the Timer 1 interrupt flag */
IEC0bits.T1IE = 1; /* Enable interrupts from Timer 1 */
T1CONbits.TON = 1; /* Enable Timer 1 peripheral */
INTEnableInterrupts();
while (1){
}
}
I would expect to see the voltage at the ouput of my DAC to mimic those I put at the input of my ADC, instead the DAC output value is always constant, no matter what I input to the ADC
What am i missing?
Also, when turning the SPIs on, should I still manually manage the IO configuration of the SDI SDO SCK pins using TRIS or is it automatically taken care of?
First of all I agree that the documentation I first found for PT8211 is rather poor. I found extended documentation here. Your DAC (PT8211) is actually an I2S device, not SPI. WS is not chip select, it is word select (left/right channel). In I2S, If you are setting WS to 0, that means the left channel. However it looks like in the extended datasheet I found that WS 0 is actually right channel (go figure).
The PIC you've chosen doesn't seem to have any I2S hardware so you might have to bit bash it. There is a lot of info on I2S though ,see I2S bus specification .
There are some slight differences with SPI and I2C. Notice that the first bit is when WS transitions from high to low is the LSB of the right channel. and when WS transitions from low to high, it is not the LSB of the left channel. Note that the output should be between 0.4v to 2.4v (I2S standard), not between 0 and 5V. (Max is 2.5V which is what you've been seeing).
I2S
Basically, I'd try it with the proper protocol first with a bit bashing algorithm with continuous flip flopping between a left/right channel.
First of all, thanks a lot for your comment. It helps a lot to know that i'm not looking at a SPI transmission and that explains why it's not working.
A few reflexions about it
I googled Bit bashing (banging?) and it seems to be CPU intensive, which I would definately try to avoid
I have seen a (successful) projet (in MikroC) where someone transmit data from that exact same PIC, to the same DAC, using SPI, with apparently no problems whatsoever So i guess it SHOULD work, somehow?
Maybe he's transforming the data so that it works? here is the code he's using, I'm not sure what happens with the F15 bit toggle, I was thinking that it was done to manage the LSB shift problem. Here is the piece of (working) MikroC code that i'm talking about
valueDAC = valueDAC + 32768;
valueDAC.F15 =~ valueDAC.F15;
Chip_Select_DAC = 0;
SPI3_Write(valueDAC);
Chip_Select_DAC = 1;
From my understanding, the two biggest differences between SPI and I2S is that SPI sends "bursts" of data where I2S continuously sends data. Another difference is that data sent after the word change state is the LSB of the last word.
So i was thinking that my SPI is triggered by a timer, which is always the same, so even if the data is not sent continuously, it will just make the sound wave a bit more 'aliased' and if it's triggered regularly enough (say at 44Mhz), it should not be SO different from sending I2S data at the same frequency, right?
If that is so, and I undertand correctly, the "only" problem left is to manage the LSB-next-word-MSB place problem, but i thought that the LSB is virtually negligible over 16bit values, so if I could just bitshift my value to the right and then just fix the LSB value to 0 or 1, the error would be small, and the format would be right.
Does it sounds like I have a valid 'Mc-Gyver-I2S-from-my-SPI' or am I forgetting something important?
I have tried to implement it, so far without success, but I need to check my SPI configuration since i'm not sure that it's configured correctly
Here is the code so far
SPI config
#define Chip_Select_DAC_Set() {LATDSET=_LATE_LATE0_MASK;}
#define Chip_Select_DAC_Clr() {LATDCLR=_LATE_LATE0_MASK;}
#define SPI4_CONF 0b1000010100100000
#define SPI4_BAUD 20
DAaC output function
//output audio to external DAC
void DAC_Output(signed int valueDAC) {
INTDisableInterrupts();
valueDAC = valueDAC >> 1; // put the MSB of ValueDAC 1 bit to the right (becase the MSB of what is transmitted will be seen by the DAC as the LSB of the last value, after a word select change)
//Left channel
Chip_Select_DAC_Set(); // Select left channel
SPI4BUF=valueDAC;
while(!SPI4STATbits.SPITBE); // wait for TX buffer to empty
SPI4BUF=valueDAC; // write 16-bits word to TX buffer
while(!SPI4STATbits.SPIRBF); // wait for RX buffer to fill
Empty_SPI3_buffer=SPI4BUF; // read RX buffer (don't know why we need to do this here, but we do)
//SPI3_Write(valueDAC); MikroC option
// Right channel
Chip_Select_DAC_Clr();
SPI4BUF=valueDAC;
while(!SPI4STATbits.SPITBE); // wait for TX buffer to empty
SPI4BUF=valueDAC; // write 16-bits word to TX buffer
while(!SPI4STATbits.SPIRBF); // wait for RX buffer to fill
Empty_SPI3_buffer=SPI4BUF;
INTEnableInterrupts();
}
The data I send here is signed, 16 bits range, I think you said that it's allright with this DAC, right?
Or maybe i could use framed SPI? the clock seems to be continous in this mode, but I would still have the LSB MSB shifting problem to solve.
I'm a bit lost here, so any help would be cool
I've been pulling my hair out lately trying to get an ATmega162 on my STK200 to talk to my computer over RS232. I checked and made sure that the STK200 contains a MAX202CPE chip.
I've configured the chip to use its internal 8MHz clock and divided it by 8.
I've tried to copy the code out of the data sheet (and made changes where the compiler complained), but to no avail.
My code is below, could someone please help me fix the problems that I'm having?
I've confirmed that my serial port works on other devices and is not faulty.
Thanks!
#include <avr/io.h>
#include <avr/iom162.h>
#define BAUDRATE 4800
void USART_Init(unsigned int baud)
{
UBRR0H = (unsigned char)(baud >> 8);
UBRR0L = (unsigned char)baud;
UCSR0B = (1 << RXEN0) | (1 << TXEN0);
UCSR0C = (1 << URSEL0) | (1 << USBS0) | (3 << UCSZ00);
}
void USART_Transmit(unsigned char data)
{
while(!(UCSR0A & (1 << UDRE0)));
UDR0 = data;
}
unsigned char USART_Receive()
{
while(!(UCSR0A & (1 << RXC0)));
return UDR0;
}
int main()
{
USART_Init(BAUDRATE);
unsigned char data;
// all are 1, all as output
DDRB = 0xFF;
while(1)
{
data = USART_Receive();
PORTB = data;
USART_Transmit(data);
}
}
I have commented on Greg's answer, but would like to add one more thing. For this sort of problem the gold standard method of debugging it is to first understand asynchronous serial communications, then to get an oscilloscope and see what's happening on the line. If characters are being exchanged and it's just a baudrate problem this will be particularly helpful as you can calculate the baudrate you are seeing and then adjust the divisor accordingly.
Here is a super quick primer, no doubt you can find something much more comprehensive on Wikipedia or elsewhere.
Let's assume 8 bits, no parity, 1 stop bit (the most common setup). Then if the character being transmitted is say 0x3f (= ascii '?'), then the line looks like this;
...--+ +---+---+---+---+---+---+ +---+--...
| S | 1 1 1 1 1 1 | 0 0 | E
+---+ +---+---+
The high (1) level is +5V at the chip and -12V after conversion to RS232 levels.
The low (0) level is 0V at the chip and +12V after conversion to RS232 levels.
S is the start bit.
Then we have 8 data bits, least significant first, so here 00111111 = 0x3f = '?'.
E is the stop (e for end) bit.
Time is advancing from left to right, just like an oscilloscope display, If the baudrate is 4800, then each bit spans (1/4800) seconds = 0.21 milliseconds (approx).
The receiver works by sampling the line and looking for a falling edge (a quiescent line is simply logical '1' all the time). The receiver knows the baudrate, and the number of start bits (1), so it measures one half bit time from the falling edge to find the middle of the start bit, then samples the line 8 bit times in succession after that to collect the data bits. The receiver then waits one more bit time (until half way through the stop bit) and starts looking for another start bit (i.e. falling edge). Meanwhile the character read is made available to the rest of the system. The transmitter guarantees that the next falling edge won't begin until the stop bit is complete. The transmitter can be programmed to always wait longer (with additional stop bits) but that is a legacy issue, extra stop bits were only required with very slow hardware and/or software setups.
I don't have reference material handy, but the baud rate register UBRR usually contains a divisor value, rather than the desired baud rate itself. A quick google search indicates that the correct divisor value for 4800 baud may be 239. So try:
divisor = 239;
UBRR0H = (unsigned char)(divisor >> 8);
UBRR0L = (unsigned char)divisor;
If this doesn't work, check with the reference docs for your particular chip for the correct divisor calculation formula.
For debugging UART communication, there are two useful things to do:
1) Do a loop-back at the connector and make sure you can read back what you write. If you send a character and get it back exactly, you know that the hardware is wired correctly, and that at least the basic set of UART register configuration is correct.
2) Repeatedly send the character 0x55 ("U") - the binary bit pattern 01010101 will allow you to quickly see the bit width on the oscilloscope, which will let you verify that the speed setting is correct.
After reading the data sheet a little more thoroughly, I was incorrectly setting the baudrate. The ATmega162 data sheet had a chart of clock frequencies plotted against baud rates and the corresponding error.
For a 4800 baud rate and a 1 MHz clock frequency, the error was 0.2%, which was acceptable for me. The trick was passing 12 to the USART_Init() function, instead of 4800.
Hope this helps someone else out!