Writing on serial port COM1 in a kernel from scratch - c

I'm writing an i386 ELF kernel from scratch. I need to be able to write to the serial port COM1.
I have wrote two functions, serial_init() is called every time I call printk(char* str) which calls for each iteration for each character serial_putc(char c).
#define SERIAL_COM1 (0x03f8)
void serial_putc(char c)
{
char* serial = (char*)SERIAL_COM1;
while ((serial[5] & 0x20) == 0);
serial[0] = c;
}
void serial_init()
{
char* serial = (char*)SERIAL_COM1;
serial[1] = 0x00;
serial[3] = 0x80;
serial[0] = 0x03;
serial[1] = 0x00;
serial[3] = 0x03;
serial[2] = 0xc7;
serial[4] = 0x0b;
}
The line protocol is:
38400 bauds
8 bits per word
No parity check
1 stop bit
I'm using qemu-system-i386 -serial stdio -kernel ./kernel to test my kernel but it doesn't print anything on the output on the serial port.
Since I needed to write outb and inb here is the code:
inline void outb(unsigned int port, unsigned char val)
{
asm volatile ("outb %%al,%%dx": :"d" (port), "a" (val));
}
inline unsigned char inb(unsigned int port)
{
unsigned char ret;
asm volatile ("inb %%dx,%%al":"=a" (ret):"d" (port));
return (ret);
}
I still can't get an ouput.
What am I doing wrong ?

You should look at outb() and inb(), you can't write on your COM1 address like that.

Related

qemu - pci_dma_read and pci_dma_write does not work

So I'm trying to learn how pci devices and drivers work using the edu device and an educational driver,
and It seems like the pci_dma_write function fails to actually write the information from the the dma buffer into the allocated address in the ram.
Here is How I testsed it:
1.I've initialized the first byte of the dma buff to be 0x12 in the realize function of the edu:
edu->dma_buf[0] = 0x12;
In the driver I've defined a struct save both the major and the destination address allocated through the dma cohherent:
struct test {
int major;
void *vaddr_to;
};
allocated it on the probe function and commanded the device to read 4 bytes:
dev_info(&(dev->dev),"Hello from Shaked\n");
dma_addr_t dma_handle_from, dma_handle_to;
void *vaddr_from, *vaddr_to;
enum { SIZE = 4 };
/* device -> RAM. */
vaddr_to = dma_alloc_coherent(&(dev->dev), 4, &dma_handle_to, GFP_ATOMIC);
*((volatile int*)vaddr_to) = 0xff;
test->vaddr_to = vaddr_to;
dev_info(&(dev->dev), "vaddr_to = %px\n", vaddr_to);
dev_info(&(dev->dev), "dma_handle_to = %llx\n", (unsigned long long)dma_handle_to);
/*write source, dest, number of bytes to transfer and activate the dma timer)/*
iowrite32(DMA_BASE, mmio + IO_DMA_SRC);
iowrite32((u32)dma_handle_to, mmio + IO_DMA_DST);
iowrite32(SIZE, mmio + IO_DMA_CNT);
iowrite32(DMA_CMD | DMA_FROM_DEV | DMA_IRQ, mmio + IO_DMA_CMD);
I've changed a little bit the interrtupt and added a print to check the value I just read:
static irqreturn_t irq_handler(int irq, void *dev)
{
int devi;
irqreturn_t ret;
u32 irq_status;
struct test* test;
test = (struct test *)dev;
if (test->major == major) {
irq_status = ioread32(mmio + IO_IRQ_STATUS);
pr_info("irq_handler irq = %d dev = %d irq_status = %llx\n",
irq, devi, (unsigned long long)irq_status);
/* Must do this ACK, or else the interrupts just keeps firing. */
iowrite32(irq_status, mmio + IO_IRQ_ACK);
pr_info("*vaddr_to new_value = %u\n", (*((u8*)test->vaddr_to)));
ret = IRQ_HANDLED;
} else {
ret = IRQ_NONE;
}
return ret;
}
however the value I get is still 255(0xff) and not 18(0x12)
probably I've missed something but when the interrupt is called the transfer to the RAM address should be completed, however it seems like the dma_write_buffer function from the device does not perform any transfer. What Am I missing?
Technical details:
edu_device: https://github.com/qemu/qemu/blob/master/hw/misc/edu.c
original edu_driver: https://github.com/cirosantilli/linux-kernel-module-cheat/blob/master/kernel_modules/qemu_edu.c
I run this on an x86-64 qemu machine with the following configuration:
$repo_loc/build/qemu-system-x86_64 \
-no-kvm \
-kernel $repo_loc/linux-5.8.5/arch/x86/boot/bzImage \
-boot c -m 2049M \
-hda $repo_loc/buildroot-2020.02.6/output/images/rootfs.ext4 \
-append "root=/dev/sda rw console=ttyS0,115200 acpi=off nokaslr" \
-serial stdio -display none \
-virtfs local,path=$repo_loc/shared,mount_tag=host0,security_model=passthrough,id=host0 \
-device edu
(where $repo_location is the path to my build dir)
add pci_set_master() after the linux driver invoked pci_enable_device() in the probe function, then everything goes fine.

Zynq7000 I2C don't work, but register are set properly

I'm trying to write piece of code, to send data via I2C on my Zynq7020 device. There are 11 register asociated with I2C and I'm prety sure, that I have set this properly. I also double check registers asociated with CPU_1X clock enable a and I2C reset, but they are set properly by default. When I set all data by the code bellow, status register is 0x00000040 and interupt status register is 0x00000000 all the time. I think, there must be some enable register, but I can't find anything in datasheet. Thanks for all replies.
//I2C registers
#define XIICPS_CR 0xE0004000 //Controll register
#define XIICPS_SR 0xE0004004 //Status register
#define XIICPS_ADDR 0xE0004008 //IIC Address register
#define XIICPS_DATA 0xE000400C //IIC data register
#define XIICPS_ISR 0xE0004010 //IIC interrupt status register
#define XIICPS_TRANS_SIZE 0xE0004014 //Transfer Size Register
#define XIICPS_SLV_PAUSE 0xE0004018 //Slave Monitor Pause Register
#define XIICPS_TIME_OUT 0xE000401C //Time out register
#define XIICPS_IMR 0xE0004020 //Interrupt mask register
#define XIICPS_IER 0xE0004024 //Interrupt Enable Register
#define XIICPS_IDR 0xE0004028 //Interrupt Disable Register
#include <stdio.h>
#include "platform.h"
#include "xil_printf.h"
int main()
{
init_platform();
initI2C();
while(1){
printf("Write start\n\r");
writeI2C(0x6C, 0x00); //I have device with 0x6C adress connected;
printf("Write done\n\r");
}
cleanup_platform();
return 0;
}
void initI2C(){
*((unsigned int*)XIICPS_CR) = 0x0000905E; //((15:14)CLK_A = 2, (13:8)CLK_B = 16, (6)CLR_FIFO=1, (5)MONITOR_MODE=0, (4)HOLD=1, (3)1, (2)1, (1)MASTER=1, (0)rv = 0)
*((unsigned int*)XIICPS_TIME_OUT) = 0x000000FF; //set timeout to 255
*((unsigned int*)XIICPS_IER) = 0x00000000; //no interupts (I'm pretty sure, that this line can't do anything, but it's in datasheet...)
*((unsigned int*)XIICPS_IDR) = 0x000002FF; //no interupts
return;
}
void writeI2C(unsigned addr, unsigned data){
*((unsigned int*)XIICPS_CR) = (*((unsigned int*)XIICPS_CR)|0x00000040)&0xFFFFFFFE; //CLR_FIFO and WRITE_MODE
*((unsigned int*)XIICPS_DATA) = data; //When I debug this with JTAG, I can see XIICPS_TRANS_SIZE increment by this lines. So data goes into FIFO properly, right?
*((unsigned int*)XIICPS_DATA) = data;
*((unsigned int*)XIICPS_DATA) = data;
*((unsigned int*)XIICPS_DATA) = data;
*((unsigned int*)XIICPS_ADDR) = addr;
while(*((unsigned int*)XIICPS_SR) != 0){ //this loop will never exit. Data sits in FIFO forever. No errors in status or interupt status registers, everything looks fine.
print("Wait1...\n\r");
}
while(*((unsigned int*)XIICPS_ISR)&0x00000001 == 0){
print("Wait2...\n\r");
}
return;
}

UART overrun error when attempting to write to transmit holding register (U0THR)

So, I have a terminal connected to TxD0 and RxD0 pins. Let's say that I just want to test whether writing to it works after all.
I wrote few functions to make UART be able to read and write chars and strings. Although if I try to run it in the simulator, it gives me an overrun error.
Here are the functions in uart.c file:
void uart0_write(unsigned char reg_data)
{
while((U0LSR & (0x20)) != 0x20);/*wait until holding register is empty*/
U0THR = (int) reg_data;/*write to holding register*/
}
void uart0_write_str(char str[])
{
while(*str != '\0')/*check for EOF*/
{
uart0_write(*str);/*write a char*/
str++;
}
}
UART0 initialization function:
void uart0_init(void)
{
PINSEL0 = 0x05; /*set pin P0.0 to TXD0 and P0.1 RxD0 (TXD0 - 01; RxD0 - 01; 0101 = 0x05)*/
U0LCR = 0x83; /*set length for 8-bit word, set the stop bit, enable DLAB*/
U0IER = (1<<0) | (1<<1);/*enable RBR and THR interrupts*/
U0FCR = 0xC7; /*enable FIFO; reset Tx FIFO; set interrupt after 14 characters*/
/*Baud rate configured to 9600 Baud (from lecture notes)*/
U0DLL = 0x9D;
U0DLM = 0x0;
U0LCR = 0x03; /*8-bit character selection; disable DLAB*/
}
Exemplary use in main:
int main(void)
{
char *introMsg;
introMsg = "Hello World\n";
systemInit();
ADC_init();
timer0_init();
uart0_init();
uart0_write_str(introMsg);
/*or: */
while(1)
{
uart0_write('c');
}
return 0;
}
With these demonstrative code snippets the UART should work properly as I saw elsewhere on the web.
But when attempting to run it, it doesn't print anything and the OE pops up.
What am I doing wrong? I'm only starting to dive into the depths of bare metal programming, so there might be some bug that I didn't notice.
I'd welcome any insights!
Stay home,
Jacob

UARTs & Registers

So I am new to this and trying to learn about registers and UARTs and have been given the following code to study.
#include <stdint.h>
typedef volatile struct {
uint32_t DR;
uint32_t RSR_ECR;
uint8_t reserved1[0x10];
const uint32_t FR;
uint8_t reserved2[0x4];
uint32_t LPR;
uint32_t IBRD;
uint32_t FBRD;
uint32_t LCR_H;
uint32_t CR;
uint32_t IFLS;
uint32_t IMSC;
const uint32_t RIS;
const uint32_t MIS;
uint32_t ICR;
uint32_t DMACR;
} pl011_T;
enum {
RXFE = 0x10,
TXFF = 0x20,
};
pl011_T * const UART0 = (pl011_T *)0x101f1000;
pl011_T * const UART1 = (pl011_T *)0x101f2000;
pl011_T * const UART2 = (pl011_T *)0x101f3000;
static inline char upperchar(char c) {
if((c >= 'a') && (c <= 'z')) {
return c - 'a' + 'A';
} else {
return c;
}
}
static void uart_echo(pl011_T *uart) {
if ((uart->FR & RXFE) == 0) {
while(uart->FR & TXFF);
uart->DR = upperchar(uart->DR);
}
}
void c_entry() {
for(;;) {
uart_echo(UART0);
uart_echo(UART1);
uart_echo(UART2);
}
}
I am just wondering if someone could explain how the pl011 DR and FR registers transmit and receive data over the associated UART peripheral.
Any help at all would be much appreciated.
There's some nice documentation on this UART here - http://infocenter.arm.com/help/topic/com.arm.doc.ddi0183g/DDI0183G_uart_pl011_r1p5_trm.pdf
The way this program works is influenced by whether the UART is in FIFO mode or not. I have not read enough of the doco to know which is the default state. The operation of the Tx and Rx differ slightly depending on this mode. It looks like the code is working only on single words, so possibly it's not in FIFO mode (or it doesn't matter for this code).
FR is the UART Flag Register (AKA UARTFR). This contains a bunch of bits that can be queried to see what the state of the UART is. Two important ones for this question are:
TXFF is a bit in FR, it becomes 1 when the transmit buffer is full.
RXFE is a bit in FR, it becomes 1 when the receive buffer is empty
DR is the UART Data Register (AKA UARTDR). This holds the data to be transmitted, and data that has been received.
So Looking at the main working part of the code ~
static void uart_echo( pl011_T *uart )
{
if ( ( uart->FR & RXFE ) == 0 ) // While the receive buffer is NOT empty
{
while( uart->FR & TXFF ); // Do <nothing> while the Tx buffer is full
// Read the content of the Data Register, converting it to uppercase(),
// then make a write to the DR, which initiates a transmit
uart->DR = upperchar(uart->DR);
}
}
So this function is echoing back whatever it reads, but in uppercase. The program is calling this for each of the three UART's, in turn.

need help configuring port to input in 8051

The connection is as follows An infrared sensor circuit which yields 0 or 5v depending on closed or open circuit output line to port 2_0 pin of microcontroller 8051 philips.Problem is when i do this the circuit value are overridden by the current value on port 2_0 led always goes on.Here is my code(in keil c) i guess i have not configured P 2_0 as input properly
void MSDelay(unsigned int);
sbit led=P1^0;
void main()
{
unsigned int var;
P2=0xFF;
TMOD=0x20;
TH1=0xFD;
SCON =0x50;
TR1=1;
while(1)
{
var=P2^0;
if(var==0)
{
led=1;
SBUF='0';
while(TI==0);
TI=0;
MSDelay(250);
}
else
{
led=0;
SBUF='9';
while(TI==0);
TI=0;
MSDelay(100);
}
}
}
EDIT : I was facing a problem since the 8086 processor i was using had a fault in it. Would recommend anyone trying this to get a few spares when programming.
jschmier has a good point. Also the port may not be configured correctly, or is there something in the circuit that is causing the led to toggle off and on very quickly so it looks like it is on all the time.
You typically use the sbit data type for P2_0 to define a bit within a special function register (SFR).
From C51: READING FROM AN INPUT PORT (modified)
sfr P2 = 0xA0;
sbit P2_0 = P2^0;
...
P2_0 = 1; /* set port for input */
var = P2_0; /* read P2_0 into var */
It is important to note that sbit variables may not be declared inside a function. They must be declared outside of the function body.
Another option may be to read all 8 pins of P2 and then mask off the unwanted bits.
char var; /* define 8 bit variable */
P2 = 0xFF; /* set P2 for input */
var = P2; /* read P2 into var */
var &= 0x01; /* mask off unwanted bits */
Rather than read P2 or the P2_0 pin into an unsigned int (16 bits), you could use a char (8 bits) or single bit to save on memory.
char var;
...
var = P2;
or
bit var;
...
var = P2_0;
Another option may be to make the char bit-addressable.
char bdata var; /* bit-addressable char */
sbit var_0 = var^0; /* bit 0 of var */
...
var = P2; /* read P2 into var */
if(var_0 == 0) /* test var_0 (bit 0 of var char) */
{
...
}
You can find additional useful information in the Keil Cx51 Compiler User's Guide and related links.
Note: Most of my 8051 experience is in assembly. The C examples above may not be 100% correct.
Thank you so much... my coding works
And I learn how to define input port and read the data
#include<reg51.h>
#define opp P1
#define ipp P0
sbit op =P1^0;
sbit ip =P0^0;
main()
{
unsigned int value;
P0=0xFF;
value=P0;
value &=0x01;
if(value==0)
{
P1=0x01;
}
else
{
P1=0x00;
}
}

Resources