i would like to create a PWM signal. And i want the frequency to be close to 38 khz. My theoretical calculation for period is 26.3 microseconds. So i choose 26 microseconds. And i can observe my signal.
But i don't understand how my code works properly :)
(My clock frequency is 1MHz so my clock signal is 1 microseconds )
if((P1IN & BIT3)!=BIT3) { // if button is pressed
for(i=0;i<692;i++){ // pwm signal's duration is 9ms
P2OUT^=0x01; // switch from 1 to 0 or vice versa
__delay_cycles(4);
}
P2OUT=0x00;
}
my calculation is:
i <692,i++,P2OUT^=0x01; // total 3 cycles
__delay_cycles(4); //total 4 cycles
so 4+3=7. but i'm confused because i think it should be 13 not 7
(here is my signal)
https://e2e.ti.com/cfs-file/__key/communityserver-discussions-components-files/166/f0fd36b0_2D00_bebd_2D00_4a31_2D00_b564_2D00_98962cf4749e-_2800_1_2900_.jpg
You can not calculate cycles based on C or C++ code. you need to check the assembly file(s) generated during the compilation of the program. Based on your compiler (which you did not mention) you can pass some some compiler parameters/switches to ask the compiler to leave the generated assembly file(s) in place for you to check the generated assembly instructions. but basically the for loop would have a jump instruction which may take 2/3 cycles and you did not calculate that.
I recommend that you later check the number of cycles of each instruction from the micro controller datasheet.
The posted code (per your calculations) switches the output every ~7cycles. and does this 692 times. For a total of 346 total cycles, however the total pulse ON time is only ~7cycles. Suggest:
if((P1IN & BIT3)!=BIT3)
{ // if button is pressed
// start pwm signal
P2OUT = 0x01;
for(int i=0; i< (9*1000);i++) // may need to be adjusted
{ // so pwm signal's duration is 9ms
_delay( 1 );
}
// stop pwm signal
P2OUT=0x00;
// wait for button to be released
while( P1IN & BIT3)!=BIT3 ){;}
}
I'm not familiar with your microcontroller's PWM details, However, most have an initialization to set how fast the PWM timer counts and its start/termination count and if it repeats and if the output is a square wave or a step up or a step down signal and the percentage of ON .vs.OFF time.
However, the posted code indicates the PWM is only a regular GPIO bit.
The posted code indicates the PWM on percentage is to be 50percent. Is this what you want?
Related
I'm trying to generate note for example Do , do's frequency is 523.
I wrote some codes but, i did not work
Systick 8 mhz
void note1(void){ // Note Do
for (int i = 0; i < 523; i++){
GPIOE->ODR = 0x4000;
delay_ms(1);
GPIOE->ODR = 0x0000;
delay_ms(1);
}
}
How can we solve this problem ?
EasyMx Pro v7
I'm calling the function like that
void button_handler(void)
{
note1();
// Clear pending bit depending on which one is pending
if (EXTI->PR & (1 << 0)){
EXTI->PR = (1 << 0);
}
else if (EXTI->PR & (1 << 1)){
EXTI->PR = (1 << 1);
}
}
523 times sending 1 and 0 and delay_ms 1 = 1 ms
1000 = 1 sec
On STM32 (as I can see you have it) you have timers which can be configured as PWM output.
So use timer, set period and prescaler values according to your needed frequency and set duty cycle on channel to 50%.
If you need 523Hz PWM output, then set your timer PWM to 523Hz using prescaler and period value:
timer_overflow_frequency = timer_input_clock /
(prescaler_value + 1) /
(period_value + 1) ;
Then, for your output channel set value half of timer period value.
For standard peripheral library, tutorial can be used from here:
https://stm32f4-discovery.net/2014/05/stm32f4-stm32f429-discovery-pwm-tutorial/
Link from unwind for Cube https://electronics.stackexchange.com/questions/179546/getting-pwm-to-work-on-stm32f4-using-sts-hal-libraries
You appear to have a fundamental misunderstanding. In your code note1(), the value 523 will affect only the duration of the note, nit its frequency. With 1ms high, 1ms low repeated 523 times you will generate a tone of approximately 500Hz for approximately 1.43 seconds. I say "approximately" because there will be some small overhead in the loop other then the time delays.
A time delay resolution 1ms is insufficient to generate an accurate tone in that manner. To do it in the manner you have, each delay would need to be 1/2f seconds, so for 523Hz approximately 956ms. The loop iteration count would need to be ft, so for say .25 seconds, 131 iterations.
However if button_handler() is as it appears to be an interrupt handler, you really should not be spending 1.46 seconds in an interrupt handler!
In any event this is an extraordinarily laborious, CPU intensive and inaccurate method of generating a specific frequency. The STM32 on your board is well endowed with hardware timers with direct GPIO output that will generate the frequency you need accurately with zero software over head. Even if none of the timers map to a suitable GPIO output that you need to use, ou can still get one to generate an interrupt at 1/2f and toggle the pin in the interrupt handler. Either way that will leave the processor free to do useful stuff while the tone is being output.
I am using an ultrasonic range/sonar sensor (HCSR-04) with an AVR ATmega32A. I have written the code understanding the concept of how the sensor works but although the program compiles correctly the device doesn't give any result.
The LCD is showing "out of range" which means PINC1 isn't going high. Is my method for checking if a bit is high or low correct? Should I use the bit_is_set(PORTC,PINC1) function? I am using the default frequency 1 MHz. PINC0 is connected to trig and PINC1 is the echo input. Here is the code:
#include<avr/io.h>
#include<util/delay.h>
#include"lcd.h"
int main(void){
ini_lcd();
TCCR1B|=1<<CS10;
DDRC|=1<<PINC0;
int a=0,b=0;
while(1){
TCNT1=0;
PORTC|=1<<PINC0;
while(TCNT1<100);
PORTC&=~(1<<PINC0);
TCNT1=0;
while(!(PORTC&(1<<PINC1))) && TCNT1<30000) ; // checking if echo has become high and not exceeding the time for max range i.e 5 m
///send_string("out");
b=TCNT1;
if(b<30000){
TCNT1=0;
while(PORTC&(1<<PINC1)); // waiting until the echo become low again
a=TCNT1;
go_to_pos(1,5);
send_int(a/58);
} else {
go_to_pos(1,5);
send_string("out of range");
}
_delay_ms(20);
}
}
The main problem I see in your code that is repeated a few times is how you're trying to read the input I/O line, for example:
while(PORTC&(1<<PINC1)); // waiting until the echo become low again
Should become:
while(PINC&(1<<PINC1)); // waiting until the echo become low again
The PORT registers are the output latches and will either contain the last value written to them or the power-on default value of zero. You need to use the PIN registers to read the current state of the I/O pin.
you need to make the tick counts in TCNT1 = 300000 not 30000 when checking for the Echo pin, since your clock is 1000000 HZ then each 0.1us TCNT1 is incremented by 1 so for 300000 tick you get 30000 us (waiting time) which is equal to the time needed for the echo to travel back at 5 meter (max range for the HCSR04)
(I'll gladly post code if someone can point out how to paste it in here without using the 4 space indentation system that doesn't work)
Hello folks
After ~9Hours racking by brains, I can't find an answer or find where my calculations are going wrong... but they are.
I have a circuit built using a microchip 18F2550 microcontroller.
I am using this circuit to measure the delay between 2 signals and am using the 2 CCP registers in capture mode.
This all works and the result is sent to the PC (over USB serial) all dandy, but the results are wrong.
I have to apply a gain of ~16000 to any results to get somewhere near the delay presented to the pins.
I have the delay set in the line
Timer1 is set as an internal
Timer3 is disabled
relevant interrupts are enabled
and the main routine runs continuously.
When I get a rising edge detection on the CCP1 pin, the interrupt is configured to reset timer1 to zero as well as the overflow counter
#INT_CCP1
void ccp1_isr() // Captures the rising edge of CCP1 pin.
{
if(timing==FALSE){ // only do this on the edge, any bouncing will reset timers etc.
set_timer1(0);
T1_Overflow = 0;
Pulse_Time = 0;
timing = 1; // Set flag to indicate timing.
output_high(BLUE_LED);
}
}
the timing flag ensures the times cannot be reset by another pulse on the CCP1 pin.
Timer1 should then be reset and start counting as normal. Every time it rolls around by 65535 (16bit device) another interrupt is fired after which the amount of overflows are incremented.
#INT_TIMER1
void isr()
{
T1_Overflow++;
}
Finally, when the input pin on CCP2 goes high, the CCP_2 interrupt is triggered. This captures the value of the CCP register (which is the value of Timer0 at the time the interrupt was fired) and the overflow register.
#INT_CCP2
void ccp2_isr()
{
if(timing == TRUE){ // only output this when preceded by CCP1
if(Count_Done == FALSE) // do this once only
{
Count_Done = TRUE; // and also flag to the main routine to output data to the terminal.
Pulse_time = CCP_2;
Pulse_Overflow = T1_Overflow;
measureCount++; // increment the number of measures.
}
output_low(BLUE_LED);
timing = FALSE;
}
}
CCP1 can now start responding to the inputs again.
The idea of this is that every time I get a pulse of one input at CCP1 followed by CCP2, a string is sent to the terminal with a counter, the number of overflows and the time left in the timer.
while(TRUE) // do forever while connected
{
usb_task(); // keep usb alive
if(Count_Done == TRUE)
{
printf(usb_cdc_putc, "%lu , %lu , %lu \r\n",measureCount, pulse_time, pulse_overflow);
Count_Done = FALSE;
}
so, I should get an output to the terminal of something like "1,61553,35" for a ~12ms delay between CCP1 and CCP2.
The problem is that these are the results I am getting for a 200ms pulse provided to the circuit. (Verified twice)
so where am I going wrong.
I have a 48MHZ Clock with no prescaler which implies a cycle every 20ns.
Divide by for 4 instructions per cycle for the clock which implies 5.2ns every cycle
16 bit timer which implies rollover every 65535*5.2ns = 341us per rollover.
when you do the calculations (0.000341*pulse_overflow)+pulse_time*(5.2*(10^-9))
then the above data gives 0012.27ms and not the 200ms provided.
Can anyone point out where I am going wrong with these calculations???
Your error is in "Divide by for 4 instructions per cycle for the clock which implies 5.2ns every cycle"
The counter ticks once every 4 cycles, not 4 times per cycle. So, the correct calculations are:
2.08333E-08 s/cycle of osc
8.33333E-08 s/tick of timer
0.005461333 s/rollover
You are off by a factor of 16.
I am trying to use a PIC18F45K22 to flash an LED on for a second, then off for a second, repeating indefinitely. The code incorporates a delay function that keeps the PIC busy for one second but I am not sure how it works (it was done through trial-and-error.) To achieve a 1 second delay, tWait should equal 125,000. I am using a 8MHz MCU clock frequency with PLL disabled. I know that 8MHz is 125nSec but after that, I am stuck. Can anyone please explain to me how it works? I've been at it for a few hours and it's driving me bonkers.
#define SYSTEM_FREQUENCY (8000000) // 8MHz
// Prototype Function
void delayS(unsigned int sec);
void main()
{
// Sets RA0 to output...
TRISA.RA0 = 0;
do
{
// Turns RA0 off...
LATA.RA0 = 0;
delayS(1);
// Turns RA0 on...
LATA.RA0 = 1;
delayS(1);
} while(1); // Repeat...
}
// My attempt at writing the function...
void delayS(unsigned int sec)
{
unsigned long tWait, tStart;
/*
To achieve a 1 Second On 1 Second Off cycle, tWait should be 125,000.
How? Why? 64 is arbitrary to achieve 125,000.
*/
tWait = Sec*(SYSTEM_FREQUENCY/64);
for(tStart = 0; tStart < tWait; tStart++);
}
The clock frequency is 8MHz, so there are 2.10^6 cycles per seconds since a cycle takes 4 clock ticks. Any assembly operation of the microcontroller takes at least one cycle. There is still a factor 16 to explain.
The factor 16 must correspond to one pass of the loop. Since unsigned long are used in this loop, it is not surprising that it takes a few operations. You will need to look at the assembly code to understand what is happening and get a precise count, like here.
If you are working with MPLAB and the xc8 compiler, there are some functions you can use, such as Delay10KTCYx(unsigned char);.
The microcontroller spends 99.99% of its time counting, wich is not critical if you flash leds. But, to use efficiently a pic, you will need to avoid these functions by using timers and interruptions. Have fun !
I want to know more about Arduino Nano timers.
What timers are there?
Do they produce interrupts?
What code would attach an interrupt handler to them?
How is delay() and delayMicroseconds() implemented...
Do they use timer interrupts? (If so, how can I have other code execute during this?)
Or do they repeatedly poll until a timer reaches a certain value?
Or do they increment a value X number of times?
Or do they do it another way?
The best way to think about the Arduino Nano timers is to think about the timers in the underlying chip: the ATmega328. It has three timers:
Timer 0: 8-bit, PWM on chip pins 11 and 12
Timer 1: 16-bit, PWM on chip pins 15 and 16
Timer 2: 8-bit, PWM on chip pins 17 and 5
All of these timers can produce two kinds of interrupts:
The "value matched" interrupt occurs when the timer value, which is added to every tick of the timer reaches a comparison value in the timer register.
The timer overflow interrupt occurs when the timer value reaches its maximum value
Unfortunately, there is no Arduino function to attach interrupts to timers. To use timer interrupts you will need to write slightly more low-level code. Basically, you will need to declare an interrupt routine something like this:
ISR(TIMER1_OVF_vect) {
...
}
This will declare a function to service timer1 overflow interrupt. Then you will need to enable the timer overflow interrupt using the TIMSK1 register. In the above example case this might look like this:
TIMSK1 |= (1<<TOIE1);
or
TIMSK1 |= BV(TOIE1);
This sets the TOIE1 (generate timer1 overflow interrupt, please) flag in the TIMSK1 register. Assuming that your interrupts are enabled, your ISR(TIMER1_OVF_vect) will get called every time timer1 overflows.
The Arduino delay() function looks as follows in the source code (wiring.c):
void delay(unsigned long ms)
{
uint16_t start = (uint16_t)micros();
while (ms > 0) {
if (((uint16_t)micros() - start) >= 1000) {
ms--;
start += 1000;
}
}
}
So internally it uses the micros() function, which indeed relies on the timer0 count. The Arduino framework uses timer0 to count milliseconds, indeed, timer0 count is is where millis() function gets its value.
The delayMicroseconds() function, on the other hand, uses certain well-timed microprocessor operations to create the delay; which function is used depends on the processor and the clock speed; the most common being nop() (no operation) which takes exactly one clock cycle. Arduino Nano uses a 16 MHz clock, and here's what the source code looks like for that:
// For a one-microsecond delay, simply return. The overhead
// of the function call yields a delay of approximately 1 1/8 µs.
if (--us == 0)
return;
// The following loop takes a quarter of a microsecond (4 cycles)
// per iteration, so execute it four times for each microsecond of
// delay requested.
us <<= 2;
// Account for the time taken in the proceeding commands.
us -= 2;
What we learn from this:
1 µs delay does nothing (the function call is the delay)
Longer delays use the left shift operation to time the delay.