Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I am programming a Texas Instruments TMS320F28335 Digital Signal Controller (DSC) and I am writing the communication with a resolver model AD2S1205 (datasheet: https://www.analog.com/media/en/technical-documentation/data-sheets/AD2S1205.pdf). I have to implement the “Supply sequencing and reset” procedure and the procedure for reading and sending the speed and position values through the SPI serial interface back to the DSC. I'm new to firmware and there are several things I don't know how to do:
In the resolver datasheet (page 16) you are asked to wait for the supply voltage Vdd to reach its operating range before moving the reset signal. How can I know when this happened?
To ask the resolver to read and transmit the position and speed information, I must observe the time diagram (page 15), e.g. before putting /RD = 0 I have to wait for /RDVEL to remain stable for t4 = 5 ns. What should I insert in the code before the instruction that lowers RD to make sure that 5 ns have passed? I can pass 0,005 to the DELAY_US(A) function available on the DSC (which delays for A microseconds) but I don’t know if it will actually work and if this is the right way to go to observe device timing diagrams.
In the “/RD input” section (page 14) it is specified that the high-to-low transition of /RD must occur when the clock is high. How can I be sure that the line of code that lowers /RD is running when the clock is high?
Connect chip Vdd to the ADC port via the divider. Reset the chip when Vdd is correct.
Your uC is 150MHz. The clock period is 6.67ns which is larger than 4ns required. Whatever you do you cant change the pin faster. Problem does not exist for you.
Connect CLKIN to the input pin. Poll it. Change /RD when clock high
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I'm a student and im currently learning the basics of microprocessors.
I've been assigned a task with no prior knowledge of the task itself and i've been struggling to find the answer to it. So i wanted to give it a shot on Stackoverflow.
The task is to write a simple program using the Programing language C to test the Arduino board which first initializes the 12th pin and afterwards continually gives pin 5V for 5 seconds and a low voltage in 0 for 1 seconds for the same pin.
For a humble solution or explanation for it i would be very happy.
Thank you in advance.
Edit:
Im referring to the Arduino Hardware.
Since the task is to set a certain pin as output, the first thing you need to do is to check the board schematics to find out which microcontroller port and pin that "pin 12" corresponds to. Microcontroller ports often have a 1 letter name such as PORTA, PORTB etc. This is also the case for AVR.
Once you have found out which port that's the correct one, you also have to figure out which bit in that port register to set. There will be 8 pins per port, since each port on this MCU corresponds to 8 bit registers.
Since you want this pin to be an output, you have to configure a "data direction register" to make it such. On AVR (and most Motorola-flavoured MCUs), these registers are called DDRx, where x is the port letter. See the AVR manual, GPIO section. You set the DDR register as output by writing a one to the bit corresponding to the pin.
Once that is done, you can set the relevant bit in actual port register to 1 or 0, depending on if you want a high or low signal. These are called PORTx on AVR.
To create a 5 second delay, it is likely enough for hobbyist/student purposes to call a "busy wait" function. The Arduino libs has the delay() function that should suffice. Simply wait 5000ms. In professional/real-world applications, busy-wait delays should be avoided and then you'd rather use the on-chip hardware peripheral timers instead, to set a flag when the timer has elapsed.
From your post I take that you have an Arduino board.
It is unlikely that someone told you to program it in C as you don't know any C. And programming that Arduino's AVR microcontroller bare metal in C is impossible of someone of your skill level.
So let's assume you're supposed to complete the indeed very simple task of programming an Arduino with the Arduino IDE in C++ to do what is asked.
All you have to do is follow this link:
https://www.arduino.cc/en/Guide
I won't give you any code as this would take away the learning experience from you.
You will have to configure pin 12 as a digital output.
You will have to find out how to set LOW and HIGH to a digital output
You will find out how to pause/delay your program for a number of seconds.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I have an Arduino Nano with an ATmega328P. I'm using Atmel Studio to program it in C and I need to have the following program:
I have 5 inputs (PC3-PC7), each should have their separate timer and each drive 2 (one red, one green) LEDs.
Each HIGH-level on an input pin (PC3-PC7) triggers a separate timer, which should be 10 minutes long.
The HIGH-level on the input pins should last over the course of these 10 minutes. If it changes to a LOW-level while running, something happens (LEDs blink, buzzer on).
If the timer has reached the 10-minute mark, something happens (red LED off, green LED on, buzzer on).
I think the time.h library is needed for this, but I have no idea how I could program this. Any help is appreciated.
There is probably no pre-made library for this.
What you need is to program some manner of Hardware Abstraction Layer (HAL) on top of your hardware peripheral timers. One hardware timer/RTC should be enough. Have it trigger with an interrupt every x time units, depending on what pre-scaler clock that is available.
From this timer interrupt, count the individual on-going tasks and see if it is time for them to execute. In pseudo code, it could look something like this:
void timer_ISR (void) // triggers once every x time units
{
// clear interrupt source etc here
// set up timer again, if needed, here
counter++; // some internal counter
// possibly, enable maskable interrupts here. If I remember AVR, this is asm SEI.
for(size_t i=0; i<timers; i++)
{
if(counter == timer[i].counter)
{
timer[i].execute();
timer[i].enable = false;
}
}
}
where execute is a function pointer to a callback function.
The first thing you should do is set up a Timer interrupt. You can read
timer interrupt
After setting the timer, you should use 1 or 2 variables to reach 10 minutes.
Like: timer int should increase Varb1 every time it gets in ISR (interrupt function). Every time Varb1 overflows, Varb2 should increase, and when Varb2 overflows, Varb3 should increase... and so on.
The reason you should have it is that a timer holds only really small times that are in milliseconds or microseconds.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 years ago.
Improve this question
In STM32F101 with some mishap with the reset, BUSY flag is held high. To come out of it, I followed the steps given in page 26 of following document from STM :
https://www.st.com/content/ccc/resource/technical/document/errata_sheet/7d/02/75/64/17/fc/4d/fd/CD00190234.pdf/files/CD00190234.pdf/jcr:content/translations/en.CD00190234.pdf
While doing the steps,I could able to do the 1st step. In the second step (Set SCL and SDA as open drain output with their values high) , I have set both SCL and SDA as output open drain, but could if I set the pins high, only SCL's IDR register is set high and SDA pin's IDR register could not be set (but it's ODR register is set). Because of this I could not able to continue with further steps. Please help me through this
This not the issue in this case. The slave device is keeping the SDA low. To exit from this deadlock you need to provide between 8 and 12 clocks. You need to toggle the SCL pin, and after every clock you need to check if the SDA line has been released by the slave device. This has nothing in common with this errata.
After this is good to reset the I2C peripheral.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am new to microprocessor programing and currently have a RGB sensor which reads an RGB value and increments a variable by an arbitrary number. I want the sensor to turn off for 0.3 seconds when I reach a certain value. Is there a way to do this or will I have to figure out a different way to throw out all the values the RGB sensor receives during that 0.3 second time span? I am writing in C.
Note: The sensor I am currently using is a TCS230.
According to the datasheet pin #3 is Output Enable ('OE, active low). So if you drive this pin high it should cut off the chip's output.
Or more to your question, it looks like if you drive pins S0 and S1 both low, it will place the chip in a "Power Down" state.
Whichever option you choose depends on what's more important. Do you want quickest reaction time, or do you want to conserve power? If you want the quickest reaction time, use 'OE. There is a typical 100ns delay between asserting this signal and the chip responding. The downside is the chip is still running during this whole time. If you choose the Power Down state, then you will save energy vs the Output Enable option, but the photodiodes have a typical 100 microsecond "recovery from power down" delay. Obviously that's a factor of 1000, and if you're doing time-critical work, probably not the best option.
Keep in mind, I have never used this chip in my life, just basing my answer a quick read of the datasheet.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am currently working an atmel micro controller, the EVK1104s, which house the UC32 Data Sheet. We have actually planted this chip on a custom PCB and are inthe process of writting more firmware.
Currently, I need to tell the ADC on the Micro Controller Unit(MCU) to sample at (8k samples / second). In reality this is for sampling a microphone. Either way, the documentation is quite unclear and I was looking for some clarification.
I know that to change the sampling rate i need to change what is called the Mode Register, the register used to configure the ADC for use (pg 799 in the link above). This is the register which allows me to change the sample/hold time/ start up time, and the ADCclock.
EX(from pg 799):
Sample & Hold Time = (SHTIM+3) / ADCClock
ADCClock = CLK_ADC / ( (PRESCAL+1) * 2 )
From what I gather, i will only need to change the PRESCAL to make the ADCClock operate at 8Khz. The problem is that PRESCAL is limited to 8 bits of resolution.
For example, if the controller is set at 12Mhz/x = 8Khz then x would need to be 1500. Because x is limited to 8 bits as i said before this would appear to be impossible, because the max is 255.
I feel that I am doing something wrong here, or not understanding what the datasheet wants me to. Can anyone confirm what I have just talked about or help direct me?
You are confused about the sampling rate and the ADC rate.
The registers you refer to in the manual only control the taking of one sample. The registers allow you to control how long to sample the voltage for. This may make a difference to you depending on the circuitry involved. That is, you don't want to take the sample too fast for your circuit. (I didn't look closely at the datasheet, but some microcontrollers take several samples and average them. This behaviour is controlled by registers, too.)
But the 8 kHz sampling rate refers to how often you want to sample. That is, this is the frequency you want to trigger the individual samples. The registers you mention don't address this. You need to use a clock and an interrupt handler to move the data out of the register into storage somewhere or do something with it, and then trigger the next sample. There is also an interrupt handler that can deal with the sample as soon as it is ready. In that scheme, you use to handlers: one to trigger the samples; another to deal with the samples when they are ready.
Edit:
To explain more why you don't want such a slow ADC rate, consider how the ADC generates its data. It samples for the first bit, waits a cycle, samples for the second bit, and so on for 10 cycles. The accuracy of the result depends on the signal staying stable over all these samples. If the signal is changing, then the bits of this number are meaningless. You need to set the prescalar and ADC clock fast enough so the signal does not change, but slow enough for the signal to settle.
So yes, you want to use a clock and interrupt handler to read the data then trigger the next reading. The ADC runs independently of the processor, and will be ready by the time the interrupt runs again. (The first reading will be garbage, but you can set a flag or something to guard against that.)
volatile int running = false
Handler()
if(running) do something with data
running = true
trigger ADC
output compare += 1/8000 s