Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I am trying to implement bit-banging i2c to communicate between an atmega128A's GPIO and an SHT21 (I2C bus was used for some other devices). The first task is to send a write sequence to the SHT21. Using an oscilloscope, I can see that the sequence sending out from the atmega has the correct start signal, correct order of bits, correct stop signal, signal levels looks correct. The serial analyzer from the scope reads out correct I2C message: S80W~A . Yet there is not any ACK response from the SHT21. The SDA and SCL are both pulled up to 3.3V via 6.7K resistors.
I really need help finding out what is wrong
Thank you so much.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I'm a student and im currently learning the basics of microprocessors.
I've been assigned a task with no prior knowledge of the task itself and i've been struggling to find the answer to it. So i wanted to give it a shot on Stackoverflow.
The task is to write a simple program using the Programing language C to test the Arduino board which first initializes the 12th pin and afterwards continually gives pin 5V for 5 seconds and a low voltage in 0 for 1 seconds for the same pin.
For a humble solution or explanation for it i would be very happy.
Thank you in advance.
Edit:
Im referring to the Arduino Hardware.
Since the task is to set a certain pin as output, the first thing you need to do is to check the board schematics to find out which microcontroller port and pin that "pin 12" corresponds to. Microcontroller ports often have a 1 letter name such as PORTA, PORTB etc. This is also the case for AVR.
Once you have found out which port that's the correct one, you also have to figure out which bit in that port register to set. There will be 8 pins per port, since each port on this MCU corresponds to 8 bit registers.
Since you want this pin to be an output, you have to configure a "data direction register" to make it such. On AVR (and most Motorola-flavoured MCUs), these registers are called DDRx, where x is the port letter. See the AVR manual, GPIO section. You set the DDR register as output by writing a one to the bit corresponding to the pin.
Once that is done, you can set the relevant bit in actual port register to 1 or 0, depending on if you want a high or low signal. These are called PORTx on AVR.
To create a 5 second delay, it is likely enough for hobbyist/student purposes to call a "busy wait" function. The Arduino libs has the delay() function that should suffice. Simply wait 5000ms. In professional/real-world applications, busy-wait delays should be avoided and then you'd rather use the on-chip hardware peripheral timers instead, to set a flag when the timer has elapsed.
From your post I take that you have an Arduino board.
It is unlikely that someone told you to program it in C as you don't know any C. And programming that Arduino's AVR microcontroller bare metal in C is impossible of someone of your skill level.
So let's assume you're supposed to complete the indeed very simple task of programming an Arduino with the Arduino IDE in C++ to do what is asked.
All you have to do is follow this link:
https://www.arduino.cc/en/Guide
I won't give you any code as this would take away the learning experience from you.
You will have to configure pin 12 as a digital output.
You will have to find out how to set LOW and HIGH to a digital output
You will find out how to pause/delay your program for a number of seconds.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
I am programming a Texas Instruments TMS320F28335 Digital Signal Controller (DSC) and I am writing the communication with a resolver model AD2S1205 (datasheet: https://www.analog.com/media/en/technical-documentation/data-sheets/AD2S1205.pdf). I have to implement the “Supply sequencing and reset” procedure and the procedure for reading and sending the speed and position values through the SPI serial interface back to the DSC. I'm new to firmware and there are several things I don't know how to do:
In the resolver datasheet (page 16) you are asked to wait for the supply voltage Vdd to reach its operating range before moving the reset signal. How can I know when this happened?
To ask the resolver to read and transmit the position and speed information, I must observe the time diagram (page 15), e.g. before putting /RD = 0 I have to wait for /RDVEL to remain stable for t4 = 5 ns. What should I insert in the code before the instruction that lowers RD to make sure that 5 ns have passed? I can pass 0,005 to the DELAY_US(A) function available on the DSC (which delays for A microseconds) but I don’t know if it will actually work and if this is the right way to go to observe device timing diagrams.
In the “/RD input” section (page 14) it is specified that the high-to-low transition of /RD must occur when the clock is high. How can I be sure that the line of code that lowers /RD is running when the clock is high?
Connect chip Vdd to the ADC port via the divider. Reset the chip when Vdd is correct.
Your uC is 150MHz. The clock period is 6.67ns which is larger than 4ns required. Whatever you do you cant change the pin faster. Problem does not exist for you.
Connect CLKIN to the input pin. Poll it. Change /RD when clock high
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 years ago.
Improve this question
In STM32F101 with some mishap with the reset, BUSY flag is held high. To come out of it, I followed the steps given in page 26 of following document from STM :
https://www.st.com/content/ccc/resource/technical/document/errata_sheet/7d/02/75/64/17/fc/4d/fd/CD00190234.pdf/files/CD00190234.pdf/jcr:content/translations/en.CD00190234.pdf
While doing the steps,I could able to do the 1st step. In the second step (Set SCL and SDA as open drain output with their values high) , I have set both SCL and SDA as output open drain, but could if I set the pins high, only SCL's IDR register is set high and SDA pin's IDR register could not be set (but it's ODR register is set). Because of this I could not able to continue with further steps. Please help me through this
This not the issue in this case. The slave device is keeping the SDA low. To exit from this deadlock you need to provide between 8 and 12 clocks. You need to toggle the SCL pin, and after every clock you need to check if the SDA line has been released by the slave device. This has nothing in common with this errata.
After this is good to reset the I2C peripheral.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
we are going to assign priority 5 to the Port F then we write the following code
int main() {
NVIC_PRI7_R |=0☓00A00000;
NVIC_EN0_R= 0☓40000000;
My question is that where does this code 0☓00A00000 this code came from?
The Nested Vectored Interrupt Controller (NVIC) registers are present in the ARM Cortex M architecture, so I am supposing that you are programming for that kind of target.
According to the ARM Cortex M manual:
The NVIC supports up to 240 interrupts, each with up to 256 levels of
priority that can be changed dynamically.The processor and NVIC can be
put into a very low-power sleep mode, leaving the Wake Up Controller
(WIC) to identify and prioritize interrupts.
The NVIC_PRI7_R variable is used to trigger the CPU in a given mode through specific registers manipulation. Setting it to a given value will make the CPU enter in a specific mode of priority (read the documentation for more).
About the 0☓00a00000, it acts as a mask to set a few bits to '1' in NVIC_PRI7_R. Precisely, 0x00a00000 == 0b101000000000000000000000, so the operation |= will set two bits to one and trigger the priority mode to 5 (note that 0b101 == 5). Read the ARM Cortex M manual to see that these very precise bits trigger the priority mode of the CPU.
Hope it helped you to understand a bit more this way of doing.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I am doing home automation project but I don’t know how to enable particular pin I.e P1.0 enable when 8 bit data is received
Description:
consider, when I send 10011101 from hyper terminal then only P1.0 pin is enabled
when I send 10111100 from hyper terminal then only P1.1 pin is enabled
please send me c code
First thing you should understand is, You need to program your 89c51 controller to receive UART data. You can do that by following
It should initialize UART with baud rate in which your computer is operating.
Write UART Receive function which receives UART data and writes in to a buffer.
Compare received data with data which is expected.
If matches turn on corresponding pin of a micro controller.
You can find related information here and here