Noisy ADC dsPIC - c

I've got a dsPIC33F collecting from two ADC channels, simultaneously, at 10bit. I'm using a timer to sample at 64Hz and have the ADC set to auto sampling, but manual conversion. Every time the timer interrupt is polled I'm clearing the sample bit and the DMA buffer is filled with my ADC data. Plotting this data shows it's giving the right values, but I've noticed it's very noisy!
Ignore the green line. The red line is correctly plotting my ADC results (the peaks are intentional), but as you can see it's got an awful lot of noise throughout.
Any ideas on what can be done to reduce this? When plotting simultaneously with a DAQ (but using the same power source and linking the grounds) it's much much smoother, so I know this noise isn't always present. Decoupling capacitors on the PIC maybe? I'm using a breadboard and through-hole components, the analogue sensor is placed as close to the PIC pin as possible. I'm under the impression this is a hardware issue, but let me know if something can be done on the software side of things.

This could be due to the source impedance that is driving the ADC, ie your analogue sensor. It might need a buffer amplifier to drive the ADC better. An opamp in unity gain configuration should help. Another way to achieve some improvement is a small capacitor from the ADC input to ground, but you would need to choose its value carefully to avoid filtering those peaks too much.

Related

System architecture to use for high speed micro controller test stand controller/daq

I am designing the controller and data acquisition unit for a rocket engine test stand. This system needs to control a number of actuators on the test stand and also be able to transmit collected data back to the host computer where the team will be watching live data/camera feeds from safety.
The overall design requirements are as follows:
Acquire data from ~15 analog sensors at 1KHz
Control the actuators on the test stand including valves and ignition switches
Transmit data back to the host computer in our shelter in real time
Accept control from the host computer for things like manual valve actuation, test sequence modification, sequence abortion, etc.
I am not exactly sure where to begin when laying out the software for this system. I am considering using an STM32 ARM Cortex-M4 processor running at 180 MHz. I am having trouble figuring how I should approach the problem. I have considered using an RTOS system but based on what I have seen those generate large overheads as you run them faster as the scheduler has to run each tick. The other idea I'm bouncing around is a state machine combined with some timer-based interrupts for reading and then sending data back out to the PC. Any advice as to how to approach this problem to minimize code complexity would be greatly appreciated. Thanks.
EDIT:
I have been told to clarify a number of things concerning the technical specs of the system.
My actuators consist of:
6 solenoids (controlled digitally through relays/MOSFET, and switched around once a second)
2 DC motors (driven with PWM outputs in a PID loop, need to be able to ramp position controllably)
One igniter, again controlled through a relay/MOSFET
My sensors consist of:
8 pressure transducers (analog voltages)
4 thermocouples (analog voltages)
2 motor encoders (quadrature encoders)
1 light sensor (analog voltage)
1 Load cell (analog voltage)
Ideally all of the collected data (all of the above sensors) plus some additional data (timestamps, motor set positions, solenoid positions) is streamed back to the host computer at in real time.
Given the motor control with PWM & PID, you need to specify a desired resolution, either in PWM timer ticks or ADC reads. This is the most critical part. It doesn't hurt if the ADC has greater resolution than your specified resolution either. The PCB has to be designed accordingly, with sufficient resolution on resistors etc.
After you've done this, find MCU with sufficiently accurate ADC. I would imagine that 12 bit resolution is enough for most applications, but I don't know your specific case.
Next, you need to decide how fast you want the PID to be. Should an output on the PWM result in a read on the ADC in the next cycle, or could you settle for slower response? The realtime bottleneck here will be the ADC conversion clock, not the CPU.
The rest of the system doesn't seem time critical at all - you just have to ensure that everything is read/set synchronously. The data transmission to/from the host should preferably be done over CAN since it comes with hard real-time characteristics. Doesn't seem that you need a whole lot of bandwidth.
I have designed systems very similar to this using bare metal 16 bit MCUs running on 16MHz. Processing speed is really not a big concern, but meeting real-time deadlines is. That means you can forget about using Linux toys like Rasp PI, it's completely out of the question. And a RTOS is likely overkill since it mostly adds additional complexity.
A bare metal Cortex M with sufficient ADC resolution and CAN seems like a good choice. If you can stay away from floating point, that's nice too - depends on how advanced math you need. If you need nothing more advanced than PID, it can be implemented with fixed point just fine. (Or PI rather, since that usually works best for fast motor control systems.)

Which MCU(Cortex-M) for time critical GPIO application?

We have an application which runs on PIC24H, we would like to port it to another MCU, preferably ARM Cortex. Application is extremely time critical, meaning that we need extremely deterministic code behaviour. In short, there are pulses which are obtained via special hardware to GPIO pins, data is analyzed right away. Processing of data is not complex(we don't need a beefy cpu/mcu to do it). After analyzing the data GPIO output pins are written to their values.
App in 3 short lines:
process input pins
determine pattern within processing of input pins
based on the received pattern write output pins
PIC24H is working at 40MHz, we can toggle the pin in 25ns, we would be grateful with at least 2x speed for future upgrades. So MCU which can run deterministic code and toggle pins with at least 80MHz (12.5ns) would be just fine. We don't need toggling of the pins at constant fast rate, we need a mcu which can toggle it in less than 25ns. We can't waste cycles while toggling, if one cycle is off we loose synchronization. Everything must be done in one cycle precision(or two but constant two cycles), so code should be 100% deterministic.
Please let me know if I'm missing something or if what we need can be done using some other methods on Cortex-M. Just keep in mind that if one cycle is lost(due cache or similar) we loose signal sync and app will not do it's work right or at all.
Thanks!
Br
According to this blog post, the interrupt latency for Cortex-M ranges from 12 to 16 cycles (assuming you are not using FPU registers) with best-case memories. M0 and M0+ are slower than M3/M4/M7. On top of this, you need to add the GPIO access times (and watch out for different clock frequencies between the core and the peripherals. Cortex-M7 will suppport higher clock speeds than M3/M4.
It still isn't clear how many cycles are consumed in recognising a pattern, and how an interrupt is useful in doing this - generally a low latency interface function like this would be an obvious target for dedicated hardware, but since you have an existing software solution it seems the problem is mis-specified.
Providing you avoid accessing any 'slow' peripherals which might stall the bus, the interrupt latency should be deterministic - any specific device should have documentation which covers this.
NXP have an application note which describes some of the detail of how to measure what is going on.

Generating a tone with PWM signal to a speaker on a PIC32 microcontroller

I'm currently working on generating a tone on a PIC32 device. The information I've found has not been enough to give me a complete understanding of how to achieve this. As I understand it a PWM signal sends 1's and 0's with specified duty cycle and frequency such that it's possible to make something rotate in a certain speed for example. But that to generate a tone this is not enough. I'm primarily focusing on the following two links to create the code:
http://umassamherstm5.org/tech-tutorials/pic32-tutorials/pic32mx220-tutorials/pwm
http://www.mikroe.com/chapters/view/54/chapter-6-output-compare-module/#ch6.4
And also the relevant parts in the reference manual.
One of the links states that to play audio it's necessary to use the timer interrupts. How should these be used? Is it necessary to compute the value of the wave with for example a sine function and then combine this with the timer interrupts to define the duty cycle after each interrupt flag?
The end result will be a program that responds to button presses and plays sounds. If a low pass filter is necessary this will be implemented as well.
If you're using PWM to simulate a DAC and output arbitrary audio (for a simple and dirty tone of a given frequency you don't need this complexity), you want to take audio samples (PCM) and convert them each into the respective duty cycle.
Reasonable audio begins at sample rates of 8KHz (POTS). So, for every (every 1/8000th of second) sample you'll need to change the duty cycle. And you want these changes to be regular as irregularities will contribute to audible distortions. So you can program a timer to generate interrupts at 8KHz rate and in the ISR change the duty cycle according to the new audio sample value (this ISR has to read the samples from memory, unless they form a simple pattern and may be computed on the fly).
When you change the duty cycle at a rate of 8KHz you generate a periodic wave at the frequency of 4KHz. This is very well audible. Filtering it well in analogue circuitry without affecting the sound that you want to hear may not be a very easy thing to do (sharp LPF filters are tricky/expensive, cheap filters are poor). Instead you can up the sample rate to either above twice what the speaker can produce (or the human ear can hear) or at least well above the maximum frequency that you want to produce (in this latter case a cheap analogue filter can help rid the unwanted periodic wave without much effect on what you want to hear, you don't need as much sharpness here).
Be warned, if the sample rate is higher than that of your audio file, you'll need a proper upsampler/sample-rate converter. Also remember that raising the sample rate will raise CPU utilization (ISR invoked more times per second, plus sample rate conversion, unless your audio is pre-converted) and power consumption.
[I've done this before on my PC's speaker, but it's now ruined, thanks to SMM/SMIs used by the BIOS and the chipset.]
For playing simple tones trough PWM you first need a driver circuit since the PIC cannot drive a speaker directly. Typically a push-pull is used as actively driving both high and low results in better speaker response. It also allows for a series capacitor, acting as a simple high-pass filter to protect the speaker from long DC periods.
This, for example, should work: http://3.bp.blogspot.com/-FFBftqQ0o8c/Tb3x2ouLV1I/AAAAAAAABIA/FFmW9Xdwzec/s400/sound.png
(source: http://electro-mcu-stuff.blogspot.be/ )
The PIC32 has hardware PWM that you can program to generate PWM at a specific frequency and duty cycle. The PWM frequency controls the tone, thus by changing the PWM frequency at intervals you can play simple music. The duty cycle affects the volume, but not linearly. High duty cycles come very close to pure DC and will be cut off by the capacitor, low duty cycles may be inaudible. Some experimentation is in order.
The link mentions timer interrupts because they are not talking about playing simple notes but using PWM + a low pass filter as a simple DAC to play real audio. In this case timer interrupts would be used to update the duty cycle with the next PCM sample to be played at regular intervals (the sampling rate).

One wire protocol in SIM800 through bit banging

Is it possible to implement one wire protocol in SIM800 through bit banging? Time required to change the direction of pin (as input or output) is 1.5 microsecond and Time required to change the state of pin (as low or high) is 1.5 microsecond.
The Dallas/Maxim 1-Wire(tm) protocol is self-clocking; it is deliberately designed to make it easy to implement in this way. Using a hardware timer would be a good idea - removing a good deal of software overhead, but even a low-precision RC oscillator is likely to be accurate enough.
1-Wire(tm) is self-clocking so I assume the timings you suggest are minimum timings, not required timing; the protocol has wide tolerances for the actual bit timing, and the inter-bit timing merely requires that the line is high for >1us, but may be any length. It only needs to be long enough to be able to detect a definite edge - on an input-capture timer or edge triggered interrupt for example - you could software poll the line, but if your application needs to get other work done at the same time, a 1us pulse may get missed.
It is not clear to me what the definition of the timings you suggest are, but if they simply refer to the duration of the edge, the 1.5us you suggest is not a software issue - that is down to the slew-rate of the I/O pin which is largely a function of the line characteristics. For short distance communication, you'd have to really mess up the hardware design to get switching that slow.

DAC Signal Generator stm32

I am programming DAC peripheral of stm32f2xx. I have an array of bytes (Sound) & I would like to generate signal with sample rate = 8K.
Now my question is:
How do I specify sample rate?
Note:
I googled alot. I am only getting trangle wave generation and sine wave generation using DMA. I dont want to use DMA.
Thanks in advance for help...
Regards,
It's not practical to play waveforms out of the DAC without using DMA. You set up the DMA with your samples, and you set up the DAC to use a timer as the trigger. Then you set up your timer to trigger at your desired sample rate.
I would agree with TJD that in general it is not practical to do so without DMA, however it is not impossible, particularly at a low sample rate.
One could use a timer set to trigger every 1/8000th of a second as the fixed time base. From there, the interrupt routine would need to load up the next sample into the DAC. The sample rate could be varied by changing the timer's time base.
It would be a similar effort to write the code to configure the DMA controller when compared to writing the code to move the correct sample into the buffer. However, the DMA approach would be more reliable, likely posses less jitter in the sample rate, and frees up the core to execute other code that may be needed. In fact, with the TIM/DMA/DACs setup, you may be able to halt the core or enter a sleep mode that keeps peripheral clocks running.
yes, i agree with TJD too.
using DMA is effecient as well as free up CPU for other task [good].
managing the timing in software(core with busy loop) [bad] will not produce good results. (so, use timer for timing [good]).
now for copying, you have to dedicate CPU to do the copying after a specific interval of time (from busy-loop or timer timeout) to DAC register.[bad]
at the end i recommend, connect DMA and timer, and on timeout, DMA will copy data to DAC register [good]. this solution only appear hard but actually much easier to work with when setup'd.
[note: written in pov of someone who is trying to understand/start on something like this]

Resources