Hi,
I want to produce an analog voltage output signal which increases over time with a certain slope, which I will send into a potentiostat and at the same time I want to read out the actual voltage and current (both are represented by a voltage signal) which I want to log and eventually plot against each other. To do this I have a USB-6008 DAQ system to my disposal.
Creating the analog output signal with an linear ramp I was possible to do by using a while loop and a Time Delay (see attachment). Important here is that I can set the slope of the linear ramp (e.g.. 10mV/s) and the step size to make a smooth inclement. However when I now want to measure an analog input signal it goes wrong.
To reduce noise influences I want for instance measure 10 values for example within 0,1 second and average it out (this read out should be quicker or equal then the inclement caused by the slope and the step size of the linear ramp. Example: a slope of 10mV/s is set along with a step size of 10. Every 0,1s the analog output signal rises with 1mV. Then I want to read 10 values at the analog inlet within this 0,1s)
Because I use a Time Delay to create the linear ramp and the analog input is in the same while loop, the time delay also affects the analog input and I get an error every time. Separately in different VI-programs (analog input and output) they work fine but combined not. I have searched this forum to found a way to create the ramp in a different way but because I am not an experienced user of labview I can't find an other way.
To make it now a bit more complicated I said I want to measure 2 analog input signals (one for the voltage of the potentiostat and one for the current (also represented by a voltage signal)) and they should be measured quicker then the inclement of the analog signal. This I have not even started with because I could not get the read out of one channel working.
Hopefully someone can help me with this problem