Does changing the oscilloscope timebase also change the sample frequency? How?
I'm trying to measure a signal that I know should be 10uS pulses, about 33mS apart. Or, in other words, a block wave with a duty cycle of 0.03%.
I can't manage to get this to display. I'm now wondering whether my adjusting the timebase to something that I thought would be reasonable for this signal (something between 10 and 100 ms per division) somehow also adjusts the sample interval of the Pokit Pro, so that it's going to be hit or miss whether my Pokit Pro sees the pulse at all.
Is this a reasonable analysis?
Can anyone tell me the relationship between timebase and sample interval?
0
Please sign in to leave a comment.
Comments
0 comments