Agilent 34420A transfer accuracy

Started by acbern, 05-06-2014 -- 11:07:22

Previous topic - Next topic

acbern

The Agilent 34420A, different than other nanovolt meters such as the Kethley ones does not have a specified transfer accuracy. In its absence one needs to come up with a value when doing error propagation calculations during calibration based on one's own assessment unfortunatelly.
I am mostly interested in the 10mV range, as I am using it for thermal converter calibration. I am ok with with a conservative value rather than an agressive one, that should be validated by measurements (these should actually be much better). measurements allone, as some e.g. have done it with the 3458a resistance transfer accuracy is not what I would like to do. it is based on typical data only, and not based on any manufacturer specs, and may be artificially good at the specific test values by coincidence. But again, measurements should be done, just to vaildate the calculation (could e.g. be done with a stable and specified fluke 5440/5720/4808 or the like).

The 34420A manual specifies some core values applicable to assessing the transfer accuracy. These are (+/-: ppm of value + ppm of range, all related to 10mV range listed in brackets below):
thermal drift (4 + 2)/K
linearity (0,8 + 0,5)
general measurement uncertainty (2ppm of range, equals 20nV basic uncertainty, in the 10mV range)
For details see 34420A data sheet/manual

This would lead to a transfer uncertainty of a conservative and rounded +/- 3ppm + 4.5 ppm (all added, rss method not applied to be conservative although some values are not correlated) over +/-half a K (half a K +/- is well doable for a 10 minute period while doing the transfer measurement in a somewhat controlled environment). this is actually not a very exciting result for a transfer accuracy if one has the 3458A in mind, especially if the voltage applied is not full scale. but at the same time, we are talking <10mV of measurements/nV resolution, that should not be forgotten. At 10mV that equals only +/- 75nV. the keithley 182 has a specified 5+9 ppm in the 3mV and 3+2 ppm in the 30mV range (over +/-1K though), so the calculation above is pretty conservative I would think. question is if there is any justification based on the 34420A data sheet values to derive less conservative values. The only option that comes in mind is applying the rss method.

Has anyone done a similar assessement, or what value are you guys using for the 34420A transfer accuracy or is there any reference somewhere?

Hawaii596

Maybe try contacting Agilent to see if they have any numbers they could provide.
"I often say that when you can measure what you are speaking about, and express it in numbers, you know something about it; but when you cannot measure it, when you cannot express it in numbers, your knowledge is of a meagre and unsatisfactory kind."
Lord Kelvin (1824-1907)
from lecture to the Institute of Civil Engineers, 3 May 1883

acbern

I have asked that question to Agilent, but have not got a reply yet. I will let you know, should there ever be an answer other than "it is not specified in the data sheet".

N79

#3
I realize this isn't what you're asking, but you may want to avoid the 34420A for thermal converter output monitoring and stick to something with better common-mode rejection (like your K182 or K2182).

acbern

thanks for your comment! can you please clarify. why do you think this is critical comparing one to the other.
both the 2182 and the 34420 have a 70db ac cmrr. what am i missing?
btw: no outcome yet from agilent re. the topic of this post. and not sure there will be any.
nist in any case uses the 34420 and the 182 within their tvc calibrators.

N79

[check your PMs, acbern]

I'd say you could ignore the thermal drift component of the specification as your successive measurements should be consecutive and any temperature drift effects on the result would be captured in your repeatability component of the overall uncertainty.

I would stick with just the linearity component. I didn't see any information in the data sheet on the statistical distribution on those specs, so I would assume it's a rectangular distribution. You would need to divide your calculated result by sqrt(3) to normalize it to combine with other components of your uncertainty.

acbern

First of, there was no response from agilent. Its not in the manual, and so nobody can answer, kind of.
With this and the fact that using the data sheet only would mean a lot of guesswork, not knowing exactly what each parameter meant (what mechanism e.g. is behind the range-related drift), I decided to rather do a type-a uncertainty charaterization with my meter. this obviously applies to my meter then only, although others are probably similar. So I had it run in sessions over 20 minutes, with measuremnts taken every 2 minutes (I am only interested in short term, relative deviations), hooked up to a 5440B. temp drift during that time was no more than 0.2K. The assumption was voltage all drifts are related to the 34420A, the 5440B being ideally stable (it is not, so the results are conservative). the digital filter was on, although this did not make much difference as shown in one trial test loop with the filter off (which was surprising, the internal noise seems pretty good).

I ended up with the following:   
-6mV input: 1ppm short term drift (2min), 2ppm drift over 20 minutes, i.e. standard deviation, K=2
-2mV input: 3 ppm and 3ppm, same parameters
(I assume the short term noise due to e.g. air causing emf voltages contributes to short term drift at that low voltage).

The results are in line with the fact that this meter is used in thermocouple characteritzations with reported uncertainties down to arround 5ppm, of which the 34420Ais one of the contributors.
Comments wellcome.