<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD>
<META http-equiv=Content-Type content="text/html; charset=iso-8859-1">
<META content="MSHTML 6.00.2900.2523" name=GENERATOR></HEAD>
<BODY style="MARGIN-TOP: 2px; FONT: 8pt Tahoma; MARGIN-LEFT: 2px">
<DIV><FONT size=2>I suspect other will give technical responses about the
ability of these temperature controllers to function at the +/- 0.1 C
level. My take is slightly different. </FONT></DIV>
<DIV><FONT size=2></FONT> </DIV>
<DIV><FONT size=2>What matters is not the set point but knowing the actual
temperature.</FONT></DIV>
<DIV><FONT size=2></FONT> </DIV>
<DIV><FONT size=2>To my knowledge most users never use the Stafford & LUS
method to calibrate the temperature on their machine. (can we vote on the
RASMB in some way?) It is no more tedious than waiting three hours before
starting a run. I have checked the calibration on my machine twice, once
in '93 when I got it, and again in 1999 when a result caused me to question
the temperature accuracy. A series of measurements up to 40 C and
back down to 4 C (with parallel measurements in a spec) takes about two
days. It does not take three hours to go from 20 to 25 C, as verified by
the stability of the area under the CoCl2/ethanol curve. At a setting of 4
C I get 3.6, at 20 I get 19.7, at 40 I get 39.8. The values drifted by 0.1
C in 7 years. So if we mean by accuracy +/- 0.5 C of setting, my machine
is within spec. When I fit data I use the actual temperature, not the set
point. & I trust the setting on the screen at low vacuum and hit start when
it reports the set temperature.</FONT></DIV>
<DIV><FONT size=2></FONT> </DIV>
<DIV><FONT size=2>Comparisons between uncalibrated machines honestly make no
sense to me. Calibrate & be done with it. (Arthur, if you
do that in Nottingham & still get different values of S between machines I
would be concerned, & amazed.) </FONT></DIV>
<DIV><FONT size=2></FONT> </DIV>
<DIV><FONT size=2>I also use the calibrated temperature values when I measure
density in an Anton Paar DMA 5000. It has a peltier cell good between 0 -
80 C, so I dial in 19.7 instead of 20 C and measure away. I suspect the
error from a calculated density (viscosity & Vbar) is larger than
(albeit coupled to) an assumed temperature. </FONT></DIV>
<DIV><FONT size=2></FONT> </DIV>
<DIV><FONT size=2>At this point the inquiring "student" should assume
errors in various parameters and propagate them into S or MW by the appropriate
equations - my favorite book for teaching this is Bevington, "Data Reduction and
Error Analysis for the Physical Sciences".</FONT></DIV>
<DIV><FONT size=2></FONT> </DIV>
<DIV><FONT size=2>PS - why Beckman has never joined up with Anton Paar to bundle
Density Meters into XLA/XLI quotes amazes me.</FONT></DIV>
<DIV><FONT size=2></FONT> </DIV>
<DIV><FONT size=2>PPSS - Walter claims to keep the original solution
around, in a cell, from the CoCl2 calibration work, and years later it still
gives the same results. Quick checking may not be as difficult as one
might think.</FONT></DIV>
<DIV> </DIV>
<DIV> </DIV>
<DIV> </DIV>
<DIV>-------------------------------------------------------------------<BR> Dr.
John J. "Jack" Correia<BR> Department of Biochemistry<BR> University
of Mississippi Medical Center<BR> 2500 North State Street<BR> Jackson,
MS 39216<BR> (601)
984-1522
<BR> fax (601)
984-1501
<BR> email address: <A
href="mailto:jcorreia@biochem.umsmed.edu">jcorreia@biochem.umsmed.edu</A>
<BR> homepage location: <A
href="http://biochemistry.umc.edu/correia.html">http://biochemistry.umc.edu/correia.html</A><BR> dept
homepage location: <A
href="http://biochemistry.umc.edu/">http://biochemistry.umc.edu/</A><BR>-------------------------------------------------------------------<BR> <BR> <BR><BR><BR>>>>
Arthur Rowe <arthur.rowe@nottingham.ac.uk> 12/10/04 08:17AM
>>><BR>Hi Everyone <FONT color=#000080>{this is a second (</FONT><FONT
color=#ff00ff>now 3rd!</FONT><FONT color=#000080>) try at getting this mail out
- first attempt got lost in cyber-space, it seems}<BR></FONT><BR>Mei-Ling Chien
gives us a very useful review of the nature of the temperature measurement and
control system in the XL-I/A instrument. However, I do not think that this fully
addresses the problems which one has in determining what the absolute
temperature of one's sample actually <I>is</I> when it is going round in the
rotor at speed.<BR><BR>It is, of course, only a worry to those (very limited)
number of people for whom an absolute s value is of importance, normally for
hydrodynamic modelling purposes (although formulation issues should not be
forgotten). When I raised this issue on RASMB a week or so back, my concern was
not "<FONT color=#ff0000><TT>to ensure their operation within the
published specification".</TT></FONT> . I am trying to get the
<U>accuracy</U> of the temperature read-out to be close to the <U>precision</U>
of which the system is capable. I have no evidence at all to suggest that the
accuracy is outside the quoted spec of 0.5º. It is just that I - in my greedy
way - want 0.1º.<BR><BR>Even the method mentioned (equilibrate for 3 hours -
under vacuum - and then check "<FONT color=#ff0000><TT>with a calibrated
external temperature sensing device to verify accuracy"</TT></FONT>
is not unambiguous in what it will yield. Quite apart from matters
such as adiabatic effects when one releases the vacuum to use an "external
temperature sensing device", can one be sure that the thermal emissivity of a
spinning rotor surface, averaged over everything that is passing by, is equal to
that of a piece of the rotor surface 'seen' in a stationary rotor? <BR><BR>None
of these are new concerns, and I certainly lay no claim to the IPRs! I imagine,
from what Mei-Ling Chien has communicated, that we at least know clearly that
the ±0.5º refers to the accuracy of the temperature <U>as measured by the
defined procedure</U>. Walter Stafford's colorimetric method (Stafford &
Liu) did not suggest the presence of errors outside the stated accuracy limit,
and is surely a valid way to approach the absolute temperature issue. But is is
pretty tedious to use as a procedure, and certainly as a routine QA method is
not feasible. <BR><BR><FONT color=#008000><I>As an approach to the size of the
problem, would there be support for Borries Demeler's suggestion (a single
sample to be circulated and multiple users on multiple machines to report an s
value under defined conditions)? After all, the NCMH + Borries's Lab gives us 6
machines for starters.<BR></I></FONT><BR>Any way, we here keep trying here to
locate the holy grail - a simple, cheap, effective method for determining the
in-cell temperature to ±0.1º <BR><BR>Regards to all (and many thanks to Mei-Ling
Chien)<BR><BR>Arthur<BR><BR>-- <BR>*************************<BR>Arthur
Rowe<BR>Lab at Sutton Bonington<BR>tel: +44 115 951 6156<BR>fax: +44 115 951
6157<BR>*************************<BR></DIV>
<BLOCKQUOTE><BR><B>From: </B>mchien@beckman.com<BR><B>Date: </B>Fri, 3 Dec
2004 10:11:46 -0800<BR><B>To: </B>"'rasmb@rasmb-email.bbri.org'"
<rasmb@server1.bbri.org><BR><B>Subject: </B>[RASMB] Re: XL-A/I temp
control<BR><BR></BLOCKQUOTE><BR>
<BLOCKQUOTE><TT>----------------------------------------------------------------------------------<BR>The
older archived RASMB emails can be found
at:<BR>http://rasmb-email.bbri.org/rasmb_archives<BR>and current archives
at<BR>http://rasmb-email.bbri.org/pipermail/rasmb/<BR>Search All the Archives
at:<BR>http://rasmb-email.bbri.org/rasmb_search.html<BR>----------------------------------------------------------------------------------<BR><BR>Hi
All,<BR><BR>Below is response regarding XL-A/I temperature control from our
Technical<BR>Support
Department.<BR><BR>******************************************************<BR>Mei-Ling
Chien PhD<BR>Staff Development Scientist, Centrifugation<BR>Platform &
Automation Business Center<BR>Beckman Coulter
Inc.<BR><BR>mchien@beckman.com<BR>(650)
859-1948<BR>******************************************************<BR><BR><BR>The
basis for temperature control specifications were instrument
design<BR>specifications for temperature control and dynamic system testing
during the<BR>prototype phase of the product.<BR><BR>If there is a discrepancy
in temperature control and measurement between<BR>instruments of the same
design then a dynamic calibration check should be<BR>performed on both
instruments <FONT color=#ff0000>to ensure their operation within the
published<BR>specification.<BR></FONT><BR>First the physical condition of
components within the temperature control and<BR>vacuum system should be
verified through inspection. Then an electronic<BR>calibration for
temperature control and vacuum can be performed. Lastly a<BR>dynamic
test or rotor dunk test is performed (rotor should be precooled
or<BR>preheated to avoid testing delay). The rotor and its contents must
be allowed<BR>to equilibrate for up to 3 hours or more. When set
temperature equals indicated<BR>temperature at the instrument interface, the
rotor temperature is then checked<BR><FONT color=#ff0000>with a calibrated
external temperature sensing device to verify accuracy</FONT>.<BR><BR>If the
checks fall out of specification then appropriate troubleshooting
is<BR>required to isolate the electronic or mechanical fault in the
temperature<BR>control or vacuum system. Once the fault is corrected the
temperature control<BR>checks are performed again.<BR><BR>Quote from Bob
Giebeler, Analytical Ultracentrifugation in Biochemistry<BR> and
Polymer Science, 1992,16-25 for the Optima XLA/I.<BR>"Temperature
control is considerably more stable, provides more rapid
cool-down<BR>and heat-up rates, is thermally more
uniform, and has equivalent accuracy as<BR>compared to previous models
including the Model E. This control system uses an<BR>isothermal
radiometer temperature-sensing system to sense the temperature of the<BR>rotor
that is emissivity-independent ad view
factor-corrected in software.<BR>Heating and cooling of the rotor
are accomplished by the refrigeration can that<BR>surrounds the
rotor, which is in turn heated and cooled
by thermoelectric<BR>modules. This environment is very isothermal,
and at equilibrium, irrespective<BR>of speed or temperature,
rotor temperature is within about one degree of the<BR>refrigeration can
temperature.<BR><BR><BR>The control system that
regulates rotor temperature, as monitored by
the<BR>radiometer, is highly software-intensive.
This software encompasses
triple<BR>proportional-integral-differential control algorithms and
proportional-integral<BR>smoothing algorithms. In addition,
radiometer view factors are measured during<BR>rotor cool-down
to allow more rapid rotor cool-down
and more accurate<BR>temperature monitoring
during cool-down. While at equilibrium, refrigeration<BR>can
temperature fluctuation does not
typically exceed +0.5C, and the<BR>corresponding
rotor temperature fluctuation is less than
+0.2C.<BR><BR><BR><BR><BR><BR><BR><BR><BR><BR><BR><BR>_______________________________________________<BR>RASMB
mailing
list<BR>RASMB@rasmb-email.bbri.org<BR>http://rasmb-email.bbri.org/mailman/listinfo/rasmb<BR></TT></BLOCKQUOTE><TT><BR></TT><BR>
<P>This message has been scanned but we cannot guarantee that it and any
attachments are free from viruses or other damaging content: you are advised to
perform your own checks. Email communications with the University of Nottingham
may be monitored as permitted by UK legislation. </P></BODY></HTML>