[RASMB] Loading concentration

Borries Demeler demeler at biochem.uthscsa.edu
Wed Dec 17 07:18:01 PST 2003


> Borris (and anyone else),
>  
> For a global analysis of sedimentation equilibrium data obtained over a range of wavelengths, do you use extinction coefficients calculated using the XLA absorbance optics, or do you use values obtained with more precise insturmentation?  Often, when one sets a scan at 230nm, the value written to the data file is off +/- 1nm.  At 230nm, this can change the absorbance by a significant amount.  Which value is correct - the one entered or the one written to the data file?  Finally, you indicated you don't trust data below 0.1 OD.  If you have collected an absorbance scan that ranges from below 0.1 to a value around, say, 0.9, do you exclude data below 0.1 in the analysis?  I agree one should not trust data if the entire scan is below 0.1 OD, but if the data spans a wide range of OD's, what is the most appropriate data to include?
>  
> Thanks,
>  
> N. Karl Maluf

Hello Karl,
Yes, you are correct, it is important to consider the difference in
extiction coefficient if globally fitting data taken at different
wavelengths. Add to that the problem with the monochromator often not
resetting to the same lambda if you change it during the run, Especially
around 230, where the extinction shoulder is steep and a few nm makes
a lot of difference in absorbance.

So here is what I do: Let's say I measure 3 concentrations at 230
and 3 loading concentrations at 280 mn. Then I do a wavelength scan
from 220 - 340 nm from each concentration, giving me 6 scans. The 280
scans will generally be off scale at 220, but there is always overlap.
I then fit the wavelength scans with a sum of gaussians (usually 5 terms
is enough), requiring that the peak position, width and relative peak
height is global, while the amplitude of the entire sum can vary (to
account for the different concentrations). This results in an intrinsic
extinction curve from 220 to 230. 

You also have to correct for pathlength (1.2 cm for standard 6-channel
cells) and exclude data that goes higher than 0.9 OD, since it becomes
nonlinear and noisy above that. Then I take the known molar extinction
coefficient (usually at 280 mn) and normalize the entire curve. Now I have
the correct molar extinction coefficient for *every* wavelength. Then
I simply look up the wavelength of each scan and apply the appropriate
extinction coefficient for each scan. This method works very well even for
scans where the XLA didn't set the wavelength correctly to the requested
value, for example, 228 or 232 instead of 230 - happens all the time on my
machine. Beckmann told me that it is normal for the monochromator not to
step exactly to the requested value, but that I can rely on the value that
is reported to be correct. All of this is implemented in UltraScan and
the wavelength fitting etc is automatically applied when doing global fits
from data at different lambdas. 

I have written a short tutorial for doing equilibrium experiments and
list the settings that tend to give good results with the XLA. It has some
practical suggestions for designing a good equilibrium experiment. You
can find it at:

http://www.ultrascan.uthscsa.edu/tutorial/equil_tutorial.html

Sorry for the lengthy message. -Borries



More information about the RASMB mailing list