[3dem] [ccpem] Which resolution?

Matthews-Palmer, Teige t.matthews-palmer14 at imperial.ac.uk
Thu Feb 13 03:12:08 PST 2020


Dear All,
To Daniel’s comment that we want something like an Abbe criterion - well we have a point resolution for the microscope itself, but it’s so high, and the SNR of our single images is so low that it’s irrelevant in the reconstructions - the sampling rate (Nyquist) is more relevant. Can there be an Abbe-like point resolution for the reconstruction…? Others have pointed out the sampling rate, filtering and sharpening change how the map looks. Does low-pass filtering the map at a very conservative resolution, and over-sampling to have lots of map voxels, give you something like a point-resolution in your map? :-/

To Marin & others, could you critique my understanding of why we even use FSC to talk about ‘resolution’ in 3DEM? (No worries if not.)

It seems to me that we want to estimate the ‘resolution limit’ in this figure from Rosenthal & Henderson 2003.
Can anyone precisely define what this ‘resolution’ limit is?
Intuitively, it’s the spatial frequency in our reconstruction where the [signal from scattering of the beam by our molecule / structure factors of our molecule] meets the amplitude of the noise in the reconstruction.
But I don’t think I understand it in a way that precisely relates to our methods.

I figure we use FSC between half-maps to estimate something like that ‘resolution limit’ - using a threshold as a marker of where we have confidence to claim that up to this spatial frequency, our (FT) map's amplitudes and phases [reflect/describe/are determined by] the scattering of the electron beam by the molecule.
I.e. the FSC threshold is our estimate of where the half-map correlation significantly exceeds what can be expected for noise, and therefore we claim it’s due to common structure factors really existing in the molecule.
(Assuming the half-maps don’t have correlated noise thanks to gold-standard reconstruction and phase-randomisation check)
P.S. Am I oversimplifying if I say that we claim the map features are totally due to the molecule’s electron scattering up to the resolution limit? After all we filter our maps to the threshold res and then boost the amplitudes (sharpen the map). Does the ‘reliability’/relationship of the resulting map's features to the molecule’s structure factors trail off gradually in the range where amplitudes have been boosted? I was thinking as if it doesn’t.

How to determine where we are confident the map is determined by the molecule’s scattering?
Marin’s sigma-factor curves? Van Heel 1987 https://doi.org/10.1016/0304-3991(87)90010-6
"A rough significance threshold value of two standard deviations above the expected random background is normally used in this application. This threshold value has no absolute validity, but rather serves to compare the quality of the results of similar experiments”

Marin’s half-bit threshold? Van Heel & Schatz 2005 https://doi.org/10.1016/j.jsb.2005.05.009
It seems fair to say that the half-bit criterion is calibrated to a 0.5 figure of merit (like the 0.143 threshold but properly accounting for the number of voxels).
"Whereas σ-factor curves indicate the resolution level at which one has collected information significantly above the noise level, the information curves indicate the resolution level at which enough information has been collected for interpretation.”

Shouldn’t, ideally, the questions of "whether information is significantly above the noise level", or "is enough for interpretation” be the exact same question? What’s so great about that 0.5FOM calibration?
Presumably because we have so much prior information about the chemical structure of proteins, EM maps at high resolution could be judged for their ‘interpretability’ against what we expect of proteins. I haven’t read about the Q-score yet, is it better than 0.5FOM?
If we completely ignore prior information about what proteins look like - what does the ‘interpretability' of an EM map mean? Surely that would have to involve distinguishing ‘signal’ from ‘noise’ without relating it to 0.5FOM, i.e. a sigma factor FSC threshold.

I guess my questions are:
- Is the 0.5FOM calibration the only or best thing linking the ‘resolution limit’ concept from attached figure to the half-map FSC?
- Is the ‘resolution limit’ something useful in relation to our experimental methods, anyway?
- Imagine an EM map with uniform resolution (haha..) Would placing a sharp FSC threshold at the ‘resolution limit’ and fourier filtering there perfectly separate map features arising from the molecule’s electron scattering, from noise? Or do the structure factors get smeared in to the noise gradually across fourier shells and does that screw up the map features?

Thanks & all the best,
Teige


[page39image1280]


On 13 Feb 2020, at 16:45, Tim Gruene <tim.gruene at univie.ac.at<mailto:tim.gruene at univie.ac.at>> wrote:


*******************
This email originates from outside Imperial. Do not click on links and attachments unless you recognise the sender.
If you trust the sender, add them to your safe senders list https://spam.ic.ac.uk/SpamConsole/Senders.aspx to disable email stamping for this address.
*******************
Hi Marin,

crystallography has long moved away from the term 'resolution', see e.g.
https://www.cell.com/structure/fulltext/S0969-2126(18)30138-2. It is merely a
ballpark number, and it is good to know whether crystallographic data were cut
at 1, 2, or 3 Angstrom, but not very important.

What counts is the interpretation of the model and conclusion that can be
drawn on the system under study. It requires a broader understanding of
crystallography in order to understand whether the conclusions are justified.
Resolution plays only a minor role in this. It is more useful to take a look
at the crystallographic map itself in order to understand.

EM is totally different from crystallography, and why would one mix concepts
between the fields?

Best,
Tim

On Thursday, February 13, 2020 12:07:15 AM CET Marin van Heel wrote:
Hi Tim,
Good to hear from you!  No longer at PSI??
See... You are already touching upon one of the logical breaking points in
the resolutiton story...!  X-ray crystallography resolution criteria like
R-factors make absolutely no sense outside the field of crystallography and
of structural biology.  It is the result of a hybrid iterative optimisation
process between the phases of a model structure and the measured amplitudes
of a diffraction experiment!  The FRC/FSC resolution criteria, in contrast,
are universal quality metrics not at all coupled to Cryo-EM or structural
biology.  Using structural biology arguments like how well I see an alpha
helix or how well I see the hole in an aromatic ring as an assessment
criterion of whether a metric is good or not is a waste of time!  (Moreover
filtering a map can completley change its appearance without changing its
information contents). Even some my own (ex-)students and (ex-)postdocs
sometimes completely miss this fundamental point. The FRC and FSC criteria
are now used as quality metrics in all walks of image science like X-ray
tomography and super-resolution light microscopy, fields of science where
atomic coordinates of proteins are not an issue. The FRC / FSC functions
are universal and very direct metrics that compare both the amplitudes and
the phases of two independent measurements of images or 3D-densities of the
same object. For more details, see the 2017 bioRxiv paper and references
therein (https://www.biorxiv.org/content/10.1101/224402v1) and check my
#WhyOWhy tweets (@marin_van_heel). See also: van Heel - Unveiling ribosomal
structures: the final phases - Current opinion in structural biology 10
(2000) 259-264.

Cheers,
Marin

On Wed, Feb 12, 2020 at 11:22 AM Tim Gruene <tim.gruene at univie.ac.at> wrote:
Dear Marin,

I did not read the enire thread, nor the manuscript you point at -
apologize
in case this has been discussed before.

What about a practical approach to determine the resolution of a cryoEM
map:
one could take a feature with scales of interest, e.g. an alpha-helix, and
shift and/or rotate it in steps of, say, 0.3A in several directions to
see, at
which magnitude (degree / distance) refinement does not take the helix
back to
its original position (within error margins).

One could also take a Monte-Carlo approach and do an arbitrary number of
random re-orientations of such a helix, refine, and calculate the
variation in
position and rotation.

This would reflect my understanding of resolution, much more than any
statistical descriptor.

Best regards,
Tim

On Wednesday, February 12, 2020 1:46:48 PM CET Marin van Heel wrote:
Hi Laurence,

One thing is certain: the 0.143 threshold is RUBBISH and all CC50 etc
are
also based on the same SLOPPY STATISTICS  as are all  fixed-valued  FSC
thresholds. This controversy has been ragings for a long long time and

the

errors made were extensively described (again) in our most recent paper
(Van Heel & Schatz 2017 BioRxiv:
https://www.biorxiv.org/content/10.1101/224402v1) which has been

downloaded

more than 3000 times. Further papers on the issue are in the pipeline.

The

math BLUNDER behind this controversy is simple:  the inner product

between

a signal vector and a noise vector is NOT zero (but rather proportional

to

SQRT(N) where N is the length of the vectors) and cannot be left out of

the

equations. This error goes back to a paper published in Nature in 1975

and

has since been repeated frequently, including in the first paper

promoting

the erroneous 0.143 FSC threshold. The consequences of this blunder in
current processing are serious especially when these erroneous metrics

are

used as an optimisation criterion in iterative refinements at
resolutions
close to Nyquist.  I get tired of facing this systematic misuse of the

FSC

function, which I myself have introduced into the literature in

1982/1986,

and people nevertheless feel they know better (with no scientific

arguments

to support!) and they feel justified to use it beyond its definition

range,

and to continue to ignore the correct math. To counter this systematic
abuse of my brain child - over decades - I feel the need to use CLEAR
LANGUAGE!
Have fun!
Marin

--
--
Tim Gruene
Head of the Centre for X-ray Structure Analysis
Faculty of Chemistry
University of Vienna

Phone: +43-1-4277-70202

GPG Key ID = A46BEE1A

--
--
Tim Gruene
Head of the Centre for X-ray Structure Analysis
Faculty of Chemistry
University of Vienna

Phone: +43-1-4277-70202

GPG Key ID = A46BEE1A

*******************
This email originates from outside Imperial. Do not click on links and attachments unless you recognise the sender.
If you trust the sender, add them to your safe senders list https://spam.ic.ac.uk/SpamConsole/Senders.aspx to disable email stamping for this address.
*******************
_______________________________________________
3dem mailing list
3dem at ncmir.ucsd.edu
https://mail.ncmir.ucsd.edu/mailman/listinfo/3dem

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.ncmir.ucsd.edu/pipermail/3dem/attachments/20200213/ac131bef/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: page39image1280.png
Type: image/png
Size: 419104 bytes
Desc: page39image1280.png
URL: <http://mail.ncmir.ucsd.edu/pipermail/3dem/attachments/20200213/ac131bef/attachment-0001.png>


More information about the 3dem mailing list