[3dem] [ccpem] Minimum standards for FSC reporting?

Marin van Heel marin.vanheel at googlemail.com
Tue Jun 13 13:29:06 PDT 2017


Hi Sjors,

The principles of the Fourier transform and the sampling theorem (= stay 
away from Nyquist) have not changed over the last 35 years. Yet, I have 
seen much more FSC misuse recently than I did in previous decades (see 
previous posts). So the 2/3^rd rule remains an old principle worth 
sticking to!

I agree with you that oversampling helps! So you are padding your images 
with zeros to make then 4 times larger and then do your interpolations 
in Fourier space? An image and its Fourier transform contain the same 
information so - up to a certain level - that is fine with me. (Up to a 
certain level meaning: padding an image with zeroes creates a sharp cut 
off at the edge violating the sampling theorem).

But then again, why not just sample the image correctly - adhering to 
the 2/3^rd rule - instead of under-sampling your data? The computations 
will take just as long when you double the sampling rate (= halving the 
pixel size) and you would not be violating the sampling rules.

So “I would encourage you” to do that! (You may even get away with only 
a 20% reduction in the pixel size)

By the way, for twice oversampling the FT of a 3D volume you will need 8 
times the volume…! Is that really what you rely on?

Have fun,

Marin

=================================================================

On 12/06/2017 14:23, Sjors Scheres wrote:
> Hi Marin,
>
> I'm not sure about the tone of this discussion anymore. The 2/3
> rule-of-thumb was probably a good idea to check for artefacts when the FSC
> was mis-used in many applications in the past.
>
> All interpolations in RELION are in the 2x oversampled Fourier domain.
> Tests have shown that these lead to errors in the reconstructions become
> negligible compared to many other factors until reaching much closer to
> Nyquist than the 2/3 limit you describe. I would encourage you to do these
> tests yourself: downscale your data by Fourier-cropping to a pixel size
> close to the estimated resolution, and re-refine the structure with the
> downsampled images. I'm not sure about IMAGIC, but at least in RELION,
> you'll find that the final FSC curves will not deviate until very close to
> Nyquist.
>
> Best regards,
> Sjors
>
>
>> Hi Sjors,
>>
>> It is good that you agree that the way I defined the FRC/FSC
>> normalization - some 35 years ago - makes that filtering does not
>> primarily affect this function. (A quick & sloppy search in Google
>> Scholar shows you never cited our original papers on the FRC/FSC so I
>> was not sure you were familiar with our design considerations)
>>
>> a) But then you choose to disagree with our 2/3 Nyquist rule (see, for
>> example: Van Heel et al. (2000) “Towards atomic resolution”; and: van
>> Heel & Schatz (2005)). Strangely enough you agree and you disagree at
>> the same time…!  Literally you state:  “If you've chosen a relatively
>> low magnification, then you could very well end up with non-zero FSCs
>> all the way up to Nyquist.” And in the next sentence: “Depending on
>> which program you use, interpolations will suffer more or less the
>> closer you go to Nyquist”.  Indeed, that is exactly the point: if you
>> under-sample the data you get into problems at high resolutions so
>> better avoid that problem: stay on the safe side (and simply don't use
>> "a relatively low magnification")!
>>
>> b) The FSC oscillating around zero beyond the 2/3rd Nyquist limit is an
>> internal consistency test showing that you indeed did not under-sample
>> your data! Since you choose to disagree with the 2/3 Nyquist
>> under-sampling rule you must necessarily disagree with any such
>> consistency check.
>>
>> So how can you disagree/agree with the 2/3 Nyquist rule if it makes
>> perfect scientific sense?  Ah... I see why: there is a looming “conflict
>> of interest” issue here! If you refine the sampling by,
>> say, 20% then your computational requirements go up by a factor 2
>> (~(1/0.8)**3)! (Computational requirements in terms of: CPU, memory
>> usage, and I/O.) If you do everything "in-core" and on the raw data you
>> will thus continuously run into resource problems!
>>
>> In other words, you do agree with the scientific principles we expressed
>> decades ago, but you nevertheless disagree with the idea overall, since
>> that suits the way of working you rely on! You accept the fact that
>> interpolation artifacts intermix with genuine high-resolution results
>> and that you can no longer tell one from the other. (Referees take heed!)
>>
>> Cheers ;)
>>
>> Marin
>>
>> On 29/05/2017 18:42, Sjors Scheres wrote:
>>> Hi Marin,
>>>
>>> I agree with your general point that filtering of the maps does not
>>> matter
>>> for FSC calculation. However, I disagree with your rules a) and b). If
>>> you've chosen a relatively low magnification, then you could very well
>>> end
>>> up with non-zero FSCs all the way up to Nyquist. Depending on which
>>> program you use, interpolations will suffer more or less the closer you
>>> go
>>> to Nyquist, but this does not necessarily keep you from reaching
>>> resolutions beyond 2/3 Nyquist. Regarding your rule c), perhaps I'm
>>> partially to blame, as the solvent mask-corrected FSC that RELION writes
>>> out is capped at zero. I will have a look at the code and try to fix
>>> this
>>> in the next release.
>>>
>>> I also agree with Oli that submission of unmasked half-maps plus the
>>> mask
>>> used for FSC calculation would be a good idea. As Alan remarked, you can
>>> already do this (as supplementary maps) at the EMDB. I would encourage
>>> everyone to do this.
>>>
>>> Best regards,
>>> Sjors
>>>
>>>
>>>
>>>
>>>> Sure Oli!
>>>>
>>>> I fully agree that two maps should always be deposited (for each 3D
>>>> reconstruction) and that those two maps should be unmasked (serious
>>>> errors can be made while masking).
>>>> However, the filtering state of the two maps is by itself not so
>>>> relevant because of the built-in FSC  normalization! That was my main
>>>> point!
>>>>
>>>> Among the many FSC errors that I have seen in the flood of cryo-EM
>>>> papers the more serious ones include: a) under-sampling the data and
>>>> thus claiming a resolution beyond 2/3 of the Nyquist frequency; b) the
>>>> FSC should oscillate around zero beyond 2/3rd Nyquist whereas in many
>>>> publications a FSC remains positive up to the Nyquist frequency, c) in
>>>> many publication the vertical FSC axis starts at "0" and goes to "1" so
>>>> one cannot even verify the oscillations around the "0" axis. I also
>>>> don't like using the same automatically generated 3D mask for the two
>>>> half volumes. I just now did a Google image search for "Fourier Shell
>>>> Correlation" and below is the result. I have no idea whose FSCs I am
>>>> looking at but a majority violate at least one of the basic rules (and
>>>> I
>>>> am not even counting the ones using  incorrect fixed-valued thresholds
>>>> like 0.5 or 0.143).
>>>>
>>>> Cheers
>>>> Marin
>>>>
>>>>
>>>>
>>>>
>>>> On 28/05/2017 13:38, Oliver Clarke wrote:
>>>>> That's all well and good, but without deposition of the unfiltered
>>>>> half maps and the mask used to calculate the FSC it is not possible to
>>>>> reproduce the resolution calculations of the authors, because only one
>>>>> map is deposited, it is sharpened and low pass filtered, and the mask
>>>>> used for FSC calculation is often neither deposited nor described.
>>>>>
>>>>> That seems worth addressing, and it's fairly straightforward to do so.
>>>>>
>>>>> Cheers
>>>>> Oli.
>>>>>
>>>>> On May 28, 2017, at 1:46 PM, Marin van Heel
>>>>> <0000057a89ab08a1-dmarc-request at JISCMAIL.AC.UK
>>>>> <mailto:0000057a89ab08a1-dmarc-request at JISCMAIL.AC.UK>> wrote:
>>>>>
>>>>>> Dear All,
>>>>>>
>>>>>> Much misunderstanding persists on the relatively straightforward
>>>>>> issue of the FSC...
>>>>>>
>>>>>> 1) In the first place: please do read the primary literature rather
>>>>>> than relying on second-hand or third-hand references where
>>>>>> errors/misunderstanding have accumulated. The first mention in the
>>>>>> literature of the "Fourier Shell Correlation" is in "George Harauz
>>>>>> and Marin van Heel, */Exact filters for general geometry three
>>>>>> dimensional reconstruction/*, Optik 73 (1986) 146-156."The how and
>>>>>> why of the FSC normalization of the amplitudes is explicitly
>>>>>> described in the original paper(s). (You can find more in Wikipedia:
>>>>>> "Fourier Shell Correlation").
>>>>>>
>>>>>> 2) Now about the consequences of that normalization: Any filtering
>>>>>> that does not zero a specific spatial frequency will affect the
>>>>>> nominator and the denominator of the FSC equation in exactly the same
>>>>>> way!This is independent of whether 3D reconstruction #1, or #2, (or
>>>>>> both #1 and #2) is/are filtered or not. This means that filtering of
>>>>>> the maps will NOT affect the FSC!I actually have written a paper
>>>>>> about it (Marin van Heel: */Unveiling ribosomal structures: the final
>>>>>> phases/*. Current Opinions in Structural Biology 10 (2000) 259-264,
>>>>>> ask me for a pdf if you have trouble finding it). Quoting from this
>>>>>> paper: “*/The bottom line … is that there is no wrong way
>>>>>> of
>>>>>> filtering the data, as its information content is not normally
>>>>>> affected. The one and only thing one can do wrong is to interpret the
>>>>>> map incorrectly/.*”
>>>>>>
>>>>>>    3) Thus, the fact that you don’t see certain details in the
>>>>>> map for
>>>>>> a given level of the FSC curve probably says more about your
>>>>>> representation choices than about the map. Low-pass filtering a map
>>>>>> to the 0.5 value of the FSC as a way to avoid “over
>>>>>> interpretation”
>>>>>> is in general a bad idea. You would probably be killing (the
>>>>>> visibility of) the high-res info as a self-fulfilling prophecy. On
>>>>>> the other hand, relying entirely on black-box programs that in some
>>>>>> mysterious way boost the visibility of high-res noise beyond any
>>>>>> reasonable FSC value can equally be a bad idea. Please do keep in
>>>>>> mind that the final interpretation of your map is your own
>>>>>> responsibility!
>>>>>>
>>>>>>    Cheers,
>>>>>>
>>>>>>    Marin
>>>>>>
>>>>>>
>>>>>> -------- Forwarded Message --------
>>>>>> Subject: 	Re: [ccpem] Minimum standards for FSC reporting?
>>>>>> Date: 	Fri, 26 May 2017 23:08:34 -0400
>>>>>> From: 	Jillian Chase <jillian.d.chase at GMAIL.COM>
>>>>>> Reply-To: 	Jillian Chase <jillian.d.chase at GMAIL.COM>
>>>>>> To: 	CCPEM at JISCMAIL.AC.UK
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi John,
>>>>>>
>>>>>> Thanks for your reply. It is possible that I was viewing the
>>>>>> unsharpened map. I imported that map into relion for targeted
>>>>>> post-processing based on threshold values from viewing map in
>>>>>> chimera, resulting in a more reasonable 4A. I'll double check which I
>>>>>> imported.
>>>>>>
>>>>>> Still puzzling though: the cryosparc map wth post processing in
>>>>>> relion shows more side chain density than what I see with identical
>>>>>> particle set processed in entirety in relion. I've been using a
>>>>>> hybrid of both programs to generate best maps possible. Has anyone
>>>>>> done more quantitative tests using both programs that may have some
>>>>>> input?
>>>>>>
>>>>>> Thanks again,
>>>>>> Jillian
>>>>>>
>>>>>> Sent from my iPhone
>>>>>>
>>>>>> On May 26, 2017, at 10:22 PM, John Rubinstein
>>>>>> <john.rubinstein at utoronto.ca <mailto:john.rubinstein at utoronto.ca>>
>>>>>> wrote:
>>>>>>
>>>>>>> Hi Jillian,
>>>>>>>
>>>>>>> Recently in our group one cryoSPARC users was accidentally
>>>>>>> downloading structures from the experiments overview page rather
>>>>>>> than getting the sharpened final maps from the experiment details
>>>>>>> page. The maps from the experiments overview page can be selected
>>>>>>> for further processing but are not sharpened and will look worse
>>>>>>> than expected for their resolution. Is it possible you’ve
>>>>>>> been
>>>>>>> looking at the unsharpened maps?
>>>>>>>
>>>>>>> Best wishes,
>>>>>>> John
>>>>>>>
>>>>>>> --
>>>>>>> John Rubinstein
>>>>>>> Molecular Medicine Program
>>>>>>> The Hospital for Sick Children Research Institute
>>>>>>> 686 Bay Street, Rm. 20-9705
>>>>>>> Toronto, ON
>>>>>>> Canada
>>>>>>> M5G 0A4
>>>>>>> Tel: (+001) 416-813-7255
>>>>>>> Fax: (+001) 416-813-5022
>>>>>>> www.sickkids.ca/research/rubinstein
>>>>>>> <http://www.sickkids.ca/research/rubinstein>
>>>>>>>
>>>>>>>> On May 26, 2017, at 9:03 PM, Jillian Chase
>>>>>>>> <jillian.d.chase at GMAIL.COM <mailto:jillian.d.chase at GMAIL.COM>>
>>>>>>>> wrote:
>>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> I've also noticed significantly higher FSC resolution estimates
>>>>>>>> with cryosparc vs relion, which do not seem realistic upon
>>>>>>>> inspection. (IE: a 4A relion postprocessed map looks much different
>>>>>>>> than a 4A cryosparc map). Has anyone noticed as well? How are you
>>>>>>>> handling?
>>>>>>>>
>>>>>>>> Best,
>>>>>>>> Jillian
>>>>>>>>
>>>>>>>> Sent from my iPhone
>>>>>>>>
>>>>>>>>> On May 26, 2017, at 8:47 PM, Oliver Clarke <olibclarke at GMAIL.COM
>>>>>>>>> <mailto:olibclarke at GMAIL.COM>> wrote:
>>>>>>>>>
>>>>>>>>> Hi all,
>>>>>>>>>
>>>>>>>>> Ive seen several high-impact cryoEM structures recently with
>>>>>>>>> "headline" global FSC resolutions that do not seem plausible based
>>>>>>>>> on inspection of the map.
>>>>>>>>>
>>>>>>>>> In each case, the resolution was based on results out of
>>>>>>>>> relion_postprocess, but no details were given about mask
>>>>>>>>> calculation or the volume of the mask compared to the model, and
>>>>>>>>> only the final map was deposited, not the half maps (so checking
>>>>>>>>> workings was not possible).
>>>>>>>>>
>>>>>>>>> I think that at a bare minimum, reporting either the volume of the
>>>>>>>>> mask compared to the volume of the map at the suggested contour
>>>>>>>>> level, or simply displaying an overlay of the mask on the model,
>>>>>>>>> should be mandatory (as should deposition of unfiltered half maps
>>>>>>>>> to facilitate recalculation of the FSC).
>>>>>>>>>
>>>>>>>>> Without knowledge of the mask, the FSC is meaningless,
>>>>>>>>> particularly if the author has chosen to use relion_postprocess as
>>>>>>>>> a "black box", and has chosen to automatically generate a mask
>>>>>>>>> based on an initial threshold without subsequently inspecting it.
>>>>>>>>>
>>>>>>>>> (There have also been a couple of structures using the pymol
>>>>>>>>> 'carve' option in extremely misleading ways without disclosing its
>>>>>>>>> use or the map contour level, but that is a rant for another day!)
>>>>>>>>>
>>>>>>>>> Thoughts/debate welcome! :)
>>>>>>>>>
>>>>>>>>> Cheers
>>>>>>>>>
>>>>>>>>> Oli
>>>> --
>>>>
>

-- 
==============================================================

     Prof Dr Ir Marin van Heel

     Research Professor at:

     Laboratório Nacional de Nanotecnologia - LNNano
     CNPEM/LNNano, Campinas, Brazil
     Brazilian mobile phone  +55-19-981809332
                            (041-19-981809332 TIM)

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.ncmir.ucsd.edu/pipermail/3dem/attachments/20170613/3731f62c/attachment-0001.html>


More information about the 3dem mailing list