Phenomenology, Fundamental Physics and Inconsistent Truths

"There is no use trying; one can't believe impossible things."

"I dare say you haven't had much practice. When I was your age, I always did it for half an hour a day. Why, sometimes I've believed as many as six impossible things before breakfast."

This week, a short, innocuous little astrophysics paper appeared in Nature discussing the apparent inferred constant surface density of cold dark matter in the inner part of galaxies.

This somewhat startled fellow SciBling Ethan, enough for him to bang out a new post - discussing why cold dark matter is to be preferred to an alternative model popularly referred to as MOND.

I thought Ethan was a bit harsh, and commented, leading to Ethan's new reply:
that, yes, it is evil to promote MOND over Dark Matter.

Tsk. These young 'uns...
Fortunately I have achieved Zen.
Not that I ever, in my most deranged hallucinations, thought I'd ever find myself defending MOND in a public forum...

The paper, Gentile et al, Nature, 461, 627, 2009, discusses the apparent constant surface mass density of disk galaxies in the centers, over several orders of magnitude in mass and luminosity of galaxies, and that the density, when fit with a cold dark matter inspired density profile for the galaxy halos, leads one to conclude the break radius is such that the inferred cold dark matter density decreases at a radius corresponding to an apparently near constant acceleration.
The word "MOND" does not actually appear in the paper, though half the references in the paper are to MOND related papers.
The result itself is non-controversial - observed disk galaxies appear to have constant density centers, and extended halos whose density declines as approximately 1/r2 over essentially all distances for which decent data is available.
Equivalently: the galaxy rotation curves rise linearly and then are flat.

So what?

Well, there has been a problem in astronomy for about 100 years.
First noted by Oort, it was really firmly established by Vera Rubin as a major large scale problem.

The problem, is that when you look at stellar kinematics, the mass observed is not sufficient to account for the observed speeds.
Crudely speaking: v2/r >> GM/r
where v is the observed (dispersion in the) speed of stars, r is the length scale, and M is the (observed) mass.
Oort first noted this when looking at the spread of velocities of stars near the Sun. These stars rotate about the Milky Way with the Sun, but with some scatter in random speed.
The mass of stars locally, the surface density of stellar mass in the disk, was insufficient to account for the spread in speeds, and the disk ought to be "puffing out" as the stars spread apart.
Oort noted a simple solution: that there simply needed to be some unobserved "dark" matter.

As it happens, it needed to be a lot of dark matter, more in fact then the visible matter.
There is such matter, for example dim dwarf "M stars" would do nicely, and there are a lot of them - we know now just how many, and there are some, but not enough to explain Oort's result.

Vera Rubin, much later, noted that this problem extended to the dynamics of galaxies as a whole persisted, and in fact was worse. Not only is the spread in speeds too high, the amplitude of rotation is much too high given the observed stellar mass; worse than that, the rotation curve as a function of radius was near constant at large radii, implying that the relative abundance of unobserved mass was much higher at large radii.

A series of subsequent studies extended the problem to essentially all large scales. It is not just an issue for the outer regions of disk galaxies, it is a problem for elliptical galaxies, clusters of galaxies, and in fact the structure of the universe as a whole.

In a series of brilliant papers, a true tour de force of modern astrophysics, it was shown in detail, that there is a single simple solution to this observational discrepancy:
there needs to be "dark matter" in the universe - mass that is not visible. It does not absorb or emit electromagnetic radiation, and as a first approximation only couples to the rest of the universe through its gravitational interaction.
Further, the matter need merely be "cold" - that is its kinetic temperature, its intrinsic velocity dispersion, need be much less than the velocity scales associated with galaxies - crudely speaking the dispersion must be less than about 30 km/sec.

This is a solution for the mass discrepancy on scales ranging from size of galaxies, through clusters of galaxies to the size of the universe.

For a couple of decades there were some major unresolved discrepancies with cold dark matter: first, we did not know if the matter was "baryonic" - made of normal stuff but in a form which was unobservable - candidate mass collections included low mass stars, burnt out stars, cold clumps of gas, and black holes.
As it happened, while all these things exist in some abundance, none come close to being abundant enough to account for the inferred dark matter.
Further, cosmological constraints strongly limit the total density of normal matter in the universe, and it is much less than needed to account for dark matter.

So, cold dark matter can not be normal matter: it must be something more exotic.

This is not a problem, modern particle physics predict in any of the extension of the minimal standard model, that there ought to be other particles, which would have mass, but not be charged (and hence not couple to electromagnetic radiation).
In fact there are lots of models for extra mass, ranging from axions, through lowest mass supersymmetric partner particles, to wimpzillas.
Proposed masses range from ultra-low mass particles with masses so low their compton wavelengths are the size of galaxies, to near Planck mass particles; and those are the non-exotic models. The proposed particle mass ranges over almost 50 orders of magnitude, from 10-22 eV/c2, to over 1014> GeV/c2

Embarrassing, but not a problem in and of itself, and actually quite good fun for imaginative theorists.
There is no direct detection of any of the proposed candidate particles for cold dark matter, though there are direct detection searches underway, which may in principle detect some of the candidates.

Now, a LOT of dark matter is needed. And for a couple of decades we did not know how much more. On small (galactic) scales, there seemed to be a need for maybe 3-10 times more mass than in normal matter; on the scale of the universe as a whole the need was for more like 30 times more mass.
This discrepancy showed up most acutely in the study of clusters of galaxies; when modeled in detail, they indicated a mass density of maybe ~ 0.3, measured as a fraction of the so-called "critical density" of the universe, but when we looked on larger scales, the mass density really needded to be closer to 1.0 in these units.
A significant discrepancy.

The solution to this problem was another brilliant piece of astrophysics - the matter density is in fact about 0.3 - about 80% of which is cold dark matter, and the rest normal matter. The missing 0.7 in density is "dark energy" - about which I will say no more.

This all fits, it is a beautiful model, and is consistent with general relativity and can be fit into most any extension of standard model particle physics.

There are some issues: other than that we have no idea at all what the matter is, and the addition of dark energy - which is actually quite beautiful if looked at correctly.

The problem is in the details. In particular, cold dark matter models predict a particular power spectrum of density fluctuation for mass concentrations in the universe.
The simple extrapolation of these models leads to far too many low mass galaxies at late times.
For example, there ought to be thousands of dwarf galaxies in the local group of galaxies, whereas there are actually only a few dozen.
Some of these dwarf galaxies will have merged into the big galaxies, some may be "dark" - they lost their baryonic gas and are there as lumps of dark matter with no stars, but the details of how this could happen are complicated and there is no successful model for this.
More worryingly, cold dark matter predicts the density profile of galaxies - they ought to have a shallow divergent density cusp in the center, with the density diverging roughly as 1/r for small r, and going as about 1/r3 at large radii.
But observed galaxies require the central density to be approximately constant, and the density at large radii to fall of like 1/r2

This is a genuine problem - solving it requires interaction - either indirect coupling to the normal matter, or self-interaction for the dark matter - both of which are problematic.
It is again a messy problem, and it requires some "conspiracy" for the transition from constant density to declining density to match the observed velocity profiles.

In the mean time, there came MOND.

MOND is not a theory of physics.

MOND has since its conception inspired some exploration of extensions of general relativity which would be "MOND like", though these have not gone far.

MOND is a "what you see is what you get" model
Crudely speaking, MOND assumes the observed mass is what is there, and then asks what must be done to make the observations make sense.
Milgrom noted the simple fact that the kinematic problems occur at approximately the same acceleration, and postulated a modifiction of Newton's Law , that there is essentially a minimum acceleration.

The annoying thing about MOND is that it works over large scales. More than just galaxies, though we ought not to get into that here and now.

That is it. That is what the Nature paper is about - there is more data on galaxies and it still shows that damned "universal acceleration" as a critical parameter.

So... this is crackpot territory, right?
Physics theories don't get to just add ad hoc parameters and trash all of physics, right?
New theories have to fit everything else first, and then go further.

Well, yes... in theory... in practise, not so much.
Don't believe it? Well, without invoking the history of string theory, consider a better known counterexample:

Schrodinger's equation: the starting point for modern quantum mechanics, arguably the most successful theory of physics ever.
It is not relativistic.
Yet it was written after Einstein, and Schrodinger knew physical theory ought to be consistent with special relativity.
The relativistic Schrodinger equation (Klein's equation) is wrong.
Relativistic quantum mechanics had to wait some years for Dirac theory, and the predictions and implications of Dirac theory were nuts - they just happened to be correct and rapidly verified by experiment.

Further, and we need to remember this as we critique phenomenological models,

both General Relativity and Quantum Field Theory are Inconsistent and one or both are necessarily incorrect!

General Relativity violates unitarity, maximally in some sense.
Relativistic quantum field theory breaks down completely at small scales and cannot be done in a dynamic curved metric. It does not couple correctly to gravity.

The only modern theory that comes close to reconciling this problem is string theory, which is beautiful but we do not know if it is a correct description of reality or how to calculate the low energy limit - string theory extends general relativity, and some variants, if you squint at them right, lead to effective forces on large scales - ie they are MOND like in some sense
- not that I think these variants are likely to be correct, but they are there and people play with them.

and hovering in the background is dark energy, which is consistent with general relativity and can be generated in quantum field theory, but which has no physical theory at all.
A cruel person would note that in some sense Cold Dark Matter failed and the model was resuscitate by adding a new parameter - ΩΛ != 0.
Arguably this came with two further parameters, W and W'
That is a big new parameter space for a successful theory.

Now, when I (or my minions) run cosmological simulations, I run them with cold dark matter and dark energy - it would be relatively trivial to change the simulations to MOND models, but I have never done so.
I may one day, but I do not think MOND is likely enough to be correct for the effort to be worth it. I don't like MOND, but I think it is useful to keep around and it is a good way to highlight a puzzle that may either be a minor coincidence or a path to new fundamental insight.

Or, if Martin Rees were to offer me a bet on dark matter vs MOND I'd want odds of somewhere around 50-1 to 100-1 in favour of dark matter.
And low stakes at that.

But, flicking through a dynamics text, I see MOND is mentioned, briefly, in Binney and Tremaine's "Galactic Dynamics", for example.
I personally spend maybe 15 minutes on MOND and other alternatives when I teach graduate level classes on the topic.

MOND is not evil, mentioning it is not evil.
MOND is a falsifiable phenomeological model that is good for parametric studies of certain dynamical models - it provides a new scale to ponder, the apparent fundamental acceleration, and a bit of a mystery, namely the recurrence of this scale in different situations.

Cold Dark Matter is a model within the context of broader physical theory;
it is woefully incomplete, its success required addition of additional parameters (namely the dark energy fraction), and fitting reality will likely require some further new physics.
It is probably more correct as a description of reality than any other current model, but it ain't scripture and it is testable.
Further, the observations are not the theory - there is a huge series of layers of interpretation and modeling going from the data to the final model context.
There is a certain inherent virtue in presenting parametric rather than model dependent interpretations of data.

Scientists must test models - that does not mean celebrating their successes, it means finding where they fails.
Yes, that means MOND also must be tested.
Theories are tested to failure. That means competing models should continually be thrown up as alternatives to provide hypothesis testing.

I think I must now go to the AAS meeting, 'cause several people owe me a beer or three, whether they know it or not.
'course I'll need those beers after my cosmology friends let me out of the corner to which they will assuredly drag me...


More like this

and the addition of dark energy - which is actually quite beautiful if looked at with beer goggles.


By Ethan (not him… (not verified) on 09 Oct 2009 #permalink

And I should probably add that I liked your essay a lot, except for the part I just mocked.

By Ethan (not him… (not verified) on 09 Oct 2009 #permalink

So, which of the TLA expansions of "FYT" did you intend?

Dark energy really is quite beautiful if looked at from the perspective of fundamental physics - it arises naturally as a term in General Relativity, and has a natural interpretation in quantum field theory.

As an additional fitting term in cosmological modeling it sucks. I do not know any astrophysicist who made an open strong case for bringing it back, before the data got there, and certainly no one predicted its current estimated numerical value.
Some serious people were talking in private about the need to bring in a Lambda - I wish I had listened to them then.

"Scientists must test models - that does not mean celebrating their successes, it means finding where they fail...Theories are tested to failure. That means competing models should continually be thrown up as alternatives to provide hypothesis testing."

Well why don't you test or rerun my experiments that support the irreverent notion that it is "radiation" mediates the gravitational force.

You won't because you and all your colleges are fixated on the 300 year-old "dumb" belief that some unknown, mysterious property of mass can either attract other mass or warp space. This ancient belief is up there with the 1000 year-old ancient belief held by the Scholastics that some unknown, mysterious property of the earth gave it the ability to make the heavens revolve around the earth in a 24 hour period.

Yet you and all your colleges firmly subscribe to the validity of the Tully-Fisher relation, where a galaxy's luminosity is proportional to its highest orbital velocity and the mass-luminosity relation, where a star's mass is highly correlated to its luminosity.

I have placed a 1000 watt heat element below a ~1068 gm hollow copper sphere that is suspended by wooden dowel to a thermally isolated force sensor. Located above the sphere are three copper containers filled with ice. After power has been applied for 6 minutes the infrared radiation from the heat element has has produced a 1.9% or a 20 gm increase in the gravitational mass of the sphere. This particular experiment, four others giving similar results and a heat-based gravity theory can be found here

So, which of the TLA expansions of "FYT" did you intend?

"Fixed your typo" - nothing more offensive. :-)

As for the rest, maybe beer brings wisdom. For the record, I was one of the people privately predicting the resurgence of Lambda back in the 80's. I get no points for this. Lots of other people had the same idea, and if you don't the courage to go public you don't deserve the credit. I also thought it was ugly then and I think it's ugly now. Sometimes the universe just doesn't listen to me.

By Ethan (not him… (not verified) on 09 Oct 2009 #permalink

@Ethan - phew.

yeah, not going out with good ideas is often a problem.
I wrote a short essay on "atomic dark matter" a couple of years ago, but walked away from it when the initial reaction was negative - and it is now a mildly hot idea for explaining the anomalous inconsistent hints of cdm detection.
On a previous occasion, I mentioned to a colleague that I and another colleague had noted a correlation in data that later became well known, but decided it wasn't interesting enough to publish.
His immediate reply was "of course, the data wasn't good enough then and no one would have believed you".
He was mostly right.

@Peter - I can't be arsed to do a low cost simulation using MOND to check how it affects structure formation; I sure as hell am not going to start fiddling with hot copper spheres to test an experiment that has more flaws in its design that a mere theorist can count.
I'm sorry, but you're "not even wrong.


I'm not saying that working on MOND is evil. As you correctly point out, people who are working on MOND are trying to account for a phenomenon that dark matter has difficulty accounting for. (FYI. many simulations that aren't run by people named Navarro, Frenk or White indicate that the inner density cusp goes as ~1/r^1.5, but I digress.) There may yet be some scientific merit that comes along with this work.

But to allow MOND to be presented as an alternative to dark matter, to present to the public a picture of a Universe where dark matter is not an absolute necessity is irresponsible and dishonest about our scientific conclusions. For a scientist to do that, IMO, is evil. I can talk all day about the great things that the steady-state model gets you, but at the end of the day, I know better than to tout its successes at the expense of the big bang.

How is this any different, or any more ethical? If what it leads to is bad, and we know that, isn't it unethical to promote it before it's ready?

Moore like central profiles are an even worse fit to the actual data... 'course if you just go straight to homnologous spherical collpase you recover singular isothermal...

You did note that the paper is in fact cast entirely in the language of cold dark matter and the phrase MOND is not to be found anywhere in the text?

It is quite possible that noting that a0 can be coupled to physics of dark matter will tell us something interesting about CDM properties, it is interesting that looking at new data in a different way leads to the same conclusion.

As for Steady State - you channeling Rob Knop?
Anyway, if inflation is eternal, then the universe is in fact staady state on large enough scales, and the cosmological principle is perfect.
A concept that is in the literature but is currently not testable. Explains a lot though...

Quantum relativity has a unification which occurs by the solution to the correlation function for the set of virtual force photons, applied to the atomic topological function. This builds the picoyoctometric, 3D, interactive video atomic model imaging equation.
The atom's RQT (relative quantum topological) data point imaging function is built by combination of the relativistic Einstein-Lorenz transform functions for time, mass, and energy with the workon quantized electromagnetic wave equations for frequency and wavelength. The atom labeled psi (Z) pulsates at the frequency {Nhu=e/h} by cycles of {e=m(c^2)} transformation of nuclear surface mass to forcons with joule values, followed by nuclear force absorption. This radiation process is limited only by spacetime boundaries of {Gravity-Time}, where gravity is the force binding space to psi, forming the GT integral atomic wavefunction. The expression is defined as the series expansion differential of nuclear output rates with quantum symmetry numbers assigned along the progression to give topology to the solutions.
Next, the correlation function for the manifold of internal heat capacity particle 3D functions is extracted by rearranging the total internal momentum function to the photon gain rule and integrating it for GT limits. This produces a series of 26 topological waveparticle functions of the five classes; {+Positron, Workon, Thermon, -Electromagneton, Magnemedon}, each the 3D data image of a type of energy intermedon of the 5/2 kT J internal energy cloud, accounting for all of them.
Those energy data values intersect the sizes of the fundamental physical constants: h, h-bar, delta, nuclear magneton, beta magneton, k (series). They quantize nuclear dynamics by acting as fulcrum particles. The result is the picoyoctometric, 3D, interactive video atomic model data point imaging function, responsive to keyboard input of virtual photon gain events by relativistic, quantized shifts of electron, force, and energy field states and positions.
Images of the h-bar magnetic energy waveparticle of ~175 picoyoctometers are available online at with the complete RQT atomic modeling guide titled The Crystalon Door, copyright TXu1-266-788. TCD conforms to the unopposed motion of disclosure in U.S. District (NM) Court of 04/02/2001 titled The Solution to the Equation of Schrodinger.

I found this interesting quote:

"In fact, just a few years ago it was common for DM pundits to reject MOND because âit cannot be framed relativisticallyâ. There were indeed various stumbling blocks on the way to this goal, but they proved surmountable. Today there are an handful of relativistic MOND gravity theories which disprove that pessimistic assessment."

from: Jacob D. Bekenstein, Relativistic MOND as an alternative to the dark matter paradigm, Nuclear Physics A, Volume 827, Issues 1-4, 15 August 2009, Pages 555c-560c

What say the DM pundits to this? Is it still evil?

Many thanks for the interesting post, but I think there is a missing ^2 on the right hand side of the following equation:

v^2/r >> GM/r

Touting one's favorite model overly strongly isn't unethical. Even doing unethical things isn't necessarily evil. (Exercise: explain the difference between unethical and immoral.) Evil is a strong word and should be reserved for things that actually are evil.

I guess the other question here is why someone thinks this paper is so ground breaking that it should be in Nature. rho ~ v^2/r^2 and the anticorrelation of rho with r0 has long been known.

MOND is amusing in its ability to resist most attempts to refute it, but it is too ugly to be an attractive theory of the universe. This is one reason that not that many people work on it - because they don't think it will be a success in the long run.

Steinn, is it really that easy to run a cosmological simulation with MOND instead of dark matter? There are two issues I can see here: one is that in MOND, far-field effects are not negligible (for ex if you are inside a spherical shell of matter and off-center, its gravitational force on you is not zero, while it is zero in Newtonian gravity. This is an issue when trying to figure out what MOND predicts for grav. lensing).

Second is that MOND isn't a theory. It has no GR metric and no way for us to calculate structure formation or the evolution of the scale factor. One could take a standard spectrum of perturbations and an open-universe cosmology and try to evolve it with a MOND gravity law, but it is not clear to me that doing so makes any sense at all.

What level of improvement/agreement/clarification one could achieve by applying a two-component virial theorem to the data?

I thought it would be interesting to point to a previous work of mine and collaborators (back from 2003) which shows that the central dark matter halo densities for a large data sample ranging from dwarf ellipticals to clusters of galaxies, based on the application of the two-component virial theorem (2VT) to these systems, do not show universality. Only the abstract is available:

Title:The case against scale-invariant central halo densities: implications for the self-interacting dark matter scenarios in the context of the two-component virial theorem
Authors: Ribeiro, A. L. B.; Dantas, C. C.; Capelato, H. V.; Carvalho, R. R.
Publication: Boletim da Sociedade Astronômica Brasileira (ISSN 0101-3440), vol.23, no.1, p. 163-163

I will attempt to find the poster PDF related to that report and make it opportunely available.

More on the 2VT can be found here:

Title: The Two-Component Virial Theorem and the Physical Properties of Stellar Systems
Authors: Dantas, Christine C.; Ribeiro, André L. B.; Capelato, Hugo V.; de Carvalho, Reinaldo R.
Publication: The Astrophysical Journal, Volume 528, Issue 1, pp. L5-L8, 2000

I've been away from that research for some years now, so I am not updated with the literature.

Two things from a meeting I was at this summer (which I think Ben was at as well, certainly a Ben was there).

1) Stacy McGaugh says that MOND does not work at cluster scales. He basically argues that MOND is really interesting for what it tells us about galaxy formation, ie., why do galaxies form in such a way that MOND works? He did not argue that MOND was interesting as theory in and of itself.

2) Simon White was of the opinion that we do not take MOND seriously. Rather, we are just reflexively dismissive. He said he did not think it would survive rigorous testing, but no one has ever really done that.

Point #2 relates to Ben's comment, could big N-body simulations really be done with MOND? If so, one could predict the power spectrum, given a variety of input conditions, and start comparing with various measured correlation functions.

Dark matter is the greatest fantasy of all time. Its very existence is pure speculation. Why not just be honest, why must the scientists try to find something to hide their ignorance? It is Ok to be ignorant, everyone is ignorant about something.

When dark energy is understood to be a complementary energy and its actual qualities known then the fantasy of dark matter will fade away with its foolish proponents.

James E Gambrell

By James E Gambrell (not verified) on 10 Oct 2009 #permalink

Well, you can easily do local modeling of MOND, the implementation of the dynamics is trivial, at least superficially there may be subtleties when it comes to actually doing it, I suppose.
A full blown cosmological simulation runs into the question of the initial conditions - with no FRW metric and presumably no inflation there is no rational choice for a power spectrum.
So, to do it, you either have to do a pseudo-GR model - which is clearly inconsistent, but just takes standard cosmology at some finite redshift slice, and then modify the potential gradients in the classical limit, leaving everything else the same.
Or, set up a Newtonian ballistically expanding universe with Gaussian scale free perturbations of the "right amplitude", in the linear regime, and then just run a pseudo-Newtonian simulation.
To do more than that requires either a full theory of MOND, either something in the TeVeS direction, or string inspired GR extensions with O(R) terms.

The reason a finite fraction of the community keeps an eye on MOND modeling, is partly the "coincidence" that t_H ~ c/a0 - if that is trivially true, I have not heard a convincing explanation, so it might be a hint about physics; and the other reason is that we keep running into approximately that acceleration in other contexts.
At the very least Cold Dark Matter must be doing something to arrange the disk-halo conspriracy, so maybe it is hinting at CDM interaction terms; or, it is giving us a direct look at dark energy physics... darned if I know what though.

t_H ~ c/a0

That IS interesting.

t_H ~ c/a0

The problem with this is that it directly implies either a random coincidence, or that a0 should evolve in a particular way so as to preserve this relation. At early times a0 is larger so the non-Newtonian influence is greater. There is also another way of cooking the proportionalities to make a0 proportional to dark energy density.

If a0 evolves then the kinematics of galaxies at earlier times could be significantly different. This is a testable prediction, which is good. (It might also affect things like gravitational lensing and high-z clusters, but I haven't looked into that.)

There was an undergrad here who worked on using the measured evolution in the Tully-Fisher relation, for ex from Kassin's paper, to test this. It is actually a pretty promising test. See

Unfortunately, this paper resulted in email exchanges with certain MONDians that suggested to me that they were more interested in preserving the theory than understanding the observations. I didn't feel like spending a lot of my time correcting their issues, because they can be very tenacious. Some people also seem to think that the TeVeS generalization can magically produce an evolution of a0 that is more or less always indistinguishable from dark matter. At this point I started to lose interest. The undergrad has finished up so I'm not sure when the paper is going to be revised.

Ben (that Ben)

@Ben - Yup.

I don't think there is really a "fundamental acceleration", although it is probably worth a couple of years of underemployed particle theorists to see if something like that can be extracted from string theory.

I really think the immediate interest in MOND like models is how it highlights the disk-halo conspiracy and whether that is providing a clue to dark matter physics, or possibly dark energy.

Anyway, I did a quick peek at ADS - just under 500 papers mention MOND in the abstract, 16,500 papers mention dark energy 32,800 mention dark matter.
Sounds about right, about 100:1 odds for MOND ;-)

How does MOND deal with these lensing maps that appear to show complex dark matter distributions in colliding clusters? It seems like those would be difficult to replicate with gravity that was only a function of the visible mass, but surely the MONDians have thought about it...

@Anne - right now they probably point at Rachel Bean's paper and snicker...

There is no good model for weak lensing in MOND. Requires a relativistic extension or something like f(R) gravity.

The Gentile et al, Nature, 461, 627, 2009, is working in the one sector where MOND achieves its successes, so it is not surprising. Its principal claim that the luminous matter surface density is constant is not surprising, since that implies a constant central surface brightness which goes way back to Freeman, K. C. 1970, ApJ, 160, 811 and probably earlier. What is surprising is that Nature published such an unsurprising result. That required some negligent and sloppy referees.

But wait: negligent and sloppy refereeing is par for the course so there is nothing surprising here.