All the bodies of the solar system that are directly irradiated by the galactic cosmic rays, emit enough neutrons to allow a measurement from space. These leakage neutron fluxes are indexes of the surface composition, depending on the energy of the neutrons (1). Recent work propose geochemical interpretations of these fluxes: the thermal energy range is sensitive to iron, titanium, rare earth elements and thorium (2, 3), the epithermal energy range is sensitive to hydrogen, samarium and gadolinium (2) and the fast energy range is representative of the average soil atomic mass (4). Nevertheless these studies make the hypothesis of a composition uniform within the footprint of the spectrometer and independent of depth. We show in this abstract that a stratified composition could change significantly the flux intensity and complicate the interpretation of the measurements. The neutron leakage flux is a competition between production effects (sensitive at high energy) and diffusion-capture effects (mostly sensitive at low energy). On one hand, it happens to be that the elements which produce the higher number of neutrons in typical lunar compositions are iron and titanium, which have also large cross section of absorption with the neutrons. On the other hand, the maximum of neutron intensity does not occur at the surface but at about 180 g cm(sup (minus)2) in depth. Therefore, if we have an iron- and/or titanium-rich soil (important production of neutrons) with a top layer having less iron and/or titanium (i.e. more transparent to the neutrons), we can expect an enhancement of the flux compared to a uniform composition.