A&A 383, 302-308 (2002)
DOI: 10.1051/0004-6361:20011686

New information recovered from the Pioneer 11 meteoroid experiment data

V. Dikarev1,2 - E. Grün1,3


1 - Max-Planck-Institut für Kernphysik, Heidelberg, Germany
2 - Astronomical Institute of St. Petersburg State University, Russia
3 - Hawaii Institute of Geophysics and Planetology, University of Hawaii, Honolulu, USA

Received 22 October 2001 / Accepted 27 November 2001

Abstract
Data of the Pioneer 11 meteoroid experiment are re-evaluated. A probabilistic model of the dust detector is constructed with no assumption on the flux of particles, using built-in redundancy of the instrument only. The analysis of redundant data strongly suggests that the instrument had suffered a failure at launch that disabled a significant part of its impact sensors. This failure reduced the total sensitive area of the detector, and the fluxes derived earlier assuming the instrument was in good health underestimated the true fluxes. We apply our model to re-derive the true particle fluxes, taking now the reduction of the initial sensor number into account. We implement a kind of in-flight calibration of a dust detector in natural meteoroid environment. We end up with higher true fluxes and wider confidence intervals that represent the best knowledge of the instrument's in-flight characteristics.

Key words: methods: statistical - instrumentation: detectors - meteors, meteoroids


1 Introduction

The Pioneer 10 and 11 spacecraft carried in-situ dust detectors in the outer solar system (Humes 1980). Until the present day these detectors provided the only dust flux measurements beyond the orbit of Jupiter, excepting an acutely conceived interpretation of data from the plasma wave instrument on board Voyager 1 and 2 (Gurnett et al. 1997). The Pioneer meteoroid experiments brought somewhat inconsistent results since the instruments did not operate as expected, yet the interest to the data has been very high for the sparity of dust measurements in the outer solar system.

The Pioneer 11 dust detector consisted of 234 pressurized gas cells and a device to monitor the pressure in each cell. When a meteoroid punctured the cell wall, the gas escaped from the cell, and the loss of pressure was detected. Detector electronics could not distinguish between individual cells, but the cells were divided into two channels consisting of 108 and 126 cells, respectively, that independently recorded penetrations. This provided a means for redundancy check of the dust experiment results. The impact counts on the channels turned out to be inconsistent, while the twin instrument on board Pioneer 10 provided useful information only from one of the channels.

In this work, we re-evaluate data of the Pioneer 11 dust experiment. We describe the dust detector in Sect. 2 and then construct a model of the experiment (Sect. 3) with no assumption on the flux of particles that puncture detector cells. The modeled phenomenon is similar to the coin tossing game. Detection in either channel is equivalent to head or tail of coin tossing. In case of the Pioneer dust experiment, however, each penetration disables the cell and thereby changes the channel's sensitive area. We implement this feature of the instrument in our probabilistic model.

The model provides expectation, standard deviation and other probabilistic characteristics of the number of impacts in each channel. This information is used to assess the quality of actual data assuming that the instrument was in good health. In Sect. 4 we show that experiment data were dramatically inconsistent, and that explanation is required. We find, however, that the data pass redundancy check if the initial ratio of cells in the channels was different from its setup value (Sect. 5), and suggest a reason - demolition of some cells during spacecraft launch. Based on our model, we re-derive particle fluxes from the impact counts and estimate error margins, taking now the uncertainty of the initial cell number ratio into account (Sect. 6) and discuss the implications of the new findings for meteoroid models in Sect. 7. Conclusions are made in Sect. 8.

  \begin{figure}
\par\includegraphics[width=10cm,clip]{h3243f1.ps}
\end{figure} Figure 1: Results of the Pioneer 11 meteoroid experiment. Time history of the impacts in pressurized cells.
Open with DEXTER

  
2 The Pioneer 11 meteoroid experiment

In this section, we reproduce a part of the instrument description from (Humes et al. 1974; Humes 1980) sufficient to construct our model. For more details see the papers and references therein.

The dust detector on Pioneer 11 consisted of 234 pressurized gas cells divided into two "channels'' for redundancy. The gas pressure in each cell was monitored as follows. A high voltage of 525 V was applied across the electrodes inserted in each cell, and the cells were pressurizes to 1175 torrs at $295^\circ$. At that temperature, conduction begins when the pressure is reduced to $\approx$130 torrs and stops when the pressure is below $\approx$2 torrs. If a meteoroid puncture the wall, the gas escapes from the cell, and conduction is measured in a small range of pressure. The on-board electronics increments the counter of the corresponding channel. In order to prevent multiple increments of the counter during long gas leakages in very small holes, the electronics was turned off for the dead time of 80 min after each penetration. If this time is not sufficient, the counter advances again, and the electronics is turned off for the next 80 min.

The cells were mounted on the back side of the high-gain antenna. The spacecraft was spinning, with the spin axis being parallel to the antenna direction. The two channels have nearly identical spin-averaged fields of view, despite their constituent sensors have different instant fields of view due to unequal shadowing by the spacecraft body. The rotation period of the spacecraft was very short ($\sim$10 s), much shorter than the time between meteoroid detections. Therefore, the probability to detect a meteoroid in a certain channel depends solely on the number of active cells in the channel, and it does not depend on the flux direction of meteoroids. This is similar to the probabilities of head and tail in coin tossing game. In contrast to that game, the punctured cell is disabled forever, it cannot detect meteoroids any more. The total sensitive area of the detector was fading out as it accumulated impact statistics. The original setup of the instrument included 108 cells in channel 0 and 126 cells in channel 1. The sensitive area of one cell is  $2.45\times10^{-3}\;\mbox{m}^2$.

  \begin{figure}
\par {\large\tt
\mbox{1100 0000 0111 0110 1001 0000 0000} \\
\mb...
...110} \\
\mbox{1100 1011 1000 1010 1101 0000 0010} \\
\mbox{011}}\end{figure} Figure 2: Results of the Pioneer 11 meteoroid experiment. The detector was composed of two data channels, denoted by symbols 0 and 1, each attached to an independent set of identical sensors, and all impact records were marked by the channel symbol. We sorted the records in time and removed all but channel symbols. The sequence of channel symbols is interesting in its own right, since it allows for redundancy check of experiment results. It contains 115 symbols, 73 zeroes and 42 ones.
Open with DEXTER

The penetration history is available for the time interval between 1973 and 1983. 115 impacts were recorded with timing accurate enough to infer fluxes, of which 73 were in channel 0 and 42 were in channel 1. The cumulative number of impacts is shown as a function of mission time in Fig. 1.

  
3 The model

Let ${\cal X} = X_1 X_2 \ldots X_N$ be the sequence of N channel symbols put in the order of detection time. Define the number of zeroes (impacts detected in channel 0) as $Z({\cal X})$, and the number of ones (impacts detected in channel 1) as  $U({\cal X})$. For example, the first five impacts in the Pioneer 11 dust detector (Fig. 2) form ${\cal X} = 11\,000$, $Z({\cal X}) = 3$and $U({\cal X}) = 2$. Our goal is to find the probability to observe the sequence $\cal X$as a result of the experiment.

Whatever is the flux of particles bombarding the dust detector, the spin-averaged probability of detection in a certain channel is proportional to the number of working cells in this channel. So one writes

$\displaystyle P(\{{\cal X} = 0\}) = {A_0 \over A_0 + A_1}$     (1)
$\displaystyle P(\{{\cal X} = 1\}) = {A_1 \over A_0 + A_1}$     (2)

where A0 and A1 is the number of cells in channels 0 and 1, respectively. Each detection changes the number of cells in a channel, so that
 
$\displaystyle P (X_1 X_2 \ldots X_N) = P (X_1) P (X_2 / X_1) \times\ldots P (X_N / X_1 X_2 \ldots X_{N-1})$     (3)

where P(A/B) means, as usual, the conditional probability of the event A given the event B. Calculation of the probability  $P({\cal X})$ is easy if one knows conditional probabilities $P (X_k / X_1 X_2 \ldots X_{k-1})$.

After k-1>0 detections the number of cells in the channels will change according to the number of zeroes and ones in ${\cal X}_{k-1} = X_1 X_2 \ldots X_{k-1}$. Channel 0 will keep $A_0-Z(X_1 X_2 \ldots X_{k-1})$ cells, channel 1 will consist of $A_1-U(X_1 X_2 \ldots X_{k-1})$ cells. The conditional probabilities are

$\displaystyle P(\{X_k = 0\} / {\cal X}_{k-1}) =
{A_0 - Z_{k-1} \over A_0 + A_1 - k + 1}$     (4)
$\displaystyle P(\{X_k = 1\} / {\cal X}_{k-1}) =
{A_1 - U_{k-1} \over A_0 + A_1 - k + 1}$     (5)

where the shortcuts Zk-1 and Uk-1 stand for the numbers of zeroes and ones in $X_1 X_2 \ldots X_{k-1}$. This fact can be written in the algebraic form

\begin{displaymath}P(X_k / {\cal X}_{k-1}) = {
(A_0 - Z_{k-1})^{1-X_k}\,
(A_1 - U_{k-1})^{X_k}
\over A_0 + A_1 - k + 1}
\end{displaymath} (6)

which after substitution in the expansion (3) yields a surprisingly simple expression

 \begin{displaymath}
P({\cal X}) =
{(A_0 + A_1 - N)! \over (A_0 + A_1)!}\,
{A_...
...A_0 - Z({\cal X}))!}\,
{A_1! \over (A_1 - U({\cal X}))!}\cdot
\end{displaymath} (7)

Interestingly, despite each impact changes the ratio of cell numbers in the channels, the probability to observe a sequence $\cal X$does not depend on the order of symbols  $X_1 X_2 \ldots X_N$.

The probability (7) allows one to calculate momenta of various functions of the random sequence $\cal X$. The expectation of the number of zeros  $Z({\cal X})$, for example, is given by

 \begin{displaymath}
EZ = \sum_{\cal X} Z({\cal X}) P({\cal X}),
\end{displaymath} (8)

the dispersion is provided by $E\, (Z - EZ)^2$, and so on. It is not practical, however, to sweep through the whole set of 2Nsequences  $X_1 X_2 \ldots X_N$ in the sum (8). Since the probability  $P({\cal X})$ does not depend on the order of symbols in the sequence, a simplified formula is applicable:

 \begin{displaymath}
EZ = \sum_{k=0}^N k \, P_k \, C(N,k)
\end{displaymath} (9)

where Pk is the probability of a sequence containing k zeroes, and $C(N,k) = N! \, / \, [k! \, (N-k)!]$.

In the following sections, we will test various hypotheses about the dust instrument against the data retrieved. The tests will include variation of the initial cell numbers A0 and A1, so that it might happen that in an intermediate calculation $Z({\cal X}) > A_0$or $U({\cal X}) > A_1$. In such a situation, we generalize the probability (7) by putting  $P({\cal X})=0$.

  
4 Inconsistency of experiment data

Evaluation of the Eq. (9) for Z and Ugives a trivial result

 \begin{displaymath}
{EZ \over N} = {A_0 \over A_0 + A_1},\;\;\;
{EU \over N} = {A_1 \over A_0 + A_1}
\end{displaymath} (10)

which is the same as the expectation values for Bernoulli trials for p=A0/(A0+A1) and q=1-p=A1/(A0+A1). The ratio of penetration fluxes in channels 0 and 1, i.e. $Z({\cal X}){:}U({\cal X})$, should, on average, resemble the initial ratio of cell numbers in the channels. For the Pioneer 10 and 11 meteoroid experiments, the original setup ratio was  A0:A1 = 108:126 = 6:7. However, the actual penetration fluxes reveal a different ratio, $Z{:}U \approx 2{:}1$.

Since the penetration fluxes differ so strong from the expected ratio, we are interested to know how far can the actual count deviate from the expectation (10) due to statistical fluctuations. One way to get the answer is to calculate the standard deviation (square root of dispersion) of the random value  $Z({\cal X})$and compare it with the deviation of the experiment result. The dispersion is

 \begin{displaymath}
{DZ \over N} = {A_0 A_1 \over (A_0 + A_1)^2} \; {A_0 + A_1 - N \over A_0 + A_1 - 1}
\end{displaymath} (11)

and in Fig. 3 we plot the number of zeroes  $Z({\cal X})$ as a function of the length of the sequence $\cal X$ along with the expectation value EZ bracketed in $\pm\sqrt{DZ}$ margins. Note that the dispersion (11) is similar to the dispersion of Bernoulli trials when $N \ll A_0 + A_1$. It vanishes to zero when the number of cells is exhausted, i.e. N=A0+A1.

By the end of the sequence, the deviation of the Pioneer 11 penetration flux ratio from the expectation exceeds $5\sigma$. For a Gaussian distribution, this is sufficient to reject an experiment result, however, our case of a discrete distribution may need a more thorough consideration. Using our model (7) we can directly calculate the probability of such a high deviation of the experiment from the expectation:

 \begin{displaymath}
P(\{Z \ge Z_{\rm pio}\}) = \sum_{k=Z_{\rm pio}}^N P_k \, C(N,k),
\end{displaymath} (12)

where  $Z_{\rm pio}>EZ$ is the result of the experiment. Evaluation of Eq. (12) for the data retrieved and the instrument assumed to be in good health, i.e., N=115, $Z_{\rm pio} = 73$, A0=108 and A1=126, yields $P = 1.5\times10^{-7}$. Only one experiment in 7.5 millions would result in such a high deviation due to statistical fluctuations. We must either reject such experiment results or admit something is wrong in our understanding of the instrument.

  
5 Inference of cell number ratio from experiment data

Already the first glimpse at the graphs in Fig. 3 suggests that the ratio A0:A1 determining the path of the expectation line can be adjusted to embrace the actual experiment results in the interval of permissible fluctuations. This adjustment, however, has to be explained.


  \begin{figure}
\par\includegraphics[width=8.8cm,clip]{h3243f3.ps}
\end{figure} Figure 3: Deviation of the Pioneer 11 detector measurements from the theoretical expectation based on the assumption that the instrument was in good health. Shown is the cumulative number of impacts in channel 0 as a function of the total number of impacts.
Open with DEXTER

Consider possible partial damage of the instrument. One can suggest that some cells were punctured or disconnected from the power, and thereby disabled at launch of the spacecraft. Note that channel 1 of the almost identical Pioneer 10 dust detector did not work at all. Such a disability of the sensors would lead to change of the ratio A0:A1.

The ratio A0:A1 is considered good if the measured deviation of Z:U from this ratio is small. Using Eq. (12) when $Z_{\rm pio}>EZ$ and

 \begin{displaymath}
P(\{Z \le Z_{\rm pio}\}) = \sum_{k=0}^{Z_{\rm pio}} P_k \, C(N,k),
\end{displaymath} (13)

when $Z_{\rm pio} \le EZ$, we can test various hypotheses on A0 and A1 against the actual data. Indeed, the larger the probability (12) or (13) is, the better is our hypothesis.

For a fixed value of A0=108 (that is assuming no cell of channel 0 was affected by the failure) the best ratio is given by A1=62. The confidence interval around this optimum is large, however. For all A1 from 43 to 75 the deviation probability is above 5% (95% confidence interval). The hypothesis of survival of channel 0 cells is supported, to some degree, by the penetration count obtained by the Pioneer 10 meteoroid experiment which was 95 out of total 108, which means that channel 0 of Pioneer 10 probably survived the launch. If the nature of the failure which disabled channel 1 on Pioneer 10 and substantially reduced the number of active cells in channel 1 on Pioneer 11 was the same, and if survival of the cells in channel 0 on Pioneer 10 was not just fortunate, it is reasonable to start with all 108 cells on channel 0 on Pioneer 11 as well.

However, this may not be true and we allow A0 to vary as well. We obtain a region of the likely parameter values (Fig. 4). In order to convert penetration counts into fluxes, the area of the instrument is required. Figure 4 shows, however, that there is a range of likely areas, so that we cannot pick up a pair (A0, A1)and use the sum multiplied by single cell area. We have to judge on sensitive area of the instrument in a probabilistic way. It is worthwhile to emphasize that, unlike previous works, this paper has to consider both impact counts and the instrument's sensitive area as unknown, random entities given by their probabilities only.


  \begin{figure}
\par\includegraphics[angle=270,width=8.8cm,clip]{h3243f4.ps}
\end{figure} Figure 4: The map of probabilities to observe the deviation of the number of impacts in the detector's channel 0 from its expectation for different initial active sensor numbers in two channels.
Open with DEXTER

  
6 Corrected penetration fluxes

Consider the problem of inference of the flux from the count taken by an instrument. The count is related to the flux through the Poisson distribution

 \begin{displaymath}
P (\lambda, n) = {\lambda^n \over n!} \, {\rm e}^{-\lambda}.
\end{displaymath} (14)

The maximum likelihood principle suggests that  $\hat \lambda = n$should be taken as the best estimate of the flux $\lambda$ given the count n. However, the count n could be a result of a measurement of a flux  $\lambda \ne \hat\lambda$ because of statistical fluctuations. That is why confidence limits are often calculated, i.e. the lower and upper bounds of the interval which brackets all fluxes $\lambda$ that could lead to the count n, at the probabilities above an adopted confidence level.

The confidence limits for the Poisson distribution (14) are given by the equations

  
$\displaystyle \sum_{k=n}^\infty P(\lambda_{\rm lower}, k)$ = $\displaystyle {\alpha\over2},$ (15)
$\displaystyle \sum_{k=0}^n P(\lambda_{\rm upper}, k)$ = $\displaystyle {\alpha\over2}$ (16)

where n is the result of the measurement, $\lambda_{\rm lower}$ and $\lambda_{\rm upper}$ are the lower and the upper confidence limits for the flux, respectively, and $1-\alpha$ is the confidence level, or the probability to get the true flux in the interval  $[\lambda_{\rm lower}, \lambda_{\rm upper}]$. Following Humes (1980) we set the confidence level to 90%, i.e. $\alpha=0.1$.

In order to apply the described method to the Pioneer 11 data, one has to bin the impact events in the time intervals of interest. Naturally, any researcher is interested in a high time resolution of the flux. However, a limited number of impact events only is available, and the fewer the number of impacts per bin is, the worse is the accuracy of the flux estimates.

Moreover, when working with the Pioneer 11 meteoroid experiment data, one has to deal with a consumable sensitive area of the detector, so that the Poisson mean

 \begin{displaymath}
\lambda \approx F \cdot A \cdot T
\end{displaymath} (17)

is a function of the true penetration flux F, changing sensitive area A and the exposure time T. Therefore, in order to keep the approximate Eq. (17) applicable, it is necessary to assure that the change of the sensitive area Ais sufficiently small within each time bin, i.e. n is not too big.

In a compromise, we choose time bins such that n is not far from 10, and set up the boundaries of the intervals according to the Pioneer 11 orbital segments of interest. The maximum likelihood estimates of the flux and the confidence limits are calculated for each time bin. They are plotted in Fig. 5 (top). Note that in contrast to (Humes 1980) we merge data from the two channels (we sum both penetration counts and sensitive areas) and apply Poisson distribution instead of $\chi^2$. Therefore, we obtain different results.

Our next step is to incorporate the newly established knowledge of the active cell number. In addition to the Poissonian noise (14), we get an uncertainty of the sensitive area of the instrument. Any pair of the active cell number in two channels A0 and A1 is now assigned a probability  $P_{A_0, A_1} (Z_{\rm pio})$ to yield the sequence of channel symbols containing  $Z_{\rm pio}$ zeroes,

\begin{displaymath}P_{A_0, A_1} (Z_{\rm pio}) = P_k \, C(N,k)
\end{displaymath}

for $k=Z_{\rm pio}$ in terms of Eq. (9). For small n, the probabilities  $P_{A_0, A_1} (Z_{\rm pio})$and  $P(\lambda, n)$ are independent. Then the maximum likelihood estimate of the flux is

\begin{displaymath}\max_{\lambda,A_0,A_1} P_{A_0,A_1} (Z_{\rm pio}) \times P (\lambda, n),
\end{displaymath} (18)

which yields  $\hat \lambda = n$ and  $\hat F = \hat \lambda / [A T]$ where A is the area corresponding to the maximum  $P_{A_0, A_1} (Z_{\rm pio})$.

The deviation probabilities (12) and (13), in turn, give the confidence level for our choice of A0 and A1. The confidence limits for the flux (per cell per exposure time) in the case of unknown active cell number are given by

  
$\displaystyle \sum_Z P_{A_0,A_1} (Z) \times
\sum_{k=n}^\infty P (F_{\rm lower}^{A_0,A_1} \cdot (A_0 + A_1), k)$ = $\displaystyle {\alpha\over4},$ (19)
$\displaystyle \sum_Z P_{A_0,A_1} (Z) \times
\sum_{k=0}^n P (F_{\rm upper}^{A_0,A_1} \cdot (A_0 + A_1), k)$ = $\displaystyle {\alpha\over4}$ (20)

with the further choice of the minimum  $F_{\rm lower}^{A_0,A_1}$and the maximum  $F_{\rm upper}^{A_0,A_1}$ out of all solutions of Eqs. (19)-(20) for different combinations of A0 and A1. The sums in Z are calculated over all Z that deviate from the expectation EZ(A0,A1) as high as  $Z_{\rm pio}$ or higher.


  \begin{figure}
\par\includegraphics[width=8.8cm,clip]{h3243f5.ps}
\end{figure} Figure 5: Fluxes on the Pioneer 11 dust detector inferred under different assumptions on the number of active sensors at the beginning of the mission. Top: the original setup's number of sensors is assumed. Middle: all sensors in channel 0 are assumed active in the beginning of the mission, the number of sensors in channel 1 is only required to satisfy the confidence level. Bottom: a view free of a-priori assumptions on the initial active channel number, the numbers of sensors in both channel are only required to satisfy the confidence level. The heliocentric distance on the upper scale shows where the spacecraft was located at the time on the lower scale; "J'' stands for Jupiter fly-by, "P'' is the perihelion distance (3.73 AU) of the post-Jupiter orbit, "S'' is Saturn fly-by. The fluxes during the flybys are not shown.
Open with DEXTER

In other words, this procedure extracts the minimum and maximum fluxes that could have led to the recorded count n under the condition (A0,A1)and the corresponding sensitive area likelihood of which is indicated by $P_{A_0, A_1} (Z_{\rm pio})$.

The middle and bottom pannels in Fig. 5 show the fluxes and confidence intervals calculated using Eqs. (19)-(20). The middle pannel is generated under the assumption that all cells in channel 0 survived the launch, while an unknown fraction of the cells in channel 1 could be destroyed. The probabilities  $P_{A_0, A_1} (Z_{\rm pio})$ for A0=108 and arbitrary $A_1\le126$ were used. Since the optimum ratio is A0:A1=108:62, the acting sensitive area of the detector is lower than that of the original setup, and the fluxes inferred are raised. Note the confidence intervals on the top and middle pannels overlap partially only. The setup ratio A0:A1(top pannel) is extermely improbable in view of $P_{A_0, A_1} (Z_{\rm pio})$, and the lowest fluxes still possible to detect with the sensitive area of the original setup ratio become too small to have been detected with the partially disabled sensitive area. Those lowest fluxes that are disregarded by the procedure in the middle pannel.

The bottom pannel is generated under no a-priori assumption on the initial number of cells. The probabilities $P_{A_0, A_1} (Z_{\rm pio})$ for arbitrary $A_0\le108$ and $A_1\le126$ were used. This freedom resulted in very large confidence intervals. Moreover, it turns out to be that the small A0 and A1 have somewhat higher probability  $P_{A_0, A_1} (Z_{\rm pio})$than the large ones, and therefore the maximum likelihood estimator prefers small sensitive area leading to high fluxes. The confidence limits in the middle pannel, however, are inside the confidence intervals of the bottom pannel.

The bottom pannel shows the least tendentious view of the Pioneer 11 meteoroid experiment results. Although it seems to be too pessimistic - weak constraints only can be put on meteoroid models - it represents the best of what the experiment's data give when being processed without any unfounded assumptions.

  
7 Implications for meteoroid models

We have re-evaluated data of the Pioneer 11 meteoroid experiment using built-in redundancy of the instrument. We applied a dedicated model to re-derive the true particle fluxes, and obtained higher fluxes and wider confidence intervals that represent the best knowledge of the instrument's in-flight characteristics. The newly established wide confidence intervals put weak constraints on the meteoroid models in the outer solar system. In particular, the conclusion of Humes (1980) that the data of Pioneer 11 demand a population of meteoroids in randomly oriented and highly eccentric orbits should be re-addressed. This population was included in the meteoroid model (Divine 1993, "halo'' population) to reproduce the Pioneer data, and according to the model it dominates particle number densities in the outer solar system.

The new information we have recovered from the experiment data is that the experiment results are of less utility than it was previously thought, but the constraints on the penetration flux are justified better. Note that when fitting models to the least tendentious view of the Pioneer 11 results in Fig. 5, one should withstand temptation to reproduce the maximum likelihood estimate of the flux shown on the plot. Figure 6 demonstrates how weak the dependence of confidence limits on confindence level is. This stems from our inability to constrain the total sensitive area of the partially damaged instrument. All we can do based on the analysis of the sequence of channel hits is to guess the value of A0/A1, while inference of fluxes requires A0+A1determining the total sensitive area. The actual lower limit of the total sensitive area is put by the total number of penetrations detected over the mission, the upper limit is due to the original setup number of sensors reduced by the obvious demolition of cells in channel 1.

  \begin{figure}
\par\includegraphics[width=8.8cm,clip]{h3243f6.ps}
\end{figure} Figure 6: Fluxes on the Pioneer 11 dust detector inferred under no assumption on the initial active channel number at the beginning of the mission, confidence levels of 75% (top) and 50% (bottom) are used to estimate confidence limits. Notations are same as in Fig. 5. It is seen that due to poorly constrained total sensitive area of the instrument the confidence limits do not converge to the maximum likelihood estimate of the penetration flux, so that the maximum likelihood estimate should not be used as a reference for meteoroid models. Models are already good if they predict penetration fluxes within the confidence limits.
Open with DEXTER

Although we were able to show that the true fluxes can be inferred from the impact counts made with an unknown sensitive area of the instrument, it is obvious that the useful information shrank dramatically because of the malfuction. However, the approach of the probabilistic description of the instrument employed in this work may be benefitial for other in-situ dust experiments, implementing the idea of in-flight calibration of a dust detector in natural meteoroid environment.

Note that in this work we confined ourselves to the problem of inference of the true penetration fluxes. Inference of the fluxes of real particles characterised by their masses and impact velocities has to rely on calibration of detection thresholds, and uncertainty of the calibration results and their extrapolation to the space velocities will further soften constraints on the models.

  
8 Conclusion

We have re-analysed data from the Pioneer 11 meteoroid experiment using built-in redundancy of the instrument and making no assumption about the flux of impactors. According to our redundancy check made possible by the newly developed probabilistic model of the instrument, experiment data were dramatically inconsistent, demanding explanation. We found, however, that the data pass redundancy check if the initial ratio of active impact sensors was different from its setup value, and suggest a reason - demolition of some sensors during spacecraft launch. Hypothesis of partial disability of the instrument is very strong, only one experiment in 7.5 millions would result in such anomaly due to statistical fluctuations.

We re-derive particle fluxes from the impact counts and estimate error margins, taking now the uncertainty of the instrument's sensitive area into account. This work is an attempt of in-flight calibration of a dust detector in natural meteoroid environment. We end up with higher true fluxes and wider confidence intervals. Meteoroid models are now subject to weaker constraints from the Pioneer 11 experiment, but those constraints represent the best knowledge of the instrument's in-flight characteristics.

Acknowledgements
Complete set of event ground confirm times for the Pioneer 11 meteoroid experiment is available from the National Space Science Data Center (NSSDC) on microfiche. We thank Dr. Markus Landgraf who had digitized the data and kindly shared the ASCII files with us.

References

 
Copyright ESO 2002