The Restoration of HST Images and Spectra II
About Event
Location
Space Telescope Science Institute (STScI)
3700 San Martin Drive
Baltimore, MD 21218
Description
This volume contains the collected papers presented at the Second Workshop on the Restoration of Images and Spectra from the Hubble Space Telescope, which was held at the Space Telescope Science Institute, Baltimore, Maryland, on 18 and 19 November 1993. The Workshop was attended by 126 registered participants. Fifty-five papers were presented in a combination of oral and poster sessions.
Proceedings edited by Robert J. Hanisch and Richard L. White.
Notes
The Space Telescope Science Institute (STScI) is a short ride away from major airports and train stations. Find information related to driving directions and relevant transportation resources on our Getting Here page.
Accordion
-
Speakers: Robert J. Hanisch and Richard L. White
Abstract: This volume contains the collected papers presented at the Second Workshop on the Restoration of Images and Spectra from the Hubble Space Telescope, which was held at the Space Telescope Science Institute, Baltimore, Maryland, on 18 and 19 November 1993. The Workshop was attended by 126 registered participants. Fifty-five papers were presented in a combination of oral and poster sessions.The Workshop was sponsored by the Image Restoration Project of the Space Telescope Science Institute. Although work on image restoration began soon after the spherical aberration problem of the HST was discovered in the summer of 1990, a dedicated effort in the area of image restoration did not get underway until the fall of 1992. Since that time, however, a small group led by Bob Hanisch and Rick White has been doing research and implementing software aimed at improving the quality of HST images. The members of this group included Ivo Busko, Jinger Mo, Nailong Wu, and Nancy Hamilton. Collaborators in the project included Don Lindler, John Krist, and Dave Baxter. Our work was also supported (through several small research contracts) by Aggelos Katsaggelos, Bobby Hunt, David Redding, and their co-workers. The Project was run out of the Advanced Systems Group of the Science Computing and Research Support Division of ST ScI.
The first workshop on HST image restoration (August 1990) presented a large variety of algorithms and approaches, but relatively little had been done to adapt these approaches to the peculiarities of HST images and spectra. The attitudes were both optimistic and, perhaps, a bit naive. HST image restoration turned out to be extremely complicated, with both spatially and temporally variable point spread functions and a variety of other instrumental effects that made high quality restorations difficult. Over the past eighteen months we have learned how to deal with many of these problems, and how to modify algorithms and generate better quality PSFs. Although HST image restoration is far from the point where an arbitrary image can be tossed into a ``black box'' deconvolution machine, good quality restoration can indeed be obtained in a large number of cases. The papers in the current volume demonstrate this quite convincingly. Perhaps the most stringent requirement on image restoration has been to retain photometric linearity. Through studies of the convergence properties of algorithms such as the Richardson-Lucy method and various Maximum Entropy approaches, we now have a good understanding of how to proceed with an image restoration so that photometric integrity is maintained. Perhaps the most profound effect of the efforts on image restoration has been to open up a new method of dealing with optical imaging data that had previously been utilized only in a segment of the astronomical community (e.g., radio astronomy).
The wonderfully successful HST Servicing Mission in December 1993 has led to HST recovering virtually all of its original design goals for optical performance. The first science images to be returned to the ground are simply awesome. The improved optical performance of the telescope allows HST to recapture the sensitivity that was irretrievably lost in the aberrated images and to tackle the observational problems that require high dynamic range. It is our expectation, however, that the algorithms and PSF modeling expertise we have developed to deal with pre-servicing mission images will continue to be useful on WFPC-II and FOC+COSTAR data. It should be possible to increase dynamic range in crowded fields, where the diffraction spikes of stars overlap to create an apparent background haze, and allow studies of the galaxies underlying quasars that are otherwise obstructed by features of the unaberrated PSF. In some cases super-resolution may be possible, extending HST's capacity for seeing detail to beyond the diffraction limit.
We would like to extend thanks to those who helped to make the Image Restoration Project and this second Workshop possible. The Project was supported by the Hubble Space Telescope Project Office at the NASA Goddard Space Flight Center. We would especially like to thank Dr. Stanley Sobieski, ST ScI's Technical Officer at the time, for his support and encouragement. Other NASA officials who were especially supportive of this work include Dr. Robert Stachnik of NASA Headquarters and Dr. Jan Hollis of NASA GSFC. Within ST ScI we thank Dr. Ron Allen for his support and guidance, and thank John Krist, the developer of the Tiny TIM PSF modeling software, which has made it possible to restore many images for which observed PSFs were not otherwise available. John's work is based on the TIM PSF modeling code developed by Chris Burrows and Hashima Hasan, who also contributed to this project through their monitoring of the time variability of the PSF. Pierre Bely also provided useful inputs in this regard, especially in the area of spacecraft pointing effects.
-
Speakers: Aggelos K. Katsaggelos, Moon Gi Kang, and Mark R. Banham
Title: Adaptive Regularized Restoration Algorithms Applied to HST Images
Abstract: This paper analyzes the performance of two set theoretic-based iterative image restoration algorithms for Hubble Space Telescope (HST) degraded images. The iterative adaptive constrained least squares and frequency adaptive constrained least squares algorithms are optimized here for HST data, and applied to several simulated and real degraded HST images. Evaluations of both the flux linearity and resolution enhancement of these algorithms are presented and compared to results obtained by the Richardson-Lucy algorithm (Lucy 1974). These results indicate that the iterative algorithms investigated here are quite suitable for HST data, and provide excellent results in terms of all evaluation criteria tested.Speakers: R. C. Puetter and R. K. Piña
Title: Pixon-Based Image Restoration
Abstract: This paper presents the theory of the pixon, the fundamental unit of picture information, and its application to Bayesian image reconstruction. Examples of the applications of these methods to artificial and real data are presented. These examples demonstrate that pixon-based methods produce results superior to both pure Goodness-of-Fit (i.e. Maximum Likelihood) methods and the best examples of Maximum Entropy methods.Speakers: James M. Coggins, Laura Kellar Fullton and Bruce W. Carney
Title: Iterative/Recursive Deconvolution with Application to HST Data
Abstract: A new deblurring algorithm has been developed involving both iteration and recursion and that is linear, flux-conserving, noise resistant, and faster to converge than extant iterative deblurring methods. Mathematical analysis shows that the recursive component of the algorithm provides the accelerated convergence. A demonstration is provided using a simulated star field image blurred using an approximation to the point spread function of the Hubble Space Telescope.Speakers: B. Bundschuh and D. Schneider H. Schulz
Title: Adaptive Least Squares Image Restoration Using Whitening Filters of Short Length
Abstract: This paper presents a regularized least squares algorithm for the restoration and reconstruction of images. A whitening filter of short length provides the regularization function. An adaptive version of the algorithm is developed by matching a weighting function to the regularization function. The adaptive regularization simultaneously leads to proper noise suppression and enhanced resolution of discontinuities.Speakers: Gordon Chin, Stephen L. Mahan and William E. Blass
Title: Image Restoration and Super-Resolution by Novel Applications of a Neural Network
Abstract: A new instrument paradigm is proposed based on the discovery of a method to determine a robust inverse point spread function for a scientific observing instrument modeled as a linear system. As a result of this discovery, it is possible to contemplate an instrumental extension which results in the recovery of a major portion of lost resolution due to the blurring effects of the PSF. Implementation of the instrumental extension and the resulting resolution enhancement is independent of prior knowledge of or access to the observed data. The method is applied to HST images as well as several one dimensional spectral data sets. Results of HST recoveries are compared to Richardson-Lucy.Speaker: Nailong Wu
Title: Model Updating in the MEM Algorithm
Abstract: The model updating technique used in the Maximum Entropy Method (MEM) task for image restoration in IRAF/STSDAS is described in detail. The properties of the method, including faster convergence in iteration and reduced nonlinearity of the solution in photometry, are discussed and shown by statistics and illustrations.Speakers: K. Bouyoucef, D. Fraix-Burnet and S. Roques
Title: Interactive Deconvolution with Error Analysis
Abstract: In the general framework of regularization of inverse problems, we define the main contours of the reconstruction algorithm IDEA (Interactive Deconvolution with Error Analysis) that is developed in our group, giving only some methodological principles. The deconvolution problem is stated in terms of weighted spectral interpolation: the amount and the nature of the interpolation to be performed is related - in a quantitative manner - to the choice of a synthetic coverage linked to the target resolution. We illustrate this deterministic viewpoint on the SN1987A image of HST and compare the result with two Bayesian approaches: the Richardson-Lucy Method (RLM) and the Maximum Entropy Method (MEM) in terms of gain in resolution, error propagation, speed of convergence, and some relevant astrophysical criteria.Speaker: Hans-Martin Adorf
Title: Towards HST Restoration with a Space-Variant PSF, Cosmic Rays and Other Missing Data
Abstract: The open problem of restoring full Wide Field and Planetary Camera (WF/PC) image frames is considered. Several novel algorithms or modifications to existing algorithms are described useful for restoring undersampled (multi-) frames degraded by a space-variant point spread function (SV-PSF) and by irregular faults. The algorithms comprise a faithful rotation operator for sufficiently sampled images, an alternative method for treating regular and/or irregular missing data, and an algorithm for SV-PSF restoration combining the classical sectioned restoration method of Trussel & Hunt with the generalized "co-addition" restoration method of Lucy & Hook. With these algorithms a comprehensive full-frame WF/PC restoration now appears to be feasible.Speaker: L. B. Lucy
Title: Image Restorations of High Photometric Quality
Abstract: An image restoration technique is described that achieves high photometric accuracy for both stars and distributed emission. The technique makes use of observer-supplied information that some objects in the field are point sources and thereby eliminates the ringing and photometric bias that arise with conventional restoration procedures. A hierarchy of codes is described based on this two-channel decomposition of astronomical images. The more sophisticated of these codes incorporate simultaneous estimation of the PSF and are thus especially relevant for ground-based imaging. Observing programs are described in which these codes effectively transfer the resolution of HST to ground-based images.Speakers: R. N. Hook and L. B. Lucy
Title: Image Restorations of High Photometric Quality. II. Examples
Abstract: A generalized form of the Lucy-Richardson restoration method is described in which images in the object plane are divided into two channels. One of these contains point sources (i.e., delta-functions) and the other models a smooth background distribution. The latter image is regularized by the use of an entropy term and hence has enforced smoothness. This approach avoids problems encountered in the photometry of the results of non-linear restoration methods. Examples of the use of this technique on standard simulated HST data frames are given, including some assessments of its photometric accuracy. Implementations running under both IRAF and MIDAS are available. -
Speakers: Kevin M. Perry and Stanley J. Reeves
Title: Generalized Cross-Validation as a Stopping Rule for the Richardson-Lucy Algorithm
Abstract: This paper presents a criterion for stopping non-linear iterative algorithms, specifically the Richardson-Lucy algorithm that is widely used to restore images from the Hubble Space Telescope. The criterion is based on generalized cross-validation and is also computed iteratively. We will present examples displaying the power of the stopping rule, and will discuss the abilities and shortcomings of this method. We also present a caveat about the method.Speaker: Richard L. White
Title: Image Restoration Using the Damped Richardson-Lucy Method
Abstract: A modification of the Richardson-Lucy iteration that reduces noise amplification in restored images is described.Speakers: Jean-Luc Starck, Fionn Murtagh, and Albert Bijaoui
Title: Image Restoration with Denoising Using Multi-Resolution
Abstract: Keywords: Multiresolution analysis, wavelet, image processing, image restoration, deconvolutionThis paper shows how an effective noise suppression strategy can be incorporated into algorithms for the solution of the inverse problem. The residual in the fit of the restored image, at each iteration, is analyzed using a wavelet transform. In order to suppress noise, only significant wavelet coefficients are retained. The effectiveness of this procedure is demonstrated.
Speakers: R. Molina, J. Mateos, and J. Abad
Title: Prior Models and the Richardson-Lucy Restoration Method
Abstract: Following the Bayesian paradigm for image restoration we show how smoothness constraints can be incorporated into the R-L method. We also examine different noise models and study their approximation by Gaussian models that are robust to detector errors.Speakers: Jorge Núñez and Jorge Llacer
Title: HST Image Restoration with Variable Resolution
Abstract: Bayesian and maximum entropy algorithms that use a constant balancing parameter present large residuals in the low or the high S/N regions depending on the value chosen for the parameter. We present the development of a variant of the Bayesian algorithm with entropy prior (called FMAPE) that uses a balancing parameter that is variant in space. This variation of the balancing parameter implies a variable resolution in the restoration, allowing high resolution with low bias in high S/N regions and a lower resolution with lower artifacts in poor S/N regions. We have applied the algorithm to the FOC and WF/PC cameras of the HST.Speakers: P. Benvenuti, F. Maggio, and S. Seatzu
Title: Regularization and Smoothing for the Restoration of Hubble Space Telescope Images
Abstract: Two numerical methods for the restoration of noisy images from the Hubble Space Telescope are presented. The first one stems from a B-spline expansion followed by the Tikhonov regularization method. The second one is based on a coarse smoothing coupled with the Tikhonov regularization. The effectiveness of these methods is illustrated by the restoration of some images already proposed in the literature.Speakers: Donald L. Snyder, Carl W. Helstrom, Aaron D. Lanterman, Mohammad Faisal, and Richard L. White
Title: Compensation for Read-Out Noise in HST Image Restoration
Abstract: Data acquired with the charge coupled device camera on the HST are modeled as an additive Poisson-Gaussian mixture, with the Poisson component representing cumulative counts of object-dependent photoelectrons, object-independent photoelectrons, bias electrons and thermoelectrons, and the Gaussian component representing read-out noise. Two methods are examined for compensating for read-out noise. One method is based upon approximating the Gaussian read-out noise by a Poisson noise and then using the expectation-maximization (modified Richardson-Lucy) algorithm for Poisson distributed data to effect the compensation. This method has been used for restoring HST images. The second method directly uses the expectation-maximization algorithm derived for the Poisson-Gaussian mixture data. This requires the determination of the conditional-mean estimate of the Poisson component of the mixture, which is accomplished by the evaluation of a nonlinear function of the data. The second method requires more computation than the first, but modest improvements in the quality of the restorations are realized, particularly for fainter objects. -
Speakers: H. Hasan and P. Y. Bely
Title: Effect of OTA Breathing on Hubble Space Telescope Images
Abstract: The effect of short term focus changes on Hubble Space Telescope images after the refurbishment mission is examined. Based on simulations we conclude that image quality will not be degraded significantly for the cameras if focus variations are within +/- 5 microns in the blue and within +/- 2 microns in the UV. The tolerance for the spectrographs is wider, about +/- 10 microns.Speakers: J. Mo and R. J. Hanisch
Title: Comparisons Between Observed and Model PSFs in WFPC Image Restoration
Abstract: A demonstration of HST WFPC image restoration has been made using both empirical and simulated PSFs. Empirical PSFs are obtained from the WFPC PSF library and also extracted from the image to be restored. Simulated PSFs are generated by the Tiny TIM package. The best results obtained thus far rely upon being able to determine a high S/N empirical PSF from the data frame. In order to achieve accurate restorations of data with high dynamic range excellent PSF models are required. PSF models are less critical when the intrinsic source structure is smooth compared to the size of the PSF.Speakers: Raadhakrishnan Poovendran, John E. Dorband, and Jan M. Hollis
Title: FOC Image Restoration Using Calculated PSFs on Parallel Architectures
Abstract: We describe our image restoration efforts which use an analytical model of the point spread function (PSF) of the Hubble Space Telescope Faint Object Camera (FOC). Our approach is based on the Zernike polynomial modeling of the wave front phase using two phase retrieval algorithms. The difficulties involved in modeling the PSF and validating our approach are illustrated using actual FOC and simulated data which have been processed on a massively parallel computer.Speakers: Scott R. McNown and Bobby R. Hunt
Title: Approximate Shift-Invariance by Warping Shift-Variant Systems
Abstract: A method is presented in which a signal, degraded by a linear shift-variant system, will undergo a warping such that the resulting warped signal will be approximately described by a warped original signal filtered by a linear shift-invariant system. The warping is a limited class of coordinate transformations, for which adjacent points do not cross each other after the transformation. This results in a signal that may appear stretched in some places and compressed in others (and curved if the signal is two-dimensional). The purpose of this distortion is to make the space-variant impulse response (which can be viewed as a space-invariant impulse response which has been warped in the original signal domain) vary as little as possible. In particular cases, a transformation can be found which will result in no impulse response variations. For most cases, however, the impulse response will still have some space variance, which the warping seeks to minimize. The residual variance will be ignored (this error must be small in order for this method to work well), and an "average" impulse response in the warped domain will be assumed. This allows for space-invariant restoration of the warped signal, with all of its attendant advantages in speed and reduced complexity.Speakers: David Redding, Meemong Lee, and Sam Sirlin
Title: Improved Prescription Retrieval and PSF Modeling Code
Abstract: The success of HST image restoration depends critically on the ability to obtain, either through direct observation or through detailed optical modeling, a good estimate of the point spread function. PSF models are, in principle, superior to observed PSFs owing to the lack of noise and the ability to compute the PSF on a higher density pixel grid. We describe an approach to PSF modeling for the HST which utilizes two complementary programs: (1) a sophisticated prescription retrieval code, and (2) a hybrid PSF modeling code that combines both ray-tracing and diffraction propagators. The PSF modeling code will be made available to the HST user community through the STScI.Speaker: Richard L. White
Title: Better HST Point-Spread Functions: Phase Retrieval and Blind Deconvolution
Abstract: The accuracy of HST image restorations is often limited by the quality of point-spread functions that are available. Phase retrieval methods are being applied to HST PSFs in an effort to improve the agreement between observed PSFs and PSFs computed using optical modeling programs. The use of blind deconvolution to make better use of observed PSFs is also discussed.Speakers: Timothy J. Schulz and Stephen C. Cain
Title: Simultaneous Phase Retrieval and Deblurring for the Hubble Space Telescope
Abstract: Most methods proposed for restoring images acquired by the Hubble Space Telescope rely on prior knowledge of the telescope's point-spread function; however, for many images, this function is not known precisely and must be inferred from the noisy measured data. In this paper, we address this problem and discuss a maximum-likelihood estimation technique for simultaneously determining the nature of the aberrations and for recovering the underlying object from a noisy, degraded image.Speakers: Julian C. Christou, Stuart M. Jefferies, and Mark W. Robison
Title: Blind Deconvolution of HST Simulated Data
Abstract: We apply an iterative deconvolution algorithm, which has the capability to recover both the object and point spread function from a single image or multiple images, to simulated HST star cluster data. The algorithm uses error metric minimization to enforce known physical constraints on both the reconstructed object and point spread function. The reconstructed object is shown to preserve the photometry inherent in the observed image. The use of multiple observations improves the signal-to-noise ratio of the reconstructed... -
Speaker: Ivan R. King
Title: Some Problems of Practical Image Restoration
Abstract: This discussion begins by noting that stellar photometry is often done quite effectively on unrestored images. Even when restoration is necessary, one must be wary of sophisticated methods, many of which distort photometric values. Fourier methods are photometrically reliable, and Wiener filtering leads to reasonably good restorations. The statistics of pixels in a restoration presents new problems, which have been only partly solved. A completely unsolved problem is presented: estimating the arithmetic difference between two images that have different PSFs and different S/N. The discussion concludes with a plea for methods that are available and transparent to the ordinary user of HST.Speaker: J. Biretta
Title: WFPC and WFPC 2 Instrumental Characteristics
Abstract: We summarize instrumental properties and problems in the WFPC and WFPC 2 cameras which are likely to complicate image restoration.Speaker: Dave Baxter
Title: Restoration of FOC Imaging Data: Some Considerations
Abstract: We review and discuss some of the considerations, relating to FOC imaging, which will affect or at least influence the quality of image restorations. Some of these effects derive from the nature of the instrument and the calibration procedures, however, others are the responsibility of the individuals designing the observations. We also discuss the expected status of these effects after the deployment of COSTAR.Speaker: Peter Challis
Title: FOC Observations of SN 1987A: The Movie
Abstract: HST has now observed SN 1987A six times over the past 3 years with the FOC as part of the Supernova Intensive Study program. To help visualize the evolution of the debris and circumstellar matter, a movie was created from these observations. The movie provides a unique perspective on the history of SN 1987A covering the epoch 1278 - 2424 days since explosion.Speaker: D. S. Briggs
Title: Superresolution of SN 1987A: A Comparative Deconvolution Study
Abstract: Supernova 1987A in the Large Magellanic Cloud presents unprecedented opportunity to observe the evolution of a supernova at all wavelengths. While optical observations with the HST Faint Object Camera have obtained resolutions of 0.1", we are limited in the radio to the resolution obtainable with the Australia Telescope Compact Array (ATCA). At the highest frequency of 8.8 GHz, this corresponds to a synthesized beam width of 0.9". At this resolution the radio supernova is distinct from a point source, but few physical conclusions can be drawn. We present here superresolved images from the ATCA with an effective resolution of 0.5". These reveal a spherical shell-like structure with a radius of 0.6", and an additional component of emission aligned with the circumstellar ring imaged with the FOC. Care is required due to the delicate nature of the imaging problem, and we present evidence that the overall image structure is plausible.Speakers: S. M. Simkin and P. T. Robinson
Title: WFPC Image Restoration and Undersampling Problems
Abstract: We describe two sets of Planetary Camera data for the same object. One was observed in such a way that it is adequately sampled (according to the Nyquist theorem) while the other was not adequately sampled. We discuss a variant of the "roll deconvolution" technique and demonstrate the advantages of this observing strategy for minimizing detector flaws. We show the results of attempting to restore each set of data using both the STSDAS "Lucy" and Weir's "MEM" routines. Our conclusion: The sampling theorem applies to WF/PC data, Surprise!?Speaker: W. Freudling
Title: Sub-Stepping as a WFPC Observing Strategy
Abstract: WFPC images offer a wide field of view but significantly undersample the PSF. A strategy to recover some of the resolution lost by undersampling is to split the exposure time into several images, which are shifted by a fraction of the pixel size. These images can subsequently be combined with a modified Richard-Lucy algorithm. One of the challenges of this approach is the treatment of cosmic ray hits. A procedure to simultaneously co-add images with different PSFs in the presence of CR hits is presented here. An IRAF implementation (Freudling, 1993) was tested on both simulations and real WFPC data. These tests show the viability of the proposed strategy.Speakers: J. R. Walsh and L. B. Lucy
Title: Optimal Combination of Sub-Stepped Spectra
Abstract: The small aperture of the GHRS projects to just one detector diode. In order to achieve adequate sampling, the data are sub-stepped in half or quarter diode steps. Using the Richardson-Lucy algorithm, optimal combination of these sub-stepped spectra can be achieved in order to regain resolution. The technique, which can be generally applied to spectra which are undersampled by the detector, is described and some examples are shown.Speakers: Athanassios Diplas, Edward A. Beaver, Phillip R. Blanco, Robert K. Piña, and Richard C. Puetter
Title: Application of the Pixon Based Restoration to HST Spectra and Comparison to the Richardson-Lucy and Jansson Algorithms: Restoration of Absorption Lines
Abstract: We discuss the application of the Pixon Based Image Restoration Method on spectroscopic HST data and compare the results with those obtained from the Richardson-Lucy and Jansson algorithms. In order to better evaluate the performance of the various algorithms we have also used artificial data. -
Speaker: I. C. Busko
Title: Evaluation of Image Restoration Algorithms Applied to HST Images
Abstract: This work reports results on intercomparison of image restoration algorithms, when used in the specific context of stellar fields imaged by the HST WFPC. Properties as fidelity to the original image and photometric linearity, as well as computation performance, were evaluated.Speakers: Don Lindler, Sara Heap, Jarita Holbrook, Eliot Malumuth, Dara Norman, and Patricia C. Vener-Saavedra
Title: Star Detection, Astrometry, and Photometry in Restored PC Images
Abstract: We have evaluated various image restoration techniques (both linear and non-linear) applied to a simulated crowded star field observed with the HST Planetary Camera (PC). Evaluation criteria included star detection, astrometry, and photometry. Numerous restoration artifacts made star detection difficult for the linear restorations. The non-linear methods give much better star detection and slightly better astrometry results. Aperture photometry measurements in images restored with the non-linear restoration methods show significant non-linearities. The flux of fainter stars are systematically underestimated. This problem is not seen in the images restored with the linear methods. However, we see significantly increased RMS scatter in the photometry results for images restored with a linear method. We found that a hybrid approach which combines the result of a non-linear method with a linear method will give linear photometry results with lower RMS scatter.Speakers: L. K. Fullton, B. W. Carney, K. A. Janes, J. M. Coggins, and P. Seitzer
Title: Improved Photometry of HST Data With Iterative/Recursive Deconvolution Techniques
Abstract: Keywords: Photometry, Deconvolution, Image-Restoration, ClustersImage restoration results are presented using a new iterative/recursive method for removing a linear, spatially-invariant blur from an image. We have implemented this algorithm and used it to restore Hubble Space Telescope (HST) Planetary Camera images of the globular cluster NGC 6352. The resulting color-magnitude diagram illustrates the photometric accuracy which can be obtained from images deconvolved using our technique with the PSFs currently available from the Space Telescope Science Institute. If better PSFs become available, we believe the color-magnitude diagram could improve significantly. For comparison, we have analyzed the unrestored images with PSF-fitting photometry.
Speakers: M. Bertero, F. Maggio, E.R. Pike, and D. Fish
Title: Assessment of Methods Used for HST Image Reconstruction
Abstract: An assessment of the performance of various inversion algorithms is presented and compared with the Richardson-Lucy method, the currently favored approach used for the restoration of Hubble Space Telescope images. Numerical validations are made in the case of two simulated images.Speaker: Peter B. Stetson
Title: DAOPHOT Reductions of the Simulated Cluster Field
Abstract: I present the results of DAOPHOT reductions of the simulated star cluster images provided by STScI's Image Restoration Project.Speakers: P. Linde, R. Snel, and S. Spännare
Title: Precision Photometry at the LMC Center: Simulating Post-COSTAR HST Observations
Abstract: We intend to use the restored HST to study age structure and chemical evolution of the stellar population at the center of the LMC. To obtain the necessary observational parameters, Strömgren photometry is planned using the Planetary Camera. To study the reduction techniques needed, a set of simulated PC images have been made. The stellar density in the images is based on an extrapolated luminosity function and show a high degree of crowding. Two of the generally available photometric software packages, ROMAFOT and DAOPHOT, have been used to analyse the images. Results show that the photometric accuracy reached with these packages is not enough for high precision photometry in this configuration. We show that the two major contributors to the errors are undersampling and the unusual shape of the point spread function. Adaptation of locally developed software is currently in progress to deal with these problems.Speakers: F. Fusi Pecci, L. Federici, G. Parmeggiani, F. R. Ferraro, C. Cacciari, G. Iannicola, C. E. Corsi, R. Buonanno, F. Zavatti, and O. Bendinelli
Title: Photometric and Surface Brightness Measurements of Simulated HST Restored Images
Abstract: Stellar photometry of simulated HST uncorrected and corrected images of a crowded star cluster has been carried out before and after deconvolution (with RMG, R-L, MEM, etc.) to optimize the reduction procedures and compare the degree of completeness and photometric accuracy actually achievable in the various cases. Experiments are also presented on the description of the surface brightness profiles of simulated clusters with different deconvolution techniques and sizes of the adopted PSF and data matrices.Speakers: Laura Ferrarese and Holland C. Ford
Title: Surface Brightness Parameters from Deconvolved PC Images of Elliptical Galaxies
Abstract: The spherical aberration affecting the HST primary mirror (Burrows et al. 1992) can seriously alter the core brightness profile of elliptical galaxies. Excellent results in recovering the original HST performances have been obtained by means of deconvolution techniques, improved and updated during the past two years in order to suit HST data (Hanisch 1993). In this paper we test the performance of deconvolution in recovering the original surface brightness profile in the case of elliptical galaxies observed with the HST Planetary Camera (PC). We analyze whether the results of deconvolution depend on the intrinsic morphological properties of the galaxy (e.g., the brightness profile or the ellipticity), or on small variations of the PSF due to jitter, position on the chip, or spectral shape of the point source.Speakers: Kavan U. Ratnatunga, Richard E. Griffiths, and Stefano Casertano
Title: Maximum Likelihood Estimation of Galaxy Morphology: Faint HST Wide Field Camera Images
Abstract: A modeling approach based on the maximum likelihood method has been developed to extract quantitative morphological and structural parameter estimates for faint galaxy images obtained with the Hubble Space Telescope (HST) Wide Field Planetary Camera (WFPC). We model both the galaxy image and the instrumental characteristics of the WFPC, including the complex Point Spread Function (PSF), the error in the Analog-to-Digital Converter (ADC), and the positive noise bias due to faint cosmic rays and undetected warm pixels. Because convolved galaxy images are compared directly with the observations, we avoid the need for deconvolution, which is difficult and potentially unstable for faint images.Speakers: P. Nisenson, E. Falco, R. Gonsalves, and S. Ebstein
Title: High Resolution Measurements from HST Power Spectra
Abstract: Power spectral analysis (PSA) has proven to be an important tool for high spatial resolution measurements from ground based interferometry. It is particularly useful for parameter estimation of stellar diameters, binary separations, and stellar asymmetries. While many images from HST are far too complicated for PSA, there are a class of scientific problems for which it is a very effective technique for handling the problems created by the HST point spread function. We have been applying PSA to images of gravitationally lensed QSOs, and other objects with some surprising results.Speakers: R. A. Gonsalves, T. S. Zaccheo, S. M. Ebstein, and P. Nisenson
Title: Cramér-Rao Bound - Accuracy of HST Image Restoration
Abstract: The Cramér-Rao bound is an analytical expression that describes the minimum obtainable mean square error associated with a given estimate of a parameter. This paper presents a compact and simple form of the bound for restoration techniques such as an inverse or a Wiener filter estimate. We present both 1-D and 2-D examples.Speakers: Adeline Caulet and Wolfram Freudling
Title: Imaging Performance of HST Compared to Future Ground-Based Adaptive Optics Systems
Abstract: Simulations of a distant cluster of galaxies at z >= 2 are presented for the HST cameras COSTAR-corrected FOC, WFPC II, the Advanced Camera, and for an adaptive optics system that may equip a future 8-meter class telescope. Except for the FOC and PC II, the faint objects are detected in 200 minutes, with identifiable morphological types. The Advanced Camera offers the biggest advantages due to its large field, great sensitivity, and high angular resolution. -
Speaker: William C. Keel
Title: Scientific Results from Deconvolved Images
Abstract: I review various regimes in which deconvolution is and is not the technique of choice for analysis of HST images. The difficulty of getting adequately deconvolved images depends largely on required angular field, attainable signal-to-noise, and the dynamic range of the target; this last factor limits many interesting investigations in the presence of spherical aberration, exacerbating noise amplification and uncertain knowledge of the point-spread function. Photometric validation issues are also important; real data are used to show how well completely different approaches agree on the intensity profiles of faint galaxies.Some examples of scientific results in the realm of galaxy structure and evolution are given, which have required deconvolved data. Some highlight are disk structures near galactic nuclei, fine structure in synchrotron jets, morphological evolution of medium- and high-redshift galaxies, and significant galaxy merging at moderate redshifts. An additional set of results has been greatly aided by deconvolution, including study of concentrated cores in galaxies and structural parameters of faint galaxies.
Speaker: T. J. Cornwell
Title: Where Have We Been, Where Are We Now, Where Are We Going?
Abstract: Keywords: Summary, Wisdom, FlatulenceThree years ago, STScI hosted the predecessor of this meeting. I attempt to contrast our understanding of HST deconvolution now with then. The papers presented here can be split into two categories: deconvolution algorithms in general, and tricky HST-dependent details. I discuss how we have progressed in these two areas and where we might expect to be at a putative next meeting in the series, three years from now.