Proceedings Volume 6271

Modeling, Systems Engineering, and Project Management for Astronomy II

cover
Proceedings Volume 6271

Modeling, Systems Engineering, and Project Management for Astronomy II

View the digital version of this volume at SPIE Digital Libarary.

Volume Details

Date Published: 12 June 2006
Contents: 7 Sessions, 48 Papers, 0 Presentations
Conference: SPIE Astronomical Telescopes + Instrumentation 2006
Volume Number: 6271

Table of Contents

icon_mobile_dropdown

Table of Contents

All links to SPIE Proceedings will open in the SPIE Digital Library. external link icon
View Session icon_mobile_dropdown
  • Modeling I: Ground-based Telescopes
  • Project Management
  • Systems Engineering I: Space Telescopes and Instruments
  • Systems Engineering II: Ground-based Telescopes
  • Modeling I: Ground-based Telescopes II
  • Modeling II: Space Telescopes and Instruments
  • Poster Session
Modeling I: Ground-based Telescopes
icon_mobile_dropdown
Integrated modeling concepts for OWL
The European Southern Observatory (ESO) has started to develop the Integrated Model (IM) for OWL by applying software tools, which were already used for an IM of the Very Large Telescope Interferometer. The OWL IM focuses onto model reliability and computational efficiency. The former is taken into consideration by a step-wise implementation and the usage of both tested tools and validated sub-models. The validation methods contain plausibility checks with hand calculations, experiments, measurements or comparison with other computational models. The latter is addressed by considering the sparsity pattern of the equations governing the structural dynamics and the control loops and by exerting advanced model reduction techniques for the dynamic model of the telescope structure, which especially try to avoid the loss of local structural flexibility due to a modal truncation. Different condensation methods are shown and their performance is evaluated. The paper shows the architecture of the integrated model with its components such as telescope structures, optics, control loops and disturbance models for wind load, seismic ground acceleration and atmospheric turbulence. Results illustrating the capability of the model approach are presented.
Status of the integrated model of the Euro50
The Euro50 is a proposed 50m extremely large telescope for optical and infrared wavelengths. To study and predict the performance of the complete telescope system, an integrated model combining the structural model of the telescope, optics models, the control systems and the adaptive optics has been established. Wind and atmospheric disturbances are also included in the model. The integrated model is written in MATLAB and C. To satisfy memory demands and to achieve acceptable execution times, 64-bit MATLAB is used and part of the model is run on a shared memory machine using OpenMP. We present results from simulations with a complete integrated single conjugate adaptive optics model. Various sensor and actuator geometries are evaluated. A comparison of wind loading and atmospheric turbulence effects is also presented. The model shows that the telescope will be essentially seeing limited under wind load and no AO correction.
Optimised external computation for the Euro50 MATLAB based integrated model
In previous work we have countered computational demands faced in integrated modelling by developing and using a parallel toolkit for MATLAB. However the use of an increasingly realistic model makes the computational requirements of the model much larger, particularly in wavefront sensing, reaching a point where simulations of several real time seconds were no longer practical taking up to 3 weeks per second. In response to this problem we have developed optimised C code to which MATLAB off loads computation. This code has numerous advantages over native MATLAB computation. It is portable, scaleable using OpenMP directives and can run remotely using Remote Procedure Calls (RPCs). It has opened up the possibility of exploiting high end Itanium and Opteron based shared memory systems, optimised 3rd party libraries and aggressive compiler optimisation. These factors combined with hand-tuning give a performance increase of the order of 100 times. The interface to the rest of the model remains the same so the overall structure is unchanged. In addition we have developed a similar system based on Message Passing Interface version 2, (MPI-2) which allows us to exploited clusters. Here we present an analysis of techniques used and gains obtained along with a brief discussion of future work.
Project Management
icon_mobile_dropdown
Cost of Quality (CoQ) metrics for telescope operations and project management
This study describes the goals, foundational work, and early returns associated with establishing a pilot quality cost program at the Robert C. Byrd Green Bank Telescope (GBT). Quality costs provide a means to communicate the results of process improvement efforts in the universal language of project management: money. This scheme stratifies prevention, appraisal, internal failure and external failure costs, and seeks to quantify and compare the up-front investment in planning and risk management versus the cost of rework. An activity-based Cost of Quality (CoQ) model was blended with the Cost of Software Quality (CoSQ) model that has been successfully deployed at Raytheon Electronic Systems (RES) for this pilot program, analyzing the efforts of the GBT Software Development Division. Using this model, questions that can now be answered include: What is an appropriate length for our development cycle? Are some observing modes more reliable than others? Are we testing too much, or not enough? How good is our software quality, not in terms of defects reported and fixed, but in terms of its impact on the user? The ultimate goal is to provide a higher quality of service to customers of the telescope.
Project management at a university
Managing instrumentation projects, large or small, involves a number of common challenges-defining what is needed, desiging a system to provide it, producing it in an economical way, and putting it into service expeditiously. Doing these things in a university environoment provides unique challenges and opportunities not obtaining in the environment of large projects at NASA or national labs. I address this topic from the viewpoint of knowledge of two such projects, the development of OAO-2 at the University of Wisconsin and the relocation of Fairborn Observatory to the Patagonia Mountains in Arizona, as well as my own developemnt of the Tennessee State 2-m Automatic Spectroscopic Telescope. For the university environment, I argue for a more traditional management style that relies on more informal techniques than those used in large-scale projects conducted by big bureaucratic institutions. This style identifies what tasks are really necessary and eliminates as much wasteful overhead as possible. I discuss many of the formalities used in project management, such as formal reviews (PDR, CDR, etc.) and Gantt charts, and propose other ways of acheving the same results more effectively. The university environment acutely requires getting the right people to do the project, both in terms of their individual personalities, motivation, and technical skills but also in terms of their ability to get on with one another. Two critical challenges confronting those doing such projects in universities are 1) keeping the contractors on task (the major challenge to anyone doing project management) and 2) dealing with the purchasing systems in such institutions.
Reliability, availability, and maintainability: a key issue for ELTs
Hundreds of mirror segment, thousands of high precision actuators, highly complex mechanical, hydraulic, electrical and other technology subsystems, and extremely sophisticated control systems: an ELT system consists of millions of individual parts and components each of them may fail and lead to a partial or complete system breakdown. Component and system reliability does not only influence the acquisition costs of a product, it also defines the necessary maintenance work and the required logistic support. Taking the long operational life time of an ELT into account, reliability and maintainability are some of the main contributors to the system life cycle costs. Reliability and maintainability are key characteristics of any product and have to be designed into it from the early beginning; they can neither be tested into a product nor introduced by numerous maintenance activities. This presentation explains the interconnections between Reliability, Availability, Maintainability and Safety (RAMS) and outlines the necessary RAMS and Reliability Centered Maintenance (RCM) processes and activities during the entire life cycle of an ELT and an ELT instrument from the initial planning to the eventual disposal phase.
The role of project science in the Chandra X-ray Observatory
The Chandra X-ray Observatory, one of NASA's Great Observatories, has an outstanding record of scientific and technical success. This success results from the efforts of a team comprising NASA, its contractors, the Smithsonian Astrophysical Observatory, the instrument groups, and other elements of the scientific community-including the thousands of scientists who utilize this powerful facility for astrophysical research. We discuss the role of NASA Project Science in the formulation, development, calibration, and operation of the Chandra X-ray Observatory. In addition to serving as an interface between the scientific community and the Project, Project Science performed what we term "science systems engineering". This activity encompasses translation of science requirements into technical requirements and assessment of the scientific impact of programmatic and technical trades. We briefly describe several examples of science systems engineering conducted by Chandra Project Science.
Systems Engineering I: Space Telescopes and Instruments
icon_mobile_dropdown
Developing a NASA strategy for the verification of large space telescope observatories
Julie A. Crooke, Johanna A. Gunderson, John G. Hagopian, et al.
In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.
The advanced modeling, simulation, and analysis capability roadmap vision for engineering
This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual TestingModels, and(5) space-based Robotics Manufacture and Servicing Models.
Optical verification of the James Webb Space Telescope
Brian McComas, Rich Rifelli, Allison Barto, et al.
The optical system of the James Webb Space Telescope (JWST) is split between two of the Observatory's element, the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM). The OTE optical design consists of an 18-hexagonal segmented primary mirror (25m2 clear aperture), a secondary mirror, a tertiary mirror, and a flat fine steering mirror used for fine guidance control. All optical components are made of beryllium. The primary and secondary mirror elements have hexapod actuation that provides six degrees of freedom rigid body adjustment. The optical components are mounted to a very stable truss structure made of composite materials. The OTE structure also supports the ISIM. The ISIM contains the Science Instruments (SIs) and Fine Guidance Sensor (FGS) needed for acquiring mission science data and for Observatory pointing and control and provides mechanical support for the SIs and FGS. The optical performance of the telescope is a key performance metric for the success of JWST. To ensure proper performance, the JWST optical verification program is a comprehensive, incremental, end-to-end verification program which includes multiple, independent, cross checks of key optical performance metrics to reduce risk of an on-orbit telescope performance issues. This paper discusses the verification testing and analysis necessary to verify the Observatory's image quality and sensitivity requirements. This verification starts with component level verification and ends with the Observatory level verification at Johnson Space Flight Center. The optical verification of JWST is a comprehensive, incremental, end-to-end optical verification program which includes both test and analysis.
Optimizing the cryogenic test configuration for the James Webb Space Telescope
Tony L. Whitman, Thomas R. Scorse
The optical system test for the James Webb Space Telescope (JWST) requires one of the largest available vacuum chambers operating at cryogenic temperatures (<45K). The test includes verification of a segmented 6.6m-diameter primary mirror with a radius of curvature of 16m. In addition, the chamber must accommodate flight instruments and test equipment above and below the telescope. ITT, under contract for this test, teamed with Northrop Grumman Space Technology (NGST), the prime contractor, and the NASA Goddard Space Flight Center (GSFC) to optimize the test configuration. The new configuration minimizes operation within the chamber, leverages recent technologies in interferometry to minimize ground-based environmental influences, and utilizes the chamber itself as the test bench to reduce the cryogenic thermal mass. The result enables an effective system-level test in Chamber A at the NASA Johnson Space Center (JSC).
Johnson Space Center Vacuum Chamber A as a partial metering structure for James Webb Space Telescope optical testing
In precision optical testing, it is desirable to provide a unified metering structure between the optical test source and the test article to limit the effects of incoming vibration sources. In this manner, the entire optical test structure may be vibration isolated as a single unit. Cryogenic temperature requirements for the James Webb Space Telescope make it cost prohibitive to maintain a single optical metering structure. This paper demonstrates the advantages and challenges of separately isolating the test source from the telescope, while using the Johnson Space Center Vacuum Chamber A as the metering structure supporting the two isolated assemblies.
Systems Engineering II: Ground-based Telescopes
icon_mobile_dropdown
Using a formal requirements management tool for system engineering: first results at ESO
The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.
System engineering in the ALMA project
Christoph Haupt, Richard Sramek, Koh-Ichiro Morita
The Atacama Large Millimeter and Sub-millimeter Array (ALMA) is a sub-millimeter-wavelength radio telescope under construction in northern Chile at an altitude of 5,000 meters. The ALMA telescope will be composed of 66 to 80 high-precision antennas plus their electronics systems, all of which operate as a single instrument. This telescope will observe the cold regions of the Universe with unprecedented depth and clarity. These regions, which are often optically dark, shine brightly in the sub-millimeter portion of the electromagnetic spectrum. ALMA is a partnership between institutions in Europe, North America, Japan and the Republic of Chile and is currently one of the largest ground-based astronomy projects under construction. ALMA is a complex and technically challenging instrument and the development and construction is dispersed over four continents. Such a project requires a strong system engineering team if it is to come together as a complete system and meet its performance objectives. ALMA System Engineering activities can be divided into; System Design and Analysis, Product Assurance, Prototype System Integration, and System Integration in Chile. This paper reports on these System Engineering activities and achievements. It also describes how the System Engineering team is staffed and organized and reports on some early technical achievements.
Systems engineering for the conceptual design of the Thirty Meter Telescope
A large, ground based telescope project like the Thirty Meter Telescope project faces unique systems engineering challenges stemming from the inevitably distributed, international nature of the endeavor, as well as the relative novelty of formal systems engineering in ground based astronomy. The TMT project established solid systems engineering processes in the conceptual design phase, including project lifecycle configuration management, requirements engineering, and a web based document library. The outlines of the highest level engineering documents, the Observatory Requirements and Observatory Architecture are presented. As integrated and end-to-end modeling is an organic part of systems engineering in support of performance allocation and trade studies, an important example, the analysis of wind and thermal induced local seeing, is briefly described in the paper.
Systems engineering in practice: can rigour and creativity co-exist?
Hermine Schnetler, Philip Rees, Ian Egan
Systems engineering as a discipline has been established for many years, being utilised to good effect most notably, in the defence industry. Its introduction in a formalised way to the UK ATC is relatively recent. Although a good start has been made in embedding the process within the lifecycle model, much work is still required to refine the systems engineering elements to cope with the complex (internationally collaborative) business model, the need to nurture creativity in the design process and the translation into a highly challenging cost-driven technology domain. This paper explores the current status of systems engineering at the UK ATC, shows where further work is needed, and how improvements can be made to meet the challenges of next generation telescopes and instrumentation. It is shown why the discipline is necessary, especially given that projects often comprise diverse global teams (both small and large), and it indicates the pitfalls of a tendency in the early stages of a project to focus on solutions rather than robust requirements capture. Finally, despite the obvious value and yet often ill-understood rigours of system engineering, it is shown how innovation and creativity can be promoted rather than stifled.
Applied systems engineering in a small scale project
Since the start of the design efforts in 2003, the design of the Optical Bench Assembly for MIRI is detailed and finalized. MIRI ('Mid Infrared Instrument') is the combined imager and integral field spectrometer for the mid infrared under development for the James Webb Space Telescope. MIRI is developed by a combined European-US Consortium. As part of this consortium, ASTRON develops the Spectrometer Main Optics Working in such a large international consortium requires focus on traceability of requirements, design, interface and verification data. This is achieved using several systems engineering practices like requirement analyses and allocation, technical performance management and configuration management. These processes are tailored to the complexity and scale of the project. The paper summarizes these practices and provides examples of the tailoring process and system engineering tools used.
Modeling I: Ground-based Telescopes II
icon_mobile_dropdown
Limitations on Earth-like planet detection with perfect and real coronagraphs on ELTs
Extremely Large Telescopes are very promising to detect and characterize Earth-like planets because of their high angular resolution and the increased number of collected photons. We study the impact of aberrations on this detection and the limitations they impose. We consider an extreme adaptive optic device upstream of a perfect coronagraph. Even with the high Strehl ratio provided, the coronagraphic image is not sufficient to detect Earth-like planet. Indeed the contrast between this kind of planet and its star is about 10-10 in the near infra-red. As a consequence, a calibration device downstream of the coronagraph must be used to reach this contrast. We modelize a realistic system taking into account dynamic aberrations left uncorrected by the adaptive optics, static aberrations of optical system and differential static aberrations due to the calibration channel. Numerical simulations compare the respective assets of a 30 meter telescope in a median site to these of a 15 meter telescope in the dome C. In both cases, we must control common static aberrations at 8 nm and differential aberrations at 0.1 nm. Beyond this limitation due to the speckle noise and despite the great collecting area, another limitation is set by the photon noise. We also compare these results to simulations made with real coronagraphs and with an obstructed pupil.
Mitigation of third order spherical, coma, and astigmatism using segmented mirrors
Constructing telescope pupil mirrors, most traditionally the primary mirror, from a series of hexagonal segments enables the construction of larger telescopes and offers a degree of aberration mitigation beyond what is feasible with traditional monolithic mirror design. The degree of correction possible varies as a function of the number of segments used to construct the mirror, the nature of the mirror surface, and the degrees of freedom available to adjust each segment. This presentation discusses the degree of aberration mitigation possible as the number of segments used increases from 6 to 60 for a constant aperture system. It includes considerations for powered or flat segmented mirrors for the cases where segmentation at other pupil positions besides the primary might be considered. It also includes the effects of 3 levels of segment adjustability: (a) piston and tilt only, (b) 6 degree of freedom, and (c) 6DOF plus radius of curvature adjustment.
A multi-layered thermal model of backup structures for mm-wavelength radio telescopes
A. Greve, D. R. Smith, M. Bremer
An unfavourable influence that degrades the performance of any millimeter wavelength radio telescope is the deformation of the reflector surface due to temperature differences in the supporting backup structure. To avoid, or at least reduce this influence, the backup structures are typically protected by a rear side cladding, insulation at the panel inner side, and ventilation or climatization of the air inside the backup structure. During the design of a mm-wavelength telescope, the layout of a thermal protection system is made, based on experience gained on other telescopes, and on thermal model calculations of the complete backup structure. The available thermal programs allow today the construction of a multi-layered backup structure model, consisting of the backup structure tube network, without and with ventilation/climatization, the panels, insulation behind the panels, and the rear side cladding. We provide a guideline for the construction of such a multi-layered thermal model, and demonstrate that realistic temperature gradients across and through a backup structure can be calculated. These gradients can be used in a finite element model to calculate the reflector surface deformations, which can be used in a diffraction program to calculate the radio beam pattern.
Perturbation rejection control strategy for OWL
B. Sedghi, L. Mišković, M. Dimmler
The control system of the ESO 100m telescope (OWL) has to reject slow and fast perturbations in several subsystems. In this paper we focus on the wind rejection control strategies for two subsystems: the main axes and the segmented mirror. It is shown that facing the same disturbance the 2 control designs have to deal with completely different problems: control of a flexible SISO (Single input-Single output) system for the altitude axis versus a dynamically coupled MIMO (Multi input-Multi output) system for the segmented mirror. For both subsystems feasible solutions are given.
Modeling wind-buffeting of the Thirty Meter Telescope
The Thirty Meter Telescope project is designing a 30m diameter ground-based optical telescope. Unsteady wind loads on the telescope structure due to turbulence inside the telescope enclosure impact the delivered image quality. A parametric model is described that predicts the optical performance due to wind with sufficient accuracy to inform relevant design decisions, including control bandwidths. The model is designed to be sufficiently computationally efficient to allow rapid exploration of the impact of design parameters or uncertain/variable input parameters, and includes (i) a parametric wind model, (ii) a detailed structural dynamic model derived from a finite element model, (iii) a linear optical response model, and (iv) a control model. Model predictions with the TMT structural design are presented, including the parametric variation of performance with external wind speed, desired wind speed across the primary mirror, and optical guide loop bandwidth. For the median mountaintop wind speed of 5.5 m/s, the combination of dome shielding, minimized cross-sectional area, and control results in acceptable image degradation.
TMT wind model validation with measurements on Keck and Gemini
Wind disturbance is an important parameter in the assessment of telescope performance, especially for extremely large telescopes like TMT. In the process of estimating the response of the TMT telescope structure to wind buffeting, we were carrying out measurements on the Keck and Gemini observatories in order to better understand the possible variations in the wind response of these different telescopes. The measurements were used to validate the TMT wind model as well as explaining the mechanism through which the wind disturbances affect the pointing of the elevation axis. We were estimating the wind torque acting on the elevation axis from the measured Disturbance Transfer Function and Position Error of the mount control. Furthermore, the quality of the correlation between wind speed and torque disturbance were investigated to evaluate the feasibility of mitigating the wind disturbance by introducing feed-forward or adaptive control schemes in the mount control.
Strategies for estimating mirror and dome seeing for TMT
Mirror and dome seeing greatly influence the optical performance of large ground-based telescopes. This study describes a strategy for modeling the effects of passive ventilation on the optical performance of the Thirty Meter Telescope (TMT). Computational Fluid Dynamic (CFD) analyses are combined with thermal analyses to model the effects of turbulence and thermal variations within the airflow around the TMT telescope-enclosure configuration. An analytical thermal model based on Newton's cooling law and incorporating a conduction heat flux and a radiation term is used to track the primary mirror temperature throughout the night. A semi-empirical seeing model is used to relate mirror temperature and wind speed to seeing. Different external wind speeds, mirror heat fluxes and ambient thermal temporal gradients are investigated and comparisons are made.
Lessons learned from parametric structural modeling of a large ground-based observatory
In the conceptual design phase for a large ground-based observatory, it is often necessary to make major design decisions affecting output figures of merit before sufficiently detailed models are available for predicting results. While a single "point design" may be selected based on expert opinion, for new and complex structures the optimal design that meets both cost and science requirements may not be obvious without analysis of many models. One solution to evaluating different designs early in the design process is to create a parametric model of the telescope structure and predict the dynamic behavior using an integrated model. An integrated model is an environment where the major disciplines of disturbance, optics, controls and structures are modeled. In this way a large tradespace across different design configurations and parametric values can be rapidly evaluated using metrics such as image motion and estimated system cost. This paper describes the steps taken by the MIT Space System Laboratory (MIT SSL) for large ground-based observatory preliminary design using a parametric integrated model. The parametric finite element structural model is described and representative results are shown. In particular, this paper will describe lessons learned about the advantages and challenges encountered during development and implementation of the parametric integrated model such as the usefulness of a visualization tool and the importance of subsystem model modularity.
Wind vibration analyses of Giant Magellan Telescope
Frank W. Kan, Daniel W. Eggers
The Giant Magellan Telescope (GMT) will be a 21.5-m equivalent aperture optical-infrared telescope located in Chile. The segmented mirror surface consists of seven 8.4-m diameter mirrors. The telescope structure will be inside a carousel-type enclosure. This paper describes the wind vibration analyses performed on the baseline configuration of the GMT structure during the conceptual design phase. The purposes of the study were to determine order of magnitude pointing and focus errors resulting from the dynamic response of the telescope structure to wind disturbances, and to identify possible changes to the structure to reduce the optical errors. In this study, wind pressure and velocity data recorded at the 8-m Gemini South Telescope were used to estimate the dynamic wind load on the GMT structure. Random response analyses were performed on a finite element model of the telescope structure to determine the structural response to dynamic wind loading and the resulting optical errors. Several areas that significantly contribute to the optical errors were identified using the spectral response curves and the mode shapes. Modifications to these areas were developed and analyzed to determine their effects on the optical performance, and an improved design was developed for the next phase of the design.
Active optical alignment of the Advanced Technology Solar Telescope
The Advanced Technology Solar Telescope (ATST) is a complex off-axis Gregorian design to be used for solar astronomy. In order the counteract the effects of mirror and telescope structure flexure, the ATST requires an active optics alignment strategy. This paper presents an active optics alignment strategy that uses three wavefront sensors distributed in the ATST field-of-view to form a least-squares alignment solution with respect to RMS wavefront error. The least squares solution is realized by means of a damped least squares linear reconstructor. The results of optical modelling simulations are presented for the ATST degrees-of-freedom subject to random perturbations. Typical results include residual RMS wavefront errors less than 20 nm. The results quoted include up to 25 nm RMS wavefront sensor signal noise, random figure errors on the mirrors up to 500 nm amplitude, random decenter range up to 500 μm, and random tilts up to 10e - 03 degrees (36 arc-secs) range.
Modeling II: Space Telescopes and Instruments
icon_mobile_dropdown
Thermal modelling of cooled instrument: from the WIRCam IR camera to CCD Peltier cooled compact packages
Philippe Feautrier, Eric Stadler, Mark Downing, et al.
In the past decade, new thermal modelling tools have been offered to system designers. These modelling tools have rarely been used for the cooled instruments in ground-based astronomy. In addition to an overwhelming increase of PC computer capabilities, these tools are now mature enough to drive the design of complex astronomical instruments that are cooled. This is the case for WIRCam, the new wide-field infrared camera installed on the CFHT in Hawaii on the Mauna Kea summit. This camera uses four 2K×2K Rockwell Hawaii-2RG infrared detectors and includes 2 optical barrels and 2 filter wheels. This camera is mounted at the prime focus of the 3.6m CFHT telescope. The mass to be cooled is close to 100 kg. The camera uses a Gifford Mac-Mahon closed-cycle cryo-cooler. The capabilities of the I-deas thermal module (TMG) is demonstrated for our particular application: predicted performances are presented and compared to real measurements after integration on the telescope in December 2004. In addition, we present thermal modelling of small Peltier cooled CCD packages, including the thermal model of the CCD220 Peltier package (fabricated by e2v technologies) and cold head. ESO and the OPTICON European network have funded e2v technologies to develop a compact packaged Peltier-cooled 8-output back illuminated L3Vision CCD. The device will achieve sub-electron read-noise at frame rates up to 1.5 kHz. The development, fully dedicated to the latest generation of adaptive optics wavefront sensors, has many unique features. Among them, the ultra-compactness offered by a Peltier package integrated in a small cold head including the detector drive electronics, is a way to achieve amazing performances for adaptive optics systems. All these models were carried out using a normal PC laptop.
Dynamic tailoring and tuning of structurally connected TPF interferometer
Robust performance tailoring and tuning is a design methodology developed for high-performance, high-risk missions, such as NASA's Terrestrial Planet Finder (TPF). It improves the level of mission confidence obtainable in the absence of a fully integrated system test prior to launch. If uncertainty is high and performance requirements are difficult to meet, existing robust design techniques can not always guarantee performance. Therefore, robust design is extended to include tuning elements that compensate for performance deviations due to parametric modeling uncertainty after the structure is built. In its early stages, the design is tailored for performance, robustness and tuneability. The incorporation of carefully chosen tuning elements guarantees that, once built, the structure will be tuneable to bring performance within requirements. In this paper we apply the dynamic tailoring and tuning methodologies to an integrated model of a structurally-connected TPF interferometer. The problem of interferometer design is posed in a robust design framework and appropriate tuning parameters such as reaction wheel isolator corner frequency and primary mirror support design are identified. Robust performance tailoring with tuning is applied to the problem using structural optimization to obtain a design that manages uncertainty through robustness and tuning authority and achieves a superior design to those generated with dynamic tailoring and robust design alone.
Fidelity of telescope subsystem models and the influence on simulation accuracy
Analysis of complex interdisciplinary systems such as large telescopes is usually performed using simulation models due to the expense of hardware testbeds. The level of fidelity of these simulation models, or the relative closeness to which the model simulates reality for the behavior under investigation, is often not assessed in a quantitative manner. Rather, the model is described qualitatively as being of either "low" or "high" fidelity, often progressing from lower to higher fidelity models as the candidate designs are down-selected. This paper provides a quantitative assessment of fidelity for structural subsystems for large telescope models based on the Nyquist criterion, shows how it influences simulation accuracy, and applies it to a space telescope model. This metric will be useful for assessing the fidelity of an existing structural finite element telescope model or in creating a new model having sufficient fidelity (sufficient accuracy).
Systems engineering and performance modeling of the Gemini High-Resolution Near-Infrared Spectrograph (HRNIRS)
The High-resolution Near-infrared Spectrograph (HRNIRS) concept for the Gemini telescopes combines a seeing-limited R ~ 7000 cross-dispersed mode and an MCAO-fed near diffraction-limited R ~ 20000 multi-object mode into a single compact instrument operating over the 0.9 - 5.5 μm range. We describe the systems engineering and performance modeling aspects of this study, emphasizing simulations of high-precision radial verlocity measurements in the Gemini Cassegrain-focus instrument environment.
Thermal models of the Planck/LFI QM/FM instruments
Maurizio Tomasi, Giorgio Baldan, Marco Lapolla, et al.
The ESA Planck mission is the third generation (after COBE and WMAP) space experiment dedicated to the measurement of the Cosmic Microwave Background (CMB) anisotropies. Two instruments will be integrated onboard: the High Frequency Instrument (HFI), an array of bolometers, and the Low Frequency Instrument (LFI), an array of pseudo-correlation HEMT radiometers. In this paper we will discuss the development of analytical and numerical models to estimate the thermal behavior of LFI, both in steady-state and transient conditions. We then describe their application to the qualification model (QM) tests. QM test data were also used to calibrate the numerical models. Finally, we show some examples about how these models can be used in predicting the instrument performances and the impact of thermal systematic effects on the scientific results.
Poster Session
icon_mobile_dropdown
Data modeling for analytical astronomical simulation VV and A
James W. Handley, Holger M. Jaenisch, Michael L. Hicklen, et al.
This paper presents a novel analytical method for comparing simulation data with measured flight data to with the goal of proving Equivalence and Consistency between the model and the real data. Our method overcomes the problems of disparity in the inputs to the simulation and varying number of parameters between the simulation and the measured flight data. Our method derives analytical Data Models that are analyzed in frequency space, yielding quantitative assessment values for the model performance relative to the measured flight data. The model output for a sensor and its "real-world" measured flight data are collected for comparison.
Bringing it all together: a unique approach to requirements for wavefront sensing and control on the James Webb Space Telescope (JWST)
The opto-mechanical design of the 6.6 meter James Webb Space Telescope (JWST), with its actively-controlled secondary and 18-segment primary mirror, presents unique challenges from a system engineering perspective. To maintain the optical alignment of the telescope on-orbit, a process called wavefront sensing and control (WFS&C) is employed to determine the current state of the mirrors and calculate the optimal mirror move updates. The needed imagery is downloaded to the ground, where the WFS&C algorithms to process the images reside, and the appropriate commands are uploaded to the observatory. Rather than use a dedicated wavefront sensor for the imagery as is done in most other applications, a science camera is used instead. For the success of the mission, WFS&C needs to perform flawlessly using the assets available among the combination of separate elements (ground operations, spacecraft, science instruments, optical telescope, etc.) that cross institutional as well as geographic borders. Rather than be yet another distinct element with its own set of requirements to flow to the other elements as was originally planned, a novel approach was selected. This approach entails reviewing and auditing other documents for the requirements needed to satisfy the needs of WFS&C. Three actions are taken: (1) when appropriate requirements exist, they are tracked by WFS&C ; (2) when an existing requirement is insufficient to meet the need, a requirement change is initiated; and finally (3) when a needed requirement is missing, a new requirement is established in the corresponding document. This approach, deemed a "best practice" at the customer's independent audit, allows for program confidence that the necessary requirements are complete, while still maintaining the responsibility for the requirement with the most appropriate entity. This paper describes the details and execution of the approach; the associated WFS&C requirements and verification documentation; and the implementation of the primary database tool for the project, DOORS (Dynamic Object-Oriented Requirements System).
Modeling the cost of detector controller support in a multi-instrument observatory
Multi-instrument observatories have followed two different approaches regarding scientific detector controllers: The ESO approach, with standard detector controllers for all instruments and the Gemini approach, with individual controllers for each instrument. Observatories may consider the transition to standard controllers in current and new instruments when pursuing objectives such as reduction in engineering resources and improvement in data quality. Both alternatives have their pros and cons, but it is sometimes difficult to justify the transition. The entrance of open-source projects to the detector controller scene may help changing the paradigm in this area, especially if investment in hardware is reduced. The problematic is discussed here, with emphasis in modeling the real costs of supporting multiple detector controllers in an observatory, in an attempt to justify the use of standard detector controllers.
A survey of ground operations tools developed to plan and validate the pointing of space telescopes and the design for WISE
WISE, the Wide Field Infrared Survey Explorer, is scheduled for launch in June 2009. The mission operations system for WISE requires a software modeling tool to help plan, integrate and simulate all spacecraft pointing and verify that no attitude constraints are violated. In the course of developing the requirements for this tool, an investigation was conducted into the design of similar tools for other space-based telescopes. This paper summarizes the ground software and processes used to plan and validate pointing for a selection of space telescopes; with this information as background, the design for WISE is presented.
SPITZER bandmerge GUI
M. Pesenson, W. Roby, J. Fowler, et al.
A graphical user interface (GUI) for bandmerging is presented. The purpose of the Bandmerge GUI is to provide an integrated graphical user interface for running the bandmerge module and its support modules to provide astronomers with an interactive tool for bandmerging. The bandmerge module identifies multi-band detections of an individual point source and merges the information in the different bands into a single record of the source. The developed Java Application provides an interface to downlink software, which is normally invoked on the command line. With the Bandmerge GUI, a SPITZER general user can select the data to be processed, specify processing parameters, and invoke the Bandmerge pipelines.
Study on position error of fiber positioning measurement system for LAMOST
Yi Jin, Chao Zhai, Xiaozheng Xing, et al.
An investigation on measuring precision of the measurement system is carried on, which is applied to optical fiber positioning system for LAMOST. In the fiber positioning system, geometrical coordinates of fibers need to be measured in order to verify the precision of fiber positioning and it is one of the most pivotal problems. The measurement system consists of an area CCD sensor, an image acquisition card, a lens and a computer. Temperature, vibration, lens aberration and CCD itself will probably cause measuring error. As fiber positioning is a dynamic process and fibers are reversing, this will make additional error. The paper focuses on analyzing the influence to measuring precision which is made by different status of fibers. The fibers are stuck to keep the relative positions steady which can rotate around the same point. The distances between fibers are measured under different experimental conditions. Then the influence of fibers' status can be obtained from the change of distances. Influence to position error made by different factors is analyzed according to the theory and experiments. Position error would be decreased by changing a lens aperture setting and polishing fibers.
Architecture of the integrated model of the Euro50
The Euro50 is a proposed 50m extremely large telescope for optical and infrared wavelengths. To study and predict the performance of the complete telescope system, an integrated model combining the structural model of the telescope, optics models, the control systems and the adaptive optics has been established. Wind and atmospheric disturbances are also included in the model. The model is written in MATLAB and C. It is general and modular and built around dedicated ordinary differential equation solvers. The difference in time constants between subsystems is exploited to speed up calculations. The solvers can handle discontinuities and subsystem mode changes. The high degree of modularity allows different telescope designs to be modelled by rearranging subsystem blocks. Certain subsystems, for instance adaptive optics, can also run in a standalone fashion. Parts of the model are parallelized for execution on a large shared memory machine. The resulting architecture of the integrated model and sample results using the code for different telescope models are presented.
TMT studies on thermal seeing modeling: mirror seeing model validation
Mirror and dome seeing are critical effects influencing the optical performance of large ground based telescopes. Computational Fluid Dynamics (CFD) and optical models that simulate mirror seeing in the Thirty Meter Telescope (TMT) are presented. The optical model used to quantify the effects of seeing utilizes the spatially varying refractive index resulting from the expected theoretical flow field, while the developed CFD post-processing model estimates the corresponding CN2 distribution. The seeing effects corresponding to a flat plate are calculated. Plots of seeing versus different temperature differences, velocities and plate lengths are generated. The discussion presented contains comparisons of the results from the two models with published empirical relations.
Integrated modeling approach for an active optics system
In all the new technology telescopes (medium, large, extremely large) the active optics is one of the most delicate systems. Sometimes only a perfect control of the mirrors can justify the whole telescope project. No assessed simulation tool exists, one of the reason is that the approach can be different for segmented or monolithic, medium or large mirrors, and so each telescope design staff usually develops its own simulation system. In this paper an integrated modeling approach is proposed, combining finite element analysis, dynamical systems simulation methods and optical performance analysis.
Making it real: computer graphics and astronomical instrumentation
Photo-realistic computer graphics software provides a valuable visualisation tool in the development of astronomical instrumentation. The perceived realism can convey a valuable sense of feasibility or unfeasibility of a concept. We here look at POV-Ray, an open-source ray-tracing graphics package that has been in use for several years at the Anglo-Australian Observatory. AAO applications include development of both static images and short movies for detailed visualisations of conceived instruments, production of high-quality images for publication and publicity, rapid generation of illustrative figures and simple geometrical and optical studies.
Numerical modeling of influence telescope segment placement at aperture and its quality on focusing radiated field
Aleksey P. Maryasov, Nicolay P. Maryasov
The feasibility of using scalar diffraction theory for calculation of different telescope aperture segments influence and their surface quality represented by phase inhomogeneity on radiated field focusing quality is analyzed. An algorithm and a program for numerical calculation that are free from optical system axis symmetry and also constraint type kr>>1 is developed and described. The values of phase inhomogeneity determine the requirements for quality high aperture and/or segmented mirrors, that in general case have not axis symmetry. The influence level of factor of segment placement and its quality on resultant focusing is the metrics of Strehl ratio. Decrease of influence value of phase inhomogeneity at move off segments from axis (in case of axial symmetry) is determined, and than some increase observed at coming near the aperture edge. The developed method makes it possible to get amplitude and phase distribution for real surface and at arbitrary distances from the telescope mirror. It affords a basis for further research on image correction received from distant stars by using calculated spread function of optical system and also is very promising for resolution increasing.
Analytical modeling of the optical transfer function of a segmented telescope with/without adaptive optics correction of the telescope's dynamical aberrations
An all-analytic optical transfer function (OTF) tool for characterizing the performance of a large segmented telescope with/without adaptive optics (AO) correction of the telescope dynamical aberrations is presented. This tool is to be applied to the determination of the Thirty Meter Telescope (TMT) optical budget error, for both telescope aberrations and AO systems specifications. It takes into account the effect of the dynamical aberrations of all optical surfaces from all the hexagonal segments to the tertiary mirror, and includes as an option AO correction of these errors. Here we present the mathematical development of the method, and give an example of application to a 73 segments 10-m telescope, without AO correction.
Practical aspects of image system validation using trans-illumination
Andrew T. Cochrane, Garth C. Robins, Gary J. Baker, et al.
The effects upon imaging due to varying the spatial coherence of the illumination in an optical system are studied. A rotating diffuser is located directly behind the object in an optical system and is trans-illuminated with spatially coherent monochromatic light. The statistical properties of the diffuser surface determine the scattering cone angle and the partial coherence effects in the image. A model is presented that can be used to determine the diffuser properties required to yield incoherent imaging. Two metrics are used to determine if an image is incoherent: the apparent transfer function and image contrast.
System analysis tools for an ELT at ESO
Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH (Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.
Stray-light sources from pupil mask edges and mitigation techniques for the TPF Coronagraph
Stray-light sources from pupil plane masks that may limit Terrestrial Planet Finder Coronagraph (TPF-C) performance are characterized1,2 and mitigation strategies are discussed to provide a guide for future development. Rigorous vector simulation with the Finite-Difference Time-Domain (FDTD) method is used to characterize waveguiding effects in narrow openings, sidewall interactions, manufacturing tool-marks, manufacturing roughness, mask tilt, and cross-wavelength performance of thick Silicon mask structures. These effects cause stray-light that is not accounted for in scalar thin-mask diffraction theory, the most important of which are sidewall interactions, waveguiding effects in narrow openings, and tilt. These results have been used to improve the scalar thin-mask theory used to simulate the TPF-C with the Integrated Telescope Model.3 Of particular interest are simulations of 100m thick vertical sidewall openings that model features typically found on Ripple masks4 fabricated by Reactive Ion Etching (RIE) processes.5 This paper contributes fundamental data for systematically modeling these effects in end-to-end system simulation. Leakage straight through the mask material varies greatly with wavelength, especially in Silicon (an attractive mask material due to the precision manufacturing techniques developed by the IC industry). Coating Silicon with 200nm of Chrome effectively mitigates the leakage without causing additional scattering. Thick-mask diffraction differs from the predictions of scalar thin-mask theory because diffraction spreading is confined by the mask's sidewalls. This confinement can make a mask opening look electro-magnetically larger or smaller than designed, by up to 3λ per vertical sidewall on a 50μm thick mask yet this can be reduced an order of magnitude by undercutting the sidewalls 20°. These confinement effects are sensitive to mask tilt (if light reaches the sidewalls) which can lead to an imbalance in stray-light sources and an extra wavelength of effective opening change on the illuminated sidewall.
Project management of an imaging optical interferometer
The Magdalena Ridge Observatory Interferometer (MROI) is part of a new observatory dedicated to astronomical research. It is a 6 element optical interferometer currently in its construction phase, with a planned phase B of 10 elements. The observatory is located within 32 km from the centre of the Very Large Array (VLA) at an altitude of approximately 3230 meters. The design is optimized for faint source imaging. This makes it one of the most advanced high spatial resolution optical instruments available to the scientific community. With a staffing of up to 20 scientists and engineers, and a large fraction of the telescopes, buildings, and delay lines outsourced to industry and consortium partners, it aims for an aggressive schedule to have first fringe with 6 telescopes in late 2009. A project this size in budget, tight milestones and deadlines, requires professional management. In this paper we address the basic principles that are followed in the project management approach. We describe a generic approach and at some instances the implementation chosen at MROI.