Focus on the Science, Not the ScientistKen Sembach, sembach [at] stsci.edu
Science thrives best in an equitable and open environment. The Hubble Space Telescope owes its unparalleled scientific productivity in no small part to its constitution as an international observatory, welcoming proposals from anyone in the worldwide astronomical community. As the prime steward for Hubble science, the Institute is committed to ensuring that the community competes for observing time on a level playing field. Fairness and equity are an integral part of the peer-review process used for proposal evaluations, so this is a commitment that we take very seriously.
It is disquieting to see a systematic offset in the success rates of proposals led by male and female principal investigators (PIs). While small for any given observing cycle, this offset cannot be explained as statistical noise in the review process, as it is persistent and always in the same sense: women do not fare as well as expected based on the percentage of proposals submitted. This leads us to believe that subtle, but important, unconscious biases may be a factor in the proposal reviews. Such biases are well documented in many disciplines and are known to affect the peer review process (see bibliography in reference ). Hubble is not the only observatory showing this effect—similar trends have been seen in the proposal selection processes for other observatories as well. This raises the possibility that there may be common biases affecting proposal assessments, possibly extending beyond gender to race, geographic location, home institution, seniority, and other factors that have not yet been fully evaluated.
Following discussions with the Space Telescope Users Committee (STUC), we implemented progressive changes in the proposal format over the past several observing cycles to obscure the identity of the PI while preserving the investigator list in an alphabetized order. We also brought in an outside expert, Dr. Stefanie Johnson (University of Colorado, Leeds School of Business), as a consultant and observer at the Cycle 25 Time Allocation Committee (TAC) meeting in 2017. Dr. Johnson has extensive professional experience in developing strategies to mitigate unconscious bias. Her analysis of the Hubble results shows that the proposal format changes we have implemented led to more balanced results in the preliminary grades, which were submitted prior to the face-to-face discussions of the proposals. However, a gender bias was still evident in the final set of proposals recommended to the Director for implementation after the in-house panel review. Dr. Johnson noted that some proposal discussions continued to focus on the proposers rather than scientific objectives. Her recommendation was to consider fully anonymizing the identities of the proposers, which would have the added benefit of minimizing conflicts of interest and reducing unknown biases, as well as those identified. While this approach is common in other fields, it represents a significant departure from the norm for much of the astronomical community.
With that in mind, I asked the STScI Science Mission Office (SMO) and the HST Mission Office (HSTMO) to constitute a working group [see Table 1] to identify an appropriate process for rendering proposals anonymous and developing instructions and guidelines for writing and reviewing proposals. As part of their deliberations, the working group solicited community feedback, receiving 60 responses: many were highly supportive, others less so. Separately, I also received letters, some in support and others suggesting alternatives to fully anonymizing the proposals. Some alternative suggestions favored a random draw of proposals meeting minimum qualifications over peer review; some suggested making the entire process open so that both reviewers and proposers are known to each other; and others advocated actively reordering proposal rankings near the cutoff to achieve a specified result. In my opinion, these alternative approaches neither preserve the value of the peer-review process nor produce a level playing field in a manner consistent with the thoughtfulness and effort invested in these proposals by the community.
|Table 1: Working Group on Anonymizing Proposals|
|Lou Strolger, STScI [chair]||Neill Reid, STScI [SMO; ex officio]|
|Tom Brown, STScI [HSTMO; ex officio]||Christina Ritchie, JPL|
|Peter Garnavich, Notre Dame||Paule Sonnentrucker, STScI|
|Stefanie Johnson, U. Colorado||Michael Strauss, Princeton|
|Mercedes Lopez Morales, CfA [STUC]||Brian Williams, STScI|
|Andrea Prestwich, CfA|
The most pressing area of concern raised by the community centers on vetting the proposing teams—specifically, ensuring that proposers have the expertise to write not only compelling proposals, but also to conduct the science proposed. In earlier days, this was a primary evaluation criterion for Hubble proposals, but in recent years has been de‑emphasized. There is also some unease about the effort required to write proposals that will need to conform to new guidelines for proposal anonymity.
The recommendations of the working group address these concerns directly. In short, the proposals will be reviewed strictly on science merit, without revealing the proposers' names. Proposals will need to be written in a manner that eliminates self-identification. Specific recommendations for how that should be done, along with additional information, are available at the websites below.[2, 3] After the science evaluations and ranking are completed, the names of the investigators and a self-provided description of qualifications will be reviewed for proposals ranked above the science merit cutoff. These latter expertise assessments will not alter the proposal rankings, but any identified deficiencies will be brought to the attention of the Director for consideration in the final approval process.
Based on the working group's recommendations and the endorsement of the Space Telescope Institute Council (STUC), and the Goddard Space Flight Center HST Project Office, the Institute will implement an anonymous review process in Cycle 26. I’d like to thank everyone involved in reaching this decision. Our overarching goals are to place the focus of the reviews squarely on the science and to give every proposal team an equal platform to make their case. I recognize that this change may be unfamiliar to some, yet steadfastly believe that this is the fairest and most defendable approach to ensuring impartial reviews of all proposals. The Institute will continue to monitor and publish the outcomes of the peer reviews, and measure progress against the well-established history of the observatory. Time will tell, but I am confident we will look back and view this change as a positive watershed moment for the mission.
Webb @ STScI UpdateKlaus Pontoppidan, pontoppi [at] stsci.edu; Bonnie Meinke, meinke [at] stsci.edu; Nikole Lewis, nlewis [at] stsci.edu
Staff at the Institute continues work toward supporting science operations with the James Webb Space Telescope. The Science and Operations Center (S&OC) software package version 2.0 was delivered to NASA, and new versions of the suite of user tools are available for proposal planning. The launch delay of Webb, announced by NASA on March 27, led to a delay of the GO cycle 1 proposal deadline to no earlier than February 1, 2019.
Launch delay and the Cycle 1 proposal deadline
Based on recommendations made by the JWST Standing Review Board, NASA is re-scheduling Webb's launch window for 2020. Given those circumstances, the Institute will delay the Cycle 1 GO/AR proposal deadline until no earlier than February 1, 2019. A revised proposal schedule will be developed in consultation with the JWST Users Committee, the JWST Project, and representatives from the European and Canadian Space Agencies. Proposals already submitted in response to the Cycle 1 Call will not be carried over and will need to be resubmitted.
Science and Operations Center status
The Institute is responsible for developing the JWST Science & Operations Center (S&OC), which includes the Flight Operations Subsystem (FOS), the Wavefront Sensing and Control Subsystem (WSS), the Project Reference Database Subsystem (PRDS), the Proposal Planning Subsystem (PPS) and the Data Management Subsystem (DMS). Version 2.0 of the S&OC was verified and delivered to NASA in October 2017. Version 2.0 verified more than 100 requirements and is on-track for a timely delivery of the flight build. S&OC version 2.0.1 verified more than 30 additional requirements at the end of March 2018.
The user tool suite has been updated to both include new tools as well as enhanced functionality of existing tools. This includes releases up to version 25.4.4 of the Astronomer's Proposal Tool (APT) and version 1.2.2 of the Exposure Time Calculator (ETC). The APT updates are in particular useful for observers of non-siderial (solar system) objects and users of Aladin for visualizing proposals. The ETC updates include significant enhancements for coronagraphic modes, as well as improved treatment of dithered observations. The background radiance that Webb is subjected to, as used by the ETC, can be explored using the JWST Backgrounds Tool (JBT).
New releases include the Space Telescope Image Product Simulator (STIPS), which enables full-field simulations of the Near-Infrared Camera (NIRCam) and Mid-Infrared Instrument (MIRI) imaging modes. Observers of solar system objects can now also calculate visibilities for moving objects using the Moving Target Visibility Tool (MTVT). Similarly, users of the Near-Infrared Spectrograph (NIRSpec) multi-object spectroscopy mode can visualize their observations with the NIRSpec Observation Visualization Tool (NOVT).
Exploring the GTO and ERS programs
The Institute received 106 proposals for the Director's Discretionary Early Release Science Program (DD-ERS). Following the recommendations of the Time Allocation Committee and a technical review, 13 programs were selected by the STScI Director. The programs span a wide range of science areas and observing modes, and include investigators from 18 countries. Detailed information about the Early Release Science Programs can be found on the Webb science website. For instance, for any accepted science programs, it is possible to download the detailed APT files using the Program Information Tool to see exactly what is planned. Some GTO teams have waived their Exclusive Access Period, such that their data become public immediately after they are obtained. This also means that archival proposals for Cycle 1 can be based on these programs. For those concerned about avoiding duplications in their proposals, it is now possible to search MAST for potentially duplicated targets (GTO and ERS for Cycle 1 proposals).
The DD-ERS programs will be presented and discussed at the 232nd summer AAS in Denver, CO during four special sessions on "Distant Galaxies and Cosmic Dawn," "Nearby and Resolved Galaxies," "Astrochemistry and the ISM," and "Planets."
Webb outreach at South by Southwest
Several Institute scientists were at the South by Southwest (SXSW) Festival in Austin, TX to give talks about Webb and its science possibilities. Nikole Lewis, Bonnie Meinke, Joel Green, and Jason Kalirai spoke on two separate panels: "All These Worlds are Yours" and "What Will the Webb Telescope Discover First?"—the first about how we use space telescopes to characterize other worlds, and the second about the bold science goals of Webb in Cycle 1. Each panel received ~100 in-person attendees. Joel Green and Jason Kalirai also joined a panel discussion on Webb hosted by "Huge" and "Quartz Media."
HST @ STScI UpdateRachel Osten, osten [at] stsci.edu
This Newsletter article summarizes recent Hubble happenings. Changes to how research grant funds are distributed will reduce the amount of obligated but unspent funds. Single-cycle grants will be limited to three years plus one possible no-cost extension. The number of no-cost extensions to grants will be limited. The Phase Ⅱ budget and proposal deadline will be strictly enforced in Cycle 26 and future cycles. Phase Ⅰ observing proposal requests for instruments subject to health and safety issues must describe observations in sufficient detail to demonstrate safety. All visit-level and most exposure-level observing restrictions must now be specified in the Phase Ⅰ proposal. A special session at the 231st AAS meeting highlighted the scientific results enabled by Hubble's UV Initiative. Overall, the observatory and its instruments are healthy and operating nominally. Note that one of the solid-state recorders on Hubble experienced increased errors however, leading to a change in how it is being used. Gyroscope #2 displays large changes in its bias levels, leading to increased jitter while very recently Gyroscope #1 suffered an unrecoverable error.
Changes to Grants Administration
A key component of the success of Hubble has been robust support of approved observing programs through grants to support data reduction, analysis and publication of scientific results. The rate at which grants are being spent in the last several years has been significantly lower than the rate at which funds are distributed to grantee institutions. This discrepancy accounts for a large amount of unspent funds on the Hubble grants program. Starting in October 2017, the Institute's Grants Administration office changed the funding allocation schedule for Hubble grants in order to reduce the pool of funds that are obligated, but unspent (also known as uncosted carryover). The total amount of committed funding has not changed.
The new process makes changes to the timing over which funds are dispersed. The percentage of the approved grant amount determines the amount of funding, as before. Future allotments will still be done in increments; however, it will be based upon invoicing and payments, rather than a pre-established schedule. Additional allotments of funding can always be made available upon request, and generally within one business day. The increments are based on the total amount of the award; see Table 1 for a summary. These changes will provide institutions with immediate access to funding through automatic funding releases. If no expenditures are reported within the first nine months of the award, a notice will be sent to the grantee. Should no expenditures be reported within the first twelve months, the available funding will be reduced by 50%. The grant award amount is not affected; only the funding available at the next increment. This applies to all new grant awards. Funding for existing awards will follow the schedule laid out in the most recent applicable grant award or amendment document.
|Approved Amount||Available at Award||Available at 90% Expended|
|Up to $30,000||100%||—|
|Up to $50,000||50%||50%|
|Greater than $50,000||20%||20% in equal increments|
In addition, the number of no-cost extensions will be limited to one. The initial period of performance for an award is three years. A request with justification to extend the performance for up to 12 months will generally be approved. No subsequent extension requests will be accepted.
More detailed information regarding changes to the funding allocation process is available at the STScI Grants Administration website (http://www.stsci.edu/institute/grants).
These changes are necessary for the Hubble grants program to remain healthy and efficient.
Enforcement of deadlines/restrictions
The submission process for Hubble observing time and associated funding for most of Hubble's on-orbit operations has occurred in two phases, with different deadlines. The Phase Ⅰ proposal deadline is for initial requests of observing time and associated science, as well as for description of an archival or theoretical science program. The Phase Ⅱ deadline is for the Phase Ⅱ observing specifications for those programs awarded observing time on the telescope, and for grant proposals associated with approved programs. The deadline for Phase Ⅰ submissions has always been a firm deadline (no late acceptances generally offered); now the Phase Ⅱ deadline is also a strict deadline. Observing specifications for approved programs must be received at the time of the Phase Ⅱ deadline, and must be sufficiently described so that long-range planning can commence. Programs that do not conform to this deadline are subject to cancellation. Successful Phase Ⅰ proposers also must submit grant proposals by the Phase Ⅱ deadline or the program risks not receiving any funding.
Detailed visit- and exposure-level specifications have generally been specified at the time of the Phase Ⅱ proposal deadline. A new change for Cycle 26 is the prerequisite that all special requirements placed at the visit level and most at the exposure level need to be stated explicitly in the Phase Ⅰ observing proposal with a brief justification. Visit-level special restrictions include orientation requirements, guiding, timing requirements, conditional requirements, and special observation requirements. Exposure-level special requirements are those pertaining to target acquisition, target position, timing, special observations, implementation requirements, and special communications.
These additions are described in the Cycle 26 Call for Proposals.
The preparation of the Phase Ⅰ observing proposal request involves motivation of the science and description of the proposed observations. Observers proposing to use instruments with health and safety concerns are required to include information in their description of observations that establishes the safety of the measurements. These must now be included in the Phase Ⅰ observing proposal. Programs that do not contain this information may be subject to cancellation.
AAS special session: Astrophysics enabled by Hubble's Ultraviolet Initiative
Observations with the Hubble Space Telescope open up a window on ultraviolet (UV) wavelengths which enables astrophysics that would not be possible otherwise. The Ultraviolet Astrophysics Legacy Initiative, in place since Cycle 21, recognizes the unique resource that Hubble has in its access to the UV portion of the electromagnetic spectrum and encourages proposals for this finite limited resource. The 231st meeting of the American Astronomical Society, which took place at National Harbor in January 2018, featured a special session entitled "Science Enabled by Hubble's Ultraviolet Initiative." The session highlighted some of the programs undertaken as part of the UV initiative covering the breadth of science done with UV observations. The invited talks set the landscape for future science at these wavelengths, and presented the key role UV observations will have in interpreting results from future missions like the James Webb Space Telescope. The speakers and their talk titles are listed below. A webcast of their presentations is available at this location (HST Special AAS 231).
Jelle Kaastra (SRON Netherlands Institute for Space Research); Deciphering AGN Outflows with Joint UV and X‑ray Observations
Daniela Calzetti (University of Massachusetts Amherst); Linking the Scales of Star Formation
Bill Sparks (STScI); Evidence of Plumes on Europa from FUV Observations with HST
Marc Rafelski (STScI); The Ultraviolet Hubble Ultra Deep Field
Jessica Werk (U. of Washington); Why Circumgalactic Matter Matters in Galaxy Evolution
Hubble's instruments continue to operate nominally, extending the scientific life of the observatory and enabling ground-breaking astrophysics.
Solid-State Recorder 3
In November 2017, Hubble's ground system began detecting increased errors when playing back science data received from the telescope. The errors originated on Solid-State Recorder 3 (SSR‑3), which was placed on the telescope in 1999. A tiger team formed to investigate possible causes reached a decision to use a different bus in one of the memory cards on SSR‑3 as a work-around. There is no indication of any imminent failure of any parts.
Gyroscope 2 (Gyro-2) on Hubble has been showing signs of aging with occasional large bias rate jumps and an increased level of jitter starting in the fall of 2017. The former has required periodic gyro scale corrections in order to ensure successful target acquisitions/re-acquisitions. These behaviors sometimes result in slightly extended acquisition times, delaying setting of the Take-Data flag and impacting a small number of science observations. Recent modifications to acquisition timing rules are expected to greatly reduce this problem without impact to target visibility times.
Jitter measurements with the Fine Guidance Sensors (FGSs) show a secular increase over the last year, with a marked uptick in the October 2017 timeframe. In addition to large bias rate drifts, Gyro-2 exhibits relatively high frequency noise (0.01 to 1 Hz), larger than any previous HST Gyro since Launch. This noise has resulted in increased boresight jitter by the FGSs and science instruments. Each science instrument team evaluated the jitter amplitude threshold above which science goals become compromised. The minimum range for science impact is 10-15 mas, depending on the instrument and the needs of the science programs. Currently no changes are being planned, but should the jitter increase beyond 15 mas, possible changes in the Gyro configuration may be implemented.
In the midst of careful scrutiny of Gyro-2’s behavior, Gyro-1 suffered an unrecoverable failure on the morning of Saturday, April 21. Its lifetime was consistent with those of other standard flex leads. This failure leaves a total of four working gyros on the observatory. Three of these are enhanced flex leads, which have an expected lifetime roughly five times longer than the standard ones. We do not want to retire Gyro-2 prematurely. The longer we use Gyro-2, the longer the observatory can use Three-Gyro Mode for science observations. The current plan is to remain in Three-Gyro Mode until a time when there are only two functional gyros, then drop to Reduced Gyro-Mode on a single gyro, with modest degradation in performance.
Community Missions Office (CMO) Update: Focus on TESSMarc Postman, postman [at] stsci.edu; William Sparks, sparks [at] stsci.edu
The TESS mission promises to produce a new, valuable, rich survey of exoplanets. The targets observed will be observable by other telescopes for follow-up studies and characterization. The TESS Data Archive, resident in the MAST facility, has passed its ground system testing and will be available for data mining through the Institute's archive portal.
The TESS mission
On April 18, NASA's latest exoplanet detection mission—the Transiting Exoplanet Survey Satellite (TESS)—lifted off from the Kennedy Space Center onboard a SpaceX Falcon 9 launch vehicle. This began TESS's two-year-long survey to discover thousands of new exoplanets within the local solar neighborhood. The observational strategy employed by TESS will be to scan the entire sky over 24 months, covering 2300 square degrees about every 27 days. TESS is designed to focus on detection of transiting planets orbiting stars within about 200 light-years from the Sun. Of the thousands of new worlds predicted to be discovered by TESS, about 300 are expected to be no larger than twice the size of the Earth, some of which may orbit within the habitable zones of their host stars. By contrast, NASA's Kepler mission largely scanned a single field of view near the Galactic bulge for many years and discovered thousands of exoplanets around more distant stars up to 3000 light-years from the Sun. The more distant worlds found by Kepler are mostly too faint for more detailed follow-up with other telescopes. The much closer exoplanets to be discovered by TESS will be more amenable to follow-up with other space-based telescopes, like Webb, or large ground-based telescopes. These follow-up observations will allow researchers to characterize the temperature, density, and atmospheric chemistry of many of these newly found worlds.
TESS Data Products
TESS will acquire high-precision photometry of more than 200,000 stars during its two-year survey with a cadence of approximately two minutes. These targets will be read-out as "postage stamps" and be made available to the community as target pixel files (TPFs) and calibrated light curves. In addition, the full image frame will be read out approximately every 30 minutes. These Full-Frame Images (FFIs) will enable users to conduct photometry on any target within each 24 × 96 degree field of view. TESS will also engage the broader community of astronomers with its guest investigator program. Researchers will be able to request up to 20,000 additional sources for study during the course of TESS's all-sky survey. All of the data from TESS will be ingested and made available for distribution to the community via the Mikulski Archive for Space Telescopes here at the Institute.
The TESS data archive ground-system at MAST has passed all its pre-launch tests and is ready for community use. Indeed, some pre-flight TESS data products are already available at MAST. These products include the TESS Input Catalog (TIC), which is a catalog of every optically luminous, persistent object in the sky that extends down to the limiting magnitudes of its component catalogs. The TIC contains more than half a billion objects, and includes catalog information from Gaia, 2MASS, SDSS, ALLWISE, APASS, Tycho-2, UCAC4, LAMOST, RAVE, APOGEE, KIC and EPIC. The TIC will allow users to:
- Look up information, including coordinates and stellar properties, for any target for which the TESS mission produces a light curve.
- Enable selection of the planet search stars, which includes consideration of background flux contamination levels.
- Provide stellar parameters, such as stellar radii, which the processing pipeline will use when calculating planet properties.
- Facilitate false positive detection, including background sources.
In addition to the TIC, a TESS Candidate Target List (CTL) is available. The CTL is a subset of the TIC, whose purpose is to provide a set of stars optimized to enable the TESS primary objectives. A majority of the two-minute cadence targets will be selected from the CTL catalog.
In support of the 6th end-to-end test (ETE-6), the most realistic simulation of synthetic TESS science data was created from the standpoint of generating the pixel data. These simulations include hidden transit data (to allow for community data challenges) and are available from MAST [https://archive.stsci.edu/tess/ete-6.html]. We note that while this simulation is the best one made to date, it relies on several assumptions and simplifications that may not reflect actual mission operations, keyword values, and instrumental and spacecraft behavior conditions.
MAST Portal to data
After the TESS launch, MAST staff will engage our data management system's operational configuration to ensure the smooth flow of data into the archive and the subsequent access for researchers to retrieve it, through the MAST Discovery Portal [https://mast.stsci.edu/portal/Mashup/Clients/Mast/Portal.html]. We will be providing documentation for users and will continue to work on enhancements to the tools available for retrieval and analysis of TESS data, especially the full frame images, which we anticipate will be a very heavily used scientific resource.
WFIRST Mission UpdateJeffrey Kruk, jeffrey.w.kruk [at] nasa.gov
The Wide-Field InfraRed Survey Telescope (WFIRST) is designed to have a field of view 100 times larger than Hubble's with comparable sensitivity and resolution, providing survey-sized data sets with space-based resolution. WFIRST's surveys will explore the nature of dark energy and discover and characterize exoplanets, and will revolutionize a large number of topics in general astrophysics. The WFIRST Project has been in Formulation since February of 2016, and recently completed a major mission review.
WFIRST Mission milestones
The Wide-Field InfraRed Survey Telescope (WFIRST) Project is nearing completion of Phase A. Phase A is the first half of the Formulation phase of the NASA Project lifecycle, in which mission requirements, conceptual designs, and a preliminary project implementation plan are developed, and new technologies are brought to a level of maturity suitable for flight.
WFIRST formally entered Phase A in February 2016 after a successful Mission Concept Review in December 2015. The WFIRST Project at NASA's Goddard Space Flight Center worked closely throughout Phase A with Mission partners and stakeholders, including NASA's Jet Propulsion Laboratory (JPL), the Institute, the Infrared Processing and Analysis Center (IPAC), and the Formulation Science Working Group.
Phase A culminates in a combined System Requirements Review (SRR) and Mission Definition Review (MDR), which assess whether or not the mission requirements are necessary and sufficient to meet the Objectives, and that the mission design implements the requirements within the available cost and schedule resources. This review was successfully completed in February 2018, and the final decision on initiation of Phase B is expected in mid April. The WFIRST Project is continuing to execute the FY18 plan while Congress deliberates on the FY19 budget.
Primary activities in Phase B include completion of detailed requirements development, preliminary design of the observatory and ground systems, definition of interfaces between all mission elements and subsystems, and development of a detailed project implementation plan. Any remaining technology development must also be concluded. In practice, considerable progress was made on the designs of the optics and supporting structure and thermal control systems, and for some of the electronics. Those designs will continue to mature in Phase B as efforts on the other aspects of the design ramp up.
Science-driven requirements for WFIRST
The wide field of view (Figure 1), Hubble-quality imaging, and sensitivity spanning 0.5–2.0 microns make WFIRST uniquely suited for a wide range of astrophysics investigations that cannot be addressed with any other facility. The Observatory performance requirements were defined by the needs of several key projects: studies of the expansion history and growth of structure in the universe by means of weak gravitational lensing, Type Ia supernovae, and the spatial-redshift correlation of galaxies, and a study of exoplanet demographics by means of gravitational microlensing.
While these key projects provided useful scientific guidance for requirement definition, the actual WFIRST observing programs will be not be specified until close to launch in order to be able to optimize the programs based on the scientific landscape at the time. The programs will also be designed to address as wide a range of astrophysics as possible. We are exploring means to maximize community participation in each of these large programs. Substantial time will also be set aside for surveys proposed by members of the community to address investigations that require data not provided by the core surveys. All observing time will be awarded competitively, and all data will be public with no proprietary period. It is anticipated that the WFIRST archive will provide an unparalleled resource for investigations as yet unknown, hence funding will also be provided for a robust archival research program.
Evolution of WFIRST's design during formulation
While many design details of the WFIRST observatory have evolved since the Mission Concept Review in December 2015, most of the characteristics of interest to users have not changed. The observatory will be placed in a halo-like orbit about the second Sun-Earth Lagrange point (Figure 2), with a field of regard defined by lines of sight with angles relative to the Sun of 54 degrees to 126 degrees. This means that a little more than half the sky is accessible at any given time, and regions within 36 degrees of the ecliptic poles are accessible continuously. The average daily data volume that can be transmitted to the ground is 11 terabits. As the mission is required to survey large areas of the sky, the observatory design is optimized for rapid slewing and settling. A slew of 0.4 degrees, the size of the narrow dimension of the wide-field channel field of view, will take just over 60 seconds including guide star acquisition at the new field. The prime mission duration is five years after commissioning, with propellant sized for ten years of operations. The observatory is designed to be serviceable, so extension of the mission life beyond ten years would be possible.
During Phase A, multiple studies were conducted exploring the impact of design choices on the observatory and the science potential of WFIRST data. One of the trade studies conducted during Phase A was optimization of the telescope temperature. The warmest components of the telescope are now designed for a temperature under 270K, as opposed to the 284K temperature at MCR, resulting in a significant decrease in thermal background at long wavelengths. For limiting magnitudes typical of the planned high-latitude galaxy survey, the reduction in exposure time is roughly a factor of two in the reddest filter, and more for fainter sources.
The Wide-Field instrument (WFI) comprises the Wide-Field Channel (WFC) and Integral-Field Channel (IFC). The WFC focal plane consists of 18 4k x 4k near-infrared HgCdTe detectors with a pixel scale of 0.11 arcseconds per pixel; the total active area is 0.281 square degrees—roughly the size of the full Moon. The main change since MCR was the addition of a filter spanning 0.48–0.76 microns, so there are now seven filters with a total bandpass of 0.48–2.0 microns. The extension of the bandpass towards short wavelengths greatly enhances the capabilities of WFIRST for studies of stellar populations, provides an additional means of testing for astrophysical systematic effects in Type Ia supernovae studies, and generally increases flexibility in responding to future opportunities. The grism bandpass is presently 1.0–1.93 microns, which for Hα-emitting galaxies corresponds to redshifts 0.55–1.9, and the dispersion is designed to provide 0.1% redshift precision. The multiplexing enabled by this wide field of view grism is critical for the large surveys envisioned for WFIRST: for a representative wide-area galaxy redshift survey program, there will be roughly 2000 redshifts measured in this range per WFIRST pointing. Further optimization of the filter bandpasses and of the grism dispersion will be assessed in Phase B, but large changes are not anticipated.
One big change in the WFC design since MCR was cooling of the focal plane passively instead of by means of a cryocooler. This was enabled by a complete overhaul in the layout of the optics that moved the focal plane much closer to the exterior surface of the observatory. Passive cooling of the focal plane reduces development risk for the instrument, but also increases the likelihood of extending the operating lifetime well in excess of the five-year design requirement.
The Integral Field Channel employs two image slicers that feed a common spectrometer and focal plane. The image slicers provide fields of view that are 9 square arcsec and 36 square arcsec in size, with spatial sampling of 0.15" and 0.3", respectively. The bandpass is 0.42–2.0 microns, with spectral resolution varying from 75 to 150. The IFC has been optimized for spatially resolved spectroscopy of faint sources in general, and Type Ia supernovae in particular. The spatial sampling provided by the image slicer enables more accurate subtraction of the host galaxy light from that of the supernova and greatly reduces the zodiacal light background in comparison to the slitless spectra provided by the grism. The resulting spectra will be used for typing and sub-typing of supernovae, measuring their redshifts, and measuring their spectral energy distributions for determination of the reddening caused by dust in the host galaxies. The IFC is slated to be contributed by one of the international partners; details of the IFC design will be established when the partnership is finalized.
The second major instrument on WFIRST is a coronagraph, which will be the first coronagraph in space with active wavefront control. It includes both shaped-pupil and hybrid-Lyot architectures, autonomous high-order and low-order wavefront control, and ultra-low-noise detectors. Downstream from the coronagraph are an imaging camera and an integral field spectrograph, both operating at visible wavelengths. The coronagraph is a technology demonstration instrument, so there are no formal scientific performance requirements. However, projected performance is that flux ratios between host star and potential planets of 10-8–10-9 will be achievable, which would enable characterization of giant exoplanet atmospheres and exozodiacal dust and debris disks. The present plan is to make the coronagraph available to the community through a Participating Scientist Program. This approach is common for planetary science instruments, but the details of how best to define this for the WFIRST coronagraph are still under discussion.
Future WFIRST development
The Formulation Science Working Group and the Science Investigation Teams have been an integral part of the requirements development and validation efforts throughout Phase A. As we move into Phase B, the Project and the science teams will be working closely with the Institute and IPAC as increasing emphasis is given to approaches to data processing and analysis, and towards means of optimizing survey definition and observing policies in general. We plan to engage the broader community in these efforts in order to maximize the scientific potential of the WFIRST mission. We anticipate a competition in 2021 that solicits a follow-on Science Working Group that would design the observational program.
Jeffrey Kruk is the WFIRST Project Scientist at NASA's Goddard Space Flight Center.
First Class of NHFP Fellows SelectedAndy Fruchter, fruchter [at] stsci.edu
Twenty-four young astrophysicists have been selected for the first class of the new NASA Hubble Fellowship Program. As described in an earlier newsletter article (The New NASA Hubble Fellowship Program), NASA is combining the Einstein, Hubble and Sagan fellowships into a single new program, the NASA Hubble Fellowship Program (NHFP). The NHFP will, like the fellowships it supersedes, support outstanding postdoctoral scientists to pursue independent research in any area of NASA Astrophysics, using theory, observation, experimentation, or instrumental development.
The new NHFP preserves the legacy of NASA's previous postdoctoral fellowship programs; once selected, fellows are named to one of three sub-categories corresponding to NASA's "big questions":
How does the Universe work? – Einstein Fellows
How did we get here? – Hubble Fellows
Are we alone? – Sagan Fellows
The NHFP received 350 applications from astophysicists who are completeing their PhD this year, or have completed their PhD in the last three years. In January, six topical panels met to sort through the applications. About one-half of the new fellows were chosen directly by the panels, and the remainder were chosen from short lists created by the panels by a committee made up of the panel chairs and the Selection Chair. Panel members and chairs were chosen largely from the U.S. astrophysical community, but a number came from Europe or Cananda.
Fellows receive a salary and funds for their research and can take the fellowship to a U.S. host institution of their choice. However, no more than two new fellows are permitted to go to any one institution in a given year. The list below provides the names of the 2018 awardees, their host institutions, and their proposed research topic:
- Kate Alexander, Northwestern University, Quantifying the Diversity of Relativistic Transients with Radio Observations
- Benedikt Diemer, Harvard University, Mapping the True Boundary of Dark Matter Halos with the Splashback Radius
- Ke Fang, Stanford University, The Highest-energy Electromagnetic Counterparts to Neutron Star Mergers
- Maximiliano Isi, Massachusetts Institute of Technology, Fundamental Physics in the Era of Gravitational Wave Astronomy
- Ben Margalit, University of California–Berkeley, Interpreting the Diverse Transient Sky
- Aaron Smith, Massachusetts Institute of Technology, Radiation Signatures of the First Galaxies and Supermassive Black Holes
- Vladimir Zhdankin, Princeton University, First-Principles Modeling of Astrophysical Turbulence in Collisionless, Nonthermal Plasmas
- Philip Cowperthwaite, Carnegie Observatories, Driving the Growth of Joint Gravitational Wave and Electromagnetic Astronomy
- Daniel Goldstein, Jet Propulsion Laboratory, Putting a New Generation of Strongly Lensed Supernovae to Work
- Max Gronke, University of California–Santa Barbara, Casting (Lyman-Alpha) Light on Galaxy Formation
- Melodie Kao, Arizona State University, How Do Substellar Objects Generate Magnetic Fields?
- Charlotte Mason, Smithsonian Astrophysical Observatory, Revolutionizing Reionization with JWST
- Aaron Meisner, California Institute of Technology, Revealing the Sun's Coolest, Nearest Neighbors with NOWISE-Reactivation
- Erica Nelson, Harvard University, The Emergence of Galactic Structure
- Anna Schauer, University of Texas–Austin, Minihaloes: Formation Sites of the First Stars and the Onset of Deionization
- Irene Shivaei, University of Arizona, Unveiling the Obscured Early Universe in the JWST Era
- Tuguldur Sukhbold, The Ohio State University, Core-Collapse Supernovae Across Metallicities and Engines
- Jamie Tayar, University of Hawaii–Institute for Astronomy, Subgiants: Models, Rotation, Convection, and Planets
- Yuan-Sen Ting, Institute for Advanced Study, Chemically Tagging the Milky Way
- Ian Czekala, University of California–Berkeley, A Uniform Measurement of Pre-Main Sequence Stellar Masses and System Architectures Using Protoplanetary Disks
- Johan Mazoyer, Jet Propulsion Laboratory, Can We Detect Exo-Earths with Future Large Space-Based Coronagraphic Instruments?
- Erik Petigura, California Institute of Technology, The Origin of Small Planets
- Kamber Schwarz, University of Arizona, The Evolution of Volatile Molecules from Protoplanetary Disks to Exoplanet Atmospheres
- Daniel Tamayo, Princeton University, A Million-Fold Speedup in the Dynamical Characterization of Exoplanet Systems
The NHFP is funded through the NASA's contract with the Institute to support the Hubble Space Telescope. The program is administered by the Institute for NASA in collaboration with the NASA Exoplanet Science Institute (NExScI) at the California Institute of Technology and the Chandra X-ray Center at the Smithsonian Astrophysical Observatory. The program is being led by Andy Fruchter (STScI), Paul Green (CXC) and Dawn Gellino (NExScI). Kartik Sheth is the NASA Headquarters official overseeing the program.
Webb Ground Segment Integration and Test: Preparation for LaunchPaul Lee, plee [at] stsci.edu; Christopher Hanley, chanley [at] stsci.edu; Ariel Bowers, abowers [at] stsci.edu; Mark Abernathy, abernathy [at] stsci.edu
The Webb Integration and Test Program is working hard to bring the ground system and the flight hardware, including the Science Instruments, Integrated Science Instrument Module (ISIM), and Optical Telescope Element (OTE), together for the mission. This article summarizes the milestones successfully reached thus far by the OED I&T team, and we highlight the upcoming activities leading up to the launch of Webb.
What is Integration and Test (I&T) and why is it important?
Take the development of a new car. Many different parts and components have to be designed, developed, built, and tested. Teams of engineers are involved in crafting these parts, but the collection of parts and components does not automatically become a car. That collection of parts cannot drive you from point A to point B. The I&T team, in collaboration with System Engineering and Development, has the responsibility to assemble the parts into the final product, and to make the sum of the parts work together as an integrated whole.
A key responsibility of the Operations and Engineering Division (OED) I&T Team is to integrate the Webb Ground Segment. The team works meticulously throughout the I&T process to bring separate systems together from end to end, resolving issues and problems along the way. The team often serves as pioneers as they are the first to operate the latest integrated Ground Segment as a whole. These campaigns verify and validate system interfaces by exercising, for the first time, data flowing among different sites, communication networks, different software systems, and human operator workflows.
Preparing for the launch of the James Webb Space Telescope
As we approach the launch of the James Webb Space Telescope, the OED I&T team has been actively involved on a number of fronts, bringing the instruments and the ground system together for the mission. For the past several years the spacecraft and instrument hardware I&T has been the main focus. The OED I&T team supported the integration and testing of the science instruments and the ISIM at the Goddard Space Flight Center (GSFC), as well as the OTIS (Optical Telescope Element + Integrated Science Instrument Module) thermal vacuum testing at the Johnson Space Center (JSC).
At the Institute, the I&T team has been busy testing the various software subsystems of the Science and Operations Center (S&OC) and the integration of these subsystems together into the larger Webb ground system. In the fall of 2017, engineers delivered the second release of the ground system software for the Webb S&OC and successfully interfacing it with other external facilities such as the Deep Space Network (DSN). At present, the I&T team continues to test and integrate newer software releases into the S&OC, and we are also integrating and testing the backup Mission Operations Center (bMOC) at GSFC to ensure operation continuity should the primary MOC at the Institute become unavailable.
The Integration and Testing of the Webb Ground Segment and the S&OC
Multiple teams with diverse expertise and experience contribute to the integration and testing of the Webb S&OC. Composed of members from Science Operations, Flight Operations, Systems Engineering, Information Technology Services Division, Development and the I&T Branch, this "Team of Teams" works together with professionalism, integrity, and mutual respect to successfully complete many integration and test campaigns. The execution of these campaigns required careful planning and coordination among the Institute teams, external partners, and stakeholders at different geographic locations across different time zones. Together, the team successfully verified and validated the systems. Furthermore, the many Government-witnessed campaigns firmly validated the functionality and operability of the integrated system to the key stakeholder, successfully reaching the appropriate level of system maturity.
The milestones we have reached thus far…
We have completed the following campaigns:
- Science instrument cryovac testing at GSFC
- OTIS thermal vacuum testing at JSC
- Provided on and off console support for OTIS test activities and the science instrument cryovac testing
- S&OC subsystems I&T
- Completed requirement and discrepancy testing for 16 builds of the Proposal Planning Subsystem (PPS), 8 builds of the Data Management Subsystem (DMS), 7 builds of the Wavefront Sensing Subsystem (WSS) and 3 builds of the Flight Operations Subsystem (FOS)
- S&OC Release 1 & 2
- Completed verification and validation acceptance testing of the integrated S&OC witnessed by the Government
- Ground Segment (GSEG) #1
- Completed ground segment verification testing of the S&OC and spacecraft interfaces
- Initial data flow tests through various external interfaces, such as the Space Network (SN), Deep Space Network (DSN), Flight Dynamics Facility (FDF), and Observatory simulator at Northrop Grumman Aerospace Systems (NGAS)
- Completed over 50 End-to-End data-flow verification and validation tests
What are the upcoming milestones (as of early March 2018)?
- SOC Release 2.0.1 with various new subsystem builds – April 2018
- Complete additional verification and validation acceptance testing of the integrated S&OC witnessed by the Government
- This version of the S&OC enhances the end-to-end data flow of the system
- SOC Build 4.2 with various new subsystem builds – Winter 2018
- Complete additional verification and validation testing of the integrated S&OC
- This version of the S&OC will be used in testing with the Observatory during GSEG #3 and #4
- S&OC to Spacecraft test (GSEG #2) – August 2018
- Completes ground segment verification testing of flight to ground requirements
- Integrated Observatory with the S&OC (GSEG #3) – December 2018
- GSEG #3 is the formal end-to-end validation test of the integrated Observatory with the Ground Segment
- Final Observatory to S&OC test (GSEG #4) – February 2019
- GSEG #4 is the final integrated Observatory test with the "Flight Build" of the S&OC
- End-to-End test between the S&OC, SN, DSN, FDF, and NGAS – through launch
- As new versions of S&OC systems and the systems used by our external partners are updated, we continue testing to ensure the ground segment remains operational
- Launch site to S&OC interface test – Launch – 3 months
- A test of the ability of the S&OC to communicate with the Webb Launch Site
STIS Echelle Blaze Function Correction ToolPaule Sonnentrucker, email@example.com; Malinda Baer; Charles Proffitt; and Sean Lockwood
Changes in the STIS echelle optics alignment over time induce wavelength shifts in the blaze function. These can generate flux calibration inconsistencies of up to 10% in the overlapping regions of each pair of spectral orders. We present a new Python-based STIS tool called 'stisblazefix' that is designed to allow users to correct for the blaze function shifts in their echelle spectra.
The 'stisblazefix' tool
The Space Telescope Imaging Spectrograph (STIS) onboard the Hubble Space Telescope is a versatile imaging spectrograph operating with two types of detectors—charge-coupled device (CCD) and Multi-Anode Microchannel Array (MAMA). These detectors offer imaging and spectroscopic capabilities. In particular, the STIS/MAMAs are the only modes offering high spatial resolution echelle spectroscopic capabilities (R ~ 114,000) in the ultraviolet (UV; 1150 Å to 3100 Å). Proper calibration of the echelle modes requires that the blaze function be well-aligned for each echelle order. Failure to properly track and correct for blaze function shifts can lead to flux inconsistencies of 10% or more in the overlapping regions of spectral orders, as these issues tend to introduce unphysical ripples in the spectrum when orders are combined, thus limiting the scientific and archival value of echelle data. To correct for this systematic behavior, WAVECAL lamp exposures are used by CALSTIS to track the shifts of the STIS blaze function on the detector over time. After the STIS repair in 2009 during Servicing Mission 4 (SM4), the time-dependent coefficients of the blaze-function corrections were set to zero, as the time dependence of the baseline for the blaze shifts had yet to be evaluated. As time elapsed since SM4, an increased misalignment of the blaze function developed in the STIS echelle data.
Calibration of the time dependence of the blaze shifts for post-SM4 STIS echelle observations is in progress, and updated coefficients that include the time dependent part of the blaze function shift have been delivered to the pipeline for the E140H modes (Monroe 2017; STAN August 2017). However, for the E230M, E230H, and E140M settings, the time dependent change of the blaze shift is not yet included in the calibration of post-SM4 data. E140M updates are pending.
Because residual calibration uncertainties that can be significant remain for data taken with those modes, a tool was developed to allow users to empirically find the shifts that best align the blaze functions for any given echelle observation (see Baer et al. 2018; STIS ISR 2018-01). This Python-based tool called 'stisblazefix' is designed to find the set of shifts for the sensitivity curves of individual orders that make the calibrated flux in the wavelength overlap between two spectral orders most consistent.
Rather than attempting to recreate the detailed CALSTIS procedure for defining the sensitivity curve of each order, 'stisblazefix' instead recovers the final applied sensitivity curve from the tabulated "net" and "flux" vectors in the extracted spectra in the x1d files produced by CALSTIS. The calibrated flux vector is produced by dividing the net flux vector by the sensitivity curve; therefore, the original sensitivity curve is simply equal to the net flux divided by the calibrated flux. Locations where the tabulated flux is zero and the sensitivity cannot be directly recovered are filled in by interpolation by the tool.
Once the sensitivity curve has been recovered, it is shifted by a specified number of pixels for each spectral order. The new flux values, errors, and flux overlap residuals are recalculated for that set of shifts. Given a set of initial guesses, the tool finds the set of blaze function shifts that minimizes the error-weighted flux residuals in the order-overlap region, in an iterative manner. Once the best shift values are determined, new FITS files with corrected flux and error vectors are generated; the original "x1d" files are substituted with "x1f" FITS files. The package also produces diagnostic plots that compare the spectra before and after correction, flux residuals before and after correction, and the final shifts applied in pixels as a function of echelle order. Figure 1 displays one example application of the 'stisblazefix' package to STIS E230H data.
Caveats: While the 'stisblazefix' package is a useful tool for correcting a wide variety of echelle data, it is still subject to some limitations. For instance, it might not be possible to obtain reliable results for observations dominated by scattered light or detector background. Sources dominated by strong emission lines might be overly sensitive to the details of the emission contribution to the order overlap regions. The assumption that the blaze shift can be applied to the entire sensitivity curve for each spectral order breaks down in some cases. Users should decide whether this tool is necessary for their data and examine the results and diagnostic plots carefully to ascertain whether it offers a useful improvement over the standard CALSTIS pipeline processing.
Additional information regarding the derivation of the blaze function corrections and installation of the 'stisblazefix' package can be found at https://stisblazefix.readthedocs.io.
LINEAR: A Novel Algorithm for Reconstructing Slitless Spectroscopy from Hubble/WFC3Russell Ryan, rryan [at] stsci.edu; Stefano Casertano, stefano [at] stsci.edu; and Nor Pirzkal, npirzkal [at] stsci.edu
Slitless spectroscopy often employs a grism, which is the combination of a prism with a transmission grating, to provide a complete spectroscopic view of any scene. The properties of the grism optical element are often tuned so that the incident light projects "in‑line," therefore the element can be used with the same imaging device as with imaging modes without additional optical components. Arguably, the biggest advantage of slitless spectroscopy is the very high multiplexing (potentially 1000s of sources in a narrow Hubble field of view), which mitigates selection biases.
Classic challenges and solutions
The engineering and scientific advantages of slitless spectroscopy are clear, however they come with a host of computational and analysis difficulties. Perhaps the most significant challenge arises when the spectral traces of neighboring sources overlap on the detector, which leads to ambiguity in measuring the spectrum for the overlapping sources. This issue motivated the concept of contamination to describe the flux associated with the overlapping sources in the field. The contamination is often estimated by ancillary broadband colors or by reorienting the spacecraft, thereby changing the dispersion direction with respect to the astrophysical sources (see Figure 1).
The spacecraft reorientation introduces an additional complication, in that the effective spectral resolution is set by the native line-spread function dictated by the optics modulated by the spatial extent of the source, projected along the dispersion direction. In analogy to classic long-slit spectroscopy, compact sources will have higher spectral resolution, hence the one-dimensional spectra will depend on the orientation of the spacecraft. Therefore, one can obtain a high spectral-resolution/low signal-to-noise spectrum from the individual orients, or a low spectral-resolution/high signal-to-noise spectrum from their combination.
A new approach: LINEAR
We have developed a new algorithm for reconstructing the optimal, one-dimensional spectrum for each source when multiple orients are available (Ryan, Casertano, & Pirzkal 2018). The algorithm describes the flux in a grism image pixel as a weighted sum over all sources and wavelengths, and the weights constitute a matrix (W) that transforms the incident spectra to the brightness in a collection of pixels, and includes instrumental (spectral trace, dispersion, flat field, sensitivity, and pixel area) and scene-dependent effects (source brightness and morphology). This matrix will have dimensionality of the number of knowns by the number of unknowns, which is the number of distinct pixels and the number of spectral points, respectively. For most situations, this will result in an overdetermined system, which implies the solution is found by least-squares minimization. Additionally, the matrix will be very sparse for a background-dominated scene, so the solution can be obtained with far greater computational efficiency by using techniques that are designed to exploit the sparsity. Paige & Saunders (1982) develop an iterative algorithm (LSQR) to minimize the regularized least-squares, and the regularization parameter can be estimated under various heuristic assumptions (such as roughly equal contribution from the data-dominated and the smoothness-dominated terms; Hansen 1992).
Use case: The Hubble Ultra-Deep Field
To highlight the power of this methodology, we extract the grism spectroscopy taken by WFC3/IR with G141 (programs 12099 and 12177) in the Hubble Ultra-Deep Field (HUDF; Beckwith et al. 2006). The data have a total exposure time of 43.3 ks in 36 images that are spread over three orients. There are ~1000 sources brighter than F140W < 26 mag, which results in a W‑matrix that is ≈106 × 105 with ≈108 non-zero elements (Ryan, Casertano, & Pirzkal 2018). On a 3.5 GHz, six-core Apple workstation with 32 Gb of memory, these calculations required ~18 hours.
In Figure 2, we show five sources extracted with LINEAR (red) and the individual-orient combined (blue) after subtracting the contamination estimated from the broadband photometry. LINEAR provides two key improvements over the orient-averaged approach: signal-to-noise and spectral resolution. In Figure 3, we show a zoom of the panels (b) and (d), where the He Ⅰemission becomes readily apparent and the Hβ/[O Ⅲ] emission lines are easily separated.
The HUDF data represent a single-use case where LINEAR improves upon classic extraction techniques, but we foresee any situation where contamination is expected to be large or the sources are often resolved, such as crowded fields (galaxy or star clusters) or spatially resolved spectroscopy. This has important implications for Webb, which will provide the astronomical community with three instruments capable of grism spectroscopy. These instruments have two grism elements, which disperse along pixel rows or columns, so that users can expect to have two orients for each field without moving the telescope. Although more work is needed to accommodate two distinct optical elements, this methodology will be important for working with multi-orient data. In the longer term, surveys with strict constraints on the spectroscopic completeness (such as WFIRST or Euclid) may require advanced algorithms for addressing source contamination.
The LINEAR codebase is predominately written in IDL, with several components in C, and is being prepared for distribution on GitHub. Until such time, please email the authors for a preliminary release.
- Beckwith, S. V. W., et al. 2006, AJ, 132, 1729
- Hansen, P. C. 1992, SIAM Review, 34, 4, 561
- Paige, C. C. & Saunders, M. A. 1982, ACM Trans. Math. Softw. 8, 2
- Ryan, R. E., Casertano, S., & Pirzkal, N. 2018, PASP, 130, 034501
- Spergel, D., et al. 2015, arXiv: 1503.03757
Data Science in Astronomy, Three WaysJoshua Peek, jegpeek [at] stsci.edu
What is Data Science? Much like "Big Data," Data Science is a term of art from industry being used so broadly that it is unclear what it means. This article does not attempt to define Data Science for everyone but, instead, dives into what Data Science means at the Institute, and what it could mean for astronomical science centers around the world. It's worth noting that the Data Science Mission Office here at the Institute isn’t just about Data Science: we worry about all aspects of data management here. We are going to talk about three separate branches of Data Science at the Institute: (1) Data-Driven Astronomy (2) Statistics, Machine Learning, and Algorithms, and (3) Analytics and Data Mining.
Data-Driven Astronomy (DDA) is the production of astronomical knowledge based on archival data sets. DDA is similar to industrial data science in that the data sets are not taken with the experiment in mind, but rather are a byproduct of other processes or investigations. DDA is often practiced on large homogenous data sets like the Sloan Digital Sky Survey or Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey (CANDELS), but can also be applied to more heterogenous data sets like the Hubble Legacy Archive. These scientific explorations often involve similar tool kits: astronomical data awareness, deep understanding of selection biases, and sophisticated multipoint statistics. Since DDA approaches often leverage very large data sets, DDA experts study subtle systematic effects that only become dominant over Poisson noise with larger amount of data, and devise data analysis strategies to suppress these effects. This is contrary to the view that Data Scientists ignore the details of the data they work with; successful DDA needs to be quite close to the data, and often involves working closely with instrument and survey specialists. DDA is not the classical mode for astronomy at the Institute, which was founded on one of the most productive "Guest Observer'' telescopes in history. Enhancing DDA at the Institute will allow us to make even better use of MAST, large data sets around the world, and, in the future, WFIRST.
Statistics, Machine Learning, and Algorithms
Interaction with data requires algorithms. Defined broadly enough, nearly all technical work at the Institute could be defined as algorithmic interactions with data, which is certainly too broad a definition for Data Science. The Statistics, Machine Learning, and Algorithms (SMLA) branch of Data Science at the Institute is related to the development of advanced methodologies for interacting with astronomical data and metadata. Statistical approaches to data sets large and small have always been critical to measuring the parameters of our universe, and the development of detailed and advanced astrostatistical techniques has become a core part of astronomical investigations and the astronomical literature. Machine Learning, which is less concerned about model parameter estimation and more concerned about prediction, is also becoming a core way astronomers interact with data. These tools, though, are not limited to measuring values of scientific interest. Advanced statistical and (especially) machine learning methodologies are becoming more and more useful for data reduction, processing, flagging, visualization, and manipulation tasks that we would typically associate with functional work at the Institute. Thus the SMLA branch of Data Science bridges functional and scientific work at the Institute.
Predictive Analytics and Data Mining
Like many modern companies, the Institute holds data, runs compute infrastructure, and has users that interact with these data and systems. Thus, it behooves us to take a page from the corporate world and examine our metadata carefully to optimize the performance of our products, services, and systems. The Predictive Analytics and Data Mining (PADM) branch of Data Science at the Institute is engaged in the collection and investigation of our non-astronomy data holdings to further our mission. While PADM may use various methodologies in common with SMLA (and to a lesser extent DDA), there is a much stronger focus on data flows, user interaction patterns, and text. In particular, the Institute generates, holds, and works with vast quantities of text data, and PADM strengthens our ability to search, characterize, and serve this textual data.
Taken together, these three approaches to data science at the Institute will help us fulfill our mission, both by producing cutting-edge science and by serving the astronomical community.
SDW2017: Scientific Detectors WorkshopJohn MacKenty, mackenty [at] stsci.edu; Elena Sabbi, sabbi [at] stsci.edu
The Institute hosted the sixth installment of a successful series of Scientific Detector Workshops on September 24–29 in Baltimore. This international workshop brought together a diverse group of scientists and engineers, who develop, produce, implement, and operate the most advanced imaging sensors used in scientific instrumentation. SDW2017 provided a look into the future detectors for a broad range of space and ground astronomical applications with contributions on supporting technologies and a few mind-stretching special talks.
By design, and aided by generous corporate sponsorships, this series of workshops has combined talks from leading experts in the technological and scientific fields, together with considerable time for discussions during the oral/poster sessions, interactive roundtables, and group social and cultural activities.
Noteworthy topics of discussion included talks on the next generation of instruments for most of the major ground-based telescopes, including the upcoming generation of extremely large telescopes. The technologies behind both the scientific sensors and, most importantly, the elements of these telescopes adaptive optical systems, was discussed from multiple viewpoints.
Many of the leading detector manufacturing companies talked about both their near- and longer-term product plans, while both research teams from the U.S., Europe, and Asia presented exciting new developments in detectors. A strong interest in the development of complimentary metal oxide semiconductor (CMOS) detectors and high-speed infrared detectors was apparent and of considerable interest (and discussion) for those involved in designing future ground and space systems.
Highly informative talks outside of the daily interests of most participants included Gary Sims on the detectors used at the National Ignition Facility at LLNL (where the radiation-damage problems are far more challenging than those faced by space astronomers), Roger Smith on JT Coolers, and Klaus Reif on shutters for astronomical cameras. View the Dr. Reif's talk on the Institute's webcast for a fascinating look into a topic often taken for granted. Dr. Adam Riess provided a scientific talk on his precision measurement of the Hubble Constant and its potential implications for the discovery of new physics—a piquant reminder to the technologists laboring on the intricate details of their detectors of the far-reaching consequences of their work.
On the final morning, technology experts involved in the studies for the next generation of NASA's flagship missions (HabEx, Lynx, LUVOIR, and OST) talked about their future detector needs and engaged in a round table discussion with the participants.
This meeting attracted 135 registered participants, plus a good number of local individuals who seized the opportunity to learn and share about the technologies and behaviors of the detectors upon which so much of the present activities and future aspirations of astronomical science depends.
The meeting was the brainchild of James Beletic (Teledyne Imaging Sensors) who chaired the organizing committee with the support of Paola Amico (European Southern Observatory), John MacKenty (STScI), Martin Roth (Astrophysical Institute of Potsdam/innoFSPE), and Elena Sabbi (STScI).
View the detailed schedule of the meeting.
View the webcast.
Science Platforms/Server-Side AnalyticsArfon Smith, arfon [at] stsci.edu
In late February, the Institute hosted a three-day workshop on the topic of "Science Platforms," aka "Server-Side Analytics."
The term "Science Platform" is growing in popularity and is generally used to refer to an integrated system of web applications, programmatic services (Application Programming Interfaces, or APIs), and an interactive analysis environment that are connected to a collection of astronomical data services. A second, related term, "Server-Side Analytics" is also in use—this term is generally used to describe the idea of providing a mechanism for astronomers to upload (and execute) their analyses close to large datasets. Server-Side Analytics was highlighted by NASA's Big Data Task Force as a major priority for big data missions such as WFIRST. At this meeting, we discussed both Science Platforms and Server-Side Analytics, the relationship between them, and the work that different centers are doing to support next-generation services for their communities to leverage big data.
Many science centers and projects, including the Institute, Large Synoptic Survey Telescope, National Optical Astronomy Observatory, and the Johns Hopkins University are currently developing science platform-like functionality for their projects. Buildling upon the community's strong interest in these topics at ADASS 2017 in Chile late last year, we gathered 50 members of the community for a hands-on collaboration meeting focussed on sharing approaches to science platforms, documenting common data-analysis patterns/workflows, identifying challenges different implementers are facing, and prototyping solutions to address these challenges.
The workshop started with each of the major implementers, i.e. a team building something close to a 'science platform', giving an overview of their project and the audience(s) they were building for. Time was also made available for workshop participants to try out each tool being presented. Speakers in this session included:
- SciServer (Gerard Lemson)
- NOAO Datalab (Knut Olsen)
- National Data Service (Kenton Guadron McHenry)
- LSST (Frossie Economou)
- STScI (Christian Mesh)
- WholeTale (Kacper Kowalik)
- DES (Matias Carrasco-Kind)
- ESAC Vicente Navarro
The remainder of the meeting was dedicated to short lightning talks by engineers and astronomers on ideas/challenges/questions they wanted to explore with others during afternoon breakout sessions. Notes from the workshop were captured in this GitHub repository and a follow-on meeting is planned for later this year.
Science with Precision AstrometryLaura Watkins, lwatkins [at] stsci.edu; Andrea Bellini, bellini [at] stsci.edu
Astrometry is one of the oldest branches of astronomy, and here at the Space Telescope Science Institute we have both a rich history and a promising future in astrometric studies. From the very early days when the Institute was founded, it was realized that the ability to accurately point and guide the Hubble Space Telescope would require a precise and reliable astrometric catalog, which led to the creation of the Guide Star Catalog (GSC), and later the Hubble Source Catalog (HSC). Hubble also has state-of-the-art astrometric detectors, and over the last 15 years, Hubble has revolutionized the field of high-precision astrometry with the Advanced Camera for Surveys, Wide-Field Camera 3, and the interferometric Fine-Guidance Sensors.
We are now looking forward to the launch of the James Webb Space Telescope in 2020, which will offer the same high-precision astrometric capabilities as Hubble. Already, several Webb guaranteed-time programs are set to pursue a variety of astrometric studies. Further in the future, the Wide-Field Infrared Survey Telescope (WFIRST) will launch in the mid-2020s. WFIRST has a much wider field of view than either Hubble or Webb, but will have comparable astrometric capabilities, thus opening up new areas of scientific investigation. Combined, data from Hubble, Webb, and WFIRST will provide an extremely long time baseline for astrometry.
However, it is not just Hubble, Webb, and WFIRST that can and will enable precise astrometric studies. We are entering an exciting new era of astrometry with many new missions currently taking observations, or planned and under construction that will bring further advances in this field. Specifically, the second Gaia data release will be made available in April, and will be crucial for both science and calibration purposes, and in particular will revolutionize Milky Way science. Future ground-based facilities—such as the Large Synoptic Survey Telescope (LSST) and the Extremely Large Telescope (ELT)—will enable astrometric studies from the ground with precisions that have only previously been achievable from space.
With the second Gaia data release anticipated late in April 2018, and a host of astrometric facilities coming online over the next decade, it seemed a particularly auspicious time to host a meeting on astrometry, and the Institute is an excellent venue for it. So on 13–15 March, the Institute hosted a mini-workshop on "Science with Precision Astrometry." It featured 69 registered attendees, 12 invited talks, a further 26 contributed talks, and 10 posters.
The workshop began with an overview of current astrometric facilities and techniques. This session featured a talk on Gaia and its short-term and long-term future prospects for the scientific community, multiple talks that focused on how Gaia data can be used to improve calibration for other observatories—in this case, for Hubble and the HSC, and the Dark Energy Camera, and an overview of the spatial scanning techniques that have been used to such good effect on Hubble.
There followed five sessions that spanned an extremely broad range of science topics that rely upon or greatly benefit from precise astrometric measurements. These sessions included talks on the solar system, extra-solar planets, binary stars, astrometric microlensing by stellar-mass and intermediate-mass black holes, the internal kinematics of globular clusters and open clusters, the Galactic center, bulge and halo, the dynamical properties of the Local Group and its satellites, AGN jets, supernova remnants, and the cosmological distance scale.
The workshop finished with a session that looked towards our prospects for the future and focused on a handful of future facilities on the ground and in space: including Theia, a possible successor to Gaia, LSST, WFIRST, and the Extremely Large Telescope (ELT).
The posters spanned a similarly rich selection of topics as the talks, discussing current state-of-the-art techniques, future facilities, and a variety of science topics.
We are moving into an era of multi-wavelength astronomy, and that was very much reflected in our workshop. The contributions covered a very wide range of wavelengths and included existing or anticipated data from many different observatories. Further, many contributions focused on the existing or anticipated synergies between different missions, and the extra value that can come from correlating multi-mission data and using one dataset to calibrate another. Astrometry is the best tool to do this. Already Gaia data are being used to calibrate a host of other missions; this calibration will be further improved with the second data release, and will be extremely advantageous for both astrometric and non-astrometric studies. Current astrometric facilities are pushing new frontiers and the future is extremely bright.
We would like to thank the event coordinators Roz Baxter and Sherita Hanna, and the AV team of Calvin Tullos and Thomas Marufu for their assistance behind the scenes that made everything run so smoothly, and the other members of the SOC—Jay Anderson, Anthony Brown, Michael Fall, Andrea Ghez, Steve Lubow, Vera Kozhurina-Platais, William O'Mullane, Elena Sabbi, Johannes Sahlmann, Kailash Sahu, Linda Smith, David Soderblom, Tony Sohn, Roeland van der Marel, and Rosemary Wyse—for their help in putting together such an exciting meeting.
Recordings of all talks are available here: STScI webcast
Hybrid Astronomy Careers at STScIMax Mutchler, mutchler [at] stsci.edu
Inclusive work culture
The Space Telescope Science Institute was created as a very unique institute to operate a very special space mission: the Hubble Space Telescope. Having a first-rate scientific staff directly engaged in operations, and with a vested interest in the scientific productivity of the mission, was a core organizational principle from the outset. It soon became clear that the same general idea must be applied to the many other professions that would also be needed (and in some cases created and defined) for mission success. Attracting and retaining a competent technical staff has contributed significantly to staff stability at the Institute and the success of Hubble. As the Institute meets the challenge of transitioning to become a more nimble multi-mission institute, our future success will require an increasingly versatile and diverse staff. We have focused intently on creating an inclusive work culture which acknowledges and supports our entire staff—developing diversity statements, a code of conduct, and a strategic plan for diversity and inclusion. Who and how we hire is also critical to our ability to build the workplace culture needed for our future. We must challenge historical biases that prevent us from securing all the talented people we need.
Hybrid career paths
The Institute has become a leader in defining what have been called "alternative" or "non-traditional" career paths in astronomy, many of which do not require a PhD. In some cases, these terms are unhelpful and have undermined the goal of promoting a broader range of career paths. These paths may be better described as "hybrid" astronomy careers which have evolved in response to the needs of conducting modern space astronomy missions. We realized we needed to create not just jobs, but well-defined career paths that reflect the expertise and high degree of dedication of our technical staff. As a result, we have a large and diverse demographic enjoying long-term careers supporting the biggest astronomy projects of our generation—hardly "alternative," but still lacking visibility.
Since 2011, I have interviewed hundreds of undergraduates, graduate students, and postdoctoral researchers, mostly with astronomy/physics backgrounds, and hired roughly 70 technical staff. These interviews represent a rather broad survey of career preparation and planning in astronomy. It has been consistently clear that almost all astronomy students at all levels are still largely being advised to prepare exclusively for the canonical path towards a PhD, independent research, and tenure. Despite much talk of broadening the idea of a successful career in astronomy, there still appears to be very little awareness, support, or intentional preparation for other paths. In addition to creating a more multi-dimensional astronomical workforce, providing multiple pathways forward provides the best avenue to engage diverse students, particularly those who have been historically excluded, and allow them to find a career in astronomy that provides the proper balance for their personal and professional needs.
We need your help
The low visibility of numerous stable careers at a high-visibility institute like ours is quite striking. It is an unfortunate barrier that we must still work to overcome in my recruitment efforts. Although we widely advertise our job postings online, we have found that word-of-mouth is still most effective. So if you teach or advise or know a student who is trying to visualize a career in astronomy, especially any students who have historically been excluded from our field, please make them aware of the hybrid career paths available at the Institute. If you are visiting us, bring them along to get a close look at the work we do, and expand their views on how they might pursue a career. Consider curriculum adjustments which better match the skills described in our job postings. We need your help to find our future workforce. But even more importantly, we owe this greater awareness and pragmatism to the many bright students who are attracted to our field. This broader persepective would make our field more welcoming to people who have been historically excluded, by providing multiple opportunities to find a workable path forward.
Astronomy Visualizations: The Cinematic Science PipelineFrank Summers, summers [at] stsci.edu; Greg Bacon, bacon [at] stsci.edu; and Zolt Levay, levay [at] stsci.edu
In the Office of Public Outreach, we have a long tradition of creating exemplary graphics and videos in support of a wide variety of programs including press releases, web sites, and educational activities. In 2003, we made an ambitious foray into large-format film with the IMAX short Hubble: Galaxies Across Space and Time. This project indirectly led to our prominent role in helping to create 13 minutes of visualizations for the 2010 IMAX release Hubble 3D. Since then, our team has been regularly producing astronomy visualizations from Hubble images, science data, and computer simulations.
One of our visualization goals has been that each new project not only creates an enjoyable and scientifically sound presentation, but also enables us to learn, test, and adapt new techniques from the film industry into our production processes. This increasing sophistication has also been accompanied by significant change in the scale and diversity of our videos, including 4K‑pixel resolution, 60‑frames-per-second time resolution, and planetarium dome and virtual reality formats. As our team, skill set, and experience has grown, we have developed a robust, yet flexible pipeline for producing what we like to call "cinematic science."
In describing the stages of this pipeline, it helps to use a specific project as an example. Below, we will discuss various aspects of the Flight Through the Orion Nebula in Visible and Infrared Light visualization that was released in January 2018. For reference, that video is available at: http://hubblesite.org/video/1003/science. The creation of this visualization was done in collaboration with the Spitzer Mission under the auspices of NASA's Universe of Learning project.
The foundation of all our work is the science data. For press releases, the source is often a Hubble image, but we utilize appropriate resources for the desired presentation. The few astronomical catalogs with three-dimensional interpretations, like stellar parallax or galaxy redshift, enable visualizations with a minimum of extrapolation. Simulations also provide 3D data and include dynamic change over time that naturally tells a story. Procuring an appropriate suite of data sets is always the initial task.
For the Orion Nebula visualization, our base images are from Hubble and Spitzer observations. We also use some wider views from Robert Gendler and NASA's Wide-field Infrared Survey Explorer (WISE) in order to fill out the fields and create a softened edge that lets the nebula "float" in a 3D space. The stars within the nebula come from the Hillenbrand catalog and an infrared catalog provided by Mario Gennaro. We also employ lists of proplyds, bow shocks, disks, and jets. Background stars come from the Hipparcos catalog, adjusted for the location of the nebula.
For the three-dimensional interpretations, we use scientific knowledge as available, scientific intuition as possible, and scientifically guided artistic license as needed. The availability of science papers that can be used to provide or guide volumetric modeling is one of the most important factors in choosing a visualization target. These usually simplistic models can be enhanced by applying physical principles like thermal pressure, radiative transfer, and dynamical flow to envision reasonable structures. Still, there are often aspects of a visualization where no observations exist and a plausible extrapolation is required.
Our starting point for Orion are some 3D interpretations of the center region around the Trapezium stars as provided by some papers by Bob O’Dell and collaborators. We worked with local expert Massimo Robberto to extend that model across the Hubble image, and with Robert Hurt and other experts at Caltech/IPAC to extend the model to the infrared layer. The nebula was sculpted as a surface in a computer graphics program and then re-interpreted as a point cloud to create a gaseous feel. In that last step, we found the commercial, surface-based point-generation algorithms to be inadequate for representing a nebula, and developed our own volume-based method.
The Hipparcos stars have well-defined positions, while those of stars within the nebula are estimated relative to the foreground gas layer, called Orion's "veil." Hence, the depth aspect of the stellar positions are adjusted to fit the sculpted model and also augmented for the incompleteness of the observations/catalogs. Finally, the bow shocks, proplyds, disks, and jets are sculpted to approximate the shapes seen in the images, and placed in appropriate positions relative to the stars that create or shape each structure.
With the fast and ongoing development of computer graphics software for commercial films, there is a full suite of potential tools to use in our scientific work. We use many of the same packages as Hollywood films, although we often use a limited subset of features. For example, many of our astronomy objects are self-luminous, making the sophisticated set of lighting design tools superfluous for some of our projects.
However, our visualizations have some aspects that are not in demand by commercial films and require us to develop our own tools. The most important in-house tool is a renderer, called pointillism, that can efficiently handle tens of millions of mostly transparent point clouds on desktop workstation hardware.
The main and veil layers of the Orion Nebula are rendered with the pointillism renderer using its point-cloud mode. A star mode in that same software, which employs proper dimming with distance for unresolved stars, is used for all of the stars in the scenes. Note that accurate rendering of the largest stars, such as the Trapezium stars, would completely overwhelm the scene, and thus such stars are artificially limited in brightness to maintain a coherent story. Commercial software is used to render the proplyd and bow-shock elements. For the 4K render, 14 separate layers were produced at 3840 × 2160 resolution in 7200 frames for a total of over 800 billion pixels rendered.
The main reason for going to the extra effort of rendering each element in the scene on a separate layer is to provide flexibility in the post-processing stage. As the layers are composited together, the look of each element can be adjusted individually, and contrasts between elements can be balanced. An important step is also the final color correction, as the hues produced by the visualization techniques invariably drift a bit from the source material. Titles, credits, and other sequence editing are completed during post processing.
The contrast of gas and star layers was a difficult balance in the Orion Nebula. The veil-gas layer captures different tones when seen first against the bright gas from above, and then against the dark, starry sky from below. In addition, the passage through the veil gas produced more light absorption than desired, so an extra veil render was done and added in to smooth the look. The balance between star sets is important to provide both background context and the visual impression of being inside a star-forming region.
To complete the visual story of Orion, we used a zoom from the Milky Way to Orion created by the European Southern Observatory. The opening 2D sequence sets a familiar stage that the 3D visualization then dives into. Music is a crucial component for cinematic presentation, and we found an excellent piece on the Free Music Archive.
Our channels of distribution are all via the internet. Our press releases and video gallery are housed at hubblesite.org. On the video gallery pages, visitors can not only watch the sequence, but also download the movie files and, in some cases, the source frames. Documentary producers and informal science professionals at museums often request the frames for incorporation into their products. We also maintain an Astronomy Visualizations playlist on our Hubble Space Telescope channel on YouTube. Our social media team works in concert with NASA social media to promote the visualizations via Facebook, Twitter, and other social networks.
The last part of the pipeline is to repeat the process for the immersive formats of planetarium domes and virtual reality. These renders cover the full sphere around the camera and can bring up new considerations and problems to solve. For example, small artifacts of the environment map render appeared when the camera passed through gas clouds in Orion. A new technique, near-field raytracing, was developed to correct those frames. Many visualizations do not have the full 3D information for these immersive formats, but those that can be done are greatly appreciated by these audiences.
The Orion sequence has quickly become one of our most popular ever. Within a couple weeks, the sequence garnered over 700,000 views on Hubble and NASA YouTube postings, and many more uncountable views on other social media sites as re-posted by sites like the Washington Post, National Geographic, and Gizmodo.
Although left until last in this article, the story is actually the starting point for any visualization. The end product of this cinematic science pipeline is not just a cool video sequence—it is a cool video sequence that enhances the public's understanding about some aspect of astronomy. The reason for putting such extensive effort into a visual product is to present an idea that is difficult or impossible to convey otherwise. As an example, if asked about what happens when two galaxies collide, most of the public would be hard pressed to envision the result. Yet show them a visualization and they immediately have a memorable representation of its dynamical development. NASA's Universe of Learning collaboration works to develop such instructive astronomy visualizations and promote STEM awareness in a broad range of audiences.
For the Orion Nebula, the viewer instinctively creates a mental model of how star-forming regions are structured, and can adapt and apply such ideas to the interpretation of other astronomical images. That effortless knowledge transfer, which is inherent in visualizations, is why we strive to create scientifically reasonable 3D interpretations—to counteract common misconceptions, and to present a focused and relevant story. The combination of astronomy data with cinematic techniques can make powerful visualizations that intuitively deliver a scientific message.
Acknowledgement: This material is based upon work supported by NASA under cooperative agreement award number NNX16AC65A. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Aeronautics and Space Administration.
Robert Brown and Knox Long RetireDavid Soderblom, drs [at] stsci.edu
Two astronomers have recently retired after long careers at the Institute.
Robert Brown joined the Institute in August 1982 as the third hire of our brand-new scientific staff. One year later, he was named NASA Project Scientist for the Hubble Space Telescope and Chair of the HST Science Working Group. Brown relocated from 1983 to 1985 from Baltimore to the Marshall Space Flight Center in Huntsville, Alabama, the NASA center responsible for building Hubble. After he returned to the Institute in 1985, Brown headed the Special Studies Office and later the Strategic Communications office.
In 1990, Brown and Holland Ford led the HST Strategy Panel, which defined the comprehensive solution to Hubble's spherical aberration—notably including the invention of COSTAR (Corrective Optics Space Telescope Axial Replacement), which astronauts installed in Hubble on the first servicing mission in December 1993. COSTAR enabled three of the launch-era instruments (FOC, FOS, and GHRS) to regain diffraction-limited imaging performance.
In 1992, Brown led the Future of Space Imaging study, which recommended the development of the instrument that became the Advanced Camera for Surveys.
In 1996–2000, Brown led the Hubble Second Decade study, which recommended the Hubble Treasury Program, the concept of the Virtual Observatory, and the near-infrared channel of Wide Field Camera 3.
For many years, 2002–2015, Brown edited the Institute Newsletter and, 1999–2005, the Annual Report.
Brown's research has focused on planetary physics and later the challenge of exoplanets. He discovered the Io sodium cloud (1974), published the first accurate measure of the rotation rate of Uranus (1977), and, with Christopher Burrows, developed a methodology for evaluating the ability of a space telescope—starting with Hubble—to detect and characterize exoplanets (1990).
Brown is now living in the Sonoran Desert south of Tucson, growing chiltepin peppers, panning for gold, and exploring canyon lands with camera and Jeep.
Knox Long started at the Institute in 1990, shortly after Hubble's launch. He initially worked on establishing and operating the data archive for Hubble, and then served as head of the Science Support Division, which at that time was responsible for technical support of Hubble observers. He then worked for about a decade as the Lead Instrument Scientist, and then Mission Scientist at the Institute for the fledgling Webb mission. Since 2011, he has served as an instrument scientist for WFC3, concentrating on activities to characterize the IR detector.
Long obtained his Ph.D. from Caltech in 1976. His thesis involved studying soft X-rays from the Large Magellanic Cloud and our Galaxy using sounding rockets. Joining the X-ray astrophysics group at Columbia University, he worked on observations of Galactic X-ray binaries with OSO-8 and carried out some of the first studies of supernova remnants and other X-ray sources in nearby galaxies with the Einstein Observatory. In 1982, he came to Johns Hopkins to be the project scientist for the Hopkins Ultraviolet Telescope, which flew on the Space Shuttle in 1990 and 1994. While at the Institute, he has continued to work in a variety of science areas—supernova remnants, cataclysmic variables, X-ray sources in normal galaxies and Active Galactic Nuclei—using a variety of space- and ground-based observatories, including Hubble and Chandra. He is also one of the principal authors of a modern Monte Carlo radiative transfer code used to model winds in accretion disk systems.
Long lives in Chicago with his wife, and is looking forward to continuing various research projects as an emeritus astronomer, enjoying the vibrant life of downtown Chicago, continuing to fish and to travel, and visiting family and friends in the Washington/Baltimore area.
In Memoriam: Nolan WalbornAlex Fullerton, fullerton [at] stsci.edu
Nolan R. Walborn, one of the original members of the research staff at STScI and a renowned expert on the spectral classification of hot, massive stars, died on February 22. His passing marks the end of an era for both the Institute and the hot-star community.
Walborn was born in Bloomsburg, PA, but moved with his family to Argentina when he was 8 years old. He attended local schools, and—as a result—was fluent in Spanish; something that often caught other Spanish speakers off guard upon their first meeting! He maintained a deep affection for South America throughout his life and retained close ties with the astronomical community there.
Walborn returned to the U.S. for undergraduate studies in physics at Gettysburg College, PA and graduated summa cum laude in 1966. He moved to the Yerkes Observatory of the University of Chicago for graduate work in the field of stellar spectroscopy with W. W. Morgan—the "M" in "MK spectral classification." Larry Marschall of Gettysburg College, who was also a graduate student at Yerkes at the time, remembers Walborn as a clear, logical thinker who wasn't hesitant to offer a fresh, and sometimes contrarian, view on the issues. "We discussed these topics informally at the Yerkes Observatory or at Friday fish-fries in the local restaurants."
In his dissertation, Walborn established precise luminosity criteria for early O‑type spectra and clarified details of the temperature classification for late O- and early B‑type stars. This work launched his career as a pre-eminent classifier of the spectra of early-type stars. Nolan's keen eye and fastidious attention to detail were perfectly suited to this type of work, which lead to numerous discoveries over the course of his 50-year research career.
After completing his dissertation in 1970, Walborn held a postdoctoral position at the David Dunlap Observatory, University of Toronto, during which he helped inaugurate the University of Toronto Southern Observatory on Las Campanas, Chile. He was hired as a staff astronomer at the Cerro Tololo Inter-American Observatory in 1973, which positioned him to continue his research into early-type stars in the Galaxy and Magellanic Clouds, and to widen his interests to include explorations of the ejecta of η Carinae, the interstellar medium of the Carina region, and the properties of massive young star clusters.
Walborn's long association with the Institute began in 1979 when he was seconded from CTIO to the AURA corporate office to help draft the AURA/JHU proposal to NASA to manage the Space Telescope Science Institute. He was especially involved with drafting the "science management" section, and later recalled: "After AURA won the competition, I heard that one of the winning points of the AURA proposal was the Science Management section, which gave me a warm feeling." Following a stint as a NASA/NRC Senior Research Associate at NASA's Goddard Space Flight Center, he joined the research staff of the fledgling Institute in 1984.
Walborn held a variety of roles during his 34-year career at the Institute. He was a founding member of the General Observer Support Branch, which established the Institute's core commitment to serving the Hubble user community. After Hubble's launch in 1990, Nolan provided dedicated support for multiple generations of spectroscopic instruments—Goddard High-Resolution Spectrograph, Space Telescope Imaging Spectrograph (STIS), Cosmic Origins Spectrograph—with the particular aim of enabling complicated, "impossible" programs to be scheduled and their scientific goals achieved, while also ensuring the health and safety of the delicate UV detectors. "Nolan unfailingly had the best interests of both the science and the workings of Hubble at heart," said John Debes, the current STIS Branch Manager.
Above all else, Walborn was a classifier of stellar spectra. He championed the morphological approach that underlies spectral classification and thought deeply about the role the method plays in scientific investigations. In his view, the search for similarities, differences, trends, and relationships provided the informational framework that is a prerequisite for meaningful investigation of the physical origin of a new phenomenon. In his own words: "Of course, morphology does not explain anything; rather, it properly formulates the phenomenon to be explained." His philosophical perspective on the linkage between "Nature," our imperfect but ever-improving "Image of Nature," and the ultimate goal of "Understanding of Nature" is articulated in the Walborn Diagram, which he spoke about frequently and defended fiercely.
Walborn's own work to study shapes, patterns, and trends in the line spectra of O‑type stars lead to the recognition of a wide variety of new phenomena, many of which are still awaiting adequate physical explanations. By-products of this work include landmark atlases of the spectra of early-type stars in the ultraviolet (International Ultraviolet Explorer, Hubble), far-ultraviolet (Copernicus, Hopkins Ultraviolet Telescope, Far Ultraviolet Spectroscopic Explorer), and X‑ray (Chandra) wavebands. Although he championed the spectroscopic explorations provided by these new windows on the universe, he always came back to the optical region of the spectrum. He was an early convert to classification with high S/N digital data obtained with modern detectors, and was particularly proud of his recent work to classify all the O‑type spectra in the Galactic O‑Star Spectroscopic Survey (GOSSS; 590 stars) and VLT-Flames Tarantula Survey (VFTS; 213 stars) with newly refined criteria. He was continuing to work on a variety of projects until the very end of his life.
Walborn was happy to engage in the "cut and thrust" of spirited debates, whether on details of stellar spectra, the necessity of the morphological approach to research, or the political and sociological issues of the day. In the context of their work on η Carinae, Roberta Humphreys, Professor Emeritus at the University of Minnesota recalls that "he could be both insightful and critical, but with a twinkle in his eye and a sly grin afterwards." Nolan was also a tireless raconteur, who would happily entertain gatherings with stories of astronomy and astronomers, as well as his many experiences in different places and times. His more playful side would often emerge after the banquet dinner at an astronomical conference. Linda Smith, head of the STScI Instruments Division, attended many of the same conferences as Nolan over the years, and noted that along with scientific discussions, "his biggest joy at conferences, however, was to dance the Macarena. He also taught the dance to many conference attendees over the years."
Nolan Walborn was a unique individual with a rich legacy of achievements. His colleagues at the Institute and many friends around the world mourn his passing, and express sincere condolences to his family. Though the loss of his insights and mentorship will be keenly missed, his accomplishments will continue to guide massive star research for years to come.