text
stringlengths 89
2.49k
| category
stringclasses 19
values |
---|---|
Modeling Results and Baseline Design for an RF-SoC-Based Readout System
for Microwave Kinetic Inductance Detectors: Building upon existing signal processing techniques and open-source software,
this paper presents a baseline design for an RF System-on-Chip Frequency
Division Multiplexed readout for a spatio-spectral focal plane instrument based
on low temperature detectors. A trade-off analysis of different FPGA carrier
boards is presented in an attempt to find an optimum next-generation solution
for reading out larger arrays of Microwave Kinetic Inductance Detectors
(MKIDs). The ZCU111 RF SoC FPGA board from Xilinx was selected, and it is shown
how this integrated system promises to increase the number of pixels that can
be read out (per board) which enables a reduction in the readout cost per
pixel, the mass and volume, and power consumption, all of which are important
in making MKID instruments more feasible for both ground-based and space-based
astrophysics. The on-chip logic capacity is shown to form a primary constraint
on the number of MKIDs which can be read, channelised, and processed with this
new system. As such, novel signal processing techniques are analysed, including
Digitally Down Converted (DDC)-corrected sub-maximally decimated sampling, in
an effort to reduce logic requirements without compromising signal to noise
ratio. It is also shown how combining the ZCU111 board with a secondary FPGA
board will allow all 8 ADCs and 8 DACs to be utilised, providing enough
bandwidth to read up to 8,000 MKIDs per board-set, an eight-fold improvement
over the state-of-the-art, and important in pursuing 100,000 pixel arrays.
Finally, the feasibility of extending the operational frequency range of MKIDs
to the 5 - 10 GHz regime (or possibly beyond) is investigated, and some
benefits and consequences of doing so are presented. | astro-ph_IM |
A superconducting focal plane array for ultraviolet, optical, and
near-infrared astrophysics: Microwave Kinetic Inductance Detectors, or MKIDs, have proven to be a
powerful cryogenic detector technology due to their sensitivity and the ease
with which they can be multiplexed into large arrays. A MKID is an energy
sensor based on a photon-variable superconducting inductance in a lithographed
microresonator, and is capable of functioning as a photon detector across the
electromagnetic spectrum as well as a particle detector. Here we describe the
first successful effort to create a photon-counting, energy-resolving
ultraviolet, optical, and near infrared MKID focal plane array. These new
Optical Lumped Element (OLE) MKID arrays have significant advantages over
semiconductor detectors like charge coupled devices (CCDs). They can count
individual photons with essentially no false counts and determine the energy
and arrival time of every photon with good quantum efficiency. Their physical
pixel size and maximum count rate is well matched with large telescopes. These
capabilities enable powerful new astrophysical instruments usable from the
ground and space. MKIDs could eventually supplant semiconductor detectors for
most astronomical instrumentation, and will be useful for other disciplines
such as quantum optics and biological imaging. | astro-ph_IM |
Light Curve Classification with DistClassiPy: a new distance-based
classifier: The rise of synoptic sky surveys has ushered in an era of big data in
time-domain astronomy, making data science and machine learning essential tools
for studying celestial objects. Tree-based (e.g. Random Forests) and deep
learning models represent the current standard in the field. We explore the use
of different distance metrics to aid in the classification of objects. For
this, we developed a new distance metric based classifier called DistClassiPy.
The direct use of distance metrics is an approach that has not been explored in
time-domain astronomy, but distance-based methods can aid in increasing the
interpretability of the classification result and decrease the computational
costs. In particular, we classify light curves of variable stars by comparing
the distances between objects of different classes. Using 18 distance metrics
applied to a catalog of 6,000 variable stars in 10 classes, we demonstrate
classification and dimensionality reduction. We show that this classifier meets
state-of-the-art performance but has lower computational requirements and
improved interpretability. We have made DistClassiPy open-source and accessible
at https://pypi.org/project/distclassipy/ with the goal of broadening its
applications to other classification scenarios within and beyond astronomy. | astro-ph_IM |
A Bayesian method for the analysis of deterministic and stochastic time
series: I introduce a general, Bayesian method for modelling univariate time series
data assumed to be drawn from a continuous, stochastic process. The method
accommodates arbitrary temporal sampling, and takes into account measurement
uncertainties for arbitrary error models (not just Gaussian) on both the time
and signal variables. Any model for the deterministic component of the
variation of the signal with time is supported, as is any model of the
stochastic component on the signal and time variables. Models illustrated here
are constant and sinusoidal models for the signal mean combined with a Gaussian
stochastic component, as well as a purely stochastic model, the
Ornstein-Uhlenbeck process. The posterior probability distribution over model
parameters is determined via Monte Carlo sampling. Models are compared using
the "cross-validation likelihood", in which the posterior-averaged likelihood
for different partitions of the data are combined. In principle this is more
robust to changes in the prior than is the evidence (the prior-averaged
likelihood). The method is demonstrated by applying it to the light curves of
11 ultra cool dwarf stars, claimed by a previous study to show statistically
significant variability. This is reassessed here by calculating the
cross-validation likelihood for various time series models, including a null
hypothesis of no variability beyond the error bars. 10 of 11 light curves are
confirmed as being significantly variable, and one of these seems to be
periodic, with two plausible periods identified. Another object is best
described by the Ornstein-Uhlenbeck process, a conclusion which is obviously
limited to the set of models actually tested. | astro-ph_IM |
Bias-Free Estimation of Signals on Top of Unknown Backgrounds: We present a method for obtaining unbiased signal estimates in the presence
of a significant background, eliminating the need for a parametric model for
the background itself. Our approach is based on a minimal set of conditions for
observation and background estimators, which are typically satisfied in
practical scenarios. To showcase the effectiveness of our method, we apply it
to simulated data from the planned dielectric axion haloscope MADMAX. | astro-ph_IM |
Arbitrary Transform Telescopes: The Generalization of Interferometry: The basic principle of astronomical interferometry is to derive the angular
distribution of radiation in the sky from the Fourier transform of the electric
field on the ground. What is so special about the Fourier transform? Nothing,
it turns out. I consider the possibility of performing other transforms on the
electric field with digital technology. The Fractional Fourier Transform (FrFT)
is useful for interpreting observations of sources that are close to the
interferometer (in the atmosphere for radio interferometers). Essentially,
applying the FrFT focuses the array somewhere nearer than infinity. Combined
with the other Linear Canonical Transforms, any homogeneous linear optical
system with thin elements can be instantiated. The time variation of the
electric field can also be decomposed into other bases besides the Fourier
modes, which is especially useful for dispersed transients or quick pulses. I
discuss why the Fourier basis is so commonly used, and suggest it is partly
because most astrophysical sources vary slowly in time. | astro-ph_IM |
The Rapid Imaging Planetary Spectrograph: The Rapid Imaging Planetary Spectrograph (RIPS) was designed as a long-slit
high-resolution spectrograph for the specific application of studying
atmospheres of spatially extended solar system bodies. With heritage in
terrestrial airglow instruments, RIPS uses an echelle grating and order-sorting
filters to obtain optical spectra at resolving powers up to R~127,000. An
ultra-narrowband image from the reflective slit jaws is captured concurrently
with each spectrum on the same EMCCD detector. The "rapid" portion of RIPS'
moniker stems from its ability to capture high frame rate data streams, which
enables the established technique known as "lucky imaging" to be extended to
spatially resolved spectroscopy. Resonantly scattered emission lines of alkali
metals, in particular, are sufficiently bright to be measured in short
integration times. RIPS has mapped the distributions of Na and K emissions in
Mercury's tenuous exosphere, which exhibit dynamic behavior coupled to the
planet's plasma and meteoroid environment. An important application is daylight
observations of Mercury at solar telescopes since synoptic context on the
exosphere's distribution comprises valuable ground-based support for the
upcoming BepiColombo orbital mission. As a conventional long slit spectrograph,
RIPS has targeted the Moon's surface-bound exosphere where structure in
linewidth and brightness as a function of tangent altitude are observed. At the
Galilean moons, RIPS can study the plasma interaction with Io and place new
constraints on the sputtered atmosphere of Europa, which in turn provides
insight into the salinity of Europa's subsurface ocean. The instrumental design
and construction are described herein, and these astronomical observations are
presented to illustrate RIPS' performance as a visiting instrument at three
different telescope facilities. | astro-ph_IM |
Experiments with calibrated digital sideband separating downconversion: This article reports on the first step in a focused program to re-optimize
radio astronomy receiver architecture to better take advantage of the latest
advancements in commercial digital technology. Specifically, an L-Band
sideband-separating downconverter has been built using a combination of careful
(but ultimately very simple) analog design and digital signal processing to
achieve wideband downconversion of an RFI-rich frequency spectrum to baseband
in a single mixing step, with a fixed-frequency Local Oscillator and stable
sideband isolation exceeding 50 dB over a 12 degree C temperature range. | astro-ph_IM |
A Lunar L2-Farside Exploration and Science Mission Concept with the
Orion Multi-Purpose Crew Vehicle and a Teleoperated Lander/Rover: A novel concept is presented in this paper for a human mission to the lunar
L2 (Lagrange) point that would be a proving ground for future exploration
missions to deep space while also overseeing scientifically important
investigations. In an L2 halo orbit above the lunar farside, the astronauts
aboard the Orion Crew Vehicle would travel 15% farther from Earth than did the
Apollo astronauts and spend almost three times longer in deep space. Such a
mission would serve as a first step beyond low Earth orbit and prove out
operational spaceflight capabilities such as life support, communication, high
speed re-entry, and radiation protection prior to more difficult human
exploration missions. On this proposed mission, the crew would teleoperate
landers and rovers on the unexplored lunar farside, which would obtain samples
from the geologically interesting farside and deploy a low radio frequency
telescope. Sampling the South Pole-Aitken basin, one of the oldest impact
basins in the solar system, is a key science objective of the 2011 Planetary
Science Decadal Survey. Observations at low radio frequencies to track the
effects of the Universe's first stars/galaxies on the intergalactic medium are
a priority of the 2010 Astronomy and Astrophysics Decadal Survey. Such
telerobotic oversight would also demonstrate capability for human and robotic
cooperation on future, more complex deep space missions such as exploring Mars. | astro-ph_IM |
Optical capabilities of the Multichannel Subtractive Double Pass (MSDP)
for imaging spectroscopy and polarimetry at the Meudon Solar Tower: The Meudon Solar Tower (MST) is a 0.60 m telescope dedicated to spectroscopic
observations of solar regions. It includes a 14-meter focal length spectrograph
which offers high spectral resolution. The spectrograph works either in
classical thin slit mode (R > 300000) or 2D imaging spectroscopy (60000 < R <
180000). This specific mode is able to provide high temporal resolution
measurements (1 min) of velocities and magnetic fields upon a 2D field of view,
using the Multichannel Subtractive Double Pass (MSDP) system. The purpose of
this paper is to describe the capabilities of the MSDP at MST with available
slicers for broad and thin lines. The goal is to produce multichannel
spectra-images, from which cubes of instantaneous data (x, y, $\lambda$) are
derived, in order to study of the plasma dynamics and magnetic fields (with
polarimetry). | astro-ph_IM |
The small size telescope projects for the Cherenkov Telescope Array: The small size telescopes (SSTs), spread over an area of several square km,
dominate the CTA sensitivity in the photon energy range from a few TeV to over
100 TeV, enabling for the detailed exploration of the very high energy
gamma-ray sky. The proposed telescopes are innovative designs providing a wide
field of view. Two of them, the ASTRI (Astrophysics con Specchi a Tecnologia
Replicante Italiana) and the GCT (Gamma-ray Cherenkov Telescope) telescopes,
are based on dual mirror Schwarzschild-Couder optics, with primary mirror
diameters of 4 m. The third, SST-1M, is a Davies-Cotton design with a 4 m
diameter mirror. Progress with the construction and testing of prototypes of
these telescopes is presented. The SST cameras use silicon photomultipliers,
with preamplifier and readout/trigger electronics designed to optimize the
performance of these sensors for (atmospheric) Cherenkov light. The status of
the camera developments is discussed. The SST sub-array will consist of about
70 telescopes at the CTA southern site. Current plans for the implementation of
the array are presented. | astro-ph_IM |
The Footprint Database and Web Services of the Herschel Space
Observatory: Data from the Herschel Space Observatory is freely available to the public
but no uniformly processed catalogue of the observations has been published so
far. To date, the Herschel Science Archive does not contain the exact sky
coverage (footprint) of individual observations and supports search for
measurements based on bounding circles only. Drawing on previous experience in
implementing footprint databases, we built the Herschel Footprint Database and
Web Services for the Herschel Space Observatory to provide efficient search
capabilities for typical astronomical queries. The database was designed with
the following main goals in mind: (a) provide a unified data model for
meta-data of all instruments and observational modes, (b) quickly find
observations covering a selected object and its neighbourhood, (c) quickly find
every observation in a larger area of the sky, (d) allow for finding solar
system objects crossing observation fields. As a first step, we developed a
unified data model of observations of all three Herschel instruments for all
pointing and instrument modes. Then, using telescope pointing information and
observational meta-data, we compiled a database of footprints. As opposed to
methods using pixellation of the sphere, we represent sky coverage in an exact
geometric form allowing for precise area calculations. For easier handling of
Herschel observation footprints with rather complex shapes, two algorithms were
implemented to reduce the outline. Furthermore, a new visualisation tool to
plot footprints with various spherical projections was developed. Indexing of
the footprints using Hierarchical Triangular Mesh makes it possible to quickly
find observations based on sky coverage, time and meta-data. The database is
accessible via a web site (http://herschel.vo.elte.hu) and also as a set of
REST web service functions. | astro-ph_IM |
The Chinese space millimeter-wavelength VLBI array - a step toward
imaging the most compact astronomical objects: The Shanghai Astronomical Observatory (SHAO) of the Chinese Academy of
Sciences (CAS) is studying a space VLBI (Very Long Baseline Interferometer)
program. The ultimate objective of the program is to image the immediate
vicinity of the supermassive black holes (SMBHs) in the hearts of galaxies with
a space-based VLBI array working at sub-millimeter wavelengths and to gain
ultrahigh angular resolution. To achieve this ambitious goal, the mission plan
is divided into three stages. The first phase of the program is called Space
Millimeter-wavelength VLBI Array (SMVA) consisting of two satellites, each
carrying a 10-m diameter radio telescope into elliptical orbits with an apogee
height of 60000 km and a perigee height of 1200 km. The VLBI telescopes in
space will work at three frequency bands, 43, 22 and 8 GHz. The 43- and 22-GHz
bands will be equipped with cryogenic receivers. The space telescopes,
observing together with ground-based radio telescopes, enable the highest
angular resolution of 20 micro-arcsecond ($\mu$as) at 43 GHz. The SMVA is
expected to conduct a broad range of high-resolution observational research,
e.g. imaging the shadow (dark region) of the supermassive black hole in the
heart of the galaxy M87 for the first time, studying the kinematics of water
megamasers surrounding the SMBHs, and exploring the power source of active
galactic nuclei. Pre-research funding has been granted by the CAS in October
2012, to support scientific and technical feasibility studies. These studies
also include the manufacturing of a prototype of the deployable 10-m
space-based telescope and a 22-GHz receiver. Here we report on the latest
progress of the SMVA project. | astro-ph_IM |
Initial simulation study on high-precision radio measurements of the
depth of shower maximum with SKA1-low: As LOFAR has shown, using a dense array of radio antennas for detecting
extensive air showers initiated by cosmic rays in the Earth's atmosphere makes
it possible to measure the depth of shower maximum for individual showers with
a statistical uncertainty less than $20\,g/cm^2$. This allows detailed studies
of the mass composition in the energy region around $10^{17}\,eV$ where the
transition from a Galactic to an Extragalactic origin could occur. Since
SKA1-low will provide a much denser and very homogeneous antenna array with a
large bandwidth of $50-350\,MHz$ it is expected to reach an uncertainty on the
$X_{\max}$ reconstruction of less than $10\,g/cm^2$. We present first results
of a simulation study with focus on the potential to reconstruct the depth of
shower maximum for individual showers to be measured with SKA1-low. In
addition, possible influences of various parameters such as the numbers of
antennas included in the analysis or the considered frequency bandwidth will be
discussed. | astro-ph_IM |
The GLENDAMA Database: This is the first version (v1) of the Gravitational LENses and DArk MAtter
(GLENDAMA) database accessible at http://grupos.unican.es/glendama/database The
new database contains more than 6000 ready-to-use (processed) astronomical
frames corresponding to 15 objects that fall into three classes: (1) lensed QSO
(8 objects), (2) binary QSO (3 objects), and (3) accretion-dominated radio-loud
QSO (4 objects). Data are also divided into two categories: freely available
and available upon request. The second category includes observations related
to our yet unpublished analyses. Although this v1 of the GLENDAMA archive
incorporates an X-ray monitoring campaign for a lensed QSO in 2010, the rest of
frames (imaging, polarimetry and spectroscopy) were taken with NUV, visible and
NIR facilities over the period 1999-2014. The monitorings and follow-up
observations of lensed QSOs are key tools for discussing the accretion flow in
distant QSOs, the redshift and structure of intervening (lensing) galaxies, and
the physical properties of the Universe as a whole. | astro-ph_IM |
Gammapy - A Python package for γ-ray astronomy: In the past decade imaging atmospheric Cherenkov telescope arrays such as
H.E.S.S., MAGIC, VERITAS, as well as the Fermi-LAT space telescope have
provided us with detailed images and spectra of the gamma-ray universe for the
first time. Currently the gamma-ray community is preparing to build the
next-generation Cherenkov Telecope Array (CTA), which will be operated as an
open observatory. Gammapy (available at https://github.com/gammapy/gammapy
under the open-source BSD license) is a new in-development Astropy affiliated
package for high-level analysis and simulation of astronomical gamma-ray data.
It is built on the scientific Python stack (Numpy, Scipy, matplotlib and
scikit-image) and makes use of other open-source astronomy packages such as
Astropy, Sherpa and Naima to provide a flexible set of tools for gamma-ray
astronomers. We present an overview of the current Gammapy features and example
analyses on real as well as simulated gamma-ray datasets. We would like Gammapy
to become a community-developed project and a place of collaboration between
scientists interested in gamma-ray astronomy with Python. Contributions
welcome! | astro-ph_IM |
An investigation of lucky imaging techniques: We present an empirical analysis of the effectiveness of frame selection
(also known as Lucky Imaging) techniques for high resolution imaging. A
high-speed image recording system has been used to observe a number of bright
stars. The observations were made over a wide range of values of D/r0 and
exposure time. The improvement in Strehl ratio of the stellar images due to
aligning frames and selecting the best frames was evaluated as a function of
these parameters. We find that improvement in Strehl ratio by factors of 4 to 6
can be achieved over a range of D/r0 from 3 to 12, with a slight peak at D/r0 ~
7. The best Strehl improvement is achieved with exposure times of 10 ms or less
but significant improvement is still obtained at exposure times as long as 640
ms. Our results are consistent with previous investigations but cover a much
wider range of parameter space. We show that Strehl ratios of >0.7 can be
achieved in appropiate conditions whereas previous studies have generally shown
maximum Strehl ratios of ~0.3. The results are in reasonable agreement with the
simulations of Baldwin et al. (2008). | astro-ph_IM |
Early Science Results from SOFIA, the World's Largest Airborne
Observatory: The Stratospheric Observatory For Infrared Astronomy, or SOFIA, is the
largest flying observatory ever built,consisting of a 2.7-meter diameter
telescope embedded in a modified Boeing 747-SP aircraft. SOFIA is a joint
project between NASA and the German Aerospace Center Deutsches Zentrum fur Luft
und-Raumfahrt (DLR). By flying at altitudes up to 45000 feet, the observatory
gets above 99.9 percent of the infrared-absorbing water vapor in the Earth's
atmosphere. This opens up an almost uninterrupted wavelength range from
0.3-1600 microns that is in large part obscured from ground based
observatories. Since its 'Initial Science Flight' in December 2010, SOFIA has
flown several dozen science flights, and has observed a wide array of objects
from Solar System bodies, to stellar nurseries, to distant galaxies. This paper
reviews a few of the exciting new science results from these first flights
which were made by three instruments: the mid-infrared camera FORCAST, the
far-infrared heterodyne spectrometer GREAT, and the optical occultation
photometer HIPO. | astro-ph_IM |
Energy spectra of abundant cosmic-ray nuclei in the NUCLEON experiment: The NUCLEON satellite experiment is designed to directly investigate the
energy spectra of cosmic-ray nuclei and the chemical composition (Z=1-30) in
the energy range of 2-500 TeV. The experimental results are presented,
including the energy spectra of different abundant nuclei measured using the
new Kinematic Lightweight Energy Meter (KLEM) technique. The primary energy is
reconstructed by registration of spatial density of the secondary particles.
The particles are generated by the first hadronic inelastic interaction in a
carbon target. Then additional particles are produced in a thin tungsten
converter, by electromagnetic and hadronic interactions. | astro-ph_IM |
Measurement errors and scaling relations in astrophysics: a review: This review article considers some of the most common methods used in
astronomy for regressing one quantity against another in order to estimate the
model parameters or to predict an observationally expensive quantity using
trends between object values. These methods have to tackle some of the awkward
features prevalent in astronomical data, namely heteroscedastic
(point-dependent) errors, intrinsic scatter, non-ignorable data collection and
selection effects, data structure and non-uniform population (often called
Malmquist bias), non-Gaussian data, outliers and mixtures of regressions. We
outline how least square fits, weighted least squares methods, Maximum
Likelihood, survival analysis, and Bayesian methods have been applied in the
astrophysics literature when one or more of these features is present. In
particular we concentrate on errors-in-variables regression and we advocate
Bayesian techniques. | astro-ph_IM |
The optical imager Galileo (OIG): The present paper describes the construction, the installation and the
operation of the Optical Imager Galileo (OIG), a scientific instrument
dedicated to the 'imaging' in the visible. OIG was the first instrument
installed on the focal plane of the Telescopio Nazionale Galileo (TNG) and it
has been extensively used for the functional verification of several parts of
the telescope (as an example the optical quality, the rejection of spurious
light, the active optics and the tracking), in the same way also several parts
of the TNG informatics system (instrument commanding, telemetry and data
archiving) have been verified making extensive use of OIG. This paper provides
also a frame of work for a further development of the imaging dedicated
instrumentation inside TNG. OIG, coupled with the first near-IR camera
(ARNICA), has been the 'workhorse instrument' during the first period of
telescope experimental and scientific scheduling. | astro-ph_IM |
VERITAS Telescope 1 Relocation: Details and Improvements: The first VERITAS telescope was installed in 2002-2003 at the Fred Lawrence
Whipple Observatory and was originally operated as a prototype instrument.
Subsequently the decision was made to locate the full array at the same site,
resulting in an asymmetric array layout. As anticipated, this resulted in less
than optimal sensitivity due to the loss in effective area and the increase in
background due to local muon initiated triggers. In the summer of 2009, the
VERITAS collaboration relocated Telescope 1 to improve the overall array
layout. This has provided a 30% improvement in sensitivity corresponding to a
60% change in the time needed to detect a source. | astro-ph_IM |
An Atmospheric Cerenkov Telescope Simulation System: A detailed numerical procedure has been developed to simulate the mechanical
configurations and optical properties of Imaging Atmospheric Cerenkov Telescope
systems. To test these procedures a few existing ACT arrays are simulated.
First results from these simulations are presented. | astro-ph_IM |
The ASTROID Simulator Software Package: Realistic Modelling of
High-Precision High-Cadence Space-Based Imaging: The preparation of a space-mission that carries out any kind of imaging to
detect high-precision low-amplitude variability of its targets requires a
robust model for the expected performance of its instruments. This model cannot
be derived from simple addition of noise properties due to the complex
interaction between the various noise sources. While it is not feasible to
build and test a prototype of the imaging device on-ground, realistic numerical
simulations in the form of an end-to-end simulator can be used to model the
noise propagation in the observations. These simulations not only allow
studying the performance of the instrument, its noise source response and its
data quality, but also the instrument design verification for different types
of configurations, the observing strategy and the scientific feasibility of an
observing proposal. In this way, a complete description and assessment of the
objectives to expect from the mission can be derived. We present a
high-precision simulation software package, designed to simulate photometric
time-series of CCD images by including realistic models of the CCD and its
electronics, the telescope optics, the stellar field, the jitter movements of
the spacecraft, and all important natural noise sources. This formalism has
been implemented in a software tool, dubbed ASTROID Simulator. | astro-ph_IM |
denmarf: a Python package for density estimation using masked
autoregressive flow: Masked autoregressive flow (MAF) is a state-of-the-art non-parametric density
estimation technique. It is based on the idea (known as a normalizing flow)
that a simple base probability distribution can be mapped into a complicated
target distribution that one wishes to approximate, using a sequence of
bijective transformations. The denmarf package provides a scikit-learn-like
interface in Python for researchers to effortlessly use MAF for density
estimation in their applications to evaluate probability densities of the
underlying distribution of a set of data and generate new samples from the
data, on either a CPU or a GPU, as simple as "from denmarf import
DensityEstimate; de = DensityEstimate().fit(X)". The package also implements
logistic transformations to facilitate the fitting of bounded distributions. | astro-ph_IM |
SHIMM: A Versatile Seeing Monitor for Astronomy: Characterisation of atmospheric optical turbulence is crucial for the design
and operation of modern ground-based optical telescopes. In particular, the
effective application of adaptive optics correction on large and extremely
large telescopes relies on a detailed knowledge of the prevailing atmospheric
conditions, including the vertical profile of the optical turbulence strength
and the atmospheric coherence timescale. The Differential Image Motion Monitor
(DIMM) has been employed as a facility seeing monitor at many astronomical
observing sites across the world for several decades, providing a reliable
estimate of the seeing angle. Here we present the Shack-Hartmann Image Motion
Monitor (SHIMM), which is a development of the DIMM instrument, in that it
exploits differential image motion measurements of bright target stars.
However, the SHIMM employs a Shack-Hartmann wavefront sensor in place of the
two-hole aperture mask utilised by the DIMM. This allows the SHIMM to provide
an estimate of the seeing, unbiased by shot noise or scintillation effects. The
SHIMM also produces a low-resolution (three-layer) measure of the vertical
turbulence profile, as well as an estimate of the coherence timescale. The
SHIMM is designed as a low-cost, portable, instrument. It is comprised of
off-the-shelf components so that it is easy to duplicate and well-suited for
comparisons of atmospheric conditions within and between different observing
sites. Here, the SHIMM design and methodology for estimating key atmospheric
parameters will be presented, as well as initial field test results with
comparisons to the Stereo-SCIDAR instrument. | astro-ph_IM |
Pan-STARRS Photometric and Astrometric Calibration: We present the details of the photometric and astrometric calibration of the
Pan-STARRS1 $3\pi$ Survey. The photometric goals were to reduce the systematic
effects introduced by the camera and detectors, and to place all of the
observations onto a photometric system with consistent zero points over the
entire area surveyed, the ~30,000 square degrees north of $\delta$ = -30
degrees. The astrometric calibration compensates for similar systematic effects
so that positions, proper motions, and parallaxes are reliable as well. The
Pan-STARRS Data Release 2 (DR2) astrometry is tied to the Gaia DR1 release. | astro-ph_IM |
The Photometric LSST Astronomical Time-series Classification Challenge
(PLAsTiCC): Selection of a performance metric for classification
probabilities balancing diverse science goals: Classification of transient and variable light curves is an essential step in
using astronomical observations to develop an understanding of their underlying
physical processes. However, upcoming deep photometric surveys, including the
Large Synoptic Survey Telescope (LSST), will produce a deluge of low
signal-to-noise data for which traditional labeling procedures are
inappropriate. Probabilistic classification is more appropriate for the data
but are incompatible with the traditional metrics used on deterministic
classifications. Furthermore, large survey collaborations intend to use these
classification probabilities for diverse science objectives, indicating a need
for a metric that balances a variety of goals. We describe the process used to
develop an optimal performance metric for an open classification challenge that
seeks probabilistic classifications and must serve many scientific interests.
The Photometric LSST Astronomical Time-series Classification Challenge
(PLAsTiCC) is an open competition aiming to identify promising techniques for
obtaining classification probabilities of transient and variable objects by
engaging a broader community both within and outside astronomy. Using mock
classification probability submissions emulating archetypes of those
anticipated of PLAsTiCC, we compare the sensitivity of metrics of
classification probabilities under various weighting schemes, finding that they
yield qualitatively consistent results. We choose as a metric for PLAsTiCC a
weighted modification of the cross-entropy because it can be meaningfully
interpreted. Finally, we propose extensions of our methodology to ever more
complex challenge goals and suggest some guiding principles for approaching the
choice of a metric of probabilistic classifications. | astro-ph_IM |
AstroInformatics: Recommendations for Global Cooperation: Policy Brief on "AstroInformatics, Recommendations for Global Collaboration",
distilled from panel discussions during S20 Policy Webinar on Astroinformatics
for Sustainable Development held on 6-7 July 2023.
The deliberations encompassed a wide array of topics, including broad
astroinformatics, sky surveys, large-scale international initiatives, global
data repositories, space-related data, regional and international collaborative
efforts, as well as workforce development within the field. These discussions
comprehensively addressed the current status, notable achievements, and the
manifold challenges that the field of astroinformatics currently confronts.
The G20 nations present a unique opportunity due to their abundant human and
technological capabilities, coupled with their widespread geographical
representation. Leveraging these strengths, significant strides can be made in
various domains. These include, but are not limited to, the advancement of STEM
education and workforce development, the promotion of equitable resource
utilization, and contributions to fields such as Earth Science and Climate
Science.
We present a concise overview, followed by specific recommendations that
pertain to both ground-based and space data initiatives. Our team remains
readily available to furnish further elaboration on any of these proposals as
required. Furthermore, we anticipate further engagement during the upcoming G20
presidencies in Brazil (2024) and South Africa (2025) to ensure the continued
discussion and realization of these objectives.
The policy webinar took place during the G20 presidency in India (2023).
Notes based on the seven panels will be separately published. | astro-ph_IM |
Searching for Extraterrestrial Intelligence with the Square Kilometre
Array: The vast collecting area of the Square Kilometre Array (SKA), harnessed by
sensitive receivers, flexible digital electronics and increased computational
capacity, could permit the most sensitive and exhaustive search for
technologically-produced radio emission from advanced extraterrestrial
intelligence (SETI) ever performed. For example, SKA1-MID will be capable of
detecting a source roughly analogous to terrestrial high-power radars (e.g. air
route surveillance or ballistic missile warning radars, EIRP (EIRP = equivalent
isotropic radiated power, ~10^17 erg sec^-1) at 10 pc in less than 15 minutes,
and with a modest four beam SETI observing system could, in one minute, search
every star in the primary beam out to ~100 pc for radio emission comparable to
that emitted by the Arecibo Planetary Radar (EIRP ~2 x 10^20 erg sec^-1). The
flexibility of the signal detection systems used for SETI searches with the SKA
will allow new algorithms to be employed that will provide sensitivity to a
much wider variety of signal types than previously searched for.
Here we discuss the astrobiological and astrophysical motivations for radio
SETI and describe how the technical capabilities of the SKA will explore the
radio SETI parameter space. We detail several conceivable SETI experimental
programs on all components of SKA1, including commensal, primary-user, targeted
and survey programs and project the enhancements to them possible with SKA2. We
also discuss target selection criteria for these programs, and in the case of
commensal observing, how the varied use cases of other primary observers can be
used to full advantage for SETI. | astro-ph_IM |
VISION: A Six-Telescope Fiber-Fed Visible Light Beam Combiner for the
Navy Precision Optical Interferometer: Visible-light long baseline interferometry holds the promise of advancing a
number of important applications in fundamental astronomy, including the direct
measurement of the angular diameters and oblateness of stars, and the direct
measurement of the orbits of binary and multiple star systems. To advance, the
field of visible-light interferometry requires development of instruments
capable of combining light from 15 baselines (6 telescopes) simultaneously. The
Visible Imaging System for Interferometric Observations at NPOI (VISION) is a
new visible light beam combiner for the Navy Precision Optical Interferometer
(NPOI) that uses single-mode fibers to coherently combine light from up to six
telescopes simultaneously with an image-plane combination scheme. It features a
photometric camera for calibrations and spatial filtering from single-mode
fibers with two Andor Ixon electron multiplying CCDs. This paper presents the
VISION system, results of laboratory tests, and results of commissioning on-sky
observations. A new set of corrections have been determined for the power
spectrum and bispectrum by taking into account non-Gaussian statistics and read
noise present in electron-multipying CCDs to enable measurement of visibilities
and closure phases in the VISION post-processing pipeline. The post-processing
pipeline has been verified via new on-sky observations of the O-type supergiant
binary $\zeta$ Orionis A, obtaining a flux ratio of $2.18\pm0.13$ mag with a
position angle of $223.9\pm1.0^{\circ}$ and separation $40.6\pm1.8$ mas over
570-750 nm, in good agreement with expectations from the previously published
orbit. | astro-ph_IM |
Errors, chaos and the collisionless limit: We simultaneously study the dynamics of the growth of errors and the question
of the faithfulness of simulations of $N$-body systems. The errors are
quantified through the numerical reversibility of small-$N$ spherical systems,
and by comparing fixed-timestep runs with different stepsizes. The errors add
randomly, before exponential divergence sets in, with exponentiation rate
virtually independent of $N$, but scale saturating as $\sim 1/\sqrt{N}$, in
line with theoretical estimates presented. In a third phase, the growth rate is
initially driven by multiplicative enhancement of errors, as in the exponential
stage. It is then qualitatively different for the phase space variables and
mean field conserved quantities (energy and momentum); for the former, the
errors grow systematically through phase mixing, for the latter they grow
diffusively. For energy, the $N$-variation of the `relaxation time' of error
growth follows the $N$-scaling of two-body relaxation. This is also true for
angular momentum in the fixed stepsize runs, although the associated error
threshold is higher and the relaxation time smaller. Due to shrinking
saturation scales, the information loss associated with the exponential
instability decreases with $N$ and the dynamical entropy vanishes at any finite
resolution as $N \rightarrow \infty$. A distribution function depending on the
integrals of motion in the smooth potential is decreasingly affected. In this
sense there is convergence to the collisionless limit, despite the persistence
of exponential instability on infinitesimal scales. Nevertheless, the slow
$N$-variation in its saturation points to the slowness of the convergence. | astro-ph_IM |
Expectations on the mass determination using astrometric microlensing by
Gaia: Context. Astrometric gravitational microlensing can be used to determine the
mass of a single star (the lens) with an accuracy of a few percent. To do so,
precise measurements of the angular separations between lens and background
star with an accuracy below 1 milli-arcsecond at different epochs are needed.
Hence only the most accurate instruments can be used. However, since the
timescale is in the order of months to years, the astrometric deflection might
be detected by Gaia, even though each star is only observed on a low cadence.
Aims. We want to show how accurately Gaia can determine the mass of the lensing
star. Methods. Using conservative assumptions based on the results of the
second Gaia Data release, we simulated the individual Gaia measurements for 501
predicted astrometric microlensing events during the Gaia era (2014.5 -
2026.5). For this purpose we use the astrometric parameters of Gaia DR2, as
well as an approximative mass based on the absolute G magnitude. By fitting the
motion of lens and source simultaneously we then reconstruct the 11 parameters
of the lensing event. For lenses passing by multiple background sources, we
also fit the motion of all background sources and the lens simultaneously.
Using a Monte-Carlo simulation we determine the achievable precision of the
mass determination. Results. We find that Gaia can detect the astrometric
deflection for 114 events. Further, for 13 events Gaia can determine the mass
of the lens with a precision better than 15% and for 13 + 21 = 34 events with a
precision of 30% or better. | astro-ph_IM |
Light curve completion and forecasting using fast and scalable Gaussian
processes (MuyGPs): Temporal variations of apparent magnitude, called light curves, are
observational statistics of interest captured by telescopes over long periods
of time. Light curves afford the exploration of Space Domain Awareness (SDA)
objectives such as object identification or pose estimation as latent variable
inference problems. Ground-based observations from commercial off the shelf
(COTS) cameras remain inexpensive compared to higher precision instruments,
however, limited sensor availability combined with noisier observations can
produce gappy time-series data that can be difficult to model. These external
factors confound the automated exploitation of light curves, which makes light
curve prediction and extrapolation a crucial problem for applications.
Traditionally, image or time-series completion problems have been approached
with diffusion-based or exemplar-based methods. More recently, Deep Neural
Networks (DNNs) have become the tool of choice due to their empirical success
at learning complex nonlinear embeddings. However, DNNs often require large
training data that are not necessarily available when looking at unique
features of a light curve of a single satellite.
In this paper, we present a novel approach to predicting missing and future
data points of light curves using Gaussian Processes (GPs). GPs are non-linear
probabilistic models that infer posterior distributions over functions and
naturally quantify uncertainty. However, the cubic scaling of GP inference and
training is a major barrier to their adoption in applications. In particular, a
single light curve can feature hundreds of thousands of observations, which is
well beyond the practical realization limits of a conventional GP on a single
machine. Consequently, we employ MuyGPs, a scalable framework for
hyperparameter estimation of GP models that uses nearest neighbors
sparsification and local cross-validation. MuyGPs... | astro-ph_IM |
Radiative Cooling II: Effects of Density and Metallicity: This work follows Lykins et al. discussion of classic plasma cooling function
at low density and solar metallicity. Here we focus on how the cooling function
changes over a wide range of density (n_H<10^12 cm^(-3)) and metallicity (Z<30Z
_sun ). We find that high densities enhance the ionization of elements such as
hydrogen and helium until they reach local thermodynamic equilibrium. By charge
transfer, the metallicity changes the ionization of hydrogen when it is
partially ionized. We describe the total cooling function as a sum of four
parts: those due to H&He, the heavy elements, electron-electron bremsstrahlung
and grains. For the first 3 parts, we provide a low-density limit cooling
function, a density dependence function, and a metallicity dependence function.
These functions are given with numerical tables and analytical fit functions.
For grain cooling, we only discuss in ISM case. We then obtain a total cooling
function that depends on density, metallicity and temperature. As expected,
collisional de-excitation suppresses the heavy elements cooling. Finally, we
provide a function giving the electron fraction, which can be used to convert
the cooling function into a cooling rate. | astro-ph_IM |
The SKA and the Unknown Unknowns: As new scientists and engineers join the SKA project and as the pressures
come on to maintain costs within a chosen envelope it is worth restating and
updating the rationale for the 'Exploration of the Unknown' (EoU). Maintaining
an EoU philosophy will prove a vital ingredient for realizing the SKA's
discovery potential. Since people make the discoveries enabled by technology a
further axis in capability parameter space, the'human bandwidth' is emphasised.
Using the morphological approach pioneered by Zwicky, a currently unexploited
region of observational parameter space can be identified viz: time variable
spectral patterns on all spectral and angular scales, one interesting example
would be 'spectral transients'. We should be prepared to build up to 10 percent
less collecting area for a given overall budget in order to enhance the ways in
which SKA1 can be flexibly utilized. | astro-ph_IM |
Inviscid SPH: In smooth-particle hydrodynamics (SPH), artificial viscosity is necessary for
the correct treatment of shocks, but often generates unwanted dissipation away
from shocks. We present a novel method of controlling the amount of artificial
viscosity, which uses the total time derivative of the velocity divergence as
shock indicator and aims at completely eliminating viscosity away from shocks.
We subject the new scheme to numerous tests and find that the method works at
least as well as any previous technique in the strong-shock regime, but becomes
virtually inviscid away from shocks, while still maintaining particle order. In
particular sound waves or oscillations of gas spheres are hardly damped over
many periods. | astro-ph_IM |
RAFTER: Ring Astrometric Field Telescope for Exo-planets and Relativity: High precision astrometry aims at source position determination to a very
small fraction of the diffraction image size, in high SNR regime. One of the
key limitations to such goal is the optical response variation of the telescope
over a sizeable FOV, required to ensure bright reference objects to any
selected target. The issue translates into severe calibration constraints,
and/or the need for complex telescope and focal plane metrology. We propose an
innovative system approach derived from the established TMA telescope concept,
extended to achieve high filling factor of an annular field of view around the
optical axis of the telescope. The proposed design is a very compact, 1 m class
telescope compatible with modern CCD and CMOS detectors (EFL = 15 m). We
describe the concept implementation guidelines and the optical performance of
the current optical design. The diffraction limited FOV exceeds 1.25 square
degrees, and the detector occupies the best 0.25 square degree with 66 devices. | astro-ph_IM |
The Electromagnetic Characteristics of the Tianlai Cylindrical
Pathfinder Array: A great challenge for 21 cm intensity mapping experiments is the strong
foreground radiation which is orders of magnitude brighter than the 21cm
signal. Removal of the foreground takes advantage of the fact that its
frequency spectrum is smooth while the redshifted 21cm signal spectrum is
stochastic. However, a complication is the non-smoothness of the instrument
response. This paper describes the electromagnetic simulation of the Tianlai
cylinder array, a pathfinder for 21 cm intensity mapping experiments. Due to
the vast scales involved, a direct simulation requires large amount of
computing resources. We have made the simulation practical by using a
combination of methods: first simulate a single feed, then an array of feed
units, finally with the feed array and a cylindrical reflector together, to
obtain the response for a single cylinder. We studied its radiation pattern,
bandpass response and the effects of mutual coupling between feed units, and
compared the results with observation. Many features seen in the measurement
result are well reproduced in the simulation, especially the oscillatory
features which are associated with the standing waves on the reflector. The
mutual coupling between feed units is quantified with S-parameters, which
decrease as the distance between the two feeds increases. Based on the
simulated S-parameters, we estimate the correlated noise which has been seen in
the visibility data, the results show very good agreement with the data in both
magnitude and frequency structures. These results provide useful insights on
the problem of 21cm signal extraction for real instruments. | astro-ph_IM |
A fast 2D image reconstruction algorithm from 1D data for the Gaia
mission: A fast 2-dimensional image reconstruction method is presented, which takes as
input 1-dimensional data acquired from scans across a central source in
different orientations. The resultant reconstructed images do not show
artefacts due to non-uniform coverage in the orientations of the scans across
the central source, and are successful in avoiding a high background due to
contamination of the flux from the central source across the reconstructed
image. Due to the weighting scheme employed this method is also naturally
robust to hot pixels. This method was developed specifically with Gaia data in
mind, but should be useful in combining data with mismatched resolutions in
different directions. | astro-ph_IM |
Basic Survey Scheduling for the Wide Field Survey Telescope (WFST): Aiming at improving the survey efficiency of the Wide Field Survey Telescope,
we have developed a basic scheduling strategy that takes into account the
telescope characteristics, observing conditions, and weather conditions at the
Lenghu site. The sky area is divided into rectangular regions, referred to as
`tiles', with a size of 2.577 deg * 2.634 deg slightly smaller than the focal
area of the mosaic CCDs. These tiles are continuously filled in annulars
parallel to the equator. The brightness of the sky background, which varies
with the moon phase and distance from the moon, plays a significant role in
determining the accessible survey fields. Approximately 50 connected tiles are
grouped into one block for observation. To optimize the survey schedule, we
perform simulations by taking into account the length of exposures, data
readout, telescope slewing, and all relevant observing conditions. We utilize
the Greedy Algorithm for scheduling optimization. Additionally, we propose a
dedicated dithering pattern to cover the gaps between CCDs and the four corners
of the mosaic CCD array, which are located outside of the 3 deg field of view.
This dithering pattern helps to achieve relatively uniform exposure maps for
the final survey outputs. | astro-ph_IM |
Baryon acoustic oscillations from Integrated Neutral Gas Observations:
Broadband corrugated horn construction and testing: The Baryon acoustic oscillations from Integrated Neutral Gas Observations
(BINGO) telescope is a 40-m~class radio telescope under construction that has
been designed to measure the large-angular-scale intensity of HI emission at
980--1260 MHz and hence to constrain dark energy parameters. A large focal
plane array comprising of 1.7-metre diameter, 4.3-metre length corrugated feed
horns is required in order to optimally illuminate the telescope. Additionally,
very clean beams with low sidelobes across a broad frequency range are
required, in order to facilitate the separation of the faint HI emission from
bright Galactic foreground emission. Using novel construction methods, a
full-sized prototype horn has been assembled. It has an average insertion loss
of around 0.15 dB across the band, with a return loss around -25 dB. The main
beam is Gaussian with the first sidelobe at around $-25 dB. A septum polariser
to separate the signal into the two hands of circular polarization has also
been designed, built and tested. | astro-ph_IM |
Hi-fi phenomenological description of eclipsing binary light variations
as the basis for their period analysis: In-depth analysis of eclipsing binary (EB) observational data collected for
several decades can inform us about a lot of astrophysically interesting
processes taking place in the systems. We have developed a wide-ranging method
for the phenomenological modelling of eclipsing binary phase curves that
enables us to combine even very disparate sources of phase information. This
approach is appropriate for the processing of both standard photometric series
of eclipses and data from photometric surveys of all kind. We conclude that
mid-eclipse times, determined using the latest version of our 'hi-fi'
phenomenological light curve models, as well as their accuracy, are nearly the
same as the values obtained using much more complex standard physical EB
models. | astro-ph_IM |
Calibration database for the Murchison Widefield Array All-Sky Virtual
Observatory: We present a calibration component for the Murchison Widefield Array All-Sky
Virtual Observatory (MWA ASVO) utilising a newly developed PostgreSQL database
of calibration solutions. Since its inauguration in 2013, the MWA has recorded
over thirty-four petabytes of data archived at the Pawsey Supercomputing
Centre. According to the MWA Data Access policy, data become publicly available
eighteen months after collection. Therefore, most of the archival data are now
available to the public. Access to public data was provided in 2017 via the MWA
ASVO interface, which allowed researchers worldwide to download MWA
uncalibrated data in standard radio astronomy data formats (CASA measurement
sets or UV FITS files). The addition of the MWA ASVO calibration feature opens
a new, powerful avenue for researchers without a detailed knowledge of the MWA
telescope and data processing to download calibrated visibility data and create
images using standard radio-astronomy software packages. In order to populate
the database with calibration solutions from the last six years we developed
fully automated pipelines. A near-real-time pipeline has been used to process
new calibration observations as soon as they are collected and upload
calibration solutions to the database, which enables monitoring of the
interferometric performance of the telescope. Based on this database we present
an analysis of the stability of the MWA calibration solutions over long time
intervals. | astro-ph_IM |
Polarimetric characterization of segmented mirrors: We study the impact of the loss of axial symmetry around the optical axis on
the polarimetric properties of a telescope with segmented primary mirror when
each segment is present in a different aging stage. The different oxidation
stage of each segment as they are substituted in time leads to non-negligible
crosstalk terms. This effect is wavelength dependent and it is mainly
determined by the properties of the reflecting material. For an aluminum
coating, the worst polarimetric behavior due to oxidation is found for the blue
part of the visible. Contrarily, dust -- as modeled in this work -- does not
significantly change the polarimetric behavior of the optical system .
Depending on the telescope, there might be segment substitution sequences that
strongly attenuate this instrumental polarization. | astro-ph_IM |
Simulation and Analysis Chain for Acoustic Ultra-high Energy Neutrino
Detectors in Water: Acousticneutrinodetectionisapromisingapproachforlarge-scaleultra-highenergyneutrinodetectorsinwater.In
this article, a Monte Carlo simulation chain for acoustic neutrino detection
devices in water will be presented. The simulation chain covers the generation
of the acoustic pulse produced by a neutrino interaction and its propagation to
the sensors within the detector. Currently, ambient and transient noise models
for the Mediterranean Sea and simulations of the data acquisition hardware,
equivalent to the one used in ANTARES/AMADEUS, are implemented. A pre-selection
scheme for neutrino-like signals based on matched filtering is employed, as it
is used for on-line filtering. To simulate the whole processing chain for
experimental data, signal classification and acoustic source reconstruction
algorithms are integrated in an analysis chain. An overview of design and
capabilities of the simulation and analysis chain will be presented and
preliminary studies will be discussed. | astro-ph_IM |
A new method of CCD dark current correction via extracting the dark
information from scientific images: We have developed a new method to correct dark current at relatively high
temperatures for Charge-Coupled Device (CCD) images when dark frames cannot be
obtained on the telescope. For images taken with the Antarctic Survey
Telescopes (AST3) in 2012, due to the low cooling efficiency, the median CCD
temperature was -46$^\circ$C, resulting in a high dark current level of about
3$e^-$/pix/sec, even comparable to the sky brightness (10$e^-$/pix/sec). If not
corrected, the nonuniformity of the dark current could even overweight the
photon noise of the sky background. However, dark frames could not be obtained
during the observing season because the camera was operated in frame-transfer
mode without a shutter, and the telescope was unattended in winter. Here we
present an alternative, but simple and effective method to derive the dark
current frame from the scientific images. Then we can scale this dark frame to
the temperature at which the scientific images were taken, and apply the dark
frame corrections to the scientific images. We have applied this method to the
AST3 data, and demonstrated that it can reduce the noise to a level roughly as
low as the photon noise of the sky brightness, solving the high noise problem
and improving the photometric precision. This method will also be helpful for
other projects that suffer from similar issues. | astro-ph_IM |
Exploiting the geomagnetic distortion of the inclined atmospheric
showers: We propose a novel approach for the determination of the nature of ultra-high
energy cosmic rays by exploiting the geomagnetic deviation of muons in nearly
horizontal showers. The distribution of the muons at ground level is well
described by a simple parametrization providing a few shape parameters tightly
correlated to $X^\mu_\mathrm{max}$, the depth of maximal muon production, which
is a mass indicator tightly correlated to the usual parameter $X_\mathrm{max}$,
the depth of maximal development of the shower. We show that some constraints
can be set on the predictions of hadronic models, especially by combining the
geomagnetic distortion with standard measurement of the longitudinal profile.
We discuss the precision needed to obtain significant results and we propose a
schematic layout of a detector. | astro-ph_IM |
Astrometric and photometric standard candidates for the upcoming 4-m
ILMT survey: The International Liquid Mirror Telescope (ILMT) is a 4-meter class survey
telescope that has recently achieved first light and is expected to swing into
full operations by 1st January 2023. It scans the sky in a fixed 22' wide strip
centered at the declination of $+29^o21'41''$ and works in Time Delay
Integration (TDI) mode. We present a full catalog of sources in the ILMT strip
that can serve as astrometric calibrators. The characteristics of the sources
for astrometric calibration are extracted from Gaia EDR3 as it provides a very
precise measurement of astrometric properties such as RA ($\alpha$), Dec
($\delta$), parallax ($\pi$), and proper motions ($\mu_{\alpha^{*}}$ &
$\mu_{\delta}$). We have crossmatched the Gaia EDR3 with SDSS DR17 and
PanSTARRS-1 (PS1) and supplemented the catalog with apparent magnitudes of
these sources in g, r, and i filters. We also present a catalog of
spectroscopically confirmed white dwarfs with SDSS magnitudes that may serve as
photometric calibrators. The catalogs generated are stored in a SQLite database
for query-based access. We also report the offsets in equatorial positions
compared to Gaia for an astrometrically calibrated TDI frame observed with the
ILMT. | astro-ph_IM |
The AstroSat Observatory: AstroSat is India's first Ultra-violet (UV) and X-ray astronomy observatory
in space. The satellite was launched by the Indian Space Research Organisation
on a Polar Satellite Launch Vehicle on 28 September 2015 from Sriharikota Range
north of Chennai on the eastern coast of India. AstroSat carries five
scientific instruments and one auxiliary instrument. Four of these consist of
co-aligned telescopes and detectors mounted on a common deck of the satellite
to observe stars and galaxies simultaneously in the near- and far-UV
wavelengths and a broad range of X-ray energies (0.3 to 80 keV). The fifth
instrument consists of three X-ray detectors and is mounted on a rotating
platform on a side that is oriented 90 degrees with respect to the other
instruments to scan the sky for X-ray transients. An auxiliary instrument
monitors the charged particle environment in the path of the satellite. | astro-ph_IM |
Recovering simulated planet and disk signals using SCALES aperture
masking: The Slicer Combined with Array of Lenslets for Exoplanet Spectroscopy
(SCALES) instrument is a lenslet-based integral field spectrograph that will
operate at 2 to 5 microns, imaging and characterizing colder (and thus older)
planets than current high-contrast instruments. Its spatial resolution for
distant science targets and/or close-in disks and companions could be improved
via interferometric techniques such as sparse aperture masking. We introduce a
nascent Python package, NRM-artist, that we use to design several SCALES masks
to be non-redundant and to have uniform coverage in Fourier space. We generate
high-fidelity mock SCALES data using the scalessim package for SCALES' low
spectral resolution modes across its 2 to 5 micron bandpass. We include
realistic noise from astrophysical and instrument sources, including Keck
adaptive optics and Poisson noise. We inject planet and disk signals into the
mock datasets and subsequently recover them to test the performance of SCALES
sparse aperture masking and to determine the sensitivity of various mask
designs to different science signals. | astro-ph_IM |
Spatial intensity interferometry on three bright stars: The present articlereports on the first spatial intensity interferometry
measurements on stars since the observations at Narrabri Observatory by Hanbury
Brown et al. in the 1970's. Taking advantage of the progresses in recent years
on photon-counting detectors and fast electronics, we were able to measure the
zero-time delay intensity correlation $g^{(2)}(\tau = 0, r)$ between the light
collected by two 1-m optical telescopes separated by 15 m. Using two marginally
resolved stars ($\alpha$ Lyr and $\beta$ Ori) with R magnitudes of 0.01 and
0.13 respectively, we demonstrate that 4-hour correlation exposures provide
reliable visibilities, whilst a significant loss of contrast is found on alpha
Aur, in agreement with its binary-star nature. | astro-ph_IM |
Overview of lunar detection of ultra-high energy particles and new plans
for the SKA: The lunar technique is a method for maximising the collection area for
ultra-high-energy (UHE) cosmic ray and neutrino searches. The method uses
either ground-based radio telescopes or lunar orbiters to search for Askaryan
emission from particles cascading near the lunar surface. While experiments
using the technique have made important advances in the detection of
nanosecond-scale pulses, only at the very highest energies has the lunar
technique achieved competitive limits. This is expected to change with the
advent of the Square Kilometre Array (SKA), the low-frequency component of
which (SKA-low) is predicted to be able to detect an unprecedented number of
UHE cosmic rays.
In this contribution, the status of lunar particle detection is reviewed,
with particular attention paid to outstanding theoretical questions, and the
technical challenges of using a giant radio array to search for nanosecond
pulses. The activities of SKA's High Energy Cosmic Particles Focus Group are
described, as is a roadmap by which this group plans to incorporate this
detection mode into SKA-low observations. Estimates for the sensitivity of
SKA-low phases 1 and 2 to UHE particles are given, along with the achievable
science goals with each stage. Prospects for near-future observations with
other instruments are also described. | astro-ph_IM |
High-resolution wide-band Fast Fourier Transform spectrometers: We describe the performance of our latest generations of sensitive wide-band
high-resolution digital Fast Fourier Transform Spectrometer (FFTS). Their
design, optimized for a wide range of radio astronomical applications, is
presented. Developed for operation with the GREAT far infrared heterodyne
spectrometer on-board SOFIA, the eXtended bandwidth FFTS (XFFTS) offers a high
instantaneous bandwidth of 2.5 GHz with 88.5 kHz spectral resolution and has
been in routine operation during SOFIA's Basic Science since July 2011. We
discuss the advanced field programmable gate array (FPGA) signal processing
pipeline, with an optimized multi-tap polyphase filter bank algorithm that
provides a nearly loss-less time-to-frequency data conversion with
significantly reduced frequency scallop and fast sidelobe fall-off. Our digital
spectrometers have been proven to be extremely reliable and robust, even under
the harsh environmental conditions of an airborne observatory, with
Allan-variance stability times of several 1000 seconds. An enhancement of the
present 2.5 GHz XFFTS will duplicate the number of spectral channels (64k),
offering spectroscopy with even better resolution during Cycle 1 observations. | astro-ph_IM |
The ADS All-Sky Survey: The ADS All-Sky Survey (ADSASS) is an ongoing effort aimed at turning the
NASA Astrophysics Data System (ADS), widely known for its unrivaled value as a
literature resource for astronomers, into a data resource. The ADS is not a
data repository per se, but it implicitly contains valuable holdings of
astronomical data, in the form of images, tables and object references
contained within articles. The objective of the ADSASS effort is to extract
these data and make them discoverable and available through existing data
viewers. The resulting ADSASS data layer promises to greatly enhance workflows
and enable new research by tying astronomical literature and data assets into
one resource. | astro-ph_IM |
Modern middleware for the data acquisition of the Cherenkov Telescope
Array: The data acquisition system (DAQ) of the future Cherenkov Telescope Array
(CTA) must be ef- ficient, modular and robust to be able to cope with the very
large data rate of up to 550 Gbps coming from many telescopes with different
characteristics. The use of modern middleware, namely ZeroMQ and Protocol
Buffers, can help to achieve these goals while keeping the development effort
to a reasonable level. Protocol Buffers are used as an on-line data for- mat,
while ZeroMQ is employed to communicate between processes. The DAQ will be
controlled and monitored by the Alma Common Software (ACS). Protocol Buffers
from Google are a way to define high-level data structures through an in-
terface description language (IDL) and a meta-compiler. ZeroMQ is a middleware
that augments the capabilities of TCP/IP sockets. It does not implement very
high-level features like those found in CORBA for example, but makes use of
sockets easier, more robust and almost as effective as raw TCP. The use of
these two middlewares enabled us to rapidly develop a robust prototype of the
DAQ including data persistence to compressed FITS files. | astro-ph_IM |
Geostationary Antenna for Disturbance-Free Laser Interferometry (GADFLI): We present a mission concept, the Geostationary Antenna for Disturbance-Free
Laser Interferometry (GADFLI), for a space-based gravitational-wave
interferometer consisting of three satellites in geostationary orbit around the
Earth. Compared to the nominal design of the Laser Interferometer Space Antenna
(LISA), this concept has the advantages of significantly decreased requirements
on the telescope size and laser power, decreased launch mass, substantially
improved shot noise resulting from the shorter 73000 km armlengths, simplified
and less expensive communications, and an overall lower cost which we (roughly)
estimate at $1.2B. GADFLI preserves much of the science of LISA, particularly
the observation of massive black-hole binary coalescences, although the SNR is
diminished for all masses in the potential designs we consider. | astro-ph_IM |
Betelgeuse scope: Single-mode-fibers-assisted optical interferometer
design for dedicated stellar activity monitoring: Betelgeuse has gone through a sudden shift in its brightness and dimmed
mysteriously. This is likely caused by a hot blob of plasma ejected from
Betelgeuse and then cooled to obscuring dust. If true, it is a remarkable
opportunity to directly witness the formation of dust around a red supergiant
star. Today's optical telescope facilities are not optimized for time-evolution
monitoring of the Betelgeuse surface, so in this work, we propose a low-cost
optical interferometer. The facility will consist of $12 \times 4$ inch optical
telescopes mounted on the surface of a large radio dish for interferometric
imaging; polarization-maintaining single-mode fibers will carry the coherent
beams from the individual optical telescopes to an all-in-one beam combiner. A
fast steering mirror assisted fiber injection system guides the flux into
fibers. A metrology system senses vibration-induced piston errors in optical
fibers, and these errors are corrected using fast-steering delay lines. We will
present the design. | astro-ph_IM |
STARFORGE: Toward a comprehensive numerical model of star cluster
formation and feedback: We present STARFORGE (STAR FORmation in Gaseous Environments): a new
numerical framework for 3D radiation MHD simulations of star formation that
simultaneously follow the formation, accretion, evolution, and dynamics of
individual stars in massive giant molecular clouds (GMCs) while accounting for
stellar feedback, including jets, radiative heating and momentum, stellar
winds, and supernovae. We use the GIZMO code with the MFM mesh-free Lagrangian
MHD method, augmented with new algorithms for gravity, timestepping, sink
particle formation and accretion, stellar dynamics, and feedback coupling. We
survey a wide range of numerical parameters/prescriptions for sink formation
and accretion and find very small variations in star formation history and the
IMF (except for intentionally-unphysical variations). Modules for
mass-injecting feedback (winds, SNe, and jets) inject new gas elements
on-the-fly, eliminating the lack of resolution in diffuse feedback cavities
otherwise inherent in Lagrangian methods. The treatment of radiation uses
GIZMO's radiative transfer solver to track 5 frequency bands (IR, optical, NUV,
FUV, ionizing), coupling direct stellar emission and dust emission with gas
heating and radiation pressure terms. We demonstrate accurate solutions for
SNe, winds, and radiation in problems with known similarity solutions, and show
that our jet module is robust to resolution and numerical details, and agrees
well with previous AMR simulations. STARFORGE can scale up to massive ($>10^5
M_\odot $) GMCs on current supercomputers while predicting the stellar
($\gtrsim 0.1 M_\odot$) range of the IMF, permitting simulations of both high-
and low-mass cluster formation in a wide range of conditions. | astro-ph_IM |
The Simons Observatory: A fully remote controlled calibration system
with a sparse wire grid for cosmic microwave background telescopes: For cosmic microwave background (CMB) polarization observations, calibration
of detector polarization angles is essential. We have developed a fully remote
controlled calibration system with a sparse wire grid that reflects linearly
polarized light along the wire direction. The new feature is a
remote-controlled system for regular calibration, which has not been possible
in sparse wire grid calibrators in past experiments. The remote control can be
achieved by two electric linear actuators that load or unload the sparse wire
grid into a position centered on the optical axis of a telescope between the
calibration time and CMB observation. Furthermore, the sparse wire grid can be
rotated by a motor. A rotary encoder and a gravity sensor are installed on the
sparse wire grid to monitor the wire direction. They allow us to achieve
detector angle calibration with expected systematic error of $0.08^{\circ}$.
The calibration system will be installed in small-aperture telescopes at Simons
Observatory. | astro-ph_IM |
Ground Layer Adaptive Optics: PSF effects on ELT scales: On certain extent the behavior of the Adaptive Optics correction for
Extremely Large Telescope scales with diameter size. But in Ground Layer
Adaptive Optics the combined effect of a Large Field of View and the large
overlap of Guide Stars pupil footprints at high atmospheric altitude introduces
severe changes in the behavior of the correction returning a very different
distribution of the energy going from known 8-10meter to 100m diameters. In
this paper we identify the reasons and the ways of these different behaviors. | astro-ph_IM |
The Generalized Spectral Kurtosis Estimator: Due to its conceptual simplicity and its proven effectiveness in real-time
detection and removal of radio frequency interference (RFI) from radio
astronomy data, the Spectral Kurtosis (SK) estimator is likely to become a
standard tool of a new generation of radio telescopes. However, the SK
estimator in its original form must be developed from instantaneous power
spectral density (PSD) estimates, and hence cannot be employed as an RFI
excision tool downstream of the data pipeline in existing instruments where any
time averaging is performed. In this letter, we develop a generalized estimator
with wider applicability for both instantaneous and averaged spectral data,
which extends its practical use to a much larger pool of radio instruments. | astro-ph_IM |
An Information Theory Approach on Deciding Spectroscopic Follow Ups: Classification and characterization of variable phenomena and transient
phenomena are critical for astrophysics and cosmology. These objects are
commonly studied using photometric time series or spectroscopic data. Given
that many ongoing and future surveys are in time-domain and given that adding
spectra provide further insights but requires more observational resources, it
would be valuable to know which objects should we prioritize to have spectrum
in addition to time series. We propose a methodology in a probabilistic setting
that determines a-priory which objects are worth taking spectrum to obtain
better insights, where we focus 'insight' as the type of the object
(classification). Objects for which we query its spectrum are reclassified
using their full spectrum information. We first train two classifiers, one that
uses photometric data and another that uses photometric and spectroscopic data
together. Then for each photometric object we estimate the probability of each
possible spectrum outcome. We combine these models in various probabilistic
frameworks (strategies) which are used to guide the selection of follow up
observations. The best strategy depends on the intended use, whether it is
getting more confidence or accuracy. For a given number of candidate objects
(127, equal to 5% of the dataset) for taking spectra, we improve 37% class
prediction accuracy as opposed to 20% of a non-naive (non-random) best
base-line strategy. Our approach provides a general framework for follow-up
strategies and can be extended beyond classification and to include other forms
of follow-ups beyond spectroscopy. | astro-ph_IM |
Robust dimensionality reduction for interferometric imaging of Cygnus A: Extremely high data rates expected in next-generation radio interferometers
necessitate a fast and robust way to process measurements in a big data
context. Dimensionality reduction can alleviate computational load needed to
process these data, in terms of both computing speed and memory usage. In this
article, we present image reconstruction results from highly reduced
radio-interferometric data, following our previously proposed data
dimensionality reduction method, $\mathrm{R}_{\mathrm{sing}}$, based on
studying the distribution of the singular values of the measurement operator.
This method comprises a simple weighted, subsampled discrete Fourier transform
of the dirty image. Additionally, we show that an alternative gridding-based
reduction method works well for target data sizes of the same order as the
image size. We reconstruct images from well-calibrated VLA data to showcase the
robustness of our proposed method down to very low data sizes in a 'real data'
setting. We show through comparisons with the conventional reduction method of
time- and frequency-averaging, that our proposed method produces more accurate
reconstructions while reducing data size much further, and is particularly
robust when data sizes are aggressively reduced to low fractions of the image
size. $\mathrm{R}_{\mathrm{sing}}$ can function in a block-wise fashion, and
could be used in the future to process incoming data by blocks in real-time,
thus opening up the possibility of performing 'on-line' imaging as the data are
being acquired. MATLAB code for the proposed dimensionality reduction method is
available on GitHub. | astro-ph_IM |
Tidal Accelerometry: Exploring the Cosmos Via Gravitational Correlations: Newtonian gravitation is non-radiative but is extremely pervasive and
penetrates equally into every media because it cannot be shielded. The extra
terrestrial fgravity is responsible for earth's trajectory. However its
correlation or geodesic deviation is manifested as semi-diurnal and diurnal
tides. Tidal signals, A(t) are temporal modulations in the field differential
which can be observed in a wide variety of natural and laboratory situations.
A(t) is a quasi-static, low frequency signal which arises from the relative
changes in positions of the detector and source and is not part of the
electromagnetic spectrum. Isaac Newton was the first to recognize the
importance of tides in astrometry and attempetd to estimate lunar mass from
ocean tides. By a case study we show, how the systematics of the gravitational
correlation can be used for calibration and de-trending which can significantly
increase the confidence level of high precision experiments. A(t) can also be
used to determine the distribution of celestial masses independently of the
"1-2-3" law. Guided by modern advances in gravity wave detectors we argue that
it is important to develop high precision accelerometry. With a resolution of
about a nano-m it will be possible to determine solar system masses and detect
the SMBH at the center of our galaxy. Observations of the gravitational
correlation can potentially open up yet to be explored vistas of the cosmos. | astro-ph_IM |
The miniJPAS Survey: A Study on Wavelength Dependence of the Photon
Response Non-uniformity of the JPAS-{\it Pathfinder} Camera: Understanding the origins of small-scale flats of CCDs and their
wavelength-dependent variations plays an important role in high-precision
photometric, astrometric, and shape measurements of astronomical objects. Based
on the unique flat data of 47 narrow-band filters provided by JPAS-{\it
Pathfinder}, we analyze the variations of small-scale flats as a function of
wavelength. We find moderate variations (from about $1.0\%$ at 390 nm to
$0.3\%$ at 890 nm) of small-scale flats among different filters, increasing
towards shorter wavelengths. Small-scale flats of two filters close in central
wavelengths are strongly correlated. We then use a simple physical model to
reproduce the observed variations to a precision of about $\pm 0.14\%$, by
considering the variations of charge collection efficiencies, effective areas
and thicknesses between CCD pixels. We find that the wavelength-dependent
variations of small-scale flats of the JPAS-{\it Pathfinder} camera originate
from inhomogeneities of the quantum efficiency (particularly charge collection
efficiency) as well as the effective area and thickness of CCD pixels. The
former dominates the variations in short wavelengths while the latter two
dominate at longer wavelengths. The effects on proper flat-fielding as well as
on photometric/flux calibrations for photometric/slit-less spectroscopic
surveys are discussed, particularly in blue filters/wavelengths. We also find
that different model parameters are sensitive to flats of different
wavelengths, depending on the relations between the electron absorption depth,
the photon absorption length and the CCD thickness. In order to model the
wavelength-dependent variations of small-scale flats, a small number (around
ten) of small-scale flats with well-selected wavelengths are sufficient to
reconstruct small-scale flats in other wavelengths. | astro-ph_IM |
Forward Global Photometric Calibration of the Dark Energy Survey: Many scientific goals for the Dark Energy Survey (DES) require calibration of
optical/NIR broadband $b = grizY$ photometry that is stable in time and uniform
over the celestial sky to one percent or better. It is also necessary to limit
to similar accuracy systematic uncertainty in the calibrated broadband
magnitudes due to uncertainty in the spectrum of the source. Here we present a
"Forward Global Calibration Method (FGCM)" for photometric calibration of the
DES, and we present results of its application to the first three years of the
survey (Y3A1). The FGCM combines data taken with auxiliary instrumentation at
the observatory with data from the broad-band survey imaging itself and models
of the instrument and atmosphere to estimate the spatial- and time-dependence
of the passbands of individual DES survey exposures. "Standard" passbands are
chosen that are typical of the passbands encountered during the survey. The
passband of any individual observation is combined with an estimate of the
source spectral shape to yield a magnitude $m_b^{\mathrm{std}}$ in the standard
system. This "chromatic correction" to the standard system is necessary to
achieve sub-percent calibrations. The FGCM achieves reproducible and stable
photometric calibration of standard magnitudes $m_b^{\mathrm{std}}$ of stellar
sources over the multi-year Y3A1 data sample with residual random calibration
errors of $\sigma=5-6\,\mathrm{mmag}$ per exposure. The accuracy of the
calibration is uniform across the $5000\,\mathrm{deg}^2$ DES footprint to
within $\sigma=7\,\mathrm{mmag}$. The systematic uncertainties of magnitudes in
the standard system due to the spectra of sources are less than
$5\,\mathrm{mmag}$ for main sequence stars with $0.5<g-i<3.0$. | astro-ph_IM |
Cosmic-CoNN: A Cosmic Ray Detection Deep-Learning Framework, Dataset,
and Toolkit: Rejecting cosmic rays (CRs) is essential for the scientific interpretation of
CCD-captured data, but detecting CRs in single-exposure images has remained
challenging. Conventional CR detectors require experimental parameter tuning
for different instruments, and recent deep learning methods only produce
instrument-specific models that suffer from performance loss on telescopes not
included in the training data. We present Cosmic-CoNN, a generic CR detector
deployed for 24 telescopes at the Las Cumbres Observatory, which is made
possible by the three contributions in this work: 1) We build a large and
diverse ground-based CR dataset leveraging thousands of images from a global
telescope network. 2) We propose a novel loss function and a neural network
optimized for telescope imaging data to train generic CR detection models. At
95% recall, our model achieves a precision of 93.70% on Las Cumbres imaging
data and maintains a consistent performance on new ground-based instruments
never used for training. Specifically, the Cosmic-CoNN model trained on the Las
Cumbres CR dataset maintains high precisions of 92.03% and 96.69% on Gemini
GMOS-N/S 1x1 and 2x2 binning images, respectively. 3) We build a suite of tools
including an interactive CR mask visualization and editing interface, console
commands, and Python APIs to make automatic, robust CR detection widely
accessible by the community of astronomers. Our dataset, open-source codebase,
and trained models are available at https://github.com/cy-xu/cosmic-conn. | astro-ph_IM |
Interoperable geographically distributed astronomical infrastructures:
technical solutions: The increase of astronomical data produced by a new generation of
observational tools poses the need to distribute data and to bring computation
close to the data. Trying to answer this need, we set up a federated data and
computing infrastructure involving an international cloud facility, EGI
federated, and a set of services implementing IVOA standards and
recommendations for authentication, data sharing and resource access. In this
paper we describe technical problems faced, specifically we show the designing,
technological and architectural solutions adopted. We depict our technological
overall solution to bring data close to computation resources. Besides the
adopted solutions, we propose some points for an open discussion on
authentication and authorization mechanisms. | astro-ph_IM |
Ultra High Molecular Weight Polyethylene: optical features at millimeter
wavelengths: The next generation of experiments for the measurement of the Cosmic
Microwave Background (CMB) requires more and more the use of advanced
materials, with specific physical and structural properties. An example is the
material used for receiver's cryostat windows and internal lenses. The large
throughput of current CMB experiments requires a large diameter (of the order
of 0.5m) of these parts, resulting in heavy structural and optical requirements
on the material to be used. Ultra High Molecular Weight (UHMW) polyethylene
(PE) features high resistance to traction and good transmissivity in the
frequency range of interest. In this paper, we discuss the possibility of using
UHMW PE for windows and lenses in experiments working at millimeter
wavelengths, by measuring its optical properties: emissivity, transmission and
refraction index. Our measurements show that the material is well suited to
this purpose. | astro-ph_IM |
EZ: A Tool for Automatic Redshift Measurement: We present EZ (Easy redshift), a tool we have developed within the VVDS
project to help in redshift measurement from otpical spectra. EZ has been
designed with large spectroscopic surveys in mind, and in its development
particular care has been given to the reliability of the results obtained in an
automatic and unsupervised mode. Nevertheless, the possibility of running it
interactively has been preserved, and a graphical user interface for results
inspection has been designed. EZ has been successfully used within the VVDS
project, as well as the zCosmos one. In this paper we describe its architecture
and the algorithms used, and evaluate its performances both on simulated and
real data. EZ is an open source program, freely downloadable from
http://cosmos.iasf-milano.inaf.it/pandora. | astro-ph_IM |
Toward a large bandwidth photonic correlator for infrared heterodyne
interferometry: Infrared heterodyne interferometry has been proposed as a practical
alternative for recombining a large number of telescopes over kilometric
baselines in the mid-infrared. However, the current limited correlation
capacities impose strong restrictions on the sensitivity of this appealing
technique. In this paper, we propose to address the problem of transport and
correlation of wide-bandwidth signals over kilometric distances by introducing
photonic processing in infrared heterodyne interferometry. We describe the
architecture of a photonic double-sideband correlator for two telescopes, along
with the experimental demonstration of this concept on a proof-of-principle
test bed. We demonstrate the \textit{a posteriori} correlation of two infrared
signals previously generated on a two-telescope simulator in a double-sideband
photonic correlator. A degradation of the signal-to-noise ratio of $13\%$,
equivalent to a noise factor $\text{NF}=1.15$, is obtained through the
correlator, and the temporal coherence properties of our input signals are
retrieved from these measurements. Our results demonstrate that photonic
processing can be used to correlate heterodyne signals with a potentially large
increase of detection bandwidth. These developments open the way to photonic
processing of wide bandwidth signals for mid-infrared heterodyne
interferometry, in particular for a large number of telescopes and for direct
imager recombiners. | astro-ph_IM |
A novel method for the absolute energy calibration of large-scale
cosmic-ray detectors using radio emission of extensive air showers: Ultra-high energy cosmic rays impinging onto the atmosphere induce huge
cascades of secondary particles. The measurement of the energy radiated by
these air showers in form of radio waves enables an accurate measurement of the
cosmic-ray energy. Compared to the well-established fluorescence technique, the
radio measurements are less dependent on atmospheric conditions and thus
potentially reduce the systematic uncertainty in the cosmic-ray energy
measurement significantly. Two attractive aspects are that the atmosphere is
transparent to MHz radio waves and the radio emission can be calculated from
first-principles using classical electrodynamics. This method will be discussed
for the Engineering Radio Array (AERA) of the Pierre Auger Cosmic-Ray
Observatory. AERA detects radio emission from extensive air showers with
energies beyond $10^{17}~$eV in the 30 - 80 MHz frequency band and consists of
more than 150 autonomous radio stations covering an area of about 17$~$km$^2$.
It is located at the same site as the Auger low-energy detector extensions
enabling combinations with various other measurement techniques. | astro-ph_IM |
Fiber modal noise mitigation by a rotating double scrambler: Fiber modal noise is a performance limiting factor in high-resolution
spectroscopy, both with respect to achieving high signal-to-noise ratios or
when targeting high-precision radial velocity measurements, with multi-mode
fiber-fed high-resolution spectrographs. Traditionally, modal noise is reduced
by agitating or "shaking" the fiber. This way, the light propagating in the
fiber is redistributed over many different modes. However, in case of fibers
with only a limited number of modes, e.g. at near-infrared wavelengths or in
adaptive-optics assisted systems, this method becomes very inefficient. The
strong agitation that would be needed stresses the fiber and could lead to
focal ratio degradation, or worse, to damaging the fiber. As an alternative
approach, we propose to make use of a classic optical double scrambler, a
device that is already implemented in many high-precision radial-velocity
spectrographs, to mitigate the effect of modal noise by rotating the
scrambler's first fiber end during each exposure. Because of the rotating
illumination pattern of the scrambler's second fiber, the modes that are
excited vary continuously. This leads to very efficient averaging of the modal
pattern at the fiber exit and to a strong reduction of modal noise. In this
contribution, we present a prototype design and preliminary laboratory results
of the rotating double scrambler. | astro-ph_IM |
Point-spread function ramifications and deconvolution of a signal
dependent blur kernel due to interpixel capacitive coupling: Interpixel capacitance (IPC) is a deterministic electronic coupling that
results in a portion of the collected signal incident on one pixel of a
hybridized detector array being measured in adjacent pixels. Data collected by
light sensitive HgCdTe arrays which exhibit this coupling typically goes
uncorrected or is corrected by treating the coupling as a fixed point spread
function. Evidence suggests that this IPC coupling is not uniform across
different signal and background levels. This variation invalidates assumptions
that are key in decoupling techniques such as Wiener Filtering or application
of the Lucy- Richardson algorithm. Additionally, the variable IPC results in
the point spread function (PSF) depending upon a star's signal level relative
to the background level, amond other parameters. With an IPC ranging from 0.68%
to 1.45% over the full well depth of a sensor, as is a reasonable range for the
H2RG arrays, the FWHM of the JWSTs NIRCam 405N band is degraded from 2.080 pix
(0".132) as expected from the diffraction patter to 2.186 pix (0".142) when the
star is just breaching the sensitivity limit of the system. For example, when
attempting to use a fixed PSF fitting (e.g. assuming the PSF observed from a
bright star in the field) to untangle two sources with a flux ratio of 4:1 and
a center to center distance of 3 pixels, flux estimation can be off by upwards
of 1.5% with a separation error of 50 millipixels. To deal with this issue an
iterative non-stationary method for deconvolution is here proposed,
implemented, and evaluated that can account for the signal dependent nature of
IPC. | astro-ph_IM |
The Physics of the Accelerating Universe Camera: The PAU (Physics of the Accelerating Universe) Survey goal is to obtain
photometric redshifts (photo-z) and Spectral Energy Distribution (SED) of
astronomical objects with a resolution roughly one order of magnitude better
than current broad band photometric surveys. To accomplish this, a new large
field of view camera (PAUCam) has been designed, built, commissioned and is now
operated at the William Herschel Telescope (WHT). With the current WHT Prime
Focus corrector, the camera covers ~1-degree diameter Field of View (FoV), of
which, only the inner ~40 arcmin diameter are unvignetted. The focal plane
consists of a mosaic of 18 2k$x4k Hamamatsu fully depleted CCDs, with high
quantum efficiency up to 1 micrometers in wavelength. To maximize the detector
coverage within the FoV, filters are placed in front of the CCDs inside the
camera cryostat (made out of carbon fiber) using a challenging movable tray
system. The camera uses a set of 40 narrow band filters ranging from ~4500 to
~8500 Angstroms complemented with six standard broad-band filters, ugrizY. The
PAU Survey aims to cover roughly 100 square degrees over fields with existing
deep photometry and galaxy shapes to obtain accurate photometric redshifts for
galaxies down to i_AB~22.5, detecting also galaxies down to i_AB~24 with less
precision in redshift. With this data set we will be able to measure intrinsic
alignments, galaxy clustering and perform galaxy evolution studies in a new
range of densities and redshifts. Here, we describe the PAU camera, its first
commissioning results and performance. | astro-ph_IM |
The KISS experiment: Mapping millimetre continuum emission has become a key issue in modern
multi-wavelength astrophysics. In particular, spectrum-imaging at low frequency
resolution is an asset for characterizing the clusters of galaxies via the
Sunyaev Zeldovich (SZ) effect. In this context, we have built a ground-based
spectrum-imager named KIDs Interferometer Spectrum Survey (KISS). This
instrument is based on two 316-pixel arrays of Kinetic Inductance Detectors
(KID) cooled to 150 mK by a custom dilution refrigerator-based cryostat. By
using Ti-Al and Al absorbers, we can cover a wide frequency range between 80
and 300 GHz. In order to preserve a large instantaneous Field of View (FoV) 1
degree the spectrometer is based on a Fourier Transform interferometer. This
represents a technological challenge due to the fast scanning speed that is
needed to overcome the effects of background atmospheric fluctuations. KISS is
installed at the QUIJOTE 2.25 m telescope in Tenerife since February 2019 and
is currently in its commissioning phase. In this proceeding we present an
overview of the instrument and the latest results. | astro-ph_IM |
Investigation of infrasound noise background at Mátra Gravitational
and Geophysical Laboratory (MGGL): Infrasonic and seismic waves are supposed to be the main contributors to the
gravity-gradient noise (Newtonian noise) of the third generation subterranean
gravitational-wave detectors. This noise will limit the sensitivity of the
instrument at frequencies below 20 Hz. Investigation of its origin and the
possible methods of mitigation have top priority during the designing period of
the detectors. Therefore long-term site characterizing measurements are needed
at several subterranean sites. However, at some sites, mining activities can
occur. These activities can cause sudden changes (transients) in the measured
signal, and increase the continuous background noise, too. We have developed a
new algorithm based on discrete Haar transform to find these transients in the
infrasound signal. We found that eliminating the transients decreases the
variation of the noise spectra, and hence results a more accurate
characterization of the background noise. We also carried out experiments for
controlling the continuous noise. Machines operating at the mine was turned on
and off systematically in order to see their effect on the noise spectra. These
experiments showed that the main contributor of the continuous noise is the
ventilation system of the mine. | astro-ph_IM |
Removing Internal Reflections from Deep Imaging Datasets: We present a means of characterizing and removing internal reflections
between the CCD and other optical surfaces in an astronomical camera. The
stellar reflections appear as out-of-focus images and are not necessarily
axisymmetric about the star. Using long exposures of very bright stars as
calibration images we are able to measure the position, size, and intensity of
reflections as a function of their position on the field. We also measure the
extended stellar point-spread function out to one degree. Together this
information can be used to create an empirical model of the excess light from
bright stars and reduce systematic artifacts in deep surface photometry. We
then reduce a set of deep observations of the Virgo cluster with our method to
demonstrate its efficacy and to provide a comparison with other strategies for
removing scattered light. | astro-ph_IM |
Performance of the ARIANNA Hexagonal Radio Array: Installation of the ARIANNA Hexagonal Radio Array (HRA) on the Ross Ice Shelf
of Antarctica has been completed. This detector serves as a pilot program to
the ARIANNA neutrino telescope, which aims to measure the diffuse flux of very
high energy neutrinos by observing the radio pulse generated by
neutrino-induced charged particle showers in the ice. All HRA stations ran
reliably and took data during the entire 2014-2015 austral summer season. A new
radio signal direction reconstruction procedure is described, and is observed
to have a resolution better than a degree. The reconstruction is used in a
preliminary search for potential neutrino candidate events in the data from one
of the newly installed detector stations. Three cuts are used to separate radio
backgrounds from neutrino signals. The cuts are found to filter out all data
recorded by the station during the season while preserving 85.4% of simulated
neutrino events that trigger the station. This efficiency is similar to that
found in analyses of previous HRA data taking seasons. | astro-ph_IM |
The Largest Russian Optical Telescope BTA: Current Status and
Modernization Prospects: The Russian 6-m telescope (BTA), once the largest telescope in the world and
now the largest optical telescope in Russia, has been successfully operating
for almost 45 years. In this paper we briefly overview the observing methods
the facility can currently provide, the ongoing projects on the development of
scientific equipment, the status of the telescope among the world's and Russian
astronomical communities, our ambitions to attract new users, and the prospects
the observatory wishes to realize in the near future. | astro-ph_IM |
Data reduction for the MATISSE instrument: We present in this paper the general formalism and data processing steps used
in the MATISSE data reduction software, as it has been developed by the MATISSE
consortium. The MATISSE instrument is the mid-infrared new generation
interferometric instrument of the Very Large Telescope Interferometer (VLTI).
It is a 2-in-1 instrument with 2 cryostats and 2 detectors: one 2k x 2k
Rockwell Hawaii 2RG detector for L\&M-bands, and one 1k x 1k Raytheon Aquarius
detector for N-band, both read at high framerates, up to 30 frames per second.
MATISSE is undergoing its first tests in laboratory today. | astro-ph_IM |
Ideas for Citizen Science in Astronomy: We review the relatively new, internet-enabled, and rapidly-evolving field of
citizen science, focusing on research projects in stellar, extragalactic and
solar system astronomy that have benefited from the participation of members of
the public, often in large numbers. We find these volunteers making
contributions to astronomy in a variety of ways: making and analyzing new
observations, visually classifying features in images and light curves,
exploring models constrained by astronomical datasets, and initiating new
scientific enquiries. The most productive citizen astronomy projects involve
close collaboration between the professionals and amateurs involved, and occupy
scientific niches not easily filled by great observatories or machine learning
methods: citizen astronomers are most strongly motivated by being of service to
science. In the coming years we expect participation and productivity in
citizen astronomy to increase, as survey datasets get larger and citizen
science platforms become more efficient. Opportunities include engaging the
public in ever more advanced analyses, and facilitating citizen-led enquiry by
designing professional user interfaces and analysis tools with citizens in
mind. | astro-ph_IM |
Spectrograph design for the Asgard/BIFROST spectro-interferometric
instrument for the VLTI: The BIFROST instrument will be the first VLTI instrument optimised for high
spectral resolution up to R=25,000 and operate between 1.05 and 1.7 $\mu$m. A
key component of the instrument will be the spectrograph, where we require a
high throughput over a broad bandwidth. In this contribution, we discuss the
four planned spectral modes (R=50, R=1000, R=5000, and R=25,000), the key
spectral windows that we need to cover, and the technology choices that we have
considered. We present our plan to use Volume Phase Holographic Gratings
(VPHGs) to achieve a high efficiency $>$ 85%. We present our preliminary
optical design and our strategies for wavelength calibration. | astro-ph_IM |
Prediction on detection and characterization of Galactic disk
microlensing events by LSST: Upcoming LSST survey gives an unprecedented opportunity for studying
populations of intrinsically faint objects using microlensing technique. Large
field of view and aperture allow effective time-series observations of many
stars in Galactic disk and bulge. Here, we combine Galactic models (for |b|<10
deg) and simulations of LSST observations to study how different observing
strategies affect the number and properties of microlensing events detected by
LSST. We predict that LSST will mostly observe long duration microlensing
events due to the source stars with the averaged magnitude around 22 in r-band,
rather than high-magnification events due to fainter source stars. In Galactic
bulge fields, LSST should detect on the order of 400 microlensing events per
square degree as compared to 15 in disk fields. Improving the cadence increases
the number of detectable microlensing events, e.g., improving the cadence from
6 to 2 days approximately doubles the number of microlensing events throughout
the Galaxy. According to the current LSST strategy, it will observe some fields
900 times during a 10-year survey with the average cadence of ~4-days (I) and
other fields (mostly toward the Galactic disk) around 180 times during a 1-year
survey only with the average 1-day cadence (II). We anticipate that the number
of events corresponding to these strategies are 7900 and 34000, respectively.
Toward similar lines of sight, LSST with the first observing strategy (I) will
detect more and on average longer microlensing events than those observable
with the second strategy. If LSST spends enough time observing near Galactic
plane, then the large number of microlensing events will allow studying
Galactic distribution of planets and finding isolated black holes among wealth
of other science cases. | astro-ph_IM |
Evaluating the efficacy of sonification for signal detection in
univariate, evenly sampled light curves using astronify: Sonification is the technique of representing data with sound, with potential
applications in astronomy research for aiding discovery and accessibility.
Several astronomy-focused sonification tools have been developed; however,
efficacy testing is extremely limited. We performed testing of astronify, a
prototype tool for sonification functionality within the Barbara A. Mikulski
Archive for Space Telescopes (MAST). We created synthetic light curves
containing zero, one, or two transit-like signals with a range of
signal-to-noise ratios (SNRs=3-100) and applied the default mapping of
brightness to pitch. We performed remote testing, asking participants to count
signals when presented with light curves as a sonification, visual plot, or
combination of both. We obtained 192 responses, of which 118 self-classified as
experts in astronomy and data analysis. For high SNRs (=30 and 100), experts
and non-experts performed well with sonified data (85-100% successful signal
counting). At low SNRs (=3 and 5) both groups were consistent with guessing
with sonifications. At medium SNRs (=7 and 10), experts performed no better
than non-experts with sonifications but significantly better (factor of ~2-3)
with visuals. We infer that sonification training, like that experienced by
experts for visual data inspection, will be important if this sonification
method is to be useful for moderate SNR signal detection within astronomical
archives and broader research. Nonetheless, we show that even a very simple,
and non-optimised, sonification approach allows users to identify high SNR
signals. A more optimised approach, for which we present ideas, would likely
yield higher success for lower SNR signals. | astro-ph_IM |
A high precision technique to correct for residual atmospheric
dispersion in high-contrast imaging systems: Direct detection and spectroscopy of exoplanets requires high contrast
imaging. For habitable exoplanets in particular, located at small angular
separation from the host star, it is crucial to employ small inner working
angle (IWA) coronagraphs that efficiently suppress starlight. These
coronagraphs, in turn, require careful control of the wavefront which directly
impacts their performance. For ground-based telescopes, atmospheric refraction
is also an important factor, since it results in a smearing of the PSF, that
can no longer be efficiently suppressed by the coronagraph. Traditionally,
atmospheric refraction is compensated for by an atmospheric dispersion
compensator (ADC). ADC control relies on an a priori model of the atmosphere
whose parameters are solely based on the pointing of the telescope, which can
result in imperfect compensation. For a high contrast instrument like the
Subaru Coronagraphic Extreme Adaptive Optics (SCExAO) system, which employs
very small IWA coronagraphs, refraction-induced smearing of the PSF has to be
less than 1 mas in the science band for optimum performance. In this paper, we
present the first on-sky measurement and correction of residual atmospheric
dispersion. Atmospheric dispersion is measured from the science image directly,
using an adaptive grid of artificially introduced speckles as a diagnostic to
feedback to the telescope's ADC. With our current setup, we were able to reduce
the initial residual atmospheric dispersion from 18.8 mas to 4.2 in broadband
light (y- to H-band), and to 1.4 mas in H-band only. This work is particularly
relevant to the upcoming extremely large telescopes (ELTs) that will require
fine control of their ADC to reach their full high contrast imaging potential. | astro-ph_IM |
On the coherence loss in phase-referenced VLBI observations: Context: Phase referencing is a standard calibration technique in radio
interferometry, particularly suited for the detection of weak sources close to
the sensitivity limits of the interferometers. However, effects from a changing
atmosphere and inaccuracies in the correlator model may affect the
phase-referenced images, leading to wrong estimates of source flux densities
and positions. A systematic observational study of signal decoherence in phase
referencing, and its effects in the image plane, has not been performed yet.
Aims: We systematically studied how the signal coherence in
Very-Long-Baseline-Interferometry (VLBI) observations is affected by a
phase-reference calibration at different frequencies and for different
calibrator-to-target separations. The results obtained should be of interest
for a correct interpretation of many phase-referenced observations with VLBI.
Methods: We observed a set of 13 strong sources (the S5 polar cap sample) at
8.4 and 15 GHz in phase-reference mode, with 32 different calibrator/target
combinations spanning angular separations between 1.5 and 20.5 degrees. We
obtained phase-referenced images and studied how the dynamic range and peak
flux density depend on observing frequency and source separation.
Results: We obtained dynamic ranges and peak flux densities of the
phase-referenced images as a function of frequency and separation from the
calibrator. We compared our results with models and phenomenological equations
previously reported.
Conclusions: The dynamic range of the phase-referenced images is strongly
limited by the atmosphere at all frequencies and for all source separations.
The limiting dynamic range is inversely proportional to the sine of the
calibrator-to-target separation. We also find that the peak flux densities,
relative to those obtained with the self-calibrated images, decrease with
source separation. | astro-ph_IM |
Optimization of an Optical Testbed for Characterization of EXCLAIM
u-Spec Integrated Spectrometers: We describe a testbed to characterize the optical response of compact
superconducting on-chip spectrometers in development for the Experiment for
Cryogenic Large-Aperture Intensity Mapping (EXCLAIM) mission. EXCLAIM is a
balloonborne far-infrared experiment to probe the CO and CII emission lines in
galaxies from redshift 3.5 to the present. The spectrometer, called u-Spec,
comprises a diffraction grating on a silicon chip coupled to kinetic inductance
detectors (KIDs) read out via a single microwave feedline. We use a prototype
spectrometer for EXCLAIM to demonstrate our ability to characterize the
spectrometers spectral response using a photomixer source. We utilize an
on-chip reference detector to normalize relative to spectral structure from the
off-chip optics and a silicon etalon to calibrate the absolute frequency. | astro-ph_IM |
Characterization of Skipper CCDs for Cosmological Applications: We characterize the response of a novel 250 $\mu$m thick, fully-depleted
Skipper Charged-Coupled Device (CCD) to visible/near-infrared light with a
focus on potential applications for astronomical observations. We achieve
stable, single-electron resolution with readout noise $\sigma \sim 0.18$
e$^{-}$ rms/pix from 400 non-destructive measurements of the charge in each
pixel. We verify that the gain derived from photon transfer curve measurements
agrees with the gain calculated from the quantized charge of individual
electrons to within < 1%. We also perform relative quantum efficiency
measurements and demonstrate high relative quantum efficiency at
optical/near-infrared wavelengths, as is expected for a thick, fully depleted
detector. Finally, we demonstrate the ability to perform multiple
non-destructive measurements and achieve sub-electron readout noise over
configurable subregions of the detector. This work is the first step toward
demonstrating the utility of Skipper CCDs for future astronomical and
cosmological applications. | astro-ph_IM |
The Possible Detection of Dark Energy on Earth Using Atom Interferometry: This paper describes the concept and the beginning of an experimental
investigation of whether it is possible to directly detect dark energy density
on earth using atom interferometry. The concept is to null out the
gravitational force using a double interferometer. This research provides a
non-astronomical path for research on dark energy. The application of this
method to other hypothetical weak forces and fields is also discussed. In the
the final section I discuss the advantages of carrying out a dark energy
density search in a satellite in earth orbit where more precise nulling of
gravitational forces can be achieved. | astro-ph_IM |
Deformable mirror-based pupil chopping for exoplanet imaging and
adaptive optics: Due to turbulence in the atmosphere images taken from ground-based telescopes
become distorted. With adaptive optics (AO) images can be given greater clarity
allowing for better observations with existing telescopes and are essential for
ground-based coronagraphic exoplanet imaging instruments. A disadvantage to
many AO systems is that they use sensors that can not correct for non-common
path aberrations. We have developed a new focal plane wavefront sensing
technique to address this problem called deformable mirror (DM)-based pupil
chopping. The process involves a coronagraphic or non-coronagraphic science
image and a deformable mirror, which modulates the phase by applying a local
tip/tilt every other frame which enables correcting for leftover aberrations in
the wavefront after a conventional AO correction. We validate this technique
with both simulations (for coronagraphic and non-coronagraphic images) and
testing (for non-coronagraphic images) on UCSC's Santa Cruz Extreme AO
Laboratory (SEAL) testbed. We demonstrate that with as low as 250 nm of DM
stroke to apply the local tip/tilt this wavefront sensor is linear for
low-order Zernike modes and enables real-time control, in principle up to kHz
speeds to correct for residual atmospheric turbulence. | astro-ph_IM |
Collision-free motion planning for fiber positioner robots:
discretization of velocity profiles: The next generation of large-scale spectroscopic survey experiments such as
DESI, will use thousands of fiber positioner robots packed on a focal plate. In
order to maximize the observing time with this robotic system we need to move
in parallel the fiber-ends of all positioners from the previous to the next
target coordinates. Direct trajectories are not feasible due to collision risks
that could undeniably damage the robots and impact the survey operation and
performance. We have previously developed a motion planning method based on a
novel decentralized navigation function for collision-free coordination of
fiber positioners. The navigation function takes into account the configuration
of positioners as well as their envelope constraints. The motion planning
scheme has linear complexity and short motion duration (~2.5 seconds with the
maximum speed of 30 rpm for the positioner), which is independent of the number
of positioners. These two key advantages of the decentralization designate the
method as a promising solution for the collision-free motion-planning problem
in the next-generation of fiber-fed spectrographs. In a framework where a
centralized computer communicates with the positioner robots, communication
overhead can be reduced significantly by using velocity profiles consisting of
a few bits only. We present here the discretization of velocity profiles to
ensure the feasibility of a real-time coordination for a large number of
positioners. The modified motion planning method that generates piecewise
linearized position profiles guarantees collision-free trajectories for all the
robots. The velocity profiles fit few bits at the expense of higher
computational costs. | astro-ph_IM |
Gaia space mission and quasars: Quasars are often considered to be point-like objects. This is largely true
and allows for an excellent alignment of the optical positional reference frame
of the ongoing ESA mission Gaia with the International Celestial Reference
Frame. But presence of optical jets in quasars can cause shifts of the optical
photo-centers at levels detectable by Gaia. Similarly, motion of emitting blobs
in the jet can be detected as proper motion shifts. Gaia's measurements of
spectral energy distribution for around a million distant quasars is useful to
determine their redshifts and to assess their variability on timescales from
hours to years. Spatial resolution of Gaia allows to build a complete magnitude
limited sample of strongly lensed quasars. The mission had its first public
data release in September 2016 and is scheduled to have the next and much more
comprehensive one in April 2018. Here we briefly review the capabilities and
current results of the mission. Gaia's unique contributions to the studies of
quasars are already being published, a highlight being a discovery of a number
of quasars with optical jets. | astro-ph_IM |
Adapting astronomical source detection software to help detect animals
in thermal images obtained by unmanned aerial systems: In this paper we describe an unmanned aerial system equipped with a
thermal-infrared camera and software pipeline that we have developed to monitor
animal populations for conservation purposes. Taking a multi-disciplinary
approach to tackle this problem, we use freely available astronomical source
detection software and the associated expertise of astronomers, to efficiently
and reliably detect humans and animals in aerial thermal-infrared footage.
Combining this astronomical detection software with existing machine learning
algorithms into a single, automated, end-to-end pipeline, we test the software
using aerial video footage taken in a controlled, field-like environment. We
demonstrate that the pipeline works reliably and describe how it can be used to
estimate the completeness of different observational datasets to objects of a
given type as a function of height, observing conditions etc. -- a crucial step
in converting video footage to scientifically useful information such as the
spatial distribution and density of different animal species. Finally, having
demonstrated the potential utility of the system, we describe the steps we are
taking to adapt the system for work in the field, in particular systematic
monitoring of endangered species at National Parks around the world. | astro-ph_IM |
The High Inclination Solar Mission: The High Inclination Solar Mission (HISM) is a concept for an
out-of-the-ecliptic mission for observing the Sun and the heliosphere. The
mission profile is largely based on the Solar Polar Imager concept: initially
spiraling in to a 0.48 AU ecliptic orbit, then increasing the orbital
inclination at a rate of $\sim 10$ degrees per year, ultimately reaching a
heliographic inclination of $>$75 degrees. The orbital profile is achieved
using solar sails derived from the technology currently being developed for the
Solar Cruiser mission, currently under development.
HISM remote sensing instruments comprise an imaging spectropolarimeter
(Doppler imager / magnetograph) and a visible light coronagraph. The in-situ
instruments include a Faraday cup, an ion composition spectrometer, and
magnetometers. Plasma wave measurements are made with electrical antennas and
high speed magnetometers.
The $7,000\,\mathrm{m}^2$ sail used in the mission assessment is a direct
extension of the 4-quadrant $1,666\,\mathrm{m}^2$ Solar Cruiser design and
employs the same type of high strength composite boom, deployment mechanism,
and membrane technology. The sail system modelled is spun (~1 rpm) to assure
required boom characteristics with margin. The spacecraft bus features a
fine-pointing 3-axis stabilized instrument platform that allows full science
observations as soon as the spacecraft reaches a solar distance of 0.48 AU. | astro-ph_IM |
Removing visual bias in filament identification: a new goodness-of-fit
measure: Different combinations of input parameters to filament identification
algorithms, such as Disperse and FilFinder, produce numerous different output
skeletons. The skeletons are a one pixel wide representation of the filamentary
structure in the original input image. However, these output skeletons may not
necessarily be a good representation of that structure. Furthermore, a given
skeleton may not be as good a representation as another. Previously there has
been no mathematical `goodness-of-fit' measure to compare output skeletons to
the input image. Thus far this has been assessed visually, introducing visual
bias. We propose the application of the mean structural similarity index
(MSSIM) as a mathematical goodness-of-fit measure. We describe the use of the
MSSIM to find the output skeletons most mathematically similar to the original
input image (the optimum, or `best', skeletons) for a given algorithm, and
independently of the algorithm. This measure makes possible systematic
parameter studies, aimed at finding the subset of input parameter values
returning optimum skeletons. It can also be applied to the output of
non-skeleton based filament identification algorithms, such as the Hessian
matrix method. The MSSIM removes the need to visually examine thousands of
output skeletons, and eliminates the visual bias, subjectivity, and limited
reproducibility inherent in that process, representing a major improvement on
existing techniques. Importantly, it also allows further automation in the
post-processing of output skeletons, which is crucial in this era of `big
data'. | astro-ph_IM |
Viewpoints: A high-performance high-dimensional exploratory data
analysis tool: Scientific data sets continue to increase in both size and complexity. In the
past, dedicated graphics systems at supercomputing centers were required to
visualize large data sets, but as the price of commodity graphics hardware has
dropped and its capability has increased, it is now possible, in principle, to
view large complex data sets on a single workstation. To do this in practice,
an investigator will need software that is written to take advantage of the
relevant graphics hardware. The Viewpoints visualization package described
herein is an example of such software. Viewpoints is an interactive tool for
exploratory visual analysis of large, high-dimensional (multivariate) data. It
leverages the capabilities of modern graphics boards (GPUs) to run on a single
workstation or laptop. Viewpoints is minimalist: it attempts to do a small set
of useful things very well (or at least very quickly) in comparison with
similar packages today. Its basic feature set includes linked scatter plots
with brushing, dynamic histograms, normalization and outlier detection/removal.
Viewpoints was originally designed for astrophysicists, but it has since been
used in a variety of fields that range from astronomy, quantum chemistry, fluid
dynamics, machine learning, bioinformatics, and finance to information
technology server log mining. In this article, we describe the Viewpoints
package and show examples of its usage. | astro-ph_IM |
Overview of the Instrumentation for the Dark Energy Spectroscopic
Instrument: The Dark Energy Spectroscopic Instrument (DESI) has embarked on an ambitious
five-year survey to explore the nature of dark energy with spectroscopy of 40
million galaxies and quasars. DESI will determine precise redshifts and employ
the Baryon Acoustic Oscillation method to measure distances from the nearby
universe to z > 3.5, as well as measure the growth of structure and probe
potential modifications to general relativity. In this paper we describe the
significant instrumentation we developed for the DESI survey. The new
instrumentation includes a wide-field, 3.2-deg diameter prime-focus corrector
that focuses the light onto 5020 robotic fiber positioners on the 0.812 m
diameter, aspheric focal surface. The positioners and their fibers are divided
among ten wedge-shaped petals. Each petal is connected to one of ten
spectrographs via a contiguous, high-efficiency, nearly 50 m fiber cable
bundle. The ten spectrographs each use a pair of dichroics to split the light
into three channels that together record the light from 360 - 980 nm with a
resolution of 2000 to 5000. We describe the science requirements, technical
requirements on the instrumentation, and management of the project. DESI was
installed at the 4-m Mayall telescope at Kitt Peak, and we also describe the
facility upgrades to prepare for DESI and the installation and functional
verification process. DESI has achieved all of its performance goals, and the
DESI survey began in May 2021. Some performance highlights include RMS
positioner accuracy better than 0.1", SNR per \sqrt{\AA} > 0.5 for a z > 2
quasar with flux 0.28e-17 erg/s/cm^2/A at 380 nm in 4000s, and median SNR = 7
of the [OII] doublet at 8e-17 erg/s/cm^2 in a 1000s exposure for emission line
galaxies at z = 1.4 - 1.6. We conclude with highlights from the on-sky
validation and commissioning of the instrument, key successes, and lessons
learned. (abridged) | astro-ph_IM |
Generating artificial light curves: Revisited and updated: The production of artificial light curves with known statistical and
variability properties is of great importance in astrophysics. Consolidating
the confidence levels during cross-correlation studies, understanding the
artefacts induced by sampling irregularities, establishing detection limits for
future observatories are just some of the applications of simulated data sets.
Currently, the widely used methodology of amplitude and phase randomisation is
able to produce artificial light curves which have a given underlying power
spectral density (PSD) but which are strictly Gaussian distributed. This
restriction is a significant limitation, since the majority of the light curves
e.g. active galactic nuclei, X-ray binaries, gamma-ray bursts show strong
deviations from Gaussianity exhibiting `burst-like' events in their light
curves yielding long-tailed probability distribution functions (PDFs). In this
study we propose a simple method which is able to precisely reproduce light
curves which match both the PSD and the PDF of either an observed light curve
or a theoretical model. The PDF can be representative of either the parent
distribution or the actual distribution of the observed data, depending on the
study to be conducted for a given source. The final artificial light curves
contain all of the statistical and variability properties of the observed
source or theoretical model i.e. same PDF and PSD, respectively. Within the
framework of Reproducible Research, the code, together with the illustrative
example used in this manuscript, are both made publicly available in the form
of an interactive Mathematica notebook. | astro-ph_IM |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.