text
stringlengths 89
2.49k
| category
stringclasses 19
values |
---|---|
Calibration of the NEVOD-EAS array for detection of extensive air
showers: In this paper we discuss the calibration of the NEVOD-EAS array which is a
part of the Experimental Complex NEVOD, as well as the results of studying the
response features of its scintillation detectors. We present the results of the
detectors energy calibration, performed by comparing their response to
different types of particles obtained experimentally and simulated with the
Geant4 software package, as well as of the measurements of their timing
resolution. We also discuss the results of studies of the light collection
non-uniformity of the NEVOD-EAS detectors and of the accuracy of air-shower
arrival direction reconstruction, which have been performed using other
facilities of the Experimental Complex NEVOD: the muon hodoscope URAGAN and the
coordinate-tracking detector DECOR. | astro-ph_IM |
The RATT PARROT: serendipitous discovery of a peculiarly scintillating
pulsar in MeerKAT imaging observations of the Great Saturn-Jupiter
Conjunction of 2020. I. Dynamic imaging and data analysis: We report on a radiopolarimetric observation of the Saturn-Jupiter Great
Conjunction of 2020 using the MeerKAT L-band system, initially carried out for
science verification purposes, which yielded a serendipitous discovery of a
pulsar. The radiation belts of Jupiter are very bright and time variable:
coupled with the sensitivity of MeerKAT, this necessitated development of
dynamic imaging techniques, reported on in this work. We present a deep radio
"movie" revealing Jupiter's rotating magnetosphere, a radio detection of
Callisto, and numerous background radio galaxies. We also detect a bright radio
transient in close vicinity to Saturn, lasting approximately 45 minutes.
Follow-up deep imaging observations confirmed this as a faint compact variable
radio source, and yielded detections of pulsed emission by the commensal
MeerTRAP search engine, establishing the object's nature as a radio emitting
neutron star, designated PSR J2009-2026. A further observation combining deep
imaging with the PTUSE pulsar backend measured detailed dynamic spectra for the
object. While qualitatively consistent with scintillation, the magnitude of the
magnification events and the characteristic timescales are odd. We are
tentatively designating this object a pulsar with anomalous refraction
recurring on odd timescales (PARROT). As part of this investigation, we present
a pipeline for detection of variable sources in imaging data, with dynamic
spectra and lightcurves as the products, and compare dynamic spectra obtained
from visibility data with those yielded by PTUSE. We discuss MeerKAT's
capabilities and prospects for detecting more of such transients and variables. | astro-ph_IM |
Enhanced models for stellar Doppler noise reveal hints of a 13-year
activity cycle of 55 Cancri: We consider the impact of Doppler noise models on the statistical robustness
of the exoplanetary radial-velocity fits. We show that the traditional model of
the Doppler noise with an additive jitter can generate large non-linearity
effects, decreasing the reliability of the fit, especially in the cases when a
correleated Doppler noise is involved. We introduce a regularization of the
additive noise model that can gracefully eliminate its singularities together
with the associated non-linearity effects.
We apply this approach to Doppler time-series data of several exoplanetary
systems. It demonstrates that our new regularized noise model yields orbital
fits that have either increased or at least the same statistical robustness, in
comparison with the simple additive jitter. Various statistical uncertainties
in the parametric estimations are often reduced, while planet detection
significance is often increased.
Concerning the 55 Cnc five-planet system, we show that its Doppler data
contain significant correlated ("red") noise. Its correlation timescale is in
the range from days to months, and its magnitude is much larger than the effect
of the planetary N-body perturbations in the radial velocity (these
perturbations thus appear undetectable). Characteristics of the red noise
depend on the spectrograph/observatory, and also show a cyclic time variation
in phase with the public Ca II H & K and photometry measurements. We interpret
this modulation as a hint of the long-term activity cycle of 55 Cnc, similar to
the Solar 11-year cycle. We estimate the 55 Cnc activity period by
$12.6\pm^{2.5}_{1.0}$ yrs, with the nearest minimum presumably expected in 2014
or 2015. | astro-ph_IM |
Analysis of active optics correction for a large honeycomb mirror: In the development of space-based large telescope systems, having the
capability to perform active optics correction allows correcting wavefront
aberrations caused by thermal perturbations so as to achieve
diffraction-limited performance with relaxed stability requirements. We present
a method of active optics correction used for current ground-based telescopes
and simulate its effectiveness for a large honeycomb primary mirror in space.
We use a finite-element model of the telescope to predict misalignments of the
optics and primary mirror surface errors due to thermal gradients. These
predicted surface error data are plugged into a Zemax ray trace analysis to
produce wavefront error maps at the image plane. For our analysis, we assume
that tilt, focus and coma in the wavefront error are corrected by adjusting the
pointing of the telescope and moving the secondary mirror. Remaining mid- to
high-order errors are corrected through physically bending the primary mirror
with actuators. The influences of individual actuators are combined to form
bending modes that increase in stiffness from low-order to high-order
correction. The number of modes used is a variable that determines the accuracy
of correction and magnitude of forces. We explore the degree of correction that
can be made within limits on actuator force capacity and stress in the mirror.
While remaining within these physical limits, we are able to demonstrate sub-25
nm RMS surface error over 30 hours of simulated data. The results from this
simulation will be part of an end-to-end simulation of telescope optical
performance that includes dynamic perturbations, wavefront sensing, and active
control of alignment and mirror shape with realistic actuator performance. | astro-ph_IM |
The Carnegie Astrometric Planet Search Program: We are undertaking an astrometric search for gas giant planets and brown
dwarfs orbiting nearby low mass dwarf stars with the 2.5-m du Pont telescope at
the Las Campanas Observatory in Chile. We have built two specialized
astrometric cameras, the Carnegie Astrometric Planet Search Cameras (CAPSCam-S
and CAPSCam-N), using two Teledyne Hawaii-2RG HyViSI arrays, with the cameras'
design having been optimized for high accuracy astrometry of M dwarf stars. We
describe two independent CAPSCam data reduction approaches and present a
detailed analysis of the observations to date of one of our target stars, NLTT
48256. Observations of NLTT 48256 taken since July 2007 with CAPSCam-S imply
that astrometric accuracies of around 0.3 milliarcsec per hour are achievable,
sufficient to detect a Jupiter-mass companion orbiting 1 AU from a late M dwarf
10 pc away with a signal-to-noise ratio of about 4. We plan to follow about 100
nearby (primarily within about 10 pc) low mass stars, principally late M, L,
and T dwarfs, for 10 years or more, in order to detect very low mass companions
with orbital periods long enough to permit the existence of habitable,
Earth-like planets on shorter-period orbits. These stars are generally too
faint and red to be included in ground-based Doppler planet surveys, which are
often optimized for FGK dwarfs. The smaller masses of late M dwarfs also yield
correspondingly larger astrometric signals for a given mass planet. Our search
will help to determine whether gas giant planets form primarily by core
accretion or by disk instability around late M dwarf stars. | astro-ph_IM |
Unrolling PALM for sparse semi-blind source separation: Sparse Blind Source Separation (BSS) has become a well established tool for a
wide range of applications - for instance, in astrophysics and remote sensing.
Classical sparse BSS methods, such as the Proximal Alternating Linearized
Minimization (PALM) algorithm, nevertheless often suffer from a difficult
hyperparameter choice, which undermines their results. To bypass this pitfall,
we propose in this work to build on the thriving field of algorithm
unfolding/unrolling. Unrolling PALM enables to leverage the data-driven
knowledge stemming from realistic simulations or ground-truth data by learning
both PALM hyperparameters and variables. In contrast to most existing unrolled
algorithms, which assume a fixed known dictionary during the training and
testing phases, this article further emphasizes on the ability to deal with
variable mixing matrices (a.k.a. dictionaries). The proposed Learned PALM
(LPALM) algorithm thus enables to perform semi-blind source separation, which
is key to increase the generalization of the learnt model in real-world
applications. We illustrate the relevance of LPALM in astrophysical
multispectral imaging: the algorithm not only needs up to $10^4-10^5$ times
fewer iterations than PALM, but also improves the separation quality, while
avoiding the cumbersome hyperparameter and initialization choice of PALM. We
further show that LPALM outperforms other unrolled source separation methods in
the semi-blind setting. | astro-ph_IM |
Thermal control of long delay lines in a high-resolution astrophotonic
spectrograph: High-resolution astronomical spectroscopy carried out with a photonic Fourier
transform spectrograph (FTS) requires long asymmetrical optical delay lines
that can be dynamically tuned. For example, to achieve a spectral resolution of
R = 30,000, a delay line as long as 1.5 cm would be required. Such delays are
inherently prone to phase errors caused by temperature fluctuations. This is
due to the relatively large thermo-optic coefficient and long lengths of the
waveguides, in this case composed of SiN, resulting in thermally dependent
changes to the optical path length. To minimize phase error to the order of
0.05 radians, thermal stability of the order of 0.05{\deg} C is necessary. A
thermal control system capable of stability such as this would require a fast
thermal response and minimal overshoot/undershoot. With a PID temperature
control loop driven by a Peltier cooler and thermistor, we minimized
interference fringe phase error to +/- 0.025 radians and achieved temperature
stability on the order of 0.05{\deg} C. We present a practical system for
precision temperature control of a foundry-fabricated and packaged FTS device
on a SiN platform with delay lines ranging from 0.5 to 1.5 cm in length using
inexpensive off-the-shelf components, including design details, control loop
optimization, and considerations for thermal control of integrated photonics. | astro-ph_IM |
Characterization Of Inpaint Residuals In Interferometric Measurements of
the Epoch Of Reionization: Radio Frequency Interference (RFI) is one of the systematic challenges
preventing 21cm interferometric instruments from detecting the Epoch of
Reionization. To mitigate the effects of RFI on data analysis pipelines,
numerous inpaint techniques have been developed to restore RFI corrupted data.
We examine the qualitative and quantitative errors introduced into the
visibilities and power spectrum due to inpainting. We perform our analysis on
simulated data as well as real data from the Hydrogen Epoch of Reionization
Array (HERA) Phase 1 upper limits. We also introduce a convolutional neural
network that capable of inpainting RFI corrupted data in interferometric
instruments. We train our network on simulated data and show that our network
is capable at inpainting real data without requiring to be retrained. We find
that techniques that incorporate high wavenumbers in delay space in their
modeling are best suited for inpainting over narrowband RFI. We also show that
with our fiducial parameters Discrete Prolate Spheroidal Sequences (DPSS) and
CLEAN provide the best performance for intermittent ``narrowband'' RFI while
Gaussian Progress Regression (GPR) and Least Squares Spectral Analysis (LSSA)
provide the best performance for larger RFI gaps. However we caution that these
qualitative conclusions are sensitive to the chosen hyperparameters of each
inpainting technique. We find these results to be consistent in both simulated
and real visibilities. We show that all inpainting techniques reliably
reproduce foreground dominated modes in the power spectrum. Since the
inpainting techniques should not be capable of reproducing noise realizations,
we find that the largest errors occur in the noise dominated delay modes. We
show that in the future, as the noise level of the data comes down, CLEAN and
DPSS are most capable of reproducing the fine frequency structure in the
visibilities of HERA data. | astro-ph_IM |
Analysis Methods for Gamma-ray Astronomy: The launch of the Fermi satellite in 2008, with its Large Area Telescope
(LAT) on board, has opened a new era for the study of gamma-ray sources at GeV
($10^9$ eV) energies. Similarly, the commissioning of the third generation of
imaging atmospheric Cherenkov telescopes (IACTs) - H.E.S.S., MAGIC, and VERITAS
- in the mid-2000's has firmly established the field of TeV ($10^{12}$ eV)
gamma-ray astronomy. Together, these instruments have revolutionised our
understanding of the high-energy gamma-ray sky, and they continue to provide
access to it over more than six decades in energy. In recent years, the
ground-level particle detector arrays HAWC, Tibet, and LHAASO have opened a new
window to gamma rays of the highest energies, beyond 100 TeV. Soon,
next-generation facilities such as CTA and SWGO will provide even better
sensitivity, thus promising a bright future for the field. In this chapter, we
provide a brief overview of methods commonly employed for the analysis of
gamma-ray data, focusing on those used for Fermi-LAT and IACT observations. We
describe the standard data formats, explain event reconstruction and selection
algorithms, and cover in detail high-level analysis approaches for imaging and
extraction of spectra, including aperture photometry as well as advanced
likelihood techniques. | astro-ph_IM |
Coherent Imaging with Photonic Lanterns: Photonic Lanterns (PLs) are tapered waveguides that gradually transition from
a multi-mode fiber geometry to a bundle of single-mode fibers (SMFs). They can
efficiently couple multi-mode telescope light into a multi-mode fiber entrance
at the focal plane and convert it into multiple single-mode beams. Thus, each
SMF samples its unique mode (lantern principal mode) of the telescope light in
the pupil, analogous to subapertures in aperture masking interferometry (AMI).
Coherent imaging with PLs can be enabled by interfering SMF outputs and
applying phase modulation, which can be achieved using a photonic chip beam
combiner at the backend (e.g., the ABCD beam combiner). In this study, we
investigate the potential of coherent imaging by interfering SMF outputs of a
PL with a single telescope. We demonstrate that the visibilities that can be
measured from a PL are mutual intensities incident on the pupil weighted by the
cross-correlation of a pair of lantern modes. From numerically simulated
lantern principal modes of a 6-port PL, we find that interferometric
observables using a PL behave similarly to separated-aperture visibilities for
simple models on small angular scales ($<\lambda/D$) but with greater
sensitivity to symmetries and capability to break phase angle degeneracies.
Furthermore, we present simulated observations with wavefront errors and
compare them to AMI. Despite the redundancy caused by extended lantern
principal modes, spatial filtering offers stability to wavefront errors. Our
simulated observations suggest that PLs may offer significant benefits in the
photon noise-limited regime and in resolving small angular scales at low
contrast regime. | astro-ph_IM |
The ASTRO-H X-ray Astronomy Satellite: The joint JAXA/NASA ASTRO-H mission is the sixth in a series of highly
successful X-ray missions developed by the Institute of Space and Astronautical
Science (ISAS), with a planned launch in 2015. The ASTRO-H mission is equipped
with a suite of sensitive instruments with the highest energy resolution ever
achieved at E > 3 keV and a wide energy range spanning four decades in energy
from soft X-rays to gamma-rays. The simultaneous broad band pass, coupled with
the high spectral resolution of Delta E < 7 eV of the micro-calorimeter, will
enable a wide variety of important science themes to be pursued. ASTRO-H is
expected to provide breakthrough results in scientific areas as diverse as the
large-scale structure of the Universe and its evolution, the behavior of matter
in the gravitational strong field regime, the physical conditions in sites of
cosmic-ray acceleration, and the distribution of dark matter in galaxy clusters
at different redshifts. | astro-ph_IM |
JUDE (Jayant's UVIT Data Explorer) Pipeline User Manual: We have written a reference manual to use JUDE (Jayant's UVIT data Explorer)
data pipeline software for processing and reducing the Ultraviolet Imaging
Telescope (UVIT) Level~1 data into event lists and images -- Level~2 data. The
JUDE pipeline is written in the GNU Data Language (GDL) and released as an
open-source which may be freely used and modified. GDL was chosen because it is
an interpreted language allowing interactive analysis of data; thus in the
pipeline, each step can be checked and run interactively. This manual is
intended as a guide to data reduction and calibration for the users of the UVIT
data. | astro-ph_IM |
Reconstruction of Cherenkov radiation signals from extensive air showers
of cosmic rays using data of a wide field-of-view telescope: The operation of a wide field-of-view (WFOV) Cherenkov telescope is
described. The detection of extensive air showers (EAS) of cosmic rays (CR) is
based upon the coincidence with signals from the Yakutsk array. The data
acquisition system of the telescope yields signals connected with EAS
development parameters: presumably, shower age and position of shower maximum
in the atmosphere. Here we describe the method of signal processing used to
reconstruct Cherenkov radiation signals induced by CR showers. An analysis of
signal parameters results in the confirmation of the known correlation of the
duration of the Cherenkov radiation signal with the distance to the shower
core. The measured core distance dependence is used to set an upper limit to
the dimensions of the area along the EAS axis where the Cherenkov radiation
intensity is above half-peak amplitude. | astro-ph_IM |
Technical Note: Asteroid Detection Demonstration from SkySat-3 B612 Data
using Synthetic Tracking: We report results from analyzing the B612 asteroid observation data taken by
the sCMOS cameras on board of Planet SkySat-3 using the synthetic tracking
technique. The analysis demonstrates the expected sensitivity improvement in
the signal-to-noise ratio of the asteroids from properly stacking up the the
short exposure images in post-processing. | astro-ph_IM |
AMEGO-X: MeV gamma-ray Astronomy in the Multimessenger Era: Recent detections of gravitational wave signals and neutrinos from gamma-ray
sources have ushered in the era of multi-messenger astronomy, while
highlighting the importance of gamma-ray observations for this emerging field.
AMEGO-X, the All-sky Medium Energy Gamma-Ray Observatory eXplorer, is an MeV
gamma-ray instrument that will survey the sky in the energy range from hundreds
of keV to one GeV with unprecedented sensitivity. AMEGO-X will detect gamma-ray
photons both via Compton interactions and pair production processes, bridging
the "sensitivity gap" between hard X-rays and high-energy gamma rays. AMEGO-X
will provide important contributions to multi-messenger science and time-domain
gamma-ray astronomy, studying e.g. high-redshift blazars, which are probable
sources of astrophysical neutrinos, and gamma-ray bursts. I will present an
overview of the instrument and science program. | astro-ph_IM |
Multi-Chroic Feed-Horn Coupled TES Polarimeters: Multi-chroic polarization sensitive detectors offer an avenue to increase
both the spectral coverage and sensitivity of instruments optimized for
observations of the cosmic-microwave background (CMB) or sub-mm sky. We report
on an effort to adapt the Truce Collaboration horn coupled bolometric
polarimeters for operation over octave bandwidth. Development is focused on
detectors operating in both the 90 and 150 GHz bands which offer the highest
CMB polarization to foreground ratio. We plan to deploy an array of 256
multi-chroic 90/150 GHz polarimeters with 1024 TES detectors on ACTPol in 2013,
and there are proposals to use this technology for balloon-borne instruments.
The combination of excellent control of beam systematics and sensitivity make
this technology ideal for future ground, ballon, and space missions. | astro-ph_IM |
AnisoCADO: a python package for analytically generating adaptive optics
point spread functions for the Extremely Large Telescope: AnisoCADO is a Python package for generating images of the point spread
function (PSF) for the european extremely large telescope (ELT). The code
allows the user to set many of the most important atmospheric and observational
parameters that influence the shape and strehl ratio of the resulting PSF,
including but not limited to: the atmospheric turbulence profile, the guide
star position for a single conjugate adaptive optics (SCAO) solution,
differential telescope pupil transmission, etc. Documentation can be found at
https://anisocado.readthedocs.io/en/latest/ | astro-ph_IM |
AstroDAbis: Annotations and Cross-Matches for Remote Catalogues: Astronomers are good at sharing data, but poorer at sharing knowledge.
Almost all astronomical data ends up in open archives, and access to these is
being simplified by the development of the global Virtual Observatory (VO).
This is a great advance, but the fundamental problem remains that these
archives contain only basic observational data, whereas all the astrophysical
interpretation of that data -- which source is a quasar, which a low-mass star,
and which an image artefact -- is contained in journal papers, with very little
linkage back from the literature to the original data archives. It is therefore
currently impossible for an astronomer to pose a query like "give me all
sources in this data archive that have been identified as quasars" and this
limits the effective exploitation of these archives, as the user of an archive
has no direct means of taking advantage of the knowledge derived by its
previous users.
The AstroDAbis service aims to address this, in a prototype service enabling
astronomers to record annotations and cross-identifications in the AstroDAbis
service, annotating objects in other catalogues. We have deployed two
interfaces to the annotations, namely one astronomy-specific one using the TAP
protocol}, and a second exploiting generic Linked Open Data (LOD) and RDF
techniques. | astro-ph_IM |
Building models for extended radio sources: implications for Epoch of
Reionisation science: We test the hypothesis that limitations in the sky model used to calibrate an
interferometric radio telescope, where the model contains extended radio
sources, will generate bias in the Epoch of Reionisation (EoR) power spectrum.
The information contained in a calibration model about the spatial and spectral
structure of an extended source is incomplete because a radio telescope cannot
sample all Fourier components. Application of an incomplete sky model to
calibration of EoR data will imprint residual error in the data, which
propagates forward to the EoR power spectrum. This limited information is
studied in the context of current and future planned instruments and surveys at
EoR frequencies, such as the Murchison Widefield Array (MWA), Giant Metrewave
Radio Telescope (GMRT) and the Square Kilometre Array (SKA1-Low). For the MWA
EoR experiment, we find that both the additional short baseline $uv$-coverage
of the compact EoR array, and the additional long baselines provided by TGSS
and planned MWA expansions, are required to obtain sufficient information on
all relevant scales. For SKA1-Low, arrays with maximum baselines of 49~km and
65~km yield comparable performance at 50~MHz and 150~MHz, while 39~km, 14~km
and 4~km arrays yield degraded performance. | astro-ph_IM |
CAPTURE: A continuum imaging pipeline for the uGMRT: We present the first fully automated pipeline for making images from the
interferometric data obtained from the upgraded Giant Metrewave Radio Telescope
(uGMRT) called CAsa Pipeline-cum-Toolkit for Upgraded Giant Metrewave Radio
Telescope data REduction - CAPTURE. It is a python program that uses tasks from
the NRAO Common Astronomy Software Applications (CASA) to perform the steps of
flagging of bad data, calibration, imaging and self-calibration. The salient
features of the pipeline are: i) a fully automatic mode to go from the raw data
to a self-calibrated continuum image, ii) specialized flagging strategies for
short and long baselines that ensure minimal loss of extended structure, iii)
flagging of persistent narrow band radio frequency interference (RFI), iv)
flexibility for the user to configure the pipeline for step-by-step analysis or
special cases and v) analysis of data from the legacy GMRT. CAPTURE is
available publicly on github (https://github.com/ruta-k/uGMRT-pipeline, release
v1.0.0). The primary beam correction for the uGMRT images produced with CAPTURE
is made separately available at https://github.com/ruta-k/uGMRTprimarybeam. We
show examples of using CAPTURE on uGMRT and legacy GMRT data. In principle,
CAPTURE can be tailored for use with radio interferometric data from other
telescopes. | astro-ph_IM |
Arm-Locking with the GRACE Follow-On Laser Ranging Interferometer: Arm-locking is a technique for stabilizing the frequency of a laser in an
inter-spacecraft interferometer by using the spacecraft separation as the
frequency reference. A candidate technique for future space-based gravitational
wave detectors such as the Laser Interferometer Space Antenna (LISA),
arm-locking has been extensive studied in this context through analytic models,
time-domain simulations, and hardware-in-the-loop laboratory demonstrations. In
this paper we show the Laser Ranging Interferometer instrument flying aboard
the upcoming Gravity Recovery and Climate Experiment Follow-On (GRACE-FO)
mission provides an appropriate platform for an on-orbit demonstration of the
arm-locking technique. We describe an arm-locking controller design for the
GRACE-FO system and a series of time-domain simulations that demonstrate its
feasibility. We conclude that it is possible to achieve laser frequency noise
suppression of roughly two orders of magnitude around a Fourier frequency of
1Hz with conservative margins on the system's stability. We further demonstrate
that `pulling' of the master laser frequency due to fluctuating Doppler shifts
and lock acquisition transients is less than $100\,$MHz over several GRACE-FO
orbits. These findings motivate further study of the implementation of such a
demonstration. | astro-ph_IM |
Correcting for Telluric Absorption: Methods, Case Studies, and Release
of the TelFit Code: Ground-based astronomical spectra are contaminated by the Earth's atmosphere
to varying degrees in all spectral regions. We present a Python code that can
accurately fit a model to the telluric absorption spectrum present in
astronomical data, with residuals of $\sim 3-5\%$ of the continuum for
moderately strong lines. We demonstrate the quality of the correction by
fitting the telluric spectrum in a nearly featureless A0V star, HIP 20264, as
well as to a series of dwarf M star spectra near the 819 nm sodium doublet. We
directly compare the results to an empirical telluric correction of HIP 20264
and find that our model-fitting procedure is at least as good and sometimes
more accurate. The telluric correction code, which we make freely available to
the astronomical community, can be used as a replacement for telluric standard
star observations for many purposes. | astro-ph_IM |
PandExo: A Community Tool for Transiting Exoplanet Science with JWST &
HST: As we approach the James Webb Space Telescope (JWST) era, several studies
have emerged that aim to: 1) characterize how the instruments will perform and
2) determine what atmospheric spectral features could theoretically be detected
using transmission and emission spectroscopy. To some degree, all these studies
have relied on modeling of JWST's theoretical instrument noise. With under two
years left until launch, it is imperative that the exoplanet community begins
to digest and integrate these studies into their observing plans, as well as
think about how to leverage the Hubble Space Telescope (HST) to optimize JWST
observations. In order to encourage this and to allow all members of the
community access to JWST & HST noise simulations, we present here an
open-source Python package and online interface for creating observation
simulations of all observatory-supported time-series spectroscopy modes. This
noise simulator, called PandExo, relies on some aspects of Space Telescope
Science Institute's Exposure Time Calculator, Pandeia. We describe PandExo and
the formalism for computing noise sources for JWST. Then, we benchmark
PandExo's performance against each instrument team's independently written
noise simulator for JWST, and previous observations for HST. We find that
\texttt{PandExo} is within 10% agreement for HST/WFC3 and for all JWST
instruments. | astro-ph_IM |
Differential HBT Method for Binary Stars: Two photon correlations are studied for a binary star system. It is
investigated how the differential Hanbury Brown and Twiss (HBT) approach can be
used in order to determine orbital parameters of a binary star. | astro-ph_IM |
4MOST: Project overview and information for the First Call for Proposals: We introduce the 4-metre Multi-Object Spectroscopic Telescope (4MOST), a new
high-multiplex, wide-field spectroscopic survey facility under development for
the four-metre-class Visible and Infrared Survey Telescope for Astronomy
(VISTA) at Paranal. Its key specifications are: a large field of view (FoV) of
4.2 square degrees and a high multiplex capability, with 1624 fibres feeding
two low-resolution spectrographs ($R = \lambda/\Delta\lambda \sim 6500$), and
812 fibres transferring light to the high-resolution spectrograph ($R \sim
20\,000$). After a description of the instrument and its expected performance,
a short overview is given of its operational scheme and planned 4MOST
Consortium science; these aspects are covered in more detail in other articles
in this edition of The Messenger. Finally, the processes, schedules, and
policies concerning the selection of ESO Community Surveys are presented,
commencing with a singular opportunity to submit Letters of Intent for Public
Surveys during the first five years of 4MOST operations. | astro-ph_IM |
Optimizing Gravitational-Wave Detector Design for Squeezed Light: Achieving the quantum noise targets of third-generation detectors will
require 10 dB of squeezed-light enhancement as well as megawatt laser power in
the interferometer arms - both of which require unprecedented control of the
internal optical losses. In this work, we present a novel optimization approach
to gravitational-wave detector design aimed at maximizing the robustness to
common, yet unavoidable, optical fabrication and installation errors, which
have caused significant loss in Advanced LIGO. As a proof of concept, we employ
these techniques to perform a two-part optimization of the LIGO A+ design.
First, we optimize the arm cavities for reduced scattering loss in the presence
of point absorbers, as currently limit the operating power of Advanced LIGO.
Then, we optimize the signal recycling cavity for maximum squeezing
performance, accounting for realistic errors in the positions and radii of
curvature of the optics. Our findings suggest that these techniques can be
leveraged to achieve substantially greater quantum noise performance in current
and future gravitational-wave detectors. | astro-ph_IM |
SORA: Stellar Occultation Reduction and Analysis: The stellar occultation technique provides competitive accuracy in
determining the sizes, shapes, astrometry, etc., of the occulting body,
comparable to in-situ observations by spacecraft. With the increase in the
number of known Solar System objects expected from the LSST, the highly precise
astrometric catalogues, such as Gaia, and the improvement of ephemerides,
occultations observations will become more common with a higher number of
chords in each observation. In the context of the Big Data era, we developed
SORA, an open-source python library to reduce and analyse stellar occultation
data efficiently. It includes routines from predicting such events up to the
determination of Solar System bodies' sizes, shapes, and positions. | astro-ph_IM |
Astrophysics Source Code Library: Here we grow again!: The Astrophysics Source Code Library (ASCL) is a free online registry of
research codes; it is indexed by ADS and Web of Science and has over 1300 code
entries. Its entries are increasingly used to cite software; citations have
been doubling each year since 2012 and every major astronomy journal accepts
citations to the ASCL. Codes in the resource cover all aspects of astrophysics
research and many programming languages are represented. In the past year, the
ASCL added dashboards for users and administrators, started minting Digital
Objective Identifiers (DOIs) for software it houses, and added metadata fields
requested by users. This presentation covers the ASCL's growth in the past year
and the opportunities afforded it as one of the few domain libraries for
science research codes. | astro-ph_IM |
Seeing Black Holes : from the Computer to the Telescope: Astronomical observations are about to deliver the very first telescopic
image of the massive black hole lurking at the Galactic Center. The mass of
data collected in one night by the Event Horizon Telescope network, exceeding
everything that has ever been done in any scientific field, should provide a
recomposed image during 2018. All this, forty years after the first numerical
simulations done by the present author. | astro-ph_IM |
Cn2 profile from Shack-Hartmann data with CO-SLIDAR data processing: Cn2 profile monitoring usually makes use of wavefront slope correlations or
of scintillation pattern correlations. Wavefront slope correlations provide
sensitivity to layers close to the receiving plane. In addition, scintillation
correlations allow a better sensitivity to high turbulence layers. Wavefront
slope and scintillation correlations are therefore complementary. Slopes and
scintillation being recorded simultaneously with a Shack-Hartmann wavefront
sensor (SHWFS), we propose here to exploit their correlation to retrieve the
Cn2 profile. The measurement method named COupled SLodar scIDAR (CO-SLIDAR)
uses correlations of SHWFS data from two separated stars. A maximum-likelihood
method is developed to estimate precisely the positions and intensities
corresponding to each SHWFS spot, which are used as inputs for CO-SLIDAR. First
results are presented using SHWFS real data from a binary star. | astro-ph_IM |
On Optimal Geometry for Space Interferometers: This paper examines options for orbit configurations for a space
interferometer. In contrast to previously presented concepts for space very
long baseline interferometry, we propose a combination of regular and
retrograde near-Earth circular orbits in order to achieve a faster filling of
$(u,v)$ coverage. With the rapid relative motion of the telescopes, it will be
possible to quickly obtain high quality images of supermassive black holes. As
a result of such an approach, it will be possible for the first time to conduct
high quality studies of the supermassive black hole close surroundings in
dynamics. | astro-ph_IM |
Technical Note: Asteroid Detection Demonstration from SkySat-3 B612 Data
using Synthetic Tracking: We report results from analyzing the B612 asteroid observation data taken by
the sCMOS cameras on board of Planet SkySat-3 using the synthetic tracking
technique. The analysis demonstrates the expected sensitivity improvement in
the signal-to-noise ratio of the asteroids from properly stacking up the the
short exposure images in post-processing. | astro-ph_IM |
WAHRSIS: A Low-cost, High-resolution Whole Sky Imager With Near-Infrared
Capabilities: Cloud imaging using ground-based whole sky imagers is essential for a
fine-grained understanding of the effects of cloud formations, which can be
useful in many applications. Some such imagers are available commercially, but
their cost is relatively high, and their flexibility is limited. Therefore, we
built a new daytime Whole Sky Imager (WSI) called Wide Angle High-Resolution
Sky Imaging System. The strengths of our new design are its simplicity, low
manufacturing cost and high resolution. Our imager captures the entire
hemisphere in a single high-resolution picture via a digital camera using a
fish-eye lens. The camera was modified to capture light across the visible as
well as the near-infrared spectral ranges. This paper describes the design of
the device as well as the geometric and radiometric calibration of the imaging
system. | astro-ph_IM |
The 4m International Liquid Mirror Telescope project: The International Liquid Mirror Telescope (ILMT) project is a scientific
collaboration in observational astrophysics between the Li{\`e}ge Institute of
Astrophysics and Geophysics (Li{\`e}ge University, Belgium), the Aryabatta
Research Institute of observational sciencES (ARIES, Nainital, India) and
several Canadian universities (British Columbia, Laval, Montr{\'e}al, Toronto,
Victoria and York). Meanwhile, several other institutes have joined the
project: the Royal Observatory of Belgium, the National University of
Uzbekistan and the Ulugh Beg Astronomical Institute (Uzbekistan) as well as the
Pozna{\'n} Observatory (Poland). The Li{\`e}ge company AMOS (Advanced
Mechanical and Optical Systems) has fabricated the telescope structure that has
been erected on the ARIES site in Devasthal (Uttarakhand, India). It is the
first liquid mirror telescope being dedicated to astronomical observations.
First light was obtained on 29 April 2022 and commissioning is being conducted
at the present time. In this short article, we describe and illustrate the main
components of the ILMT. We also highlight the ILMT papers presented during the
third BINA workshop, which discuss various aspects of the ILMT science
programs. | astro-ph_IM |
The Unified Astronomy Thesaurus: The Unified Astronomy Thesaurus (UAT) is an open, interoperable and
community-supported thesaurus which unifies the existing divergent and isolated
Astronomy & Astrophysics vocabularies into a single high-quality,
freely-available open thesaurus formalizing astronomical concepts and their
inter-relationships. The UAT builds upon the existing IAU Thesaurus with major
contributions from the astronomy portions of the thesauri developed by the
Institute of Physics Publishing, the American Institute of Physics, and SPIE.
We describe the effort behind the creation of the UAT and the process through
which we plan to maintain the document updated through broad community
participation. | astro-ph_IM |
The International Pulsar Timing Array: The International Pulsar Timing Array (IPTA) is an organisation whose raison
d'etre is to facilitate collaboration between the three main existing PTAs (the
EPTA in Europe, NANOGrav in North America and the PPTA in Australia) in order
to realise the benefits of combined PTA data sets in reaching the goals of PTA
projects. Currently, shared data sets for 39 pulsars are available for
IPTA-based projects. Operation of the IPTA is administered by a Steering
Committee consisting of six members, two from each PTA, plus the immediate past
Chair in a non-voting capacity. A Constitution and several Agreements define
the framework for the collaboration. Web pages provide information both to
members of participating PTAs and to the general public. With support from an
NSF PIRE grant, the IPTA facilitates the organisation of annual Student
Workshops and Science Meetings. These are very valuable both in training new
students and in communicating current results from IPTA-based research. | astro-ph_IM |
Systematics in the ALMA Proposal Review Rankings: The results from the ALMA proposal peer review process in Cycles 0-6 are
analyzed to identify any systematics in the scientific rankings that may
signify bias. Proposal rankings are analyzed with respect to the experience
level of a Principal Investigator (PI) in submitting ALMA proposals, regional
affiliation (Chile, East Asia, Europe, North America, or Other), and gender.
The analysis was conducted for both the Stage 1 rankings, which are based on
the preliminary scores from the reviewers, and the Stage 2 rankings, which are
based on the final scores from the reviewers after participating in a
face-to-face panel discussion. Analysis of the Stage 1 results shows that PIs
who submit an ALMA proposal in multiple cycles have systematically better
proposal ranks than PIs who have submitted proposals for the first time. In
terms of regional affiliation, PIs from Europe and North America have better
Stage 1 rankings than PIs from Chile and East Asia. Consistent with Lonsdale et
al. (2016), proposals led by men have better Stage 1 rankings than women when
averaged over all cycles. This trend was most noticeably present in Cycle 3,
but no discernible differences in the Stage 1 rankings are present in recent
cycles. Nonetheless, in each cycle to date, women have had a lower proposal
acceptance rate than men even after differences in demographics are considered.
Comparison of the Stage 1 and Stage 2 rankings reveal no significant changes in
the distribution of proposal ranks by experience level, regional affiliation,
or gender as a result of the panel discussions, although the proposal ranks for
East Asian PIs show a marginally significant improvement from Stage 1 to Stage
2 when averaged over all cycles. Thus any systematics in the proposal rankings
are introduced primarily in the Stage 1 process and not from the face-to-face
discussions. | astro-ph_IM |
RSM detection map for direct exoplanet detection in ADI sequences: Beyond the choice of wavefront control systems or coronographs, advanced data
processing methods play a crucial role in disentangling potential planetary
signals from bright quasi-static speckles. Among these methods, angular
differential imaging (ADI) for data sets obtained in pupil tracking mode (ADI
sequences) is one of the foremost research avenues, considering the many
observing programs performed with ADI-based techniques and the associated
discoveries. Inspired by the field of econometrics, here we propose a new
detection algorithm for ADI sequences, deriving from the regime-switching model
first proposed in the 1980s. The proposed model is very versatile as it allows
the use of PSF-subtracted data sets (residual cubes) provided by various
ADI-based techniques, separately or together, to provide a single detection
map. The temporal structure of the residual cubes is used for the detection as
the model is fed with a concatenated series of pixel-wise time sequences. The
algorithm provides a detection probability map by considering two possible
regimes for concentric annuli, the first one accounting for the residual noise
and the second one for the planetary signal in addition to the residual noise.
The algorithm performance is tested on data sets from two instruments, VLT/NACO
and VLT/SPHERE. The results show an overall better performance in the receiver
operating characteristic space when compared with standard
signal-to-noise-ratio maps for several state-of-the-art ADI-based
post-processing algorithms. | astro-ph_IM |
The star catalogue of Wilhelm IV, Landgraf von Hessen-Kassel: Accuracy
of the catalogue and of the measurements: We analyse a manuscript star catalogue by Wilhem IV, Landgraf von
Hessen-Kassel, from 1586. From measurements of altitudes and of angles between
stars, given in the catalogue, we find that the measurement accuracy averages
26 arcsec for eight fundamental stars, compared to 49 arcsec of the
measurements by Brahe. The computation in converting altitudes to declinations
and angles between stars to celestial position is very accurate, with errors
negligible with respect to the measurement errors. Due to an offset in the
position of the vernal equinox the positional error of the catalogue is
slightly worse than that of Brahe's catalogue, but when correction is made for
the offset -- which was known to 17th century astronomers -- the catalogue is
more accurate than that of Brahe by a factor two. We provide machine-readable
Tables of the catalogue. | astro-ph_IM |
Efficient least-squares basket-weaving: We report on a novel method to solve the basket-weaving problem.
Basket-weaving is a technique that is used to remove scan-line patterns from
single-dish radio maps. The new approach applies linear least-squares and works
on gridded maps from arbitrarily sampled data, which greatly improves
computational efficiency and robustness. It also allows masking of bad data,
which is useful for cases where radio frequency interference is present in the
data. We evaluate the algorithms using simulations and real data obtained with
the Effelsberg 100-m telescope. | astro-ph_IM |
A template method for measuring the iron spectrum in cosmic rays with
Cherenkov telescopes: The energy-dependent abundance of elements in cosmic rays plays an important
role in understanding their acceleration and propagation. Most current results
are obtained either from direct measurements by balloon- or satellite-borne
detectors, or from indirect measurements by air shower detector arrays on the
Earth's surface. Imaging Atmospheric Cherenkov Telescopes (IACTs), used
primarily for $\gamma$-ray astronomy, can also be used for cosmic-ray physics.
They are able to measure Cherenkov light emitted both by heavy nuclei and by
secondary particles produced in air showers, and are thus sensitive to the
charge and energy of cosmic ray particles with energies of tens to hundreds of
TeV. A template-based method, which can be used to reconstruct the charge and
energy of primary particles simultaneously from images taken by IACTs, will be
introduced. Heavy nuclei, such as iron, can be separated from lighter cosmic
rays with this method, and thus the abundance and spectrum of these nuclei can
be measured in the range of tens to hundreds of TeV. | astro-ph_IM |
Characterization and correction of charge-induced pixel shifts in DECam: Interaction of charges in CCDs with the already accumulated charge
distribution causes both a flux dependence of the point-spread function (an
increase of observed size with flux, also known as the brighter/fatter effect)
and pixel-to-pixel correlations of the Poissonian noise in flat fields. We
describe these effects in the Dark Energy Camera (DECam) with charge dependent
shifts of effective pixel borders, i.e. the Antilogus et al. (2014) model,
which we fit to measurements of flat-field Poissonian noise correlations. The
latter fall off approximately as a power-law r^-2.5 with pixel separation r,
are isotropic except for an asymmetry in the direct neighbors along rows and
columns, are stable in time, and are weakly dependent on wavelength. They show
variations from chip to chip at the 20% level that correlate with the silicon
resistivity. The charge shifts predicted by the model cause biased shape
measurements, primarily due to their effect on bright stars, at levels
exceeding weak lensing science requirements. We measure the flux dependence of
star images and show that the effect can be mitigated by applying the reverse
charge shifts at the pixel level during image processing. Differences in
stellar size, however, remain significant due to residuals at larger distance
from the centroid. | astro-ph_IM |
Optimal Probabilistic Catalogue Matching for Radio Sources: Cross-matching catalogues from radio surveys to catalogues of sources at
other wavelengths is extremely hard, because radio sources are often extended,
often consist of several spatially separated components, and often no radio
component is coincident with the optical/infrared host galaxy. Traditionally,
the cross-matching is done by eye, but this does not scale to the millions of
radio sources expected from the next generation of radio surveys. We present an
innovative automated procedure, using Bayesian hypothesis testing, that models
trial radio-source morphologies with putative positions of the host galaxy.
This new algorithm differs from an earlier version by allowing more complex
radio source morphologies, and performing a simultaneous fit over a large
field. We show that this technique performs well in an unsupervised mode. | astro-ph_IM |
Reaching Diverse Groups in Long-Term Astronomy Public Engagement Efforts: Professional astronomy is historically not an environment of diverse
identities. In recognizing that public outreach efforts affect career outcomes
for young people, it is important to assess the demographics of those being
reached and continually consider strategies for successfully engaging
underrepresented groups. One such outreach event, the International
Astronomical Youth Camp (IAYC), has a 50-year history and has reached ~1700
participants from around the world. We find that the IAYC is doing well in
terms of gender (59% female, 4.7% non-binary at the most recent camp) and LGBT+
representation, whereas black and ethnic minorities are lacking. In this
proceeding, we report the current landscape of demographics applying to and
attending the IAYC; the efforts we are making to increase diversity amongst
participants; the challenges we face; and our future plans to bridge these
gaps, not only for the benefit of the camp but for society overall. | astro-ph_IM |
Analysis of a Custom Support Vector Machine for Photometric Redshift
Estimation and the Inclusion of Galaxy Shape Information: Aims: We present a custom support vector machine classification package for
photometric redshift estimation, including comparisons with other methods. We
also explore the efficacy of including galaxy shape information in redshift
estimation. Support vector machines, a type of machine learning, utilize
optimization theory and supervised learning algorithms to construct predictive
models based on the information content of data in a way that can treat
different input features symmetrically.
Methods: The custom support vector machine package we have developed is
designated SPIDERz and made available to the community. As test data for
evaluating performance and comparison with other methods, we apply SPIDERz to
four distinct data sets: 1) the publicly available portion of the PHAT-1
catalog based on the GOODS-N field with spectroscopic redshifts in the range $z
< 3.6$, 2) 14365 galaxies from the COSMOS bright survey with photometric band
magnitudes, morphology, and spectroscopic redshifts inside $z < 1.4$, 3) 3048
galaxies from the overlap of COSMOS photometry and morphology with 3D-HST
spectroscopy extending to $z < 3.9$, and 4) 2612 galaxies with five-band
photometric magnitudes and morphology from the All-wavelength Extended Groth
Strip International Survey and $z < 1.57$.
Results: We find that SPIDER-z achieves results competitive with other
empirical packages on the PHAT-1 data, and performs quite well in estimating
redshifts with the COSMOS and AEGIS data, including in the cases of a large
redshift range ($0 < z < 3.9$). We also determine from analyses with both the
COSMOS and AEGIS data that the inclusion of morphological information does not
have a statistically significant benefit for photometric redshift estimation
with the techniques employed here. | astro-ph_IM |
COMAP Early Science: II. Pathfinder Instrument: Line intensity mapping (LIM) is a new technique for tracing the global
properties of galaxies over cosmic time. Detection of the very faint signals
from redshifted carbon monoxide (CO), a tracer of star formation, pushes the
limits of what is feasible with a total-power instrument. The CO Mapping
Project (COMAP) Pathfinder is a first-generation instrument aiming to prove the
concept and develop the technology for future experiments, as well as
delivering early science products. With 19 receiver channels in a hexagonal
focal plane arrangement on a 10.4 m antenna, and an instantaneous 26-34 GHz
frequency range with 2 MHz resolution, it is ideally suited to measuring
CO($J$=1-0) from $z\sim3$. In this paper we discuss strategies for designing
and building the Pathfinder and the challenges that were encountered. The
design of the instrument prioritized LIM requirements over those of ancillary
science. After a couple of years of operation, the instrument is well
understood, and the first year of data is already yielding useful science
results. Experience with this Pathfinder will drive the design of the next
generations of experiments. | astro-ph_IM |
Stout: Cloudy's Atomic and Molecular Database: We describe a new atomic and molecular database we developed for use in the
spectral synthesis code Cloudy. The design of Stout is driven by the data needs
of Cloudy, which simulates molecular, atomic, and ionized gas with kinetic
temperatures 2.8 K < T < 1e10 K and densities spanning the low to high-density
limits. The radiation field between photon energies $10^{-8}$ Ry and 100 MeV is
considered, along with all atoms and ions of the lightest 30 elements, and ~100
molecules. For ease of maintenance, the data are stored in a format as close as
possible to the original data sources. Few data sources include the full range
of data we need. We describe how we fill in the gaps in the data or extrapolate
rates beyond their tabulated range. We tabulate data sources both for the
atomic spectroscopic parameters and for collision data for the next release of
Cloudy. This is not intended as a review of the current status of atomic data,
but rather a description of the features of the database which we will build
upon. | astro-ph_IM |
Adapting the PyCBC pipeline to find and infer the properties of
gravitational waves from massive black hole binaries in LISA: The Laser Interferometer Space Antenna (LISA), due for launch in the mid
2030s, is expected to observe gravitational waves (GW)s from merging massive
black hole binaries (MBHB)s. These signals can last from days to months,
depending on the masses of the black holes, and are expected to be observed
with high signal to noise ratios (SNR)s out to high redshifts. We have adapted
the PyCBC software package to enable a template bank search and inference of
GWs from MBHBs. The pipeline is tested on the LISA data challenge (LDC)'s
Challenge 2a (\enquote{Sangria}), which contains MBHBs and thousands of
galactic binaries (GBs) in simulated instrumental LISA noise. Our search
identifies all 6 MBHB signals with more than $92\%$ of the optimal SNR. The
subsequent parameter inference step recovers the masses and spins within their
$90\%$ confidence interval. Sky position parameters have 8 high likelihood
modes which are recovered but often our posteriors favour the incorrect sky
mode. We observe that the addition of GBs biases the parameter recovery of
masses and spins away from the injected values, reinforcing the need for a
global fit pipeline which will simultaneously fit the parameters of the GB
signals before estimating the parameters of MBHBs. | astro-ph_IM |
Astrobiological Complexity with Probabilistic Cellular Automata: Search for extraterrestrial life and intelligence constitutes one of the
major endeavors in science, but has yet been quantitatively modeled only rarely
and in a cursory and superficial fashion. We argue that probabilistic cellular
automata (PCA) represent the best quantitative framework for modeling
astrobiological history of the Milky Way and its Galactic Habitable Zone. The
relevant astrobiological parameters are to be modeled as the elements of the
input probability matrix for the PCA kernel. With the underlying simplicity of
the cellular automata constructs, this approach enables a quick analysis of
large and ambiguous input parameters' space. We perform a simple clustering
analysis of typical astrobiological histories and discuss the relevant boundary
conditions of practical importance for planning and guiding actual empirical
astrobiological and SETI projects. In addition to showing how the present
framework is adaptable to more complex situations and updated observational
databases from current and near-future space missions, we demonstrate how
numerical results could offer a cautious rationale for continuation of
practical SETI searches. | astro-ph_IM |
A Novel Greedy Approach To Harmonic Summing Using GPUs: Incoherent harmonic summing is a technique which is used to improve the
sensitivity of Fourier domain search methods. A one dimensional harmonic sum is
used in time-domain radio astronomy as part of the Fourier domain periodicity
search, a type of search used to detect isolated single pulsars. The main
problem faced when implementing the harmonic sum on many-core architectures,
like GPUs, is the very unfavourable memory access pattern of the harmonic sum
algorithm. The memory access pattern gets worse as the dimensionality of the
harmonic sum increases. Here we present a set of algorithms for calculating the
harmonic sum that are suited to many-core architectures such as GPUs. We
present an evaluation of the sensitivity of these different approaches, and
their performance. This work forms part of the AstroAccelerate project which is
a GPU accelerated software package for processing time-domain radio astronomy
data. | astro-ph_IM |
The Simons Observatory 220 and 280 GHz Focal-Plane Module: Design and
Initial Characterization: The Simons Observatory (SO) will detect and map the temperature and
polarization of the millimeter-wavelength sky from Cerro Toco, Chile across a
range of angular scales, providing rich data sets for cosmological and
astrophysical analysis. The SO focal planes will be tiled with compact
hexagonal packages, called Universal Focal-plane Modules (UFMs), in which the
transition-edge sensor (TES) detectors are coupled to 100 mK
microwave-multiplexing electronics. Three different types of dichroic TES
detector arrays with bands centered at 30/40, 90/150, and 220/280 GHz will be
implemented across the 49 planned UFMs. The 90/150GHz and 220/280 GHz arrays
each contain 1,764 TESes, which are read out with two 910x multiplexer
circuits. The modules contain a series of densely routed silicon chips, which
are packaged together in a controlled electromagnetic environment with robust
heat-sinking to 100 mK. Following an overview of the module design, we report
on early results from the first 220/280GHz UFM, including detector yield, as
well as readout and detector noise levels. | astro-ph_IM |
High-Contrast Testbeds for Future Space-Based Direct Imaging Exoplanet
Missions: Instrumentation techniques in the field of direct imaging of exoplanets have
greatly advanced over the last two decades. Two of the four NASA-commissioned
large concept studies involve a high-contrast instrument for the imaging and
spectral characterization of exo-Earths from space: LUVOIR and HabEx. This
whitepaper describes the status of 8 optical testbeds in the US and France
currently in operation to experimentally validate the necessary technologies to
image exo-Earths from space. They explore two complementary axes of research:
(i) coronagraph designs and manufacturing and (ii) active wavefront correction
methods and technologies. Several instrument architectures are currently being
analyzed in parallel to provide more degrees of freedom for designing the
future coronagraphic instruments. The necessary level of performance has
already been demonstrated in-laboratory for clear off-axis telescopes
(HabEx-like) and important efforts are currently in development to reproduce
this accomplishment on segmented and/or on-axis telescopes (LUVOIR-like) over
the next two years. | astro-ph_IM |
Planck-LFI radiometers tuning: "This paper is part of the Prelaunch status LFI papers published on JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/jinst"
This paper describes the Planck Low Frequency Instrument tuning activities
performed through the ground test campaigns, from Unit to Satellite Levels.
Tuning is key to achieve the best possible instrument performance and tuning
parameters strongly depend on thermal and electrical conditions. For this
reason tuning has been repeated several times during ground tests and it has
been repeated in flight before starting nominal operations. The paper discusses
the tuning philosophy, the activities and the obtained results, highlighting
developments and changes occurred during test campaigns. The paper concludes
with an overview of tuning performed during the satellite cryogenic test
campaign (Summer 2008) and of the plans for the just started in-flight
calibration. | astro-ph_IM |
A tomographic algorithm to determine tip-tilt information from laser
guide stars: Laser Guide Stars (LGS) have greatly increased the sky-coverage of Adaptive
Optics (AO) systems. Due to the up-link turbulence experienced by LGSs, a
Natural Guide Star (NGS) is still required, preventing full sky-coverage. We
present a method of obtaining partial tip-tilt information from LGSs alone in
multi-LGS tomographic LGS AO systems. The method of LGS up-link tip-tilt
determination is derived using a geometric approach, then an alteration to the
Learn and Apply algorithm for tomographic AO is made to accommodate up-link
tip-tilt. Simulation results are presented, verifying that the technique shows
good performance in correcting high altitude tip-tilt, but not that from low
altitudes. We suggest that the method is combined with multiple far off-axis
tip-tilt NGSs to provide gains in performance and sky-coverage over current
tomographic AO systems. | astro-ph_IM |
Information field theory: Non-linear image reconstruction and signal analysis deal with complex inverse
problems. To tackle such problems in a systematic way, I present information
field theory (IFT) as a means of Bayesian, data based inference on spatially
distributed signal fields. IFT is a statistical field theory, which permits the
construction of optimal signal recovery algorithms even for non-linear and
non-Gaussian signal inference problems. IFT algorithms exploit spatial
correlations of the signal fields and benefit from techniques developed to
investigate quantum and statistical field theories, such as Feynman diagrams,
re-normalisation calculations, and thermodynamic potentials. The theory can be
used in many areas, and applications in cosmology and numerics are presented. | astro-ph_IM |
Optimum Acceptance Regions for Direct Dark Matter Searches: Most experiments that search for direct interactions of WIMP dark matter with
a target can distinguish the dominant electronrecoil background from the
nuclear recoil signal, based on some discrimination parameter. An acceptance
region is defined inthe parameter space spanned by the recoil energy and this
discrimination parameter. In the absence of a clear signal in thisregion, a
limit is calculated on the dark matter scattering cross section. Here, an
algorithm is presented that allows to define the acceptance region a priori
such that the experiment has the best sensitivity. This is achieved through
optimized acceptance regions for each WIMP model and WIMP mass that is to be
probed. Using recent data from the CRESST-II experiment as anexample, it is
shown that resulting limits can be substantially stronger than those from a
conventional acceptance region. In an experiment with a segmented target, the
algorithm developed here can yield different acceptance regions for the
individual subdetectors. Hence, it is shown how to combine the data
consistently within the usual Maximum Gap or Optimum Interval framework. | astro-ph_IM |
The BINGO Project II: Instrument Description: The measurement of diffuse 21-cm radiation from the hyperfine transition of
neutral hydrogen (HI signal) in different redshifts is an important tool for
modern cosmology. However, detecting this faint signal with non-cryogenic
receivers in single-dish telescopes is a challenging task. The BINGO (Baryon
Acoustic Oscillations from Integrated Neutral Gas Observations) radio telescope
is an instrument designed to detect baryonic acoustic oscillations (BAOs) in
the cosmological HI signal, in the redshift interval $0.127 \le z \le 0.449$.
This paper describes the BINGO radio telescope, including the current status of
the optics, receiver, observational strategy, calibration, and the site. BINGO
has been carefully designed to minimize systematics, being a transit instrument
with no moving dishes and 28 horns operating in the frequency range $980 \le
\nu \le 1260$ MHz. Comprehensive laboratory tests were conducted for many of
the BINGO subsystems and the prototypes of the receiver chain, horn, polarizer,
magic tees, and transitions have been successfully tested between 2018 - 2020.
The survey was designed to cover $\sim 13\%$ of the sky, with the primary
mirror pointing at declination $\delta=-15^{\circ}$. The telescope will see an
instantaneous declination strip of $14.75^{\circ}$. The results of the
prototype tests closely meet those obtained during the modeling process,
suggesting BINGO will perform according to our expectations. After one year of
observations with a $60\%$ duty cycle and 28 horns, BINGO should achieve an
expected sensitivity of 102 $\mu K$ per 9.33 MHz frequency channel, one
polarization, and be able to measure the HI power spectrum in a competitive
time frame. | astro-ph_IM |
Automated Speckle Interferometry of Known Binaries: Astronomers have been measuring the separations and position angles between
the two components of binary stars since William Herschel began his
observations in 1781. In 1970, Anton Labeyrie pioneered a method, speckle
interferometry, that overcomes the usual resolution limits induced by
atmospheric turbulence by taking hundreds or thousands of short exposures and
reducing them in Fourier space. Our 2022 automation of speckle interferometry
allowed us to use a fully robotic 1.0-meter PlaneWave Instruments telescope,
located at the El Sauce Observatory in the Atacama Desert of Chile, to obtain
observations of many known binaries with established orbits. The long-term
objective of these observations is to establish the precision, accuracy, and
limitations of this telescope's automated speckle interferometry measurements.
This paper provides an early overview of the Known Binaries Project and provide
example results on a small-separation (0.27") binary, WDS 12274-2843 B 228. | astro-ph_IM |
LIGO series, dimension of embedding and Kolmogorov's complexity: The interpretation of the series recorded by the Laser Interferometer
Gravitational Wave Observatory is a very important issue. Naturally, it is not
free of controversy. Here we apply two methods widely used in the study of
nonlinear dynamical systems, namely, the calculation of Takens' dimension of
embedding and the spectrum of Kolmogorov's complexity, to the series recorded
in event GW150914. An increase of the former and a drop of the latter are
observed, consistent with the claimed appearance of a gravitational wave. We
propose these methods as additional tools to help identifying signals of
cosmological interest. | astro-ph_IM |
4MOST Consortium Survey 10: The Time-Domain Extragalactic Survey (TiDES): The Time-Domain Extragalactic Survey (TiDES) is focused on the spectroscopic
follow-up of extragalactic optical transients and variable sources selected
from forthcoming large sky surveys such as that from the Large Synoptic Survey
Telescope (LSST). TiDES contains three sub-surveys: (i) spectroscopic
observations of supernova-like transients; (ii) comprehensive follow-up of
transient host galaxies to obtain redshift measurements for cosmological
applications; and (iii) repeat spectroscopic observations to enable the
reverberation mapping of active galactic nuclei. Our simulations predict we
will be able to classify transients down to $r = 22.5$ magnitudes (AB) and,
over five years of 4MOST operations, obtain spectra for up to 30,000 live
transients to redshift $z \sim 0.5$, measure redshifts for up to 50,000
transient host galaxies to $z \sim 1$ and monitor around 700 active galactic
nuclei to $z \sim 2.5$. | astro-ph_IM |
Miniature X-Ray Solar Spectrometer (MinXSS) - A Science-Oriented,
University 3U CubeSat: The Miniature X-ray Solar Spectrometer (MinXSS) is a 3-Unit (3U) CubeSat
developed at the Laboratory for Atmospheric and Space Physics (LASP) at the
University of Colorado, Boulder (CU). Over 40 students contributed to the
project with professional mentorship and technical contributions from
professors in the Aerospace Engineering Sciences Department at CU and from LASP
scientists and engineers. The scientific objective of MinXSS is to study
processes in the dynamic Sun, from quiet-Sun to solar flares, and to further
understand how these changes in the Sun influence the Earth's atmosphere by
providing unique spectral measurements of solar soft x-rays (SXRs). The
enabling technology providing the advanced solar SXR spectral measurements is
the Amptek X123, a commercial-off-the-shelf (COTS) silicon drift detector
(SDD). The Amptek X123 has a low mass (~324 g after modification), modest power
consumption (~2.50 W), and small volume (6.86 cm x 9.91 cm x 2.54 cm), making
it ideal for a CubeSat. This paper provides an overview of the MinXSS mission:
the science objectives, project history, subsystems, and lessons learned that
can be useful for the small-satellite community. | astro-ph_IM |
Daemons: Detection at Pulkovo, Gran Sasso, and Soudan: During a week of the March maximum in 2011, two oppositely installed
direction-sensitive TEU-167d Dark Electron Multipliers (DEMs) recorded a flux
of daemons from the near-Earth almost circular heliocentric orbits (NEACHOs).
The flux measured from above is f \approx (8\pm3)\times10^-7 cm^-2 s^-1, and
that from below is twice smaller. The difference may be due both to specific
design features of the TEUs themselves, and to dissimilarities in the slope of
trajectories along which objects are coming from above or from below. It is
shown that the daemon paradigm enables a quantitative interpretation of DAMA
and CoGeNT experiments with no additional hypotheses. Both the experiments
record a daemon flux of f ~ 10^-6 cm^-2 s^-1 from strongly elongated
Earth-crossing heliocentric orbits (SEECHOs), predecessors of NEACHOs.
Recommendations are given for processing of DAMA/LIBRA data, which
unambiguously suggest that, in approximately half of cases (when there occur
double events in the detector, rejected in processing under a single-hit
criterion), the signals being recorded are successively excited by a single
SEECHO object along a path of ~1 m, i.e., this is not a WIMP. It is noted that
due regard to cascade events and pair interaction of ions will weaken the
adverse influence exerted by the blocking effect on the channeling of iodine
ions knocked out in NaI(Tl) crystal. This influence will become not so
catastrophic as it follows from simplified semi-analytical models of the
process: one might expect the energy of up to ~10% of primary recoil iodine
ions will be converted to the scintillation light. | astro-ph_IM |
On the Estimation of the Depth of Maximum of Extensive Air Showers Using
the Steepness Parameter of the Lateral Distribution of Cherenkov Radiation: Using Monte Carlo simulation of extensive air showers, we showed that the
maximum depth of showers, $X_{max}$ can be estimated using $P=Q(100)/Q(200)$,
the ratio of Cherenkov photon densities at 100 and 200 meters from the shower
core, which is known as the steepness parameter of the lateral distribution of
Cherenkov radiation on the ground. A simple quadratic model has been fitted to
a set of data from simulated extensive air showers, relating the steepness
parameter and the shower maximum depth. Then the model has been tested on
another set of simulated showers. The average difference between the actual
maximum depth of the simulated showers and the maximum depth obtained from the
lateral distribution of Cherenkov light is about 9 $g/cm^2$. In addition,
possibility of a more direct estimation of the mass of the initial particle
from $P$ has been investigated. An exponential relation between these two
quantities has been fitted. Applying the model to another set of showers, we
found that the average difference between the estimated and the actual mass of
primary particles is less than 0.5 atomic mass unit. | astro-ph_IM |
Applied Machine-Learning Models to Identify Spectral Sub-Types of M
Dwarfs from Photometric Surveys: M dwarfs are the most abundant stars in the Solar Neighborhood and they are
prime targets for searching for rocky planets in habitable zones. Consequently,
a detailed characterization of these stars is in demand. The spectral sub-type
is one of the parameters that is used for the characterization and it is
traditionally derived from the observed spectra. However, obtaining the spectra
of M dwarfs is expensive in terms of observation time and resources due to
their intrinsic faintness. We study the performance of four machine-learning
(ML) models: K-Nearest Neighbor (KNN), Random Forest (RF), Probabilistic Random
Forest (PRF), and Multilayer Perceptron (MLP), in identifying the spectral
sub-types of M dwarfs at a grand scale by deploying broadband photometry in the
optical and near-infrared. We trained the ML models by using the
spectroscopically identified M dwarfs from the Sloan Digital Sky Survey Data
Release (SDSS) 7, together with their photometric colors that were derived from
the SDSS, Two-Micron All-Sky Survey, and Wide-field Infrared Survey Explorer.
We found that the RF, PRF, and MLP give a comparable prediction accuracy, 74%,
while the KNN provides slightly lower accuracy, 71%. We also found that these
models can predict the spectral sub-type of M dwarfs with ~99% accuracy within
+/-1 sub-type. The five most useful features for the prediction are r-z, r-i,
r-J, r-H, and g-z, and hence lacking data in all SDSS bands substantially
reduces the prediction accuracy. However, we can achieve an accuracy of over
70% when the r and i magnitudes are available. Since the stars in this study
are nearby (d~1300 pc for 95% of the stars), the dust extinction can reduce the
prediction accuracy by only 3%. Finally, we used our optimized RF models to
predict the spectral sub-types of M dwarfs from the Catalog of Cool Dwarf
Targets for TESS, and we provide the optimized RF models for public use. | astro-ph_IM |
The SVOM gamma-ray burst mission: We briefly present the science capabilities, the instruments, the operations,
and the expected performance of the SVOM mission. SVOM (Space-based multiband
astronomical Variable Objects Monitor) is a Chinese-French space mission
dedicated to the study of Gamma-Ray Bursts (GRBs) in the next decade. The SVOM
mission encompasses a satellite carrying four instruments to detect and
localize the prompt GRB emission and measure the evolution of the afterglow in
the visible band and in X-rays, a VHF communication system enabling the fast
transmission of SVOM alerts to the ground, and a ground segment including a
wide angle camera and two follow-up telescopes. The pointing strategy of the
satellite has been optimized to favor the detection of GRBs located in the
night hemisphere. This strategy enables the study of the optical emission in
the first minutes after the GRB with robotic observatories and the early
spectroscopy of the optical afterglow with large telescopes to measure the
redshifts. The study of GRBs in the next decade will benefit from a number of
large facilities in all wavelengths that will contribute to increase the
scientific return of the mission. Finally, SVOM will operate in the era of the
next generation of gravitational wave detectors, greatly contributing to
searches for the electromagnetic counterparts of gravitational wave triggers at
Xray and gamma-ray energies. | astro-ph_IM |
Optimal detuning for quantum filter cavities: Vacuum quantum fluctuations impose a fundamental limit on the sensitivity of
gravitational-wave interferometers, which rank among the most sensitive
precision measurement devices ever built. The injection of conventional
squeezed vacuum reduces quantum noise in one quadrature at the expense of
increasing noise in the other. While this approach improved the sensitivity of
the Advanced LIGO and Advanced Virgo interferometers during their third
observing run (O3), future improvements in arm power and squeezing levels will
bring radiation pressure noise to the forefront. Installation of a filter
cavity for frequency-dependent squeezing provides broadband reduction of
quantum noise through the mitigation of this radiation pressure noise, and it
is the baseline approach planned for all of the future gravitational-wave
detectors currently conceived. The design and operation of a filter cavity
requires careful consideration of interferometer optomechanics as well as
squeezing degradation processes. In this paper, we perform an in-depth analysis
to determine the optimal operating point of a filter cavity. We use our model
alongside numerical tools to study the implications for filter cavities to be
installed in the upcoming "A+" upgrade of the Advanced LIGO detectors. | astro-ph_IM |
Sub-Kelvin Cooling for the BICEP Array Project: In the field of astrophysics, the faint signal from distant galaxies and
other dim cosmological sources at millimeter and submillimeter wavelengths
require the use of high-sensitivity experiments. Cryogenics and the use of
low-temperature detectors are essential to the accomplishment of the scientific
objectives, allowing lower detector noise levels and improved instrument
stability. Bolometric detectors are usually cooled to temperatures below 1K,
and the constraints on the instrument are stringent, whether the experiment is
a space-based platform or a ground-based telescope. The latter are usually
deployed in remote and harsh environments such as the South Pole, where
maintenance needs to be kept minimal. CEA-SBT has acquired a strong heritage in
the development of vibration-free multistage helium-sorption coolers, which can
provide cooling down to 200 mK when mounted on a cold stage at temperatures
<5K. In this paper, we focus on the development of a three-stage cooler
dedicated to the BICEP Array project led by Caltech/JPL, which aims to study
the birth of the Universe and specifically the unique B-mode pattern imprinted
by primordial gravitational waves on the polarization of the Cosmic Microwave
Background. Several cryogenic receivers are being developed, each featuring one
such helium-sorption cooler operated from a 4K stage cooled by a Cryomech
pulse-tube with heat lifts of >1.35W at 4.2K and >36W at 45K. The major
challenge of this project is the large masses to be cooled to sub-kelvin
temperatures (26 kg at 250mK) and the resulting long cool-down time, which in
this novel cooler design is kept to a minimum with the implementation of
passive and active thermal links between different temperature stages. A first
unit has been sized to provide 230, 70 and 2{\mu}W of net heat lifts at the
maximum temperatures of 2.8K, 340 and 250mK, respectively, for a minimum
duration of 48 hours. | astro-ph_IM |
Concept of multiple-cell cavity for axion dark matter search: In cavity-based axion dark matter search experiments exploring high mass
regions, multiple-cavity design is considered to increase the detection volume
within a given magnet bore. We introduce a new idea, referred to as
multiple-cell cavity, which provides various benefits including a larger
detection volume, simpler experimental setup, and easier phase-matching
mechanism. We present the characteristics of this concept and demonstrate the
experimental feasibility with an example of a double-cell cavity. | astro-ph_IM |
KilonovaNet: Surrogate Models of Kilonova Spectra with Conditional
Variational Autoencoders: Detailed radiative transfer simulations of kilonova spectra play an essential
role in multimessenger astrophysics. Using the simulation results in parameter
inference studies requires building a surrogate model from the simulation
outputs to use in algorithms requiring sampling. In this work, we present
KilonovaNet, an implementation of conditional variational autoencoders (cVAEs)
for the construction of surrogate models of kilonova spectra. This method can
be trained on spectra directly, removing overhead time of pre-processing
spectra, and greatly speeds up parameter inference time. We build surrogate
models of three state-of-the-art kilonova simulation data sets and present
in-depth surrogate error evaluation methods, which can in general be applied to
any surrogate construction method. By creating synthetic photometric
observations from the spectral surrogate, we perform parameter inference for
the observed light curve data of GW170817 and compare the results with previous
analyses. Given the speed with which KilonovaNet performs during parameter
inference, it will serve as a useful tool in future gravitational wave
observing runs to quickly analyze potential kilonova candidates | astro-ph_IM |
Reduced Order Estimation of the Speckle Electric Field History for
Space-Based Coronagraphs: In high-contrast space-based coronagraphs, one of the main limiting factors
for imaging the dimmest exoplanets is the time varying nature of the residual
starlight (speckles). Modern methods try to differentiate between the
intensities of starlight and other sources, but none incorporate models of
space-based systems which can take into account actuations of the deformable
mirrors. Instead, we propose formulating the estimation problem in terms of the
electric field while allowing for dithering of the deformable mirrors. Our
reduced-order approach is similar to intensity-based PCA (e.g. KLIP) although,
under certain assumptions, it requires a considerably lower number of modes of
the electric field. We illustrate this by a FALCO simulation of the WFIRST
hybrid Lyot coronagraph. | astro-ph_IM |
A High Sensitivity Fourier Transform Spectrometer for Cosmic Microwave
Background Observations: The QUIJOTE Experiment was developed to study the polarization in the Cosmic
Microwave Background (CMB) over the frequency range of 10-50 GHz. Its first
instrument, the Multi Frequency Instrument (MFI), measures in the range 10-20
GHz which coincides with one of the naturally transparent windows in the
atmosphere. The Tenerife Microwave Spectrometer (TMS) has been designed to
investigate the spectrum between 10-20 GHz in more detail. The MFI bands are 2
GHz wide whereas the TMS bands will be 250 MHz wide covering the complete 10-20
GHz range with one receiver chain and Fourier spectral filter bank. It is
expected that the relative calibration between frequency bands will be better
known than the MFI channels and that the higher resolution will provide
essential information on narrow band interference and features such as ozone.
The TMS will study the atmospheric spectra as well as provide key information
on the viability of ground-based absolute spectral measurements. Here the novel
Fourier transform spectrometer design is described showing its suitability to
wide band measurement and $\sqrt{N}$ advantage over the usual scanning
techniques. | astro-ph_IM |
A Parallel Monte Carlo Code for Simulating Collisional N-body Systems: We present a new parallel code for computing the dynamical evolution of
collisional N-body systems with up to N~10^7 particles. Our code is based on
the the Henon Monte Carlo method for solving the Fokker-Planck equation, and
makes assumptions of spherical symmetry and dynamical equilibrium. The
principal algorithmic developments involve optimizing data structures, and the
introduction of a parallel random number generation scheme, as well as a
parallel sorting algorithm, required to find nearest neighbors for interactions
and to compute the gravitational potential. The new algorithms we introduce
along with our choice of decomposition scheme minimize communication costs and
ensure optimal distribution of data and workload among the processing units.
The implementation uses the Message Passing Interface (MPI) library for
communication, which makes it portable to many different supercomputing
architectures. We validate the code by calculating the evolution of clusters
with initial Plummer distribution functions up to core collapse with the number
of stars, N, spanning three orders of magnitude, from 10^5 to 10^7. We find
that our results are in good agreement with self-similar core-collapse
solutions, and the core collapse times generally agree with expectations from
the literature. Also, we observe good total energy conservation, within less
than 0.04% throughout all simulations. We analyze the performance of the code,
and demonstrate near-linear scaling of the runtime with the number of
processors up to 64 processors for N=10^5, 128 for N=10^6 and 256 for N=10^7.
The runtime reaches a saturation with the addition of more processors beyond
these limits which is a characteristic of the parallel sorting algorithm. The
resulting maximum speedups we achieve are approximately 60x, 100x, and 220x,
respectively. | astro-ph_IM |
The LSST era of supermassive black holes accretion-disk reverberation
mapping: The Vera C. Rubin Observatory Legacy Survey of Space and Time (LSST) will
detect an unprecedentedly large sample of actively accreting supermassive black
holes with typical accretion disk (AD) sizes of a few light days. This brings
us to face challenges in the reverberation mapping (RM) measurement of AD sizes
in active galactic nuclei (AGNs) using interband continuum delays. We examine
the effect of LSST cadence strategies on AD RM using our metric
AGNTimeLagMetric. It accounts for redshift, cadence, the magnitude limit, and
magnitude corrections for dust extinction. Running our metric on different LSST
cadence strategies, we produce an atlas of the performance estimations for LSST
photometric RM measurements. We provide an upper limit on the estimated number
of quasars for which the AD time lag can be computed within 0<z<7 using the
features of our metric. We forecast that the total counts of such objects will
increase as the mean sampling rate of the survey decreases. The AD time lag
measurements are expected for >1000 sources in each Deep Drilling field (DDF,
10 sq. deg) in any filter, with the redshift distribution of these sources
peaking at z~1. We find the LSST observation strategies with a good cadence (~
5 days) and a long cumulative season (~9 yr), as proposed for LSST DDF, are
favored for the AD size measurement. We create synthetic LSST light curves for
the most suitable DDF cadences and determine RM time lags to demonstrate the
impact of the best cadences based on the proposed metric. | astro-ph_IM |
Application of the optimised next neighbour image cleaning method to the
VERITAS array: Imaging atmospheric Cherenkov telescopes, such as the VERITAS array, are
subject to the Night Sky Background (NSB) and electronic noise, which
contribute to the total signal of pixels in the telescope camera. The
contribution of noise photons in event images is reduced with the application
of image cleaning methods. Conventionally, high thresholds must be employed to
ensure the removal of pixels containing noise signal. On that account,
low-energy gamma-ray showers might be suppressed during the cleaning. We
present here the application of an optimised next neighbour image cleaning for
the VERITAS array. With this technique, differential noise rates are estimated
for each individual observation and thus changes in the NSB and afterpulsing
are consistently being accounted for. We show that this method increases the
overall rate of reconstructed gamma-rays, lowers the energy threshold of the
array and allows the reconstruction of low energy (E > 70 GeV) source events
which were suppressed by the conventional cleaning method. | astro-ph_IM |
Using multiobjective optimization to reconstruct interferometric data
(II): polarimetry and time dynamics: In Very Long Baseline Interferometry (VLBI), signals from multiple antennas
combine to create a sparsely sampled virtual aperture, its effective diameter
determined by the largest antenna separation. The inherent sparsity makes VLBI
imaging an ill-posed inverse problem, prompting the use of algorithms like the
Multiobjective Evolutionary Algorithm by Decomposition (MOEA/D), as proposed in
the first paper of this series. This study focuses on extending MOEA/D to
polarimetric and time dynamic reconstructions, particularly relevant for the
VLBI community and the Event Horizon Telescope Collaboration (EHTC). MOEA/D's
success in providing a unique, fast, and largely unsupervised representation of
image structure serves as the basis for exploring these extensions. The
extension involves incorporating penalty terms specific to total intensity
imaging, time-variable, and polarimetric variants within MOEA/D's
multiobjective, evolutionary framework. The Pareto front, representing
non-dominated solutions, is computed, revealing clusters of proximities.
Testing MOEA/D with synthetic datasets representative of EHTC's main targets
demonstrates successful recovery of polarimetric and time-dynamic signatures
despite sparsity and realistic data corruptions. MOEA/D's extension proves
effective in the anticipated EHTC setting, offering an alternative and
independent claim to existing methods. It not only explores the problem
globally but also eliminates the need for parameter surveys, distinguishing it
from Regularized Maximum Likelihood (RML) methods. MOEA/D emerges as a novel
and useful tool for robustly characterizing polarimetric and dynamic signatures
in VLBI datasets with minimal user-based choices. Future work aims to address
the last remaining limitation of MOEA/D, specifically regarding the number of
pixels and numerical performance, to establish it within the VLBI data
reduction pipeline. | astro-ph_IM |
Astronomy & Astrophysics in ICAD History: The International Conference on Auditory Display (ICAD) is a significant
event for researchers and practitioners interested in exploring the use of
sound in conveying information and data. Since its inception in 1994, the
conference has served as a vital forum for exchanging ideas and presenting
research findings in the field of auditory display. While the conference
primarily focuses on auditory display and sound design, astronomy has made its
presence felt in the proceedings of the conference over the years. However, its
not until the current ICAD conference where astronomy features a dedicated
session. This paper aims to provide an statistical overview of the presence of
astronomy in the ICAD conference's history from 1994 to 2022, highlighting some
of the contributions made by researchers in this area, as well as the topics of
interest that have captured the attention of sound artists. | astro-ph_IM |
Broadband spectroscopy of astrophysical ice analogs. I. Direct
measurement of complex refractive index of CO ice using terahertz time-domain
spectroscopy: Context: Reliable, directly measured optical properties of astrophysical ice
analogs in the infrared (IR) and terahertz (THz) range are missing. These
parameters are of great importance to model the dust continuum radiative
transfer in dense and cold regions, here thick ice mantles are present, and are
necessary for the interpretation of future observations planned in the far-IR
region. Aims: Coherent THz radiation allows direct measurement of the complex
dielectric function (refractive index) of astrophysically relevant ice species
in the THz range. Methods: The time-domain waveforms and the frequency-domain
spectra of reference samples of CO ice, deposited at a temperature of 28.5 K
and annealed to 33 K at different thicknesses, have been recorded. A new
algorithm is developed to reconstruct the real and imaginary parts of the
refractive index from the time-domain THz data. Results: The complex refractive
index in the wavelength range of 1 mm - 150 ${\mu}$m (0.3 - 2.0 THz) has been
determined for the studied ice samples, and compared with available data found
in the literature. Conclusions: The developed algorithm of reconstructing the
real and imaginary parts of the refractive index from the time-domain THz data
enables, for the first time, the determination of optical properties of
astrophysical ice analogs without using the Kramers-Kronig relations. The
obtained data provide a benchmark to interpret the observational data from
current ground based facilities as well as future space telescope missions, and
have been used to estimate the opacities of the dust grains in presence of CO
ice mantles. | astro-ph_IM |
Response of the XENON100 Dark Matter Detector to Nuclear Recoils: Results from the nuclear recoil calibration of the XENON100 dark matter
detector installed underground at the Laboratori Nazionali del Gran Sasso
(LNGS), Italy are presented. Data from measurements with an external 241AmBe
neutron source are compared with a detailed Monte Carlo simulation which is
used to extract the energy dependent charge-yield Qy and relative scintillation
efficiency Leff. A very good level of absolute spectral matching is achieved in
both observable signal channels - scintillation S1 and ionization S2 - along
with agreement in the 2-dimensional particle discrimination space. The results
confirm the validity of the derived signal acceptance in earlier reported dark
matter searches of the XENON100 experiment. | astro-ph_IM |
Charge Transfer Inefficiency in the Hubble Space Telescope since
Servicing Mission 4: We update a physically-motivated model of radiation damage in the Hubble
Space Telescope Advanced Camera for Surveys/Wide Field Channel, using data up
to mid 2010. We find that Charge Transfer Inefficiency increased dramatically
before shuttle Servicing Mission 4, with ~1.3 charge traps now present per
pixel. During detector readout, charge traps spuriously drag electrons behind
all astronomical sources, degrading image quality in a way that affects object
photometry, astrometry and morphology. Our detector readout model is robust to
changes in operating temperature and background level, and can be used to
iteratively remove the trailing by pushing electrons back to where they belong.
The result is data taken in mid-2010 that recovers the quality of imaging
obtained within the first six months of orbital operations. | astro-ph_IM |
Explicit expansion of the three-body disturbing function for arbitrary
eccentricities and inclinations: Since the original work of Hansen and Tisserand in the XIXth century, there
have been many variations in the analytical expansion of the three-body
disturbing function in series of the semi-major axis ratio. With the increasing
number of planetary systems of large eccentricity, these expansions are even
more interesting as they allow us to obtain for the secular systems finite
expressions that are valid for all eccentricities and inclinations. We
revisited the derivation of the disturbing function in Legendre polynomial,
with a special focus on the secular system. We provide here expressions of the
disturbing function for the planar and spatial case at any order with respect
to the ratio of the semi-major axes. Moreover, for orders in the ratio of
semi-major axis up to ten in the planar case and five in the spatial case, we
provide explicit expansions of the secular system, and simple algorithms with
minimal computation to extend this to higher order, as well as the algorithms
for the computation of non secular terms. | astro-ph_IM |
Agile Earth observation satellite scheduling over 20 years:
formulations, methods and future directions: Agile satellites with advanced attitude maneuvering capability are the new
generation of Earth observation satellites (EOSs). The continuous improvement
in satellite technology and decrease in launch cost have boosted the
development of agile EOSs (AEOSs). To efficiently employ the increasing
orbiting AEOSs, the AEOS scheduling problem (AEOSSP) aiming to maximize the
entire observation profit while satisfying all complex operational constraints,
has received much attention over the past 20 years. The objectives of this
paper are thus to summarize current research on AEOSSP, identify main
accomplishments and highlight potential future research directions. To this
end, general definitions of AEOSSP with operational constraints are described
initially, followed by its three typical variations including different
definitions of observation profit, multi-objective function and autonomous
model. A detailed literature review from 1997 up to 2019 is then presented in
line with four different solution methods, i.e., exact method, heuristic,
metaheuristic and machine learning. Finally, we discuss a number of topics
worth pursuing in the future. | astro-ph_IM |
Research Performance of Turkish Astronomers in the Period of 1980-2010: We investigated the development of astronomy and astrophysics research
productivity in Turkey in terms of publication output and their impacts as
reflected in the Science Citation Index (SCI) for the period 1980-2010. It
includes 838 refereed publications, including 801 articles, 16 letters, 15
reviews, and six research notes. The number of papers were prominently
increased after 2000 and the average number of papers per researcher is
calculated as 0.89. Total number of received citations for 838 papers is 6938,
while number of citations per papers is approximately 8.3 in 30 years.
Publication performance of Turkish astronomers and astrophysicists was compared
with those of seven countries that have similar gross domestic expenditures on
research and development, and members of Organization for Economic Co-operation
and Development (OECD). Our study reveals that the output of astronomy and
astrophysics research in Turkey has gradually increased over the years. | astro-ph_IM |
Astronomical Imagery: Considerations For a Contemporary Approach with
JPEG2000: The new wide-field radio telescopes, such as: ASKAP, MWA, LOFAR, eVLA and
SKA; will produce spectral-imaging data-cubes (SIDC) of unprecedented size --
in the order of hundreds of Petabytes. Servicing such data as images to the
end-user in a traditional manner and formats is likely going to encounter
significant performance fallbacks. We discuss the requirements for extremely
large SIDCs, and in this light we analyse the applicability of the approach
taken in the JPEG2000 (ISO/IEC 15444) standards. We argue the case for the
adaptation of contemporary industry standards and technologies vs the
modification of legacy astronomy standards or the development new from scratch. | astro-ph_IM |
Tokyo Axion Helioscope: The idea of a magnetic axion helioscope was first proposed by Pierre Sikivie
in 1983. Tokyo axion helioscope was built exploiting its detection principle
with a dedicated cryogen-free superconducting magnet and PIN photodiodes for
x-ray detectors. Solar axions, if exist, would be converted into x-ray photons
in the magnetic field. Conversion is coherently enhanced even for massive
axions by filling the conversion region with helium gas. Its start up, search
results so far and prospects are presented. | astro-ph_IM |
Dome C site testing: long term statistics of integrated optical
turbulence parameters at ground level: We present long term site testing statistics obtained at Dome C, Antarctica
with various experiments deployed within the Astroconcordia programme since
2003. We give values of integrated turbulence parameters in the visible at
ground level and above the surface layer, vertical profiles of the structure
constant Cn2 and a statistics of the thickness of the turbulent surface layer. | astro-ph_IM |
Classification methods for noise transients in advanced
gravitational-wave detectors II: performance tests on Advanced LIGO data: The data taken by the advanced LIGO and Virgo gravitational-wave detectors
contains short duration noise transients that limit the significance of
astrophysical detections and reduce the duty cycle of the instruments. As the
advanced detectors are reaching sensitivity levels that allow for multiple
detections of astrophysical gravitational-wave sources it is crucial to achieve
a fast and accurate characterization of non-astrophysical transient noise
shortly after it occurs in the detectors. Previously we presented three methods
for the classification of transient noise sources. They are Principal Component
Analysis for Transients (PCAT), Principal Component LALInference Burst (PC-LIB)
and Wavelet Detection Filter with Machine Learning (WDF-ML). In this study we
carry out the first performance tests of these algorithms on gravitational-wave
data from the Advanced LIGO detectors. We use the data taken between the 3rd of
June 2015 and the 14th of June 2015 during the 7th engineering run (ER7), and
outline the improvements made to increase the performance and lower the latency
of the algorithms on real data. This work provides an important test for
understanding the performance of these methods on real, non stationary data in
preparation for the second advanced gravitational-wave detector observation
run, planned for later this year. We show that all methods can classify
transients in non stationary data with a high level of accuracy and show the
benefits of using multiple classifiers. | astro-ph_IM |
A Novel Source of Tagged Low-Energy Nuclear Recoils: For sufficiently wide resonances, nuclear resonance fluorescence behaves like
elastic photo-nuclear scattering while retaining the large cross-section
characteristic of resonant photo-nuclear absorption. We show that NRF may be
used to characterize the signals produced by low-energy nuclear recoils by
serving as a novel source of tagged low-energy nuclear recoils. Understanding
these signals is important in determining the sensitivity of direct WIMP
dark-matter and coherent neutrino-nucleus scattering searches. | astro-ph_IM |
Agile Software Engineering and Systems Engineering at SKA Scale: Systems Engineering (SE) is the set of processes and documentation required
for successfully realising large-scale engineering projects, but the classical
approach is not a good fit for software-intensive projects, especially when the
needs of the different stakeholders are not fully known from the beginning, and
requirement priorities might change. The SKA is the ultimate software-enabled
telescope, with enormous amounts of computing hardware and software required to
perform its data reduction. We give an overview of the system and software
engineering processes in the SKA1 development, and the tension between
classical and agile SE. | astro-ph_IM |
The Transients Handler System for the Cherenkov Telescope Array
Observatory: The Cherenkov Telescope Array Observatory (CTAO) will be the largest and most
advanced ground-based facility for gamma-ray astronomy. Several dozens of
telescopes will be operated at both the Northern and Southern Hemisphere. With
the advent of multi-messenger astronomy, many new large science infrastructures
will start science operations and target-of-opportunity observations will play
an important role in the operation of the CTAO. The Array Control and Data
Acquisition (ACADA) system deployed on each CTAO site will feature a dedicated
sub-system to manage external and internal scientific alerts: the Transients
Handler. It will receive, validate, and process science alerts in order to
determine if target-of-opportunity observations can be triggered or need to be
updated. Various tasks defined by proposal-based configurations are processed
by the Transients Handler. These tasks include, among others, the evaluation of
observability of targets and their correlation with known sources or objects.
This contribution will discuss the concepts and design of the Transients
Handler and its integration in the ACADA system. | astro-ph_IM |
A Study of the Effect of Molecular and Aerosol Conditions in the
Atmosphere on Air Fluorescence Measurements at the Pierre Auger Observatory: The air fluorescence detector of the Pierre Auger Observatory is designed to
perform calorimetric measurements of extensive air showers created by cosmic
rays of above 10^18 eV. To correct these measurements for the effects
introduced by atmospheric fluctuations, the Observatory contains a group of
monitoring instruments to record atmospheric conditions across the detector
site, an area exceeding 3,000 km^2. The atmospheric data are used extensively
in the reconstruction of air showers, and are particularly important for the
correct determination of shower energies and the depths of shower maxima. This
paper contains a summary of the molecular and aerosol conditions measured at
the Pierre Auger Observatory since the start of regular operations in 2004, and
includes a discussion of the impact of these measurements on air shower
reconstructions. Between 10^18 and 10^20 eV, the systematic uncertainties due
to all atmospheric effects increase from 4% to 8% in measurements of shower
energy, and 4 g/cm^2 to 8 g/cm^2 in measurements of the shower maximum. | astro-ph_IM |
WOMBAT: A Scalable and High Performance Astrophysical MHD Code: We present a new code for astrophysical magneto-hydrodynamics specifically
designed and optimized for high performance and scaling on modern and future
supercomputers. We describe a novel hybrid OpenMP/MPI programming model that
emerged from a collaboration between Cray, Inc. and the University of
Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which
allows the code to run extremely efficiently at very high thread counts ideal
for the latest generation of the multi-core and many-core architectures. Such
performance characteristics are needed in the era of "exascale" computing. We
describe and demonstrate our high-performance design in detail with the intent
that it may be used as a model for other, future astrophysical codes intended
for applications demanding exceptional performance. | astro-ph_IM |
FACT: Towards Robotic Operation of an Imaging Air Cherenkov Telescope: The First G-APD Cherenkov Telescope (FACT) became operational at La Palma in
October 2011. Since summer 2012, due to very smooth and stable operation, it is
the first telescope of its kind that is routinely operated from remote, without
the need for a data-taking crew on site. In addition, many standard tasks of
operation are executed automatically without the need for manual interaction.
Based on the experience gained so far, some alterations to improve the safety
of the system are under development to allow robotic operation in the future.
We present the setup and precautions used to implement remote operations and
the experience gained so far, as well as the work towards robotic operation. | astro-ph_IM |
The Large Array Survey Telescope -- Pipeline. I. Basic image reduction
and visit coaddition: The Large Array Survey Telescope (LAST) is a wide-field telescope designed to
explore the variable and transient sky with a high cadence and to be a test-bed
for cost-effective telescope design. A LAST node is composed of 48 (32 already
deployed), 28-cm f/2.2 telescopes. A single telescope has a 7.4 deg^2 field of
view and reaches a 5-sigma limiting magnitude of 19.6 (21.0) in 20s (20x20s)
(filter-less), while the entire system provides a 355 deg^2 field of view. The
basic strategy of LAST is to obtain multiple 20-s consecutive exposures of each
field (a visit). Each telescope carries a 61 Mpix camera, and the system
produces, on average, about 2.2 Gbit/s. This high data rate is analyzed in near
real-time at the observatory site, using limited computing resources (about 700
cores). Given this high data rate, we have developed a new, efficient data
reduction and analysis pipeline. The data pipeline includes two major parts:
(i) Processing and calibration of single images, followed by a coaddition of
the visit's exposures. (ii) Building the reference images and performing image
subtraction and transient detection. Here we describe in detail the first part
of the pipeline. Among the products of this pipeline are photometrically and
astrometrically calibrated single and coadded images, 32-bit mask images
marking a wide variety of problems and states of each pixel, source catalogs
built from individual and coadded images, Point Spread Function (PSF)
photometry, merged source catalogs, proper motion and variability indicators,
minor planets detection, calibrated light curves, and matching with external
catalogs. The entire pipeline code is made public. Finally, we demonstrate the
pipeline performance on real data taken by LAST. | astro-ph_IM |
matvis: A matrix-based visibility simulator for fast forward modelling
of many-element 21 cm arrays: Detection of the faint 21 cm line emission from the Cosmic Dawn and Epoch of
Reionisation will require not only exquisite control over instrumental
calibration and systematics to achieve the necessary dynamic range of
observations but also validation of analysis techniques to demonstrate their
statistical properties and signal loss characteristics. A key ingredient in
achieving this is the ability to perform high-fidelity simulations of the kinds
of data that are produced by the large, many-element, radio interferometric
arrays that have been purpose-built for these studies. The large scale of these
arrays presents a computational challenge, as one must simulate a detailed sky
and instrumental model across many hundreds of frequency channels, thousands of
time samples, and tens of thousands of baselines for arrays with hundreds of
antennas. In this paper, we present a fast matrix-based method for simulating
radio interferometric measurements (visibilities) at the necessary scale. We
achieve this through judicious use of primary beam interpolation, fast
approximations for coordinate transforms, and a vectorised outer product to
expand per-antenna quantities to per-baseline visibilities, coupled with
standard parallelisation techniques. We validate the results of this method,
implemented in the publicly-available matvis code, against a high-precision
reference simulator, and explore its computational scaling on a variety of
problems. | astro-ph_IM |
SSTRED: Data- and metadata-processing pipeline for CHROMIS and CRISP: Context: Data from ground-based, high-resolution solar telescopes can only be
used for science with calibrations and processing, which requires detailed
knowledge about the instrumentation. [...] Aims: We aim to provide observers
with a user-friendly data pipeline for data from the Swedish 1-meter Solar
Telescope (SST) that delivers science-ready data together with the metadata
needed for proper interpretation and archiving. Methods: We briefly describe
the [instrumentation]. We summarize the processing steps from raw data to
science-ready data cubes in FITS files. We report calibrations and
compensations for data imperfections in detail. Misalignment of \ion{Ca}{ii}
data due to wavelength-dependent dispersion is identified, characterized, and
compensated for. We describe intensity calibrations that remove or reduce the
effects of filter transmission profiles as well as solar elevation changes. We
present REDUX, a new version of the MOMFBD image restoration code, with
multiple enhancements and new features. [...] We describe how image restoration
is used [...]. The science-ready output is delivered in FITS files, with
metadata compliant with the SOLARNET recommendations. Data cube coordinates are
specified within the World Coordinate System (WCS). Cavity errors are specified
as distortions of the WCS wavelength coordinate with an extension of existing
WCS notation. We establish notation for specifying the reference system for
Stokes vectors [...]. [CRISPEX] has been extended to accept SSTRED output
[...]. Results: SSTRED is a mature data-processing pipeline for imaging
instruments, developed and used for the SST/CHROMIS imaging spectrometer and
the SST/CRISP spectropolarimeter. SSTRED delivers well-characterized,
science-ready, archival-quality FITS files with well-defined metadata. The
SSTRED code, as well as REDUX and CRISPEX, is freely available through git
repositories. | astro-ph_IM |
Measurements of diffusion of volatiles in amorphous solid water:
application to interstellar medium environments: The diffusion of atoms and molecules in ices covering dust grains in dense
clouds in interstellar space is an important but poorly characterized step in
the formation of complex molecules in space. Here we report the measurement of
diffusion of simple molecules in amorphous solid water (ASW), an analog of
interstellar ices, which are amorphous and made mostly of water molecules. The
new approach that we used relies on measuring in situ the change in band
strength and position of mid-infrared features of OH dangling bonds as
molecules move through pores and channels of ASW. We obtained the Arrhenius
pre-exponents and activation energies for diffusion of CO, O$_2$, N$_2$,
CH$_4$, and Ar in ASW. The diffusion energy barrier of H$_2$ and D$_2$ were
also measured, but only upper limits were obtained. These values constitute the
first comprehensive set of diffusion parameters of simple molecules on the pore
surface of ASW, and can be used in simulations of the chemical evolution of ISM
environments, thus replacing unsupported estimates. We also present a set of
argon temperature programmed desorption experiments to determine the desorption
energy distribution of argon on non-porous ASW. | astro-ph_IM |
Towards 10 cm/s radial velocity accuracy on the Sun using a Fourier
transform spectrometer: The IAG solar observatory is producing high-fidelity, ultra-high-resolution
spectra (R>500000) of the spatially resolved surface of the Sun using a Fourier
Transform spectrometer (FTS). The radial velocity (RV) calibration of these
spectra is currently performed using absorption lines from Earth's atmosphere,
limiting the precision and accuracy. To improve the frequency calibration
precision and accuracy we plan to use a Fabry-Perot etalon (FP) setup that is
an evolution of the CARMENES FP design and an iodine cell in combination. To
create an accurate wavelength solution, the iodine cell is measured in parallel
with the FP. The FP can then be used to transfer the accurate wavelength
solution provided by the iodine via simultaneous calibration of solar
observations. To verify the stability and precision of the FTS we perform
parallel measurements of the FP and an iodine cell. The measurements show an
intrinsic stability of the FTS of a level of 1 m/s over 90 hours. The
difference between the FP RVs and the iodine cell RVs show no significant
trends during the same time span. The RMS of the RV difference between FP and
iodine cell is 10.7 cm/s, which can be largely attributed to the intrinsic RV
precisions of the iodine cell and the FP (10.2 cm/s and 1.0 cm/s,
respectively). This shows that we can calibrate the FTS to a level of 10 cm/s,
competitive with current state-of-the-art precision RV instruments. Based on
these results we argue that the spectrum of iodine can be used as an absolute
reference to reach an RV accuracy of 10 cm/s. | astro-ph_IM |
Ray-tracing 3D dust radiative transfer with DART-Ray: code upgrade and
public release: We present an extensively updated version of the purely ray-tracing 3D dust
radiation transfer code DART-Ray. The new version includes five major upgrades
: 1) a series of optimizations for the ray-angular density and the scattered
radiation source function; 2) the implementation of several data and task
parallelizations using hybrid MPI+OpenMP schemes; 3) the inclusion of dust
self-heating; 4) the ability to produce surface brightness maps for observers
within the models in HEALPix format; 5) the possibility to set the expected
numerical accuracy already at the start of the calculation. We tested the
updated code with benchmark models where the dust self-heating is not
negligible. Furthermore, we performed a study of the extent of the source
influence volumes, using galaxy models, which are critical in determining the
efficiency of the DART-Ray algorithm. The new code is publicly available,
documented for both users and developers, and accompanied by several programmes
to create input grids for different model geometries and to import the results
of N-body and SPH simulations. These programmes can be easily adapted to
different input geometries, and for different dust models or stellar emission
libraries. | astro-ph_IM |
Design, pointing control, and on-sky performance of the mid-infrared
vortex coronagraph for the VLT/NEAR experiment: Vortex coronagraphs have been shown to be a promising avenue for
high-contrast imaging in the close-in environment of stars at thermal infrared
(IR) wavelengths. They are included in the baseline design of METIS. To ensure
good performance of these coronagraphs, a precise control of the centering of
the star image in real time is needed. We previously developed and validated
the quadrant analysis of coronagraphic images for tip-tilt sensing estimator
(QACITS) pointing estimator to address this issue. While this approach is not
wavelength-dependent in theory, it was never implemented for mid-IR
observations, which leads to specific challenges and limitations. Here, we
present the design of the mid-IR vortex coronagraph for the new Earths in the
$\alpha$ Cen Region (NEAR) experiment with the VLT/VISIR instrument and assess
the performance of the QACITS estimator for the centering control of the star
image onto the vortex coronagraph. We use simulated data and on-sky data
obtained with VLT/VISIR, which was recently upgraded for observations assisted
by adaptive optics in the context of the NEAR experiment. We demonstrate that
the QACITS-based correction loop is able to control the centering of the star
image onto the NEAR vortex coronagraph with a stability down to $0.015
\lambda/D$ rms over 4h in good conditions. These results show that QACITS is a
robust approach for precisely controlling in real time the centering of vortex
coronagraphs for mid-IR observations. | astro-ph_IM |
Spread spectrum for imaging techniques in radio interferometry: We consider the probe of astrophysical signals through radio interferometers
with small field of view and baselines with non-negligible and constant
component in the pointing direction. In this context, the visibilities measured
essentially identify with a noisy and incomplete Fourier coverage of the
product of the planar signals with a linear chirp modulation. In light of the
recent theory of compressed sensing and in the perspective of defining the best
possible imaging techniques for sparse signals, we analyze the related spread
spectrum phenomenon and suggest its universality relative to the sparsity
dictionary. Our results rely both on theoretical considerations related to the
mutual coherence between the sparsity and sensing dictionaries, as well as on
numerical simulations. | astro-ph_IM |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.