new

Get trending papers in your email inbox!

Subscribe

byAK and the research community

Mar 11

Jellyfish: A Large Language Model for Data Preprocessing

In this paper, we present Jellyfish, an open-source LLM as a universal task solver for DP. Built on the Llama 2 13B model, Jellyfish is instruction-tuned with the datasets of several typical DP tasks including error detection, data imputation, schema matching, and entity matching, and delivers generalizability to other tasks. Remarkably, Jellyfish can operate on a local, single, and low-priced GPU with its 13 billion parameters, ensuring data security and enabling further tuning. Its proficiency in understanding natural language allows users to manually craft instructions for DP tasks. Unlike many existing methods that heavily rely on prior knowledge, Jellyfish acquires domain knowledge during its tuning process and integrates optional knowledge injection during inference. A distinctive feature of Jellyfish is its interpreter, which elucidates its output decisions. To construct Jellyfish, we develop a series of pre-tuning and DP-tuning techniques. Jellyfish is equipped with an instance serializer, which automatically translates raw data into model prompts, and a knowledge injector, which optionally introduces task- and dataset-specific knowledge to enhance DP performance. Our evaluation of Jellyfish, using a range of real datasets, shows its competitiveness compared to state-of-the-art methods and its strong generalizability to unseen tasks. Jellyfish's performance rivals that of GPT series models, and its interpreter offers enhanced reasoning capabilities compared to GPT-3.5. Furthermore, our evaluation highlights the effectiveness of the techniques employed in constructing Jellyfish. Our model is available at Hugging Face: https://huggingface.co/NECOUDBFM/Jellyfish .

LoTSS jellyfish galaxies: II. Ram pressure stripping in groups versus clusters

Numerous examples of ram pressure stripping in galaxy clusters are present in literature; however, substantially less work has been focused on ram pressure stripping in lower mass groups. In this work we use the LOFAR Two-metre Sky Survey (LoTSS) to search for jellyfish galaxies in ~500 SDSS groups (z<0.05), making this the most comprehensive search for ram pressure stripping in groups to date. We identify 60 jellyfish galaxies in groups with extended, asymmetric radio continuum tails, which are found across the entire range of group mass from 10^{12.5} < M_group < 10^{14},h^{-1},M_odot. We compare the group jellyfish galaxies identified in this work with the LoTSS jellyfish galaxies in clusters presented in Roberts et al. (2021), allowing us to compare the effects of ram pressure stripping across three decades in group/cluster mass. We find that jellyfish galaxies are most commonly found in clusters, with the frequency decreasing towards the lowest mass groups. Both the orientation of observed radio continuum tails, and the positions of group jellyfish galaxies in phase space, suggest that galaxies are stripped more slowly in groups relative to clusters. Finally, we find that the star formation rates of jellyfish galaxies in groups are consistent with `normal' star-forming group galaxies, which is in contrast to cluster jellyfish galaxies that have clearly enhanced star formation rates. On the whole, there is clear evidence for ongoing ram pressure stripping in galaxy groups (down to very low group masses), though the frequency of jellyfish galaxies and the strength of ram pressure stripping appears smaller in groups than clusters. Differences in the efficiency of ram pressure stripping in groups versus clusters likely contributes to the positive trend between quenched fraction and host halo mass observed in the local Universe.