Spaces:
Sleeping
Model documentation & parameters
Algorithm Version: Which model version to use.
Maximal sequence length: The maximal number of SMILES tokens in the generated molecule.
Number of samples: How many samples should be generated (between 1 and 50).
Model card -- PolymerBlocks
Model Details: PolymerBlocks is a sequence-based molecular generator tuned to generate blocks of polymers (e.g., catalysts and monomers). The model relies on a Variational Autoencoder architecture as described in Born et al. (2021; iScience)
Developers: Matteo Manica and colleagues from IBM Research.
Distributors: Original authors' code integrated into GT4SD.
Model date: Not yet published.
Model version: Only initial model version.
Model type: A sequence-based molecular generator tuned to generate blocks of polymers (e.g., catalysts and monomers).
Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: N.A.
Paper or other resource for more information: TBD
License: MIT
Where to send questions or comments about the model: Open an issue on GT4SD repository.
Intended Use. Use cases that were envisioned during development: Chemical research, in particular drug discovery.
Primary intended uses/users: Researchers and computational chemists using the model for model comparison or research exploration purposes.
Out-of-scope use cases: Production-level inference, producing molecules with harmful properties.
Metrics: N.A.
Datasets: N.A.
Ethical Considerations: Unclear, please consult with original authors in case of questions.
Caveats and Recommendations: Unclear, please consult with original authors in case of questions.
Model card prototype inspired by Mitchell et al. (2019)
Citation
TBD, temporarily please cite:
@article{manica2022gt4sd,
title={GT4SD: Generative Toolkit for Scientific Discovery},
author={Manica, Matteo and Cadow, Joris and Christofidellis, Dimitrios and Dave, Ashish and Born, Jannis and Clarke, Dean and Teukam, Yves Gaetan Nana and Hoffman, Samuel C and Buchan, Matthew and Chenthamarakshan, Vijil and others},
journal={arXiv preprint arXiv:2207.03928},
year={2022}
}